Synaptic-Flow
Synaptic-Flow copied to clipboard
Algorithm 1 loop
Hello, I have some questions about your code.
-
I could not understand where the loop from Algorithm 1 of the paper is on the code, if it is the multishot.py loop of line 53, or if it is the prune_loop function, as both loops compute a sparsity exponentiated at a term of kind k/n.
-
My goal is to remove 90% of the parameters from my custom network with synflow. What should be the values of compression_list, level_list, and prune_epochs? pre_epochs should be 0, right? I did a test just printing the sparsity values from the multishot.py, using compression_list = 1 (assuming that 10^1 in your notation is equal 90%), level_list = 20, and prune_epochs = 1. Evaluating the sparsity values, it seemed to me that this would remove more than 90% (in the third iteration, it had already passed). My test and results below (assume the code is correctly indented):
` schedule = "exponential" # default compression_list = [1] # 10^1 or 90% level_list = [20] # default = [] prune_epochs = 1 # default
W = 1000
for compression in compression_list: for level in level_list: for l in range(level): # Prune Model sparsity = (10**(-float(compression)))**((l + 1) / level)
# Prune model (summarized prune_loop function)
for epoch in range(prune_epochs):
if schedule == "exponential":
sparse = sparsity**((epoch + 1) / prune_epochs)
elif schedule == "linear":
sparse = 1.0 - (1.0 - sparsity)*((epoch + 1) / prune_epochs)
prune = round(W * sparse)
W_ = W - prune
print("W: {:.0f}\tSparsity: {:.2f}\tRemoving: {:.0f}\tRemaining:{:.0f}".format(W, sparse, prune, W_))
W = W_
W: 1000 Sparsity: 0.89 Removing: 891 Remaining:109 W: 109 Sparsity: 0.79 Removing: 87 Remaining:22 W: 22 Sparsity: 0.71 Removing: 16 Remaining:6 W: 6 Sparsity: 0.63 Removing: 4 Remaining:2 W: 2 Sparsity: 0.56 Removing: 1 Remaining:1 W: 1 Sparsity: 0.50 Removing: 1 Remaining:0 W: 0 Sparsity: 0.45 Removing: 0 Remaining:0 W: 0 Sparsity: 0.40 Removing: 0 Remaining:0 W: 0 Sparsity: 0.35 Removing: 0 Remaining:0 W: 0 Sparsity: 0.32 Removing: 0 Remaining:0 W: 0 Sparsity: 0.28 Removing: 0 Remaining:0 W: 0 Sparsity: 0.25 Removing: 0 Remaining:0 W: 0 Sparsity: 0.22 Removing: 0 Remaining:0 W: 0 Sparsity: 0.20 Removing: 0 Remaining:0 W: 0 Sparsity: 0.18 Removing: 0 Remaining:0 W: 0 Sparsity: 0.16 Removing: 0 Remaining:0 W: 0 Sparsity: 0.14 Removing: 0 Remaining:0 W: 0 Sparsity: 0.13 Removing: 0 Remaining:0 W: 0 Sparsity: 0.11 Removing: 0 Remaining:0 W: 0 Sparsity: 0.10 Removing: 0 Remaining:0
`