How to increase # of Layers with CfC or LTC Cell
Hi , This is my model where I want to use two layers of CfC , however with one layer this works fine when I add a second layer thus code gives error. Please mention how I can use stacking.
Build the CfC model
wiring = wirings.FullyConnected(25, 6) #25 total Neurons and 6 output model = tf.keras.models.Sequential([ tf.keras.layers.InputLayer(input_shape=(4, 30)), # 4 Lookback time steps, 30 features CfC(wiring, return_sequences=True), # CfC Layer 1 CfC(wiring, return_sequences=False), # CfC Layer 2 tf.keras.layers.Dense(units=6 , activation='relu') # Output 6 predictions (for 6 wind sites) ])
Compile the model
model.compile(optimizer=tf.keras.optimizers.Adam(0.001), loss='mean_squared_error') model.summary()
Error:
**File ~\anaconda\Lib\site-packages\ncps\wirings\wirings.py:39, in Wiring.build(self, input_dim) 37 def build(self, input_dim): 38 if not self.input_dim is None and self.input_dim != input_dim: ---> 39 raise ValueError( 40 "Conflicting input dimensions provided. set_input_dim() was called with {} but actual input has dimension {}".format( 41 self.input_dim, input_dim 42 ) 43 ) 44 if self.input_dim is None: 45 self.set_input_dim(input_dim)
ValueError: Conflicting input dimensions provided. set_input_dim() was called with 30 but actual input has dimension 6**
You can use two wirings. If you want to stack them, I'd suggest the following setup (though you can also keep the original structure with 25 units and 6 outputs for wiring1 too):
wiring1 = wirings.FullyConnected(25) #25 total Neurons and 25 output
wiring2 = wirings.FullyConnected(25, 6) #25 total Neurons and 6 output
model = tf.keras.models.Sequential([
tf.keras.layers.InputLayer(input_shape=(4, 30)), # 4 Lookback time steps, 30 features
CfC(wiring1, return_sequences=True), # CfC Layer 1
CfC(wiring2, return_sequences=False), # CfC Layer 2
tf.keras.layers.Dense(units=6 , activation='relu') # Output 6 predictions (for 6 wind sites)
])