RSNNS
RSNNS copied to clipboard
Jordan segfault when used with more than one layer
when using, e.g.: size=c(2,2), the function exits with a seg fault, but should better report that Jordan networks with more than one hidden layer don't make much sense, as their feedback comes directly from the output layer.
library(RSNNS)
data(snnsData) inputs <- snnsData$laser_1000.pat[,inputColumns(snnsData$laser_1000.pat)] outputs <- snnsData$laser_1000.pat[,outputColumns(snnsData$laser_1000.pat)]
patterns <- splitForTrainingAndTest(inputs, outputs, ratio=0.15)
modelJordan <- jordan(patterns$inputsTrain, patterns$targetsTrain, size=c(2,2), learnFuncParams=c(0.1), maxit=100, inputsTest=patterns$inputsTest, targetsTest=patterns$targetsTest, linOut=FALSE)