carrot icon indicating copy to clipboard operation
carrot copied to clipboard

Outputs are being mutated when they shouldn't

Open christianechevarria opened this issue 6 years ago • 14 comments

Describe the bug Output is not remaining normalized

To Reproduce Current settings:

Methods.mutation.MOD_ACTIVATION.mutateOutput = false;
Methods.mutation.SWAP_NODES.mutateOutput = false;

Expected behavior

  • Outputs remain normalized when mutateOutput is set to false.
  • Default behavior is to keep output normalized

Screenshots Screenshot 2019-05-04 at 10 26 12 AM

christianechevarria avatar May 04 '19 14:05 christianechevarria

Looks like this is happening because there's currently no check or differentiation between input / hidden nodes and output nodes. This is the current code inside network.mutate for example:

case mutation.MOD_ACTIVATION:
        // Has no effect on input node, so they (should be, but aren't necessarily ) excluded
        if (!method.mutateOutput && this.input + this.output === this.nodes.length) {
          if (config.warnings) console.warn('No nodes that allow mutation of activation function');
          break;
        }

        var index = Math.floor(Math.random() * (this.nodes.length - (method.mutateOutput ? 0 : this.output) - this.input) + this.input);
        var node = this.nodes[index];

        node.mutate(method);
        break;

We're randomly picking an index and not checking to see if the selected node is an input or output neuron which, of course, results in some probability of output neurons being mutated.

For reference here is the SWAP_NODES case too:

case mutation.SWAP_NODES:
        // Has no effect on input node, so they (should be) excluded
        if ((method.mutateOutput && this.nodes.length - this.input < 2) ||
          (!method.mutateOutput && this.nodes.length - this.input - this.output < 2)) {
          if (config.warnings) console.warn('No nodes that allow swapping of bias and activation function');
          break;
        }

        var index = Math.floor(Math.random() * (this.nodes.length - (method.mutateOutput ? 0 : this.output) - this.input) + this.input);
        var node1 = this.nodes[index];
        index = Math.floor(Math.random() * (this.nodes.length - (method.mutateOutput ? 0 : this.output) - this.input) + this.input);
        var node2 = this.nodes[index];

        var biasTemp = node1.bias;
        var squashTemp = node1.squash;

        node1.bias = node2.bias;
        node1.squash = node2.squash;
        node2.bias = biasTemp;
        node2.squash = squashTemp;
        break;
    }

Some possible solutions:

  • Add a check for node.type, try a new indexes when output nodes are found easy to implement but not the most performant
  • Re-architect Network to house input & output neurons within Network.input & Network.output respectively. This cuts down on checks during various for loops and still allows us to treat all 3 arrays (input, output, and hidden) as one for situations that require that behavior more performant, but also more complex to implement and maintain

Any input on this would be highly appreciated!

christianechevarria avatar May 04 '19 17:05 christianechevarria

This one should fix it: ea01662193472e69b41cd0088db9d344aa90b68e

There was a stray location variable that was removed in: 026202076f51b8f83d3c368bfc005f723cbb5537

christianechevarria avatar May 06 '19 20:05 christianechevarria

@christianechevarria I'm still getting values more than 1. So not totally fixed.

dan-ryan avatar May 08 '19 09:05 dan-ryan

@dan-ryan Are you still having this issue?

I can't replicate it on my end anymore.

Code snippet?

luiscarbonell avatar May 22 '19 18:05 luiscarbonell

The code I gave before not doing it? I didn't get it last night with my latest script. I'll try repeating it.

dan-ryan avatar May 22 '19 20:05 dan-ryan

Hey guys! Just wanted to see if there were any updates on the status of this as a bug?

christianechevarria avatar Jun 03 '19 03:06 christianechevarria

I'll try the latest changes and see if it has changed anything.

dan-ryan avatar Jun 03 '19 03:06 dan-ryan

Since I can't run the latest build without crashing while evolving, testing will have to wait.

dan-ryan avatar Jun 06 '19 01:06 dan-ryan

@dan-ryan The tests were running fine on our end now - are you still running into the issue?

luiscarbonell avatar Jun 10 '19 19:06 luiscarbonell

Yes: https://github.com/liquidcarrot/carrot/issues/115

dan-ryan avatar Jun 18 '19 11:06 dan-ryan

@dan-ryan wanted to circle back on this one as part of the #115 dependent issues, is this still a problem that's happening?

We're currently unable to reproduce it

christianechevarria avatar Aug 20 '19 16:08 christianechevarria

I had some issues with yesterdays testing with values not being normalised. But it might be because I was still having trouble with #115

dan-ryan avatar Aug 28 '19 22:08 dan-ryan

Circling back on this, it's possible that output neurons are not actually mutated, but network might be outputting un-normalized output anyway due to issues caused by #34

I've added a fix for #34 in 71301ba7c4ac68c86f934cf7efac3604982bfce3 and I'd be curious to find out if it also solves this issue too

christianechevarria avatar Sep 04 '19 15:09 christianechevarria

Just discovered an ADD_NODE mutation bug that converts outputs into hidden nodes while writing tests:

https://github.com/liquidcarrot/carrot/issues/163#issuecomment-536233679

christianechevarria avatar Sep 28 '19 23:09 christianechevarria