jameschch
jameschch
Glad you found a use for it. I am using different ML libraries and so it makes sense for me not to use the parameter optimizer that comes with LEAN....
I get quite satisfactory performance with the useSharedAppDomain flag which will mean some initialization is not repeated for each run. This still needs some attention but basically works. My main...
I can take a hint. The port to .net5 is now done. [https://github.com/jameschch/LeanParameterOptimization](https://github.com/jameschch/LeanParameterOptimization) I introduced less bugs than I expected. But, you now have to run the Optimizer with my...
> Thanks again > > At the moment its giving us ideas about how best to run Lean as a batch process rather than using parallel methods. > > Core...
Unfortunately the python runtime does not expose much useful information. It is possible to recover more for debugging if compiled from source. I have seen this error when the build...
Yes I have no doubt you have your hands full with just concentrating on Neural Nets. It's refreshing using your library as it's terse and to the point. For RL...
These are all valuable suggestions. Sorry if I've not picked up exactly what you are trying to put across. Some of these issues are quite subtle so feel free to...
You expressed interest before in Walk-forward optimization. Thanks to your encouragement, I have just merged my effort at this into the master branch of my .net core port of the...
I've been reflecting on the current version of the walk forward and think it is suitable for walk-forward **validation** but needs a multi-level cost function over the cost of each...
I have had this same problem with a multitude of parameters. This is one of the reasons I made a UI for editing configuration here: [https://optimizer.ml/Config](https://optimizer.ml/Config) Maybe you can let...