b-pro icon indicating copy to clipboard operation
b-pro copied to clipboard

Packaging all the featurizers within a single library

Open pierrelux opened this issue 9 years ago • 6 comments

In order to avoid code duplication, it might be useful if you could package all of you existing featurizers within a single library.

For the moment, I copied all of the following files within the source directory of the Python wrapper that I wrote:

$ ls src/
Background.cpp  Background.hpp  BasicFeatures.cpp  BasicFeatures.hpp  BASSFeatures.cpp  BASSFeatures.hpp  BlobTimeFeatures.cpp  BlobTimeFeatures.hpp  Features.cpp  Features.hpp  Parameters.cpp  Parameters.hpp

which I then compile as a library:

$(CXX) -shared -Wl,-soname,libshallowale.so -o [...]

It'd be cleaner if your code would be maintained in its own repo and library, and the Python wrapper would simply link to it.

pierrelux avatar Feb 12 '16 17:02 pierrelux

@pierrelux I am planning to refactor the whole code during this summer. I'll let you know when I'm done with it. Thanks for the suggestion!

mcmachado avatar Jul 08 '16 01:07 mcmachado

Is there Python wrapper for this code you can share with me?

pum-purum-pum-pum avatar Feb 05 '17 11:02 pum-purum-pum-pum

I wrote a wrapper in the other branch. It is quite fast, there is just a small performance drop when compared to C++. However, I never managed to be able to obtain an equally fast implementation of Sarsa. I guess I was never able to use the representation sparseness properly in Python.

If you are interested, let me know. I can help you with the Python wrapper and maybe you are able to properly use sparsity in Python.

mcmachado avatar Feb 07 '17 19:02 mcmachado

@mcmachado Does the current sarsa implementation work? I can help implement sparsity properly in python if you need help.

xkianteb avatar Jul 21 '19 22:07 xkianteb

The C++ Sarsa implementation works. I was not able to test the Python implementation because the code is too slow. I'd be happy to incorporate a fast working python implementation in this repo, but I don't have capacity to do that now. If you are interested in doing this, it would be great :-)

On Sun, 21 Jul 2019 at 18:06, Kianté [email protected] wrote:

@mcmachado https://github.com/mcmachado Does the current sarsa implementation work? I can help implement sparsity properly in python if you need help.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/mcmachado/b-pro/issues/1?email_source=notifications&email_token=AB7SQ6VCBDFNQKHCITX6TBTQATMXLA5CNFSM4B3E4FC2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD2OMR3Y#issuecomment-513591535, or mute the thread https://github.com/notifications/unsubscribe-auth/AB7SQ6QMXRSELN64XUCP343QATMXLANCNFSM4B3E4FCQ .

-- Marlos Cholodovskis Machado http://www.dcc.ufmg.br/~marlos http://www.cs.ualberta.ca/~machado

mcmachado avatar Jul 25 '19 20:07 mcmachado

When I wrote this issue more than two years ago, it was in the context of this wrapper that I had written: https://bitbucket.org/rllabmcgill/shallowpy/src/master/

It may still be useful for you. I was able to get Sarsa(0) to work by relying on the memory over-commitment feature in Linux which lets you "np.zero" huge arrays without actually using that memory until you write to it. It means that if you write your sarsa(0) with sparse updates through indexing methods only, it should run pretty fast. The problem is when you want to implement sarsa(lambda) with eligibility traces and need to decay your trace. You then end up writing a lot and the memory usage just explodes. That's when @mcmachado and I started discussing about porting their special "merging" logic from the C code into Python. It never quite happened, but it would be great if someone can do it.

pierrelux avatar Aug 05 '19 21:08 pierrelux