João Felipe Santos
João Felipe Santos
Even though restricted Boltzmann machines (and DBMs/DBNs) and autoencoders (DAE, CAE, stacked autoencoders) have a different principle as they are unsupervised, having an implementation that follows the Mocha architecture could...
You are right about autoencoders being trained with SGD as MLPs. There are some "special" things, though: 1. specific regularizers/cost functions (e.g., for contractive and sparse autoencoders) 2. Tied weights:...
Auto sample rate conversion would be really nice for files, but we still would have problems when trying to play arrays. That would force the user to make all arrays...
I think the implementation looks nice and this is a useful function. The way it's written right now it is returning a vector instead of an instance of a filter...
It's not in the plans, but it could be an interesting addition. However, the license they used for their code is not compatible with ours, so we cannot simply translate...
@JayKickliter has an initial implementation of a least-squares solution to this in #109, so I'm closing this.
No problem. The [Scipy implementation](https://github.com/scipy/scipy/blob/v0.14.0/scipy/signal/fir_filter_design.py#L311) seems to be simple enough to be ported to DSP.jl without much work, would you like to submit a pull request? Thanks!
I have been away for ages, but I'm defending my thesis soon and planning on doing some Julia development again. I will be able to help in the near future.
I think this looks good. Scipy has `buttord`, `ellipord`, `cheb1ord`, `cheb2ord`, and `ellipord` implementations that we could use as a reference. Should we pass `Butterworth` as a type instead of...
As far as I know nobody is working on this, so feel free to propose something :)