polylearn
                                
                                
                                
                                    polylearn copied to clipboard
                            
                            
                            
                        Non negativity constraints
I'd like to apply the PolynomialNetworkRegressor to a domain in which the predicted values are constrained to be non negative. Does polylearn support non negativity constraints?
Hi Anmol,
This is a very interesting question. Usually, it's the weights of a model that are constrained to be non negative, not the outputs.
In the case of factorization machines with even degree, if the lambdas (output layer weights) are all positive I believe the output is also non negative. For polynomial nets I'm not sure but I think the reparametrization breaks this argument (because the lambdas are absorbed into U and fitted implicitly). I'll have to think a bit more is there's an easy way to get this behavior with the current implementation.
For even degree PN I think constraining the parameters to be non negative leads to non negative outputs as well for arbitrary inputs. If the inputs are non negative too, this should hold for any degree. However the constraint might be too restrictive. @mblondel do you have any suggestions?
-------- Original Message -------- From: Anmol Singh [email protected] Sent: May 28, 2017 5:16:51 AM GMT+09:00 To: scikit-learn-contrib/polylearn [email protected] Cc: Subscribed [email protected] Subject: [scikit-learn-contrib/polylearn] Non negativity constraints (#11)
I'd like to apply the PolynomialNetworkRegressor to a domain in which the predicted values are constrained to be non negative. Does polylearn support non negativity constraints?
Checkout this blog http://maggotroot.blogspot.ch/2013/11/constrained-linear-least-squares-in.html it allows for constraints on outputs. It works in a similar manner as lsqlin in matlab.
This is interesting, thank you @equialgo!