pykan
pykan copied to clipboard
I think "hence beating curse of dimensionality" P8 a little unconvincing
Obviously, C of Theorem 2.1 is dependent of dimension. There are remarkably a contradiction between the Red box and Blue Box.
studying ...
Sorry, $\ell=CN^{-\alpha}$ we meant the scaling exponent $\alpha$ is independent of dimemsion, but C is depedent.
Do you have a characterization of the size of the smooth KA representation? E.g., for any Holder/Sobolev/Besov function. If not, then I think this result is not directly comparable with the classical MLP or splines approximation theories since you have to assume a "constant size" smooth KA representation.
Sorry, ℓ=CN−α we meant the scaling exponent α is independent of dimemsion, but C is depedent.
You did a good job! But I shall hope to see your more serious underlying mathematical theory. Discussions and debates are a favourite way of learning.
https://zhuanlan.zhihu.com/p/695869050?utm_campaign=shareopn&utm_medium=social&utm_psn=1770095693690294273&utm_source=wechat_session
studying
Sorry, ℓ=CN−α we meant the scaling exponent α is independent of dimemsion, but C is depedent.
Here I give you a counterexample:
Now alpha is fixed to 1. when C = 10, the x^2 <= C x^alpha holds during 0<x<10. But 10 increases, C havs to increase. In fact, min C can be x. Now we have X^2 <= C1 * x^(alpha+1), where C1 is constant 1. It follows immediately from the inequality that the exp of x of RHS (stand for your dim) increases!
https://zhuanlan.zhihu.com/p/695932311