LowRankApprox.jl
LowRankApprox.jl copied to clipboard
Gradient sketching-based PCA
Hello,
Are you interested in merging an implementation of PCA based on gradient descent with gradient sketching? The main ideas are
- PCA with quadratic loss formulated as an optimization problem
- Partitioning the input data as for stochastic gradient descent
- Gradient sketching inspired by https://arxiv.org/abs/1809.03054
I've uploaded a sketch on this link: https://gist.github.com/severinson/914c49809bc58e1fafb5a100948a6fc9
Performance:
main(n=10000, m=1000, k=100, atol=1e-2)
Sketched gradient descent
0.094326 seconds (31 allocations: 9.928 MiB)
Explained variance 0.12975954414003324
LowRankApprox.jl
0.224748 seconds (128 allocations: 56.419 MiB)
Explained variance 0.1095444998727487
main(n=10000, m=1000, k=100, atol=1e-3)
Sketched gradient descent
0.360975 seconds (31 allocations: 9.928 MiB)
Explained variance 0.1510703546595912
LowRankApprox.jl
0.220870 seconds (128 allocations: 56.419 MiB)
Explained variance 0.10934845047524705
Platform info:
versioninfo()
Julia Version 1.5.1
Commit 697e782ab8 (2020-08-25 20:08 UTC)
Platform Info:
OS: Linux (x86_64-pc-linux-gnu)
CPU: AMD EPYC 7451 24-Core Processor
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-9.0.1 (ORCJIT, znver1)
Environment:
JULIA_NUM_THREADS = 48