If you haven't clicked through to read the above commit message, please do. It is amazing. This is a glimpse into the mind of a mad scientist (and I respect and love mad scientists more than anything and aspire to be one)
I’m quite happy to see random projections getting some love, but I hope more people start using Choromanski et al.’s 2016 Structured Orthogonal Random Features, which has provably higher accuracy while reducing runtime to linearithmic and memory to linear (or constant) from quadratic for each. I’ve verified this experimentally in my implementation here [0]. As a shameless plug, it’s quite fast, is written in C++, and comes with Python bindings for both kernel projections and orthogonal JL transforms.
The reason why I am using random projections in the latest test is because I am testing an algorithm that iteratively calculates the inverse Cholesky factor of the covariance matrix and am testing it on Mnist images. The cov matrices made from raw Mnist images are non-invertible, but projecting them to a much smaller dimension allows me to actually test the algorithm on non-synthetic data.
I do not actually need more than I have, but I'll keep your link in mind if I ever need random projections though.
jpwagner|7 years ago
jessep|7 years ago
jamesmcm|7 years ago
agumonkey|7 years ago
stochastic_monk|7 years ago
[0]: https://github.com/dnbaker/frp
abstractcontrol|7 years ago
I do not actually need more than I have, but I'll keep your link in mind if I ever need random projections though.
wild_preference|7 years ago
caseymarquis|7 years ago
HerrMonnezza|7 years ago