top | item 17519886

(no title)

kgbier | 7 years ago

The commit messages across the entire project are... alarmingly insightful. There's some detailed stream of consciousness stuff in there.

discuss

order

jpwagner|7 years ago

jessep|7 years ago

If you haven't clicked through to read the above commit message, please do. It is amazing. This is a glimpse into the mind of a mad scientist (and I respect and love mad scientists more than anything and aspire to be one)

jamesmcm|7 years ago

The funny part is the diff.

agumonkey|7 years ago

what should we call this new form of web log ? c[ommit]log ? v[ersion]log ? g[it]log ?

stochastic_monk|7 years ago

I’m quite happy to see random projections getting some love, but I hope more people start using Choromanski et al.’s 2016 Structured Orthogonal Random Features, which has provably higher accuracy while reducing runtime to linearithmic and memory to linear (or constant) from quadratic for each. I’ve verified this experimentally in my implementation here [0]. As a shameless plug, it’s quite fast, is written in C++, and comes with Python bindings for both kernel projections and orthogonal JL transforms.

[0]: https://github.com/dnbaker/frp

abstractcontrol|7 years ago

The reason why I am using random projections in the latest test is because I am testing an algorithm that iteratively calculates the inverse Cholesky factor of the covariance matrix and am testing it on Mnist images. The cov matrices made from raw Mnist images are non-invertible, but projecting them to a much smaller dimension allows me to actually test the algorithm on non-synthetic data.

I do not actually need more than I have, but I'll keep your link in mind if I ever need random projections though.

wild_preference|7 years ago

How is this relevant to the upstream comment?

caseymarquis|7 years ago

I'm inspired.I'm going to start writing little things about my day in my commit messages.

HerrMonnezza|7 years ago

`git commit -m` as a blogging platform...