Libbum | 2 months ago | on: Cassette tapes are making a comeback?
Libbum's comments
Libbum | 2 years ago | on: Whisper.api: Open-source, self-hosted speech-to-text with fast transcription
Libbum | 2 years ago | on: Ask HN: What's the coolest physical thing you've made?
Libbum | 3 years ago | on: A Julia package for high-throughput manipulation of structured signal data
Libbum | 3 years ago | on: Outside the safe operating space of a new planetary boundary for PFAS
Since that's a catch all category for many things (nuclear waste, other synthetic chemicals, there's even debate with the original authors of the framework if Artificial Intelligence should be considered), the boundary value itself is not currently defined.
One school of thought is that the boundary value for this category should be zero - as any synthetic substance is more than nature generates.
Regardless, papers like this one are helpful to piece together all of the novel entities research amd get a better picture of how this boundary interacts with the rest of the Earth System.
Libbum | 5 years ago | on: Cozy is a modern audiobook player for Linux
Libbum | 5 years ago | on: Show HN: A proactive way of securing your Linux server
Libbum | 5 years ago | on: Differentiable Control Problems
The answer is quite simple really. Classical basis functions suffer from the curse of dimensionality because if you tensor product polynomial basis functions or things like Fourier basis, with N basis functions in each direction, then you have N^d parameters that are required in order to handle every combination `sin(x) + sin(2x) + ... + sin(y) + sin(2y) + ... + sin(x)sin(y) + sin(2x)sin(y) + ....`
Neural networks only grow polynomially with dimensional, so at around 8 dimensional objects it becomes more efficient. In fact, this is why we have https://diffeqflux.sciml.ai/dev/layers/BasisLayers/
Libbum | 5 years ago | on: Differentiable Control Problems
These comments are appreciated - I think a discussion like this is lacking in the SciML docs (or at least not visible enough). Will have a chat with some of the devs and see if there's something we can add.
Libbum | 5 years ago | on: Differentiable Control Problems
If F is completely unknown, perhaps you start training with a 10 dimensional polynomial basis. What is the (computational) cost of obtaining your solution? Once you have it, will this polynomial accurately represent your system in any real world manner? Perhaps higher order parameters are needed to approximate trigonometric functions - are you able to easily add such functions to your training basis? If not - then your basis could be too restrictive to provide you with a minimal implementation of your control variable.
It looks like you work with this stuff far more than I have, so perhaps that's not an adequate answer.
Another way to look at this though: If you only wanted to characterise your system with polynomials, UODEs + SINDy can do this for you - the NN is simply the optimisation method that's in place of any other optimisation algorithm.
Libbum | 5 years ago | on: Differentiable Control Problems
`FastChain(FastDense(3,32,tanh), FastDense(32,32,tanh), FastDense(32,2))` (from [0]) would take three inputs from your basis, run it through one hidden layer and provide you with two trained parameters.
This [1] example uses two hidden layers, its one of the more complex solutions I've seen so far. To move to this complexity from a simpler chain, we first make sure our solution is not in a local minima [2], then proceed to increase the parameter count if the NN fails to converge.
[0] https://diffeqflux.sciml.ai/dev/FastChain/ [1] https://github.com/ChrisRackauckas/universal_differential_eq... [2] https://diffeqflux.sciml.ai/dev/examples/local_minima/
Libbum | 5 years ago | on: Differentiable Control Problems
However, this method is just one small aspect of the SciML [0] ecosystem now. The article is a little outdated in that sense.
Once obtaining your NN control parameter, it's now possible to use Sparse Identification of Nonlinear Dynamics (SINDy) on that parameter to recover equations of motion governing it [1].
The real promise of these methods is to use the universal approximator power of NNs to get around the 'curse of dimensionality' & uncover presently unknown representations of motion within any system. Take a look at [2] for a more detailed description.
[0]: https://sciml.ai/ [1]: https://datadriven.sciml.ai/dev/sparse_identification/sindy/ [2]: https://arxiv.org/abs/2001.04385
Libbum | 5 years ago | on: Ask HN: How do you read long PDFs?
Libbum | 5 years ago | on: Ask HN: How do you read long PDFs?
Libbum | 5 years ago | on: Ask HN: How do you read long PDFs?
Very nice hardware, just awful software.
Libbum | 6 years ago | on: Ancient Academia: the life of a Mesopotamian scholar in the seventh century B.C
Libbum | 6 years ago | on: Ancient Academia: the life of a Mesopotamian scholar in the seventh century B.C
Libbum | 6 years ago | on: Free Audio Books: Download Great Books for Free
Libbum | 6 years ago | on: Show HN: Creepyface – A JavaScript library to make your face look at the pointer
Libbum | 6 years ago | on: Elm 0.19.1