top | item 40484863

(no title)

frankling_ | 1 year ago

Well, the most common ML problems can be expressed as optimization over smooth functions (or reformulated that way manually). We might have to convince the ML world that branches do matter :) On the other hand, there are gradient-free approaches that solve problems with jumps in other ways, like many reinforcement learning algorithms, or metaheuristics such as genetic algorithms in simulation-based optimization. The jury's still out on "killer apps" where gradient descent can outperform these approaches reliably, but we're hoping to add to that body of knowledge...

discuss

order

YeGoblynQueenne|1 year ago

>> We might have to convince the ML world that branches do matter :)

Easy: tell them about automata.