(no title)
mota7 | 1 year ago
So it turns the (1+a)(1+b) into 1+a+b. Which is definitely not the same! But it turns out, machine guessing apparently doesn't care much about the difference.
mota7 | 1 year ago
So it turns the (1+a)(1+b) into 1+a+b. Which is definitely not the same! But it turns out, machine guessing apparently doesn't care much about the difference.
amelius|1 year ago
Am I missing something?
dotnet00|1 year ago
tommiegannert|1 year ago
Feels like multiplication shouldn't be needed for convergence, just monotonicity? I wonder how well it would perform if the model was actually trained the same way.
dsv3099i|1 year ago
You're going to have tolerance on the result anyway, so what's a little more error. :)