top | item 24615738

Multivariate Temporal Autoencoder for Predictive Reconstruction of Deep Sequence

18 points| shivinski | 5 years ago |altumintelligence.com | reply

9 comments

order
[+] mdifrgechd|5 years ago|reply
Someone who works in time series forecasting once told me that (to paraphrase) you know that if someone publishes a new method it's not all that good, because if it was they would have started a hedge fund instead.
[+] abeppu|5 years ago|reply
It's not my area, but isn't forecasting financial time series stuff harder than forecasting for systems where the system components aren't actively trying to outsmart one another based on some of the same data? I.e. a method could be good enough to use in systems that aren't about the interaction of smart agents (e.g. what will my service's request volume look like next week), but fail to be useful in markets?
[+] clircle|5 years ago|reply
Sounds like you need a neural network to help you write a better title for your post
[+] jwiley|5 years ago|reply
Yeah absolutely, I want all my research papers to have snappy, attention grabbing headlines instead of clear descriptions of the subject they are addressing. How about:

"Leading temporal autoencoder expert reveals the No. 1 trick to deep sequence reconstruction youre NOT doing."

"The $$$ secret to temporal autoencoders that AI researchers are hiding from you."

"20 something invents impossible autoencoder technology in their garage that could change almost everything."

[+] spyder|5 years ago|reply
Uh... no comparison with any other method?
[+] guerrilla|5 years ago|reply
This title is like something out of Star Trek mumbo-jumbo. Surely a title made out of that summary at the top would make more sense.
[+] nwsm|5 years ago|reply
"This article does not exist" type headline.