(no title)
niles | 1 year ago
I kinda feels like you are using the LLM to assign "weights" or important properties of an algo and then directly translating the basic arithmetic accounting of those factors into a prediction. What I expect is that the LLM would also be used to read all past news to find similar patterns and then create time slices where its weights could be tested against a control. It can then backtest its own weights to better tune what factors really led to an outcome and expose this refinement as part of the prediction.
News Data Sources: https://www.gdeltproject.org/ https://credibilitycoalition.org https://data.worldbank.org/
No comments yet.