top | item 42694020

(no title)

orasis | 1 year ago

Check out improve.ai if you want to see this taken to the next level. We combined Thompson Sampling with XGBoost to build a multi-armed bandit that learns to choose the best arm across context. MIT license.

discuss

order

No comments yet.