top | item 45440431

OpenTSLM: Language models that understand time series

280 points| rjakob | 5 months ago |opentslm.com

Paper: https://www.opentslm.com/OpenTSLM-whitepaper.pdf

Repo: https://github.com/StanfordBDHG/OpenTSLM

Foundation models excel at text, images, audio, and video, but lack temporal reasoning capabilities over time-series data streams that run the real world: vitals, prices, telemetry, grid loads, clickstreams, machine logs, business processes.

Time Series Language Models (TSLMs) are open foundation models, supporting time‑series as a native modality, next to text, letting users ask questions, get explanations, and recommendations, all in natural language.

The OpenTSLM White Paper released today demonstrates state-of-the-art temporal reasoning performance. Unlike prior approaches, the cross-attention architecture scales to long time-series remaining viable at scale.

The results:

- Sleep staging: 4.4× accuracy with a model 200× smaller (~880× efficiency)

- Activity recognition: ~6× accuracy with 200× smaller (~1,000× efficiency)

- ECG interpretation: ~2× accuracy with 200× smaller (~400× efficiency)

— first model to process 12-lead ECG signals and text simultaneously with chain-of-thought reasoning validated by cardiologists.

For the first time, foundation models can handle multiple time-series streams of varying lengths concurrently, integrate them with textual context, and produce interpretable explanations (verified by domain experts, clinicians).

This work is the result of a growing collaboration between researchers from Stanford, ETH Zurich, UIUC, University of St. Gallen, University of Washington, Google, and Amazon.

It points to the next foundation model frontier: temporal intelligence that unlocks proactive healthcare, adaptive robotics, resilient infrastructure, and new forms of human-AI collaboration.

80 comments

order

copypaper|5 months ago

I understand this provides a way to interact with ts data via natural language, but is there any benefit to this over tool calling to a library that uses signal processing and/or rule based algos (or using machine learning if the data is noisy/variable)?

For example, you ask an off-the-shelf LLM to analyze your ECG data. The LLM uses a tool to call out to your ECG ts analysis library. The library iterates over the data and finds stats & ECG events. It returns something like "Average heart rate: 60bpm, AFib detected at <time>, etc...". The LLM has all the info it needs to give an accurate analysis at a fraction of computational cost.

On top of that, this requires a large annotated dataset and a pre-trained model. And correct me if I'm wrong, but I don't think it's possible to have a "general" model that could handle arbitrary time series data. I.e. a model that is trained on ECG data would not be compatible with stock market data. And there isn't a way to have a model that understands both stock market data and ECG data.

manquer|5 months ago

You couldn’t run that on the edge though .

The point is to be reliably run it on the edge , nobody sane would want their heart rate monitor to be run via the cloud with the uptimes and reliability that come that would come with any remote service plus the extra challenges of llm inference .

The goal would be running on the edge in addition to standard rules based detection which already these machines have and add advanced pattern detection that llms can provide to reduce alert fatigue and also detect new class of complex patterns which these sensors typically don’t.

SebastianSosa|5 months ago

I understand this provides a conversation interface for interacting with internet scale data (ChatGPT), but is there any benefit to this over searching in Google then clicking on the top link, (avoiding the ad) clicking accept my cookies, reading the header, scrolling down, Xing out of premium subscription, reading rest of article, repeat for the 4 next links?

Ok bro.

let_tim_cook_|5 months ago

"Stanford Repo Released Sep 31, 2025" Seems like something sampled from a distribution with non-zero probability that the day after Sep 30, 2025 would is the 31st....

rjakob|5 months ago

Thanks for the note. Ironically, the post is about models built to understand time.

lomase|5 months ago

They fixed it already.

Animats|5 months ago

The underlying work is something called "Flamingo".[1] This is a system for understanding interleaved text and images in sequence. So it can process two "modalities" that are both sequential. This new work seems to put some kind of time token in one "modality" channel, leading to more awareness of time.

(The web site is too cute. Applying a left to right gradient on text is a bit much.)

[1] https://arxiv.org/pdf/2204.14198

FilosofumRex|5 months ago

A fun litmus test for it would be to de-trend S&P500 to its individual components and to identify and rank contributions of all 500 stocks. But that alone would not get it a job at Rentec or the NSA.

Unlike most commercial & medical applications where signals are stationary with white (uncorrelated) noise, the NSA & Rentec mostly deal with non-stationary signals with regime changes and correlated noise, which can't be denoised without loss of information.

The idea is not so much to predict the next stock price tick or to decipher an intercepted signal (most likely encrypted anyways), but rather to detect "regime changes", ie quickest detection of a change of pattern in non-stationary signals. Then the detected pattern is matched to known trading patterns for a particular stock or to the expected spy activities.

brandonb|5 months ago

This is very cool! From the paper, this technique seems to work well for question answering in time-series.

In medical AI, IMO, the most exciting work is detecting disease signals too subtle for humans—for example, estimating ejection fraction from an ECG (which cardiologists can’t do this, but algorithms can and have been tested in RCTs: https://www.nature.com/articles/s41591-021-01335-4 ).

Since OpenTSLM tokenizes time-series into an LLM embedding space, would that process prevent capturing such subtle signals? Or could the approach be extended to handle that use case?

RealLast|5 months ago

OpenTSLM models are exactly made to capture these subtle signals. That was one of the original motivations. The model integrates the raw time series data via cross attention, with concrete time series representations learned by a raw time series encoder.

aerugo_|4 months ago

What I really want to know is - how do LLMs understand time series at all? Admittedly, even the best LLMs are not fantastic at analyzing time series and tabular data, but they also don’t completely suck at it. Why is that? They seem better at it that my intuition tells me they should be.

In my opinion we need a multi-modal model that is great at both tabular datasets and text analysis. Most analytical work in economics, policy, public health, medicine etc requires a combination of crosschecking between both. Current gen LLMs are not good enough at generating novel insights by looking at tables and text at the same time. I also haven’t have found any data on this so please serve it to be on a plate if I’m wrong.

esafak|5 months ago

Wouldn't it be better to have the model write a script that calls a TS library and give it access to an interpreter to run it? That's how a human would do it. I'm not convinced of the need to bake this into the model. What can you do with native TS capability that you can't by tool calling?

sync|5 months ago

Anthropic is encouraging the "have the model write a script" technique as well, buried in their latest announcement on Claude Agent SDK, this stuck with me:

> The Claude Agent SDK excels at code generation—and for good reason. Code is precise, composable, and infinitely reusable, making it an ideal output for agents that need to perform complex operations reliably.

> When building agents, consider: which tasks would benefit from being expressed as code? Often, the answer unlocks significant capabilities.

https://www.anthropic.com/engineering/building-agents-with-t...

ForHackernews|5 months ago

Does it actually have a concept of time? Does it understand causality?

RealLast|5 months ago

I think you missed the point. Would you call an image analysis library to describe an image or reason over a sequence of images? Check out some of the plots in the paper to see what these models can do.

pks016|5 months ago

Looks promising! I'll try it once I get home today.

I work with a large number of audio time series data (not words and all have subtle variation). It would be interesting to see how it compares to traditional statistical methods.

resters|5 months ago

It would be nice if claude code could monitor a time series of my heart rate to realize when it is soiling the bed.

lsh0|5 months ago

fwiw, I'm finding claude2 released a few days ago to be a lot less infuriating

LudwigNagasena|5 months ago

As I understand it, the model is trained for classification and interpretation of time series data, but have you tried benchmarking it at forecasting? Explanation and recommendations are often deeply intertwined with forecasts, so there must be at least some effect there?

amelius|5 months ago

If you view a byte sequence as a time series then I suppose this could be a good file compression algorithm.

lacoolj|5 months ago

Like hitting a thumb tack with a sledge hammer

zubairov|5 months ago

This is very cool! Amazing work guys!

llmslave|5 months ago

Guaranteed there are hedge funds with language models that can predict time series. Alot of really good time series research has never been published, and is locked in some guys head that lives in a 20 million dollar apartment in NYC

fogzen|5 months ago

When I worked at an ML hedge fund 6 years ago, t-SNE performed the best and momentum was the feature that best predicted stock movements.

The actual algorithms for predicting price movement were fairly simplistic, most work was around strategies for dealing with overfitting and how to execute the trades. Accuracy was around 51-55% (a bit better than coin toss) so it was a big challenge to actually execute the trades and still make a profit after fees and other nonsense. Finding alpha is what ML is used for but that’s just the first step.

1980phipsi|5 months ago

One of the difficulties with these models would be backtesting investment strategies. You always need to make sure that you are only using data that would have been available at the time to avoid look-ahead bias.

reactordev|5 months ago

Can confirm, kdb+ exists… and you’ll probably never be able to get your hands on it. There are lots of models that use it. And they are indeed locked inside some guys head high up in the towers of midtown.

fmbb|5 months ago

Why would they use LLM for this?

constantcrying|5 months ago

This isn't (just) time series forecasting, it is about interacting with time series data through natural language.

senorrib|5 months ago

I doubt those are language models.

yawnxyz|5 months ago

would be cool to use this to predict series of passages for directed evolution, e.g. appelman protocol or similar, in phage/host interactions

qwe----3|5 months ago

“Researchers from Google” (did an internship)

pdntspa|5 months ago

OF COURSE the good stuff is proprietary....

ivape|5 months ago

Seems like MIT?

syntaxing|5 months ago

How many parameters are a basic model?

ivape|5 months ago

You'd be fine tuning a base model, and they suggested 1B and 3B variants, possibly bigger.

t_mann|5 months ago

> Read the White Paper

> A universal TSLM will power proactive healthcare, adaptive robotics, resilient infrastructure, and new forms of human-AI collaboration.

> scientists, engineers, and builders from ETH, Stanford, Harvard, Cambridge, TUM, CDTM, Google, Meta, AWS, and beyond

What's with all this fuss? Why not just upload your paper to arxiv? Time series models are interesting enough, but from the abstract it's not even clear whether they are using transformers or a recurrent architecture like xLSTM - arguably a more intuitive choice for time series - or something else. This website is barely distinguishable from a crypto/DeFi pitch.

RealLast|5 months ago

The full paper is on the website. The arxive release of the exact same paper is pending. Click the button "read the white paper" to get the full paper.

ghc|5 months ago

> Few studies use cross-attention to integrate time series into LLMs

I mean, sure, but why would you need a study for that? There's plenty of prior work using cross-attention to integrate time series dynamics into non-LLM transformer models, right? Or maybe I'm assuming that integrating a time series embedding with an LLM is easier than it is.

Looking at the repo, the training data seems extremely health-focused. I guess I would have to tune the model with my own datasets if I want it to answer questions about multi-source sensor data?

orbifold|5 months ago

This is a terrible idea and direction but it will not stop people from pursuing it and as soon as they have a critical mass of people reviewing each other it will go on for quite a while. Transformers for time series is one of those things that seems to make sense but not really.

EGreg|5 months ago

Can you elaborate as to why, actually? What specifically makes this the case

iLoveOncall|5 months ago

You don't need specially trained LLMs for this. My team has been using successfuly Claude 3.5 for a year for the purpose of analyzing huge time series data sets (close to the max context window), without anything special beyond a prompt describing the task at hand.

nowittyusername|5 months ago

I agree, LLM's are capable of doing this right out of the box if you provide it grounding data like current time and a few other things in the system prompt. Its really odd that this is getting any attention.

NwtnsMthd|5 months ago

This sounds very interesting, would you be able to share a little more about your process? What works and what doesn't?

RLAIF|5 months ago

[deleted]

Imad_mkdm|5 months ago

[deleted]

esafak|5 months ago

You know this adds nothing.

posidoli|5 months ago

That is outstanding work and will revolutionize the approaches in this topic!

Y_Y|5 months ago

Bad bot