top | item 32056189

AllenNLP will be unmaintained in December

62 points| lgessler | 3 years ago |github.com

18 comments

order
[+] lol1lol|3 years ago|reply
It's one of the overly abstracted libraries. Too hard to tweak something. HuggingFace Transformers did a better job at keeping things simpler.
[+] schmmd|3 years ago|reply
AllenNLP started before transformers, and so it provided high level abstractions to experiment with model architectures, which is where much of NLP research was happening at the time. Transformers definitely changed the playing field, as it became the basis for most models!
[+] ta988|3 years ago|reply
What's a good alternative?
[+] make3|3 years ago|reply
By very far most of the work in nlp now uses pretrained models. So people use HuggingFace Transformers now. https://huggingface.co/docs/transformers/main/en/index

HuggingFace Transformers is a huge high quality open source repo of pre trained models & associated code. People combine that with Pytorch-Lightning or Fairseq most of the time afaik.

[+] marvinalone|3 years ago|reply
It depends on what you use AllenNLP for. AllenNLP has a ton of functionality for vectorizing text. Most of the tokenizer/indexer/embedder stuff is about that. But these days we all use transformers for that, so there isn't much of a need to experiment with ways to vectorize.

If you like the trainer, or the configuration language, or some of the other components you should check out Tango (https://github.com/allenai/tango). One of Tango's origins is the question "What if AllenNLP supported workflow steps other than read -> train -> evaluate?". We noticed that a lot of work in NLP no longer fit that simple pattern, so we needed a new tool that can support more complex experiments.

If you like the metrics, try torchmetrics. Torchmetrics has almost exactly the same API as AllenNLP metrics.

If you like any of the nn components, please get in touch with the Tango team (on GitHub). We recently had some discussion around rescuing a few of those, since there seems to be some excitement.