I think Raku is better than Python agent based systems for a few reasons:
- You don't have to think about concurrency or multithreading as in Python. There is no GIL to worry about. The built in support for things Supply and hyper-operators are all available in the language. It is really easy to hook up disparate parts of a distributed agent without having to think about async or actors libraries or whatever in Python.
- Something I prefer is the OOP abstractions in Raku. They are much richer than Python. YMMV, depending on what you prefer.
- Better support for gradual typing and constraints out of the box in Raku.
Python wins on the AI ecosystem though :)
I started messing around with this code several years ago and the LLM libs in Raku were not as rich as today. I thought I needed a specific type of LLM message handling structure that could be extended to do tool handling and some of Letta type memory management (which I never got around to!). I have some Python libs of my own and I ported them. I suspect if I was starting now, I would use what is available in the community. This version of TallMountain is the last of a long series of prototypes, so I never rewrote those parts.
Nice to see others who think that Raku is a good fit for LLM ... I have had some success integrating LLM::DWIM (a raku command line LLM client built on LLM::Functions etc) with a DSL approach to make a command line calculator based on Raku Grammars.
> crag
> ?^<elephant mass in kg> / ?^<mouse mass in kg> #300000①
> ?^<speed of a flying swallow in mph> #30mph
looking4advice|5 months ago
- You don't have to think about concurrency or multithreading as in Python. There is no GIL to worry about. The built in support for things Supply and hyper-operators are all available in the language. It is really easy to hook up disparate parts of a distributed agent without having to think about async or actors libraries or whatever in Python.
- Something I prefer is the OOP abstractions in Raku. They are much richer than Python. YMMV, depending on what you prefer.
- Better support for gradual typing and constraints out of the box in Raku.
Python wins on the AI ecosystem though :)
I started messing around with this code several years ago and the LLM libs in Raku were not as rich as today. I thought I needed a specific type of LLM message handling structure that could be extended to do tool handling and some of Letta type memory management (which I never got around to!). I have some Python libs of my own and I ported them. I suspect if I was starting now, I would use what is available in the community. This version of TallMountain is the last of a long series of prototypes, so I never rewrote those parts.
librasteve|5 months ago
PS. Raku has Inline::Python where you need a lib from the Python ecosystem (which I am sure you know, but in case others are curious)
antononcube|5 months ago
BTW, several years ago the LLM-revolution didn't happen yet. Raku started to have sound LLM packages circa March-May 2023.