top | item 47138993

(no title)

leoedin | 5 days ago

I can't even imagine how many joules would be used per function call!

As an experiment, it's kind of cool. I'm kind of at a loss to what useful software you'd build with it though. Surely once you've run the AI function once it would be much simpler to cache the resulting code than repeatedly re-generate it?

Can anyone think of any uses for this?

discuss

order

ryancoleman|5 days ago

They're handy for situations where it would be impractical to anticipate the way your input might vary. Like say you want to accept invoices or receipts in a variety of file formats where the data structure varies but you can rely on the LLM to parse and organize. AI Functions lets you describe how that logic should be generated on-demand for the input received, with post-conditions (another Python function the dev write) which define what successful outcomes look like. Morgan wrote about the receipt parser scenario here: https://dev.to/morganwilliscloud/the-python-function-that-im... (FYI I'm on the Strands Agents team)

simsla|5 days ago

I've used stuff like this for a hobby project where "effort to write it" vs "times I'm going to use it" is heavily skewed [0]. For production use cases, I can only see it being worth it for things that require using an ML model anyway, like "summarize this document".

[0] e.g. something like the below which I expect to use maybe a dozen times total.

Main routine: In folder X are a bunch of ROM files (iso, bin, etc) and a JSON file with game metadata for each. Look for missing entries, and call [subroutine] once per file (can be called in parallel). When done, summarise the results (successes/failures) based on the now updated metadata.

Subroutine: (...) update XYZ, use metacritic to find metadata, fall back to Google.

amelius|5 days ago

You just tell the AI: use as little energy as possible, by whatever means necessary!

pphysch|5 days ago

Anthropic announces deal to buy 100% of Idaho's potato crop, in return for options, in new energy efficiency push

re-thc|5 days ago

> run the AI function once it would be much simpler to cache the resulting code than repeatedly re-generate it?

Surely, you'll run a function that does an AI call to cache the resulting code.

ryancoleman|5 days ago

The initial version on GitHub does not implement caching or memorization but it's possible and where the project will likely head. (FYI I'm on the Strands Agents team).