top | item 47174492

(no title)

londons_explore | 4 days ago

I think we aren't far from AI being able to solve this sort of problem too.

Imagine you are Apple and can just set an LLM loose on the codebase for a weekend with the task to reduce RAM usage of every component by 50%...

discuss

order

al_borland|4 days ago

From everything I’ve seen, LLMs aren’t exactly known for writing extremely optimized code.

Also, what happens to the stability and security of my phone after they let an LLM loose on the entire code base for a weekend?

There are 1.5 billion iPhones out there. It’s not a place to play fast and loose with bleeding edge tech known for hallucinations and poor architecture.

teeray|4 days ago

> LLMs aren’t exactly known for writing extremely optimized code.

They are trained on everything, and as a result write code like the Internet average developer.

rescbr|4 days ago

If you ask an LLM to code whatever, it definitely won’t produce optimized code.

If you direct it to do a specific task to find memory and cpu optimization points, based on perf metrics, then it’s a completely different world.

robinwassen|4 days ago

They kind do if you prompt them, I had mine reimplement the Windows calc (almost fully feature complete) in rust running with 2mb RAM instead of 40mb or whatever the win 11 version uses as a POC.

A handwritten c implementation would most likely be better, but there is so much to gain from just slaughtering the abstraction bloat it does not really matter.

alpaca128|4 days ago

LLMs are trained on currently existing code.