top | item 44082241

(no title)

kovezd | 9 months ago

There are places where: a) weather predictions are unreliable, b) there is scarcity of water. Just making the right decision on at what hour to water is a huge monthly saving of water.

discuss

order

1over137|9 months ago

None of which need AI hype crap. Some humidity sensors, photosensors, etc. will do the job.

kovezd|9 months ago

Need is a very strong word. We don't need a lot of we have today.

But as a hobbyist I would prefer to program in an LLM than learn a bunch of algorithms, and sensor readings. It's also very similar to how I would think about it, making it easier to debug.

mnky9800n|9 months ago

I think there’s two schools of thought. The models will get so big everyone everywhere will use them for everything and they will make lots of money on api calls. The models will get cheaper and cheaper computationally on inference that implementing them on the edge will cost nothing and so an LLM will be in everything. Then every computational device will have one as long as you pay a license fee to the people who trained them.

ithkuil|9 months ago

Does it have to be computed at the edge by every person?

kovezd|9 months ago

Just as the other comment "have to" is a very strong word. But there are benefits to it: a) adaptability to local weather patterns, b) no access to WiFi in large properties.