top | item 43789563

(no title)

sgtnoodle | 10 months ago

I asked Gemini to format some URLs into an XML format. It got halfway through and gave up. I asked if it truncated the output, and it said yes and then told _me_ to write a python script to do it.

discuss

order

queenkjuul|10 months ago

On the one hand, it did better than chatgpt at understanding what i wanted and actually transforming my data

On the other, truncating my dataset halfway through is nearly as worthless as not doing it at all (and i was working with a single file, maybe hundreds of kilobytes)

walls|10 months ago

This is my most common experience with Gemini. Ask it to do something, it'll tell you how you can do it yourself and then stop.

edoloughlin|10 months ago

Given that Gemini seems to have frequent availability issues, I wonder if this is a strategy to offload low-hanging fruit (from a human-effort pov) to the user. If it is, I think that's still kinda impressive.

ASalazarMX|10 months ago

Somehow I like this. I hate that current LLMs act like yes-men, you can't trust them to give unbiased results. If it told me my approach is stupid, and why, I would appreciate it.

GoToRO|10 months ago

That's a different kind of push back.