top | item 45925781

(no title)

hexage1814 | 3 months ago

This is legitimately useful! I already have been using LLMs to write userscripts, buck you extension make the whole process 100% times easier, especially because it is already running on the browser, instead of you having to go back and forward with copy and paste code into VSCode or some chatbot.

discuss

order

jmadeano|3 months ago

That's the goal! This whole project started because I was originally just asking ChatGPT to write userscripts. For generic ones, it was okay, but the feedback loop was slow: I had to save the page archive, manually feed it to LLM, write a detailed prompt, generate a script (that probably didn't work initially), and then repeat that process until I was happy.

Now I can do that all in seconds (and iterate just as quickly). I also love the ability to easily share scripts I made with friends. Hopefully it's useful for you!

hexage1814|3 months ago

Thanks for replying. I wasn't expecting a reply since the thread was so popular and there were a bunch of comments :) Since I have your attention, I would like to ask something that isn't quite about the project itself, but rather about the ease of use that projects like this will bring.

Do you think pages in the future might start locking down and making it harder for users to customize things? Sort of DRM? Sometimes, for instance, on Cloudflare checking captcha pages, often I have to disable my userscript extension because some of my scripts interfere with the captcha or something.

And as some people pointed out, I'm somewhat skeptical sites would like users modifying their pages, not because those custom modifications wouldn't be useful to their users and make their user experience better, but because those sites do not want a better user experience for their users. Hell, if they wanted or were okay with that, YouTube Premium would offer you an API so that you can watch your stuff without being bothered by their horrible official front-end in your preferred alternative front-end.

So I'm just curious what your take on that is.

Again, loved the extension, and a small suggestion I would make is for the extension to store locally the prompts that generated that code, like the conversation. I'm not sure if this exists—at least I wasn't able to find it—but I think it could be useful.