top | item 45386935

(no title)

smjburton | 5 months ago

> ... is it primarily by inputting code snippets or abstract context into something like a Claude or ChatGPT?

I usually provide the initial context by describing the app that I'm working on (language, framework, etc) as well as the feature I want to build, and then add the files (either snippets or upload) that are relevant to build the feature (any includes or other files it will be integrating with).

This keeps the chat context focused, and the LLM still has access to the code it needs to build out the feature without having access to the full code base. If it needs more context (sometimes I'll ask the LLM if they want access to other files), I'll provide additional code until it feels like it has enough to work with to provide a solution.

It's a little tedious, but once I have the context set up, it works well to provide solutions that are (mostly) bug free and integrate well with the rest of my code.

I primarily work with Perplexity Pro so that I have access to and can switch between all pro level models (Claude, ChatGPT, Grok, etc) plus Google search results for the most up-to-date information.

discuss

order

nadis|5 months ago

Thanks! This is a different approach from what I was imagining so really appreciate you explaining.

I haven’t used Perplexity (Pro or otherwise) much at all yet but will have to try.

smjburton|5 months ago

You're very welcome! Good luck.