top | item 46845633 (no title) songodongo | 28 days ago Does Copilot include system prompts at the extension level or the API level? discuss order hn newest verdverm|28 days ago Copilot prompts are in the extension, they make the system_prompt field you send in the API request to an LLMThe underlying models have various guardrails and alignments that you cannot work around triviallyFor Copilot, you can look at the code on GitHub, it is JSX based, which is interesting, they pass context budget info aroundFor Claude Code, and many others, you can find the extracted prompts onlineAt this point, they are all dynamically generated from fragments and contextual data (like what files or language you're working with)
verdverm|28 days ago Copilot prompts are in the extension, they make the system_prompt field you send in the API request to an LLMThe underlying models have various guardrails and alignments that you cannot work around triviallyFor Copilot, you can look at the code on GitHub, it is JSX based, which is interesting, they pass context budget info aroundFor Claude Code, and many others, you can find the extracted prompts onlineAt this point, they are all dynamically generated from fragments and contextual data (like what files or language you're working with)
verdverm|28 days ago
The underlying models have various guardrails and alignments that you cannot work around trivially
For Copilot, you can look at the code on GitHub, it is JSX based, which is interesting, they pass context budget info around
For Claude Code, and many others, you can find the extracted prompts online
At this point, they are all dynamically generated from fragments and contextual data (like what files or language you're working with)