top | item 47161118

(no title)

bojo | 5 days ago

Perhaps? I just used it to analyze one of my 96k Zig codebases using Claude Code and here is (part of) what came back. (I snipped out the deeper analysis above as it exposes my private project - but it was all correct).

  Head-to-Head

  ┌──────────────┬─────────┬─────────────┬────────────┐
  │    Metric    │  Opty   │ Traditional │   Ratio    │
  ├──────────────┼─────────┼─────────────┼────────────┤
  │ Input tokens │ ~13,500 │ ~39,408     │ 2.9x fewer │
  ├──────────────┼─────────┼─────────────┼────────────┤
  │ Tool calls   │ 21      │ 61          │ 2.9x fewer │
  ├──────────────┼─────────┼─────────────┼────────────┤
  │ Round trips  │ 5       │ 9           │ 1.8x fewer │
  └──────────────┴─────────┴─────────────┴────────────┘
I had it run a separate analysis using traditional vs. opty and count the actual tool calls and input token counts. My prompt was basically, "do a full analysis of this entire codebase."

discuss

order

verdverm|5 days ago

you're focused on quantity, that's yesterday's problem, tokens are getting cheaper, contexts are getting longer

try quality instead