top | item 47189084

(no title)

A1aM0 | 3 days ago

This is a very fair critique, and I agree with it.

You’re right that many strong engineers can’t legally share employer code, and “has OSS time” is not a universal signal. So I’m now thinking of this as a dual-path system:

1) Public evidence path (for people with OSS/public technical work), where existing contributions are treated as reusable evidence. 2) Structured assessment path (for people without public artifacts), using scoped tasks/pair debugging/incident reasoning mapped to the same rubric.

So OSS should be an advantage when present, but never a requirement.

Also agree on AI confounders: raw public activity can’t be trusted at face value anymore. We need to weight traceable process signals (review back-and-forth, bug-to-fix chain, consistency over time) higher than easy-to-generate text/code volume.

If you were hiring with this, what would be your minimum bar for “credible evidence”?

discuss

order

apothegm|3 days ago

Structured assessments aren’t exactly anything new in tech hiring.

A1aM0|3 days ago

You’re right: structured assessments are old news.

The thing I’m testing is not “new tests,” but a tighter system: derive the rubric from real team tasks, apply the same rubric to both public evidence and live assessments, and make every score traceable (with lower weight on easy-to-fake AI-era signals).

If that doesn’t improve consistency/speed/quality vs current hiring loops, then it’s just old process with new branding.