I interviewed a number of people for a few positions and I never told them that I detected them using ChatGPT. We structured our interviews in 2 parts. The first one was finding a bug. First clue if they were using AI was that they would solve it instantly. Second part was to write something related to our work that had definitive start/end. If they were using AI, they often were able to get something out, but they had no foundation to reason about it and modify it. They would quickly become lost. We always said that they could use whatever "helps" as long as they showed what they were doing on screen. For some reason, only one person openly showed that they were using AI, but that was only because they couldn't figure out how to turn it off in the UI. We didn't disqualify anyone for using AI, we disqualified them because of their dishonesty. If you can't trust someone in an interview, how can you trust them in a remote environment?
consp|2 years ago
Long story short: asking them to make small changes and then tell us what would happen was a shurefire way to detect the true cheaters and not the lazy people.
I also fondly remember triggering float errors in loops so you'd get an extra cycle due to it ending in .999etc instead of 0.
drakonka|2 years ago
KronisLV|2 years ago
Honestly, the realistic style of work that's close to how one would actually approach problems in their day to day is pretty much ideal. In my case that would be using a nice IDE, some AI as a glorified autocomplete, IntelliSense and all that as well, in addition to Googling stuff along the way, if needed.
That should be enough to let them know both how I think, as well as show how I can solve problems and reason about those solutions. Heck, maybe even give me a simple task to build a CRUD and then talk about the choices I've made, if they're serious about hiring me and want to actually see what's inside of my brain.
But of course, in many places can't have that happen - they want to put the candidates in a situation where they just have a barebones text editor and expect them to produce good results. Blergh.
yieldcrv|2 years ago
that was interesting, upvoting that employer for honesty and pragmatism
d-cc|2 years ago
Radical honesty has been a core cultural component to many a strong team, I'm glad to see somebody else mention this. There seems to be something unique about the relationship between codering and the concepts of transparency, honesty, and truth more broadly.
Or maybe that's just a consequence of version control :)
lazide|2 years ago
Chernobyl being one prominent example.
At least in a field like engineering where actual successful results/working output matters, anyway.
There are other fields where the same dynamics are not in play.
One cannot solve (or even avoid) a problem that one refuses to acknowledge exists, after all.
93po|2 years ago
dennis_jeeves2|2 years ago
And what is worse than lies, is self delusion, even if honest. To nit pick on radical honesty, my observation is that most people won't tolerate it, plain honesty appears to be the sweet stop inmost cases.
saiya-jin|2 years ago
There is no bigger warning sign than outright lying. A normal mature person would ask just before if AI is allowed.
isaacfrond|2 years ago
josephg|2 years ago