top | item 39212471

(no title)

jhawk28 | 2 years ago

I interviewed a number of people for a few positions and I never told them that I detected them using ChatGPT. We structured our interviews in 2 parts. The first one was finding a bug. First clue if they were using AI was that they would solve it instantly. Second part was to write something related to our work that had definitive start/end. If they were using AI, they often were able to get something out, but they had no foundation to reason about it and modify it. They would quickly become lost. We always said that they could use whatever "helps" as long as they showed what they were doing on screen. For some reason, only one person openly showed that they were using AI, but that was only because they couldn't figure out how to turn it off in the UI. We didn't disqualify anyone for using AI, we disqualified them because of their dishonesty. If you can't trust someone in an interview, how can you trust them in a remote environment?

discuss

order

consp|2 years ago

This sounds like the "coding for engineers" course I was a TA of. Everybody copied everybodies code and depending on their effort they either modified the variable names, flow or nothing (including the original authors name).

Long story short: asking them to make small changes and then tell us what would happen was a shurefire way to detect the true cheaters and not the lazy people.

I also fondly remember triggering float errors in loops so you'd get an extra cycle due to it ending in .999etc instead of 0.

drakonka|2 years ago

I did a live coding interview a while back where I was sharing my screen. I just pointed out that I'd been testing Copilot and offered to disable it in my IDE. The engineer just waved it off and said I should keep it on. Trying to hide it didn't even cross my mind - either they want to see how I work in a realistic environment with available tooling or they want to see what I can do in a "blind" setup. The company's approach here is actually a potentially good piece of information for the candidate's evaluation of the company as well. Either way, doesn't seem like something worth hiding.

KronisLV|2 years ago

> Trying to hide it didn't even cross my mind - either they want to see how I work in a realistic environment with available tooling or they want to see what I can do in a "blind" setup.

Honestly, the realistic style of work that's close to how one would actually approach problems in their day to day is pretty much ideal. In my case that would be using a nice IDE, some AI as a glorified autocomplete, IntelliSense and all that as well, in addition to Googling stuff along the way, if needed.

That should be enough to let them know both how I think, as well as show how I can solve problems and reason about those solutions. Heck, maybe even give me a simple task to build a CRUD and then talk about the choices I've made, if they're serious about hiring me and want to actually see what's inside of my brain.

But of course, in many places can't have that happen - they want to put the candidates in a situation where they just have a barebones text editor and expect them to produce good results. Blergh.

yieldcrv|2 years ago

I just did an interview where the collaborative coding session had an ai assistant in it, just a wrapper around chatgpt

that was interesting, upvoting that employer for honesty and pragmatism

d-cc|2 years ago

>We didn't disqualify anyone for using AI, we disqualified them because of their dishonesty. If you can't trust someone in an interview, how can you trust them in a remote environment?

Radical honesty has been a core cultural component to many a strong team, I'm glad to see somebody else mention this. There seems to be something unique about the relationship between codering and the concepts of transparency, honesty, and truth more broadly.

Or maybe that's just a consequence of version control :)

lazide|2 years ago

It’s a fundamental part of (reliable) engineering. Many a person has died historically when in ‘harder’ engineering someone was hiding things, and someone being able to acknowledge their lack of knowledge is key to not getting into that state - or being able to progress/grow at all, IMO.

Chernobyl being one prominent example.

At least in a field like engineering where actual successful results/working output matters, anyway.

There are other fields where the same dynamics are not in play.

One cannot solve (or even avoid) a problem that one refuses to acknowledge exists, after all.

93po|2 years ago

I don't think radical honesty would ever work in a workplace. A very high level, yes, but not radical as it's usually meant.

dennis_jeeves2|2 years ago

>There seems to be something unique about the relationship between codering and the concepts of transparency, honesty, and truth more broadly.

And what is worse than lies, is self delusion, even if honest. To nit pick on radical honesty, my observation is that most people won't tolerate it, plain honesty appears to be the sweet stop inmost cases.

saiya-jin|2 years ago

Yes basically when interviewing you should be looking for warning signs. CV is as it is, you can't cover any bigger one extensively in that short time, so you poke randomly and go deep.

There is no bigger warning sign than outright lying. A normal mature person would ask just before if AI is allowed.

isaacfrond|2 years ago

Oh the horror of people finding bugs instantly. You surely don't want them around in your company.

josephg|2 years ago

> We didn't disqualify anyone for using AI, we disqualified them because of their dishonesty.