top | item 27791369

(no title)

aurelian15 | 4 years ago

This is just one example of the "analogue hole" [1] problem shared by all anti-cheat/DRM systems. At least in theory, there is no technology that can prevent exploits like this short of dystopian levels of surveillance and locking down computing devices even further. By that I mean encrypted communication on all computer buses (including USB, HDMI), and only allowing access to those busses via physically hardened "secure" enclaves, up to (in the end game) big-brother-like surveillance (think electronic proctoring solutions). I think that this is exactly the problem with such DRM schemes---the ensuing cat-and-mouse game will inevitably lead to trampling the user's freedoms, because locking down computing devices and environments to ridiculous levels is the only way in which DRM can be made to work.

Of course, for now, cheats like the one featured in the article should be fairly easy to detect (at least from what I've seen in the linked video). The motion of the bot is extremely jerky; a simple rule-based system, or, if you want to be fancy, a neural network based anomaly detection system should be able to detect this.

On the side of the cheat authors, this could be easily circumvented if they include a "calibration phase", where user input trains a simple neural network to stochastically emulate the dynamics of the user's sensor-action loop. The cheat could then act slightly faster than the user, giving them an edge while still using their unique dynamics profile.

I wonder where this will lead eventually, and I genuinely feel sorry for all the people who pour their heart and soul into competitive gaming; I don't think that this kind of cheating is something that can and should (see above) be prevented in the long-run. The best possible outcome I can imagine is that online gaming becomes more cooperative or once more converges back to small groups of people who know and trust each other.

[1] https://en.wikipedia.org/wiki/Analog_hole

Edit: Spelling, grammar, and clarity

discuss

order

ActorNightly|4 years ago

The solution is really simple - make all competitive gaming events LANs with standardized hardware that is not touched by players before the event starts.

For regular online gaming, you can train a neural net to detect cheats like this, biased by the players score. If the cheat is introducing enough error for the player to be killable, its not ruining the experience for the rest of the players.

marcinzm|4 years ago

>By that I mean encrypted communication on all computer buses (including USB, HDMI)

That only delays things since in the end you still need a human being to be able to play. So you can have a camera looking at the screen and a mouse/keyboard with some wires soldered to the key points.

runnerup|4 years ago

Indeed. Or a robot arm moving the mouse. The analog hole will always exist. However, it may prove hard to make a computer move the mouse like a human, and type like a human. Heuristics will likely be able to separate human from bot input for quite awhile still.

The game makers probably enjoy a large advantage in size-of-dataset vs cheat makers.

voidnullnil|4 years ago

>will inevitably lead to trampling the user's freedoms

People keep saying this but it happened 20 years ago. This reminds me of shit like the postal service requiring photo ID to receive a package and people complaining about NSA hundreds of years later. Now you need a phone to play a game and some of the most popular need literal photo ID checks. Imagine, sending your photo ID which if stolen people can steal your money, to a bunch of newgrads running a game studio. This is what people (kids and manchildren) accept to address the overstated problem of game cheating. I played thousands of hours of games for 20 years and the number of cheaters I ran into is around 10 or 20. Most players of games (including the ones who complain about "cheaters") do not even have a clue what a game cheat is. They think some guy has some cheat that only works in this weird scenario that happens 1/100 games. Yeah, can you guys stop making me need photo ID for to play some stupid game? This is no different than every obnoxious statist concern that gets addressed by some charlatan who purports to be saving the world by ruining everyone's day (almost any time I install or configure a game my day is ruined, imagine a typical dependency hell but 10x worse). And no, I haven't ran into little cheaters because of "sophisticated anticheat" (stuff like punkbuster is extremely incompetent), it's because public hacks simply get blacklisted once they become big enough to matter.

herbst|4 years ago

On a way more simple level, back in my days I wrote bots for flash/browser games basically by detecting specific pixels on the screen and acting according to it. Sounds stupid, but with simple games this could work very well.

I never got 'detected' how would they? And some of my bots easily got more skills than I ever had

rightbyte|4 years ago

LAN tournaments solve the problem, though. So it is not that bad.