I pulled the vendor's brochure; if you were curious what a vending machine would use this information for...
They collect demographics on WHO is purchasing what items including gender, age, etc. They use this information for targeting advertising (inc. with "partner media brokers").
They're also proud that when users install their app, it uses "gamification" to increase sales (whatever that means).
Vending machines covered by a large screen stand for everything I hate about contemporary tech.
They improve absolutely nothing from a buyer's perspective. Every step of the transaction is made worse. You can't glance at the entire inventory. You never know how much of an item is left. The machine does not reliably know how much of an item is left. Every interaction lags. And in return I get ads and mini games. Just so some C-suite cretin (guess what the C in C*O stands for) can show his little cretin friends how innovative his farts are.
It is late. I am hungry. My train departs in 2 minutes. Please, I just want a bloody Snickers.
That's quite the brochure. My favorite selling point is the way you can make the machine display products that it's doesn't actually have inside, sell them, and give the punter a digital IOU instead of the soda they tried to buy...
> "Customizable UI design - Product selection can be extended to include products not physically in the machine. Consumers can store them in Invenda Wallet [a mobile app] and redeem them somewhere else."
Somehow managing to turn a convenience-driven impulse buy into an additional chore to redeem later.
At the end of the day, there must be some angle I'm not understanding or these features wouldn't actually drive sales. I wonder if the idea is to vend digital products? Drive traffic to nearby physical stores through some kind of targeted digital coupon? Has anyone seen this kind of thing in the wild?
Detecting "gender" by facial recognition on a 21st century uni campus in the USA ... what could _possibly_ go wrong and cause a massive media meltdown?
The machines use the charitably named "demographic sensor" which is obviously the embedded camera linked to a "facial recognition" application, BUT, it doesn't appear that it actually recognizes (or records) faces. Meaning, it's not linking your face to any online identity, or recording your face at all. In fact, the company is European and claims that their entire platform is GDPR compliant, which is... probably true?
Rather, it throws out a series of guesses and confidence values of a person's age, gender, and race, and allow the planogram (and OOH advertising) to change dynamically based on that information.
Which is not necessarily great, but also an entire order of magnitude less invasive than every interaction any casual user has on the internet with any ad ever. Or, frankly, with any POS system recording a repeat purchase from your credit card, from which motivated vendors can back into the rest of the demographic data.
I'm not excusing it, but while the error reads "facial recognition", it's more "stereotyping enablement platform," which, while only marginally better, is probably still better.
It's also hilarious to think of the thing displaying the green M&M if you're a "probable" woman, and the red or yellow M&M if you're a "probable" man, and seeing how long it'd take for anyone to correlate the change.
I’m sure this is real but just for fun I’m entertaining the idea that someone is trolling — either with a fun dialog title or the app name.
I worked at a company in the late 2000s that moved to an office with automatic urinals. These were fairly novel at the time and had a matte black plastic unit with a flush handle that also had a shiny window made of dark, IR transparent plastic. It was clearly some kind of proximity sensor for the autoflush but some joker made an official “do not touch the cameras” sign that wound a few people up.
It’s probably something like Apple’s Face ID sensor, which is a “camera” in the strict sense of the word but it captures a depth map rather than an image. I’m willing to believe that they aren’t transmitting images or even raw depth maps over the Internet (for one thing, it would be a lot of data and it’s hard to conceive of what the use of such data would be), but they are almost certainly attempting to correlate demographic information, or possibly even individual users, with specific purchasing patterns.
In 2018 Renesas had a "Facial Expression Kit Giveaway" contest and I was one of the winners. It was the "RZ Omron Facial Expression Kit giveaway". RZ being one of their newer, at the time, Micro lines. I had to sign an affidavit to get my prize, which I did. Never did use it and it eventually went to a local Hamfest. Their example code was a vending machine.
Isn't facial recognition wide spread in the US yet?
In the UK we're already using it everywhere. Most stores and checkout systems have it. It's used in CCTV across the country. It's used by police to identify protestors. Schools in my area even install spyware on kids phones to monitor them and their families.
I can't imagine anyone would mind having being identified by a vending machine here lol... What's the risk?
This stuff drives me mental, but the article ends up not being quite the black pill it could be. Nice to see students fighting back at the grassroots, the University (hopefully) acting, and people feeling enpowered to do something under the law.
We have to keep pushing for this stuff to become less and less normalised, and also for penalties to become more and more serious. When that happens, people will continue to feel more enpowered to fight this stuff. We should also make sure laws are clear that people have the right to "vandalise" such devices. In my opinion taking a bat to one of these things should at least be defensible in court.
I wonder where the line in the sand exists for what people feel is “okay” tracking and “not okay” tracking. Here it seems to be that almost universally, the students dislike the facial recognition. Many folks I’ve spoken to otherwise don’t care about being tracked online. I suppose people are more sensitive to being tracked when it transitions into the real world and becomes more “real.”
There’s an implied consent. I know that creating a Gmail account means that Google has access to all my emails. By using Gmail, I’m relatively aware that they’re building an ad profile on me, and I implicitly consent to that.
But if I found out they’re collecting the bloodwork pdfs my doctor sends and selling them to insurance companies, that would be objectively beyond any reasonable consent.
If you’re selling me M&Ms, I have a very very low tolerance for what I consider appropriate data collection.
I don't mind being targeted online since it changes the ads I see from "Acai berry smoothies" to "beach vacations". Something I am not interested in, to something I am interested in (due to browsing history). That makes my experience better and I understand the trade off while using a free product (Google Maps, Facebook etc etc).
However, the difference in this case is that the vending company (or someone along the chain) is using it to serve demographic ads to the customer of the vending machine. I still have to pay for the product and it's not changing non targeted ads to targeted ads (there were no ads before). I'm sure they aren't passing along some of the ad revenue to me since there's no competition (I'm sure all vending machines in the area are owned by the same company). So this is just a source of extra revenue for the company and serves no purpose to the user.
I’m going to call bullshit on the claim that they don’t store data “It does not engage in storage, communication, or transmission of any imagery or personally identifiable information”
They obviously store data. You have to write the image to disk after the camera takes it. I guarantee that if a 3rd party audit were conducted they’d find data stored on there.
Heck, I’ll do the audit for zero upfront, payment contingent on finding images stored on there.
I know the HN guidelines say not to comment that someone hasn't read the article, but the comments section can be a little useless when literally all of the top comments I see are from posters who appear not to have read the article.
In summary (and, in fairness, these technical details are pretty far down the article):
> The machines are owned by MARS and the manufacturer is Invenda.
> MARS did not respond to requests for comment from CTV.
> Invenda also did not respond to CTV’s requests for comment but told Stanley in an email “the demographic detection software integrated into the smart vending machine operates entirely locally.”
> “It does not engage in storage, communication, or transmission of any imagery or personally identifiable information,” it continued.
> According to Invenda’s website, the Smart Vending Machines can detect the presence of a person, their estimated age and gender. The website said the “software conducts local processing of digital image maps derived from the USB optical sensor in real-time, without storing such data on permanent memory mediums or transmitting it over the Internet to the Cloud.”
> The website said the “software conducts local processing of digital image maps derived from the USB optical sensor in real-time, without storing such data on permanent memory mediums or transmitting it over the Internet to the Cloud.”
The most important thing to keep in mind about clauses like this is that at any time it's only one software update away from changing the behavior and suddenly starting to send everything. You'd never know. (And: That point in time may be in the past.)
> “It does not engage in storage, communication, or transmission of any imagery or personally identifiable information,”
Probably bullshit.
Firstly, "personally identifiable information" is usually defined very specifically, like 'we didn't record their social security number'.
Secondly, they send a bunch of information extracted from the image and they probably store the raw images on the machines which are then manually extracted at some point. Like photocopier that store a scan of everything put on the bed in its own hard-drive.
Thank you for the excerpt. I attempted to read the article but have a firmly negative reaction to the auto-playing video that CTV chose to pop over it.
I hate that with time there is just going to be more and more covert non-consentual facial recognition everywhere. It should be outlawed, but obviously that does not align with the interests of the powerful so it'll never happen.
there's a lot of roads all leading to the same place, there needs to be a constitutional amendment about the right to own your own data, and if someone collects it from you they need to pay you to collect it, and you should receive a tax on any transaction where its sold
at least that's my suggestion
if people collected my information and sent me a check in the mail every month for a decent amount, i might not be so annoyed.
Just smash it if you see it. Or someone suggested super glue. If we all actively destroy these devices, it'll eventually not be worth it to replace them.
Am a privacy advocate, yet think this might be an (understandable) overreaction.
I remember older generation tech used in digital signage applications that wanted to identify ages and genders to show them appropriate content, say apparel for example. It ran locally and was well meaning (enough).
Unfortunately in a world with Meta, ubiquitous telemetry, Ring doorbells, NSA, and data breaches and brokers, … we can’t trust anyone to do the right thing any longer.
So this type of benign-ish functionality, which would have been all local just two decades ago, unfortunately now can’t be trusted either.
Still, I’m very happy to hear a young person or two being concerned. Last trip to a college campus there was mandatory app and no cash accepted in many places.
I think the key difference here is that the digital signage technology you speak of, the recognition was a necessary requirement to show the appropriate content, clothing for example.
In this case, the recognition doesn't do anything for the end user of the vending machine. It is purely data for other brokers to use. The vending machine didn't say to the user that people of their demographic were buying Diet Coke over normal Coke.
Slapping facial recognition on something that didn't have it before, without any obvious functionality or benefit for the persons face being scanned is very much cause for concern, in my opinion.
In that case I want the Sensor™ to be physically unable to send out actual images to the computer that handles the vending machine, so no software update can actually start doing something else than they claim it to do originally.
I'm pretty sure many big box places like Home Depot and Walmart are using actual facial recognition on everyone who is in or near any of their stores. This vending machine sounds pretty benign by comparison.
[+] [-] Someone1234|2 years ago|reply
They collect demographics on WHO is purchasing what items including gender, age, etc. They use this information for targeting advertising (inc. with "partner media brokers").
They're also proud that when users install their app, it uses "gamification" to increase sales (whatever that means).
See here, they're super proud of it too:
https://a.storyblok.com/f/184550/x/e7435c019e/brochure-svm_g...
[+] [-] gherkinnn|2 years ago|reply
They improve absolutely nothing from a buyer's perspective. Every step of the transaction is made worse. You can't glance at the entire inventory. You never know how much of an item is left. The machine does not reliably know how much of an item is left. Every interaction lags. And in return I get ads and mini games. Just so some C-suite cretin (guess what the C in C*O stands for) can show his little cretin friends how innovative his farts are.
It is late. I am hungry. My train departs in 2 minutes. Please, I just want a bloody Snickers.
These companies are a blight.
[+] [-] alwa|2 years ago|reply
> "Customizable UI design - Product selection can be extended to include products not physically in the machine. Consumers can store them in Invenda Wallet [a mobile app] and redeem them somewhere else."
Somehow managing to turn a convenience-driven impulse buy into an additional chore to redeem later.
At the end of the day, there must be some angle I'm not understanding or these features wouldn't actually drive sales. I wonder if the idea is to vend digital products? Drive traffic to nearby physical stores through some kind of targeted digital coupon? Has anyone seen this kind of thing in the wild?
[+] [-] red_admiral|2 years ago|reply
[+] [-] RockRobotRock|2 years ago|reply
https://www.youtube.com/watch?v=S2hwnGrn3go
[+] [-] ssss11|2 years ago|reply
[+] [-] disillusioned|2 years ago|reply
Rather, it throws out a series of guesses and confidence values of a person's age, gender, and race, and allow the planogram (and OOH advertising) to change dynamically based on that information.
Which is not necessarily great, but also an entire order of magnitude less invasive than every interaction any casual user has on the internet with any ad ever. Or, frankly, with any POS system recording a repeat purchase from your credit card, from which motivated vendors can back into the rest of the demographic data.
I'm not excusing it, but while the error reads "facial recognition", it's more "stereotyping enablement platform," which, while only marginally better, is probably still better.
It's also hilarious to think of the thing displaying the green M&M if you're a "probable" woman, and the red or yellow M&M if you're a "probable" man, and seeing how long it'd take for anyone to correlate the change.
[+] [-] heads|2 years ago|reply
I worked at a company in the late 2000s that moved to an office with automatic urinals. These were fairly novel at the time and had a matte black plastic unit with a flush handle that also had a shiny window made of dark, IR transparent plastic. It was clearly some kind of proximity sensor for the autoflush but some joker made an official “do not touch the cameras” sign that wound a few people up.
[+] [-] neilv|2 years ago|reply
Is the "USB optical sensor" really a hidden camera, and "digital image maps" is images from the hidden camera?
If so, then are they also being weaselly in the rest of the sentence?
> without storing such data on permanent memory mediums or transmitting it over the Internet to the Cloud.”
[+] [-] zoky|2 years ago|reply
[+] [-] rpaddock|2 years ago|reply
This is the only related item I could find today:
"OMRON Develops Real-Time Facial Expression Estimation Technology"
https://www.omron.com/media/press/2012/10/e1023.html
[+] [-] EMCymatics|2 years ago|reply
[+] [-] hexo|2 years ago|reply
[+] [-] kypro|2 years ago|reply
In the UK we're already using it everywhere. Most stores and checkout systems have it. It's used in CCTV across the country. It's used by police to identify protestors. Schools in my area even install spyware on kids phones to monitor them and their families.
I can't imagine anyone would mind having being identified by a vending machine here lol... What's the risk?
[+] [-] SuperNinKenDo|2 years ago|reply
We have to keep pushing for this stuff to become less and less normalised, and also for penalties to become more and more serious. When that happens, people will continue to feel more enpowered to fight this stuff. We should also make sure laws are clear that people have the right to "vandalise" such devices. In my opinion taking a bat to one of these things should at least be defensible in court.
[+] [-] nusl|2 years ago|reply
[+] [-] KMnO4|2 years ago|reply
But if I found out they’re collecting the bloodwork pdfs my doctor sends and selling them to insurance companies, that would be objectively beyond any reasonable consent.
If you’re selling me M&Ms, I have a very very low tolerance for what I consider appropriate data collection.
[+] [-] giarc|2 years ago|reply
However, the difference in this case is that the vending company (or someone along the chain) is using it to serve demographic ads to the customer of the vending machine. I still have to pay for the product and it's not changing non targeted ads to targeted ads (there were no ads before). I'm sure they aren't passing along some of the ad revenue to me since there's no competition (I'm sure all vending machines in the area are owned by the same company). So this is just a source of extra revenue for the company and serves no purpose to the user.
[+] [-] itsTyrion|2 years ago|reply
[+] [-] bborud|2 years ago|reply
[+] [-] more_corn|2 years ago|reply
[+] [-] hn_throwaway_99|2 years ago|reply
In summary (and, in fairness, these technical details are pretty far down the article):
> The machines are owned by MARS and the manufacturer is Invenda.
> MARS did not respond to requests for comment from CTV.
> Invenda also did not respond to CTV’s requests for comment but told Stanley in an email “the demographic detection software integrated into the smart vending machine operates entirely locally.”
> “It does not engage in storage, communication, or transmission of any imagery or personally identifiable information,” it continued.
> According to Invenda’s website, the Smart Vending Machines can detect the presence of a person, their estimated age and gender. The website said the “software conducts local processing of digital image maps derived from the USB optical sensor in real-time, without storing such data on permanent memory mediums or transmitting it over the Internet to the Cloud.”
[+] [-] jjav|2 years ago|reply
The most important thing to keep in mind about clauses like this is that at any time it's only one software update away from changing the behavior and suddenly starting to send everything. You'd never know. (And: That point in time may be in the past.)
[+] [-] kibwen|2 years ago|reply
[+] [-] moi2388|2 years ago|reply
[+] [-] Eddy_Viscosity2|2 years ago|reply
Probably bullshit.
Firstly, "personally identifiable information" is usually defined very specifically, like 'we didn't record their social security number'.
Secondly, they send a bunch of information extracted from the image and they probably store the raw images on the machines which are then manually extracted at some point. Like photocopier that store a scan of everything put on the bed in its own hard-drive.
[+] [-] aendruk|2 years ago|reply
[+] [-] notRobot|2 years ago|reply
[+] [-] stainablesteel|2 years ago|reply
at least that's my suggestion
if people collected my information and sent me a check in the mail every month for a decent amount, i might not be so annoyed.
[+] [-] offtrail|2 years ago|reply
[+] [-] Nihilartikel|2 years ago|reply
It worked in Venice, and what a fine method of self expression!
[+] [-] mixmastamyk|2 years ago|reply
I remember older generation tech used in digital signage applications that wanted to identify ages and genders to show them appropriate content, say apparel for example. It ran locally and was well meaning (enough).
Unfortunately in a world with Meta, ubiquitous telemetry, Ring doorbells, NSA, and data breaches and brokers, … we can’t trust anyone to do the right thing any longer.
So this type of benign-ish functionality, which would have been all local just two decades ago, unfortunately now can’t be trusted either.
Still, I’m very happy to hear a young person or two being concerned. Last trip to a college campus there was mandatory app and no cash accepted in many places.
[+] [-] NoPicklez|2 years ago|reply
In this case, the recognition doesn't do anything for the end user of the vending machine. It is purely data for other brokers to use. The vending machine didn't say to the user that people of their demographic were buying Diet Coke over normal Coke.
Slapping facial recognition on something that didn't have it before, without any obvious functionality or benefit for the persons face being scanned is very much cause for concern, in my opinion.
[+] [-] antfarm|2 years ago|reply
[+] [-] ale42|2 years ago|reply
[+] [-] aaron695|2 years ago|reply
[deleted]
[+] [-] m3kw9|2 years ago|reply
[deleted]
[+] [-] frob|2 years ago|reply
[deleted]
[+] [-] gotoeleven|2 years ago|reply
[deleted]
[+] [-] robinduckett|2 years ago|reply
[+] [-] c22|2 years ago|reply
[+] [-] jjav|2 years ago|reply