I'm already not wanting to have personal conversations on teams. My tech savvy colleagues and the ones who can be convinced are on signal, where we talk about job offers and relationships. A few others do Instagram, and get to see my art photography. And occasionally I'll bump into someone when we're both in the office and be able to say whatever not looked over by AI. There's a real chilling effect on getting to know people.
> And occasionally I'll bump into someone when we're both in the office and be able to say whatever not looked over by AI.
At my present workplace, we have cameras with microphones. They also have installed spyware on laptops and desktops, to be able to see the screens of employees. They also go through mails and have a list of all web traffic done by employees.
Which is one of the reasons I've handed my resignation a few days ago.
Is that a recent thing? Or only in the US? Age related? Size of the company? The level of personal communication I've witnessed over such tools is pretty superficial, casual. But perhaps that's just my age, location, or the fact that the last time I worked for a large company, Skype was still new.
Your employer is not obliged to maintain a communication system so that you "get to know people". If you consider how much these tools cost to maintain, it's completely understandable that companies want to have 100% content control.
So, now we can't use Teams to have the "water-cooler" moments that supervisors claim we need, but really we are having them on Signal or IOS and they just can't measure that. Organizations really, really, really hate transparency.
They don't want you shooting the shit with company communications mediums because that has limited upside and much less limited potential downside.
Remember the famous "will the atom bomb test ignite the atmosphere" gentleman's bet those scientists had? Nobody actually thought it would but they discussed it semi-seriously. Today discussing some fanciful bad outcome like that (be it the mundane failure to deliver a product or something more interesting) is a liability when it's sitting in your company email servers. Even if that bad thing isn't what winds up happening or the people speculating aren't in a position to have accurate info the other side's lawyer or the regulator will try and construe it as proof that the company should have known ahead of time.
Or, more likely, say there's some sexual harassment or adultery kerfuffle between employees. It's way better for the company if none of that happened on company provided communications tools.
From the company's perspective it's avoidable risk to have work communication tools be used for informal BSing between employees. But they can't realistically prevent that so they introduce Skynet in order to make people watch their mouths and move those sensitive conversations elsewhere.
This is nothing new: corporations have scanned instant messages, emails and even recorded phone calls for decades, and will fire you based on that evidence for violations of corporate policy. And will sue you or call the cops if they detect potential crimes.
I’m kind of surprised so many people are shocked by this. I know of one company where dozens of people were fired because their email was scanned for external job interviews and the CIO had a report, which he used to prematurely cut staff when he needed to save budget.
The only difference now is that the tech is smarter and cheaper so that you don’t need to pay as many people to spy on their coworkers.
Your defence against this is to find a job where you’re too valuable for them to do anything. As with any jurisdiction where there is at will employment.
> The only difference now is that the tech is smarter and cheaper so that you don’t need to pay as many people to spy on their coworkers.
Your comment implies this isn't potentially an enormous difference. The difference is between having to pay people to spy on their coworkers, and having computers that do it passively, invisibly, continuously, in real time?
> I’m kind of surprised so many people are shocked by this. I know of one company where dozens of people were fired because their email was scanned for external job interviews and the CIO had a report, which he used to prematurely cut staff when he needed to save budget.
On a related note, if you were a Microsoft employee, how comfortable would you be talking with recruiters on LinkedIn?
The difference here is that "find a new job" might be accomplished by LinkedIn...which is also owned by Microsoft.
So Microsoft's cloud ecosystem generally owns your work email, and the site you use to find a job.
Honestly: I don't care what they say (because it'll be "we datamined LinkedIn, but don't worry we did it with only the public APIs and just bypassed rate limiting so technically...to add data to our "employee leaving" filter...) - Microsoft and LinkedIn, specifically, need to be forcibly broken up with this sort of control over the full employee lifecycle.
"The leavers classifier detects messages that explicitly express intent to leave the organization, which is an early signal that may put the organization at risk of malicious or inadvertent data exfiltration upon departure".
In other words "how to promote and encourage paranoid behaviors from employers" :(
Once I discovered that every school-issued machine had a VNC server running on it I assumed that the contents of my screen were being recorded at every moment. Turns out I was half right, as I caught up with the IT guy afterwards and the principal (a paranoid sociopath who shouldn't be anywhere near kids) wanted the ability to catch kids when she thought they were looking at non-school related things.
It's fundamental safety in a society with these sorts of companies to assume: company infra = logged until you die. Once your company has come under a subpoena for information or under some kind of long term discovery, you write emails under the assumption they're going to be in court for everyone and your mother to see.
It's extremely revealing that this particular classifier is framed as "prevent data loss" not "intercept skills loss" or "figure out why your employees want to leave and then fix that".
It seems maybe not the intent, but the practical result is to use the private sector to implement CCP like social credit scores isn't it? By doing everything in the private sector they get around all those pesky constitutional protections.
so the rights and freedoms are only protecting citizens from government oppression... if a private company does it, then it's fine cuz corporations are also free people.
they're free people who somehow are getting to oppress and censor individual humans (otherwise the corporation is who is being oppressed), but let's pretend that we can punish them by "taking our dollars elsewhere" such that it's our own fault
IMO, tracing this towards the root, I find along the way the grand system of royalties and other kinds of rent schemes. Nobody cares cuz we prefer the promise (for the majority is a promise) that we can come up with something great to make it BIG and then get to live from rent or other kinds of royalty payments
Most large companies had "social credit scores" for decades. They're called performance reviews. Nothing new here. You just had unreasonable and naïve expectations. You now know MS Teams is monitored. You are free to seek employment in companies that don't use MS Teams if you dislike this so much.
Submitted title was "Office 365 implementing AI to detect employees colluding, leaving and more". That broke the site guidelines: "Please use the original title, unless it is misleading or linkbait; don't editorialize." - https://news.ycombinator.com/newsguidelines.html
The proper place to include that sort of interpretation is by adding it in a comment in the thread. Then your interpretation is on a level playing field with everyone else's (https://hn.algolia.com/?dateRange=all&page=0&prefix=false&so...). Also, a comment gives you room to actually substantiate your interpretation.
On the other hand, a thread like this probably wouldn't have gotten attention without the sensational title in the first place, so this kind of submission is a borderline case and at worst a venial sin. (We still change the title once it does make the frontpage though.)
The title "Microsoft Purview: Additional classifiers for Communication Compliance (preview)" sounds like nothing at all. It doesn't seem like exaggerating to say that the reality is literally Big Brother in a corporate context. Seems like your changing the title is just going to have the effect of reducing attention given to something that really needs to be exposed in clear terms.
I think if you have an E5 license there is already thoughtcrime functionality built-in. I remember someone demoing this to me in a Teams user group, and no one seemed to think it was creepy at all. In addition to flagging keywords it also used AI to detect undesirable thoughts and emotions, under the guise of anti-harassment and compliance. Unfortunately I can't remember the name of the feature but I think it might be this:
Side-side note, I think the reason why that is allowed to still keep going on given that the SEC knows about it and that there's ample evidence has to do with national security reasons.
It's extremely troubling that given all this corporate authoritarian AI tech they built that Microsoft is still trying to be the voice of reason about the dangers of AI.
> It's extremely troubling that given all this corporate authoritarian AI tech they built that Microsoft is still trying to be the voice of reason about the dangers of AI.
Just speculating, but this phenomenon could either be explained by 1.) diverse internal opinion; the parts of Microsoft responsible for warning against AI are not the same parts pushing authorarian AI software, or 2.) Moat-building/ladder-pulling; Microsoft is warning people of the danger of _other people's_ AI, but of course you can trust _their_ AI, because they're the ones warning you after all!
There is no way they will be able to make an AI at this point that will
A) Be accurate
B) Work across multiple contexts
C) Run efficiently on billions of messages
This will just result in many false positives, and unnecessary eavesdropping on employees personal conversations.
Once its revealed an organization is using this, people will quickly move all conversations to another platform, even if policy forbids that. Resulting in an even greater security risk potentially.
And as per usual, if Microsoft gets someone fired (e.g. comes in looking for money laundering, finds out the staff member is making fun of their boss), there will be no repercussions.
if you accidentally fire 10% of good people you still have 90% of them left, and if that lets you fire 80% of the staff that are committing thought-crime it's probably a win.
Part of what makes stuff like this surprising is expectations of privacy. Like for example. If you start a video chat on Hangouts or Zoom, even or maybe especially on a work account, you don’t expect that meeting to be recorded or analyzed surreptitiously. I think in many places, it would be illegal.
Because of this, one might feel like the same standard applies to other one-on-one and small group communication avenues, but it’s actually completely the opposite.
I honestly first thought this article was satire. It is so unreal to find myself in a world where this is acceptable. What's next? Installing cameras in restrooms to catch offline conversations?
I have zero confidence that this system is smart enough to differentiate between all these things and the legitimate variants thereof (e.g. collusion and cross team collaboration are basically indistinguishable) that companies actually want people doing or discussing and likely outnumber the bad by orders of magnitude.
Yeah and the sad thing is companies that don't know better will flip it on because "wow look at smart Microsoft's latest feature, we better use this!" and then inadvertently fire Sally in HR because her asking people to sign a card for the VP's birthday looked suspiciously like a violation of corporate gift policies.
This seems to be office 365 implementing monitoring of official communications of employees & contractors for the office account? I don’t think it extends to a personal office 365 account, at least it didn’t seem to.
Why is this exactly newsworthy? Any communication through official channels is the property of the employer anyway. To collude, leave & other stuff use personal channels maybe.
> Any communication through official channels is the property of the employer anyway.
Why is there always this attitude of "it's a private business, they can do what they want". Why does the fact that they can do something distract from criticism of them doing it? The fact that this tech exists is horrifyingly dystopian on its own merits. But it also has widespread consequences in a country with so many employment monopolies and opportunities for outright wage slavery. Heavy-handed workplace surveillance and heuristics-based crap are becoming increasingly difficult to simply opt-out of.
> Why is this exactly newsworthy? Any communication through official channels is the property of the employer anyway.
Pretty clear one of the major things they're going for here is detecting "jobsite troublemakers", ie employees who are upset with job conditions/agitating for improvements/discussing salaries/etc, which is given specific legal protection. It is explicitly legal and protected for employees to discuss labor conditions, organizing, or salaries regardless of whether you do it "on company property" or "on company chat". Just because the company owns it doesn't mean you have no legal rights - just like a company can dismiss you for no reason but they can't dismiss you for any reason.
They are wrapping it up with "think of the children" justifications like "employees who are discussing salary might be considering leaving and they might take nefarious action if they do so" but that's the core of the situation here - these are tools to detect and fight against legally-protected activities by employees.
> Workplace collusion: The workplace collusion classifier detects messages referencing secretive actions such as concealing information or covering instances of a private conversation, interaction, or information.
> "The leavers classifier detects messages that explicitly express intent to leave the organization, which is an early signal that may put the organization at risk of malicious or inadvertent data exfiltration upon departure"
Hypothetically, do you think it would be a good idea for Microsoft to build a classifier and provide managers with a list of potential "religiously devout", eg based on correlated work/away periods, language patterns, etc? Sure, it's a legally protected classification, but there's an elevated risk of extremist activity, which surely presents a business risk, right? So why not?
> Why is this exactly newsworthy? Any communication through official channels is the property of the employer anyway.
Is this sentence meant descriptive or normative? Because there are definitely juristrictions where it is not that easy (e.g EU).
If it is meant normative then I wonder if you also think they "own" all conversations happening on corporate ground? Should they be allowed to record anywhere on corporate property, and use what they record in any way?
Are there? I think it's much more likely there are engineers that were told to work on this, and thought that working on this would be fun, let them have an ML project, be a learning opportunity, give them something good for their performance packet, give them good experience, be good for visibility in the org, and many things like that.
I don't know why jumping to the most far reaching evil option is popular in threads like this is the default.
I am almost surprised it took this long to get to this point, but I suppose the recent resignation wave made it into a viable product offering. My last MBA class was HR analytics class that, among other things, dealt with email sentiment analysis and stuff like that. Part of me was thinking average HR person won't touch this stuff, but if a company just happened to offer something that would do it for them..
I've always had a preference against working with microsoft products but this is getting to the point where I'd find a new gig instead of being subjected to this stuff.
I think at least some my staff will likely resign because mandatory deep inspection / network monitoring is being forced onto everyone's computers by the IT department. It's probably the only way to stop it from happening at the moment. Unfortunately the buzzword of "zero trust" has been bent towards meaning "spy on everything your employees do".
What is the effect on creative expression and sociability, between co-workers, when they know they're being analyzed by a computer to figure out if they should be fired?
This is absolutely going to be used against unionizers, which is what's really meant by "colluding". In the US this is going to get a lot of people fired. In other parts of the world, it's going to get them killed. This kind of software is Zyklon B for the 21st century.
We have come a long way now that we have these advanced classifiers. You would be surprised how low tech the initial product was, by low tech I mean devoid of any ML/AI. We went GA in end of 2019.
Saw a lot of interesting use cases too for e.g Japanese enterprises wanting to detect cases like suicide or intent to suicide, that is why we have multiple types of classifiers.
I worked on the Infra side (not ML). That too was “low-tech” or the more apt term would be “not the latest tech”. Core parts of the app were part of a monolith (think Exchange). Then we were using a really old .NET Framework version for our MVC app. Lot of the storage technologies we used were very MS specific as well. AFAIK, all of this is still valid today.
They have "workplace collusion" as a category, and even more dystopian shit, like:
EDIT: apparently these 2 are just jokes, sorry for not checking my sources!
`Negative emotions: Expressions of sadness, unhappiness, discontent, anger, rage, anguish, or existential ennui, as these may negatively affect team cohesion.
Joy: Language suggesting hopefulness, optimism, anticipation of a brighter future, faith in humankind and/or in a loving and benevolent creator, as these may imply that the user is thinking about topics other than the best interests of the organization.`
Seems to only apply to messages, for now. My understanding is that unless a call on Teams is explicitly recorded, there's no capability for the organization to monitor the content within.
Is this still accurate? Are there any features in the pipeline planning to change this?
Microsoft offering "communications compliance" within the same product is certainly chilling enough as it is. The reality where people lose their job as a result of previously-protected casual [voice] chat doesn't seem so crazy now. All it takes is missing a quietly-introduced feature update by a week before the organization flips the switch and doesn't tell anyone.
I’m sure automatic transcripts of all calls is just a few developments away - it’d be desirable for call centers and I suspect it’ll be available to all eventually.
Sounds like its time to set up a scheduled batch file that sends a bunch of messages around that would trigger watchdogs like this, as well as the NSA prism keywords just for funsies.
It's interesting looking at the way they try sell this monitoring to the employees as being a positive thing[1]. At least the wider population can experience what it's like to live under DTEX[2]
I hope the people implementing all these policies and technologies are seriously weighing the consequences of their actions. I suspect that they are not.
MS Office Home > Admin > Exchange Admin Center > Mail Flow > Rules > Click the plus sign for New Rule > Create New Rule > Apply this rule if > Subject or Body includes > Specify a word or phrase
How good the AI is, depends on the flood of false positives the current system generates. If MS is true to form getting anything useful comes at great expense.
The #1 thing they search for is notably missing from the list.
At one of the previous employers, they sold part of the company to an outsourcing enterprise, including employees and founded a new one to move the remaining employees.
As a part "of being sold company" When I wanted to interview to the new company, my future to be manager send me his phone number, and advised not to use Teams for any sensitive conversation.
Why is everything duplicated in this announcement? The list of classifier descriptions effectively appears twice, the first time with the text of the "What you need to do to prepare" (which, btw, says exactly nothing on how to prepare) appended to each item.
What even is this site? It looks like grade A content rehashing from various MS sites...
There are good and bad for those feature. The only rule of thumb: Never perform non-working related matter in workplace or facility provided by company, no matter how good your performance in company or how good your relationship with superior.
hmm I think I have a new reason not to use any Microsoft products in the office. I can even claim an ethics issue with interacting with them now. Unfortunately the existence of this feature breaks trust that my management hasn't abused it, the only way to avoid this is by not engaging with Microsoft offerings such as word or excel in the office.
Wish I could read the page but apparently my ad blocker is too offensive. Well I'd be fine with supporting the publisher through online ads but I am really not okay with the tracking those advertisers do. You ditch the tracking and any annoying ads and I'll ditch the ad blocker. Until then, we have to agree to disagree, the Faustian bargain of internet advertising is untenable.
This is no way to defend against inside threats. Any real threats will use other means of communication. Meanwhile this is just treating everybody as if they can't be trusted.
This is the whole point of culture and society. Mass surveillance didn’t/doesn’t work for the NSA/CIA and it sure isn’t going to work for corporate paymasters either.
Employees get what they need and give what they can..
But seriously, I always found it amusing that once you step into a corporate you can get food, drinks & other amnesties for free.. almost like it's a socialist society.. But when said employees step outside, they are the first in-line for the capitalist agenda..
The only way this sort of thing changes is with labor organization ie unionization.
The government won’t save you from efforts like this. The government represents the interests of the capital owning class.
The demonization of unions is one of the most successful cases of propaganda in the last century. It’s gone so far s people who will die on the hill of Jeff Bezos paying slightly more taxes because everyone seems to think they’ll be Jeff Bezos one day.
I see that phrase thrown around a lot. It's a variant of "you're never going to be a billionaire (so you shouldn't be against X)." Why do people assume that you have to think you'll be a billionaire to be against something that would affect billionaires negatively? Is something only wrong if you think you'll find yourself in that position one day?
Already saved from this sort of thing by just being self employed for the past 10 years. I get paid for every hour of work with no spying and just a weekly status update with my clients. It's a simple relationship and I am honestly not sure I could ever go back.
> The demonization of unions is over the most successful cases of propaganda in the last century
Ever notice how unions are somehow all the same entity, and seem to have to answer for things completely different unions in completely different industries did?
Nobody treats corporations this way, even though (if you look at interlocking BoD membership) there's a more reasonable case to be made for collusion in some industries...
>The only way this sort of thing changes is with labor organization ie unionization.
>The government won’t save you from efforts like this. The government represents the interests of the capital owning class.
You realize that the power/existence of "labor organization ie unionization" is dependent on the government? Without government protection labor unions don't stand a chance.
Office 97 and Windows XP was something of a high point in personal computing. The internet has enabled entirely new product categories, but it has also badly eroded old ones with the solvent of MRR greed. Merely selling a thing just ain't good enough, especially if it's software. Even offline applications are SaaS, now, where the "service" are frequent updates that leave you at the perennial mercy of every company from which you purchase software (and every company with which they do business, recursively). I'm normally pretty sanguine about business models, but when I lay it out like this, I find it quite disturbing.
2 years ago I decided to fire up my Pentium#100 and write a technical plan on it using Office 2000, for my real life corporate job. It worked magnificently, no fuss, plan presentation went fine. Faster on a 100mhz machine with 16MB ram than whatever monstrosity underlies O365 and Google Docs.
I'll happily use an old Office, just please splice in the "What do you want to do?" search bar so I don't have to hunt through nested menus/ribbons for some obscure formatting option I use once every 6 months.
If you've ever thought your employer isn't monitoring the chat then you're a fool. I'd go as far to say that if you think there is any form of electronic communication that isn't being monitored on some level you're also being foolish.
There's a difference between monitoring and logging, and nobody is reading the chat logs or even paying attention to chat metrics in many workplaces because the value of doing so is dubious given the potential for employee backlash.
tpmx|3 years ago
totetsu|3 years ago
DeathArrow|3 years ago
At my present workplace, we have cameras with microphones. They also have installed spyware on laptops and desktops, to be able to see the screens of employees. They also go through mails and have a list of all web traffic done by employees.
Which is one of the reasons I've handed my resignation a few days ago.
dj_mc_merlin|3 years ago
tgv|3 years ago
fartcannon|3 years ago
california2077|3 years ago
librarianscott|3 years ago
throwaway0a5e|3 years ago
Remember the famous "will the atom bomb test ignite the atmosphere" gentleman's bet those scientists had? Nobody actually thought it would but they discussed it semi-seriously. Today discussing some fanciful bad outcome like that (be it the mundane failure to deliver a product or something more interesting) is a liability when it's sitting in your company email servers. Even if that bad thing isn't what winds up happening or the people speculating aren't in a position to have accurate info the other side's lawyer or the regulator will try and construe it as proof that the company should have known ahead of time.
Or, more likely, say there's some sexual harassment or adultery kerfuffle between employees. It's way better for the company if none of that happened on company provided communications tools.
From the company's perspective it's avoidable risk to have work communication tools be used for informal BSing between employees. But they can't realistically prevent that so they introduce Skynet in order to make people watch their mouths and move those sensitive conversations elsewhere.
indymike|3 years ago
throwawaymanbot|3 years ago
[deleted]
parasubvert|3 years ago
I’m kind of surprised so many people are shocked by this. I know of one company where dozens of people were fired because their email was scanned for external job interviews and the CIO had a report, which he used to prematurely cut staff when he needed to save budget.
The only difference now is that the tech is smarter and cheaper so that you don’t need to pay as many people to spy on their coworkers.
Your defence against this is to find a job where you’re too valuable for them to do anything. As with any jurisdiction where there is at will employment.
karaterobot|3 years ago
Your comment implies this isn't potentially an enormous difference. The difference is between having to pay people to spy on their coworkers, and having computers that do it passively, invisibly, continuously, in real time?
RcouF1uZ4gsC|3 years ago
On a related note, if you were a Microsoft employee, how comfortable would you be talking with recruiters on LinkedIn?
bombcar|3 years ago
XorNot|3 years ago
So Microsoft's cloud ecosystem generally owns your work email, and the site you use to find a job.
Honestly: I don't care what they say (because it'll be "we datamined LinkedIn, but don't worry we did it with only the public APIs and just bypassed rate limiting so technically...to add data to our "employee leaving" filter...) - Microsoft and LinkedIn, specifically, need to be forcibly broken up with this sort of control over the full employee lifecycle.
unknown|3 years ago
[deleted]
peterfield|3 years ago
indrora|3 years ago
Once I discovered that every school-issued machine had a VNC server running on it I assumed that the contents of my screen were being recorded at every moment. Turns out I was half right, as I caught up with the IT guy afterwards and the principal (a paranoid sociopath who shouldn't be anywhere near kids) wanted the ability to catch kids when she thought they were looking at non-school related things.
It's fundamental safety in a society with these sorts of companies to assume: company infra = logged until you die. Once your company has come under a subpoena for information or under some kind of long term discovery, you write emails under the assumption they're going to be in court for everyone and your mother to see.
regularfry|3 years ago
narrator|3 years ago
bsedlm|3 years ago
they're free people who somehow are getting to oppress and censor individual humans (otherwise the corporation is who is being oppressed), but let's pretend that we can punish them by "taking our dollars elsewhere" such that it's our own fault
IMO, tracing this towards the root, I find along the way the grand system of royalties and other kinds of rent schemes. Nobody cares cuz we prefer the promise (for the majority is a promise) that we can come up with something great to make it BIG and then get to live from rent or other kinds of royalty payments
california2077|3 years ago
dang|3 years ago
The proper place to include that sort of interpretation is by adding it in a comment in the thread. Then your interpretation is on a level playing field with everyone else's (https://hn.algolia.com/?dateRange=all&page=0&prefix=false&so...). Also, a comment gives you room to actually substantiate your interpretation.
On the other hand, a thread like this probably wouldn't have gotten attention without the sensational title in the first place, so this kind of submission is a borderline case and at worst a venial sin. (We still change the title once it does make the frontpage though.)
davesque|3 years ago
* https://www.microsoft.com/en-my/microsoft-365/roadmap?filter...
* https://www.microsoft.com/en-my/microsoft-365/roadmap?filter...
* https://www.microsoft.com/en-my/microsoft-365/roadmap?filter...
* https://www.microsoft.com/en-my/microsoft-365/roadmap?filter...
* https://www.microsoft.com/en-my/microsoft-365/roadmap?filter...
* https://www.microsoft.com/en-my/microsoft-365/roadmap?filter...
* https://www.microsoft.com/en-my/microsoft-365/roadmap?filter...
The title "Microsoft Purview: Additional classifiers for Communication Compliance (preview)" sounds like nothing at all. It doesn't seem like exaggerating to say that the reality is literally Big Brother in a corporate context. Seems like your changing the title is just going to have the effect of reducing attention given to something that really needs to be exposed in clear terms.
CPAhem|3 years ago
I tried to summarize the article in the title. Will follow the guidelines from now on.
jtbayly|3 years ago
behnamoh|3 years ago
[deleted]
4oo4|3 years ago
https://docs.microsoft.com/en-us/microsoft-365/compliance/co...
So I think if Microsoft existed in the world of 1984, they would easily be the preferred tech vendor for IngSoc.
Side note, do you think this would also detect the money laundering and bribery going on within Microsoft itself?
https://www.theverge.com/2022/3/25/22995144/microsoft-foreig...
Side-side note, I think the reason why that is allowed to still keep going on given that the SEC knows about it and that there's ample evidence has to do with national security reasons.
It's extremely troubling that given all this corporate authoritarian AI tech they built that Microsoft is still trying to be the voice of reason about the dangers of AI.
codemonkey-zeta|3 years ago
Just speculating, but this phenomenon could either be explained by 1.) diverse internal opinion; the parts of Microsoft responsible for warning against AI are not the same parts pushing authorarian AI software, or 2.) Moat-building/ladder-pulling; Microsoft is warning people of the danger of _other people's_ AI, but of course you can trust _their_ AI, because they're the ones warning you after all!
parasubvert|3 years ago
Everything in corporate email has always been subject to read by others, there is no expectation of privacy.
As we’ve seen from countless court cases, they range from boring nothingburgers, to evidence of actual crimes.
unknown|3 years ago
[deleted]
bearjaws|3 years ago
A) Be accurate
B) Work across multiple contexts
C) Run efficiently on billions of messages
This will just result in many false positives, and unnecessary eavesdropping on employees personal conversations.
Once its revealed an organization is using this, people will quickly move all conversations to another platform, even if policy forbids that. Resulting in an even greater security risk potentially.
And as per usual, if Microsoft gets someone fired (e.g. comes in looking for money laundering, finds out the staff member is making fun of their boss), there will be no repercussions.
zmmmmm|3 years ago
if you accidentally fire 10% of good people you still have 90% of them left, and if that lets you fire 80% of the staff that are committing thought-crime it's probably a win.
jchw|3 years ago
Because of this, one might feel like the same standard applies to other one-on-one and small group communication avenues, but it’s actually completely the opposite.
lamontcg|3 years ago
Anyone using Teams is already a red flag.
duckmysick|3 years ago
smcleod|3 years ago
qw|3 years ago
throwaway0a5e|3 years ago
qbasic_forever|3 years ago
jll29|3 years ago
ma2rten|3 years ago
Angostura|3 years ago
HeckFeck|3 years ago
bitwize|3 years ago
jes|3 years ago
Echelon is one keyword I remember.
gautamdivgi|3 years ago
Why is this exactly newsworthy? Any communication through official channels is the property of the employer anyway. To collude, leave & other stuff use personal channels maybe.
creatonez|3 years ago
Why is there always this attitude of "it's a private business, they can do what they want". Why does the fact that they can do something distract from criticism of them doing it? The fact that this tech exists is horrifyingly dystopian on its own merits. But it also has widespread consequences in a country with so many employment monopolies and opportunities for outright wage slavery. Heavy-handed workplace surveillance and heuristics-based crap are becoming increasingly difficult to simply opt-out of.
paulmd|3 years ago
Pretty clear one of the major things they're going for here is detecting "jobsite troublemakers", ie employees who are upset with job conditions/agitating for improvements/discussing salaries/etc, which is given specific legal protection. It is explicitly legal and protected for employees to discuss labor conditions, organizing, or salaries regardless of whether you do it "on company property" or "on company chat". Just because the company owns it doesn't mean you have no legal rights - just like a company can dismiss you for no reason but they can't dismiss you for any reason.
They are wrapping it up with "think of the children" justifications like "employees who are discussing salary might be considering leaving and they might take nefarious action if they do so" but that's the core of the situation here - these are tools to detect and fight against legally-protected activities by employees.
> Workplace collusion: The workplace collusion classifier detects messages referencing secretive actions such as concealing information or covering instances of a private conversation, interaction, or information.
> "The leavers classifier detects messages that explicitly express intent to leave the organization, which is an early signal that may put the organization at risk of malicious or inadvertent data exfiltration upon departure"
Hypothetically, do you think it would be a good idea for Microsoft to build a classifier and provide managers with a list of potential "religiously devout", eg based on correlated work/away periods, language patterns, etc? Sure, it's a legally protected classification, but there's an elevated risk of extremist activity, which surely presents a business risk, right? So why not?
Epa095|3 years ago
Is this sentence meant descriptive or normative? Because there are definitely juristrictions where it is not that easy (e.g EU).
If it is meant normative then I wonder if you also think they "own" all conversations happening on corporate ground? Should they be allowed to record anywhere on corporate property, and use what they record in any way?
bsedlm|3 years ago
outworlder|3 years ago
efitz|3 years ago
hyperhopper|3 years ago
I don't know why jumping to the most far reaching evil option is popular in threads like this is the default.
A4ET8a8uTh0|3 years ago
anm89|3 years ago
zmmmmm|3 years ago
danschumann|3 years ago
ornornor|3 years ago
21723|3 years ago
creatonez|3 years ago
> This kind of software is Zyklon B for the 21st century
is a bit of an over-the-top comparison
AbbeFaria|3 years ago
We have come a long way now that we have these advanced classifiers. You would be surprised how low tech the initial product was, by low tech I mean devoid of any ML/AI. We went GA in end of 2019.
Saw a lot of interesting use cases too for e.g Japanese enterprises wanting to detect cases like suicide or intent to suicide, that is why we have multiple types of classifiers.
I worked on the Infra side (not ML). That too was “low-tech” or the more apt term would be “not the latest tech”. Core parts of the app were part of a monolith (think Exchange). Then we were using a really old .NET Framework version for our MVC app. Lot of the storage technologies we used were very MS specific as well. AFAIK, all of this is still valid today.
fartcannon|3 years ago
aaaaaaaaaaab|3 years ago
m-p-3|3 years ago
gigel82|3 years ago
EDIT: apparently these 2 are just jokes, sorry for not checking my sources!
`Negative emotions: Expressions of sadness, unhappiness, discontent, anger, rage, anguish, or existential ennui, as these may negatively affect team cohesion.
Joy: Language suggesting hopefulness, optimism, anticipation of a brighter future, faith in humankind and/or in a loving and benevolent creator, as these may imply that the user is thinking about topics other than the best interests of the organization.`
From https://old.reddit.com/r/sysadmin/comments/v3b2mn/microsoft_...
rl3|3 years ago
Is this still accurate? Are there any features in the pipeline planning to change this?
Microsoft offering "communications compliance" within the same product is certainly chilling enough as it is. The reality where people lose their job as a result of previously-protected casual [voice] chat doesn't seem so crazy now. All it takes is missing a quietly-introduced feature update by a week before the organization flips the switch and doesn't tell anyone.
bombcar|3 years ago
alliao|3 years ago
greyhair|3 years ago
I will say, however, that I don't use my personal phone to host any employer apps. It is my phone, not theirs. I pay the service fee.
So conversations I have on my phone are mine. My coworkers all operate the same way.
Arrath|3 years ago
cube00|3 years ago
[1]: https://www.microsoft.com/en-us/microsoft-viva/insights
[2]: https://www.dtexsystems.com
Havoc|3 years ago
Could someone head over to MS HQ and slap some sense into whoever thought blessing the world with this is a win?
alchemyromcom|3 years ago
BMc2020|3 years ago
How good the AI is, depends on the flood of false positives the current system generates. If MS is true to form getting anything useful comes at great expense.
The #1 thing they search for is notably missing from the list.
DeathArrow|3 years ago
As a part "of being sold company" When I wanted to interview to the new company, my future to be manager send me his phone number, and advised not to use Teams for any sensitive conversation.
MauranKilom|3 years ago
What even is this site? It looks like grade A content rehashing from various MS sites...
cheinyeanlim|3 years ago
schmichael|3 years ago
userbinator|3 years ago
jandrusk|3 years ago
parasubvert|3 years ago
Unless you’re too valuable for them to care.
hsuduebc2|3 years ago
unknown|3 years ago
[deleted]
killjoywashere|3 years ago
lumost|3 years ago
HeckFeck|3 years ago
blinded|3 years ago
rkagerer|3 years ago
annoyingnoob|3 years ago
hatchoo|3 years ago
bombcar|3 years ago
rmbeard|3 years ago
creatonez|3 years ago
eyeareque|3 years ago
How should companies defend themselves from insider threats?
willismichael|3 years ago
the_optimist|3 years ago
leroman|3 years ago
But seriously, I always found it amusing that once you step into a corporate you can get food, drinks & other amnesties for free.. almost like it's a socialist society.. But when said employees step outside, they are the first in-line for the capitalist agenda..
jll29|3 years ago
karmasimida|3 years ago
amilich|3 years ago
jmyeet|3 years ago
The government won’t save you from efforts like this. The government represents the interests of the capital owning class.
The demonization of unions is one of the most successful cases of propaganda in the last century. It’s gone so far s people who will die on the hill of Jeff Bezos paying slightly more taxes because everyone seems to think they’ll be Jeff Bezos one day.
daenz|3 years ago
I see that phrase thrown around a lot. It's a variant of "you're never going to be a billionaire (so you shouldn't be against X)." Why do people assume that you have to think you'll be a billionaire to be against something that would affect billionaires negatively? Is something only wrong if you think you'll find yourself in that position one day?
colechristensen|3 years ago
It is possible to see unions as both the source of some and solution to other forms of abuse.
nightski|3 years ago
jll29|3 years ago
_jal|3 years ago
Ever notice how unions are somehow all the same entity, and seem to have to answer for things completely different unions in completely different industries did?
Nobody treats corporations this way, even though (if you look at interlocking BoD membership) there's a more reasonable case to be made for collusion in some industries...
gruez|3 years ago
>The government won’t save you from efforts like this. The government represents the interests of the capital owning class.
You realize that the power/existence of "labor organization ie unionization" is dependent on the government? Without government protection labor unions don't stand a chance.
mc32|3 years ago
Like if a manager learns something and takes action because of it?
Or learning about employee behavior and sentiment and using that information to suppress promotions…
Or being informed of employee misbehavior and not taking action against it…
hereforphone|3 years ago
mrfusion|3 years ago
You just installed it locally off a disc and it just worked when you needed it. You didn’t even need internet.
javajosh|3 years ago
So I won't think about it.
progman32|3 years ago
Arrath|3 years ago
bee_rider|3 years ago
Office suites were a mistake. Return to text editor.
userbinator|3 years ago
Markoff|3 years ago
CPLX|3 years ago
SMAAART|3 years ago
wly_cdgr|3 years ago
speed_spread|3 years ago
humbleMouse|3 years ago
Closi|3 years ago
Mine doesn't. I know that because I am the 365 admin.
tpmx|3 years ago
faeriechangling|3 years ago
tyingq|3 years ago
draw_down|3 years ago
[deleted]
lucisferre|3 years ago
doctor_eval|3 years ago
paranoidrobot|3 years ago
Even if it doesn't work right - having it at all is going to result in all sorts of bullshit for employees where this is enabled.
Someone digging through your emails because you happened to mention some vaguely related keywords... yeah, no.
bootcat|3 years ago
rmbeard|3 years ago