top | item 31600848

Microsoft Purview: Additional classifiers for Communication Compliance (preview)

325 points| CPAhem | 3 years ago |pupuweb.com

307 comments

order

totetsu|3 years ago

I'm already not wanting to have personal conversations on teams. My tech savvy colleagues and the ones who can be convinced are on signal, where we talk about job offers and relationships. A few others do Instagram, and get to see my art photography. And occasionally I'll bump into someone when we're both in the office and be able to say whatever not looked over by AI. There's a real chilling effect on getting to know people.

DeathArrow|3 years ago

> And occasionally I'll bump into someone when we're both in the office and be able to say whatever not looked over by AI.

At my present workplace, we have cameras with microphones. They also have installed spyware on laptops and desktops, to be able to see the screens of employees. They also go through mails and have a list of all web traffic done by employees.

Which is one of the reasons I've handed my resignation a few days ago.

dj_mc_merlin|3 years ago

Why would anyone have personal conversations on a platform which is linked to your work?

tgv|3 years ago

Is that a recent thing? Or only in the US? Age related? Size of the company? The level of personal communication I've witnessed over such tools is pretty superficial, casual. But perhaps that's just my age, location, or the fact that the last time I worked for a large company, Skype was still new.

fartcannon|3 years ago

Signal and WhatsApp aren't 100% trust worthy though. Why not pick something you can host yourself?

california2077|3 years ago

Your employer is not obliged to maintain a communication system so that you "get to know people". If you consider how much these tools cost to maintain, it's completely understandable that companies want to have 100% content control.

librarianscott|3 years ago

So, now we can't use Teams to have the "water-cooler" moments that supervisors claim we need, but really we are having them on Signal or IOS and they just can't measure that. Organizations really, really, really hate transparency.

throwaway0a5e|3 years ago

They don't want you shooting the shit with company communications mediums because that has limited upside and much less limited potential downside.

Remember the famous "will the atom bomb test ignite the atmosphere" gentleman's bet those scientists had? Nobody actually thought it would but they discussed it semi-seriously. Today discussing some fanciful bad outcome like that (be it the mundane failure to deliver a product or something more interesting) is a liability when it's sitting in your company email servers. Even if that bad thing isn't what winds up happening or the people speculating aren't in a position to have accurate info the other side's lawyer or the regulator will try and construe it as proof that the company should have known ahead of time.

Or, more likely, say there's some sexual harassment or adultery kerfuffle between employees. It's way better for the company if none of that happened on company provided communications tools.

From the company's perspective it's avoidable risk to have work communication tools be used for informal BSing between employees. But they can't realistically prevent that so they introduce Skynet in order to make people watch their mouths and move those sensitive conversations elsewhere.

indymike|3 years ago

Nah. People like to talk and not get fires for saying the wrong keyword.

parasubvert|3 years ago

This is nothing new: corporations have scanned instant messages, emails and even recorded phone calls for decades, and will fire you based on that evidence for violations of corporate policy. And will sue you or call the cops if they detect potential crimes.

I’m kind of surprised so many people are shocked by this. I know of one company where dozens of people were fired because their email was scanned for external job interviews and the CIO had a report, which he used to prematurely cut staff when he needed to save budget.

The only difference now is that the tech is smarter and cheaper so that you don’t need to pay as many people to spy on their coworkers.

Your defence against this is to find a job where you’re too valuable for them to do anything. As with any jurisdiction where there is at will employment.

karaterobot|3 years ago

> The only difference now is that the tech is smarter and cheaper so that you don’t need to pay as many people to spy on their coworkers.

Your comment implies this isn't potentially an enormous difference. The difference is between having to pay people to spy on their coworkers, and having computers that do it passively, invisibly, continuously, in real time?

RcouF1uZ4gsC|3 years ago

> I’m kind of surprised so many people are shocked by this. I know of one company where dozens of people were fired because their email was scanned for external job interviews and the CIO had a report, which he used to prematurely cut staff when he needed to save budget.

On a related note, if you were a Microsoft employee, how comfortable would you be talking with recruiters on LinkedIn?

bombcar|3 years ago

Why do people persist in using work emails for personal things like job interviews?

XorNot|3 years ago

The difference here is that "find a new job" might be accomplished by LinkedIn...which is also owned by Microsoft.

So Microsoft's cloud ecosystem generally owns your work email, and the site you use to find a job.

Honestly: I don't care what they say (because it'll be "we datamined LinkedIn, but don't worry we did it with only the public APIs and just bypassed rate limiting so technically...to add data to our "employee leaving" filter...) - Microsoft and LinkedIn, specifically, need to be forcibly broken up with this sort of control over the full employee lifecycle.

peterfield|3 years ago

"The leavers classifier detects messages that explicitly express intent to leave the organization, which is an early signal that may put the organization at risk of malicious or inadvertent data exfiltration upon departure". In other words "how to promote and encourage paranoid behaviors from employers" :(

indrora|3 years ago

Have you seen high schools in the US?

Once I discovered that every school-issued machine had a VNC server running on it I assumed that the contents of my screen were being recorded at every moment. Turns out I was half right, as I caught up with the IT guy afterwards and the principal (a paranoid sociopath who shouldn't be anywhere near kids) wanted the ability to catch kids when she thought they were looking at non-school related things.

It's fundamental safety in a society with these sorts of companies to assume: company infra = logged until you die. Once your company has come under a subpoena for information or under some kind of long term discovery, you write emails under the assumption they're going to be in court for everyone and your mother to see.

regularfry|3 years ago

It's extremely revealing that this particular classifier is framed as "prevent data loss" not "intercept skills loss" or "figure out why your employees want to leave and then fix that".

narrator|3 years ago

It seems maybe not the intent, but the practical result is to use the private sector to implement CCP like social credit scores isn't it? By doing everything in the private sector they get around all those pesky constitutional protections.

bsedlm|3 years ago

so the rights and freedoms are only protecting citizens from government oppression... if a private company does it, then it's fine cuz corporations are also free people.

they're free people who somehow are getting to oppress and censor individual humans (otherwise the corporation is who is being oppressed), but let's pretend that we can punish them by "taking our dollars elsewhere" such that it's our own fault

IMO, tracing this towards the root, I find along the way the grand system of royalties and other kinds of rent schemes. Nobody cares cuz we prefer the promise (for the majority is a promise) that we can come up with something great to make it BIG and then get to live from rent or other kinds of royalty payments

california2077|3 years ago

Most large companies had "social credit scores" for decades. They're called performance reviews. Nothing new here. You just had unreasonable and naïve expectations. You now know MS Teams is monitored. You are free to seek employment in companies that don't use MS Teams if you dislike this so much.

dang|3 years ago

Submitted title was "Office 365 implementing AI to detect employees colluding, leaving and more". That broke the site guidelines: "Please use the original title, unless it is misleading or linkbait; don't editorialize." - https://news.ycombinator.com/newsguidelines.html

The proper place to include that sort of interpretation is by adding it in a comment in the thread. Then your interpretation is on a level playing field with everyone else's (https://hn.algolia.com/?dateRange=all&page=0&prefix=false&so...). Also, a comment gives you room to actually substantiate your interpretation.

On the other hand, a thread like this probably wouldn't have gotten attention without the sensational title in the first place, so this kind of submission is a borderline case and at worst a venial sin. (We still change the title once it does make the frontpage though.)

davesque|3 years ago

I know it's policy to use original titles, but the editorialization in this case hardly seems sensational. Just look at the linked roadmap tickets:

* https://www.microsoft.com/en-my/microsoft-365/roadmap?filter...

* https://www.microsoft.com/en-my/microsoft-365/roadmap?filter...

* https://www.microsoft.com/en-my/microsoft-365/roadmap?filter...

* https://www.microsoft.com/en-my/microsoft-365/roadmap?filter...

* https://www.microsoft.com/en-my/microsoft-365/roadmap?filter...

* https://www.microsoft.com/en-my/microsoft-365/roadmap?filter...

* https://www.microsoft.com/en-my/microsoft-365/roadmap?filter...

The title "Microsoft Purview: Additional classifiers for Communication Compliance (preview)" sounds like nothing at all. It doesn't seem like exaggerating to say that the reality is literally Big Brother in a corporate context. Seems like your changing the title is just going to have the effect of reducing attention given to something that really needs to be exposed in clear terms.

CPAhem|3 years ago

OK, I will so in future.

I tried to summarize the article in the title. Will follow the guidelines from now on.

jtbayly|3 years ago

Thanks for the clarification about this sort of catch 22. I had been wondering about this exact scenario that seems to happen fairly regularly.

behnamoh|3 years ago

[deleted]

4oo4|3 years ago

I think if you have an E5 license there is already thoughtcrime functionality built-in. I remember someone demoing this to me in a Teams user group, and no one seemed to think it was creepy at all. In addition to flagging keywords it also used AI to detect undesirable thoughts and emotions, under the guise of anti-harassment and compliance. Unfortunately I can't remember the name of the feature but I think it might be this:

https://docs.microsoft.com/en-us/microsoft-365/compliance/co...

So I think if Microsoft existed in the world of 1984, they would easily be the preferred tech vendor for IngSoc.

Side note, do you think this would also detect the money laundering and bribery going on within Microsoft itself?

https://www.theverge.com/2022/3/25/22995144/microsoft-foreig...

Side-side note, I think the reason why that is allowed to still keep going on given that the SEC knows about it and that there's ample evidence has to do with national security reasons.

It's extremely troubling that given all this corporate authoritarian AI tech they built that Microsoft is still trying to be the voice of reason about the dangers of AI.

codemonkey-zeta|3 years ago

> It's extremely troubling that given all this corporate authoritarian AI tech they built that Microsoft is still trying to be the voice of reason about the dangers of AI.

Just speculating, but this phenomenon could either be explained by 1.) diverse internal opinion; the parts of Microsoft responsible for warning against AI are not the same parts pushing authorarian AI software, or 2.) Moat-building/ladder-pulling; Microsoft is warning people of the danger of _other people's_ AI, but of course you can trust _their_ AI, because they're the ones warning you after all!

parasubvert|3 years ago

Emails aren’t thoughtcrimes. This is nonsense.

Everything in corporate email has always been subject to read by others, there is no expectation of privacy.

As we’ve seen from countless court cases, they range from boring nothingburgers, to evidence of actual crimes.

bearjaws|3 years ago

There is no way they will be able to make an AI at this point that will

A) Be accurate

B) Work across multiple contexts

C) Run efficiently on billions of messages

This will just result in many false positives, and unnecessary eavesdropping on employees personal conversations.

Once its revealed an organization is using this, people will quickly move all conversations to another platform, even if policy forbids that. Resulting in an even greater security risk potentially.

And as per usual, if Microsoft gets someone fired (e.g. comes in looking for money laundering, finds out the staff member is making fun of their boss), there will be no repercussions.

zmmmmm|3 years ago

accuracy isn't a strict requirement though

if you accidentally fire 10% of good people you still have 90% of them left, and if that lets you fire 80% of the staff that are committing thought-crime it's probably a win.

jchw|3 years ago

Part of what makes stuff like this surprising is expectations of privacy. Like for example. If you start a video chat on Hangouts or Zoom, even or maybe especially on a work account, you don’t expect that meeting to be recorded or analyzed surreptitiously. I think in many places, it would be illegal.

Because of this, one might feel like the same standard applies to other one-on-one and small group communication avenues, but it’s actually completely the opposite.

lamontcg|3 years ago

Reinforces that during interviews candidates should be determining what the company uses for internal communications and choose accordingly.

Anyone using Teams is already a red flag.

duckmysick|3 years ago

What's a good alternative? I feel like it's a matter of time before similar feature is added to Slack and Co.

smcleod|3 years ago

Absolutely agree with this.

qw|3 years ago

I honestly first thought this article was satire. It is so unreal to find myself in a world where this is acceptable. What's next? Installing cameras in restrooms to catch offline conversations?

throwaway0a5e|3 years ago

I have zero confidence that this system is smart enough to differentiate between all these things and the legitimate variants thereof (e.g. collusion and cross team collaboration are basically indistinguishable) that companies actually want people doing or discussing and likely outnumber the bad by orders of magnitude.

qbasic_forever|3 years ago

Yeah and the sad thing is companies that don't know better will flip it on because "wow look at smart Microsoft's latest feature, we better use this!" and then inadvertently fire Sally in HR because her asking people to sign a card for the VP's birthday looked suspiciously like a violation of corporate gift policies.

jll29|3 years ago

Technology like this will sometimes work and many times not, and the false positives and true negatives will cause a lot of harm on the way.

ma2rten|3 years ago

NLP has improved a lot in the last 5 years. I believe that this is now technically possible with the right training data.

Angostura|3 years ago

Remember the gfood old days of Usenet with signatures that deliberately contained keywards to try and DDoS the NSA's "line eater"

HeckFeck|3 years ago

You can still find them in the email signature of any communication from one Richard M. Stallman!

jes|3 years ago

Yep.

Echelon is one keyword I remember.

gautamdivgi|3 years ago

This seems to be office 365 implementing monitoring of official communications of employees & contractors for the office account? I don’t think it extends to a personal office 365 account, at least it didn’t seem to.

Why is this exactly newsworthy? Any communication through official channels is the property of the employer anyway. To collude, leave & other stuff use personal channels maybe.

creatonez|3 years ago

> Any communication through official channels is the property of the employer anyway.

Why is there always this attitude of "it's a private business, they can do what they want". Why does the fact that they can do something distract from criticism of them doing it? The fact that this tech exists is horrifyingly dystopian on its own merits. But it also has widespread consequences in a country with so many employment monopolies and opportunities for outright wage slavery. Heavy-handed workplace surveillance and heuristics-based crap are becoming increasingly difficult to simply opt-out of.

paulmd|3 years ago

> Why is this exactly newsworthy? Any communication through official channels is the property of the employer anyway.

Pretty clear one of the major things they're going for here is detecting "jobsite troublemakers", ie employees who are upset with job conditions/agitating for improvements/discussing salaries/etc, which is given specific legal protection. It is explicitly legal and protected for employees to discuss labor conditions, organizing, or salaries regardless of whether you do it "on company property" or "on company chat". Just because the company owns it doesn't mean you have no legal rights - just like a company can dismiss you for no reason but they can't dismiss you for any reason.

They are wrapping it up with "think of the children" justifications like "employees who are discussing salary might be considering leaving and they might take nefarious action if they do so" but that's the core of the situation here - these are tools to detect and fight against legally-protected activities by employees.

> Workplace collusion: The workplace collusion classifier detects messages referencing secretive actions such as concealing information or covering instances of a private conversation, interaction, or information.

> "The leavers classifier detects messages that explicitly express intent to leave the organization, which is an early signal that may put the organization at risk of malicious or inadvertent data exfiltration upon departure"

Hypothetically, do you think it would be a good idea for Microsoft to build a classifier and provide managers with a list of potential "religiously devout", eg based on correlated work/away periods, language patterns, etc? Sure, it's a legally protected classification, but there's an elevated risk of extremist activity, which surely presents a business risk, right? So why not?

Epa095|3 years ago

> Why is this exactly newsworthy? Any communication through official channels is the property of the employer anyway.

Is this sentence meant descriptive or normative? Because there are definitely juristrictions where it is not that easy (e.g EU).

If it is meant normative then I wonder if you also think they "own" all conversations happening on corporate ground? Should they be allowed to record anywhere on corporate property, and use what they record in any way?

bsedlm|3 years ago

if the entirety of your being is the property of your employeer during "clock time" that don't seem like employemnt to me

outworlder|3 years ago

Except it's an AI making decisions and flagging people who may then be the recipient of adverse actions.

efitz|3 years ago

There are engineers who thought it would be a good idea to develop and train these models.

hyperhopper|3 years ago

Are there? I think it's much more likely there are engineers that were told to work on this, and thought that working on this would be fun, let them have an ML project, be a learning opportunity, give them something good for their performance packet, give them good experience, be good for visibility in the org, and many things like that.

I don't know why jumping to the most far reaching evil option is popular in threads like this is the default.

A4ET8a8uTh0|3 years ago

I am almost surprised it took this long to get to this point, but I suppose the recent resignation wave made it into a viable product offering. My last MBA class was HR analytics class that, among other things, dealt with email sentiment analysis and stuff like that. Part of me was thinking average HR person won't touch this stuff, but if a company just happened to offer something that would do it for them..

anm89|3 years ago

I've always had a preference against working with microsoft products but this is getting to the point where I'd find a new gig instead of being subjected to this stuff.

zmmmmm|3 years ago

I think at least some my staff will likely resign because mandatory deep inspection / network monitoring is being forced onto everyone's computers by the IT department. It's probably the only way to stop it from happening at the moment. Unfortunately the buzzword of "zero trust" has been bent towards meaning "spy on everything your employees do".

danschumann|3 years ago

What is the effect on creative expression and sociability, between co-workers, when they know they're being analyzed by a computer to figure out if they should be fired?

21723|3 years ago

This is absolutely going to be used against unionizers, which is what's really meant by "colluding". In the US this is going to get a lot of people fired. In other parts of the world, it's going to get them killed. This kind of software is Zyklon B for the 21st century.

creatonez|3 years ago

Sure, surveillance capitalism is pretty horrifying, but

> This kind of software is Zyklon B for the 21st century

is a bit of an over-the-top comparison

AbbeFaria|3 years ago

Fun fact: I used to work in this team.

We have come a long way now that we have these advanced classifiers. You would be surprised how low tech the initial product was, by low tech I mean devoid of any ML/AI. We went GA in end of 2019.

Saw a lot of interesting use cases too for e.g Japanese enterprises wanting to detect cases like suicide or intent to suicide, that is why we have multiple types of classifiers.

I worked on the Infra side (not ML). That too was “low-tech” or the more apt term would be “not the latest tech”. Core parts of the app were part of a monolith (think Exchange). Then we were using a really old .NET Framework version for our MVC app. Lot of the storage technologies we used were very MS specific as well. AFAIK, all of this is still valid today.

fartcannon|3 years ago

How, for the love of god, do you defend Microsoft after this?

aaaaaaaaaaab|3 years ago

B-but they write VS Code! They were supposed to have changed!

m-p-3|3 years ago

Next step will be to detect potential attempt at unionizing.

gigel82|3 years ago

They have "workplace collusion" as a category, and even more dystopian shit, like:

EDIT: apparently these 2 are just jokes, sorry for not checking my sources!

`Negative emotions: Expressions of sadness, unhappiness, discontent, anger, rage, anguish, or existential ennui, as these may negatively affect team cohesion.

Joy: Language suggesting hopefulness, optimism, anticipation of a brighter future, faith in humankind and/or in a loving and benevolent creator, as these may imply that the user is thinking about topics other than the best interests of the organization.`

From https://old.reddit.com/r/sysadmin/comments/v3b2mn/microsoft_...

rl3|3 years ago

Seems to only apply to messages, for now. My understanding is that unless a call on Teams is explicitly recorded, there's no capability for the organization to monitor the content within.

Is this still accurate? Are there any features in the pipeline planning to change this?

Microsoft offering "communications compliance" within the same product is certainly chilling enough as it is. The reality where people lose their job as a result of previously-protected casual [voice] chat doesn't seem so crazy now. All it takes is missing a quietly-introduced feature update by a week before the organization flips the switch and doesn't tell anyone.

bombcar|3 years ago

I’m sure automatic transcripts of all calls is just a few developments away - it’d be desirable for call centers and I suspect it’ll be available to all eventually.

alliao|3 years ago

given how well voice recognition work nowadays it's really a matter of cost of implementation rather than if?

greyhair|3 years ago

That link was not happy about my pihole swallowing their ad links, so I could not read it.

I will say, however, that I don't use my personal phone to host any employer apps. It is my phone, not theirs. I pay the service fee.

So conversations I have on my phone are mine. My coworkers all operate the same way.

Arrath|3 years ago

Sounds like its time to set up a scheduled batch file that sends a bunch of messages around that would trigger watchdogs like this, as well as the NSA prism keywords just for funsies.

Havoc|3 years ago

Oh great - AI thought police to make the corporate existence even bleaker.

Could someone head over to MS HQ and slap some sense into whoever thought blessing the world with this is a win?

alchemyromcom|3 years ago

I hope the people implementing all these policies and technologies are seriously weighing the consequences of their actions. I suspect that they are not.

BMc2020|3 years ago

MS Office Home > Admin > Exchange Admin Center > Mail Flow > Rules > Click the plus sign for New Rule > Create New Rule > Apply this rule if > Subject or Body includes > Specify a word or phrase

How good the AI is, depends on the flood of false positives the current system generates. If MS is true to form getting anything useful comes at great expense.

The #1 thing they search for is notably missing from the list.

DeathArrow|3 years ago

At one of the previous employers, they sold part of the company to an outsourcing enterprise, including employees and founded a new one to move the remaining employees.

As a part "of being sold company" When I wanted to interview to the new company, my future to be manager send me his phone number, and advised not to use Teams for any sensitive conversation.

MauranKilom|3 years ago

Why is everything duplicated in this announcement? The list of classifier descriptions effectively appears twice, the first time with the text of the "What you need to do to prepare" (which, btw, says exactly nothing on how to prepare) appended to each item.

What even is this site? It looks like grade A content rehashing from various MS sites...

cheinyeanlim|3 years ago

There are good and bad for those feature. The only rule of thumb: Never perform non-working related matter in workplace or facility provided by company, no matter how good your performance in company or how good your relationship with superior.

schmichael|3 years ago

> Microsoft Purview public preview provides policies protecting products, probably punting people’s privacy.

jandrusk|3 years ago

Employees will just start colluding using external tools once the first few suckers get fired.

parasubvert|3 years ago

You’ll then just get fired for using the external tools.

Unless you’re too valuable for them to care.

hsuduebc2|3 years ago

This is pretty dystopian.

killjoywashere|3 years ago

Someone is going to have to square this with executive branch records law. You can’t have a democracy and IngSoc. That’a not how either of them work.

lumost|3 years ago

hmm I think I have a new reason not to use any Microsoft products in the office. I can even claim an ethics issue with interacting with them now. Unfortunately the existence of this feature breaks trust that my management hasn't abused it, the only way to avoid this is by not engaging with Microsoft offerings such as word or excel in the office.

HeckFeck|3 years ago

We're slowly closing the gap with China, aren't we? What will happen when this technology is applied more widely?

blinded|3 years ago

safe to say anything happening on a work computer or work software is looked at or can be looked at by someone in the company.

rkagerer|3 years ago

What ever happened to hiring a good team of trustworthy employees and setting out to achieve a mission together?

annoyingnoob|3 years ago

Wish I could read the page but apparently my ad blocker is too offensive. Well I'd be fine with supporting the publisher through online ads but I am really not okay with the tracking those advertisers do. You ditch the tracking and any annoying ads and I'll ditch the ad blocker. Until then, we have to agree to disagree, the Faustian bargain of internet advertising is untenable.

hatchoo|3 years ago

Only when conversations are in English right? I can still use the local language? :D

bombcar|3 years ago

The transcripts are available in more and more languages so perhaps it’s time to learn Navaho.

eyeareque|3 years ago

I get the 1984 vibe this has.

How should companies defend themselves from insider threats?

willismichael|3 years ago

This is no way to defend against inside threats. Any real threats will use other means of communication. Meanwhile this is just treating everybody as if they can't be trusted.

the_optimist|3 years ago

This is the whole point of culture and society. Mass surveillance didn’t/doesn’t work for the NSA/CIA and it sure isn’t going to work for corporate paymasters either.

leroman|3 years ago

Employees get what they need and give what they can..

But seriously, I always found it amusing that once you step into a corporate you can get food, drinks & other amnesties for free.. almost like it's a socialist society.. But when said employees step outside, they are the first in-line for the capitalist agenda..

jll29|3 years ago

One word: outrageous!

jmyeet|3 years ago

The only way this sort of thing changes is with labor organization ie unionization.

The government won’t save you from efforts like this. The government represents the interests of the capital owning class.

The demonization of unions is one of the most successful cases of propaganda in the last century. It’s gone so far s people who will die on the hill of Jeff Bezos paying slightly more taxes because everyone seems to think they’ll be Jeff Bezos one day.

daenz|3 years ago

>think they’ll be Jeff Bezos one day.

I see that phrase thrown around a lot. It's a variant of "you're never going to be a billionaire (so you shouldn't be against X)." Why do people assume that you have to think you'll be a billionaire to be against something that would affect billionaires negatively? Is something only wrong if you think you'll find yourself in that position one day?

colechristensen|3 years ago

>The demonization of unions is over the most successful cases of propaganda in the last century.

It is possible to see unions as both the source of some and solution to other forms of abuse.

nightski|3 years ago

Already saved from this sort of thing by just being self employed for the past 10 years. I get paid for every hour of work with no spying and just a weekly status update with my clients. It's a simple relationship and I am honestly not sure I could ever go back.

jll29|3 years ago

Emacs would never spy on you like that, would it?

_jal|3 years ago

> The demonization of unions is over the most successful cases of propaganda in the last century

Ever notice how unions are somehow all the same entity, and seem to have to answer for things completely different unions in completely different industries did?

Nobody treats corporations this way, even though (if you look at interlocking BoD membership) there's a more reasonable case to be made for collusion in some industries...

gruez|3 years ago

>The only way this sort of thing changes is with labor organization ie unionization.

>The government won’t save you from efforts like this. The government represents the interests of the capital owning class.

You realize that the power/existence of "labor organization ie unionization" is dependent on the government? Without government protection labor unions don't stand a chance.

mc32|3 years ago

Doesn’t some of this stuff add some legal liability to organizations?

Like if a manager learns something and takes action because of it?

Or learning about employee behavior and sentiment and using that information to suppress promotions…

Or being informed of employee misbehavior and not taking action against it…

hereforphone|3 years ago

I shouldn't have to belong to a particular organization (especially a politically active one) in order to have a job in my field.

mrfusion|3 years ago

Anyone else miss Office 97?

You just installed it locally off a disc and it just worked when you needed it. You didn’t even need internet.

javajosh|3 years ago

Office 97 and Windows XP was something of a high point in personal computing. The internet has enabled entirely new product categories, but it has also badly eroded old ones with the solvent of MRR greed. Merely selling a thing just ain't good enough, especially if it's software. Even offline applications are SaaS, now, where the "service" are frequent updates that leave you at the perennial mercy of every company from which you purchase software (and every company with which they do business, recursively). I'm normally pretty sanguine about business models, but when I lay it out like this, I find it quite disturbing.

So I won't think about it.

progman32|3 years ago

2 years ago I decided to fire up my Pentium#100 and write a technical plan on it using Office 2000, for my real life corporate job. It worked magnificently, no fuss, plan presentation went fine. Faster on a 100mhz machine with 16MB ram than whatever monstrosity underlies O365 and Google Docs.

Arrath|3 years ago

I'll happily use an old Office, just please splice in the "What do you want to do?" search bar so I don't have to hunt through nested menus/ribbons for some obscure formatting option I use once every 6 months.

bee_rider|3 years ago

Not really.

Office suites were a mistake. Return to text editor.

userbinator|3 years ago

Office 2003 was probably the peak, and then it started going downhill with 2007.

Markoff|3 years ago

you can still ban internet access in firewall and activate it locally, I'm using last offline installable Office 2019, 365 can't touch my computer

CPLX|3 years ago

I just want a simple version of MS Access in the cloud.

SMAAART|3 years ago

What year is this? 1984?

wly_cdgr|3 years ago

1984 got nothing on 2022

speed_spread|3 years ago

"Just tell me what year you want me to believe we're in"

humbleMouse|3 years ago

If you've ever thought your employer isn't monitoring the chat then you're a fool. I'd go as far to say that if you think there is any form of electronic communication that isn't being monitored on some level you're also being foolish.

Closi|3 years ago

> If you've ever thought your employer isn't monitoring the chat then you're a fool.

Mine doesn't. I know that because I am the 365 admin.

tpmx|3 years ago

One approach is to run/work for small companies with adult smart people who trust each other without surveillance.

faeriechangling|3 years ago

There's a difference between monitoring and logging, and nobody is reading the chat logs or even paying attention to chat metrics in many workplaces because the value of doing so is dubious given the potential for employee backlash.

tyingq|3 years ago

It's probably more common that they log it, and trawl it when there's some reason to. Still dystopian, but less work.

lucisferre|3 years ago

It's ok it's from Microsoft. Nothing in Office 365 works, this won't either.

doctor_eval|3 years ago

Reminds me of that old joke: “the first product Microsoft makes that doesn’t suck will be a vacuum cleaner”

paranoidrobot|3 years ago

I'm sure this is intended as a joke.

Even if it doesn't work right - having it at all is going to result in all sorts of bullshit for employees where this is enabled.

Someone digging through your emails because you happened to mention some vaguely related keywords... yeah, no.

bootcat|3 years ago

Google docs is way better seemingly for now,