(no title)
Flatterer3544 | 1 month ago
You would not be privy to their internal processes, and thusfar not be able to prove wrong doing. You would just have to hope for a new Snowden and that the found wrongdoings would actually be punished this time.
bko|1 month ago
For instance, if your job is to be on your feet all day and you can barely stand, then that job is not for you. I have never met employers that are so flush in opportunities of candidates that they just randomly choose to exclude certain people.
And if it's insurance, there's a group rate. The difference only variable is what the employee chooses out of your selected plans (why make a plan available if you don't want people to pick that one?) and family size. It's illegal to discriminate of family size and that does add up to 10k extra on the employer side. But there are downsides to hiring young single people, so things may balance out.
zopa|1 month ago
So less, the job requires you to stand all day, and more, once a week or so they ask you make a binder of materials, and the hole puncher they want you to use dislocates your hands (true story). Or, it's a desk job, but you can't get from your desk to the bathroom in your wheelchair unless they widen the aisles between desks (hypothetical).
jjmarr|1 month ago
Read your policy!
rafterydj|1 month ago
If we circumvent those privacy laws, through user licenses, or new technology - we are removing the protections of normal citizens. Therefore, the bad behavior which we already decided as a society to ban can now be perpetrated again, with perhaps a fresh new word for it to dodge said old laws.
If I understand your comment, you are essentially wondering why those old laws existed in the first place. I would suggest racism or other systemic issues, and differences in insurance premiums, are more than enough to justify the existence of privacy laws. Take a normal office job as an example over a manual labor intensive job. No reason at all that health conditions should impact that. The idea of not being hired because I have a young child, or a health condition, that would raise the group rate from the insurer passing the cost to my employer (which would be in their best interest to do) is a terrible thought. And it happened before, and we banned that practice (or did our best to do so).
All this to say, I believe HIPAA helps people, and if ChatGPT is being used to partially or fully facilitate medical decision making, they should be bound under strict laws preventing the release of that data regardless of their existing user agreements.
pseudalopex|1 month ago
Insurers derive rates for each employer from each employer's costs where laws allow this. And many employers self fund medical insurance.
simianwords|1 month ago
well_ackshually|1 month ago
Also, in some cases: they absolutely do. Try to get hired in Palantir and see how much they know about your browsing history. Anything related to national security or requiring clearances has you investigated.
smsm42|1 month ago
Aurornis|1 month ago
Anyone who has worked in hiring for any big company knows how much goes into ensuring hiring processes don't accidentally touch anything that could be construed as illegal discrimination. Employees are trained, policies and procedures are documented, and anyone who even accidentally says or does anything that comes too close to possibly running afoul of hiring laws will find themselves involved with HR.
The idea that these same companies also have a group of people buying private search information or ChatGPT conversations for individual applicants from somewhere (which nobody can link to) and then secretly making hiring decisions based on what they find is silly.
The arguments come with the usual array of conspiracy theory defenses, like the "How can you prove it's not happening" or the claims that it's well documented that it's happening but nobody can link to that documentation.
anal_reactor|1 month ago
Aurornis|1 month ago
Under this conspiracy theory they'd have to be available for sale somewhere, right? Yet no journalist has ever picked up the story? Nobody has ever come out and whistleblown that their company was buying Google searches and denying applicants for searching for naughty words?
Aurornis|1 month ago
The continued secrecy of the conspiracy would then depend on every person involved in orchestrating this privacy violation and illegal hiring scheme keeping it secret forever. Nobody ever leaking it to the press, no disgruntled employees e-mailing their congress people, no concerned citizens slipping a screenshot to journalists. Both during and after their employment with the company.
To even make this profitable at all, the data would have to be secretly sold to a lot of companies for this use, and also continuously updated to be relevant. Giant databases of your secret ChatGPT queries being sold continuously in volume, with all employees at both the sellers, the buyers, and the users of this information all keeping it perfectly quiet, never leaking anything.
drawnwren|1 month ago
purrcat259|1 month ago
GDPR Request. Ah wait, regulation bad.