It seems like it ought to be reasonably restrictive to install some new AI tool on dev machines.
* they don’t know who’s behind it, what’ll it output?
* they don’t know the business model. Is will it be used to exfiltrate code from the company, aka train on the company’s codebase? Other text files you open?
I’m not at all saying I think Cursor is doing that—training on customer data would be a completely unethical business practice, bordering on malware, usually companies whose names get bounced around here are not so bad. But, the hypothetical host company doesn’t know anything about them, so it is prudent to require some checking.
I don't even use copilot (yet) because the process of reviewing it through all the different obstacles (security, legal, budget) is on it's 24th month or something. I think many people in traditional industry can relate.
osigurdson|1 year ago
bee_rider|1 year ago
* they don’t know who’s behind it, what’ll it output?
* they don’t know the business model. Is will it be used to exfiltrate code from the company, aka train on the company’s codebase? Other text files you open?
I’m not at all saying I think Cursor is doing that—training on customer data would be a completely unethical business practice, bordering on malware, usually companies whose names get bounced around here are not so bad. But, the hypothetical host company doesn’t know anything about them, so it is prudent to require some checking.
alkonaut|1 year ago
pc86|1 year ago
Sometimes it causes issues, 99% of the time it's fine.
yunohn|1 year ago
zxcvbnm69|1 year ago
[deleted]