(no title)
fargle | 1 year ago
the contract between user and "maker" should be requirements. if a machine does not fulfill its requirements, it can be the maker's responsibility, for example a flight instrument that failed to report correct information.
but if you choose to use a piece of software that says "we actually guarantee nothing", which is the vast majority of it, then it's definitely the user's responsibility for choosing to use such a tool.
kogus|1 year ago
I'm sure there is a whole library full of legal precedent around "who is liable, and when". My earlier point was really that "Machines are not accountable" does not mean "Machines cannot make decisions". It really means "We need to think about who is accountable when this machine makes a mistake.