“Computer says” is a pretty standard excuse for doing fucked up shit as it adds a complex form of indirection and obfuscation between the will of a human and the actual actions that result from that will.
Doesn’t work as an excuse with people who actually make the software that makes the computer “say” something (because the complexity of what us used is far less for them and thus they know what’s behind it and that the software is just an agent of somebody’s will), but it seems to work with even non-expert (technology fan) techies, more so with non-techies.
With AI the people using the computer as an excuse just doubled down on this because in this case the software wasn’t even explicitly crafted to do what it does, it was trained (though in practice you can sorta guide it in some direction or other by chosing what you train it with) further obscuring the link between the will of a human which has decided what it does (or at least, decided which of the things it ended up doing after training are acceptable and which require changes to training) and the output of a computer system.
Considering that just about the entirety of the Justice System. Legislative System and Regulatory System are technically ignorant, using the “computer says” as an excuse often results in profit enhancing outcomes, incentivising “greed above all” people to use it to confuse, block or manipulate such systems.
“Computer says” is a pretty standard excuse for doing fucked up shit as it adds a complex form of indirection and obfuscation between the will of a human and the actual actions that result from that will.
Doesn’t work as an excuse with people who actually make the software that makes the computer “say” something (because the complexity of what us used is far less for them and thus they know what’s behind it and that the software is just an agent of somebody’s will), but it seems to work with even non-expert (technology fan) techies, more so with non-techies.
With AI the people using the computer as an excuse just doubled down on this because in this case the software wasn’t even explicitly crafted to do what it does, it was trained (though in practice you can sorta guide it in some direction or other by chosing what you train it with) further obscuring the link between the will of a human which has decided what it does (or at least, decided which of the things it ended up doing after training are acceptable and which require changes to training) and the output of a computer system.
Considering that just about the entirety of the Justice System. Legislative System and Regulatory System are technically ignorant, using the “computer says” as an excuse often results in profit enhancing outcomes, incentivising “greed above all” people to use it to confuse, block or manipulate such systems.