The second of Asimov’s laws of robotics says that a robot must obey any orders given to it by human beings, unless that order will endanger humans. Fine in theory. But what happens when those instructions are corrupted? Lewis Page at the Register reports on the “software error” that sent a military robot helicopter straying into restricted airspace near Washington DC.
“Robot planes and choppers lacking instructions from their human masters will normally circle where they are when comms go down, and control is almost always restored shortly thereafter – as in fact happened with the rogue Fire Scout. The difference here is that the MQ-8 failed to follow its built-in failure protocol, instead continuing on course.”