The second of Asimov’s laws of robotics says that a robot must obey any orders given to it by human beings, unless that order will endanger humans. Fine in theory. But what happens when those instructions are corrupted? Lewis Page at the Register reports on the “software error” that sent a military robot helicopter straying