Futurism: Social and Legal Rights of Robots
February 17, 2011 1 Comment
By R. A. Freitas
If we give rights to intelligent machines, either robots or computers, we’ll also have to hold them responsible for their own errors. Robots, by analogy to humans, must conform to a “reasonable computer” standard. Sentient computers and their software should be held to the standard of competence of all other data processing systems of the same technological generation. Thus, if all “sixth generation” computers ought to be smart enough to detect bogus input in some circumstances, then given that circumstance, a court will presume that a “sixth generation” computer knew or should have known the input data were bogus.
Exactly who or what would be the tortfeasor in these cases? Unlike a living being whose mind and body are inseparable, a robot’s mind (software) and bodyare severable and distinct. This is an important distinction. Robot rights most logically should reside in the mechanism’s software (the programs executing in the robot’s computer brain) rather than in its hardware.
This can get mighty complicated. Robots could be instantly reprogrammed, perhaps loading and running a new software applications package every hour. Consider a robot who commits a felony while running the aggressive “Personality A” program, but is running mild-mannered ‘Personality M” when collared by the police. Is this a false arrest? Following conviction, are all existing copies of the criminal software package guilty too, and must they suffer same punishment? (Guilt by association?) If not, is it double jeopardy to take another copy to trial? The robot itself could be released with its aggressive program excised from memory, but this may offend our sense of justice.
The bottom line is it’s hard to apply human laws to robot persons. Let’s say a human shoots a robot, causing it to malfunction, lose power, and “die.” But the robot, once “murdered,” is rebuilt as good as new. If copies of its personality data are in safe storage, then the repaired machine’s mind can be reloaded and up and running in no time – no harm done and possibly even without memory ofthe incident. Does this convert murder into attempted murder? Temporary roboslaughter? Battery? Larceny of time? We’ll We’ll probably need a new class of felonies or “cruelty to robots” statutes to deal with this.
If robots are persons, will the Fifth Amendment protect them from self-incrimination? Under present law, a computer may be compelled to testify, even against itself, without benefit of the Fifth Amendment. Can a warrant be issued to search the mind of a legal person? If not, how can we hope to apprehend silicon-collar criminals in a world of electronic funds transfer and Quotron stock trading?
How should deviant robots be punished? Western penal systems assume that punishing the guilty body punishes the guilty mind – invalid for computers whose electromechanical body and software mind are separable. What is cruel and unusual punishment for a sentient robot? Does reprogramming a felonious computer person violate constitutional privacy or other rights?
Robots and software persons areentitled to protection of life and liberty. But does “life” imply the right of a program to execute, or merely to be stored? Denying execution would be like keeping a human in a permanent coma – which seems unconstitutional. Do software persons have a right to data they need in order to keep executing? Can robot citizens claim social benefits? Are unemployed robo-persons entitled to welfare? Medical care, including free tuneups at the government machine shop? Electricity stamps?Free education? Family and reproductive rights? Don’t laugh. A recent NASA technical study found that self-reproducing robots could be developed today in a 20-year Manhattan-Project-style effort costing less than $10 billion (NASA Conference Publication 2255, 1982).
In the far distant future, there may be a day when vociferous robo-lobbyists pressure Congress to fund more public memory banks, more national network microprocessors, more electronic repair centers, and other silicon-barrel projects. The machines may have enough votes to turn the rascals out or even run for public office themselves. One wonders which political party or social class the “robot bloc” will occupy.
In any case, the next time that Coke machine steals your quarter,better think twice before you kickit. Someday you may need a favor.