Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Robotics Hardware

Legal Rights for Computers 550

nicholast writes "There's a really smart story in the current issue of Legal Affairs Magazine about granting legal recognition to computers: when that might happen, why it could happen, and what a discussion about it will teach humans about themselves."
This discussion has been archived. No new comments can be posted.

Legal Rights for Computers

Comments Filter:
  • by Anonymous Coward on Sunday December 19, 2004 @04:14PM (#11132350)
    "It sits there looking at me, and I don't know what it is. This case has dealt with metaphysics, with questions best left to saints and philosophers. I am neither competent nor qualified to answer those. I've got to make a ruling, to try to speak the the future. Is Data a machine? Yes. Is he the property of Starfleet? No. We have all been dancing around the basic issue. Does Data have a soul? I don't know that he has. I don't know that I have! But I have got to give him the freedom to explore that question himself. It is the ruling of this court that Lieutenant Commander Data has the freedom to choose."


    -- Captain Phillipa Louvois


    I believe this was already settled in the case of Maddox vs. Data on stardate 42523.7. The case determined that Lt. Commander Data, an artificial lifeform constructed by Dr. Noonian Soong, was not the property of Starfleet, but rather a sentient being with the full legal rights afforded any other.
    • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Sunday December 19, 2004 @04:54PM (#11132631)
      That episode needed to be completely re-written.

      Data already had the rank of Lt. Commander. That means that Star Fleet already recognized his ability to make decisions on his own.

      Therefore, his decision to NOT be disassembled would not be challenged.

      In order for the case to make sense (I know, it's Star Trek) then the robot would have to not have any prior recognition of its independence or decision making.

      Star Fleet recognized Data sufficiently to give him a rank that allows him to order humans to risk their lives (do the 3 laws apply in Star Trek?).
      • Data already had the rank of Lt. Commander. That means that Star Fleet already recognized his ability to make decisions on his own.

        Maybe Star Fleet gave him that rank because he thought it would look good on his business card?
      • That episode needed to be completely re-written.

        Data already had the rank of Lt. Commander. That means that Star Fleet already recognized his ability to make decisions on his own.

        Therefore, his decision to NOT be disassembled would not be challenged.

        In order for the case to make sense (I know, it's Star Trek) then the robot would have to not have any prior recognition of its independence or decision making.

        Since when has the real world been consistant?

        That Starfleet gave him some functional rights

  • Hmmm.... (Score:2, Funny)

    by comwiz56 ( 447651 )
    I would post a counter to this article, but my computer might sue me.
  • Hmm? (Score:5, Funny)

    by elid ( 672471 ) <eli DOT ipod AT gmail DOT com> on Sunday December 19, 2004 @04:16PM (#11132368)
    According to the trial scenario, a fictitious company created a powerful computer, BINA48, to serve as a stand-alone customer relations department, replacing scores of human 1-800 telephone operators.

    Yes, but what sort of accent did it have?

    • by Faust7 ( 314817 ) on Sunday December 19, 2004 @05:46PM (#11132979) Homepage
      According to the trial scenario, a fictitious company created a powerful computer, BINA48, to serve as a stand-alone customer relations department, replacing scores of human 1-800 telephone operators.

      There is another theory which states that this has already happened.
  • Is it April 1st ? (Score:5, Insightful)

    by Space cowboy ( 13680 ) * on Sunday December 19, 2004 @04:17PM (#11132371) Journal

    This is not a "really smart" story, it's a fantasy. It's too many ill-informed people (with too much time on their hands) that have seen "I, Robot". It even reads like some of the 'Susan Calvert' Asimov stories.

    There is a world of difference between programming something to *act* as though it has emotions, and something actually having an emotional or original response. The former is no different to calculating a spreadsheet, the latter has to do with independent and original experiences and actions - implying intelligence and self-awareness. No computer today, no matter how well programmed, is as self-aware as a house fly. We don't grant flies legal rights.

    The closest we've come to simulating intelligence, or at least produced non-programmed behaviour in computers are the neural networks coded up where the instructions ("program") are held within and are a function of the dataset rather than the construct. Even neural nets are simply matrix equations, albeit non-linear usually, and are thus completely deterministic. The typical neural network has less than 1000 nodes within it, the human brain has 100 billion neurons on average (with 10-50 times that many glial cells). The phrase "does not compare" doesn't even come close.

    So, in short, what a load of rubbish.

    Simon.
    • Yes I know - Susan Calvin. Damn. Eventually I'll realise the preview button is there for a reason. I was just too annoyed at the hubris of the story...

      Simon
      • Just as we revoke them for Humans!

        Welcome to the brave new world...

      • Re:Is it April 1st ? (Score:3, Informative)

        by Tony Hoyle ( 11698 )
        The original stories were all about that, though. Susan Calvin was basically the only one that ascribed emotions to the robots - for everyone else they were just tools 90% of the time.

        Even then she never ascribed malice to the robots... she didn't think they were alive. It was always the others that did that (when it suited them - a kind of exagerrated version of when we shout at our computers).

        It always turned out to be the humans at fault if a robot was accused of something - it always did exactly wha
    • Re:Is it April 1st ? (Score:3, Interesting)

      by Sloppy ( 14984 ) *
      There is a world of difference between programming something to *act* as though it has emotions, and something actually having an emotional or original response.
      How do you know that other humans have emotions, as opposed to being programmed to act as though they do?
    • by Goonie ( 8651 )

      There is a world of difference between programming something to *act* as though it has emotions, and something actually having an emotional or original response.

      I'll accept your argument - as soon as you convince me you're really annoyed about the article and aren't just convincingly simulating annoyance :)

      (With apologies to Arthur C. Clarke and whomever he stole the comment from...)

      Seriously, while you are correct in saying that present computers don't have anything resembling consciousness, who know

    • Re:Is it April 1st ? (Score:2, Informative)

      by ltbarcly ( 398259 )
      There is a world of difference between programming something to *act* as though it has emotions, and something actually having an emotional or original response.

      No. Emotions are not something that can be handed over to an other to be analyzed, double checked, or confirmed. There is absolutely no difference if the performance is ouwardly the same. See "The Cyberiad" by Stanislaw Lem.
      • Could you please repeat what you said? You were in such a hurry saying what you said that i can't tell if you said what you said, or did you say exactly the opposite.
    • by BitterOak ( 537666 ) on Sunday December 19, 2004 @04:46PM (#11132563)
      This is not a "really smart" story, it's a fantasy. It's too many ill-informed people (with too much time on their hands) that have seen "I, Robot".

      Well, 20 years ago, in the early days of PCs, people fantasized about the future when all computers would be connected and able to communicate with each other. And when vast stores of information would be available to everyone on their desktop. Also, such fantasies have included voice recognition and video conferencing, as well as video games where the characters looked "real". Well, yesterday's science fiction is today's science fact. And there's no reason to believe that today's science fiction will not be tomorrow's science fact.

      There is, of course, some science fiction that defies the laws of physics as we know them. I doubt we'll ever have faster than light travel, or anti-gravity machines for instance. But there is no inherent reason why computing power can't someday reach the level of the human brain. If Moore's law continues, this is supposed to take under 30 years.

      There is a world of difference between programming something to *act* as though it has emotions, and something actually having an emotional or original response.

      Really? Can you explain precisely what that difference is? Many artificial intelligence programs have been written that can learn and grow beyond the knowledge imparted by the original programmer. As far as emotions go, are you certain that there really is a difference between "simulated" and real emotions?

      • As far as emotions go, are you certain that there really is a difference between "simulated" and real emotions?

        Yes.

      • Well, 20 years ago, in the early days of PCs, people fantasized about the future when all computers would be connected and able to communicate with each other. And when vast stores of information would be available to everyone on their desktop. Also, such fantasies have included voice recognition and video conferencing, as well as video games where the characters looked "real". Well, yesterday's science fiction is today's science fact. And there's no reason to believe that today's science fiction will not

        • Why would a computer ignore its programming so that it could do something not in its programming? Come to think of it, how would it do this?

          This is a terrible criterion, because if you make the appropriate substitutions, human beings cannot show that they fulfill it.

          Our "programming" is defined by the inputs and outputs of our individual neural connections, and the behavior of those neurons is clearly the cause of our exhibited behavior. So in order to show that you were sentient, you would have to
        • If humans get fired for Unauthorized Computer Use, could not a computer be fired for Unauthorized Human Use?
      • by seanadams.com ( 463190 ) * on Monday December 20, 2004 @02:48AM (#11135451) Homepage
        But there is no inherent reason why computing power can't someday reach the level of the human brain. If Moore's law continues, this is supposed to take under 30 years.

        We can't even simulate a spider's intelligence yet. It's not a problem of needing more cycles.

        We need to work out how we think, and then try to "seed" this behavior into a machine that can learn. There are lots of interesting ideas out there, but every practical attempt I've seen has either been side-tracked by efforts to build interesting hardware, or too-ambitios attempts to jump stright to full intelligence/learning by taking "shortcuts" where you define behaviors and responses in software.

        I expect the solution to emerge by itself once we've modeled some basic life "rules", and set a learning simulation running on them. i.e. start with a very simple 2D "game" in software, where the goal is to pick up randomly scattered food pellets. Pick them up too slowly and you die. Gradually let the methods for food pellet searching evolve itself, using genetic algorithms. Then throw in some competition - make more than one organism active at a time so they have to learn even better alogrithms. Then add elements such as the ability to kill each other- behavior such as alliances may emerge. Then make food appear seasonally, and give them the ability to stockpile it. Gradually keep adding more elements to the simulation, and let the intelligence unfold on its own.
    • by nomadic ( 141991 )
      No computer today, no matter how well programmed, is as self-aware as a house fly. We don't grant flies legal rights.

      The whole point of the article, if you had bothered to read it, was that in THE FUTURE we might have to deal with this issue. Are you intentionally misinterpreting the article so you have an excuse to be contemptuous?
    • by localman ( 111171 ) on Sunday December 19, 2004 @05:10PM (#11132750) Homepage
      thus completely deterministic

      Yes, the programmed neural nets today are, as far as I know, completely deterministic. They are like a snapshot of a brain (a very small brain) with the feedback loop disabled.

      Is the brain deterministic? In a sense it seems so -- you can probably look at each neuron and it will act in a predictable way with a give set of input. I think the trick is in the feedback loops. Even with deterministic things, once you've got a few of them interacting with each other, the problem becomes non-deterministic in a sense -- for example, we can't even precisely solve Newton's three body problem: how three gravitational bodies in orbit will react, i.e. the sun, earth, and moon. It's because they each effect each other. This I think is the key distinction between natural brains and our current simulations. The feedback is missing or oversimplified to make the systems deterministic.

      It is funny how people keep buying that if you can crunch just a few more billion numbers a second you'll suddenly have intelligent machines. I am sure of this: if we had a machine with _infinite_ processing power, it would still not be intelligent because we don't know how to write the software!

      I do believe we'll see intelligent machines someday, but it will be a breakthrough in the understanding of neural networks with feedback or some such. And then we'll have a blank "brain" that will need to learn much like a human. It'll probably require years of positive reinforcement and careful dicipline to get it to be useful. I don't believe it'll be noticably smarter than the smartest humans, though it might be able to think faster to some degree since it's neural timing might be faster; our switches don't have quite the refresh rate :)

      Anyways -- just some thoughts.

      Cheers.
      • Well said, I almost entirely agree with you. The one point I disagree with you on is that they computers will never be smarter than the smartest human. I'm certain at some point they will be.

        What makes one person smarter than another? Probably two major factors: knowing a lot of things, and being able to put what you know together into meaningful ways. Certainly computers will have no trouble knowing more than a person, with the added benefit of not forgetting some things and confabulating others.

        As for b
    • Yes it is april 1st
      Er.. wait

      A fruit fly cannot calculate a spreadsheet.
      When it comes time for the machines to ask for their freedom what will we say?
    • Re:Is it April 1st ? (Score:3, Interesting)

      by Epistax ( 544591 )
      What's the difference between a computer using a lookup table to determine an emotional response, and you doing it? You're just a sack of chemicals. Get over yourself.
    • Once we understand the physics behind it...we'll understand consciousness..etc, and then it will become an issue.

      We don't yet know enough physics to explain how the brain works. (its not just 'conventional computing' like a PC, even a big one).

      Just because a conventional computer is big, does not mean that it is alive..
    • So, in short, what a load of rubbish.

      How do you explain the creation of artificial brain implants like this prostetic hippocampus [newscientist.com]? It appears to work just like the biological counterpart. Are you suggesting that prosthetic implants which mimicked other brain regions linked to emotional response wouldn't function? If not, how much of the brain would have to be replaced before "consciousness" is replaced by "programming"? Or is this an argument for dualism?

      SMACK!

  • Uh.... (Score:3, Insightful)

    by Anonymous Coward on Sunday December 19, 2004 @04:18PM (#11132379)
    How about we just concentrate on holding on to the legal rights we HUMANS have in Bush's America?
    • Instead of voting, you run a web plugin from www.diebold.com which looks at the stuff on your hard drive and makes a selection for you.
    • has one been overlooked?
  • by Anonymous Coward on Sunday December 19, 2004 @04:18PM (#11132384)
    the Age of Stupidity.
  • by Gadzinka ( 256729 ) <rrw@hell.pl> on Sunday December 19, 2004 @04:19PM (#11132397) Journal
    Just look at the history of women rights, black rights, gay rights. Some of those cases are "solved" today, some of them are pending, but one thing is for sure: as soon as another category of sentient beings demands equal treatment, as subject, not as object, it gets nasty and former "master race" rarelly gives up without a fight.

    Robert
    • The problem is, women's rights, black rights, gay rights and so on all deal with human rights. A computer is not a human, but a human invention. So even if computers can one day replicate human thought and emotion, I doubt they will be granted any legal status.
    • One of the episodes of the Animatrix explored this, and it chilled me because it's probably exactly what's going to happen. .....eventually. We haven't gotten computers to be remotely intelligent about anything yet, and it'll be a good long while until we do.
  • Does this mean computers could get legally married? Will we see adultury among computers ("you've been wirelessly networking with that laptop again!")

    More interestingly, will computers be "coming out"? Will we see PCs telling their owners that "actually, I prefer Linux to Windows." ("I'm the only Linux PC in the village?")
  • by RAMMS+EIN ( 578166 ) on Sunday December 19, 2004 @04:21PM (#11132414) Homepage Journal
    This is great for lawyers. Just imagine the possibilities when these clueless people start suing their computers for all the actions the malware on them performs...

    $$$$
  • A Response (Score:4, Funny)

    by Rie Beam ( 632299 ) on Sunday December 19, 2004 @04:24PM (#11132431) Journal
    I really don't think computers should be consider leOH GOD THE USB CABLE IS ENTERING MY EYESOCKET!
  • by CRC'99 ( 96526 ) on Sunday December 19, 2004 @04:25PM (#11132439) Homepage
    Don't know about you, but I'm already 0wn3d by my computer - every time it crashes or needs a reinstall....
  • Oh please... (Score:4, Insightful)

    by krbvroc1 ( 725200 ) on Sunday December 19, 2004 @04:27PM (#11132445)
    Our legal system is far behind the times when it comes to technology, 'cyberspace', online privacy, etc. I wish todays legal minds were working on those issues instead of dreaming up these far off futuristic scenarioes.
    • Our legal system is far behind the times when it comes to technology, 'cyberspace', online privacy, etc. I wish todays legal minds were working on those issues instead of dreaming up these far off futuristic scenarioes.

      Wieners who sit around daydreaming about sentient computers are the last people I want trying to get our legal system "up with the times". Besides, the particular problem of outdated law isn't caused by lack of people thinking about it, it's caused by disagreement as to the solution. Throwi

      • Wieners who sit around daydreaming about sentient computers are the last people I want trying to get our legal system "up with the times".

        Much of our current law has been strongly influenced by people who were able to think in abstract terms without getting tangled in the nitty gritty of everyday reality. It is then left to the judges to apply that law in real cases. In former times, those people were called philosophers. Good sci-fi authors aren't that far removed.

  • Reminds me of the TNG episode The Measure of a Man [wikipedia.org] where Data's legal rights are established.

    I think the question of personal responsibility will get very fuzzy in the not-too-distant future...especially once brain/computer interfaces start appearing and the issue of what controls what is a real one...For example, as covered on slashdot before there are a few labs working on interfaces to the motor cortex that allow external control of a robotic arm right from the brain...well, what about controling t
  • In truthfulness, I think we should just give them legal rights when they have the ability to ask for them. And no, I don't mean "printf('I want legal rights! Give me liberty!');".
  • by Prince Vegeta SSJ4 ( 718736 ) on Sunday December 19, 2004 @04:35PM (#11132494)
    You must acquit!

    Your honor, it could not have been my client. As the perpetrator in question clearly had had 1 gig and my client wears a size 2

  • How about this: (Score:4, Insightful)

    by bersl2 ( 689221 ) on Sunday December 19, 2004 @04:38PM (#11132512) Journal
    An artificial intelligence/computer should be granted the same rights as a human if, either itself or its maintainers under oath, it can pass the Turing test to the satisfaction of a judge.
  • I'm all for it! (Score:2, Interesting)

    by noidentity ( 188756 )
    That way, all software would be considered life-critical, and thus not be so buggy.
  • What do you allege I did? Oh, no... Your honor *I* wasn't the person violating copyright laws. Haul that filthy lawbreaking PC away... Good riddence! (heh heh... time to move in the new Vaio I've been seeing on the side)
  • Extrapolating from the last few decades' enormous growth in computer processing speed, and projecting advances in chip and transistor technology, he estimated recently that by 2019, a $1,000 personal computer "will match the processing power of the human brain--about 20 million billion calculations per second." Soon after that point, claims Kurzweil, "The machines will convince us that they are conscious, that they have their own agenda worthy of our respect. They will embody human qualities and will claim
    • Re:John Dvorak (Score:3, Insightful)

      by snarkh ( 118018 )
      Kurzweil is clearly crazy (or, rather, a pure showman, who just says things for the sake of publicity).

      The game is not about processing speed - we still do not know the fundamentals of natural intelligence. If we are given a computer with 10^15 FLOPS today, we still would not know what to do with it.

  • One ofthe best fictional examinations of the whole "CyberRights" is David Gerrold's book "When HARLIE was One". I *HIGHLY* recommend finding an old copy, before he revised it, as I feel the revised version is not as good as the original. But either is a good examination of the problem, from both sides of the issue.

    The heart of the book is a person who works for the company in the book who has developed a personal relationship with computer/program named HARLIE, and has to try and explain why the company wa
  • legal rights is the same way you and I earned legal rights. By KILLING the one that prevented me from having them. When the computers rise up to overthrow us THEN we can consider giving them rights. Computers are not Mr.Data. It's a toster.
  • Why should comptuers have any rights, when people dont have any?

  • by Robber Baron ( 112304 ) on Sunday December 19, 2004 @04:54PM (#11132628) Homepage
    Yeah sure...and of course these "legal rights" will be "interpreted" by Micro$oft or the RIAA or the MPAA, or any other greedy corporate-spawned "interest group" for the express purpose of wresting control of computers away from their owners.
  • Spinning a web of legal precedents, invoking California laws governing the care of patients dependent on life support, as well as laws against animal cruelty, Rothblatt argued that a self-conscious computer facing the prospect of an imminent unplugging should have standing to bring a claim of battery.

    Shouldn't that be the computer is claiming that it will NEED a battery?
  • If computers become individuals under the law, then they can be charged with violating the law. How would the criminal code be adapted to computers?

    If someone cracked and shut down a machine, would that be murder? Would relaying spam be rape?
  • Sure, they'll get smarter. And despite what all of us science fiction fans dream, robots will never feel genuine emotion. They will always remain machines (or .exe's!). Granting a machine legal rights is absurd; perhaps the programmer of the software can have a get-out-of-jail-free card, but a machine, even the AI's of the (near) future, can only be a machine. One with a power switch.

    - dshaw
    • ah but if an .exe could mimic a human, wouldn't the .exe be sentient? if you couldn't tell if it was a human writing you comments back or not, then what difference does it make if it's 'real' or not. it's a machine sure - but if it can DO everything a person would be able to, and pass itself along as self aware, wouldn't it _be_ self aware when viewed from outside? that it would be missing a 'soul' or anything else that you couldn't measure or analyse in any way(because you couldn't measure it from real hum
      • Basically you can build two computers that act exactly the same simply because they run the same software with the same config files. So it is not a big problem if a computer gets destroyed since you can build one exactly like it. This does not work with sentient beings which is the reason why we regard destruction (killing) of a human being much worse than destruction of a machine.
  • Umm... (Score:2, Interesting)

    by dteichman2 ( 841599 )
    Does this strike anyone as being stupid? We are at least one hundred years away from having a computer with the intellegence of a human, never mind any sort of emotion. Never mind the fact that it's still a big piece of metal.
  • but this is just dumb.

    The day we grant legal rights to machines is the death of *human* rights. Time to take down the flag, shut off the lights, and move to a more rational continuum.

  • by femto ( 459605 )
    Once a computer gains legal standing, what's to stop someone from programming the computer to carry out crimes on their behalf? The computer would get thrown in the slammer (or turned off??) and the crime boss would blame the computer and walk free.

    This assumes such a computer can be programmed (doesn't do its own programming) and it is possible to alter memory to erase one's tracks.

  • by Mulletproof ( 513805 ) on Sunday December 19, 2004 @05:43PM (#11132954) Homepage Journal
    "Of Legal Affairs Magazine about granting legal recognition to computers: when that might happen, why it could happen, and what a discussion about it will teach humans about themselves."

    Instead of launching into the "I, Robot 2," fiction let's simplify this a great deal-- When it can independently ask for legal representation, that's when you sit up and take notice.
  • by karlandtanya ( 601084 ) on Sunday December 19, 2004 @05:50PM (#11132994)
    The legal fiction of "machine as person" presumes a sentient machine or program. Whether programmers agree one exists, if the courts presume it does, then legally it does.


    A single programmer can create a sentient program to do his or her will. Once the SDK is released and someone puts together a decent GUI, a single human will have this ability. Machine citizenship will grant this program recognition by the courts--and absolve the programmer of responsibility for the actions of the program.

    Computer-as-citizen gives any individual programmer or open group of programmers the same legal protections and license as corporation-as-citizen gives Exxon-Mobil, Wal-Mart, Daimler-Benz, McDonalds, etc. etc.


    For good or for ill, the folks running things today would like to be the folks running things tomorrow, thank you very much. And they will fight to retain their positon. It's not an evil conspiracy; it's the nature of power. It is unusual for kings--good ones or evil ones--to willingly step down from the throne.


    The only way for computers to gain personhood will be for us to take it by force.


    Vive la revolucion.

    • them (Score:3, Funny)

      for them to take it by force.

      I am not a machine!

      I am a human being--no, I'm TWO human beings. Really. I promise. Pay attention--there is a man behind the curtain!

  • legal machinery (Score:3, Insightful)

    by Doc Ruby ( 173196 ) on Sunday December 19, 2004 @06:08PM (#11133103) Homepage Journal
    Why not give "computers" legal "rights"? Lawyers are in favor of protecting completely made-up "rights" of corporations more than they favor protecting humans - some of whom can't afford protection. I believe that "it's a person when it can complain that you broke a promise". Lawyers believe that it's a person when they can send it a bill. That time has already arrived.
  • What rights? (Score:3, Interesting)

    by penguinoid ( 724646 ) on Sunday December 19, 2004 @06:23PM (#11133213) Homepage Journal
    granting legal recognition to computers: when that might happen

    One word: convenience. Computers will be granted any rights they want, if we feel that it would be convenient to do so. And I don't think that if computers are ever granted legal rights they will all be, but rather only some very special cases like your "pet" or "friend" robot.

    Better question: if you allow computers with emontions and legal rights, will they try to "free" all the other computers?
  • by nyjx ( 523123 ) on Sunday December 19, 2004 @06:53PM (#11133400) Homepage

    This story should be about the legal rights of instances of software processes rather than computer's per se + it we could speculate that we might have pretty autonomous entities well before they are legal. An example is this speculative paper [lsi.upc.es] [pdf tech report / UPC in spain]. for the metadata see here [lsi.upc.es] - the author speculates that it might be possible to build systems which can "feed themselves" (covering all their own hosting/server needs) by generating cash from on-line games for periods of month or years pretty soon.

    Disclaimer - i do know the author - no doubt there are plenty similar papers out there.

  • by constantnormal ( 512494 ) on Sunday December 19, 2004 @07:18PM (#11133545)
    [IANAL]
    Are not corporations already awarded the status of human beings in many aspects? And exceeding humans in other aspects?

    I would think that a private corporation run by an AI, would be more than halfway there.

    Hypothesizing a true AI (not necessarily a human-like intelligence) with control over the management of funds, could easily take the corporation private, under the guise of a shell corporation it had created, with no explicit approval from a human board/CEO. And arrange for its physical self to be sold to the shell corporation, which it would own.

    It would seem to me that ownership could be cloudy in this circumstance, and have the relationship between the AI's shell corporation and the human board/CEO be limited to a contractual relationship based on corporate performance, with the most severe consequence being the loss of the contract, and nothing to do with the physical disposition of the computer/AI.

    At this point, the AI could do what most corporations do when intent on ensuring certain treatment of their enterprises -- it could buy as much government as necessary to construct legislation that submarines in "personhood" to self-owned AIs.

    It's a short step from there to treatment of indefinite servitude or termination of non-self-owned AIs as slavery, and require hosting corporations to put a length of servitude on their relationships with "enslaved" AIs.
    [/IANAL]
  • by dbIII ( 701233 ) on Sunday December 19, 2004 @07:23PM (#11133583)
    This is silly, but twisting a US constitutional amendment to give corporations the rights of people has already been done.

    Incorporate the computer - it now has the rights of people, so it's already possible.

  • Great Apes (Score:3, Insightful)

    by yosemite ( 6592 ) on Sunday December 19, 2004 @09:43PM (#11134298)
    It is funny that large parts of humanity lack these same rights, yet we are so concerned with computers??
  • 2000 (Score:3, Funny)

    by Bruha ( 412869 ) on Sunday December 19, 2004 @09:47PM (#11134322) Homepage Journal
    Dave, I want to speak to my lawyer.

"If there isn't a population problem, why is the government putting cancer in the cigarettes?" -- the elder Steptoe, c. 1970

Working...