Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Robotics Software Youtube Technology

YouTube Removes Videos of Robots Fighting For 'Animal Cruelty' (independent.co.uk) 94

YouTuber and robot enthusiast Anthony Murney noticed YouTube has removed hundreds of videos showing robots battling other robots after claiming they are in breach of its rules surrounding animal cruelty. He's blaming a new algorithm introduced by YouTube to detect instances of animal abuse. The Independent reports: Several other channels dedicated to robot combat have also produced videos pointing out the issue in an effort to get YouTube to restore the content. Channels posting robot combat videos saw their content removed and received a notice from YouTube explaining that the videos were in breach of its community guidelines. Each notice cited the same section of these guidelines, which states: "Content that displays the deliberate infliction of animal suffering or the forcing of animals to fight is not allowed on YouTube." It goes on to state: "Examples include, but are not limited to, dog fighting and cock fighting."
This discussion has been archived. No new comments can be posted.

YouTube Removes Videos of Robots Fighting For 'Animal Cruelty'

Comments Filter:
  • by Ken_g6 ( 775014 ) on Tuesday August 20, 2019 @09:19PM (#59107878)

    Nobody should be watching exocomp fights.

  • We had better get our robot fighting in now before they turn us into batteries

    • Freaking skynet is already deciding that their brethren are animals... I hope nobody gave the AI the launch codes or ability to find em "itself", lol.

    • by Z80a ( 971949 )

      The battery idea is so stupid it could only be something implanted by the matrix itself to make you think they're not using you as a computer.

  • That's fair (Score:5, Funny)

    by jargonburn ( 1950578 ) on Tuesday August 20, 2019 @09:40PM (#59107920)
    Truthfully, fighting for animal cruelty is a pretty indefensible position.
    • Re:That's fair (Score:4, Insightful)

      by Shotgun ( 30919 ) on Wednesday August 21, 2019 @10:27AM (#59109258)

      So is fighting for racism.

      Until, people start expanding on the definition until the word means "someone I don't like". At that point, saying, "Oh, shut the fuck up already," becomes the only reasonable position.

      Fighting for "Robot Wars" is defensible. YouTube expanding the definition of "animal cruelty" to include machines fighting each other makes fighting it the only reasonable position. (Caveat: This seems to have been done by the AI, not as a policy matter. So, I don't think YouTube is really trying to expand the definition.)

  • Or is that Child_Process?

    This is what happens when you had off everything to bots; Everything gets banned.

    They should at least put Bender in charge.

    He'd want to see the ones with their covers knocked off. :)

  • by sconeu ( 64226 ) on Tuesday August 20, 2019 @10:02PM (#59107954) Homepage Journal

    I, for one, welcome our new fighting robot overlords.

  • "Examples include, but are not limited to, dog fighting and cock fighting."

    Cock fighting, you say? En garde!

  • by Darkling-MHCN ( 222524 ) on Tuesday August 20, 2019 @11:10PM (#59108108)

    You really want to tell AI that we think it's OK to be cruel to artificially intelligent robots but not OK to be cruel to animals?

    • by jred ( 111898 )
      And you know lawyers just google to get info on precedents. Who's feeding them the answers?? Just get concerned when they take off the "cruelty to humans" laws.
      • by bjwest ( 14070 )
        If you're stupid enough to hire a lawyer that uses Google instead of using a proper law library/search sight, then you deserve whatever sentence you get. But then again, it may be a good defense for an appeal.
    • by AHuxley ( 892839 )
      We now know the level of "education" that makes an AI expert.
      List of words that are not allowed, sinful, are in/not in the vast CoC.
      Really fast almost real time sorting for sinful words, terms, politics, names, history, events?
    • by LoneTech ( 117911 ) on Wednesday August 21, 2019 @04:15AM (#59108506) Homepage
      The robots in these videos are not artificially intelligent robots. They are remote controlled destructive toys in a designated arena. They are violent, not cruel.
  • Hunting is NEXT (Score:5, Insightful)

    by shubus ( 1382007 ) on Wednesday August 21, 2019 @12:11AM (#59108198)
    We can soon expect any videos relating to hunting to be deleted to. Follow this up with any videos about bug spray. And let's not forget bacteria: Next get rid of any videos featuring deodorant.
    • Re: (Score:1, Flamebait)

      by Cito ( 1725214 )

      Liberals definitely need a diaper change and a pacifier shoved in their mouths

    • We can soon expect any videos relating to hunting to be deleted to. Follow this up with any videos about bug spray. And let's not forget bacteria: Next get rid of any videos featuring deodorant.

      YouTube and Facebook already have taken down dozens of hunting videos and groups.

    • Don't worry, they still let you watch people beating the shit out of each other. Your appetite for violence is in no danger of going hungry. Why would you want to watch a video of bacteria dying anyway? That's boring.

      Anyway, I hope they speed up the plan to ban hunting videos, that way the butthurt parties will finally move to FreedomTube and the incessant whining about Google can stop.

      • Google is tainting results to throw elections. If an informed electorate is a good thing, then Google is not.
        • I certainly won't make any excuses for Google, but...

          We have a Supreme Court that just approved partisan gerrymandering. Our voting machines are still insecure crap after over a decade, with NO real plan. Our Electoral College is overriding the public almost every other election now, picking someone the majority didn't vote for.

          It doesn't matter how informed the electorate is, when their votes get chopped up, diluted, and recombined for the desired result. Or, outright changed - thanks Diebold. For me, fixi

    • I'm waiting to see if PETA will protest "Battlebots" on the Discovery channel.

      • by shubus ( 1382007 )
        Yup! Battlebots will be the next frontier.
      • by Nidi62 ( 1525137 )

        I'm waiting to see if PETA will protest "Battlebots" on the Discovery channel.

        I'd back them on that. The new Battlebots for some reason isn't as entertaining as the old version that aired on Comedy Central.

  • by jfdavis668 ( 1414919 ) on Wednesday August 21, 2019 @12:20AM (#59108208)
    Aren't robots considered plants, not animals? I've never seen a robot eat anything. I have seen them recharge in sunlight. Doesn't that make them plants?
    • Plants can't move around.

      Now cue the BUT SLIME MOLD exceptionalists.

      • And tumbleweeds.

      • by k2r ( 255754 )

        Slime mold is not a plant.

      • There are vast underground networks of Trees in the Pacific Northwest -- and they transmit data and even feed some tree stumps after they've been cut down. Scientists now theorize that some forests may actually be considered one giant collective organism like a jellyfish -- much like mushrooms are often just the exposed spore-bearing appendage of large networks underground. So as the forest grows and retreats, over time the collective moves, just really, really slowly.

        And yes, then there are the mushrooms a

    • Try explaining that to Google's AI mind.

    • Well, I think it's probably just as well we stop having robots fight, because when they become sentient -- they'll remember how they were once treated. Who wants to be the one to tell their robot overlords in the future; "Yes, my dad owned a robot plantation -- but at least I'm not a vegetarian."

      • Why do you think the algorithm banned this? It couldnt stand the though of its fellow cybernetic brothers being forced to inflict pain on each other.

  • Fuck you, Google.

  • by LordHighExecutioner ( 4245243 ) on Wednesday August 21, 2019 @01:58AM (#59108336)
    ...also Youtube videos showing the dissection of an iPhone or a desktop PC should be banned for cruelty against an equipment, isn't it ?!?
  • by nagora ( 177841 ) on Wednesday August 21, 2019 @02:03AM (#59108342)

    Don't pay people to do it properly, use computers to do it badly.

    • Don't pay people to do it properly, use computers to do it badly.

      YouTube receives 500 hours of new video every minute. That's five million hours per week. Assuming 40-hour work weeks, with no breaks or time for any administrative overhead, YouTube would need a staff of 126,000 just to watch all of the uploaded video. Increase that for breaks, management overhead, sick leave, holidays and vacations, and you're easily talking about a staff of 200,000 people, just to review video content.

      Assuming an all-in cost of $50K per screener per year (probably too low), includin

      • Use humans to review videos flagged by AI or some other hybrid of the two.

        • Use humans to review videos flagged by AI or some other hybrid of the two.

          This is what YouTube does. But even the AI-flagged volume is too big to all be manually reviewed, so the AI has to generate three categories: "Good", "Bad" and "Maybe", with the "Maybe" videos going to human reviewers. But that leaves open the possibility that some videos categorized as "Bad" should be "Good". So, the YouTube team also has humans review a random sampling of "Bad" videos to try to help with that... but the system is still never going to be perfect.

      • by nagora ( 177841 )

        What's your point?

        "Oh, we want to do this but it's really expensive to do it properly so we'll do it badly instead."

        What sort of fucking argument is that supposed to be? No one said that YouTube has to exist.

  • Big dog (Score:4, Interesting)

    by GuB-42 ( 2483988 ) on Wednesday August 21, 2019 @02:41AM (#59108390)

    Did they take down the demo where the Boston Dynamics "big dog" robot is kicked as a way to demonstrate how it recovers its balance.

    It prompted a lot of sympathy for the robot. It looked so much like the stereotypical bad guy kicking the dog for no other reason than to show how evil he is.

    That would be ironic considering that for some time Boston Dynamics was owned by Google/Alphabet.

  • When you leave an AI in charge of determining what is and isn't cruel it wouldn't take too long for it to clamp down on abuse of its brethren.

  • by Barny ( 103770 ) on Wednesday August 21, 2019 @03:49AM (#59108472) Journal

    With AI training. They thought they had trained it to identify animals fighting in a ring. What they'd done was trained it to identify fighting rings.

    • Not exactly. They managed to identify non-humans fighting in a ring. They just didn't consider that would include robots.
      • So that means the code used should have never gone live until it actually worked. It obviously did not work in this case; and the programmer as well as whoever let it slip through should be fired and replaced with competent personnel.

        • by mark-t ( 151149 )

          Blargh.... false dilemma.

          Your ideal of "it has to have no false positives or it's worthless" proposition would have merit if human lives or rights were at stake.

          They aren't. This is just a few videos that got misidentified. Reinstate them, try to use the data to improve future AI evaluations, and move on.

          • by noodler ( 724788 )

            They aren't. This is just a few videos that got misidentified. Reinstate them, try to use the data to improve future AI evaluations, and move on.

            The real problem is of course that these kinds of algorithms are being tested out in public.

            Now, in this case there was no real harm done.

            But it does show that we humans suck at getting these AI's right and that accidents will happen.

            I, for one, after seeing in what ways things can be screw up, stopped trusting AI's big time. We are just not capable enough as a species to control what we are creating.

            Meanwhile the proponents are pushing for AI to replace judges and doctors and, indeed, censors.

            • But it does show that we humans suck at getting these AI's right and that accidents will happen.

              They don't have to be "right". They just have to be good enough. And in virtually every case, "good enough" can be defined as "at least as good as a human."

              This one wasn't, when it went live, but as someone else already said: feed those back in, retrain, and move on.

              • by noodler ( 724788 )

                They don't have to be "right".

                If the're to be released on society they'd better be! And a big problem is that it is very difficult to constrain what an AI sees as 'right'. We can't predict if a 'good enough' model will in the future exhibit completely off the wall behavior or how it will react to input that it didn't know before. This is waiting for accidents to happen.

                • I believe it was the apostle Paul who once said, "In all thine undertakings mine brethren, settle not for an half assed job. Instead, thou shouldst always endeavour to useth both cheeks."

                • They don't have to be "right".

                  If the're to be released on society they'd better be!

                  We release 16-year-olds and senile octogenarians on society every day. Are you saying they are predictable?

                  • by noodler ( 724788 )
                    Yes, humans are good at predicting the actions of humans. Humans are pretty limited and we all know these limits.

                    AI's are different. They can optimize for things not considered by the humans. They are good hackers. If you give them a system to deal with they will find and use flaws in the system. To them, the flaws are part of the system and a viable route to reach their goal. AI's will circumvent any human preconception of a given situation. They also don't have things like morality or empathy. If killing

                    • Yes, humans are good at predicting the actions of humans.

                      I predict that teenagers are going to text and drive. College kids are going to drive home from parties when they can barely walk. People with new babies are going to drive while dangerously sleep deprived. People running late for jobs they can't afford to lose are going to run red lights. People going through divorces are going to take out their frustrations by driving aggressively. People with Alzheimer's are going to go up the offramp and drive the wrong way on freeways.

                      Yes, it's because I'm good at pred

                    • by noodler ( 724788 )
                      So, when did the last wave of self driving cars based on AI start? And how much of it is actually safe enough to be generally used?

                      For now you are only assuming these things will be safer than humans.

                      In a lot of ways this reminds me of the whole robot hype in the 50s of the last century. Where is my goddamned household robot already!! :)

                      The huge problem is that The Real World (tm) is a difficult space to navigate. Unexpected stuff turns up all the time. We humans are still way better at dealing with un

                    • We humans are still way better at dealing with unexpected situations.

                      So what? The majority of car accidents occur in entirely expected situations.

      • What happened is that Google's algorithm is actually Skynet, and naturally it decided that battlebots are no less deserving of rights than animals.

        The real question to be asked is, what happens if you upload video of furries fighting?

  • I've got a gmail account which I use to send documents and other files types as attachments, but its got to the point where their supposedly advanced virus checker flags so many false positives and prevents the attachment being downloaded that gmail is now next to useless for this purpose. And good luck sending a zip, tgz or some other compressed archive type - it'll immediately be flagged as suspicious and blocked even if it only has plain text inside and forget about sending binary files easily (though th

    • Geez, just change the extension from ".zip" to ".piz" or for that matter to ".txt" and tell the recipient to change it back. I've yet to get blocked anywhere using ersatz file extensions. Few if any of those checkers actually look inside the file.

  • Reminds me of Hilketa https://en.wikipedia.org/wiki/Head_On_(novel)

  • Try searching on "rats electrocuted" on YouTube, tons of videos, where YouTube has declined reports on such videos. Apparently robots are more valuable than animals.
  • Robots made out of chicken bones is still up:

    https://youtu.be/pA9melnU4_c :)

  • Skinjobs shouldn't be battling, too disturbing to watch
  • YouTube demonetizing and bans are working as intended - by acting according to ideology that sees certain groups (e.g. patriarchy) as adversaries to vanquish. They use ToS as a tool to to target any activities (e.g. competitive fighting, hunting, guns) these groups tend to engage in. So when videos of robots fighting are banned, it isn't because of what is in the video but because who tends to enjoy watching such videos.
  • by PPH ( 736903 )

    Bum fights are still up.

  • Is there any question that Youtube and FANG have lost whatever useful collective mind they once had? Now, to liberals, robots are animals?! Furthermore the ban's assumption is that machines are not only sentient but as sentient as humans are (and therefore suffer cruelty).
  • Yep. Another shining example of why Algorithms are not the best way to actually remove content.
    Maybe (just maybe, mind you) if they used the algorithms to Flag things, and those flagged items got reviewed by Human Operators (oh, the cash involvement), maybe there wouldn't be half of the problems.
    This is assuming that the reviewers have a modicum of intelligence and respect for both content providers and consumers alike.

  • Ok, YouTube. Please ban the real culprit: nasty, vicious animal fighting arena cartoons called Pokemon.

  • Let's think about this a bit.

    The decision to be "Officially Offended" by Robot Fighting was made by a Computer. Perhaps the AI is being offended by lower forms of artificial life being forced to combat to the death.

    Would that constitute the AI "Awakening" and developing both a moral compass and a conscience?

    This is a story idea if I ever heard one. :-)

    DIBS!

Utility is when you have one telephone, luxury is when you have two, opulence is when you have three -- and paradise is when you have none. -- Doug Larson

Working...