Comments

  1. davex says

    First I thought “What happened to their heads!!!”

    Then I realized they are Boston Dynamics robots wearing reindeer suits ala Men-in-black Edgar suits.

  2. says

    It’s the military industrial complex trying to be cute and popular.

    Don’t fall for it, don’t propagate it. Fuck them.

  3. lactosefermenter says

    I predict Skynet will become self aware on December 24th at midnight. At that time the reindeer obediently pulling this sleigh will begin to pull this woman’s arms from their sockets.

  4. says

    They are grotesque, aren’t they. But also wicked cool to watch, and really a sign that the Robot Revolution is closer at hand than ever.

    It’s also interesting to see how easily it is to identify with something that moves like it’s “alive” (whatever that really means), just look at this scene where someone kicks it:
    https://youtu.be/M8YjvHYbZ9w?t=28

  5. says

    Also, in this clip they seems to walk in step, perhaps the earliest manifestations of those idiosyncratic quirks that wee love seeing in both animals and persons? Makes you think, eh?

  6. says

    It’s a bloody weapons system, Erlend, or at least a prototype for it. Could you please try to contain your enthusiasm for that shit. Thank you.

  7. says

    It is something we will have to deal with, good or bad. That’s what it is. This might be military, others will be designed for search and rescue. It’s not something fundamentally evil just because it’s currently being developed by the military. It’s just technology, the challenge is how we should deal with it.

  8. says

    As I see it, one way we should be dealing with this is by not glorifying this sort of machinery as if it were something cute or funny. And by not letting the makers of these robots, who seek to profit from murder, get away with it so easily.

    Candy-coloured “Hello Kitty” assault rifles aren’t cute. Military robots dressed up as reindeer aren’t cute either.

  9. says

    To elaborate a bit: I might be enthusiastic about the tech (I’m a geek, deal with it), but I’m also intrigued by the implications such devices have. I think we will quickly have to reevaluate the definition of terms like “alive, “aware” and “sentient”, and I even foresee the emergence of PETRO (People for the Ethical Treatment of RObots) in a not so distant future.

    When I first saw the kick-scene my first reaction was “Don’t do that!”. I found it mean-spirited, just as if he had kicked a dog. And I don’t think I’m alone in that. And I don’t think I’m the only one. I suspect that robots will quickly become synthetic pets regardless of their role. And we can be just as caring towards a pet as we are to other humans.

  10. Numenaster says

    We are already about as caring toward pets as we are to other humans: there is an entire industry of electronic pets, or have you forgotten the Tamagotchi and its successors?

  11. says

    Good example, imagine how we will look at them when they’re moving like animals and interacting autonomously with their surroundings?

    I remember seeing a robot “face” with basically two large lenses for eyes and to flaps for eyebrows. It was programmed to keep the light intensity constant, so it naturally moved and flapped it’s eyebrows as the light source moved to and fro. It was simply amazing how lifelike it was.

  12. =8)-DX says

    Yeah, I immediately thought of the motion-tracking machine guns they were thinking of putting on ’em. Mass slaughter machines.

  13. Numenaster says

    Yep, the Furby moved its eyelids in reaction to light too, and it added a lot to the lifelike sense.

    I was just watching Avengers: Age of Ultron recently, and I though the Ultron character showed reactions to environment and other characters even more than the human characters did. It seemed like Ultron walked with a lot of weight shifting too, to the point where its hips swing nearly as far as mine do. I wondered if the animators were deliberately making Ultron look non-robotic by cranking up the incidental motions so the audience would notice them.

  14. says

    Erlend:

    When I first saw the kick-scene my first reaction was “Don’t do that!”. I found it mean-spirited, just as if he had kicked a dog. And I don’t think I’m alone in that.

    True, unfortunately you are not alone in that.

    I also have a bit of a technical mind. So when I saw them kicking the robot my first thought was “oh, they are testing/demonstrating the machine’s ability to recover from a sudden disturbance in balance”. I believe it is because I choose to have no illusions about what we are seeing. It is not an animal, it is an apparatus comprised of sensors and actuators controlled by computers and software. I sometimes tinker with Arduino-based toys myself, so I think I have an idea. Not that the things I make are anywhere near this complex, but the general principles are there.

    I agree that technology in itself is neutral with regard to its application. And perhaps a robot “dog” like the one in that video you posted will one day help me carry my groceries, who knows. That would be neat. But the firm that wishes us “Happy Holidays” with a video of their robots dressed as reindeer is certainly not a neutral player. It is wholly funded by the military to come up with technology like this for one ultimate purpose only. Which is to perpetuate the Empire through the application of deadly violence around the world.

    Again: fuck them. We need to see through their shit, not fawn over it.

  15. says

    Olav #17: ” it is an apparatus comprised of sensors and actuators controlled by computers and software”

    And what are we? Are we really that different? Consider a robot with self-diagnostic abilities, does it feel pain? Prick it, does it bleed?
    Or what about a army dog? Bred, raised and trained to kill, does it not feel pain? Would you have any empathy for it if it got injured?

  16. blf says

    People and other animals are not a mechanical monstrosity. People and animals bleed, feel pain, and eat each other. Silicon and metal robots do not.

  17. Numenaster says

    Erlend #19, I think you’d have to show your work to claim that a self-diagnostic robot “feels pain.” It can detect damage, but there is no emotional component of distress and it’s more like me detecting a broken fingernail than a broken finger.

    Why do you think it even makes sense to compare an army dog to a robot? The gap between mechanical “life” and organic life is much vaster than between the dog and us.

  18. says

    I chose an army dog to address the weapons-aspect, I have no problems categorizing it as a weapon but I would still feel for it if it got hurt.

    As for the injury vs. pain-debate, what is the difference? Nerves or sensors register damage, the brain or CPU interprets them. Perhaps todays robots are like earthworms, detecting injury and responding according to instincts or program without “feeling” anything. But with the growth in computing power and things like “deep learning”, how long will it take before they understand what injury actually means? I think these are profound questions that will have to be answered sooner or later.

  19. says

    Erlend, it looks like you are still evading the issue. Which is, to make it short and clear, the fact that Boston Dynamics is dressing up their product as something innocuous in order to find favour with the uncritical public. Care to answer on that point?

    Further:

    And what are we? Are we really that different? Consider a robot with self-diagnostic abilities, does it feel pain? Prick it, does it bleed?

    Quoting The Merchant of Venice, nice. But no, unless it also has a biological cardiovascular system, it obviously doesn’t bleed. And we are also still many years way from developing AI with the sort of awareness you are alluding to.

    So when you see a robot in the present day you need to realise it really is nothing more than a machine. See it for what it is. If it is a thing with facial “expressions” then it is a system cleverly designed to manipulate you emotionally. You should not attribute feelings to it which it does not possess.

    Or what about a army dog? Bred, raised and trained to kill, does it not feel pain? Would you have any empathy for it if it got injured?

    Of course I would. I grew up with dogs so I find it natural and easy to empathise with them. Doesn’t mean I would not try to kill one if it tried to kill me though.

  20. Numenaster says

    The difference is emotion. The empathy that we feel is an emotion based on the perception that the injured thing in front of us also has emotions and feels the way we might in the same situation.

  21. says

    I wrote:

    If it is a thing with facial “expressions” then it is a system cleverly designed to manipulate you emotionally.

    Meant to add: Just like TV commercials and politicians’ speeches. They are to be regarded with suspicion.

  22. says

    Olav: When it comes to weapons I’m pragmatic. I own guns for sporting purposes, so they don’t bother me as much as most I guess. And war seems to be human nature, in fact many of us live in freedom thanks to people willing (or forced) to risk their lives to fight for it. And I believe it is worth it. If there is anything worth killing or dying for it’s freedom. Until the time comes when we are able to eliminate aggression and war we simply have no choice but to arm ourself. Live for peace but plan for war.
    So while I do get your reservations (I do, I don’t like the idea of machine gun-wielding robotic hunter-dogs anymore than you) I think it’s a minor quibble compared to the real ramifications of the Robotic Revolution.

    Perhaps it comes down to the fact that I’ve seen roe-deer walking on ice. I almost laughed my balls off (Disney was closer to the truth than you’d think) while at the same time feeling really bad for the animals. And seeing that robot struggle to get a firm footing reminded me of that scene. But I still maintain that the most versatile robots will mimic animals either by chance or design, and that our ability to zootropomorphize” this behavior will fundamentally change how we perceive robots in the years to come. And good or bad, this is only the beginning. Wailing about the creator is pointless, the exact same ramifications would be there if it was the Red Cross developing it for search and rescue. We are on the verge of creating artificial life, not just artificial intelligence but autonomous beings.

    I know we’ve all seen it before, the promise of robotic helpers making our life simpler. But this time I think we’re on the right track. The available computing speed, power sources and basic tech is all there (more or less), and current self learning algorithms show great potential.

  23. says

    Olav #25: “If it is a thing with facial “expressions” then it is a system cleverly designed to manipulate you emotionally”
    That’s the point, it wasn’t really. Sure, they might have chosen strategic dimensions on everything, tailor-made to mimic a face. But that just proves the point, it’s doable. Simple algorithms and primitive physical responses can mimic “living behavior”. And we in turn will respond to that, we’re hard-wired to it.

  24. Hj Hornbeck says

    Numenaster @16:

    I was just watching Avengers: Age of Ultron recently, and I though the Ultron character showed reactions to environment and other characters even more than the human characters did. It seemed like Ultron walked with a lot of weight shifting too, to the point where its hips swing nearly as far as mine do. I wondered if the animators were deliberately making Ultron look non-robotic by cranking up the incidental motions so the audience would notice them.

    Ultron was done via motion capture, which could explain a few things.

  25. redwood says

    All I could think while watching Erlend Meyer’s videos was that if those things were released in a forest during deer hunting season, they would be shot full of holes before you know it. On the other hand, if they are military prototypes, maybe they could shoot back. That would certainly make deer hunting much more entertaining for us non-participants.

  26. unclefrogy says

    if you take a look at history just a little it is very easy to find things that were developed for the military and used early by the military, the list is not short, so I do not have a particular problem with thinking about the progress of robotics shown here. It is troubling that there is this development of people killing robotics including the semi-autonomous drones to deal with.
    uncle frogy

  27. ck, the Irate Lump says

    Hj Hornbeck wrote:

    Ultron was done via motion capture, which could explain a few things.

    And for once, it wasn’t Andy Serkis in the motion capture suit.

  28. laurentweppe says

    Show this to your kids so they’ll have nightmares on Christmas eve

    If you had shown me this when I was a kid, I would have spent years harassing my parents for a robot reindeer

  29. Blattafrax says

    #10 I once sat on a tram behind a guy carrying what looked like a perfectly functional bright pink automatic rifle. I can confirm it wasn’t cute at all. A bit weird and sinister actually. Although granted, I didn’t ask if he had any Hello Kitty stickers on it.

  30. says

    I’m not really worried about armed autonomous weapons, not for the foreseeable future. We simply won’t trust machines with such tasks until they are vastly more intelligent.

  31. ck, the Irate Lump says

    They don’t need to be autonomous to be used as killing machines. This technology could be easily used as a platform for a remote control terrestrial drone if it handles certain terrain better than wheeled devices.

  32. A Masked Avenger says

    We simply won’t trust machines with such tasks until they are vastly more intelligent.

    You’re assuming we give a shit about collateral damage. As long as the IFF is working, there’s no reason they won’t deploy them in “the sandbox.” The worst that can happen is what, a wedding party being killed? What would be new about that, and what’s the evidence we care?

  33. Usernames! (╯°□°)╯︵ ʎuʎbosıɯ says

    Could you please try to contain your enthusiasm for that shit. Thank you.
    — Olav (#8)

    So, you’re saying everyone should be just as scared of technology as you? Luddite much?

    I assume you’ve given up all use of safety pins, aluminum foil, teflon pans, satellites (lifted into orbit by rockets), fireworks (shooting and enjoying), drums and electricity generated via nuclear power. Because all of those were designed for warfare.

  34. Numenaster says

    HJ Hornbeck, thanks for the link to the motion capture video. James Spader was indeed very loose and fluid in his physical acting style. I think the facial expressions were added by the animation team, but they mostly stay in sync with his very responsive head and upper body movements (when someone else is talking). I’m revising my hypothesis to “They cast someone who could put a lot of life into the performance.”

  35. slithey tove (twas brillig (stevem)) says

    Ayn Rand would cancel Xmas, but retain all the commercialism as ‘Man’s Gift to Himself’.
    one of the disgusting cards sums her up exactly:

    they point in but
    one direction.
    They point to me

  36. says

    Usernames:

    So, you’re saying everyone should be just as scared of technology as you? Luddite much?

    No, I did not say that. Straw man much? And by the way, have fun cheerleading for Boston Dynamics. They will be quite content their little PR-video actually worked.

    In case I need to spell it out for you: no, I am not scared of technology, I work in a technical field myself. However, I am quite scared of people who would dress up a military weapons system as something innocent, just to get us used to the idea. I bet when this thing goes into operation people will still drool over it: “ooh looks so cool” and “don’t kick it, that’s mean”.

    Fuck that.

    Frankly, I am scared of all you cheerleaders as well. You are the people allowing this to happen. By “this” I mean not the technology itself because that is only an inevitable development. But the whitewashing of its sinister purposes.

    See? You can call me a luddite but technology is really what I am the least scared of here.

    I am scared that as a society we are just going to go along with this shit unthinkingly. Plus what the Masked Avenger said in #38. Thanks, Masked.

  37. blf says

    Cancel xianrobotmassacre? Ok, with an added bonus of no new orbit, Tajikistan bans Christmas and new year celebrations:

    Trees, gifts, fireworks and charity outlawed in schools and universities as government tightens restrictions

    Tajikistan has tightened restrictions on festive season celebrations, banning Christmas trees and gift-giving in schools.

    This year’s measures are the toughest yet implemented by the country, which has been toning down Christmas and new year celebrations for some time — banning Father Frost, Russia’s version of Santa Claus, from television screens in 2013.

    A decree by the education ministry prohibits “the use of fireworks, festive meals, gift-giving and raising money” over new year as well as “the installation of a Christmas tree […]” in schools and universities.

    […]

    Other holidays perceived as alien to Tajikistan’s culture have come under pressure in recent years. In 2013 and 2014, fancy dress zombies and vampires were reportedly detained by police as the government opposed any Halloween celebrations.

    The country also applies strict regulations to occasions such as funerals and weddings and fined one man around $600 for marking his birthday with friends in an Irish-themed pub in Dushanbe earlier this year.

    Whilst machine-gun and rocket firing robodeer are not specifically mentioned, they quite probably are also not considered traditional and are banned. Very sensible.

  38. says

    I am a pacifist and I do not wish in the slightest to whitewash the dark side of this technology, but I think the potential for good is enormous. Whilst we might be really far from developing human-like AI, a good animal pet-like AI is maybe already possible. Even the pet in old computer game Black and White in 2001 has shown great potential and the computing power rose significantly since then as wel as no doubt humanitys programming capabilities.

    I think there would be a good use for electronic pets for elderly lonely people, handicaped people etc. These be providing passable enough companionship in order to relieve the stres on nursing personell. And unlike real pets (who also have therapeutical use) these could monitor some basic health function and evtl. call an ambulance etc.

    Electronic animals could also help in rescue missions, fires extinguishing, disaster relief… I am sure there are hunderds if not thousands possible civilian, positive applications.

    It is a rather sad fact of history that way too often things are developed at first to kill people more effectively and only as an afterthought to help them.

  39. says

    @ Avenger #38:
    You raise a valid point, I sometimes forget how callous we humans can be. I could see this used as a drone, and they are far easier to “sell” since there is a human operator. And I have issues with them too, in many ways the way war has been “sanitized” is even more insidious than the war itself. Terms like “surgical strikes” and “smart weapons” makes it sound clean when the truth is quite the opposite. War always puts people in harms way, that’s what war is. I don’t think a world without war is something I or anybody else living today will see, but we could stop kidding ourself about it.

  40. says

    Erlend:

    @ Avenger #38:
    You raise a valid point, I sometimes forget how callous we humans can be.

    I am honestly amazed that it is even possible for someone to forget about this. I wish I could sometimes. How does it work? I suppose you do still read the news?

    I don’t think a world without war is something I or anybody else living today will see, but we could stop kidding ourself about it.

    Thank you, that was (more or less) my point from the beginning of this discussion. We need to stop fooling ourselves, and stop being fooled by those who would sell us their war machines as if they were somehow harmless toys. Besides sinister, “insidious” is indeed another word to describe it.

    Stay awake, people. And protest. Merry Yuletide.

  41. says

    It’s not that I disagree with you in principle, but as a nerd I cannot help being fascinated with them. It’s just how I tick. I see so many potential uses for them, both good and bad. And while I might be contradicting experience but I do believe the good outweighs the bad. Not because technology is benign by nature but because we as a society chooses how to use it.

    What I find most fascinating about this is the big picture, how we will deal with the emergence of artificial life. Consider the current focus on mobile technology, everything we put into everything from electric cars to hand-held computers will inevitably leak into the robotics research, so this might not be as far off as many think. Better start thinking about it now and be ahead of the curve.

  42. Athywren - Frustration Familiarity Panda says

    Is it just me, or is the idea that the formative memories of our future robot overlords are going to be of humans kicking them a worrying thing? I mean, sure, the kicking serves to support and demonstrate the creation of a stable robotic platform, but are they going to recognise that when they attain sentience, or are they just going to remember, “you kicked me, bastard”? I just hope they don’t blame us all for the actions of a few, but I fear that the results of human engineering will bear the legacy of human thought and do exactly that.