Zigzaggery

art : ai : communication : research : robotics : science : sf : teaching : theory : writing

Tag: Robot (page 1 of 7)

Curtin RoboFair

Last Sunday (24 November) was Curtin University’s RoboFair, an annual event where anyone who is interested can visit and find out more about robots and robotic engineering. I was invited to be a part of the event this year as a representative of the Centre for Culture and Technology (CCAT) at Curtin. There were lots of stands with interesting robots, a whole heap of interactive displays for children and adults to enjoy. In some ways the most exciting thing for me was my poster (sad, I know)! This is the first time I’ve ever had a poster describing the types of humanities perspectives that I work with in relation to robots and communication:

CCAT RoboFair Poster

I also decided to try to run a survey during the event, but this didn’t quite work out as planned… ie I didn’t really get anyone to complete the survey (except for about four people for whom I entered the answers myself). Note to self: don’t expect people wandering round an open-day style event to scan QR codes or go to (even shortened) web addresses unless there’s a prize on offer!

However, in spite of this failure I did get to talk to a lot of great people, and certainly got the overall sense that many are interested in the relationship between technology and human culture/society to the extent that they would be like to attend seminars/workshops to think about and discuss the topic. I’m pretty sure that no-one I spoke to had ever heard the word MOOC though.

I did have a very brief wander around the rest of RoboFair, and I met a very interesting artist, Nathan Thompson, with a “robo-guitar briefcase” (yes, that’s how he describes it). I’d link to his website, but I think it’s having a few technical problems at the moment; however, assuming I catch up with Nathan and his robo-guitar again I’ll write a post about this analogue-based device, which has toured Japan recently and I’m hoping will perform in Perth sometime soon.

Sensing at cross-purposes

All this talk about perspectives, windows, maps and travelers etc. and no mention of robots… well, I’d better do something about that!

Alan, Brad, Clara and Daphne are “cybernetic machines” designed and built by the artist Jessica Field. They are all linked together to form an art installation, a system that is able to perceive human visitors. I saw these ‘robots’ when I was in Canada in 2007, at the Musée des beaux-arts de Montréal as part of the Communicating Vessels: New Technologies and Contemporary Art exhibition.

Semiotic Investigation into Cybernetic Behaviour from Jessica Field on Vimeo.

Alan, Brad, Clara and Daphne can’t move around, so they can’t really be thought of as travelers, but the ‘conversation’ between Alan and Clara offers an extreme illustration of interaction between beings that perceive the world from incommensurable perspectives. Alan is able to sense the motion of visitors over time, whereas Clara senses their distance from her in space. Alan and Clara’s perceptions of their environment are communicated to human visitors by the other two robots/computers in the system. Brad produces noises indicating particular aspects of their emotional state or “mood”, while Daphne translates their interactions into a conversational exchange in English. Although Alan and Clara aren’t really communicating with each other directly, their potential interaction is played out for visitors to the installation. As you move around the room you begin to ‘experiment’ with the robots (at least that’s what I did) in order to try to work out what their conversation means, what they can and cannot ‘see’.

Alan and Clara’s conversation highlights the difficulty involved in discussing the world with an other that senses its environment in an entirely different way from you. They see the world through different windows, and most of the time they are unable to agree on what is happening. Occasionally, Alan and Clara both ‘catch sight’ of a visitor at almost the same moment, “WOW! YOU SAW IT TOO”, and they are able to agree that something is there, but for much of the time the conversation is one of confusion over what, if anything, is out there in the installation space.

The difficulty in their interaction unfolds in part because of the extreme difference in their perceptions, but also because Alan and Clara are unable to develop any strong sense of trust for each other or respect for the other’s judgement. This means that, while they appear to find their disagreements over what is in the room unsettling, they don’t take any steps to try and work together in developing a sense of what is happening in the room. Of course, the installation is designed precisely not to explore this idea, but rather to focus on the incommensurable nature of Alan and Clara’s ideas about the world. It offers a great illustration to help explain why I’m particularly interested in how trust and respect can develop between disparate team members who sense the world in different ways. Attaining a level of trust and respect is key in effective human-dog teams for example, and I think it could also be vital in human-robot teams.

Questions about Baxter

Baxter is a new workplace robot developed by a company called Rethink Robotics, run by Rodney Brooks (a very well known robot designer, who also started the company iRobot famous for developing Roomba robotic vacuum cleaners).  Baxter is designed to work next to people in manufacturing environments, being human-like in form and consisting of a torso and two arms, together with a screen “face” (well, one consisting of eyes and eyebrows at least).

Most interesting to me is the way in which people communicate with Baxter, using touch first to get the robot’s attention and then to move the robot’s arms into particular positions.  This reminds me of the touch of a yoga teacher, for example, in helping to position people into a particular pose.  Baxter also has a graphical interface, displayed on the same screen that more often shows the robots eyes, which is controlled with buttons on each arm.  In order to “program” Baxter to complete a task a person can therefore show the robot what to do by moving its arms into position and choosing the appropriate action, eg to pick up or place down a part, from the user interface.  Importantly, it is the way in which Baxter learns a movement from being placed into position that seems to separate it from various other manufacturing robots currently in production.

As Rodney Brooks explains in the interview with IEEE Spectrum writers Erico Guizzo and Evan Ackerman, “[w]hen you hold the cuff, the robot goes into gravity-compensation, zero-force mode” such that “the arm is essentially floating”.  This makes the robot easy to position, and as Mike Bugda notes in the video below, Baxter is therefore understood to be “very compliant to the user”.  Although “compliant” is used here in part to emphasise that the robot is flexible and therefore able to deal with “an unstructured environment” (Matthew Williamson, in conversation with Guizzo and Ackerman), there is also a sense in which this robot is being placed as a servant, or possibly even a slave, by virtue of its immediate compliance to a human’s touch.  This design decision in itself is probably a pragmatic response to making the robot easy to program in the workplace, but from my perspective it raises some issues since this is also clearly a robot designed to be read as human-like, but also as a compliant servant/slave.

The idea of Baxter as human-like is reinforced when Bugda (in the video below) explains that Baxter “exhibits a certain level of interaction, collaborative friendliness that makes it easy to understand and introduce into a manufacturing environment”.  For example, when you touch its arm the robot stops what it is doing and looks towards you, acknowledging your presence, and once its instruction is complete the robot acknowledges that it understands with a nod of its “head”.  Guizzo and Ackerman take this idea further, when they suggest that while at work Baxter’s “animated face” displays “an expression of quiet concentration”, while a touch on the arm causes Baxter to “stop whatever it’s doing and look at you with the calm, confident eyes”.

Although this video is simply a demonstration, Baxter has obviously had some limited field testing, and Brooks (again in conversations with Guizzo and Ackerman) notes that after people have worked with the robot for a while “something interesting happens … People sort of personify the robot. They say, ‘It’s my buddy!’”.  It is at this point that the perception of the robot as a friend is reinforced.

This type of reading of Baxter as a “buddy” or friend might be assumed to be closely linked to the robot’s human-like form.  However, my research considering the ALAVs, the Fish-Bird project and also Guy Hoffman’s robotic desk lamp, AUR, along with anecdotal evidence from friends who are owners of Roomba robotic vacuum cleaners, indicates that robots of pretty much any form encourage this kind of personification.  In addition, for Baxter, I suspect that the use of touch between human and robot might also serve to support the perception of this robot as a social subject, and eventually a friend.  The importance of touch in working with Baxter would seem to sets this robot apart from others that I have considered in my research to date.  This might also suggest that Baxter could be made more machine-like (and less human-like) in a move that would reduce my discomfort in placing humanoid robots as servants/slaves, as I have suggested occurs with this robot worker.

IEEE Spectrum report and video of Baxter, “How Rethink Robotics Built Its New Baxter Robot Worker” (Erico Guizzo, Evan Ackerman, October 2012)

A bear (not a rabbit) called Thumper?

I couldn’t overlook this new robot, erm, pillow-bear.

Robot teddy bear to monitor your sleep

Jukusui-Kun-Robot

While its communication skills might seem limited to pawing at your head, this robot listens for your snores while it’s cute companion monitors your blood oxygen. Should your sleep become less than silent, or your oxygen levels drop alarmingly, the pillow teddy will wake you with a gentle paw on your head (although sound sleepers might require a more energetic thump I suppose).


It’s a little difficult to see from the video just how effective the tickling/thumping action might be, and it’s also a pity that the level of background noise makes it hard to relate the bear’s reaction to the snore of the presumably willing test subject (or more likely the creator of the robot).

Older posts

© 2014 Zigzaggery

Theme by Anders NorenUp ↑