Last Sunday (24 November) was Curtin University’s RoboFair, an annual event where anyone who is interested can visit and find out more about robots and robotic engineering. I was invited to be a part of the event this year as a representative of the Centre for Culture and Technology (CCAT) at Curtin. There were lots of stands with interesting robots, a whole heap of interactive displays for children and adults to enjoy. In some ways the most exciting thing for me was my poster (sad, I know)! This is the first time I’ve ever had a poster describing the types of humanities perspectives that I work with in relation to robots and communication:
I also decided to try to run a survey during the event, but this didn’t quite work out as planned… ie I didn’t really get anyone to complete the survey (except for about four people for whom I entered the answers myself). Note to self: don’t expect people wandering round an open-day style event to scan QR codes or go to (even shortened) web addresses unless there’s a prize on offer!
However, in spite of this failure I did get to talk to a lot of great people, and certainly got the overall sense that many are interested in the relationship between technology and human culture/society to the extent that they would be like to attend seminars/workshops to think about and discuss the topic. I’m pretty sure that no-one I spoke to had ever heard the word MOOC though.
I did have a very brief wander around the rest of RoboFair, and I met a very interesting artist, Nathan Thompson, with a “robo-guitar briefcase” (yes, that’s how he describes it). I’d link to his website, but I think it’s having a few technical problems at the moment; however, assuming I catch up with Nathan and his robo-guitar again I’ll write a post about this analogue-based device, which has toured Japan recently and I’m hoping will perform in Perth sometime soon.
Update: Following Slideshare’s removal of the slidecast functionality this presentation was moved to Penxy. The slides and audio are linked here.
My talk, “Send in the Robots”, for the Adventures in Culture & Technology seminar series arranged by the Centre for Culture & Technology at Curtin was yesterday, and I have just uploaded the slides and audio to SlideShare:
Tomorrow (11 June, 2013) I am giving a seminar at the Bristol Robotics Laboratory (BRL). It is five years since I last visited the lab, and I’m looking forward to seeing how the projects I saw back then have developed, as well as getting the opportunity to see new projects that have started more recently.
This is the title and overview of what I have planned for the seminar:
“Tempered” Anthropomorphism and/or Zoomorphism in Human-Robot Interactions
In this talk I consider how a level of “tempered” anthropomorphism and/or zoomorphism can facilitate perceptions of, and interactions between, overtly different communicators such as humans and non-humanoid robots.
My argument interrogates the tendency within social robotics simply to accept the ascription of human characteristics to machines as important in the facilitation of meaningful human-robot interactions. Many
scientists and other academics might argue that this decision is flawed in a similar way to scholarship that attributes human characteristics to animals. In contrast, my analysis suggests that it is possible to adopt
a “tempered” approach, in particular when the robot other is overtly non-humanoid. I suggest that a level of projection is unavoidable, and is quite possibly the only way to attempt to understand autonomous or
semi-autonomous robots. However, being constantly reminded of the “otherness” of the machine is also vital, and is of practical value in creating effective multi-skilled teams consisting of humans and robots.
I will try to alter my talk as required in response to my audience’s reactions, since it can be quite challenging to present humanities-type research to a predominantly technical audience. My aim is to emphasise the practical use of reconsidering human-robot interactions in this way.
Baxter is a new workplace robot developed by a company called Rethink Robotics, run by Rodney Brooks (a very well known robot designer, who also started the company iRobot famous for developing Roomba robotic vacuum cleaners). Baxter is designed to work next to people in manufacturing environments, being human-like in form and consisting of a torso and two arms, together with a screen “face” (well, one consisting of eyes and eyebrows at least).
Most interesting to me is the way in which people communicate with Baxter, using touch first to get the robot’s attention and then to move the robot’s arms into particular positions. This reminds me of the touch of a yoga teacher, for example, in helping to position people into a particular pose. Baxter also has a graphical interface, displayed on the same screen that more often shows the robots eyes, which is controlled with buttons on each arm. In order to “program” Baxter to complete a task a person can therefore show the robot what to do by moving its arms into position and choosing the appropriate action, eg to pick up or place down a part, from the user interface. Importantly, it is the way in which Baxter learns a movement from being placed into position that seems to separate it from various other manufacturing robots currently in production.
As Rodney Brooks explains in the interview with IEEE Spectrum writers Erico Guizzo and Evan Ackerman, “[w]hen you hold the cuff, the robot goes into gravity-compensation, zero-force mode” such that “the arm is essentially floating”. This makes the robot easy to position, and as Mike Bugda notes in the video below, Baxter is therefore understood to be “very compliant to the user”. Although “compliant” is used here in part to emphasise that the robot is flexible and therefore able to deal with “an unstructured environment” (Matthew Williamson, in conversation with Guizzo and Ackerman), there is also a sense in which this robot is being placed as a servant, or possibly even a slave, by virtue of its immediate compliance to a human’s touch. This design decision in itself is probably a pragmatic response to making the robot easy to program in the workplace, but from my perspective it raises some issues since this is also clearly a robot designed to be read as human-like, but also as a compliant servant/slave.
The idea of Baxter as human-like is reinforced when Bugda (in the video below) explains that Baxter “exhibits a certain level of interaction, collaborative friendliness that makes it easy to understand and introduce into a manufacturing environment”. For example, when you touch its arm the robot stops what it is doing and looks towards you, acknowledging your presence, and once its instruction is complete the robot acknowledges that it understands with a nod of its “head”. Guizzo and Ackerman take this idea further, when they suggest that while at work Baxter’s “animated face” displays “an expression of quiet concentration”, while a touch on the arm causes Baxter to “stop whatever it’s doing and look at you with the calm, confident eyes”.
Although this video is simply a demonstration, Baxter has obviously had some limited field testing, and Brooks (again in conversations with Guizzo and Ackerman) notes that after people have worked with the robot for a while “something interesting happens … People sort of personify the robot. They say, ‘It’s my buddy!’”. It is at this point that the perception of the robot as a friend is reinforced.
This type of reading of Baxter as a “buddy” or friend might be assumed to be closely linked to the robot’s human-like form. However, my research considering the ALAVs, the Fish-Bird project and also Guy Hoffman’s robotic desk lamp, AUR, along with anecdotal evidence from friends who are owners of Roomba robotic vacuum cleaners, indicates that robots of pretty much any form encourage this kind of personification. In addition, for Baxter, I suspect that the use of touch between human and robot might also serve to support the perception of this robot as a social subject, and eventually a friend. The importance of touch in working with Baxter would seem to sets this robot apart from others that I have considered in my research to date. This might also suggest that Baxter could be made more machine-like (and less human-like) in a move that would reduce my discomfort in placing humanoid robots as servants/slaves, as I have suggested occurs with this robot worker.
In this blog's profile image I appear with Keepon, Sparki and Chris Lynas' depiction of a drone from Iain M. Banks' Culture.
The photographs of Keepon and Sparki are my own and the drone image appears courtesy of Chris Lynas.