https://www.npr.org/player/embed/536043276/536505014
a friend of mine suggested this podcast to me during summer break. I’m not much of a podcast gal, but some of the topics are just right up my alley, especially this one, “Can Robots Teach Us What It means to Be Human?”
“Robots bent on our destruction remain the stuff of movies like “Terminator.” And robot sentience is still an idea that’s far off in the future. But there’s a lot we’re learning about smart machines, and there’s a lot that smart machines are teaching us about how we connect with the world around us and with each other. This week on HIDDEN BRAIN, can robots teach us what it means to be human?“
This reminds me again of my goal of my project, or even my current art practice. It’s not about creating the perfect robot like those in sci-fis and cartoons. It’s about making people aware of our everyday life interactions that we have with an already pre-existing robots that sometimes we don’t think as much as a robot, because the society is heavily influenced by the media and pop culture have shifted people’s definition of robot.
This episode give me such big inspiration to continue on pursuing this topic, the research that has been done by professor Kate Darling proves that we do project emotions and souls into toy robots that pretty much behave like a pet dinosaur in this case. Being aware, being open to interactions between us human and robot, create a false emotion that we project onto robots, make it seems like it comes from within.
In Darling’s research, people who would likely to hurt the robots are also more likely to hurt another human being. I’m guessing that this has something to do with the level of empathy each person has. using robot as an empathy measurement for human? sounds legit to me. Now, the robots that are being used in Darling’s are designed to be friendly looking, it’s basically an expensive dinosaur pet robot toy called PLEO.

With this cute looking robot, of course people would be more open and willing to interact with it. I wonder if that would be the case if the robots presented are just assembly of mechanical arms. Will they still project the same emotions and souls onto them? will they feel empathetic towards them?
The participants exposed to PLEO for about 20 mins then asked if one of them could hammer and break the little friend they just made. none of them could do it. they were too attached to them. Kate Darling thought the price and appearance of the robot might be a huge factor into this action taking.
Then Darling did a follow up project by using HEXBUGS toy instead of the expensive adorable dinosaur. but the result stays the same. Most of participants did not want to or were hesitant to smash the bug. Wether the participants want to smash the bug or not tells a lot about the personality of the participants themselves, that is how they treat other people and how they feel empathy to people around them.

“So the follow-up study that we did, not with the dinosaurs, we did with HEXBUGs, which are a very simple toy that moves around like an insect. And there, we were looking at people’s hesitation to hit the HEXBUG and whether they would hesitate more if we gave it a name and whether they would hesitate more if they had natural tendencies for empathy, for empathic concern. And, you know, we found that people with low empathic concern for other people, they didn’t much care about the HEXBUG and would hit it much more quickly. And people with high empathic concern would hesitate more. And some even refused to hit the HEXBUGs. ….Yeah. I think there’s a lot of projection happening there. I also think that before we get to the question of robot rights and consciousness, you know, we have to ask ourselves, how do robots fit into our lives when we perceive them as conscious? Because I think that’s when it starts to get morally messy and not when they actually inherently have some sort of consciousness.”
– Kate Darling
So, robots are warm only if we’re warm