Tweenbot: The lost robot

Tweenbot is a project by Kacie Kincer. Tweenbot is made out of cardboard, ten inch tall, always smiling, only moves in one direction, but it needs to go somewhere, it carries a flag that says “help me get to MoMA!”

Only moving in one direction, it got caught in so many obstacles or go into dangerous direction. But every time this happened, there’s always someone who saves tweenbot from falling into destruction and eventually ended up at its destination. It is now a permanent collection in MoMA and the project is still on going.

So how did a 10 inch cardboard robot who had no idea what direction is get to where it needed to go? The help from people around it. But why would one help a piece of rolling cardboard? it’s just a rolling cardboard, right? Well, if you read my post about NPR podcast: Hidden Brain, you’d get the idea why. People around tweenbot developed sympathy within the short time they ran into it. How could it gain people’s sympathy so fast? Well, first, from my observation and bias, I think it’s because it’s cute. just look at it, it’s smiling!!


it looks so happy. adorable. 10/10.

second, it is made out of cardboard, a familiar recyclable material. there is nothing eccentric in the design of the robot. third, it is helpless and very prone to danger. it’s just like when you see a helpless kitten on the street. only this one really can’t do anything other than rolling forward, makes it even vulnerable.

My conclusion is, I think people develop sympathy relatively fast when the subject of sympathy is familiar, harmless vulnerable, and has no potential to disadvantage the sympathizer.



NPR Hidden Brain: Can Robots Teach Us What It Means to Be Human?

a friend of mine suggested this podcast to me during summer break. I’m not much of a podcast gal, but some of the topics are just right up my alley, especially this one, “Can Robots Teach Us What It means to Be Human?”

Robots bent on our destruction remain the stuff of movies like “Terminator.” And robot sentience is still an idea that’s far off in the future. But there’s a lot we’re learning about smart machines, and there’s a lot that smart machines are teaching us about how we connect with the world around us and with each other. This week on HIDDEN BRAIN, can robots teach us what it means to be human?

This reminds me again of my goal of my project, or even my current art practice. It’s not about creating the perfect robot like those in sci-fis and cartoons. It’s about making people aware of our everyday life interactions that we have with an already pre-existing robots that sometimes we don’t think as much as a robot, because the society is heavily influenced by the media and pop culture have shifted people’s definition of robot.

This episode give me such big inspiration to continue on pursuing this topic, the research that has been done by professor Kate Darling proves that we do project emotions and souls into toy robots that pretty much behave like a pet dinosaur in this case. Being aware, being open to interactions between us human and robot, create a false emotion that we project onto robots, make it seems like it comes from within.

In Darling’s research, people who would likely to hurt the robots are also more likely to hurt another human being. I’m guessing that this has something to do with the level of empathy each person has. using robot as an empathy measurement for human? sounds legit to me. Now, the robots that are being used in Darling’s are designed to be friendly looking, it’s basically an expensive dinosaur pet robot toy called PLEO.


With this cute looking robot, of course people would be more open and willing to interact with it. I wonder if that would be the case if the robots presented are just assembly of mechanical arms. Will they still project the same emotions and souls onto them? will they feel empathetic towards them?

The participants exposed to PLEO for about 20 mins then asked if one of them could hammer and break the little friend they just made. none of them could do it. they were too attached to them. Kate Darling thought the price and appearance of the robot might be a huge factor into this action taking.

Then Darling did a follow up project by using HEXBUGS toy instead of the expensive adorable dinosaur. but the result stays the same. Most of participants did not want to or were hesitant to smash the bug. Wether the participants want to smash the bug or not tells a lot about the personality of the participants themselves, that is how they treat other people and how they feel empathy to people around them.


“So the follow-up study that we did, not with the dinosaurs, we did with HEXBUGs, which are a very simple toy that moves around like an insect. And there, we were looking at people’s hesitation to hit the HEXBUG and whether they would hesitate more if we gave it a name and whether they would hesitate more if they had natural tendencies for empathy, for empathic concern. And, you know, we found that people with low empathic concern for other people, they didn’t much care about the HEXBUG and would hit it much more quickly. And people with high empathic concern would hesitate more. And some even refused to hit the HEXBUGs. ….Yeah. I think there’s a lot of projection happening there. I also think that before we get to the question of robot rights and consciousness, you know, we have to ask ourselves, how do robots fit into our lives when we perceive them as conscious? Because I think that’s when it starts to get morally messy and not when they actually inherently have some sort of consciousness.”

– Kate Darling

So, robots are warm only if we’re warm

Where My Mind Is Currently At: Warm Machines

Recently I have been thinking about a lot about robots (as I have stated on my about page) and how the society perceive them. Now let’s take a step back and process what ‘robot’ means.

Screen Shot 2017-09-04 at 11.12.43 PM

(source:, through Google search engine)

How the first and second sub-definition uses one another as a way to measure how human-like and robot-like two of them are was one of the first reasons why this topic piqued my interest and I chose to pursue it. It says that 1) robot is a machine that resembles a human being and able to replicate certain human movement, and 2) used to refer to a person who behaves in a mechanical or unemotional manner. But with modern technologies and inventions, human beings have come so far to make robots function as humanly as possible, and I don’t think people are going to stop anytime soon. So with so many advanced robots running today, where do we draw the line of being ‘robot-like’ and ‘human-like’?


(source: Google, Gizmo, Amazon, Ubisoft, SONY, Disney, Adafruit)

Programmers, artists, scientists, and hobbyists become more and more interested in making robots that move and function just like those that we see in science fictions. Growing up in Tokyo, Japan during the second half of the 90s had made me become really accustomed to the friendly robot trend that was being popularized mainly through media and commercial products at the time. However, I realized when I went back to my hometown, Jakarta, Indonesia, that it wasn’t the case for most people there. I would say that in early 2000, technology and globalization was going pretty slow in Indonesia, with internet still being new. So most people were more often exposed to traditional process. There were less robot toys in the toy store. Computed toys and video games such as Tamagotchi and SEGA saturn were less favored by parents, also took a couple years to become as popular in Indonesia as when it was first released in Japan.


(source: Mushi Production, Shin-Ei Animation, Bee-Train, Studio Bogey)

Of course I wasn’t aware of this cultural resistance as a kid, but now looking back, there definitely was a form of rejection by this different group of consumers. Were they uncomfortable by those games? Were they scared that our traditional games and toys would be replaced by imported foreign products? Why was there a resistance toward those products? I’m sure there were a lot of aspects ranging from economy to politics that might have influenced the attitudes people had. I wonder if the fact that the idea of machines was foreign to them played a big role in it. If that was over than a decade ago, then how are things today world wide? That’s what I want to learn and discover more.


(source: SONY, BANDAI)

I think we’re not at the point where people can just accept robots to work alongside human beings without rejections. I’m confident a lot of people still think of them as ‘the cold machines that do not have feelings and will of their own, running on instructional language controlled by human’. What I want to prove is not that one day robots will function and act fully like human beings, but rather how human perceptions of robots can change with modern technologies and inventions. Robots are cool, but they’re not cold hard machines, they’re warm if you share a heart with them. I think they are a bunch of very warm machines.

I have accumulation of materials for references and sources I gathered out of interest that I could share here as well. I will post relevant ones on separate posts in the future. For now, this is where I am at.