Dan Chen: “Robots are bound to become more intimate part of our lives”

https://www.facebook.com/plugins/video.php?href=https%3A%2F%2Fwww.facebook.com%2Fquartznews%2Fvideos%2F1782828751750819%2F&show_text=0&width=560

Dan Chen is a designer I recently found out about, his views on technology and robots are similar to mine. One day I want to make as many robots, both simple and complicated, as I can, just like this guy.

 

 

Things to check and do: after consultations


After talking to Margaret on Wednesday, she recommended me to watch this talk by Youngmoo Kim, the director of Expressive and Creative Interactive Technology (EXCITE) and a professor at Drexel University. This topic is very relevant to the context of my research, I’m hoping to get in touch with him soon, I thought this video deserves to open my journal entry.

I presented my Warm Robot idea to Ryan, Annet, Margaret, and Alan on wednesday. I got so much valuable feedback and directions. I decided to compile my notes and put it up on my CPJ as well:

Ryan & Annet

Artists to check out to observe from:

After consulting with Alan, for the size of my robot, inflatable might not be the best way to create a breathing pattern, he said a simple mechanical movements and a motor might work better. OMO is a robot that replicates breathing motion after it picks up a breathing motion from its interaction with a breathing being. Studying how the breathing mechanism work in OMO might help me figure out what’s the best way to do the breathing pattern in my work.

The fluid movements of Wolfson’s animatronics that depends on gravity is what express the emotions in the work. How do I implement this in my robot? how do I make a fluid mechanical movements with such limited skill set? Annet asked me, “Does it need to move on its own? Does it need a human control?” And I think it does. I definitely can collaborate with it to create a fluid communication between my robot and the viewers. My thesis GTI also recommended me to check out Tony Oursler‘s works to study the eyes that he projects onto his sculptures to convey expressions through the LCD screens (eye parts) of the robot. what kind of eye shapes and movements will convey its expression the best?

  • John Bell, founder of Great Small Works

If the robot needs a mediator, we could work as puppeteer and the puppet. John Bell wrote Strings, Hands, ShadowsA Modern Puppet History If I’m going to perform as a part of the piece then I have work to do on learning puppetry. I think it’s also a good chance to go back and observe videos of Alexander Calder’s circus.

Collections of early science fiction short stories about the interactions between humans, robots, and morality. I read the summary and it seems to be a western fictional history of robotics, which would be interesting because my knowledge of robotics have always been fictional from the eastern point of view. I definitely needed context from a different point of view.

Margaret

Aside from the great lecture she recommended me, another set of questions that I should get around is:

  • How to engage my audience, how to keep them engaged
  • Who are my audiences? Are there particular parties I want to target?
  • To be open about one’s emotion, listening to another sharing theirs is a good evoker. Does the robot need a narrative? A history or purpose to tell?

Alan

Engineering and programming reality check time!!

  • Breathing movement is difficult to mimic, keep scale in mind. it is still doable! Air pump is not practical, cross out inflatable. mechanical movement and motors would do it better. study from OMO.
  • He had generated a simple algorithm for Erin’s project from last year that might be reusable for my project. It could be used for the switch of heart rate reactions. I should talk to Erin next week about it.
  • My robots seem to have too many elements to be implemented within given deadline, prioritizing a couple would help the process of making and trouble shooting. make a spreadsheet for my timeline!
  • Solenoid would be a better component to use rather than a vibrating motor to mimic heartbeat sensation. It gives a thump rather than vibration.
  • I need to do a research about how fast human heartbeat can change. note: human likes something that’s just a little off from the accurate number. use this to my advantages
  • This whole project would be far easier if I can stick it to a computer so it’s not completely autonomous or rely on arduino much. a Rhino Grasshopper extension such as Node-RED and Firefly would make my life so much easier.
  • Get permission from Lucas (intro to robotics teacher) to make this project as a combined project for the rest of the class!!! His help would be extra extra helpful
  • do a study and try out on Arduino 1 bit DAC PWM
  • I have to work fast, talk to Paul and Alan. make sure they both understand what’s going on and what I’m trying to do with my project and keep each other updated.
  • (From Beth, thesis GTI) other softwares to check on sound, Audacity and DM1

Warm robot: heartbeat sensor

From now I’m gonna refer to my final project as Warm Robot, just a temporary name but it makes it easy to track the project’s progress.

After purchasing the pulse sensor, I tried to do a test with it with LED light as an output. It reads my heartbeat pretty well, especially when I hooked it up on my ear.

IMG_2137

(it came with an ear clip!)

I wrote the code for analog serial reader and added an LED for an output (and additional LED to test out multiple output), here’s the code:


int Threshold = 520; //the turning point when the heart beats
int LED1 = 13; //first LED, to indicate each beat
int LED2 = 5; //another output in correspondence to the first LED

void setup() {
 Serial.begin(9600);
}

// the loop routine runs over and over again forever:
void loop() {
 int Pulse = analogRead(A0); //where I plugged in my pulse sensor
 
 // print out the value you read:
 Serial.println(Pulse);

if (Pulse > Threshold){
 digitalWrite(LED1,HIGH); 
 } else {
 digitalWrite(LED1,LOW);
 }
 delay(10);

if (LED1, HIGH){
 digitalWrite(LED2,HIGH);
 delay(2);
 digitalWrite(LED2,LOW);
 delay(2); 
 digitalWrite(LED2,HIGH);
 delay(2);
 digitalWrite(LED2,LOW);
 delay(4); /* i made sure the total of delay is less than or
the same number as the LED1 delay */
 } else {
 digitalWrite(LED2,LOW);
 } 
}

 

Then I opened the serial plotter to see the graph of the value from the pulse sensor. I googled and looked around codes people have written to find a best way to count each heartbeat and so far putting a threshold seems like the simplest one that worked for me. I wonder if there’s a way to count it for each +apex and -apex? is that even possible? I think? I’ll need to consult someone for this.

IMG_2138 copy

mmmm yaA i’m alive

IMG_2138

IMG_2134

it seems to be working with LED lights, I tried piezo for a sound output but it doesn’t seem to be working.. I thought it would work if I just change it from digital to analog. Regardless, it’s a step forward!! let’s see what else I can do before class on Wednesday!

Understanding Robots: dissection

I’ll start with the good news: I got my Schengen visa a couple days ago! that means I’ll be traveling to the Netherland and get to check out the Dutch Design Week and collaborate with students at WDKA!!

So I managed to consult about my project to Ryan, Annet, Alan, and Margaret. I got great feedback and now I know which direction I can I should take. I will be running into a lot of trouble shooting with the wiring and programming, but there’s nothing stopping me from building robots, something I’ve always wanted to built.

However, I’m still nowhere remotely close to confident with electronics. Alan told me to just write down what I need to do and go at it. Today in thesis my GTI told me the same thing. But I personally can’t just ‘go at it’ without a general understanding of what does what! so today I decided to open an old casette recorder player (found from erecycling bin) that’s only half working to see if I could fix it or if I could harvest some parts from it.


I did practice on how to harvest parts from broken electronics in my intro to robotics class, but it was pretty hard to understand what parts I could use because most of the electronics I found use more modern sensors instead of simple buttons, motor, and gears here and there. so opening an old familiar cassette player was helpful.

So everything was working fine except the assembly of the motor for the casette player.


the rubber band that was there to move the gears was becoming loose and wouldn’t stay in place and the rewind button doesn’t switch the slot smoothly to change the direction of the gears. I figured they might sell these rubber bands online but finding a replacement for this is not worth the time. but the speakers are working just fine.


from the front part, I harvested two healthy stereo speaker and a board with small amplifiers and potentiometer that I might desolder later. from the back, I got multiple gears, a motor, and a might working or might not tape head.


this main chipboard that bunch of microcontrollers are things that I’m not familiar with yet. but it’s amazing how everything is packed neatly in here, creating a functional device to play and record music.


so in the end there were my harvests:

  • two healthy stereo speakers
  • a dc motor
  • multiple gears and rods
  • small potentiometer (on chipboard)
  • two 2073b JRC amplifiers (on chipboard)
  • a tape head (working or not???)
  • empty player shell for repurpose!

I’m very happy with the harvest result and more understanding on how simple robots work!

Tweenbot: The lost robot

Tweenbot is a project by Kacie Kincer. Tweenbot is made out of cardboard, ten inch tall, always smiling, only moves in one direction, but it needs to go somewhere, it carries a flag that says “help me get to MoMA!”

Only moving in one direction, it got caught in so many obstacles or go into dangerous direction. But every time this happened, there’s always someone who saves tweenbot from falling into destruction and eventually ended up at its destination. It is now a permanent collection in MoMA and the project is still on going.

So how did a 10 inch cardboard robot who had no idea what direction is get to where it needed to go? The help from people around it. But why would one help a piece of rolling cardboard? it’s just a rolling cardboard, right? Well, if you read my post about NPR podcast: Hidden Brain, you’d get the idea why. People around tweenbot developed sympathy within the short time they ran into it. How could it gain people’s sympathy so fast? Well, first, from my observation and bias, I think it’s because it’s cute. just look at it, it’s smiling!!

4

it looks so happy. adorable. 10/10.

second, it is made out of cardboard, a familiar recyclable material. there is nothing eccentric in the design of the robot. third, it is helpless and very prone to danger. it’s just like when you see a helpless kitten on the street. only this one really can’t do anything other than rolling forward, makes it even vulnerable.

My conclusion is, I think people develop sympathy relatively fast when the subject of sympathy is familiar, harmless vulnerable, and has no potential to disadvantage the sympathizer.

 

 

5 Senses and 5 Levels of Intimacy: Do we need all 5 stimuli to evoke emotional response?

I’ve had this thought for a while since we discussed ‘The Sympathy of Things’ in class. Ryan mentioned something about how he went to a conference a while back and hearing people talking and working on making the best, high definition, and realistic visual and audio simulation but not much about human sense of touch. I think today, it’s safe to say that people have been trying to achieve that. through full body experience virtual reality, pressure, vibration, use of gravity, wearable technologies, etc. I think those are the beginning of it.

I remember when I was a 11, I started to learn how to download music and movies illegally (since my brother never let me borrow his casettes and CDs, this condition forced me to be tech savvy) and at some point I wondered if we would ever be able to download smell and have a scent devices like mp3 players. It’s been a decade and I haven’t heard any word about a research or development. If we are able to create visual and audio simulations then what about our other senses, touch, smell, and taste?

The more I think about it, the more I think about our senses’ relationships to our memories and emotions. If I were to list down all 5 senses starting from least intimate to most intimate, it’d be like this:

  1. Sight
  2. Hearing
  3. Touch
  4. Smell
  5. Taste

And so far we have devices that could simulate what we see and what we hear. We could easily make digital copies of it and share it with whoever we want. And we are currently trying to create simulations for touch. I wonder how long it’d take, if it’s possible at all, to create simulations for smell and taste.

Both are such abstract concept and even we have trouble describing it with today’s linguistic expressions. They’re also strong triggers for emotions and memories, if one day technology could make those simulation happen, it’d be such a strong instrument. 

I’m still curious about it. but anyway, back to my project:

Is it necessary to activate all 5 of our senses to trigger emotional response? I don’t think so. While I won’t deny that some of them are stronger stimuli than the others, I believe we can use other things that are readily available in today’s technology to get emotional response from an audience.

Based on a study conducted in Berkeley, human adults can only focus and be conscious of 4 tasks at a time. If one were told to watch their breathing, be aware of their heartbeat, feel the texture on their fingertips, and listen to a voice, they would be lost in those four moments.

I can use this limit to my advantage on creating a false full immersive experience with a simple machine.

the uncanny valley: where sympathy is absent

The uncanny valley is definitely something I have to consider while planning this project. My goal is to make the people who interact with my project to feel both comfort and discomfort at the same time to give them a space in their head to question those emotional response that they feel.

While I have been going on and on about what I want to do with my project, I don’t think I’ve ever written here how I am going to approach it.

My idea is to make a robot that measures one’s anxiety/nervousness by receiving biofeedback from the viewer through a pulse sensor, and to interact with said person based on the heart rate that the sensor detected. How am I gonna do it? I  don’t know but I have a couple ideas and I.. might have gone and bought a couple of materials that I could use without making sure this would work or not. However, you will never know until you try it.

So how am I gonna build that robot? I’m no programmer nor robotic engineer, I’m an artist. I work backwards. I create a character, the body, the case for the electronic components that are gonna run behind a cute character. Why a cute character tho?

So My goal is to capture sympathy or emotional attachment from the viewer through things that they’re already familiar and comfortable with. we love cute stuffs. we like things we can relate to, so if we go back to our uncanny valley chart, I want my robot to physically appeal at this point, the apex of familiarity and comfort:

uncanny1

Then of course there would be discomfort of knowing a non-human half-inanimate object is trying to understand our emotion, emotionally, that would fall under the uncanny valley:

uncanny2

But if these two qualities are put together in one entity, it could create a new interesting point in the uncanny valley, a higher appealing apex:

uncanny3

So I started with sketches of my character and the basic of how to input and the output would look like.

IMG_1971IMG_1973IMG_1975I will explain more about this robot in my next CPJ update with a more refined details about the sketches! There’s a lot of thinking and conversation that happened during the brainstorming of this little guy that I realized I didn’t have time to put in the CPJ, so I’ll make sure I’ll fill in the gap later when I get the time to visualize and write things.

 

NPR Hidden Brain: Can Robots Teach Us What It Means to Be Human?

https://www.npr.org/player/embed/536043276/536505014

a friend of mine suggested this podcast to me during summer break. I’m not much of a podcast gal, but some of the topics are just right up my alley, especially this one, “Can Robots Teach Us What It means to Be Human?”

Robots bent on our destruction remain the stuff of movies like “Terminator.” And robot sentience is still an idea that’s far off in the future. But there’s a lot we’re learning about smart machines, and there’s a lot that smart machines are teaching us about how we connect with the world around us and with each other. This week on HIDDEN BRAIN, can robots teach us what it means to be human?

This reminds me again of my goal of my project, or even my current art practice. It’s not about creating the perfect robot like those in sci-fis and cartoons. It’s about making people aware of our everyday life interactions that we have with an already pre-existing robots that sometimes we don’t think as much as a robot, because the society is heavily influenced by the media and pop culture have shifted people’s definition of robot.

This episode give me such big inspiration to continue on pursuing this topic, the research that has been done by professor Kate Darling proves that we do project emotions and souls into toy robots that pretty much behave like a pet dinosaur in this case. Being aware, being open to interactions between us human and robot, create a false emotion that we project onto robots, make it seems like it comes from within.

In Darling’s research, people who would likely to hurt the robots are also more likely to hurt another human being. I’m guessing that this has something to do with the level of empathy each person has. using robot as an empathy measurement for human? sounds legit to me. Now, the robots that are being used in Darling’s are designed to be friendly looking, it’s basically an expensive dinosaur pet robot toy called PLEO.

B000RWEGCO-2-med

With this cute looking robot, of course people would be more open and willing to interact with it. I wonder if that would be the case if the robots presented are just assembly of mechanical arms. Will they still project the same emotions and souls onto them? will they feel empathetic towards them?

The participants exposed to PLEO for about 20 mins then asked if one of them could hammer and break the little friend they just made. none of them could do it. they were too attached to them. Kate Darling thought the price and appearance of the robot might be a huge factor into this action taking.

Then Darling did a follow up project by using HEXBUGS toy instead of the expensive adorable dinosaur. but the result stays the same. Most of participants did not want to or were hesitant to smash the bug. Wether the participants want to smash the bug or not tells a lot about the personality of the participants themselves, that is how they treat other people and how they feel empathy to people around them.

c26-B0051C0HGY-2-l

“So the follow-up study that we did, not with the dinosaurs, we did with HEXBUGs, which are a very simple toy that moves around like an insect. And there, we were looking at people’s hesitation to hit the HEXBUG and whether they would hesitate more if we gave it a name and whether they would hesitate more if they had natural tendencies for empathy, for empathic concern. And, you know, we found that people with low empathic concern for other people, they didn’t much care about the HEXBUG and would hit it much more quickly. And people with high empathic concern would hesitate more. And some even refused to hit the HEXBUGs. ….Yeah. I think there’s a lot of projection happening there. I also think that before we get to the question of robot rights and consciousness, you know, we have to ask ourselves, how do robots fit into our lives when we perceive them as conscious? Because I think that’s when it starts to get morally messy and not when they actually inherently have some sort of consciousness.”

– Kate Darling

So, robots are warm only if we’re warm

First try on electronics: bzzbzZ

I signed up for Unravel The Code with a lot of ideas in my head, but not enough skill for what I want to do, so I thought it’d be smart to take an Intro to Robotics class at the same time (it was a smart choice). The class is being taught by Lucas Haroldsen, he graduated from MICA sculpture program and he’s pretty cool. So on Tuesday, I did my first tutorial on electronics in class with different ways to set up LED lights. After the tutorial we were given assignment to put together a circuit based on a couple of schematic drawing options to build from a couple of basic components (transistor, potentiometer, a couple of different motors, lights, and a photoresistor) to work with. I was very fired up about the idea of my first robot I finish a week worth of assignment in one night. I didn’t talk to anyone. Am I turning into a robot? Find out at the end of the class!

So this was one of the schematics that I chose to do:

Screen Shot 2017-09-08 at 4.42.38 PM

Which is which, what is what, I had no idea what was going on. But it’s an introductory class for a reason so it came with a drawing:

Screen Shot 2017-09-08 at 4.43.09 PM

Nice, right? So I did my best and prototyped it on my breadboard:

IMG_1482

I had trouble with the transistor, but after a couple of tries and when everything finally worked out, it felt like jesus was born again. it’s also interesting to see components that look exactly the same but work differently and make an entire circuit works in a different way.

By turning a small input current into a large output current, the transistor acts like an amplifier. But it also acts like a switch at the same time. When there is no current to the base, little or no current flows between the collector and the emitter. Turn on the base current and a big current flows.” (src: explainthatstuff.com)

So I used the PNP transistor to switch the current to go to the base from emitter (which is connected to the photoresistor) to the collector (the output, here I chose a small motor that functions like a vibrator) and while in the day bug (using NPN transistor) the motor runs when the photoresistor is exposed to light, the PNP transistor switched it around so current flows when the photoresistor does not receive any light. Honestly, I don’t really know how it works, but at least I could understand the flow of the current so I could do my own troubleshooting.

After making sure the circuit is good on the breadboard, I put it together on a tiny copper clad board. Took me a little time to get used to soldering but it feels just like a small TIG welder:

IMG_1487

I am still impressed at how all of these things could fit in that tiny board (and how I didn’t burn my hand or anything nearby). I’m so proud of it.

That was my first baby step and journey into making robots, and I definitely will go beyond the class and make more.

Where My Mind Is Currently At: Warm Machines

Recently I have been thinking about a lot about robots (as I have stated on my about page) and how the society perceive them. Now let’s take a step back and process what ‘robot’ means.

Screen Shot 2017-09-04 at 11.12.43 PM

(source: dictionary.com, through Google search engine)

How the first and second sub-definition uses one another as a way to measure how human-like and robot-like two of them are was one of the first reasons why this topic piqued my interest and I chose to pursue it. It says that 1) robot is a machine that resembles a human being and able to replicate certain human movement, and 2) used to refer to a person who behaves in a mechanical or unemotional manner. But with modern technologies and inventions, human beings have come so far to make robots function as humanly as possible, and I don’t think people are going to stop anytime soon. So with so many advanced robots running today, where do we draw the line of being ‘robot-like’ and ‘human-like’?

1111

(source: Google, Gizmo, Amazon, Ubisoft, SONY, Disney, Adafruit)

Programmers, artists, scientists, and hobbyists become more and more interested in making robots that move and function just like those that we see in science fictions. Growing up in Tokyo, Japan during the second half of the 90s had made me become really accustomed to the friendly robot trend that was being popularized mainly through media and commercial products at the time. However, I realized when I went back to my hometown, Jakarta, Indonesia, that it wasn’t the case for most people there. I would say that in early 2000, technology and globalization was going pretty slow in Indonesia, with internet still being new. So most people were more often exposed to traditional process. There were less robot toys in the toy store. Computed toys and video games such as Tamagotchi and SEGA saturn were less favored by parents, also took a couple years to become as popular in Indonesia as when it was first released in Japan.

Print

(source: Mushi Production, Shin-Ei Animation, Bee-Train, Studio Bogey)

Of course I wasn’t aware of this cultural resistance as a kid, but now looking back, there definitely was a form of rejection by this different group of consumers. Were they uncomfortable by those games? Were they scared that our traditional games and toys would be replaced by imported foreign products? Why was there a resistance toward those products? I’m sure there were a lot of aspects ranging from economy to politics that might have influenced the attitudes people had. I wonder if the fact that the idea of machines was foreign to them played a big role in it. If that was over than a decade ago, then how are things today world wide? That’s what I want to learn and discover more.

3333

(source: SONY, BANDAI)

I think we’re not at the point where people can just accept robots to work alongside human beings without rejections. I’m confident a lot of people still think of them as ‘the cold machines that do not have feelings and will of their own, running on instructional language controlled by human’. What I want to prove is not that one day robots will function and act fully like human beings, but rather how human perceptions of robots can change with modern technologies and inventions. Robots are cool, but they’re not cold hard machines, they’re warm if you share a heart with them. I think they are a bunch of very warm machines.

I have accumulation of materials for references and sources I gathered out of interest that I could share here as well. I will post relevant ones on separate posts in the future. For now, this is where I am at.