A lot of updates: Veebo

Wednesday is the opening of our show at Open Works and I’m wrapping up things to for the show.

There’s a lot of things that did not happen and did not go as smoothly as I’d like it to. Like the TFT LCD screen or the code for Arduino. A lot of changes needed to be made, while Alan helped me with the coding, there is not enough time to work with him and I’m not knowledgeable in programming to troubleshoot things on my own.

But there are still a lot of things that were accomplished! As shown in previous update, I’m pretty pleased with where the breathing mechanism is at. And the physical body of the robot is almost there, too.

that’s the head of the robot

the body is still currently in the making

1

that’s the gear of the motorized breathing mechanism that goes into the robot’s chest

3

and this is the armature for the chest

IMG_3449

tried to print the breathing mechanism on the formlabs but didn’t provided enough support and this is what happened

IMG_3602

this is the prototype of the chest movement

I also drew a diagram to show what component goes where:

veebo

Warm Robot: More Progress

This week progress is on robot’s breathing mechanism, heartbeat, and the circuitry.

I tested the motors for the breathing mechanism and heartbeat with simpler prototypes, and tried out how much voltage each of them needs to create different ranges of breaths per minute and beats per minute and these are the notes that I wrote down:

IMG_3248

I gathered the necessary speed according to my prior research, below is the table of different species and states of respiratory and heart rates:

Screen Shot 2017-11-12 at 8.41.45 PM

I also made a casing for the motor as well as smoother gear for the breathing movement

(tested out different egg shapes for a better timing for the breathing movement, i found out that slight differences create a huge timing gap between them)

(evolutions of the prototypes)

That’s it for now, more updates to come.

Update: Process and progress + timeline

trip to the Netherlands, Dutch Design Week, Willem de Kooning Academie, and Amsterdam was a B L A S T!! Report of the trip will be posted soon. For now, I want to keep my CPJ updated on my project!

It’s been a while since the last time I wrote the progress of my project, I think it’s a good a idea to explain from the beginning again.

I think of robots as a potential beings. Our technology is not quite there yet today, but I believe that it’s not far from now for robots and machines to be running autonomously. Today robots are not only tools for us, they are also our companions and collaborators. How do people feel about it? What exactly are our relationships to robots in the past, today, and in the future?

Therefore I am working on a companion robot that could potentially become therapy robot for children with anxiety.

This little buddy that has no name yet will run on Arduino mega with heart pulse sensor and several outputs such as voice, breathing movement, heartbeat sound, and animated eyes that reacts to your heart rate. The body will be made mainly of wood. Below are character design studies and technical sketches for fabrication.

Screen Shot 2017-11-04 at 9.09.46 PMScreen Shot 2017-11-04 at 9.09.04 PMScreen Shot 2017-11-04 at 9.08.45 PMScreen Shot 2017-11-04 at 9.09.13 PM

list of things to do (typed out version was posted in previous post)

 

after a lot of figuring out around the electronic part, I finally started on prototyping the body and mechanical movements that are gonna be implemented in the body

I used insulation foam from Home Depot as material for prototyping the main body parts. The planned size and shape feels right, and so far there has been no problem with the size of the room inside of the body for the electronic parts. Next I need to figure out how to build the shape out of wood and mount the electronic pieces onto the body.

I milled the material that I got so they’re ready to be laminated and then turned on the lathe to create the shape that I’m going for

breathing mechanism is coming together!!!! prototype started out from foam core, tape, and wire, then a burnt chipboard that never got put together, to a more presentable laser cut plywood to show and test out shapes for the appropriate breathing in and breathing out movement of the chest. Laser cut breathing hinges seem to be working well to show the movement in the prototypes, however, it is not moving the way I want the chest part to expand. Annet suggested to use neoprene for a more even stretch. Ryan also suggested to create an armature with foam to even the distribution of the pressure onto the heart-shaped chest.

and today I had a meeting with Alan to test out the code that he wrote to smooth out the heartbeat reading as well as setting a debounce and spreading them for multiple different output. We’re still having trouble with the LCD screen. For some reason, it works just fine when it’s directly plugged onto the Arduino board but not when it’s plugged through cables. Maybe it’s the connection that’s not strong enough? Alan said there’s multiple possible reasons this could be happening and he’s gonna help me figure it out. So now I’m gonna be working on my audio file and circuit, as well as eye expressions while he’s troubleshooting the LCD screen connection.

I’m so psyched for this project! So far things are going pretty close to the original timeline, nothing going too wrong just yet.

UNRAVEL THE CODE, NILAM’S TIMELINE

Things to check and do: after consultations


After talking to Margaret on Wednesday, she recommended me to watch this talk by Youngmoo Kim, the director of Expressive and Creative Interactive Technology (EXCITE) and a professor at Drexel University. This topic is very relevant to the context of my research, I’m hoping to get in touch with him soon, I thought this video deserves to open my journal entry.

I presented my Warm Robot idea to Ryan, Annet, Margaret, and Alan on wednesday. I got so much valuable feedback and directions. I decided to compile my notes and put it up on my CPJ as well:

Ryan & Annet

Artists to check out to observe from:

After consulting with Alan, for the size of my robot, inflatable might not be the best way to create a breathing pattern, he said a simple mechanical movements and a motor might work better. OMO is a robot that replicates breathing motion after it picks up a breathing motion from its interaction with a breathing being. Studying how the breathing mechanism work in OMO might help me figure out what’s the best way to do the breathing pattern in my work.

The fluid movements of Wolfson’s animatronics that depends on gravity is what express the emotions in the work. How do I implement this in my robot? how do I make a fluid mechanical movements with such limited skill set? Annet asked me, “Does it need to move on its own? Does it need a human control?” And I think it does. I definitely can collaborate with it to create a fluid communication between my robot and the viewers. My thesis GTI also recommended me to check out Tony Oursler‘s works to study the eyes that he projects onto his sculptures to convey expressions through the LCD screens (eye parts) of the robot. what kind of eye shapes and movements will convey its expression the best?

  • John Bell, founder of Great Small Works

If the robot needs a mediator, we could work as puppeteer and the puppet. John Bell wrote Strings, Hands, ShadowsA Modern Puppet History If I’m going to perform as a part of the piece then I have work to do on learning puppetry. I think it’s also a good chance to go back and observe videos of Alexander Calder’s circus.

Collections of early science fiction short stories about the interactions between humans, robots, and morality. I read the summary and it seems to be a western fictional history of robotics, which would be interesting because my knowledge of robotics have always been fictional from the eastern point of view. I definitely needed context from a different point of view.

Margaret

Aside from the great lecture she recommended me, another set of questions that I should get around is:

  • How to engage my audience, how to keep them engaged
  • Who are my audiences? Are there particular parties I want to target?
  • To be open about one’s emotion, listening to another sharing theirs is a good evoker. Does the robot need a narrative? A history or purpose to tell?

Alan

Engineering and programming reality check time!!

  • Breathing movement is difficult to mimic, keep scale in mind. it is still doable! Air pump is not practical, cross out inflatable. mechanical movement and motors would do it better. study from OMO.
  • He had generated a simple algorithm for Erin’s project from last year that might be reusable for my project. It could be used for the switch of heart rate reactions. I should talk to Erin next week about it.
  • My robots seem to have too many elements to be implemented within given deadline, prioritizing a couple would help the process of making and trouble shooting. make a spreadsheet for my timeline!
  • Solenoid would be a better component to use rather than a vibrating motor to mimic heartbeat sensation. It gives a thump rather than vibration.
  • I need to do a research about how fast human heartbeat can change. note: human likes something that’s just a little off from the accurate number. use this to my advantages
  • This whole project would be far easier if I can stick it to a computer so it’s not completely autonomous or rely on arduino much. a Rhino Grasshopper extension such as Node-RED and Firefly would make my life so much easier.
  • Get permission from Lucas (intro to robotics teacher) to make this project as a combined project for the rest of the class!!! His help would be extra extra helpful
  • do a study and try out on Arduino 1 bit DAC PWM
  • I have to work fast, talk to Paul and Alan. make sure they both understand what’s going on and what I’m trying to do with my project and keep each other updated.
  • (From Beth, thesis GTI) other softwares to check on sound, Audacity and DM1

Warm robot: heartbeat sensor

From now I’m gonna refer to my final project as Warm Robot, just a temporary name but it makes it easy to track the project’s progress.

After purchasing the pulse sensor, I tried to do a test with it with LED light as an output. It reads my heartbeat pretty well, especially when I hooked it up on my ear.

IMG_2137

(it came with an ear clip!)

I wrote the code for analog serial reader and added an LED for an output (and additional LED to test out multiple output), here’s the code:


int Threshold = 520; //the turning point when the heart beats
int LED1 = 13; //first LED, to indicate each beat
int LED2 = 5; //another output in correspondence to the first LED

void setup() {
 Serial.begin(9600);
}

// the loop routine runs over and over again forever:
void loop() {
 int Pulse = analogRead(A0); //where I plugged in my pulse sensor
 
 // print out the value you read:
 Serial.println(Pulse);

if (Pulse > Threshold){
 digitalWrite(LED1,HIGH); 
 } else {
 digitalWrite(LED1,LOW);
 }
 delay(10);

if (LED1, HIGH){
 digitalWrite(LED2,HIGH);
 delay(2);
 digitalWrite(LED2,LOW);
 delay(2); 
 digitalWrite(LED2,HIGH);
 delay(2);
 digitalWrite(LED2,LOW);
 delay(4); /* i made sure the total of delay is less than or
the same number as the LED1 delay */
 } else {
 digitalWrite(LED2,LOW);
 } 
}

 

Then I opened the serial plotter to see the graph of the value from the pulse sensor. I googled and looked around codes people have written to find a best way to count each heartbeat and so far putting a threshold seems like the simplest one that worked for me. I wonder if there’s a way to count it for each +apex and -apex? is that even possible? I think? I’ll need to consult someone for this.

IMG_2138 copy

mmmm yaA i’m alive

IMG_2138

IMG_2134

it seems to be working with LED lights, I tried piezo for a sound output but it doesn’t seem to be working.. I thought it would work if I just change it from digital to analog. Regardless, it’s a step forward!! let’s see what else I can do before class on Wednesday!

the uncanny valley: where sympathy is absent

The uncanny valley is definitely something I have to consider while planning this project. My goal is to make the people who interact with my project to feel both comfort and discomfort at the same time to give them a space in their head to question those emotional response that they feel.

While I have been going on and on about what I want to do with my project, I don’t think I’ve ever written here how I am going to approach it.

My idea is to make a robot that measures one’s anxiety/nervousness by receiving biofeedback from the viewer through a pulse sensor, and to interact with said person based on the heart rate that the sensor detected. How am I gonna do it? I  don’t know but I have a couple ideas and I.. might have gone and bought a couple of materials that I could use without making sure this would work or not. However, you will never know until you try it.

So how am I gonna build that robot? I’m no programmer nor robotic engineer, I’m an artist. I work backwards. I create a character, the body, the case for the electronic components that are gonna run behind a cute character. Why a cute character tho?

So My goal is to capture sympathy or emotional attachment from the viewer through things that they’re already familiar and comfortable with. we love cute stuffs. we like things we can relate to, so if we go back to our uncanny valley chart, I want my robot to physically appeal at this point, the apex of familiarity and comfort:

uncanny1

Then of course there would be discomfort of knowing a non-human half-inanimate object is trying to understand our emotion, emotionally, that would fall under the uncanny valley:

uncanny2

But if these two qualities are put together in one entity, it could create a new interesting point in the uncanny valley, a higher appealing apex:

uncanny3

So I started with sketches of my character and the basic of how to input and the output would look like.

IMG_1971IMG_1973IMG_1975I will explain more about this robot in my next CPJ update with a more refined details about the sketches! There’s a lot of thinking and conversation that happened during the brainstorming of this little guy that I realized I didn’t have time to put in the CPJ, so I’ll make sure I’ll fill in the gap later when I get the time to visualize and write things.