Collaborama Pt.1

So I’m currently in a collaboration project with my colleague and a new friend, Lilian, to work on a project together using our P5.js knowledge so far as our toolbox. The Challenge is to create a group activity that utilizes 21 screens together.

After brainstorming a couple of ideas and possibilities within our limitation, Lilian and I came up with an idea about ‘unplugging’ and having a full attention to the people around us without distraction of screens, except that it is facilitated by screens and our P5.js app. Our app creates a phone campfire.

From my research on campfires, it is a casual ritual performed today in campsites to prevent predators and pests, or simply to provide warmth and comfort. The idea of it came from bonfires, which is more ceremonial.

Google definition:

Bon·fire
/ˈbänˌfī(ə)r/

noun

noun: bonfire; plural noun: bonfires

  1. a large open-air fire used as part of a celebration, for burning trash, or as a signal.
    “the smell of burning leaves from a garden bonfire”

Origin

scrns9

late Middle English: from bone + fire. The term originally denoted a large open-air fire on which bones were burnt (sometimes as part of a celebration), also one for burning heretics or proscribed literature. Dr Johnson accepted the mistaken idea that the word came from French bon ‘good’.

The word was derived from bone and fire, the tradition began in Great Britain, because in 1605 AD, the conspiracy to blow the British parliament was foiled. Guy Fawkes, who is the suspect of the attempted blow, then was executed and burned to ashes. Since then people have been celebrating the bonfire.

There are many cultural traditions behind bonfire. In Czech Republic, people start bonfire in festival called “Burning The Witches”, it is very old but still observed folk custom and special holiday, to celebrate the coming of spring. In Nepal, bonfire is almost synonymous with camp-fire, people do it during winter months. In India, especially in Punjab, people eat peanut and sit around the bonfire to celebrate the festival of Lohri to celebrate the winter soltice. In Japan people start dancing around bonfire to mark the end of O-Bon season.

Today people would start campfire at campsite to provide heat for cooking or to prevent insects and predators to come around.

All of them have the same similarities, which is to bring people together around fire. What is it about fire? Fire has always been an important part of human lives. There is an interesting article about human relationship with fire in context of western civilization on this page: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4874404/

On Lilian’s side of research of ‘unplugging’:

Digital Detox / Unplugging

Both interested in the concept of bringing people together and away from technology. 

Initial introduction to audio and mic input from self-portrait exercise. 

The on-going trend to unplug or “digital detox” , people are interested in experience ‘real life” and minimalism – japanese minimalism or hygge

https://www.countryliving.com/life/a41187/what-is-hygge-things-to-know-about-the-danish-lifestyle-trend/

Books:

scrns10

Strains on relationships for people who are too plugged in
The concept and popularity about unplugging
The history of the bonfire and coming together to discuss
Deeper relationships: looking people in the eyes. Pushing your body forward and upright. 

Unplugging is a privilege in digitally divided and hyper-connected societies. The term “digital divide” implies that the worldwide, explosive growth of the Internet and data ( Kitchin, 2014) is an uneven, multidimensional phenomenon. 

Unplugging is a subtle notion that is emerging as a contestation to the dominant technocratic mode of urban governance (Kitchin, 2014)  that is, the so-called Smart City model that demands a transition to overcome the social tensions and misalignments caused by hyper-connected societies.

https://www.tandfonline.com/doi/full/10.1080/10630732.2014.971535

  • The time spend per week has doubled from 8 hours to 18.9 hours (Ofcom, 2015)
  • Goldilocks Hypothesis: the “just right” amount of moderation and screen use
  • Not to deprive people of important social information and peer pursuits
  • Not to displace meaningful analogue pursuits
    • Differences between sedentary and non sedentary activities (watching a movie, browsing social media vs. activitely engaging with people online)

A Large Scale Test of the Goldilocks Hypothesis: Quantifying the Relations Between Digital Screens and the Mental Well-Being of Adolescents

(https://ora.ox.ac.uk/objects/uuid:672ebf78-4b9a-42d3-8e81-8bc2561dce11/download_file?safe_filename=Przbylski%2Band%2BWeinstein%252C%2BLarge%2Bscale%2Btest%2Bof%2Bthe%2BGoldilocks%2Bhypothesis%2B-%2BQuantifying%2Bthe%2Brelations%2Bbetween%2Bdigital%2Bscreens.pdf&file_format=application%2Fpdf&type_of_work=Journal+article)

Calming/Relating/Clearing your mind apps:

  1. AmbiPro
  2. Calm
  3. Headspace

And our combined journal of our progress so far:
(DISCLAIMER: we blended our journal together as a more collaborative approach, so some of these words are of Lilian Leung’s and some are mine)

Development

Inspiration
https://www.youtube.com/watch?v=D-CrRpQ80aw (Pause App – Inspiration)

Day XX – Sept XX

Originally we tried working with a gradient built on RGB, though while digging into control of the gradient and switching values, I [lilian] was quite comfortable yet with working with multiple values once we needed to go having them change based on input

scrns11

Instead we began developing a set of gradients we could use as transparent pngs, this allowed us more control over what they visually looked like and allowed the gradients to become more dynamic and also easier to manipulate.

Initial testing of the gradients and proof of concept of having the gradient grow based on micInput. 

While Lilian was working on the gradients of the fire, I [Nilam] was trying to figure out how to add on the microphone input and make the gradient correspond to the volume of the mic input. So I used mapping.

scrns1

The louder the input volume the higher the Red value gets and the redder the screen become. This way we can just change the background to raster image, and instead of lowering the RGB value to 0 to create black, it changes its opacity to 0 to show the darker gradient image on the back of it.

scrns7

scrns8

I [Nilam] made edit on Lilian’s version of experimentation and integrated my microphone input and mapping part into the interface she already developed.

Day XX – Friday, September 19, 2019

Our Challenges

We were still trying to figure out why mic and audio input and output was working on our laptops but not on our phones. The translation of mic input on to increase the size of the fire seemed laggy, though retried resizing our images. 

On our mobile devices, the deviceShake function seemed to be working, while laggy on firefox, playing the sketch on Chrome provided better, more responsive, results

Other issues were once we started changing the transition of the tint for our sketch that sometimes the deviceShake would stop working entirely.

We wanted a less abrupt and smoother transition from the microphone input. So we tried to figure out if there are functions like delay. We couldn’t find anything so we decided to try using if statement instead of mapping.

We found out from our google searches that there is a possibility of a bug that stopped p5.js certain functions like deviceShaken from working after the recent iOS update in this past summer. Because, while laggy, it still worked on Lilian’s android phone, while it just completely never worked My [Nilam] iphone.

Lilian – working on additional function like mobile rotation and acceleration to finess the functionality of the experiment.
Nilam – working on creating a smoother transition of the gradient fading by using if statement and acceleration instead of using mapping

The rest of the project are to be continued on part 2

A lot of updates: Veebo

Wednesday is the opening of our show at Open Works and I’m wrapping up things to for the show.

There’s a lot of things that did not happen and did not go as smoothly as I’d like it to. Like the TFT LCD screen or the code for Arduino. A lot of changes needed to be made, while Alan helped me with the coding, there is not enough time to work with him and I’m not knowledgeable in programming to troubleshoot things on my own.

But there are still a lot of things that were accomplished! As shown in previous update, I’m pretty pleased with where the breathing mechanism is at. And the physical body of the robot is almost there, too.

that’s the head of the robot

the body is still currently in the making

1

that’s the gear of the motorized breathing mechanism that goes into the robot’s chest

3

and this is the armature for the chest

IMG_3449

tried to print the breathing mechanism on the formlabs but didn’t provided enough support and this is what happened

IMG_3602

this is the prototype of the chest movement

I also drew a diagram to show what component goes where:

veebo

Warm Robot: More Progress

This week progress is on robot’s breathing mechanism, heartbeat, and the circuitry.

I tested the motors for the breathing mechanism and heartbeat with simpler prototypes, and tried out how much voltage each of them needs to create different ranges of breaths per minute and beats per minute and these are the notes that I wrote down:

IMG_3248

I gathered the necessary speed according to my prior research, below is the table of different species and states of respiratory and heart rates:

Screen Shot 2017-11-12 at 8.41.45 PM

I also made a casing for the motor as well as smoother gear for the breathing movement

(tested out different egg shapes for a better timing for the breathing movement, i found out that slight differences create a huge timing gap between them)

(evolutions of the prototypes)

That’s it for now, more updates to come.

Update: Process and progress + timeline

trip to the Netherlands, Dutch Design Week, Willem de Kooning Academie, and Amsterdam was a B L A S T!! Report of the trip will be posted soon. For now, I want to keep my CPJ updated on my project!

It’s been a while since the last time I wrote the progress of my project, I think it’s a good a idea to explain from the beginning again.

I think of robots as a potential beings. Our technology is not quite there yet today, but I believe that it’s not far from now for robots and machines to be running autonomously. Today robots are not only tools for us, they are also our companions and collaborators. How do people feel about it? What exactly are our relationships to robots in the past, today, and in the future?

Therefore I am working on a companion robot that could potentially become therapy robot for children with anxiety.

This little buddy that has no name yet will run on Arduino mega with heart pulse sensor and several outputs such as voice, breathing movement, heartbeat sound, and animated eyes that reacts to your heart rate. The body will be made mainly of wood. Below are character design studies and technical sketches for fabrication.

Screen Shot 2017-11-04 at 9.09.46 PMScreen Shot 2017-11-04 at 9.09.04 PMScreen Shot 2017-11-04 at 9.08.45 PMScreen Shot 2017-11-04 at 9.09.13 PM

list of things to do (typed out version was posted in previous post)

 

after a lot of figuring out around the electronic part, I finally started on prototyping the body and mechanical movements that are gonna be implemented in the body

I used insulation foam from Home Depot as material for prototyping the main body parts. The planned size and shape feels right, and so far there has been no problem with the size of the room inside of the body for the electronic parts. Next I need to figure out how to build the shape out of wood and mount the electronic pieces onto the body.

I milled the material that I got so they’re ready to be laminated and then turned on the lathe to create the shape that I’m going for

breathing mechanism is coming together!!!! prototype started out from foam core, tape, and wire, then a burnt chipboard that never got put together, to a more presentable laser cut plywood to show and test out shapes for the appropriate breathing in and breathing out movement of the chest. Laser cut breathing hinges seem to be working well to show the movement in the prototypes, however, it is not moving the way I want the chest part to expand. Annet suggested to use neoprene for a more even stretch. Ryan also suggested to create an armature with foam to even the distribution of the pressure onto the heart-shaped chest.

and today I had a meeting with Alan to test out the code that he wrote to smooth out the heartbeat reading as well as setting a debounce and spreading them for multiple different output. We’re still having trouble with the LCD screen. For some reason, it works just fine when it’s directly plugged onto the Arduino board but not when it’s plugged through cables. Maybe it’s the connection that’s not strong enough? Alan said there’s multiple possible reasons this could be happening and he’s gonna help me figure it out. So now I’m gonna be working on my audio file and circuit, as well as eye expressions while he’s troubleshooting the LCD screen connection.

I’m so psyched for this project! So far things are going pretty close to the original timeline, nothing going too wrong just yet.

UNRAVEL THE CODE, NILAM’S TIMELINE

Warm robot: heartbeat sensor

From now I’m gonna refer to my final project as Warm Robot, just a temporary name but it makes it easy to track the project’s progress.

After purchasing the pulse sensor, I tried to do a test with it with LED light as an output. It reads my heartbeat pretty well, especially when I hooked it up on my ear.

IMG_2137

(it came with an ear clip!)

I wrote the code for analog serial reader and added an LED for an output (and additional LED to test out multiple output), here’s the code:


int Threshold = 520; //the turning point when the heart beats
int LED1 = 13; //first LED, to indicate each beat
int LED2 = 5; //another output in correspondence to the first LED

void setup() {
 Serial.begin(9600);
}

// the loop routine runs over and over again forever:
void loop() {
 int Pulse = analogRead(A0); //where I plugged in my pulse sensor
 
 // print out the value you read:
 Serial.println(Pulse);

if (Pulse > Threshold){
 digitalWrite(LED1,HIGH); 
 } else {
 digitalWrite(LED1,LOW);
 }
 delay(10);

if (LED1, HIGH){
 digitalWrite(LED2,HIGH);
 delay(2);
 digitalWrite(LED2,LOW);
 delay(2); 
 digitalWrite(LED2,HIGH);
 delay(2);
 digitalWrite(LED2,LOW);
 delay(4); /* i made sure the total of delay is less than or
the same number as the LED1 delay */
 } else {
 digitalWrite(LED2,LOW);
 } 
}

 

Then I opened the serial plotter to see the graph of the value from the pulse sensor. I googled and looked around codes people have written to find a best way to count each heartbeat and so far putting a threshold seems like the simplest one that worked for me. I wonder if there’s a way to count it for each +apex and -apex? is that even possible? I think? I’ll need to consult someone for this.

IMG_2138 copy

mmmm yaA i’m alive

IMG_2138

IMG_2134

it seems to be working with LED lights, I tried piezo for a sound output but it doesn’t seem to be working.. I thought it would work if I just change it from digital to analog. Regardless, it’s a step forward!! let’s see what else I can do before class on Wednesday!

Understanding Robots: dissection

I’ll start with the good news: I got my Schengen visa a couple days ago! that means I’ll be traveling to the Netherland and get to check out the Dutch Design Week and collaborate with students at WDKA!!

So I managed to consult about my project to Ryan, Annet, Alan, and Margaret. I got great feedback and now I know which direction I can I should take. I will be running into a lot of trouble shooting with the wiring and programming, but there’s nothing stopping me from building robots, something I’ve always wanted to built.

However, I’m still nowhere remotely close to confident with electronics. Alan told me to just write down what I need to do and go at it. Today in thesis my GTI told me the same thing. But I personally can’t just ‘go at it’ without a general understanding of what does what! so today I decided to open an old casette recorder player (found from erecycling bin) that’s only half working to see if I could fix it or if I could harvest some parts from it.


I did practice on how to harvest parts from broken electronics in my intro to robotics class, but it was pretty hard to understand what parts I could use because most of the electronics I found use more modern sensors instead of simple buttons, motor, and gears here and there. so opening an old familiar cassette player was helpful.

So everything was working fine except the assembly of the motor for the casette player.


the rubber band that was there to move the gears was becoming loose and wouldn’t stay in place and the rewind button doesn’t switch the slot smoothly to change the direction of the gears. I figured they might sell these rubber bands online but finding a replacement for this is not worth the time. but the speakers are working just fine.


from the front part, I harvested two healthy stereo speaker and a board with small amplifiers and potentiometer that I might desolder later. from the back, I got multiple gears, a motor, and a might working or might not tape head.


this main chipboard that bunch of microcontrollers are things that I’m not familiar with yet. but it’s amazing how everything is packed neatly in here, creating a functional device to play and record music.


so in the end there were my harvests:

  • two healthy stereo speakers
  • a dc motor
  • multiple gears and rods
  • small potentiometer (on chipboard)
  • two 2073b JRC amplifiers (on chipboard)
  • a tape head (working or not???)
  • empty player shell for repurpose!

I’m very happy with the harvest result and more understanding on how simple robots work!

the uncanny valley: where sympathy is absent

The uncanny valley is definitely something I have to consider while planning this project. My goal is to make the people who interact with my project to feel both comfort and discomfort at the same time to give them a space in their head to question those emotional response that they feel.

While I have been going on and on about what I want to do with my project, I don’t think I’ve ever written here how I am going to approach it.

My idea is to make a robot that measures one’s anxiety/nervousness by receiving biofeedback from the viewer through a pulse sensor, and to interact with said person based on the heart rate that the sensor detected. How am I gonna do it? I  don’t know but I have a couple ideas and I.. might have gone and bought a couple of materials that I could use without making sure this would work or not. However, you will never know until you try it.

So how am I gonna build that robot? I’m no programmer nor robotic engineer, I’m an artist. I work backwards. I create a character, the body, the case for the electronic components that are gonna run behind a cute character. Why a cute character tho?

So My goal is to capture sympathy or emotional attachment from the viewer through things that they’re already familiar and comfortable with. we love cute stuffs. we like things we can relate to, so if we go back to our uncanny valley chart, I want my robot to physically appeal at this point, the apex of familiarity and comfort:

uncanny1

Then of course there would be discomfort of knowing a non-human half-inanimate object is trying to understand our emotion, emotionally, that would fall under the uncanny valley:

uncanny2

But if these two qualities are put together in one entity, it could create a new interesting point in the uncanny valley, a higher appealing apex:

uncanny3

So I started with sketches of my character and the basic of how to input and the output would look like.

IMG_1971IMG_1973IMG_1975I will explain more about this robot in my next CPJ update with a more refined details about the sketches! There’s a lot of thinking and conversation that happened during the brainstorming of this little guy that I realized I didn’t have time to put in the CPJ, so I’ll make sure I’ll fill in the gap later when I get the time to visualize and write things.