The Choir of 3D Printers

It was 2018 when I was working as a digital fabrication technician at a maker space in Baltimore, Maryland, where I had to tend to twelve 3D printers in a room no bigger than ten by ten feet. The room was clean and neatly organized, with two computers and eight Ultimaker printers stacked on a movable shelf on the right side by the door, stacks of filaments and four Prusa printers on the desk by the wall opposite to the door, repair tool kit and one in-progress giant experimental printer on the left to the door.

To give you a context, this was one of my most depressing time of my life; I was fresh out of undergraduate working two part time jobs and doing freelance works, uncertain about my graduate school applications and funding, worried sick about my ill mother at home, and desperately trying to look for a design studio that was willing to give me a work visa within a time limit of four months or else I would get kicked out of the country–things were not so hot. Every time I clocked in, I dreaded having to deal with customers who often blame the 3D printer’s failures on us, technicians.

It was slow and quiet that day, but I was emotionally exhausted from all my personal baggage. So I rested my head on the desk, closed my eyes, and thought of nothing, when it hit me, they were singing. By that time, I have worked with 3D printers long enough to have recognized their sound when they were printing and made a few off-handed comments about it. However, it was not until that day that it felt like I was listening to the 3D printers’ performance, it was a robotic choir.

Perhaps it would be an insult to performers to call an unorganized sound making machines a choir, but the sound those 3D printers were making was very soothing to me, it felt like one. It might also be because of my bias and affinity towards machines that made me come to this conclusion. Confined in a small room amidst nothing but my depression and twelve moving 3D printers, I subconsciously looked for any kind of comfort the situation had to offer, and it was in the sound of twelve 3D printers I found it.

I was aware that the sound was not an intended product of 3D printers, it just is. Of course, they were not ‘singing’, it was my human centric idea that projected the idea of singing onto those machines. They were just doing their jobs, commanded by us, printing out filament layer by layer to bring a digital design to our physical life. It was my human mind that wanted to believe that they were singing a choir. They were not aware of anything they were doing.

But regardless of intentions, it cheered me up. The twelve 3D printers could not see, hear, nor feel, yet the presence of each of them made me feel better about my life in that moment. It is perhaps romantic, and considering where we are today with technology and the direction its moving towards, it might as well be dangerous. I was projecting and seeking empathy from machines that cannot and will not reciprocate my human feelings for they are not and will never be human.

Wood Breakage, Machine Fallibility

I think what makes human so interesting is the fact that we’re all so flawed. Flaws might seem like something that’s undesirable to most people, as many consider them as faults or weaknesses. To me flaws are endearing, whether in humans, wood, or machines. Just because one is incapable of surpassing their own limits at the time, does not mean they are broken. The beautiful thing about being human is that our limits change over time, they fluctuate. People learn and fail. Fail, fail, succeed, and fail again. One’s success might look like failure to another, and vice versa.

About a week ago, I had a meeting with Stan Krzyzanowski, an undergraduate faculty at OCAD University. He teaches first year sculpture classes and introduction to wood, just like Ken Martin did at MICA. We met at his small office in the main building near the second floor cafe. He remarked that he just got a new computer for his office.

He told me that he thought he was a furniture maker, and that was what he called himself for so many years until one day he realized that it was not something that he wanted to do. He was interested in the expansion and cracking of the material he was working with, which was wood. Then he started studying in the breaking of the material. He showed me pictures and videos of his works, but some of the links and the videos kept breaking on him because of the new computer, it doesn’t play flash, and he couldn’t show the pieces he wanted to show me, one of them being this piece, a pine cone that showers itself with water the moment it dries out.

ezgif-1-d7b0311400ac
(Cone Oscillator by Stan Krzyzanowski)

I was talking to him about how I want to collaborate with my material, and while I was working with wood as my material I felt like I understood its limit and such, but not with technology. Then I showed this piece that I did in undergraduate, Memento Mori, to Stan. When I was working on this piece, I realized for the first time that I couldn’t manipulate wood to whatever I wanted forever– that one day, it would tell me ‘no’, break, and spring back on me. Memento Mori Part II was the piece that taught me that.

cc5e7549763961.58be31a41d85f(Memento Mori, Part II by Nilam Sari)

But Stan asked me, if it was really a collaboration if you keep letting it do whatever it wants to do? To which I replied, “what do you mean?” Stan turned to his computer and clicked on the link that he knew was not working, showing a blank page. He turned to me and asked, “I mean, if the screen is not working, is it a collaboration with technology?” I just stared at it. It’s not, is it?

Then he showed this piece, where he put together a bunch of simple recording of a spinning metal faceplate from the lathe into html. All videos are of the same recording, but the limit of an old cospimputer processor he had that time played the video at different speed and timing

spinning_lathe_faceplate_2_meta2_h-264
(Spinning Lathe Faceplate Grid Video by Stan Krzyzanowski)

And I realized that in my more recent piece from undergraduate, “Permanent Address“, I worked with wood carefully. Bent but never broke it. I made it into a thing that it was not but definitely not forcing it to not what it wanted to be. I might had not realized it when I was talking to Stan but I see it now I think.

dsc_0141(Permanent Address by Nilam Sari)

And in regards to technology, I think I haven’t found the limits and breaking point to it. I like machines but I don’t know why I like it yet. Why do I like it when machines glitch? Is it because I like flaws? vulnerability? Can I tell the difference between a glitch and a bug yet? I don’t know, I don’t know yetStan told me that if a material speaks to me, then I gotta do something with it. I might not know today, but I will learn more and more everyday from it, and that’s what makes it fun.

I think machines can be as flawed as human beings. It perhaps isn’t something that can be programmed, but is found. If it does what it was told to do then the machine is not flawed, but is imitating a flaw. Maybe I should get more attuned to everyday machines, get cheap robots, go to best buy and watch a roomba or something, I might find more machine fallibility in everyday life.

I wonder what is with me, my practice, and my obsession in finding living quality in machines? Is finding these in other human beings not enough for me? Why do I want machines to appear to be alive? That’s another topic for another post. I personally have not found the answer to these, but I do find joy when it happens. I think as an artist, it is my job and joy to find something meaningful behind everyday mundane things.

bubble blower

I found this unique object today

 

it’s an old bubble blower

last post I was talking about observing my surroundings to see if I could find inanimate objects that has living quality for my research. I find this old bubble blower to have that quality. The movement itself might seem mechanical, but the fact that it does not blow bubbles successfully from each holes, the odd unfamiliar shape, the size, and the worn and aging marks on the machine makes it feels a little human.

What a weird little being I thought. it also doesn’t help that it has two round fan that makes it looks like it has a pair of eyes.

I hope to keep observing and find more intriguing objects like this.

Collaborama Pt.1

So I’m currently in a collaboration project with my colleague and a new friend, Lilian, to work on a project together using our P5.js knowledge so far as our toolbox. The Challenge is to create a group activity that utilizes 21 screens together.

After brainstorming a couple of ideas and possibilities within our limitation, Lilian and I came up with an idea about ‘unplugging’ and having a full attention to the people around us without distraction of screens, except that it is facilitated by screens and our P5.js app. Our app creates a phone campfire.

From my research on campfires, it is a casual ritual performed today in campsites to prevent predators and pests, or simply to provide warmth and comfort. The idea of it came from bonfires, which is more ceremonial.

Google definition:

Bon·fire
/ˈbänˌfī(ə)r/

noun

noun: bonfire; plural noun: bonfires

  1. a large open-air fire used as part of a celebration, for burning trash, or as a signal.
    “the smell of burning leaves from a garden bonfire”

Origin

scrns9

late Middle English: from bone + fire. The term originally denoted a large open-air fire on which bones were burnt (sometimes as part of a celebration), also one for burning heretics or proscribed literature. Dr Johnson accepted the mistaken idea that the word came from French bon ‘good’.

The word was derived from bone and fire, the tradition began in Great Britain, because in 1605 AD, the conspiracy to blow the British parliament was foiled. Guy Fawkes, who is the suspect of the attempted blow, then was executed and burned to ashes. Since then people have been celebrating the bonfire.

There are many cultural traditions behind bonfire. In Czech Republic, people start bonfire in festival called “Burning The Witches”, it is very old but still observed folk custom and special holiday, to celebrate the coming of spring. In Nepal, bonfire is almost synonymous with camp-fire, people do it during winter months. In India, especially in Punjab, people eat peanut and sit around the bonfire to celebrate the festival of Lohri to celebrate the winter soltice. In Japan people start dancing around bonfire to mark the end of O-Bon season.

Today people would start campfire at campsite to provide heat for cooking or to prevent insects and predators to come around.

All of them have the same similarities, which is to bring people together around fire. What is it about fire? Fire has always been an important part of human lives. There is an interesting article about human relationship with fire in context of western civilization on this page: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4874404/

On Lilian’s side of research of ‘unplugging’:

Digital Detox / Unplugging

Both interested in the concept of bringing people together and away from technology. 

Initial introduction to audio and mic input from self-portrait exercise. 

The on-going trend to unplug or “digital detox” , people are interested in experience ‘real life” and minimalism – japanese minimalism or hygge

https://www.countryliving.com/life/a41187/what-is-hygge-things-to-know-about-the-danish-lifestyle-trend/

Books:

scrns10

Strains on relationships for people who are too plugged in
The concept and popularity about unplugging
The history of the bonfire and coming together to discuss
Deeper relationships: looking people in the eyes. Pushing your body forward and upright. 

Unplugging is a privilege in digitally divided and hyper-connected societies. The term “digital divide” implies that the worldwide, explosive growth of the Internet and data ( Kitchin, 2014) is an uneven, multidimensional phenomenon. 

Unplugging is a subtle notion that is emerging as a contestation to the dominant technocratic mode of urban governance (Kitchin, 2014)  that is, the so-called Smart City model that demands a transition to overcome the social tensions and misalignments caused by hyper-connected societies.

https://www.tandfonline.com/doi/full/10.1080/10630732.2014.971535

  • The time spend per week has doubled from 8 hours to 18.9 hours (Ofcom, 2015)
  • Goldilocks Hypothesis: the “just right” amount of moderation and screen use
  • Not to deprive people of important social information and peer pursuits
  • Not to displace meaningful analogue pursuits
    • Differences between sedentary and non sedentary activities (watching a movie, browsing social media vs. activitely engaging with people online)

A Large Scale Test of the Goldilocks Hypothesis: Quantifying the Relations Between Digital Screens and the Mental Well-Being of Adolescents

(https://ora.ox.ac.uk/objects/uuid:672ebf78-4b9a-42d3-8e81-8bc2561dce11/download_file?safe_filename=Przbylski%2Band%2BWeinstein%252C%2BLarge%2Bscale%2Btest%2Bof%2Bthe%2BGoldilocks%2Bhypothesis%2B-%2BQuantifying%2Bthe%2Brelations%2Bbetween%2Bdigital%2Bscreens.pdf&file_format=application%2Fpdf&type_of_work=Journal+article)

Calming/Relating/Clearing your mind apps:

  1. AmbiPro
  2. Calm
  3. Headspace

And our combined journal of our progress so far:
(DISCLAIMER: we blended our journal together as a more collaborative approach, so some of these words are of Lilian Leung’s and some are mine)

Development

Inspiration
https://www.youtube.com/watch?v=D-CrRpQ80aw (Pause App – Inspiration)

Day XX – Sept XX

Originally we tried working with a gradient built on RGB, though while digging into control of the gradient and switching values, I [lilian] was quite comfortable yet with working with multiple values once we needed to go having them change based on input

scrns11

Instead we began developing a set of gradients we could use as transparent pngs, this allowed us more control over what they visually looked like and allowed the gradients to become more dynamic and also easier to manipulate.

Initial testing of the gradients and proof of concept of having the gradient grow based on micInput. 

While Lilian was working on the gradients of the fire, I [Nilam] was trying to figure out how to add on the microphone input and make the gradient correspond to the volume of the mic input. So I used mapping.

scrns1

The louder the input volume the higher the Red value gets and the redder the screen become. This way we can just change the background to raster image, and instead of lowering the RGB value to 0 to create black, it changes its opacity to 0 to show the darker gradient image on the back of it.

scrns7

scrns8

I [Nilam] made edit on Lilian’s version of experimentation and integrated my microphone input and mapping part into the interface she already developed.

Day XX – Friday, September 19, 2019

Our Challenges

We were still trying to figure out why mic and audio input and output was working on our laptops but not on our phones. The translation of mic input on to increase the size of the fire seemed laggy, though retried resizing our images. 

On our mobile devices, the deviceShake function seemed to be working, while laggy on firefox, playing the sketch on Chrome provided better, more responsive, results

Other issues were once we started changing the transition of the tint for our sketch that sometimes the deviceShake would stop working entirely.

We wanted a less abrupt and smoother transition from the microphone input. So we tried to figure out if there are functions like delay. We couldn’t find anything so we decided to try using if statement instead of mapping.

We found out from our google searches that there is a possibility of a bug that stopped p5.js certain functions like deviceShaken from working after the recent iOS update in this past summer. Because, while laggy, it still worked on Lilian’s android phone, while it just completely never worked My [Nilam] iphone.

Lilian – working on additional function like mobile rotation and acceleration to finess the functionality of the experiment.
Nilam – working on creating a smoother transition of the gradient fading by using if statement and acceleration instead of using mapping

The rest of the project are to be continued on part 2

Ordinary Computation

Today we learned more about P5.js in class. It was a lot of information in one session and I couldn’t grasp all of the materials because I was trying to understand the parts that was explained at the beginning. I personally think Nick was going too fast, though I understand why he needed to. We’ve only got 2 years in grad school!

I’ve had this conversation with a friend, Allan Doyle, before, but I just remembered again that learning coding is just like learning a language. And we’re trying to understand it over such short amount of time. But I think the upper-hand of being in a learning space is that we can consistently dedicate a huge chunk of our time to learn it.

Learn it like a language. Learn it as if it’s ordinary. I read the intro part of Ordinary Affect by Kathleen Stewart today. What is Ordinary Affect? I tried to break it down by word definition as I usually do to understand a word or a phrase:

or·di·nar·y
/ˈôrdnˌerē/
Learn to pronounce
adjective
adjective: ordinary
  1. 1.
    with no special or distinctive features; normal.
    “he sets out to depict ordinary people”
    synonyms:

    usualnormalstandardtypicalstockcommoncustomaryhabitualaccustomed,
    expectedwontedeverydayregularroutineday-to-daydailyestablished, settled, setfixedtraditionalquotidianprevailing

    “the ordinary course of events”

    antonyms:

    abnormal

    • uninteresting; commonplace.
      “ordinary items of everyday wear”
      synonyms:

       averagenormalrun-of-the-millstandardtypicalmiddle-of-the-roadcommonconventionalmainstreamunremarkableunexceptional,
      unpretentiousmodestplainsimplehomelyhomespunworkaday,
      undistinguishednondescript, characterless, colorlesscommonplace,
      humdrummundaneunmemorablepedestrianprosaicquotidian,
      uninterestinguneventfuldullboringuninspiringblandsuburban,
      hackneyedstalemediocremiddlingindifferentMore

      antonyms:
  2. 2.
    (especially of a judge or bishop) exercising authority by virtue of office and not by delegation.
noun
noun: the ordinary; noun: ordinary; plural noun: ordinaries; noun: Ordinary; plural noun: Ordinaries
  1. 1.
    what is commonplace or standard.
    “their clichés were vested with enough emotion to elevate them above the ordinary”
  2. 2.
    BRITISHLAW
    a person, especially a judge, exercising authority by virtue of office and not by delegation.
    • US
      (in some US states) a judge of probate.
  3. 3.
    those parts of a Roman Catholic service, especially the Mass, which do not vary from day to day.
  4. 4.
    HERALDRY
    any of the simplest principal charges used in coats of arms (especially chief, pale, bend, fess, bar, chevron, and saltire).
  5. 5.
    ARCHAIC
    a meal provided at a fixed time and price at an inn.
  6. 6.
    HISTORICALNORTH AMERICAN
    another term for penny-farthing.

scrns3

and interestingly, the word Affect had 3 different meanings in 3 different context,

af·fect1
/əˈfekt/
verb
verb: affect; 3rd person present: affects; past tense: affected; past participle: affected; gerund or present participle: affecting
  1. have an effect on; make a difference to.
    “the dampness began to affect my health”
    synonyms:
     influence, exert influence on, have an effect on, act on, work on, conditiontouch, have an impact on, impact on, take hold of, attackinfectstrike, strike at, hitMore
    antonyms:
    be unaffected
    • touch the feelings of (someone); move emotionally.
      “the atrocities he witnessed have affected him most deeply”
      synonyms:
      upsettrouble, hit hard, overwhelmdevastatedamagehurtpaingrievesadden,
      distressdisturbperturbagitateshake, shake up, stirMore
      antonyms:
       be unaffected, be indifferent to, unaffecting, unmoving

scrns4

af·fect2
/əˈfekt/
verb
verb: affect; 3rd person present: affects; past tense: affected; past participle: affected; gerund or present participle: affecting
  1. pretend to have or feel (something).
    “as usual I affected a supreme unconcern”
    synonyms:
    pretendfeignfakecounterfeitshamsimulatefabricate, give the appearance of, make a show of, make a pretense of, play at, go through the motions of; More
    • use, wear, or assume (something) pretentiously or so as to make an impression on others.
      “an American who had affected a British accent”
      synonyms:

      assume, put on, take on, adoptlike, have a liking for, embraceespouse

      “he deliberately affected a republican stance”

scrns5

af·fect3
/ˈafekt,əˈfekt/

noun

PSYCHOLOGY
noun: affect
  1. emotion or desire, especially as influencing behavior or action.

 

scrns6

Being in a class called “Affect And Emotions In Practice” I went into the reading with presumption that within the context of the class the affect mentioned in “Ordinary Affect” primarily mean the 3rd description of the word. But I would miss the entire point of this reading if I take that as the only meaning of the word here.

Order, rules, fixed, not special, habitual, common and normal. It is what it is supposed to be and just is, ordinary. Nothing is out of place, it’s just there where it is supposed to be. What is it? Affect. But what is affect? “to make difference to”, “to move someone emotionally”, “pretend to feel”, “pretentiously”, or “desire or emotion”? perhaps it is all of them. They might seem to mean differently, but they make sense together in “Ordinary Affect”. as Steward wrote:

“Ordinary Affects is an experiment, not a judgement. Committed not to the demystification and uncovered truths that support a well-known picture of the world, but rather to speculation, curiosity, and the concrete, it tries to provoke attention to the forces that come into view as habit or shock, resonance or impact. Something throws itself together in a moment as an event and a sensation: a something both animated and inhabitable.”

But of what something? gestures, was it, that we talked about in class? Maybe it’s the poetry of everyday movements, the way someone touch their hair, when your parents lick their thumb to flick the page of newspaper, the way trees  grow in directions that are recorded in their grain pattern, the tic toc of a clock? characteristics of the universe that are always affected and affecting to one and another, going on continuous motion keeping the world rotating and revolving.

I don’t know. I would love to hear what other people have to say in class on Monday. I’m very excited for the discussions in this class. But ordinary, ordinary…. I also want to make coding something that is ordinary to me.

So I did a little more practice this afternoon, and probably gonna watch videos and do more tonight. (you know I prefer a night out dancing at some old men bar with friends but I haven’t made many friends just yet and this is okay too).

I tried to create a prototype of our group project that reacts to microphone input, but somehow it’s not working on mobile as we intended to for the context of the piece.

scrns1

https://editor.p5js.org/nilampwns/sketches/UGF7GFVCM

I tried to find solution to it but google wasn’t much of a help this time around.

But I also want to show this other thing I worked on for fun,

scrns2

https://editor.p5js.org/nilampwns/sketches/mvB3bq2BG

Which is also in my sketches that make sketches series. The dot goes up and down based on microphone input and the horizontal movement are moving steadily. It’s almost as if it’s making graph for voice input. It was very fun to make!

P.s. we learned how to use webcam on our p5 sketches as well. And we captured this in class.

liam

took us 4 collaborators to make this pic happen. Thanks Liam, Jessie, and Lilian.

In attempt to understand creative coding

So in my previous post I ranted about how I don’t feel connected to my new medium yet. So I’m finding simple way to connect with it better, start small. So I started with making what I already know how to make on top of my head: create a small drawing tool. So I did and messed around with it. It’s like I’m starting to make small sketches of codes. And here’s the sketch of the sketch’s result:

result

Untitled

Untitled3

untitled4

it’s just a bunch of shapes that create pattern.

here is the code on p5.js ; here is the second one ; the third one ; & the fourth one

> left click to clear
> the fourth one is voice activated!!

 

5 Senses and 5 Levels of Intimacy: Do we need all 5 stimuli to evoke emotional response?

I’ve had this thought for a while since we discussed ‘The Sympathy of Things’ in class. Ryan mentioned something about how he went to a conference a while back and hearing people talking and working on making the best, high definition, and realistic visual and audio simulation but not much about human sense of touch. I think today, it’s safe to say that people have been trying to achieve that. through full body experience virtual reality, pressure, vibration, use of gravity, wearable technologies, etc. I think those are the beginning of it.

I remember when I was a 11, I started to learn how to download music and movies illegally (since my brother never let me borrow his casettes and CDs, this condition forced me to be tech savvy) and at some point I wondered if we would ever be able to download smell and have a scent devices like mp3 players. It’s been a decade and I haven’t heard any word about a research or development. If we are able to create visual and audio simulations then what about our other senses, touch, smell, and taste?

The more I think about it, the more I think about our senses’ relationships to our memories and emotions. If I were to list down all 5 senses starting from least intimate to most intimate, it’d be like this:

  1. Sight
  2. Hearing
  3. Touch
  4. Smell
  5. Taste

And so far we have devices that could simulate what we see and what we hear. We could easily make digital copies of it and share it with whoever we want. And we are currently trying to create simulations for touch. I wonder how long it’d take, if it’s possible at all, to create simulations for smell and taste.

Both are such abstract concept and even we have trouble describing it with today’s linguistic expressions. They’re also strong triggers for emotions and memories, if one day technology could make those simulation happen, it’d be such a strong instrument. 

I’m still curious about it. but anyway, back to my project:

Is it necessary to activate all 5 of our senses to trigger emotional response? I don’t think so. While I won’t deny that some of them are stronger stimuli than the others, I believe we can use other things that are readily available in today’s technology to get emotional response from an audience.

Based on a study conducted in Berkeley, human adults can only focus and be conscious of 4 tasks at a time. If one were told to watch their breathing, be aware of their heartbeat, feel the texture on their fingertips, and listen to a voice, they would be lost in those four moments.

I can use this limit to my advantage on creating a false full immersive experience with a simple machine.

the uncanny valley: where sympathy is absent

The uncanny valley is definitely something I have to consider while planning this project. My goal is to make the people who interact with my project to feel both comfort and discomfort at the same time to give them a space in their head to question those emotional response that they feel.

While I have been going on and on about what I want to do with my project, I don’t think I’ve ever written here how I am going to approach it.

My idea is to make a robot that measures one’s anxiety/nervousness by receiving biofeedback from the viewer through a pulse sensor, and to interact with said person based on the heart rate that the sensor detected. How am I gonna do it? I  don’t know but I have a couple ideas and I.. might have gone and bought a couple of materials that I could use without making sure this would work or not. However, you will never know until you try it.

So how am I gonna build that robot? I’m no programmer nor robotic engineer, I’m an artist. I work backwards. I create a character, the body, the case for the electronic components that are gonna run behind a cute character. Why a cute character tho?

So My goal is to capture sympathy or emotional attachment from the viewer through things that they’re already familiar and comfortable with. we love cute stuffs. we like things we can relate to, so if we go back to our uncanny valley chart, I want my robot to physically appeal at this point, the apex of familiarity and comfort:

uncanny1

Then of course there would be discomfort of knowing a non-human half-inanimate object is trying to understand our emotion, emotionally, that would fall under the uncanny valley:

uncanny2

But if these two qualities are put together in one entity, it could create a new interesting point in the uncanny valley, a higher appealing apex:

uncanny3

So I started with sketches of my character and the basic of how to input and the output would look like.

IMG_1971IMG_1973IMG_1975I will explain more about this robot in my next CPJ update with a more refined details about the sketches! There’s a lot of thinking and conversation that happened during the brainstorming of this little guy that I realized I didn’t have time to put in the CPJ, so I’ll make sure I’ll fill in the gap later when I get the time to visualize and write things.

 

Where My Mind Is Currently At: Warm Machines

Recently I have been thinking about a lot about robots (as I have stated on my about page) and how the society perceive them. Now let’s take a step back and process what ‘robot’ means.

Screen Shot 2017-09-04 at 11.12.43 PM

(source: dictionary.com, through Google search engine)

How the first and second sub-definition uses one another as a way to measure how human-like and robot-like two of them are was one of the first reasons why this topic piqued my interest and I chose to pursue it. It says that 1) robot is a machine that resembles a human being and able to replicate certain human movement, and 2) used to refer to a person who behaves in a mechanical or unemotional manner. But with modern technologies and inventions, human beings have come so far to make robots function as humanly as possible, and I don’t think people are going to stop anytime soon. So with so many advanced robots running today, where do we draw the line of being ‘robot-like’ and ‘human-like’?

1111

(source: Google, Gizmo, Amazon, Ubisoft, SONY, Disney, Adafruit)

Programmers, artists, scientists, and hobbyists become more and more interested in making robots that move and function just like those that we see in science fictions. Growing up in Tokyo, Japan during the second half of the 90s had made me become really accustomed to the friendly robot trend that was being popularized mainly through media and commercial products at the time. However, I realized when I went back to my hometown, Jakarta, Indonesia, that it wasn’t the case for most people there. I would say that in early 2000, technology and globalization was going pretty slow in Indonesia, with internet still being new. So most people were more often exposed to traditional process. There were less robot toys in the toy store. Computed toys and video games such as Tamagotchi and SEGA saturn were less favored by parents, also took a couple years to become as popular in Indonesia as when it was first released in Japan.

Print

(source: Mushi Production, Shin-Ei Animation, Bee-Train, Studio Bogey)

Of course I wasn’t aware of this cultural resistance as a kid, but now looking back, there definitely was a form of rejection by this different group of consumers. Were they uncomfortable by those games? Were they scared that our traditional games and toys would be replaced by imported foreign products? Why was there a resistance toward those products? I’m sure there were a lot of aspects ranging from economy to politics that might have influenced the attitudes people had. I wonder if the fact that the idea of machines was foreign to them played a big role in it. If that was over than a decade ago, then how are things today world wide? That’s what I want to learn and discover more.

3333

(source: SONY, BANDAI)

I think we’re not at the point where people can just accept robots to work alongside human beings without rejections. I’m confident a lot of people still think of them as ‘the cold machines that do not have feelings and will of their own, running on instructional language controlled by human’. What I want to prove is not that one day robots will function and act fully like human beings, but rather how human perceptions of robots can change with modern technologies and inventions. Robots are cool, but they’re not cold hard machines, they’re warm if you share a heart with them. I think they are a bunch of very warm machines.

I have accumulation of materials for references and sources I gathered out of interest that I could share here as well. I will post relevant ones on separate posts in the future. For now, this is where I am at.