Life on Vegetables; Interactive Design Between Micro-animals and Computer

by fe6710sc-s in Workshop > Science

143 Views, 2 Favorites, 0 Comments

Life on Vegetables; Interactive Design Between Micro-animals and Computer

DSC_0206.JPG
DSC_0212 kopia.jpg

This is a design and technology project conducted at the LHÍ, Iceland University of the Arts. The assignment was to learn about interaction technology. The theme of the assignment was “Fruits and Vegetables”. 


The inspiration of the project is microbial life on fruits and vegetables and in fermented foods that we eat. The project is also inspired by the quest to find life in outer space and parallels are drawn between the macro- and micro cosmos to create awareness of the vast spectrum of life that surrounds us.


With a microscope we have explored microfauna found in our everyday life and recorded video footage of our findings. With computer aided processing of information we then visualized the micro animals that we found. We created a program that detects their movement and generates  sound based on that. 


The project is a lot about exploration, research and learning about ways to connect design with technology. It resulted in a small design study that aims to open up a small window in which we can catch glimpses of a rarely witnessed reality in our nearest surroundings - the microcosmos that we are all a part of. 


Supplies

Material: Microscope, Camera that fits with microscope, laptop

Softwares: Processing, AmScope Mu1000

Samples: Potato, fermented carrots, seaweed, moss, lichens

Find Life

IMG_0840.jpg

To detect life on vegetables and other organic surfaces, a microscope is needed. The micro-life is detected through an AM scope, which is connected with the app MuScope 100 on a computer, an app that shows on a screen what the microscope detects. 


Install MuScope for free here on your computer: 

https://amscope.com/pages/software-downloads


Select the model of your microscope and install the app connected to that model. The microscope we used was a digital AmScope called IAA 11 MSC-01. 


When the app is installed, connect your computer to the microscope. Turn on the microscope and open the app. You should be able to see through the microscope on a window on your screen. At this point a sample can be put under the microscope to be examined on the computer. 


Use a clean petri dish or a glass slide for the samples of choice. 


Samples of potato peel, fermented carrots, seaweed and moss are then put under the microscope. 


The sample of fermented carrots we used had been fermented in salt water for four weeks. 

We extracted some of the liquid from the carrots as well and added it to the sample. 


To use a sample of moss found outside, it is best to take a small sample and soak it overnight in a jar that has a water surface of 1 cm. Then use a water drop tool to gather a sample from the moss water. Try to get some of the sediment at the bottom without disturbing the water too much. 


It is important to have the samples on a light background, because it is the contrast of light that is detected by the computer, not the movement itself. 


Also note that the surface of the sample should be kept relatively flat because the microscope can only detect one level of depth at a time. 


At first the sample is put under 4X, the lowest level of the microscope, where the focus is generally easy to find. A focus is found by altering the proximity of the sample to the microscope and the back light. 


It is also generally easier to spot movement on a lower level of the microscope because the camera covers a bigger surface. When a movement is detected it can be examined further using a stronger microscope. 


Then the level of the microscope is increased to 10x and 40x by hand without changing the position of the sample. To refocus, decrease the space between the microscope and the sample until the picture is clear again.

Record Life

When a movement is detected on the screen, screen record the movement through the app, MuScope. Stop the recording when the movement is lost or enough footage is gathered. Then trim the video as fit.


Crop the video so the subject of the video is in the left half of the frame. This is to fit it within the computer program later on. 


To identify found life or micro-animals, upload footage on the website https://www.inaturalist.org where fellow enthusiasts of the microcosmos can examine snapshots of the video and identify said life. After confirmed identification, the movements of the micro animals/microbia can be traced through computer coding. 


Code: Tracing Movements of Life

For the coding part we used Processing. Processing is a flexible software sketchbook and a language for learning how to code. The code is written in java. Processing can be downloaded for free here: https://processing.org/


In processing, install “BlobDetection” and “Video library for processing 4”.

(Sketch>Manage Library>Import Library).


Open BlobDetection code. When you run this code you will see that it is connected to your webcam. There it is detecting dark areas as “blobs”.

(File>Examples>BlobDetection>bd_webcam)


Open the video library for processing 4.This code allows you to run through an example video called “launch2.mp4” by using the right arrow key.

(Files>Examples>video library for processing 4>Movie>Frames)


Through comparing the two bits of code next to each other and gaining knowledge of what they do, we managed to merge the video player into the blob detection code so that the blob detector works on the data from a video instead of the webcam.

You can change what video is being played by simply exchanging “launch2” to the name of the file that you want to run, in the void setup:

	mov = new Movie(this, "launch2.mp4"); 

Just make sure the file is reachable for the program. Make it reachable by adding the file to the project's sketch file in the “data” folder in your file explorer (Windows) or finder (Mac).


Now you have the base of the code ready and hopefully it detects the areas that you wish by a red rectangle and a green “blob”. From here you can play around with the sensitivity and the visual appearance of the program. This is the base of code.

import processing.video.*;
import blobDetection.*;


Movie mov;


BlobDetection theBlobDetection;
PImage img;


void setup() {
  size(1280, 960);
  background(0);
  mov = new Movie(this, "misterT.mp4");
  mov.play();


  // BlobDetection
  // img which will be sent to detection (a smaller copy of the cam frame);
  img = new PImage(80,60); 
  theBlobDetection = new BlobDetection(img.width, img.height);
  theBlobDetection.setPosDiscrimination(true);
  theBlobDetection.setThreshold(0.5f); // will detect bright areas whose luminosity > 0.2f;


}


void movieEvent(Movie m) {
  m.read();
}


void draw() {
  image(mov, 0, 0, width, height);
  img.copy(mov, 0, 0, mov.width, mov.height, 
        0, 0, img.width, img.height);
  fastblur(img, 2);
  theBlobDetection.computeBlobs(img.pixels);
  drawBlobsAndEdges(true,true);
}




// ==================================================
// drawBlobsAndEdges()
// ==================================================
void drawBlobsAndEdges(boolean drawBlobs, boolean drawEdges)
{
  noFill();
  Blob b;
  EdgeVertex eA,eB;
  for (int n=0 ; n<theBlobDetection.getBlobNb() ; n++)
  {
    b=theBlobDetection.getBlob(n);
    if (b!=null)
    {
      // Edges
      if (drawEdges)
      {
        strokeWeight(3);
        stroke(0,255,0);
        for (int m=0;m<b.getEdgeNb();m++)
        {
          eA = b.getEdgeVertexA(m);
          eB = b.getEdgeVertexB(m);
          if (eA !=null && eB !=null)
            line(
              eA.x*width, eA.y*height, 
              eB.x*width, eB.y*height
              );
        }
      }


      // Blobs
      if (drawBlobs)
      {
        strokeWeight(1);
        stroke(255,0,0);
        rect(
          b.xMin*width,b.yMin*height,
          b.w*width,b.h*height
          );
      }


    }


      }
}


// ==================================================
// Super Fast Blur v1.1
// by Mario Klingemann 
// <http://incubator.quasimondo.com>
// ==================================================
void fastblur(PImage img,int radius)
{
 if (radius<1){
    return;
  }
  int w=img.width;
  int h=img.height;
  int wm=w-1;
  int hm=h-1;
  int wh=w*h;
  int div=radius+radius+1;
  int r[]=new int[wh];
  int g[]=new int[wh];
  int b[]=new int[wh];
  int rsum,gsum,bsum,x,y,i,p,p1,p2,yp,yi,yw;
  int vmin[] = new int[max(w,h)];
  int vmax[] = new int[max(w,h)];
  int[] pix=img.pixels;
  int dv[]=new int[256*div];
  for (i=0;i<256*div;i++){
    dv[i]=(i/div);
  }


  yw=yi=0;


  for (y=0;y<h;y++){
    rsum=gsum=bsum=0;
    for(i=-radius;i<=radius;i++){
      p=pix[yi+min(wm,max(i,0))];
      rsum+=(p & 0xff0000)>>16;
      gsum+=(p & 0x00ff00)>>8;
      bsum+= p & 0x0000ff;
    }
    for (x=0;x<w;x++){


      r[yi]=dv[rsum];
      g[yi]=dv[gsum];
      b[yi]=dv[bsum];


      if(y==0){
        vmin[x]=min(x+radius+1,wm);
        vmax[x]=max(x-radius,0);
      }
      p1=pix[yw+vmin[x]];
      p2=pix[yw+vmax[x]];


      rsum+=((p1 & 0xff0000)-(p2 & 0xff0000))>>16;
      gsum+=((p1 & 0x00ff00)-(p2 & 0x00ff00))>>8;
      bsum+= (p1 & 0x0000ff)-(p2 & 0x0000ff);
      yi++;
    }
    yw+=w;
  }


  for (x=0;x<w;x++){
    rsum=gsum=bsum=0;
    yp=-radius*w;
    for(i=-radius;i<=radius;i++){
      yi=max(0,yp)+x;
      rsum+=r[yi];
      gsum+=g[yi];
      bsum+=b[yi];
      yp+=w;
    }
    yi=x;
    for (y=0;y<h;y++){
      pix[yi]=0xff000000 | (dv[rsum]<<16) | (dv[gsum]<<8) | dv[bsum];
      if(x==0){
        vmin[y]=min(y+radius+1,hm)*w;
        vmax[y]=max(y-radius,0)*w;
      }
      p1=x+vmin[y];
      p2=x+vmax[y];


      rsum+=r[p1]-r[p2];
      gsum+=g[p1]-g[p2];
      bsum+=b[p1]-b[p2];


      yi+=w;
    }
  }


}


Code: Change Sensitivity and Appearance

blobs.png

The “blobs” are being detected by the program due to contrasting luminosity in the video. The programme's sensitivity to detecting darker areas can be changed by changing the sensitivity threshold. The threshold is the f-number (between 0-1) that somehow decides what is detected. 


By playing around with this number the part of the video that is detected changes. We generally had to raise the number to between 0.5-0.9 to get the desired sensitivity. The threshold is changed in the void setup:

	theBlobDetection.setThreshold(0.2f); // will detect bright areas whose luminosity > 0.2f;


We wanted to delete the red rectangle and only keep the shape that resembles the outlines of the micro animal that we found. 


This is done in void drawBlobsAndEdges(boolean drawBlobs, boolean drawEdges) where we disabled the part of the code that draws the red rectangle. That can be done by “comment it out using /* (..) */. The forward slash and star tell the program not to read what's written in between them. You can also delete this part but that way it's much harder to undo. 

/*
      // Blobs
      if (drawBlobs)
      {
        strokeWeight(1);
        stroke(255,0,0);
        rect(
          b.xMin*width,b.yMin*height,
          b.w*width,b.h*height
          );
      }
*/


To create the effect of tracing the micro animals movement we “comment out” the line of code that creates images of the video in the void draw. That way only the “blob” is drawn on a black background, over and over again, creating the desired effect of tracking the animals movement. 


In void draw:

	//image(mov, 0, 0, width, height);



Code: Sound Generated by Movement

After figuring out the visual tracing of the micro animals, we wanted to add another observable dimension to them. We want to give them “a voice” and add sound. The sound is supposed to be generated by the movement that we now can trace. 


This part is also done in Processing.


In processing, install the “Minim” audio library and open the example file “CreateAnInstrument”.

The next step is to make the sound generated by the code “mouse controlled”, that is making the tones output depend on the mouse's coordinates on the screen.


In the void setup disable this entire part:

 /* when providing an Instrument, we always specify start time and duration
  out.playNote( 0.0, 0.9, new SineInstrument( 97.99 ) );
  out.playNote( 1.0, 0.9, new SineInstrument( 123.47 ) );
  
  // we can use the Frequency class to create frequencies from pitch names
  out.playNote( 2.0, 2.9, new SineInstrument( Frequency.ofPitch( "C3" ).asHz() ) );
  out.playNote( 3.0, 1.9, new SineInstrument( Frequency.ofPitch( "E3" ).asHz() ) );
  out.playNote( 4.0, 0.9, new SineInstrument( Frequency.ofPitch( "G3" ).asHz() ) );
  */


Moved the out.playNote to void draw, but replaced the SineInstrument value to the mouses X and Y coordinates:

out.playNote( 0.0, 0.9, new SineInstrument( mouseX ) );
out.playNote( 0.0, 0.9, new SineInstrument( mouseY ) );


This way we have code for a simple “mouse controlled instrument”. 


The next step is to make it controlled by the “blobs” coordinates instead of the mouses. This is done by comparing the BlobDetection code with the mouse controlled instrument and merging them together.


This is how the code should look before the void setup:

In void setup add:

minim = new Minim(this);
  
out = minim.getLineOut();


This is how void setup should look like:

In void drawBlobs the out.playNote is added (see row 128) and MouseX is substituted by the function that describes the Blobs position. We know that because that is where the BlobDetection is supposed to draw the rectangle that was mentioned earlier.

Now the “instrument” is controlled by the BlobDetection and therefore based on the movement of the micro animal in our footage. 


By dividing the function (see screenshot, row 128) we can regulate the sound and we played around with this to generate a sound that we liked. 

Code: Split Screen to See the Correlation

screensh.png

To be able to see the correlation between the movement in the video footage and the generated traces and sounds we decided on a split screen layout as the output of the code.


This was simply done by setting the parameter that defines where on the screen the video is played to the x coordinate that represents the middle of the screen. So based on the size in void setup, divide the x value by 2 to get the middle of the screens x coordinate (in our case 640). In void draw change the x parameter accordingly:

void draw() {
 image(mov, 640, 0, width, height);
 img.copy(mov, 0, 0, mov.width, mov.height, 
    0, 0, img.width, img.height);
 fastblur(img, 2);
 theBlobDetection.computeBlobs(img.pixels);
 drawBlobsAndEdges(true,true);
}

And that's it!


Feel free to try this out and explore your own tardigrade-BlobDetection-drawingmachine-musicInstrument


Conclusion

The end result would've been fun to showcase live. For the exhibition of this project we showed a video of what we had done. We screen-recorded the program that ran through different footage and then edited it. A great possibility would be to set the code up in connection with the AmScope camera output instead of video recordings. That way the code could track what is under the microscope in a live feed, creating the visuals and audio depending on what is found. The user of the microscope could get a better grasp of the feeling and the excitement that we had when we were actually searching for something that moved in our samples, which was a big part of this project.