Breath Installation Research

This project aims to create an interactive installation that uses heart rate or breath as inputs devices to produce a range of images, animations, video clips and sound within a specific environment. The installation will explore how sensory input (visual and audio) can affect a sense of bodily awareness in participants and promotes a state of calm.

In the first instance I shall be exploring what technologies and viable software is available and perhaps more importantly how it is currently being used.

I attended two lectures at the Festival of Ideas in Cambridge. the first one explored using data collected from muscle sensors to produce music and also data collected via sensors to measure acceleration (produced on a rowing machine).  http://www.festivalofideas.cam.ac.uk/events/sound-body-movement The data collected in both instances is processed algorithmically creating a series of musical melodies. (using logic Pro software). Dr Domenico Vicinanza is both a computer scientist and composer and has worked on creating music from not only human centred data but also from data collected at CERN and even deep space. (see below).

The focus of their research is around music, my interest is both auditory and visual and therefore I wonder whether the mechanics of the data collection could be applied to visualisations. I also began to think about the relationship between colour and musical notation. There are some theories that say specific colours can be expressed in the same way that musical notes are. There are seven colours in a rainbow, seven notes on a standard musical scale. Where sound is made of frequency, colour is made of light waves so that it might be possible to assign specific colours to musical notes.

from https://xenophilius.wordpress.com/2008/11/30/what-color-is-middle-c-musical-pitch-related-to-color/

I also found this youtube video that puts some of this into action. If a person is triggering this kind of visualisation with their heart rate, it could make for a very immersive and engaging experience.

This project is based on Heart rate variability which Kaiser claims is affected by a mediative state; below is a comparative graph of the two states:

 

 

 

This is something I found which functionally is exactly what I have in mind.

http://www.kpkaiser.com/entrepreneurship/building-a-meditation-controlled-orb/

Next up, research into meditative visualisations.

I found this video in which participants draw with their minds.

https://vimeo.com/157738873

Using heart rate monitors as data input is most commonly used in exercise and fitness applications. The aim being to increase heart rate rather than slow it down, as in my project. The popularity of games such as Wii Sports, Wii Fit and Dance Dance Revolution point to a broad interest in the form of exercise that video games can provide. Below is a link to ‘Heart Rate Control of Exercise Video Games’ which explores using a combination of heart rate data and game play using traditional console game play.

http://faculty.uoit.ca/kapralos/csci5530/Papers/kei1_GI2009.pdf

Part of the references on this paper led me (albeit indirectly) to Christopher Janney . Below is visualising music instruments.

It would be interesting to simply get a heart rate monitor to trigger visualisations as a test for further developments.

Googling meditative algorithm

I found this website http://www.moodmetric.com/#technology this company have developed a ‘mood ring’ that users wear, this then connects with an app that displays stress levels. They promote the idea that meditation will become as ubiquitous as running, keep fit etc. The recording of stress levels may well be picked up amongst the general population as the link to stress and illness are linked.

Thinking about the kind of imagery I might use for this project. This video record of work by Yayoi Kusama is inspiring me to make something that is truly immersive; probably VR will be required to achieve this.

 

Prana

https://vimeo.com/205993079

This is made by B-reel and uses breath (captured by Xethru sensor from Norway) to trigger LED lights to brighten and dim on in and out breaths. It is a highly effective and immersive installation. The technology used in the xethru sensor indicates presence of a person and their breath. The sensor works with radar technology and would be perfect as it is both small and can be installed anywhere (i.e. not attached to a person), even hidden in a wall or the like.

Here is a video of a less high tech version of the xethru using arduino

here is a link to the tutorial that is behind the video

https://www.build-electronic-circuits.com/arduino-radar-tutorial/

The cost of the sensor is $250 which is a bit of an investment, but I can see how using the breath as an interactive trigger is more immediate and obvious to the user. It could also have applications for creating interactive wallpaper (real wallpaper on a wall) that could create visualisations based on the breath. This could be applied to a range of situations, meditation spaces, exercise spaces, bedroom and relaxation and so many more! This reminds me to look at the way in which music visualisation algorithms manipulate imagery. Could be a similar methodology.

Another slightly Heath Robinson approach using the temperature sensor inside the nose to detect in and out breath. This could be a good way to test without the investment, but might be very time consuming to get to work, only to find out that I will need to start from scratch with a more sophisticated sensor.

http://www.kiddyhub.com/breath_rate_sensor.html

Raspberry Pi

I went to a Raspberry Pi jam in Cambridge since I have no experience with this technology. Pi uses python programming which I also have no experience with, but is pretty similar to actionscript and javascript with some syntax differences. I was able to programme a traffic light system. I believe that attaching a heart rate monitor to the pi is doable, and I have found this link explaining how to do it.  I have now ordered a sensor raspberry pi starter kit and will play around with that before attaching a heart rate sensor to it. The Raspberry Pi guys are incredibly helpful and answered my naive questions immediately.

I may try out this kit after I have succeeded with other sensors.

https://www.parallax.com/product/28037

 

technologies

I have tried (and slightly failed) to get the raspberry pi and a heart rate sensor to control some visuals on screen. Despite help from computer programming staff at Anglia Ruskin University (where I work), the heart rate is unstable and jumps all over the place. The code is written in python which I ‘borrowed’ and then I tried to take this code and add a function that ran for each heart rate grouping ( so 40-50 bmp, 50-60 bmp and so on). For some reason this disrupts the code which collects the heart rate data. Very annoying. I will try and post more about this later when I revisit the set up.

 

In the meantime I am exploring other options. Since I am familiar with Adobe Animate and actionscript 3, I thought I might concentrate on the visual part of the project and see if I could get the microphone to pick up breath and then use that to trigger some simple shape morphs. Success! Here it is http://www.tinaburton.com/breath.html

 

Cymatics

 

 

Cymatics

 

 

 

Eddies

vortex street around a cylinder. This can occur around cylinders and spheres, for any fluid, cylinder size and fluid speed, provided that the flow has a Reynolds number in the range ~40 to ~1000.[1]

In fluid dynamics, an eddy is the swirling of a fluid and the reverse current created when the fluid is in a turbulent flow regime.[2] The moving fluid creates a space devoid of downstream-flowing fluid on the downstream side of the object. Fluid behind the obstacle flows into the void creating a swirl of fluid on each edge of the obstacle, followed by a short reverse flow of fluid behind the obstacle flowing upstream, toward the back of the obstacle.

visualisations

 

Data Visualisation Software

 

https://www.apexvj.com

http://haptic-data.com/toxiclibsjs/examples/poly-smooth-p5 (this one is interactive and makes jagged shapes smoother. Wonder what would happen if this was triggered by heart rate. each beat creates a new shape and then the time between breaths morphs the shapes.

http://haptic-data.com/toxiclibsjs/examples/mesh-align-to-axis-web-g-l

This one made me think of constructing a floating orb like object, that changes colour/position/size depending on heart rate. Furthermore, the user could be tasked with controlling the structure, so that the heart rate is ‘forced’ to slow down.

 

the company above has lots of interesting screen savers

8 Ways

She also did this experiment using a micro microphone.

Breath

Breath

 

Where the breath is

She stands alone onstage
and has no instrument.

She lays her palms upon her breast,
where the breath is born
and where it dies.

The palms do not sing
nor does the breast.

What sings is what stays silent.

Source: Adam Zagajewski, Selected Poems, Faber and Faber Ltd, 2004

 

Working with Adobe Animate using Microphone input

I have been experimenting with using the microphone to trigger interaction. The idea is to demonstrate that a sound (later to be a heartbeat) can produce visualisations. I discovered that actionscript 3 can achieve this. Below is a first very basic proof of concept. When you allow the microphone, check the reduce echo box and turn down the volume a bit in order to reduce feedback. Once it is set up, make some noise!

http://www.tinaburton.com/microphoneShapeBlurVisualiser.html

Toggle Sidebar