The Mediated Extension of Self
physical computing, data visualization
Design Brief:From Marx to McLuhan and beyond the idea that the technology - in the broadest sense - is what augments the human and makes us stand out among the rest of the animal kingdom has permeated the discussion about the relationships between ourselves and our media. Tools augment our physical ability, information storage and disseminaation techniques extend the capacity of our minds. But in all those conversations the focus has been disctincly outward, to the world outside of our selves and our bodies. I will ask you to see how the technolgy can be pointed inward.
For this project we are going to explore how digital technolgy extends and augments our senses, our bodies and our visions of ourselves. We'll start by creating devices that would allow us to collect data from our immediate surroundings; that would serve as extensions of our bodies and sensory organs. We will organize and distill this data to a meaningful set and explore ways of presenting it that could uncover possible underlying patterns.
Documentation:I began this process by thinking about moments of uncertainty in my interactions with the world. I have very sensitive skin and wear sunscreen every day, but I wasn’t certain how much sun exposure I was actually getting, so I set out to measure and visualize it.
Construction:I began with a UV sensor and a light sensor connected on a breadboard to an ESP32. I used a battery until it died, and then used a mobile phone battery to power the board so that I could carry my device with me. I found an old box that had a clear cover, which was perfect for holding the device. In order to make sure the sensors were getting light, I stuffed the box with paper until the sensors pressed up against the clear cover.
Once I had my device constructed, I tested it out. I setup Adafruit IO and connected to the ESP32 using wifi. I tried to get a mobile hotspot connected so that I could carry the device with me, but unfortunately it didn’t work. First, I setup the light sensor and UV sensor separately. Using the serial monitor alongside Adafruit IO, I checked the readings to make sure that my sensors were connecting, and then combined the two connections into one code. I was getting errors with the UV sensor mapping to the same values as the light sensor (0-4095) instead of its own (0-10). As I edited the code down, I realized that I could remove the sensorVal variable that stored the UV data since the UV data already had a variable, and the problem was fixed.
Once my device was accurately collecting data, I carried it around with me while I completed my daily tasks. I put it on my desk while I worked, carried it with me around the house, and brought it with me when I walked my dogs. One night, I left it in the kitchen while I cooked and did the dishes, and I got some really interesting readings of the UV light going down and the overall light decreasing and then increasing as I turned on the inside lights.
I wanted to try mapping the data that I collected from the sunset to see how the correlation (or deviation) between UV, light, and time might look visually. I started coding in p5js by setting up the API function for the light sensor. I tested the API connection using a draw function to draw a circle for each light sensor value, with an x/y position that correlated with the time and a size and fill color that correlated with the amount of light. When it looked like it was working, I lowered the time window to a 10 minute period of sunset between 5:45pm and 5:55pm.
I realized that I was thinking about the data as 4 separate parts: UV value, UV time, Light data, and Light time. Having these 4 arrays made finding specific points very difficult, and I was getting a ton of data when I ran the code. As an alternative, I created a value sets array, which housed all of the 4 variables at one point in time, together.
I changed the color mode from RGB to HSB so that I could work more with bright/dark levels in the drawing. I adjusted the rectangles so that the the fill was a mapping of the light sensor values to two pi for hue, saturation, and brightness. The x position was mapped to the time in minutes, the width was refined, and the UV values were mapped to the height of the rectangles.
When I was satisfied with how the data was represented as rectangles, I wanted to try a more abstract version. I used the same values as the rectangle values, but used the draw function to make an arc with a stroke instead. When viewing it, you can click through the two drawings by simply clicking on the canvas. By creating one version that was more figurative and one that was more abstract, I wanted to show that the same data can be mapped in completely different ways.