Thursday, December 8, 2016

Artwork Brought to Life with Photon Sensor Data (#IoT)


Art Brought to Life with Photon Sensor Data


I'm making progress toward my goal of creating a wifi-connected book that uses data to help tell a story.  As a test of concept, I created a Thing HTTP app and a "React" app in ThingSpeak to trigger a Particle function (on one Photon) based upon readings from a Photocell (attached to a second Photon).  To get started, I created a ThingSpeak channel and a Particle web hook by following this tutorial.

Once I created the web hook and connected my Photon to a light sensor, the Photon started logging the data on ThingSpeak.  In order to make the data more useful, I modified the Photon code in the tutorial by adding the following lines to the loop, just above the line containing the Write API key for the ThingSpeak Channel.)

value = map(value, 0, 4096, 0, 255); //maps values
value = constrain(value, 0, 255); //constrains values between 0 and 255.  


Next, I uploaded this code to a second Photon, connecting the Photon to a paper circuit using alligator clips. Lastly, I created the ThingHTTP and React apps.


The React app allowed me to set up a trigger related to the data.  I set it up so that a reaction happens any time the sensor picks up a value of 150 or greater (when it's cloudy or dark and the resistance increases) or 50 and under (when it's bright or sunny) . The ThingHTTP app allowed me to post an HTTP request to Particle, triggering the Particle function that illuminates the owl's eyes and the candle flame.



I later added LEDs to simulate falling snow.