Tuesday, April 4, 2017

CheerLights, Node-Red, & Raspberry Pi

I first experimented with CheerLights a few months ago while attempting to better understand the Internet of Things.  Today, I'm using a Raspberry Pi Zero, rather than a Photon microcontroller. One dazzling detail about CheerLights is that it allows people all over the world to simultaneously synchronize the color of lights. Anyone can change the color of lights connected through the CheerLights API, by sending a tweet that mentions CheerLights, along with the name of a color.

Using this Node-red tutorial from Magpi magazine, I was able to get CheerLights up and running on my Pi with relative ease. In the process, I learned a little bit about Node-red, a graphical interface for interacting with smart objects through the web.  Of note, I used a 10mm RGB LED with a common cathode, rather than the suggested Ledborg or three separate LEDs.

This diagram shows the flow used to trigger CheerLights on a Raspberry Pi using Node-Red

I'm wondering what else might be achieved with Node-red.  The interface is not entirely intuitive, but it seems like it could be useful tool for learning how smart objects talk and interact on the internet.

Sunday, March 5, 2017

Raspberry Pi & Weather Underground Data Experiments

Over the past couple of weeks, I've been playing around in an attempt to better understand and apply weather data that's available on the web.

As part of this journey, I hooked up my Raspberry Pi to a Sense hat, and used ThingSpeak to log the data. I lucked into finding this detailed tutorial, which helped me track my home's temperature and humidity levels (although I had to remove some strange characters in the code before it would compile properly).

I also completed Raspberrypi.org's Fetching the Weather exercise, to learn how to access data from a nearby Raspberry Pi Weather Station.  I was surprised that the nearest RPi weather station is in Quebec, Canada; I was hoping to find a school in Vermont that was using one.
Weather data from Canada using a Raspberry Pi Weather Station

This discovery led me down a path that was unrelated to my Python exploration, but splendidly related to weather data.  Finding it hard to believe that the nearest weather station was in Canada, I stumbled upon the Weather Underground API, which connects to weather data in my home town.  This prompted me to do a deeper dive to better understand how I might use that data, perhaps in combination with a Photon microcontroller, as part of an art project I was working on.

To make a long story short, I signed up for a Weather Underground account and was given an API key that allowed me to tap into JSON data using a dedicated channel that I created on ThingSpeak!  While I still have a lot to learn, I was able to use the MATLAB analysis feature (which allows you to "Get Data From a Webpage") to harvest weather conditions by typing in a unique URL containing my API key.  I was then able to record things like wind velocity and temperature. This meant that I could use data ranges that I specified to trigger functions on my Photon!  I know this is dorky, but the learning curve has been exhilarating!

A smattering of JSON data that may be harvested through Weather Underground

Thursday, December 8, 2016

Artwork Brought to Life with Photon Sensor Data (#IoT)

Art Brought to Life with Photon Sensor Data

I'm making progress toward my goal of creating a wifi-connected book that uses data to help tell a story.  As a test of concept, I created a Thing HTTP app and a "React" app in ThingSpeak to trigger a Particle function (on one Photon) based upon readings from a Photocell (attached to a second Photon).  To get started, I created a ThingSpeak channel and a Particle web hook by following this tutorial.

Once I created the web hook and connected my Photon to a light sensor, the Photon started logging the data on ThingSpeak.  In order to make the data more useful, I modified the Photon code in the tutorial by adding the following lines to the loop, just above the line containing the Write API key for the ThingSpeak Channel.)

value = map(value, 0, 4096, 0, 255); //maps values
value = constrain(value, 0, 255); //constrains values between 0 and 255.  

Next, I uploaded this code to a second Photon, connecting the Photon to a paper circuit using alligator clips. Lastly, I created the ThingHTTP and React apps.

The React app allowed me to set up a trigger related to the data.  I set it up so that a reaction happens any time the sensor picks up a value of 150 or greater (when it's cloudy or dark and the resistance increases) or 50 and under (when it's bright or sunny) . The ThingHTTP app allowed me to post an HTTP request to Particle, triggering the Particle function that illuminates the owl's eyes and the candle flame.

I later added LEDs to simulate falling snow.

Monday, November 14, 2016

A Working Prototype (#IoT)

Merging Different Functions

I've reached the point in my exploration where I've got several different functions merged into one program and a working prototype that uses alligator clips, a servo, a buzzer, NeoPixels, and surface mounted LEDs. The trick is going to be figuring out how to streamline the circuit so that it can fit neatly inside of a book (that will be built to accommodate it), with different functions playing on different pages to help tell a story of some sort.  

I just ordered some conductive fabric strips to test out for the hinges.  This tape is conductive on both sides, so I could conceivably use it on the hinges and adhere it to copper tape soldered to the Photon.

I have been unsuccessful getting a more compact servo to work with the code, so I may have to table the servo idea, unless I can come up with a way to conceal it.  The servo I'm currently using, the TowerPro SG90, is over an inch tall.  The one I want to use, the HK5320, is much smaller.  I can't figure out if the issue I'm having relates to voltage or something else.

SMD LED Function

Monday, November 7, 2016

Interactive NeoPixels with Photon and Twitter (#IoT)

Today, I spent time experimenting with code and playing around with  If Then Than That (IFTTT), which recently changed its set-up.  My goal was to start merging programs on my Photon.  So far, I've got a servo and NeoPixels running in one program, but I'd still like to add in code for a buzzer and LEDs that will be triggered by a light sensor.

My most exciting breakthrough was figuring out how to use IFTTT's "New Tweet From Search" feature, which makes it possible to trigger a web request by filtering a search in Twitter.  In the case of my experiment, I created Applets, formerly referred to by IFTTT to as recipes, that can control the colors of NeoPixels connected to my Photon, in much the same way that CheerLights work!

This could provide an interesting way to interact with a wifi-connected book.  A reader could send a tweet to change the color of LEDs in the book or scan a QR code to achieve the same effect, by triggering a Maker Event (also set up in IFTTT).  While I'd already figured out how to do this with my own Tweets, I now know how to allow other peoples' tweets to interact with my Photon.  My next step is to add code to the program so that a musical function is called in response to data received on a light sensor.

Friday, October 28, 2016

Photocell Data in ThingSpeak Can Trigger a Servo on Photon (#IoT)

In the process of learning about how the Internet of Things works, as I attempt to create a wifi-connected book, I wanted to figure out how to get sensor data from a ThingSpeak channel to trigger a Particle function, even if the data I'm using isn't super juicy.

In an earlier post, I set up my own private ThingSpeak channel and connected it to the Particle API via a web hook.  Since then, I've updated my code so that it maps the value of the analog data, constricting the readings within a range of 0-255 instead of 0-4095.
While playing around with ThingSpeak this time, I used the React and ThingHTTP apps to use the data from the photocell to trigger a servo connected to a Photon. The React app allowed me to set up a trigger related to the data.  I set it up so that a reaction would happen any time the sensor picked up a value of 30 or under (or when it was sunny and the resistance dipped).  Then, the ThingHTTP app allowed me to post an HTTP request to Particle, in much the same way you would if using the Maker Channel on If Then Than That (IFTTT).

This breakthrough is exciting, because I have figured out how to control a physical object using my own data.  Now, if I wanted to, I could use the data from a photocell to trigger a piezo tune, a servo, or an LED light show, based upon whether it's sunny or dark outside.  I'm not yet sure how I'll use this new knowledge, but it's mine nonetheless!

Tuesday, October 25, 2016

Triggering a Rickroll on a Photon with a QR Code (#IoT)

Over the past couple of days, there has been a lot of news circulating about security issues present in gadgets connected to the Internet of Things.  Keeping that in mind, I'm cautiously continuing my experimentation with the Photon, trying to be mindful of the fact that there are privacy risks involved in transforming ordinary objects into smart objects.

Thinking about ways that music might contribute to the creation of an interactive book that I'm going to be creating, I started tinkering with the code from Spark Fun's Music Time tutorial that plays Rick Astley's song "Never Going to Give You Up."  I tweaked the code by creating a new Particle function that only plays the music when the function is triggered by a web call.  I used the Maker Channel on If Then Than That to trigger it and linked it to a QR code (see previous post).

Even though electronic buzzer music can be a little annoying, I'm thinking that it would be interesting if QR codes embedded into a book could trigger different tunes or sound effects to help advance a narrative.