I first experimented with CheerLights a few months ago while attempting to better understand the Internet of Things. Today, I'm using a Raspberry Pi Zero, rather than a Photon microcontroller. One dazzling detail about CheerLights is that it allows people all over the world to simultaneously synchronize the color of lights. Anyone can change the color of lights connected through the CheerLights API, by sending a tweet that mentions CheerLights, along with the name of a color.
Using this Node-red tutorial from Magpi magazine, I was able to get CheerLights up and running on my Pi with relative ease. In the process, I learned a little bit about Node-red, a graphical interface for interacting with smart objects through the web. Of note, I used a 10mm RGB LED with a common cathode, rather than the suggested Ledborg or three separate LEDs.
This diagram shows the flow used to trigger CheerLights on a Raspberry Pi using Node-Red
I'm wondering what else might be achieved with Node-red. The interface is not entirely intuitive, but it seems like it could be useful tool for learning how smart objects talk and interact on the internet.
I'm making progress toward my goal of creating a wifi-connected book that uses data to help tell a story. As a test of concept, I created a Thing HTTP app and a "React" app in ThingSpeak to trigger a Particle function (on one Photon) based upon readings from a Photocell (attached to a second Photon). To get started, I created a ThingSpeak channel and a Particle web hook by following this tutorial.
Once I created the web hook and connected my Photon to a light sensor, the Photon started logging the data on ThingSpeak. In order to make the data more useful, I modified the Photon code in the tutorial by adding the following lines to the loop, just above the line containing the Write API key for the ThingSpeak Channel.)
value = map(value, 0, 4096, 0, 255); //maps values value = constrain(value, 0, 255); //constrains values between 0 and 255.
Next, I uploaded this code to a second Photon, connecting the Photon to a paper circuit using alligator clips. Lastly, I created the ThingHTTP and React apps.
The React app allowed me to set up a trigger related to the data. I set it up so that a reaction happens any time the sensor picks up a value of 150 or greater (when it's cloudy or dark and the resistance increases) or 50 and under (when it's bright or sunny) . The ThingHTTP app allowed me to post an HTTP request to Particle, triggering the Particle function that illuminates the owl's eyes and the candle flame.
I've reached the point in my exploration where I've got several different functions merged into one program and a working prototype that uses alligator clips, a servo, a buzzer, NeoPixels, and surface mounted LEDs. The trick is going to be figuring out how to streamline the circuit so that it can fit neatly inside of a book (that will be built to accommodate it), with different functions playing on different pages to help tell a story of some sort.
I just ordered some conductive fabric strips to test out for the hinges. This tape is conductive on both sides, so I could conceivably use it on the hinges and adhere it to copper tape soldered to the Photon.
I have been unsuccessful getting a more compact servo to work with the code, so I may have to table the servo idea, unless I can come up with a way to conceal it. The servo I'm currently using, the TowerPro SG90, is over an inch tall. The one I want to use, the HK5320, is much smaller. I can't figure out if the issue I'm having relates to voltage or something else.
Today, I spent time experimenting with code and playing around with If Then Than That (IFTTT), which recently changed its set-up. My goal was to start merging programs on my Photon. So far, I've got a servo and NeoPixels running in one program, but I'd still like to add in code for a buzzer and LEDs that will be triggered by a light sensor.
My most exciting breakthrough was figuring out how to use IFTTT's "New Tweet From Search" feature, which makes it possible to trigger a web request by filtering a search in Twitter. In the case of my experiment, I created Applets, formerly referred to by IFTTT to as recipes, that can control the colors of NeoPixels connected to my Photon, in much the same way that CheerLights work!
This could provide an interesting way to interact with a wifi-connected book. A reader could send a tweet to change the color of LEDs in the book or scan a QR code to achieve the same effect, by triggering a Maker Event (also set up in IFTTT). While I'd already figured out how to do this with my own Tweets, I now know how to allow other peoples' tweets to interact with my Photon. My next step is to add code to the program so that a musical function is called in response to data received on a light sensor.
Over the past couple of days, there has been a lot of news circulating about security issues present in gadgets connected to the Internet of Things. Keeping that in mind, I'm cautiously continuing my experimentation with the Photon, trying to be mindful of the fact that there are privacy risks involved in transforming ordinary objects into smart objects.
Thinking about ways that music might contribute to the creation of an interactive book that I'm going to be creating, I started tinkering with the code from Spark Fun's Music Time tutorial that plays Rick Astley's song "Never Going to Give You Up." I tweaked the code by creating a new Particle function that only plays the music when the function is triggered by a web call. I used the Maker Channel on If Then Than That to trigger it and linked it to a QR code (see previous post).
Even though electronic buzzer music can be a little annoying, I'm thinking that it would be interesting if QR codes embedded into a book could trigger different tunes or sound effects to help advance a narrative.
In my last post, I got a servo running on a Photon by using web requests sent from Node.js and Twitter. After getting to that point, I wondered how I might automate the process so that someone else could trigger my servo, if I were to integrate one into a wifi-connected book. Today, I had an idea that this might be achieved by creating a QR code. I wasn't sure whether this was possible until I came across a blog by Trevor Fox. While his post was related to a completely different (albeit clever) use of the QR code, I noticed that he was using If Then Than That (IFTTT) like I was, when I created my web request for Twitter. This prompted me to do some experimenting which ended up working!
The last time I played with my servo, I created a recipe that paired Twitter with an "Action" on the Maker Channel. The "Action" was a web POST. When paired with the hashtag #servo, it triggered a Particle function that made the servo move. This time, I needed to create a "Trigger" to receive a web request.
The "Trigger" became a unique URL that I could then paste into a QR code generator to produce a QR code.
After making a new recipe on IFTTT that used the "Trigger" to call the "Action," I was able to control the servo by scanning the QR code.
I'm continuing my exploration of the Photon today and am happy to report a couple new things that I've learned. My initial goal was to get a servo up and running and to control it via Twitter.
Thanks to some open source code from MIT, shared on the Spark Fun website, I was able to get my servo to run at an established time and date, which is pretty cool! The code that I'm using was designed to have a servo tip fish food into a tank at a certain point in time, but an additional Particle function in the code also allows the servo to be moved via a web request.
Wiskers is the name of my device. Feed is the name of the Particle function.
Although I was able to quickly get the servo to work by calling the function in the command line that I set up using Node.js, creating the web request was a little trickier. In order to call the function on If Then Then That (IFTTT), I had to provide the device ID and authentication code for my Photon, as well as a unique URL linking to Particle. This took a while to figure out, but it's working now. I just have to remember to refresh my recipe in ITFFF whenever I change the code in the Particle app.
In any case, I am trying to imagine how I might embed a tiny servo into a book to help advance a narrative. It would have to be smaller than the one I'm using now. Off the top of my head, I can imagine a servo powering a tiny propeller in a steampunk book. Now, I need to figure out a way to allow the servo to be triggered by someone other than me. I'm not yet sure how to automate that, but I'll keep taking baby steps until I figure it out.
Taking inspiration from Jie Qi and Natalie Freed, I am embarking upon a quest to create a wifi connected book of my own. While I'm only at the start of my inquiry, I've already got a few ideas swirling.
Photon Unboxing
After reading details about Natalie Freed's Tide Notebook at NEXMAP, I discovered that she used the SparkCore microcontroller to connect her book to the web. In particular, she accessed tide data and used it to control LEDs mounted inside. The newest iteration of SparkCore, the Particle Photon microcontroller, can also send or receive data through Particle Build. Paired with a free open-source Internet of Things (IoT) platform called ThingSpeak, anyone can access or share data and use it to control physical objects, such as the LEDs in a book, wearable, or other type of physical object...even without an MIT degree!
Hello World!
My Remembrall
One enticing project that uses a Photon, in combination with several cloud-based apps, is the Internet Connected Remembrall, a physical orb that glows red when its owner neglects something on their to-do list.
By following the directions and really thinking about what each app was communicating with the other, I began to slowly understand how the Internet of Things (IoT) works on a small scale.
Once I got my own Photon connected to the Internet, I was able to start taking the code apart and changing it. I literally squealed when my LED, connected to a hashtag on Twitter, turned on for the first time! I've started learning a bit about the role of Web Hooks and JSON (Java Script Object Notation), both of which are aspects of connecting physical objects with data. Thanks to the excellent documentation in the Remembrall Instructable, I now have a road map for making my first wifi connected object tied to the data I choose.
Another thing that I've been playing with is creating an object connected via Particle Build to ThingSpeak's CheerLights channel. In case you haven't heard of CheerLights yet (I just did), data, generated through people's tweets around the world, can be harnessed to control the color of LEDs in your own physical objects. For example, if I send a tweet to @CheerLights and request a specific color, a Web Hook triggered by the Twitter feed will signal a change in the LED color on all of the objects connected to the data feed, including my own.
If I connect my own object to the data (in this case, a Photon with a Spark Fun Wearable Shield attached to a neopixel), I can see the light color change in near real time!