Concept of standing in front of wall of light

Computer Science Bachelors Capstone

Client  City of Syracuse Innovation Team

Date   August 2019 – January 2019 

Purpose As the City of Syracuse transitions to becoming a smart city this is a transparent way to understand some of the tech that could be used in a smart city. 

Design/Project Document   https://docs.google.com/document/d/1rpWmhr3shepzdNn
I8t1h5EmimbXuCRRqdOppr5mSHas/edit?usp=sharing

Project status   COMPLETE

Repository link https://github.com/EdwardDeaver/SyracuseInnovationLEDProject

Iot website (not receiving data anymore) https://cityofsyracuse.github.io/PerformanceDashboard/

Programming languages used OpenFrameworks 11 (C++ Creative Coding Framework), NodeJS (ExpressJS, SocketIO), Python3, Arduino(C++) 

Programming languages explanation – OpenFrameworks was used to create an app that talks to a FadeCandy, receive Serial input, create a graphical dashboard of serial distance data, and send data to python over a UDP socket. NodeJS was used for the Heroku web server to receive data via a Post request and send data to the web client via SocketIO. Python3 was used to establish a UDP Socket, send post data to the NodeJS server and send data to AWS DynamoDB. 

Cloud services used AWS DynamoDB, Heroku 

Cloud services explanation AWS DynamoDB was used to store distance sensor data. Heroku was used to host the NodeJS web server. 

Hardware used 2x Arduino ESP8266s, 2x Ultrasonic Distance Sensor – 3V or 5V – HC-SR04 compatible – RCWL-1601, 4x 30 Led/M RGB NeoPixel Led strips, Raspberry Pi 4 – 4Gb

Hardware Explanation The ESP8266s were used to obtain data from the ultrasonic sensors and send that data over serial. The distance sensors were used to know how far someone was from the project. The NeoPixels were used to indicate how far you were from the project. The Raspberry Pi 4 was used to run the project. 

Other applications used Adobe Illustrator  (for hardware illustration posters)

Hardware Illustration Postershttps://www.behance.net/gallery/90824077/Capstone-Project-Definitions

 

 

Description: 

This was an interactive activation to demonstrate smart tech in an approachable fashion. 

Citizen’s knowledge of “smart” things is important; Governments are currently having vendors try to sell them facial recognition software, ways to monetize citizens data, and artificial intelligence solutions. This system itself will not produce a complete understanding of “smart” things by the entire public, but it can provide a jumping off point to explore ideas. First, the project will try to help the understandings of “smart” through the use of radical transparency of the system. Radical transparency is a business philosophy based on total openness. It is similar in some respects to the open source ethos.  Second, I have created technology definitions for parts of the system that are defined using the lowest technology literacy rate of a given area to create simple definitions. Though I do not know how to define technology literacy rates I have made sure the definitions use metaphors that relate to devices the reader probably has interacted with. The project relies on two these parts to accomplish the mission because when transparency increases in a [governmental] system it acts as a multiplier on preconceived notions of government to reinforce those held ideas. Researchers at the Utrecht University Netherlands conducted a study(N=658) to test the link of transparency of government and trust. The researchers found that due to preexisting ideas of government that people had the effects of transparency were not prominent in increasing trust.  It is possible to change these preconceived notions. In advertising,  agencies have been doing it for years. Agencies like like DoSomething.org and the AdCouncil participate in advocacy or education related advertising campaigns. At the federal government level there has been the creation of code.gov to show what open source projects the federal government has made and is currently sharing.

Video of the final product:


Video of interacting with the installation (2020)

Video of the OpenFrameworks app running:


Video of the OpenFrameworks app running

Video of the NodeJS website receiving the sensor data:


Video of the NodeJS website receiving the sensor data.

The final posters:


    •  

 

Initial steps:

Initially the project was going to be a Twitter bot that manipulated images sent to it and display them on a LED wall. The processes for applying for the Twitter API is straight forward except if you have a government association. After stating what I was doing with the API, I received a followup email asking to restate what I said. By that time over week had passed and I moved the project in a different direction for interactivity.

These are quotes from the initial idea document:

I want interactivity as opposed to passive sensors(like a temperature sensor because you can just ask your phone what the temperature is or look outside if it’s snowing). So, let’s make a bot(twitter) that takes in photos via an @ mention then cuts them into 3×3 grids(OR higher resolution grids 8×8, 16×16) then finds the dominant color in each “quadrant” then exports that to a LED light grid, with a screen saying who sent the photo. Then a webcam on the other side of the room takes a photo of it and tweets the photo and @’s the person who sent it.

This bypasses sensors entirely and forces people to interact with technology to see a physical manifestation of digital actions. By creating physical experiences we can break down the mystification of technology and create magic. We will also have a plaque/website saying in plain English how this works.

Flow:

1. Tweet at the bot this image: (Author @unsplash)

Some artsy looking building. Stock image from @unsplash
Stock image from @unsplash

2. The program will download the image then start to analyze it:

a. First, split the image up into a 3×3 grid:

Splitting the image into a 3x3 grid.
Splitting the image into a 3×3 grid. 

2.b Then find the dominant color of each box in the grid:

Dominant color of every cube in the grid.
Dominant color of every cube in the grid. 

3. Then represent those colors on a grid of LEDs, and put the tweeter’s name on a screen.

3.a Then use a webcam on the other side of the room to take a picture of it and tweet out the photo and @ the tweeter.

Major pivot to the main idea – Midterm:

After the Twitter approval took too long I moved to using an Arduino and sensors, I also wanted a led wall like this:

My wanting mock up of a LED wall. Made with Blender by me. A large led wall.
My wanting mock up of a LED wall. Made with Blender by me. 

By this point the project hardware solidified into using a Raspberry Pi 4 as the master device and using Arduinos as sensor interfaces over the USB. The Raspberry Pi would control Neopixel LEDs using a fadecandy a teensy based Neopixel control USB board :

HARDWARE DATA FLOW FROM MY MIDTERM SLIDESHOW PRESENTATION.
HARDWARE DATA FLOW FROM MY MIDTERM SLIDESHOW PRESENTATION. 

The initial application that ran on the RB Pi 4 was made using Processing a Java based creative coding framework. I eventually moved from this due to low FPS when reading from the serial input and writing to the FadeCandy. The Processing library for the FadeCandy works by establishing points on the canvas to sample colors from the canvas. Those colors are then sent via the Open Pixel Control library to the FadeCandy server then that sends its data to the FadeCandy.

Software data flow from my midterm slideshow presentation.It uses a Processing app GUI window to send data to the Fade Candy.
Software data flow from my midterm slideshow presentation.
https://www.youtube.com/watch?v=mS5aIwXD1FE
DEMO OF THE ULTRASONIC DISTANCE SENSOR BEING READ BY PROCESSING TO CONTROL A RECTANGLE ON SCREEN THATS DATA WAS SENT TO THE FADECANDY. 

In order to achieve my dream of a large led wall that still existed at this time I planned on moving the application to a Mac Mini and increasing the amount of FadeCandys to 3 and tripling the increasing the amount of NeoPixel strips to 24 strips. The FadeCandys can support 8 led strips. This did not happen.

This is the hardware diagram I wanted. 3x Arduinos - 3x ultrasonic distance sensors, 3x FadeCandys and 8 LED strips per FadeCandys. This never happened.
This is the hardware diagram I wanted. 3x Arduinos – 3x ultrasonic distance sensors, 3x FadeCandys and 8 LED strips per FadeCandys. This never happened.

This is the hardware diagram I wanted. 3x Arduinos – 3x ultrasonic distance sensors, 3x FadeCandys and 8 LED strips per FadeCandys. This never happened.

The Midterm to Final:

The move towards OpenFrameworks:

This began my journey from my comfortable Java world to C++. I started to remake the program in OpenFrameworks 0.10.0 using XCode on Mac. Due to OpenFrameworks 0.10.0  not supporting the Raspberry Pi 4 I had planned on using a Mac Mini as the main hardware platform. Because this is C++ and not compiling for JVM each compilation needs to be done on the specific device you’re targeting. Eventually OpenFrameworks 0.11.0 would be release and support the Raspberry Pi 4, which would end up being the final platform for this project.

Eventually I was able to get to a point where I could control the position of boxes in OpenFrameworks (Mac) using 2 ultrasonic sensors as X / Y controllers.

https://www.youtube.com/watch?v=3VLrgS0LWVo&feature=emb_title
Video of me using two ultrasonic distance sensors to control the X / Y position. 

During this time I also tried to make a standby animation using time as a variable to create a smooth gradient. This resulted in ~200 rectangles per rectangle. This was not added to the final product.

https://www.youtube.com/watch?v=MaJnSsrK23I&feature=emb_title
Time based change to create gradients in OpenFrameworks. 

Once the interaction of the sensors and the FadeCandy was complete I tried to add more IoT features: writing sensor data to a database and a website.

For the database I chose to go with AWS DynamoDB, a NoSQL database. I chose DynamoDB because it had a very generous free tier and I wanted to get experience using AWS.

For the website I made NodeJS Express Post endpoint that used a internal SocketIO connection to send new data received from the Post request to the client. I also created a really smooth pulsating green light that updated when the data was received. I wanted to make the site to give another way for someone not able to visit the instillation.

https://www.youtube.com/watch?v=h2EaYBodnsk&feature=emb_title
Website receiving the sensor data. 

Final:

System data flow:

The data in the application is mainly gathered and passed around in the OpenFrameworks update function. First update the OPC function. Then restart the stage, which is essentially wiping the canvas.  Then if Serial 1 is available get the data, convert bytes to float, and set it to newx1. Then if Serial 2 is available get the data, convert bytes to float, and set it to newx2. If the timer is >60 minutes reset timer and stage. Then it will play a predefined tone based on the newx1 amount, and newx2 amount. Then, it will draw the amount of squares using the linear squares for newx1 and newx2. Next, it will reach the stage pixels and send it to OPC FadeCandy. Then it will send the sensor data to the UDP socket.

The data in the application is mainly gathered and passed around in the OpenFrameworks update function. First update the OPC function. Then restart the stage, which is essentially wiping the canvas.  Then if Serial 1 is available get the data, convert bytes to float, and set it to newx1. Then if Serial 2 is available get the data, convert bytes to float, and set it to newx2. If the timer is >60 minutes reset timer and stage. Then it will play a predefined tone based on the newx1 amount, and newx2 amount. Then, it will draw the amount of squares using the linear squares for newx1 and newx2. Next, it will reach the stage pixels and send it to OPC FadeCandy. Then it will send the sensor data to the UDP socket.
System data flow for OpenFramework Update function. 

The color data flow:

When a number is input it will check the amount and determine how many level of boxes will enraged. Each category will draw a "rectangle". The lower the input amount the more rectangles will be drawn due to the distance sensors. This means the user is very close to the sensor. Each "rectangle" is made up of 70 rectangles to produce a gradient.
THIS SHOWS BOTH OF THE LIGHTS EFFECT. 

Ex. Left lights:

When a number is input it will check the amount and determine how many level of boxes will enraged. Each category will draw a “rectangle”. The lower the input amount the more rectangles will be drawn due to the distance sensors. This means the user is very close to the sensor. Each “rectangle” is made up of 70 rectangles to produce a gradient.

This shows both of the lights effect.

Network flow (OpenFrameworks):

How the OpenFrameworks application works it sends a UDP message of the data to a Python UDP Socket. The Python app sends that to a NodeJS app, and to AWS DynamoDB. The NodeJS app receives that data and sends it to an internal SocketIO room that the client site connects to. Note the Github link does not have the AWS DynamoDB write code, unfortunately I believe the only version exists on the Raspberry Pi.

This is the "network data flow" of the entire application. The dashed line marked the bounds of the localhost and world wide web.
This is the “network data flow” of the entire application. The dashed line marked the bounds of the localhost and world wide web. 

Network flow (NodeJS):

  • Uses Express to create a REST API (Get for the website)
  • Get  (/): returns the website
  • Post (/sendsensorsata)
  • Receives the input JSON data from the Python app Does input validation on the request – Validates key is equal to key stored as environment variable in NodeJS – Validates sensor1 and sensor1 are not undefined, numbers. – If the request is valid sends the sensor data to the socketIO client.

Wiring diagram:

The wiring diagram for the installation is very simple. The power distribution for the WS2812 NeoPixel LED strips is in parallel and the data lines are run to the FadeCandy. The ESP8266s are connected to the Raspberry Pi via USB to Serial cables. The distance senor have a trigger and echo connections as well as power and ground connections.

THE HARDWARE WIRING FOR THE INSTALLATION. THE LED STRIPS ARE RAN IN PARALLEL FROM A 5V 12A POWER SUPPLY. THE DATA LINES ARE CONNECTED TO A FADECANDY. THE ESP8266S ARE CONNECTED VIA USB TO SERIAL CABLES, AND THE DISTANCE SENSORS ARE CONNECTED TO TRIGGER AND ECHO PINS, AND POWER / GROUND.
THE HARDWARE WIRING FOR THE INSTALLATION. THE LED STRIPS ARE RAN IN PARALLEL FROM A 5V 12A POWER SUPPLY. THE DATA LINES ARE CONNECTED TO A FADECANDY. THE ESP8266S ARE CONNECTED VIA USB TO SERIAL CABLES, AND THE DISTANCE SENSORS ARE CONNECTED TO TRIGGER AND ECHO PINS, AND POWER / GROUND. 

Understandable definitions:

These are the definitions that were used to make the posters and make the technology used more accessible.

NeoPixels: These lights are like Christmas lights. Each light can be changed to a different color.

Arduino: This is a “microcontroller”. It’s a device that are told to do one thing and they do it forever. They are similar to how your cell phone works but instead of being able to run multiple apps at once, they can only run 1 app.

Distance Sensor: This allows the Arduino to tell how close you are. This is like motion detectors on garage lights, or automatic doors at your favorite retail store.

FadeCandy: This tells the light strip which individual lights to light up and what color. It’s like a remote for the lights but controlled from the computer.

Raspberry Pi: This is a very tiny computer. It works just like your laptop but is super compact. The computer runs a program that listens for the Arduino to say something – then tells the lights to light up.

Github:

The main application: https://github.com/EdwardDeaver/SyracuseInnovationLEDProject

Follow
...

This is a unique website which will require a more modern browser to work!

Please upgrade today!