Digital Wave is a project created during Hack the Arena and was desinged to be the next generation of "The Wave." Feel free to peruse this README. You may also find more information on this project's DevPost page.
Ideal Workflow
- Customers visit site and enter in their section, row, and seat number
- Customers point the front-facing camera on their phones to the Jumbotron
- The Jumbotron displays an obvious most vibrant color
- Phones will then detect the most vibrant color and display it on their screens, creating an orchestra of colors in the arena
Our MVP only targets Google Chrome and Firefox for Android devices (and technically any device with a camera). Unfortunately, Safari does not support getUserMedia, so it is impossible to interface with the cameras of iOS devices and Macs from the web (using Chrome on those devices may be a possibility). Creating native apps on those platforms could be a possibility down the line.
- The client has access to the user's camera and get the information from the camera
- The browser processes the image and determines the most vibrant color
- The client renders a color that covers the entire screen
After recieving a connection, the Server broadcasts color info via Socket.io to all devices (in the backup case).
Note that if the best case plan of attack is realized, then the server does nothing more than serve computer vision code to the client. The client and Jumbotron then take full responsibility of Digital Wave.
The Server may eventually broadcast video to Jumbotron.
The Jumbotron expects to read raw video information from the Server.
We interact with the camera via the browser and use basic computer vision to detect a most vibrant color.
If the best case does not work, it's also possible for the Server to directly broadcast color streams to clients.