LiveShowy: Let Us Play

By Owen Bickford

Elevator Pitch

We know the BEAM and LiveView are powerful for making traditional applications more interactive. We know the BEAM is a player in crypto. But, what about fun and creativity? With LiveView, the BEAM has some superpowers that may open the door to an arena of new use cases.

Description

By now, we have seen example after example of Phoenix LiveView making applications more interactive and easier to build. Forms with realtime feedback, tables and charts with updating values, realtime chat, interactive games… Phoenix can handle it all. What makes Phoenix unique is the relative ease with which you can build these types of features. During this talk, we are going to focus on using the BEAM to have fun, connect a group of people, and to create something unique. A new application, LiveShowy, will be the test bed/playground for some fun and interesting ideas.

Notes

Goal

The goal of this talk is to demonstrate how LiveView could be used to accept rapid input from several users (LePetit seats 325), which is transformed into audio and visual representations during the talk. To minimize latency, the application may run on a local machine connected to a dedicated router where attendees can connect via Wifi.

If the event becomes virtual-only, the app would temporarily run on the web, allowing interaction from across the globe.

Technical Considerations

The demos will require audio, which could be provided via XLR from an audio interface (preferable) or HDMI from the laptop. Virtual audience members would need to hear the audio if the event is live-streaming. The talk will ideally run entirely in the browser, including slides and demos.

In October, I demo’d a proof of concept where ~20 SmartLogic members were able to control an SVG circle on a “stage.” Members were able to view other circles moving in realtime. The app was hosted from my local MacBook Pro via Ngrok, and performance was surprisingly smooth. When my brother connected from Texas while I was in Michigan, latency was 0-6ms and stable.

I plan to start with a small “band” of volunteers who can interact with the app initially, moving circles around the screen. Then, I’ll introduce a musical element where each player controls notes on an instrument or separate instruments.

The next step is to move to a “choir” stage, where each person controls a voice. Part of the fun here will be directing the audience. Can we unite in harmony? How large can the choir be? Throughout, we’ll monitor application metrics to gauge app performance.

Wrap Up

After a few fun demos, we’ll walk through some key features and design decisions in the application and reflect on how the BEAM made it all possible. As time allows, we may address:

  • LiveView Hooks
  • SVG vs Canvas
  • PortMidi & the MIDI spec
  • Optimizations
  • Authentication & authorization
  • Real-world challenges: hardware, OS config, etc.

Wrapping up, we’ll start pondering other novel ways the BEAM could be used for collaboration and fun.