During the pandemic, normally large crowd gathering events have either cancelled, or have been forced to find creative ways to host them on-line. For my town’s annual car show, they went the creative route. Normally this event draws thousands of people to Doylestown, PA each July. Streets are closed, and classic and antique cars line the streets. One of the town’s biggest events. Because of COVID-19, it would be impossible to host an event like that this year, so a twist on “virtual” was developed.
For the “COVID Edition Car Show,” cars would drive a designated route through the streets of town, interspersed with the regular traffic. The public was encouraged to watch all on a Facebook Live stream. With cars traversing a route in a 2 mile radius around town, how do you provide coverage of that? Many, many challenges. Also, this is an all volunteer event that supports charity, so no budget as well!
It was necessary to have multiple cameras around the route to cover the burst of classic cars intermixed in the normal traffic. For streaming, it would originate from my home office using FiOS Gigabit internet. For the “cameras,” I found a free iOS/Android camera app that would broadcast a standard RTMP stream over cell or wifi. They were received via my TebWeb domain to a Mac Mini running an RTMP server (also free software).
I used the very popular and robust OBS Studio on my main iMac to handle the actual switching and stream encoding to Facebook. It is also free software, being part of the open source community. I was able to add the received RTMP “camera feeds” to OBS. It worked! I could receive feeds from anywhere with a strong cell connection. I tested, and tested (and tested) to make sure this consistently worked, and it did! All that was needed was the Larix Broadcaster App (free), and enter an RTMP URL. It works!
Another challenge would be how to communicate to the “cameras.” This challenge was met by a suggestion from someone on the car committee who was a big fan of Google Apps. “Why not use a Google Meet Room?” This was genius, and worked very well. Each camera team used two devices. One was the camera, the other was using a Google Meet room, shared with the other “cameras” and myself. For myself, a separate iPad mini sitting next to me served this purpose. Now I could direct cameras, tell them when they were live, etc.
Next challenge was how to provide commentary. I already had Sound Siphon on my iMac, which makes a virtual in/out from other software. At first, I envisioned one announcer calling a Google Voice number I had, and feeding that via the web browser/Sound Siphon to OBS. It worked fine, but just one announcer. How to do more? So I went with the Google Meet concept that was working as my intercom to the cameras. I had a second Google account, so I could make an “Announcer Room.” Leaving it open on the web browser, I could take its output to OBS. Now, anyone who joined the room could provide commentary, and talk between themselves. I discovered that when I talked via the web browser, that my voice was NOT heard in the stream. Beautiful, now I can cue and direct the announcers easily.
It was time for the 3 hour live stream. I met with my “camera ops” on site prior to the stream–good thing I live close by! My main announcers would be at their home as well. They were also moderating the Facebook comments, and pulled from that to offer commentary. They would be more “color commentary” as being at their house they would not be in sync with what they saw on the Facebook output. We also had one of the committee people who would occasionally join the Announcer Room from the street to provide more play-by-play.
After deciding where the cameras would be on the route, I headed back to my home office (dubbed “The War Room” for the show). Now the nervous part—would I see camera feeds? I anxiously waited as my operators joined the Google Meet room, and started their Larix App. I see them! Six live views from various points around town. Then I hear from the announcers in their Google Meet room. This looks like it is really going to all work!
WE’RE LIVE AT 5!
I started the feed with several minutes of pre produced videos. OBS Studio really does a lot! Takes a little getting used to how the workflow in it is, and it’s “quirks,” but it handled everything, even marking the live cameras with a bug. I even made a credit sequence with text–all inside OBS Studio. It really worked great. It took around the first hour for all of us to settle in to the routine, including those directing the cars through town. It was not by any means a flawless live production, with occasional frozen images, announcer audio cutting out from a weak cell connection, etc. But as someone who has worked in all levels of video production, from cable access to national broadcast, I was pleased…..we were pleased with the results. The technology worked! Techniques can be refined, but the underlinings that made it technically possible were there. We pulled off a viable live, multi camera, multi location production—with mobile devices and open source software.
THE SOFTWARE LIST (Mac based)
- OBS Studio (Open Broadcaster Software) — https://obsproject.com/
- Local RTMP Server (Mac) — https://github.com/sallar/mac-local-rtmp-server
- Larix Broadcaster (iOS/Android) — https://softvelum.com/larix/
- Google Meet — https://meet.google.com/
- Virtual Audio In/Out — https://github.com/ExistentialAudio/BlackHole
The applications for smaller scale, small town events is big. A better way to provide Facebook Live coverage much superior to someone doing a single feed just from their phone. The “War Room” used for an event on the other side of town could easily be the other side of the state, or further. And setting up everyone’s mobile devices was very easy using a free app. As I have spent nearly 35 years in video production and technology, I still found it amazing that we were able to accomplish this. And with no budget. Even 5 years ago, this would not have been possible to do this way.