Celebrating 145 years in 2024! Est. 1879, MMR is the Oldest and Most-Read Magazine Covering the MI Trade!
Subscribe Now for Free! CLICK HERE!

More results...

Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
Search in posts
Search in pages

Slightly Stoopid Take Avolites Gear on Tour

Christian Wissmuller • Supplier Scene • October 21, 2019

To visually bring their shows to life in concert, Slightly Stoopid now turns to Pulse Lighting, which supplies a full complement of Avolites gear, including a Sapphire Touch console, Ai software, and the lighting manufacturer’s new Synergy feature set.

“Every design starts with a conversation. The more we interact with our artists, the better the design translates,” says Mikey Cummings of Pulse Lighting.

“A majority of our tours use the Sapphire Touch, Arena, or Tiger Touch II,” he says. “With lots of video, color-mixing lasers, and LEDs in Slightly Stoopid’s design, we knew this would be a great opportunity to dive into Avolites’ new Synergy software alongside the Ai software we were already planning to use.”

Cummings volunteered for the project when the band’s management approached Pulse President Paul Hoffman, the longtime lighting designer for jam-band pioneers Widespread Panic. Pulse’s first collaboration with Slightly Stoopid was the band’s signature Closer To The Sun festival in Mexico last year.

“We designed the rig for their Mexico festival. After its success, they asked us to do some one-off designs for festivals and shows in the spring. Basically, we were creating independent lighting designs for each show based upon what local vendors could provide. They were pleased with everything we had done, so they asked us to present a full lighting design for their summer tour.”

Providing a complete touring package from Pulse inventory enabled Cummings and Programmer and Lighting Director Dan Grabus to assemble a “dream” setup with the Ai software and the new Synergy package integrating seamlessly with the Avolites Sapphire Touch console used for control. While the design was a little lighter on LED fixtures than Cummings initially preferred, it was heavy on lasers.

“Synergy works really well with the lasers, which we didn’t expect,” he says. “We have four 30-watt lasers upstage and six crowd-scanning Phenoms on the downstage truss, all provided by Lightwave. This achieves a 3D look between the lasers and video, which project the same color and intensity patterns, creating a lot of depth. One of the things that we really like about Ai is how easy it is to manipulate and merge clips together within a layer. Instead of having to crop things in some outboard, third-party software and then bring it back, we are able to manipulate static images and content right on board.”

Instead of using physical servers, they opted to use the Ai software license dongle, which Grabus could easily transport to any gig.

“The great thing about Ai is that you can patch 256 universes of pixels—that’s an amazing amount of added fixtures,” says Cummings. “Then, when using Synergy, it becomes a virtual part of the effect engine, like it’s a plug-in in your pixel mapper. It’s seamless and works absolutely fantastic.”

Cummings and Grabus report that they were overall impressed with the functionality during programming and operation. Whatever Avolites has done in designing Synergy, the Pulse team says they have done it very well.

“Using Synergy, the console is actually triggering the video server through an integrated plug-in type of relationship. Everything is on the console—you can see all of your video clips and everything else. If you want to add content, you can add it from the console. You don’t have to go to the server. The console allows you to upload directly.”

“Typically, you have video walls with content running, with lights cued to the colors and balanced into it the best you can. But now imagine that the entire lighting rig is just running the content. So you have these big graphical lighting focuses with the video coming from behind the lights, pushing through color and content, and your rig completely following it. It’s amazing. And all you have to do is engage it in the pixel mapper. You don’t have to write this lengthy syntax-based effect. You simply bring up the pixel mapper in the effect engine, touch the group of your choosing, select a video clip, and that’s it. If you want to change the color of that video clip, you can either do that in Ai from the console, or run a separate effect from the console over top of it. It’s fully integrated. It saved us a massive amount of time in programming. Within two weeks, we were able to fully cue the show, which would have taken us three to four weeks otherwise.”

Join the Conversation!

Leave a comment below. Remember to keep it positive!

Leave a Reply

The Latest News and Gear in Your Inbox - Sign Up Today!