Avolites Media Server Illuminates Exterior of Rio, VegasAn Avolites Ai media server is at the heart of the new lighting installation that wraps both towers at the Las Vegas landmark, the Rio Hotel & Casino. This is being used to map, help control, and schedule over three miles and 351,032 pixels of "illuminative possibility" designed by Chris Kuroda and Andrew "Gif" Giffin, using ClearLED's X-Bar 25 mm product, which wraps 360 degrees around the buildings. Well-known for their work as live music and entertainment lighting designers, Kuroda and Giffin programmed a series of elegant cues, scenes, and sequences that run automatically, bringing a unique lighting aesthetic to the architecture of this iconic Vegas hotel and casino. Ruben Laine, founder and "chief nerd" of Australia and US-based Creative Integration Studio, was asked to devise a control solution that treated video as lighting. This involved outputting lighting in a video-centric format, enabling micro-manageable levels of detail to be accessed for each vertical LED strip, some of which are more than 4,000 pixels long. The Rio's lighting scheme is part of an ongoing multi-million-dollar refit to the resort being managed by Dreamscape Companies. The new LEDs replace 3.6 miles of old neon that had been in residence since the 1990s. The overall project is the brainchild of Marty Millman, VP of development and construction at Dreamscape. He very specifically didn't want new lighting that resembled any other generic or clinically pixel-mapped building installation fed with video content, instead calling for something unique, different, and standout. A major Phish fan for many years, Milman reached out to the band's long-term lighting creative team of Kuroda and Giffin, challenging them to produce the specific look he envisioned for The Rio. Their work for Phish frequently uses linear stage/theatre-style light sources, such as Robe Tetra2s and TetraXs, as a dynamic structural base to their familiar rig of automated trusses, simultaneously adding another layer of kinetic movement. Kuroda and Giffin have programmed hundreds of thousands of lighting cues for the assorted Phish tours and projects, using lighting consoles and effects engines, which give the animation a crisp and clearly defined appearance. This was exactly what Milman wanted. Kuroda and Giffin quickly realized that the enormous number of pixels involved meant that DMX driven directly from a lighting console was not an option. Enter Laine, who immediately grasped that what was needed was video playback that did not involve video content. Using the Avolites media server and Ai was one of Laine's first thoughts. "I have always been an Ai guy," he comments, quickly moving to spec this product for the task, in combination with the real-time graphics rendering of Notch. Laine collaborated with the Avolites team in the UK to add a new function to the AI server's "follow-on" actions that allows for "randomized specificity" as a custom play mode to manage all the media, control, and scheduling using a Notch block that Laine built, giving lighting control across the entire surface of the buildings. This custom scheduling, allowing randomization, enables the playback of a long "base look" followed by a series of random sequences before returning to the base look again. and repeating the process, which also means that the same series of sequences will never get repeated and become predictable. The programmed lighting scenes are divided into two categories: "base looks" that are subtly animated, and "shows," which are faster, bolder, and higher contrast. A base look plays for five minutes, followed by a one-minute show, all randomly selected, followed again by another randomly selected base look, then another one-minute show. "Being able to dictate a range of files to each clip, from which it would pick randomly for its next clip, was amazing," Kuroda says. The lighting programming itself was more loosely timed on a clip-by-clip basis, with no two clips the same length, so using tools like Calendar or Macro Script made it impossible to use anything else. Kuroda, Giffin, and Laine were all impressed with the input from Avolites and, in particular, with Ai developers Simone Donadini and Terry Clark. They started lighting programming with the linear elements in Notch, treating each vertical line as its own layer or canvas, complete with dedicated intensity controls and a "form" to allow for solids, gradients, or patterns, plus full transform controls like position and scale, as well as different color and alpha controls. This meant that a single layer could maneuver complex gradients using one element, and these layers were then stacked. A second, independently controlled, layer allowed Giffin to get "really funky" with lighting programming, stacking two-dimensional controls, giving a set of 20 "super layers" to cover the entire array of layers, rendering underneath the 200 linear layers with similar but more complex controls and effects. Finally, by including animatable masks, the individual architectural segments and features of the buildings could be highlighted, which maintained Rio's architectural identity. "We wanted to achieve this without the building getting lost in the glamour and glitz of its shiny, new technicolor veil," Kuiroda says, adding that "the genius" of this control methodology "was that it allowed our familiar tools and lighting programming workflow to be used during the creative process." Ideas were discussed just like they were standard lighting cues, creating and manipulating them on the fly using a lighting console and lighting console logic, relying on many of their concert lighting tricks like color wipes across the whole canvas, narrow bands of white leading in a new color from "rocket tips," or creating shapes with the negative space and animating them into numerous forms. With around 50 or 60 slow-moving looks and another 50 or 60 fast-moving ones, they needed a server that would pick these to play randomly over the course of a year, so that nothing was repeated regularly. This Notch and Q Series / Ai combination also crunches 2,000 universes of pixel data into eight DMX universes of externally exposed Art-Net channels. Each sequence is played back from the console and Art-Net, recorded into Notch, then rendered at 60 frames per second for the smoothest possible motion across each pixel on The Rio's facade. The Q Series media server outputs the rendered clips into ClearLED's signal processors, which are then pushed down a few miles of fiber optic cable. "Q Series / Ai was, without a doubt, a crucial part of this adventure. From our original concept of running the show as live Notch blocks, through every creative, technical, and executive challenge, to the final execution. Using Q Series/Ai allowed us to effectively map the building in just a couple of hours," Laine says, adding that they probably spent more time driving around looking for a parking spot with a good view than actually doing the mapping. 
|