Prove your humanity


 

Avolites Media server drives dazzling lighting install at The Rio Hotel & Casino

defaultdefaultdefault

An Avolites Ai media server is at the heart of the new lighting installation that wraps both towers at the Rio Hotel & Casino. This is being used to map, help control and schedule over three miles – and 351,032 pixels – designed by the creative lighting team of Chris Kuroda and Andrew “Gif” Giffin, using Clear LED’s X-Bar 25 mm product which wraps 360 degrees around the buildings. 

Well-known for their work as live music and entertainment lighting designers, Chris and Gif programmed a series of cues, scenes and sequences that run automatically, bringing an engineered lighting aesthetic to the architecture of this Vegas hotel and casino. 

Chief Nerd Ruben Laine from Australia and US-based Creative Integration Studio was asked to devise a control solution that treated video as lighting. This involved outputting lighting in a video-centric format, enabling micro-manageable levels of detail to be accessed for each vertical LED strip, with some over 4,000 pixels long. 

The Rio’s lighting scheme is part of an ongoing multi-million-dollar refit to the resort being managed by Dreamscape Companies. The new LEDs replace 3.6 miles of old neon that had been in residence since the ’90s.

The overall project is the brainchild of Marty Millman, VP of Development and Construction at Dreamscape. He specifically didn’t want new lighting that resembled any other generic or clinically pixel mapped building installation fed with video content. He wanted something unique, different and stand-out.

A major Phish fan for many years, Marty reached out to the artist’s long-term lighting creative team of Chris and Gif, challenging them to produce the specific look he envisioned for The Rio, having been inspired by their lighting designs for the band. Their work for the artist frequently uses linear stage / theatre style light sources – like Robe Tetra2s and TetraXs – as a dynamic structural base to their familiar rig of automated trusses, simultaneously adding another layer of kinetic movement.

Chris and Gif have programmed hundreds of thousands of lighting cues for the assorted Phish tours and projects, using lighting consoles and effects engines, which give the animation a special crisp and clearly defined appearance. This was exactly what Marty wanted, and a workflow that is second nature to Chris and Gif.

Chris and Gif quickly realized that the large number of pixels involved meant that DMX driven directly from a lighting console was not an option. Ruben immediately grasped that they needed ‘video playback’ that did not involve video content. Using the Avolites media server and Ai was one of Ruben’s first thoughts. 

“I have always been an Ai guy,” he comments, quickly moving to spec this product for the task, in combination with the real-time graphics rendering of Notch.

Ruben, who has used the Avo Ai Media servers for over 10 years, collaborated with the Avolites team in the UK to add a new function to the AI server’s ‘Follow-on’ actions that allows for “randomised specificity” as a custom play mode to manage all the media, control and scheduling using a Notch block that Ruben built, giving lighting control across the entire surface of the buildings.

This custom scheduling – allowing randomisation – enables the playback of a long ‘base look’ followed by a series of random sequences before returning to the base look again and repeating the process, which also means that the same series of sequences will never get repeated and become predictable.

The programmed lighting scenes are divided into two categories, “base looks” that are subtly animated and shows that are faster, bolder and higher contrast. A ‘base look’ plays for five minutes, followed by a one-minute show – all randomly selected – followed again by another randomly selected base look, then another one-minute show. 

“Being able to dictate a range of files to each clip, from which it would pick randomly for its next clip, was amazing,” Chris explains. 

The lighting programming itself was more loosely timed on a clip-by-clip basis with no two clips the same length, so using tools like Calendar or Macro Script made it impossible to use anything else.

Chris, Gif and Ruben were impressed with the input from Avolites and in particular with Ai developers Simone Donadini and Terry Clark. They started lighting programming with the linear elements in Notch, treating each vertical line as its own layer or canvas, complete with dedicated intensity controls and a “form” to allow for solids, gradients, or patterns, plus full transform controls like position and scale, as well as different color and alpha controls. 

This meant that a single layer could maneuver complex gradients using one element, and these layers were then stacked.

A second independently controlled layer allowed Gif to get “really funky” with lighting programming, stacking two-dimensional controls, giving a set of 20 ‘super layers’ to cover the entire array of layers, rendering underneath the 200 linear layers with similar but more complex controls and effects. 

Finally, by including animatable masks, the individual architectural segments and features of the buildings could be highlighted, which maintained Rio’s architectural identity.

“We wanted to achieve this without the building getting lost in the glamour and glitz of its shiny, new technicolor veil,” explains Chris, adding that “the genius” of this control methodology “was that it allowed our familiar tools and lighting programming workflow to be used during the creative process.”

Ideas were discussed just like they were standard lighting cues, creating and manipulating them on the fly using a lighting console and lighting console logic, relying on many of their concert lighting tricks like color wipes across the whole canvas, narrow bands of white leading in a new color from “rocket tips”, or creating shapes with the negative space and animating them into numerous forms.

With around 50 or 60 slow-moving looks and another 50 or 60 fast-moving ones, they needed a server that would pick these to play randomly over the course of a year, so that nothing was repeated regularly.

This Notch and Q Series / Ai combination also effectively crunches 2,000 universes of pixel data into 8 DMX universes of externally exposed ArtNet channels. Each sequence is played back from the console and ArtNet, recorded into Notch, then rendered at 60 frames per second for the smoothest possible motion across each pixel on The Rio’s facade. 

The Q Series media server outputs the rendered clips into CLEAR LED’s signal processors which are then pushed down a few miles of fibre optic cable. “Q Series / Ai was without a doubt a crucial part of this adventure. From our original concept of running the show as live Notch blocks, through every creative, technical, and executive challenge, to the final execution. Using Q Series / Ai allowed us to effectively map the building in just a couple of hours,” comments Ruben.

No more articles