Linkin Park’s collaboration with Steve Aoki on ‘A Light That Never Comes’ is hypnotically realized in the music promo for the song - a cyberpunk-like journey into a digital world that features ‘glitchy’ re-creations of the band members. The video, directed by Linkin Park programmer and turntable-ist Joe Hahn, relies on visuals from Ghost Town Media, which developed a scanning methodology for the clip’s characters and crafted a complex world for them to inhabit. Here’s a step-by-step account of how they did it.
Step 1: Concept
‘A Light That Never Comes’ is a song written for Linkin Park’s second remix album, Recharged. Depicting that remix theme became the driving factor behind the video. “We wanted it to feel like the same way a DJ samples audio cues and clips to make a larger piece,” explains Ghost Town Media principal/creative director Brandon Parvini. “It had to have that same fractured, remix feel. It also went with the notion as the band as ‘digital bastardizations’ or glitches of their original selves. Sometimes the glitches can be so beautiful that we try to find a soul inside of them.”
That idea led Parvini to further research of ‘4D’ capture systems. When played as video files, 4D capture offered the right sense of jitter, sampling-like glitches and general chaos in the images (which are provided as OBJ files). “When you capture progressive OBJ sequences, the points don't share their placement across time,” notes Parvini.
“A single point at the tip of the nose won't necessarily be at the tip of the nose in three frames - it might shift all over the place. You have to think about like throwing a trash bag over a form, and as the form moves and shifts around inside the scrim, the wrapper, everything's inconsistent. I wanted to play with that and toy with that look.”
Step 2: Scanning
The video features band members (in glitch-3D form), a central female presence who is shown in a higher level of fidelity, and several background performers. In order to scan these different characters, Ghost Town Media adopted a combination of Artec’s MHT scanner and two Microsoft Xbox Kinects, with the data running through Artec Studio 9.1.
One challenge Ghost Town Media had to overcome was acquiring the scans and turning them into usable pieces of footage that could still be manipulated. “Artec Studio is built to capture a solid object,” notes Parvini, “so if I want to scan a chair or a maquette, it’s constantly scanning at about 15 frames per second. It’s a locked number and varies a little bit to 14.9 or 15.3. I realized that as you’re scanning all these items, that that is actually a sequence - you’re basically capturing an OBJ sequence effectively. What it does is: you have the option to just output the scans raw without merging them into a full mesh."
"Once we realized it would do that," adds Parvini, "we found out it would give us both the OBJ output and it would actually give you the RGB image that it was capturing when it was doing the initial scan. So I would be able to have a matching still frame to match up against the OBJ file.”
Since the band members are shown singing in the video, Ghost Town had to deal with lip sync. “For all of the lip sync footage we basically slowed the song down to half speed,” says Parvini. “I could then speed it up and have it as 24p as an OBJ sequence and then still warp it and deal with it as a 3D file.”
The Artec MHT provided scanning of approximately a 2 foot by 2 foot radius, enough, says Parvini, to show from mid-chest to head, or a foot stepping down or a face. The dual Kinects, on the other hand, enabled more full-body capture - essentially a 180 degree scan. The results were ultimately used for sprites and ‘glitchy’ bodies that run around in the clip.
“Originally I had looked at trying to blend the two of them together,” recalls Parvini, “but it kind of looked like you had a well-sculpted head on top of a blob. It wasn’t a really pleasing aesthetic. So I decided to fracture the two people out and have our girl - Judy - be more a high fidelity piece. I ended up taking some of the scans of her body and turning those into a quick-rigged body and then mounted the OBJ head inside a container to her head - to have here as more of a character.”
Step 3: Digital workflow
After scanning, the files were renamed and brought into either Element 3D for After Effects or into Cinema4D. “I could bring in an image sequence that the Artec captured as an OBJ sequence and the RGB images and project the video or essentially the image sequence of the face back over the capture and it would now match,” explains Parvini. “You’re getting essentially UV texture files that are just working.”
Artists could then apply glitch and distortion set-ups and low-poly filter passes. “You can do that with reduced polygons or with a mograph approach,” says Parvini. “We were playing around with something that felt physical but glitchy - not too hologram-y. We wanted to keep that plastic feel that you can get from polygons. We tried to make sure the guys were recognizable but show them as flawed iterations of themselves.”
Step 4: World-building
In addition to the glitchy-type performances, the clip also features an elaborate world and several ‘stone giant’ versions of the band-members - known as Altars - that were inspired by ‘fantastic realism’ artist Kris Kuksi’s architectural work. “When it came to the sculptures,” states Parvini, “we had already done full body scans of the guys last year for their Living Things album release cover art that were scanned at Gentle Giant. That gave us a really good base to go off, because then when we scanned again with the Artec we got even more detail, such as a wrinkle in the eyebrow. So we cleaned up the scans and made them into a full-body mesh.”Watch a work-in-progress clip for one of the Altars.
The complex world was developed with a hybrid approach to allow for several changes made as the project continued. “Some of the scenes are completely 100 per cent Element 3D, even the backdrop, where everything is being animated and composited from start to finish inside Element,” says Parvini. “Then for some of the more grand shots and wider vistas, we used Cinema4D. To get the worlds to speak together there was a general schema that we needed to have there across all the items. We wanted this place to feel digital - if it felt too real and physical then we were starting to worry about the wrong things.”
Step 5: Merging it altogether: a 'digital wash'
Parvini says the entire video was its own world that “had to feel busy and overwhelming, like a digital wash hovering over you as if you were fighting to see into a place you hadn’t seen before.” That’s clearly the effect of the piece, one that Ghost Town Media now adds to its already impressive array of music promo and other projects, each tending to take a different angle to the norm. “I see us as kind of technical futurists,” adds Parvini. “We really love what’s next. If it doesn’t feel like it’s a new way of approaching the post, it doesn’t really intrigue us enough.”
We've been a free service since 1999 and now rely on the generous contributions of readers like you. If you'd like to help support our work, please join the hundreds of others and become an fxinsider member.