An interactive love story: Making Jeff Buckley’s “Just like a Woman”

Singer/songwriter Jeff Buckley’s sudden and mysterious passing in 1997 devastated his fans. In the years since then, he’s been celebrated for the small but soulful catalog of songs he wrote and performed during his short life.

To celebrate Buckley’s cover of Bob Dylan’s “Just Like a Woman” this year, Sony Music wanted to do something special. So they tapped Blind and Interlude to collaborate on an interactive music video.

We caught up with Creative Director Greg Gunn (Blind) and Co-Founder/Chief Product Officer Tal Zubalsky (Interlude) to learn more about this special project.

Interview with Blind’s Greg Gunn and Interlude’s Tal Zubalsky

In terms of the story, is this four animations essentially playing in tandem, with the user peeping into each via the panels?

Greg gunn There are four unique stories in this, but it’s more like one story told four different ways.

Think of it like four parallel timelines happening simultaneously. The core story follows two people over the course of their day; from the time they wake up to the moment they go to sleep. It’s a seamless loop.

Still from the final video

Still from the final video

As the viewer you decide — in each moment — the emotional lens through which you experience the story: single woman, happy couple, unhappy couple and single man. The key story points don’t necessarily change, but the character’s actions and color palette are unique to each timeline.

Tal Zubalsky The end result is actually achieved by compiling a video frame on the fly, depending on the user’s input, and with the help of meta­data mapping where each panel is positioned in each frame of the video, all this while streaming the video just like any standard online video.

This is achieved using a new capability developed by Interlude, triggered specifically by the creative requirements of this piece.

This was a lot of animation to create! How did that affect your approach to developing the look and handling production?

Greg Yes. Yes it was. I remember briefing the animators about the idea and emphasizing the words, “times four” every time we talked about something. In total we animated over 296 unique panels.

The spreadsheet to track progress for everything was nuts.

In terms of production, I knew that we’d have a ton of work to do. One of my concerns going into into it was if there’d be too much going on. At any given point there could be seven different panels on screen, each with their own looping animation, floating through the frame.

Throw in four parallel timelines and you start to wonder, “Will this just confuse people?”

Blind_JBPoster_B

With that in mind, the team and I put together a set of design rules that would help keep the visual language consistent and ease the pain of animating so many shots. For example, we limited the color palette of each timeline to keep things simple.

This forced us to think about how to use negative space and not feel that we had draw each and every piece of the panel.

What’s the underlying basic scheme for handling the animations? Is each clickable element playing a sprite sheet that’s getting swapped out with each click?

Greg We tried several methods to achieve the desired interaction, but ultimately went with using HTML’s <canvas> element.

Essentially, all four versions of the video are playing together, stacked on top of each other, and we use HTML to create masks for each panel that users interact with. If a user clicks/taps a panel the timeline changes, but we only see it through that one panel.

Early sketches

Early sketches

And on desktop, the background color and panel borders are generated in real time so we can control the color and animation based on the users action.

From an animation standpoint, this kept things simple—render four, pre­-animated videos that perfectly align!

Tal There are no sprites. What users are seeing is actual rendered video (rendered in maybe different formats for different devices). User clicks are toggling which of the four rendered timelines the clicked panel is revealing.

For the moments where the user can click an instrument — this seems like a bold creative choice. You’re giving the user control over the sound of a newly discovered Buckley track.

Who’s idea was that, and was there a lot of discussion around it?

Tal The idea originated in discussions between Adam Block (President of Sony Music’s legacy recordings) and Yoni Bloch (Interlude’s CEO and Co­Founder).­

dylan

Still from the interactive music video for Bob Dylan’s “Like a Rolling Stone”

Sony and Interlude had a very successful collaboration releasing Bob Dylan’s Like a Rolling Stone interactive video, and were just waiting for the next opportunity to collaborate.

It wasn’t clear that it was going to be on a click of an instrument panel, but the basic notion of adding layers of orchestration that support Jeff Buckley’s inherent emotional dynamic, and which will be controlled by the user­ was there.

When Greg brought the creative idea for the infinite love story, and with a fruitful collaboration between Blind and Interlude­ it became clear how we can tie the music, visuals, interactivity and technology all together in a unique and innovative way.

What snags did you guys hit along the way and how did you handle them?

Greg The biggest creative challenge that we encountered was about user experience.

When I described the concept for the video to people their eyes would kind of glaze over. It sounded interesting, but complicated.

A quiet moment together

A quiet moment together

The idea was crystal clear in my mind, but after trying to explain it, I realized that we’d need to spend time thinking about how people would interact with the video.

It was important to everyone involved that users not only follow and engage with the story, but understand that their interactions have a direct effect on the character’s feelings and change in the underlying orchestration.

We spent a lot of time testing ideas for how to notify the user without encroaching on the story and experience.

It was a delicate balance that I applaud Tal and team at Interlude for handling with the utmost grace. Kudos, gang.

Tal From a technical standpoint, the idea Greg brought for interactivity was fresh, and something we’d never done before — and we didn’t support out of the box. To us (at interlude), that’s the best way to evolve our technology.

The first thing we did was brainstorm potential directions of solving the challenge, looking and pros and cons, narrowing it down, and eventually developing a short, 15-­second proof of concept, before any frame was even drawn.

That ended up being the direction we picked and which eventually evolved into an amazing new feature enabling frame­-level ­manipulation (meaning changing not only the entire frame on user interaction, but parts of it), while maintaining the very important cross­-platform reach.

Tags: , , , ,

About the author

Justin Cone

/ justincone.com
Together with Carlos El Asmar, Justin co-founded Motionographer, F5 and The Motion Awards. He currently lives in Austin, Texas with is wife, son and fluffball of a dog. Before taking on Motionographer full-time, Justin worked in various capacities at Psyop, NBC-Universal, Apple, Adobe and SCAD.

2 Comments

Daniele

Awesome work!

Mar Ver

This is stunning, very innovative. Greg Gunn is incredibly talented, so glad you interviewed it

Comments are closed.