The WhiteBox was a multi-purpose experiential prototyping platform from my time at DLab, the innovation lab at DIRECTV.  It was meant to present complex ideas in a flexible, immersive environment, and to enable us to develop and iterate on ideas which would otherwise require a tremendous amount of engineering just to try out. 

Get the experience right, and it can guide the engineering of actual products. 

I was the sole developer on the project, and also responsible for all 3d models, animation, and rendering, but I did have tremendous support from my team in fabrication, character rigging, motion graphics, and static art. 

I chose to design and build the WhiteBox in Derivative Touchdesigner, allowing me to manage media flows, projectors, and a constantly changing list of inputs and outputs easily. Over time, inputs to the WhiteBox included webcams, a Microsoft Kinect, Arduinos, NFC readers, Leap Motion,  Pixycam, OSC over wifi, and more. The physical build of the WhiteBox was basically a collapsible rectangular room, made of a fabric we could project onto. We would then add any hardware that was needed for the current project into the space.

We spent a lot of time presenting to executives, as well as the company at large, about our expectations of how entertainment consumption, habits and technology were changing, challenges we would face, and opportunities this would create. These efforts were often piecemeal, so my team and I wanted to take a holistic look at what this future was like. We created our set of assumptions and projections, our personas, and then went to work crafting a storyline that would guide us through a day in the life of our family of the near future.

Called “Heart of the Home”, this was an evolving project that led to the development of the WhiteBox. At its inception, it was simply the room, a projector, and a TV mounted horizontally at about table height.

The TV presented what we called the “Dollhouse” — a top down view into a 3D animated cutout house with multiple rooms and a driveway. Our 3D animated family would go about their day, moving between rooms, interacting with each other and the technology around them. To highlight the important interactions, we would pause the action on the television, and then draw a glowing line up to the top of the screen. The action would continue on the far wall, which would redraw itself as the room, and then demonstrate the interaction at a real life scale. This mix of on screen animation and projection mapped environment was very effective at giving viewers both a high level overview, as well as a good approximation of the real life experience. Since this was a guided presentation, I also added an ipad with a control panel that used OSC to send command signals to the system. This way the presentation could be paused, sections repeated or skipped as needed in the moment.

The success of the project led to a sequel, at least in spirit. It was after the initial Heart of the Home presentation that we realized the potential of the platform, and the second iteration was an interactive demonstration of ideas, and of the platform itself.

The approach this time was to create a set of interactive experiences that participants could do for themselves, getting a taste of what the future of entertainment might be. A touchscreen monitor was placed on a podium in the room, giving the participant the ability to navigate the experience themselves and instructing them on how to proceed. My team and I shot videos of a model demonstrating each activity, and these videos, along with some motion graphics would guide the user.

A Kinect camera was placed on the floor in front of the far wall, this gave the system the intelligence to know when to go live and reset, as well as a gestural interface for certain experiences. The touchscreen would present itself as an iPad when necessary, acting as the second screen in some scenarios, and on the other side, an Arduino with a Pixycam (a computer vision camera sensor) were setup to simulate a sort of future refrigerator that knows what food is inside. We faked this with plastic fruit and veggies that users would place on a glass plate inside a small mockup of a refrigerator, which the Pixycam/Arduino would recognize and send on to the system. It was very effective.

Users were given the opportunity to navigate their televisions via gestures, flick and pluck programming to and from a tablet and a television seamlessly, get lists of recommended cooking shows based on what food was in their fridge, and more.

Starting as a booth presentation at one of the lab’s open houses, the project grew and iterated into a set of videos, interactive immersive experiences, a multiscreen presentation at the DIRECTV Revolution conference, and was featured at South By Southwest as part of DIRECTV’s presence at the event. It’s continuing influence spawned a series of other projects that explored aspects of our future vision, and gave the thematic base for the lab’s final open house.