INTERACTIVE PSEUDO-HOLOGRAPHIC VIRTUAL AQUARIUMS

Built for the Australian National Maritime Museum's "One Ocean, Our Future" exhibition, and in collaboration with the Schmidt Ocean Institute. I created five interactive, gesturally controlled pseudo-holographic virtual aquariums, allowing museum guests to explore and engage with five 3D visualizations of extraordinary deep-sea specimens revealed during Schmidt Ocean Institute's 2020 circumnavigation of Australia aboard their research vessel Falkor. All development used a combination of Lightwave3D, Blender, and the Unity realtime game engine, with custom scripting written in C#.

Screen Shot 2022-01-21 at 9.44.50 AM.png

E14-E15 MATTERPORT SCAN

I led a small team in creating a complete digital double of MIT buildings E14 and E15, home of the Media Lab, the Center for Bits and Atoms, Comparative Media Studies, the Program in Art, Culture + Technology, the Center for Advanced Urbanism, and the MIT List Visual Arts Center.

matterportScan.png

MEDIA LAB METAVERSE PANEL

The realization of a metaverse—the convergence of augmented reality, virtual reality, and other digital technologies with physical reality—has the ability to radically change the way we connect, perceive, and experience the world around us. My contributions to this panel discussion included virtual production and the creation of unique virtual 3D environments using the Spatial platform.

Screen Shot 2022-01-22 at 1.54.27 PM.png

2021 FALL MEMBER EVENT

As creative producer and "Chief Imagineer" for the Media Lab's semi-annual Member Meeting, I led all aspects of production, including live simulcast and pre-recorded "simulive" video segments. Using principles of illusion and stage magic, the simulive segments were designed to seamlessly integrate with the event's livestream, creating the appearance that there were six on-site, live production crews when in fact there were only three.

memberweekFall2021.jpeg

As creative producer for Inflection PoinT, a student-focused, student-organized event to help the Media Lab community connect and learn, and celebrate the uniqueness that makes us resilient and the curiosity that drives us forward, I led the production of Showcase21, a livestreamed, curated showcase of videos that highlighted the work and lives of students, UROPs, and postdocs at the Media Lab. Additionally, I project-managed the development of a custom Gather.town installation created by a team of Media Lab students to provide additional opportunities for networking and virtual demonstrations.

Watch the trailer

An image of the MIT Media Lab with the words "MIT Media Lab Inflection PoinT: Showcase 21" overlaid

2021 FESTIVAL OF LEARNING: VIRTUAL ATRIUMS

The Media Lab's 3rd-floor atrium is an important community space, which was  largely inaccessible for more than a year, due to the Covid-19 pandemic. For the 2021 Festival of Learning, an annual event by and for the Media Lab community, I led the development of four virtual atriums in Mozilla Hubs to serve as online gathering spaces for different activities—a Disco Atrium, Underwater Atrium, Outer Space Atrium, and the Hall of Mirrors, which served as a portal to scheduled sessions.

A panda avatar in a Space Invaders-themed environment

LILLI is an immersive, volumetric, reflective lightfield display using a hybrid of video and laser projection to visualize marine wildlife and ocean climate data, etc. While there's been an abundance of large holographic displays portrayed throughout science fiction literature and films, which often show large consoles surrounded by multiple people working together to manipulate floating images and data, there is a nagging question that goes unanswered: How do multiple people interact with a large 3D scene?


This project is two-fold:

1. To create a large, interactive 3D display and

2. Explore and establish a foundation for interacting with large 3D data for multiple users


Using a combination of laser and video projectors, we designed and built a car-sized 3D display that can be viewed from all sides. Developed in collaboration with MIT Media Lab's Open Ocean Initiative to help promote cleaner and more accessible ocean research, this display will be featured in 11th Hour Racing's traveling pavilion as they follow racing sail boats around the world.

Watch the demo video

lilli.png

#ZoomADay was a year-long project exploring the creation and use of synthetic characters and deep fakes for use in online telepresence and communications. It began as a series of Snapchat lenses, many of which are publicly available, and advanced to include more complex, AI-synthesized characters, using a machine learning toolkit called Avitarify. Along the way, the project was used to create virtual regalia for the 2020 Media Lab/Program in Media Arts and Sciences graduates and for a virtual celebration of the announcement that Dava Newman had become the director of the Lab.

A man in an old-fashioned diving helmet underwater, in front of a submersible ship.

Dissertation, Massachusetts Institute of Technology, 2019. Programmable Synthetic Hallucinations describe the utilization of the bio-physiological mechanics of hallucination generated in the human brain to display virtual information directly in the visual field.

A woman wearing a helmet that covers her face and head, with several cables emerging from the back.

I've built several iterations of these Pepper's ghost-style displays, including Yo-Yo Vision, a human-sized, autostereoscopic (no 3D glasses required) lightfield display that allowed Yo-Yo Ma to perform at the Media Lab’s 30th Anniversary celebration from a remote location. Another version of this project, the 3D Telepresence Chair, was used to allow an executive from a Media Lab member company to attend a meeting remotely.

3D_aerial_display.jpg_edited.jpg

OCEAN BLUE

Ocean Blue was an immersive ocean data experience built for the Wandering Cricket Night Market, part of a network of ephemeral art experiences. Ocean data concerning climate change, ocean health, and marine conservation were juxtaposed with a crowd-sourced database of memories, hopes, and dreams about the ocean collected online through a Google Form.


Twin projectors beamed the ocean facts or memories into an infinity mirror-effect moon pool installed in the back of an rented U-Haul moving truck. The interior of the box truck was decorated to simulate an underwater environment and the moon pool’s real water surface was perturbed by Arduino controlled aquarium pumps, creating a ripple effect that caused the projected typography to appear as caustic light reflections through the environment. 

 

After a short amount of time, the microcontroller would pause the pumps and as the water stilled to a mirror surface the ocean facts and memories would coalesce and become legible on the walls surrounding the guests. After some moments of reflection, the pumps would reactivate and the words would ripple back into flickering, meditative caustic patterns and the sound of the moon pool would gently fill the truck. A new set of ocean facts or memories would be randomly chosen from the database and the whole process would repeat.

 

The main projection application was written in Processing and the relay control of the aquarium pumps was coded in C in the Arduino IDE.


Guests to the experience reported a sense of calmness and wonder and delighted in the reveal of the text as the moon pool stilled. Several guests spontaneously offered their memories about the ocean and discussion of conservation and exploration; some regretting they hadn’t followed their dream to work with whales or become an oceanographer. A number of guests returned several times to discover new facts or memories and many lingered for the meditative atmosphere. Several hundred people were able to experience this immersive and interactive data environment before the end call of the Night Market. 

 

Many thanks to Jon Ferguson, Scott Berk, Anna Waldman-Brown, Chia Evers and other volunteers that helped build and host this immersive data experience.

OceanBlueText.jpg

Expands the home-video viewing experience by generating imagery to extend the TV screen and give the impression that the scene wraps completely around the viewer. Uses optical flow, color analysis, and heuristics to extrapolate beyond the screen edge in real time using standard microprocessors and GPUs.

InfinityByNine_1280_square_300dpi.tif_edited.jpg

An immersive storytelling environment to augment creative play using texture, color, and image. Utilizes natural language processing to listen to and understand stories being told, and thematically augment the environment using color and images.

A drawing of an owl projected against a blue wall.

A fusion of exploratory data visualizations based on realtime location-aware computing and the aesthetics of Abstract Expressionism.

A Jackson Pollock-style digital painting.

A basketball net that incorporates segments of conductive fiber whose resistance changes with degree of stretch to calculate force and speed of a basketball traveling through the net. The output is displayed via the in-arena Jumbotron and televised to the home audience. Premiered at the NBA 2012 All-Star Slam Dunk Contest; winner of a 2013 Effie Award (Bronze, Beverages—Non-alcohol).

A basketball net.

Transforms motion tracking and blob detection data collected from varying inputs, such as floating soap bubbles or live bacteria, into sound. BubbleSynth was a 2014 Guthman Musical Instrument Competition semi-finalist; SYNTHBacteria premiered at the Peabody Essex Museum in Salem, Massachusetts, on September 18, 2014, as part of the After Hours PEM PM Series.

A computer display showing bacteria under a microscope on the right and a column of code on the left

A data-driven social controller for visual exploration.

A ball-shaped touched display showing various movie posters.

4K/8K Comics applies the affordances of ultra-high-resolution screens to traditional print media such as comic books, graphic novels, and other sequential art forms. The comic panel becomes the entry point to the corresponding moment in the film adaptation, while scenes from the film indicate the source frames of the graphic novel. The relationships among comics, films, social media, parodies, and other support materials can be navigated using native touch screens, gestures, or novel wireless control devices. Big data techniques are used to sift, store, and explore vast catalogs of long-running titles, enabling sharing and remixing among friends, fans, and collectors.

A display showing page 6 of a Wolverine comic book, a related Wikipedia page, and a frame from a Wol

A sensor and actuator platform for Live Action Role Playing (LARP), immersive theater, theme park, and other transmedia experiences. The system is an open framework allowing any kind of sensor or actuator to be easily programed and reprogrammed on the fly, allowing novice LARP masters to add magic props and special visual effects to their game. The system premiered at Tri-Wyrd, in June 2012, and was integrated into “Veil Wars,” a two-day, pervasive, locative experience layered onto the Wyrd Con convention, created by Scott Walker and loosely based on Medieval Japanese mythology.

A "fire gate" created with a paper lantern and red LEDs.