As creative producer for Inflection PoinT, a student-focused, student-organized event to help the Media Lab community connect and learn, and celebrate the uniqueness that makes us resilient and the curiosity that drives us forward, I led the production of Showcase21, a livestreamed, curated showcase of videos that highlighted the work and lives of students, UROPs, and postdocs at the Media Lab. Additionally, I project-managed the development of a custom Gather.town installation created by a team of Media Lab students to provide additional opportunities for networking and virtual demonstrations.

Watch the trailer

An image of the MIT Media Lab with the words "MIT Media Lab Inflection PoinT: Showcase 21" overlaid

2021 FESTIVAL OF LEARNING: VIRTUAL ATRIUMS

The 's Media Lab 3rd-floor atrium is an important community space, which was  largely inaccessible for more than a year, due to the Covid-19 pandemic. For the 2021 Festival of Learning, an annual event by and for the Media Lab community, I led the development of four virtual atriums in Mozilla Hubs to serve as online gathering spaces for different activities—a Disco Atrium, Underwater Atrium, Outer Space Atrium, and the Hall of Mirrors, which served as a portal to scheduled sessions.

A panda avatar in a Space Invaders-themed environment

LILLI is an immersive, volumetric, reflective lightfield display using a hybrid of video and laser projection to visualize marine wildlife and ocean climate data, etc. While there's been an abundance of large holographic displays portrayed throughout science fiction literature and films, which often show large consoles surrounded by multiple people working together to manipulate floating images and data, there is a nagging question that goes unanswered: How do multiple people interact with a large 3D scene?


This project is two-fold:

1. To create a large, interactive 3D display and

2. Explore and establish a foundation for interacting with large 3D data for multiple users


Using a combination of laser and video projectors, we designed and built a car-sized 3D display that can be viewed from all sides. Developed in collaboration with MIT Media Lab's Open Ocean Initiative to help promote cleaner and more accessible ocean research, this display will be featured in 11th Hour Racing's traveling pavilion as they follow racing sail boats around the world.

Watch the demo video

lilli.png

#ZoomADay was a year-long project exploring the creation and use of synthetic characters and deep fakes for use in online telepresence and communications. It began as a series of Snapchat lenses, many of which are publicly available, and advanced to include more complex, AI-synthesized characters, using a machine learning toolkit called Avitarify. Along the way, the project was used to create virtual regalia for the 2020 Media Lab/Program in Media Arts and Sciences graduates and for a virtual celebration of the announcement that Dava Newman had become the director of the Lab.

A man in an old-fashioned diving helmet underwater, in front of a submersible ship.

Dissertation, Massachusetts Institute of Technology, 2019. Programmable Synthetic Hallucinations describe the utilization of the bio-physiological mechanics of hallucination generated in the human brain to display virtual information directly in the visual field.

A woman wearing a helmet that covers her face and head, with several cables emerging from the back.

I've built several iterations of these Pepper's ghost-style displays, including Yo-Yo Vision, a human-sized, autostereoscopic (no 3D glasses required) lightfield display that allowed Yo-Yo Ma to perform at the Media Lab’s 30th Anniversary celebration from a remote location. Another version of this project, the 3D Telepresence Chair, was used to allow an executive from a Media Lab member company to attend a meeting remotely.

3D_aerial_display.jpg_edited.jpg

Expands the home-video viewing experience by generating imagery to extend the TV screen and give the impression that the scene wraps completely around the viewer. Uses optical flow, color analysis, and heuristics to extrapolate beyond the screen edge in real time using standard microprocessors and GPUs.

InfinityByNine_1280_square_300dpi.tif_edited.jpg

An immersive storytelling environment to augment creative play using texture, color, and image. Utilizes natural language processing to listen to and understand stories being told, and thematically augment the environment using color and images.

A drawing of an owl projected against a blue wall.

A fusion of exploratory data visualizations based on realtime location-aware computing and the aesthetics of Abstract Expressionism.

A Jackson Pollock-style digital painting.

A basketball net that incorporates segments of conductive fiber whose resistance changes with degree of stretch to calculate force and speed of a basketball traveling through the net. The output is displayed via the in-arena Jumbotron and televised to the home audience. Premiered at the NBA 2012 All-Star Slam Dunk Contest; winner of a 2013 Effie Award (Bronze, Beverages—Non-alcohol).

A basketball net.

Transforms motion tracking and blob detection data collected from varying inputs, such as floating soap bubbles or live bacteria, into sound. BubbleSynth was a 2014 Guthman Musical Instrument Competition semi-finalist; SYNTHBacteria premiered at the Peabody Essex Museum in Salem, Massachusetts, on September 18, 2014, as part of the After Hours PEM PM Series.

A computer display showing bacteria under a microscope on the right and a column of code on the left

A data-driven social controller for visual exploration.

A ball-shaped touched display showing various movie posters.

4K/8K Comics applies the affordances of ultra-high-resolution screens to traditional print media such as comic books, graphic novels, and other sequential art forms. The comic panel becomes the entry point to the corresponding moment in the film adaptation, while scenes from the film indicate the source frames of the graphic novel. The relationships among comics, films, social media, parodies, and other support materials can be navigated using native touch screens, gestures, or novel wireless control devices. Big data techniques are used to sift, store, and explore vast catalogs of long-running titles, enabling sharing and remixing among friends, fans, and collectors.

A display showing page 6 of a Wolverine comic book, a related Wikipedia page, and a frame from a Wol

A sensor and actuator platform for Live Action Role Playing (LARP), immersive theater, theme park, and other transmedia experiences. The system is an open framework allowing any kind of sensor or actuator to be easily programed and reprogrammed on the fly, allowing novice LARP masters to add magic props and special visual effects to their game. The system premiered at Tri-Wyrd, in June 2012, and was integrated into “Veil Wars,” a two-day, pervasive, locative experience layered onto the Wyrd Con convention, created by Scott Walker and loosely based on Medieval Japanese mythology.

A "fire gate" created with a paper lantern and red LEDs.