top of page

SINGAPORE - IMAGE TO VIDEO TECH DEMO

'Singapore' (featuring the music of Tom Waits) is an animated music video created solely using generative machine learning methodologies.

This technology demo is presented for research and educational purposes only and, as such, should be considered fair use.

REEFGEN - A DIFFUSION-BASED GENERATIVE ANIMATED CORAL REEF

ReefGen is an interactive, real time application that uses diffusion model generated damselfish and coral reefs to explore the connection between data visualization, ocean health, and machine learning generative art.

Each damselfish is absolutely unique and species accurate via a generative workflow. The reef's appearance - healthy to unhealthy -  can be generated at run time via access to real world data sources, and the number and species of damselfish can adjust accordingly. Eventually, more individual species of coral and other reef inhabitants will be added. ReefGen is a continual work-in-progress.

SHADOW EMA - AN IMMERSIVE ESCAPE ROOM EXPERIENCE

STRANGER THINGS MEETS THE NEVERENDING STORY

"Story Lab II students were developing an AI model that, once complete, could tell the “perfect story”. But it developed more than any of the students would have imagined. It created a rift in the basement setting loose fictional characters and monsters into the real world. The students were too late to save themselves, so now it is up to the people visiting EMA to close the rift. The guests solve a series of puzzles based around stories both familiar and new. In the first room, players work together to unlock the safe containing a journal written by the Story Lab students and the blacklight which leads the way to the AI’s lair. The players are subsequently challenged in the puzzle hall where they must scour the room for the answer to the puppet’s riddles. If successful, players move into the final room, deactivating the machine by joining arms and working together. With the machine powered off, the players ascend the stairs with the revived guide, saving the EMA Center from certain doom."

The experience combined Immersive Theater with completely original puzzles created with p5js, projection mapping, Arduino microcontroller programming, 3D printing, machine learning generative design, #ARG, and good old fashioned set, prop, and puppet construction.

ShadowEMA.png

KOIGEN -- A GENERATIVE MODEL-BASED KOI POND

KoiGen is an interactive, realtime Processing sketch that uses diffusion model generated koi to explore the connection between data visualization and machine learning generative art.

Each koi is absolutely unique and species accurate and created using a generative workflow. KoiGen was the initial instantiation of this idea and work continues under the ReefGen project cited above.

AI+CREATIVE FILMMAKING DESIGN HACKATHON AND WORKSHOPS

The Nebraska Technology and Governance Center, Cinema 16, and Ash Eliza Smith, Robert Twomey and NovySan from the Johnny Carson Center for Emerging Media Arts organized and co-hosted a one-day intensive AI film-making Hackathon in April 2023 where creatives offered up speculative visions of the future and emergent system designs to offer us clues and strategies of ways that we can shape an interspecies future. The day culminated with a dinner, guest speakers, and a film screening with awards.

Professor Dan Novy led two out of three pre-hackathon workshops introducing students to the ethical use of emerging machine learning tools such as Stabile Diffusion, Dreambooth, ControlNet, RunwayML, Midjourney, Chat-GPT, DALL-E, Eb Synth, Lens Studio, Avatarify, DeepFace Live, Eleven Labs, D-ID, et al.

https://vimeo.com/showcase/10352314

16BY9_TAKE_ONE_AI_HACKATHON.png

DENTICLES AND TENTACLES OCEAN EXPLORATION WORKSHOP

(Rarotonga, Cook Islands) From October 3th to 14th, 2022, Sharks Pacific, Sharks Pacific Cook Islands Trust and the Ocean Discovery League (ODL) hosted two workshops for Cook Islands students to train them in ocean exploration, engineering, and research techniques. Sharks Pacific and the Ocean Discovery League provided opportunities that have never before been available to young Cook Islanders interested in STEM. Several of the deep-sea cameras that will be used during the expedition were initially developed at the MIT Media Lab. 

Professor Dan Novy, University of NebraskaLincoln, co-developer of the Maka Niu deep-ocean imaging and sensor platform also led workshops in storytelling for science communication as well as immersive data visualization for community outreach.

https://www.cookislandsnews.com/internal/opinion/diving-out-of-their-comfort-zone-for-deep-sea-capacity-building/

dentAndTents.jpeg

SPIRITS OF PLACE

On Halloween Day 2022, Assistant Professor Dan Novy and the first-year students in the Johnny Carson Center for Emerging Media Arts Story Lab 1 course premiered 'Spirits of Place,' forty-five unique geolocative immersive audio experiences placed in or around downtown Lincoln. Including vocal performances by actors of the Johnny Carson School of Theater & Film, 'Spirits of Place' are interactive audio walks which explore location as character and its uncanny ability to provide a deep sense of immersion and narrative enhancement. Stories include everything for long dead ghosts wishing to be free, uncovering the stories behind the victims of an historic trainwreck, or aiding a scientist from a collapsing parallel dimension to save our world. The immersive audio experiences were built using the free Echoes interactive sound walks app or MIT's App Inventor. Listeners wishing to explore can begin by visiting https://go.unl.edu/spirits-of-place to download the app and choose their first adventure.

spirits-of-place.png

INTERACTIVE PSEUDO-HOLOGRAPHIC VIRTUAL AQUARIUMS

Built for the Australian National Maritime Museum's "One Ocean, Our Future" exhibition, and in collaboration with the Schmidt Ocean Institute. I created five interactive, gesturally controlled pseudo-holographic virtual aquariums, allowing museum guests to explore and engage with five 3D visualizations of extraordinary deep-sea specimens revealed during Schmidt Ocean Institute's 2020 circumnavigation of Australia aboard their research vessel Falkor. All development used a combination of Lightwave3D, Blender, and the Unity realtime game engine, with custom scripting written in C#.

anmm.gif

E14-E15 MATTERPORT SCAN

I led a small team in creating a complete digital double of MIT buildings E14 and E15, home of the Media Lab, the Center for Bits and Atoms, Comparative Media Studies, the Program in Art, Culture + Technology, the Center for Advanced Urbanism, and the MIT List Visual Arts Center.

matterportScan.png

MEDIA LAB METAVERSE PANEL

The realization of a metaverse—the convergence of augmented reality, virtual reality, and other digital technologies with physical reality—has the ability to radically change the way we connect, perceive, and experience the world around us. My contributions to this panel discussion included virtual production and the creation of unique virtual 3D environments using the Spatial platform.

Screen Shot 2022-01-22 at 1.54.27 PM.png

2021 FALL MEMBER EVENT

As creative producer and "Chief Imagineer" for the Media Lab's semi-annual Member Meeting, I led all aspects of production, including live simulcast and pre-recorded "simulive" video segments. Using principles of illusion and stage magic, the simulive segments were designed to seamlessly integrate with the event's livestream, creating the appearance that there were six on-site, live production crews when in fact there were only three.

memberweekFall2021.jpeg

As creative producer for Inflection PoinT, a student-focused, student-organized event to help the Media Lab community connect and learn, and celebrate the uniqueness that makes us resilient and the curiosity that drives us forward, I led the production of Showcase21, a livestreamed, curated showcase of videos that highlighted the work and lives of students, UROPs, and postdocs at the Media Lab. Additionally, I project-managed the development of a custom Gather.town installation created by a team of Media Lab students to provide additional opportunities for networking and virtual demonstrations.

Watch the trailer

An image of the MIT Media Lab with the words "MIT Media Lab Inflection PoinT: Showcase 21" overlaid

2021 FESTIVAL OF LEARNING: VIRTUAL ATRIUMS

The Media Lab's 3rd-floor atrium is an important community space, which was  largely inaccessible for more than a year, due to the Covid-19 pandemic. For the 2021 Festival of Learning, an annual event by and for the Media Lab community, I led the development of four virtual atriums in Mozilla Hubs to serve as online gathering spaces for different activities—a Disco Atrium, Underwater Atrium, Outer Space Atrium, and the Hall of Mirrors, which served as a portal to scheduled sessions.

A panda avatar in a Space Invaders-themed environment

LILLI is an immersive, volumetric, reflective lightfield display using a hybrid of video and laser projection to visualize marine wildlife and ocean climate data, etc. While there's been an abundance of large holographic displays portrayed throughout science fiction literature and films, which often show large consoles surrounded by multiple people working together to manipulate floating images and data, there is a nagging question that goes unanswered: How do multiple people interact with a large 3D scene?


This project is two-fold:

1. To create a large, interactive 3D display and

2. Explore and establish a foundation for interacting with large 3D data for multiple users


Using a combination of laser and video projectors, we designed and built a car-sized 3D display that can be viewed from all sides. Developed in collaboration with MIT Media Lab's Open Ocean Initiative to help promote cleaner and more accessible ocean research, this display will be featured in 11th Hour Racing's traveling pavilion as they follow racing sail boats around the world.

Watch the demo video

lilli.png

#ZoomADay was a year-long project exploring the creation and use of synthetic characters and deep fakes for use in online telepresence and communications. It began as a series of Snapchat lenses, many of which are publicly available, and advanced to include more complex, AI-synthesized characters, using a machine learning toolkit called Avitarify. Along the way, the project was used to create virtual regalia for the 2020 Media Lab/Program in Media Arts and Sciences graduates and for a virtual celebration of the announcement that Dava Newman had become the director of the Lab.

A man in an old-fashioned diving helmet underwater, in front of a submersible ship.

Dissertation, Massachusetts Institute of Technology, 2019. Programmable Synthetic Hallucinations describe the utilization of the bio-physiological mechanics of hallucination generated in the human brain to display virtual information directly in the visual field.

A woman wearing a helmet that covers her face and head, with several cables emerging from the back.

I've built several iterations of these Pepper's ghost-style displays, including Yo-Yo Vision, a human-sized, autostereoscopic (no 3D glasses required) lightfield display that allowed Yo-Yo Ma to perform at the Media Lab’s 30th Anniversary celebration from a remote location. Another version of this project, the 3D Telepresence Chair, was used to allow an executive from a Media Lab member company to attend a meeting remotely.

3D_aerial_display.jpg_edited.jpg

OCEAN BLUE

Ocean Blue was an immersive ocean data experience built for the Wandering Cricket Night Market, part of a network of ephemeral art experiences. Ocean data concerning climate change, ocean health, and marine conservation were juxtaposed with a crowd-sourced database of memories, hopes, and dreams about the ocean collected online through a Google Form.


Twin projectors beamed the ocean facts or memories into an infinity mirror-effect moon pool installed in the back of an rented U-Haul moving truck. The interior of the box truck was decorated to simulate an underwater environment and the moon pool’s real water surface was perturbed by Arduino controlled aquarium pumps, creating a ripple effect that caused the projected typography to appear as caustic light reflections through the environment. 

 

After a short amount of time, the microcontroller would pause the pumps and as the water stilled to a mirror surface the ocean facts and memories would coalesce and become legible on the walls surrounding the guests. After some moments of reflection, the pumps would reactivate and the words would ripple back into flickering, meditative caustic patterns and the sound of the moon pool would gently fill the truck. A new set of ocean facts or memories would be randomly chosen from the database and the whole process would repeat.

 

The main projection application was written in Processing and the relay control of the aquarium pumps was coded in C in the Arduino IDE.


Guests to the experience reported a sense of calmness and wonder and delighted in the reveal of the text as the moon pool stilled. Several guests spontaneously offered their memories about the ocean and discussion of conservation and exploration; some regretting they hadn’t followed their dream to work with whales or become an oceanographer. A number of guests returned several times to discover new facts or memories and many lingered for the meditative atmosphere. Several hundred people were able to experience this immersive and interactive data environment before the end call of the Night Market. 

 

Many thanks to Jon Ferguson, Scott Berk, Anna Waldman-Brown, Chia Evers and other volunteers that helped build and host this immersive data experience.

OceanBlueText.jpg

Expands the home-video viewing experience by generating imagery to extend the TV screen and give the impression that the scene wraps completely around the viewer. Uses optical flow, color analysis, and heuristics to extrapolate beyond the screen edge in real time using standard microprocessors and GPUs.

InfinityByNine_1280_square_300dpi.tif_edited.jpg

An immersive storytelling environment to augment creative play using texture, color, and image. Utilizes natural language processing to listen to and understand stories being told, and thematically augment the environment using color and images.

A drawing of an owl projected against a blue wall.

A fusion of exploratory data visualizations based on realtime location-aware computing and the aesthetics of Abstract Expressionism.

A Jackson Pollock-style digital painting.

A basketball net that incorporates segments of conductive fiber whose resistance changes with degree of stretch to calculate force and speed of a basketball traveling through the net. The output is displayed via the in-arena Jumbotron and televised to the home audience. Premiered at the NBA 2012 All-Star Slam Dunk Contest; winner of a 2013 Effie Award (Bronze, Beverages—Non-alcohol).

A basketball net.

Transforms motion tracking and blob detection data collected from varying inputs, such as floating soap bubbles or live bacteria, into sound. BubbleSynth was a 2014 Guthman Musical Instrument Competition semi-finalist; SYNTHBacteria premiered at the Peabody Essex Museum in Salem, Massachusetts, on September 18, 2014, as part of the After Hours PEM PM Series.

A computer display showing bacteria under a microscope on the right and a column of code on the left

A data-driven social controller for visual exploration.

A ball-shaped touched display showing various movie posters.

4K/8K Comics applies the affordances of ultra-high-resolution screens to traditional print media such as comic books, graphic novels, and other sequential art forms. The comic panel becomes the entry point to the corresponding moment in the film adaptation, while scenes from the film indicate the source frames of the graphic novel. The relationships among comics, films, social media, parodies, and other support materials can be navigated using native touch screens, gestures, or novel wireless control devices. Big data techniques are used to sift, store, and explore vast catalogs of long-running titles, enabling sharing and remixing among friends, fans, and collectors.

A display showing page 6 of a Wolverine comic book, a related Wikipedia page, and a frame from a Wol

A sensor and actuator platform for Live Action Role Playing (LARP), immersive theater, theme park, and other transmedia experiences. The system is an open framework allowing any kind of sensor or actuator to be easily programed and reprogrammed on the fly, allowing novice LARP masters to add magic props and special visual effects to their game. The system premiered at Tri-Wyrd, in June 2012, and was integrated into “Veil Wars,” a two-day, pervasive, locative experience layered onto the Wyrd Con convention, created by Scott Walker and loosely based on Medieval Japanese mythology.

A "fire gate" created with a paper lantern and red LEDs.
bottom of page