What is clothing for? When we ask people to tell us what their personal reasons are for choosing their clothing, we find that the most popular reasons aren’t necessarily weather protection and comfort. Instead, people wear clothes to express aspects of themselves, including emotion, personality and their perceptions of the occasion. A red jacket can express confidence, leather shoes can indicate that they’re at a wedding, or basketball cap shows team allegiance. They use fashion to provide a temporal expression of themselves; they want to do more, to tell stories about themselves.
Fashion shows have traditionally been gazing at models walking on a runway. It’s not until Chanel created innovative fashion experiences that the world began to see supermarkets, airports, and feminist protests as fashion shows. While these projects transformed the space for fashion, there’s another inherent limitation of fashion that has yet to be overcome: the static form of clothing that has no interaction in time, no extension in temporal expression. In this realm, work by Kailu Guan has sought to bridge the gap between the physical and the digital. Inspired by her work, we aim to create our own digital narrative experience, but adapted to the space of fashion shows.
We propose a future of fashion that narrates dynamic visual ideas in 4D, allowing designers to shine in 5D interaction space, creating a new vision of the fashion show as a participatory practice where viewers move around to see aspects of the clothing from their own perspective, an interactive form of the fashion show experience.
Just as static movies are to interactive games, the old fashion show becomes a new fashion experience with digital content that expresses in time and space. Thus we decided to bring about animated content integrated with fashion that tells a story as the audience use digital devices to view it. The models become stationary while the audience moves around. We implement this using markers on garment that work with computer vision to augment reality.
User and Material Testing
First we talked to cohorts of fashion designers and fashion show goers to see what type of augmented clothing serve the expressive needs of those who design fashion and those who consume fashion. We find that most individuals interested in augmented clothing fall into a spectrum between total customization and minimal enhancement, but they all suggested that in a fashion show, they want to see integrated pieces that tell an epic story, with lighting and projection that add to the story. Thus we decided to create a dress that narrates the birth and death of the universe. We decided to do this on a circular crinoline form so that viewers can go around the model to see the story unfold like a tapestry or go above her to see the story in one view. The dress form also needed to be flat enough for the considerations below. We also needed to construct a 3D projection system that rotates in space in alignment with the story of the rise and fall of planets narrated in the dress.
To investigate the efficacy of computer-vision-based enhancements on clothing, we performed a series of experiments to see if textures on fabric can be picked up as markers by the Vuforia system (see process video). Lighting appears not to be a problem as long as there’s some ambient brightness. Distance is an issue to start the marker detection, because the device needs to be close enough to pick up the pattern, but moving farther away once the marker is detected is not an issue. We reasoned that this makes viewers congregate around the model initially to find the markers before moving to their own preferred view points.
Scale of where to look is also a consideration. Viewers can take views that contain more than one model or more than one marker, so that stories can trigger each other or from one dress to the next. We used Unity scripting to create an animation on one marker that triggers a different animation on another marker, so that one pattern of activity can lead to another in the narrative, like a spaceship (video). Finally the texture of the garment is a key consideration. We found that stretchable materials do not permit seamless marker detection, and also that embroidered patterns, while beautiful, cannot cover enough area in a reasonable time. After further consultation with designers and viewers, we decided on using heat-transferred vinyl on a dark polyester fabric.
Prototyping and Construction
For Vuforia integration, traditional dress forms with darts and flares cannot be used for markers due to the instability of the ruffles. Thus we set out to get a pattern with complete flatness on each side of the crinoline by draping the crinoline form in muslin. After making a prototype based on the pattern, we taped patterns onto it to test using Vuforia. We then build each of the next two layers of the dress by draping onto the crinoline then sewing together each of the layers. The zipper opens at the back of the second layer and on the side of the first (augmented reality) layer because we found that the stories needed to be visualized on the front, back, and side panels to be coherent.
Next we designed four sets of patterns to be used around each side of the crinoline, which narrates the birth of the universe in the Big Bang, star formation and expansion, multidimensional aggregation, and the end in the Big Crunch. We scanned in our draped pattern and printed the vinyl using a Roland cutter. After placing the cut-out pattern on a heat-press, we carefully peel out the excess. The result is then draped again over the mannequin and further stitches are used to reinforce the back side. Throughout the process we continue to improve our marker detection and animation in Unity. The high contrast of the vinyl on fabric makes the Vuforia integration robust.
We designed shapes and textures in Blender and Maya to deploy for iphone in this project. A wireframe shader was employed to show the “rising from the ground” concept from two of the scenes. A package for moving objects in Kepler orbits was used to move some of the planet spheres in the destruction scene. The challenges here were the placement and movement of the individual shapes with respect to the pattern on the clothing. We modified a camera path creator asset for Unity to move our ship and planet-like shapes around, and also added particle systems from Unity. The results are a set of four animations that narrates a tapestry of birth and death of the universe accessible from the iphone.
When we talk to viewers about animations on textiles, they repeatedly discuss the possibility of a projection environment that complements the AR fashion by setting the scene of the narrative. We implemented the system using three Cube RIF projectors that spin around a central gear driven by a 1 RPM gear motor. The projectors are connected to ball heads that allow free angles of rotation, so that the projectors can make orbits of various angles around the room. The result is a slowly rotating set of three videos (currently base on the animations we built earlier) that makes a planet-like movement around the room, complementing the fashion show itself, which involves an audience rotating around the model in a similar orbit fashion. The theme of a cyclical evolution of appearance and disappearance of planets in orbit around the universe, which is born and is gone, is converged upon by the dress, the animated AR content, the projection in the room, and the audience-model engagement.
We have created an narrative fashion experience that runs contrary to traditional practice. Instead of the viewers sitting static at a gallery gazing at walking models, we reverse the relationship, allowing the audience to go where they place and engaging AR content when they please. Using the theme of the cyclical birth and death of the universe, we created a visual and interactive experience that uses 4D projection, continuity of the dress form, and audience movement to show off fashion design in a temporally interactive manner. The future of fashion is to see beyond the 2D fabric and 3D design into 4D custom animated content and 5D audience interaction. It remains to see what else fashion exhibition is capable of expressing.
This work was exhibited at NYC Media Lab 2018, Parsons School of Design.