Humans are implicitly affected by arrangement of objects in spaces we inhabit. Autonomous devices however, interact with us using explicit gestures. Robots, for example, affect our perception and behavior by using human-like metaphors to give us impressions of recognition, disagreement, or understanding. How do implicit and explicit influences work together to influence human behavior in the real world?
In order to design for the interactions of machine gestures with space, I utilized autonomous chairs in image, video, and virtual environment forms. I investigated implicit influence at Northeastern University CAMD by using eye-tracking to study perceptual attention using different chair arrangements. I worked with Wendy Ju’s lab at Cornell Tech on understanding the way explicit chair gestures affect us using videos of chair-human interactions. At Parsons School of Design, I tested human response to both arrangement and interactive capabilities of chairs in VR experiences that prototype hypothetical scenarios. I found that machine gestural interactions affect human perception in particular spatial arrangements during free-form virtual exploration, demonstrating design strategies using hypothetical situations that are difficult to realize in real-life.