Idea Generation


When you think of VR where does your mind go? Is it an ultra-realistic video game providing the complete entertainment experience? Or maybe it’s being able to view the Grand Canyon while sitting in your living room. But what about putting on a headset and going to work?

The Cornell Virtual Embodiment Lab is asking questions about collaboration and competition, something anyone with a 9-5 job is all too familiar with.

So why should you care about this? Well, if you’ve had to travel for work meeting you may want to pay attention. Companies spend a total of $111.7 billion a year in domestic travel for conventions, meetings, and training purposes. However, as VR technology becomes more advanced some of these trips may not be needed. Instead of flying to corporate headquarters employees could simply meet in a virtual space. This would save valuable resources in time, money, and fuel expenses that come with travel.

This study was headed by Yilu Sun, who came to the lab as a MPS student in Information Science.  Her experiment was inspired by an earlier study that tracked movement of participants in collaborating pairs to predict their success at a collaborative task.

In her study, she manipulated avatar appearance, and whether participants were competing or collaborating.  She is currently analyzing the data, and hopes to contribute to knowledge about how common social interactions may occur in virtual reality. “When we see the trends we notice all of these fantasies we see in sci-fi coming within our grasps. VR is a tool that can transition us into a more globally connected group of people.”


The Moon Over Ithaca

You can’t hop on Elon Musk’s rocket, but the Communication Department has an alternative.

“One small step for man, one giant leap for mankind,” said astronaut Neil Armstrong as he scurried across the moon’s surface in 1969. If Armstrong were alive today, I believe he would be beside himself to see the headway virtual reality is making in terms of allowing people to feel present in a virtual environment.

Teaching moon phases have been a subject that has challenged high school classrooms for years. While hands-on activities can be engaging, they also bring challenges–equipment can get misplaced or broken. Desktop simulations are accurate but perhaps less engaging. But what if we brought the moon right into your classroom? The Virtual Embodiment Lab is looking past cool visuals and video games and is searching for ways VR can help us better understand and learn abstract concepts.

Screenshot (2)

The idea is that by combining the immersive feeling of VR and by making it interactive students can have a more memorable experience when learning about the phases of the moon.

To keep Neil Armstrong and those alike interested, Professor Andrea Stevenson Won’s lab is currently working on a project that brings the hands-on activity into outer space. This project is in collaboration with Jonathan Schuldt, also in the Department of Communication, and Natasha Holmes in the Physics Department, and is funded by Oculus Education.

The graduate student leads for this project are Byungdoo Kim, Jack Madden, Swati Pandita, and Yilu Sun. Undergraduate team members include Philip Barrett, Caley Droof, Alice Nam,  Dwyer Tschantz and Kylie Youk. The environment was programmed with the assistance of Annie Hughey, Akhil Gopu, Anirudh Maddula, Frank Rodriguez, Albert Tsao, and Jason Wu.

Picture of sign for room 494

The Virtual Embodiment Lab is located on the 4th floor of Mann Library Building, in the Department of Communication.

Each graduate student has their own key role within the project. For instance, Kim and Pandita are running the experiments with assistance from undergraduate research assistants. This includes recruiting participants, assigning them to different conditions, helping them go through the experimental stimulus, measuring their responses and analyzing the data.

20180208_173531Byungdoo’s overall goal for the project is to “earn more experience in research in the immersive virtual environment and its impact on attitudinal and behavioral change.”

Byungdoo Kim is a Ph.D. student in the Department of Communication interested in pro-environmental judgment and decision-making.

JackMaddenHeadshot.jpgJack Madden is a 4th year Ph.D. candidate in the Astronomy Department at Cornell and has been working with Professor Holmes and Professor Won on this Moon phase project since the Fall. The main goal of their work is to further explore how learning takes place in virtual reality.

His research on exoplanets made him uniquely qualified to create the models of the moon and earth used in the virtual environment.

Pandita_HeadshotSwati Pandita is a first-year Communication Ph.D. student advised by Dr. Andrea S. Won in the Virtual Embodiment lab. Her interests lie in human-computer interaction and embodied cognition.

Her role is to oversee user experience (listening to user feedback about what can be improved with regards to the interface and interactions) and run the experiment. 

She aims to evaluate VR as an effective learning environment over traditional classroom styles (desktop interfaces or hands-on demos), as well as to provide a novel experience that is engaging and informative for students.

Yilu Sun.jpg

Yilu Sun graduated from Cornell with a Master’s degree in Information Science in December of 2017. Sun’s research interest includes nonverbal synchrony, avatar customization as well as UX design and research in virtual reality.

Yilu says, “At the early stage of the project, I worked with Jason Wu to create low to medium fidelity prototypes and presented to the team for feedback. Then I collaborated with Jack Madden and a team of CS students on programming the presentation of the quiz questions in the head-mounted display. Recently I am leading the UX study to find the most user-friendly quiz question design.”


This virtual expedition won’t be around for long, this semester the shuttle is coming back down to Earth!



Embodiment’s Effect on Behavior

Avatar creation is at the forefront of VR technology. Allowing individuals to be embodied by their own created avatar gives rise to a more immersive and engaging experience. But what about when their avatar doesn’t quite look like them? Senior research assistant Aishwariyah Dhyan Vimal aimed to answer this very question. 


The Grocery Store

In her recent study, participants were asked to shop in a virtual grocery store for one week’s worth of groceries in a “food desert”. Items varied in relative health benefits as well as price, with each participant having a budget of $60. The experimental variable was the assigned avatar, being either slender or obese.

Aishy set out to find whether or not the embodied avatar would affect the shopping habits of participants.

In an effort to make the experience more authentic, each participant’s created a unique head to their avatar using facial generation technology. Some participants noticed their avatar’s relative obesity immediately, one even saying, “Whoa, I’m fat.”

When asked what was behind this project, Aishy turned to public policy. “I looked at the public policy to reduce obesity and food deserts. The obesity epidemic in the USA continues to worsen and the implementation of the public policy will help reduce this problem.” The research also looked specifically at “food deserts” asking, “would participants be more supportive to public policy to reduce obesity?” 



Aishy noted the struggles she went through working on a senior honor thesis, having to learn many new skills to make everything work. “From creating a virtual reality environment in Unity, creating customizable avatar heads, Qualtrics survey, data analysis, and conducting an actual lab experiment.” The growing researcher ended with acknowledging how happy she is she pushed through it and finished.







Undergrad research assistants create new “Pit Demo” for VEL

The research team at Cornell University has recently created a “Pit Demo” to observe how the sense of “presence” affects us in a virtual world. Participants in this demo are able to freely move around a world that is an exact replica of the lab. Senior research assistant Sydney Smith modeled the rooms in 3DS Max and imported it into Unity 3D. Functionality was implemented by Jason Wu and Daniel Tagle, who wrote scripts to collapse the floor on keypress, allowing participants to see a “pit” appear below their feet leading to the floor of Mann Library below, and pick up and throw objects from the room into the pit.  Below, research assistant Claudia Morris tests out the pit demo.  The plank that she is standing on matches the digital model of the plank in the virtual scene, providing passive haptic feedback.

Get to know the under-graduate researchers working on this project


Daniel Tagle (on left) is a senior studying Communication. He’s worked on Perspective Taking in Virtual Reality and now is leading the Pit Demo Team. A research assistant in the lab since September 2016, Daniel originally became interested in working in VR, from being a really big gamer.  He is fascinated with how people can interact with others in virtual worlds. Daniel is looking forward to working with virtual reality for many years to come.

Jason Wu (on right) is a junior majoring in Information Science, with a minor in Architecture. He is interested in the spatial qualities of virtual reality, as well as its potential in facilitating social experiences. He recently competed in HackReality NYC, where his project “Wanderlust” was awarded first place.

SydneySydney Smith is a senior majoring Communication with a focus in media studies. Her main role in this project has been the modeling of the lab in 3DS Max, as well as the “Pit” in the demo. She hopes to continue modeling, creating more realistic worlds as well as sharpening her skills as a 360 videographer.


Tracking nonverbal behavior in High Fidelity

Interaction.pngTracking the movements of participants in virtual environments is key to our research.  The above screenshot shows the summed movements of two participants’ heads and hands as they converse in High Fidelity, a shared virtual environment that allows users in different locations to meet in virtual worlds.  Omar Shaikh created the tracking visualizer, and Yilu Sun is conducting experiments using this platform.