Development Update 3/29/2017

This is the current state of Just, with the addition of small props to populate the scene with further details.  I textured 92 small props that utilize a 4096 x 4096 atlas to minimize draw calls.  Majority of the props are physics objects that the viewer can pick up and interact with.

Preliminary Lighting Setup

After I finished modeling the lighting fixtures for the environment, I was able to add in actual lighting.  While I'm still playing with the settings, below is a video of a quick walkthrough of the lit environment.

Play Testing: Tutorial

Today I exported a rough draft of the experience and brought it to DAAP to do play testing.  My goal for this test was to drop users into the environment and see how well they learned from the UI elements I had created.  It was very apparent just how much the wording of my instructions mattered.

The Tutorial consisted of two lessons: Teleportation and Object Interaction

Teleportation

Here is my initial wording: 

Press down on the trackpad to activate teleportation, release the trackpad to teleport to that spot, try it now to go down to that glowing ring.

This is accompanied by two animations of a hand pressing on the trackpad and the user teleporting across the space

Some of the errors were:

  • Using both controllers to teleport when only one is necessary
  • Walking instead of teleporting
  • Not knowing how to properly point the controllers

My new wording:

Don’t Walk, Teleport!

Using either controller, press on the trackpad and point it where you want to go, release the trackpad to teleport to that spot.  Try it now to go down to that glowing ring!


Object Interaction

Here is my initial wording: 

Squeeze the trigger on the back of the controller to pick up objects and open doors.  Try picking up the object below.  When you are done, open the door at the end of the hall.

This is accompanied by an animation of the trigger being pulled on the controller

Some of the errors were:

  • Using both controllers to grab an object
  • Being afraid to reach too far forward to grab object
  • Not reading all directions on the UI element
  • Picking up and instantly dropping 

My new wording:

Use the trigger on the back of either the controller to pick up objects and open doors.  Try it now by picking up the cube in front of you.

After the user grabs the cube the screen will change

Now that you’ve got the basic interactions down, teleport over to that door and open it to begin the experience

3D Development: Dining Furniture

Below is a timelapse of me texturing dining furniture using Quixel Suite.  The total time spent in Quixel Suite was around one hour and twenty minutes.  

Quixel is a crucial part of my capstone, it makes the creation of realistic and detailed textures easy and fast.

Game Engine Development

This past weekend I began to bring my assets into unreal engine and create interactions for them.  I used the VR Content Examples as a point of reference for implementing basic mechanics into the scene.  Below is a shot of one of the first demos where I have set up teleprotation as well as basic doors and drawers.

You can see the fridge handles glow orange when my controllers pass through them to signal to the user that they can interact with the object. Figuring this out was difficult and frustrating at first but has gotten better as I've spent more time in the game engine.  

Beginning 3D Developement

This past week I've begun creating the assets that will fill my experience and bring it to life.  I started with a stove because it was something I had a lot of reference for, drawing mainly form Jennifer's House (and most dirty college stoves) The asset was modeled and UV'd in 3DS Max and textured in Quixel.

This is the first time I've used this workflow, and I am pleased with how easy it is to create a compelling story with textures alone.  I'll need to get faster as I go, but an encouraging start nonetheless

Research: Books & Documentaries

To sum up the extent of my research I'm grouping together the rest of my materials.

My books include:

A Framework for Understanding Poverty and Evicted: Poverty and Profit in the American City

The Documentaries include:

Rich Hill and Poverty in Chicago

Concluded Research:

Evicted, the first book I read provided me with a lot of great reference and understanding for how the cycle of poverty exists and is so hard to break out of.  The book consists mainly of stories from tenants facing evictions and provides great background for how they got in their current position and how they manage to stay off the streets.  

A Framework for Understanding Poverty is a great reference book that provided me with background information on how poverty persists in the U.S. It also has stories from people in poverty that help explain what it is like to be in their shoes.

Poverty in Chicago is a documentary that focuses mainly on the homeless or the near homeless population in Chicago.  It really opened my eyes in terms of just how big of a role disability can play in being able to find good work.  This brought in perspective a lot of what I learned from Steven Howe about just how close we are to poverty, one debilitating injury can make a huge difference in your ability to make a living.

Rich Hill is a well known documentary that follows the stories of three young boys living in Rich Hill, Missouri.  The documentary is very intimate and tells very good stories of these boys' lives and how living in poverty as well as public perception of poverty has an impact on their everyday lives.

Research: Jennifer's Family

Jennifer's Family is a photo documentary by Louisa Marie Summer. The documentary follows a family of seven in Rhode Island.  The images from this documentary are serving as the main visual inspiration for my environment. The images that the photographer captured are uniquely intimate and offer a lot in terms of inspiration.  From the photos I am looking at things like types of furniture and how they use their furniture.  How do they use each of their rooms and what do they put on their walls are all helpful for my research and my ability to build an accurate environment.

Research: Steven Howe

Steven is the former Head of the School of Psychology at UC, he has a number of published works documenting his research on poverty in Cincinnati.  As I did research specific to Cincinnati I would see Steven's name come up consistently, so I reached out to him and we talked for a little over an hour over the phone.

Interview Findings:

The Just World Hypothesis - When Steven first told me about this it made a lot click in terms of understanding my audience.  He explained that people have a tendency to think that everything is fair and just in the world and that people at the bottom must be there because of something they've done when in face this couldn't be less true.  I understood that my main goal would be to get people outside of their comfort zones and confront any misconceptions they have about poverty.

Another aspect would be showing people just how easily their life could turn to this, because there is in fact not much that separate someone in poverty from someone who is well off.  In reality most of us are a job loss, injury, or huge medical expense away from being put in a rough spot.  Leveraging this aspect, I could create empathy much more easily for those in poverty.

Cincinnati.jpg

Research Methods - Steve and I discussed research methods that he has used in the past during his research.  For my project we agreed that field research may not be the best option, and since I am not involved with any organizations in Cincinnati at the moment, it would be difficult for me to use them as a research outlet.  With this realization in mind I would stick to mostly secondary accounts for my research: books, documentaries, and articles.

Research: Empathy in VR

As stated before, experiences that create empathy are a growing trend in VR.  The reason for this is that virtual reality as a platform offers immersion that other mediums can not.  As a result,  these experiences are create lasting impressions with the stories that they tell.

The Machine to be Another

The Machine to be Another

Project Syria VR

Project Syria VR

Two of the experiences that I researched early on are shown above.  To the left, The Machine to be Another, is a simulation that places you in the body of another human to experience the world through their eyes.  The project mixed up people with different genders, disabilities, and race. To the right is Project Syria VR.  Project Syria by Emblematic Group is a simulation of life in a Syrian refugee camp where viewers are faced with violence and other everyday struggles that go on inside the camps.  Emblematic Group has created a number of great empathy projects that address things like homelessness, hunger, and police brutality.

Initial Proposal

This project began in my capstone prep class during summer semester 2016.  I had gone through a lot of ideas, all with the broad goal of using virtual reality to take viewers to places they had not yet been or could not go and to educate them about these places. Some ideas included inside a beehive or inside a computer. Eventually this led to me wanting to visualize intimidating statistics, for instance: What would it look like if ocean levels continue to rise at such a fast rate, or what will it look like if the bee population were to die off.  Eventually I settled on visualizing poverty and the struggles of the working class poor for a few reasons:

1. The struggles of the working class poor often goes unrecognized because on the surface they may look and behave like anyone else

2. Poverty is something that a lot of people are very susceptible to without realizing it, whether it's a bad injury or getting laid off from work,  just about anyone can fall into poverty shockingly easily.

3.  Creating empathy is a trend that is very unique and effective in VR because of the medium's ability to completely immerse the viewer in an experience, leveraging this would help drive home the overall message.

There are other reasons along with these but these are the main drivers.