Author Archives: mim.harries@gmail.com

Wonderland (VR Game Design)

As the lead designer and producer on the Wonderland project at Carnegie Mellon University’s Entertainment Technology Center, I lead my team to find the right questions as we designed, pitched, prototyped, and playtested six short experiences for the Oculus Rift with Touch.

Our client was the director of the Alice program. The Alice program is responsible for the educational software Alice, which helps beginning computer science students (particularly grade school students) learn to code in an appealing, image-based way. The goal of the Wonderland project was to explore virtual reality as a potential avenue for expanding this style of education.

Ultimately, we developed six experiences that each illustrated one specific computer science concept. We playtested with teachers who were enthusiastic and optimistic about the potential for these experiences, or others like them, to enhance their lessons. With their feedback, we came to the conclusion that virtual reality is most useful as a way to illustrate abstract concepts like parameter passing in a simple way that gives teachers and students a common vocabulary.

Finding the Question

A slide from our final presentation to faculty, where we explained the evolution of our understanding of our goal.

The biggest challenge of this project was framing the central question in a way that enabled us to break out of our own ideas of what computer science games look like. It took us a few tries to figure out the version of the overall question our project was trying to answer, beyond what each prototype was trying to accomplish. We started with “how do we teach computer science in virtual reality?” However, the designs we made based on this question were largely re-creations of existing, 2D computer science learning games.

Furthermore, as the semester went on it became clear that our individual ideas of what we were trying to accomplish were diverging. I met with each team member individually and got them to talk about what they felt we were trying to accomplish, then met with the client to get his version. Then I brought the team together and wrote each different version of our goal on our whiteboard, and we all agreed on the interpretation that aligned with our client’s. From then on, we were asking not “How to we teach computer science in VR?” but “What does VR have to offer to the world of computer science education?” That change opened up our designs and got us to start building teaching tools rather than lessons, which was far more effective.

Playtesting and Feedback

Technology teachers playtesting one of our prototypes.

Another challenge was figuring out how to evaluate something that we knew would not be a finished product by the end of the semester. The project was about prototyping and exploring, so our goal was never to release polished products. We would have loved to test in classrooms, but the fact was that equipment was limited and each prototype had usability quirks that required careful instruction to navigate, so attempting to lead several children through these experiences together was not a possibility. We needed a way to evaluate the potential of these games as teaching tools.

My solution to this problem was to go to teachers instead. After all, if teachers do not see the value in a teaching tool, the students will likely never see it. And working with a few teachers at a time meant that we could easily guide them through the games and make it clear that they were not experiencing finished products. From the beginning of the project, I sent out a survey to teachers that follow the Alice program to ask which computer science concepts they find most difficult to teach and why. We chose our topics based on the answers to this survey and designed to solve the problems they presented.

A group of teacher observing their colleague play one of our prototypes.

For playtesting the prototypes, I wrote a verbal questionnaire that asked their history with computer science and their level of familiarity with the concept the prototype they were about to experience was based on. I took notes as one of my teammates led them through the game (and recorded when we had permission), and then asked them follow-up questions about how well they felt the experience demonstrated the concept and how they might use it in their classroom. It turned out that some of the teachers were not familiar with some of the concepts we were demonstrating, so in their case I briefly explained the concept and checked to see if they connected it to what they had just experienced and felt like they could understand the concept. Finally, I asked teachers who experiences more than one of our prototypes to rank them in several categories like accuracy, enjoyment, and potential for classroom use.

 

A high school student playtesting one of our prototypes while I observe and guide her through it.

As it happened, we did get the opportunity to playtest with grade school students, and that presented another challenge. We did not have control over whether or not the students we playtested with were already familiar with computer science and the settings we were playtesting in required that we move quickly and did not allow for us to embed the experience in any type of lesson. So how do we evaluate whether or not the prototypes were working? I decided that since most of our verification was coming from teachers, what we really needed to know from the students was whether or not they would play games like these and whether or not they were able to identify patterns in what they were doing within the games.

Based on that, I asked them if they were familiar with the key ideas the prototype would be trying to get across. If they were already familiar with the concept, after they played I asked them to identify what elements or actions in the prototype represented which computer science ideas. If they were not familiar with the concept, I asked them to explain what they were doing in the experience and what process they were following. If what they described corresponded to the concept we were trying to get across, I took that as evidence that the experience was on the right track.

Documentation

As the team member most comfortable with writing and communication, figuring out our documentation was my responsibility. We needed documentation that would enable someone else to pick up our project and continue based on our conclusions, our process, and our prototypes. We also needed documentation that could reasonably enable teachers to use our prototypes in order to argue that they could fit into a lesson plan and to help our client to continue demonstrating our prototypes if need be. Therefore, I split up our documentation by its intended audience: designers, developers, teachers, and general interested parties. I also wrote a one page instruction document explaining what each section of the documentation was, so that anyone using it could find what they needed without having to search through everything.

The digital structure of our documentation.

The general documentation included my overall analysis of the project and our conclusions and high-level explanations of each prototype. The prototype explanations included the goals, the original pitched design, the final experience (with video), lessons we learned, and what we would do if we were to continue working on it. Those explanations also included where to find the code for anyone who wanted to play around with the experiences for themselves.

For the teachers, we wrote step by step walk through guides for each prototype, including common problems and how to either fix them or work around them. These guides also included suggestions for how to present the experience and verbally guide someone through it, informed by our own experiences playtesting. In addition, I had the programmers on the team write pseudo-code to go along with each prototype. The pseudo-code was not based on the code that ran the prototypes but was instead a simple example of what each concept looks like in code form. My reasoning was that the mostly likely way for these prototypes to be used was in conjunction with code so that students could relate chunks of code to moments in the experience. Therefore, we provided an example of what code for a lesson like that might look like, annotated with what it corresponded to in the experience.

A page from a prototype walk through showing a gesture used in the prototype.

The documentation for developers was standard technical documentation detailing how the code behind the prototypes worked, ways of programming them that we tried and discarded (no point in having future developers make the same mistakes we did), and elements we wanted to add if had more time.

The documentation for designers included a design review where I wrote about what we learned, our advice for future development, and what we concluded about the role of VR in teaching computer science. I also detailed our process, how it evolved, and what could have been better, as well as the elements of the project that we discovered were more or less important or time consuming than we expected.

The end result was that we have detailed, extensive documentation with information relevant to multiple audiences, organized by who each document is most relevant to. I designed this system based on feedback from teachers about what would be most useful to them, and based on the idea that another project might need to pick up where we left off sometime in the future.

 

Button’s Journey (Narrative and Sound Designer)

I did the narrative and sound design for Button’s Journey, a game for the Kinect 2 that was developed in two weeks by a five person team as part of the Building Virtual Worlds course at the Entertainment Technology Center. Working on this game helped me truly understand both the value and difficulty of simplicity. In the game, guests play as a button who has been blown out of its home and away from its love by a fan and must navigate the outside world in order to return. It looks like this:

 

The biggest design challenge for this game was keeping things simple. Even when coming up with the basic story we tended to get bogged down by details we liked instead of settling on a core narrative. Finally, when we knew we wanted to use buttons but did not yet know what the story would be, my teammate asked me “What is the simplest story we could tell about these two buttons?” and I answered, “One of them wants to get to the other but things are in the way.” From that point on, we committed to keeping the story, and by extension the game, as simple as possible. As our narrative designer, I established a requirement that any new element had to support and enhance the story without distracting from it, and following that guideline effectively prevented us from adding too many random elements.

 

An image from our opening sequence. This included a small animation of the button rolling across the table.

There were two moments in the game where the level design presented narrative challenges.The first was getting the button out of the tailor shop in the first place. The first draft of the story had the button getting knocked into a garbage can, taken out with the trash, and then falling next to the dumpster when the trash was dumped out. When we playtested the game, we discovered that guests were not understanding that series of events at all, and that it was far more complicated than it needed to be. After than we knew we wanted to change two things: we wanted to simplify the whole sequence, and we wanted the reason the button ended up outside to have something to do with the obstacles it faces on the way back. The most popular obstacle during playtests had been the vents that nearly blow the button off the roof, so I rewrote the beginning of the story to have the button blown out the window by a fan. That version was far easier to understand and ended up being quite effective.

 

This is the moment in the game where the button is carried away by a bird after getting close to the tailor shop for the first time.

The second moment that presented a narrative challenge was getting the button from the ground to the roof of the neighboring building. Our first iteration of the story had depended on the button seeing the surrounding landscape, so I had written the story to get the button up high rather than simply having it cross the street. We ended up removing the part of the story that required being up high, but we liked the fact that it made the gameplay more about balance and steering (trying not to fall from the edge of the roof) than about timing (dodging cars to cross the street) which we felt was more suitable for the Kinect. That meant that we still needed a way to get the button up to the roof, so I wrote in a bird that would grab the button and bring it to its nest on the roof about one third of the way through the game. Having the bird grab the button there actually turned out to be an even stronger moment than we had anticipated, because the bird grabs the button just as the tailor shop is coming into view for the first time and puts the button even further away.

 

In addition to being presented to our professors and classmates as part of the Building Virtual Worlds class, this game was showcased at the ETC Fall Festival in December 2016, where family, friends, alumni, and industry professionals would be invited to come and experience this and other ETC projects. As part of the festival, each team was given a room to set up for their world. My team and I themed our room to look like a tailor shop and the street outside. This created an excellent opportunity for me to take advantage of my staging, construction, and prop making skills in designing and implementing our theme.

 

The Button’s Journey team in our room for the Fall Festival. From left to right: Beizhen Hu, Peilin Li, Griva Patel, myself, and Daniel Hua.

When it came to decorating our room for the Fall Festival, the primary challenge was arranging the play space in such as way that guests could watch the person playing without being picked up by the Kinect and interfering with the game. To accomplish that, I suggested that we use the back two-thirds of the room as the play space, with the screen and Kinect perpendicular to the doorway, and then use the front third of the room as a viewing area. To keep the guests in the viewing area from wandering into the play area, I built a large window frame that blocked the play area, leaving a gap for a single guest to enter the play area at a time. The doorway into the room and the opening into the play space were on opposite sides of the room to make it harder for guests to accidentally walk straight into the play space when they entered the room. To make the whole room fit with the theme of our game, we used props and decorations to make the play area look like the tailor shop and to make the viewing area look like the neighborhood outside the shop, so that guests watching the game seemed to be looking into the window of the shop. You can see the results below.

360 Video (Writer, Director)

In the fall of 2016 as part of the Visual Story class at the Entertainment Technology Center, I wrote and directed a 360 degree short film based on the Pixar prompt:

“Once upon a time…every day…until one day…because of that…because of that…until finally…and ever since then…”

We had a set of locations to choose from, so we decided to tell a sweet story that took place at a bus stop and a coffee shop. The narrative was our take on the simple “love at first sight” idea, using color to show connection and affection. I created the film with four other students: Michael Luan, Pradnesh Patil, Matthew Stone, and Anqi Yang. The actors are myself and Anqi. We used the song “Married Life” from the film Up.

 

 

Filming in 360 degrees presented some interesting challenges, especially coupled with the fact that we were not allowed to use dialogue at all. We needed to rely entirely on visuals not only to effectively convey the story but also to get the viewers watching the correct places to begin with. Luckily, I had some experience directing theater scenes with little or no dialogue, and knew some techniques used on stage to direct the audience’s gaze. The images on this page are scenes from the film in 360 degrees, so feel free to click and drag to look around the scene.

 

 

When writing the initial story beats, I made sure I was using simple, clear events and actions that could be easily communicated through body language and avoided trying to convey more than one idea at once. I used movement and gaze to direct the viewer’s eyes throughout the piece. For example, when a character is entering the character already in the scene looks up to show the viewers that something is happening, and a scene shift happens when the characters walk past the camera, so that the viewer’s natural inclination to turn and follow them leads them to look in the right place in the next scene. In addition, I used the amount of color in the scene to help convey the tone of the story. As the characters become more connected, more and more of the scene is in color. When one of them has to leave, the color fades again.

 

 

The Illusion (Assistant Director)

In the spring of 2016 I assistant directed the Luther College Visual and Performing Arts production of The Illusion by Pierre Corneille and Tony Kushner. As the assistant director, I helped to come up with the staging of the show and direct the actors in their roles. Additionally, I was solely in charge of the direction of the penultimate scene of the show, meaning that I handled the blocking and direction of that scene from the beginning. This experience enabled me to develop my sense of spectacle and to hone my ability to adapt my work to the style of my collaborators.

The final moment of the penultimate scene of the show. Photo by Brittany Todd, www.photographybybrittany.com

 

 

 

In the scene I directed, the tone of the show shifts from farcical to dramatic as the protagonist finally has to face the fatal consequences of cheating on his wife with the prince’s wife. This is the scene where everything goes wrong, so I wanted it to feel sinister and not quite right. The idea was that by the climax of the scene no one but the protagonist was surprised by his death. To accomplish that, I designed the blocking to echo moments in earlier scenes when things were happier with different outcomes. For example, in one of the first scenes the protagonist gets down on one knee to declare his love to the character who later becomes his wife, and she leans down to kiss him. In the scene I directed, he again knelt to convince her of his sincerity and she leaned towards him, but this time she pushed him away.

 

 

 

 

 

A moment in rehearsal when the actors were still wearing their red noses. Photo by Brittany Todd, www.photographybybrittany.com

 

 

Directing that scene presented some interesting challenges, the most obvious being the need to direct the scene in my own way without creating something that was noticeably different from the rest of the show. The primary way I accomplished this way by using the techniques favored by the director throughout the process, namely clowning tactics like red nose and gestures. That way even though I was directing my own interpretation of the scene, the methods I was using to do it were consistent with the rest of the show. I also dealt with this challenge by consciously echoing the blocking the director used in earlier scenes and twisting it to suit the new scene.

 

 

 

 

The moment before the curtain drops. Photo by Brittany Todd, www.photographybybrittany.com

 

 

 

 

 

The second major challenge was that the layout of the theater meant that the blocking had to be designed to ensure that the actors ended the scene in a very specific place so that they would be hidden behind the red curtain that would drop. To make sure that they ended up in the right place, I planned the blocking by starting at the end of the scene with the positions I knew they needed to get to and working backwards to the beginning of the scene. To keep the scene from looking cramped towards the end, I kept the character that would be exiting the scene before the curtain fell further away from the spot where the others were gathering.

 

 

 

 

 

 

 

Leading up to the final moments of the scene, with the character who needed to re-enter the scene on the right and the character that brought her in on the left. Photo by Brittany Todd, www.photographybybrittany.com

The third challenge of that scene was subtler, and was one I did not actually solve until shortly before the show opened. One of the characters appeared to leave the scene and reenter later, but the script included no direction to that effect. She simply did not have lines during a conversation that clearly did not include her in any way and then suddenly had lines again at the very end of the scene. The question was, what was she doing in the middle? Initially, I had the actress try hiding from the other two characters throughout the scene, as if she did not want to get involved in their fighting but could not quite bring herself to leave altogether. That kind of worked, but was clearly awkward and gave the actress quite a bit of difficulty. When dress rehearsals started and I saw the scene on stage with the trees she was supposed to be hiding behind, I knew I had to come up with something else. There was no real reason for her character to stay, and it looked intensely awkward to have her hanging around upstage. At that point, the question became, “why does she come back?” Finally I realized that the solution was the character that entered the scene just before the end of the scene, who has been hunting for one of the others. If I assumed that she ran into him off stage, he could force her back into the scene by making her lead him to the others. All of the actors much preferred that version, and it was ultimately much more effective on stage.

Lost and Found Application (Designer, Programmer)

For my senior project at Luther College I worked with four other students to code a web application that will help building administrators track “lost and found” items. This was my first experience with a long term collaborative coding project, particularly one where we had complete control over our design and implementation. The application was primarily intended for building administrators, who would be able to easily add photographs of items that were turned in, as well as to look up those items later on when students came looking for them. As both a designer and a programmer, I learned a lot about aesthetics and usability.

 

The login page of the application. Guests could click on “Lost an Item?” if they were not building administrators.

 

My primary task was our guest user functionality, which allowed students to look up where their lost item is most likely to be. This feature presented an interesting design question. Building administrators need full access to the database of items in order to know if they have what a student is looking for, but giving the same level of access to students, or anyone who happened upon our application, would enable them to “shop” through the returned items and pretend to be the one who lost something they want. We initially did not intend to include any sort of non-administrator access, but that feature was requested so often that we decided to work out a compromise.

 

 

 

An early iteration of the guest search results.

In the end, the solution I developed was to require non-administrators to input information about a specific item in return for information about which buildings similar items had been returned to. Students using the application could input as much information as they wanted about a specific item they were looking for without ever getting back anything more than likely buildings and contact information. By not telling anonymous users anything about the item entries being matched to their query, I hoped to prevent them from gathering enough information about any item to believably pretend to be its owner.

 

Myself, Jessica Tan, Sergei Hanka, Kirby Olson, and Ales Varabyou presenting our application at the Student Reserach Symposium.

 

At the end of the year my teammates and I presented our project at Luther’s Student Research Symposium, where other students and faculty came to find out what we had to show after a year of work. In the interest of keeping the presentation fun to watch, we decided to show a video of the application in action and then do a live demonstration of some of the additional features rather than verbally explain the entire application to the audience. I wrote, directed, sound designed, and edited the following video, starring Hannah Miller and Josh Weisenburger, two members of Luther’s improvisational acting troupe, along with Teresa Flinchbaugh, an Administrative Assistant for Luther’s Computer Science and Mathematics Departments.

 

 

 

Love and Information (Stage Manager)

In the fall of 2015 I stage managed a production of Love and Information by Caryl Churchill for the Luther College Visual & Performing Arts department. This was my first experience as a primary facilitator of a complex collaborative project, and I developed my organizational and problem solving skills considerably to handle such an inherently complex show. The script leaves the setting and context of each of the short scenes up to the production, so the cast, production staff, and technicians relied on me to ensure clear and consistent communication between everyone involved.

 

A section of the spreadsheet used by the designers and myself to keep everyone up to date on the design decisions being made throughout the process.

I used a Google spreadsheet to keep the costume, lighting, and sound designers informed of any decisions that were made or changed during each rehearsal. I recorded each day’s changes in a new color to make them easy to find, as well as sending out reports after each rehearsal detailing what we worked on and what we decided or changed. This eased the problem of the designers having to wait for the director and cast to make decisions by making it possible for the designers to start work on the scenes that were set while the cast continued to experiment with other scenes instead of having to wait for everything to be decided.

 

A page of my script with cues written in.

 

During the performances, I shifted from facilitating the collaboration between the cast, director, and designers to coordinating the light board operator, the sound board operator, and two projector operators. To keep myself organized, I wrote out the order of the warnings and cues I needed to give the technicians in order to effectively and smoothly transition between scenes. I quickly learned to give the necessary cues in groups rather than individually, as there was no other way to ensure that everyone got their cue with sufficient time to act on it. I also started only giving the cues numbers with the warnings, which I had ample time to say, and leaving them off when I gave the real cues in order to communicate them more quickly.