OUTDOOR DISPLAY UTILIZATION PROJECT

The idea of utilizing the displays in the public area came from my HCDE 419 final project. After a quarter of research reading HCI related articles and, I came up with this idea that embraces the concept of “ubiquitous computing” and “share economy” into the public space. The outdoor display utilization (ODU) project aims to create a solution that turns any outdoor displays into extension screens of user’s smart phones, allowing them to view customized content without taking their phones out. With ODU enabled displays nearby, users can better enjoy their lives with less distractions.

With ODU, users can access information an a more convenient way. On the other hand, the owners of the displays can also benefit from increased attention on their commercial contents.

The original research can be found here.

Based on my original project, me and Owen Hu teamed up to push the project forward building multiple prototypes to further develop the idea.

Part 1: Paper Prototype

The first prototype is on paper. We brainstormed several possible content that can be displayed on a display in the public area. As the result, we sketched two ways an outdoor display works:

  1. When idle, the display automatically shows useful information other than just advertisements, such as bus schedule or a map;
  2. When connected with user’s smartphone, the display receives information from the smartphone and displays what the user set it to display, such as workout data.

To use the outdoor displays, users need to enable this feature in supported Apps. This is for protecting their privacy so that they can turn this feature off. In our prototype, we used Nike+ Run Club as an example.

PART 2: WIREFRAME

The second phase of our prototype is to digitize the sketch, allowing better user testing results. Following is our interactive prototype created with Sketch:

On the display end, we created multiple screens for both vertical and horizontal displays. Here are two selected screens:

Part 3: Physical Prototype

In each phase, we built upon previous prototypes and further increase the fidelity of our prototype. After creating wireframe prototype for the mobile end, we made a physical prototype to demonstrate the outdoor display with cardboard and paper. The following video demonstrates how our physical prototype works:

PART 4: Video Prototype

Finally, we concluded our project with a beautiful promotional video. We filmed this video at the University Village.

A typical day with ODU:

Feedback

During the final exhibit, we received helpful feedback from our peers, instructors, and visitors.

Likes

  • Very interesting concept. People like how they can access more screens in the public area;
  • People think this idea is especially good for runners because they don’t want to take out their phones when running;
  • Beautiful video at night.

Need Improvement

  • A visitor asked what happens if more than one user is nearby a display. Our current solution is to display one person’s content for a while and switch to another person’s, but there is certainly room for a better solution, such as split screens;
  • One person mentioned about privacy issue . Although we explained that the system is going to be completely local so that their smart phone will only connect to the display instead of the Internet, people were still concerned about letting other people to see their information. We would like to allow users to choose what information they feel comfortable displaying on a public screen, but certainly we need more assurance to address  their privacy concerns;
  • One problem that we came of by ourselves is how to implement the system. We supposed that installing such system to enough displays to make a difference is going to be difficult. Moreover, there also needs to be sufficient amount of supported Apps for users to choose in order to actually have effect.

MOTION PRESENTER: A WIZARD OF OZ PROTOTYPE

Motion Presenter is a Wizard of Oz prototype, which seems to work, but actually is faked. My team designed this prototype that aimed to make our participants believe we made a functional product. However, one of our team members, aka the wizard, was controlling the effect.

Design

Motion Presenter is a motion capture device that recognizes presenter’s gesture and control the slides accordingly. The goal of this design is to give presenter the freedom to walk around and control the slides without going back to the computer or holding a controller.

Prototype

There are four members in my team:

  1. Facilitator: communicates with participants, making sure they follow our protocol;
  2. Camera man (me): films the testing session and edits the video demo;
  3. Note taker: takes notes of the testing session and asks follow up questions after each testing;
  4. Wizard: controls the slides.

The way we built this prototype is by remote connecting one laptop to the classroom computer. The specific steps are the following:

  1. Find an empty classroom and take it;
  2. Download a remote control App, TeamViewer, on the classroom computer;
  3. Download TeamViewer on our laptop;
  4. Download the sample Power Point slides on the classroom computer and play it;
  5. Set up the remote control.

After the initial set up, our wizard could use her laptop to control the classroom computer and play the slides. We then invited three participants to the classroom for the testing. We told them that we had a motion sensor installed on the classroom projector so they could use their gesture to control the slides. When introducing our team, the facilitator told them the wizard was just a note taker. The facilitator also taught participants what gestures they could use, then started the testing.

We designed three scenarios for participants to complete:

  1. Switch slides: go to the next slide and go back to the previous slide;
  2. Video control: pause/resume video, fast forward/go back, volume up/down;
  3. Canvas: draw/erase.

Feedback

After going through three scenarios with our sample slides, we revealed the truth to our participants. One participant was completely fooled by us and really shocked when we told her that she was not controlling the slides. Another participant said she had doubts at the beginning because she did not believe we actually installed a sensor on the projector. The last participant did not believe us at all so she had no response when we told her the fact but said “Yeah I know”

Overall, this prototype was very successful. All three participants had no problem understanding and performing the first set of gesture. But for the second set of gesture, one participant was confused about video forward with slide forward, because they all required hand gesture. Another participant frequently adjusted the video volume, so the device (the wizard) was not able to capture her motion in time, thus led to unexpected volume jump. As for the third set of gesture, two participants found it hard to control the drawing speed, and the actual lines were not appearing at the locations they drew. This was more like a technical challenge, and the “finger on canvas for drawing and fist bottom on canvas for erasing” was easy to understand for them.

Future Improvements

  • Adding a “viewing all slides” and a “jumping to certain slide” function to the gesture controlling system;
  • Providing better feedback (or conformation) after a controlling gesture is performed;
  • It would be nice to let users design their own controlling gestures.

We think using gesture control to do presentation is a reasonable and practical solution, and we look forward for an opportunity to further develop this system in the future.

NIKE+ run Club Video Promo

This blog is about a one minute video prototype demonstrating one of Nike’s App: Nike+ Run Club, which is designed for runners to track their activities.

Design

The design of this video aimed to be as exciting as possible since the purpose of the App was to encourage people to workout. I was hoping my audiences to get excited and motivated to go for a run after watching this video. An very important element of this video was the music, Light em up. The whole video was designed for this song, catering its beats. The reason I did this way was simple, based on my video editing experience, adjusting speed or duration of videos was relatively easy, but editing music was very difficult because the pitch would change if you change its speed. Therefore, in order not to change the music in the editing phase, I did some light editing on the song to make sure it fits the one minute requirement before I drew the storyboard and planned the visuals. Then I just left the music unchanged until the video was done.

There are 5 features demonstrated in the video:

Scenario 1:

  • Feed: the feature that allows the user to browse Nike’s news;
  • Activity: the history of workouts done by the user, including data and analysis;
  • Club: the place to check out and reserve Nike events;

Scenario 2:

  • Run: the core functionality for this App;

Scenario 3:

  • Auto-pause/resume: the feature that automatically pauses tracking when user stops, and resumes tracking when user starts running.

Stodyboard

Drawing a storyboard with preset music was fairly simple because I did not have to plan the time duration for each scene. I already had empty slots (the music beats), so all I need was just to fill in those slots with footage.

The storyboard was precisely planned. Each second had its purpose. If you watched the video, you will find out that the video is almost exactly the same as the storyboard. The execution of the storyboard was very strict.

Filming and Editing

Filming the video was fun. Because the due date was on Monday, I had to shoot the video on the weekends. However, Seattle was rainy on weekends. I had no choice but continuing the project. Here I would like to give my special thank to the runner that appears in my video, Jack Sun. He helped me filmed all the scenes on a cold rainy day with only one shirt and one pair of shorts on. Thanks to his dedication, the filming session went really well. We got all scenes that I wanted in two hours.

Another interesting fact was the use of wheat flour. I planned one scene where the actor clapped his hand and the talcum powder went all over (like what LeBron James always does). However, nobody had talcum powder, so we ended up using wheat flour. It worked!

I used both my own iPhone and Apple Watch in the video.

watch-960x720.jpg

You may noticed that my Apple Watch was actually broken. I did not broken it during the filming session. I broke it a while a ago but I still decided to use this one because it looked very stylish.

files-960x543.png

Editing the video was not very hard since I had a lot of experiences editing videos. However, it was still a time consuming process as you can see how many video clips that I imported from my camera.

a7-960x720.jpg

Finally, big salute to my dearest camera, the hero of this filming session. It underwent two hours of raining while remaining functional.

Feedback

Likes

  • Perfect editing: people love the organic combination between video and audio;
  • Nice idea to pause the music when the person in the video was pausing running;
  • Very exciting music (people love fall out boys).

Need Improvement

  • Demonstration of the UI is short: people did not have time to watch a scene because they switched too fast;
  • One person mentioned that she did not know what the App was until the ending credit. I might add an intro for the next time;
  • A little overexposing: I turned the ISO a little too high so some parts are very white. I should have turned it down a bit.

In terms of the nature, this video is more “promotional” than just a walk-through. It is more like a TV commercial that attracts people’s attention with cool visuals and exciting music. The intended audience are potential users. However, as a prototype, this video lacks some elaborations on the App’s features. It can be more descriptive on the operation of the App. Although it is very difficult to thoroughly illustrate an App in a minute, I could use more time on demonstrating the App rather than scenarios.