Our smartphones allow us to connect with the rest of the world from anywhere. Ironically, however, they also tend to disconnect us from the world immediately around us. But rather than fight the creep of technology into our social spaces, there is the opportunity to reframe our personal devices as social devices.
As we start using our phones to interact with the people who are near us, there are a number of unique aspects of our interaction that mobile applications can capitalize on. Co-located collaborators can see each other, talk with each other, and share surrounding context. This is why it is easier to have a conversation with someone while driving if the other person is in the car with you and not on the phone. A person in the passenger seat can pause when the road requires your attention. A person on the phone can’t see when your attention gets diverted or the road conditions change.
Mobile applications should support face-to-face interaction by building on our existing persistent communication channels and shared context. They can do this by:
- Keeping users’ attention on their immediate surroundings,
- Augmenting existing physical signals, and
- Supporting existing verbal communication.
Keeping Attention on the Surroundings
One reason that mobile phone use is seen as rude is that it removes the user’s attention from their immediate surroundings. To avoid this, mobile devices should support input and output modalities that require minimal attention to the device, leaving the user’s attention free to focus on the people around them. Smartphones, tablets, and, more recently, wearable devices provide new sensors that can enable low-attention scenarios. A person, for example, can input information into their mobile phone merely by tapping their pocket, or get output from their phone via vibration.
As an example of how mobile devices might support low attention input and output, we developed a system that allows meeting attendees to provide feedback from their mobile devices during a presentation. Audience feedback is aggregated and reflected back to attendees alongside the presenter’s PowerPoint slides. The feedback, however, is incidental to the audience’s primary goal, which is listening to the presentation. Using the touch screen and accelerometer, their phones can recognize their gestures and allow them provide input without attending to the device at all. For example, in one implementation of the system a phone that is face up provides positive feedback, while a phone that is face down provides negative feedback. Likewise, phones’ sensors could be used to identify hand raising or clapping, and that information could be used for feedback.
While the system engages the audience and creates a sense of community among the people who use it, it can be hard for the presenter to keep track of what is occurring in the backchannel while they are talking. For this reason, when an interesting feedback event occurs a summary notification is created and the presenter’s mobile phone vibrates to create awareness. Audience members’ phones could also vibrate, creating a communal awareness of feedback events and re-engaging people in the feedback experience.
Augmenting Existing Physical Signals
In addition to not detracting us from our immediate social surroundings, our phones should augment our existing interactions. One way to do this is to support and augment people’s physical cues.
We explored the use of physical signaling via a mobile search application. Although most mobile search tools are designed for individuals, many mobile searches involve people searching with others. During these searches, collaborators transition between individual and group search, moving closer to each other and sharing their screens while discussing search criteria and results. We developed a mobile application that explicitly supports this behavior by allowing users to physically signal their willingness to share. The core application provides standard mobile search functionality when the phone is used in an upright, vertical position. However, users can rotate their devices horizontally to indicate (to the device and others) that they are ready to collaborate (as shown in the picture above). The phone’s rotation causes it to enter screen-sharing mode; subsequent group members who rotate their phones are then able to interact with the sharer’s screen contents directly.
The use of device orientation to control interaction is an active area of research. For example, Codex, a prototype dual-screen tablet computer, activated collaboration features upon detecting the device was in an outward facing position. Our mobile search application builds on the concept of sensor-based interactions to multi-person, multi-device collaboration scenarios, in which a device’s orientation serves as a signifier of a desire to collaborate both to the controlling software and the co-located group members. Small adjustments to the position and orientation of paper or mobile devices support delicate shifts in collaboration.
Supporting Existing Verbal Communication
Social mobile devices should also actively seek to increase interpersonal discussion and augment the positive aspects of having a verbal backchannel between co-located people, while reducing unnecessary coordination costs. Communication, and, in particular, voice communication, has been shown to have a significant positive impact on cooperation and trust, which is essential to positive social experiences.
As an example of how a mobile device might encourage verbal communication, we developed a mobile application that identifies and describes meeting attendees in order to foster social relationships. By observing the application in use across a number of enterprise meetings, we found that users valued being able to access information about the other people in the room, particularly when those people are unfamiliar. To help users engage with the people they were learning about while using their phones, we employed a gaming approach that asks trivia questions about the other attendees. This gameplay appeared to focus attention within the meeting context and spark conversations.
Similarly, the social mobile search application described earlier uses trivia questions to spark conversation during a search. For example, when collaborators search for a café, they might receive a pop-up asking, “What was the first retailer to offer freshly-brewed coffee in to-go cups?” The pop-up appears simultaneously across everyone’s phone, and must be dismissed for participants to continue their search. Because everyone is interrupted with the same question at the same time, this can spark conversation. However, such interruptions can also distract users as they try to complete their task. For this reason, trivia questions are timed to minimize disruption and do not appear during activities like text entry.
These are just some ways to capitalize on the unique aspects face-to-face interaction to create a more social phone experience. If we can get it right, our hope is that eventually you will react positively when the person you are with pulls out their phone, rather negatively.
- J. Teevan, M.R. Morris and S. Azenkot. Supporting Interpersonal Interaction during Collaborative Mobile Search. IEEE Computer special issue on Collaborative Information Seeking, 47(3), 2014.
- J. Teevan, M.R. Morris and S. Azenkot. Using Physical Signaling to Support Collaborative Mobile Search. CSCW 2014.
- M. Böhmer, T.S. Saponas and J. Teevan. Smartphone Use Does Not Have to Be Rude: Making Phones a Collaborative Presence in Meetings. MobileHCI 2013.
- J. Teevan, D.J. Liebling, A. Paradiso, C. Garcia Jurado Suarez, C. von Veh and D. Gehring. Displaying Mobile Feedback during a Presentation. MobileHCI 2012.
- J. Teevan. Using Mobile Phones to Augment Face-to-Face Social Interaction. Talk at MISC, 2012.
- J. Teevan, A. Karlson, S. Amini, A.J. Bernheim Brush and J. Krumm. Understanding the Importance of Location, Time, and People in Mobile Local Search Behavior. Mobile HCI 2011.