mobilized: An Insider's Guide to the Business and Future of Connected Technology - S.C. Moatti (2016)
Chapter 3. The Spirit Rule: The Best Mobile Products Give Us Meaning
TL;DR
Mobile products are the ultimate personal products. They are with us always and understand what matters to us. Meaning from mobile comes through personalization and community.
Personalization: Mobile products make us feel taken care of. They personalize everything to our mood and context.
Community: Mobile products establish social norms and rituals that make us feel like we belong.
Mobile products are companions that need to be aware of our context—what’s happening outside and how we feel inside. They work like an extension of our spirit.
Mobile companies use two types of filters to build products for meaning: internal and external.
Case studies discussed in this chapter: Facebook, Google Glass, Siri, Tinder, Trulia, and Yelp.
Theo, a depressed writer living alone in Los Angeles, is going through a painful divorce. 48 As a way of coping with his feelings of isolation, he buys an AI assistant, similar to the iPhone’s talking app, Siri. The artificial assistant’s name: Samantha.
Samantha is an electronic device unlike any other. She’s sophisticated, adaptable, and continuously supportive. Samantha makes Theo feel special. Although she’s just a piece of software, Theo eventually falls in love with her.
Theo and Samantha are, of course, the main characters in Spike Jonze’s Academy Award-winning movie, Her, which is a Hollywood version of what the mobile revolution can bring about.
Samantha is Theo’s virtual soul mate. That at least is how he sees her. Everything he asks from her he gets. Samantha is thoroughly personalized to suit him. She’s his perfect match.
But when he confesses to his friends that he and Samantha are dating, he is confronted with ridicule and judgment. Soon, Samantha also reveals that Theo is not her only relationship. In fact, she’s promiscuous. Samantha is dating thousands of people. She’s also spending time with other AI assistants, whose presence she prefers to humans. Theo is shattered.
Now, you might find the dynamic at play in Her a bit creepy. I wouldn’t argue with that. But there’s an important lesson inside of the fantasy: Great mobile products are a lot like Samantha. They’re like an extension of our spirit.
One of the most disruptive aspects of mobile products is that they are the ultimate personal products. They are with us always. They know exactly what is meaningful to us. They personalize everything to our mood and context. They build trust and comfort—ingredients that naturally build attachment. We form emotional relationships with them.
On the flip side, relationships often get challenged when confronted with other people, our communities, and the world around us. Social norms and ritual apply.
What Samantha failed to grasp with Theo, successful mobile products need to understand: we are as much who we are inside, as individuals with unique feelings and emotions, as who we are outside, in our communities with their rules and compromises. And, we seek meaning from both inside and outside. Together, our inner and outer selves reflect the complete essence of our spirit.
To understand how our mobile products help us focus on the things that matter, let’s first look inside: meaning is personalization. Then we’ll look outside: meaning is community.
Meaning and Personalization: Feeling Cared For
We are all wired with anxieties that get triggered when we least expect them. In fact, psychology professor Roy Baumeister explains that it takes a lot of energy to keep this stress under control. He calls that energy willpower,49which is also the title of his best-selling book.
“Some people imagine that willpower is something you only use once in a while, such as when you are tempted to do something wrong. The opposite is true,” he says. “Most people use their willpower many times a day, all day.”
It all adds up to depletion of energy. That’s when we most feel that we lose control.
“Depletion seems to be like turning up the volume on your life as a whole,” Baumeister says.
Good mobile products turn the volume down on our life, and they do it by knowing a lot about us. The more they know about us, the more personalized they get. The more personalized they get, the better able they are to cater to our individual wants and needs.
A few months ago, my friend Jennifer (not her real name) started using dating service Tinder.50 What does she enjoy about it most? It’s highly personalized. Unlike more established dating services, Tinder only exists on smartphones and doesn’t rely on impersonal algorithms to evaluate romantic potential between people.
Instead, it shows her people who are close to her geographically or socially, using her location and Facebook friends and interests to make her experience unique. It uses the smartphone’s GPS to show only matches located nearby. And Jennifer also appreciates that the service uses Facebook Connect to create member profiles. Photos feel more authentic, and mutual friends are featured. It feels welcoming and safe.
“The first time I swiped,” she says, “the screen of my phone was [immediately] inundated with an ever updating stream of male suitors: loafer-wearing Kip, 28, popping champagne on the deck of a boat (pretentious—swipe left!); shirtless Aaron, 31, winking at his reflection (bathroom-mirror selfie—swipe left!); tall, dimpled Peter, 30, smiling from a mountaintop (swipe right!).” Soon after, Peter liked her too. “I was hooked.”
As well, she isn’t assaulted by dozens of suitors, sending the same feeble introductory e-mail to every girl. Tinder’s double-opt-in mechanism lets Jennifer interact with only those suitors she chooses. If you swipe right and they do too, it’s a match and the pair can exchange private messages in the app.
Jennifer is one of over 50 million singles who participate in the cultural phenomenon that Tinder has become in less than four years. The service is available in 30 languages and claims over six billion matches since its launch in 2012. Some will use it in their hometown, others to spice up a business trip.
By being constantly connected to our environment, our mobile products alleviate decision fatigue. They sort through the millions of information bits we are bombarded with to show us only the ones that matter right here, right now. We give them permission to make these decisions on our behalf because they know enough about us to personalize everything.
This personalization is essential to what makes mobile products successful.-It puts us in complete control of the experience.
Sometimes, the experience we get from mobile is so personalized that we wouldn’t be able to reproduce it otherwise. Life suddenly gets easier, because we are no longer hampered by circumstances beyond our control. Our stress level goes down, as in this example.
Not too long ago, I had an important meeting with a major partner, and as I was leaving my apartment it started raining. I decided to hail a cab.
Of course, there was no cab in sight. It took me a while to finally find one and by then, I was soaked and already late for my meeting. On top of this, when it came time to pay the fare, I didn’t have enough cash so we had to stop by an ATM.
All I could think about was that I was going to lose my client. I blamed myself for not planning enough. I was upset at the rain for messing up the traffic. But really, I was afraid of losing a significant source of income. All because I couldn’t find a cab.
Now that I started using Lyft and Uber, I no longer get stressed when I need a ride. All I need to do is pull up the service on my phone when I’m getting ready to go somewhere, get in the car when I’m notified that it’s here to pick me up, and get out when I’ve arrived. It optimizes my itinerary in real time by routing around delays that before would have left me stuck in traffic. It even tells me ahead of time how much the fare will be. I no longer even need to “pay” in the traditional sense, because the fare is automatically charged to my credit card. I feel cared for, even pampered, because the service eliminates all the previous hassle of getting from point A to point B. It feels good.
Feeling taken care of in ways we cannot provide to ourselves is a reflection of what is important to us, of what has inner meaning to us. A bond naturally develops from this extreme personalization, similar to any relationship. This connection lifts our spirit, not unlike intense feelings such as love. And what gives us more meaning than being in love?
The meaning we create inside ourselves is what makes us unique. Outside, it’s all about compromises. It’s about how we live well in our communities. The best mobile products recognize this duality. We’ve looked inside, now let’s look outside.
Meaning and Community: Social Norms and Rituals
In any relationship, our bonds are most tested when we go out among other people in the world. Social dynamics can be affected, as we saw earlier with Theo and Samantha. And because mobile is still new in our lives, we have not yet established norms and rituals for it.
Consider the question on many people’s minds these days: Are we becoming too connected to our mobile devices? Some people (including myself at times) struggle to ever disconnect. We are always on our devices. Because every action on mobile is immediately recorded and broadcast, we compulsively obsess over every signal we get or don’t get. It’s been two hours already and no one commented on my post. Did I say something wrong? Why are people not paying attention? Maybe they don’t like me anymore? Maybe they never did?
“Some people need to get unhooked,” says best-selling author of Hooked, Nir Eyal, who wrote the foreword to this book.51 He’s certainly talking about the Theos of the world, who completely isolate themselves from others as their connection with their devices deepens. But what about the rest of us?
It’s almost become cliché to hear complaints about everyone having their faces glued to their screens, stifling meaningful person-to-person communication. But do our smartphones disengage us from those around us, or connect us with them?
Some argue that people who suddenly shut down from a group and pull out their smartphone are detrimental to communities. But most of the time, I see people do this because they want to move the conversation forward, to be helpful. They want to fact-check something, or share a funny video. I don’t think there’s anything wrong with that.
In fact, research shows that mobile products create deeper bonds between users and their communities.
A study by the University of Florida, for instance, illustrates how mobile products make people feel more connected to those around them.52 In 2013, a group of 339 undergraduates volunteered to answer questions about their smartphone activity. The researchers were interested in the frequency with which these students used their smartphones and what they were doing on them.
They found that participants spent two to three hours every day using their smartphone, of which about one hour was spent on social activities, including the use of social networks.
More importantly, the students were asked to answer questions about their social capital, which is the complex web of relationships that helps us live and work in our community. The results show that heavy smartphone users who use their mobile devices to connect with others and the world around them have stronger social capital. For example, these students were more involved in their local community, had more trusted people they could turn to for advice about making important decisions, and knew a greater number of people who could give them access to resources like professional publications.
With the mobile revolution, there is a lot more data about everything and everyone than ever before. And there is no going back. This abundance of information is mostly helpful, though sometimes it can expose our private lives—that inside world we’ve created through hyper-personalization of our mobile products—to a level of scrutiny that challenges our comfort level.
When we engage in mobile communities that operate by sharing intimate information about each other, we are in a sense observing each other—all the time. Research confirms that we tend to be on our best behavior when we know we are being observed.53 But this unprecedented level of personal exposure that the mobile revolution demands in order to function is relatively uncharted territory.
Besides having such a vast amount of our personal information available to others, I have another concern: mobile companies have become, in fact, less business, more utility. They cannot go out of business because, just like our electricity and water, we expect our data to be around all the time.
Tech visionary and best-selling author of You Are Not a Gadget, Jaron Lanier, makes the case that companies that own massive amounts of data about us are effectively a public service. They should be nationalized, he argues, and their focus should shift from making a profit to protecting people.
But even government-run agencies need checks and balances. When government has had easy access to our personal information in the past, it has had nefarious consequences: surveillance states, “enemies” lists, persecution of dissidents. In places like China, it is still a reality today.
How do we protect ourselves from government overreach into our digital—and, therefore, personal—lives? We need to have the debate.
The same goes for commercial control of our personal data. The European debate around the “right to be forgotten” is an example of democracy in action, of people demanding more control of their digital selves.
So privacy loss is one of the biggest concerns when it comes to living in a mobile community. It’s a topic that divides generations. Most millennials don’t believe there is such a thing as privacy; many baby boomers feel that it’s a right. This isn’t saying that the former are naïve or the latter suspicious. This is saying that, in our own ways, we all want to feel in control. We want our mobile products to protect us, our loved ones, and our communities.
The disappointing launch of Google Glass is an example of what can happen when a mobile product fails to do so. It followed a typical losing playbook: inflated expectations, overpriced gadget ahead of its time, safety concerns, unclear use case, and lack of real-world testing. But none of those issues were its fatal flaw.
It didn’t help that the initial launch was poorly handled. Right out of the gate, the press trashed the technical shortcomings: the battery didn’t last, there were many crashes and bugs. One journalist even called it “the worst product of all time.” But soft-launching a product before it is ready for prime time in order to use the feedback to improve is not uncommon in Silicon Valley.
What doomed Google Glass is that its power was seen as scary and out of the hands of its users. People felt it was built primarily to serve Google’s own interests, that it was designed to collect data for the benefit of the tech giant rather than to serve the needs of its users. When exactly was it recording? Was it really turned off when it said it was? It triggered the same fear Theo experienced in Her, when Samantha revealed to him that she wasn’t his exclusive AI assistant. Was she sharing his secrets with others? Theo felt betrayed and out of control because Samantha wasn’t up front with him.
It extended into communities. People wearing Google Glass were asked to leave bars, movie theaters, and casinos. Privacy concerns were raised. Eventually, the product was removed from the market at the beginning of 2015.
Google is now trying to reinvent Google Glass as a hands-free display that could, for instance, be used by surgeons as they operate on a patient. I think it could be very valuable in these types of controlled situations.
The failed launch of Google Glass shows that we care about privacy rules in our mobile products—they have meaning to us—and we notice and react negatively when they are not clear.
Meaning in Context
Things only make sense in context. Samantha was able to focus on what mattered to Theo only once she understood what it was that had meaning to him. This is a particularly difficult challenge for mobile products because there are so many signals coming through at any point in time, from both inside and outside.
Take Apple’s personal assistant, Siri. The reason it isn’t used to its full potential today is that many of its recommendations are not pertinent.54 It very often fails to contextualize and correctly interpret all the external information that’s relevant to a user’s request.
Siri was initially created to be a do engine that fulfills wishes. Unlike a search engine, which finds information but then leaves it up to you to take action, Siri was meant to do things for you, like buy movie tickets, place a confirmation call on your behalf, or reserve a table under your name.
But when it initially launched, it was far from being able to do all the things its creators wished it could do. Like many technologies, Siri’s genesis goes back to military research. The original company was founded in the early days of the mobile revolution by three members of the CALO project, the largest AI project in history.55 (CALO is an acronym for “Cognitive Assistant that Learns and Organizes.”) The goal was to build a virtual assistant that would help military commanders. It used a complex technology called natural language processing to interpret what we say and turn it into commands a computer can understand.
Siri’s secret sauce was to use natural language processing in a very sophisticated way. Instead of trying to guess what people say in a vacuum, which is a challenge for AI because there are so many variables and possibilities, Siri set out to understand sentences in context, using a smartphone’s GPS, address book, calendar, and more to interpret meaning.
Early on, Siri missed the point too often. Its responses to questions and requests weren’t contextual or even relevant.
As such, Siri never lived up to its hype. People see it mostly as an intriguing gadget with some amusing uses, or as a hands-free tool when they are driving or otherwise unable to use the keyboard. Some use it to manage their calendar, for example, when they need to schedule an alarm, set a reminder, or cancel a meeting. And that’s about it.
In Her, Samantha was a more evolved version of Siri—it was what Siri wanted to be. But that was a movie, and mobile products must function in reality. And the reality is that mobile products can help us focus on what matters only if they understand the context we’re in.
With Siri, we’ve looked at external context. Now let’s use Facebook to examine how our internal context is just as important.
Many people who wished to file a complaint on Facebook used to give up halfway. Someone would start reporting, say, a photo someone else had posted and then would change their mind at the last minute and abandon the process.
The Facebook team discovered that the reason people would not go through with reporting pictures or posts they initially found offensive was that they were afraid it would upset the person who posted it. That person was their friend most of the time, and they didn’t want to hurt them. They were placing their friend’s feelings above their own.
To solve the problem, Facebook called on a group of experts in behavioral economics. “[Unlike traditional economics,] behavioral economics does not assume that people are rational,” says Dan Ariely, best-selling author of Predictably Irrational56 and professor of behavioral economics at Duke University. Much of Ariely’s research revolves around how people make choices and the resulting effect on incentives.
The complaint form had been designed to be as neutral as possible. When people reported a photo, it asked them to select the reason they were reporting it: it’s insulting, or it’s blurry, or it tags me but I’m not actually in it, and so on. It appealed to people’s common sense and rational brain.
But reason is often misinterpreted as judgment. The tone of the abuse report form felt judgmental to users. Even though it asked to report a judgment on the picture, people were afraid it would be interpreted as a judgment on the person. Such a complaint could put the entire friendship at risk.
The behavioral experts suggested an alternative approach. Instead of listing neutral reasons for submitting a complaint about what the poster had done, why not offer a choice of what the user was feeling at the time they decided to file a report. Were they feeling insulted by the content? Were they concerned that the photo they got tagged in didn’t make them look good? Were they upset because their political views had been exposed?
The abuse report became a way for users to share how they felt about something that was bothering them, without incriminating, judging, or accusing their friends. They were able to express negative feelings without compromising their relationship.
The completion rate of the complaint form went from single digit to almost 100 percent.
Facebook didn’t invent a new type of communication. In fact, many family therapists use this technique when they help couples improve their relationship. Facebook simply recognized that people think about and value their place within communities. It is another reflection of their inner selves, their spirit.
This deep understanding of our internal and external lives is powered under the hood by a gigantic personalization machine. This machine can be so effective that people become emotionally attached to their mobile products. We come to rely on them extensively because they know what’s meaningful to us as individuals. It’s an extraordinary relationship, when you think about it. Let’s look at the mechanics of how it is achieved.
Building for Meaning: Mobile Products as Extensions of Our Spirit
Because mobile products are always with us, they know exactly what is meaningful to us as individuals and as members of our communities. Mobile companies rely on two types of filters—internal and external—to satisfy the needs of both relationships. Internal filters enable personalization by learning about us, who we care about, and where we go. Once they understand what matters to us personally, external filters allow the experience to be shared and enjoyed with other people.
Internal filters can be as simple as our current location or our address book, but they also enable more subtle location-based services that effectively connect people to their environment without them being involved.
Take real estate marketplace Trulia, for instance. Most people looking for an apartment or home prefer to search on the go, while they’re actually in their favorite neighborhood, rather than from behind a computer screen. Trulia completely eliminates the tedious research part of the process.
Once Trulia knows a user’s criteria for a home, it sends them personalized push notifications whenever a suitable listing is available nearby. It even contacts property managers on their behalf and helps them be first in line to check desirable properties. Users no longer need to browse themselves; Trulia does it for them. It’s like a real-life version of Samantha, who personalized everything for Theo without even having to consult with him.
We first tested this concierge feature during my time at Trulia and were concerned that people might not like it. After all, we were making big decisions on their behalf and acting as an automated rental agent. We thought many people would complain, so we made it very easy for them to opt out.
On the contrary, user satisfaction went up. The process of looking for a place to purchase or rent is so stressful that anything that made it easier was welcome.
As our mobile products come to understand what matters to us, we expect them to meet more than our individual needs. They also need to help us integrate in our communities. This is where external filters come in. Done right, external filters power healthy communities where everyone feels respected.
External filters can take the form of a privacy policy, which governs what a product can and cannot reveal about its users, or compliancy rules, which dictate what can and cannot be said or done about members of a community, or a double opt-in mechanism, which regulates how people meet others. They establish social norms and rituals for our mobile products.
Users feel that they belong in a community because the product itself helps them respect others’ needs and act in ways that the group will accept. This makes them feel like they matter because they are part of something bigger than themselves.
To be effective, external filters need to be very transparent, otherwise users feel manipulated, even betrayed. Recall how our friend Theo in Her felt when he no longer was certain what information Samantha was sharing with others. The policy itself matters less than how clearly it is communicated, as we saw in the real-life counterexample of Google Glass.
Building for Meaning: Internal and External Filters
Mobile companies use two types of filters to sort through the barrage of signals we constantly receive from our environment and create meaning out of it. Internal filters personalize our experience based on our mood, context, and preferences.57
Internal filters come in two forms:
Place, such as current location, home or work address, and other points of interest, power the what’s called location-based services. Place filters keep us connected to our environment in ways we couldn’t be without them. Mobile companies that power these services have transformed our lives profoundly. People no longer need to decipher maps to go places or rely on local insiders for cool restaurants.
People, such as address books and social plug-ins, make it easy for us to connect with our friends and loved ones and reach exactly whom we need when we need them. They make sure we have control over who sees what when. They protect us and our social circles. Mobile companies use people filters to let us share photos and videos with selected groups, or suggest that we send our friends a nice message on their birthday, or create custom groups to, for example, invite friends to a holiday bash or message our book club.
Because they are so personal, internal filters build an emotional connection between users and their mobile products. Just as with friends and loved ones, building a relationship needs to survive the test of being out in the world. This is where external filters come in. To be effective, they need to be very transparent.
There are three types of external filters:
Policy, such as privacy rules, terms and conditions, and other legalese, govern how users’ information can and should be used. These filters protect members of a community from threats like government monitoring and nuisances like unsolicited advertising. Mobile companies are often forced by law to implement and enforce policy filters.
Popularity, such as reviews and ratings, establish and protect users’ reputations. They facilitate the informal vetting process we go through as we become accepted by a group or community. They express the respect others have for our skills, tastes, values, and opinions. They are often used by mobile companies to ensure that members of a community play by the rules.
Permission, such as opt-in/opt-out features and abuse reports, regulate how others can impact us. They are especially important because many of us never disconnect from our mobile products. Permission filters ensure that we can decide what others can and can’t do with our personal data, and set the boundaries for how they can communicate with us. Mobile companies must be very mindful of not taking users for granted.
Mobile products are the ultimate personal products. They cater to our every need. Internal filters personalize everything to make us feel understood and taken care of as an individual. External filters create the social norms we need to feel like we belong to the communities we care about. This is how mobile products become extensions of their users’ spirit.
Building Facebook and Yelp for Meaning
Let’s look at how Facebook uses internal filters to personalize the experience of the most adept users.
Seasoned Facebook users tend to have more friends than average and as a result, there is a lot of content for them to browse, from profiles to pictures, posts, pages, and more. Most of the time, they will rely on the News Feed feature to give them a snapshot of what’s most relevant right here, right now.
When I worked at Facebook, I sat a few desks away from the team that was in control of the settings and parameters of the News Feed. One of its charges was to develop internal filters to better personalize the experience of Facebook users.
It is hard to imagine a more complex task than making sense of thousands of rankings and preferences in real time. But instead of monopolizing control of the process, the News Feed team made the deliberate choice to give users direct access to internal filters so that they could control the content of their own News Feed.
The Facebook team realized that advanced users wanted more than just better controls and settings, or a fancier News Feed. They wanted their voices heard. This is why internally they are called content producers, in contrast with less sophisticated users who consume content instead of producing it.
An average user may not notice this, but experienced users craft their Facebook posts to make sure they reach exactly whom they’re intended for. They often create multiple groups of friends based on shared interests or locations. They even have the ability to give those groups names: “amateur chess club,” or “Sally’s birthday bash,” or “San Francisco hikers.”
Facebook doesn’t try to get the content producers to do anything new. It simply makes it easy for them to share the things that matter to them with their community.
Now let’s look at how crowdsourced reviews service Yelp uses external filters to build community.
Clay (not his real name) is a 53-year-old general contractor in Concord, California. He started his business more than 30 years ago and almost had to close it down during the last recession. Things suddenly picked up in 2011, when one of his customers asked if they could leave him a review on Yelp. Clay and his small crew had done a great job with their bathroom remodel.
Founded in 2004 by former PayPal employees Russel Simmons and Jeremy Stoppelman, Yelp58 publishes local business reviews through crowdsourcing. Over the past few years, I’ve been lucky to count them as one of my partner companies.
At the time, Clay wasn’t familiar with Yelp. He downloaded the app on his smartphone and created an entry for his business. Soon, other happy customers left him reviews. It felt really good to see the quality of his work acknowledged publicly. Yelp reviews and ratings are examples of popularity external filters that, such as in Clay’s case, make users feel valued and appreciated.
Within a few months, Clay noticed that he was receiving significantly more requests for quotes. Until then, new business had come through referrals (and there was never quite enough of them), but these seemed to be coming out of the blue. People would reach out and mention the great Yelp reviews.
Clay used to work with the same trusted specialists on his jobs, an electrician and a plumber, who over time had become his friends. He had been concerned that they might want to retire soon without finding the time to groom the next generation. When contracts started to increase, his business reached the capacity it needed to support a larger crew with younger members he could train himself.
Today, Clay’s business has more than doubled. He is not alone—Yelp has millions of reviews and has become a rainmaker for small businesses. A 2011 study published by Harvard Business School59 revealed that for every star in a review, the sales for the business being reviewed is affected by 5 to 9 percent.
I asked Clay what he worries about most and his answer didn’t surprise me: the competition. But I failed to grasp his real meaning. Clay has more business than he needs, so he’s not concerned about losing bids to his competitors. Instead, he’s scared that one of them will leave him a fake negative review on Yelp.
Fake reviews, as you can imagine, are a common practice. To filter them out, Yelp has developed a proprietary tool that marks as many as one in four reviews as suspicious. Although Yelp has had to deal with lawsuits about review accuracy, the company’s filtering algorithm is one of the industry’s most effective. Because of that, Yelp has gained the trust of 85 million people60 who use its mobile app every month to find a nearby restaurant for lunch, a late-night bar to end a fun date, or a contractor like Clay for a home remodel.
External filters, such as popularity filters, can make or break a business. The judgments people make about us carry a lot of weight, and we have little control over it.
Remember and Share
One of the most disruptive aspects of mobile products is that they are with us always. They understand what matters to us both from inside, as individuals with unique feelings and emotions, and from outside, as members of communities with social rules and rituals. Meaning from mobile comes through personalization and community.
Personalization: The more mobile products know about us, the more personalized they get. So people come to rely on them extensively and often become emotionally attached to them.
Community: As our bond grows inside, so does the need for privacy and other social norms outside. We need big business and government to protect our personal data and treat it with respect.
To help us focus on the things that matter, mobile products need to be aware of our context, both what’s going on outside and how we feel inside.
Mobile companies use two types of filter to build for meaning: internal and external. Internal filters enable personalization by learning about us, who we care about, and where we go. Once they understand what matters to us personally, external filters allow the experience to be shared and enjoyed with other people.