Tuesday, August 27, 2013

News Analysis: Addicted to Apps

SAN FRANCISCO — IN Silicon Valley, that fairy-tale land of robots and driverless cars, a deeply held belief motivates all: If you build it, they will come.

If engineers can build something, the thinking goes, they do. Whether they should is beside the point. People will eventually adapt, engineers believe, just as they always have.

And they do adapt, most of the time. That uneasy feeling that often accompanies our first experience with a new technology quickly subsides, and we are won over. Sure, smartphones track us everywhere we go, but who worries about that when they’re so cool and useful?

It is one way to understand why we have so freely shared troves of personal information with big technology companies — as well as the muted public reaction now that we know more about how the government spies on our data.

Adapting to a new technology is like a love affair, said Ellen Ullman, a software engineer and a writer of essays and novels about the human element of computing. The devices, apps and tools seduce us, she said, and any doubts or fears we had melt away.

“It’s naïve confidence in the digital environment,” Ms. Ullman said. “People come to trust it because one, it’s fun, and two, it’s convenient.”

But we cannot rely on the makers of new technology to think about the moral and privacy implications, she said. “There is not a lot of internal searching among engineers,” she said. “They are not encouraged to say, ‘What does that mean for society?’ That job is left for others. And the law and social norms trail in dealing with the pace of technical changes right now.”

Take, for instance, Google Now. It is the most visible of an emerging crop of apps that perform what engineers call predictive search — sending you information they think you need even before you ask for it. Among everyday users, there is often a visceral negative reaction to such apps. They give too much information to advertisers or the government, people fear, and eliminate the unpredictability of human existence.

But Amit Singhal, Google’s senior vice president for search, predicted that we will eventually adjust.

“If it’s there with you all the time, you will get comfortable,” he said. “Our objective is to build that technology because, guess what, that does not exist. We’re just building the dream, and clearly users will have to get comfortable with it.”

The first day I used Google Now, my phone buzzed to tell me I needed to leave in 15 minutes for my restaurant reservation, because there was traffic on the way.

The thing is, I had never told my phone I had a reservation. Or where I was, or the route I planned to take. Google had spotted the OpenTable reservation in my Gmail in-box, knew my phone’s location and checked Google Maps for traffic conditions.

I was creeped out.

But a few days later, as I was beginning to pack for a trip to Portland, Ore., Google Now sent me an alert with the weather there. It had noticed the flight reservation in my e-mail and figured I would be packing a couple of days before. That somehow felt considerate.

Now I trust it to tell me whether there is a delay on my route to work (even though I never told it where I live or work), how many steps I walk each month, which recipes I should try, when my e-commerce packages have shipped and whether I need to remember to buy diapers next time I am at the store.

GOOGLE and other tech companies are intimately familiar with this arc of seduction — from distrust to dependence — and take advantage of it to ease us into the future they want to build, said Evgeny Morozov, who writes about the political and social implications of technology.

Remember the short-lived uproar when Gmail began scanning e-mail messages to show related ads? Google Now is simply the next step on that path, Mr. Morozov said.

Inside tech companies, engineers would rather set aside pesky impediments like government regulations, social mores and people’s fear of change.

“Maybe we should set aside a small part of the world,” somewhere like the proudly free-spirited Burning Man arts festival in the Black Rock Desert in Nevada, fantasized Larry Page, a Google founder and now its chief executive, during rare public remarks in June. He imagined “safe places where we can try out some new things and figure out what is the effect on society and what is the effect on people, without having to deploy them in the whole world.”

But Fred Turner, a Stanford professor who studies the ways technology and American culture shape one another, including the influence of Burning Man on Google, said that is a dangerous approach.

“Fantasizing about a place to try technology, where it would just be users and machines, is precisely what needs to not happen,” he said.

The blame lies not just with engineers, he said, but with technology’s users.

“The feeling is it’s really just us and our computer in the room, when really that’s not the case at all,” he said. “Both the engineer and the person are not thinking about all the institutions these devices connect us to. And the creepiness comes in when you start thinking about those institutions.”

Perhaps the solution is not to imagine technology free of societal constraints but to fully engage with all of its messy human implications.

In the recent movie “The Internship,” a comedic romp through Google-land starring Vince Vaughn and Owen Wilson, the gulf between engineers and the rest of the world is a recurring theme. The movie’s climax comes when a group of Google geeks leave campus and interact with real people.

By the end, the engineers have come around. The one in charge of Google search pays another character, a nonengineer, a high compliment: “You have a way with people. That’s a lost art.”

Now that is a Hollywood ending.

Claire Cain Miller is a technology reporter for The New York Times.

No comments:

Post a Comment