Public’s loss of privacy translates into big profits for online companies
Unlike most aspects of the financial crisis, it’s fairly easy to understand the Goldman Sach’s debacle. The company put its own interests ahead of its clients by encouraging them to invest in a mortgage product that the bank itself had bet would fail. . .
Unlike most aspects of the financial crisis, it’s fairly easy to understand the Goldman Sach’s debacle. The company put its own interests ahead of its clients by encouraging them to invest in a mortgage product that the bank itself had bet would fail. But while politicians, the public, and the media take aim at the shenanigans of Wall Street, a similar breach in public trust is emanating from Silicon Valley amidst a wash of money and public ignorance. The currency they’re manipulating though is worth billions: your personal information.
In April, Facebook launched Open Graph, its newest social platform that aims to turn the web into a more social experience by sharing user information and preferences with other sites. Company founder Mark Zuckerberg heralded its launch as “the most transformative thing we’ve ever done for the web.” He’s right, but not for the reasons he says.
Facebook is chasing after Google’s dominance in the advertising market, which rewarded Google with $22.9 billion last year alone. With personal information the increasingly valuable currency of the online marketplace, technology companies are scrambling to collect and share personal data with advertisers and other third parties. In the process, your privacy is being compromised, with little debate or realization by society.
Over the past several years, as Facebook and other companies have launched new social tools and platforms, they have followed a fairly consistent rollout model: First, they automatically opt you into sharing your personal information. Then, they give you the option to opt-out but only after your privacy has been violated. After all, once information is out on the web, it is impossible to contain. The privacy controls you think you have are thus worthless.
In an effort to mitigate potential public backlash, both Facebook and Google are attempting to convince you that privacy is dead. Last December, Google CEO Eric Schmidt declared: “If you have something you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.” A month later, Zuckerberg said privacy was “no longer a social norm.” But despite what the CEOs say, the public’s desire for privacy is far from dead. A recent study conducted by the University of California, Berkeley and the University of Pennsylvania found that 88 percent of people have refused to give out information to businesses because they felt it was too personal or unnecessary.
Those who don’t mind sharing personal information with other companies should know that this process can impact their friends simply by association. Two students at MIT last year demonstrated they could predict with 78 percent accuracy whether a Facebook user was gay based on the percentage of his friends that were gay.
Aggregating and publicizing information can of course be beneficial to society. Google Flu Trends for example, is able to estimate flu activity effectively two weeks before the Center for Disease Control by compiling search queries about the flu. But the “public by default” policy pursued by both Facebook and Google over the last few years benefits companies, not society.
Without sensible regulatory oversight and an informed public, technology giants like Facebook and Google will continue to whittle away at the institution of privacy while raking in staggering profits. Goldman Sachs and Wall Street lobbied for deregulation, so that they could pursue wild-eyed schemes with less accountability. These companies asked us to trust them. Let’s not make the same mistake twice.
- Zeba Khan is an independent social media consultant and writer
Facebook is Betting Against its Users
Note: This piece was originally published at the Huffington Post.
Facebook is at it again: Here are our new privacy settings. Trust us, we will take care of you.
Note: This piece was originally published at the Huffington Post
Facebook is at it again: Here are our new privacy settings. Trust us, we will take care of you.
After its privacy practices have been roundly criticized by the New York Times, a chorus of users, Silicon Valley insiders and privacy advocates, Facebook doubles down its privacy bet with Open Graph, their new service that makes it possible to share your profile and friends with any website, anywhere, anytime. Forget the prior privacy fiasco of their Beacon service — boy, do we have a deal for you! By clicking on the “Like” button, you can let your world — and a zillion marketers know — what you like. It is “open” so that means it should be good, like in “open source,” what the good guys do to make software a transparent public good. Right?
Hardly. This is a company that does not have your best interest at heart, despite what CEO Mark Zuckerberg said in his recent PR-laced op-ed. Not that sharing information and connecting with online “friends” is not a good thing. But the way Facebook does it has a price: your privacy. The Faustian bargain is simple: You relinquish any effective control over your personal information, and hence your digital life and identity, and in turn, you can do all these cool things.
Privacy-shmivacy. “Who cares?” says CEO Mark of Facebook and CEO Eric of Google. Trust us. To paraphrase Google’s Eric Schmidt, if you don’t trust us, then you were probably doing something you shouldn’t. We want all the information we can get from you because that is how we make our billions. Just let go and go with the flow. Don’t be anti-social. This is social media. Be social. SHARE.
But who can blame them. It is a simple matter of THE BUSINESS MODEL. This is how they make money — selling information about you to “advertisers.” Who would want to deny Silicon Valley entrepreneurs their natural right to make as much money as fast as they can?
But there really is a problem. It is the Goldman Sach’s problem, so exquisitely played out in the morality tale of their fateful congressional testimony last April. You couldn’t script it any better with the Fabulous French Fab and a CEO and CFO so embedded in their bubble of self-interest and self-importance that they make Marie Antoinette look like a social worker.
It is the business model, stupid. The more we know about you and the less you, our customer/client, knows about us, the more money we make. We “short” our customer because we are betting that they aren’t savvy enough to protect their own interests; the client, the user, is not so much a customer, as a “mark.” Goldman says they have “sophisticated” customers who know the risks. Yeah, state pension fund employees and fund manages that get paid a fraction of their salaries with a fraction of their analytic resources, data, and street smarts.
The same asymmetry of information exists in the social media space. According to a Pew poll, 23% of Facebook users don’t even know about privacy settings — period. What about the Power Users of social media? Well, there is no better exemplar of the informed, seasoned, savvy Power User than Andrew McLaughlin, current Deputy Technology Officer of the USA, former Head of Global Policy and Government Affairs at Google, Yale Law School grad, and former Senior Fellow at Harvard’s Berkman Center for Internet and Society. (Full disclosure that is also where I reside as well.) McLaughlin got himself in deep, deep water when he exposed his trusted Gmail account to Google’s “revolutionary” service, Buzz, designed to enable many of the same great things that Open Graph promised. Out leaked his contact list to the world at large, and more specifically, to eager Republican watchdogs such as RedState.com, who gleefully raised all sorts of embarrassing questions about whether McLaughlin was using his Gmail account for White House business and whether he was inappropriately communicating with his old Google colleagues. Cries were made for full disclosure, invocation of the Freedom of Information Act, comparisons made to Dick Cheney’s protected list of White House visitors. If McLaughlin had it to do over again, I doubt he would have clicked on Buzz. But there are no do overs — what is out is out. And that is the point of privacy leakages. When damage is done, it cannot be undone. And if an Andrew McLaughlin can be tricked and trapped, there really is no hope for the rest of us.
The issue is not with social media. Social media is great and here to stay. Moreover, when it goes mobile, it will only get more powerful and more useful. But it also could become easily Orwellian through the exploitation of personal information. It could become a means for total surveillance where the costs and impacts of today’s breaches are a trifle by comparison. Think medical information, DNA, all financial and commercial transactions, what you do, where you are, and whom you talk to every minute of the day.
The problem is that information marketing companies should not be like some banks and the credit card companies that make money by tricking and trapping, obfuscation, and betting against their “customers” under the guise of acting in their interest. This is not to say that information marketing companies should not make money off of social media and customer data. They should. Indeed, by providing the proper safeguards, checks and balances, more money can be made off of sensitive data, because it will be trusted and more readily shared and relied upon.
What is needed is a kind of Glass Steagall Act for the collection, use, storage and sale of personal data, which prohibits those banks entrusted to safeguard commercial accounts from also trading in those accounts. Fortunately, the FTC, the White House, FCC, GSA, and DoD, and several credit service providers, telecom carriers, and others are showing more foresight in appreciating the importance of user control and the commercial value of trust and privacy than many financial service and social media companies. But even with their efforts, technology, the market and the money are moving faster than they are.
Now is the time to get the rules of the road right so that is possible to both protect and share valuable and sensitive information. This does NOT require government micro-regulation, but it does require SOME thoughtful regulation, principles and architectures that create the right checks and balances and the incentives to reward a race to the top rather than one to the bottom. One can look to the new White House National Strategy for Secure Online Transactions currently under development as the beginning of a new privacy framework that thoughtfully tries to resolve the paradox of having both privacy and sharing in a way that is cognizant of current technology trends and new business models to advance rather than undermine the public interest.
FTC Roundtable Explores Online Privacy
Can there be security and privacy online after the fact?
Can there be security and privacy online after the fact? That was the question posed at a March 17th public roundtable on consumer privacy sponsored by the Federal Trade Commission. The roundtable brought together academics, industry experts and government officials to discuss the challenges of building a secure and authenticated layer for the Internet on top of the original open and trust-based structure.
In her opening remarks, outgoing FTC Commissioner Pamela Jones Harbour cited the recent launch of Google Buzz and Facebook’s rollout of its new privacy settings as well as the 2007 release of Facebook Beacon as examples of irresponsible conduct by technology companies with respect to consumer privacy. In those instances, consumers were automatically signed up for the rollouts or launches and had to opt-out after the fact. “Unlike a lot of tech products, consumer privacy cannot be run in beta. Once data is shared, control is lost forever,” Harbour said.
In its early days, the Internet was used to facilitate communication among a number of researchers at various universities around the country. It was a small, known, and trusted environment. However in the decades since, its nature has changed dramatically. An architectural layer was built on top, encompassing a complex commercial enterprise, social-networking, and search functionality. In time, a variety of popular services rose up, many of which to this day only employ encryption technology for initial log-in information, leaving all subsequent data sent unencrypted. Experts say this practice exposes consumers to significant risk when they connect to popular cloud-based services using public wireless networks in coffee shops, airports, and other public areas. Without encryption, hackers can easily intercept user data. As new technologies are continuously being developed and new business models are created, many experts are focused on how such privacy concerns and future privacy challenges can be met.
One of the most significant issues in online security is the lack of an authentication layer within the architecture of the web. The current and cumbersome system of using usernames, passwords, and shared secrets is continuously threatened by the possibility of phishing and identity theft. “Personally identifying information can be constructed from non-identifying information,” said John Clippinger, co-director of the Law Lab at Harvard University’s Berkman Center for Internet & Society. “You have to have a user-centric, interoperable system that allows people to control information about themselves and have a chain of trust that can be traced back to the individual.”
The panel encouraged the use of protocols that have already been developed like SSL encryption as a first step towards tackling current privacy issues. Looking towards the future, several panelists referred to the work being done to develop new types of authentication technology to address the usability of privacy. One such technology is the information card, which allows users to sign into hundreds of websites using the one card with no usernames or passwords. The underlying technology provides a different personal identifier to each website, ensuring that no correlatable identifier is being shared across all those sites. These new kinds of identifiers such as the I-Card will give consumers more control over their digital identities, allowing them to control what and how much of their information is shared with other parties while protecting their privacy.
Another issue of discussion was concern over the lack of a clear directive from any regulatory body to technology companies on consumer privacy protocol. Some panelists felt that technology companies are learning harmful lessons from each other’s attempts to push the envelope and are encouraging copycat behavior. With the emergence of business models based upon aggregating information and making it available, correct business incentives and audit mechanisms will play increasingly important roles. “There’s great wealth and opportunity and things that could happen when you use this information effectively, so you don’t want to sequester it. But at the same time, you want to have governance principles that are enforced quickly, transparently, and effectively that grow with the technology,” Clippinger added. “Otherwise, it will get co-opted.”
The event was the final of three public events sponsored by the FTC to explore the privacy challenges that are posed by technology and business practices that collect and use consumer data.
View the webcast here.