It’s business as usual for privacy at the US Chamber of Commerce and Internet Association

With the exception of a call for greater transparency around how companies collect and use data — a growing bi-partisan, public-private sector bright spot in the American debate on privacy — the US Chamber of Commerce’s ten new privacy principles and the Internet Association’s almost identical principlesreleased today reflect long-standing industry hostility towards effective government regulation and privacy more broadly. The principles are mostly an extension of the “trust us to do the right thing” argument they’ve been making for years, which have failed miserably.

The Chamber’s very first principle to prohibit state laws altogether on the subject is a not-so-subtle swipe at the popular new law on privacy in California, which industry fought tooth and nail. While imperfect, the law marked an important watershed in popular awakening to the abuses and dangers of the current “click here so we can own your data” model. The Chamber goes on to say in this first principle that “the United States already has a history of robust privacy protection,” which, in addition to being downright cynical and wrong, signals a new round of opposition to meaningful government oversight or intervention.

Their principle on harm-focused enforcement is another clearly outdated and limited approach, as is the call to prohibit individuals from being able to bring an action based on an infringement of their privacy. Together, they completely marginalize us as citizens and consumers, and ask us to trust the system to work on our behalf.

Meanwhile, the Internet Association has loopholes and doublespeak galore. Almost all references to data rights are bounded by phrases like “personal information they have provided,” which often amounts to less than 1% of data collected or purchased by companies. The coup de grace: “individuals should have meaningful controls over how personal information they provide to companies is collected, used, and shared, except where that information is necessary for the basic operation of the business…” When the entire business is predicated on advertising or personalized content and services, I’m not sure what is left really.

As a skeptic myself toward most prescriptive government regulations — I’d rather see innovative new tools and business models solve market and societal failures wherever possible — I spent years watching how utterly incapable industry is of reforming itself when it comes to data and privacy. There is simply too much money and power tied to them while all of the negative externalities fall on us as users — a textbook market failure.

That led me, in addition to my startup efforts on privacy, to work on a number of initiatives that helped create the principles and specifics for the new EU regulations known as GDPR (General Data Protection Regulation). These laws, also imperfect, not only aim to curb current abuses, they mandate far greater transparency and provide a roadmap for a fairer and more sustainable data and privacy model built around the rights of individuals about how their data is used.

Criticized for stifling innovation, GDPR is actually doing the opposite — it is catalyzing the private sector to start building new services that empower people directly with their data, competing both over how much value they can create for users if given access to their data while also showing what good stewards they can be of that data. It’s turning the “race to the bottom” we’ve seen around data and privacy into a much more enlightened and compelling “race to the top.”

Not surprisingly, the Chamber and most US companies have not been fans of GDPR. The lip service given in the principles to “privacy innovation” is a far cry from the vision and efforts underway in Europe, and nowhere do they reference our rights as citizens or consumers. In fact, as mentioned earlier, they only seek to limit those rights.

The most concerning potential development is the use of regulation shaped by these industry lobbying groups to further entrench their power and disadvantage startups and newcomers. The Electronic Frontier Foundation and others have been sounding the alarm on that possibility, and my read on the recent Congressional hearings by Facebook and Twitter is that this is their new strategy. In fact, the degree to which these privacy principles mimic the principles of GDPR while undermining them at every turn is nothing short of dastardly.

To conclude on a positive note, transparency is the single most important key to addressing the worst abuses around privacy and to unlocking a private sector competition to do right by users and their data. Despite 20 years with the curtains drawn tight around data collection and exploitation by industry, it’s simply un-American to stand against greater transparency — which is why both Republicans and Democrats are in favor of it.

Embracing the Chamber’s and the Internet Association’s call for transparency is the perfect jujitsu opportunity for those of us who want to see a more pro-user, pro-privacy model emerge. The real battle will be over just how far it goes, over how much we truly get to see and understand how our data is collected and for what purpose. Once that genie is out of the bottle, we can expect the private sector to get back to what it does best — creating even more incredible data-driven services that truly meet our needs and interests.

Digi.me going prime time

I had the chance yesterday to speak with Paula Newton on CNN’s Quest Means Business. I thought she was going to focus on the Congressional hearings earlier in the day with Sheryl Sandberg of Facebook and Jack Dorsey of Twitter, but she really wanted to understand how digi.me works. She’s done quite a lot of stories on how our data and privacy is being abused by the big platforms, so it was refreshing to see her interest in solutions like ours.

We discussed our new app ecosystem, why it’s so interesting for developers, and how we empower people with their data if the data is already “out there” (a question I get all the time). You’ll have to watch the interview to learn more.

It was fun to visit the studio here in Washington. I was in the makeup room with Wolf Blitzer as the news of the mystery New York Times op-ed was breaking. Of course, the first tweet on my interview asked why CNN was talking about privacy and data given the other news. At least I didn’t get bumped!

IMG_5154

 

Trump’s right on this — it’s time to rip open the black box at Google, Facebook and Twitter

 

Image result for facebook google twitter logos

It’s not often that I find myself siding with President Trump and FCC Chairman Ajit Pai on technology policy. As we watch today’s congressional hearings with Facebook COO Sheryl Sandberg and Twitter CEO Jack Dorsey — and the “empty seat” for Google who refused to send a senior executive — they are dead right in their call for greater transparency. The stakes are just too high to continue to allow these mammoth platforms to decide behind closed doors how to collect data about us, filter the content we see and manipulate our decision making. Regulators must act, as they have done in Europe. So too must we as citizens.

I find it unlikely that these companies purposefully bias their search results and content feeds against Trump and Republicans. In fact, most evidence so far of the weaponization of Facebook by outside actors like the Russians and Cambridge Analytica shows that they have more often exploited the platforms to support Trump and his view of the world. But their algorithms certainly contain all kinds of biases that we need to understand, and the lack of transparency raises unanswerable questions that not only make such concerns possible, they prevent government and us as individuals from responding effectively.

And, make no mistake, these platforms were designed from the start to influence our thinking and behavior. Click by click, terabyte of data by terabyte of data, they track our every move, building sophisticated profiles of each of us to make it easier for content and advertising to reach us. In fact, the first big Facebook breach of trust was an internal Facebook project to see if they could affect a user’s emotions by elevating posts with happy or sad content. They were so proud they published their findings for other data scientists to review. Rather than see the project as a psychological study with human subjects requiring clear consent of the participants, Facebook saw it, as one executive told me at the time, “as what we do every day with A/B testing in advertising.”

It’s no accident that Mark Zuckerberg’s called the challenge of confronting Facebook in his op-ed in today’s Washington Post an “arms race.” Only the largest of organizations have the resources to even participate in such a vast and expensive exercise, structurally limiting the ability of new companies and ideas to emerge. Sheryl Sandberg’s testimony is a laundry list of initiatives Facebook has undertaken recently to address these threats, most of which should have been undertaken years ago when they were warned about these problems but chose to ignore them because it was bad for business. (I, like many others, met privately with Facebook in 2016 to express my concerns while also encouraging them to act publicly.)

The Electronic Frontier Foundation (EFF) and others have rightfully warned that the massive efforts by the big platforms to shape privacy and data policy is designed to ensure their long-term domination, especially around ownership and control of our data. I share this concern, and saw it first hand in Europe five years ago while leading a data initiative at the World Economic Forum. Thankfully European regulators, backed by citizens voicing their deep concerns, managed to hash together a forward-looking set of laws that came into effect this past May (GDPR) predicated on transparency and users getting access to their data to use however they choose — and with absolutely clear consent.

The Congress and the Administration must insist on the same here in the United States. There simply isn’t any way we can continue on the current path, no matter how much Facebook, Twitter and Google say they can save us. Because “saving us” involves saving their business model, which created the problem in the first place. It’s time for new ideas and new solutions.

Facebook ignored recommendations from 2016 internal study on their data and privacy problem

facebook_2015_logo_detail

In early 2016, well after it learned about the massive scale violations by Cambridge Analytica of its user data, Facebook sanctioned an internal study about its approach to data and privacy. Led by its Deputy Chief Privacy Officer, the company convened a series of off-the-record workshops with 175 privacy and data professionals around the world.

Most of us were already well known for our concerns about Facebook’s approach to exploiting its vast troves of user data, but agreed to participate with the hope that we might help the company start acting more responsibly. The discussions were candid and hard hitting. We focused on the ethical and business challenges Facebook would face if were unable to reform itself. Many of us left encouraged.

Unlike most internal studies, Facebook decided, curiously, to produce a public version of their report, which I wrote about in June of that year. You can download a copy of the report here.

Against the recommendations of many of my colleagues, I publicly commended Facebook for such a thoughtful report and highlighted its findings about embracing greater transparency and control of data by users. Many of the ideas centered around new concepts of empowering users with their data and giving them agency over how and when it was used. A number of companies (including my own) were working on tools and business models that made that vision increasingly possible, and it was exciting to see such a decentralized, user-centric model articulated by Facebook.

I knew the findings would be hard for Facebook to implement in the short term, but viewed the report as being an important statement of where the company could go. Facebook was actually well positioned to take advantage of a new collaborative relationship with its users around data. I also sensed that the report represented an emerging, mostly European viewpoint inside the company, and wanted to do all I could to further their cause.

I went so far as to challenge Mark Zuckerberg directly:

“I hope Mark Zuckerberg reads it and internalizes its many good recommendations, especially given the powerful catalyzing role Facebook could play to empower people with data. It’s not just the right thing to do, it would be great for the company’s long-term business (oh, and for that pesky regulatory problem).”

I knew from my interactions at Facebook, including with board members and senior product and policy leaders, that without Zuckerberg’s full support, ideas so core to Facebook’s future would be dead on arrival.

Within a few months, it became clear that the report had indeed missed its mark. Follow up initiatives were either cancelled or redefined so narrowly that no one wanted to participate. People I reached out to at Facebook who should have known about the report said it hadn’t even registered on their radar. When I shared the specifics they simply responded “that does not reflect Mark’s thinking.”

At such a critical moment in the company’s future, I would strongly encourage the company to revisit its own recommendations. While centralized systems and tightly controlled companies can be effective in many contexts, Facebook has simply become too intertwined with how we live our lives to continue to operate that way.

This article originally appeared on Medium at this link.

Today’s Facebook report on personal data & privacy gets a lot right

Is it a wolf in sheep’s clothing or a sign of enlightenment at the world’s largest collector of personal data?

wolf-in-sheep-image

I must admit I was more than a little wary when I was invited by Facebook’s Global Deputy Chief Privacy Officer, Stephen Deadman, to participate in an off-the-record roundtable on the future of personal data and privacy. The involvement of the UK consulting firm helped convince me, given their long-time focus on building transparency and trust in this area. I’m glad I did.

I must admit I was more than a little wary when I was invited by Facebook’s Global Deputy Chief Privacy Officer, Stephen Deadman, to participate in an off-the-record roundtable on the future of personal data and privacy. The involvement of the UK consulting firm Ctrl-Shift helped convince me, given their long-time focus on building transparency and trust in this area. I’m glad I did.

Overshadowed by today’s announcement of 500 million Instagram users,Facebook released a report this morning called “A New Paradigm for Personal Data: Five Shifts to Drive Trust and Growth.” You can download it here: http://bit.ly/28L4HII or check out Deadman’s Op-Ed here:http://bit.ly/28LMDB9.

I hope Mark Zuckerberg reads it and internalizes its many good recommendations, especially given the powerful catalyzing role Facebook could play to empower people with data. It’s not just the right thing to do, it would be great for the company’s long-term business (oh, and for that pesky regulatory problem).

While much of the report’s thinking has been articulated previously, including by Ctrl-Shift, the Personal Data Ecosystem Consortium (where Personal, Inc. was a founding member), the World Economic Forum’s Global Agenda Council on Data and The Aspen Institute’s Communications & Society Program (both of which I participated in), it matters that Facebook spent its time and energy to convene so many trusted experts — 175 in all across 21 global roundtables — and to publish such a thoughtful and balanced report.

Unlike regulators, privacy and security advocates or most any industry player, no matter how large, Facebook is in a unique position to put the tools directly into the hands of their users and provide powerful direct and indirect incentives for them to start becoming hubs for their data.

In this model, users could re-use their data in a permission-based way, and in infinite combinations, across the entire connected universe at home, work and everywhere in between. It would be the ultimate democratization of data in a fair and transparent ecosystem where individuals actively decide when, where and how to participate in a robust value exchange tied to their data.

So why would Facebook take such a risk when its current business model is built on its ownership and control of user data?

Deadman answers that question in the introduction to the new report:

My observation from the years I’ve spent working on privacy and data related issues is that the personal data debate has been largely grounded in a limiting premise – that the desire to innovate with data is generally incompatible with preserving individuals’ rights to privacy and self-determination.

This premise is entrenched by regulators, policymakers and industry, as we tend to talk in terms of trade-offs, as though these two equally desirable goals will always be in tension with each other, and our only choice is to balance them off against each other.

I firmly believe that such trade-off thinking is undesirable – it leads to suboptimal outcomes – and I also believe it’s unnecessary: we now have the skills, technology and motivation to transcend this supposed trade-off.

He goes further:

The debate also entrenches an assumption that only organisations can control data, ignoring the ability and potential of individuals to take a more active role, exercising agency, choice and control over their own data.

I don’t think the evidence supports this assumption. What is more, when people have more control over their own data, more growth, innovation and value can be created than when they don’t.

It’s this very last point that will win the day. There is simply more opportunity to innovate and create value when individuals are empowered in this way. No single company, or government for that matter, can ever match the competitive advantage of individuals (or teams of individuals) to aggregate and permission access to the constantly growing and changing data from across their lives — including their connected devices.

And those who try to keep the individual out of the equation risk being punished as this new model emerges. Data collection, use and monetization simply can’t be kept behind the curtains much longer. Deadman is right to draw Facebook’s attention to both the opportunity — and the risk — of not embracing the rightful role of users.

There is also a surprising set of security benefits of a model with less standalone copies of data in the world, especially when the data that is shared on a session basis and comes networked with real-time validation and authentication. The future would not only be more secure with this approach, it also happens to be in the interest of the world’s largest identity provider.

In our own business, we are seeing this user-centric model starting to take root inside the workplace by and between employees. The enterprise is one of the few places where the need for individuals to practice active data management and data security is both understood and able to be mandated. It’s probably no accident that the Facebook at Work solution is one of the company’s biggest new initiatives.

The report finishes with grand brush strokes, painting a vision of a race to the top among companies who compete for access to user data based on trust, transparency and the value they can deliver. These market-based solutions have all the elements of the “digital enlightenment” many of us have been talking about for a long time.

For those of you worried that Facebook is simply trying to co-opt this new model before it is even established, or use it as a shield to avoid regulation, I understand the concern. But I really don’t think there will be any going back once it happens. As people wake up and experience the magic of having their data go to work for them, they will never be passive about their data or oblivious to its value again.

While Facebook has a lot to gain by being a leader, it has even more to lose by being seen by its community of users as holding them back. I applaud Deadman and his colleagues for taking such a bold position.

This post was originally published here on Medium.

Data as a Human Right

This post was originally published on the World Economic Forum Blog.

WEF-logo

Data has the power to transform our lives – collectively and individually. What is needed to unlock the profound opportunity data affords to improve the human condition – and to defend against a multitude of threats – is not technical, but an ethical framework for its use by and beyond those who initially collect it, including providing access to individuals.

At its most fundamental level, data about individuals represents a new kind of “digital self” that cannot be easily distinguished from the physical person. Some consider it a form of property; others a form of expression or speech. Those working in the area of genomics often view personal data as the DNA sequences that make us truly unique. Whatever lens one uses, it has become increasingly clear that the consequences of how personal data is used are every bit as real for people and society as any material, physical or economic force.

Properly harnessed by ethical practitioners, the principled use of “big data” sets can improve our economies, create jobs, reduce crime, increase public health, identify corruption and waste, predict and mitigate humanitarian crises, and lessen our impact on the environment. Similarly, empowering individuals with access to reusable copies of data collected by others, also called “small data”, can help them drastically improve the quality of their lives, from making better financial, education and health decisions, to saving time and reducing friction in discovering and accessing private and public sector services. Evidence of the positive impact of leveraging data, by both institutions and individuals, abounds.

However, data, like the technology that generates it, is in and of itself neutral. It can be used for good or ill. With a proper, ethical framework, data can – and should – be leveraged for the benefit of humankind, simultaneously at the societal, organizational and individual level. Misused, its power to harm and exploit is similarly unlimited.

In fact, what raises the ethical use and respect for data potentially into the realm of a fundamental human right is its ability to describe and reveal unique human identity, attributes and behaviors – and its power to affect a person’s, and a society’s, well-being as a result. Just as in the physical world, basic rights and opportunities must be preserved.

Indeed, it is already well recognized that invasions of our digital privacy can be exploited for repression, and that technologies for sharing data can be harnessed to support freedom. More fundamentally, though, we need to extend our core rights themselves into the digital world. For example, we must adapt our notion of freedom of thought to account for the new reality that much of our thinking goes on in digital spaces – as does the management and sharing of our most private information. Preserving individual freedom will now require protecting autonomy with respect to our own data.

Clearly, cultural and regional differences regarding human rights in the analog, physical world are sure to arise in this digital, data-oriented world. We do not seek to resolve those issues, but to develop a clear framework of principles to help provide data, data access and data use the protections they deserve.

The Era of Small Data Begins

This post was originally published under the same title on the Personal blog, A Personal Stand.

This is the first post in a series on the rise of “small data” and the new platforms, tools and rules to empower people with their data. It was written for “The Rise of Big Data” panel at the Stanford Graduate School of Business E-Conference on March 6, 2012.

Big data is big business

More data is created every year or so than has been created in all of human history. In this always-on, always-connected world, where even things are being plugged into the Web, the amount of data is growing exponentially.

The collection, storage, analysis, use and monetization of all that data is called “big data.” Corporations and governments are hyper-focused on becoming big data experts to avoid being permanently left behind. The first movers to master the art and science of big data are already changing the way we live, while disrupting industries and amassing fortunes at speeds never before seen.

Given the stakes, massive investments are being made every year to build the technology and expertise required to succeed in big data, optimized, of course, around the needs of companies and governments, not individuals. Industry experts have likened this big data boom to the early days of “big oil,” and refer to data as the “new oil.” Just as oil was essential to building the modern industrial economy, data has become the lifeblood of the new digital economy.

Companies must learn to compete in big data regardless of their industry, or else face obsolescence. This is a tough challenge and touches all aspects of the operations, strategy and culture of companies. At the same time, opportunities abound as entirely new industries are emerging around data as they did around oil — sourcing, extracting, refining, mining, analyzing, distributing, and selling large sets of data.

Big data creates big problems

With its insatiable appetite for digital bits and bytes on each of us, big data is driving a virtual arms race to capture and exploit information about our every move. Big data will log the life of a child born in 2012 in such a way that the person’s activities will be able to be reconstructed not just by the day, but by the hour or minute. In the hands of bad actors, the potential for wrongdoing with these permanent and growing archives of our lives is real and rightfully concerning.

Yet, until recently, people had virtually no idea of big data’s existence as its tools and marketplaces remained largely hidden. The next generation of tracking and data mining technologies are being created based on the assumption that individuals do not care enough to change their online and mobile behavior, which confuses lack of interest with the current lack of alternatives.

But with privacy and security concerns now front-page news, and the financial triumphs of companies built entirely from personal data such as Facebook, Google and LinkedIn, people are waking up and starting to ask tough questions. While companies and government regulators negotiate over how to curb the most egregious risks and abuses, a new and more powerful model is emerging that is designed around the needs and interests of people, providing them a far better, more sustainable alternative to the status quo.

Enter small data

Small data puts the power and tools of big data into the hands of people. It is based on the assumption that people have a significant long-term competitive advantage over companies and governments at aggregating and curating the best and most complete set of structured, machine-readable data about themselves and their lives – the “golden copy”. With proper tools, protections and incentives, small data allows each person to become the ultimate gatekeeper and beneficiary of their own data.

Built on privacy by design and security by design principles, small data can help people become smarter, healthier, and make better, faster decisions. It can help people discover new experiences more easily, reclaim time in their busy lives, and enjoy deeper, more positive relationships with others.

Small data can also greatly improve the capacity and performance of governments and non-governmental institutions, from eliminating time-consuming forms and other inefficient data practices, to improving public health and education by leveraging the power of more accurate and complete data provided with an individual’s permission. Such institutions can also help share important data with individuals, allowing them to have a copy for their own use.

Applied to commerce, small data holds the promise of connecting people with the best and most relevant products and services in a safe and anonymous environment. It can transform advertising into a more respectful, less disruptive industry that rewards people for their time and engagement with their messages and for their purchases. Small data offers customers the opportunity to better balance and assert their interests with companies (some have called this model Vendor Relationship Management (VRM)). Companies who play by these new rules and earn the trust of individuals will be rewarded with access to rich and robust data otherwise unavailable, giving them instant competitive advantages over companies who choose to go it alone.

The first small data platform – a data vault, private network and apps

Personal has spent over two years designing, building and launching the first scalable small data platform. At its core is a secure data vault to aggregate and store structured and unstructured data from just about any source. A private, personal network sits on top to set permissions for data to enter or exit the vault. People are able to connect with other people through the network, and soon with companies, apps, and private or public institutions, and decide which, if any, of their data they are willing to grant them permission to access.

We have put equal weight on privacy and security, and on helping people leverage their own data in exciting, new ways. These concepts are inextricably linked in small data, which requires a high degree of trust to function properly. Similarly, we have rewritten the legal rules of data ownership to protect and empower users, who we call owners. And, because we know relationships can sometimes end, we have built what we believe is the most complete data portability and deletion capabilities in a data platform. Trust doesn’t work unless you are truly free to leave.

In addition to launching our own apps in the coming months, we are inviting developers to apply for early access to build apps on our platform to show off the power and benefits of small data. Individuals have never imagined the magic of running apps on reusable, structured data about the most important things in their lives, while developers have never assumed having access to such high quality data on which to innovate. The possibilities are limitless.

We are excited to help usher in this new era where permission, transparency and privacy become the norm, and where companies and governments have to align around new rules and provide clear and compelling benefits in order to earn access.

At Personal, we see the future through the lens of small data — and we think it will change everything.