Data trusts and data trust

Trust and integrity are key to nonprofits. They trade on these virtues. It's no accident that "Trusts" are the name for one type of nonprofit enterprise.  The defining aspect of the nonprofit corporate structure - the non-distribution clause relating to the use of financial assets - codifies the use of financial assets for mission, allowing the public to trust that the organization will be true to its social purpose.

In the 21st Century, nonprofits are going to need to engender that same kind of trust regarding their use of digital assets (otherwise known as digital data).

This is a tremendous opportunity for the sector. Earning and keeping the trust of all (data) donors  could become a defining quality for civil society organizations and help distinguish them from commercial enterprises and public agencies. Currently, many commercial operations and the government are treading lightly on the trust of their customers and constituents. Headlines from just this week:

Uber: "Whose Privacy will Uber Violate Next?"

Class Dojo: "Privacy Concerns for Class Dojo and Other Tracking Apps for Schoolchildren"
Government: Survey: US Adults feel they are losing control of their data
Nonprofits and philanthropy - all of civil society - should be using data in line with their missions and designing their organizational practices and policies with an eye toward earning, keeping, and sustaining the trust of the public. Good digital data governance policies will be key. There are early signs that "data trusts" will emerge as a new type of enterprise - but all civil society organizations should be working to maintain trust regarding data.

Philosophy Talk: Digital Activism

My laugh is not nearly as engaging as Tom Magliozzi's of Car Talk but I'll do my best on December 14 when I'll be talking about Digital Civil Society on Philosophy Talk. Here's the write up about the show:

“Cyber-Activism” with Lucy Bernholz
Whether it’s making donations and signing petitions online, or using
social media to highlight political causes, cyber-activism has never
been easier. With a few clicks, we can make our voices heard around
the globe. But who’s listening, and is anything actually changing?
Does cyber-activism mobilize real-world action on the ground? Or does
it reduce political engagement to simple mouse-clicking, and
ultimately threaten the subversive nature of change? John and Ken get
active with Lucy Bernolz, co-author of “Disrupting Philanthropy:Technology and the Future of the Social Sector.”
Tickets are available for the live show here.  If you're not in the Bay Area Philosophy Talk is hosted on public radio stations around the country and available on the web.

New Power (or lessons for business from the social economy)


Henry Timms (founder of #givingtuesday and my colleague via Stanford PACS) and Jeremy Heimans have a new article in the December issue of Harvard Business Review called "Understanding New Power." In it they discuss characteristics such as co-ownership and participatory governance. They highlight some of the values of the new power that they call "opt in decision making" and "open source collaboration."

In the requisite 2 x 2 matrix (this is HBR after all) the precious terrain of the upper right hand quadrant includes a mix of movements (Occupy), nonprofits (Wikipedia), benefit corporations (Etsy), and commercial enterprises.

In other words, several of the institutional forms that constitute what we've been calling the social economy embody the characteristics and values that Timms and Heimanns pinpoint as a new type of power. Go read it - see what you think.

Apps and Ethics

I just got an alert from a trusted friend* to the existence of an app - Radar - which is designed to alert you if social media accounts start showing signs that your friends are in distress. The app is intended to help friends help friends in need. It was launched by a suicide crisis line in the UK called Samaritins.

But it's set off a (rightful) alarm about surveillance and privacy and algorithmic alerts. In order to work the app needs to constantly monitor all your accounts, be programmed to infer emotions from content, and alerts you if someone you follow is determined to be "in need." Problems abound - let's look at a few:

  1.  Not everyone who might follow you is necessarily your "friend." Many are probably bots. Worse, some may be stalkers.
  2. Algorithmic determination of emotional states? No question there - the risk of false positives or negatives seems rather high.   The app notes on it's own website that it's in beta - "and won't get it right every time." Suicidal ideation and social media apps full of trolls and troublemakers hardly seems like the place to take this chance.
  3. Constant monitoring of all the accounts you follow. Meaning that no consent is ever asked for from those whose accounts it's reading. And the app is storing data - does it need to?
I'm sure the app is well-intentioned. But practices around Privacy and Consent are precisely the issues that civil society organizations need to get right. This one seems to get them wrong.

*Thanks, Ben!

New questions for nonprofits and philanthropy

Among other things, the digital age is bringing us new kinds of nonprofits. I've been talking about this for several years, using examples of the Internet Archive, Mozilla, Creative Commons, and Wikimedia Foundation as "anchor institutions" of digital civil society. Each of these organizations is at least a decade old and each one exists to protect and promote some form of digital asset. If there weren't digital data and infrastructure, none of these nonprofits would (need to) exist.

There are other examples, newer ones, working on newer versions of shared social challenges in the digital age. One of those challenges is privacy. The founder of Privacy International announced today that he will launch a new organization, Code Red, in 2015 that is focused on protecting human rights advocates and whistleblowers in the digital age. This is an example of a social mission particular to the digital age.

Philanthropy is also challenged by attributes unique to the digital age. Take something like this effort to donate satellite imagery - How do we donate something and still own it, which is what happens with digital data? Who owns the data that get donated? is it really a donation or more of a loan? What are the licensing restrictions that will make sense for the donated data? Who is liable for a use of the data that puts someone in danger? These are all examples of questions that we've had answers to when it comes to donating time or money and we need new answers for donating data.

There are also new challenges for longstanding social sector organizations. Domestic violence is one area where the dangers of digital surveillance are keenly felt. There are tools custom built to facilitate stalking and off-the-shelf digital capacities (find my phone, for example) that make tracking people much easier.

What we're facing are questions of how to obtain the public benefit (new medical breakthroughs, new datasets that can inform poverty eradication efforts, whole new resources like up-to-date satellite imagery) of these digital tools without compromising or endangering people. I don't think the math behind this is going to be as simple as weighing one kind of benefit (public) against another (private) - it's going to be some form of multivariable calculus that includes issues of consent, ownership, liability, perpetuity, privacy, and security.

These are the questions that interest me. The ones that represent fundamental shifts in how civil society, nonprofits and philanthropy work. Much more interesting and important than the latest fundraising challenge on social media.