Dan Gillmor http://dangillmor.com Just in case you were still wondering… Thu, 22 Sep 2016 16:25:31 +0000 en-US hourly 1 http://dangillmor.com/wp-content/uploads/2016/08/cropped-dan512-32x32.jpg Dan Gillmor http://dangillmor.com 32 32 Athens http://dangillmor.com/2016/09/22/athens/ http://dangillmor.com/2016/09/22/athens/#respond Thu, 22 Sep 2016 08:56:09 +0000 http://dangillmor.com/?p=3264 It’s a joy to be back in Athens, where I’m doing a keynote talk, panel and workshop at a journalism conference created by Open University of Cyprus. CNN Greece, which is digital-only (CNN’s first experiment of this kind), came by my hotel yesterday to chat about the future of media (and asked about the presidential race). This photo is from the roof cafe. Quite the set…

dan in greece

Photo by Noriko Takiguchi

]]>
http://dangillmor.com/2016/09/22/athens/feed/ 0
Indie Web Camp http://dangillmor.com/2016/08/27/indie-web-camp/ http://dangillmor.com/2016/08/27/indie-web-camp/#comments Sat, 27 Aug 2016 15:10:33 +0000 http://dangillmor.com/?p=3238

I’m at the IndieWebCamp — a meeting of people who believe in “a people-focused alternative to the ‘corporate web'” — in New York City. This is a small but vital movement aimed at restoring (some) control of our data and communications to the people who create it at the edges of the countless networks that comprise the Internet.

I wrote about the Indie Web a couple of years ago, and it’s good to catch up with the impressive progress since then.

]]>
http://dangillmor.com/2016/08/27/indie-web-camp/feed/ 5
Witnessing, activism, journalism — and the boundaries of free speech in the Facebook age http://dangillmor.com/2016/07/12/witnessing-activism-journalism-and-the-boundaries-of-free-speech-in-the-facebook-age/ http://dangillmor.com/2016/07/12/witnessing-activism-journalism-and-the-boundaries-of-free-speech-in-the-facebook-age/#comments Tue, 12 Jul 2016 23:43:00 +0000 http://dangillmor.com/?p=3220

Diamond “Lavish” Reynolds changed our perception of media last week with her shocking and heartbreaking real-time web video of the last minutes of Philando Castile’s life. The couple, with her daughter riding in the back seat of their sedan, had been pulled over by local police in a Minneapolis suburb, and Reynolds had the astonishing presence of mind to send the aftermath of Castile’s shooting by a police officer — which included her arrest by cops who didn’t even try to save his life — to the world via Facebook’s “Live” video platform.

Countless articles, analyses, commentaries, and other posts have chronicled a media shift in those moments. The implications are real, and important. We are only beginning to confront the issues they raise. Among them:

  • Is this a revolutionary, or evolutionary, change in media creation, distribution, and access?
  • Does it represent a turning point for citizen journalism?
  • What responsibilities do you and I have in situations where we can witness important events and behavior, and where might that lead?
  • What can we trust, and what should we share?
  • Facebook seems to have been caught almost unaware of the likely consequences of offering a real-time video platform. Do Facebook and other centralized distribution platforms have editorial duties to perform?
  • More broadly, who will control what we can create and see in coming years? Facebook? Government? Or you and me?

Reynolds’ video prompted me to revisit something I wrote more than a decade ago, in my book, We the Media, which discussed the then-nascent idea of radically democratized media and one of its important offshoots, citizen journalism. I asked my readers to recall the media environment on Sept. 11, 2001, and then peer into an easily predictable future.

Our memories of that awful day stem largely from television: videos of airplanes slamming into the World Trade Center, the fireballs that erupted, people falling and jumping from the towers, the crumbling to earth of the structures. Individuals with video cameras captured parts of this story, and their work ended up on network TV as well. The big networks stopped showing most graphic videos fairly quickly. But those pictures are still on the Net for anyone who wants to see them.

We also learned, second-hand, that people in the airplanes and Trade Center towers phoned loved ones and colleagues that awful day. What would we remember if the people on the airplanes and in those buildings all had camera-phones? What if they’d been sending images and audio from the epicenter of the terrorists’ airborne arsenal, and from inside the towers that became coffins for so many? I don’t mean to be ghoulish, but I do suggest that our memories would be considerably different had images and sounds of that kind ricocheted around the globe.

Since then, a number of technologies (and uses of those tools) have become much more common. One of them is live-streaming, now so routine that we take it for granted as an offshoot of traditional broadcasting. Live-streaming from mobile phones has been around for some years, too.

In that context, Reynolds’ live video was anything but revolutionary. It was a logical extension of what came before. But the velocity of change is accelerating, and what she did had big implications.

Her video was a three-faceted act: witnessing, activism, and journalism*. Even though few people saw it in real time, she was saving it to the data cloud in real time, creating and — one hopes — preserving a record of what may or may not be judged eventually to have been a crime by a police officer. What Reynolds did was brave, and important for all kinds of reasons.

She also taught the rest of us something vital: We all have an obligation to witness and record some things even if we are not directly part of what’s happening. That’s what two people did as they captured videos of the killing of Alton Sterling in Baton Rouge, Louisiana, last week. They understood their duty when it comes to holding accountable the people we rely on to protect the public in honorable ways. (I still believe that the vast majority of police officers are honorable and trying to do their jobs right. But there’s also no question in my mind that the majority of officers at least tolerate the bad cops who are doing such harm to the reputation of law enforcement, and helping poison public trust.)

At this point I’m convinced, as Ethan Zuckerman says, that we have an obligation to use our cameras in these situations, among many others. These are times when a video record of what happened may not provide absolute clarity, but at least it can provide data.. It may also deter the worst kinds of behavior by public officials in the line of duty — especially as governments that adopted body cameras for police then pass laws designed to prevent the videos from reaching the public.

I also worry, as I did in my book (and long before) about Big and Little Brother becoming the default. So we’re going to have to draw lines, individually and (hopefully) as societal norms: Some things we see, we get the video and post it. At other times, we may get the video, but we’ll just delete it. And we have to make it second nature to realize that some — most — things shouldn’t be captured at all. Pervasive surveillance by law enforcement and/or the rest of us chills free speech and assembly, ultimately deadens us.

We’ll also have to learn, individually and collectively, what we can trust. This takes practice, because the online world is awash with deceit and lies along with honor and truth. It takes practice by news organizations not to be faked out, but even more so by the rest of us, because we, not journalists, have to learn to be the final arbiters — and we have to do this collectively, because like it or not, our news organizations are demonstrating in general that they’re not up to the job. I hate saying that, but there it is.

This is why I spend so much time lately teaching “media literacy,” which asks the former audience — still consumers but also creators — to be active users of media, not passive readers/watchers/etc. This is, in the “consuming media” process, about being skeptical and using judgment; understanding our own biases and working to challenge them; listening to others who may disagree with us; asking questions; waiting before trusting what we see; and so much more. It’s also about recognizing our role as media creators. As we wield our cameras we are obliged, if we want to be trusted, to be honorable.

One element of danger for the citizen video maker — being challenged or arrested or worse by people in authority who don’t want you capturing what they do — is lessening. In fact, the “war on photography” by police and others in power could soon be moot, for several reasons. In the United States, at least, courts are increasingly recognizing a First Amendment right to capture videos of police in their public role. This won’t stop cops from breaking the law, as officers sometimes do by confiscating phones and deleting photos and videos they find objectionable. (Police departments and the politicians they report to don’t mind paying taxpayers’ money to plaintiffs who sue after abuses.)

Meanwhile, technology is reaching a point where police soon won’t realize they are being recorded. It’s been possible for years to buy cameras that become part of our clothing. Google Glass made people realize how trivial it will be to embed cameras in eyewear. Soon enough, we’ll be able to capture videos simply by looking at something; Google, Samsung and Sony (and certainly others) are working on camera/recording devices embedded in contact lenses.

Again, this technology will be used for bad purposes we can easily imagine. And that will inevitably lead to moves aimed at preventing those uses, which in turn leads to free speech and other essential liberties.

Surely the authorities are delighted to hear of Apple’s new patent that lets police (and presumably others, such as big-time musical acts and movie theater owners) block iPhone recording “in areas where picture or video capture is prohibited.” From a company as control-freakish as Apple, it’s no surprise to learn of such a thing. From the context of free expression, it’s potentially catastrophic — and your fears should grow in a world where huge, centralized companies, often working closely with governments, become the venues for expression.

We need to ask now, not tomorrow, who controls the media we create and consume. Increasingly, it’s not us.

Perhaps smart people will find ways around phone makers’ constraints. (I assume they will, actually.) But what will we have gained when we take videos of newsworthy events if the videos are then disappeared by Facebook or Google or Comcast other giant platforms and telecom carriers?

Facebook is the most immediate threat, because it has become the default venue for conversation, and for news. It is also visibly unprepared for this role. Facebook hasn’t given a plausible explanation for its initial removal of Reynolds’ video soon after she posted it. Perhaps, as some smart observers suggest, the video was flagged by other Facebook users, prompting an automatic takedown while the company decided what to do about it (it went back up). Or perhaps the police who confiscated Reynolds’ phone took it down. Or perhaps Facebook itself decided initially to remove it. Or none of the above. The point is that the video remains visible because Facebook allows it to be visible. (Of course, in this highly visible case the video surely has been saved elsewhere and would be immediately reposted online if Facebook decided to remove it.)

The company’s policies on what videos — Live or not — and other material can stay online are incoherent. So, for that matter, are the policies at Twitter, Google’s YouTube and other user-created platforms. This is understandable, though obviously not good. At some level Facebook has no alternative but to make make on-the-fly and contradictory, even hypocritical, decisions. But as Margaret Sullivan of the Washington Post has observed, Facebook has to recognize that it is “in the news business.” It’s making editorial decisions. So are the other platforms. I’ve called them “the new editors of the Internet,” and and much as I wish that wasn’t true, it is.

But Facebook is the behemoth, and the one making the key decisions at this point. This is wrong in so many ways. It’s enormously dangerous that an enormously powerful enterprise can decide what free speech will be. I don’t want a few people’s whims in Menlo Park overruling the First Amendment and other free speech “guarantees” (in quotes because those assurances are worthless in many other countries). So I don’t use Facebook for my speech. I’m posting this, among other places, here on my own website.

But I’m just one person, and approximately 1.6 billion other people have made a different choice. I hope they’ll reconsider someday, but I’m not counting on it.

At the very least, as Facebook becomes what amounts to a “common carrier,” we’ll need to treat it like one under the law. The government can’t stop people from saying anything they choose on the phone. This has to apply to companies like Facebook, or they will have far, far too much power over freedom of speech and assembly. Yet asking the government to intervene brings its own risks, which are visible in many other parts of the world where governments routinely order social media companies to disallow certain speech.

The answer, or part of it, is what World Wide Web creator Tim Berners-Lee has called re-decentralization. A few weeks ago I spent several days with technology pioneers and young activists who want to save the Web and, by extension, the wider Internet from being controlled by a few centralized entities.

While they’re working on this, we should be experimenting ourselves with tools that don’t require us to rely on Facebook et al. For example, the capabilities of Facebook Live have been available via a project called “Rhinobird”, which uses open Web standards including WebRTC. And as projects like the “Interplanetary File System” take root, we’ll be able to use URLs as names of web content, not addresses.

These projects, and many others, are inspiring. I’m going to do whatever I can to help them succeed, because the stakes are so high — for free speech and so much more.

***

*The Reynolds video broke somewhat new ground in citizen journalism, which came to notice a decade ago. Citizen journalism got trashed, early on, by just about everyone in the traditional media world, and the flaws in the concept were certainly clear enough after a number of small and large “false news” debacles. But it was always important for its potential, and the countless cases where it was an essential part of the news flow more than made up for the downside. If nothing else, the act of witnessing — directly and not through intermediaries who may miss the context or the meaning — grew into its own media form. The more we saw videos of police misconduct, for example, the more white Americans had to understand whey black Americans feel they’ve been living in a different country.

 

]]>
http://dangillmor.com/2016/07/12/witnessing-activism-journalism-and-the-boundaries-of-free-speech-in-the-facebook-age/feed/ 1
Decentralized Web Summit — Day One http://dangillmor.com/2016/06/08/decentralized-web-summit-day-one-2/ http://dangillmor.com/2016/06/08/decentralized-web-summit-day-one-2/#respond Wed, 08 Jun 2016 16:21:08 +0000 http://dangillmor.com/?p=3160

decentral1
(I’ll be updating this regularly during the day. New stuff will be at the bottom of the post, not the top, on the principle that most people reading this will read it only once or twice. Maybe I’m wrong, but that’s the method in my madness. For great to-the-moment info, follow Kevin Marks on Twitter.)

After an amazing Builders Day–a gathering of technologists who talked deep code about the potential to re-decentralize the Web and the larger Internet–it’s the official start of the Decentralized Web Summit. Brewster Kahle, founder of the Internet Archive, convened the event and the archive is hosting it. (My blogging from yesterday is here.)

Note: You can watch a live stream of the event.

Mitchell Baker, executive chair of the Mozilla Foundation and Corp., is launching the day. She has three guiding principles:

  • Immediate. Safe instant access to content accessible via a universal address without the need for install.
  • Open. anyone can publish content without permission or barrier and provide access as they see fit.
  • Agency: user agent can choose how to interpret content provided by a service offering.

It’s not about a particular technology, she says. It’s about much more than that.

Vint Cerf, one of the genuine originators of the Internet, is calling himself the “chief Internet evangelist” in the room. Fair enough. (More below…)

He’s here to talk about a “self-archiving web”–and starts with some Internet lessons:

  • Collaborate and cooperate
  • Open design and evolution process.
  • Anyone can join if they follow the protocols.
  • Room for multiple business models.
  • Modular design, layered evolution.
  • E pluribus Unum

He’s thinking about archiving. Traditionally it’s meant storing a document somewhere. But “think about what we do with software,” with versioning capabilities. The Web is a “complex reference structure,” he notes, and archiving isn’t a trivial issue; it’s really hard.

The Internet Archive takes snapshots. Yet the web is a constantly changing entity or collection of entities. Not only do pages change rapidly, but they look different to different users.

“The web can hardly contain itself,” he notes. Links deteriorate (can we have a permanent link system, as Tim Berners-Lee has called for)? HTML rendering is a challenge because HTML itself changes. Then there are permissions, access controls, and copyright restrictions.

He suggests the Google Docs editing/storage/synchronization process may be a useful way to think about the challenge, especially in the automatic, cooperative replication of pages. Maybe Pub/Sub (publish/subscribe) can work? A definite need is metadata–information about information–and lots of it.

We’ll need software libraries to handle a lot of this. And we’ll need to have ability to run the old software by emulating old hardware.

How will we surf the self-archiving web? Multiple sources and methods of access, to start.

Vint notes that print publications have editions, which are snapshots. A question, then, is how often should we be making snapshots of stuff in the future.

He lists some useful properties of, and issues raised by, a self-archiving web:

  • Automatic archive upon publication.
  • Do we sign up for this? Cost? Who pays?
  • How do rendering engines and permissions work?
  • Filter malware?
  • How much fidelity to the original? Should links persist? Or just a surface-level look?
  • What’s the vocabulary for all this?

Is this official records? Once it’s archived, is a page “an indelible and unalterable instance” (lawyers, start your subpoena engines?)? What about encrypted content? Can we put access controls on individual pieces; e.g. let people see it after 25 years?

What’s the role of containers? Virtual machines are sandboxes, but containers can interact with underlying OS and communicate with each other. Interesting potential.

A question about mobile, which is degrading the web as apps (read: Facebook for the most part) become a primary access-to-info method. Vint isn’t sure he knows the answer. He observes that Android apps now run natively inside the Chrome OS. This isn’t a fix, he acknowledges.

The app space, he says, is almost out of control: too many and an expectation/hope among developers that their specific app, which does one thing, will be what people want and use. He definitely wants Internet of Things not to go there. Web apps may be a useful alternative, an underlying structure. (He semi-ducks a question about business models.)

What about things in archives that we don’t want (right to be forgotten)? In order to remember to wipe something out of index we have to remember it, he says, or else it might come back. It’s not just stuff we generate; it’s the stuff other people generate. He suggests we need better social practices, and says there’s no simple ethical solution on the horizon.

***

Tim Berners-Lee, the actual creator of the World Wide Web, is up to discuss some of the strategic issues involved in re-decentralizing the web, something he’s advocated repeatedly in recent months.

“The objective was to make something that worked,” he says, connecting a variety of systems and knowledge. He talks about early standards and protocols, and how they had a way of helping get information from here to there. A way to locate pages was part of the process. HTTP was kind of a way to combine the idea of “headers” (e.g. SMTP) and HTML content. Web addresses (URLs) were another vital building block.

One result: anybody could publish and put ideas out there, and anyone could find and link to them. We had a web where we could have good, smart discussions.

He observes the vast creativity that has emerged since then, to turn the web into the huge thing it is where “you can do anything you like”–a fountain of innovative work.

The silo-ization of the web, into silos like Facebook etc., worries him a great deal. People talk in one silo, have pictures in another, and they’re frustrated. The idea that everyone can participate with their own domain, server, etc., is still true but less relevant. And advertising has become the favored way to make money.

This isn’t optimal, he says. If someone claims it is, that’s a myth.

He wants to bring back some of the early ethos, and not letting our data and conversations live inside other people’s silos–bring back a truly decentralized web. He’s working on a project called SOLID, a collection of ideas and technologies aimed at re-decentralizing, in part by separating data from the things applications do to the data. He’s not happy about using the current domain name system, but alternatives are, at the moment, not reliable enough.

Don’t think of URLs as places, he advises. Think of them as names of things you want to see and use. (Big benefits can derive from that including security.)

Again, he says how frustrated the silos have made him. He’s “excited” that we can re-decentralize the web.

How do we persuade non-technical people to not be siloed given how hard it is to use truly decentralized tech? “It’s a really important problem,” he says. He wants “really great designers” to be part of this process. PGP (encryption) is great software, for example, but horrible from a user-experience point of view–and almost no one is working to improve that.

Several follow-up questions are quite technical. Kevin Marks is translating well into English (link at top).

***

Brewster Kahle is up next. He wrote a pivotal essay last year called “Locking the Web Open,” which helped me crystallize some of my own thoughts about what is happening.

“The way we code the web will determine a lot about how we live our lives online,” he says.

The web is an essential system but it doesn’t have some of the basic structure that preserves privacy, liberty and so much more that we need, he says. And it’s not available everywhere, due to censorship (and digital divide issues).

It’s an ephemeral medium, he says. And it’s being used to spy on people.

Is it reliable? Sort of. Is it private. No. Is it fun? Yes. We get one out of three at the moment. We need all three: reliable, private, fun.

He distinguishes the web from the Internet. A key nature of the Internet as a whole is reliability through resiliency. (“Five guys locked into a room for a year” made it happen.)

He discusses Amazon’s cloud: a decentralized system under one owner’s control. He’d like to make AWS features available to everyone in the world but not under Amazon’s control.

“We want to make it reader-private,” so you don’t fear spies (and others) doing things based on what you read. Writer-private is easier than reader-private.

We also need to think media and how creators can more easily get paid.

Brewster is jazzed about how far JavaScript (JS) has come. Not only can we run old platforms (e.g. DOS) in our browsers–a 15-year-old computer living in your browser–but we can use browsers as computing platforms in their own right.

Encryption: We won that war in the 1990s, but we use it mostly for online commerce. We can do a lot better, he says. We can use crypto in our browsers in new ways. Blockchain could be part of it as well. Ditto peer-to-peer in new ways.

His proposal: “WordPress, but decentralized.” (Matt Mullenweg should be here…)

Could we do this? Goals:

  • Normal browser, no download or plugin.
  • Good names, e.g. http://brewsterblog.dweb
  • Good performance.
  • Fun to post/comment
  • User IDs with different roles
  • Payments/tips
  • Archives/versions

With JS as an operating system, it already works at the browser level, he says. He’s demo-ing on a distributed system I mentioned yesterday: IPFS. It worked. Amazing…

Easy names? There are people in the room who are working on this.

Performance needs ISPs and CDNs. Get the hashes closer to users.

Updates need to be decentralized. Mutable torrents, and other possible solutions.

Identity is complex. Bitcoin’s system is a possible model, maybe the basis. Bonus: tipping and commerce included.

“We can have WordPress, but decentralized,” he says. “A lot of the pieces actually exist.”

“We can bake the First Amendment into the code itself.”

**

Kevin Marks is moderating a panel on P2P (peer to peer), with people who are doing it in ways that are as far ahead of Napster as JavaScript is ahead of its earliest incarnation. (That would be way, way ahead.) I’ll be posting short videos with most of them, recorded yesterday at the “Science Fair” portion of the program.

Zooko Wilcox, from ZCash, a censorship-resistant digital money system (he’s also doing Tahoe-LAFS).

David Dias is part of the IPFS (interplanetary file system, an amazing achievement in decentralization).

Gavin Wood’s Ethcore is working on ways of improving and expanding on open-source blockchain systems.

Feross Aboukhadijeh works on WebTorrent, a client that runs natively in the browser. BItTorrent launched a hugely successful protocol, but it’s used via applications you install. He’s putting torrent protocol into the browser directly. This has potential for all kinds of data sharing in highly useful ways.

A conversation about whether GitHub is centralized or decentralized. (The answer seems to be Yes.)

Apps are winning on mobile, but browsers are getting more competitive as the standards process (always slower than what a private company can do, says Aboukhadijeh) proceeds.

Wilcox: Facebook’s is winning in part because it can exclude people and add-on services. This can be good for the user experience.

A question: Who pays for all this? Several speakers the payment possibilities inherent in blockchain technology. Also, says Wilcox: With decentraliztion there are fewer risks from current hosting operations.

***

Chelsea Barabas from the MIT Media Lab leads panel on naming an identity.

She asks for an anecdote illustrating the problem.

Jeremy Rand (Namecoin): Certificate authorities that have power to tell us what websites are authentic can be incompetent, compromised or outright corrupt. If criminals or governments can impersonate a website, that’s “problematic.” Namecoin repurposes BitCoin into a naming system; could be replacement for DNS? Maybe.

Joachim Lohkamp (Jolocom) is working on decoupling data from application, and providing authentication and control of data to users.

Muneeb Ali (BlockStack) asks you to consider how many trust points you navigate in launching BitCoin’s website from your computer. A lot: More than 10. Something wrong with that. BlockStack (decentralizing the DNS as well) wants to fix that.

Christopher Allen has been in this field for many years. Centralization “has a way of creeping in,” he says. He’s working on projects that help the underprivileged–including refugees–have identity systems that give them more rights.

Mistakes from the early days of the web still haunt us, but there were some successes, too. TLS beat back Mastercard and Visa, not a small achievement, notes Allen. Microsoft tried Passport, but no one wanted it to control even a federated system.

Twitter/Facebook/etc. single signon systems are using what had been decentralized tech but don’t allow others to use it with them. A one-way process.

How do we break the cycle?

Lohkamp again stresses decoupling data from applications (Facebook the example of tightly coupled).

Ali: Biggest change is BitCoin, which solved a hard problem, namely trust, via a neutral playing field where identity can be created and verified.

Rand: Lots of attempts to replace cert authorities, with no success. He, too, likes BitCoin’s ability to evade the third-party control.

***

Cory Doctorow has been thinking as long and hard about technological control-freakery as anyone.

He starts with advice: Use the will-power you have now, when you’re strong, so you’ll not do the wrong thing when you have a moment of weakness. It goes to “how we build things that work well and fail well.”

The web is closed today because “just like you make compromises.” Little compromises, one after another. We discount future costs of present benefits.

Make it more expensive. Take options off the table now.

Pressure on browser vendors and other tech companies means they won’t block Google Docs. But the GPL is locked open, incapable of compromise, and because it’s indispensible it’s being used. Hence Linux, which uses GPL.

Cory talks about a variety of projects that are more and less open and free in all senses of the word.

Systems that work well and fail badly die in flames. GPL is designed to fail well. No one wants to take the risk of suing and setting a bad precedent.

He turns to DRM, which he’s trying to destroy (in a project with the Electronic Frontier Foundation), pointing out that it doesn’t work well but that the Draconian DMCA (Digital Millennium Copyright Act) is a powerful weapon deterring people from tampering with it–including security researchers.

DRM has metastasized, he says, to control how purchasers can use or fix all kinds of products including cars. It’s in, um, rectal thermometers. Now it’s in web browsers, to prevent some videos from playing or being turned into screencasts.

No legislature has banned what companies are banning to stop user conduct. Worst, he says, it “turning browsers into no-go zones” and the impact on security research. When the W3C did this, it compromised in a small but terrible way, he says.

It’s the sum of a million tiny compromises. We thought if we refused to compromise, others down the road would do it.

Companies frequently abuse standards bodies to achieve control (via patents), he says. But W3C has a good policy on patents; they can’t be used in the process.

How do we keep DMCA from colonizing the open web? We participate in the open web. Take the control systems off the table, now, he says.

The EFF has proposed this at W3C, and in proposals for DMCA exceptions for medical implants.

We can go further, he says. Law is required. So is giving each other support.

We have to agree now what to take off the table, to prevent tomorrow’s compromises. Two principles:

One: When a computer receives conflicting instructions from its owner and third party, the owner always wins. Systems should only be designed so remote instructions cannot be followed without owner’s consent. Otherwise, among other things, you create security risks.

(While we’re at it, throw away the Computer Fraud and Abuse Act).

Two: Never give corporations or the state the power to silence people who find flaws.

Be hard-liners on these principles, Cory says. If you don’t safeguard users from control, you will be remembered badly.

We can’t lock it open forever. We can leave behind the material to leave a better world.

David Reed asks why Librarian of Congress is the only person who can make exceptions to DMCA. Librarian can agree to exceptions, but can’t permit anyone to make the tools to do so.

EFF will change this in legislatures by increments, and with litigation challenging constitutionality of the DMCA altogether.

***

Panel: values and the decentralized web.

[Got derailed temporarily…back now.]

A wide-ranging discussion about the need to create infrastructure that contemplates good practices ahead of time. This resonates with Cory’s talk.

Wendy Seltzer from the W3C is talking about how standards follow layered principles on which the net itself was built, where most of innovation rides on top of the lower layers.

How do we build reader privacy into all this? Max Ogden (dat project) disagrees with people who say software isn’t political. “We have control over intent,” he says, encouraging encryption by default including transport. Support initiatives like that, he suggests.

(My net connection was down for a bit…

***

Now it’s “security in a world of black hats” — a key topic in thinking about a decentralized web.

Moderator Ross Schulman of the Open Technology Institute asks, what is your threat model?

Paige Peterson of Maidsafe (encrypted communications) talks about protecting against “large actor” (read: nation state), is talking having networks monitor themselves via various nodes.

Tor’s Mike Perry: A great number of threats, depending on the user, but everything from nation-states to adversaries trying to monitor the users. Approach: eliminating single oints of failure.

Brian Warner, working on Tahoe-LAFS: trying to allow users to use servers but not totally depend on them for security.

Van Jacobson (Google and Named Data Networking Project; see yesterday): If producers of info sign it, sig lets network check to see if it’s been corrupted. Receiver has the choices. For sender who wants message to be read, credentials needed.

What can we export from centralized systems re security for decentralized?

Warner: We don’t know much about users since we don’t monitor them, where centralized systems can report failures more easily.

Peterson talks about evolving codebase due to evolving features. Moved from C++ to Rust, aimed at developers looking for more modern languages and for longer term sustainability.

Jacobson says physical locality can be a factor in trust. He gives the example of his house, where devices might trust each other but nothing outside.

Perry talks about how the organization works to prevent single points of failure. Fascinating and complex mixture of machine and human rules. He’s describing multiple levels of protection.

Brewster wonders how Tor can help others provide better reader privacy. Warner says the more applications that run over Tor, the better. He notes that it’s too easy to leak information about IP address, etc. Think about protocol weaknesses early, not late, he says.

***

Final panel: How might we decentralize scientific journal articles.

Note: this is ultimately a demo that should be watched via the stream.

Juan Benet, IPFS (interplanetary file system), goal to make internet work across everything, everywhere. “We care about lifting the web from its location.”

Trent McConaghy, BigChainDB and IPDB (latter is interplanetary database): data existing in distributed ways. Ways of storing data are permeable. “It’s just there.”

Evan Schwartz, InterLedger: If we want to build payment systems, we have to pick specific existing networks. Already been solved for information: the Internet. Routing packets over disparate networks. This is about routing packets of money over disparate networks.

Denis Nazarov, MediaChain: lets participants describe media and ID it. Uses Content ID.

Karissa McKelvey, dat project: Decentralize data the way universities and labs already are. We help them share in P2P way.

Each of them demos what their projects. For today’s purposes, we’re seeing some potential interoperation. It’s not hard to imagine how these technologies could fit alongside and into each other.

***

Brewster Kahle sums up the day. The key question now: What do we do next?

Foundations can help us get over some of the humps. Do we need a bunch of conferences? Awards? Lock some of it open? Licensing requirements on distributed code?

He thanks everyone for joining this gathering.

Last word: “Let’s build the decentralized web.”

 

]]>
http://dangillmor.com/2016/06/08/decentralized-web-summit-day-one-2/feed/ 0
Decentralized Web Summit — Day One http://dangillmor.com/2016/06/08/decentralized-web-summit-day-one/ http://dangillmor.com/2016/06/08/decentralized-web-summit-day-one/#respond Wed, 08 Jun 2016 16:21:08 +0000 http://dangillmor.com/?p=3160

decentral1
(I’ll be updating this regularly during the day. New stuff will be at the bottom of the post, not the top, on the principle that most people reading this will read it only once or twice. Maybe I’m wrong, but that’s the method in my madness. For great to-the-moment info, follow Kevin Marks on Twitter.)

After an amazing Builders Day–a gathering of technologists who talked deep code about the potential to re-decentralize the Web and the larger Internet–it’s the official start of the Decentralized Web Summit. Brewster Kahle, founder of the Internet Archive, convened the event and the archive is hosting it. (My blogging from yesterday is here.)

Note: You can watch a live stream of the event.

Mitchell Baker, executive chair of the Mozilla Foundation and Corp., is launching the day. She has three guiding principles:

  • Immediate. Safe instant access to content accessible via a universal address without the need for install.
  • Open. anyone can publish content without permission or barrier and provide access as they see fit.
  • Agency: user agent can choose how to interpret content provided by a service offering.

It’s not about a particular technology, she says. It’s about much more than that.

Vint Cerf, one of the genuine originators of the Internet, is calling himself the “chief Internet evangelist” in the room. Fair enough. (More below…)

He’s here to talk about a “self-archiving web”–and starts with some Internet lessons:

  • Collaborate and cooperate
  • Open design and evolution process.
  • Anyone can join if they follow the protocols.
  • Room for multiple business models.
  • Modular design, layered evolution.
  • E pluribus Unum

He’s thinking about archiving. Traditionally it’s meant storing a document somewhere. But “think about what we do with software,” with versioning capabilities. The Web is a “complex reference structure,” he notes, and archiving isn’t a trivial issue; it’s really hard.

The Internet Archive takes snapshots. Yet the web is a constantly changing entity or collection of entities. Not only do pages change rapidly, but they look different to different users.

“The web can hardly contain itself,” he notes. Links deteriorate (can we have a permanent link system, as Tim Berners-Lee has called for)? HTML rendering is a challenge because HTML itself changes. Then there are permissions, access controls, and copyright restrictions.

He suggests the Google Docs editing/storage/synchronization process may be a useful way to think about the challenge, especially in the automatic, cooperative replication of pages. Maybe Pub/Sub (publish/subscribe) can work? A definite need is metadata–information about information–and lots of it.

We’ll need software libraries to handle a lot of this. And we’ll need to have ability to run the old software by emulating old hardware.

How will we surf the self-archiving web? Multiple sources and methods of access, to start.

Vint notes that print publications have editions, which are snapshots. A question, then, is how often should we be making snapshots of stuff in the future.

He lists some useful properties of, and issues raised by, a self-archiving web:

  • Automatic archive upon publication.
  • Do we sign up for this? Cost? Who pays?
  • How do rendering engines and permissions work?
  • Filter malware?
  • How much fidelity to the original? Should links persist? Or just a surface-level look?
  • What’s the vocabulary for all this?

Is this official records? Once it’s archived, is a page “an indelible and unalterable instance” (lawyers, start your subpoena engines?)? What about encrypted content? Can we put access controls on individual pieces; e.g. let people see it after 25 years?

What’s the role of containers? Virtual machines are sandboxes, but containers can interact with underlying OS and communicate with each other. Interesting potential.

A question about mobile, which is degrading the web as apps (read: Facebook for the most part) become a primary access-to-info method. Vint isn’t sure he knows the answer. He observes that Android apps now run natively inside the Chrome OS. This isn’t a fix, he acknowledges.

The app space, he says, is almost out of control: too many and an expectation/hope among developers that their specific app, which does one thing, will be what people want and use. He definitely wants Internet of Things not to go there. Web apps may be a useful alternative, an underlying structure. (He semi-ducks a question about business models.)

What about things in archives that we don’t want (right to be forgotten)? In order to remember to wipe something out of index we have to remember it, he says, or else it might come back. It’s not just stuff we generate; it’s the stuff other people generate. He suggests we need better social practices, and says there’s no simple ethical solution on the horizon.

***

Tim Berners-Lee, the actual creator of the World Wide Web, is up to discuss some of the strategic issues involved in re-decentralizing the web, something he’s advocated repeatedly in recent months.

“The objective was to make something that worked,” he says, connecting a variety of systems and knowledge. He talks about early standards and protocols, and how they had a way of helping get information from here to there. A way to locate pages was part of the process. HTTP was kind of a way to combine the idea of “headers” (e.g. SMTP) and HTML content. Web addresses (URLs) were another vital building block.

One result: anybody could publish and put ideas out there, and anyone could find and link to them. We had a web where we could have good, smart discussions.

He observes the vast creativity that has emerged since then, to turn the web into the huge thing it is where “you can do anything you like”–a fountain of innovative work.

The silo-ization of the web, into silos like Facebook etc., worries him a great deal. People talk in one silo, have pictures in another, and they’re frustrated. The idea that everyone can participate with their own domain, server, etc., is still true but less relevant. And advertising has become the favored way to make money.

This isn’t optimal, he says. If someone claims it is, that’s a myth.

He wants to bring back some of the early ethos, and not letting our data and conversations live inside other people’s silos–bring back a truly decentralized web. He’s working on a project called SOLID, a collection of ideas and technologies aimed at re-decentralizing, in part by separating data from the things applications do to the data. He’s not happy about using the current domain name system, but alternatives are, at the moment, not reliable enough.

Don’t think of URLs as places, he advises. Think of them as names of things you want to see and use. (Big benefits can derive from that including security.)

Again, he says how frustrated the silos have made him. He’s “excited” that we can re-decentralize the web.

How do we persuade non-technical people to not be siloed given how hard it is to use truly decentralized tech? “It’s a really important problem,” he says. He wants “really great designers” to be part of this process. PGP (encryption) is great software, for example, but horrible from a user-experience point of view–and almost no one is working to improve that.

Several follow-up questions are quite technical. Kevin Marks is translating well into English (link at top).

***

Brewster Kahle is up next. He wrote a pivotal essay last year called “Locking the Web Open,” which helped me crystallize some of my own thoughts about what is happening.

“The way we code the web will determine a lot about how we live our lives online,” he says.

The web is an essential system but it doesn’t have some of the basic structure that preserves privacy, liberty and so much more that we need, he says. And it’s not available everywhere, due to censorship (and digital divide issues).

It’s an ephemeral medium, he says. And it’s being used to spy on people.

Is it reliable? Sort of. Is it private. No. Is it fun? Yes. We get one out of three at the moment. We need all three: reliable, private, fun.

He distinguishes the web from the Internet. A key nature of the Internet as a whole is reliability through resiliency. (“Five guys locked into a room for a year” made it happen.)

He discusses Amazon’s cloud: a decentralized system under one owner’s control. He’d like to make AWS features available to everyone in the world but not under Amazon’s control.

“We want to make it reader-private,” so you don’t fear spies (and others) doing things based on what you read. Writer-private is easier than reader-private.

We also need to think media and how creators can more easily get paid.

Brewster is jazzed about how far JavaScript (JS) has come. Not only can we run old platforms (e.g. DOS) in our browsers–a 15-year-old computer living in your browser–but we can use browsers as computing platforms in their own right.

Encryption: We won that war in the 1990s, but we use it mostly for online commerce. We can do a lot better, he says. We can use crypto in our browsers in new ways. Blockchain could be part of it as well. Ditto peer-to-peer in new ways.

His proposal: “WordPress, but decentralized.” (Matt Mullenweg should be here…)

Could we do this? Goals:

  • Normal browser, no download or plugin.
  • Good names, e.g. http://brewsterblog.dweb
  • Good performance.
  • Fun to post/comment
  • User IDs with different roles
  • Payments/tips
  • Archives/versions

With JS as an operating system, it already works at the browser level, he says. He’s demo-ing on a distributed system I mentioned yesterday: IPFS. It worked. Amazing…

Easy names? There are people in the room who are working on this.

Performance needs ISPs and CDNs. Get the hashes closer to users.

Updates need to be decentralized. Mutable torrents, and other possible solutions.

Identity is complex. Bitcoin’s system is a possible model, maybe the basis. Bonus: tipping and commerce included.

“We can have WordPress, but decentralized,” he says. “A lot of the pieces actually exist.”

“We can bake the First Amendment into the code itself.”

**

Kevin Marks is moderating a panel on P2P (peer to peer), with people who are doing it in ways that are as far ahead of Napster as JavaScript is ahead of its earliest incarnation. (That would be way, way ahead.) I’ll be posting short videos with most of them, recorded yesterday at the “Science Fair” portion of the program.

Zooko Wilcox, from ZCash, a censorship-resistant digital money system (he’s also doing Tahoe-LAFS).

David Dias is part of the IPFS (interplanetary file system, an amazing achievement in decentralization).

Gavin Wood’s Ethcore is working on ways of improving and expanding on open-source blockchain systems.

Feross Aboukhadijeh works on WebTorrent, a client that runs natively in the browser. BItTorrent launched a hugely successful protocol, but it’s used via applications you install. He’s putting torrent protocol into the browser directly. This has potential for all kinds of data sharing in highly useful ways.

A conversation about whether GitHub is centralized or decentralized. (The answer seems to be Yes.)

Apps are winning on mobile, but browsers are getting more competitive as the standards process (always slower than what a private company can do, says Aboukhadijeh) proceeds.

Wilcox: Facebook’s is winning in part because it can exclude people and add-on services. This can be good for the user experience.

A question: Who pays for all this? Several speakers the payment possibilities inherent in blockchain technology. Also, says Wilcox: With decentraliztion there are fewer risks from current hosting operations.

***

Chelsea Barabas from the MIT Media Lab leads panel on naming an identity.

She asks for an anecdote illustrating the problem.

Jeremy Rand (Namecoin): Certificate authorities that have power to tell us what websites are authentic can be incompetent, compromised or outright corrupt. If criminals or governments can impersonate a website, that’s “problematic.” Namecoin repurposes BitCoin into a naming system; could be replacement for DNS? Maybe.

Joachim Lohkamp (Jolocom) is working on decoupling data from application, and providing authentication and control of data to users.

Muneeb Ali (BlockStack) asks you to consider how many trust points you navigate in launching BitCoin’s website from your computer. A lot: More than 10. Something wrong with that. BlockStack (decentralizing the DNS as well) wants to fix that.

Christopher Allen has been in this field for many years. Centralization “has a way of creeping in,” he says. He’s working on projects that help the underprivileged–including refugees–have identity systems that give them more rights.

Mistakes from the early days of the web still haunt us, but there were some successes, too. TLS beat back Mastercard and Visa, not a small achievement, notes Allen. Microsoft tried Passport, but no one wanted it to control even a federated system.

Twitter/Facebook/etc. single signon systems are using what had been decentralized tech but don’t allow others to use it with them. A one-way process.

How do we break the cycle?

Lohkamp again stresses decoupling data from applications (Facebook the example of tightly coupled).

Ali: Biggest change is BitCoin, which solved a hard problem, namely trust, via a neutral playing field where identity can be created and verified.

Rand: Lots of attempts to replace cert authorities, with no success. He, too, likes BitCoin’s ability to evade the third-party control.

***

Cory Doctorow has been thinking as long and hard about technological control-freakery as anyone.

He starts with advice: Use the will-power you have now, when you’re strong, so you’ll not do the wrong thing when you have a moment of weakness. It goes to “how we build things that work well and fail well.”

The web is closed today because “just like you make compromises.” Little compromises, one after another. We discount future costs of present benefits.

Make it more expensive. Take options off the table now.

Pressure on browser vendors and other tech companies means they won’t block Google Docs. But the GPL is locked open, incapable of compromise, and because it’s indispensible it’s being used. Hence Linux, which uses GPL.

Cory talks about a variety of projects that are more and less open and free in all senses of the word.

Systems that work well and fail badly die in flames. GPL is designed to fail well. No one wants to take the risk of suing and setting a bad precedent.

He turns to DRM, which he’s trying to destroy (in a project with the Electronic Frontier Foundation), pointing out that it doesn’t work well but that the Draconian DMCA (Digital Millennium Copyright Act) is a powerful weapon deterring people from tampering with it–including security researchers.

DRM has metastasized, he says, to control how purchasers can use or fix all kinds of products including cars. It’s in, um, rectal thermometers. Now it’s in web browsers, to prevent some videos from playing or being turned into screencasts.

No legislature has banned what companies are banning to stop user conduct. Worst, he says, it “turning browsers into no-go zones” and the impact on security research. When the W3C did this, it compromised in a small but terrible way, he says.

It’s the sum of a million tiny compromises. We thought if we refused to compromise, others down the road would do it.

Companies frequently abuse standards bodies to achieve control (via patents), he says. But W3C has a good policy on patents; they can’t be used in the process.

How do we keep DMCA from colonizing the open web? We participate in the open web. Take the control systems off the table, now, he says.

The EFF has proposed this at W3C, and in proposals for DMCA exceptions for medical implants.

We can go further, he says. Law is required. So is giving each other support.

We have to agree now what to take off the table, to prevent tomorrow’s compromises. Two principles:

One: When a computer receives conflicting instructions from its owner and third party, the owner always wins. Systems should only be designed so remote instructions cannot be followed without owner’s consent. Otherwise, among other things, you create security risks.

(While we’re at it, throw away the Computer Fraud and Abuse Act).

Two: Never give corporations or the state the power to silence people who find flaws.

Be hard-liners on these principles, Cory says. If you don’t safeguard users from control, you will be remembered badly.

We can’t lock it open forever. We can leave behind the material to leave a better world.

David Reed asks why Librarian of Congress is the only person who can make exceptions to DMCA. Librarian can agree to exceptions, but can’t permit anyone to make the tools to do so.

EFF will change this in legislatures by increments, and with litigation challenging constitutionality of the DMCA altogether.

***

Panel: values and the decentralized web.

[Got derailed temporarily…back now.]

A wide-ranging discussion about the need to create infrastructure that contemplates good practices ahead of time. This resonates with Cory’s talk.

Wendy Seltzer from the W3C is talking about how standards follow layered principles on which the net itself was built, where most of innovation rides on top of the lower layers.

How do we build reader privacy into all this? Max Ogden (dat project) disagrees with people who say software isn’t political. “We have control over intent,” he says, encouraging encryption by default including transport. Support initiatives like that, he suggests.

(My net connection was down for a bit…

***

Now it’s “security in a world of black hats” — a key topic in thinking about a decentralized web.

Moderator Ross Schulman of the Open Technology Institute asks, what is your threat model?

Paige Peterson of Maidsafe (encrypted communications) talks about protecting against “large actor” (read: nation state), is talking having networks monitor themselves via various nodes.

Tor’s Mike Perry: A great number of threats, depending on the user, but everything from nation-states to adversaries trying to monitor the users. Approach: eliminating single oints of failure.

Brian Warner, working on Tahoe-LAFS: trying to allow users to use servers but not totally depend on them for security.

Van Jacobson (Google and Named Data Networking Project; see yesterday): If producers of info sign it, sig lets network check to see if it’s been corrupted. Receiver has the choices. For sender who wants message to be read, credentials needed.

What can we export from centralized systems re security for decentralized?

Warner: We don’t know much about users since we don’t monitor them, where centralized systems can report failures more easily.

Peterson talks about evolving codebase due to evolving features. Moved from C++ to Rust, aimed at developers looking for more modern languages and for longer term sustainability.

Jacobson says physical locality can be a factor in trust. He gives the example of his house, where devices might trust each other but nothing outside.

Perry talks about how the organization works to prevent single points of failure. Fascinating and complex mixture of machine and human rules. He’s describing multiple levels of protection.

Brewster wonders how Tor can help others provide better reader privacy. Warner says the more applications that run over Tor, the better. He notes that it’s too easy to leak information about IP address, etc. Think about protocol weaknesses early, not late, he says.

***

Final panel: How might we decentralize scientific journal articles.

Note: this is ultimately a demo that should be watched via the stream.

Juan Benet, IPFS (interplanetary file system), goal to make internet work across everything, everywhere. “We care about lifting the web from its location.”

Trent McConaghy, BigChainDB and IPDB (latter is interplanetary database): data existing in distributed ways. Ways of storing data are permeable. “It’s just there.”

Evan Schwartz, InterLedger: If we want to build payment systems, we have to pick specific existing networks. Already been solved for information: the Internet. Routing packets over disparate networks. This is about routing packets of money over disparate networks.

Denis Nazarov, MediaChain: lets participants describe media and ID it. Uses Content ID.

Karissa McKelvey, dat project: Decentralize data the way universities and labs already are. We help them share in P2P way.

Each of them demos what their projects. For today’s purposes, we’re seeing some potential interoperation. It’s not hard to imagine how these technologies could fit alongside and into each other.

***

Brewster Kahle sums up the day. The key question now: What do we do next?

Foundations can help us get over some of the humps. Do we need a bunch of conferences? Awards? Lock some of it open? Licensing requirements on distributed code?

He thanks everyone for joining this gathering.

Last word: “Let’s build the decentralized web.”

 

]]>
http://dangillmor.com/2016/06/08/decentralized-web-summit-day-one/feed/ 0
Decentralized Web Summit — Builders Day http://dangillmor.com/2016/06/07/decentralized-web-summit/ http://dangillmor.com/2016/06/07/decentralized-web-summit/#comments Tue, 07 Jun 2016 16:34:16 +0000 http://dangillmor.com/?p=3141 decentral1(I’ll be updating this regularly.)
Brewster Kahle, founder of the Internet Archive, has pulled together an amazing group of people for what he’s calling–with only a tiny amount of hyperbole–the “Decentralized Web Summit.”  Some of the “original architects” of this system–including Vint Cerf and Tim Berners-Lee–are here, or will be, along with the younger and deeply committed architects of what we all agree we want in a general way. I’m one of the participants, but I’m in awe of the people around me.

Why is this necessary? Because our technology and communications are being recentralized, and controlled, by governments and big companies. They often mean well. And we, the users, often choose the convenience (or supposed safety) that come with letting others control our communications.

Today is “Builders Day,” in which we try to figure out what we want and what’s already available. Tomorrow is a more conference-type program, and Thursday is a meetup.

Brewster started the day by asking three key questions:

  • How can we build a reliable web?
  • How can we make it more private?
  • And how do we keep it fun and evolving.

Mitchell Baker, who runs Mozilla, suggests three basic design principles:

  • Immediate. Safe instant access to content accessible via a universal address without the need for install.
  • Open. anyone can publish content without permission or barrier and provide access as they see fit.
  • Agency: user agent can choose how to interpret content provided by a service offering.

For some great live-tweeting, check out Kevin Marks’ feed at Twitter.

***

The Builders are identifying themselves and what they want out of the day. Some have macro goals. I described mine this way: We need tech and communications that lets anyone speak, read, assemble and innovate without permission, and I want to help get that done. Others have more micro goals, such as fixing specific roadblocks to the decentralized net.

One of the best: “I want to see all the ones and zeros liberated forever,” says John Light of Bitseed.

You can see the participants here. This is why I say I’m in awe.

***

We broke into groups, looking for areas of agreement and disagreement, plus ideas on how we can build or push forward decentralization. Then we merged groups (twice) and boiled it all down again, in order to have specific items to work on this afternoon.

What’s crucial to realize is that this is not an easy problem. Even the definitions are nuanced and complex. For example, what do we mean exactly by decentralization in the first place. There have to be some kinds of control points in some contexts.

We started with groups of six. My group(s) talked about such things as identity, encryption, and censorship. Then we compared notes (literally post-it notes) with another group and settled on some essentials to pursue later. Five other groups did likewise, and a spokesperson from each reported out to the rest of the participants.

I made an incredibly amateurish mobile phone video of the recommendations and posted it to the Archive (not, ahem, YouTube), using the new and wonderful mobile app called OpenArchive, which runs on Android. (Here’s a link to the page where the video is hosted.)

***

Google’s Van Jacobson talked, in part, about the inherent problems with IP (Internet Protocol in this context, not “intellectual property). It was a miraculous achievement. But it isn’t scaling as well as we need to a global (and someday interplanetary) scale.

Jacobson is working on the NDN–Named Data Networking”–project that aims to solve some of the growth issues. One key piece of this is where trust resides in the system. Today we get much of that trust from where the data originates, but perhaps we can get it from the data itself.

***

Zooko Wilcox (Zcash) isn’t enamored of the centralize-everything mantra. He’s focused, he says, on a more fundamental goal: to promote human rights with technology.

He chides us for our one-time “technological determinism”–a belief that we could solve any problem with tech. If some of us thought the law, or at least judges, would come to see it our way, we were naive. We aren’t anymore.

People share resources for many reasons. One is money, and he says money creates stability in a key way. He likes “commercial structures” and open-source (“almost like science”) in different ways.

***

zeronetTamas Kocsis is here from Hungary to talk about ZeroNet, a radically decentralized system that uses blockchain technology and BitTorrent to create “Open, free and uncensorable websites.” He shows a demo of ZeroBlog, one of the applications that he’s created from his platform, with seamless editing and publishing on a network that lives on multiple, loosely connected machines, because it’s operating entirely peer-to-peer. He’s enabled chat, bulletin aboard and more. The service can be connected to Tor for enhanced privacy.

A question from the audience from someone who was “super-impressed” when he first looked at it. Is there away of importing from existing applications (such as this WordPress blog)? He’s aware of the problem, but since there’s no back-end this is difficult. (In other words, no.)

Still, super-impressive, an understatement.

***

Much more TK…

]]>
http://dangillmor.com/2016/06/07/decentralized-web-summit/feed/ 2
Journalists: Stop complaining about Facebook, and do something about it http://dangillmor.com/2016/04/09/journalists-stop-complaining-about-facebook-and-do-something-about-it/ http://dangillmor.com/2016/04/09/journalists-stop-complaining-about-facebook-and-do-something-about-it/#comments Sat, 09 Apr 2016 14:07:12 +0000 http://dangillmor.com/?p=3128

(I’m on a panel at the International Journalism Festival later today, entitled “The capture of traditional media by Facebook.” I’m planning to say some of what you see below. What follows is an early draft of a section of a book chapter, and I’ll be revising it a lot.)

On the cover of this week’s Economist is a photo mashup of Facebook CEO Mark Zuckerberg as an emperor. It is a fitting image, given his company’s growing domination of online conversation.Zuckerberg as emperor

It is also a sign, one of many in recent months, that people in journalism have awoken to a potentially existential threat to the craft, among many other consequences of Facebook’s reach and clout in the information world. At the International Journalism Festival in Perugia, Italy, where a Facebook representative stonewalled questions a year ago, more than one panel has been devoted to the issue of how journalism will work if big “platform” companies—especially Facebook—control distribution.

How should we respond? From my perspective, two primary schools of thought have emerged. One is to embrace that dominance, albeit with some unease, and fully participate in Facebook’s ecosystem. Another is to persuade Facebook to take seriously its growing responsibility to help get quality journalism in front of as many people as possible.

Both of those approaches assume that Facebook is too big, too powerful to resist—that we have no alternative but to capitulate to its dominance. But if that is true, the consequences will be disastrous. We will be living in the ecosystem of a company that has repeatedly demonstrated its untrustworthiness, an enterprise that would become the primary newsstand for journalism and would be free to pick the winners via special deals with media people and tweaks of its opaque algorithms. If this is the future, we are truly screwed.

I say: no. Let’s not give up so easily. Instead, let’s resist—and find a way out of this trap.

Before I explain how, let’s offer some due praise. You don’t have to trust Facebook, or approve of its “surveillance capitalism” approach to business, to recognize its staggering brilliance in other respects. The company is loaded with talent, and has become an entrepreneurial icon. It is innovative technically and quick to adapt to changing conditions. And I have no doubt that the vast majority of its employees, and some of its investors, want to do the right thing when it comes to free speech.

But Facebook is also becoming a monopoly, moving closer and closer to what Zuckerberg himself has called his goal—that Facebook should be “like electricity” in the sense of effectively being a public utility that we cannot do without.

And that’s where I’d start in helping journalists, and others, escape from its web. Here’s an early, and therefore rough, draft of the approach I’d suggest:

First, journalists should remember the proverbial first rule of getting out of a hole: stop digging. Sadly, with the advent of Facebook’s Instant Articles, a publishing platform with great allure in some ways, news organizations have abandoned their shovels and brought in heavy earth-moving machinery to dig themselves in even deeper. I’m not saying drop all connections to Facebook right now, but the dig-faster “strategy” is beyond short-sighted. It’s outright suicidal.

Second, journalism organizations should explain to the communities they serve how Facebook operates. Such as:

  • Invasion of privacy. The occasional articles we see about Facebook’s latest privacy intrusions barely begin to describe the massive way this company (and other online advertising operations) are creating unprecedentedly detailed dossiers on everyone, and then using this information in ways we can barely imagine. The ubiquitous “Like” button, found all over the Internet, is part of Facebook’s surveillance system.
  • Control of speech. Facebook decides what its users will see by manipulating their news feeds. It removes posts based on its puritan approach to sex, and reserves the right to determine what speech is acceptable, period. In America, Facebook’s terms of service overrule the First Amendment.
  • Becoming an alternate Internet. Facebook would be delighted if you never leave its embrace. In some countries, where it makes special deals with governments and (often government-controlled) telecom companies, it effectively is the Internet on mobile devices.
  • Evolving ethics. Facebook constantly pushes the boundaries of acceptable behavior, especially in the way it collects and handles data on its users. It changes its terms of service and privacy policy, often in ways that should alarm people.

Third, journalists should do what they have done many times before when they encountered threats to freedom of expression: ask people with political power to intervene. As Facebook takes on more and more of the trappings of monopoly and utility, we need antitrust officials and others in government to pay attention. Of course, Facebook isn’t the only threat in this regard. The telecom carriers are potentially just as dangerous to speech, given their wish to control how our information moves in the vital part of the networks they control. There’s long list of other threats including pervasive surveillance by government, and for the most part journalists have ignored these attacks on freedom of expression. I said in Perugia last year that journalists need to be activists on these fundamental issues of liberty, and renew that plea here.

Fourth, once journalists have explained all this, they should help the communities they serve take action themselves. This should include technical countermeasures—how to block, to the extent possible, all that surveillance by corporations and government, by using encryption; browser plugins that block the online trackers; and more. Journalists should also tell people how they can campaign for change, such as contacting their elected representatives and regulators at the local, state and federal levels; support organizations that help preserve liberties; etc.

Fifth, journalists should join and support the nascent efforts to counteract the centralization of technology and communications. It’s not practical to ask media people to create a decentralized, federated web that includes social connections as well as standard publishing. This is beyond their expertise. But they should be leaders in the push to get there, and give financial and other help to projects that further the goal. Moreover, they should lobby a key constituency that has taken only timid steps toward saving the open Web: philanthropists and NGOs. Foundations, in particular, need to put their considerable resources behind decentralized platform development, and news organizations can help convince them to do so.

Plainly, some of these strategies will be easier to pull off than others. But we have to try. The alternative looks grim.

]]>
http://dangillmor.com/2016/04/09/journalists-stop-complaining-about-facebook-and-do-something-about-it/feed/ 18
My media use http://dangillmor.com/2016/03/18/my-media-use/ http://dangillmor.com/2016/03/18/my-media-use/#respond Fri, 18 Mar 2016 10:18:51 +0000 http://dangillmor.com/?p=3105

For an online course I’m teaching, here’s an example of my media use. Note to students: I don’t expect your blog posts to be this long.

As a “consumer”:

My daily media consumption is enormous, because I do this for a living. Here’s what happened one recent day:

When I wake up I briefly check email and Twitter. If something seems super-urgent I may open an email or click through to a link. Usually I don’t.

At breakfast, using a tablet, I go to the homepages of the New York Times, Wall Street Journal, Washington Post, and Financial Times. All of those outlets have a world view, and I want to see what their editors–some of the best in journalism–believe is important. I also check my RSS newsreader, which collects stories and links from a variet of sources I’ve pre-selected.

At my home-office desk:

— I check out a number of websites including Reddit, BoingBoing, Ars Technica, National Review, TechDirt, Jon Oliver (when HBO posts his regular commentary), among others.

— I run Twitter and Google+ in separate browser tabs but don’t try to keep up with it all the time (though I confess I check them more often than I should.) Whether an important story or some ridiculous meme is bubbling up, I’ll be likely to notice it among the people I follow. I also check 5 Twitter lists I follow on these topics: journalism, the media business, technology, entrepreneurship and media literacy.

— Besides regular email, I subscribe to several mail lists on those topics, as well as a great daily list of five items from This.cm, a site that creates serendipity for me. I sort those separately in my email inbox, and read them one after the other. Many of the links have already shown up in Twitter, and many point to the traditional and other media sites I routinely scan.

During the day I’m constantly bouncing around to various media including videos (typically posted on YouTube and Vimeo), audio (NPR and others), and other websites.

After dinner I sometimes watch videos on our television, but almost never live TV. We subscribe to Netflix, Amazon Prime and satellite (Dish). I record some TV series (e.g. “Justified”) and watch when I have time, skipping through the commercials.

On my bedside table I have a hardcover book or two (one from the library and one I’ve bought, the latter almost always written by someone I know), and a Kindle Voyager e-reader. I read for a half hour or so before going to sleep.

Takeaways (similar to what I found when I did this several years ago):

I listen to or watch very little broadcast media apart from NPR (or super-important breaking news, very very rarely).

My main sources of trusted information resemble some of the ones from several decades ago, such as the New York Times (which, like all other media, I do not trust fully, since they do get things wrong from time to time). I get to them in some different ways, however.

In particular, several Twitter lists and Google+ circles (roughly the same thing; collections of people I follow about specific topics) have become filters of great value. I can generally depend on them to send me to information I need to know about. However, I know I’m missing some important things if I rely only on other people to flag things.

For me, media consumption is an evolving collection of people, sites, conversations, and entertainment. Much of it overlaps. It takes more effort on my part, but I believe I’m vastly better informed — and entertained.

As a creator:

I create a lot of media, too, though not nearly as much as I “consume” (I hate that word; as I’ve told students in digital media literacy, we should use media, not consume it).

On a given day, here’s roughly how I created media. I’m guessing it’s different from what students do these days. Most of what I create is text. Not all, but most.

In the morning, I answered a batch of email. I do this regularly during the day, because I get a lot and I try to keep up with it. I’ll never get to the fabled “inbox zero” but I’ll try. Occasionally I get and send several text messages, most often with my wife.

I post frequently on Twitter, and more occasionally on Google+. (I rarely use Facebook, for reasons I described in my book Mediactive.) 

Lately, I’ve been posting (too infrequently!) to This.cm, a wonderful new service that tries to collect–from a bunch of interesting users–just a few items per day that we all believe everyone should see. The site is in beta so I can’t invite all of you to join it, yet.

My blog doesn’t get enough love, though I do post there from time to time. On the day in question I wasted a lot of time responding to someone who was trying to convince me (actually, his own fans) that I’m wrong about net neutrality.

As a longtime photographer I take lots of pictures. I don’t post most of them, but when I do it’s usually to Flickr or Google+ or my blog. I need to do this more. I don’t have an Instagram account but probably should get one.

There’s a way I semi-create media that most of don’t appreciate: individualized media via online services. Example: I wanted driving directions the other day, and used Google Maps. It produced a page of directions and a map. This is media, too–but just for me.

My other media creation, on a regular basis, doesn’t get seen by anyone but me for some period of time: writing I’m doing for my columns and essays at Slate and Medium, as well as a new book. In a way, those are the most traditional forms of media I’ve been making.

There’s more, but you get the idea!

]]>
http://dangillmor.com/2016/03/18/my-media-use/feed/ 0
Help crowdfund better journalism on encryption http://dangillmor.com/2016/03/15/help-crowdfund-better-journalism-on-encryption/ http://dangillmor.com/2016/03/15/help-crowdfund-better-journalism-on-encryption/#respond Tue, 15 Mar 2016 21:34:06 +0000 http://dangillmor.com/?p=3095
crypto

https://www.flickr.com/photos/yusamoilov/

The “FBI-versus-Apple” story of recent weeks has brought a vital issue to the front burner: whether we will have secure technology in the future or not–or at least the chance to have secure technology.

In reality, this isn’t only about Apple or the FBI. It’s about the considerable weight of government in its zeal to have access to everything we say and do in the digital realm–which is to say, increasingly, almost everything we say and do.

The Obama administration, and governments around the world, believe they have an innate right to whatever information they want, whenever they want it. This is a law-enforcement-first mentality, and in many ways an understandable one in a sometimes dangerous environment. But governments also want something they assuredly cannot have: a way to crack open our devices and communications, willy-nilly, when we’re using encryption tools that make it difficult if not impossible to do so without users turning over the keys to their digital locks.

They call this a “privacy versus security” debate. It is, in fact, a “security versus security” issue: If they get backdoors into our devices, software and networks, they will–according to just about every reputable non-government security and encryption expert–guarantee that we will all be less secure in the end, because malicious hackers and criminals (some of whom work for government) will ultimately get access, too. Governments want magic math, and they can’t have it. It’s also a free speech issue, a huge one, because the government is telling Apple it has to write new code and sign it with a digital signature.

Sorry, this is binary. We have to choose. One choice is to acknowledge that bad guys have a way to have some secure conversations using encryption, thereby forcing law enforcement and spies to come up with other ways to find out what the bad guys are doing. The other choice is to reduce everyone’s security, on the principle that we simply can’t afford to let bad people use these tools.

Sadly, the journalism about this has been reprehensibly bad, at least until recently, outside of the tech press. Traditional Big Media basically parrot government people, including most recently President Obama himself, even though they’re finally starting to wake up to what’s happening. John Oliver’s HBO program last Sunday was a sterling example of how media can treat this complex topic in a way that a) tells the truth; and b) explains things with great clarity.

tedhdirt logoMike Masnick and his site, TechDirt, have been leaders in covering the way various liberties and technology intersect. Now they’re crowdfunding to add more coverage of encryption and its ramifications. I’m supporting this initiative and hope you’ll give it some thought as well. We need more such coverage, and we can depend on Mike and team to provide it.

]]>
http://dangillmor.com/2016/03/15/help-crowdfund-better-journalism-on-encryption/feed/ 0
Accelerating the mobile Web http://dangillmor.com/2016/02/28/accelerating-the-mobile-web/ http://dangillmor.com/2016/02/28/accelerating-the-mobile-web/#respond Sun, 28 Feb 2016 16:52:57 +0000 http://dangillmor.com/?p=3075

Google’s “Accelerated Mobile Pages” project launched for real this week, and it’s pretty amazing. I still worry about giant, centralized companies and their power–and it shouldn’t be needed in the first place. For more, check out my deep dive into AMP at Medium Backchannel. Key quote:

Before getting into details about what’s happening here, let’s be clear on something. AMP wouldn’t be necessary — assuming it is in the first place — if the news industry hadn’t so thoroughly poisoned its own nest.

Looking for money in a business that grows more financially troubled by the month, media companies have infested articles with garbage code, much of it on behalf of advertising/surveillance companies, to the extent that readers have quite reasonably rebelled. We don’t like slow-responding sites, period. On our mobile devices, which are taking over as the way we “consume” information, we despise having to download megabytes of crapware just to read something, because the carriers charge us for the privilege. That’s one reason why we use ad blockers. (The other, at least for me, is that we despise being spied on so relentlessly.) The news business could have solved this problem without racing into the arms of giant, centralized tech companies. But it didn’t, and here we are.

 

]]>
http://dangillmor.com/2016/02/28/accelerating-the-mobile-web/feed/ 0