In a recent interview, Tim Berners-Lee, who was instrumental in launching the World Wide Web, said: "I wouldn't say the internet has failed with a capital F, but it has failed to deliver the positive, constructive society many of us had hoped for."
The social media industry has attracted the ire of regulators for many years now. Initially these concerns revolved around economic power and privacy. Increasingly, one of the biggest issues is the way that unmanaged, user-generated content manipulates society, and the way that it in turn can be manipulated to influence our political and social life.
This issue has attracted most attention in Europe and North America, but the effects extend worldwide: a recent bout of communal strife in Sri Lanka has been attributed to rumors spread on Facebook.
So far, the attempts to address the issues arising from social media content have been patchy and tentative. Some proposals are based on obvious misapprehensions of the way that social media works.
Facebook, for example, is a system made up of 2 billion moving, living, breathing members interacting in ways that cannot be predicted or monitored. It makes more sense to think of it as an organism than as a machine, and interventions in organisms are notoriously likely to produce unanticipated outcomes. Social media can be managed, but never controlled.
Evan Williams, co-founder of Twitter, has said: "One of the things we've seen in the past few years is that technology doesn't just accelerate and amplify human behavior . . . it creates feedback loops that can fundamentally change the nature of how people interact and societies move."
There is no obvious precedent for regulating content of this kind. No-one can predict how social media might develop over the next week, let alone the coming years. The introduction of new controls, powered by artificial intelligence and ever-growing ranks of human censors, cannot overcome this unpredictability without curtailing speech in an arbitrary and arguably oppressive manner. For a historical analogy look no further than Gosplan, the Soviet Union's central planning agency; a rules-based framework cannot fully anticipate the way an organism adapts and evolves.
The organic way that social media operates has clear and unpalatable implications for content regulators. If, as seems likely, social media platforms will always be unable to fully control content on their platforms, there are two choices. Regulators and policy-makers can learn to tolerate forms of speech they currently consider intolerable. Alternatively, they can implement an intrusive and expensive system of control that will generate arbitrary outcomes, fueling the resentment that drives much of the problematic content in the first place.
We are in the middle of an experiment to test these alternatives. In rough terms, Europe and North America are traveling the first route; China is traveling the second. In the coming years, we will see who stays their course, and how effective it is. Just don't ask anyone to predict the outcome.
Straight Talk is a weekly briefing from the desk of the Chief Research Officer. To receive the newsletter by email, please contact us.