Words and pictures are powerful weapons. Do not abuse them! So says the end of the code of ethics of the Norwegian press.
If you transfer these to global social networks, you could say they are freely distributing weapons to everybody who wants them but are refusing to shoulder any responsibility if someone gets hurt.
Would the United States have ended up with a different president if millions of Americans had not been lied to and manipulated through social networks in 2016? Would there have been a majority in the United Kingdom voicing support for remaining in the European Union? Was the outcome in Norway’s local elections this past fall affected by manipulation?
Nobody knows the answer to these questions, but the mere fact they are raised is detrimental to the trust in democracy and democratic institutions. Politicians and public servants are targeted in the campaign of lies, and they have no efficient ways of defending themselves. In a worst case scenario, this leads to fewer people being willing to take on positions of trust, and public institutions are partly blocked.
Many people have described the problems well, but it is hard to find good solutions. One thing making this complicated is the necessity to balance a judicial and regulatory defense of democracy with an equally important need to protect freedom of speech.
At Schibsted, we have evaluated possible solutions intended to create this balance in the report “Ensuring democracy and freedom of speech online: a need for a balanced regulation of social networks.”
The social networks typically regard themselves as providers of tech without responsibility for the content carried by the network, just as the telecom companies provide distribution through cable or satellite. On the other hand, some people feel a solution could be to regulate the networks in the same fashion as media run by an editor.
We think both solutions have their weaknesses. Instead, we propose the networks be regulated as a new category of actors in the field of mass communication.
The networks offer more than a satellite or a cable. They offer functionality for liking, sharing, and using algorithms for steering content in the network in the “right” direction based on enormous amounts of data about every single user. At the same time, they cannot be called publicists simply because their work is not publicity work. The networks have not defined any societal responsibility. They don’t produce or edit content, and they don’t run their businesses according to any common ethical rulebook.
The social networks have revolutionised people’s possibilities to communicate with each other and become a part of a community. They have also provided chances to speak to all those who are suppressed by authoritarian leaders. The value of this for the community is formidable, and we must not regulate in such a way that these values are lost. It would totally ruin the core character of the social networks if we forced them to seek advance clearance for everything users post.
The solution we support and wish to put forth is to make the networks remove unlawful content after such content has been reported. Germany has tried implementing this system, but it does not seem to be very efficient. Today’s rules are interpreted to mean one has only to act against the single reported posting. The networks feel no obligation to act against similar postings made by others.
Those who want to instigate terror, for example, use this knowledge, of course, and the sanctions have only a small effect. This “hole” in the regulations must be closed so the networks’ obligation to remove illicit content is relevant for all postings with the same content.
We also propose a rule saying users already exposed to the illicit material should be informed it has been removed and why.
In the spirit of the law, we find it logical to give the networks a participator’s responsibility for spreading illicit content they haven’t produced themselves. We also believe corporate fines should be used in cases when the law is broken.
The European Union and countries such as France, the United Kingdom, and Australia have suggested how networks could be regulated when spreading dangerous content. A weakness in a lot of this work is describing a new category of illicit content called “intentional disinformation.” The question is how to describe a specific “intention” and who decides what constitutes “disinformation.”
In the hands of the wrong authority, such laws can lead to substantial restrictions on the freedom of speech, which is undesired. The U.S. president uses the term “fake news” about every piece of news he doesn’t like. We can imagine a similar misuse of the term “disinformation.”
It shouldn’t be prohibited to lie, but what should be restricted is spreading lies that significantly damage democracy or democratic institutions, systematically and on a large scale.
As an alternative to creating new regulations related to the freedom of speech with the aim of limiting the networks, we propose using rules we already have in our penal code. There are a multitude of rules aimed at protecting free elections and democratic institutions. The problem is the networks’ responsibility in regard to these rules has not been defined and the rules have, to a large extent, been dormant.
In our view, it is crucial that future laws setting limits on freedom of speech be established and managed on a national level. In EU countries, the laws regulating freedom of speech vary widely, and any attempt to create a common set of laws will likely be a negative experience for the most open and liberal countries.
On the other hand, what the European Union should do is create an over-arching set of rules establishing networks as a new judicial category. These would establish the networks’ responsibility according to national laws regarding free speech and demand transparency. For example, they would clarify the consequences for setting up algorithms in certain ways and demand supervision of the laws established through national control. Each individual country should go through its regulations on illicit statements and make appropriate adjustments to clarify the responsibility of participating networks.
As for content from media run by editors and distributed by networks, we suggest the networks be free from responsibility and thus incapable of changing such content. This is in line with the 2016 proposition from the Alliance of Independent Press Councils of Europe. The publishing enterprise is already regulated by law, and the responsibility is clearly placed on editors. It does not make sense for networks to alter content for which editors have taken responsibility. Further, it would weaken the trust in content from editor-managed media.
Scandinavian countries lead when it comes to freedom of speech and freedom of the press. This gives us a chance to speak knowledgably and internationally on these topics. However, for this to happen, politicians and others with an interest in managing societies must take part in the conversation. The growing threat to fundamental values will not disappear by itself.