The behaviour and content that new legislation on which legislation is concentrating.
A number of policy and law-makers have sought to identify and prohibit a class of behaviour outlined in the wake of the rising international concern over “fake news” content shared via online platforms. The key factor weighing in the balance is the right to free expression, with such rights being entrenched within the constitutions of many countries, in addition to the international commitments made under the ECHR in Europe and the UN Charter more widely.
While a number of specifically targeted initiatives in different jurisdictions have emerged, such as the French law of 22 December 2018 creating an injunction process to prevent dissemination of false information during an election period, legislatures have faced difficult debates over whether any wider ‘disinformation law’ would disproportionately infringe the right to free speech. Such debates often identify existing more general laws which could be applied in the current context, for example those in place to prevent propaganda, hate crime, or defamation. Notably, earlier this year in France, the Avia law which was directed at preventing online hate speech was struck down by the Constitutional Council.
Concerns over child online safety have long been widespread in many countries. Regulation in this field is focussed on the protection of minors through the imposition of prohibitions on the sharing of certain materials which affect children, and on restricting the access of minors to prohibited materials. Many legislative initiatives are focussed on preventing the manipulation and exploitation of children by older individuals through the use of content-sharing platforms.
While many online platforms have policies and restrictions in place, legislators in many countries are putting forward more structured measures to tackle the scope for children to have access to unsafe content. However, there is unlikely to be harmonisation between the various national initiatives as a result of various policy divergences, not least the fact that across the EU, the minimum age for children to consent to the processing of their personal data ranges between 13 and 16 years old.
In addition, while some governments are planning thorough changes (such as in the UK where a code of practice on tackling online child sexual exploitation is due to be published this year), others are relying on existing laws with the aim of applying them in the context of technologies that have developed and evolved since those laws were adopted.
There exists legislation in many jurisdictions prohibiting the sharing of content classed as “terrorist”, and many firms will have moderation and filtering systems in place to restrict such content. However, a number of countries are seeking to outline clearer requirements in relation to such content, including the UK, which has been planning this year to publish a Code of Practice relating to the practical steps platforms must take in preventing such content being shared.
While some jurisdictions are relying on their courts’ ability to apply existing laws, such as the Spanish Criminal Code preventing terrorist activity, other jurisdictions have enacted legislation broadening the application of existing laws prohibiting inciting terrorist activity, such as the German NetzDG initiative. So far, there is no indication of any uniform approach across jurisdictions in order to tackle such behaviour.
While many countries have domestic measures in place to prevent harassment, this area is subject to change and reform in relation to physical and emotional harassment. In many cases, harassment online may be caught by existing laws, however a number of countries are launching programs of reform to clarify the law in relation to such behaviour taking place online. However, such initiatives do not always have a direct effect on platform operators.
For example, the French law 2014-873 of 4 August 2014 created a specific offence of cyber harassment, but applies to individuals rather than the platform which may have hosted such activities. By contrast, in Germany NetzDG will in some circumstances impose requirements on platforms to prevent harassment, where the latter is covered by the existing Criminal Code. In the UK, the planned legislative changes will impose requirements on platform operators to take specified steps where there is evidence that their users are being harassed or abused on their services. This is to be covered by a code of practice issued by the new regulator, with which companies will need to comply.
The sale of illegal goods and services is a topic that tends to be dealt with by domestic lawmakers in specific legislation relating to the type of product at hand. For instance, most countries have laws on selling dangerous substances, or use IP laws to crack down on the sale of counterfeit goods.
The majority of proposals discussed in this report tend to deal more explicitly with other types of harm, rather than illegal goods and services. Nevertheless, where laws or proposals aim to impose broad duties of care on online platforms to prevent harmful content, this may well extend to online listings for products or services which could cause harm. Indeed, the European Commission has hinted that protection against illegal goods and services sits alongside its efforts to prevent other types of harmful or illegal behaviour – in its Communication on Tackling Illegal Content Online, published in 2017, the Commission noted that the recommendations made were designed to complement various other existing and proposed legislative measures, including a Memorandum of Understanding on the Sale of Counterfeit Goods and a European Commission Notice on the Market Surveillance of Products Sold Online.