Austria Refers Facebook ‘Hate Speech’ Case to EU Court

The Supreme Court of Austria, Oberste Gerichtshof (OGH) has asked the Court of Justice of the European Union (CJEU) for clarification on the scope of Art. 15 (I) of the E-Commerce Directive and the host provider privilege. Here are the facts of the case.

The plaintiff is Dr Eva Glawischnig-Piesczek, a Green politician from Austria. In April 2016, a Facebook user with the fake alias ‘Michaela Jaskova’ posted an image of Glawischnig-Piesczek, and made some rude comments in German (“wretched traitor”, “corrupt clumsy oaf”, “member of a fascist party”) regarding the politician. Facebook was requested to delete the image and the comments in July 2016, but failed to do so.

Glawischnig-Piesczek then obtained a preliminary injunction against Facebook, which obliged the social network not only to delete the image and the specific comments (making them inaccessible worldwide), but also to delete any future uploads of the image if it was accompanied by comments that were identical or similar in meaning to the original comments. Upon being served the injunction, Facebook blocked access to the original image and comments (limited to Austria) and appealed the decision. The court of second instance upheld the first decision only in part: Facebook was now obligated to delete any future uploads of the image if it was accompanied by comments that were identical to the original wording or if the comments were similar in meaning and Facebook had actual knowledge of these comments (e.g. via a subsequent notice from the plaintiff or a third party).

Both parties appealed the court of appeal’s decision, which brought the case to Austria’s highest civil court, the OGH.

The judges begin their decision with a detailed analysis of the comments that were made by ‘Michaela Jaskova’ and find them to be unlawful due to their defamatory and offensive nature and complete lack of factual background for the statements. The court also states that the unlawfulness was evident, even to a layman without any legal background. Due to this, Facebook was required to delete the image and the comments upon obtaining actual knowledge of the post on July 2016. Facebook’s failure to do so deprived the social network of its liability privilege as a hosting provider under Article 15 of the E-Commerce-Directive, which is implemented in § 16 ECG (the Austrian ‘E-Commerce-Law’).

Facebook is considered by the court to be an abettor to the unlawful comments and as such, under Austrian law, is obliged to refrain from any repetition of the infringement. Austrian jurisprudence with regard to defamatory statements allows the plaintiff and the court to include statements that are not identical, but similar in wording or meaning, in the injunction. This does make sense and is handled similarly in Germany. If it was only forbidden to repeat the identical statement, the offender could easily circumvent the court’s decision by slightly altering his statement, resulting in the necessity of a new court order for each new statement.

However, according to the court, the claim for a ‘broad’ injunction that includes statements that are different from the original one might be in conflict with Article 15 of the E-Commerce-Directive. Article 15, which is implemented in § 18 ECG, asks that member states shall not impose a general obligation on providers to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating unlawful activity.

Indeed, an obligation for Facebook to pro-actively identify every future infringing post, including those that are different in wording, but similar in meaning, could result in an obligation to monitor all information which Facebook stores.

The judges appear uncertain whether such a judgment would result in a general obligation in the sense of Article 15. They turn to recital 48 of the E-Commerce-Directive, which states:

This Directive does not affect the possibility for Member States of requiring service providers, who host information provided by recipients of their service, to apply duties of care, which can reasonably be expected from them and which are specified by national law, in order to detect and prevent certain types of illegal activities.

According to this, a judgment asking for the prevention of illegal activities is not completely excluded with regards to Article 15 (1) of the E-Commerce-Directive, but can be made with regards to specific infringements.

Looking at McFadden, the court states that monitoring all of the information must be excluded from the outset as contrary to Article 15 (1). However, the judges believe this statement from the CJEU is not applicable to the case at hand, since McFadden dealt with an access-provider and not a hosting provider. This statement is a bit odd in its brevity, given that Article 15 applies to ‘mere conduit’ as well, but the court provides no further explanation.

The judges appear more inclined to apply L’Oréal/eBay, which found that hosting providers can be ordered to take measures which contribute, not only to bringing to an end infringements of intellectual property rights, but also to preventing further infringements of that kind. This leaves the question what constitutes infringements ‘of that kind’. Are only infringements that are identical to the original infringement ‘of that kind’ or are similar infringements included when it comes to ‘hate speech’?

The Vienna judges decided to ask the CJEU for clarification and have referred the following questions [this is not an official translation]:

Does Article 15(1) of Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) preclude the national court, to make an order requiring a hosting provider who has failed to expeditiously remove illegal information not only to remove the specific information but also other information that is identical in wording?

With regards to the first question, the court further asks whether Article 15(1) precludes such an order that requires the hosting provider to remove such information (or block access to it) worldwide or only in the relevant member state.

The court further asks whether Article 15(1) precludes such an order that is limited to removing or blocking access to the illegal information only from the specific user who posted the content and whether such an order would be applicable worldwide or only in the relevant member state.

The court next asks, if the previous questions are answered in the negative: does the same answer apply to information that is not identical in wording, but similar in meaning?

Finally, the court asks: does the same answer apply to information that is not identical in wording, but similar in meaning, once the host provider has actual knowledge of the information?

The referral touches some very relevant questions. In Canada, the Supreme Court ruled in Equustek v. Google that the search engine had to delist certain results from its search engine globally. In the case at hand, the plaintiff wants the offensive content to be removed from Facebook globally, while Facebook has only limited access from Austria.

Also, the issue of pro-active monitoring/content filtering by hosting providers is at the center of the discussion around Art. 13 of the upcoming DSM-directive.

The original version of this article was posted on the IPKat by Mirko Brüß. It has been reproduced under a Creative Commons CC BY 2.0 UK Licence.

Launched in 2003 as a teaching aid for Intellectual Property Law students in London, the IPKat’s weblog has become a popular source of material, comment and amusement. IPKat covers copyright, patent, trade mark, info-tech and privacy/confidentiality issues from a mainly UK and European perspective.

The IPKat team is Neil J. Wilkof, Annsley Merelle Ward, Darren Smyth, Nicola Searle, Eleonora Rosati, Merpel and David Brophy.