ISSN (Online) - 2349-8846
-A A +A

Social Media Accountability

Lessons from Germany

Alok Prasanna Kumar ( is senior resident fellow at the Vidhi Centre for Legal Policy, and is based in Bengaluru.


On 31 December 2017, Germany’s Gesetz zur Verbesserung der Rechtsdurchsetzung in Sozialen Netzwerken (Network Enforcement Act) came into force after having been passed by the Bundestag in June 2017 (McKay 2018).1 The law, which is applicable to “social networks” (such as Facebook, Twitter, Google+, and others),2 requires them to remove “unlawful content” within 24 hours of receiving a complaint about the content on their website. The failure to do so involves massive penalties, going up to millions of euros. For the moment, the law is applicable only to social networks with more than two million users in Germany and, in addition, they are also required to put in place an “effective and transparent procedure” for handling complaints of unlawful content from users. It is important to note that the law itself does not define what “unlawful content” is. Rather, it relies on the pre-existing laws that criminalise some forms of speech, such as hate speech, incitement to violence, etc.

Ever since it was first mooted, the law has been criticised on the issue that it amounts to a “privatisation” of censorship (Toor 2017). Critics include not just the social network platforms themselves, but also digital rights activists in Europe and around the world who have expressed fears that this may end up increasing the power of social media platforms to determine what content they find acceptable and censor accordingly. The heavy fines and short timeline mean—as one activist says, “they’ll shoot first and ask questions later”—that companies running such social networks will err on the side of caution and censor content simply on the basis of a complaint, rather than really examine whether the content might infringe the laws (Toor 2017). It is also contended that this effectively means that the interpretation of the laws concerning free speech has been left to the discretion of private corporations rather than public institutions.

Are these valid criticisms? Even if they are, if the approach works in minimising the harmful influences of social media on discourse, could this approach be adopted in Indian law?

From Liability to Responsibility

Since the mid-1990s, the approach taken to imposing liabilities on intermediaries, that is, those merely hosting or carrying content they did not create, has been to give them an exemption from liability for illegal content so long as they also remove such illegal content when the fact of illegality is brought to their notice. This was put in place to keep in mind the needs of the then fledgling industry, to ensure that the growth of the internet was not hampered by a crushing legal regime and innovation being stifled (Frosio, forthcoming). While the German law does not subvert this paradigm, it fundamentally does impose much higher obligations on the intermediary, which has prompted the concern that the government has “outsourced” its obligation to enforce the law to private entities. The move does raise serious questions about the rightness of allowing private entities to take the call on legality of content.

While I have previously shared this concern (Kumar 2016), I must confess that I have changed my mind about this, going by the events of the last year or so. One key factor that I think has become more prominent, and will become even more so going ahead, is the control exercised by social networks on what users see. Whereas Facebook was the outlier in terms of moving away from a purely chronological feed (latest posts first) to an algorithmic one (where the user sees what the social network’s algorithm thinks may be most relevant), this seems to have become the standard across all social networks (Kiberd 2016). This, as research has shown (Hern 2017), is hardly benign. There is the serious potential of this algorithm being gamed to spread fake news and other unlawful content. More importantly, social networks cannot just be considered the internet’s version of bulletin boards. Rather, the use of algorithms now means that, in my view, they should be considered more akin to an editor of a newspaper. When they have the control and the power to choose what content is seen and what goes up, the legal liability must also reflect this shift.

To that extent Germany, with its history of hate speech, is perhaps not unjustified in taking this approach to regulate unlawful speech online. To be clear, no norms concerning the content of online speech are being imposed by this law itself. The focus is on creating an enforcement mechanism alone and, though the first few applications of the law have already been controversial (Oltermann 2018), it remains to be seen how effective these measures will be. What it presents perhaps is a counter to the Silicon Valley–driven belief in absolute freedom of speech being the default on the internet, a reflection perhaps of the demographics of the influencers on Silicon Valley than a position based on the reality of the public discourse around the world.

Intermediary Liability in India

Facebook is said to have about 24 crore users in India, by far its largest user base in any country (Livemint 2017). YouTube (owned by Google) is said to have a similar number of users in India, with the usage set to double soon thanks to the spread of 4G and broadband internet (Narasimhan 2017). Twitter is said to have at least two crore users in India, though the company has not put out any official figures (Huffington Post 2015). The spread of social networks on the internet has gone hand in hand with the expanding internet use in India. It is not just a small group of the elite in the urban areas anymore that is using the internet. The reach of social networks is only going to expand as all the major social networks make a big push to offer their services in India’s many languages (Kini 2017).

It goes without saying that India has had its problems with unlawful speech on social media platforms. In handling these, the Indian government’s approach has been ham-fisted and crude. Section 66-A, which tried to address this, did so in such a poorly thought-out and unscientific manner that it had to be struck down by the Supreme Court (Shreya Singhal v Union of India 2015). Yet, the problems of regulating speech online have not gone away. There is, simultaneously, a problem of under-regulation and over-regulation. Hate speech against women, minorities and disadvantaged communities continues unabated and mostly unpunished, while relatively innocuous content is taken down and criminal proceedings started because someone in power does not like it (Roy 2015).

As far as intermediary liability is concerned, Section 79 of the Information Technology Act, 2000 mandates that intermediaries (such as social networks) will not be held liable for content uploaded by others on to their websites if such content is unlawful in any way, provided that when this fact was communicated to them, they removed the content “expeditiously.” This immunity from liability is what may be leading to under-regulation in the context of unlawful content; that in the absence of any definition of “expeditiously,” companies that run social media platforms do not feel the need to respond quickly to complaints of unlawful behaviour on social media.

Relevance for India

Could Germany’s approach in the Network Enforcement Act be adopted for Indian conditions? The mistake made with Section 66-A (and one which the Supreme Court also pointed out) was to try and create a new definition of unlawful speech for the purposes of the internet. In contrast, the Network Enforcement Act’s concern is only enforcement of existing laws. To that extent, the approach of the German law is certainly worth emulating: creating a meaningful obligation on the part of social networks to stop unlawful content on the internet.

There are, however, two further concerns to be addressed before adopting this approach: first, a much wider range of content restrictions are permitted under Indian laws than under German ones; and second, the absence of meaningful recourse in cases of misuse by government or powerful groups. There is merit in the argument that the range of restrictions permitted under Indian law may not be something that private companies, even if advised by a large team of well-versed lawyers, will be able to understand and apply systematically. Potential misuse could also be addressed through transparency requirements (as contained in the German law) and penalties for malicious complaints.

It remains to be seen how this law will work in Germany, and its dismissal as a failure or its being hailed as a success in less than a year must be taken with a pinch of salt. Nonetheless, India’s lawmakers, activists and citizens should keep an eye on Germany as it looks to address the problems of hate speech on the internet.


1 The full text of the law is available at publicationFile&v=1 and the English translation at

2 In this article I have used “social network” to mean both the platform itself and the company which owns such a platform.


Frosio, Giancarlo F (forthcoming): “Why Keep a Dog and Bark Yourself? From Intermediary Liability to Responsibility,” International Journal of Law and Information Technology, [Currently 2976023; viewed on 13 January 2018.]

Hern, Alex (2017): “How Social Media Filter Bubbles and Algorithms Influence the Election,” Guardian, 22 May, viewed on 13 January 2018, 2017/may/22/social-media-election-facebook-filter-bubbles.

Huffington Post (2015): “India Has 22.2 Million Twitter Users: Report,” 28 January, viewed on 11 January 2018, n_6562950.html.

Kiberd, Roisin (2016): “Why 2016 Was the Year of the Algorithmic Timeline,” Motherboard, 25 December, viewed on 13 January 2018,

Kini, Sahil (2017): “Mind Your Language on the Indian Internet,” Livemint, 27 November, viewed on 11 January 2018,

Kumar, Alok Prasanna (2016): “Securing Women’s Right to Free Speech on Social Media,”
Economic & Political Weekly, Vol 51, No 30,
pp 10–11.

Livemint (2017): “India Now Has Highest Number of Facebook Users, Beats US: Report,” 14 July, viewed on 11 January 2018, /Indians-largest-audience-country-for-Facebook-Report.html.

McKay, Tom (2018): “Germany’s New Social Media Hate Speech Law Is Now Being Enforced,” 1 January, Gizmodo, viewed on 11 January 2018, https: //

Narasimhan, T E (2017): “YouTube Looks to Double User Base in India, Take It to 400 Mn from 200 Mn,” Business Standard, 2 August, viewed on 11 January 2018,

Oltermann, Philip (2018): “Tough New German Law Puts Tech Firms and Free Speech in Spotlight,” Guardian, 5 January, viewed on 13 January 2018, /2018/jan/05/tough-new-german-law-puts-tech-firms-and-free-speech-in-spotlight

Roy, Prasanto K (2015): “Why Online Harassment Goes Unpunished in India,” BBC News, 17 July, viewed on 13 January 2018,

Shreya Singhal v Union of India (2015): SCC, SC, 5, p 1.

Toor, Amar (2017): “Germany Wants to Fine Facebook over Hate Speech, Raising Fears of Censorship,” Verge, 23 June, viewed on 11 January 2018, 15852048/germany-hate-speech-facebook-twitter-fine-censorship.

Updated On : 31st Jan, 2018


(-) Hide

EPW looks forward to your comments. Please note that comments are moderated as per our comments policy. They may take some time to appear. A comment, if suitable, may be selected for publication in the Letters pages of EPW.

Back to Top