December 11th, 2020
By Rehan Piracha
In a new letter to Prime Minister Imran Khan, big technology firms have outlined their objections to the social media rules in the country, renewing calls for a credible consultation with the government.
In another letter to the PM on December 5, members of the Asia Internet Coalition have urgently called for a credible consultation process through which AIC members can provide substantive input to the “Removal and Blocking of Unlawful Content (Procedure, Oversight and Safeguards) Rules 2020” to address crucial issues such as internationally-recognised rights to individual expression and privacy. “AIC members are alarmed by the scope of Pakistan’s new Rules, as well as the opaque process by which these rules were finalized.”
However, there has been no official public response from the Prime Minister’s office and Information Technology Ministry over the objections and concerns shared by big technology firms.
In the letter released to the press on December 9, the AIC says the extensive and broad-based consultation that the prime minister had promised stakeholders in February never occurred. PTA had committed during bilateral meetings with AIC and its member companies to share a draft copy of the Rules. “Furthermore, the Ministry of Information Technology and Telecommunication recently updated the Rules on their website without explanation or due process. Industry stakeholders have therefore lost trust in the consultation process because it is neither credible nor transparent,” the letter adds.
The letter again warns the prime minister that the big technology might not be able to provide services to users in Pakistan due to the social media rules. “However, the rules, as currently notified and gazetted, would make it extremely difficult for AIC Members to make their platforms and services available to Pakistani users and businesses,” the letter reads.
The AIC says that instead of clarifying the scope of the powers given to the PTA, these rules create further confusion for both users and online platforms in Pakistan. “Large portions of the Rules are not only unworkable for global internet platforms, they go beyond the scope of the Parent Act (PECA 2016), putting their legality into question.”
In particular, the data localization requirements in the Rules will prevent Pakistani citizens from accessing a free and open internet and shut Pakistan’s digital economy off from the rest of the world, the AIC warns in the letter. The coalition says the PTA’s powers have been expanded excessively, allowing them to force social media companies to violate established human rights norms on privacy and freedom of expression.
The AIC says that the coalition shares the Prime Minister’s vision of a dynamic digital economic ecosystem for Pakistan, where platforms continue to drive substantial economic growth. “We now need your full and direct support in ensuring that Pakistan does not go down a highly counter-productive path that could derail the efforts that your government and the ICT industry have painstakingly invested in for many years”, the letter concludes.
What are the objections and concerns of Big tech?
In their letter to the prime minister, the AIC has shared the coalition’s objections and concerns over certain portions of the Removal and Blocking of Unlawful Content (Procedure, Oversight, and Safeguards) Rule 2020.
Fixed turnaround times for blocking content – Rule 6(2)
The AIC says the exact time frame for complying with a notice is not something that should be stipulated in the Rules, as it will vary from case to case, depending on the complexities and volume of content under consideration. The AIC suggests that takedown requests should be responded to within a reasonable timeframe, or ‘without undue delay’.
Thresholds for enforcement – Rules 6(5) and 8
The AIC says effective enforcement should focus on systemic, intentional failures. According to the coalition, social media companies need a clear understanding of what constitutes “systemic failure” so they have a reasonable path to action. The coalition proposes that PTA’s primary means of identifying systemic failures should be the transparency reports produced by social media companies.
Registration, permanent office, and data localization requirements – Rule 9(5)
The AIC says provisions on registration, permanent office, and data localization requirements fall outside the scope of the parent legislation, namely section 37 of PECA, which tasks PTA to develop rules on safeguards, transparent processes, and effective oversight mechanisms for the exercise of its powers to block certain types of content. A narrow scope will also allow PTA to define and draw clear lines between legal and illegal speech and content, based on evidence of harm consistent with both international norms and Articles 10A, 19, and 19A of the Constitution of Pakistan, the AIC says.
The requirement to provide user data in decrypted format – Rule 9(7)
According to the coalition, the requirement to provide user data in a decrypted format in the rules contravenes existing law on user data disclosure. Specifically, it contravenes the scheme of PECA, under which PTA may only seek the removal of unlawful online content, and agencies authorized under Section 29 may seek user data pursuant to an order from a competent court for seeking disclosure/production of documents, as per the procedure laid down under PECA. Further, any social media company that is a US-domiciled entity is subject to US laws that regulate the circumstances under which a U.S. based electronic communications provider may disclose user information.
Prevention of Live Streaming – Rule 9(9)
The AIC says the proactive filtering obligations contemplated under this Rule are contrary to Section 38(5) of PECA, which expressly rejects imposition of any obligation on intermediaries or service providers to proactively monitor or filter material or content hosted, transmitted or made available on their platforms.
Provisions like Section 38(5) PECA are appropriate in that they encourage platforms to take a balanced approach to content removals. This is also in line with the United Nations’ Joint Declaration on Freedom of Expression on the Internet, which affirms that “intermediaries should not be required to monitor user-generated content.” Mandating proactive monitoring, on the contrary, creates a risk that platforms will instead take a “better safe than sorry” approach—blocking content at upload or implementing a “take down first, ask questions later (or never)” approach.
As recognized in the 2019 UK White Paper on Online Harms, the AIC says, an approach that forces companies to monitor and detect all problematic content on their services would be impossible for platforms hosting large amounts of content, the AIC says. The notice and takedown system is the only pragmatic approach to tackle online content that may be unlawful, as determined by the relevant competent authority, the coalition proposes.