Introduction:

As India prepares for the implementation of the Digital India Act, the Ministry of Electronics and Information Technology (MEITY) has demonstrated a rigorous approach to consolidate and standardize the current regulatory framework. This new legislation aims to address emerging technological trends and advances in the market. To gain further insight into the Digital India Act, please refer to our previous publication "Digital Media Act FAQs - India", published in April 2023. The draft of the new law is expected to be available for public consultation in the coming weeks.

Key Provisions for Online Safety and Trust:

During the consultation process, the regulator clearly expressed that the Digital India Act will include specific provisions for online safety and building trust. These provisions encompass various issues, such as revenge porn, cyber-flashing, dark web activities, protection of women and children, defamation, cyber-bullying, doxing, salami slicing, and more. The proposed act is also expected to grant social media platforms moderate discretionary powers to tackle the spread of fake news.

Regulator's Commitment to Combating Cybercrimes:

On June 9, MEITY reaffirmed its commitment to strictly regulate the internet and combat new cybercrimes in India. The regulator released a list of eleven items that will be considered objectionable content and banned from social media platforms. This list includes unauthorized content belonging to another person, content inciting violence based on gender, race, ethnicity, religion, or caste, harmful content for children, intellectual property rights infringement, dissemination of misleading or false information, impersonation, content threatening national unity, security, or sovereignty, content containing harmful computer code, unverified online games, advertisements or promotions of unauthorized online games, and content violating existing laws.

Current Regulatory Framework and International Trends:

India's current regulatory framework, the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, governs online content and provides conditional immunity. While there is no blanket ban under the existing law, many of the points listed above are part of the intermediary due diligence. Non-compliance by online platforms can lead to the loss of liability protection, and offenders may face criminal and civil charges. Additionally, the government has the authority to independently block material without waiting for service providers to act.

Globally, there is an active trend among lawmakers to regulate intermediaries and online platforms. The EU Digital Services Act, set to take effect in July 2024, classifies intermediaries based on their role and size. It imposes graded obligations on various types of intermediaries, with the most stringent rules reserved for large online platforms and search engines. The UK Online Safety Bill provides a list of illegal content to be removed by platform providers and emphasizes measures to prevent children from accessing inappropriate content. Australia's Online Safety Act, in effect since this year, increases accountability for online service providers and establishes a cyber abuse scheme with unique provisions for adults and children. Countries like China and Germany have implemented stringent provisions for regulating and monitoring content, accompanied by severe penalties for non-compliance.

Balancing Freedom of Expression and Regulation:

India's proposed Digital India Act brings forth crucial considerations regarding the balance between freedom of expression and necessary content regulation. Achieving the right balance will require careful deliberation, clear definitions, and transparent enforcement mechanisms. The regulator should provide clear guidelines for protection of minors' data. It is vital to avoid subjective interpretations and ensure that content creators understand the compliance criteria. Effectively enforcing the law will require collaborative efforts between government agencies, technology companies, and civil society. These efforts should complement initiatives such as promoting digital literacy, encouraging responsible online behaviour, and supporting platforms in their efforts to filter out objectionable content.

Preparing for Change: How Digital Intermediaries Can Adapt

As we await the finalization of the law, it is crucial for social media and online platform service providers to prepare themselves for the upcoming changes. While the specific requirements of the law remain to be seen, it is evident that companies will need to adjust their content monitoring practices.

Implementing systematic processes to monitor social media content and establishing clear policies will be essential steps. Additionally, companies should consider including appropriate disclaimers to protect their positions as service providers, depending on the final approach of the law.

Education will also play a significant role. Companies will likely be expected to take responsibility for educating users about responsible digital behaviour and promoting media literacy. Enhancing cybersecurity measures should also be a priority to ensure the safety of users and their data.

A collaborative approach, working in conjunction with regulators, will be instrumental in achieving the objectives of the proposed law. By engaging in open dialogue and cooperation, digital intermediaries can help shape the law's purpose and contribute to its effective implementation.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.