On 13 July 2022, the Ministry of Communications and Information (“MCI”) launched a public consultation to seek feedback on proposed measures to enhance online safety for Singapore-based users of social media services. The consultation closes on 10 August 2022.
MCI observes that globally, there is widespread acceptance for the view that social media services distributing online content have a responsibility to keep their users safe from harm. While many social media services have made efforts to address harmful content, the prevalence of harmful online content remains a concern. This includes content that endorses acts of terrorism and extreme violence against certain communities, encourages suicide and self-harm, or threatens to destabilise physical or mental well-being through harassment, bullying or the non-consensual sharing of sexual images.
To address the risks of harmful online content, MCI is considering issuing two new Codes:
- Code of Practice for Online Safety: Designated social media services with significant reach or impact will be required to have appropriate measures and safeguards to mitigate exposure to harmful online content for Singapore-based users. These include system-wide processes to enhance online safety for all users, with additional safeguards for young users (i.e. individuals below the age of 18); and
- Content Code for Social Media Services: MCI proposes to grant the Info-communications Media Development Authority (“IMDA”) the power to direct any social media service to disable access to content that is particularly harmful to Singapore society where such content has not been detected by the social media services themselves. An example of such content is content that incites racial or religious disharmony or intolerance.
This article highlights key aspects of the proposed Codes.
Code of Practice for Online Safety
MCI proposes that the Code of Practice for Online Safety address the following:
- User safety: Designated social media services will be required to have community standards for the following categories of harmful content, examples of which are set out in Annex A to the consultation paper:
- Sexual content
- Violent content
- Self-harm content
- Cyberbullying content
- Content endangering public health
- Content facilitating vice and organised crime
These designated social media services will be expected to moderate content to reduce users’ exposure to such harmful content, e.g. to disable access to such content when reported by users.
- Proactive detection and removal of certain content: Designated social media services will be required to proactively detect and remove child sexual exploitation and abuse material and terrorism content.
- Allowing users to manage exposure: Designated social media services could also provide users with tools and options to manage their own exposure to unwanted content and interactions, e.g. tools that allow users to hide unwanted comments on their feeds, and limit contact and interactions with other users.
- Safety information: Designated social media services will be required to provide safety information that is easily accessible to users, e.g. Singapore-based resources or contacts to local support centres. Relevant safety information, such as helplines and counselling information, may also be pushed to users that search for high-risk content, e.g. those related to self-harm and suicide.
- Additional safeguards for young users: Designated social media services will be required to put in place additional safeguards to protect young users, including stricter community standards for young users, and tools that allow young users or parents/guardians to manage and mitigate young users’ exposure to harmful content and unwanted interactions. Examples of such tools and how they operate are set out in the consultation paper.
- User reporting and resolution: Designated social media services will be required to provide an efficient and transparent user reporting and resolution process to enable users to alert these services of harmful content. The process should be easy to access and use and allow users to report harmful online content in the categories of harmful content listed above (under User safety). The designated social media services should assess and take appropriate action on these user reports in a timely and diligent manner.
- Accountability: Designated social media services will be required to produce annual reports on their content moderation policies and practices, and the effectiveness of their measures in improving user safety. These reports would be made publicly available on the IMDA website, thus allowing users to better understand how their exposure to harmful content is reduced on the services they use.
Content Code for Social Media Services to deal with extremely harmful content
The proposed Content Code for Social Media Services will deal with instances where extremely harmful content remains online in relation to suicide and self-harm, sexual harm, public health, public security, and racial or religious disharmony or intolerance. Examples of such content are set out in Annex B to the consultation paper. Given the concerns about the impact of such extremely harmful content, it is proposed to allow IMDA to direct any social media service to disable access to specified harmful content for users in Singapore, or to disallow specified online accounts on the social media service from communicating content and/or interacting with users in Singapore.