- Joined
- Feb 26, 2019
- Messages
- 12,449
- Points
- 113
Singapore lays out proposals to shield young social media users from harmful content; seeks public feedback
www.channelnewsasia.com
File photo of mobile apps on a screen.(Photo: AFP/ARUN SANKAR)
13 Jul 2022 01:41PM (Updated: 13 Jul 2022 01:41PM)
SINGAPORE: A public consultation was launched on Wednesday (Jul 13) to seek views on the Government's proposed measures to enhance online safety for users of social media platforms, including young people.
There are two sets of proposals announced by the Ministry of Communications and Information (MCI) in June.
The first, a Code of Practice for Online Safety, will require social media services with significant reach or impact to have system-wide processes to mitigate exposure to harmful online content for Singapore-based users, including people below the age of 18.
The second, a Content Code for Social Media Services, will allow Infocomm Media Development Authority (IMDA) to direct any social media service to disable local access to content that is deemed harmful to Singapore's society, such as online material that incites racial or religious disharmony.
"We recognise that some social media services have put in place measures to protect their users. However, such measures vary from service to service," said MCI in its public consultation paper.
"Additionally, when evaluating harmful content on social media services, Singapore’s unique socio-cultural context needs to be considered. Given the evolving nature of harmful online content, more can be done, especially to protect young users."
"These designated services will also be expected to moderate content to reduce users’ exposure to such harmful content, for example, to disable access to such content when reported by users," said MCI.
"For child sexual exploitation and abuse material, and terrorism content, these services will be required to proactively detect and remove such content."
For young users, MCI proposed additional safeguards such as including stricter community standards and tools that allow young people or their parents to manage their exposure to harmful content.
The ministry said these tools could include those that limit the visibility of young users’ accounts to others.
"The tools could be activated by default for services that allow users below 18 to sign up for an account," it added.
www.channelnewsasia.com
13 Jul 2022 01:41PM (Updated: 13 Jul 2022 01:41PM)
SINGAPORE: A public consultation was launched on Wednesday (Jul 13) to seek views on the Government's proposed measures to enhance online safety for users of social media platforms, including young people.
There are two sets of proposals announced by the Ministry of Communications and Information (MCI) in June.
The first, a Code of Practice for Online Safety, will require social media services with significant reach or impact to have system-wide processes to mitigate exposure to harmful online content for Singapore-based users, including people below the age of 18.
The second, a Content Code for Social Media Services, will allow Infocomm Media Development Authority (IMDA) to direct any social media service to disable local access to content that is deemed harmful to Singapore's society, such as online material that incites racial or religious disharmony.
"We recognise that some social media services have put in place measures to protect their users. However, such measures vary from service to service," said MCI in its public consultation paper.
"Additionally, when evaluating harmful content on social media services, Singapore’s unique socio-cultural context needs to be considered. Given the evolving nature of harmful online content, more can be done, especially to protect young users."
SAFEGUARDS FOR YOUNG PEOPLE
Under the Code of Practice for Online Safety, authorities are considering requiring designated social media services to have community standards for six categories of content - sexual content, violence, self-harm, cyberbullying, content endangering public health and content facilitating vice and organised crime."These designated services will also be expected to moderate content to reduce users’ exposure to such harmful content, for example, to disable access to such content when reported by users," said MCI.
"For child sexual exploitation and abuse material, and terrorism content, these services will be required to proactively detect and remove such content."
For young users, MCI proposed additional safeguards such as including stricter community standards and tools that allow young people or their parents to manage their exposure to harmful content.
The ministry said these tools could include those that limit the visibility of young users’ accounts to others.
"The tools could be activated by default for services that allow users below 18 to sign up for an account," it added.