Nikita Mishra
Jun 21, 2022

Singapore govt clamps down on harmful social media content

Online security and scrutiny beef up in Singapore, as social media services are directed to remove harmful digital content.

(Unsplash)
(Unsplash)

Social media platforms in Singapore will soon have to act against “harmful online content”, sexual or self-harm, or content that can threaten public health, under a proposed directive by the Infocomm Media Development Authority (IMDA) to protect consumers.

"Online safety is a growing concern and Singapore is not alone in seeking stronger safeguards for our people," said Josephine Teo, communications and information minister and minister-in-charge of smart nation and cybersecurity, in a Facebook post. "Over the years, social media services have put in place measures to ensure user safety on their platforms. Still, more can be done given the evolving nature of harms on these platforms and the socio-cultural context of our society."

Proposed guidelines

According to an online survey by Sunlight Alliance for Action—a public-private-people partnership which tackles online harms—done in January this year, nearly half of 1,000 Singaporeans polled have had rough personal experiences with online safety.

Teo announced the government aims to set up two codes of practice to bolster online safety and security for its people.

Under the first code, high-reach social media services will be expected to place intense safety standards to ensure users are not exposed to any harmful or inappropriate content. Additional safeguards will have to be put for young users under the age of 18.

The second proposal mandates that IMDA can order any social media service to remove specified types of “flagrant content”. 

A public consultation exercise will commence next month.

Social media’s path forward

While the contribution of social media platforms and new technologies cannot be denied, the usage has resulted in many unseen and unintended consequences. “More countries are pushing to enhance online safety, and many have enacted or are in the process of enacting laws around this,” highlights Teo.

Under the new directive, action can be taken against social media platforms that do not comply in improving the online safety gap. Singaporeans are proactively encouraged to report child sexual exploitation and any abusive material, as well as terrorist content they encounter online. Networks must set in vigorous accountability processes to handle and act on such complaints. 

Source:
Campaign Asia

Related Articles

Just Published

23 hours ago

Women to Watch 2024: Hyewon Park, EssenceMediacom

Park may have brought in great success for a key client, but matching this accolade is her willingness to continually upgrade her leadership skills.

23 hours ago

Nvidia brings AI to the desktop—what does it mean ...

Nvidia’s new AI rigs promises to reshape how marketing happens—from speeding up content, to trimming budgets and bringing modelling tools in-house.

1 day ago

Chinese creative legend Tomaz Mok: ‘No relationship ...

In an exclusive interview with Campaign in Shanghai, the former McCann veteran gets candid about the obscurity of international awards for Chinese work, leaving a big network after 36 years, and his advice for young talent.