Nikita Mishra
Jun 21, 2022

Singapore govt clamps down on harmful social media content

Online security and scrutiny beef up in Singapore, as social media services are directed to remove harmful digital content.

(Unsplash)
(Unsplash)

Social media platforms in Singapore will soon have to act against “harmful online content”, sexual or self-harm, or content that can threaten public health, under a proposed directive by the Infocomm Media Development Authority (IMDA) to protect consumers.

"Online safety is a growing concern and Singapore is not alone in seeking stronger safeguards for our people," said Josephine Teo, communications and information minister and minister-in-charge of smart nation and cybersecurity, in a Facebook post. "Over the years, social media services have put in place measures to ensure user safety on their platforms. Still, more can be done given the evolving nature of harms on these platforms and the socio-cultural context of our society."

Proposed guidelines

According to an online survey by Sunlight Alliance for Action—a public-private-people partnership which tackles online harms—done in January this year, nearly half of 1,000 Singaporeans polled have had rough personal experiences with online safety.

Teo announced the government aims to set up two codes of practice to bolster online safety and security for its people.

Under the first code, high-reach social media services will be expected to place intense safety standards to ensure users are not exposed to any harmful or inappropriate content. Additional safeguards will have to be put for young users under the age of 18.

The second proposal mandates that IMDA can order any social media service to remove specified types of “flagrant content”. 

A public consultation exercise will commence next month.

Social media’s path forward

While the contribution of social media platforms and new technologies cannot be denied, the usage has resulted in many unseen and unintended consequences. “More countries are pushing to enhance online safety, and many have enacted or are in the process of enacting laws around this,” highlights Teo.

Under the new directive, action can be taken against social media platforms that do not comply in improving the online safety gap. Singaporeans are proactively encouraged to report child sexual exploitation and any abusive material, as well as terrorist content they encounter online. Networks must set in vigorous accountability processes to handle and act on such complaints. 

Source:
Campaign Asia

Follow us

Top news, insights and analysis every weekday

Sign up for Campaign Bulletins

Related Articles

Just Published

22 minutes ago

Rethinking SEO: How Google's AI Overviews are ...

With Google's AI Overviews and 'AI Mode' set to transform the search landscape, publishers and agencies are grappling with declining click-through rates and the need to adapt their SEO strategies.

44 minutes ago

Baby shark in Greenpeace film is distraught by ...

In a stunning new animation film by Photoplay, the dangers of overfishing are simply and effectively narrated by a blue shark, a species often caught in the crossfire of long line fishing.

19 hours ago

Omnicom set targets to cut staff costs by 10 per cent

Omnicom Group set targets to reduce its total staff compensation bill by 10 per cent ahead of its proposed acquisition of Interpublic Group (IPG), PRWeek has learned.

1 day ago

The AI-powered imperative: How marketers can reach ...

As AI flips search and viewing habits, Google Singapore’s Ben King breaks down what marketers need to do to keep up and stay relevant.