Clippings

Criminologist calls for new laws to curb AI porn

3

GEORGE TOWN: New laws and mechanisms are needed to curb the spread of Artificial Intelligence (AI)-generated pornography, said a criminalogist.

Datuk Dr P. Sundramoorthy, from Universiti Sains Malaysia’s Centre for Policy Research , warned that such content posed serious risks to the mental health of victims, especially minors.

“The law is playing catch-up while technology is sprinting ahead,” he said.

“In the meantime, our youths are exposed, our victims are silenced, and our justice system is ill-equipped,” he told the New Straits Times.

He said the Penal Code was not designed for a world where “anyone with a smartphone can destroy a person’s reputation in minutes”.

“The fact that a teenager could so easily exploit his peers using off-the-shelf technology should terrify us,” he said. Sundramoorthy was referring to the arrest of a 16-year-old private school student in Johor for allegedly using AI to edit and sell lewd images of his female classmates.

He is being investigated under Section 292 of the Penal Code for selling and distributing obscene material and Section 233 of the Communications and Multimedia Act for improper use of network facilities.

Sundramoorthy said Malaysia needed laws that reflected the realities of digital abuse and empower ed victims with protection and redress.

“Victims often suffer twice. Once from the abuse itself, and again when society shames them into silence. This mirrors the trauma of sexual assault, where victims are blamed instead of supported.”

He called for legislation that defines and criminalises AI-generated and deepfake pornography, and for fast-track content removal mechanisms on social media.

“This could be similar to the ‘Take it Down’ Act in the United States,” he said.

“The time for reform is now. Not after another teenager is victimised, not after another viral scandal.”

Women’s Aid Organisation (WAO) advocacy officer Tamyra Selvarajan said Malaysia’s reactive legal framework failed to address the long-term psychological and reputational harm faced by survivors.

“Our laws are growing, but they are still largely punitive, not preventive or restorative,” she said.

“There is no equivalent of the ‘Take it Down’ Act here. Survivors are often left waiting, reporting to authorities while harmful content remains live and circulating.”

Tamyra said survivors, particularly women and minors, often navigated a confusing legal and enforcement maze, with no clarity on which agency held jurisdiction.

“The problem is far from theoretical,” she said.

Tamyra said WAO handled a case this year involving a woman victimised by AI-generated porn.

In another case documented by the All Women’s Action Society (Awam) Telenita Helpline, a man was blackmailed with intimate videos recorded during a video call.

Tamyra said these cases highlighted key system gaps: no centralised reporting platform, no mandated takedown timelines and no survivor-centred support services.

She urged the government to consider a law akin to the US “Take it Down” Act, noting that existing Malaysian legislation, including the Penal Code, Communications and Multimedia Act, Anti-Sexual Harassment Act and the Sexual Offences Against Children Act, were not designed for the era of AI and deepfakes.

“If a country like the US, often criticised for its fragmented approach to tech regulation, can pass a federal law within months, Malaysia must at least begin with serious intent.

“What we’re asking for is not censorship, it’s dignity, safety, and justice,” she said


This article first appeared on NST.