Rahat Kapur
Sep 15, 2023

Is the era of human-generated journalism officially over?

From AI anchors to articles churned out by Chat GPT, what's the value today of a human journalist in a world where it costs only $400 to machine-generate the news? Plenty and then some, editor Rahat Kapur opines.

Photo: Getty Images.
Photo: Getty Images.

On December 4 2016, the Comet Ping Pong pizza shop in Washington D.C.’s Chevy Chase neighbourhood was bustling with families and hungry lunch-seekers, eager for a slice on a lazy Sunday afternoon. It was the middle of the day, when without warning, Edgar Maddison Welch from Salisbury, North Carolina entered the restaurant with a semi-automatic rifle in his hand and shot three rounds inside the building. He was immediately arrested. In addition to the AR-15 style rifle, Welch was also found to have an additional Colt. 38 caliber handgun, a shotgun and a folding knife on his person and in his car. By a miraculous stroke of luck, no one was injured during the incident.

Shocked?

Maybe not so much, given the political context and state of today’s gun violence statistics in the U.S.

But, what if you found out this incident occurred as a direct result of misreported information?

Or that Welch’s actions were inextricably linked to widely-circulated tweets spreading like wildfire on the internet, proclaiming that the peaceful pizza shop was the base for a paedophile sex ring involving Democratic presidential candidate Hillary Clinton, spurned off a leaked email and fodder spread by rival right-wing opposition members?

Or that Welch had gone to the pizza shop that day to “self-investigate” the conspiracy theory and take matters into his own hands and “rescue the children” he believed were trapped in the shop—all as a result of what he’d read on social media, as well as news bulletin board sites online?

Or even, that in spite of multiple debunkings by credible sources such as The New York Times, the FBI and the Metropolitan Police Department District of Columbia, the Comet Ping Pong restaurant has continued to receive death threats, and violent Yelp reviews, and as recently as 2019 and 2020, has been victimised by believers of #PizzaGate (as it’s since been dubbed), suffering everything from arson to their phone lines being jammed for a day?

Are you shocked yet?

Welcome to the era of A-eye

There’s no denying the landscape of information and consequently, journalism, has been irrevocably changed in the last two decades. Since the hysteria of a new millennium (and its associated Y2K madness), we’ve been awash with wave after wave of technological development—from the personalisation of devices to the thrills of mobility, to the galaxy-reverberating boom of social media and now, welcome to the official era of artificial intelligence (AI).

With the rise of automated content generation exploration in major news agencies, including the likes of the Associated Press, The New York Times, The Washington Post, and countless such others, the business of news is no longer about the race for timeliness alone, but now, a bid for delivering it in the most efficient and cost-effective manner possible.

Where journalists were once tasked with arduous (if not mundane) but fundamental tasks to analyse data, identify trends and uncover patterns, there’s now a robot or machine learning mechanism that can do that—and do it faster, cheaper and in many cases, more accurately.

In accordance with recent Glassdoor statistics, the average journalist in Singapore makes around SGD$8,725 month (USD$6,405), resulting in an annual salary of around SGD$105,000 (USD$76,000). That’s a pretty penny, given with the newly-cropped up options of Gen AI tools on the market can cost as little as USD$400 when combined to build a powerful information machine, and one that is capable of producing mass articles, tweets and even, the daily news.

With statistics such as these in mind, what incentive do we have left to rely on the human eye anymore, when we’ve got the A-eye?

The empowerment argument

There are some common retaliations to sceptics of the AI movement in the media space, the main being that the use of artificial intelligence is not only inevitable, but also an unprecedented tool of enablement when it comes to empowering journalists.

Others include:

  • The less time journalists are forced to spend on tasks such as data processing, visualising and fact-checking volumes of information, the more time they can devote to investigation, source-searches, story validation and lead generation. After all, these are the key pillars of what makes a good and memorable piece of coverage.
     
  • Where press releases are concerned, AI tools can help with brevity, summarisation and re-writing of information that doesn’t require a great deal of processing and can simply be “churned” and posted in the interests of time.
     
  • AI is a supplementing technology to aid and assist journalists, not to replace them. Where leveraged appropriately and responsibly, it can result in better, quicker and more factually-accurate stories as long as there is sufficient human oversight.

The third point is a particularly pertinent one, as it’s the way in which most news organisations are currently justifying the embedding of AI into their fundamental delivery cycles in this capacity.

The Associated Press announced in July 2023 that it would be pursuing a partnership with OpenAI, enabling the ChatGPT creator to use the news organisation’s archives going back to 1985 to train their AI models.

Also in July of this year, The New York Times reported that it, as well as The Washington Post and News Corp (who owns the Wall Street Journal) had all been pitched Google’s latest Genesis offering. The tool utilises AI technology to ingest current events and generate news content, acting as a kind of personal assistant automating several tasks and freeing up time for journalists to explore other more meaningful outcomes. However, The NY Times also reported that some executives who saw the pitch found it “unsettling”, saying it took for granted the amount of work and effort that goes into the production of “accurate and artful” news stories.

Executive editor of the South China Morning Post (SCMP) Chungyan Chow echoes this sentiment in many ways. Chow joined the Post in 1998, working over two decades to rise through the ranks to his current role via the City, China and Business desks. He oversees the newsroom’s day-to-day operations, as well as its digital site and print publications, alongside supervising the SCMP’s China and US coverage.

SCMP's executive editor Chungyan Chow.

“Journalism is an extremely complicated job. AI can perform better and cheaper than any human journalist in some aspects, but they have their limitations as well,” Chow shares with Campaign.

“AI is great at assimilating information, analysing structured data, verifying information and generating content at a lightning speed. But there are other areas where it cannot perform so well, such as understanding human emotions, making ethical judgements, producing real insights and most importantly, building trust.”

And perhaps therein lies the issue with the empowerment argument. What use is brevity when the crux of a good story lies in the journalist’s ability to derive insight, even from the most minute threads of information?

“A core part of a journalist’s job is to connect with people, source information from different parties and construct a fair and balanced narrative. This requires the journalist to exercise professional judgment and understand the complex human emotions and motives at play. The journalist often needs to talk, preferably face to face, to the human actors involved in the story,” says Chow. “The kind of trust a reporter can build with the interviewees will be critical to the success of the story.”

But what about when it comes to more menial tasks like the aforementioned repurposing or churn of press release cycles? Or reporting the news as a slew of new AI-generated anchors flood the market from be it China’s ‘Xinhua’, India Today’s ‘Sana’, Odisa-TV’s ‘Lisa’ or Indonesian channel TVOne’s multiple AI news anchors?

China's "Xinhua" news anchor who can work 24 hours a day.
 
India's "Sapna" AI-generated news anchor.

Chow remains unconvinced.

“AI anchors may become a fad because of sheer novelty, but I don’t think they will replace human anchors for good. In the end, a good news anchor wins the audience with her or his unique personality. Even the occasional bloopers are fondly remembered. Once the novelty of virtual anchors wash off, I think people will still want real human anchors to do news broadcasts,” he shares.

But for Chow, the argument extends beyond the realms of empowering reporters alone. He rightfully raises the impetus of trust between the consumers of news and media outlets as a key factor of consideration when utilising AI.

“Readers must have trust in a news organisation. This kind of relationship building will require a human touch and understanding that are beyond today’s AI. What will happen in the future is impossible to tell. But at this stage, we don’t see how AI can replace human journalists at a large scale.”

The ethical dilemma

The aforementioned #PizzaGate is only one such story of the ethical and moral concerns that shroud the role of artificial intelligence in the media space.

The fact of the matter is, as it stands today, there are critical and inherent limitations to AI that prevent it from fully assuming the role of human reporters. One such limitation is the lack of contextual understanding and judgment, as noted by Chow. The business of news is a nuanced one, and content that lacks it can lead to inaccuracies or bias, which in turn can deceive and manipulate public perception. 

Also, the question of accountability is a paramount one. Who is policing the dissemination of information and consequently, misinformation, on a global scale—especially when sovereign country-led regulatory frameworks are already burdened with the task of borderless internet content management in every corner of the world?

How do you hold AI accountable for its errors or misreporting in the ways in which you can a human journalist? If news is misreported in the U.S. but causes an action of inflammation in Australia, which country regulates the act, and how do you establish a link that results in justice for the victimised parties involved?

We’ve already seen this scenario play out in several social media contexts, with the likes of Meta, Twitter, Google and YouTube all currently undergoing multiple cases of scrutiny and legal ramifications across Canada, Europe and Thailand.

Not to mention, all this before we’ve even explored the significance of protecting privacy, data ownership and copyright. These are all topics currently on the pulse of this debate as publishers battle it out with Gen AI creators for royalties and payments in exchange for access to their valuable archives and information sets in order to train these artificial models. Their argument is clear and relevant: We cannot risk the disintegration of human creativity and its sanctity in the name of educating machines to replicate human-led tasks.

In excerpts from a News Corp earnings call earlier this year, CEO Robert Thomson denoted this exact point, acknowledging the paradigm shift in the AI-news relationship.

 "There have certainly been fundamental changes in the media landscape. We have led the quest for appropriate compensation for content from the big digital platforms, and that quest, begun publicly in 2007 when I testified before the House of Lords about the challenges for publishers and society in the internet age, has entered a new, fascinating phase with the rise of Generative AI,” said Thomson. “It is crucial for our societies that AI is replete with EI, that recomposition does not lead to the decomposition of creativity and integrity.”

Thomson also reiterated his worry at the misuse of information, taken out of context and unethically scraped and ingested to train AI engines, and like SCMP’s Chow, relayed the consequent implications for consumers in the wider society.

“We have been characteristically candid about the AI challenge to publishers and to intellectual property. It is essentially a tech triptych. In the first instance, our content is being harvested and scraped and otherwise ingested to train AI engines. Ingestion should not lead to indigestion. Secondly, individual stories are being surfaced in specific searches. And, thirdly, original content can be synthesised and presented as distinct when it is actually an extracting of our editorial essence.”

News Corp CEO Rob Thomson

"These super snippets, distilling the effort and insight of great journalism, are potentially designed so the reader will never visit a news site, thus fatally undermining journalism and damaging our societies.”

As of September 2023, Reuters reported News Corp is currently engaged in "various negotiations" with artificial intelligence companies over their use of its content, with Thomson quoted as saying:

"What you will see over time is a lot of litigation; some media companies have already begun those discussions. Personally, we're not interested in that at this stage. We're much more interested in negotiation."

So, what happens next?

No one can deny the striking emergence of AI into the modern newsroom, and its revolutionising of content generation, data analysis, and audience insight and engagement. Where applied correctly, there is value to be extracted.

"I believe AI will fundamentally change the news industry and journalism, but it will not mark the end of content generation by journalists. It will force journalists to focus on areas that are unique to humans and leave the other parts to artificial intelligence. In other words, journalists need to understand how to use AI to help their research and news production and identify what areas in their work that humans will hold unique advantages and can add value to," concludes SCMP's Chow.

But, there still remains the critical imbalance of the overall integrity, accuracy and ethical standards of news reporting when it comes to AI. For now, there is simply no substitute for human-led quality. It might seem like mere news, but facets such as democracy, freedom of press, justice and social equity exist on the foundations of this information—making the work of journalists not just a fundamental profession, but a key pillar of human existence itself. If we want to raise educated, emotionally-intelligent, thriving societies, there is no stand-in for the immeasurable insight that human beings bring to the craft of reporting.

In the words of Warren Buffet: "The smarter the journalists are, the better off society is."

Source:
Campaign Asia

Related Articles

Just Published

8 hours ago

Dentsu Q3 2024 earnings: Japan's growth contrasts ...

Despite a robust 2.8% Q3 increase in Japan, Dentsu has downgraded its full-year outlook to flat (0%) due to a sharp fall in the APAC region.

13 hours ago

To the junior creative in the industry: 'It's okay ...

An agency CEO responds to a junior creative's heartbreaking confession, offering practical advice and a much-needed dose of empathy.

13 hours ago

PHD wins $35 million Bosch China media account

EXCLUSIVE: The multimillion dollar corporate media mandate moves after a competitive review process in Q2.

14 hours ago

Beyond Wall Street: Dow Jones on redefining legacy ...

As the media industry navigates a mercurial landscape, Dow Jones’ global CCO, CMO, and EVP and GM for leadership, luxury, and events sit down with Campaign to discuss why their news goes well beyond the parishioners of finance.