News Daily


Men's Weekly

Australia

  • Written by The Conversation
AI decides what we see online. It’s time digital platforms tell us exactly how they do it

If you suffer from information overload, or are unsure what to trust online, you’re not alone. Australians are increasingly disengaging from traditional news, turning instead to social media, influencers and – more recently – generative artificial intelligence (AI) chatbots and summaries.

It’s a murky, polluted world where opaque algorithms decide what you see. They’re known to have little regard for accuracy, quality or the evidence-based reporting we need for a safe and thriving community.

At the same time, local journalism is disappearing. Distrust in mainstream news is growing. This issue has escalated rapidly with “zero-click” AI search results. Instead of serving links, they show the information upfront. This decreases traffic to news websites, further reducing audience, subscription opportunities and revenue. The rapid spread of AI has pushed an already fragile news ecosystem closer to breaking point.

Earlier this year, a News Futures: Media Policy Roundtable brought together 45 leaders from industry, government, not-for-profit organisations, digital platforms and academia.

The attendees agreed that the opacity of algorithms on social media, search and AI platforms – which decide what is shown, ranked or omitted with little accountability – has become a core threat to journalism and audience trust. Published today, the resulting report proposes a paradigm shift in how we support and define journalism in Australia.

Misinformation is flourishing

Misinformation flourishes when there is high demand for information but insufficient verified evidence. A healthy (and prominent) supply of quality news and information can counterbalance misinformation. Our research shows a strong link between news consumption and people’s ability to verify misinformation.

For consumers, laws and civic education have not kept pace with AI content, such as deepfakes. There are no clear standards for showing where online content comes from or standard guidelines for checking if it’s real. Because many AI systems work like black boxes, it’s also hard to know who is responsible when they make mistakes or show bias.

Australians already have very low confidence in their ability to verify misinformation. Only about 40% are confident they can check if a website or social media post can be trusted, and only 43% are confident they can check if information they find online is true.

This problem is being compounded by the growing prevalence of AI slop and hallucinations (low-quality and erroneous content). In fact, Australians are among the most concerned about online misinformation globally.

Read more: Slopaganda wars: how (and why) the US and Iran are flooding the zone with viral AI-generated noise

People don’t know whom to trust

Experts at the roundtable were worried about low media and AI literacy among citizens. Many Australians struggle to verify information online, and are unsure where to turn for trusted sources.

When everything starts to look unreliable, switching off can feel like the safest option, which many Australians choose to do – 69% avoid news often, sometimes or occasionally.

The problem is digital platforms are an unreliable interface for news. Through algorithms, they make invisible and unaccountable choices that reshape the public’s access to information. In selecting where information is drawn from, these digital intermediaries can create new “winners” and “losers”, elevating some content above others with little regard for quality or accuracy.

But there is no impetus for platforms to explain how their algorithms work or when they change, how news is prioritised (or de-prioritised), or how AI-generated information is produced.

There is an urgent need for transparency in algorithmic curation and mandatory labelling of AI-generated content.

Where to from here?

The roundtable participants identified five priorities that, together, could drastically improve our information ecosystem. Three of those specifically target AI.

1. Greater transparency from big tech platforms. Australians deserve to know how algorithms curate news on search engines, social media, and AI chatbots. They also need to know when AI is involved in producing content. Clear labelling and disclosure rules would help rebuild trust and give users more control.

2. Fair rules for AI use of news. AI companies should not be able to take journalism for free. Industry-wide licensing agreements, copyright reform and stronger competition law could ensure news organisations are compensated when their work is used to train generative AI tools.

3. Prioritising media and AI literacy education across the nation. Educating people on how algorithms work, and how to spot bias and misinformation is one of the fastest and most cost-effective interventions available. And it’s not just for schools – adults need ongoing opportunities to upskill too.

4. Journalism funding should reflect its role as a public good. One-off grants are not enough. Proposals such as a tax offset for journalists’ salaries is a sustainable alternative that could support newsrooms directly, especially small and regional outlets, while remaining accountable.

5. Journalism training for news influencers, content creators and digital-first outlets. A common industry code is required to ensure the quality of the whole news ecosystem, and the industry needs to work on this together.

Society can’t afford an information environment in which invisible AI dictates what we see. Without action, the public interest journalism that underpins democracy and social cohesion will continue to crumble.

Read more https://theconversation.com/ai-decides-what-we-see-online-its-time-digital-platforms-tell-us-exactly-how-they-do-it-281327

Top Electrical Safety Tips from Inner West Sydney Electricians

While it may not be the most exciting subject to discuss, having an electrically safe home is definitely one of the most critical. Knowing the basics could help you avoid accidents and ensure your home remains in good condition, whether... Read more

When to Escalate a Debt Recovery Matter to Legal Action

Knowing when to transition from informal debt collection efforts to formal legal proceedings is a decision that many creditors find difficult to navigate. Acting too early can damage commercial relationships, while waiting too long can reduce the likelihood of recovery... Read more

Why Slurry Hose Systems Are Essential for Handling Abrasive Industrial Materials

Transporting abrasive mixtures is a common challenge in industries such as mining, dredging, and construction. These mixtures, known as slurry, consist of solid particles suspended in water or other liquids. Moving slurry through pipelines requires specialised equipment that can withstand... Read more

Why Choosing the Right Dental Clinic Matters for Long Term Oral Health

Maintaining good oral health requires regular checkups, preventive care, and professional treatment when needed. Visiting a trusted Dental Clinic plays a vital role in keeping teeth and gums healthy while preventing more serious dental problems in the future. Many people only... Read more

Is Deep Plane Facelift Safe in Thailand?

When you ask whether a deep plane facelift is safe in Thailand, you’re really asking: “Can I get high-quality surgical care with strong safety standards and reliable follow-up while I’m traveling?” That’s a smart question. But the country name alone... Read more

Why Cloud Services Are Now Essential for Business Growth and Security

In today’s fast-moving digital environment, understanding how cloud services support long-term stability has become a priority for businesses across Australia. As expectations shift and workplaces adopt more flexible models, organisations are turning to cloud services to keep systems running smoothly... Read more