Prof Nick Couldry from the LSE on how social media has failed society
Nick Couldry is Professor of Media, Communications and Social Theory Emeritus and Professorial Research Fellow in the Department of Media and Communications at the London School of Economics and Political Science. As a sociologist of media and culture, he approaches media and communications from the perspective of the symbolic power that has been historically concentrated in media institutions. He is interested in how media and communications institutions and infrastructures contribute to various types of order – social, political, cultural, economic, and ethical.. In the past 10 years, his work has increasingly focussed on data questions, and ethics, politics and deep social implications of Big Data and small data practices. He is the author or editor of 17 books and many journal articles and book chapters.He has recently co-founded the Tierra Comun tri-lingual website (Englosh, Spanish and Portugese) to encourage networking with and among Latin American scholars and activists interested in data colonialism. Nick Couldry’s most recent book is The Space of the World: Can Human Solidarity Survive Social Media and What if it Can’t? It is the first of a three-book series titled Humanising the Future. We are at the International Communications Association’s 75th annual conference in Denver, Colorado, where we will discuss his most recent work. Hosted on Acast. See acast.com/privacy for more information.
--------
47:42
--------
47:42
Lizzie O'Shea on why digital rights are human rights
In recent years, many major companies, both in Australia and around the world, have conspicuously failed to protect their customers’ data, leading to personal details being shared on the dark web. Global platform companies have facilitated the spread of disinformation and misinformation, while their algorithms have contributed to the fragmentation and polarisation of society. But governments in some parts of the world have sought to force these companies to lift their game, imposing more rigorous standards that mandate the protection of privacy and user data. In Australia, the government has passed new laws including the creation of a tort for serious invasions of privacy and expanding the investigative and enforcement powers of the Office of the Australian Information Commissioner. Lizzie O’Shea is the founder and chair of Digital Rights Watch, an Australian non-government organisation which advocates for human rights protections in the digital world. Digital Rights Watch focuses on issues such as privacy, security, data rights, access to data and technology, and the role of journalism in holding technology companies to account.She also sits on the board of Blueprint for Free Speech and the Alliance for Gambling Reform. She’s a past recipient of the Davis Projects for Peace Prize and has been named a Human Rights Hero by Access Now. He 2019 book Future Histories: What Ada Lovelace, Tom Paine, and the Paris Commune Can Teach Us About Digital Technology was shortlisted for the Victorian Premier’s Literary Award. Lizzie is also a regular panellist on the Burning Platforms podcast alongside Peter Lewis who appeared on our first episode. Hosted on Acast. See acast.com/privacy for more information.
--------
54:14
--------
54:14
"A flashing warning light": discussing the 2025 Edelman Trust Barometer
The 25th anniversary edition of the Edelman Trust Barometer revealed that Australia has slipped into distrust territory with a profound global shift to acceptance of aggressive action and deepening fears giving rise to a widespread sense of grievance.We were delighted to co-host the launch of the 2025 Edelman Trust Barometer on 18 March. In collaboration with Edelman Australia and the Centre for AI, Trust and Governance, we brought together around 160 people at The Sybil Centre at the University of Sydney.This episode contains the panel discussion from the launch, facilitated by media commentator Tim Burrowes. Terry was joined on the panel by Kim Portrate, Milly Bannister and Jared Mondschein. This followed Terry's opening keynote address and a speech by Tom Robinson, CEO of Edelman Australia, that provided detailed analysis and insights from the Australian report – a separate conversation between Tom and Terry can be found in our previous episode.We highly recommend you download the report and find out more about the state of societal trust in Australia, and how it may impact your industry. Hosted on Acast. See acast.com/privacy for more information.
--------
39:00
--------
39:00
Tom Robinson on trust's decline and the rise of grievance
As we’ve discussed extensively on the podcast, trust in public institutions is declining. But how do we know this, and how do we measure how much things have changed? The international communications firm Edelman has been tracking this issue for 25 years, and its Edelman Trust Barometer has become one of the most authoritative global sources on trust in society. This year, their study has found that that globally, there has been what they term a profound shift to acceptance of aggressive action, with increased polarisation, deepening fears, and a widespread sense of grievance.The 2025 version of the Edelman Trust Barometer was released a few weeks ago, and the CEO of Edelman Australia, Tom Robinson, joins Terry to explore its findings in detail. Before joining Edelman, Tom spent more than a decade at MediaCom, working with high-profile brands on their marketing and content strategies. He also has extensive experience with digital media. Hosted on Acast. See acast.com/privacy for more information.
--------
24:17
--------
24:17
Prof Heather Ford on how to build AI systems we can trust
As AI continues to make its way into more aspect of life, some interesting trends about how the public feels about these new, increasingly pervasive services have been observed. The developers of AI promise that their systems will produce reliable, comprehensive, and bias-free results. But national surveys consistently show that the public is sceptical towards AI. And yet experimental studies show that in practice, people do trust AI more than one might suspect. Can increasing AI literacy help to overcome this deficit, and teach us what to trust when it comes to AI, and where we’re right to be cautious? And if so, how should literacy initiatives balance goals to learn how AI works in practice, and how AI could or should work in the future?Today’s guest, Dr Heather Ford, has been thinking extensively about these issues. She’s an ARC Future Fellow and Professor in the School of Communications at UTS. She is the Coordinator of the UTS Data and AI Ethics Cluster, Affiliate of the UTS Data Science Institute, and Associate of the UTS Centre for Media Transition. She was appointed to the International Panel on the Information Environment (IPIE) in 2023.Heather Ford is currently conducting research funded by the Australian Research Council and the Wikimedia Foundation on Wikipedia bias, question and answering technologies, digital literacy and the impact of generative AI on our information environment. Previously she has worked for global technology corporations and non-profits in the US, UK, South Africa and Kenya. Her research focuses on the social implications of media technologies and the ways in which they might be better designed to prevent misinformation, social exclusion, and harms as a result of algorithmic bias. Hosted on Acast. See acast.com/privacy for more information.
Governments, the economy and civil society depend on the public’s trust to work effectively – but this trust is declining in an age of polarisation and misinformation. The UN Secretary-General Antonio Guterres has warned that this “malady of mistrust” is as damaging as COVID or climate change. We don’t talk much about trust – but we certainly notice when it breaks down, in corporate scandals or political coups. But in a time when many are losing faith in our most vital institutions, how can the bonds of trust be rebuilt?In Time for Trust, Terry Flew will explore these themes with leading experts on trust, from academics and journalists to community leaders, both from Australia and around the world. Professor Flew holds a prestigious Laureate Fellowship from the Australian Research Council. He’s particularly interested in “mediated trust” – that is, forms of trust and mistrust as they are expressed in and through the digital media technologies we use to make sense of the world. From trust in news to trust in digital platforms, from trust in corporations and governments to trust in AI, “Time for Trust” will ask – who, and what, do we trust, have we lost that trust, and can we get it back? And are technologies bringing us together or driving us apart?Join us for a fascinating journey through one of the most important issues facing people and societies everywhere. Because Billy Joel was right – it is a matter of trust.Time for Trust is brought to you by the Faculty of Arts and Social Sciences at the University of Sydney, and the Australian Research Council. It's produced by Dominic Knight, and recorded on unceded Gadigal Land. Hosted on Acast. See acast.com/privacy for more information.