Defending Ultra Low Emission Zones from disinformation
Deeper Truths Q&A with Amil Khan, founder of Valent Projects: a tech start up using AI tools to address manipulation and disinformation
If you’re keen to hear from someone who’s on the ‘front lines’ of fighting disinformation, then this week’s Q&A with Amil should satisfy your appetite.
I’ll leave it to Amil to tell you about what he does and why, but what I will say is that while much noise has been made in the last few years about exposing and fighting influence operations, Amil and his team at Valent Projects have been quietly going about their business with interesting results.
If you’re interested in what the next big disinformation issue will be, keep an eye out for Amil’s two determinants of what makes something high-risk for disinformation blowback — it might just save your next campaign.
Who are you, what is Valent Projects and why does it exist?
I’m an ex-foreign correspondent whose career was bookended by the US invasion of Iraq (and its aftermath) and the Arab Spring uprisings. By 2013, I saw how disinformation was used to turn popular opinion in western countries against a democratic opposition movement.
What really struck me was the use of fake accounts, bot nets and hashtag manipulation; which is much more technical than just lying. As I studied the issue, I realised I couldn’t find any organisations with the mix of skills I thought were needed to tackle the problem - mainly data, tech, social media and political science. So as Covid hit, I set up Valent to bring those skills under one roof.
You recently did some widely praised work on how disinformation was impacting the debate about Ultra Low Emission Zones. Can you summarise what you found?
As a team, we are constantly looking to see what methodology is being used by different malign actors.
The ULEZ debate caught our attention because it showed signs of being an influence campaign such as very consistent cycling of narratives, but the classic techniques — e.g. hashtag spamming, bot fans etc — were not immediately visible. When we looked into it, we found the techniques being used were designed to make posts from certain accounts trigger platforms’ algorithms and appear more often on people’s timelines.
We had seen this same technique being used by a militia fighting Sudan’s army at the moment. Methodologies move across the world as they are adopted and adapted by different actors, so it’s important for us to keep up.
We've seen a lot of issues hijacked by disinformation in a way that we might describe as 'unexpected'. I'm thinking of 20mph speed limits, Oxford 'climate lockdown' protests or '15 minute cities' more generally. You have these apparently ‘common sense’ policy ideas, that more technocratic politicians or policy wonks can see will produce big benefits and seem fairly uncontroversial, and then suddenly just before implementation everything blows up.
What do these issues, which are easily hijacked by disinformation, have in common and what does this tell us about what the next big issue might be?
The pattern seems to suggest that there are two key determinants as to whether an issue will become a magnet for disinformation and manipulation. These are:
Whether the discussion is calling for a change in the status quo that has the potential to impose unwelcome costs on a powerful actor (government, sector, organisation etc)
Is there potential to frame the nature of the discussion into a binary culture/identify issue?
When those two criteria are met, we have the conditions for a polarising disinformation flame war. The thematic issues I worry about are around labour rights as AI transforms economies.
When it comes to climate related policies, I think we are still very much in the foothills. Implementation of Net Zero in the UK, and similar policies across the world, are going to keep fuelling reactions.
Also, 2024 will be the year of elections, the tools and methods available to malign actors are vastly more powerful than they were just a couple of years ago. I fear that election integrity and monitoring mechanisms across the world won’t be able to keep up.
Here in the UK it looks like we'll have a general election next year. It's pretty clear for those who follow British politics closely that the incumbent Conservative Party - who have a mountain to climb to stay in power - have adopted a 'culture war' strategy, which at times flirts with disinformation about things like 15 minute cities.
Our political system is fairly resilient to non-mainstream populist parties, but its biggest vulnerability is one of the two mainstream parties being co-opted by populists and disinformation actors. How worried are you about the impact that disinformation could have on the long-term health of our political institutions?
There is definitely a change in the internal dynamics of political parties.
In the past, votes (and therefore success) were assumed to be in the middle. That has now changed, and the assumption is that occupying the margins is necessary to maintain relevance. I say “assumed” because disinformation plays a part in making decision makers think that is the case.
My worry is that in the UK our politics becomes more about extremes arguing (and feeding) each other while the disengaged middle becomes more and more disillusioned. A big part of the problem is that the current tools for measuring sentiment such as polling are not good at picking up how disinformation is skewing views.
What's the one piece of advice you'd give someone who finds themselves fighting a misinformation or disinformation outbreak?
Be clear about the real-world outcomes you care about and assess whether the malicious activity you are facing is actually having an impact on them.
Disinformation campaigns aren’t static. They seek to provoke the targets into a response, and that response is then used as source material to generate more malicious content. If you end up the target of a disinformation campaign, it is difficult not to spin into panic mode. But it is worth keeping in mind that that in itself is an objective.
Don’t underestimate the power of ignoring the attack - specially in the initial stages.
What's the next thing the person reading this should read/watch/listen to if they want to understand more about this subject?
The Great Hack is a few years old now, but Netflix’s documentary on the Cambridge Analytica scandal is a great, easy to understand primer on the nature of digital persuasion.
Also, from a while ago, but as relevant today is the the New York Times’ four-part primer on Russia’s approach to disinformation. Understanding the Kremlin’s influence strategy is useful because it forms the foundation of the methodologies that came after.
For anyone looking for more, I would suggest reading Peter Pomerantsev, particularly This is Not Propaganda. (Note from Stefan: I 100% second this recommendation.)
Thanks for reading this week’s edition of Deeper Truths, the newsletter fighting for a less divided and more informed world. If you know someone you think we should interview, then hit reply to this email and let us know!