Discover more from Iris Report
Facts by Platform – War IRL
Algos, censors, and fractures
We may agree that we are not entitled to our own facts. We may also agree on precious little else these days. After all, we aren’t even seeing the same “facts.” Today, the designation of what comprises facts is increasingly made for us, in nearly real time, by the platforms we engage with and the censorship we endure.
Global platforms like Instagram, TikTok, YouTube, WeChat, V Kontakte, and Twitter are the battlegrounds for new age propaganda. The world may be on a path leading us all toward (or away) from global military conflagration, torrents of refugees, shifting global alliances, and spiking energy and food prices. Yet the undisclosed and private decisions of nations, platforms, and apps are increasingly setting the terms for divergent and, very possibly, belligerently opposed realities.
For billions of social media users in China and Russia, the war in Ukraine is a smart phone, real time, social media “war” fought between states with serious tech talent. Billions of people see versions of reality moderated by algorithms and censors. This is the shape of influence operations and propaganda in the digital age: highly curated, siloed, tailored, and controlled.
No matter which wars you see — or don’t see — the realities on the ground are harsh, encompassing and terribly fascinating, as death at scale always appears to be. Narratives of current events depend on the platform- at least for a time and for many people in far flung locations. V Kontakte and Odnoklassniki command the Russian landscape with over 100MM Russian speaking users. The messaging app Telegram has reportedly surpassed Meta’s Whatsapp, serving up stories and channels on the war in Urkaine to millions of Russians.
In China, WeChat, Doutin, QQ from TenCent and Baidu dominate Chinese social engagement. These platforms reach over 1 billion users and over 80% of Chinese speakers. For these folks the war in Ukraine appears to be very much a security action and perhaps even necessary conflict.
Iris Report is a reader-supported publication. To receive new posts and support our work, please become a free or paid subscriber.
To put a finer point on it, global attention has divided on the “wars” in Ukraine. Yes, wars — not singular but plural. Tech companies and the walled gardens that US national and private tech companies are permitting through the gates offer widely divergent “realities” about the war. Ukraine is, after all, a set of very different stories told by censored, curated or raw tech platforms. For example, not admitting there is a war in Ukraine can get you bumped off platforms. In the end, billions of people see their version, moderated by algos and censors.
It’s become axiomatic to say that we live in an age of platforms and applications. There are corporate and national platforms. Some nations restrict or close national platforms, others do not. Some platform companies do light to heavy content censoring, promotion and demotion by policy, machine learning around user preferences, data on users etc. These factors determine which war you are seeing, or in the edge cases, perhaps no war at all.
In all these cases, we are living through an intense divergence of realities as delivered by platforms and apps. Yet, these platforms are the dominant collectors and conveyors of reality/content. Platforms and apps are shaping, shading and highlighting life, death, nations, justice and the path ahead. While some good work has been done regarding the issues of online terrorism, that too is a very slippery problem set that remains fraught.
Not nearly enough attention has been paid to the issues of control which appear to be growing larger and more intractable. Curating, blocking, and promoting content shapes passions. It also shapes decisions, fighting, aid, intervention, the welcome of refugees and donor’s gifting decisions. These decisions may elevate or crush hopes, provide balms and assistance and — if we dare to dream —stability or even peace.
Most content creators and consumers can’t or don’t see the processes that shape curation and control. There is little awareness of apps and the potency of algorithms and therefore, low transparency and accountability.
From the US Presidential Election of 2016, to COVID-19 vaccination, and Ukraine, governments and platforms seek to maintain control and position – often as a priority well above the simple serving up of information.
Ours is a time of gnawing uncertainty and volatility. But in techland people are presently debating the rules and rights of citizens on platforms and applications. The debate is on who gets to inflect, promote, demote, shade or highlight such issues as life, death, justice and war. We, if there is still a “we,” keep seeing the same issues arise with growing consequence. Clearly, we need to move quickly and decisively while we have any possibility of appealing to something like a commonality.
This seems like a time to pause and consider the stakes. It might also be time to add two questions: What does it mean to live in an age where platforms determine our “realities” and where platforms and apps lack transparency and accountability? And who could or should do anything about it?