It’s Sunday night. I think it’s Sunday night. And I just read this on Buzzfeed. It says it’s Buzzfeed. It’s a story by a guy named Charlie Warzel. I guess his name is Charlie. And maybe he’s a person. Or maybe he’s a bot.
He, it, ledes with this,
“In mid-2016, Aviv Ovadya realized there was something fundamentally wrong with the internet — so wrong that he abandoned his work and sounded an alarm. A few weeks before the 2016 election, he presented his concerns to technologists in San Francisco’s Bay Area and warned of an impending crisis of misinformation in a presentation he titled “Infocalypse.”
Oh my. And then. And Then.
“The web and the information ecosystem that had developed around it was wildly unhealthy, Ovadya argued. The incentives that governed its biggest platforms were calibrated to reward information that was often misleading and polarizing, or both. Platforms like Facebook, Twitter, and Google prioritized clicks, shares, ads, and money over quality of information, and Ovadya couldn’t shake the feeling that it was all building toward something bad — a kind of critical threshold of addictive and toxic misinformation. The presentation was largely ignored by employees from the Big Tech platforms — including a few from Facebook who would later go on to drive the company’s NewsFeed integrity effort.”
Ovadya. That doesn’t sound like a real name. Or maybe it sounds so unreal that it must be real. Like mad scientist, maybe, real.
“Ovadya saw early what many — including lawmakers, journalists, and Big Tech CEOs — wouldn’t grasp until months later: Our platformed and algorithmically optimized world is vulnerable — to propaganda, to misinformation, to dark targeted advertising from foreign governments — so much so that it threatens to undermine a cornerstone of human discourse: the credibility of fact.”
Run for the hills, the Zeros and Ones are falling. Ha Ha.
“For Ovadya — now the chief technologist for the University of Michigan’s Center for Social Media Responsibility and a Knight News innovation fellow at the Tow Center for Digital Journalism at Columbia — the shock and ongoing anxiety over Russian Facebook ads and Twitter bots pales in comparison to the greater threat: Technologies that can be used to enhance and distort what is real are evolving faster than our ability to understand and control or mitigate it. The stakes are high and the possible consequences more disastrous than foreign meddling in an election — an undermining or upending of core civilizational institutions, an “infocalypse.”
Wait, we’re all adults here. Sentient and sensible human beings, born and raised with an innate feel for what’s real. Fool me once, makes us all fool proof.
“As these tools become democratized and widespread, Ovadya notes that the worst case scenarios could be extremely destabilizing.
“There’s “diplomacy manipulation,” in which a malicious actor uses advanced technology to “create the belief that an event has occurred” to influence geopolitics. Imagine, for example, a machine-learning algorithm (which analyzes gobs of data in order to teach itself to perform a particular function) fed on hundreds of hours of footage of Donald Trump or North Korean dictator Kim Jong Un, which could then spit out a near-perfect — and virtually impossible to distinguish from reality — audio or video clip of the leader declaring nuclear or biological war. “It doesn’t have to be perfect — just good enough to make the enemy think something happened that it provokes a knee-jerk and reckless response of retaliation.”
Hooey. Phony to the point of being funny. Forget I even mentioned anything about all this bit and byte balderdash. I’m going to put it to bed, just as soon as I link the article so you too can share the yucks.
Nighty, night. Don’t let the bit and byte bite.