- Published on
Misinformation regulation is an XY problem
Imagine the following scenario: Bob, a Microsoft Windows Enterprise customer, calls the support hotline and asks:
Whenever I try to download the Windows Canon printer driver, I see the error message ‘Installed failed due to outdated core printer firmware.’ How do I update my core printer firmware?
Peter, a novice customer support representative, spends an hour helping Bob install the core printer firmware. After, they spend another hour installing the Canon printer driver. Bob, thankful for the help, then says something that makes Peter groan:
Perfect! I was trying to figure out how to print a document from my Windows laptop to my company’s Canon printer. Now I can!
Peter groans: if he had known that from the beginning, he could’ve simply sent Bob a help article explaining how Bob could use the more reliable and easy to set up “Cloud Print” system, no drivers needed.
The fictitious example above is an illustration of the XY problem: a phenomenon where people pose questions about issues with their attempted solution (the “X”) rather than the core problem itself (the “Y”). Eric S. Raymond coined the phrase in his book “How to Ask Questions the Smart Way”:
Q: How can I use X to do Y?
A: If what you want is to do Y, you should ask that question without pre-supposing the use of a method that may not be appropriate. Questions of this form often indicate a person who is not merely ignorant about X, but confused about what problem Y they are solving and too fixated on the details of their particular situation.
The XY problem is commonplace in the world of software engineering and customer support calls: help providers attempt to debug a coworker's or customer’s misguided attempt at a solution, without ever knowing the true goal of the coworker or customer in the first place. XY problems make us lose the forest for the trees and are quite often a waste of time and resources.
U.S. politicians' obsessive focus on combating misinformation and disinformation through more stringent regulation of social media platforms is a modern, dangerous example of the XY problem on a national scale. Like Bob, these officials are so singularly focused on their “potential solution” as to not realize its faults: one cannot simply “regulate away” the presence of mis- and disinformation, and more often than not, those efforts will slowly slide into the realm of censorship and limiting freedom of expression.
Mis- and disinformation will always be present in the world of digital communication. Or really, any type of communication: the Great Fire of Rome in 64 AD is an ancient example of misinformation efforts. After the fire, rumors spread that it was Nero himself who started the fire, to clear space for the construction of more extravagant buildings. Nero himself blamed the Christians for the fire, which sparked a persecution campaign against them. The reality? Most likely, an accidental fire started from flammable goods in the market, combined with hot, dry summer weather. There will always be unreliable actors in sufficiently large communication systems who look to manipulate information to serve their interests.
If that is the case, then the true problem to address – the Y – is increasing digital literacy levels. “Digital literacy” describes education aimed at helping people effectively find, evaluate and communicate information via digital platforms. It’s the internet-age equivalent of “teach a man to fish and feed them for life.” Digital literacy inoculates us from falling prey to clickbait, divisive fictions and blatant mistruths, and limits our ability to be insidiously influenced by malicious third parties. Initiatives like Common Sense Media, The News Literacy Project, Media Literacy Now, and Google’s “Be Internet Awesome” promote digital literacy education, helping people learn how to effectively consume and communicate information.
In particular, increased investment must be made in the evaluation component of digital literacy. Most Americans today have no issue accessing information online, but their ability to detect bias and fictions is incredibly compromised. In the private sector, there is promising progress on ceding the power to evaluate information back to individuals. The Community Notes feature on X (formerly Twitter) encourages individual users to add context to posts that might be wrong or misleading. Meta is adopting the same model, decreasing reliance on external, obfuscated content moderation teams. A government-led initiative to compile non-partisan resources on information evaluation (think of healthcare.gov, but digital-literacy.gov) would complement these private-industry efforts, further increase the quality of public discourse and quash the need for further mis- and disinformation regulatory measures.
But, a pause: is this a realistic suggestion? “Show me the incentives and I’ll show you the outcomes,” said the late Berkshire Hathaway vice chairman and investor Charlie Munger, and we’d be wise to listen to him here. What incentive exists for the political apparatus to improve the digital literacy of their citizenry? A populace capable of effective information evaluation is far more dangerous – in the sense of independence from malicious influence – than a populace who cedes responsibility of that information evaluation to regulators and obfuscated fact-checking systems. This indicates a diametrically opposite incentive: political administrations (independent of party) rely on tribalism and division to keep people distracted and easier to govern. But maybe my tin foil hat is on a little too tightly, and we should instead use Hanlon’s Razor: “Never attribute to malice that which is adequately explained by stupidity.” There may be well meaning politicians out there who see regulation as the best outcome for their constituents. Whatever the motivation, it is our responsibility to check those in power and ensure legislative action stays grounded in curing root causes, not simply symptoms.
Ultimately, our responsibility here is two-fold. One, we must speak out against undue mis- and disinformation regulation: the responsibility of evaluating digital information lies with individuals, and is outside the purview of regulators. Two, we must strive to increase our own digital literacy. In an increasingly digital world, it is now more important than ever to not simply trust, but verify information. These together will help us keep individual liberties safe and lead to a more independent, informed and cohesive world.