Here’s a little thread I wrote about why the electoral college should be reformed, rather than retired.
And, for purposes of today’s column, here are the three critical tweets:
“The system was originally created at the Constitutional Convention in 1787. Some people thought the President should be elected by a direct vote of the people, but others thought he (always he back then) should be appointed by Congress.
“See, back in the day, we didn’t have national television, or social media, or jet planes. So it was way harder to hear about candidates. With a popular vote, the candidate from the most populous state would always win, because you’d just vote for who you know.
“The electoral system was a compromise. By having each state vote for electors, and having the electors vote for President, all the states would be represented in the outcome.”
The purpose of the electoral college has always been to serve as a dampener, so no one group has an outsized influence on the outcome. This is an important function — but one that is unfortunately not being delivered in the college’s current incarnation.
Swing states have a massively outsized influence on the outcome, and therefore — if we still want the original function to be performed — the college must be reformed.
In other words, something that was fit for purpose in 1787 may not be fit for purpose today.
Last week, Annalee Newitz wrote in the New York Times about the brokenness of social media. “When she thinks about the future, [Safiya Umoja Noble, the author of “Algorithms of Oppression”] imagines a counterintuitive and elegantly simple solution to the algorithm problem. She calls it ‘slow media.’ As Ms. Noble said: ‘Right now, we know billions of items per day are uploaded into Facebook. With that volume of content, it’s impossible for the platform to look at all of it and determine whether it should be there or not.’”
The article goes on: “Instead of deploying algorithms to curate content at superhuman speeds, what if future public platforms simply set limits on how quickly content circulates? …That slowness would give human moderators or curators time to review content.”
You can instantly imagine the backlog. If we can’t keep up in real time, how could we keep up when we’re slowing the process down?
But this is the point.
Our algorithms cannot keep up with the volume of content we are sharing. And, if our algorithms can’t keep up, there’s no way humans can — or should. This week a former Facebook moderator filed suit against the platform, saying the work of reviewing threatening, hateful, bullying, or otherwise dangerous content had given him PTSD.
But even if we’re not dealing with the worst of the worst, humans are not equipped to deal with this flood of content. We’re not equipped to process it in real time, to determine sources, veracity and credibility.
You might be offended by this. You might think you can easily spot whether content is legitimate. And maybe right now you can. But it’s only going to get harder as deepfakes and spoofing become more sophisticated.
Like the electoral college, the tools we used to determine credibility were designed to solve the problem in a different environment. And just as the function of the electoral college is still essential, so is the function of determining credibility.
I don’t know what the answer for that function is. But I know it can’t be a solution designed for a different time. Context matters.