The campaign and election of 2016 was unprecedented in so many ways; news reporting and social media has forever changed the way candidates campaign and citizens vote. I think mistrust of the media and the divide between parties is the most noticeable - and worrisome - shift for voters, with coverage of 'fake news' continuing into this, the sixth week of Trump's presidency. Is social media to blame?
The campaign and election process of 2016 was unprecedented in countless ways, from the size of the initial GOP field (US News called it "the largest primary field of the modern era"), televised debates that hit record viewership numbers, the divide created between the Democratic and Republican parties, and the level of dirt and finger-pointing (one Politico headline read "This Is the Dirtiest Presidential Race Since '72"). NPR lists 65 ways the campaign was unprecedented, worth a read, but essentially I think the effect on the media and its reputation is most worrisome for voters, with coverage of "fake news" continuing into the sixth week of Trump's presidency.
For the most part, colleagues and friends who went to the polls had little doubt for whom they would be voting. Let's face it, it was a polarizing time. I only watched two debates, not to help me decide on a candidate, but for what I know would be its entertainment value. I was not swayed by any degree to someone other than my originally chosen candidate; rather, my selection was resoundingly confirmed.
So why is that? What is happening in today's environment that isn't helping us look at things from a perspective rather than our own? I believe much of it is due to social media (I'm not alone there, it's widely reported that social media had significant influence on how people got their news about politics this past election season). Pew Research found that 44 percent of the general population get their news from Facebook.
We all have our daily routines. I regularly visit various social networking sites for work and personal use. However, recently my streams have taken a sharp turn away from personal news (Aunt Jenny posting a picture of the new baby), updates and invitations (curriculum night at the high school) and photos of food (which I personally didn't understand but enjoyed) to political opinions, celebrity gossip and tragic news - of the real and fake kind.
I may or may not want to see what shows up in my feeds, but I would definitely prefer to see certain things first. In comes that algorithm. Depending on what source you trust (this is the crux of things, right?), you can learn a little about the algorithm through this Slate article (I've trusted Slate for years, and their awards are impressive) or this Hubspot blog (which I trust since it is a tool that helps me with successful social media strategies). But essentially the algorithm looks at your posts and those of everyone and everything you follow (friends, pages, accounts, etc.), share and like, and ranks them all according to what it "thinks" is most worthwhile to you personally. I reference the algorithm not because I think Facebook was at fault for swaying the election, but so that we know how it works in order to make educated social media choices and changes.
Facebook Founder Mark Zuckerberg insists in a comment response to a status update that "News and media are not the primary things people do on Facebook ... I find it odd when people insist we call ourselves a news or media company in order to acknowledge its importance." While I agree "Facebook is mostly about helping people stay connected with friends and family", I think it's a mistake to overlook the current climate and studies that indicate the site is being used by many as a source for news, as entrepreneur and investor Dave McClure insists: "It's clearly a source of news and information for billions of people. If that's not a media organization then I don't know what is."
Argument aside, I agree with McClure when he says companies like Facebook have a responsibility to deliver balance: "Maybe we need to mix in having ethics and principles and caring about the fact that people have a reasonable and rational experience of the information they process."
I agree. And maybe Facebook will get there. While I don't want to see or hear rants or unverified news from those who think vastly differently than I do, I do want to see factual information from diverse perspectives: "Factual" is the key word. Let's face it, we all love to hear confirmation and agreement of our ideas and choices (that darn ego), but we do ourselves a disservice if we completely ignore stories or ideas that conflict with our own. Even if we disagree, scroll past, unfriend or unfollow now and then, it's a citizen's responsibility to look at both sides.
Undoubtedly the debate about Facebook will continue but the bottom line is that it should be just one source among many, and individuals are responsible for sifting through and sharing reliable, accurate information. Dan Abrams, CEO and Founder of Abrams Media, Chief Legal Affairs Anchor for ABC News and previously the co-anchor of Nightline wrote it best on Mediaite: "After all, humans may be largely to blame for sharing error-riddled, often conspiratorial nonsense." So it's up to us right now to make sure we are hearing, seeing and reading a variety of news from trusted sources.