In the only statement issued regarding his two-year investigation of whether or not Russia tried to influence the 2016 elections, the sphinx-like FBI Special Counsel Robert Mueller stated, “Russian intelligence officers who were part of the Russian military launched a concerted attack on our political system.”
He ends his statement with, “There were multiple, systematic efforts to interfere in our election. That allegation deserves the attention of every American.”
Based on Mueller’s findings and reports from various experts, any time the opportunity for the Russian military to sow more discord in America arose, they took it. Often, these opportunities revolved around race and America’s most shaky foundation: racism.
Asian-American actress Kelly Marie Tran, the first woman of color to star in one of the installments of the Star Wars franchise, was run off of social media temporarily after the trolling she received. In his published paper, “Weaponizing The Haters: The Last Jedi and The Strategic Politicization Of Pop Culture Through Social Media Manipulation,” Morten Bay, a research fellow at the USC Annenberg School for Communication and Journalism, found that fifty-one percent of negative tweets about the 2017 installment of the “Star Wars” franchise were “politically motivated or not even human.” He also stated, “Russian trolls sought to ‘weaponize’ Star Wars criticism as an instrument of information warfare with the purpose of pushing for political change.”
There is evidence that Russian trolls specifically targeted Black people on social media, posting divisive rhetoric on platforms such as Facebook and Twitter. From stirring “outrage” around Colin Kaepernick taking a knee during the National Anthem at football games, to posing as Black Americans to encourage Black voters not to turn out in 2016, Russian trolls have proven they’ll use any opportunity to be divisive. So when Black singer and actress Halle Bailey was cast as Ariel in Disney’s new live-action The Little Mermaid, and racists started tweeting #NotMyAriel, petitioning Disney and demanding that the role be recast with a red-headed white woman who looks like the fictional cartoon creature, it’s worth researching if this was another example of Russian bots back at it again.
Shadow and Act reached out to Bay not long after the #NotMyAriel started showing up on Twitter. “I would be surprised if they weren’t involved somehow,” Bay said.“The thing is that what we know that Russian trolls and at least some extent state-sponsored actors, were infiltrating American popular culture discourse all over the place.”
Former Research Fellow at the Council on Foreign Relations and author of “Likewar: The Weaponization of Social Media,” Emerson Brooking told Shadow And Act, “It is likely that the #NotMyAriel hashtag was boosted by inauthentic means. Moreover, the hashtag seems to have been engineered to provoke outrage among mainstream observers. This proved the case when media outlets and social media influencers gave the hashtag significant amplification, even if their intentions were to counter it.”
While The Little Mermaid may seem a trivial matter, pop culture has consistently proven to be able to powerfully affect the behaviors of Americans. The fact that Donald Trump, a failed businessman and former reality TV host, is in the White House is proof of that.
Said Bay, “What they would basically just do is fan the flames that were already there so that the controversy look bigger. Everything that was said on both sides was just amplified a lot. They obviously made both sides feel like they were being attacked more.”
But why would Russia even care about a mythical mermaid? Senator James Lankford (OK), who sat on the Senate Intelligence Committee, told NPR’s Mary Louise Kelly in the aftermath of the Parkland shooting, “What Russia seems to want is divisiveness everywhere else, and they try to get a competitive advantage by destabilizing every country around them. They’ve done it for years, and they’ve finally come here.”
The good news is that though the hashtag #NotMyAriel was pushed along enough to have trended on Twitter, it stayed on the trending list for a relatively brief time. And though there are a number of petitions on change.org, there are not enough people signed on to realistically impact an election. Notably, there are also no high profile influencers who have spoken out in support of the hashtag and against the casting, which could theoretically sway things much the way Trump, a global brand, was able to do. The original inflammatory posts that were being quote-tweeted across Twitter had very few followers and thousands of comments. On the contrary, Bailey has been flooded with support for the role, including from another famous Halle, both Disney’s Freeform, the distribution network for Bailey’s Grown-ish and the original Ariel voice actress have both stepped up to defend and celebrate Bailey.
And, judging from racists’ supposed boycotting of Starbucks and Nike, whose profits increased substantially thereafter, it’s safe to say that racists aren’t always the most effective at boycotts. But the job of bots is not just to put contrary, inflammatory opinions out there, it’s to make them reach as many people as possible. P.W. Singer, strategist at New America and a consultant to the U.S. military and intelligence community and co-author with Brooking of, “LikeWar,” had this to say about bot activity: “What’s fascinating is that it’s not just about persuading the targets. It’s about hijacking the network’s own algorithms. So these botnets drive viral, different points of view that then push that up into the news feed. It makes them trend, so people even outside those networks begin to see them.”
So what are the tools in the international social media warfare toolbox and can we discern if the #NotMyAriel hashtag is another front in that war?
First there are the bots; literally automated social media accounts linked to Russia. According to the MIT Technology Review, there are a few basic ways to tell if you’re dealing with a bot:
1) The profile is rudimentary with no picture or the picture is stolen from somewhere else (which you find out of course, by searching the image in Google).
2) The linguistic syntax seems “off”. They are missing the nuances of the language, don’t seem to get figures of speech, and jump too quickly to another subject. They are talking “at you” not “to you” when you interact with them.
3) On the other end of the spectrum from quickly going from one subject to the other, is a seeming inability to move on from a subject or conversation regardless of the direction that the conversation takes.
4) Take a look at the overall account. There should be a pattern of behavior that seems “normal.” If they send 100 tweets per day or only ten tweets per year, they are likely a bot.
Then there are trolls. These are real people, who sometimes act in concert, to jump into conversations in order to push an agenda. Bay states, “If you can see that that account was, for instance, either started very recently, or that it started a long time ago, but doesn’t have a lot of tweets in between that time and what is happening now, that’s also a surefire indicator. Another surefire indicator is if you scroll far enough down the troll account, and you find different languages.”
Developing the ability to quickly spot bots and trolls will be key to living in a world where we aren’t constantly manipulated by fake news. Senator Lankford argues that Americans need to get to the point where they can identify real opinions on social media vs. fake ones.
Bey adds, “When I look at social media these days, I generally have the sort of assumption that the Russian trolls and probably also at this point, Trump, some other actors from other nations are definitely getting involved, because they’ve seen how well it worked in 2016.” Increased public awareness of these tactics will lessen the damage they are able to inflict. “We’re now living in a new normal, where this is something people are aware of. So the minute people like you then report on it, a lot of people will be like, ‘Oh, okay, maybe we need to just step back for a second and see what’s actually the case here.’ And I think there’s also a growing awareness that Twitter and Facebook is not the real world.”
Brooking cautioned against developing a habit of automatically ascribing incidents like these to Russia. “By consistently blaming Russia for such information campaigns, we conveniently overlook the toxicity of our own domestic politics. If Russian bots amplified this controversy, they only did so inasmuch as they amplify any American cultural division,” said Brooking.