The Alliance for Securing Democracy just announced an interesting tool:
The Hamilton 68 dashboard, launching today as part of the Alliance for Securing Democracy, provides a near real-time look at Russian propaganda and disinformation efforts online. The top of the page shows tweets from official Russian propaganda outlets in English, and a short post discussing the themes of the day. This is Russia’s overt messaging.
But these disinformation networks also include bots and trolls that synchronize to promote Russian messaging themes, including attack campaigns and the spreading of disinformation. Some of these accounts are directly controlled by Russia, others are users who on their own initiative reliably repeat and amplify Russian themes. Our analysis is based on linked 600 Twitter accounts to Russian influence activities online, and the lower section of the dashboard features charts that display topics, hashtags, and links currently promoted by this network.
This may represent an incident of bots and other automated systems used as a meme warfare tactic to control the political narrative in a specific venue. The cessation of this tactic here may be related to some shift in political climate. Indeed, it comes at the same time the Russian government announced their disappointment in the Trump administration over new sanctions.
This switch in automated suppression affords an opportunity to study the effect of such suppression on the social/political dynamics in the affected forum. CoPsyCon will be observing and studying this as it unfolds.
Attention is currently focused on threat vectors of disinformation and psychological manipulation in social and traditional media as components of asymmetric hybrid warfare, but largely limited to threat models of:
Radicalizing and recruiting individuals for specific terrorist actions.
Undermining public trust and effectiveness of mainstream institutions, disrupting the defender’s ability to stop the attacker’s goals.
An insidious and aggressive extension of (2) is now proving to be very effective in moving beyond chaos and disruption and enabling attackers to actively re-write the political narrative to serve their goals. In this scenario, disinformation is used to rally and unite a minority faction, and then botnets, search-engine optimization (SEO) and other media trend manipulation tactics are used to precisely choreograph the content and timing of that faction’s political expressions. This creates a controllable “weaponized demographic” that can be used as a lever to force the Overton window to shift in ways that would not occur organically, which the attacker hopes will lead to policy changes favorable to their goals, despite a lack of majority support.
This vector is available to attackers of any political affiliation, and would be destructive regardless of the specific policy goals in question. Currently, the largest active attacks are represented by the Brexit campaign and the Trump presidency. Both were well outside the Overton window a few years ago, but the window was rapidly shifted by this type of attack, and both became reality.
CoPsyCon.org is developing models and tools to identify in real time and selectively disrupt signals based not on their content, but on their function of uniting and controlling a weaponized demographic, thereby mitigating the threat to democracy while avoiding the risks of censorship and partisanship.
Currently, US democracy is under attack by malicious actors using technological tools to promote disinformation for the purpose of coordinating and controlling a significant subset of the population as a weaponized demographic. CoPsyCon will conduct research and development to build a technical and psychological knowledgebase of expertise on this attack vector, and develop tools to disrupt the effectiveness of this attack against democracy by disrupting the malicious actor’s ability to coordinate and control their victims.
What CoPsyCon does:
Develop expertise on the technical and psychological process of controlling and coordinating a weaonized demographic, and disrupt this at the process level, regardless of the specific malicious actor’s goals
Research the origin, characteristics, and propagation vectors of the memes used in this type of attack
Develop a combined technological and psychological model of the attack
Develop automated monitoring tools to enable a realtime dashboard monitor for similar attacks
Develop memetics/counter-psyops protocols for building counter-deza memes that will be effective at diluting and disrupting the monolithic deza meme signals used in the attacks
Develop technological tools to inject counter-deza memes ahead of a deza meme attack to dilute and disrupt the ability of the attack to control a weaponized demographic.
Apply this shield in an unbiased and neutral manner.
Equally counter foreign and domestic origins of attacks
Equally counter threats that originate from nation-states, non-state entities, powerful individuals, or loose ad-hoc collections of aligned interests
Operate with as much transparency as possible
Support rule of law, democracy, and all of the institutions that define and protect America
What CoPsyCon does not do:
Diverge from its focused mission
Advocate for political positions, issues or parties
Focus on prosecution of specific malicious actors
Use or advocate defenses based on banning, blocking, censoring, etc. of any communication channel, entity, idea, etc.
Evaluate or judge beliefs or values on any basis other than how those beliefs and values contribute to susceptibility to the threat of coordinated deza-meme attacks
Manipulate population beliefs and values in any way other than as an unavoidable side effect of our core mission of disruption of coordinated deza-meme attacks
Believe that this type of attack is uniquely used by Russia, or any other specific entity
Discriminate against groups or individuals on the basis of any criterion other than those that are logically necessary for our mission
This is a draft of the technical work plan for CoPsyCon.
Phase I. Observation and measurement of deza (disinformation) psyops
a) Retrospective timeline scans: Compile Twitter archive and Google Trends data. Identify and analyze info-deza event pairs, i.e. critical information events or “breaking news”, followed by triggered deza campaign. Describe technical patterns and processes of disinformation promulgation. Describe psychological patterns of deza memes. Identify deza origins, if possible.
b) Real-time feed scans: Set up server to monitor live feeds. Identify and analyze info-deza event pairs in near-real time.
Phase II. Counter-deza meme development (depends on the above)
a) Soft broadcast test: Compose counter-deza memes and measure propagation in the wild
b) Hard broadcast test: Hire botnets and measure counter-deza meme propagation in the wild with amplification.
c) Field test: Wait for critical information events and measure propagation of counter-deza memes in live “combat zone”
Phase III. Active counter-psyops deployment
a) Develop metrics for effectiveness of defenses.
b) Iterative development cycle of refining deza identification and counter-deza meme development.
c) Report results and coordinate with authorities.
This is a quick dump to get this out there, further discussion and documentation to be provided over time.
Google Trends doesn’t have a real public API; it’s heavily rate limited and so it’s hard to get even moderate amounts of data, and large amounts are right out. I worked with pytrends and scripted some delays and automation for downloading larger amounts of trend data for lists of search terms.
I used this to download just over a year’s worth of day-by-day trend data for (currently) 815 search terms, mostly related to current political events. Then I subjected this matrix to Independent Component Analysis using Scikit-learn, which turns out to do a great job of separating out components with clear meaning on the events timeline. The components are displayed on a stacked plot with key political events labeled on the timeline. Some of the political events I entered a priori because they were obvious (like the election itself), and then some of them were entered after I researched archived news corresponding to peaks I saw in the plots. Note that many of the events in the latter categories were things that I wasn’t thinking about at all when I put together the list of search terms, but the ICA on the large set of terms brings out those peaks automatically.
Here’s an example plot with 13 components (choosing number of components for ICA is more art than science, 13 seemed to work well but other numbers did too):
There are a few other interesting trends graphs I’ve seen online. Here I’ve placed two big ones overlaid on that plot, for comparison of timelines. The first one is the plot of Alfa bank DNS server logs, which may or may not have something to do with anything nefarious. The second one is from Echelon Insights‘ annual promotional “year in news” article. Of course I don’t know their exact methodology (the point is that they’re selling their service, after all) but it’s likely that the trends displayed here are based on starting with the keywords labeling the peaks, rather than a component analysis of some sort. I would love to hear from someone who knows more.
The columns at the left are the top 15 terms, and bottom 15 terms, associated with that component. The positively associated search terms trend much more frequently when you see a positive spike in the associated component. The negatively associated search terms trend much more frequently when you see a negative spike in the component, or when the overall value of the component is low. The code automatically detects the spikes and rectifies the signal (ICA outputs are unpredictably, if not arbitrarily, scaled, so some ordering and rectifying are helpful) so the positively associated terms are usually more meaningful, but there are some interesting trends in the negative words too.
To improve visualization and insight related to the search terms, the code also generates word clouds of the top positively and negatively associated terms. Here they are in the same order as on the left side of the plot.
Although this is preliminary, I can make a few observations here.
This article by Roger Sollenberger on how SEO is used for political influence describes gaming Google search results with a lot of fake posts. It uses the specific example of how the Seth Rich story started to get pushed after Trump brought alleged Russian spies into the Oval Office. In component 9 (sorry they’re not labeled, you’ll have to count from the top) you can see a predominance of terms related to both “trump_russians_oval_office” and “seth_rich”, which is consistent with that article’s demonstration.
Trump’s “grab ’em by the pussy” tape leaked on October 11. Shortly afterwards, a massive amount of anti-Hillary trends started and didn’t let up until the election. They dropped off almost immediately after the election.
The stuff glommed together as “wikileaks/hacking” in the Echelon graph is not completely clearly separated out in the components. Below, you can see what it looks like when I specified 15 components instead of 13. The Wikileaks email trend is clearer there, in component 10. Maybe I should have used that one as my main example instead of the 13 components.
In case it isn’t obvious, the component number 12 (13 in the 15-component plot below) represents the strong weekly periodicity in Google search trends due to the work week.
Please consider this a preliminary post, I wanted to get these tools and data out there. You can access all the code and data at my GitHub profile here. (As of this writing, 2017 June 6, I haven’t added documentation yet, and I probably won’t have time to for a few weeks. Sorry!)
Not long after the unexpected results of the 2016 US election, the US intelligence community collectively published a declassified summary entitled Assessing Russian Activities and Intentions in Recent US Elections, a joint statement from the FBI, CIA, and NSA unequivocally stating that Russian state-sponsored actors had interfered in the election. Since then, it would be an understatement to say that the plot has thickened. Russia’s role has received copious media attention, and is the primary target of Robert Mueller’s special counsel probe. However, less attention has been given to other facts of the acute threat. In particular, the methods that take advantages of weaknesses in our political and media systems would not be remediated by identifying and punishing Russian perpetrators and their collaborators. The same threats to democracy, freedom, and the integrity of American systems would still be vulnerable to attack from other state or, more disturbingly, non-state actors. There is evidence that non-state actors, specifically mega-wealthy oligarchs, are currently using the same tools towards the same goal of disrupting America.
The first article I read that alerted me to the threats of big-data-based marketing tools applied to political manipulation by non-state actors was the article The Data That Turned the World Upside Down, translated from German and published on 2017 Jan 28 by Vice Motherboard, which described Cambridge Analytica, botnets, and political manipulation. I already knew about work on big data and politics from colleagues who worked for other companies doing similar work, but this article suggested that it was already powerful, and being used by hostile actors against the US.
Paste Magazine published How the Trump-Russia Data Machine Games Google to Fool Americans on 2017 June 1, which describes another tool in the disinformation arsenal: search engine optimization. This article also describes how a specific false disinformation narrative, was broadcasted and amplified in a planned chronology to counter the real breaking news of Trump’s meeting with Russians in the Oval Office.
As early as 2005, the military was warned of the potential threat of “meme warfare”. Mike Prosser wrote a thesis titled MEMETICS—A GROWTH INDUSTRY IN US MILITARY OPERATIONS for the United States Marine Corps School of Advanced Warfighting. Later, in 2011 October at the Social Media for Defense Summit in Alexandria, Virginia, Dr. Robert Finkelstein presented a Tutorial on Military Memetics.
And yet, the Department of Defense Cyber Strategy, presented by USCYBERCOM in 2015 April, has no mention whatsoever of “meme”, “memetic”, “disinformation”, or anything about media manipulation, social media, or any other related topics.
CoPsyCon exists because this blind spot in the US defense system critically needs to be filled.