Honours Essay: ‘Why this topic? Why now?’

You can read the now complete Honours thesis here.

Less than a month before Donald Trump’s election victory in 2016, a series of controversial Clinton campaign images went viral on the Internet. Primarily spread via Twitter and Reddit, these images were accompanied by the trending hashtag #draftourdaughters, appearing to be leaked promotional material from the official Hillary Clinton campaign (Wall & Mitew 2018). As can be seen in Figure 1, these images took the presidential candidate’s stated support for women registering for the draft to the extreme (Nelson 2016). Internet users were shocked by these images and took to social media to express their disappointment in Hillary Clinton, as demonstrated in Figure 2.


Figure 1: An example of campaign content (Smith 2016).


Figure 2: A collection of tweets reacting to the campaign content.

However, these images were not actually from the official Clinton campaign. They had been created and spread by users of the notoriously toxic online imageboard 4chan (Nagle 2017, p. 15). These individuals shared official campaign fonts, colours and themes to make their propaganda appear authentic, and played on the existing anxieties surrounding Clinton in order to sway public opinion. Members of 4chan’s  politically incorrect (/pol/) forum gave each other constant feedback on each iteration, spreading only the ‘best’ memes via sockpuppet accounts on social media (Wall & Mitew 2018). Eventually revealed to be a false campaign, Draft Our Daughters opened up a new conversation about modern information warfare and its effects on public perception and behaviour. Although information warfare already existed, the impact of this campaign was unprecedented – especially given that it was initiated by globally dispersed Internet users, rather than legacy broadcast media.

In early 2017, I sat in a lecture theatre and listened to a digital media academic discuss the Draft Our Daughters campaign, which I had no previous knowledge of. I was blown away by the influence everyday Internet users could have on the public narrative – and that such influence could be achieved through something as seemingly simple as memes. Ever since that day, memetic warfare has remained my key area of interest. How and why does it work? In what ways does it affect legacy media and the broader audience? My thesis aims to address these questions and develop a systemic perspective on the key dynamics of targeted memetic warfare campaigns. Given that this is a relatively unstudied phenomenon, there are many gaps within the existing research.

Most existing literature addresses traditional information warfare and modern information warfare as separate entities, despite their many similarities (Rowett 2018, p. 4439). I believe it is important to study both forms together, because their core functions remain the same – we cannot understand one without understanding the other. Additionally, both phenomena are typically explored through a theoretical lens rather than a logistical one (Gisea 2017, p. 7). My research aims to combine the existing literature with my own case studies to form a coherent exposition of the key dynamics used in memetic warfare campaigns, and their direct impact on public perception and behaviour. Addressing this phenomenon from a systemic perspective will contribute greatly to the academic conversation surrounding new media effects, as well as provide a practical guide for media professionals interested in information warfare tactics. This piece will introduce and explore information warfare and memetic warfare as two parts which form a coherent whole.

The Mass Media Landscape

While the history of ‘mass media’ is relatively short, it has had a tremendous impact on the way humans think, communicate, and behave. From the invention of the printing press in the late 15th century to the Internet in the present day, we can see a clear trajectory of the changes brought upon by each new mass communication tool (Lee 2009). Arguably, the emergence of the Internet and the digital technologies that followed have had the most significant societal impact of all mass media tools. While each new legacy media tool boasted increased inclusivity and reach, they all operated on a centralised, one-to-many model of communication. According to the logic of this model, all of the information being disseminated by mass media organisations is necessarily created and filtered by a small group of gatekeepers with concentrated power, hence the expression ‘one-to-many’. Paul Baran’s (1962) paper “On Distributed Networks” establishes this model – as can be seen in Figure 3, legacy media operates on model A. It is highly organised, has a clear top-to-bottom hierarchy, and does not allow for the audience to communicate back. It also typically incurs a high cost of production, which acts as a kind of gatekeeper.

While centralised networks are more efficient than their counterparts, they are also more fragile – if the central node is damaged or destroyed, the whole network collapses. When this was the only model used, audiences were nothing more than passive consumers with little to no control over the public narrative – it simply wasn’t accessible. Lacking an alternative, consumers often took what the mass media said at face value, not daring to question its credibility or authority. Additionally, the majority of legacy mass media channels are owned by just 6 media conglomerates (WebFX n.d.). This means that almost all legacy media content is controlled by a very small minority, which does not leave much room for dissenting voices. Thus, the centralised model is the perfect foundation for information warfare – especially propaganda.

Then came the Internet, which operates on model C, demonstrated in Figure 3. This is the distributed, many-to-many model  (Baran 1962). Every node has the ability to connect and communicate with any other node on the network. Clay Shirky’s book Here Comes Everybody (2008) expands on this, explaining how this shift has increased the power and efficiency of collective action. Coordinating group action online is free, and virtually limitless because it operates “outside the framework of traditional institutions and organisations” (p. 29).  As Anderson (2008, p. 50) notes, it is “the democratisation of the tools of production”. This conveys that the production of media content now exists in a distributed nature, meaning the number of perspectives that can be taken on a specific issue has a far wider scope than ever before. Model C has very few gatekeepers, and the cost of production is entirely up to the individual creator. It also incurs much less risk than the other models – if one or more nodes fail, they are simply removed from the network without damaging it. No longer passive consumers, this paradigm shift has transformed the audience into what Axel Bruns (2006, p. 2) labels active ‘produsers’ who have the power to alter the content being created for their consumption. This new audience of empowered users has driven the tremendous increase of user-generated content. More importantly, everyday produsers can now access and alter the public narrative and engage in their own forms of information warfare. This is why we are currently living in the ‘fake news’ era – if everyone is able to disseminate news content online, how can we possibly decide who to trust? The Internet has permanently changed the media landscape, and with this, the information warfare landscape has also been transformed (Rowett 2018, p. 4439). The ability to manipulate the perceptions and behaviours of the public no longer exists within a small group, but an entire network of online users. My thesis aims to systematise the mechanics that occur when the entire network of users is able to engage in modern information warfare.


Figure 3: Paul Baran’s (1962) three network topologies: centralised, decentralised, and  distributed.

Information Warfare

Although information warfare (IW) has existed for as long as mass media, it still lacks a universally accepted definition. While Western definitions see IW separately from cyberwarfare, Russian definitions hold that the term encompasses cyberwarfare as well as all strategic uses of information and communications (Giles 2016, p. 7). As my focus is on the tactics used to change perceptions and behaviours, I will be adhering to the Western definitions – technological and cyberwarfare will remain outside the scope of my thesis. IW has a wide range of subsets, all of which have overlapping definitions (Scott 2018, p. 14). For this reason, I will be focusing on the most commonly discussed types of IW – propaganda and psychological operations.

Information warfare is the strategically planned manipulation of information to influence the perceptions and behaviours of any targeted audience. Contrary to traditional forms of battle, IW focuses on attacking the ideas, emotions, and attitudes of the opponent to produce a desired collective behaviour (Rowett 2018, p. 4438). Where bullets were once used, “word weapons” and “munitions of the mind” are now the weapons of choice (Taylor 1997, p. 149). The current paradigmatic shift from legacy IW to modern IW has effectively democratised the ability to influence the population. The existing research does not address the dynamics that appear when this happens, nor does it recommend any ways to prevent yourself from being manipulated by modern IW. Much of modern IW takes place over the Internet, but it does not operate exclusively online. It also operates through legacy mass media tools. In the past, scholars have agreed that IW only takes place in times of conflict, but this too is changing (Giles 2016, p. 4). Before introducing propaganda and psychological operations, we must first explore the mental mechanics that make it possible for the human mind to be influenced by the mass media – framing.

Beginning with Frederic Bartlett (1932, p. 128), a number of cognitive studies have discovered that rather than perceiving complexity, the mind takes shortcuts when processing external stimuli. Our brains simply do not have the capacity to process every new piece of information individually, so we use our prior knowledge and experiences to understand new information. Framing describes the cognitive process where external stimuli are “classified and assigned to the categories of the previously acquired experience” (Pluwak 2015, p. 307). Frames consist of opinions, behaviours, events, symbols, language, and so on. Essentially, a frame acts as a narrative-like guide to help us perceive reality. More importantly, these frames within our minds are malleable and vulnerable to exploitation.

Mass media takes full advantage of this vulnerability, using their own visual and linguistic framing to reshape or reinforce our existing frames of reference. As Entman (1993, p. 52) explains, to frame is “to select some aspects of a perceived reality and make them more salient in a communicating text”, in a way that promotes the interests of the frame’s creator. This occurs in newspapers, books, educational materials, television and the Internet. An example of two different media frames can be seen in Figure 4, which depicts the same news story framed differently by Fox News Latino and Fox News Channel. The audience for the first story is more progressive, while the second audience is more conservative – Fox caters to this, framing the immigrant issue completely differently for both audiences. The goal of media framing is for the constructed frame to become incorporated into the audience’s long-term memory. The more often the frames’ objects are repeated, the more successful that frame is in becoming the most accepted version of reality in our minds. Over time, these frames influence the ways we think and act. If a new piece of information does not resonate with someone’s existing internal frames, it will not be processed into their long-term memory – even if this new information is better and more reliable (Heuer 1999 p. 23). This means that, once an individual takes a certain perspective on any issue, it is extremely difficult to make them see things another way. This is why it is so difficult for older generations to see things from a new perspective – their internal frames have been solidified over many years. In summary, framing is the core psychological exploit underpinning all forms of information warfare and is a key area of research pertinent to my thesis topic.


Figure 4: An image from Twitter comparing two different Fox channels covering the same story through different frames (Media Matters 2014).


Propaganda has been discussed extensively in academia for many years, but research has not been able to keep up with the accelerated changes brought upon by the Internet. Edward Bernays is considered to be the primary pioneer of modern propaganda and public relations, having worked on a number of campaigns for major American corporations. In his foundational book Propaganda (1928), he describes it as a “consistent, enduring effort to create or shape events to influence the relations of the public to an idea, enterprise, or group” (p. 25). This definition still holds true – it is the dynamics that have changed. Propaganda operates through framing as discussed earlier, although this strategy is not always intentional. French philosopher Jacques Ellul builds on the work of Bernays, zeroing in on the importance of the mass media and the mass audience in creating successful propaganda. Individuals in groups are more emotional and easily provoked into action, which is why successful propaganda is impossible without mass media (1973, p. 7). This, combined with the fact that every mass media organisation is owned by individuals with their own frames and biases, means that every piece of media is a form of propaganda.

However, it is important to note that propaganda is not employed exclusively to manipulate politics and social causes. It has a wide array of uses including advertising, health promotion, agenda-setting, and so on. Whether a piece of propaganda is ‘good’ or ‘bad’ is in the eye of the beholder (Bernays 1928, p. 48). The primary difference between propaganda and other forms of IW is that it operates constantly and is not always created with a set goal in mind. Propaganda is not possible without access to the public narrative, which was once only available to the owners of mass media organisations. The distributed nature of the Internet as demonstrated in Figure 3 means that this power has now been spread to every user on the network, and the mechanics behind these new forms of propaganda are what my thesis aims to uncover.

Psychological Operations

Unlike propaganda, psychological operations (PSYOPS) are always specific operations created with a clear outcome in mind. According to Findley & Goldstein (1996, p. 11 ), PSYOPS “reduce the morale and combat efficiency of enemy troops and creates dissidence and disaffection within their ranks”. This definition adheres to the outdated idea that they only occur during times of conflict, and are only targeted towards the opposing combatants (Taylor 1997, p. 150). As the IW arena widens, PSYOPS become possible in times of war and peace, and all societies are potentially being targeted – not just those involved in conflict. Similar to propaganda, PSYOPS use information and psychology to attack hearts and minds rather than physical bodies. Rowett (2018, p. 4439) notes that “by controlling or influencing a culture, the norms and rationale of the culture and society can be altered, subverted, and changed.” These changes typically result in an outcome favourable to the country or organisation heading the PSYOP. An example of this can be seen in the US invasion of Iraq in 2003, when George Bush used fear to reframe the idea of war. Following 9/11 and the beginning of ‘the war on terror’, American citizens were already concerned about the rising threat of terrorism. Bush escalated these concerns by making unsubstantiated claims that Saddam Hussein had links to Al-Qaeda and was harbouring ‘weapons of mass destruction’ (Kellner 2004, p. 330). With substantial media coverage, people began to adopt the media’s framing of Iraq and saw invasion as a necessary pre-emptive strike. It has since been speculated that the goal of this invasion was to gain control of Iraqi oil supplies. This is just one example of what can be achieved through a PSYOP. In this case, the government used media framing to play on existing fears, making the idea of invading Iraq more acceptable. Modern PSYOPS are often discussed on the fringes of the Internet, with users attempting to conduct their own operations. It can be argued that the #draftourdaughters campaign was an organic PSYOP, perhaps the first of its kind. Instead of being headed by one powerful individual, this operation allowed anyone on 4chan to contribute. And, most importantly, the tools of choice were not those we see in traditional propaganda – they were memes.

Memetic Warfare

When you hear the word meme, what’s the first thing that comes to mind? Most likely it’s a humorous Internet image that you recently saw and enjoyed. This is the everyday Internet users’ definition of a meme – however, they go much deeper than this. The term was originally coined by evolutionary biologist Richard Dawkins. In his book The Selfish Gene (1976, p. 142) Dawkins defines a meme as “a unit of cultural transmission, or a unit of imitation”, comparing their spread to genetic evolution. At its core, a meme is a concept, or part of one (Scott 2018, p. 8). It’s a small piece of information that self-replicates and changes as it spreads from mind to mind, resulting in infinite variations of the same original idea.

A meme can be almost anything – fashion, language, religion, opinions, symbols, catchy songs, and so on. All memes start as concepts, but not all concepts become memes (Hancock 2010, p. 41). What differentiates memes from concepts is that memes “spread themselves indiscriminately, without regard to whether they are useful, neutral, or harmful to us” (Blackmore 1999, p. 7). This description applies both to conceptual memes and the seemingly harmless visual memes we see regularly. While the two are constantly converging, a distinction should be made between what I refer to as ‘conceptual’ and ‘visual’ memes. Visual memes are limited to those created and spread online, while a conceptual meme refers to the memetic concepts and behaviours spread by any non-genetic means (Dawkins 1976, p. 245). Both conceptual and visual memes act as a means of narrative construction.

Despite appearing small, visual memes are semantically loaded with meanings that influence the way we perceive reality (Gal et al. 2014, p. 1699). As can be seen on the Internet and in broader society, some memes are more successful than others – some spread across the globe, and some ‘die’ immediately. One of the most successful conceptual memes of all time is Christianity. The ‘survival value’ of a meme is based on its psychological appeal to the viewer (Dawkins 1976, p. 250). Essentially, if a meme does not resonate with the frames inside the audience’s minds, it will not survive for long. Figure 1 demonstrates a visual meme which resonated with its viewers and subsequently went viral. This false campaign material played on the existing Hillary Clinton frame – feminism, diversity and social justice, which is why its survival value was so high.

The study of memetics has been taking place for quite some time, and the research shows that memes have far more of an impact than just making us laugh. Similar to frames as discussed above, memes “both reflect norms and constitute a central practice in their formation” (Gal et al. 2014, p. 1700), meaning that they have real impact on the way society thinks and acts. One of the key factors that allows memes to influence public perception is the speed with which they spread and mutate, like a virus of the mind (Hancock 2010, p. 41). Much like propaganda, memes enter our minds and potentially change our developed frames before we have time to critically evaluate them. If the meme resonates with our existing frames, it is more likely to be stored in our long-term memory and passed onto others (Heuer 1999, p. 23). Both legacy and modern information warfare operate this way. The key difference here is that online memes are accessible to virtually everyone. This means that for the first time, the public narrative is being altered by the audience, rather than only by the concentrated areas of power (Rowett 2018, p. 4437). The dynamics and implications of memetic warfare have barely been touched by academia, and much less through a systemic lens. This is what my thesis aims to address – memetic warfare is having mass influence every day, and the existing literature surrounding the phenomenon is quickly becoming outdated.

Memetic warfare combines memetics with information warfare to create a new, digital form of public persuasion. It is currently being discussed as an emergent type of information/psychological operations, where memes are created and spread via social media in an attempt to alter the perceptions and behaviours of targeted groups and individuals (Bradshaw & Howard 2017, p. 12). While memetic warfare aims to attack the public narrative through framing just as traditional information warfare does, the dynamics are completely different. The information battleground is more accessible and diverse than ever before, which means that the competition for control over public perception is being fought by anyone who wishes to do so. As a result, control of the public narrative is in a constant state of flux. In 1970, renowned media scholar Marshall McLuhan wrote that “World War 3 is a guerilla information war with no division between military and civilian participation” (p. 66). Given that individuals from all backgrounds are participating in memetic warfare, it seems McLuhan’s prediction has come to fruition with this modern phenomenon.

Just as propaganda cannot exist without the mass media, memetic warfare cannot exist without online communities. Bernays’ (1928) notion of the ‘public mind’ is mirrored by online communities, which create their own alternative public minds. These online subcultures develop their own social norms and values through the content they create, and the sense of collective identity these communities harbour is one of the main driving forces behind memetic warfare (Gal et al. 2016). While online communities often foster creativity, inclusion, and a wide variety of content, they can also become hotbeds for what Adrian Massanari (2017, p. 330) refers to as “toxic technocultures”. Her article specifically focuses on Reddit and 4chan, two online communities which are often behind controversial and sometimes unlawful content. An example of this toxicity can be seen in 4chan’s 2013 #cutforbieber campaign. In early 2013, photos emerged of Justin Bieber smoking marijuana, causing his younger fans great concern. 4chan users leveraged this existing anxiety to start a false trend of self-harming in the name of Bieber, accompanied by the hashtag #cut4bieber (Monde 2013). Graphic images – both real and false –  spread all over Twitter, and the hashtag went viral before people discovered it was falsely propagated. This is one example of memetic warfare on a relatively small scale, which had rather unfortunate implications for its audience. While there is already a wealth of research surrounding online communities and the content they produce, there is very little about the influence they have on public perception and behaviour in the ‘real world’. If a random group of 4chan users can convince young girls to self-harm, it’s highly likely they can memetically impose even more harmful behaviours on society. Memetic warfare campaigns such as #cut4bieber expose the need for a systemic understanding of these phenomena. Without a deeper understanding of modern information warfare, we will not be able to protect ourselves from its influence. This need is emphasised by the fact that multiple military organisations are already conducting their own research on memetic warfare, but are struggling to turn the abstract concept into reality (Gisea 2017, p. 7). My thesis aims to address this need, providing a systemic breakdown of the way memetic warfare operates, and its implications going forward.

Despite having major influence over the way society thinks and behaves, conversations about memetic warfare remain outside of the mainstream. As discussed earlier, the mass media landscape is currently undergoing a paradigm shift which is resulting in everyday Internet users manipulating the public narrative. This shift has completely transformed the dynamics of information warfare, and consumers are now more vulnerable to manipulation than ever before. This is why furthering the academic conversation about modern information warfare is so important. While there is plenty of research on traditional information warfare and memetics, there is very little written about memetic warfare. Additionally, the existing memetic warfare literature fails to combine the theoretical with the practical. My thesis aims to address this gap from a systemic perspective, so that media researchers and intelligence professionals alike can recognise the need for further research into this phenomenon. Rather than discussing memetic warfare in an abstract sense, my thesis will analyse what it looks like in practice. In doing so, I aim to propose strategies that will help individuals prevent themselves from being manipulated by such tactics. To achieve this, I will incorporate a range of media theories combined with my own case studies of two separate memetic warfare campaigns. With this combination, I hope to produce a thesis that has applications in the outside world.


Anderson, C 2008, The Long Tail: Why the Future of Business is Selling Less of More,    Hyperion eBooks, New York.

Baran, P 1962, On Distributed Communications Networks, RAND Corporation, Santa Monica, viewed 25 September 2019, <http://www.rand.org/content/dam/rand/pubs/papers/2005/P2626.pdf&gt;

Bartlett, FC 1932, Remembering: A Study in Experimental and Social Psychology, Cambridge University Press, New York.

Bernays, E 1928, Propaganda, Liveright, New York.

Blackmore, S 1999, The Meme Machine, Oxford University Press, Oxford.

Bradshaw, S & Howard, P 2017, ‘Troops, Trolls and Troublemakers: A Global Inventory of Organized Social Media Manipulation’, 17 July, pp. 4-24.

Bruns, Al 2006, ‘Towards Produsage: Futures for User-Led Content Production’, in Proceedings Cultural Attitudes towards Communication and Technology 2006, Tartu, Estonia, n.d., viewed 25 September 2019, <https://eprints.qut.edu.au/4863/1/4863_1.pdf&gt;

Dawkins, R 1976, The Selfish Gene, Oxford University Press, Oxford.

Ellul, J 1973, Propaganda: The Formation of Men’s Attitudes, Random House, New York.

Entman, RM 1993, ‘Framing: Toward Clarification of a Fractured Paradigm’, Journal of Communication, vol. 43, no. 4, pp. 51-58.

Findley, BF & Goldstein, FL 1996, Psychological Operations: Principles and Case Studies, Defense Technical Information Center, Virginia.

Gal, N, Shifman, L & Kampf, Z 2016, “It Gets Better”: Internet Memes and the Construction of Collective Identity’, New Media & Society, vol. 18, no. 8, pp. 1698-1714.

Giles, K 2016, Russian Handbook of Information Warfare, NATO Defense College, Rome.

Gisea, J 2017, ‘It’s Time to Embrace Memetic Warfare’, NATO Defence Strategic Communications Journal, vol. 1, no. 5.

Hancock, BJ 2010, ‘Memetic Warfare: The Future of War’, Military Intelligence Professional Bulletin, vol. 36, no. 2, pp. 41-47.

Heuer, R 1999, Psychology of Intelligence Analysis, Central Intelligence Agency, Virginia.

Kellner, D 2004, ‘Media Propaganda and Spectacle in the War on Iraq: A Critique of U.S. Broadcasting Networks’, Cultural Studies/Critical Methodologies, vol. 4, no. 3, pp. 329-38.

Kuzio, A 2014, Exploitation of Schemata in Persuasive and Manipulative Discourse in English, Polish and Russian, Cambridge Scholars Publishing, Newcastle upon Tyne, England.

Lee, LT 2009, ‘History and Development of Mass Communications’, Journalism and Mass Communication, vol. 1.

Massanari, A 2017, ‘#Gamergate and The Fappening: How Reddit’s Algorithm, Governance, and Culture Support Toxic Technocultures’, New Media & Society, vol. 19, no. 3, pp. 329-346.

McLuhan, M 1970, Culture is Our Business, Ballantine Books, New York.

Media Matters 2014, August 9, viewed September 27 2019, <https://twitter.com/mmfa/status/497856477802278912&gt;

Monde, C 2013, ‘Justin Bieber Fans Draw Shock, Outrage With Gruesome ‘Cut4Bieber’ Trending Topic’, Daily News, 8 January, viewed 20 October 2019, <https://www.nydailynews.com/entertainment/gossip/cut4bieber-trending-topic-draws-shock-outrage-article-1.1235624&gt;

 Nagle, A 2017,  Kill All Normies: Online Culture Wars From 4Chan And Tumblr To Trump And The Alt-Right, Zero Books.

Nelson, L 2016, Clinton: I Support Women Registering for the Draft, Politico, viewed 16 October 2019, <https://www.politico.com/story/2016/06/hillary-clinton-women-draft-register-224390&gt;

Pluwak, A 2015, ‘The Linguistic Aspect of Strategic Framing in Modern Political Campaigns’, Cognitive Studies, no. 11, p. 307.

Rowett, G 2018, ‘The Strategic Need to Understand Online Memes and Modern Information Warfare Theory’, paper presented to IEEE Conference on Big Data, Seattle, USA, 10 – 13 December, <https://www.researchgate.net/publication/330629162_The_strategic_need_to_understand_online_memes_and_modern_information_warfare_theory&gt&gt;

Scott, J 2018, Information Warfare: The Meme Is the Embryo of the Narrative Illusion, Center for Cyber-Influence Operations Studies, Washington, D.C.

Shirky, C 2008, Here Comes Everybody: The Power of Organizing Without Organizations, Penguin Books, New York.

Smith, J 2016, Will Hillary Draft Women? Inside #DraftOurDaughters, a Fake Ad Campaign From the Alt-Right, image, Mic, viewed 10 October 2019, <https://www.mic.com/articles/157933/will-hillary-draft-women-inside-draft-our-daughters-a-fake-ad-campaign-from-the-alt-right&gt;

Taylor, P 1997, Global Communications, International Affairs and the Media Since 1945, Routledge, London.

 Wall, T & Mitew, T 2018, ‘Swarm Networks and the Design Process of a Distributed Meme Warfare Campaign’, First Monday, vol. 23, no. 5, <https://firstmonday.org/ojs/index.php/fm/article/view/8290/7202&gt;

 WebFX, n.d., ‘The 6 Companies That Own (Almost) All The Media’, WebFX, weblog post, viewed 10 October 2019, <https://www.webfx.com/blog/internet/the-6-companies-that-own-almost-all-media-infographic/&gt;

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

close-alt close collapse comment ellipsis expand gallery heart lock menu next pinned previous reply search share star