37 Educational Content, Openness and Surveillance in the Digital Ecology
“I remember, remembering without images”
Five Rings – Orla Barry
Almost everything we do today is mediated by some type of device. It is difficult to perform any task without having data created by us, or data resulting from our actions, circulating through third party computers. Permeated as we are by ‘free’ services, offered in the form of spaces for collaboration, production and sharing of content – it is worth asking: why are these services being offered at ‘no cost’ to educational institutions and used by administrators, teachers and students?
The world only recently learned of Edward Snowden’s revelations about the amount of information and methods used to collect data from individuals and organizations by the United States National Security Agency (NSA). What was once a niche discussion by researchers and ‘technophobic alarmists’ has become a scandal that has opened a Pandora’s box leading to public discussion on privacy and surveillance. There are now regular accounts about the indiscriminate collection of data and personal information by corporations through devices, apps and services, as well as a general lack of zeal by companies and governments with data that citizens considered protected, confidential or private.
Still, for the vast majority of the population, the alarm did not bring about substantial change in behavior. Even the revelation that companies like Microsoft and Google were cooperating with the United States government through NSA’s PRISM program sounded perhaps too remote to elicit any concrete reaction from most users of these platforms.
The alarm rang once again with the revelation of of data collected through Facebook for political purposes by British company Cambridge Analytica. The use of profiling and its use on social networks to influence voters through content, in several countries, made the privacy alarm ring very loudly for the billions of users of the largest social network on the planet. Many (including celebrities!) declared that they would leave the network. Ultimately, the impact on the number of Facebook accounts and profit was negligible, nothing but a jab.
In the context of Brazil, the jab became a hard cross with the presidential election of 2018, and the climate of mistrust created by the circulation of fake news. The impact of the indiscriminate collection and use of data by surveillance platforms created an open wound in family discussion groups (which, off course, happen in these same platforms), the fear of open dialogue in work environments, and the marked polarization of daily living. The belief about the neutrality of technology and the certainty that social networks were bringing us closer together was upended in mainstream media and in bar conversations, overturning the principle (as Zuckerberg promised) that social network services like Facebook are naturally forces for good and promote the integration of humankind. The consequence of this awareness, for a large part of the population, was not a radical change in behavior. Still, we have evidence that there is at least an interest in better understanding the meaning of this uncharted territory.
In our interactions with various content and media platforms, we contribute content explicitly (images, videos, texts and the like), in which case we can establish some measure of protection by abiding to rules or following an ethic of sharing (photos of projects and student productions, yes; photos of students’ faces in class, no). But we also share data in implicit ways, as when data about our patterns of behavior, interactions, our pauses and breaths are collected; everything is content and, for this market, it is what generates value. It is opaque, distant data. It is not visible, tangible, and memorable like videos and photos in a timeline. Having access to and understanding the scale and scope of the information we share in this manner is not a simple task.
The electoral scandals and fake news blew up the neutrality argument of content platforms, which are regulated by people in corporations, but mainly by opaque algorithms. Even when dealing with the data in the aggregate, searching for patterns, the algorithms select and promote content, indicate videos, photos, news and contacts. Algorithms can strengthen world views, radicalism and prejudices.
The scandals and problems mentioned above have at least one point in common: large businesses that are ‘intermediaries’ of our experiences. The consolidation of the market means that most of the platforms we use to produce, store and share content, articulate our research, and communicate with our colleagues take place through these channels, whether we know it or not. These companies are some of the same platforms involved in some of the leaks and scandals mentioned above.
The sharing of videos posted on YouTube; documents, data and sensitive information sent via Gmail or Drive (Google); the creation of student and teacher groups on Whatsapp, Instagram and Facebook (all from Facebook), are just some examples of common practices in research institutions, and in basic and higher education. The use of a non-institutional email as an ‘official’ point of communication by educational institutions and other actors in the public sphere is very common.
If in the social sphere the concern has stimulated discussion and debate, in the field of education the consternation is still incipient. We have been holding the user responsible for making ‘conscious use’ of the networks, and we ignore the power and role of governments, institutions and large companies in building and defining this digital ecosystem. The biggest and most important problems regarding the relationship between educational institutions and corporations associated with surveillance capitalism is just beginning to be illuminated.
Public universities and educational networks around the world (including Brazil) have established partnerships with companies such as Microsoft and Google, outsourcing services previously considered essential, such as email management and data storage. Personal data, private information in scanned documents, search strings, and a multitude of private data circulate opaquely through these connections, weaving a cozy relationship between institutions and businesses. In addition, the adoption of these services by institutions, teachers and administrators turns them into de facto communication and information structures within institutions, replacing the institutional apparatus.
When school administrators define that the space for notification and information exchange will be, for example, a Facebook group, what choice (besides using Facebook) does a parent or guardian have, if she/he wants to be kept abreast of what happens in school? If a teacher defines that students should join a WhatsApp group to articulate classroom activities, what real choice does a student have to deny her or his participation?
In these and other cases, recurrent in our institutions in Brazil (and I believe, many other, particularly poorer countries), several dilemmas remain unexplored. First, there is no awareness or clarity of the meaning of ‘gratuity’ and the forms of control and surveillance that are in fact the business model behind these services. Second, there is a clear asymmetry of power between the providers and the users, which is rarely questioned and becomes, in certain cases, a form of coercion. Third, the opportunity to counteract inertia and educate on how allegiance to these platforms has created qualitatively different, and often harmful formats and spaces of communication, is lost. Finally, the opportunity to explore alternatives to surveillance services is not explored.
We need to have deeper discussions in our schools and universities in regards to how we should exercise control over our work, communication and sharing tools, particularly when we promote open education. We have learned through scandals that these choices matter and, and that these platforms have a direct impact on the structure of our society. Fortunately, we already have several alternatives in the form of services, systems and models (which do exist, both paid and free/libre) that allow us to glimpse other futures for our technological development. The task is urgent.
The Guardian, “The NSA Files”, 2019, https://www.theguardian.com/us-news/the-nsa-files.
Carole Cadwalladr e Emma Graham-Harrison, “Revealed: 50 Million Facebook Profiles Harvested for Cambridge Analytica in Major Data Breach”, The Guardian, 17 de Março, 2018, sec. News, https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election.
This of course, parallels the experience of other countries’ elections during the same period. For the case of Brazil, see: Cristina Tardáguila, Fabrício Benevenuto, and Pablo Ortellado, “Opinion | Fake News Is Poisoning Brazilian Politics. WhatsApp Can Stop It.”, The New York Times, October 17, 2018, Opinion section, https://www.nytimes.com/2018/10/17/opinion/brazil-election-fake-news-whatsapp.html.
Agência Brasil, “Mark Zuckerberg publica manifesto e defende ‘comunidade global’”, 17 de Fevereiro, 2017, http://agenciabrasil.ebc.com.br/internacional/noticia/2017-02/mark-zuckerberg-publica-manifesto-e-defende-comunidade-global.
Pew Research Center, “The State of Privacy in America”, Pew Research Center (blog), 2016, https://www.pewresearch.org/fact-tank/2016/09/21/the-state-of-privacy-in-america/.
Max Fisher e Amanda Taub, “How YouTube Radicalized Brazil”, The New York Times, 11 de Agosto, 2019, sec. World, https://www.nytimes.com/2019/08/11/world/americas/youtube-brazil.html; Zeynep Tufekci, “Algorithmic Harms beyond Facebook and Google: Emergent Challenges of Computational Agency”, Colo. Tech. LJ 13 (2015): 203.
See our Education Under Surveillance Project at https://educacaovigiada.org.br. Also: Henrique Parra et al., “Infraestruturas, Economia e Política Informacional: O Caso Do Google Suite for Education”, Mediações 23, no. 1 (2018): 63–99, https://doi.org/10.5433/2176-6665.2018v23n1p63; Ribeiro da Cruz, Filipe Saraiva, e Tel Amiel, “Coletando Dados Sobre o Capitalismo de Vigilância Nas Instituições Públicas Do Ensino Superior Do Brasil.”, in VI Simpósio Internacional LAVITS (LAVITS, Salvador, 2019), http://lavits.org/eventos/simposio-lavits-2019/
Tel Amiel completed his PhD in Instructional Technology at the University of Georgia. He is currently professor at the School of Education at the University of Brasília where he coordinates the UNESCO Chair in Distance Education. He was previously coordinator of the UNESCO Chair in Open Education (Unicamp), and a visiting fellow at the University of Wollongong and Stanford University, and a visiting professor at Utah State University. He co-leads the Open Education Initiative, a grassroot collective dedicated to promoting open education policy and practice.
Amiel, T., ter Haar, E., Vieira, M. S., & Soares, T. C. (2020). Who Benefits from the Public Good? How OER Is Contributing to the Private Appropriation of the Educational Commons. In D. Burgos (Ed.), Radical Solutions and Open Science: An Open Approach to Boost Higher Education (pp. 69–89). Springer.
Amiel, T., & Soares, T. C. (2020). Advancing Open Education Policy in Brazilian Higher Education. In K. Zhang, C. J. Bonk, T. C. Reeves, & T. H. Reynolds (Eds.), MOOCs and Open Education in the Global South: Challenges, Successes, and Opportunities (pp. 229–235). Routledge.
This text was originally published in expanded form in Portuguese under a Creative Commons Attribution-NonCommercial 4.0 International License as: Amiel, T. (2019). Conteúdos educacionais, abertura e vigilância na ecologia digital. In F. J. de Almeida, G. Torrezan, L. Lima, & R. E. Catelli (Eds.), Cultura, educação e tecnologias em debate (3rd ed.). SESC São Paulo. Translated here with small adaptations by the author.