<span class='p-name'>Talking to Children about Technology, Social Media, & Algorithms</span>

Talking to Children about Technology, Social Media, & Algorithms

Together with Katie Paciga, Elizabeth Stevens, and Kristin Turner, I submitted a proposal for this call for papers from the Journal of Design Science. The call is a partnership between Wired Magazine, the MIT Media Lab, and the UC Irvine Connected Learning Lab.

The special issue invites both empirical and conceptual papers from diverse perspectives that explore different facets of this emerging area of inquiry. They also welcome submissions from children and young people.

Our proposal was reviewed and accepted. We are now conducting interviews with our children to talk about these issues. We’re also canvasing the field to identify best practices for having these discussions.

What age/developmentally appropriate texts, tools, questions do you use to talk to young kids (age 3/4 to 9/10) about tech, privacy, security, computational thinking, etc.? Please share resources in the comments below. I’m also interesting in conducting interviews as well and sharing as podcasts to supplement the publication. Please reach out via email (hello@wiobyrne.com) or social networks.

I’ve shared our full proposal below to promote transparency and open scholarship.


Co-constructing Digital Futures: Parents and Children Becoming Thoughtful, Connected, and Critical Users of Digital Technologies

We are literacy researchers who understand that children live in and shape a connected world where they have the ability to consume and create literally at their fingertips. We care deeply about preparing them to be lifelong learners with the skills they need to access, analyze, evaluate, create, and participate through digital technologies (Ito et al., 2013). We are also parents who must navigate the realities of a digital world: every time our children log into an app on a device they are using at school, they leave a data trail. We also know they engage in the affordances of digital technologies often through the price of their privacy (Berson & Berson, 2006). At the same time, we know that developing digital literacy includes the understanding that algorithms drive users to particular content (Burrell, 2016). Children’s worldviews can be limited by geofencing and other algorithmic tools that are driven by for-profit purposes. How can parents help their children to understand the challenges and opportunities in these trends/forces/tensions? How can parents help children to be more reflective about the activities in which they engage? This proposed article will explore these questions through dual perspectives: those of parents (literacy researchers) and youth.

As we look to future-proof digital rights for ourselves, and our children, we must understand and contest how these technologies impact our privacy, autonomy, security, human dignity, justice, and power structures. Furthermore, our interactions in all areas of this digitized society, helps feed algorithms that track, learn, and predict our current and future behaviors (Wilson, 2018). In order to safeguard our children and their agency in future contexts, we need to contextualize current and future technologies, while at the same time preparing youth to be both hypervisible and invisible as they interact with these digital texts, tools, and spaces (Emejulu, & Mcgregor, 2019). It is within these complex informational ecosystems in which our children will grow up, and one-size-fits all privacy, security, and literacy solutions do not equitably support all (Bertot, Jaeger, & Hansen, 2012). More marginalized and vulnerable populations bear the burden of more harm as the ethics of these digitization practices must be called into question (Dillon, 2010), as well as the entities that collect and curate our data.  

As parents with, perhaps, more knowledge about how algorithms and privacy work in a digital world, we sit at an interesting intersection (Garcia, Cantrill, Filipiak, Hunt, Lee, Mirra, & Peppler, 2014). This article will offer insight into how each of us navigated conversations with our own children, ages 3 – 12, about the realities of the digital world, both the affordances and limitations, particularly those associated with the interactions between algorithms and privacy. Our research documents four case studies and includes methods such as stimulated response, individual interviewing, and focus group interviewing. 

In this writing, we propose a more collaborative approach than what has typically been adopted when thinking about children and technology. Rather than framing the problem as technology doing harm to children, we suggest that we can empower children to advocate for their own rights in an age of screentime (Turner, Jolls, Hagerman, O’Byrne, Hicks, Eisenstock, & Pytash, 2017). Our chapter will be co-authored with our children, who are certainly stakeholders – and potentially experts – in digital and social technologies. We believe that the voice of youth is often ignored in discussions about technology use and practice. In this article we seek to empower children to share their perceptions on these elements in order to identify and amplify their beliefs about privacy, security, and potential future threats as a result of these algorithmic methods. Our children’s perceptions may differ from an adult perspective, and it will be important for their voices to be heard throughout this process, including as contributors to the chapter.

Collaboratively with our children, we will outline activities that parents (and educators) can use in their own contexts in order to prepare youth to be thoughtful, perhaps skeptical, users of tools and spaces. We approach this work as educators, researchers, and most importantly parents. We hope that by documenting our collective strategies to learn more with our children about the ways algorithms drive and focus our digital experiences that we can empower other parents, educators, and researchers to do the same and, in turn, work to develop youth that are more critical and aware in their roles as consumers and creators of content.

References

Berson, I. R., & Berson, M. J. (2006). Children and their digital dossiers: Lessons in privacy rights in the digital age. International Journal of Social Education, 21(1), 135-147.

Bertot, J. C., Jaeger, P. T., & Hansen, D. (2012). The impact of policies on government social media usage: Issues, challenges, and recommendations. Government information quarterly, 29(1), 30-40.

Burrell, J. (2016). How the machine ‘thinks’: Understanding opacity in machine learning algorithms. Big Data & Society, 3(1). doi: 10.1177/2053951715622512

Dillon, R. S. (2010). Respect for persons, identity, and information technology. Ethics and Information Technology, 12(1), 17-28. doi:10.1007/s10676-009-9188-8 

Emejulu, A., & Mcgregor, C. (2019). Towards a radical digital citizenship in digital education. Critical Studies in Education, 60(1), 131-147. doi: 10.1080/17508487.2016.1234494

Garcia, A., Cantrill, C., Filipiak, D., Hunt, B., Lee, C., Mirra, N., & Peppler, K. (2014). Teaching in the connected learning classroom. Irvine, CA: Digital Media and Learning Research Hub.

Ito, M., Gutiérrez, K., Livingstone, S., Penuel, B., Rhodes, J., Salen, K., Schor, J., Sefton-Green, and Watkins, S. C. (2013) Connected learning: an agenda for research and design. Digital Media and Learning Research Hub, Irvine, CA, USA.

Turner, K. H., Jolls, T., Hagerman, M. S., O’Byrne, W., Hicks, T., Eisenstock, B., & Pytash, K. (2017). Developing digital and media literacies in children and adolescents. Pediatrics, 140(Supplement 2), S122-S126.

Willson, M. (2018). Raising the ideal child? Algorithms, quantification and prediction. Media, Culture & Society. doi: 10.1177/0163443718798901

 

Image Credits

5 Comments Talking to Children about Technology, Social Media, & Algorithms

Leave A Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.