My comments were included in the section on concerns about technology being used for control and exploitation.
Ian O’Byrne, an assistant professor of education at the College of Charleston, wrote, “I believe that technology, most notably these digital spaces and places we inhabit, has the potential to both help and harm our well-being over time. These technologies can prove to be helpful as they connect individuals on a global scale and allow people to inform and educate themselves on a level previously unavailable. Social connections across affinity spaces can be created to allow us to find like minded individuals and learn together regardless of where we exist geographically. In many ways, I think we’ll move to a model (if we’re not there already) in which we have much more in common with connections in these digital, social spaces, than we do with neighbors in our own local area.
The challenge is that these technologies can also be extremely harmful, and I believe they will only become worse. Many of these digital, social spaces are playing fast and loose with our data and privacy. Our social signals are sucked up by algorithms that double and triple down on this to give more of the same to our feed. Within this black box of an individual’s customized, aggregated feed is the potential for propaganda, hate and disinformation to spawn. Where family, friends and possibly community would previously be there to step in and try to normalize these perspectives, an individual can be indoctrinated online by their own bias and move against their well-being and the well-being of society without the individual possibly even knowing.
Ultimately, the determination about whether this helps or hurts people in the long run will be determined by an individual’s willingness to understand, problematize and strive for balance as they interact with these technologies. We need education, specifically in critical thinking and evaluation of these texts and spaces to empower individuals to ask questions and problematize their own thinking.