Tagged: identity Toggle Comment Threads | Keyboard Shortcuts

  • Ian O'Byrne 1:18 pm on March 13, 2020 Permalink | Reply
    Tags: identity   

    How to Generate Infinite Fake Humans 


    Humankind has remedied the shock of modern life with pleasures from its reverberations. It is telling that the early commercial applications of AI similar to This Person Does Not exist include stock photography and pornography, two domains in which the actual, lived experience of human beings are completely subordinated to their deployment as vessels for pure spectacle, banal on the one hand and lurid on the other. Whether or not someone “really exists” has been of little concern in most social …

  • Ian O'Byrne 12:26 pm on February 29, 2020 Permalink | Reply
    Tags: identity, securitybreach,   

    Exclusive: A high school student created a fake 2020 candidate. Twitter verified it – CNN 


    The blue checkmark is a hallmark of Twitter and one that was later copied by Facebook. It is often given to prominent accounts belonging to journalists, politicians, government agencies and businesses. The feature is central to Twitter’s goal of helping users find reliable information on the platform, often from verified newsmakers.

  • Ian O'Byrne 12:27 pm on February 13, 2020 Permalink | Reply
    Tags: , , identity,   

    Mark Zuckerberg’s privilege to be forgotten 


    Between his destroyed journal records and Facebook messages, we know now that Zuckerberg was able to keep significant parts of his life and his company’s history private. And between scandals like Cambridge Analytica and Clearview AI, in which huge troves of data were scraped from users without their knowledge or consent, the monster that Zuckerberg created may have foreclosed the possibility of privacy for billions of people. That feels more than just unequal; it seems historically cruel.

  • Ian O'Byrne 4:33 pm on January 24, 2020 Permalink | Reply
    Tags: identity, , ,   

    Talking to Children About Digital Safety 


    The Internet provides an opportunity for children to learn, explore their
    world, and socialize with friends. By understanding the potential dangers
    your children face, you can educate them and help them have safer digital

  • Ian O'Byrne 4:32 pm on January 24, 2020 Permalink | Reply
    Tags: identity, , ,   

    Digital Sharing 


    Write a Character Sketch
    Ask students to write a character sketch about one or both of the characters in the video. Be as creative as you can. There are no wrong answers. Give these characters a life of their own, whatever you want it to be.

    Prompt with questions:

    Who is your character?
    What is his/her name?
    Who are his/her friends?
    How long have they known each other?
    Who are the people in his/her family? What are they like?
    What’s his/her backstory?
    Where does he/she live?
    Where did he/she u…

  • Ian O'Byrne 4:31 pm on January 23, 2020 Permalink | Reply
    Tags: , identity,   

    Young Children (0-8) and Digital Technology 


    Families who live in houses with gardens and/or in child-friendly residential areas (i.e. no busy streets, neighbours with children, squares, playgrounds, etc.) talk more about playing outside and with their neighbours` children. A playground in the vicinity also encourages outdoor play. In the case of the Gamma family, because neighbours knew each other quite well and the neighbourhood was very safe, children could walk in and out of their houses quite freely to visit their friends in the neigh…

  • Ian O'Byrne 4:31 pm on January 23, 2020 Permalink | Reply
    Tags: , , identity,   

    How to Model and Explain Digital Security to K-12 Students 


    Passwords, Privacy, Personal Information, Photographs, Property, Permission, Protection, Professionalism, Personal Brand

  • Ian O'Byrne 4:01 pm on January 23, 2020 Permalink | Reply
    Tags: , , identity,   

    How to Model and Explain Digital Security to K-12 Students 


    Passwords, Privacy, Personal Information, Photographs, Property, Permission, Protection, Professionalism, Personal Brand

  • Ian O'Byrne 12:52 pm on September 2, 2019 Permalink | Reply
    Tags: , identity, , , public,   

    Posting publicly online 

    I’m a proponent of sharing openly, and publicly online. This means that warts and all I (for the most part) share what I do..and who I am. I mean…look at what I’m doing here. I’m doing my daily five minute journal here openly online. No one may come and read this…but that’s not the point. 🙂

    As I guide others, there is often the question about WHY would you do this. There is also the question about what should you share…and do you share everything.

    From my own perspective…I share what I believe fits into this “brand” that I’ve developed. It’s not the right term, and it does rub some people the wrong way…but I share things that deal with literacy, education, and technology. I do try and work some aspects of advocacy, or activism…or productivity into it. But, it’s all in the same realm of content.

    I believe that overall, sharing and opening up over time has helped me. I believe that it helps me build my identity and I’ve had offers and opportunities because of this collection of materials I’ve shared online.

    For the most part, I don’t believe that it has been a negative. There have been one or two times that I’ve shared something and I’ve gotten a response that’s been a bit less than I planned for.

    I do wonder as I build up an audience and this identity…will the expectations change?

  • Ian O'Byrne 1:15 pm on October 16, 2018 Permalink | Reply
    Tags: , identity, ,   

    Friction-Free Racism 

    Friction-Free Racism — Real Life by an author (Real Life)

    Surveillance capitalism turns a profit by making people more comfortable with discrimination

    Chris Gilliard in Real Life Magazine. All annotations in context.

    Questions about the inclusivity of engineering and computer science departments have been going on for quite some time. Several current “innovations” coming out of these fields, many rooted in facial recognition, are indicative of how scientific racism has long been embedded in apparently neutral attempts to measure people — a “new” spin on age-old notions of phrenology and biological determinism, updated with digital capabilities.

    A need for diverse individuals in engineering, computer science, and STEM fields as these technological devices become ubiquitous in our lives.

    Only the most mundane uses of biometrics and facial recognition are concerned with only identifying a specific person, matching a name to a face or using a face to unlock a phone. Typically these systems are invested in taking the extra steps of assigning a subject to an identity category in terms of race, ethnicity, gender, sexuality, and matching those categories with guesses about emotions, intentions, relationships, and character to shore up forms of discrimination, both judicial and economic.

    Points about the use of technology as a means to identify and differentiate between groups, most specifically in terms of race.

    A key to Browne’s book is her detailed look at the way that black bodies have consistently been surveilled in America: The technologies change, but the process remains the same. Browne identifies contemporary practices like facial recognition as digital epidermalization: “the exercise of power cast by the disembodied gaze of certain surveillance technologies (for example, identity card and e-passport verification machines) that can be employed to do the work of alienating the subject by producing a ‘truth’ about the body and one’s identity (or identities) despite the subject’s claims.”

    More about coding difference and using this as a means to prescribe the same power structures and ideologies.

    Many current digital platforms proceed according to the same process of writing difference onto bodies through a process of data extraction and then using “code” to define who is what.  Such acts of biometric determinism fit with what has been called surveillance capitalism, defined by Shoshanna Zuboff as “the monetization of free behavioral data acquired through surveillance and sold to entities with interest in your future behavior.”


    In other words. race is deployed as an externally assigned category for purposes of commercial exploitation and social control, not part of self-generated identity for reasons of personal expression. The ability to define one’s self and tell one’s own stories is central to being human and how one relates to others; platforms’ ascribing identity through data undermines both.

    Having just finished White Fragility, this is at the top of my mind right now. The consideration of the systems involved in racism, and codification of these differences…while distancing people from the system so they don’t feel like they’re a part of it.

    At the same time racism and othering are rendered at the level of code, so certain users can feel innocent and not complicit in it.

    Adding algorithms to the model intensifies the problem as it doubles and triples down on user signals.

    Once products and, more important, people are coded as having certain preferences and tendencies, the feedback loops of algorithmic systems will work to reinforce these often flawed and discriminatory assumptions. The presupposed problem of difference will become even more entrenched, the chasms between people will widen.

    This is making me think about a recent piece in which our social media feeds were examined to consider the ways in which they reify the powerful by using algorithms to modify the feed.

    What would it look like to be constantly coded as different in a hyper-surveilled society — one where there was large-scale deployment of surveillant technologies with persistent “digital epidermalization” writing identity on to every body within the scope of its gaze?


    Proponents of persistent surveillance articulate some form of this question often and conclude that a more surveillant society is a safer one. My answer is quite different. We have seen on many occasions that more and better surveillance doesn’t equal more equitable or just outcomes, and often results in the discrimination being blamed on the algorithm. Further, these technological solutions can render the bias invisible

    A powerful takeaway from Gilliard that will resonate with me for some time.

    The end game of a surveillance society, from the perspective of those being watched, is to be subjected to whims of black-boxed code extended to the navigation of spaces, which are systematically stripped of important social and cultural clues. The personalized surveillance tech, meanwhile, will not make people less racist; it will make them more comfortable and protected in their racism.

compose new post
next post/next comment
previous post/previous comment
show/hide comments
go to top
go to login
show/hide help
shift + esc