Tagged: identity Toggle Comment Threads | Keyboard Shortcuts

  • Ian O'Byrne 12:27 pm on February 13, 2020 Permalink | Reply
    Tags: , , identity,   

    Mark Zuckerberg’s privilege to be forgotten 

    https://www-theverge-com.cdn.ampproject.org/v/s/www.theverge.com/platform/amp/2020/2/12/21135314/mark-zuckerberg-deleted-journals?usqp=mq331AQCKAE%3D&amp_js_v=0.1#referrer=https%3A%2F%2Fwww.google.com&amp_tf=From%20%251%24s&ampshare=https%3A%2F%2Fwww.theverge.com%2F2020%2F2%2F12%2F21135314%2Fmark-zuckerberg-deleted-journals

    Between his destroyed journal records and Facebook messages, we know now that Zuckerberg was able to keep significant parts of his life and his company’s history private. And between scandals like Cambridge Analytica and Clearview AI, in which huge troves of data were scraped from users without their knowledge or consent, the monster that Zuckerberg created may have foreclosed the possibility of privacy for billions of people. That feels more than just unequal; it seems historically cruel.

     
  • Ian O'Byrne 4:33 pm on January 24, 2020 Permalink | Reply
    Tags: identity, , ,   

    Talking to Children About Digital Safety 

    https://www.d2l.org/wp-content/uploads/2016/12/TALKING_TO_KIDS_ABOUT_DIGITAL_SAFETY_10.5.15.pdf?gclid=Cj0KCQiApaXxBRDNARIsAGFdaB_GWfPlJFHJDhqXsuP_U2m97ZYUfP4pUSOetjhHrJJjk-A7hobLypMaAuYpEALw_wcB

    The Internet provides an opportunity for children to learn, explore their
    world, and socialize with friends. By understanding the potential dangers
    your children face, you can educate them and help them have safer digital
    experiences.

     
  • Ian O'Byrne 4:32 pm on January 24, 2020 Permalink | Reply
    Tags: identity, , ,   

    Digital Sharing 

    https://curriculum.code.org/csf-19/coursef/19/

    Write a Character Sketch
    Ask students to write a character sketch about one or both of the characters in the video. Be as creative as you can. There are no wrong answers. Give these characters a life of their own, whatever you want it to be.

    Prompt with questions:

    Who is your character?
    What is his/her name?
    Who are his/her friends?
    How long have they known each other?
    Who are the people in his/her family? What are they like?
    What’s his/her backstory?
    Where does he/she live?
    Where did he/she u…

     
  • Ian O'Byrne 4:31 pm on January 23, 2020 Permalink | Reply
    Tags: , identity,   

    Young Children (0-8) and Digital Technology 

    http://www.lse.ac.uk/media@lse/research/ToddlersAndTablets/RelevantPublications/Young-Children-(0-8)-and-Digital-Technology.pdf

    Families who live in houses with gardens and/or in child-friendly residential areas (i.e. no busy streets, neighbours with children, squares, playgrounds, etc.) talk more about playing outside and with their neighbours` children. A playground in the vicinity also encourages outdoor play. In the case of the Gamma family, because neighbours knew each other quite well and the neighbourhood was very safe, children could walk in and out of their houses quite freely to visit their friends in the neigh…

     
  • Ian O'Byrne 4:31 pm on January 23, 2020 Permalink | Reply
    Tags: , , identity,   

    How to Model and Explain Digital Security to K-12 Students 

    https://www.thetechedvocate.org/model-explain-digital-security-k-12-students/

    Passwords, Privacy, Personal Information, Photographs, Property, Permission, Protection, Professionalism, Personal Brand

     
  • Ian O'Byrne 4:01 pm on January 23, 2020 Permalink | Reply
    Tags: , , identity,   

    How to Model and Explain Digital Security to K-12 Students 

    https://www.thetechedvocate.org/model-explain-digital-security-k-12-students/

    Passwords, Privacy, Personal Information, Photographs, Property, Permission, Protection, Professionalism, Personal Brand

     
  • Ian O'Byrne 12:52 pm on September 2, 2019 Permalink | Reply
    Tags: , identity, , , public,   

    Posting publicly online 

    I’m a proponent of sharing openly, and publicly online. This means that warts and all I (for the most part) share what I do..and who I am. I mean…look at what I’m doing here. I’m doing my daily five minute journal here openly online. No one may come and read this…but that’s not the point. 🙂

    As I guide others, there is often the question about WHY would you do this. There is also the question about what should you share…and do you share everything.

    From my own perspective…I share what I believe fits into this “brand” that I’ve developed. It’s not the right term, and it does rub some people the wrong way…but I share things that deal with literacy, education, and technology. I do try and work some aspects of advocacy, or activism…or productivity into it. But, it’s all in the same realm of content.

    I believe that overall, sharing and opening up over time has helped me. I believe that it helps me build my identity and I’ve had offers and opportunities because of this collection of materials I’ve shared online.

    For the most part, I don’t believe that it has been a negative. There have been one or two times that I’ve shared something and I’ve gotten a response that’s been a bit less than I planned for.

    I do wonder as I build up an audience and this identity…will the expectations change?

     
  • Ian O'Byrne 1:15 pm on October 16, 2018 Permalink | Reply
    Tags: , identity, ,   

    Friction-Free Racism 

    Friction-Free Racism — Real Life by an author (Real Life)

    Surveillance capitalism turns a profit by making people more comfortable with discrimination

    Chris Gilliard in Real Life Magazine. All annotations in context.

    Questions about the inclusivity of engineering and computer science departments have been going on for quite some time. Several current “innovations” coming out of these fields, many rooted in facial recognition, are indicative of how scientific racism has long been embedded in apparently neutral attempts to measure people — a “new” spin on age-old notions of phrenology and biological determinism, updated with digital capabilities.

    A need for diverse individuals in engineering, computer science, and STEM fields as these technological devices become ubiquitous in our lives.

    Only the most mundane uses of biometrics and facial recognition are concerned with only identifying a specific person, matching a name to a face or using a face to unlock a phone. Typically these systems are invested in taking the extra steps of assigning a subject to an identity category in terms of race, ethnicity, gender, sexuality, and matching those categories with guesses about emotions, intentions, relationships, and character to shore up forms of discrimination, both judicial and economic.

    Points about the use of technology as a means to identify and differentiate between groups, most specifically in terms of race.

    A key to Browne’s book is her detailed look at the way that black bodies have consistently been surveilled in America: The technologies change, but the process remains the same. Browne identifies contemporary practices like facial recognition as digital epidermalization: “the exercise of power cast by the disembodied gaze of certain surveillance technologies (for example, identity card and e-passport verification machines) that can be employed to do the work of alienating the subject by producing a ‘truth’ about the body and one’s identity (or identities) despite the subject’s claims.”

    More about coding difference and using this as a means to prescribe the same power structures and ideologies.

    Many current digital platforms proceed according to the same process of writing difference onto bodies through a process of data extraction and then using “code” to define who is what.  Such acts of biometric determinism fit with what has been called surveillance capitalism, defined by Shoshanna Zuboff as “the monetization of free behavioral data acquired through surveillance and sold to entities with interest in your future behavior.”

     

    In other words. race is deployed as an externally assigned category for purposes of commercial exploitation and social control, not part of self-generated identity for reasons of personal expression. The ability to define one’s self and tell one’s own stories is central to being human and how one relates to others; platforms’ ascribing identity through data undermines both.

    Having just finished White Fragility, this is at the top of my mind right now. The consideration of the systems involved in racism, and codification of these differences…while distancing people from the system so they don’t feel like they’re a part of it.

    At the same time racism and othering are rendered at the level of code, so certain users can feel innocent and not complicit in it.

    Adding algorithms to the model intensifies the problem as it doubles and triples down on user signals.

    Once products and, more important, people are coded as having certain preferences and tendencies, the feedback loops of algorithmic systems will work to reinforce these often flawed and discriminatory assumptions. The presupposed problem of difference will become even more entrenched, the chasms between people will widen.

    This is making me think about a recent piece in which our social media feeds were examined to consider the ways in which they reify the powerful by using algorithms to modify the feed.

    What would it look like to be constantly coded as different in a hyper-surveilled society — one where there was large-scale deployment of surveillant technologies with persistent “digital epidermalization” writing identity on to every body within the scope of its gaze?

     

    Proponents of persistent surveillance articulate some form of this question often and conclude that a more surveillant society is a safer one. My answer is quite different. We have seen on many occasions that more and better surveillance doesn’t equal more equitable or just outcomes, and often results in the discrimination being blamed on the algorithm. Further, these technological solutions can render the bias invisible

    A powerful takeaway from Gilliard that will resonate with me for some time.

    The end game of a surveillance society, from the perspective of those being watched, is to be subjected to whims of black-boxed code extended to the navigation of spaces, which are systematically stripped of important social and cultural clues. The personalized surveillance tech, meanwhile, will not make people less racist; it will make them more comfortable and protected in their racism.

     
  • Ian O'Byrne 11:52 am on September 19, 2018 Permalink | Reply
    Tags: bullying, ident, identity, ,   

    On Death Threats and the Life I Lead… 

    On Death Threats and the Life I Lead… by Pernille Ripp (Pernille Ripp)

    It seems to the price you pay to be public in a way, to being online. Nestled in between all of the learning, the connections, and the book recommendations is your daily slice of hatred.

    Pernille Ripp on recent death threats she has received on her blog.

    Please note…as Ripp does in her post…that there is offensive language in the post. But, this is hateful, threatening discourse from a commenter on her site.

    Pernille asks questions about what her response or concern should be in the matter.

     

     
  • Ian O'Byrne 5:28 pm on August 28, 2018 Permalink | Reply
    Tags: , identity   

    Representation and the internet with Franchesca Ramsey 

    Representation and the internet with Franchesca Ramsey
    In season three, episode five of the IRL podcast, Veronica Belmont and Franchesca Ramsey meet the people working to make the web — and the world — friendlier places for you, me and everyone else.

    The animated video shared above is a clip from this episode.

     

     
c
compose new post
j
next post/next comment
k
previous post/previous comment
r
reply
e
edit
o
show/hide comments
t
go to top
l
go to login
h
show/hide help
shift + esc
cancel