Digital Ethics: Preoccupied with whether or not they could, they didn’t stop to think if they should.

The title refers to a sentiment made famous by Ian Malcolm in Jurassic Park about the dangers of unleashing scientific marvels upon a world not ready for them.  The example dramatizes a point about sought-after scientific wonders and the unintended consequences of their proliferation.  This week’s readings dealt with the ethics of our digital world and the problems society continues to reconcile.   

The key concepts that I will be discussing are:

1. The alien nation felt in the different socio-economic and age strata

2. The unseen bias within the “algorithms” and Surveillance Capitalism

3. Building digitally ethical infrastructure

John B. Horrigan’s Digital Readiness Gaps gets to the heart of a significant problem within the digital landscape: navigating through a society with an increasing digital dependency.  When you examine the pew research center data the divide in age and affluence is striking.  Anecdotally I relate to the lived consequence of the study.  My mother who lost her warehouse worker job of 50 years (before the pandemic) was a woman over 50 who was as analog as they came.  She started looking for jobs and became aware that the world had gone and changed since she last found her job, which was done simply by a personal reference of someone who worked there.  She would call me on her smartphone (one that I set up for her) and ask me to register on job sites. She simply has no ability to understand the digital process, nor was she inclined to learn.  Eventually she obtained a job with a different company in the same warehouse she worked. Every single admin procedure was automated through a digital service.  Again I acted as a liaison between the company and my mother.  This relationship now dominates our interactions.  Trying to teach her how to use the apps and website to navigate her new job’s admin portion proved too difficult.  Ultimately I found myself navigating through all of the digital hurdles while she happily continued her analog existence.  In the end my mother simply could not overcome the fear of pressing a button. The surprising aspect of this example is the slow process in which it occurs.  Currently I find myself in the “cautious clickers” demographic. I find myself looking at technology not fearfully, but exhaustively.  As one the last generation who existed before the digital explosion I find myself at the precipice of knowing the digital fundamentals of antiquated digital technology. As I age  I find myself becoming hesitant and wary of technology.  Slowly a creeping dread of inadequacy and foreignness grips you. The newest app or piece of tech simply becomes more inaccessible not just for lack of ability, but due to lack of time, need, or patience to learn. While there is a sense of inevitability to it, one wonders if one can remain engaged with the ever-quickening pace of digital advancement.

Question Cluster 1: Do you think there is, in general, an age that eventually leads an alien nation to technology that can not be overcome? Will this happen to this first generation born in a wholly digital age?  Is access to digital literacy the answer to preventing the gap in readiness to persist in the future? Or is the eventual alien nation felt about technology an inevitable phase of decline in human adaptability?

Another issue that the readings engaged in was the murky transparency and ownership of ubiquitous digital technologies.  In Google Search: Hyper-visibility as a Means of Rendering Black Women and Girls Invisible by Safiya Umoja Noble examines Google and, more closely, Google search engine to discuss the hidden bias and normalizing effects of search engines.  Google’s search algorithm is the cornerstone of their digital empire.  This highly guarded secret makes it difficult to understand how the Google search algorithm returns the results, but it does give insight into what others are searching. Safiya, to a highly effective degree, uses the example of the search term “Black girls” to show how the Google search engine expresses how “hegemonic discourses about the hypersexualized Black woman, which exist offline in traditional media, are instantiated online as evidenced by [the] discussion of search results in Google.”  The image displaying the highly sexualized results is alarming.  The keyword search results were taken on September 18th, 2011. They returned 140,000,000 results.  For curiosity’s sake, I search the same term on April 23, 2021, and was pleasantly surprised at the change in theme on the result page, as the top result was  Another change was the number of results return. Within approximately ten years, a 2764% growth in the number of results returned, from the previous 140 million to the staggering 3.87 billion.  Does this mean the algorithm has become less misogynistic?  One can only hope because, as the article describes, Google functions “as the dominant ‘symbol system’ of society due to its prominence as the most popular search engine to date, and through its market dominance.”  The problem with trying to answer the question about Google’s algorithm is the lack of transparency associated with it.  Its intrinsic economic value makes it difficult to gain access to its code and the infrastructure that surrounds it.  That infrastructure is 

Question Cluster 2: Given the enormous impact that Tech companies, such as Google, have on society, should their privacy supplant our need for accountability? In other words, given that these search engines monetize their constant observational power of the user, should their tactics be completely transparent?  Is it possible to shift the large mechanisms already moving within these large capitalist structures?  Would complete transparency be enough or have these digital tools become too large to move away from?

The final issue to talk about centers around alternative digital spaces/infrastructure.  In the SPARC Roadmap for Action, one of the most direct points made in the 32-page document is “any movement towards true community control of infrastructure will require institutions to be willing to invest in digital infrastructure with the same commitment as they currently invest in physical infrastructure.”  Another aspect of this statement is the understanding that the current infrastructure is tightly controlled, mirroring society’s inequities and biases that create these digital tools.  Algorithms are particularly harmful to the belief that they are inherently objective and thus safe for usage. Still, the feedback loops that adjust the algorithm behavior are often too fast to control.  The Community Action portion of the document lays out the practical steps in creating infrastructure alternatives to commercial solutions.  From building from scratch to acquiring existing assets, the need for resources is vital.  I wonder if the inherent need for capital resources makes it more challenging to produce alternatives. Like many times before, we are forced to depend on individuals’ charitable side, many of whom have made fortunes within the current infrastructure. It seems contradictory that someone who has compiled the resources to build an alternative would do so. 

The SPARC document lays out the steps needed to generate an alternative infrastructure that embodies the egalitarian ideas that the internet and the digital world were meant to usher in. We have been living in an inflection point of history.  Not since the industrial revolution has society been asked to adopt a new wave of life. Let us end by returning to the sentiment of the post’s title. Are we going to marvel at our technological wonders and not seriously consider whether or not we should continue down this road that improves material existence at the cost of magnifying and multiplying ills that technology is meant to solve?

Question Cluster #3: I have brought individual philanthropy to be a contradictory approach in funding alternatives. Does this mean that we can only rely on the government/or the public sector to create a digital infrastructure that is community-controlled?  Is it possible to completely disregard the financial aspect of infrastructure indefinitely, which may conflict with communitarian ideals of the digital space?