The possibility that Cambridge Analytica engineered beliefs and votes through its use of Facebook data and sophisticated psychological profiling and personalization algorithms may be scary. But bear in mind that it is merely illustrative and symptomatic of our diseased techno-social environment.
Lately, some call for abstinence or bans, likening digital tech to tobacco, and others call for regulation and design interventions. Both analogies are partial and fail to reflect the systemic problems we collectively face. A better analogy is climate change. Digital networked technologies are re-engineering our planet, our social systems and our very selves.
Engineered dependence upon tech is a genuine problem. Digital technology companies want as much of your attention as possible, and theyve developed sophisticated techniques for hijacking your mind. Many apps on a smartphone ration out carefully timed rewards with all the lights and whistles of a slot machine.
Meticulously engineered human-computer interfaces manage user experiences and trigger the same circuits and chemicals in your brain that typify addiction. As in the tobacco and fast-food industries, profit motives steer companies to set deep psychological hooks in their consumers and force society to confront its ongoing struggle to manage addiction.
Some, such as Anya Kamenetz, question whether addiction is an appropriate label and note the need for more scientific studies. I agree, but its beside the point. There are many different economic, psychological and cultural factors that increase the extent to which consumers are dependent on digital tech, even if theyre not clinically addicted.
In stark contrast with tobacco, fast food and gambling, society has strong incentives to encourage tech adoption and use. Smartphones, digital assistants, fitness trackers, gaming apps and social media are, after all, not pure vices. They significantly expand consumers capabilities to communicate, produce and share knowledge, socialize and participate in a wide variety of socially productive activities. There are substantial upsides for individual users and society.
Along these lines, Kamenetz persuasively argues that phones are like automobiles; banning the tech outright or nudging folks to quit isnt viable. Better, she and others suggest, are design interventions smartphone equivalents of seatbelts, airbags and anti-lock brakes and public awareness campaigns that pressure tech companies to adopt less nefarious business models.
But given the breadth and depth of the problEm, these good ideas will fall well short.
Tinkering at the margins with design solutions will not reshape the industry or the installed base of consumers that have already outsourced so much of their thinking and become incredibly dependent upon always on devices. Just ask all of the Facebook users complaining (on Facebook and other social media) about Facebooks privacy policies and failure to police third-party use of their data.
Cambridge Analytica-inspired media attention notwithstanding, dont hold your breath for mass exodus from Facebook or substantive changes in the platforms basic business model. Giving users more control, as the New York Times Editorial Board suggests Congress ought to do, or nudging them to adjust their privacy settings wont change very much either.
To diagnose and deal with a disease this large, we need to think at a different scale, just as understanding and responding to the 2008 financial crisis required recognition of the systemic failures and not just blaming designers of sliced-and-diced, high-risk, mortgage-backed securities.
After all, what were really talking about in tech policy debates is the world were building for ourselves, our children and future generations. To examine the interconnected, global, environmental and intergenerational considerations and at the same time relate those considerations to our everyday lives, the best metaphor is climate change.
Heres one way to understand climate change: We want energy. Energy is an essential input into so many of our modern activities. We can build different supply systems for energy, but the one weve relied on over the past century is heavily dependent upon burning fossil fuels.
It need not be. There are alternative sources of energy. But fossil fuels have been relatively cheap, convenient and politically supported for past and current generations. The massive external costs from burning fossil fuels are not felt by past or current generations; the costs are largely pushed on future generations.
While some blame may be cast upon fossil fuel companies or others for actions that might be deemed equivalent to engineering addiction (or cover ups, delaying, etc.), we all bear some of the responsibility for climate change especially those of us in the United States and other developed countries whove consumed so much.
But keep in mind, our heavy dependence on fossil fuel consumption has been economically rational. We all make countless individually and incrementally cost-benefit-justified decisions advantaged by cheap and convenient fossil fuel consumption. It is a massive, global tragedy of the commons.
Dealing with climate change is politically and economically difficult because it requires significant structural changes, adjustments in how we live our lives and cultural and various other systemic adaptations.
As philosophy Prof. Evan Selinger and I argue in our new book, Re-Engineering Humanity, the digital networked environment suffers from a similar tragedy. We want, among other things, to connect, communicate, interact, transact and otherwise engage with each other nearly instantly and often without regard for geographic location. Digital networked technology, like energy, is an essential input into so many of our modern activities.
Every day, we each make various decisions about technology that seem, on their own terms, rational and unproblematic. We adopt technology and mindlessly bind ourselves to the terms and conditions offered. We follow scripts written and paths set by platform designers. We carry, wear and attach devices to ourselves and our children, maintaining a connection and increasing our dependence. We outsource thinking because, heck, theres always an app for that.
Each decision may be cost-benefit justified, yet the net effect on who we are and the lives were capable of leading may be unjustifiable.
Nothing less than our humanity is at stake. We risk being engineered to behave like predictable and programmable people.
Its too easy to blame companies that treat us as programmable objects through hyper-personalized technologies attuned to our personal histories, present behaviors and feelings and predicted futures. They bear some responsibility, but so do all of us.
Frischmann is the Charles Widger Endowed University Professor in Law, Business and Economics at Villanova University and co-author of the upcoming book, Re-Engineering Humanity.