The Top 10 Technologies That Concern Millennials
Everyone wants to know what Millennials like when it comes to selling ideas, products, or political candidates. After all, they now constitute the largest group of citizens, and voters.
But if you’re a corporation in the business of disrupting the status quo, say, or a pundit looking to stir up trouble, or a writer of dystopian movies and novels, what you should want to know is what kinds of things worry Millennials.
Surveys from the likes of Pew are useful in asking direct questions to see whether, for example, people are anxious about automation taking jobs, or willing to trust robots to drive cars or offer healthcare, etc. (Millennials, by the way, seem a little less anxious about all that than do Boomers.)
For a window on what worries Millennials about emerging technologies -- and derivatively, the companies that create them – check out the just released annual update of the Top 10 Ethical Dilemmas from the University of Notre Dame’s Reilly Center for Science, Technology, and Values.
The Reilly Top 10 has always been about emerging tech. What’s different this year is that the process to create the list was, in large measure, driven by Notre Dame students; i.e., by the trailing edge of the Millennial boom. The list creators are eager to point out this is not “about scary technologies” per se, but an examination through the lens of ethical issues “to think about.” But, by definition then, this year’s Top 10 list offers a kind of Rorschach of Millennial anxieties.
So, herein a highly compressed summary of the Top 10 technologies “to think about.” Keep in mind that nothing on the list is notional; everything is already in use or in ‘beta’ test.
Most people, likely all Millennials, keep personal, financial and professional information in computers. Ransomware – a virus ‘injected’ by a hacker – can lock you out of, and even destroy your files, unless you pay a ransom either with your credit card or bitcoins. Surveys show that “59% of people paid the ransom out of their own pockets.” The Reilly team notes that you “don’t have to be a computer genius to launch an attack. The Dark Web currently has about 45,000 ads for ransomware for sale” in forms that are easy to use.
Police use breathalyzers to enforce DWI convictions. Now comes the Textalyzer to combat a rise in accidents from, call it DWT -- driving while texting. After pulling you over (or out of a wreck) police can plug the Textalyzer into your phone to check if you were Snapchatting, Tweeting, Facebooking, or Candy Crushing while driving. Hello Fourth Amendment?
3. Google Clip
The Reilly team seemed ready to vote this the “creepiest Christmas gift” of 2018. The “cute little hands-free camera” can be clipped onto clothes, or anything, to observe and record your world and experiences, continuously. It doesn’t run 24/7 (that would be uselessly difficult to store and navigate later). Instead the Clip uses artificial intelligence and facial recognition to “capture beautiful, spontaneous images” of your life as it “senses” when you are with people you know, “pets you love,” or in a beautiful or picturesque moment. Nuff said.
4. Emotion-Sensing Facial Recognition
New software can recognize your emotions in real time and deduce just how annoyed, happy, interested or angry you are when “you shop, and eat, and play video games.” It can run off of the video feed from existing webcams, say in stores or public places, or your phone’s camera. One imagines many positive uses (training, therapy, education), as well as some not so positive. For those who find it creepy when companies track what you buy or where you are, emotion sensing adds tracking how you feel about what you’re seeing or doing.
As the Reilly team noted, “simply getting a copy of your genome is so 2008. Fast-forward a decade and now you can buy individual apps that will read a sequence and tell you everything you’ve always wanted to know” from what wines you like, to what diets or exercise work for you, or even what your children will look like. All that rather intimate data, and predictive guessing, resides in the Cloud.
6. Sentencing Software
Algorithms are used to predict what movies, groceries or clothing you might like to purchase. All of us have experience with the accuracy and efficacy of such algorithms. Now comes an already-in-use algorithm that predicts recidivism to help guide courts in post-trial sentencing. At least one state Supreme Court has already decided that a convicted citizen does not have the right to see inside that ‘black’ box.
Everyone – at least Millennials -- has a digital history contained in a sea of social media, chat and email activities. In theory, when a loved one dies, that history could be used to create a “sort of memorial chatbot designed to feel a lot like talking,” online, to the lost loved on. Apparently not in theory; you can try one such app, called Replica, based on one particular person (so far). Meanwhile, a south Korean programmer created a similar app that animates a 3D avatar of your dead loved one.
8. “Citizen” App
A smartphone app to help innocent citizens “stay safe and aware in areas wracked by crime” by automatically letting you know about ongoing crimes comes from linking your location in real time with continual monitoring of local police scanners. The app also allows live-streaming video for “complete transparency of your neighborhood.” It's worth noting the app was first released with the name “Vigilante.” Again, nuff said.
9. Social Credit System
A report from China’s State Council proposes mandatory participation for every citizen (by 2020) in an on-line tracking and rating system to create a “social credit” score. By tracking all manner of activities, the goal is to create a real time “score” for each individual in terms of “honesty in government affairs, commercial integrity, societal integrity, and judicial credibility.” Scores would be used, as a minimum it seems, for determining insurance or loan rates, or civil service suitability, etc.
10. Robot Priest
Students at Notre Dame of course could not resist adding this to the list. While a Protestant church in Germany created a robo-priest (to dispense blessings) as a stunt for this year’s 500th anniversary of the Reformation, Japan’s SoftBank Robotics appears to be producing a real line of robot Buddhist monks to “deliver funeral rites to the exploding elderly population in Japan.” Yikes.
The unambiguous thread in this list: the increasing power (and creepiness) of algorithms. And, as you read the Reilly team’s commentary about the ethical challenges, the other thread that glaringly jumps out is privacy, along with the potential for corporate or government misuse.
Worries about privacy and control define the new digital era. From the punditry to the policymakers, the latter because of people, we see a rising chorus calling for action, of some kind, from someone. But what? And from whom? The government? The corporations?
The citizenry does not appear confident in either government or business to tackle anything, much less such thorny issues.
In the wake of privacy hacks, worries about fake news, questions about predatory practices and even collusion, fallout from social media side-effects from bullying to creepy ads, have generated a proliferation of recent stories about Silicon Valley’s “cultural ignorance.” A few recent headlines encapsulate today’s zeitgeist: “Silicon Valley Is Not Your Friend,” the New York Times; “Why the Public’s Love Affair With Silicon Valley Might Be Over,” Fast Company, and; “Should America’s Tech Giants Be Broken Up,” Business Week.
Meanwhile, public confidence in government is even lower. Consider the general tracking poll that finds barely 20 percent of citizens now say they “trust the government in Washington … just about all or most of the time,” compared to the 70 percent saying they did in 1960. Another long-running tracking poll asks voters if they think government is “pretty much run by a few big interests” versus “for the benefit of the people.” In 1960 that poll’s ratio was 50/50; today just 25 percent believe the latter.
How this plays out remains to be seen. Perhaps the emerging Millennial corporate and political leadership will find creative solutions. Either way, we do know one thing: Such technologies are coming, and soon. To paraphrase a memorable phrase from a recent presidential election debate: “George Orwell’s 1984 is calling, and wants its dystopia back.”