If you’re interested in perusing previous posts covering new words in futuring, you can see them all here. And stay tuned, because to celebrate my upcoming 1 year blog-iversary, I’m going to put them all together for easier access. Here we go…
Civic Hacking – this is a term that refers to the creative, dynamic and emergent practice of using data in unexpected ways to solve civic problems and/or challenges. Those involved come from a variety of formal and informal locations – but all are united in thier desire to use data for good and to harness the dual powers of data technology and democratic community activism to make the world a better. place. There is a great definition with some history here. There is actually a national “civic day of hacking” (it’s coming up September 21 this year.
Here’s a recent(ish) academic article about the phenomenon and practice.
This is totally fascinating. Cyborg anthropology is the study of how technology is impacting and changing human behavior. This brief TED talk by Amber Case is really interesting and asks a simple question: “Are we already cyborgs?”
Here’s a great site that has collected and defined Cyborg Anthropology and does a good job of organizing topics by various areas of interest. Here’s an additional article and book on the subject (I just ordered it – very intrigued). I’m guessing this is an area of practice that is going to continue growing.
This has really popped up quite a lot in recent media. A deep fake is moving image/film-like document that appears real, but is in fact, manufactured with great technical precision to fool the viewer. Because of our extraordinary talent-base in movies and the technical aspects of creating special effects – many people are somewhat familiar with the idea that we can make anything look (somehow) like anything else. But concern has grown recently because of use of these technologies outside of entertainment spaces, and of particular worry, emerging potential for them to be used in politically unstable situations to complicate and/or weaponize communications. This is part of a broader set of concerns about “disinformation campaigns and warfare” (see below). Here is a brief popular journalistic overview. Here are some more articles specific to political/national security concerns about the technology and it’s use. Lastly, here’s a TED talk to break it all down.
Information Warfare/Disinformation Campaigns
Back in December of 2018, I shared a term called “computational propaganda” (scroll down) in this ongoing vocabulary list project that is related to the idea of the idea of a particular way of weaponizing false information internationally with significant geopolitical implications.
As early as 1996, people watching the playing field, were very much aware of the potential for “cybersecurity” and information warfare to become increasing challenges in the world of ahead. This is kind of an interesting historical document that summarizes these ideas of that day. Here’s a more recent historical document that provides a historical overview of the U.S. military’s efforts to develop and guide security in this area. The definition of information warfare is literally when two or more parties use (mis)information as a weapon to divide and take political advantage in a conflict.
A related but distinct topic is that of “disinformation” which is similar to but slightly different than propoganda. Disinformation is the catch all term that describes how variations of information (sometimes variations of accuracy) are systematically deployed in a conflictual situation with the intention of confusing or misguiding people. Here’s a nice overview (and toolkit for fighting disinformation) developed from the UK.
Given our commitment to democratic political engagement, and given the rise of concern and activity to understand these concepts and join many around the world who are actively resisting/fighting against disinformation (often led by journalism), this is an important issue for social workers to have foundational working knowledge about.
(Special note: I wish to underscore that I’m far from an expert on this topic, and the previous one on deepfakes…but seek to provide some beginning definitions as I’m learning about in this blog. Inclusion of information in these entries is not intended to imply endorsement of the content – rather to simply amplify a variety of ways of looking at and understand the issues so we can continue to learn and debate about these issues together. )
This is my new favorite thing. I recently ran across this model and haven’t been able to stop thinking about it – found it most inspiring! What if everyday places that people just normally spend time in, became explicit “community resilience hubs” to assure readiness for significant challenges particular to that region and/or as the community themselves determined? In fact, my guess is that resilience hubs are already everywhere, sometimes just unrecognized. But in truth, so many of the answers to community challenges are best and likely found close to home. This is a fundamental social work value. What if every social worker were a “community resilience hub booster?” This link provides a wonderful guidebook to invite communities to consider and experiment with this framework. These authors were inspired by hurricane recovery in Puerto Rico – but they want to boost this signal to expand into all kinds of places where resilience is needed. Bottom line: I want to be in a future where this is happening more and more and more.
This article defines techlash as: (noun) The growing public animosity towards large Silicon Valley platform technology companies and their Chinese equivalents. It was a word of the year late last year…but I’ve only recently learned about it. It certainly fits a (continuing) trend internationally – and very real fears about the speed of change, the need for change, the motives for this acceleration and the well-being of all involved in the process. Certainly there has been growing critique of the need for more rapid expansion of explicit ethics in the tech world. I devoted a recent post to a “round up” of ethics articles and resources that provided a good foundation for social work (and beyond) to help to ground our thinking and work to help us navigate this complex matter. There is no way for us, as professionals, to simply “opt out” of this important conversation as if it doesn’t effect us. The truth is, tech influences and impacts everyone at this point in history. We have an ethical obligation to be both ethical innovators to advance the common good in our sphere of influence and interrogate/critique when harm is being done.