technoscience

All posts tagged technoscience

Reimagining “AI’s” Environmental and Sociotechnical Materialities

There’s a new open-access book of collected essays called Reimagining AI for Environmental Justice and Creativity, and I happen to have an essay in it. The collection is made of contributions from participants in the October 2024 “Reimagining AI for Environmental Justice and Creativity” panels and workshops put on by Jess Reia, MC Forelle, and Yingchong Wang, and I’ve included my essay here, for you. That said, I highly recommend checking out the rest of the book, because all the contributions are fantastic.

This work was co-sponsored by: The Karsh Institute Digital Technology for Democracy Lab, The Environmental Institute, and The School of Data Science, all at UVA. The videos for both days of the “Reimagining AI for Environmental Justice and Creativity” talks are now available, and you can find them at the Karsh Institute website, and also below, before the text of my essay.

All in all, I think these these are some really great conversations on “AI” and environmental justice. They cover “AI”‘s extremely material practical aspects, the deeply philosophical aspects, and the necessary and fundamental connections between the two, and these are crucial discussions to be having, especially right now.

Hope you dig it.

Continue Reading

Earlier this month I was honoured to have the opportunity to sit and talk to Douglas Rushkoff on his TEAM HUMAN podcast. If you know me at all, you know this isn’t by any means the only team for which I play, or even the only way I think about the construction of our “teams,” and that comes up in our conversation. We talk a great deal about algorithms, bias, machine consciousness, culture, values, language, and magick, and the ways in which the nature of our categories deeply affect how we treat each other, human and nonhuman alike. It was an absolutely fantastic time.

From the page:

In this episode, Williams and Rushkoff look at the embedded biases of technology and the values programed into our mediated lives. How has a conception of technology as “objective” blurred our vision to the biases normalized within these systems? What ethical interrogation might we apply to such technology? And finally, how might alternative modes of thinking, such as magick, the occult, and the spiritual help us to bracket off these systems for pause and critical reflection? This conversation serves as a call to vigilance against runaway systems and the prejudices they amplify.

As I put it in the conversation: “Our best interests are at best incidental to [capitalist systems] because they will keep us alive long enough to for us to buy more things from them.” Following from that is the fact that we build algorithmic systems out of those capitalistic principles, and when you iterate out from there—considering all attendant inequalities of these systems on the merely human scale—we’re in deep trouble, fast.

Check out the rest of this conversation to get a fuller understanding of how it all ties in with language and the occult. It’s a pretty great ride, and I hope you enjoy it.

Until Next Time.

I have a review of Ashley Shew’s Animal Constructions and Technological Knowledge, over at the Social Epistemology Research and Reply Collective: “Deleting the Human Clause.”

From the essay:

Animal Constructions and Technological Knowledge is Ashley Shew’s debut monograph and in it she argues that we need to reassess and possibly even drastically change the way in which we think about and classify the categories of technology, tool use, and construction behavior. Drawing from the fields of anthropology, animal studies, and philosophy of technology and engineering, Shew demonstrates that there are several assumptions made by researchers in all of these fields—assumptions about intelligence, intentionality, creativity and the capacity for novel behavior…

Shew says that we consciously and unconsciously appended a “human clause” to all of our definitions of technology, tool use, and intelligence, and this clause’s presumption—that it doesn’t really “count” if humans aren’t the ones doing it—is precisely what has to change.

I am a huge fan of this book and of Shew’s work, in general. Click through to find out a little more about why.

Until Next Time.