In this post, Peter Medeiros talks about how exploring his opinions of A.I. influenced and inspired “A Future Full of Glaciers,” published in our January/February issue.
By Peter Medeiros
When I set out to write “A Future Full of Glaciers,” it was partly a challenge to myself:
I wanted to write a story about A.I. that does not feature A.I.
This was for a couple reasons:
First, there’s other writers who have dealt with the big, philosophical implications of A.I. much better than I could. Ray Nayler’s The Mountain in the Sea does this, sort of (impressively) in a subplot. I wouldn’t top that in a short story.
Second, I find the idea that contemporary A.I. will result in anything like true sentience without an organic component to be a lot of hype and, frankly, dishonest marketing. So the machine-like intelligences in my story have a biological component, though this is fairly incidental to the tale.
Finally, from a craft perspective, I thought it was a bad idea to come at the subject head-on. The literary agent Donald Maass writes about this in his excellent The Emotional Craft of Fiction; naming something is very unlikely to elicit the feeling of that thing in your reader. Reporting that a character is terrified, no matter how well you describe their knees knocking or their balls shriveling up (Hello, early King!) or whatever, won’t make your reader scared, not by itself.
In speculative fiction, we have more latitude to come at something sideways. This likely goes without saying; the great writers of Dystopian fiction are not merely providing warnings but reporting back to us what is already alive in our own world. Philip K. Dick said that every piece of A Scanner Darkly was something he himself observed, more or less, the science fiction scramble suits and Substance D notwithstanding.
It seemed to me that to say anything about A.I.—and I should let you know I have negative interest in the technology, but lots of interest in how it is used, which is to say I have an interest in people—I had to say it, like Dickinson advises, “slant.”
And it seemed like I had to say something, whether I wanted to or not.
When ChatGPT went live in 2022, suddenly people wanted to know my opinion of it, probably because I teach writing.
Because one of the subjects I teach is Research Writing, I tried my best to point people to sources and perspectives I admire. Ted Chiang’s 2024 article in The New Yorker is one. Another: the excellent book Feeding the Machine: The Hidden Human Labor Powering A.I. See also Joseph Fasano’s now-viral poem “For a Student Who Used AI to Write a Paper.” Perhaps also listen to Jimmy Eat World’s “Love Never,” which summarizes most of my thoughts.
I gave these recommendations because I didn’t want to catastrophize or pontificate. Nor did I want to get into a debate around nomenclature, which I would have done if I revealed my real thinking: “A.I.” simply isn’t artificial intelligence.
It’s only “A.I.” in the way that “hoverboards” are hoverboards and not one-wheel electric scooters—because technocrats strategically disrespect language as a way to sell products. Not even products; the idea of a product that might someday be.
Given that the main currency of crypto shills is bloviation and dream-stuff, it makes sense they would create a machine that might rob people of their ability to dream for themselves.
Confucius says: “The beginning of wisdom is to call things by their proper name.”
The opposite is also true: to give into someone else’s improper naming is the beginning of bullshit, bamboozle, and grift.
One part of the grift is that scientists are being drawn away from any search for real, genuine “AI” by the Large Language Model bubble. There are really exciting things happening for those who actually study intelligence that could lead to great, useful advances in our understanding of ourselves and the natural world. Consider the 2024 “brain mapping” of the fruit fly. Real research into neuroscience that might lead to AGI! How cool! I wonder how many young scientists looking for a career researching such things are caught up—like fruit flies in a web?—by the growing number of LLM start-ups doomed to shutter when the bubble bursts.
Meanwhile, the current programs we call “AI” have led to teen suicides and an abundance of “revenge porn” (an odd euphemism for visual art being used to perform, basically, sexual assault). See also: breakdown in public trust with the rise of misinformation, new kinds of scams, the plagiarism intrinsic to most LLMs, etc.
Not to mention climate change. Researchers with Nature Sustainability report that the A.I. industry is looking to add 24-44 million metric tons of carbon dioxide to the atmosphere by 2030.
In November, the World Court ruled that the U.S. is morally and legally required to do what it can to climate change. Meanwhile, the Clean Air Act has been effectively gutted. Exxon is currently arguing against California’s Climate Accountability Package by saying—and this is real—that the First Amendment should protect them from any requirement to disclose their omnicidal greenhouse gas emissions.
The only way to prevent rolling climate catastrophe is to reduce emissions, yet purveyors of “A.I.” argue they should be allowed to increase emissions, and that they should be free from any and all oversight. This summer, the co-founder of Palantir Technologies, Peter Thiel, “hesitated” before he could commit to the idea that humanity should survive into the future at all. Some were shocked by this, but given recent revelations about how the technoaristocrats spend their time and money, it hardly seems surprising to me. I can’t say if the people peddling these so-called A.I. technologies are sociopathic criminals; I can only say they sure act and sound like they are.
You can probably tell: I got used to giving book and article recommendations because I didn’t want to open my mouth and let out my real, friendship-destroying, mood-killing opinions.
Opinions which could be summarized as, say, Butlerian Jihad.
But it was also a cop out.
So, to force myself to articulate my real thoughts, I had to write a story. (It’s how I think; thank you to the wonderful Malka Older for “narrative disorder.”)
I had to ask, “If I hate this stuff so much, what’s the opposite of A.I.?”
In speculative fiction, we have more latitude to come at something sideways.
There is a lake in Vermont where my wife and I go every year. It is part of her childhood, and now it is part of the life we have built together.
I find it easy to lose myself in work. One more email. One more paper to grade.
But the lake—its age, its size, the people on its shores—helps me slow down. It allows me to revel in gorgeous inefficiency. We each read a whole book by the lakeshore. I do writing in a real, physical journal. I don’t jog; I hike. I fucking stroll.
I have time to think of all the foolish things I did when I was young.
And I think of the many foolish things my students are forced to do by an educational system trying its best to slot them into neat little packages when they are anything but neat, and are wonderful for it. As I enter middle age, I find myself in possession of the kind of uncertain wisdom that can only be screamed and can therefore never be heard, not by any self-respecting young person who carries the prerequisite disdain for old-timers’ advice.
But I want to scream it all the same: In ten years, nobody will care about or remember your grades! Nobody will remember if you managed it all error-free! But you might just carry with you, in your bones, the pride of accomplishment. And when you spend time with something you love, give yourself over to it with obsession and struggle, only then will this broken, dying world feel like it loves you back.
Listen, I want to say. You need to swim.
A father I know told me, in one of those conversations, that he used an A.I. program to generate a D&D adventure for his son and his friends. He is a good man who I admire, a buddy; we have a shared politics. He is much tougher and smarter than me. He is not Peter Thiel.
I played it cool, but as someone who loves role-playing games and the shared imagination spaces they create, I could have wept.
I could have pointed out how many free adventures are available online, written by passionate designers and given away out of a spirit of, well, adventure. Not to mention thousands of classic magazine-adventures converted to PDF and uploaded by brave archivists.
But what I wanted to say and could not was, “Someday, you will be dead. I will be dead. The time we save will not, ultimately, save us in return.”
There are no glaciers in the future.
Glacier National Park will lose its last glaciers by the end of this decade or the start of the next.
More pressingly, the authors of the Climate Risk Index estimate that “from 1995 to 2024, more than 832,000 lives were lost[…] driven by more than 9,700 extreme weather events,” which can be linked to the sort of climate apocalypse I must assume A.I.-proponents do not just tolerate but revel in. There seems little distance between a belief that people should spend less time living—by, for example, writing poetry or game adventures for their children—and a belief that there should simply be fewer people alive.
And many A.I. programs are designed specifically to maximize the number of human dead. Humans Rights Watch reports that the Israel Defense Force uses a program called “Where’s Daddy?” to track strategic targets via their phones, to optimize drone and missile attacks for greater lethality. According to reports, this typically means identifying when a “target” has just arrived home.
Home, one presumes, to their family.
Jeff Bezos famously said he wants his workers to wake up scared.
The runaway psychopathy of the billionaire techno-feudalist class pedaling A.I. and exacerbating climate change goes just one step further; they want us to wake up not at all.
But in my story, I wanted to write about the lake.
At the lake, I get sand in my socks.
My wife’s old family friends bring their children. The children are very loud and very funny. One of them finds the biggest caterpillar any of us have ever seen. Another breaks rocks to reveal geodes inside, then invites us to The Rock Museum.
I am a terrible swimmer. I cannot swim across the lake. But I can swim to the raft and lay on my back and feel the water evaporate from my skin, against all the advice of my dermatologist. I can look at the woman I love, who saved my one singular human life in ways I can never fully articulate or repay, and I watch how her face is lit by the dazzling light on the water, as though the lake’s surface is not a reflection of the sun, the only reason we humans are alive at all, but the sunlight’s true source. As though there is something down there that cannot be found with a submarine.
If you go looking in the lake, actually, you can’t find it at all.
We can only find it in each other, and in time.
To me, this is the opposite of A.I.
Works Cited
Adil, Lina, et al. ”Climate Risk Index 2026: Who Suffers Most from Extreme Weather Events?” Germanwatch. Nov 2025, https://www.germanwatch.org/sites/default/files/2025-11/CRI%2026%20full%20report.pdf
Chiang, Ted. “Why A.I. Isn’t Going to Make Art.” The New Yorker, 31 Aug. 2024, www.newyorker.com/culture/the-weekend-essay/why-ai-isnt-going-to-make-art.
Domonoske, Camila. “The EPA Proposes Gutting Its Greenhouse Gas Rules. Here’s What It Means for Cars and Pollution.” NPR, 29 July 2025, www.npr.org/2025/07/29/nx-s1-5463771/epa-greenhouse-gas-regulations-cars-pollution.
Douthat, Ross. “Opinion | Peter Thiel and the Antichrist.” The New York Times, 26 June 2025, www.nytimes.com/2025/06/26/opinion/peter-thiel-antichrist-ross-douthat.html.
Fasano, Joseph. “For a Student Who Used AI to Write a Paper.” https://poets.org/poem/student-who-used-ai-write-paper.
Fuller-Wright, Liz. “Mapping an Entire (Fly) Brain: A Step toward Understanding Diseases of the Human Brain.” Princeton University, 2024, www.princeton.edu/news/2024/10/02/mapping-entire-fly-brain-step-toward-understanding-diseases-human-brain.
Graham, Mark, et al. Feeding the Machine. Bloomsbury Publishing USA, 6 Aug. 2024.
Human Rights Watch. “Questions and Answers: Israeli Military’s Use of Digital Tools in Gaza.”Human Rights Watch, 10 Sept. 2024, www.hrw.org/news/2024/09/10/questions-and-answers-israeli-militarys-use-digital-tools-gaza.
Jimmy Eat World. “Love Never.” Jimmyeatworld.com 2018.
NASA Earth Observatory. “World of Change: Ice Loss in Glacier National Park.” NASA Science, 14 June 2016, https://science.nasa.gov/earth/earth-observatory/world-of-change/glacier-national-park/.
Older, Malka. “Narrative Disorder.” Firesidefiction.com, 2017, https://firesidefiction.com/narrative-disorder.
Umoh, Ruth. “Why Jeff Bezos Wants Amazon Employees to “Wake up Every Morning Terrified.”” CNBC, CNBC, 28 Aug. 2018, https://www.cnbc.com/2018/08/28/why-jeff-bezos-wants-amazon-employees-to-wake-up-terrified.html.
“World Court Rules U.S. Legally Obligated to Curb Climate Change.” Earthjustice, 28 Nov. 2025, https://earthjustice.org/brief/2025/world-court-rules-u-s-legally-obligated-to-curb-climate-change.
Xiao, Tianqi, et al. “Environmental Impact and Net-Zero Pathways for Sustainable Artificial Intelligence Servers in the USA.” Nature Sustainability, 10 Nov. 2025, www.nature.com/articles/s41893-025-01681-y, https://doi.org/10.1038/s41893-025-01681-y.
Zraick, Karen. “Exxon Sues California over New Climate Disclosure Laws.” The New York Times, 25 Oct. 2025, www.nytimes.com/2025/10/25/climate/exxon-california-lawsuit-free-speech.html.