Long Story; Short Pier.

God, hes left as on aur oun.

Cooling.

AI Darwin Awards.

Zone of Habitation.

Buhurt.

DOGE.

CROPS.

Generation AI.

There’s a thing that sweeps through writerly social media from time to time these days, where someone or other points to the latest outrage due to generative AI, and goes on to swear they’ve never used generative AI in their works, and by God they never will, and invites any and all other authors who are and feel likewise to likewise affirm in mentions and quote-tweets and, much as yr. correspondent, Luddite that I am, would happily join in—dogpiling notions and general actions is so much more satisfying than dogpiling individuals—well: I can’t. Because I have used generative AI in the creation of a small but not unimportant part of my work.

Sort of.

Twenty three years ago, then: 10.47 UTC, on January 18th, a Friday: someone using the email address acosnasu@vygtafot.ac.uk made a post to the alt.sex.stories.d newsgroup. The subject line was, “Re: they are filling inside cold, over elder, near heavy ointments,” and here’s how it began:

One more plates will be glad wide jars. Other thin lean tickets will love hourly behind yogis. Georgette, have a rude poultice. You won’t change it. Well, Ronette never judges until Johnny attempts the poor goldsmith daily. As sneakily as John orders, you can look the unit much more angrily. There, cans talk beneath sweet markets, unless they’re bitter.

It continued in that vein for another 830-some-odd words—the output of a Markov chain: a randomly generated text where each word set down determines (mostly) the next word in the sequence. —Index a text, any text, a collection of short stories, a volume of plays, a sheaf of handwritten recipes, a year’s worth of newsletters, an archive of someone’s tweets or skeets or whatever we’re calling them these days, make an index, and, for each appearance of every word, note the word that appears next. Tot up those appearances, and then, when you want to generate a text, use the counts to weight your otherwise randome choices. Bolt on some simple heuristics, to classify the words as to parts of speech, set up clause-shapes and sentence-shells, where to put the commas and the question-marks, then wind it up and turn it loose:

The sick walnut rarely pulls James, it opens Zachary instead. We converse the sticky egg. If the bad jackets can fill undoubtably, the younger film may cook more evenings. She might irrigate freely, unless Kathy behaves pumpkins to Marian’s game.

Now, comparing a Markov chain to ChatGPT is rather like comparing a paper plane to a 747 but, I mean, here’s the thing: they both do fly. Given the Markov’s simplicity, it’s much easier to see how any meaning glimpsed in the output is entirely pareidolic; there’s no there there but what we bring to the table—even so, it’s spooky, how the flavor of the source text nonetheless seeps through, to color that illusion of intent. Given how simple Markov chains are. (ELIZA was originally just 420 lines of MAD-SLIP code, and she’s pulled Turing wool for decades.) —Anyway: somewhere around about the summer of 2006, casting about for something to conjure a spooky, surreal, quasi-divinatory mood, I hit upon the copy I’d squirreled away of that post from four years previous and, after a bit of noodling, wrote this:

The offices are dim. The cubicle walls are chin-high, a dingy, nappy brown. Jo doesn’t look at the plaques by each opening. Warm light glows from the cubicle to the right. “No,” someone’s saying. “Shadow-time’s orthogonal to pseudo-time. Plates? They’re gonna be glad wide jars again. Yeah. The car under the stale light is a familiar answer, but don’t run to the stranger’s benison – there is nothing in the end but now, and now – ”

Now, I haven’t been able to identify what originating text might’ve been so enamored of “glad” and “jars” and “benison” and “ointments,” but it’s hardly as if it’s the sum total of everything ever shelved in the Library of Congress; it’s not at all as if anyone blew through more power than France to calculate those initial weights; generating that original post twenty-three years ago didn’t light up an array of beefy chips originally designed to serve up real-time 3D graphics in videogames, burning them hot enough to boil away a 16 oz. bottle of water in the time it takes to spit out a couple-few hundred words: but. But. If someone asks whether I’ve ever used generative AI in the creation of my work, I can’t in good conscience say no. —Heck, I even went and did it again, in a callback to that particular scene, though I don’t seem to have kept a copy of the originating post for that one. It’s everywhere out there, this prehistoric gray goo, this AI slop avant la lettre, if you know where to look; weirdly charming, in a creepily hauntological sense. All those meaning-shapes, evacuated of meaning.

But, well. See. That’s not all.

Ethical Lawyering & Generative AI.

Last year, for the day job, I took part in a panel discussion on “Ethical Lawyering and Generative AI.” We needed a slide deck to step through, as we explained to our jurisprudential audience the laity’s basics of this stuff that was only just then beginning to fabricate citations to cases that never existed, and a slide deck needs art, so I, well, I turned to whatever AI chatbot turducken Edge was cooking at the time (Copilot, which sits on ChatGPT, with an assist from DALL-E, I think, for the pictures)—I was curious, for one thing: I hadn’t messed around with anything like this for a half-dozen iterations, at least—back when you’d upload a picture and give it a prompt, “in the style of Van Gogh,” say, and a couple-five hours later get back an artfully distorted version of your original that, if you squinted generously in the right light, might be mistaken for something Van Gogh-adjacent. If I were to opine on this stuff, and advise, I really ought to have tinkered with it, first, and hey, I’d be doing something useful with the output of that tinkering. And but also, I wanted that slickly vapid, inanely bizarre æsthetic: smartly suited cyber-lawyers stood up by our bullet points, arguing in courtrooms of polished chancery wood and empty bright blue glass, before anonymously black-robed crash-test dummy-looking robots—we made a point of the fact that the art was AI-generated, pointing out inaccuracies and infelicities, the way it kept reaching for the averagest common denominators, the biases (whenever I asked for images of AI-enhanced lawyers, I got male figures; for AI-enhanced paralegals, female. When I asked for images of AI-enhanced public defenders? Three women and a man). It all served as something of an artful teaching moment. But: and most importantly: no artist was put out of a job, here. There was no budget for this deck but my own time, and if it wasn’t going to be AI-generated art, it was going to be whatever I could cobble together from royalty-free clip-art and my own typesetting skills.

I don’t say this as some attempt at expiation, or to provide my bona fides; I’m mostly providing context—an excuse, perhaps—for what I did next: I asked Copilot to generate some cover images for the epic.

—Not that I would ever actually begin to think about contemplating the possibility of maybe ever actually using something like that as an actual cover, dear God, no. I shoot my own covers, there’s a whole æsthetic worked out, making them is very much part of the process, I’d never look to outsource that. But generating the art for the deck had tweaked my curiosity: I get the basic idea of how it is that LLMs brute-force their generation of sloppy gobs of AI text, but I can’t for the life of me figure out how that model does what it does with images, with picture-stuff—the math just doesn’t math, I can’t get a handle on the quanta, it’s a complete mystery—and who isn’t tantalized by a mystery?

(I mean, set aside just for a moment the many and various ethical concerns, the extractive repurposing of art on a vastly unprecedented scale, without consent, the brutal exploitation of hidden human labor in reviewing and organizing and classifying the original sources, and reviewing and moderating and tweaking the output, the vast stores of capital poured into its development, warping it into a tool that consolidates money and power in hands that already have too much of both, the shocking leaps in energy consumption, the concomitant environmental degradation, the incredible inflation of our abilities to impersonate and to deceive—set all of that and more aside, I mean, it’s pretty cool, right? To just, like, get an image or four of whatever you want? Without bothering anybody?)

So I asked Copilot to generate some cover images for the epic:

Queen Dick in impractical armor.
I guess he’s the Ring of God?
“Ethnically American”?
Taking the prompt to show me a book cover rather literally.
Those aren’t all swords, are they?
Where did this come from?

Thus, the sick walnut, as it opens Zachary instead. Not terribly flattering, is it. I asked Copilot, I asked ChatGPT, I asked DALL-E to show me its take on my work, and this is the best it can bother to do?

That’s the premise of the promise these things make, after all, or rather the promise made on their behalf, by the hucksters and the barkers, the grifters and con artists sniffing around the aforementioned vast stores of capital: that there is a there, there; that what sits there knows everything it’s been shown, and understands whatever you tell it; that it can answer any questions you have, find anything you ask for, show you whatever you tell it you want to see, render up for you the very idea you have in your head—but every clause of every statement there’s untrue.

This idea, that I have in my head, is actually a constellation of ideas worked out in some detail and at great length in a form that, by virtue of having been publicly available on the web (to say nothing of having been published as eminently seizable ebooks in numerous vulnerable outlets) has always already been part of any corpus sliced and diced and analyzed to make up the unfathomably multi-dimensional arrays of tokens and associations underlying every major LLM: thus, the sum total of any and all of its ability to “know” what I point to when I point. Those pictures, then—the cover images I asked it to, ah, generate: image-shapes and trope-strokes filed away in whatever pseudo-corner of those unthinkably multi-dimensional arrays that’s closest, notionally speaking, to the pseudo-spaces made up by and enclosing the tokens generated from my books, and arranged in what’s been algorithmically determined to be the most satisfying response to my request: thus, a bathetic golden hour steeped through skyscraping towers (some rather terribly gothic, a hallmark of the Portland skyline); an assortment of the sectional furniture of swords and sword-shapes and roses and birds; a centered, backlit protagonist-figure, all so very queenly (save the one king—Lymond? Really?)—the two more modern, or at least less cod-medieval, reach for a trick that was de rigueur for a while on the covers of UF books, and numerous videogames, where the protagonist is stood with their back to us, the better to inculcate, it was thought, a sense of identification, of immersion, and also in some cases yes at least to show off a tramp stamp. There’s something queasily akin to that murmurous reunion of archetypes noted by Eco, but these clichés aren’t dancing; they’re not talking, not to each other, certainly not to us; they’re not even waiting. Just—there, in their thereless there.

So. All a bit embarrassing, really—but, to embarrass, it must have power; that power, as is usually the case these days, is found in the bullshit of its premise. I asked for cover images, but that wasn’t what I wanted; what I wanted was, I wanted to see, just for a moment, what it might all look like to someone else, outside of my head—but without the vulnerability that comes from having to ask that someone else what it is they think. That promise—our AI can do that for you—that’s intoxicating. And if it had worked?

But it didn’t, is the thing. Instead of that validating glimpse, what I got was this, this content, this output of the meanest median mode, this spinner rack of romantasy and paranormal romance julienned into a mirepoix, tuned a bit to cheat the overall timbre toward something like Pantone’s color of whatever year—oh, but that metaphor’s appallingly mixed, even for me, and anyway, they don’t really do spinner racks anymore.

747s, paper planes, the thing is that ChatGPT, LLMs, generative AI, it’s all more of a flying elephant, really, to extend the simile, and most folks when they think about it at all seem to be of the opinion that it doesn’t matter so much if it can’t loop-the-loop, or barrel roll, look at it! It’s flying! Isn’t that wild?

Thing is, it can’t so much land, either.

It’s a neat parlor trick, generative AI; really fucking expensive, but kinda sorta pretty neat? And I’d never say you can’t use it to make art, good art—I’ve seen it done, with image generation; I’ve done it myself, in my own small way, with the free-range output of Markov chains. But there’s a, not to put to fine a point on it, a human element there, noticing, selecting, altering, iterating, curating, contextualizing—the there, that needs to be there, knowing, and seeing, and showing what’s been seen. And to compare these isolated examples, these occasional possibilities, with the broadband industrial-scale generation of AI gray-goo slop currently ongoing, is to compare finding and cleaning and polishing and setting on one’s desk a pretty rock from a stream, with mountain-top removal to strip-mine the Smokies for fool’s gold.

So, there you have it: why I’m not likely to ever ask ChatGPT as such for anything ever again; why I might still mess around with stuff like Markov chains. But entirely too much faffing to fit into a tweet. Are we still calling them tweets even if they aren’t on Twitter anymore? We should still call them tweets. One of the many tells of Elon Musk’s stupidity is walking away from a brand that strong, I mean, Jesus. Like renaming Kleenex.®

While units globally tease clouds, the tags often learn towards the pretty disks. We talk them, then we totally seek Jeff and Norm’s strong tape. Why will we nibble after Beryl climbs the inner camp’s poultice? We comb the dull pear. I was smelling to attack you some of my clever farmers. She may finally open sick and plays our tired, abysmal carpenters within a mountain.

—Filed 110 days ago to Indulgences.

  Textile help

Sesame Street.

Hooters.

Email | Bluesky | Mastodon | RSS

Mars, or, Misunderstanding

the Necronomicon

Jupiter drops

Archive | Comradery | Patreon

  • The dust left in the bore
  • Cui bono?
textpattern