The OUTPUT Anthology is Out!

OUTPUT: An Anthology of Computer-Generated Text 1953–2023 in a hand model’s hands (or an AI facsimile thereof?)

I’m delighted that after more than four years of work by Lillian-Yvonne Bertram and myself — we’re co-editors of this book — the MIT Press and Counterpath have jointly published

Output: An Anthology of Computer-Generated Text, 1953–2023

This anthology spans seven decades of computer-generated text, beginning before the term “artificial intelligence” was even coined. While not restricted to poetry, fiction, and other creative projects, it reveals the rich work that has been done by artists, poets, and other sorts of writers who have taken computing and code into their own hands. The anthology includes examples of powerful and principled rhetorical generation along with story generation systems based on cognitive research. There are examples of “real news” generation that has already been informing us — along with hoaxes and humor.

Page spread from OUTPUT with Everest Pipkin’s i’ve never picked a protected flower

Page spread from OUTPUT with Talan Memmott’s Self Portrait(s) [as Other(s)]

Page spread from OUTPUT with thricedotted’s The Seeker

It’s all contextualized by brief introductions to each excerpt, longer introductions to each fine-grained genre of text generation, and an overall introduction that Lillian-Yvonne and I wrote. There are 200 selections in the 500-page book, which we hope will be a valuable sourcebook for academics and students — but also a way for general readers to learn about innovations in computing and writing.

You can buy Output now from several sources. I suggest your favorite independent bookseller! If you’re in the Boston area, stop by the MIT Press Bookstore which as of this writing, has 21 on hand as of actually publishing this post, has 14 copies!

Book Launches

November 11 (Monday): Both editors will speak at the University of Virginia, Bryan Hall, Faculty Lounge, Floor 2. Free & open to the public. 5pm.

November 20 (Wednesday): Online book launch for Output, hosted by the University of Maryland. Both editors in conversation with Matt Kirschenbaum. Free, register on Zoom. 12noon Eastern Time.

November 21 (Thursday) Book launch at WordHack with me, David Gissen, Sasha Stiles, Andrew Yoon, and open mic presenters. $15 (purchasing a ticket online is recommended; WordHack has sold out the past several months). Wonderville, 1186 Broadway, Brooklyn, 7pm.

December 9 (Monday) Book launch at Book Club Bar with the editors, Charles Bernstein, Robin Hill, Stephanie Strickland, and Leonard Richardson. 197 E 3rd St (at Ave B), New York City’s East Village. Free, RSVP required. 8pm.

December 13 (Friday) European book launch with the editors, Scott Rettberg, and others TBA. University of Bergen’s Center for Digital Narrative, Langesgaten 1-2, 3:30pm. Free & open to the public.

Gram’s Fairy Tales: Manual & Grimoire

Taper, an online literary magazine published twice yearly, is now in its 12th issue. An independent editorial collective (Kyle Booten, Angela Chang, Kavi Duvvoori, Leonardo Flores, Helen Shewolfe Tseng, and Andy Wallace, for this issue) makes all the decisions about selections, themes for forthcoming issues, and so on, and also handles all communication with authors. They do all the work! Editors are allowed to submit works, in which case they recuse themselves from the collective’s discussion. I’m proud to be publisher of this magazine — although it’s really the editorial collective that makes it happen.

My “Gram’s Fairy Tales” was selected for this issue, with “Tools” as its theme!

A story grammar in a textarea (below) and a generated story about a heroic lily, a villainous droplet, and a chanterelle who helps the hero.

For those who haven’t visited Taper, when you do, you’ll find that the poems are computational and are no more than 2KB (2048 bytes) in size. (Issue #1 hosted 1KB poems and issue #3 had ones that were 3KB, but Goldilocks found the ideal constraint in 2KB.) Issue #12 features a lot of general-purpose programs that are true tools, textual or otherwise, as well as dynamic text generators that speak to the theme. In fact, the issue is our largest yet, with 32 selections.

To read what the author/programmers say about their work, in their statements, you generally have to “View Page Source” (or choose the similar option in your browser) and look at a comment at the top of each page, which is licensed as free (libre) software so you can make use of it in any way you like. However, given the theme of this issue, I wrote “Gram’s Fairy Tales” to not only give access to the underlying HTML5, but also to let people write grammars that can be placed in a text area and used to generate stories. Given this, I’m going to pull a bit from my “author statement,” found in complete form in the comments, and post it here. I’ll also provide three novel story grammars written by others: Jhave Jhonston (author of ReRites), Kyle Booten (author of Salon des Fantômes), and Kavi Duvvoori (author of Common Is That They).

Inspirations, InstructionsJhave’sKyle’sKavi’s

Two Inspiring Systems, and Instructions

As I developed “Gram’s Fairy Tales,” I was thinking about two remarkable story generation systems, very different ones, both quite simple.

One is the mid-1980s Story Machine, which I’ve known about for a while in its Commodore 64 incarnation. It is a strangely animistic system in which any entity can perform any transitive or intransitive action. While there are multimedia elements and some notion of state, it essentially develops narratives one independent sentence at a time.

The other is a system by Joseph Grimes, operational in Mexico City in 1963. Only one text from this story generator is documented. It seems to have produced stories based on a story grammar, with formalist structures. So, a hero would be challenged in some way and overcome the challenge. The one-page magazine article about the system suggested that it could be called “Grimes’s Fairy Tales.”

“Gram’s Fairy Tales” uses a simple grammar to generate stories. There are special “hero” and “villain” tokens, which always refer to the same two entities, selected from the “entity” rule. You don’t have a pool of good guys and bad guys; any entities are equally likely to be heroes and villains.

Beyond those special tokens, the other nonterminal tokens (the lowercase ones) expand according to rules, with each represented as one line in the text field.

The “|” indicates alternatives; these are split apart first.

The “+” indicates a conjunction of elements; all of those will appear.

Tokens keep getting expanded until eventually there is a run of text which in my example grammars are in all caps. These are “terminals,” and ends up being presented as is, without any further transformation.

Your grammar needs to begin with a “story” rule, and it should have “adj” and “entity” rules so a hero and villain can be determined. Anything else is up to you. Indeed, you don’t actually have to refer to a hero or villain.

My terminals are in all caps (making for an all-caps story) simply because it’s tricky to get the capitalization of sentences right. It would make things harder for those who want to edit the grammar and create their own stories, and the grammar is already a bit tricky.

Spaces have to be included properly at the beginning and ending of terminals, for instance, except for the terminals that come at the very end of stories.

Also, if you have entities or adjectives that begin with a vowel sound, you’ll end up with grammatical mistakes. The solution in this simple tool (or toy) — simply use adjectives and entities that all begin with consonant sounds!

To get an idea of how to write your own grammar, you may want to start with an extremely simple one, such as:

story>hero+ FOUGHT +villain+.|hero+ TOOK A NAP.
adj>CLEVER|STRONG
entity>FOX|CHICKEN

Jhave’s Grammar: A Harmonious Moment

Here’s a striking project by Jhave Johnston, who is my colleague at the University of Bergen Center for Digital Narrative and winner of the Robert Coover Award. You’ll notice that Jhave wrote his grammar to enable proper capitalization, despite the difficulty of that. Although the right side of the text is hidden, you can select it all and paste it into “Gram’s Fairy Tales” to see how it works.

story>intro+conclusion
intro>In the heart of +emotion+, +entities+ +verb+ in +concept+|Abiding as  +emotion+, +entities+ +verb+ in +concept+|Before +emotion+, +entities+ +verb+ in +concept+|Imperturbable in +emotion+, +entities+ +verb+ in +concept+|Serendipitous as +emotion+, +entities+ +verb+ in +concept+|Surrounded by  +emotion+, +entities+ +verb+ in +concept+|Amidst +emotion+, +entities+ +verb+ in +concept+|Beyond +emotion+, +entities+ +verb+ in +concept+
emotion>serenity|peace|tranquility|joyousness|tenderness|compassion|love|radiance|integrity
entities>entities_nature+ and +entities_tech
entities_nature>brooks|forest|cascade|swamp|torrent|ravines|refugees|stars|orchards|plateaus|tendrils|alleyways|groves|mountains|rivers|fields|meadows|ridges|oceans|orchards|moons|blossoms|nebulae|subduction trenches
entities_tech>LEDs|torrents|media|villains|dollar-stores|ditches|pop-ups|server-farms|antennae|antagonists|satellites|philosophies|trinkets|video-walls|entrails|exiles|synapses|heroes|plastics
verb>merged|thrived|replenished|enjoyed|trembled|reveried|prospered
concept>precon+conce
precon>harmony, |wisdom's light, |exuberance, |cohesion, |unity, |solidarity, |changelessness, |insight, 
conce>seeking mutual actualization|observing mutual transcendence|flourishing with supple realization|witnessing collective discovery|imploding within perfect revelation|imploding within effortless actualization|observing ripe achievement|listening to shared success
conclusion>.

A custom grammar generates “Surrounded by peace, blossoms and antennae merged in solidarity, listening to shared success.”

Kyle Booten’s Grammar: How to Compose an Ode

Kyle is a longtime member of the Taper editorial collective — since issue #5 in fall 2020! — and has long been working to have computer text generation provoke his own (and others’) writing. His grammar produces instructions for writing odes of different sorts, and is the most elaborate by far, clocking in at 70 lines, with several that are essentially comments or blank. Again, you can select all of these and paste them into “Gram’s” without trouble, although some of the text on the right side is obscured.

Pindar Horace Hordar 
(An Ode-Gym)
Kyle Booten


story>HOW TO WRITE AN ODE: +ode_type
ode_type>pindaric|pindaric|pindaric|pindaric|horatian|horatian|horatian|horatian|blended



PINDARIC

pindaric> (TURN) +strophe+ (COUNTERTURN) +antistrophe+ (STAND) +epode
strophe>IN A STANZA OF +p_lines+ +short_or_long+ +meter1+ LINES, ADOPTING A +style1+ TENOR, DESCRIBE +hero+, WITH PARTICULAR ATTENTION TO THE +p_feature+ OF +hero+.
meter1>DACTYLLIC|DACTYLLIC|DACTYLLIC|QUASI-AEOLIC|LOGAOEDIC
meter2>meter_adjust+ +meter1|TROCHAIC|IAMBIC
meter_adjust>NOT|ANYTHING BUT|ONLY IMPERFECTLY
p_lines>6|8|8|10|12
short_or_long>BRIEF|BRIEF|BRIEF|LENGTHY|LENGTHY|LENGTHY|ABSURDLY LONG
p_feature>LAUREL|ORDER|VIRTUE|LAW|DECREE|STRENGTH|CRIME|FATE
style1>SOLEMN|MYSTICAL|GNOMIC|JAGGED|SUBLIME|MORALISTIC|GOVERNMENTAL|MUSCULAR|LAUDATORY|REVERENT|JUBILANT
antistrophe>USING THE SAME STANZAIC STRUCTURE BUT NOW +style_or_mentioning+, CONTRAPOSE +hero+ WITH ITS INVERSE AND ENEMY, +villain+.
style_or_mentioning>style2|MENTIONING +p_trait
style2>ZANY|PIXELATED|EUPHORIC|REVERENT|BLOODTHIRSTY|STERN|RECURSIVE|VITUPERATIVE|BRUTALIST|PARANOID|ANGUISHED
p_trait>THE STATE|A GOD|A WAR|A SPORT|A MORAL|A VIRTUE|A CONQUEST|A VICTORY|A NATURAL DISASTER
epode>NOW, WITH +subtract+ FEWER LINES, AND THESE +ep_style+, IMAGINE HOW +hero+ SHALL +do_to_villain+ +villain+.
ep_style>meter2|style1|style2|short_or_long|short_or_long+ AND +meter2|style1+ AND +style2|style2+ AND +meter2
subtract>2|2|2|3|3|3|4|4|5
do_to_villain>OVERCOME|OVERCOME|OVERCOME|REFORM|BANISH|DEMOTE|CENSURE|STEAL FROM|WED|WOUND|BECOME


HORATIAN

horatian>IN +stanzas+ STANZAS, EACH OF 4 LINES, DESCRIBE +hero+, +gradient+.|IN +stanzas+ STANZAS, EACH OF 4 LINES, DESCRIBE +hero+, +gradient+. +extra+.
gradient>PROGRESSING GENTLY FROM+ +concept_statement|PROGRESSING GENTLY FROM+ +concept_statement+ AND SIMULTANEOUSLY FROM +concept_statement2
concept_statement>concept+ TO +concept2
concept>THE +cognizing+ OF THE +abstraction+ OF +object|THE +cognizing2+ OF THE +abstraction2+ OF +object
concept2>THE +cognizing2+ OF THE +abstraction+ OF +object|THE +cognizing+ OF THE +abstraction2+ OF +object2
concept_statement2>GREEN TO BLUE|BLUE TO GREEN|WEALTH TO POVERTY|MORNING TO NIGHT|RAIN TO SLEEP|YOUTH TO OLD AGE|FALL TO SPRING|WINTER TO SUMMER
abstraction>PAIN|SADNESS|LOSS|HUMOR|FRAILTY|YEARNING
abstraction2>LOSS|GROWTH|MERCY|HISTORY|TIMELESSNESS|MORTALITY
cognizing>AWARENESS|FORGIVENESS|HATRED|NON-AWARENESS|APPRECIATION|MISUNDERSTANDING|SYMPATHY
cognizing2>ACCEPTANCE|PERCEPTION|REGRETTING|UNDERSTANDING
object>hero|hero|hero|hero|LIFE IN GENERAL
object2>hero|hero|hero|hero|villain|LIFE IN GENERAL
stanzas>3|4|4|6|6|8|10
extra>REMARK IN THE +count_line+ LINE UPON +extra_object|prosody
extra_object>THE +h_feature+ OF +hero|WINE|WINE|WINE|A MORSEL|A MORSEL|A FRIEND
h_feature>FLAVOR|SIZE|TEXTURE|FAMILY|POSITION|TEMPERATURE|ECHO|INTERIOR|FORM|TEMPERAMENT|HONOR|HUE
count_line>1ST|rel_num|rel_num+ TO LAST
prosody>MIMIC +poet+'S METER
poet>SAPPHO|ALCAEUS
rel_num>2ND|3RD|4TH


BLENDED

blended>bl1|bl2|bl3|bl4|bl5|bl6
bl1>[STANZA 1: +pindaric_bit+] [STANZA 2: +pindaric_bit+] [STANZA 3: +horatian_bit+]
bl2>[STANZA 1: +pindaric_bit+] [STANZA 2: +horatian_bit+] [STANZA 3: +pindaric_bit+]
bl3>[STANZA 1: +horatian_bit+] [STANZA 2: +pindaric_bit+] [STANZA 3: +pindaric_bit+]
bl4>[STANZA 1: +pindaric_bit+] [STANZA 2: +horatian_bit+] [STANZA 3: +horatian_bit+]
bl5>[STANZA 1: +horatian_bit+] [STANZA 2: +horatian_bit+] [STANZA 3: +pindaric_bit+]
bl6>[STANZA 1: +horatian_bit+] [STANZA 2: +pindaric_bit+] [STANZA 3: +horatian_bit+]
pindaric_bit>meter1|meter2|short_or_long|style1|style1|style_or_mentioning|strophe|ep_style
horatian_bit>gradient|gradient|extra|extra_object|cognizing|cognizing2|concept_statement|concept|concept2|concept_statement2|prosody


adj>VINTAGE|COMMON|CIVIC|BROKEN|HONEYED|PAINTED|SACRED|HUNGRY|VERNAL|FLEETING
entity>VASE|FRIEND|DOVE|DEER|FRUIT|LAKE|HOUSE|GAME|TUNE|STORM|GLEN|KNOT|VEIL|WRIT

Kavi Duvvoori’s Most Grammatical Grammar

Kavi joined the editorial collective of Taper for this issue (#12) and has really put their shoulder to the wheel! They (like Jhave & Kyle) have been producing serious computer-generated texts and adjacent projects, of course with particular twists. Their grammar uses story elements (hero, villain) and does tell a story — but one about grammars.

story>theory+.|theory+. +coda+.
theory>A +academic+ +critiqued+ +hero+, +noting+ +villain+ IS +really+ A +metaphor+|+villain+ WAS +endorsed+ BY A +academic+ WITH +hero+ USED AS A +vessel+ TO CATCH THE +metaphor|+villain+ IS ITSELF +really+ +surface+, ESPECIALLY +hero+|+hero+ BY A +academic+ IS +villain+ THAT DESCRIBES, FINALLY, THE +metaphor+
coda>+villain+ +ofthis+ IS THE +metaphor+ PUT IN A +vessel+ BY +hero+reversal+|+villain+ +ofthis+ IS +hero+reversal+
ofthis>OF THIS STORY|IN OUR TALE|FOR OUR PURPOSES HERE
reversal> (OR DO I HAVE IT BACKWARDS?)| (WAS IT REALLY THAT +coda+?)|
response>+critiqued+ THE +problem+ IN|+endorsed+
critiqued>FOUND A GAP IN|DISPUTED|ELABORATED TO APORIA
endorsed>REINFORCED|ELABORATED ON|SHOWN TO NOT POSSESS THE ALLEGED +problem+
academic>linguist|+specialty+ +linguist+|SOCIOLOGIST OF THE +academic+
specialty>COMPUTATIONAL|CRITICAL|CREATIVE|HISTORICAL
linguist>POET|LINGUIST|PHILOLOGIST|LITERARY THEORIST
problem>LACK OF RIGOUR|IDEALISM|CONTRADICTIONS|PROBLEMS
noting>NOTING THAT|SHOWING HOW|FOR
really>REALLY|IN FACT|IN TRUTH
metaphor>image+ ON +surface|churn+ +metaphor
image>SHADOW|REFLECTION|PICTURE
vessel>NET|SIEVE|DISPLAY CASE
churn>SHIFTING|FLOWING|CHURNING
surface>A BUILDING'S WALL|A PAGE|SKIN|+churn+ +flow
flow>CURRENTS|WATER|LEAVES|WIND|MUD
adj>CHOMSKYAN|PANINIAN|MOSTLY ADELE GOLDBERG-INSPIRED|PRIMARILY IRENE HEIM-DERIVED|MONTAGUE|LAKOFFIAN|CHARLES SANDERS PIERCE-ROOTED
entity>GRAMMAR|MODEL|LOGIC|FORMALISM

“A FORMALISM IS ITSELF IN TRUTH A PAGE, ESPECIALLY A LAKOFFIAN MODEL.” Plus a grammar.

WordHack Book Table

This May 21, 2020 at 7pm Eastern Time is another great WordHack!

A regular event at Babycastles here in New York City, this WordHack will be fully assumed into cyberspace, hosted as usual by Todd Anderson but this time with two featured readings (and open mic/open mouse) viewable on Twitch. Yes, this is the link to the Thursday May 21, 2020 WordHack!

There are pages for this event up on Facebook and withfriends.

I’m especially enthusiastic about this one because the two featured readers will be sharing their new, compelling, and extraordinary books of computer-generated poetry. This page is a virtual “book table” linking to where you can buy these books (published by two nonprofit presses) from their nonprofit distributor.

Travesty Generator coverLillian-Yvonne Bertram will present Travesty Generator, just published by Noemi Press. The publisher’s page for Travesty Generator has more information about how, as Cathy Park Hong describes, “Bertram uses open-source coding to generate haunting inquiring elegies to Trayvon Martin, and Eric Garner, and Emmett Till” and how the book represents “taking the baton from Harryette Mullen and the Oulipians and dashing with it to late 21st century black futurity.”

Data Poetry coverJörg Piringer will present Data Poetry, just published in his own English translation/recreation by Counterpath. The publisher’s page for Data Poetry offers more on how, as Allison Parrish describes it, Jörg’s book is a “wunderkammer of computational poetics” that “not only showcases his thrilling technical virtuosity, but also demonstrates a canny sensitivity to the material of language: how it looks, sounds, behaves and makes us feel.”

Don’t Venmo me! Buy Travesty Generator from Small Press Distribution ($18) and buy Data Poetry from Small Press Distribution ($25).

SPD is well equipped to send books to individuals, in addition to supplying them to bookstores. Purchasing a book helps SPD, the only nonprofit book distributor in the US. It also gives a larger share to the nonprofit publishers (Noemi Press and Counterpath) than if you were to get these books from, for instance, a megacorporation.

Because IRL independent bookstores are closed during the pandemic, SPD, although still operating, is suffering. You can also support SPD directly by donating.

I also suggest buying other books directly from SPD. Here are several that are likely to interest WordHack participants, blatantly including several of my own. The * indicates an author who has been a featured presenter at WordHack/Babycastles; the books next to those asterisks happen to all be computer-generated, too:

Thanks for those who want to dig into these books as avid readers, and thanks to everyone able to support nonprofit arts organizations such as Babycastles, Small Press Distribution, Noemi Press, and Counterpath.

Concise Computational Literature is Now Online in Taper

I’m pleased to announce the release of the first issue of Taper, along with the call for works for issue #2.

Taper is a DIY literary magazine that hosts very short computational literary works — in the first issue, sonic, visual, animated, and generated poetry that is no more than 1KB, excluding comments and the standard header that all pages share. In the second issue, this constraint will be relaxed to 2KB.

The first issue has nine poems by six authors, which were selected by an editorial collective of four. Here is how this work looked when showcased today at our exhibit in the Trope Tank:

Weights and Measures and for the pool players at the Golden Shovel, Lillian Yvonne-Bertram
“Weights and Measures” and “for the pool players at the Golden Shovel,” Lillian Yvonne-Bertram
193 and ArcMaze, Sebastian Bartlett
“193” and “ArcMaze,” Sebastian Bartlett
Alpha Riddims, Pierre Tchetgen and Rise, Angela Chang
“Alpha Riddims,” Pierre Tchetgen and “Rise,” Angela Chang
US and Field, Nick Montfort
“US” and “Field,” Nick Montfort
God, Milton Läufer
“God,” Milton Läufer

This issue is tiny in size and contains only a small number of projects, but we think they are of very high quality and interestingly diverse. This first issue of Taper also lays the groundwork for fairly easy production of future issues.

The next issue will have two new editorial collective members, but not me, as I focus on my role as publisher of this magazine though my very small press, Bad Quarto.

Using Electricity readings, with video of one

I’m writing now from the middle of a four-city book tour which I’m on with Rafael Pérez y Pérez and Allison Parrish – we are the first three author/programmers to develop books (The Truelist, Mexica, and Articulations) in this Counterpath series, Using Electricity.

I’m taking the time now to post a link to video of a short reading that Allison and I did at the MLA Convention, from exactly a month ago. If you can’t join us at an upcoming reading (MIT Press Bookstore, 2018-02-06 6pm or Babycastles in NYC, 2018-02-07 7pm) and have 10 minutes, the video provides an introduction to two of the three projects.

Rafael wasn’t able to join us then; we are very glad he’s here from Mexico City with us this week, and has read with us in Philadelphia and Providence so far!

My @party Talk on Computer-Generated Books

I just gave a talk at the local demoparty, @party. While I haven’t written out notes and it wasn’t recorded, here are the slides. The talk was “Book Productions: The Latest in Computer-Generated Literary Art,” and included some discussion of how computer-generated literary books related to demoscene productions.

Trope Tank Writer in Residence

The Trope Tank is accepting applications for a writer in residence during academic year 2016-2017.

The Trope Tank, 3 August 2016

Our mission is developing new poetic practices and new understandings of digital media by focusing on the material, formal, and historical aspects of computation and language. More can be discovered about the Trope Tank here:

http://nickm.com/trope_tank/

The main projects of the Trope Tank for 2016-2017 are Renderings and Heftings, as I’ve described for a forthcoming article in _Convolutions 4_:

> The **Renderings** project is an effort to locate computational
> literature in languages other than English — poetry and other
> text generators, combinatorial poems, interactive fiction, and
> interactive visual poetry, for example — and translate this work
> to English. Along the way, it is necessary to port some of this
> work to the Web, or emulate it, or re-implement it, both in
> the source language and in English. This provides the original
> language community better access to a functioning version
> of the original work, some of which originates in computer
> magazines from several decades ago, some of which is from
> even earlier. The translations give the English-language
> community some perspective on the global creative work that has
> been undertaken with language and computation, helping
> to remedy the typical view of this area, which is almost always
> strongly English-centered.

> **Heftings,** on the other hand, is not about translation into
> English; the project is able to include translation between any
> pair of languages (along with the translation of work that is
> originally multilingual). Nor does it focus on digital and computational
> work. Instead, Heftings is about “impossible translation” of all
> sorts — for instance, of minimal, highly constrained,
> densely allusive, and concrete/visual poems. The idea is that
> even if the translation of such works is impossible, attempts at
> translation, made while working collaboratively and in conversation
> with others, can lead to insights. The Heftings project
> seeks to encourage translation attempts, many such attempts
> per source text, and to facilitate discussion of these. There is no
> concept that one of these attempts will be determined to be the
> best and will be settled upon as the right answer to the question
> of translation.

The Trope Tank’s work goes beyond these main projects. It includes developing creative projects, individually and collaboratively; teaching about computing, videogaming, and the material history of the text in formal and informal ways; and research into related areas. Those in the Trope Tank have also curated and produced exhibits and brought some of the lab’s resources to the public at other venues. The lab hosts monthly meetings of the People’s Republic of Interactive Fiction and occasional workshops.

There are no fees or costs associated with the residency; there is also no stipend or other financial support provided as part of the appointment. A writer in residence has 24-hour access to and use of the Trope Tank, including space to work, power and network connection, and use of materials and equipment. As a member of the MIT community, a writer in residence can access the campus and check out books from the MIT Libraries. We encourage our writer in residence to attend research and creative discussions and join us in project work and other collaborations, but this is not expressed with a particular requirement to be in the Trope Tank some amount of time per week.

To apply, email me, Nick Montfort, at moc.mkcin@mkcin with short answers (in no case to exceed 250 words each) to the following questions:

– What work have you done that relates to computation, language and literature, and the mission of the lab? Include URLs when appropriate; there is no need to include the URLs when counting words.

– How would you make use of your time in the Trope Tank? You do not have to offer a detailed outline of a particular project, but explain in some way how it would be useful to you to have access to the materials, equipment, and people here.

– What is your relationship, if any, to literary translation, and do you see yourself contributing to Renderings, Heftings, or both? If so, how?

– What connections could you potentially make between communities of practice and other groups you know, either in the Boston area or beyond, and the existing Trope Tank community within MIT?

Include a CV/resume in PDF format as an attachment.

Applications will be considered beginning on August 15; applicants are encouraged to apply by noon on that day.

We value diverse backgrounds, experiences, and thinking, and encourage applications by members of groups that are underrepresented at MIT.

NaNoGenMo 2014: A Look Back & Back

There were so many excellent novel generators, and generated novels, last month for NaNaGenMo (National Novel Generation Month).

I thought a lot of them related to and carried on the work of wonderful existing literary projects — usually in the form of existing books. And this is in no way a backhanded complement. My own NaNoGenMo entry was the most rooted in an existing novel; I simply computationally re-implemented Samuel Beckett’s novel Watt (or at least the parts of it that were most illegible and computational), in my novel generator Megawatt (its PDF output is also available). For good measure, Megawatt is completely deterministic; although someone might choose to modify it and generate different things, as it stands it generates exactly one novel. So, for me to say that I was reminded of a great book when I saw a particular generator is pure praise.

Early in month, Liza Daly’s Seraphs set a high standard and must have discouraged many offhand generators! Liza’s generator seeks images and randomizes text to produce a lengthy book that is like the Voynich Manuscript, and certainly also like the Codex Seraphinianus.

Allison Parrish’s I Waded in Clear Water is a novel based on dream interpretations. Of course, it reminds me of 10,000 Dreams Interpreted (and I am pleased, thanks to my students from long ago, to have the leading site on the Web for that famous book) but it also reminds me of footnote-heavy novels such as Infinite Jest. Let me note that a Twine game has already been written based on this work: Fowl are Foul, by Jacqueline Lott.

I found Zarkonnen’s Moebius Tentacle; Or the Space-Octopus oddly compelling. It was created by simple substitution of strings from Moby-Dick (one novel it clearly reminded me of), freeing the story to be about the pursuit of an octopus by space amazons. It wasn’t as polished as I would have liked (just a text file for output), and didn’t render text flawlessly, but still, the result was amazing. Consider how the near-final text presents the (transformed) Tashtego in his final tumult:

A sky-hawk that
tauntingly had followed the main-truck downwards from its unnatural home
among the stars, pecking at the flag, and incommoding Lazerbot-9 there;
this spacebat now chanced to intercept its broad fluttering wing between the
hammer and the plasteel; and simultaneously feeling that etherial thrill,
the submerged robot beneath, in her death-gasp, kept her hammer frozen
there; and so the spacebat of heaven, with archangelic shrieks, and her
imperial beak thrust upwards, and her whole captive form folded in the
flag of Vixena, went away with her spaceship, which, like Satan, would not sink
to transwarp till she had dragged a living part of heaven along with her, and
helmeted herself with it.

Sean Barrett wrote two beautiful generators (at least) – the first of which was How Hannah Solved The Twelve-Disk Tower of Hanoi. Deliberate, progressing, intelligent, and keeping the reader on the edge of her seat – this one is great. But, that generator (drafted by November 9) wasn’t enough, and Barrett also contributed (only a day late) The Basketball Game, an opera generator that provides a score (with lyrics) and MIDI files. It’s as if “I got Philip Glass!” indicates that one is rebounding.

Eric Stayton’s I Sing Of takes the beginning of the Aeneid as grist, moving through alternate invocations using WordNet. I like the way different epics are invoked by the slight changes, and was reminded of Calvino’s If on a winter’s night a traveler.

Sam Coppini’s D’ksuban Dictionary, although also just a text file, is a simple but effective generator of a fictional language’s dictionary. Less like the Devil’s Dictionary, more like the (apparently unpublished) lexicon of Earth: Final Conflict. I’m sure literary works in D’ksuban will be forthcoming soon.

Ben Kybartas’s Something, Somewhere is wonderfully spare and evocative – more Madsen than Hemingway.

Finally, Thricedotted’s The Seeker is an extraordinary concrete novel in the tradition of Raymond Federman’s Double or Nothing. The text, based on wikiHow, is good and serves well to define a protagonist who always wishes to do right, but the typographical framework is really excellent.

These are just a few comments before NaNoGenMo goes as stale as a late-December pumpkin. I hope you enjoy tis work and other work that was done last month, and that you keep an eye peeled for further novel generators – next November and throughout the year.

A “Trope Report” on Stickers

Not literally on stickers, no. This technical report from the Trope Tank is “Stickers as a Literature-Distribution Platform,” and is by Piotr Marecki. It’s just been released as TROPE-14-02 and is very likely to be the last report of 2014. Here’s the abstract:

Contemporary experimental writing often directs its attention to its writing space, its
medium, the material on which it is presented. Very often this medium is meaningful
and becomes part of the work – the printed text transfered to another media context
(for instance, into a traditional book) would become incomprehensible. Literature
distributed on stickers is a form of writing that is divided into small fragments of texts
(a type of constrained writing), physically scattered in different locations. One of the
newest challenges in literature are books with augmented reality, AR, which examine
the relation between the physical (the medium) and the virtual interaction. Sticker
literature is a rather simple analog form of augmented reality literature. The stickers
have QR codes or web addresses printed on them, so the viewer who reads/sees a
random sticker in the public space can further explore the text online. The viewer can
read other parts of the text on photographs (the photograph being another medium) of
other stickers placed in different locations. The author will discuss the use of stickers
throughout literary history, beginning with 20th century French Situationists, through
different textual strategies applied by visual artists and ending with literary forms such
as the sticker novel Implementation (2004) by Nick Montfort and Scott Rettberg or
Stoberskiade (2013). The author shall try to explain why writers decide to use this form,
how the text is distributed and received and how the city space is used in such projects.

Megawatt

Megawatt coverThe fruits of my National Novel Generation Month (NaNoGenMo) labors are now online; the Megawatt generator is available as a single 350-line Python file, while the novel it deterministically generates can be obtained as a PDF, megawatt.pdf or in epub format, megawatt.epub. From the program’s docstring and from the preface to the book:

Megawatt is the title of both a computer program, the source code
to which you may be reading, and the output of this program, which in
many ways is like a standard novel and which you may instead be reading.
This note appears at the beginning of both.

The program Megawatt is based on passages from Samuel Beckett’s novel Watt, first published in 1953 but written much earlier, when Beckett was aiding the French Resistance during World War II.

The novel Megawatt leaves aside all of the more intelligible language of Beckett’s novel and is based, instead, on that which is most systematic and inscrutable. It does not just recreate these passages, although with minor changes the Megawatt code can be used to do so. In the new novel, rather, they are intensified by generating, using the same methods that Beckett used, significantly more text than is found in the already excessive Watt.

(Please note: The following information is handy if you want to, for instance, modify the program and generate a PDF or epub yourself. You don’t need to do this to read the novel. You can download it in PDF: megawatt.pdf or in epub format: megawatt.epub.)

To produce the novel in markdown format, run megawatt.py (a Python 2
program) with TextBlob (a text processing library) installed.

% python megawatt.py > megawatt.text

To produce PDF and epub documents, use pandoc:

% pandoc -V geometry:paperwidth=5.5in \
-V geometry:paperheight=8.25in \
-V geometry:margin=.7in -o megawatt.pdf \
megawatt.text
% echo ‘% Megawatt’ > info.txt
% echo ‘% Nick Montfort’ >> info.txt
% pandoc -o megawatt.epub info.txt megawatt.text

Megawatt was written/generated for the second NaNoGenMo (National
Novel Generation Month) in November 2014, and is free software.

World Clock Punches in on The Verge

Some kind comments about World Clock and NaNoGenMo in the article “The Strange World of Computer-Generated Novels” by Josh Dzieza.

Nick Montfort’s World Clock was the breakout hit of last year. A poet and professor of digital media at MIT, Montfort used 165 lines of Python code to arrange a new sequence of characters, locations, and actions for each minute in a day. He gave readings, and the book was later printed by the Harvard Book Store’s press. Still, Kazemi says reading an entire generated novel is more a feat of endurance than a testament to the quality of the story, which tends to be choppy, flat, or incoherent by the standards of human writing.

“Even Nick expects you to maybe read a chapter of it or flip to a random page,” Kazemi says.

There were many great generated novels last year, and are already many great ones this year. I don’t think among this abundance that World Clock is a very good poster boy for NaNoGenMo. Still, my experience with the book does make a strong case for having your generated novel translated in (or originally written in) Polish.

NaNoGenMo 3000!!!!

Er, sorry. I exaggerated a bit. It’s actually just
NaNoGenMo 2014. But that’s still really cool.

“Spend the month of November writing code that generates a novel of 50k+ words.” As is traditional, the event occurs on GitHub.

Zegar ?wiatowy, the Polish World Clock

World Clock in Polish, displayed World Clock (book, code) has now been published in Polish. The translation is by Piotr Marecki, who translated the underlying novel-generating program and generated a new novel in Polish. ha!art is the publisher, and the book appears in the Liberatura series, which also includes some very distinguished titles: The Polish translations of Finnegans Wake and of Perec’s Life A User’s Manual, for instance. The Polish World Clock on the shelf

Updated 10:01 — Time Still Ticks

Lance Olsen and Tim Guthrie have updated their classic and palindromically-titled electronic literature work, 10:01.

This piece was included in the first Electronic Literature Collection and the first edition can still be seen there. Since it offers links out to the Web, and some of these became stale since the piece was first published in 2005, the prolific and edgy experimental writer Olsen and developer Guthrie have revised the piece for the Web for 2014, also reworking a few other elements. One is still able to select among movie patrons to read their perspectives. The piece is a companion to the print-novel version of 10:01, published by Chiasmus in 2005.

I cannot explain how apropos it is that I blog about this after returning from an AMC theater.

The audience is listening! Check it out.

Senderos que se … Intimate, Infinite

Intimate, Infinite by Robert Yang

Robert Yang’s latest is a first-person-shooter version of Jorge Luis Borges’s “The Garden of Forking Paths.” With a lovingly off-kilter translation (befitting its “original”) and with visuals and (quite minimal) interaction that suits the experience, this is an extraordinary set of linked mini-games, well worth the short amount of time it takes to get through them, and worth offering at least a bit for this pay-what-you-will game.

Check out Intimate, Infinite.

Intimate, Infinite by Robert Yang

New Report on Nanowatt & World Clock

The latest technical report (or “Trope Report”) to issue from the Trope Tank is TROPE-14-01, “New Novel Machines: Nanowatt and World Clock by Nick Montfort:

>My Winchester’s Nightmare: A Novel Machine (1999) was developed to bring the interactor’s input and the system’s output together into a texture like that of novelistic prose. Almost fifteen years later, after an electronic literature practice mainly related to poetry, I have developed two new “novel machines.” Rather than being works of interactive fiction, one (Nanowatt, 2013) is a collaborative demoscene production (specifically, a single-loading VIC-20 demo) and the other (World Clock, 2013) is a novel generator with accompanying printed book. These two productions offer an opportunity to discuss how my own and other highly computational electronic literature relates to the novel. Nanowatt and World Clock are non-interactive but use computation to manipulate language at low levels. I discuss these aspects and other recent electronic literature that engages the novel, considering to what extent novel-like computational literature in general is becoming less interactive and more fine-grained in its involvement with language.

This was the topic of my talk at the recent ELO conference. Share and enjoy!