Lexcavator is an experimental-ish retro arcade/word game that I’ve been working on since last March, and it’s finally ready for prime time. I’m really excited for people to play! Download it here. (Pay what you want, even if you want to pay $0.)

Some notes about the game:

  • It’s programmed entirely in Python, using processing.py.
  • I use Markov chains to keep related sequences of letters adjacent in the game board. The goal is to make a faster-paced word game where it’s easy to find meaty words.
  • The global leaderboard is completely anonymous (I didn’t want to deal with user authentication), but it does a few things that not many other games do. First, you’re given a percentile rank for each score, which provides a better explanation of how you’re doing in comparison to other players than the global high score or a ranking alone. Second, after each game, you get a list of words that had never been found by any other player before (example from @robdubbin).

For more announcements about Lexcavator, follow @lexcavator on Twitter.

The home-grown chiptune soundtrack is available from Bandcamp. I’ve embedded the title screen track below.

POETRONIX poster

POETRONIX

An evening of poetry, performance, and experimental text design from NYU/ITP’s Reading and Writing Electronic Text

Friday, May 4th, 2012
7pm
721 Broadway, New York, NY
Ground floor (Common room)
FREE

Over the course of Spring semester, eighteen NYU students have engaged in intense electro-textual experiments: composing, mangling, generating and remixing electronic text using the Python programming language. For one night only, these students will gather to present and perform their experiments to the general public.

What to expect: innovative poetic forms, bizarre textual interfaces, generative satire, advanced natural language processing techniques, and more!

Reading and Writing Electronic Text is a course offered at NYU’s Interactive
Telecommunication Program. (http://itp.nyu.edu/itp/). The course is an introduction to both the Python programming language and contemporary techniques in electronic literature. See the syllabus and examples of student work here: http://rwet.decontextualize.com/

Poster design by Inessah Selditz. (Download the full-size version here.)

cover for "introduction to tornado"

The book that I co-wrote with Mike Dory and Brendan Berg for O’Reilly has been released! It’s a gentle introduction to Tornado, a web application framework for Python. Unwittingly included in the book are several of my (previously unpublished) remixes of the last stanza of Frost’s “The Road Not Taken”—along with, of course, the code you need to make your own.  Order a copy from Amazon or directly from O’Reilly.

Tags: , , ,

I just put up a web page for Lexcavator, along with a new demo video, which you can watch below!

cover image for "codings"

A piece of mine, the Autonomous Parapoetic Device, will be in “Codings,” a show curated by Nick Montfort. It opens tomorrow at Pace Digital Gallery.

lexcavator screenshot

Lexcavator is going to be a part of Kill Screen’s GDC party on March 8th! I’m honored and excited to have a game on display alongside games from indie heavy-hitters like Zach Gage and Ramiro Corbetta. You can help fund the event and/or buy tickets here. I’m not going to be able to attend, but it looks like it’s going to be a lot of fun.

(Lexcavator is an action/puzzle/word game I’ve been working on for a while. I’m planning to get a web site for the game up this weekend; until then, you can read this nice little GSW write-up from last year.)

Here’s a round-up of the pieces performed at Hello Word! on May 6th, 2011. Many of the photos below were taken by Aaron Uhrmacher (see the full set here). The performance was the final project for students in Reading and Writing Electronic Text, a course I teach at ITP.

Aaron Uhrmacher and Chris Allick, Showetry. Aaron and Chris presented a number of exceptional pieces generated by juxtaposing lines recovered from phrase searches in YouTube subtitles. The documentation page has an audio recording of Aaron’s performance.

 

Alex Dodge, 新しい言葉 (New Words). Alex used a statistical model of a Japanese lexicon to generate new words that sound Japanese. He then assigned definitions to these words and created example sentences. Alex performed his words in a pedagogical frame, showing each on a separate PowerPoint slide. (My favorite word/definition is shown in the photo above.)

 

Alvin Chang, Hello World News. Alvin scraped data from Twitter and Wikipedia, tagged the words by part of speech, then squeezed them into a context-free grammar designed to mimic the structure of news stories. The pieces turned out great, and the titles that Alvin made up to go with the pieces are hilarious.

Read the rest of this entry »

@everyword in context

Last week Adrian Chen conducted an e-mail interview with me about @everyword. Here’s the resulting article on Gawker. The @everyword account gained about a thousand new followers as a result of the article—not bad for an account that just tweets word after word every half hour!

It’s been interesting to read people’s reactions to @everyword (and yes, I have the Twitter search for @everyword in my RSS feed reader, because I am hopelessly narcissistic). For the most part, the reactions are positive! It’s satisfying when someone is amused by a word that they didn’t know existed (or that they hadn’t considered to be a “word”) or when someone finds unexpected synergy between a word that just got posted and something that is happening in their lives.

Some of the reactions are more critical. Here’s one reaction in particular that I wanted to respond to, from Twitter user @fran_b__:

@everyword They aren’t words unless they have meaning, which implies context. Stripped of context, they are simply (python) string arguments. (source)

This response baffled me, because in my mind @everyword is all about context. For example, here’s the way that I typically read @everyword:

everywordcontextThis is a screenshot of my Twitter client on a typical morning. You can see the tweets from @everyword interleaved in the feed. I don’t generally read the tweets in my feed like I would paragraphs or sentences in an essay or a piece of fiction (e.g., I skip tweets, I don’t necessarily expect cohesion from one tweet to the next), but I do tend to read them in sequence. It’s undeniable that the tweets exist in the same physical context here. Because of this, some interesting possibilities for creative reading crop up. It’s easy for one tweet to “color” how nearby tweets are read, for example. I’m not saying that @notch is prone to nutations, or that @factoryfactory and @daphaknee are nutcases, but that’s certainly a reading made possible by the tweets’ close proximity.

There’s also the context provided merely by being in sequence with other words in the @everyword feed. Here’s an example:

everywordcontext2

I find this endlessly fascinating. When you see these words juxtaposed like this, you can’t help but try to find some connection between them. In some cases, the connection is grammatical (nunnery is of course morphologically related to the word nuns). But nunsnuptials and nursemaid together like is almost like a little narrative. “Nuns can’t have nuptials, and they certainly can’t be nursemaids.” It seems ironic that the words would be juxtaposed like this, and that perception only emerges from seeing these words in this kind of unusual context.

It’s also a cultural practice of ours to consider individual words in the abstract: we pick out our favorite words, we decide which words are commonly misused, we decry our politicians for making up words or using words with a disagreeable frequency, etc. In some sense, a word carries with it a cultural context, no matter where it occurs. One of the intentions of @everyword was to play with this idea: every word has cultural baggage. What would happen if we systematically exposed ourselves to that baggage?

Even if I concede that the words in @everyword are “simply (python) string arguments,” isn’t that also a context? A computer program is a kind of writing, after all. It means something for a programmer to choose to put one string in a program, instead of some other string, or to feed some set of data to a program instead of some other set. Sure, the Python program that runs @everyword would also work with any other arbitrary data set—@everybaseballplayer, anyone?—but the fact that I chose words, and words in this particular order, is part of the context of the piece.

In the end, I think @fran_b__’s implication is that there are certain kinds of contexts that a word can occur in that “count” as meaningful (such as being in a sentence intentionally composed by an individual) and others that don’t. I suppose that for certain fields of study, this is a valid point of view: if you’re analyzing a novel, for example, you might not want to include in your analysis the novel sitting next to it on the shelf. As a writer and poet, however, I find that limitation pretty dull. There’s never been an era in history with such diverse practices for reading and writing text. Why not have as much fun with that as possible?

Tags: , , ,

Hey all! I’ve started a new blog over at tumblr, called Big Writing. It’s a notebook for all of the interesting text-, language-, and poetics-related stuff I come across on the Internet. Here’s the mission statement. Go check it out, if you think that’s your thing. (I do still plan to post about my personal and professional projects here.)

Hello Word!
An evening of poetry, performance, and experimental text design from NYU/ITP’s Reading and Writing Electronic Text

Friday, May 6th 2011
7pm
721 Broadway, New York, NY
Ground floor (Common room)
FREE

Over the course of Spring semester, sixteen NYU students have engaged in intense electro-textual experiments: composing, mangling, generating and remixing electronic text using the Python programming language. For one night only, these students will gather to present and perform their experiments to the general public.

Some examples of projects that may make an appearance at the event: movie dialogue remixed in real time; dynamic newspaper blackout poetry; an endless exquisite corpse from Twitter search results; infinite generative creation myths; and much more.

Reading and Writing Electronic Text is a course offered at NYU’s Interactive
Telecommunication Program. (http://itp.nyu.edu/itp/). The course is an introduction to both the Python programming language and contemporary techniques in electronic literature. See the syllabus and examples of student work here: http://rwet.decontextualize.com/

Poster design by Sofy Yuditskaya and Martin Bravo. Download a full-size version here.

Tags: , , ,

« Older entries § Newer entries »