Tag Archives: computers

Hush! Caution! Echoland!

It’s a cliché to speak of Ulysses as an endpoint, the culmination of the nineteenth century novel, finally bled dry of plot and incident and to a certain extent character (we only feel that we know Stephen and Bloom so well because we spend so much time with them) and erupting into a monstrous growth of period detail and stylistic parody. It’s not the last station on the line, but at least, unlike with Finnegans Wake, one can still pretend it’s something like a readable work of fiction.

But Ulysses was my gateway into mainstream literature: before that, excepting what I was forced to read for educational purposes, I’d only read science fiction and fantasy. Literature was too boring, just a bunch of normal people doing grown up stuff. Ulysses was different: the first handful of chapters were pretty slow and contained a great deal of matter relating to Thomas Aquinas which I let slide by in peaceful incomprehension. But once the newspaper headlines started in the seventh chapter it started to get fun, if not easier to understand.

So for me, it’s always felt like a starting point, not a conclusion. It’s not exactly a friendly introduction to the Western canon, but there’s a lot of writers I first heard of, or was exposed to parodies of, under its influence. And its attitude of “hey, keep up with this if you can”, the sense you get of being complicit with someone taking everything they knew about every book they’d ever read for a dance, is exhilarating.

This post started out as another very short science fiction story, which is what I usually post here on Bloomsday, but it felt like it was getting into territory I’ve covered too often: a sort of dystopian scenario where after the Singularity, or some parody thereof, the AIs really do reconstruct Dublin from Ulysses, and put a bunch of human consciousnesses in it, and it’s terrible, like being trapped in a Bloomsday costume party for all eternity. I gave it up because, for one thing, I was unconsciously plagiarising part of a short story by Ian Watson from the 80s called, I think, “The Bloomsday Revolutions”. (I thought of Ian Watson for the first time in years the other day. He’s a good writer, look him up if you get the chance.)

The other reason I stopped was that I’m weary of science fiction being about computers and AI. I think that the real event underlying the Singularity is the collapse of the sf imagination into the computational. Too many of my own attempts to write longer pieces of fiction have gotten stuck or faltered for the same reasons.

I was remembering Jorn Barger, the guy who coined the word ‘blog’ and had a kind of internet celebrity which then dissolved into anti-Semitism and silence. Barger was an autodidact Joyce fan and had a site called “IQ Infinity”, the central thesis of which was that in Ulysses and Finnegans Wake, Joyce had solved AI, that through sheer brainpower he’d comprehended how the human mind worked. I don’t remember, if I ever really understood, exactly in what sense Barger thought that the works themselves constituted artificial intelligence: could one create the personality of “Leopold Bloom” from the text if it were somehow transformed into software? As another Joyce fanboy I can understand Barger’s reverent awe — without sharing it to that extent. And looked at dispassionately, it’s a ridiculous and self-infatuated idea: if only everyone else loved my favourite author as much as I do, they’d understand how consciousness works, too.

I was also thinking of Ted Nelson’s Xanadu project, the origin of the term ‘hypertext’: I’d known about it for decades, and based on what I’d read of his writing at various times, I thought of him as a crank, embittered by the success of the web. Xanadu proposes a much more complex way of linking and embedding documents within one another, with links that go both ways, and an elaborate system of building a top-level document from a variety of sources. Having come across this summary by a recent participant made me sense, in an obscure way, the allure of this vision of a global network of interpenetrating words. But in another way, it feels nightmarish.

In my mind, Xanadu’s “transclusion” is a codified and rigid version of the sort of association of ideas which the reading mind does in a flexible way all by itself. All writing depends on this, but it’s essential to a text like Ulysses, and even more so Finnegans Wake. Rather than narrating, “Stephen thought about Aquinas’ doctrines of sense perception as he walked along Sandymount”, Joyce interpolates “the ineluctable modality of the visible” and so on, all those weird terms I didn’t understand the first time, leaving it to the reader to either follow the echoes, if they are aware of the reference, or, if not, to fold the unusual texts into their own memory, to be echoed later or in other texts.

I love this process: to some extent, what I’ve just described is what being literate means to me. But I enjoy doing it with my own mind, or letting my own mind do it for me, and the thought of it being made explicit, with coloured markers joining the texts in different columns, makes me queasy, as do the very few working xanalogical demos.

I should add that sometimes just reading Joyce gives me the same feeling of vertigo. There’s a central image, or nightmare, behind these different incarnations, a cousin of Borges’ total library, the idea of mind as a sort of infinite glossary. It makes sense that my imagination, in trying to come up with a response to Ulysses as Bloomsday comes around each year, would return to the machines with which I work, and the fantasy that one day they’ll be able to read our favourite books so well that they’ll bring them to life.

I often get the same feeling reading blogs from the rationalist and AI risk communities. I suspect that these are not so much a school of philosophy as a literary genre, in which people with a very particular form of intelligence — discursive, articulate, fond of numerical arguments, insistent that any discipline can either be reduced to economics or physics, or is empty or misleading — imagine, with the same kind of self-infatuation, that magnified forms of this form of intelligence will either save or wreck the world. Earlier this year, I got so compulsive about reading this kind of thing that I had to use a site-blocking extension to stop myself.

I console myself with the idea that Joyce, had he lived in our era, would have been very bad (one imagines with glee his towering contempt and exasperation) at using computers.


Speculative Execution

Speculative execution is not exactly how thought works, it’s how you work without thinking about it. When philosophers talk about determinism versus free will, they treat the brain as if it were a black box with memory and sensory perceptions going in and actions coming out, with a clear sequence of causality from the first to the last. For the determinists, this is enough. For those who believe in free will, there’s an extra something special added at some point — the Soul, some kind of quantum magic going on in the synapses, whatever sort of swerve away from clockwork perfection seems convincing this decade — but it’s just another station on a linear progression.

Cognitive psychology and neuroscience undermine all of this because the brain doesn’t work like a black box. Without your noticing, it’s continually second-guessing and anticipating in all sorts of different ways. Your visual field is not the beautiful and transparent 360-degree spherical sensorium, God’s own VR headset, that you think it is: it’s a little dot of fine-detailed vision in constant motion with the gaps filled in by how your visual centres have come to assume that the world works. You anticipate what other people are about to say; your own words come tumbling out of your mouth without any conscious composition. The mind isn’t some Cartesian homunculus behind your eyes, marshalling inputs and emitting appropriate commands like some idealised 18th century lord. It’s a democratic and noisy playroom of independently-acting modules, all fighting for what little bandwidth your senses and memory afford them, and only too keen to proceed as far as they can on what guesses they can make.

And just as in CPUs, the goal of all this mess, this willingness to go out on a limb, is efficiency. Err on the side of caution if you think there’s a predator or, more realistically, the hostile or mocking attention of your peers; get distracted by anything which seems promising, an attractive person or an appetising aroma, because who knows that it might not be your last chance.

That’s the evolutionary story, and while we like to locate the life-and-death struggles behind the bundle of hacks we call consciousness in the savage prehistoric past, think of how much more we need to rely on speculative processing in the buzzing and blooming and overcrowded Umwelt we’ve built around ourselves. Sure, we might have evolved on the savannah, but all of these words and walls and works and wills and won’ts are what we’ve built to suit us, and they give our phantom selves such a lot of opportunity to run down the paths of might-have-been.

You’re about to change lanes and you map out the trajectory towards the exit ramp but: there’s someone coming up the inside. Backtrack. You’re indulging in a daydream fantasy about an attractive co-worker and then have to be polite and efficient with him for an hour-long team meeting. Backtrack. You’re following the plot of a movie and then what is he doing? Didn’t she get shot? Backtrack.

And this is just on a small scale. You marry young, anticipating decades of mutual happiness, only to have to unpick it all in a messy divorce in your early thirties. You choose a degree based on a school friend you hero-worshipped but get sidetracked out of it and have to explain it away for the next decade. A swarm of ghost lives, decisions and commitments and purchases and options which, if we’re lucky, we get to retrospectively make sense of, justify, tell ourselves it was destiny or fate, that it was what we were aiming for all along, what we really needed. But perhaps the truth, and it need not be an unkind one, is that a human life needs a sort of virtual scaffolding of possibilities, that the might-have-beens which we’ve unconsciously or consciously rejected are what hold us together.

Certain mental illnesses and mood disorders can be seen as a perversion of this tendency. Depression as the paralysis brought on by too keen an awareness of the sheer volume — number is too narrow a word — of possibilities exploding from every moment: anxiety is a failure of the shielding which lets our minds evaluate them without bothering us with the nagging sense that we are dancing over an abyss. In the manic phase of bipolar disorder there is a dimming of the red light and bell that clangs to signal that it’s time to backtrack, impulses are followed through to their destructive last.

It doesn’t take very much paranoia to imagine that our brain’s talent for speculative execution could be an exploitable vulnerability. Maybe back in the days of the savannah — any predator will have a keen instinct for the false steps and feints of its prey — but now? The misdirection of the magician, the fortune teller’s cold read, the confidence of the con artist, sure in their ability to anticipate just how far down the garden path their marks will lead themselves. The manipulative and abusive, those who gaslight and interrogate, the grandstanding attorney and the demagogue: do they take not take their victim’s or audience’s might-have-beens and magnify them into terrors or seductions? Facebook keeps a record of not only the posts you write, but those you cancel. The algorithms that watch us will have a better map of our shadow self than we will, seeing all the links we follow and then hurriedly click shut, the people we stalk, the products we dare not purchase.

Except that we know from a hundred ads which clumsily ape our ten most recent Google queries that the algorithms are not yet that subtle. The idea that our brains could be hacked by means as delicate as those which can be used to steal the ghosts of data from the might-have-beens of CPU caches is science fiction. And what is fiction, if not a way to coax an audience into the speculative execution of a series of thoughts, a shared illusion, a thing which could never be?

Waiting for Broadband

I’m home from work this morning, waiting for a Telstra technician to come and activate the phone line in my new flat so that my ISP can switch my broadband over. It’s about six weeks since I got approved to move here, and almost four weeks since I moved in.

This has got me thinking a fair bit about Australian broadband, and about how people complain about it, and why I think they’ve been complaining about the wrong thing, and how it’s probably too late, but anyway:

The problem with Australian internet is not that it’s slow, it’s that it’s not a utility.

Australian complaints about the speed of the internet fall into two categories:

  • Robot surgeons on the Moon, and
  • I wanna pirate that rapey dragon show faster

Neither of these are a good argument for the NBN, because only a few places are going to need some kind of massive network connectivity to do gee-whiz immersive futuristic stuff that isn’t actually happening yet, and because complaining that you can’t watch Game of Thrones intersects with another big tedious argument, the one about copyright and how unfair it is that Australia doesn’t get everything at the same time as America. And if you wanted a show that would make broadband seem like a compelling mainstream issue, GoT is exactly the opposite of that show.

What we should have got out of the NBN, and what I would love right now, is for network access to become like power, water and gas: so we don’t have to layer it on top of an analogue voice network and deal with two levels of call centres. My flat is hard to identify in databases because it has a couple of different addresses, which is the main reason for the delay. I had this issue with the power, too, but the way I resolved that was to walk out to the backyard and write down the number on the meter.

I should have been able to do something that simple so that my kids could do a bunch of boring but necessary stuff, like read their email and access their school’s website, immediately after we moved in.

Anyone who is still talking about ‘futureproofing’ in this context needs to be put in a box and left out for the council cleanup. Our network infrastructure isn’t even presentproof.




I’m hopelessly addicted to Kai Krause’s Frax app. (More of mine here.)


As a reward for making it through the last three months, I got the family an iPad. When they were launched, I said “wow if I want one, and I never buy Apple stuff, then they are going to sell like hotcakes”: a dire prophecy which is now fulfilled. Like many computers, the iPad can be used to make stuff up, and thanks to its (STET damn you) clever autocorrection, it can also be iced to make stuff up.

The iPad is pointless and fun, like male nipples or alcohol.

The iPad keeps one young by providing a new thing which is hard to justify to one’s parents.

The iPad has an olfactory sniffer which detects youth molecules and sends them over Internet to Steve Jobs’ doctor, who uses them to make serum. This is why children should not be allowed to play with an iPad for more than half an hour per day.

Contrary to popular belief, iPad can multitask. It can spend your money, correct your spelling and make you look like a wanker, all at the same time.

“iPad” is not a made-up word, but the Latin acronym IUSTUS PATERNUS AMORE DECIDUOSAM meaning “Dadda still loves you but he’s busy now.”