Wind in a Jar


, , , , ,

The “Files” Aren’t In The “Computer”

Self-styled futurists are always bandying about phantasms for the amusement of the public, and with the spiking of all things AI and MedTech recently, of course you get a fresh crop of reductionist immortality articles. Somebody enjoyed Altered Carbon on NetFlix, and now they want us to believe you can upload your ability to have Déjà vu to AWS directly. Consciousness is a funny thing, we keep testing, and we keep finding that the files aren’t necessarily in the computer.

Talk about “uploading a mind” for some sort of immortality misses the mark, either thru sloppy use of terminology, or lack of a proper model of what a mind actually is. Usually both, which if accepted allows, all kinds of absurd conflation to occur like a Three Card Monte card shark shuffling the payout card to always keep it face down and deflecting attention to the decoys.

The esteemed scholar and author of this article, Angela Thornton, University of Nottingham teases with a very IFL Science-style byline, “Uploading our minds to a computer may someday be possible” [https://earthsky.org/human-world/uploading-our-minds-to-a-computer-may-someday-be-possible/]. The article is, of course, rife with weak language, speculative terms and phrases like “could” “potentially” “contentious”, “no one knows for certain” – all extremely genre-typical. It leans on Sci-Fi in television and film, while also managing to ignore a host of top line considerations many of the best science fiction authors struggled with as the core issues in their oft-prescient storytelling.

One can attempt to configure a neural net that replicates a particular brain’s structure, subsequently training that configuration to fire in sequencings replicating the organic brain’s neuron firing sequences under various conditions and stimulus. The computations and algorithms produce the tightest simulation and outputs can be parsed into an organic brain understandable media.

At the core of this simulation is a matrix of self-correcting digital abacuses flipping black and white beads into different orders. What has been achieved is not the uploading of a mind, it is the finger pointing at the Moon and not the Moon itself. It is like trying to catch the wind in a jar, and upon entering a still cave, opening the jar and expecting the cave to get windy.

You’re still underfitting the original subject you’re trying to model.

Thought Libel & Bioethics

Thornton’s article included some other fanciful musings about extrapolating from brain output estimation techniques to “write” routines for implanting memories and information into living beings. While the term “neurorights” is invoked, really it still falls under Bioethics, because they’ll start with animal testing first.

The baked-in assumption that a memory write routine could imprint on the mind tacitly dismisses the metaphorical File Header of a memory. Persons who have experienced trauma can repress, or block out memories, which is frequently attendant with other psychological problems, in what is known as “personality fragmentation”. This is a condition where, in effect, the conscious mind blocks content in memories, because they are corrupted. The side effect of that process is usually suffering, and the DSM has been updated quite a few times to keep up with the myriad forms of that suffering. Implanting memories into a sentient lifeform risks traumatizing it.

Why Are We Complaining About This?

Because, to put it very lightly, it’s a little trite.

Also, Transhumanists and the Transhumanist-adjacent, don’t seem to take the risk seriously, they’re so enamored with the idea we might be able to do it all that they forget it’s vaguely abominable and will immediately be seized upon by rich megalomaniacs.

To quote the late Bill Hicks, “Science is all about coulda, not shoulda.”


Leave a Reply