In 1967 Tony Pritchett created The Flexipede using:
I’m creating this eBook portrait of Tony using:
But this project always felt a bit out of whack because I haven’t yet managed to source an Atlas supercomputer and then code it using punchcards and Fortran VI.
But then I discovered Ai…
I realised that instead of Atlas and 16mm, I could use Ai and 16mm. Ai also felt right, since my research has shown that Tony and Atlas created the first computer-generated ‘illusion of life’.
The clip above is something I posted online in celebration of The Flexipede’s 55th birthday. It’s some scratches on a Flexipede print, a few frames before the creature walks on screen and opens it’s spiral-shaped eye for the first time, reacting to something onscreen and thus appearing sentient.
I put the scratches into Stable Diffusion Ai via Google Colab, which runs PyTorch – spent a while on some coding forums looking like an eejit – and then added the prompt ‘illusion of life’. Essentially what you see here is Flexipede footage remixed by Ai. Personally, I like that it looks a little like musical notation. I love that this collaboration references Tony’s crediting of Atlas in The Flexipede titles.
Of course, Tony knew EXACTLY what he wanted to ‘make the computer do’. Using a text to image diffusion model is a whole different kettle of rabbits (I was trying to type ‘fish’ but the computer opted for ‘rabbits’). Just how much I use Ai will be something I discover during making, I think.
The clip bellow is a bit more involved.
Tony described the back and forth between coding, digital and analogue output as an ‘iterative process’.
Inspired by this, his use of both cutting edge digital and 16mm analogue tech – AND his repurposing of equipment, I decided to do the following…
I took digitised 16mm footage (1967) of:
a) The Flexipede walking across the screen.
b) Titles from a scientific demo reel belonging to Culham nuclear fusion research laboratory. (The reel was showing the capabilities of their microfilm recorder, which Tony used to allow the Flexipede to walk out of the digital Atlas computer onto analogue 16mm film. You can find out more about how this was done, here. )
c) The digitised optical soundtrack.
I put the picture into After Effects and transformed it using an algorithmic ‘expresion’. I put the sound into Ableton.
I used my DIY microfilm recorder to transfer the footage back onto 16mm. I then hand-processed the black and white footage. Huge thanks to Metal and John Salim. Images below. The Flexipede is already looking very different!
Personal note: The imagery reminded me of the stained glass window in Tony’s box room/archive storage room door.
White Bus projectionist John Salim kindly projected the footage onto a wall at The Old Waterworks and I filmed it using a DSLR. I included quite a large chunk of the wall, some beanbags and a projector in shot.
I put the footage back into Stable Diffusion and added the prompt, ‘Fortran’. I also took samples from The Flexipede and played around with them using Ableton.
(I’ve since obtained a scan of the original Flexipede optical track and will be having some fun wth that at a later date!)
I have no idea what Tony would think of all this, but as he was fascinated by ‘the new’ I have an inkling that he would have at least maybe enjoyed having a play with Ai.
You can see the resulting ‘microshort’ below:
And another thing…
Here’s the footage slowed down to 20% of it’s original speed.
Note the apparent ‘voice’ at 1 min 10 secs:
“Are we human? Are we human?”
Well that’s lovely – since my research has found that THE FLEXIPEDE is the world’s first piece of computer generated character animation aka ‘illusion of life’ AND Tony was a big fan of all things paranormal 🙂