Canonical Software

Lists are all the rage – and not just the traditional rundowns at New Year. In 2014 Paul Ford suggested that it might be possible to propose a software canon consisting of great works of deeply influential software that changed the nature of what followed. He suggested five: the office suite Microsoft Office, the image editor Photoshop, the videogame Pac-Man, the operating system Unix, and the text editor Emacs.

Earlier, in 2013, Matthew Kirschenbaum had come up with ten: like Ford, he listed Photoshop but there was no room for Microsoft Office (instead WordStar and VisiCalc) – and no Pac-Man (Maze War, Adventure, and Minecraft were the games in his list). There was no operating system, but he did include Mosaic, the first graphical web browser.

Engaging in this kind of debate is generally best done over a few drinks, but in their absence, what would be the software canon for archaeologists?

Continue reading


Real-time Digital Archaeology?

Vint Cerf, co-designer of the TCP/IP protocols that make the Internet work and vice-president and Chief Internet Evangelist for Google, warned last month (for example, here, here and here) about an information black hole into which digitised material is lost as we lose access to the programs which are needed to view them. Somewhat ironically, Google’s own priorities recently seem to have been to increasingly withdraw from information projects which preserved the past – killing off archives, slowing down digitisation activities, removing the Timeline and increasingly prioritising newness over older more established sources in search results (Baio 2015).

Responses to the reporting of Cerf’s warnings were mixed. Some seemed relatively complacent: after all, we’re already preserving data and information in libraries and archives, aren’t we, while using open file formats will mean that bit rot is not a problem? In the process, many seemed to overlook part of Cerf’s argument – that there was a need to preserve old software and hardware so that we retain the ability to read files in their original formats: what he characterised as ‘digital vellum’.

Continue reading

Mastering Mystery

A couple of articles have appeared in recent days which in different ways suggest that digital technology is too easy. Samuel Arbesman writes about how it is important to be able to see under the hood of the tools we use; Brian Millar suggests that simplicity of design takes away the satisfaction and confidence gained through mastery of the technology. In different ways, both link understanding with our ability to see past glossy interfaces designed to keep us at arms length from the guts of the process. Arbesman’s reminiscences about typing games programs from magazines into a VIC-20 and gaining an appreciation of coding and debugging certainly makes me nostalgic – in my case it was working with my younger brother’s Commodore-64 which led directly to working as a research student on machines ranging from ICL and VAX mainframes, North Star Horizons and Sirius PCs running CP/M, to the original IBM PCs and MS-DOS, and using them (and more recent models!) to construct archaeological programs in FORTRAN, C and C++.

Continue reading