Parsimonious Preservation – (another) different approach to digital information management

October 14th, 2013

We have been featuring various theories about digital information management on this blog in order to highlight some of the debates involved in this complex and evolving field.

To offer a different perspective to those that we have focused on so far, take a moment to consider the principles of Parsimonious Preservation that has been developed by the National Archives, and in particular advocated by Tim Gollins who is Head of Preservation at the Institution.

racks of servers storing digital information

In some senses the National Archives seem to be      bucking the trend of panic, hysteria and (sometimes)  confusion that can be found in other literature relating  to digital information management. The advice given in  the report, ‘Putting Parsimonious Preservation into  Practice‘, is very much advocating a hands-off, rather  than hands-on approach, which many other  institutions, including the British Library, recommend.

The principle that digital information requires  continual interference and management during its life  cycle is rejected wholesale by the principles of  parsimonious preservation, which instead argues that  minimal intervention is preferable because this entails  ‘minimal alteration, which brings the benefits of  maximum integrity and authenticity’ of the digital data object.

As detailed in our previous posts, cycles of coding and encoding pose a very real threat to digital data. This is because it can change the structure of the files, and risk in the long run compromising the quality of the data object.

Minimal intervention in practice seems here like a good idea – if you leave something alone in a safe place, rather than continually move it from pillar to post, it is less likely to suffer from everyday wear and tear. With digital data however, the problem of obsolescence is the main factor that prevents a hands-off approach. This too is downplayed by the National Archives report, which suggests that obsolescence is something that, although undeniably a threat to digital information, it is not as a big a worry as it is often presented.

Gollins uses over ten years of experience at the National Archives, as well as the research conducted by David Rosenthal, to offer a different approach to obsolescence that takes note of the ‘common formats’ that have been used worldwide (such as PDF, .xls and .doc). The report therefore concludes ‘that without any action from even a national institution the data in these formats will be accessible for another 10 years at least.’

10 years may seem like a short period of time, but this is the timescale cited as practical and realistic for the management of digital data. Gollins writes:

‘While the overall aim may be (or in our case must be) for ―permanent preservation […] the best we can do in our (or any) generation is to take a stewardship role. This role focuses on ensuring the survival of material for the next generation – in the digital context the next generation of systems. We should also remember that in the digital context the next generation may only be 5 to10 years away!’

It is worth mentioning here that the Parsimonious Preservation report only includes references to file extensions that relate to image files, rather than sound or moving images, so it would be a mistake to assume that the principle of minimal intervention can be equally applied to these kinds of digital data objects. Furthermore, .doc files used in Microsoft Office are not always consistent over time – have you ever tried to open a word file from 1998 on an Office package from 2008? You might have a few problems….this is not to say that Gollins doesn’t know his stuff, he clearly must do to be Head of Preservation at the National Archives! It is just this ‘hands-off, don’t worry about it’ approach seems odd in relation to the other literature about digital information management available from reputable sources like The British Library and the Digital Preservation Coalition. Perhaps there is a middle ground to be struck between active intervention and leaving things alone, but it isn’t suggested here!

For Gollins, ‘the failure to capture digital material is the biggest single risk to its preservation,’ far greater than obsolescence. He goes on to state that ‘this is so much a matter of common sense that it can be overlooked; we can only preserve and process what is captured!’ Another issue here is the quality of the capture – it is far easier to preserve good quality files if they are captured at appropriate bit rates and resolution. In other words, there is no point making low resolution copies because they are less likely to survive the rapid successions of digital generations. As Gollins writes in a different article exploring the same theme, ‘some will argue that there is little point in preservation without access; I would argue that there is little point in access without preservation.’

Diagram explaining how emulation works to make obsolete computers available on new machines

This has been bit of a whirlwind tour through a very interesting and thought provoking report that explains how a large memory institution has put into practice a very different kind of digital preservation strategy. As Gollins concludes:

‘In all of the above discussion readers familiar with digital preservation literature will perhaps be surprised not to see any mention or discussion of “Migration” vs. “Emulation” or indeed of ―“Significant Properties”. This is perhaps one of the greatest benefits we have derived from adopting our parsimonious approach – no such capability is needed! We do not expect that any data we have or will receive in the foreseeable future (5 to 10 years) will require either action during the life of the system we are building.’

Whether or not such an approach is naïve, neglectful or very wise, only time will tell.

Bristol Archive Records – ¼ inch studio master tapes, ½ inch 8 track multi-track tapes, audio cassettes, DAT recordings and Betamax digital audio recordings

October 7th, 2013

Bristol Archive Records is more than a record label. It releases music, books and through its website, documents the history of Bristol’s punk and reggae scenes from 1977 onwards. You can get lost for hours trawling through the scans of rare zines and photographs, profiles of record labels, bands, discographies and gig lists. Its a huge amount of work that keeps on expanding as more tapes are found, lurking in basements or at that unforeseen place at the back of the wardrobe.

REVELATION-ROCKERS-ARC242V-Cover

Great Bear has the privilege of being the go-to digitisation service for Bristol Archive Records, and many of the albums that grace the record store shelves of Bristol and beyond found their second digital life in the Great Bear Studio.

BLACK-ROOTS-Antholgy-cover

The tapes that Mike Darby has given us to digitise include ¼ inch studio master tapes, ½ inch 8 track multi-track tapes, audio cassettes, DAT recordings and Betamax digital audio recordings. The recordings were mostly made at home or in small commercial studios, often they were not stored in the best conditions.  Some are demos, or other material which has never been released before.  Many were recorded on Ampex tape, and therefore needed to be baked before they were played back, and we also had to deal with other physical problems with the tape, such as mold, but they have all, thankfully, been fixable.

After transfers we supply high quality WAV files as individual tracks or ‘stems’ to label manager Mike Darby, which are then re-mastered before they are released on CD, vinyl or downloads.

Bristol Archive Records have done an amazing job ensuring the cultural history of Bristol’s music scenes are not forgotten. As Mike explains in an interview on Stamp the Wax:

‘I’m trying to give a bit of respect to any individual that played in any band that we can find any music from. However famous or successful they were is irrelevant. For me it’s about acknowledging their existence. It’s not saying they were brilliant, some of it was not very good at all, but it’s about them having their two seconds of “I was in that scene”.’

electric_guitars-cover

While Darby admits in the interview that Bristol Archive Records is not exactly a money spinner, the cultural value of these recordings are immeasurable. We are delighted to be part of the wider project and hope that these rare tapes continue to be found so that contemporary audiences can enjoy the musical legacies of Bristol.

1/2 inch EIAJ skipfield reel to reel videos transferred for Stephen Bell

October 7th, 2013

We recently digitised a collection of 1/2 inch EIAJ skipfield reel to reel videos for Dr Stephen Bell, Lecturer in Computer Animation at Bournemouth University.

CLEWS SB 01 from Stephen Bell on Vimeo.

Stephen wrote about the piece:

‘The participatory art installation that I called “Clews” took place in “The White Room”, a bookable studio space at the Slade School of Art, over three days in 1979. People entering the space found that the room had been divided in half by a wooden wall that they could not see beyond, but they could enter the part nearest the entrance. In that half of the room there was a video monitor on a table with a camera above it pointing in the direction of anyone viewing the screen. There was also some seating so that they could comfortably view the monitor. Pinned to the wall next to the monitor was a notice including cryptic instructions that referred to part of a maze that could be seen on the screen. Participants could instruct the person with the video camera to change the view by giving simple verbal instructions, such as ‘up’, “down”, “left”, “right”, “stop”, etc. until they found a symbol that indicated an “exit”.’

My plan was to edit the video recordings of the event into a separate, dual screen piece but it was too technically challenging for me at the time. I kept the tapes though, with the intention of completing the piece when time and resources became available. This eventually happened in 2012 when, researching ways to get the tapes digitized, I discovered Greatbear in Bristol. They have done a great job of digitizing the material and this is the first version of piece I envisaged all those years ago.’

Nice to have a satisfied customer!

7″ 8 track reel to reel tapes recorded on a Fostex A8

September 30th, 2013

We were recently sent a collection of 7″ 8-track reel-to-reel tapes. All the 8-track tapes were recorded using Dolby C noise reduction on a Fostex A8 machine. They haven’t been stored in optimum conditions and as many were recorded on AMPEX tape, they need to be baked prior to transfer in order to treat their probable binder hydrolysis.

Ampex 7" Tapes

The A-8 was part of the home recording revolution that took the 80s by storm. The A-8 in particular was popular because it was the first machine to offer eight tracks on just one 1/4″ tape. The machine, like its ‘first mate’ the 350 Mixer, were not meant for professionals but enthusiastic amateurs who were happy to work things out themselves. ‘Sure you won’t know everything right off. But you won’t have to. Just hook up to the 350 (our instructions are easy and explicit) and go to work. You can learn the key to incredible flexibility as you go. While you are working on your music. Not before,’ were the encouraging words in the 350 mixer manual.

Fostex_A-8LR

Products like the Fostex A-8 enabled bands and artists who would never have got a commercial record deal to record their music. All sorts of weird and wonderful sounds were recorded on multi-track tape recorders, and they often received airplay on John Peel‘s radio shows.

When we transfer reel-to-reel multi-track tapes we save each stem individually, so you can remix the recordings digitally if you want to. If you spent far too much time in the early 80s playing with your home studio and have a load of old tapes lying in your cupboard, we can help give them a new lease of life. With Ampex tapes in particular, it is critical to transfer them now because they will deteriorate quickly if action is not taken soon.

Paper backed Soundmirror ‘magnetic ribbon’ – early domestic magnetic tape recorders

September 30th, 2013

The oldest tape we have received at the Great Bear is a spool of paper backed magnetic tape, c.1948-1950. Its pretty rare to be sent paper backed tape, and we have been on a bit of adventure trying to find more about its history. On our trail we found a tale of war, economics, industry and invention as we chased the story of the ‘magnetic ribbon’.

Paper Backed Magnetic Tape

The first thing to recount is how the development of magnetic tape in the 1930s and 1940s is enmeshed with events in the Second World War. The Germans were pioneers of magnetic tape, and in 1935 AEG demonstrated the Magnetophon, the first ever tape recorder. The Germans continued to develop magnetic tape, but as the 1930s wore on and war declared, the fruits of technological invention were not widely shared – establishing sophisticated telecommunication systems was essential for the ‘war effort’ on both sides.

Towards the end of the war when the Allies liberated the towns and cities of Europe, they liberated its magnetic tape recording equipment too. Don Rushin writes in ‘The Magic of Magnetic Tape.’

‘By late 1944, the World War II Allies were aware of the magnetic recorder developed by German engineers, a recorder that used an iron-powder-coated paper tape, which achieved much better sound quality that was possible with phonograph discs. A young Signal Corps technician, Jack Mullin, became part of a scavenging team assigned to follow the retreating German army and to pick up items of electronic interest. He found parts of recorders used in the field, two working tape recorders and a library of tapes in the studios of Radio Frankfurt in Bad Bauheim.’

In the United States in WW2, significant resources were used to develop magnetic tape. ‘With money no object and the necessity of adequate recording devices for the military, developments moved at a brisker pace’, writes Mark Mooney.

Soundmirror advert

This where our paper tape comes into the equation, courtesy of Polish-born inventor Semi J. Begun. Begun began working for the Brush Development Company in 1938, who were one of the companies contracted to develop magnetic tape for the US Navy during the war. In his position at Brush Begun invented the ‘Sound Mirror.’ Developed in 1939-1940 but released on the market in 1946, it was the first magnetic tape recorder to be sold commercially in the US post WW2.

As the post-war rush to capitalise on an emerging consumer market gathered apace, companies such as 3M developed their own magnetic tapes. Paper backed magnetic tape was superseded toward the end of the 1940s by plastic tape, making a short but significant appearance in the history of recording media.

This however is a story of magnetic tape in the US, and our tape was recorded in England, so the mystery of the paper tape has not been solved. Around the rim of the rusted spool it states that it is ‘Licensed by the Brush Development Co U.S.A’, ‘Made in England’, ‘Patents Pending’ and ‘Therimonic Products Ltd.’

Therimonic were the British company who acquired the license to build the Soundmirror in 1948. Barry M Jones, who has collected a wider history of the British tape recorder, home studio and studio recording industries writes, ‘[Soundmirror] was the first British-built domestic tape-recorder, whereas the first British built-and-designed tape recorder was the Wright & Weaire, which appeared a few weeks later. Production began in autumn 1948 but the quality of the paper tape meant it shedded oxide too readily and clogged the heads!’

Production of the Soundmirrors continued to late 1954 so it is possible to date the tape as being recorded some time between 1948 and 1958. The weight of the spool and the tape is surprisingly heavy, the tape incredibly fragile, marking its passage through time with signs of corrosion and wear. It is a beautiful object, as many of the tapes we get are, that is entwined with the social histories of media, invention, economy and everyday life.

A word about metadata and digital collections

September 23rd, 2013

Metadata is data about data. Maybe that sounds pretty boring, but archivists love it, and it is really important for digitisation work.

As mentioned in the previous post that focused on the British Library’s digital preservation strategies, as well as many other features on this blog, it is fairly easy to change a digital file without knowing because you can’t see the changes. Sometimes changing a file is reversible (as in non-destructive editing) but sometimes it is not (destructive editing). What is important to realise is changing a digital file irrevocably, or applying lossy instead of lossless compression, will affect the integrity and authenticity of the data.

What is perhaps worse in the professional archive sector than changing the structure of the data, is not making a record of it in the metadata.

Metadata is a way to record all the journeys a data object has gone through in its lifetime. It can be used to highlight preservation concerns if, for example, a file has undergone several cycles of coding and decoding that potentially make it vulnerable to degradation.

Example of Metadata

Metadata can in fact be split into three kinds, as Ian Ireland writes in this article:

technical data (info on resolution, image size, file format, version, size), structural metadata (describes how digital objects are put together such as a structure of files in different folders) and descriptive (info on title, subject, description and covering dates) with each type providing important information about the digital object.’

As the previous blog entry detailed, digital preservation is a dynamic, constantly changing sector. Furthermore, digital data requires far greater intervention to manage collections than physical objects and even analogue media. In such a context data objects undergo rapid changes as they adapt to the technical systems they are opened by and moved between. This would produce, one would speculate, a large stream of metadata.

What is most revealing about metadata surrounding digital objects, is they create a trail of information not only about the objects themselves. They also document our changing relationship to, and knowledge about, digital preservation. Metadata can help tell the story about how a digital object is transformed as different technical systems are adopted and then left behind. The marks of those changes are carried in the data object’s file structure, and the metadata that further elaborate those changes.

Like those who preserve physical heritage collections, a practice of minimal intervention is the ideal for maintaining both the integrity and authenticity of digital collections. But mistakes are made, and attempts to ‘clean up’ or otherwise clarify digital data do happen, so when they do, it is important to record those changes because they help guide how we look after archives in the long term.

Digitisation strategies – back up, bit rot, decay and long term preservation

September 23rd, 2013

In a blog post a few weeks ago we reflected on several practical and ethical questions emerging from our digitisation work. To explore these issues further we decided to take an in-depth look at the British Library’s Digital Preservation Strategy 2013-2016 that was launched in March 2013. The British Library is an interesting case study because they were an ‘early adopter’ of digital technology (2002), and are also committed to ensuring their digital archives are accessible in the long term.

Making sure the UK’s digital archives are available for subsequent generations seems like an obvious aim for an institution like the British Library. That’s what they should be doing, right? Yet it is clear from reading the strategy report that digital preservation is an unsettled and complex field, one that is certainly ‘not straightforward. It requires action and intervention throughout the lifecycle, far earlier and more frequently than does our physical collection (3).’

The British Library’s collection is huge and therefore requires coherent systems capable of managing its vast quantities of information.

‘In all, we estimate we already have over 280 terabytes of collection content – or over 11,500,000 million items – stored in our long term digital library system, with more awaiting ingest. The onset of non-print legal deposit legislation will significantly increase our annual digital acquisitions: 4.8 million websites, 120,000 e-journal articles and 12,000 e-books will be collected in the first year alone (FY 13/14). We expect that the total size of our collection will increase massively in future years to around 5 petabytes [that’s 5000 terabytes] by 2020.’

All that data needs to be backed up as well. In some cases valuable digital collections are backed up in different locations/ servers seven times (amounting to 35 petabytes/ 3500 terabytes). So imagine it is 2020, and you walk into a large room crammed full of rack upon rack of hard drives bursting with digital information. The data files – which include everything from a BWAV audio file of a speech by Natalie Bennett, leader of the Green Party after her election victory in 2015, to 3-D data files of cunieform scripts from Mesopotamia, are constantly being monitored by algorithms designed to maintain the integrity of data objects. The algorithms measure bit rot and data decay and produce further volumes of metadata as each wave of file validation is initiated. The back up systems consume large amounts of energy and are costly, but in beholding them you stand in the same room as the memory of the world, automatically checked, corrected and repaired in monthly cycles.

Such a scenario is gestured toward in the British Library’s long term preservation strategy, but it is clear that it remains a work in progress, largely because the field of digital preservation is always changing. While the British Library has well-established procedures in place to manage their physical collections, they have not yet achieved this with their digital ones. Not surprisingly ‘technological obsolescence is often regarded as the greatest technical threat to preserving digital material: as technology changes, it becomes increasingly difficult to reliably access content created on and intended to be accessed on older computing platforms.’ An article from The Economist in 2012 reflected on this problem too: ‘The stakes are high. Mistakes 30 years ago mean that much of the early digital age is already a closed book (or no book at all) to historians.’

Destroyed Hard Drive

There are also shorter term digital preservation challenges, which encompass ‘everything from media integrity and bit rot to digital rights management and metadata.’ Bit rot is one of those terms capable of inducing widespread panic. It refers to how storage media, in particular optical media like CDs and DVDs, decay over time often because they have not been stored correctly. When bit rot occurs, a small electric charge of a ‘bit’ in memory disperses, possibly altering program code or stored data, making the media difficult to read and at worst, unreadable. Higher level software systems used by large institutional archives mitigate the risk of such underlying failures by implementing integrity checking and self-repairing algorithms (as imagined in the 2020 digital archive fantasy above). These technological processes help maintain ‘integrity and fixity checking, content stabilisation, format validation and file characterisation.’

Archival Gold Disc

300 years, are you sure?

Preservation differences between analogue and digital media

The British Library isolate three main areas where digital technologies differ from their analogue counterparts. Firstly there is the issue of ‘proactive lifestyle management‘. This refers to how preservation interventions for digital data need to happen earlier, and be reviewed more frequently, than analogue data. Secondly there is the issue of file ‘integrity and validation.’ This refers to how it is far easier to make changes to a digital file without noticing, while with a physical object it is usually clear if it has decayed or a bit has fallen off. This means there are greater risks to the authenticity and integrity of digital objects, and any changes need to be carefully managed and recorded properly in metadata.

Finally, and perhaps most worrying, is the ‘fragility of storage media‘. Here the British Library explain:

‘The media upon which digital materials are stored is often unstable and its reliability diminishes over time. This can be exacerbated by unsuitable storage conditions and handling. The resulting bit rot can prevent files from rendering correctly if at all; this can happen with no notice and within just a few years, sometimes less, of the media being produced’.

A holistic approach to digital preservation involves taking and assessing significant risks, as well as adapting to vast technological change. ‘The strategies we implement must be regularly re-assessed: technologies and technical infrastructures will continue to evolve, so preservation solutions may themselves become obsolete if not regularly re-validated in each new technological environment.’

Establishing best practice for digital preservation remains a bit of an experiment, and different strategies such as migration, emulation and normalisation are tested to find out what model best helps counter the real threats of inaccessibility and obsolescence we may face in 5-10 years from now. What is encouraging about the British Library’s strategic vision is they are committed to ensuring digital archives are accessible for years to come despite the very clear challenges they face.

Remembering Ray Dolby pioneer of analogue noise reduction

September 16th, 2013

We have already written about noise reduction this week, but did so without acknowledging the life of Ray Dolby, one of the inventors of video tape recording while working at Ampex and the inventor and founder of Dolby Noise Reduction, who died on 12 September 2013.

An obituary in The Guardian described how:

‘His noise-reduction system worked by applying a pre-emphasis to the audio recording, usually boosting the quieter passages. The reverse process was used on playback. Removing the boost – lowering the level – also removed most of the tape hiss that accompanied all analogue recordings. Of course, people did not care how it worked: they could hear the difference.’

 Dolby managed to solve a clear problem blighting analogue tape recording: the high frequency noise or tape hiss inherent when recording on magnetic tape.

Dolby 365 / 363 Dual Channel / Stereo A-Type Noise Reduction

Like many professional recording studios from the 1960s onwards, the Great Bear Studio uses the Dolby A noise-reduction system that we use to play back Dolby A encoded tape. On the Dolby A the input signal is split into four individual frequency bands and provided 10 dB of broadband noise reduction overall.

Dolby SR (Spectral Recording) modules and board from a BVH 3100 1 inch C format video

We also have a Dolby SR system that was introduced in 1986 to improve upon analogue systems and in some cases surpass rapidly innovating digital sound technologies. Dolby SR maximises the recorded signal at all times using a complex series of filters that change according to the input signal and can account for up to 25dB noise reduction.

Audio Noise Reduction and Finn’s World War Two Stories

September 16th, 2013

We get a range of tape and video recordings to digitise at the Great Bear. Our attention is captured daily by things which are often unusual, interesting and historically significant in their own way.

Last week we received a recording of Pilot Officer Edwin Aldridge ‘Finn’ Haddock talking about his experiences in the Second World War. Finn, who has since passed away,  had made the tape in preparation for a talk he was doing at a local school, using the recording in order to rehearse his memories.

Despite the dramatic nature of the story where he is shot down in Northern France, sheltered by the French resistance and captured by the Germans, it is told in a remarkably matter of fact, detached manner. This is probably because the recording was made with no specific audience in mind, but was used to prompt his talk.

Finn’s story gives us a small insight into the bravery and resilience of people in such exceptional circumstances. The recording tells us what happened in vivid terms, from everyday facts such as what he ate during his shelter and capture to mass executions conducted by the Gestapo.

The now digitised tape recording, which was sent to us by his niece, will be shared among family members and a copy deposited with the local history club in Wheatley Hill, where Finn was born.

Finn was also interviewed by the Imperial War Museum about his experiences, which can be accessed if you click on this link.

On a technical note, when we were sent the tape we were asked if we could reduce the noise and otherwise ‘clean up’ the recording. While the question of how far it is reasonable to change the original recording remains an important consideration for those involved in digital archiving work, as was discussed last week on the Great Bear tape blog, there are some things which can be done if there is excessive hiss or other forms of noise on a recording.

screen grab of spectogram from Izotope RX of an audio file

The first step is to remove transient noise which manifest as clicks and pops which can affect the audibility of the recording. Family home recordings that were made with cheap tape recorders and microphones often picked up knocks and bangs, and there were some on Finn’s tape that were most probably the result of him moving around as he recorded his story.

The second step is to deploy broadband noise reduction, which removes noise across the audio spectrum. To do this we use high pass and low pass filters which effectively smooth off unwanted noise at either end of the frequency range. The limited frequency range of the male voice means that it is acceptable to employ filters at 50 Hz (high pass) and 8000 Hz (low pass) and this will not affect the integrity of the recording.

It is important to remember that noise reduction is always a bit of a compromise because you don’t want to clean something up to the extent that it sounds completely artificial. This is why it is important to keep the ‘raw’ transfer as well as an uncompressed edited version because we do not know what noise reduction techniques may be available in five, ten or twenty years from now. Although we have a lot of experience in achieving high quality digital transfers at the Great Bear, any editing we do to a transfer is only one person’s interpretation of what sounds clear or appropriate. We therefore always err on the side of caution and provide customers with copies of uncompressed raw, edited and compressed access copies of digitised files.

Finn’s story noise reduced

The ‘raw’ transfer

A further problem in noise reduction work is that it is possible to push noise reduction technology too much so that you end up creating ‘artefacts’ in the recording. Artefacts are fundamental alterations of the sound quality in ways that are inappropriate for digitisation work.

Another thing to consider is destructive and non-destructive editing. Destructive editing is when a recording has been processed in software and changed irrevocably. Non-destructive editing, not surprisingly, is reversible, and Samplitude, the software we use at the Great Bear, can save all the alterations made to the file so if certain editing steps need to be undone they can be.

Again, while in essence the principles of digital transfer are simple, the intricacies of the work are what makes it challenging and time consuming. 

 

Measuring signals – challenges for the digitisation of sound and video

September 9th, 2013

In a 2012 report entitled ‘Preserving Sound and Moving Pictures’ for the Digital Preservation Coalition’s Technology Watch Report series, Richard Wright outlines the unique challenges involved in digitising audio and audiovisual material. ‘Preserving the quality of the digitized signal’ across a range of migration processes that can negotiate ‘cycles of lossy encoding, decoding and reformatting is one major digital preservation challenge for audiovisual files’ (1).

Wright highlights a key issue: understanding how data changes as it is played back, or moved from location to location, is important for thinking about digitisation as a long term project. When data is encoded, decoded or reformatted it alters shape, therefore potentially leading to a compromise in quality. This is a technical way of describing how elements of a data object are added to, taken away or otherwise transformed when they are played back across a range of systems and software that are different from the original data object.

Time-Based-Corrector

To think about this in terms which will be familiar to people today, imagine converting an uncompressed WAV into an MP3 file. You then burn your MP3s onto a CD as a WAV file so it will play back on your friend’s CD player. The WAV file you started off with is not the same as the WAV file you end up with – its been squished and squashed, and in terms of data storage, is far smaller. While smaller file size may be a bonus, the loss of quality isn’t. But this is what happens when files are encoded, decoded and reformatted.

Subjecting data to multiple layers of encoding and decoding does not only apply to digital data. Take Betacam video for instance, a component analogue video format introduced by SONY in 1982. If your video was played back using composite output, the circuity within the Betacam video machine would have needed to encode it. The difference may have looked subtle, and you may not have even noticed any change, but the structure of the signal would be altered in a ‘lossy’ way and can not be recovered to it’s original form. The encoding of a component signal, which is split into two or more channels, to a composite signal, which essentially squashes the channels together, is comparable to the lossy compression applied to digital formats such as mp3 audio, mpeg2 video, etc.

UMatic-Time-Based-Corrector

A central part of the work we do at Great Bear is to understand the changes that may have occurred to the signal over time, and try to minimise further losses in the digitisation process. We use a range of specialist equipment so we can carefully measure the quality of the analogue signal, including external time based correctors and wave form monitors. We also make educated decisions about which machine to play back tapes in line with what we expect the original recording was made on.

If we take for granted that any kind of data file, whether analogue or digital, will have been altered in its lifetime in some way, either through changes to the signal, file structure or because of poor storage, an important question arises from an archival point of view. What do we do with the quality of the data customers send us to digitise? If the signal of a video tape is fuzzy, should we try to stabilise the image? If there is hiss and other forms of noise on tape, should we reduce it? Should we apply the same conservation values to audio and film as we do to historic buildings, such as ruins, or great works of art? Should we practice minimal intervention, use appropriate materials and methods that aim to be reversible, while ensuring that full documentation of all work undertaken is made, creating a trail of endless metadata as we go along?

Do we need to preserve the ways magnetic tape, optical media and digital files degrade and deteriorate over time, or are the rules different for media objects that store information which is not necessarily exclusive to them (the same recording can be played back on a vinyl record, a cassette tape, a CD player, an 8 track cartridge or a MP3 file, for example)? Or should we ensure that we can hear and see clearly, and risk altering the original recording so we can watch a digitised VHS on a flat screen HD television, in line with our current expectations of media quality?

Time-Based-Correctors

Richard Wright suggests it is the data, rather than operating facility, which is the important thing about the digital preservation of audio and audiovisual media.

‘These patterns (for film) and signals (for video and audio) are more like data than like artefacts. The preservation requirement is not to keep the original recording media, but to keep the data, the information, recovered from that media’ (3).

Yet it is not always easy to understand what parts of the data should be discarded, and which parts should kept. Audiovisual and audio data are a production of both form and content, and it is worth taking care over the practices we use to preserve our collections in case we overlook the significance of this point and lose something valuable – culturally, historically and technologically.

Magnetic Reel to Reel Tape and New Transfer Machines – Pictures from the Great Bear Studio

September 2nd, 2013

The Great Bear studio always has a wealth of interesting material in it, that somehow have survived the test of time.

EMI and Scotch Magnetic Recording tape

From racks stacked full of obsolete audio and video tape machines, to the infinite varieties of reel-to-reel tape that were produced by companies such as Scotch, E.M.I. and Irish Recording Tape.

As objects in themselves they are fascinating, instilled with the dual qualities of fragility and resilience, the boxes worn at the edges and sometimes marked with stamps, identificatory stickers or scrawled, handwritten notes.

A selection of ‘audio letters’ sent to us by a customer

 

The latest addition to the Great Bear Studio – the Fostex Model 80 8 Track Recorder

Curating Digital Information or What Do You With Your Archive?

September 2nd, 2013

Today is the first day of iPres 2013, the 10th international conference on the preservation of digital objects held in Lisbon, Portugal. To mark the occasion we want to reflect on an issue that is increasingly important for the long term management of digital data: curation.

Anyone who has lived through the digital transition in the 21st century surely cannot ignore the information revolution they have been part of. In the past ten years, vast archives of analogue media have been migrated to digital formats and everyday we create new digital information that is archived and distributed through networks. Arcomen, who are running a workshop at iPres on ‘Archiving Community Memories’, describe how

‘in addition to the “common” challenges of digital preservation, such as media decay, technological obsolescence, authenticity and integrity issues, web preservation has to deal with the sheer size and ever-increasing growth and change rate of Web data. Hence, selection of content sources becomes a crucial and challenging task for archival organizations.’

As well as the necessary and sometimes difficult choices archival organisations have to make in the process of collecting an archive, there is then the issue of what to do with your data once it has been created. This is where the issue of digital curation comes in.

SONY_website_1996

Screenshot of the SONY website from 1996

Traditionally, the role of the curator is to ‘take care’ and interpret collections in an art gallery or a museum. In contemporary society, however, there is an increasing need for people to curate collections that are exclusively digital, and can only be accessed through the web. Part of any long term digitisation strategy, particularly if an archive is to be used for education or research purposes, should therefore factor in plans and time for curation.

Curation transforms a digital collection from being the equivalent of a library, which may be searchable, organised and catalogued, into something more akin to an exhibition. Curation helps to select aspects of an archive in order to tell deliberate stories, or simply help the user navigate content in a particular way. Curating material is particularly important if an archive deals with a specialist subject that no one knows about because visitors often need help to manoeuvre large amounts of complex information. Being overwhelmed by content on the internet is an often cited expression, but ensuring digital content is curated carefully means it is more likely that people visiting your site will be able to cope with what they find there, and delve deeper into your digitsed archival treasures.

Like all things digital, there is no one steadfast or established guidelines for how to ensure your collection is curated well. The rapid speed that technology changes, from preferred archival formats, software to interface design, mean that digital curation can never be a static procedure. New multiple web authoring tools such as zeega, klynt and 3WDOC will soon become integrated into web design in a similar fashion to the current Web 2.0 tools we use now, therefore creating further possibilities for the visual, immersive and interactive presentation of digital archive material.

fostex_Dec 1998

Screenshot of the Fostex website from Dec 1998

Curation is an important aspect of digital preservation in general because it can facilitate long term use and engagement with your collection. What may be lost when archive sites become pruned and more self-consciously arranged is the spontaneous and sometimes chaotic experience of exploring information on the web.

Ultimately though, digital curation will enable more people to navigate archival collections in ways that can foster meaningful, transformative and informative encounters with digitised material.

The Magnetist – Audio Cassettes in Contemporary Culture

August 16th, 2013

As lovers of magnetic tape and obsolete media, we keep our eyes open for people who remain attached to the formats most have forgot.

A recent film posted on Vimeo features the creative life of part time chef, noise musician and tape DJ Micke, also known as ‘The Magnetist’.

The film follows the Stockholm-based artist through his life as a ‘tapeologist.’ From demagnetising tape in order to create soundscapes, to running a club night comprised of tapes scavenged from wherever he can find them, Micke demonstrates how the audio cassette remains a source of inspiration within counter culture.

The Magnetist from Filibuster on Vimeo.

The wider resurgence of cassettes is evident from the forthcoming Cassette Store Day, an event that will be marked in record stores in the UK, USA, Europe and South America.

New tape labels are popping up all the time. Tapes are now often preferred to CD-Rs for short run albums in do it yourself punk culture, as releases blur the line between art object and collector item.

So what’s behind the sub-cultural obsession with the audio cassette tape? Perhaps it is no more complex than novelty value and nostalgia. It may however be evidence of the persistence of analogue technologies in an era where digital technologies appear to have colonised our relationship to sound and vision.

Is there a yearning to resist the ways digital media shapes how we listen to music, both at the level of sound quality, and the promiscuous skipping through mp3 files?

You simply can’t do that with tape. You have to rewind, fast forward or listen the whole way through. Its a mechanical process, often shrouded in hiss.

What is certain, fashion or no fashion, the wheels on the Great Bear tape machines will keep turning.

C-120 Audio Cassette Transfer – the importance of high quality formats

August 16th, 2013

In archiving, the simple truth is formats matter. If you want the best quality recording, that not only sounds good but has a strong chance of surviving over time, it needs to be recorded on an appropriate format.

Most of us, however, do not have specialised knowledge of recording technologies and use what is immediately available. Often we record things within limited budgets, and need to make the most of our resources. We are keen to document what’s happening in front of us, rather than create something that will necessarily be accessible many years from now.

At the Great Bear we often receive people’s personal archives on a variety of magnetic tape. Not all of these tapes, although certainly made to ensure memories were recorded, were done on the best quality formats.

Recently we migrated a recording of a wedding service from 1970 made on C-120 audio cassette.

C-120 Audio Cassette

Image taken using a smart phone @ 72 dpi resolution

C60 and C90 tapes are probably familiar to most readers of this blog, but the C-120 was never widely adopted by markets or manufacturers because of its lesser recording quality. The C-120 tape records for an hour each side, and uses thinner tape than its C90 and C60 counterparts. This means the tape is more fragile, and is less likely to produce optimum recordings. Thinner tapes is also more likely to suffer from ‘print-through‘ echo.

As the Nakamichi 680 tape manual, which is pretty much consulted as the bible on all matters tape in the Great Bear studio, insists:

‘Choosing a high quality recording tape is extremely important. A sophisticated cassette deck, like the 680, cannot be expected to deliver superior performance with inferior tapes. The numerous brands and types of blank cassettes on the market vary not only in the consistency of the tape coating, but in the degree of mechanical precision as well. The performance of an otherwise excellent tape is often marred by a poor housing, which can result in skewing and other unsteady tape travel conditions.’

The manual goes on to stress ‘Nakamichi does not recommend the use of C-120 or ferrichrome cassettes under any circumstances.’ Strong words indeed!

It is usually possible to playback most of the tape we receive, but a far greater risk is taken when recordings are made on fragile or low quality formats. The question that has to be thought through when making recordings is: what are you making them for? If they are meant to be a long term record of events, careful consideration of the quality of the recording format used needs to be made to ensure they have the greatest chance of survival.

Such wisdom seems easy to grasp in retrospect, but what about contemporary personal archives that are increasingly ‘born digital’?

A digital equivalent of the C-120 tape would be the MP3 format. While MP3 files are easier to store, duplicate and move across digital locations, they offer substantially less quality than larger, uncompressed audio files, such as WAVs or AIFFs. The current recommended archival standard for recording digital audio is 24 bit/ 48 kHz, so if you are making new recordings, or migrating analogue tapes to digital formats, it is a good idea to ensure they are sampled at this rate

In a recent article called ‘3 Ways to Change the World for Personal Archiving’ on the Library of Congress’ Digital Preservation blog, Bill LeFurgy wrote:

‘in the midst of an amazing revolution in computer technology, there is a near total lack of systems designed with digital preservation in mind. Instead, we have technology seemingly designed to work against digital preservation. The biggest single issue is that we are encouraged to scatter content so broadly among so many different and changing services that it practically guarantees loss. We need programs to automatically capture, organize and keep our content securely under our control.’

The issue of format quality also comes to the fore with the type of everyday records we make of our digital lives. The images and video footage we take on smart phones, for example, are often low resolution, and most people enjoy the flexibility of compressed audio files. In ten years time will the records of our digital lives look pixelated and poor quality, despite the ubiquity of high tech capture devices used to record and share them? Of course, these are all speculations, and as time goes on new technologies may emerge that focus on digital restoration, as well as preservation.

Ultimately, across analogue and digital technologies the archival principles are the same: use the best quality formats and it is far more likely you will make recordings that people many years from now can access.

Great Bear Studio Visit – Archive for Mathematical Sciences and Philosophy

July 16th, 2013

This week in the Great Bear Studio we are being visited by Michael Wright, Director of The Archive for Mathematical Sciences and Philosophy.

The Archive for Mathematical Sciences and Philosophy holds an extensive collection of audio and video recordings on subjects in mathematics, physics and philosophy, particularly the philosophy and foundations of mathematics and the exact sciences recorded since the early 1970s.

The website explains further the rationale for collecting the recordings:

Such recordings allow historians of science and mathematics to form a better appreciation of the background to the emergence of new ideas; and also of the complex pattern formed by “roads not taken” – ideas which for whatever reason were laid aside, or apparently subsumed in other developments. Those ideas may later re-emerge in ways yielding a new perspective on those developments. Such a rich archive of primary oral source material naturally aids historical study of the Sciences and the conceptual and philosophical questions to which they give rise.

 The project started in 1973 when Michael recorded lectures, seminars and courses relating to Maths and Philosophy when he was a doctoral student. The early recordings were made in Oxford, London and Cambridge and were done on an enthusiastic, if amateur basis. In the 1980s and 1990s the recording process became more systematic, and more video recordings were taken. The archive is still collecting material, and Michael often travels to conferences and lectures to record contemporary debates in the field, as he is this week when he travels to Warsaw for Samuel Eilenberg Centenary conference (there are recordings of Eilenberg’s lectures and an interview collected in the archive).

Michael Wright in the studio.

What started as a hobby for Michael has now become a full time commitment. The archive contains a staggering 37,000 recordings, those he made and ones solicited from other individuals. They include recordings of figures such as Imre Lakatos, Ilya Prigogine, contemporary philosopher Alain Badiou and many more.

The majority of recordings from 1973-2003 were recorded on audio cassette format, although some were done on reel-to-reel recorders. Many of these recordings remain on analogue tape, and the biggest challenge for the archive is now to find the funds to migrate several thousand hours of recordings to digital format.

The archive also track downs and publishes existing material that may be collected in other archives, or are stored in people’s personal collections. For Michael the biggest revelation in constructing the archive was finding out about the amount of material people have that are sitting in the back of their cupboards. This is either because people have forgotten they exist, or because they simply do not known what to do with them.

The archive became a charitable trust in 2008 and names among its trustees English mathematical physicist and philosopher Sir Roger Penrose, and Martin Rees, former Master of Trinity College and Emeritus Professor of Cosmology at the University of Cambridge and President of the Royal Society.

Its an exciting, and transitional, time for the archive as it plans to take its next steps. In the coming years there are plans to develop the website through uploading ‘born digital’ information, attain funds for wholesale digitisation of tape and paper resources and continue to collect recordings. This ambitious project is well on its way to becoming a vital and unique contribution to the subject, and will interest many other people who are simply curious about these rich and complex topics.

 


designed and developed by
greatbear analogue and digital media ltd, 0117 985 0500
Unit 26, The Coach House, 2 Upper York Street, Bristol, BS2 8QN, UK


XHTML | CSS
greatbear analogue and digital media is proudly powered by WordPress
hosted using Debian and Apache