While this is by no means a final figure (and does not include the holdings of record companies and DATheads), it does suggest there is a significant amount of audio recorded on this obsolete format which, under certain conditions, is subject to catastrophic signal loss.
The conditions we are referring to is that old foe of magnetic tape: mould.
In contrast with existing research about threats to DAT, which emphasise how the format is threatened by ‘known playback problems that are typically related to mechanical alignment’, the biggest challenges we consistently face with DATs is connected to mould.
It is certainly acknowledged that ‘environmental conditions, especially heat, dust, and humidity, may also affect cassettes.’
Nevertheless, the specific ways mould growth compromise the very possibility of successfully playing back a DAT tape have not yet been fully explored. This in turn shapes the kinds of preservation advice offered about the format.
What follows is an attempt to outline the problem of mould growth on DATs which, even in minimal form, can pretty much guarantee the loss of several seconds of recording.
Tape width issues
The first problem with DATs is that they are 4mm wide, and very thin in comparison to other forms of magnetic tape.
The size of the tape is compounded by the helical method used in the format, which records the signal as a diagonal stripe across the tape. Because tracks are written onto the tape at an angle, if the tape splits it is not a neat split that can be easily spliced together.
The only way to deal with splits is to wind the tape back on to the tape transport or use leder tape to stick the tape back together at the breaking point.
Either way, you are guaranteed to lose a section of the tape because the helical scan has imprinted the recorded signal at a sharp, diagonal angle. If a DAT tape splits, in other words, it cuts through the diagonal signal, and because it is digital rather than analogue audio, this results in irreversible signal loss.
And why does the tape split? Because of the mould!
If you play back a DAT displaying signs of dormant mould-growth it is pretty much guaranteed to split in a horrible way. The tape therefore needs to be disassembled and wound by hand. This means you can spend a lot of time restoring DATs to a playable condition.
Rewinding by hand is however not 100% full proof, and this really highlights the challenges of working with mouldy DAT tape.
Often mould on DATs is visible on the edge of the tape pack because the tape has been so tightly wound it doesn’t spread to the full tape surface.
In most cases with magnetic tape, mould on the edge is good news because it means it has not spread and infected the whole of the tape. Not so with DAT.
Even with tiny bits of mould on the edge of the tape there is enough to stick it to the next bit of tape as it is rewound.
When greater tension is applied in an attempt to release the mould, due to stickiness, the tape rips.
A possible and plausible explanation for DAT tape ripping is that due to the width and thinness of the tape the mould is structurally stronger than the tape itself, making it easier for the mould growth to stick together.
When tape is thicker, for example with a 1/4 ” open reel tape, it is easier to brush off the dormant mould which is why we don’t see the ripping problem with all kinds of tape.
Our experience confirms that brushing off dormant mould is not always possible with DATs which, despite best efforts, can literally peel apart because of sticky mould.
What, then, is to be done to ensure that the 3353 (and counting) DAT tapes in existence remain in a playable condition?
One tangible form of action is to check that your DATs are stored at the appropriate temperature (40–54°F [4.5–12°C]) so that no mould growth develops on the tape pack.
The other thing to do is simple: get your DAT recordings reformatted as soon as possible.
While we want to highlight the often overlooked issue of mould growth on DATs, the problems with machine obsolescence, a lack of tape head hours and mechanical alignment problems remain very real threats to successful transfer of this format.
Our aim at the Great Bear is to continue our research in the area of DAT mould growth and publish it as we learn more.
As ever, we’d love to hear about your experiences of transferring mouldy DATs, so please leave a comment below if you have a story to share.
In the last Great Bear article we quoted sage advice from the International Association of Audiovisual Archivists: ‘Optimal preservation measures are always a compromise between many, often conflicting parameters.’ 
While this statement is true in general for many different multi-format collections, the issue of compromise and conflicting parameters becomes especially apparent with the preservation of digitized and born-digital video. The reasons for this are complex, and we shall outline why below.
Lack of standards (or are there too many formats?)
Carl Fleischhauer writes, reflecting on the Federal Agencies Digitization Guidelines Initiative (FADGI) research exploring Digital File Formats for Videotape Reformatting (2014), ‘practices and technology for video reformatting are still emergent, and there are many schools of thought. Beyond the variation in practice, an archive’s choice may also depend on the types of video they wish to reformat.’ 
We have written in depth on this blog about the labour intensity of digital information management in relation to reformatting and migration processes (which are of course Great Bear’s bread and butter). We have also discussed how the lack of settled standards tends to make preservation decisions radically provisional.
In contrast, we have written about default standards that have emerged over time through common use and wide adoption, highlighting how parsimonious, non-interventionist approaches may be more practical in the long term.
The problem for those charged with preserving video (as opposed to digital audio or images) is that ‘video, however, is not only relatively more complex but also offers more opportunities for mixing and matching. The various uncompressed-video bitstream encodings, for example, may be wrapped in AVI, QuickTime, Matroska, and MXF.’ 
What then, is this ‘mixing and matching’ all about?
It refers to all the possible combinations of bitsteam encodings (‘codecs’) and ‘wrappers’ that are available as target formats for digital video files. Want to mix your JPEG2000 – Lossless with your MXF, or ffv1 with your AVI? Well, go ahead!
What then is the difference between a codec and wrapper?.
As the FADGI report states: ‘Wrappers are distinct from encodings and typically play a different role in a preservation context.’ 
The wrapper or ‘file envelope’ stores key information about the technical life or structural properties of the digital object. Such information is essential for long term preservation because it helps to identify, contextualize and outline the significant properties of the digital object.
Information stored in wrappers can include:
Content (number of video streams, length of frames),
Context (title of object, who created it, description of contents, re-formatting history),
Video rendering (Width, Height and Bit-depth, Colour Model within a given Colour Space, Pixel Aspect Ratio, Frame Rate and Compression Type, Compression Ratio and Codec),
Audio Rendering – Bit depth and Sample Rate, Bit Rate and compression codec, type of uncompressed sampling.
Codecs, on the other hand, define the parameters of the captured video signal. They are a ‘set of rules which defines how the data is encoded and packaged,’  encompassing Width, Height and Bit-depth, Colour Model within a given Colour Space, Pixel Aspect Ratio and Frame Rate; the bit depth and sample rate and bit rate of the audio.
Although the wrapper is distinct from the encoded file, the encoded file cannot be read without its wrapper. The digital video file, then, comprises of wrapper and at least one codec, often two, to account for audio and images, as this illustration from AV Preserve makes clear.
Diagram taken from AV Preserve’s A Primer on Codecs for Moving Image and Sound Archives
Pick and mix complexity
Why then, are there so many possible combinations of wrappers and codecs for video files, and why has a settled standard not been agreed upon?
Fleischhauer at The Signal does an excellent job outlining the different preferences within practitioner communities, in particular relating to the adoption of ‘open’ and commercial/ proprietary formats.
Compellingly, he articulates a geopolitical divergence between these two camps, with those based in the US allegedly opting for commercial formats, and those in Europe opting for ‘open.’ This observation is all the more surprising because of the advice in FADGI’s Creating and Archiving Born Digital Video: ‘choose formats that are open and non-proprietary. Non-proprietary formats are less likely to change dramatically without user input, be pulled from the marketplace or have patent or licensing restrictions.’ 
One answer to the question: why so many different formats can be explained by different approaches to information management in this information-driven economy. The combination of competition and innovation results in a proliferation of open source and their proprietary doubles (or triplets, quadruples, etc) that are constantly evolving in response to market ‘demand’.
Impact of the Broadcast Industry
An important area to highlight driving change in this area is the role of the broadcast industry.
Format selections in this sector have a profound impact on the creation of digital video files that will later become digital archive objects.
In the world of video, Kummer et al explain in an article in the IASA journal, ‘a codec’s suitability for use in production often dictates the chosen archive format, especially for public broadcasting companies who, by their very nature, focus on the level of productivity of the archive.’  Broadcast production companies create content that needs to be able to be retrieved, often in targeted segments, with ease and accuracy. They approach the creation of digital video objects differently to how an archivist would, who would be concerned with maintaining file integrity rather ensuring the source material’s productivity.
Furthermore, production contexts in the broadcast world have a very short life span: ‘a sustainable archiving decision will have to made again in ten years’ time, since the life cycle of a production system tends to be between 3 and 5 years, and the production formats prevalent at that time may well be different to those in use now.’ 
Take, for example, H.264/ AVC ‘by far the most ubiquitous video coding standard to date. It will remain so probably until 2015 when volume production and infrastructure changes enable a major shift to H.265/ HEVC […] H.264/ AVC has played a key role in enabling internet video, mobile services, OTT services, IPTV and HDTV. H.264/ AVC is a mandatory format for Blu-ray players and is used by most internet streaming sites including Vimeo, youtube and iTunes. It is also used in Adobe Flash Player and Microsoft Silverlight and it has also been adopted for HDTV cable, satellite, and terrestrial broadcasting,’ writes David Bull in his book Communicating Pictures.
HEVC, which is ‘poised to make a major impact on the video industry […] offers to the potential for up to 50% compression efficiency improvement over AVC.’ Furthermore, HEVC has a ‘specific focus on bit rate reduction for increased video resolutions and on support for parallel processing as well as loss resilience and ease if integration with appropriate transport mechanisms.’ 
The development of codecs for use in the broadcast industry deploy increasingly sophisticated compression that reduce bit rate but retain image quality. As AV Preserve explain in their codec primer paper, ‘we can think of compression as a second encoding process, taking coded information and transferring or constraining it to a different, generally more efficient code.’ 
The explosion of mobile, video data in the current media moment is one of the main reasons why sophisticated compression codecs are being developed. This should not pose any particular problems for the audiovisual archivist per se—if a file is ‘born’ with high degrees of compression the authenticity of the file should not ideally, be compromised in subsequent migrations.
Nevertheless, the influence of the broadcast industry tells us a lot about the types of files that will be entering the archive in the next 10-20 years. On a perceptual level, we might note an endearing irony: the rise of super HD and ultra HD goes hand in hand with increased compression applied to the captured signal. While compression cannot, necessarily, be understood as a simple ‘taking away’ of data, its increased use in ubiquitous media environments underlines how the perception of high definition is engineered in very specific ways, and this engineering does not automatically correlate with capturing more, or better quality, data.
Like error correction that we have discussed elsewhere on the blog, it is often the anticipation of malfunction that is factored into the design of digital media objects. These, in turn, create the impression of smooth, continuous playback—despite the chaos operating under the surface. The greater clarity of the visual image, the more the signal has been squeezed and manipulated so that it can be transmitted with speed and accuracy. 
Staying with the broadcast world, we will finish this article by focussing on the MXF wrapper that was ‘specifically designed to aid interoperability and interchange between different vendor systems, especially within the media and entertainment production communities. [MXF] allows different variations of files to be created for specific production environments and can act as a wrapper for metadata & other types of associated data including complex timecode, closed captions and multiple audio tracks.’ 
The Presto Centre’s latest TechWatch report (December 2014) asserts ‘it is very rare to meet a workflow provider that isn’t committed to using MXF,’ making it ‘the exchange format of choice.’ 
We can see such adoption in action with the Digital Production Partnership’s AS-11 standard, which came into operation October 2014 to streamline digital file-based workflows in the UK broadcast industry.
While the FADGI reports highlights the instability of archival practices for video, the Presto Centre argue that practices are ‘currently in a state of evolution rather than revolution, and that changes are arriving step-by-step rather than with new technologies.’
They also highlight the key role of the broadcast industry as future archival ‘content producers,’ and the necessity of developing technical processes that can be complimentary for both sectors: ‘we need to look towards a world where archiving is more closely coupled to the content production process, rather than being a post-process, and this is something that is not yet being considered.’ 
The world of archiving and reformatting digital video is undoubtedly complex. As the quote used at the beginning of the article states, any decision can only ever be a compromise that takes into account organizational capacities and available resources.
What is positive is the amount of research openly available that can empower people with the basics, or help them to delve into the technical depths of codecs and wrappers if so desired. We hope this article will give you access to many of the interesting resources available and some key issues.
As ever, if you have a video digitization project you need to discuss, contact us—we are happy to help!
 IASA Technical Committee (2014) Handling and Storage of Audio and Video Carriers, 6.
 Carl Fleischhauer, ‘Comparing Formats for Video Digitization.’http://blogs.loc.gov/digitalpreservation/2014/12/comparing-formats-for-video-digitization/.
 Federal Agencies Digital Guidelines Initiative (FADGI), Digital File Formats for Videotape Reformatting Part 5. Narrative and Summary Tables. http://www.digitizationguidelines.gov/guidelines/FADGI_VideoReFormatCompare_pt5_20141202.pdf, 4.
 FADGI, Digital File Formats for Videotape, 4.
 AV Preserve (2010) A Primer on Codecs for Moving Image and Sound Archives& 10 Recommendations for Codec Selection and Management, www.avpreserve.com/wp-content/…/04/AVPS_Codec_Primer.pdf, 1.
 FADGI (2014) Creating and Archiving Born Digital Video Part III. High Level Recommended Practices, http://www.digitizationguidelines.gov/guidelines/FADGI_BDV_p3_20141202.pdf, 24.
 Jean-Christophe Kummer, Peter Kuhnle and Sebastian Gabler (2015) ‘Broadcast Archives: Between Productivity and Preservation’, IASA Journal, vol 44, 35.
 Kummer et al, ‘Broadcast Archives: Between Productivity and Preservation,’ 38.
 David Bull (2014) Communicating Pictures, Academic Press, 435-437.
 Av Preserve, A Primer on Codecs for Moving Image and Sound Archives, 2.
 For more reflections on compression, check out this fascinating talk from software theorist Alexander Galloway. The more practically bent can download and play with VISTRA, a video compression demonstrator developed at the University of Bristol ‘which provides an interactive overview of the some of the key principles of image and video compression.
 ‘FADGI, Digital File Formats for Videotape, 11.
 Presto Centre, AV Digitisation and Digital Preservation TechWatch Report #3, https://www.prestocentre.org/, 9.
 Presto Centre, AV Digitisation and Digital Preservation TechWatch Report #3, 10-11.
Deciding when to digitise your magnetic tape collections can be daunting.
The Presto Centre, an advocacy organisation working to help ‘keep audiovisual content alive,’ have a graphic on their website which asks: ‘how digital are our members?’
They chart the different stages of ‘uncertainty,’ ‘awakening’, ‘enlightenment’, ‘wisdom’ and ‘certainty’ that organisations move through as they appraise their collections and decide when to re-format to digital files.
Similarly, the folks at AV Preserve offer their opinion on the ‘Cost of Inaction‘ (COI), arguing that ‘incorporating the COI model and analyses into the decision making process around digitization of legacy physical audiovisual media helps organizations understand the implications and make well-informed decisions.’
They have even developed a COI calculator tool that organisations can use to analyse their collections. Their message is clear: ‘the cost of digitization may great, but the cost of inaction may be greater.’
Digitising small-medium audiovisual collections
For small to medium size archives, digitising collections may provoke worries about a lack of specialist support or technical infrastructure. It may be felt that resources could be better used elsewhere in the organisation. Yet as we, and many other people working with audiovisual archives often stress, the decision to transfer material stored on magnetic tape has to be made sooner or later. With smaller archives, where funding is limited, the question of ‘later’ is not really a practical option.
Furthermore, the financial cost of re-formatting audiovisual archives is likely to increase significantly in the next five-ten years; machine obsolescence will become an aggravated problem and it is likely to take longer to restore tapes prior to transfer if the condition of carriers has dramatically deteriorated. The question has to be asked: can you afford not to take action now?
If this describes your situation, you might want to hear about other small to medium sized archives facing similar problems. We asked one of our customers who recently sent in a comparatively small collection of magnetic tapes to share their experience of deciding to take the digital plunge.
We are extremely grateful for Annaig from the Medical Mission Sisters for answering the questions below. We hope that it will be useful for other archives with similar issues.
1. First off, please tell us a little bit about the Medical Mission Sisters Archive, what kind of materials are in the collection?
The Medical Mission Sisters General Archives include the central archives of the congregation. They gather all the documents relating to the foundation and history of the congregation and also documents relating to the life of the foundress, Anna Dengel. The documents are mainly paper but there is a good collection of photographs, slides, films and audio documents. Some born digital documents are starting to enter the archives but they are still few.
2. As an archive with a modest collection of magnetic tapes, why did you decide to get the materials digitised now? Was it a question of resources, preservation concerns, access request (or a mixture of all these things!)
The main reason was accessibility. The documents on video tapes or audio tapes were the only usable ones because we still had machines to read them but all the older ones, or those with specific formats, where lost to the archives as there was no way to read them and know what was really on the tapes. Plus the Medical Mission Sisters is a congregation where Sisters are spread out on 5 continents and most of the time readers don’t come to the archives but send me queries by emails where I have to respond with scanned documents or digital files. Plus it was obvious that some of the tapes were degrading as that we’d better have the digitisation sooner than later if we wanted to still be able to read what was on them. Space and preservation was another issue. With a small collection but varied in formats, I had no resources to properly preserve every tape and some of the older formats had huge boxes and were consuming a lot of space on the shelves. Now, we have a reasonably sized collection of CDs and DVDs, which is easy to store in good conditions and is accessible everywhere as we can read them on computer here and I can send them to readers via email.
3. Digital preservation is a notoriously complex, and rapidly evolving field. As a small archive, how do you plan to manage your digital assets in the long term? What kinds of support, services and systems are your drawing on to design a system which is robust and resilient?
At the moment the digital collection is so small that it cannot justify any support service or system. So I have to build up my own home made system. I am using the archives management software (CALM) to enter data relating to the conservation of the CDs or DVDs, dates of creation, dates to check them and I plan to have regular checks on them and migrations or copies made when it will prove necessary.
4. Aside from the preservation issue, what are your plans to use the digitised material that Great Bear recently transferred?
It all depends on the content of the tapes. But I’ve already spotted a few documents of interest, and I haven’t been through everything yet. My main concern now is to make the documents known and used for their content. I was already able to deliver a file to one of the Sisters who was working on a person related to the foundation of the congregation, the most important document on her was an audio file that I had just received from Great Bear, I was able to send it to her. The document would have been unusable a few weeks before. I’ve come across small treasures, like a film, probably made by the foundress herself, which nobody was aware of. The Sisters are celebrating this year the 90th anniversary of their foundation. I plan to use as many audio or video documents as I can to support the events the archives are going to be involved into.
What is illuminating about Annaig’s answers is that her archive has no high tech plan in place to manage the collection – her solutions for managing the material very much draw on non-digital information management practices.
The main issues driving the decision to migrate the materials are fairly common to all archives: limited storage space and accessibility for the user-community.
What lesson can be learnt from this? Largely, that if you are trained as an archivist, you are likely to already have the skills you need to manage your digital collection.
So don’t let the more bewildering aspects of digital preservation put you off. But do take note of the changing conditions for playing back and accessing material stored on magnetic tape. There will come a time when it will be too costly to preserve recordings on a wide variety of formats – many of such formats we can help you with today.
At the beginning of 2015, the British Library launched the landmark Save Our Sounds project.
The press release explained:
‘The nation’s sound collections are under threat, both from physical degradation and as the means of playing them disappear from production. Archival consensus internationally is that we have approximately 15 years in which to save our sound collections by digitising them before they become unreadable and are effectively lost.’
Yes you have read that correctly dear reader: by 2030 it is likely that we simply will not be able to play many, if not all of the tape we currently support at Great Bear. A combination of machine obsolescence, tape deterioration and, crucially, the widespread loss of skills necessary to repair, service and maintain playback machines are responsible for this astounding situation. They will make it ‘costly, difficult and, in many cases, impossible’ to preserve our recorded audio heritage beyond the proposed cut-off date.
Yet whatever way you look at it, there is need to take action to migrate any collections currently stored on obsolete media, particular if you are part of a small organisation with limited resources. The reality is it will become more expensive to transfer material as we move closer to 2030. The British Library project relates particularly to audio heritage, but the same principles apply to audiovisual collections too.
Yes that rumbling you can hear is the sound of archivists the world over engaged in flurry of selection and appraisal activities….
One of the most interesting things about discussions of obsolete media is that the question of operability is often framed as a matter of life or death.
Formats are graded according to their ‘endangered statuses’ in more or less explicit terms, as demonstrated on this Video Preservation website which offers the following ‘obsolescence ratings’:
‘Extinct: Only one or two playback machines may exist at specialist laboratories. The tape itself is more than 20 years old.
Critically endangered: There is a small population of ageing playback machinery, with no or little engineering or manufacturing support. Anecdotal evidence indicates that there are fewer working machine-hours than total population of tapes. Tapes may range in age from 40 years to 10 years.
Endangered: The machine population may be robust, but the manufacture of the machinery has stopped. Manufacturing support for the machines and the tapes becomes unavailable. The tapes are often less expensive, and more vulnerable to deterioration.
Threatened: The playback machines are available; however, either the tape format itself is unstable or has less integrity than other available formats, or it is known that a more popular or updated format will be replacing this one in a short period of time.
Vulnerable: This is a current but highly proprietary format.
Lower risk: This format will be in use over the next five years (1998-2002).’
The ratings on the video preservation website were made over ten years ago. A more comprehensive and regularly updated resource to consult is the Preservation Self-Assessment Program (PSAP), ‘a free online tool that helps collection managers prioritize efforts to improve conditions of collections. Through guided evaluation of materials, storage/exhibit environments, and institutional policies, the PSAP produces reports on the factors that impact the health of cultural heritage materials, and defines the points from which to begin care.’ As well as audiovisual media, the resource covers photo and image material, paper and book preservation. It also has advice about disaster planning, metadata, access and a comprehensive bibliography.
The good news is that fantastic resources do exist to help archivists make informed decisions about reformatting collections.
A Digital Compact Cassette
The bad news, of course, is that the problem faced by audiovisual archivists is a time-limited one, exacerbated no doubt by the fact that digital preservation practices on the ‘output end’ are far from stable. Finding machines to playback your Digital Compact Cassette collection, in other words, will only be a small part of the preservation puzzle. A life of file migrations in yet to be designed wrappers and content-management systems awaits all kinds of reformatted audiovisual media in their life-to-come as a digital archival object.
Depending on the ‘content value’ of any collection stored on obsolete media, vexed decisions will need to be made about what to keep and what to throw away at this clinical moment in the history of recorded sound.
Sounding the fifteen-year warning
At such a juncture, when the fifteen year warning has been sounded, perhaps we can pause for a second to reflect on the potential extinction of large swathes of audio visual memory.
If we accept that any kind of recording both contains memory (of a particular historical event, or performance) and helps us to remember as an aide-mémoire, what are the consequences when memory storage devices which are, according to UNESCO, ‘the primary records of the 20th and 21st centuries’, can no longer be played back?
These questions are of course profound, and emerge in response to what are consequential historical circumstances. They are questions that we will continue to ponder on the blog as we reflect on our own work transferring obsolete media, and maintaining the machines that play them back. There are no easy answers!
Perhaps we will come to understand the 00s as a point of audiovisual transition, when mechanical operators still functioned and tape was still in fairly good shape. When it was an easy, almost throw away decision to make a digital copy, rather than an immense preservation conundrum. So where once there was a glut of archival data—and the potential to produce it—is now the threat of abrupt and irreversible dropout.
We have recently digitised a U-matic video tape of eclectic Norwegian video art from the 1980s. The tape documents a performance by Kjartan Slettemark, an influential Norwegian/ Swedish artist who died in 2008. The tape is the ‘final mix’ of a video performance entitledChromakey Identity Blue in which Slettemark live mixed several video sources onto one tape.
The theoretical and practical impossibility of documenting live performance has been hotly debated in recent times by performance theorists, and there is some truth to those claims when we consider the encounter with Slettemark’s work in the Great Bear studio. The recording is only one aspect of the overall performance which, arguably, was never meant as a stand alone piece. This was certainly reflected in our Daily Mail-esque reaction to the video when we played it back. ‘Eh? Is this art?! I don’t get it!’ was the resounding response.
Having access to the wider context of the performance is sometimes necessary if the intentions of the artist are to be appreciated. Thankfully, Slettemark’s website includes part-documentation of Chromakey Identity Blue, and we can see how the different video signals were played back on various screens, arranged on the stage in front of (what looks like) a live TV audience.
Upon seeing this documentation, the performance immediately evokes to the wider context of 70s/ 80s video art, that used the medium to explore the relationship between the body, space, screen and in Slettemark’s case, the audience. A key part of Chromakey Identity Blue is the interruption of the audience’s presence in the performance, realised when their images are screened across the face of the artist, whose wearing of a chroma key mask enables him to perform a ‘special effect’ which layers two images or video streams together.
What unfolds through Slettemark’s performance is at times humorous, suggestive and moving, largely because of the ways the faces of different people interact, perform or simply ignore their involvement in the spectacle. As Marina Abramovic‘s use of presence testifies, there can be something surprisingly raw and even confrontational about incorporating the face into relational art. As an ethical space, meeting with the ‘face’ of another became a key concept for twentieth century philosopher Emmanuel Levinas. The face locates, Bettina Bergo argues, ‘“being” as an indeterminate field’ in which ‘the Other as a face that addresses me […] The encounter with a face is inevitably personal.’
If an art work like Slettemark’s is moving then, it is because it stages moments where ‘faces’ reflect and interface across each other. Faces meet and become technically composed. Through the performance of personal-facial address in the artwork, it is possible to glimpse for a brief moment the social vulnerability and fragility such meetings engender. Brief because the seriousness is diffused Chromakey Identity Blue by a kitsch use of a disco ball that the artist moves across the screen to symbolically change the performed image, conjuring the magical feel of new technologies and how they facilitate different ways of seeing, being and acting in the world.
Videokunstarkivet (The Norwegian Video Art Archive)
The tape of Slettemark was sent to us byVideokunstarkivet,an exciting archival project mapping all the works of video art that have been made in Norway since the mid-1960s. Funded by the Norwegian Arts Council, the project has built the digital archival infrastructure from the bottom up, and those working on it have learnt a good many things along the way. Per Platou, who is managing the project, was generous enough to share some the insights for readers of our blog, and a selection of images from archive’s interface.
There are several things to be considered when creating a digital archive ‘from scratch’. Often at the beginning of a large project it is possible look around for examples of best practice within your field. This isn’t always the case for digital archives, particularly those working almost exclusively with video files, whose communities of practice are unsettled and established ways of working few and far between. The fact that even in 2014, when digital technologies have been widely adopted throughout society, there is still not any firm agreement on standard access and archival file formats for video files indicates the peculiar challenges of this work.
Because of this, projects such as Videokunstarkivet face multiple challenges, with significant amounts of improvisation required in the construction of the project infrastructure. An important consideration is the degree of access users will have to the archive material. As Per explained, publicly re-publishing the archive material from the site in an always open access form is not a concern of the Videokunstarkivet, largely due to the significant administrative issues involved in gaining licensing and copyright permissions. ‘I didn’t even think there was a difference between collecting and communicating the work yet after awhile I saw there is no point in showing everything, it has to be filtered and communicated in a certain way.’
Instead, interested users will be given a research key or pass word which enables them to access the data and edit metadata where appropriate. If users want to re-publish or show the art in some form, contact details for the artist/ copyright holder are included as part of the entry. Although the Videokunstarkivet deals largely with video art, entries on individual artists include information about other archival collections where their material may be stored in order to facilitate further research. Contemporary Norwegian video artists are also encouraged to deposit material in the database, ensuring that ongoing collecting practices are built-in to the long-term project infrastructure.
Another big consideration in constructing an archive is what to collect. Per told me that video art in Norway really took off in the early 80s. Artists who incorporated video into their work weren’t necessarily specialists in the medium, ‘there just happened to be a video camera nearby so they decided to use it.’ Video was therefore often used alongside films, graphics, performance and text, making the starting point for the archive, according to Per, ‘a bit of a mess really.’ Nonetheless, Videokunstarkivet ‘approaches every artist like it was Edvard Munch,’ because it is very hard to know now exactly what will be culturally valuable in 10, 20 or even 100 years from now. While it may not be appropriate to ‘save everything!’ for larger archival projects, for a self-contained and focused archival project such as the Videokunstarkivet, an inclusive approach may well be perfectly possible.
Building software infrastructures
Another important aspect of the project is technical considerations – the actual building of the back/ front end of the software infrastructure that will be used to manage newly migrated digital assets.
It was very important that the Videokunstarkivet archive was constructed using Open Source software. It was necessary to ensure resilience in a rapidly changing technological context, and so the project could benefit from any improvements in the code as they are tested out by user communities.
The project uses an adapted version of Digital Asset Management system Resource Space that was developed with LIMA, an organisation based in Holland that preserves, distributes and researches media art. Per explained that ‘since Resource Space was originally meant for photos and other “light” media files, we found it not so well suited for our actual tasks.’ Video files are of course far ‘heavier’ than image or even uncompressed audio files. This meant that there were some ‘pretty severe’ technical glitches in the process of establishing a database system that could effectively manage and playback large, uncompressed master and access copies. Through establishing the Videokunstarkivet archive they were ‘pushing the limits of what is technically possible in practice’, largely because internet servers are not built to handle large files, particularly not if those files are being transcoding back and forth across the file management system. In this respect, the project is very much ‘testing new ground’, creating an infrastructure capable of effectively managing, and enabling people to remotely access large amounts of high-quality video data.
Access files will be available to stream using open source encoded files Web M (hi and lo) and X264 (hi and lo), ensuring that streaming conditions can be adapted to individual server capabilities. The system is also set up to manage change large-scale file transcoding should there be substantial change in file format preferences. These changes can occur without compromising the integrity of the uncompressed master file.
The interface is built with Bootstrap which has been adapted to create ‘a very advanced access-layer system’ that enables Videokunstarkivet to define user groups and access requirements. Per outlined these user groups and access levels as follows:
‘- Admin: Access to everything (i.e.Videokunstarkivet team members)
– Research: Researchers/curators can see video works, and almost all the metadata (incl previews of the videos). They cannot download master files. They can edit metadata fields, however all their edits will be visible for other users (Wikipedia style). If a curator wants to SHOW a particular work, they’ll have to contact the artist or owner/gallery directly. If the artist agrees, they (or we) can generate a download link (or transcode a particular format) with a few clicks.
– Artist: Artists can up/download uncompressed master files freely, edit metadata and additional info (contact, cv, websites etc etc). They will be able to use the system to store digital master versions freely, and transcode files or previews to share with who they want. The ONLY catch is that they can never delete a master file – this is of course coming out of national archive needs.’
Per approached us to help migrate the Kjartan Slettemark tape because of the thorough approach and conscientious methodology we apply to digitisation work. As a media archaeology enthusiast, Per stressed that it was desirable for both aesthetic and archival reasons that the materiality of U-matic video was visible in the transferred file. He didn’t want the tape, in other words, to be ‘cleaned up’ in anyway. To migrate the tape to digital file we used our standardised transfer chain for U-matic tape. This includes using an appropriate time-based-corrector contemporary to U-matic era, and conversion of the dub signal using a dedicated external dub – y/c converter circuit.
We are very happy to be working with projects such as the Videokunstarkivet. It has been a great opportunity to learn about the nuts and bolts design of cutting-edge digital video archives, as well as discover the work of Kjartan Slettemark, whose work is not well-known in the UK. Massive thanks must go to Per for his generous sharing of time and knowledge in the process of writing this article. We wish the Videokunstarkivet every success and hope it will raise the profile of Norwegian video art across the world.
We are currently migrating a collection of tapes made by Irene Brown who, in the late 1960s, was a school teacher living in Inverness. Irene was a member of the Inverness folk club and had a strong interest in singing, playing guitar and collecting the musical heritage of folk and Gaelic culture.
The tapes, that were sent by her niece Mrs. Linda Baublys, are documents of her Auntie’s passion, and include recordings Irene made of folk music sung in a mixture of Gaelic and English at the Gellions pub, Inverness, in the late 1960s.
The tapes also include recordings of her family singing together. Linda remembered fondly childhood visits to her ‘Granny’s house that was always filled with music,’ and how her Auntie used to ‘roar and sing.’
Perhaps most illustriously, the tapes include a prize-winning performance at the annual An Comunn Gaidhealach/ The National Mòd (now Royal National Mòd). The festival, which has taken place annually at different sites across Scotland since it was founded in 1892 is modelled on the Welsh Eisteddfod and acts ‘as a vehicle for the preservation and development of the Gaelic language. It actively encourages the teaching, learning and use of the Gaelic language and the study and cultivation of Gaelic literature, history, music and art.’ Mòd festivals also help to keep Gaelic culture alive among diasporic Scottish communities, as demonstrated by the US Mòd that has taken place annually since 2008.
If you want to find out more about Gaelic music visit the Year of the Song website run by BBC Alba where you can access a selection of songs from the BBC’s Gaelic archive. If you prefer doing research in archives and libraries take a visit to the School of Scottish Studies Archives. Based at the University of Edinburgh, the collection comprises a significant sound archive containing thousands of recordings of songs, instrumental music, tales, verse, customs, beliefs, place-names biographical information and local history, encompassing a range of dialects and accents in Gaelic, Scots and English.
As well as learning some of the songs recorded on the tape to play herself, Linda plans to eventually deposit the digitised transfers with the School of Scottish Studies Archives. She will also pass the recordings on to a local school that has a strong engagement with traditional Gaelic music.
Digitising and country lanes
Linda told us it was a ‘long slog’ to get the tapes. After Irene died at the age of 42 it was too upsetting for her mother, and Linda’s Granny, to listen to them. The tapes were then passed onto Linda’s mother who also never played the tapes, so when she passed away Linda, who had been asking for the tapes for nearly 20 years, took responsibility to get them digitised.
The tapes were in fairly good condition and minimal problems arose in the transfer process. One of the tapes was however suffering from ‘country-laning’. This is when the shape of the tape has become bendy (like a country lane), most probably because it had been stored in fluctuating temperatures which cause the tape to shrink and grow. It is more common in acetate-backed tape, although Linda’s tapes were polymer-backed. Playing a tape suffering from country-laning often results in problems with the azimuth because the angle between tape head and tape are dis-aligned. A signal can still be discerned, because analogue recordings rarely drop out entirely (unlike digital tape), but the recording may waver or otherwise be less audible. When the tape has been deformed in this way it is very difficult to totally reverse the process. Consequently there has to be some compromise in the quality of the transfer.
We hope you will enjoy this excerpt from the tapes, which Linda has kindly given us permission to include in this article.
If you are new to the world of digital preservation, you may be feeling overwhelmed by the multitude of technical terms and professional practices to contend with, and the fact that standards never seem to stay in place for very long.
Fortunately, there are many resources related to digital preservation available on the internet. Unfortunately, the large amount of websites, hyperlinks and sub-sections can exacerbate those confounded feelings.
In order to help the novice, nerd or perplexed archivist wanting to learn more, we thought it would be useful to compile a selection of (by no means exhaustive) resources to guide your hand. Ultimately if content is to be useful it does need to be curated and organised.
Bear in mind that individual websites within the field tend to be incredibly detailed, so it is worth having a really good explore to find the information you need! And, as is the norm with the internet, one click leads to another so before you know it you stumble upon another interesting site. Please feel free to add anything you find to the comment box below so the list can grow!
AV Preserve are a US-based consultation company who work in partnership with organisations to help them implement digital information preservation and dissemination plans. They have an amazing ‘papers and presentation’ section of their website, which includes research about diverse areas such as assessing cloud storage, digital preservation software, metadata, making an institutional case for digital preservation, managing personal archives, primers on moving image codecs, disaster recovery and many more. It is a treasure trove, and there is a regularly updated blog to boot!
The Digital Preservation Coalition‘s website is full of excellent resources including a digital preservation jargon buster, case studies, preservation handbook and a ‘what’s new’ section. The Technology Watch Reports are particularly useful. Of relevance to the work Great Bear do is the ‘Preserving Moving Pictures and Sound’, but there are many others including Intellectual Property and Copyright, Preserving Metadata and Digital Forensics.
Preservation Guide Wiki – Set up initially by Richard Wright, BBC as early as 2006, the wiki provides advice on getting started in audiovisual digital preservation, developing a strategy at institutional and project based levels.
The PrestoCentre’s website is amazing resource to explore if you want to learn more about digital preservation. The organisation aim to ‘enhance the audiovisual sector’s ability to provide long-term access to cultural heritage’. They have a very well stocked library that is composed of tools, case studies and resources, as well as a regularly updated blog.
The A/V Artifact Atlas is a community-generated resource for people working in digital preservation and aims to identify problems that occur when migrating tape-based media. The Atlas is made in a wiki-format and welcomes contributions from people with expertise in this area – ‘the goal is to collectively build a comprehensive resource that identifies and documents AV artifacts.’ The Atlas was created by people connected to the Bay Area Video Coalition, a media organisation that aims to inspire ‘social change by empowering media makers to develop and share diverse stories through art, education and technology.’
Richard Hess is a US-based audio restoration expert. Although his website looks fairly clunky, he is very knowledgeable and well-respected in the field, and you can find all kinds of esoteric tape wisdom on there.
The National Film and Sound Archive of Australia have produced an in-depth online Preservation Guide. It includes a film preservation handbook, an audiovisual glossary, advice on caring for your collection and disaster management.
The British Library’s Playback and Recording Equipment directory is well worth looking through. Organised chronologically (from 1877 – 1990s), by type and by model, it includes photos, detailed descriptions and you can even view the full metadata for the item. So if you ever wanted to look at a Columbia Gramophone from 1901 or a SONY O-matic tape recorder from 1964, here is your chance!
In 2005 UNESCO declared 27 October to be World Audiovisual Heritage Day. The web pages are an insight into the way audiovisual heritage is perceived by large, international policy bodies.
Be sure to take advantage of the 35 open access digital heritage articles published by Routledge. The articles are from the International Journal of Heritage Studies, Archives and Records, Journal of the Institute of Conservation, Archives and Manuscripts and others.
The Digital Curation Centre works to support Higher Education Institutions to interpret and manage research data. Again, this website is incredibly detailed, presenting case studies, ‘how-to’ guides, advice on digital curation standards, policy, curation lifecycle and much more.
Europeana is a multi-lingual online collection of millions of digitized items from European museums, libraries, archives and multi-media collections.
Digital Preservation Tools and Software
For open source digital preservation software check out The Open Planets Foundation (OPF), who address core digital preservation challenges by engaging with its members and the community to develop practical and sustainable tools and services to ensure long-term access to digital content. The website also includes the very interesting Atlas of Digital Damages
The BBC’s R & D Archive is an invaluable resource of white papers, research and policy relating to broadcast technology from the 1930s onwards. As the website states, ‘whether it’s noise-cancelling microphones in the 1930s, the first transatlantic television transmission in the 1950s, Ceefax in the 1970s, digital radio in the 1990s and HD TV in the 2000s, or the challenge to “broadcasting” brought about by the internet and interactive media, BBC Research & Development has led the way with innovative technology and collaborative ways of working.’
As mentioned above, please feel free to add your website or project to the comment box below. We will continue to update this list!
The history of amateur recording is peppered with examples of people who stretched technologies to their creative limit. Whether this comes in the form of hours spent trying things out and learning through doing, endlessly bouncing tracks in order to turn an 8-track recording into a 24-track epic or making high quality audio masters on video tape, people have found ways to adapt and experiment using the tools available to them.
One of the lesser known histories of amateur home recordings is making high quality stereo mixdowns and master recordings from multi-track audio tape onto consumer-level Hi-Fi VCRs.
We are currently migrating a stereo master VHS Hi-Fi recording of London-based indie band Hollow Hand. Hollow Hand later adopted the name Slanted and were active in London between 1992-1995. The tapes were sent in by Mark Venn, the bass player with Slanted and engineer for these early recordings that were recorded in 1992 in the basement of a Clapham squat. Along with the Hi-Fi VHS masters, we have also been sent eight reels of AMPEX ¼ tapes of Slanted that are being transferred for archival purposes. Mark intends to remix the eight track recordings digitally but as of yet has no plans for a re-release.
When Mark sent us the tapes to be digitised he thought they had been encoded with a SONY PCM, a mixed digital/ analogue recording method we have covered in a previous blog post. The tapes had, however, been recorded directly from the FOSTEX eight track recorder to the stereo Hi-Fi function on a VHS video tape machine. For Mark at the time this was the best way to get a high quality studio master because other analogue and digital tape options, such as Studer open reel to reel and DAT machines, were financially off-limits to him. It is worth mentioning that Hi-Fi audio technologies were introduced in the VHS model by JVC around 1984, so using this method to record stereo masters would have been fairly rare, even among people who did a lot of home recording. It was certainly a bit of a novelty in the Great Bear Studio – they are the first tapes we have ever received that have been recorded in this way – and take it for granted that we see a lot of tape.
Using the Hi-Fi function on VHS tape machines was probably as good as it got in terms of audio fidelity for those working in an exclusively analogue context. It produced a master recording comparable in quality to a CD, particularly if the machine had manual audio recording level control. This is because, as we wrote about in relation to PCM/ Betamax, video tape could accommodate greater bandwidth that audio tape (particularly audio cassette), therefore leading to better quality recordings.
One of our replacement upper head drums
VHS Hi-Fi audio is achieved using audio frequency-modulation (AFM) and relied on a form of magnetic recording called ‘depth multiplexing‘. This is when
‘the modulated audio carrier pair was placed in the hitherto-unused frequency range between the luminance and the colour carrier (below 1.6 MHz), and recorded first. Subsequently, the video head erases and re-records the video signal (combined luminance and colour signal) over the same tape surface, but the video signal’s higher centre frequency results in a shallower magnetization of the tape, allowing both the video and residual AFM audio signal to coexist on tape.’
Challenges for migrating Hi-Fi VHS Audio
Although the recordings of Hollow Hand are in good working condition, analogue masters to VHS Hi-Fi audio do face particular challenges in the migration process.
Playing back the tapes in principle is easy if both tape and machine are in optimum condition, but if either are damaged the original recordings can be hard to reproduce.
A particular problem for Hi-Fi audio emerges when the tape heads wear and it becomes harder to track the hi-fi audio recording because the radio frequency signal (RF) can’t be read consistently off the tape. Hi-Fi recordings are harder to track because of depth multiplexing, namely the position of the recorded audio relative to the video signal. Even though there is no video signal as such in the playback of Hi-Fi audio, the video signal is still there, layered on top of the audio signal, essentially making it harder to access. Of course when tape heads/ drums wear down they can always be replaced, but acquiring spare parts will become increasingly difficult in years to come, making Hi-Fi audio recordings on VHS particularly threatened.
In order to migrate tape-based media to digital files in the most effective way possible, it is important to use appropriate machines for the transfer. The Panasonic AG-7650 we used to transfer the Hollow Hand tapes afforded us great flexibility because it is possible to select which audio tracks are played back at any given time which meant we could isolate the Hi-Fi audio track. The Panasonic AG-7650 also has tracking meters which makes it easy to assess and adjust the tracking of the tape and tape head where necessary.
As ever, the world of digitisation continues to generate anomalies, surprises and good stories. Who knows how many other video/ audio hybrid tapes are out there! If you do possess an archive collection of such tapes we advise you to take action to ensure they are migrated because of the unique problems they pose as a storage medium.
We are pleased to announce that we are now able to support the transfer of 2″ Quadruplex Video Tape (PAL, SECAM & NTSC) to digital formats.
2” Quad was a popular broadcast analogue video tape format whose halcyon period ran from the late 1950s to the 1970s. The first quad video tape recorder made by AMPEX in 1956 cost a modest $45,000 (that’s $386,993.38 in today’s money).
2” Quad revolutionized TV broadcasting which previously had been reliant on film-based formats, known in the industry as ‘kinescope‘ recordings. Kinescope film required significant amounts of skilled labour as well as time to develop, and within the USA, which has six different time zones, it was difficult to transport the film in a timely fashion to ensure broadcasts were aired on schedule.
To counter these problems, broadcasters sought to develop magnetic recording methods, that had proved so successful for audio, for use in the television industry.
The first experiments directly adapted the longitudinal recording method used to record analogue audio. This however was not successful because video recordings require more bandwidth than audio. Recording a video signal with stationary tape heads (as they are in the longitudinal method), meant that the tape had to be recorded at a very high speed in order accommodate sufficient bandwidth to reproduce a good quality video image. A lot of tape was used!
Ampex, who at the time owned the trademark marketing name for ‘videotape’, then developed a method where the tape heads moved quickly across the tape, rather than the other way round. On the 2” quad machine, four magnetic record/reproduce heads are mounted on a headwheel spinning transversely (width-wise) across the tape, striking the tape at a 90° angle. The recording method was not without problems because, the Toshiba Science Museum write, it ‘combined the signal segments from these four heads into a single video image’ which meant that ‘some colour distortion arose from the characteristics of the individual heads, and joints were visible between signal segments.’
The limitations of Quadruplex recording influenced the development of the helical scan method, that was invented in Japan by Dr. Kenichi Sawazaki of the Mazda Research Laboratory, Toshiba, in 1954. Helical scanning records each segment of the signal as a diagonal stripe across the tape. ‘By forming a single diagonal, long track on two-inch-wide tape, it was possible to record a video signal on one tape using one head, with no joints’, resulting in a smoother signal. Helical scanning was later widely adopted as a recording method in broadcast and domestic markets due to its simplicity, flexibility, reliability and economical use of tape.
This brief history charting the development of 2″ Quad recording technologies reveals that efficiency and cost-effectiveness, alongside media quality, were key factors driving the innovation of video tape recording in the 1950s.
We understand that when organisations decide to digitise magnetic tape collections the whole process can take significant amounts of time. From initial condition appraisals, to selecting which items to digitise, many questions, as well as technical and cultural factors, have to be taken into account before a digital transfer can take place.
This is further complicated by that fact that money is not readily available for larger digitisation projects and specific funding has to be sought. Often an evidence base has to be collected to present to potential funders about the value and importance of a collection, and this involves working with organisations who have specific expertise in transferring tape-based collections to digital formats to gain vital advice and support.
We are very happy to work with organisations and institutions during this crucial period of collection assessment and bid development. We understand that even during the pre-application stage informed decisions need to be made about the conditions of tape, and realistic anticipations of what treatments may be required during a particular digitisation project. We are very willing to offer the support and advice that will hopefully contribute to the development of a successful bid.
For example, we recently were contacted by Ken Turner who was involved in Action Space, an experimental, community theatre group established in 1968. Ken has a collection of nearly 40 EIAJ SONY video tapes that were made in the 1980s. Because of the nature of the tapes, which almost always require treatment before they can be played back, transferring the whole collection will be fairly expensive so funding will be necessary to make the project happen. We have offered to do a free assessment of the tapes and provide a ten minute sample of the transfer that can be used as part of an evidence base for a funding bid.
Potential Problems with EIAJ ½ Video Tapes
The EIAJ video tape recorder was developed in the late 1960s and is a fairly important format in the history of recordable media. As the first standardized video tape machine, it could playback tapes made by different companies and therefore made video use far cheaper and more widespread, particularly within a domestic context. The EIAJ standard had a similar democratising impact on non-professional video recording due to its portability, low cost, and versatility.As mentioned above, the EIAJ tapes almost always require treatment before they can be played back, particularly the SONY V30-H and V60-H tapes. Problems with the tape are indicated by squealing and shedding upon playback. This is an example of what the AV Artifact Atlas describe as stiction, ‘when media suffering from hydrolysis or contamination is restricted from moving through the tape path correctly.’ When stiction occurs the tape needs to be removed from the transport and treated immediately, either through baking and cleaning, before the transfer can be completed.
EIAJ tapes that have a polyethylene terephthalate ‘back coating’ or ‘substrate’ may also be affected by temperature or humidity changes in its storage environment. These may have caused the tape pack to expand or contract, therefore resulting in permanent distortion of the tape backing. Such problems are exacerbated by the helical scan method of recording which is common to video tape, which records parallel tracks that run diagonally across the tape from one edge to the other. If the angle that the recorded tracks make to the edge of the tape do not correspond with the scan angle of the head (which always remains fixed), mistracking and information loss can occur, which can lead to tracking errors. Correcting tracking errors is fairly easy as most machines have in-built tracking controls. Some of the earliest SONY CV ½ inch video tape machines didn’t have this function however, so this presents serious problems for the migration of these tapes if their back coating has suffered deformation.
The possibility of collaboration
We are excited about the possibility of working with the Action Space collection, mainly because we would love to opportunity to learn more about their work. Like many other theatre groups who were established in the late 1960s, Action Space wanted to challenge the elitism of art and make it accessible to everyone in the community. In their 1972 annual report, which is archived on the Unfinished Histories: Recording the History of Alternative Theatre website, they describe the purposes of the company as follows:
‘Its workings are necessarily experimental, devious, ambiguous, and always changing in order to find a new situation. In the short term the objectives are to continually question and demonstrate through the actions of all kinds new relationships between artists and public, teachers and taught, drop-outs and society, performers and audiences, and to question current attitudes of the possibility of creativity for everyone. For the longer term the aim is to place the artists in a non-elite set up, to keep “normal” under revision, to break barriers in communication and to recognise that education is a continuing process.’
Although Action Space disbanded in 1981, the project was relaunched in the same year as Action Space Mobile, who are still operating today. The centre of the Action Space Mobile’s philosophy is that they are an arts company ‘that has always worked with people, believing that contact and participation in the arts can change lives positively.’ There is also the London based ActionSpace, who work with artists with learning disabilities.
We hope that offering community heritage projects the possibility of collaboration will help them to benefit from our knowledge and experience. In turn we will have interesting things to watch and listen to, which is part of what makes working in the digitisation world fun and enjoyable.
We are now used to living in a born-digital environment, but the transition from analogue to digital technologies did not happen overnight. In the late 1970s, early digital audio recordings were made possible by a hybrid analogue/digital system. It was composed by the humble transport and recording mechanisms of the video tape machine, and a not so humble PCM (pulse-code-modulation) digital processor. Together they created the first two-channel stereo digital recording system.
The first professional use digital processing machine, made by SONY, was the PCM 1600. It was introduced in 1978 and used a U-Matic tape machine. Later models, the PCM 1610/ 1630, acted as the first standard for mastering audio CDs in the 1980s. SONY employee Toshitada Doi, whose impressive CV includes the development of the PCM adaptor, the Compact Disc and the CIRC error correction system, visited recording studios around the world in an effort to facilitate the professional adoption of PCM digital technologies. He was not however welcomed with open arms, as the SONY corp. website explains:
‘Studio engineers were opposed to digital technology. They criticized digital technology on the grounds that it was more expensive than analogue technology and that it did not sound as soft or musical. Some people in the recording industry actually formed a group called MAD (Musicians Against Digital), and they declared their position to the Audio Engineering Society (AES).’
Several consumer/ semi-professional models were marketed by SONY in the 70s and 80s, starting with the PCM-1 (1977). In a retro-review of the PCM-F10 (1981), Dr Frederick J. Bashour explains that
‘older model VCRs often worked better than newer ones since the digital signal, as seen by the VCR, was a monochrome pattern of bars and dots; the presence of modern colour tweaking and image compensation circuits often reduced the recording system’s reliability and, if possible, were turned off.’
Why did the evolution of an emerging digital technology stand on the shoulders of what had, by 1981, become a relatively mature analogue technology? It all comes down to the issue of bandwidth. A high quality PCM audio recording required 1-1.5 MHz bandwidth, which is far greater than a conventional analogue audio signal (15-20KHz). While this bandwidth was beyond the scope of analogue recording technology of the time, video tape recorders did have the capacity to record signals with higher bandwidths.
If you have ever wondered where the 16 bit/ 44 Khz sampling standard for the CD came from, it was because in the early 1980s, when the CD standard was agreed, there was no other practical way of storing digital sound than by a PCM Converter & video recorder combination. As the wikipedia entry for the PCM adaptor explains, ‘the sampling frequencies of 44.1 and 44.056 kHz were thus the result of a need for compatibility with the 25-frame (CCIR 625/50 countries) and 30-frame black and white (EIAN 525/60 countries) video formats used for audio storage at the time.’ The sampling rate was adopted as the standard for CDs and, unlike many other things in our rapidly changing technological world, it hasn’t changed since.
The fusion of digital and analogue technologies did not last long, and the introduction of DAT tapes in 1987 rendered the PCM digital converters/ video tape system largely obsolete. DAT recorders basically did the same job as PCM/ video but came in one, significantly smaller, machine. DAT machines had the added advantage of being able to accept multiple sampling rates (the standard 44.1 kHz, as well as 48kHz, and 32kHz, all at 16 bits per sample, and a special LP recording mode using 12 bits per sample at 32 kHz for extended recording time).
Problems with migrating early digital tape recordings
There will always be the risk with any kind of magnetic tape recordings that there won’t be enough working tape machines to playback the material recorded on them in the future. As spare parts become harder to source, tapes with worn out transport mechanisms will simply become inoperable. We are not quite at this stage yet, and at Great Bear we have plenty of working U-Matic, Betamax and VHS machines so don’t worry too much! Machine obsolescence is however a real threat facing tape based archives.
Such a problem comes into sharp relief when we consider the case of digital audio recordings made on analogue video tape machines. Audio recordings ‘work’ the tape transport in a far more vigorous fashion than your average domestic video tape user. It may be rewound and fast-forwarded more often, and in a professional environment may be in constant use, thus leading to greater wear and tear.
Those who chose to adopt digital early and made recordings on tape will have marvelled at the lovely clean recordings and the wonders of error correction technology. As a legacy format however, tape-based digital recordings are arguably more at risk than their analogue counterparts. They are doubly compromised by fragility of tape, and the particular problems that befall digital technologies when things go wrong.
‘Edge damage’ is very common in video tape and can happen when the tape transport becomes worn. This can alter the alignments of transport mechanism, leading it to move move up and down and crush the tape. As you can see in this photograph the edge of this tape has become damaged.
Because it is a digital recording, this has led to substantial problems with the transfer, namely that large sections of the recording simply ‘drop out.’ In instances such as these, where the tape itself has been damaged, analogue recordings on tape are infinitely more recoverable than digital ones. Dr W.C. John Van Bogart explains that
‘even in instances of severe tape degradation, where sound or video quality is severely compromised by tape squealing or a high rate of dropouts, some portion of the original recording will still be perceptible. A digitally recorded tape will show little, if any, deterioration in quality up to the time of catastrophic failure when large sections of recorded information will be completely missing. None of the original material will be detectable in these missing sections.’
This risk of catastrophic, as opposed to gradual loss of information on tape based digital media, is what makes these recordings particularly fragile and at risk. What is particularly worrying about digital tape recordings is they may not show any external signs of damage until it is too late. We therefore encourage individuals, recording studios and memory institutions to assess the condition of their digital tape collections and take prompt action if the recorded information is valuable.
The story of PCM digital processors and analogue tapes gives us a fascinating window into a time when we were not quite analogue, but not quite digital either, demonstrating how technologies co-evolve using the capacities of what is available in order to create something new.
‘A non-magnetic, 100 year, green solution for data storage.’
This is the stuff of digital information managers’ dreams. No more worrying about active data management, file obsolescence or that escalating energy bill.
Imagine how simple life would be if there was a way to store digital information that could last, without intervention, for nearly 100 years. Those precious digital archives could be stored in a warehouse that was not climate controlled, because the storage medium was resilient enough to withstand irregular temperatures.
Imagine after 100 years an archivist enters that very same warehouse to retrieve information requested by a researcher. The archivist pulls a box off the shelf and places it on the table. In their bag they have a powerful magnifying glass which they use to read the information. Having ascertained they have the correct item, they walk out the warehouse, taking the box with them. Later that day, instructions provided as part of the product licensing over 100 years ago are used to construct a reader that will retrieve the data. The information is recovered and, having assessed the condition of the storage medium which seems in pretty good nick, the digital optical technology storage is taken back to the warehouse where it sits for another 10 years, until it is subject to its life-cycle review.
Does this all sound too good to be true? For anyone exposed to the constantly changing world of digital preservation, the answer would almost definitely be yes. We have already covered on this blog numerous issues that the contemporary digital information manager may face. The lack of standardisation in technical practices and the bewildering array of theories about how to manage digital data mean there is currently no ‘one size fits all’ solution to tame the archive of born-digital and digitised content, which is estimated to swell to 3,000 Exabytes (thousands of petabytes) by 2020*. We have also covered the growing concerns about the ecological impact of digital technologies, such as e-waste and energy over-consumption. With this in mind, the news that a current technology exists that can by-pass many of these problems will seem like manna from heaven. What can this technology be and why have you never heard about it?
The technology in question is called DOTS, which stands for Digital Optical Technology System. The technology is owned and being developed by Group 47, who ‘formed in 2008 in order to secure the patents, designs, and manufacturing processes for DOTS, a proven 100-year archival technology developed by the Eastman Kodak Company.’ DOTS is refreshingly different from every other data storage solution on the market because it ‘eliminates media and energy waste from forced migration, costly power requirements, and rigid environmental control demands’. What’s more, DOTS are ‘designed to be “plug & play compatible” with the existing Linear Tape Open (LTO) tape-based archiving systems & workﬂow’.
In comparison with other digital information management systems that can employ complex software, the data imaged by DOTS does not use sophisticated technology. John Lafferty writes that at ‘the heart of DOTS technology is an extremely stable storage medium – metal alloy sputtered onto mylar tape – that undergoes a change in reflectivity when hit by a laser. The change is irreversible and doesn’t alter over time, making it a very simple yet reliable technology.’
DOTS can survive the benign neglect all data experiences over time, but can also withstand pretty extreme neglect. During research and development, for example, DOTS was exposed to a series of accelerated environmental age testing that concluded ‘there was no discernible damage to the media after the equivalent of 95.7 years.’ But the testing did not stop there. Since acquiring patents for the technology Group 47,
‘has subjected samples of DOTS media to over 72 hours of immersion each in water, benzine, isopropyl alcohol, and Clorox (™) Toilet Bowl Cleaner. In each case, there was no detectable damage to the DOTS media. However, when subjected to the citric acid of Sprite carbonated beverage, the metal had visibly deteriorated within six hours.’
Robust indeed! DOTS is also non-magnetic, chemically inert, immune from electromagnetic fields and can be stored in normal office environments or extremes ranging from -9º – 65º C. It ticks all the boxes really.
DOTS vs the (digital preservation) world
The only discernible benefit of the ‘open all hours’, random access digital information culture over a storage solution such as DOTS is accessibility. While it certainly is amazing how quick and easy it is to retrieve valuable data at the click of a button, it perhaps should not be the priority when we are planning how to best take care of the information we create, and are custodians of. The key words here are valuable data. Emerging norms in digital preservation, which emphasise the need to always be responsive to technological change, takes gambles with the very digital information it seeks to preserve because there is always a risk that migration will compromise the integrity of data.
The constant management of digital data is also costly, disruptive and time-consuming. In the realm of cultural heritage, where organisations are inevitably under resourced, making sure your digital archives are working and accessible can sap energy and morale. These issues of course affect commercial organisations too. The truth is the world is facing an information epidemic, and surely we would all rest easier if we knew our archives were safe and secure. Indeed, it seems counter-intuitive that amid the endless flashy devices and research expertise in the world today, we are yet to establish sustainable archival solutions for digital data.
Of course, using a technology like DOTS need not mean we abandon the culture of access enabled by file-based digital technologies. It may however mean that the digital collections available on instant recall are more carefully curated. Ultimately we have to ask if privileging the instant access of information is preferable to long-term considerations that will safeguard cultural heritage and our planetary resources.
If such a consideration errs on the side of moderation and care, technology’s role in shaping that hazy zone of expectancy known as ‘the future’ needs to shift from the ‘bigger, faster, quicker, newer’ model, to a more cautious appreciation of the long-term. Such an outlook is built-in to the DOTS technology, demonstrating that to be ‘future proof’ a technology need not only withstand environmental challenges, such as flooding or extreme temperature change, but must also be ‘innovation proof’ by being immune to the development of new technologies. As John Lafferty writes, the license bought with the product ‘would also mandate full backward compatibility to Generation Zero, achievable since readers capable of reading greater data densities should have no trouble reading lower density information.’ DOTS also do not use propriety codecs, as Chris Castaneda reports, ‘the company’s plan is to license the DOTS technology to manufacturers, who would develop and sell it as a non-proprietary system.’ Nor do they require specialist machines to be read. With breathtaking simplicity, ‘data can be recovered with a light and a lens.’
It would be wrong to assume that Group 47’s development of DOTS is not driven by commercial interests – it clearly is. DOTS do however seem to solve many of the real problems that currently afflict the responsible and long-term management of digital information. It will be interesting to see if the technology is adopted and by who. Watch this space!
* According to a 2011 Enterprise Strategy Group Archive TCO Study
2014 will no doubt present a year of new challenges for those involved in digital preservation. A key issue remains the sustainability of digitisation practices within a world yet to establish firm standards and guidelines. Creating lasting procedures capable of working across varied and international institutions would bring some much needed stability to a profession often characterized by permanent change and innovation.
In 1969 The EIAJ-1 video tape was developed by the Electronic Industries Association of Japan. It was the first standardized format for industrial/non-broadcast video tape recording. Once implemented it enabled video tapes to be played on machines made by different manufacturers and it helped to make video use cheaper and more widespread, particularly within a domestic context.
The introduction of standards in the digitisation world would of course have very little impact on the widespread use of digital technologies which are, in the west, largely ubiquitous. It would however make the business of digital preservation economically more efficient, simply because organisations would not be constantly adapting to change. For example, think of the costs involved in keeping up with rapid waves of technological transformation: updating equipment, migrating data and ensuring file integrity and operability are maintained are a few costly and time consuming examples of what this would entail.
Although increasingly sophisticated digital forensic technology can help to manage some of these processes, highly trained (real life!) people will still be needed to oversee any large-scale preservation project. Within such a context resource allocation will always have to account for these processes of adaptation. It has to be asked then: could this money, time and energy be practically harnessed in other, more efficient ways? The costs of non-standardisation becomes ever more pressing when we consider the amount of the digital data preserved by large institutions such as the British Library, whose digital collection is estimated to amass up to 5 petabytes (5000 terabytes) by 2020. This is not a simple case of updating your iphone to the next model, but an extremely complex and risky venture where the stakes are high. Do we really want to jeopardise rich forms cultural heritage in the name of technological progress?
The US-based National Digital Stewardship Alliance (NDSA) National Agenda for Digital Stewardship 2014 echoes such a sentiment. They argue that ‘the need for integration, interoperability, portability, and related standards and protocols stands out as a theme across all of these areas of infrastructure development’ (3). The executive summary also stresses the negative impact rapid technological change can create, and the need to ‘coordinate to develop comprehensive coverage on critical standards bodies, and promote systematic community monitoring of technology changes relevant to digital preservation.’ (2)
File Format Action Plans
One step on the way to more secure standards is the establishment of File Format Action Plans, a practice which is being increasingly recommended by US institutions. The idea behind developing a file format action plan is to create a directory of file types that are in regular use by people in their day to day lives and by institutions. Getting it all down on paper can help us track what may be described as the implicit user-standards of digital culture. This is the basic idea behind Parsimonious Preservation, discussed on the blog last year: that through observing trends in file use we may come to the conclusion that the best preservation policy is to leave data well alone since in practice files don’t seem to change that much, rather than risk the integrity of information via constant intervention.
What are the other main challenges facing ‘digital stewards’ in 2014? In a world of exponential information growth, making decisions about what we keep and what we don’t becomes ever more pressing. When whole collections cannot be preserved digital curators are increasingly called upon to select material deemed representative and relevant. How is it possible to know now what material needs to be preserve for posterity? What values inform our decision making?
To take an example from our work at Great Bear: we often receive tapes from artists who have achieved little or no commercial success in their life times, but whose work is often of great quality and can tell us volumes about a particular community or musical style. How does such work stand up against commercially successful recordings? Which one is more valuable? The music that millions of people bought and enjoyed or the music that no one has ever heard?
Ultimately these questions will come to occupy a central concern for digital stewards of audio data, particularly with the explosion of born-digital music cultures which have enabled communities of informal and often non-commercial music makers to proliferate. How is it possible to know in advance what material will be valuable for people 20, 50 or 100 years from now? These are very difficult, if not impossible questions for large institutions to grapple with, and take responsibility for. Which is why, as members of a digital information management society, it is necessary to empower ourselves with relevant information so we can make considered decisions about our own personal archives.
A final point to stress is that among the ‘areas of concern’ for digital preservation cited by the NDSA, moving image and recorded sound figure highly, alongside other born-digital content such as electronic records, web and social media. Magnetic tape collections remain high risk and it is highly recommended that you migrate this content to a digital format as soon as possible. While digitisation certainly creates many problems as detailed above, magnetic tape is also threatened by physical deterioration and its own obsolescence challenges, in particular finding working machines to play back tape on. The simple truth is, if you want to access material in your tape collections it needs now to be stored in a resilient digital format. We can help, and offer other advice relating to digital information management, so don’t hesitate to get in touch.
What a year it has been in the life of Great Bear Analogue and Digital Media. As always the material customers have sent us to digitise has been fascinating and diverse, both in terms of the recordings themselves and the technical challenges presented in the transfer process. At the end of a busy year we want to take this opportunity to thank our customers for sending us their valuable tape collections, which over the course of 2013 has amounted to a whopping 900 hours of digitised material.
We feel very honoured to play a part in preserving personal and institutional archives that are often incredibly rare, unique and, more often than not, very entertaining. It is a fairly regular occurrence in the Great Bear Studio to have radio jingles from the 60s, oral histories of war veterans, recordings of family get-togethers and video documentation of avant-garde 1970s art experiments simultaneously migrating in a vibrant melee of digitisation.
We have also received a large amount of rare or ‘lost’ audio recordings through which we have encountered unique moments in popular music history. These include live recordings from the Couriers Folk Club in Leicester, demo tapes from artists who achieved niche success like 80s John Peel favourites BOB, and large archives of prolific but unknown songwriters such as the late Jack Hollingshead, who was briefly signed to the Beatles’ Apple label in the 1960s. We always have a steady stream of tapes from Bristol Archive Records, who continue to acquire rare recordings from bands active in the UK’s reggae and post-punk scenes. We have also migrated VHS footage of local band Meet Your Feet from the early 1990s.
On our blog we have delved into the wonderful world of digital preservation and information management, discussing issues such as ‘parsimonious preservation‘ which is advocated by the National Archives, as well as processes such as migration, normalisation and emulation. Our research suggests that there is still no ‘one-size-fits-all’ strategy in place for digital information management, and we will continue to monitor the debates and emerging practices in this field in the coming year. Migrating analogue and digital tapes to digital files remains strongly recommended for access and preservation reasons, with some experts bookmarking 15 April 2023 as the date when obsolescence for many formats will come into full effect.
While the world is facing a growing electronic waste crisis, Great Bear is doing its bit to buck the trend by recycling old domestic and professional tape machines. In 2013 we have acquired over 20 ‘new’ old analogue and digital video machines. This has included early 70s video cassette domestic machines such as the N1502, up to the most recent obsolete formats such as Digital Betacam. We are always looking for old machines, both working and not working, so do get in touch if your spring clean involves ridding yourself of obsolete tape machines!
Our collection of test equipment is also growing as we acquire more wave form monitors, rare time-based correctors and vectorscopes. In audio preservation we’ve invested heavily in early digital audio machines such as multi-track DTRS and ADAT machines which are rapidly becoming obsolete.
We are very much looking forward to new challenges in 2014 as we help more people migrate their tape-based collections to digital formats. We are particularly keen to develop our work with larger archives and memory institutions, and can offer consultation on technical issues that arise from planning and delivering a large-scale digitisation project, so please do get in touch if you want to benefit from our knowledge and experience.
Once again a big thank you from us at Greatbear, and we hope to hear from you in the new year.
The NAB Cartridge (named after the National Association of Broadcasters) was a mainstay of radio broadcasting from the late 1950s-1990s. It was replaced by the mini disc and computerised broadcast automatons.
NAB Cartridges were used primarily for jingles, station identifications, commercials and music. Each cartridge comprised of several recordings of the same, short jingle. Mechanically the tape is designed to play on an endless loop. This required limited manual operation such as rewinding or fast-forwarding, and enabled short recordings to be accessed efficiently and accurately during live broadcasts.
Because they were used in broadcast NAB Cartridges often used the best quality tape available at the time which was usually AMPEX. As readers of the blog will know, this is bad news if you want to listen to the tape a few years down the line. We baked the tapes so they could be played back again, and were then transferred using a SONIFEX HS Cartridge player.
You can listen to one of the incredibly cheesy jingles below!