IASA – Resources and Research

March 27th, 2015

There are an astonishing amount of online resources relating to the preservation and re-formatting of magnetic tape collections.

Whether you need help identifying and assessing your collection, getting to grips with the latest video codec saga or trying to uncover esoteric technical information relating to particular formats, the internet turns up trumps 95% of the time.

Marvel at the people who put together the U-Matic web resource, for example, which has been online since 1999, a comprehensive outline of the different models in the U-Matic ‘family.’ The site also hosts ‘chat pages’ relating to Betamax, Betacam, U-Matic and V2000, which are still very much active, with archives dating back to 1999. For video tape nerds willing to trawl the depths of these forums, nuggets of machine maintenance wisdom await you.

 International Association of Sound and Audiovisual Archives

Sometimes you need to turn to rigorous, peer-reviewed research in order to learn from AV archive specialists.

Fortunately such material exists, and a good amount of it is collected and published by the International Association of Sound and Audiovisual Archives (IASA).

Three IASA journals laid out on the floor

‘Established in 1969 in Amsterdam to function as a medium for international co-operation between archives that preserve recorded sound and audiovisual documents’, IASA holds expertise relating to the many different and specialist issues attached to the care of AV archives.

Comprised of several committees dealing with issues such as standards and best practices; National Archive policies; Broadcast archives; Technical Issues; Research Archives; Training and Education, IASA reflects the diverse communities of practice involved in this professional field.

As well as hosting a yearly international conference (check out this post on The Signal for a review of the 2014 meeting), IASA publish a bi-annual journal and many in-depth specialist reports.

Their Guidelines on the Production and Preservation of Digital Audio Objects (2nd edition, 2009), written by the IASA Technical Committee, is available as a web resource, and provides advice on key issues such as small scale approaches to digital storage systems, metadata and signal extraction from original carriers, to name a few.

Most of the key IASA publications are accessible to members only, and therefore remain behind a paywall. It is definitely worth taking the plunge though, because there are comparably few specialist resources relating to AV archives written with an interdisciplinary—and international—audience in mind.

Examples of issues covered in member-only publications include Selection in Sound Archives, Decay of Polymers, Deterioration of Polymers and Ethical Principles for Sound and Audiovisual Archives.

The latest publication from the IASA Technical Committee, Handling and Storage of Audio and Video Carriers (2014) or TC05, provides detailed outlines of types of recording carriers, physical and chemical stability, environmental factors and ‘passive preservation,’ storage facilities and disaster planning.

The report comes with this important caveat:

 ‘TC 05 is not a catalogue of mere Dos and Don’ts. Optimal preservation measures are always a compromise between many, often conflicting parameters, superimposed by the individual situation of a collection in terms of climatic conditions, the available premises, personnel, and the financial situation. No meaningful advice can be given for all possible situations. TC 05 explains the principal problems and provides a basis for the archivist to take a responsible decision in accordance with a specific situation […] A general “Code of Practice” […] would hardly fit the diversity of structures, contents, tasks, environmental and financial circumstances of collections’ (6).

Member benefits

Being an IASA member gives Great Bear access to research and practitioner communities that enable us to understand, and respond to, the different needs of our customers.

Typically we work with a range of people such as individuals whose collections have complex preservation needs, large institutions, small-to-medium sized archives or those working in the broadcast industry.

Our main concern is reformatting the tapes you send us, and delivering high quality digital files that are appropriate for your plans to manage and re-use the data in the future.

If you have a collection that needs to be reformatted to digital files, do contact us to discuss how we can help.

1″ type A Video Tape – The Old Grey Whistle Test

March 5th, 2015

Sometimes genuine rarities turn up at the Great Bear studio. Our recent acquisition of four reels of ‘missing, believed wiped’ test recordings of cult BBC TV show The Old Grey Whistle Test is one such example.Old Grey Whistle Test Ampex reel

It is not only the content of these recordings that are interesting, but their form too, because they were made on 1” type A videotape.

The Ampex Corporation introduced 1” Society of Motion Picture and Television Engineers (SMPTE) type A videotape in 1965.

The 1″ type A was ‘one of the first standardized reel-to-reel magnetic tape formats in the 1 inch (25 mm) width.’ In the US it had greatest success as an institutional and industrial format. It was not widely adopted in the broadcast world because it did not meet Federal Communications Commission (FCC) specifications for broadcast videotape formats—it was capable of 350 lines, while the NTSC standard was 525, PAL and SECAM were 625 (for more information on television standards visit this page, also note the upcoming conference ‘Standards, Disruptions and Values in Digital Culture and Communication‘ taking place November 2015).

According the VT Old Boys website, created by ex-BBC engineers in order to document the history of videotape used at the organisation, 2″ Quadruplex tape remained very much the norm for production until the end of the 1970s.

Yet the very existence of the Old Grey Whistle Test tapes suggests type A videotape was being used in some capacity in the broadcast world. Perhaps ADAPT, a project researching British television production technology from 1960-present, could help us solve this mystery?

Old Grey Whistle Test ReelFrom type A, to type B….

As these things go, type A was followed by type B, with this model developed by the German company Bosch. Introduced in 1976, type B was widely adopted in continental Europe, but not in UK and USA which gravitated toward the type C model, introduced by SONY/ Ampex, also in 1976. Type C then became the professional broadcast standard and was still being used well into the 1990s. It was able to record high quality composite video, and therefore had an advantage over component videos such as Betacam and MII that were ‘notoriously fussy and trouble-prone.‘ Type C also had fancy functions like still, shuttle, variable-speed playback and slow motion.

From a preservation assessment point of view, ‘one-inch open reel is especially susceptible to risks associated with age, hardware, and equipment obsolescence. It is also prone to risks common to other types of magnetic media, such as mould, binder deterioration, physical damage, and signal drop-outs.’

1" Type A Machine

The Preservation Self-Assessment Programme advise that ‘this format is especially vulnerable, and, based on content assessment, it should be a priority for reformatting.’

AMPEX made over 30 SMPTE type A models, the majority of which are listed here. Yet the number of working machines we have access to today is few and far between.

In years to come it will be common for people to say ‘it takes four 1” type A tape recorders to make a working one’, but remember where you heard the truism first.

Harvesting several of these hulking, table-top machines for spares and working parts is exactly how we are finding a way to transfer these rare tapes—further evidence that we need to take the threat of equipment obsolescence very seriously.

Digitising small audiovisual collections: making decisions and taking action

February 24th, 2015

Deciding when to digitise your magnetic tape collections can be daunting.

The Presto Centre, an advocacy organisation working to help ‘keep audiovisual content alive,’ have a graphic on their website which asks: ‘how digital are our members?’

They chart the different stages of ‘uncertainty,’ ‘awakening’, ‘enlightenment’, ‘wisdom’ and ‘certainty’ that organisations move through as they appraise their collections and decide when to re-format to digital files.

Similarly, the folks at AV Preserve offer their opinion on the ‘Cost of Inaction‘ (COI), arguing that ‘incorporating the COI model and analyses into the decision making process around digitization of legacy physical audiovisual media helps organizations understand the implications and make well-informed decisions.’

They have even developed a COI calculator tool that organisations can use to analyse their collections. Their message is clear: ‘the cost of digitization may great, but the cost of inaction may be greater.’

Digitising small-medium audiovisual collections

For small to medium size archives, digitising collections may provoke worries about a lack of specialist support or technical infrastructure. It may be felt that resources could be better used elsewhere in the organisation. Yet as we, and many other people working with audiovisual archives often stress, the decision to transfer material stored on magnetic tape has to be made sooner or later. With smaller archives, where funding is limited, the question of ‘later’ is not really a practical option.

Furthermore, the financial cost of re-formatting audiovisual archives is likely to increase significantly in the next five-ten years; machine obsolescence will become an aggravated problem and it is likely to take longer to restore tapes prior to transfer if the condition of carriers has dramatically deteriorated. The question has to be asked: can you afford not to take action now?

If this describes your situation, you might want to hear about other small to medium sized archives facing similar problems. We asked one of our customers who recently sent in a comparatively small collection of magnetic tapes to share their experience of deciding to take the digital plunge.

We are extremely grateful for Annaig from the Medical Mission Sisters for answering the questions below. We hope that it will be useful for other archives with similar issues.

threadimg-eiaj-half-inch-video-tape1. First off, please tell us a little bit about the Medical Mission Sisters Archive, what kind of materials are in the collection?

The Medical Mission Sisters General Archives include the central archives of the congregation. They gather all the documents relating to the foundation and history of the congregation and also documents relating to the life of the foundress, Anna Dengel. The documents are mainly paper but there is a good collection of photographs, slides, films and audio documents. Some born digital documents are starting to enter the archives but they are still few.

2. As an archive with a modest collection of magnetic tapes, why did you decide to get the materials digitised now? Was it a question of resources, preservation concerns, access request (or a mixture of all these things!)

The main reason was accessibility. The documents on video tapes or audio tapes were the only usable ones because we still had machines to read them but all the older ones, or those with specific formats,  where lost to the archives as there was no way to read them and know what was really on the tapes. Plus the Medical Mission Sisters is a congregation where Sisters are spread out on 5 continents and most of the time readers don’t come to the archives but send me queries by emails where I have to respond with scanned documents or digital files. Plus it was obvious that some of the tapes were degrading as that we’d better have the digitisation sooner than later if we wanted to still be able to read what was on them. Space and preservation was another issue. With a small collection but varied in formats, I had no resources to properly preserve every tape and some of the older formats had huge boxes and were consuming a lot of space on the shelves. Now, we have a reasonably sized collection of CDs and DVDs, which is easy to store in good conditions and is accessible everywhere as we can read them on computer here and I can send them to readers via email.

3. Digital preservation is a notoriously complex, and rapidly evolving field. As a small archive, how do you plan to manage your digital assets in the long term? What kinds of support, services and systems are your drawing on to design a system which is robust and resilient?

At the moment the digital collection is so small that it cannot justify any support service or system. So I have to build up my own home made system. I am using the archives management software (CALM) to enter data relating to the conservation of the CDs or DVDs, dates of creation, dates to check them and I plan to have regular checks on them and migrations or copies made when it will prove necessary.

4. Aside from the preservation issue, what are your plans to use the digitised material that Great Bear recently transferred?

It all depends on the content of the tapes. But I’ve already spotted a few documents of interest, and I haven’t been through everything yet. My main concern now is to make the documents known and used for their content. I was already able to deliver a file to one of the Sisters who was working on a person related to the foundation of the congregation, the most important document on her was an audio file that I had just received from Great Bear, I was able to send it to her. The document would have been unusable a few weeks before. I’ve come across small treasures, like a film, probably made by the foundress herself, which nobody was aware of. The Sisters are celebrating this year the 90th anniversary of their foundation. I plan to use as many audio or video documents as I can to support the events the archives are going to be involved into.

***

What is illuminating about Annaig’s answers is that her archive has no high tech plan in place to manage the collection – her solutions for managing the material very much draw on non-digital information management practices.

The main issues driving the decision to migrate the materials are fairly common to all archives: limited storage space and accessibility for the user-community.

What lesson can be learnt from this? Largely, that if you are trained as an archivist, you are likely to already have the skills you need to manage your digital collection.

So don’t let the more bewildering aspects of digital preservation put you off. But do take note of the changing conditions for playing back and accessing material stored on magnetic tape. There will come a time when it will be too costly to preserve recordings on a wide variety of formats – many of such formats we can help you with today.

If you want to discuss how Great Bear can help you re-format your audiovisual collections, get in touch and we can explore the options.

If you are a small-medium size archive and want to share your experiences of deciding to digitise, please do so in the comment box below.

Save our Sounds – 2030 and the threat of audiovisual extinction

February 9th, 2015

At the beginning of 2015, the British Library launched the landmark Save Our Sounds project.

The press release explained:

‘The nation’s sound collections are under threat, both from physical degradation and as the means of playing them disappear from production. Archival consensus internationally is that we have approximately 15 years in which to save our sound collections by digitising them before they become unreadable and are effectively lost.’

dvw-a510-digital-betacam-loading-gearYes you have read that correctly dear reader: by 2030 it is likely that we simply will not be able to play many, if not all of the tape we currently support at Great Bear. A combination of machine obsolescence, tape deterioration and, crucially, the widespread loss of skills necessary to repair, service and maintain playback machines are responsible for this astounding situation. They will make it ‘costly, difficult and, in many cases, impossible’ to preserve our recorded audio heritage beyond the proposed cut-off date.

While such news might (understandably) usher in a culture of utter panic, and, let’s face it, you’d have to have a strong disposition if you were charged with managing the Save Our Sounds project, the British Library are responding with stoic pragmatism. They are currently undertaking a national audit to map the conditions of sound archives which your organisation can contribute to.

Yet whatever way you look at it, there is need to take action to migrate any collections currently stored on obsolete media, particular if you are part of a small organisation with limited resources. The reality is it will become more expensive to transfer material as we move closer to 2030. The British Library project relates particularly to audio heritage, but the same principles apply to audiovisual collections too.

Yes that rumbling you can hear is the sound of archivists the world over engaged in flurry of selection and appraisal activities….

Extinction

One of the most interesting things about discussions of obsolete media is that the question of operability is often framed as a matter of life or death.

Formats are graded according to their ‘endangered statuses’ in more or less explicit terms, as demonstrated on this Video Preservation website which offers the following ‘obsolescence ratings':

‘Extinct: Only one or two playback machines may exist at specialist laboratories. The tape itself is more than 20 years old.

Critically endangered: There is a small population of ageing playback machinery, with no or little engineering or manufacturing support. Anecdotal evidence indicates that there are fewer working machine-hours than total population of tapes. Tapes may range in age from 40 years to 10 years.

Endangered: The machine population may be robust, but the manufacture of the machinery has stopped. Manufacturing support for the machines and the tapes becomes unavailable. The tapes are often less expensive, and more vulnerable to deterioration.

Threatened: The playback machines are available; however, either the tape format itself is unstable or has less integrity than other available formats, or it is known that a more popular or updated format will be replacing this one in a short period of time.

Vulnerable: This is a current but highly proprietary format.

Lower risk: This format will be in use over the next five years (1998-2002).’

The ratings on the video preservation website were made over ten years ago. A more comprehensive and regularly updated resource to consult is the Preservation Self-Assessment Program (PSAP), ‘a free online tool that helps collection managers prioritize efforts to improve conditions of collections. Through guided evaluation of materials, storage/exhibit environments, and institutional policies, the PSAP produces reports on the factors that impact the health of cultural heritage materials, and defines the points from which to begin care.’ As well as audiovisual media, the resource covers photo and image material, paper and book preservation. It also has advice about disaster planning, metadata, access and a comprehensive bibliography.

The good news is that fantastic resources do exist to help archivists make informed decisions about reformatting collections.

dcc-backview

A Digital Compact Cassette

The bad news, of course, is that the problem faced by audiovisual archivists is a time-limited one, exacerbated no doubt by the fact that digital preservation practices on the ‘output end’ are far from stable. Finding machines to playback your Digital Compact Cassette collection, in other words, will only be a small part of the preservation puzzle. A life of file migrations in yet to be designed wrappers and content-management systems awaits all kinds of reformatted audiovisual media in their life-to-come as a digital archival object.

Depending on the ‘content value’ of any collection stored on obsolete media, vexed decisions will need to be made about what to keep and what to throw away at this clinical moment in the history of recorded sound.

Sounding the fifteen-year warning

At such a juncture, when the fifteen year warning has been sounded, perhaps we can pause for a second to reflect on the potential extinction of large swathes of audio visual memory.

If we accept that any kind of recording both contains memory (of a particular historical event, or performance) and helps us to remember as an aide-mémoire, what are the consequences when memory storage devices which are, according to UNESCO, ‘the primary records of the 20th and 21st centuries’, can no longer be played back?

These questions are of course profound, and emerge in response to what are consequential historical circumstances. They are questions that we will continue to ponder on the blog as we reflect on our own work transferring obsolete media, and maintaining the machines that play them back. There are no easy answers!

As the 2030 deadline looms, our audiovisual context is a sobering retort to critics who framed the widespread availability of digitisation technologies in the first decade of the 21st century as indicative of cultural malaise—evidence of a culture infatuated with its ‘past’, rather than concerned with inventing the ‘future’.

Perhaps we will come to understand the 00s as a point of audiovisual transition, when mechanical operators still functioned and tape was still in fairly good shape. When it was an easy, almost throw away decision to make a digital copy, rather than an immense preservation conundrum. So where once there was a glut of archival data—and the potential to produce it—is now the threat of abrupt and irreversible dropout.

Play those tapes back while you can!

1/2″ EIAJ video tape – aesthetic glitches

January 16th, 2015

In an article on the BBC website Temple reflected on the recordings: ‘we affectionately called the format “Glorious Bogroll Vision” but really it was murksville. Today monochrome footage would be perfectly graded with high-contrast effects. But the 1970s format has a dropout-ridden, glitchy feel which I enjoy now.’ 

Note the visible drop out in the image

Note the visible drop out in the image

The glitches of 1/2″ video were perfect for Temple’s film, which aimed to capture the apocalyptic feeling of Britain on the eve of 1977. Indeed, Temple reveals that ‘we cut in a couple of extra glitches we liked them so much.

Does the cutting in of additional imperfection signal a kind-of fetishisation of the analogue video, a form of wanton nostalgia that enables only a self-referential wallowing on a time when things were gloriously a lot worse than they are now?

Perhaps the corrupted image interrupts the enhanced definition and clarity of contemporary digital video.

Indeed, Temple’s film demonstrates how visual perception is always produced by the transmission devices that playback moving images, sound and images, whether that be the 1/2″ video tape or the super HD television.

It is reminder, in other words, that there are always other ways of seeing, and underlines how punk, as a mode of aesthetic address in this case, maintains its capacity to intervene into the business-as-usual ordering of reality.

What to do with your 1/2″ video tapes?

hitachi_reel_to_reel_eiaj_vtr1

While Temple’s film was made to look worse than it could have been, EIAJ 1/2″ video tapes are most definitely a vulnerable format and action therefore needs to be taken if they are to be preserved effectively.

In a week where the British Library launched their Save Our Sounds campaign, which stated that ‘archival consensus internationally is that we have approximately 15 years in which to save our sound collections by digitising them before they become unreadable and are effectively lost,’ the same timeframes should be applied to magnetic tape-based video collections.

So if your 1/2″ tapes are rotting in your shed as Temple’s Clash footage was, you know that you need to get in there, fish them out, and send them to us pronto!

DVCAM transfers, error correction coding & misaligned machines

December 17th, 2014

This article is inspired by a collection of DVCAM tapes sent in by London-based cultural heritage organisation Sweet Patootee. Below we will explore several issues that arise from the transfer of DVCAM tapes, one of the many Digital Video formats that emerged in the mid-1990s. A second article will follow soon which focuses on the content of the Sweet Patootee archive, which is a fascinating collection of video-taped oral histories of 1 World War veterans from the Caribbean.

The main issue we want to explore below is the role error correction coding performs both in the composition of the digital video signal and during the preservation playback. We want to highlight this issue because it is often assumed that DVCAM, which first appeared on the market in 1996, is a fairly robust format.

The work we have done to transfer tapes to digital files indicates that error correction coding is working overdrive to ensure we can see and hear these recordings. The implication is that DVCAM collections, and wider DV-based archives, should really be a preservation priority for institutions, organisations and individuals.

Before we examine this in detail, let’s learn a bit about the technical aspects of error correction coding.

Error error error

DVFormat7Error correction coding is a staple part of audio and audio-visual digital media. It is of great important in the digital world of today where the higher volume of transmitted signals require greater degrees of compression, and therefore sophisticated error correction schemes, as this article argues.

Error correction works through a process of prediction and calculation known as interpolation or concealment. It takes an estimation of the original recorded signal in order to re-construct parts of the data that have been corrupted. Corruption can occur due either to wear and tear, or insufficiencies in the original recorded signal.

Yet as Hugh Robjohns explains in the article ‘All About Digital Audio’ from 1998:

 ‘With any error protection system, if too many erroneous bits occur in the same sample, there is a risk of the error detection system failing, and in practice, most media failures (such as dropouts on tape or dirt on a CD), will result in a large chunk of data being lost, not just the odd data bit here and there. So a technique called interleaving is used to scatter data around the medium in such a way that if a large section is lost or damaged, when the data is reordered many smaller, manageable data losses are formed, which the detection and correction systems can hopefully deal with.’

There are many different types of error correction, and ‘like CD-ROMs, DV uses Reed-Solomon (RS) error detection and correction coding. RS can correct localized errors, but seldom can reconstruct data damaged by a dropout of significant size (burst error),’ explains this wonderfully detailed article about DV video formats archived on web archive.

The difference correction makes

Digital technology’s error correction is one of the key things that differentiate it from their analogue counterparts. As the IASA‘s Guidelines on the Production and Preservation of Digital Audio Objects (2009) explains:

‘Unlike copying analogue sound recordings, which results in inevitable loss of quality due to generational loss, different copying processes for digital recordings can have results ranging from degraded copies due to re-sampling or standards conversion, to identical “clones” which can be considered even better (due to error correction) than the original.’ (65)

To think that digital copies can, at times, exceed the quality of the original digital recording is both an astonishing and paradoxical proposition. After all we are talking about a recording that improves at the perceptual level, despite being compositionally damaged. It is important to remember that error correction coding cannot work miracles, and there are limits to what it can do.

Dietrich Schüller and Albrecht Häfner argue in the International Association of Sound and Audiovisual Archives’s (IASA) Handling and Storage of Audio and Video Carriers (2014) that ‘a perfect, almost error free recording leaves more correction capacity to compensate for handling and ageing effects and, therefore, enhances the life expectancy.’ If a recording is made however ‘with a high error rate, then there is little capacity left to compensate for further errors’ (28-29).

The bizarre thing about error-correction coding then is the appearance of clarity it can create. And if there are no other recordings to compare with the transferred file, it is really hard to know what the recorded signal is supposed to look and sound like were its errors not being corrected.

DVCAM PRO

When we watch the successfully migrated, error corrected file post-transfer, it matters little whether the original was damaged. If a clear signal is transmitted with high levels of error correction, the errors will not be transferred, only the clear image and sound.

Contrast this with a damaged analogue tape it would be clearly discernible on playback. The plus point of analogue tape is they do degrade gracefully: it is possible to play back an analogue tape recording with real physical deterioration and still get surprisingly good results.

Digital challenges

The big challenge working with any digital recordings on magnetic tape is to know when a tape is in poor condition prior to playback. Often tape will look fine and, because of error correction, will sound fine too until it stops working entirely.

How then did we know that the Sweet Patootee tapes were experiencing difficulties?

Professional DV machines such as our DVC PRO have a warning function that flashes when the error-correction coding is working at heightened levels. With our first attempt to play back the tapes we noticed that regular sections on most of the tapes could not be fixed by error correction.

The ingest software we use is designed to automatically retry sections of the tape with higher levels of data corruption until a signal can be retrieved. Imagine a process where a tape automatically goes through a playing-rewinding loop until the signal can be read. We were able to play back the tapes eventually, but the high level of error correction was concerning.

DVFormat6

As this diagram makes clear, around 25% of the recorded signal in DVCAM is composed of subcode data, error detection and error correction.

DVCAM & Mis-alignment

It is not just the over-active error correction on DVCAMs that should send the alarm bells ringing.

Alan Griffiths from Bristol Broadcast Engineering, a trained SONY engineer with over 40 years experience working in the television industry, told us that early DVCAM machines pose particular preservation challenges. The main problem here is that the ‘mechanisms are completely different’ for earlier DVCAM machines which means that there is ‘no guarantee’ they will play back effectively on later models.

Recordings made on early DVCAM machines exhibit back tensions problems and tracking issues. This increases the likelihood of DV dropout on playback because a loss of information was recorded onto the original tape. The IASA confirm that ‘misalignment of recording equipment leads to recording imperfections, which can take manifold form. While many of them are not or hardly correctable, some of them can objectively be detected and compensated for.’

One possible solution to this problem, as with DAT tapes, is to ‘misalign’ the replay digital video tape recorder to match the misaligned recordings. However ‘adjustment of magnetic digital replay equipment to match misaligned recordings requires high levels of engineering expertise and equipment’ (2009; 72), and must therefore not be ‘tried at home,’ so to speak.

Our experience with the Sweet Patootee tapes indicates that DVCAM tapes are a more fragile format than is commonly thought, particularly if your DVCAM collection was recorded on early machines. If you have a large collection of DVCAM tapes we strongly recommend that you begin to assess the contents and make plans to transfer them to digital files. As always, do get in touch if you need any advice to develop your plans for migration and preservation.

 

Reel-to-reel transfer of Anthony Rye, Selborne’s nature poet

November 25th, 2014

We have recently transferred a number of recordings of the poet, Anthony Rye, reading his work. The tapes were sent by his Grandson Gabriel who was kind enough to tell us a bit more about Anthony’s life and work.

‘Anthony Francis Rye is intimately associated with the Hampshire village of Selborne, a village made famous by Gilbert White and his book, Natural History of Selborne.

The Rye family has been here since the end of the 19th century and Anthony came to live here in the 1940s with his wife, in the house I now live in.

Among his books of poems are The Inn of the Birds (1947), Poems from Selborne (1961) and To A Modern Hero (1957). He was an illustrator and trained as an engraver and illustrated The Inn of the Birds himself, of which he said the poems “…were written to make people more alive to the spirit of bird-life and to the nature of birds generally. It was hoped to communicate something of the intense pleasure in birds felt by the author, and at the same time, by emphasizing their strange remote quality without destroying the sense of their being our fellow creatures…”Jacket cover depicting a hand drawn rural scene with people walking

His poem ‘The Shadow on the Lyth’ from Poems from Selborne, invokes a dark moment in Selborne’s history when it was proposed by the council to put a much needed sewage works at the bottom of Church Meadow, thus ruining one of the most beautiful settings in Hampshire – one beloved of natural historian Gilbert White. Anthony Rye fought this and after a long struggle managed to have the works re-sited out of sight.’

Gilbert White’s life and work was a significant influence on Rye’s work and in 1970 he published the book Gilbert White and his Selborne.

Although the BBC has previously broadcast Rye’s poems, Gabriel tell us that these particular recordings have not been. Until now the recordings have been stored in Arthur’s house; migrating them to digital files is an exciting opportunity for family members, but also hopefully wider audiences, to access Rye’s work.

 

Listen to Anthony Rye reading his poems, with thanks to Gabriel for granting permission

Recording technologies in history

75SonyBrochure02

Arthur Jolland, a nature photographer and friend of the poet made the recordings on a SONY 800B, a portable reel-to-reel tape machine described by SONY as ‘compact, convenient and capable, a natural for both business and pleasure.’

The machine, which used a ‘ServoControl Motor; the same type of motor used is missile guidance control systems where critical timing accuracy is a must,’ is historically notorious for its use by US President Richard Nixon who racked up 3,700-4,000 hours of recordings that would later implicate him during the Watergate Scandal.

Sahr Conway-Lanz explains that ‘room noise may constitute roughly one quarter of the total hours of recorded sound’ because tape machines recorded at the super slow speed of 15/16 of an inch per second ‘in order to maximize the recording time on each tape’ (547-549).

Decreasing the speed of a tape recording causes a uniform reduction in the linearity of response, resulting in more hiss and dropouts. If you listen to the recordings made by Nixon, it is pretty hard to discern what is being said without reference to the transcripts.

The transfer process

There were no big issues with the condition of the Anthony Rye tapes other than a small amount of loose binder shedding. This was easily solved by dry cleaning with pellon fabric prior to digitization.

Although in some cases playing back tapes on exactly the same machine as it was recorded on is desirable (particularly so with DAT transfers), we migrated the recordings using our SONY APR 5003. Sony APR 5003v headblock closeup, with tape laced up

Using a technically superior model, one of the few large format professional reel-to-reel machines SONY manufactured, mitigates the extent errors are added to the recording as part of the transfer process. Furthermore, the greater flexibility and control offered with the 5003 makes it easier to accurately replay tapes recorded on machines that had lower specifications.

Another slight adjustment was attaching longer leader tape to the front and end of the tape. This is because the Sony APR 5003 has a much longer tape path than the 800B, and if this isn’t done material can be lost from the beginning and end of the recording.

***

The journeys we have been on above – from the natural history of a Hampshire village seen through the eyes of largely unknown poet to the Watergate scandal – is another example of the diverse technical, cultural and historical worlds that are opened up by the ‘mysterious little reddish-brown ribbon‘ and its playback mechanisms.

World Day for Audiovisual Heritage – digitisation and digital preservation policy and research

October 27th, 2014

Today, October 27, has been declared World Day for Audiovisual Heritage by UNESCO. We also blogged about it last year.

Since 2005, UNESCO have used the landmark to highlight the importance of audiovisual archives to ‘our common heritage’ which  contain ‘the primary records of the 20th and 21st centuries.’ Increasingly, however, the day is used to highlight how audio and moving image archives are particularly threatened with by ‘neglect, natural decay to technological obsolescence, as well as deliberate destruction’.

Indeed, the theme for 2014 is ‘Archives at Risk: Much More to Do.’ The Swiss National Sound Archives have made this rather dramatic short film to promote awareness of the imminent threat to audiovisual formats, which is echoed by UNESCO’s insistence that ‘all of the world’s audiovisual heritage is endangered.’

As it is World Audiovisual Heritage Day, we thought it would be a good idea to take a look at some of the recent research and policy that has been collected and published relating to digitisation and digital preservation.

While the UNESCO anniversary is useful for raising awareness of the fragility of audiovisual mediums, what is the situation for organisations and institutions grappling with these challenges in practice?

Recent published research – NDSA

The first to consider are preliminary results from a survey published by the US-based NDSA Standards and Practices Working Group, full details can be accessed here.

The survey asked a range of organisations, institutions and collections to rank issues that are critical for the preservation of video collections. Respondents ‘identified the top three stumbling blocks in preserving video as:

  • Getting funding and other resources to start preserving video (18%)
  • Supporting appropriate digital storage to accommodate large and complex video files (14%)
  • Locating trustworthy technical guidance on video file formats including standards and best practices (11%)’

Interestingly in relation to the work we do at Great Bear, which often reveal the fragilities of digital recordings made on magnetic tape, ‘respondents report that analog/physical media is the most challenging type of video (73%) followed by born digital (42%) and digital on physical media (34%).’

It may well be that there is simply more video on analogue/ physical media than other mediums which can account for the higher response, and that archives are yet to grapple with the archival problem of digital video stored on physical mediums such as DVD and in particular, consumer grade DVD-Rs. Full details will be published on The Signal, the Library of Congress’ Digital Preservation blog, in due course.

Recent research – Digital Preservation Coalition (DPC)

Another piece of preliminary research published recently was the user consultation for the 2nd edition of the Digital Preservation Coalition’s Digital Preservation Handbook. The first edition of the Handbook was published in 2000 but was regularly updated throughout the 00s. The consultation precedes what will be a fairly substantial overhaul of the resource.

Many respondents to the consultation welcomed that a new edition would be published, stating that much content is now ‘somewhat outdated’ given the rapid change that characterises digital preservation as a technological and professional field.

Survey respondents ranked storage and preservation (1), standards and best practices (2) and metadata and documentation (3) as the biggest challenges involved in digital preservation, and therefore converge with the NDSA findings. It must be stressed, however, that there wasn’t a massive difference across all the categories that included issues such as compression and encryption, access and creating digital materials.

Some of the responses ranged from the pragmatic…

‘digital preservation training etc tend to focus on technical solutions, tools and standards. The wider issues need to be stressed – the business case, the risks, significant properties’ (16)

‘increasingly archives are being approached by community archive groups looking for ways in which to create a digital archive. Some guidance on how archive services can respond effectively and the issues and challenges that must be considered in doing so would be very welcome’ (16)

…to the dramatic…

‘The Cloud is a lethal method of storing anything other than in Lo Res for Access, and the legality of Government access to items stored on The Cloud should make Curators very scared of it. Most digital curators have very little comprehension of the effect of solar flares on digital collections if they were hit by one. In the same way that presently part of the new method of “warfare” is economic hacking and attacks on financial institutions, the risks of cyber attacks on a country’s cultural heritage should be something of massive concern, as little could demoralise a population more rapidly. Large archives seem aware of this, but not many smaller ones that lack the skill to protect themselves’ (17)

…Others stressed legal issues related to rights management…

‘recording the rights to use digital content and ownership of digital content throughout its history/ life is critical. Because of the efforts to share bits of data and the ease of doing so (linked data, Europeana, commercial deals, the poaching of lines of code to be used in various tools/ services/ products etc.) this is increasingly important.’ (17)

It will be fascinating to see how the consultation are further contextualised and placed next to examples of best practice, case studies and innovative technological approaches within the fully revised 2nd edition of the Handbook.

European Parliament Policy on Film Heritage

Our final example relates to the European Parliament and Council Recommendation on Film Heritage. The Recommendation was first decreed in 2005. It invited Member States to offer progress reports every two years about the protection of and access to European film heritage. The 4th implementation report was published on 2 October 2014 and can be read in full here.

The language of the recommendation very much echoes the rationale laid out by UNESCO for establishing World Audiovisual Heritage Day, discussed above:

‘Cinematography is an art form contained on a fragile medium, which therefore requires positive action from the public authorities to ensure its preservation. Cinematographic works are an essential component of our cultural heritage and therefore merit full protection.’

Although the recommendation relates to preservation of cinematic works specifically, the implementation report offers wide ranging insight into the uneven ways ‘the digital revolution’ has affected different countries, at the level of film production/ consumption, archiving and preservation.

The report gravely states that ‘European film heritage risks missing the digital train,‘ a phrase that welcomes a bit more explanation. One way to understand is that it describes how countries, but also Europe as a geo-political space, is currently failing to capitalise on what digital technologies can offer culturally, but also economically.

The report reveals that the theoretical promise of interoperable digital technologies-smooth trading, transmission and distribution across economic, technical and cultural borders-was hindered in practice due to costly and complex copyright laws that make the cross border availability of film heritage, re-use (or ‘mash-up’) and online access difficult to implement. This means that EU member states are not able to monetise their assets or share their cultural worth. Furthermore, this is further emphasised by the fact that ‘85% of Europe’s film heritage is estimated to be out-of-commerce, and therefore, invisible for the European citizen’ (37).

In an age of biting austerity, the report makes very clear that there simply aren’t enough funds to implement robust digitization and digital preservation plans: ‘Financial and human resources devoted to film heritage have generally remained at the same level or have been reduced. The economic situation has indeed pushed Member States to change their priorities’ (38).

There is also the issue of preserving analogue expertise: ‘many private analogue laboratories have closed down following the definitive switch of the industry to digital. This raises the question on how to maintain technology and know-how related to analogue film’ (13).

Production Heritage Budget EUThe report gestures toward what is likely to be a splitting archival-headache-to-come for custodians of born digital films: ‘resources devoted to film heritage […] continue to represent a very small fraction of resources allocated to funding of new film productions by all Member States’ (38). Or, to put it in numerical terms, for every €97 invested by the public sector in the creation of new films, only €3 go to the preservation and digitisation of these films. Some countries, namely Greece and Ireland, are yet to make plans to collect contemporary digital cinema (see opposite infographic).

Keeping up to date

It is extremely useful to have access to the research featured in this article. Consulting these different resources helps us to understand the nuts and bolts of technical practices, but also how different parts of the world are unevenly responding to digitisation. If the clock is ticking to preserve audiovisual heritage in the abrupt manner presented in the Swiss National Archives Film, the EU research in particular indicates that it may well be too late already to preserve a significant proportion of audiovisual archives that we can currently listen to and watch.

As we have explored at other places in this blog, wanting to preserve everything is in many ways unrealistic; making clinical selection decisions is a necessary part of the archival process. The situation facing analogue audiovisual heritage is however both novel and unprecedented in archival history: the threat of catastrophic drop out in ten-fifteen years time looms large and ominous.

All that is left to say is: enjoy the Day for World Audiovisual Heritage! Treasure whatever endangered media species flash past your eyes and ears. Be sure to consider any practical steps you can take to ensure the films and audio recordings that are important to you remain operable for many years to come.

Transferring Digital Audio Tapes (DATs) to digital audio files

October 9th, 2014

This post focuses on the problems that can arise with the transfer of Digital Audio Tapes (DATs).

An immature recording method (digital) on a mature recording format (magnetic tape), the audio digital recording revolution was never going to get it right first time (although DATs were not of course the first digital recordings made on tape).

Indeed, at a meeting of audio archivists held in 1995, there was a consensus even then that DAT was not, and would never be, a reliable archival medium. One participant stated: ‘we have tapes from 1949 that sound wonderful,’ and ‘we have tapes from 1989 that are shot to hell.’ And that was nearly twenty years ago! What chances do the tapes have now?

A little DAT history

Before we explore that, let’s have a little DAT history.

SONY introduced Digital Audio Tapes (DATs) in 1987. At roughly half the size of an analogue cassette tape, DAT has the ability to record at higher, equal or lower sampling rates than a CD (48, 44.1 or 32 kHz sampling rate respectively) at 16 bit quantization.

Although popular in Japan, DATs were never widely adopted by the majority of consumer market because they were more expensive than their analogue counterparts. They were however embraced in professional recording contexts, and in particular for recording live sound.

It was recording industry paranoia, particularly in the US, that really sealed the fate of the format. With its threatening promise of perfect replication, DAT tapes were subject to an unsuccessful lobbying campaign by the Recording Industry Association of America (RIAA). RIAA saw DATs as the ultimate attack on copyright law and pressed to introduce the Digital Audio Recorder Copycode Act of 1987.

This law recommended that each DAT machine had a ‘copycode’ chip installed that could detect whether prerecorded copyrighted music was being replicated. The method employed a notch filter that would subtly distort the quality of the copied recording, thus sabotaging the any acts of piracy tacitly enabled by the DAT medium. The law was however not passed, and compromises were made, although the US Audio Home Recording Act of 1992 imposed taxes on DAT machines and blank media.

How did they do ‘dat?

Like video tape recorders, DAT tapes use a rotating head and helical scan method to record data. The helical scan can, however, pose real problems for the preservation transfers of DAT tapes because it makes it difficult to splice the tape together if it becomes sticky and snaps during the tape wind. With analogue audiotape, which records information longitudinally, it is far more possible to splice the tape together and continue the transfer without risking irrevocable information loss.

Another problem posed by the helical scan method is that such tapes are more vulnerable to tape pack and backing deformation, as the CLIR guide explain:

‘Tracks are recorded diagonally on a helical scan tape at small scan angles. When the dimensions of the backing change disproportionately, the track angle will change for a helical scan recording. The scan angle for the record/playback head is fixed. If the angle that the recorded tracks make to the edge of the tape do not correspond with the scan angle of the head, mistracking and information loss can occur.’

When error correction can’t correct anymore

dat-mute-playback-condition-sony-7040Most people will be familiar with the sound of digital audio dropouts even if they don’t know the science behind them. You will know them most probably as those horrible clicking noises produced when the error correction technology on CDs stops working. The clicks indicate that the ‘threshold of intelligibility’ for digital data has been breached and, as theorist Jonathan Sterne reminds us, ‘once their decay becomes palpable, the file is rendered entirely unreadable.’

Our SONY PCM 7030 professional DAT machine, pictured opposite, has a ‘playback condition’ light that flashes if an error is present. On sections of the tape where quality is really bad the ‘mute’ light can flash to indicate that the error correction technology can’t fix the problem. In such cases drop outs are very audible. Most DAT machines did not have such a facility however, and you only knew there was a problem when you heard the glitchy-clickety-crackle during playback when, of course, it was too late do anything about it.

The bad news for people with large, yet to be migrated DAT archives is that the format is ‘particularly susceptible to dropout. Digital audio dropout is caused by a non-uniform magnetic surface, or a malfunctioning tape deck. However, because the magnetically recorded information is in binary code, it results in a momentary loss of data and can produce a loud transient click or worse, muted audio, if the error correction scheme in the playback equipment cannot correct the error,’ the wonderfully informative A/V Artifact Atlas explains.

Given the high density nature of digital recordings on narrow magnetic tape, even the smallest speck of dust can cause digital audio dropouts. Such errors can be very difficult to eliminate. Cleaning playback heads and re-transferring is an option, but if the dropout was recorded at the source or the surface of tape is damaged, then the only way to treat irregularities is through applying audio restoration technologies, which may present a problem if you are concerned with maintaining the authenticity of the original recording.

Listen to this example of what a faulty DAT sounds like

Play back problems and mouldy DATs

Mould growth on the surface of DAT tape

Mould growth on the surface of DAT tape

A big problem with DAT transfers is actually being able to play back the tapes, or what is known in the business as ‘DAT compatibility.’ In an ideal world, to get the most perfect transfer you would play back a tape on the same machine that it was originally recorded on. The chances of doing this are of course pretty slim. While you can play your average audio cassette tape on pretty much any tape machine, the same cannot be said for DAT tapes. Often recordings were made on misaligned machines. The only solution for playback is, Richard Hess suggests, to mis-adjust a working machine to match the alignment of the recording on the tape.

As with any archival collection, if it is not stored in appropriate conditions then mould growth can develop. As mentioned above, DAT tapes are roughly half the size of the common audiocassette and the tape is thin and narrow. This makes them difficult to clean because they are mechanically fragile. Adapting a machine specifically for the purposes of cleaning, as we have done with our Studer machine, would be the most ideal solution. There is, however, not a massive amount of research and information about restoring mouldy DATs available online even though we are seeing more and more DAT tapes exhibiting this problem.

As with much of the work we do, the recommendation is to migrate your collections to digital files as soon as possible. But often it is a matter of priorities and budgets. From a technical point of view, DATs are a particularly vulnerable format. Machine obsolescence means that compared to their analogue counterparts, professional DAT machines will be increasingly hard to service in the long term. As detailed above, glitchy dropouts are almost inevitable given the sensitivity and all or nothing quality of digital data recorded on magnetic tape.

It seems fair to say that despite being meant to supersede analogue formats, DATs are far more likely to drop out of recorded sound history in a clinical and abrupt manner.

They therefore should be a high priority when decisions are made about which formats in your collection should be migrated to digital files immediately, over and above those that can wait just a little bit longer.

Phyllis Tate’s Nocturn for Four Voices 3″ 1/4 inch reel to reel tape transfer

September 19th, 2014

We have recently transferred a previously unpublished 3” ¼ inch tape recording of British 20th century composer Phyllis Tate’s Nocturn for Four Voices. The tape is a 2-track stereo recording made at 7.5 inches per second (in/s) at the Purcell Room in London’s Southbank Centre in 1975, and was broadcast on 16 September 1976.

When migrating magnetic tape recordings to digital files there are several factors that can be considered to assess the quality of recording even before we play back the tape. One of these is the speed at which the tape was originally recorded.

Diagramme of track widths on magnetic tape, and the relative thicknesses of 1, 2 and 4 track recordings

Generally speaking, the faster the speed the better the reproduction quality when making the digital transfer. This is because higher tape speeds spread the recorded signal longitudinally over more tape area, therefore reducing the effects of dropouts and tape noise. The number of tracks recorded on the tape also has an impact on how good it sounds today. Simply put, the more information stored on the tape due to recording speed or track width, the better the transfer will sound.

The tape of Nocturn for Four Voices was however suffering from binder hydrolysis and therefore needed to be baked prior to play back. EMI tape doesn’t normally do this but as the tape was EMI professional it may well have used Ampex stock and / or have been back coated, thus making the binder more susceptible to such problems.

Remembering Phyllis Tate

Nocturn for Four Voices is an example of how Tate ‘composed for unusual combinations of instruments and voice.’ The composition includes ‘Bass Clarinet, Celeste, String Quartet and Double Bass’, music scholar Jane Ballantyne explains.

The tape was brought into us by Tate’s daughter, Celia Frank, who is currently putting the finishing touches to a web archive that, she hopes, will help contemporary audiences (re)discover her mother’s work.

Like many women musicians and artists, Phyllis Tate, who trained at the Royal Academy of Music, remains fairly obscure to the popular cultural ear.

This is not to say, of course, that her work did not receive critical acclaim from her contemporaries or posthumously. Indeed, it is fair to say that she had a very successful composing career. Both the BBC and the Royal Academy of Music, among others, commissioned compositions from Tate, and her work is available to hire or buy from esteemed music publishers Oxford University Press (OUP).

Edmund Whitehouse, who wrote a short biography of the composer, described her as ‘one of the outstanding British composers of her generation, she was truly her own person whose independent creative qualities produced a wide range of music which defy categorisation.’

Her music often comprised of contrasting emotional registers, lyrical sections and unexpected changes of direction. As a writer of operattas and choral music, with a penchant for setting poetry to music, her work is described by the OUP as the product of ‘an unusual imagination and an original approach to conventional musical forms or subjects, but never to the extent of being described as “avant-garde”.’

Tate’s music was very much a hit with iconic suffrage composer Ethel Smyth who, upon hearing Tate’s compositions, reputedly declared: ‘at last, I have heard a real woman composer.’ Such praise was downplayed by Tate, who tended to point to Smyth’s increased loss of hearing in later life as the cause of her enjoyment: ‘My Cello Concerto was performed soon afterwards at Bournemouth with Dame Ethel sitting in the front row banging her umbrella to what she thought was the rhythm of the music.’Open reel tape and box

While the dismissal of Smyth’s appreciation is tender and good humoured, the fact that Tate destroyed significant proportions of her work does suggest that at times she could have doubted her own abilities as a composer. Towards the end of her life she revealed: ‘I must admit to having a sneaking hope that some of my creations may prove to be better than they appear. One can only surmise and it’s not for the composer to judge. All I can vouch is this: writing music can be hell; torture in the extreme; but there’s one thing worse; and that is not writing it.’ As a woman composing in an overwhelmingly male environment, such hesitancies are perhaps an understandable expression of what literary scholars Gilbert and Gubar called ‘the anxiety of authorship.’

Tate’s work is a varied and untapped resource for those interested in twentieth century composition and the wider history of women composers. We wish Celia the best of luck in getting the website up and running, and hope that many more people will be introduced to her mother’s work as a consequence.

Thanks to Jane Ballantyne and Celia Frank for their help in writing this article.

Obsolete technologies and contemporary sound art

August 26th, 2014

At the recent Supernormal festival held at Braziers Park, Oxfordshire, a number of artists were using analogue technologies to explore concepts that dovetail nicely with the work we do at Great Bear collecting, servicing and repairing obsolete tape machines.

Hacker Farm, for example, keep ‘obsolete tech and discarded, post-consumerist debris’ alive using ‘salvaged and the hand-soldered’ DIY electronics. Their performance was a kind-of technological haunting, the sound made when older machines are turned on and re-purposed in different eras. Eerie, decayed, pointless and mournful, the conceptual impetus behind Hacker Farm raises many questions that emerge from the rather simple desire to keep old technologies working. Such actions soon become strange and aesthetically challenging in the contemporary technological context, which actively reproduces obsolescence in the endless search for the new, fostering continuous wastefulness at the centre of industrial production.

Music by the Metre

Another performance at the festival which engaged with analogue technologies was Graham Dunning’s Music by the Metre. The piece pays homage to Situationist Pinot-Gallizio‘s method of ‘Industrial Painting’ (1957-1959), in which the Italian artist created a 145 metre hand and spray painted canvas that was subsequently cut up and sold by the metre. The action, which attempted to destroy the perception of the sacrilegious art-object and transform it into something which could be mass-quantified and sold, aimed to challenge ‘the mental disease of banalisation’ inherent to what Guy Debord termed ‘the society of the spectacle.’

In Dunning’s contemporary piece he used spools of open reel tape to record a series of automated machines comprised of looping record players, synth drone, live environmental sound and tape loops. This tape is then cut by the artist in metre long segments, placed in see-through plastic bags and ‘sold’ on his temporary market stall used to record and present the work.

Dunning’s work exists in interesting tension with the ideas of Pinot-Gallizio, largely because of the different technological and aesthetic contexts the artists are responding to.

Pinot-Gallizio’s industrial painting aimed to challenge the role of art within a consumer society by accelerating its commodity status (mass-produced, uniform, quantified, art as redundant, art as part of the wall paper). Within Dunning’s piece, such a process of acceleration is not so readily available, particularly given the deep obsolescence of consumer-grade open reel tape in 2014, and, furthermore, its looming archival obsolescence (often cited at ’10-20 years‘ by archivists).

Within the contemporary context, open reel analogue tapes have become ornate and aestheticised in themselves because they have lost their function as an everyday, recordable mass blank media. When media lose their operating context they are transformed into objects of fascination and desire, as Claire Bishop pithily states in her Art Forum essay, ‘The Digital Divide': ‘Today, no exhibition is complete without some form of bulky, obsolete technology—the gently clucking carousel of the slide-projector, or the whirring of an 8mm or 16mm film reel […] the sumptuous texture of indexical media is unquestionably seductive, but its desirability also arises from the impression that it is scarce, rare and precious.’

In reality, the impression of open reel to reel analogue tape’s rarity is however well justified, as manufacturers and distributors of magnetic tape are increasingly hard to find. Might there be something more complex and contradictory be going on in Dunning’s homage to Pinot-Gallizio? Could we understand it as a neat inversion of the mass-metred-object, doubly cut adrift from its historical (1950s-1970s) and technological operating context (the open reel tape recorder), the bag of tape is decelerated, existing as nothing other than art object. Stuffed messily in a plastic bag and displayed ready to be sold (if only by donation), the tape is both ugly and useless given its original and intended use. It is here Dunning’s and Pinot-Gallizio’s work converge, situated at different historical and temporal poles from which critique of the consumer society can be mounted: accelerated plenitude and decelerated exhaustion.

onexmetres

Analogue attachments

As a company that works with obsolete magnetic tape-based media, Great Bear has a vested interest in ensuring tapes and playback machines remain operational. Although our studio, with its stacks of long-forgotten machines, may look like a curious art installation to some, the tapes we migrate to digital files are not quite art objects…yet. Like Hacker Farm, we help to keep old media alive through careful processes of maintenance and repair.

From looking at how contemporary sound artists are engaging with analogue technologies, it is clear that the medium remains very much part of the message, as Marshall McLuhan would say, and that meaning becomes amplified, contorted or transformed depending on historical context, and media norms present within it.

Reports from the ‘bleeding edge’ – The Presto Centre’s AV Digitisation TechWatch Report #2

July 28th, 2014

The Presto Centre‘s AV Digitisation and Digital Preservation TechWatch Report, published July 2014, introduces readers to what they describe as the ‘bleeding edge’ of AV Digitisation and Archive technology.

Written in an engaging style, the report is well worth a read. If you don’t have time, however, here are some choice selections from the report which relate to the work we do at Great Bear, and some of the wider topics that have been discussed on the blog.

The first issue to raise, as ever, is continuing technological change. The good news is

‘there are no unexpected changes in file sizes or formats on the horizon, but it is fair to say that the inexorable increase in file size will continue unabated […] Higher image resolutions, bits per pixel and higher frame rates are becoming a fact of life, driving the need for file storage capacity, transfer bandwidth and processing speeds, but the necessary technology developments continue to track some form of Moore’s law, and there is no reason to believe that the technical needs will exceed technical capability, although inevitably there will be continuing technology updates needed by archives in order for them to manage new material.’

Having pointed out the inevitability of file expansion, however, others parts of the report clearly express the very real everyday challenges that ever increasing file sizes are posing to the transmission of digital information between across different locations:rate_vs_size.today-0

‘transport of content was raised by one experienced archive workflow provider. They maintained that, especially with very high bit-rate content (such as 4k) it still takes too long to transfer files into storage over the network, and in reality there are some high-capacity content owners and producers shipping stacks of disks around the country in Transit vans, on the grounds that, in the right circumstances this can still be the highest bandwidth transfer mechanism, even though the Digital Production Partnership (DPP) are pressing for digital-only file transfer.’

While those hoards of transit vans zipping up and down the motorway between different media providers is probably the exception rather than the rule, we should note that a similar point was raised by Per Platou when he talked about the construction of the Videokuntstarkivet – the Norwegian video art archive. Due to the size of video files in particular, Per found that publishing them online really pushed server capabilities to the absolute maximum. This illustrates that there remains a discrepancy between the rate at which broadcast technologies develop and the economic, technological and ecological resources available to send and receive them.

Another interesting point about the move from physical to file-based media is the increased need for Quality-Control (QC) software tools that will be employed to ‘ensure that our digital assets are free from artefacts or errors introduced by encoders or failures of the playback equipment.’ Indeed, given that glitches born from slow or interrupted transfers may well be inevitable because of limited server capabilities, software developed by Bristol-based company Vidcheck will be very useful because it ‘allows for real-time repair of Luma, Chroma, Gamma and audio loudness issues that may be present in files. This is a great feature given that many of the traditional products on the market will detect problems but will not automatically repair them.’

Other main points worth mentioning from the report is the increasing move to open-source, software only solutions for managing digital collections and the rather optimistic tone directed toward ‘archives with specific needs who want to find a bespoke provider who can help design, supply and support a viable workflow option – so long as they avoid the large, proprietary ‘out-of-the-box’ solutions.’

If you are interested in reading further TechWatch reports you can download #1 here, and watch out for #3 that will be written after the International Broadcasting Convention (IBC) which is taking place in September, 2014.

 

Digital preservations, aesthetics and approaches

July 23rd, 2014

Digital Preservation 2014, the annual meeting of the National Digital Information Infrastructure and Preservation Program and the National Digital Stewardship Alliance is currently taking place in Washington, DC in the US.

The Library of Congress’s digital preservation blog The Signal is a regular reading stop for us, largely because it contains articles and interviews that impressively meld theory and practice, even if it does not exclusively cover issues relating to magnetic tape.

What is particularly interesting, and indeed is a feature of the keynotes for the Digital Preservation 2014 conference, is how the relationship between academic theory—especially relating to aesthetics and art—is an integral part of the conversation of how best to meet the challenge of digital preservation in the US. Keynote addresses from academics like Matthew Kirschenbaum (author of Mechanisms) and Shannon Mattern, sit alongside presentations from large memory institutions and those seeking ways to devise community approaches to digital stewardship.

The relationship between digital preservation and aesthetics is also a key concern of Richard Rhinehart and Jon Ippolito’s new book Re-Collection: Art, New Media and Social Memory, which has just been published by MIT Press.

This book, if at times deploying rather melodramatic language about the ‘extinction!’ and ‘death!’ of digital culture, gently introduces the reader to the wider field of digital preservation and its many challenges. Re-Collection deals mainly with born-digital archives, but many of the ideas are pertinent for thinking about how to manage digitised collections as well.Stop Rewind

In particular, the recommendation by the authors that the digital archival object remains variable was particularly striking: ‘the variable media approach encourages creators to define a work in medium- independent terms so that it can be translated into a new medium once its original format is obsolete’ (11). Emphasising the variability of the digital media object as a preservation strategy challenges the established wisdom of museums and other memory institutions, Rhinehart and Ippolito argue. The default position to preserve the art work in its ‘original’ form effectively freezes a once dynamic entity in time and space, potentially rendering the object inoperable because it denies works of art the potential to change when re-performed or re-interpreted. Their message is clear: be variable, adapt or die!

As migrators of tape-based collections, media variability is integral to what we do. Here we tacitly accept the inauthenticity of the digitised archival object, an artefact which has been allowed to change in order to ensure accessibility and cultural survival.

US/ European differences ?

While aesthetic and theoretical thinking is influencing how digital information management is practiced in the US, it seems as if the European approach is almost exclusively framed in economic and computational terms

Consider, for example, the recent EU press release about the vision to develop Europe’s ‘knowledge economy‘. The plans to map and implement data standards, create cross-border coordination and an open data incubator are, it would seem, far more likely to ensure interoperable and standardised data sharing systems than any of the directives to preserve cultural heritage in the past fifteen years, a time period characterised by markedly unstable approaches, disruptive innovations and a conspicuous lack of standards (see also the E-Ark project).

It may be tempting these days to see the world as one gigantic, increasingly automated archival market, underpinned by the legal imperative to collect all kinds of personal data (see the recent ‘drip’ laws that were recently rushed through the UK parliament). Yet it is also important to remember the varied professional, social and cultural contexts in which data is produced and managed.

One session at DigiPres, for example, will explore the different archival needs of the cultural heritage sector:

‘Digital cultural heritage is dependent on some of the same systems, standards and tools used by the entire digital preservation community. Practitioners in the humanities, arts, and information and social sciences, however, are increasingly beginning to question common assumptions, wondering how the development of cultural heritage-specific standards and best practices would differ from those used in conjunction with other disciplines […] Most would agree that preserving the bits alone is not enough, and that a concerted, continual effort is necessary to steward these materials over the long term.’

Of course approaches to digital preservation and data management in the US are largely overdetermined by economic directives, and European policies do still speak to the needs of cultural heritage institutions and other public organisations.

What is interesting, however, is the minimal transnational cross pollination at events such as DigiPres, despite the globally networked condition we all share. This suggests there are subtle divergences between approaches to digital information management now, and how it will be managed in coming years across these (very large) geopolitical locations. Aesthetics or no aesthetics, the market remains imperative. Despite the turn toward open archives and re-usable data, competition is at the heart of the system and is likely to win out above all else.

D1, D2 & D3 – histories of digital video tape

July 14th, 2014

D1 tape

The images in this article are of the first digital video tape formats, the D1, D2 and D3. The tendency to continually downsize audiovisual technology is clearly apparent: the gargantuan shell of the D1 gradually shrinks to the D3, which resembles the size of a domestic VHS tape.

Behind every tape (and every tape format) lie interesting stories, and the technological wizardry and international diplomacy that helped shape the roots of our digital audio visual world are worth looking into.

In 1976, when the green shoots of digital audio technology were emerging at industry level, the question of whether Video Tape Recorders (VTRs) could be digitised began to be explored in earnest by R & D departments based at SONY, Ampex and Bosch G.m.b.H. There was considerable scepticism among researchers about whether digital video tape technology could be developed at all because of the wide frequency required to transmit a digital image.

In 1977 however, as reported on the SONY websiteYoshitaka Hashimoto and team began to intensely research digital VTRs and ‘in just a year and a half, a digital image was played back on a VTR.’

Several years of product development followed, shaped, in part, by competing regional preferences. As Jim Slater argues in Modern Television Systems (1991): ‘much of the initial work towards digital standardisation was concerned with trying to find ways of coping with the three very different colour subcarrier frequencies used in NTSC, SECAM and PAL systems, and a lot of time and effort was spent on this’ (114).

Establishing a standard sampling frequency did of course have real financial consequences, it could not be randomly plucked out the air: the higher the sampling frequency, the greater overall bit rate; the greater overall bit rate, the more need for storage space in digital equipment. In 1982, after several years of negotiations, a 13.5 MHz sampling frequency was agreed. European, North American, ‘Japanese, the Russians, and various other broadcasting organisations supported the proposals, and the various parameters were adopted as a world standard, Recommendation 601 [a.k.a. 4:2:2 DTV] standard of the CCIR [Consultative Committee for International Radio, now International Telecommunication Union]’ (Slater, 116).

The 4:4:2 DTV was an international standard that would form the basis of the (almost) exclusively digital media environment we live in today. It was ‘developed in a remarkably short time, considering its pioneering scope, as the worldwide television community recognized the urgent need for a solid basis for the development of an all-digital television production system’, write Stanley Baron and David Wood.

Once agreed upon, product development could proceed. The first digital video tape, the D1, was introduced on the market in 1986. It was an uncompressed component video which used enormous bandwidth for its time: 173 Mbit/sec (bit rate), with maximum recording time of 94 minutes. D-2 and D-3 Tapes

As Slater writes

‘unfortunately these machines are very complex, difficult to manufacture, and therefore very expensive […] they also suffer from the disadvantage that being component machines, requiring luminance and colour-difference signals at input and output, they are difficult to install in a standard studio which has been built to deal with composite PAL signals. Indeed, to make full use of the D1 format the whole studio distribution system must be replaced, at considerable expense’ (125).

Being forced to effectively re-wire whole studios, and the considerable risk involved in doing this because of continual technological change, strikes a chord with the challenges UK broadcast companies face as they finally become ‘tapeless’ in October 2014 as part of the Digital Production Partnership’s AS-11 policy.

Sequels and product development

As the story so often goes, D1 would soon be followed by D2. Those that did make the transition to D1 were probably kicking themselves, and you can only speculate the amount of back injuries sustained getting the machines in the studio (from experience we can tell you they are huge and very heavy!)

It was fairly inevitable a sequel would be developed because even as the D-1 provided uncompromising image quality, it was most certainly an unwieldy format, apparent from its gigantic size and component wiring. In response a composite digital video – the D2 – was developed by Ampex and introduced in 1988.

In this 1988 promotional video, you can see the D-2 in action. Amazingly for our eyes and ears today the D2 is presented as the ideal archival format. Amazing for its physical size (hardly inconspicuous on the storage shelf!) but also because it used composite video signal technology. Composite signals combine on one wire all the component parts which make up a video signal: chrominance (colour, or Red Green, Blue – RGB) and luminance (the brightness or black and white information, including grayscale).

While the composite video signal used lower bandwidth and was more compatible with existing analogue systems used in the broadcast industry of the time, its value as an archival format is questionable. A comparable process for the storage we use today would be to add compression to a file in order to save file space and create access copies. While this is useful in the short term it does risk compromising file authenticity and quality in the long term. The Ampex video is fun to watch however, and you get a real sense of how big the tapes were and the practical impact this would have had on the amount of time it took to produce TV programmes.

Enter the D3

Following the D2 is the D3, which is the final video tape covered in this article (although there were of course the D5 and D9.)

The D3 was introduced by Panasonic in 1991 in order to compete with Ampex’s D2. It has the same sampling rate as the D2 with the main difference being the smaller shell size.

The D3’s biggest claim to fame was that it was the archival digital video tape of choice for the BBC, who migrated their analogue video tape collections to the format in the early 1990s. One can only speculate that the decision to take the archival plunge with the D3 was a calculated risk: it appeared to be a stable-ish technology (it wasn’t a first generation technology and the difference between D2 and D3 is negligible).

The extent of the D3 archive is documented in a white paper published in 2008, D3 Preservation File Format, written by Philip de Nier and Phil Tudor: ‘the BBC Archive has around 315,000 D3 tapes in the archive, which hold around 362,000 programme items. The D3 tape format has become obsolete and in 2007 the D3 Preservation Project was started with the goal to transfer the material from the D3 tapes onto file-based storage.’

Tom Heritage, reporting on the development of the D3 preservation project in 2013/2014, reveals that ‘so far, around 100,000 D3 and 125,000 DigiBeta videotapes have been ingested representing about 15 Petabytes of content (single copy).’

It has then taken six years to migrate less than a third of the BBC’s D3 archive. Given that D3 machines are now obsolete, it is more than questionable whether there are enough D3 head hours left in existence to read all the information back clearly and to an archive standard. The archival headache is compounded by the fact that ‘with a large proportion of the content held on LTO3 data tape [first introduced 2004, now on LTO-6], action will soon be required to migrate this to a new storage technology before these tapes become difficult to read.’ With the much publicised collapse of the BBC’s (DMI) digital media initiative in 2013, you’d have to very strong disposition to work in the BBC’s audio visual archive department.

The roots of the audio visual digital world

The development of digital video tape, and the international standards which accompanied its evolution, is an interesting place to start understanding our current media environment. They are also a great place to begin examining the problems of digital archiving, particularly when file migration has become embedded within organisational data management policy, and data collections are growing exponentially.

While the D1 may look like an alien-techno species from a distant land compared with the modest, immaterial file lists neatly stored on hard drives that we are accustomed to, they are related through the 4:2:2 sample rate which revolutionised high-end digital video production and continues to shape our mediated perceptions.

Videokunstarkivet – Norway’s Digital Video Art Archive

July 7th, 2014

We have recently digitised a U-matic video tape of eclectic Norwegian video art from the 1980s. The tape documents a performance by Kjartan Slettemark, an influential Norwegian/ Swedish artist who died in 2008. The tape is the ‘final mix’ of a video performance entitled Chromakey Identity Blue in which Slettemark live mixed several video sources onto one tape.

The theoretical and practical impossibility of documenting live performance has been hotly debated in recent times by performance theorists, and there is some truth to those claims when we consider the encounter with Slettemark’s work in the Great Bear studio. The recording is only one aspect of the overall performance which, arguably, was never meant as a stand alone piece. This was certainly reflected in our Daily Mail-esque reaction to the video when we played it back. ‘Eh? Is this art?! I don’t get it!’ was the resounding response.

Having access to the wider context of the performance is sometimes necessary if the intentions of the artist are to be appreciated. Thankfully, Slettemark’s website includes part-documentation of Chromakey Identity Blue, and we can see how the different video signals were played back on various screens, arranged on the stage in front of (what looks like) a live TV audience.

Upon seeing this documentation, the performance immediately evokes to the wider context of 70s/ 80s video art, that used the medium to explore the relationship between the body, space, screen and in Slettemark’s case, the audience. A key part of Chromakey Identity Blue is the interruption of the audience’s presence in the performance, realised when their images are screened across the face of the artist, whose wearing of a chroma key mask enables him to perform a ‘special effect’ which layers two images or video streams together.

What unfolds through Slettemark’s performance is at times humorous, suggestive and moving, largely because of the ways the faces of different people interact, perform or simply ignore their involvement in the spectacle. As Marina Abramovic‘s use of presence testifies, there can be something surprisingly raw and even confrontational about incorporating the face into relational art. As an ethical space, meeting with the ‘face’ of another became a key concept for twentieth century philosopher Emmanuel Levinas. The face locates, Bettina Bergo argues, ‘“being” as an indeterminate field’ in which ‘the Other as a face that addresses me […] The encounter with a face is inevitably personal.’

If an art work like Slettemark’s is moving then, it is because it stages moments where ‘faces’ reflect and interface across each other. Faces meet and become technically composed. Through the performance of personal-facial address in the artwork, it is possible to glimpse for a brief moment the social vulnerability and fragility such meetings engender. Brief because the seriousness is diffused Chromakey Identity Blue by a kitsch use of a disco ball that the artist moves across the screen to symbolically change the performed image, conjuring the magical feel of new technologies and how they facilitate different ways of seeing, being and acting in the world.

Videokunstarkivet (The Norwegian Video Art Archive)

VKA DAM Interface

The tape of Slettemark was sent to us by Videokunstarkivet, an exciting archival project mapping all the works of video art that have been made in Norway since the mid-1960s. Funded by the Norwegian Arts Council, the project has built the digital archival infrastructure from the bottom up, and those working on it have learnt a good many things along the way. Per Platou, who is managing the project, was generous enough to share some the insights for readers of our blog, and a selection of images from archive’s interface.

There are several things to be considered when creating a digital archive ‘from scratch’. Often at the beginning of a large project it is possible look around for examples of best practice within your field. This isn’t always the case for digital archives, particularly those working almost exclusively with video files, whose communities of practice are unsettled and established ways of working few and far between. The fact that even in 2014, when digital technologies have been widely adopted throughout society, there is still not any firm agreement on standard access and archival file formats for video files indicates the peculiar challenges of this work.

Because of this, projects such as Videokunstarkivet face multiple challenges, with significant amounts of improvisation required in the construction of the project infrastructure. An important consideration is the degree of access users will have to the archive material. As Per explained, publicly re-publishing the archive material from the site in an always open access form is not a concern of the  Videokunstarkivet, largely due to the significant administrative issues involved in gaining licensing and copyright permissions. ‘I didn’t even think there was a difference between collecting and communicating the work yet after awhile I saw there is no point in showing everything, it has to be filtered and communicated in a certain way.’

VKA DAM INterace

Instead, interested users will be given a research key or pass word which enables them to access the data and edit metadata where appropriate. If users want to re-publish or show the art in some form, contact details for the artist/ copyright holder are included as part of the entry. Although the Videokunstarkivet deals largely with video art, entries on individual artists include information about other archival collections where their material may be stored in order to facilitate further research. Contemporary Norwegian video artists are also encouraged to deposit material in the database, ensuring that ongoing collecting practices are built-in to the long-term project infrastructure.VKA DAM Interface

Another big consideration in constructing an archive is what to collect. Per told me that video art in Norway really took off in the early 80s. Artists who incorporated video into their work weren’t necessarily specialists in the medium, ‘there just happened to be a video camera nearby so they decided to use it.’ Video was therefore often used alongside films, graphics, performance and text, making the starting point for the archive, according to Per, ‘a bit of a mess really.’ Nonetheless, Videokunstarkivet ‘approaches every artist like it was Edvard Munch,’ because it is very hard to know now exactly what will be culturally valuable in 10, 20 or even 100 years from now. While it may not be appropriate to ‘save everything!’ for larger archival projects, for a self-contained and focused archival project such as the Videokunstarkivet, an inclusive approach may well be perfectly possible.

Building software infrastructures

Another important aspect of the project is technical considerations – the actual building of the back/ front end of the software infrastructure that will be used to manage newly migrated digital assets.

It was very important that the Videokunstarkivet archive was constructed using Open Source software. It was necessary to ensure resilience in a rapidly changing technological context, and so the project could benefit from any improvements in the code as they are tested out by user communities.

The project uses an adapted version of Digital Asset Management system Resource Space that was developed with LIMA, an organisation based in Holland that preserves, distributes and researches media art. Per explained that ‘since Resource Space was originally meant for photos and other “light” media files, we found it not so well suited for our actual tasks.’ Video files are of course far ‘heavier’ than image or even uncompressed audio files. This meant that there were some ‘pretty severe’ technical glitches in the process of establishing a database system that could effectively manage and playback large, uncompressed master and access copies. Through establishing the Videokunstarkivet archive they were ‘pushing the limits of what is technically possible in practice’, largely because internet servers are not built to handle large files, particularly not if those files are being transcoding back and forth across the file management system. In this respect, the project is very much ‘testing new ground’, creating an infrastructure capable of effectively managing, and enabling people to remotely access large amounts of high-quality video data.

VKA DAM Interface Access files will be available to stream using open source encoded files Web M (hi and lo) and X264 (hi and lo), ensuring that streaming conditions can be adapted to individual server capabilities. The system is also set up to manage change large-scale file transcoding should there be substantial change in file format preferences. These changes can occur without compromising the integrity of the uncompressed master file.

The interface is built with Bootstrap which has been adapted to create ‘a very advanced access-layer system’ that enables Videokunstarkivet to define user groups and access requirements. Per outlined these user groups and access levels as follows:

‘- Admin: Access to everything (i.e.Videokunstarkivet team members)

– Research: Researchers/curators can see video works, and almost all the metadata (incl previews of the videos). They cannot download master files. They can edit metadata fields, however all their edits will be visible for other users (Wikipedia style). If a curator wants to SHOW a particular work, they’ll have to contact the artist or owner/gallery directly. If the artist agrees, they (or we) can generate a download link (or transcode a particular format) with a few clicks.

– Artist: Artists can up/download uncompressed master files freely, edit metadata and additional info (contact, cv, websites etc etc). They will be able to use the system to store digital master versions freely, and transcode files or previews to share with who they want. The ONLY catch is that they can never delete a master file – this is of course coming out of national archive needs.’F+©lstad overview

Per approached us to help migrate the Kjartan Slettemark tape because of the thorough approach and conscientious methodology we apply to digitisation work. As a media archaeology enthusiast, Per stressed that it was desirable for both aesthetic and archival reasons that the materiality of U-matic video was visible in the transferred file. He didn’t want the tape, in other words, to be ‘cleaned up’ in anyway. To migrate the tape to digital file we used our standardised transfer chain for U-matic tape. This includes using an appropriate time-based-corrector contemporary to U-matic era, and conversion of the dub signal using a dedicated external dub – y/c converter circuit.

We are very happy to be working with projects such as the Videokunstarkivet. It has been a great opportunity to learn about the nuts and bolts design of cutting-edge digital video archives, as well as discover the work of Kjartan Slettemark, whose work is not well-known in the UK. Massive thanks must go to Per for his generous sharing of time and knowledge in the process of writing this article. We wish the Videokunstarkivet every success and hope it will raise the profile of Norwegian video art across the world.


designed and developed by
greatbear analogue and digital media ltd, 0117 985 0500
Unit 26, The Coach House, 2 Upper York Street, Bristol, BS2 8QN, UK


XHTML | CSS
greatbear analogue and digital media is proudly powered by WordPress
hosted using Debian and Apache