Transferring Digital Audio Tapes (DATs) to digital files

October 9th, 2014

This post focuses on the problems that can arise with the transfer of Digital Audio Tapes (DATs).

An immature recording method (digital) on a mature recording format (magnetic tape), the audio digital recording revolution was never going to get it right first time (although DATs were not of course the first digital recordings made on tape).

Indeed, at a meeting of audio archivists held in 1995, there was a consensus even then that DAT was not, and would never be, a reliable archival medium. One participant stated: ‘we have tapes from 1949 that sound wonderful,’ and ‘we have tapes from 1989 that are shot to hell.’ And that was nearly twenty years ago! What chances do the tapes have now?

A little DAT history

Before we explore that, let’s have a little DAT history.

SONY introduced Digital Audio Tapes (DATs) in 1987. At roughly half the size of an analogue cassette tape, DAT has the ability to record at higher, equal or lower sampling rates than a CD (48, 44.1 or 32 kHz sampling rate respectively) at 16 bit quantization.

Although popular in Japan, DATs were never widely adopted by the majority of consumer market because they were more expensive than their analogue counterparts. They were however embraced in professional recording contexts, and in particular for recording live sound.

It was recording industry paranoia, particularly in the US, that really sealed the fate of the format. With its threatening promise of perfect replication, DAT tapes were subject to an unsuccessful lobbying campaign by the Recording Industry Association of America (RIAA). RIAA saw DATs as the ultimate attack on copyright law and pressed to introduce the Digital Audio Recorder Copycode Act of 1987.

This law recommended that each DAT machine had a ‘copycode’ chip installed that could detect whether prerecorded copyrighted music was being replicated. The method employed a notch filter that would subtly distort the quality of the copied recording, thus sabotaging the any acts of piracy tacitly enabled by the DAT medium. The law was however not passed, and compromises were made, although the US Audio Home Recording Act of 1992 imposed taxes on DAT machines and blank media.

How did they do ‘dat?

Like video tape recorders, DAT tapes use a rotating head and helical scan method to record data. The helical scan can, however, pose real problems for the preservation transfers of DAT tapes because it makes it difficult to splice the tape together if it becomes sticky and snaps during the tape wind. With analogue audiotape, which records information longitudinally, it is far more possible to splice the tape together and continue the transfer without risking irrevocable information loss.

Another problem posed by the helical scan method is that such tapes are more vulnerable to tape pack and backing deformation, as the CLIR guide explain:

‘Tracks are recorded diagonally on a helical scan tape at small scan angles. When the dimensions of the backing change disproportionately, the track angle will change for a helical scan recording. The scan angle for the record/playback head is fixed. If the angle that the recorded tracks make to the edge of the tape do not correspond with the scan angle of the head, mistracking and information loss can occur.’

When error correction can’t correct anymore

dat mute playback condition sony 7040 300x169 Transferring Digital Audio Tapes (DATs) to digital filesMost people will be familiar with the sound of digital audio dropouts even if they don’t know the science behind them. You will know them most probably as those horrible clicking noises produced when the error correction technology on CDs stops working. The clicks indicate that the ‘threshold of intelligibility’ for digital data has been breached and, as theorist Jonathan Sterne reminds us, ‘once their decay becomes palpable, the file is rendered entirely unreadable.’

Our SONY PCM 7030 professional DAT machine, pictured opposite, has a ‘playback condition’ light that flashes if an error is present. On sections of the tape where quality is really bad the ‘mute’ light can flash to indicate that the error correction technology can’t fix the problem. In such cases drop outs are very audible. Most DAT machines did not have such a facility however, and you only knew there was a problem when you heard the glitchy-clickety-crackle during playback when, of course, it was too late do anything about it.

The bad news for people with large, yet to be migrated DAT archives is that the format is ‘particularly susceptible to dropout. Digital audio dropout is caused by a non-uniform magnetic surface, or a malfunctioning tape deck. However, because the magnetically recorded information is in binary code, it results in a momentary loss of data and can produce a loud transient click or worse, muted audio, if the error correction scheme in the playback equipment cannot correct the error,’ the wonderfully informative A/V Artifact Atlas explains.

Given the high density nature of digital recordings on narrow magnetic tape, even the smallest speck of dust can cause digital audio dropouts. Such errors can be very difficult to eliminate. Cleaning playback heads and re-transferring is an option, but if the dropout was recorded at the source or the surface of tape is damaged, then the only way to treat irregularities is through applying audio restoration technologies, which may present a problem if you are concerned with maintaining the authenticity of the original recording.

Listen to this example of what a faulty DAT sounds like

Play back problems and mouldy DATs

mould growth on dat surface 01 Transferring Digital Audio Tapes (DATs) to digital files

Mould growth on the surface of DAT tape

A big problem with DAT transfers is actually being able to play back the tapes, or what is known in the business as ‘DAT compatibility.’ In an ideal world, to get the most perfect transfer you would play back a tape on the same machine that it was originally recorded on. The chances of doing this are of course pretty slim. While you can play your average audio cassette tape on pretty much any tape machine, the same cannot be said for DAT tapes. Often recordings were made on misaligned machines. The only solution for playback is, Richard Hess suggests, to mis-adjust a working machine to match the alignment of the recording on the tape.

As with any archival collection, if it is not stored in appropriate conditions then mould growth can develop. As mentioned above, DAT tapes are roughly half the size of the common audiocassette and the tape is thin and narrow. This makes them difficult to clean because they are mechanically fragile. Adapting a machine specifically for the purposes of cleaning, as we have done with our Studer machine, would be the most ideal solution. There is, however, not a massive amount of research and information about restoring mouldy DATs available online even though we are seeing more and more DAT tapes exhibiting this problem.

As with much of the work we do, the recommendation is to migrate your collections to digital files as soon as possible. But often it is a matter of priorities and budgets. From a technical point of view, DATs are a particularly vulnerable format. Machine obsolescence means that compared to their analogue counterparts, professional DAT machines will be increasingly hard to service in the long term. As detailed above, glitchy dropouts are almost inevitable given the sensitivity and all or nothing quality of digital data recorded on magnetic tape.

It seems fair to say that despite being meant to supersede analogue formats, DATs are far more likely to drop out of recorded sound history in a clinical and abrupt manner.

They therefore should be a high priority when decisions are made about which formats in your collection should be migrated to digital files immediately, over and above those that can wait just a little bit longer.

Phyllis Tate’s Nocturn for Four Voices 3″ 1/4 inch reel to reel tape transfer

September 19th, 2014

We have recently transferred a previously unpublished 3” ¼ inch tape recording of British 20th century composer Phyllis Tate’s Nocturn for Four Voices. The tape is a 2-track stereo recording made at 7.5 inches per second (in/s) at the Purcell Room in London’s Southbank Centre in 1975, and was broadcast on 16 September 1976.

When migrating magnetic tape recordings to digital files there are several factors that can be considered to assess the quality of recording even before we play back the tape. One of these is the speed at which the tape was originally recorded.

BASF Track Width 1024x658 Phyllis Tates Nocturn for Four Voices 3 1/4 inch reel to reel tape transfer

Generally speaking, the faster the speed the better the reproduction quality when making the digital transfer. This is because higher tape speeds spread the recorded signal longitudinally over more tape area, therefore reducing the effects of dropouts and tape noise. The number of tracks recorded on the tape also has an impact on how good it sounds today. Simply put, the more information stored on the tape due to recording speed or track width, the better the transfer will sound.

The tape of Nocturn for Four Voices was however suffering from binder hydrolysis and therefore needed to be baked prior to play back. EMI tape doesn’t normally do this but as the tape was EMI professional it may well have used Ampex stock and / or have been back coated, thus making the binder more susceptible to such problems.

Remembering Phyllis Tate

Nocturn for Four Voices is an example of how Tate ‘composed for unusual combinations of instruments and voice.’ The composition includes ‘Bass Clarinet, Celeste, String Quartet and Double Bass’, music scholar Jane Ballantyne explains.

The tape was brought into us by Tate’s daughter, Celia Frank, who is currently putting the finishing touches to a web archive that, she hopes, will help contemporary audiences (re)discover her mother’s work.

Like many women musicians and artists, Phyllis Tate, who trained at the Royal Academy of Music, remains fairly obscure to the popular cultural ear.

This is not to say, of course, that her work did not receive critical acclaim from her contemporaries or posthumously. Indeed, it is fair to say that she had a very successful composing career. Both the BBC and the Royal Academy of Music, among others, commissioned compositions from Tate, and her work is available to hire or buy from esteemed music publishers Oxford University Press (OUP).

Edmund Whitehouse, who wrote a short biography of the composer, described her as ‘one of the outstanding British composers of her generation, she was truly her own person whose independent creative qualities produced a wide range of music which defy categorisation.’

Her music often comprised of contrasting emotional registers, lyrical sections and unexpected changes of direction. As a writer of operattas and choral music, with a penchant for setting poetry to music, her work is described by the OUP as the product of ‘an unusual imagination and an original approach to conventional musical forms or subjects, but never to the extent of being described as “avant-garde”.’

Tate’s music was very much a hit with iconic suffrage composer Ethel Smyth who, upon hearing Tate’s compositions, reputedly declared: ‘at last, I have heard a real woman composer.’ Such praise was downplayed by Tate, who tended to point to Smyth’s increased loss of hearing in later life as the cause of her enjoyment: ‘My Cello Concerto was performed soon afterwards at Bournemouth with Dame Ethel sitting in the front row banging her umbrella to what she thought was the rhythm of the music.’phillis tate nocturne emi tape Phyllis Tates Nocturn for Four Voices 3 1/4 inch reel to reel tape transfer

While the dismissal of Smyth’s appreciation is tender and good humoured, the fact that Tate destroyed significant proportions of her work does suggest that at times she could have doubted her own abilities as a composer. Towards the end of her life she revealed: ‘I must admit to having a sneaking hope that some of my creations may prove to be better than they appear. One can only surmise and it’s not for the composer to judge. All I can vouch is this: writing music can be hell; torture in the extreme; but there’s one thing worse; and that is not writing it.’ As a woman composing in an overwhelmingly male environment, such hesitancies are perhaps an understandable expression of what literary scholars Gilbert and Gubar called ‘the anxiety of authorship.’

Tate’s work is a varied and untapped resource for those interested in twentieth century composition and the wider history of women composers. We wish Celia the best of luck in getting the website up and running, and hope that many more people will be introduced to her mother’s work as a consequence.

Thanks to Jane Ballantyne and Celia Frank for their help in writing this article.

Obsolete technologies and contemporary sound art

August 26th, 2014

At the recent Supernormal festival held at Braziers Park, Oxfordshire, a number of artists were using analogue technologies to explore concepts that dovetail nicely with the work we do at Great Bear collecting, servicing and repairing obsolete tape machines.

Hacker Farm, for example, keep ‘obsolete tech and discarded, post-consumerist debris’ alive using ‘salvaged and the hand-soldered’ DIY electronics. Their performance was a kind-of technological haunting, the sound made when older machines are turned on and re-purposed in different eras. Eerie, decayed, pointless and mournful, the conceptual impetus behind Hacker Farm raises many questions that emerge from the rather simple desire to keep old technologies working. Such actions soon become strange and aesthetically challenging in the contemporary technological context, which actively reproduces obsolescence in the endless search for the new, fostering continuous wastefulness at the centre of industrial production.

Music by the Metre

Another performance at the festival which engaged with analogue technologies was Graham Dunning’s Music by the Metre. The piece pays homage to Situationist Pinot-Gallizio‘s method of ‘Industrial Painting’ (1957-1959), in which the Italian artist created a 145 metre hand and spray painted canvas that was subsequently cut up and sold by the metre. The action, which attempted to destroy the perception of the sacrilegious art-object and transform it into something which could be mass-quantified and sold, aimed to challenge ‘the mental disease of banalisation’ inherent to what Guy Debord termed ‘the society of the spectacle.’

c5de30aca4de08f408314978fe50f0f6 777x1024 Obsolete technologies and contemporary sound art

In Dunning’s contemporary piece he used spools of open reel tape to record a series of automated machines comprised of looping record players, synth drone, live environmental sound and tape loops. This tape is then cut by the artist in metre long segments, placed in see-through plastic bags and ‘sold’ on his temporary market stall used to record and present the work.

Dunning’s work exists in interesting tension with the ideas of Pinot-Gallizio, largely because of the different technological and aesthetic contexts the artists are responding to.

Pinot-Gallizio’s industrial painting aimed to challenge the role of art within a consumer society by accelerating its commodity status (mass-produced, uniform, quantified, art as redundant, art as part of the wall paper). Within Dunning’s piece, such a process of acceleration is not so readily available, particularly given the deep obsolescence of consumer-grade open reel tape in 2014, and, furthermore, its looming archival obsolescence (often cited at ’10-20 years‘ by archivists).

Within the contemporary context, open reel analogue tapes have become ornate and aestheticised in themselves because they have lost their function as an everyday, recordable mass blank media. When media lose their operating context they are transformed into objects of fascination and desire, as Claire Bishop pithily states in her Art Forum essay, ‘The Digital Divide': ‘Today, no exhibition is complete without some form of bulky, obsolete technology—the gently clucking carousel of the slide-projector, or the whirring of an 8mm or 16mm film reel [...] the sumptuous texture of indexical media is unquestionably seductive, but its desirability also arises from the impression that it is scarce, rare and precious.’

In reality, the impression of open reel to reel analogue tape’s rarity is however well justified, as manufacturers and distributors of magnetic tape are increasingly hard to find. Might there be something more complex and contradictory be going on in Dunning’s homage to Pinot-Gallizio? Could we understand it as a neat inversion of the mass-metred-object, doubly cut adrift from its historical (1950s-1970s) and technological operating context (the open reel tape recorder), the bag of tape is decelerated, existing as nothing other than art object. Stuffed messily in a plastic bag and displayed ready to be sold (if only by donation), the tape is both ugly and useless given its original and intended use. It is here Dunning’s and Pinot-Gallizio’s work converge, situated at different historical and temporal poles from which critique of the consumer society can be mounted: accelerated plenitude and decelerated exhaustion.

onexmetres Obsolete technologies and contemporary sound art

Analogue attachments

As a company that works with obsolete magnetic tape-based media, Great Bear has a vested interest in ensuring tapes and playback machines remain operational. Although our studio, with its stacks of long-forgotten machines, may look like a curious art installation to some, the tapes we migrate to digital files are not quite art objects…yet. Like Hacker Farm, we help to keep old media alive through careful processes of maintenance and repair.

From looking at how contemporary sound artists are engaging with analogue technologies, it is clear that the medium remains very much part of the message, as Marshall McLuhan would say, and that meaning becomes amplified, contorted or transformed depending on historical context, and media norms present within it.

Reports from the ‘bleeding edge’ – The Presto Centre’s AV Digitisation TechWatch Report #2

July 28th, 2014

The Presto Centre‘s AV Digitisation and Digital Preservation TechWatch Report, published July 2014, introduces readers to what they describe as the ‘bleeding edge’ of AV Digitisation and Archive technology.

Written in an engaging style, the report is well worth a read. If you don’t have time, however, here are some choice selections from the report which relate to the work we do at Great Bear, and some of the wider topics that have been discussed on the blog.

The first issue to raise, as ever, is continuing technological change. The good news is

‘there are no unexpected changes in file sizes or formats on the horizon, but it is fair to say that the inexorable increase in file size will continue unabated […] Higher image resolutions, bits per pixel and higher frame rates are becoming a fact of life, driving the need for file storage capacity, transfer bandwidth and processing speeds, but the necessary technology developments continue to track some form of Moore’s law, and there is no reason to believe that the technical needs will exceed technical capability, although inevitably there will be continuing technology updates needed by archives in order for them to manage new material.’

Having pointed out the inevitability of file expansion, however, others parts of the report clearly express the very real everyday challenges that ever increasing file sizes are posing to the transmission of digital information between across different locations:rate vs size.today 0 Reports from the bleeding edge   The Presto Centres AV Digitisation TechWatch Report #2

‘transport of content was raised by one experienced archive workflow provider. They maintained that, especially with very high bit-rate content (such as 4k) it still takes too long to transfer files into storage over the network, and in reality there are some high-capacity content owners and producers shipping stacks of disks around the country in Transit vans, on the grounds that, in the right circumstances this can still be the highest bandwidth transfer mechanism, even though the Digital Production Partnership (DPP) are pressing for digital-only file transfer.’

While those hoards of transit vans zipping up and down the motorway between different media providers is probably the exception rather than the rule, we should note that a similar point was raised by Per Platou when he talked about the construction of the Videokuntstarkivet – the Norwegian video art archive. Due to the size of video files in particular, Per found that publishing them online really pushed server capabilities to the absolute maximum. This illustrates that there remains a discrepancy between the rate at which broadcast technologies develop and the economic, technological and ecological resources available to send and receive them.

Another interesting point about the move from physical to file-based media is the increased need for Quality-Control (QC) software tools that will be employed to ‘ensure that our digital assets are free from artefacts or errors introduced by encoders or failures of the playback equipment.’ Indeed, given that glitches born from slow or interrupted transfers may well be inevitable because of limited server capabilities, software developed by Bristol-based company Vidcheck will be very useful because it ‘allows for real-time repair of Luma, Chroma, Gamma and audio loudness issues that may be present in files. This is a great feature given that many of the traditional products on the market will detect problems but will not automatically repair them.’

Other main points worth mentioning from the report is the increasing move to open-source, software only solutions for managing digital collections and the rather optimistic tone directed toward ‘archives with specific needs who want to find a bespoke provider who can help design, supply and support a viable workflow option – so long as they avoid the large, proprietary ‘out-of-the-box’ solutions.’

If you are interested in reading further TechWatch reports you can download #1 here, and watch out for #3 that will be written after the International Broadcasting Convention (IBC) which is taking place in September, 2014.

 

Digital preservations, aesthetics and approaches

July 23rd, 2014

Digital Preservation 2014, the annual meeting of the National Digital Information Infrastructure and Preservation Program and the National Digital Stewardship Alliance is currently taking place in Washington, DC in the US.

The Library of Congress’s digital preservation blog The Signal is a regular reading stop for us, largely because it contains articles and interviews that impressively meld theory and practice, even if it does not exclusively cover issues relating to magnetic tape.

What is particularly interesting, and indeed is a feature of the keynotes for the Digital Preservation 2014 conference, is how the relationship between academic theory—especially relating to aesthetics and art—is an integral part of the conversation of how best to meet the challenge of digital preservation in the US. Keynote addresses from academics like Matthew Kirschenbaum (author of Mechanisms) and Shannon Mattern, sit alongside presentations from large memory institutions and those seeking ways to devise community approaches to digital stewardship.

The relationship between digital preservation and aesthetics is also a key concern of Richard Rhinehart and Jon Ippolito’s new book Re-Collection: Art, New Media and Social Memory, which has just been published by MIT Press.

This book, if at times deploying rather melodramatic language about the ‘extinction!’ and ‘death!’ of digital culture, gently introduces the reader to the wider field of digital preservation and its many challenges. Re-Collection deals mainly with born-digital archives, but many of the ideas are pertinent for thinking about how to manage digitised collections as well.DSC05277 Digital preservations, aesthetics and approaches

In particular, the recommendation by the authors that the digital archival object remains variable was particularly striking: ‘the variable media approach encourages creators to define a work in medium- independent terms so that it can be translated into a new medium once its original format is obsolete’ (11). Emphasising the variability of the digital media object as a preservation strategy challenges the established wisdom of museums and other memory institutions, Rhinehart and Ippolito argue. The default position to preserve the art work in its ‘original’ form effectively freezes a once dynamic entity in time and space, potentially rendering the object inoperable because it denies works of art the potential to change when re-performed or re-interpreted. Their message is clear: be variable, adapt or die!

As migrators of tape-based collections, media variability is integral to what we do. Here we tacitly accept the inauthenticity of the digitised archival object, an artefact which has been allowed to change in order to ensure accessibility and cultural survival.

US/ European differences ?

While aesthetic and theoretical thinking is influencing how digital information management is practiced in the US, it seems as if the European approach is almost exclusively framed in economic and computational terms

Consider, for example, the recent EU press release about the vision to develop Europe’s ‘knowledge economy‘. The plans to map and implement data standards, create cross-border coordination and an open data incubator are, it would seem, far more likely to ensure interoperable and standardised data sharing systems than any of the directives to preserve cultural heritage in the past fifteen years, a time period characterised by markedly unstable approaches, disruptive innovations and a conspicuous lack of standards (see also the E-Ark project).

It may be tempting these days to see the world as one gigantic, increasingly automated archival market, underpinned by the legal imperative to collect all kinds of personal data (see the recent ‘drip’ laws that were recently rushed through the UK parliament). Yet it is also important to remember the varied professional, social and cultural contexts in which data is produced and managed.

One session at DigiPres, for example, will explore the different archival needs of the cultural heritage sector:

‘Digital cultural heritage is dependent on some of the same systems, standards and tools used by the entire digital preservation community. Practitioners in the humanities, arts, and information and social sciences, however, are increasingly beginning to question common assumptions, wondering how the development of cultural heritage-specific standards and best practices would differ from those used in conjunction with other disciplines [...] Most would agree that preserving the bits alone is not enough, and that a concerted, continual effort is necessary to steward these materials over the long term.’

Of course approaches to digital preservation and data management in the US are largely overdetermined by economic directives, and European policies do still speak to the needs of cultural heritage institutions and other public organisations.

What is interesting, however, is the minimal transnational cross pollination at events such as DigiPres, despite the globally networked condition we all share. This suggests there are subtle divergences between approaches to digital information management now, and how it will be managed in coming years across these (very large) geopolitical locations. Aesthetics or no aesthetics, the market remains imperative. Despite the turn toward open archives and re-usable data, competition is at the heart of the system and is likely to win out above all else.

D1, D2 & D3 – histories of digital video tape

July 14th, 2014

d1 minidv tape comparison 2 300x258 D1, D2 & D3   histories of digital video tape

The images in this article are of the first digital video tape formats, the D1, D2 and D3. The tendency to continually downsize audiovisual technology is clearly apparent: the gargantuan shell of the D1 gradually shrinks to the D3, which resembles the size of a domestic VHS tape.

Behind every tape (and every tape format) lie interesting stories, and the technological wizardry and international diplomacy that helped shape the roots of our digital audio visual world are worth looking into.

In 1976, when the green shoots of digital audio technology were emerging at industry level, the question of whether Video Tape Recorders (VTRs) could be digitised began to be explored in earnest by R & D departments based at SONY, Ampex and Bosch G.m.b.H. There was considerable scepticism among researchers about whether digital video tape technology could be developed at all because of the wide frequency required to transmit a digital image.

In 1977 however, as reported on the SONY websiteYoshitaka Hashimoto and team began to intensely research digital VTRs and ‘in just a year and a half, a digital image was played back on a VTR.’

Several years of product development followed, shaped, in part, by competing regional preferences. As Jim Slater argues in Modern Television Systems (1991): ‘much of the initial work towards digital standardisation was concerned with trying to find ways of coping with the three very different colour subcarrier frequencies used in NTSC, SECAM and PAL systems, and a lot of time and effort was spent on this’ (114).

Establishing a standard sampling frequency did of course have real financial consequences, it could not be randomly plucked out the air: the higher the sampling frequency, the greater overall bit rate; the greater overall bit rate, the more need for storage space in digital equipment. In 1982, after several years of negotiations, a 13.5 MHz sampling frequency was agreed. European, North American, ‘Japanese, the Russians, and various other broadcasting organisations supported the proposals, and the various parameters were adopted as a world standard, Recommendation 601 [a.k.a. 4:2:2 DTV] standard of the CCIR [Consultative Committee for International Radio, now International Telecommunication Union]‘ (Slater, 116).

The 4:4:2 DTV was an international standard that would form the basis of the (almost) exclusively digital media environment we live in today. It was ‘developed in a remarkably short time, considering its pioneering scope, as the worldwide television community recognized the urgent need for a solid basis for the development of an all-digital television production system’, write Stanley Baron and David Wood.

Once agreed upon, product development could proceed. The first digital video tape, the D1, was introduced on the market in 1986. It was an uncompressed component video which used enormous bandwidth for its time: 173 Mbit/sec (bit rate), with maximum recording time of 94 minutes. IMAG1651 2 1024x577 D1, D2 & D3   histories of digital video tape

As Slater writes

‘unfortunately these machines are very complex, difficult to manufacture, and therefore very expensive […] they also suffer from the disadvantage that being component machines, requiring luminance and colour-difference signals at input and output, they are difficult to install in a standard studio which has been built to deal with composite PAL signals. Indeed, to make full use of the D1 format the whole studio distribution system must be replaced, at considerable expense’ (125).

Being forced to effectively re-wire whole studios, and the considerable risk involved in doing this because of continual technological change, strikes a chord with the challenges UK broadcast companies face as they finally become ‘tapeless’ in October 2014 as part of the Digital Production Partnership’s AS-11 policy.

Sequels and product development

As the story so often goes, D1 would soon be followed by D2. Those that did make the transition to D1 were probably kicking themselves, and you can only speculate the amount of back injuries sustained getting the machines in the studio (from experience we can tell you they are huge and very heavy!)

It was fairly inevitable a sequel would be developed because even as the D-1 provided uncompromising image quality, it was most certainly an unwieldy format, apparent from its gigantic size and component wiring. In response a composite digital video – the D2 – was developed by Ampex and introduced in 1988.

In this 1988 promotional video, you can see the D-2 in action. Amazingly for our eyes and ears today the D2 is presented as the ideal archival format. Amazing for its physical size (hardly inconspicuous on the storage shelf!) but also because it used composite video signal technology. Composite signals combine on one wire all the component parts which make up a video signal: chrominance (colour, or Red Green, Blue – RGB) and luminance (the brightness or black and white information, including grayscale).

While the composite video signal used lower bandwidth and was more compatible with existing analogue systems used in the broadcast industry of the time, its value as an archival format is questionable. A comparable process for the storage we use today would be to add compression to a file in order to save file space and create access copies. While this is useful in the short term it does risk compromising file authenticity and quality in the long term. The Ampex video is fun to watch however, and you get a real sense of how big the tapes were and the practical impact this would have had on the amount of time it took to produce TV programmes.

Enter the D3

Following the D2 is the D3, which is the final video tape covered in this article (although there were of course the D5 and D9.)

The D3 was introduced by Panasonic in 1991 in order to compete with Ampex’s D2. It has the same sampling rate as the D2 with the main difference being the smaller shell size.

The D3’s biggest claim to fame was that it was the archival digital video tape of choice for the BBC, who migrated their analogue video tape collections to the format in the early 1990s. One can only speculate that the decision to take the archival plunge with the D3 was a calculated risk: it appeared to be a stable-ish technology (it wasn’t a first generation technology and the difference between D2 and D3 is negligible).

The extent of the D3 archive is documented in a white paper published in 2008, D3 Preservation File Format, written by Philip de Nier and Phil Tudor: ‘the BBC Archive has around 315,000 D3 tapes in the archive, which hold around 362,000 programme items. The D3 tape format has become obsolete and in 2007 the D3 Preservation Project was started with the goal to transfer the material from the D3 tapes onto file-based storage.’

Tom Heritage, reporting on the development of the D3 preservation project in 2013/2014, reveals that ‘so far, around 100,000 D3 and 125,000 DigiBeta videotapes have been ingested representing about 15 Petabytes of content (single copy).’

It has then taken six years to migrate less than a third of the BBC’s D3 archive. Given that D3 machines are now obsolete, it is more than questionable whether there are enough D3 head hours left in existence to read all the information back clearly and to an archive standard. The archival headache is compounded by the fact that ‘with a large proportion of the content held on LTO3 data tape [first introduced 2004, now on LTO-6], action will soon be required to migrate this to a new storage technology before these tapes become difficult to read.’ With the much publicised collapse of the BBC’s (DMI) digital media initiative in 2013, you’d have to very strong disposition to work in the BBC’s audio visual archive department.

The roots of the audio visual digital world

The development of digital video tape, and the international standards which accompanied its evolution, is an interesting place to start understanding our current media environment. They are also a great place to begin examining the problems of digital archiving, particularly when file migration has become embedded within organisational data management policy, and data collections are growing exponentially.

While the D1 may look like an alien-techno species from a distant land compared with the modest, immaterial file lists neatly stored on hard drives that we are accustomed to, they are related through the 4:2:2 sample rate which revolutionised high-end digital video production and continues to shape our mediated perceptions.

Videokunstarkivet – Norway’s Digital Video Art Archive

July 7th, 2014

We have recently digitised a U-matic video tape of eclectic Norwegian video art from the 1980s. The tape documents a performance by Kjartan Slettemark, an influential Norwegian/ Swedish artist who died in 2008. The tape is the ‘final mix’ of a video performance entitled Chromakey Identity Blue in which Slettemark live mixed several video sources onto one tape.

The theoretical and practical impossibility of documenting live performance has been hotly debated in recent times by performance theorists, and there is some truth to those claims when we consider the encounter with Slettemark’s work in the Great Bear studio. The recording is only one aspect of the overall performance which, arguably, was never meant as a stand alone piece. This was certainly reflected in our Daily Mail-esque reaction to the video when we played it back. ‘Eh? Is this art?! I don’t get it!’ was the resounding response.

Having access to the wider context of the performance is sometimes necessary if the intentions of the artist are to be appreciated. Thankfully, Slettemark’s website includes part-documentation of Chromakey Identity Blue, and we can see how the different video signals were played back on various screens, arranged on the stage in front of (what looks like) a live TV audience.

Upon seeing this documentation, the performance immediately evokes to the wider context of 70s/ 80s video art, that used the medium to explore the relationship between the body, space, screen and in Slettemark’s case, the audience. A key part of Chromakey Identity Blue is the interruption of the audience’s presence in the performance, realised when their images are screened across the face of the artist, whose wearing of a chroma key mask enables him to perform a ‘special effect’ which layers two images or video streams together.

What unfolds through Slettemark’s performance is at times humorous, suggestive and moving, largely because of the ways the faces of different people interact, perform or simply ignore their involvement in the spectacle. As Marina Abramovic‘s use of presence testifies, there can be something surprisingly raw and even confrontational about incorporating the face into relational art. As an ethical space, meeting with the ‘face’ of another became a key concept for twentieth century philosopher Emmanuel Levinas. The face locates, Bettina Bergo argues, ‘“being” as an indeterminate field’ in which ‘the Other as a face that addresses me […] The encounter with a face is inevitably personal.’

If an art work like Slettemark’s is moving then, it is because it stages moments where ‘faces’ reflect and interface across each other. Faces meet and become technically composed. Through the performance of personal-facial address in the artwork, it is possible to glimpse for a brief moment the social vulnerability and fragility such meetings engender. Brief because the seriousness is diffused Chromakey Identity Blue by a kitsch use of a disco ball that the artist moves across the screen to symbolically change the performed image, conjuring the magical feel of new technologies and how they facilitate different ways of seeing, being and acting in the world.

Videokunstarkivet (The Norwegian Video Art Archive)

020 System delete directory 300x168 Videokunstarkivet   Norways Digital Video Art Archive

The tape of Slettemark was sent to us by Videokunstarkivet, an exciting archival project mapping all the works of video art that have been made in Norway since the mid-1960s. Funded by the Norwegian Arts Council, the project has built the digital archival infrastructure from the bottom up, and those working on it have learnt a good many things along the way. Per Platou, who is managing the project, was generous enough to share some the insights for readers of our blog, and a selection of images from archive’s interface.

There are several things to be considered when creating a digital archive ‘from scratch’. Often at the beginning of a large project it is possible look around for examples of best practice within your field. This isn’t always the case for digital archives, particularly those working almost exclusively with video files, whose communities of practice are unsettled and established ways of working few and far between. The fact that even in 2014, when digital technologies have been widely adopted throughout society, there is still not any firm agreement on standard access and archival file formats for video files indicates the peculiar challenges of this work.

Because of this, projects such as Videokunstarkivet face multiple challenges, with significant amounts of improvisation required in the construction of the project infrastructure. An important consideration is the degree of access users will have to the archive material. As Per explained, publicly re-publishing the archive material from the site in an always open access form is not a concern of the  Videokunstarkivet, largely due to the significant administrative issues involved in gaining licensing and copyright permissions. ‘I didn’t even think there was a difference between collecting and communicating the work yet after awhile I saw there is no point in showing everything, it has to be filtered and communicated in a certain way.’

012 Munthe documents alt files 300x168 Videokunstarkivet   Norways Digital Video Art Archive

Instead, interested users will be given a research key or pass word which enables them to access the data and edit metadata where appropriate. If users want to re-publish or show the art in some form, contact details for the artist/ copyright holder are included as part of the entry. Although the Videokunstarkivet deals largely with video art, entries on individual artists include information about other archival collections where their material may be stored in order to facilitate further research. Contemporary Norwegian video artists are also encouraged to deposit material in the database, ensuring that ongoing collecting practices are built-in to the long-term project infrastructure.01b New Artwork 300x168 Videokunstarkivet   Norways Digital Video Art Archive

Another big consideration in constructing an archive is what to collect. Per told me that video art in Norway really took off in the early 80s. Artists who incorporated video into their work weren’t necessarily specialists in the medium, ‘there just happened to be a video camera nearby so they decided to use it.’ Video was therefore often used alongside films, graphics, performance and text, making the starting point for the archive, according to Per, ‘a bit of a mess really.’ Nonetheless, Videokunstarkivet ‘approaches every artist like it was Edvard Munch,’ because it is very hard to know now exactly what will be culturally valuable in 10, 20 or even 100 years from now. While it may not be appropriate to ‘save everything!’ for larger archival projects, for a self-contained and focused archival project such as the Videokunstarkivet, an inclusive approach may well be perfectly possible.

Building software infrastructures

Another important aspect of the project is technical considerations – the actual building of the back/ front end of the software infrastructure that will be used to manage newly migrated digital assets.

It was very important that the Videokunstarkivet archive was constructed using Open Source software. It was necessary to ensure resilience in a rapidly changing technological context, and so the project could benefit from any improvements in the code as they are tested out by user communities.

The project uses an adapted version of Digital Asset Management system Resource Space that was developed with LIMA, an organisation based in Holland that preserves, distributes and researches media art. Per explained that ‘since Resource Space was originally meant for photos and other “light” media files, we found it not so well suited for our actual tasks.’ Video files are of course far ‘heavier’ than image or even uncompressed audio files. This meant that there were some ‘pretty severe’ technical glitches in the process of establishing a database system that could effectively manage and playback large, uncompressed master and access copies. Through establishing the Videokunstarkivet archive they were ‘pushing the limits of what is technically possible in practice’, largely because internet servers are not built to handle large files, particularly not if those files are being transcoding back and forth across the file management system. In this respect, the project is very much ‘testing new ground’, creating an infrastructure capable of effectively managing, and enabling people to remotely access large amounts of high-quality video data.

07 F+©lstad artwork info 300x168 Videokunstarkivet   Norways Digital Video Art Archive Access files will be available to stream using open source encoded files Web M (hi and lo) and X264 (hi and lo), ensuring that streaming conditions can be adapted to individual server capabilities. The system is also set up to manage change large-scale file transcoding should there be substantial change in file format preferences. These changes can occur without compromising the integrity of the uncompressed master file.

The interface is built with Bootstrap which has been adapted to create ‘a very advanced access-layer system’ that enables Videokunstarkivet to define user groups and access requirements. Per outlined these user groups and access levels as follows:

‘- Admin: Access to everything (i.e.Videokunstarkivet team members)

– Research: Researchers/curators can see video works, and almost all the metadata (incl previews of the videos). They cannot download master files. They can edit metadata fields, however all their edits will be visible for other users (Wikipedia style). If a curator wants to SHOW a particular work, they’ll have to contact the artist or owner/gallery directly. If the artist agrees, they (or we) can generate a download link (or transcode a particular format) with a few clicks.

– Artist: Artists can up/download uncompressed master files freely, edit metadata and additional info (contact, cv, websites etc etc). They will be able to use the system to store digital master versions freely, and transcode files or previews to share with who they want. The ONLY catch is that they can never delete a master file – this is of course coming out of national archive needs.’06 F+©lstad overview 300x168 Videokunstarkivet   Norways Digital Video Art Archive

Per approached us to help migrate the Kjartan Slettemark tape because of the thorough approach and conscientious methodology we apply to digitisation work. As a media archaeology enthusiast, Per stressed that it was desirable for both aesthetic and archival reasons that the materiality of U-matic video was visible in the transferred file. He didn’t want the tape, in other words, to be ‘cleaned up’ in anyway. To migrate the tape to digital file we used our standardised transfer chain for U-matic tape. This includes using an appropriate time-based-corrector contemporary to U-matic era, and conversion of the dub signal using a dedicated external dub – y/c converter circuit.

We are very happy to be working with projects such as the Videokunstarkivet. It has been a great opportunity to learn about the nuts and bolts design of cutting-edge digital video archives, as well as discover the work of Kjartan Slettemark, whose work is not well-known in the UK. Massive thanks must go to Per for his generous sharing of time and knowledge in the process of writing this article. We wish the Videokunstarkivet every success and hope it will raise the profile of Norwegian video art across the world.

New additions in the Great Bear Studio – BBC-adapted Studer Open reel tape machine

June 24th, 2014

DSC05129 New additions in the Great Bear Studio   BBC adapted Studer Open reel tape machineWe recently acquired a new Studer open reel tape machine to add to our extensive collection of playback equipment.

This Studer is, however, different from the rest, because it originally belonged to BBC Bristol. It therefore bears the hall marks of a machine specifically adapted for broadcast use.

The tell tale signs can be found in customised features, such as control faders and switches. These enabled sound levels to be controlled remotely or manually.

 The presence of peak programme meters (P.P.M.), buttons that made it easy to see recording speeds (7.5/ 15 inches per second), as well as switches between cues and channels, were also specific to broadcast use.

DSC05128 New additions in the Great Bear Studio   BBC adapted Studer Open reel tape machine

Studer tape machines were favoured in professional contexts because of their ‘sturdy tape transport mechanism with integrated logic control, electronically controlled tape tension even during fast wind and braking phases, electronic sensing of tape motion and direction, electronic tape timing, electronic speed control, plug-in amplifier modules with separately plug-gable equalization and level pre-sets plus electronic equalization changeover.’

Because of Studer’s emphasis on engineering quality, machines could be adapted according to the specific needs of a recording or broadcast project.  

In our digitisation work at Great Bear, we have also adapted a Studer machine to clean damaged or shedding tapes prior to transfer. The flexibility of machine enables us to remove fixed guides so vulnerable tape can move safely through the transport. This preservation-based adaption is testimony to the considered design of Studer open reel tape machines, even though it diverges from its intended use.    

If you want to learn a bit more about the Equipment department at the BBC who would have been responsible for adapting machines, follow this link.

DSC05127 New additions in the Great Bear Studio   BBC adapted Studer Open reel tape machine

ADAPT, who are researching the history of television production also have an excellent links section of their website, including one to the BBC’s Research and Develop (R&D) archive which houses many key digitised publications relating to the adoption and use of magnetic tape in the broadcast industry.

The difference ten years makes: changes in magnetic tape recording and storage media

June 24th, 2014

DSC05144 The difference ten years makes: changes in magnetic tape recording and storage mediaGenerational change for digital technologies are rapid and disruptive.  ‘In the digital context the next generation may only be five to ten years away!’ Tom Gollins from the National Archives reminds us, and this seems like a fairly conservative estimate.

It can feel like the rate of change is continually accelerating, with new products appearing all the time. It is claimed, for example, that the phenomena of ‘wearable tech chic’ is now upon us, with the announcement this week that Google’s glass is available to buy for £1,000.

The impact of digital technologies have been felt throughout society, and this issue will be explored in a large immersive exhibition of art, design, film, music and videogames held at the Barbian July-Sept 2014. It is boldly and emphatically titled: Digital Revolution.

To bring such technological transformations back into focus with our work at Great Bear, consider this 2004 brochure that recently re-surfaced in our Studio. As an example of the rapid rate of technological change, you need look no further.

A mere ten years ago, you could choose between several brands of audio mini disc, ADAT, DAT, DTRS, Betacam SP, Digital Betacam, super VHS, VHS-C, 8mm and mini DV.

DSC05142 The difference ten years makes: changes in magnetic tape recording and storage mediaStorage media such as Zip disks, Jaz CartExabytes and hard drives that could store between 36-500Gb of data were also available to purchase.

RMGI are currently the only manufacturer of professional open reel audio tape. In the 2004 catalogue, different brands of open reel analogue tape are listed at a third of 2014 retail prices, taking into account rates of inflation.

While some of the products included in the catalogue, namely CDs, DVDs and open reel tape, have maintained a degree of market resiliency due to practicality, utility or novelty, many have been swept aside in the march of technological progress that is both endemic and epidemic in the 21st century.

 

 

Digitisation: methodologies, processing and archival practices

June 16th, 2014

 DSC05096 Digitisation: methodologies, processing and archival practicesWe work with a range of customers at Great Bear, digitising anything from personal collections to the content of institutional archives. Because of this, what customers need from a digitisation service can be very different.

A key issue relates to the question of how much we process the digital file, both as part of the transfer and in post-production. In other words, to what extent do we make alterations to the form of the recording when it becomes a digitised artifact. While this may seem like an innocuous problem, the question of whether or not to apply processing, and therefore radically transform the original recording, is a fraught, and for some people, ethical, consideration.

There are times when applying processing technologies is desirable and appropriate. With the transfer of video tape, for example, we always use time-based correctors or frame synchronisers to reduce or eliminate errors during play back. Some better quality video tape machines, such as the U-matic BVU-950P, already have time-based correctors built in which makes external processing unnecessary. As the AV Artifact Atlas explains however, time-based correction errors are very common with video tape:

‘When a different VTR is used to playback the same signal, there can be slight mechanical and electronic differences that prevent the tape from being read in the same way it was written. Perhaps the motors driving the tape in a playback VTR move slightly slower than they did in the camera that recorded the tape, or maybe the head of the playback VTR rotates a fraction quicker than the video head in the machine that recorded the tape. These tiny changes in timing can dramatically affect stability in a video image.’

We also utilise built in processes that are part of machine’s circuitry, such as drop out compensation and noise reduction. We use these, however, not in order to make the tape ‘look better.’ We do it rather as a standard calibration set up, which is necessary for the successful playback of the tape in a manner appropriate to its original operating environment.

After all, video tape machines were designed to be interchangeable. It is likely such stabilising processing would have been regularly used to play back tapes in machines that were different to those they were recorded on. Time-based correction and frame synchronisation are therefore integral to the machine/ playback circuitry, and using such processing tools is central to how we successfully migrate tape-based collections to digital files.

Digital processing tools DSC05095 Digitisation: methodologies, processing and archival practices

Our visual environment has changed dramatically since the days when domestic video tape was first introduced, let alone since the hay day of VHS. The only certainty is that it will continue to change. Once it was acceptable for images to be a bit grainy and low resolution, now only the crisp clarity of a 4K Ultra HD image will do. There is perhaps the assumption that ‘clearer is better’, that being able to watch moving images in minute detail is a marker of progress.  Yet should this principle be applied to the kinds of digitisation work we do at Great Bear? There are processors that can transform the questionable analogue image into a bright, high definition, colour enriched digital copy. The teranex processor, for example, ‘includes extremely high quality de-interlacing, up conversion, down conversion, SD and HD cross/standards conversion, automatic cadence detection and removal even with edited content, noise reduction, adjustable scaling and aspect ratio conversion.’ ‘Upgrading’ analogue images in this way does come with certain ethical risks.

Talking about ethics in conjunction with video or audio tape might seem a bit melodramatic, but it is at the point of intervention/ non-intervention where the needs of our customers diverge the most. This is not to say that people who do want to process their tapes are unethical – far from it! We understand that for some customers it may be preferable for such processing to occur, or to apply other editing techniques such as noise reduction or amplification, so that audio can be heard with greater clarity.

Instead we want to emphasise that our priority is getting the best out of the tape and our playback machines, rather than relying on the latest processing technology that is also at risk from obsolescence. After all, a heavily processed file will always require further processing at an unknown point in future so that it can maintain visually relevant to whatever format is commercially dominant at the time. Such transformations of the digital file, which are necessarily destructive and permanent, contribute to the further circulation of what Hito Steyerl calls ‘poor images‘, ‘a rag or a rip; an AVI or a JPEG…The poor image has been uploaded, downloaded, shared, reformatted, and reedited. It transforms quality into accessibility, exhibition value into cult value, films into clips, contemplation into distraction.’

Maintaining the integrity, and as far as possible authenticity of the original recordings, is a core part of our methodology. In this way our approach corresponds with Jisc’s mantra of ‘reproduction not optimisation’ where they write:

‘Improving, altering or modifying media for optimisation may seem logical when presenting works to a public or maintaining perceived consistency. It should be remembered that following an often natural inclination to enhance what we perceive to be a poor level of quality is a subjective process prescribed by personal preference, technological trends and cultural influences. In many cases the intentions of a creator are likely to be unknown and this can cause difficulties in interpreting levels of quality. In these instances common sense alongside trepidation should prevail. On the one end of the spectrum unintelligible recordings may be of little use to anyone, whereas at the opposite end recordings from previous eras were not produced with modern standards of clarity in mind.’

It is important to bear in mind, however, that even if a file is subject to destructive editing there may come a time when the metadata created about the artefact can help to illuminate its context and provenance, and therefore help it maintain its authenticity. The debates regarding digital authenticity and archiving will of course shift as time passes and practices evolve.

In the meantime, we will continue to do what we are most skilled at: restoring, repairing and migrating magnetic tape to digital files in a manner that maintains both the integrity of the original operating environment and the recorded signal.

Future tape archaeology: speculations on the emulation of analogue environments

June 2nd, 2014

At the recent Keeping Tracks symposium held at the British Library, AV scoping analyst Adam Tovell stated that

‘there is consensus internationally that we as archivists have a 10-20 year window of opportunity in which to migrate the content of our physical sound collections to stable digital files. After the end of this 10-20 year window, general consensus is that the risks faced by physical media mean that migration will either become impossible or partial or just too expensive.’

This point of view certainly corresponds to our experience at Great Bear. As collectors of a range of domestic and professional video and audio tape playback machines, we are aware of the particular problems posed by machine obsolescence. Replacement parts can be hard to come by, and the engineering expertise needed to fix machines is becoming esoteric wisdom. Tape degradation is of course a problem too. These combined factors influence the shortened horizon of magnetic tape-based media.

All may not be lost, however, if we are take heart from a recent article which reported the development of an exciting technology that will enable memory institutions to recover recordings made over 125 years ago on mouldy wax cylinders or acid-leaching lacquer discs.

IRENE (Image, Reconstruct, Erase Noise, Etc.), developed by physicist Carl Haber at the Lawrence Berkeley National Laboratory, is a software programme that ‘photographs the grooves in fragile or decayed recordings, stitches the “sounds” together with software into an unblemished image file, and reconstructs the “untouchable” recording by converting the images into an audio file.’

The programme was developed by Haber after he heard a radio show discuss the Library of Congress’ audio collections that were so fragile they risked destruction if played back. Haber speculated that the insights gained from a project he was working on could be used to recover these audio recordings. ‘“We were measuring silicon, why couldn’t we measure the surface of a record? The grooves at every point and amplitude on a cylinder or disc could be mapped with our digital imaging suite, then converted to sound.”’

For those involved in the development of IRENE, there was a strong emphasis on the benefits of patience and placing trust in the inevitable restorative power of technology. ‘It’s ironic that as we put more time between us and the history we are exploring, technology allows us to learn more than if we had acted earlier.’

Can such a hands-off approach be applied to magnetic tape based media? Is the 10-20 year window of opportunity described by Tovell above unnecessarily short? After all, it is still possible to playback wax cylinder recordings from the early 20th century which seem to survive well over long periods of time, and magnetic tape is far more durable than is commonly perceived.

In a fascinating audio recording made for the Pitt Rivers Museum in Oxford, Nigel Bewley from the British Library describes how he migrated wax cylinder recordings that were made by Evans Pritchard in 1928-1930 and Diamond Jenness in 1911-1912. Although Bewley reveals his frustration in the preparation process, he reveals that once he had established the size of stylus and rotational speed of the cylinder player, the transfer was relatively straightforward.

You will note that in contrast with the recovery work made possible by IRENE, the cylinder transfer was made using an appropriate playback mechanism, examples of which can accessed on this amazing section of the British Library’s website (here you can also browse through images and information about disc cutters, magnetic recorders, radios, record players, CD players and accessories such as needle tins and headphones – a bit of a treasure trove for those inclined toward media archaeology).

Perhaps the development of the IRENE technology will mean that it will no longer be necessary to use such ‘authentic’ playback mechanisms to recover information stored on obsolete media. This brings us neatly to the question of emulation.

Emulation

DSC04897 Future tape archaeology: speculations on the emulation of analogue environments

If we assume that all the machines that playback magnetic tape become irrevocably obsolete in 10-20 years, what other potential extraction methods may be available? Is it possible that emulation techniques, commonly used in the preservation of born-digital environments, can be applied to recover the recorded information stored on magnetic tape?

In a recent interview Dirk Von Suchodoletz explains that:

‘Emulation is a concept in digital preservation to keep things, especially hardware architectures, as they were. As the hardware itself might not be preservable as a physical entity it could be very well preserved in its software reproduction. [...] For memory institutions old digital artifacts become more easy to handle. They can be viewed, rendered and interacted-with in their original environments and do not need to be adapted to our modern ones, saving the risk of modifying some of the artifact’s significant properties in an unwanted way. Instead of trying to mass-migrate every object in the institution’s holdings objects are to be handled on access request only, significantly shifting the preservation efforts.’

For the sake of speculation, let us imagine we are future archaeologists and consider some of the issues that may arise when seeking to emulate the operating environments of analogue-based tape media.

To begin with, without a working transport mechanism which facilitates the transmission of information, the emulation of analogue environments will need to establish a circuitry that can process the Radio Frequency (RF) signals recorded on magnetic tape. As Jonathan Sterne reflects, ‘if [...] we say we have to preserve all aspects of the platform in order to get at the historicity of the media practice, that means archival practice will have to have a whole new engineering dimension to it.’

Yet with the emulation of analogue environments, engineering may have to be a practical consideration rather than an archival one. For example, some kind of transport mechanism would presumably have to be emulated through which the tape could be passed through. It would be tricky to lay the tape out flat and take samples of information from its surface, as IRENE’s software does to grooved media, because of the sheer length of tape when it unwound. Without an emulated transport mechanism, recovery would be time consuming and therefore costly, a point that Tovell intimates at the beginning of the article. Furthermore, added time and costs would necessitate even more complex selection and appraisal decisions on behalf of archivists managing in-operative magnetic tape-based collections. Questions about value will become fraught and most probably politically loaded. With an emulated transport mechanism, issues such as tape vulnerability and head clogs, which of course impact on current migration practices, would come into play.

Audio and video differences

On a technical level emulation may be vastly more achievable for audio where the signal is recorded using a longitudinal method and plays back via a relatively simple process. Audio tape is also far less propriety than video tape. On the SONY APR-5003V machine we use in the Great Bear Studio for example, it is possible to play back tapes of different sizes, speeds, brands, and track formations via adjustments of the playback heads. Such versatility would of course need to be replicated in any emulation environment.

helical scan2 Future tape archaeology: speculations on the emulation of analogue environmentsThe technical circuitry for playing back video tape, however, poses significantly more problems. Alongside the helical scan methods, which records images diagonally across the video tape in order to prevent the appearance of visible joints between the signal segments, there are several heads used to read the components of the video signal: the image (video), audio and control (synch) track.

Unlike audio, video tape circuitry is more propriety and therefore far less inter-operable. You can’t play a VHS tape on a U-Matic machine, for example. Numerous mechanical infrastructures would therefore need to be devised which correspond with the relevant operating environments – one size fits all would (presumably) not be possible.

A generic emulated analogue video tape circuit may be created, but this would only capture part of the recorded signal (which, as we have explored elsewhere on the blog, may be all we can hope for in the transmission process). If such systems are to be developed it is surely imperative that action is taken now while hardware is operative and living knowledge can be drawn upon in order to construct emulated environments in the most accurate form possible.

While hope may rest in technology’s infinite capacity to take care of itself in the end, excavating information stored on magnetic tape presents far more significant challenges when compared with recordings on grooved media. There is far more to tape’s analogue (and digital) circuit than a needle oscillating against a grooved inscription on wax, lacquer or vinyl.

The latter part of this article has of course been purely speculative. It would be fascinating to learn about projects attempting to emulate the analogue environment in software – please let us know if you are involved in anything in the comments below.

Capitalising on the archival market: SONY’s 185 TB tape cartridge

May 20th, 2014

In Trevor Owen’s excellent blog post ‘What Do you Mean by Archive? Genres of Usage for Digital Preservers’, he outlines the different ways ‘archive’ is used to describe data sets and information management practices in contemporary society. While the article shows it is important to distinguish between tape archives, archives as records management, personal papers and computational archives, Owens does not include an archival ‘genre’ that will become increasingly significant in the years to come: the archival market.

The announcement in late April 2014 that SONY has developed a tape cartridge capable of storing 185 TB of data was greeted with much excitement throughout the teccy world. The invention, developed with IBM, is ‘able to achieve the high storage capacity by utilising a “nano-grained magnetic layer” consisting of tiny nano-particles’ and boasts the world’s highest areal recording density of 148 Gb/in.

The news generated such surprise because it signaled the curious durability of magnetic tape in a world thought to have ‘gone tapeless‘. For companies who need to store large amounts of data however, tape storage, usually in the form of Linear Tape Open Cartridges, has remained an economically sound solution despite the availability of file-based alternatives. Imagine the amount of energy required to power up the zettabytes of data that exist in the world today? Whatever the benefits of random access, that would be a gargantuan electricity bill.

Indeed, tape cartridges are being used more and more to store large amounts of data. According to the Tape Storage Council industry group, tape capacity shipments grew by 13 percent in 2012 and were projected to grow by 26 percent in 2013. SONY’s announcement is therefore symptomatic of the growing archival market which has created demand for cost effective data storage solutions.

It is not just magnetic tape that is part of this expanding market. Sony, Panasonic and Fuji are developing optical ‘Archival discs’ capable of storing 300GB (available in summer 2015 ), with plans to develop 500GB and 1 TB disc.

Why is there such a demand for data storage?

Couldn’t we just throw it all away?

The Tape Storage Council explain:

‘This demand is being driven by unrelenting data growth (that shows no sign of slowing down), tape’s favourable economics, and the prevalent data storage mindset of “save everything, forever,” emanating from regulatory, compliance or governance requirements, and the desire for data to be repurposed and monetized in the future.’

Big Hadoop 01 full Capitalising on the archival market: SONYs 185 TB tape cartridgeThe radical possibilities of data-based profit-making abound in the ‘buzz’ that surrounds big data, an ambitious form of data analytics that has been embraced by academic research councils, security forces and multi-national companies alike.

Presented by proponents as the way to gain insights into consumer behaviour, big data apparently enables companies to unlock the potential of ‘data-driven decision making.’ For example, an article in Computer Weekly describes how Ebay is using big data analytics so they can better understand the ‘customer journey’ through their website.

Ebay’s initial forays into analysing big data were in fact relatively small: in 2002 the company kept around 1% of customer data and discarded the rest. In 2007 the company changed their policy, and worked with an established company to develop a custom data warehouse which can now run ad-hoc queries in just 32 seconds.

It is not just Ebay who are storing massive amounts of customer data. According to the BBC, ‘Facebook has begun installation of 10,000 Blu-ray discs in a prototype storage cabinet as back-ups for users’ photos and videos’. While for many years the internet was assumed to be a virtual, almost disembodied space, the desire from companies to monetise information assets mean that the incidental archives created through years of internet searches, have all this time been stored, backed up and analysed.

Amid all the excitement and promotion of big data, the lack of critical voices raising concern about social control, surveillance and ethics is surprising. Are people happy that the data we create is stored, analysed and re-sold, often without our knowledge or permission? What about civil liberties and democracy? What power do we have to resist this subjugation to the irrepressible will of the data-driven market?

These questions are pressing, and need to be widely discussed throughout society. Current predictions are that the archive market will keep growing and growing.

‘A recent report from the market intelligence firm IDC estimates that in 2009 stored information totalled 0.8 zetabytes, the equivalent of 800 billion gigabytes. IDC predicts that by 2020, 35 zetabytes of information will be stored globally. Much of that will be customer information. As the store of data grows, the analytics available to draw inferences from it will only become more sophisticated.

The development of SONY’s 185 TB tape indicate they are well placed to capitalise on these emerging markets.

The kinds of data stored on the tapes when they become available for professional markets (these tapes are not aimed at consumers) will really depend on the legal regulations placed on companies doing the data collecting. As the case of eBay discussed earlier makes clear, companies will collect all the information if they are allowed to. But should they be? As citizens in the internet society  how can ensure we have a ‘right to be forgotten’? How are the shackles of data-driven control societies broken?

Going ‘tape-less': AS-11 Digital Production Partnership standards

May 7th, 2014

Is this the end of tape as we know it? Maybe not quite yet, but October 1, 2014, will be a watershed moment in professional media production in the UK: it is the date that file format delivery will finally ‘go tape-less.’

Establishing end-to-end digital production will cut out what is now seen as the cumbersome use of video tape in file delivery. Using tape essentially adds a layer of media activity to a process that is predominantly file based anyway. As Mark Harrison, Chair of the Digital Production Partnership (DPP), reflects:

kayak workflow designer dpp example Going tape less: AS 11 Digital Production Partnership standards

Example of a workflow for the DPP AS-11 standard

‘Producers are already shooting their programmes on tapeless cameras, and shaping them in tapeless post production environments. But then a strange thing happens. At the moment a programme is finished it is transferred from computer file to videotape for delivery to the broadcaster. When the broadcaster receives the tape they pass it to their playout provider, who transfers the tape back into a file for distribution to the audience.’

Founded in 2010, the DPP are a ‘not-for-profit partnership funded and led by the BBC, ITV and Channel 4 with representation from Sky, Channel 5, S4/C, UKTV and BT Sport.’ The purpose of the coalition is to help ‘speed the transition to fully digital production and distribution in UK television’ by establishing technical and metadata standards across the industry.

The transition to a standardised, tape-less environment has further been rationalised as a way to minimise confusion among media producers and help economise costs for the industry. As reported on Avid Blogs production companies, who often have to respond to rapidly evolving technological environments, are frantically preparing for deadline day. ‘It’s the biggest challenge since the switch to HD’, said Andy Briers, from Crow TV. Moreover, this challenge is as much financial as it is technical: ‘leading post houses predict that the costs of implementing AS-11 delivery will probably be more than the cost of HDCAM SR tape, the current standard delivery format’, writes David Wood on televisual.com

Outlining the standard

Audio post production should now be mixed to the EBU R128 loudness standard. As stated in the DPP’s producer’s guide, this new audio standard ‘attempts to model the way our brains perceive sound: our perception is influenced by frequency and duration of sound’ (9).

In addition, the following specifications must be observed to ensure the delivery format is ‘technically legal.’

  • HD 1920×1080 in an aspect ratio of 16:9 (1080i/25)
  • AVC-I in MXF (Material Exchange Format) OP1a files to AS11 specification
  • DPP required metadata
  • Photo Sensitive Epilepsy (flashing) testing to OFCOM standard/ the Harding Test

The shift to file-based delivery will require new kinds of vigilance and attention to detail in order to manage the specific problems that will potentially arise. The DPP producer’s guide states: ‘unlike the tape world (where there may be only one copy of the tape) a file can be copied, resulting in more than one essence of that file residing on a number of servers within a playout facility, so it is even more crucial in file-based workflows that any redelivered file changes version or number’.

Another big development within the standard is the important role performed by metadata, both structural (inherent to the file) and descriptive (added during the course of making the programme) . While broadcasters may be used to manually writing metadata as descriptive information on tape-boxes, they must now be added to the digital file itself. Furthermore, ‘the descriptive and technical metadata will be wrapped with the video and audio into a new and final AS-11 DPP MXF file,’ and if ‘any changes to the file are [made it is] likely to invalidate the metadata and cause the file to be rejected. If any metadata needs to be altered this will involve re-wrapping the file.’

Interoperability: the promise of digital technologies

The sector-wide agreement and implementation of digital file-delivery standards are significant because they represent a commitment to manufacturing full interoperability, an inherent potential of digital technologies. As French philosopher of technology Bernard Stiegler explains:

‘The digital is above all a process of generalised formalisation. This process, which resides in the protocols that enable interoperability, makes a range of diverse and varied techniques. This is a process of unification through binary code of norms and procedures that today allow the formalisation of almost everything: traveling in my car with a GPS system, I am connected through a digitised triangulation process that formalises my relationship with the maps through which I navigate and that transform my relationship with territory. My relationships with space, mobility and my vehicle are totally transformed. My inter-individual, social, familial, scholarly, national, commercial and scientific relationships are all literally unsettled by the technologies of social engineering. It is at once money and many other things – in particular all scientific practices and the diverse forms of public life.’

Interoperability  Going tape less: AS 11 Digital Production Partnership standards

This systemic homogenisation described by Stiegler is called into question if we consider whether the promise of interoperability – understood here as different technical systems operating efficiently together – has ever been fully realised by the current generation of digital technologies. If this was the case then initiatives like the DPP’s would never have to be pursued in the first place – all kinds of technical operations would run in a smooth, synchronous matter. Amid the generalised formalisation there are many micro-glitches and incompatibilities that slow operations down at best, and grind them to a halt at worst.

With this in mind we should note that standards established by the DPP are not fully interoperable internationally. While the DPP’s technical and metadata standards were developed in close alliance with the US-based Advanced Media Workflow Association’s (AMWA) recently released AS-11 specification, there are also key differences.

As reported in 2012 by Broadcast Now Kevin Burrows, DPP Technical Standards Lead, said: ‘[The DPP standards] have a shim that can constrain some parameters for different uses; we don’t support Dolby E in the UK, although the [AMWA] standard allows it. Another difference is the format – 720 is not something we’d want as we’re standardising on 1080i. US timecode is different, and audio tracks are referenced as an EBU standard.’ Like NTSC and PAL video/ DVD then, the technical standards in the UK differ from those used in the US. We arguably need, therefore, to think about the interoperability of particular technical localities rather than make claims about the generalised formalisation of all technical systems.  Dis-synchrony and technical differences remain despite standardisation.

The AmberFin Academy blog have also explored what they describe as the ‘interoperability dilemma’. They suggest that the DPP’s careful planning mean their standards are likely to function in an efficient manner: ‘By tightly constraining the wrapper, video codecs, audio codecs and metadata schema, the DPP Technical Standards Group has created a format that has a much smaller test matrix and therefore a better chance of success. Everything in the DPP File Delivery Specification references a well defined, open standard and therefore, in theory, conformance to those standards and specification should equate to complete interoperability between vendors, systems and facilities.’ They do however offer these words of caution about user interpretation: ‘despite the best efforts of the people who actually write the standards and specifications, there are areas that are, and will always be, open to some interpretation by those implementing the standards, and it is unlikely that any two implementations will be exactly the same. This may lead to interoperability issues.’

It is clear that there is no one simple answer to the dilemma of interoperability and its implementation. Establishing a legal commitment, and a firm deadline date for the transition, is however a strong message that there is no turning back. Establishing the standard may also lead to a certain amount of technological stability, comparable to the development of the EIAJ video tape standards in 1969, the first standardised format for industrial/non-broadcast video tape recording. Amid these changes in professional broadcast standards, the increasingly loud call for standardisation among digital preservationists should also be acknowledged.

For analogue and digital tapes however, it may well signal the beginning of an accelerated end. The professional broadcast transition to ‘full-digital’ is a clear indication of tape’s obsolescence and vulnerability as an operable media format.

Significant properties – technical challenges for digital preservation

April 28th, 2014

A consistent focus of our blog is the technical and theoretical issues that emerge in the world of digital preservation. For example, we have explored the challenges archivists face when they have to appraise collections in order to select what materials are kept, and what are thrown away. Such complex questions take on specific dimensions within the world of digital preservation.

If you work in digital preservation then the term ‘significant properties’ will no doubt be familiar to you. The concept has been viewed as a hindrance due to being shrouded by foggy terminology, as well as a distinct impossibility because of the diversity of digital objects in the world which, like their analogue counterparts, cannot be universally generalised or reduced to a series of measurable characteristics.

DSC04797 Significant properties   technical challenges for digital preservation

In a technical sense, establishing a set of core characteristics for file formats has been important for initiatives like Archivematica, ‘a free and open-source digital preservation system that is designed to maintain standards-based, long-term access to collections of digital objects.’ Archivematica implement ‘default format policies based on an analysis of the significant characteristics of file formats.’ These systems manage digital information using an ‘agile software development methodology’ which ‘is focused on rapid, iterative release cycles, each of which improves upon the system’s architecture, requirements, tools, documentation, and development resources.’

Such a philosophy may elicit groans of frustration from information managers who may well want to leave their digital collections alone, and practice a culture of non-intervention. Yet this adaptive-style of project management, which is designed to respond rapidly to change, is often contrasted with predictive development that focuses on risk assessment and the planning of long-term projects. The argument against predictive methodologies is that, as a management model, it can be unwieldy and unresponsive to change. This can have damaging financial consequences, particularly when investing in expensive, risky and large scale digital preservation projects, as the BBC’s failed DMI initiative demonstrates.

Indeed, agile software development methodology may well be an important key to the sustainability of digital preservation systems which need to find practical ways of maneuvering technological innovations and the culture of perpetual upgrade. Agility in this context is synonymous with resilience, and the practical application of significant properties as a means to align file format interoperability offers a welcome anchor for a technological environment structured by persistent change.

Significant properties vs the authentic digital object

What significant properties imply, as archival concept and practice, is that desiring authenticity for the digitised and born-digital objects we create is likely to end in frustration. Simply put, preserving all the information that makes up a digital object is a hugely complex affair, and is a procedure that will require numerous and context-specific technical infrastructures.

As Trevor Owens explains: ‘you can’t just “preserve it” because the essence of what matters about “it” is something that is contextually dependent on the way of being and seeing in the world that you have decided to privilege.’ Owens uses the example of the Geocites web archiving project to demonstrate that if you don’t have the correct, let’s say ‘authentic’ tools to interpret a digital object (in this case, a website that is only discernible on certain browsers), you simply cannot see the information accurately. Part of the signal is always missing, even if something ‘significant’ remains (the text or parts of the graphics).

It may be desirable ‘to preserve all aspects of the platform in order to get at the historicity of the media practice’, Jonathan Sterne, author of MP3: Meaning of a Format suggests, but in a world that constantly displaces old technological knowledge with new, settling for the preservation of significant properties may be a pragmatic rather than ideal solution.

Analogue to digital issues

To bring these issues back to the tape we work we with at Great Bear, there are of course times when it is important to use the appropriate hardware to play the tapes back, and there is a certain amount of historically specific technical knowledge required to make the machines work in the first place. We often wonder what will happen to the specialised knowledge learnt by media engineers in the 70s, 80s and 90s, who operated tape machines that are now obsolete. There is the risk that when those people die, the knowledge will die with them. Of course it is possible to get hold of operating manuals, but this is by no means a guarantee that the mechanical techniques will be understood within a historical context that is increasingly tape-less and software-based.  By keeping our wide selection of audio and video tape machines purring, we are sustaining a machinic-industrial folk knowledge which ultimately helps to keep our customer’s magnetic tape-based, media memories, alive.

Of course a certain degree of historical accuracy is required in the transfers because, very obviously, you can’t play a V2000 tape on a VHS machine, no matter how hard you try!

Yet the need to play back tapes on exactly the same machine becomes less important in instances where the original tape was recorded on a domestic reel-to-reel recorder, such as the Grundig TK series, which may not have been of the greatest quality in the first place. To get the best digital transfer it is desirable to play back tapes on a machine with higher specifications that can read the magnetic information on the tape as fully as possible. This is because you don’t want to add any more errors to the tape in the transfer process by playing it back on a lower quality machine, which would then of course become part of the digitised signal.

It is actually very difficult to remove things like wow and flutter after a tape has been digitised, so it is far better to ensure machines are calibrated appropriately before the tape is migrated, even if the tape was not originally recorded on a machine with professional specifications. What is ultimately at stake in transferring analogue tape to digital formats is the quality of the signal. Absolute authenticity is incidental here, particularly if things sound bad.

The moral of this story, if there can be one, is that with any act of transmission, the recorded signal is liable to change. These can be slight alterations or huge drop-outs and everything in-between. The agile software developers know that given the technological conditions in which current knowledge is produced and preserved, transformation is inevitable and must be responded to. Perhaps it is realistic to assume this is the norm in society today, and creating digital preservation systems that are adaptive is key to the survival of information, as well as accepting that preserving the ‘full picture’ cannot always be guaranteed.

Irene Brown’s reel to reel recordings of folk and Gaelic culture

April 22nd, 2014

DSC04788 Irene Browns reel to reel recordings of folk and Gaelic cultureWe are currently migrating a collection of tapes made by Irene Brown who, in the late 1960s, was a school teacher living in Inverness. Irene was a member of the Inverness folk club and had a strong interest in singing, playing guitar and collecting the musical heritage of folk and Gaelic culture.

The tapes, that were sent by her niece Mrs. Linda Baublys, are documents of her Auntie’s passion, and include recordings Irene made of folk music sung in a mixture of Gaelic and English at the Gellions pub, Inverness, in the late 1960s.

The tapes also include recordings of her family singing together. Linda remembered fondly childhood visits to her ‘Granny’s house that was always filled with music,’ and how her Auntie used to ‘roar and sing.’

Perhaps most illustriously, the tapes include a prize-winning performance at the annual An Comunn Gaidhealach/ The National Mòd (now Royal National Mòd). The festival, which has taken place annually at different sites across Scotland since it was founded in 1892 is modelled on the Welsh Eisteddfod and acts ‘as a vehicle for the preservation and development of the Gaelic language. It actively encourages the teaching, learning and use of the Gaelic language and the study and cultivation of Gaelic literature, history, music and art.’ Mòd festivals also help to keep Gaelic culture alive among diasporic Scottish communities, as demonstrated by the US Mòd that has taken place annually since 2008.

If you want to find out more about Gaelic music visit the Year of the Song website run by BBC Alba where you can access a selection of songs from the BBC’s Gaelic archive. If you prefer doing research in archives and libraries take a visit to the School of Scottish Studies Archives. Based at the University of Edinburgh, the collection comprises a significant sound archive containing thousands of recordings of songs, instrumental music, tales, verse, customs, beliefs, place-names biographical information and local history, encompassing a range of dialects and accents in Gaelic, Scots and English.

As well as learning some of the songs recorded on the tape to play herself, Linda plans to eventually deposit the digitised transfers with the School of Scottish Studies Archives. She will also pass the recordings on to a local school that has a strong engagement with traditional Gaelic music.

Digitising and country lanes

Linda told us it was a ‘long slog’ to get the tapes. After Irene died at the age of 42 it was too upsetting for her mother, and Linda’s Granny, to listen to them. The tapes were then passed onto Linda’s mother who also never played the tapes, so when she passed away Linda, who had been asking for the tapes for nearly 20 years, took responsibility to get them digitised.

DSC04785 Irene Browns reel to reel recordings of folk and Gaelic culture

The tapes were in fairly good condition and minimal problems arose in the transfer process. One of the tapes was however suffering from ‘country-laning’. This is when the shape of the tape has become bendy (like a country lane), most probably because it had been stored in fluctuating temperatures which cause the tape to shrink and grow. It is more common in acetate-backed tape, although Linda’s tapes were polymer-backed. Playing a tape suffering from country-laning often results in problems with the azimuth because the angle between tape head and tape are dis-aligned. A signal can still be discerned, because analogue recordings rarely drop out entirely (unlike digital tape), but the recording may waver or otherwise be less audible. When the tape has been deformed in this way it is very difficult to totally reverse the process. Consequently there has to be some compromise in the quality of the transfer.

We hope you will enjoy this excerpt from the tapes, which Linda has kindly given us permission to include in this article.


designed and developed by
greatbear analogue and digital media ltd, 0117 985 0500
Unit 26, The Coach House, 2 Upper York Street, Bristol, BS2 8QN, UK


XHTML | CSS
greatbear analogue and digital media is proudly powered by WordPress
hosted using Debian and Apache