DVCAM transfers, error correction coding & misaligned machines

December 17th, 2014

This article is inspired by a collection of DVCAM tapes sent in by London-based cultural heritage organisation Sweet Patootee. Below we will explore several issues that arise from the transfer of DVCAM tapes, one of the many Digital Video formats that emerged in the mid-1990s. A second article will follow soon which focuses on the content of the Sweet Patootee archive, which is a fascinating collection of video-taped oral histories of 1 World War veterans from the Caribbean.

The main issue we want to explore below is the role error correction coding performs both in the composition of the digital video signal and during the preservation playback. We want to highlight this issue because it is often assumed that DVCAM, which first appeared on the market in 1996, is a fairly robust format.

The work we have done to transfer tapes to digital files indicates that error correction coding is working overdrive to ensure we can see and hear these recordings. The implication is that DVCAM collections, and wider DV-based archives, should really be a preservation priority for institutions, organisations and individuals.

Before we examine this in detail, let’s learn a bit about the technical aspects of error correction coding.

Error error error

DVFormat7 274x300 DVCAM transfers, error correction coding & misaligned machinesError correction coding is a staple part of audio and audio-visual digital media. It is of great important in the digital world of today where the higher volume of transmitted signals require greater degrees of compression, and therefore sophisticated error correction schemes, as this article argues.

Error correction works through a process of prediction and calculation known as interpolation or concealment. It takes an estimation of the original recorded signal in order to re-construct parts of the data that have been corrupted. Corruption can occur due either to wear and tear, or insufficiencies in the original recorded signal.

Yet as Hugh Robjohns explains in the article ‘All About Digital Audio’ from 1998:

 ‘With any error protection system, if too many erroneous bits occur in the same sample, there is a risk of the error detection system failing, and in practice, most media failures (such as dropouts on tape or dirt on a CD), will result in a large chunk of data being lost, not just the odd data bit here and there. So a technique called interleaving is used to scatter data around the medium in such a way that if a large section is lost or damaged, when the data is reordered many smaller, manageable data losses are formed, which the detection and correction systems can hopefully deal with.’

There are many different types of error correction, and ‘like CD-ROMs, DV uses Reed-Solomon (RS) error detection and correction coding. RS can correct localized errors, but seldom can reconstruct data damaged by a dropout of significant size (burst error),’ explains this wonderfully detailed article about DV video formats archived on web archive.

The difference correction makes

Digital technology’s error correction is one of the key things that differentiate it from their analogue counterparts. As the IASA‘s Guidelines on the Production and Preservation of Digital Audio Objects (2009) explains:

‘Unlike copying analogue sound recordings, which results in inevitable loss of quality due to generational loss, different copying processes for digital recordings can have results ranging from degraded copies due to re-sampling or standards conversion, to identical “clones” which can be considered even better (due to error correction) than the original.’ (65)

To think that digital copies can, at times, exceed the quality of the original digital recording is both an astonishing and paradoxical proposition. After all we are talking about a recording that improves at the perceptual level, despite being compositionally damaged. It is important to remember that error correction coding cannot work miracles, and there are limits to what it can do.

Dietrich Schüller and Albrecht Häfner argue in the International Association of Sound and Audiovisual Archives’s (IASA) Handling and Storage of Audio and Video Carriers (2014) that ‘a perfect, almost error free recording leaves more correction capacity to compensate for handling and ageing effects and, therefore, enhances the life expectancy.’ If a recording is made however ‘with a high error rate, then there is little capacity left to compensate for further errors’ (28-29).

The bizarre thing about error-correction coding then is the appearance of clarity it can create. And if there are no other recordings to compare with the transferred file, it is really hard to know what the recorded signal is supposed to look and sound like were its errors not being corrected.

securedownload 300x119 DVCAM transfers, error correction coding & misaligned machines

When we watch the successfully migrated, error corrected file post-transfer, it matters little whether the original was damaged. If a clear signal is transmitted with high levels of error correction, the errors will not be transferred, only the clear image and sound.

Contrast this with a damaged analogue tape it would be clearly discernible on playback. The plus point of analogue tape is they do degrade gracefully: it is possible to play back an analogue tape recording with real physical deterioration and still get surprisingly good results.

Digital challenges

The big challenge working with any digital recordings on magnetic tape is to know when a tape is in poor condition prior to playback. Often tape will look fine and, because of error correction, will sound fine too until it stops working entirely.

How then did we know that the Sweet Patootee tapes were experiencing difficulties?

Professional DV machines such as our DVC PRO have a warning function that flashes when the error-correction coding is working at heightened levels. With our first attempt to play back the tapes we noticed that regular sections on most of the tapes could not be fixed by error correction.

The ingest software we use is designed to automatically retry sections of the tape with higher levels of data corruption until a signal can be retrieved. Imagine a process where a tape automatically goes through a playing-rewinding loop until the signal can be read. We were able to play back the tapes eventually, but the high level of error correction was concerning.

DVFormat6 300x294 DVCAM transfers, error correction coding & misaligned machines

As this diagram makes clear, around 25% of the recorded signal in DVCAM is composed of subcode data, error detection and error correction.

DVCAM & Mis-alignment

It is not just the over-active error correction on DVCAMs that should send the alarm bells ringing.

Alan Griffiths from Bristol Broadcast Engineering, a trained SONY engineer with over 40 years experience working in the television industry, told us that early DVCAM machines pose particular preservation challenges. The main problem here is that the ‘mechanisms are completely different’ for earlier DVCAM machines which means that there is ‘no guarantee’ they will play back effectively on later models.

Recordings made on early DVCAM machines exhibit back tensions problems and tracking issues. This increases the likelihood of DV dropout on playback because a loss of information was recorded onto the original tape. The IASA confirm that ‘misalignment of recording equipment leads to recording imperfections, which can take manifold form. While many of them are not or hardly correctable, some of them can objectively be detected and compensated for.’

One possible solution to this problem, as with DAT tapes, is to ‘misalign’ the replay digital video tape recorder to match the misaligned recordings. However ‘adjustment of magnetic digital replay equipment to match misaligned recordings requires high levels of engineering expertise and equipment’ (2009; 72), and must therefore not be ‘tried at home,’ so to speak.

Our experience with the Sweet Patootee tapes indicates that DVCAM tapes are a more fragile format than is commonly thought, particularly if your DVCAM collection was recorded on early machines. If you have a large collection of DVCAM tapes we strongly recommend that you begin to assess the contents and make plans to transfer them to digital files. As always, do get in touch if you need any advice to develop your plans for migration and preservation.

 

Reel-to-reel transfer of Anthony Rye, Selborne’s nature poet

November 25th, 2014

We have recently transferred a number of recordings of the poet, Anthony Rye, reading his work. The tapes were sent by his Grandson Gabriel who was kind enough to tell us a bit more about Anthony’s life and work.

‘Anthony Francis Rye is intimately associated with the Hampshire village of Selborne, a village made famous by Gilbert White and his book, Natural History of Selborne.

The Rye family has been here since the end of the 19th century and Anthony came to live here in the 1940s with his wife, in the house I now live in.

Among his books of poems are The Inn of the Birds (1947), Poems from Selborne (1961) and To A Modern Hero (1957). He was an illustrator and trained as an engraver and illustrated The Inn of the Birds himself, of which he said the poems “…were written to make people more alive to the spirit of bird-life and to the nature of birds generally. It was hoped to communicate something of the intense pleasure in birds felt by the author, and at the same time, by emphasizing their strange remote quality without destroying the sense of their being our fellow creatures…”35 Reel to reel transfer of Anthony Rye, Selbornes nature poet

His poem ‘The Shadow on the Lyth’ from Poems from Selborne, invokes a dark moment in Selborne’s history when it was proposed by the council to put a much needed sewage works at the bottom of Church Meadow, thus ruining one of the most beautiful settings in Hampshire – one beloved of natural historian Gilbert White. Anthony Rye fought this and after a long struggle managed to have the works re-sited out of sight.’

Gilbert White’s life and work was a significant influence on Rye’s work and in 1970 he published the book Gilbert White and his Selborne.

Although the BBC has previously broadcast Rye’s poems, Gabriel tell us that these particular recordings have not been. Until now the recordings have been stored in Arthur’s house; migrating them to digital files is an exciting opportunity for family members, but also hopefully wider audiences, to access Rye’s work.

 

Listen to Anthony Rye reading his poems, with thanks to Gabriel for granting permission

Recording technologies in history

75SonyBrochure02 Reel to reel transfer of Anthony Rye, Selbornes nature poet

Arthur Jolland, a nature photographer and friend of the poet made the recordings on a SONY 800B, a portable reel-to-reel tape machine described by SONY as ‘compact, convenient and capable, a natural for both business and pleasure.’

The machine, which used a ‘ServoControl Motor; the same type of motor used is missile guidance control systems where critical timing accuracy is a must,’ is historically notorious for its use by US President Richard Nixon who racked up 3,700-4,000 hours of recordings that would later implicate him during the Watergate Scandal.

Sahr Conway-Lanz explains that ‘room noise may constitute roughly one quarter of the total hours of recorded sound’ because tape machines recorded at the super slow speed of 15/16 of an inch per second ‘in order to maximize the recording time on each tape’ (547-549).

Decreasing the speed of a tape recording causes a uniform reduction in the linearity of response, resulting in more hiss and dropouts. If you listen to the recordings made by Nixon, it is pretty hard to discern what is being said without reference to the transcripts.

The transfer process

There were no big issues with the condition of the Anthony Rye tapes other than a small amount of loose binder shedding. This was easily solved by dry cleaning with pellon fabric prior to digitization.

Although in some cases playing back tapes on exactly the same machine as it was recorded on is desirable (particularly so with DAT transfers), we migrated the recordings using our SONY APR 5003. sony apr headblock closeup 300x225 Reel to reel transfer of Anthony Rye, Selbornes nature poet

Using a technically superior model, one of the few large format professional reel-to-reel machines SONY manufactured, mitigates the extent errors are added to the recording as part of the transfer process. Furthermore, the greater flexibility and control offered with the 5003 makes it easier to accurately replay tapes recorded on machines that had lower specifications.

Another slight adjustment was attaching longer leader tape to the front and end of the tape. This is because the Sony APR 5003 has a much longer tape path than the 800B, and if this isn’t done material can be lost from the beginning and end of the recording.

***

The journeys we have been on above – from the natural history of a Hampshire village seen through the eyes of largely unknown poet to the Watergate scandal – is another example of the diverse technical, cultural and historical worlds that are opened up by the ‘mysterious little reddish-brown ribbon‘ and its playback mechanisms.

World Day for Audiovisual Heritage – digitisation and digital preservation policy and research

October 27th, 2014

Today, October 27, has been declared World Day for Audiovisual Heritage by UNESCO. We also blogged about it last year.

Since 2005, UNESCO have used the landmark to highlight the importance of audiovisual archives to ‘our common heritage’ which  contain ‘the primary records of the 20th and 21st centuries.’ Increasingly, however, the day is used to highlight how audio and moving image archives are particularly threatened with by ‘neglect, natural decay to technological obsolescence, as well as deliberate destruction’.

Indeed, the theme for 2014 is ‘Archives at Risk: Much More to Do.’ The Swiss National Sound Archives have made this rather dramatic short film to promote awareness of the imminent threat to audiovisual formats, which is echoed by UNESCO’s insistence that ‘all of the world’s audiovisual heritage is endangered.’

As it is World Audiovisual Heritage Day, we thought it would be a good idea to take a look at some of the recent research and policy that has been collected and published relating to digitisation and digital preservation.

While the UNESCO anniversary is useful for raising awareness of the fragility of audiovisual mediums, what is the situation for organisations and institutions grappling with these challenges in practice?

Recent published research – NDSA

The first to consider are preliminary results from a survey published by the US-based NDSA Standards and Practices Working Group, full details can be accessed here.

The survey asked a range of organisations, institutions and collections to rank issues that are critical for the preservation of video collections. Respondents ‘identified the top three stumbling blocks in preserving video as:

  • Getting funding and other resources to start preserving video (18%)
  • Supporting appropriate digital storage to accommodate large and complex video files (14%)
  • Locating trustworthy technical guidance on video file formats including standards and best practices (11%)’

Interestingly in relation to the work we do at Great Bear, which often reveal the fragilities of digital recordings made on magnetic tape, ‘respondents report that analog/physical media is the most challenging type of video (73%) followed by born digital (42%) and digital on physical media (34%).’

It may well be that there is simply more video on analogue/ physical media than other mediums which can account for the higher response, and that archives are yet to grapple with the archival problem of digital video stored on physical mediums such as DVD and in particular, consumer grade DVD-Rs. Full details will be published on The Signal, the Library of Congress’ Digital Preservation blog, in due course.

Recent research – Digital Preservation Coalition (DPC)

Another piece of preliminary research published recently was the user consultation for the 2nd edition of the Digital Preservation Coalition’s Digital Preservation Handbook. The first edition of the Handbook was published in 2000 but was regularly updated throughout the 00s. The consultation precedes what will be a fairly substantial overhaul of the resource.

Many respondents to the consultation welcomed that a new edition would be published, stating that much content is now ‘somewhat outdated’ given the rapid change that characterises digital preservation as a technological and professional field.

Survey respondents ranked storage and preservation (1), standards and best practices (2) and metadata and documentation (3) as the biggest challenges involved in digital preservation, and therefore converge with the NDSA findings. It must be stressed, however, that there wasn’t a massive difference across all the categories that included issues such as compression and encryption, access and creating digital materials.

Some of the responses ranged from the pragmatic…

‘digital preservation training etc tend to focus on technical solutions, tools and standards. The wider issues need to be stressed – the business case, the risks, significant properties’ (16)

‘increasingly archives are being approached by community archive groups looking for ways in which to create a digital archive. Some guidance on how archive services can respond effectively and the issues and challenges that must be considered in doing so would be very welcome’ (16)

…to the dramatic…

‘The Cloud is a lethal method of storing anything other than in Lo Res for Access, and the legality of Government access to items stored on The Cloud should make Curators very scared of it. Most digital curators have very little comprehension of the effect of solar flares on digital collections if they were hit by one. In the same way that presently part of the new method of “warfare” is economic hacking and attacks on financial institutions, the risks of cyber attacks on a country’s cultural heritage should be something of massive concern, as little could demoralise a population more rapidly. Large archives seem aware of this, but not many smaller ones that lack the skill to protect themselves’ (17)

…Others stressed legal issues related to rights management…

‘recording the rights to use digital content and ownership of digital content throughout its history/ life is critical. Because of the efforts to share bits of data and the ease of doing so (linked data, Europeana, commercial deals, the poaching of lines of code to be used in various tools/ services/ products etc.) this is increasingly important.’ (17)

It will be fascinating to see how the consultation are further contextualised and placed next to examples of best practice, case studies and innovative technological approaches within the fully revised 2nd edition of the Handbook.

European Parliament Policy on Film Heritage

Our final example relates to the European Parliament and Council Recommendation on Film Heritage. The Recommendation was first decreed in 2005. It invited Member States to offer progress reports every two years about the protection of and access to European film heritage. The 4th implementation report was published on 2 October 2014 and can be read in full here.

The language of the recommendation very much echoes the rationale laid out by UNESCO for establishing World Audiovisual Heritage Day, discussed above:

‘Cinematography is an art form contained on a fragile medium, which therefore requires positive action from the public authorities to ensure its preservation. Cinematographic works are an essential component of our cultural heritage and therefore merit full protection.’

Although the recommendation relates to preservation of cinematic works specifically, the implementation report offers wide ranging insight into the uneven ways ‘the digital revolution’ has affected different countries, at the level of film production/ consumption, archiving and preservation.

The report gravely states that ‘European film heritage risks missing the digital train,‘ a phrase that welcomes a bit more explanation. One way to understand is that it describes how countries, but also Europe as a geo-political space, is currently failing to capitalise on what digital technologies can offer culturally, but also economically.

The report reveals that the theoretical promise of interoperable digital technologies-smooth trading, transmission and distribution across economic, technical and cultural borders-was hindered in practice due to costly and complex copyright laws that make the cross border availability of film heritage, re-use (or ‘mash-up’) and online access difficult to implement. This means that EU member states are not able to monetise their assets or share their cultural worth. Furthermore, this is further emphasised by the fact that ‘85% of Europe’s film heritage is estimated to be out-of-commerce, and therefore, invisible for the European citizen’ (37).

In an age of biting austerity, the report makes very clear that there simply aren’t enough funds to implement robust digitization and digital preservation plans: ‘Financial and human resources devoted to film heritage have generally remained at the same level or have been reduced. The economic situation has indeed pushed Member States to change their priorities’ (38).

There is also the issue of preserving analogue expertise: ‘many private analogue laboratories have closed down following the definitive switch of the industry to digital. This raises the question on how to maintain technology and know-how related to analogue film’ (13).

Production Heritage Budget 300x298 World Day for Audiovisual Heritage   digitisation and digital preservation policy and researchThe report gestures toward what is likely to be a splitting archival-headache-to-come for custodians of born digital films: ‘resources devoted to film heritage […] continue to represent a very small fraction of resources allocated to funding of new film productions by all Member States’ (38). Or, to put it in numerical terms, for every €97 invested by the public sector in the creation of new films, only €3 go to the preservation and digitisation of these films. Some countries, namely Greece and Ireland, are yet to make plans to collect contemporary digital cinema (see opposite infographic).

Keeping up to date

It is extremely useful to have access to the research featured in this article. Consulting these different resources helps us to understand the nuts and bolts of technical practices, but also how different parts of the world are unevenly responding to digitisation. If the clock is ticking to preserve audiovisual heritage in the abrupt manner presented in the Swiss National Archives Film, the EU research in particular indicates that it may well be too late already to preserve a significant proportion of audiovisual archives that we can currently listen to and watch.

As we have explored at other places in this blog, wanting to preserve everything is in many ways unrealistic; making clinical selection decisions is a necessary part of the archival process. The situation facing analogue audiovisual heritage is however both novel and unprecedented in archival history: the threat of catastrophic drop out in ten-fifteen years time looms large and ominous.

All that is left to say is: enjoy the Day for World Audiovisual Heritage! Treasure whatever endangered media species flash past your eyes and ears. Be sure to consider any practical steps you can take to ensure the films and audio recordings that are important to you remain operable for many years to come.

Transferring Digital Audio Tapes (DATs) to digital audio files

October 9th, 2014

This post focuses on the problems that can arise with the transfer of Digital Audio Tapes (DATs).

An immature recording method (digital) on a mature recording format (magnetic tape), the audio digital recording revolution was never going to get it right first time (although DATs were not of course the first digital recordings made on tape).

Indeed, at a meeting of audio archivists held in 1995, there was a consensus even then that DAT was not, and would never be, a reliable archival medium. One participant stated: ‘we have tapes from 1949 that sound wonderful,’ and ‘we have tapes from 1989 that are shot to hell.’ And that was nearly twenty years ago! What chances do the tapes have now?

A little DAT history

Before we explore that, let’s have a little DAT history.

SONY introduced Digital Audio Tapes (DATs) in 1987. At roughly half the size of an analogue cassette tape, DAT has the ability to record at higher, equal or lower sampling rates than a CD (48, 44.1 or 32 kHz sampling rate respectively) at 16 bit quantization.

Although popular in Japan, DATs were never widely adopted by the majority of consumer market because they were more expensive than their analogue counterparts. They were however embraced in professional recording contexts, and in particular for recording live sound.

It was recording industry paranoia, particularly in the US, that really sealed the fate of the format. With its threatening promise of perfect replication, DAT tapes were subject to an unsuccessful lobbying campaign by the Recording Industry Association of America (RIAA). RIAA saw DATs as the ultimate attack on copyright law and pressed to introduce the Digital Audio Recorder Copycode Act of 1987.

This law recommended that each DAT machine had a ‘copycode’ chip installed that could detect whether prerecorded copyrighted music was being replicated. The method employed a notch filter that would subtly distort the quality of the copied recording, thus sabotaging the any acts of piracy tacitly enabled by the DAT medium. The law was however not passed, and compromises were made, although the US Audio Home Recording Act of 1992 imposed taxes on DAT machines and blank media.

How did they do ‘dat?

Like video tape recorders, DAT tapes use a rotating head and helical scan method to record data. The helical scan can, however, pose real problems for the preservation transfers of DAT tapes because it makes it difficult to splice the tape together if it becomes sticky and snaps during the tape wind. With analogue audiotape, which records information longitudinally, it is far more possible to splice the tape together and continue the transfer without risking irrevocable information loss.

Another problem posed by the helical scan method is that such tapes are more vulnerable to tape pack and backing deformation, as the CLIR guide explain:

‘Tracks are recorded diagonally on a helical scan tape at small scan angles. When the dimensions of the backing change disproportionately, the track angle will change for a helical scan recording. The scan angle for the record/playback head is fixed. If the angle that the recorded tracks make to the edge of the tape do not correspond with the scan angle of the head, mistracking and information loss can occur.’

When error correction can’t correct anymore

dat mute playback condition sony 7040 300x169 Transferring Digital Audio Tapes (DATs) to digital audio filesMost people will be familiar with the sound of digital audio dropouts even if they don’t know the science behind them. You will know them most probably as those horrible clicking noises produced when the error correction technology on CDs stops working. The clicks indicate that the ‘threshold of intelligibility’ for digital data has been breached and, as theorist Jonathan Sterne reminds us, ‘once their decay becomes palpable, the file is rendered entirely unreadable.’

Our SONY PCM 7030 professional DAT machine, pictured opposite, has a ‘playback condition’ light that flashes if an error is present. On sections of the tape where quality is really bad the ‘mute’ light can flash to indicate that the error correction technology can’t fix the problem. In such cases drop outs are very audible. Most DAT machines did not have such a facility however, and you only knew there was a problem when you heard the glitchy-clickety-crackle during playback when, of course, it was too late do anything about it.

The bad news for people with large, yet to be migrated DAT archives is that the format is ‘particularly susceptible to dropout. Digital audio dropout is caused by a non-uniform magnetic surface, or a malfunctioning tape deck. However, because the magnetically recorded information is in binary code, it results in a momentary loss of data and can produce a loud transient click or worse, muted audio, if the error correction scheme in the playback equipment cannot correct the error,’ the wonderfully informative A/V Artifact Atlas explains.

Given the high density nature of digital recordings on narrow magnetic tape, even the smallest speck of dust can cause digital audio dropouts. Such errors can be very difficult to eliminate. Cleaning playback heads and re-transferring is an option, but if the dropout was recorded at the source or the surface of tape is damaged, then the only way to treat irregularities is through applying audio restoration technologies, which may present a problem if you are concerned with maintaining the authenticity of the original recording.

Listen to this example of what a faulty DAT sounds like

Play back problems and mouldy DATs

mould growth on dat surface 01 Transferring Digital Audio Tapes (DATs) to digital audio files

Mould growth on the surface of DAT tape

A big problem with DAT transfers is actually being able to play back the tapes, or what is known in the business as ‘DAT compatibility.’ In an ideal world, to get the most perfect transfer you would play back a tape on the same machine that it was originally recorded on. The chances of doing this are of course pretty slim. While you can play your average audio cassette tape on pretty much any tape machine, the same cannot be said for DAT tapes. Often recordings were made on misaligned machines. The only solution for playback is, Richard Hess suggests, to mis-adjust a working machine to match the alignment of the recording on the tape.

As with any archival collection, if it is not stored in appropriate conditions then mould growth can develop. As mentioned above, DAT tapes are roughly half the size of the common audiocassette and the tape is thin and narrow. This makes them difficult to clean because they are mechanically fragile. Adapting a machine specifically for the purposes of cleaning, as we have done with our Studer machine, would be the most ideal solution. There is, however, not a massive amount of research and information about restoring mouldy DATs available online even though we are seeing more and more DAT tapes exhibiting this problem.

As with much of the work we do, the recommendation is to migrate your collections to digital files as soon as possible. But often it is a matter of priorities and budgets. From a technical point of view, DATs are a particularly vulnerable format. Machine obsolescence means that compared to their analogue counterparts, professional DAT machines will be increasingly hard to service in the long term. As detailed above, glitchy dropouts are almost inevitable given the sensitivity and all or nothing quality of digital data recorded on magnetic tape.

It seems fair to say that despite being meant to supersede analogue formats, DATs are far more likely to drop out of recorded sound history in a clinical and abrupt manner.

They therefore should be a high priority when decisions are made about which formats in your collection should be migrated to digital files immediately, over and above those that can wait just a little bit longer.

Phyllis Tate’s Nocturn for Four Voices 3″ 1/4 inch reel to reel tape transfer

September 19th, 2014

We have recently transferred a previously unpublished 3” ¼ inch tape recording of British 20th century composer Phyllis Tate’s Nocturn for Four Voices. The tape is a 2-track stereo recording made at 7.5 inches per second (in/s) at the Purcell Room in London’s Southbank Centre in 1975, and was broadcast on 16 September 1976.

When migrating magnetic tape recordings to digital files there are several factors that can be considered to assess the quality of recording even before we play back the tape. One of these is the speed at which the tape was originally recorded.

BASF Track Width 1024x658 Phyllis Tates Nocturn for Four Voices 3 1/4 inch reel to reel tape transfer

Generally speaking, the faster the speed the better the reproduction quality when making the digital transfer. This is because higher tape speeds spread the recorded signal longitudinally over more tape area, therefore reducing the effects of dropouts and tape noise. The number of tracks recorded on the tape also has an impact on how good it sounds today. Simply put, the more information stored on the tape due to recording speed or track width, the better the transfer will sound.

The tape of Nocturn for Four Voices was however suffering from binder hydrolysis and therefore needed to be baked prior to play back. EMI tape doesn’t normally do this but as the tape was EMI professional it may well have used Ampex stock and / or have been back coated, thus making the binder more susceptible to such problems.

Remembering Phyllis Tate

Nocturn for Four Voices is an example of how Tate ‘composed for unusual combinations of instruments and voice.’ The composition includes ‘Bass Clarinet, Celeste, String Quartet and Double Bass’, music scholar Jane Ballantyne explains.

The tape was brought into us by Tate’s daughter, Celia Frank, who is currently putting the finishing touches to a web archive that, she hopes, will help contemporary audiences (re)discover her mother’s work.

Like many women musicians and artists, Phyllis Tate, who trained at the Royal Academy of Music, remains fairly obscure to the popular cultural ear.

This is not to say, of course, that her work did not receive critical acclaim from her contemporaries or posthumously. Indeed, it is fair to say that she had a very successful composing career. Both the BBC and the Royal Academy of Music, among others, commissioned compositions from Tate, and her work is available to hire or buy from esteemed music publishers Oxford University Press (OUP).

Edmund Whitehouse, who wrote a short biography of the composer, described her as ‘one of the outstanding British composers of her generation, she was truly her own person whose independent creative qualities produced a wide range of music which defy categorisation.’

Her music often comprised of contrasting emotional registers, lyrical sections and unexpected changes of direction. As a writer of operattas and choral music, with a penchant for setting poetry to music, her work is described by the OUP as the product of ‘an unusual imagination and an original approach to conventional musical forms or subjects, but never to the extent of being described as “avant-garde”.’

Tate’s music was very much a hit with iconic suffrage composer Ethel Smyth who, upon hearing Tate’s compositions, reputedly declared: ‘at last, I have heard a real woman composer.’ Such praise was downplayed by Tate, who tended to point to Smyth’s increased loss of hearing in later life as the cause of her enjoyment: ‘My Cello Concerto was performed soon afterwards at Bournemouth with Dame Ethel sitting in the front row banging her umbrella to what she thought was the rhythm of the music.’phillis tate nocturne emi tape Phyllis Tates Nocturn for Four Voices 3 1/4 inch reel to reel tape transfer

While the dismissal of Smyth’s appreciation is tender and good humoured, the fact that Tate destroyed significant proportions of her work does suggest that at times she could have doubted her own abilities as a composer. Towards the end of her life she revealed: ‘I must admit to having a sneaking hope that some of my creations may prove to be better than they appear. One can only surmise and it’s not for the composer to judge. All I can vouch is this: writing music can be hell; torture in the extreme; but there’s one thing worse; and that is not writing it.’ As a woman composing in an overwhelmingly male environment, such hesitancies are perhaps an understandable expression of what literary scholars Gilbert and Gubar called ‘the anxiety of authorship.’

Tate’s work is a varied and untapped resource for those interested in twentieth century composition and the wider history of women composers. We wish Celia the best of luck in getting the website up and running, and hope that many more people will be introduced to her mother’s work as a consequence.

Thanks to Jane Ballantyne and Celia Frank for their help in writing this article.

Obsolete technologies and contemporary sound art

August 26th, 2014

At the recent Supernormal festival held at Braziers Park, Oxfordshire, a number of artists were using analogue technologies to explore concepts that dovetail nicely with the work we do at Great Bear collecting, servicing and repairing obsolete tape machines.

Hacker Farm, for example, keep ‘obsolete tech and discarded, post-consumerist debris’ alive using ‘salvaged and the hand-soldered’ DIY electronics. Their performance was a kind-of technological haunting, the sound made when older machines are turned on and re-purposed in different eras. Eerie, decayed, pointless and mournful, the conceptual impetus behind Hacker Farm raises many questions that emerge from the rather simple desire to keep old technologies working. Such actions soon become strange and aesthetically challenging in the contemporary technological context, which actively reproduces obsolescence in the endless search for the new, fostering continuous wastefulness at the centre of industrial production.

Music by the Metre

Another performance at the festival which engaged with analogue technologies was Graham Dunning’s Music by the Metre. The piece pays homage to Situationist Pinot-Gallizio‘s method of ‘Industrial Painting’ (1957-1959), in which the Italian artist created a 145 metre hand and spray painted canvas that was subsequently cut up and sold by the metre. The action, which attempted to destroy the perception of the sacrilegious art-object and transform it into something which could be mass-quantified and sold, aimed to challenge ‘the mental disease of banalisation’ inherent to what Guy Debord termed ‘the society of the spectacle.’

c5de30aca4de08f408314978fe50f0f6 777x1024 Obsolete technologies and contemporary sound art

In Dunning’s contemporary piece he used spools of open reel tape to record a series of automated machines comprised of looping record players, synth drone, live environmental sound and tape loops. This tape is then cut by the artist in metre long segments, placed in see-through plastic bags and ‘sold’ on his temporary market stall used to record and present the work.

Dunning’s work exists in interesting tension with the ideas of Pinot-Gallizio, largely because of the different technological and aesthetic contexts the artists are responding to.

Pinot-Gallizio’s industrial painting aimed to challenge the role of art within a consumer society by accelerating its commodity status (mass-produced, uniform, quantified, art as redundant, art as part of the wall paper). Within Dunning’s piece, such a process of acceleration is not so readily available, particularly given the deep obsolescence of consumer-grade open reel tape in 2014, and, furthermore, its looming archival obsolescence (often cited at ’10-20 years‘ by archivists).

Within the contemporary context, open reel analogue tapes have become ornate and aestheticised in themselves because they have lost their function as an everyday, recordable mass blank media. When media lose their operating context they are transformed into objects of fascination and desire, as Claire Bishop pithily states in her Art Forum essay, ‘The Digital Divide': ‘Today, no exhibition is complete without some form of bulky, obsolete technology—the gently clucking carousel of the slide-projector, or the whirring of an 8mm or 16mm film reel […] the sumptuous texture of indexical media is unquestionably seductive, but its desirability also arises from the impression that it is scarce, rare and precious.’

In reality, the impression of open reel to reel analogue tape’s rarity is however well justified, as manufacturers and distributors of magnetic tape are increasingly hard to find. Might there be something more complex and contradictory be going on in Dunning’s homage to Pinot-Gallizio? Could we understand it as a neat inversion of the mass-metred-object, doubly cut adrift from its historical (1950s-1970s) and technological operating context (the open reel tape recorder), the bag of tape is decelerated, existing as nothing other than art object. Stuffed messily in a plastic bag and displayed ready to be sold (if only by donation), the tape is both ugly and useless given its original and intended use. It is here Dunning’s and Pinot-Gallizio’s work converge, situated at different historical and temporal poles from which critique of the consumer society can be mounted: accelerated plenitude and decelerated exhaustion.

onexmetres Obsolete technologies and contemporary sound art

Analogue attachments

As a company that works with obsolete magnetic tape-based media, Great Bear has a vested interest in ensuring tapes and playback machines remain operational. Although our studio, with its stacks of long-forgotten machines, may look like a curious art installation to some, the tapes we migrate to digital files are not quite art objects…yet. Like Hacker Farm, we help to keep old media alive through careful processes of maintenance and repair.

From looking at how contemporary sound artists are engaging with analogue technologies, it is clear that the medium remains very much part of the message, as Marshall McLuhan would say, and that meaning becomes amplified, contorted or transformed depending on historical context, and media norms present within it.

Reports from the ‘bleeding edge’ – The Presto Centre’s AV Digitisation TechWatch Report #2

July 28th, 2014

The Presto Centre‘s AV Digitisation and Digital Preservation TechWatch Report, published July 2014, introduces readers to what they describe as the ‘bleeding edge’ of AV Digitisation and Archive technology.

Written in an engaging style, the report is well worth a read. If you don’t have time, however, here are some choice selections from the report which relate to the work we do at Great Bear, and some of the wider topics that have been discussed on the blog.

The first issue to raise, as ever, is continuing technological change. The good news is

‘there are no unexpected changes in file sizes or formats on the horizon, but it is fair to say that the inexorable increase in file size will continue unabated […] Higher image resolutions, bits per pixel and higher frame rates are becoming a fact of life, driving the need for file storage capacity, transfer bandwidth and processing speeds, but the necessary technology developments continue to track some form of Moore’s law, and there is no reason to believe that the technical needs will exceed technical capability, although inevitably there will be continuing technology updates needed by archives in order for them to manage new material.’

Having pointed out the inevitability of file expansion, however, others parts of the report clearly express the very real everyday challenges that ever increasing file sizes are posing to the transmission of digital information between across different locations:rate vs size.today 0 Reports from the bleeding edge   The Presto Centres AV Digitisation TechWatch Report #2

‘transport of content was raised by one experienced archive workflow provider. They maintained that, especially with very high bit-rate content (such as 4k) it still takes too long to transfer files into storage over the network, and in reality there are some high-capacity content owners and producers shipping stacks of disks around the country in Transit vans, on the grounds that, in the right circumstances this can still be the highest bandwidth transfer mechanism, even though the Digital Production Partnership (DPP) are pressing for digital-only file transfer.’

While those hoards of transit vans zipping up and down the motorway between different media providers is probably the exception rather than the rule, we should note that a similar point was raised by Per Platou when he talked about the construction of the Videokuntstarkivet – the Norwegian video art archive. Due to the size of video files in particular, Per found that publishing them online really pushed server capabilities to the absolute maximum. This illustrates that there remains a discrepancy between the rate at which broadcast technologies develop and the economic, technological and ecological resources available to send and receive them.

Another interesting point about the move from physical to file-based media is the increased need for Quality-Control (QC) software tools that will be employed to ‘ensure that our digital assets are free from artefacts or errors introduced by encoders or failures of the playback equipment.’ Indeed, given that glitches born from slow or interrupted transfers may well be inevitable because of limited server capabilities, software developed by Bristol-based company Vidcheck will be very useful because it ‘allows for real-time repair of Luma, Chroma, Gamma and audio loudness issues that may be present in files. This is a great feature given that many of the traditional products on the market will detect problems but will not automatically repair them.’

Other main points worth mentioning from the report is the increasing move to open-source, software only solutions for managing digital collections and the rather optimistic tone directed toward ‘archives with specific needs who want to find a bespoke provider who can help design, supply and support a viable workflow option – so long as they avoid the large, proprietary ‘out-of-the-box’ solutions.’

If you are interested in reading further TechWatch reports you can download #1 here, and watch out for #3 that will be written after the International Broadcasting Convention (IBC) which is taking place in September, 2014.

 

Digital preservations, aesthetics and approaches

July 23rd, 2014

Digital Preservation 2014, the annual meeting of the National Digital Information Infrastructure and Preservation Program and the National Digital Stewardship Alliance is currently taking place in Washington, DC in the US.

The Library of Congress’s digital preservation blog The Signal is a regular reading stop for us, largely because it contains articles and interviews that impressively meld theory and practice, even if it does not exclusively cover issues relating to magnetic tape.

What is particularly interesting, and indeed is a feature of the keynotes for the Digital Preservation 2014 conference, is how the relationship between academic theory—especially relating to aesthetics and art—is an integral part of the conversation of how best to meet the challenge of digital preservation in the US. Keynote addresses from academics like Matthew Kirschenbaum (author of Mechanisms) and Shannon Mattern, sit alongside presentations from large memory institutions and those seeking ways to devise community approaches to digital stewardship.

The relationship between digital preservation and aesthetics is also a key concern of Richard Rhinehart and Jon Ippolito’s new book Re-Collection: Art, New Media and Social Memory, which has just been published by MIT Press.

This book, if at times deploying rather melodramatic language about the ‘extinction!’ and ‘death!’ of digital culture, gently introduces the reader to the wider field of digital preservation and its many challenges. Re-Collection deals mainly with born-digital archives, but many of the ideas are pertinent for thinking about how to manage digitised collections as well.DSC05277 Digital preservations, aesthetics and approaches

In particular, the recommendation by the authors that the digital archival object remains variable was particularly striking: ‘the variable media approach encourages creators to define a work in medium- independent terms so that it can be translated into a new medium once its original format is obsolete’ (11). Emphasising the variability of the digital media object as a preservation strategy challenges the established wisdom of museums and other memory institutions, Rhinehart and Ippolito argue. The default position to preserve the art work in its ‘original’ form effectively freezes a once dynamic entity in time and space, potentially rendering the object inoperable because it denies works of art the potential to change when re-performed or re-interpreted. Their message is clear: be variable, adapt or die!

As migrators of tape-based collections, media variability is integral to what we do. Here we tacitly accept the inauthenticity of the digitised archival object, an artefact which has been allowed to change in order to ensure accessibility and cultural survival.

US/ European differences ?

While aesthetic and theoretical thinking is influencing how digital information management is practiced in the US, it seems as if the European approach is almost exclusively framed in economic and computational terms

Consider, for example, the recent EU press release about the vision to develop Europe’s ‘knowledge economy‘. The plans to map and implement data standards, create cross-border coordination and an open data incubator are, it would seem, far more likely to ensure interoperable and standardised data sharing systems than any of the directives to preserve cultural heritage in the past fifteen years, a time period characterised by markedly unstable approaches, disruptive innovations and a conspicuous lack of standards (see also the E-Ark project).

It may be tempting these days to see the world as one gigantic, increasingly automated archival market, underpinned by the legal imperative to collect all kinds of personal data (see the recent ‘drip’ laws that were recently rushed through the UK parliament). Yet it is also important to remember the varied professional, social and cultural contexts in which data is produced and managed.

One session at DigiPres, for example, will explore the different archival needs of the cultural heritage sector:

‘Digital cultural heritage is dependent on some of the same systems, standards and tools used by the entire digital preservation community. Practitioners in the humanities, arts, and information and social sciences, however, are increasingly beginning to question common assumptions, wondering how the development of cultural heritage-specific standards and best practices would differ from those used in conjunction with other disciplines […] Most would agree that preserving the bits alone is not enough, and that a concerted, continual effort is necessary to steward these materials over the long term.’

Of course approaches to digital preservation and data management in the US are largely overdetermined by economic directives, and European policies do still speak to the needs of cultural heritage institutions and other public organisations.

What is interesting, however, is the minimal transnational cross pollination at events such as DigiPres, despite the globally networked condition we all share. This suggests there are subtle divergences between approaches to digital information management now, and how it will be managed in coming years across these (very large) geopolitical locations. Aesthetics or no aesthetics, the market remains imperative. Despite the turn toward open archives and re-usable data, competition is at the heart of the system and is likely to win out above all else.

D1, D2 & D3 – histories of digital video tape

July 14th, 2014

d1 minidv tape comparison 2 300x258 D1, D2 & D3   histories of digital video tape

The images in this article are of the first digital video tape formats, the D1, D2 and D3. The tendency to continually downsize audiovisual technology is clearly apparent: the gargantuan shell of the D1 gradually shrinks to the D3, which resembles the size of a domestic VHS tape.

Behind every tape (and every tape format) lie interesting stories, and the technological wizardry and international diplomacy that helped shape the roots of our digital audio visual world are worth looking into.

In 1976, when the green shoots of digital audio technology were emerging at industry level, the question of whether Video Tape Recorders (VTRs) could be digitised began to be explored in earnest by R & D departments based at SONY, Ampex and Bosch G.m.b.H. There was considerable scepticism among researchers about whether digital video tape technology could be developed at all because of the wide frequency required to transmit a digital image.

In 1977 however, as reported on the SONY websiteYoshitaka Hashimoto and team began to intensely research digital VTRs and ‘in just a year and a half, a digital image was played back on a VTR.’

Several years of product development followed, shaped, in part, by competing regional preferences. As Jim Slater argues in Modern Television Systems (1991): ‘much of the initial work towards digital standardisation was concerned with trying to find ways of coping with the three very different colour subcarrier frequencies used in NTSC, SECAM and PAL systems, and a lot of time and effort was spent on this’ (114).

Establishing a standard sampling frequency did of course have real financial consequences, it could not be randomly plucked out the air: the higher the sampling frequency, the greater overall bit rate; the greater overall bit rate, the more need for storage space in digital equipment. In 1982, after several years of negotiations, a 13.5 MHz sampling frequency was agreed. European, North American, ‘Japanese, the Russians, and various other broadcasting organisations supported the proposals, and the various parameters were adopted as a world standard, Recommendation 601 [a.k.a. 4:2:2 DTV] standard of the CCIR [Consultative Committee for International Radio, now International Telecommunication Union]’ (Slater, 116).

The 4:4:2 DTV was an international standard that would form the basis of the (almost) exclusively digital media environment we live in today. It was ‘developed in a remarkably short time, considering its pioneering scope, as the worldwide television community recognized the urgent need for a solid basis for the development of an all-digital television production system’, write Stanley Baron and David Wood.

Once agreed upon, product development could proceed. The first digital video tape, the D1, was introduced on the market in 1986. It was an uncompressed component video which used enormous bandwidth for its time: 173 Mbit/sec (bit rate), with maximum recording time of 94 minutes. IMAG1651 2 1024x577 D1, D2 & D3   histories of digital video tape

As Slater writes

‘unfortunately these machines are very complex, difficult to manufacture, and therefore very expensive […] they also suffer from the disadvantage that being component machines, requiring luminance and colour-difference signals at input and output, they are difficult to install in a standard studio which has been built to deal with composite PAL signals. Indeed, to make full use of the D1 format the whole studio distribution system must be replaced, at considerable expense’ (125).

Being forced to effectively re-wire whole studios, and the considerable risk involved in doing this because of continual technological change, strikes a chord with the challenges UK broadcast companies face as they finally become ‘tapeless’ in October 2014 as part of the Digital Production Partnership’s AS-11 policy.

Sequels and product development

As the story so often goes, D1 would soon be followed by D2. Those that did make the transition to D1 were probably kicking themselves, and you can only speculate the amount of back injuries sustained getting the machines in the studio (from experience we can tell you they are huge and very heavy!)

It was fairly inevitable a sequel would be developed because even as the D-1 provided uncompromising image quality, it was most certainly an unwieldy format, apparent from its gigantic size and component wiring. In response a composite digital video – the D2 – was developed by Ampex and introduced in 1988.

In this 1988 promotional video, you can see the D-2 in action. Amazingly for our eyes and ears today the D2 is presented as the ideal archival format. Amazing for its physical size (hardly inconspicuous on the storage shelf!) but also because it used composite video signal technology. Composite signals combine on one wire all the component parts which make up a video signal: chrominance (colour, or Red Green, Blue – RGB) and luminance (the brightness or black and white information, including grayscale).

While the composite video signal used lower bandwidth and was more compatible with existing analogue systems used in the broadcast industry of the time, its value as an archival format is questionable. A comparable process for the storage we use today would be to add compression to a file in order to save file space and create access copies. While this is useful in the short term it does risk compromising file authenticity and quality in the long term. The Ampex video is fun to watch however, and you get a real sense of how big the tapes were and the practical impact this would have had on the amount of time it took to produce TV programmes.

Enter the D3

Following the D2 is the D3, which is the final video tape covered in this article (although there were of course the D5 and D9.)

The D3 was introduced by Panasonic in 1991 in order to compete with Ampex’s D2. It has the same sampling rate as the D2 with the main difference being the smaller shell size.

The D3’s biggest claim to fame was that it was the archival digital video tape of choice for the BBC, who migrated their analogue video tape collections to the format in the early 1990s. One can only speculate that the decision to take the archival plunge with the D3 was a calculated risk: it appeared to be a stable-ish technology (it wasn’t a first generation technology and the difference between D2 and D3 is negligible).

The extent of the D3 archive is documented in a white paper published in 2008, D3 Preservation File Format, written by Philip de Nier and Phil Tudor: ‘the BBC Archive has around 315,000 D3 tapes in the archive, which hold around 362,000 programme items. The D3 tape format has become obsolete and in 2007 the D3 Preservation Project was started with the goal to transfer the material from the D3 tapes onto file-based storage.’

Tom Heritage, reporting on the development of the D3 preservation project in 2013/2014, reveals that ‘so far, around 100,000 D3 and 125,000 DigiBeta videotapes have been ingested representing about 15 Petabytes of content (single copy).’

It has then taken six years to migrate less than a third of the BBC’s D3 archive. Given that D3 machines are now obsolete, it is more than questionable whether there are enough D3 head hours left in existence to read all the information back clearly and to an archive standard. The archival headache is compounded by the fact that ‘with a large proportion of the content held on LTO3 data tape [first introduced 2004, now on LTO-6], action will soon be required to migrate this to a new storage technology before these tapes become difficult to read.’ With the much publicised collapse of the BBC’s (DMI) digital media initiative in 2013, you’d have to very strong disposition to work in the BBC’s audio visual archive department.

The roots of the audio visual digital world

The development of digital video tape, and the international standards which accompanied its evolution, is an interesting place to start understanding our current media environment. They are also a great place to begin examining the problems of digital archiving, particularly when file migration has become embedded within organisational data management policy, and data collections are growing exponentially.

While the D1 may look like an alien-techno species from a distant land compared with the modest, immaterial file lists neatly stored on hard drives that we are accustomed to, they are related through the 4:2:2 sample rate which revolutionised high-end digital video production and continues to shape our mediated perceptions.

Videokunstarkivet – Norway’s Digital Video Art Archive

July 7th, 2014

We have recently digitised a U-matic video tape of eclectic Norwegian video art from the 1980s. The tape documents a performance by Kjartan Slettemark, an influential Norwegian/ Swedish artist who died in 2008. The tape is the ‘final mix’ of a video performance entitled Chromakey Identity Blue in which Slettemark live mixed several video sources onto one tape.

The theoretical and practical impossibility of documenting live performance has been hotly debated in recent times by performance theorists, and there is some truth to those claims when we consider the encounter with Slettemark’s work in the Great Bear studio. The recording is only one aspect of the overall performance which, arguably, was never meant as a stand alone piece. This was certainly reflected in our Daily Mail-esque reaction to the video when we played it back. ‘Eh? Is this art?! I don’t get it!’ was the resounding response.

Having access to the wider context of the performance is sometimes necessary if the intentions of the artist are to be appreciated. Thankfully, Slettemark’s website includes part-documentation of Chromakey Identity Blue, and we can see how the different video signals were played back on various screens, arranged on the stage in front of (what looks like) a live TV audience.

Upon seeing this documentation, the performance immediately evokes to the wider context of 70s/ 80s video art, that used the medium to explore the relationship between the body, space, screen and in Slettemark’s case, the audience. A key part of Chromakey Identity Blue is the interruption of the audience’s presence in the performance, realised when their images are screened across the face of the artist, whose wearing of a chroma key mask enables him to perform a ‘special effect’ which layers two images or video streams together.

What unfolds through Slettemark’s performance is at times humorous, suggestive and moving, largely because of the ways the faces of different people interact, perform or simply ignore their involvement in the spectacle. As Marina Abramovic‘s use of presence testifies, there can be something surprisingly raw and even confrontational about incorporating the face into relational art. As an ethical space, meeting with the ‘face’ of another became a key concept for twentieth century philosopher Emmanuel Levinas. The face locates, Bettina Bergo argues, ‘“being” as an indeterminate field’ in which ‘the Other as a face that addresses me […] The encounter with a face is inevitably personal.’

If an art work like Slettemark’s is moving then, it is because it stages moments where ‘faces’ reflect and interface across each other. Faces meet and become technically composed. Through the performance of personal-facial address in the artwork, it is possible to glimpse for a brief moment the social vulnerability and fragility such meetings engender. Brief because the seriousness is diffused Chromakey Identity Blue by a kitsch use of a disco ball that the artist moves across the screen to symbolically change the performed image, conjuring the magical feel of new technologies and how they facilitate different ways of seeing, being and acting in the world.

Videokunstarkivet (The Norwegian Video Art Archive)

020 System delete directory 300x168 Videokunstarkivet   Norways Digital Video Art Archive

The tape of Slettemark was sent to us by Videokunstarkivet, an exciting archival project mapping all the works of video art that have been made in Norway since the mid-1960s. Funded by the Norwegian Arts Council, the project has built the digital archival infrastructure from the bottom up, and those working on it have learnt a good many things along the way. Per Platou, who is managing the project, was generous enough to share some the insights for readers of our blog, and a selection of images from archive’s interface.

There are several things to be considered when creating a digital archive ‘from scratch’. Often at the beginning of a large project it is possible look around for examples of best practice within your field. This isn’t always the case for digital archives, particularly those working almost exclusively with video files, whose communities of practice are unsettled and established ways of working few and far between. The fact that even in 2014, when digital technologies have been widely adopted throughout society, there is still not any firm agreement on standard access and archival file formats for video files indicates the peculiar challenges of this work.

Because of this, projects such as Videokunstarkivet face multiple challenges, with significant amounts of improvisation required in the construction of the project infrastructure. An important consideration is the degree of access users will have to the archive material. As Per explained, publicly re-publishing the archive material from the site in an always open access form is not a concern of the  Videokunstarkivet, largely due to the significant administrative issues involved in gaining licensing and copyright permissions. ‘I didn’t even think there was a difference between collecting and communicating the work yet after awhile I saw there is no point in showing everything, it has to be filtered and communicated in a certain way.’

012 Munthe documents alt files 300x168 Videokunstarkivet   Norways Digital Video Art Archive

Instead, interested users will be given a research key or pass word which enables them to access the data and edit metadata where appropriate. If users want to re-publish or show the art in some form, contact details for the artist/ copyright holder are included as part of the entry. Although the Videokunstarkivet deals largely with video art, entries on individual artists include information about other archival collections where their material may be stored in order to facilitate further research. Contemporary Norwegian video artists are also encouraged to deposit material in the database, ensuring that ongoing collecting practices are built-in to the long-term project infrastructure.01b New Artwork 300x168 Videokunstarkivet   Norways Digital Video Art Archive

Another big consideration in constructing an archive is what to collect. Per told me that video art in Norway really took off in the early 80s. Artists who incorporated video into their work weren’t necessarily specialists in the medium, ‘there just happened to be a video camera nearby so they decided to use it.’ Video was therefore often used alongside films, graphics, performance and text, making the starting point for the archive, according to Per, ‘a bit of a mess really.’ Nonetheless, Videokunstarkivet ‘approaches every artist like it was Edvard Munch,’ because it is very hard to know now exactly what will be culturally valuable in 10, 20 or even 100 years from now. While it may not be appropriate to ‘save everything!’ for larger archival projects, for a self-contained and focused archival project such as the Videokunstarkivet, an inclusive approach may well be perfectly possible.

Building software infrastructures

Another important aspect of the project is technical considerations – the actual building of the back/ front end of the software infrastructure that will be used to manage newly migrated digital assets.

It was very important that the Videokunstarkivet archive was constructed using Open Source software. It was necessary to ensure resilience in a rapidly changing technological context, and so the project could benefit from any improvements in the code as they are tested out by user communities.

The project uses an adapted version of Digital Asset Management system Resource Space that was developed with LIMA, an organisation based in Holland that preserves, distributes and researches media art. Per explained that ‘since Resource Space was originally meant for photos and other “light” media files, we found it not so well suited for our actual tasks.’ Video files are of course far ‘heavier’ than image or even uncompressed audio files. This meant that there were some ‘pretty severe’ technical glitches in the process of establishing a database system that could effectively manage and playback large, uncompressed master and access copies. Through establishing the Videokunstarkivet archive they were ‘pushing the limits of what is technically possible in practice’, largely because internet servers are not built to handle large files, particularly not if those files are being transcoding back and forth across the file management system. In this respect, the project is very much ‘testing new ground’, creating an infrastructure capable of effectively managing, and enabling people to remotely access large amounts of high-quality video data.

07 F+©lstad artwork info 300x168 Videokunstarkivet   Norways Digital Video Art Archive Access files will be available to stream using open source encoded files Web M (hi and lo) and X264 (hi and lo), ensuring that streaming conditions can be adapted to individual server capabilities. The system is also set up to manage change large-scale file transcoding should there be substantial change in file format preferences. These changes can occur without compromising the integrity of the uncompressed master file.

The interface is built with Bootstrap which has been adapted to create ‘a very advanced access-layer system’ that enables Videokunstarkivet to define user groups and access requirements. Per outlined these user groups and access levels as follows:

‘- Admin: Access to everything (i.e.Videokunstarkivet team members)

– Research: Researchers/curators can see video works, and almost all the metadata (incl previews of the videos). They cannot download master files. They can edit metadata fields, however all their edits will be visible for other users (Wikipedia style). If a curator wants to SHOW a particular work, they’ll have to contact the artist or owner/gallery directly. If the artist agrees, they (or we) can generate a download link (or transcode a particular format) with a few clicks.

– Artist: Artists can up/download uncompressed master files freely, edit metadata and additional info (contact, cv, websites etc etc). They will be able to use the system to store digital master versions freely, and transcode files or previews to share with who they want. The ONLY catch is that they can never delete a master file – this is of course coming out of national archive needs.’06 F+©lstad overview 300x168 Videokunstarkivet   Norways Digital Video Art Archive

Per approached us to help migrate the Kjartan Slettemark tape because of the thorough approach and conscientious methodology we apply to digitisation work. As a media archaeology enthusiast, Per stressed that it was desirable for both aesthetic and archival reasons that the materiality of U-matic video was visible in the transferred file. He didn’t want the tape, in other words, to be ‘cleaned up’ in anyway. To migrate the tape to digital file we used our standardised transfer chain for U-matic tape. This includes using an appropriate time-based-corrector contemporary to U-matic era, and conversion of the dub signal using a dedicated external dub – y/c converter circuit.

We are very happy to be working with projects such as the Videokunstarkivet. It has been a great opportunity to learn about the nuts and bolts design of cutting-edge digital video archives, as well as discover the work of Kjartan Slettemark, whose work is not well-known in the UK. Massive thanks must go to Per for his generous sharing of time and knowledge in the process of writing this article. We wish the Videokunstarkivet every success and hope it will raise the profile of Norwegian video art across the world.

New additions in the Great Bear Studio – BBC-adapted Studer Open reel tape machine

June 24th, 2014

DSC05129 New additions in the Great Bear Studio   BBC adapted Studer Open reel tape machineWe recently acquired a new Studer open reel tape machine to add to our extensive collection of playback equipment.

This Studer is, however, different from the rest, because it originally belonged to BBC Bristol. It therefore bears the hall marks of a machine specifically adapted for broadcast use.

The tell tale signs can be found in customised features, such as control faders and switches. These enabled sound levels to be controlled remotely or manually.

 The presence of peak programme meters (P.P.M.), buttons that made it easy to see recording speeds (7.5/ 15 inches per second), as well as switches between cues and channels, were also specific to broadcast use.

DSC05128 New additions in the Great Bear Studio   BBC adapted Studer Open reel tape machine

Studer tape machines were favoured in professional contexts because of their ‘sturdy tape transport mechanism with integrated logic control, electronically controlled tape tension even during fast wind and braking phases, electronic sensing of tape motion and direction, electronic tape timing, electronic speed control, plug-in amplifier modules with separately plug-gable equalization and level pre-sets plus electronic equalization changeover.’

Because of Studer’s emphasis on engineering quality, machines could be adapted according to the specific needs of a recording or broadcast project.  

In our digitisation work at Great Bear, we have also adapted a Studer machine to clean damaged or shedding tapes prior to transfer. The flexibility of machine enables us to remove fixed guides so vulnerable tape can move safely through the transport. This preservation-based adaption is testimony to the considered design of Studer open reel tape machines, even though it diverges from its intended use.    

If you want to learn a bit more about the Equipment department at the BBC who would have been responsible for adapting machines, follow this link.

DSC05127 New additions in the Great Bear Studio   BBC adapted Studer Open reel tape machine

ADAPT, who are researching the history of television production also have an excellent links section of their website, including one to the BBC’s Research and Develop (R&D) archive which houses many key digitised publications relating to the adoption and use of magnetic tape in the broadcast industry.

The difference ten years makes: changes in magnetic tape recording and storage media

June 24th, 2014

DSC05144 The difference ten years makes: changes in magnetic tape recording and storage mediaGenerational change for digital technologies are rapid and disruptive.  ‘In the digital context the next generation may only be five to ten years away!’ Tom Gollins from the National Archives reminds us, and this seems like a fairly conservative estimate.

It can feel like the rate of change is continually accelerating, with new products appearing all the time. It is claimed, for example, that the phenomena of ‘wearable tech chic’ is now upon us, with the announcement this week that Google’s glass is available to buy for £1,000.

The impact of digital technologies have been felt throughout society, and this issue will be explored in a large immersive exhibition of art, design, film, music and videogames held at the Barbian July-Sept 2014. It is boldly and emphatically titled: Digital Revolution.

To bring such technological transformations back into focus with our work at Great Bear, consider this 2004 brochure that recently re-surfaced in our Studio. As an example of the rapid rate of technological change, you need look no further.

A mere ten years ago, you could choose between several brands of audio mini disc, ADAT, DAT, DTRS, Betacam SP, Digital Betacam, super VHS, VHS-C, 8mm and mini DV.

DSC05142 The difference ten years makes: changes in magnetic tape recording and storage mediaStorage media such as Zip disks, Jaz CartExabytes and hard drives that could store between 36-500Gb of data were also available to purchase.

RMGI are currently the only manufacturer of professional open reel audio tape. In the 2004 catalogue, different brands of open reel analogue tape are listed at a third of 2014 retail prices, taking into account rates of inflation.

While some of the products included in the catalogue, namely CDs, DVDs and open reel tape, have maintained a degree of market resiliency due to practicality, utility or novelty, many have been swept aside in the march of technological progress that is both endemic and epidemic in the 21st century.

 

 

Digitisation: methodologies, processing and archival practices

June 16th, 2014

 DSC05096 Digitisation: methodologies, processing and archival practicesWe work with a range of customers at Great Bear, digitising anything from personal collections to the content of institutional archives. Because of this, what customers need from a digitisation service can be very different.

A key issue relates to the question of how much we process the digital file, both as part of the transfer and in post-production. In other words, to what extent do we make alterations to the form of the recording when it becomes a digitised artifact. While this may seem like an innocuous problem, the question of whether or not to apply processing, and therefore radically transform the original recording, is a fraught, and for some people, ethical, consideration.

There are times when applying processing technologies is desirable and appropriate. With the transfer of video tape, for example, we always use time-based correctors or frame synchronisers to reduce or eliminate errors during play back. Some better quality video tape machines, such as the U-matic BVU-950P, already have time-based correctors built in which makes external processing unnecessary. As the AV Artifact Atlas explains however, time-based correction errors are very common with video tape:

‘When a different VTR is used to playback the same signal, there can be slight mechanical and electronic differences that prevent the tape from being read in the same way it was written. Perhaps the motors driving the tape in a playback VTR move slightly slower than they did in the camera that recorded the tape, or maybe the head of the playback VTR rotates a fraction quicker than the video head in the machine that recorded the tape. These tiny changes in timing can dramatically affect stability in a video image.’

We also utilise built in processes that are part of machine’s circuitry, such as drop out compensation and noise reduction. We use these, however, not in order to make the tape ‘look better.’ We do it rather as a standard calibration set up, which is necessary for the successful playback of the tape in a manner appropriate to its original operating environment.

After all, video tape machines were designed to be interchangeable. It is likely such stabilising processing would have been regularly used to play back tapes in machines that were different to those they were recorded on. Time-based correction and frame synchronisation are therefore integral to the machine/ playback circuitry, and using such processing tools is central to how we successfully migrate tape-based collections to digital files.

Digital processing tools DSC05095 Digitisation: methodologies, processing and archival practices

Our visual environment has changed dramatically since the days when domestic video tape was first introduced, let alone since the hay day of VHS. The only certainty is that it will continue to change. Once it was acceptable for images to be a bit grainy and low resolution, now only the crisp clarity of a 4K Ultra HD image will do. There is perhaps the assumption that ‘clearer is better’, that being able to watch moving images in minute detail is a marker of progress.  Yet should this principle be applied to the kinds of digitisation work we do at Great Bear? There are processors that can transform the questionable analogue image into a bright, high definition, colour enriched digital copy. The teranex processor, for example, ‘includes extremely high quality de-interlacing, up conversion, down conversion, SD and HD cross/standards conversion, automatic cadence detection and removal even with edited content, noise reduction, adjustable scaling and aspect ratio conversion.’ ‘Upgrading’ analogue images in this way does come with certain ethical risks.

Talking about ethics in conjunction with video or audio tape might seem a bit melodramatic, but it is at the point of intervention/ non-intervention where the needs of our customers diverge the most. This is not to say that people who do want to process their tapes are unethical – far from it! We understand that for some customers it may be preferable for such processing to occur, or to apply other editing techniques such as noise reduction or amplification, so that audio can be heard with greater clarity.

Instead we want to emphasise that our priority is getting the best out of the tape and our playback machines, rather than relying on the latest processing technology that is also at risk from obsolescence. After all, a heavily processed file will always require further processing at an unknown point in future so that it can maintain visually relevant to whatever format is commercially dominant at the time. Such transformations of the digital file, which are necessarily destructive and permanent, contribute to the further circulation of what Hito Steyerl calls ‘poor images‘, ‘a rag or a rip; an AVI or a JPEG…The poor image has been uploaded, downloaded, shared, reformatted, and reedited. It transforms quality into accessibility, exhibition value into cult value, films into clips, contemplation into distraction.’

Maintaining the integrity, and as far as possible authenticity of the original recordings, is a core part of our methodology. In this way our approach corresponds with Jisc’s mantra of ‘reproduction not optimisation’ where they write:

‘Improving, altering or modifying media for optimisation may seem logical when presenting works to a public or maintaining perceived consistency. It should be remembered that following an often natural inclination to enhance what we perceive to be a poor level of quality is a subjective process prescribed by personal preference, technological trends and cultural influences. In many cases the intentions of a creator are likely to be unknown and this can cause difficulties in interpreting levels of quality. In these instances common sense alongside trepidation should prevail. On the one end of the spectrum unintelligible recordings may be of little use to anyone, whereas at the opposite end recordings from previous eras were not produced with modern standards of clarity in mind.’

It is important to bear in mind, however, that even if a file is subject to destructive editing there may come a time when the metadata created about the artefact can help to illuminate its context and provenance, and therefore help it maintain its authenticity. The debates regarding digital authenticity and archiving will of course shift as time passes and practices evolve.

In the meantime, we will continue to do what we are most skilled at: restoring, repairing and migrating magnetic tape to digital files in a manner that maintains both the integrity of the original operating environment and the recorded signal.

Future tape archaeology: speculations on the emulation of analogue environments

June 2nd, 2014

At the recent Keeping Tracks symposium held at the British Library, AV scoping analyst Adam Tovell stated that

‘there is consensus internationally that we as archivists have a 10-20 year window of opportunity in which to migrate the content of our physical sound collections to stable digital files. After the end of this 10-20 year window, general consensus is that the risks faced by physical media mean that migration will either become impossible or partial or just too expensive.’

This point of view certainly corresponds to our experience at Great Bear. As collectors of a range of domestic and professional video and audio tape playback machines, we are aware of the particular problems posed by machine obsolescence. Replacement parts can be hard to come by, and the engineering expertise needed to fix machines is becoming esoteric wisdom. Tape degradation is of course a problem too. These combined factors influence the shortened horizon of magnetic tape-based media.

All may not be lost, however, if we are take heart from a recent article which reported the development of an exciting technology that will enable memory institutions to recover recordings made over 125 years ago on mouldy wax cylinders or acid-leaching lacquer discs.

IRENE (Image, Reconstruct, Erase Noise, Etc.), developed by physicist Carl Haber at the Lawrence Berkeley National Laboratory, is a software programme that ‘photographs the grooves in fragile or decayed recordings, stitches the “sounds” together with software into an unblemished image file, and reconstructs the “untouchable” recording by converting the images into an audio file.’

The programme was developed by Haber after he heard a radio show discuss the Library of Congress’ audio collections that were so fragile they risked destruction if played back. Haber speculated that the insights gained from a project he was working on could be used to recover these audio recordings. ‘“We were measuring silicon, why couldn’t we measure the surface of a record? The grooves at every point and amplitude on a cylinder or disc could be mapped with our digital imaging suite, then converted to sound.”’

For those involved in the development of IRENE, there was a strong emphasis on the benefits of patience and placing trust in the inevitable restorative power of technology. ‘It’s ironic that as we put more time between us and the history we are exploring, technology allows us to learn more than if we had acted earlier.’

Can such a hands-off approach be applied to magnetic tape based media? Is the 10-20 year window of opportunity described by Tovell above unnecessarily short? After all, it is still possible to playback wax cylinder recordings from the early 20th century which seem to survive well over long periods of time, and magnetic tape is far more durable than is commonly perceived.

In a fascinating audio recording made for the Pitt Rivers Museum in Oxford, Nigel Bewley from the British Library describes how he migrated wax cylinder recordings that were made by Evans Pritchard in 1928-1930 and Diamond Jenness in 1911-1912. Although Bewley reveals his frustration in the preparation process, he reveals that once he had established the size of stylus and rotational speed of the cylinder player, the transfer was relatively straightforward.

You will note that in contrast with the recovery work made possible by IRENE, the cylinder transfer was made using an appropriate playback mechanism, examples of which can accessed on this amazing section of the British Library’s website (here you can also browse through images and information about disc cutters, magnetic recorders, radios, record players, CD players and accessories such as needle tins and headphones – a bit of a treasure trove for those inclined toward media archaeology).

Perhaps the development of the IRENE technology will mean that it will no longer be necessary to use such ‘authentic’ playback mechanisms to recover information stored on obsolete media. This brings us neatly to the question of emulation.

Emulation

DSC04897 Future tape archaeology: speculations on the emulation of analogue environments

If we assume that all the machines that playback magnetic tape become irrevocably obsolete in 10-20 years, what other potential extraction methods may be available? Is it possible that emulation techniques, commonly used in the preservation of born-digital environments, can be applied to recover the recorded information stored on magnetic tape?

In a recent interview Dirk Von Suchodoletz explains that:

‘Emulation is a concept in digital preservation to keep things, especially hardware architectures, as they were. As the hardware itself might not be preservable as a physical entity it could be very well preserved in its software reproduction. […] For memory institutions old digital artifacts become more easy to handle. They can be viewed, rendered and interacted-with in their original environments and do not need to be adapted to our modern ones, saving the risk of modifying some of the artifact’s significant properties in an unwanted way. Instead of trying to mass-migrate every object in the institution’s holdings objects are to be handled on access request only, significantly shifting the preservation efforts.’

For the sake of speculation, let us imagine we are future archaeologists and consider some of the issues that may arise when seeking to emulate the operating environments of analogue-based tape media.

To begin with, without a working transport mechanism which facilitates the transmission of information, the emulation of analogue environments will need to establish a circuitry that can process the Radio Frequency (RF) signals recorded on magnetic tape. As Jonathan Sterne reflects, ‘if […] we say we have to preserve all aspects of the platform in order to get at the historicity of the media practice, that means archival practice will have to have a whole new engineering dimension to it.’

Yet with the emulation of analogue environments, engineering may have to be a practical consideration rather than an archival one. For example, some kind of transport mechanism would presumably have to be emulated through which the tape could be passed through. It would be tricky to lay the tape out flat and take samples of information from its surface, as IRENE’s software does to grooved media, because of the sheer length of tape when it unwound. Without an emulated transport mechanism, recovery would be time consuming and therefore costly, a point that Tovell intimates at the beginning of the article. Furthermore, added time and costs would necessitate even more complex selection and appraisal decisions on behalf of archivists managing in-operative magnetic tape-based collections. Questions about value will become fraught and most probably politically loaded. With an emulated transport mechanism, issues such as tape vulnerability and head clogs, which of course impact on current migration practices, would come into play.

Audio and video differences

On a technical level emulation may be vastly more achievable for audio where the signal is recorded using a longitudinal method and plays back via a relatively simple process. Audio tape is also far less propriety than video tape. On the SONY APR-5003V machine we use in the Great Bear Studio for example, it is possible to play back tapes of different sizes, speeds, brands, and track formations via adjustments of the playback heads. Such versatility would of course need to be replicated in any emulation environment.

helical scan2 Future tape archaeology: speculations on the emulation of analogue environmentsThe technical circuitry for playing back video tape, however, poses significantly more problems. Alongside the helical scan methods, which records images diagonally across the video tape in order to prevent the appearance of visible joints between the signal segments, there are several heads used to read the components of the video signal: the image (video), audio and control (synch) track.

Unlike audio, video tape circuitry is more propriety and therefore far less inter-operable. You can’t play a VHS tape on a U-Matic machine, for example. Numerous mechanical infrastructures would therefore need to be devised which correspond with the relevant operating environments – one size fits all would (presumably) not be possible.

A generic emulated analogue video tape circuit may be created, but this would only capture part of the recorded signal (which, as we have explored elsewhere on the blog, may be all we can hope for in the transmission process). If such systems are to be developed it is surely imperative that action is taken now while hardware is operative and living knowledge can be drawn upon in order to construct emulated environments in the most accurate form possible.

While hope may rest in technology’s infinite capacity to take care of itself in the end, excavating information stored on magnetic tape presents far more significant challenges when compared with recordings on grooved media. There is far more to tape’s analogue (and digital) circuit than a needle oscillating against a grooved inscription on wax, lacquer or vinyl.

The latter part of this article has of course been purely speculative. It would be fascinating to learn about projects attempting to emulate the analogue environment in software – please let us know if you are involved in anything in the comments below.

Capitalising on the archival market: SONY’s 185 TB tape cartridge

May 20th, 2014

In Trevor Owen’s excellent blog post ‘What Do you Mean by Archive? Genres of Usage for Digital Preservers’, he outlines the different ways ‘archive’ is used to describe data sets and information management practices in contemporary society. While the article shows it is important to distinguish between tape archives, archives as records management, personal papers and computational archives, Owens does not include an archival ‘genre’ that will become increasingly significant in the years to come: the archival market.

The announcement in late April 2014 that SONY has developed a tape cartridge capable of storing 185 TB of data was greeted with much excitement throughout the teccy world. The invention, developed with IBM, is ‘able to achieve the high storage capacity by utilising a “nano-grained magnetic layer” consisting of tiny nano-particles’ and boasts the world’s highest areal recording density of 148 Gb/in.

The news generated such surprise because it signaled the curious durability of magnetic tape in a world thought to have ‘gone tapeless‘. For companies who need to store large amounts of data however, tape storage, usually in the form of Linear Tape Open Cartridges, has remained an economically sound solution despite the availability of file-based alternatives. Imagine the amount of energy required to power up the zettabytes of data that exist in the world today? Whatever the benefits of random access, that would be a gargantuan electricity bill.

Indeed, tape cartridges are being used more and more to store large amounts of data. According to the Tape Storage Council industry group, tape capacity shipments grew by 13 percent in 2012 and were projected to grow by 26 percent in 2013. SONY’s announcement is therefore symptomatic of the growing archival market which has created demand for cost effective data storage solutions.

It is not just magnetic tape that is part of this expanding market. Sony, Panasonic and Fuji are developing optical ‘Archival discs’ capable of storing 300GB (available in summer 2015 ), with plans to develop 500GB and 1 TB disc.

Why is there such a demand for data storage?

Couldn’t we just throw it all away?

The Tape Storage Council explain:

‘This demand is being driven by unrelenting data growth (that shows no sign of slowing down), tape’s favourable economics, and the prevalent data storage mindset of “save everything, forever,” emanating from regulatory, compliance or governance requirements, and the desire for data to be repurposed and monetized in the future.’

Big Hadoop 01 full Capitalising on the archival market: SONYs 185 TB tape cartridgeThe radical possibilities of data-based profit-making abound in the ‘buzz’ that surrounds big data, an ambitious form of data analytics that has been embraced by academic research councils, security forces and multi-national companies alike.

Presented by proponents as the way to gain insights into consumer behaviour, big data apparently enables companies to unlock the potential of ‘data-driven decision making.’ For example, an article in Computer Weekly describes how Ebay is using big data analytics so they can better understand the ‘customer journey’ through their website.

Ebay’s initial forays into analysing big data were in fact relatively small: in 2002 the company kept around 1% of customer data and discarded the rest. In 2007 the company changed their policy, and worked with an established company to develop a custom data warehouse which can now run ad-hoc queries in just 32 seconds.

It is not just Ebay who are storing massive amounts of customer data. According to the BBC, ‘Facebook has begun installation of 10,000 Blu-ray discs in a prototype storage cabinet as back-ups for users’ photos and videos’. While for many years the internet was assumed to be a virtual, almost disembodied space, the desire from companies to monetise information assets mean that the incidental archives created through years of internet searches, have all this time been stored, backed up and analysed.

Amid all the excitement and promotion of big data, the lack of critical voices raising concern about social control, surveillance and ethics is surprising. Are people happy that the data we create is stored, analysed and re-sold, often without our knowledge or permission? What about civil liberties and democracy? What power do we have to resist this subjugation to the irrepressible will of the data-driven market?

These questions are pressing, and need to be widely discussed throughout society. Current predictions are that the archive market will keep growing and growing.

‘A recent report from the market intelligence firm IDC estimates that in 2009 stored information totalled 0.8 zetabytes, the equivalent of 800 billion gigabytes. IDC predicts that by 2020, 35 zetabytes of information will be stored globally. Much of that will be customer information. As the store of data grows, the analytics available to draw inferences from it will only become more sophisticated.

The development of SONY’s 185 TB tape indicate they are well placed to capitalise on these emerging markets.

The kinds of data stored on the tapes when they become available for professional markets (these tapes are not aimed at consumers) will really depend on the legal regulations placed on companies doing the data collecting. As the case of eBay discussed earlier makes clear, companies will collect all the information if they are allowed to. But should they be? As citizens in the internet society  how can ensure we have a ‘right to be forgotten’? How are the shackles of data-driven control societies broken?


designed and developed by
greatbear analogue and digital media ltd, 0117 985 0500
Unit 26, The Coach House, 2 Upper York Street, Bristol, BS2 8QN, UK


XHTML | CSS
greatbear analogue and digital media is proudly powered by WordPress
hosted using Debian and Apache