Posts Tagged ‘obsolete’

Re-animating archives: Action Space’s V30H / V60H EIAJ 1/2″ video tapes

Wednesday, May 20th, 2015

One of the most interesting aspects of digitising magnetic tapes is what happens to them after they leave the Great Bear studio.

Often transfers are done for private or personal interest, such as listening to the recording of loved ones, or for straightforward archival reasons.

Yet in some cases material is re-used in a new creative project, thereby translating recordings within a different technical and historical context.

Walter Benjamin described such acts as the ‘afterlife’ of translation: ‘a translation issues from the original not so much for its life as from its afterlife […] translation marks their stage of continued life.’ [1]

A child stands on top of an inflatable structure, black and white image.

Stills from the Action Space tapes

So it was with a collection of ½ inch EIAJ SONY V30H and V60H video tapes that recently landed in the Great Bear studio which documented the antics of Action Space.

Part of the vanguard movement of radical arts organisations that emerged in the late 1960s, Action Space described themselves as ‘necessarily experimental, devious, ambiguous, and always changing in order to find a new situation. In the short term the objectives are to continually question and demonstrate through the actions of all kinds new relationships between artists and public, teachers and taught, drop-outs and society, performers and audiences, and to question current attitudes of the possibility of creativity for everyone.’ [2]

Such creative shape-shifting, which took its impulsive artistic action in a range of public spaces can so often be the enemy of documentation.

Yet Ken Turner, who founded Action Space alongside Mary Turner and Alan Nisbet, told me that ‘Super Eight film and transparency slides were our main documentation tools, so we were aware of recording events and their importance.’

Introduced in 1969, EIAJ 1/2″ was the first format to make video tape recording accessible to people outside the professional broadcast industry.

Action Space were part of this wave of audiovisual adoption (minor of course by today’s standards!)

After ‘accidentally’ inheriting a portapak recorder from the Marquis of Bath, Ken explained, Action Space ‘took the portapak in our stride into events and dramas of the community festivals and neighbourhood gatherings, and adventure playgrounds. We did not have an editing deck; as far as I can remember played back footage through a TV, but even then it had white noise, if that’s the term, probably it was dirty recording heads. We were not to know.’

Preservation issues

Yes those dirty recording heads make things more difficult when it comes to re-formatting the material.

While some of the recordings replay almost perfectly, some have odd tracking problems and emit noise, which are evidence of a faulty recorder and/or dirty tape path or heads. Because such imperfections were embedded at the time of recording, there is little that can be done to ‘clean up’ the signal.

Other problems with the Action Space collection arise from the chemical composition of the tapes. The recordings are mainly on Sony branded V30H and high density V60H tape which always suffer from binder hydrolysis. The tapes therefore needed ‘baking’ treatment prior to transfer usually (we have found) in a more controlled and longer way from Ampex branded tapes.

And that old foe of magnetic tape strikes again: mould. Due to being stored in an inappropriate environment over a prolonged period, many of the tapes have mould growth that has damaged the binder.

Despite these imperfections, or even because of them, Ken appreciates the unique value of these recordings: ‘the footage I have now of the community use reminds me of the rawness of the events, the people and the atmosphere of noise and constant movement. I am extremely glad to have these tapes transposed into digital footage as they vividly remind me of earlier times. I think this is essential to understanding the history and past experiences that might otherwise escape the memories of events.’

People sliding down an inflatable structure, joyful expressions on their faces.Historical translation

While the footage of Action Space is in itself a valuable historical document, the recordings will be subject a further act of translation, courtesy of Ken’s film maker son, Huw Wahl.

Fresh from the success of his film about anarchist art critic and poet Herbert Read, Huw is using the digitised tapes as inspiration for a new work.

This new film will reflect on the legacies of Action Space, examing how the group’s interventions can speak to our current historical context.

Huw told me he wants to re-animate Action Space’s ethos of free play, education and art in order ‘to question what actions could shape a democratic and creative society. In terms of the rhetoric of creativity we hear now from the arts council and artistic institutions, it’s important to look at where that developed from. Once we see how radical those beginnings really were, maybe we will see more clearly where we are heading if we continue to look at creativity as a commodity, rather than a potent force for a different kind of society.’

Inflatable action

Part of such re-animation will entail re-visiting Action Space’s work with large inflatable structures, or what Ken prefers to call ‘air or pneumatic structures.’

Huw intends to make a new inflatable structure that will act as the container for a range of artistic, academic, musical and nostalgic responses to Action Space’s history. The finished film will then be screened inside the inflatable, creating what promises to be an unruly and unpredictable spectacle.

Ken spoke fondly about the video footage which recorded ‘the urgency of “performance” of the people who are responding to the inflatables. Today inflatable making and use is more controlled, in the 60s control was only minimally observed, to prevent injuries. But in all our activities over 10 years of air structure events, we had only one fractured limb.’Young people sliding down the side of an inflatable structure - Action Space archive

Great Bear cameo!

Another great thing about the film is that the Great Bear Studio will have an important cameo role.

Huw came to visit us to shoot footage of the transfers. He explains his reasons:

‘I’d like viewers to see the set up for the capturing of the footage used in the film. Personally it’s very different seeing the reel played on a deck rather than scanning through a quicktime file. You pay a different kind of attention to it. I don’t want to be too nostalgic about a format I have never shot with, yet there seems to be an amateur quality inherent to the portapak which I assume is because the reels could be re-recorded over. Seeing material shot by children is something the super 8mm footage just doesn’t have, it would have been too expensive. Whereas watching children grabbing a portapack camera and running about with it is pretty exciting. Seeing the reels and machines for playing it all brings me closer to the experience of using the actual portapak cameras. Hopefully this will inform the filming and editing process of this film.’

We wish Huw the very best for his work on this project and look forward to seeing the results!

***Big thanks to Ken Turner and Huw Wahl for answering questions for this article.***

Notes

[1] Walter Benjamin, ‘The Task of the Translator,’ Selected Writings: 1913-1926, Volume 1, Harvard University Press, 2006, 253-264, 254.

[2] Action Space Annual Report, 1972, accessed http://www.unfinishedhistories.com/history/companies/action-space/action-space-annual-report-extract/.

Videokunstarkivet – Norway’s Digital Video Art Archive

Monday, July 7th, 2014

We have recently digitised a U-matic video tape of eclectic Norwegian video art from the 1980s. The tape documents a performance by Kjartan Slettemark, an influential Norwegian/ Swedish artist who died in 2008. The tape is the ‘final mix’ of a video performance entitled Chromakey Identity Blue in which Slettemark live mixed several video sources onto one tape.

The theoretical and practical impossibility of documenting live performance has been hotly debated in recent times by performance theorists, and there is some truth to those claims when we consider the encounter with Slettemark’s work in the Great Bear studio. The recording is only one aspect of the overall performance which, arguably, was never meant as a stand alone piece. This was certainly reflected in our Daily Mail-esque reaction to the video when we played it back. ‘Eh? Is this art?! I don’t get it!’ was the resounding response.

Having access to the wider context of the performance is sometimes necessary if the intentions of the artist are to be appreciated. Thankfully, Slettemark’s website includes part-documentation of Chromakey Identity Blue, and we can see how the different video signals were played back on various screens, arranged on the stage in front of (what looks like) a live TV audience.

Upon seeing this documentation, the performance immediately evokes to the wider context of 70s/ 80s video art, that used the medium to explore the relationship between the body, space, screen and in Slettemark’s case, the audience. A key part of Chromakey Identity Blue is the interruption of the audience’s presence in the performance, realised when their images are screened across the face of the artist, whose wearing of a chroma key mask enables him to perform a ‘special effect’ which layers two images or video streams together.

What unfolds through Slettemark’s performance is at times humorous, suggestive and moving, largely because of the ways the faces of different people interact, perform or simply ignore their involvement in the spectacle. As Marina Abramovic‘s use of presence testifies, there can be something surprisingly raw and even confrontational about incorporating the face into relational art. As an ethical space, meeting with the ‘face’ of another became a key concept for twentieth century philosopher Emmanuel Levinas. The face locates, Bettina Bergo argues, ‘“being” as an indeterminate field’ in which ‘the Other as a face that addresses me […] The encounter with a face is inevitably personal.’

If an art work like Slettemark’s is moving then, it is because it stages moments where ‘faces’ reflect and interface across each other. Faces meet and become technically composed. Through the performance of personal-facial address in the artwork, it is possible to glimpse for a brief moment the social vulnerability and fragility such meetings engender. Brief because the seriousness is diffused Chromakey Identity Blue by a kitsch use of a disco ball that the artist moves across the screen to symbolically change the performed image, conjuring the magical feel of new technologies and how they facilitate different ways of seeing, being and acting in the world.

Videokunstarkivet (The Norwegian Video Art Archive)

VKA DAM Interface

The tape of Slettemark was sent to us by Videokunstarkivet, an exciting archival project mapping all the works of video art that have been made in Norway since the mid-1960s. Funded by the Norwegian Arts Council, the project has built the digital archival infrastructure from the bottom up, and those working on it have learnt a good many things along the way. Per Platou, who is managing the project, was generous enough to share some the insights for readers of our blog, and a selection of images from archive’s interface.

There are several things to be considered when creating a digital archive ‘from scratch’. Often at the beginning of a large project it is possible look around for examples of best practice within your field. This isn’t always the case for digital archives, particularly those working almost exclusively with video files, whose communities of practice are unsettled and established ways of working few and far between. The fact that even in 2014, when digital technologies have been widely adopted throughout society, there is still not any firm agreement on standard access and archival file formats for video files indicates the peculiar challenges of this work.

Because of this, projects such as Videokunstarkivet face multiple challenges, with significant amounts of improvisation required in the construction of the project infrastructure. An important consideration is the degree of access users will have to the archive material. As Per explained, publicly re-publishing the archive material from the site in an always open access form is not a concern of the  Videokunstarkivet, largely due to the significant administrative issues involved in gaining licensing and copyright permissions. ‘I didn’t even think there was a difference between collecting and communicating the work yet after awhile I saw there is no point in showing everything, it has to be filtered and communicated in a certain way.’

VKA DAM INterace

Instead, interested users will be given a research key or pass word which enables them to access the data and edit metadata where appropriate. If users want to re-publish or show the art in some form, contact details for the artist/ copyright holder are included as part of the entry. Although the Videokunstarkivet deals largely with video art, entries on individual artists include information about other archival collections where their material may be stored in order to facilitate further research. Contemporary Norwegian video artists are also encouraged to deposit material in the database, ensuring that ongoing collecting practices are built-in to the long-term project infrastructure.VKA DAM Interface

Another big consideration in constructing an archive is what to collect. Per told me that video art in Norway really took off in the early 80s. Artists who incorporated video into their work weren’t necessarily specialists in the medium, ‘there just happened to be a video camera nearby so they decided to use it.’ Video was therefore often used alongside films, graphics, performance and text, making the starting point for the archive, according to Per, ‘a bit of a mess really.’ Nonetheless, Videokunstarkivet ‘approaches every artist like it was Edvard Munch,’ because it is very hard to know now exactly what will be culturally valuable in 10, 20 or even 100 years from now. While it may not be appropriate to ‘save everything!’ for larger archival projects, for a self-contained and focused archival project such as the Videokunstarkivet, an inclusive approach may well be perfectly possible.

Building software infrastructures

Another important aspect of the project is technical considerations – the actual building of the back/ front end of the software infrastructure that will be used to manage newly migrated digital assets.

It was very important that the Videokunstarkivet archive was constructed using Open Source software. It was necessary to ensure resilience in a rapidly changing technological context, and so the project could benefit from any improvements in the code as they are tested out by user communities.

The project uses an adapted version of Digital Asset Management system Resource Space that was developed with LIMA, an organisation based in Holland that preserves, distributes and researches media art. Per explained that ‘since Resource Space was originally meant for photos and other “light” media files, we found it not so well suited for our actual tasks.’ Video files are of course far ‘heavier’ than image or even uncompressed audio files. This meant that there were some ‘pretty severe’ technical glitches in the process of establishing a database system that could effectively manage and playback large, uncompressed master and access copies. Through establishing the Videokunstarkivet archive they were ‘pushing the limits of what is technically possible in practice’, largely because internet servers are not built to handle large files, particularly not if those files are being transcoding back and forth across the file management system. In this respect, the project is very much ‘testing new ground’, creating an infrastructure capable of effectively managing, and enabling people to remotely access large amounts of high-quality video data.

VKA DAM Interface Access files will be available to stream using open source encoded files Web M (hi and lo) and X264 (hi and lo), ensuring that streaming conditions can be adapted to individual server capabilities. The system is also set up to manage change large-scale file transcoding should there be substantial change in file format preferences. These changes can occur without compromising the integrity of the uncompressed master file.

The interface is built with Bootstrap which has been adapted to create ‘a very advanced access-layer system’ that enables Videokunstarkivet to define user groups and access requirements. Per outlined these user groups and access levels as follows:

‘- Admin: Access to everything (i.e.Videokunstarkivet team members)

– Research: Researchers/curators can see video works, and almost all the metadata (incl previews of the videos). They cannot download master files. They can edit metadata fields, however all their edits will be visible for other users (Wikipedia style). If a curator wants to SHOW a particular work, they’ll have to contact the artist or owner/gallery directly. If the artist agrees, they (or we) can generate a download link (or transcode a particular format) with a few clicks.

– Artist: Artists can up/download uncompressed master files freely, edit metadata and additional info (contact, cv, websites etc etc). They will be able to use the system to store digital master versions freely, and transcode files or previews to share with who they want. The ONLY catch is that they can never delete a master file – this is of course coming out of national archive needs.’F+©lstad overview

Per approached us to help migrate the Kjartan Slettemark tape because of the thorough approach and conscientious methodology we apply to digitisation work. As a media archaeology enthusiast, Per stressed that it was desirable for both aesthetic and archival reasons that the materiality of U-matic video was visible in the transferred file. He didn’t want the tape, in other words, to be ‘cleaned up’ in anyway. To migrate the tape to digital file we used our standardised transfer chain for U-matic tape. This includes using an appropriate time-based-corrector contemporary to U-matic era, and conversion of the dub signal using a dedicated external dub – y/c converter circuit.

We are very happy to be working with projects such as the Videokunstarkivet. It has been a great opportunity to learn about the nuts and bolts design of cutting-edge digital video archives, as well as discover the work of Kjartan Slettemark, whose work is not well-known in the UK. Massive thanks must go to Per for his generous sharing of time and knowledge in the process of writing this article. We wish the Videokunstarkivet every success and hope it will raise the profile of Norwegian video art across the world.

Big Data, Long Term Digital Information Management Strategies & the Future of (Cartridge) Tape

Monday, November 18th, 2013

What is the most effective way to store and manage digital data in the long term? This is a question we have given considerable attention to on this blog. We have covered issues such as analogue obsolescence, digital sustainability and digital preservation policies. It seems that as a question it remains unanswered and up for serious debate.

We were inspired to write about this issue once again after reading an article that was published in the New Scientist a year ago called ‘Cassette tapes are the future of big data storage.’ The title is a little misleading, because the tape it refers to is not the domestic audio tape that has recently acquired much counter cultural kudos, but rather archival tape cartridges that can store up to 100 TB of data. How much?! I hear you cry! And why tape given the ubiquity of digital technology these days? Aren’t we all supposed to be ‘going tapeless’?

The reason for such an invention, the New Scientist reveals, is the ‘Square Kilometre Array (SKA), the world’s largest radio telescope, whose thousands of antennas will be strewn across the southern hemisphere. Once it’s up and running in 2024, the SKA is expected to pump out 1 petabyte (1 million gigabytes) of compressed data per day.’

SKA_dishes

Image of the SKA dishes

Researchers at Fuji and IBM have already designed a tape that can store up to 35TB, and it is hoped that a 100TB tape will be developed to cope with the astronomical ‘annual archive growth [that] would swamp an experiment that is expected to last decades’. The 100TB cartridges will be made ‘by shrinking the width of the recording tracks and using more accurate systems for positioning the read-write heads used to access them.’

If successful, this would certainly be an advanced achievement in material science and electronics. Smaller tape width means less room for error on the read-write function – this will have to be incredibly precise on a tape that will be storing a pretty extreme amount of information. Presumably smaller tape width will also mean there will be no space for guard bands either. Guard bands are unrecorded areas between the stripes of recorded information that are designed to prevent information interference, or what is known as ‘cross-talk‘.They were used on larger domestic video tapes such as U-Matic and VHS, but were dispensed with on smaller formats such as the Hi-8, which had a higher density of magnetic information in a small space, and used video heads with tilted gaps instead of guard bands.

The existence of SKA still doesn’t explain the pressing question: why develop new archival tape storage solutions and not hard drive storage?

Hard drives were embraced quickly because they take up less physical storage space than tape. Gone are the dusty rooms bursting with reel upon reel of bulky tape; hello stacks of infinite quick-fire data, whirring and purring all day and night. Yet when we consider the amount of energy hard drive storage requires to remain operable, the costs – both economic and ecological – dramatically increase.

The report compiled by the Clipper Group published in 2010 overwhelmingly argues for the benefits of tape over disk for the long term archiving of data. They state that ‘disk is more than fifteen times more expensive than tape, based upon vendor-supplied list pricing, and uses 238 times more energy (costing more than the all costs for tape) for an archiving application of large binary files with a 45% annual growth rate, all over a 12-year period.’

This is probably quite staggering to read, given the amount of investment in establishing institutional architecture for tape-less digital preservation. Such an analysis of energy consumption does assume, however, that hard drives are turned on all the time, when surely many organisations transfer archives to hard drives and only check them once every 6-12 months.

Yet due to the pressures of technological obsolescence and the need to remain vigilant about file operability, coupled with the functional purpose of digital archives to be quickly accessible in comparison with tape that can only be played back linearly, such energy consumption does seem fairly inescapable for large institutions in an increasingly voracious, 24/7 information culture. Of course the issue of obsolescence will undoubtedly affect super-storage-data tape cartridges as well. Technology does not stop innovating – it is not in the interests of the market to do so.

Perhaps more significantly, the archive world has not yet developed standards that address the needs of digital information managers. Henry Newman’s presentation at the Designing Storage Architectures 2013 conference explored the difficulty of digital data management, precisely due to the lack of established standards:

  • ‘There are some proprietary solutions available for archives that address end to end integrity;
  • There are some open standards, but none that address end to end integrity;
  • So, there are no open solutions that meet the needs of [the] archival community.’

He goes on to write that standards are ‘technically challenging’ and require ‘years of domain knowledge and detailed understanding of the technology’ to implement. Worryingly perhaps, he writes that ‘standards groups do not seem to be coordinating well from the lowest layers to the highest layers.’ By this we can conclude that the lack of streamlined conversation around the issue of digital standards means that effectively users and producers are not working in synchrony. This is making the issue of digital information management a challenging one, and will continue to be this way unless needs and interests are seen as mutual.

Other presentations at the recent annual meeting for Designing Storage Architectures for Digital Collections which took place on September 23-24, 2013 at the Library of Congress, Washington, DC, also suggest there are limits to innovation in the realm of hard drive storage.  Gary Decad, IBM, delivered a presentation on the ‘The Impact of Areal Density and Millions of Square Inches of Produced Memory on Petabyte Shipments for TAPE, NAND Flash, and HDD Storage Class‘.

For the lay (wo)man this basically translates as the capacity to develop computer memory stored on hard drives. We are used to living in a consumer society where new improved gadgets appear all the time. Devices are getting smaller and we seem to be able buy more storage space for cheaper prices. For example, it now costs under £100 to buy a 3TB hard drive, and it is becoming increasingly more difficult to purchase hard drives which have less than 500GB storage space. Compared with last year, a 1TB hard drive was the top of the range and would have probably cost you about £100.

A 100TB storage unit in 2010, compared with a smaller hard drive symbolising 2020.

Does my data look big in this?

Yet the presentation from Gary Decad suggests we are reaching a plateau with this kind of storage technology – infinite memory growth and reduced costs will soon no longer be feasible. The presentation states that ‘with decreasing rates of areal density increases for storage components and with component manufactures reluctance to invest in new capacity, historical decreases in the cost of storage ($/GB) will not be sustained.’

Where does that leave us now? The resilience of tape as an archival solution, the energy implications of digital hard drive storage, the lack of established archival standards and a foreseeable end to cheap and easy big digital data storage, are all indications of the complex and confusing terrain of information management in the 21st century. Perhaps the Clipper report offers the most grounded appraisal: ‘the best solution is really a blend of disk and tape, but – for most uses – we believe that the vast majority of archived data should reside on tape.’ Yet it seems until the day standards are established in line with the needs of digital information managers, this area will continue to generate troubling, if intriguing, conundrums.

Digitisation strategies – back up, bit rot, decay and long term preservation

Monday, September 23rd, 2013

In a blog post a few weeks ago we reflected on several practical and ethical questions emerging from our digitisation work. To explore these issues further we decided to take an in-depth look at the British Library’s Digital Preservation Strategy 2013-2016 that was launched in March 2013. The British Library is an interesting case study because they were an ‘early adopter’ of digital technology (2002), and are also committed to ensuring their digital archives are accessible in the long term.

Making sure the UK’s digital archives are available for subsequent generations seems like an obvious aim for an institution like the British Library. That’s what they should be doing, right? Yet it is clear from reading the strategy report that digital preservation is an unsettled and complex field, one that is certainly ‘not straightforward. It requires action and intervention throughout the lifecycle, far earlier and more frequently than does our physical collection (3).’

The British Library’s collection is huge and therefore requires coherent systems capable of managing its vast quantities of information.

‘In all, we estimate we already have over 280 terabytes of collection content – or over 11,500,000 million items – stored in our long term digital library system, with more awaiting ingest. The onset of non-print legal deposit legislation will significantly increase our annual digital acquisitions: 4.8 million websites, 120,000 e-journal articles and 12,000 e-books will be collected in the first year alone (FY 13/14). We expect that the total size of our collection will increase massively in future years to around 5 petabytes [that’s 5000 terabytes] by 2020.’

All that data needs to be backed up as well. In some cases valuable digital collections are backed up in different locations/ servers seven times (amounting to 35 petabytes/ 3500 terabytes). So imagine it is 2020, and you walk into a large room crammed full of rack upon rack of hard drives bursting with digital information. The data files – which include everything from a BWAV audio file of a speech by Natalie Bennett, leader of the Green Party after her election victory in 2015, to 3-D data files of cunieform scripts from Mesopotamia, are constantly being monitored by algorithms designed to maintain the integrity of data objects. The algorithms measure bit rot and data decay and produce further volumes of metadata as each wave of file validation is initiated. The back up systems consume large amounts of energy and are costly, but in beholding them you stand in the same room as the memory of the world, automatically checked, corrected and repaired in monthly cycles.

Such a scenario is gestured toward in the British Library’s long term preservation strategy, but it is clear that it remains a work in progress, largely because the field of digital preservation is always changing. While the British Library has well-established procedures in place to manage their physical collections, they have not yet achieved this with their digital ones. Not surprisingly ‘technological obsolescence is often regarded as the greatest technical threat to preserving digital material: as technology changes, it becomes increasingly difficult to reliably access content created on and intended to be accessed on older computing platforms.’ An article from The Economist in 2012 reflected on this problem too: ‘The stakes are high. Mistakes 30 years ago mean that much of the early digital age is already a closed book (or no book at all) to historians.’

Destroyed Hard Drive

There are also shorter term digital preservation challenges, which encompass ‘everything from media integrity and bit rot to digital rights management and metadata.’ Bit rot is one of those terms capable of inducing widespread panic. It refers to how storage media, in particular optical media like CDs and DVDs, decay over time often because they have not been stored correctly. When bit rot occurs, a small electric charge of a ‘bit’ in memory disperses, possibly altering program code or stored data, making the media difficult to read and at worst, unreadable. Higher level software systems used by large institutional archives mitigate the risk of such underlying failures by implementing integrity checking and self-repairing algorithms (as imagined in the 2020 digital archive fantasy above). These technological processes help maintain ‘integrity and fixity checking, content stabilisation, format validation and file characterisation.’

Archival Gold Disc

300 years, are you sure?

Preservation differences between analogue and digital media

The British Library isolate three main areas where digital technologies differ from their analogue counterparts. Firstly there is the issue of ‘proactive lifestyle management‘. This refers to how preservation interventions for digital data need to happen earlier, and be reviewed more frequently, than analogue data. Secondly there is the issue of file ‘integrity and validation.’ This refers to how it is far easier to make changes to a digital file without noticing, while with a physical object it is usually clear if it has decayed or a bit has fallen off. This means there are greater risks to the authenticity and integrity of digital objects, and any changes need to be carefully managed and recorded properly in metadata.

Finally, and perhaps most worrying, is the ‘fragility of storage media‘. Here the British Library explain:

‘The media upon which digital materials are stored is often unstable and its reliability diminishes over time. This can be exacerbated by unsuitable storage conditions and handling. The resulting bit rot can prevent files from rendering correctly if at all; this can happen with no notice and within just a few years, sometimes less, of the media being produced’.

A holistic approach to digital preservation involves taking and assessing significant risks, as well as adapting to vast technological change. ‘The strategies we implement must be regularly re-assessed: technologies and technical infrastructures will continue to evolve, so preservation solutions may themselves become obsolete if not regularly re-validated in each new technological environment.’

Establishing best practice for digital preservation remains a bit of an experiment, and different strategies such as migration, emulation and normalisation are tested to find out what model best helps counter the real threats of inaccessibility and obsolescence we may face in 5-10 years from now. What is encouraging about the British Library’s strategic vision is they are committed to ensuring digital archives are accessible for years to come despite the very clear challenges they face.

From digital files back to analogue tape

Monday, June 10th, 2013

The bread and butter work of Great Bear Analogue and Digital Media is to migrate analogue and digital magnetic tape to digital files, but recently we were asked by a customer to transfer a digital file to ¼ analogue tape.

The customer was concerned about the longevity of electronic digital formats, and wanted to transfer his most valued recordings to a tangible format he knew and trust. Transferring from digital to analogue was certainly more expensive: the blank tape media cost over £50 alone.

In a world where digital technology seems pervasive, remaining so attached to analogue media may appear surprising. Yet the resilience of tape as a recorded medium is far greater than is widely understood.

Take this collection of old tapes that are in the back yard of the Great Bear office. Fear not customers, this is not what happens to your tapes when you send them to us! They are a collection of test tapes that live outside all year round without shelter from the elements. We use them to test ways of treating degraded tapes because we don’t want to take unnecessary risks with our customer’s material.

audio-cassette-tapes-left-outside-for-years

Despite being subject to pretty harsh conditions, the majority of material on these tapes is recoverable to some degree.

Would digital data stored on a hard drive survive if it had to endure similar conditions? It is far less likely.

Due to its electronic composition digital data is fragile in comparison with analogue magnetic tape. This is also the ironic conclusion of Side by Side (2012), the documentary film narrated by Keanu Reeves which explores the impact of digital technology on the film industry.

Requests for digital to analogue transfers are fairly rare at Great Bear, but we are happy to do them should the need arise!

And don’t forget to back up your digital files in at least three different locations to ensure it is safe.

Real time transfers – digitising tape media

Monday, June 3rd, 2013

In theory the work we do at Great Bear is very simple: we migrate information from analogue or digital magnetic tape to electronic digital files.

Once transferred, digital files can be easily edited, tagged, accessed, shared or added to a database. Due to the ubiquitous nature of digital media today, if you want to use your data, it needs to be in a digital form.

In practice however, there are a lot more issues that arise when migrating tape based media. These can stem from the obsolescence of machines (spare parts being a particular issue), physical problems with the tape and significantly, the actual person-time involved in doing the transfer.

threading-eiaj-video-tape-closeup

While large institutions like the Library of Congress in USA can invest in technology that enables mass digitisation like those developed by Samma Systems, most transfers require operators to do the work. The simple truth is that for fragile and obsolete tape media, there is no other option. In the film ‘Living Archive – Preservation Challenge‘ David Crostwait from American digitisation company DC Video describes the importance of careful, real time transfers:

‘When a tape is played back, that tape starts from the very beginning and may run for 60-65 minutes straight. One person sits in front of that machine and watches that tape from beginning to end, s/he does nothing else but watch that tape. We feel this procedure is the only way to guarantee the highest quality possible.’

threadimg-eiaj-half-inch-video-tape

At Great Bear we echo this sentiment. We give each transfer individual attention so that the information is migrated accurately and effectively. Sometimes this means doing things slowly to ensure that tape is spooled correctly and the tension within the tape pack is even. If transfers are rushed there is always the danger that tape will get crumpled or damaged. As the saying goes, ‘the more haste, the worse speed’.

 

Archiving for the digital long term: information management and migration

Monday, June 3rd, 2013

As an archival process digitisation offers the promise of a dream: improved accessibility, preservation and storage.

However the digital age is not without its archival headaches. News of the BBC’s plans to abandon their Digital Media Initiative (DMI), which aimed to make the BBC media archive ‘tapeless’, clearly demonstrates this. As reported in The Guardian:

‘DMI has cost £98.4m, and was meant to bring £95.4m of benefits to the organisation by making all the corporation’s raw and edited video footage available to staff for re-editing and output. In 2007, when the project was conceived, making a single TV programme could require 70 individual video-handling processes; DMI was meant to halve that.’

The project’s failure has been explained by its size and ambition. Another telling reason was cited: the software and hardware used to deliver the project was developed for exclusive use by the BBC. In a statement BBC Director Tony Hall referred to the fast development of digital technology, stating that ‘off-the-shelf [editing] tools were now available that could do the same job “that simply didn’t exist five years ago”.’

g tech pro hard-drive-raid-array

The fate of the DMI initiative should act as a sobering lesson for institutions, organisations and individuals who have not thought about digitisation as a long, rather than short term, archival solution.

As technology continues to ‘innovate’ at startling rate,  it is hard to predict how long the current archival standard for audio and audio-visual will last.

Being an early adopter of technology can be an attractive proposition: you are up to date with the latest ideas, flying the flag for the cutting edge. Yet new technology becomes old fast, and this potentially creates problems for accessing and managing information. The fragility of digital data comes to the fore, and the risk of investing all our archival dreams in exclusive technological formats as the BBC did, becomes far greater.

macos-x-copy-dialogue-box

In order for our data to survive we need to appreciate that we are living in what media theorist Jussi Parikka calls an ‘information management society.’ Digitisation has made it patently clear that information is dynamic rather than stored safely in static objects. Migrating tape based archives to digital files is one stage in a series of transitions material can potentially make in its lifetime.

Given the evolution of media and technology in the 20th and 21st centuries, it feels safe to speculate that new technologies will emerge to supplant uncompressed WAV and AIFF files, just as AAC has now become preferred to MP3 as a compressed audio format because it achieves better sound quality at similar bit rates.

Because of this at Great Bear we always migrate analogue and digital magnetic tape at the recommended archival standard, and provide customers with high quality and access copies. Furthermore, we strongly recommend to customers to back up archive quality files in at least three separate locations because it is highly likely data will need to be migrated again in the future.

broken DAT / degraded mouldy tape

Saturday, August 18th, 2012

Early tape based digital formats such as DAT, Tascam DTRS and ADAT, etc are often problematic now, partly with tape issues and also reliability and spares availability. In 20 or even 10 years time these machines will be much less serviceable than the analogue tape machines of the previous generation and as a result more obsolete and a higher priority to migrate to a file based digital format.

We’ve also started to see a particularly nasty problem with some, and usually the 120 minute length, DAT’s. The first symptoms are a broken DAT tape usually on wind.  The tape pack seems to become slightly sticky, with intermittent tension between the layers of tape and with the thinner tape in 120 lengths this can sometimes break the tape on wind.

TDK 120 DAT sticky tape layers

You can see in the above image how the tape sticks slightly to the pack and then releases when hand wound. With the greater tension of a machine wind and the tape also wound around the head drum this becomes risky.

With large transfer jobs checking each DAT by disassembly is not feasible but the permanent damage and / or part loss of a section of audio caused by a break is not feasible either!


Trustpilot

designed and developed by
greatbear analogue and digital media ltd, 0117 985 0500
Unit 26, The Coach House, 2 Upper York Street, Bristol, BS2 8QN, UK


greatbear analogue and digital media is proudly powered by WordPress
hosted using Debian and Apache