Posts Tagged ‘Standards’

Grundig C 100 and the early history of the Compact Cassette

Monday, March 7th, 2016

The recent arrival of a Grundig C 100 cassette in the Great Bear studio has been an occasion to explore the early history of the compact cassette.

grundig-c100-cassette-tape

The compact cassette has gained counter-cultural kudos in recent times, and more about that later, but once upon a time the format was the new kid on the block.

The audio cassette was revolutionary for several reasons, an important one being its compact size. The compact cassette, introduced by Dutch company Philips in 1963 could be held in the palm of your hand, while its closest neighbour in media history, the RCA Sound Tape cartridge (1958-1964), needed to be held with two.

The compact cassette also offered a more user-friendly experience for the consumer.

Whereas reel-to-reel tape had to be threaded manually through the tape transport, all the user of a compact cassette tape machine had to do was insert a tape in a machine and press play.

Format Wars

One of the less-emphasised histories of the compact cassette is the alternative cassette standards that were vying for market domination alongside Philips in the early 1960s.

One alternative was the DC International system developed by the German company Grundig who at that time were a leading manufacturer of tape, radio and Hi-Fi systems.

In 1965 Grundig introduced its first cassette recorder, the C 100, which used the Double Cassette (DC) International system. The DC International used two-reels within the cassette shell similar to the Compact-System promoted by Philips. There were, however, important differences between the two standards.

The DC International standard used a larger cassette shell (120x77x12mm) with a ¼” tape width and recorded at 2” per second. The Compact-System was smaller all around: 0.15” tape width and recorded at 1⅞ in/s.

audio-cassette-grundig-c100-comparisonFervent global competition shaped audio cassette production in the mid-1960s.

Grundig’s DC International was effectively (and rapidly) ousted from the market by Philips’ ‘open’ licensing strategy.

Eric D. Daniel and C. Denis Mee explain that

‘From the beginning Philips pursued a strategy of licensing its design as widely as possible. According to Frederik Philips, president of the firm at the time, this policy was the brainchild of Mr. Hartong, a member of the board of management. Hartong believed that Philips should allow other manufacturers access to the design, turning the compact cassette into a world product….Despite initial plans to charge a fee, Phillips eventually decided to offer the license for free to any firm willing to produce the design. Several firms adopted the compact cassette almost immediately, including many Japanese manufacturers.’ [1]

The outcome of this licensing strategy was a widespread, international adoption of Philips’ compact cassette standard.

In Billboard on 16 September 1967 it was reported: ‘Philips has scored a critical victory on the German market for its “Compact-System”, which now seems certain to have uncontested leadership. Teldec has switched from the DC-International system to the Philips system, and Grundig, the major manufacturer of the DC-International system, announced that it will also start manufacturing cassette players for the Philips system.’

Cassettes today

The portable, user-friendly compact cassette has proved to be a resilient format. Despite falling foul to the digital march of progress in the early 1990s, the past couple of years have been defined by claims that cassettes are back and (almost) cool again.

Although the Recording Industry Association of America have denied reports they are tracking cassette sales again, it is clear that ‘a small, but engaged niche audience… is steadily growing’ for tape-based releases.

Whether that audience is gorging on tapes from do it yourself tape labels or sampling the delights of Justin Bieber’s latest album, cassettes are a hit for low-budget music-makers and status-bearers alike.

Compact Cassette Preservation

Amid this cassette fervour, Great Bear remains embroiled with the old wave of cassettes.

Cassettes from the 1960s and early 1970s carry specific preservation concerns.

Loss of lubricant is a common problem. You will know if your tape is suffering lubricant loss if you hear a horrible squealing sound during play back. This is known as ‘stick slip,’ which describes the way friction between magnetic tape and tape heads stick and slip as they move antagonistically through the tape transport.

This squealing poses big problems because it can intrude into the signal path and become part of the digital transfer. Tapes displaying such problems therefore require careful re-lubrication to ensure the recording can be transferred in its optimum – and squeal free – state.

Early compact cassettes also have problems that characterise much ‘new media.’

As Eric D. Daniel et al elaborate: ‘during the compact cassette’s first few years, sound quality was mediocre, marred by background noise, wow and flutter, and a limited frequency range. While ideal for voice recording applications like dictation, the compact cassette was marginal for musical recording.’ [2]

The resurgence in compact cassette culture may lull people into a false sense that recordings stored on cassettes are not high risk and do not need to be transferred in the immediate future.

It is worth remembering, however, that although playback machines will continue to be produced in years to come, not all tape machines are of equal, archival quality.

The last professional grade audio cassette machines were produced in the late 1990s and even the best of this batch lag far behind the tape machine to end all tape machines – the Nakamichi Dragon with its Automatic Azimuth Correction technology – that was discontinued in 1993.

To ensure the best quality transfers it is advisable to play back tapes using professional-grade machines. This enables greater control of problems that can arise with azimuth, wow and flutter which often need to be checked and if necessary adjusted prior to playback, a process that is not possible on cheaper, domestic machines.

As ever, if you have any specific concerns or enquiries regarding your audio cassette collections, please contact us to discuss it. 

Notes

[1] Eric D. Daniel et al, eds. (2009) Magnetic Recording: The First 100 Years. Piscataway: IEEE Press Marketing, 103-104.

[2] Eric D. Daniel et al, eds, Magnetic Recording, 104.

D1, D2 & D3 – histories of digital video tape

Monday, July 14th, 2014

D1 tape

The images in this article are of the first digital video tape formats, the D1, D2 and D3. The tendency to continually downsize audiovisual technology is clearly apparent: the gargantuan shell of the D1 gradually shrinks to the D3, which resembles the size of a domestic VHS tape.

Behind every tape (and every tape format) lie interesting stories, and the technological wizardry and international diplomacy that helped shape the roots of our digital audio visual world are worth looking into.

In 1976, when the green shoots of digital audio technology were emerging at industry level, the question of whether Video Tape Recorders (VTRs) could be digitised began to be explored in earnest by R & D departments based at SONY, Ampex and Bosch G.m.b.H. There was considerable scepticism among researchers about whether digital video tape technology could be developed at all because of the wide frequency required to transmit a digital image.

In 1977 however, as reported on the SONY websiteYoshitaka Hashimoto and team began to intensely research digital VTRs and ‘in just a year and a half, a digital image was played back on a VTR.’

Several years of product development followed, shaped, in part, by competing regional preferences. As Jim Slater argues in Modern Television Systems (1991): ‘much of the initial work towards digital standardisation was concerned with trying to find ways of coping with the three very different colour subcarrier frequencies used in NTSC, SECAM and PAL systems, and a lot of time and effort was spent on this’ (114).

Establishing a standard sampling frequency did of course have real financial consequences, it could not be randomly plucked out the air: the higher the sampling frequency, the greater overall bit rate; the greater overall bit rate, the more need for storage space in digital equipment. In 1982, after several years of negotiations, a 13.5 MHz sampling frequency was agreed. European, North American, ‘Japanese, the Russians, and various other broadcasting organisations supported the proposals, and the various parameters were adopted as a world standard, Recommendation 601 [a.k.a. 4:2:2 DTV] standard of the CCIR [Consultative Committee for International Radio, now International Telecommunication Union]’ (Slater, 116).

The 4:4:2 DTV was an international standard that would form the basis of the (almost) exclusively digital media environment we live in today. It was ‘developed in a remarkably short time, considering its pioneering scope, as the worldwide television community recognized the urgent need for a solid basis for the development of an all-digital television production system’, write Stanley Baron and David Wood.

Once agreed upon, product development could proceed. The first digital video tape, the D1, was introduced on the market in 1986. It was an uncompressed component video which used enormous bandwidth for its time: 173 Mbit/sec (bit rate), with maximum recording time of 94 minutes. D-2 and D-3 Tapes

As Slater writes

‘unfortunately these machines are very complex, difficult to manufacture, and therefore very expensive […] they also suffer from the disadvantage that being component machines, requiring luminance and colour-difference signals at input and output, they are difficult to install in a standard studio which has been built to deal with composite PAL signals. Indeed, to make full use of the D1 format the whole studio distribution system must be replaced, at considerable expense’ (125).

Being forced to effectively re-wire whole studios, and the considerable risk involved in doing this because of continual technological change, strikes a chord with the challenges UK broadcast companies face as they finally become ‘tapeless’ in October 2014 as part of the Digital Production Partnership’s AS-11 policy.

Sequels and product development

As the story so often goes, D1 would soon be followed by D2. Those that did make the transition to D1 were probably kicking themselves, and you can only speculate the amount of back injuries sustained getting the machines in the studio (from experience we can tell you they are huge and very heavy!)

It was fairly inevitable a sequel would be developed because even as the D-1 provided uncompromising image quality, it was most certainly an unwieldy format, apparent from its gigantic size and component wiring. In response a composite digital video – the D2 – was developed by Ampex and introduced in 1988.

In this 1988 promotional video, you can see the D-2 in action. Amazingly for our eyes and ears today the D2 is presented as the ideal archival format. Amazing for its physical size (hardly inconspicuous on the storage shelf!) but also because it used composite video signal technology. Composite signals combine on one wire all the component parts which make up a video signal: chrominance (colour, or Red Green, Blue – RGB) and luminance (the brightness or black and white information, including grayscale).

While the composite video signal used lower bandwidth and was more compatible with existing analogue systems used in the broadcast industry of the time, its value as an archival format is questionable. A comparable process for the storage we use today would be to add compression to a file in order to save file space and create access copies. While this is useful in the short term it does risk compromising file authenticity and quality in the long term. The Ampex video is fun to watch however, and you get a real sense of how big the tapes were and the practical impact this would have had on the amount of time it took to produce TV programmes.

Enter the D3

Following the D2 is the D3, which is the final video tape covered in this article (although there were of course the D5 and D9.)

The D3 was introduced by Panasonic in 1991 in order to compete with Ampex’s D2. It has the same sampling rate as the D2 with the main difference being the smaller shell size.

The D3’s biggest claim to fame was that it was the archival digital video tape of choice for the BBC, who migrated their analogue video tape collections to the format in the early 1990s. One can only speculate that the decision to take the archival plunge with the D3 was a calculated risk: it appeared to be a stable-ish technology (it wasn’t a first generation technology and the difference between D2 and D3 is negligible).

The extent of the D3 archive is documented in a white paper published in 2008, D3 Preservation File Format, written by Philip de Nier and Phil Tudor: ‘the BBC Archive has around 315,000 D3 tapes in the archive, which hold around 362,000 programme items. The D3 tape format has become obsolete and in 2007 the D3 Preservation Project was started with the goal to transfer the material from the D3 tapes onto file-based storage.’

Tom Heritage, reporting on the development of the D3 preservation project in 2013/2014, reveals that ‘so far, around 100,000 D3 and 125,000 DigiBeta videotapes have been ingested representing about 15 Petabytes of content (single copy).’

It has then taken six years to migrate less than a third of the BBC’s D3 archive. Given that D3 machines are now obsolete, it is more than questionable whether there are enough D3 head hours left in existence to read all the information back clearly and to an archive standard. The archival headache is compounded by the fact that ‘with a large proportion of the content held on LTO3 data tape [first introduced 2004, now on LTO-6], action will soon be required to migrate this to a new storage technology before these tapes become difficult to read.’ With the much publicised collapse of the BBC’s (DMI) digital media initiative in 2013, you’d have to very strong disposition to work in the BBC’s audio visual archive department.

The roots of the audio visual digital world

The development of digital video tape, and the international standards which accompanied its evolution, is an interesting place to start understanding our current media environment. They are also a great place to begin examining the problems of digital archiving, particularly when file migration has become embedded within organisational data management policy, and data collections are growing exponentially.

While the D1 may look like an alien-techno species from a distant land compared with the modest, immaterial file lists neatly stored on hard drives that we are accustomed to, they are related through the 4:2:2 sample rate which revolutionised high-end digital video production and continues to shape our mediated perceptions.

Going ‘tape-less’: AS-11 Digital Production Partnership standards

Wednesday, May 7th, 2014

Is this the end of tape as we know it? Maybe not quite yet, but October 1, 2014, will be a watershed moment in professional media production in the UK: it is the date that file format delivery will finally ‘go tape-less.’

Establishing end-to-end digital production will cut out what is now seen as the cumbersome use of video tape in file delivery. Using tape essentially adds a layer of media activity to a process that is predominantly file based anyway. As Mark Harrison, Chair of the Digital Production Partnership (DPP), reflects:

Example of a workflow for the DPP AS-11 standard

Example of a workflow for the DPP AS-11 standard

‘Producers are already shooting their programmes on tapeless cameras, and shaping them in tapeless post production environments. But then a strange thing happens. At the moment a programme is finished it is transferred from computer file to videotape for delivery to the broadcaster. When the broadcaster receives the tape they pass it to their playout provider, who transfers the tape back into a file for distribution to the audience.’

Founded in 2010, the DPP are a ‘not-for-profit partnership funded and led by the BBC, ITV and Channel 4 with representation from Sky, Channel 5, S4/C, UKTV and BT Sport.’ The purpose of the coalition is to help ‘speed the transition to fully digital production and distribution in UK television’ by establishing technical and metadata standards across the industry.

The transition to a standardised, tape-less environment has further been rationalised as a way to minimise confusion among media producers and help economise costs for the industry. As reported on Avid Blogs production companies, who often have to respond to rapidly evolving technological environments, are frantically preparing for deadline day. ‘It’s the biggest challenge since the switch to HD’, said Andy Briers, from Crow TV. Moreover, this challenge is as much financial as it is technical: ‘leading post houses predict that the costs of implementing AS-11 delivery will probably be more than the cost of HDCAM SR tape, the current standard delivery format’, writes David Wood on televisual.com

Outlining the standard

Audio post production should now be mixed to the EBU R128 loudness standard. As stated in the DPP’s producer’s guide, this new audio standard ‘attempts to model the way our brains perceive sound: our perception is influenced by frequency and duration of sound’ (9).

In addition, the following specifications must be observed to ensure the delivery format is ‘technically legal.’

  • HD 1920×1080 in an aspect ratio of 16:9 (1080i/25)
  • AVC-I in MXF (Material Exchange Format) OP1a files to AS11 specification
  • DPP required metadata
  • Photo Sensitive Epilepsy (flashing) testing to OFCOM standard/ the Harding Test

The shift to file-based delivery will require new kinds of vigilance and attention to detail in order to manage the specific problems that will potentially arise. The DPP producer’s guide states: ‘unlike the tape world (where there may be only one copy of the tape) a file can be copied, resulting in more than one essence of that file residing on a number of servers within a playout facility, so it is even more crucial in file-based workflows that any redelivered file changes version or number’.

Another big development within the standard is the important role performed by metadata, both structural (inherent to the file) and descriptive (added during the course of making the programme) . While broadcasters may be used to manually writing metadata as descriptive information on tape-boxes, they must now be added to the digital file itself. Furthermore, ‘the descriptive and technical metadata will be wrapped with the video and audio into a new and final AS-11 DPP MXF file,’ and if ‘any changes to the file are [made it is] likely to invalidate the metadata and cause the file to be rejected. If any metadata needs to be altered this will involve re-wrapping the file.’

Interoperability: the promise of digital technologies

The sector-wide agreement and implementation of digital file-delivery standards are significant because they represent a commitment to manufacturing full interoperability, an inherent potential of digital technologies. As French philosopher of technology Bernard Stiegler explains:

‘The digital is above all a process of generalised formalisation. This process, which resides in the protocols that enable interoperability, makes a range of diverse and varied techniques. This is a process of unification through binary code of norms and procedures that today allow the formalisation of almost everything: traveling in my car with a GPS system, I am connected through a digitised triangulation process that formalises my relationship with the maps through which I navigate and that transform my relationship with territory. My relationships with space, mobility and my vehicle are totally transformed. My inter-individual, social, familial, scholarly, national, commercial and scientific relationships are all literally unsettled by the technologies of social engineering. It is at once money and many other things – in particular all scientific practices and the diverse forms of public life.’

Jigsaw with the pieces representing various technical elements fitting together

This systemic homogenisation described by Stiegler is called into question if we consider whether the promise of interoperability – understood here as different technical systems operating efficiently together – has ever been fully realised by the current generation of digital technologies. If this was the case then initiatives like the DPP’s would never have to be pursued in the first place – all kinds of technical operations would run in a smooth, synchronous matter. Amid the generalised formalisation there are many micro-glitches and incompatibilities that slow operations down at best, and grind them to a halt at worst.

With this in mind we should note that standards established by the DPP are not fully interoperable internationally. While the DPP’s technical and metadata standards were developed in close alliance with the US-based Advanced Media Workflow Association’s (AMWA) recently released AS-11 specification, there are also key differences.

As reported in 2012 by Broadcast Now Kevin Burrows, DPP Technical Standards Lead, said: ‘[The DPP standards] have a shim that can constrain some parameters for different uses; we don’t support Dolby E in the UK, although the [AMWA] standard allows it. Another difference is the format – 720 is not something we’d want as we’re standardising on 1080i. US timecode is different, and audio tracks are referenced as an EBU standard.’ Like NTSC and PAL video/ DVD then, the technical standards in the UK differ from those used in the US. We arguably need, therefore, to think about the interoperability of particular technical localities rather than make claims about the generalised formalisation of all technical systems.  Dis-synchrony and technical differences remain despite standardisation.

The AmberFin Academy blog have also explored what they describe as the ‘interoperability dilemma’. They suggest that the DPP’s careful planning mean their standards are likely to function in an efficient manner: ‘By tightly constraining the wrapper, video codecs, audio codecs and metadata schema, the DPP Technical Standards Group has created a format that has a much smaller test matrix and therefore a better chance of success. Everything in the DPP File Delivery Specification references a well defined, open standard and therefore, in theory, conformance to those standards and specification should equate to complete interoperability between vendors, systems and facilities.’ They do however offer these words of caution about user interpretation: ‘despite the best efforts of the people who actually write the standards and specifications, there are areas that are, and will always be, open to some interpretation by those implementing the standards, and it is unlikely that any two implementations will be exactly the same. This may lead to interoperability issues.’

It is clear that there is no one simple answer to the dilemma of interoperability and its implementation. Establishing a legal commitment, and a firm deadline date for the transition, is however a strong message that there is no turning back. Establishing the standard may also lead to a certain amount of technological stability, comparable to the development of the EIAJ video tape standards in 1969, the first standardised format for industrial/non-broadcast video tape recording. Amid these changes in professional broadcast standards, the increasingly loud call for standardisation among digital preservationists should also be acknowledged.

For analogue and digital tapes however, it may well signal the beginning of an accelerated end. The professional broadcast transition to ‘full-digital’ is a clear indication of tape’s obsolescence and vulnerability as an operable media format.

Software Across Borders? The European Archival Records and Knowledge Preservation (E-Ark) Project

Monday, February 24th, 2014

The latest big news from the digital preservation world is that the European Archival Records and Knowledge Preservation – (E-Ark), a three year, multinational research project, has received a £6M award from the European Commission ‘to create a revolutionary method of archiving data, addressing the problems caused by the lack of coherence and interoperability between the many different systems in use across Europe,’ the Digital Preservation Coalition, who are partners in the project, report.

What is particularly interesting about the consortium E-Ark has brought together is commercial partners will be part of a conversation that aims to establish long term solutions for digital preservation across Europe. More often than not, commercial interests have driven technological innovations used within digital preservation. This has made digital data difficult to manage for institutions both large and small, as the BBC’s Digital Media Initiative demonstrates, because the tools and protocols are always in flux. A lack of policy-level standards and established best practices has meant that the norm within digital information management has very much been permanent change.

Such a situation poses great risks for both digitised and born digital collections because information may have to be regularly migrated in order to remain accessible and ‘open’. As stated on the E-Ark website, ‘the practices developed within the project will reduce the risk of information loss due to unsuitable approaches to keeping and archiving of records. The project will be public facing, providing a fully operational archival service, and access to information for its users.’

Vectorscope

The E-Ark project will hopefully contribute to the creation of compatible systems that can respond to the different needs of groups working with digital information. Which is, of course, just about everybody right now: as the world economy becomes increasingly defined by information and ‘big data’, efficient and interoperable access to commercial and non-commercial archives will be an essential part of a vibrant and well functioning economic system. The need to establish data systems that can communicate and co-operate across software borders, as well as geographical ones, will become an economic necessity in years to come.

The task facing E-Ark is huge, but one crucial to implement if digital data is to survive and thrive in this brave new datalogical world of ours. As E-Ark explain: ‘Harmonisation of currently fragmented archival approaches is required to provide the economies of scale necessary for general adoption of end-to-end solutions. There is a critical need for an overarching methodology addressing business and operational issues, and technical solutions for ingest, preservation and re-use.’

Maybe 2014 will be the year when digital preservation standards start to become a reality. As we have already discussed on this blog, the US-based National Agenda for Digital Stewardship 2014 outlined the negative impact of continuous technological change and the need to create dialogue among technology makers and standards agencies. It looks like things are changing and much needed conversations are soon to take place, and we will of course reflect on developments on the Great Bear blog.

 

Digital Preservation – Establishing Standards and Challenges for 2014

Monday, January 13th, 2014

2014 will no doubt present a year of new challenges for those involved in digital preservation. A key issue remains the sustainability of digitisation practices within a world yet to establish firm standards and guidelines. Creating lasting procedures capable of working across varied and international institutions would bring some much needed stability to a profession often characterized by permanent change and innovation.

In 1969 The EIAJ-1 video tape was developed by the Electronic Industries Association of Japan. It was the first standardized format for industrial/non-broadcast video tape recording. Once implemented it enabled video tapes to be played on machines made by different manufacturers and it helped to make video use cheaper and more widespread, particularly within a domestic context.

Close up of tape machine on the 'play', 'stop', 'rewind' button

The introduction of standards in the digitisation world would of course have very little impact on the widespread use of digital technologies which are, in the west, largely ubiquitous. It would however make the business of digital preservation economically more efficient, simply because organisations would not be constantly adapting to change. For example, think of the costs involved in keeping up with rapid waves of technological transformation: updating equipment, migrating data and ensuring file integrity and operability are maintained are a few costly and time consuming examples of what this would entail.

Although increasingly sophisticated digital forensic technology can help to manage some of these processes, highly trained (real life!) people will still be needed to oversee any large-scale preservation project. Within such a context resource allocation will always have to account for these processes of adaptation. It has to be asked then: could this money, time and energy be practically harnessed in other, more efficient ways? The costs of non-standardisation becomes ever more pressing when we consider the amount of the digital data preserved by large institutions such as the British Library, whose digital collection is estimated to amass up to 5 petabytes (5000 terabytes) by 2020. This is not a simple case of updating your iphone to the next model, but an extremely complex and risky venture where the stakes are high. Do we really want to jeopardise rich forms cultural heritage in the name of technological progress?

The US-based National Digital Stewardship Alliance (NDSA) National Agenda for Digital Stewardship 2014 echoes such a sentiment. They argue that ‘the need for integration, interoperability, portability, and related standards and protocols stands out as a theme across all of these areas of infrastructure development’ (3). The executive summary also stresses the negative impact rapid technological change can create, and the need to ‘coordinate to develop comprehensive coverage on critical standards bodies, and promote systematic community monitoring of technology changes relevant to digital preservation.’ (2)

File Format Action Plans

One step on the way to more secure standards is the establishment of File Format Action Plans, a practice which is being increasingly recommended by US institutions. The idea behind developing a file format action plan is to create a directory of file types that are in regular use by people in their day to day lives and by institutions. Getting it all down on paper can help us track what may be described as the implicit user-standards of digital culture. This is the basic idea behind Parsimonious Preservation, discussed on the blog last year: that through observing trends in file use we may come to the conclusion that the best preservation policy is to leave data well alone since in practice files don’t seem to change that much, rather than risk the integrity of information via constant intervention.

As Lee Nilsson, who is currently working as a National Digital Stewardship Resident at the US Library of Congress writes, ‘specific file format action plans are not very common’, and when created are often subject to constant revision. Nevertheless he argues that devising action plans can ‘be more than just an “analysis of risk.” It could contain actionable information about software and formats which could be a major resource for the busy data manager.’

Other Preservation Challenges

Analogue to Digital Converter close upWhat are the other main challenges facing ‘digital stewards’ in 2014? In a world of exponential information growth, making decisions about what we keep and what we don’t becomes ever more pressing. When whole collections cannot be preserved digital curators are increasingly called upon to select material deemed representative and relevant. How is it possible to know now what material needs to be preserve for posterity? What values inform our decision making?

To take an example from our work at Great Bear: we often receive tapes from artists who have achieved little or no commercial success in their life times, but whose work is often of great quality and can tell us volumes about a particular community or musical style. How does such work stand up against commercially successful recordings? Which one is more valuable? The music that millions of people bought and enjoyed or the music that no one has ever heard?

Ultimately these questions will come to occupy a central concern for digital stewards of audio data, particularly with the explosion of born-digital music cultures which have enabled communities of informal and often non-commercial music makers to proliferate. How is it possible to know in advance what material will be valuable for people 20, 50 or 100 years from now? These are very difficult, if not impossible questions for large institutions to grapple with, and take responsibility for. Which is why, as members of a digital information management society, it is necessary to empower ourselves with relevant information so we can make considered decisions about our own personal archives.

A final point to stress is that among the ‘areas of concern’ for digital preservation cited by the NDSA, moving image and recorded sound figure highly, alongside other born-digital content such as electronic records, web and social media. Magnetic tape collections remain high risk and it is highly recommended that you migrate this content to a digital format as soon as possible. While digitisation certainly creates many problems as detailed above, magnetic tape is also threatened by physical deterioration and its own obsolescence challenges, in particular finding working machines to play back tape on. The simple truth is, if you want to access material in your tape collections it needs now to be stored in a resilient digital format. We can help, and offer other advice relating to digital information management, so don’t hesitate to get in touch.

Big Data, Long Term Digital Information Management Strategies & the Future of (Cartridge) Tape

Monday, November 18th, 2013

What is the most effective way to store and manage digital data in the long term? This is a question we have given considerable attention to on this blog. We have covered issues such as analogue obsolescence, digital sustainability and digital preservation policies. It seems that as a question it remains unanswered and up for serious debate.

We were inspired to write about this issue once again after reading an article that was published in the New Scientist a year ago called ‘Cassette tapes are the future of big data storage.’ The title is a little misleading, because the tape it refers to is not the domestic audio tape that has recently acquired much counter cultural kudos, but rather archival tape cartridges that can store up to 100 TB of data. How much?! I hear you cry! And why tape given the ubiquity of digital technology these days? Aren’t we all supposed to be ‘going tapeless’?

The reason for such an invention, the New Scientist reveals, is the ‘Square Kilometre Array (SKA), the world’s largest radio telescope, whose thousands of antennas will be strewn across the southern hemisphere. Once it’s up and running in 2024, the SKA is expected to pump out 1 petabyte (1 million gigabytes) of compressed data per day.’

SKA_dishes

Image of the SKA dishes

Researchers at Fuji and IBM have already designed a tape that can store up to 35TB, and it is hoped that a 100TB tape will be developed to cope with the astronomical ‘annual archive growth [that] would swamp an experiment that is expected to last decades’. The 100TB cartridges will be made ‘by shrinking the width of the recording tracks and using more accurate systems for positioning the read-write heads used to access them.’

If successful, this would certainly be an advanced achievement in material science and electronics. Smaller tape width means less room for error on the read-write function – this will have to be incredibly precise on a tape that will be storing a pretty extreme amount of information. Presumably smaller tape width will also mean there will be no space for guard bands either. Guard bands are unrecorded areas between the stripes of recorded information that are designed to prevent information interference, or what is known as ‘cross-talk‘.They were used on larger domestic video tapes such as U-Matic and VHS, but were dispensed with on smaller formats such as the Hi-8, which had a higher density of magnetic information in a small space, and used video heads with tilted gaps instead of guard bands.

The existence of SKA still doesn’t explain the pressing question: why develop new archival tape storage solutions and not hard drive storage?

Hard drives were embraced quickly because they take up less physical storage space than tape. Gone are the dusty rooms bursting with reel upon reel of bulky tape; hello stacks of infinite quick-fire data, whirring and purring all day and night. Yet when we consider the amount of energy hard drive storage requires to remain operable, the costs – both economic and ecological – dramatically increase.

The report compiled by the Clipper Group published in 2010 overwhelmingly argues for the benefits of tape over disk for the long term archiving of data. They state that ‘disk is more than fifteen times more expensive than tape, based upon vendor-supplied list pricing, and uses 238 times more energy (costing more than the all costs for tape) for an archiving application of large binary files with a 45% annual growth rate, all over a 12-year period.’

This is probably quite staggering to read, given the amount of investment in establishing institutional architecture for tape-less digital preservation. Such an analysis of energy consumption does assume, however, that hard drives are turned on all the time, when surely many organisations transfer archives to hard drives and only check them once every 6-12 months.

Yet due to the pressures of technological obsolescence and the need to remain vigilant about file operability, coupled with the functional purpose of digital archives to be quickly accessible in comparison with tape that can only be played back linearly, such energy consumption does seem fairly inescapable for large institutions in an increasingly voracious, 24/7 information culture. Of course the issue of obsolescence will undoubtedly affect super-storage-data tape cartridges as well. Technology does not stop innovating – it is not in the interests of the market to do so.

Perhaps more significantly, the archive world has not yet developed standards that address the needs of digital information managers. Henry Newman’s presentation at the Designing Storage Architectures 2013 conference explored the difficulty of digital data management, precisely due to the lack of established standards:

  • ‘There are some proprietary solutions available for archives that address end to end integrity;
  • There are some open standards, but none that address end to end integrity;
  • So, there are no open solutions that meet the needs of [the] archival community.’

He goes on to write that standards are ‘technically challenging’ and require ‘years of domain knowledge and detailed understanding of the technology’ to implement. Worryingly perhaps, he writes that ‘standards groups do not seem to be coordinating well from the lowest layers to the highest layers.’ By this we can conclude that the lack of streamlined conversation around the issue of digital standards means that effectively users and producers are not working in synchrony. This is making the issue of digital information management a challenging one, and will continue to be this way unless needs and interests are seen as mutual.

Other presentations at the recent annual meeting for Designing Storage Architectures for Digital Collections which took place on September 23-24, 2013 at the Library of Congress, Washington, DC, also suggest there are limits to innovation in the realm of hard drive storage.  Gary Decad, IBM, delivered a presentation on the ‘The Impact of Areal Density and Millions of Square Inches of Produced Memory on Petabyte Shipments for TAPE, NAND Flash, and HDD Storage Class‘.

For the lay (wo)man this basically translates as the capacity to develop computer memory stored on hard drives. We are used to living in a consumer society where new improved gadgets appear all the time. Devices are getting smaller and we seem to be able buy more storage space for cheaper prices. For example, it now costs under £100 to buy a 3TB hard drive, and it is becoming increasingly more difficult to purchase hard drives which have less than 500GB storage space. Compared with last year, a 1TB hard drive was the top of the range and would have probably cost you about £100.

A 100TB storage unit in 2010, compared with a smaller hard drive symbolising 2020.

Does my data look big in this?

Yet the presentation from Gary Decad suggests we are reaching a plateau with this kind of storage technology – infinite memory growth and reduced costs will soon no longer be feasible. The presentation states that ‘with decreasing rates of areal density increases for storage components and with component manufactures reluctance to invest in new capacity, historical decreases in the cost of storage ($/GB) will not be sustained.’

Where does that leave us now? The resilience of tape as an archival solution, the energy implications of digital hard drive storage, the lack of established archival standards and a foreseeable end to cheap and easy big digital data storage, are all indications of the complex and confusing terrain of information management in the 21st century. Perhaps the Clipper report offers the most grounded appraisal: ‘the best solution is really a blend of disk and tape, but – for most uses – we believe that the vast majority of archived data should reside on tape.’ Yet it seems until the day standards are established in line with the needs of digital information managers, this area will continue to generate troubling, if intriguing, conundrums.


Trustpilot

designed and developed by
greatbear analogue and digital media ltd, 0117 985 0500
Unit 26, The Coach House, 2 Upper York Street, Bristol, BS2 8QN, UK


greatbear analogue and digital media is proudly powered by WordPress
hosted using Debian and Apache