Posts Tagged ‘Innovation’

The difference ten years makes: changes in magnetic tape recording and storage media

Tuesday, June 24th, 2014

Data Storage Catalogue_front_2004Generational change for digital technologies are rapid and disruptive.  ‘In the digital context the next generation may only be five to ten years away!’ Tom Gollins from the National Archives reminds us, and this seems like a fairly conservative estimate.

It can feel like the rate of change is continually accelerating, with new products appearing all the time. It is claimed, for example, that the phenomena of ‘wearable tech chic’ is now upon us, with the announcement this week that Google’s glass is available to buy for £1,000.

The impact of digital technologies have been felt throughout society, and this issue will be explored in a large immersive exhibition of art, design, film, music and videogames held at the Barbian July-Sept 2014. It is boldly and emphatically titled: Digital Revolution.

To bring such technological transformations back into focus with our work at Great Bear, consider this 2004 brochure that recently re-surfaced in our Studio. As an example of the rapid rate of technological change, you need look no further.

A mere ten years ago, you could choose between several brands of audio mini disc, ADAT, DAT, DTRS, Betacam SP, Digital Betacam, super VHS, VHS-C, 8mm and mini DV.

Data Storage Catalogue_back_2004Storage media such as Zip disks, Jaz CartExabytes and hard drives that could store between 36-500Gb of data were also available to purchase.

RMGI are currently the only manufacturer of professional open reel audio tape. In the 2004 catalogue, different brands of open reel analogue tape are listed at a third of 2014 retail prices, taking into account rates of inflation.

While some of the products included in the catalogue, namely CDs, DVDs and open reel tape, have maintained a degree of market resiliency due to practicality, utility or novelty, many have been swept aside in the march of technological progress that is both endemic and epidemic in the 21st century.



Significant properties – technical challenges for digital preservation

Monday, April 28th, 2014

A consistent focus of our blog is the technical and theoretical issues that emerge in the world of digital preservation. For example, we have explored the challenges archivists face when they have to appraise collections in order to select what materials are kept, and what are thrown away. Such complex questions take on specific dimensions within the world of digital preservation.

If you work in digital preservation then the term ‘significant properties’ will no doubt be familiar to you. The concept has been viewed as a hindrance due to being shrouded by foggy terminology, as well as a distinct impossibility because of the diversity of digital objects in the world which, like their analogue counterparts, cannot be universally generalised or reduced to a series of measurable characteristics.

Cleaning an open reel-to-reel tape

In a technical sense, establishing a set of core characteristics for file formats has been important for initiatives like Archivematica, ‘a free and open-source digital preservation system that is designed to maintain standards-based, long-term access to collections of digital objects.’ Archivematica implement ‘default format policies based on an analysis of the significant characteristics of file formats.’ These systems manage digital information using an ‘agile software development methodology’ which ‘is focused on rapid, iterative release cycles, each of which improves upon the system’s architecture, requirements, tools, documentation, and development resources.’

Such a philosophy may elicit groans of frustration from information managers who may well want to leave their digital collections alone, and practice a culture of non-intervention. Yet this adaptive-style of project management, which is designed to respond rapidly to change, is often contrasted with predictive development that focuses on risk assessment and the planning of long-term projects. The argument against predictive methodologies is that, as a management model, it can be unwieldy and unresponsive to change. This can have damaging financial consequences, particularly when investing in expensive, risky and large scale digital preservation projects, as the BBC’s failed DMI initiative demonstrates.

Indeed, agile software development methodology may well be an important key to the sustainability of digital preservation systems which need to find practical ways of maneuvering technological innovations and the culture of perpetual upgrade. Agility in this context is synonymous with resilience, and the practical application of significant properties as a means to align file format interoperability offers a welcome anchor for a technological environment structured by persistent change.

Significant properties vs the authentic digital object

What significant properties imply, as archival concept and practice, is that desiring authenticity for the digitised and born-digital objects we create is likely to end in frustration. Simply put, preserving all the information that makes up a digital object is a hugely complex affair, and is a procedure that will require numerous and context-specific technical infrastructures.

As Trevor Owens explains: ‘you can’t just “preserve it” because the essence of what matters about “it” is something that is contextually dependent on the way of being and seeing in the world that you have decided to privilege.’ Owens uses the example of the Geocites web archiving project to demonstrate that if you don’t have the correct, let’s say ‘authentic’ tools to interpret a digital object (in this case, a website that is only discernible on certain browsers), you simply cannot see the information accurately. Part of the signal is always missing, even if something ‘significant’ remains (the text or parts of the graphics).

It may be desirable ‘to preserve all aspects of the platform in order to get at the historicity of the media practice’, Jonathan Sterne, author of MP3: Meaning of a Format suggests, but in a world that constantly displaces old technological knowledge with new, settling for the preservation of significant properties may be a pragmatic rather than ideal solution.

Analogue to digital issues

To bring these issues back to the tape we work we with at Great Bear, there are of course times when it is important to use the appropriate hardware to play the tapes back, and there is a certain amount of historically specific technical knowledge required to make the machines work in the first place. We often wonder what will happen to the specialised knowledge learnt by media engineers in the 70s, 80s and 90s, who operated tape machines that are now obsolete. There is the risk that when those people die, the knowledge will die with them. Of course it is possible to get hold of operating manuals, but this is by no means a guarantee that the mechanical techniques will be understood within a historical context that is increasingly tape-less and software-based.  By keeping our wide selection of audio and video tape machines purring, we are sustaining a machinic-industrial folk knowledge which ultimately helps to keep our customer’s magnetic tape-based, media memories, alive.

Of course a certain degree of historical accuracy is required in the transfers because, very obviously, you can’t play a V2000 tape on a VHS machine, no matter how hard you try!

Yet the need to play back tapes on exactly the same machine becomes less important in instances where the original tape was recorded on a domestic reel-to-reel recorder, such as the Grundig TK series, which may not have been of the greatest quality in the first place. To get the best digital transfer it is desirable to play back tapes on a machine with higher specifications that can read the magnetic information on the tape as fully as possible. This is because you don’t want to add any more errors to the tape in the transfer process by playing it back on a lower quality machine, which would then of course become part of the digitised signal.

It is actually very difficult to remove things like wow and flutter after a tape has been digitised, so it is far better to ensure machines are calibrated appropriately before the tape is migrated, even if the tape was not originally recorded on a machine with professional specifications. What is ultimately at stake in transferring analogue tape to digital formats is the quality of the signal. Absolute authenticity is incidental here, particularly if things sound bad.

The moral of this story, if there can be one, is that with any act of transmission, the recorded signal is liable to change. These can be slight alterations or huge drop-outs and everything in-between. The agile software developers know that given the technological conditions in which current knowledge is produced and preserved, transformation is inevitable and must be responded to. Perhaps it is realistic to assume this is the norm in society today, and creating digital preservation systems that are adaptive is key to the survival of information, as well as accepting that preserving the ‘full picture’ cannot always be guaranteed.

2″ Quad Video Tape Transfers – new service offered

Monday, March 3rd, 2014

We are pleased to announce that we are now able to support the transfer of 2″ Quadruplex Video Tape (PAL, SECAM & NTSC) to digital formats.

Quadruplex Scanning Diagram

2” Quad was a popular broadcast analogue video tape format whose halcyon period ran from the late 1950s to the 1970s. The first quad video tape recorder made by AMPEX in 1956 cost a modest $45,000 (that’s $386,993.38 in today’s money).

2” Quad revolutionized TV broadcasting which previously had been reliant on film-based formats, known in the industry as ‘kinescope‘ recordings. Kinescope film required significant amounts of skilled labour as well as time to develop, and within the USA, which has six different time zones, it was difficult to transport the film in a timely fashion to ensure broadcasts were aired on schedule.

To counter these problems, broadcasters sought to develop magnetic recording methods, that had proved so successful for audio, for use in the television industry.

The first experiments directly adapted the longitudinal recording method used to record analogue audio. This however was not successful because video recordings require more bandwidth than audio. Recording a video signal with stationary tape heads (as they are in the longitudinal method), meant that the tape had to be recorded at a very high speed in order accommodate sufficient bandwidth to reproduce a good quality video image. A lot of tape was used!

Ampex, who at the time owned the trademark marketing name for ‘videotape’, then developed a method where the tape heads moved quickly across the tape, rather than the other way round. On the 2” quad machine, four magnetic record/reproduce heads are mounted on a headwheel spinning transversely (width-wise) across the tape, striking the tape at a 90° angle. The recording method was not without problems because, the Toshiba Science Museum write, it ‘combined the signal segments from these four heads into a single video image’ which meant that ‘some colour distortion arose from the characteristics of the individual heads, and joints were visible between signal segments.’

Quad scanning

The limitations of Quadruplex recording influenced the development of the helical scan method, that was invented in Japan by Dr. Kenichi Sawazaki of the Mazda Research Laboratory, Toshiba, in 1954. Helical scanning records each segment of the signal as a diagonal stripe across the tape. ‘By forming a single diagonal, long track on two-inch-wide tape, it was possible to record a video signal on one tape using one head, with no joints’, resulting in a smoother signal. Helical scanning was later widely adopted as a recording method in broadcast and domestic markets due to its simplicity, flexibility, reliability and economical use of tape.

This brief history charting the development of 2″ Quad recording technologies reveals that efficiency and cost-effectiveness, alongside media quality, were key factors driving the innovation of video tape recording in the 1950s.


Digital Optical Technology System – ‘A non-magnetic, 100 year, green solution for data storage.’

Monday, January 27th, 2014

‘A non-magnetic, 100 year, green solution for data storage.’

This is the stuff of digital information managers’ dreams. No more worrying about active data management, file obsolescence or that escalating energy bill.

Imagine how simple life would be if there was a way to store digital information that could last, without intervention, for nearly 100 years. Those precious digital archives could be stored in a warehouse that was not climate controlled, because the storage medium was resilient enough to withstand irregular temperatures.

Imagine after 100 years an archivist enters that very same warehouse to retrieve information requested by a researcher. The archivist pulls a box off the shelf and places it on the table. In their bag they have a powerful magnifying glass which they use to read the information. Having ascertained they have the correct item, they walk out the warehouse, taking the box with them. Later that day, instructions provided as part of the product licensing over 100 years ago are used to construct a reader that will retrieve the data. The information is recovered and, having assessed the condition of the storage medium which seems in pretty good nick, the digital optical technology storage is taken back to the warehouse where it sits for another 10 years, until it is subject to its life-cycle review.Group_47_DOTS

Does this all sound too good to be true? For anyone exposed to the constantly changing world of digital preservation, the answer would almost definitely be yes. We have already covered on this blog numerous issues that the contemporary digital information manager may face. The lack of standardisation in technical practices and the bewildering array of theories about how to manage digital data mean there is currently no ‘one size fits all’ solution to tame the archive of born-digital and digitised content, which is estimated to swell to 3,000 Exabytes (thousands of petabytes) by 2020*. We have also covered the growing concerns about the ecological impact of digital technologies, such as e-waste and energy over-consumption. With this in mind, the news that a current technology exists that can by-pass many of these problems will seem like manna from heaven. What can this technology be and why have you never heard about it?

The technology in question is called DOTS, which stands for Digital Optical Technology System. The technology is owned and being developed by Group 47, who ‘formed in 2008 in order to secure the patents, designs, and manufacturing processes for DOTS, a proven 100-year archival technology developed by the Eastman Kodak Company.’ DOTS is refreshingly different from every other data storage solution on the market because it ‘eliminates media and energy waste from forced migration, costly power requirements, and rigid environmental control demands’. What’s more, DOTS are ‘designed to be “plug & play compatible” with the existing Linear Tape Open (LTO) tape-based archiving systems & workflow’.

In comparison with other digital information management systems that can employ complex software, the data imaged by DOTS does not use sophisticated technology. John Lafferty writes that at ‘the heart of DOTS technology is an extremely stable storage medium – metal alloy sputtered onto mylar tape – that undergoes a change in reflectivity when hit by a laser. The change is irreversible and doesn’t alter over time, making it a very simple yet reliable technology.’

DOTS can survive the benign neglect all data experiences over time, but can also withstand pretty extreme neglect. During research and development, for example, DOTS was exposed to a series of accelerated environmental age testing that concluded ‘there was no discernible damage to the media after the equivalent of 95.7 years.’ But the testing did not stop there. Since acquiring patents for the technology Group 47,

‘has subjected samples of DOTS media to over 72 hours of immersion each in water, benzine, isopropyl alcohol, and Clorox (™) Toilet Bowl Cleaner. In each case, there was no detectable damage to the DOTS media. However, when subjected to the citric acid of Sprite carbonated beverage, the metal had visibly deteriorated within six hours.’

Robust indeed! DOTS is also non-magnetic, chemically inert, immune from electromagnetic fields and can be stored in normal office environments or extremes ranging from -9º – 65º C. It ticks all the boxes really.

DOTS vs the (digital preservation) world

The only discernible benefit of the ‘open all hours’, random access digital information culture over a storage solution such as DOTS is accessibility. While it certainly is amazing how quick and easy it is to retrieve valuable data at the click of a button, it perhaps should not be the priority when we are planning how to best take care of the information we create, and are custodians of. The key words here are valuable data. Emerging norms in digital preservation, which emphasise the need to always be responsive to technological change, takes gambles with the very digital information it seeks to preserve because there is always a risk that migration will compromise the integrity of data.

The constant management of digital data is also costly, disruptive and time-consuming. In the realm of cultural heritage, where organisations are inevitably under resourced, making sure your digital archives are working and accessible can sap energy and morale. These issues of course affect commercial organisations too. The truth is the world is facing an information epidemic, and surely we would all rest easier if we knew our archives were safe and secure. Indeed, it seems counter-intuitive that amid the endless flashy devices and research expertise in the world today, we are yet to establish sustainable archival solutions for digital data.

256px-Dictionary_through_lens (2)

Of course, using a technology like DOTS need not mean we abandon the culture of access enabled by file-based digital technologies. It may however mean that the digital collections available on instant recall are more carefully curated. Ultimately we have to ask if privileging the instant access of information is preferable to long-term considerations that will safeguard cultural heritage and our planetary resources.

If such a consideration errs on the side of moderation and care, technology’s role in shaping that hazy zone of expectancy known as ‘the future’ needs to shift from the ‘bigger, faster, quicker, newer’ model, to a more cautious appreciation of the long-term. Such an outlook is built-in to the DOTS technology, demonstrating that to be ‘future proof’ a technology need not only withstand environmental challenges, such as flooding or extreme temperature change, but must also be ‘innovation proof’ by being immune to the development of new technologies. As John Lafferty writes, the license bought with the product ‘would also mandate full backward compatibility to Generation Zero, achievable since readers capable of reading greater data densities should have no trouble reading lower density information.’ DOTS also do not use propriety codecs, as Chris Castaneda reports, ‘the company’s plan is to license the DOTS technology to manufacturers, who would develop and sell it as a non-proprietary system.’ Nor do they require specialist machines to be read. With breathtaking simplicity, ‘data can be recovered with a light and a lens.’

It would be wrong to assume that Group 47’s development of DOTS is not driven by commercial interests – it clearly is. DOTS do however seem to solve many of the real problems that currently afflict the responsible and long-term management of digital information. It will be interesting to see if the technology is adopted and by who. Watch this space!

* According to a 2011 Enterprise Strategy Group Archive TCO Study

Digital Preservation – Establishing Standards and Challenges for 2014

Monday, January 13th, 2014

2014 will no doubt present a year of new challenges for those involved in digital preservation. A key issue remains the sustainability of digitisation practices within a world yet to establish firm standards and guidelines. Creating lasting procedures capable of working across varied and international institutions would bring some much needed stability to a profession often characterized by permanent change and innovation.

In 1969 The EIAJ-1 video tape was developed by the Electronic Industries Association of Japan. It was the first standardized format for industrial/non-broadcast video tape recording. Once implemented it enabled video tapes to be played on machines made by different manufacturers and it helped to make video use cheaper and more widespread, particularly within a domestic context.

Close up of tape machine on the 'play', 'stop', 'rewind' button

The introduction of standards in the digitisation world would of course have very little impact on the widespread use of digital technologies which are, in the west, largely ubiquitous. It would however make the business of digital preservation economically more efficient, simply because organisations would not be constantly adapting to change. For example, think of the costs involved in keeping up with rapid waves of technological transformation: updating equipment, migrating data and ensuring file integrity and operability are maintained are a few costly and time consuming examples of what this would entail.

Although increasingly sophisticated digital forensic technology can help to manage some of these processes, highly trained (real life!) people will still be needed to oversee any large-scale preservation project. Within such a context resource allocation will always have to account for these processes of adaptation. It has to be asked then: could this money, time and energy be practically harnessed in other, more efficient ways? The costs of non-standardisation becomes ever more pressing when we consider the amount of the digital data preserved by large institutions such as the British Library, whose digital collection is estimated to amass up to 5 petabytes (5000 terabytes) by 2020. This is not a simple case of updating your iphone to the next model, but an extremely complex and risky venture where the stakes are high. Do we really want to jeopardise rich forms cultural heritage in the name of technological progress?

The US-based National Digital Stewardship Alliance (NDSA) National Agenda for Digital Stewardship 2014 echoes such a sentiment. They argue that ‘the need for integration, interoperability, portability, and related standards and protocols stands out as a theme across all of these areas of infrastructure development’ (3). The executive summary also stresses the negative impact rapid technological change can create, and the need to ‘coordinate to develop comprehensive coverage on critical standards bodies, and promote systematic community monitoring of technology changes relevant to digital preservation.’ (2)

File Format Action Plans

One step on the way to more secure standards is the establishment of File Format Action Plans, a practice which is being increasingly recommended by US institutions. The idea behind developing a file format action plan is to create a directory of file types that are in regular use by people in their day to day lives and by institutions. Getting it all down on paper can help us track what may be described as the implicit user-standards of digital culture. This is the basic idea behind Parsimonious Preservation, discussed on the blog last year: that through observing trends in file use we may come to the conclusion that the best preservation policy is to leave data well alone since in practice files don’t seem to change that much, rather than risk the integrity of information via constant intervention.

As Lee Nilsson, who is currently working as a National Digital Stewardship Resident at the US Library of Congress writes, ‘specific file format action plans are not very common’, and when created are often subject to constant revision. Nevertheless he argues that devising action plans can ‘be more than just an “analysis of risk.” It could contain actionable information about software and formats which could be a major resource for the busy data manager.’

Other Preservation Challenges

Analogue to Digital Converter close upWhat are the other main challenges facing ‘digital stewards’ in 2014? In a world of exponential information growth, making decisions about what we keep and what we don’t becomes ever more pressing. When whole collections cannot be preserved digital curators are increasingly called upon to select material deemed representative and relevant. How is it possible to know now what material needs to be preserve for posterity? What values inform our decision making?

To take an example from our work at Great Bear: we often receive tapes from artists who have achieved little or no commercial success in their life times, but whose work is often of great quality and can tell us volumes about a particular community or musical style. How does such work stand up against commercially successful recordings? Which one is more valuable? The music that millions of people bought and enjoyed or the music that no one has ever heard?

Ultimately these questions will come to occupy a central concern for digital stewards of audio data, particularly with the explosion of born-digital music cultures which have enabled communities of informal and often non-commercial music makers to proliferate. How is it possible to know in advance what material will be valuable for people 20, 50 or 100 years from now? These are very difficult, if not impossible questions for large institutions to grapple with, and take responsibility for. Which is why, as members of a digital information management society, it is necessary to empower ourselves with relevant information so we can make considered decisions about our own personal archives.

A final point to stress is that among the ‘areas of concern’ for digital preservation cited by the NDSA, moving image and recorded sound figure highly, alongside other born-digital content such as electronic records, web and social media. Magnetic tape collections remain high risk and it is highly recommended that you migrate this content to a digital format as soon as possible. While digitisation certainly creates many problems as detailed above, magnetic tape is also threatened by physical deterioration and its own obsolescence challenges, in particular finding working machines to play back tape on. The simple truth is, if you want to access material in your tape collections it needs now to be stored in a resilient digital format. We can help, and offer other advice relating to digital information management, so don’t hesitate to get in touch.

Big Data, Long Term Digital Information Management Strategies & the Future of (Cartridge) Tape

Monday, November 18th, 2013

What is the most effective way to store and manage digital data in the long term? This is a question we have given considerable attention to on this blog. We have covered issues such as analogue obsolescence, digital sustainability and digital preservation policies. It seems that as a question it remains unanswered and up for serious debate.

We were inspired to write about this issue once again after reading an article that was published in the New Scientist a year ago called ‘Cassette tapes are the future of big data storage.’ The title is a little misleading, because the tape it refers to is not the domestic audio tape that has recently acquired much counter cultural kudos, but rather archival tape cartridges that can store up to 100 TB of data. How much?! I hear you cry! And why tape given the ubiquity of digital technology these days? Aren’t we all supposed to be ‘going tapeless’?

The reason for such an invention, the New Scientist reveals, is the ‘Square Kilometre Array (SKA), the world’s largest radio telescope, whose thousands of antennas will be strewn across the southern hemisphere. Once it’s up and running in 2024, the SKA is expected to pump out 1 petabyte (1 million gigabytes) of compressed data per day.’


Image of the SKA dishes

Researchers at Fuji and IBM have already designed a tape that can store up to 35TB, and it is hoped that a 100TB tape will be developed to cope with the astronomical ‘annual archive growth [that] would swamp an experiment that is expected to last decades’. The 100TB cartridges will be made ‘by shrinking the width of the recording tracks and using more accurate systems for positioning the read-write heads used to access them.’

If successful, this would certainly be an advanced achievement in material science and electronics. Smaller tape width means less room for error on the read-write function – this will have to be incredibly precise on a tape that will be storing a pretty extreme amount of information. Presumably smaller tape width will also mean there will be no space for guard bands either. Guard bands are unrecorded areas between the stripes of recorded information that are designed to prevent information interference, or what is known as ‘cross-talk‘.They were used on larger domestic video tapes such as U-Matic and VHS, but were dispensed with on smaller formats such as the Hi-8, which had a higher density of magnetic information in a small space, and used video heads with tilted gaps instead of guard bands.

The existence of SKA still doesn’t explain the pressing question: why develop new archival tape storage solutions and not hard drive storage?

Hard drives were embraced quickly because they take up less physical storage space than tape. Gone are the dusty rooms bursting with reel upon reel of bulky tape; hello stacks of infinite quick-fire data, whirring and purring all day and night. Yet when we consider the amount of energy hard drive storage requires to remain operable, the costs – both economic and ecological – dramatically increase.

The report compiled by the Clipper Group published in 2010 overwhelmingly argues for the benefits of tape over disk for the long term archiving of data. They state that ‘disk is more than fifteen times more expensive than tape, based upon vendor-supplied list pricing, and uses 238 times more energy (costing more than the all costs for tape) for an archiving application of large binary files with a 45% annual growth rate, all over a 12-year period.’

This is probably quite staggering to read, given the amount of investment in establishing institutional architecture for tape-less digital preservation. Such an analysis of energy consumption does assume, however, that hard drives are turned on all the time, when surely many organisations transfer archives to hard drives and only check them once every 6-12 months.

Yet due to the pressures of technological obsolescence and the need to remain vigilant about file operability, coupled with the functional purpose of digital archives to be quickly accessible in comparison with tape that can only be played back linearly, such energy consumption does seem fairly inescapable for large institutions in an increasingly voracious, 24/7 information culture. Of course the issue of obsolescence will undoubtedly affect super-storage-data tape cartridges as well. Technology does not stop innovating – it is not in the interests of the market to do so.

Perhaps more significantly, the archive world has not yet developed standards that address the needs of digital information managers. Henry Newman’s presentation at the Designing Storage Architectures 2013 conference explored the difficulty of digital data management, precisely due to the lack of established standards:

  • ‘There are some proprietary solutions available for archives that address end to end integrity;
  • There are some open standards, but none that address end to end integrity;
  • So, there are no open solutions that meet the needs of [the] archival community.’

He goes on to write that standards are ‘technically challenging’ and require ‘years of domain knowledge and detailed understanding of the technology’ to implement. Worryingly perhaps, he writes that ‘standards groups do not seem to be coordinating well from the lowest layers to the highest layers.’ By this we can conclude that the lack of streamlined conversation around the issue of digital standards means that effectively users and producers are not working in synchrony. This is making the issue of digital information management a challenging one, and will continue to be this way unless needs and interests are seen as mutual.

Other presentations at the recent annual meeting for Designing Storage Architectures for Digital Collections which took place on September 23-24, 2013 at the Library of Congress, Washington, DC, also suggest there are limits to innovation in the realm of hard drive storage.  Gary Decad, IBM, delivered a presentation on the ‘The Impact of Areal Density and Millions of Square Inches of Produced Memory on Petabyte Shipments for TAPE, NAND Flash, and HDD Storage Class‘.

For the lay (wo)man this basically translates as the capacity to develop computer memory stored on hard drives. We are used to living in a consumer society where new improved gadgets appear all the time. Devices are getting smaller and we seem to be able buy more storage space for cheaper prices. For example, it now costs under £100 to buy a 3TB hard drive, and it is becoming increasingly more difficult to purchase hard drives which have less than 500GB storage space. Compared with last year, a 1TB hard drive was the top of the range and would have probably cost you about £100.

A 100TB storage unit in 2010, compared with a smaller hard drive symbolising 2020.

Does my data look big in this?

Yet the presentation from Gary Decad suggests we are reaching a plateau with this kind of storage technology – infinite memory growth and reduced costs will soon no longer be feasible. The presentation states that ‘with decreasing rates of areal density increases for storage components and with component manufactures reluctance to invest in new capacity, historical decreases in the cost of storage ($/GB) will not be sustained.’

Where does that leave us now? The resilience of tape as an archival solution, the energy implications of digital hard drive storage, the lack of established archival standards and a foreseeable end to cheap and easy big digital data storage, are all indications of the complex and confusing terrain of information management in the 21st century. Perhaps the Clipper report offers the most grounded appraisal: ‘the best solution is really a blend of disk and tape, but – for most uses – we believe that the vast majority of archived data should reside on tape.’ Yet it seems until the day standards are established in line with the needs of digital information managers, this area will continue to generate troubling, if intriguing, conundrums.

Repairing obsolete media – remembering how to fix things

Monday, June 17th, 2013

A recent news report on the BBC website about recycling and repairing ‘old’ technology resonates strongly with the work of Great Bear.

The story focused on the work of Restart Project, a charity organisation who are encouraging positive behavioural change by empowering people to use their electronics for longer. Their website states,

the time has come to move beyond the culture of incessant electronics upgrades and defeatism in the face of technical problems. We are preparing the ground for a future economy of maintenance and repair by reskilling, supporting repair entrepreneurs, and helping people of all walks of life to be more resilient.

We are all familiar with the pressure to adopt new technologies and throw away the old, but what are the consequences of living in such a disposable culture? The BBC report describes how ‘in developed nations people have lost the will to fix broken gadgets. A combination of convenience and cultural pressure leads people to buy new rather than repair.’

These tendencies have been theorised by French philosopher of technology Bernard Stiegler as the loss of knowledge of how to live (savoir-vivre). Here people lose not only basic skills (such as how to repair a broken electronic device), but are also increasingly reliant on the market apparatus to provide for them (for example, the latest new product when the ‘old’ one no longer works).

A lot of the work of Great Bear revolves around repairing consumer electronics from bygone eras. Our desks are awash with soldering irons, hot air rework stations, circuit boards, capacitors, automatic wire strippers and a whole host of other tools.


We have bookshelves full of operating manuals. These can help us navigate the machinery in the absence of a skilled engineer who has been trained how to fix a MII, U-Matic or D3 tape machine.

As providers of a digitisation service we know that maintaining obsolete machines appropriate to the transfer is the only way we can access tape-based media. But the knowledge and skills of how to do so are rapidly disappearing – unless of course they are actively remembered through practice.

The Restart Project offers a community-orientated counterpoint to the erosion of skills and knowledge tacitly promoted by the current consumer culture. Promoting values of maintenance and repair opens up the possibility for sustainable, rather than throwaway, uses of technology.

Even if the Restart Project doesn’t catch on as widely as it deserves to, Great Bear will continue to collect, maintain and repair old equipment until the very last tape head on earth is worn down.

Archiving for the digital long term: information management and migration

Monday, June 3rd, 2013

As an archival process digitisation offers the promise of a dream: improved accessibility, preservation and storage.

However the digital age is not without its archival headaches. News of the BBC’s plans to abandon their Digital Media Initiative (DMI), which aimed to make the BBC media archive ‘tapeless’, clearly demonstrates this. As reported in The Guardian:

‘DMI has cost £98.4m, and was meant to bring £95.4m of benefits to the organisation by making all the corporation’s raw and edited video footage available to staff for re-editing and output. In 2007, when the project was conceived, making a single TV programme could require 70 individual video-handling processes; DMI was meant to halve that.’

The project’s failure has been explained by its size and ambition. Another telling reason was cited: the software and hardware used to deliver the project was developed for exclusive use by the BBC. In a statement BBC Director Tony Hall referred to the fast development of digital technology, stating that ‘off-the-shelf [editing] tools were now available that could do the same job “that simply didn’t exist five years ago”.’

g tech pro hard-drive-raid-array

The fate of the DMI initiative should act as a sobering lesson for institutions, organisations and individuals who have not thought about digitisation as a long, rather than short term, archival solution.

As technology continues to ‘innovate’ at startling rate,  it is hard to predict how long the current archival standard for audio and audio-visual will last.

Being an early adopter of technology can be an attractive proposition: you are up to date with the latest ideas, flying the flag for the cutting edge. Yet new technology becomes old fast, and this potentially creates problems for accessing and managing information. The fragility of digital data comes to the fore, and the risk of investing all our archival dreams in exclusive technological formats as the BBC did, becomes far greater.


In order for our data to survive we need to appreciate that we are living in what media theorist Jussi Parikka calls an ‘information management society.’ Digitisation has made it patently clear that information is dynamic rather than stored safely in static objects. Migrating tape based archives to digital files is one stage in a series of transitions material can potentially make in its lifetime.

Given the evolution of media and technology in the 20th and 21st centuries, it feels safe to speculate that new technologies will emerge to supplant uncompressed WAV and AIFF files, just as AAC has now become preferred to MP3 as a compressed audio format because it achieves better sound quality at similar bit rates.

Because of this at Great Bear we always migrate analogue and digital magnetic tape at the recommended archival standard, and provide customers with high quality and access copies. Furthermore, we strongly recommend to customers to back up archive quality files in at least three separate locations because it is highly likely data will need to be migrated again in the future.

The Grain of Video Tape

Tuesday, January 8th, 2013


From U Matic to VHS, Betacam to Blu Ray, Standard Definition to High Definition, the formats we use to watch visual media is constantly evolving.

Yet have you ever paused to consider what is at stake in the changing way audio-visual media is presented to us? Is viewing High Definition film and television always a better experience than previous formats? What is lost when the old form is supplanted by the new?

At Great Bear we have the pleasure of seeing the different textures, tones and aesthetics of tape-based Standard Definition video on a daily basis. The fuzzy grain of these videos contrasts starkly with the crisp, heightened colours of High Definition digital media we are increasingly used to seeing now on television, smartphones and tablets.

This is not however a romantic retreat to all things analogue in the face of an unstoppable digital revolution, although scratch the surface of culture and you will find many people are.

At Great Bear we always have one foot in the past, and one foot in the future. We act as a conduit between old and new media, ensuring that data stored on older media can continue to have a life in today’s digital intensive environments.



designed and developed by
greatbear analogue and digital media ltd, 0117 985 0500
Unit 26, The Coach House, 2 Upper York Street, Bristol, BS2 8QN, UK

greatbear analogue and digital media is proudly powered by WordPress
hosted using Debian and Apache