The way women shirked the language of patriarchy is evident on the tape box. We digitised the ‘MISTRESS’ copy, not the master copy.
Seeing the mistress copy today is a reminder of the way gendered language influences how we can think about cultural forms. The master copy, of course, in conventional understanding, is the finished article, the final cut. The master of the house – the person in charge – is gendered male. Yet is this still the case?
‘If we find that audio-technical discourse renders signal processing in terms of masculinist languages of mastery and domination of nature, can we help but wonder after its broader social implications? Does it not also suggest a gendered set of relations to these technologies? It is any wonder we still find the design, implementation, marketing, and use of audio-signal processing technologies to be male-dominated fields? [To change things] it will require fundamentally rethinking how we model, describe, interact, and sound with signal processing technologies’.
For feminist women who felt systematically excluded from certain kinds of cultural and economic activity, the gendering of language was an extension of violence they experienced because they were women.
Making the tape a MISTRESS may help rectify the problem, as does crossing out the very idea of a master copy.
The magnetic viewer makes the mysterious tracks recorded onto the tape visible
We have recently acquired a magnetic viewer in order to aid our digitisation work. By pressing the viewer against the tape we are able to read the magnetic information recorded on it. The reader helps us to visually identify the position of the recorded tracks on the tape, and enables accurate playback during digitisation. Magnetic readers can also help us to identify potential problems with the tape, for example if a track has been partially erased, because it will show up on the viewer.
We receive tapes that are in varying states of repair and disrepair. Sometimes the person who made the recording kept the tapes in impeccable, temperature controlled conditions. Inscribed on the boxes are dates and lists of who performed, and what instrument they played. The tapes often feature detailed notes about the number of tracks recorded, whether they are in stereo or mono and if they used noise reduction technology. Digitisation, in such cases, does not usually pose great challenges.
At the other extreme are tapes recorded by people who never wrote anything down about how they made their recording. This means the people doing the digitising can be left to do a lot of guess work (particularly if that person has since died, and can’t tell you anything about the recording). A lack of informative metadata about the tape does not necessarily create migration difficulties: recordings can be very straightforward like, for example, a ½ track stereo recording of a single voice.
It is essential that the appropriate head is used to read the magnetic information recorded onto the tape.
Problems can however arise when recordings have been made in an idiosyncratic (and inconsistent) manner. For example (and in exceptional circumstances) we receive single magnetic tapes that have a mixture of track formats on them which include four track multi-track, ½ and ¼ track mono and ½ and ¼ track stereo.
In such cases it can be hard to discern the precise nature of the recordings using the ears alone. Often such recordings don’t sound ‘quite right’, even if it is not exactly clear what the problem is.
Rather than relying on speculation, using the magnetic reader gives 100% confirmation about where tracks are recorded on the tape, and therefore helps us to replay the tape using the appropriate playback heads, and therefore digitise it accurately.
If you are new to the world of digital preservation, you may be feeling overwhelmed by the multitude of technical terms and professional practices to contend with, and the fact that standards never seem to stay in place for very long.
Fortunately, there are many resources related to digital preservation available on the internet. Unfortunately, the large amount of websites, hyperlinks and sub-sections can exacerbate those confounded feelings.
In order to help the novice, nerd or perplexed archivist wanting to learn more, we thought it would be useful to compile a selection of (by no means exhaustive) resources to guide your hand. Ultimately if content is to be useful it does need to be curated and organised.
Bear in mind that individual websites within the field tend to be incredibly detailed, so it is worth having a really good explore to find the information you need! And, as is the norm with the internet, one click leads to another so before you know it you stumble upon another interesting site. Please feel free to add anything you find to the comment box below so the list can grow!
AV Preserve are a US-based consultation company who work in partnership with organisations to help them implement digital information preservation and dissemination plans. They have an amazing ‘papers and presentation’ section of their website, which includes research about diverse areas such as assessing cloud storage, digital preservation software, metadata, making an institutional case for digital preservation, managing personal archives, primers on moving image codecs, disaster recovery and many more. It is a treasure trove, and there is a regularly updated blog to boot!
The A/V Artifact Atlas is a community-generated resource for people working in digital preservation and aims to identify problems that occur when migrating tape-based media. The Atlas is made in a wiki-format and welcomes contributions from people with expertise in this area – ‘the goal is to collectively build a comprehensive resource that identifies and documents AV artifacts.’ The Atlas was created by people connected to the Bay Area Video Coalition, a media organisation that aims to inspire ‘social change by empowering media makers to develop and share diverse stories through art, education and technology.’
Richard Hess is a US-based audio restoration expert. Although his website looks fairly clunky, he is very knowledgeable and well-respected in the field, and you can find all kinds of esoteric tape wisdom on there.
The Digital Preservation Coalition‘s website is full of excellent resources including a digital preservation jargon buster, case studies, preservation handbook and a ‘what’s new’ section. The Technology Watch Reports are particularly useful. Of relevance to the work Great Bear do is the ‘Preserving Moving Pictures and Sound’, but there are many others including Intellectual Property and Copyright, Preserving Metadata and Digital Forensics.
The Digital Curation Centre works to support Higher Education Institutions to interpret and manage research data. Again, this website is incredibly detailed, presenting case studies, ‘how-to’ guides, advice on digital curation standards, policy, curation lifecycle and much more.
In 2005 UNESCO declared 27 October to be World Audiovisual Heritage Day. The web pages are an insight into the way audiovisual heritage is perceived by large, international policy bodies.
Be sure to take advantage of the 35 open access digital heritage articles published by Routledge. The articles are from the International Journal of Heritage Studies, Archives and Records, Journal of the Institute of Conservation, Archives and Manuscripts and others.
For open source digital preservation software check out The Open Planets Foundation (OPF), who address core digital preservation challenges by engaging with its members and the community to develop practical and sustainable tools and services to ensure long-term access to digital content. The website also includes the very interesting Atlas of Digital Damages.
The summer of 2008 saw a spate of articles in the media focusing on a new threat to magnetic tapes.
The reason: the warm, wet weather was reported as a watershed moment in magnetic tape degradation, with climate change responsible for the march of mould consuming archival memories, from personal to institutional collections.
The connection between climate change and tape mould is not one made frequently by commentators, even in the digital preservation world, so what are the links? It is certainly true that increased heat and moisture are prime conditions for the germination of the mould spores that populate the air we breathe. These spores, the British Library tell us
‘can stay dormant for long periods of time, but when the conditions are right they will germinate. The necessary conditions for germination are generally:
• temperatures of 10-35ºC with optima of 20ºC and above
• relative humidities greater than 70%’
The biggest threat to the integrity of magnetic tape is fluctuations in environmental temperatures. This means that tape collections that are not stored in controlled settings, such as a loft, cupboard, shed or basement, are probably most at risk.
While climate change has not always been taking as seriously as it should be by governments and media commentators, the release today of the UN’s report, which stated in no uncertain terms that climate change is ‘severe, pervasive and irreversible’, should be a wake up call to all the disbelievers.
To explore the links between climate change and tape degradation further we asked Peter Specs from US-based disaster recovery specialists the Specs Brothers if he had noticed any increase in the number of mouldy tapes they had received for restoration. In his very generous reply he told us:
‘The volume of mouldy tapes treated seems about the same as before from areas that have not experienced disasters but has significantly increased from disaster areas. The reason for the increase in mould infected tapes from disaster areas seems to be three-fold. First, many areas have recently been experiencing severe weather that is not usual for the area and are not prepared to deal with the consequences. Second, a number of recent disasters have affected large areas and this delays remedial action. Third, after a number of disasters, monies for recovery seem to have been significantly delayed. We do a large amount of disaster recovery work and, when we get the tapes in for processing fairly quickly, are generally able to restore tapes from floods before mould can develop. In recent times, however, we are getting more and more mouldy tapes in because individuals delayed having them treated before mould could develop. Some were unaware that lower levels of their buildings had suffered water damage. In other areas the damage was so severe that the necessities of life totally eclipsed any consideration of trying to recover “non-essential” items such as tape recordings. Finally, in many instances, money for recovery was unavailable and individuals/companies were unwilling to commit to recovery costs without knowing if or when the government or insurance money would arrive.’
Nigel Bewley, soon to be retired senior sound engineer at the British Library, also told us there had been no significant increase in the number of mouldy tapes they had received for treatment. Yet reading between the lines here, and thinking about what Pete Specs told us, in an age of austerity and increased natural disasters, restoring tape collections may slip down the priority list of what needs to be saved for many people and institutions.
Mould: Prevention Trumps the Cure
Climate change aside, what can be done to prevent your tape collections from becoming mouldy? Keeping the tapes stored in a temperature controlled environment is very important – ’15 + 3° C and 40% maximum relative humidity (RH) are safe practical storage conditions,’ recommend the National Technology Alliance. It is also crucial that storage environments retain a stable temperature, because significant changes in the storage climate risk heating or cooling the tape pack, making the tension in the tape pack increase or decrease which is not good for the tape.
If you are wondering how mould attacks magnetic tape, it is attracted to the binder or adhesive that attaches the layers of the tape together. If you can see the mould on the tape edges it usually means the mould has infected the whole tape.
Optical media can also be affected by mould. Miriam B. Kahn writes in Disaster Response and Planning for Libraries
‘Optical discs are susceptible to water, mould and mildew. If the polycarbonate surface is damaged or not sealed appropriately, moisture can become trapped and begin to corrode the metal encoding surface. If moisture or mould is invasive enough, it will make the disc unreadable’ (85).
Prevention, it seems, is better than having to find the cure. So turn on the lights, keep the air flowing and make the RH level stable.
The history of amateur recording is peppered with examples of people who stretched technologies to their creative limit. Whether this comes in the form of hours spent trying things out and learning through doing, endlessly bouncing tracks in order to turn an 8-track recording into a 24-track epic or making high quality audio masters on video tape, people have found ways to adapt and experiment using the tools available to them.
One of the lesser known histories of amateur home recordings is making high quality stereo mixdowns and master recordings from multi-track audio tape onto consumer-level Hi-Fi VCRs.
We are currently migrating a stereo master VHS Hi-Fi recording of London-based indie band Hollow Hand. Hollow Hand later adopted the name Slanted and were active in London between 1992-1995. The tapes were sent in by Mark Venn, the bass player with Slanted and engineer for these early recordings that were recorded in 1992 in the basement of a Clapham squat. Along with the Hi-Fi VHS masters, we have also been sent eight reels of AMPEX ¼ tapes of Slanted that are being transferred for archival purposes. Mark intends to remix the eight track recordings digitally but as of yet has no plans for a re-release.
When Mark sent us the tapes to be digitised he thought they had been encoded with a SONY PCM, a mixed digital/ analogue recording method we have covered in a previous blog post. The tapes had, however, been recorded directly from the FOSTEX eight track recorder to the stereo Hi-Fi function on a VHS video tape machine. For Mark at the time this was the best way to get a high quality studio master because other analogue and digital tape options, such as Studer open reel to reel and DAT machines, were financially off-limits to him. It is worth mentioning that Hi-Fi audio technologies were introduced in the VHS model by JVC around 1984, so using this method to record stereo masters would have been fairly rare, even among people who did a lot of home recording. It was certainly a bit of a novelty in the Great Bear Studio – they are the first tapes we have ever received that have been recorded in this way – and take it for granted that we see a lot of tape.
Using the Hi-Fi function on VHS tape machines was probably as good as it got in terms of audio fidelity for those working in an exclusively analogue context. It produced a master recording comparable in quality to a CD, particularly if the machine had manual audio recording level control. This is because, as we wrote about in relation to PCM/ Betamax, video tape could accommodate greater bandwidth that audio tape (particularly audio cassette), therefore leading to better quality recordings.
One of our replacement upper head drums
VHS Hi-Fi audio is achieved using audio frequency-modulation (AFM) and relied on a form of magnetic recording called ‘depth multiplexing‘. This is when
‘the modulated audio carrier pair was placed in the hitherto-unused frequency range between the luminance and the colour carrier (below 1.6 MHz), and recorded first. Subsequently, the video head erases and re-records the video signal (combined luminance and colour signal) over the same tape surface, but the video signal’s higher centre frequency results in a shallower magnetization of the tape, allowing both the video and residual AFM audio signal to coexist on tape.’
Challenges for migrating Hi-Fi VHS Audio
Although the recordings of Hollow Hand are in good working condition, analogue masters to VHS Hi-Fi audio do face particular challenges in the migration process.
Playing back the tapes in principle is easy if both tape and machine are in optimum condition, but if either are damaged the original recordings can be hard to reproduce.
A particular problem for Hi-Fi audio emerges when the tape heads wear and it becomes harder to track the hi-fi audio recording because the radio frequency signal (RF) can’t be read consistently off the tape. Hi-Fi recordings are harder to track because of depth multiplexing, namely the position of the recorded audio relative to the video signal. Even though there is no video signal as such in the playback of Hi-Fi audio, the video signal is still there, layered on top of the audio signal, essentially making it harder to access. Of course when tape heads/ drums wear down they can always be replaced, but acquiring spare parts will become increasingly difficult in years to come, making Hi-Fi audio recordings on VHS particularly threatened.
In order to migrate tape-based media to digital files in the most effective way possible, it is important to use appropriate machines for the transfer. The Panasonic AG-7650 we used to transfer the Hollow Hand tapes afforded us great flexibility because it is possible to select which audio tracks are played back at any given time which meant we could isolate the Hi-Fi audio track. The Panasonic AG-7650 also has tracking meters which makes it easy to assess and adjust the tracking of the tape and tape head where necessary.
As ever, the world of digitisation continues to generate anomalies, surprises and good stories. Who knows how many other video/ audio hybrid tapes are out there! If you do possess an archive collection of such tapes we advise you to take action to ensure they are migrated because of the unique problems they pose as a storage medium.
Whole subcultures have emerged in this memory boom, as digital technologies enable people with similar obsessions to come together via a shared passion for saving obscurities presumed to be lost forever. One such organisation is Kaleidoscope, whose aim is to keep the memory of ‘vintage’ British television alive. Their activities capture an urgent desire bubbling underneath the surface of culture to save everything, even if the quality of that everything is questionable.
Of course, as the saying goes, one person’s rubbish is another person’s treasure. As with most cultural heritage practices, the question of value is at the centre of people’s motivations, even if that value is expressed through a love for Pan’s People, Upstairs, Downstairs, Dick Emery and the Black and White Minstrel Show.
We were recently contacted by a customer hunting for lost TV episodes. His request: to lay hands on any old tapes that may unwittingly be laden with lost jewels of TV history. His enquiry is not so strange since a 70s Top of the Pops programme, a large proportion of which were deleted from the official BBC archive, trailed the end of ½ EIAJ video tape we recently migrated. And how many other video tapes stored in attics, sheds or barns potentially contain similar material? Or, as stated on the Kaleidoscope website:
‘Who’d have ever imagined that a modest, sometimes mould-infested collection of VHS tapes in a cramped back bedroom in Pill would lead to the current Kaleidoscope archive, which hosts the collections of many industry bodies as well as such legendary figures as Bob Monkhouse or Frankie Howard?’
Selection and appraisal in the archive
Living in an age of seemingly infinite information, it is easy to forget that any archival project involves keeping some things and throwing away others. Careful considerations about the value of an item needs to be made, both in relation to contemporary culture and the projected needs of subsequent generations.
These decisions are not easy and carry great responsibility. After all, how is it possible to know what society will want to remember in 10, 20 or even 30 years from now, let alone 200? The need to remember is not static either, and may change radically over time. What is kept now also strongly shapes future societies because our identities, lives and knowledge are woven from the memory resources we have access to. Who then would be an archivist?
When faced with a such a conundrum the impulse to save everything is fairly seductive, but this is simply not possible. Perhaps things were easier in the analogue era when physical storage constraints conditioned the arrangement of the archive. Things had to be thrown away because the clutter was overwhelming. With the digital archive, always storing more seems possible because data appears to take up less space. Yet as we have written about before on the blog, just because you can’t touch or even see digital information, doesn’t mean it is not there. Energy consumption is costly in a different way, and still needs to be accounted for when appraising how resource intensive digital archives are.
‘For the BBC, national programmes that have entered the main archive and been fully catalogued have not, in general, been deleted. The deletions within the retention policy mainly apply to “contribution material” i.e. components (rushes) of a final programme, or untransmitted material. Hence, “long-term” for “national programmes that have entered the main archive and been fully catalogued” means in perpetuity. We have already kept some material for more than 75 years, including multiple format migrations.’
Value – whose responsibility?
For all those episodes, missing believed wiped, the treasure hunters who track them down tread a fine line between a personal obsession and offering an invaluable service to society. You decide.
What is inspiring about amateur preservationists is that they take the question of archival value into their own hands. In the 21st century, appraising and selecting the value of cultural artifacts is therefore no longer the exclusive domain of the archivist, even if expertise about how to manage, describe and preserve collections certainly is.
Does the popularity of such activities change the constitution of archives? Are they now more egalitarian spaces that different kinds of people contribute to? It certainly suggests that now, more than ever, archives always need to be thought of in plural terms, as do the different elaborations of value they represent.
We are pleased to announce that we are now able to support the transfer of 2″ Quadruplex Video Tape (PAL, SECAM & NTSC) to digital formats.
2” Quad was a popular broadcast analogue video tape format whose halcyon period ran from the late 1950s to the 1970s. The first quad video tape recorder made by AMPEX in 1956 cost a modest $45,000 (that’s $386,993.38 in today’s money).
2” Quad revolutionized TV broadcasting which previously had been reliant on film-based formats, known in the industry as ‘kinescope‘ recordings. Kinescope film required significant amounts of skilled labour as well as time to develop, and within the USA, which has six different time zones, it was difficult to transport the film in a timely fashion to ensure broadcasts were aired on schedule.
To counter these problems, broadcasters sought to develop magnetic recording methods, that had proved so successful for audio, for use in the television industry.
The first experiments directly adapted the longitudinal recording method used to record analogue audio. This however was not successful because video recordings require more bandwidth than audio. Recording a video signal with stationary tape heads (as they are in the longitudinal method), meant that the tape had to be recorded at a very high speed in order accommodate sufficient bandwidth to reproduce a good quality video image. A lot of tape was used!
Ampex, who at the time owned the trademark marketing name for ‘videotape’, then developed a method where the tape heads moved quickly across the tape, rather than the other way round. On the 2” quad machine, four magnetic record/reproduce heads are mounted on a headwheel spinning transversely (width-wise) across the tape, striking the tape at a 90° angle. The recording method was not without problems because, the Toshiba Science Museum write, it ’combined the signal segments from these four heads into a single video image’ which meant that ‘some colour distortion arose from the characteristics of the individual heads, and joints were visible between signal segments.’
The limitations of Quadruplex recording influenced the development of the helical scan method, that was invented in Japan by Dr. Kenichi Sawazaki of the Mazda Research Laboratory, Toshiba, in 1954. Helical scanning records each segment of the signal as a diagonal stripe across the tape. ‘By forming a single diagonal, long track on two-inch-wide tape, it was possible to record a video signal on one tape using one head, with no joints’, resulting in a smoother signal. Helical scanning was later widely adopted as a recording method in broadcast and domestic markets due to its simplicity, flexibility, reliability and economical use of tape.
This brief history charting the development of 2″ Quad recording technologies reveals that efficiency and cost-effectiveness, alongside media quality, were key factors driving the innovation of video tape recording in the 1950s.
What is particularly interesting about the consortium E-Ark has brought together is commercial partners will be part of a conversation that aims to establish long term solutions for digital preservation across Europe. More often than not, commercial interests have driven technological innovations used within digital preservation. This has made digital data difficult to manage for institutions both large and small, as the BBC’s Digital Media Initiative demonstrates, because the tools and protocols are always in flux. A lack of policy-level standards and established best practices has meant that the norm within digital information management has very much been permanent change.
Such a situation poses great risks for both digitised and born digital collections because information may have to be regularly migrated in order to remain accessible and ‘open’. As stated on the E-Ark website, ‘the practices developed within the project will reduce the risk of information loss due to unsuitable approaches to keeping and archiving of records. The project will be public facing, providing a fully operational archival service, and access to information for its users.’
The E-Ark project will hopefully contribute to the creation of compatible systems that can respond to the different needs of groups working with digital information. Which is, of course, just about everybody right now: as the world economy becomes increasingly defined by information and ‘big data’, efficient and interoperable access to commercial and non-commercial archives will be an essential part of a vibrant and well functioning economic system. The need to establish data systems that can communicate and co-operate across software borders, as well as geographical ones, will become an economic necessity in years to come.
The task facing E-Ark is huge, but one crucial to implement if digital data is to survive and thrive in this brave new datalogical world of ours. As E-Ark explain: ‘Harmonisation of currently fragmented archival approaches is required to provide the economies of scale necessary for general adoption of end-to-end solutions. There is a critical need for an overarching methodology addressing business and operational issues, and technical solutions for ingest, preservation and re-use.’
Maybe 2014 will be the year when digital preservation standards start to become a reality. As we have already discussed on this blog, the US-based National Agenda for Digital Stewardship 2014 outlined the negative impact of continuous technological change and the need to create dialogue among technology makers and standards agencies. It looks like things are changing and much needed conversations are soon to take place, and we will of course reflect on developments on the Great Bear blog.
Philip Jap came from a time when mime, dance, slapped bass lines, mascara and techno-dystopic anthems were staple parts of a successful popular music career. Cut from the same new wave goth cloth as Gary Numan, Human League and John Foxx, sporting mesmeric dance moves like a male Kate Bush, Jap lit up the early 1980s with performances on the David Essex Showcase, an audience participation talent show similar to today’s Britain’s Got Talent or Pop Idol. Jap went on to sign for Carlin Music Publishing and A&M Records, release an eponymous solo album and have a top 40 hit with ‘Save Us,’ a dramatic plea for liberation from an increasingly intrusive ‘mechanical world.’
Jap retains a modest yet loyal fanbase (The Philip Jap Army), and his recordings will soon be made available through his twitter site. Although he did not have runaway commercial chart success, Jap went on to have a successful career as a composer and arranger for TV series and commercials and is the co-founder of AUDIOfield, a music production company.
The Great Bear studio has been graced with Jap’s music this week because we have been migrating a collection of low-band U-Matic videos that feature a number of TV appearances and promotional videos, including the 30 minute ‘special’ that was recorded for the BBC. In similar fashion to our recent transfer of Manchester Oi! band State Victims, the tapes were found in an old suitcase in a barn!
Although the tapes were mostly in good condition, some of the tapes were recorded on early SONY brand and were suffering from low Radio Frequency (RF) levels. RF levels are the recorded levels that can be read off the tape itself. To get a good, clear picture it is essential that the RF levels are strong. According to the AV Artifact Atlas, RF deterioration can occur because of a ‘poorly made recording on broken or mis-calibrated machine/record heads, or the use of poor quality video tape stock.’ Low RF levels may also occur if ‘the source media itself has been exposed to a strong magnetic field (unshielded speakers, motors, high-voltage transformers, etc.)’.
When a tape is suffering from low RF levels there are not loads of things you can do to reverse the process. This is because the signal recorded on the tape has essentially faded over time, due to a bad initial recording, unsuitable storage conditions leading to de-magnetisation or sticky shed, or poor quality tape (such as AMPEX or SONY U-matic tapes, although not exclusively). It is possible however to modify the tracking, a calibration adjustment which ensures the spinning playback head is properly aligned with the helical scan signal written onto the video tape. Tracking changes the speed at which the tape moves past the tape heads, which although spinning during playback, remain stationary. It is not the answer of all low RF ills, however, because the signal on the tape itself has become weaker, even if the calibration adjustment helps the machine read the signal more effectively.
Thankfully the tapes play back well, which is pretty amazing given that the tapes are over 30 years old and were never meant to be archive copies in the first place. We have also had a pretty enjoyable time watching and listening to Philip Jap’s amazing music. It is definitely time for a revival!
We understand that when organisations decide to digitise magnetic tape collections the whole process can take significant amounts of time. From initial condition appraisals, to selecting which items to digitise, many questions, as well as technical and cultural factors, have to be taken into account before a digital transfer can take place.
This is further complicated by that fact that money is not readily available for larger digitisation projects and specific funding has to be sought. Often an evidence base has to be collected to present to potential funders about the value and importance of a collection, and this involves working with organisations who have specific expertise in transferring tape-based collections to digital formats to gain vital advice and support.
We are very happy to work with organisations and institutions during this crucial period of collection assessment and bid development. We understand that even during the pre-application stage informed decisions need to be made about the conditions of tape, and realistic anticipations of what treatments may be required during a particular digitisation project. We are very willing to offer the support and advice that will hopefully contribute to the development of a successful bid.
For example, we recently were contacted by Ken Turner who was involved in Action Space, an experimental, community theatre group established in 1968. Ken has a collection of nearly 40 EIAJ SONY video tapes that were made in the 1980s. Because of the nature of the tapes, which almost always require treatment before they can be played back, transferring the whole collection will be fairly expensive so funding will be necessary to make the project happen. We have offered to do a free assessment of the tapes and provide a ten minute sample of the transfer that can be used as part of an evidence base for a funding bid.
Potential Problems with EIAJ ½ Video Tapes
The EIAJ video tape recorder was developed in the late 1960s and is a fairly important format in the history of recordable media. As the first standardized video tape machine, it could playback tapes made by different companies and therefore made video use far cheaper and more widespread, particularly within a domestic context. The EIAJ standard had a similar democratising impact on non-professional video recording due to its portability, low cost, and versatility.As mentioned above, the EIAJ tapes almost always require treatment before they can be played back, particularly the SONY V30-H and V60-H tapes. Problems with the tape are indicated by squealing and shedding upon playback. This is an example of what the AV Artifact Atlas describe as stiction, ‘when media suffering from hydrolysis or contamination is restricted from moving through the tape path correctly.’ When stiction occurs the tape needs to be removed from the transport and treated immediately, either through baking and cleaning, before the transfer can be completed.
EIAJ tapes that have a polyethylene terephthalate ‘back coating’ or ‘substrate’ may also be affected by temperature or humidity changes in its storage environment. These may have caused the tape pack to expand or contract, therefore resulting in permanent distortion of the tape backing. Such problems are exacerbated by the helical scan method of recording which is common to video tape, which records parallel tracks that run diagonally across the tape from one edge to the other. If the angle that the recorded tracks make to the edge of the tape do not correspond with the scan angle of the head (which always remains fixed), mistracking and information loss can occur, which can lead to tracking errors. Correcting tracking errors is fairly easy as most machines have in-built tracking controls. Some of the earliest SONY CV ½ inch video tape machines didn’t have this function however, so this presents serious problems for the migration of these tapes if their back coating has suffered deformation.
The possibility of collaboration
We are excited about the possibility of working with the Action Space collection, mainly because we would love to opportunity to learn more about their work. Like many other theatre groups who were established in the late 1960s, Action Space wanted to challenge the elitism of art and make it accessible to everyone in the community. In their 1972 annual report, which is archived on the Unfinished Histories: Recording the History of Alternative Theatre website, they describe the purposes of the company as follows:
‘Its workings are necessarily experimental, devious, ambiguous, and always changing in order to find a new situation. In the short term the objectives are to continually question and demonstrate through the actions of all kinds new relationships between artists and public, teachers and taught, drop-outs and society, performers and audiences, and to question current attitudes of the possibility of creativity for everyone. For the longer term the aim is to place the artists in a non-elite set up, to keep “normal” under revision, to break barriers in communication and to recognise that education is a continuing process.’
Although Action Space disbanded in 1981, the project was relaunched in the same year as Action Space Mobile, who are still operating today. The centre of the Action Space Mobile’s philosophy is that they are an arts company ‘that has always worked with people, believing that contact and participation in the arts can change lives positively.’ There is also the London based ActionSpace, who work with artists with learning disabilities.
We hope that offering community heritage projects the possibility of collaboration will help them to benefit from our knowledge and experience. In turn we will have interesting things to watch and listen to, which is part of what makes working in the digitisation world fun and enjoyable.
Often the tapes we receive to digitise are ‘forgotten’ recordings. Buried under a pile of stuff in a dark, cold room, their owners think they are lost forever. Then, one day, a reel of the mysterious tape emerges from the shadows generating feelings of excitement and anticipation. What is stored on tape? Is the material in a playable condition? What will happen to the tape once it is in a digital format?
All of these things happened recently when Paul Travis sent us a ¼ inch AMPEX tape of the band he played in with his brother, the Salford Oi! punk outfit State Victims. The impetus for forming State Victims emerged when the two brothers ‘split from Salford bands, Terrorist Guitars and the Bouncing Czechs respectively, and were looking for a new musical vessel to express and reassert their DIY music ethic, but in a more vital and relevant way, searching for a new form of “working-class protest.”‘
The tape had been in the wilderness for the past 30 years, residing quietly in a shed in rural Cambridgeshire. It was in fairly good condition, displaying no signs of damage such as mould on the tape or spool. Like many of the AMPEX tapes we receive it did need some baking treatment because it was suffering from binder hydrolysis (a.k.a. Sticky Shed Syndrome). The baking, conducted at 49 Celsius for 8 hours in our customised oven, was successful and the transfer was completed without any problems. We created a high resolution stereo 24 bit/ 96 kHz WAV file which is recommended for archived audio, as well as a MP3 access copy that can be easily shared online.
Image of tape post-transfer. When it arrived the tape was not wound on neatly and there was no leder tape on it.
Finding old tapes and sending them to be digitised can be a process of discovery. Originally Paul thought the tape was of a 1983 session recorded at the Out of the Blue Studios in Ancoats, Manchester, but it became apparent that the tape was of an earlier recording. Soon after we digitised the first recording we received a message from Paul saying another State Victims tape had ‘popped up in an attic’, so it is amazing what you find when you start digging around!
Like many other bands connected to the Manchester area, the digital artefacts of State Victims are stored on the Manchester District Music Archive (MDMA), a user-led online archive established in 2003 in order to celebrate Greater Manchester music and its history. The MDMA is part of a wider trend of do it yourself archival activity that exploded in the 21st century due to the availability of cheap digital technologies. In what is arguably a unique archival moment, digital technologies have enabled marginal, subcultural and non/ anti-commercial music to widely circulate alongside the more conventional, commercial artefacts of popular music. This is reflected in the MDMA where the artefacts of famous Manchester bands such as The Smiths, The Fall, Oasis and Joy Division sit alongside the significantly less famous archives of the Manchester Musicians Collective, The Paranoids, Something Shady and many others.
Within the community-curated space of the MDMA all of the artefacts acquire a similar value, derived from their ability to illuminate the social history of the area told through its music. Much lip service has been paid to the potential of Web 2.0 technologies and social media to enable new forms of collaboration and ‘user-participation’, but involving people in the construction of web-based content is not always an automatic process. If you build it, people do not always come. As a user-led resource, however, the MDMA seems pretty effective. It is inviting to use, well organised and a wide range of people are clearly contributing, which is reflected in the vibrancy of its content. It is exciting that such an online depository exists, providing a new home for the errant tape, freshly digitised, that is part of Manchester’s music history.
In a technological world that is rapidly changing how can digital information remain accessible?
One answer to this question lies in the use of open source technologies. As a digital preservation strategy it makes little sense to use codecs owned by Mac or Windows to save data in the long term. Propriety software essentially operate like closed systems and risk compromising access to data in years to come.
It is vital, therefore, that the digitisation work we do at Great Bear is done within the wider context of digital preservation. This means making informed decisions about the hardware and software we use to migrate your tape-based media into digital formats. We use a mixture of propriety and open source software, simply because it makes our a bit life easier. Customers also ask us to deliver their files in propriety formats. For example, Apple pro res is a really popular codec that doesn’t take up a lot of data space so our customers often request this, and of course we are happy to provide it.
Using open systems definitely has benefits. The flexibility of Linux, for example, enables us to customise our digitisation system according to what we need to do. As with the rest of our work, we are keen to find ways to keep using old technologies if they work well, rather than simply throwing things away when shiny new devices come on the market. There is the misconception that to ingest vast amounts of audio data you need the latest hardware. All you need in fact is a big hard drive, flexible, yet reliable, software and an operating system that doesn’t crash so it can be left to ingest for 8 hours or more. Simple! Examples of open source software we use is the sound processing programme SoX. This saves us a lot of time because we are able to write scripts for the programme that can be used to batch process audio data according to project specifications.
Openness in the digital preservation world
Within the wider digital preservation world open source technologies are also used widely. From digital preservation tools developed by projects such as SCAPE and the Open Planets Foundation, there are plenty of software resources available for individuals and organisations who need to manage their digital assets. It would be naïve, however, to assume that the practice of openness here, and in other realms of the information economy, are born from the same techno-utopian impulse that propelled the open software movement from the 1970s onwards. The SCAPE website makes it clear that the development of open source information preservation tools are ‘the best approach given the substantial public investment made at the European and national levels, and because it is the most effective way to encourage commercial growth.’
What would make projects like SCAPE and Open Planets even better is if they thought about ways to engage non-specialist users who may be curious about digital preservation tools but have little experience of navigating complex software. The tools may well be open, but the knowledge of how to use them are not.
‘The problem is most archivists, curators and conservators involved in media reformatting are ill-equipped to detect artifacts, or further still to understand their cause and ensure a high quality job. They typically don’t have deep training or practical experience working with legacy media. After all, why should we? This knowledge is by and large the expertise of video and audio engineers and is increasingly rare as the analogue generation ages, retires and passes on. Over the years, engineers sometimes have used different words or imprecise language to describe the same thing, making the technical terminology even more intimidating or inaccessible to the uninitiated. We need a way capture and codify this information into something broadly useful. Preserving archival audiovisual media is a major challenge facing libraries, archives and museums today and it will challenge us for some time. We need all the legs up we can get.’
The promise of openness can be a fraught terrain. In some respects we are caught between a hyper-networked reality, where ideas, information and tools are shared openly at a lightning pace. There is the expectation that we can have whatever we want, when we want it, which is usually now. On the other side of openness are questions of ownership and regulation – who controls information, and to what ends?
Perhaps the emphasis placed on the value of information within this context will ultimately benefit digital archives, because there will be significant investment, as there already has been, in the development of open resources that will help to take care of digital information in the long term.
We are now used to living in a born-digital environment, but the transition from analogue to digital technologies did not happen overnight. In the late 1970s, early digital audio recordings were made possible by a hybrid analogue/digital system. It was composed by the humble transport and recording mechanisms of the video tape machine, and a not so humble PCM (pulse-code-modulation) digital processor. Together they created the first two-channel stereo digital recording system.
The first professional use digital processing machine, made by SONY, was the PCM 1600. It was introduced in 1978 and used a U-Matic tape machine. Later models, the PCM 1610/ 1630, acted as the first standard for mastering audio CDs in the 1980s. SONY employee Toshitada Doi, whose impressive CV includes the development of the PCM adaptor, the Compact Disc and the CIRC error correction system, visited recording studios around the world in an effort to facilitate the professional adoption of PCM digital technologies. He was not however welcomed with open arms, as the SONY corp. website explains:
‘Studio engineers were opposed to digital technology. They criticized digital technology on the grounds that it was more expensive than analogue technology and that it did not sound as soft or musical. Some people in the recording industry actually formed a group called MAD (Musicians Against Digital), and they declared their position to the Audio Engineering Society (AES).’
Several consumer/ semi-professional models were marketed by SONY in the 70s and 80s, starting with the PCM-1 (1977). In a retro-review of the PCM-F10 (1981), Dr Frederick J. Bashour explains that
‘older model VCRs often worked better than newer ones since the digital signal, as seen by the VCR, was a monochrome pattern of bars and dots; the presence of modern colour tweaking and image compensation circuits often reduced the recording system’s reliability and, if possible, were turned off.’
Why did the evolution of an emerging digital technology stand on the shoulders of what had, by 1981, become a relatively mature analogue technology? It all comes down to the issue of bandwidth. A high quality PCM audio recording required 1-1.5 MHz bandwidth, which is far greater than a conventional analogue audio signal (15-20KHz). While this bandwidth was beyond the scope of analogue recording technology of the time, video tape recorders did have the capacity to record signals with higher bandwidths.
If you have ever wondered where the 16 bit/ 44 Khz sampling standard for the CD came from, it was because in the early 1980s, when the CD standard was agreed, there was no other practical way of storing digital sound than by a PCM Converter & video recorder combination. As the wikipedia entry for the PCM adaptor explains, ‘the sampling frequencies of 44.1 and 44.056 kHz were thus the result of a need for compatibility with the 25-frame (CCIR 625/50 countries) and 30-frame black and white (EIAN 525/60 countries) video formats used for audio storage at the time.’ The sampling rate was adopted as the standard for CDs and, unlike many other things in our rapidly changing technological world, it hasn’t changed since.
The fusion of digital and analogue technologies did not last long, and the introduction of DAT tapes in 1987 rendered the PCM digital converters/ video tape system largely obsolete. DAT recorders basically did the same job as PCM/ video but came in one, significantly smaller, machine. DAT machines had the added advantage of being able to accept multiple sampling rates (the standard 44.1 kHz, as well as 48kHz, and 32kHz, all at 16 bits per sample, and a special LP recording mode using 12 bits per sample at 32 kHz for extended recording time).
Problems with migrating early digital tape recordings
There will always be the risk with any kind of magnetic tape recordings that there won’t be enough working tape machines to playback the material recorded on them in the future. As spare parts become harder to source, tapes with worn out transport mechanisms will simply become inoperable. We are not quite at this stage yet, and at Great Bear we have plenty of working U-Matic, Betamax and VHS machines so don’t worry too much! Machine obsolescence is however a real threat facing tape based archives.
Such a problem comes into sharp relief when we consider the case of digital audio recordings made on analogue video tape machines. Audio recordings ‘work’ the tape transport in a far more vigorous fashion than your average domestic video tape user. It may be rewound and fast-forwarded more often, and in a professional environment may be in constant use, thus leading to greater wear and tear.
Those who chose to adopt digital early and made recordings on tape will have marvelled at the lovely clean recordings and the wonders of error correction technology. As a legacy format however, tape-based digital recordings are arguably more at risk than their analogue counterparts. They are doubly compromised by fragility of tape, and the particular problems that befall digital technologies when things go wrong.
‘Edge damage’ is very common in video tape and can happen when the tape transport becomes worn. This can alter the alignments of transport mechanism, leading it to move move up and down and crush the tape. As you can see in this photograph the edge of this tape has become damaged.
Because it is a digital recording, this has led to substantial problems with the transfer, namely that large sections of the recording simply ‘drop out.’ In instances such as these, where the tape itself has been damaged, analogue recordings on tape are infinitely more recoverable than digital ones. Dr W.C. John Van Bogart explains that
‘even in instances of severe tape degradation, where sound or video quality is severely compromised by tape squealing or a high rate of dropouts, some portion of the original recording will still be perceptible. A digitally recorded tape will show little, if any, deterioration in quality up to the time of catastrophic failure when large sections of recorded information will be completely missing. None of the original material will be detectable in these missing sections.’
This risk of catastrophic, as opposed to gradual loss of information on tape based digital media, is what makes these recordings particularly fragile and at risk. What is particularly worrying about digital tape recordings is they may not show any external signs of damage until it is too late. We therefore encourage individuals, recording studios and memory institutions to assess the condition of their digital tape collections and take prompt action if the recorded information is valuable.
The story of PCM digital processors and analogue tapes gives us a fascinating window into a time when we were not quite analogue, but not quite digital either, demonstrating how technologies co-evolve using the capacities of what is available in order to create something new.
‘A non-magnetic, 100 year, green solution for data storage.’
This is the stuff of digital information managers’ dreams. No more worrying about active data management, file obsolescence or that escalating energy bill.
Imagine how simple life would be if there was a way to store digital information that could last, without intervention, for nearly 100 years. Those precious digital archives could be stored in a warehouse that was not climate controlled, because the storage medium was resilient enough to withstand irregular temperatures.
Imagine after 100 years an archivist enters that very same warehouse to retrieve information requested by a researcher. The archivist pulls a box off the shelf and places it on the table. In their bag they have a powerful magnifying glass which they use to read the information. Having ascertained they have the correct item, they walk out the warehouse, taking the box with them. Later that day, instructions provided as part of the product licensing over 100 years ago are used to construct a reader that will retrieve the data. The information is recovered and, having assessed the condition of the storage medium which seems in pretty good nick, the digital optical technology storage is taken back to the warehouse where it sits for another 10 years, until it is subject to its life-cycle review.
Does this all sound too good to be true? For anyone exposed to the constantly changing world of digital preservation, the answer would almost definitely be yes. We have already covered on this blog numerous issues that the contemporary digital information manager may face. The lack of standardisation in technical practices and the bewildering array of theories about how to manage digital data mean there is currently no ‘one size fits all’ solution to tame the archive of born-digital and digitised content, which is estimated to swell to 3,000 Exabytes (thousands of petabytes) by 2020*. We have also covered the growing concerns about the ecological impact of digital technologies, such as e-waste and energy over-consumption. With this in mind, the news that a current technology exists that can by-pass many of these problems will seem like manna from heaven. What can this technology be and why have you never heard about it?
The technology in question is called DOTS, which stands for Digital Optical Technology System. The technology is owned and being developed by Group 47, who ‘formed in 2008 in order to secure the patents, designs, and manufacturing processes for DOTS, a proven 100-year archival technology developed by the Eastman Kodak Company.’ DOTS is refreshingly different from every other data storage solution on the market because it ‘eliminates media and energy waste from forced migration, costly power requirements, and rigid environmental control demands’. What’s more, DOTS are ‘designed to be “plug & play compatible” with the existing Linear Tape Open (LTO) tape-based archiving systems & workﬂow’.
In comparison with other digital information management systems that can employ complex software, the data imaged by DOTS does not use sophisticated technology. John Lafferty writes that at ‘the heart of DOTS technology is an extremely stable storage medium – metal alloy sputtered onto mylar tape – that undergoes a change in reflectivity when hit by a laser. The change is irreversible and doesn’t alter over time, making it a very simple yet reliable technology.’
DOTS can survive the benign neglect all data experiences over time, but can also withstand pretty extreme neglect. During research and development, for example, DOTS was exposed to a series of accelerated environmental age testing that concluded ‘there was no discernible damage to the media after the equivalent of 95.7 years.’ But the testing did not stop there. Since acquiring patents for the technology Group 47,
‘has subjected samples of DOTS media to over 72 hours of immersion each in water, benzine, isopropyl alcohol, and Clorox (™) Toilet Bowl Cleaner. In each case, there was no detectable damage to the DOTS media. However, when subjected to the citric acid of Sprite carbonated beverage, the metal had visibly deteriorated within six hours.’
Robust indeed! DOTS is also non-magnetic, chemically inert, immune from electromagnetic fields and can be stored in normal office environments or extremes ranging from -9º – 65º C. It ticks all the boxes really.
DOTS vs the (digital preservation) world
The only discernible benefit of the ‘open all hours’, random access digital information culture over a storage solution such as DOTS is accessibility. While it certainly is amazing how quick and easy it is to retrieve valuable data at the click of a button, it perhaps should not be the priority when we are planning how to best take care of the information we create, and are custodians of. The key words here are valuable data. Emerging norms in digital preservation, which emphasise the need to always be responsive to technological change, takes gambles with the very digital information it seeks to preserve because there is always a risk that migration will compromise the integrity of data.
The constant management of digital data is also costly, disruptive and time-consuming. In the realm of cultural heritage, where organisations are inevitably under resourced, making sure your digital archives are working and accessible can sap energy and morale. These issues of course affect commercial organisations too. The truth is the world is facing an information epidemic, and surely we would all rest easier if we knew our archives were safe and secure. Indeed, it seems counter-intuitive that amid the endless flashy devices and research expertise in the world today, we are yet to establish sustainable archival solutions for digital data.
Of course, using a technology like DOTS need not mean we abandon the culture of access enabled by file-based digital technologies. It may however mean that the digital collections available on instant recall are more carefully curated. Ultimately we have to ask if privileging the instant access of information is preferable to long-term considerations that will safeguard cultural heritage and our planetary resources.
If such a consideration errs on the side of moderation and care, technology’s role in shaping that hazy zone of expectancy known as ‘the future’ needs to shift from the ‘bigger, faster, quicker, newer’ model, to a more cautious appreciation of the long-term. Such an outlook is built-in to the DOTS technology, demonstrating that to be ‘future proof’ a technology need not only withstand environmental challenges, such as flooding or extreme temperature change, but must also be ‘innovation proof’ by being immune to the development of new technologies. As John Lafferty writes, the license bought with the product ‘would also mandate full backward compatibility to Generation Zero, achievable since readers capable of reading greater data densities should have no trouble reading lower density information.’ DOTS also do not use propriety codecs, as Chris Castaneda reports, ‘the company’s plan is to license the DOTS technology to manufacturers, who would develop and sell it as a non-proprietary system.’ Nor do they require specialist machines to be read. With breathtaking simplicity, ‘data can be recovered with a light and a lens.’
It would be wrong to assume that Group 47′s development of DOTS is not driven by commercial interests – it clearly is. DOTS do however seem to solve many of the real problems that currently afflict the responsible and long-term management of digital information. It will be interesting to see if the technology is adopted and by who. Watch this space!
* According to a 2011 Enterprise Strategy Group Archive TCO Study
Across the world, 2014-2018 will be remembered for its commitment to remembrance. The events being remembered are, of course, those related to the First World War.
What is most intriguing about the centenary of the First World War is that it is already an occasion for growing reflection on how such an event has been remembered, and the way this shapes contemporary perceptions of history.
The UK government has committed over £50 million pounds for commemoration events such as school trips to battlefields, new exhibitions and public ceremonies. If you think that seems like a little bit too much, take a visit to the No Glory in War website, the campaign group who are questioning the purposes of commemorating a war that caused so much devastation.
The concerns raised by No Glory about political appropriation are understandable, particularly if we take into account a recent Daily Mail article written by current Education Secretary Michael Gove. In it Gove stresses that it is
‘important that we commemorate, and learn from, that conflict in the right way in the next four years. […] The war was, of course, an unspeakable tragedy, which robbed this nation of our bravest and best. Our understanding of the war has been overlaid by misunderstandings, and misrepresentations which reflect an, at best, ambiguous attitude to this country and, at worst, an unhappy compulsion on the part of some to denigrate virtues such as patriotism, honour and courage.
The conflict has, for many, been seen through the fictional prism of dramas such as Oh! What a Lovely War, The Monocled Mutineer and Blackadder, as a misbegotten shambles – a series of catastrophic mistakes perpetrated by an out-of-touch elite. Even to this day there are Left-wing academics all too happy to feed those myths.’
Gove clearly understands the political consequences of public remembrance. In his view, popular cultural understanding of the First World War have distorted our knowledge and proper values ‘as a nation’. There is however a ‘right way to remember,’ and this must convey particular images and ideas of the conflict, and Britain’s role within it.
Digitisation and re-interpretation
While the remembrance of the First World War will undoubtedly become, if it has not already, a political struggle over social values, digital archives will play a key role ensuring the debates that take place are complex and well-rounded. Significant archive collections will be digitised and disseminated to wide audiences because of the centenary, leading to re-interpretation and debate.
If you want a less UK-centric take on remembrance you can visit the Europeana 1914-1918 Website or Centenary News, a not-for-profit organisation that has been set up to provide independent, impartial and international coverage of the Centenary of the First World War.
Large amounts of digitised material about the First World War are paper documents, given that portable recording technologies were not in wide scale use during the years of the conflict.
The first hand oral testimonies of First World War soldiers have usually been recorded several years after the event. What can such oral records tell us that other forms of archival evidence can’t?
Since it became popular in the 1960s and 1970s, oral histories have often been treated with suspicion by some professional historians who have questioned their status as ‘hard evidence’. The Oral History Society website describe however the unique value of oral histories: ‘Everyone forgets things as time goes by and we all remember things in different ways. All memories are a mixture of facts and opinions, and both are important. The way in which people make sense of their lives is valuable historical evidence in itself.’
We were recently sent some oral recordings of Frank Brash, a soldier who had served in the First World War. The tapes, that were recorded in 1975 by Frank’s son Robert, were sent in by his Great-Grandson Andrew who explained how they were made ‘as part of family history, so we could pass them down the generations.’ He goes on to say that ‘Frank died in 1980 at the age of 93, my father died in 2007. Most of the tapes are his recollections of the First World War. He served as a machine gunner in the battles of Messines and Paschendale amongst others. He survived despite a life expectancy for machine gunners of 6 days. He won the Military Medal but we never found out why.’
The recordings themselves included a lot of tape hiss because they were recorded at a low sound level, and were second generation copies of the tapes (so copies of copies).
Our job was to digitise the tapes but reduce the noise so the voices could be heard better. This was a straightforward process because even though they were copies, the tapes were in good condition. The hiss however was often as loud as the voice and required a lot of work post-migration. Fortunately, because the recording was of a male voice, it was possible to reduce the higher frequency noise significantly without affecting the audibility of Frank speaking.
Remembering the interruption
Amid the rush of archive fever surrounding the First World War, it is important to remember how, as a series of events, it arguably changed the conditions of how we remember. It interrupted what Walter Benjamin called ‘communicable experience.’ In his essay ‘The Storyteller: Reflections on the Works of Nikolai Leskov’, Benjamin talks of men who ‘had returned from the battlefield grown silent’, unable to share what had happened to them. The image of the shell-shocked soldier, embodied by fictional characters such as Septimus Smith in Virginia Woolf’s Mrs. Dalloway, was emblematic of men whose experience had been radically interrupted. Benjamin went on to write:
‘Never has experience been contradicted more thoroughly than the strategic experience by tactical warfare, economic experience by inflation, bodily experience by mechanical warfare, moral experience by those in power. A generation that had gone to school on a horse drawn street-car now stood under the empty sky in a countryside in which nothing remained unchanged but the clouds, and beneath these clouds, in a field of force of torrents and explosions, was the tiny, fragile human body.’
Of course, it cannot be assumed that prior to the Great War all was fine, dandy and uncomplicated in the world. This would be a romantic and false portrayal. But the mechanical force of the Great War, and the way it delayed efforts to speak and remember in the immediate aftermath, also needs to be integrated into contemporary processes of remembrance. How will it be possible to do justice to the memory of the people who took part otherwise?