8mm / Hi8 video tape digitising of The Upright Electric Guitar

July 1st, 2017

We recently helped in the digitising and creation of an online video for the following project by Nelson Johnson:

 

Is it a piano? Is it an electric guitar? Neither, it’s a hybrid! Keys, “action”, dampers from an upright piano, wood planks, electric guitar strings, and long pickup coils.

Watch and listen to a YouTube video of this instrument: https://youtu.be/pXIzCWyw8d4

Inception, designing and building

I first had the idea for the upright electric guitar in late 1986. At that time I had been scraping together a living for around 2 years, by hauling a 450-pound upright piano around to the shopping precincts in England, playing it as a street entertainer – and in my spare time I dreamt of having a keyboard instrument that would allow working with the sound of a “solid body” electric guitar. I especially liked the guitar sound of Angus Young from AC/DC, that of a Gibson SG. It had a lot of warmth in the tone, and whenever I heard any of their music, I kept thinking of all the things I might be able to do with that sound if it was available on a keyboard, such as developing new playing techniques. I had visions of taking rock music in new directions, touring, recording, and all the usual sorts of things an aspiring musician has on their mind.

Digital sampling was the latest development in keyboard technology back then, but I had found that samples of electric guitar did not sound authentic enough, even just in terms of their pure tone quality. Eventually all this led to one of those “eureka” moments in which it became clear that one way to get what I was after, would be to take a more “physical” approach by using a set of piano keys and the “action” and “dampering” mechanism that normally comes with them, and then, using planks of wood to mount on, swop out piano strings for those from an electric guitar, add guitar pickups, wiring and switches, and so on – and finally, to send the result of all this into a Marshall stack.

I spent much of the next 12 years working on some form of this idea, except for a brief interlude for a couple of years in the early 1990s, during which I collaborated with a firm based in Devon, Musicom Ltd, whose use of additive synthesis technology had led them to come up with the best artificially produced sounds of pipe organs that were available anywhere in the world. Musicom had also made some simple attempts to create other instrument sounds including acoustic piano, and the first time I heard one of these, in 1990, I was very impressed – it clearly had a great deal of the natural “warmth” of a real piano, warmth that was missing from any digital samples I had ever heard. After that first introduction to their technology and to the work that Musicom were doing, I put aside my idea for the physical version of the upright electric guitar for a time, and became involved with helping them with the initial analysis of electric guitar sounds.

Unfortunately, due to economic pressures, there came a point in 1992 when Musicom had to discontinue their research into other instrument sounds and focus fully on their existing lines of development and their market for the pipe organ sounds. It was at that stage that I resumed work on the upright electric guitar as a physical hybrid of an electric guitar and an upright piano.

I came to describe the overall phases of this project as “approaches”, and in this sense, all work done before I joined forces with Musicom was part of “Approach 1”, the research at Musicom was “Approach 2”, and the resumption of my original idea after that was “Approach 3”.

During the early work on Approach 1, my first design attempts at this new instrument included a tremolo or “whammy bar” to allow some form of note / chord bending. I made detailed 3-view drawings of the initial design, on large A2 sheets. These were quite complicated and looked like they might prove to be very expensive to make, and sure enough, when I showed them a light engineering firm, they reckoned it would cost around £5,000.00 for them to produce to those specifications. Aside from the cost, even on paper this design looked a bit impractical – it seemed like it might never stay in tune, for one thing.

Despite the apparent design drawbacks, I was able to buy in some parts during Approach 1, and have other work done, which would eventually be usable for Approach 3. These included getting the wood to be used for the planks, designing and having the engineering done on variations of “fret” pieces for all the notes the new instrument would need above the top “open E” string on an electric guitar, and buying a Marshall valve amp with a separate 4×12 speaker cabinet.

While collaborating with Musicom on the electronic additive synthesis method of Approach 2, I kept hold of most of the work and items from Approach 1, but by then I had already lost some of the original design drawings from that period. This is a shame, as some of them were done in multiple colours, and they were practically works of art in their own right. As it turned out, the lost drawings included features that I would eventually leave out of the design that resulted from a fresh evaluation taken to begin Approach 3, and so this loss did not stop the project moving forward.

The work on Approach 3 began in 1992, and it first involved sourcing the keys and action/dampering of an upright piano. I wanted to buy something new and “off the shelf”, and eventually I found a company based in London, Herrberger Brooks, who sold me one of their “Rippen R02/80” piano actions and key sets, still boxed up as it would be if sent to any company that manufactures upright pianos.

These piano keys and action came with a large A1 blueprint drawing that included their various measurements, and this turned out to be invaluable for the design work that had to be done next. The basic idea was to make everything to do with the planks of wood, its strings, pickups, tuning mechanism, frets, “nut”, machine heads and so on, fit together with, and “onto”, the existing dimensions of the piano keys and action – and to then use a frame to suspend the planks vertically, to add a strong but relatively thin “key bed” under the keys, legs under the key bed to go down to ground level and onto a “base”, and so on.

To begin work on designing how the planks would hold the strings, how those would be tuned, where the pickup coils would go and so on, I first reduced down this big blueprint, then added further measurements of my own, to the original ones. For the simplest design, the distance between each of the piano action’s felt “hammers” and the next adjacent hammer was best kept intact, and this determined how far apart the strings would have to be, how wide the planks needed to be, and how many strings would fit onto each plank. It looked like 3 planks would be required.

While working on new drawings of the planks, I also investigated what gauge of electric guitar string should be used for each note, how far down it would be possible to go for lower notes, and things related to this. With a large number of strings likely to be included, I decided it would be a good idea to aim for a similar tension in each one, so that the stresses on the planks and other parts of the instrument would, at least in theory, be relatively uniform. Some enquiries at the University of Bristol led me to a Dr F. Gibbs, who had already retired from the Department of Physics but was still interested in the behaviour and physics of musical instruments. He assisted with the equations for calculating the tension of a string, based on its length, diameter, and the pitch of the note produced on it. Plugging all the key factors into this equation resulted in a range of electric guitar string gauges that made sense for the upright electric guitar, and for the 6 open string notes found on a normal electric guitar, the gauges resulting from my calculations were similar to the ones your average electric guitarist might choose.

Other practicalities also determined how many more notes it would theoretically be possible to include below the bottom “open E” string on an electric guitar, for the new instrument. For the lowest note to be made available, by going all the way down to a 0.060 gauge wound string – the largest available at that time as an electric guitar string – it was possible to add several more notes below the usual open bottom E string. I considered using bass strings for notes below this, but decided not to include them and instead, to let this extra range be the lower limit on strings and notes to be used. Rather than a bass guitar tone, I wanted a consistent sort of electric guitar tone, even for these extra lower notes.

For the upper notes, everything above the open top E on a normal guitar would have a single fret at the relevant distance away from the “bridge” area for that string, and all those notes would use the same string gauge as each other.

The result of all the above was that the instrument would accommodate a total of 81 notes / strings, with an octave of extra notes below the usual guitar’s open bottom E string, and just under 2 octaves of extra notes above the last available fret from the top E string of a Gibson SG, that last fretted note on an SG being the “D” just under 2 octaves above the open top E note itself. For the technically minded reader, this range of notes went from “E0” to “C7”.

Having worked all this out, I made scale drawings of the 3 planks, with their strings, frets, pickup coils, and a simple fine-tuning mechanism included. It was then possible to manipulate a copy of the piano action blueprint drawing – with measurements removed, reduced in size, and reversed as needed – so it could be superimposed onto the planks’ scale drawings, to the correct relational size and so on. I did this without the aid of any computer software, partly because in those days, CAD apps were relatively expensive, and also because it was difficult to find any of this software that looked like I could learn to use it quickly. Since I had already drawn this to scale in the traditional way – using draftsman’s tools and a drawing board – it made sense to work with those drawings, so instead of CAD, I used photocopies done at a local printing shop, and reduced / reversed etc, as needed.

Key drawing of 3 planks, strings, frets, fine tuning mechanism and pickup coils, combined with upright piano action

 

It was only really at this point, once the image of the piano action’s schematic was married up to the scale drawings of the 3 planks, that I began to fully understand where this work was heading, in terms of design. But from then on, it was relatively easily to come up with the rest of the concepts and to draw something for them, so that work could proceed on the frame to hold up the planks, the key bed, legs, and a base at ground level.

Around this time, I came across an old retired light engineer, Reg Huddy, who had a host of engineer’s machines – drill presses, a lathe, milling machine, and so on – set up in his home. He liked to make small steam engines and things of that nature, and when I first went to see him, we hit it off immediately. In the end he helped me make a lot of the metal parts that were needed for the instrument, and to machine in various holes and the pickup coil routing sections on the wood planks. He was very interested in the project, and as I was not very well off, he insisted in charging minimal fees for his work. Reg also had a better idea for the fine tuning mechanism than the one I had come up with, and we went with his version, as soon as he showed it to me.

If I am honest, I don’t think I would ever have finished the work on this project without all the help that Reg contributed. I would buy in raw materials if he didn’t already have them, and we turned out various parts as needed, based either on 3-view drawings I had previously come up with, or for other parts we realised would be required as the project progressed, from drawings I worked up as we went along. Reg sometimes taught me to use his engineering machinery, and although I was a bit hesitant at times, after a while I was working on these machines to a very basic standard.

I took the wood already bought for the instrument during the work on Approach 1, to Jonny Kinkead of Kinkade Guitars, and he did the cutting, gluing up and shaping to the required sizes and thicknesses for the 3 planks. The aim was to go with roughly the length of a Gibson SG neck and body, to make the planks the same thickness as an SG body, and to include an angled bit as usual at the end where an SG or any other guitar is tuned up, the “machine head” end. Jonny is an excellent craftsman and was able to do this work to a very high standard, based on measurements I provided him with.

As well as getting everything made up for putting onto the planks, the piano action itself needed various modifications. The highest notes had string lengths that were so short that the existing dampers had to be extended so they were in the correct place, as otherwise they would not have been positioned over those strings at all. Extra fine adjustments were needed for each damper, so that instead of having to physically bend the metal rod holding a given damper in place – an inexact science at the best of times – it was possible to turn a “grub screw” to accomplish the same thing, but with a much greater degree of precision. And finally, especially important for the action, the usual felt piano “hammers” were to be replaced by smaller versions made of stiff wire shaped into a triangle. For these, I tried a few design mock-ups to find the best material for the wire itself, and to get an idea of what shape to use. Eventually, once this was worked out, I made up a “jig” around which it was possible to wrap the stiff wire so as to produce a uniformly shaped “striking triangle” for each note. This was then used to make 81 original hammers that were as similar to each other as possible. Although using the jig in this way was a really fiddly job, the results were better than I had expected, and they were good enough.

 

Close-up of a few hammers, dampers and strings

While this was all underway, I got in touch with an electric guitar pickup maker, Kent Armstrong of Rainbow Pickups. When the project first started, I had almost no knowledge of solid body electric guitar physics at all, and I certainly had no idea how pickup coils worked. Kent patiently explained this to me, and once he understood what I was doing, we worked out as practical a design for long humbucker coils as possible. A given coil was to go all the way across one of the 3 planks, “picking up” from around 27 strings in total – but for the rightmost plank, the upper strings were so short that there was not enough room to do this and still have both a “bridge” and a “neck” pickup, so the top octave of notes would had to have these two sets of coils stacked one on top of the other, using deeper routed areas in the wood than elsewhere.

For the signal to send to the amplifier, we aimed for the same overall pickup coil resistance (Ω) as on a normal electric guitar. By using larger gauge wire and less windings than normal, and by wiring up the long coils from each of the 3 planks in the right way, we got fairly close to this, for both an “overall bridge” and an “overall neck” pickup. Using a 3-way switch that was also similar to what’s found on a normal electric guitar, it was then possible to have either of these 2 “overall” pickups – bridge or neck – on by itself, or both at once. Having these two coil sets positioned a similar distance away from the “bridge end” of the strings as on a normal guitar, resulted in just the sort of sound difference between the bridge and neck pickups, as we intended. Because, as explained above, we had to stack bridge and neck coils on top of each other for the topmost octave of notes, those very high notes – much higher than on most electric guitars – did not sound all that different with the overall “pickup switch” position set to “bridge”, “neck”, or both at once. That was OK though, as those notes were not expected to get much use.

Some electric guitar pickups allow the player to adjust the volume of each string using a screw or “grub screw” etc. For the upright electric guitar I added 2 grub screws for every string and for each of the bridge and neck coils, and this means we had over 300 of these that had to be adjusted. Once the coils were ready, and after they were covered in copper sheeting to screen out any unwanted interference and they were then mounted up onto the planks, some early adjustments made to a few of these grub screws, and tests of the volumes of those notes, enabled working up a graph to calculate how much to adjust the height of each of the 300+ grub screws, for all 81 strings. This seemed to work quite well in the end, and there was a uniform change to volume from one end of the available notes to the other, one which was comparable to a typical electric guitar.

Unlike a normal electric guitar, fine tuning on this instrument was done at the “ball end” / “bridge end” of each string, not the “machine heads end” / “nut end”. The mechanism for this involved having a very strong, short piece of round rod put through the string’s “ball”, positioning one end of this rod into a fixed groove, and turning a screw using an allen key near the other end of the rod, to change the tension in the string. It did take a while to get this thing into tune, but I have always had a good ear, and over the years I had taught myself how to tune a normal piano, which is much more difficult than doing this fine tuning of the upright electric guitar instrument.

fine tuning mechanisms for each string (in the upper right part of the photo)

hammers, dampers, strings, pickup coils and their grub screws, and fine tuning mechanisms

 

A frame made of aluminium was designed to support the 3 planks vertically. They were quite heavy on their own, and much more so with all the extra metal hardware added on, so the frame had to be really strong. Triangle shapes gave it extra rigidity. To offset the string tensions, truss rods were added on the back of the 3 planks, 4 per plank at equal intervals. When hung vertically, the 3 planks each had an “upper” end where the fine tuning mechanisms were found and near where the pickup coils were embedded and the strings were struck, and a “lower” end where the usual “nut” and “machine heads” would be found. I used short aluminium bars clamping each of 2 adjacent strings together in place of a nut, and zither pins in place of machine heads. The “upper” and “lower” ends of the planks were each fastened onto their own hefty piece of angle iron, which was then nestled into the triangular aluminium support frame. The result of this design was that the planks would not budge by even a tiny amount, once everything was put together. This was over-engineering on a grand scale, making it very heavy – but to my thinking at that time, this could not be helped.

The piano keys themselves also had to have good support underneath. As well as preventing sagging in the middle keys and any other potential key slippage, the “key bed” had to be a thin as possible, as I have long legs and have always struggled with having enough room for them under the keys of any normal piano. These 2 requirements – both thin and strong – led me to have some pieces of aluminium bar heat treated for extra strength. Lengths of this reinforced aluminium bar were then added “left to right”, just under the keys themselves, having already mounted the keys’ standard wooden supports – included in what came with the piano action – onto a thin sheet of aluminium that formed the basis of the key bed for the instrument. There was enough height between the keys and the bottom of these wooden supports, to allow a reasonable thickness of aluminium to be used for these left-to-right bars. For strength in the other direction of the key bed – “front to back” – 4 steel bars were added, positioned so that, as I sat at the piano keyboard, they were underneath but still out of the way. Legs made of square steel tubing were then added to the correct height to take this key bed down to a “base” platform, onto which everything was mounted. Although this key bed ended up being quite heavy in its own right, with the legs added it was as solid as a rock, so the over-engineering did at least work in that respect.

If you have ever looked inside an upright piano, you might have noticed that the “action” mechanism usually has 2 or 3 large round nuts you can unscrew, after which it is possible to lift the whole mechanism up and out of the piano and away from the keys themselves. On this instrument, I used the same general approach to do the final “marrying up” – of piano keys and action, to the 3 planks of wood suspended vertically. The existing action layout already had “forks” that are used for this, so everything on the 3 planks was designed to allow room for hefty sized bolts fastened down tightly in just the right spots, in relation to where the forks would go when the action was presented up to the planks. The bottom of a normal upright piano action fits into “cups” on the key bed, and I also used these in my design. Once the planks and the key bed were fastened down to the aluminium frame and to the base during assembly, then in much the same way as on an upright piano, the action was simply “dropped down” into the cups, then bolted through the forks and onto, in this case, the 3 planks.

It’s usually possible to do fine adjustments to the height of these cups on an upright piano, and it’s worth noting that even a tiny change to this will make any piano action behave differently. This is why it was so important to have both very precise tolerances in the design of the upright electric guitar’s overall structure, together with as much strength and rigidity as possible for the frame and other parts.

With a normal upright piano action, when you press a given key on the piano keyboard, it moves the damper for that single note away from the strings, and the damper returns when you let go of that key. In addition to this, a typical upright piano action includes a mechanism for using a “sustain pedal” with the right foot, so that when you press the pedal, the dampers are pushed away from all the strings at the same time, and when you release the pedal, the dampers are returned back onto all the strings. The upright piano action bought for this instrument did include all this, and I especially wanted to take advantage of the various dampering and sustain possibilities. Early study, drawing and calculations of forces, fulcrums and so on, eventually enabled use of a standard piano sustain foot pedal – bought off the shelf from that same firm, Herrberger Brooks – together with a hefty spring, some square hollow aluminium tube for the horizontal part of the “foot to dampers transfer” function, and a wooden dowel for the vertical part of the transfer. Adjustment had to be made to the position of the fulcrum, as the first attempt led to the foot pedal needing too much force, which made it hard to operate without my leg quickly getting tired. This was eventually fixed, and then it worked perfectly.

At ground level I designed a simple “base” of aluminium sheeting, with “positioners” fastened down in just the right places so that the legs of the key bed, the triangular frame holding up the 3 planks, and the legs of the piano stool to sit on, always ended up in the correct places in relation to each other. This base was also where the right foot sustain pedal and its accompanying mechanism were mounted up. To make it more transportable, the base was done in 3 sections that could fairly easily be fastened together and disassembled.

After building – further tests and possible modifications

When all this design was finished, all the parts were made and adjusted as needed, and it could finally be assembled and tried out, the first time I put the instrument together, added the wiring leads, plugged it into the Marshall stack, and then tuned it all up, it was a real thrill to finally be able to sit and play it. But even with plenty of distortion on the amp, it didn’t really sound right – it was immediately obvious that there was too much high frequency in the tone. It had wonderful amounts of sustain, but the price being paid for this was that the sound was some distance away from what I was really after. In short, the instrument worked, but instead of sounding like a Gibson SG – or any other electric guitar for that matter – it sounded a bit sh***y.

When I had first started working on this project, my “ear” for what kind of guitar sound I wanted, was in what I would describe as an “early stage of development”. Mock-up tests done during Approach 1, before 1990, had sounded kind of right at that time. But once I was able to sit and play the finished instrument, and to hear it as it was being played, with hindsight I realised that my “acceptable” evaluation of the original mock-up was more because, at that point, I had not yet learned to identify the specific tone qualities I was after. It was only later as the work neared completion, that my “ear” for the sound I wanted became more fully developed, as I began to better understand how a solid body electric guitar behaves, what contributes to the tone qualities you hear from a specific instrument, and so on.

I began asking some of the other people who had been involved in the project, for their views on why it didn’t sound right. Two things quickly emerged from this – it was too heavy, and the strings were being struck, instead of plucking them.

Kent Armstrong, who made the pickups for the upright electric guitar, told me a story about how he once did a simple experiment which, in relation to my instrument, demonstrated what happens if you take the “it’s too heavy” issue to the extreme. He told me about how he had once “made an electric guitar out of a brick wall”, by fastening an electric guitar string to the wall at both ends of the string, adding a pickup coil underneath, tuning the string up, sending the result into an amp, and then plucking the string. He said that this seemed to have “infinite sustain” – the sound just went on and on. His explanation for this was that because the brick wall had so much mass, it could not absorb any of the vibration from the string, and so all of its harmonics just stayed in the string itself.

Although this was a funny and quite ludicrous example, I like this kind of thing, and the lesson was not lost on me at the time. We discussed the principles further, and Kent told me that in his opinion, a solid body electric guitar needs somewhere around 10 to 13 pounds of wood mass, in order for it to properly absorb the strings’ high harmonics in the way that gives you that recognisable tone quality we would then call “an electric guitar sound”. In essence, he was saying that the high frequencies have to “come out”, and then it’s the “warmer” lower harmonics which remain in the strings, that makes an electric guitar sound the way it does. This perfectly fit with my own experience of the tones I liked so much, in a guitar sound I would describe as “desirable”. Also, it did seem to explain why my instrument, which had a lot more “body mass” than 10 to 13 pounds – with its much larger wood planks, a great deal of extra hardware mounted onto them, and so on – did not sound like that.

As for striking rather than plucking the strings, I felt that more trials and study would be needed on this. I had opted to use hammers to strike the strings, partly as this is much simpler to design for – the modifications needed to the upright piano action bought off the shelf, were much less complicated than those that would have been required for plucking them. But there was now a concern that the physics of plucking and striking might be a lot different to each other, and if so there might be no way of getting around this, except to pluck them.

I decided that in order to work out what sorts of changes would best be made to the design of this instrument to make it sound better, among other things to do as a next step, I needed first-hand experience of the differences in tone quality between various sizes of guitar body. In short, I decided to make it my business to learn as much as I could about the physics of the solid body electric guitar, and if necessary, to learn more than perhaps anyone else out there might already know. I also prepared for the possibility that a mechanism to pluck the strings might be needed.

At that time, in the mid 1990s, there had been some excellent research carried out on the behaviour of acoustic guitars, most notably by a Dr Stephen Richardson at the University of Cardiff. I got in touch with him, and he kindly sent me details on some of this work. But he admitted that the physics of the acoustic guitar – where a resonating chamber of air inside the instrument plays a key part in the kinds of sounds and tones that the instrument can make – is fundamentally different to that of a solid body electric guitar.

I trawled about some more, but no one seemed to have really studied solid body guitar physics – or if they had, nothing had been published on it. Kent Armstrong’s father Dan appeared on the scene at one point, as I was looking into all this. Dan Armstrong was the inventor of the Perspex bass guitar in the 1960s. When he, Kent and I all sat down together to have a chat about my project, it seemed to me that Dan might in fact know more than anyone else in the world, about what is going on when the strings vibrate on a solid body guitar. It was very useful to hear what he had to say on this.

I came away from all these searches for more knowledge, with further determination to improve the sound of the upright electric guitar. I kept an eye out for a cheap Gibson SG, and as luck would have it, one appeared online for just £400.00 – for any guitar enthusiasts out there, you will know that even in the 1990s, that was dirt cheap. I suspected there might be something wrong with it, but decided to take a risk and buy it anyway. It turned out to have a relatively correct SG sound, and was cheap because it had been made in the mid 1970s, at a time when Gibson were using inferior quality wood for the bodies of this model. While it clearly did not sound as good as, say, a vintage SG, it was indeed a Gibson original rather than an SG copy, and it did have a “workable” SG sound that I could compare against.

I also had a friend with a great old Gibson SG Firebrand, one that sounded wonderful. He offered to let me borrow it for making comparative sound recordings and doing other tests. I was grateful for this, and I did eventually take him up on the offer.

One thing that I was keen to do at this stage, was to look at various ways to measure – and quantify – the differences in tone quality between either of these two Gibson SGs and the upright electric guitar. I was advised to go to the Department of Mechanical Engineering at the University of Bristol, who were very helpful. Over the Easter break of 1997, they arranged for me to bring in my friend’s SG Firebrand and one of my 3 planks – with its strings all attached and working – so that one of their professors, Brian Day, could conduct “frequency sweep” tests on them. Brian had been suffering from early onset of Parkinson’s disease and so had curtailed his normal university activities, but once he heard about this project, he was very keen to get involved. Frequency sweep tests are done by exposing the “subject” instrument to an artificially created sound whose frequency is gradually increased, while measuring the effect this has on the instrument’s behaviour. Brian and his colleagues carried out the tests while a friend and I assisted. Although the results did not quite have the sorts of quantifiable measurements I was looking for, they did begin to point me in the right direction.

After this testing, someone else recommended I get in touch with a Peter Dobbins, who at that time worked at British Aerospace in Bristol and had access to spectral analysis equipment at their labs, which he had sometimes used to study the physics of the hurdy gurdy, his own personal favourite musical instrument. Peter was also very helpful, and eventually he ran spectral analysis of cassette recordings made of plucking, with a plectrum, the SG Firebrand, the completed but “toppy-sounding” upright electric guitar, and a new mock-up I had just made at that point, one that was the same length as the 3 planks, but only around 4 inches wide. This new mock-up was an attempt to see whether using around 12 or 13 much narrower planks in place of the 3 wider ones, might give a sound that was closer to what I was after.

 

Mock-up of possible alternative to 3 planks – would 12 or 13 of these sound better instead? Shown on its own (with a long test coil), and mounted up to the keys and action setup so that plucking tests could make use of the dampers to stop strings moving between recordings of single notes

As it turned out, the new mock-up did not sound that much different to the completed upright electric guitar itself, when the same note was plucked on each of them. It was looking like there was indeed a “range” of solid guitar body mass / weight of wood that gave the right kind of tone, and that even though the exact reasons for the behaviour of “too much” or “too little” mass might be different to each other, any amount of wood mass / weight on either side of that range, just couldn’t absorb enough of the high harmonics out of the strings. Despite the disappointing result of the new mock-up sounding fairly similar to the completed instrument, I went ahead and gave Peter the cassette recordings of it, of the completed instrument, and of my friend’s SG Firebrand, and he stayed late one evening at work and ran the spectral analysis tests on all of these.

Peter’s spectral results were just the kind of thing I had been after. He produced 3D graphs that clearly showed the various harmonics being excited when a given string was plucked, how loud each one was, and how long they went on for. This was a pictorial, quantitative representation of the difference in tone quality between my friend’s borrowed SG Firebrand, and both the completed instrument and the new mock-up. The graphs gave proper “shape” and “measure” to these differences. By this time, my “ear” for the sort of tone quality I was looking for, was so highly developed that I could distinguish between these recordings immediately, when hearing any of them. And what I could hear, was reflected precisely on these 3D graphs.

 

Spectral analysis graphs in 3D, of Gibson SG Firebrand “open bottom E” note plucked, and the same note plucked on the upright electric guitar. Frequency in Hz is on the x axis and time on the y axis, with time starting at the “back” and moving to the “front” on the y axis. Harmonics are left-to-right on each graph – leftmost is the “fundamental”, then 1st harmonic etc. Note how many more higher harmonics are found on the right graph of the upright electric guitar, and how they persist for a long time. I pencilled in frequencies for these various harmonics on the graph on the right, while studying it to understand what was taking place on the string.

While this was all underway, I also mocked up a few different alternative types of hammers and carried out further sound tests to see what sort of a difference you would get in tone, from using different materials for these, but always still striking the string. Even though I was more or less decided on moving to a plucking mechanism, for completeness and full understanding, I wanted to see if any significant changes might show up from using different sorts of hammers. For these experiments, I tried some very lightweight versions in plastic, the usual felt upright piano hammers, and a couple of others that were much heavier, in wood. Not only was there almost no difference whatsoever between the tone quality that each of these widely varied types of hammers seemed to produce, it also made next to no difference where, along the string, you actually struck it.

Other hammer designs tried – there was little variation in the sound each of these produced

These experiments, and some further discussions with a guitar maker who had helped out on the project, brought more clarification to my understanding of hammers vs plucking. Plucking a string seems to make its lower harmonics get moving right away, and they then start out with more volume compared to that of the higher harmonics. The plucking motion will always do this, partly because there is so much energy being transferred by the plectrum or the player’s finger – and this naturally tends to drive the lower harmonics more effectively. When you hit a string with any sort of hammer though, the effect is more like creating a sharp “shock wave” on the string, but one with much less energy. This sets off the higher harmonics more, and the lower ones just don’t get going properly.

In a nutshell, all of this testing and research confirmed the limitations of hammers, and the fact that there are indeed fundamental differences between striking and plucking an electric guitar string. Hammers were definitely “out”.

To summarise the sound characteristic of the upright electric guitar, its heavy structure and thereby the inability of its wood planks to absorb enough high frequencies out of the strings, made it naturally produce a tone with too many high harmonics and not enough low ones – and hitting its strings with a hammer instead of plucking, had the effect of “reinforcing” this tonal behaviour even more, and in the same direction.

The end?

By this point in the work on the project, as 1998 arrived and we got into spring and summer of that year, I had gotten into some financial difficulties, partly because this inventing business is expensive. Despite having built a working version of the upright electric guitar, even aside from the fact that the instrument was very heavy and took some time to assemble and take apart – making it impractical for taking on tour for example – the unacceptable sound quality alone, meant that it was not usable. Mocked-up attempts to modify the design so that there would be many planks, each quite narrow, had not improved the potential of the sound to any appreciable degree, either.

I realised that I was probably reaching the end of what I could achieve on this project, off my own back financially. To fully confirm some of the test results, and my understanding of what it is that makes a solid body electric guitar sound the way it does, I decided to perform a fairly brutal final test. To this end, I first made recordings of plucking the 6 open strings on the cheap SG I had bought online for £400.00. Then I had the “wings” of this poor instrument neatly sawn off, leaving the same 4-inch width of its body remaining, as the new mock-up had. This remaining width of 4 inches was enough that the neck was unaffected by the surgery, which reduced the overall mass of wood left on the guitar, and its shape, down to something quite similar to that of the new mock-up.

I did not really want to carry out this horrible act, but I knew that it would fully confirm all the indications regarding the principles, behaviours and sounds I had observed in both the 3 planks of the completed upright electric guitar, in the new mock-up, and in other, “proper” SG guitars that, to my ear, sounded right. If, by doing nothing else except taking these lumps of wood mass away from the sides of the cheap SG, its sound went from “fairly good” to “unacceptably toppy”, it could only be due to that change in wood mass.

After carrying out this crime against guitars by chopping the “wings” off, I repeated the recordings of plucking the 6 open strings. Comparison to the “before” recordings of it, confirmed my suspicions – exactly as I had feared and expected, the “after” sound had many more high frequencies in it. In effect I had “killed” the warmth of the instrument, just by taking off those wings.

In September 1998, with no more money to spend on this invention, and now clear that the completed instrument was a kind of “design dead end”, I made the difficult decision to pull the plug on the project. I took everything apart, recycled as many of the metal parts as I could (Reg Huddy was happy to have many of these), gave the wood planks to Jonny Kinkead for him to use to make a “proper” electric guitar with as he saw fit, and then went through reams of handwritten notes, sketches and drawings from 12 years of work, keeping some key notes and drawings which I still have today, but having a big bonfire one evening at my neighbour’s place, with all the rest.

Some “video 8” film of the instrument remained, and I recently decided to finally go through all of that, and all the notes and drawings kept, and make up a YouTube video from it. This is what Greatbear Analogue & Digital Media has assisted with. I am very pleased with the results, and am grateful to them. Here is a link to that video: https://youtu.be/pXIzCWyw8d4

As for the future of the upright electric guitar, in the 20 years since ceasing work on the project, I have had a couple of ideas for how it could be redesigned to sound better and, for some of those ideas, to also be more practical.

One of these new designs involves using similar narrow 4-inch planks as on the final mockup described above, but adding the missing wood mass back onto this as “wings” sticking out the back – where they would not be in the way of string plucking etc – positioning the wings at a 90-degree angle to the usual plane of the body. This would probably be big and heavy, but it would be likely to sound a lot closer to what I have always been after.

Another design avenue might be to use 3 or 4 normal SGs and add robotic plucking and fretting mechanisms, driven by electronic sensors hooked up to another typical upright piano action and set of keys, with some programmed software to make the fast decisions needed to work out which string and fret to use on which SG guitar for each note played on the keyboard, and so on. While this would not give the same level of intimacy between the player and the instrument itself as even the original upright electric guitar had, the tone of the instrument would definitely sound more or less right, allowing for loss of “player feeling” from how humans usually pluck the strings, hold down the frets, and so on. This approach would most likely be really expensive, as quite a lot of robotics would probably be needed.

An even more distant possibility in relation to the original upright electric guitar, might be to explore additive synthesis further, the technology that the firm Musicom Ltd – with whom I collaborated during Approach 2 in the early 1990s – continue to use even today, for their pipe organ sounds. I have a few ideas on how to go about such additive synthesis exploration, but will leave them out of this text here.

As for my own involvement, I would like nothing better than to work on this project again, in some form. But these days, there are the usual bills to pay, so unless there is a wealthy patron or perhaps a sponsoring firm out there who can afford to both pay me enough salary to keep my current financial commitments, and to also bankroll the research and development that would need to be undertaken to get this invention moving again, the current situation is that it’s very unlikely I can do it myself.

Although that seems a bit of a shame, I am at least completely satisfied that, in my younger days, I had a proper go at this. It was an unforgettable experience, to say the least!

Museum of Magnetic Sound Recording – interview with Martin Theophilus

February 1st, 2017

We recently spoke to Martin Theophilus, Executive Director of the Museum of Magnetic Sound Recording based in Austin, Texas.

While the Great Bear studio is a kind of museum – it is full of old machines that we maintain and preserve – we wanted to know more about this ‘proper’ Magnetic Sound Recording Museum.

How did the collection get started, what kind of equipment does it collect and what do they think the future holds for magnetic tape?

Many thanks to Martin for taking time to respond to our questions. If you want to support the Museum of Magnetic Sound Recording’s aim to establish a permanent storage facility you can make a donation here.

Enjoy!

GB: When and how did the Museum of Magnetic Sound Recording get started?

M: The Museum was created in an effort to preserve our vintage recording collection that was initiated in 1998 with the web site Reel2ReelTexas.com. My audio recording began professionally in 1964. Our production switched to video in the early 1990’s. In 1998, the collection began with a gift of an Edison cylinder player from my wife Chris. I missed having the tape recorders around, so we began acquiring the recorders I’d worked with and then several historically significant recorders were secured. One included the first professional magnetic tape recorder built in the US. It is the 1948 Ampex 200A #33 reel to reel tape recorder belonging to Capitol Records. We also have Willie first T-26 Dynavox tape recorder

We have many very first recording devices from: Ampex, Berlant, Brush, Magnecord, Pioneer, Sony, Studer and Teac/Tascam. While there are not many large multi-track recorders, the intent was to display those recording devices that assisted musicians in creating their music. There are now around 225 tape recorders and 100 + vintage classic microphones. in 2012 we decided the collection was of significance and needed to be preserved and made available to the public in a permanent secure facility. We founded the non-profit and acquired a dedicated Board with all original members staying the course with us.

GB: How are you funded and how can people view the collection?

M: Presently the Museum is funded by private donations. At this time we are functioning with volunteers and the collection is available to view on line. By appointment we provide private tours in our Studio/Museum.

GB: What is your favourite piece of (working) equipment and why?

That’s difficult, however it is the Studer A807. It is in excellent condition and is one of the top Studer machines produced. Incidentally they had a wonderful museum saving their history. It disappeared after Harmon took Studer over.


A tour of the Studer tape recorder and mixer ‘museum’ and a company history, recorded in Switzerland before the museum relocated to the Soundcraft Studer HQ in the UK.

GB:What is your favourite piece of (non-working) equipment and why?

M: There has to be two. 1) One would be the Ampex 200A #33 mentioned above. It just needs motor capacitors and will be operating soon. The 200A was overbuilt and weighed 240 lbs. While it originally belonged to Capitol Records, it eventually ended up with the San Francisco engineer/producer Leo De Gar Kulka.  2) The second is the Sony TC-772 half track 15 ips portable location recorder. It too needs motor capacitors. It was able to complete long high quality remote recordings and provided audio limiters, vari-speed and XLR connections. Beautiful design.

GB: What are the challenges of preserving magnetic sound recording? Is there a tension between keeping the machines working, and preserving their appearance as museum exhibits? Do you also seek to preserve the context surrounding the machines, i.e., marketing materials and so forth?

M: We strive to acquire the most complete and working examples of the items in the collection. Several, including another favourite – the Technics RS-1700, was traded up six times before we acquired a showroom quality recorder. The same was true for its dust cover and now both are “as new.” The working units need to be exercised regularly, oiled, heads cleaned and aligned and kept as clean as possible. I can go around the collection one day and everything is working well. The next day there may be a tour and some will always be finicky. The Swiffer duster is a valuable tool to keep the items clean. They are all in air conditioned rooms, but it is Texas and there will be dust.

The things we believe set our collection apart from others are: 1) most units work, are connected to sound systems and can be demonstrated, and 2) for each unit we have acquired and display not only manuals, but also ads, brochures, reviews and posters. All of these are scanned loaded to the web site.

Currently, we have over 1,000 images that are waiting to be processed and added to the site. Additionally, the Museum has most of the radio catalogs (Allied, Burstein Applebee, Lafayette, Olsen, Radio Shack, and more) and magazines (AES Journals, Engineer Producer, Db, Modern Recording, Tape Recorder, etc.) that advertised tape recorders from the 1930’s until they quit publishing. The recorder and microphone sections have also been scanned and added to the website.

GB: What kind of people come to the museum tours? What response do they have the material?

M: Most of the tours we provide are: folks who have been active in the recording industry; professional musicians; other collectors; radio and TV related folks; persons who have viewed the web site and are visiting in the Austin area; students; teachers; and people who are making a donation of a piece of equipment.

The responses have been overwhelming. As are visits to the web site.  We maintain an ongoing web site survey asking if folks support the creation of our permanent public facility.

GB:Do you ever work with audio visual archivists to offer advice about preservation?

M: In the Spring of 2015, University of Texas at Austin’s School of Architecture’s Third Year Interior Design Class completed 11 interior designs for our Museum. One of the students won a $30,000 scholarship with her museum design. In that process, the UT School of Architecture provided significant information regarding preservation practices. The Bob Bullock Texas State History Museum’s Deputy Director, Margaret Koch, has been a supporter and mentor for our museum and provided many recommendations for preservation as we move forward. Just in the past couple of weeks, Peter Hammer, curator of the Ampex Museum prior to its donation to Stanford University, has agreed to provide our museum with preservation practices. Peter also envisions our re-creating the original Ampex Museum within our Museum of Magnetic Sound Recording. While we maintain the collection in a climate controlled studio, we will be more able to adhere to preservation practices when we have a permanent public facility.

GB:What do you see as the future of magnetic sound recording?

M: Magnetic sound recording will hopefully always be preserved and new discoveries integrated into the current knowledge. Magnetic cassettes have recently gained new attention (vinyl too). Maybe reel tape recorders will make a comeback. On our home page we show a new Revox A77 reel tape recorder being built by Akai. Otari still custom produces their classic MX-5050 reel tape recorder.

More importantly, professional recording studios around the globe are finding that many musicians love analogue recordings, so they are retaining, or acquiring analogue recorders. The evolutionary period of magnetic recording beginning in Germany in 1934 to the dawn of digital around 1982, spans an almost fifty year period. While the recording quality of vinyl had evolved and many still consider it of top reproduction quality, the advent of magnetic tape with the ability to edit and reproduce multiple copies was an incredible breakthrough.

GB:Your website is full of amazing information. What is the relationship between the online site and the physical museum?

M: Interesting question, because our intent has always been to provide as much web information as possible (far beyond the physical collection). In our recent conversations with Peter Hammer, the Ampex Museum curator, it is his belief that our preservation work: saving and scanning manuals, ads, catalogs, letters and all the supporting documentation, will actually be more significant than the actual machines themselves. 

Peter states “When I say to people,“Digits last longer than molecules”, that tends to make them think twice about the extreme impermanence of physical collections, especially after I tell them horror stories like the Ampex Museum, the Anna Amalia Library fire in Weimar in 2004, the Cologne City Museum collapse in 2009, and now a new one for me, the sad demise of the Studer collection. Physical collections simply cannot withstand the vagaries of governmental agencies, corporations, private owners, the weather, or seismic stability!”

However, I am still passionate about creating a safe permanent public facility for the collection. There is much to be said for folks being able to actually view and operate a vintage recorder and view the process of making a recording.

GB: Anything else you want to say?

M: We have come to realize that to implement our vision, we will require a major donor who would enable the museum in the long term. We also found that preserving recording technology cannot compete with the museums that are preserving the musicians and their music. The Bob Bullock Texas History Museum considered displaying some of our magnetic recording items when they expanded their Texas music section. However they determined that folks were more likely to visit displays about Texas music. For that reason they went with the history of the Austin City Limits and items from music collections from the Rock ’n Roll Hall of Fame and the Grammy Museum.
——-
In closing, I thank you for this opportunity you’ve given me to reflect on what our goals are. We have responded to many promising opportunities, received significant verbal support, but have yet to bring the permanent facility to fruition. Due to last year’s heavy production schedule and some folks who did not follow through, I was discouraged. So last October I told our Board that maybe the museum had run its course. However, they would have none of that and encouraged us to push forward. Shortly after that we received a nice donation and I met Peter Hammer who has become an excellent resource who will be providing valuable Ampex documents and preservation consultation. So I feel very positive about our mission and will be happy to keep you posted as we progress.

Gregory Sams’s VegeBurger – Food Revolution

January 23rd, 2017

‘Watch out: the vegetarians are on the attack’ warned an article published in the April 1984 edition of the Meat Trades Journal.

The threat? A new product that would revolutionise the UK’s eating habits forever.

Gregory Sams’s VegeBurger invented a vernacular that is so ubiquitous now, you probably thought it’s always been here. While vegetarianism can be traced way back to 7th century BCE, ‘Veggie’, as in the food products and the people that consume them, dates back to the early 1980s.

VegeBurger was the first vegetarian food product to become available on a mass, affordable scale. It was sold in supermarkets rather than niche wholefood shops, and helped popularise the notion that a vegetarian diet was possible.

As the story of the VegeBurger goes, it helped ‘a whole lot of latent vegetarians came out of the closet.’

Whole Food Histories

Before inventing the VegeBurger, Sams opened Seed in 1967, London’s first macrobiotic whole food restaurant. Seed was regularly frequented by all the countercultural luminaries of the era, including John and Yoko.

Working with his brother Craig Sams he started Harmony Foods, a whole food distribution business (later Whole Earth), and published the pioneering Seed – the Journal of Organic Living. 

In 1982 Gregory went out on a limb to launch the VegeBurger. Colleagues in the whole food business (and the bank manager) expressed concern about how successful a single-product business could be. VegeBurger defied the doubters, however, and sales rocketed to 250,000 burgers per week as the 80s wore on.

The burgers may have sold well, but they also helped change hearts and minds. In 1983 his company Realeat commissioned Gallup to conduct a survey of public attitudes to meat consumption.

The survey results coincided with the release of the frozen VegeBurger, prompting substantial debate in the media about vegetarianism. ‘It was news, with more people moving away from red meat consumption than anybody had realized. VegeBurger was on television, radio and newspapers to such a degree that, when I wasn’t being interviewed or responding to a press query, all my time was spent keeping retailers stocked with the new hit’.

Food for Thought

Great Bear have just transferred the 1982 VegeBurger TV commercial that was recorded on the 1″ type C video format.

The advert, Gregory explains, ‘was produced for me by my dear friend Bonnie Molnar who used to work with a major advertising agency and got it all done for £5000, which was very cheap, even in 1982. We were banned from using the word “cowburger” in the original and had to take out the phrase “think about it” which contravened the Advertising Standards Authority’ stricture that adverts could not be thought provoking! I had also done the original narration, very well, but not being in the union that was disallowed. What a world, eh?’

Gregory’s story shows that it is possible to combine canny entrepreneurship and social activism. Want to know more about it? You can read the full VegeBurger story on Gregory’s website.

Thanks to Gregory for permission to reproduce the advert and for talking to us about his life.

Pre-Figurative Digital Preservation

January 16th, 2017

How do you start preserving digital objects if your institution or organisation has little or no capacity to do so?

Digital preservation can at first be bit-part and modular. You can build your capacity one step at a time. Once you’ve taken a few steps you can then put them together, making a ‘system’.

It’s always good to start from first principles, so make sure your artefacts are adequately described, with consistent file-naming and detailed contextual information.

You might want to introduce tools such as Fixity into your workflow, which can help you keep track of file integrity. For audio visual content get familiar with MediaInfo, MediaConch, QC Tools or Exactly.

Think of this approach as pre-figurative digital preservation. It’s the kind of digital preservation you can do even if you don’t (yet) have a large scale digital repository. Pre-figurative digital preservation is when you organise and regularly assess the condition of your collections as if it is managed in a large repository.

So when that day comes and you get the digital content management system you deserve, those precious zeros and ones can be ingested with relative ease, ready to be managed through automated processes. Pre-figurative digital preservation is an upgrade on the attitude that preserving files to make them accessible, often using lossy compression, is ‘good enough’ (we all know that’s not good enough!!)

Pre-figurative digital preservation can help you build an information system that fits your needs and capacities. It is a way to do something rather than avoid the digital preservation ‘problem’ because it seems too big and technically complex.

Learning New Skills

The challenge of managing digitised and born-digital material means archivists will inevitably have to learn new skills. This can feel daunting and time as an archivist we have recently worked with told us:

‘I would love to acquire new skills but realistically there’s going to be a limit to how much I can learn of the technical stuff. This is partly because I have very small brain but also partly because we have to stretch our resources very thin to cover all the things we have to do as well as digital preservation.’

Last year the Society of American Archivists launched the Try5 for Ongoing Growth initiative. It offers a framework for archivists who want to develop their technological knowledge. The idea is you learn 5 new technical skills, share your experience (using #Try5SAA) and then help someone else on the basis of what you’ve learnt.

Bertram Lyons from AV Preserve outlined 5 things the under-confident but competence hungry (audiovisual) archivist could learn to boost their skill set.

These include getting familiar with your computer’s Command Line Interface (CLI), creating and running Checksums, Digital File Packaging, Embedding and Extracting Metadata and understanding Digital Video. Lyons provides links to tutorials and resources that are well worth exploring.

Expanding, bit by bit

If your digital collections are expanding bit by bit and you are yet to tackle the digital elephant in the room, it may well be time to try pre-figurative digital preservation.

We’d love to hear more from archivists whose digital preservation system has evolved in a modular fashion. Let us know in the comments what approaches and tools you have found useful.

 

Developments in Digital Video Preservation – CELLAR

December 13th, 2016

We are living in interesting times for digital video preservation (we are living in interesting times for other reasons too, of course).

For many years digital video preservation has been a confusing area of audiovisual archiving. To date there is no settled standard that organisations, institutions and individuals can unilaterally adopt. As Peter Bubestinger-Steindl argues, ‘no matter whom you ask [about which format to use] you will get different answers. The answers might be correct, but they might not be the right solution for your use-cases.’

While it remains the case that there is still no one-size-fits-all solution for digital video preservation, recent progress made by the Codec Encoding for LossLess Archiving and Realtime transmission (CELLAR) working group should be on the radar of archivists in the field.

The aim of CELLAR is to standardise three lossless open-source audiovisual formats – Matroska, FFV1 and FLAC – for use in archival environments and transmission.

To date the evolution of video formats has largely been driven by broadcast, production and consumer markets. The development of video formats for long term archival use has been a secondary consideration.

The work on the Matroska container, FFV1 video codec and FLAC audio codec is therefore hugely significant because they have, essentially, been developed by audiovisual archivists for audiovisual archivists.

Other key points to note is that Matroska, FFV1 and FLAC are:

1. Open Source. This increases their resilience as a preservation format because the code’s development is widely documented.

And, importantly, they employ

2. Lossless compression. Simply put, lossless compression makes digital video files easier to store and transmit: file size is decreased without damaging integrity.

Managing large file sizes has been a major practical glitch that has held back digital video preservation in the past. The development of effective lossless compression for digital video is therefore a huge advance.

Archival focus

The archival-focus is evident in the capacities of Matroska container, as outlined by Dave Rice and Ashley Blewer in a paper presented at the ipres conference in 2016.

Here they explain that ‘the Matroska wrapper is organized into top-level sectional elements for the storage of attachments, chapter information, metadata and tags, indexes, track descriptions, and encoding audiovisual data.’

Each of these elements has a checksum associated with it, which means that each part of the file can be checked at a granular level. If there is an error in the track description, for example, this can be specifically dealt with. Matroska enables digital video preservation to become targeted and focused, a very useful thing given the complexity of video files.

screenshot-xml-schema-ebml-schema-digital-video-preservation

It is also possible to embed technical and descriptive metadata within the Matroska container, rather than alongside it in a sidecar document.

This will no doubt make Matroska attractive to archivists who dream of a container-format that can store additional technical and contextual information.

Yet as Peter B. Hermann Lewetz and Marion Jaks argue, ‘keeping everything in one video-file increases the required complexity of the container, the video-codec – or both. It might look “simpler” to have just one file, but the choice of tools available to handle the embedded data is, by design, greatly reduced. In practice this means it can be harder (or even impossible) to view or edit the embedded data. Especially, if the programs used to create the file were rare or proprietary.’

While it would seem that embedding metadata in the container file is currently not wholly practical, developing tools and systems that can handle such information must surely be a priority as we think about the long term preservation of video files.

FFV1 and FLAC are also designed with archival use in mind. FFV1, Rice and Blewer explain, uses lossless compression and contains ‘self-description, fixity, and error resilience mechanisms.’ ‘FLAC is a lossless audio codec that features embedded checksums per audio frame and can store embedded metadata in the source WAVE file.’

Milestones for Digital Video Preservation

By the end of 2016 the CELLAR working group will have submitted standard and information specifications to the Internet Engineering Steering Group (IESG) for Matroska, FFV1, FLAC and EBML, the binary XML format the Matroska container is based on.

Outside of CELLAR’s activities there are further encouraging signs of adoption among the audio visual preservation community.

The Presto Centre’s AV Digitisation and Digital Preservation TechWatch Report #04 has highlighted the growing influence of open source, even within commercial audio visual archiving products.

Austrian-based media archive management company NOA, for example, ‘chose to provide FFV1 as a native option for encoding within its FrameLector products, as they see it has many benefits as a lossless, open source file format that is easy to use, has low computational overheads and is growing in adoption.’

We’ll be keeping an eye on how the standardisation of Matroska, FFV1 and FLAC unfolds in 2017. We will also share our experiences with the format, including whether there is increased demand and uptake among our customer base.

 

Revealing Histories: North Staffordshire

December 7th, 2016

Great Bear are delighted to be working with the Potteries Heritage Society to digitise a unique collection of tape recordings made in the 1970s and 80s by radio producer, jazz musician and canals enthusiast Arthur Wood, who died in 2005.

The project, funded by a £51,300 grant from the Heritage Lottery Fund (HLF), will digitise and make available hundreds of archive recordings that tell the people’s history of the North Staffordshire area. There will be a series of events based on the recordings, culminating in an exhibition in 2018.

The recordings were originally made for broadcast on BBC Radio Stoke, where Arthur Wood was education producer in the 1970s and 80s. They feature local history, oral history, schools broadcasts, programmes on industrial heritage, canals, railways, dialect, and many other topics of local interest.

There are spontaneous memoirs and voxpop interviews as well as full-blown scripted programmes such as the ‘Ranter Preachers of Biddulph Moor’ and ‘The “D”-Day of 3 Men of the Potteries’ and ‘Millicent: Lady of Compassion’, a programme about 19th century social reformer Millicent, Duchess of Sutherland.

Arthur Wood: Educational Visionary

In an obituary published in The Guardian, David Harding described Wood as ‘a visionary. He believed radio belonged to the audience, and that people could use it to find their own voice and record their history. He taught recording and editing to many of his contributors – miners, canal, steel and rail workers, potters, children, artists, historians and storytellers alike.’

The tapes Great Bear will be digitising reflect what Wood managed to retain from his career at the BBC.

Before BBC Radio Stoke moved premises in 2002, Wood picked up as many tapes as he could and stored them away. His plan was to transfer them to a more future proof format (which at the time was mini disc!) but was sadly unable to do this before he passed away.arthur-wood-tapes-revealing-voices-great-bear

‘About 2 years ago’ Arthur’s daughter Jane explains, ‘I thought I’d go and have a look at what we actually had. I was surprised there were quite so many tapes (about 700 in all), and that they weren’t mainly schools programmes, as I had expected.

I listened to a few of them on our old Revox open reel tape machine, and soon realised that a lot of the material should be in the city (and possibly national) archives, where people could hear it, not in a private loft. The rest of the family agreed, so I set about researching how to find funding for it.’

50th anniversary of BBC Local Radio

The Revealing Voices project coincides with an important cultural milestone: the 50th anniversary of BBC local radio. Between 1967 and 1968 the BBC was granted license to set up a number of local radio stations in Durham, Sheffield, Brighton, Leicester, Merseyside, Nottingham, Leeds and Stoke-on-Trent.

Education was central to how the social role of local radio was imagined at the time:

‘Education has been a major preoccupation of BBC Local Radio from the outset. Indeed, in one sense, the entire social purpose of local radio, as conceived by the BBC, may be described as educational. As it is a central concern of every civilised community, so too must any agency serving the aims of such a community treat it as an area of human activity demanding special regard and support. It has been so with us. Every one of our stations has an educationist on its production staff and allocates air-time for local educational purposes’ (Education and BBC Local Radio: A Combined Operation by Hal Bethell, 1972, 3).

Within his role as education producer Wood had a remit to produce education programmes in the broadest sense – for local schools, and also for the general local audience. Arthur ‘was essentially a teacher and an enthusiast, and he sought to share local knowledge and stimulate reflective interest in the local culture mainly by creating engaging programmes with carefully chosen contributors,’ Jane reflected.

Revealing Voices and Connecting Histories

Listening to old recordings of speech, like gazing at old photograph, can be very arresting. Sound recordings often contain an ‘element which rises from the scene, shoots out of it like an arrow, and pierces me’, akin to Roland Barthes might have called a sonic punctum.

The potency of recorded speech, especially in analogue form, arises from its indexicality—or what we might call ‘presence’. This ‘presence’ is accentuated by sound’s relational qualities, the fact that the person speaking was undeniably there in time, but when played back is heard but also felt here.

When Jane dropped off the tapes in the Great Bear studio she talked of the immediate impact of listening again to her father’s tape collection. The first tape she played back was a recording of a woman born in 1879, recalling, among other things, attending a bonfire to celebrate Queen Victoria’s jubilee.

Hearing the voice gave her a distinct sense of being connected to a woman’s life across three different centuries. This profound and unique experience was made possible by the recordings her father captured in the 1970s, unwinding slowly on magnetic tape.

The Revealing Voices project hope that other people, across north Staffordshire and beyond, will have a similar experiences of recognition and connection when they listen to the transferred tapes. It would be a fitting tribute to Arthur Wood’s life-work, who, Jane reflects, would be ‘glad that a solution has been found to preserve the tapes so that future generations can enjoy them.’

***

If you live in the North Staffordshire area and want to volunteer on the Revealing Voices project please contact Andy Perkin, Project Officer, on andy at revealing-voices dot org dot uk.

Many thanks to Jane Wood for her feedback and support during research for this article.

Happy World Day for Audio Visual Heritage 2016!

October 27th, 2016

Happy World Day for Audio Visual Heritage!

World Day for Audiovisual Heritage, which is sponsored by UNESCO and takes place every year on 27 October, is an occasion to celebrate how audio, video and film contribute to the ‘memory of the world.’

The theme for 2016 – ‘It’s your story, don’t lose it!’ – conveys the urgency of audio visual preservation and the important role sound, film and video heritage performs in the construction of cultural identities and heritage.

Great Bear make an important contribution to the preservation of audiovisual heritage.

On one level we offer practical support to institutions and individuals by transferring recordings from old formats to new.

The wider context of Great Bear’s work, however, is preservation: in our Bristol-based studio we maintain old technologies and keep ‘obsolete’ knowledge and skills alive. Our commitment to preservation happens every time we transfer a recording from one format to another.

We work hard to make sure the ‘memory’ of old techniques remain active, and are always happy to share what we learn with the wider audiovisual archiving community.

Skills and Technology

Ray Edmondson points out in Audio Visual Archiving: Philosophy and Principles (2016) that preserving technology and skills is integral to audiovisual archiving:

‘The story of the audiovisual media is told partly through its technology, and it is incumbent on archives to preserve enough of it – or to preserve sufficient documentation about it – to ensure that the story can be told to new generations. Allied to this is the practical need, which will vary from archive to archive, to maintain old technology and the associated skills in a workable state. The experience of (for example) listening to an acoustic phonograph or gramophone, or watching the projection of a film print instead of a digital surrogate, is a valid aspect of public access.’close up of an edit button on a studer tape machine-great-bear-analogue-digital-media

Edmondson articulates the shifting perceptions within the field of audiovisual archiving, especially in relation to the question of ‘artefact value.’

‘Carriers once thought of and managed as replaceable and disposable consumables’, he writes, ‘are now perceived as artefacts requiring very different understanding and handling.’

Viewing or listening to media in their original form, he suggests, will come to be seen as a ‘specialist archival experience,’ impossible to access without working machines.

Through the maintenance of obsolete equipment the Great Bear studio offers a bridge to such diverse audio visual heritage experiences.

These intangible cultural heritages, released through the playback of media theorist Wolfgang Ernst has called ‘Sonic Time Machines’, are part of our every day working lives.

We rarely ponder their gravity because we remain focused on day to day work: transferring, repairing, collecting and preserving the rich patina of audio visual heritage sent in by our customers.

Enjoy World Day for Audio Visual Heritage 2016!

Spoking – Treating and Assessing Magnetic Tape

October 17th, 2016

Assessment and treatment is an important part of Great Bear’s audiovisual preservation work.

open reel tape displaying signs of spoking, where the insides of the tape pack buckle and deformEven before a tape is played back we need to ensure it is in optimum condition.

Sometimes it is possible to make a diagnosis through visual assessment alone.

A tape we received recently, for example, clearly displayed signs of ‘spoking.’

Spoking is a term used in the AV preservation world to describe the deformation of the tape pack due to improper winding, storage or a badly set up machine.

The National Archives describe it as a ‘condition of magnetic tape and motion picture film where excessive pressure caused by shrinkage or too much winding tension eventually causes deformation.’

In our experience ‘spoking’ predominantly occurs with domestic open reel tapes. We have rarely seen problems of this nature with recordings made in professional settings.

Compared with professional grade tape, domestic open reel tape was often thinner, making it cheaper to produce and buy.

‘Spoking’ in domestic tape recordings can also be explained by the significant differences in how tape was used in professional and domestic environments.

Domestic tape use was more likely to have an ‘amateur’ flavour. This does not mean that your average consumer did not know what they were doing. Nor were they careless with the media they bought and made. It cannot be denied, however, that your average domestic tape machine would never match the wind-quality of their professional counterparts.

In contrast, the only concern of recording professionals was to make a quality recording using the best tape and equipment. Furthermore, recording practices would be done in a conscientious and standardised manner, according to best industry practice.

Combined these factors result in a greater number of domestic tapes with winding errors such as cinching, pack-slip and windowing.

Treating Spoking

The majority of ‘spoking’ cases we have seen are in acetate-backed tape which tends to become inflexible – a bit like an extended tape measure – as it ages.

The good news is that it is relatively easy to treat tapes suffering from ‘spoking’ through careful – and slow – re-winding.

Slowly winding the tape at a controlled tension, colloquially known as ‘library wind’, helps relieve stress present in the pack. The end result is often a flatter and even wound tape pack, suitable for making a preservation transfer.

The Containers – late 70s new wave lives again

September 26th, 2016

It might be a familiar story to some people. At one point, say the late 1970s, you were in your early 20s and the main songwriter in a post-punk/ new wave band. You tried really hard to get it off the ground: moved to London, met the right people, played several memorable gigs. You worked with talented artists, some went on to become successful pop stars.

Audio cassette with case, songs listed in hand written textYou were also pretty organised. You managed to record your music in a professional recording studio. But the band faltered due to commercial reasons, personality differences etc, etc.

The dream of a pop music career faded but, undeterred, you started a new solo project. You built your sound on cutting edge technology – the reliable pulses of the drum machine.

Modest success followed, including an album release on one of the early 1980s’s many DIY record labels. You secured high profile support slots for big acts, such as the Thompson Twins, and wowed spectators with an idiosyncratic musical style.

Yet it was not possible to make music your profession, and you drifted away from the industry.

The only evidence you ever existed, in a musical sense, was that a friend—Robyn Hitchcock of the Soft Boys—covered your songs from time to time.

Re-discovery

30 years later you start scratching around the internet and realise that the album you made in 1980 is now highly collectable. It’s selling for silly prices on ebay. It seems that all this time you’ve had a cult following on college radio in the US.

This kick starts a self-archiving project, powered by the publishing power of youtube. You start to upload your back catalogue without a shred of wishfulness over what might have been. What the hell, at least people can hear the music now.

Soon you get an email from Manufactured Recordings, an independent record label in Brooklyn who specialise in re-issues. They love you! And want to release and listen to absolutely everything you have done.

A tape reel of the Containers in a boxThe immediate priority is a fresh pressing of your cult DIY album: The Beach Bullies’ We Rule the Universe, warmly re-appraised in 2015 as an ‘excellent slice of obscurist he-said/she-said bedsit pop.’

Then, in 2017, the entire back catalogue of The Containers, your band that never quite made it, will be released. The compilation carries the title Self-Contained.

The material on this album, like so many re-issues available today, were expertly transferred in the Great Bear studio!

Finally the world will be able to hear The Containers’ ‘lost album’, that was recorded in 1979 at Spaceward studio, Cambridge.

Spaceward had a reputation for making ‘no-nonsense, quality recordings that successfully captured the essence of the late seventies style of music.’ Artists such as The Raincoats, Scritti Politti, Gary Numan, The Mekons and many others laid down tracks there.

At the helm was Mike Kemp, a supportive and inventive engineer who, James remembered, checked the final mix through a transistor radio whose battery had half expired.

What can we expect to hear when the The Containers’ music is finally released into the world? The band, James explained, combined ‘literate songwriting with the energy of the period.’ ‘We weren’t afraid of using more than three chords. We wanted to write great songs, with witty, biting lyrics.’

Re-issuing music culture

Audio cassette in a tape boxThe status of ‘old’ recordings has changed a lot in recent times. James believes his work is no longer old as in ‘not new’ and therefore ‘forgettable,’ but old as in ‘cult, hidden or classic’.

The contemporary ‘re-issue market’ is built upon the desirability of ‘some mislaid masterwork, tugged from obscurity, relieved of dust, and repackaged for rediscovery.’

While ‘re-issue’ culture can be traced back to the mid-twentieth century, widespread digitisation has clearly fuelled the eruption of pop music’s archival imaginary in the 21st century. Different categories of recorded sound – including more messy or unfinished works – can be decoded as ‘valuable’ or ‘interesting’.

James’ new label, Manufactured Records, for example, wanted to publish demos, rough bedroom recordings and other works in progress as well as the The Containers’ studio recordings.

Such recordings, James believes, have novelty value because they provide unique insight into ‘mindset of the artist’ when they were writing a piece of music. They may also capture the acoustic textures of everyday sound environments, a factor which sets them apart from the flat polished surfaces of (less authentic) studio recordings.

Uncontained

Containers

The Containers (l-r) James A Smith – gtr. vocals, Adrian ‘Hots’ Foster – bass gtr, Alan Bearham – drums, Josephine Buchan – vocals

The timely recognition of the Containers and the Beach Bullies should warm the hearts of anyone who has felt that their music careers happened within a bell jar.

It is clear, from speaking with James, the immense pleasure and excitement he feels in being rediscovered after many years.

What’s more, the future appears bright for his musical endeavours: to celebrate the release of the album next year The Containers will go on tour again, featuring the original drummer and bassist.

The moment has come for this ‘music out of time’, that was only played live on a few occasions in the early 1980s, to live again.

***

Many thanks to James A Smith for sharing his memories with us.

 

Digital Video in Mixed-Content Archives

September 12th, 2016

On a recent trip to one of Britain’s most significant community archives, I was lucky enough to watch a rare piece of digitised video footage from the late 1970s.

As the footage played it raised many questions in my mind: who shot it originally? What format was it originally created on? How was it edited? How was it distributed? What was the ‘life’ of the artefact after it ceased to actively circulate within communities of interest/ use? How and who digitised it?

As someone familiar with the grain of video images, I could make an educated guess about the format. I also made other assumptions about the video. I imagined there was a limited amount of tape available to capture the live events, for example, because a number of still images were used to sustain the rolling audio footage. This was unlikely to be an aesthetic decision given that the aim of the video was to document a historic event. I could be wrong about this, of course.

When I asked the archivist the questions flitting through my mind she had no answers. She knew who the donor of the digital copy was, but nothing about the file’s significant properties. Nor was this important information included in the artefact’s record.

This struck me as a hugely significant problem with the status of digitised material – and especially perhaps video – in mixed-content archives where the specificities of AV content are not accounted for.

Due to the haphazard and hand-to-mouth way mixed-content archives have acquired digital items, it seems more than likely this situation is the rule rather than the exception: acquired bit by bit (no pun intended), maintaining access is often privileged over preserving the context and context of the digitised video artefact.

As a researcher I was able to access the video footage, and this of course is better than nothing.

Yet I was viewing the item in an ahistoric black hole. It was profoundly decontextualised; an artefact extracted to its most barest of essences.

Standard instabilities

This is not in any way a criticism of the archive in question. In fact, this situation is wholly understandable given that digital video are examples of ‘media formats that exist in crisis.’

Video digitisation remains a complex and unstable area of digital preservation. It is, as we have written elsewhere on this blog, the final frontier of audiovisual archiving. This seems particularly true within the UK context where there is currently no systematic plan to digitise video collections, unlike film and audio.

The challenge with digital video preservation remains the bewildering number of potential codec/ wrapper combinations that can be used to preserve video content.

There are signs, however, that file-format stabilities are emerging. The No Time to Wait: Standardizing FFV1 & Matroska for Preservation symposium (Berlin, July 2016) brought together software developers and archivists who want to make the shared dream of an open source lossless video standard, fit for archival purpose, a reality.

It seems like the very best minds are working together to solve this problem, so Great Bear are quietly optimistic that a workable, open source standard for video digital preservation is in reach in the not too distant future.

Metadata

Yet as my experience in the archive makes clear, the challenge of video digitisation is not about file format alone.

There is a pressing need to think very carefully about the kind of metadata and other contextual material that need to be preserved within and alongside the digitised file.

Due to limited funding and dwindling technical capacity, there is likely to be only one opportunity to transfer material currently recorded on magnetic tape. This means that in 2016 there really can be no dress rehearsal for your video digitisation plans.

As Joshua Ranger strongly emphasises:

‘Digitization is preservation…For audiovisual materials. And this bears repeating over and over because the anti-digitization voice is much stronger and generally doesn’t include any nuance in regards to media type because the assumption is towards paper. When we speak about digitization for audio and video, we now are not speaking about simple online access. We are speaking about the continued viability, about the persistence and the existence of the media content.’

What information will future generations need to understand the digitised archive materials we produce?

An important point to reckon with here is that not all media are the same. The affordances of particular technologies, within specific historical contexts, have enabled new forms of community and communicative practice to emerge. Media are also disruptive (if not deterministic) – they influence how we see the world and what we can do.

On this blog, for example, Peter Sachs Collopy discussed how porta-pak technology enabled video artists and activists in the late 1960s/ early 1970s to document and re-play events quickly.

Such use of video is also evident in the 1975 documentary Les prostituées de Lyon parlent (The prostitutes of Lyon speak).

Les prostituées documents a wave of church occupations by feminist activists in France.

The film demonstrates how women used emergent videotape technology to transmit footage recorded within the church onto TV screens positioned outside. Here videotape technology, and in particular its capacity to broadcast uni-directional messages, was used to protect and project the integrity of the group’s political statements. Video, in this sense, was an important tool that enabled the women – many of whom were prostitutes and therefore without a voice in French society – to ‘speak’.

Peter’s interview and Les prostituées de Lyon parlent are specific examples of how AV formats are concretely embedded within a social-historical and technical context. The signal captured – when reduced to bit stream alone – is simply not an adequate archival source. Without sufficient context too much historical substance is shed.

In this respect I disagree with Ranger’s claim that ‘all that really may be needed moving ahead [for videotape digitisation] is a note in the record for the new digital preservation master that documents the source.’ To really preserve the material, the metadata record needs to be rich enough for a future researcher to understand how a format was used, and what it enabled users to do.

‘Rich enough’ will always be down to subjective judgement, but such judgements can be usefully informed by understanding what makes AV archive material unique, especially within the context of mixed-content archives.

Moving Forward

So, to think about this practically. How could the archive item I discuss at the beginning of the article be contextualised in a way that was useful to me, as a researcher?

At the most basic level the description would need to include:

  • The format it was recorded on, including details of tape stock and machine used to record material
  • When it was digitised
  • Who digitised it (an individual, an institution)

In an ideal world the metadata would include:

  • Images of the original artefact – particularly important if digital version is now the only remaining copy
  • Storage history (of original and copy)
  • Accompanying information (e.g., production sheets, distribution history – anything that can illuminate the ‘life’ of artefact, how it was used)

This information could be embedded in the container file or be stored in associated metadata records.

matroska sample with embedded preservation metadataThese suggestions may seem obvious, but it is surprising the extent to which they are overlooked, especially when the most pressing concern during digitisation is access alone.

In every other area of archival life, preserving the context of item is deemed important. The difference with AV material is that the context of use is often complex, and in the case of video, is always changing.

As stressed earlier: in 2016 and beyond you will probably only get one chance to transfer collections stored on magnetic tape, so it is important to integrate rich descriptions as part of the transfer.

Capturing the content alone is not sufficient to preserve the integrity of the video artefact. Creating a richer metadata record will take more planning and time, but it will definitely be worth it, especially if we try to imagine how future researchers might want to view and understand the material.

Monstrous Regiment – Audio Cassette Digitisation

August 1st, 2016

Monstrous Regiment were one of many trailblazing feminist theatre companies active in the 1970s-1990s. They were established as a collective very much built around performers, both (professional) actors such as Mary McCusker and (professional) musicians such as Helen Glavin.

Between 1975-1993 Monstrous Regiment produced a significant number of plays and cabarets. These included Scum: Death, Destruction and Dirty Laundry, Vinegar Tom, Floorshow, Kiss and Kill, Dialogue Between a Prostitute and One of Her Clients, Origin of the Species, My Sister in This House, Medea and many others.

Monstrous Regiment’s plays were not always received positively be feminists. A performance of Time Gentlemen Please (1978), for example, was controversially shut down in Leeds when some audience members stormed the stage. The play was, according to some commentators, seen to promote a ‘glossy, middle-class view of sexual liberation.’ [1]

As with any historical event there are many different accounts of what happened that evening. Mary McCusker and Gillian Hanna have discussed their perspective, as performers, in an interview conducted with Unfinished Histories: Recording the History of Alternative Theatre.

A detailed biography of the company can be also found on the Unfinished Histories website, which has loads more information about Women’s, Black, Gay and Lesbian Theatre companies active at the same time as Monstrous Regiment. Check it out!

An Archival Legacy

Monstrous Regiment still exist on paper, but ceased producing in 1993 after the Arts Council withdrew the company’s revenue funding.

To ensure a legacy for Monstrous Regiment’s work the company archive was deposited in the Women’s Library (then Fawcett Library).

Due to a large cataloguing backlog at the Women’s Library, however, the Monstrous Regiment collection was never made publicly available.

Co-founder Mary McCusker explains her frustration with this situation:

‘We were always keen to create a body of work that would be accessible to future practitioners that the work would not be hidden from history, but alas unknown to us it was not catalogued so available to no one. Script were meant to be performed, some of the unpublished plays have not been available for such a long time. I/we do want the ideas the energy of those times the talent and wonderful creativity to be there after we are gone. That goes for the plays’ readings we did as well as the performances.’

‘I admire writers immensely and even if some plays didn’t get the critical response hoped for I believe all the work deserves a space, somewhere to be discovered anew. I would also hope the idea a group of actors started this and kept going, took control over their work conditions and wanted their beliefs to inform what was written and how they worked with other creative beings would still resonate in the future.’

Monstrous Moves

Two women sing in a theatrical manner into a microphoneTo address the access problem the Monstrous Regiment archive was recently moved to a new home, the theatre collection at the V & A, where it will soon be catalogued.

The decision to relocate is part of a new effort to organise and publicly interpret the Monstrous Regiment archive.

Plans are in place to construct a new archival website that will tell the Monstrous Regiment Story. It will include photographs, fliers, scripts, ephemera and – yes – audiovisual material.

Russell Keat, a semi-retired academic and partner of Mary McCusker, has begun the process of looking through the collection at the V & A, selecting items for digitisation and contacting people who performed with Monstrous Regiment to ask for new material.

Russell has also been exploring McCusker’s personal audio cassette collection for traces of Monstrous Regiment. The fruits of this labour were sent to Great Bear for digitisation.

The recordings we transferred include performances of Gentlemen Prefer Blondes and Floorshow, a radio broadcast of Mourning Pictures, a spoken voice audio guide of the play The Colony Comes a Cropper for Visually Challenged Audiences, a tape made by a composer for Mary to rehearse with, songs from Vinegar Tom and Kiss and Kill recorded in a rehearsal studio and a sound tape for Love Story of a Century, comprising piano and rain effects.

The (live) Monstrous Regiment Archive

Making audiovisual documentation was an exceptional rather than everyday activity in the late 1970s and early 1980s. ‘We had a few things filmed; not whole plays but maybe snippets. Music taped. Radio interviews and magazine interviews were one way of spreading the word,’ Mary told us.

As a documentary form, the audiovisual recording exists in tension with the theatrical ideal of live performance: ‘It’s very difficult for a film to capture the experience of live theatre because of course you rehearse and produce the play to be experienced live. BUT naturally if that performance has gone and all you have is a script then any filmed documentation gives the reader/viewer all the visual clues about what a character is feeling when they speak but also the bigger picture about how they feel about what other characters are saying,’ Mary reflected.

Live and later recorded music performed a key role in Monstrous Regiment’s work. Unlike other theatre groups such as the Sadista SistersSpare Tyre and Gay Sweatshop, Monstrous Regiment never released an album of the music they performed. The tapes Great Bear have transferred will therefore help future researchers understand the musical dimension of the company’s work in a more nuanced way.

Mary explains that ‘from the very start we wanted live music to be part of the shows we produced and encouraged writers to write not only for the company of actors but also to put music as an integral part of the play; to have it as a theatrical force in a central position, not a scene change background filler.

This was true in all our early work and of course in the two cabarets. I think the songs in Vinegar Tom by Caryl Churchill still provoke much discussion. I know I loved singing them. Later as our musicians moved on and also money got tighter we had musicians like Lindsay Cooper and Joanna MacGregor write and perform scores for plays that were recorded and became used rather as you would in cinema.’

***

We are hugely grateful to Mary and Russell for taking time to respond to our questions for this article.

We wish them the best of luck for their archive project, and will post links to the new website when it hits the servers.

Notes

[1] Aleks Sierz (2014) In-Yer-Face Theatre: British Drama Today, London: Faber and Faber.

VHS – more obsolescence threats

July 28th, 2016

S-VHS-Machine-Great-Bear-Analogue-Digital-MediaWe couldn’t let the news that ‘Japan’s Funai Electric has announced it will end production of home videocassette recorders in July’ go by unnoticed.

Earlier this month we wrote an article that re-appraised the question of VHS obsolescence.

Variability within the VHS format, such as recording speeds and the different playback capacities of domestic and professional machines, fundamentally challenge claims that VHS is immune from obsolescence threats which affect other, less ubiquitous formats.

The points we raised in this article and in others on the Great Bear tape blog are only heightened by news that domestic VHS manufacture is to be abandoned this month.

It is always worth being a bit wary of media rhetoric: this is not the first time VHS’s ‘death’ has been declared.

In 2008, for example, JVC announced they would no longer manufacture standalone VHS machines.

Yet Funai Electric’s announcement seems decidedly more grave, given that ‘declining sales, plus a difficulty in obtaining the necessary parts’ are the key reasons cited for their decision.

To be plain here: If manufacturers are struggling to find parts for obsolete machines this doesn’t bode well for the rest of us.

The ‘death’ of a format is never immediate. In reality it is a stage by stage process, marked by significant milestones.

The announcement last week is certainly one milestone we should take notice of.

Especially when there are several other issues that compromise the possibility of effective VHS preservation in the immediate and long term future.

What needs to be done?

As ever, careful assessment of your tape collection is recommended. We are always on hand to talk through any questions you have.

Deacon Blue Live – Betamax PCM recordings

July 25th, 2016

Great Bear exist to make obsolete tape recordings accessible in the digital age.

We often work with artists and record labels who use our services to digitise back catalogues and previously unreleased material.

We regularly work with Bristol Archive Records, for example, who keep the memory of Bristol’s post punk and reggae history alive, one release at a time.

Other ‘archival’ releases recently transferred include cult Yugoslav New Wave band Doktor Spira i Ljudska Bića’s Dijagnoza (available late 2016), John Peel favourites Bob and legendary acid-folk act The Courtyard Music Group.

Great Bear can deliver your files as high resolution stereo recordings or, if available, individual ‘stems’ ready for the new remix.

A stack of Betamax PCM recordings of a Deacon Blue tour in 1988Deacon Blue Live – PCM Betamax transfer

We recently transferred several live concerts by Scottish pop sensations Deacon Blue.

Recorded in 1988, the concerts capture Deacon Blue in their prime.

The energetic performances feature many of their well-known hits, such as ‘Real Gone Kid’ and ‘Fergus Sings the Blues.’

As Pulse-Code Modulation (PCM) digital recordings on Betamax tape transferred at 24 bit/ 44 kHz, the recordings capture the technical proficiency of the band with exceptional clarity.

Introduced in the late 1970s, PCM digital audio harnessed the larger bandwidth of videotape technology to record digital audio signals.

‘A PCM adaptor has the analogue audio (stereo) signal as its input, and translates it into a series of binary digits, which, in turn, is coded and modulated into a monochrome (black and white) video signal, appearing as a vibrating checkerboard pattern, modulated with the audio, which can then be recorded as a video signal.’

PCM digital audio was widely used until the introduction of Digital Audio Tape (DAT) in 1987. Despite its portability and ability to record at different sampling rates, DAT was not immediately or widely adopted. Given that the Deacon Blue recordings were made on PCM/Betamax in 1988 is evidence of this. It also indicates a telling preference for digital over analogue formats in the late 1980s.

Deacon Blue Live at the Dominion Theatre, London, 26th October 1988 will be available to download as part of Deacon Blue’s new album Believers, released 30th September 2016.

According to singer and main songwriter Ricky Ross, the new Deacon Blue album aims to conjure a sense of hope: ‘it’s our statement to the fact that belief in the possibilities of hope and a better tomorrow is the side we choose to come down on.’

Deacon Blue are touring the UK in Nov/ Dec, visiting Bristol’s Colston Hall on 18 November.

 

VHS – Re-appraising Obsolescence

July 4th, 2016

VHS was a hugely successful video format from the late 1970s-late 1990s. It was adopted widely in domestic and professional contexts.

Due to its familiarity and apparent ubiquity you might imagine it is easy to preserve VHS.

Well, think again.

VHS is generally considered to be a low preservation risk because playback equipment is still (just about) available.

There is, however, a huge degree of variation within VHS. This is even before we consider improvements to the format, such as S-VHS (1987), which increased luminance bandwidth and picture quality.

Complicating the preservation picture

The biggest variation within VHS is of recording speed.

Recording speed affects the quality of the recording. It also dictates which machines you can use to play back VHS tapes.

Great-Bear-Analogue-Digital-Media-SONY SVO-500P-Panasonic AG-650Domestic VHS could record at three different speeds: Standard Play, which yielded the best quality recordings; Long Play, which doubled recording time but compromised the quality of the recording; Extended or Super Long Play, which trebled recording time but significantly reduced the recording quality. Extended/ Super Long Play was only available on the NTSC standard.

It is generally recognised that you should always use the best quality machines at your disposal to preserve magnetic media.

VHS machines built for domestic use, and the more robust, industrial models vary significantly in quality.

Richard Bennette in The Videomaker wrote (1995): ‘In more expensive VCRs, especially industrial models, the transports use thicker and heavier mounting plates, posts and gears. This helps maintain the ever-critical tape signal distances over many more hours of usage. An inexpensive transport can warp or bend, causing time base errors in the video signals’.

Yet better quality VHS machines, such as the SONY SVO-500P and Panasonic AG-650 that we use in the Great Bear Studio, cannot play back Long or Extended Play recordings. They only recorded—and therefore can only play back—Standard Play signals.

This means that recordings made at slower speeds can only be transferred using cheaper, domestic VHS machines.

Domestic VHS tape: significant problems to come

This poses two significant problems within a preservation context.

Firstly, there is concern about the availability of high-functioning domestic VHS machines in the immediate and long-term.

Domestic VHS machines were designed to be mass produced and affordable to the everyday consumer. Parts were made from cheaper materials. They simply were not built to last.

JVC stopped manufacturing standalone VHS machines in 2008.

Used VHS machines are still available. Given the comparative fragility of domestic machines, the ubiquity of the VHS format—especially in its domestic variation—is largely an illusion.

The second problem is the quality of the original Long or Extended Play recording.

Great-Bear-Analogue-Digital-Media-JVC-Super-VHS-ETOne reason for VHS’s victory over Betamax in the ‘videotape format wars’ was that VHS could record for three hours, compared with Betamax’s one.

As with all media recorded on magnetic tape, slower recording speeds produce poorer quality video and audio.

An Extended Play recording made on a domestic VHS is already in a compromised position, even before you put it in the tape machine and press ‘play.’

Which leads us to a further and significant problem: the ‘press play’ moment.

Interchangeability—the ability to play back a tape on a machine different to the one it was recorded on—is a massive problem with video tape machines in general.

The tape transport is a sensitive mechanism and can be easily knocked out of sync. If the initial recording was made with a mis-aligned machine it is not certain to play back on another, differently aligned machine. Slow recording complicates alignment further, as there is more room for error in the recording process.

The preservation of Long and Extended Play VHS recordings is therefore fraught with challenges that are not always immediately apparent.

(Re)appraising VHS

Aesthetically, VHS continues to be celebrated in art circles for its rendering of the ‘poor image’. The decaying, unstable appearance of the VHS signal is a direct result of extended recording times that threaten its practical ability to endure.

Variation of recording time is the key point of distinction within the VHS format. It dramatically affects the quality of the original recording and dictates the equipment a tape can be played back on. With this in mind, we need to distinguish between standard, long and extended play VHS recordings when appraising collections, rather than assuming ‘VHS’ covers everything.

One big stumbling block is that you cannot tell the recording speed by looking at the tape itself. There may be metadata that can indicate this, or help you make an educated guess, but this is not always available.

We recommend, therefore, to not assume VHS—and other formats that straddle the domestic/ professional divide such as DVCAM and 8mm video—is ‘safe’ from impending obsolescence. Despite the apparent availability and familiarity of VHS, the picture in reality is far more complex and nuanced.

***

As ever, Great Bear are more than happy to discuss specific issues affecting your collection.

Get in touch with us to explore how we can work together.

SONY’s U-matic video cassette

June 27th, 2016

Introduced by SONY in 1971 U-matic was, according to Jeff Martin, ‘the first truly successful videocassette format’.

Philips’ N-1500 video format dominated the domestic video tape market in the 1970s. By 1974 U-Matic was widely adopted in industrial and institutional settings. The format also performed a key role in the development of Electronic News Gathering. This was due to its portability, cost effectiveness and rapid integration into programme workflow. Compared with 16mm film U-matic had many strengths.mobile-news-gathering-u-matic

The design of the U-Matic case mimicked a hardback book. Mechanical properties were modelled on the audio cassette’s twin spool system.

Like the Philips compact audio cassette developed in the early 1960s, U-Matic was a self-contained video playback system. This required minimal technical skill and knowledge to operate.

There was no need to manually lace the video tape through the transport, or even rewind before ejection like SONY’s open reel video tape formats, EIAJ 1/2″ and 1″ Type C. Stopping and starting the tape was immediate, transferring different tapes quick and easy. U-Matic ushered in a new era of efficiency and precision in video tape technology.

Emphasising technical quality and user-friendliness was key to marketing U-Matic video tape.

As SONY’s product brochure states, ‘it is no use developing a TV system based on highly sophisticated knowledge if it requires equally sophisticated knowledge to be used.’

sony-u-matic-brochure-ease-operationThe ‘ease of operation’ is demonstrated in publicity brochures in a series of images. These guide the prospective user through tape machine interface. The human operator, insulated from the complex mechanical principles making the machine tick only needs to know a few things: how to feed content and direct pre-programmed functions such as play, record, fast forward, rewind and stop.

New Applications

Marketing material for audio visual technology often helps the potential buyer imagine possible applications. This is especially true when a technology is new.

For SONY’s U-Matic video tape it was the ‘very flexibility of the system’ that was emphasised. The brochure recounts a story of an oil tanker crew stationed in the middle of the Atlantic.

After they watch a football match the oil workers sit back and enjoy a new health and safety video. ‘More inclined to take the information from a television set,’ U-matic is presented as a novel way to combine leisure and work.

Ultimately ‘the obligation for the application of the SONY U-matic videocassette system lies with the user…the equipment literally speaks for itself.’

International Video Networks

Before the internet arrived, SONY believed video tape was the media to connect global businesses.

u-matic-video-tapes

‘Ford, ICI, Hambro Life, IBM, JCB…what do these companies have in common, apart from their obvious success? Each of these companies, together with many more, have accepted and installed a new degree of communications technology, the U-matic videocassette system. They need international communication capability. Training, information, product briefs, engineering techniques, sales plans…all can be communicated clearly, effectively by means of television’.

SONY heralded videotape’s capacity to reach ‘any part of the world…a world already revolutionised by television.’ Video tape distributed messages in ‘words and pictures’. It enabled simultaneous transmission and connected people in locations as ‘wide as the world’s postal networks.’ With appropriate equipment interoperability between different regional video standards – PAL, NTSC and SECAM – was possible.

Video was imagined as a powerful virtual presence serving international business communities. It was a practical money-saving device and effective way to foster inter-cultural communication: ‘Why bring 50 salesmen from the field into Head Office, losing valuable workingSony-u-matic-international-distribution time when their briefing could be sent through the post?’

Preserving U-Matic Video Tape

According the Preservation Self-Assessment Program, U-Matic video tape ‘should be considered at high preservation risk’ due to media and hardware obsolescence.

A lot of material was recorded on the U-matic format, especially in media and news-gathering contexts. In the long term there is likely to be more tape than working machines.

Despite these important concerns, at Great Bear we find U-Matic a comparatively resilient format. Part of the reason for this is the ¾” tape width and the presence of guard bands that are part of the U-matic video signal.

Guard bands were used on U-matic to prevent interference or ‘cross-talk’ between the recorded tracks.

In early video tape design guard bands were seen as a waste of tape. Slant azimuth technology, a technique which enabled stripes to be recorded next to each other, was integrated into later formats such as Betamax and VHS. As video tape evolved it became a whole lot thinner.

In a preservation context thinner tape can pose problems. If tape surface is damaged and there is limited tape it is harder to read a signal during playback. In the case of digital tape damaged tape on a smaller surface can result in catastrophic signal loss. Analogue formats often fare better, regardless of age.

Paradoxically it would seem that the presence of guard bands insulates the recorded signal from total degradation: because there is more tape there is a greater margin of error to transfer the recorded signal.

through-hole-example-studer-machine

Through Hole Technology

Like other formats, such as the SONY EIAJ, certain brands of U-Matic tape can pose problems. Early SONY, Ampex and Kodak branded tape need to dehydration treatment (‘baked’) to prevent shedding during playback. If your U-Matic tape smells of wax crayons this is a big indication there are issues. The wax crayon smell seems only to affect SONY branded tape.

Concerns about hardware obsolescence should of course be taken seriously. Early ‘top loading’ U-Matic machines are fairly unusable now.

Mechanical and electronic reliability for ‘front loading’ U-Matic machines such as the BVU-950 remains high. The durability of U-Matic machines becomes even more impressive when contrasted with newer machines such as the DVC Pro, Digicam and Digibeta. These tend to suffer relatively frequent capacitor failure.

Later digital video tape formats also use surface-mounted custom-integrated circuits. These are harder to repair at component level. Through-hole technology, used in the circuitry of U-Matic machines, make it easier to refurbish parts that are no longer working.

Transferring your U-Matic Collections

U-matic made video cassette a core part of many industries. Flexible and functional, its popularity endured until the 1980s.

Great Bear has a significant suite of working NTSC/ PAL/ SECAM U-matic machines and spare parts.

Get in touch by email or phone to discuss transferring your collection.


Trustpilot

designed and developed by
greatbear analogue and digital media ltd, 0117 985 0500
Unit 26, The Coach House, 2 Upper York Street, Bristol, BS2 8QN, UK


greatbear analogue and digital media is proudly powered by WordPress
hosted using Debian and Apache