This chapter provides a brief look at the technology used at the BBC Radiophonic Workshop from its inception in 1958 to its closure in 1996. For more about the early period or artistic considerations, see The BBC Radiophonic Workshop, The First 25 Years by Desmond Briscoe and Roy Curtis Bramwell, ©BBC, London, United Kingdom, 1983, ISBN 0 563 20150 9.
The Radiophonic Workshop began by providing musique concrète material for radio, initially in the field of drama. Using a wide range of equipment, often obtained from other departments, it soon acquired an enviable reputation for the sounds and music that it created for radio and television.
In its early work, under the direction of Desmond Briscoe, the only available materials were real sounds that were recorded and manipulated using tape machines and other devices. The process was similar to modern sampling, using reverse playback, speed or pitch change, equalisation and reverberation, accompanied by endless skilful editing. Many sources of sound were familiar to drama studios, such as pebbles in boxes, mutilated musical instruments or even an old copper water tank!
The voltage controlled synthesiser of the late sixties caused significant change. Sound and music could now be created immediately, although often only one note could be played at a time! The arrival of multitrack tape recorders in the following decade enabled composers to build up complex layers of material that could be modified as work progressed. During this period the Workshop enjoyed a short period of fame, mainly because of the Doctor Who science fiction series, although it soon retreated into relative obscurity. In the mid-seventies, the department began to expand and took up new initiatives, working on alternative material that wasn’t always strictly ‘radiophonic’.
By the eighties, computer control of digital musical instruments via the Musical Instrument Digital Interface (MIDI), using Macintosh computers, was a reality. This was complemented in the last decade by recording systems based on more advanced computers. The all-digital studio had arrived. However, by the nineties this technology was available to all and the Workshop was forced to close.
Rooms 13 and 14 at the Maida Vale studios were the birthplace of the Workshop. The original tape recorders included small Ferrograph models and the monstrous Motosacoche machines. The latter were particularly difficult to use since these took fifteen seconds for the tape to get up to speed. In later years, Room 13 was devoted to working with film, although some older tape machines remained. These included an early Ampex recorder and an EMI BTR/2 , the latter having an extra motorised spool on one side for editing. In addition, there was a Prevost 35 mm film viewing machine and a film editing machine. Much of early work for television was recorded onto ‘sepmag’ film (35 mm wide magnetic tape with sprockets) so that it could be synchronised with a matching picture film.
For many years Room 12 provided Doctor Who sound effects. It had a specially-built mixing console fitted with ‘continuous’ carbon faders, an innovation for the time, and miniature valve amplifiers. Three Philips tape machines, with inter-linked remote control, provided a comprehensive tape manipulation facility, whilst a Leevers-Rich tape recorder had both ‘continuous’ and ‘chromatic’ tape speed controllers. In addition, an EMI TR/90 tape machine was equipped with the Tempophon , whose spinning head allowed the pitch of a recording to be changed without changing the tempo.
Room 11 contained an early transistorised mixing desk. This incorporated ‘Glowpot’ faders, designed by Dave Young, the Workshop’s highly inventive engineer. Traditional BBC ‘stud faders’ (which were switched attenuators) suffered from ‘stud noise’, an effect particularly noticeable on tonal sounds. The Glowpot fader overcame this problem by using a modified stud fader to control the intensity of light-bulbs that in turn, illuminated a pair of light dependent resistors (LDRs) within an attenuator network. The thermal inertia of the light-bulbs effectively eliminated any stud noise.
The sixties saw the first voltage-controlled synthesisers, the biggest of which appeared in Room 10. The Delaware was manufactured by Electronic Music Studios (London) Ltd, more usually known as EMS. This machine was a modified Synthi 100 , incorporating a two-level keyboard and numerous elements connected by two ‘virtual earth’ patching matrixes. It also included a real-time sequencer that took analogue control and gate signals from the keyboard, digitised them and stored the data in RAM. This machine’s greatest problem, common to most voltage-controlled synthesisers, was that of VCO frequency drift, usually caused by changes in temperature as the equipment warmed up. A later attempt to replace the Delaware by a new machine, consisting of Ken Gale’s Wavemaker modules, came to an end as new technology, much of it from Japan, began to arrive in the late seventies.
Few other synthesisers were used at this time, apart from the VCS3 , also produced by EMS. This highly adaptable and portable machine first appeared in 1968 and was particularly useful for sound effects. It also incorporated a versatile ‘virtual earth’ patching matrix for interconnecting parts of the synthesiser. The later ARP Odyssey , which had front-panel switches for patching, was also popular.
The Workshop’s main source of reverberation or ‘echo’ came from two EMT 140 echo plates. Each consisted of a large box containing a steel plate suspended on springs. Two transducers, one for transmitting sounds to the device and the other for receiving reverberant sounds, were attached to the plate. A remote-controlled servo system adjusted the mounting springs, modifying the reverberation time (or ‘room size’) of the plate: typically, this was set to between two and three seconds.
Alternative reverberation was provided by an echo room, a small area of oblong dimensions with a sloping ceiling. A loudspeaker was positioned at the ‘short’ end whilst a microphone, located at the opposite end, picked up the reverberant sound. In later years a pair of microphones were fitted to give stereo sound. Unlike the plate, the room’s reverb time was fixed at four seconds.
The least successful form of reverberation was provided by the humble echo spring. This incorporated a coiled spring, usually over 200 mm in length, with audio transducers at each end. The results weren’t too impressive, although interesting sounds could be produced by hitting such a device!
By the mid-seventies the Workshop was in the doldrums, little having changed since the late sixties. The department had developed piecemeal, acquiring rooms along the corridor of the Maida Vale Studios as it went. Apart from Rooms 12, 13 and 14, most studio were converted offices.
By 1974, Rooms 13 and 14 contained a Glensound DK/1 stereo mixing console. Unusually, this mixer had pan-pots (for positioning the stereo image) on each channel fader, allowing the user to ‘pan’ and ‘fade’ a sound simultaneously. This studio also had the first multitrack tape machine, a Studer A80 8-track. A push-button matrix allowed the user to send any sources to any of the multitrack’s inputs. Later, as the Workshop expanded, this installation moved into a new area, Room 36.
The Workshop’s first ‘off the shelf’ mixing desk arrived in 1979. The Neve 8066 was a conventional twenty-channel music console, coupled to a Studer A80 16-track recorder. It was installed in Studio E, part of the original room 13, in time for Rockcoco , a rock musical produced by Paddy Kingsland.
This installation was soon followed by others containing Soundcraft Series 2 mixers that provided eight group outputs, allowing any source to be directed to an 8-track recorder’s inputs. These consoles were installed in Studio C (originally Dave Young’s office) and in Studio F (the original room 10). But these installations were only stop-gap measures, an attempt to catch up on lost time.
Several highly versatile electronic effects devices appeared at this time. These included Roland flangers and phasers that used quasi-digital bucket-brigade devices to introduce delays into audio signals.
During this decade the consumer music industry expanded, giving access to a tremendous range of ‘off the shelf’ products. With a little imagination, sometimes even stretching equipment beyond what the manufacturer envisaged, these devices offered the Workshop unseen new opportunities.
By this time the department was showing signs of serious financial deprivation. Under the hand of Brian Hodgson it at last received the funding it deserved, with one of the six studios being upgraded each year. Once again, Soundcraft consoles seemed the obvious choice.
By 1982, Studio B (the original room 36) and Studio D (Maida Vale’s wartime control room) had Soundcraft Series 1624 mixers, designed for 16-track operation, whilst Studio A incorporated a Series 800 console, providing for 8-track work. Next came Studios C, E and H (the latter converted from a small film theatre) with Soundcraft 2400 consoles, this time with 24-track capacity.
These new studios had 8-track, 16-track or, later, 24-track tape recorders. Unfortunately the 8-track machines were totally inadequate for stereo work, since they only really provided four stereo tracks. Sadly, the introduction of 16-track and 24-track machines also caused problems when material had to be interchanged between studios. The greatest difficulty was caused by the 8-track format, which used one inch wide tape, unlike the other systems that used two inch tape.
The new mixers completely justified the advantages of low-cost equipment. By using unbalanced audio circuits, instead of ‘broadcast’ transformer-balanced circuits, a very high sound quality was maintained. Consequently transformers were removed from virtually all other equipment.
In 1981, computer technology arrived in the form of the Fairlight Computer Musical Instrument (CMI). This was an adapted minicomputer that incorporated a graphical display and light pen. The CMI outlasted many later ‘top end’ machines, some of which disappeared without trace.
During the early eighties, the music industry launched the Musical Instrument Digital Interface (MIDI), a system for connecting musical devices and computers. MIDI sockets soon appeared on the back of many keyboards and synthesisers. The initial reaction was: ‘what could it be used for?’
The answer was sequencing. This process took a performance from a musical keyboard and recorded it as digital data in a MIDI sequencer. The data could be edited as necessary and subsequently used to play any MIDI instrument. The beauty of the system was that no sounds were recorded on tape: the information was simply held as a computer file that could be updated at any time. Even the choice as to which sound or ‘voice’ was triggered by keyboard actions could be left to the very last moment.
One early sequencer, the Yamaha QX1 , provided eight MIDI outputs but required the composer to stare for hours at a small liquid crystal display (LCD). A more important arrival was the Apple Macintosh and associated MIDI interface, first used at the Workshop during 1986. The earliest machine was a Mac Plus , later superseded by the Mac II , IIx and finally the Quadra 900 . The Mac ’s WIMP (Window Icon Mouse Pointer) environment was ideal for a non-technical musician. In addition, plenty of software for housekeeping and programme documentation was available.
The new range of digital MIDI synthesisers also used completely new techniques, such as frequency modulation (FM), featured in Yamaha’s DX7 . Its dramatic and highly musical sounds accelerated the demise of older analogue machines. Many more all-digital synthesisers were soon to follow.
The Workshop regularly produced material for television, involving the business of synchronising sound to picture. Working with film was fairly easy: the passing picture frames were counted (at a rate of 24 or 25 frames per second) and the time for each ‘cue’ was calculated. Later, the completed sound track could be transferred onto sepmag tape and then played in ‘synch’ with the original picture film. If the two weren’t in step, the sepmag tape and picture film could be ‘slipped’ against each other or edited with a razor blade. However, it was usual to edit the sepmag tape rather than the picture film!
The arrival of video recording made things more difficult, since video tape didn’t have sprockets! The Workshop was originally provided with Shibaden half-inch helical-scan video machines that allowed the composer to see the original picture material. In some instances the ‘time of day’ or ’time of recording’ would be ‘burnt in’ to one corner of the picture, showing the elapsed hours, minutes and frames. The composer would then make calculations to fit the new sounds to the pictures provided. Finally the new material would be checked against a stopwatch. The completed tape would normally be ‘played in’ during the process of ‘dubbing’ the final transmission tape. For this to work, the ‘play’ button on the playback machine had to be pressed at exactly the right time!
With a new generation of video machines, timecode also could be provided as useful data. In a Video Home System (VHS) machine, one of the stereo ‘hi-fi’ sound tracks carried SMPTE longitudinal timecode, whilst on the later U-matic machines it was conveyed via one of the ‘linear’ sound tracks.
Later, Vertical Interval Timecode (VITC) was introduced, consisting of timecode carried within the video signal itself. Unfortunately, this system wasn’t reliable on European VHS machines and the Workshop therefore had to standardise on the semi-professional Sony U-matic format.
When a video tape arrived, the composer copied the timecode (and sometimes the original sound track) from the VCR to an appropriate track (or tracks) on a multitrack tape recorder. The signal played back from the ‘timecode track’ could then be used to drive a timecode reader that displayed the elapsed time. This helped the composer to create new sounds that were ‘on cue’. Unfortunately although a cue could be anticipated, there was no guarantee that the composer would actually hit it!
Ray Riley’s Timecode Memory Unit (TMU) was the first device that put Radiophonic studios under the control of timecode. It worked by accepting Binary Coded Decimal (BCD) signals from a standard SMPTE timecode reader. These were then used to generate a trigger, gate, switch-closure or bleep signal at a specified timecode. Hence any sequencer or tape machine could be ‘fired off’ when required. It also generated a regular metronome click at a rate locked to the speed of the timecode.
The TMU was eventually replaced by Syncwriter , as developed by Jonathan Gibbs (software) and Ray White (hardware). This also used the signals from a standard timecode reader, but was connected to the 1 MHz ‘bus’ of a BBC microcomputer and its video display. Syncwriter therefore gave a visual presentation of time progressing towards the fixed cue points. It also generated a range of different clock outputs, all locked to timecode, that could drive pre-MIDI sequencers, as well as duplicating the features of the original TMU . MIDI inputs and outputs were also provided, enabling it to generate MIDI timecode (MTC) or MIDI ‘clocks’ that could be merged with the MIDI output of a keyboard. Syncwriter was later updated by Ray Riley to accept SMPTE longitudinal timecode directly.
The final piece of the timecode jigsaw came much later. Although the composer could now create material synchronised to picture, the timecode had to be ‘stamped’ onto the final tape to prevent timing drift due to speed variation in the playback machine. The only way to do this (apart from using one audio track for SMPTE longitudinal timecode) was to use a quarter-inch tape machine with a ‘centre’ track designated for timecode. Two Studer A810 machines finally appeared in 1991, but by this time the speed stability and timecode options of Digital Audio Tape (DAT) had almost made them obsolete.
By 1987, the explosive growth in technology had made the conventional studio with its large mixing console almost unworkable. The composers found themselves surrounded in a veritable sea of keyboards and ‘effects racks’ bulging with equipment. Clearly a new approach was necessary and this would involve MIDI. One composer, Peter Howell, recognised that the central focus of an ideal studio ought to be a Macintosh computer that controlled all aspects of studio operation. By now, the task of mixing had almost become subservient to the creation and sequencing of sound. Three products appeared that offered a solution to the problem. These were the Yamaha DMP7 mixer, the Akai DP3200 audio matrix and an Apple Macintosh application known as HyperCard .
To all appearances, the DMP7 was just another compact eight-channel audio mixer, but internally the audio path was entirely digital, employing 32-bit processing for all functions, including special effects such as delay and reverberation. It was even fitted with motorised faders so that settings could be recalled in an instant. Furthermore, every switch and control setting could be adjusted via MIDI messages, albeit in a non-standard fashion. The user could use a look-up table within the DMP7 to convert any incoming MIDI message into an appropriate instruction for the mixer.
The Akai DP3200 matrix had 32 audio inputs and outputs, and was controlled by a serial interface. On investigation this was found to be MIDI, although it used non-standard codes. These later caused problems with ‘intelligent’ MIDI interfaces, since they broke the rules of MIDI. Dummy data bytes were then added to persuade the interface to accept the data but these didn’t upset the matrix.
HyperCard seemed at first a rather curious Mac program, perhaps more suited to domestic chores such as accounting. But on closer inspection it was found to have a unique ability for creating screen displays or ‘cards’ with pre-programmed buttons and boxes. These cards could then be bolted together and used with other proprietary software. HyperCard also included its own language and could handle MIDI data, offering a real opportunity to reach the goal of a computer-controlled studio.
Putting these elements together resulted in a trial studio assembled in March of 1988. This required 200 audio cables and considerable effort from Peter Howell, Mark Wilson (Development Coordinator), Ray Riley (Engineer) and Ray White (Senior Engineer). This exciting new studio system was soon in operation, the mixers and matrixes proving their worth. The software, based on HyperCard , was refined as the studio was used and later became known as Cue Card .
By January of 1989, Elizabeth Parker was hard at work in the finalised version of the circular console. The new studio, Studio F, was on the site of the old Film Unit projection room and studio. And its console was self-contained, making it totally independent of the building infrastructure. It was provided with removable panels and cable covers, allowing wiring to be modified in minutes.
Five ‘input’ mixers were used, each positioned beneath a mini-rack containing the appropriate sound generators, and connected to two ‘output’ mixers. Three DP3200 matrixes completed the audio system. The ‘input’ mixers were assigned, in order, to synthesisers, samplers, the eight outputs of a TX816 synthesiser, drum machines and the outputs of the multitrack tape recorder.
The equipment was controlled from a Mac II with MIDI interface, a MIDI matrix and MIDI ‘distribution’ box. To record the fader movements of any mixer, a MIDI circuit was also connected from the mixers to the Mac . Data from all the mixers was combined using a chain of MIDI ‘mergers’. With everything under MIDI control, there was no need for a musical keyboard in each instrument. In fact, many synthesisers came in convenient rack-mounting boxes. Hence, the Yamaha KX88 was chosen as the ‘master’ keyboard. This included a ‘pitch bend’ wheel and sockets for foot pedals.
The installation included a Roland S-550 , a multiple-output sampler that kept sounds on floppy disk. Later additions included an EMU Proteus that gave excellent ‘playback only’ samples, and the similar Procussion for percussive sounds. Both were multi-timbral machines that could employ all 16 MIDI channels. Other synths in the final line-up included Roland’s D550 , and from Yamaha, the TX802 , TG55 and TX816 . The latter was really eight DX7 synths combined in one rack-mounted box.
The wide range of effects devices included Yamaha’s DEQ7 equaliser and SPX1000 effects unit, Roland’s SRV2000 reverb, Gatex noise gates, and Drawmer’s excellent compressor-expander. The first three were from a new generation of machines that used digital signal processing (DSP).
Following the success of Studio F, a second installation was completed in September 1989, this time for Peter Howell. In this studio, the sound quality was enhanced by replacing the audio links between ‘input’ and the ‘output’ mixers by digital circuits. To do this, one ‘output’ mixer was replaced by a DMP7D , which was similar to a DMP7 , but had digital audio connections. Yamaha provided a Custom Interface Unit (CIU) to convert the stereo data from four mixers into eight mono signals.
This studio also saw the appearance of the innovative Akai DD1000 , a four-track audio recorder that recorded digital audio onto erasable magneto-optical disks. Its digital audio outputs were fed, via an interface, into another DMP7D that, in turn, was fed into the ‘output’ mixers. In this way, the DD1000 was used as the master timing clock for digital signals throughout the studio.
Slowly but inexorably, the Macintosh computer itself was becoming a device for sampling and recording sounds. The Mac IIfx featured six slots that accommodated NuBus cards, each of which could provide special features. For example, Digidesign’s SoundTools card, with an interface box and software, provided stereo recording, editing and sound manipulation. The Workshop also used the low cost AudioMedia cards: although less powerful these didn’t need an interface box.
Useful though these systems were, the real breakthrough came with a new version of Opcode’s Vision sequencer programme, known as StudioVision . This allowed previously-recorded soundfiles to be triggered from within a MIDI sequence. Surprisingly, soundfiles created using Digidesign’s system could be interchanged with those created using StudioVision and an AudioMedia card.
Next came Digidesign’s ProTools , a four-track system, again consisting of a single NuBus card and interface, but capable of providing more tracks whenever extra cards were added. By the end of 1992, this new hardware was being used to incorporate four-track material into MIDI sequences.
In October 1991, the Workshop opened an entirely new studio whose job was to remove sound rather than create it. The Sonic Solutions NoNoise system could dispose of interference or other unwanted sounds from any recording and allowed the restored material to be transferred onto CD-R.
A Mac IIfx computer (later replaced by a Quadra 900 ) was eventually equipped with three powerful Digital Signal Processing (DSP) NuBus cards. The first two cards provided a four-channel on-screen mixing desk, complete with faders, pan-pots and extensive equalisation. These cards also had a Small Computer System Interface (SCSI) port that was connected to 2.8 GB of hard disk storage and a Sony CD Encoder . From this encoder audio data passed via optical fibres to five Sony CD Writer s.
The third card was entirely for de-noising, providing two stereo digital audio inputs and outputs via optical fibre circuits. These were connected to an interface box that accepted inputs and outputs over standard AES/EBU or SDIF2 digital interfaces. The AES/EBU interface was now the ‘industry standard’ for digital audio devices whilst the SDIF2 (Sony) interface was used on older devices.
Source material would be loaded into the system, usually from Digital Audio Tape (DAT), creating a soundfile on hard disk. The effect of various settings would then be checked using the on-screen mixer. Having chosen the best setting, the system would process the material in the background. This was often achieved in real-time, using three separate ‘passes’ to remove crackles, noise and hiss.
The new soundfile could be then edited on the Mac ’s screen as a foreground task. Sound waveforms appeared graphically, allowing the user to zoom in to see more detail or to move out for a general view. Sections of sound could be removed and replaced with ‘black’ silence, or the original material could join up around the gap. Material also could be repeated or swapped between the stereo tracks.
The final soundfile could be ‘dumped’ onto tape, DAT or CD-R. To create a CD, the operator had to create a Table of Contents (TOC). This could be done manually, using flags that were shown on the graphical display, or automatically by setting a silence threshold and a duration.
The studio was built into an elongated version of the earlier circular console and was equipped with four Akai DP3200 matrixes. Three DMP7D mixers were used, each with an AD8X 20-bit A-to-D converter and SPX1000 effects processor. Other equipment included a Roland SN550 Digital Noise Eliminator , a Precision Power Phase Chaser and Audioscope spectrum analyser. Sony PCM2500 and PCM7030 DAT machines were also installed. The latter worked with SMPTE timecode and could be controlled directly from the Sonic system for automated loading and dumping of sounds.
A new matrix control application called Max was developed by Anthony Morson. This replaced one vital role of Cue Card in the move to a new generation of computers and MIDI interfaces.
New digital samplers and other devices allowed the Workshop to finally return to its tradition of making sounds from those of the real world, so reopening a vital repertoire for musical composition. During a period of electrifying developments, the Workshop had remained at the ‘sharp end’ of the industry. But in the final analysis, whatever technology had to offer, an artistically creative production could only come from the imagination, skill and endless patience of the composer.
©Ray White 2001.