Rebecca White tells us her experience as a subtitler with Red Bee Media

Rebecca WhiteI have to confess – a career in subtitling was not a burning ambition of mine. In fact, it had never occurred to me. My encounter with the world of subtitles was a happy accident. I’d studied French and Spanish at university, and wanted to work with language in some way. After graduating, I spent a year in Montreal and was offered a temporary admin role in a school for deaf and disabled children. It was amazing to see the classes being delivered in ASL. I really enjoyed learning more about and interacting with the deaf community. My interest was piqued. So on my return to London, when I saw a vacancy for a subtitling assistant, I didn’t think twice. I worked as an assistant in the subtitling and signing departments before becoming a fully-fledged subtitler.


History of Subtitling

Subtitling departments were established in the BBC in the early ‘80s, but the first BBC programme to carry subtitles aired back in 1979 – a documentary about deaf children called Quietly In Switzerland. This also marked the first use of CEEFAX subtitles for the deaf in the world. In 1986, Blue Peter became the first programme with live subtitles. As a result of the 1990 Broadcasting Act, subtitling output increased on the BBC and in response to the growing demand from deaf and hard-of hearing audiences to see subtitles on network news programmes, the BBC set up a specialised live subtitling unit. Fast forward a couple of decades, and Red Bee Media now provides subtitles for 100% of programmes on both Channel 4 and the BBC. Sports channels like Sky Sports and ESPN have more live content, which means more real-time subtitling. Such a huge increase in subtitled output required the development of new technology. This is where voice recognition software comes in.

How does it work

That’s right – we use our voices. Contrary to popular belief, live subtitling does not involve typing really fast. Some subtitles are produced by stenographers, who transcribe speech by writing in shorthand on a stenograph machine. But the vast majority of us now use speech recognition software to repeat – ‘respeak’ – what’s being said on the TV in a computer-friendly, robotic voice to produce on-screen text. We also add in punctuation [comma], move the subtitles’ position [comma] and change colours when there’s a new speaker [full stop]. This accounts for the slight delay in the subtitles reaching your telly, and also for some of the mistakes you may notice.

I promise you that we subtitlers do know the difference between having ‘patience’ and ‘patients’. While the technology we use is impressive, it’s not quite as sophisticated as human language (yet!). It particularly hates homophones – words of different meaning that sound the same. Unfortunately, the English language is full of them. Wherever possible, I use the software to create vocal commands which can get around these. If I didn’t, you could end up reading – ‘The finish athlete eight at a tie restaurant in soul’ instead of ‘The Finnish athlete ate at a Thai restaurant in Seoul’. Similar problems arise with tricky names and obscure vocabulary. If there’s breaking news about the president of Turkmenistan and I’ve not entered it into my dictionary, the computer’s going to go with its best guess. And with a name like Gurbanguly Berdimuhamedow, the results probably won’t be pretty! We prepare thoroughly for all output but it’s impossible to predict exactly what vocabulary we’re going to need. This is live TV – anything can happen.


You’re probably wondering why there’s a need for the middle man. Why can’t the software pick up the audio directly from the studio? Again, the technology is not advanced enough for that. Once my own ‘voice model’ is loaded, the computer only understands me. Think of a debate show like Question Time, where there can be several people talking (or shouting) at the same time, often at high speed and in different regional accents. In this case, I’d need to carefully edit where necessary, so that the most essential points are being subtitled. This requires an on-the-spot judgment call – from a human being, and not a computer.

Fortunately, there’s no chance of any dictionary malfunctions and real-time edits when subtitling pre-recorded programmes. We receive these clips in advance and have time to research vocabulary and ensure the punctuation is just so. Many of my colleagues have language degrees, which may seem strange considering we generally subtitle from English into English. I think it indulges the linguist’s love for words – and our pedantry for grammar. Personally, I’ve always felt that monolingual subtitling involves a certain level of translation. Whether it be accurately rendering regional and foreign accents, puns that are essential to a punch-line, or adding labels to depict sounds, our job is to provide the same information and experience that a hearing viewer enjoys. And, as you can see, that can be more complicated than it looks.

As for what the future holds for us subtitlers; who knows? Just a few years ago, I’d be met with a blank stare when I tried to describe ‘respeaking’. Yet now, thanks to Smartphones and packages like Siri and S-Voice, more and more people are aware of voice recognition. Will viewers’ demands change as technology improves, and their awareness of it grows? How will changes to people’s TV viewing habits affect subtitling on TV? And what about expanding the use of remote captioning, delivering subtitles straight to your laptop from the other side of the world? Ah, a subtitler’s work is never done.

Rebecca White is a subtitler at Red Bee Media Ltd and is based in London. Red Bee provides ‘access services’ – subtitles, signing and audio description – for a variety of TV channels in the UK, including the BBC, Channel 4, UKTV and Sky.

17 thoughts on "Rebecca White tells us her experience as a subtitler with Red Bee Media"

  1. Dawn says:

    Really enjoyed reading this article/interview. Hopefully more people will become aware of the challenges faced by subtitlers and the technology behind live subtitling will hopefully improve too 🙂

    1. Rebecca White says:

      Thanks for your comment, Dawn. I’m really glad you enjoyed the article.

  2. Sylvia Webb says:

    Thank you, Rebecca, your article is really interesting. As someone who did the “shorthand/typing” route at school and became a “secretary”, and spent much of her life (before hearing loss) using this knowledge, I have often pondered on whether there can ever be nearly instantaneous subtitling. Having also transcribed direct from a “dictaphone”, I am very aware of how many times nobody in the office could understand a sentence someone had dictated onto the machine, and in the end we all just made a guess, based on the subject matter with which we were familiar. This again makes me wonder just how subtitling “live” can ever really be expected to achieve what most people seem to expect of it. Continue with the good work – I am so aware of what you must be struggling with – it is really, really appreciated.

  3. Rebecca White says:

    Thanks for your comments, Dawn and Sylvia. I’m really glad you both enjoyed the article (and appreciate the challenges that live subtitling presents!).

  4. Susan white says:

    Really enjoyed your article Rebecca,thank you. I now have a much deeper understanding of what subtitling involves. Not so easy eh! Very enlightening and very well expressed. Well done.

  5. Joni says:

    Wow, what a fascinating and well-written article! I loved learning the intricacies behind your everyday work, Becka. (Also, I love the idea of the `voice model` that only recognises you and your accent.) Live subtitling must be so fraught with challenges! Red Bee is lucky to have you, with your linguistic talents. Well done on such incredible and important work.

  6. Tin says:

    Thank you for throwing light on how subtitles appear so quickly on ‘live’ programmes. I have often wondered. Will still have a chuckle when the wrong word appears occasionally, but at least now I know it is the machine and not the subtitler making the mistake. Important work for which many of us take for granted. Many thanks Becka for such an interesting article.

  7. Jo says:

    Great article, Rebecca. I would love to get in contact with you.

  8. KC says:

    Wouldn’t it be easier and more effective to type subtitles though?

  9. Mairi Millar says:

    Thanks for this great article, Rebecca. I have just applied to Red Bee Media for a Subtitler position and found this in my research of the job. I had assumed subtitles were typed. I am good with language and accents so am going to give it go! Mairi

  10. Brian Quass says:

    Nice article, Rebecca. I myself have been a subtitler for the past quarter century here in the states, although here we usually refer to subtitling for the deaf as ‘closed-captioning’, which has different technical specs and style rules than does subtitling for a hearing audience (though today we have a hybrid, too, that serves both deaf and hearing viewers at the same time).

    Although technology, as you point out, continues to help improve such services, I wanted to sound a note of caution, too (particularly in regard to the captioning of the prerecorded programming with which I work) because through the years, I have seen technology often dictate captioning style rather than more germane issues such as readability and aesthetics. In addition, broadcasters and producers are often more concerned about caption cost than they are about caption quality, and when they do listen to their viewers on the subject of captioning, they are often listening to “fully hearing” viewers who don’t understand some of the important differences between old-fashioned subtitling and captioning (subtitling for the hard of hearing). Timing captions to shot changes, for instance, has been shown in studies to improve readability for deaf viewers (not to mention improving the aesthetic viewing experience for everybody, the captions thus appearing less ‘ragged’) but since such timing TAKES time and cannot be speedily accomplished by today’s technology, it is often the first thing to go when a captioning company wants to save money. The captioning field is particularly vulnerable to such monetarily based decision-making, of course, because it is one of the few jobs in the entertainment industry that has no union or similar organized representation (unlike gaffers, boom operators, key grips, set dressers, you name it).

  11. RSP says:

    It still beggars belief the number of errors apparent in the subtitling of pre-recorded or scripted material. Doesn’t anyone review them before publishing? I’ve just finished watching Jet! When Britain Ruled The Skies where the errors leaped into double-figures. Certainly spoils the enjoyment of the programme.

  12. Chas Donaldson says:

    “Fortunately, there’s no chance of any dictionary malfunctions and real-time edits when subtitling pre-recorded programmes.”
    Just had to share this with some Deaf and hard-of-hearing friends – how we laughed.
    Quick example – subtitling of “Top Gear” constantly puts “break” for “brake” and “grill” for “grille”.

  13. Annette Llewellyn says:

    Rebecca – Thanks for the enlightening article. Keep up the good work and be prepared for more work as the youth of today lose their hearing because they haven’t protected it from decibel overloads at concerts and listening to music way too loud through headsets and earphones.

  14. David Thompson says:

    The best subtitles are done by stenographers. This voice recogniton software is useless. People of my age deserve better. Can we please have more subtitles by stenographers?

  15. Anne says:

    Thanks for such a great description! I am applying for the same position right now!

  16. Kay C says:

    Many thanks for your interesting article Rebecca, but especially for the work you and your colleagues do. I now understand the very slight lag.
    Hearing loss is so isolating. I needed hearing aids some 5 years ago but still needed the sound up really loudly. I still could not distinguish key words or accents clearly so missed crucial points. I discovered Red Bee Media on my iPad. I am truly delighted as I now gain full meaning from the text and thoroughly enjoy the programs shown. Also I can turn my hearing aids up and reduce the volume on the iPad so I don’t disturb others.
    I find the service brilliant! As an English “pendant” I delight in the accuracy you achieve and am amused by the very few slight errors. Thank you and your colleagues so much for your impressive skills and for giving me so much enjoyment.

Leave a Reply

Your email address will not be published. Required fields are marked *