This article is current as of February 4th, 2022.
A few months ago, Zoom announced that auto-generated captions (also known as live transcription) were now available for all Zoom meeting accounts. The development has been a long-awaited feature for the deaf and hard-of-hearing community.
As popular and ubiquitous Zoom has become, it can be overwhelming to understand its multiple meeting features and options – especially in regards to closed captioning. Here at Aberdeen Broadcast Services, we offer live captioning services with our team of highly trained, experienced captioners with the largest known dictionaries in the industry. CART (Communication Access Realtime Translation) captioning is still considered the gold standard of captioning (See related post: FranklinCovey Recognizes the Human Advantage in Captioning). Our team at Aberdeen strives to go above and beyond expectations with exceptional captioning and customer service.
Whether you choose to enable Zoom’s artificial intelligence (AI) transcription feature or integrate a 3rd-party service, like Aberdeen Broadcast Services, the following steps will help ensure you’re properly setting up your event for success.
To get started, you'll need to enable closed captioning in your Zoom account settings.
Scroll down to the “Closed captioning” options.
In the top right, enable closed captions by toggling the button from grey to blue to “Allow host to type closed captions or assign a participant/3rd-party service to add closed captions.”
Below is a detailed description of the additional three closed captioning options here in the settings...
This feature enables a 3rd-party closed captioning service, such as Aberdeen Broadcast Services, to caption your Zoom meeting or webinar using captioning software. The captions from a 3rd-party service are integrated into the Zoom meeting via a caption URL or API token that sends its captions to Zoom. For a 3rd-party service such as Aberdeen to provide captions within Zoom, this feature must be enabled.
As mentioned earlier in this post, auto-generated captions or AI captions became available to all Zoom users in October 2021. Zoom refers to auto-generated captions as its live transcription feature, which is powered by automatic speech recognition (ASR) and artificial intelligence (AI) technology. While not as accurate, ASR is an acceptable way to provide live captions for your Zoom event if you are not able to secure a live captioner. If you will be having a live captioner through a 3rd-party service in your meeting, do NOT check “Allow live transcription service to transcribe meeting automatically.”
Unless you expect to use Zoom’s AI live transcription for most of your meetings, it is best to uncheck or disable live transcription as Zoom’s AI auto-generated captions will override 3rd-party captions in a meeting if live transcription is enabled.
This setting gives the audience an additional option to view what is being transcribed during your Zoom meeting or webinar. In addition to viewing captions as subtitles at the bottom of the screen, users will be able to view the full transcript on the right side of the meeting.
The meeting organizer or host can control permission of who can save a full transcript of the closed captions during a meeting. Enabling the Save Captions feature grants access to the entire participant list in a meeting.
Transcript options from 3rd-party services may vary. At Aberdeen Broadcast Services, we provide full transcripts in a variety of formats to fit your live event or post-production needs. For more information, please see our list of captioning exports or contact us.
Once the webinar or meeting is live, the individual assigned as the meeting host can acquire the caption URL or API token.
As the host, copy the API token by clicking on the Closed Caption or Live Transcript button on the bottom of the screen and selecting Copy the API token, which will save the token to your clipboard.
By copying the API token, you will not need to assign yourself or a participant to type. Send the API token to your closed captioning provider to integrate captions from a 3rd-party service into your Zoom meeting. We ask that clients provide the API token at least 20 minutes before an event (and no earlier than 24 hours) to avoid any captioning issues.
Once the API token has been activated within your captioning service, the captioner will be able to test captions from their captioning software.
A notification in the Zoom meeting should pop up at the top saying “Live Transcription (Closed Captioning) has been enabled.” and the Live Transcript or Closed Caption button at the bottom of the screen will appear for the audience. Viewers can now choose Show Subtitle to view the captions.
Viewers will be able to adjust the size of captions by clicking on Subtitle Settings...
Yes! Captioning multiple breakout rooms occurring at the same time is possible using the caption API token to integrate with a 3rd-party, such as Aberdeen Broadcast Services. Zoom's AI live transcription option is currently not supported in multiple Zoom breakout rooms, which is why it is important to consult with live captioning experts to make that happen. Contact us to learn more about how it works.
Enjoy this post? Email sales@abercap.com for more information or feedback. We look forward to hearing your thoughts!
This weekend I had the privilege of attending a wedding where the bride is hearing and the groom is deaf. It was very interesting to watch them communicate with each other. She even serenaded him by “signing” a love song. I was questioned a lot about my job as a caption editor and was asked why captions aren’t always accurate. The couple was watching a sitcom together and some of the jokes lost their humor due to the fact that the captions were severely edited down and not verbatim. I discussed the importance of verbatim captioning and how at Aberdeen we make it our priority to provide accurate, verbatim captioning that does not deter from the original meaning. It was a personal moment for me to take pride in my professional career but also sad to see that many programs still have poor captioning that barely meets the requirements. Hopefully this will change as awareness spreads on the importance of quality captioning.
Side note: I was very embarrassed by my own personal lack of sign language and was frustrated that I was not able to communicate as well as I would have liked. I think it is important that everyone knows at least a few fundamental and basic signs. Here is a website that provides a list of some common signs that you should know: http://www.start-american-sign-language.com/basic-words-in-sign-language.html.
My grandfather is a soft-spoken former Marine and my father has carried on the tradition of both soft-spoken tendencies as well as military service. The one thing that isn’t soft, however, is the volume at which they watch television. Both will ratchet up the sound until it’s blaring through the entire house. This may seem like an annoying habit or one that’s inconsiderate, at best, but for them, it’s the only way to hear. Both suffer from hearing loss – hereditary as well as linked to their military service.
My father is 80% deaf and wears hearing aids in both ears to compensate. His audiologist has told him that by the time he reaches 50 years old he will be nearly 100% deaf. My grandfather, too, wears hearing aids but often forgets to put them in rendering them useless. The solution the family has found for my grandfather who loves baseball and true crime movies is closed captioning. He can now sit in the living room with the family bustling around him and follow the storyline of Law and Order or Tigers baseball game without missing a beat.
It’s an amazing thing to now work for a company that provides these services for people around the world. Having my family be affected by hearing loss makes me so much more appreciative of people like those on our team who strive to deliver 100% accurate captioning.
Written by Amber Kellogg
Many people today associate text written in all capital letters as text that is meant to be SHOUTED AT YOU! Over the years, researchers set out to prove which type of text is the most legible. The general consensus today is that lower-case lettered words are easier to read. One reason is simply that you are used to reading this type of font. The majority of print that we read is in upper and lowercase font. Also, the human brain reads words, not letters. Words typed in upper and lowercase have more distinctive shapes and varied contours. This allows us to read the actual word and not just each letter. Words set in all capitals are read letter by letter and hence, reading rate is approximately 15% slower.
So the question that remains is: Why are the majority of closed captions written in all capitals? This problem was a solution to an early decoding problem. The original decoders had trouble with letters that had descenders (g, j, p, q, y). These letters ended up getting squished to fit in the caption line and thus became illegible. So the easy solution at the time was to convert text to all capitals. This became the trend for many years to follow.
However, is there still a need to close caption in all caps? The answer is no. Today’s decoders have no problem with the once tricky letters with descenders. So, in effort to move into the 21st century and to enhance readability, many closed captioning companies are choosing to make the switch to upper and lowercase captions. What are your thoughts on this?
The statistics are astounding: 42 million American adults are illiterate and 50 million are unable to read higher than a 4th-grade reading level. Studies show that frequent reading improves reading proficiency, although sadly, according to the “Washington Post,” only one in four Americans actually read a book in the past year. Instead of reading books, the average child in America watches 20 hours of TV each week. How do we remedy this ongoing problem? What if watching television could actually improve literacy?
According to studies conducted in Finland, it can! Children in Finland watch as much TV as Americans and even attend school less. Yet, they rank higher in educational achievement and literacy. Jim Trelease states in “The Read-Aloud Handbook” that Finland’s high test scores are due to their use of closed captions. Children in Finland want to watch American sitcoms but are only able to understand them by watching the Finnish words at the bottom of the screen. Therefore, they must read to watch!
Victoria Winterhalter Brame conducted her own study after reading Trealease’s book. In her article “TV that Teaches: Using Closed Captions,” she writes about introducing her daughter to closed captioning at a young age and that once her daughter started Kindergarten, she began to recognize her vocabulary words on the screen while watching television.
Perhaps there is a simple solution to this nationwide problem of illiteracy and reading deficiency: turn on the closed- captions!
I find that most producers and television stations don't really love the idea of captions. In fact, they often find it a nuisance. There are reasons that everyone should love captions, but it is just a matter of being aware of the 94-million Americans who use--and most likely--love closed captions. Read the top five reasons why you should love closed captions too.
1. The Deaf and Hard-of-Hearing Love Captions
I guess this is a given, but most people do not know how many people are actually affected if there are no closed captions present on a television program. The number of deaf and hard-of-hearing Americans is astounding--approximately 28 million people! Equal rights to all include providing access to all, which is something we can all smile about.
2. Multi-lingual America Loves Captions
Being from the LA area, I hear how many different languages are spoken in US. All I have to do is go to a local market and hear the multiple languages spoken--Japanese, Spanish, Korean, Arabic, Farsi, Vietnamese... you name it! In America, the number of non-native English speakers is growing, which means most of them are actively learning English. In fact, over 30-million Americans are learning English as a second or other language. What easier way to learn than to watch television in English with English closed captions. You are not only listening to the language but you are reading the language. Your neighbor is learning English with the help of captions ... and that should make any American happy.
3. Grandma Loves Captions
As we all know, not all grandmas (or grandpas for that matter) need closed captions, but many, even the ones without significant hearing loss, enjoy watching television with closed captions. My 85-year-old grandmother lives alone and is pretty much housebound due to her being on oxygen. Her main form of entertainment is television with closed captions, even though she has very little hearing loss. The day her cable box stopped working she almost had a panic attack. Even if it is just to watch the latest Ellen DeGeneres show, closed captions make my Grandma happy... and probably yours too.
4. Six-year-olds Love Captions
Anyone who has young children, has had young children or knows young children, understands it can be a struggle teaching them to read. Ten million Americans are school-age children learning to read in school, but ultimately the parent needs to take time to read with their child and have their child practice reading to them. Most children watch some amount of television daily, in fact studies show that children watch an average of 1,680 minutes of television a week. Children watching television with captions on, improves their reading skills at a far faster rate than children who do not watch television with captions. That is pain-free teaching for parents... and that would make any parent happy.
5. Adults Who Cannot Read Love Captions
It sounds strange that someone who cannot read would love captions, but captions are a tool to help illiterate adults learn to read. It may be an astonishing fact, but about one in 20 adults in the U.S. is not literate in English and 27 million American adults are improving their literacy skills. Using captions is an easy way to aid in the adult learning process. You may not have thought about so many adults not knowing how to read, but knowing your captions may help, makes it worthwhile.
It is hard to think about the usefulness of captions unless you have someone close to you that captioning directly affects. Maybe captions could help someone close to you that is not using them. Pass on the word and turn on your captions--it may help more people than you think! Do you love captions yet?
Sony Electronics and Aberdeen Captioning along with software developer CPC have joined forces to develop the first file-based closed-captioning system that maximizes the benefits of Sony’s XDCAM HD422 tapeless technology. The new workflow uses Sony’s PDW-HD1500 optical deck to make the process more efficient, faster and more flexible.
“Because the XDCAM system is file-based, we’re able to do our work in a much more refined and streamlined way,” said Matt Cook, President of Aberdeen Captioning. “Now, once someone is done with their XDCAM edit, we take their file, caption directly onto that file, and then place it back onto the disc. We’ve eliminated the need to go through a closed-captioning encoder—which can cost up to $10,000—therefore eradicating the requirement to do real-time play-out.”
According to Cook, clients—which include a range of broadcast networks, groups and independent producers—benefit from faster turnaround times and a more cost- and time-efficient process than previous methods.
“The primary benefit for clients is that they can keep their file in its original form, and send it to us on a hard drive, via FTP site, or on a disc,” he said. “Once we put the captioning data back in the video file, we can then return it to the client in the format of their choice.”
The PDW-HD1500 deck is designed for file-based recording in studio operations. A Gigabit Ethernet data drive allows it to write any file format from any codec onto the optical disc media, and it also makes handling either SD or HD content much easier.
“This deck is perfect for applications like closed captioning, where turnaround time is often critical and multi-format flexibility is a key,” said Wayne Zuchowski, group marketing manager for XDCAM system at Sony Electronics.
Cook added, “We can handle any format without a problem. That type of capability and functionality is very important to us because as a captioning company we’re required to deliver a finished product in any format a client requires.”
When Aberdeen receives content from a client, the company first converts it to a smaller “working file,” for example Windows or a Quick Time media file, which is used to do the transcribing, captioning and timing.
“Once the captioning work is done, we marry the original MXF XDCAM file and our captioned data file through our MacCaption software,” Cook said. “With the press of a button, both files are merged, and we can drag and drop it back onto the disc and send out, or FTP it to a client and they can drag and drop onto a disc.”
The Sony and Aberdeen joint captioning system will be on display at NAB in Sony’s exhibit, C11001, Central Hall, Las Vegas Convention Center.
Article Written by Tom Di Nome, from Sony Electronics
Copyright notice:
© Sony Electronics & Aberdeen Captioning, Inc. 2009.
This article can be freely reproduced under the following conditions:
a) that no economic benefit be gained from the reproduction
b) that all citations and reproductions carry a reference to this original publication on [online] http://www.abercap.com/blog
The FCC (Federal Communications Commission) has implemented numerous closed-captioning mandates to make television more accessible to deaf and hard-of-hearing viewers, but they have failed to bring forth quality requirements in the face of many consumer and agency complaints.
To review the FCC closed captioning mandates, visit their website at FCC website.
After reviewing the requirements, you can see they mainly address the amount of programming that is required to be closed captioned, but they do not undertake anything to do with the quality or accessibility of the closed captioning. In 2005, deaf and hard of hearing agencies (TDI, NAD, Hearing Loss Association of America, Association for Late Deafened Adults, and the Deaf and Hard of Hearing Consumer Advocacy Network) filed a petition with the FCC addressing the issue of the quality of closed captioning. Almost three and a half years later, the FCC has not addressed the issue in its entirety.
To see the entire petition, complaints and responses go to: Hearingloss.org
You will note that many of the complaints are in regard to lack of CC or quality CC for emergency television. The FCC has addressed that situation, but it is not 100% mandated for all stations and networks.
Later this year they "should" be passing a law to make filing complaints easier and more effective. For more information read: Filing Captioning Complaints
Even though the possible new complaint procedure may lead producers, networks, and TV stations to caption their programs better, the FCC has not come up with specific rules about the quality of closed captioning. One very easy point to mandate that would make a significant difference was one of the points mentioned in the 2005 petition:
- All pre-recorded captioning should be captioned offline as opposed to in real-time.
The truth of the matter is that many pre-recorded programs are captioned in a live fashion, therefore significantly decreasing quality.
The main reasons shows are closed captioned this way are:
All the FCC would have to do is mandate that live-style cannot be used for pre-recorded programming. The overall quality of captioning for pre-recorded programming would significantly increase. Unfortunately, many captioning companies offer live captioning for pre-recorded programming to stay afloat in such a competitive market.
Now that almost all programs are being closed-captioned and it has become the norm for TV producers and stations, I predict that more laws will get passed in the near future addressing the quality of closed captions. This would completely change the way *some* closed captioning companies operate, allowing the "good" companies to shine and stand out from the crowd.
The whole point of closed captioning is accessibility and if the captions are not completely readable, there is no access.
One of the number-one questions I get from prospective clients or even friends is the question: What is the difference between roll-up captioning, pop-on captioning, and subtitling? Also, people often think that captioning is the same thing as subtitling, which it isn't. To take this question even further, I will explain in what cases each one is ideally used.*
Captioning VS. Subtitling
Captioning was created so deaf or hard-of-hearing viewers could read along to TV shows. A technology needed to be created that was accessible to the deaf viewer, but not obligatory for hearing viewers. So today, closed captioning is decoded by a decoder chip in the television and it must be activated to view. Captions are white letters with a black background. The font looks similar to Courier New.
Subtitling, on the other hand, was originally created so viewers of programming in a language other than their own could read along in their own language. Unlike captions, subtitles cannot be turned on or off through a TV decoder chip. They are burned on the video. If you are watching subtitles on a DVD or Blu-ray Disc, they can be turned on or off through the menu. Subtitles can be different fonts or colors and usually do not have a black or transparent background.
Roll-up Captioning
What is it?
Roll-up captions scroll up the screen line by line usually two to three lines at a time. It is the most basic form of captioning, as it usually does not include extensive sound effect description nor speaker identification.
When is it used?
Roll-up captioning is mainly used for ALL live programming and for post-production broadcast programming that only has one speaker (not very common).
For an example of roll-up captioning, view the video on this page: roll-up video
Pop-On Captioning
What is it?
Pop-on captions pop on and off the screen one caption at a time. They typically look like a square box and each caption usually consists of two to three lines. Pop-on captions should include sound effect descriptions as well as movement for speaker identification.
When is it used?
Pop-on captions should be used for pre-recorded broadcast programming with multiple speakers.
For an example of pop-on captioning view the video on this page: pop-on video
Subtitling
What is it?
Subtitles pop on and off the screen just like pop-on captions but they typically do not have a black background and can be any font and color.
When is it used?
Subtitles should always be used for DVD and Blu-ray Discs as they can be turned on and off through the menu. They should also be used for broadcasts in countries where the programming is of a language other than the country's primary language.
For an example of subtitles view the video on this page: subtitling video
*Please note that this article's aim is to be a general explanation for the person that has no prior knowledge of the topic. It does not go into depth on the technical differences between captioning and subtitling. I specifically talk about captions for broadcasting and not other purposes like online video, et cetera. When I speak about captioning, I am referring to Line 21 (analog) captioning, not captioning for HD.
Since Blu-ray has largely become accepted as the new HD disc format standard, there have been many inquiries about closed captioning and subtitling for Blu-ray Discs (BD).
To set the record straight, Blu-ray does not support traditional closed captioning. This is for a practical reason: subtitles on Blu-ray can be easily turned on and off through the disc's menu, just like with standard DVDs. Consequently, BD does not support Line 21, the traditional format for analog closed captions, because it adheres to modern High-Definition Multimedia Interface (HDMI) specifications. These specs were designed to replace older digital and analog standards.
If you're looking to transfer a DVD or any other standard-definition video to Blu-ray Disc, you might be concerned about the need to recreate subtitles. Fortunately, your closed captioning company can convert your existing caption files into Blu-ray-compatible subtitles for your authoring system. This might require some reformatting depending on the original captioning method used.
Blu-ray subtitles offer several advanced features compared to standard SD subtitles. Unlike SD subtitles, which are limited to a single font type, size, and color, Blu-ray allows for much greater flexibility. With Blu-ray, it’s possible to create multiple layers of subtitles, incorporating up to six different colors, fonts, and sizes. This means you can vary the appearance of subtitles for on-screen signs or dialogue, enhancing speaker identification and enriching the viewer’s experience. It’s even possible to make sound effects stand out from dialogue, turning basic subtitles into a visually engaging component of your media.
The file type for Blu-ray subtitles is an XML-based textual format accompanied by images (JPEG) of each subtitle. This is similar to the system used in DVD authoring, where the XML file serves as a directory, dictating the placement and timing of each subtitle image on the screen.