It is safe to say that most people who speak more than one language understand that there are certain idiosyncrasies inherent in each language. Sometimes it is difficult to accurately translate certain sayings and the end result can be comical.
In an Athens hotel there is a sign that reads: “Visitors are expected to complain at the office between the hours of 9 and 11 a.m. daily.” A sign for a Copenhagen airline reads: “We take your bags and send them in all directions.”
The most common errors in translation occur when attempting to translate idioms and sayings word for word. For example, “A quien madruga Dios le ayuda” would equate to the English expression “the early bird gets the worm,” but if translated word for word it would be “he who gets up early is helped by God.”
There also can be ambiguity with words. For example, an address can be a location or a speech. Many English words are homonyms, which can also lead to errors in translation. The word polish can be a verb meaning to make smooth or glossy, but it can also refer to the inhabitants of Poland or the Polish language.
Translators and subtitlers need to be aware of popular idioms and ambiguity of certain words in order to avoid an inaccurate translation. Here are a few humorous subtitling blunders that were found (information taken from Subtitling Worldwide).
In a subtitle "They sicked the dog on me" became "they caused the dog to be nauseated by me.”
In a subtitle where a man is sitting in a car someone said, "he cracked the window," meaning he opened it just a little bit. According to the subtitler, the man “smashed the window.”
“Wish him many happy returns for me.” This was translated as, "Wish him many more reincarnations."
A soldier was shot dead and another soldier closes his eyes and says, "Rest easy." The Dutch subtitle said: "Take a nice little break."
With a growing awareness of the importance of closed captions and subtitles, there is a need for universal consistency and standards of excellence. Here are a few key guidelines, proposed by Mary Carroll and Jan Ivarsson in their 1992 book, Subtitling, that Aberdeen implements to achieve quality subtitles.
Video Copy and Glossary: Subtitlers should indeed work from a video copy of the production. Providing a glossary of unusual words, names, and specialized terms ensures accuracy and consistency across the subtitles.
Compression of Dialogue: When dialogue must be compressed for subtitling, it's essential that the meaning remains clear and coherent.
Translating On-Screen Text: All critical on-screen text, like signs or notices, should be translated. It's also beneficial to include what might be considered "superfluous" information, such as off-screen voices and names, to assist hearing-impaired viewers.
Subtitling Songs: Songs should be subtitled when they are relevant to the content or contribute to the understanding of the narrative.
Timing of Subtitles: Subtitles should align closely with the rhythm of the dialogue, editing cuts, and sound bridges in the film. They should appear and disappear in sync with the audio to preserve the natural flow of conversation.
Emphasizing Key Elements in Subtitles: Subtitles should effectively convey elements of surprise or suspense without undermining them. This involves careful placement and timing relative to the visual and auditory cues in the content.
Reading Rhythm: The duration of subtitles should accommodate the average viewer's reading speed—generally not appearing for less than one second or more than seven seconds, except in the case of songs.
Synchronization: There should be a close correlation between what is spoken in the film and what is subtitled, with efforts made to synchronize the source and target languages as closely as possible.
Legibility of Subtitles: Subtitles must be easy to read, with clear lettering and a suitable font. Techniques like adding a drop shadow or background box can enhance readability.
Consistency in Positioning: The placement of subtitles should be consistent throughout the production, aiding in viewer comprehension and minimizing distraction.
Character Limit: The number of characters per line should be compatible with the subtitling system and should be legible on any screen size.
We’ve been blogging about this legislation for months, and the battle has been won! President Obama signed the 21st Century Communications & Video Accessibility Act into law on Friday, October 8, 2010, during the National Disability Employment Awareness Month. This new legislation requires smart phones, television programs, emergency broadcast information, the Internet, menus on DVD players, program guides on cable TV, and other communication devices to be accessible to people with disabilities, thus creating more opportunities in the workplace and classroom. It also provides funding for deaf and blind people to purchase communication equipment and services. With society’s new reliance on the Internet and other modern technologies, this legislation simply allows people with disabilities access to technologies that are used in everyday life.
Many people fought long and hard for this bill to become a law. Representative Edward Markey authored the bill in 2009. He had this to say: “We’ve moved from Braille to broadband, from tracing words in palms to navigating a Palm Pilot. Americans with disabilities need access to the latest 21st century communications and video tools to compete in the job market and engage in daily activities that increasingly rely on the latest technologies.”
Have you ever noticed how fast you speak sometimes? I know we all notice the speed at which we drive down the freeway, even if we pretend not to, but have you ever stopped to think about the speed at which you speak? And more so, could a dragon help you realize this?
I don’t know about you, but I had never thought about any other way of typing than to actually use my fingers and strike the keys on a computer keyboard. I know I’m not the fastest typer out there, but I’ve always felt proud of my 70 wpm typing speed. Somehow I managed to write all my college papers without ever missing a deadline. Granted, some of those papers were written the night before they were due and sometimes even printed out 30 minutes before class started, but they were always on time. Then, of course, I was left walking like a zombie throughout campus for the rest of the day; nothing a good cup of coffee could not fix, right?
About 2 months ago I applied for a transcriber/caption editor position at Aberdeen Captioning, and while going through the interview process I was asked to transcribe an 8-minute long video. As soon as I opened the video and saw the length of it, I thought, “Piece of cake,” so I started typing away. After half an hour of changing back and forth between Microsoft Word and Media player, rewinding the video several times, and being nowhere close to being done, I found myself talking to the computer and saying, “Hold your horses!” Needless to say I needed a break, but I kept thinking, if only I could type faster, or even better, if the people in the video could speak slower, this task wouldn’t be as frustrating.
Within my first week at the job I was introduced to my new dragon friend; Dragon Naturally Speaking. Doing justice to its epic name, this speech recognition software has made my transcribing experience a lot more interesting from the moment I started using it. Remember how in the movie Eragon, the dragon Saphira, could read Eragon’s thoughts? That’s more or less how this program works. Obviously Dragon doesn’t type what you’re thinking, but after a short training, the program learns how to understand your voice and you’re good to go! So instead of typing what people say in a video, you speak the words and Dragon types it for you. Pretty neat huh?
Despite being excited about using a new program, I still had my doubts. Could this voice recognition software really be faster than my 70 wpm typing? I mean, that’s a decent typing speed, right? Besides, when I type I can fix my mistakes immediately whereas with Dragon it’s easier to keep dictating and then go back and fix my mistakes later. I was skeptical about this dictation program’s effectiveness against my own. If there’s one thing I remember from my biology classes in college is that I have to test my theories to obtain an answer, so I decided to put an end to my doubts and find out if all my years of typing would help me compete with this dragon.
Two minutes of typing. I don’t remember ever surrendering so fast, not even when I ran 3 miles under 100⁰ F weather in cross-country! I had to face it; I can speak significantly faster than I can type. The Dragon Naturally speaking software is a great tool that tremendously improves your typing speed, especially if you’re typing a long paper; after all, Dragon doesn’t get tired of typing and I hardly believe anyone would ever get tired of talking. I mainly use the program to dictate what other people speak, but I can’t help to wonder how much it would’ve helped me in college when I had to write those 10-page papers. This program types words that I don’t even know how to spell, automatically capitalizes words after periods and what’s best, I can keep training it to understand me better every time I notice a mistake; that’s a keeper, if you ask me.
On the 20th Anniversary of the Americans with Disabilities Act, 348 members of the U.S. House of Representatives voted for HR 3101, the 21st Century Communications and Video Accessibility Act. This act will require closed captioning on all Internet video information and will help millions of people have access to these videos. This act will also provide up to $10 million annually for specialized communications equipment for low-income individuals who are deaf/blind. It also requires that Internet telephones be hearing aid compatible.
California Representative Henry A. Waxman states: “Today, as we mark the 20th anniversary of the landmark Americans with Disabilities Act, the House is giving Americans with disabilities access to smart phones, other communications technology, and video programming. This bill ensures that millions of Americans with disabilities can participate in our 21st century digital society.”
Senator John Kerry said, “Technology and the Internet have broken down barriers, and no one should be or has to be excluded from modern communications and the new economy because of a disability. It’s been 20 years since the Americans With Disabilities Act knocked down barriers to employment and government services — and now it’s time to do the same thing [with regard to] blocking people with disabilities from getting online.”
So what happens next? The bill gets sent to the Senate for vote and if it passes, it will be signed into law!
YouTube and Google now index video content based on text in closed captions and subtitles, enhancing the discoverability and SEO ranking of videos that include them. This capability means that closed captions help your video appear in search results for keywords contained within the captions, potentially driving more views and engagements.
Mark Robertson authored an article on his website titled "In-Depth Look at YouTube Closed Captions, SEO, and YouTube Indexing," where his insights reveal that closed captions not only boost accessibility but also expand your video's global reach through multi-language subtitles. This broadens the appeal of your content to a global audience, including those with hearing impairments and non-native speakers of the video's language.
Beyond viewer accessibility, the strategic use of closed captions contributes significantly to SEO. Captions allow Google’s algorithms to understand better and index the video’s content, which can enhance the video’s visibility in search results. This approach to SEO is not just about improving accessibility but also about maximizing content discoverability and engagement on digital platforms. Properly implemented, closed captions can increase viewer retention, reduce bounce rates, and expand the global reach of videos by making them accessible to non-native speakers and those in sound-sensitive environments.
Furthermore, closed captions can significantly influence user engagement metrics, such as watch time and interaction rates, which are crucial for boosting SEO rankings on YouTube. By providing a text version of the video's audio, you cater to a wider range of user needs and preferences, which improves the overall user experience and could lead to higher rankings in search results and recommendations.
To implement closed captions effectively, it’s important to use recognized subtitle file formats like SRT or VTT, which are readable and indexable by search engines. Also, consider updating your captions regularly to include new keywords and to enhance SEO impact. Engaging professional transcription services can ensure the accuracy, readability, and SEO optimization of your subtitles.
In summary, adding closed captions to your YouTube videos is a straightforward yet powerful tool to enhance video accessibility, viewer engagement, and SEO performance. For more details on captioning your web or YouTube video, please visit our page on YouTube-Ready Captions.
Aberdeen Captioning has been qualified as a “YouTube Ready” vendor by DCMP. As a DCMP “Approved Captioning Service Vendor,” Aberdeen is committed to providing quality captioning in multimedia formats, now including your YouTube videos. This allows your YouTube video to be captioned according to DCMP guidelines and with a customer satisfaction guarantee. Aberdeen offers different choices for your YouTube video and will work closely to establish a package that is right for you.
Need multi-language translation for your YouTube video? No problem! Aberdeen will provide a professional and experienced translator to ensure that your message is understood globally. All you need is a YouTube account.
To learn more about DCMP's "YouTube Ready" qualification visit: http://youtubeready.dcmp.org/
Also, watch Aberdeen’s YouTube video for more information on our captioning and subtitling services for your YouTube video at: https://aberdeen.io/abercap/youtube-ready-captioning/
The statistics are astounding: 42 million American adults are illiterate and 50 million are unable to read higher than a 4th-grade reading level. Studies show that frequent reading improves reading proficiency, although sadly, according to the “Washington Post,” only one in four Americans actually read a book in the past year. Instead of reading books, the average child in America watches 20 hours of TV each week. How do we remedy this ongoing problem? What if watching television could actually improve literacy?
According to studies conducted in Finland, it can! Children in Finland watch as much TV as Americans and even attend school less. Yet, they rank higher in educational achievement and literacy. Jim Trelease states in “The Read-Aloud Handbook” that Finland’s high test scores are due to their use of closed captions. Children in Finland want to watch American sitcoms but are only able to understand them by watching the Finnish words at the bottom of the screen. Therefore, they must read to watch!
Victoria Winterhalter Brame conducted her own study after reading Trealease’s book. In her article “TV that Teaches: Using Closed Captions,” she writes about introducing her daughter to closed captioning at a young age and that once her daughter started Kindergarten, she began to recognize her vocabulary words on the screen while watching television.
Perhaps there is a simple solution to this nationwide problem of illiteracy and reading deficiency: turn on the closed- captions!
From the time of the Great Commission until now, Christians have been evangelizing the world through various traditional and modern methods: missionary work, preaching, tracts, music, films, television, crusades, books, street-corner preaching, door knocking, church planting, and now, through the Internet.
While traditional evangelism definitely has its place, there is no better way than the Internet to reach millions of people across the world with the least amount of effort. Many ministries are using Internet-based evangelism by setting up virtual church campuses where members in remote areas without access to a physical church can attend. These virtual churches have opened up the opportunity for millions of people who otherwise may not have been able to hear the Gospel or attend a good local Christian church.
ADDRESSING THE LANGUAGE GAP
When a cyber-church hopes to open its ‘virtual doors” to an international community they should first think of how they will communicate to a non-English-speaking community.
Providing multi-language subtitles is the most efficient and cost-effective method to localizing your webcasts in multiple languages. Subtitles can be combined with just about any player: Flash, QuickTime, Windows Media Player, iTunes, iPod, iPhone, YouTube, RealPlayer, etc. If you want your message to be understood by a multi-lingual audience, there is no way around localizing your programming for various languages. This article summarizes the main ways to tackle subtitling.
COST AND QUALITY CHOICES
The Automated Translator
The cheapest option to subtitle your English video is automatic translation. Google offers this as a free service for YouTube videos. The main problem, however, is inaccuracy.
Here is an actual example of one such English to Spanish translation:
The original English subtitle: “The history of the Flood is precise. The history from Abraham on is precise. Everything else is precise. There's precision in the Law and the history books. There's precision in the Psalms and the books of literature that we call poetry. And there is precision in the prophets.”
The automated translation: “La historia de la inundación es preciso. La historia de Abraham en es preciso. Todo lo demás es preciso. No hay precisión en la Ley y en los libros de historia. No hay precisión en los Salmos y los libros de la literatura que llamamos poesía. Y no hay precisión en los profetas.”
But there are two serious problems with this translation. First, it doesn’t use the proper term for the Flood. It is like calling the Flood “the inundation” in English. Second, it says that the Bible ISN’T precise in many instances, which exactly the opposite of what the speaker intended.
The main point is that automated translation will often distort, add or subtract from the Word of God, while a good human translator relies on the translations of the Bible into Spanish that have been diligently compared to the original Bible manuscripts. Therefore they do not need to do their own translation of the Bible. Finally, good Christian translators rely on the Lord to give them the proper words, something a computer could never do.
The Volunteer Translator
If you have volunteer translators in your church, this can be an excellent way to go. The translation will be free, but you will most likely have to team up with a subtitling company to create the needed subtitle file. Nevertheless your cost will be significantly lower.
One of the main advantages to using volunteers is that the translator will most likely be familiar with the speaker’s style and message as well as have a heart for what they are translating.
Nevertheless, there are two points to be aware of when dealing with volunteers. First, just because they “know” another language, doesn’t mean they will be able to properly translate into that language in a Christian context. Be sure they are native speakers of the target language, as well as having attended Christian church or listened to Christian teaching in their native language. Also, as with all volunteers, you must be sure they can meet your production deadline week after week. Be sure to have a back-up plan.
The Non-Christian/Amateur Translator
If you search for the cheapest subtitling package cost, you may end up with a “non-Christian” or “amateur” translation. With a non-Christian translator or inexperienced translator you may find a cheaper rate, but you will run into problems similar to what you find with automatic translation. The terminology used is often of secular nature, or worse yet, that of another religion when referring to Christian matters. For example, in Japan, where less than one percent are reported Christians, it is very difficult to find a Japanese-speaker able to properly translate the word “atonement,” as this concept does not exist in Japan’s main religions, Buddhism and Shinto.
Another example that had me chuckling for hours was in an interpreted church service from English to Spanish when the interpreter referred to the Holy Ghost as the “Fantasma Sagrado,” instead of the correct Spanish term, “Espíritu Santo.” For a Spanish-speaker this is like calling the Holy Ghost something similar to the Sacred Phantom. It doesn’t quite work.
The Experienced Christian Translator
Of course, this is the best option, but not always the most affordable. The experienced Christian translator is a Christian with a heart for the message and also with the training and tools to localize your message properly to the target audience. When searching for a full-package subtitling service, be sure to ask the company about the translators they use. Do they have experience translating Christian material? How many years of experience do they have? What other Christian material have they translated? Ask to see their resumes. In addition to a good Christian translator, if you are willing to pay top dollar, also be sure there is an additional proof of the translation before the subtitles go live, so that any errors are caught. If you have people available in your ministry to do a proof of the final translations, this option can bring your cost down significantly.
THE BOTTOM LINE
All and all, when translating your message for multi-language subtitles, be sure to allow feedback on the translation from the viewers. This can be a simple box below the video where the viewer can input their feedback. You never know, you may even get viewers across the world willing to translate your message for free. Are you getting the number of viewers you desired in each country? If not, it may be that the subtitles are so poor that the viewer gives up.
Although there are many service options out there at many different costs, the important thing is that you know exactly what you are getting and you evaluate what will work best for your ministry. From years of experience, the old saying still rings true: You get what you pay for.
True EIA-708 captions can be achieved without the use for an external HD captioning encoder that can cost well over $8000 (US) by using a new captioning work flow.
What You Need
You need Final Cut Pro 7, an AJA Kona card (3, LHi or LSe), and the most recent Kona drivers. Of course, you will also need a special caption file created by a closed captioning service like Aberdeen Captioning (ONLY created using CPC/Mac Caption software).
What You Get
If you have the above-mentioned hardware and software, you can create the HD master caption tape in various HD tape formats, create HD captions for optical disc format delivery, and create captions for web video--and all of this can be done in-house without having to ship anything back and forth with a caption company.
How it Works
The AJA Kona cards ($1490 US) function as an HD encoder and place the captions in the accurate area in the HD video data. Here's how:
The closed captioning file that you receive is then imported into Final Cut Pro 7 and it doesn't require you to do any rendering. Final Cut Pro 7 has been designed to accept this caption file and in conjunction with the AJA card, it places the caption data into the correct location in the video being laid to tape.
AJA, Final Cut Pro, and CPC have all worked together to create this new technology.
Note: This workflow can also work for SD video if needed.