If this question is asked: How do television stations receive programming from producers? Most people outside the industry would assume that everything happens digitally file-based. We are in 2012 and this is the way things are done, right? Not exactly the case. Moving completely file-based has proved challenging for most TV stations, where receiving tape remains commonplace. This article seeks to explain step-by-step exactly how Aberdeen Broadcast Services has made going 100% file-based not only possible, but easier than would be expected for both producers and TV stations. You will learn about what is currently happening at stations to date, a solution, and then the steps to the process. In this article, insider information is shared from software to infrastructure used to make a complete digital file delivery to TV stations a success.

Discovering a Niche

Necessity is the mother of all invention. This phrase holds true for Aberdeen Broadcast Services, a company based in Southern California, USA. Originally incorporated in early 2001 as a Closed Captioning company, Aberdeen built its business around providing closed captioning services to producers and broadcasters who were forced to comply with new FCC regulations that mandated closed captioning to appear on (almost) all US television broadcasts. During the first ten years in business, the program transfer medium of choice was tape, mostly analog Beta SP tape. Year after year thousands of tapes went in and out of the Rancho Santa Margarita, California offices via FedEx. As the business grew, owners Matthew Cook and Becky Isaacs knew there must be a more efficient, cost-effective way to move these programs around the nation and the world. With broadband Internet saturation reaching record levels every year, the Internet was looking increasingly like the avenue to accomplish such a task. However, initial experiences with popular file transfer services and other FTP-based options proved unreliable. In the television business, unreliable was not going to be acceptable.

Working Backward 

The initial hurdle was moving large, high-quality broadcast master programs to Aberdeen’s offices for captioning and then getting those masters to the destination television stations for broadcast.

These large master files could be upwards of 30GB for a half-hour High Definition (HD) program. At the time, the most widely utilized distribution method of transfer over the public Internet was FTP (File Transfer Protocol). FTP is a standard network protocol used to transfer files from one host to another host over a TCP-based network. After robust testing, TCP was found to be far from ideal when trying to sustain a large data stream over long distances. The biggest drawback of TCP-based file transfers is latency. The time it takes for transferring packets to be sent from A to B and then acknowledgement confirmation sent back to the sender (B to A) to confirm the receipt is called Round-trip Time (RTT) or Round-trip Delay Time (RDT). The TCP protocol needs this acknowledgement before the sender will send another packet. This amount of time from delivery to confirmation is called latency; the higher the latency the slower the transfer. The further apart the two connections are the longer the return message can take, thus causing even the fastest high speed connection’s transfers to slow to a crawl. The longer the connection time necessary for a transfer the more vulnerable the transfer is to timeouts and service disconnects.

After looking into the broadcast transfer solutions available at the time, the software chosen was based on the adoption of the User Datagram Protocol (UDP) technology and its advantages over the more commonplace FTP protocol.  UDP is an alternate method of transfer which does not wait for packet confirmation and delivers as many packets as fast as the connection will allow. An MD5 checksum at the end of the transfer, scans the file on the receiving end and makes sure the file is complete. Any missing packets that may have been lost during transfer are resent to complete the file. This improvement allowed transfers to maintain much greater data rates over optimum connection locations but offered the largest improvement for long distance transfers and slow speed connections. Other sensible features that would assist in the logistics of file transfers such as:

Computer & Network Infrastructure

The decision to house the networking and computer systems off-site was made to add redundancies and protection from power outages and other service disruptions. A SSAE 16, SOC-1 Type II data center in Dallas, Texas was chosen due to its central location in the United States and its 24/7 support. Building out the appropriate transcoding farm was the next challenge. The initial architecture was built on a Dell Blade chassis with a Gigabit backplane connecting a 60TB Dell MD3000 and MD1000SAN in a RAID 5 configuration.  As the digital file delivery business started to grow it was immediately apparent that the networking limitations of a chassis system connected via Windows Share (NTFS) to the SAN would not support the high resource demands of a multi-blade transcoding farm. Bottlenecks originally limited the six blade system to a shared 1Gbps iSCSI connection. This was not enough throughput considering the mezzanine file format for the system was uncompressed AVIs with data rates up to 600Mbps per file. The multiple transcoding blades were quickly slowed by these large files traversing the network. The decision was made to find a network consultant seasoned in video transcoding to redesign the system in an effort to increase throughput and scalability. Jeff Brue at Open Drives Inc. came highly recommended from a number of West Coast post houses and major motion picture companies. The redesign of the transcoding farm by Open Drives Inc. included a new custom 180TB Solaris Share storage system with full Active Directory support and Windows compatible ACL’s. New SuperMicro 16 core 2.4 GHz servers with 32GB of RAM and GPU processing video cards were implemented to maximize the transcoding horsepower for each blade. Network switches were also upgraded to support a central 10Gigabit Ethernet infrastructure. By efficiently utilizing memory caching and 40 Gigabits of bonded 10Gb/s connectivity, a balanced mix of processing power and storage speed have been achieved.  Aberdeen operators currently remote into the ten servers at their Dallas data center using Remote Desktop and web GUIs. The team’s operators manage, receive, transcode, and deliver hundreds of full-length TV programs and movies a week from off-site locations around the globe.

Transcoding to Match

Television stations’ method of broadcasting can easily be compared to an Apple iTunes playlist. The machine and interface that handles this is appropriately called the Play Server. Each track in the “playlist” is either a promo, commercial, or program and as each clip ends the next file is played and passed through the airchain to the end viewer.  Although these Play Servers can handle a multitude of different codecs, wrappers, and metadata tracks, stations usually have a “house codec” that all files are encoded/ingested to for constancy and uniformity. In an effort to minimize the number of times a client’s program is encoded, Aberdeen transcodes the asset only once to the house codec for each station deliverable, allowing it to enter the station’s workflow seamlessly. Important encoding variables for individual station house codecs include codec, wrapper, bit rate, GOP structure, aspect ratio, resolution, color sampling, field dominance, frame rate, audio format, sample rate, bit depth, and caption data.

Brent Chance, the Broadcast Engineer at NBC Encompass had this to say about file delivery systems: “It is vital for digital delivery services to have the resources and means to send content in a format that is ideal for the station’s workflow, their systems demand it.”

Being a captioning company, the successful transcoding of digital files would hinge on Aberdeen’s ability to get caption data correctly into the house codecs of their clients’ stations. This proved to be no easy task as the caption information can live in various positions inside the different digital files. The following are different specifications for different codec/wrapper combinations for caption data to be written to. SMPTE 436M, ATSC, DTV, SCTE20, SMPTE RDD-11, SMPTE 328, SMPTE 314, SMPTE 374, SMPTE 360, CEA-608, and CEA-708. Believe it or not, getting correct caption insertion into the deliverable file is still an ongoing effort that involves working with the software developers that encode the files and the hardware developers that make the station Play Servers.

Automation was found to be the most advantageous way to move files through the various stages of the transcoding process. Using the latest professional software encoding engines, Aberdeen’s engineers utilize the built-in watch-folder functionality of these programs to automatically push assets through the television station specific workflows for encoding and quality control. Developing a folder structure that made logical sense to an operator and contained the correct layers necessary to utilize the watch folder automation was vital.

In addition to expensive broadcast encoding software, inexpensive or freeware programs have found a place in daily transcoding workflows. At Aberdeen, programs such as Handbrake, MediaInfo, VLC, Quicktime, Windows Media Player, and JES Deinterlacer are all used daily to play, analyze, and transcode different file formats. Most of these Freeware programs use the open source FFMPEG cross-platform solution as the engine to record, play, encode, and stream content.

Client’s Master Files

Planning for the diverse cross-section of Internet connection speeds from client locations was critical to create proper expectations and standards for client file uploads. Most high-speed Internet, cable modem, and DSL connections are designed to allow much faster download speeds than upload speeds. Producers need to use their slower upload speeds to submit files. This compounded with the large file size of broadcast quality programs meant that a number of variables needed to be considered. Aberdeen customized a popular data transfer chart to illustrate the approximate transfer times relative to file size. Different file compression formats were then recommended based off the connection speed and file size calculation from the chart in an attempt to minimize transfer times while at the same time not compromising the quality of the file content.

After researching codec white papers and conducting critical visual analysis of all of the commonly used compression codecs, Aberdeen approved six popular/accessible delivery formats which would be recommended based on upload speed, NLE system, and source content. Among the codecs recommended were Apple ProRes (LT), Sony XDCAM, DVCPRO (HD, 50,25), DV, H.264, and MPEG-2.

Receiving programs digitally not only forced Aberdeen to require their clients to refine their file specifications but also their content specifications as well. Television broadcast standards put forth by the ATSC, DVB, ISDB, and DTMB require terrestrial, cable, satellite, and handheld content to adhere to strict signal specifications—the most notable of which are for Chroma, Luminance, Gamma, audio loudness, and peak level. Since the station deliverable files that are being created by Aberdeen go directly to the station’s Play Server and not through the traditional tape ingest process where signals are conformed by a Processing Amplifier, they needed to devise a method to ensure each program was delivered with legal values for broadcast. Among the many viable software options, VidChecker was chosen based on its ability to not only check for values outside of legal specifications but in most cases to correct them as well. It is this combination of client education and the software check-correction that allows Aberdeen to process and deliver hundreds of files a week with 99.999% station acceptance.

TV Station Accepting Files

TV stations are well aware of the need and desire of programmers to deliver files digitally. However, this can be much easier for the producer than it can be for the station.  Tapes have been the medium of choice for decades and the station engineers and technicians know how to handle them. To many engineers, files are new and complicated, especially if stations accept a number of different formats. There are hundreds of ways files can be encoded incorrectly and TV stations need to figure out new systems and train technicians and engineers on other ways of processing new programming.  Many stations are trying to tackle this process on their own and are realizing the complexities involved and the need for additional personnel.

Aberdeen helps TV stations simplify the file delivery process. By creating a specific file that matches each TV station’s Play Server, Aberdeen limits the number of tasks that the operator/technician has to perform because the file is in the native format of the Play Server.  The ability to create files in station specific formats has been well received by all stations due to the reduction of time it takes to process Aberdeen files for AIR, ultimately freeing up manpower and streamlining internal processes. An additional benefit is that the UDP delivery system being used is completely automated. When a file is placed in the appropriate station watch folder a notification e-mail is automatically sent to the appropriate station employees stating that a new file is ready and will be queued for download at the top of the next hour. Files are usually configured to download to a machine in Master Control or to a shared network attached storage system (NAS) for easy transfer into the station’s automation system or Play Server. At the station’s request, the leader and tail elements (bars, tone, slate and black) of a file can be trimmed off the program to make the file prep process even more seamless.

Files are delivered to stations with a specific naming convention that allows broadcasters to tell exactly what the program, episode, and airdate are without having to embed or attach additional metadata. The primary naming convention currently being used is ProgramIdentifier_EpisodeNumber_Airdate. By providing an optimal file for the station’s workflow, the process of receiving outside content is made more efficient.

Some stations that started with one or two programs have been pushing all their outside producers to use file-based delivery services. KDTN’s engineer Stephen Darsey had this to say: “I dare you to find the difference between a tape ingest and delivered file, you won’t and I don’t care how long you've stared at a scope and a monitor, you won’t see it!” David Tait, Broadcast Engineer from Zoomer Media says, “(The) technology used (UDP) beats FTP and is more robust and reliable. We also like the fact that the files are automatically delivered on a schedule.”

Processes and Procedures

Automate everything! This has become the internal operational mantra of the AberFast business unit. By taking as many of the manual processes as possible out of the workflows, files are run through the system and are delivered and billed quickly, efficiently, and accurately. The manual processes that do remain are treated as opportunities for quality control and gate-keeping. Standardizing file naming conventions for all clients has allowed Aberdeen to employ filters to automatically sort files into appropriate workflows based on the prefix characters of the file name. Requiring clients to submit file names in the correct format eliminates manual tasks when files arrive and allows for faster and accurate processing. For example, a file filter dialog allows operators to enter inclusive or exclusive file naming filters. A file “JPM_KDTN_W308_092012” will go through a workflow built to include all file names with the prefix JPM_KDTN*. This simple ideological change proved to be a very powerful tool for routing different program versions through specific workflows automatically.

Proxies are used at multiple places in the automation chain. For example, the first task of the transcoding farm is to make a proxy in the same aspect ratio and frame rate as the asset file. This proxy is what is used by Aberdeen’s transcribers and caption editors to create caption files that are later embedded into the final deliverable files. These small proxies are usually 1/10th to 1/50 of the original asset size (usually in WMV or MOV formats) making them easy and quick to move within our network. Proxy files are also used in the QC process. From the final station specific file, a proxy with open/burned-in closed captions is created. This file is then manually reviewed to ensure that closed caption data, video and audio sync, and start/end times are all accurate– similar to checking a tape.  These proxy files are available to the TV stations or producers to provide confidence and proof of captioning for the final deliverables.

Another layer to Aberdeen’s asset management workflow was standardizing the Windows folder structures of the shared storage (SAN). This was an important practice that aided in maintaining organization throughout the shared storage array. This consistency allows operators to easily understand where files need to be placed to start the automation and where finished files arrive after going through the job cycle.

In Summary

In this article you have read exactly how 100% digital file delivery has been achieved successfully, but not without two years of late nights, countless meetings with software vendors, lots of trial and error, and just plain old hard work to come up with this functioning solution. We see that this process still takes extensive testing with stations, but once that initial testing is over, it is smooth sailing. Sounds like a dream come true! For the producers who use it, it is. For TV stations, it is more than a dream—it changes everything. Get moving in the right direction and go completely file-based.

About the Author

Matt Donovan is the Director of Digital Delivery for Aberdeen. Matt is a Broadcast Engineer and graduate from Pepperdine University.  Matt has worked in many live and studio environments over his young career for the likes of ESPN, FOX, CBS and Current TV. Matt is passionate about creating beautiful images.  He uses his broadcast technology experience to build and maintain Asset Management Systems that enhance the quality of Aberdeen’s clients programming.

Does it seem like working in the television broadcast industry is a bit like learning a second language?  It certainly can feel that way for many content producers and video engineers dealing specifically with closed captioning and file compression.  Exposure to such foreign words and concepts such as metadata usually occurs on the floors of industry trade shows, like NAB, and in many a master control room across the nation and globe.  Hmmm…”metadata…”that sounds an awful lot like Greek, doesn’t it?  Actually, “meta” is a Greek prefix meaning “after,” or “adjacent,” among other things.  Makes sense then why the word “meta” was adopted by the TV industry to describe the area of a digital file containing data about the audio and video, but that is not the actual audio or video information in and of itself.  Rather, metadata is simply complimentary information pertaining to the audio and video tracks of the file.  This additional data happens to come “after” A/V tracks or is “adjacent” to the A/V tracks in a respect.

For instance, since captioning can be “closed” (not “open” or viewable), captioning is technically not part of the audio or video, but is additional metadata that compliments the rest of the file.  With traditional analog video, this extra information would be encoded onto Line 21 of the VBI (vertical blanking interval) as part of the EIA-608 captioning standard.  Hard to imagine, but the digital caption encoding protocol (EIA-708) is even more complex.  It is not enough then to simply put captions into the somewhat equivalent Line 9 or VANC (vertical ancillary) portion of a digital file.  That is due to the fact that the location of captions within this VANC area is dependent upon what file format the producer is required to send to each individual television station or network.

To complicate matters even further, not all TV stations use the same file formats and on-air play servers to air video programming, like in the analog days, for which BetaCam and DigiBeta tapes were the commonly accepted tape formats.  Some of the trouble associated with creating files for different station on-air play servers is that each of the play servers uses a different proprietary file format.  Each of these file formats, such as MXF, LXF, and GXF, among many others, look for captioning data in unique, and format specific, locations of the metadata area of each file.  So, unless you are transcoding files with this in mind, closed captioning stands a good chance of getting lost in translation.

More on file ingest and quality issues in future entries.  In the meantime, please see this link for more on 608 vs. 708 captioning standards.

 

This blog article was written by Steve Holmes, Sales Engineer for Aberdeen Broadcast Services

TV Stations and Change – The Reality

My job as Technical Support Engineer at the AberFast Division of Aberdeen Broadcast Services is to secure station set-ups for digital media delivery. More specifically, to deliver half-hour programs complete with the FCC-required closed captions. One would think that a procedure that reduces the work of TV station personnel and technical resources would be a welcomed technological advancement. For the most part, it is, but often not without some prodding and cajoling.

Yet this past year since starting at AberFast, there have been many instances where I must plead my case with station engineers about the benefits of AberFast digital file delivery. The reasoning is quite varied – from virus intrusions, internet bandwidth congestion, tapes works well, access into station networks is prohibited, to “We just don’t accept digital media files, period!” At AberFast we work diligently to partner with stations to make their job easier, not harder and yet opposition is faced at every turn.

Post-Production – The Beast of Change

In order to understand the perspective of my TV engineering counterparts, I did a bit of soul searching from my years as a post-production editor who migrated from linear tape-based editing to the nonlinear digital world.

In video editing, the goal is to work efficiently by minimizing repetitive keystrokes and utilizing automation (macros) as much as possible. Once you get those button clicks in sync, you have rhythm, speed and an incredibly fast creative workflow. Once efficiency is achieved, you’re in a groove…that is until the technology changes or systems and software are no longer supported. Now you have to learn anew, back at square one.

The Fear of Change – How I Felt

I’ve realized it’s not so much the fear of technology but the fear of change. It’s the fear of struggling with new workflows and procedures in a hectic work environment, especially since the status quo works well. I remember being asked to deliver 30-second TV ads via FTP to our master-control facility across the state. I had no idea what FTP was. I had no idea about video codecs. I didn’t know a "dot m-o-v" from a "dot a-v-i" or "dot w-m-v". I knew I could lay 10 spots onto a beta tape in 5 minutes, blindfolded, and hand it to a receptionist to mail. That procedure worked fine for many years. Why did I have to change my routine to accommodate this FTP thingy? Compressing video files was time-consuming (CPU’s weren’t very fast 9 years ago). Furthermore, it tied up my AVID so I couldn’t edit because I was exporting. FTPing slowed my computer’s resources so editing was problematic. I had no engineering support to teach me about video codecs and compression schemes, and after a department downsizing, my workload was overwhelming at best.

Embracing Change Works – My Story

So as the old saying goes, “When in Rome…”. I started to learn about codecs via Google searches. I would run side-by-side codec comparison tests. I evaluated file size with quality. The smallest file that provided the best video quality would be the file I could export and FTP the fastest. I learned to make the most of this new workflow. While my AVID was tied up exporting, I checked emails, wrote scripts, and contacted clients. I learned about and created watch folders to automate file conversion procedures. So because of this change I ended up broadening my knowledge of video files, sped up my ancillary work duties outside of editing, and I kept up with my workload more easily than before. I was back in the groove and a few years later in a new job at Aberdeen.

Cloud-based Technology or Else!

There’s another saying that goes, “The only constant in life is change” (and death and taxes, but that’s for another blog). Cloud-based file delivery is here to stay and at AberFast we are at the forefront of this rapidly changing technology. We want to partner with clients and stations—not to create more change—but to make change easier. We want to decrease distribution costs while increasing video programming quality. We want to connect program producers with TV stations so that content moves effortlessly. Just … embrace … the change. Otherwise, you’ll end up watching the clouds roll on by.

This blog was written by Vincent D’Amore, Technical Support Engineer for Aberdeen Broadcast Services. 

Unlimi-Tech Software is the creator of FileCatalyst, the world’s leading file transfer solution. Aberdeen has selected FileCatalyst in order to meet all of our “tapeless” distribution needs. Simply having an FTP site wasn’t enough to handle all of Aberdeen’s various tapeless transactions. FileCatalyst allows Aberdeen and their clients to easily handle 50GB files with speed and reliability. Users can log into the web-based portal and upload, download, or send videos, CC files, transcripts, etc. The HotFolders are used by clients who like the automation of file transfers and notifications are sent upon completion of upload. “FileCatalyst provided exactly the products and tools we needed to build Aberfast, our digital video file delivery service,” says Aberdeen President Matthew Cook.

To learn more about FileCatalyst, visit filecatalyst.com.

The era of big bulky tapes being damaged, glitch, or lost in the mail is over! Aberdeen Captioning is offering a new one-stop file distribution service. Clients simply upload their HD source file straight from their editing system and Aberdeen then adds the closed captions or subtitles.  Then, Aberdeen takes the new captioned video and can deliver it digitally to any station you need all over the world, in any format the station requires.  Worried about your file uploading correctly? Aberdeen uses a unique delivery system called ‘HotFolder’ which does continual bit for bit checking as the file uploads. If the connection happens to get broken, as soon as it is reconnected, it picks up where it left off.  You will receive an e-mail once the file is completely uploaded. The process is simple and effective. Check out Aberdeen’s new video which describes this process in detail:

http://www.abercap.com/digital_delivery.html