DIGITAL VIDEO INVITATION

copyright 1997, 1999

by Eddie Ciletti

Remember how much fun it was to get digital audio on your computer workstation? Ready or not, video capabilities are lurking right around the corner. At the very least, hard-disk random access makes digital workstations faster at locating and synchronizing than any tape machine.

I was lucky enough to attend Adobeís Digital Video Invitational this past summer. If you are considering a video workstation, Table One (below) lists the key players. The most popular Mac-based systems are by Avid and Media 100. Both use proprietary firmware and hardware. All of the hardware ó much of it for Wintel machines ó is supported by Adobeís Premiere, a non-linear video and audio editing software package. Adobe After Effects makes everything look even prettier.
 

VENDOR 
PRODUCT
   
Canopus
DV REX Firewire capture card
TrueVision
video capture and export cards 
NetShow 
Internet Video
Netpower
video capture and export cards 
miro
video capture and export cards
Media 100
Mac-based non-linear editor 
Kinetix
3D and animation software
Intergraph
Turnkey Systems
Future Media Concepts
Training
FAST
video capture and export cards
Digital Production Systems
video capture and export cards 
Blossom
Turnkey Systems
Apple
personal computers
Adobe
Premiere and After Effects video software
Avid
Mac- and NT-based editors 

Table One: A whole bunch of video-related vendors

Assuming that your audio expertise is the main customer "draw," picture quality can take a back seat so long as the ability to accurately lock to picture is not compromised. If youíre new to the video biz, be forewarned that the term "Broadcast Quality" is seriously overused. Suffice to say the highest quality analog video format is Beta SP and, in the Digital Domain, itís 10 bit CCIR-601. You will pay for image quality both in hardware and software.

Video destined for release on the Net is the exception. Here, quality and size are compromised just to make it work at 28.8!

(And we thought CD-ROM video was bad!)

LET THEM EAT TAPE?

Mention "video" and the dollar signs pop up faster than you can say "real time," especially when compared to outfitting an audio facility. Video workstations are no exception, but the hidden expense ó unless you are physically within an equipped facility ó is the need to own a real video machine.

Standard fare is the archaic, but ubiquitous ¾-inch U-Matic format. Used decks are more affordable but can be huge and clunky. Linear Time Code (LTC) can be placed on one of the two audio tracks or ó on more expensive machines ó a dedicated "address track." Vertical Interval Time Code (VITC) can be inserted into the video (at the time of transfer) and converted with a suitable reader.

FINANCIAL EXTREMES

A standard VHS deck would certainly be convenient. I have an older JVC consumer machine that allows time code (or regular audio) to be inserted on the linear tracks, leaving the HIFI audio intact. This allows me to sync to picture while tape is moving. However, the time code is not referenced to the master and the workstation is clueless when tape is stopped. (On professional machines, VITC time code is output even during "freeze" frames.)

Note: Itís a good idea to have the ability to burn-in a time code "window" over the video. (MOTUís Video Timepiece can do this.) Once recorded, frame accurate time code will always be displayed, even when a consumer VCR is freeze-framed. It is also helpful when using a basic .avi player, such as the plug-in provided with the Soundscape audio workstation.

At the other end of the spectrum, professional Beta SP machines start at about $5,000 for the bare bones and over $50,000 for the lap of luxury. Beta SP (and better ¾-inch decks) conform to Sony 9-pin machine control.

SINK YER EYES

To synchronize a workstation to a video deck, check out Mark of The Unicornís MIDI Timepiece AV (www.motu.com) or JL Cooperís PPS-100 (www.jlcooper.com). If you donít yet have sync capabilities, consult the video capture manufacturer for a compatible card. One option is Sony 9-pin protocol, which is really a bi-directional RS-422 port that can spit out machine control while receiving time code data from tape.

TRAFFIC JAM

Getting video into your computer starts with a capture card. The popular video sources are detailed in Table Two. A full resolution NTSC component capture is 720 x 480 pixels (lesser cards yield 640 x 480 pixels) and requires about a megabyte per frame. Multiplying that by 30 FPS will quickly generate a traffic jam (dropped frames) as the system attempts to cram 30 MB/sec on a typical hard drive capable of only 5MB/sec or 10 MB/sec.
 
FORMAT NAME
CABLING REQUIREMENTS
FORMAT DETAILS
 
 
 
NTSC
Composite video via single coax cable
Combined Luminance and Chroma
S-Video
Y/C via mini DIN
Separate Luminance (Y) and Chroma ( C ) 
Beta SP
Component Analog via 3 standard coax cables
Luminance (Y) and two color difference signals: 

R-Y and B-Y

CCIR-601
Component (4:2:2) Digital via single coax (serial) or via DB-25 (parallel)
Uncompressed 8 bit or 10 bit 
DV Link #1

DV Link #2

Component (4:1:1) Digital via IEEE-1394 "Fire Wire"
5:1 compression 3.6 MB/Sec

Table Two: The five video formats and their cabling requirements. Luminance is the monochrome or Black & White signal which contains all of the image detail. Separating the color information (Chroma) into one or more parts increases resolution and decreases artifacts.

SLIGHT DETOUR

Here in cyberland, I have the luxury of including additional information, borrowed from .  Here it is...

Composite (video)

Luminance and chrominance are combined using one of the coding standards - PAL, NTSC and SECAM - to make composite video. The process, which is an analogue form of video compression, restricts the bandwidths (image detail) of components. Chrominance is added to the luminance using a visually acceptable technique but it becomes difficult, if not impossible, to accurately reverse the process (decode) into pure luminance and chrominance. This can cause problems, especially in post production.

Luminance

A component, the black and white or brightness element, of an image. It is written as Y, so the Y in YUV, YIQ, (Y, R-Y, B-Y) and Y, Cr, Cb is the luminance information of the signal. In a colour TV system the luminance signal is usually derived from the RGB signals, originating from a camera or telecine, by a matrix or summation of approximately: Y = 0.3R + 0.6G + 0.1B

Chrominance

The colour part of a signal, relating to the hue and saturation but not to the brightness or luminance of the signal. Thus black, grey and white have no chrominance, but any coloured signal has both chrominance and luminance. Cr and Cb, (R-Y) and (B-Y) represent the chrominance information of a signal.

Component Video

The normal interpretation of a component video signal is one in which the luminance and chrominance remain as separate components, eg analogue components in MII and Betacam VTRs (Beta is YRB with Setup aka 7.5 IRE), digital components Y, Cr, Cb in CCIR 601. RGB is also a component signal. Component video signals retain maximum luminance and chrominance bandwidth.

Y, (R-Y), (B-Y)

These are the analogue luminance, Y, and colour difference signals (R-Y) and (B-Y) of component video. Y is pure luminance information whilst the two colour difference signals together provide the colour information. The latter are the difference between a colour and luminance: red - luminance and blue - luminance. The signals are derived from the original RGB source (eg a camera or telecine).

The Y, (R-Y), (B-Y) signals are fundamental to much of television. For example in CCIR 601 it is these signals that are digitised to make 4:2:2 component digital video and in the PAL and NTSC TV systems they are used to generate the final composite coded signal.

Y, Cr, Cb

The digital luminance and colour difference signals in CCIR 601 coding. The Y luminance signal is sampled at l3.5 MHz and the two colour difference signals are sampled at 6.75 MHz co-sited with one of the luminance samples. Cr is the digitised version of the analogue component (R-Y), likewise Cb is the digitised version of (B-Y).

And now back to the main thrust of this article.

HARDER and FASTER

The two primary options for increased throughput are faster drives and data compression. Ultra-Wide SCSI drives are capable of 40MB/sec (max). That and 2:1 Motion JPEG compression can reduce the traffic to 15 MB/sec with an undetectable loss of quality. The alternatives are standard drive arrays or serious data reduction. A Redundant Array of Inexpensive Drives (RAID) parallels two or more hard drives to increase the data transfer rate.

If size and quality are not important, simply throw away the easy stuff. NTSC sends even and odd lines on separate fields (two fields make a frame). Discarding half the lines (the height of a vertical line is equivalent to a pixel) cuts the picture size in half, typically to 320 x 240. Even the frame rate can be cut in half from 30 FPS to 15 FPS. Now the data rate is more manageable before compression. Standard Quick Time or .avi "movies" are typical of what radical compression looks like. (MPEG and motion JPEG are two examples of compression schemes.)

FIRE WIRE

The most recent development is affordable digital video (DV) for both consumer and professional applications. In addition to standard analog I/O, DV also has a serial interface called IEEE-1394 or "FireWire." Using a DV camcorder to capture the signal also provides 5:1 data compression making light work for the computer hardware (no A-to-D converter or additional compression is required).

For more information on FireWire and DV technology, check out the following web sites:
 
Table Three: Whereís the Fire?
Texas Instruments Q & A
1394 Trade Association
1394 Home Page
Skipstone purchased by Adaptec
Adaptec FireWire Interfaces

Back to Eddie's Homepage