AV Production Industry Insights | Professional Technical Guide
The pyrotechnic mortar must fire exactly 127 milliseconds after the musical hit not 150 milliseconds, which would look late, and definitely not 100 milliseconds, which would fire while the performer still occupies the danger zone. Human reaction time averages around 250 milliseconds, making manual cueing physically impossible for effects requiring split-second precision. The solution is timecode: the invisible metronome that synchronizes every technical element to a common reference, enabling precision that human operators alone cannot achieve.
The SMPTE Legacy That Governs Entertainment
The Society of Motion Picture and Television Engineers developed timecode standards in the 1960s to solve film and audio synchronization problems. SMPTE timecode divides time into hours, minutes, seconds, and frames—typically 24, 25, 29.97, or 30 frames per second depending on video standard. This format became universal across entertainment, from film post-production to live concert programming, creating a common language that lets equipment from different manufacturers lock to shared temporal reference.
Linear Timecode (LTC) travels through audio cables as an audible signal the characteristic “buzz” that leaks into recordings when timecode routing fails. MIDI Timecode (MTC) transmits the same information through MIDI connections. Both formats have been largely superseded in complex productions by network-based protocols like Art-Net timecode and SMPTE ST 2059, which offer higher precision and more flexible distribution.
Building the Timecode Distribution System
A timecode master—typically a media server, playback system, or dedicated generator produces the reference that all other systems follow. The Dataton Watchout servers commonly serve as timecode masters for corporate productions, while touring concerts might use Pro Tools systems or dedicated hardware like the Rosendahl Nanosync. The choice of master determines the entire system’s reliability: if the master fails or generates inconsistent timecode, every synchronized system experiences problems.
Distribution amplifiers like those from Horita or ESE split timecode signals without degradation, allowing dozens of receiving systems to lock to a single master. The physical routing of timecode requires the same attention that audio receives—dedicated cables, clear labeling, and documented signal flow. Productions that treat timecode as an afterthought discover its importance only when synchronization fails during shows.
Lighting Console Timecode Integration
Modern lighting consoles including grandMA3, Hog 4, and ETC Eos implement timecode event triggering that fires cues at specific timecode positions. A timecode-triggered cue list for a musical number might contain hundreds of events, each programmed to fire within frames of their intended moments. This programming approach requires visualization tools that show cue positions relative to audio waveforms the MA Lighting solution provides such displays natively.
The workflow distinction between “timecode chase” and “timecode trigger” matters for programming strategy. Chase mode continuously follows timecode position, jumping to appropriate cues if playback starts mid-song. Trigger mode fires cues only when timecode reaches their programmed values, potentially missing cues if playback starts after their trigger points. Complex shows often use hybrid approaches, with chase sections for programmed sequences and trigger events for specific moments that must fire even during unexpected playback variations.
Video System Synchronization
Media servers like disguise and Resolume can both generate and receive timecode, enabling complex relationships where video playback drives other systems or responds to external timing references. IMAG (image magnification) workflows increasingly use timecode to coordinate camera switches with pre-programmed stage looks the cut to a specific camera automatically triggers associated lighting changes. This coordination level requires careful programming but produces seamless integration that manual operation cannot match.
The genlock capability that professional video equipment includes ensures that multiple video sources maintain frame-accurate alignment. Without genlock, combining video signals produces visible “tearing” as frames from different sources fail to align. Genlock distribution, like timecode distribution, requires dedicated infrastructure that production budgets must explicitly include.
Audio System Time Alignment
Dante networking has made sample-accurate audio synchronization commonplace, but integrating audio systems with lighting and video timecode requires additional configuration. The Dante Controller software shows sample clock relationships but doesn’t display SMPTE timecode alignment. Productions where audio timing affects visual synchronization need measurement tools like the Avid SYNC HD that verify alignment across different time reference systems.
Backing tracks and click tracks used in modern concert productions typically originate from Pro Tools or Ableton Live sessions that generate timecode simultaneously with audio. The musical director’s click and the timecode feeding technical systems must maintain perfect relationship drift between them produces timing errors that musicians and technicians experience differently but equally problematically.
Troubleshooting Synchronization Failures
When timecode systems fail, symptoms appear across multiple departments simultaneously lighting cues fire late, video content drifts from audio, and automated elements miss their marks. Diagnosis requires understanding the timecode distribution chain to identify where problems originate. A failing timecode master affects everything; a damaged distribution cable affects only downstream systems. Systematic troubleshooting from master through distribution to receiving devices locates problems faster than random cable replacement.
Frame rate mismatches create subtle problems that evade quick diagnosis. A lighting console expecting 30fps timecode that receives 29.97fps signal will gradually drift from correct positions imperceptibly at first, but measurably after several minutes. Productions that use equipment from multiple sources must explicitly verify that all systems agree on frame rate configuration. The assumption that equipment arrives correctly configured has caused synchronization failures that took entire technical rehearsals to identify.