Automation in Worship by Kevin Poole

We talk alot about excellence in worship and in church services these days.  Sometimes, we misunderstand exactly why excellence is important; its because God is perfect and holy and He deserves our very best and that’s why we strive to do things well.  It’s an act of worship!

Doing things well doesn’t just start on Sunday morning; it starts long in advance with strategic planning and attentiveness to detail.  Kevin Poole is a friend of mine who is the tech director at Mobbery Baptist Church in Longview, TX.  I had the opportunity to see his workflow and ideas first hand as they were planning things out for their Christmas production this year.  He goes into detail on how they’ve planned automation in worship and for special events that they do.  The only way you can get to this level of excellence is through careful planning, attention to detail along sensitivity and maturity in understanding how the Holy Spirit will lead the congregation and anticipating it.  Kevin gets it.  This article may be advanced for some but I do believe that there is something that every church can take away from it.  Make sure you follow Kevin Poole on Twitter at @kevinrpoole …he’s full of great insight and ideas!  Check out his article:

 

Automation in Worship

 

Before I start this, let me give credit where credit is due. There are three men that taught me and led me along this workflow, I would know none of this without the help of Jon Daggett, Mike Gerringer, and Daniel Albert of Thomas Road Baptist Church and Liberty University.

 

Excellence is critical; Make it as easy as possible

In worship settings, we strive for excellence. In a volunteer driven technical ministry, many of the team members have careers that are none technical. They serve faithfully, learning the craft as they go. My goal was to create systems that could be used in worship and production that would allow anyone to operate the gear. The preproduction planning of people who have chosen the technical disciplines as their ministry would help lower the learning curve of technically advanced gear.

Sync is a big deal in production. Time is constant and can never be paused or slowed down. Therefore, we as a team all have to be in sync. Musically, this is accomplished through a director with tempo. Technologically, this is accomplished with timecode. These are the two foundations of this post. This does not eliminate the need for execution by a technical ministry, but it helps control the number of moving parts and failure points.

Keeping the musicality

Lets start musically. Our worship team, which includes a praise team of 10, 30-piece orchestra, 6-piece rhythm section, and a 100-voice choir. When I started down this path, we were not using click. The first thing we needed to do was to create a means for them to hear a click source and start learning to play in a time specific constraint. We bought a metronome beat station and inputted that into our console. The rhythm section was already on ears, I had to get the director on ears so he could hear it. Currently, only our rhythm section and director have the ability to hear click. This has worked for now, but we are seeking to allow the orchestra members this option.

Next, we had to get the source click to coincide with a technical timecode of some flavor. There are several applications that allow this. We choose Digital Performer (DP), mostly because I had seen it done by the mighty three mentioned above. DP allowed me to create tempo maps from demo tracks provided by my Worship Pastor. DP could then output timecode along with a click so that digital devices would track in time with musical moments signaled by click.

Now I had to figure out how to get multiple types of devices talk to each other given the same timecode source. Our largest devices we wanted to sync were video playback, lighting control, and audio cues. I found that these three types of devices like to talk sync, but they each talked a different kind of sync, or had a different sync input type. Some strategic purchases had to be made to make sure that all would speak the same timecode.

 

The workflow

For our workflow, we are using two types of timecode simultaneously. First, DP is very good at handling midi. DP allows multiple tracks of midi, much like audio tracks. They can be assigned to different outputs of an interface (see diagram below). Once I could get midi out, I could then get midi in to other devices, either natively or through an interface. I won’t mention brands of gear because most gear has this ability, and if not, may have a work around.

I set DP to output Midi Time Code (MTC) and connected my midi interface to my lighting desk. I also connected midi thru my audio console. Some audio consoles allow for MTC, but for me, all I needed was midi note values to recall presets in my audio desk. So now as DP tracks along in time, the MTC flows to a timeline in my lighting console which is programmed with cues, and my audio desk listens for midi notes telling it to recall specific presets. My lighting now is automated with my mix/mic cues.

Now lets talk about video. This is a bit more complicated yet easier. Video playback has a frame rate that plays back at a certain specified number of frames per second. In our case, we sync video playback to 30fps. DP allows you to set this playback amount with in the app, so if 29.97 or 24 or 60fps is your workflow, there are ways to do this as well. Finding a video playback device that listens to MTC is difficult. The video world has seemed to standardize on SMPTE as its timecode of choice for this kind of workflow. I’m an audio guy first and foremost, so if I badger anything here, please give me grace.

DP is also capable of outputting SMPTE timecode as an audio source. As an audio guy, I found this very comforting. It was simple to then route audio through XLR cable to a video playback box which the capability of inputting and tracking to SMPTE.

In DP, there is a feature called “Chunks. “ Our workflow uses this feature as discreet multitrack sessions, each being labeled as one song. I can then set each chunk to its own specified timecode hour. I set our video playback box to play the appropriate video at the appropriate timecode hour related to the song in DP. Now, when I press play in DP, my audio desk will fire the appropriate preset for the song, lighting will start its thing, and video playback will start. All three disciplines, one play button.

In this workflow, the only position that was required to make the look happen was a TD switching video. While there are means to automate points of a switcher, we have not taken it that far yet. GPIs are limited in what they can do, and can be complicated to wire. This may be a next stage for us. I’m not saying we would automate every camera dissolve, but perhaps we could automate a cut when video playback starts so that first frame of the video is not missed. We haven’t yet seen a need for this.

There are other ways to fire video playback such as via a midi note. Some would say this is an appropriate method in this kind of workflow. This has one fatal flaw in my mind. Simply starting a video does not ensure frame accurate alignment or constant playback speed. If the video playback jitters, or audio slows, the lip sync, or the alignment of video transitions will not follow. Its perceived as a failure or less than excellent. Purchasing a video playback device capable of SMPTE chase is more money, agreed. Our goal is to not create distractions but eliminate them. The margin of money to the security of not having distractions seemed appropriate. We purchased the more expensive device in this case.

There is one more fun feature to this workflow. As an audio engineer, one task of my role I prefer not to do is cueing and firing playback tracks. There is always that awkward moment if the track fires late, if you miss a cue, or if the device freezes. While DP is remarkably stable in starting and stopping playback, it also allows for remote control of transport via an iPad app.

I have placed an iPad on a stand near our director running DP Control. I have set DP to cue the following chunk at the end of each song, and waits for the play command. My director then presses one button in the course of the service/production: play. The director does not need to know the ins and outs of DP, just when he wants the next element to start.

This workflow will not happen over night. There are many moving parts, and many pieces of gear that must work together in order for the end product to come across as seamless. Once the foundation is set, however, the workflow opens all kinds of possibilities. Remember, midi can control many devices, and DP can have many autonomous midi tracks. I think I once saw a midi controlled coffee pot…maybe that was DMX, but I could still automate through lighting!

 

 

 

 

 

 

I hope you can see the power of preproduction and automation. Remember, poorly planned productions will lead to poorly executed productions.

 

Check out some other guest blog posts in this series:

Allen & Heath GLD vs. Midas Pro 1 Hands on Review by Peter Wituszynski

Behind the Scenes Technology: A Children’s Pastor’s Perspective

Wow. It's Quiet Here...

Be the first to start the conversation!

Leave a Reply:

Gravatar Image

Switch to our mobile site