Dealing with Logic’s Beat Mapping

Do I really have to do it? You know – load an audio file in Logic, enable Beat Mapping, then drag a Measure/Beat ruler position to the corresponding peak in the audio waveform. If you are working on very simple materials, maybe the automatic detection can work nearly alone. If you are working on real music, it doesn’t work.

After many long tedious sessions of Ctrl-Shift-dragging and hoping the beat to sticks to the desired position, and usually ending up with a bumpy Tempo track, I’ve decided a better solution had to be found. My preferred workaround is to beat the Tempo in a new MIDI track. Then, I’ll ask Logic to get the beat mapping from that one. If things are a bit out, I can simply nudge the MIDI event, proportionally respace them with the Time Handles, or even add or delete events if a beat should be added or removed.

This works a lot better for me. At least, it is a tempo I beat myself on real music, and not an informed hint from a machine.

A missing standard for keyswitching

The lack of an universal and advanced standard for keyswitching makes me crazy. I’m one of those who prefer not to insert keyswitches in the score, nor use separate tracks for playing techniques. I want a meta-code to drive my technique changes.

What I did, in making my articulation sets for Logic, was to first create my own personal articulations/techniques map, starting from a Spitfire Audio UACC map repeated two times (UACC s 1-128, Logic 1-256). This means that all my maps will have the same articulation types at the same ID. Selection messages will start from those fixed positions.

Unfortunately, not all libraries are coherent in how they map their articulations/techniques, so I'm still using too many articulation sets and expression maps. With VSL VI libraries I built my own presets, all organized in the same way. But this is not possible with all libraries.