DuskDave timing accuracy of MIDI commands
Ooh, that's an interesting question!
That's made me stop and think about MIDI 'resolution, the setting in my DAW (in my case, I set all MIDI files to 960 ticks per quarter note for highest resolution), the resolution in ST3, and whether I need to think about the relationship between all this?
In my case, in my DAW where I create all of my backing tracks (including manage all the device MIDI automation), I set all MIDI files to 960 ticks per quarter note for highest resolution.
In ST3, we have MIDI command timing resolution down to 1ms.
If I have a song at 120bpm, my DAW at 960 ticks per quarter note resolution, and I have two MIDI commands 1 tick apart, then that is basically 1ms apart.
If they are 10 ticks apart - 5ms.
100 ticks apart - 52ms
Note: the ms value is rounded in my DAW (Cakewalk Sonar) which is why it's isn't perfectly linear
Does ST4 need a lower resolution (i.e. 1/10th of a ms)? Personally, I don't see a need.
Is that even technically possible? 🤷
MIDI was never a 'fast' protocol. It was never designed to be. I have some devices really struggle to accept MIDI commands with less than 10ms between each command (i.e. some just get dropped/not processed). So I've always tested how tight I can make timing of consecutive MIDI commands with devices (and then eased off a bit) so I know a device's approx. fastest MIDI command acceptance capability. I then factor that in on a per device basis when I have to send a bunch of MIDI automation messages in a very tight time window.