When I set the cutoff to 1000Hz, and the frequency where my sound is no longer audible is 260Hz, a bipolar value will kill the sound with an amount of 7%, and a unipolar macro will kill the sound at -4%. However, 740Hz (the difference between 260Hz and 1kHz is neither 7% nor 4%, but is 74% of 1000Hz).
When I set Decay to 1s, a macro of -6% reduces it to .5s,. which is in fact 50% of 1s
When I set an LFO to 1 Hz, a macro value of 2% scales it to 2Hz.
This is something that bothers me too since the day I’ve got my Play+.
I guess, the official answer to this would be that the scaling is linear and the whole range is:
for cutoff it is 0hz…20kHz
for decay it is 0s…10s
for LFO rate it’s 0Hz…100Hz.
So 1% range for cutoff is 200hz. 1% for decay is 0.1sec. 1% for LFO rate is 1Hz.
Parameters values you set in your patch are not taken into account whatsover. 10% for cutoff is always 2kHz in the both directions whether you set cutoff to 500Hz or 10kHz.
The same logic applies to modulation as well. This makes my patches way harder to tweak and manage.
I think at least cutoff modulation/macro should be logarithmic and based on pitch rather than on frequency. If I set cutoff to 500Hz and modulate it with an LFO, it should go to 2kHz up and to 500Hz down rather than to 0Hz down. And if I increase cutoff to 2kHz, the range should also change to 1kHz…4kHz.
But then the bug is the that the unit is wrong, the unit of percentage points is pp , p.p. , or %pt., but not %.
I guess the root cause is that all parameters (maybe with the exception of Osc. Volume) have a resolution of 100 steps that is mapped to the actual range of the parameter.
True.
Good point. Maybe that linear response is also why I find that there is something extremely odd in the response to changes in volume, which that makes both velocity and the mixer much harder to use than on other devices.