Hey guys,
I'm working on something which involves a looot of delays and its taking up a lot of CPU. I noticed that the normal delay in core uses 4 samples for interpolation, and I'm wondering if anyone has a more efficient version? I only need linear interpolation, so I think that's just crossfading two samples instead.
Thankss