I remember an excellent post by some NI official why multicore is problematic for realtime music apps. Probably in the old forum. Does someone remember? Or are there other resources in the net where its explained?
I know DAWs can put different tracks on different cores. But if you have a big reaktor ensemble it can't be calculated with more than one core AFAIK.
Is it the realtime aspect of the whole thing, where the sychronisation of different calculations in different cores would use too many resources?
Thanks