What's the general status of Reaktor?

Jon Watte
Jon Watte Member Posts: 41 Tri
edited March 24 in Reaktor

I notice that none of the "OS version X compatibility" and "M1 silicon compatibility" pages even mention Reaktor, even though they mention a bunch of other Komplete synths.

No mention of Reaktor: https://support.native-instruments.com/hc/en-us/articles/360014683497-Apple-Silicon-M1-Compatibility-News

Why is this? Is Reaktor splitting off from Native Instruments? Is it a legacy that will never be supported in later OS-es and hardware? Is it somehow its own special snowflake that has a totally different set of support articles I haven't been able to find?

Where would I go to find out whether-and-when on Reaktor support for M1 silicon?

«13456

Comments

  • Mutis
    Mutis Member Posts: 85 Tri

    Don’t hold your breath from a “so x86 centric” company (stated by the old Board and discussed by new when some of us pointed lack of awareness towards Apple Silicon transition).

    The good news are tools for translating code are in place and sooner or later things should come in place too. The question is what things and what place…

  • colB
    colB Member Posts: 155 Saw

    The good news are tools for translating code are in place and sooner or later things should come in place too

    That's my point though - tools for translating code would allow them to easily make their existing code for Reaktor run on Apple Silicon, but it would still generates x86 binaries. Which would be useless.

    They need some completely new code for the core compiler to generated ARM binaries. There is no point in translating their old code, so translation tools are useless

    I'm sure they are well on the way to getting native M1 compatibility, but it is unrealistic to expect it to happen as quickly as for other apps where translation tools can be applied to the whole process.

  • Mutis
    Mutis Member Posts: 85 Tri
    edited March 8

    Translation tools to get x86 code into M1 includding export binaries I should said. ;)

    These tools include Qt.

  • colB
    colB Member Posts: 155 Saw
    edited March 9

    Remember, every time you initialise a Reaktor ensemble that includes core (that's most of them), Reaktor runs a multi part compilation process turning all those boxes and wires into binary machine code.

    A tool that can turn x86 binaries into ARM binaries is only useful here if it is free to package into your product without licencing restrictions, and can convert any large binary optimally in milliseconds every time without fail and without human tweaking or testing... do they have that?

    It would have to be open source as well otherwise Apple would then control the future of your flagship product...

  • Trevor Meier
    Trevor Meier Member Posts: 37 Tri

    It would be good to hear some official word from someone at NI about the future of Reaktor. They’ve made public posts with roadmaps for Kontakt and Maschine, but in the Reaktor front it’s crickets.

  • Mutis
    Mutis Member Posts: 85 Tri

    I agree with both and also have my own concerns so time will tell us if the open source path is taken or Apple offers a solution for what @colB have exposed. Anyways don’t expect an answer by Native (or almost not a clear not-corporative one) due probably the moderators/powerusers aren’t allow to disclose that info. It’s best to look into Qt and under the hood frameworks and figure it by yourselves.

    Maybe it’s also time to check alternatives like libpd or Juce. Reaktor looks too old fashioned but I’mmnot so versed on that so probably I’m wrong af.


    Time will tell.

  • Jon Watte
    Jon Watte Member Posts: 41 Tri

    I understand that Reaktor generates code -- it's the right way to make fine-grained configurability work at high performance!

    I also understand why Rosetta isn't a good answer for Reaktor -- it needs a much lower-level implementation.

    Technically, one could have hoped that they might be using an abstraction layer like LLVM for their code generation back-end, which could simplify the transition, but given that Reaktor has a very long history ("it is very experienced") then they probably started it before LLVM was a real thing, and thus have their own back-end, and ARM SIMD works just differently enough from x64 SIMD to make an engineer sad.

    That's actually not the main question I'm asking. The question I'm asking is why the documentation on M1 transition and OS version transitions don't even mention Reaktor. There's a bunch of tools that are listed as "not on M1 yet" on those pages, and a few (mainly Kontakt,) that are. But Reaktor is in neither list. Which seems like a ... very large omission?

    I had good luck with @Jeremy_NI promising a documentation update in another thread, so here's hoping lightning can strike twice if I call his name :-D

  • Matt_NI
    Matt_NI BerlinAdministrator Posts: 685 admin
    edited March 9

    @Jon Watte Reaktor 6 isn't listed on the KB article because it's not officially supported. If you check the document, we're only listing what is compatible either via Rosetta 2 or natively. It's not legacy or splitting from the brand but I imagine we can't work on everything at once, especially when considering there is other compatibility topics other than Silicon.

    Unfortunately, we don't really have a timeline available for Reaktor 6 at the moment. We can see products like Maschine, Komplete Kontrol, Massive X or even Guitar Rig 6 Pro are in the making which means Reaktor is not a "coming very soon" sort of thing.

  • colB
    colB Member Posts: 155 Saw
    edited March 9

    Technically, one could have hoped that they might be using an abstraction layer like LLVM for their code generation back-end, which could simplify the transition

    An extra set of abstraction layer's for a VM would slow down the compilation process. Code gets compiled at every initialisation, so when you load a preset, it recompiles, when you make an edit, when you reset the audio engine..

    I would imagine slowing that down significantly is not an option?

    I expect that there are parts of the process that will translate easily - building dependency graphs etc., more abstract parts of the compilation process. The main problem would be the code generation itself with optimisations, on a completely different architecture.

    Also expect that the Dev team is small - just big enough to handle maintenance, and to have small to medium updates every few years. Expecting the same team to quickly rewrite the most complex and challenging parts of the code base seem's unrealistic. Particularly when targeting a technology they have no experience of.

    Just seems to me that we should all just be patient. And those who can't handle this sort of difficult transition should maybe look towards Apple for answers and for slinging mud.

    I'm sure Apple could have approached the transition in a way that made things easier for developers like NI. But they didn't so go hassle them ;)

  • Jon Watte
    Jon Watte Member Posts: 41 Tri

    An extra set of abstraction layer's for a VM would slow down the compilation process.

    At the risk of getting a bit too technical for the audience, there's nothing extra there. You need a compiler back-end that can JIT compile. LLVM is one such back-end. It doesn't add any "extra" layering compared to anything you write yourself (assuming you configure/use it appropriately.) There may be other reasons not to use that particular one (including the particular trade-offs chosen in that library,) but "an extra layer" doesn't seem like a reason to me, because no matter what, you need a library to poke the data dependency and control graph at, and get code out of.

    Anyway: Right now, a reasonably human reading available communication from Native Instruments will believe that Reaktor isn't even on the roadmap, and is abandonware.

  • colB
    colB Member Posts: 155 Saw

    Anyway: Right now, a reasonably human reading available communication from Native Instruments will believe that Reaktor isn't even on the roadmap, and is abandonware.

    I hope not! it has been very quiet for many months though

    I expect it's just that the devs are very busy with native M1 support, and now with more of an NI presence on the forums from Matt, Jeremy et al., devs don't need to check here at all... ? 🤞

  • colB
    colB Member Posts: 155 Saw

    but "an extra layer" doesn't seem like a reason to me, because no matter what, you need a library to poke the data dependency and control graph at, and get code out of.

    A virtual machine involves an extra layer, because you have a new 'language' of virtual bytecode instructions.

    So with a virtual machine, you

    • do all the dependency graphs and whatnot
    • then you compile that into bytecode instructions... that's your target machine code (virtual of course)
    • then you use a (hopefully near optimal pre-existing, developed by someone else) stage to convert the bytecode into machine code for whatever platform you are using.

    without a VM you

    • do all the dependency graphs and whatnot
    • then compile that directly into machine code instructions

    There is no way around that. The whole point of VM is that its an extra layer of abstraction to improve portability. The cost is that there is an extra layer of processing, in many cases, its a price worth paying because in many contexts, a little extra time to deal with JIT lag, and slightly slower binaries is not an issue. And you might be targetting many platforms, so the idea of only having to maintain one target, the VM is extremely attractive.

    Unfortunately, one area where that would be problematic is audio DSP for various reasons (particularly control of timing, and missed opportunities for exploiting platform specific optimisations. JIT and GC would be problems not benefits), and on top of that Reaktor also has the requirement for an extremely fast compilation process - so adding another translation layer would be... sub optimal.

    Then there's the fact that Reaktor has never targeted more than two cpu architectures, and only one since 2006(ish) when Apple went intel, in that context, it wouldn't make any sense to use a VM

  • Jon Watte
    Jon Watte Member Posts: 41 Tri
    edited March 9

    LLVM is not mainly a virtual machine, although LLVM IR data can be interpreted if you really want to. LLVM is mainly a retargetable compiler code generation back end/library, used by clang, Rustc, and a number of other compilers, FWIW. Yes, LLVM has an IR, but so does any other code generation back-end, no matter whether you wrote it yourself or not. LLVM is not a JIT, it's a code generator back-end.

    Native code, built by Xcode, for ARM M1 Macs, use clang, which uses LLVM for codegen. Your DAW hosts and native plugins are most likely all compiled through LLVM, especially if they're M1 native. (There may also be a smattering of assembly involved, but native intrinsics generally do the job well enough.) It's pretty good infrastructure, as far as such things go. I think the fact that LLVM was not a production quality library when Reaktor was first built, has much more to do with it!

  • colB
    colB Member Posts: 155 Saw

    That's pretty cool. I haven't read up on anything like that for many years... just assumed LLVM was relating to some sort of VM. Pretty stupid name really, calling it LLVM, then insisting that it's not an acronym 🙄

    ...Turns out it was originally Low Level Virtual Machine... it can be a JIT but doesn't have to be...

    ...doing some more searching, I can't find any examples of LLVM being used with dataflow paradigm languages, it seems to be more in line with control flow stuff, C++ alikes, etc.

    Are there any examples? Can the LLVM optimisation system 'understand' Dataflow code the way it 'understands' control flow based paradigms?

Back To Top