What's the general status of Reaktor?
Comments
-
Verified technolody, lots of SW, lots of HW. Lots of experience design such CPUs. Good compilers. Not many CPUs survived that long, if any.
Yes, it may be difficult to replace x86. But mainly, there is no real reason to do so.....
Apple makes 4 different CPUs (for PC) and that's it. Take it or leave it. x86 platform has hundred or so differently scaled CPUs. Everyone will find what suits. More or less.
And Apple does not sell its CPUs, so it cannot penetrate outside.
Intel and AMD are able switch to ARM if needed. They already produce ARMs. But there would have to be a demand. Why would users want to switch to ARM? AMD notebook CPUs, if produced on the same process like M1 would consume about the same, while being stronger. So, the only advantage of AS are specialized coprocessors and faster access to memory. x86 may extend instruction set or coprocessors may be external addon. And DDR5, bigger cache and possibly more memory banks will solve most of AS advantages....
And by the way, x86 procesors are internally sort of RISC. Each x86 instruction is internally translated to internal almost RISC like code.... There is sort of HW Rosetta.
0 -
Rust is very much a data flow language
Most of my searches on "dataflow programming in Rust" lead back to these two projects "timely dataflow" and "differential_dataflow" both related, from the same team, and from a quick look they seem to be dataflow frameworks written using Rust, rather than the rust language itself being a dataflow language?
Although they might be implementations of a virtual 'dataflow architecture' model - which is not the same as dataflow programming paradigm it would seem (discovered this distinction just now... on Wikipedia, so it must be true!).
Various info on the Rust language mention many different paradigms. Here's an example from an intro tutorial: "Rust is a multi-paradigm programming language, which supports imperative procedural, concurrent actor, object-oriented and pure functional styles. It also supports generic programming and metaprogramming, in both static and dynamic styles."
'The Book' linked from the main rust language site doesn't contain the word 'dataflow'.
None of the resources I can find on Dataflow programming languages mention Rust (Wikipedia, Stack overflow, various papers)
Definitely need to read more about Rust though, so thanks for that - every day's a school day :)
0 -
Let NI finish making Massive X working on x86 first.. :)
/uj
0 -
My suggestion is that *the rust compiler* is a dataflow application.
The reason for this is that Rust tracks object ownership at compile time, and must account for single ownership at all times, so the graph it builds to do this internally, is a data flow graph of ownership, which in turn determines when the Rust compiler emits "destroy this object" code. (As well as give error messages to programmers who write improperly constructed programs.)
Maybe this is too "conceptually dataflow" rather than, like, "these boxes and arrows show how data flows," or some data processing language or library like Pandas or Flink or whatever, compared to what you're looking for?
Anyway, I have very high confidence LLVM would generate well optimized code (including SIMD) for audio DSP, for a variety of architectures. The most important of which are x64, ARMv8, and CUDA/PTX.
Which brings me to ANOTHER crazy Reaktor-adjacent idea I want to see realized, which is: wouldn't it be cool to multiply the throughput of Reaktor by a hundred X by running it on a high-end GPU? Latencies would be higher (probably no better than 400 Hz control rate at best, and double-buffered 64-sample buffers or bigger,) but the throughput could be absolutely astounding. One of those little NVIDIA Jetson modules could probably be used to build a hardware-based synth that did the same thing, too.
Uh, anyway.
0 -
There are two real reasons to replace x64 as a CPU.
The first is actually technical: The x64 architecture makes a number of architectural dictates that actually makes really low-power and really high-scalability implementations very hard. Not only the complex instruction decoder with multiple layers of prefixes, which can be worked around with MOAR TRANSISTORS at a small cost, but the synchronously-consistent memory model gets in the way of many-core implementation performance, and requires significantly EVEN MOAR TRANSISTORS to run. The ARM memory model is less strict, and allegedly allows for easier multi-core implementations, and the allegedly simpler instruction set allows for fewer transistors spent on decode, which matters for lower power. (Then again, RISC-V has an even simpler instruction set, if that's where you want to go. And companies like SiFive are!)
However, the most important one is not technical, but business based: There's an effective duopoly in x64 CPUs, and it is essentially impossible for a third party to break into this market. The amount of licensing lock-up that exists between Intel and AMD makes it impossible for a third party to innovate in the "x64 compatible" market. Meanwhile, ARM is a somewhat-more-open standard, and both NVIDIA and Apple have built high-performance ARM cores on their own (although NVIDIA seems to have abandoned theirs, lately.) And RISC-V is based entirely around the idea that the ISA is open for competition. I believe that a market with lower barriers to entry and higher real competition, will better fill demand, faster, and cheaper, over time. It's the part of "free market" that actually works in our favor as consumers and humans :-)
2 -
ARM is also licenced, there more producers, but for x86 there also used to be much more of them..... And there are not only AMD and Intel in x86 even now. There is producer in China and possibly other countries (Russia, India)...
RISC has its advantages, x86 another advantages.... ARM is more energy efective, but it is mainly because x86 did not try very much. There was no need.
And concerning silicon needed to achieve similar CPU power (x86 vs. ARM).... Apple Silicon does not convince me that less silicon is needed on ARM. It is quite opposite, AS uses double or so silicon on core than AMD's Zen3 core....
And M1 Ultra is real monster.....
ARM has its place mainly at areas where is consumption important, many threads are needed and not so much CPU power is needed and SW dependance is not the issue. Small appliances, data servers.
ARM has advantage for Apple in that it may unify codebase for phones, wearables, tablets and computers. And gain margin also on producing CPUs for computers. If it will be sustainable on longer timeframe will show the future. I strongly doubt it, but I may be wrong....
0 -
I literally came in the community today because of this. To run Reaktor ensembles in Studio One I have to run Studio One in Rosetta. Which while it does work, its not why I switched up to the ARM/M1 Mac. Honestly if NI stop dropping expansions biweekly (because from a consumer standpoint that what it seems like) and focus on their "core" products ;)
0 -
Honestly if NI stop dropping expansions biweekly (because from a consumer standpoint that what it seems like) and focus on their "core" products
That would make sense if it was the same folk developing sample expansions and also maintaining/developing the core compiler systems for Reaktor.
Is that likely?
Problem here is that the size of Dev team required for maintenance and small updates every few years is different from the size of Dev team needed for basically a total low level rewrite. And in such a specialised area, finding folk to build the team up short term would be somewhat challenging. What are you gonna do? Spend $$$ to train up a bunch of new very expensive specialist developers, then just sack 'em all when the job is done? Or just take longer to get the job done, knowing that then the existing team understands the new system well... and that in the meanwhile, Apple users can use rosetta anyway?
It's always risky as a consumer being an early adopter of any new tech. This is no exception ;)
0 -
finding folk to build the team up short term would be somewhat challenging
It's not that hard, if you're willing to out-bid Apple, Google, and Facebook for people with experience and skills.
Most European companies, don't have that ability, and most professional audio companies, aren't that rich.
0 -
So it is challenging, is not it?
And Europe is not US. You have language barriers and generally much lower labor mobility. And amount of money is interesting just to certain level. Friends, parents, kids, family is more important than more money. One may offer more money, but what about grandma, cat, garden with carrot, apple tree, ....?
What coIB had in mind was, that in long run it is probably better to get less people, but longterm. They would stay, or at least most of them. And go on Reaktor development...
0 -
What coIB had in mind was, that in long run it is probably better to get less people, but longterm. They would stay, or at least most of them. And go on Reaktor development...
Partly that, but also that there are very few applications like Reaktor where there is a visual dataflow language being directly compiled into machine language. So, even cream of the crop google level devs would still need to be trained up by the existing team - it's unlikely they would have directly applicable experience. During this time development might slow rather than speeding up... The very small existing team would probably have to be part of the recruitment process too... Better just to suck it up and get on with it.
Would be cool to get some fresh top brains on it though - maybe they could find a solution to the problem of slow compilation in certain situations - a long standing issue that has been attacked but never solved. And it could be an opportunity to add some of the functionality that core has been missing since 2005 (doubt it though)
Whatever, I'm really hoping we will all see some benefits from the devs taking a fresh deep dive through the core compiler code... you never know :)
0 -
even cream of the crop google level devs would still need to be trained up by the existing team
A bunch of them are happy hobbyist electronic musicians, letting Google pay the bills and making tunes on the week-ends. I know some of them myself! One of them works on his own VST3-host tracker/DAW in his spare time, as an exercise in C++ code perfection (like, optimizing EVERYTHING.) You can also find them in the media/driver team of any major hardware platform, especially those who just can't stand doing enterprise-software for the paycheck and have to at least get close to the tunes.
Even Unreal Engine is making their own real-time synth engine to support physically modeled sound effects in games!
This kind of work can easily be done remotely, so the cat and kids schools aren't particular problems. It's entirely a question of being prepared to pay what the market demands for the skills you need.
0 -
It can and cannot be done remotedly. Friend of mine has worked in company that had most of programmers in India, better programmers in Europe and headquators in US. Nightmare.
Remote work is possible, but only half or less people prefer it. And even less purely remotedly. Majority wants to go to the office at least once a week.
And you miss one thing, Reaktor is DSP compiler to maschine code. And compiler to ARM/M1 needs to be done. So it needs someone that understands DSP and constructing ARM compilers and having ears to hear if it works as expected or not.....
I do not say, it is not possible to find right folks for the job, but it is pretty hard. It is not just question of money.
If one earns more than he needs, which is true for most good IT professionals, there must be different motivation than money. And do not forget that perceiving value of money is logarithmical. When you offer to double the income, the guy may perceive as if it is only 5, 10, % more, not the double. The work must be interesting for given person and working conditions,
------
Not to be mistaken, I am SW developer and speak from my own experience, or experience of developers I personaly know.
0 -
I can think of three people right now off the top of my head I would trust to do such a port, if I were paying with my own money and needed the product done well. One currently works at Facebook, one currently works at Microsoft, one currently works at Roblox. Each of those companies pay Silicon Valley salaries and stock grants. If I think a little harder, I can almost certainly think of more.
The entire problem looks to me like "Native Instruments does not want to pay market dollars for the necessary skills." Which is totally understandable. Pro audio just isn't that large or profitable a market, and I imagine the new owners at NI want higher profits and lower costs, rather than a bigger future investment.
1 -
It is not port. It is writting new compiler for new CPU from languege they probably do not know, as it is presumably something internal NI. Most probably they would have to move to Berlin. And I suppose NI would expect them to stay there several years, the longer the better.
I also know few people who might be able do it. But I doubt they would be willing do it even for Silicon Valey salaries....
And maybe NI is not able or willing to pay that money. Because in EU employer has also pay big part of social security, health care and pension fonds....
So we both agree that it is not easy task. ;-)
0
Categories
- All Categories
- 19 Welcome
- 1.4K Hangout
- 60 NI News
- 732 Tech Talks
- 3.9K Native Access
- 15.8K Komplete
- 1.9K Komplete General
- 4.1K Komplete Kontrol
- 5.5K Kontakt
- 1.5K Reaktor
- 364 Battery 4
- 814 Guitar Rig & FX
- 416 Massive X & Synths
- 1.2K Other Software & Hardware
- 5.5K Maschine
- 6.9K Traktor
- 6.9K Traktor Software & Hardware
- Check out everything you can do
- Create an account
- See member benefits
- Answer questions
- Ask the community
- See product news
- Connect with creators