KK software improvements. We do NOT need new S-series MK3 KK Keyboards...

Options
2

Comments

  • LostInFoundation
    LostInFoundation Member Posts: 4,311 Expert
    edited August 2023
    Options

    Detachable browser would definitely be useful.

    I know it sounds weird, but one of the best features of Studio One is its detachable elements.

    It could seem like a small thing, but enhances the workflow tremendously. Specially if using multiple screens (preferably touchscreens, since Studio One is very touch friendly)

  • Kymeia
    Kymeia NKS User Library Mod Posts: 3,863 mod
    edited August 2023
    Options

    I think you don't have any idea what is involved in creating controller maps if you think it is that simple. There is no chance of this being possible in the near future for Komplete Kontrol which hosts many different synths and kinds of plugin (some which don't even make a sound themselves such as audio fx) - at least not till AI is much much smarter than it currently is. I know one plugin that is using this sort of approach in the most cutting edge way and that is the upcoming SonicCharge Synplant 2 and it took the developer several years to teach the AI how to use just that one fairly basic VA/FM synth. Honestly those of us who tag a lot of stuff know how hard it is for a human sometimes to decide is this a pad or soundscape or is this a lead or a bass? AI would make a very poor job of even deciding on those sort of tags, let alone attributes that require you to look at the patch itself in the synth (not the sound, the patch) such as whether it is monophonic, duo phonic or polyphonic, if it is processed, if it is sample based, physically modelled or additive, if it is synthetic or acoustic, etc.

  • nightjar
    nightjar Member Posts: 1,305 Guru
    edited August 2023
    Options

    I think you are grossly underestimating what will be possible quite soon with the right talent and resources being focused on it... if NI doesn't do it, someone else will.

    In the words of Thomas Dolby.. SCIENCE!

    To get to this automatic NKS 2.0 nirvana.. machine learning can look at more than the sonic analysis of generated audio.. there is also analyzing screen shots of instrument/fx GUI and the very code itself to create a meta file on the instrument/fx parameters and how to best map them.

    And all of this would be continually improved and expanded the more it is used.

  • nightjar
    nightjar Member Posts: 1,305 Guru
    Options

    NI should not design MK3 keyboard controllers until they have first developed NKS 2.0

    And NKS 2.0 should be developed first as a desktop screen GUI, pointing device, QUERTY keyboard.

    Once this is achieved, then MK3 keyboard controllers can be designed for performance input + remote control of desktop apps.

  • Kymeia
    Kymeia NKS User Library Mod Posts: 3,863 mod
    edited August 2023
    Options

    It's obvious any new hardware will also come with an updated NKS spec. But I very much doubt it will be using machine learning to tag sounds and create controller maps anytime soon. Maybe NKS 10? Even then I'm not sure I would want to hand over so much control to an AI. It doesn't seem like a positive step forwards at all to me.

  • LostInFoundation
    LostInFoundation Member Posts: 4,311 Expert
    Options

    Nice discussion guys. I’m not too much a fan of AI, more oriented to the idea of NI going back to “musicians doing/knowing what musicians need”. But I must admit some of the arguments nightjar brings to the table could be interesting.

    Let’s hope all this AI movement will become “humans using technology to improve themselves” and not “humans using technology to avoid working”

  • nightjar
    nightjar Member Posts: 1,305 Guru
    edited August 2023
    Options

    I don't think that the ML/AL analysis of the spectral/time domain of generated audio would even be "tagging" in any current sense of the word. The user might use text input as one method to start a search criteria , but they might also offer a reference sound.. or "verbalize"/sing/fingertap/?? to seek out a playable patch.

    Case in point.. IRL I just heard a distant train. Record that on my phone, offer that as part of my search input.. along with the text "distant train".. now find me a playable patch that evokes that feeling...

    Text input not tied to "burned in" tagging of the past is the future

  • Kymeia
    Kymeia NKS User Library Mod Posts: 3,863 mod
    edited August 2023
    Options

    I think as an adjunct to humans also using their own brains, judgement and experience to identify and classify sounds that's fine, it could be useful, I just don't believe we are anywhere near the point where it would be able to replace the human input and think it is likely to generate a lot of mistakes. I have an AI powered sample librarian that can do something similar already (Sononym), it's useful to a point but it does also get a lot wrong.

  • JesterMgee
    JesterMgee Member Posts: 2,639 Expert
    Options

    Some people seem to need machines to do the work for them, glad I was born with a brain capable of doing things for myself…

  • LostInFoundation
    LostInFoundation Member Posts: 4,311 Expert
    Options

    One thing is for sure: atm AI is generating interesting discussions. And this is stimulating human brains. One point for AI

  • nightjar
    nightjar Member Posts: 1,305 Guru
    Options

    My brain prefers to spend more time creating music and less time digging through resources to find what I want.

    Tools that help minimize the time I spend looking for what I want/need/imagine are getting better all the time. Resource management is an excellent application of machine learning.

  • LostInFoundation
    LostInFoundation Member Posts: 4,311 Expert
    edited August 2023
    Options

    Only counter argument I can raise to this: digging through resources is an ulterior way in which your brain learns new things and therefore will further expand what you want/need/imagine.

    Sometimes doing tasks we are not interested into make us discover things we didn’t know we were interested into. Or approach better what we already knew was of our interest

    Think about music (since we are on a music forum): listening only to what we like surely will not help us discovering other things we may like. Or why we like so much what we like. Or where it comes from 😏

  • nightjar
    nightjar Member Posts: 1,305 Guru
    edited August 2023
    Options

    My brain knows how it prefers to use its time. And I do enjoy puzzles of all types.

    My brain's quest to manage music creation resources extends back to around 1991 when I coded a librarian module for the Roland/BOSS SE-50 that Roland later added to the products supplemental software.

    So I know that there is fun to be had in the design of these tools... BUT.. my brain knows when it wants to get in a creative music flow and not encounter distracting friction from resource management.

  • LostInFoundation
    LostInFoundation Member Posts: 4,311 Expert
    Options

    Are you sure your brain is not an undercover AI making you think you decided on your own what you decided?😏

    😂😂😂

  • nightjar
    nightjar Member Posts: 1,305 Guru
    Options

    Hahahaha

    I guess it might be hard to tell...

Back To Top