NKS Quality Control

Options
24

Comments

  • Kubrak
    Kubrak Member Posts: 2,790 Expert
    Options

    Well, one could use the both, tagging system by humans and computer provided tagging based on analysis. Similar system works in Traktor, one may use his own tags, company provided tags and tags/features based on track analysis.

    Generally there is room for all three worlds, company tags, user tags and analysis tags. Could be cool if one marks preset/sample he likes and he would get list of similar ones. Whatever "similar" means..... There are tens to hundreds thousands of presets/samples in I catalog. Rather overhelming... Sort of smart searching would be great.

    But, one cannot get even stupidly simple to implement randomization of order of presets in list... So, no much hope for more advanced methods....

  • nightjar
    nightjar Member Posts: 1,287 Guru
    Options

    The AI/ML analysis would not do any word-based tagging... and any sort of stored analysis would be easily cleared and re-done whenever desired.

  • JesterMgee
    JesterMgee Member Posts: 2,603 Expert
    edited August 2023
    Options

    Waste of time, next.

    Stop derailing every damn thread with your dreams and hopes for the utopian future. We get it, you want ipads, AI and everything done automatically for you at the touch of a virtual button. Point noted, not gonna happen, move on man...

  • Kymeia
    Kymeia NKS User Library Mod Posts: 3,763 mod
    edited August 2023
    Options

    Sorry but this doesn't make any sense to me. If a preset has metadata that metadata isn't going to change in any meaningful way unless the preset itself changes, in which case it would be a new preset. So of course preset metadata should be fixed, otherwise it will be of no use in 'describing' that preset. I can see you are trying to speculate about possible futures which is fine, but this is far from being something that is a viable plan and certainly is unlikely to be in the next version of KK (at least not in the near future).

  • nightjar
    nightjar Member Posts: 1,287 Guru
    Options

    My comments are 100% on target with the thread.

    As you stated the problem as the thread originator:

    So it's a given that when users make NKS presets there will be issues with the quality/consistency of things like tagging, often making a bit of a mess and it's sometimes difficult to pick up on this especially given the broken way the tagging in KK works these days (not removing old unused/inaccurate tags even after being fixed)

    I proposed the solution that I am 100% certain is forthcoming. It is inevitable.

  • nightjar
    nightjar Member Posts: 1,287 Guru
    Options

    Any metadata associated with a preset would be as temporary as the periods of time between a user wishing to do an analysis with a particular body of algorithms. This metadata would not be directly user interpretable, instead a data scheme that would encode various degrees of satisfying search criteria.

    The metadata associated with the instrument's architecture and available parameters would be more permanent with a baseline supplied by the developer.

  • JesterMgee
    JesterMgee Member Posts: 2,603 Expert
    Options

    I wouldn't try even make a point, he has a very narrow view of the future without the ability to see the actual requirements to get there.

    There is not enough info in a 2 second sample preview to make his dream a reality so it would require every preset to be loaded one by one (which would take huge amounts of time and likely crash constantly based on my experience), until enough sonic data could generate an accurate match and this would also have to take into consideration how sounds can be modulated, if they are splits or layers of sounds, if they change based on velocity, do they have vibrato on the AT channels... so many variables that a human doing the tagging can know quickly (also looking at the developers class of category in the plugin) but an "AI" would have to work through every combination to actually make anything useful and would still throw things in completely wrong. Just consider the way Omnisphere has sounds that may sound like a piano, but they are more soundscapes.

    It can work for simple sounds like drum hits, but complex sounds of a massive variety, would just make a dogs breakfast.

    I am sure there will be some flakey counter argument on how it could work based on a youtube he has seen on it, regardless as you and others have stated, it will never happen.

  • JesterMgee
    JesterMgee Member Posts: 2,603 Expert
    edited August 2023
    Options

    I was proposing to allow users to edit tags and allow easier management and of course for the NI department to check official tagging before releasing libraries, not for AI to make a mess of things. If you want to sit here and offer your fantasies, I can of course shoot them down with reality. Nothing you propose will work howe you believe it will because you lack understanding of a few key points. What experience and examples can you offer that work for this kind of thing? Remember, we are not dealing with audio files here like music files, we are dealing with synth instruments that need to be loaded and analysed. How much memory would all this need? How much extra CPU overhead? How much time is needed? So many actual real world metrics you leave out, anyone can dream things up but to make things a reality is a different matter.

    NI will not be developing and implementing it any time soon (ever), no one else on the forums here has been asking for it, Music production just does not need that kind of tech, almost all of the rest of us are grounded in reality and have much more simple are realistic requests for features. People still use hardware from the 1960s to make music, well those with talent that can just make things work regardless of the limitations anyway. Just a few fixes to the system is all that is needed, not a whole reimagining of it, but I know this wont change your mind. Just gonna be forever disappointed because the majority of people out there are not looking for this kind of solution in this area.

  • Kymeia
    Kymeia NKS User Library Mod Posts: 3,763 mod
    Options
  • nightjar
    nightjar Member Posts: 1,287 Guru
    Options

    From quote:

    There is not enough info in a 2 second sample preview to make his dream a reality....

    If a NKS preview is long enough for human assessment, it's long enough for AI/ML to make the same assessments and more. No need to load.

    And the analysis would simultaneously "know" the capabilities of the instrument generating this preset to make determinations beyond realtime human assessment.

  • JesterMgee
    JesterMgee Member Posts: 2,603 Expert
    edited August 2023
    Options

    If a NKS preview is long enough for human assessment, it's long enough for AI/ML to make the same assessments and more. No need to load.

    Sometimes it is not enough to classify a sound correctly and requires loading the preset (from experience). Also, for a human we read the info from the plugin, have an idea based on the actual type of plugin what the likely sound will be. A Pad sound can also sound like a soundscape but when looking in the plugin and seeing it is an Atmos, we know it is a Soundscape. How would an AI read all the different types of browsers in every instrument, or know how, or even be able to technically do that while analysing a sound, how would it be done, how long would be needed, can the user use their computer, where is that info stored, would it always be the same for every user that does the same thing, where is the AI info trained and controlled by, would NI need sevrers to do all this, would you need an internet connection to do it, woould that need to be on all the time or would it be stored offline somehow, where, and how would it be updated.... My god the questions you leave unanswered...

    The AI/ML analysis would not do any word-based tagging... and any sort of stored analysis would be easily cleared and re-done whenever desired.

    I am actually curious now, how it would work, what your idea is. To even bother going any further, how would one find a sound such as a Grand Piano with a Bright character as I always try to find this kind of sound? You say there would be no tags, how would I find a piano, would I have to type it in? How can I do that from my hardware?

    Also, how exactly does this work:

    and any sort of stored analysis would be easily cleared and re-done whenever desired

    What is being analysed (sound samples or the instrument preset itself), how is it stored and where, how is it cleared and redone, how long does that take, can it be done from the hardware, how exactly do you achieve all these things? This is all stuff I have no clue how you would manage any of this and why I feel it would never be possible

    And the analysis would simultaneously "know" the capabilities of the instrument generating this preset to make determinations beyond realtime human assessment.

    Explain how exactly would this work?

  • nightjar
    nightjar Member Posts: 1,287 Guru
    edited August 2023
    Options

    From the quote:

    I was proposing to allow users to edit tags and allow easier management and of course for the NI department to check official tagging before releasing libraries, not for AI to make a mess of things. 

    Your expectation for the "NI department" to check official tagging = labor costs.

    You really think that the constant "fire-hose" of new presets from all sources will not cause NI to seek out means of lowering labor costs and simultaneously improving results and turn around time?

    You are the dreamer, not me.

    That's like expecting humans to be doing facial recognition in photo libraries.... which is getting more amazing day by day.

  • JesterMgee
    JesterMgee Member Posts: 2,603 Expert
    Options

    Your expectation for the "NI department" to check official tagging = labor costs.

    Duh, that is WHAT THEY DO ALREADY. They check every official NKS library submitted by devs to make sure it fits their requirements. It takes a few seconds to check tagging.

    NI do not check user fgenerated ones, we do this ourselves and people like me and Kymeia who know this stuff both agree your idea is fanciful and has FAR too many unexplained details on how it would even be possible. It's just a pipe dream that you always bring up, in every possible thread you can, without ever explaining more than "it would work like this, and you could imagine it like that..."

    I just want you to deta8il, based on an existing example of the same concept or tech how it would work, how it would be cheaper and better than it is already and how people woud use it

  • JesterMgee
    JesterMgee Member Posts: 2,603 Expert
    Options

    That's like expecting humans to be doing facial recognition in photo libraries.... which is getting more amazing day by day.

    Absolutely nothing like that.

    We are talking simple correction of mistakes, not trying to analyse a face against a database of millions which is what autonomous processes are good for.

    I have used "auto tagging" software in many cases and it always gets things wrong. Sonically it detects something correct,. but class is wrong. Sure it can get better but honestly I would take a curated library from someone that knows what they are doing over an AI approximation of anything.

    AI lacks the ability to make a choice based on feeling. Just because all the signs point to a sound being a string does not mean it is a string, it may be better classed as a Pad when it comes from a plugin like PadShop.

    We just do not need AI to solve every little problem just like not every solution needs an ipad crammed into it.

  • nightjar
    nightjar Member Posts: 1,287 Guru
    edited August 2023
    Options

    From the quote:

    AI lacks the ability to make a choice based on feeling. Just because all the signs point to a sound being a string does not mean it is a string, it may be better classed as a Pad when it comes from a plugin like PadShop

    You are looking backward, not forward.

    NI has the intellectual assets of iZotope RX10 and a huge incentive to capitalize on any advantage they can gain through innovation that would be hard for others to match.

Back To Top