KK3 UX Feedback

Options
2456

Comments

  • spicemix
    spicemix Member Posts: 68 Member
    edited January 8
    Options

    Setting up my expression pedal was a great demonstration of the value of the GOMS game. This was so confusing and needlessly difficult.

    1. Why should I need to use the standalone app to setup my MIDI? Count all the steps needed to do that and you lose the game. Your coopetitor Orchestral Tools has not only their installer app but their product store right inside their plugin (which OK has lots of problems and given the visible delay in development may have lost its lead engineer but still shows where things are headed in UX).
    2. Far worse, when you click the MIDI port icon, and then click Pedals, and then arrow over to Pedal B, you get a gray Pedal B. Why? You actually have to click that gray Pedal B to activate Pedal B. Why? (And doesn't gray mean it can't be clicked?) Until you click that gray Pedal B, the controls for configuring Pedal A are still there and active, although "Pedal A" is not shown. Why? And if you change those controls, you're actually still configuring Pedal A. But it says Pedal B, albeit gray.

    Not only have you lost the GOMS game, you have created a total, reeking UI mess here. "Oh but that's a rarely used feature" Please. No utterly unprofessional dodges. You have created more work for the user and the programmer alike with zero benefit whatsoever. I struggled with this for about 10 minutes to figure out what was going wrong and I doubt I'm alone. Whereas it would have SAVED you more than ten minutes to simply have the arrow button that moves to Pedal B just go ahead and select Pedal B without the additional click. You may have had costly hardware returns over this embarrassing software design flaw, as people understandably concluded the pedal ports were broken.

    Make a spreadsheet of every function in the product and run GOMS on it. Who can find opportunities to minimize user gestures without forcing them to remember things or read the manual? Reward those finds. Don't get pantsed out in public like this.

  • spicemix
    spicemix Member Posts: 68 Member
    edited January 8
    Options

    While we're complaining, why not add a particularly annoying one for Native Access. I move a few libs around on different drives, now I need to relocate them. Wow does this simple and common task lose the GOMS game.

    For all the filters available, the most valuable one, "Libs needing repair" is completely missing. Instead I have to carefully scroll through my entire collection looking for a little orange circle with a ! in it. Then I have to click that, click relocate, navigate the old-fashioned file dialog to the proper folder, do a couple more needless clicks and click away a useless confirmation dialog if somehow I got that right. All of this for every single library needing repair.

    To win the GOMS game? File systems nowadays have strong indexing features, and finding a Kontakt library isn't exactly a needle in a haystack. I know Germans have an enormous anxiety about looking STASI-ish, that's why they destroyed the entire Web with cookie warnings, and they might feel very nervous searching people's hard drives for relocated Kontakt libraries. But I think we will forgive you for just doing this for us. Or at least allowing us to press a single button that searches all our drives for all our missing libraries and sews things up accordingly. I mean, what's the value of an installer app if I have to do all this work for it to just index a folder properly?

    Oh and a constant source of irritation with NA: when you copy a serial number off a website and paste it into Add Serial, it doesn't trim the paste for you. Often you get whitespace that wrecks the paste of the serial number, and you need to run it through another text field that will trim it before you paste into NA. Just trim whitespace when pasting for us!

  • nightjar
    nightjar Member Posts: 1,307 Guru
    Options

    My biggest issue with the KK3 UX is the whole concept & presentation of categorical tagging. It seems archaic to me.

    The laundry list of Sound Type and Character is comical.

    There needs to be a spectral/time domain analysis of sonic characteristics of NKS audio previews that create an objective "fingerprint" of presets. Gradient levels of different sonic characteristics that machine learning could digest to offer us found groups of presets in so many new and powerful ways.

    Tagging is labor intensive, subjective, and rather stupid to be doing in 2024.

  • spicemix
    spicemix Member Posts: 68 Member
    Options

    Right you have XLN XO and Waves COSMOS and the like as browsable starfields of sounds. Not even so new at this point.

    It's possible that the current state of KK3 is just an MVP to get the thing out the door and the platform is designed to develop out into something more sophisticated that isn't ready yet. The hazard being AI-generated audio outpaces this development enough to render it obsolete commercially by the time it ships. But us ole-timers may still enjoy handcrafting music for a while.

  • BIF
    BIF Member Posts: 700 Pro
    Options

    It's possible that the current state of KK3 is just an MVP to get the thing out the door...

    The MVP (Minimally Viable Product, for those who might not know the lingo) probably was KK 3.0.0, in combination with whatever the firmware that came with it.

    And you're probably right; the MVP was to get the thing out the door. I imagine having shipping crates and containers full of product just sitting around (regardless of where it was or who's loading dock it was blocking), probably was beginning to cause blockages in the supply chain. So yeah, it had to go out the door in order for the pipes to become available for the "steady state" of everyday business.

    I suspect we're well past the MVP stage now, and well into the iteration stage.

    What always makes me nervous about NI is that iteration only happens until it doesn't. And then the product gets killed by leadership. But we're probably good at least for 2024. I hope the dev teams are busy iterating as much as they can, now that the holidays have passed.

  • spicemix
    spicemix Member Posts: 68 Member
    Options

    Yes I should say to the senior management that they are 98% there and that last 2% is something that annoys me and a few power users but isn't worth throwing the baby out with the bathwater.

    There are some sensitive devs, but the better of them will be happy someone literate in the craft is paying this close attention to their work and urging better. Think of an athlete...would they prefer to be ignored, or booed and criticized for missing the goal? A star athlete would be annoyed they aren't booing harder when they miss. It's the management that wants to silence the critics, because it's often the management who are responsible for misplaced priorities.

  • nightjar
    nightjar Member Posts: 1,307 Guru
    Options

    How is this not a ridiculous way to be prompting searches in 2024?


  • spicemix
    spicemix Member Posts: 68 Member
    Options

    I'm not so against tag clouds as they do recognition over recall, and offer a single click interaction. I'm much more annoyed with the alphabetical-only results list, and the exclusion of user presets and favorites by default. At least the tag cloud is already filled out for us, probably not perfectly, but there's no way I'm adding to it myself. I will click the favorite button though, because they force me to.

    I think "User presets" is too broad a category because it also means "3rd parties who refuse our vigorish." I want a category that is my own presets. I suppose I have to favorite them...but...here we have another bug.

    • When I edit and overwrite a favorited preset, it's no longer a favorite. No bueno. Fix dat.

    Anyway, a UI that dances about architecture is an interesting question. The real value of AI is sonic inpainting: OK tell me what it is I want to use in this track I'm working on right now. A ranked list of those options, ranked for context, would be what we actually want.

  • nightjar
    nightjar Member Posts: 1,307 Guru
    Options

    If I want recognition and single click.. let me make my own list of favorite prompt words that work in conjunction with a sounds "sonic fingerprint".

    No hard-assignment of descriptive words.. only metadata of the engine & parameters

  • BIF
    BIF Member Posts: 700 Pro
    Options

    That tag cloud is really a thunderstorm cloud.

    I hardly ever use those myself. For example, to me these are virtually useless, either because they have as many different meanings as there are with people with ears. Or they have no meaning at all, simply because we have ignored what words mean for so long.

    Airy, Bright, Clean, Dark, Deep, Digital, Dirty, Evolving, Granular, Lick, Processed, Sub, and Synthetic.

    "Clean" could describe an unprocessed guitar or other sound, but come on...all of this stuff is processed in order to be put into one of these libraries from NI or one of its partners. There might be "appearing to be clean sounding" sounds, but probably not any that are truly clean.

    I don't think I've ever heard a "dark" piano sound and said, "oooh, that's dark!" One time, I had an audio engineer tell me that one of my piano patches sounded "processed". And it was the opening patch on the Korg Triton, so OF COURSE it's processed with all kinds of reverb, EQ, and just about anything else you'd put on your premier synth's default piano sound!

    What is "evolving"? Is that a sound that gets more complex if you hold down a key, or is it a sound that goes from one channel to another each time you press a key? Or does it add stuff the higher you go on the keyboard? Or lower on the keyboard? And if it's that, why isn't it EvolvingUp?

    I would rather just keep my categories simple. The name of the base instrument or a name of an orchestra section, or a voicing such as "Lead" or "Pad". Once we get beyond those, we start to lose meaning for some users, and we ultimately end up outsmarting ourselves.

    I know what a Hurdy Gurdy is. If it's described as "Hurdy Gurdy to Hell and Back", then it had better have some distortion on it! But somebody else might expect it to have some wah on it too.

    Okay, back to my quest to learn the difference between "dark" and "bright".

  • spicemix
    spicemix Member Posts: 68 Member
    Options

    Wow I could talk piano tone for centuries. Was my specialty for years. Anyway the first knob on Noire that goes Soft to Hard will be dark to bright. The original instrument itself can have "the juice" hardener on the hammers for bright, or needling to soften that up, or the soft pedal shifting the hammers to a less beaten down spot for very dark. Then there's the mics and placement...I would have over the Hamburg a pair of Josephsons aimed at the hammers, perhaps ribbons in blumlein at the bow, and Schoeps in ORTF and distant. The hammers would be bright and crisp, the ribbon dark and wide, the ORTF very natural as if you were there. I would generally aim for natural balance but in context of a mix that can go out the window.

    Many people can't mix, the sound library has to do the mixing for them, that's OK, in that case these descriptors can help them. Having simple knobs to adjust that is great. Having whole plugin chains with a single page of knobs controlling the most useful parameters across the chain would be even better, but sadly not there yet. Finish the job.

  • BIF
    BIF Member Posts: 700 Pro
    Options

    I could go for what you suggest, but only if all vendors come to agreement on what makes something "dark". And it shouldn't just be limited to a single instrument.

    For example, we've all agreed on pitch. It's inarguable that C#4 is not only HIGHER in pitch than C4, we have agreed that it is a half-step higher, or 100 cents higher. Likewise, it's inarguable that G4 is a 5th above C4, and that holds true for all pitched instruments.

    But if something is "dark" or has an effect of "being dark" on piano, then that same characteristic should be called "dark" if it can be applied to a violin or a trombone. Otherwise, it's just confusing.

    In photography, for another example, everybody knows what "depth of field" and "bokeh" mean. Each of these is a thing. So when you go to a drawing application or a 3D rendering application and you want to use a stopped-down depth of field or apply a "bokeh effect", you already know what to expect. Bokeh = bokeh. A high F-Stop will give you a wide depth of field. Always and forever.

    And that's what I want in the audio world. I don't feel we're there yet.

  • nightjar
    nightjar Member Posts: 1,307 Guru
    edited January 10
    Options

    NKS2 could have been something far different than what it is...

    Rather than continue the archaic path of categorization by clumsy "hard-assigned" descriptive words.. NI could have introduced a method to create their standard for "sonic fingerprints" that could then be soft-correlated to descriptive words via machine learning.

    Much could be conceptually "borrowed" from how images are analyzed and managed.

    The tech from iZotope is right on target for this to happen.

    Could be that the hardware for S-Mk3 was too far along for a change in strategy when the iZotope merger happened.

  • spicemix
    spicemix Member Posts: 68 Member
    edited January 10
    Options

    Like we were saying, AI will disintermediate this process. We will have sound about sound, not words about sound, not images about sound. The AI will be like a magician with a giant bag of sonic tricks and keep pulling stuff out and let us zoom in on what we wanted aurally, in context. If you look at say, Photoshop Generative Fill, we can get the same kind of thing for music, you will choose among options that all fit the context perfectly, mixed perfectly, the right vibe and everything.

    There are a few things holding that up, and this discussion is a nice example. Humans are faster at visual categorization than sonic, so it's hard to get good labeling of sound. Also, musicians are loath to share their multitrack data so we can only train models on full arrangements rather than parts. But it will get done soon enough.

  • nightjar
    nightjar Member Posts: 1,307 Guru
    edited January 10
    Options

    AI is getting better at pulling parts out of arrangements at a surprising rate...

    And there multitracks made to "sound-like-the hits" purposes I'm sure willing to be sold for training rights... and the data doesn't need to be "hits".. just well arranged

Back To Top