What role do "frameworks" typically play in GUI development? What are the implications for the UX?
(kreiert von Gott -> er hat einfach mehr knowledge )
Frameworks provide structure and reusable components in GUI development, which:
Accelerates implementation
Encourages consistency across interfaces
Can limit creativity if blindly followed 🧠 UX Implication: Frameworks support faster development, but may lead to interfaces focused on implementation ease rather than user-centered interaction unless UX is considered from the start.
Describe the UX design perspective on the use of (development) frameworks.
From a UX perspective:
Frameworks should be adapted to fit user needs, not the other way around.
Designers must closely link gesture and interface development (Slide 6).
The framework should support intuitive feedback, minimize false positives, and enable natural interaction flow.
Why is user differentiation not always a good idea?
(Not explicitly defined in one slide, but implied)
Over-personalizing or creating different interactions for different users may lead to inconsistencies, higher learning curves, and unpredictability.
It breaks the goal of self-descriptiveness and simplicity (Slides 9–10, 37–38).
Why are false positives particularly problematic in gesture-based
interactions?
the system is doing sth even though you did nothing
user learns through feedback -> false feedback given
-> user is learning a possible wrong gesture
dont know how to undo if system reactet to wrong gesture
Gott:
They cause actions to trigger without intent, breaking the sense of control.
Especially critical in NUIs where gestures are ambiguous or similar.
Result in frustration and confusion (Slide 5).
Slide 4 emphasizes avoiding primitive or easily confused gestures.
Which is more fatal in NUIs: false positives or false negatives?
false positives (user doesnt know what he did, learns the wrong gesture)
false negatives -> user does gesture again (not a problem)
Slides focus more on false positives as more problematic:
They trigger unwanted actions.
Users may not know how to undo them.
Slide 5 suggests closely analyzing both but emphasizes avoiding similar gestures and improving feedback to reduce false positives.
What needs to be considered in the scope of gesture-based interfaces?
Do not limit your ideation by the question "Will the user guess?"
Use this question to optimize your feedback
Create no primitive gestures on the basis of this question
Take a closer look on false-negatives and false-positives
Try to understand the gesture recognition software
Inform the user by affordances
Avoid similar gestures
From Slides 5–6 and 34–36:
Minimize false positives/negatives
Design gesture recognition closely with UI
Focus on manipulation-based gestures
Ensure cause-effect consistency
Use affordances and self-descriptive design
Carefully visualize gesture phases and system states
What is the role of feedback in NUIs?
people learn from feedback -> right feedback has to be given
What are the challenges related to the self-descriptiveness of gesture languages?
learning curve of shortcuts (gulf of competence)
takes time / effort to learn better gesture for sth -> motivation to change is low
transition (learning e.g. shortcut) should be supported
What can we learn from the relationship between dominant and non-dominant hand for the design of gesture languages?
Gott weil ich hab kein brain:
Not directly discussed in the slides. However, the principle of context-sensitive linking (Slide 20) and movement-based states (Slides 22–24) imply:
Different hands may perform different functions.
Designers can consider assigning manipulation to the dominant hand, and menu/control gestures to the non-dominant hand, inspired by the concept of gesture layering.
Last changed3 months ago