User Interface eXtensible Markup Language (UsiXML) is a formal Domain-Specific Language (DSL) used in Human-Computer Interaction (HCI) and Software Engineering (SE) in order to describe any user interface of any interactive application independently of any implementation technology. A user interface may involve variations depending on: the context of use (in which the user is carrying out her interactive task), the device or the computing platform (on which the user is working), the language (used by the user), the organization (to which the user belongs), the user profile, the interaction modalities (e.g., graphical, vocal, tactile, haptics)
Showing User Interface Adaptivity by Animated TransitionsJean Vanderdonckt
Animated transitions can provide a smooth transition between an initial user interface and an adapted final interface by using intermediate transitional interfaces. Related work shows that animation can be used to direct user attention, visualize processes over time, simplify complex content, and show reactions to interactions. When designing animated transitions, they should be short, support the direction of movement or information, and may be supported by sound. Users should be able to control the animation speed and sequence. An evaluation of transition scenarios found that participants generally reacted positively to the use of animation for interface adaptations.
The language reading direction is probably one of the most determinant factors influencing the successful internationalization of graphical user interfaces, beyond their mere translation. Western languages are read from left to right and top to bottom, while Arabic languages and Hebrew are read from right to left and top to bottom, and Oriental languages are read from top to bottom. In order to address this challenge, we introduce flippable user interfaces that enable the end user to change the reading direction of a graphical user interface by flipping it into the desired reading direction by direct manipulation. This operation automatically and dynamically changes the user interface layout based on a generalized concept of reading direction and translates it according to the end user’s preferences.
3D User Interfaces for Information Systems Based on UsiXMLJean Vanderdonckt
Since many years, 3D interactive systems have demonstrated some benefits in reproducing adequately the reality, in improving it, and even in augmenting it by providing the user with unprecedented actions. 3D User Interfaces are becoming the primary subject of interest of a growing community of researchers and developers adopting different approaches for specifying and creating 3DUIs. Providing development methods and software support for 3DUIs is a complex problem. In this paper, we argue that developing 3DUIs for Information Systems is an activity that would benefit from the application of a model-driven development methodology composed of: a set of models defined according to an ontology, a language that expresses
these models, and a structured method manipulating these models.
Distributed User Interfaces: How to Distribute User Interface Elements across...Jean Vanderdonckt
Distributed User Interfaces (DUIs) have become one vivid area of research and development in
Human-Computer Interaction (HCI) where many dramatic changes occur in the way we can interact
with interactive systems. DUIs attempt to surpass user interfaces that are manipulated only by a single
end user, on the same computing platform, and in the same environment, with little or no variations
among these axes. In contrast to such currently existing user interfaces, DUIs enable end
users to distribute any user interface element, ranging from the largest one to the smallest one,
across one or many of these dimensions at design and/or run-time: across different users, across different computing platforms, and across different physical environments. In this way, end users
could be engaged in distributed tasks that are regulated by distribution rules, many of them being
currently used in the real world. This paper provides a conceptual framework that invites us to rethink
traditional user interfaces in a distributed way based on the locus of distribution control: in the hands of the end user, under control of the system, or in mixed-initiative way. Any user interface submitted to distribution may also be subject to adaptation with respect to the user, the platform, and the environment.
This paper suggests a method for developing graphical user interfaces based on generative patterns. A generative pattern contains portions of previously designed user interfaces are expressed through models that are either partially or totally instantiated. These portions could be identified and re-applied to a new design case study by generating code by instantiating the specifications contained in the models. The method involves typical models found in user interface development life cycle such as task, domain, abstract user interface, concrete user interface, final user interface, context model, and mappings between them. Any model could virtually be the source of a pattern and could be described, searched, matched, retrieved, and assembled together so as to create a new graphical user interface. For this purpose, a software has been developed that manages generative patterns by combining an existing user interface description language (UsiXML – user interface extensible markup language) with concepts addressing problems raised by pattern description and matching in a pattern-based language (PLML – Pattern Language Markup Language, a language was introduced to uniformly represent user interface patterns). Once instantiated from the generative patterns, the models give rise to a model-driven engineering based on model-to-model transformation and model-to-code compilation.
This document provides a brief walkthrough of the basics of sketching and drawing. It outlines gathering supplies, choosing a 2D image or real subject as your focus. For 2D images, it recommends starting simply to build skills and considering scale if resizing the image. For real subjects, it advises starting in the middle and working outwards, lightly sketching shapes before adding detail, shading and tone.
Towards Canonical Task Types for User Interface DesignJean Vanderdonckt
Task models are the cornerstone of user-centred design methodologies for user interface design. Therefore, they deserve attention in order to produce them effectively and efficiently, while guaranteeing the reproducibility of a task model: different persons should in principle obtain the same task model, or a similar one, for the same problem. In order to provide user interface designers with some guidance for task modelling, a list of canonical task types is proposed that offers a unified definition of frequently used tasks types in a consistent way. Each task type consists of a a task action coupled with a task object, each of them being written according to design guidelines. This list provides the following benefits: tasks are modelled in a more consistent way, their definition is more communicable and shared, task models can be efficiently used for model-driven engineering of user interfaces.
Computer graphics are images created using computers and include 2D images made with software as well as 3D graphics. They are used for entertainment, charts, graphs, design, and manufacturing. Computer graphics have advanced from early 2D pixel art and vector graphics to modern 3D graphics used in video games, movies, and other applications. The field continues to evolve with more powerful and accessible graphics hardware and software.
Showing User Interface Adaptivity by Animated TransitionsJean Vanderdonckt
Animated transitions can provide a smooth transition between an initial user interface and an adapted final interface by using intermediate transitional interfaces. Related work shows that animation can be used to direct user attention, visualize processes over time, simplify complex content, and show reactions to interactions. When designing animated transitions, they should be short, support the direction of movement or information, and may be supported by sound. Users should be able to control the animation speed and sequence. An evaluation of transition scenarios found that participants generally reacted positively to the use of animation for interface adaptations.
The language reading direction is probably one of the most determinant factors influencing the successful internationalization of graphical user interfaces, beyond their mere translation. Western languages are read from left to right and top to bottom, while Arabic languages and Hebrew are read from right to left and top to bottom, and Oriental languages are read from top to bottom. In order to address this challenge, we introduce flippable user interfaces that enable the end user to change the reading direction of a graphical user interface by flipping it into the desired reading direction by direct manipulation. This operation automatically and dynamically changes the user interface layout based on a generalized concept of reading direction and translates it according to the end user’s preferences.
3D User Interfaces for Information Systems Based on UsiXMLJean Vanderdonckt
Since many years, 3D interactive systems have demonstrated some benefits in reproducing adequately the reality, in improving it, and even in augmenting it by providing the user with unprecedented actions. 3D User Interfaces are becoming the primary subject of interest of a growing community of researchers and developers adopting different approaches for specifying and creating 3DUIs. Providing development methods and software support for 3DUIs is a complex problem. In this paper, we argue that developing 3DUIs for Information Systems is an activity that would benefit from the application of a model-driven development methodology composed of: a set of models defined according to an ontology, a language that expresses
these models, and a structured method manipulating these models.
Distributed User Interfaces: How to Distribute User Interface Elements across...Jean Vanderdonckt
Distributed User Interfaces (DUIs) have become one vivid area of research and development in
Human-Computer Interaction (HCI) where many dramatic changes occur in the way we can interact
with interactive systems. DUIs attempt to surpass user interfaces that are manipulated only by a single
end user, on the same computing platform, and in the same environment, with little or no variations
among these axes. In contrast to such currently existing user interfaces, DUIs enable end
users to distribute any user interface element, ranging from the largest one to the smallest one,
across one or many of these dimensions at design and/or run-time: across different users, across different computing platforms, and across different physical environments. In this way, end users
could be engaged in distributed tasks that are regulated by distribution rules, many of them being
currently used in the real world. This paper provides a conceptual framework that invites us to rethink
traditional user interfaces in a distributed way based on the locus of distribution control: in the hands of the end user, under control of the system, or in mixed-initiative way. Any user interface submitted to distribution may also be subject to adaptation with respect to the user, the platform, and the environment.
This paper suggests a method for developing graphical user interfaces based on generative patterns. A generative pattern contains portions of previously designed user interfaces are expressed through models that are either partially or totally instantiated. These portions could be identified and re-applied to a new design case study by generating code by instantiating the specifications contained in the models. The method involves typical models found in user interface development life cycle such as task, domain, abstract user interface, concrete user interface, final user interface, context model, and mappings between them. Any model could virtually be the source of a pattern and could be described, searched, matched, retrieved, and assembled together so as to create a new graphical user interface. For this purpose, a software has been developed that manages generative patterns by combining an existing user interface description language (UsiXML – user interface extensible markup language) with concepts addressing problems raised by pattern description and matching in a pattern-based language (PLML – Pattern Language Markup Language, a language was introduced to uniformly represent user interface patterns). Once instantiated from the generative patterns, the models give rise to a model-driven engineering based on model-to-model transformation and model-to-code compilation.
This document provides a brief walkthrough of the basics of sketching and drawing. It outlines gathering supplies, choosing a 2D image or real subject as your focus. For 2D images, it recommends starting simply to build skills and considering scale if resizing the image. For real subjects, it advises starting in the middle and working outwards, lightly sketching shapes before adding detail, shading and tone.
Towards Canonical Task Types for User Interface DesignJean Vanderdonckt
Task models are the cornerstone of user-centred design methodologies for user interface design. Therefore, they deserve attention in order to produce them effectively and efficiently, while guaranteeing the reproducibility of a task model: different persons should in principle obtain the same task model, or a similar one, for the same problem. In order to provide user interface designers with some guidance for task modelling, a list of canonical task types is proposed that offers a unified definition of frequently used tasks types in a consistent way. Each task type consists of a a task action coupled with a task object, each of them being written according to design guidelines. This list provides the following benefits: tasks are modelled in a more consistent way, their definition is more communicable and shared, task models can be efficiently used for model-driven engineering of user interfaces.
Computer graphics are images created using computers and include 2D images made with software as well as 3D graphics. They are used for entertainment, charts, graphs, design, and manufacturing. Computer graphics have advanced from early 2D pixel art and vector graphics to modern 3D graphics used in video games, movies, and other applications. The field continues to evolve with more powerful and accessible graphics hardware and software.
Fonda: Erfolgsfaktor BenutzeroberflächeFonda Wien
Bedeutung von Interaktionsdesign in Software- und Internet-Projekten für Versicherungsunternehmen.
Dieser Vortrag wurde im Rahmen des 5. Messekongress "IT für Versicherungsunternehmen" der Versicherungsforen Leipzig von Alexander Reiberger, GF von Fonda, gehalten.
Der jährlich stattfindende Messekongress "IT für Versicherungsunternehmen" der Versicherungsforen Leipzig fungiert als Diskussionsplattform über aktuelle Entwicklungen und Trends. Außerdem wird neben einer fachbezogenen Ausstellermesse ein Vortragsprogramm mit hochkarätigen Key Notes und themenspezifischen Fach- und - Diskussionsforen geboten. (www.assekuranz-messekongress.de/)
Fonda (www.fonda.at) ist eine Full Service Agentur für Digitale Medien aus Wien. Bei Fonda arbeiten Experten für alle wichtigen Kompetenzfelder aus dem Gebiet „Digital“ unter einem Dach. Fonda folgt dem Grundsatz „Besser einfach“. Unsere Überzeugung: Die besten Ideen kommen nur dann bestens an, wenn sie für den Nutzer klar und einfach verständlich sind. Wir bemühen uns, Klarheit und einfache Erfassbarkeit in jedes Projekt zu bringen, wie komplex auch immer die Inhalte und die technischen Anforderungen sind.
Vortrag von Wolfram Nagel (digiparden GmbH) zum Thema "Multiscreen Experience Design" auf der Usability Professionals Konferenz 2012 in Konstanz.
Die Gerätelandschaft wird immer dynamischer und fragmentierter. Viele Anwender werden zukünftig mehrere verschiedene Endgeräte (gleichzeitig) benutzen. Deshalb müssen Informationen auf möglichst allen (relevanten) Screens und Ausgabekanälen verfügbar sein. Das wiederum bedeutet, dass jedes Projekt generell für mehrere Screens und Ausgabekanäle gedacht und konzipiert werden muss, um dem Anwender eine möglichst „fließende Multiscreen Experience“ zu bieten. Der Vortrag stellt Prinzipien, Muster und Empfehlungen vor, die man bei der Konzeption von Multiscreen Projekten berücksichtigen sollte. Zwei Schwerpunkte des Vortrags sind Content- und Informationsmanagement für verschiedene Screens, sowie Kommunikation und Nutzung von Informationen auf mobilen Endgeräten.
Ein Blick in die Kristallkugel mit dem Ziel spannende und relevante Online-Trends für das Jahr 2004 hervorzusagen. Auf der Liste sind:
- Multimodal Interaction
- WAI
- PDF/Acrobat 6
- Blogging und RSS
- Digital Rights Management
- ENUM/E.164
- Anti-Spam
- Google Web API
- Grid Computing
- SOAP 2.0/XMLP
Der Mediencampus der Hochschule Darmstadt soll über eine Website als eigenständige Marke präsentiert und das Leistungsspektrum des Mediencampus soll für die unterschiedlichen Zielgruppen optimal dargestellt werden.
Ziel ist es eine Campuswebseite zu schaffen, die den Nutzeranforderungen der kommenden Jahre gerecht wird und ressourcensparend in der redaktionellen Pflege und technischen Weiterentwicklung ist.
Agile (Software-) Prozesse - Quo Vadis? [in German]Martin Gaedke
[DE] Eingeladener Impulsvortrag zum Thema Agile Software Engineering und Agile Management - in drei Teilen: Eine kurze Einführung zu Scrum, Übersicht zu Agilen Prozessen sowie ein Ausblick zu aktuellen Trends.
Einladung im Rahmen des BMBF-Projektes "Innoprofile-Transfer" zu Systemzuverlässigkeit in Elektromobilität und Energiemanagement
[EN] An introductory note about Agile Software Engineering and Agile Management - in three parts: An introduction to Scrum, Agile principles and approaches, and current trends regarding applying Agile in Management, Enterprise Agility, Lean, Kanban, Scalable Agile Framework, Business Model Generation, Holacracy, purpose-driven work environment, Design Thinking, Results-only and Impact-oriented approaches.
Presented at "Innoprofile-Transfer" project-workshop on System Reliability in electric mobility and energy management (a project funded by the Federal Ministry of Education and Research (BMBF)).
Vom 23. bis 24. Juni 2014 lud we.CONECT Experten und Entscheidungsträger von führenden Unternehmen der Branche zur Smart Variant.con 2014 ein, um Strategien, Prozesse, Lösungsansätze und konkrete Projekte zur Realisierung eines prozessual sinnvollen, effizienten und logischen Variantenmanagements vorzustellen und zu diskutieren.
we.CONECT
Konzeption und Einführung eines Responsive (Re-)Designs bei einer Dax 30 Marke: Vorgehen, Methoden, Erfahrungen.
Mit dem rasanten Wandel der mobilen Nutzungsgewohnheiten ist Responsive Design in der Web-Implementierung schon fast eine Selbstverständlichkeit geworden. Aber große Markenuniversen mit verteilten Systemen stellen Konzeptioner hier vor besondere Herausforderungen:
# Wie spezifiziert man Modul- und Layoutverhalten für sämtliche Site-Typen?
# Wie verifiziert man ein geräteunabhängiges Markenerlebnis im Usability-Test?
# Wie normiert man das Ergebnis in einem Responsive Styleguide?
# Wie migriert man sämtliche bestehenden Inhalte?
Applikationsmodernisierung: Der Weg von Legacy in die CloudAarno Aukia
Stell Dir vor: Du willst einen 6000er besteigen. Eine gute Vorbereitung, gutes Material und professionelles Wissen sind dabei unabdingbar.
Du schnappst Dir einen Berg-Guide, der Dich bei schwierigen Passagen unterstützt und Dir das passende Know-How weitergibt. Das schwere Material kannst Du auf ein Team aufteilen, das genau weiss, welche Pakete Du am sinnvollsten schnürst. Sie zeigen Dir zudem, welchen unnötigen Ballast Du abwerfen kannst.
Am Ende stehst Du am Ziel – Dich erwartet ein grossartiges Resultat und die Zufriedenheit des Vollbrachten.
Genau so fühlt sich die Reise Deiner Legacy Applikation an.
Im Webinar zeigen Dir die drei Partnerfirmen Object Engineering, Puzzle und VSHN, wie Du Deine Applikationen fit hältst. Dabei geben sie Dir einen Einblick, wie Experten die Applikationen analysieren, aufpeppen und den Betrieb sicherstellen können.
To the end of our possibilities with Adaptive User InterfacesJean Vanderdonckt
Slides of the keynote presented at the 1st International Workshop on Human-in-the-Loop Applied Machine Learning (HITLAML '23)
September 04 - 06, 2023 - Belval, Luxembourg.
This presentation summarizes the evolution of techniques used to adapt the user interfaces to the context of use, which is composed of the user, the platform, and the environment.
Engineering the Transition of Interactive Collaborative Software from Cloud C...Jean Vanderdonckt
Paper presented at EICS '22: https://dl.acm.org/doi/10.1145/3532210
The "Software as a Service" (SaaS) model of cloud computing popularized online multiuser collaborative software. Two famous examples of this class of software are Office 365 from Microsoft and Google Workspace. Cloud technology removes the need to install and update the software on end users' computers and provides the necessary underlying infrastructure for online collaboration. However, to provide a good end-user experience, cloud services require an infrastructure able to scale up to the task and allow low-latency interactions with a variety of users worldwide. This is a limiting factor for actors that do not possess such infrastructure. Unlike cloud computing which forgets the computational and interactional capabilities of end users' devices, the edge computing paradigm promises to exploit them as much as possible. To investigate the potential of edge computing over cloud computing, this paper presents a method for engineering interactive collaborative software supported by edge devices for the replacement of cloud computing resources. Our method is able to handle user interface aspects such as connection, execution, migration, and disconnection differently depending on the available technology. We exemplify our approach by developing a distributed Pictionary game deployed in two scenarios: a nonshared scenario where each participant interacts only with their own device and a shared scenario where participants also share a common device, including a TV. After a theoretical comparative study of edge vs. cloud computing, an experiment compares the two implementations to determine their effect on the end user's perceived experience and latency vs. real latency
Weitere ähnliche Inhalte
Ähnlich wie Faure vanderdonckt co-summit2013-final
Fonda: Erfolgsfaktor BenutzeroberflächeFonda Wien
Bedeutung von Interaktionsdesign in Software- und Internet-Projekten für Versicherungsunternehmen.
Dieser Vortrag wurde im Rahmen des 5. Messekongress "IT für Versicherungsunternehmen" der Versicherungsforen Leipzig von Alexander Reiberger, GF von Fonda, gehalten.
Der jährlich stattfindende Messekongress "IT für Versicherungsunternehmen" der Versicherungsforen Leipzig fungiert als Diskussionsplattform über aktuelle Entwicklungen und Trends. Außerdem wird neben einer fachbezogenen Ausstellermesse ein Vortragsprogramm mit hochkarätigen Key Notes und themenspezifischen Fach- und - Diskussionsforen geboten. (www.assekuranz-messekongress.de/)
Fonda (www.fonda.at) ist eine Full Service Agentur für Digitale Medien aus Wien. Bei Fonda arbeiten Experten für alle wichtigen Kompetenzfelder aus dem Gebiet „Digital“ unter einem Dach. Fonda folgt dem Grundsatz „Besser einfach“. Unsere Überzeugung: Die besten Ideen kommen nur dann bestens an, wenn sie für den Nutzer klar und einfach verständlich sind. Wir bemühen uns, Klarheit und einfache Erfassbarkeit in jedes Projekt zu bringen, wie komplex auch immer die Inhalte und die technischen Anforderungen sind.
Vortrag von Wolfram Nagel (digiparden GmbH) zum Thema "Multiscreen Experience Design" auf der Usability Professionals Konferenz 2012 in Konstanz.
Die Gerätelandschaft wird immer dynamischer und fragmentierter. Viele Anwender werden zukünftig mehrere verschiedene Endgeräte (gleichzeitig) benutzen. Deshalb müssen Informationen auf möglichst allen (relevanten) Screens und Ausgabekanälen verfügbar sein. Das wiederum bedeutet, dass jedes Projekt generell für mehrere Screens und Ausgabekanäle gedacht und konzipiert werden muss, um dem Anwender eine möglichst „fließende Multiscreen Experience“ zu bieten. Der Vortrag stellt Prinzipien, Muster und Empfehlungen vor, die man bei der Konzeption von Multiscreen Projekten berücksichtigen sollte. Zwei Schwerpunkte des Vortrags sind Content- und Informationsmanagement für verschiedene Screens, sowie Kommunikation und Nutzung von Informationen auf mobilen Endgeräten.
Ein Blick in die Kristallkugel mit dem Ziel spannende und relevante Online-Trends für das Jahr 2004 hervorzusagen. Auf der Liste sind:
- Multimodal Interaction
- WAI
- PDF/Acrobat 6
- Blogging und RSS
- Digital Rights Management
- ENUM/E.164
- Anti-Spam
- Google Web API
- Grid Computing
- SOAP 2.0/XMLP
Der Mediencampus der Hochschule Darmstadt soll über eine Website als eigenständige Marke präsentiert und das Leistungsspektrum des Mediencampus soll für die unterschiedlichen Zielgruppen optimal dargestellt werden.
Ziel ist es eine Campuswebseite zu schaffen, die den Nutzeranforderungen der kommenden Jahre gerecht wird und ressourcensparend in der redaktionellen Pflege und technischen Weiterentwicklung ist.
Agile (Software-) Prozesse - Quo Vadis? [in German]Martin Gaedke
[DE] Eingeladener Impulsvortrag zum Thema Agile Software Engineering und Agile Management - in drei Teilen: Eine kurze Einführung zu Scrum, Übersicht zu Agilen Prozessen sowie ein Ausblick zu aktuellen Trends.
Einladung im Rahmen des BMBF-Projektes "Innoprofile-Transfer" zu Systemzuverlässigkeit in Elektromobilität und Energiemanagement
[EN] An introductory note about Agile Software Engineering and Agile Management - in three parts: An introduction to Scrum, Agile principles and approaches, and current trends regarding applying Agile in Management, Enterprise Agility, Lean, Kanban, Scalable Agile Framework, Business Model Generation, Holacracy, purpose-driven work environment, Design Thinking, Results-only and Impact-oriented approaches.
Presented at "Innoprofile-Transfer" project-workshop on System Reliability in electric mobility and energy management (a project funded by the Federal Ministry of Education and Research (BMBF)).
Vom 23. bis 24. Juni 2014 lud we.CONECT Experten und Entscheidungsträger von führenden Unternehmen der Branche zur Smart Variant.con 2014 ein, um Strategien, Prozesse, Lösungsansätze und konkrete Projekte zur Realisierung eines prozessual sinnvollen, effizienten und logischen Variantenmanagements vorzustellen und zu diskutieren.
we.CONECT
Konzeption und Einführung eines Responsive (Re-)Designs bei einer Dax 30 Marke: Vorgehen, Methoden, Erfahrungen.
Mit dem rasanten Wandel der mobilen Nutzungsgewohnheiten ist Responsive Design in der Web-Implementierung schon fast eine Selbstverständlichkeit geworden. Aber große Markenuniversen mit verteilten Systemen stellen Konzeptioner hier vor besondere Herausforderungen:
# Wie spezifiziert man Modul- und Layoutverhalten für sämtliche Site-Typen?
# Wie verifiziert man ein geräteunabhängiges Markenerlebnis im Usability-Test?
# Wie normiert man das Ergebnis in einem Responsive Styleguide?
# Wie migriert man sämtliche bestehenden Inhalte?
Applikationsmodernisierung: Der Weg von Legacy in die CloudAarno Aukia
Stell Dir vor: Du willst einen 6000er besteigen. Eine gute Vorbereitung, gutes Material und professionelles Wissen sind dabei unabdingbar.
Du schnappst Dir einen Berg-Guide, der Dich bei schwierigen Passagen unterstützt und Dir das passende Know-How weitergibt. Das schwere Material kannst Du auf ein Team aufteilen, das genau weiss, welche Pakete Du am sinnvollsten schnürst. Sie zeigen Dir zudem, welchen unnötigen Ballast Du abwerfen kannst.
Am Ende stehst Du am Ziel – Dich erwartet ein grossartiges Resultat und die Zufriedenheit des Vollbrachten.
Genau so fühlt sich die Reise Deiner Legacy Applikation an.
Im Webinar zeigen Dir die drei Partnerfirmen Object Engineering, Puzzle und VSHN, wie Du Deine Applikationen fit hältst. Dabei geben sie Dir einen Einblick, wie Experten die Applikationen analysieren, aufpeppen und den Betrieb sicherstellen können.
Ähnlich wie Faure vanderdonckt co-summit2013-final (20)
To the end of our possibilities with Adaptive User InterfacesJean Vanderdonckt
Slides of the keynote presented at the 1st International Workshop on Human-in-the-Loop Applied Machine Learning (HITLAML '23)
September 04 - 06, 2023 - Belval, Luxembourg.
This presentation summarizes the evolution of techniques used to adapt the user interfaces to the context of use, which is composed of the user, the platform, and the environment.
Engineering the Transition of Interactive Collaborative Software from Cloud C...Jean Vanderdonckt
Paper presented at EICS '22: https://dl.acm.org/doi/10.1145/3532210
The "Software as a Service" (SaaS) model of cloud computing popularized online multiuser collaborative software. Two famous examples of this class of software are Office 365 from Microsoft and Google Workspace. Cloud technology removes the need to install and update the software on end users' computers and provides the necessary underlying infrastructure for online collaboration. However, to provide a good end-user experience, cloud services require an infrastructure able to scale up to the task and allow low-latency interactions with a variety of users worldwide. This is a limiting factor for actors that do not possess such infrastructure. Unlike cloud computing which forgets the computational and interactional capabilities of end users' devices, the edge computing paradigm promises to exploit them as much as possible. To investigate the potential of edge computing over cloud computing, this paper presents a method for engineering interactive collaborative software supported by edge devices for the replacement of cloud computing resources. Our method is able to handle user interface aspects such as connection, execution, migration, and disconnection differently depending on the available technology. We exemplify our approach by developing a distributed Pictionary game deployed in two scenarios: a nonshared scenario where each participant interacts only with their own device and a shared scenario where participants also share a common device, including a TV. After a theoretical comparative study of edge vs. cloud computing, an experiment compares the two implementations to determine their effect on the end user's perceived experience and latency vs. real latency
UsyBus: A Communication Framework among Reusable Agents integrating Eye-Track...Jean Vanderdonckt
Presentation of ACM EICS '22 paper: https://dl.acm.org/doi/10.1145/3532207
Eye movement analysis is a popular method to evaluate whether a user interface meets the users' requirements and abilities. However, with current tools, setting up a usability evaluation with an eye-tracker is resource-consuming, since the areas of interest are defined manually, exhaustively and redefined each time the user interface changes. This process is also error-prone, since eye movement data must be finely synchronised with user interface changes. These issues become more serious when the user interface layout changes dynamically in response to user actions. In addition, current tools do not allow easy integration into interactive applications, and opportunistic code must be written to link these tools to user interfaces. To address these shortcomings and to leverage the capabilities of eye-tracking, we present UsyBus, a communication framework for autonomous, tight coupling among reusable agents. These agents are responsible for collecting data from eye-trackers, analyzing eye movements, and managing communication with other modules of an interactive application. UsyBus allows multiple heterogeneous eye-trackers as input, provides multiple configurable outputs depending on the data to be exploited. Modules exchange data based on the UsyBus communication framework, thus creating a customizable multi-agent architecture. UsyBus application domains range from usability evaluation to gaze interaction applications design. Two case studies, composed of reusable modules from our portfolio, exemplify the implementation of the UsyBus framework.
µV: An Articulation, Rotation, Scaling, and Translation Invariant (ARST) Mult...Jean Vanderdonckt
Paper presented at ACM EICS '22
Finger-based gesture input becomes a major interaction modality for surface computing. Due to the low precision of the finger and the variation in gesture production, multistroke gestures are still challenging to recognize in various setups. In this paper, we present µV, a multistroke gesture recognizer that addresses the properties of articulation, rotation, scaling, and translation invariance by combining $P+'s cloud-matching for articulation invariance with !FTL's local shape distance for RST-invariance. We evaluate µV against five competitive recognizers on MMG, an existing gesture set, and on two new versions for smartphones and tablets, MMG+ and RMMG+, a randomly rotated version on both platforms. µV is significantly more accurate than its predecessors when rotation invariance is required and not significantly inferior when it is not. µV is also significantly faster than others with many samples and not significantly slower with few samples
RepliGES and GEStory: Visual Tools for Systematizing and Consolidating Knowle...Jean Vanderdonckt
The body of knowledge accumulated by gesture elicitation studies (GES), although useful, large, and extensive, is also heterogeneous, scattered in the scientific literature across different venues and fields of research, and difficult to generalize to other contexts of use represented by different gesture types, sensing devices, applications, and user categories. To address such aspects, we introduce RepliGES, a conceptual space that supports (1) replications of gesture elicitation studies to confirm, extend, and complete previous findings, (2) reuse of previously elicited gesture sets to enable new discoveries, and (3) extension and generalization of previous findings with new methods of analysis and for new user populations towards consolidated knowledge of user-defined gestures. Based on RepliGES, we introduce GEStory, an interactive design space and visual tool, to structure, visualize and identify user-defined gestures from a number of 216 published gesture elicitation studies
Gesture-based information systems: from DesignOps to DevOpsJean Vanderdonckt
Keynote address for the 29th International Conference on Information Systems Development ISD'2021 (Valencia, Spain, September 8-10, 2021). See https://isd2021.webs.upv.es/program.php#keynotes
This talk promotes the Seven I':
Implementation continuity
Inclusion of end-users
Interaction first
Integration among stakeholders
Iteration short
Incremental progress
Innovation openness
Intra-platform plasticity regularly assumes that the display of a computing platform remains fixed and rigid during interactions with the platform in contrast to reconfigurable displays, which can change form depending on the context of use. In this paper, we present a model-based approach for designing and deploying graphical user interfaces that support intra-platform plasticity for reconfigurable displays. We instantiate the model for E3Screen, a new device that expands a conventional laptop with two slidable, rotatable, and foldable lateral displays, enabling slidable user interfaces. Based on a UML class diagram as a domain model and a SCRUD list as a task model, we define an abstract user interface as interaction units with a corresponding master-detail design pattern. We then map the abstract user interface to a concrete user interface by applying rules for the reconfiguration, concrete interaction, unit allocation, and widget selection and implement it in JavaScript. In a first experiment, we determine display configurations most preferred by users, which we organize in the form of a state-transition diagram. In a second experiment, we address reconfiguration rules and widget selection rules. A third experiment provides insights into the impact of the lateral displays on a visual search task.
Evaluating Gestural Interaction: Models, Methods, and MeasuresJean Vanderdonckt
The document discusses various methods for evaluating gestural user interfaces (UIs), including comparing a UI to a reference model, collecting evaluation data on usability criteria, and using standardized scales and metrics. Common dimensions for evaluation are goals, utility, usability, and factors like system acceptance, ease of use, and cost. Methods mentioned include observations, questionnaires, heuristic evaluations, and measuring task performance and preferences using standardized scales. Guidelines are provided for designing and assessing the usability of different gestures.
Conducting a Gesture Elicitation Study: How to Get the Best Gestures From Peo...Jean Vanderdonckt
Lecture 3: Conducting a Gesture Elicitation Study: How to Get the Best Gestures From People?
Francqui Chair in Computer Science 2020 VUB, Jean Vanderdonckt, 27 April 2021
This document provides an overview of gestural interaction and various gesture recognition techniques. It begins with definitions of gestures and how they can vary based on factors like the body part used, number of dimensions, whether they are contact-based or not. It then discusses benefits of gestures and examples of gesture recognizers like xStroke and techniques like Rubine, SiGeR, LVS, hidden Markov models, and the $-family of recognizers. The document provides details on properties like stroke, direction, and rotation invariance as well as training and recognition phases for different recognizers.
The document summarizes Jean Vanderdonckt's upcoming lecture on gestural interaction. It will cover the psychological, hardware, software, usage, social and user experience dimensions of gestural interaction. On the psychological dimension, it discusses definitions of gestures and theories of gesture types. On the hardware dimension, it outlines paradigms of contact-based and contact-less gesture interaction. On the software dimension, it provides an overview of gesture recognition algorithms such as Rubine, Siger, LVS and nearest neighbor classification.
User-centred Development of a Clinical Decision-support System for Breast Can...Jean Vanderdonckt
See the paper at https://www.scitepress.org/Link.aspx?doi=10.5220/0010258900600071
We conducted a user-centered design of a clinical decision-support system for breast cancer screening, diagnosis, and reporting based on stroke gestures. We combined knowledge elicitation interviews, scenario-focused questionnaires, and paper mock-ups to understand user needs. Multi-fidelity (low and high) prototypes were designed and compared first in vitro in a usability laboratory, then in vivo in the real world. The resulting user interface provides radiologists with a platform that integrates domain-oriented tools for the visualization of mammograms, the manual, and the semi-automatic annotation of breast cancer findings based on stroke gestures. The contribution of this work lies in that, to the best of our knowledge, stroke gestures have not yet been applied to the annotation of mammograms. On the one hand, although there is a substantial amount of research done in stroke-based interaction, none focuses especially on the domain of breast cancer annotation. On the other hand, typical gestures in breast cancer annotation tools are those with a keyboard and a mouse
Simplifying the Development of Cross-Platform Web User Interfaces by Collabo...Jean Vanderdonckt
Ensuring responsive design of web applications requires their user interfaces to be able to adapt according to different contexts of use, which subsume the end users, the devices and platforms used to carry out the interactive tasks, and also the environment in which they occur. To address the challenges posed by responsive design, aiming to simplify their development by factoring out the common parts from the specific ones, this paper presents Quill, a web-based development environment that enables various stakeholders of a web application to collaboratively adopt a model-based design of the user interface for cross-platform deployment. The paper establishes a series of requirements for collaborative model-based design of cross-platform web user interfaces motivated by the literature, observational and situational design. It then elaborates on potential solutions that satisfy these requirements and explains the solution selected for Quill. A user survey has been conducted to determine how stakeholders appreciate model-based design user interface and how they estimate the importance of the requirements that lead to Quill
Detachable user interfaces consist of graphical user interfaces whose parts or whole can be detached at run-time from their host, migrated onto an- other computing platform while carrying out the task, possibly adapted to the new platform and attached to the target platform in a peer-to-peer fashion. De- taching is the property of splitting a part of a UI for transferring it onto another platform. AttAaching is the reciprocal property: a part of an existing interface can be attached to the currently being used interface so as to recompose another one on-demand, according to user's needs, task requirements. Assembling inter- face parts by detaching and attaching allows dynamically composing, decom- posing and re-composing new interfaces on demand. To support this interaction paradigm, a development infrastructure has been developed based on a series of primitives such as display, undisplay, copy, expose, return, transfer, delegate, and switch. We exemplify it with QTkDraw, a painting application with attach- ing and detaching based on the development infrastructure.
The Impact of Comfortable Viewing Positions on Smart TV GesturesJean Vanderdonckt
Whereas gesture elicitation studies for TV interaction
assume that participants adopt an upright, frontal viewing
position, we asked 21 participants to hold a natural, comfortable
viewing position, the posture they adopt when watching TV
at home. By involving a broad selection of users regarding
age, profession, our study targets a higher ecological validity
than in existing studies. Agreements rates were lower than existing studies using an upright, frontal viewing position. Participants experienced problems due to (1) having to use their slave hand instead of their dominant hand, (2) being in a certain orientation with their head making it more difficult to perform some physical movements, and (3) being hindered in their movement by the sofa there lay on. Since each person may have a different
position inducing different gestures due to the aforementioned
problems, the effect of a comfortable viewing position is analyzed
by comparison to gestures for a frontal position.
Head and Shoulders Gestures: Exploring User-Defined Gestures with Upper BodyJean Vanderdonckt
This paper presents empirical results about user-dened gestures
for head and shoulders by analyzing 308 gestures elicited from 22 participants for 14 referents materializing 14 different types of tasks in IoT context of use. We report an overall medium consensus but with medium variance (mean: .263, min: .138, max: .390 on the unit scale) between participants gesture proposals, while their thinking time were less similar (min: 2.45 sec, max: 22.50 sec), which suggests that head and shoulders gestures are not all equally easy to imagine and to produce. We point to the challenges of deciding which head and shoulders gestures
will become the consensus set based on four criteria: the agreement rate, their individual frequency, their associative frequency, and their unicity.
Paper accessible at https://dial.uclouvain.be/pr/boreal/en/object/boreal%3A213794
G-Menu: A Keyword-by-Gesture based Dynamic Menu Interface for SmartphonesJean Vanderdonckt
Instead of relying on graphical or vocal modalities for searching
an item by keyword (called K-Menu), this paper presents the G-Menu exploiting gesture interaction and gesture recognition: when a user sketches a keyword by gesturing the first letters of its label, a menu with items related to the recognized letters is constructed dynamically and presented to the user for selection and auto-completion. The selection can be completed either gesturally by an appropriate gesture (called the G-Menu) or by touch only (called the T-Menu). This paper compares the three types of menu, i.e., by keyword, by gesture, and by touching, in a user study with twenty participants on their item selection time (for measuring task efficiency), their error rate (for measuring task effectiveness),
and their subjective satisfaction (for measuring user satisfaction).
Paper accessible at https://dial.uclouvain.be/pr/boreal/en/object/boreal%3A213790
Unistroke and multistroke gesture recognizers have always striven to reach some robustness with respect to
all variations encountered when people issue gestures by hand
on touch surfaces or with sensing devices. For this purpose,
successful stroke recognizers rely on a gesture recognition
algorithm that satisfies a series of invariance properties such
as: stroke-order invariance, stroke-number invariance, stroke direction invariance, position, scale, and rotation invariance.
Before initiating any recognition activity, these algorithms
ensure these properties by performing several pre-processing
operations. These operations induce an additional computational
cost to the recognition process, as well as a potential error
bias. To cope with this problem, we introduce an algorithm that
ensures all these properties analytically instead of statistically
based on a vector algebra. Instead of points, the recognition
algorithm works on vectors between vectors. We demonstrate
that this approach not eliminates the need for these preprocessing
operations but also satisfies an entire structure preserving
transformation.
Paper available at https://dial.uclouvain.be/pr/boreal/en/object/boreal%3A217006
Body-based gestures, such as acquired by Kinect sensor, today benefit from efficient tools for their recognition and development, but less for automated reasoning. To facilitate this activity, an ontology for structuring body-based gestures, based on user, body and body parts, gestures, and environment, is designed and encoded in Ontology Web Language according to modelling triples (subject, predicate, object). As a proof-of-concept and to feed this ontology, a gesture elicitation study collected 24 participants X 19 referents for IoT tasks = 456 elicited body-based gestures, which were classified and expressed according to the ontology.
See paper at https://dl.acm.org/citation.cfm?id=3328238
4. The UsiXML Project
Goals
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
UsiXML defines, validates, and standardises an open user interface
description language (UIDL), increasing productivity and reusability, and
improving usability and accessibility of industrial interactive applications
using the μ7 concept.
Goal 1: The UsiXML “µ7” concept elicitation and promotion
•
Multi-device, multi-platform, multi-user, multi-linguality / culturality, multiorganisation, multi-context, multi-modality
Goal 2: Development of the UsiXML language and the model-driven method
•
•
•
Standard User Interface Description Language
New models to capture µ7 aspects
UI development methodology
Goal 3: Set up development tools and demonstration of the validity on applications
•
•
•
Tools development
Usability support
Validation through demonstrators
ITEA 2 - 4
5. The UsiXML Project
Market Positioning
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
μ Modality
• New contexts and constraints
imposed to use various modalities
μ Platform
• Cross Platform consumer & user
behaviour
μ7
μ User
μ Context
• Pattern recognition
• Contextual analysis
• Anytime, anywhere
• Big Data, In-memory computing
• Digital Asset Management
• Analytics
μ Device
• Any Device (input/output)
• Mobile Devices
• M2M
• Internet of things
• Users evolving over time and new
user profiles appearing constantly
• Pattern recognition
• Natural user interfaces (voice,
gesture...)
• User experience
μ Linguality
μ Organization
• Applications that need to be
extended to multiple organizations
• Cloud collaborative processes
• Integrated ecosystems
• Applications submitted to
internationalisation with new
languages, markets, cultures
ITEA 2 - 5
6. The UsiXML Project
Project members
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
ITEA 2 - 6
7. The UsiXML Project
Cameleon Reference Framework (CRF)
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
Task & Domain (T&D)
Abstract
Container
Abs. Int. Unit
facet=control
Abs. Int. Unit
facet=control
Abs. Int. Unit
facet=control
Abstract User Interface
(AUI)
button
Concrete User Interface
(CUI)
Window
textInput
button
Final User Interface (FUI)
Calvary, G., Coutaz, J., Thevenin, D., Limbourg, Q., Bouillon, L., Vanderdonckt, J., A Unifying Reference Framework for
Multi-Target User Interfaces, Interacting with Computers, Vol. 15, No. 3, June 2003, pp. 289-308
ITEA 2 - 7
8. The UsiXML Project
Cameleon Reference Framework (CRF)
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
Adopted by
in W3C
(4 May 2010)
http://www.w3.org/2005/Incubator/model-based-ui/XGR-mbui-20100504/
ITEA 2 - 8
9. The UsiXML Project
How is UsiXML?
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• Is open
• Is multi-model
• Is multi-level of abstraction: first in history (2003) to support all
CRF levels
• Supports model-based or model-driven UI
• Is multi-usage
• Is multi-path
–
–
–
–
Forward engineering
Reverse engineering
Lateral engineering
Cross-cutting
ITEA 2 - 9
10. The UsiXML Project
Forward Development Method
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
UsiTask editor
Task & Domain (T&D)
UsiDomain editor
UsiContext editor
Abstract User Interface
(AUI)
Developed by Ricardo Tesoriero (UCL & UCLM)
Ricardo Tesoriero, Jean Vanderdonckt, Extending
UsiXML to Support User-Aware Interfaces, HCSE'2010, pp. 95-110
UsiAbstract
generator/editor
ITEA 2 - 10
12. The UsiXML Project
End User Club
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
ITEA 2 - 12
13. The UsiXML Project
Observers
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
ITEA 2 - 13
14. The UsiXML Project
Supporters
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
ITEA 2 - 14
15. The UsiXML Project
Promoters
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
ITEA 2 - 15
17. 7 Standardisation actions
The situation before
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
First attempts to introduce a XML User Interface Description Language
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
Organisation #1
UIDL #1
?
Organisation #2
UIDL #2
?
Organisation #n
UIDL #n
?
UsiXML Del 1.1, V2 – State of the art in User Interface Description Languages, ITEA2, 55 p. (nominated excellent ITEA SotA)
Accessible at: http://www.itea2.org/project/workpackagedocument/download?document=468&file=08026_UsiXML_WP1_D1_1_v2_State_of_the_Art_of_UIDL.doc
ITEA 2 - 17
18. 7 Standardisation actions
The UsiXML Strategic plan
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
Organisation #1
Organisation #2
UsiXML
Organisation #n
UsiXML Del 1.1, V2 – State of the art in User Interface Description Languages, ITEA2, 55 p. (nominated excellent ITEA SotA)
Accessible at: http://www.itea2.org/project/workpackagedocument/download?document=468&file=08026_UsiXML_WP1_D1_1_v2_State_of_the_Art_of_UIDL.doc
ITEA 2 - 18
19. 7 Standardisation actions
The UsiXML Strategic plan
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
UsiXML labelled ITEA2 (3008086)
Starting the ITEA2 UsiXML project
Sept. 15, 2008
2000
2001
2002
2003
Jan., 2009
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
Software tools
support
Step-wise method
involves
Models
described in
UI Desc. Language
https://www.oasis-open.org/committees/tc_home.php?wg_abbrev=uiml
Helms, J., Schaefer, R., Luyten, K., Vermeulen, J., Abrams, M., Coyette, A., Vanderdonckt, J., Human-Centered Engineering with the
User Interface Markup Language, in Seffah, A., Vanderdonckt, J., Desmarais, M. (eds.), “Human-Centered Software Engineering”,
Chapter 7, HCI Series, Springer, London, 2009, pp. 141-173
ITEA 2 - 19
20. 7 Standardisation actions
OASIS UIML
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
Technical Committee on User Interface Modelling Language
User Interface Markup Language (UIML) V1.0
Jan. 15, 2000
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
User Interface Markup Language (UIML) V4.0
Committee Draft
Jan. 23, 2008
UIML Reference chapter
Sept., 2009
Input: CUI & AUI (but not task!), SketchiXML, validators
Process: by progressive incorporation (monthly telco) and validation
https://www.oasis-open.org/committees/tc_home.php?wg_abbrev=uiml
Helms, J., Schaefer, R., Luyten, K., Vermeulen, J., Abrams, M., Coyette, A., Vanderdonckt, J., Human-Centered Engineering with the
User Interface Markup Language, in Seffah, A., Vanderdonckt, J., Desmarais, M. (eds.), “Human-Centered Software Engineering”,
Chapter 7, HCI Series, Springer, London, 2009, pp. 141-173
ITEA 2 - 20
21. 7 Standardisation actions
FP7 NEXOF-RA
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
Reference architecture for NESSI European Platform
Initiating calls for inputs
2008
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
Advanced User-Service Interactions (Del. 1.1)
June 15, 2010
Input: AUI, CUI, context of use (user+platform+environment)
Process: by workshops, report, and proof-of-concept
http://ec.europa.eu/information_society/apps/projects/logos/6/216446/080/deliverables/001_D11cAdvancedUserServiceInteractionscontribution.pdf
Limbourg, Q., Vanderdonckt, J., Multi-Path Transformational Development of User Interfaces with Graph Transformations, in Seffah,
A., Vanderdonckt, J., Desmarais, M. (eds.), “Human-Centered Software Engineering”, Chapter 6, HCI Series, Springer, London, 2009,
pp. 109-140
ITEA 2 - 21
22. 7 Standardisation actions
NESSI
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
Networked European Software and Services Initiative
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
Submission to NESSI
Sept., 2010
Input: AUI, CUI, context of use (user+platform+environment)
Process: by workshops, report, and proof-of-concept
http://ec.europa.eu/information_society/apps/projects/logos/6/216446/080/deliverables/001_D11cAdvancedUserServiceInteractionscontribution.pdf
Limbourg, Q., Vanderdonckt, J., Multi-Path Transformational Development of User Interfaces with Graph Transformations, in Seffah,
A., Vanderdonckt, J., Desmarais, M. (eds.), “Human-Centered Software Engineering”, Chapter 6, HCI Series, Springer, London, 2009,
pp. 109-140
ITEA 2 - 22
23. 7 Standardisation actions
COST N 294 Mause
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
Towards the MAturation of Information Technology USability Evaluation
COST294-MAUSE Closing
Conference Proceedings
August, 2009
2000
2001
2002
2003
2004
Workshop on User Interface
Web Quality Models
Sept. 12-14, 2005
2005
2006
2007
2008
2009
2010
2011
2012
2013
User Experience Manifesto
Sept. 3, 2007
Input: CUI, context of use, usability, quality
Process: by F2F meetings, workshops, and deliverables
http://www.cost294.org/
Abrahão, S., Iborra, E., Vanderdonckt, J., Usability Evaluation of User Interfaces Generated with a Model-Driven Architecture Tool, in
Law, E., Hvannberg, E., and Cockton, G. (eds.), “Maturing Usability: Quality in Software, Interaction and Value”, Chapter 1, HCI
Series, Vol. 10, Springer, London, 2008, pp. 3-32.
ITEA 2 - 23
24. 7 Standardisation actions
ISO/IEC JTC 1/SC 7
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
ISO/IEC 24744:2007 - Software Engineering -- Metamodel for Development Methodologies
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
Initial standard version
2007
Last stage
August 9, 2009
Input: task, usability, ergonomics of notation, method
Process: by progressive incorporation
(telco and F2F meetings)
http://www.iso.org/iso/catalogue_detail.htm?csnumber=38854
Sousa, K., Vanderdonckt, J., Henderson-Sellers, B., Gonzalez-Perez, C., Evaluating a graphical notation for modelling software
development methodologies, Journal of Visual Languages and Computation, Vol. 23, No. 4, 2012, pp. 195-212.
ITEA 2 - 24
25. 7 Standardisation actions
W3C Charter Group
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
Model-based User Interface Design (MBUI)
Second workshop Incubator group (organized by us)
June 11-12, 2009
First workshop Incubator group
August 13, 2008
2000
2001
2002
XG Final report
May 4, 2010
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
Opening Charter Group
Fev., 2011
Public Draft Published
Nov. 8, 2013
Closing Charter Group
Nov. 30, 2013
Input: CRF, task, AUI (editor), CUI, case studies, software
Process: by submission and consensus (weekly telco, F2F meetings,
technical plenaries)
http://www.w3.org/2011/01/mbui-wg-charter, http://www.w3.org/wiki/Model-Based_User_Interfaces
Tran, V., Tesoriero, R., Vanderdonckt, J., Systematic Generation of Abstract User Interfaces, Proc. of 4th ACM Int. Symposium on
Engineering Interactive Computing Systems EICS’2012 (Copenhagen, June 25-28, 2012), ACM Press, New York, 2012, pp. 101-110.
ITEA 2 - 25
26. 7 Standardisation actions
OMG IFML
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
Object Management Group – Interaction Flow Modeling Language
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
Initial submission
2010
Adoption as OMG Standard
March, 2013
Input: CUI
Process: by submission and voting (F2F meetings)
http://www.ifml.org
Marco Brambilla, Jordi Cabot and Manuel Wimmer, Model-Driven Software Engineering in Practice (Synthesis Lectures on Software
Engineering), Sept. 26, 2012.
ITEA 2 - 26
30. What's next?
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• Touch phones
Night version, 2 days
Day version, 2 days
ITEA 2 - 30
31. What's next?
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• Tablets
Night version, 4 days
Day version, 4 days
ITEA 2 - 31
32. What's next?
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• Desktop version
ITEA 2 - 32
35. What's next?
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• ITEA2 UsiXML project is finished, but UsiXML continues
–
–
–
–
As a language: towards UsiXML 2.2 stable version
As a consortium: you can join
As a series of product & services: through companies
As a consulting agency: through spin-offs (e.g., MiLab, Mexico)
• W3C Ubiquitous Application Design Community Group continues
– More meta-models to be discussed: concrete UI, user model,
modalities, etc.
– Need for more
•
•
•
company involvement and adoption
software support
use cases
Also see:
http://www.w3.org/2013/Talks/quill/
http://www.w3.org/2013/Talks/Serenoa/
Join now the Ubiquitous Application Design Community Group at:
http://www.w3.org/community/uad/
ITEA 2 - 35
36. More information
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• The ITEA2 UsiXML profile:
http://www.itea2.org/project/index/view/?project=1127
• The UsiXML project web site:
www.usixml.eu
• The UsiXML language web site:
www.usixml.org
3689 visitors in 2012
• UsiXML FaceBook page:
https://www.facebook.com/UsiXML
• UsiXML SlideShare:
http://www.slideshare.net/search/slideshow?q=usixml
• UsiXML PlayList:
http://www.youtube.com/playlist?list=PLn_SfKW8yXZAVuESKWEKUqwRQ
a16ORCi6
• UsiXML Twitter:
https://twitter.com/usixml
ITEA 2 - 36
37. If you have any user interface development, please consider UsiXML
Thank you very much for your attention!
http://fr.slideshare.net/jeanvdd/faure-vanderdonckt-cosummit2013-final
38. Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
Slides for reference purpose
ITEA 2 - 38
39. Acknowledgements for Support
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
Avec le soutien de la DGO6
Département des Programmes de Recherche
FP7 Nexof-RA: http://cordis.europa.eu/fp7/ict/ssai/docs/fp7call1achievements/nexof-ra.pdf
FP7 Human: http://www.human.aero/
FP7 Selfman: http://www.ist-selfman.org
FP7 Serenoa: http://www.serenoa-fp7.eu/
ITEA 2 - 39
40. Some UsiXML software
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• UsiDashBoard: support for method engineering
Developed by Javier Cano, Javier Munoz (Prodevelop)
Cano, F.J., Vanderdonckt, J., Towards Methodological Guidance for User Interface Development Life Cycle, Proc. of 2nd Int.
Workshop on User Interface Extensible Markup Language UsiXML’2011 (Lisbon, 6 September 2011), Thales Research and
Technology France, Paris, 2011, pp. 35-45.
ITEA 2 - 40
41. Some UsiXML software
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• UsiComp: Composition of
user interfaces (by UJF)
Developed by Alfonso García Frey (UJF, LIG)
Alfonso García Frey, Eric Ceret, Sophie Dupuy-Chessa, Gaëlle Calvary, Yoann
Gabillon, UsiComp: an extensible model-driven composer, Proc of ACM EICS 2012,
pp. 263-268
ITEA 2 - 41
42. Some UsiXML software
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• UsiWSC: Usable User Interface for Interactive Web Service
Composition
http://webapps.fundp.ac.be/usiwsc/
Developed by Mohamed Boukhebouze & Waldemar Pires Ferreira Neto (UNamur)
Mohamed Boukhebouze, Waldemar Pires Ferreira Neto, Lim Erbin, Philippe Thiran, UsiWSC: Framework for Supporting an Interactive Web
Service Composition, in Proceeding of the 12th International Conference on Web Engineering ICWE'2012, Springer, Berlin, 2012. ITEA 2 - 42
43. Some UsiXML software
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• UsiGesture: incorporating gestures in GUIs
Developed by François Beuvens (UCL)
Beuvens, F., Vanderdonckt, J., Designing Graphical User Interfaces Integrating Gestures in the UsiGesture environment, Proc. of 30th
ACM International Conference on Design of Communication SIGDOC’2012 (Seattle, October 5-8, 2012), ACM Press, New York, 2012, pp.
ITEA 2 - 43
313-322.
44. Some UsiXML software
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• UsiDistrib: Distributed User Interfaces across devices
Developed by Jérémie Melchior (UCL)
Melchior, J., Grolaux, D., Vanderdonckt, J., Van Roy, P., A Toolkit for Peer-to-Peer Distributed User Interfaces: Concepts,
Implementation, and Applications, Proc. of 1st ACM SIGCHI Symposium on Engineering Interactive Computing Systems EICS’2009
(Pittsburgh, July 15-17, 2009), ACM Press, New York, 2009, pp. 69-78.
ITEA 2 - 44
45. Some UsiXML software
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• UsiKiosk: Distributed User Interfaces across devices
(by See & Touch)
Developed by Eric Delvaux (See & Touch)
ITEA 2 - 45
46. Some UsiXML software
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• UsiExplain: Self-explanatory user interfaces by model-driven
engineering (by UJF)
Developed by Alfonso García Frey (UJF/LIG)
http://iihm.imag.fr/publs/2013/PhD_Alfonso-Garcia-Frey.pdf
Alfonso García Frey, Gaëlle Calvary, Sophie Dupuy-Chessa, Nadine Mandran, Model-Based Self-explanatory UIs for Free,
but Are They Valuable?, Proc. of IFIP INTERACT (3) 2013: 144-161
ITEA 2 - 46
47. Some UsiXML software
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• SECRET: reverse engineering of GUIs (by UCLM)
Montero, F., López-Jaquero, V., González, P. (2013). User-Centered Reverse Engineering. Computing Systems Department, University
of Castilla-La Mancha, Albacete, Spain. Available at: https://www.dsi.uclm.es/trep.php?codtrep=DIAB-13-04-1
ITEA 2 - 47
48. Some UsiXML software
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• UsiResourcer: reverse engineering of GUIs from their resource
file
Sanchez, O., Vanderdonckt, J., Molina, J., Re-Engineering Graphical User
Interfaces from their Resource Files with UsiResourcer, Proc. of 7th Int.
Conf. on Research Challenges in Information Science RCIS’2013 (Paris, 2931 May 2013), IEEE Computer Society, Los Angeles, 2013.
ITEA 2 - 48
50. Co-summit 2013, Scandic - Stockholm
Some UsiXML software
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• UsiView: animated transition between UsiXML and code
a
Conceptual
view
Animated transition between conceptual and external views
b
Internal
view
c
External
view
Animated transition between internal and external views
UsiXML editor developed by Benoît Hambucken (Defimedia), animated transitions by Ch.-E. Dessart (UCL)
Dessart, Ch.-E., Genaro Motti, V., and Vanderdonckt, J., Animated Transitions between User Interface Views, Proc. of Int.
Working Conf. on Advanced Visual Interfaces AVI’2012 (Capri, May 21-25th, 2012), ACM Press, New York, 2012, pp. 341-348.
ITEA 2 - 50
51. Some UsiXML software
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• UsiCentral & DefBox: Web Authoring environment (by
www.defimedia.be)
UsiCentral is developed by Benoît Hambucken, Luc Ponsard, and others (Defimedia)
ITEA 2 - 51
53. Some UsiXML software
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• D2Flex: A tool for Designing Flexible process models (by
UJF/LIG)
Eric Ceret, Sophie Dupuy-Chessa, Gaelle Calvary, M2FLEX: A process metamodel for flexibility at runtime, Proc. of
IEEE RCIS'2013, pp. 1-12
ITEA 2 - 53
54. Some UsiXML software
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• UsiPatterns: A library for Multi-device User Interface Patterns (by
www.namahn.be)
ITEA 2 - 54
55. Co-summit 2013, Scandic - Stockholm
Some UsiXML software
Project Presentation
• • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • CPM•tree • • • • •
• • editor
CPM tree editor
Save
Apply new profile
Menu Name
Launch template, mapping or profile editor
Concrete Presentation Tree
• Transformation Templates: by UPV
Indivudual Properties
Identification
Hat
Window1
Menu_bar
Menu_comp1
Menu_comp2
Menu_comp3
Menu_comp4
IUBook
Window2
Box2.1
DisplaySet1
Box2.1.1
Box
Box
Id
67555447
Name
MyButton
Semantic
defaultContent
Press Here
defaultHelp
But1help.rtf
defaultIcon
thisIcon.png
defaultTooltip
Activate player
Font
FontType
Arial
FontSize
12
Color
Alignment
bgColor
#FFFFF
fgColor
Alignment1
Alignment2
#00000
IUAuthor
IULoan
Mapping editor
Load
Widgets
Save Profile olny
Save Mapping
Source Element
Corresponding Concrete Presentation
HAT
Service IU
Grouping
Input Argument
Boolean
hBox
Label
textInput
String
Integer
PopulationIU
Instance IU
Master Detail
Action
Navigation
ITEA 2 - 55
56. Some UsiXML software
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• UsiTask: task model editor as an Eclipse plug-in (by UCL/UCLM)
UsiTask is developed by Ricardo Tesoriero (UCL & UCLM)
Ricardo Tesoriero, Jean Vanderdonckt, Extending UsiXML to Support User-Aware Interfaces, Proc. of HCSE'2010, pp. 95-110
ITEA 2 - 56
57. Some UsiXML software
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• UsiDomain: domain model editor as an Eclipse plug-in (by
UCL/UCLM)
UsiDomain is developed by Ricardo Tesoriero (UCL & UCLM)
Ricardo Tesoriero, Jean Vanderdonckt, Extending UsiXML to Support User-Aware Interfaces, Proc. of HCSE'2010, pp. 95-110
ITEA 2 - 57
58. Some UsiXML software
Co-summit 2013, Scandic - Stockholm
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• UsiContext: context of use model editor as an Eclipse plug-in (by
UCL/UCLM)
UsiContext is developed by Ricardo Tesoriero (UCL & UCLM)
Ricardo Tesoriero, Jean Vanderdonckt, Extending UsiXML to Support User-Aware Interfaces, Proc. of HCSE'2010, pp. 95-110
ITEA 2 - 58
59. Co-summit 2013, Scandic - Stockholm
Some UsiXML software
Project Presentation
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
• ReTaskXML: reverse engineering of UIs (by UCLM)
reTaskXML
ReTaskXML is developed by Francisco Montero (UCLM)
ITEA 2 - 59