var bibbase_data = {"data":"\"Loading..\"\n\n
\n\n \n\n \n\n \n \n\n \n\n \n \n\n \n\n \n
\n generated by\n \n \"bibbase.org\"\n\n \n
\n \n\n
\n\n \n\n\n
\n\n Excellent! Next you can\n create a new website with this list, or\n embed it in an existing web page by copying & pasting\n any of the following snippets.\n\n
\n JavaScript\n (easiest)\n
\n \n <script src=\"https://bibbase.org/show?bib=https://bercher.net/bibtex/bookChapters.bib&theme=default&fullnames=1&jsonp=1&hidemenu=1&jsonp=1\"></script>\n \n
\n\n PHP\n
\n \n <?php\n $contents = file_get_contents(\"https://bibbase.org/show?bib=https://bercher.net/bibtex/bookChapters.bib&theme=default&fullnames=1&jsonp=1&hidemenu=1\");\n print_r($contents);\n ?>\n \n
\n\n iFrame\n (not recommended)\n
\n \n <iframe src=\"https://bibbase.org/show?bib=https://bercher.net/bibtex/bookChapters.bib&theme=default&fullnames=1&jsonp=1&hidemenu=1\"></iframe>\n \n
\n\n

\n For more details see the documention.\n

\n
\n
\n\n
\n\n This is a preview! To use this list on your own web site\n or create a new web site from it,\n create a free account. The file will be added\n and you will be able to edit it in the File Manager.\n We will show you instructions once you've created your account.\n
\n\n
\n\n

To the site owner:

\n\n

Action required! Mendeley is changing its\n API. In order to keep using Mendeley with BibBase past April\n 14th, you need to:\n

    \n
  1. renew the authorization for BibBase on Mendeley, and
  2. \n
  3. update the BibBase URL\n in your page the same way you did when you initially set up\n this page.\n
  4. \n
\n

\n\n

\n \n \n Fix it now\n

\n
\n\n
\n\n\n
\n \n \n
\n
\n  \n 2019\n \n \n (1)\n \n \n
\n
\n \n \n
\n \n\n \n \n \n \n \n \n A Multimodal Dialogue Framework for Cloud-Based Companion Systems.\n \n \n \n \n\n\n \n Matthias Kraus; Marvin Schiller; Gregor Behnke; Pascal Bercher; Susanne Biundo; Birte Glimm; and Wolfgang Minker.\n\n\n \n\n\n\n In Rafael Banchs; Luis Fernando D'Haro; and Haizhou Li., editor(s), 9th International Workshop on Spoken Dialogue Systems, of Lecture Notes in Electrical Engineering, pages 405–410. Springer, 2019.\n This book chapter is a slightly newer version of the paper by Kraus et al. that was accepted at the 10th International Workshop On Spoken Dialog Systems Technology (IWSDS 2018).\n\n\n\n
\n\n\n\n \n \n \"A paper\n  \n \n\n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n\n \n \n \n \n \n \n \n\n  \n \n \n\n\n\n
\n
@InCollection { Kraus2019CloudCompanion,\n   author    = {Matthias Kraus and Marvin Schiller and Gregor Behnke and Pascal Bercher and Susanne Biundo and Birte Glimm and Wolfgang Minker},\n   title     = {A Multimodal Dialogue Framework for Cloud-Based Companion Systems},\n   booktitle = {9th International Workshop on Spoken Dialogue Systems},\n   year      = {2019},\n   pages     = {405--410},\n   editor    = {Rafael Banchs and Luis Fernando D'Haro and Haizhou Li},\n   publisher = {Springer},\n   note      = {This book chapter is a slightly newer version of the paper by Kraus et al. that was accepted at the 10th International Workshop On Spoken Dialog Systems Technology (IWSDS 2018).},\n   series    = {Lecture Notes in Electrical Engineering},\n   abstract  = {Companion systems are cooperative, cognitive systems aiming at assist- ing a user in everyday situations. Therefore, these systems require a high level of availability. One option to meet this requirement is to use a web-deployable archi- tecture. In this demo paper, we present a multimodal cloud-based dialogue frame- work for the development of a distributed, web-based companion system. The pro- posed framework is intended to provide an efficient, easily extensible, and scalable approach for these kinds of systems and will be demonstrated in a do-it-yourself assistance scenario.},\n   url_Paper                = {https://bercher.net/publications/2019/Kraus2019CloudCompanion.pdf}\n}\n
\n
\n\n\n
\n Companion systems are cooperative, cognitive systems aiming at assist- ing a user in everyday situations. Therefore, these systems require a high level of availability. One option to meet this requirement is to use a web-deployable archi- tecture. In this demo paper, we present a multimodal cloud-based dialogue frame- work for the development of a distributed, web-based companion system. The pro- posed framework is intended to provide an efficient, easily extensible, and scalable approach for these kinds of systems and will be demonstrated in a do-it-yourself assistance scenario.\n
\n\n\n
\n\n\n\n\n\n
\n
\n\n
\n
\n  \n 2017\n \n \n (4)\n \n \n
\n
\n \n \n
\n \n\n \n \n \n \n \n \n User-Centered Planning.\n \n \n \n \n\n\n \n Pascal Bercher; Daniel Höller; Gregor Behnke; and Susanne Biundo.\n\n\n \n\n\n\n In Susanne Biundo; and Andreas Wendemuth., editor(s), Companion Technology – A Paradigm Shift in Human-Technology Interaction, of Cognitive Technologies, 5, pages 79–100. Springer, 2017.\n \n\n\n\n
\n\n\n\n \n \n \"User-Centered paper\n  \n \n \n \"User-Centered poster\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 3 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n\n\n\n
\n
@InCollection{Bercher2017UserCenteredPlanning,\n  author                   = {Pascal Bercher and Daniel H{\\"o}ller and Gregor Behnke and Susanne Biundo},\n  title                    = {User-Centered Planning},\n  chapter                  = {5},\n  pages                    = {79--100},\n  doi                      = {10.1007/978-3-319-43665-4_5},\n  abstract                 = {User-centered planning capabilities are core elements of Companion-Technology. They are used to implement the functional behavior of technical systems in a way that makes those systems Companion-able – able to serve users individually, to respect their actual requirements and needs, and to flexibly adapt to changes of the user's situation and environment. This book chapter presents various techniques we have developed and integrated to realize user-centered planning. They are based on a hybrid planning approach that combines key principles also humans rely on when making plans: stepwise refining complex tasks into executable courses of action and considering causal relationships between actions. Since the generated plans impose only a partial order on actions, they allow for a highly flexible execution order as well. Planning for Companion-Systems may serve different purposes, depending on the application for which the system is created. Sometimes, plans are just like control programs and executed automatically in order to elicit the desired system behavior; but sometimes they are made for humans. In the latter case, plans have to be adequately presented and the definite execution order of actions has to coincide with the user's requirements and expectations. Furthermore, the system should be able to smoothly cope with execution errors. To this end, the plan generation capabilities are complemented by mechanisms for plan presentation, execution monitoring, and plan repair.},\n  booktitle                = {Companion Technology -- A Paradigm Shift in Human-Technology Interaction},\n  editor                   = {Susanne Biundo and Andreas Wendemuth},\n  publisher                = {Springer},\n  year                     = {2017},\n  series                   = {Cognitive Technologies},\n  url_Paper                = {https://bercher.net/publications/2017/Bercher2017UserCenteredPlanning.pdf},\n  url_Poster               = {https://bercher.net/publications/2015/Bercher2015UserCenteredPlanningPoster.pdf}\n}\n\n
\n
\n\n\n
\n User-centered planning capabilities are core elements of Companion-Technology. They are used to implement the functional behavior of technical systems in a way that makes those systems Companion-able – able to serve users individually, to respect their actual requirements and needs, and to flexibly adapt to changes of the user's situation and environment. This book chapter presents various techniques we have developed and integrated to realize user-centered planning. They are based on a hybrid planning approach that combines key principles also humans rely on when making plans: stepwise refining complex tasks into executable courses of action and considering causal relationships between actions. Since the generated plans impose only a partial order on actions, they allow for a highly flexible execution order as well. Planning for Companion-Systems may serve different purposes, depending on the application for which the system is created. Sometimes, plans are just like control programs and executed automatically in order to elicit the desired system behavior; but sometimes they are made for humans. In the latter case, plans have to be adequately presented and the definite execution order of actions has to coincide with the user's requirements and expectations. Furthermore, the system should be able to smoothly cope with execution errors. To this end, the plan generation capabilities are complemented by mechanisms for plan presentation, execution monitoring, and plan repair.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n Advanced User Assistance for Setting Up a Home Theater.\n \n \n \n \n\n\n \n Pascal Bercher; Felix Richter; Thilo Hörnle; Thomas Geier; Daniel Höller; Gregor Behnke; Florian Nielsen; Frank Honold; Felix Schüssel; Stephan Reuter; Wolfgang Minker; Michael Weber; Klaus Dietmayer; and Susanne Biundo.\n\n\n \n\n\n\n In Susanne Biundo; and Andreas Wendemuth., editor(s), Companion Technology – A Paradigm Shift in Human-Technology Interaction, of Cognitive Technologies, 24, pages 485–491. Springer, 2017.\n \n\n\n\n
\n\n\n\n \n \n \"Advanced paper\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 2 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n \n \n\n\n\n
\n
@InCollection{Bercher2017HomeTheater,\n  Author                   = {Pascal Bercher and Felix Richter and Thilo H\\"ornle and Thomas Geier and Daniel H\\"oller and Gregor Behnke and Florian Nielsen and Frank Honold and Felix Sch\\"ussel and Stephan Reuter and Wolfgang Minker and Michael Weber and Klaus Dietmayer and Susanne Biundo},\n  title                    = {Advanced User Assistance for Setting Up a Home Theater},\n  pages                    = {485--491},\n  chapter                  = {24},\n  doi                      = {10.1007/978-3-319-43665-4_24},\n  keywords                 = {2017,userCentered,bookChapter},\n  booktitle                = {Companion Technology -- A Paradigm Shift in Human-Technology Interaction},\n  editor                   = {Susanne Biundo and Andreas Wendemuth},\n  publisher                = {Springer},\n  year                     = {2017},\n  series                   = {Cognitive Technologies},\n  abstract                 = {In many situations of daily life, such as in educational, work-related, or social contexts, one can observe an increasing demand for intelligent assistance systems. In this chapter, we show how such assistance can be provided in a wide range of application scenarios—based on the integration of user-centered planning with advanced dialog and interaction management capabilities. Our approach is demonstrated by a system that assists a user in the task of setting up a complex home theater. The theater consists of several hi-fi devices that need to be connected with each other using the available cables and adapters. In particular for technically inexperienced users, the task is quite challenging due to the high number of different ports of the devices and because the used cables might not be known to the user. Support is provided by presenting a detailed sequence of instructions that solves the task.},\n  url_Paper                = {https://bercher.net/publications/2017/Bercher2017HomeTheater.pdf},\n}\n\n
\n
\n\n\n
\n In many situations of daily life, such as in educational, work-related, or social contexts, one can observe an increasing demand for intelligent assistance systems. In this chapter, we show how such assistance can be provided in a wide range of application scenarios—based on the integration of user-centered planning with advanced dialog and interaction management capabilities. Our approach is demonstrated by a system that assists a user in the task of setting up a complex home theater. The theater consists of several hi-fi devices that need to be connected with each other using the available cables and adapters. In particular for technically inexperienced users, the task is quite challenging due to the high number of different ports of the devices and because the used cables might not be known to the user. Support is provided by presenting a detailed sequence of instructions that solves the task.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n To Plan for the User Is to Plan With the User – Integrating User Interaction Into the Planning Process.\n \n \n \n \n\n\n \n Gregor Behnke; Florian Nielsen; Marvin Schiller; Denis Ponomaryov; Pascal Bercher; Birte Glimm; Wolfgang Minker; and Susanne Biundo.\n\n\n \n\n\n\n In Susanne Biundo; and Andreas Wendemuth., editor(s), Companion Technology – A Paradigm Shift in Human-Technology Interaction, of Cognitive Technologies, 7, pages 123–144. Springer, 2017.\n \n\n\n\n
\n\n\n\n \n \n \"To paper\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 1 download\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n\n\n\n
\n
@InCollection{Behnke2017UserIntegration,\n  Author                   = {Gregor Behnke and Florian Nielsen and Marvin Schiller and Denis Ponomaryov and Pascal Bercher and Birte Glimm and Wolfgang Minker and Susanne Biundo},\n  title                    = {To Plan for the User Is to Plan With the User -- Integrating User Interaction Into the Planning Process},\n  chapter                  = {7},\n  pages                    = {123--144},\n  doi                      = {10.1007/978-3-319-43665-4_7},\n  abstract                 = {Settings where systems and users work together to solve problems collaboratively are among the most challenging applications of Companion-Technology. So far we have seen how planning technology can be exploited to realize Companion-Systems that adapt flexibly to changes in the user's situation and environment and provide detailed help for users to realize their goals. However, such systems lack the capability to generate their plans in cooperation with the user. In this chapter we go one step further and describe how to involve the user directly into the planning process. This enables users to integrate their wishes and preferences into plans and helps the system to produce individual plans, which in turn let the Companion-System gain acceptance and trust from the user. Such a Companion-System must be able to manage diverse interactions with a human user. A so-called mixed-initiative planning system integrates several Companion-Technologies which are described in this chapter. For example, a—not yet final—plan, including its flaws and solutions, must be presented to the user to provide a basis for her or his decision. We describe how a dialog manager can be constructed such that it can handle all communication with a user. Naturally, the dialog manager and the planner must use coherent models. We show how an ontology can be exploited to achieve such models. Finally, we show how the causal information included in plans can be used to answer the questions a user might have about a plan. The given capabilities of a system to integrate user decisions and to explain its own decisions to the user in an appropriate way are essential for systems that interact with human users.},\n  booktitle                = {Companion Technology -- A Paradigm Shift in Human-Technology Interaction},\n  editor                   = {Susanne Biundo and Andreas Wendemuth},\n  publisher                = {Springer},\n  year                     = {2017},\n  series                   = {Cognitive Technologies},\n  url_Paper                = {https://bercher.net/publications/2017/Behnke2017UserIntegration.pdf}\n}\n\n
\n
\n\n\n
\n Settings where systems and users work together to solve problems collaboratively are among the most challenging applications of Companion-Technology. So far we have seen how planning technology can be exploited to realize Companion-Systems that adapt flexibly to changes in the user's situation and environment and provide detailed help for users to realize their goals. However, such systems lack the capability to generate their plans in cooperation with the user. In this chapter we go one step further and describe how to involve the user directly into the planning process. This enables users to integrate their wishes and preferences into plans and helps the system to produce individual plans, which in turn let the Companion-System gain acceptance and trust from the user. Such a Companion-System must be able to manage diverse interactions with a human user. A so-called mixed-initiative planning system integrates several Companion-Technologies which are described in this chapter. For example, a—not yet final—plan, including its flaws and solutions, must be presented to the user to provide a basis for her or his decision. We describe how a dialog manager can be constructed such that it can handle all communication with a user. Naturally, the dialog manager and the planner must use coherent models. We show how an ontology can be exploited to achieve such models. Finally, we show how the causal information included in plans can be used to answer the questions a user might have about a plan. The given capabilities of a system to integrate user decisions and to explain its own decisions to the user in an appropriate way are essential for systems that interact with human users.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n User Involvement in Collaborative Decision-Making Dialog Systems.\n \n \n \n \n\n\n \n Florian Nothdurft; Pascal Bercher; Gregor Behnke; and Wolfgang Minker.\n\n\n \n\n\n\n In Kristiina Jokinen; and Graham Wilcock., editor(s), Dialogues with Social Robots: Enablements, Analyses, and Evaluation, pages 129–141. Springer, 2017.\n This book chapter was accepted at the 7th International Workshop On Spoken Dialogue Systems (IWSDS 2016).\n\n\n\n
\n\n\n\n \n \n \"User paper\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 1 download\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n\n\n\n
\n
@InCollection{Nothdurft2016UserInvolvement,\n  booktitle                = {Dialogues with Social Robots: Enablements, Analyses, and Evaluation},\n  Author                   = {Florian Nothdurft and Pascal Bercher and Gregor Behnke and Wolfgang Minker},\n  title                    = {User Involvement in Collaborative Decision-Making Dialog Systems},\n  Editor                   = {Kristiina Jokinen and Graham Wilcock},\n  Publisher                = {Springer},\n  Year                     = {2017},\n  pages                    = {129--141},\n  note                     = {This book chapter was accepted at the 7th International Workshop On Spoken Dialogue Systems (IWSDS 2016).},\n  doi                      = {10.1007/978-981-10-2585-3_10},\n  abstract                 = {Abstract Mixed-initiative assistants are systems that support humans in their decision-making and problem-solving capabilities in a collaborative manner. Such systems have to integrate various artificial intelligence capabilities, such as knowledge representation, problem solving and planning, learning, discourse and dialog, and human-computer interaction. These systems aim at solving a given problem autonomously for the user, yet involve the user into the planning process for a collaborative decision-making, to respect e.g. user preferences. However, how the user is involved into the planning can be framed in various ways, using different involvement strategies, varying e.g. in their degree of user freedom. Hence, here we present results of a study examining the effects of different user involvement strategies on the user experience in a mixed-initiative system.},\n  url_Paper                = {https://bercher.net/publications/2016/Nothdurft2016UserInvolvement.pdf}\n}\n\n\n
\n
\n\n\n
\n Abstract Mixed-initiative assistants are systems that support humans in their decision-making and problem-solving capabilities in a collaborative manner. Such systems have to integrate various artificial intelligence capabilities, such as knowledge representation, problem solving and planning, learning, discourse and dialog, and human-computer interaction. These systems aim at solving a given problem autonomously for the user, yet involve the user into the planning process for a collaborative decision-making, to respect e.g. user preferences. However, how the user is involved into the planning can be framed in various ways, using different involvement strategies, varying e.g. in their degree of user freedom. Hence, here we present results of a study examining the effects of different user involvement strategies on the user experience in a mixed-initiative system.\n
\n\n\n
\n\n\n\n\n\n
\n
\n\n\n\n\n
\n\n\n \n\n \n \n \n \n\n
\n"}; document.write(bibbase_data.data);