FaceChat: An Emotion-Aware Face-to-face Dialogue Framework. Alnuhait, D., Wu, Q., & Zhou, Y. arXiv.org, March, 2023. Place: Ithaca Publisher: Cornell University Library, arXiv.org
Paper abstract bibtex While current dialogue systems like ChatGPT have made significant advancements in text-based interactions, they often overlook the potential of other modalities in enhancing the overall user experience. We present FaceChat, a web-based dialogue framework that enables emotionally-sensitive and face-to-face conversations. By seamlessly integrating cutting-edge technologies in natural language processing, computer vision, and speech processing, FaceChat delivers a highly immersive and engaging user experience. FaceChat framework has a wide range of potential applications, including counseling, emotional support, and personalized customer service. The system is designed to be simple and flexible as a platform for future researchers to advance the field of multimodal dialogue systems. The code is publicly available at https://github.com/qywu/FaceChat.
@article{alnuhait_facechat_2023,
title = {{FaceChat}: {An} {Emotion}-{Aware} {Face}-to-face {Dialogue} {Framework}},
url = {https://www.proquest.com/working-papers/facechat-emotion-aware-face-dialogue-framework/docview/2786648899/se-2},
abstract = {While current dialogue systems like ChatGPT have made significant advancements in text-based interactions, they often overlook the potential of other modalities in enhancing the overall user experience. We present FaceChat, a web-based dialogue framework that enables emotionally-sensitive and face-to-face conversations. By seamlessly integrating cutting-edge technologies in natural language processing, computer vision, and speech processing, FaceChat delivers a highly immersive and engaging user experience. FaceChat framework has a wide range of potential applications, including counseling, emotional support, and personalized customer service. The system is designed to be simple and flexible as a platform for future researchers to advance the field of multimodal dialogue systems. The code is publicly available at https://github.com/qywu/FaceChat.},
language = {English},
journal = {arXiv.org},
author = {Alnuhait, Deema and Wu, Qingyang and Zhou, Yu},
month = mar,
year = {2023},
note = {Place: Ithaca
Publisher: Cornell University Library, arXiv.org},
keywords = {Artificial Intelligence, Business And Economics--Banking And Finance, Computation and Language, Natural language processing, User experience, Computer vision, Customer services, Speech processing},
annote = {Copyright - © 2023. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.},
annote = {Última actualización - 2023-03-15},
}
Downloads: 0
{"_id":"9MfmanBD9yqafRxu6","bibbaseid":"alnuhait-wu-zhou-facechatanemotionawarefacetofacedialogueframework-2023","author_short":["Alnuhait, D.","Wu, Q.","Zhou, Y."],"bibdata":{"bibtype":"article","type":"article","title":"FaceChat: An Emotion-Aware Face-to-face Dialogue Framework","url":"https://www.proquest.com/working-papers/facechat-emotion-aware-face-dialogue-framework/docview/2786648899/se-2","abstract":"While current dialogue systems like ChatGPT have made significant advancements in text-based interactions, they often overlook the potential of other modalities in enhancing the overall user experience. We present FaceChat, a web-based dialogue framework that enables emotionally-sensitive and face-to-face conversations. By seamlessly integrating cutting-edge technologies in natural language processing, computer vision, and speech processing, FaceChat delivers a highly immersive and engaging user experience. FaceChat framework has a wide range of potential applications, including counseling, emotional support, and personalized customer service. The system is designed to be simple and flexible as a platform for future researchers to advance the field of multimodal dialogue systems. The code is publicly available at https://github.com/qywu/FaceChat.","language":"English","journal":"arXiv.org","author":[{"propositions":[],"lastnames":["Alnuhait"],"firstnames":["Deema"],"suffixes":[]},{"propositions":[],"lastnames":["Wu"],"firstnames":["Qingyang"],"suffixes":[]},{"propositions":[],"lastnames":["Zhou"],"firstnames":["Yu"],"suffixes":[]}],"month":"March","year":"2023","note":"Place: Ithaca Publisher: Cornell University Library, arXiv.org","keywords":"Artificial Intelligence, Business And Economics–Banking And Finance, Computation and Language, Natural language processing, User experience, Computer vision, Customer services, Speech processing","annote":"Última actualización - 2023-03-15","bibtex":"@article{alnuhait_facechat_2023,\n\ttitle = {{FaceChat}: {An} {Emotion}-{Aware} {Face}-to-face {Dialogue} {Framework}},\n\turl = {https://www.proquest.com/working-papers/facechat-emotion-aware-face-dialogue-framework/docview/2786648899/se-2},\n\tabstract = {While current dialogue systems like ChatGPT have made significant advancements in text-based interactions, they often overlook the potential of other modalities in enhancing the overall user experience. We present FaceChat, a web-based dialogue framework that enables emotionally-sensitive and face-to-face conversations. By seamlessly integrating cutting-edge technologies in natural language processing, computer vision, and speech processing, FaceChat delivers a highly immersive and engaging user experience. FaceChat framework has a wide range of potential applications, including counseling, emotional support, and personalized customer service. The system is designed to be simple and flexible as a platform for future researchers to advance the field of multimodal dialogue systems. The code is publicly available at https://github.com/qywu/FaceChat.},\n\tlanguage = {English},\n\tjournal = {arXiv.org},\n\tauthor = {Alnuhait, Deema and Wu, Qingyang and Zhou, Yu},\n\tmonth = mar,\n\tyear = {2023},\n\tnote = {Place: Ithaca\nPublisher: Cornell University Library, arXiv.org},\n\tkeywords = {Artificial Intelligence, Business And Economics--Banking And Finance, Computation and Language, Natural language processing, User experience, Computer vision, Customer services, Speech processing},\n\tannote = {Copyright - © 2023. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.},\n\tannote = {Última actualización - 2023-03-15},\n}\n\n","author_short":["Alnuhait, D.","Wu, Q.","Zhou, Y."],"key":"alnuhait_facechat_2023","id":"alnuhait_facechat_2023","bibbaseid":"alnuhait-wu-zhou-facechatanemotionawarefacetofacedialogueframework-2023","role":"author","urls":{"Paper":"https://www.proquest.com/working-papers/facechat-emotion-aware-face-dialogue-framework/docview/2786648899/se-2"},"keyword":["Artificial Intelligence","Business And Economics–Banking And Finance","Computation and Language","Natural language processing","User experience","Computer vision","Customer services","Speech processing"],"metadata":{"authorlinks":{}}},"bibtype":"article","biburl":"https://bibbase.org/network/files/22WYpzbBvi3hDHX7Y","dataSources":["cYu6uhMkeFHgRrEty","hLMh7bwHyFsPNWAEL","LKW3iRvnztCpLNTW7","TLD9JxqHfSQQ4r268","X9BvByJrC3kGJexn8","iovNvcnNYDGJcuMq2","NjZJ5ZmWhTtMZBfje"],"keywords":["artificial intelligence","business and economics–banking and finance","computation and language","natural language processing","user experience","computer vision","customer services","speech processing"],"search_terms":["facechat","emotion","aware","face","face","dialogue","framework","alnuhait","wu","zhou"],"title":"FaceChat: An Emotion-Aware Face-to-face Dialogue Framework","year":2023}