基于逻辑推理和多任务融合的认知刺激对话生成

蒋玉茹, 李梦媛, 陶宇阳, 区可明, 佘泽鹏, 施水才

PDF(2872 KB)
PDF(2872 KB)
山西大学学报(自然科学版) ›› 2025, Vol. 48 ›› Issue (3) : 516-526. DOI: 10.13451/j.sxu.ns.2024158
信息科学

基于逻辑推理和多任务融合的认知刺激对话生成

作者信息 +

Cognitive Stimulation Dialogue Generation Based on Logical Reasoning and Multi-task Integration

Author information +
History +

摘要

在全球老龄化背景下,老年人的健康问题逐渐凸显,认知刺激对话是保持老年人认知健康的重要手段。前人构建了一个结合情感支持的中文认知刺激对话数据集(Chinese Cognitive Stimulation Dialogue Dataset,CSConv),开启了中文认知刺激对话的研究工作。但是没有充分建模认知刺激对话中的逻辑推理关系,生成对话时没能有效利用策略标签的指导作用。本文将认知刺激对话生成视为一个多任务融合的逻辑思维推理过程,将情感分类任务、决策任务和对话回复生成任务间的逻辑关系,建模为一个推理过程,来引导大语言模型生成。针对决策任务,本文提出分层编码器结构的决策模型。决策实验结果表明,决策模型分别将认知刺激治疗原则及情感支持策略决策任务的准确率提高了3.96%和2.1%。针对多任务逻辑思维推理过程,本文提出多任务融合方法,将分类、决策、生成三个任务对应的模型结合在一起。实验结果表明,相比前人方法,多任务融合方法将双语评估替分-4(Bilingual Evaluation Understudy Score based on 4-gram,BLEU-4)提升了7.95%,表明对话回复能力得到提升,证明了该方法的有效性和先进性。

Abstract

In the context of global aging, the health problems of the elderly have gradually become prominent, and cognitive stimulation dialogue is an important means to maintain the cognitive health of the elderly. Previous researchers constructed a Chinese cognitive stimulation dialogue dataset (CSConv) that includes emotional support, thereby initiating research in the field of Chinese cognitive stimulation dialogue. However, the authors did not fully model the logical reasoning relationships within cognitive stimulation dialogues and did not effectively utilize the guiding role of strategy labels during dialogue generation. This study regards cognitive stimulation dialogue generation as a multi-task integration thinking and reasoning process, and models the logical relationship among emotion classification tasks, decision-making tasks and dialogue response generation tasks as a reasoning process to guide the generation of large language models. For decision-making tasks, this paper proposes a decision-making model with a hierarchical encoder structure. The results of the decision-making experiment show that the decision-making model improves the accuracy of the cognitive stimulation therapy principles and emotional support strategies decision-making tasks by 3.96% and 2.1%, respectively. For multi-task logical thinking and reasoning process, this paper proposes a multi-task integration method to combine the models corresponding to the three tasks. The experimental results show that the multi-task integration method has improved BLEU-4 by 7.95% compared with the previous baseline, indicating that the dialogue response ability has been improved, proving the effectiveness and advancement of this method.

关键词

认知刺激 / 情感支持 / 决策任务 / 多任务融合方法

Key words

cognitively stimulating / emotional support / decision-making tasks / multi-task integration method

中图分类号

TP391

引用本文

导出引用
蒋玉茹 , 李梦媛 , 陶宇阳 , . 基于逻辑推理和多任务融合的认知刺激对话生成. 山西大学学报(自然科学版). 2025, 48(3): 516-526 https://doi.org/10.13451/j.sxu.ns.2024158
JIANG Yuru, LI Mengyuan, TAO Yuyang, et al. Cognitive Stimulation Dialogue Generation Based on Logical Reasoning and Multi-task Integration[J]. Journal of Shanxi University(Natural Science Edition). 2025, 48(3): 516-526 https://doi.org/10.13451/j.sxu.ns.2024158

参考文献

1
DE OLIVEIRA T C, SOARES F C, DE MACEDO L D, et al. Beneficial Effects of Multisensory and Cognitive Stimulation on Age-related Cognitive Decline in Long-term-care Institutions[J]. Clin Interv Aging, 2014, 9: 309-320. DOI: 10.2147/cia.s54383 .
2
PARK J M, KIM M W, SHIM H Y. Effects of a Multicomponent Cognitive Stimulation Program on Cognitive Function Improvement Among Elderly Women[J]. Asian Nurs Res (Korean Soc Nurs Sci), 2019, 13(5): 306-312. DOI: 10.1016/j.anr.2019.11.001 .
3
TOKUNAGA S, SEABORN K, TAMURA K, et al. Cognitive Training for Older Adults with a Dialogue-based, Robot-facilitated Storytelling System[C]//International Conference on Interactive Digital Storytelling. Little Cottonwood Canyon, UT, USA: Springer, Cham, 2019: 405-409. DOI: 10.1007/978-3-030-33894-7_43 .
4
TOKUNAGA S, TAMURA K, OTAKE-MATSUURA M. A Dialogue-based System with Photo and Storytelling for Older Adults: Toward Daily Cognitive Training[J]. Front Robot AI, 2021, 8: 644964. DOI: 10.3389/frobt.2021.644964 .
5
SPECTOR A, THORGRIMSEN L, WOODS B, et al. Efficacy of an Evidence-based Cognitive Stimulation Therapy Programme for People with Dementia: Randomised Controlled Trial[J]. Br J Psychiatry, 2003, 183: 248-254. DOI: 10.1192/bjp.183.3.248 .
6
艾米·斯佩克特, 琳恩·索格林森, 鲍勃·伍兹, 等. 认知刺激治疗Cst:为认知障碍症设计的循证小组活动(导师手册)[M]. 黄凯茵, 译. 香港: 香港大学出版社, 2017.
7
NAVARRO J, DOCTOR F, ZAMUDIO V, et al. Fuzzy Adaptive Cognitive Stimulation Therapy Generation for Alzheimer's Sufferers: Towards a Pervasive Dementia Care Monitoring Platform[J]. Future Gener Comput Syst, 2018, 88: 479-490. DOI: 10.1016/j.future.2018.06.018 .
8
JIANG J Y, WANG S, LI Q T, et al. A Cognitive Stimulation Dialogue System with Multi-source Knowledge Fusion for Elders with Cognitive Impairment[C]//Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg, PA, USA: Association for Computational Linguistics, 2023: 10628-10640. DOI: 10.18653/v1/2023.acl-long.593 .
9
LIU S, ZHENG C, DEMASI O, et al. Towards Emotional Support Dialog Systems[C]//Proceedings of the 59th Annual Meeting of the ACL and the 11th IJCNLP (Volume 1: Long Papers). Pennsylvania, USA: ACL Anthology, 2021: 3469-3483. DOI: 10.18653/v1/2021.acl-long.269 .
10
RYU H, KIM S, KIM D, et al. Simple and Steady Interactions Win the Healthy Mentality[J]. Proc ACM Hum-Comput Interact, 2020, 4(CSCW2): 1-25. DOI: 10.1145/3415223 .
11
VALTOLINA S, HU L. Charlie: a Chatbot to Improve the Elderly Quality of Life and to Make them more Active to Fight Their Sense of Loneliness[C]//CHItaly 2021: 14th Biannual Conference of the Italian SIGCHI Chapter. New York: ACM, 2021: 10.1145/3464385.3464726. DOI: 10.1145/3464385.3464726 .
12
LEE M H, ACKERMANS S, VAN AS N, et al. Caring for Vincent: a Chatbot for Self-compassion[C]//Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. New York: ACM, 2019: 10.1145/3290605.3300932. DOI: 10.1145/3290605.3300932 .
13
SHARMA A, LIN I W, MINER A S, et al. Towards Facilitating Empathic Conversations in Online Mental Health Support: A Reinforcement Learning Approach[C]//Proceedings of the Web Conference 2021. New York: ACM, 2021: 194-205. DOI: 10.1145/3442381.3450097 .
14
SHARMA A, MINER A, ATKINS D, et al. A Computational Approach to Understanding Empathy Expressed in Text-based Mental Health Support[C]//Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Stroudsburg, PA, USA: Association for Computational Linguistics, 2020: 5263-5276. DOI: 10.18653/v1/2020.emnlp-main.425 .
15
RASHKIN H, SMITH E M, LI M, et al. Towards Empathetic Open-domain Conversation Models: A New Benchmark and Dataset[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA, USA: Association for Computational Linguistics, 2019: 5370-5381. DOI: 10.18653/v1/p19-1534 .
16
KIM J, SHIN E, HAN K, et al. Efficacy of Smart Speaker-based Metamemory Training in Older Adults: Case-control Cohort Study[J]. J Med Internet Res, 2021, 23(2): e20177. DOI: 10.2196/20177 .
17
DEVLIN J, CHANG M W, LEE K, et al. BERT: Pre-Training of Deep Bidirectional Transformers for Language Understanding[C]//Proceedings of the 2019 Conference of the NAACL: Human Language Technologies, Volume 1 (Long and Short Papers). Minneapolis, USA: ACL Anthology, 2019: 4171-4186. DOI: 10.18653/v1/N19-1423 .
18
SHI X J, CHEN Z R, WANG H, et al. Convolutional LSTM Network: A Machine Learning Approach for Precipitation Nowcasting[C]//NIPS'15: Proceedings of the 29th International Conference on Neural Information Processing Systems. Montreal, Canada: ML Research Press, 2015: 802-810.
19
VASWANI A, SHAZEER N, PARMAR N, et al. Attention is All You Need[J]. Adv Neural Inf Process Syst, 2017: 6000-6010. DOI: 10.5555/3295222.3295349 .
20
REIMERS N, GUREVYCH I. Sentence-BERT: Sentence Embeddings Using Siamese BERT-networks[C]//Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). Stroudsburg, PA, USA: Association for Computational Linguistics, 2019: 3982-3992. DOI: 10.18653/v1/d19-1410 .
21
DU Z X, QIAN Y J, LIU X, et al. GLM: General Language Model Pretraining with Autoregressive Blank Infilling[C]//Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg, PA, USA: Association for Computational Linguistics, 2022: 320-335. DOI: 10.18653/v1/2022.acl-long.26 .
22
PAPINENI K, ROUKOS S, WARD T, et al. BLEU: a Method for Automatic Evaluation of Machine Translation[C]//Proceedings of the 40th Annual Meeting on Association for Computational Linguistics-ACL '02. Morristown, NJ, USA: Association for Computational Linguistics, 2001: 311-318. DOI: 10.3115/1073083.1073135 .
24
ZHANG T Y, KISHORE V, WU F, et al. BERTScore: Evaluating Text Generation with BERT[EB/OL]. (2020-02-24)[2024-03-28].
25
WEI J, WANG X Z, SCHUURMANS D, et al. Chain of Thought Prompting Elicits Reasoning in Large Language Models[EB/OL]//NIPS'22: 36th International Conference on Neural Information Processing Systems. New Orleans LA USA:Curran Associates Inc. 2022, 35: 24824-24837. DOI:10.5555/3600270.3602070 .
26
LIU P F, YUAN W Z, FU J L, et al. Pre-train, Prompt, and Predict: A Systematic Survey of Prompting Methods in Natural Language Processing[J]. ACM Comput Surv, 2023, 55(9): 1-35. DOI: 10.1145/3560815 .
27
WANG Y, KE P, ZHENG Y, et al. A Large-scale Chinese Short-text Conversation Dataset[C]//Natural Language Processing and Chinese Computing: 9th CCF International Conference, NLPCC 2020. Zhengzhou, China, 2020, Proceedings, Part I 9: Springer, 2020: 91-103. DOI: 10.1007/978-3-030-60450-9_8 .
28
XU L, ZHANG X W, DONG Q Q. CLUECorpus2020: a Large-scale Chinese Corpus for Pre-training Language Model[EB/OL]. (2020-03-05)[2024-03-28].
29
RADFORD A, WU J, CHILD R, et al. Language Models are Unsupervised Multitask Learners[J]. OpenAI blog, 2019, 1(8): 9.

脚注

1 https://huggingface.co/uer/sbert-base-chinese-nli

基金

北京市自然科学基金(4242019)

评论

PDF(2872 KB)

Accesses

Citation

Detail

段落导航
相关文章

/