Logo image
Teaching Large Language Models Number-Focused Headline Generation With Key Element Rationales
Conference paper   Open access

Teaching Large Language Models Number-Focused Headline Generation With Key Element Rationales

Zhen Qian, Xiuzhen Zhang, Xiaofei Xu and Feng Xia
Findings of the Association for Computational Linguistics: NAACL 2025 (Albuquerque, New Mexico, 29/04/2025–04/05/2025)
02/05/2025
pdf
Published615.12 kBDownloadView
Open Access

Abstract

Large Language Models Natural Language Processing Natural language processing
Number-focused headline generation is a summarization task requiring both high textual quality and precise numerical accuracy, which poses a unique challenge for Large Language Models (LLMs). Existing studies in the literature focus only on either textual quality or numerical reasoning and thus are inadequate to address this challenge. In this paper, we propose a novel chain-of-thought framework for using rationales comprising key elements of the Topic, Entities, and Numerical reasoning (TEN) in news articles to enhance the capability for LLMs to generate topic-aligned high-quality texts with precise numerical accuracy. Specifically, a teacher LLM is employed to generate TEN rationales as supervision data, which are then used to teach and fine-tune a student LLM. Our approach teaches the student LLM automatic generation of rationales with enhanced capability for numerical reasoning and topic-aligned numerical headline generation. Experiments show that our approach achieves superior performance in both textual quality and numerical accuracy.

Details

Metrics

53 File views/ downloads
11 Record Views
Logo image