This paper introduces a new NLG architecture that can be sensitive to surface stylistic requirements. It brings together a well-founded linguistic theory that has been used in many successful NLG systems (Systemic Functional Linguistics, SFL) and an existing AI search mechanism (the Assumption-based Truth Maintenance System, ATMS) which caches important search information and avoids work duplication. It describes a technique for converting systemic grammar networks to dependency networks that an ATMS can reason with. The generator then uses the translated networks to generate natural language texts. The paper also describes how surface constraints can be incorporated within the new architecture. We then evaluate the efficiency of our system.
|Number of pages||10|
|Publication status||Published - 2004|