< Back to previous page
Tensors and Neural Networks for Creative Language Generation
Today's language processing systems are exclusively task-based: typically, a machine learning model is trained on a significant amount of training data, optimizing its parameters in order to produce the best possible output for a particular problem or task. As such, the bulk of natural language processing algorithms mimic human language use, and seldom produce creative output. This project aims to investigate to what extent different kinds of creativity can be embedded into a computational model of language. Linguistic theories of metaphor analyze metaphorical expressions as a conceptual mapping from one semantic domain into another, which makes them a prototypical example of combinational creativity. The project will explore the generation of metaphor through the manipulation of domain mappings induced by a tensor-based factorization model. Secondly, in order to model creativity within larger textual entities (from sentence level up to document level), neural network architectures will be exploited. Specifically, the project will focus on the development of constraint-based neural network architectures for creative language generation. Unconstrained neural network architectures seek to reproduce the data they were trained on; by constraining the neural network's language generation, the network is incited to find novel ways to express the same semantic content - a process which lends itself to exploratory and transformational creativity.
Date:10 Dec 2020 → Today
Keywords:tensors, neural networks, language generation, creativity, metaphor
Disciplines:Natural language processing, Computational linguistics