Story Generation by Lara J . Martin
The problem of improvisational story generation involves one or more agents collaborating in orde... more The problem of improvisational story generation involves one or more agents collaborating in order to create a story without any advance notice of topic. We present a pipeline for an artificial agent that is capable of improvisational story-telling while collaborating with a human agent. Starting with story corpora, we " eventify " sentences, which creates a simplified and abstracted representation. The rest of the pipeline–the agent's response–is broken into three parts: generating successive events (event-to-event), translating of events back into natural language (event-to-sentence), and plugging the specifics of the story back into the generated sentences (slot filling). We discuss techniques for each of these sub-problems.
Proceedings of the AAAI Conference on Artificial Intelligence
Neural network based approaches to automated story plot generation attempt to learn how to genera... more Neural network based approaches to automated story plot generation attempt to learn how to generate novel plots from a corpus of natural language plot summaries. Prior work has shown that a semantic abstraction of sentences called events improves neural plot generation and and allows one to decompose the problem into: (1) the generation of a sequence of events (event-to-event) and (2) the transformation of these events into natural language sentences (event-to-sentence). However, typical neural language generation approaches to event-to-sentence can ignore the event details and produce grammatically-correct but semantically-unrelated sentences. We present an ensemble-based model that generates natural language guided by events. We provide results—including a human subjects study—for a full end-to-end automated story generation system showing that our method generates more coherent and plausible stories than baseline approaches 1.
Language-modeling--based approaches to story plot generation attempt to construct a plot by sampl... more Language-modeling--based approaches to story plot generation attempt to construct a plot by sampling from a language model (LM) to predict the next character, word, or sentence to add to the story. LM techniques lack the ability to receive guidance from the user to achieve a specific goal, resulting in stories that don't have a clear sense of progression and lack coherence. We present a reward-shaping technique that analyzes a story corpus and produces intermediate rewards that are backpropagated into a pre-trained LM in order to guide the model towards a given goal. Automated evaluations show our technique can create a model that generates story plots which consistently achieve a specified goal. Human-subject studies show that the generated stories have more plausible event ordering than baseline plot generation techniques.
Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
Language-modeling--based approaches to story plot generation attempt to construct a plot by sampl... more Language-modeling--based approaches to story plot generation attempt to construct a plot by sampling from a language model (LM) to predict the next character, word, or sentence to add to the story. LM techniques lack the ability to receive guidance from the user to achieve a specific goal, resulting in stories that don't have a clear sense of progression and lack coherence. We present a reward-shaping technique that analyzes a story corpus and produces intermediate rewards that are backpropagated into a pre-trained LM in order to guide the model toward a given goal. Automated evaluations show our technique can create a model that generates story plots which consistently achieve a specified goal. Human-subject studies show that the generated stories have more plausible event ordering than baseline plot generation techniques.
Proceedings of the Second Workshop on Storytelling
ArXiv, 2018
Open story generation is the problem of automatically creating a story for any domain without ret... more Open story generation is the problem of automatically creating a story for any domain without retraining. Neural language models can be trained on large corpora across many domains and then used to generate stories. However, stories generated via language models tend to lack direction and coherence. We introduce a policy gradient reinforcement learning approach to open story generation that learns to achieve a given narrative goal state. In this work, the goal is for a story to end with a specific type of event, given in advance. However, a reward based on achieving the given goal is too sparse for effective learning. We use reward shaping to provide the reinforcement learner with a partial reward at every step. We show that our technique can train a model that generates a story that reaches the goal 94% of the time and reduces model perplexity. A human subject evaluation shows that stories generated by our technique are perceived to have significantly higher plausible event orderin...
Automated story generation is the problem of automatically selecting a sequence of events, action... more Automated story generation is the problem of automatically selecting a sequence of events, actions, or words that can be told as a story. We seek to develop a system that can generate stories by learning everything it needs to know from textual story corpora. To date, recurrent neural networks that learn language models at character, word, or sentence levels have had little success generating coherent stories. We explore the question of event representations that provide a mid-level of abstraction between words and sentences in order to retain the semantic information of the original data while minimizing event sparsity. We present a technique for preprocessing textual story data into event sequences. We then present a technique for automated story generation whereby we decompose the problem into the generation of successive events (event2event) and the generation of natural language sentences from events (event2sentence). We give empirical results comparing different event represen...
Language-modeling--based approaches to story plot generation attempt to construct a plot by sampl... more Language-modeling--based approaches to story plot generation attempt to construct a plot by sampling from a language model (LM) to predict the next character, word, or sentence to add to the story. LM techniques lack the ability to receive guidance from the user to achieve a specific goal, resulting in stories that don't have a clear sense of progression and lack coherence. We present a reward-shaping technique that analyzes a story corpus and produces intermediate rewards that are backpropagated into a pre-trained LM in order to guide the model towards a given goal. Automated evaluations show our technique can create a model that generates story plots which consistently achieve a specified goal. Human-subject studies show that the generated stories have more plausible event ordering than baseline plot generation techniques.
Lecture Notes in Computer Science, 2016
Automated story generation is the problem of automatically selecting a sequence of events, action... more Automated story generation is the problem of automatically selecting a sequence of events, actions, or words that can be told as a story. We seek to develop a system that can generate stories by learning everything it needs to know from textual story corpora. To date, recurrent neural networks that learn language models at character, word, or sentence levels have had little success generating coherent stories. We explore the question of event representations that provide a mid-level of abstraction between words and sentences in order to retain the semantic information of the original data while minimizing event sparsity. We present a technique for preprocessing textual story data into event sequences. We then present a technique for automated story generation whereby we decompose the problem into the generation of successive events (event2event) and the generation of natural language sentences from events (event2sentence). We give empirical results comparing different event represen...
The problem of improvisational story generation involves one or more agents collaborating in orde... more The problem of improvisational story generation involves one or more agents collaborating in order to create a story without any advance notice of topic. We present a pipeline for an artificial agent that is capable of improvisational storytelling while collaborating with a human agent. Starting with story corpora, we “eventify” sentences, which creates a simplified and abstracted representation. The rest of the pipeline–the agent’s response–is broken into three parts: generating successive events (event-to-event), translating of events back into natural language (event-to-sentence), and plugging the specifics of the story back into the generated sentences (slot filling). We discuss techniques for each of these sub-problems.
ArXiv, 2021
The advent of large pre-trained generative language models has provided a common framework for AI... more The advent of large pre-trained generative language models has provided a common framework for AI story generation via sampling the model to create sequences that continue the story. However, sampling alone is insufficient for story generation. In particular, it is hard to direct a language model to create stories to reach a specific goal event. We present two automated techniques grounded in deep reinforcement learning and reward shaping to control the plot of computer-generated stories. The first utilizes proximal policy optimization to fine-tune an existing transformerbased language model to generate text continuations but also be goal-seeking. The second extracts a knowledge graph from the unfolding story, which is used by a policy network with graph attention to select a candidate continuation generated by a language model. We report on automated metrics pertaining to how often stories achieve a given goal event as well as human participant rankings of coherence and overall sto...
Dungeons and Dragons by Lara J . Martin
Game playing has been an important testbed for artificial intelligence. Board games, first-person... more Game playing has been an important testbed for artificial intelligence. Board games, first-person shooters, and real-time strategy games have well-defined win conditions and rely on strong feedback from a simulated environment. Text adventures require natural language understanding to progress through the game but still have an underlying simulated environment. In this paper, we propose tabletop roleplaying games as a challenge due to an infinite action space, multiple (collaborative) players and models of the world, and no explicit reward signal. We present an approach for reinforcement learning agents that can play tabletop roleplaying games.
Artificial Intelligence and Interactive Digital Entertainment Conference, 2018
Speech Translation/Language Learning by Lara J . Martin
2015 IEEE Workshop on Automatic Speech Recognition and Understanding (ASRU), 2015
Uploads
Story Generation by Lara J . Martin
Dungeons and Dragons by Lara J . Martin
Speech Translation/Language Learning by Lara J . Martin