Skip to content

Paper List for (Unconditional) Neural Text Generation

Notifications You must be signed in to change notification settings

desire2020/NTG-Papers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 

Repository files navigation

Paper Collection of Neural Text Generation (NTG)

Neural Text Generation refers to a kind of methods that mainly attempt to use NN as function approximators to mimic the underlying distribution of (natural) languages. The most important applications of the conditional version of this topic include Neural Machine Translation (NMT), neural image captioning and dialogue system (chatbot). However, the researches of NTG usually refer to those focus on the unconditional problem, that is to really learn the latent distribution of the target language (instead of a transformation mapping from source form to target form).

This repository presents a collection of previous research papers of Neural Text Generation (NTG), as well as a taxonomy constructed according to publication time, method paradigm or paper type.

Taxonomy of Papers

Survey and Theoretical Analysis

Metrics, Toolbox and Dataset

Online-available Course

Research Paper

About

Paper List for (Unconditional) Neural Text Generation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published