Skip to content

From the real article, generate the summarizer with Machine learning.

Notifications You must be signed in to change notification settings

Noah-zero/generate_a_text_sumarizer

 
 

Repository files navigation

Automatic_text_summarizer_generater_with_DeepLearning

This is the code for "How to Make a Text Summarizer - Intro to Deep Learning

Coding Challenge - Due Date - Thursday, March 23rd at 12 PM PST

The challenge for this video is to make a text summarizer for a set of articles with Keras. You can use any textual dataset to do this. By doing this you'll learn more about encoder-decoder architecture and the role of attention in deep learning. Good luck!

Overview

This is the code for Deep Learning and this video is explainning about fundamental of this project. We're using an encoder-decoder architecture to generate a headline from a news article. And the 1M(articles and blogs) is used as as training dataset.

Dependencies

  • Tensorflow or Theano
  • Keras
  • python-Levenshtein (pip install python-levenshtein)

Use pip to install any missing dependencies

Basic Usage

Data

The video example is made from the text at the start of the article, which I call description (or desc), and the text of the original headline (or head). The texts should be already tokenized and the tokens separated by spaces. This is a good example dataset. You can use the 'content' as the 'desc' and the 'title' as the 'head'.

Once you have the data ready save it in a python pickle file as a tuple: (heads, descs, keywords) were heads is a list of all the head strings, descs is a list of all the article strings in the same order and length as heads. I ignore the keywrods information so you can place None.

Here is a link on how to get similar datasets

Build a vocabulary of words

The vocabulary-embedding notebook describes how a dictionary is built for the tokens and how an initial embedding matrix is built from GloVe

Train a model

train notebook describes how a model is trained on the data using Keras

Use model to generate new headlines

predict generate headlines by the trained model and showes the attention weights used to pick words from the description. The text generation includes a feature which was not described in the original paper, it allows for words that are outside the training vocabulary to be copied from the description to the generated headline.

Examples of headlines generated

Good (cherry picking) examples of headlines generated cherry picking of generated headlines cherry picking of generated headlines

Examples of attention weights

attention weights

Credits

The credit for this code goes to udibr i've merely created a wrapper to make it easier to get started.

test with : http://www.dailymail.co.uk/sport/football/article-4338048/Could-Patrick-Vieira-succeed-Arsenal-boss-Arsene-Wenger.html

About

From the real article, generate the summarizer with Machine learning.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%