A bidirectional encoder-decoder LSTM neural network is trained for text summarization on the cnn/dailymail dataset. (MIT808 project)
-
Updated
Oct 22, 2018 - Python
A bidirectional encoder-decoder LSTM neural network is trained for text summarization on the cnn/dailymail dataset. (MIT808 project)
In this repository, I have developed a CycleGAN architecture with embedded Self-Attention Layers, that could solve three different complex tasks. Here the same principle Neural Network architecture has been used to solve the three different task. Although truth be told, my model has not exceeded any state of the art performances for the given ta…
NLP - Semantic Role Labeling using GCN, Bert and Biaffine Attention Layer. Developed in Pytorch
This is a deep learning network: ResNet with an attention layer that can be used on a custom data set.
pytorch sentiment classification example used NSMC data
Using a deep learning model that takes advantage of LSTM and a custom Attention layer, we create an algorithm that is able to train on reviews and existent summaries to churn out and generate brand new summaries of its own.
Add a description, image, and links to the attention-layer topic page so that developers can more easily learn about it.
To associate your repository with the attention-layer topic, visit your repo's landing page and select "manage topics."