Skip to content

Yashhuc/Text-Summarizer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Project Overview

The purpose of developing an abstractive text summarizer is to automatically generate concise and coherent summaries of longer texts while preserving the main ideas and key information. Abstractive summarization is a task typically associated with natural language processing (NLP).

Model Used:

• BART-Large is a variant of the BART (Bidirectional and Auto-Regressive Transformers) model, which is a powerful language generation model. BART-Large is specifically trained on a large-scale dataset and has a larger model size compared to the base BART model.

• With increased capacity and parameters, BART-Large demonstrates enhanced performance in various natural language processing tasks such as text summarization, translation, and text generation. The larger size of BART-Large allows it to capture more intricate language patterns, leading to improved language understanding and generation capabilities.

A brief look at the project:

Screenshot 2024-07-14 161816 Screenshot 2024-07-14 161928 Screenshot 2024-07-14 161944 Screenshot 2024-07-14 162005 Screenshot 2024-07-14 162035

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published