Metadata-Version: 2.1
Name: bert-summarizer
Version: 0.1.1
Summary: Text Summarization Library based on transformers
Home-page: https://github.com/k-tahiro/bert-summarizer
Author: k-tahiro
Author-email: tahiro.k.ad@gmail.com
Maintainer: k-tahiro
Maintainer-email: tahiro.k.ad@gmail.com
License: UNKNOWN
Platform: UNKNOWN
Classifier: Development Status :: 2 - Pre-Alpha
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Description-Content-Type: text/markdown
Requires-Dist: ginza (>=4.0.5)
Requires-Dist: transformers[ja,torch] (>=4.0.0)

**BERT-based text summarizers**

## Table of Contents

- [Table of Contents](#table-of-contents)
- [About the Project](#about-the-project)
- [Getting started](#getting-started)
- [Usage](#usage)
- [Roadmap](#roadmap)
- [Contributing](#contributing)
- [License](#license)
- [Contact](#contact)
- [Acknowledgements](#acknowledgements)

## About the Project

- This repository will provide various summarization models using BERT.

## Getting started

TBW

## Usage

TBW

## Roadmap

- BertSumExt

## Contributing

TBW

## License

Distributed under the MIT License. See `LICENSE` for more information.

## Contact

Keisuke Hirota - [@rad0717](https://twitter.com/rad0717) - tahiro.k.ad[at]gmail.com

Project Link: [https://github.com/k-tahiro/bert-summarizer](https://github.com/k-tahiro/bert-summarizer)

## Acknowledgements

- EMNLP 2019 paper [Text Summarization with Pretrained Encoders](https://arxiv.org/abs/1908.08345)
  - [nlpyang/PreSumm](https://github.com/nlpyang/PreSumm)


