Skip to content
/ SHAN Public

Siamese Hierarchical Attention Networks for Extractive Summarization

License

Notifications You must be signed in to change notification settings

jogonba2/SHAN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SHAN

Siamese Hierarchical Attention Networks for Extractive Summarization

If you use this work, please cite the following references:

 @article{shannlke18,
  added-at = {2019-08-21T14:58:39.000+0200},
  author = {González, José-Ángel and Encarna, Segarra and García-Granada, Fernando and Sanchis, Emilio and Hurtado, Lluís-F.},
  biburl = {https://www.bibsonomy.org/bibtex/2058c329a53d32147b7c50c93aeffb584/ghagerer},
  ee = {https://doi.org/10.3233/JIFS-179011},
  interhash = {4f95ae31aa4cb45806065be00110f317},
  intrahash = {058c329a53d32147b7c50c93aeffb584},
  journal = {Journal of Intelligent and Fuzzy Systems},
  keywords = {attention-based-aspect-extraction hierarchical shabnam siamese summarization},
  number = 5,
  pages = {4599-4607},
  timestamp = {2019-08-22T14:58:09.000+0200},
  title = {Siamese hierarchical attention networks for extractive summarization.},
  url = {http://dblp.uni-trier.de/db/journals/jifs/jifs36.html#GonzalezEGSH19},
  volume = 36,
  year = 2019
 }
@Article{app9183836,
AUTHOR = {González, J.-A. and Hurtado, L.-F. and Segarra, E. and García-Granada, F. and Sanchis, E.},
TITLE = {Summarization of Spanish Talk Shows with Siamese Hierarchical Attention Networks},
JOURNAL = {Applied Sciences},
VOLUME = {9},
YEAR = {2019},
NUMBER = {18},
ARTICLE-NUMBER = {3836},
URL = {https://www.mdpi.com/2076-3417/9/18/3836},
ISSN = {2076-3417},
ABSTRACT = {In this paper, we present an approach to Spanish talk shows summarization. Our approach is based on the use of Siamese Neural Networks on the transcription of the show audios. Specifically, we propose to use Hierarchical Attention Networks to select the most relevant sentences for each speaker about a given topic in the show, in order to summarize his opinion about the topic. We train these networks in a siamese way to determine whether a summary is appropriate or not. Previous evaluation of this approach on summarization task of English newspapers achieved performances similar to other state-of-the-art systems. In the absence of enough transcribed or recognized speech data to train our system for talk show summarization in Spanish, we acquire a large corpus of document-summary pairs from Spanish newspapers and we use it to train our system. We choose this newspapers domain due to its high similarity with the topics addressed in talk shows. A preliminary evaluation of our summarization system on Spanish TV programs shows the adequacy of the proposal.},
DOI = {10.3390/app9183836}
}

About

Siamese Hierarchical Attention Networks for Extractive Summarization

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages