Enhancing the hatten model for local citation recommendation using BiLSTM and attention pooling

Author affiliations

Authors

  • Tran Dang Khoa Faculty of Information Technology, HUTECH University, 475A Dien Bien Phu Street, Thanh My Tay Ward, Ho Chi Minh City, Viet Nam https://orcid.org/0009-0005-5981-1339
  • Thi N. Dinh Graduate University of Science and Technology, Vietnam Academy of Science and Technology, 18 Hoang Quoc Viet Street, Nghia Do Ward, Ha Noi, Viet Nam
  • Phu Pham Faculty of Information Technology, HUTECH University, 475A Dien Bien Phu Street, Thanh My Tay Ward, Ho Chi Minh City, Viet Nam
  • Bay Vo Faculty of Information Technology, HUTECH University, 475A Dien Bien Phu Street, Thanh My Tay Ward, Ho Chi Minh City, Viet Nam https://orcid.org/0000-0002-9246-4587
  • Nguyen Nhu Son Institute of Information Technology, Vietnam Academy of Science and Technology, 18 Hoang Quoc Viet Street, Nghia Do Ward, Ha Noi, Viet Nam https://orcid.org/0000-0002-3901-2514

DOI:

https://doi.org/10.15625/1813-9663/23095

Keywords:

Local citation recommendation, BiLSTM, deep learning, natural language processing, attention pooling.

Abstract

Over the past decade, citation recommendation has gained increasing attention due to the exponential growth of scientific publications. Among various approaches, local citation recommendation (LCR) - a content-based method leveraging textual context - has proven effective but faces scalability challenges when applied to large databases. To balance computational efficiency and accuracy, recent systems adopt a two-stage pipeline: a lightweight prefetching phase followed by a refined reranking stage. Building upon this direction, This paper proposes Enhanced-HAtten, an improved version of the HAtten-SciBERT model [1]. The proposed model retains the original two-stage architecture but augments the prefetching phase with a Bidirectional Long Short-Term Memory (BiLSTM) layer and attention pooling, enabling richer sequential and semantic representations. Experiments on two benchmark datasets-ACL-200 and FullTextPeerRead demonstrate that Enhanced-HAtten consistently outperforms the original HAtten-SciBERT pipeline, yielding over 10% improvement in both Mean Reciprocal Rank (MRR) and Recall@K, confirming its effectiveness for large-scale scholarly recommendation tasks.

Downloads

Published

28-02-2026

How to Cite

[1]T. D. Khoa, Thi N. Dinh, Phu Pham, Bay Vo, and N. N. Son, “Enhancing the hatten model for local citation recommendation using BiLSTM and attention pooling”, J. Comput. Sci. Cybern., Feb. 2026.

Issue

Section

Articles

Similar Articles

<< < 3 4 5 6 7 8 9 10 11 12 > >> 

You may also start an advanced similarity search for this article.