Ying Zhang's Homepage

About Ying Zhang (张莹)


I am a postdoc in the Natural Language Understanding Team, RIKEN Center for Advanced Intelligence Project (AIP), under the supervision of Prof. Kentaro Inui at Tohoku University.

Thank you for your interest in my profile! I am working in natural language processing, to track my newest publications, you can also visit my ORCID, Semantic Scholar, or Google Scholar.

Newest Publications

  1. • Dongyuan Li, Ying Zhang*, Zhen Wang, Shiyin Tan, Satoshi Kosugi, and Manabu Okumura. “Active Learning for Abstractive Text Summarization via LLM-Determined Curriculum and Certainty Gain Maximization,” In Findings of The 2024 Conference on Empirical Methods in Natural Language Processing.

  2. • Yusong Wang, Ying Zhang*, Dongyuan Li, Jialun Shen, Yicheng Xu, Mingkun Xu, Kotaro Funakoshi, and Manabu Okumura. “FINE-LMT: Fine-grained Feature Learning for Multi-Modal Machine Translation ,” In The Pacific Rim International Conference on Artificial Intelligence (PRICAI 2024).

  3. • Shiyin Tan, Dongyuan Li, Renhe Jiang, Ying Zhang*, and Manabu Okumura. “Community-Invariant Graph Contrastive Learning,” In Forty-first International Conference on Machine Learning (ICML 2024).[paper]

  4. • Dongyuan Li, Ying Zhang*, Yusong Wang, Kataro Funakoshi, and Manabu Okumura. “Active Learning with Task Adaptation Pre-training for Speech Emotion Recognition,” Journal of Natural Language Processing (JNLP 2024).[paper]

  5. Ying Zhang*, Hidetaka Kamigaito, and Manabu Okumura. “Bidirectional Transformer Reranker for Grammatical Error Correction,” Journal of Natural Language Processing (JNLP 2024), Volume 31, Issue 1. [paper]

  6. Ying Zhang*, Hidetaka Kamigaito, and Manabu Okumura. “Bidirectional Transformer Reranker for Grammatical Error Correction,” In Findings of the Association for Computational Linguistics: ACL 2023, pp. 3801-3825, long paper. [paper, bib, presentation, poster, code]

  7. Ying Zhang*, Hidetaka Kamigaito, Tatsuya Aoki, Hiroya Takamura, and Manabu Okumura. “Generic Mechanism for Reducing Repetitions in Encoder-Decoder Models,” Journal of Natural Language Processing (JNLP 2023), Volume 30, Issue 2, pp. 401-431. [paper, bib, code]

  8. Ying Zhang*, Hidetaka Kamigaito, and Manabu Okumura. “A Language Model-based Generative Classifier for Sentence-level Discourse Parsing,” In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP 2021), pp. 2432-2446, long paper. [paper, bib, presentation, poster, code]

  9. Ying Zhang*, Hidetaka Kamigaito, Tatsuya Aoki, Hiroya Takamura, and Manabu Okumura. “Generic Mechanism for Reducing Repetitions in Encoder-Decoder Models,” In Proceedings of the International Conference Recent Advances in Natural Language Processing (RANLP 2021), pp. 1606-1615. [paper, bib, presentation, code]

Research Interests


My research focuses on natural language processing that utilizes computers to understand the text as human beings. Specifically, I am passionate about developing general functions and mechanisms that can be easily applied to pretrained large language models across various natural language generation tasks to enhance results.

For instance, I am particularly interested in addressing the issue of text degeneration. This involves training deep neural networks to assign higher probabilities for natural and grammatical texts compared to bland and repetitive texts when using likelihood as a decoding objective (Holtzman et al. 2020). It is included in my Ph.D. dissertation: Addressing Text Degeneration of Discriminative Models with Re-ranking Methods.

Academic/Career


Thank you for your interest in my work!

If you have any question about our paper or code, please feel free to contact with me via