Liqin Ye

School of CSE, Georgia Institute of Technology

prof_pic.jpg

CODA Building E1651

756 W Peachtree St NW

Atlanta, GA 30332

I am a Machine Learning Ph.D. student at Georgia Tech (ML@GT), advised by Dr. Chao Zhang and Dr. Sudheer Chava. Previously, I received my B.S. degree in Computer Science at University of California, Irvine, where I have a fortune to work with Dr. Stephan Mandt on adapting Diffusion Models for climate system simulations.

My primary research interests encompass three core areas: (1) LLM reasoning and agent (2) multi‑objective alignment of LLMs to meet diverse user demands, (3) enhancing the efficiency of learning from LLM‑synthesized data.

I am actively engaged in cutting-edge research on LLM and welcome collaborators with expertise in this domain. Feel free to contact me at liqiny at gatech dot edu if you have any questions about my research or want to collaborate.

News

Sep 18, 2025 One co-author paper is accepted to NeurIPS 2025 Dataset & Benchmark Track. We develop a framework and dataset decoding global central bank communications.
Jul 07, 2025 One co-author paper is accepted to COLM 2025. We expose LLM’s financial knowledge amnesia.
May 16, 2025 One first-author paper is accepted to KDD 2025 Research Track. We proposed a iterative refinement to denoise LLM-generated noisy labels.

Publications

  1. Precise Attribute Intensity Control in Large Language Models via Targeted Representation Editing
    Rongzhi Zhang*, Liqin Ye*, Yuzhao Heng, Xiang Chen, Tong Yu, Lingkai Kong, Sudheer Chava, and Chao Zhang
    Preprint. Under Review, 2025
  2. Calibrating Pre-trained Language Classifiers on LLM-generated Noisy Labels via Iterative Refinement
    Liqin Ye, Agam Shah, Chao Zhang, and Sudheer Chava
    In ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD), 2025
  3. Beyond the Reported Cutoff: Where Large Language Models Fall Short on Financial Knowledge
    Agam Shah, Liqin Ye, Sebastian Jaskowski, Wei Xu, and Sudheer Chava
    In 2nd Conference on Language Modeling (COLM), 2025
  4. Words That Unite The World: A Unified Framework for Deciphering Central Bank Communications Globally
    Agam Shah, Siddhant Sukhani, Huzaifa Pardawala, Saketh Budideti, Riya Bhadani, Rudra Gopal, Siddhartha Somani, Michael Galarnyk, Soungmin Lee, Arnav Hiray, and 17 more authors
    In Neural Information Processing Systems Dataset & Benchmark Track (NeurIPS D&B Track), 2025

Educations

Georgia Institute of Technology

School of Computer Science and Engineering
Ph.D. in Machine Learning (2024 - Present)
M.S. in Computer Science (2023 - 2025)

University of California, Irvine

Donald Bren School of Information and Computer Sciences
B.S. in Computer Science (2019 - 2023)
ICS Honor Student

Teaching

MGT 8803 AI for Finance

Graduate Teaching Assistant, Fall 2025,Fall 2024, Georgia Tech

ICS 45C Programming in C++

Lab Tutor, Fall 2021, UC Irvine

Services

Reviewer: KDD 2025, ICLR 2025, NeurIPS 2024, ACL 2024, EMNLP 2024, Computational Economics

AI and Future of Finance Conference: Setup, Poster Session