Jiaqi Cao

LLM Researcher @ LUMIA Group · Shanghai Jiao Tong University

profile/blue-profile.png

About

👋 Hi, I'm Jiaqi Cao. (Call me Max!)

I am a second-year Master’s student at Shanghai Jiao Tong University (SJTU), where I also completed my B.E. degree. I am fortunate to be supervised by Prof. Zhouhan Lin. I’ve had internships at Microsoft and Shanghai AI Lab.

🔍 Research Interests

My core research areas include:

  • Next-Gen LLM Architectures: Exploring latent memory mechanisms in specific.

    [Memory Decoder & MLP Memory]

  • Continual Learning: Exploring context-to-weight mechanisms to enable continual learning in LLMs.

🤝 Let's Connect

I am always open to academic discussions or potential collaborations. Please feel free to reach out!

News

🎓 Actively seeking Job Opportunities — Graduating in March 2027. Looking for internship & full-time positions at Any LLM foundation model teams. Reach out if interested!
Jan 17, 2026 One paper(MLP Memory) is accepted by ICLR 2026 🥳
Dec 01, 2025 I will be presenting Memory Decoder at San Diego. Have a chat with me 🙌
Sep 17, 2025 One paper(Memory Decoder) is accepted by NeurIPS 2025 🥳

Selected Publications

(* indicates equal contribution)

  1. Memory Decoder: A Pretrained, Plug-and-Play Memory for Large Language Models
    Jiaqi Cao*, Jiarui Wang*, Rubin Wei, and 4 more authors
    NeurIPS 2025
  2. MLP Memory: A Retriever-Pretrained Memory for Large Language Models
    Rubin Wei*, Jiaqi Cao*, Jiarui Wang, and 4 more authors
    ICLR 2026