Jingjing Li
Jingjing Li

Applied Scientist

About Me

Jingjing Li is an Applied Scientist at Tiktok E-Commerce, specializing in Large Language Models (LLMs) and Natural Language Processing (NLP). She focuses on developing LLMs to enhance intelligent customer service. Dr. Li holds a Ph.D. in Computer Science & Engineering from The Chinese University of Hong Kong. Her research interests include unsupervised text generation, controllable text generation, and conversational AI.

Interests
  • Large Language Models (LLMs)
  • Natural Language Processing (NLP)
  • Text Generation
  • Conversational AI
Education
  • PhD in Computer Science

    The Chinese University of Hong Kong

  • B.Eng. in Computer Science

    Xidian University

Experience

  1. Applied Scientist

    Tiktok E-Commerce
    • Fine-tuning, and deploying Large Language Models (LLMs) for intelligent customer service.
  2. Applied Scientist Intern

    Tencent Lightspeed Studios
    • Developed LLM-driven autonomous agent system for a game demo, winning 1st place in the company-wide innovation competition.
  3. Applied Scientist Intern

    Amazon Alexa AI
    • Analyzed semantic fidelity in neural data-to-text models. Proposed a novel decoding strategy to improve semantic accuracy and entity correctness (+8%) over state-of-the-art models (BART, T5).
  4. Research Intern

    Microsoft Research Asia
    • Investigated unsupervised text generation in data-scarce settings. Developed an iterative in-place text span editing approach, achieving state-of-the-art performance on unsupervised text simplification (AAAI 2022).

Education

  1. PhD in Computer Science

    The Chinese University of Hong Kong
  2. B.Eng. in Computer Science

    Xidian University
    Outstanding Graduate
Publications
(2024). How Well Can LLMs Echo Us? Evaluating AI Chatbots' Role-Play Ability with ECHO. arXiv.
(2024). A Survey of Text Watermarking in the Era of Large Language Models. ACM Computing Surveys.
(2020). Unsupervised Text Generation by Learning from Search. In NeurIPS 2020.
(2019). Improving Question Generation With to the Point Context. In EMNLP-IJCNLP 2019.