Zora (Zhiruo) Wang
็่ท่ฅ
PhD Student
Carnegie Mellon University
Language Technologies Institute
About Me
I am currently a PhD student at the Language Technologies Institute at Carnegie Mellon University working with Daniel Fried and Graham Neubig. My primary research interest is to use programmatic approaches to solve real-world tasks. Particularly, I am working on:
Memory- and Skill-Adaptive Agents
- Digital agents that learns navigation workflows in memory [AWM] and specialized skills as actuators [ASI] [SkillWeaver]; build verifiable and efficient toolboxes for math, data analysis, and visual reasoning tasks [TroVE]
- These agents often facilitate human verification [TroVE] and improves task success [CowPilot]
- On versatile tasks such as presentation making [AutoPresent], software engineering [ODEX], and popular professions [TAC]
Augmented Language Models
- Tools [Survey]: designed by human experts [AutoPresent] or LMs [TroVE]; their behaviors to under-specified queries and execution failures [Fail-TaLMs]
- Programs: Challenging benchmarks for open domain [ODEX], multilingual [MCoNaLa], and data science [HiTab] problems; Performant code LMs [StarCoder] [ECCO] with [Code RAG], assistive [APIs], and libraries in [DocPrompting]
- Knowledge: retrieve texts [FilCo] [ReAtt] [RAGGED] and structured data [WikiTable] [TUTA]
News
๐พ Older News
Oct 2024: Joined the panelist of Our CS (Workshop for Undergraduates in CS) and shared some of my thoughts about research
Sep 2024: Gave an invited talk at Camel-AI ๐ช about Agent Workflow Memory, check out the video, paper, and tweet
Jul 2024: Gave a tutorial at SIGIR about Large Language Models for Tabular Data, check out the slides & recordings if you're interested!
May 2024: Organized our CMU Agent Workshop ๐ค with plenty of events -- insightful tutorials, talks, and posters! I also gave two (short) tutorials about tool-augmented LMs and codegen testbeds.
Mar 2024: Gave a guest lecture about "Language Agents and Tool Use" at the Advanced NLP course (11-711) course, check out the recordings!
Mar 2024: Gave a talk at the [FLAME_ (Foundation and LAnguage Model) ๐ฅ] seminar, about [our recent survey] and [TroVE]
Feb 2024: Gave a talk about Language Models with Tools at the LLM as Agent Seminar, about [TroVE] and works in progress ๐คซ
Feb 2024: Gave a lecture about Evaluation (metrics and benchmarks) for the Neural Code Generation (11-891) course ๐ป
Jan 2024: TAing for the new course [11-891 Neural Code Generation], reach out if you want to discuss more project ideas ๐ช
Nov 2023: Gave a guest lecture about Evaluation and Benchmarks for Code Generation for the Advanced NLP course (11-711) ๐ฉโ๐ซ more details [here]
Aug 2023: A talk about ๐ ๏ธ Tool using, learning, and making with LLMs at Code Generation Reading Group, check out the [video]
Apr 2023: Gave a talk about [ODEX] at the Machine Learning Methods in Software Engineering (video) hosted by JetBrains Research Team ๐ฉโ๐ป
Upcoming Events
- Apr 2025: I will attend NAACL (with my cat ๐ฑ) and present three works: [CodeRAG-bench], [CowPilot], and [Fail-TaLMs]. Excited to meet people in Albuquerque!
- May 2025: I will be in the Bay Area visiting Stanford. Happy to meet new people & catch up with old friends during the summer!
My Recent Favorite Publications
Agent Workflow Memory
What Are Tools Anyway? A Survey from the Language Model Perspective
TroVE: Inducing Verifiable and Efficient Toolboxes for Solving Programmatic Tasks
Get Connected
If you want to get connected, discuss potential project ideas, ask about CMU application, or chat about any other relevant topics, my office hours are available every Friday 4-5 pm EST. Email me to secure a time to chat!
I often mentor a small number of students every semester, please fill out this application form if you're interested in working with me! However, as I got plenty of emails, I might not be able to reply to all of them.
If you are from underrepresented groups, or do not have much research experience, you are encouraged to reach out!
If You're Interested in My Name
My name in Chinese is ็่ท่ฅ, which reads as Wang, Zhiruo. It is usually hard for non-native speakers to pronounce, so you can also call me Zora (as ZR is similar to Zhi Ruo).
I love my name, especially in Chinese characters, since it has a more beautiful meaning than in English alphabets. ่ท stands for ็ฝ่ท (Angelica dahurica) and ่ฅ stands for ๆ่ฅ (Pollia japonica), which are two kinds of Chinese herbal medicine. Also, ่ท่ฅ is a beautiful vanilla ๐ฟ