Skip to content

Official repository of paper "IRG-MotionLLM: Interleaving Motion Generation, Assessment and Refinement for Text-to-Motion Generation"

Notifications You must be signed in to change notification settings

HumanMLLM/IRG-MotionLLM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 

Repository files navigation

IRG-MotionLLM: Interleaving Motion Generation, Assessment and Refinement for Text-to-Motion Generation

paper link

👀Overview

Recent advances in motion-aware large language models have shown remarkable promise for unifying motion understanding and generation tasks. However, these models typically treat understanding and generation separately, limiting the mutual benefits that could arise from interactive feedback between tasks. In this work, we reveal that motion assessment and refinement tasks act as crucial bridges to enable bidirectional knowledge flow between understanding and generation. Leveraging this insight, we propose Interleaved Reasoning for Motion Generation (IRMoGen), a novel paradigm that tightly couples motion generation with assessment and refinement through iterative text-motion dialogue. To realize this, we introduce IRG-MotionLLM, the first model that seamlessly interleaves motion generation, assessment, and refinement to improve generation performance.

🏠Model Zoo

Coming soon.

⛰️Get Ready

Coming soon.

🔥Training

Coming soon.

📏Evaluation

Coming soon.

✒️ Citation

If you find our work helpful for your research, please consider citing our work.

@article{li2025irg-motionllm,
  title={IRG-MotionLLM: Interleaving Motion Generation, Assessment and Refinement for Text-to-Motion Generation},
  author={Li, Yuan-Ming and Yang, Qize and Lei, Nan and Fu, Shenghao and Zeng, Ling-An and Hu, Jian-Fang and Wei, Xihan and Zheng, Wei-Shi},
  journal={arXiv preprint arXiv:2512.10730},
  year={2025}
}

📜 License

  • Our models and code are under the Apache License 2.0. Our data is under MIT License.

Acknowledgement

We sincerely acknowledge and appreciate the exceptional open-source contributions that form the foundation of our work: Motion-Agent, MotionGPT, AToM, MARDM, Text-to-Motion, VLM-R1.

About

Official repository of paper "IRG-MotionLLM: Interleaving Motion Generation, Assessment and Refinement for Text-to-Motion Generation"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published