Poster
HyperTree Planning: Enhancing LLM Reasoning via Hierarchical Thinking
Runquan Gui · Zhihai Wang · Jie Wang · Chi Ma · Huiling Zhen · Mingxuan Yuan · Jianye Hao · Defu Lian · Enhong Chen · Feng Wu
East Exhibition Hall A-B #E-2300
Many real-world tasks—such as planning a multi-day trip, organizing a complex schedule, or making long-term decisions—require advanced reasoning and step-by-step planning. Current large language models (LLMs) often struggle with these complex tasks due to their lack of explicit structure and long-term coherence.We present HyperTree Planning (HTP), a new method that enables LLMs to solve complex tasks by automatically decomposing them into subgoals using a tree-structured framework. This tree structure allows the model to plan and execute sub-tasks recursively, while a self-reflection mechanism guides the model to revise and improve its reasoning throughout the process. The entire pipeline is fully automated, without any human intervention.HTP significantly improves LLMs' ability to complete complex tasks accurately and efficiently. It outperforms strong baselines in travel planning, instructional generation, and embodied AI tasks. Our method demonstrates that LLM agents can autonomously perform multi-step reasoning and handle diverse real-world needs with high reliability and precision.