Poster
in
Workshop: Exploration in AI Today (EXAIT)
See it to Place it: Evolving Macro Placements with Vision Language Models
Ikechukwu Uchendu · Vincent Zhuang · Wenjie Jiang · Kuang-Huei Lee · Ebrahim M. Songhori · Swati Goel · Karly Hou · Vijay Janapa Reddi
Keywords: [ reinforcement learning ] [ evolutionary search ] [ chip floorplanning ] [ vision language models ] [ macro placement ] [ in-context learning ] [ spatial reasoning ]
We propose using Vision-Language Models (VLMs) for macro placement in chip floorplanning, a complex optimization task that has recently shown promising advancements through machine learning methods. We hypothesize that the impressive spatial reasoning and understanding capabilities of VLMs can effectively complement existing learning-based approaches. In this work, we introduce VeoPlace (Visual Evolutionary Optimization Placement), a novel framework that uses a VLM to guide the actions of a base policy by constraining them to subregions of the chip canvas. The VLM proposals are iteratively optimized through an evolutionary search strategy with respect to resulting placement quality. On open-source benchmarks, VeoPlace yields state-of-the-art results on four out of seven benchmark circuits, matching or exceeding the performance of prior learning-only approaches. Our approach opens new possibilities for electronic design automation tools that leverage foundation models to solve complex physical design problems.