Poster
Tilted Sharpness-Aware Minimization
Tian Li · Tianyi Zhou · Jeff Bilmes
West Exhibition Hall B2-B3 #W-503
Sharpness-Aware Minimization (SAM) is a technique that improves the performance of deep learning models by finding "flat" areas on the loss landscape---regions where small changes to the model parameters don't dramatically increase the loss. However, SAM can be computationally challenging because it focuses only on the worst-case scenarios in a small neighborhood of parameters, making optimization difficult, especially when the model's loss landscape is complex.We introduce a new approach called Tilted SAM (TSAM), inspired by a method called "exponential tilting." TSAM smooths out the optimization by assigning greater importance to areas with higher losses, rather than just focusing on the absolute worst-case. This makes it easier to find flatter minima, potentially improving model performance and making optimization smoother and less challenging. We develop new algorithms to efficiently solve TSAM and demonstrate that it achieves better results than standard SAM and its variants in various image and text tasks.