Ask a Question

Prefer a chat interface with context about you and your work?

Language in a Bottle: Language Model Guided Concept Bottlenecks for Interpretable Image Classification

Language in a Bottle: Language Model Guided Concept Bottlenecks for Interpretable Image Classification

Concept Bottleneck Models (CBM) are inherently interpretable models that factor model decisions into humanreadable concepts. They allow people to easily understand why a model is failing, a critical feature for high-stakes applications. CBMs require manually specified concepts and often under-perform their black box counterparts, preventing their broad adoption. We address …