Concept Bottleneck Model
Explanation Evaluation Study

Help us understand which AI explanations are more meaningful to humans.

What is this study about?

Concept Bottleneck Models (CBMs) classify images by first identifying a set of human-interpretable concepts (e.g., "has wings", "sandy ground") and then using these concepts—weighted by their importance—to predict a class label.

In this study you will see the same image classified by two different CBM systems. Your task is to judge which model's explanation better justifies its prediction.

What to look for

Instructions

  1. View the image and the two explanations (CBM A and CBM B).
  2. Select which explanation you find more informative and faithful.
  3. Optionally, add a comment to explain your choice.
  4. Click Submit to proceed to the next pair.

You will evaluate 20 image pairs. This should take approximately 10 minutes.

Start Evaluation →