Feature importance scores come from Ludwig's Integrated Gradients explainer. It interpolates between each example and a neutral baseline sample, summing the change in the model output along that path. Higher |importance| values indicate stronger influence. Plots share a common x-axis to make magnitudes comparable across labels, and the table columns can be sorted for quick scans.
| label | feature | importance | abs importance |
|---|---|---|---|
| temperature | temperature_feature | 0.490821 | 0.490821 |