*54*

The Akaike information criterion (AIC) is a metric that is used to compare the fit of different regression models.

It is calculated as:

AIC = 2K – 2*ln*(L)

where:

**K:**The number of model parameters.: The log-likelihood of the model. This tells us how likely the model is, given the data.*ln*(L)

Once you’ve fit several regression models, you can compare the AIC value of each model. The model with the lowest AIC offers the best fit.

One question students often have about AIC is: **What is considered a good AIC value?**

The simple answer: **There is no value for AIC that can be considered “good” or “bad” because we simply use AIC as a way to compare regression models. The model with the lowest AIC offers the best fit. The absolute value of the AIC value is not important.**

For example, if Model 1 has an AIC value of 730.5 and Model 2 has an AIC value of 456.3, then Model 2 offers a better fit. The absolute values of the AIC are not important.

A useful reference on this topic comes from *Serious Stats: A Guide to Advanced Statistics for the Behavioral Sciences* on page 402:

As with likelihood, the absolute value of AIC is largely meaningless (being determined by the arbitrary constant). As this constant depends on the data, AIC can be used to compare models fitted on identical samples.

The best model from the set of plausible models being considered is therefore the one with the smallest AIC value (the least information loss relative to the true model).

As noted in the textbook, the absolute value of the AIC is not important. We simply use AIC values to compare the fit of models and the model with the lowest AIC value is best.

**How to Determine if a Model Fits a Dataset Well**

The AIC value is a useful way to determine which regression model fits a dataset the best among a list of potential models, but it doesn’t actually quantify * how well* the model fits the data.

For example, a particular regression model might have the lowest AIC value among a list of potential models, but it may still be a poor fitting model.

To determine if a model fits a dataset well, we can use the following two metrics:

- Mallows’ Cp: A metric that quantifies the amount of bias in regression models.
- Adjusted R-squared: The proportion of the variance in the response variable that can be explained by the predictor variables in the model, adjusted for the number of predictor variables in the model.

One potential strategy for choosing the “best” regression model among several potential models is as follows:

- First, identify the model with the lowest AIC value.
- Then, fit this regression model to the data and calculate the Mallows’ Cp and adjusted R-squared of the model to quantify how well it actually fits the data.

This approach allows you to identify the best fitting model *and* quantify how well the model actually fits the data.

**Additional Resources**

How to Interpret Negative AIC Values

How to Calculate AIC in R

How to Calculate AIC in Python