Standardized coefficient

In statistics, standardized (regression) coefficients, also called beta coefficients or beta weights, are the estimates resulting from a regression analysis where the underlying data have been standardized so that the variances of dependent and independent variables are equal to 1.

Standardization of the coefficient is usually done to answer the question of which of the independent variables have a greater effect on the dependent variable in a multiple regression analysis where the variables are measured in different units of measurement (for example, income measured in dollars and family size measured in number of individuals).

For simple linear regression with orthogonal predictors, the standardized regression coefficient equals the correlation between the independent and dependent variables.

Values for standardized and unstandardized coefficients can also be re-scaled to one another subsequent to either type of analysis.

The standardized coefficient simply results as

[2][3] Standardized coefficients' advocates note that the coefficients are independent of the involved variables' units of measurement (i.e., standardized coefficients are unitless), which makes comparisons easy.

[3] Critics voice concerns that such a standardization can be very misleading.

[2][4] Due to the re-scaling based on sample standard deviations, any effect apparent in the standardized coefficient may be due to confounding with the particularities (especially: variability) of the involved data sample(s).

Also, the interpretation or meaning of a "one standard deviation change" in the regressor

may vary markedly between non-normal distributions (e.g., when skewed, asymmetric or multimodal).

Some statistical software packages like PSPP, SPSS and SYSTAT label the standardized regression coefficients as "Beta" while the unstandardized coefficients are labeled "B".