## Understanding R-squared and Adjusted R-squared in Linear Regression

Linear regression is one of the most popular statistical techniques used in data analysis. It is a simple yet powerful method for modeling the relationship between a response variable and one or more predictor variables.

R-squared and adjusted R-squared are two important metrics that are commonly used to evaluate the performance of a linear regression model.

### Understanding R-squared

R-squared, also known as the coefficient of determination, is a measure of how well the linear regression model fits the data. It measures the proportion of variation in the response variable that can be explained by the predictor variables.

In other words, it is a measure of the goodness of fit of the model. R-squared ranges from 0 to 1.

A value of 0 indicates that the model explains none of the variability in the response variable, whereas a value of 1 indicates that the model explains all of the variability in the response variable. A higher value of R-squared indicates a better fit between the model and the data.

However, R-squared has a limitation: it increases with the addition of more predictor variables, even if these variables have little or no effect on the response variable. This can lead to overfitting of the model.

To address this limitation, the adjusted R-squared metric is used.

### Understanding Adjusted R-squared

The adjusted R-squared is a modified version of R-squared that takes into account the number of predictor variables in the model. It penalizes the addition of unnecessary predictor variables and thus provides a more realistic estimate of the goodness of fit of the model.

The formula for adjusted R-squared is:

`Adjusted R-squared = 1 - ((1 - R-squared) * (n - 1) / (n - k - 1))`

where n is the sample size and k is the number of predictor variables in the model. The adjusted R-squared ranges from 0 to 1, with a higher value indicating a better fit between the model and the data.

A useful model should have a high adjusted R-squared value, indicating that the predictor variables in the model are important and meaningful in explaining the variability in the response variable.

### Example 1: Calculating Adjusted R-squared using sklearn

Sklearn is a powerful Python library for machine learning and data analysis.

To calculate the adjusted R-squared in a multiple linear regression model using sklearn, we can follow these steps:

- Import the necessary libraries and load the data:

```
import pandas as pd
from sklearn.linear_model import LinearRegression
# Load the data
df = pd.read_csv('mpg.csv')
X = df[['wt', 'drat', 'qsec', 'hp']]
y = df['mpg']
```

- Fit the linear regression model:

`model = LinearRegression().fit(X, y)`

- Calculate R-squared and adjusted R-squared:

```
n = len(y)
k = len(X.columns)
R_squared = model.score(X, y)
adjusted_R_squared = 1 - (((1 - R_squared) * (n - 1)) / (n - k - 1))
print(f'R-squared: {R_squared:.3f}')
print(f'Adjusted R-squared: {adjusted_R_squared:.3f}')
```

#### Output:

```
R-squared: 0.819
Adjusted R-squared: 0.800
```

### Example 2: Calculating Adjusted R-squared using statsmodels

Statsmodels is another powerful Python library for statistical analysis. To calculate the adjusted R-squared in a multiple linear regression model using statsmodels, we can follow these steps:

- Import the necessary libraries and load the data:

```
import pandas as pd
import statsmodels.api as sm
# Load the data
df = pd.read_csv('mpg.csv')
X = df[['wt', 'drat', 'qsec', 'hp']]
y = df['mpg']
```

- Fit the linear regression model:

```
X = sm.add_constant(X)
model = sm.OLS(y, X).fit()
```

- Calculate R-squared and adjusted R-squared:

```
n = len(y)
k = len(X.columns) - 1
R_squared = model.rsquared
adjusted_R_squared = 1 - (((1 - R_squared) * (n - 1)) / (n - k - 1))
print(f'R-squared: {R_squared:.3f}')
print(f'Adjusted R-squared: {adjusted_R_squared:.3f}')
```

#### Output:

```
R-squared: 0.819
Adjusted R-squared: 0.800
```

## Additional Resources

Linear regression is a vast topic, and there are many resources available online to learn more about it. Some great resources include online courses, tutorials, and books.

### Some of the best online resources for learning linear regression in Python include:

- Scikit-learn Documentation – The official documentation for Scikit-learn, which includes detailed tutorials on linear regression and other machine learning techniques.
- Statsmodels Documentation – The official documentation for Statsmodels, which includes detailed tutorials on linear regression and other statistical techniques.
- Coursera – Coursera offers many courses on linear regression and other topics in data analysis.
- Udemy – Udemy offers many courses on linear regression and other topics in data analysis.
- DataCamp – DataCamp offers many courses on linear regression and other topics in data analysis.

## Conclusion

In conclusion, R-squared and adjusted R-squared are important metrics for evaluating the performance of a linear regression model. R-squared measures the goodness of fit of the model, while adjusted R-squared takes into account the number of predictor variables in the model and provides a more realistic estimate of the goodness of fit.

Sklearn and statsmodels are two powerful Python libraries that can be used to calculate these metrics in a multiple linear regression model. There are many resources available online to learn more about the topic, including tutorials, courses, and books.

In summary, R-squared and adjusted R-squared are essential metrics for evaluating the performance of a linear regression model. R-squared measures the goodness of fit of the model, while adjusted R-squared provides a more realistic estimate that takes into account the number of predictor variables in the model.

Using Python libraries such as sklearn and statsmodels, one can easily calculate these metrics. Resources such as online courses, tutorials, and books provide further educational opportunities for learning about linear regression and how to apply it in data analysis.

Understanding and utilizing these metrics are crucial for developing a useful and meaningful linear regression model.