Within construction industry, concrete is used extensively in almost all of construction projects. Thus the strength of concrete must passed certain specification depending on the type of project. However, due to its complex chemical mixture, different concretes with slightly different ingredients can interact in different way, thus it is crucial to be able to predict the strength of different types of concrete.
In this section, the goal is to create and train a neural networks model that will help us in predicting the strength of concrete based on different properties such as water(in kg/m^3), cement (kg/m^3), coarse aggregate (kg/m^3), age (day) and so on…
The dataset is donated by Prof. I-Cheng Yeh, Department of Information Management, Chung-Hua University. It is available to download at https://archive.ics.uci.edu/ml/datasets/Concrete+Compressive+Strength. The dataset contain 1030 examples, each with 8 features input and one output which is the strength of the concrete.
With that being said let’s get to it!
Step 1: Load the dataset
concrete <- read.csv('concrete.csv')
Step 2: Explore and prepare the dataset
concrete
str(concrete)
'data.frame': 1030 obs. of 9 variables:
$ cement : num 141 169 250 266 155 ...
$ slag : num 212 42.2 0 114 183.4 ...
$ ash : num 0 124.3 95.7 0 0 ...
$ water : num 204 158 187 228 193 ...
$ superplastic: num 0 10.8 5.5 0 9.1 0 0 6.4 0 9 ...
$ coarseagg : num 972 1081 957 932 1047 ...
$ fineagg : num 748 796 861 670 697 ...
$ age : int 28 14 28 28 28 90 7 56 28 28 ...
$ strength : num 29.9 23.5 29.2 45.9 18.3 ...
=> As we can see, this dataset contains 1030 examples and all of the input variables and output are numeric.
Since we want to use neural network, it’s a good idea to normalize the dataset so that variables that have significantly greater range will not dominate others. If a variables has a larger range compare to the others, it will have more “say” (aka impact) during the training process, thus the model will reply more heavily on this variable instead of considering all variables equally.
# let's define a function to normalize dataset
normalize <- function(x){
return ((x - min(x)) / (max(x) - min(x)))
}
concrete_norm <- as.data.frame(lapply(concrete, normalize))
summary(concrete_norm)
cement slag ash water superplastic coarseagg fineagg age strength
Min. :0.0000 Min. :0.00000 Min. :0.0000 Min. :0.0000 Min. :0.0000 Min. :0.0000 Min. :0.0000 Min. :0.00000 Min. :0.0000
1st Qu.:0.2063 1st Qu.:0.00000 1st Qu.:0.0000 1st Qu.:0.3442 1st Qu.:0.0000 1st Qu.:0.3808 1st Qu.:0.3436 1st Qu.:0.01648 1st Qu.:0.2664
Median :0.3902 Median :0.06121 Median :0.0000 Median :0.5048 Median :0.1988 Median :0.4855 Median :0.4654 Median :0.07418 Median :0.4001
Mean :0.4091 Mean :0.20561 Mean :0.2708 Mean :0.4774 Mean :0.1927 Mean :0.4998 Mean :0.4505 Mean :0.12270 Mean :0.4172
3rd Qu.:0.5662 3rd Qu.:0.39775 3rd Qu.:0.5912 3rd Qu.:0.5607 3rd Qu.:0.3168 3rd Qu.:0.6640 3rd Qu.:0.5770 3rd Qu.:0.15110 3rd Qu.:0.5457
Max. :1.0000 Max. :1.00000 Max. :1.0000 Max. :1.0000 Max. :1.0000 Max. :1.0000 Max. :1.0000 Max. :1.00000 Max. :1.0000
==> The dataset has been normalized and variables’ range is now between 0 and 1. The normalized dataset is stored in concrete_norm variable which we will use to train the neural network. With their values normalized, each variable will have equal “say” (or impact) compare to the rest of variables.
With the normalized dataset in hand, we will go ahead and split our dataset into training and testing sub dataset:
# training set 75 percent, testing set 25%
concrete_train <- concrete_norm[1:773, ]
concrete_test <- concrete_norm[774:1030, ]
Step 3: Train the neural network
We will use the neural network model implemented in the neuralnet package. We need to download and load the library.
# install.packages('neuralnet')
library(neuralnet)
# we will train a neural network with 1 hidden layer
concrete_model <- neuralnet(strength ~ cement + slag + ash + water + superplastic + coarseagg + fineagg + age, data = concrete_train, hidden = 1)
# then we plot the topology of the resulted neural network
plot(concrete_model)

==> By plotting the neural network’s topology, we observed that the model has 8 input variables plus a bias unit as expected. The weights that goes with each input variable is also shown next to each each node. For example, the weight for water variable is -1.64942; this makes sense because, intuitively, if we put too much water into the concrete mixture it will “thin” out the mixture, thus leads to weaker bounding concrete. The plot also show us the Sum of Squared Error (SSE) = 5.078 which is the cumulative squared error between predicted and true values of strength; it also include the number of steps = 3725 that the neural network take to train the training dataset.
Step 4: Evaluate the neural network’s performance
Now that we have our neural network trained, we will put it to the test by using it to predict the testing dataset and see how it would perform.
# compute function will return the neurons for each layer and the result; these two field are stored in $neurons and $net.result variables
model_results <- compute(concrete_model, concrete_test[1:8])# syntax: compute(neural_network_model, test_dataset)
predicted_strength <- model_results$net.result
# evaluate the correlation between predicted strength and true strength of the testing dataset
cor(predicted_strength, concrete_test$strength)
[,1]
[1,] 0.8063207679
==> If two variables have correlation close to 1, then it means these two variables have a close linear relationship to each other. Our predicted_strength and concrete_test$strength has a correlation measurement of 0.81; this indicates that even with our simple neural network model with only one hidden unit, it can predict the concrete strength fairly well!
Step 5: Improve the neural network’s performance
When it come to public safety, we want to improve our ability to predict structure integrity as accurate as we can. Even though our simple neural network can predict concrete strength fairly well, there is room for improvement.
Let’s try to increase the number of hidden unit to 5:
concrete_model2 <- neuralnet(strength ~ cement + slag + ash + water + superplastic + coarseagg + fineagg + age, data = concrete_train, hidden = 5)
plot(concrete_model2)

==> Right off the bat, we can see that the Sum of Squared Error (SSE) has decreased significantly from 5.078 to 1.705 which is about 3 times reduction in SSE value. In addition, the number of steps has also increased from 3725 to 41680 since the neural network is more complex due to the extra hidden units! By doing the same steps to evaluate the new neural network’s performance, we get:
model_results2 <- compute(concrete_model2, concrete_test[1:8])
predicted_strength2 <- model_results2$net.result
cor(predicted_strength2, concrete_test$strength)
[,1]
[1,] 0.9238489504
==> With the new and a bit more complex neural network, we can see a significant improvement in the correlation measurement between predicted strength values and the true strength values. With correlation measurement = 0.924, the predicted strength is much more closely resemble the true strength compare to the previous neural network model. This is such great improvement for adding just a few hidden neurons!!!
Machine Learning with R by
Brett Lantz
---
title: "Modeling the Strength of Concrete with Artificial Neural Networks"
output: html_notebook
---

Within construction industry, concrete is used extensively in almost all of construction projects. Thus the strength of concrete must passed certain specification depending on the type of project. However, due to its complex chemical mixture, different concretes with slightly different ingredients can interact in different way, thus it is crucial to be able to predict the strength of different types of concrete.</br>

In this section, the <em><u>goal is to create and train a neural networks model that will help us in predicting the strength of concrete</u></em> based on different properties such as water(in kg/m^3), cement (kg/m^3), coarse aggregate (kg/m^3), age (day) and so on...</br>

The dataset is donated by <em>Prof. I-Cheng Yeh, Department of Information Management, Chung-Hua University</em>. It is available to download at https://archive.ics.uci.edu/ml/datasets/Concrete+Compressive+Strength. The dataset contain 1030 examples, each with 8 features input and one output which is the strength of the concrete.</br>

With that being said let's get to it!</br></br>

<h4><u> Step 1: Load the dataset</u></h4>

```{r}
concrete <- read.csv('concrete.csv')
```
</br>

<h4><u> Step 2: Explore and prepare the dataset</u></h4>
```{r}
concrete
str(concrete)
```

=> As we can see, this dataset contains 1030 examples and all of the input variables and output are numeric.

Since we want to use neural network, it's a good idea to normalize the dataset so that variables that have significantly greater range will not dominate others. If a variables has a larger range compare to the others, it will have more "say" (aka impact) during the training process, thus the model will reply more heavily on this variable instead of considering all variables equally.
```{r}
# let's define a function to normalize dataset
normalize <- function(x){
  return ((x - min(x)) / (max(x) - min(x)))
}

concrete_norm <- as.data.frame(lapply(concrete, normalize))
summary(concrete_norm)
```
==> The dataset has been normalized and variables' range is now between 0 and 1. The normalized dataset is stored in <em>concrete_norm</em> variable which we will use to train the neural network. With their values normalized, each variable will have equal "say" (or impact) compare to the rest of variables.</br>

With the normalized dataset in hand, we will go ahead and split our dataset into training and testing sub dataset:
```{r}
# training set 75 percent, testing set 25%
concrete_train <- concrete_norm[1:773, ]
concrete_test <- concrete_norm[774:1030, ]
```
</br>
<h4><u> Step 3: Train the neural network</u></h4>

We will use the neural network model implemented in the <em>neuralnet</em> package. We need to download and load the library.
```{r}
# install.packages('neuralnet')
library(neuralnet)

# we will train a neural network with 1 neuron in hidden layer
concrete_model <- neuralnet(strength ~ cement + slag + ash + water + superplastic + coarseagg + fineagg + age, data = concrete_train, hidden = 1)
# then we plot the topology of the resulted neural network
plot(concrete_model)
```
==> By plotting the neural network's topology, we observed that the model has 8 input variables plus a bias unit as expected. The weights that goes with each input variable is also shown next to each each node. For example, the weight for water variable is -1.64942; this makes sense because, intuitively, if we put too much water into the concrete mixture it will "thin" out the mixture, thus leads to weaker bounding concrete.
The plot also show us the Sum of Squared Error (SSE) = 5.078 which is the cumulative squared error between predicted and true values of strength; it also include the number of steps = 3725 that the neural network take to train the training dataset.</br></br>

<h4><u> Step 4: Evaluate the neural network's performance</u></h4>
Now that we have our neural network trained, we will put it to the test by using it to predict the testing dataset and see how it would perform.
```{r}
# compute function will return the neurons for each layer and the result; these two field are stored in $neurons and $net.result variables
model_results <- compute(concrete_model, concrete_test[1:8])# syntax: compute(neural_network_model, test_dataset)
predicted_strength <- model_results$net.result
# evaluate the correlation between predicted strength and true strength of the testing dataset
cor(predicted_strength, concrete_test$strength)
```
==> If two variables have correlation close to 1, then it means these two variables have a close linear relationship to each other. Our <strong>predicted_strength</strong> and <strong>concrete_test$strength</strong> has a correlation measurement of 0.81; this indicates that even with our simple neural network model with only one hidden unit, it can predict the concrete strength fairly well!</br></br>

<h4><u> Step 5: Improve the neural network's performance</u></h4>
When it come to public safety, we want to improve our ability to predict structure integrity as accurate as we can. Even though our simple neural network can predict concrete strength fairly well, there is room for improvement.

Let's try to increase the number of hidden unit to 5:
```{r}
concrete_model2 <- neuralnet(strength ~ cement + slag + ash + water + superplastic + coarseagg + fineagg + age, data = concrete_train, hidden = 5)
plot(concrete_model2)
```
==> Right off the bat, we can see that the Sum of Squared Error (SSE) has decreased significantly from 5.078 to 1.705 which is about 3 times reduction in SSE value. In addition, the number of steps has also increased from 3725 to 41680 since the neural network is more complex due to the extra hidden units!
By doing the same steps to evaluate the new neural network's performance, we get:
```{r}
model_results2 <- compute(concrete_model2, concrete_test[1:8])
predicted_strength2 <- model_results2$net.result
cor(predicted_strength2, concrete_test$strength)
```
==> With the new and a bit more complex neural network, we can see a significant improvement in the correlation measurement between predicted strength values and the true strength values. With correlation measurement = 0.924, the predicted strength is much more closely resemble the true strength compare to the previous neural network model. This is such great improvement for adding just a few hidden neurons!!!</br></br>

<h4>+ Reference:</h4>
<em>Machine Learning with R</em> by <em>Brett Lantz</em>
