Tuning hyperparameters is a key concept in machine learning. You can perform hyperparameter tuning automatically using techniques such as grid search, random search, and gradient based optimization etc. This guide demonstrates an example process of how to tune hyperparameters manually for a selected dataset and an algorithm, by performing a few tests using WSO2 Machine Learner. You can use the same approach with different datasets and algorithms.

### Goals of this guide

This guide uses the well-known Pima Indians Diabetes dataset and the Logistic Regression with mini batch gradient descent algorithm to perform the analysis. The hyperparameters of this algorithm are as follows.

Hyperparameter | Description |
---|---|

Iterations | Number of times optimizer run before completing the optimization process. |

Learning Rate | Step size of the optimization algorithm. |

Regularization Type | Type of the regularization. WSO2 Machine Learner supports L2 and L1 regularizations. |

Regularization Parameter | Regularization parameter controls the model complexity. Hence, it helps to control model overfitting. |

SGD Data Fraction | Fraction of the training dataset used in a single iteration of the optimization algorithm. |

This guide demonstrates the following goals on finding the optimal Learning Rate and the Number of Iterations, while keeping the other hyperparameters in the above list at a constant value.

Finding the optimal Learning Rate and the Number of Iterations which improves Area Under Curve (AUC). For more information on Area Under Curve of ROC Curve, see Model Evaluation Measures.

Finding the relationship between Learning Rate and AUC.

Finding the relationship between number of iterations and AUC.

### Approach on tuning hyperparameters

The approach on how to achieve the above goals on tuning the hyperparameters is described below.

- Upload your dataset (e.g. Pima Indians Diabetes dataset) to WSO2 ML. For instructions on uploading the dataset to the ML, see Exploring Data.
Create a project, and then generate a model by creating an analysis. For instructions, see Generating Models.

Keep the Learning Rate at a fixed value (0.1), and vary the Number of Iterations in the Step 4 Parameters section of the model generating wizard in the WSO2 ML UI as shown below.

Record the AUC value you obtain against each iterations number as shown in the example below.

You can get the AUC values from the Model Summary in the WSO2 ML UI as shown below.

Plot a graph using the results you obtained as shown in the example below.

According to the above graph, AUC increases with the number of iterations. Hence, you can pick 10000 as a fair number of iterations to find the optimal learning rate. (You may pick any number > 5000, where learning rate started to climb over 0.5. However, increasing the Number of Iterations extensively would lead to an overfitted model).Now, find the optimal Learning Rate based on the ‘fair’ number for iterations you picked above. For that, create analyses by keeping the Number of Iterations at a fixed value (10000) and vary the Learning Rate in the Step 4 Parameters section of the model generating wizard in the WSO2 ML UI, as you did in step 2 above.

Record the AUC value you obtain against each Learning Rate as shown in the example below.

Plot a graph using the results you obtained as shown in the example below.

According to the above graph, the AUC has a global maxima at 0.01 Learning Rate (precisely it is between 0.005 and 0.01). Hence, AUC maximizes when Learning Rate approaches 0.01. Therefore, 0.01 is the optimal Learning Rate for this particular dataset and algorithm.

- Now, keep the Learning Rate at 0.01, and vary the Number of Iterations again in the Step 4 Parameters section of the model generating wizard in the WSO2 ML UI, as you did in step 2 above.
- Record the AUC value you obtain against each iterations number as shown in the example below.
- Plot a graph using the results you obtained as shown in the example below.

Above graph depicts that the AUC only increases slightly when you increase the Number of Iterations.

### Obtaining optimal values for the hyper parameters

Even though you further improve the Number of Iterations, AUC will probably not considerably improve. Therefore, decide on the optimal Number of Iterations based on the above observations, and also considering how much computing power you have and what level of AUC you expect.

Also, use another binary classification algorithm (Support Vector Machine), or else carry out feature engineering on the dataset so that it reduces the noise of the training data to increase the AUC.