Link Search Menu Expand Document

SciPy library

  1. Implement Rastrigin function $f: \mathbb{R}^d \to \mathbb{R}$ for d = 10. link

    \[f(\mathbf{x})=10 d+\sum_{i=1}^{d}\left[x_{i}^{2}-10 \cos \left(2 \pi x_{i}\right)\right]\]
    • Consider global optimization from here.
    • Plot 4 graphs for different $d$ from {10, 100, 1000, 10000}. On each graph you are to plot $f$ from $N_{fev}$ for 5 methods: basinhopping, brute, differential_evolution, shgo, dual_annealing from scipy, where $N_{fev}$ - the number of function evaluations. This information is usually avalable from specific_optimizer.nfev. If you will need bounds for the optimizer, use $x_i \in [-5, 5]$.
  2. Machine learning models often have hyperparameters. To choose optimal one between them one can use GridSearch or RandomSearch. But these algorithms computationally uneffective and don’t use any sort of information about type of optimized function. To overcome this problem one can use bayesian optimization. Using this method we optimize our model by sequentially chosing points based on prior information about function. Image.

    In this task you will use optuna package for hyperparameter optimization RandomForestClassifier. Your task is to find best Random Forest model varying at least 3 hyperparameters on iris dataset. Examples can be find here or here

     !pip install optuna
    
     import sklearn.datasets
     import sklearn.ensemble
     import sklearn.model_selection
     import sklearn.svm
    
     import optuna
    
     iris = sklearn.datasets.load_iris()
     x, y = iris.data, iris.target