From 83dfdcd2edabf3909ff12676ee65c46ba269fc0b Mon Sep 17 00:00:00 2001 From: Lopes Date: Tue, 6 May 2025 16:09:46 +0200 Subject: [PATCH 01/10] Upload image notebook and pytest functions --- 25_libraries_image_classification.ipynb | 2182 +++++++++++++++++ .../test_25_libraries_image_classification.py | 158 ++ 2 files changed, 2340 insertions(+) create mode 100644 25_libraries_image_classification.ipynb create mode 100644 tutorial/tests/test_25_libraries_image_classification.py diff --git a/25_libraries_image_classification.ipynb b/25_libraries_image_classification.ipynb new file mode 100644 index 00000000..59a981be --- /dev/null +++ b/25_libraries_image_classification.ipynb @@ -0,0 +1,2182 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "10140aff", + "metadata": {}, + "source": [ + "# Image Classification Notebook" + ] + }, + { + "cell_type": "markdown", + "id": "0f1a238d", + "metadata": {}, + "source": [ + "## Table of Contents\n", + " - [Image Classification Notebook](#Image-Classification-Notebook)\n", + " - [References](#References)\n", + " - [Libraries](#Libraries)\n", + " - [Introduction](#Introduction)\n", + " - [Classes](#Classes)\n", + " - [Functions](#Functions)\n", + " - [Dataset](#Dataset)\n", + " - [Load data](#Load-data)\n", + " - [Explore image processing](#Explore-image-processing)\n", + " - [Geometric transformation](#Geometric-transformation)\n", + " - [Scaling](#Scaling)\n", + " - [Cropping](#Cropping)\n", + " - [Vertical flip](#Vertical-flip)\n", + " - [Horizontal flip](#Horizontal-flip)\n", + " - [Rotation](#Rotation)\n", + " - [Image filtering](#Image-filtering)\n", + " - [Average filter](#Average-filter)\n", + " - [Median filter](#Median-filter)\n", + " - [Gaussian filter](#Gaussian-filter)\n", + " - [Photometric transformation](#Photometric-transformation)\n", + " - [Adjust brightness](#Adjust-brightness)\n", + " - [Adjust contrast](#Adjust-contrast)\n", + " - [Adjust saturation](#Adjust-saturation)\n", + " - [CNN model development](#CNN-model-development)\n", + " - [Data preprocessing](#Data-preprocessing)\n", + " - [Data Augmentation](#Data-augmentation)\n", + " - [Train, validation, and test sets](#Train-validation-and-test-sets)\n", + " - [PyTorch datasets](#Pytorch-datasets)\n", + " - [PyTorch dataloaders](#Pytorch-dataloaders)\n", + " - [Model training](#Model-training)\n", + " - [Training hyperparameters](#Training-hyperparameters)\n", + " - [Initialise model architecture](#Initialise-model-architecture)\n", + " - [Optimiser](#Optimiser)\n", + " - [Train model](#Train-model)\n", + " - [Loss function](#Loss-function)\n", + " - [Learning curves](#Learning-curves)\n", + " - [Model testing](#Model-testing)\n", + " - [Explore results](#Explore-results)\n", + " - [Compute average accuracy](#Compute-average-accuracy)\n", + " - [Compute confusion matrix](#Compute-confusion-matrix)\n", + " - [Explain image prediction](#Explain-image-prediction)\n", + " - [Load data batch](#Load-data-batch)\n", + " - [Compute GradCAM heatmap](#Compute-GradCAM-heatmap)\n", + " - [Visualise GradCAM heatmap with the image](#Visualise-GradCAM-heatmap-with-the-image)\n" + ] + }, + { + "cell_type": "markdown", + "id": "eea0d77c", + "metadata": {}, + "source": [ + "## Libraries" + ] + }, + { + "cell_type": "markdown", + "id": "a1581546", + "metadata": {}, + "source": [ + "- NumPy\n", + "- Matplotlib\n", + "- Scikit-learn\n", + "- OpenCV-Python\n", + "- PyTorch\n", + "- Albumentations\n", + "- PyTorch GradCAM" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "id": "f1f07e51", + "metadata": {}, + "outputs": [], + "source": [ + "# Important modules\n", + "import pytest\n", + "import os\n", + "import pickle\n", + "import numpy as np\n", + "import pandas as pd\n", + "from matplotlib import pyplot as plt\n", + "\n", + "from sklearn.model_selection import train_test_split\n", + "from sklearn.metrics import confusion_matrix\n", + "\n", + "import cv2\n", + "\n", + "import torch\n", + "import torch.nn as nn\n", + "import torch.optim as optim\n", + "from torch.utils.data import Dataset, DataLoader\n", + "import albumentations as A\n", + "\n", + "from pytorch_grad_cam import GradCAM\n", + "from pytorch_grad_cam.utils.model_targets import ClassifierOutputTarget\n", + "from pytorch_grad_cam.utils.image import show_cam_on_image" + ] + }, + { + "cell_type": "markdown", + "id": "d11781b5", + "metadata": {}, + "source": [ + "## References\n", + "\n", + "Here are some additional references to guide you while self-learning:\n", + "- Official documentation for [openCV](https://docs.opencv.org/4.x/d6/d00/tutorial_py_root.html)\n", + "- Official documentation for [PIL library](https://pillow.readthedocs.io/en/stable/)\n", + "- Official documentation for [PyTorch](https://pytorch.org/)\n", + "- Official documentation for [Albumentations](https://albumentations.ai/)\n", + "- Official documentation for [PyTorch GradCAM](https://jacobgil.github.io/pytorch-gradcam-book/introduction.html)\n", + "- [A tutorial from Microsoft to compute image classification using PyTorch](https://learn.microsoft.com/en-us/windows/ai/windows-ml/tutorials/pytorch-train-model)" + ] + }, + { + "cell_type": "markdown", + "id": "d55143be", + "metadata": {}, + "source": [ + "## Introduction\n", + "\n", + "Image Classification is a foundational task in the field of computer vision and machine learning. This notebook aims to provide practical experience in image processing and in building and evaluating image classification models. It begins by demonstrating how to load and preprocess image data using Matplotlib and OpenCV-Python. Then, it shows how to build a basic image classification pipeline based on Convolutional Neural Networks (CNNs) using PyTorch, Albumentations, and Scikit-learn. Next, it covers how to evaluate model performance using Scikit-learn and NumPy, and finally, it introduces model explainability using Grad-CAM.\n", + "\n", + "The goal of this notebook is not to teach the underlying algorithms and procedures used in this field, but rather to give the user an idea of what can be done with these Python libraries." + ] + }, + { + "cell_type": "markdown", + "id": "9a037a61", + "metadata": {}, + "source": [ + "## Classes\n", + "\n", + "The following three classes are essential for improving modularity and readability.\n", + "\n", + "- **ImageDataset** is used to load images along with their labels and to perform image augmentation.\n", + "- **ImageClassifier** is responsible for building the image classification model, which in this case is based on Convolutional Neural Networks (CNNs).\n", + "- **Trainer** handles the training and evaluation processes using batches of data.\n", + "\n", + "By organizing the code in this way, we simplify debugging and future extensions.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "0644e0df", + "metadata": {}, + "outputs": [], + "source": [ + "class ImageDataset(Dataset):\n", + " def __init__(self, images, labels, transform=None):\n", + " self.images = images\n", + " self.labels = labels\n", + " self.transform = transform\n", + " \n", + " def __len__(self):\n", + " return len(self.images)\n", + " \n", + " def __getitem__(self, idx):\n", + " image = self.images[idx]\n", + " label = self.labels[idx]\n", + " \n", + " # Ensure the image is in the shape (H, W, C) for Albumentations\n", + " image = np.transpose(image, (1, 2, 0))\n", + " \n", + " # Apply Albumentations transforms\n", + " if self.transform:\n", + " augmented = self.transform(image=image)\n", + " image = augmented['image']\n", + "\n", + " return image, label\n", + " \n", + "class ImageClassifier(nn.Module):\n", + " def __init__(self, in_channels=1, out_classes = 1):\n", + " super(ImageClassifier, self).__init__()\n", + " \n", + " # Best model weights\n", + " self.best_model_weights = None\n", + " \n", + " # Convolutional block layers\n", + " self.feature_maps = 64\n", + " \n", + " # First convolutional block\n", + " self.conv1 = nn.Conv2d(in_channels, self.feature_maps, kernel_size = 3)\n", + " self.pool1 = nn.MaxPool2d(kernel_size = 2)\n", + " self.bn1 = nn.BatchNorm2d(self.feature_maps)\n", + " \n", + " # Second convolutional block\n", + " self.conv2 = nn.Conv2d(self.feature_maps, self.feature_maps * 2, kernel_size = 3)\n", + " self.pool2 = nn.MaxPool2d(kernel_size = 2)\n", + " self.bn2 = nn.BatchNorm2d(self.feature_maps * 2)\n", + " \n", + " # Third convolutional block\n", + " self.conv3 = nn.Conv2d(self.feature_maps * 2, self.feature_maps * 4, kernel_size = 3)\n", + " self.pool3 = nn.MaxPool2d(kernel_size = 2)\n", + " self.bn3 = nn.BatchNorm2d(self.feature_maps * 4)\n", + " \n", + " # Rectified linear unit (activation function)\n", + " self.relu = nn.ReLU()\n", + " \n", + " # Flatten layer that produces the features vector\n", + " self.flatten = nn.Flatten(start_dim=1)\n", + " \n", + " # Dropout layer is responsible for introducing regularisation into training process\n", + " self.dropout = nn.Dropout(p = 0.3)\n", + " \n", + " # Number of output classes\n", + " self.out_classes = out_classes\n", + " \n", + " # Classification layer \n", + " self.fc = nn.Linear(1024, self.out_classes)\n", + " \n", + " def forward(self, x):\n", + " # Convolutional block 1\n", + " x = self.conv1(x)\n", + " x = self.pool1(x)\n", + " x = self.relu(x)\n", + " x = self.bn1(x)\n", + " \n", + " # Convolutional block 2\n", + " x = self.conv2(x)\n", + " x = self.pool2(x)\n", + " x = self.relu(x)\n", + " x = self.bn2(x)\n", + " \n", + " # Convolutional block 3\n", + " x = self.conv3(x)\n", + " x = self.pool3(x)\n", + " x = self.relu(x)\n", + " x = self.bn3(x)\n", + " \n", + " # Classifier\n", + " x = self.flatten(x)\n", + " x = self.dropout(x)\n", + " x = self.fc(x)\n", + " return x\n", + "\n", + "class Trainer():\n", + " def __init__(self, model):\n", + " self.model = model\n", + " self.train_losses = []\n", + " self.val_losses = []\n", + " \n", + " def fit(self, epochs, train_dataloader, val_dataloader, optimizer, criterion, device, early_stopping_limit = 0):\n", + " early_stopping_count = 0\n", + " best_val_loss = 9999\n", + " best_epoch = 0\n", + " \n", + " # Training log file\n", + " log_filename = \"training_log.txt\"\n", + " with open(log_filename, \"w\") as log_file:\n", + " log_file.write(\"Epoch,Train Loss,Val Loss,Best Val Loss,Best Epoch\\n\")\n", + "\n", + " \n", + " for epoch in range(epochs):\n", + " # Train mode\n", + " self.model.train()\n", + " \n", + " train_loss = 0.0\n", + " train_samples_count = 0.0\n", + " for i, data in enumerate(train_dataloader):\n", + " inputs, labels = data\n", + " inputs = inputs.to(device)\n", + " labels = labels.long().to(device)\n", + " \n", + " # zero the parameter gradients\n", + " optimizer.zero_grad()\n", + "\n", + " # Forward step\n", + " outputs = self.model(inputs)\n", + " \n", + " # Backward step\n", + " loss = criterion(outputs, labels)\n", + " loss.backward()\n", + " \n", + " # Weight optimisation\n", + " optimizer.step()\n", + "\n", + " train_loss += loss.item()\n", + " train_samples_count += 1\n", + " \n", + " # Validation mode\n", + " self.model.eval()\n", + " \n", + " val_loss = 0.0\n", + " val_samples_count = 0.0\n", + " for i, data in enumerate(val_dataloader):\n", + " inputs, labels = data\n", + " inputs = inputs.to(device)\n", + " labels = labels.long().to(device)\n", + " \n", + " outputs = self.model(inputs)\n", + " loss = criterion(outputs, labels)\n", + " \n", + " val_loss += loss.item()\n", + " val_samples_count += 1\n", + " \n", + " train_loss /= train_samples_count\n", + " val_loss /= val_samples_count\n", + " \n", + " self.train_losses.append(train_loss)\n", + " self.val_losses.append(val_loss)\n", + " \n", + " early_stopping_count += 1\n", + " \n", + " if val_loss < best_val_loss:\n", + " best_epoch = epoch\n", + " best_val_loss = val_loss\n", + " early_stopping_count = 0\n", + " self.model.best_model_weights = self.model.state_dict()\n", + " \n", + " \n", + " print(f'Epoch: {epoch}, Loss: {train_loss}, Val Loss: {val_loss}. The best val loss is {best_val_loss} in epoch {best_epoch}.')\n", + " \n", + " with open(log_filename, \"a\") as log_file:\n", + " log_file.write(f\"{epoch},{train_loss},{val_loss},{best_val_loss},{best_epoch}\\n\")\n", + " \n", + " if early_stopping_count == early_stopping_limit and early_stopping_limit > 0:\n", + " break\n", + " \n", + " def predict(self, test_dataloader, device):\n", + " # Load best weights\n", + " if self.model.best_model_weights:\n", + " self.model.load_state_dict(self.model.best_model_weights)\n", + "\n", + " # Test mode\n", + " self.model.eval()\n", + " \n", + " original_images = []\n", + " true_labels = []\n", + " predicted_labels = []\n", + "\n", + " for data in test_dataloader:\n", + " images, labels = data\n", + " images = images.to(device)\n", + " outputs = self.model(images)\n", + " _, predicted = torch.max(outputs, 1)\n", + " \n", + " images = images.cpu().detach().numpy()\n", + " labels = labels.numpy()\n", + " predicted = predicted.cpu().detach().numpy()\n", + " \n", + " original_images.append(images)\n", + " true_labels.append(labels)\n", + " predicted_labels.append(predicted)\n", + " \n", + " original_images = np.concatenate(original_images)\n", + " true_labels = np.concatenate(true_labels)\n", + " predicted_labels = np.concatenate(predicted_labels)\n", + " \n", + " return original_images, true_labels, predicted_labels\n" + ] + }, + { + "cell_type": "markdown", + "id": "4f522994", + "metadata": {}, + "source": [ + "## Functions\n", + "\n", + "The following three functions are going to be used throughout the notebook. They comprise the loading of binary files using Pickle (**load_pickle_file**), single image plotting (**plot_image**), and multiple image plotting (**plot_multiple_images**)." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "44ba288a", + "metadata": {}, + "outputs": [], + "source": [ + "def load_pickle_file(filepath):\n", + " with open(filepath, \"rb\") as f:\n", + " return pickle.load(f)\n", + "\n", + "def plot_image(img, figsize = (2,3)):\n", + " plt.figure(figsize = figsize)\n", + " plt.imshow(img)\n", + " plt.axis(\"off\")\n", + " \n", + "def plot_multiple_images(*images_titles, figsize = (2, 3)):\n", + " num_images = len(images_titles)\n", + " fig, axs = plt.subplots(1, num_images, figsize = figsize)\n", + " for i in range(num_images):\n", + " axs[i].imshow(images_titles[i][0])\n", + " axs[i].set_title(images_titles[i][1])\n", + " axs[i].axis(\"off\")" + ] + }, + { + "cell_type": "markdown", + "id": "650cf38c", + "metadata": {}, + "source": [ + "## Dataset\n", + "\n", + "In this section, we load the CIFAR-10 dataset, which consists of 60,000 32x32 color images across 10 different classes, with 6,000 images per class. The dataset is divided into 50,000 training images and 10,000 test images. It was already processed and it is ready to use after loading the binary files *train_set.pkl* and *test_set.pkl*." + ] + }, + { + "cell_type": "markdown", + "id": "c610a4ed", + "metadata": {}, + "source": [ + "### Load data\n", + "\n", + "Training and test sets are loaded using Pickle library." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "id": "44f45fed", + "metadata": {}, + "outputs": [], + "source": [ + "# Sets filepaths\n", + "dataset_folder = os.path.join(\"CIFAR10\")\n", + "train_set_file = os.path.join(dataset_folder, \"train_set.pkl\")\n", + "test_set_file = os.path.join(dataset_folder, \"test_set.pkl\")\n", + "\n", + "# Load sets\n", + "train_set = load_pickle_file(train_set_file)\n", + "test_set = load_pickle_file(test_set_file)\n", + "\n", + "# CIFAR10 classes\n", + "CIFAR_10_CLASSES = [\n", + " \"Airplane\", \"Automobile\", \"Bird\", \"Cat\", \"Deer\",\n", + " \"Dog\", \"Frog\", \"Horse\",\"Ship\",\"Truck\"\n", + "]" + ] + }, + { + "cell_type": "markdown", + "id": "ff15666e", + "metadata": {}, + "source": [ + "## Explore image processing\n", + "\n", + "Image processing is fundamental to computer vision, forming the basis for interpreting and analyzing visual information. By applying techniques such as resizing, filtering, color adjustments, and data augmentation, image processing enhances input quality, minimizes noise, and corrects distortions. These methods can also simulate real-world variability, helping models generalize better. In this notebook, we explore three categories of image transformations: geometric transformations, image filtering, and photometric transformations." + ] + }, + { + "cell_type": "markdown", + "id": "f4ad942e", + "metadata": {}, + "source": [ + "### Example image" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "id": "2b6795a0", + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAK8AAACvCAYAAACLko51AAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjEsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvc2/+5QAAAAlwSFlzAAAPYQAAD2EBqD+naQAAD+pJREFUeJztncmPXFcVxt9cVa/nbreHtttJ7EBCGLJAIEWAhAISfwULliC2bCPgv2HFLkiRgEQQAjGxkmBwbMft9jx0u+ea34DKYnPv9zl90+4o3Orvt3tH901Vp67uV+fcc8K6rutACA+JvuwHEOKgyHmFt8h5hbfIeYW3yHmFt8h5hbfIeYW3yHmFt8h5hbckrgPfeONXYNt+cB9svXYPb9KYMA0R/mbOv3gebOfOoy0gAcG7d26D7T8XLhjHqysrMKYkP90oxY+k0crBNjs1DbbpmZnPPB4xNz8HtpmZebDlkzhuagqv15rEZ2vmpq3Zsj7/IAjirAW2KgiJDaldprwSv6eqwqtFMV7sO69+zeEGmnmFx8h5hbfIecX4r3nnFpfAtrhwAmxnzzyH584fM44HYQpjwiQDG0t46/W6YHvp5PNgO//yt4zjlatXYcz25gbYtjbQduvmDbDdvoW2xFoytjJ8z3LQAVuaxGBrNnHNmzSaOG4K17OtqUnjeHZhEcbMzuP3OTOL95ycwbX9FLG1JqeM47iBa/E4QXdLYnx3VzTzCm+R8wpvkfMKb5HzivEXbF99Cf84vnblGtjWt3fBllt/rjdaKGR6vT2wZRmKuGqAgq3dRxG0ePyUcfzaaRR1d2+tgq2zvQW21773fbDdf3gXnzdtGMezlogZceljM3gy4p0/vgm28hEGVaIIgwh1iLa4ke37OcYVnpeScUnDfKcR+QQGOGYs8T41fwbGzM1hMGZhYQFs3/7Gy4ELmnmFt8h5hbfIeYW3yHnFEYiwTaH4OPfiV8B25/ZNsG1sPDSOp0l2VKOJIiCLMcI2keHvrdsbgK0uTUFSFDAkmJnBiNKgj4KwKPH6yyTjrdWcNY4nc/N4xLHlF8DWIZHEt37/O7DFBY7LYhS/aWU+b9XF54/KIdh6RBBWRBCukVyz+lNLvMckwhZhNK1BBOHPfvnzwAXNvMJb5LzCW+S8YvzXvJf/9RHYpheOg62V4O9h8/Ej47hL1mDHT57Gm0YlmIYkjX9A1oJhZdoi63hESnZNzM1hxtS77/4ZbFMtXKu98vXvGsd9su4b4CsF04snwTZMUANsbm6CLU9w/Zlb6+AGyeYKE3x+VrSOfGxBjcvgoK6t5xjsOmUJ7nYOXipPM6/wFjmv8BY5r/AWOa8Yf8G2sbUGtksf/gNsaYEC4uQL5tagARmTT+J2ljw3M8NG1OT3Ri4XdLqmYCD/jwfDQR9sn3z0Adguvv0W2CYm8HlPLZrPe2KZBF6ISPzmK6+CLfnpL8B2lwSAtrfWwba7Y25l2tvBTLl2uw22bhcDNMMhBjNqIu3C0PxeMiI4sxQDKrm1Tf/zoJlXeIucV3iLnFd4i5xXjL9gY3W3bnRw6876AzODbES3Mhf9U8cwMheS7KVWE+sULJD6EUmCQqDfNbcGtVq4xeXa1ctge++vfwFbVGJYbGsdhdI9q2ZaYwq3uGS5WVNhxCzJbvvBD1/H5yBZX90eCq9OxxSr7d1tGPPwDoq/1RtYi+Lap586idUzZ5aN4wVS06PVQhE3P49bg1zRzCu8Rc4rvEXOK7xFzivGX7AFJIVuluzDf7iCtRCalnjauXMLz3uIQu+DixfB9gqJRuUTmMY46JtFronWCT6++D7Ytkk0qihQsFUlhvVChxTA4QAjVns1ii4WeGqkKHha5N1n5kxB3CQF/7IIbTvbWP/i9ddxu9OJEyjGJq1i20kzdyou3SSi3BXNvMJb5LzCW+S8wlvkvGL8BVuP5B1mTbfq18XQ3LNWk0rgD+6Z+9xGXL+BXX7ee+/vYItI7YIkNp9jcR5rKARD0rmI/Jx3d3A/1oJVfXxEZhW4C0nXo7Ii4o9sbEvTzKlyeUWEY69nvtfVKxhJfPftP4FtdRWL+y0t4d7C9c3HYKstuZo0MQqXkJTIgqRc/ugnPw5c0MwrvEXOK7xFziu8Rc4rxl+wzZI0xofXLju1JupZEbYgw9umdh+oUfSogeP2On2nRX9ltcbaIXu9SpJOODOLwm5AKm/0+vgce3t7nykan4zp4XnTpBVsNayc0k3bbRSTV6xUz39ewL2GKytX8FrW84+4cfO6U7GWyqpEEsWkGjvxjYJUQPzNb38duKCZV3iLnFd4i5xXeIucV4y/YFtexlZQVy/8DWyPt3G/VHfTFClnnj8LYyKyhy0iEarQpULhEwFhCoGCRLEmWpiOt7OLAmi3jSKrRZ7NTuFcfYSfxRTZrzaRYzQqI/2Zr179BGybpBjM6qpZpXxzCyNiZY2fR81KQpLPuyR7+uyvoCatsliKKPuOXdHMK7xFziu8Rc4rvEXOK8ZfsOUxiptTRMQNSbn7om9GwPoDXLhv7WB64pDUj0+JyApJWmBpRbIKsmerjvFZkwZJr+yjQOmT9gKXrplC6fEHH8KYvEVSKUkaaU3evWtHKoORMCXiyVJPMUkZDQJSNjOq3UQWiRwGsfW8tdu1qCJ0RDOv8BY5r/AWOa84AtuAdnG9dXrJLK42YnIWazl0H5oVtzc28c/7NssWYz1XWYtR0oq0Ks1zB6Sa9+bODtgyUuMgZAXu+tiOa8+qFdEfsnfCNWpM5hDWLoptK4pI1Mauj8BiD1Ho1kKqJHqCs//12JqXBZ1c0cwrvEXOK7xFziu8Rc4rxl+w9XvY5ohtc5mbxqypwj6XrO07pI1SRuo7dK2aBCMqsg0osf40Z8IgIn/K93ooTCOrTdPTLjgYoIhzES000MAemBSqK/e941PuSb4EVnmdVax3gQYkWOAiODiaeYW3yHmFt8h5hbfIecX4C7ZOZxNsN63tJiNaTdyvPzs9ZRz3icCKsCB5sLgw7ySKuh0UWQPrHgNSkTwhgjCO8fc8HBZOkbLSFlRUoBDxxIJYLAJGxFNNo1bmuJrcgEUND5PaVZzRTDM3NPMKb5HzCm+R8wpvkfOK8Rds7194B2x3b2Gv2jTBBXh7z1RjSRNbMk1O4vaYM6dOgW17A5XdJqkj0LK2I21u4XmsZEBBUgC7XSzIFwfZoYkPGsRiRkfBdphRLBZho2LsgO9+0PNGaOYV3iLnFd4i5xXeIucV4y/Yrl+5BLaNdaw2fu7cc2BrWLUWegOMWA0GmOqYkr5SIUkCjImo2O2YKZZ1hNG0BhGOBak0XhNBOKjwHXCvmFsUi1YzIO/kavsyOKjwUqE9cSSR8wpvkfMKb5HzivEXbOt37oKtKtk+K7xkKzfbQz1auwNjJkkBut09TMNMs3DfXrsjulbmZCvHdlHb23j9usDUybyFlct3uqSHcFHvWxCEiTiWJsmDboe4n4wQEVF7mNG0wxacmnmFt8h5hbfIeYW3yHnF+Au2nS6KojwlraBI6mFiRdhyUt2ctLMN+qRP7yRp+9QjBVFqqxr7sMa9b3VBbESLlMTIUidtmRWSYiXPkgL4LOe6XCsm0a6KjGOtrA6KXdHy86CZV3iLnFd4i5xXeIucV4y/YOuSoh1xgGmBG+v3wLZ44qRxfHrpOIxpNnBP2MZjTLlcX3vsVNY/j0xbRqJHx5fM5xrxYB1bDmzu7B1QsIWHGnn6ogVbScQT7/8c7iviXFMdFWETRxI5r/AWOa8Y/zVv0cW1YMV8vyRrpNpcGycJrq1OnsL15/FjJ8D2h+tvgm3p1BLYWlZHqk4PAxLtIf7ZXpC+T+w9WVV1lyXps2RWsT/0a4dCe2yjUe14fde1qz2OnXeYGWpP7nHgM4X4kpHzCm+R8wpvkfOK8RdsZ4/lYFuYR9vsHIqs1NqC0ytRPK2tPwLbc6fPg2359FmwLR4ztxmNKKzAxb1/X4Yx61tYo2FQufX8DWnv3i+22BwXdiEZ5zAm2D/I8vR7IrZAi+PYrZf0M6CZV3iLnFd4i5xXeIucV4y/YDu/fAxs+RTWWkgnUDzdvGdmhz3e3YExnTYRcWc3wHbyNFZLX1t7ALaV1dvG8d0HazAmCEmdAmYjUbcvusAdE3GsN3DNRKIVKXOtvE57INeRa2nAzzx8Ks/wMWrmFd4i5xXeIucV3iLnFeMv2CZmsF5C1EBx1iEpkZXVzzcJcctPq4FCabeNaZjtIfYZXlnFllobGzv7pjryyJPr9pv9U/7c0hWfAong1eTUhIi4yhJUrPdwRaNppO9yiVGxsiapk9blIuJa9nP97+mCg6KZV3iLnFd4i5xXeIucV4y/YJs5hnvMbt3HlMKb9zGSVVoiZdBFEdCzS5kHQbDVxuJ+IanI1yd70Wx9liSJU2V3to+L1oIL9y8Q57pni2m4xBK5T56DCKWafIVhavZdrkmNCdb+qyIF9IqSvQMRe1YkLgzJc7HPLDx40T7NvMJb5LzCW+S8wlvkvGL8BVufbD+6cw/3nd0hqYcDWz1V+JspSD/ifAKjekmBi/5ySESFdc8oJRGxyrGwBw4LQlqIZP+5oKLplez65K6OVcpjq6gg24OXsQhh7BZdpKLWEoXVAKvaRywyFyvCJo4gcl7hLXJe4S1yXnEEKqO3MRVxOMSK5BFJqyuHdvSscoooxUQsJGR9n5H0vqphRpkGBYvksPREJpTImeRUe4+Za5cmtjeNFQWJA3yHiDxcVJqRyZhcv0UijkmCaakh2dNXkO8dK8XjGPa9x0QkuqKZV3iLnFd4i5xXeIucV4y/YOvtYfpj0cWevyFLv7NERUn2RTFhUA/7Tnu2mO6qG2Z/46LGaw1I1cLasQoGa/tk9+l1rVjPUgzZHjM20+SkRUKemudO5w0ck2P/54hUdmSppLxk//6FTpgoT7ODz5+aeYW3yHmFt8h5xfiveasCt+TMT1v9op7SpsrOSKsrrNuQxnitLCE20oa1rHDctrWebZLtQ0UTF2YDUhq9IFlrLABhr4Pplh+ylo1JZlWWYEBiZgLXqSfmZ3Bcy3zXZoafWZS4tWWNYxbMSPc9NyStvli19Jisg13RzCu8Rc4rvEXOK7xFzivGX7CFJEtocR6F1+ICLuaryhQfUYB/mseR26Pwugpom+6YmWxpgxQKJAGPfg+FEtnRQoMULoX2IiI4M7JFqZXh5z3Jgg2tfF8RFJOgQkSyudh3EEWpW5FBO7uNTotsK5a2AYkjiJxXeIucV3iLnFeMv2BjKVIJidIwW5qakaE0RuHBUsNqxzoFA1LzwRYfU9MobKoai/uFAQqqUV4cjItKh37Ebr18I2YjT0HrO4T7VzjnkbN033oPTxNsrIieLX5D2naLvBUr9+6IZl7hLXJe4S1yXuEtcl5xBCJsJErDUtyyDBf4zaZpS4hYYKmCLHLGBBtr1ZSnLeM4JRGlglwrjMg2psix1oL1GbF3ci4V4VwrIkabPZDU0gioOGPXchxnvTutx0D6GLOCha5o5hXeIucV3iLnFd4i5xXeEta8sa4Q//do5hXeIucV3iLnFd4i5xXeIucV3iLnFd4i5xXeIucV3iLnFYGv/Bev9xSmm2mi6gAAAABJRU5ErkJggg==", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Select image\n", + "image = train_set[0][9]\n", + "\n", + "# Convert image from (C, H, W) to (H, W, C)\n", + "image = np.transpose(image, (1,2,0))\n", + "\n", + "# Plot image\n", + "plot_image(image)" + ] + }, + { + "cell_type": "markdown", + "id": "0fcff42b", + "metadata": {}, + "source": [ + "### Geometric transformation\n", + "\n", + "Geometric transformations alter the spatial structure of the image while preserving its semantic content. They help the model become invariant to different orientations and scales:\n", + "- **Scaling**: Resizes the image to a specific size, often required to match input dimensions for image classifiers. It uses interpolation to obtain the new pixel-values.\n", + "- **Cropping**: Extracts a subregion of the image; useful for focusing on important parts or adding variability.\n", + "- **Horizontal and vertical flip**: Flips the image along the x-axis or y-axis; helps the model learn symmetry.\n", + "- **Rotation**: Rotates the image by a small angle to simulate different orientations of the objects." + ] + }, + { + "cell_type": "markdown", + "id": "25325640", + "metadata": {}, + "source": [ + "#### Scaling" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "id": "9f272277", + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
🚫 OpenAI API key is not provided.
API key is missing.
" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "
🔄 IPytest extension (re)loaded.
" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%reload_ext tutorial.tests.testsuite" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "id": "02f2480c", + "metadata": {}, + "outputs": [ + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "c2fd61632d3e45ea9b2d33d713e16199", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Scale image by half\n", + "scaled_image = solution_scale_image(image, 0.5)\n", + "\n", + "if scaled_image is not None:\n", + " # Use this function to plot images side by side\n", + " plot_multiple_images((image, \"Original\"), (scaled_image, \"Scaled\"), figsize = (4, 5))" + ] + }, + { + "cell_type": "markdown", + "id": "24bdd539", + "metadata": {}, + "source": [ + "#### Cropping" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "id": "cc7e4291", + "metadata": {}, + "outputs": [ + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "60feb9e32dda4b03abcf6d6eb1f4f0ae", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Crop image to get a 15-by-15 image starting on (x,y): (2,2)\n", + "cropped_image = solution_crop_image(image, 2, 2, 15, 15)\n", + "\n", + "if cropped_image is not None:\n", + " # Use this function to plot images side by side\n", + " plot_multiple_images((image, \"Original\"), (cropped_image, \"Cropped\"), figsize = (4, 5))" + ] + }, + { + "cell_type": "markdown", + "id": "4a447b09", + "metadata": {}, + "source": [ + "#### Horizontal Flip" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "id": "6dc1507a", + "metadata": {}, + "outputs": [ + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "20e0f6f8ddb443b8ad30199289244442", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Flip image horizontally\n", + "flip_image_horizontal = solution_horizontal_flip_image(image)\n", + "\n", + "if flip_image_horizontal is not None:\n", + " # Use this function to plot images side by side\n", + " plot_multiple_images((image, \"Original\"), (flip_image_horizontal, \"Horizontal Flip\"), figsize = (4, 5))" + ] + }, + { + "cell_type": "markdown", + "id": "0b473067", + "metadata": {}, + "source": [ + "#### Vertical Flip" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "id": "1d9325d9", + "metadata": {}, + "outputs": [ + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "9d99e8b0985d40d484a4b43a7177b431", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Flip image vertically\n", + "flip_image_vertical = solution_vertical_flip_image(image)\n", + "\n", + "if flip_image_vertical is not None:\n", + " # Use this function to plot images side by side\n", + " plot_multiple_images((image, \"Original\"), (flip_image_vertical, \"Vertical Flip\"), figsize = (4, 5))" + ] + }, + { + "cell_type": "markdown", + "id": "be8a0c63", + "metadata": {}, + "source": [ + "#### Rotation" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "id": "c1469ab1", + "metadata": {}, + "outputs": [ + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "f0439770c28b41ca8b82262623da1391", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Rotate image by 20 degrees\n", + "rotated_image = solution_rotate_image(image, 20)\n", + "\n", + "if rotated_image is not None:\n", + " # Use this function to plot images side by side\n", + " plot_multiple_images((image, \"Original\"), (rotated_image, \"Rotated\"), figsize = (4, 5))" + ] + }, + { + "cell_type": "markdown", + "id": "c38eb125", + "metadata": {}, + "source": [ + "### Image filtering\n", + "\n", + "Filtering helps reduce noise and enhance specific image features. These are often used as a form of preprocessing before feeding images into a model:\n", + "- **Average filter**: Applies a smoothing effect by replacing each pixel with the average of its neighborhood.\n", + "- **Median filter**: Reduces salt-and-pepper noise by replacing each pixel with the median of neighboring pixels.\n", + "- **Gaussian filter**: Applies a Gaussian blur to smooth the image, often used to reduce high-frequency noise." + ] + }, + { + "cell_type": "markdown", + "id": "d850c8c1", + "metadata": {}, + "source": [ + "#### Average filter " + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "id": "cd187048", + "metadata": {}, + "outputs": [ + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "20a47b710a72400ea2634ba607c59b8e", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Filter image using average filter\n", + "average_filter_image = solution_average_filter(image, (3, 3))\n", + "\n", + "if average_filter_image is not None:\n", + " # Use this function to plot images side by side\n", + " plot_multiple_images((image, \"Original\"), (average_filter_image, \"Average filter\"), figsize = (4, 5))" + ] + }, + { + "cell_type": "markdown", + "id": "8abdf202", + "metadata": {}, + "source": [ + "#### Median filter" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "id": "c20b2bfb", + "metadata": {}, + "outputs": [ + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "a2e262f6bce44cafa6b074a008d8185b", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Filter image using median filter\n", + "median_filter_image = solution_median_filter(image, 3)\n", + "\n", + "if median_filter_image is not None:\n", + " # Use this function to plot images side by side\n", + " plot_multiple_images((image, \"Original\"), (median_filter_image, \"Median filter\"), figsize = (4, 5))" + ] + }, + { + "cell_type": "markdown", + "id": "cc4f9ed1", + "metadata": {}, + "source": [ + "#### Gaussian filter" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "id": "4f123cf6", + "metadata": {}, + "outputs": [ + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "0fdd6fe9103a400d876ac423afea8f85", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Filter image using Gaussian filter\n", + "gaussian_filter_image = solution_gaussian_filter(image, (7, 7), 0)\n", + "\n", + "if gaussian_filter_image is not None:\n", + " # Use this function to plot images side by side\n", + " plot_multiple_images((image, \"Original\"), (gaussian_filter_image, \"Gaussian filter\"), figsize = (4, 5))" + ] + }, + { + "cell_type": "markdown", + "id": "e736d5c9", + "metadata": {}, + "source": [ + "### Photometric transformation\n", + "\n", + "Photometric transformations modify the color properties of an image to simulate different lighting conditions and improve model robustness to brightness and contrast changes:\n", + "- **Brightness**: Randomly increases or decreases the brightness of the image.\n", + "- **Contrast**: Alters the difference between light and dark regions in the image.\n", + "- **Saturation**: Modifies the intensity of the colors in the image." + ] + }, + { + "cell_type": "markdown", + "id": "1d8fa3ce", + "metadata": {}, + "source": [ + "#### Adjust brightness" + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "id": "04be16c2", + "metadata": {}, + "outputs": [ + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "ebd850c398c4419b9f53ca0b7b98e810", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Brighter image (positive brightness value)\n", + "brighter_image = solution_adjust_brightness(image, 100)\n", + "\n", + "# Darker image (negative brightness value)\n", + "darker_image = solution_adjust_brightness(image, -100)\n", + "\n", + "if brighter_image is not None and darker_image is not None:\n", + " # Use this function to plot images side by side\n", + " plot_multiple_images((image, \"Original\"), (brighter_image, \"Brighter image\"), (darker_image, \"Darker image\"), figsize = (7, 8))" + ] + }, + { + "cell_type": "markdown", + "id": "bb558cf5", + "metadata": {}, + "source": [ + "#### Adjust contrast" + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "id": "00b40d8d", + "metadata": {}, + "outputs": [ + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "046897f270d041869151a9e100385004", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Increase contrast (Value > 1.0)\n", + "high_contrast_image = solution_adjust_contrast(image, 2.0)\n", + "\n", + "# Reduce contrast (Value < 1.0)\n", + "low_contrast_image = solution_adjust_contrast(image, 0.5)\n", + "\n", + "if high_contrast_image is not None and low_contrast_image is not None:\n", + " # Use this function to plot images side by side\n", + " plot_multiple_images((image, \"Original\"), (high_contrast_image, \"High contrast image\"), (low_contrast_image, \"Low contrast image\"), figsize = (7, 8))" + ] + }, + { + "cell_type": "markdown", + "id": "7d83b5da", + "metadata": {}, + "source": [ + "#### Adjust saturation" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "id": "6a42dd8e", + "metadata": {}, + "outputs": [ + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "cc5750f6d3774b86b413108e6f898cbc", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Decrease saturation\n", + "low_saturation_image = solution_adjust_saturation(image, 0.2)\n", + "\n", + "# Increase saturation\n", + "high_saturation_image = solution_adjust_saturation(image, 2.5)\n", + "\n", + "if low_saturation_image is not None and high_saturation_image is not None:\n", + " # Use this function to plot images side by side\n", + " plot_multiple_images((image, \"Original\"), (low_saturation_image, \"Low saturation image\"), (high_saturation_image, \"High saturation image\"), figsize = (7, 8))" + ] + }, + { + "cell_type": "markdown", + "id": "8d90478e", + "metadata": {}, + "source": [ + "## Image classifier development using CNNs\n", + "\n", + "Image classification is the task of assigning a label or category to an input image from a predefined set of classes. It is a fundamental problem in computer vision with widespread applications, including facial recognition, medical imaging, quality control, and autonomous driving. This section outlines the key steps involved in developing an image classification model using PyTorch. It begins with data preprocessing, which includes splitting the dataset into training, validation, and test sets. Afterwards, it defines data augmentation strategies using the Albumentations library, loads the data as PyTorch datasets, and initialises PyTorch dataloaders to efficiently feed data during training. The next step is model training, where a CNN-based model is initialized and optimised using the training and validation data. After training, the model is evaluated on the test set to assess its performance. The evaluation includes metrics such as accuracy and the confusion matrix, which help interpret the model's predictive behavior. Finally, the PyTorch Grad-CAM library is used to visualize the regions of input images that contribute most to the model’s decisions, providing insights into model explainability using representative examples." + ] + }, + { + "cell_type": "markdown", + "id": "d78dee96", + "metadata": {}, + "source": [ + "### Dataset preprocessing" + ] + }, + { + "cell_type": "markdown", + "id": "8ec8f785", + "metadata": {}, + "source": [ + "#### Train, validation, and test sets\n", + "\n", + "```train_test_split``` from Scikit-learn can be used to split the original training set into training and validation sets. The test set is already defined by the dataset' authors." + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "id": "8aba72de", + "metadata": {}, + "outputs": [], + "source": [ + "# Train and validation sets\n", + "X_train, y_train = train_set[0], train_set[1]\n", + "X_train, X_val, y_train, y_val = train_test_split(X_train, y_train, test_size = 0.3, random_state = 42)\n", + "\n", + "# Test set\n", + "X_test, y_test = test_set[0], test_set[1]" + ] + }, + { + "cell_type": "markdown", + "id": "e84c416a", + "metadata": {}, + "source": [ + "#### Data Augmentation\n", + "\n", + "Data augmentation is a crucial technique in image classification that helps improve the performance and robustness of machine learning models. It involves generating new training samples by applying random transformations — such as rotation, flipping, cropping, scaling, or color jittering — to the original images. Albumentations is one of the most widely used libraries for performing data augmentation in image classification tasks. It includes augmentation techniques that replicate operations commonly used in image processing, such as:\n", + "- ```A.Affine``` for scaling;\n", + "- ```A.Rotate``` for rotation;\n", + "- ```A.HorizontalFlip``` for horizontal flipping;\n", + "- ```A.VerticalFlip``` for vertical flipping;\n", + "- ```A.ColorJitter``` for color jittering.\n", + "\n", + "Albumentations can also be used for image normalization (```A.Normalize```), resizing (```A.Resize```), and converting images to PyTorch tensors with the (Channel, Height, Width) format using ```A.ToTensorV2```, which is required for model training." + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "id": "3c0cc98c", + "metadata": {}, + "outputs": [], + "source": [ + "TARGET_SIZE = 32\n", + "\n", + "# Transformations performed on train set\n", + "train_transform = A.Compose([\n", + " A.Affine(scale = (0.2, 1.5), p = 0.1),\n", + " A.Rotate(limit = 45, p = 0.1),\n", + " A.HorizontalFlip(p = 0.1),\n", + " A.VerticalFlip(p = 0.1),\n", + " A.ColorJitter(brightness = (0.5, 1.5), contrast = (0.5, 1.5), saturation = (0.5, 1.5), hue = (0,0), p = 0.1), # Applied only to 'image'\n", + " A.Normalize(mean=(0.4914, 0.4822, 0.4465), std=(0.2470, 0.2435, 0.2616)), # Applied only to 'image'\n", + " A.Resize(height = TARGET_SIZE, width = TARGET_SIZE),\n", + " A.ToTensorV2()\n", + "])\n", + "\n", + "# Transformations performed on validation and test sets\n", + "val_transform = A.Compose([\n", + " A.Normalize(mean=(0.4914, 0.4822, 0.4465), std=(0.2470, 0.2435, 0.2616)), # Applied only to 'image'\n", + " A.Resize(height = TARGET_SIZE, width = TARGET_SIZE),\n", + " A.ToTensorV2()\n", + "])" + ] + }, + { + "cell_type": "markdown", + "id": "46993f36", + "metadata": {}, + "source": [ + "#### PyTorch Datasets\n", + "\n", + "```ImageDataset``` class is based on PyTorch ```Dataset``` class and is used for loading the images and their corresponding labels, for applying transformations (such as data augmentation), and returns them in a format suitable for model training, validating, and testing." + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "id": "ef2d8891", + "metadata": {}, + "outputs": [], + "source": [ + "# Dataset classes necessary for the data loaders\n", + "train_dataset = ImageDataset(X_train, y_train, transform = train_transform)\n", + "val_dataset = ImageDataset(X_val, y_val, transform = val_transform)\n", + "test_dataset = ImageDataset(X_test, y_test, transform = val_transform)" + ] + }, + { + "cell_type": "markdown", + "id": "fad563a4", + "metadata": {}, + "source": [ + "#### PyTorch Dataloaders\n", + "\n", + "```DataLoader``` is essential for training efficiency and performance. It abstracts the complexity of batching, shuffling, and parallel data access, allowing you to focus on building and training your models." + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "id": "dec6c703", + "metadata": {}, + "outputs": [], + "source": [ + "# Data loaders needed for the model training\n", + "train_dataloader = DataLoader(train_dataset, batch_size = 128, shuffle=True)\n", + "val_dataloader = DataLoader(val_dataset, batch_size = 128, shuffle=True)\n", + "test_dataloader = DataLoader(test_dataset, batch_size = 128, shuffle=True)" + ] + }, + { + "cell_type": "markdown", + "id": "0a4bcc59", + "metadata": {}, + "source": [ + "### Model training\n", + "\n", + "Model training comprises a series of steps. First, we must check which devices are available for training the model. In case a GPU with Cuda cores is available is should be used as it really improves the speed. Otherwise, lets use CPU. Then, model and training hyperparameters should be defined, such as numer of output classes, number of training epochs, number of consecutive not improving epochs needed for stopping the training in case we use early stopping regularisation, and learning rate. Other hyperparameters can be defined, it depends on what the user wants to do during the training. In this notebook we are going to define the number of epochs, which are the number of times the model is going to see the training set. Early stopping is a way of trying to avoid overfitting where the model evaluates the model every new epoch using a validation set. In case the loss obtained for the validation set does not decrease for a long period of time (pre-defined epochs), the model optimisation stops and retrieves the checkpoint where the validation loss got the last decrease (see [Early Stopping](https://paperswithcode.com/method/early-stopping)). Learning rate defined how quick the models weights should change during training. If it is too high the weights are going to change really quick and might miss minima because they are always jumping from one side to another side. If it is too small the model weights might get stuck a local minimum. So although this is not done in this notebook, this parameter should be studied in order to choose the best (see [What is learning rate in machine learning?](https://www.ibm.com/think/topics/learning-rate)). After defining the hyperparameters, we should define the loss function that is going to be used to evaluate the model and it should be sent to the hardware used for training. Afterwards, the model is defined using ```ImageClassifier``` class and is sent to the device used for training. Next, we should define the optimiser function and also send it to the device used for training. Afterwards, we train the model in case some optimised weights are not available and we explore the learning curves." + ] + }, + { + "cell_type": "markdown", + "id": "12b90a68", + "metadata": {}, + "source": [ + "#### Check which device is used for training" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "id": "76810a36", + "metadata": {}, + "outputs": [], + "source": [ + "DEVICE = torch.device(\"cuda:0\" if torch.cuda.is_available() else \"cpu\")" + ] + }, + { + "cell_type": "markdown", + "id": "3df5da88", + "metadata": {}, + "source": [ + "#### Define training hyperparameters" + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "id": "a3f6513c", + "metadata": {}, + "outputs": [], + "source": [ + "# Get number of output classes\n", + "NUMBER_CLASSES = len(CIFAR_10_CLASSES)\n", + "\n", + "# Number of training epochs\n", + "EPOCHS = 500\n", + "\n", + "# Number of consecutive not improving epochs needed for stopping the training\n", + "EARLY_STOPPING_LIMIT = EPOCHS // 10\n", + "\n", + "# Learning rate\n", + "LR = 0.001" + ] + }, + { + "cell_type": "markdown", + "id": "7f0d3c3d", + "metadata": {}, + "source": [ + "#### Loss function\n", + "\n", + "The cross entropy loss function is defined by:\n", + "\n", + "$$\n", + "\\mathcal{L} = -\\sum_{i=1}^{C} y_i \\log(\\hat{y}_i)\n", + "$$\n", + "\n", + "Where:\n", + "\n", + "$\\mathcal{L}$: Cross-entropy loss\n", + "\n", + "$C$: Total number of classes\n", + "\n", + "$y_i$: Ground truth indicator for class $i$, where $y_i = 1$ if class $i$ is the correct class, otherwise $y_i = 0$\n", + "\n", + "$\\hat{y}_i$: Predicted probability for class $i$, typically from the softmax output, where $0 \\leq \\hat{y}_i \\leq 1$ and $\\sum_{i=1}^{C} \\hat{y}_i = 1$" + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "id": "596d2d55", + "metadata": {}, + "outputs": [], + "source": [ + "# Cross Entropy Loss\n", + "criterion = nn.CrossEntropyLoss()\n", + "criterion = criterion.to(DEVICE)" + ] + }, + { + "cell_type": "markdown", + "id": "bbd59740", + "metadata": {}, + "source": [ + "#### Initialise model architecture" + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "id": "37aaa6e0", + "metadata": {}, + "outputs": [], + "source": [ + "# Initialise image classifier\n", + "model = ImageClassifier(in_channels = 3, out_classes = NUMBER_CLASSES)\n", + "model = model.to(DEVICE)" + ] + }, + { + "cell_type": "markdown", + "id": "6e6283d1", + "metadata": {}, + "source": [ + "#### Optimiser function\n", + "\n", + "In this notebook, we are using Adam optimiser which is one of the most used optimisers in deep neural network optimisation (see [Gentle Introduction to the Adam Optimisation Algorithm for Deep Learning](https://machinelearningmastery.com/adam-optimization-algorithm-for-deep-learning/)).\n", + "\n", + "The parameter update at each step is given by:\n", + "\n", + "$$\n", + "\\begin{aligned}\n", + "m_t &= \\beta_1 m_{t-1} + (1 - \\beta_1) g_t \\\\\n", + "v_t &= \\beta_2 v_{t-1} + (1 - \\beta_2) g_t^2 \\\\\n", + "\\hat{m}_t &= \\frac{m_t}{1 - \\beta_1^t} \\\\\n", + "\\hat{v}_t &= \\frac{v_t}{1 - \\beta_2^t} \\\\\n", + "\\theta_t &= \\theta_{t-1} - \\alpha \\frac{\\hat{m}_t}{\\sqrt{\\hat{v}_t} + \\epsilon}\n", + "\\end{aligned}\n", + "$$\n", + "\n", + "Where:\n", + "\n", + "$\\theta_t$: Parameters at time step $t$\n", + "\n", + "$g_t$: Gradient of the loss with respect to parameters at step $t$\n", + "\n", + "$m_t$: Exponentially decaying average of past gradients (1st moment)\n", + "\n", + "$v_t$: Exponentially decaying average of past squared gradients (2nd moment)\n", + "\n", + "$\\hat{m}_t$, $\\hat{v}_t$: Bias-corrected estimates of $m_t$ and $v_t$\n", + "\n", + "$\\alpha$: Learning rate\n", + "\n", + "$\\beta_1$: Decay rate for the first moment estimate (typically 0.9)\n", + "\n", + "$\\beta_2$: Decay rate for the second moment estimate (typically 0.999)\n", + "\n", + "$\\epsilon$: Small constant to prevent division by zero (e.g., 1e-8)" + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "id": "ad47d6b7", + "metadata": {}, + "outputs": [], + "source": [ + "# Adam Optimiser\n", + "optimizer = optim.Adam(model.parameters(), lr = LR)" + ] + }, + { + "cell_type": "markdown", + "id": "c046cb9c", + "metadata": {}, + "source": [ + "#### Train model\n", + "\n", + "Here, we train the model. First, we initialise the class ```Trainer``` which we are going to use for training and evaluating the model using the PyTorch ```Dataset```s defined before. In case, some model weights are already available, we can skip the training and using them." + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "id": "ee081ef8", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training model weights...\n", + "Epoch: 0, Loss: 1.520462845363756, Val Loss: 1.1197907672090046. The best val loss is 1.1197907672090046 in epoch 0.\n", + "Epoch: 1, Loss: 1.2096626140775473, Val Loss: 0.9215332592948008. The best val loss is 0.9215332592948008 in epoch 1.\n", + "Epoch: 2, Loss: 1.0686169616932417, Val Loss: 0.8447397245188891. The best val loss is 0.8447397245188891 in epoch 2.\n", + "Epoch: 3, Loss: 0.9770413578426751, Val Loss: 0.7714212577221757. The best val loss is 0.7714212577221757 in epoch 3.\n", + "Epoch: 4, Loss: 0.9027988477780001, Val Loss: 0.7413592813378673. The best val loss is 0.7413592813378673 in epoch 4.\n", + "Epoch: 5, Loss: 0.853520769925013, Val Loss: 0.7288939409841926. The best val loss is 0.7288939409841926 in epoch 5.\n", + "Epoch: 6, Loss: 0.8115674629263634, Val Loss: 0.7225121598122484. The best val loss is 0.7225121598122484 in epoch 6.\n", + "Epoch: 7, Loss: 0.7751083819970598, Val Loss: 0.6755026561223855. The best val loss is 0.6755026561223855 in epoch 7.\n", + "Epoch: 8, Loss: 0.7429245090397605, Val Loss: 0.680803889945402. The best val loss is 0.6755026561223855 in epoch 7.\n", + "Epoch: 9, Loss: 0.7036612864828458, Val Loss: 0.6538166893740832. The best val loss is 0.6538166893740832 in epoch 9.\n", + "Epoch: 10, Loss: 0.6798463224494544, Val Loss: 0.6429420563123994. The best val loss is 0.6429420563123994 in epoch 10.\n", + "Epoch: 11, Loss: 0.6614546892199203, Val Loss: 0.6293074839701087. The best val loss is 0.6293074839701087 in epoch 11.\n", + "Epoch: 12, Loss: 0.6360901489745091, Val Loss: 0.6260964428469286. The best val loss is 0.6260964428469286 in epoch 12.\n", + "Epoch: 13, Loss: 0.6167764896458953, Val Loss: 0.6097777591923536. The best val loss is 0.6097777591923536 in epoch 13.\n", + "Epoch: 14, Loss: 0.6087375462707811, Val Loss: 0.619213687666392. The best val loss is 0.6097777591923536 in epoch 13.\n", + "Epoch: 15, Loss: 0.5897037236794939, Val Loss: 0.6073367350687415. The best val loss is 0.6073367350687415 in epoch 15.\n", + "Epoch: 16, Loss: 0.5765072896967839, Val Loss: 0.6440611239206993. The best val loss is 0.6073367350687415 in epoch 15.\n", + "Epoch: 17, Loss: 0.563063828723274, Val Loss: 0.6116426238569163. The best val loss is 0.6073367350687415 in epoch 15.\n", + "Epoch: 18, Loss: 0.5490653581210296, Val Loss: 0.6325035504365372. The best val loss is 0.6073367350687415 in epoch 15.\n", + "Epoch: 19, Loss: 0.5475345523688044, Val Loss: 0.6214407298524501. The best val loss is 0.6073367350687415 in epoch 15.\n", + "Epoch: 20, Loss: 0.5203386396169662, Val Loss: 0.644289857755273. The best val loss is 0.6073367350687415 in epoch 15.\n", + "Epoch: 21, Loss: 0.5306917810744612, Val Loss: 0.620820519530167. The best val loss is 0.6073367350687415 in epoch 15.\n", + "Epoch: 22, Loss: 0.5145865923514331, Val Loss: 0.5965031617778843. The best val loss is 0.5965031617778843 in epoch 22.\n", + "Epoch: 23, Loss: 0.5126297283564171, Val Loss: 0.6246055566658408. The best val loss is 0.5965031617778843 in epoch 22.\n", + "Epoch: 24, Loss: 0.5013029677589445, Val Loss: 0.6039554145881685. The best val loss is 0.5965031617778843 in epoch 22.\n", + "Epoch: 25, Loss: 0.4936089391690971, Val Loss: 0.6187645522719722. The best val loss is 0.5965031617778843 in epoch 22.\n", + "Epoch: 26, Loss: 0.49248143774967124, Val Loss: 0.6178014038477914. The best val loss is 0.5965031617778843 in epoch 22.\n", + "Epoch: 27, Loss: 0.4751875848665725, Val Loss: 0.6290953432099294. The best val loss is 0.5965031617778843 in epoch 22.\n", + "Epoch: 28, Loss: 0.47110289161222696, Val Loss: 0.6119560217958385. The best val loss is 0.5965031617778843 in epoch 22.\n", + "Epoch: 29, Loss: 0.47182121448708275, Val Loss: 0.6084022969007492. The best val loss is 0.5965031617778843 in epoch 22.\n", + "Epoch: 30, Loss: 0.46568958635312796, Val Loss: 0.6035911913140345. The best val loss is 0.5965031617778843 in epoch 22.\n", + "Epoch: 31, Loss: 0.46517014171737825, Val Loss: 0.6001845184019057. The best val loss is 0.5965031617778843 in epoch 22.\n", + "Epoch: 32, Loss: 0.45642245550007715, Val Loss: 0.6151119955515457. The best val loss is 0.5965031617778843 in epoch 22.\n", + "Epoch: 33, Loss: 0.44822045453708537, Val Loss: 0.6187824904918671. The best val loss is 0.5965031617778843 in epoch 22.\n", + "Epoch: 34, Loss: 0.4419831280616948, Val Loss: 0.592142356654345. The best val loss is 0.592142356654345 in epoch 34.\n", + "Epoch: 35, Loss: 0.44394656189166715, Val Loss: 0.5911562273562965. The best val loss is 0.5911562273562965 in epoch 35.\n", + "Epoch: 36, Loss: 0.44761017992766233, Val Loss: 0.6003621490830082. The best val loss is 0.5911562273562965 in epoch 35.\n", + "Epoch: 37, Loss: 0.44031569524838104, Val Loss: 0.6164587758860346. The best val loss is 0.5911562273562965 in epoch 35.\n", + "Epoch: 38, Loss: 0.44285188463047476, Val Loss: 0.5972995141805229. The best val loss is 0.5911562273562965 in epoch 35.\n", + "Epoch: 39, Loss: 0.43360351151140936, Val Loss: 0.6004025703769619. The best val loss is 0.5911562273562965 in epoch 35.\n", + "Epoch: 40, Loss: 0.41746894465963336, Val Loss: 0.5901112660007962. The best val loss is 0.5901112660007962 in epoch 40.\n", + "Epoch: 41, Loss: 0.42153748733936436, Val Loss: 0.599281668915587. The best val loss is 0.5901112660007962 in epoch 40.\n", + "Epoch: 42, Loss: 0.4245654776366088, Val Loss: 0.6247793034476748. The best val loss is 0.5901112660007962 in epoch 40.\n", + "Epoch: 43, Loss: 0.4211408660685929, Val Loss: 0.6417396566120245. The best val loss is 0.5901112660007962 in epoch 40.\n", + "Epoch: 44, Loss: 0.41268631853979, Val Loss: 0.6215485965801497. The best val loss is 0.5901112660007962 in epoch 40.\n", + "Epoch: 45, Loss: 0.4059799904153295, Val Loss: 0.6040090186110998. The best val loss is 0.5901112660007962 in epoch 40.\n", + "Epoch: 46, Loss: 0.4065084534622457, Val Loss: 0.6168524281958402. The best val loss is 0.5901112660007962 in epoch 40.\n", + "Epoch: 47, Loss: 0.40375757086886105, Val Loss: 0.6170856871847379. The best val loss is 0.5901112660007962 in epoch 40.\n", + "Epoch: 48, Loss: 0.4098697109487805, Val Loss: 0.5972580624333883. The best val loss is 0.5901112660007962 in epoch 40.\n", + "Epoch: 49, Loss: 0.4095643772913592, Val Loss: 0.6005579427642337. The best val loss is 0.5901112660007962 in epoch 40.\n", + "Epoch: 50, Loss: 0.4083644581319642, Val Loss: 0.6048085048037061. The best val loss is 0.5901112660007962 in epoch 40.\n", + "Epoch: 51, Loss: 0.403082884249896, Val Loss: 0.6078571540824438. The best val loss is 0.5901112660007962 in epoch 40.\n", + "Epoch: 52, Loss: 0.3945612990073044, Val Loss: 0.5963720141325967. The best val loss is 0.5901112660007962 in epoch 40.\n", + "Epoch: 53, Loss: 0.38647738012084126, Val Loss: 0.5921772230984801. The best val loss is 0.5901112660007962 in epoch 40.\n", + "Epoch: 54, Loss: 0.39561996859138027, Val Loss: 0.5898459821434344. The best val loss is 0.5898459821434344 in epoch 54.\n", + "Epoch: 55, Loss: 0.3939645201185324, Val Loss: 0.5995454742746839. The best val loss is 0.5898459821434344 in epoch 54.\n", + "Epoch: 56, Loss: 0.38574477915998795, Val Loss: 0.5923815730769756. The best val loss is 0.5898459821434344 in epoch 54.\n", + "Epoch: 57, Loss: 0.3892661157968271, Val Loss: 0.6093509543245121. The best val loss is 0.5898459821434344 in epoch 54.\n", + "Epoch: 58, Loss: 0.3850240001495737, Val Loss: 0.5916378082612813. The best val loss is 0.5898459821434344 in epoch 54.\n", + "Epoch: 59, Loss: 0.378469897556479, Val Loss: 0.5968938746442229. The best val loss is 0.5898459821434344 in epoch 54.\n", + "Epoch: 60, Loss: 0.3794175671943783, Val Loss: 0.5946346276392371. The best val loss is 0.5898459821434344 in epoch 54.\n", + "Epoch: 61, Loss: 0.383537115134897, Val Loss: 0.6071822138155921. The best val loss is 0.5898459821434344 in epoch 54.\n", + "Epoch: 62, Loss: 0.3816265944037994, Val Loss: 0.6084112055220846. The best val loss is 0.5898459821434344 in epoch 54.\n", + "Epoch: 63, Loss: 0.38150809210364833, Val Loss: 0.5979497970665916. The best val loss is 0.5898459821434344 in epoch 54.\n", + "Epoch: 64, Loss: 0.37301460979846274, Val Loss: 0.6065514082625761. The best val loss is 0.5898459821434344 in epoch 54.\n", + "Epoch: 65, Loss: 0.36878351867198944, Val Loss: 0.5835936852430893. The best val loss is 0.5835936852430893 in epoch 65.\n", + "Epoch: 66, Loss: 0.3737880415181174, Val Loss: 0.5992799398757643. The best val loss is 0.5835936852430893 in epoch 65.\n", + "Epoch: 67, Loss: 0.3712460848428037, Val Loss: 0.5899697612907927. The best val loss is 0.5835936852430893 in epoch 65.\n", + "Epoch: 68, Loss: 0.37315521574150906, Val Loss: 0.5957560945870513. The best val loss is 0.5835936852430893 in epoch 65.\n", + "Epoch: 69, Loss: 0.362878164725147, Val Loss: 0.5935364683805886. The best val loss is 0.5835936852430893 in epoch 65.\n", + "Epoch: 70, Loss: 0.3722979333931512, Val Loss: 0.6141205157263804. The best val loss is 0.5835936852430893 in epoch 65.\n", + "Epoch: 71, Loss: 0.36279642804913276, Val Loss: 0.5836885872028642. The best val loss is 0.5835936852430893 in epoch 65.\n", + "Epoch: 72, Loss: 0.35015128895531605, Val Loss: 0.6027285529900406. The best val loss is 0.5835936852430893 in epoch 65.\n", + "Epoch: 73, Loss: 0.3609611526893003, Val Loss: 0.5830024175219617. The best val loss is 0.5830024175219617 in epoch 73.\n", + "Epoch: 74, Loss: 0.35839762582178536, Val Loss: 0.5948921146534257. The best val loss is 0.5830024175219617 in epoch 73.\n", + "Epoch: 75, Loss: 0.3572802374506519, Val Loss: 0.5998453563552791. The best val loss is 0.5830024175219617 in epoch 73.\n", + "Epoch: 76, Loss: 0.3575136511430253, Val Loss: 0.5765268831687459. The best val loss is 0.5765268831687459 in epoch 76.\n", + "Epoch: 77, Loss: 0.3491816062779322, Val Loss: 0.6038693424503682. The best val loss is 0.5765268831687459 in epoch 76.\n", + "Epoch: 78, Loss: 0.3558763565264479, Val Loss: 0.5848387520192033. The best val loss is 0.5765268831687459 in epoch 76.\n", + "Epoch: 79, Loss: 0.3427315055236329, Val Loss: 0.5766063006752629. The best val loss is 0.5765268831687459 in epoch 76.\n", + "Epoch: 80, Loss: 0.3547864577522243, Val Loss: 0.5917557159722862. The best val loss is 0.5765268831687459 in epoch 76.\n", + "Epoch: 81, Loss: 0.34569333091269444, Val Loss: 0.588234972903284. The best val loss is 0.5765268831687459 in epoch 76.\n", + "Epoch: 82, Loss: 0.34583661112472086, Val Loss: 0.5996746274374299. The best val loss is 0.5765268831687459 in epoch 76.\n", + "Epoch: 83, Loss: 0.3544433234925688, Val Loss: 0.5710372688659167. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 84, Loss: 0.34202558868122795, Val Loss: 0.58054094142833. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 85, Loss: 0.3480827844273435, Val Loss: 0.6032005470182936. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 86, Loss: 0.34498164269828446, Val Loss: 0.5817765525336993. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 87, Loss: 0.34134020775991636, Val Loss: 0.5917026044453605. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 88, Loss: 0.3539756306440291, Val Loss: 0.5930931666644953. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 89, Loss: 0.3384290711179267, Val Loss: 0.598258935799033. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 90, Loss: 0.32630713126302635, Val Loss: 0.5862780529058585. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 91, Loss: 0.33687664508602044, Val Loss: 0.5769727050752963. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 92, Loss: 0.3473127528372472, Val Loss: 0.5762458418385458. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 93, Loss: 0.33563532669396295, Val Loss: 0.5968525631953094. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 94, Loss: 0.3314749488647837, Val Loss: 0.595398830660319. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 95, Loss: 0.3329750731804945, Val Loss: 0.590960522576914. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 96, Loss: 0.3353428637241795, Val Loss: 0.5752199105287003. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 97, Loss: 0.34178659967044844, Val Loss: 0.5732296922449338. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 98, Loss: 0.32314487758779176, Val Loss: 0.5811086026793819. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 99, Loss: 0.32910647651139835, Val Loss: 0.5770133462245182. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 100, Loss: 0.3238052527291061, Val Loss: 0.5814551895452758. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 101, Loss: 0.32984716396262176, Val Loss: 0.5813906811556574. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 102, Loss: 0.326775474759349, Val Loss: 0.5779723345728244. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 103, Loss: 0.32983843188216216, Val Loss: 0.5768956274299298. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 104, Loss: 0.33291823569222956, Val Loss: 0.5779225970223799. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 105, Loss: 0.3280012247009869, Val Loss: 0.5753000757451785. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 106, Loss: 0.3194662572367348, Val Loss: 0.5870925825783762. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 107, Loss: 0.3207915492192672, Val Loss: 0.5964459269228628. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 108, Loss: 0.32585835576492506, Val Loss: 0.6154704705133276. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 109, Loss: 0.3208541542182874, Val Loss: 0.5844341463456719. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 110, Loss: 0.31640350160590053, Val Loss: 0.5904933976925025. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 111, Loss: 0.3147911937349904, Val Loss: 0.5792616754770279. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 112, Loss: 0.32040639015009803, Val Loss: 0.5819793530439926. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 113, Loss: 0.31413557266231873, Val Loss: 0.6056355551137762. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 114, Loss: 0.30808168859051094, Val Loss: 0.5927262697684563. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 115, Loss: 0.3089282238570443, Val Loss: 0.5796368061485937. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 116, Loss: 0.32587182331476766, Val Loss: 0.5752878412604332. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 117, Loss: 0.32540303055387343, Val Loss: 0.583455377211005. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 118, Loss: 0.3131185951132844, Val Loss: 0.5797113500914332. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 119, Loss: 0.30391744426349654, Val Loss: 0.6049800280292156. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 120, Loss: 0.31208072145924953, Val Loss: 0.5922758643657474. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 121, Loss: 0.3176320235985909, Val Loss: 0.6109542104147249. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 122, Loss: 0.3147456742224902, Val Loss: 0.5852055001561924. The best val loss is 0.5710372688659167 in epoch 83.\n", + "Epoch: 123, Loss: 0.30696533712809976, Val Loss: 0.5706274071234768. The best val loss is 0.5706274071234768 in epoch 123.\n", + "Epoch: 124, Loss: 0.32265936966686354, Val Loss: 0.5859721261313406. The best val loss is 0.5706274071234768 in epoch 123.\n", + "Epoch: 125, Loss: 0.3112110183948148, Val Loss: 0.5667502364869845. The best val loss is 0.5667502364869845 in epoch 125.\n", + "Epoch: 126, Loss: 0.3104869855472641, Val Loss: 0.5941177205008975. The best val loss is 0.5667502364869845 in epoch 125.\n", + "Epoch: 127, Loss: 0.31350058882775966, Val Loss: 0.5784695949089729. The best val loss is 0.5667502364869845 in epoch 125.\n", + "Epoch: 128, Loss: 0.31120698572727884, Val Loss: 0.5807840824127197. The best val loss is 0.5667502364869845 in epoch 125.\n", + "Epoch: 129, Loss: 0.30841147328597784, Val Loss: 0.5897202885757058. The best val loss is 0.5667502364869845 in epoch 125.\n", + "Epoch: 130, Loss: 0.30785051270993086, Val Loss: 0.5675957033694801. The best val loss is 0.5667502364869845 in epoch 125.\n", + "Epoch: 131, Loss: 0.3112386478668582, Val Loss: 0.5779468749286765. The best val loss is 0.5667502364869845 in epoch 125.\n", + "Epoch: 132, Loss: 0.3125714513942273, Val Loss: 0.5686445665561547. The best val loss is 0.5667502364869845 in epoch 125.\n", + "Epoch: 133, Loss: 0.30767959122457644, Val Loss: 0.5657109239343869. The best val loss is 0.5657109239343869 in epoch 133.\n", + "Epoch: 134, Loss: 0.3017018769398658, Val Loss: 0.5809539278685036. The best val loss is 0.5657109239343869 in epoch 133.\n", + "Epoch: 135, Loss: 0.3097518128852775, Val Loss: 0.5628624608961202. The best val loss is 0.5628624608961202 in epoch 135.\n", + "Epoch: 136, Loss: 0.30671128467486725, Val Loss: 0.5763838447756686. The best val loss is 0.5628624608961202 in epoch 135.\n", + "Epoch: 137, Loss: 0.30341582777943926, Val Loss: 0.5731805956969827. The best val loss is 0.5628624608961202 in epoch 135.\n", + "Epoch: 138, Loss: 0.29784292695078535, Val Loss: 0.5638393105591758. The best val loss is 0.5628624608961202 in epoch 135.\n", + "Epoch: 139, Loss: 0.30926666142296616, Val Loss: 0.5739989044555163. The best val loss is 0.5628624608961202 in epoch 135.\n", + "Epoch: 140, Loss: 0.2959961067803585, Val Loss: 0.5746817353923442. The best val loss is 0.5628624608961202 in epoch 135.\n", + "Epoch: 141, Loss: 0.30737390869507825, Val Loss: 0.5700446905220969. The best val loss is 0.5628624608961202 in epoch 135.\n", + "Epoch: 142, Loss: 0.3121158451820812, Val Loss: 0.5812855759414576. The best val loss is 0.5628624608961202 in epoch 135.\n", + "Epoch: 143, Loss: 0.30305711022258675, Val Loss: 0.5643048703165378. The best val loss is 0.5628624608961202 in epoch 135.\n", + "Epoch: 144, Loss: 0.30016324647369175, Val Loss: 0.5782985472578114. The best val loss is 0.5628624608961202 in epoch 135.\n", + "Epoch: 145, Loss: 0.3040212186148567, Val Loss: 0.5691732195979458. The best val loss is 0.5628624608961202 in epoch 135.\n", + "Epoch: 146, Loss: 0.3016812792659676, Val Loss: 0.5617153816303965. The best val loss is 0.5617153816303965 in epoch 146.\n", + "Epoch: 147, Loss: 0.29671870411312495, Val Loss: 0.579236001281415. The best val loss is 0.5617153816303965 in epoch 146.\n", + "Epoch: 148, Loss: 0.29685061381463584, Val Loss: 0.5651809762595064. The best val loss is 0.5617153816303965 in epoch 146.\n", + "Epoch: 149, Loss: 0.29743194694284103, Val Loss: 0.5766752395084349. The best val loss is 0.5617153816303965 in epoch 146.\n", + "Epoch: 150, Loss: 0.292420252804121, Val Loss: 0.5900611251087512. The best val loss is 0.5617153816303965 in epoch 146.\n", + "Epoch: 151, Loss: 0.2913234588438577, Val Loss: 0.5807744913687141. The best val loss is 0.5617153816303965 in epoch 146.\n", + "Epoch: 152, Loss: 0.292916942197476, Val Loss: 0.5939694009089874. The best val loss is 0.5617153816303965 in epoch 146.\n", + "Epoch: 153, Loss: 0.30532007675318823, Val Loss: 0.5810983082500555. The best val loss is 0.5617153816303965 in epoch 146.\n", + "Epoch: 154, Loss: 0.2950349525161033, Val Loss: 0.5784054488196211. The best val loss is 0.5617153816303965 in epoch 146.\n", + "Epoch: 155, Loss: 0.29075001142103307, Val Loss: 0.5682049542665482. The best val loss is 0.5617153816303965 in epoch 146.\n", + "Epoch: 156, Loss: 0.2975866798664967, Val Loss: 0.5920458019284879. The best val loss is 0.5617153816303965 in epoch 146.\n", + "Epoch: 157, Loss: 0.2897822512653622, Val Loss: 0.5714672541214247. The best val loss is 0.5617153816303965 in epoch 146.\n", + "Epoch: 158, Loss: 0.29135854074554723, Val Loss: 0.575728770534871. The best val loss is 0.5617153816303965 in epoch 146.\n", + "Epoch: 159, Loss: 0.28913253031834196, Val Loss: 0.6004293250330424. The best val loss is 0.5617153816303965 in epoch 146.\n", + "Epoch: 160, Loss: 0.2927849720211795, Val Loss: 0.5684288557050592. The best val loss is 0.5617153816303965 in epoch 146.\n", + "Epoch: 161, Loss: 0.2959106251271102, Val Loss: 0.5842390287730653. The best val loss is 0.5617153816303965 in epoch 146.\n", + "Epoch: 162, Loss: 0.29101231552823614, Val Loss: 0.5712400572532315. The best val loss is 0.5617153816303965 in epoch 146.\n", + "Epoch: 163, Loss: 0.2881297772056865, Val Loss: 0.559442984224376. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 164, Loss: 0.2852074009006041, Val Loss: 0.5753907935094025. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 165, Loss: 0.285701166430529, Val Loss: 0.5759714293782994. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 166, Loss: 0.28792087516210374, Val Loss: 0.5690475027945082. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 167, Loss: 0.2832980568717866, Val Loss: 0.5747623486539065. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 168, Loss: 0.28516104360566524, Val Loss: 0.5714504392975468. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 169, Loss: 0.28167932151551667, Val Loss: 0.57846040872194. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 170, Loss: 0.2865719164103052, Val Loss: 0.5612821884579577. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 171, Loss: 0.28216071981583196, Val Loss: 0.5650454471677037. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 172, Loss: 0.2902397823921085, Val Loss: 0.5877135702866619. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 173, Loss: 0.2852446578279899, Val Loss: 0.5798372459613671. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 174, Loss: 0.28750243619845733, Val Loss: 0.5724708799083355. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 175, Loss: 0.2822929304229082, Val Loss: 0.5700611611038952. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 176, Loss: 0.2920268885791302, Val Loss: 0.579023973921598. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 177, Loss: 0.28251984051979373, Val Loss: 0.5748149303056426. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 178, Loss: 0.28110622288319315, Val Loss: 0.5797006480269513. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 179, Loss: 0.2818679098230209, Val Loss: 0.5799835576849469. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 180, Loss: 0.28157650233402737, Val Loss: 0.570118812938868. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 181, Loss: 0.2799555491248186, Val Loss: 0.5903648266852912. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 182, Loss: 0.2763484850743391, Val Loss: 0.5635228816230419. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 183, Loss: 0.280780463567833, Val Loss: 0.5791870104306835. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 184, Loss: 0.28329240117412413, Val Loss: 0.5799015035568658. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 185, Loss: 0.28868773334870373, Val Loss: 0.565531560551312. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 186, Loss: 0.27346769788295683, Val Loss: 0.5797525214946876. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 187, Loss: 0.27500163742008, Val Loss: 0.5668845628782854. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 188, Loss: 0.28088780275008973, Val Loss: 0.568546903840566. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 189, Loss: 0.27970000421696334, Val Loss: 0.5697976484642191. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 190, Loss: 0.2756377290254527, Val Loss: 0.5751031566474397. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 191, Loss: 0.27344675987523837, Val Loss: 0.5829155700186551. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 192, Loss: 0.28079140621380216, Val Loss: 0.5932203809083518. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 193, Loss: 0.27451732036841175, Val Loss: 0.576087320255021. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 194, Loss: 0.28170232104994086, Val Loss: 0.5776064959117921. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 195, Loss: 0.27986278074936277, Val Loss: 0.5733445214770608. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 196, Loss: 0.2789561886095653, Val Loss: 0.5823673263697301. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 197, Loss: 0.2791278816650819, Val Loss: 0.5784084450895504. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 198, Loss: 0.2773500455230692, Val Loss: 0.5823896259574567. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 199, Loss: 0.2784909770218995, Val Loss: 0.5699085335610277. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 200, Loss: 0.27406128188663154, Val Loss: 0.5931905136775162. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 201, Loss: 0.27406109086353414, Val Loss: 0.5760189481710983. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 202, Loss: 0.28128726361659323, Val Loss: 0.5785159903057551. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 203, Loss: 0.276198021741244, Val Loss: 0.5724056608090966. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 204, Loss: 0.2763995799596292, Val Loss: 0.5720801659054675. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 205, Loss: 0.26505959575084875, Val Loss: 0.5688852122274496. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 206, Loss: 0.2807133837337912, Val Loss: 0.561060284911576. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 207, Loss: 0.2783729234968659, Val Loss: 0.5606417134404182. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 208, Loss: 0.2643663240392713, Val Loss: 0.5761121589753587. The best val loss is 0.559442984224376 in epoch 163.\n", + "Epoch: 209, Loss: 0.2734516680131864, Val Loss: 0.5506142446045148. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 210, Loss: 0.27256592770997623, Val Loss: 0.5627262877710795. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 211, Loss: 0.2707751576319663, Val Loss: 0.5873762753050206. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 212, Loss: 0.2774685037179585, Val Loss: 0.569239574721304. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 213, Loss: 0.27612383769702736, Val Loss: 0.5658961516820779. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 214, Loss: 0.27853093443125704, Val Loss: 0.5582708893185955. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 215, Loss: 0.2721802492755173, Val Loss: 0.5597216320239892. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 216, Loss: 0.27059901581846013, Val Loss: 0.5538761527356455. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 217, Loss: 0.2727644860635709, Val Loss: 0.582524243805368. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 218, Loss: 0.26874367626261536, Val Loss: 0.560043477153374. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 219, Loss: 0.2608478213694409, Val Loss: 0.5644208152415389. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 220, Loss: 0.27496112102683445, Val Loss: 0.5581450025408955. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 221, Loss: 0.2667116536113032, Val Loss: 0.5762663476042829. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 222, Loss: 0.26607079991567745, Val Loss: 0.5656787182314921. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 223, Loss: 0.2701546778209018, Val Loss: 0.5599509242732646. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 224, Loss: 0.26507608965039253, Val Loss: 0.576087441989931. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 225, Loss: 0.2638658442582092, Val Loss: 0.5629089731028525. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 226, Loss: 0.27089454646963274, Val Loss: 0.562779827643249. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 227, Loss: 0.27437972312752346, Val Loss: 0.5632650872408334. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 228, Loss: 0.265983763103285, Val Loss: 0.5618503467511322. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 229, Loss: 0.2690598440703249, Val Loss: 0.5708437885268259. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 230, Loss: 0.26835012680640186, Val Loss: 0.5602795345298315. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 231, Loss: 0.26432300467778297, Val Loss: 0.5540358454494153. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 232, Loss: 0.26599613016974316, Val Loss: 0.5551857680587445. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 233, Loss: 0.26347067438229155, Val Loss: 0.5685149085218624. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 234, Loss: 0.26751254161779026, Val Loss: 0.5660574610455561. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 235, Loss: 0.26541295283249694, Val Loss: 0.5670869153434948. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 236, Loss: 0.2653599366491293, Val Loss: 0.5614773609628112. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 237, Loss: 0.2662328870790283, Val Loss: 0.5676712120993662. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 238, Loss: 0.26688076857559, Val Loss: 0.5781088633052374. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 239, Loss: 0.27209434459788084, Val Loss: 0.5685614577289355. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 240, Loss: 0.25565624068470766, Val Loss: 0.5630070125147447. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 241, Loss: 0.26337509224340866, Val Loss: 0.5663782963055676. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 242, Loss: 0.25791891253668897, Val Loss: 0.570207545502206. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 243, Loss: 0.27337414504837815, Val Loss: 0.5673647981326458. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 244, Loss: 0.2638578884249186, Val Loss: 0.5680219308804657. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 245, Loss: 0.2566627488084083, Val Loss: 0.5638837970919528. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 246, Loss: 0.2633417400456693, Val Loss: 0.5605825064545971. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 247, Loss: 0.2639889079180077, Val Loss: 0.551850961173995. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 248, Loss: 0.25892963700920996, Val Loss: 0.5655168537366189. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 249, Loss: 0.2659519577711603, Val Loss: 0.5685224588644706. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 250, Loss: 0.2558550471100059, Val Loss: 0.5574127387697414. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 251, Loss: 0.2676079269145092, Val Loss: 0.5606442260540138. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 252, Loss: 0.26820769324137345, Val Loss: 0.569399874594252. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 253, Loss: 0.26273784000616873, Val Loss: 0.5565078392372294. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 254, Loss: 0.2554377465352525, Val Loss: 0.5548669375100378. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 255, Loss: 0.2655937552560855, Val Loss: 0.5637813233217951. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 256, Loss: 0.255323704957527, Val Loss: 0.5692406907930212. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 257, Loss: 0.25345539956958624, Val Loss: 0.5626253899881395. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 258, Loss: 0.25726864467898425, Val Loss: 0.5553672404612525. The best val loss is 0.5506142446045148 in epoch 209.\n", + "Epoch: 259, Loss: 0.2604094168891872, Val Loss: 0.5730891451239586. The best val loss is 0.5506142446045148 in epoch 209.\n" + ] + } + ], + "source": [ + "# Train the image classifier\n", + "trainer = Trainer(model)\n", + "model_path = \"cnn_weights.pt\"\n", + "\n", + "if os.path.exists(model_path):\n", + " print(\"Loading model weights...\")\n", + " model.load_state_dict(torch.load(model_path, weights_only=True))\n", + " model.eval()\n", + "else:\n", + " print(\"Training model weights...\")\n", + " # Fit model\n", + " trainer.fit(EPOCHS, train_dataloader, val_dataloader, optimizer, criterion, DEVICE, EARLY_STOPPING_LIMIT)\n", + " # Save model weights\n", + " torch.save(trainer.model.best_model_weights, \"cnn_weights.pt\")" + ] + }, + { + "cell_type": "markdown", + "id": "1accead2", + "metadata": {}, + "source": [ + "#### Learning curves\n", + "\n", + "After training the model, we can analyse the learning curves to assess the training process. These curves, which typically display the loss over epochs for both the training and validation sets, are crucial for improving model performance. They can help identify issues like overfitting or underfitting. Overfitting occurs when the model performs well on the training data but poorly on the validation data, usually indicated by a widening gap between the two curves. Underfitting, on the other hand, is suggested when both the training and validation curves show poor performance and fail to improve. By monitoring these curves, we can adjust hyperparameters or modify the model architecture to address such issues." + ] + }, + { + "cell_type": "code", + "execution_count": 39, + "id": "5ca13d69", + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "Text(0.5, 1.0, 'Learning curve of image classification model')" + ] + }, + "execution_count": 39, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjcAAAHHCAYAAABDUnkqAAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjEsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvc2/+5QAAAAlwSFlzAAAPYQAAD2EBqD+naQAAfbVJREFUeJzt3Qd4U9UbBvC3G1po2XvvPWTJkiEyVBSFv7gBB6LgwokDxAVuHCiCilsQFRRBEBBEBGTJkL132bOlO//nPbc3TbookDQd7+950mbn5uYm97vf+c45fg6HwwERERGRPMLf1wsgIiIi4kkKbkRERCRPUXAjIiIieYqCGxEREclTFNyIiIhInqLgRkRERPIUBTciIiKSpyi4ERERkTxFwY2IiIjkKQpuJNeoUqUK+vfv7+vFyFfOnj2Le+65B2XKlIGfnx8eeeSRDO+rz8fSsWNHc/KVF154wXxWrhISEvDkk0+iYsWK8Pf3R69evcz1vB/vn924nXB7yY8u5Xviq88rN1Jwk898/vnn5guyYsUKXy+K5AKvvvqq2Wbuv/9+fPXVV7jjjjt8vUhyET777DO88cYb6NOnD7744gs8+uijXn/NAwcOmB3x6tWrvf5aIqkFprlGJIfavHmzOeqU7PPHH3/g8ssvx4gRI857X30+OcNzzz2Hp59+Os3nWL58ebzzzjtu1587dw6BgYFeC25GjhxpMhVNmjRxu23ChAlISkryyuuKkH6JxCeYJo+Li7ugx4SEhCAoKAh5UVRUFHKiw4cPo0iRIsjvn09uwmClQIECWfoceT9vBTeZ4XbC7UXEWxTcSLr279+Pu+66C6VLlzY/QvXr1zepbVcMToYPH45mzZohIiICYWFhaN++PebPn+92v127dpmmsDfffBNjxoxB9erVzXNu2LDBWR+wbds20w7NH2A+14ABAxAdHZ1pW7XdxPb3339j6NChKFmypFmGG264AUeOHHF7LI8S+VrlypVDaGgoOnXqZF4/q+3ffPy7776Lhg0bmh0CX6t79+7O5j37PXKZztdObr9nvv6tt96KokWLol27dmb98Prdu3eneY5hw4YhODgYJ06ccF73zz//mGXg+uJ76tChg1kXWcGd3d13320+X76fxo0bm+YK24IFC8yy7Ny5EzNmzDDneeL7zEhGn8+iRYvw0EMPmXXGz/e+++4z287Jkydx5513mvfPE2tCHA6H23NynbRp0wbFixdHwYIFzbb2ww8/pHltZiD4GiVKlEDhwoVx3XXXmW04vRqFrGzbmfn666/RsmVLs8653FdccQV+//33DO+f1e8JTZo0ydyP7yE8PNxsb9zubPHx8SYbUrNmTfO5cb1w25kzZ066NTf2dsnXWr9+vfNz5OdLGa0fbhv8rnD9VK1a1TRL2gcjx48fx+OPP26WrVChQmY5e/TogTVr1jifg8/fokULc57fZft17e9HejU3DPAfe+wxUxfE161du7b5/FNvE3yeIUOGYNq0aWjQoIHzM5w1a9Z5Pzt7u/7+++/NemQ2i+uazXWnTp1CbGysqSsrVaqUeW9cdl6X+sDspZdecv6O8X0888wzae7H5X755ZdRoUIF528OP4P08LvA17Xfe40aNfDaa68pu3UJ1CwlaRw6dMg0Rdg/Itwp/fbbb+YH7/Tp086iUp7/5JNPcMstt+Dee+/FmTNn8Omnn6Jbt25YtmxZmlT0xIkTERMTg4EDB5ovcLFixZy33XTTTeZHdNSoUVi1apV5Xv7A8At+Pg8++KDZybDphD/mDKC43JMnT3YLDl5//XX07NnTLB9/iPmfy5MVfO/8YeaPOAts+QP3119/YenSpWjevDkuxv/+9z+zk2JdC38Ir732WrOD5w/vE0884XZfXte1a1fzPu1mBi4Ld4R832wO4vrt3LmzWS7ufDPCQIAFrwwouZ643qdMmWJ2OPyRffjhh1G3bl1TY8PaDP44c6dD3BYuFD8fFiRzZ8L1NX78eBPkLF68GJUqVTLvf+bMmaYmhDsrBjw27tgZqNx2221m58qdP9fbr7/+imuuucZ5Py471xFrgrjt/vnnn263X+i2nRG+BwYDDLhefPFFE3AyyOTnwc8nPVn9njBA4X2uvPJK53a/ceNGE7DyMyG+Nr8j3Ab5GfO5GWDzO3PVVVeleW2+P36Or7zyiikO52OJn29GTUl8Xm4H/J7WqVPHBDsMKHmwwfe7Y8cOE1jwc+C2w3X68ccfm+CaATuDIj4/1w+DOj4PgznieksPt39+zgzC+FlwncyePdt8D/j6qZvTGDD/9NNPeOCBB0xw8t5776F3797Ys2ePCfjOh+uBwTKb7/g9eP/99002id8jHkBwPXNb5Xee75Hvw8Z1zwMBBkT8XvDz5/Pxs5o6darzfnwMg5urr77anPgZcRtJnbHmeuW64/tk4M/vBL8b/M06ePCg+T2Ti+CQfGXixIk8DHIsX748w/vcfffdjrJlyzqOHj3qdv3NN9/siIiIcERHR5vLCQkJjtjYWLf7nDhxwlG6dGnHXXfd5bxu586d5jXDw8Mdhw8fdrv/iBEjzG2u96cbbrjBUbx4cbfrKleu7OjXr1+a99KlSxdHUlKS8/pHH33UERAQ4Dh58qS5HBkZ6QgMDHT06tXL7fleeOEF83jX50zPH3/8Ye730EMPpbnNfl37PXKZUuP1fJ+p3/Mtt9yS5r6tW7d2NGvWzO26ZcuWmft/+eWXztesWbOmo1u3bm7vm59L1apVHVdddVWm72fMmDHm+b7++mvndXFxcea1CxUq5Dh9+rTbOr/mmmsyfb7zfT6pl5Ov4+fn5xg0aJDzOm5LFSpUcHTo0MHtOe1tzXU5GzRo4OjcubPzupUrV5rXeeSRR9zu279//zTrPqvbdnq2bt3q8Pf3N9tmYmKi222u74/vwfV9ZPV78vDDD5vvCO+fkcaNG5/387C3L1dcnvr166e5b+r1c+edd5r3mN7vg/0eY2Ji0rx/bv8hISGOF1980XkdnyOj7wS3E24vtmnTppn7vvzyy27369Onj9lWtm3b5rbMwcHBbtetWbPGXP/+++9nsmYcjvnz55v7cRvitmTjd5Gv06NHD7f7c1t1Xc7Vq1ebx99zzz1u93v88cfN9fytIP7OcRn5WbluG88880ya35yXXnrJERYW5tiyZYvbcz799NPmd2zPnj0Zfl6SMTVLiRt+f3788UeT4eD5o0ePOk880mTqlkcgFBAQYI7kiOlTpquZ0WAmw76PKx5ZZXTkP2jQILfLPNI7duyYOTI9Hx4ZunZ95WMTExOdzTvz5s0zy8WjvNQZhazg+uDzp1dUm7rL7YVI/Z6pb9++WLlyJbZv3+68jhkoZrquv/56c5m9T7Zu3WqatLiO7M+HaX0e9S9cuDDTdDazJMykMEtg41Erm3V4dM+shyfxSNx1PbVq1cpsW7zexm2J2w2zAq54dG3jETW3P36+rtuX3Rxxvs/3Qrbt9DBbwfXKI/LUhdOZbQdZ/Z4wm8XP0LWJKTXeh00b/Pw9jcvG98j1k1420n6P3Bbt98/vGbdBNuGwGSmz9ZcZbpNcT9wGXTEzws+K2TVXXbp0Mc1CtkaNGpnmsdTbT0aYHXStD7O3STZXuuL1e/fuNZ+XvZzEZvDUy0lswqW5c+eaDA23QddtI73MILOm3KaZlXXdJvkeuX75fZYLp2YpccNaFaak2XTAU0b1GjamZ9966y1s2rTJ1APYmMpNLb3rbEzFurKbX7hD449WZjJ7LNlBDtuxXbFZzL5vZhhoMNXu2ozmCemtD6b6+cPJgIbt+PzB5Y8fm6Ds9WDv2Pr165fhc3NHndF74/pgc1jqHbTdVJFezc+lSP35sO6EWF+Q+nrXmiJi8xNT+wzoXGsaXHcYXF6+l9TrM/XnfaHbdnrbAV+nXr16uFBZ+Z4wOGPTGj9r1oKwCYPNtayrsrGph0FurVq1TBMeb2NTHHful4rrhwcTfN6s1J99+OGHpiaLO2BbVpqE0sPPkN8xNjFlZZtMvU0Rt/fU248ntkm+X36f+N7sbS31tsWDBQae9nLa//k9c8WDu9TfS36f165dm+GBX2bbpGRMwY24sY/4b7/99gx3nvYPKQsrWevAAcHYNs4aGR59sf3ZNfOQ3lF4anxcelIXE3r6sZ6S0ZG76w9/VtYHf+B5FMedHIMbtvuzjsC19sj+jFijkrquycYj6Zwio88nvetdPzPWDrEOgwW73JGWLVvWHG2ztujbb7/16rbtSVn9nvB6BnGsNWGmgie+V2YZ7GJvrgs+5ueffzZFzKzlYT3KuHHjTC1IdmCN1PPPP2+yHCysZdDPHT6zEtlVAHup3/kL2SbTe95LydimxnXGeinW26WHgaxcOAU34oZHDzx64k6ZadHMsMiwWrVqprDP9cuelTFRslPlypXNfxYOuh4pM52elSM9pr+5w2FzQkbZG/tojJkBVxeTBWHTFI/iOW4MMzjsacGmAtflIWZyzvcZZbQ+eKTIH1XX7A2zCvbtOQGbkNgjiOvetdswd/iuuLx8L8wiuB4p8/O+2G07PVzvfB0WzWYUVF7q94TNV/yseeJrcTtgsS6DCTtbwG2QvXh4YjMiAx4WwF5qcMP1w23qv//+O+/7Yc8fFkW74rbP3moXEwDwM2RTDoutXbM3OW2btLc1Zltci7JZVM33by+n/Z/342fvmh1L/ZvD7Yqf48Vsk5Ix1dxImiMX1sZwx5Lej5xrF2v7KMf1qIY9B5YsWYKchHUoHMvjo48+crv+gw8+yNLjuT74HtlTJjX7vXOnwB/21O3jzDhcKL4e1+13331nmqTYi4rdh23sIcUfRHaT5Y9iaqm7wafGnhuRkZFuvclYU8AeI8z4sOdGTsB1wB2ka/aLveFYF+KK9TLprWu+n4vdttPDzAuDQTYNpc5QZJYxyOr3hMG2K76WnUmym+RS34efF4Oe1N2QL4Y9LcP06dPTHcHcXn6+n9Tvl9spe/u4srfZ1AF/RtskP+fU30lmpbgNsKkuJ+ByUuoeTG+//bb5b/fQY6DCLCO3Qdd1lV7PJzY9cltgEJ8a151d7yMXRpmbfIrjeqQ3LgS7nI4ePdp0yWQxHbuussaAWQsWC/LoiueJO10ejXJcGX6peeTM9Djvn95O11c4ngnfF2se2MzBOgV2BWfanwHJ+Y4weZTKugZ2N+WRGB/PnRubTXgbuxQTj5y57vifBZkMdLZs2XLBy8vmCT4vfzB5JMtMTuqdEJsj+IPP8T14BM8aDe5c+Lkx0OIOKrMCbGYD2FTC4mWO08GjcXY55o9v6roHX+E2xXXA9c3iadYejB071uzMmXlyDfYYtHDZufO3u4Lb6971883qtp0evu6zzz5rmmLYdHjjjTeajNLy5ctNc6LdzTq1rH5PuN3w9dmdn93vmfXjzpFZIjtLwMewGz/fMzM4DEL42dnboCeanNjcxQCX2wlfl92RGbyw+zXrSvh+GOBxu2PX7nXr1uGbb75xy1AQA3Den++V2xSDHa739GrNmKniNs/1ywCW4y5xOdj8xuYu1+JhX+JysUmTNVsMPLie2J2fzYYMDPke7CwYxwLiNsH1xaDo33//df7muGJT5S+//GLux+8kP1sWlnO98rPl+kj9GMmCTHpSSR5kd8/N6LR3715zv0OHDjkGDx7sqFixoiMoKMhRpkwZx5VXXukYP36887nYxfHVV181XSXZDbRp06aOX3/9NU03T7ub9BtvvJFht9UjR46ku5x87Pm6Gqfutmp39+R/G7vXPv/88+Z9FCxY0HQl3rhxo+lu7tolOSN8PJe/Tp06potnyZIlTbdRdkO2sRsxuxqzS3HhwoUdN910k+kSmlFX8NTv2dWECRPMffg8586dS/c+//77r+PGG28074Hrn+uHrzlv3rzzvh9+vgMGDHCUKFHCvJ+GDRum22XXE13BU38+Gb1/PpZdYl19+umnpts73x/XPZ8zva7OUVFRZnstVqyY6c7Obv+bN2829xs9enSa936+bTszn332mdnWuUxFixY13aznzJmTYVfwrH5PfvjhB0fXrl0dpUqVMp9JpUqVHPfdd5/j4MGDzvuwq3TLli0dRYoUMdsx18krr7zi1q35UrqC0+7du02XcG7jXN5q1aqZ9WV3Z2dX8Mcee8x0qecytG3b1rFkyZI075t+/vlnR7169cxQDK7dwlO/dzpz5owZxqFcuXLmc+Hnzu+ca1dqe5m5POfb/tJj/zZMmTLF7foL2Vbj4+MdI0eONMMucDm5HQ0bNsysF1fsLs/72eupY8eOjv/++y/d5eR753PUqFHDfPb8XrZp08bx5ptvun226gqedX78k5UgSCSv4ZEXa2XYG4dHjJK3sDi3adOmpqCXgwCKSP6hmhvJFzgqb2p2+zfT/JI3P1824bHgVkTyF9XcSL7A4lkOpc62bxZhsn6ABbscS6Rt27a+Xjy5RJxag/VDrHlg8bjdlZp1I6nHLhGRvE/NUpIvsGCU40iwqYIDlbHImEWobJLKSWPCyMXhqL7szcZu2izS5SBtLAJnc6MvZr0WEd9ScCMiIiJ5impuREREJE9RcCMiIiJ5Sr5rjObgawcOHDCDSnlyfhARERHxHlbRcGBTDpqZeuJf5PfghoGNek+IiIjkTnv37jWjeGcm3wU39tDyXDkcpl5ERERyPvZ0ZXIiK1PE5Lvgxm6KYmCj4EZERCR3yUpJiQqKRUREJE9RcCMiIiJ5ioIbERERyVPyXc2NiIjkbYmJiYiPj/f1YshFCA4OPm8376xQcCMiInlmHJTIyEicPHnS14siF4mBTdWqVU2QcykU3IiISJ5gBzalSpVCaGioBmrNpYPsHjx40Ex+eymfn4IbERHJE01RdmBTvHhxXy+OXKSSJUuaACchIQFBQUEX+zQqKBYRkdzPrrFhxkZyL7s5isHqpVBwIyIieYaaonI3T31+Cm5EREQkT1FwIyIikodUqVIFY8aM8flz+JKCGxERER81wWR2euGFFy7qeZcvX46BAwciP1NvKQ+JTUjE0bNxYGthuSIFfb04IiKSw7HLs23y5MkYPnw4Nm/e7LyuUKFCbmP4sMg2MDAwSz2O8jtlbjxk3b5TaDv6D9w6YamvF0VERHKBMmXKOE8REREmW2Nf3rRpEwoXLozffvsNzZo1Q0hICBYtWoTt27fj+uuvR+nSpU3w06JFC8ydOzfTJiU/Pz988sknuOGGG0xvspo1a+KXX365oGXds2ePeV2+Znh4OG666SYcOnTIefuaNWvQqVMns8y8ncu8YsUKc9vu3bvRs2dPFC1aFGFhYahfvz5mzpwJb1LmxkOCAqw4MT7R4etFERHJ95jpOBd/ad2JL1bBoACP9fp5+umn8eabb6JatWomONi7dy+uvvpqvPLKKybg+fLLL03gwIwPB77LyMiRI/H666/jjTfewPvvv4/bbrvNBB3FihXL0uB6dmDz559/mjFoBg8ejL59+2LBggXmPny+pk2b4qOPPkJAQABWr17tHKeG942Li8PChQtNcLNhwwa3rJQ3KLjxkMAAa0OOT0zy9aKIiOR7DGzqDZ/tk9fe8GI3hAZ7Zvf64osv4qqrrnJeZjDSuHFj5+WXXnoJU6dONZmYIUOGZPg8/fv3xy233GLOv/rqq3jvvfewbNkydO/e/bzLMG/ePKxbtw47d+5ExYoVzXUMqpiBYX0Ps0fM7DzxxBOoU6eOuZ3ZIRtv6927Nxo2bGguM1DzNjVLeUhwcuYmIUmZGxER8YzmzZu7XT579iwef/xx1K1bF0WKFDEZkI0bN5oAIjONGjVynmf2hE1Hhw8fztIy8PkZ1NiBDdWrV8+8Pm+joUOH4p577kGXLl0wevRo03xme+ihh/Dyyy+jbdu2GDFiBNauXQtvU+bGQwLtZqkEZW5ERHyNTUPMoPjqtT2FgYgrBjZz5swxTVU1atRAwYIF0adPH9Psk5mgVFMZsNmMzU2ewp5dt956K2bMmGHqhBjETJo0ydT5MOjp1q2bue3333/HqFGj8NZbb+HBBx+Etyi48ZBA/+RmKQ9uLCIicnG48/ZU01BO8vfff5smJgYNdiZn165dXn3NunXrmlofnuzsDetmOJcXMzi2WrVqmdOjjz5qmsAmTpzoXE4+btCgQeY0bNgwTJgwwavBjZqlPCQ4UAXFIiLiXaxl+emnn0zBLnsoMVviyQxMetjUxHoZFg2vWrXK1Orceeed6NChg2k2O3funKn3YXExi5QZgLEWh0ERPfLII5g9e7ap2eHj58+f77zNWxTceDhzk5jkMFX6IiIinvb222+bXlNt2rQxvaTY3HPZZZd5PQv2888/m9e94oorTLDDomCOzUPsHXXs2DET8DBzw27iPXr0MD20iOPzsMcUAxoWMPM+H374oXeX2ZHP9sSnT5824wmcOnXKFFR57Hlj4tHohd/N+S0v93BmckRExPtiYmJMZqBq1aooUKCArxdHvPA5Xsj+W3tgDwnyT1mV6g4uIiLiOwpuPDzODSWo7kZERMRnFNx4uOaG4pS5ERER8RkFNx4suApKzt4kqDu4iIiIzyi48cb8UglqlhIREfEVBTcepIH8REREfE/BjRcyNyooFhER8R0FN95ollJBsYiIiM8ouPFCd3AFNyIiIr6j4MaDgp2ZGzVLiYhI9ujYsaOZvymzGbubNGmC/ETBjRcyNwnK3IiIyHlwbijOtZSev/76ywwxsnbt2mxfrrxAwY0HBSZPwRCfpMyNiIhk7u6778acOXOwb9++NLdNnDjRzLjdqFEjnyxbbqfgxoOCkifLjE9Q5kZERDJ37bXXomTJkvj888/drj979iymTJligh/Otn3LLbegfPnyCA0NRcOGDfHdd99d0usmJSXhxRdfRIUKFRASEmKarGbNmuW8PS4uDkOGDEHZsmXN5JWVK1fGqFGjzG2ca5vNXJUqVTKPLVeuHB566CHkNIG+XoC8JCh5nBuNUCwi4mMOBxAf7ZvXDgrlsPXnvVtgYCDuvPNOE9w8++yzphmKGNgkJiaaoIaBTrNmzfDUU0+ZmbBnzJiBO+64A9WrV0fLli0vavHeffddvPXWW/j444/RtGlTfPbZZ7juuuuwfv161KxZE++99x5++eUXfP/99yaI2bt3rznRjz/+iHfeeQeTJk1C/fr1ERkZiTVr1iCnUXDjha7gcSooFhHxLQY2r5bzzWs/cwAIDsvSXe+66y688cYb+PPPP01hsN0k1bt3b0RERJjT448/7rz/gw8+iNmzZ5vA42KDmzfffNMESzfffLO5/Nprr2H+/PkYM2YMxo4diz179pggp127dibgYubGxtvKlCmDLl26ICgoyAQ/F7sc3qRmKQ9SQbGIiFyIOnXqoE2bNiZ7Qtu2bTPFxGySImZwXnrpJdMcVaxYMRQqVMgENwwyLsbp06dx4MABtG3b1u16Xt64caM5379/f6xevRq1a9c2TU6///67837/+9//cO7cOVSrVg333nsvpk6dioSEBOQ0ytx4kEYoFhHJIdg0xAyKr177AjCQYUaGWRNmbdjk1KFDB3MbszpsRmJWhQFOWFiY6fbNuhhvueyyy7Bz50789ttvmDt3Lm666SaTqfnhhx9QsWJFbN682VzPYugHHnjAmXliJienUObGg+xZweOUuRER8S3Wr7BpyBenLNTbuGLw4O/vj2+//RZffvmlaaqy62/+/vtvXH/99bj99tvRuHFjkzHZsmXLRa+W8PBwUwTM53XFy/Xq1XO7X9++fTFhwgRMnjzZ1NocP37c3FawYEHTjZ21OQsWLMCSJUuwbt065CTK3HhQoDNzo+BGRESyhk1NDCSGDRtmmo3YLGRj7QszJosXL0bRokXx9ttv49ChQ26ByIV64oknMGLECJMhYk8pZovYDPXNN9+Y2/ka7CnFYmMGXSxwZp1NkSJFTPEzm8patWplem99/fXXJthxrcvJCRTceJBGKBYRkYvBpqlPP/0UV199tcms2J577jns2LED3bp1M8HEwIED0atXL5w6deqiX+uhhx4yj3/sscdw+PBhEyixdxQDKSpcuDBef/11bN26FQEBAWjRogVmzpxpAh0GOKNHj8bQoUNNkMOmsunTp6N48eLISfwc7LTuIwsXLjRtdStXrsTBgwdNYRI/tKxgCo1tkg0aNDARZ1YxKmb1OT9Ypt086YkpazBl5T482b02HuhYw6PPLSIiGYuJiTF1IlWrVjVjs0je+xwvZP/t05qbqKgo04bIIqoLcfLkSTM2wJVXXomcOIifCopFRETyabNUjx49zOlCDRo0CLfeeqtJl02bNg05bRA/zQouIiLiO7mutxQLn9j+yGKorIiNjTWpLNeTtwuKVXMjIiLiO7kquGFx09NPP22qszlsdVZwPgx7lEee2Eff2+PcKHMjIiLiO7kmuGFVNpuiRo4ciVq1amX5cexax+Ij+2TPj+HNcW7UFVxExDd82EdGctDnl2u6gp85cwYrVqzAv//+a2YrtWc25YpgFofDQ3fu3DnN4zhrKU/ZwZm5SdKXS0QkO9mj40ZHR5txVyR3skdeZk1tvghu2O0r9QiIH374If744w8zwBG7jeWUuaXiE5S5ERHJTtwZcgwWjttCHBPGHuVXcgcmLI4cOWI+u6yWnuTI4IZTuXOSMBv7tnPMGk4OxplG2aS0f/9+Mxw1Bw/imDauSpUqZfrBp77eV4L8k7uCK3MjIpLtOIou2QGO5D7c13P/f6mBqU+DGzYzderUyXmZIx5Sv379zBDPHNjvYmc+9QXNLSUi4jvcIXLaAB74xsfH+3px5CIEBwebACdXj1DsC94cofjrpbvx3LT/0K1+aXx8R3OPPreIiEh+djq3jFCcV+eW0gjFIiIivqPgxgsFxWqWEhER8R0FN14YoViZGxEREd9RcONBwXZXcGVuREREfEbBjQcFJld4axA/ERER31Fw40FBgXazlDI3IiIivqLgxoOC/NUsJSIi4msKbjxIBcUiIiK+p+DGgzRCsYiIiO8puPHCrODK3IiIiPiOghtvBDdJytyIiIj4ioIbb4xQnKDgRkRExFcU3HhQUPI4Nwka50ZERMRnFNx4UFCguoKLiIj4moIbb4xQnOiAw6HsjYiIiC8ouPGg4OSCYlLTlIiIiG8ouPFCQTGpO7iIiIhvKLjxUnATr+7gIiIiPqHgxgu9pShe3cFFRER8QsGNB/n7+yEgefJM1dyIiIj4hoIbb80vpcyNiIiITyi48TAN5CciIuJbCm68VFScoIH8REREfELBjZcmz4xTcCMiIuITCm68NTO4xrkRERHxCQU3Xioo1vxSIiIivqHgxsMCA1LmlxIREZHsp+DGwwKd49wocyMiIuILCm48LDjQztwouBEREfEFBTdeytyoWUpERMQ3FNx4qbeUMjciIiK+oeDGw9QVXERExLcU3HhphGJlbkRERHxDwY3XmqWUuREREfEFBTdeGsRPXcFFRER8Q8GNt+aWSlBwIyIi4gsKbjws0D+5oDhJzVIiIiK+oODGW81SKigWERHxCQU33mqWUkGxiIiITyi48VJXcGVuREREfEPBjYcFa4RiERERn1Jw47VB/NQsJSIiku+Cm4ULF6Jnz54oV64c/Pz8MG3atEzv/9NPP+Gqq65CyZIlER4ejtatW2P27NnImb2llLkRERHJd8FNVFQUGjdujLFjx2Y5GGJwM3PmTKxcuRKdOnUywdG///6LnCI40FqlsfEKbkRERHwhED7Uo0cPc8qqMWPGuF1+9dVX8fPPP2P69Olo2rQpcoKw4ADzPzo+0deLIiIiki/5NLi5VElJSThz5gyKFSuW4X1iY2PNyXb69GmvLlNoiLVKo2MTvPo6IiIikgcLit98802cPXsWN910U4b3GTVqFCIiIpynihUrenWZwoKt4CYqTpkbERERX8i1wc23336LkSNH4vvvv0epUqUyvN+wYcNw6tQp52nv3r1eXa7QEKtZ6pyCGxEREZ/Ilc1SkyZNwj333IMpU6agS5cumd43JCTEnLJLaJAV3ETFqVlKRETEF3Jd5ua7777DgAEDzP9rrrkGOU2Ys+ZGmRsREZF8l7lhvcy2bducl3fu3InVq1ebAuFKlSqZJqX9+/fjyy+/dDZF9evXD++++y5atWqFyMhIc33BggVNPU1OEJrcW0qZGxERkXyYuVmxYoXpwm134x46dKg5P3z4cHP54MGD2LNnj/P+48ePR0JCAgYPHoyyZcs6Tw8//DByXOYmLhEOh0YpFhERyVeZm44dO2YaAHz++edulxcsWICczs7cJCY5EJuQhALJNTgiIiKSPXJdzU1OF5rcFdzO3oiIiEj2UnDjYQH+figQZK3WKA3kJyIiku0U3HhBWHL2RpkbERGR7KfgxosD+anHlIiISPZTcOMFYcmZG41SLCIikv0U3HhzrBvV3IiIiGQ7BTdeHutGREREspeCGy8oqPmlREREfEbBjRdofikRERHfUXDjBZpfSkRExHcU3HiBam5ERER8R8GNF6i3lIiIiO8ouPGCMI1QLCIi4jMKbjxl/yrgvabA59emjFCszI2IiEi2S5nCWi6NwwEc3wEkJihzIyIi4kPK3HhKYIj1PyHGWXMTrd5SIiIi2U7BjacEFrD+J8Sot5SIiIgPKbjxlKCU4Ebj3IiIiPiOghtPZ24S4xAW5GfOaoRiERGR7KfgxtPBDce58bcyNsrciIiIZD8FN14IbsICrKAmJj4JiUkOHy6UiIhI/qPgxlMCAgF/q5C4oF+882r1mBIREcleCm68kL0JQRwC/JPrbtRjSkREJFspuPFCcOOXEKv5pURERHxEwY1Xxro5p1GKRUREfETBjVfGuonV/FIiIiI+ouDGG5mbeJfMTbwyNyIiItlJwY1XmqVUcyMiIuIrCm68NL9UeMEgc/bUuZRu4SIiIuJ9Cm68NL9UkeTg5mS0ghsREZHspODGS5mbomHB5qwyNyIiItlLwY1XCopjEJGcuTkRFefbZRIREclnFNx4KXNTJDS5WUqZGxERkWyl4MZLNTdFQ61mqZPRytyIiIhkJwU33srcqKBYRETEJxTceKvmRs1SIiIiPqHgxlu9pVyapRwOh2+XS0REJB9RcOOtcW6SMzfxiQ5NnikiIpKNFNx4KXNTMCgAwQHW6lXTlIiISPZRcOOlmhs/Pz9n9kZj3YiIiGQfBTdeytyQHdxolGIREZHso+DGSzU3VCS5qPiExroRERHJH8HNwoUL0bNnT5QrV84040ybNu28j1mwYAEuu+wyhISEoEaNGvj888+RYzM3GutGREQkfwU3UVFRaNy4McaOHZul++/cuRPXXHMNOnXqhNWrV+ORRx7BPffcg9mzZyNHCAyx/serWUpERMRXAn32ygB69OhhTlk1btw4VK1aFW+99Za5XLduXSxatAjvvPMOunXrBp8LLOiWubHHulFBsYiISPbJVTU3S5YsQZcuXdyuY1DD6zMSGxuL06dPu528nrlJDm40SrGIiEj2y1XBTWRkJEqXLu12HS8zYDl37ly6jxk1ahQiIiKcp4oVK3pvAYPcMzdFCmryTBERkeyWq4KbizFs2DCcOnXKedq7d2+21dwUtTM3KigWERHJHzU3F6pMmTI4dOiQ23W8HB4ejoIFk7MmqbBXFU/ZWnOTGAs4HGqWEhER8YFclblp3bo15s2b53bdnDlzzPU5gp25seeXUrOUiIhI/gpuzp49a7p082R39eb5PXv2OJuU7rzzTuf9Bw0ahB07duDJJ5/Epk2b8OGHH+L777/Ho48+ihzBrrmxZwYPS2mW0szgIiIi+SC4WbFiBZo2bWpONHToUHN++PDh5vLBgwedgQ6xG/iMGTNMtobj47BL+CeffJIzuoGTfyDgl7xK41MyNwlJDpyNTfDtsomIiOQTPq256dixY6YZjfRGH+Zj/v33X+RIfn5W3U18lMncFAjyN6eY+CSciIpH4QJWJkdERES8J1fV3OQKLmPdcEqJ4mHW5SNnY327XCIiIvmEghsvj3VTorAV3BxVcCMiIpItFNx4eaybkoWsuptjZ9VjSkREJDsouPHy/FJ2s5QyNyIiItlDwY2X55cqUdjO3Ci4ERERyQ4Kbrxdc1PIztyoWUpERCQ7KLjxcs1N8eTgRr2lREREsoeCGy/X3JRwFhQruBEREckOCm68XXOjZikREZFspeAmm2puTp2LR1xCki+XTEREJF9QcOPlmpsiBYMQ4O9nzh+PUvZGREQkRwY3e/fuxb59+5yXly1bhkceeQTjx4/35LLliZobf38/FAuz6m401o2IiEgODW5uvfVWzJ8/35yPjIzEVVddZQKcZ599Fi+++CLytVQ1N65NU+oxJSIikkODm//++w8tW7Y057///ns0aNAAixcvxjfffJPuTN75suYm/pzzqpQeU2qWEhERyZHBTXx8PEJCrGzE3Llzcd1115nzderUwcGDB5GvFSxq/T933HlVSo8pZW5ERERyZHBTv359jBs3Dn/99RfmzJmD7t27m+sPHDiA4sWLI18LK2H9jzqaJnNz9IyCGxERkRwZ3Lz22mv4+OOP0bFjR9xyyy1o3Lixuf6XX35xNlflW2Elrf9nDzuvskcpPqbeUiIiIl4XeDEPYlBz9OhRnD59GkWLJjfDABg4cCBCQ0ORr4WVSidzo2YpERGRHJ25OXfuHGJjY52Bze7duzFmzBhs3rwZpUol79zze7NU7CkgIdatWeqImqVERERyZnBz/fXX48svvzTnT548iVatWuGtt95Cr1698NFHHwH5vaDYPzkhFnXE/CtfxOpBtf/EOTgcDl8unYiISJ53UcHNqlWr0L59e3P+hx9+QOnSpU32hgHPe++9h3zNzy+l7iY5uKlQ1GqqOxObgJPR8b5cOhERkTzvooKb6OhoFC5c2Jz//fffceONN8Lf3x+XX365CXLyvVQ9pgoGB6BUYavuZs/xaF8umYiISJ53UcFNjRo1MG3aNDMNw+zZs9G1a1dz/eHDhxEeHu7pZcy9RcUuPaYqFbOyNwpuREREcmBwM3z4cDz++OOoUqWK6frdunVrZxanadOmnl7G3CdVsxRVKq7gRkREJMd2Be/Tpw/atWtnRiO2x7ihK6+8EjfccIMnly93KpROcJOcudmr4EZERCTnBTdUpkwZc7JnB69QoYIG8Mssc6NmKRERkZzbLJWUlGRm/46IiEDlypXNqUiRInjppZfMbfmeghsREZHclbl59tln8emnn2L06NFo27atuW7RokV44YUXEBMTg1deeQX5mrOgOG1wc+DkOcQnJiEo4KLiShEREfFGcPPFF1/gk08+cc4GTo0aNUL58uXxwAMPKLhxdgVPCW5KFg5BSKA/YhOSzGB+VUqE+W75RERE8rCLSh8cP34cderUSXM9r+Nt+Z5rs1RyM52fn5+apkRERHJqcMMeUh988EGa63kdMzj5nh3cOBKBmJPOqxXciIiI5NBmqddffx3XXHMN5s6d6xzjZsmSJWZQv5kzZ3p6GXOfwGCgQAQQc8rK3oQWM1drrBsREZEcmrnp0KEDtmzZYsa04cSZPHEKhvXr1+Orr77y/FLmkVGKqybX2ew4EuWrpRIREcnzLnqcm3LlyqUpHF6zZo3pRTV+/HhPLFvub5o6ttWtqLh6yULm/44jZ324YCIiInmb+iNn0+SZrsHN7uPRiEvQeEAiIiLeoODGWwolN0tFpTRLlQ4PQVhwABKTHNhzXE1TIiIi3qDgJhtHKWZ38OqlrOzNtsMKbkRERHxec8Oi4cywsFhSBTcuoxTbTVNr953CdtXdiIiI+D644VxS57v9zjvvvNRlyrOZG6pe0uoxpeBGREQkBwQ3EydO9NJi5J/gpkZys9T2wwpuREREvEE1N14vKE7bLEXbj0TB4XD4YslERETyNAU33u4KHncWiEsZkZijFAf4++FsbAIOn4n13fKJiIjkUT4PbsaOHYsqVaqgQIECaNWqFZYtW5bp/ceMGYPatWujYMGCqFixIh599FHExMQgxwkJBwJC0mRvQgIDnHNMbVPTlIiISN4KbiZPnoyhQ4dixIgRWLVqlZmQs1u3bjh8OGVsGFfffvstnn76aXP/jRs3mtGQ+RzPPPMMchw/P5e6m5SB/FzrbrYcOuOLJRMREcnTfBrcvP3227j33nsxYMAA1KtXD+PGjUNoaCg+++yzdO+/ePFitG3bFrfeeqvJ9nTt2hW33HLLebM9vh+l2L3upm6Zwub/poMKbkRERPJMcBMXF4eVK1eiS5cuKQvj728uc4bx9LRp08Y8xg5mduzYYWYhv/rqqzN8ndjYWJw+fdrt5MtRiqlu2XDzf1NkNi6LiIhIPnHRE2deqqNHjyIxMRGlS5d2u56XN23alO5jmLHh49q1a2d6GiUkJGDQoEGZNkuNGjUKI0eORE7qDl4nObjZfOiMmYqBBcYiIiKSRwqKL8SCBQvw6quv4sMPPzQ1Oj/99BNmzJiBl156KcPHDBs2DKdOnXKe9u7dm30LnEHNDQuKCwYFICY+CbuOaRoGERGRPJG5KVGiBAICAnDo0CG363m5TJky6T7m+eefxx133IF77rnHXG7YsCGioqIwcOBAPPvss6ZZK7WQkBBz8u0UDO7NUszU1CpTGGv2njR1N/bYNyIiIpKLMzfBwcFo1qwZ5s2b57wuKSnJXG7dunW6j4mOjk4TwDBAohw5IF4GzVJuRcWquxEREckbmRtiN/B+/fqhefPmaNmypRnDhpkY9p4izlNVvnx5UzdDPXv2ND2smjZtasbE2bZtm8nm8Ho7yMlRCmUc3NRJDm42qseUiIhI3glu+vbtiyNHjmD48OGIjIxEkyZNMGvWLGeR8Z49e9wyNc899xz8/PzM//3796NkyZImsHnllVeQI2WSubGLipW5ERER8Sw/R45sz/EedgXn7OUsLg4PtwIMrzlzCHirFuDnDzx/FPBPyS6dio5H4xd/N+fXjOiKiIJB3l0WERGRfLL/zlW9pXKd0OJWYONIAs5Eut0UERqEisUKmvPr95/y0QKKiIjkPQpuvCkgEChZ1zp/YFWamxuVL2L+r1VwIyIi4jEKbrytQnPr/77laW5qWCHC/F+3T8GNiIiIpyi48bYKLaz/+1amualReSu4Wbv/ZHYvlYiISJ6l4Ca7ghs2SyUmuN3UIDlzs/f4ORyPivPF0omIiOQ5Cm68rUQtICQciI8GDm9wuym8QBCqlQgz59ep7kZERMQjFNx4G8fpKd8sC3U3apoSERHxBAU32Vp3syLNTQ2T627WqKhYRETEIxTcZIeKLa3/2+YAce6zgDdIDm40UrGIiIhnKLjJDtU6AkWrWNMw/DPO7aZapQs7i4qj49wLjkVEROTCKbjJDgFBQKdnrfN/vwucO+G8qVhYMEoUCjbntx0+66slFBERyTMU3GSXBr2BUvWAmFPA9IeBxHjnTTVLWdmbLYcU3IiIiFwqBTfZhZNm9ngN8A8CNvwMTOkPJCWam2qVLmT+bz10xscLKSIikvspuMlOVa8A+n4NBAQDm34Fts0zV9dIrrvZouBGRETkkim4yW61uwN1e1rnj2wy/2qVsjI3apYSERG5dApufKFYNev/8R1uPab2nzyHqFj1mBIREbkUCm5yQHBT1PSYCjHnt6rHlIiIyCVRcOMLRata/4/vdF6lomIRERHPUHDjy8zNqb1AQqxb09T6AxqpWERE5FIouPGFQqWAIM4G7gBO7jFXtapazPz/a+sRHy+ciIhI7qbgxhf8/NLU3bSpUQL+fsD2I1GmsFhEREQujoIbXylWxS24iSgYhCYVi5jzf21R9kZERORiKbjxlVSZG7qiVknzf6GapkRERC6aghtfBzdHtwL//Wh6TrWvaQU3i7YeRWKSw7fLJyIikkspuPF1cLNjPvDDXcBPA9G4QgTCCwTidEwCVu9NmTlcREREsk7Bja/HurHtX4HA+LPoVKeUuTh9zUHfLJeIiEgup+DGV8LLAxVaAiVqA4XKAI4kYO8/6NW0vLl5+poDiE9M8vVSioiI5DoKbnzF3x+4+3dg8D9AzS7Wdbv/RvsaJVA8LBjHouKwaNtRXy+liIhIrqPgxtfj3fBUua11efdiBAb449pGZc3Fn//d79vlExERyYUU3OQEldtY//evAuKinU1Tv284hNiERN8um4iISC6j4CYnKFLZqsFJigf2LTeD+ZUsHILouET8u+ekr5dOREQkV1FwkxO4Nk1tnA4/Pz+0qV7cXFysuhsREZELouAmp2hyq/V/xWfAofUpwc32Y75dLhERkVxGwU1OUb0TUPc6wJEIzHgMbapZwc3qvScRFZuQ9v5x0cDMJ4G9y9LeFnUU+OFuYNci5FhJ6uYuIiLeoeAmJ+k+CggKA/YsQcVDc1GhaEEkJDmwbNfxtPdd8x2w7GPg9+fS3vbPx8B/PwBzhiNHOvAv8Go5YO5IXy+JiIjkQQpucpKICkDrwdb5hW+ibXL2Zkl6TVOH/rP+H1wDJKbK7Gz9PaX3VXQ6gZGvLRkLJJwDln8KJMT5emlERCSPUXCT01x+v5W9iVyLXuEbzFV/bDqc9n6H1lv/E2KAI5uALb8Df70FnD4AHFydfCcHsPNP5CjnTgAbfrHOx57KecsnIiK5noKbnCa0GNB8gDnbctv7qB+wH9sOn8XmyDMp93E4gENW4GPs/Qf48R5g3ovAFOuxTtvmnf81488BM58AVn8Hr1s7BUiMTbm84Wfvv6aIiOQrCm5yojYPAsGFEXBkPWYEPYF7A37Fr2sPpNx+cg8Qd8a9mYdZENq71Ppfvpn1f/t8KxiidT+kX4D893vAsvHAr48CMcnP4y3/fmn9r32N9X/TjLTNaiIiIpdAwU1OVLgMcM9cq/cUgIcDf8K8NdvhYBDALIvdJGU7vj3tc1w5HAgIAU7vA45utWpzfrwb+OZ/7nUuJ/cCi96xzrMO5r8fvfe+jm4DItcB/kFAz3eBgsWAc8fNnFoiIiKeouAmpypVB7jpSyQVq4FCfjG47OTviPrseuDNWsD6n6z7VGzl/piOwwA/f6BwWaBKe6Bya+v67fNSuoXHnAR2LQQWvw+8WgH4uL0V1AQWtG7/9+uMl4lBFYOj95sBp/alvf1MJLArk0DFrq+pdDlQqCRQJzl7s26K9X/9NGDeS1YvKgZCkrMxSLazgiIiOYiCm5zMzw/+yfU3zwd+jUL7FwGxp1OCgdpXAwUirPMh4UC7ocC9fwADZgL+AUD1K63btv9hupc7sbZmwWtW0xYLfP0CgJu/AfwDgf0r3et5bJtmAuPaWT2xjm2zmrhccSfHwOfzq4EdGRQJ71xo/a96hfW/8S0pQc22ucCUfsBfbwKL3gZ+fQQ50rmTwE/3Za2WKS87fRB4px4wKXnwSRGRHMTnwc3YsWNRpUoVFChQAK1atcKyZenUhLg4efIkBg8ejLJlyyIkJAS1atXCzJkzkWc1uRVJASEI8YtPe1uZBkDZJtb5ml2BwGCgXFOgWDXruuqdrf/M2ux2CW44Bg4DmxK1gDumAvf9CdS4EqjV3bp9Sn+r95Vr4PLHy4AjycoKUeqmpD1LTQ8vY82k9Afts7NHdnBTqTVQtIq1LFPusq6r0DLl+c+m00sswaUYOTNcnq1zM7499ixwYDUQ61K7lBXLPwHWTgLmvoB8bd33QNQRYPNvF74Oz2fHAmDs5enXh4mI5PTgZvLkyRg6dChGjBiBVatWoXHjxujWrRsOH05np8ZBeePicNVVV2HXrl344YcfsHnzZkyYMAHly1uzaOdJocXgX/8Gc/a3xBb4IuGqlNtKNwCa32UFCK0fSPvY0vWBQqWB+Ggg+igQEAyEJGd66PIHrACoTEPrcocngQJFgKObgW//B0wdZAUBbE46vB4ICgVu+DgleEhymbGc00bYNv0KxMe4L8uRjdYy8DnKXWZd5+8PNE4+8mdBNJfvf59bxdAMpDYmdxkn7ui+7Qu8XAqY/2rm64xj+3x5PfBNH+DIFvfbuMw/3AWMqgCM72Ddj7VMa7+3MjJxUZk/98bpKU10HCU6v3Jm7hxWPVd6uC7/GZ+1sZaObQeO77DOLxhtbS//jPPc8opIvuLT4Obtt9/GvffeiwEDBqBevXoYN24cQkND8dlnLjtKF7z++PHjmDZtGtq2bWsyPh06dDBBUZ7WYzRw3Qf4o+6LeDehN04GlwHKN7cCl/q9gIfXpPSOSj0hp529IQYVtbpZ51nM2/hm9/uXbQw8vBpoPcSq3eEoyB+1SRkFucltQJV2VoDE5jE7U8PpHjZMs84HF7JuYzMTd272NAt2kxSzNcww2VyXodkAIKI8UK9XSnMVm4F+HgJ8ehWwZZZ1PcfzSR20uFo72Rr/hztevofUO2VTNM1aET+rGe7nB4Cp91kZmdTNba5YfG2PIcRpMrhTXzMZ+O0pIDGdzBrFnAY+aAFMvgMXhNkQBnPsCZfTsEDd/uyJ6zA9i8YAvz1hBcnne68TOgHj2ltBrN2EuvMv1fQwU5nf14FIbgpumIVZuXIlunTpkrIw/v7m8pIlLk0oLn755Re0bt3aNEuVLl0aDRo0wKuvvorERJcMQiqxsbE4ffq02ynXKVgUuOwOXH1ZDRxHOLonjkHCgN+t4OV8XIMbFvK2ug8ILQ5c+TwQVDD91+r2CtB/BhBeHji52+rhZA8wyFoePg/ZxcOc7iExzmoSa9bfum7GUODV8sBP91o/zuyS7tokZSta2XpMyTpA+8es6+pdn9I09V4T4N+vrMtNbweqdgCSEtJOO7F3OfBJF6tJbFXy/e1Ah9kaZgXY02xBctan83PAte+k3IeZIto2xwrWvu4NrPzc/TWYkXLFgI21Qcww2Bkd7oxmDQPeqgPsW2k1sRzdYmWhMgrIuEMf2yrlOYgDHTKYY4E1l9sVM0bMiBx0CTBcnT1iBVXe4uxR55d5cGMHo1tnW+siI2wy5RAEcWeB75LrsCjqsLXu8iuul1fKWnVoIpI7gpujR4+aoIRBiitejoyMTPcxO3bsMM1RfBzrbJ5//nm89dZbePnllzN8nVGjRiEiIsJ5qlixInKrdjVLoEhoECKjkvDPrhNZe1C1TinnGZRUaA48ucNqzspM5TbA4GVA5+eBwuWAlgOB4tWt26q0TQk+uNNiJsUen6dBb+v82UNWdoT1PTMes3ZwVNOlWc3GbuGD/wEKl04JeBgoMeBgwXOJ2sCAWcD1Y4Fr3rYKn/l8c0ZYO37uyNnUtG+5lYFhExq7wbPY+vR+4LNuwHtNgderAyd2AWElrSa5y/oBFZMDNWbBaPsCq2s8M0+/DgV2L7YGOPyoLbDkQ+s+EcnbELMqbPKj9VOBqGPWay39EDhz0BrTx7WQ2+7l5opNYr88aI0yzbGGbPb6Yk82Bkg2BkifdrUyIl/dkDaI4bxdb9UGRleyltnTgyQyeLNrqpjJs6f5SI31Uq7ZHTuoTI/rKNVsuiQOF+Ca8Usve+RaF5YbsIn3v5+yPuUIs47MEP43FXkaA1sG5BfzOP4+iOTEguILkZSUhFKlSmH8+PFo1qwZ+vbti2effdY0Z2Vk2LBhOHXqlPO0d+9e5FZBAf7o0cAq6J2+xmVQv8ywyzV34mzGSp01OZ+QQsAVjwOPbQSufiPl+srtrP8MAFibwx/g+jdaJwYlzMBcdmdKFmfFp9b/VvdbdUBZ0e1Va6C/GycADyxJ6dZeogbQ/nHr/N9jgPebW720Tu1J2SFSvetSAi0GPRSfXE9zxRNAcJhV89PnM6Dtw8Ddc4DQElZx89KPrPvxfX1+rRV0cC4vvgZ1esb6bw+cSFvnAL89aQUXrB0y62eee+E1Mx52E8P8UcCk24Dfn7UCG7OcK61gh01cdqaLNs+0Aoq36gJjWwCH1qUEAovfc19v7AnH5WZgyWX+/k7gp4GeGyiRo2Cf2GmtK2a/mL05tRfY84+1jPbr2MvPQJC98bitZFQgbAc3dj0YP0dmGDMLbthLi9seewLauN6YRcpJzThsurR7H05/CPhhQPrF6CzMZrOc6xhWdlDL+qOcVt/FzgHc1i8VAxS+bw4vwaEkLiTI/qgd8GFrzxe0S57gs+CmRIkSCAgIwKFDPMJPwctlypRJ9zHsIcXeUXycrW7duibTw2au9LBHVXh4uNspN+vZ2ApuZqw7iLOxWdxhXfcecO88a4fuCazNYREzm6Kij1k7MDbxsJmMJw4geN37QPfXgOI1U4qfu1xADyNmjm75Fmh0k9UU5qrTMKDvN1bPLQ5SaI/KfMdPQKO+VtbGzsywdiisFNDvV+D2H4Fe44AW96Y8F2t8rnrRyhaxxxgxOGC2iu+L51kEfeUIK1jie+PginxeG4uwzQCIyfU6t35vLQN3+naxLXfYbGJhE9/hjcCfo61mLteiWQZfDFw4nQbrluxmH2ZfWHd05oD1uszG9UgONhd/YHXLNsvtSGk6Y5aLQSADCza78XSpGGjYNUDXfwCEl7WaE2lidytrxvdlj61EXGdNkovGF4xK+5zMdtnNnn0+tWq2eP/kASzNTtSu2+KOnzuyU/tTmqvYJGq/98m3AxM6p62z8hU2hbKplPVEzOwxa2P3uLM/Mxuzhcx08fM0j91pNQkTM5iuWTBPYLPnj/cCZ9x/f7P8vr64Dvji+qz3XswIx7Ti++SBwoV8buzQwIMNZki5bi8Vg0sOj+HaSUJytUBfvXBwcLDJvsybNw+9evVyZmZ4eciQIek+hkXE3377rbkf63Noy5YtJujh8+UHraoWR7USYdhxNApfLtmFBzrWyP6FCAgE7l9s7ZRO7LayKgWLpL1fUAEzEKFppmk/1LrsKXWvteqJmNXgFA4MhpiZ4ok79oDkLM79S6wRn9NbvtRqXJUSBLS4C6jVA/jnI6DFPVZGylXJulbzV6l6Vjd8ZpHMcl0HVO9kNd3ZWYUila2AkHU3rOMxmRUA4RWsgIXPzXGKdsy3shtsSiP2kmPGw54Sg8tjAoAwa2fOTBADO2aMuJ4PrLIey4lXG/Sx1jfvO28ksHwC0DS5GSk17qCY3mcAxrnN0qvlYlaE03MwI8SgsXYP6/oKzazMgl2zxKk82Fxlv3cGjHz/3HGZ8ZaWWs2j3G6YEbM/J7Mer7KaTJn5Yl0V3wdHsI5cYx3Vf3czULdnSsE5bZltNTVyG7BrfLhe7IDKU7i+s1Lj5uqvt60DAHt4BRvnVuNYTnY2lEXz+1ZY57f8Zq1r16ZIYpaE1x3ZDPT6MP16uQt5L6wL44HBsa1Wfd2FHPgwMOI2zICETZJ2VvVCcVuws7r2AKJtH3Ffzwzy2DGBw16wCZrfD37XWRvn+jhmii8Ws40MzPk9K1XXyvpmxNQP/gFUbAmEFL7415S83SzFbuDsyv3FF19g48aNuP/++xEVFWV6T9Gdd95pmpVsvJ29pR5++GET1MyYMcMUFLPAOL8I8PfDkM5WQPPJXzsRldXsjafxx5A7qcZ9gSKVMr5f6XrWUb499o5HlyEUaNgH+N9EoKVLNsbeYdojPWclsLF3xMzS8MQdOH9QGSilDmyoWseUgQgb3GidZy2QnZ2qkVIoj8ptgWb9rPP8Mf/3G+v8jeOtnXn/mdZ9iDU6m5N30hzB2X4e1g/1HJOyE+IO4OrXrddk0LT6W2BjctaGQYIdSPJ9MIvEnaNd1Mvsy2tVgDnDrWVhjQ5Pb1SzMg1sTmPAw4wJsyRsEmFQxsEbWYze9aWU91azW8p/jorNHffHHawxcBicsKaJWTG7PofNWnbB9pIPUqb+sJtMA0Os98bPsGaXlEwHgyZ7MEm3OiIH8H0/q/7KNcPE+hYGDambc6Y9YNUs8fas4rr45Epr7B3uYLleWMxtZ5TMYjisnSyDD2LwZmciggu7N7fSiolWgMF54tgsZwe8DDLZlGkHN8wK2gEbM1+s20pd6H4+7OXHrvY2Zg4Z2BC3C2ZwXDMWvC97DbI2iLVTXE5mxdjEyZHDXYdo4LKePgD88YrVK272s1ZwxnXP8xxLKj3MGP1wd0oQz+8cty8GL3bTJpeDA4eOa2u9Povuv7wOmPm4+yCazHSa4v0/s17P5Io9IO0DiPNlOPk9+/pGYNbTyDEY/F1Ik14+4bPMDbFm5siRIxg+fLhpWmrSpAlmzZrlLDLes2ePM0NDLAaePXs2Hn30UTRq1MiMb8NA56mnnkJ+cl3jcnh33lbsPhaNb/7ZjYFXJBf6yqVh1uKuWVbTT6FSmd+XdTc1OgPVOlu1Oxz/h3UodtE1g5LZybU5PLLl5csHA0vHAknx1thCPAK1j1IrJU+lwR03syCcDoOZKQaFPLru9KyVgXLFbBCvZ2aGWRU7qGN2wxZW3Aq+uKP9+x2gTs+U5fr7XZcn43I4gP0rrB9vrgMGRWxuY5BiN8Nxig97VGziUe7QTdaycafJnZE9ZlGXESnd/lm7xWXgzvCd+lZXfXtIAb4us1+pcX1xfbCWh5kcYhBg71yb3A6s/jqlez5rvlhzw2YOBpEL37KWi6N2s36MO8DVyYEln7dqeyso4lE4g0AGy8Qiba5z7jQ4qCQzj3aPMPZoY3MjgzTWlLEYnhgM/DzYylI9uMoK2rjM7N3HJtLJt1lNdK0HWwEsn4fPy50ph0cwH4G/9dlzBHK71oiF/AtfT6kbszNC9vIys8H1yuYZFt6zRyMDrT2LrQE+2XzHJiTiMA/crjnKOHEQTwZhm2cAvz8PdH/VCjpYFM/glNk0BqLstWZj/RDXjc10KlhhZZxsfBzrvfj+2NzDjgnM9toYILJmigFW8RrAtWOsbZKfDYvkWaTPzKyZIDg5CHXtSbjqS2s9cX3x82H28pPkXqEMsG/5LuOsCtcNp57hd529L4kZU9dMIMdl4u3psZtbuTzXvuv+vnggwHqqywelDM3BYIvF9FzOOlfD4/j8n3azPud+01N+R3yNzcc+zmz5ORw5qfrO+9gVnL2mWFycm+tvvl+xF0/+sBYlCgXjryc7o2BwqroU8S1+rVgoyaPRIcuBIhWtI9Kvb7B2XL0/tbJONu5IRyXX+NDVb7pnozLCI24WVNs/uswSDN0AFHDZtrlTt3/8bQyAeFTNI76OTwFtH7Xqp9hcwmCCc5C5Bj32zpDNka6ZsdS4c+AOlztyFrOnvo21Q9xZMnBiIMnmFc5Tll4vOvrkKmBfciEygyy7KJwe32YtL4+6GdiwOZBZhmXJdTi2pndY2UNmKDiysp0pYlbEDpTYW+725JqYL3pazWH2e+ZOy37dCi2sz9TupfO/L6yxphhA2IXRLFhnAMLPklk5NlFyh8mmR+4MmfHh58UdumtXd9aCsfnQxuVjL0Jm1WwMehlwdn0FaDME+Ky7e488DorJQIc7bJP1caRkJa56CWj7EDDxGmD3Imsb406cPQ2p1SDg8Ia0RdwMVjgGFXfS9vtmvRtfJ7BAct2Nw2o2ZZDDgIU1RfY6u/7DlCZRrksGeswYcfkYePKAgBkbjmWVGsfjumEcsPILKwvLjJndJMXPgnVl3/V131b5mTHwN4OcDnAPxvne+PkyMGLWlENf2OvDxh6ZLZKzSqm9XT8l6zXgN+sAxfbLQ8CqL6xaPX5uzLIyEGNvSG67j6yzDjZS4/bAJmVmiFPXF57Pn28A85N7CxetCgxaZAXyrr9D/I1wDcK8jRk3Fnoz88zfOZcERXbuvxXc5FLxiUno9OYC7DtxDs9fWw93t6vq60WS1LhD4xEMm2VsTNlzh5Zer7HxHa0ffWZsuKPNao0HfxxZl8IdR4maQEmXnaFt2QSreJk7ZmZJbmaTicM6MuYPvCv+JDDo4RE0mxy542HXdDbBsSbhUtcJi4C5w89K7z1mWNgcQvyh5Mz2xCzFkHR6X7G5gpknct3xdnjKCjiYNUsdtHFnxGwMn5PZEH4GXCdcD3aQxx08M1P2Y1iozeCFO04Wt3OHad9mY28/FsVnhMElC6D5PMx0PbYZeLexlc1i8MCdOps/32loFc8yK8TatekPW82DfF0WcrNpkhkefr527ZMrvhcGJSz+vms28Ho16zUfWg0Uq2oN5cBMlGsAxeJ8FupyHTJbx/ViRvFODrgZXHH8HTvYYZE7RxdnVs7OtrCGi+ub21DDm6wglVkeblcMWphhscfMsgdtZLMmM3oc6oBZor5fpQw8SpH/WdlBrmsuV8enrWlW+B75fWFzpx2YEp+r+6iU4IrNW3YWiOuPAfHoytZycngMjrTO4IjbWtlG7uuRQTjfn63doynN0ByRnZMa2z0o2QOTHRXs73Tq+7vi58mmRmb4uKyu2AzI5lsGURxAlU2MzObywMEOIrjO7MCf2aie71vrh1lBTpvDLBxr8Jgt5rbFoMMeld4VM2o8sGHTKter3ZzPJlIeNLBTRe3kKXpsLJLndsf3a0+EzCZKZhuZrWYnDg9ScJMPghv6btkeDPtpHUoWDsFfT3ZCgSBlb3I1dp/mDoQ/gPaYP57GoIXj/FzoEaKv8KiTR75McXcfbR3ds4nGtUnIFYMZ7ry5g2Wmgt3lXZvfKrayggG7iz4zWGxe+Kh18thMyTtEBk7cObO+gztkBimsUbILpa95y6q/YHMVn49NUDzy5jhE3MnwsTx6t5spM8LmIHbnZ0DLed74/Ax6uKO1dy6sZWHTU9eXrewKd2hsGmLmg8EXC6xv+sIaBmDa/VbTEwOHYzus5iFmIT5oaWV8uPNjbRB34MwouvayY5aBgSwLndlLMTXejztiFgKzCJnn2aRFN39r7dxYZ8OdMfV8L3nnmmo6nTKNrKCFPS4zwtdiEJReofOsZ6zRxDl8Q+r1y4wRm90Y4Jgd9SYrS8MejAzu3m2UEgC2vM8K9L/pDURUAu6ZY60nO0BhQGwP+2DXPZksV3JgzICXAQszYww22c2ftUNcbga/DG441IONAUj/6VbAyKCQ743jNdkBI5+XvVrtJi2uA2bm7N6gzucJtbYFbjcspucEyRxj7KvkQntm0E7ttT779HDZOF0Ps172NsaMGAND+7Oys50M6JiB5vrkOnpwpdXUzGXjNDgMYojfgT4TgTrXWgE6g/HU2WkPUHCTT4KbuAQre7P/5DmMvK4++rXJ5MdCJC9g0SizDBxqgMXq6WHPKR71c8fDo3kejbOImrU43MmxAJZjztB9C60mDB79cydHfO70et9wOhDOXF+oDPDIWisLxWCLOxLiMAOcl431Nuz1c9XI878fZvJ4hM0sVkaF+ewuzxoaZoKY4mdR9SSXkZw5zAHrh4hNgtzRujZLEudNY0Bgu+JJoLPLjvdicJ2yaY2ZL04Bw4CZO0NO2cKhFAb9ZfVg4/pgYMFmJJ4YFHiwqSLTjOb0B61gjs21JWtZwSjPczwr9nhkBpEF6/bOnJkSNr/Zo3Cz6Y7bB98bp1phk2fD/yWPWeWSJbMzJ/zcGXDY09EQsyZmGIhMuvOzWZTBNQO/e+dbzUh2ExqzeszksQaJwYjrwIUMENn0ySEtGGyzaczOToZEWJk+BvSsZ2LAz+Zvu+aKWcd2Q62sCwu1+Xr83NiDk4EWA1Z+dnazsGuTHQc0nZ3c2YfNfwykGOC0echqKmYT7ONbLq1XXzoU3OST4Ia+WrILz/+8HuWLFMSCJzqagf5EJJ0MEJsIuXNgTynWfTA1zyNgG6ftYN0Rf+zTaxLkTyV3EtzZ2Wl9NlVN7GHtgFhIzB9zNkNwHjdv7cC5HBxcknUibEpjhuh8TZgcZJFNWFxOvj8OtMmeaZeChdfcwbHJqVqHlOsZ9JlJel1qP3yFBbcsUnatqWHwymwgcTnZXZ/NJ649HDl5a+pxmeygiAN/smmVvbTYhMfH27VyDyy1al+mDUoZf4e1OcSsmhlw0M/aDlmszcxbpTbWEA8fXm5lgZhNZNaDmZRdf1m1WMymMRBmForZxJUTrYDxjmnWeFM2jgvFDGPFy60i/vQKozmy95znUwYPtQvzGZywqXL+K+5jDjFQYbDPjByDH/aWZDaRQRS/P2xOY/bJOS0Le2neaa1nD1Nwk4+Cm5j4RLQd/QeORcXhnb6NcUPTCr5eJJH8xZ71PKMeNt7A5gxO2MqiYtf54zLDQRCZ1Und6y6vYy0Jx+Y5G2m9fw5LYEaDXpfSw4o9jVwDRO4WfxliZX3smiXbo+utXmXMhLGZkEECm6uYler/a0rWiD0j2SvQDKaZQbDMTBebERlosuCeNUEMluygi82b7OUWUcH9cQyqWS91sZmRpESrBs0uRraDnF5jrffGyX7ZPMesD4cvYHbm/ctSxuAiNkH1/dp6bwwiOQ6V3bGB0+Vc7PhHmVBwk4+CGxo7fxvemL0Z1UuG4bXejdC4YhFlcEREMsK6HQYfxB5bdp2LK+4a7W7hzNRwbBtmSwanqoEhjuvDGprMehJmBWuuOC0L63aYSWHzIXszeoMjeVRv1lsxIzR4uTW9jT0KNQM6ZiDtwIyDtjKjxXo1Ns0OmOke0DMjysEQGXBx2pwLHfQyCxTc5LPg5tS5eJO9sadjaFa5KH6836WLooiIpGCROOulmMVhrU1Ws2XM4oSV8O6ysWmKGR12qb/UpsPzYQDD4RkY3LE+J4dTcJPPghtaufs4Jizcid83RCLJASwZ1hllIzxbzCUikmcw28J6kuwobpZs33/rU80jmlUuhnF3NEP9ctaAVct2uoz1ICIi7ljUq8Amz9Inm8e0rGq1gSq4ERGR/ErBTR4ObjYePI2Xf91ganJERETyC59OnCme16KKFdxsPXwWAyYuR+TpGHP5uWszGPBMREQkj1HmJo8pFhaMWqWtwbPswOaHVfvMeDgiIiL5gYKbPJy9oUIhgTgZHY+Z6w76dJlERESyi4KbPOjqhtZw3P9rVgGDOlQz579YshvRcdY4OCIiInmZxrnJow6fiUGJsBAcjYpFm1F/ICHJgeBAf9x3RTU81rW2rxdPRETkgmicG0GpwgXg7+9n/o/u3QgVihY0s4i//8c2LNl+zNeLJyIi4jUKbvKBPs0q4K8nO+G2VpXM5ZHT1yMhMcnXiyUiIuIVCm7yCT8/PzzetTYiCgZhU+QZfLlkt68XSURExCsU3OQjRcOC8VjXWub8qzM3YvG2o75eJBEREY9TcJPP3HF5ZVzXuJwpML7vq5V47Ps1+Hn1fl8vloiIiMcouMmHzVOv92mE5pWL4kxsAn5ctQ8PT1qN9QdO+XrRREREPELBTT5UICgA39zbCh/f0QwtqhQ1101Zsc/XiyUiIuIRCm7yqZDAAHSrXwYPdKphLk9bvR+xCZqiQUREcj8FN/ncFTVLonR4iJmiYd7Gw75eHBERkUum4CafC/D3Q+/LKpjzHy7Yhv0nz4GDVuezgatFRCQP0fQLgj3HotFtzEKci080UzQE+fuZupxXbmiI7g3K+HrxREREoOkX5IJUKh6Kn4e0NcXFnKIhKi4Rx6Li8MA3KzF2/jbM33QYMfGqxxERkdxBmRtxSkpymNGLQ4L8MW7BdkxZmdKDqkapQpj6QBsULhDk02UUEZH86bQyN3IxONFmvXLhqF6yEF7r3QjPXVMXV9QqiSKhQdh2+Cye+nGtanFERCTHU3AjGQY697Svhi/vaonP+rdAUIAfZq6LxKeLdvp60URERDKl4EbO67JKRfH8tfXM+VG/bcKyncd9vUgiIiIZUnAjWZ6TqleTckhMcmDwt6uw70S087bVe0/itVmbsPd4ynUiIiK+ooJiybLouATcMHYxNh86g3IRBfB4t9pm4L8Z6w6a2ysWK4gfB7XBgVMxCC8QiGolC/l6kUVEJB/uvxXcyAWJPBWD2z5Ziu1HopzX+fkBEQWDzCjHHCeH3ckLBgVg+oPtTC8rERGRS6XeUuI1ZSIKYPJ9rdGySjFUKR6K/m2qYPqQdvh5cFuUKBRsAhvigIAPT/rXeVlERCS7KHMjHsOam//2n0LtMoXR+6PFOBEdj8GdquOJbnV8vWgiIpLLKXMjPlGxWCh6NCxram04dQN9sXg3omITfL1oIiKSjyi4Ea/oXr8MqpYIw9nYBPyy5oC5jlM48Py8jYfMaMgiIiLeEOiVZ5V8j4MA3tqyEl6ZuRFfL7WyN+P+3I6jZ+PM7XXLhmPUjQ3RpGIRXy+qiIjkMcrciNf0blbB9J5af+A0Xp6x0QQ2ZSMKoHBIIDYePI0BE5e5jZcjIiLiCQpuxGuKhQWjZ6Ny5jznp2KmZuGTncypYfkIq+D4m1WITdCM4yIikseCm7Fjx6JKlSooUKAAWrVqhWXLlmXpcZMmTYKfnx969erl9WWUi/Pi9fXx1v8aY97QDrilZSUEBfijaFgwPrztMhPwrNl3Cg999y/iE927jDOz8+L0DXj51w34aMF2zN902NTviIiI5Piu4JMnT8add96JcePGmcBmzJgxmDJlCjZv3oxSpUpl+Lhdu3ahXbt2qFatGooVK4Zp06Zl6fXUFTznWLT1KO76fDniEpPQokpRFA8LQeUSoahXNhzP/LQOUXHuGZ3S4SEYd3szNK1U1GfLLCIivpGrRihmQNOiRQt88MEH5nJSUhIqVqyIBx98EE8//XS6j0lMTMQVV1yBu+66C3/99RdOnjyp4CaX+mPTIdz31UrEJ6bdDDlQYNPKRXDgZAyW7zyOyNMxCA7wx7s3NzFdzkVEJP84fQH7b5/2loqLi8PKlSsxbNgw53X+/v7o0qULlixZkuHjXnzxRZPVufvuu01wk5nY2Fhzcl05knN0rlMaP97fBn9tPYrQ4AD8ueUIFmw+gmsalsXbfRsjJDDA3I9NUkMnr8bvGw7h0e9Xo1LxUNQvF+HrxRcRkRzIp8HN0aNHTRamdOnSbtfz8qZNm9J9zKJFi/Dpp59i9erVWXqNUaNGYeTIkR5ZXvGORhWKmBMNaFvVTNDJualYT2UrFBKIj25vhgGfL8fCLUcw6OuVZtoH3ufJH9aAw+a0rlYcfVtURFiIRjgQEcnPckRBcVadOXMGd9xxByZMmIASJUpk6THMCjGFZZ/27t3r9eWUSxMaHOgW2NgC/P3w3s1NzOzje4+fw0OTVuPZqeswe/0hzNlwCC/+ugH9Jy4z81mxBxaDJBERyX98eojLACUgIACHDh1yu56Xy5Qpk+b+27dvN4XEPXv2dF7HGh0KDAw0RcjVq1d3e0xISIg5Sd5QJDQYH9/eHDd+9LfJ4NhBz8ArquHrJbuxfNcJDPh8GTYcOI2ERAee6lHHDCbIQQVFRCR/8GnmJjg4GM2aNcO8efPcghVebt26dZr716lTB+vWrTNNUvbpuuuuQ6dOncx5FiJL3levXDhG39jIefnRLjXxVPc6eKdvE3P5723HzBg6Z2IT8Ny0/3DvlyvM1A/ErM5XS3fjnTlbsGLXcRw+HaMu5iIieYzPixOGDh2Kfv36oXnz5mjZsqXpCh4VFYUBAwaY29lNvHz58qZ2huPgNGjQwO3xRYpYtRqpr5e8rVfT8khIciDy1Dnc37GGua5LvdJ46fr6mLJyH25rVQlRsYl4ffYmzNt0GAO/WonOtUvim3/2YOvhs+b+787bav4HBfjh2avron/bqj59TyIikkeCm759++LIkSMYPnw4IiMj0aRJE8yaNctZZLxnzx7Tg0oktT7NKqS57o7WVczJxjms2EzFJiy7Gat4WDAur1YcS3Ycw8noONMNnfU6tcuEo3X14uY+HCEhvbofERHJ+Xw+zk120zg3+c+ynccxdv42hAT6o07ZcNzVtoqp3SFu/o9NWYOfVu1HiULB+PiOZth1NBovTF+PikVDTTZo/f5TiE1Iwpibm5jn+GD+NlxVtzSaVynm67cmIpJvnM5Ng/hlNwU3ktq5uETc+NFiM+VDZq5rbM2T9cuaA2bqiN8fuQKlwgtk01KKiORvpxXcZEzBjaTnVHQ8XpqxAT+s3Ae2Rj3UuSaKhgaZ3lcVi4Vi/MLtZiwdVx1rl8TE/i3UfCUikg1yzQjFIjlFRGgQ3vxfY9xxeWXTtbxBeWv0Y7vIOCExCZ8s2mnO33hZefy69qAZSfnHVfvR+7Ly+HTRTlO7c0PT8igToWyOiIgvKbgRcdG4otX7LrWhXWth/YHTKFQgEK/1boSapQrjtVmbTJfyxKQkvDxjo7nfG7M3mdsqFw9FksOBckUK4tlr6po5sfj4GqUKoUCQNaWEiIh4h5qlRC4Cx83p8MZ8HDodC44PyCYrBjS7j0Wnue8tLSshwB/4euke3Ni0PN66qbGZLHTP8Wh8fU8rlChkDTL5+d87TTf2D269DFVLhKV5Hn5VGUyxuPnpHnXUHCYi+cppNUuJeBezL0M618Tz0/4zgU31kmH47eErcCwq1hQm7z9xDqdjEvDm75vx3bI9zsdNXb3fTPrJCUDpiSlr8Fn/Fmbi0JG/bgAPNV6ZsRGf9GuOyFMxKFwg0DlX1qJtR/HeH9vM+Q61SqJNjaxNQSIikt8ouBG5SH2bV8Rni3Zi17EovHR9AwQH+qNsREFzcs22vPn7FnO+Wokw7DgahTFzrcEDaf7mI2aAQY6WbOdQ5248hJHT1+OLxbsQFhyI21tXxr3tq+HN2Zudj/tiyS634IY9vm6ZsBSB/n745t5WztnURUTyIzVLiVyCI2diTbamTpn0t6WkJAcmLd+L8kULmnF0rnlvkbmeTVF3t6tq6nZs9cuFo1bpwpj67/40z1MgyB8x8UlmnB02S7Ep7K+nOqN8ESuQGv3bJoz7c7s5/9w1dXFP+2ppnoMTiX77zx6T9alZurDH1oGISHZQs5RINilZOMScMsIJO29tVcl5+dpGZU1Pqye61cJNzSuagOdgcvPTjU0rICouATPXHTQBzP0dq6NpxSJ4Z+5W5xg897Svin/3nMTi7cfwzdLdeLJ7HWyKPI1P/trhfA1OK3HjZRVQLMwaqJCiYhMw4PPlZkDDn1cfwPQH23ltnYiI+JoyNyLZXIi8/chZ1C9ndTVPz7p9p3DqXDza1bSanRKTHPh59X5sO3wWQzrXwMItRzHo65UmIFrweEfc8+UKE/B0rVcae0+cM4FQl7ql8UafRnAkN3NN/HuX2yCFc4d2MD23spKZCgsJQGiwjoNExLc0iF8mFNxIbsemrmveX2SCFQYoDHoKhQSagIX1P7dOWGqKnO2mLFt4gUBUKBqKDQdP46Era+LBzjVMRodTUTCA4txbtcoUNk1d/+0/ZZq5mEUqXigE797cBG2qq4BZRHxHwU0mFNxIXvDnliPo99ky5+UXetZzDji4cvcJDPtpLbYcsmY/r1OmMHo2Lofel1XAPzuP4eFJq00Aw/qdvSei8czVdfHPjuOYtT7SFCRzstF1+0+5vR5rfF7u1dCtiW39gVMoWSjkgqegWL33pMlM8bFcNjbdiYicj4KbTCi4kbyAX9vbP/0Hf287hkYVIjD1gbZmZGVbXEISthw6gwpFCzonCSVmapq/PBfn4hPTPKc9Xo99ngFR/zZVTBEyx99h4DP5vsvRqEIRjJq5CZ/9vdNkjDio4TWNymZpuVnzc9PHS5yXh15Vy2SRspKt+nnNfrSoUsxkn0Qk/zmtgmKRvI0D+L3epzEmLNyBu9pWdQtsiN3S7SkkXHHMnG71S2Pa6gMma9Ktfhm898dW0+V8/J3NzP81+06iY61SZjwealKxCGISkjB9zQHc/cUKFAgMQOTpGHPb2dgEDP52Ff7eXslkgBjssJmM4/80rBCBp7rXcVu2r5fuNv9Z7Hw8Kg4fLdiOm1tWRKnC7tmfE1FxZnJSe6DC71fsxdM/rUO1kmGY/cgVCOKoiCIiGVDmRiSfOXo2Fr+vP4RrG5dFeIEgU+DMoKR0Js1LzPhc98EibD8S5azfYcZm7f5TJkChchEFcGXd0qb4mQMY0jUNy+Kdvk1MsHUyOg4tX51nskq/DGmL4T+vN01UnM/rpV4NzP1jExLxzE//4cdV+9C4QgTu61AdPRqUQY93/8KmyDPmPsOvrYe72llNcCKSf5xWs1TGFNyIXByOmMyeV9VLFjJNYfbIyYu3H8UTU9Zi/8lzzvsyK8SgiZOJclydcbc3w+Tle/DC9A2mpmfmQ+3wz87juHm8NfDgrEeuQKnwENw1cTlW7D7h9rqcjNR17B8GVq/e2NBkiL75Zw8C/PxMsxi71tcuU9g50elv/0Vi6Y5jZpoLBll9W1R0ZoL4s5fV6SsYjH2+eKdpsrvvimqa9kLERxTcZELBjYjnMbMzb9NhLNt5zDRtPXpVLVNfM/CrFabHVq3ShUwz1NGzcW7Fz3d/vtw8rkH5cJQuXMCcZxf313s3MkEOZ1u33dS8AtbtP+3WpT215pWLmuBr+a7jZjRoVwyAmG06dDoGt034x0yCOqhDdVzfpJyzmYtZrUFfrUSLqsVMk9re49Gm2W3tPqvAeuR19dGvTRXvrEQRyZSCm0wouBHJPpxWgoMHnklupiocEoi/nurkLHJmoNFtzEKcjI43l9l8NeW+1mZ2dv40PTP1P+fcXDMeaodAf398uGAb9p04Z2ZaZ+8tzvP106p9Zr4udmm3FQ0NQp9mFcxzfvznDiQkOUzG6WxMglvgw6JrDph4c4tKeHbqOjOitP16j32/xjSH8TmYweH/aQ+0Rb1y4SaAWr//FNrVLJmlMYNs783bah777s1N3QZaFJHMKbjJhIIbkey151g0Fmw5jOJhIWhSqYhzygjbrP8izaCE9Nb/GqN3swrO2+ITk8xEoiwufqRLrUxf58DJc6aL/NEzsSb706d5RVNLZHePv+eL5TiRHESxPoiB0eeLd5lsErWqWgzLXOb4KlU4BIfPxJrX/vXBdhjx83qTWWKA07B8hHlO2+XViuGtm5qkeW+pcTRp1g/xNdjcxnokEckaBTeZUHAjkvP8tu6g+d+jYda6lF8M1gD1n7jMZIm+u/dy05uME45+889uvD57s8nMULPKRd0Clxevr487W1cxzWr3fbUCy3dZt7ETWNNKRbFm70mTFWIQxFGi+TzHouJMsfYT3WqjSokw53MxwJq78bDz8hd3tTQ1SbP+O2gyRo93rZ1uLzfacOA0zsTEo1W14l5bRyI5mYKbTCi4Ecm/WGjMMX4KFwhyu37VnhMY+OVKJDkcZt6tl6ZvMIMaslZo5kPtEZhck8Ofy1V7TmL+psO4umFZ0zzFzNSQ71LqclyxGe6xrrVQr1yEGRjxpV83mK7xV9UtbZ6fzVI3t6iIjxfuME1qnKfs58FtTQ1QweAAk3nisr0zZwv+2nrUPCfv/8J19U1znDdwyo09x6PQrHIxrzy/yMVScJMJBTcikh5mXGISEk3G5eCpc6Y2pn+bqs4eWJlhF/afVu032R3WAhUNC8akZXvS9Pyyg5Pnr62H/41bYqbCsBUMCjCBlz3zO5+nbtnCWJMcNDEoYvDFX2w2q/VvWwV9W1RCREErUONP+eZDZ1AmvIDbwI3p4USrs9dH4ukedUwQc+wsm9+Czfu4+t2/sOtYNN6+qbGZgPVCcBkYhLHnW50y+n0Vz1JwkwkFNyKSHVgv9MlfO7Fo2xHsOhpt6nHa1iiB+zpUM1kXBhIf/LHNZG16NiqHh6+siRs/+tvUALG3uf3LzPN9LqtgRnJmt/ah36/GodOx5raw4AB0a1AGnCGV9UIstGYh9cjrrXGDGLRwpOndx6Lx4vT1JlBrWKGIGWSR2A2f160/cBr1y4WjZqlCZoBHO+s069ErzltHZOP7eXbqf/hh5T5T87T46c5pMmQil0LBTSYU3IhITmsqs5u9mDHaeTTKTHGx70S0qe9h93aODeQ6szwHSmQ3eXv+MJtrUGQLDQ4wWSnWBbmyJ11NT6VioSaQ4ojQV9UrjYREhwmUQgIDULxQsJkGo2XVYmaso1PR8fj6n91mFGkGUTbXwRbZdMe5yPia1UoWchu1mt3tOdcYa43Ye47vq1qJMBMIMrDKytxjzJhxkteu9UunGe2aQReHJWAR+PkyWpKzKbjJhIIbEckL+NO9aNtRs+NmbU7l4mFoXb246Sr/zdI9JjBgEGE3fXWvXwZbDp/BjiNR6FS7JD7p1wIz1h00gUXTikVMN3g2gd1+eSXc064aer6/CGdirS786WEB9a0tK5lMDXuV2dd1qVvaXFe5eCjmP9YRczYewkPf/Wua2ogB08u9GphZ5lmMfcuEpYiOSzTF1xys0TXgYhMdn5ODQfL5GFQdPh2DRAcwuFN10/TF5e/78RLTZb9EoRC81ruhGW3bnlfthV/Wm15xQQF+JkPGASDTq1fi+py5LhJnY+NRtUQhLNxyxPRue7xbbTWx5RAKbjKh4EZE8jp7BGb+/2PTYZO1YU8uBhjsCcYggV3aXTG7w0CoUfkIky3hiNTzNx82GRfWA7HYmUHG7mNRZsJW1xGpmWnhWEEssmb26PJX55kpONrWKI4l24+Z0Z2ZDWKxsj1pKzNSzFKxZ5mr0uEhJjBjgXbqbJMrBivXNylvApD/9qcd2JHL+9P9bcw4SgyebKx34oSwzPSwWY692WqVLmxqrN6dtzXN8zC4+vruViazxCwbM1qcvNVef9FxCXh8yhrULROOB7MwCSwzXV8s2YUCQf64t71nR7zOaOTtFbuOm+J1Zs1yMwU3mVBwIyJyabiT50zxXyzehStqlTSzu7tmQ16duRHjF+5wK6JmtiYqLhFv/b4ZXy3d7Ww+Y61Pu5olzECLLI7+YVBr1Cxd2LwGAygOAMkM1Lp9p7B2/0mUjSiIf/eccOtSz/qgz+9qYTJGnDctKi7BjIzN7A3rkFhLdPvllTHil/UoG1EA1zUuZ2qdXB9vZ6maVmKT4DmzXGzuYpDF169XNtwsD6+rWiIML13fwCw3J699ZeZG89jRNzbEzS0rmVnsf113EDFxiWYgSbtpjYNNMpNkz7323DV1zYjX/+0/ZeqTuLwX0wuOAeSTP64xo3NP7N/CLcDhc/f8YJHJ7nHS2XJZrKFyxfeTleZBb1NwkwkFNyIi3sWmInZfLx4WbJrKOHaQ6w6XtUW/rYvErmNReLBzTZNlYXahfNGCJng5H+625mw4ZAqhGXhc3bAMapRK6dX297ajuO2Tf5yXOTErp+9o/9p8ZxMacdoPFntzdnsa1qOOmazVxnGFHvhmlbMbfuq6Jk7HwYljI0/HmMvs4cYgau2+k86ecp3rlMKrNzTEPzuP4ZHJq5293Q6cijHLzsEiD56yHs9szpBONcwEtBw6gMFK9ZJhGDN3q8lQsSmPy+wHPxw5G2sGrmQm6efVB5yjc08eeLnbWEjMKjHoo3Y1SuDLu1pmGKgws8Zl71S7FE6ei8fAL1dgc+QZxCYmYUCbKniyex1nvRQDTGbjihcKQXZRcJMJBTciInkfd8yckoMZi6XPXGn+f/zndoz6bZO5vW/zinitTyOTIVqz76Qpms5ogERmbLgzLxYabIq7X5+9Cd8ts6bpIAYonDKEAZdrITeb1ezBIW13XF7ZjFP02PerU3qmFQg0Pd7Sq3Fi8xubA8+H9UacG61Xk3JmElkOCvm/5hVw9xcrzDLYz/Ps1XVxT/uqGPfnDjN4JJsFW1UtjnuvqIoBE5ebQOuRLjVNhurLJbvdXoO1WmNvuwzzNh7Gg9/9awKvGQ+1T5Nt4lxzLHz3dOCj4CYTCm5ERPI+9jZ7dPJqXNekvAko7EzMte8vMsHE9/e1RmiwNT3HheJu8+FJq/HLmgPOjA+blzjWEbNRzO5wVOsTUXF4Zuo6k4Wxp9zgFCPMnHB07LHzt5lmohsvK2+Kp/l8bNIzvcfKRZjMFGuU2KzG5i42P7FHGcc7YuDADBAfz2wOpze5fuzfJnsUwOdPrm0izqn2v2YV8PzP682yXVGzpJmqJCMMhJgIYjZo3O2XmVqtp35ca5r62tcsYZaLwQ892LkGHuta21lPxOLtiYt3olu9MiZ49CQFN5lQcCMikn9xl8e93qXWkDAzce+XK0xTzpRBrTMd04dFxwdOxphMx/kKiLl8DCo4PACDMTabcSyi1AXg6T2OgRsDD2IAZ09Yy4CKARTrfb5wycZwEEcWg4+cvsFkp1jIzXoi9sKjLnVLmV51tHL3cdPUxwCH2OTIrA+Lsl+8voEJpt6du8VZT8TRvZnV4WjbnqLgJhMKbkREJC+avHwPnvpxnQksmJn6euluMyjkM1fXNcERC4Of/HGtyRC90LO+mTyWmGGavvaAGdOI9Txd3/nTBDEzH27vNuP9vI2HMPCrlSaQ+umBtvhw/jbT9OeqdunCGNK5huk55zqekScouMmEghsREcmLuDtfsPkILqtUFBGhQZlmnTLrlcUmPQY3roGN6wSuCUlJZqBJ1tZM+GuHGW7gbEwC7mlfDX1bVPR4UGNTcJMJBTciIiJ5e//tucYwERERkRxAwY2IiIjkKQpuREREJE9RcCMiIiJ5ioIbERERyVMU3IiIiEieouBGRERE8hQFNyIiIpKn5IjgZuzYsahSpQoKFCiAVq1aYdmyZRned8KECWjfvj2KFi1qTl26dMn0/iIiIpK/+Dy4mTx5MoYOHYoRI0Zg1apVaNy4Mbp164bDhw+ne/8FCxbglltuwfz587FkyRJUrFgRXbt2xf79+7N92UVERCTn8fn0C8zUtGjRAh988IG5nJSUZAKWBx98EE8//fR5H5+YmGgyOHz8nXfeed77a/oFERGR3CfXTL8QFxeHlStXmqYl5wL5+5vLzMpkRXR0NOLj41GsWLF0b4+NjTUrxPUkIiIieZdPg5ujR4+azEvp0qXdruflyMjILD3HU089hXLlyrkFSK5GjRplIj37xKyQiIiI5F0+r7m5FKNHj8akSZMwdepUU4ycnmHDhpkUln3au3dvti+niIiIZJ9A+FCJEiUQEBCAQ4cOuV3Py2XKlMn0sW+++aYJbubOnYtGjRpleL+QkBBzstklRmqeEhERyT3s/XaWSoUdPtayZUvHkCFDnJcTExMd5cuXd4waNSrDx7z22muO8PBwx5IlSy749fbu3cu1opNOOumkk046IfeduB8/H59mbojdwPv164fmzZujZcuWGDNmDKKiojBgwABzO3tAlS9f3tTO0GuvvYbhw4fj22+/NWPj2LU5hQoVMqfzYX0Om6YKFy4MPz8/j0eVrOnh86snlmdp3XqP1q33aN16j9Zt/lu3DocDZ86cMfvx8/F5cNO3b18cOXLEBCwMVJo0aYJZs2Y5i4z37NljelDZPvroI9PLqk+fPm7Pw3FyXnjhhfO+Hp+rQoUK8CZuDDlpg8hLtG69R+vWe7RuvUfrNn+t24iIiCzdz+fBDQ0ZMsScMhq0z9WuXbuyaalEREQkN8rVvaVEREREUlNw40HslcXmMdfeWeIZWrfeo3XrPVq33qN16z0heWDd+nz6BRERERFPUuZGRERE8hQFNyIiIpKnKLgRERGRPEXBjYiIiOQpCm48ZOzYsWbEZE7g2apVKyxbtszXi5TrcBBGjhrteqpTp47z9piYGAwePBjFixc3o1H37t07zbxkYlm4cCF69uxpRvLkepw2bZrb7exHwIEzy5Yti4IFC6JLly7YunWr232OHz+O2267zQziVaRIEdx99904e/Ys8rvzrdv+/fun2Y67d+/udh+t2/RxJPoWLVqYEeRLlSqFXr16YfPmzW73ycrvAAd/veaaaxAaGmqe54knnkBCQgLys1FZWLcdO3ZMs+0OGjQoV65bBTceMHnyZDONBLvOrVq1Co0bN0a3bt1w+PBhXy9arlO/fn0cPHjQeVq0aJHztkcffRTTp0/HlClT8Oeff+LAgQO48cYbfbq8ORWnMOF2yKA7Pa+//jree+89jBs3Dv/88w/CwsLMNssdh4073/Xr12POnDn49ddfzU594MCByO/Ot26JwYzrdvzdd9+53a51mz5+rxm4LF261Kyb+Ph4dO3a1azzrP4OJCYmmp0vR7JfvHgxvvjiC3z++ecmmM/P/szCuqV7773Xbdvlb0WuXLcXPPOkpDv55+DBg90m/yxXrlymk39KWiNGjHA0btw43dtOnjzpCAoKckyZMsV53caNG80kahczgWp+wnU0depU5+WkpCRHmTJlHG+88Ybb+g0JCXF899135vKGDRvM45YvX+68z2+//ebw8/Nz7N+/P5vfQe5Zt9SvXz/H9ddfn+FjtG6z7vDhw2Zd/fnnn1n+HZg5c6bD39/fERkZ6bzPRx99ZCZbjo2N9cG7yB3rljp06OB4+OGHHRnJTetWmZtLxAh25cqVJq3vOn8VLy9ZssSny5YbsWmE6f5q1aqZo1umQInrmEcaruuZTVaVKlXSer5AO3fuNPO4ua5LztfC5lR7XfI/m0s4oa2N9+e2zUyPZI7TxjBlX7t2bdx///04duyY8zat26w7deqU+V+sWLEs/w7wf8OGDZ3zExKzkpwMktkySX/d2r755huUKFECDRo0wLBhwxAdHe28LTet2xwxt1RudvToUZOqc/2wiZc3bdrks+XKjbhzZYqTOwSmQ0eOHIn27dvjv//+Mzvj4OBgs1NIvZ7tmeEla+z1ld42a9/G/9w5uwoMDDQ/hFrfmWOTFJtJqlatiu3bt+OZZ55Bjx49zI4hICBA6zaLkpKS8Mgjj6Bt27ZmR0tZ+R3g//S2bfs2Qbrrlm699VZUrlzZHGCuXbsWTz31lKnL+emnn3LdulVwIzkGdwC2Ro0amWCHX7Tvv//eFL2K5AY333yz8zyPcrktV69e3WRzrrzySp8uW27C+hAe2LjW3Yl31+1Al7ovbrvscMBtlkE6t+HcRM1Sl4jpOx6Npa7W5+UyZcr4bLnyAh6d1apVC9u2bTPrkk2AJ0+edLuP1vOFs9dXZtss/6cuiGePCPby0fq+MGxi5e8Et2PSuj2/IUOGmELr+fPno0KFCs7rs/I7wP/pbdv2bfndkAzWbXp4gEmu225uWbcKbi4RU6TNmjXDvHnz3FJ+vNy6dWufLltux66xPGLg0QPXcVBQkNt6ZrqUNTlazxeGzSX8IXJdl2wzZ72HvS75nzsQ1jjY/vjjD7Nt2z94kjX79u0zNTfcjknrNmOs0ebOd+rUqWadcFt1lZXfAf5ft26dWwDJ3kHsdl+vXj3kV47zrNv0rF692vx33XZzzbr1dUVzXjBp0iTT0+Tzzz83PSEGDhzoKFKkiFtFuZzfY4895liwYIFj586djr///tvRpUsXR4kSJUxVPw0aNMhRqVIlxx9//OFYsWKFo3Xr1uYkaZ05c8bx77//mhO/5m+//bY5v3v3bnP76NGjzTb6888/O9auXWt691StWtVx7tw553N0797d0bRpU8c///zjWLRokaNmzZqOW265xZHfZbZuedvjjz9ueu5wO547d67jsssuM+suJibG+Rxat+m7//77HREREeZ34ODBg85TdHS08z7n+x1ISEhwNGjQwNG1a1fH6tWrHbNmzXKULFnSMWzYMEd+dv951u22bdscL774olmn3Hb521CtWjXHFVdckSvXrYIbD3n//ffNFy44ONh0DV+6dKmvFynX6du3r6Ns2bJmHZYvX95c5hfOxh3vAw884ChatKgjNDTUccMNN5gvp6Q1f/58s+NNfWI3Zbs7+PPPP+8oXbq0CcyvvPJKx+bNm92e49ixY2aHW6hQIdPVc8CAAWbnnd9ltm65o+APP3/w2WW5cuXKjnvvvTfNgY7WbfrSW688TZw48YJ+B3bt2uXo0aOHo2DBguYAiQdO8fHxjvwM51m3e/bsMYFMsWLFzG9CjRo1HE888YTj1KlTuXLd+vGPr7NHIiIiIp6imhsRERHJUxTciIiISJ6i4EZERETyFAU3IiIikqcouBEREZE8RcGNiIiI5CkKbkRERCRPUXAjIvmSn58fpk2b5uvFEBEvUHAjItmuf//+JrhIferevbuvF01E8oBAXy+AiORPDGQmTpzodl1ISIjPlkdE8g5lbkTEJxjIcHZy11PRokXNbczifPTRR+jRowcKFiyIatWq4YcffnB7PGcn7ty5s7m9ePHiGDhwoJlJ3tVnn32G+vXrm9fizMacFdnV0aNHccMNNyA0NBQ1a9bEL7/84rztxIkTuO2221CyZEnzGrw9dTAmIjmTghsRyZGef/559O7dG2vWrDFBxs0334yNGzea26KiotCtWzcTDC1fvhxTpkzB3Llz3YIXBkeDBw82QQ8DIQYuNWrUcHuNkSNH4qabbsLatWtx9dVXm9c5fvy48/U3bNiA3377zbwun69EiRLZvBZE5KL4euZOEcl/OIN2QECAIywszO30yiuvmNv50zRo0CC3x7Rq1cpx//33m/Pjx483s0KfPXvWefuMGTMc/v7+zhm4y5Ur53j22WczXAa+xnPPPee8zOfidb/99pu53LNnTzNbt4jkPqq5ERGf6NSpk8mGuCpWrJjzfOvWrd1u4+XVq1eb88ykNG7cGGFhYc7b27Zti6SkJGzevNk0ax04cABXXnllpsvQqFEj53k+V3h4OA4fPmwu33///SZztGrVKnTt2hW9evVCmzZtLvFdi0h2UHAjIj7BYCJ1M5GnsEYmK4KCgtwuMyhigESs99m9ezdmzpyJOXPmmECJzVxvvvmmV5ZZRDxHNTcikiMtXbo0zeW6deua8/zPWhzW3tj+/vtv+Pv7o3bt2ihcuDCqVKmCefPmXdIysJi4X79++PrrrzFmzBiMHz/+kp5PRLKHMjci4hOxsbGIjIx0uy4wMNBZtMsi4ebNm6Ndu3b45ptvsGzZMnz66afmNhb+jhgxwgQeL7zwAo4cOYIHH3wQd9xxB0qXLm3uw+sHDRqEUqVKmSzMmTNnTADE+2XF8OHD0axZM9Pbisv666+/OoMrEcnZFNyIiE/MmjXLdM92xazLpk2bnD2ZJk2ahAceeMDc77vvvkO9evXMbey6PXv2bDz88MNo0aKFucz6mLffftv5XAx8YmJi8M477+Dxxx83QVOfPn2yvHzBwcEYNmwYdu3aZZq52rdvb5ZHRHI+P1YV+3ohRERS175MnTrVFPGKiFwo1dyIiIhInqLgRkRERPIU1dyISI6j1nIRuRTK3IiIiEieouBGRERE8hQFNyIiIpKnKLgRERGRPEXBjYiIiOQpCm5EREQkT1FwIyIiInmKghsRERHJUxTciIiICPKS/wORyAeyBiTjvgAAAABJRU5ErkJggg==", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "training_log = pd.read_csv(\"training_log.txt\")\n", + "\n", + "plt.figure()\n", + "plt.plot(training_log[\"Train Loss\"])\n", + "plt.plot(training_log[\"Val Loss\"])\n", + "plt.legend([\"Train loss\", \"Val loss\"])\n", + "plt.xlabel(\"Epochs\")\n", + "plt.ylabel(\"Loss\")\n", + "plt.title(\"Learning curve of image classification model\")" + ] + }, + { + "cell_type": "markdown", + "id": "528a4711", + "metadata": {}, + "source": [ + "### Model testing\n", + "\n", + "After having the model already optimised, we can evaluate the model using the ```Trainer``` class by calling the ```predict``` function." + ] + }, + { + "cell_type": "code", + "execution_count": 40, + "id": "93ea20aa", + "metadata": {}, + "outputs": [], + "source": [ + "original_images, true_labels, predicted_labels = trainer.predict(test_dataloader, DEVICE)" + ] + }, + { + "cell_type": "markdown", + "id": "c8081975", + "metadata": {}, + "source": [ + "### Explore results\n", + "\n", + "To evaluate the results, we display the model's accuracy along with the confusion matrix. The confusion matrix is a powerful evaluation tool that helps us understand the model’s performance across multiple classes. It maps the relationship between true and predicted labels, showing the number of instances for each possible prediction-outcome pair." + ] + }, + { + "cell_type": "markdown", + "id": "94597545", + "metadata": {}, + "source": [ + "#### Compute average accuracy" + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "id": "4df4d88f", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Accuracy: 0.8303\n" + ] + } + ], + "source": [ + "# Compute average accuracy\n", + "num_test_samples = len(original_images)\n", + "correct = (true_labels == predicted_labels).sum()\n", + "print(\"Accuracy:\", correct/num_test_samples)" + ] + }, + { + "cell_type": "markdown", + "id": "dce3ac13", + "metadata": {}, + "source": [ + "#### Compute confusion matrix" + ] + }, + { + "cell_type": "code", + "execution_count": 42, + "id": "9be2ac3d", + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAugAAAL2CAYAAAAThmN7AAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjEsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvc2/+5QAAAAlwSFlzAAAPYQAAD2EBqD+naQAA3pdJREFUeJzs3QV4FFcXBuAvIbiEJAQJCe7u7u5OcYf+QItLkba4u7tDcSvu7k6R4u6a4JLs/5wbZtkNCYQSmZ393udZyM5ONjM7szNnzj33joPJZDKBiIiIiIh0wTG8F4CIiIiIiD5jgE5EREREpCMM0ImIiIiIdIQBOhERERGRjjBAJyIiIiLSEQboREREREQ6wgCdiIiIiEhHGKATEREREekIA3QiIiIiIh1hgE5EunTp0iWUKlUKzs7OcHBwwKpVq0L0/a9fv67ed/bs2SH6vrasSJEi6hFSXr58iebNmyN+/Pjqs27fvn2IvTeFrGHDhiFZsmSIECECsmTJEt6LQ2T3GKATUZCuXLmC//3vf+rEHSVKFMSKFQv58+fHmDFj8ObNm1D9240aNcI///yDAQMGYN68eciRIweMonHjxipglc8zsM9RLk7kdXkMHz78u9//7t276N27N06ePInwNHDgQHUB1KpVK7UNGzRoEOp/L6Qv5ALav3+/+myfP3+OsPbXX39h9OjRIf6+mzdvRteuXdV3e9asWepz/JadO3eiWrVq6uIrUqRIiBs3LipWrIgVK1Z8cRFsuQ/L72n7dsBH7dq1rf7G+vXr1XQPDw/4+fkFuhxJkiSxeo/o0aMjV65cmDt3bqDzy/GkUqVKiBcvnppftmVQ7ty5g59++gmxY8dW39XKlSvj6tWr3/xsiEKCU4i8CxEZzrp161CzZk1EjhwZDRs2RIYMGfD+/Xvs3bsXXbp0wdmzZzF16tRQ+dsStB44cAA9e/bEr7/+Gip/I3HixOrvRIwYEeHByckJr1+/xpo1a1QQYGnBggXqgujt27f/6b0lQO/Tp48KXr4nGyqBWkjavn078uTJg169eiEsSGBZo0YNVKlSJVQDdPls5SJLArewDtDPnDkT4i0Rsp0cHR0xY8YMFWx/i2zPvn37ImXKlOoCXr5LT548UQF19erV1f5bt27dr75H27ZtkTNnTqtpsr9akveRaRLoyzKWKFEi0PeSfbxTp07q53v37mH69OnqAv/du3do0aKF1by///67uqjImjUrNm3a9NXWn6JFi8Lb2xs9evRQx4lRo0ahcOHC6sLXzc3tm58T0Y9ggE5EX7h27ZrKZsmJV06MCRIkML/2yy+/4PLlyyqADy2PHj1S/4dmACTZMwmCw4tc+EjGcuHChV8E6BKIlS9fHsuXLw+TZZELhWjRogUrOPseDx8+RLp06ULs/T5+/KgyqSG9nPZOtlPUqFGD9bkuW7ZMBedyIST7qeUFrly4S9D74cOHb75PwYIF1XsE5dWrV1i9ejUGDRqksvoSrAcVoCdMmBD169c3P5eLJ2n1k4A6YIAuxzYJ+h8/fgx3d/cg//7EiRNVS9bhw4fNFxJly5ZViYoRI0YEq5WB6IeYiIgCaNmypUkOD/v27QvW/B8+fDD17dvXlCxZMlOkSJFMiRMnNnXv3t309u1bq/lkevny5U179uwx5cyZ0xQ5cmRT0qRJTXPmzDHP06tXL/W3LR/ye6JRo0bmny1pv2Np8+bNpvz585ucnZ1N0aNHN6VKlUotk+batWvqd2bNmmX1e9u2bTMVKFDAFC1aNPW7lSpVMp07dy7Qv3fp0iW1TDJfrFixTI0bNza9evXqm5+X/I4s0+zZs9Vn8OzZM/Nrhw8fVu+9fPly9f+wYcPMrz158sTUqVMnU4YMGdTvx4wZ01SmTBnTyZMnzfPs2LHji8/Pcj0LFy5sSp8+veno0aOmggULmqJGjWpq166d+TV5aBo2bKiWL+D6lypVyhQ7dmzTnTt3Al2/oJZBPnPx4MEDU9OmTU1x48ZV758pUyb1WVjSto+s/6hRo9S+5ejoaDpx4kSgfzOwvyefs+b27dumJk2aqL8p+2i6dOlMM2bM+OJ9xo4dq16Tz0XWMXv27KYFCxYEuW9arldgLl68aKpWrZopXrx4al0TJkxoqlWrlun58+dW882bN8+ULVs2U5QoUUwuLi5qnps3b5pfl+0S1PfiR76XX9tXApMmTRqTq6urycfHx/Qtltsw4L6xdOnSr/6ufB6yve/du2caMmSI+n69efPmi/m0Y0pAOXLkUOsclEePHqnlkG0aGDk+ySMg2feTJ0/+1WUnCgnMoBPRF6TsQjJQ+fLlC9b80hFwzpw5KiMmTc2HDh1Sma/z589j5cqVVvNK9l3ma9asmWqGnjlzpsp4Zc+eHenTp1d1rZI579ChA+rUqYNy5cohRowY37X8Un5ToUIFZMqUSWX7JFstf3ffvn1f/b2tW7eqLJmsu9SmSgnMuHHjVKb7+PHjXzTBS+Y7adKkal3ldWlal1rcIUOGBGs5ZV1btmyp6nabNm2qpklWMk2aNMiWLdsX80v9q9RYS+mR/N0HDx5gypQpqtn93LlzqlY3bdq0ap3//PNP/PzzzypTKSy3pZQjyHpKK4lkHqUeNzDS10BaUGQ7ScmRdCCUvyelMFJTLn8vMLIM8rpsQ09PT3P5gWQs5TOVjqiyPaR8SdZj6dKlah+Quu527dpZvZdkT6XUR9ZFtqOrq2ugf1P+nuyHUn8s84rkyZOr/+VzklIbaTWRvynLsWHDBrUP+vj4mEtGpk2bpkovZP+U5ZC/e/r0abU/S8mGbK+LFy+qVg/JzsaJE8e8XoGRkrDSpUurUos2bdqo0gqpa167dq1aV+kArdVF//HHH2p/knWQFiTZ7woVKoQTJ06o74OUe0m5xe3bt9XfFt/6XgTneymfm5SqSaZY9l8R1PdeMsr//vuv2ldjxoyJH/HixQuVxbYk21ZKbYRkzKXERD4z2U+7deumjkuy7wenpUU+JxcXl/+0bNJKI9td+05akv1L9n9Z/h/9DIi+KkTCfCIyDG9vb5VZqly5crDml+ytzN+8eXOr6Z07d1bTt2/fbpXtkmm7d+82T3v48KHKLEpm+GuZt+/JoEvGVZ5LliwogWXQs2TJojKskqnWnDp1SmXyJJsc8O9JFthS1apVTW5ubkH+Tcv1kAy4qFGjhql48eLqZ19fX1P8+PFNffr0CfQzkMynzBNwPeTzk0yp5siRI0FmQrVM7OTJkwN9zTKDLjZt2qTm79+/v+nq1aumGDFimKpUqWIKjsCym6NHj1bvN3/+fPO09+/fm/LmzaveW8vMausvmVPZR4JDPlPLrLmmWbNmpgQJEpgeP35sNb127dqq9eP169fquezz0rrwNbI9vpU110i2/1vZ4uvXr5siRIhgGjBggNX0f/75x+Tk5GQ1XT7Lb2XN/8v30nJ//JrVq1er35XvV3B8LYP+rRYWWfdp06aZfy9fvnyBHpPk85CstnzX5SGfW4MGDdT7/fLLL/8pg669Zvmd0kyYMEG99u+//wbrMyD6rziKCxFZkYyiCG52SDqGiY4dO1pN17KmAWvVpSZZy+pq2cfUqVOH6OgIWu261LAGNfpDQNK5TDp/SSbXMksrWfiSJUua19OSZL8tyXpJdlr7DINDMrMyssX9+/dVtlr+D6qDnWSQtQyjr6+v+luSRZXPTzL4wSXv06RJk2DNK0NdSkdAycpLBlnq9iWL/l/J5yhZUWkd0Ugds2SupWPerl27rOaXTodfqxX+FqnikFp+GWFEfpasrfaQ7LZkpbXPTvYbybweOXIEIUHLkEtdttT5B0ZaT2Qfley55bLJZySdMHfs2PGf/vb3fi9D49jwNdLCs2XLFquHrLNYtGiR2s9l22tkf5FWj2fPnn3xXpLRln1EHhkzZlStArJ/y9CR/4U2spJ8TwLS+q2E9ihWRAzQiciKDCcmpAk3OG7cuKFOpilSpLCaLidbCXjkdUuJEiX64j2kKTqwE+9/VatWLVWWIk38Ur4hTeRLliz5arCuLacEu4GVbEjQJB3XvrYuWpP696yLlPBIwLN48WLVrC8d0gJ+lhpZfilvkMBNggcpsZCgRJrjJdAMLulU9z0dLWWYPLlokQuYsWPHqjKe/0o+Z1l+7ULD8jPWXrckJTA/QspFpJxEyji0IE57aBcp0klS/Pbbb+qCR8oYZBmlQ/S3yqK+RpZdAmQpHZFtJRcEEyZMsNpWUjYiFw7y9wIun5SiaMv2vb73exkax4avkUBaOn1aPrTgd/78+WobyAWolELJQ0ZdkZIhKYcKKHfu3CrA37hxo9pXZf3kO/hfOxNLh1khpUkBaSMrafMQhRbWoBPRFydhqS2W4dy+h9T3BofUMQfGv7/af/sbkk22JCfP3bt3q+yjZArlxC0BcLFixVS2Lahl+F4/si4aCbQlMy21wtKK8LVxmWXkCKlVltrYfv36mWt2pYY6uC0F/yW4kDpoLVCUsekts9+h7UcDIe1zkVp7qaUPjLSSaBcJFy5cUDXiss9I5l1G85Bsrwyt+F/IiB/SKiOtObLvSUuB1IEfPHhQ1efL8sl+LdnhwPan7+1/8V+/l8EhfSO0fSC0yAWL1oIhFy0ByUWs1sdAIxc/2ggvchEkyyl9UKQPRcAWhOCQ75V8L6VVLSBtWlD9L4hCCgN0IvqCnNwk4ygdA/PmzfvVeWUoRgky5MSqZUG1jnmSuZTXQ4pkqAO7QUxg2UAJXIsXL64eI0eOVMGtdLSToD2w4dq05ZQALSDpGCdBgNwEJTRISYt0lpVlDnizloBD3EnHORmv2pJ8JlqHxZAOyqTVQDLNUpoknQeHDh2KqlWrfjGGdXDJ5ywZf9lnLLPo8hlrr/9Xga23ZKKlhUIu4oIaps+SbGNpgZGHZGzl4kk6cXbv3l1leP/LZyvZYnnIGNwyjrq07kyePBn9+/dXHVnlgk6y7alSpfru9QvL76Usn7QwycWGBL8/evEQGAnApeRJylQCXrDIPRikBefmzZuBtsRpZIhS6Tgt33kpz/re763sl7K9jh49+sVr0tFWOpGzgyiFNpa4ENEX5K6CclKTEhE5oQd2h1E5QWslGiLgHQ4lKNZOliFFghkpD5AAzzKjFXCkmKdPn37xu9oNewJrthYy1rvMI5lsy4sAaUmQzKe2nqFBgm7JiI8fP95chxsYCVgCZuelyV9GBrGkBSQhcbdLKfuQgEg+F9mmMpKNdhOY/0I+R6mzlxYNy1E3ZNQSCfgksPqvZL0DrrN8ZlLLLNnwwFqFtDH3hZRUWJISCbkwkc9cG9v7ez5bqdmWdbMkgZ8EgNrnJxcAsoySoQ+4beW55TLJ3w5uKVNofS9lOWWZ5NgQcN2EfFekBeJHAnTpyyEXSDL6jOVDxlkXMopOcPZbWU4Zmee/kL8nmXzLIF0u3qWfSHBGkiH6UcygE1GggbAM9ycnScm+Wd5JVDKA2rB4InPmzCpgk4y7BC0SYMmQbRLQyR0dJfgMKZJdlhOvZHClVEA63k2aNEll9iw7SUqHRilxkSBEMoVSniGlClJSUKBAgSDfXzqVyfCD0mogQ/BpwyxKZ7+vlZ78KAnYJLsanJYNWTfJaEs2W0oNJKCRjF7A7Sd1uJKllUyfBHZSp/u99dwSjMjnJneO1IZ9lGEPZZhEKbWRbPr3kvIE6WQq+8+xY8dUwC8tA1LrLcHkj2QmZahOGSpTglApQZD1lfUePHiwajmRn+XGNRJ0y0Wc7DMyv3ZBJx1i5QJJMtzSd0FqwOWiSfYjbbnkbwhpjZH9UbK90gE1sCytfH4yrKMEdLKPSkCrZYa1DpCyrSSTLhl6uWOmfGfkb8kNdeTCUz6vzp07m/+2XNhI2Ya0YMgFjfztwITW91KOCbLfSauClD5JuZN2J1EpC9q2bZs6dvwXkp3Wht8Mqu+E7Ieyz8tx4Gu0mwrJviB9CbQbKsnnLy1uWqddOU7I5y8aNGhgbllo3bq1Cu5l28vnL78v7yX7hdbRlihU/efxX4jI8OQmKy1atDAlSZJE3fRDbowjN/8ZN26c1c1O5IYoMjSg3HQoYsSIJi8vr6/eqOhbw/sFNcyidgMiuVGPLE/q1KnVcH0Bh1mUmw3JkGweHh5qPvm/Tp06an0C/o2AQxFu3bpVraPcqEaG+KtYsWKQNyoKOIyjvFdwhuALzrB2QQ2zKMNRypCBsnyynAcOHAh0eEQZEk9uuCPD1QV2o6LAWL6PDHco20tuniPb11KHDh3U0JPyt78mqO0tw+jJTYPixImjtk/GjBm/2A5f2weCIkPfFSpUSH02AW9UJH9Tht2TfVP2URnOUoa3nDp1qnmeKVOmqN+XoTJl6Eq5IU2XLl3U0KOW+vXrp244JJ/B17a3DEspQ3HK+8gNiOQGP0WLFlX7WEByYyq5QZbsF/KQGwLJ8l64cME8z8uXL01169ZVN1AK7o2KgvO9DO4wi5a075gMSyr7mLu7u/quyH73X29U1KZNG/XalStXgvy7vXv3VvPI8Kdf28eE3Pwq4Hc8sBs+aQ9ZNku3bt1Sw6DKcUCGAK1QoYK6ORlRWHCQf0L3EoCIiIiIiIKLNehERERERDrCAJ2IiIiISEcYoBMRERER6QgDdCIiIiIiHWGATkRERESkIwzQiYiIiIh0hAE6EREREZGOMEAnIiIiItIRBuhERERERDrCAJ2IiIiISEcYoBMRERER6QgDdCKdMplM6v83b97A29s70NeIiIjIeBigE+mQBOAODg5Ys2YNqlevjqxZs6Jhw4aYNm2ael1eIyIiImNigE6kQxKAr1u3DrVr10a+fPkwa9YslUXv3bs3du3aFd6LR0RERKHIwcS2cqJw9/btW0SJEkVlzuXx6tUrFZwXKFAA3bt3x8uXL5EqVSrUrFkTY8aMCe/FJSIiolDEDDpROJs5cyb++OMPPHnyRGXOHR0dET16dBWUFy9eHDdv3kTq1KlRsWJFc3C+YcMGnD59OrwXnYiIiEIBA3SicKI1Xh09ehSbN2/GhAkT8PTpUzVNgnPpHLpy5UoVpJcvXx4TJ05Ur92/fx+zZ8/GuXPn2FmUiIjIgBigE4UTCbCFBN4SgK9evRpjx47Fo0ePECtWLLRv3x6jRo1CvHjxMHXqVESIEEHNP378eJU9z5MnDzuLEhERGZBTeC8AkT2aMWMGVqxYgfnz58PFxQUDBw7Ehw8f1KgtkhVv27Yt6tSpo4J4ea1Nmzaq7OXx48dYunSp6iiaJEmS8F4NIiIiCgXsJEoUDvbt2wcPDw8kTZpUlazEjx9fTe/SpQu2bduGSpUqoWPHjiooX7hwIaZMmYKoUaOqoFwy6+nSpQvvVSAiIqJQwgCdKBwdP35cBeLt2rVD1apVvwjSJXPu5uaG169fI1q0aPj48SOcnNjwRUREZGSsQScKRy9evEDEiBExadIkVd4ihg0bpjqG/v3336rj6MOHD1VwLrQ6dCIiIjIuBuhEYSCohqrChQujZ8+eqnxFOoRaBumlSpVSo7XIMIx+fn5qOjuFEhERGR/byonCgATW0glUsuWnTp1SWXEpVylTpgyKFCmiylYkKJcgXciY54MHD0akSJFQq1YtNTY6ERER2QfWoBOFIhke8dixY6qTp5AOn61bt0bMmDFVgB47dmzMnTsXOXLkwJ49ezB8+HA1/vn//vc/VK9ePbwXn4iIiMIB03KkC9p1ouX1oq1fO759+xZ3795VQyJ27doV3t7easzzkSNHqmny8PT0VJ1BT5w4gYIFC6JDhw4qcJegXW5WZOufAREREX0/ZtAp3El9tVbC4evra9UR0vI1W/TkyRPMmjUL8+bNQ5o0aVSnUKkrjxs3rnke6RAq45tL6Ys4cOAAvLy8VPBORERE9ocBOoUrywBcRjLZv38/3r9/j9SpU6Nv376wZfLVktpzCdKnT5+ubkokgfi9e/fU61LKIp1DT548iXLlymHx4sUqi05ERET2zXZTk2QIWnD+22+/oU+fPkiZMqWqxx4yZAh++uknGIGMY96kSRM0aNBAZdBbtGihpktwLqQjqLQacHxzIiIiEowIKNwdOXIEq1atwvLly5E/f36sXr1aBa1FixYNNCOtd9pyPn36VI3aIj9LSUvLli3Vz9JhtFmzZhg3bpzKqEvHUSntSZQoUXgvOhEREekAM+gU7uRW9xKQS3AugXr9+vXVaCatWrVSGWeZJmwpOJeLDBlCMW/evMiSJQumTZumMuSyTjJCi5SzJE2aVN1BVGrOZfzzhAkThvfiExERkQ4wg07hHsxKZ8jEiROru2Z269ZNBecSxAoZ3WTZsmVInz69Kn/RO1mfzZs3q/Kcfv36qaD7zJkzanSWq1evolevXvj555/VfGPHjlWZdelAGiNGjPBedCIiItIJBugUZoIakUXGBL958ybatGmDgQMHmoNz6UQpN+uRscJTpEgBW7nokM6gdevWVUMrapInT462bdsibdq0aNiwoWolkPIXGWKRwTkRERFZ4iguFCYs68clc/zvv/+quus//vhDZdDlJj2lS5dG5cqV1UgmklmePHmyuuPm8ePHVXmI3mvQta9S2bJl1QXF+PHj1Yg0Wh16+/btsW3bNhw6dAjRokWz+SEkiYiIKHQwOqBQJ4GoFlj//vvvavjEBw8eqCEVM2TIgH379qmg/O+//1Y15wMGDFAdKCVIl7twSnAuN+/Rc3AuZPnkIeskdfPSAVRq6z98+KBeT5YsmQrMZZpgcE5ERESBYYkLhTotEJVRTeTumBs2bEDOnDnV+OC//vorypcvj7Vr16JEiRKqU6XcgTNy5Mjm0g8JzvU4BKGW0b9z547KlMsoLDJcYuvWrXHw4EFVh75kyRLEiRNHzX/58mVVziPz6nF9iIiISB+YwqMwIXXZEqju3r3bHHjL+OBTp05VJSEVK1ZUGfXo0aOr6do8EgTrNZiV4FyGhixWrJi64JC68q1bt6pM+Z9//qkCcencKvXosn5yB9ERI0aoLDoRERFRUBigU5jInTu3qi8/d+4cXr9+bS59kYyyBOmSRS9QoIAa8cSSnstaZFSWHj16qFYAqTe/fv266uS6cuVKlCpVSg2lKB1eJasunUSl9jxz5szhvdhERESkc+wkSiEusM6PsptduXJFBawXL15U2XIvLy9zmYi3tzdGjRqlatT1mjHXviraRYOM3y7LO2nSJNURVAJ2GYlGLkCkzKVmzZrhvMRERERkixigU6gF5zKGufwswWu6dOnUtGvXrqnb3kswK51DJUgPGNDrveZ8y5Ytqrb81atXePfunSpz0UiduQynKCPU1K5dW60rERHZPr2PJEbG2tYscaEQ3aG1QFtqsKWTZI0aNZAvXz51J00hd8+UWmwp+ShcuLAK2ANm2/UYnAv5sm7fvl0NBykdXmVoSKk5Hz16tHkeGV5RRqCR4F3uJiqj0hARkTECtr1796pj/C+//IKjR4/i0aNH4b1oFApkW69fv16d38Mrj80AnUJ8KEUZRlHqyuUhBzAp9ZDSFrnpkEiSJAlmzZqlOkt26tQJtkKy41KeI/XmkjWXDq9yASJ3OpW7oGrk4kPuDioHcamxJyIi2ybntxUrVqiby0mQLq3Acvzv378/nj9/Ht6LRz9IWsalRVyLZ2R4ZDmHR40aNdwy6QzQ6YesW7dO7dRaFvzs2bOqvlwC8KJFi2Lnzp0qmJVses+ePTFkyBBV/iFBunwhli5dClsJzqtWrYp+/fqZh02U1gDpJJomTRo1So3cWEkjr0n5DhER2b7z58+rhJKMxCUDAMhgAHIHbBl1TO52Tbbr0qVLqmW8c+fOavQ1rTRXWsLDs8yFATr9Z5JJlnrrKVOmqJ1auLi4qAyDDD0oGWbpLClB7aJFi8xBeq9evdS8CRIkUCOcSMCud3JzoTJlyqj1lAsQy2y5BOmZMmXCmDFjMH369HBdTqKwpjX/7tixQ12wExmRj4+Punme9CuSO2GnTp0azZo1U+WcQlpXtZvSkW1JmTKlSiTOnDkTXbt2VfdiERKbyNDPIjzKXBig038mB6oiRYqo4FtGMpFMuoeHhxoPXG40JNNluMEWLVqo+ePHj6+GW9y1a5fVzi5But4E/DLKTYjatWuHn3/+WWVOZDhFjYx7LlfeJUuWVDdbIrIHlqMayXdahkqVC1jp5E1kNPfu3VOBm9Scy707JOMqySkhLcVSDsF6dNtVtWpVLFiwQMUyHTt2VNlzOZbJRZmwzKRrpTChTZ+98Uj3tCtLOSjJOOALFy5UJ+xWrVohVqxYePPmDU6fPq0yy9rt7mWc8D59+piDWD30kg6MtlxHjhxRzZrSIVRuNCTZ8vbt26vX5syZo+aV7Ll2BT5y5EjddnANa3rdthRytO17+/ZtHD9+XH0X5CTHgcHs4ztt5O+4tm5SsilBefbs2VGuXDn07t0b8eLFU4kay5JG6Uwo80piyojDJAc2zWjb+uPHj6hSpYpKLNaqVUslDiUQb9y4saoIkKGgZR6Je6T1PyzO94wm6D+RnVd2VunoKaUu0qNddmzZ0Vu2bKk6VlSrVk1llp89e6aaBOVLLhl3vR/cZbmk46dk/qWWXJo2Zbxz6eTavHlzVdYj5KJELkSkhEfYe3A+d+5cta1lLHg5mOt5G9OPk+0rwXnixInVnX9/++03Nd3o29xyv7anfVxbTzmWS78beW7E9dfWSTqEyj4tAZoE5dI6LBehMgjC48eP1UPu7SEtqpJJl46jUo9uyywDcUmwyc+urq5q3Y3I9Glby4hsx44dQ7169VSSQfrGNWzYUNWhSxwgpbsSoEuiUY51Mk+YnO9lHHSi4PL19Q10+osXL0yNGjUy5cqVyzRq1CjT27dv1fSxY8eaatasafr1119N79+/V9M+fvxo0rMzZ86Y4sWLZ5o9e7bJx8dHTevZs6cpTpw4pokTJ6rn169fN3Xo0EGt7+PHj0327s2bN6ayZcuacufObZo2bZp5P/Hz8wvvRaNQYLldp0+fbnJwcDBVrVrVdO/ePZO9fQZBHRONwnL9lixZYipQoIBp+fLl5mlG/I6vW7fOFDVqVNOECRPM5wAh57B58+aZUqdObYoVK5YpTZo0phw5cphOnDhhMpKuXbuqc6CHh4fJ2dnZNGTIENOtW7dMRrRs2TK1LX///XfTsWPHzNPXr19vihIlium3334Lt2VjgE7BZnkgnjFjhqljx46m/v37m7Zu3aqmvXz50tS4cWMVtI4ZM8YcpGv/iw8fPpj0JuAJZvv27aZUqVKpINzy5NS9e3f1RdYOVHfu3DE9fPgwzJdXrx49emSqW7euqVChQupCxh6CdCOv29fWV7vI1p5PmTJFBel//vmn6dmzZyYjsjwWLFy40FS+fHnz8cyoQbrleknQKokWOQbmzZvX9PfffxvyeyDnqzJlyph69eplTj5duHDBNHToUJW00Wzbts10+fJlddwz0nbeuXOnKX78+KYtW7aYjh49aho9erTa5nK+f/LkiclIjh8/boobN65p8uTJgb6+cuVKU/To0VXyUZJQYb2vM0CnYLHcKeWKUq6qixcvrg7UcmIePHiwOUiXnVmm9+vXz5w115vATqh3795Vy7tmzRpTtGjRzMH369evzQduT09P05w5c8J8efW+b2jbWU5kFSpUUNtfLuK0/cZIJ/DAyIla9n0j07ahBCatW7c21atXz9SlSxfziUtOcnIs6N27t+GCdMvjhax/ixYtTBEiRDD9/PPPhg/StYyqBG1ynJfAVY6DRYoUMa1YscI8jy1/x7VlP3nypMqY//TTT6YGDRqYbty4Yfrll19MRYsWVUkbyarLvm9Ukljp06eP+g5bWrBggSlSpEimuXPn2vy2tiTnqHz58plevXplnhbweywX4xLE379/3xTWGKDTd5EmoOrVq5sOHjyonsvBbPz48SYnJydVziIkUKlSpYo6ien5i3zt2jVTu3bt1M/SZCtfVPkSSnZQmi1LlSplevfunXpd1kMyJWnTpjWtXr06nJdcX7RtvGjRIlO1atVM+fPnN8WIEcOUJEkSw5a7XL16VX0PxKpVq0zp0qUz/fvvvyajk4BMMkrt27dXZV/p06dXgYt2gps6daopYsSIKnB//vy5yWgki5g9e3bT//73P9VSKEGrXKgYOUj/559/VEC+YcMG87RTp06ZChYsqC7EJaGhseXv+Nq1a9X2lPUcOHCgKU+ePCZHR0f1PZcAVRI1ErzKeUGviafvIaVK8n21TFDJNpWLbLnwFHL+0/ZpuVCRfV8SVba8nS2NGDFCHcOklSSgXbt2mZN0gb0eFhig01dZnnAkAJODVrZs2b4o7ZDMipubm6rfFvIl1nNgJssmzfIpU6Y0lS5dWh2UpLZQW14JunLmzGkqVqyYCuRlvSRzlCBBAlX6QtYOHTqkAreZM2eqbLKUAcmJzKg16XIST5gwobqQk33nr7/+MhmdXLxmyZJFla8J+R5IjWrz5s2t5hs5cqQpduzYhmj6t7R582bVD2X//v3quezTsq7ymdSvX9+wQboc/yRA1wJxbf3Onj1rihkzpipps8yk2xLteCT7tmxDbd+WYPzcuXPm8k1N06ZNVRmfHks1v4csv/QnsCw/FbJvV65cWX1/L168aFXOJi3ihQsXtvlj+L8WiRTZb+UiTMq3LMk+LuVccgETnuvLAJ2CZLljSucvKe2QoFXKP7ROMdrBWmrVJHjdt2+f1Xvo/WQlmTAJsKS51pIcuOSEJBkiadaUQD5ZsmRWnUjsWcCDluwbkkm17FAlF3FygZM4cWIVuOt9X/hef/zxh9p3MmXKZJ5m6yfur5ETtnwHpIVM+l9I0CbfH41c1Grrb7QSFzF//nx1QWLZKVz2d6m7l2OiXKhomVVbDWICK0mTC7FEiRKpAE0L2LTvsgRskoGUoO706dMmW7R3715zB/cDBw4E2WLWuXNnk4uLi2pRMBLpRyYlTJojR46okh7pJCrbVFrC5IJFjuXSEdxW921x5coVdUHdqlUrk0ZKcqVkV1rGHzx4oL7fUsYrZS2SbApPDNApUJZfQjkJS9Zcak2lx7Ps4CVKlFD1eprbt2+rg7j0fLaldZOTa8OGDVXTXbNmzQKdf8+ePaoziTQBkvVnKJ+NHMSkCVguYrRRPLTSIAnqJMsmJ3EZ7cMItOBk1qxZqpRD1k2+DxojNH8HRk5ccuKWmkz5rstxQQvIZR+QUo8dO3ao57Z8Eg9IyyDKvi6jd0gm3dLNmzdVa4pcoMoxRO+jVAXF8gJaLrBkP9YyrJMmTVKZRtnnNXI+kDpt+e5LMCOBni2SfVdGY5GLbW2ULsvPY9OmTaYmTZqoMjajjdYi+6q0Asm6axdglkG6jGIin42UvGTOnNnmL0CfPXtm6tGjh7oYk1HYhBzDWrZsqWrskyZNasqaNav6Pss5P7wxQCcrAb94kgWVGjwZ2UQjzfmSOZFAffHixap+W0Y0yJgxo82cnOQkKwchIfWz0lNdDkDShBkwcxKwGdCeWe4f0stfDuyyb8hJTurOJctkSS7i5EAvzcfS4cpoZH+X0SwkcLMM0sXhw4etOh/Z6nbWAhVZFxndQrZ5nTp1rOaXCxVpXTPCMItBtfTIsVBKmsqVK2eVRZWsXI0aNVSAIyd3ycja8joPGjRItShKjb10+NayiH379lXbXhIa0gdBzgFyzBTSqVKy6LZKWglk20n/GekEbEn2eylpkySUEfdtyY5LB2+5ALPsHCrHL9mvZQQXy8/ElloJ/QK5kJAgXfZlSTp26tTJ6nwmyQcp/ZGLbj1ggE5mAZsoZQxYyQrJaC2SPbMMviUwz5Ahg7rCluBcdngtkNV7kC7LWbt2bXWy0U6mT58+VfWHctEh2RLJAEt2XeorjdjZ7UfJyUoOZjL0mEZaVyJHjqwyE9rwY1IGIjWb3t7eJlumHeillEvqEqU1QGpUtROclENJpkmagaWZVMbUleDFFofh1NZVTlgSiJUsWVL115DgW9bNy8tLraf0LZD1ls5j0kQsHQeNFMDIdm7Tpo26GNHG/ZYLdll/+Uykg5nUKMuFmcwjxxBpLZKMpK2S/Vb6EknHf+lzIxfXrq6u5rr7pUuXqkBc1lmy51pLmfQ3kYs0W9m3pQ5Z9m9J0mjD5kprnySZZF20liCjCBioSnZ49+7d5mBbXpfzvQTp0hFWIyU/Uv4jHf61vld6P78HJANayDjuAYN0uaCW1k/L8h69YYBOinwppdnHctg86TQpO7A092hNW5bZZLnSlBO1BLta51DtgK33g5Rkc6X2TJq1pOlayAlWmnJTpEihmvClR790frR3EnBYZo8kSJGLG6nHlJYHy89Xxo2VDkZyQJfmQjnZ23rdvrbfSJAmNchSDiUXbtJhUNt3pLl/48aNKpMu80h9tmSgbJV0npJgU5p+5SJLjgESuMh6ylCakkmXkiZp9pdg1QjBuSVpCXJ3d1eZ4YoVK6r9XTqNyfFPOk3KzddkWydPnlxlkrWhWKXPinSmtxWWx0TJGkqAKsd1jYxeIaVLEqTLxZnQhtUU0h9B7g8hJS56H8VIW1dJJMj+LMco6R8j21FG7BCyb8tnIK0kUtpiBDIspGUGXPZt2V5y/JZznRy3tPO7BOkyIpsk3DSSlJDPQzLpsu/rnZ+fn/lCW5JEUoon2zTghbPs21JTL/u2JBn0iAE6ma8otatprXlHOkDJwUw6i0iWXGMZhEvnQMmySFOYZU26HmnlBtqBWjInUnohQbqWSZcvrWRGJTtsCwej0CatBxKQyolLI8HIsGHDVAAnGUbtM9U+VwnmpS+CBLRGGfFGbt4hAbk2LJlk3iRokw7E2ggAklmSG3nICc+W77on33/p+CoXq0K2q2xryTRp2TM5Bsi6ygnQVst4giLBmnR4t7zAkhZDOZFrdxWUwFQu6C3LtiRQlYszWzluWLYWyLFeAnAZiUm76NRel+myP2j3utDOE1rHSWlR0HtttrYuknCRfVlKOuQ4Jd9rOQdIS7BklMWlS5fUOsmQsUbYt6Vjt1xIygAOcqySi2q5+Dh//rwqYZJklLSMaOd1+d7LsU3GCLfMQkupq3w2et6+7y36/0gprrQASoJILq4lASnnLUvDhw9XLZ8S34THOOffwgDdzgVs+pKRGOTLqR2s5AAlX175EssBS2OZSZcvgtTvWTZ56o18SeXkqZUlaOstwYhcXMgB2tYzvaFFOyHLAV4LPGW/kIOb7CuWWXRba/4MipSmSBCu9VOQ5n4pebLsEC2lUFKPK2U9RmoSl++EfJ8lCJVmf8k2yj0NLJu9LUfrMRqpN5agRjqFy/6sHSskGSFjvAdMREhwKll2Ob7ooWPZ9x73pTRFglQ5dsswupJxtOwMKD9Ly0C3bt2s3kPOAVIWqZd63cDIxZJWoijrIqVpklCyvDiR0i0pw5N9XutDIb8nfQuMQlr8pDVcShIHDBhg9ZoE3nI8k2Scdv6WFrSAteZ67Yvl+2lbSr8QqaGX55I4kBZc7dwkx2wZuUX2b8sgXfZpufDU6x1SGaDbOW3n1v6XA5TUU0pzlpZJkRO1BOmyw0swG/B3hTSL6iFbGrATjBYwSjOddHySE6/WFKvNK81/EmjKgyUt1vuE9rMcnKU5VG7UpJW7SFO31PbJ56bdpMoIZHxn6SwmZRzSBKrtP1KHK4GpZGK0G3lIy4u270hNq62RCy05mUlHX9muEszI/xKUyzTJvElwru0HUsoiJW22Eoh+ixzvJMiUcj45wcv6y4W6bE+tbEcLTKSVUUoiJJAJSEoD9F7ioZEhMjWyjaXfjezbso1lJBa5YZtcfGskQJdpWgbdVkbwkOWWYFxaQ7RhPyVgk9IO7bm2LnKTIsmaawkcI5AsuWxPbTxz7a7fkkgLSM7rcn6XO4VaBuZ67xDq++m4JBfNsm7yPZR9WurLpTzPsgVEkktt27ZV5S7SoV2OYzKwgV5bBQQDdDsmZRySBZTyBcvblEv2UDIKMravZZAuJybJNMswRRo9jm0tTXeyjHLBYLl8ElRIhxc5EMs8lvNLramUaxjpAP1faJ+X7BPSLCgBqpZxkOZ8ycLIwU3LpEvwIkG6lAkF7Ihji6QvhdTQy/4j6xuwRUAu4GQkD23/kfll35EspK3tO7KNpQVAmnjley0dPeV7L4GqlLPICc/yglwr45Dtb4QhR6UES2prpa+JJCSkXKlSpUqqpUirW7U8eUuph9TdSzBnq6QjvGRL5XsuWdLGjRtb1d9KiZ/ULEs2WbKusr3lYlW+93oP1gIj+7Lsr5JYkJIk+d7K4AZSj2w5Vr98FyR5Y5QEjdx3Qi6yJWtseW8SGfBB6s+lxS/gsU0uZixbyfXO1+KGWfLdlVZOy/tTyAAX2h1AtXWV2EaSjXLMlthH72PaM0C3UzKqhmTHpCOUnIhk/F7LMW7lylOuMC2DdNnZA/ti6y1rIgdk+YLKyVRqJC07bUmWS+4cKs3RkhWV0WmkWUxGJrC8SLFHltkI2S+qVKmi9gHpNKSN1iKBuXSqkrp0yyBdDo5SoysnQVslzZxy+2vJsliyvMiT2nrZt7QDu4x6IUGerdWqSnZYMouSZZo9e7YKXKS+Wo4JErDLRZkE79JqIi0DcnEuo/NI/a7e+5oEh/QlkEyqZIplJBZt6DVZd9m/pQxAMo2SMZcSPklmyHaWodn0fPz7Gqm7lnIsWRchI7FIYCOBmSU5Dso88t2X46IEeVpwbivrrmXG5bsr+7Zkj6XlS1pI5GJaRliSY5rUHct5TfZ92de1jrC2TLadnLel34Q2epbldpNjnOzXcl4PmGDTY8LtW2Ut0jdILsA0EoTLtpURaSz70QRcN1u4XwUDdDslX1jJjshBW5p05YQsmUMpb5FxcGXnlaZvOThLM1DAWx7r+UAtX07JkMhY5xI4yolYMoPabXslSJeTrwRa0mFGsmdGCDp+hGUJg5y0tVYS7ZbH7dq1MwehWo2yBCtauYsE6ZZ3WLRFkomRAFU6CQZ2otLqcaVVQfYduRCU74at7TuyjeUELt//gFlRObnL+NcSzMgwipJlksy6XMRLQGeE0VrkOCAtPtrQiZbkYl5aSCR7LBclcvyTAED2d+lIpp3U9Xz8+9o6yyhLGinXkn1ZghsZVjGw0hXL74EtZNAtR5ixDMBkvGv5zkpwKkG61B5Lx1dpOZJ9XRISRijbkuBUSjlle1qSixApx9PKsLShE2WarQXpvhaJJDmOyfpKwk0bsEDIBbeWSZeynaDukqt3DNDtmGQDJSOmnXTl4Kbt1JIhlZIFyZ7JkGLSNGYrJMsvQbfWwU+a4yVLLgdjaa6Vk5U0XcvBSTrF2sqoC6FNAm8JRmR7W6pVq5bKOElmUVofJIjRgnTJOlnWtNoyuSOitBZYZt8CkosUKXGQQEcuArX6TlvexrK+lsGXXLTL8JjaiDXyXZGsqhE6hsqxQY5v2ljP2uhDlusvZSBy/NACeGkpkkyktl/YQqD6tXXWyEW3XHxLWYMErpajdtjaOgpJFsh+bXlTPSHnMdmfpYOo1NtLQC4BnMwvrcZS6qOH/lMhFaBL0snyQkzujiqlarIPyIWIdkMpGR5VRu0JeP8TW3DkyBHVYVvO63KxLPdpkOOaZZAuF2LSwinrPX/+fJMtYoBu56TeUB4a+XJLaYNkHLS7BsoNOfR+VR2QlLbI+L1aRkWCTGm+lsy5ZMfky22kjo0hQS5UJCssdbjasJPSmiJZCul0Iyc4+QwlKJdmYzmpyfwy3JoRSK2mXMQF1glQI52Q5MRmhG2sla5pLDNLErBpHWRtLVv8NXJBVbBgQRWkaCNVaSyPcVKnLNnzgMGqrR0HLddZtrmWtJCgXEoAtU7CMpKHHBeldtlWyagrUsoipUiWxy8pvdM6cEs/EcmcS+ufXkfu+NEAXe7B0Lx5czX4gWxXaf2SfVlalKX+WvpgyXFMyHy2+P3etWuXVSmiBONBBenSii5xjC3dn0DDAN3OSdAlWWWpHdZudazVrUnmSOrYtBOULZ2c5EAkB2tZZqmvl7HctZspSTOfZMm052R9MpcLMzmZy8FbOhRZ3rBDOk7KwU6r7bPFTFtQJKMm6yvrbplRswxcO3bsqOpVbamZNKhtLK0hlkG65TpJs7Fc4BrRt9Zfjn9yEWp5sxajrLOU6cjFlwSoli2HMpqNZJ/lAtyWO8Fq6ykXYDL6kGSMA95wSJILMmKJZNLl/GDL3+XASDmqlKVJp1dp+ZRAXSs/lPO8tCJIeZslWwzSA/veBhakyzpLnxJb68QvGKCTuVOl3BEvqKyCLQZikhGSjiJSn2ZrdcLhSUY0kCyx1KJrw61p9dcSxEoGSi6AjEjKGqQjnbS0SE26RjKNclKTzlWWN20yQpCqZRuFBCxyYS41qtJ5VBgtgAlq/bX1lHHN5QJFMo6W042wztKPQII3y7uFarXaUqomfU9sOVgL6vgVMMEk8xil5S+oTHpg6yfBqrSmSCBrpH1bYxmky02KNLa6ngzQ7Zi208oYwNKkK6OaWE63Vdryy13TZKglrR7P1tcrLF2+fFnd2l0CNctSAOmjINknPd+c5EfISVxqsKUWXbKJ0klSmoclqy7ZdSN0JPtWJllaCCTzZst3Q/3e9df2cUlESImE3GHRlloMv+d7Lesr32vLbR5wRAtbD9Itj1+W62nEbfo9Qbu0oEjLga1v328F6dLBXZKO2p1/bZWD/AOya3fu3EHOnDnRtm1bdOvWDUbx4MEDFChQALVr10a/fv3Ce3FszqVLl9Q+IYeIQYMGYcuWLejVqxf279+PrFmzwsgOHz6MYcOG4fLly4gZMyby5cuHZs2aIWXKlDD6Npbvyt69e5E5c2YYnbb+jo6O6NGjB0aOHIl///0XJ0+eRMSIEeHn56deM+I6i99//x358+eHEVnu23/88Ydh1/NbHj9+jOnTp6vv9MOHD7Fv3z61b/v6+iJChAgwIm9vb6xatQp58+ZFqlSpYLPC+wqB9EE6TEpPd8tmfSOQ1gHpqW6UG1CER5ZRsomSPZaOtVoriz0wcpbJkj1vY239JbMo6y4jFWnZZFss6/vedZYhJY0wdOa39m25xfuBAwdM9khKtuQzkFF7tH3ayPu2xggt5sZKDdB/Vq5cOZQvXx5p0qSBkRQtWlS1Dnh4eIT3otgkyRgPHz4cefLkwYkTJ5A9e3bYC8vMqZEbGu15G1uuf8uWLXHmzBmVXfz48SOcnJxg5HWWFqJChQohQ4YMMPp6enp62u05IEuWLJg3bx5GjRql9mnJnBt539Y4ODjA1rHEhcxkV5Cd2mhNX2/fvkWUKFHCezFs2ocPH1TgQsbFbezP6MF5YIxYymPp/fv3iBQpEuyddo4n28AAnYiIiIhIR4x7yUxEREREZIMYoBMRERER6QgDdCIiIiIiHWGATkRERESkIwzQiYiIiIh0hAE6hah3796hd+/e6n97wvW2n/W2x3W21/W2x3UWXG/7WW97XGdbWW8Os0ghysfHB87OzupWu7FixYK94Hrbz3rb4zrb63rb4zoLrrf9rLc9rrOtrDcz6EREREREOsIAnYiIiIhIR+zrfsZ2duvmu3fvImbMmGF6a19pNrL8315wve1nve1xne11ve1xnQXX237W2x7XObzXWyrLX7x4AQ8PDzg6Bp0nZw26Qd2+fRteXl7hvRhEREREFMCtW7fg6emJoDCDblCSOVcapAAiRYA9uTd0O+zNB7/3sEcRHSPB3jgg7FrEiMKDCfaZN3zr+wb2JrJjZNgbyZ6nTpruc5wWBAboBmUua5Hg3M4CdL32yA5NDNDtBwN0Mjp7DdAj+tpfSBbFMQrslcM3yo/ZSZSIiIiISEcYoBMRERER6QgDdCIiIiIiHWGATkRERESkIwzQiYiIiIh0hAE6EREREZGOMEAnIiIiItIRBuhERERERDrCAJ2IiIiISEcYoBMRERER6QgDdCIiIiIiHWGATkRERESkIwzQiYiIiIh0hAE6EREREZGOMEAnIiIiItIRBuhERERERDrCAJ2IiIiISEcYoBMRERER6QgDdCIiIiIiHWGATkRERESkIwzQiYiIiIh0hAE6EREREZGO2F2AvnPnTjg4OOD58+c//F5FihRB+/btYXh+JuDwQ2D+JWDqeWDBJeDoI8Bk+jzPVR9gzQ1g5gVg0jng8dsv3+f1R2DbHWD2BWDaeWDpVeCKD2zJ3j17UaNKTSRPlBLRI8bEmtVrgpy3bet2ap7xYybASEYNG43Ykd3QrVMP87TZ0+egfMlK8IqTWL32/Lk3bN23tvWAvgORNUM2uDvHQ0J3L5QvXRFHDh2BkQ0fOgLRIsZAl45dYWT9+w5Q62n5yJIhK+yJUbe1vR7D7925h5aNWyOlRxp4xk6MgtkL48Sxk1bzXPz3IupVb4CkcVMgkWsSlMhfGrdv3oatGj5kBArlLYL4rgmRJGFy1K5eFxcvXLKaZ+b0WShTojwSuHkiRiTnEIkNQ4phA/QDBw4gQoQIKF++vNX0fPny4d69e3B2dg63ZbM5Jx4DZ58BBeMDtZMDeeIBJ58A/zz9PM8HPyBBNCBP3KDfR4Lz5++AsomAWsmBZDGBLbeBR29gK169eo2MmTJi1NgRX53v71V/4/ChI0jgkQBGcvzoccyaNgfpM6a3mv769RuUKFUMHX/rAKP41rZOkTIFRowZgcMnDmLLzs1InDgRKpWrgkePHsGIjh45hhnTZiJjxgywB+nSp8XVW1fMj607t8BeGHlb2+Mx/Pmz5yhXtCIiRoyIxav/wr4Tu9F3cB/Ejh3bPM+1K9dRvlglpEydEqs3r8SuIzvRqXsHRI4SGbZq7559+LlVC2zfsxVr1q/Ch48fULl8Vbx69co8z5vXb1CyVHF0/q0j9MYJBjVjxgy0adNG/X/37l14eHio6ZEiRUL8+PGD/D1fX1+VYXd0NOy1y/d78AZIEhNIHNP/eaxIwCVv4KFFljz1py+6z/ug3+f+a6BQAiBeVP/n2d2BU0+BR28B90/TdK50mVLq8TV379xFp/ZdsHrdKlSvXANG8fLlS7Ro1BJjJ43CsMEjrV5r3bal+n/Prr0wim9t61p1frJ6Pnj4IMyZNRdn/jmLosWKwEhk2zdt1AwTJo/HkIFDYA8iRHBC/PjxYG+Mvq3t8Rg+dsQ4JPT0wLhpY8zTEidNbDXPgN4DUaJ0cfQe+Kd5WtLkSWDLVq1dYfV88vRJSJowOU4cP4kCBfOrab+0ba3+371rD/TG0agHmMWLF6NVq1Yqgz579uwgS1zkNbmK/Pvvv5EuXTpEjhwZN2/eROPGjVGlShX06dMH7u7uiBUrFlq2bIn374MOQOfNm4ccOXIgZsyY6iKgbt26ePjw4Rd/e9u2bWq+aNGiqYz+hQsXrN5n9erVyJYtG6JEiYJkyZKpZfj48SPCjQTUd175Z7+FlK9IsJ0oxve9T/xo/iUtb339y2MkyPf1AxJGh1H4+fmhWeMWaN+xncrAGUnndl1RqmxJFClurOAzJMhxQZpKpWUuYybjZR07tOmIMmVLo1jxorAXVy5fQbJEKZAuVQY0adAUt27egj2wx21t9GP4xrWbkTl7ZjSt2xxpvNKhaO7imDtjntU6b9mwFclTJkfNCrXUPKUKlsH6v9fDSHy8/UsvXVxcYAsMGaAvWbIEadKkQerUqVG/fn3MnDkTJst66QBev36NIUOGYPr06Th79izixvUv05BA+vz58yqwXrhwIVasWKGC5aB8+PAB/fr1w6lTp7Bq1Spcv35dBfoB9ezZEyNGjMDRo0fh5OSEpk2bml/bs2cPGjZsiHbt2uHcuXOYMmWKuogYMGDAV9f53bt38PHxsXqEmGxxgBSxgIVXgCnn/GvHM7kBqb6zTKiUp389+6wL/rXsu+8BZbwA50gwihHDRqpt2rpNKxjJ8iUrcPrEafTq/0d4L4qubFi3AXFjx4drjDiqTnXNhtWIEycOjGTp4qU4eeIk+g4I+thnNDlz5cTUGZOxeu0qjBk/Gtev30CJoqXw4sULGJk9bmt7OIbfuHYDs6fOQbLkSbFkzWI0btEIPTr9jkXzFqvXHz18jFcvX2Hs8LEoVqoolq5dgvKVyqFRrabYt3s/jMDPzw+/de6OvPnyIH2GdLAFhixxkbIWCcxFmTJl4O3tjV27dqlOnUEF1hMnTkTmzJmtpks5jAT3kulOnz49+vbtiy5duqggPLASGMtAWzLfY8eORc6cOVVGP0aMz9lmCbYLFy6sfu7WrZvK8r99+1ZlzOUCQKY1atTI/D7y97p27YpevXoFuc6DBg366sXDD7nsA1z0BkokBFwj+2fQ9z0AojkBaT7XsH2TdDR95wtUTAREcQKuvQA23waqJAHcosDWnTh2AhPHTcL+w3tVS4lR3L51R3UIXbl+udpH6bNCRQrhwNF9ePL4CWbNmI0GdRth574diBvXHUZw+9Zt1UlwzYY1drXtLUsgpEUkZ64cSJM8HZYvXYHGTf2PzUZjr9vaHo7hEpxmyZ4Zv/frqZ5nypIR/577V3Xur92glnpdlKlQBq0+lStmzJwBhw8ewexpc5C/UD7Yug5tO+Hc2fPYsmMjbIXhMuhSLnL48GHUqVNHPZcr4Vq1aqmgPSgSiGfKlOmL6RKwS3CuyZs3rwq2b90KvKnz2LFjqFixIhIlSqTKXLQgXEpmLFn+rQQJ/DugaKUwkn2XCwEJ6LVHixYtVMdWyfQHpXv37upCRHsEtYz/yYEH/ln0lM7+gbTUm2d29e88Glze74Ezz4CiHoBnDCBOFCCnu3/tuUw3gH179+PRw0dInSwtYkWJrR43b9xE9649kDaFdadKW3Ly+Em1XoVzF4VbtLjqsW/3PkyZMFX9LP027FX06NGRPEVy5MqTC5OmTVTHmzmz5sAojh8/gYcPHyFfrvyIGcVZPfbs3ouJ4yepn+1l20sZpHQKvnrlKoyK29q4x/B48eMhVZpUVtNSpkmlki/CLY6rOnalTms9j/zOnU/z2LKO7Tpj4/pNWL95DRJ6JoStMFwGXQJxqdfWOoUKKW+R2vLx48cH+jtRo0b94atl6RVcunRp9ViwYIGqW5fAXJ4HrFuXntQa7e9qV7ByASCZ8GrVqn3xN76W1ZD1k0eo+BhIeZAst+l73sPv8+8FvET8SvmRLalTvzaKBqjbrFy+CurUq40GjfxbdGxR4WKFsP+4defPX1r8qnr7t+/cTo2WRDB/j9+/+0pHaRsjnV2PnDhkNe1/zVshdepU6Nilg91sezkuX7t6DfHr1YZRcVsb9xieK29OXLl4xWralUtX4JXI05ykzJojCy4HMo/np3lskclkUp1916xeiw1b1iFJUtvq9GqoAF0C87lz56r67lKlrHtpS4dPqSOX2vTgkmz2mzdvVAAvDh48qDLaXl5eX8z777//4smTJxg8eLD5dakx/17SOVRaAVKkSAHdSBIDOP4YiBkRcPlU4nLqiXV5i3T8fPkBePXB/7nWoVTKYOQRO7J/rfmue0DeeECUCP4lLrdeAeW+/Dz1fKK+cvlzFu36tRs4dfI0XF1d4JXIC25ubl9cjMWLFw+pUltnJmyJtAYF7CwVLXp0uLq6mqc/uP8ADx48xLUr19Tzc2fOIUZM+a54wsXVNjrkfM+2dnVzxdBBw1C+QjnETxBflbhMmTRVjf5QtXpVGIVs+/QZrDOH0aNHU+sfcLqRSMa0XIWyqjX03t17alz0CBEcUbN2TRiVvWxrezyGt2z7P5QrUgGjhoxG5RqVcfzIccybMQ8jJgw3z/Nrh1/QvP7PyFsgDwoUKYDtm7dj07rNashFWy5rWbpoGRYt/wsxY8ZQ5ykRyzmWOa5T5677D8ytY2fPnEPMGDHUhYmc48KToQL0tWvX4tmzZ2jWrNkX45xXr15dZdeHDRsW7PeTzLe81++//646fEoN+K+//hpo/bkcyOUqdNy4cWq0lzNnzqja8e/1559/okKFCur9atSoof6WXCjI+/Xv3x/hokB84PAjYPd94M1HILoTkM4FyGFRZ3v9BbDj7ufnWz41i+WIA+SMC0Rw8A/EDz4ENtz0HzddAvZiHp+Hb7QBx4+dQNkS5czPu3Xprv6v16Aups6cAns1c9psDOk/1Py8XPEK6v8J08ahXsO6sEVf29ZjJ47BxQsXsWDeXyo4lyAme45s2LJjk2FGfrBnd+7cQaP6TfD0yVPEcY+DfPnzYufeHapllGybPR7Ds+XIijlLZqH/HwMwfOBIJEqSCP2H9UPNOp+HkCxfuRyGjxuK0cPGqg6kKVIlx6xFM5Anf27YqulT/Euby5awvh/O5OkTUb9hPf95ps7EoP6Dza+VLlb2i3nCi4Ppa8Ob2Bip/5Ym5nXr1n3xmtSl586dG2PGjFEjpEggL3WFMkKK3A004N2jZPQVmSZ16BMmTFCjpEhduwTgWimJdDrNkiULRo8erZ5Lhr5Hjx6qXlwy4VIXXqlSJZw4cULNJ6PBFC1a1Py3xcmTJ5E1a1Zcu3YNSZL4N79s2rRJ1aHL78nVu2T9mzdvrmrRg0tGcVEXKc1SA5GM3zRp6dVY62Zae/DBzzhlFd8joqNxRgAKLgcYp/MaUWBM31U/aRxvfIPuZ2ZUURztr0Oyj48PPOJ4qf6CMoS3XQToIUkL0GW4RFvEAN2+MEC3HwzQyegYoNsPBuix7GcUFyIiIiIiW8YAnYiIiIhIRwzVSTQkSW06EREREVFYYwadiIiIiEhHGKATEREREekIA3QiIiIiIh1hgE5EREREpCMM0ImIiIiIdIQBOhERERGRjjBAJyIiIiLSEQboREREREQ6wgCdiIiIiEhHGKATEREREekIA3QiIiIiIh1hgE5EREREpCMM0ImIiIiIdIQBOhERERGRjjBAJyIiIiLSEQboREREREQ6wgCdiIiIiEhHGKATEREREekIA3QiIiIiIh1hgE5EREREpCMM0ImIiIiIdMQpvBeAQte9odsRK1Ys2JPoZdPA3rzZeDG8F4HCiMlkgj1ycHAI70WgMOIA+9zWUSJEhb3x9fsIe+Nr8g3WfMygExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigU4h58eIFunT8DWmSp4NbTHcUK1gcx44cg0376AdceA7svQ9svwMceQR4v/d/zc8EXPIGDjwAtt8Fdt8DzjwF3vlav8fJJ8CeT78f1Dw2Zu/uvaheuQaSeiVHVKfo+Hv1Ghjd1MnTkDNrLsR1ia8ehfMXxaYNm2BPhg8dgWgRY6BLx64wsmGDhyF/noJwjx0PiRIkRs1qtXDxwkUYnb2utz0cz/bu2YeaVX5CikSpECNiLKxZvfaLef49fwE/Va0FDzdPxHWOj0J5CuPWzVuwVdOnzES+7AXgGSeRepQoVApbNm4xv371yjXUq9kAyRKmVK83qtsEDx88hF4wQA9hvXv3RpYsWb46T+PGjVGlShXz8yJFiqB9+/awdb/871fs2LYd02dPxeETB1G8ZHFUKFMJd+/chc06/xx4+g5I7wLkiQe4RgaOPwbe+voH6C8+AMliArndgcxuwOuP/gG5JZfIQCZXIG88///f+AKnn8KWvXr1ChkzZcTocaNgLxImTIh+A/pi/+G92HdoD4oULawCmHNnz8EeHD1yDDOmzUTGjBlgdHt270XLVj9j174dWLtxDT5++IAKZSup/d7I7HW97eF49vrVK2TIlAEjx44I9PWrV66iVJFSSJU6FTZsXYeDx/fjt56/IXKUKLBVCRN6oHf/Xth1YAd27t+OQkUKoU6N+jh/7rza5lXLVwccHLBm02ps2rkRH95/QK1qdeHn5wc9cDCZTKbwXogDBw6gQIECKFOmDNatW/fdAfGqVatw8uRJ6EFwlsfb2xvysceOHdscoEtQP3r06BBbDh8fHzg7O+PekzuIFSsWQtubN28QzyUBlqxYhDLlypin589VEKXKlESvvn8irEQvmyZk3sjXBOy86x94x7E4SB16CLhFAVIE8rlKdl2y7AXiAVGcAn/fR2+AU0+BYh6Ao0OILOqbjeGX5ZKM0+Lli1CpckXYGw93TwwcMgCNmzYKs78ZHofsly9fIl+uAiqAGTJwCDJlzoRhI4eG6TI4OITMd+W/ePToERIlSIIt2zehQKECsBf2uN7heTzzNYVNy6pk0Bcu+wsVK1cwT2tUrzEiOkXE9DnTEJZ8/T6G6d9LHD8Z+g3qg4SeCVGj0k+48eCqOUby9vZB4nhJsXLdchQtXiTUlkHiMy/3JCoW/Fp8posM+owZM9CmTRvs3r0bd+/acLY1mCRw1oJzo/j48SN8fX2/uNqOGjUKDuw7AJskgZApkG+JBNXP3wVdEiOcgvhqffAD7r8BnCOFWHBOYU/29SWLl6osTO48uWB0Hdp0RJmypVGseFHYIx9vH/W/i6sL7Im9rre9kYzxpvWbkSJVClQuVwVJPJKhSL6igZbB2PIxe9mS5Xj96jVy5cmJ9+/eq4v+yJEjm+eJEiUyHB0dcXD/QehBuAfokplZvHgxWrVqhfLly2P27Nnm1+TngIGsZKe1TIq83qdPH5w6dUpNk4f2+zdv3kTlypURI0YMdYXy008/4cGDB1+UosycOROJEiVS87Vu3VptxKFDhyJ+/PiIGzcuBgwYYPX3v/W+milTpsDLywvRokVT88iVUlAlLgG9e/cOnTt3Vk3q0aNHR+7cubFz507oWcyYMVWgMmTAENy7e099jgsXLMKhg4dx//592CQJsiWQvvrCv2ZcAvZ7r/2z5O/9As+4X/YB4kf9MkCXWnWpU991D3j7EcjsGmarQSHnzD9nEMc5LpyjuaBt63ZYvGwh0qZLCyNbungpTp44ib4D+sBegxepuc+bLy/SZ0gPe2Gv622PHj18pGKxkUNHoWSpEvh7/SpUrFIRdWvWU2VPtuzsmXPwcPWCe8z46PhrJyxYMg9p0qZBztw5ED16NPTq0RuvX79WyZbff/tTxS73730Z09llgL5kyRKkSZMGqVOnRv369VXAHNwm3Fq1aqFTp05Inz497t27px4yTQ4sEkQ/ffoUu3btwpYtW3D16lX1mqUrV65gw4YN2LhxIxYuXKgy+XKRcPv2bfV7Q4YMwe+//45Dhw6p+YP7vpcvX1brtWbNGvXeJ06cUMF/cP3666+q7GfRokU4ffo0atasqcp/Ll269NWgXppNLB9hbfrsaWrbpUicCi7R3TBp/GTUrFVTXZHaLKk9F6qT513g1kv/ADwgqUf/51NdeZpAWkcSx/CvU8/qJo31wNln/gE/2RSpzzx07AB279+FFv9rjhZN/6fqGY3q9q3bKkibOXcmothwLeqPaN+mA86ePYe5f82BPbHX9bZHWs11+Url8Gv7X5EpSyZ06toRZcuXwYypM2DLUqZKgT2Hd2Hb3i1o+nNTtGzeGv+e/xdx3ONg9l+zsGHdJhXAayUnmbNm1k3MEkSRbNiRoFgCcyFBqHxAEvxKXfa3RI0aVWWynZycVMZbI4HzP//8g2vXrqkstpg7d64K5I8cOYKcOXOad0q5IJDsb7p06VC0aFFcuHAB69evVxtILhokSN+xY4fKYm/bti1Y7/v27Vs1XTLgYty4cSrwHzFihNVyBkYy9LNmzVL/e3h4qGmSTZdAX6YPHDgw0N8bNGiQak0IT8mSJ8Om7RvVlaiPzwskSBAfDes2QpKkSWCzojkBOdylUA74aAIiR/APxKNG+DI4l8x4tjiBl7dEiuD/iB7R/yGjwkgmPvbn5jXSv0iRIiF5iuTq52zZs+LY0WOYMG4ixk8aByM6fvwEHj58hHy58punSYZJRoSYPHEKnr96iggRLL4LBtO+bUesX7cBW3dshqen//HcHtjretsrtzhuKo6SzLKl1GlS226JqtUxO5n6OWu2LDh+9AQmjZuCMRNHoXjJYjj173E8efwEEZycEDu2M1ImSoMkSRNDD8L1MkGC4cOHD6NOnTrquewgko2WoP1HnD9/XgXQWhAtJACXchl5TZMkSRIVnGvixYun5rO8epJpDx8+/K73lZIZLTgXefPmVRcDsr7fIhcAcgJMlSqVuvjQHnLRIhn/oHTv3l1d3GiPW7fCb2gkKcuR4PzZs2fYunkbKlQsD5sXwdE/OJca8idvAfeo1sH560/BuQTh3/Qpc66PjuL0A+R7La1XRlW0WBEcOXEIB4/uNz+yZc+G2nVqqZ+NGpxLS6AEqX+v+hsbt6y37STDd7DX9bZ3EsRmz5ENly5Yt9JfunQZXok/xztG4Gfyw/v3n4ZKtrhAkeB8147dqtynXIWygL1n0CUQl86FWqZYO0BI0f748eNVoByw3OXDhw8h9vcjRoxo9Vxq2AObFpZD7kgdmJz0jh079sXJTwL1oMhnZtnZITxs2bxVba9UqVLiypWr6Pnb70iVOiUaNG4AmyXBuOyC0Z38g/BLPv5ZdY9o/sG5DJcoQy1mcfOfTxvfPKKjfydQyZL7SKY8kn9mXYZYvOLjn4GXaTZK9tMrlz9fMF6/dh2nTp6Ci6srEiUy1gFd80ePP1G6TCl4JfJSY/4vXrgEu3ftwZr1q2FUksAIWH8sdZuubq6GrkuW8g7ZvktXLEaMmDHM/Wikg7+03BqVva63PRzPZB2vXr5qfn7j2nWcPnladQCWY1q7Tu3QqG5j5C+YH4WKFMSWTVuxYe0GbNi6Hraq9+99UbJ0CXh6ear1X7poGfbu2osVa5ep1+fPWYDUaVLBLU4cHDl0BL916o5f2rZCytQpYdcBugTmUgYiZR+lSpWyek06UEpNeOLEidWJUEomJCsrAg5fKFd+knG2lDZtWpVBloeW7T537hyeP3+uMt7/VXDfV8pTZDQa7cLj4MGD5pKZb8maNataH8naFyxYELbW47/X771x5/Yd9aWvUrUyevX784uLHpsio7JIx08Z91yC7rhR/YdXlOD7zUfg8dvPQy9akmy6jJkewQF4+Aa46uPfiVQy7DJEY1JXmx7F5fjR4yhd4nOW4bfO3dT/9RvWw7SZU2HUIeeaNWmB+/fuw9k5FjJkzKCCcxnvn4x3UypRqngZ6+kzJqNBIxtOOHyDva63PRzPjh87gXIlPrdmd+vSQ/1fr0FdTJk5GZWqVMSYCaMxYugIdOnQFSlTpcSCJfORr0Be2PIxu2WzVqrTZyznWCqpIMF5sRL+o1FdungZff7oh2dPnyFR4kTo/FtH/NIu+P0FDRugr127VpVANGvWTF2dW6pevbrKrm/atEmNgtKjRw+0bdtWdda0HOVFK1ORmnAJ3D09PVXGp0SJEsiYMSPq1aunxhaXiwHppFm4cGHkyJHjPy9zcN9XOlM1atQIw4cPV501ZdllJJdv1Z8LKW2R92/YsKG6eJGAXXYyqX/PlCmTqmXXq+o1q6mHocSL5v8ITFQnoMQ36jNjRASyu8No5IYPbz4a++YlAU2eNim8F0EXNm3bCKOzt33b3tfbHo5nhQoXxMsPXx88omGTBuphFBOmfL1vUJ8BvdRDr8KtBl0CcAl4AwbnWoB+9OhRNZrK/PnzVadNCYwlqy7DIwacVzqXSgdPd3d3NY+UpaxevRouLi4oVKiQ+jvJkiVTwzn+iOC+b4oUKVCtWjWUK1dOtQ5IYD1x4sRg/x3pDCoBuoxQI1l3aVGQTqhS205ERERExqaLO4lSyAvrO4nqSYjdSdSGhOedRCls2eshOzzvJEpkpDuJ6klY30lUD2zqTqJEREREROSPAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdcQrvBaDQ9dH0UT3syesNF2BvEvUtCXt0+fd1sDd+Jl/YIwc4wB6ZTH6wN06OEWGPHB0iwN5EdIwEexMxmOvMDDoRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQKf/ZPqUGciXLT883RKpR4mCpbBl4xareQ4fPIwKpSohQeyEap6yxcrhzZs3MKrhQ0cgWsQY6NKxK2zZkzHH8Kjv/i8eL9ZfVa+/WHsFT8Ydw6OBB/F4+GF4LzqPj49fW73H+6vP8WzmP3g8+CAejziCl1uvw+RngpH28Xat2yNzmqyIFysBknmkQJ1qdXHx34uwZYP7DYFL5DhWj1wZ85hff/v2LTq37YpkCVLC0zUxGtZqjIcPHsJIRg0bjdiR3dCtUw/1/NnTZ+jS/jfkyJAL8Z0TIkOKTOjaoRu8vX1gywb3GwqXKO5Wj1yZ8ppfnz19LiqUrIxE7knVa97PvWEEw4eMROG8RZHA1RNJE6ZA7ep1cfHCpS/mO3TwMMqXqoh4sT3g4eaF0sXK2vT5a++evahRpSaSJ0qJ6BFjYs3qNebXPnz4gN+7/4GcWXLD3Tmemqd5459x7+49GE2aFOnUeTrgo32bDtAbBuhh5Pr163BwcMDJkye/+3eLFCmC9u3bQ08SJvRA7wG9sOvgDuw8sB2FihREner1cP7seXNwXr1CDRQrURTb923Fjv3b0KJVCzg6GnOXO3rkGGZMm4mMGTPA1rk0zwS3jjnMD+f66dT0yOnc1P9OCaIjZqUUcG2dBc71/F/znn/OHIB/vP8K3gvPI1Ly2HD5OTNiVU+F9xef4dXWG7Al39rHs2TLgonTxuPw6UNYsW45TCYTqpavBl9fX9iyNOnS4N8bZ82PDTvWmV/r0fl3bFy/CbP/moG1W1fj/r37aFCrMYzi+NHjmDVtDtJnTG+edu/efbWe/Qb3xYHjezFh2nhs27wdbf7XFrZObevrZ8yPDdvXml978+Y1ipcqhg5d9XXu+VH79uxDi1bNsX3PFvy9fiU+fPyIKuWr4tWrV1bBeTV1/iqGnfu2Yef+7fhfq59t+vz16tVrZMyUEaPGjvjitdevX+PkiVPo1vM37Du8BwuXLMCli5dQs2otGM2eA7tw9dYV82PtRv8LlWo1qkJvnMJ7AYyicePGmDNnjvm5q6srcubMiaFDhyJTpkzw8vLCvXv3ECdOHBhB2QplrZ7/2e8PzJg6E0cOH0Xa9GnRvXNP/O+X/6Fj189XpSlTp4QRvXz5Ek0bNcOEyeMxZOAQ2DrH6BGtnr/fdweOLlEQMXEs9Txq9vjm1yLEBqIXTYRnU07B7/k7RHCNgrdnH8MpXjREL+zlP49rVEQvnhg+yy8iWmEvOEaOACPs402afw5MEydJhN/79ET+HAVx4/pNJEueFLbKyckJ8eLH+2K6ZIznz16AaXOnoFDRQmra+KnjkDtzXhw5dBQ5c+eArX+PWzRqibGTRmHY4JHm6enSp8W8xZ+P7UmTJ8UffXvi58Yt8fHjR/V52SonpwiBbmvRqk1L9f/eXftgJCvXLrd6Pnn6RCRLmAInjp9EgYL51bRunXug5S8/o5PF+SuVjZ+/SpcppR6BcXZ2xtqNf1tNGzlmOArlK4JbN2/BK5H/sdwI3N3drZ6PGDoCyZInQ8FCBaE3tns5qENlypRRQbg8tm3bpg7cFSpUUK9FiBAB8ePHD/JgLtk3OdjbIskYLlu8HK9fvUau3Dnx6OEjHD18FO5x3VGyUCmk8EyFcsXL48C+AzCiDm06okzZ0ihWvCiMxuTrh7enHyFKlriqBeiL19/74u3Jh3CMHRmOzpH8J/r6ARGsDy0OER2Bj374eO8ljLCPByTZtwVz/0LipInh6ZUQtuzq5atImyQ9sqTOjhaN/odbN2+r6aeOn1RN4UWKFTbPmypNSngm8sSRg0dg6zq364pSZUuiSPEi35zXx9sHMWPFtOngXFy9fA1pk2ZAljQ51MWJtq3tiWxL4eriov63PH8VL1QKyTxTokzxcthv0PNXULx9fNQx3zm2M4zq/fv3WPTXIjRs3CDQ81t4Y4AegiJHjqyCcHlkyZIF3bp1w61bt/Do0aMvSlx27typnm/YsAHZs2dXv7t37151om/YsCFixIiBBAkSYMSIL5uj9OLsP2fh4eIJ9xjx0PHXjliwdJ5qMr1+7bp6fVC/wWjUrBGWr1mGzFkzo1LpKrhy6QqMZOnipTh54iT6DugDI3r371OY3n5UAbqlN0fu4dGgg3g8+BDeX36O2PXTw+FTUB4puQs+3n6Bt2ceqbIXX593eL3b/8Tv9/I9bElQ+7hm2uTp6nV5bNm4FavWr0SkSJ8uVGxQ9pzZMWH6OCxdswQjxg1TrQHlilfAixcv8ODBQ7VuAU/YceO6q9ds2fIlK3D6xGn06v/HN+d98vgJhg4ajsbNGsKWZc+VDROmjcXSvxdjxLihuHFDtnVFvHhhmxfR/4Wfnx9+69wdefLlQboM/uV61z6dvwb2G6y28co1y5Ala2ZULF0Zlw12/gqK9DX5o/ufqFmrJmLF8m85NaI1q9fg+XNv1G9YH3pk25f/Om8unT9/PlKkSAE3Nzer+jZLEsQPHz4cyZIlg4uLC7p06YJdu3Zh9erViBs3Lnr06IHjx4+rgP9r3r17px4aH5/Q78AkJSt7juxWf2v18tVo2aw11m9dqw56QkoA6jeqp37OnDUTdm3fhXmz56u6XiO4feu26hC6ZsMaRIkSBUb09sRDRErhgggxrYPOyBndETFZbBVwvzlwFz7LLyB2k4xwcHJUtefRSyTBy3VX8WLlJcDJEdELeuLDTR9Ah1mK/7KPa0H6T3VqqpaT+/fvY9zI8Whctwk279pos/tDyTIlzD9nyJgeOXJlR8aUWbBq2WpEiWqb6/Qtt2/dUR1CV65f/s3tJvvBT1VqI02a1Oj2x2+wZSVLB9jWObMjY6qsWLVsFRo00WfAEtI6tu2M82fPYfOOjeZp2vmrafMmaNDI/3OQBNPOT+evPgY5fwVFWska1GmoWvXHTBgFI5szay5KlSkFD48E0CMG6CFo7dq1KvMtJCCXDLhM+1rHkr59+6JkyZLmoH7GjBkqsC9evLiaJnXtnp6e3/zbgwYNQp8+YZvFlWxa8hTJ1M9Zs2XB8WMnMGn8ZHTo4l+3lyZtaqv5U6VJrYJaozh+/AQePnyEfLn86xa1Uoi9e/Zh8sQpeP7qqSptslW+z9/iw7XniPXT54yxxjGKk3rALSoiesbE46GH8e7fJ4iSwb++L1peD0TNkwB+Lz/AMUoE+D5/h1fbbyJC7MiwJUHt42MmjjbXbsojecrkyJk7JxLHTYq1q9aiRu0aMALJlqdImRxXr1xD0eKFVZOwjOZhmUWX70C8eNYtLLbk5PGTqqyhcO6iVt/j/Xv2Y9qk6Xj44p76HksrQo2KP6lj/PylcxExonVfDSNta3vQqV0X1eF547Z1SOj5uSwt/qea/IDnr9QGO399LTi/eeMW1m9Za+js+c0bN7F92w4sXPoX9IolLiGoaNGiqoRFHocPH0bp0qVRtmxZ3LgR9OgVOXJ87lh15coVdQLMnTu3VWfT1KmtDxSB6d69O7y9vc0PKa0Ja5J5eP/uveowl8AjAS5dvGz1+uVLlw3V2aRosSI4cuIQDh7db35ky54NtevUUj/bcnAuVG159IiIlNK/NjNIpk+Pj9bDKEoJl2TeHSJGwLszj+EYKxKcEvhfwNoqbR8PjGSc5PHuvW2V8XyNJA2uXb2ugpbM2bKooHTXjt3m1y9duITbN28jZ54v6/JtReFihbD/+F7sObLL/MiaPQtq1qmhfpbvsWTOq5WvgYiRImHhigU220ISrG2dIPBOo0Yh31EJztesXou1m/5GkqRJrF5PnCTxp/PXJUOfv4IKzi9fvqI+F2n5N7K5c+apfgZly5WBXjGDHoKiR4+uSlo006dPV9m1adOmoXnz5kH+TkiQGnZ5hJXePfuo5nBPLy+8fPECSxctw95de9VwcxKYte3YBoP6DkKGTBmQMXNGLJy3UJ3M5y76PBqCrYsZMybSZ/g8HJuIHj0aXN1cv5huiyext6ceInKmuHBw/FyW4vvsLd6dfazKWyR495P68n13VCfQSCljm+d7vf+OKnWRkhbJrMs8sWqksnovvfvaPi6BzIqlK1CsZDHEieOGu3fuYtTQ0aoMpFQZ/xYxW/THb3+iTPnSKhCR4QUH9x2iAtTqtarB2TkW6jeuh55d/4CLS2zVSbJrh+4qOLflEVzkeywjtViKFj26So7IdC04f/36DabOmowXPi/UQ8Rxj2OzF+J/dOuFMuVKfd7W/Yb6b+ufqqnXH9x/oMa4v3rF//4HZ8+cQ8yYMeDp5QkX129ctOu8rGXpoqVYtPwvtT6yniKWcyxEjRpVnb/adWyDgX0Hq2EJ5fz117y/1Fjp8xbNhS1fgF257L8txfVrN3Dq5Gm4urogfoL4qFervhpqcdmqpfD19cP9T5+LvG7L/WqCSrTMmzMf9RvU03VHb/0umQHIF13KW4J7c4PkyZOrDNWhQ4eQKFEiNe3Zs2e4ePEiChf+PHKCHjx69Bgtm7bC/XsP1IFNxg2WwEXGPRet27ZSHU16dOmBZ0+fI0Om9Fi1YYVNDz9nTz5c9Yaf93tEyRqgdMHJUdWSvz50D6Y3H+EYIyIiJoql6s8do38+iL+//Ayv99yGydekhlyMVSsNIn8rE68zX9vH5QYeMirRpHGT8fzZc8SN5458BfJhy65NKitjq+7cuYvmDX/G0yfPEMfdDbnz5caW3RtVICoGDu+vjmkNazdRLQnFShbF8LFDYWSnTpzG0cPH1M9Z01lfiJy6cEK1GNrstm70P+ttvWuDeVvLePBDBgwzz1++RCX1/4SpY1G3YR3YKrkBmShbwn+ENc2k6RNQv6F/n6lf2rbG27fv0E2dv56pRNPqDStt+vwl5XllS5QzP+/Wpbv6v16Duuj5Zw+sW7NePc+bI5/V723Yuh6FCutvCMIfIaUtMnykjN6iZw4mSZVRiIyD/uDBA8yaNcscWI8fPx6TJk3C9u3bkSRJEiRNmhQnTpxQHT5lFBcpiZH5Ysf+nHls1aqVGtll5syZqpNoz5491e83a9YMo0f7170Gh2R9JHt/6/ENQ9eRBSaig7FqQ4Mjcb/Ax7c1usu/f76Jjr3wM9n2jZD+KwfYTutLSDKZ/Dst2hMnR/s7hgtHB9tsjfkR9vi99vHxQXw3D1WO/LX4jBn0ELRx40bVMVRrNk2TJg2WLl2q7gQqwywGx7Bhw1RTVMWKFdV7dOrUSW1EIiIiIrIPzKAbFDPo9oUZdPvBDLp9YQbdfjCDbh98gplB5yguREREREQ6wgCdiIiIiEhHGKATEREREekIA3QiIiIiIh1hgE5EREREpCMM0ImIiIiIdIQBOhERERGRjjBAJyIiIiLSEQboREREREQ6wgCdiIiIiEhHGKATEREREekIA3QiIiIiIh1hgE5EREREpCMM0ImIiIiIdIQBOhERERGRjjBAJyIiIiLSEQboREREREQ6wgCdiIiIiEhHGKATEREREekIA3QiIiIiIh1hgE5EREREpCMM0ImIiIiIdIQBOhERERGRjjiF9wJQ6HJycFIPe2KCCfbmUs81sEeFZjaCvdnVZBbs0dN3j2CPXCO7h/ciEIUaP/jB3vgFc52ZQSciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6PSf7N2zFzWq1ETyRCkRPWJMrFm9xvzahw8f8Hv3P5AzS264O8dT8zRv/DPu3b0HI6+3MJlM6Ne7P5J5pYBbTHeUL10Rly9dhpGMGjYasSO7oVunHlbTDx88goqlK8PDxQtecRKjbPEKePPmDWzFh2dvcGPmMZzptAGn26zFhb478PrG80Dnvb3gFE61/BuPtl2xmn5t4iGc674Fp39di7NdN+HmrOP48PwtjLattX29RsWf1OtrV6+DLTm87wia/9QSeVIVRLJYabB57dYg5+3ZvpeaZ+aEOeZpt2/cxm+/9EShjMWRNm5mFMlUEqMGjMX79+9hpG1dvmQlNc3y0eGXTrBlw4eMROG8RZHA1RNJE6ZA7ep1cfHCJat5Htx/gBaNf0Zyr1SIF9sDBXIVwuoVq2Hkc9eAvgORNUM2dc5O6O6lzl1HDh2Brdu7Zx9qVvkJKRKlQoyIsbBm9Vqr12VaYI/RI8YgvDFAD2X3799HmzZtkCxZMkSOHBleXl6oWLEitm3bFqzfnz17NmLHjg29efXqNTJmyohRY0d88drr169x8sQpdOv5G/Yd3oOFSxbg0sVLqFm1Fmzd19ZbjBw+CpPGT8bYCaOxc98ORI8eDZXLV8Xbt7YbpFk6fvQ4Zk2bg/QZ038RnNeoWBPFShTFtn1bsH3fVvzcqjkcHW3jEPPx1XtcGrYXDhEckezXPEjdqyg8aqRHhGgRv5jX+8Q9vLr2DE7OUb54LUaqOEjcIjvS9CmGJP/LiXePXuH61COG2taaiWMnw8HBAbbo9as3SJshDfqM+POr821aswUnj5xCvARxraZfuXgNfn5+GDC6DzYdWovfB3fHXzMXY3ifUTDatm7UtCEu3DhnfvQZ1Au2bN+efWjRqjm279mCv9evxIePH1GlfFW8evXKPM/PTVvi0sXLWLxiIQ4e349KVSqiYd0mOHXiFIx67kqRMgVGjBmBwycOYsvOzUicOBEqlauCR48ewZa9fvUKGTJlwMgg1vvKrUtWj0nTJqrjWuWqlRDenMJ7AYzs+vXryJ8/vwqwhw0bhowZM6rs8qZNm/DLL7/g33//ha0qXaaUegTG2dkZazf+bTVt5JjhKJSvCG7dvAWvRF4w4npLRnHC2Ino2qMLKlSqoKZNmzUVSRMmV1ftNWvVgC17+fIlWjRqibGTRmHY4JFWr/Xo0hM///IzOnRpb56WMnVK2IqHmy8jkmtUJGqU1TwtcpzogWbZ7yz+B8na5sHV8Ye+eN29RHLzz5HcoiFu6ZS4PvkwTL5+Kvg3wrYWp0/9gwljJmDH/m1InTgdbE2RUoXU42vu332APl36Y/bK6WhW839WrxUuWVA9NImSeuHqpWtYMGMhegz4DbbkW9s6arSoiBc/Hoxi5drlVs8nT5+IZAlT4MTxkyhQML+adujAYYwaNwI5cmZXz+WYPn7sRJw4cQqZs2aG0c5doladn6yeDx4+CHNmzcWZf86iaLEisFWlypRSj6AE3LfXrVmHQkUKIWmypAhvtnPGsEGtW7dWV2KHDx9G9erVkSpVKqRPnx4dO3bEwYMH1TwjR45UgXv06NFVdl1+Rw6YYufOnWjSpAm8vb3V+8ijd+/esEXePj5q+Z1jO8Oorl+7rppGixYranWxkjNXDhw6eBi2rnO7rihVtiSKFLc+WD96+AhHDx+Du3sclCpcBim90qBciYo4sM9/H7cFPqfuI2qi2CrbfbbLRlwYsBNP9tywmsfkZ8LN2SfgXjIFonjEClZW/tnh24iWzNWmgvOvbWuthaxFw58xbPRQQwVuliQ73unnrmjRthlSpQ3eheYL7xdwdrG949vXtrVYumgZknmkRN6s+dHn975q+xuJj7eP+t/VxcU8LXfeXFi+bCWePn2m9oVli5fj3dt3KFioAOyBlGrNnD5Lnb8yZsoAe/HgwUNsXL8JjZo0gB4wgx5Knj59io0bN2LAgAEq+A5IK1uREoCxY8ciadKkuHr1qgrQu3btiokTJyJfvnwYPXo0/vzzT1y4cEHNHyNGjED/3rt379RD4+Pjf9DRAynv+KP7n6hZqyZixfp2YGOrJDgXceNZN4fL84cP/F+zVcuXrMDpE6exff/WQC9MxOD+Q9FvcB9kzJwRi+YvRuUyVXHg+F4kT/k5q6xX7x+/xpPd11UGPG6ZVHhz4xnuLPkHDk4OcM2byJxlh6MD4hT7embl7opzeLLzGvze+yJaUhck/SU3jLKtRY/OvyNX3lwoX6kcjGryqGmIECECGrcK3on6+pUbmDN1Pnr07wojbeuataqrFs/4HvFx9p+z6N2zjyr9mL9kLoxAgu/fOndHnnx5kC7D55agOX/NQuN6TZE4flI4OTkhWrRo+GvpfCRPkQxGtmHdBjSq10RdhMVPEB9rNqxGnDhxYC/+mvcXYsaMgUo6KG8RDNBDyeXLl1XJQ5o0ab46X/v2n0sCkiRJgv79+6Nly5YqQI8UKZK6gpXMc/z48b/6PoMGDUKfPn2gN1LS06BOQ/VZjJlgm/WZ9u72rTuq49jK9csRJcqXddd+fib1f5PmjVC/UT31c+YsmbBrx27Mn7MAvfp/vc5XF0wmRE0cGwmqpFVPoyVyxtu7L/Bk9w0VoEtn0cfbryJVj8LfrLuOWyo53PInwvsnr3F/3UXcnH1cBem2UK/9rW29fs0G7N65B7sP74BR/XPiDGZPmoc1e5YHa5tJKUyTai1QrkoZ1G5sXSZgy9taNG7eyPxz+gzpVIuJXHhfu3INSZOHfwnAj+rYtjPOnz2HzTs2Wk3v33sAvJ97Y83G1XBzc8Xav9ehUd3G2LR9Q5B9MoxASjsOHN2HJ4+fYNaM2WhQt5HqSxU3rjvswdzZ8/BTnZ+C/D6ENdtqd7UhEpAGx9atW1G8eHEkTJgQMWPGRIMGDfDkyZPvbkbs3r27KoXRHrdu3YJegvObN26pA52Rs+dCa+5/+OCh1XR5Hjee7ZYCnDx+UpWxFM5dFG7R4qrHvt37MGXCVPWzdvBOnTa11e+lTpNKBQG2QDp8RkkQ02pa5Pgx8P6p/yg0ry4/wccX73Cuxxacar1GPT48fYO7y86qaVbvFSMyIseLgZjp4iJx8+x4ceYhXl97BiNs6x3bduLa1WtIHDeZ+XXRsHZjNeKHERzZfwxPHj1BgXTFkNIlvXrcuXkXA3sOQcEMxazmfXDvAeqWb4hsubNi4Ni+sCXf2ta+vr5f/E6OXP412VevXIOt69SuiypnWLd5DRJ6JjRPl3WbMnEaJk4djyLFCqsWwe5/dEPW7FkxdfJ0GJm09idPkRy58uRSnSWl9WDOrM+jFxnZvr37cenCJTRu+vmiNLwxgx5KUqZMqbIvX+sIKp1IK1SogFatWqlSGFdXV+zduxfNmjVTNWDSrBZcMkKMPPRCC84vX76CDVvWwc3NDUaXJGkSFaTv3LFTZZC1UqMjh4+i+f+aw1YVLlYI+4/vtZr2S4tfVSfQ9p3bIUmyJEjgEV81fVu6fOkKSpYuDlsQPbkr3j3w7/uheffgFSK5RVU/u+T2Qow01lmkq2MPwiWPp7kEJlCfLtRNH/xghG0t2cQmLRpbvZ4vWwEMHNYfZcqXgRFUrV0J+YvmtZrWuGpzVKldGTXrV7XKnEtwniFLegydNNBmRiwK7raWEp+A/jl1Rv0fL0E8m06edW7fVXXcX79lrTpuW3rzKTnmEGB7yuchJTH2RNb3/TvbHTr0e8ydORdZs2VVF2R6wQA9lEiwXbp0aUyYMAFt27b9og79+fPnOHbsmPoCjBgxwnxwX7JkidV8UuYSWCYjvElH1iuXr5qfX792A6dOnoarq4uqXatXq74aanHZqqXw9fXD/U/12fK6rJOt+tp6S63mL21bY+jAYUiRIjkSJ0mCfr37IYFHAlSs7D+qiy2Slp106f1LPzTRokdX+7g2vU2HNhjcb7DqUCSPv+YvUtmIuQtnwRa4F0+GS0P34sGGi4id3QOvrz/H07034FnPf8QGpxiR1MOSQwQHRIwVGVHi+/cLkaEX31x/jugpXNXwjDLE4v2//0Uk92iIluxzBzRb39aBdQz19PJEkqSJYStevXyFG1dvmp/fun4b506fV508E3p5wMXNens5RXSCe9w4SJYymTk4r1OuIRIm8lCjtjx9/NQ8r3s8d0NsayljWbp4OUqVKQEXV1dVg96jy+/IVzAfMthwmYeUtSxdtBSLlvvXG2t9h2I5x0LUqFGRKk0qVWve7pf2GDCkv/o81v69Ftu37sDSVYthxHOXq5srhg4ahvIVyqnzt5S4TJk0FXfv3EXV6p8vSm11va9arPeNa9dx+uRpuHw6Z2uJtJXLV2Hg0AHQEwbooUiCcxlmMVeuXOjbty8yZcqEjx8/YsuWLZg0aRIWLVqkMs3jxo1TY6Pv27cPkydPtnoPqUuXHUzGTc+cObPKqn9PZj20HD92AmVLfO4k1q1Ld/V/vQZ10fPPHli3Zr16njdHPqvf27B1PQoV/jw8ma352npPnTkFHTt3wOtXr/Frq7aqhjFv/rxYtXaFbmraQkvrti3x7t1bNdzis6fPkSFTelXbait1qtGSuCBpy5y4t+o8Hqy7iEhxosGjZga45PYM9ns4RooA75P3cH/tv/B754uIzlEQM7074pVNBceIX2YjKXzrzOuW/9yUPaDHYPV/9bpVMGyy/89fs3fHPty4ekM98qUpbPXaVR/bHT7XUsRIkbBz+y5MGjdZHdOkDKRS1Yro3L0jbNn0KTPU/2VLWCdNJk2fgPoN6yFixIhYtnopevXsjZ+q1lYXc8mSJ8WUGZNQumzQw/XZ8rlr7MQxuHjhIhbM+0sF5xKwZ8+RDVt2bPriIs4W17tcifLm59269DCv95SZ/vGWjNIjLSs1a+trKGQHU3CLpek/uXfvnipfWbt2rfrZ3d0d2bNnR4cOHVCkSBGMGjVKjZEuGfVChQqhXr16aNiwIZ49e2Ye6UVKYJYuXapq03v16hWsoRblilA6mN57csfwtd8EfPCzj2bIgArPagJ7s6uJbbRKhLSn72z7hin/lWtk28jIh6QIDvZ5Qetoh+ttgv2FoD4+PvBw81T9Bb8WnzFANygG6PaFAbr9YIBuXxig2w8G6PbBJ5gBum31aiEiIiIiMjgG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY44hfcCEIU0Rwf7u+50sMN1FnubzoO9idksB+zRyxlHYY9MMMHe+NnhOgsnOz2O25sIDhGCNR/3BiIiIiIiHWGATkRERESkIwzQiYiIiIh0hAE6EREREZGOMEAnIiIiItIRBuhERERERDrCAJ2IiIiISEcYoBMRERER6QgDdCIiIiIiHWGATkRERESkIwzQiYiIiIh0hAE6EREREZGOMEAnIiIiItIRBuhERERERDrCAJ2IiIiISEcYoBMRERER6QgDdCIiIiIiHWGATkRERESkIwzQiYiIiIh0hAE6EREREZGOMEAnIiIiItIRBuhERERERDrCAJ2IiIiIyJYD9Dlz5mDdunXm5127dkXs2LGRL18+3LhxI6SXj4iIiIjIrnx3gD5w4EBEjRpV/XzgwAFMmDABQ4cORZw4cdChQ4fQWEbSob179qJGlZpIniglokeMiTWr11i9bjKZ0K93fyTzSgG3mO4oX7oiLl+6DKMZNngY8ucpCPfY8ZAoQWLUrFYLFy9chJFMnzID+bLlh6dbIvUoUbAUtmzcYn69Xev2yJwmK+LFSoBkHilQp1pdXPzX9j+D4UNGoHDeIkjgmhBJEyZH7ep1cfHCJfPrT58+Ref2XZA1fXa4x4qHtMnTo0uHrvD29obN8DMBJ58AK64Bf10GVl4HTj+RL3Dg8x98AMy7BJx/Zj3d5z2w4y6w5Aqw6Aqw8RZw/zVsaVsXylsE8V0TIkkg21rMnD4LZUqURwI3T8SI5Iznz5/Dlg0fMhKF8xZFAldPJE2YItB1FocOHkb5UhURL7YHPNy8ULpYWbx58wZGPZ5ZnsOqV6wB50guWLv6c1LSKKZOnoacWXMhrkt89Sicvyg2bdgEo9u7ey+qV66BpF7JEdUpOv4OELvYdIB+69YtpEiRQv28atUqVK9eHT///DMGDRqEPXv2hMYykg69evUaGTNlxKixIwJ9feTwUZg0fjLGThiNnft2IHr0aKhcvirevn0LI9mzey9atvoZu/btwNqNa/DxwwdUKFsJr169glEkTOiB3gN6YdfBHdh5YDsKFSmIOtXr4fzZ8+r1LNmyYOK08Th8+hBWrFuuTmxVy1eDr68vbNm+PfvQolULbN+zFX+vX4UPHz+gSvmq5m17/+593Lt7DwOG9MehEwcwefpEbNm0Fb/8/CtsxtlnwMXnQK64QKXEQDY3/2n/BnKRcfMl8PgtEDXCl69tv+sf1Jf0BMp5AS6R/ae9+QhbsHfPPvz8aVuv+bSt5Xhl+T1+8/oNSpYqjs6/dYQR+O/fzbF9zxb8vX4lPnz8aLV/a8F5tQo1UKxEMezctw0792/H/1r9DEdHR8MezzQTx06Cg4MDjCphwoToN6Av9h/ei32H9qBI0cIqwXTu7DkY2atXr1TsMnrcKOidg0nOpt8hbty42LRpE7JmzaoeHTt2RIMGDXDlyhVkzpwZL1++hNE1btxYlfoIJycnuLq6IlOmTKhTp456TQ8HLx8fHzg7O+PekzuIFStWqP4tyaAvWvYXKlauqJ7LLiWZ9bYd2qB9x3ZqmmQVJQs5ZcZk1KxVI1SXx9Eh/D7/R48eIVGCJNiyfRMKFCoQZn/3vd97hKXE8ZKi3+C+aNikwRevnTl9BvlzFMSJ88eRLHnSUF0OxzDsRvPo0WMkS5gcG7atR4GC+QOdZ+WylWje+Gc8eH5PHRtCQ8xmOULuzbbfAaI4AfnifZ626y4QwREoEP/ztNcfgQ23gOIe/oF32thAWhf/1976AkuvAqU8gXj+rav44OefSS+REEgQLUQW9eWMowjLbS3Hq42BbOvdu/agXMkKuP3whirvDG0mfNcp+gf37xTYsG2deZ2LFiiBYsWL4I8+vyMs+YXROgd1PDt98h/UqlpbBfCpEqXBgqXzUaFy+VBfjkiOkRCePNw9MXDIADRu2gj2IKpTdCxevgiVPsUuYRmfxXNNoOKir8Vn3312K1myJJo3b64eFy9eRLly5dT0s2fPIkmSJLAXZcqUwb1793D9+nVs2LABRYsWRbt27VChQgV8/Bh6WaMPHz5A765fu44H9x+gaLGi5mlysZAzVw6VkTEyH28f9b+L66fgxWAkK75s8XK8fvUauXLnDDQ7sWDuX0icNDE8vRLCSHw+la64ugS9bb19fBAzVsxQC85DnHtU/1IUKVERT98BD98CHhZBteRw9t4H0sUGYkf+8j0iOwKxIgJXffwDcymbuegNRIkAuAYyvw1ta5evbGujHru0/fvRw0c4evgo3OO6o3ihUkjmmRJlipfD/n0HYOTj2evXr9G8YQsMHzMM8eJbXLgamHwOSxYvVcfv3Hlyhffi0H8N0KXmPG/evCpTuHz5cri5uanpx44dUxlkexE5cmTEjx9fNRNly5YNPXr0wOrVq1WwPnv2bDWP1CjKhYy7u7u6SipWrBhOnTpl9T7yO/L7UaJEQbJkydCnTx+rAF+a2CZNmoRKlSohevToGDBgAPROgnMRN15cq+ny/OED/9eMyM/PD106dkXefHmRPkN6GMnZf87Cw8UT7jHioeOvHbFg6TykSZfG/Pq0ydPV6/LYsnErVq1fiUiRwjcbFNLb9rfO3ZEnXx6ky5Au0HkeP36CoQOHoUmzxrAZGVyAJDGB1TeA+ZeAdTeBNLGBZBZZnTPPpFnKf3pgpAxAMuUS3EvWXGrZpUZdsu2RAymHsZFtnTdfHqQPYlsbTWD797Vr19X/A/sNRuNmDbFyzTJkyZoZFUtXxuVLV2DU41n3zj2QK28ulK/kn3w0sjP/nEEc57hwjuaCtq3bYfGyhUibLm14LxZ98t1pHmnSGz9+/BfTJbC0dxKAS5nPihUrVGBes2ZN1aFWgnbJIE+ZMgXFixdXLQ9SFiM1+w0bNsTYsWNRsGBBVSYk9fyiV69e5vft3bs3Bg8ejNGjRweZmXv37p16WDahUNhq36YDzp49h227tsJoUqZOiT1Hdqv9avXy1WjZrDXWb11rPqn9VKcmihUvivv372PcyPFoXLcJNu/aqC48jaBj206qRnXzjo2Bvi6fS83KNZEmbWr0+LM7bMb1l8C1F/7lLLEjAc/eAUceAdGcgOSxgCdvgX+fA+UT+QfigZEM++FH/hnz0p5ABAfgsg+w4x5Q1sv/vWxIh7adcO7seWwJYlsbUce2nXH+7Dmr/VuCdtG0eRM0aFRf/Zw5a2bs3L4L82bPR58Bn89RRjmeXb1yFbt37sGew7tgD1KlToVDxw7A29sHK5evRIum/8Pm7RsZpOtEsI6cp0+fDvYbSi22PUuTJo36vPbu3YvDhw/j4cOHKtsuhg8frjrWLlu2TAXiclHTrVs3NGrkX+8lGfR+/fqpoSstA/S6deuiSZMmX/270klXLxdJWrPgwwcPkSDB5zpWeZ4xszH3j/ZtO2L9ug3YumMzPD2NVdohJBuePEUy9XPWbFlw/NgJ1Ql4zMTRappcgMojecrkyJk7JxLHTYq1q9aiRu3Q7W8QFjq164yN6zepeuSEgWzbFy9eoGqF6ogRMwb+WroAESNGhM04/tg/i540pv9z6dz58iNw5ql/gP7wjX+NuYzyopHy4GOPgfPPgWpJgftvgDuvgJ+SAZE+ZczdogD3rvuXvWRwha3o+GlbbwpiWxtRp3ZdPu3f66zWOf6n47hcdFpKnSY1bt+6DSMezyShdu3KNSRyty7XbVCrIfIVyIt1W9fCSPw/h+Tq52zZs+LY0WOYMG4ixk8aF96LRsEN0LNkyaJKLYLqT6q9Jv/b+sgNP0r7HKSURTrMaiVAGhmeSjLlQubZt2+fVdmKfH4y0onUwUWL5l8HmiPHtzuFde/eXXXY1UhmwMvLC+EhSdIkKkjfuWMnMmfJZF6eI4ePovn/msNo27tDu074e9Xf2Lxto1p3eyDZtffv3gf5mcjj3fuw7bga0mQdZBjFNavXYv2WdYFuW9mvq5Svpi7CF69YZHstBh/9s6RWJFGuHeql1CV+gE6e2+74T5cA3vI9AmbYLd/HBrZ1p0/bekMQ29po/Pfvrp/277VfrHPiJImRwCMBLl20HnpRhsstWbokjHg8k9avgB3f82bLj0HDB6JM+TIwOvkcLFviyQYC9GvXLLIn9FXnz59H0qRJVXCeIEEC7Ny584t5tJ7/Mo9kvatVq/bFPJYneqk9/xYJELRMfViQZb9y+ar5+fVrN3Dq5Gm4urrAK5EXfmnbWtXjpkiRHImTJEG/3v3Uwb5i5QowWlnL4oVLsHTFYpVBlRIPIdlk7X4Btq53zz4oWaYEPL288PLFCyxdtAx7d+1VQypeu3odK5auQLGSxRAnjhvu3rmLUUNHI0rUKChVpqTNl7XIui5a/hdixoxh7lsRyzmW2rYSnFcuV1UNvzd99lS88HmhHiKOexxEiGAD9dee0f1rzKNH9C9xkTpyyYyn+BR8Sw15wDpyqUeXoRadI33uaBrJEdh/H8joBjg5AJe8gZcfgITfPnbppazla9tayDR5SBmEOHvmHGLGiAHPRJ6qZNEWy1qWLloa5DpLoqldxzYY2HewGpYuY+aM+GveX2qs9HmL5sKIxzNJLAXWMdTTyxNJkiaGkfzR40+ULlNKna+lFVDOYzJC0Zr1q2FkL1XscsVqUItTJ0/BxdUViRKFT1LzhwL0xImNtWOGlu3bt+Off/5RN2zy9PRUwZrUjAc1uo10Dr1w4YJ5XHlbIk2CZUt87kTTrYt/3W29BnUxdeYUdOzcQfWM/7VVW3g/90be/Hmxau0K28swBuNmD6JUcevsytQZk9Gg0ZdDENoiGX6tZdNWuH/vgTp5p8+YXp3MipUoqsYBP7DvACaNm4znz54jbjx35CuQD1t2bVKjP9gyuaGJKFvCeni1SdMnon7Dejh14pQa5UJkTpvVap4zF0+rDKTuyfjncqOiww/9S1miOgEpYwGZrFv+vkpqz4snBE48Abbc9s+aS/BexMNmRnEJaltP/rSt1TxTZ2JQ/8Hm1+SGPQHnsSWf19k6aTJp+gTz+kii5e3bd+jWpQeePX2GDJkyYPWGlaE+fGp4Hc/siQz00axJC9y/dx/OzrGQIWMGFZwXL1kcRnb86HGULuH/3RW/de6m/pd9ftrMqbDpcdDFvHnzMHnyZJVZl7uJSgAvHRglc1y5cmUYnYx1/uDBA8yaNUuVpMjPGzduVHXgRYoUUXXmMhZ6oUKF1JWp3Gk1VapUuHv3LtatW4eqVauqshUZT16GZfz9999Ro0YN9TtS9nLmzBn0799f/S3JYqxcuRJVqlTR7TjoehOe46CHl7AeB10vwnIcdL0I0XHQbUhYjoOuJ2E1DrqehPU46HoR3uOgk42Pgy5D/kmts4x/LsMIajXnUrYhQbq9kIBcSlgkOy5jou/YsUONxiLDJkrTtgTW69evV0G6dPCUAL127dq4ceMG4sXzb0IrXbo01q5di82bNyNnzpzIkycPRo0axRYLIiIiIjv23Rn0dOnSYeDAgSqjGzNmTJXxldFHJOsr2ePHjx+H3tJSsDGDbl+YQbcfzKDbF2bQ7Qcz6PbBJ7Qy6FLWkjWrdb2lkA6KchcqIiIiIiL67747QJc685MnTwZa8pE2LQe3JyIiIiL6Ed99izepP//ll1/UWN1SHSM341m4cKHqIDl9+vQfWhgiIiIiInv33QG63MJexkiVkUfkZjpyl0sPDw+MGTNGdYIkIiIiIqIwDNBFvXr11EMCdBn0PW7cuD+wCERERERE9EMBunj48KG6yY6QIQXd3W37piRERERERDbZSVRuvNOgQQNV1lK4cGH1kJ/r16+vhowhIiIiIqIwDNClBv3QoUPqjphyoyJ5yM12jh49iv/9738/sChERERERPTdJS4SjMst6gsUKGCeJnfEnDZtmrqjJhERERERhWEG3c3NTd2hMiCZ5uLi8gOLQkRERERE3x2gy/CKMhb6/fv3zdPk5y5duuCPP/4I6eUjIiIiIrIrwSpxyZo1qxqpRXPp0iUkSpRIPcTNmzcROXJkPHr0iHXoREREREShHaBXqVLlR/4GERERERGFZIDeq1ev4L4fERERERGFZQ06ERERERHpaJhFX19fjBo1CkuWLFG15+/fv7d6/enTpyG5fEREREREduW7M+h9+vTByJEjUatWLXXnUBnRpVq1anB0dETv3r1DZymJiIiIiOzEdwfoCxYsUDcl6tSpE5ycnFCnTh1Mnz4df/75Jw4ePBg6S0lEREREZCe+O0CXMc8zZsyofo4RI4bKoosKFSpg3bp1Ib+ERERERER25LsDdE9PT9y7d0/9nDx5cmzevFn9fOTIETUWOhERERERhWGAXrVqVWzbtk393KZNG3X30JQpU6Jhw4Zo2rTpDywKERERERF99ygugwcPNv8sHUUTJ06M/fv3qyC9YsWKIb18RERERER25YfHQc+TJ48aySV37twYOHBgyCwVEREREZGdcjCZTKaQeKNTp04hW7Zsapx0Cn8+Pj5wdnbG/Sd3EStWLNgTBwcH2Jv3ftb3I7AX733fwt5EjhAF9si9ezHYoyeDd8He+JrsM46I5BgJ9sbP5Ad7jM8SuCVUg6x8LT7jnUSJiIiIiHSEAToRERERkY4wQCciIiIissVRXKQj6Nc8evQoJJaHiIiIiMiuBTtAP3HixDfnKVSo0I8uDxERERGRXQt2gL5jx47QXRIiIiIiImINOhERERGRnjBAJyIiIiLSEQboREREREQ6wgCdiIiIiEhHGKATEREREdl6gL5nzx7Ur18fefPmxZ07d9S0efPmYe/evSG9fEREREREduW7A/Tly5ejdOnSiBo1qhob/d27d2q6t7c3Bg4cGBrLSERERERkN747QO/fvz8mT56MadOmIWLEiObp+fPnx/Hjx0N6+YiIiIiI7Mp3B+gXLlwI9I6hzs7OeP78eUgtFxERERGRXfruAD1+/Pi4fPnyF9Ol/jxZsmQhtVxERERERHbpuwP0Fi1aoF27djh06BAcHBxw9+5dLFiwAJ07d0arVq1CZymJiIiIiOyE0/f+Qrdu3eDn54fixYvj9evXqtwlcuTIKkBv06ZN6CwlEREREZGd+O4AXbLmPXv2RJcuXVSpy8uXL5EuXTrEiBEjdJaQiIiIiMiOfHeArokUKZIKzImIiIiIKBwD9KJFi6oselC2b9/+o8tERERERGS3vjtAz5Ili9XzDx8+4OTJkzhz5gwaNWoUkstGRERERGR3vjtAHzVqVKDTe/furerRiYiIiIgoDIdZDEr9+vUxc+bMkHo7IiIiIiK7FGIB+oEDBxAlSpSQejsiIiIiIrv03QF6tWrVrB5Vq1ZFnjx50KRJE/zvf/8LnaUk3UuTIh2iRYzxxaN9mw4wsqmTpyFn1lyI6xJfPQrnL4pNGzbBSKZPmYF82fLD0y2RepQoWApbNm6xmufwwcOoUKoSEsROqOYpW6wc3rx5A1uWLXVOuEdN8MWja/vu6vUH9x+iddNfkS5JJiR2S4ZieUtizcq1sPVtnTdbfiR0S6QexQuWwmaLbT1r+myUK1FBvRYrkgueP/eGrXk37SzejTjxxePD1lvqddOrD/iw/jreTfoH78acwvt5/8L34nPz75u83+HDphv+7zPmJN5NP4uP++7B5OsHW7N3zz7UrPITUiRKhRgRY2HN6i/333/PX8BPVWvBw80TcZ3jo1Cewrh10/+zskX2ejyzx3NXQL6+vujbqx/SpcwAt5juyJA6EwYPGAKTyQRD1KA7OztbPXd0dETq1KnRt29flCpVCvaqcePGmDNnjvrZyckJrq6uyJQpE+rUqaNek8/JyPYc2AVfixPUubPnUKFMRVSrURVGljBhQvQb0BcpUqZQX/L5cxegZrVaOHh0P9KlN8YwpAkTeqD3gF5IniK5Wse/5i1Ener1sOfwLqRNn1adzKpXqIEOXTtg2Kghav//5/QZm9/nN+/dYLVP/3vuX9QoXwuVq1VUz39t3gbez30wf+kcuMZxxfLFK9C8/v+wZd9GZMqSEUbY1gs/beu9n7b169dvUKJUcfXo/Xtf2KJI9VIBFudj0+M3+LDsCiKkjq2ef9hwA3jni4hVksEhqhN8/32Gj2uvwaFeajjGiwbT03fq9yOW9IJD7Mjwe/wWH7fcBD74walIQtiS169eIUOmDGjQuAHq1qz3xetXr1xFqSKl0LBJQ/T8swdixoqJ8+f+RWQbbi231+OZPZ67Aho5bCSmT5mOqTOnIG26tDh+7ARaNm+FWLFioXWbVtAbB9N3XDrI1ce+ffuQMWNGuLi4hO6S2RgJwh88eIBZs2apz0l+3rhxIwYNGoSCBQvi77//Vl/0sOLj46Mupu4/uat2vrDWpWNXbFi/Ef+cP/XVYTlDQ1j/vYA83D0xcMgANG4adqMavfd7j7CUOF5S9BvcFw2bNEDxAiVRtHgR/N6nJ8Lae9+3Yfa3enb+A5s3bMXhM/vVPpY4TnIMGzsYP9WtaZ4nVcJ0+KN/TzRo8mWwE1IiRwjb4ChRvKTo/2lba/bs2ovyJSvi5sPriB3bOmkTWty7FwuV9/244zZ8r3ojUtN0aru+G3sKTiW8ECGdq3medxNOw6mgByJkihP4exx5AN9TjxG5efoQX74ng3chLEgGfeGyv1CxcgXztEb1GiOiU0RMnzMNYcnX5GuXx7NIjpFgb+cuP1PYtTxVr1wDcePGxaRpE83T6v5UD1GiRMXMudPDND5L4JYQ3t7eX43PvuuSMEKECCpL/vz55+Y++ixy5MiIHz++ujLNli0bevTogdWrV2PDhg2YPXu2mufmzZuoXLmyuvOqbJiffvpJBfOW+vfvr3aimDFjonnz5ujWrdsXw1vq2fv377Hor0Vo2LhBuAfLYUkuzJYsXopXr14hd55cMOo6Llu8HK9fvUau3Dnx6OEjHD18FO5x3VGyUCmk8EyFcsXL48C+AzAS2aeXLVqOuo1qm/fpXHlyYNWyv/Hs6TP4+flh5ZJVePf2LfIXygcjbmsjkrIU33NPESGDm3m7OnhEh9+FZzC9+agyi5JBx0cTHL1iBv1G73zhECUCjET26U3rNyNFqhSoXK4KkngkQ5F8RQMtg7FV9no8s8dzl8iTNzd27tiFSxcvqeenT/2D/fsOoFSZktCj726zyZAhA65evRo6S2NAxYoVQ+bMmbFixQp1wJPg/OnTp9i1axe2bNmiPstatWqZ51+wYAEGDBiAIUOG4NixY0iUKBEmTZr0zb/z7t07dVVm+Qgva1avUbWp9RvWhz04888ZxHGOC+doLmjbuh0WL1uoms+M5Ow/Z+Hh4gn3GPHQ8deOWLB0HtKkS4Pr166r1wf1G4xGzRph+ZplyJw1MyqVroIrl67AKNb/vVGVs9Sp//m7On3+VHUfCMmaJ3ROjE5tumL24plIljwpbH1bJ3DxRJwY8dDBYlsbkd9lbxVcR0jvZp4WsUISmHxNeD/xH7wffVKVr0SsnBQOLpEDfQ/Ts3fwPfEoyOy6rZJgVYZOHjl0FEqWKoG/169CxSoVVSnMnt17Ycvs/XhmT+cuS526dkKNn6oja4bscI7qgnw58+OXtq1Ru+7n47qefHfNhWR3O3fujH79+iF79uyIHj261evhUU6hd2nSpMHp06exbds2/PPPP7h27Rq8vLzUa3PnzkX69Olx5MgR5MyZE+PGjUOzZs1Up1vx559/YvPmzd8cY15Kafr06QM9mDNrLkqVKQUPjwSwB6lSp8KhYwfg7e2DlctXokXT/2Hz9o2GOtClTJ0Se47sVhd+q5evRstmrbF+61p10SmaNG+M+o38yzoyZ82EXdt3Yd7s+arW0wgWzPkLxUsXQ3yP+OZpg/oMhc9zHyxfvwSubq7YsGajqkFfs3UV0mVIa9Pbem+Abb1h61pDBum+/zyBY9JYcIgR0TxNOnyqGvQaKYCoEVQQ/2HtdUSslRKO7lGtft/04j3er7gMx1QuhgvQte92+Url8Gv7X9XPmbJkwqEDhzBj6gwULFQAtsrej2f2dO6ytHzpCixeuASz5s1U63j61Gn81uk3JEiQAPUbhl5ZYqhn0KUTqDR/lCtXDqdOnUKlSpXg6empatHlETt2bNalB0GaSaX59Pz58yow14JzkS5dOvXZyWviwoULyJXLuokp4PPAdO/eXdUzaY9bt8Knl/3NGzexfduOMK1hC2+RIkVSHY6yZc+KfgP7ImOmDJgw7nONm3HWMRmyZsuiTlLSsWzS+MmIF98/YE2TNrXV/KnSpMbtW7dhBLdu3MLu7XtQv3Fd87RrV69jxuSZGDNlFAoVLYgMmdKjS89OyJItM2ZOmQUjbeuMn7a10Zh83sN08wUcM37Onpuev4PfycdwKp0IjoljwjFuNDjlSwCHeFHhe/KR9e+//IAPSy/D0SM6nEp9PqYbhVscN9VvKk1a6wuz1PLdvmnb3217Pp7Z27nLUs9uv6NTl46oWasGMmRMj7r16+DXdr9ixNARsOkMumRnW7ZsiR07doTuEhmQBN9JkyYN9fp3eYS3uXPmqfq9suXKwF5JFkZKjoy+ju/fvUfiJImQwCMBLl28bPX65UuXUbJ0CRjBwnmLESduHJQs+3l93rz2H3LN0dG6j4VjBEdzFs5Y+3PYdkIOC75nngDRnOCY7HMnV9OHT9suYN8ZeW458suL9yo4d4gbFU6lExuyr40Eb9lzZMOlC/71uppLly7DK7GxLkjs6Xhmz+euN69ffzEaj56P2cEO0LXBXgoXLhyay2M427dvV2UtHTp0UC0OktmWh5ZFP3funOp0K5l0IUNWSrlLw4YNze8hz22B7OTz5sxH/Qb1wnTEmvD0R48/UbpMKXgl8sKLFy9U89nuXXuwZv1qGEXvnn1QskwJeHp54eWLF1i6aBn27tqLFeuWq8Ckbcc2GNR3kMpCZcycUQ3NJyf1uYv8hx21ZbJPL5y7CLXq/WS1T6dMnQJJkydFp1+7os+gXnBxc8GGvzdi17bdWLBiHoy0rWXElpXrlqvXH9x/oMZ/l+H3xLkzZxEjRkx4JvKEq6vttKCqzp9nnqiRWhwsLrIcXKOooROl7typcEL/YRYvP4fpxgtEqJrsc3C+5DIQK6KaB9KZVPv96J9LZWyBlE5evfy5T9mNa9dx+uRpuLi6qGNau07t0KhuY+QvmB+FihTElk1bsWHtBmzYuh62yp6PZ/Z27gqobPmyGDp4GLwSeaoSl1MnT2H86PFqmFE9+q4oyohZgpAkV57379//YpjFChUqqIBbrtxkiMp69eph9OjR+PjxI1q3bq0uenLkyKHeo02bNmjRooV6ni9fPixevFjVrydL5n9y0DMpbZEbWMjoLfbi0aNHaNakBe7fuw9n51jIkDGDOsAVL1kcRvHo0WO0bNoK9+89QCznWEifMb06mRUrUVS93rptK7x9+xY9uvTAs6fPVbnHqg0rbL6zpNi1fTdu37qDeo1qW02PGDEiFq6aj36/D0D9Gg3x6uUrFbCPnz4GJcsUt+lt/T+LbS3NwCsttvWMqbMwuP8Q8/xlipVX/0+aPgH1Gn4uAdI7Cbjx4gMcM3wubxEOERzgVC0ZfPfcxYdVV2X8Uji4RIJT2cSI8CnT7nfjhSqFwfN3eD/1rNXvR+6UFbZExoEuV8J/G4puXXqo/+s1qIspMyejUpWKGDNhtCoB6NKhK1KmSokFS+YjX4G8sFX2fDyzt3NXQCPGDEffXv3Rvk1H1QlaWkuatmiK7r93gx4Fexx0CS5lXO1vBekyQok9CnijIqnHl9Fb6tati0aNGpmbVWSYRQnCpcOoTCtTpozqGBovXjzze0kH3LFjx6qDhAzDKEMyHj58GAcOHLCZcdDDkz1eSIb1OOh6EZbjoOtFWI+DrhehNQ663oXVOOh6EtbjoOtFeI+DHh7Cchx0vQjuOOjfFaBL1jfgnUQDkmCUQlbJkiXV+Orz5gW/6ZwBun1hgG4/GKDbFwbo9oMBun3wCWaA/l0lLrVr11Y30KHQ8/r1a0yePBmlS5dWN4ZauHAhtm7dqsZMJyIiIiLjC3aAbo9ZyfAgn/P69evVzYqkxEU6jS5fvhwlShi/BzkRERER/YdRXCh0RY0aVWXMiYiIiMg+BTtA1+s4kURERERERhLsO4kSEREREVHoY4BORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY44hfcCENGPc4QD7FHUCNHCexEojDwdvBv2KHqFtLA3b9ZdCO9FoDDiYIfnLodgrjMz6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6BQi+vcdgGgRY1g9smTICnsxeeIUpE6eFrGju6Jg3sI4cvgojGL4kJEonLcoErh6ImnCFKhdvS4uXrj0xXyHDh5G+VIVES+2BzzcvFC6WFm8efMGtmrvnn2oWaUWUiROjRiRnLFm9Vqr1x88eIj/NWulXnd3jo8qFarh8qUrsHX2uN579+xFjSo1kTxRSkSPGBNrVq+xen1A34HImiEb3J3jIaG7F8qXrogjh47A5nz0A84/B3bdA7bcBg49BLzff379wRvg6CNg+11g023Ax+I1ja8JOPfMf56td4ATT4B3vrBle3fvRfXKNZDUKzmiOkXH3wG2vxFNnTwNObPmQlyX+OpROH9RbNqwCfZk+NARKlbp0rEr9IgBOoWYdOnT4uqtK+bH1p1bYA+WLlmG3zp3Q88/uuPAkX3IlDkjKpWrjIcPH8II9u3ZhxatmmP7ni34e/1KfPj4EVXKV8WrV6+sgvNqFWqgWIli2LlvG3bu347/tfoZjo62e4h5/eo1MmTKgJFjhn/xmslkQp0adXHt2nUsXv4X9h3eA69EXqhYtrLV52KL7HG9X716jYyZMmLU2BGBvp4iZQqMGDMCh08cxJadm5E4cSJUKlcFjx49gk05+wx48hbI6Arkiw+4RfYPyN9+CrB9/YDYkYFUzkG/x4XnwKO3QGZXIKe7f3B+8glsmey7sv1HjxsFe5EwYUL0G9AX+w/vxb5De1CkaGHUrFYL586egz04euQYZkybiYwZM0CvHExyxKX/rHHjxpgzZ84X0y9duoQUKVIgvPj4+MDZ2Rn3n9xFrFixwiSDLpm2Q8cOILw5ODiE6d+TjHn2nNkxeuxI9dzPzw8pkqRCq19aostvncNkGT76fUBYefToMZIlTIEN29ahQMH8alrRAiVQrHgR/NHnd4QlB4TNtpZM8sKlC1CxcgX1/NLFy8iaIbsK2OTCVNvuybxSone/P9G4aSMYgZ7W28EhbC72JIO+aNlfqFi54lePrwncEmLtpjUoWqxI6C5PBf/P+YdJ5nvbHSCrG+Ae9fP0Aw+AOFGAlBZB+ZuPwO77QN64QKxIn6d/8AN23AUyuQLxo/lPe/kB2PcAyO3uH9yHgDfrLiC8SAZ98fJFqPSV7W9UHu6eGDhkQJgev8IjBH358iXy5SqgLsiGDByCTJkzYdjIoWH29+X4Ed/NA97e3l+Nz2w3vaUjZcqUwb1796weSZMmtZrn/ftAmgoN5srlK0iWKAXSpcqAJg2a4tbNWzA62a4njp9AseJFzdMkayzPDx88DCPy8fZR/7u6uKj/Hz18hKOHj8I9rjuKFyqFZJ4pUaZ4OezfF/4Xa6Hl3bt36v8oUSJbbffIkSPjwL6DMCp7Xe+A3/mZ02epBEjGTPrNvn1BAiGJhRwDXNTK82f+2/WbpORF3sMtyudpMSICUSIAz41/jjMqX19fLFm8VLUk5M6TC0bXoU1HlClb2uq8rUcM0EOAnJzix49v9ShevDh+/fVXtG/fHnHixEHp0qXVvLt27UKuXLnU7yRIkADdunXDx48fze/14sUL1KtXD9GjR1evjxo1CkX+3959wFVV/mEAf5AlspeCAu4tKs7UElHclqaimXtVWrkXmjNNc8/cK8ty5Z5lDlBzo7lyoojgZCg4UO7/875XLlzE8S+Fc895vn1OcAaX88rl3ue85/e+1KghH0fJKlaqiHkL52D9pnWYNnMqwsOvIjCgjmyPmt25c1e+uOXMmdNou1iPjr4JtRG9pQP7BeO9qu+hRKkScpsodxC++3YcOnRuh7UbV6OsXxl8WLexydcmv0zRYkVkacfwb0YiJiZGhrbJE6Yg8nokoqOjoVZabbewdfNW5HTygIudG2ZOm4WNW9fL13aTYZENcLICLsXrS1pEYL+RoA/Wj5Pf7DHEcSLfW6aLDlbZTL4OXYtO/X0Kbo454ZjDGT2698SK1b+geIm3dMdGoVatWIWw42EYNWYklI4B/R0SpS9WVlbYt28f5syZg8jISDRo0AAVK1bEiRMnMHv2bCxcuBCjR482fE2fPn3k8Rs2bMDvv/+OkJAQHDt27I16tsRtk7RLZqpbrw6aNm8qe5Rq1wnE2o1rEBcbhzWrfsvU86B3q0+Pfjh7+gyW/LTQKLQLnbp0RNv2bVDGrwzGTRyLwkUKYdmSn6BGlpaWWL5ymbwA8c6VTw6W3LsnBHXq1TbpuvvX0Wq7heo1quPAkX34c+8f8jWu7aftceuWidWgi9pzQQ4SjQSuPgA8c+hDN2lOkaJFZFnq3v170PXzLuja6XOcPXMWanU94rocELrox0XInj3NXSCFssjqE1CDTZs2wc7OzrBev359+bFw4cIYPz61rmnIkCHw9vbGzJkzZZ10sWLFcOPGDQwcOBDDhg2Tt5dEqF++fLnsgRcWL16M3Llzv/Ycxo4di5EjlXNF6OTkJAdWXb50GWrm5uYKc3PzFwaEinUPj1xQk749+2Pblu3YtnMz8njlMWxPaWex4kWNji9arKh8QVQrv3J+OHAkVNYRPnmSBHd3N9SoVhN+5dU9e5FW2y3uahYsVFAuld6rhNLFy2Lp4qWZNs7krchhAVTKqZ/NRdSkW5sDJ+6Kwus3+3rrbPoSF1GLnrYX/Umy/rHIpIgORPF8FsqV98PRI0cxa8YPmDl7BtTo2LHj8qK6aiX92ClB3AEXM1eJmdhiE+7J93OlUHeXRyYJCAhAWFiYYZk+fbrcXr58eaPjzp49iypVqhgNYqxWrZocsHD9+nVcvnwZSUlJsgQmhahzLFrUOPhkJDg4WL5hpiwREVlb/y3adOXyFdWF1Ixe4ERg2fXnbqMeZbEu3sTVQAziEeFcDALetH0D8uXPZ7Q/b7688MztiQvnjadevHjhoiyHUDvxOypCquhVPnb0OBp92ABaoNV2p/09f/LYROuuRbmLCNQiaN95BOR8w95EMWBUvH2JmWBSJCTpS2ZE+QyZ/HM6ZZyJGgXUrIHDxw/iryP7DUu58uXwSauW8nMlhXOBPehvqWcloxlbxPbMImraxZJVggcMRoNG9eHj44OoG1FyVhdz82wI+iQIatej99fo2vEzlC/vhwoVK2Dm9Flyqrp2HdpCLWUtq35dhV/XLIe9vR1uPq+td3B0gI2Njbzg7Nnna3w3apycqsy3jC+WL1su50pf9uuPMFXiIvPyxdQ7QFfDr+Jk2Ek4uzjLC4/fVq+Fm7sbvL29cPrUGQzoOwiNPmqIWrX1d79MlRbbLdp8KU2bw69cxYmwk3BxcYaLqwvGj52Aho0awMPTA3fv3MXc2fNwI/IGPm72MUyKCOOiB9zWAkh8CpyPE9PWAHlsU3vCHz1NrSdPeD4+SoR5sYhecy9b4J84/eci6J+L1YfztzSDS9b9/FPHy4RfCceJsBNwdnGBj0o7GYYOHiZLU8XvtBgrtuKXlbJcbeOW9VAre3t7lCxV0mibrW0O+TuefrsSMKBnouLFi2PNmjWyRzKlF13Um4snjZeXF5ydnWWN5+HDh2XQFURv+Pnz51G9enUomaivb9+mI+7dvSffvKtWq4Ldobvg7u4OtQtq0Rx3bt/BqBGjZXgVUzat37wOuXKp4+7Bgrn6evP6gfqp9lLMXjALbdq1lp9/2aM7Hj16jEH9ByPmXoycR3v91rUoUNB4NiNTInqFG9RObbNom9C67aeYu3C2HAQcPGAIbt28JYNbq9afYNAQZf7Bi/+HFtst2lw/MPUOwKD+wYY2T/9hGs7/cx4/L1suw7l4My9foRx+37XdMNWkyRClLSKUix5vEbBz2einV0yZ2eX2Q+BUTOrxJ+/pPxa0Bwo9n4axqBOAWP3c53JGF2ughH5GJ1N17Mgx1A3Ul6YK4u9aCOL1bf6ieVAjMYd/545dER0VDUdHB5TyLSXDuSlfaKsN50F/C/Ogx8bGYt26dUbbxcwrZcuWxdSpU41CbJEiRdCxY0c5w8s///yDLl264Msvv8SIESPkMV27dsXOnTvl4FExE8jw4cOxY8cOdO7cWc7ootR50JUks+dBV4LMnAddSTJrHnTKepk1D7rSvLV50E1IVs6DTplLixE0nvOgK/Mvd23ZsgWHDh1CmTJl8MUXX8jg/c03qX/cZfLkybJOvVGjRggMDJQ16qLn3RRGHBMRERHRf8cedIUTM7uIYD9p0iQZ5t8Ue9C1hT3opHbsQdcO9qBrhxYjaPwb9qCzBl1hjh8/jnPnzsmZXMQPb9SoUXJ748aNs/rUiIiIiCgTMKAr0MSJE2V9upjCT0zVKP5YkUn9xToiIiIi+tcY0BXGz88PR48ezerTICIiIqIsos2iPiIiIiIihWJAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFscjqE6B3S/f8Py0xg1lWn0Km02KbJTPttftpchI0SVsvYwaJm85Ba2w6loUWPVwcBq3RWj75f9rMHnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnT6V0JDQtG8SRAK+hSGraU9Nq7faNiXlJSEb4KHomLZynB3zCWP6dLhM0TdiILahO4NRbPGzZHfuyBsLGyxIc2/g1qEhuxDUJOWKJS3KOysHLFx/Saj/Tdv3sLnnbvJ/e6OHmjSqCkuXrgEdbS7BQr5FIGdpcML7X7w4AH69OiLIvmKwc0+J8qXrogFcxdCTaZMmAona1cM6jvYsK1X9z4oW6w8PBzzoGCeImjVrDXOnzsPNbd5yYKlaFj7I3i75ZX7YmPjoEbFCpVADku7F5ZeX/eGSUrWAWF3gd+uAMsvAmvDgZN3AZ0u4+P/ugksuwCcjTHeHv8E2HUDWHkJ+PUSsC0CiE6EKZs3Zz4q+lVCTmcPufhXC8D2rduhZs+ePcOo4d+iROFScLV3R6mipTFuzPfQvez5kMUY0OlfSUhIhG9pX0yZPumFfYmJiQg7fgKDhgzEvkMh+GXlz7hw/gKCPm4JtUlISJD/DlNnTIFaJSYkolTpUpg8beIL+8QLW6vmn+LKlXCsWLNc/ry9fbzxYf3G8t/GlCUmJOjbncFzXBjUbzD+2PEHFiydj6N/H8aXX3dH3579sHnjFqjBsSPHsHj+UpT0LWm0vWy5Mpg1fwYOnjiANZtWyazTtFFz+ean1jYnJj5EYJ2a6DPQRIPqGwo5sAeXIy4Zlk3b9B0OTZt/DJN0OgY4HwtUygl8lBco56rfdi6DC6xrD4A7jwAb8xf3/XlDH+prewENvAFna/22h09hqvLkyYNvx4zC/kOh2HcwBDUC/BHUtCXOnD4DtZo8YTIWzF0g38uO/X0E3343ClMmTsXsmXOgRBZZfQJK1aFDB8TGxmLdunVG23fv3o2AgADExMTAyckJWlW3Xh25ZMTR0RGbtm0w2iZ+IapXrYGIaxEywKlF3fp15aJmderVlktGRE/5oYOHcej4XyhRsrjcNm3mFBTwLoxVK1ajQ6f2MFV16tWRy8sc/OsgPm37Kar7fyDXO3XtiEXzF+PI4SNo+GEDmDJxd6Br+y8wffYUTBg32Whfhy6pP9O8+XzwzcjBeL9CdVwLv4b8BfNDjW3u3uML+TFkTyjUzN3d3Wh90vhJKFCwAD6orn+Om5zbDwEvO8DLVr9uZwmE3wfuPjI+LvEpcPg2UCu3Pnin9egZcD8JqJJLH8yFcm7A+Tgg9glgY5oxKv1r1MjRIzB/7gL5el6iZAmo0V8HDqLhhw1Rr0E9uZ43X16sWrEKRw4fhRKxBz2TPXnyBFoUFx8PMzMzODo5ZvWp0Fv0+PFj+TF7duvUF5Vs2WBtbY0D+/6CmlV+rzK2bNyCG5E35J2EPbv34uKFi6hVuxZMXb+eA1Cnfm3UqFXjlceJuyQ/L10u3+jyeOeBFtqspfeqX5f/inYd2srXbpPkbqMvRRElKsK9x8CtR0DuHKnHiJ7x0GighBPglPo6ZmCdDXCwBC7HA0nJ+rIZEc6zmwMuGRxvgsTdr5UrVsnf58rvVYJavVelMnbv2iPv6AsnT/yN/fsOvLQDKqsxoP9Ha9asQcmSJWUgyZcvHyZNMr4dLrZ9++23aNeuHRwcHPDZZ5/JF76vvvoKnp6eyJ49O/LmzYuxY8cavkb03Hfp0kX2ZoivqVmzJk6cOAFT9ejRIwwNHoaglkGyPaQeRYsVkXdEhn8zUt5VEs/tyROmIPJ6JKKjo6Fmk6ZNQLHixWQNunMOV3zcsCkmT5+I9z+oBlO2ZuVvOHn8JIaPHvrSYxbMWYg8Lj5y+WP7H1i3ZQ2srKyg5jZrjRhXJGrt27RrA5NVyhnIZw+svwr8dAHYfA0o5gQUSPM+dCoGyGam354RcXESmEcf7kX9uahlFzXqorfdOoNyGBNy6u9TcHPMCccczujRvSdWrP4FxUvo74SqUd8BfdG8RTP4lSoPRxtnVK1YDV/26I5PPlVm+a1p3ptRiKNHj6JFixYYMWIEWrZsif3796N79+5wdXWVJTIpJk6ciGHDhmH48OFyffr06diwYQNWrlwJHx8fREREyCVFUFAQbGxssHXrVlkuMnfuXNSqVQvnz5+Hi4vLS3syU3ozhfj4eCiBGDDatlU72cM4bZZ667S1ytLSEstXLkP3z76Gd658MDc3R0CtGrJHQqkDb96WObPm4vChw1i5dgV8fLzloNI+PfrBM7cnAmoFwBRdj4iUgyPXblkjOw9eJqhVkPw5R0ffxIwps9ChdWds373llV9j6m3WmqWLf5QlXrlze8JkhT8ArtwH3vcAnKyAmMf6UpYcFkBBB32py7lYoKGPPohnRLyOHbqt7zGv6wWYmwEX44FdUUB9b/1jmagiRYvg4NEDiIuLx9o1a9G10+fY8ec21Yb0Nat+w4pfVmLxskWyjSdPnMTAvgNlZ2mbdq2hNKb7zMoEmzZtgp2dndG2tAOhJk+eLIPz0KH6XpciRYrgzJkzmDBhglFAFz3gffv2Naxfu3YNhQsXxvvvvy9vHYoe9BShoaE4dOgQbt26JXvlUwK+qIVfvXq17IHPiOiBHzlyJJQkJZxfuxqBLb9vYu+5SvmV88OBI6GIi4vDkydJcHd3Q41qNeFX3g9q9fDhQ4z4ZiR+Wf2zoZ5RDCj9+8TfmDZ5uskG9LBjYbh96zb8KwcYvebtD9mP+bMX4Nb9KHkR5ujoIJeChQuiYuUKyJerIDat34zmLZtBrW3WkmtXr+HPnbvwy6rlMGnH7uh70fPb69dFDfmDp8Cpe/qAfuuhvsZczPKSQvQrHL0DnI0FmuYHoh8CkQlAiwKA1fPngWt2ICpcX/ZSKuNOM1Mg7noVLFRQfl6uvB+OHjmKWTN+wMzZM6BGQwZ9g779+yCoZXO5Xsq3pBwXJ8ZaMKCbGDEYdPbs2UbbDh48iDZt9Lf8zp49i8aNGxvtr1atGqZOnSpf4FNe1CtUqGB0jAjvtWvXRtGiRVGvXj00atQIderoB6OJUhYxWEn0wqcPBJcuvXzquuDgYPTp08eoB93b2zvLw/nFi5ew9ffNL7SH1Efc7UkZOHrs6HEMHTEEaiWe32IR9fZpZTM3R3JyMkyVf83q2H/MeCDkl12/QuGihdGrX88Mg6q4UyKWtHfw1N5mtftx6TK453RH/ecXnybraQa/i6KjPOXmnih18UhTjy7sjNRvFwE+7WOk72FP+zgqIV67TPX3+E08TEzM4DU7m2JfsxnQX8HW1haFChUy2nb9+vV/9ThplStXDleuXJElLH/88YcskwkMDJQ95CKci9stYraY9F41a4zobU/pcc8M4jwvXbxsWA+/chUnwk7CxcUZHp4eaN2yjZxqcfW6VXj2LFneChfEflOuVU1P/++QeuEUfiUcJ8JOwNnFRZY9qKWNl9P8rK+GX8XJsJNwdnGW9ee/rV4LN3c3eHt74fSpMxjQdxAafdTQ5AdLvtDuK+FG7X6/+vsYMmgostvY6Etc9u7DLz/9grETvoOpsre3N8zGkyKHra0srRPbwy+Hy593zcAAuLq5yQGyUyZMQ3ab7IodaPVf2yzcjL4p5/u/cknf03rm1BnY2dvJ57x4PqiJCCvLlv6ENm1bw8LCxCOCmL1F1JjbWupLXEQduegZL/Q8fIsa8vR15KIeXUy16GiVOtDUKhuwPxrwdQUszIALccCDJCCP8Xu7KRk6eJiciU28lt2/f1+WfuzdE4KNW9ZDreo3rI/x4ybA28dLlriI9+qZU2eibYe2UCIT/+3LWsWLF8e+ffuMtol1Ueryul4XUe4h6tbF0rx5c9mTfu/ePRnexeA68cIoBpgqleghrR+YOk3ToP7B8mPrtp9iyLDBhrmgq1SoavR1W//YYpiWTg3EvMl1A+sb1gf2GyQ/ittl8xfNgxqIn3WD2o0M64P6Dzb8rOcunC0vvoIHDMGtm7fkxVmr1p9g0JABUEW7Axtm3O5Fc7D058UYPmQEOrfrgph7MfDO643ho4ahy+edoVbW2a1xIPQvzJ4xF7ExsciZyx1V36+KHbu3yh5XtVo0fwm+Hz3esN6glv73QcwH37rdp1ATUdoibvuL2VtMnpj/XPyhokO39KUsYkrEwg5A6f/jjq6oPa+VBzh+F/j9ur7XXIT3GrlNehaX27dvo3PHroiOipblaqV8S8lwbuodK68yadpEjBo+Gr2+7iPL2sR4oU5dOyH4G/37ttKY6dQ+kusdzoN++fJlVKxY0TBI9MCBA+jWrRt++OEHQw26CNm9evWSS9raddFL7ufnJ2+3jB8/Hps3b0ZkZKSsSa9evbq8ohXbRdi/ceOG3P/xxx+/UC7zMqLERZQcRN2N1FztdzYz7U1O9CzZdP9gxn9iqtO//QdPk5Oy+hQoE1llM90Q+G/l6KTe8Suv8nBxGLQmWafM8pJ3SeQzT9c8ctzWq/IZe9D/A9HbLWZiETO0iKkURegeNWqU0QDRl91SFeH7woULsqddhPwtW7YYaqPE50OGDEHHjh3lVa6Hh4cM7bly5cqklhERERFRVmEPukqxB11b2IOuHexB1xb2oGsHe9C1If4Ne9C1l2SIiIiIiBSMAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUxCKrT4DeraTkJ3LREotsltAaMzNtXmubwQxaY22ePatPgTKRTqeD1iQuOg4tsvm4OLTmwW+noDU6vNnvtDbf1YmIiIiIFIoBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBPROYmZlh3bp1L92/e/dueUxsbCxM0ZQJU+Fk7YpBfQfL9Zh7MejfayAqlKoED8c8KFWoNAb0HoS4uHiYutCQfQhq0gKFfIrAztIBG9dvMtr/eacv5Pa0S5OGH8OUhYaEonmTIBT0KQxbS3tsXL/RsC8pKQnfBA9FxbKV4e6YSx7TpcNniLoRBbWbOH4ScljaoX+fAVC7OT/MRdGCxeFk64IPqvjj8KEjULN5c+ajol8l5HT2kIt/tQBs37odajd61Bj5nE67lC3lBzUrVqjEC20WS6+ve8Nk6XTAuVjgj0hgcwSw8wZwPk6/PcXjZ8Dxu8CO58f8dQt4kGT8OPtvAhuvGS8n70FN79l26d6vU5apk6YhqzGgvwW3b99Gt27d4OPjA2tra3h4eKBu3brYt2/fG3191apVERUVBUdHR5iaY0eOYfH8pSjpW9KwLSoqGtFR0fh23CgcOBaKWfNnYueOP/H15z1g6hITElCqdClMnj7ppcfUrhuISxEXDMvinxZl6jm+bQkJifAt7YspGbQ5MTERYcdPYNCQgdh3KAS/rPwZF85fQNDHLaFmRw4fxcL5i+DrWwpqt2rlagzsNwhDhgbjwOF9KF3GFx81aIxbt25BrfLkyYNvx4zC/kOh2HcwBDUC/BHUtCXOnD4DtStRsjguR1wyLH/s/h1qFnJgj1F7N23Td0A0bW7CHSsX44HwB4CvCxDgCRR30m+78kC/XwT1w7eBxKdAJTfA3wOwsdCH9KfJxo/lYwvUzpO6iMdS0Xv2pTTv1WKZPf8H2WHa+OOPkNUssvoE1KBZs2Z48uQJli5digIFCuDmzZvYuXMn7t69+0Zfb2VlJUO9qXnw4AG6tv8C02dPwYRxk41e4JetWGpYz18wP4aOGoLPOnyBp0+fwsLCdJ92derVkcuriIu0XB65oBZ169WRS0bEReWmbRuMtk2eNhHVq9ZAxLUIePt4Q23E875T+86YNWcmvv/ue6jd9Ckz0LFLR7Tr0E6uz/hhOrZu2Yali39E/4H9oEYNP2xgtD5y9AjMn7sAhw4eRomSJaBm5uYW8FDR69fruLu7G61PGj8JBQoWwAfVP4DJuvcY8LABctno13NYAJEJQOxjAPZAwlMg5glQwwOwt9IfU9oZ2JEIRCYCee1SH8s8G5DdHGp9z86V7rm+eeNmVK9RHfkL5EdWYw/6fyTKUkJCQvD9998jICAAefPmRaVKlRAcHIyPPkq9Artz5w4+/vhj5MiRA4ULF8aGDRteWuKyZMkSODk5ybIYcWz27Nllj3xERASUpF/PAahTvzZq1Krx2mPj4+Jh72Bv0uH8TYXsCUW+3AXgV7Icen7Z+40v1NQiLj5ePp8dnUzvjtCb6P11H9SrXxc1awVA7UTHw/Fjx43ami1bNrl+6K9D0IJnz55h5YpVSEhIQOX3KkHtLl28hAI+hVCiSCl0bNtJXmhrhXi+/7r8V7Tr0Fa+hpksF2vgzqPUkpW4J/rQnvN5YE9+XuqSLU0bRXvFujguLRHst10HdkcBZ2Nf7GFXkZs3b2Hblu1o37EtlIAB/T+ys7OTiwjTjx+ne2KnMXLkSLRo0QInT55EgwYN0Lp1a9y79/JaLlE6MGbMGPz444+yVEaE908++eSlx4vvHR8fb7S8S2tW/oaTx09i+Oihrz327p27GD92Ijp01vfAqVlg3UDMWzwXm7ZvxKjvRsn67aaNmsk3eS149OgRhgYPQ1DLIDg4OEBtVq1YhbDjYRg1ZiS04M6du/K5mzNnTqPtYj06+ibU7NTfp+DmmBOOOZzRo3tPrFj9C4qXKA41q1ipIuYtnIP1m9Zh2sypCA+/isCAOrh//z60QIyviY2NQ5t2bWDSCjkAeWyBXVHApmvA3miggD3gZavfb2cJ2JgDZ+OAJ8n6wC5KYB4909empxCP4ecKVM2pf8zrCfq6dZVavmw57O3t8JECylsE9XdnvmOiR1j0eHft2hVz5sxBuXLl4O/vL8N06dKlDcd16NABrVq1kp9/9913mD59Og4dOoR69epl+Lhi8N3MmTNRuXJluS7KZ4oXLy6/RvTQpzd27Fh5EZAZrkdEygGha7eskb37ryIuFFo0+QTFihXFoKEDoXZBLZsbPi/lW1IuvkXLYO+eEATUfP2dBlMmnrNtW7WDTqfDtFlToDbXI67LAaEbt2587fOeTF+RokVw8OgBObh97Zq16Nrpc+z4c5uqQ3raUjbf0qVQsVIFFCtYAmtW/YYOndpD7UTZliiHyJ3bEybtRqI+TJdzBewtgbgk4HSMvlTF207fU17BHThxF9h+HRAd6W7ZgZzZgTTjSI1KXRys9F9/4BaQkATYWkJtflyyDC1atVDM6zt70N9SDfqNGzdk2YoI3KJkRQR1EdxTpA3rtra2snfxVYOsRPCvWLGiYb1YsWKy7OXs2bMZHi9KauLi4gzLuyyHCTsWhtu3bsO/cgBcc+SUy769+zB31jz5eUpvseh1af5hC3mH4adVP8LSUn2/0K8j6thc3Vxx+eJlaCGcX7sagY3b1quy9/zYseO4des2qlaqBvvsjnIJ2RuKH2bOlp+r8S6Jm5srzM3NX3itEutqr1MWY4MKFiqIcuX98O13o2RgnTXjB2iJeM8pVLgQLl9S9+uXcO3qNfy5c5c6LkTOxKb2ootg7W2r70G/kObOupMV4O8J1PPSD/58L6e+N932Ff224msEUcOuMvtC9+PCPxcU9fNnQH9LxBVX7dq1MXToUOzfv1/2mA8fPtywP304FfVtyclvr5ZLDEwUoSjt8q7416yO/cdCEXJ4j2HxK18WQa2ay8/FG7roOW/asDksrazwy28/K+aKNLNFXo/Evbv34OFpeoOA/99wfvHiJWzavgGurq5QI3EH5PDxg/jryH7DUq58OXzSqqX8XDzv1RhS/cr5Ydefuw3bxOuWWK+kgXrstES7X1XGqEZiQPSVy1dUfzEm/Lh0GdxzuqN+g4zvapuUZzp9r3haLyupt8wGWJvr69Vjn6QOLM1I/POadhMeNPoyPy76Ub7W+ZbxhVKwxOUdKVGixCvnPn8dMdvJkSNHDOUs//zzj6xDF2UuWc3e3l7O1JJWDltbuLi4yO0p4Twx8SHmLZ6D+/H35SK4ubuZdJARb1hpe8OvXgnHybCTcHZxlsvYb8fJ6ZnEyPDLl69g6KBhKFioAALr1IIpt/lSmjaHX7mKE2En4eLiLC88WrdsI6daXL1uFZ49SzbUJov9IuCphXjelyyVOp2oYGubAy6uLi9sV5Mevb9G146foXx5P1SoWAEzp89CYkKiHEinVkMHD5PlHmIWInEncMUvK2WZ2sYt66FmwQMGo0Gj+nLKYPG3DMS86Obm2RD0SRDUfvG1bOlPaNO2tTomMhAhW/SWi6kTZYnLE+DyfX1PetoyGKts+mPuPwFOxehnfkkZSCrKWMSMLmJdHBf/BDgdqx+AKnrlVfCe7f18ljGRWdauWYfvxo+BkqjgmZi1xAwdQUFB6NSpkyxjEW/iIliPHz8ejRs3/tePK3rcv/76a1mrLl4wvvrqK7z33nsZ1p8rzYnjJ3Hk0FH5uV+JCsb7/jmOvPl8YKqOHT2OBoENDeuD+uv/OFPrtp9i6qwpcmDZz8uWIy42Dp65PVEzsCaGjvxG3uEw5TbXD0yddm5Q/2BDm4cMG4zNG7fI9SoVqhp93dY/tqC6vwlPVUZSUIvmuHP7DkaNGI2b0TdRukxprN+8Drly5VL137bo3LGr/HsOjo4OKOVbSobzWrVN90L7TURGRqJ9m47yrp/oTKlarQp2h+56YSpCtRGlLWK2GtVcdPo6A+figL/vAY+T9T3eop68SJqZtcSAUFGXLgaFiv1iAGna/aJO/fYjfbB/lqwP8p42QGFH1bxnz100R36+esUaOXYq6JPUMWRKYKYTZ0X/mrjlOWLECOzYsQOXLl2St/u9vb1laB88eDBsbGxkOcvatWvRpEkTo9q+qVOnylIYUbMupmiMiYmR20Xteq9evbBo0SL0799fvmh+8MEHWLhwoezZeBPiilDMUX3t9hVV1gO/ikU27dW6m730/qW6abHdJj39G/3f+BatHTmaqnuO/Yw8+O0UtCY+Ph65Xb3keMFX5TMGdAVKCegp86L/Gwzo2qLFoKrVdjOgawvforWDAV0b4t8woHOQKBERERGRgjCgExEREREpCAO6Aom69P9S3kJEREREposBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBLLL6BOjdssxmJRctSUp+Aq2xNs8OLdLpdFl9CkT0lpmZmUGL4tecgNbYNS0FzUlKfqPD2INORERERKQgDOhERERERArCgE5EREREpCAM6ERERERECsKATkRERESkIAzoREREREQKwoBORERERKQgDOhERERERArCgE5EREREpCAM6ERERERECsKATkRERESkIAzoREREREQKwoBORERERKQgDOhERERERArCgE5EREREpCAM6ERERERECsKATkRERESkIAzoREREREQKwoBORERERKQgDOhERERERArCgE5EREREpCAM6ERERERECsKATkRERESkIAzoREREREQKwoBO/0poSCiaNwlCQZ/CsLW0x8b1Gw37kpKS8E3wUFQsWxnujrnkMV06fIaoG1FQkykTpsLJ2hWD+g42bFuyYCka1v4I3m555b7Y2DiozYRxE1DtvQ/g7pQLPp55EdS0Jc7/cx5aEBl5A53adYZXLh+42LuhYtlKOHrkGNQsdG8omjVujvzeBWFjYYsNaX7X1Uyr7dbac3zenPmo6FcJOZ095OJfLQDbt26H2iyYuxBVylVDHlcfudT6oA52bPvdsH/xgiVoENhI7nOwcjbN9y6dDjgXC/wRCWyOAHbeAM7H6benePwMOH4X2PH8mL9uAQ+SUvcnPgU2Xst4uZGYqc1hQFeg8PBwmJmZISwsDEqVkJAI39K+mDJ90gv7EhMTEXb8BAYNGYh9h0Lwy8qfceH8BQR93BJqcezIMSyevxQlfUsabU9MfIjAOjXRZ2BvqFXI3lB80e0z7Nm3C5u2bcTTpCQ0qv8REhISoGYxMTGo5R8IC0tLrN34G46dPIKxE8bC2dkJaiZ+ruJ3feqMKdASLbZbi8/xPHny4Nsxo7D/UCj2HQxBjQB/2elw5vQZqEmePLkxYsxw7PlrF3Yf+BP+NT5Aq2atcfb02TTvXbXQ15Tfuy7GA+EPAF8XIMATKO6k33blgX6/COqHb+tDeCU3wN8DsLHQh/SnyfpjbMyB2nmMl6KOgLkZkDN7pjbHIlO/mwkSQflVhg8fjhEjRkBr6tarI5eMODo6YtO2DUbbJk+biOpVayDiWgS8fbxhyh48eICu7b/A9NlTMGHcZKN93Xt8IT+G7AmFWm3Yst5ofd6iufDxzIfjR4/j/ervQ60mT5gCL688mLdwjmFbvvz5oHZ169eVi9Zosd1afI43/LCB0frI0SMwf+4CHDp4GCVKloBa1G9U32h92LdDsWDeIhw+dATFSxbHlz26mf57173HgIcNkMtGv57DAohMAGIfA7AHEp4CMU+AGh6AvZX+mNLOwI5EIDIRyGsnQh+Q3dz4caMSgdw5AIvM7dNmD/prREVFGZapU6fCwcHBaFu/fv0Mx+p0Ojx9+jRLz1ep4uLj5cWOo5MjTF2/ngNQp35t1KhVI6tPRRHi4+LlR2cXZ6jZ5k2bUa58ObT+pA3y5s6H9ypUxaIFi7P6tIjeGq0/x589e4aVK1bJuyeV36sENbdz9Yo1SExIRKXKFaEaLtbAnUepJStxT/ShPefzwJ78vNQlW5qOVxHIxbo4LiOxT4D4JMDHDpmNAf01PDw8DIvoGRYhM2X93LlzsLe3x9atW1G+fHlYW1sjNDQUHTp0QJMmTYwep1evXqhRIzXQJScnY/z48ShUqJD8Oh8fH4wZM+alv0ydOnVCsWLFcO3aNZiaR48eYWjwMAS1DJIXOKZszcrfcPL4SQwfPTSrT0URxPO4f58BqFK1CkqWMi73UZsrl8Nlz1rBQoWwfvN6dP28C/r17o+ffvw5q0+N6K3Q6nP81N+n4OaYE445nNGje0+sWP0LipcoDrU5/fdpeDp7wc0uF3p/1Qc/r1qGYiWKQTUKOQB5bIFdUcCma8DeaKCAPeBlq99vZ6kvYTkbBzxJ1gd2UQLz6Jm+Nj0j1x4Adhb68J/JWOLyFgwaNAgTJ05EgQIF4Oz8Zr2IwcHBmD9/PqZMmYL3339f9saLwJ/e48eP0apVK1mXHhISAnd39wwfTxwnlhTx8fpezawmBoy2bdVO3l2YNsu0azmvR0TKAaFrt6xB9uyZW4umVL2+7o3Tp89g554/oIWLEdG7OGq0vqStrF8ZWae6YN5CtGnXOqtPj+g/0+pzvEjRIjh49ADi4uKxds1adO30OXb8uU11Ib1w0cIIPbxX5oP1a9bji87dsfWPTeoJ6TcSgesJQDlXwN4SiEsCTsfoS1a87fQ95RXcgRN3ge3XAdGR7pZdX1ueZhypwbNkfYlMkay588+A/haMGjUKtWvXfuPj79+/j2nTpmHmzJlo37693FawYEEZ1NPXOjds2FAG7127dske/JcZO3YsRo4cCSVJCefXrkZgy++bTL73POxYGG7fug3/ygFGdzf2h+zH/NkLcOt+FMzN09WuqVivHn2wZfNW/LFrh6xbVTsPTw8UK278Rla0WFGsW2tck09kqrT6HLeyskLBQgXl5+XK++HokaOYNeMHzJw9A+prZwH5uV+5sjh29Dhmz5yDaT9MhSqciU3tRRccrICHT4EL8fqALjhZAf6eQNLzHnRrcyAkWr89vRsPgWe61B74TMaA/hZUqFDh/zr+7NmzMnTXqlXrlceJnnMvLy/8+eefsLF5XkP1ih75Pn36GNbFFbK3t3eWh/OLFy9h6++b4erqClPnX7M69h8zHkDzZdevZK9Er349NRPOxd2Q3j37YsO6Ddixc5vqB5GlqFL1PVw4bzyd5MULF2V5GpEa8Dmeeich7R1pdbfzCVTjmU7fK57Wy+b5sHxe4S3q1UWduZipJb2IB/pBpyLEZwEG9LfA1tb46ipbtmwyxKQPrCleF7ZTNGjQAD/99BMOHDiAmjVrvvJYUcculswievcvXbxsWA+/chUnwk7CxcVZ9sK0btlGTrW4et0qPHuWjOjom/I4sV9cxZsiMd6gREnjW545bG3h4uJi2H4z+iZu3ryFK5euyPUzp87Azt4O3t5eqhlEKcpaVvyyEqt+WyHbFh0dLbeLOzxv+tw2RV/1+Ao1q9fC+HET0Kx5Uxw5fFQOoFNbL1vGv+uXDOvhV8JxIuwEnF1c4GPiMzK9ihbbrcXn+NDBw+SMZGJ2MXF3W7y27d0Tgo3pZqsydSOGjETteoHw8vbGg/v3serX1XLGlrWb16S+d0XfwuVL+vf1M6dOw87OHl4+XvJ92yTkstH3loupE2WJyxPg8n3A29a4DMYqm/6Y+0+AUzH6EJ4ykDRFQhJw9zFQOeOy4szAgP4OiDrxU6dOGW0Tc5pbWlrKzwsXLiyDzM6dO9GlS5eXPk63bt1QqlQpfPTRR9i8eTP8/f2hFOLWWP3A1OmpBvUPlh9bt/0UQ4YNxuaNW+R6lQpVjb5u6x9bUN3/A6jVovlL8P3o8Yb1BrUayY+z5s9A63afQi1/2EOoU6ue8faFc9C2fVuoVYWK5fHr6l8wfMhwjB09Dvny58X4Sd/jk0/VM7//y+b8rxuYOkXbwH6D5EdRkzx/0TyolRbbrcXn+O3bt9G5Y1dER0XD0dEBpXxLyXBeq/ar73Cbmtu37+DzTt0QHXUTDrKdJWU4rxmoL9lcOG8xxo3+3nB8vZoN5cfZC2aZznuXrzNwLg74+x7wOFlfey6mTkxbQy4GhIq6dDEoVOwX5SsZ1ZhfS9Dvd8+68WZmuvRdvfRSS5YskbOxxMbGyvXdu3cjICBA/nEHJ6fUP+Swfft21K9fXx5fpUoV2Qsupmj08/OTXyOIenFRhy62V6tWTb5InD59Gp07d5YDQvPnz4/jx4+jbNmy8pihQ4fK2WLS16m/jChxET2aUXcjTb72+/+VlKyiW3ZvyNpcm4NWtfjy9bq/zUDqwue4dmjxvcuhWRloTlIysO064uLiXpnP2IP+DtStW1cG6gEDBsgpBsUUie3atcPff/9tOEbst7CwwLBhw3Djxg14enriiy/0f+QmPXFRIGrFRMnLtm3bULWqca80EREREakHe9BVij3o2sIedO3Qau+iVvE5rh1afO9iD7rDSw/jHyoiIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEIusPgF6N3Q6nfx4P/4+tCYp+Qm0xtpce21O+zzXEjMzs6w+BcpEfI5rhxbfu5CUDM15mvxGv9sM6Cp1/74+mBfJXyyrT4WIiIiI0uU0R0dHvIyZTouX5xqQnJyMGzduwN7ePlN7I+Lj4+Ht7Y2IiAg4ODhAK9hu7bRbi23Waru12GaB7dZOu7XY5qxut4jdIpznzp0b2bK9vNKcPegqJX7oXl5eWfb9xRNeS7/sKdhu7dBim7Xabi22WWC7tUOLbc7Kdr+q5zwFB4kSERERESkIAzoRERERkYIwoNNbZW1tjeHDh8uPWsJ2a6fdWmyzVtutxTYLbLd22q3FNptKuzlIlIiIiIhIQdiDTkRERESkIAzoREREREQKwoBORERERKQgDOhERPTGOnTogCZNmhjWa9SogV69emX6eezevVv+EbbY2NhMa6tSz5OI1IcBnYjIxIkgKUKgWKysrFCoUCGMGjUKT58+feff+7fffsO3336ryLCaL18+TJ06NVO+FxHR28S/JEpEpAL16tXD4sWL8fjxY2zZsgVffvklLC0tERwc/MKxT548kUH+bXBxcXkrj0NERKnYg05EpAJiPl8PDw/kzZsX3bp1Q2BgIDZs2GBUqjFmzBjkzp0bRYsWldsjIiLQokULODk5yaDduHFjhIeHGx7z2bNn6NOnj9zv6uqKAQMGIP3MvOlLXMQFwsCBA+Ht7S3PSfTmL1y4UD5uQECAPMbZ2Vn2pIvzEpKTkzF27Fjkz58fNjY2KFOmDFavXm30fcRFR5EiReR+8Thpz/PfEG3r3Lmz4XuKf5Np06ZleOzIkSPh7u4u/yT4F198IS9wUrzJuad19epVfPjhh/LfwNbWFiVLlpRtIyJKiz3oREQqJMLi3bt3Des7d+6UAfP333+X60lJSahbty6qVKmCkJAQWFhYYPTo0bIn/uTJk7KHfdKkSViyZAkWLVqE4sWLy/W1a9eiZs2aL/2+7dq1w4EDBzB9+nQZVq9cuYI7d+7IwL5mzRo0a9YM//zzjzwXcY6CCLg//fQT5syZg8KFC2Pv3r1o06aNDMX+/v7yQqJp06byrsBnn32GI0eOoG/fvv/p30cEay8vL6xatUpefOzfv18+tqenp7xoSfvvlj17dlmeIy4KOnbsKI8XFztvcu7piTaIgC+OEwH9zJkzsLOz+09tISIVEn+oiIiITFf79u11jRs3lp8nJyfrfv/9d521tbWuX79+hv25cuXSPX782PA1y5Yt0xUtWlQen0Lst7Gx0W3fvl2ue3p66saPH2/Yn5SUpPPy8jJ8L8Hf31/Xs2dP+fk///wjutfl98/Irl275P6YmBjDtkePHuly5Mih279/v9GxnTt31rVq1Up+HhwcrCtRooTR/oEDB77wWOnlzZtXN2XKFN2b+vLLL3XNmjUzrIt/NxcXF11CQoJh2+zZs3V2dna6Z8+evdG5p2+zr6+vbsSIEW98TkSkTexBJyJSgU2bNsmeWNEzLnqHP/30U4wYMcKw39fX16ju/MSJE7h48SLs7e2NHufRo0e4dOkS4uLiEBUVhcqVKxv2iV72ChUqvFDmkiIsLAzm5uYZ9hy/jDiHxMRE1K5d22i76GX28/OTn589e9boPATR8/9fzZo1S94duHbtGh4+fCi/Z9myZY2OEXcBcuTIYfR9Hzx4IHv1xcfXnXt6PXr0kCVIO3bskGVI4o5C6dKl/3NbiEhdGNCJiFRA1GXPnj1bhnBRZy7CdFqinCItES7Lly+Pn3/++YXHEuUZ/0ZKycr/Q5yHsHnzZuTJk8don6hhf1d+/fVX9OvXT5btiNAtLlQmTJiAgwcPvtNz79KliywtEl8jQrookRHn8PXXX//HFhGRmjCgExGpgAjgYkDmmypXrhxWrFiBnDlzynrwjIh6bBFYq1evLtfFtI1Hjx6VX5sR0Usveu/37Nkje4fTS+nBFwM0U5QoUUKGWdGL/bKed1H/njLgNcVff/2F/2Lfvn2oWrUqunfvbtgm7hykJ+40iN71lIsP8X3FnQpRUy8G1r7u3DMivlYMNhWLmGVn/vz5DOhEZISzuBARaVDr1q3h5uYmZ24Rg0TFYE4xEFKUYFy/fl0e07NnT4wbNw7r1q3DuXPnZJh91RzmYt7x9u3bo1OnTvJrUh5z5cqVcr+YYUbM3iLKcW7fvi17oEXPtejJ7t27N5YuXSpD8rFjxzBjxgy5Logge+HCBfTv318OMF2+fLkcvPomIiMjZelN2iUmJkYO6BSDTbdv347z589j6NChOHz48AtfL8pVxGwvYjCnmG1l+PDh+Oqrr5AtW7Y3Ovf0xIw34nuKfxtx7K5du+QFCBGRkawugiciorc3SPT/2R8VFaVr166dzs3NTQ4qLVCggK5r1666uLg4w6BQMQDUwcFB5+TkpOvTp488/mWDRIWHDx/qevfuLQeYWllZ6QoVKqRbtGiRYf+oUaN0Hh4eOjMzM3leghioOnXqVDlo1dLSUufu7q6rW7eubs+ePYav27hxo3wscZ4ffPCBfMw3GSQqjkm/iAGyYoBnhw4ddI6OjrJt3bp10w0aNEhXpkyZF/7dhg0bpnN1dZWDQ8W/j/jaFK879/SDRL/66itdwYIFZTvEsW3bttXduXPnlT9fItIeM/E/48hORERERERZhSUuREREREQKwoBORERERKQgDOhERERERArCgE5EREREpCAM6ERERERECsKATkRERESkIAzoREREREQKwoBORERERKQgDOhERERERArCgE5EREREpCAM6ERERERECsKATkREREQE5fgfsjAlAX91/jkAAAAASUVORK5CYII=", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Compute confusion matrix\n", + "cm = confusion_matrix(true_labels, predicted_labels)\n", + "\n", + "# Plot confusion matrix\n", + "fig, ax = plt.subplots(figsize=(10, 8))\n", + "cax = ax.matshow(cm, cmap='Greens')\n", + "\n", + "# Add labels, title, and ticks\n", + "ax.set_xticks(np.arange(NUMBER_CLASSES))\n", + "ax.set_yticks(np.arange(NUMBER_CLASSES))\n", + "ax.set_xticklabels(CIFAR_10_CLASSES)\n", + "ax.set_yticklabels(CIFAR_10_CLASSES)\n", + "plt.xlabel('Predicted Labels')\n", + "plt.ylabel('True Labels')\n", + "plt.title('Confusion Matrix for test set of CIFAR10')\n", + "\n", + "# Annotate each cell with the numeric value\n", + "for (i, j), val in np.ndenumerate(cm):\n", + " ax.text(j, i, f'{val}', ha='center', va='center', color='black')\n", + "\n", + "# Rotate class names on x-axis\n", + "plt.xticks(rotation=45)\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "id": "b449b164", + "metadata": {}, + "source": [ + "### Explain image classifier predictions\n", + "\n", + "Deep neural networks are often described as \"black boxes\" because their decision-making processes are difficult to understand and interpret. To address this, researchers have developed various methods to make these models more explainable. One such method is Grad-CAM (Gradient-weighted Class Activation Mapping). Grad-CAM computes the gradients of a target class with respect to the final convolutional layers and generates a heatmap that highlights the regions of the input image most influential in the model’s prediction for that class." + ] + }, + { + "cell_type": "markdown", + "id": "bac9c664", + "metadata": {}, + "source": [ + "#### Prepare image for GradCAM\n", + "\n", + "We begin by preparing the image for Grad-CAM visualisation. The image must be converted to the (Height, Width, Channels) format and normalized to values between 0 and 1, as the visualizer from the PyTorch-GradCAM library expects this format. Afterwards, a batch dimension should be added to make it compatible with the model's input requirements. We also retrieve the predicted and true labels, as both are needed for Grad-CAM computation and visualisation." + ] + }, + { + "cell_type": "code", + "execution_count": 50, + "id": "972d39aa", + "metadata": {}, + "outputs": [], + "source": [ + "# Get a batch of images\n", + "idx = 1\n", + "img = original_images[idx]\n", + "img_np = np.transpose(img, (1,2,0)) # shape: (H, W, C)\n", + "img_np = (img_np - img_np.min()) / (img_np.max() - img_np.min()) # Should be normalised between 0 and 1\n", + "img = np.expand_dims(img, axis = 0)\n", + "img = torch.from_numpy(img)\n", + "img = img.to(DEVICE) # shape: [1, C, H, W]\n", + "pred_label = predicted_labels[idx]\n", + "true_label = true_labels[idx]" + ] + }, + { + "cell_type": "markdown", + "id": "c2dd4920", + "metadata": {}, + "source": [ + "#### Compute GradCAM heatmap\n", + "\n", + "Using the predicted class, we compute the Grad-CAM values based on the activations and gradients from the last convolutional layer of the image classifier. This layer is typically chosen because it retains spatial information that helps localise the regions of the input image most relevant to the model's decision." + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "id": "223532a5", + "metadata": {}, + "outputs": [], + "source": [ + "# Make sure input requires grad\n", + "img.requires_grad = True\n", + "\n", + "# Define the layer(s) to inspect\n", + "target_layers = [model.conv3]\n", + "\n", + "# Define the target class you want to explain\n", + "targets = [ClassifierOutputTarget(pred_label)]\n", + "\n", + "# Make sure that the model is on eval and not on train mode\n", + "model.eval()\n", + "\n", + "# Create CAM object\n", + "with GradCAM(model=model, target_layers=target_layers) as cam:\n", + " grad_cam_matrix = cam(input_tensor=img, targets=targets)\n", + " grad_cam_matrix = grad_cam_matrix[0, :]" + ] + }, + { + "cell_type": "markdown", + "id": "7ce807d1", + "metadata": {}, + "source": [ + "#### Visualise GradCAM heatmap with the image\n", + "\n", + "After obtaining the Grad-CAM heatmap, we overlay it on the input image to visualise the regions that contributed most to the model’s prediction. This helps identify which pixels the model focused on when predicting the class." + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "id": "b73bf41a", + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAZcAAADaCAYAAABq1w8LAAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjEsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvc2/+5QAAAAlwSFlzAAAPYQAAD2EBqD+naQAAMsVJREFUeJztnQeUJUX1/zu+MPMm7szOzAY2w+6ySJKggMhRMaAiKIoZ8RgRc86iiArmHDCn/8EMZjGAyFFBkqCEBXYXNk+el193/c9tfPubnfut3bfQywz6/ZwzZ6FevX7V3VV1u+p++17XGGMcQgghJEW8NA9GCCGECDQuhBBCUofGhRBCSOrQuBBCCEkdGhdCCCGpQ+NCCCEkdWhcCCGEpA6NCyGEkNShcSGEEJI6/3PG5b3vfa/juu79+u7Xv/715Lt33323s7+QY8tvyG/NBo9+9KOddevWzfl2krnNWWed5SxdunS2mzHnufs/4+iiiy7ar3PXbPCQMS4333yz87znPc9ZuHChk81mnQULFjjPfe5zk3Kyd3bs2OG85jWvcVavXu3k83ln/vz5ztFHH+285S1vcaampma7eSQl7rrrLudVr3qVc+CBBzptbW3J39q1a51zzjnHufHGG525ysTEhPO+973POfTQQ51CoZD0UXnIkf65efNm+J1nPvOZyWQrdRB//OMfk8/l79vf/jasc9xxxyWft/JAta/8+c9/dp74xCcmc1Yul3MOOOAA5ylPeYrz3e9+1/lfwH0oxBb70Y9+5Dz72c92ent7nRe/+MXOsmXLEot/8cUXO8PDw873v/9957TTTmvpWI1GI/mTm72vRFHk1Ov1xLjtrycIOS85v6997WvJ018ajIyMOIcffngygM8+++zEwMh1k8nmsssuS/5tPmXKymXnzp3OP//5zz0eU7pNtVp1wjB0fN9PpZ3kgSH38lnPepYTBEHy4CUTted5zr///e9kDG3YsCExPkuWLNnvbZG+K5N7K6v8O++803nsYx/rbNy40TnjjDOc448/3slkMkm//N73vpeM+9tuu22370hfHhgYcAYHB5NxKec2c0zK75900knJWJd/f/GLX8CxJp+vWLFir31+X7jkkkuSe3HYYYc5Z555ptPT05Nc+yuuuCIZM3/4wx92a8OFF17ovPGNb9xvc9esYOY4d9xxh2lrazOrV68227dv3+2zHTt2JOXt7e1m/fr1ezzO1NSUeShw1113ibE3X/va11I75kc+8pHkmFdddZX6bHx83JTL5V3/f+KJJ5qDDz44td8mD944kXGwZs0as3nzZvV5vV43n/zkJ83GjRsflHHywhe+0CxZsmSv9aRdhx56aDLGr7zyStg/3/72t6vyr371qyYMQ/P73/8+6dt//OMfVZ0//OEPyWenn366CYIgmS+mc/7555uBgQFz/PHHp97n165dmxyzWq2qz7Zt26bG+4UXXmj+25jz22Ji0UulkvOlL33J6e/v3+2zvr4+54tf/KJTLBadj3zkI2pv8pZbbnGe85znJE8N8jQ0/bPplMtl59WvfnVyvI6ODuepT32qc++99yb1pP6efC7yxP/kJz85WQLLNpM8VSxfvtz55je/qVYP8mRyyCGHJMv+zs7OZMl8ww03OPub9evXJ6uLY489Vn0m7UBPQnLt5GlPtlVkWT/9+tp8LvK0KucmT6KPf/zjnfb29mT78rzzzktWOmT/IfdHxoGseIeGhtTnspqRPr548WJ1v6R/POlJT0r6vqx4hCuvvDJZRchWjqzU5Xuve93rkrEyk5/85CfJtpL0I/n3xz/+ccvt/uEPf5iMgXe84x27xujM/nn++eer8u985zvO4x73uKSPrlmzJvl/G6eeempyDrKamI5sT8nW2v5Yea9fv9456qijkhXYTGRLGiFznKygpK3y3b///e+7fY7mLvl/2QaV8z/ooIOSe3DkkUcmK6TZZs4bl0svvTSZwE844QT4+aMe9ajk85///OfqMxkcYpg++MEPOi95yUusvyGD7NOf/nQywD784Q8n+72nnHJKy2284447nGc84xlJZ//oRz+aGDM55nR/kEy4MgjFEH3sYx9z3vSmNzk33XSTc+KJJ1r3lNNCtkFk6+Bb3/pWS/VHR0edJzzhCcm2ipyPbKPJvvYvf/nLvX5Xfke+K1sWMuFJR3/Pe96T/JH9uyW2cuVK55hjjtmn78k2izwIyIQnTuWnP/3pSblMxDJ2XvGKVyRjQ+rIvy94wQt2+/5vfvOb5DsyyV1wwQXO0572NOdFL3qRc80117T0+z/72c+Sf5///Oe33GYZL7KtJFvlgvz7gx/8wKnVarC+PCCJgZEttiZi0GR8ysPn/hpzl19+uXPPPfe0VF8MnTxIv+xlL3M+8IEPJA9vp59+erINvzf+9Kc/Oa997WsTn7Q8yMmWt4zBNLf57hdmDjM2NpYsGU899dQ91nvqU5+a1JuYmEj+/z3veU/y/89+9rNV3eZnTa699trk/1/72tfuVu+ss85KyqV+E9mqkjJZyjaRpb+UXXHFFbvKZPsum82aN7zhDbvKKpWKiaJot9+Q40i98847b79ui23dutX09/cnx5VtxJe//OXmu9/9bnJ9ZyLbYlLvm9/85q4yWdoPDg6apz/96Xtsp2yFSNm55567qyyOY3PKKaeYTCajtiVIOsjWkVz3pz3taeqz0dHR5Lo3/0qlkrpfb33rW9X3ptdrcsEFFxjXdc2GDRt2lR122GFmaGhot770m9/8JjluK9tihx9+uOnq6jL7wkUXXWTy+fyu8X7bbbclv/fjH/8Ybotdcskl5rLLLkva3twWfNOb3mSWL1++37aCL7744uS3pd+fdNJJ5l3veley7YfmAKk3b948MzIysqv8pz/9aVJ+6aWXWucuQf5f/q655ppdZXJ/crmcOe2008xsMqdXLpOTk8m/slzfE83Pxck3nZe//OV7/Y1f/epXyb+vfOUrdys/99xzW26nqHGmr6xk+06WqLJaaSJLXXGuNp/u5elCtiSk3j/+8Q9nfyKrCHlSk+shq5IvfOELyRObPK2+//3vV1tW0i55CmoiS3vZ8pt+PntClukzl+3yVPm73/0uxbMiTZr9Xu7bTESgIf2x+ffZz35W1ZHVyUxk9d5EtttE5PHIRz4y6SvXXXddUr5lyxbn+uuvd174whc6XV1du+rLCl7GRKtt39v4nolsAcnOQvN7q1atSlbIe9oaO/nkkxNhgIh/5Bzk3+bKZ39w9tlnJ3OLXH/ZMpdxJnOEtPUvf/mLqi/Of9nxaNKcT1oZc494xCOS828iW5myUvv1r3+dzDWzxZw2Ls3O0zQy+2qERIWxN0RlIpP+zLqyxdAqcjNnIh1FJvImcRw7H//4x5POJYZG/Dsy2EURMz4+7uwL0mG2bt26259tS6CJ7MN//vOfTyaEW2+91fnUpz6V/P673/3uRHU3nUWLFqm93ZnnY0OupficpiOyWGF/vh/0v0yz3yNJufgkf/vb31qluOKLkfs9E1FuydauTMhitKSvyBau0OyvMnYE6dMzkYemmVL46f212VbxqextfE/nX//6V2LcREIs29HNP5nEZWtw5gNmE1FoyTa5bD+JP2LTpk37tCUm7Z3efjmfvfH4xz8+meDHxsaS3xQ5uFwz2Rrfvn37HueQpqFpZcyh6y9jTrY1W2nn/6RxkachmRT3ps+Xz8XpLB3V9vS1P7E5BKevCMTv8/rXvz7xEclAl04ng/7ggw9ODM++IANDrsv0P/Q0hBCjIR1PVmbS4cUYzHzia+V8yNwbJ2iPXXwwIvOVyRgxfUU9/eFFVh/ixxRfm/gKpa82xRv72l8FcVBP76/NlwbFnyfGSvp0KzSNpIgLZFJt/olvsFKpJAIBG2JMZKUljnHxJ7a6uhKkvdPbL+fTKm1tbclK5DOf+Yzzzne+MzEYM/2X/41jLnDmOGLlv/zlLydLS6QmEVWLPBGLI+z+Ot5ksIgGffoTgDwNpYk4HEXZMnOVIE81sorZF0TbL4N9OjJY9hVZYcgTkqxm0kKupSzlm6sVofmOAt/Y3n/INtFXvvIV529/+1uyhflAEKGJ3LNvfOMbuznwZ/a55vsyt99+uzqGrI6nIw8w05VmzdWtvFQojnYxGm9729v22C6ZaGXlIeNo5ja2IFtP8jsiKEDI/CErBHn/RYQ7+4Jch+nzz/19cH34wx+e/JvmmEPXX+6fGLWZCtsHkzm9chFEVSU3UoyH+ClmynvFjyAXUerdH2TpKnzuc5/brVyUMWkiTyYzn0JEkSOS531F5IbyNDr9b/p+7Uz++te/JvvmM5GJSK7pzC2MB4o8oTWRc5b/l22JxzzmMan+Dvk/3vzmNyfjQPb6t23b9oCegJtP0dO/I//9yU9+crd68gQvLwmKEZq+tStGSKTs05GV0/T+2jQuorIUeb7Ija+++mrVFtkyE5mycNVVVyUPkmI85Hsz/8RvISoym/pSVu2yHSzKxX1RpwnS3untt60Em1x++eWwvPkiZ5pjTq7bdL+trAJ/+tOfJn6m2XzBec6vXGQ1IZ1X9PfSCWe+oS+ORnnyEX34/UEcYSKl/MQnPpFMtPIuiEj7mk/bab2JLyswkQnKwBDHqDwdylPWTP/E/kAkyPJbEsVAzlcc9LJ3/dWvfjUxVG9/+9tT+y05njgyxckrWzKy/JftFfmN2XyK+m9Hxok81YuTWiau5hv6YhRkVS6fyfYX8q/MRLaqZDzJe1ny8CPbzbLdhPb/RX4sqyZ5qhfDJg988mAm272thBWShw6JHiATtmwZy3snMnFLuUiFpd3y4CTGR/qwTJa21wTk/TQxROKsly1ohDi65W9/c+qppybzlKzM5FrKw50IWuTVCtlSk/K0kHeL5CFZ3mOSbc7mg7KE05lVzEOEG2+8MZEWi+xR3swVaaz8/0033aTqNiV7SPqK5HzFYtGcc845pre31xQKhUTSeeuttyb1PvShD+1ViixS25mIvFH+pkuRRZos7RcZ5XHHHWeuvvpqVW9/SJHl2on08ogjjkjOUd5WlnacccYZ5h//+IdqN5Jlznzj2iZFbkZLOPnkk5O3ruUNaLnmMyWYZP+9qf+KV7zCrFy5MpGjSl9rys+vv/763eo27xfilltuMY997GOT8dDX12de8pKXmBtuuAH2zR/+8IdJZACR1cub6T/60Y9afkN/umT63e9+tznkkEOSfiNtX7dunXnb295mtmzZYmq1WiLXPeGEE/Z4nGXLliXy5plS5D2xP6TI3/ve98yZZ55pVqxYkdwDOR+5Nu94xzt2Saj39ob+zFchbFJkmbu+/e1vm1WrViX3QM5fzn22eUjEFpsNxPEn8bhkL7j51jLZM6IuEt8SA2ES8uAgOyuiQpu+FT1XmPM+lwcDFNJCtslkG0GW6oQQQv7LfC4PBhKm5Nprr01UKKL7Fz+B/L30pS/dLRYTIYSQ1qBxcZzEwS4KF5EyypaOyBVFC99UqRBCCNk36HMhhBCSOvS5EEIISR0aF0IIIalD40IIIWT2HPq9y3FcqM6eNlW2yBZDKtbRT8d24qidbWE7+H4V1s2iCAcRdiUZP1RlxTI+bgME6PN8/MZ+tVZRZYWOLKxbLGnpc62Bj1sogOsggNNrRA1YtRGDiMkxvvUoC0Ook+ndV7eqExm15XHlcl03eHQMtzdwdRva8vg5KI51G3J5fG43XqVjMM0G+Z7TYHk2r/tmZ3c3PojRfbZSKsGqoQfuibFce3SZY8tY8vTAq9dxiPcYuHbBbbb240wG31OUTCuyxNVEWSFt2IJzxgacnyVziXH0mLZFYzENfdwwxOfcAHNbuYLb64EII2GA5xoDzi0I8blt22gPELrrt/dagxBCCNlHaFwIIYSkDo0LIYSQ1KFxIYQQMnsO/f4+nZ9byLRpD1WtitOWRnXtcNy5A6f4LeS0g6qzoB2egtvQdauVOm6Dq8vLVezcrNR03Y4OnCQoamgnW6mIUw/XasAp6OFza9Tjlh2O9To+DxPrZ4hKBdfNZrXTs2FJodyW0W12Lc8ryIkYNfBxkf84G+CuWi7r6xDPXtrwlmhvw45lPwTXKMJiEwOc3qWiFpUImUBf0GwG3yfkDG8AZ3PSBhf0QTAWk2OAXO7ZLO7zBnQAW9+OkHDHohSIbcIEIDaIbKoAo+9Rw3LOfqDnxtiS0z70dZtdIAgQPE+XG0unR6XBjMyjTepAdPNAXrHnyoUQQkjq0LgQQghJHRoXQgghqUPjQgghZPYc+hnLW53t4K3iUhE79KtV7ZzMZvAb6GGgHeexxXE2VdWOzIbFIVcGzulyDTsLxaU2k270trM4yOu6DQFw6AnZjI5qUKxg5/bEBH7rOp/Ptvx2tKnrZ4hGGXvqaiWdRTKTw3Xn9+i3x0PLm9Tb7tmqyiYnbW/o6+vmW5ybpZIWXVSAk38uAXy3CWEI3ni3iCki4GQPfItQwAtacpoLtajesiO8DpzptnGHxpIL7nPye0CsIIn7ED64mHWLAKEKIkrYxqntDX0T6fOIgZhHiOr63vlAXCG05wotnZswNaHHaLVmeUMfrB9sK4o6EA81LOfWCly5EEIISR0aF0IIIalD40IIISR1aFwIIYSkDo0LIYSQ2VOLVYo4DEWY1fapZgktgkKkoPwhwuiIVkS053DdQiGnylwPqxzGpnS4mUaEj5sF+R9yOaxumwAKjjDQ7RLybVrpNT61HdYNAhwiI45ROBWLYgTUdS1qGNfR5RYBGIwNEYY4hw1SBtnCytQaIEdLxhYKQ987y6nNGRoWVZ8HQ+TYwv+AsWRR1JXLtZbVnzBvCq7qRDU9J8QgPIrgg0QmQYDVbdWqbq8HFG+2nCfVmh6L9x0Dq9MMaLNNIeei87PGSDEtKwVhXUvyFxfkaIkt+asikO/IImKFx2X4F0IIIXMKGhdCCCGpQ+NCCCEkdWhcCCGEzJ5DvziJHfoeyEFRrmKHZRBq53TdErYAOdRg7oYE4CyOLaICkNsEOfSEGmhbuYhDSBSn9PXp7u7CbQAhJGzOxnw+13IoHZsj2wXXLQJOc6GjU3eJ+X06XI1QaNNta2svtByCJJ/LtCwqKHTiPDqVsr7PljQxc4aaJX+QC/J01C3hVFB/seWxMUZf59hYvLoOCLtkEd3ERp+HMXhKiSIk6MDimHpN96EcCI+StAGMZ9et7JM4Bs0rxuCL6QIPt4lxh8tk9b1rB/mvhCyYG0MgKEraAKarMMDjGWV0yVjy6KC8NA8kNxJXLoQQQlKHxoUQQkjq0LgQQghJHRoXQgghqUPjQgghZBaThYVYteSD0CmZjCW5UE1LD6I6Vjlks7ppXoiVM6WqTqhVLGIFRy6jz6NmEVrUyvqD8iQ+7lRZq7dy7VhlFQG1ULWE1XhLlyyE5cWpsirbNL4T1q0BRU3DEt5iXrdW5XSDhHBCDtyj9vb2lp9jcrhLOV09Haqsbiyqwioot4Q2mSv4fr7l0CK+bzlvoOKJLWGMgkD3Q9fDfbMe6ftXq+GQPoGv+0rk4LoRGLv1AlZD1UKdbDBY0AfrGiCRbBQnYN3ugW78ew09HicmcZK+qA5Cr0zhCaQto+9z3vI8j0KyoNA2//lEfz/ACRqzIGRWDNSYgkHKxAew/ODKhRBCSOrQuBBCCEkdGhdCCCGpQ+NCCCFk9hz6gcW55IFkD5kQhziYmpxqOadHT7d2hrkudkRNThZ1u3zsLIwj7Qm1+Iqdjjbg9LQkOMjl9e8V2rHjdsfosCqrTmCH/gGDC2D5xg2bVZlbs7StHdwP1+JYrOjQGd44blseiCN8r9GyQzfosTzbZPV5lIZxSA+U86Jex22YK3ieLeyN03L+jxrIeRKDECtCmOsBP2ZxbleB878XN8J0gRZ34D6Y7QWiEKwpcAIQmiZ7AJ5/ilPa8R4V8f3vOrgTlo9PAgHAKD6PEISackbx9fFAaCJX63ASAhAKx4twyBuzfVDX9SzqmECfR71Ubz1PjG1ybAGuXAghhKQOjQshhJDUoXEhhBCSOjQuhBBCUofGhRBCyOypxRwHKww8EEZifAKHTkB6mM4uHC4kAqEIgsBmC7WCIwOS7whlkEnKdS0KlxxK6oXrdoEQGyjMixCCQxx16MNg3Uccfigsr09o9dRWfzus2w9u3VD7fFi3L9bXsrOEz7kwpe+dacdKqNuL+rgjdawAqwVa/Zfx8P3s6tGqp7Ex/f25BQ5P47r6HKvVestDN2tLqGWAEinEajGnT/+eP4RVaI0eEIOmH/cVfyEYS12WfgWKs134nmaKOuzJkjye1tatwdd9y91avbnjni2wrudpBWg1HIJ1wzE9FrI78TlnpkD/LuPrPjyi58HyFh0ySYhKeg7yLUrRLEhMWKnY+t/e4cqFEEJI6tC4EEIISR0aF0IIIalD40IIIWT2HPrYDeU4UyD8wvDoOKzb0akdRjlLUo8yyI8SA2ezEIJcM7WaJQxJpMtjE2HLGzRayjMjTI3puuVxLGzoBnlejjhkLW4DyMUi3HXbptZymyT5alCIFHzcKKOdkCWLU3leoK97ZPBxcyM6z0fOkssnatflQTe+92WQdyMCIX7mFvga1UC+o1IZh0fKZlFODyyOadS1o9904b7iLdL9O1pgySkzpMtz83CfX527U5XN83HelaiinfR9VSzoKGT0uFs6MA/WHTAbYPnNw9fpNkQ4P0oMwrSMR8th3fH2A1TZtpwO3SIEFX2P4h1YgBD06LEQFHGfj8v6fgY5i0ADCJtMeP/XH1y5EEIISR0aF0IIIalD40IIISR1aFwIIYSkDo0LIYSQWQz/YizKBXCEQicOAZLLg7AFJawuaTT07/X2YhVIqaiVZZnQohgCCccaEdbCdXZrNVQDhI8RKkDdVgFKJiGf1yEkIosq7OZb1sPy227bCNqGVSCbPH3sHR349zp8HZKl2hiDdY/s0MmXwnas0ruxqo8RObjuvBAoyywhKyYq4H6A851bYOUTanZGX4qEINTjo1HHYzQO9O+1L8KJ7OqL9D0xK/D46GvfqcqWm9tg3cNyOpxKX4xDrJQbWm06v4rvaQHMSwsmF8O6ja1Yndhz079UmSlZ1JQdYF5arhP3CZu9pc5MJupYLVZbeJQqG+1ZCOtu26LHaFzG96itqucEP4P7SbUB5kyQQKxV5vooJIQQ8hCExoUQQkjq0LgQQghJHRoXQgghsxj+xRJ6pbNbOwa72nBeCeNoZ/qOrSOwLvLf9vb2WBqnHYAjI/i4Dgj1kreEQ+jqAWFlqviSLVmkQ7qcdMwxsO7KxQt02aE4b8vWbdhZGOb0723bikNWTJV1LoyObuxUjsC13Dl8L6wb9+ibZHA0H6djSNfNu7hyX5cWCkyWy7Buo64d+nGIBSVzBWPwtc/ldLklRQscS6Upy3l36aL8YsuNWqKFKbkC7oOrjXaEHxz8G9Z9WH1UlUX34LAy1RE9Hpf1LYJ1ezM6j8k8dwDWndyJx4c3qa97cScW7tSKujwzHz+jrznwHlU22MA5jDbldX6d9b1YdLHhIO1kDy2Cl7aiVoRUG1hIE1f03Gh8PDe2AlcuhBBCUofGhRBCSOrQuBBCCEkdGhdCCCGpQ+NCCCFk9tRiI8MTLR9g1RAOWzBR1UqJMKeVTEJXTitfanWs9ujt1cqp0oRWpwiDC/v197uwImLxYq06WbHqCFg3n9Xn3JbDyZsecdwhqmzLdqxum7zrblj+8GOPVmXz+7FKZgEIm1MG4ViETVt0QqX1d+lQM8Ktt96qyipAFSQsXADCZnRqpY/gh+B+bsYqm1ykn4+C7Nx+ZiqXLMnpwGiaB0Ls2BQ/XoClZdlBPZaibqyGaluh27B8FPfBYwtaDbW6hPtx5x36XveWh2DdoAbCCt2NFXaL5+s+P3YnDru0cxvu8z4Yu139+FouAaGb2iPcN8dLWnm5YBFOpPj3ir6WUYwVfZtW6BAyXoD7ibdT3/v6JO5/QUWPUS9g+BdCCCFzCBoXQgghqUPjQgghJHVoXAghhMyeQ784gR2AY552yNeK2MEVgbwpE5M4FEFU03VLEzgEyNCQdup1d2Fn+toDl6myQ9YsgXVdRzuzunvmw7q//tW1qmz7CHbeLV99gCq77NJfw7rXX/9PWN7fr530HQXshFy3Zq0qW3GQboPwy9/+VZUNj0zBuisPXK7KqhEOFbNl7HZVZrI61IgwMq77hCXdjdM3X8c2qQLhyFyiXsUO1Yqr841ENUuojliPj6rluCbSx6jXcG6TQZBHaX4Oj7tVIL3S8sAylkCkl9wYHqN33KjDzews43saP1rf/2tuwzmQ7ty6HbetTTvpGxkcSuegfi0IelSpC5/H3dpJX1qJ59GVx+mpeCrGIqq7KnowbF6Er2XZ6OtmLCGF2nwtIIgauJ+0AlcuhBBCUofGhRBCSOrQuBBCCEkdGhdCCCGpQ+NCCCFk9tRiMQizIXieLvctIQPm9+uEONtHcLiQ4rgO9ZINcBtyWR0a4pC1WskkPGztalW2duVKWPf2W+9QZdddewusOzioVWTHHIeThXV366RnhQ4s4Viz5kBYngcKl5FhfC2379ypynoHcLiI6667QZXd/C8dEkY4vfd0VbZoKU7oVi5p1UpHFp/z1Jiu6/q4LsplZOt/cwUDEtYJLkj45HlYLVZo18qeYhmHMaqX9e/5Vcs1CvRx+/otibr6tQRsoAOf2/AtOizMlpEdsG6hTyufBvM4pFRnTo+DDovSa2FfLyyvhmHL6rRySYeWKbbrhFzCluFtzkx2FHEImjUH6es+uACrNOfXtYqs2qnnVuGuQTDOXXx9PBBhJwbqwVbhyoUQQkjq0LgQQghJHRoXQgghqUPjQgghZPYc+jYnaS6nvUDjYzjsSc8CndNhYAA7w9aPaWdfRwGHWVi1crEqe+aZz4R1jz78SFXmN7CNXblc511Zey92QrZ3amfhytWrYN1yRYfMec5zz4B1xyzXsgHyeUxMYAdgNqOv8T9vwWFlarF2yFYqOEzLvVuHVdmCZUth3XpDO5sbNXzdiyDqhZ/T4U6EbF6XBxbhx1zBtTjpg0BfowroK0J3h65bKOD8HyMghEzWweOuo1eHNzlwHc67cxDI27Roqw55IvRu1iGa+l18bpm6dtL3ts3D4o+KDqdy1MN0uCNhp6UfT8RamDBaxWFaunw9ZZZ3aMe9EJX1cetj+N5P3KOFAvMWYOd/f6zD8ZhoE6xbbmiHvhdgh3420GPJeFig0QpzexQSQgh5SELjQgghJHVoXAghhKQOjQshhJDUoXEhhBAye2qxQidW64RAeNDdg1VdeRCmZWiwD9Y9bO0aVfbIY3A4lQWLdIKitiwOh+AY3YZ7N+IEV4WCPsbDjz4W1i1VdLiIEISVEFBkieFhHR5D2L4Nq9N8oCyaACFzElxdd/M2nDhpwQELVJkXtsG683pBwrJ2HP4lk9F1N23GaqEtW7VSp70DqxVDoCILMyAz1Rwik8WKIR/EssnlsQIsDPQ5dhSwCmhw2QpVtvgYSziVVbrPR/6dsG7N6P42UcWKxUybPo+FK3BYGaSo8i3KwiwYSxEINSRUi7i/RSCEVc2iLCuBED1mCh+3I69Vdm43nhPaqnqM9TawUusAX4d5Kk9uhHX7prTyLsziMEHtoGm+j+f9VuDKhRBCSOrQuBBCCEkdGhdCCCGpQ+NCCCFk9hz6gwuwY7GrXTuijjzqUFi3brRzKGrgHCSPOu5kVXbi8SfAutW6dm5Wa3VYd2zY4vQGtHdoh5wx2B7HkXa+NRo4hMTUpA71cMn/+wmsu2kTFhssOUA7Qy+//M+w7uDCIVV25LHrYN3Dj9D3Lo5xN2lr0yFv8pYeVamAnBlj2Elfb+i+Vi7h++k6+rj5PA5tMlcodFjyEoXa0bpggc4TJESOHktxjM97yQLt0F+6cAk+Lgin4+aw2GSirPP8DGBNgZPp0h+YIr7/cQacm4NFGjWj+8XtN/8L1r3XInjxu3SIlBvvxDmM5nXqOeHoRfgeDfUMqjIzhO99GGrHe6DTMCV0V/T96K9YwrTEOjeOX8d1ux2/5RxarcCVCyGEkNShcSGEEJI6NC6EEEJSh8aFEEJI6tC4EEIImT212MELsVqst9cSZgVQqeufu/Kq62HdTFhQZZPDOMxCJ1BwtHXjBEcdBa1uG+jHiYhGJrUqwyvpRD1JeaDPrTiM1WLDwzoB2AnHHw/rjlqShdVAMqMNG7bCuvMHtapr2QFYLVQBYWwig1U9ngGJk8r4Hg306oRu2zbjaxlXR1oKjZKUx1rpUxu9/wmOHgzm6yYn5KHUDoeKaUT6nmzYiBN1+b06kVTtn7hvZoHSLrcAq6zuzWjV47I2HC6k5Ol77Rp8bm5GP/PWy/ieliMdpmWlpW93g74tjIIwKwPjOIxNT0Grujq78ByYCfT5xXgoOW5Fn3NjG1ZIFkq6Aw04WFqWi/SckPFwI7qNnhu9CsO/EEIImUPQuBBCCEkdGhdCCCGpQ+NCCCEkdWhcCCGEzJ5a7MnH4VhUbQWdHOqKm++GdbeVtC079IhVsO7wlE6S9aer/wTrukDQUKpjZQhSHWWzOCZToaAVZ12dWBnS2amvQz6v4/oIMZCMVOtYGWIMjqk0f6BflZ1+5pPxMWKthvFA0iMhn9OqQNdSN/B1XK+polbTCIesO0SVjYxhRU57oI+RyeCuWgFJnYo7sMJurnDgYq3eE8KMPu8NO7AKqFjXfWhwSMeyEsolreC7ez2OneUCkeXIDq02Exrejaqs6t8E6xbu1IM0tw33lWxV98GgioOWmQldFk1a4pBVsfIpAxSkh647ENb1QHzEwMHqq5yn+6xrU17WQcKyEk4sNhAPqLLyBFb/hbGeB30ft8FMaXVbtagVga3ClQshhJDUoXEhhBCSOjQuhBBCUofGhRBCyOw59A9cip2FtYZ2Dh21WieyEq65fYsqKweW7ELAGR7VcAgQByTlytZwaIk41g65houdYWNF7UzdPqbPQahWtEPetVzeBvA32nL91Gq4be0FfX16urtw3VDXzeWwM7UAjlso6FA8NnFDmMH3c3CJ7hOPyp0I69bAvUP3Tdi4/g5VNgzK5hJ93TiUUgT6xcI+7WwWNoOkd3XPEqojo3/PBJY4JI7uiBH4fvJ7Rt/rkoud0IGvRRbFKhZ0RCNgMExZkvQB7UZjDFZ1onHs6K9kdJtrQNgiZICIZQyEfhJ6wVjIlfD4yFb0ePS78DkX2nT4lyWZpbBu5OtzNkCUIIyP6LBLpVGcKK4VuHIhhBCSOjQuhBBCUofGhRBCSOrQuBBCCEkdGhdCCCGzpxYbtoTqiGOt4GrPYYXL0kU6ZMlt20dh3TqI6VK3hC2IgczGhLhuBqjTPEvyHOPo4/oW1VKQ18eIsGDNcRtAsRZjO+87WNVVrWsV2b07cbIwpx62HIbCOLptrovr+q4Om+MFFrVQFrQhxN0vBmqWKMLXfWSrDhO0bvUyZy5TqmAFoDG6D4SW69ndqVV9w0UcQiiqgetZxv3NFHXdMV+HNhJGPK0gHXZxuJqeA7S6zUNxm6S8R/c3Y1OAjepjxAVLiJVOfM7emFanuaXJlue7kmUslXPg/MqWto3r47ptlvZ2AtVsu+V+gqo2tVhpUocJGrAkQmsFrlwIIYSkDo0LIYSQ1KFxIYQQkjo0LoQQQmbPoX/DZhzXP460g8rzdI4NoQLCRVRBHoOEHCi3OAANyDdifBxmIQahGqLYkv8B5FixOZZROhaQRuW+cuC8dz0c2sazONMDJEywOABdB10LS13L7+HjAoc+CI8hRODe1SyKBxeIJiyBTZwcCEEzEd3/HBQPBlsnbbl79LV3XXyNGsBT27CITRwHCAgmcXgTZ6duQ6MX39MNQ1o4kTV47DdCfc5Dy+6BdTMD+v4ZSxQSMwqu2aRlTrGIAsIxfS0z45Zx0ADjNIPruqjcEu3KyQCHfg7PCSYLBBoBnsNc37SUT0oIQJifasaiSmoBrlwIIYSkDo0LIYSQ1KFxIYQQkjo0LoQQQlKHxoUQQsjsqcWuvt0i1wDKg1yI1SW+p9VFVQ/bt4yv1SUZSygMJPaKXKy0qNdQ8hxY1YmioCV1XFLugHKvdUWWZ7DawyaTMiDkDcjbdh+BPohnCaUTADWdb1GyhYEO8xOE+h4LSKASWuR0XgMkrKrb2quv8ejUnc5c5p6RMv4AqMUCax8CqkcXhwryXR3Wwx/D98noaDpOXMBt2JHTyemijlWwbgmo27bH82Ddnqyea3qGQFYwKR/Udd0xy1iyKs5AmS1HFhBPuZbEa24GqEItYal8MLd5maDlkC4eSAqW/B6YQIxFxeo1dHvLNRyeqxW4ciGEEJI6NC6EEEJSh8aFEEJI6tC4EEIImT2HftniLIyBw6gG8lIIIXD2Rj6uWwE5L7rasWO5BtrQsORYEBdXqxiQ58VY8q4YD+RBsdhuA9pgHEuoB0tomgiEvLECxA2x5frEILyNb1EVwMgkNmECCE+BQlPYjmFA/hph++YNqiyOJ5y5TN0y7NDViCz91fP1PTWW8EiNhnboZydwjpYIpGOJLZFiHCBM2b4YO+kne7T4Y5ur8zsJfY5WFcxzsGO5x9Xlvb3DuG4PLvcnQN/aYZknaiDcDK7poPthG7Yw6pJtfKC5BpQlgNxRBghmhOK4jo9jLKG8WoErF0IIIalD40IIISR1aFwIIYSkDo0LIYSQ1KFxIYQQMntqsYotZAAIT+FZQoDUgK7CtcgnPJD4qGppg3F16ITYmjgJHcBpWamFlF62g1hbAKUhljAfHr5FnoMuhiUMBbrNFrWYC+K02BKIuSB8iwHqFKFR1aqTSlSBdetlHfqnMjkF6xbHtbyprdByt54VLJcIXmfb/YdDwaIWc40ONxMBBZlgRgq6zNblkVrQEtmmNE+rxe7uWwrrbukYUmV9DpCxOY7T7+ryeQ5WhdmO0dep1Wm9nTj+i18G19iSfMsBN8m1Xcy4dVVXHOsL32jgpF4xCBNVj7HysubrJG0hCGHTKly5EEIISR0aF0IIIalD40IIISR1aFwIIYSkTsueT6+tA5b7wKHvZzO4LghD4trCWzggX0kDhxYJgI306rgucnzFthAroDzeBx+9D0J03FcObLrl3BoWp55BDkBL2+JIO8ijyOIsBOXGkvDGA87CyOJYrBndhprFsRiBEDT1uv5+Ug7uUR18fy7hWgQvHsrzY+lDOI6ITdARtRwix5vSU4JbwfFf4kkgeNmGr73pAv21F1Z1SgNaoHPP4EJYd2fXfFU2P96G68bdsHy+0XPbPKcT1u309HULXCxMCV0tYvEd3I+zsXamG0u+o6gO8h1Z5g8ksIlcy9jP67oxTqHVEly5EEIISR0aF0IIIalD40IIISR1aFwIIYSkDo0LIYSQ2VOLBXkdFsKmcHEsCcA8oGbxLEokpCJr1LDSYmJMJ7mJQAgRoVzW8SkqFZwQp9YACidjU2+BZFhASWe9Dlhk5dTr+AOk9goCfDuDUJeHQdhyOB+bWiwEyYxig9vrZ7SCMMzhNmRzWk0Vu1hh5Tj51hMnzRG8AKspYZQdS3gkWNX6i1qVF0fjsGa1osPsxHWsFqtv1efRaOB7GuWBsrDHcp/6QfmgRQmnxWLOvfNxe9uzXfjnYq32GvSwkrHX1+1o93AixTxQi+UdHHanHShps45WkCXHyOmxbxHoOm6bHvuBJTGh44SpLj+4ciGEEJI6NC6EEEJSh8aFEEJI6tC4EEIImT2Hfgyc20IEnL2xhx1GKJyBAbkJ7juIrlsr4ZweI9t0PgZjcYQ3gCO8VrPkNwAhR2oWUYEtPEmroVt8z+Jgt+VSAY73vIed3qGvHZxBFj9XBKH+vcASgiSb1eW5Nhxiw8/othlLrhKUHqPewPcoCMOWBQhzBST+SMpRTiBLjhbHgNArIMTOfR9ox3JUx8ctT+m6JrLcJ6PvfxTh/hpVdHm0E4dYie8CoaYGYVXHDOgybwHu2+VBfB47hxapsskMdtIPgPAv3QEO/1LwtHio08NO+sjXjv6u0BIqxtfH9V18XN/V82BkCSvjwXn7/o8lrlwIIYSkDo0LIYSQ1KFxIYQQkjo0LoQQQlKHxoUQQsjsqcUqkzi5EFLmhJYwJCjqgE0t1qhr1UrcwKqMjh6thootSatQYrAowrETDEiG5ViUPjGoi0Kp2BROHgiPkpTbwn/sQ3KyMKPLQxASxnYMz8MKID8ESdp8rHpzfX1+9dhrWdHn2xSIqMzYA6HMBRq1astjCSXjSwDFBijIbH0TKRYFJJIylqRu6DobJPWzlVvukzFtqsxt4NAtHrgQrmUcuIFFeYnmpXm6DULkj6qyhkUB5niT+rguDv8Se5mWj9tw9dht2BSSRs+ZMTrhJBEiuJ8PQHjJlQshhJDUoXEhhBCSOjQuhBBCUofGhRBCyOw59I2DQxGEwHmWDS0Oo0g7swwITZE0zNdOyAA4ppNy4Ayz+qGAhyqyOOnRr9kyIbjAw253sGund93i3LSFakDZO3JZHP4lDwQEtuQfdZCPBeW1SQ4BnM2+xaFvkOPVEvkH/5j1yoMfm9sOfePge4qc9wHIH5IcI9b3xFhyDflAkOHZci4BZ7Hdp4scwBanObj/HsjFcx/tum5BlyXl3bpfxCB6TFLebjkTcOj2Ap4au3wtbig4eP7IOdqZHlhELD4Q7gSWPo+OAKbL/9QF98Mi0NlTRqD7A1cuhBBCUofGhRBCSOrQuBBCCEkdGhdCCCGpQ+NCCCFk9tRihc4CLEd6hlyow7EI9aJO9tWIsVosA5RPuRwOkeKD0CIoLIQttIgtTIsPFGCBxRxngAIsa1Fvodgt5RqWTtUtYWxQWIYwwOqSHLg+rkWFVK6DcBGWED0xuMZIPSjUaiAESR1LXKJI/15sSZqFQqagED9ziWwW92N0RwJbQjWQnC4G1802PgJLiCYXJK1zLSoidJldm8IJlHsunid8X4de8TsKLSu9GjlLv8pa+gUYptkMVkhmgJoua0n0F0bgnC2CNXSFbfMSEhvaEtDFKEGjRRSGwgc9kMR7XLkQQghJHRoXQgghqUPjQgghJHVoXAghhMyeQz8MsXPagylPcBiKMAN+zrSeVySTwW1A5ZbUFk4A8lgEFkc48nvlLI7QAnA45nK4vRFqQxHneahZHP2ocSgEjeABD55vCdHjgDwvUxWLQx85023OX1DWsNwjdNyGxVkdA4FGBMrmEp7FSY80CzaHKgzfYiy5f7zWQxP5Pgj/YvHpojw/1nMDz7EobJOQyeqwMEEbzq8S53TjahmcfyYKLR0ONCMMTMtjybNcy8AD479hy0sEQunAmrjckp4Hjkab4AWV23IEtQJXLoQQQlKHxoUQQkjq0LgQQghJHRoXQgghqUPjQgghJHVc80De7yeEEEIAXLkQQghJHRoXQgghqUPjQgghJHVoXAghhKQOjQshhJDUoXEhhBCSOjQuhBBCUofGhRBCSOrQuBBCCHHS5v8D5RX0E3Cl3UIAAAAASUVORK5CYII=", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Combine CAM with image\n", + "visualization = show_cam_on_image(img_np, grad_cam_matrix, use_rgb=True)\n", + "\n", + "# Plot image with GradCAM output\n", + "true_class = CIFAR_10_CLASSES[true_label]\n", + "pred_class = CIFAR_10_CLASSES[pred_label]\n", + "plot_multiple_images((img_np, f\"Original - {true_class}\"), (visualization, f\"Grad-CAM - {pred_class}\"), figsize = (5,6))" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.17" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/tutorial/tests/test_25_libraries_image_classification.py b/tutorial/tests/test_25_libraries_image_classification.py new file mode 100644 index 00000000..31834d26 --- /dev/null +++ b/tutorial/tests/test_25_libraries_image_classification.py @@ -0,0 +1,158 @@ +import pytest +import cv2 +import numpy as np + +def reference_scale_image(image, scale_factor): + # Get the current dimensions + height, width = image.shape[:2] + + # Calculate the new dimensions + new_width = int(width * scale_factor) + new_height = int(height * scale_factor) + new_size = (new_width, new_height) + + # Resize the image + return cv2.resize(image, new_size) + +@pytest.mark.parametrize("scale_factor", [0.5, 1.0, 2.0]) +def test_scale_image(scale_factor, function_to_test): + image = np.ones((32, 32, 3), dtype=np.uint8) * 255 + image_test = function_to_test(image, scale_factor) + image_reference = reference_scale_image(image, scale_factor) + assert image_test.shape == image_reference.shape + +def reference_crop_image(image, x: int, y: int, width: int, height: int): + x1, x2, y1, y2 = x, x+width, y, y+height + return image[y:y + height, x:x + width] + +@pytest.mark.parametrize("x, y, width, height", [ + (2, 2, 2, 2), + (5, 5, 4, 4), + (10, 10, 6, 6) +]) +def test_crop_image(x, y, width, height, function_to_test): + image = np.ones((32, 32, 3), dtype=np.uint8) * 255 + image_test = function_to_test(image, x, y, width, height) + image_reference = reference_crop_image(image, x, y, width, height) + assert image_test.shape == image_reference.shape + +def reference_horizontal_flip_image(image): + return cv2.flip(image, 1) + +def test_horizontal_flip_image(function_to_test): + image = np.ones((32, 32, 3), dtype=np.uint8) * 255 + image_test = function_to_test(image) + image_reference = reference_horizontal_flip_image(image) + assert np.allclose(image_test, image_reference) + +def reference_vertical_flip_image(image): + return cv2.flip(image, 0) + +def test_vertical_flip_image(function_to_test): + image = np.ones((32, 32, 3), dtype=np.uint8) * 255 + image_test = function_to_test(image) + image_reference = reference_vertical_flip_image(image) + assert np.allclose(image_test, image_reference) + +def reference_rotate_image(image, angle: float): + (h, w) = image.shape[:2] + center = (w // 2, h // 2) + M = cv2.getRotationMatrix2D(center, angle, scale=1.0) + + # Compute new bounding dimensions + cos = np.abs(M[0, 0]) + sin = np.abs(M[0, 1]) + new_w = int((h * sin) + (w * cos)) + new_h = int((h * cos) + (w * sin)) + + # Adjust rotation matrix for translation + M[0, 2] += (new_w / 2) - center[0] + M[1, 2] += (new_h / 2) - center[1] + + # Perform rotation with expanded canvas + return cv2.warpAffine(image, M, (new_w, new_h)) + +@pytest.mark.parametrize("angle", [5, 10, 20, 30]) +def test_rotate_image(angle, function_to_test): + image = np.ones((32, 32, 3), dtype=np.uint8) * 255 + image_test = function_to_test(image, angle) + image_reference = reference_rotate_image(image, angle) + assert np.allclose(image_test, image_reference) + +def reference_average_filter(image, kernel_size): + return cv2.blur(image, kernel_size) + +@pytest.mark.parametrize("kernel_size", [(3,3), (5,5)]) +def test_average_filter(kernel_size, function_to_test): + image = np.ones((32, 32, 3), dtype=np.uint8) * 255 + image_test = function_to_test(image, kernel_size) + image_reference = reference_average_filter(image, kernel_size) + assert np.allclose(image_test, image_reference) + +def reference_median_filter(image, ksize): + return cv2.medianBlur(image, ksize) + +@pytest.mark.parametrize("ksize", [3, 5]) +def test_median_filter(ksize, function_to_test): + image = np.ones((32, 32, 3), dtype=np.uint8) * 255 + image_test = function_to_test(image, ksize) + image_reference = reference_median_filter(image, ksize) + assert np.allclose(image_test, image_reference) + +def reference_gaussian_filter(image, kernel_size, sigma): + return cv2.GaussianBlur(image, kernel_size, sigma) + +@pytest.mark.parametrize("kernel_size, sigma", [ + ((3,3),0), + ((5,5),0), + ((3,3),1), + ((5,5),1) +]) +def test_gaussian_filter(kernel_size, sigma, function_to_test): + image = np.ones((32, 32, 3), dtype=np.uint8) * 255 + image_test = function_to_test(image, kernel_size, sigma) + image_reference = reference_gaussian_filter(image, kernel_size, sigma) + assert np.allclose(image_test, image_reference) + +def reference_adjust_brightness(image, brightness_value): + return cv2.convertScaleAbs(image, beta=brightness_value) + +@pytest.mark.parametrize("brightness_value", [-30, -20, -10, 0, 10, 20, 30]) +def test_adjust_brightness(brightness_value, function_to_test): + image = np.ones((32, 32, 3), dtype=np.uint8) * 255 + image_test = function_to_test(image, brightness_value) + image_reference = reference_adjust_brightness(image, brightness_value) + assert np.allclose(image_test, image_reference) + +def reference_adjust_contrast(image, contrast_value): + return cv2.convertScaleAbs(image, alpha = contrast_value) + +@pytest.mark.parametrize("contrast_value", [0.5, 1.0, 1.5, 2.0]) +def test_adjust_contrast(contrast_value, function_to_test): + image = np.ones((32, 32, 3), dtype=np.uint8) * 255 + image_test = function_to_test(image, contrast_value) + image_reference = reference_adjust_contrast(image, contrast_value) + assert np.allclose(image_test, image_reference) + +def reference_adjust_saturation(image, saturation_factor): + # Convert the image from BGR to HSV + image_hsv = cv2.cvtColor(image, cv2.COLOR_RGB2HSV) + + # Split the HSV image into Hue, Saturation, and Value channels + hue, saturation, value = cv2.split(image_hsv) + + # Adjust the saturation channel (Ensure it stays within valid range) + saturation = np.clip(saturation * saturation_factor, 0, 255) + + # Merge the channels back + image_hsv_adjusted = cv2.merge([hue, saturation.astype(np.uint8), value]) + + # Convert the adjusted image back to BGR + return cv2.cvtColor(image_hsv_adjusted, cv2.COLOR_HSV2RGB) + +@pytest.mark.parametrize("saturation_factor", [0.5, 1.0, 1.5, 2.0]) +def test_adjust_saturation(saturation_factor, function_to_test): + image = np.ones((32, 32, 3), dtype=np.uint8) * 255 + image_test = function_to_test(image, saturation_factor) + image_reference = reference_adjust_saturation(image, saturation_factor) + assert np.allclose(image_test, image_reference) \ No newline at end of file From 287970c0e9f017be50750f0e197fb8e36df19e8c Mon Sep 17 00:00:00 2001 From: "pre-commit-ci[bot]" <66853113+pre-commit-ci[bot]@users.noreply.github.com> Date: Tue, 6 May 2025 14:15:55 +0000 Subject: [PATCH 02/10] [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci --- 25_libraries_image_classification.ipynb | 1033 ++++------------- .../test_25_libraries_image_classification.py | 57 +- 2 files changed, 233 insertions(+), 857 deletions(-) diff --git a/25_libraries_image_classification.ipynb b/25_libraries_image_classification.ipynb index 59a981be..3a8c6081 100644 --- a/25_libraries_image_classification.ipynb +++ b/25_libraries_image_classification.ipynb @@ -2,7 +2,7 @@ "cells": [ { "cell_type": "markdown", - "id": "10140aff", + "id": "0", "metadata": {}, "source": [ "# Image Classification Notebook" @@ -10,7 +10,7 @@ }, { "cell_type": "markdown", - "id": "0f1a238d", + "id": "1", "metadata": {}, "source": [ "## Table of Contents\n", @@ -62,7 +62,7 @@ }, { "cell_type": "markdown", - "id": "eea0d77c", + "id": "2", "metadata": {}, "source": [ "## Libraries" @@ -70,7 +70,7 @@ }, { "cell_type": "markdown", - "id": "a1581546", + "id": "3", "metadata": {}, "source": [ "- NumPy\n", @@ -84,8 +84,8 @@ }, { "cell_type": "code", - "execution_count": 47, - "id": "f1f07e51", + "execution_count": null, + "id": "4", "metadata": {}, "outputs": [], "source": [ @@ -115,7 +115,7 @@ }, { "cell_type": "markdown", - "id": "d11781b5", + "id": "5", "metadata": {}, "source": [ "## References\n", @@ -131,7 +131,7 @@ }, { "cell_type": "markdown", - "id": "d55143be", + "id": "6", "metadata": {}, "source": [ "## Introduction\n", @@ -143,7 +143,7 @@ }, { "cell_type": "markdown", - "id": "9a037a61", + "id": "7", "metadata": {}, "source": [ "## Classes\n", @@ -159,8 +159,8 @@ }, { "cell_type": "code", - "execution_count": 2, - "id": "0644e0df", + "execution_count": null, + "id": "8", "metadata": {}, "outputs": [], "source": [ @@ -370,7 +370,7 @@ }, { "cell_type": "markdown", - "id": "4f522994", + "id": "9", "metadata": {}, "source": [ "## Functions\n", @@ -380,8 +380,8 @@ }, { "cell_type": "code", - "execution_count": 3, - "id": "44ba288a", + "execution_count": null, + "id": "10", "metadata": {}, "outputs": [], "source": [ @@ -405,7 +405,7 @@ }, { "cell_type": "markdown", - "id": "650cf38c", + "id": "11", "metadata": {}, "source": [ "## Dataset\n", @@ -415,7 +415,7 @@ }, { "cell_type": "markdown", - "id": "c610a4ed", + "id": "12", "metadata": {}, "source": [ "### Load data\n", @@ -425,8 +425,8 @@ }, { "cell_type": "code", - "execution_count": 4, - "id": "44f45fed", + "execution_count": null, + "id": "13", "metadata": {}, "outputs": [], "source": [ @@ -448,7 +448,7 @@ }, { "cell_type": "markdown", - "id": "ff15666e", + "id": "14", "metadata": {}, "source": [ "## Explore image processing\n", @@ -458,7 +458,7 @@ }, { "cell_type": "markdown", - "id": "f4ad942e", + "id": "15", "metadata": {}, "source": [ "### Example image" @@ -466,21 +466,10 @@ }, { "cell_type": "code", - "execution_count": 5, - "id": "2b6795a0", - "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAK8AAACvCAYAAACLko51AAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjEsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvc2/+5QAAAAlwSFlzAAAPYQAAD2EBqD+naQAAD+pJREFUeJztncmPXFcVxt9cVa/nbreHtttJ7EBCGLJAIEWAhAISfwULliC2bCPgv2HFLkiRgEQQAjGxkmBwbMft9jx0u+ea34DKYnPv9zl90+4o3Orvt3tH901Vp67uV+fcc8K6rutACA+JvuwHEOKgyHmFt8h5hbfIeYW3yHmFt8h5hbfIeYW3yHmFt8h5hbckrgPfeONXYNt+cB9svXYPb9KYMA0R/mbOv3gebOfOoy0gAcG7d26D7T8XLhjHqysrMKYkP90oxY+k0crBNjs1DbbpmZnPPB4xNz8HtpmZebDlkzhuagqv15rEZ2vmpq3Zsj7/IAjirAW2KgiJDaldprwSv6eqwqtFMV7sO69+zeEGmnmFx8h5hbfIecX4r3nnFpfAtrhwAmxnzzyH584fM44HYQpjwiQDG0t46/W6YHvp5PNgO//yt4zjlatXYcz25gbYtjbQduvmDbDdvoW2xFoytjJ8z3LQAVuaxGBrNnHNmzSaOG4K17OtqUnjeHZhEcbMzuP3OTOL95ycwbX9FLG1JqeM47iBa/E4QXdLYnx3VzTzCm+R8wpvkfMKb5HzivEXbF99Cf84vnblGtjWt3fBllt/rjdaKGR6vT2wZRmKuGqAgq3dRxG0ePyUcfzaaRR1d2+tgq2zvQW21773fbDdf3gXnzdtGMezlogZceljM3gy4p0/vgm28hEGVaIIgwh1iLa4ke37OcYVnpeScUnDfKcR+QQGOGYs8T41fwbGzM1hMGZhYQFs3/7Gy4ELmnmFt8h5hbfIeYW3yHnFEYiwTaH4OPfiV8B25/ZNsG1sPDSOp0l2VKOJIiCLMcI2keHvrdsbgK0uTUFSFDAkmJnBiNKgj4KwKPH6yyTjrdWcNY4nc/N4xLHlF8DWIZHEt37/O7DFBY7LYhS/aWU+b9XF54/KIdh6RBBWRBCukVyz+lNLvMckwhZhNK1BBOHPfvnzwAXNvMJb5LzCW+S8YvzXvJf/9RHYpheOg62V4O9h8/Ej47hL1mDHT57Gm0YlmIYkjX9A1oJhZdoi63hESnZNzM1hxtS77/4ZbFMtXKu98vXvGsd9su4b4CsF04snwTZMUANsbm6CLU9w/Zlb6+AGyeYKE3x+VrSOfGxBjcvgoK6t5xjsOmUJ7nYOXipPM6/wFjmv8BY5r/AWOa8Yf8G2sbUGtksf/gNsaYEC4uQL5tagARmTT+J2ljw3M8NG1OT3Ri4XdLqmYCD/jwfDQR9sn3z0Adguvv0W2CYm8HlPLZrPe2KZBF6ISPzmK6+CLfnpL8B2lwSAtrfWwba7Y25l2tvBTLl2uw22bhcDNMMhBjNqIu3C0PxeMiI4sxQDKrm1Tf/zoJlXeIucV3iLnFd4i5xXjL9gY3W3bnRw6876AzODbES3Mhf9U8cwMheS7KVWE+sULJD6EUmCQqDfNbcGtVq4xeXa1ctge++vfwFbVGJYbGsdhdI9q2ZaYwq3uGS5WVNhxCzJbvvBD1/H5yBZX90eCq9OxxSr7d1tGPPwDoq/1RtYi+Lap586idUzZ5aN4wVS06PVQhE3P49bg1zRzCu8Rc4rvEXOK7xFzivGX7AFJIVuluzDf7iCtRCalnjauXMLz3uIQu+DixfB9gqJRuUTmMY46JtFronWCT6++D7Ytkk0qihQsFUlhvVChxTA4QAjVns1ii4WeGqkKHha5N1n5kxB3CQF/7IIbTvbWP/i9ddxu9OJEyjGJq1i20kzdyou3SSi3BXNvMJb5LzCW+S8wlvkvGL8BVuP5B1mTbfq18XQ3LNWk0rgD+6Z+9xGXL+BXX7ee+/vYItI7YIkNp9jcR5rKARD0rmI/Jx3d3A/1oJVfXxEZhW4C0nXo7Ii4o9sbEvTzKlyeUWEY69nvtfVKxhJfPftP4FtdRWL+y0t4d7C9c3HYKstuZo0MQqXkJTIgqRc/ugnPw5c0MwrvEXOK7xFziu8Rc4rxl+wzZI0xofXLju1JupZEbYgw9umdh+oUfSogeP2On2nRX9ltcbaIXu9SpJOODOLwm5AKm/0+vgce3t7nykan4zp4XnTpBVsNayc0k3bbRSTV6xUz39ewL2GKytX8FrW84+4cfO6U7GWyqpEEsWkGjvxjYJUQPzNb38duKCZV3iLnFd4i5xXeIucV4y/YFtexlZQVy/8DWyPt3G/VHfTFClnnj8LYyKyhy0iEarQpULhEwFhCoGCRLEmWpiOt7OLAmi3jSKrRZ7NTuFcfYSfxRTZrzaRYzQqI/2Zr179BGybpBjM6qpZpXxzCyNiZY2fR81KQpLPuyR7+uyvoCatsliKKPuOXdHMK7xFziu8Rc4rvEXOK8ZfsOUxiptTRMQNSbn7om9GwPoDXLhv7WB64pDUj0+JyApJWmBpRbIKsmerjvFZkwZJr+yjQOmT9gKXrplC6fEHH8KYvEVSKUkaaU3evWtHKoORMCXiyVJPMUkZDQJSNjOq3UQWiRwGsfW8tdu1qCJ0RDOv8BY5r/AWOa84AtuAdnG9dXrJLK42YnIWazl0H5oVtzc28c/7NssWYz1XWYtR0oq0Ks1zB6Sa9+bODtgyUuMgZAXu+tiOa8+qFdEfsnfCNWpM5hDWLoptK4pI1Mauj8BiD1Ho1kKqJHqCs//12JqXBZ1c0cwrvEXOK7xFziu8Rc4rxl+w9XvY5ohtc5mbxqypwj6XrO07pI1SRuo7dK2aBCMqsg0osf40Z8IgIn/K93ooTCOrTdPTLjgYoIhzES000MAemBSqK/e941PuSb4EVnmdVax3gQYkWOAiODiaeYW3yHmFt8h5hbfIecX4C7ZOZxNsN63tJiNaTdyvPzs9ZRz3icCKsCB5sLgw7ySKuh0UWQPrHgNSkTwhgjCO8fc8HBZOkbLSFlRUoBDxxIJYLAJGxFNNo1bmuJrcgEUND5PaVZzRTDM3NPMKb5HzCm+R8wpvkfOK8Rds7194B2x3b2Gv2jTBBXh7z1RjSRNbMk1O4vaYM6dOgW17A5XdJqkj0LK2I21u4XmsZEBBUgC7XSzIFwfZoYkPGsRiRkfBdphRLBZho2LsgO9+0PNGaOYV3iLnFd4i5xXeIucV4y/Yrl+5BLaNdaw2fu7cc2BrWLUWegOMWA0GmOqYkr5SIUkCjImo2O2YKZZ1hNG0BhGOBak0XhNBOKjwHXCvmFsUi1YzIO/kavsyOKjwUqE9cSSR8wpvkfMKb5HzivEXbOt37oKtKtk+K7xkKzfbQz1auwNjJkkBut09TMNMs3DfXrsjulbmZCvHdlHb23j9usDUybyFlct3uqSHcFHvWxCEiTiWJsmDboe4n4wQEVF7mNG0wxacmnmFt8h5hbfIeYW3yHnF+Au2nS6KojwlraBI6mFiRdhyUt2ctLMN+qRP7yRp+9QjBVFqqxr7sMa9b3VBbESLlMTIUidtmRWSYiXPkgL4LOe6XCsm0a6KjGOtrA6KXdHy86CZV3iLnFd4i5xXeIucV4y/YOuSoh1xgGmBG+v3wLZ44qRxfHrpOIxpNnBP2MZjTLlcX3vsVNY/j0xbRqJHx5fM5xrxYB1bDmzu7B1QsIWHGnn6ogVbScQT7/8c7iviXFMdFWETRxI5r/AWOa8Y/zVv0cW1YMV8vyRrpNpcGycJrq1OnsL15/FjJ8D2h+tvgm3p1BLYWlZHqk4PAxLtIf7ZXpC+T+w9WVV1lyXps2RWsT/0a4dCe2yjUe14fde1qz2OnXeYGWpP7nHgM4X4kpHzCm+R8wpvkfOK8RdsZ4/lYFuYR9vsHIqs1NqC0ytRPK2tPwLbc6fPg2359FmwLR4ztxmNKKzAxb1/X4Yx61tYo2FQufX8DWnv3i+22BwXdiEZ5zAm2D/I8vR7IrZAi+PYrZf0M6CZV3iLnFd4i5xXeIucV4y/YDu/fAxs+RTWWkgnUDzdvGdmhz3e3YExnTYRcWc3wHbyNFZLX1t7ALaV1dvG8d0HazAmCEmdAmYjUbcvusAdE3GsN3DNRKIVKXOtvE57INeRa2nAzzx8Ks/wMWrmFd4i5xXeIucV3iLnFeMv2CZmsF5C1EBx1iEpkZXVzzcJcctPq4FCabeNaZjtIfYZXlnFllobGzv7pjryyJPr9pv9U/7c0hWfAong1eTUhIi4yhJUrPdwRaNppO9yiVGxsiapk9blIuJa9nP97+mCg6KZV3iLnFd4i5xXeIucV4y/YJs5hnvMbt3HlMKb9zGSVVoiZdBFEdCzS5kHQbDVxuJ+IanI1yd70Wx9liSJU2V3to+L1oIL9y8Q57pni2m4xBK5T56DCKWafIVhavZdrkmNCdb+qyIF9IqSvQMRe1YkLgzJc7HPLDx40T7NvMJb5LzCW+S8wlvkvGL8BVufbD+6cw/3nd0hqYcDWz1V+JspSD/ifAKjekmBi/5ySESFdc8oJRGxyrGwBw4LQlqIZP+5oKLplez65K6OVcpjq6gg24OXsQhh7BZdpKLWEoXVAKvaRywyFyvCJo4gcl7hLXJe4S1yXnEEKqO3MRVxOMSK5BFJqyuHdvSscoooxUQsJGR9n5H0vqphRpkGBYvksPREJpTImeRUe4+Za5cmtjeNFQWJA3yHiDxcVJqRyZhcv0UijkmCaakh2dNXkO8dK8XjGPa9x0QkuqKZV3iLnFd4i5xXeIucV4y/YOvtYfpj0cWevyFLv7NERUn2RTFhUA/7Tnu2mO6qG2Z/46LGaw1I1cLasQoGa/tk9+l1rVjPUgzZHjM20+SkRUKemudO5w0ck2P/54hUdmSppLxk//6FTpgoT7ODz5+aeYW3yHmFt8h5xfiveasCt+TMT1v9op7SpsrOSKsrrNuQxnitLCE20oa1rHDctrWebZLtQ0UTF2YDUhq9IFlrLABhr4Pplh+ylo1JZlWWYEBiZgLXqSfmZ3Bcy3zXZoafWZS4tWWNYxbMSPc9NyStvli19Jisg13RzCu8Rc4rvEXOK7xFzivGX7CFJEtocR6F1+ICLuaryhQfUYB/mseR26Pwugpom+6YmWxpgxQKJAGPfg+FEtnRQoMULoX2IiI4M7JFqZXh5z3Jgg2tfF8RFJOgQkSyudh3EEWpW5FBO7uNTotsK5a2AYkjiJxXeIucV3iLnFeMv2BjKVIJidIwW5qakaE0RuHBUsNqxzoFA1LzwRYfU9MobKoai/uFAQqqUV4cjItKh37Ebr18I2YjT0HrO4T7VzjnkbN033oPTxNsrIieLX5D2naLvBUr9+6IZl7hLXJe4S1yXuEtcl5xBCJsJErDUtyyDBf4zaZpS4hYYKmCLHLGBBtr1ZSnLeM4JRGlglwrjMg2psix1oL1GbF3ci4V4VwrIkabPZDU0gioOGPXchxnvTutx0D6GLOCha5o5hXeIucV3iLnFd4i5xXeEta8sa4Q//do5hXeIucV3iLnFd4i5xXeIucV3iLnFd4i5xXeIucV3iLnFYGv/Bev9xSmm2mi6gAAAABJRU5ErkJggg==", - "text/plain": [ - "
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": null, + "id": "16", + "metadata": {}, + "outputs": [], "source": [ "# Select image\n", "image = train_set[0][9]\n", @@ -494,7 +483,7 @@ }, { "cell_type": "markdown", - "id": "0fcff42b", + "id": "17", "metadata": {}, "source": [ "### Geometric transformation\n", @@ -508,7 +497,7 @@ }, { "cell_type": "markdown", - "id": "25325640", + "id": "18", "metadata": {}, "source": [ "#### Scaling" @@ -516,60 +505,20 @@ }, { "cell_type": "code", - "execution_count": 6, - "id": "9f272277", - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
🚫 OpenAI API key is not provided.
API key is missing.
" - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "text/html": [ - "
🔄 IPytest extension (re)loaded.
" - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": null, + "id": "19", + "metadata": {}, + "outputs": [], "source": [ "%reload_ext tutorial.tests.testsuite" ] }, { "cell_type": "code", - "execution_count": 7, - "id": "02f2480c", - "metadata": {}, - "outputs": [ - { - "data": { - "application/vnd.jupyter.widget-view+json": { - "model_id": "c2fd61632d3e45ea9b2d33d713e16199", - "version_major": 2, - "version_minor": 0 - }, - "text/plain": [ - "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": null, + "id": "21", + "metadata": {}, + "outputs": [], "source": [ "# Scale image by half\n", "scaled_image = solution_scale_image(image, 0.5)\n", @@ -614,7 +552,7 @@ }, { "cell_type": "markdown", - "id": "24bdd539", + "id": "22", "metadata": {}, "source": [ "#### Cropping" @@ -622,25 +560,10 @@ }, { "cell_type": "code", - "execution_count": 9, - "id": "cc7e4291", - "metadata": {}, - "outputs": [ - { - "data": { - "application/vnd.jupyter.widget-view+json": { - "model_id": "60feb9e32dda4b03abcf6d6eb1f4f0ae", - "version_major": 2, - "version_minor": 0 - }, - "text/plain": [ - "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": null, + "id": "24", + "metadata": {}, + "outputs": [], "source": [ "# Crop image to get a 15-by-15 image starting on (x,y): (2,2)\n", "cropped_image = solution_crop_image(image, 2, 2, 15, 15)\n", @@ -676,7 +588,7 @@ }, { "cell_type": "markdown", - "id": "4a447b09", + "id": "25", "metadata": {}, "source": [ "#### Horizontal Flip" @@ -684,25 +596,10 @@ }, { "cell_type": "code", - "execution_count": 11, - "id": "6dc1507a", - "metadata": {}, - "outputs": [ - { - "data": { - "application/vnd.jupyter.widget-view+json": { - "model_id": "20e0f6f8ddb443b8ad30199289244442", - "version_major": 2, - "version_minor": 0 - }, - "text/plain": [ - "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": null, + "id": "27", + "metadata": {}, + "outputs": [], "source": [ "# Flip image horizontally\n", "flip_image_horizontal = solution_horizontal_flip_image(image)\n", @@ -737,7 +623,7 @@ }, { "cell_type": "markdown", - "id": "0b473067", + "id": "28", "metadata": {}, "source": [ "#### Vertical Flip" @@ -745,25 +631,10 @@ }, { "cell_type": "code", - "execution_count": 13, - "id": "1d9325d9", - "metadata": {}, - "outputs": [ - { - "data": { - "application/vnd.jupyter.widget-view+json": { - "model_id": "9d99e8b0985d40d484a4b43a7177b431", - "version_major": 2, - "version_minor": 0 - }, - "text/plain": [ - "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": null, + "id": "30", + "metadata": {}, + "outputs": [], "source": [ "# Flip image vertically\n", "flip_image_vertical = solution_vertical_flip_image(image)\n", @@ -798,7 +658,7 @@ }, { "cell_type": "markdown", - "id": "be8a0c63", + "id": "31", "metadata": {}, "source": [ "#### Rotation" @@ -806,25 +666,10 @@ }, { "cell_type": "code", - "execution_count": 15, - "id": "c1469ab1", - "metadata": {}, - "outputs": [ - { - "data": { - "application/vnd.jupyter.widget-view+json": { - "model_id": "f0439770c28b41ca8b82262623da1391", - "version_major": 2, - "version_minor": 0 - }, - "text/plain": [ - "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": null, + "id": "33", + "metadata": {}, + "outputs": [], "source": [ "# Rotate image by 20 degrees\n", "rotated_image = solution_rotate_image(image, 20)\n", @@ -874,7 +708,7 @@ }, { "cell_type": "markdown", - "id": "c38eb125", + "id": "34", "metadata": {}, "source": [ "### Image filtering\n", @@ -887,7 +721,7 @@ }, { "cell_type": "markdown", - "id": "d850c8c1", + "id": "35", "metadata": {}, "source": [ "#### Average filter " @@ -895,25 +729,10 @@ }, { "cell_type": "code", - "execution_count": 17, - "id": "cd187048", - "metadata": {}, - "outputs": [ - { - "data": { - "application/vnd.jupyter.widget-view+json": { - "model_id": "20a47b710a72400ea2634ba607c59b8e", - "version_major": 2, - "version_minor": 0 - }, - "text/plain": [ - "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": null, + "id": "37", + "metadata": {}, + "outputs": [], "source": [ "# Filter image using average filter\n", "average_filter_image = solution_average_filter(image, (3, 3))\n", @@ -948,7 +756,7 @@ }, { "cell_type": "markdown", - "id": "8abdf202", + "id": "38", "metadata": {}, "source": [ "#### Median filter" @@ -956,25 +764,10 @@ }, { "cell_type": "code", - "execution_count": 19, - "id": "c20b2bfb", - "metadata": {}, - "outputs": [ - { - "data": { - "application/vnd.jupyter.widget-view+json": { - "model_id": "a2e262f6bce44cafa6b074a008d8185b", - "version_major": 2, - "version_minor": 0 - }, - "text/plain": [ - "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": null, + "id": "40", + "metadata": {}, + "outputs": [], "source": [ "# Filter image using median filter\n", "median_filter_image = solution_median_filter(image, 3)\n", @@ -1009,7 +791,7 @@ }, { "cell_type": "markdown", - "id": "cc4f9ed1", + "id": "41", "metadata": {}, "source": [ "#### Gaussian filter" @@ -1017,25 +799,10 @@ }, { "cell_type": "code", - "execution_count": 21, - "id": "4f123cf6", - "metadata": {}, - "outputs": [ - { - "data": { - "application/vnd.jupyter.widget-view+json": { - "model_id": "0fdd6fe9103a400d876ac423afea8f85", - "version_major": 2, - "version_minor": 0 - }, - "text/plain": [ - "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": null, + "id": "43", + "metadata": {}, + "outputs": [], "source": [ "# Filter image using Gaussian filter\n", "gaussian_filter_image = solution_gaussian_filter(image, (7, 7), 0)\n", @@ -1070,7 +826,7 @@ }, { "cell_type": "markdown", - "id": "e736d5c9", + "id": "44", "metadata": {}, "source": [ "### Photometric transformation\n", @@ -1083,7 +839,7 @@ }, { "cell_type": "markdown", - "id": "1d8fa3ce", + "id": "45", "metadata": {}, "source": [ "#### Adjust brightness" @@ -1091,25 +847,10 @@ }, { "cell_type": "code", - "execution_count": 23, - "id": "04be16c2", - "metadata": {}, - "outputs": [ - { - "data": { - "application/vnd.jupyter.widget-view+json": { - "model_id": "ebd850c398c4419b9f53ca0b7b98e810", - "version_major": 2, - "version_minor": 0 - }, - "text/plain": [ - "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": null, + "id": "47", + "metadata": {}, + "outputs": [], "source": [ "# Brighter image (positive brightness value)\n", "brighter_image = solution_adjust_brightness(image, 100)\n", @@ -1147,7 +877,7 @@ }, { "cell_type": "markdown", - "id": "bb558cf5", + "id": "48", "metadata": {}, "source": [ "#### Adjust contrast" @@ -1155,25 +885,10 @@ }, { "cell_type": "code", - "execution_count": 25, - "id": "00b40d8d", - "metadata": {}, - "outputs": [ - { - "data": { - "application/vnd.jupyter.widget-view+json": { - "model_id": "046897f270d041869151a9e100385004", - "version_major": 2, - "version_minor": 0 - }, - "text/plain": [ - "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": null, + "id": "50", + "metadata": {}, + "outputs": [], "source": [ "# Increase contrast (Value > 1.0)\n", "high_contrast_image = solution_adjust_contrast(image, 2.0)\n", @@ -1211,7 +915,7 @@ }, { "cell_type": "markdown", - "id": "7d83b5da", + "id": "51", "metadata": {}, "source": [ "#### Adjust saturation" @@ -1219,25 +923,10 @@ }, { "cell_type": "code", - "execution_count": 27, - "id": "6a42dd8e", - "metadata": {}, - "outputs": [ - { - "data": { - "application/vnd.jupyter.widget-view+json": { - "model_id": "cc5750f6d3774b86b413108e6f898cbc", - "version_major": 2, - "version_minor": 0 - }, - "text/plain": [ - "VBox(children=(Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': 'HTML(value=\\'
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": null, + "id": "53", + "metadata": {}, + "outputs": [], "source": [ "# Decrease saturation\n", "low_saturation_image = solution_adjust_saturation(image, 0.2)\n", @@ -1288,7 +966,7 @@ }, { "cell_type": "markdown", - "id": "8d90478e", + "id": "54", "metadata": {}, "source": [ "## Image classifier development using CNNs\n", @@ -1298,7 +976,7 @@ }, { "cell_type": "markdown", - "id": "d78dee96", + "id": "55", "metadata": {}, "source": [ "### Dataset preprocessing" @@ -1306,7 +984,7 @@ }, { "cell_type": "markdown", - "id": "8ec8f785", + "id": "56", "metadata": {}, "source": [ "#### Train, validation, and test sets\n", @@ -1316,8 +994,8 @@ }, { "cell_type": "code", - "execution_count": 29, - "id": "8aba72de", + "execution_count": null, + "id": "57", "metadata": {}, "outputs": [], "source": [ @@ -1331,7 +1009,7 @@ }, { "cell_type": "markdown", - "id": "e84c416a", + "id": "58", "metadata": {}, "source": [ "#### Data Augmentation\n", @@ -1348,8 +1026,8 @@ }, { "cell_type": "code", - "execution_count": 30, - "id": "3c0cc98c", + "execution_count": null, + "id": "59", "metadata": {}, "outputs": [], "source": [ @@ -1377,7 +1055,7 @@ }, { "cell_type": "markdown", - "id": "46993f36", + "id": "60", "metadata": {}, "source": [ "#### PyTorch Datasets\n", @@ -1387,8 +1065,8 @@ }, { "cell_type": "code", - "execution_count": 31, - "id": "ef2d8891", + "execution_count": null, + "id": "61", "metadata": {}, "outputs": [], "source": [ @@ -1400,7 +1078,7 @@ }, { "cell_type": "markdown", - "id": "fad563a4", + "id": "62", "metadata": {}, "source": [ "#### PyTorch Dataloaders\n", @@ -1410,8 +1088,8 @@ }, { "cell_type": "code", - "execution_count": 32, - "id": "dec6c703", + "execution_count": null, + "id": "63", "metadata": {}, "outputs": [], "source": [ @@ -1423,7 +1101,7 @@ }, { "cell_type": "markdown", - "id": "0a4bcc59", + "id": "64", "metadata": {}, "source": [ "### Model training\n", @@ -1433,7 +1111,7 @@ }, { "cell_type": "markdown", - "id": "12b90a68", + "id": "65", "metadata": {}, "source": [ "#### Check which device is used for training" @@ -1441,8 +1119,8 @@ }, { "cell_type": "code", - "execution_count": 33, - "id": "76810a36", + "execution_count": null, + "id": "66", "metadata": {}, "outputs": [], "source": [ @@ -1451,7 +1129,7 @@ }, { "cell_type": "markdown", - "id": "3df5da88", + "id": "67", "metadata": {}, "source": [ "#### Define training hyperparameters" @@ -1459,8 +1137,8 @@ }, { "cell_type": "code", - "execution_count": 34, - "id": "a3f6513c", + "execution_count": null, + "id": "68", "metadata": {}, "outputs": [], "source": [ @@ -1479,7 +1157,7 @@ }, { "cell_type": "markdown", - "id": "7f0d3c3d", + "id": "69", "metadata": {}, "source": [ "#### Loss function\n", @@ -1503,8 +1181,8 @@ }, { "cell_type": "code", - "execution_count": 35, - "id": "596d2d55", + "execution_count": null, + "id": "70", "metadata": {}, "outputs": [], "source": [ @@ -1515,7 +1193,7 @@ }, { "cell_type": "markdown", - "id": "bbd59740", + "id": "71", "metadata": {}, "source": [ "#### Initialise model architecture" @@ -1523,8 +1201,8 @@ }, { "cell_type": "code", - "execution_count": 36, - "id": "37aaa6e0", + "execution_count": null, + "id": "72", "metadata": {}, "outputs": [], "source": [ @@ -1535,7 +1213,7 @@ }, { "cell_type": "markdown", - "id": "6e6283d1", + "id": "73", "metadata": {}, "source": [ "#### Optimiser function\n", @@ -1577,8 +1255,8 @@ }, { "cell_type": "code", - "execution_count": 37, - "id": "ad47d6b7", + "execution_count": null, + "id": "74", "metadata": {}, "outputs": [], "source": [ @@ -1588,7 +1266,7 @@ }, { "cell_type": "markdown", - "id": "c046cb9c", + "id": "75", "metadata": {}, "source": [ "#### Train model\n", @@ -1598,278 +1276,10 @@ }, { "cell_type": "code", - "execution_count": 38, - "id": "ee081ef8", - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Training model weights...\n", - "Epoch: 0, Loss: 1.520462845363756, Val Loss: 1.1197907672090046. The best val loss is 1.1197907672090046 in epoch 0.\n", - "Epoch: 1, Loss: 1.2096626140775473, Val Loss: 0.9215332592948008. The best val loss is 0.9215332592948008 in epoch 1.\n", - "Epoch: 2, Loss: 1.0686169616932417, Val Loss: 0.8447397245188891. The best val loss is 0.8447397245188891 in epoch 2.\n", - "Epoch: 3, Loss: 0.9770413578426751, Val Loss: 0.7714212577221757. The best val loss is 0.7714212577221757 in epoch 3.\n", - "Epoch: 4, Loss: 0.9027988477780001, Val Loss: 0.7413592813378673. The best val loss is 0.7413592813378673 in epoch 4.\n", - "Epoch: 5, Loss: 0.853520769925013, Val Loss: 0.7288939409841926. The best val loss is 0.7288939409841926 in epoch 5.\n", - "Epoch: 6, Loss: 0.8115674629263634, Val Loss: 0.7225121598122484. The best val loss is 0.7225121598122484 in epoch 6.\n", - "Epoch: 7, Loss: 0.7751083819970598, Val Loss: 0.6755026561223855. The best val loss is 0.6755026561223855 in epoch 7.\n", - "Epoch: 8, Loss: 0.7429245090397605, Val Loss: 0.680803889945402. The best val loss is 0.6755026561223855 in epoch 7.\n", - "Epoch: 9, Loss: 0.7036612864828458, Val Loss: 0.6538166893740832. The best val loss is 0.6538166893740832 in epoch 9.\n", - "Epoch: 10, Loss: 0.6798463224494544, Val Loss: 0.6429420563123994. The best val loss is 0.6429420563123994 in epoch 10.\n", - "Epoch: 11, Loss: 0.6614546892199203, Val Loss: 0.6293074839701087. The best val loss is 0.6293074839701087 in epoch 11.\n", - "Epoch: 12, Loss: 0.6360901489745091, Val Loss: 0.6260964428469286. The best val loss is 0.6260964428469286 in epoch 12.\n", - "Epoch: 13, Loss: 0.6167764896458953, Val Loss: 0.6097777591923536. The best val loss is 0.6097777591923536 in epoch 13.\n", - "Epoch: 14, Loss: 0.6087375462707811, Val Loss: 0.619213687666392. The best val loss is 0.6097777591923536 in epoch 13.\n", - "Epoch: 15, Loss: 0.5897037236794939, Val Loss: 0.6073367350687415. The best val loss is 0.6073367350687415 in epoch 15.\n", - "Epoch: 16, Loss: 0.5765072896967839, Val Loss: 0.6440611239206993. The best val loss is 0.6073367350687415 in epoch 15.\n", - "Epoch: 17, Loss: 0.563063828723274, Val Loss: 0.6116426238569163. The best val loss is 0.6073367350687415 in epoch 15.\n", - "Epoch: 18, Loss: 0.5490653581210296, Val Loss: 0.6325035504365372. The best val loss is 0.6073367350687415 in epoch 15.\n", - "Epoch: 19, Loss: 0.5475345523688044, Val Loss: 0.6214407298524501. The best val loss is 0.6073367350687415 in epoch 15.\n", - "Epoch: 20, Loss: 0.5203386396169662, Val Loss: 0.644289857755273. The best val loss is 0.6073367350687415 in epoch 15.\n", - "Epoch: 21, Loss: 0.5306917810744612, Val Loss: 0.620820519530167. The best val loss is 0.6073367350687415 in epoch 15.\n", - "Epoch: 22, Loss: 0.5145865923514331, Val Loss: 0.5965031617778843. The best val loss is 0.5965031617778843 in epoch 22.\n", - "Epoch: 23, Loss: 0.5126297283564171, Val Loss: 0.6246055566658408. The best val loss is 0.5965031617778843 in epoch 22.\n", - "Epoch: 24, Loss: 0.5013029677589445, Val Loss: 0.6039554145881685. The best val loss is 0.5965031617778843 in epoch 22.\n", - "Epoch: 25, Loss: 0.4936089391690971, Val Loss: 0.6187645522719722. The best val loss is 0.5965031617778843 in epoch 22.\n", - "Epoch: 26, Loss: 0.49248143774967124, Val Loss: 0.6178014038477914. The best val loss is 0.5965031617778843 in epoch 22.\n", - "Epoch: 27, Loss: 0.4751875848665725, Val Loss: 0.6290953432099294. The best val loss is 0.5965031617778843 in epoch 22.\n", - "Epoch: 28, Loss: 0.47110289161222696, Val Loss: 0.6119560217958385. The best val loss is 0.5965031617778843 in epoch 22.\n", - "Epoch: 29, Loss: 0.47182121448708275, Val Loss: 0.6084022969007492. The best val loss is 0.5965031617778843 in epoch 22.\n", - "Epoch: 30, Loss: 0.46568958635312796, Val Loss: 0.6035911913140345. The best val loss is 0.5965031617778843 in epoch 22.\n", - "Epoch: 31, Loss: 0.46517014171737825, Val Loss: 0.6001845184019057. The best val loss is 0.5965031617778843 in epoch 22.\n", - "Epoch: 32, Loss: 0.45642245550007715, Val Loss: 0.6151119955515457. The best val loss is 0.5965031617778843 in epoch 22.\n", - "Epoch: 33, Loss: 0.44822045453708537, Val Loss: 0.6187824904918671. The best val loss is 0.5965031617778843 in epoch 22.\n", - "Epoch: 34, Loss: 0.4419831280616948, Val Loss: 0.592142356654345. The best val loss is 0.592142356654345 in epoch 34.\n", - "Epoch: 35, Loss: 0.44394656189166715, Val Loss: 0.5911562273562965. The best val loss is 0.5911562273562965 in epoch 35.\n", - "Epoch: 36, Loss: 0.44761017992766233, Val Loss: 0.6003621490830082. The best val loss is 0.5911562273562965 in epoch 35.\n", - "Epoch: 37, Loss: 0.44031569524838104, Val Loss: 0.6164587758860346. The best val loss is 0.5911562273562965 in epoch 35.\n", - "Epoch: 38, Loss: 0.44285188463047476, Val Loss: 0.5972995141805229. The best val loss is 0.5911562273562965 in epoch 35.\n", - "Epoch: 39, Loss: 0.43360351151140936, Val Loss: 0.6004025703769619. The best val loss is 0.5911562273562965 in epoch 35.\n", - "Epoch: 40, Loss: 0.41746894465963336, Val Loss: 0.5901112660007962. The best val loss is 0.5901112660007962 in epoch 40.\n", - "Epoch: 41, Loss: 0.42153748733936436, Val Loss: 0.599281668915587. The best val loss is 0.5901112660007962 in epoch 40.\n", - "Epoch: 42, Loss: 0.4245654776366088, Val Loss: 0.6247793034476748. The best val loss is 0.5901112660007962 in epoch 40.\n", - "Epoch: 43, Loss: 0.4211408660685929, Val Loss: 0.6417396566120245. The best val loss is 0.5901112660007962 in epoch 40.\n", - "Epoch: 44, Loss: 0.41268631853979, Val Loss: 0.6215485965801497. The best val loss is 0.5901112660007962 in epoch 40.\n", - "Epoch: 45, Loss: 0.4059799904153295, Val Loss: 0.6040090186110998. The best val loss is 0.5901112660007962 in epoch 40.\n", - "Epoch: 46, Loss: 0.4065084534622457, Val Loss: 0.6168524281958402. The best val loss is 0.5901112660007962 in epoch 40.\n", - "Epoch: 47, Loss: 0.40375757086886105, Val Loss: 0.6170856871847379. The best val loss is 0.5901112660007962 in epoch 40.\n", - "Epoch: 48, Loss: 0.4098697109487805, Val Loss: 0.5972580624333883. The best val loss is 0.5901112660007962 in epoch 40.\n", - "Epoch: 49, Loss: 0.4095643772913592, Val Loss: 0.6005579427642337. The best val loss is 0.5901112660007962 in epoch 40.\n", - "Epoch: 50, Loss: 0.4083644581319642, Val Loss: 0.6048085048037061. The best val loss is 0.5901112660007962 in epoch 40.\n", - "Epoch: 51, Loss: 0.403082884249896, Val Loss: 0.6078571540824438. The best val loss is 0.5901112660007962 in epoch 40.\n", - "Epoch: 52, Loss: 0.3945612990073044, Val Loss: 0.5963720141325967. The best val loss is 0.5901112660007962 in epoch 40.\n", - "Epoch: 53, Loss: 0.38647738012084126, Val Loss: 0.5921772230984801. The best val loss is 0.5901112660007962 in epoch 40.\n", - "Epoch: 54, Loss: 0.39561996859138027, Val Loss: 0.5898459821434344. The best val loss is 0.5898459821434344 in epoch 54.\n", - "Epoch: 55, Loss: 0.3939645201185324, Val Loss: 0.5995454742746839. The best val loss is 0.5898459821434344 in epoch 54.\n", - "Epoch: 56, Loss: 0.38574477915998795, Val Loss: 0.5923815730769756. The best val loss is 0.5898459821434344 in epoch 54.\n", - "Epoch: 57, Loss: 0.3892661157968271, Val Loss: 0.6093509543245121. The best val loss is 0.5898459821434344 in epoch 54.\n", - "Epoch: 58, Loss: 0.3850240001495737, Val Loss: 0.5916378082612813. The best val loss is 0.5898459821434344 in epoch 54.\n", - "Epoch: 59, Loss: 0.378469897556479, Val Loss: 0.5968938746442229. The best val loss is 0.5898459821434344 in epoch 54.\n", - "Epoch: 60, Loss: 0.3794175671943783, Val Loss: 0.5946346276392371. The best val loss is 0.5898459821434344 in epoch 54.\n", - "Epoch: 61, Loss: 0.383537115134897, Val Loss: 0.6071822138155921. The best val loss is 0.5898459821434344 in epoch 54.\n", - "Epoch: 62, Loss: 0.3816265944037994, Val Loss: 0.6084112055220846. The best val loss is 0.5898459821434344 in epoch 54.\n", - "Epoch: 63, Loss: 0.38150809210364833, Val Loss: 0.5979497970665916. The best val loss is 0.5898459821434344 in epoch 54.\n", - "Epoch: 64, Loss: 0.37301460979846274, Val Loss: 0.6065514082625761. The best val loss is 0.5898459821434344 in epoch 54.\n", - "Epoch: 65, Loss: 0.36878351867198944, Val Loss: 0.5835936852430893. The best val loss is 0.5835936852430893 in epoch 65.\n", - "Epoch: 66, Loss: 0.3737880415181174, Val Loss: 0.5992799398757643. The best val loss is 0.5835936852430893 in epoch 65.\n", - "Epoch: 67, Loss: 0.3712460848428037, Val Loss: 0.5899697612907927. The best val loss is 0.5835936852430893 in epoch 65.\n", - "Epoch: 68, Loss: 0.37315521574150906, Val Loss: 0.5957560945870513. The best val loss is 0.5835936852430893 in epoch 65.\n", - "Epoch: 69, Loss: 0.362878164725147, Val Loss: 0.5935364683805886. The best val loss is 0.5835936852430893 in epoch 65.\n", - "Epoch: 70, Loss: 0.3722979333931512, Val Loss: 0.6141205157263804. The best val loss is 0.5835936852430893 in epoch 65.\n", - "Epoch: 71, Loss: 0.36279642804913276, Val Loss: 0.5836885872028642. The best val loss is 0.5835936852430893 in epoch 65.\n", - "Epoch: 72, Loss: 0.35015128895531605, Val Loss: 0.6027285529900406. The best val loss is 0.5835936852430893 in epoch 65.\n", - "Epoch: 73, Loss: 0.3609611526893003, Val Loss: 0.5830024175219617. The best val loss is 0.5830024175219617 in epoch 73.\n", - "Epoch: 74, Loss: 0.35839762582178536, Val Loss: 0.5948921146534257. The best val loss is 0.5830024175219617 in epoch 73.\n", - "Epoch: 75, Loss: 0.3572802374506519, Val Loss: 0.5998453563552791. The best val loss is 0.5830024175219617 in epoch 73.\n", - "Epoch: 76, Loss: 0.3575136511430253, Val Loss: 0.5765268831687459. The best val loss is 0.5765268831687459 in epoch 76.\n", - "Epoch: 77, Loss: 0.3491816062779322, Val Loss: 0.6038693424503682. The best val loss is 0.5765268831687459 in epoch 76.\n", - "Epoch: 78, Loss: 0.3558763565264479, Val Loss: 0.5848387520192033. The best val loss is 0.5765268831687459 in epoch 76.\n", - "Epoch: 79, Loss: 0.3427315055236329, Val Loss: 0.5766063006752629. The best val loss is 0.5765268831687459 in epoch 76.\n", - "Epoch: 80, Loss: 0.3547864577522243, Val Loss: 0.5917557159722862. The best val loss is 0.5765268831687459 in epoch 76.\n", - "Epoch: 81, Loss: 0.34569333091269444, Val Loss: 0.588234972903284. The best val loss is 0.5765268831687459 in epoch 76.\n", - "Epoch: 82, Loss: 0.34583661112472086, Val Loss: 0.5996746274374299. The best val loss is 0.5765268831687459 in epoch 76.\n", - "Epoch: 83, Loss: 0.3544433234925688, Val Loss: 0.5710372688659167. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 84, Loss: 0.34202558868122795, Val Loss: 0.58054094142833. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 85, Loss: 0.3480827844273435, Val Loss: 0.6032005470182936. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 86, Loss: 0.34498164269828446, Val Loss: 0.5817765525336993. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 87, Loss: 0.34134020775991636, Val Loss: 0.5917026044453605. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 88, Loss: 0.3539756306440291, Val Loss: 0.5930931666644953. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 89, Loss: 0.3384290711179267, Val Loss: 0.598258935799033. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 90, Loss: 0.32630713126302635, Val Loss: 0.5862780529058585. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 91, Loss: 0.33687664508602044, Val Loss: 0.5769727050752963. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 92, Loss: 0.3473127528372472, Val Loss: 0.5762458418385458. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 93, Loss: 0.33563532669396295, Val Loss: 0.5968525631953094. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 94, Loss: 0.3314749488647837, Val Loss: 0.595398830660319. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 95, Loss: 0.3329750731804945, Val Loss: 0.590960522576914. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 96, Loss: 0.3353428637241795, Val Loss: 0.5752199105287003. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 97, Loss: 0.34178659967044844, Val Loss: 0.5732296922449338. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 98, Loss: 0.32314487758779176, Val Loss: 0.5811086026793819. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 99, Loss: 0.32910647651139835, Val Loss: 0.5770133462245182. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 100, Loss: 0.3238052527291061, Val Loss: 0.5814551895452758. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 101, Loss: 0.32984716396262176, Val Loss: 0.5813906811556574. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 102, Loss: 0.326775474759349, Val Loss: 0.5779723345728244. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 103, Loss: 0.32983843188216216, Val Loss: 0.5768956274299298. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 104, Loss: 0.33291823569222956, Val Loss: 0.5779225970223799. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 105, Loss: 0.3280012247009869, Val Loss: 0.5753000757451785. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 106, Loss: 0.3194662572367348, Val Loss: 0.5870925825783762. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 107, Loss: 0.3207915492192672, Val Loss: 0.5964459269228628. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 108, Loss: 0.32585835576492506, Val Loss: 0.6154704705133276. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 109, Loss: 0.3208541542182874, Val Loss: 0.5844341463456719. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 110, Loss: 0.31640350160590053, Val Loss: 0.5904933976925025. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 111, Loss: 0.3147911937349904, Val Loss: 0.5792616754770279. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 112, Loss: 0.32040639015009803, Val Loss: 0.5819793530439926. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 113, Loss: 0.31413557266231873, Val Loss: 0.6056355551137762. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 114, Loss: 0.30808168859051094, Val Loss: 0.5927262697684563. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 115, Loss: 0.3089282238570443, Val Loss: 0.5796368061485937. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 116, Loss: 0.32587182331476766, Val Loss: 0.5752878412604332. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 117, Loss: 0.32540303055387343, Val Loss: 0.583455377211005. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 118, Loss: 0.3131185951132844, Val Loss: 0.5797113500914332. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 119, Loss: 0.30391744426349654, Val Loss: 0.6049800280292156. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 120, Loss: 0.31208072145924953, Val Loss: 0.5922758643657474. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 121, Loss: 0.3176320235985909, Val Loss: 0.6109542104147249. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 122, Loss: 0.3147456742224902, Val Loss: 0.5852055001561924. The best val loss is 0.5710372688659167 in epoch 83.\n", - "Epoch: 123, Loss: 0.30696533712809976, Val Loss: 0.5706274071234768. The best val loss is 0.5706274071234768 in epoch 123.\n", - "Epoch: 124, Loss: 0.32265936966686354, Val Loss: 0.5859721261313406. The best val loss is 0.5706274071234768 in epoch 123.\n", - "Epoch: 125, Loss: 0.3112110183948148, Val Loss: 0.5667502364869845. The best val loss is 0.5667502364869845 in epoch 125.\n", - "Epoch: 126, Loss: 0.3104869855472641, Val Loss: 0.5941177205008975. The best val loss is 0.5667502364869845 in epoch 125.\n", - "Epoch: 127, Loss: 0.31350058882775966, Val Loss: 0.5784695949089729. The best val loss is 0.5667502364869845 in epoch 125.\n", - "Epoch: 128, Loss: 0.31120698572727884, Val Loss: 0.5807840824127197. The best val loss is 0.5667502364869845 in epoch 125.\n", - "Epoch: 129, Loss: 0.30841147328597784, Val Loss: 0.5897202885757058. The best val loss is 0.5667502364869845 in epoch 125.\n", - "Epoch: 130, Loss: 0.30785051270993086, Val Loss: 0.5675957033694801. The best val loss is 0.5667502364869845 in epoch 125.\n", - "Epoch: 131, Loss: 0.3112386478668582, Val Loss: 0.5779468749286765. The best val loss is 0.5667502364869845 in epoch 125.\n", - "Epoch: 132, Loss: 0.3125714513942273, Val Loss: 0.5686445665561547. The best val loss is 0.5667502364869845 in epoch 125.\n", - "Epoch: 133, Loss: 0.30767959122457644, Val Loss: 0.5657109239343869. The best val loss is 0.5657109239343869 in epoch 133.\n", - "Epoch: 134, Loss: 0.3017018769398658, Val Loss: 0.5809539278685036. The best val loss is 0.5657109239343869 in epoch 133.\n", - "Epoch: 135, Loss: 0.3097518128852775, Val Loss: 0.5628624608961202. The best val loss is 0.5628624608961202 in epoch 135.\n", - "Epoch: 136, Loss: 0.30671128467486725, Val Loss: 0.5763838447756686. The best val loss is 0.5628624608961202 in epoch 135.\n", - "Epoch: 137, Loss: 0.30341582777943926, Val Loss: 0.5731805956969827. The best val loss is 0.5628624608961202 in epoch 135.\n", - "Epoch: 138, Loss: 0.29784292695078535, Val Loss: 0.5638393105591758. The best val loss is 0.5628624608961202 in epoch 135.\n", - "Epoch: 139, Loss: 0.30926666142296616, Val Loss: 0.5739989044555163. The best val loss is 0.5628624608961202 in epoch 135.\n", - "Epoch: 140, Loss: 0.2959961067803585, Val Loss: 0.5746817353923442. The best val loss is 0.5628624608961202 in epoch 135.\n", - "Epoch: 141, Loss: 0.30737390869507825, Val Loss: 0.5700446905220969. The best val loss is 0.5628624608961202 in epoch 135.\n", - "Epoch: 142, Loss: 0.3121158451820812, Val Loss: 0.5812855759414576. The best val loss is 0.5628624608961202 in epoch 135.\n", - "Epoch: 143, Loss: 0.30305711022258675, Val Loss: 0.5643048703165378. The best val loss is 0.5628624608961202 in epoch 135.\n", - "Epoch: 144, Loss: 0.30016324647369175, Val Loss: 0.5782985472578114. The best val loss is 0.5628624608961202 in epoch 135.\n", - "Epoch: 145, Loss: 0.3040212186148567, Val Loss: 0.5691732195979458. The best val loss is 0.5628624608961202 in epoch 135.\n", - "Epoch: 146, Loss: 0.3016812792659676, Val Loss: 0.5617153816303965. The best val loss is 0.5617153816303965 in epoch 146.\n", - "Epoch: 147, Loss: 0.29671870411312495, Val Loss: 0.579236001281415. The best val loss is 0.5617153816303965 in epoch 146.\n", - "Epoch: 148, Loss: 0.29685061381463584, Val Loss: 0.5651809762595064. The best val loss is 0.5617153816303965 in epoch 146.\n", - "Epoch: 149, Loss: 0.29743194694284103, Val Loss: 0.5766752395084349. The best val loss is 0.5617153816303965 in epoch 146.\n", - "Epoch: 150, Loss: 0.292420252804121, Val Loss: 0.5900611251087512. The best val loss is 0.5617153816303965 in epoch 146.\n", - "Epoch: 151, Loss: 0.2913234588438577, Val Loss: 0.5807744913687141. The best val loss is 0.5617153816303965 in epoch 146.\n", - "Epoch: 152, Loss: 0.292916942197476, Val Loss: 0.5939694009089874. The best val loss is 0.5617153816303965 in epoch 146.\n", - "Epoch: 153, Loss: 0.30532007675318823, Val Loss: 0.5810983082500555. The best val loss is 0.5617153816303965 in epoch 146.\n", - "Epoch: 154, Loss: 0.2950349525161033, Val Loss: 0.5784054488196211. The best val loss is 0.5617153816303965 in epoch 146.\n", - "Epoch: 155, Loss: 0.29075001142103307, Val Loss: 0.5682049542665482. The best val loss is 0.5617153816303965 in epoch 146.\n", - "Epoch: 156, Loss: 0.2975866798664967, Val Loss: 0.5920458019284879. The best val loss is 0.5617153816303965 in epoch 146.\n", - "Epoch: 157, Loss: 0.2897822512653622, Val Loss: 0.5714672541214247. The best val loss is 0.5617153816303965 in epoch 146.\n", - "Epoch: 158, Loss: 0.29135854074554723, Val Loss: 0.575728770534871. The best val loss is 0.5617153816303965 in epoch 146.\n", - "Epoch: 159, Loss: 0.28913253031834196, Val Loss: 0.6004293250330424. The best val loss is 0.5617153816303965 in epoch 146.\n", - "Epoch: 160, Loss: 0.2927849720211795, Val Loss: 0.5684288557050592. The best val loss is 0.5617153816303965 in epoch 146.\n", - "Epoch: 161, Loss: 0.2959106251271102, Val Loss: 0.5842390287730653. The best val loss is 0.5617153816303965 in epoch 146.\n", - "Epoch: 162, Loss: 0.29101231552823614, Val Loss: 0.5712400572532315. The best val loss is 0.5617153816303965 in epoch 146.\n", - "Epoch: 163, Loss: 0.2881297772056865, Val Loss: 0.559442984224376. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 164, Loss: 0.2852074009006041, Val Loss: 0.5753907935094025. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 165, Loss: 0.285701166430529, Val Loss: 0.5759714293782994. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 166, Loss: 0.28792087516210374, Val Loss: 0.5690475027945082. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 167, Loss: 0.2832980568717866, Val Loss: 0.5747623486539065. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 168, Loss: 0.28516104360566524, Val Loss: 0.5714504392975468. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 169, Loss: 0.28167932151551667, Val Loss: 0.57846040872194. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 170, Loss: 0.2865719164103052, Val Loss: 0.5612821884579577. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 171, Loss: 0.28216071981583196, Val Loss: 0.5650454471677037. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 172, Loss: 0.2902397823921085, Val Loss: 0.5877135702866619. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 173, Loss: 0.2852446578279899, Val Loss: 0.5798372459613671. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 174, Loss: 0.28750243619845733, Val Loss: 0.5724708799083355. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 175, Loss: 0.2822929304229082, Val Loss: 0.5700611611038952. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 176, Loss: 0.2920268885791302, Val Loss: 0.579023973921598. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 177, Loss: 0.28251984051979373, Val Loss: 0.5748149303056426. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 178, Loss: 0.28110622288319315, Val Loss: 0.5797006480269513. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 179, Loss: 0.2818679098230209, Val Loss: 0.5799835576849469. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 180, Loss: 0.28157650233402737, Val Loss: 0.570118812938868. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 181, Loss: 0.2799555491248186, Val Loss: 0.5903648266852912. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 182, Loss: 0.2763484850743391, Val Loss: 0.5635228816230419. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 183, Loss: 0.280780463567833, Val Loss: 0.5791870104306835. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 184, Loss: 0.28329240117412413, Val Loss: 0.5799015035568658. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 185, Loss: 0.28868773334870373, Val Loss: 0.565531560551312. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 186, Loss: 0.27346769788295683, Val Loss: 0.5797525214946876. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 187, Loss: 0.27500163742008, Val Loss: 0.5668845628782854. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 188, Loss: 0.28088780275008973, Val Loss: 0.568546903840566. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 189, Loss: 0.27970000421696334, Val Loss: 0.5697976484642191. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 190, Loss: 0.2756377290254527, Val Loss: 0.5751031566474397. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 191, Loss: 0.27344675987523837, Val Loss: 0.5829155700186551. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 192, Loss: 0.28079140621380216, Val Loss: 0.5932203809083518. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 193, Loss: 0.27451732036841175, Val Loss: 0.576087320255021. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 194, Loss: 0.28170232104994086, Val Loss: 0.5776064959117921. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 195, Loss: 0.27986278074936277, Val Loss: 0.5733445214770608. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 196, Loss: 0.2789561886095653, Val Loss: 0.5823673263697301. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 197, Loss: 0.2791278816650819, Val Loss: 0.5784084450895504. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 198, Loss: 0.2773500455230692, Val Loss: 0.5823896259574567. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 199, Loss: 0.2784909770218995, Val Loss: 0.5699085335610277. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 200, Loss: 0.27406128188663154, Val Loss: 0.5931905136775162. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 201, Loss: 0.27406109086353414, Val Loss: 0.5760189481710983. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 202, Loss: 0.28128726361659323, Val Loss: 0.5785159903057551. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 203, Loss: 0.276198021741244, Val Loss: 0.5724056608090966. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 204, Loss: 0.2763995799596292, Val Loss: 0.5720801659054675. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 205, Loss: 0.26505959575084875, Val Loss: 0.5688852122274496. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 206, Loss: 0.2807133837337912, Val Loss: 0.561060284911576. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 207, Loss: 0.2783729234968659, Val Loss: 0.5606417134404182. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 208, Loss: 0.2643663240392713, Val Loss: 0.5761121589753587. The best val loss is 0.559442984224376 in epoch 163.\n", - "Epoch: 209, Loss: 0.2734516680131864, Val Loss: 0.5506142446045148. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 210, Loss: 0.27256592770997623, Val Loss: 0.5627262877710795. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 211, Loss: 0.2707751576319663, Val Loss: 0.5873762753050206. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 212, Loss: 0.2774685037179585, Val Loss: 0.569239574721304. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 213, Loss: 0.27612383769702736, Val Loss: 0.5658961516820779. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 214, Loss: 0.27853093443125704, Val Loss: 0.5582708893185955. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 215, Loss: 0.2721802492755173, Val Loss: 0.5597216320239892. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 216, Loss: 0.27059901581846013, Val Loss: 0.5538761527356455. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 217, Loss: 0.2727644860635709, Val Loss: 0.582524243805368. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 218, Loss: 0.26874367626261536, Val Loss: 0.560043477153374. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 219, Loss: 0.2608478213694409, Val Loss: 0.5644208152415389. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 220, Loss: 0.27496112102683445, Val Loss: 0.5581450025408955. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 221, Loss: 0.2667116536113032, Val Loss: 0.5762663476042829. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 222, Loss: 0.26607079991567745, Val Loss: 0.5656787182314921. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 223, Loss: 0.2701546778209018, Val Loss: 0.5599509242732646. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 224, Loss: 0.26507608965039253, Val Loss: 0.576087441989931. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 225, Loss: 0.2638658442582092, Val Loss: 0.5629089731028525. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 226, Loss: 0.27089454646963274, Val Loss: 0.562779827643249. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 227, Loss: 0.27437972312752346, Val Loss: 0.5632650872408334. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 228, Loss: 0.265983763103285, Val Loss: 0.5618503467511322. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 229, Loss: 0.2690598440703249, Val Loss: 0.5708437885268259. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 230, Loss: 0.26835012680640186, Val Loss: 0.5602795345298315. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 231, Loss: 0.26432300467778297, Val Loss: 0.5540358454494153. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 232, Loss: 0.26599613016974316, Val Loss: 0.5551857680587445. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 233, Loss: 0.26347067438229155, Val Loss: 0.5685149085218624. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 234, Loss: 0.26751254161779026, Val Loss: 0.5660574610455561. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 235, Loss: 0.26541295283249694, Val Loss: 0.5670869153434948. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 236, Loss: 0.2653599366491293, Val Loss: 0.5614773609628112. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 237, Loss: 0.2662328870790283, Val Loss: 0.5676712120993662. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 238, Loss: 0.26688076857559, Val Loss: 0.5781088633052374. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 239, Loss: 0.27209434459788084, Val Loss: 0.5685614577289355. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 240, Loss: 0.25565624068470766, Val Loss: 0.5630070125147447. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 241, Loss: 0.26337509224340866, Val Loss: 0.5663782963055676. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 242, Loss: 0.25791891253668897, Val Loss: 0.570207545502206. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 243, Loss: 0.27337414504837815, Val Loss: 0.5673647981326458. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 244, Loss: 0.2638578884249186, Val Loss: 0.5680219308804657. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 245, Loss: 0.2566627488084083, Val Loss: 0.5638837970919528. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 246, Loss: 0.2633417400456693, Val Loss: 0.5605825064545971. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 247, Loss: 0.2639889079180077, Val Loss: 0.551850961173995. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 248, Loss: 0.25892963700920996, Val Loss: 0.5655168537366189. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 249, Loss: 0.2659519577711603, Val Loss: 0.5685224588644706. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 250, Loss: 0.2558550471100059, Val Loss: 0.5574127387697414. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 251, Loss: 0.2676079269145092, Val Loss: 0.5606442260540138. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 252, Loss: 0.26820769324137345, Val Loss: 0.569399874594252. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 253, Loss: 0.26273784000616873, Val Loss: 0.5565078392372294. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 254, Loss: 0.2554377465352525, Val Loss: 0.5548669375100378. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 255, Loss: 0.2655937552560855, Val Loss: 0.5637813233217951. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 256, Loss: 0.255323704957527, Val Loss: 0.5692406907930212. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 257, Loss: 0.25345539956958624, Val Loss: 0.5626253899881395. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 258, Loss: 0.25726864467898425, Val Loss: 0.5553672404612525. The best val loss is 0.5506142446045148 in epoch 209.\n", - "Epoch: 259, Loss: 0.2604094168891872, Val Loss: 0.5730891451239586. The best val loss is 0.5506142446045148 in epoch 209.\n" - ] - } - ], + "execution_count": null, + "id": "76", + "metadata": {}, + "outputs": [], "source": [ "# Train the image classifier\n", "trainer = Trainer(model)\n", @@ -1889,7 +1299,7 @@ }, { "cell_type": "markdown", - "id": "1accead2", + "id": "77", "metadata": {}, "source": [ "#### Learning curves\n", @@ -1899,31 +1309,10 @@ }, { "cell_type": "code", - "execution_count": 39, - "id": "5ca13d69", - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "Text(0.5, 1.0, 'Learning curve of image classification model')" - ] - }, - "execution_count": 39, - "metadata": {}, - "output_type": "execute_result" - }, - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjcAAAHHCAYAAABDUnkqAAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjEsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvc2/+5QAAAAlwSFlzAAAPYQAAD2EBqD+naQAAfbVJREFUeJzt3Qd4U9UbBvC3G1po2XvvPWTJkiEyVBSFv7gBB6LgwokDxAVuHCiCilsQFRRBEBBEBGTJkL132bOlO//nPbc3TbookDQd7+950mbn5uYm97vf+c45fg6HwwERERGRPMLf1wsgIiIi4kkKbkRERCRPUXAjIiIieYqCGxEREclTFNyIiIhInqLgRkRERPIUBTciIiKSpyi4ERERkTxFwY2IiIjkKQpuJNeoUqUK+vfv7+vFyFfOnj2Le+65B2XKlIGfnx8eeeSRDO+rz8fSsWNHc/KVF154wXxWrhISEvDkk0+iYsWK8Pf3R69evcz1vB/vn924nXB7yY8u5Xviq88rN1Jwk898/vnn5guyYsUKXy+K5AKvvvqq2Wbuv/9+fPXVV7jjjjt8vUhyET777DO88cYb6NOnD7744gs8+uijXn/NAwcOmB3x6tWrvf5aIqkFprlGJIfavHmzOeqU7PPHH3/g8ssvx4gRI857X30+OcNzzz2Hp59+Os3nWL58ebzzzjtu1587dw6BgYFeC25GjhxpMhVNmjRxu23ChAlISkryyuuKkH6JxCeYJo+Li7ugx4SEhCAoKAh5UVRUFHKiw4cPo0iRIsjvn09uwmClQIECWfoceT9vBTeZ4XbC7UXEWxTcSLr279+Pu+66C6VLlzY/QvXr1zepbVcMToYPH45mzZohIiICYWFhaN++PebPn+92v127dpmmsDfffBNjxoxB9erVzXNu2LDBWR+wbds20w7NH2A+14ABAxAdHZ1pW7XdxPb3339j6NChKFmypFmGG264AUeOHHF7LI8S+VrlypVDaGgoOnXqZF4/q+3ffPy7776Lhg0bmh0CX6t79+7O5j37PXKZztdObr9nvv6tt96KokWLol27dmb98Prdu3eneY5hw4YhODgYJ06ccF73zz//mGXg+uJ76tChg1kXWcGd3d13320+X76fxo0bm+YK24IFC8yy7Ny5EzNmzDDneeL7zEhGn8+iRYvw0EMPmXXGz/e+++4z287Jkydx5513mvfPE2tCHA6H23NynbRp0wbFixdHwYIFzbb2ww8/pHltZiD4GiVKlEDhwoVx3XXXmW04vRqFrGzbmfn666/RsmVLs8653FdccQV+//33DO+f1e8JTZo0ydyP7yE8PNxsb9zubPHx8SYbUrNmTfO5cb1w25kzZ066NTf2dsnXWr9+vfNz5OdLGa0fbhv8rnD9VK1a1TRL2gcjx48fx+OPP26WrVChQmY5e/TogTVr1jifg8/fokULc57fZft17e9HejU3DPAfe+wxUxfE161du7b5/FNvE3yeIUOGYNq0aWjQoIHzM5w1a9Z5Pzt7u/7+++/NemQ2i+uazXWnTp1CbGysqSsrVaqUeW9cdl6X+sDspZdecv6O8X0888wzae7H5X755ZdRoUIF528OP4P08LvA17Xfe40aNfDaa68pu3UJ1CwlaRw6dMg0Rdg/Itwp/fbbb+YH7/Tp086iUp7/5JNPcMstt+Dee+/FmTNn8Omnn6Jbt25YtmxZmlT0xIkTERMTg4EDB5ovcLFixZy33XTTTeZHdNSoUVi1apV5Xv7A8At+Pg8++KDZybDphD/mDKC43JMnT3YLDl5//XX07NnTLB9/iPmfy5MVfO/8YeaPOAts+QP3119/YenSpWjevDkuxv/+9z+zk2JdC38Ir732WrOD5w/vE0884XZfXte1a1fzPu1mBi4Ld4R832wO4vrt3LmzWS7ufDPCQIAFrwwouZ643qdMmWJ2OPyRffjhh1G3bl1TY8PaDP44c6dD3BYuFD8fFiRzZ8L1NX78eBPkLF68GJUqVTLvf+bMmaYmhDsrBjw27tgZqNx2221m58qdP9fbr7/+imuuucZ5Py471xFrgrjt/vnnn263X+i2nRG+BwYDDLhefPFFE3AyyOTnwc8nPVn9njBA4X2uvPJK53a/ceNGE7DyMyG+Nr8j3Ab5GfO5GWDzO3PVVVeleW2+P36Or7zyiikO52OJn29GTUl8Xm4H/J7WqVPHBDsMKHmwwfe7Y8cOE1jwc+C2w3X68ccfm+CaATuDIj4/1w+DOj4PgznieksPt39+zgzC+FlwncyePdt8D/j6qZvTGDD/9NNPeOCBB0xw8t5776F3797Ys2ePCfjOh+uBwTKb7/g9eP/99002id8jHkBwPXNb5Xee75Hvw8Z1zwMBBkT8XvDz5/Pxs5o6darzfnwMg5urr77anPgZcRtJnbHmeuW64/tk4M/vBL8b/M06ePCg+T2Ti+CQfGXixIk8DHIsX748w/vcfffdjrJlyzqOHj3qdv3NN9/siIiIcERHR5vLCQkJjtjYWLf7nDhxwlG6dGnHXXfd5bxu586d5jXDw8Mdhw8fdrv/iBEjzG2u96cbbrjBUbx4cbfrKleu7OjXr1+a99KlSxdHUlKS8/pHH33UERAQ4Dh58qS5HBkZ6QgMDHT06tXL7fleeOEF83jX50zPH3/8Ye730EMPpbnNfl37PXKZUuP1fJ+p3/Mtt9yS5r6tW7d2NGvWzO26ZcuWmft/+eWXztesWbOmo1u3bm7vm59L1apVHVdddVWm72fMmDHm+b7++mvndXFxcea1CxUq5Dh9+rTbOr/mmmsyfb7zfT6pl5Ov4+fn5xg0aJDzOm5LFSpUcHTo0MHtOe1tzXU5GzRo4OjcubPzupUrV5rXeeSRR9zu279//zTrPqvbdnq2bt3q8Pf3N9tmYmKi222u74/vwfV9ZPV78vDDD5vvCO+fkcaNG5/387C3L1dcnvr166e5b+r1c+edd5r3mN7vg/0eY2Ji0rx/bv8hISGOF1980XkdnyOj7wS3E24vtmnTppn7vvzyy27369Onj9lWtm3b5rbMwcHBbtetWbPGXP/+++9nsmYcjvnz55v7cRvitmTjd5Gv06NHD7f7c1t1Xc7Vq1ebx99zzz1u93v88cfN9fytIP7OcRn5WbluG88880ya35yXXnrJERYW5tiyZYvbcz799NPmd2zPnj0Zfl6SMTVLiRt+f3788UeT4eD5o0ePOk880mTqlkcgFBAQYI7kiOlTpquZ0WAmw76PKx5ZZXTkP2jQILfLPNI7duyYOTI9Hx4ZunZ95WMTExOdzTvz5s0zy8WjvNQZhazg+uDzp1dUm7rL7YVI/Z6pb9++WLlyJbZv3+68jhkoZrquv/56c5m9T7Zu3WqatLiO7M+HaX0e9S9cuDDTdDazJMykMEtg41Erm3V4dM+shyfxSNx1PbVq1cpsW7zexm2J2w2zAq54dG3jETW3P36+rtuX3Rxxvs/3Qrbt9DBbwfXKI/LUhdOZbQdZ/Z4wm8XP0LWJKTXeh00b/Pw9jcvG98j1k1420n6P3Bbt98/vGbdBNuGwGSmz9ZcZbpNcT9wGXTEzws+K2TVXXbp0Mc1CtkaNGpnmsdTbT0aYHXStD7O3STZXuuL1e/fuNZ+XvZzEZvDUy0lswqW5c+eaDA23QddtI73MILOm3KaZlXXdJvkeuX75fZYLp2YpccNaFaak2XTAU0b1GjamZ9966y1s2rTJ1APYmMpNLb3rbEzFurKbX7hD449WZjJ7LNlBDtuxXbFZzL5vZhhoMNXu2ozmCemtD6b6+cPJgIbt+PzB5Y8fm6Ds9WDv2Pr165fhc3NHndF74/pgc1jqHbTdVJFezc+lSP35sO6EWF+Q+nrXmiJi8xNT+wzoXGsaXHcYXF6+l9TrM/XnfaHbdnrbAV+nXr16uFBZ+Z4wOGPTGj9r1oKwCYPNtayrsrGph0FurVq1TBMeb2NTHHful4rrhwcTfN6s1J99+OGHpiaLO2BbVpqE0sPPkN8xNjFlZZtMvU0Rt/fU248ntkm+X36f+N7sbS31tsWDBQae9nLa//k9c8WDu9TfS36f165dm+GBX2bbpGRMwY24sY/4b7/99gx3nvYPKQsrWevAAcHYNs4aGR59sf3ZNfOQ3lF4anxcelIXE3r6sZ6S0ZG76w9/VtYHf+B5FMedHIMbtvuzjsC19sj+jFijkrquycYj6Zwio88nvetdPzPWDrEOgwW73JGWLVvWHG2ztujbb7/16rbtSVn9nvB6BnGsNWGmgie+V2YZ7GJvrgs+5ueffzZFzKzlYT3KuHHjTC1IdmCN1PPPP2+yHCysZdDPHT6zEtlVAHup3/kL2SbTe95LydimxnXGeinW26WHgaxcOAU34oZHDzx64k6ZadHMsMiwWrVqprDP9cuelTFRslPlypXNfxYOuh4pM52elSM9pr+5w2FzQkbZG/tojJkBVxeTBWHTFI/iOW4MMzjsacGmAtflIWZyzvcZZbQ+eKTIH1XX7A2zCvbtOQGbkNgjiOvetdswd/iuuLx8L8wiuB4p8/O+2G07PVzvfB0WzWYUVF7q94TNV/yseeJrcTtgsS6DCTtbwG2QvXh4YjMiAx4WwF5qcMP1w23qv//+O+/7Yc8fFkW74rbP3moXEwDwM2RTDoutXbM3OW2btLc1Zltci7JZVM33by+n/Z/342fvmh1L/ZvD7Yqf48Vsk5Ix1dxImiMX1sZwx5Lej5xrF2v7KMf1qIY9B5YsWYKchHUoHMvjo48+crv+gw8+yNLjuT74HtlTJjX7vXOnwB/21O3jzDhcKL4e1+13331nmqTYi4rdh23sIcUfRHaT5Y9iaqm7wafGnhuRkZFuvclYU8AeI8z4sOdGTsB1wB2ka/aLveFYF+KK9TLprWu+n4vdttPDzAuDQTYNpc5QZJYxyOr3hMG2K76WnUmym+RS34efF4Oe1N2QL4Y9LcP06dPTHcHcXn6+n9Tvl9spe/u4srfZ1AF/RtskP+fU30lmpbgNsKkuJ+ByUuoeTG+//bb5b/fQY6DCLCO3Qdd1lV7PJzY9cltgEJ8a151d7yMXRpmbfIrjeqQ3LgS7nI4ePdp0yWQxHbuussaAWQsWC/LoiueJO10ejXJcGX6peeTM9Djvn95O11c4ngnfF2se2MzBOgV2BWfanwHJ+Y4weZTKugZ2N+WRGB/PnRubTXgbuxQTj5y57vifBZkMdLZs2XLBy8vmCT4vfzB5JMtMTuqdEJsj+IPP8T14BM8aDe5c+Lkx0OIOKrMCbGYD2FTC4mWO08GjcXY55o9v6roHX+E2xXXA9c3iadYejB071uzMmXlyDfYYtHDZufO3u4Lb6971883qtp0evu6zzz5rmmLYdHjjjTeajNLy5ctNc6LdzTq1rH5PuN3w9dmdn93vmfXjzpFZIjtLwMewGz/fMzM4DEL42dnboCeanNjcxQCX2wlfl92RGbyw+zXrSvh+GOBxu2PX7nXr1uGbb75xy1AQA3Den++V2xSDHa739GrNmKniNs/1ywCW4y5xOdj8xuYu1+JhX+JysUmTNVsMPLie2J2fzYYMDPke7CwYxwLiNsH1xaDo33//df7muGJT5S+//GLux+8kP1sWlnO98rPl+kj9GMmCTHpSSR5kd8/N6LR3715zv0OHDjkGDx7sqFixoiMoKMhRpkwZx5VXXukYP36887nYxfHVV181XSXZDbRp06aOX3/9NU03T7ub9BtvvJFht9UjR46ku5x87Pm6Gqfutmp39+R/G7vXPv/88+Z9FCxY0HQl3rhxo+lu7tolOSN8PJe/Tp06potnyZIlTbdRdkO2sRsxuxqzS3HhwoUdN910k+kSmlFX8NTv2dWECRPMffg8586dS/c+//77r+PGG28074Hrn+uHrzlv3rzzvh9+vgMGDHCUKFHCvJ+GDRum22XXE13BU38+Gb1/PpZdYl19+umnpts73x/XPZ8zva7OUVFRZnstVqyY6c7Obv+bN2829xs9enSa936+bTszn332mdnWuUxFixY13aznzJmTYVfwrH5PfvjhB0fXrl0dpUqVMp9JpUqVHPfdd5/j4MGDzvuwq3TLli0dRYoUMdsx18krr7zi1q35UrqC0+7du02XcG7jXN5q1aqZ9WV3Z2dX8Mcee8x0qecytG3b1rFkyZI075t+/vlnR7169cxQDK7dwlO/dzpz5owZxqFcuXLmc+Hnzu+ca1dqe5m5POfb/tJj/zZMmTLF7foL2Vbj4+MdI0eONMMucDm5HQ0bNsysF1fsLs/72eupY8eOjv/++y/d5eR753PUqFHDfPb8XrZp08bx5ptvun226gqedX78k5UgSCSv4ZEXa2XYG4dHjJK3sDi3adOmpqCXgwCKSP6hmhvJFzgqb2p2+zfT/JI3P1824bHgVkTyF9XcSL7A4lkOpc62bxZhsn6ABbscS6Rt27a+Xjy5RJxag/VDrHlg8bjdlZp1I6nHLhGRvE/NUpIvsGCU40iwqYIDlbHImEWobJLKSWPCyMXhqL7szcZu2izS5SBtLAJnc6MvZr0WEd9ScCMiIiJ5impuREREJE9RcCMiIiJ5Sr5rjObgawcOHDCDSnlyfhARERHxHlbRcGBTDpqZeuJf5PfghoGNek+IiIjkTnv37jWjeGcm3wU39tDyXDkcpl5ERERyPvZ0ZXIiK1PE5Lvgxm6KYmCj4EZERCR3yUpJiQqKRUREJE9RcCMiIiJ5ioIbERERyVPyXc2NiIjkbYmJiYiPj/f1YshFCA4OPm8376xQcCMiInlmHJTIyEicPHnS14siF4mBTdWqVU2QcykU3IiISJ5gBzalSpVCaGioBmrNpYPsHjx40Ex+eymfn4IbERHJE01RdmBTvHhxXy+OXKSSJUuaACchIQFBQUEX+zQqKBYRkdzPrrFhxkZyL7s5isHqpVBwIyIieYaaonI3T31+Cm5EREQkT1FwIyIikodUqVIFY8aM8flz+JKCGxERER81wWR2euGFFy7qeZcvX46BAwciP1NvKQ+JTUjE0bNxYGthuSIFfb04IiKSw7HLs23y5MkYPnw4Nm/e7LyuUKFCbmP4sMg2MDAwSz2O8jtlbjxk3b5TaDv6D9w6YamvF0VERHKBMmXKOE8REREmW2Nf3rRpEwoXLozffvsNzZo1Q0hICBYtWoTt27fj+uuvR+nSpU3w06JFC8ydOzfTJiU/Pz988sknuOGGG0xvspo1a+KXX365oGXds2ePeV2+Znh4OG666SYcOnTIefuaNWvQqVMns8y8ncu8YsUKc9vu3bvRs2dPFC1aFGFhYahfvz5mzpwJb1LmxkOCAqw4MT7R4etFERHJ95jpOBd/ad2JL1bBoACP9fp5+umn8eabb6JatWomONi7dy+uvvpqvPLKKybg+fLLL03gwIwPB77LyMiRI/H666/jjTfewPvvv4/bbrvNBB3FihXL0uB6dmDz559/mjFoBg8ejL59+2LBggXmPny+pk2b4qOPPkJAQABWr17tHKeG942Li8PChQtNcLNhwwa3rJQ3KLjxkMAAa0OOT0zy9aKIiOR7DGzqDZ/tk9fe8GI3hAZ7Zvf64osv4qqrrnJeZjDSuHFj5+WXXnoJU6dONZmYIUOGZPg8/fv3xy233GLOv/rqq3jvvfewbNkydO/e/bzLMG/ePKxbtw47d+5ExYoVzXUMqpiBYX0Ps0fM7DzxxBOoU6eOuZ3ZIRtv6927Nxo2bGguM1DzNjVLeUhwcuYmIUmZGxER8YzmzZu7XT579iwef/xx1K1bF0WKFDEZkI0bN5oAIjONGjVynmf2hE1Hhw8fztIy8PkZ1NiBDdWrV8+8Pm+joUOH4p577kGXLl0wevRo03xme+ihh/Dyyy+jbdu2GDFiBNauXQtvU+bGQwLtZqkEZW5ERHyNTUPMoPjqtT2FgYgrBjZz5swxTVU1atRAwYIF0adPH9Psk5mgVFMZsNmMzU2ewp5dt956K2bMmGHqhBjETJo0ydT5MOjp1q2bue3333/HqFGj8NZbb+HBBx+Etyi48ZBA/+RmKQ9uLCIicnG48/ZU01BO8vfff5smJgYNdiZn165dXn3NunXrmlofnuzsDetmOJcXMzi2WrVqmdOjjz5qmsAmTpzoXE4+btCgQeY0bNgwTJgwwavBjZqlPCQ4UAXFIiLiXaxl+emnn0zBLnsoMVviyQxMetjUxHoZFg2vWrXK1Orceeed6NChg2k2O3funKn3YXExi5QZgLEWh0ERPfLII5g9e7ap2eHj58+f77zNWxTceDhzk5jkMFX6IiIinvb222+bXlNt2rQxvaTY3HPZZZd5PQv2888/m9e94oorTLDDomCOzUPsHXXs2DET8DBzw27iPXr0MD20iOPzsMcUAxoWMPM+H374oXeX2ZHP9sSnT5824wmcOnXKFFR57Hlj4tHohd/N+S0v93BmckRExPtiYmJMZqBq1aooUKCArxdHvPA5Xsj+W3tgDwnyT1mV6g4uIiLiOwpuPDzODSWo7kZERMRnFNx4uOaG4pS5ERER8RkFNx4suApKzt4kqDu4iIiIzyi48cb8UglqlhIREfEVBTcepIH8REREfE/BjRcyNyooFhER8R0FN95ollJBsYiIiM8ouPFCd3AFNyIiIr6j4MaDgp2ZGzVLiYhI9ujYsaOZvymzGbubNGmC/ETBjRcyNwnK3IiIyHlwbijOtZSev/76ywwxsnbt2mxfrrxAwY0HBSZPwRCfpMyNiIhk7u6778acOXOwb9++NLdNnDjRzLjdqFEjnyxbbqfgxoOCkifLjE9Q5kZERDJ37bXXomTJkvj888/drj979iymTJligh/Otn3LLbegfPnyCA0NRcOGDfHdd99d0usmJSXhxRdfRIUKFRASEmKarGbNmuW8PS4uDkOGDEHZsmXN5JWVK1fGqFGjzG2ca5vNXJUqVTKPLVeuHB566CHkNIG+XoC8JCh5nBuNUCwi4mMOBxAf7ZvXDgrlsPXnvVtgYCDuvPNOE9w8++yzphmKGNgkJiaaoIaBTrNmzfDUU0+ZmbBnzJiBO+64A9WrV0fLli0vavHeffddvPXWW/j444/RtGlTfPbZZ7juuuuwfv161KxZE++99x5++eUXfP/99yaI2bt3rznRjz/+iHfeeQeTJk1C/fr1ERkZiTVr1iCnUXDjha7gcSooFhHxLQY2r5bzzWs/cwAIDsvSXe+66y688cYb+PPPP01hsN0k1bt3b0RERJjT448/7rz/gw8+iNmzZ5vA42KDmzfffNMESzfffLO5/Nprr2H+/PkYM2YMxo4diz179pggp127dibgYubGxtvKlCmDLl26ICgoyAQ/F7sc3qRmKQ9SQbGIiFyIOnXqoE2bNiZ7Qtu2bTPFxGySImZwXnrpJdMcVaxYMRQqVMgENwwyLsbp06dx4MABtG3b1u16Xt64caM5379/f6xevRq1a9c2TU6///67837/+9//cO7cOVSrVg333nsvpk6dioSEBOQ0ytx4kEYoFhHJIdg0xAyKr177AjCQYUaGWRNmbdjk1KFDB3MbszpsRmJWhQFOWFiY6fbNuhhvueyyy7Bz50789ttvmDt3Lm666SaTqfnhhx9QsWJFbN682VzPYugHHnjAmXliJienUObGg+xZweOUuRER8S3Wr7BpyBenLNTbuGLw4O/vj2+//RZffvmlaaqy62/+/vtvXH/99bj99tvRuHFjkzHZsmXLRa+W8PBwUwTM53XFy/Xq1XO7X9++fTFhwgRMnjzZ1NocP37c3FawYEHTjZ21OQsWLMCSJUuwbt065CTK3HhQoDNzo+BGRESyhk1NDCSGDRtmmo3YLGRj7QszJosXL0bRokXx9ttv49ChQ26ByIV64oknMGLECJMhYk8pZovYDPXNN9+Y2/ka7CnFYmMGXSxwZp1NkSJFTPEzm8patWplem99/fXXJthxrcvJCRTceJBGKBYRkYvBpqlPP/0UV199tcms2J577jns2LED3bp1M8HEwIED0atXL5w6deqiX+uhhx4yj3/sscdw+PBhEyixdxQDKSpcuDBef/11bN26FQEBAWjRogVmzpxpAh0GOKNHj8bQoUNNkMOmsunTp6N48eLISfwc7LTuIwsXLjRtdStXrsTBgwdNYRI/tKxgCo1tkg0aNDARZ1YxKmb1OT9Ypt086YkpazBl5T482b02HuhYw6PPLSIiGYuJiTF1IlWrVjVjs0je+xwvZP/t05qbqKgo04bIIqoLcfLkSTM2wJVXXomcOIifCopFRETyabNUjx49zOlCDRo0CLfeeqtJl02bNg05bRA/zQouIiLiO7mutxQLn9j+yGKorIiNjTWpLNeTtwuKVXMjIiLiO7kquGFx09NPP22qszlsdVZwPgx7lEee2Eff2+PcKHMjIiLiO7kmuGFVNpuiRo4ciVq1amX5cexax+Ij+2TPj+HNcW7UFVxExDd82EdGctDnl2u6gp85cwYrVqzAv//+a2YrtWc25YpgFofDQ3fu3DnN4zhrKU/ZwZm5SdKXS0QkO9mj40ZHR5txVyR3skdeZk1tvghu2O0r9QiIH374If744w8zwBG7jeWUuaXiE5S5ERHJTtwZcgwWjttCHBPGHuVXcgcmLI4cOWI+u6yWnuTI4IZTuXOSMBv7tnPMGk4OxplG2aS0f/9+Mxw1Bw/imDauSpUqZfrBp77eV4L8k7uCK3MjIpLtOIou2QGO5D7c13P/f6mBqU+DGzYzderUyXmZIx5Sv379zBDPHNjvYmc+9QXNLSUi4jvcIXLaAB74xsfH+3px5CIEBwebACdXj1DsC94cofjrpbvx3LT/0K1+aXx8R3OPPreIiEh+djq3jFCcV+eW0gjFIiIivqPgxgsFxWqWEhER8R0FN14YoViZGxEREd9RcONBwXZXcGVuREREfEbBjQcFJld4axA/ERER31Fw40FBgXazlDI3IiIivqLgxoOC/NUsJSIi4msKbjxIBcUiIiK+p+DGgzRCsYiIiO8puPHCrODK3IiIiPiOghtvBDdJytyIiIj4ioIbb4xQnKDgRkRExFcU3HhQUPI4Nwka50ZERMRnFNx4UFCguoKLiIj4moIbb4xQnOiAw6HsjYiIiC8ouPGg4OSCYlLTlIiIiG8ouPFCQTGpO7iIiIhvKLjxUnATr+7gIiIiPqHgxgu9pShe3cFFRER8QsGNB/n7+yEgefJM1dyIiIj4hoIbb80vpcyNiIiITyi48TAN5CciIuJbCm68VFScoIH8REREfELBjZcmz4xTcCMiIuITCm68NTO4xrkRERHxCQU3Xioo1vxSIiIivqHgxsMCA1LmlxIREZHsp+DGwwKd49wocyMiIuILCm48LDjQztwouBEREfEFBTdeytyoWUpERMQ3FNx4qbeUMjciIiK+oeDGw9QVXERExLcU3HhphGJlbkRERHxDwY3XmqWUuREREfEFBTdeGsRPXcFFRER8Q8GNt+aWSlBwIyIi4gsKbjws0D+5oDhJzVIiIiK+oODGW81SKigWERHxCQU33mqWUkGxiIiITyi48VJXcGVuREREfEPBjYcFa4RiERERn1Jw47VB/NQsJSIiku+Cm4ULF6Jnz54oV64c/Pz8MG3atEzv/9NPP+Gqq65CyZIlER4ejtatW2P27NnImb2llLkRERHJd8FNVFQUGjdujLFjx2Y5GGJwM3PmTKxcuRKdOnUywdG///6LnCI40FqlsfEKbkRERHwhED7Uo0cPc8qqMWPGuF1+9dVX8fPPP2P69Olo2rQpcoKw4ADzPzo+0deLIiIiki/5NLi5VElJSThz5gyKFSuW4X1iY2PNyXb69GmvLlNoiLVKo2MTvPo6IiIikgcLit98802cPXsWN910U4b3GTVqFCIiIpynihUrenWZwoKt4CYqTpkbERERX8i1wc23336LkSNH4vvvv0epUqUyvN+wYcNw6tQp52nv3r1eXa7QEKtZ6pyCGxEREZ/Ilc1SkyZNwj333IMpU6agS5cumd43JCTEnLJLaJAV3ETFqVlKRETEF3Jd5ua7777DgAEDzP9rrrkGOU2Ys+ZGmRsREZF8l7lhvcy2bducl3fu3InVq1ebAuFKlSqZJqX9+/fjyy+/dDZF9evXD++++y5atWqFyMhIc33BggVNPU1OEJrcW0qZGxERkXyYuVmxYoXpwm134x46dKg5P3z4cHP54MGD2LNnj/P+48ePR0JCAgYPHoyyZcs6Tw8//DByXOYmLhEOh0YpFhERyVeZm44dO2YaAHz++edulxcsWICczs7cJCY5EJuQhALJNTgiIiKSPXJdzU1OF5rcFdzO3oiIiEj2UnDjYQH+figQZK3WKA3kJyIiku0U3HhBWHL2RpkbERGR7KfgxosD+anHlIiISPZTcOMFYcmZG41SLCIikv0U3HhzrBvV3IiIiGQ7BTdeHutGREREspeCGy8oqPmlREREfEbBjRdofikRERHfUXDjBZpfSkRExHcU3HiBam5ERER8R8GNF6i3lIiIiO8ouPGCMI1QLCIi4jMKbjxl/yrgvabA59emjFCszI2IiEi2S5nCWi6NwwEc3wEkJihzIyIi4kPK3HhKYIj1PyHGWXMTrd5SIiIi2U7BjacEFrD+J8Sot5SIiIgPKbjxlKCU4Ebj3IiIiPiOghtPZ24S4xAW5GfOaoRiERGR7KfgxtPBDce58bcyNsrciIiIZD8FN14IbsICrKAmJj4JiUkOHy6UiIhI/qPgxlMCAgF/q5C4oF+882r1mBIREcleCm68kL0JQRwC/JPrbtRjSkREJFspuPFCcOOXEKv5pURERHxEwY1Xxro5p1GKRUREfETBjVfGuonV/FIiIiI+ouDGG5mbeJfMTbwyNyIiItlJwY1XmqVUcyMiIuIrCm68NL9UeMEgc/bUuZRu4SIiIuJ9Cm68NL9UkeTg5mS0ghsREZHspODGS5mbomHB5qwyNyIiItlLwY1XCopjEJGcuTkRFefbZRIREclnFNx4KXNTJDS5WUqZGxERkWyl4MZLNTdFQ61mqZPRytyIiIhkJwU33srcqKBYRETEJxTceKvmRs1SIiIiPqHgxlu9pVyapRwOh2+XS0REJB9RcOOtcW6SMzfxiQ5NnikiIpKNFNx4KXNTMCgAwQHW6lXTlIiISPZRcOOlmhs/Pz9n9kZj3YiIiGQfBTdeytyQHdxolGIREZHso+DGSzU3VCS5qPiExroRERHJH8HNwoUL0bNnT5QrV84040ybNu28j1mwYAEuu+wyhISEoEaNGvj888+RYzM3GutGREQkfwU3UVFRaNy4McaOHZul++/cuRPXXHMNOnXqhNWrV+ORRx7BPffcg9mzZyNHCAyx/serWUpERMRXAn32ygB69OhhTlk1btw4VK1aFW+99Za5XLduXSxatAjvvPMOunXrBp8LLOiWubHHulFBsYiISPbJVTU3S5YsQZcuXdyuY1DD6zMSGxuL06dPu528nrlJDm40SrGIiEj2y1XBTWRkJEqXLu12HS8zYDl37ly6jxk1ahQiIiKcp4oVK3pvAYPcMzdFCmryTBERkeyWq4KbizFs2DCcOnXKedq7d2+21dwUtTM3KigWERHJHzU3F6pMmTI4dOiQ23W8HB4ejoIFk7MmqbBXFU/ZWnOTGAs4HGqWEhER8YFclblp3bo15s2b53bdnDlzzPU5gp25seeXUrOUiIhI/gpuzp49a7p082R39eb5PXv2OJuU7rzzTuf9Bw0ahB07duDJJ5/Epk2b8OGHH+L777/Ho48+ihzBrrmxZwYPS2mW0szgIiIi+SC4WbFiBZo2bWpONHToUHN++PDh5vLBgwedgQ6xG/iMGTNMtobj47BL+CeffJIzuoGTfyDgl7xK41MyNwlJDpyNTfDtsomIiOQTPq256dixY6YZjfRGH+Zj/v33X+RIfn5W3U18lMncFAjyN6eY+CSciIpH4QJWJkdERES8J1fV3OQKLmPdcEqJ4mHW5SNnY327XCIiIvmEghsvj3VTorAV3BxVcCMiIpItFNx4eaybkoWsuptjZ9VjSkREJDsouPHy/FJ2s5QyNyIiItlDwY2X55cqUdjO3Ci4ERERyQ4Kbrxdc1PIztyoWUpERCQ7KLjxcs1N8eTgRr2lREREsoeCGy/X3JRwFhQruBEREckOCm68XXOjZikREZFspeAmm2puTp2LR1xCki+XTEREJF9QcOPlmpsiBYMQ4O9nzh+PUvZGREQkRwY3e/fuxb59+5yXly1bhkceeQTjx4/35LLliZobf38/FAuz6m401o2IiEgODW5uvfVWzJ8/35yPjIzEVVddZQKcZ599Fi+++CLytVQ1N65NU+oxJSIikkODm//++w8tW7Y057///ns0aNAAixcvxjfffJPuTN75suYm/pzzqpQeU2qWEhERyZHBTXx8PEJCrGzE3Llzcd1115nzderUwcGDB5GvFSxq/T933HlVSo8pZW5ERERyZHBTv359jBs3Dn/99RfmzJmD7t27m+sPHDiA4sWLI18LK2H9jzqaJnNz9IyCGxERkRwZ3Lz22mv4+OOP0bFjR9xyyy1o3Lixuf6XX35xNlflW2Elrf9nDzuvskcpPqbeUiIiIl4XeDEPYlBz9OhRnD59GkWLJjfDABg4cCBCQ0ORr4WVSidzo2YpERGRHJ25OXfuHGJjY52Bze7duzFmzBhs3rwZpUol79zze7NU7CkgIdatWeqImqVERERyZnBz/fXX48svvzTnT548iVatWuGtt95Cr1698NFHHwH5vaDYPzkhFnXE/CtfxOpBtf/EOTgcDl8unYiISJ53UcHNqlWr0L59e3P+hx9+QOnSpU32hgHPe++9h3zNzy+l7iY5uKlQ1GqqOxObgJPR8b5cOhERkTzvooKb6OhoFC5c2Jz//fffceONN8Lf3x+XX365CXLyvVQ9pgoGB6BUYavuZs/xaF8umYiISJ53UcFNjRo1MG3aNDMNw+zZs9G1a1dz/eHDhxEeHu7pZcy9RcUuPaYqFbOyNwpuREREcmBwM3z4cDz++OOoUqWK6frdunVrZxanadOmnl7G3CdVsxRVKq7gRkREJMd2Be/Tpw/atWtnRiO2x7ihK6+8EjfccIMnly93KpROcJOcudmr4EZERCTnBTdUpkwZc7JnB69QoYIG8Mssc6NmKRERkZzbLJWUlGRm/46IiEDlypXNqUiRInjppZfMbfmeghsREZHclbl59tln8emnn2L06NFo27atuW7RokV44YUXEBMTg1deeQX5mrOgOG1wc+DkOcQnJiEo4KLiShEREfFGcPPFF1/gk08+cc4GTo0aNUL58uXxwAMPKLhxdgVPCW5KFg5BSKA/YhOSzGB+VUqE+W75RERE8rCLSh8cP34cderUSXM9r+Nt+Z5rs1RyM52fn5+apkRERHJqcMMeUh988EGa63kdMzj5nh3cOBKBmJPOqxXciIiI5NBmqddffx3XXHMN5s6d6xzjZsmSJWZQv5kzZ3p6GXOfwGCgQAQQc8rK3oQWM1drrBsREZEcmrnp0KEDtmzZYsa04cSZPHEKhvXr1+Orr77y/FLmkVGKqybX2ew4EuWrpRIREcnzLnqcm3LlyqUpHF6zZo3pRTV+/HhPLFvub5o6ttWtqLh6yULm/44jZ324YCIiInmb+iNn0+SZrsHN7uPRiEvQeEAiIiLeoODGWwolN0tFpTRLlQ4PQVhwABKTHNhzXE1TIiIi3qDgJhtHKWZ38OqlrOzNtsMKbkRERHxec8Oi4cywsFhSBTcuoxTbTVNr953CdtXdiIiI+D644VxS57v9zjvvvNRlyrOZG6pe0uoxpeBGREQkBwQ3EydO9NJi5J/gpkZys9T2wwpuREREvEE1N14vKE7bLEXbj0TB4XD4YslERETyNAU33u4KHncWiEsZkZijFAf4++FsbAIOn4n13fKJiIjkUT4PbsaOHYsqVaqgQIECaNWqFZYtW5bp/ceMGYPatWujYMGCqFixIh599FHExMQgxwkJBwJC0mRvQgIDnHNMbVPTlIiISN4KbiZPnoyhQ4dixIgRWLVqlZmQs1u3bjh8OGVsGFfffvstnn76aXP/jRs3mtGQ+RzPPPMMchw/P5e6m5SB/FzrbrYcOuOLJRMREcnTfBrcvP3227j33nsxYMAA1KtXD+PGjUNoaCg+++yzdO+/ePFitG3bFrfeeqvJ9nTt2hW33HLLebM9vh+l2L3upm6Zwub/poMKbkRERPJMcBMXF4eVK1eiS5cuKQvj728uc4bx9LRp08Y8xg5mduzYYWYhv/rqqzN8ndjYWJw+fdrt5MtRiqlu2XDzf1NkNi6LiIhIPnHRE2deqqNHjyIxMRGlS5d2u56XN23alO5jmLHh49q1a2d6GiUkJGDQoEGZNkuNGjUKI0eORE7qDl4nObjZfOiMmYqBBcYiIiKSRwqKL8SCBQvw6quv4sMPPzQ1Oj/99BNmzJiBl156KcPHDBs2DKdOnXKe9u7dm30LnEHNDQuKCwYFICY+CbuOaRoGERGRPJG5KVGiBAICAnDo0CG363m5TJky6T7m+eefxx133IF77rnHXG7YsCGioqIwcOBAPPvss6ZZK7WQkBBz8u0UDO7NUszU1CpTGGv2njR1N/bYNyIiIpKLMzfBwcFo1qwZ5s2b57wuKSnJXG7dunW6j4mOjk4TwDBAohw5IF4GzVJuRcWquxEREckbmRtiN/B+/fqhefPmaNmypRnDhpkY9p4izlNVvnx5UzdDPXv2ND2smjZtasbE2bZtm8nm8Ho7yMlRCmUc3NRJDm42qseUiIhI3glu+vbtiyNHjmD48OGIjIxEkyZNMGvWLGeR8Z49e9wyNc899xz8/PzM//3796NkyZImsHnllVeQI2WSubGLipW5ERER8Sw/R45sz/EedgXn7OUsLg4PtwIMrzlzCHirFuDnDzx/FPBPyS6dio5H4xd/N+fXjOiKiIJB3l0WERGRfLL/zlW9pXKd0OJWYONIAs5Eut0UERqEisUKmvPr95/y0QKKiIjkPQpuvCkgEChZ1zp/YFWamxuVL2L+r1VwIyIi4jEKbrytQnPr/77laW5qWCHC/F+3T8GNiIiIpyi48bYKLaz/+1amualReSu4Wbv/ZHYvlYiISJ6l4Ca7ghs2SyUmuN3UIDlzs/f4ORyPivPF0omIiOQ5Cm68rUQtICQciI8GDm9wuym8QBCqlQgz59ep7kZERMQjFNx4G8fpKd8sC3U3apoSERHxBAU32Vp3syLNTQ2T627WqKhYRETEIxTcZIeKLa3/2+YAce6zgDdIDm40UrGIiIhnKLjJDtU6AkWrWNMw/DPO7aZapQs7i4qj49wLjkVEROTCKbjJDgFBQKdnrfN/vwucO+G8qVhYMEoUCjbntx0+66slFBERyTMU3GSXBr2BUvWAmFPA9IeBxHjnTTVLWdmbLYcU3IiIiFwqBTfZhZNm9ngN8A8CNvwMTOkPJCWam2qVLmT+bz10xscLKSIikvspuMlOVa8A+n4NBAQDm34Fts0zV9dIrrvZouBGRETkkim4yW61uwN1e1rnj2wy/2qVsjI3apYSERG5dApufKFYNev/8R1uPab2nzyHqFj1mBIREbkUCm5yQHBT1PSYCjHnt6rHlIiIyCVRcOMLRata/4/vdF6lomIRERHPUHDjy8zNqb1AQqxb09T6AxqpWERE5FIouPGFQqWAIM4G7gBO7jFXtapazPz/a+sRHy+ciIhI7qbgxhf8/NLU3bSpUQL+fsD2I1GmsFhEREQujoIbXylWxS24iSgYhCYVi5jzf21R9kZERORiKbjxlVSZG7qiVknzf6GapkRERC6aghtfBzdHtwL//Wh6TrWvaQU3i7YeRWKSw7fLJyIikkspuPF1cLNjPvDDXcBPA9G4QgTCCwTidEwCVu9NmTlcREREsk7Bja/HurHtX4HA+LPoVKeUuTh9zUHfLJeIiEgup+DGV8LLAxVaAiVqA4XKAI4kYO8/6NW0vLl5+poDiE9M8vVSioiI5DoKbnzF3x+4+3dg8D9AzS7Wdbv/RvsaJVA8LBjHouKwaNtRXy+liIhIrqPgxtfj3fBUua11efdiBAb449pGZc3Fn//d79vlExERyYUU3OQEldtY//evAuKinU1Tv284hNiERN8um4iISC6j4CYnKFLZqsFJigf2LTeD+ZUsHILouET8u+ekr5dOREQkV1FwkxO4Nk1tnA4/Pz+0qV7cXFysuhsREZELouAmp2hyq/V/xWfAofUpwc32Y75dLhERkVxGwU1OUb0TUPc6wJEIzHgMbapZwc3qvScRFZuQ9v5x0cDMJ4G9y9LeFnUU+OFuYNci5FhJ6uYuIiLeoeAmJ+k+CggKA/YsQcVDc1GhaEEkJDmwbNfxtPdd8x2w7GPg9+fS3vbPx8B/PwBzhiNHOvAv8Go5YO5IXy+JiIjkQQpucpKICkDrwdb5hW+ibXL2Zkl6TVOH/rP+H1wDJKbK7Gz9PaX3VXQ6gZGvLRkLJJwDln8KJMT5emlERCSPUXCT01x+v5W9iVyLXuEbzFV/bDqc9n6H1lv/E2KAI5uALb8Df70FnD4AHFydfCcHsPNP5CjnTgAbfrHOx57KecsnIiK5noKbnCa0GNB8gDnbctv7qB+wH9sOn8XmyDMp93E4gENW4GPs/Qf48R5g3ovAFOuxTtvmnf81488BM58AVn8Hr1s7BUiMTbm84Wfvv6aIiOQrCm5yojYPAsGFEXBkPWYEPYF7A37Fr2sPpNx+cg8Qd8a9mYdZENq71Ppfvpn1f/t8KxiidT+kX4D893vAsvHAr48CMcnP4y3/fmn9r32N9X/TjLTNaiIiIpdAwU1OVLgMcM9cq/cUgIcDf8K8NdvhYBDALIvdJGU7vj3tc1w5HAgIAU7vA45utWpzfrwb+OZ/7nUuJ/cCi96xzrMO5r8fvfe+jm4DItcB/kFAz3eBgsWAc8fNnFoiIiKeouAmpypVB7jpSyQVq4FCfjG47OTviPrseuDNWsD6n6z7VGzl/piOwwA/f6BwWaBKe6Bya+v67fNSuoXHnAR2LQQWvw+8WgH4uL0V1AQWtG7/9+uMl4lBFYOj95sBp/alvf1MJLArk0DFrq+pdDlQqCRQJzl7s26K9X/9NGDeS1YvKgZCkrMxSLazgiIiOYiCm5zMzw/+yfU3zwd+jUL7FwGxp1OCgdpXAwUirPMh4UC7ocC9fwADZgL+AUD1K63btv9hupc7sbZmwWtW0xYLfP0CgJu/AfwDgf0r3et5bJtmAuPaWT2xjm2zmrhccSfHwOfzq4EdGRQJ71xo/a96hfW/8S0pQc22ucCUfsBfbwKL3gZ+fQQ50rmTwE/3Za2WKS87fRB4px4wKXnwSRGRHMTnwc3YsWNRpUoVFChQAK1atcKyZenUhLg4efIkBg8ejLJlyyIkJAS1atXCzJkzkWc1uRVJASEI8YtPe1uZBkDZJtb5ml2BwGCgXFOgWDXruuqdrf/M2ux2CW44Bg4DmxK1gDumAvf9CdS4EqjV3bp9Sn+r95Vr4PLHy4AjycoKUeqmpD1LTQ8vY82k9Afts7NHdnBTqTVQtIq1LFPusq6r0DLl+c+m00sswaUYOTNcnq1zM7499ixwYDUQ61K7lBXLPwHWTgLmvoB8bd33QNQRYPNvF74Oz2fHAmDs5enXh4mI5PTgZvLkyRg6dChGjBiBVatWoXHjxujWrRsOH05np8ZBeePicNVVV2HXrl344YcfsHnzZkyYMAHly1uzaOdJocXgX/8Gc/a3xBb4IuGqlNtKNwCa32UFCK0fSPvY0vWBQqWB+Ggg+igQEAyEJGd66PIHrACoTEPrcocngQJFgKObgW//B0wdZAUBbE46vB4ICgVu+DgleEhymbGc00bYNv0KxMe4L8uRjdYy8DnKXWZd5+8PNE4+8mdBNJfvf59bxdAMpDYmdxkn7ui+7Qu8XAqY/2rm64xj+3x5PfBNH+DIFvfbuMw/3AWMqgCM72Ddj7VMa7+3MjJxUZk/98bpKU10HCU6v3Jm7hxWPVd6uC7/GZ+1sZaObQeO77DOLxhtbS//jPPc8opIvuLT4Obtt9/GvffeiwEDBqBevXoYN24cQkND8dlnLjtKF7z++PHjmDZtGtq2bWsyPh06dDBBUZ7WYzRw3Qf4o+6LeDehN04GlwHKN7cCl/q9gIfXpPSOSj0hp529IQYVtbpZ51nM2/hm9/uXbQw8vBpoPcSq3eEoyB+1SRkFucltQJV2VoDE5jE7U8PpHjZMs84HF7JuYzMTd272NAt2kxSzNcww2VyXodkAIKI8UK9XSnMVm4F+HgJ8ehWwZZZ1PcfzSR20uFo72Rr/hztevofUO2VTNM1aET+rGe7nB4Cp91kZmdTNba5YfG2PIcRpMrhTXzMZ+O0pIDGdzBrFnAY+aAFMvgMXhNkQBnPsCZfTsEDd/uyJ6zA9i8YAvz1hBcnne68TOgHj2ltBrN2EuvMv1fQwU5nf14FIbgpumIVZuXIlunTpkrIw/v7m8pIlLk0oLn755Re0bt3aNEuVLl0aDRo0wKuvvorERJcMQiqxsbE4ffq02ynXKVgUuOwOXH1ZDRxHOLonjkHCgN+t4OV8XIMbFvK2ug8ILQ5c+TwQVDD91+r2CtB/BhBeHji52+rhZA8wyFoePg/ZxcOc7iExzmoSa9bfum7GUODV8sBP91o/zuyS7tokZSta2XpMyTpA+8es6+pdn9I09V4T4N+vrMtNbweqdgCSEtJOO7F3OfBJF6tJbFXy/e1Ah9kaZgXY02xBctan83PAte+k3IeZIto2xwrWvu4NrPzc/TWYkXLFgI21Qcww2Bkd7oxmDQPeqgPsW2k1sRzdYmWhMgrIuEMf2yrlOYgDHTKYY4E1l9sVM0bMiBx0CTBcnT1iBVXe4uxR55d5cGMHo1tnW+siI2wy5RAEcWeB75LrsCjqsLXu8iuul1fKWnVoIpI7gpujR4+aoIRBiitejoyMTPcxO3bsMM1RfBzrbJ5//nm89dZbePnllzN8nVGjRiEiIsJ5qlixInKrdjVLoEhoECKjkvDPrhNZe1C1TinnGZRUaA48ucNqzspM5TbA4GVA5+eBwuWAlgOB4tWt26q0TQk+uNNiJsUen6dBb+v82UNWdoT1PTMes3ZwVNOlWc3GbuGD/wEKl04JeBgoMeBgwXOJ2sCAWcD1Y4Fr3rYKn/l8c0ZYO37uyNnUtG+5lYFhExq7wbPY+vR+4LNuwHtNgderAyd2AWElrSa5y/oBFZMDNWbBaPsCq2s8M0+/DgV2L7YGOPyoLbDkQ+s+EcnbELMqbPKj9VOBqGPWay39EDhz0BrTx7WQ2+7l5opNYr88aI0yzbGGbPb6Yk82Bkg2BkifdrUyIl/dkDaI4bxdb9UGRleyltnTgyQyeLNrqpjJs6f5SI31Uq7ZHTuoTI/rKNVsuiQOF+Ca8Usve+RaF5YbsIn3v5+yPuUIs47MEP43FXkaA1sG5BfzOP4+iOTEguILkZSUhFKlSmH8+PFo1qwZ+vbti2effdY0Z2Vk2LBhOHXqlPO0d+9e5FZBAf7o0cAq6J2+xmVQv8ywyzV34mzGSp01OZ+QQsAVjwOPbQSufiPl+srtrP8MAFibwx/g+jdaJwYlzMBcdmdKFmfFp9b/VvdbdUBZ0e1Va6C/GycADyxJ6dZeogbQ/nHr/N9jgPebW720Tu1J2SFSvetSAi0GPRSfXE9zxRNAcJhV89PnM6Dtw8Ddc4DQElZx89KPrPvxfX1+rRV0cC4vvgZ1esb6bw+cSFvnAL89aQUXrB0y62eee+E1Mx52E8P8UcCk24Dfn7UCG7OcK61gh01cdqaLNs+0Aoq36gJjWwCH1qUEAovfc19v7AnH5WZgyWX+/k7gp4GeGyiRo2Cf2GmtK2a/mL05tRfY84+1jPbr2MvPQJC98bitZFQgbAc3dj0YP0dmGDMLbthLi9seewLauN6YRcpJzThsurR7H05/CPhhQPrF6CzMZrOc6xhWdlDL+qOcVt/FzgHc1i8VAxS+bw4vwaEkLiTI/qgd8GFrzxe0S57gs+CmRIkSCAgIwKFDPMJPwctlypRJ9zHsIcXeUXycrW7duibTw2au9LBHVXh4uNspN+vZ2ApuZqw7iLOxWdxhXfcecO88a4fuCazNYREzm6Kij1k7MDbxsJmMJw4geN37QPfXgOI1U4qfu1xADyNmjm75Fmh0k9UU5qrTMKDvN1bPLQ5SaI/KfMdPQKO+VtbGzsywdiisFNDvV+D2H4Fe44AW96Y8F2t8rnrRyhaxxxgxOGC2iu+L51kEfeUIK1jie+PginxeG4uwzQCIyfU6t35vLQN3+naxLXfYbGJhE9/hjcCfo61mLteiWQZfDFw4nQbrluxmH2ZfWHd05oD1uszG9UgONhd/YHXLNsvtSGk6Y5aLQSADCza78XSpGGjYNUDXfwCEl7WaE2lidytrxvdlj61EXGdNkovGF4xK+5zMdtnNnn0+tWq2eP/kASzNTtSu2+KOnzuyU/tTmqvYJGq/98m3AxM6p62z8hU2hbKplPVEzOwxa2P3uLM/Mxuzhcx08fM0j91pNQkTM5iuWTBPYLPnj/cCZ9x/f7P8vr64Dvji+qz3XswIx7Ti++SBwoV8buzQwIMNZki5bi8Vg0sOj+HaSUJytUBfvXBwcLDJvsybNw+9evVyZmZ4eciQIek+hkXE3377rbkf63Noy5YtJujh8+UHraoWR7USYdhxNApfLtmFBzrWyP6FCAgE7l9s7ZRO7LayKgWLpL1fUAEzEKFppmk/1LrsKXWvteqJmNXgFA4MhpiZ4ok79oDkLM79S6wRn9NbvtRqXJUSBLS4C6jVA/jnI6DFPVZGylXJulbzV6l6Vjd8ZpHMcl0HVO9kNd3ZWYUila2AkHU3rOMxmRUA4RWsgIXPzXGKdsy3shtsSiP2kmPGw54Sg8tjAoAwa2fOTBADO2aMuJ4PrLIey4lXG/Sx1jfvO28ksHwC0DS5GSk17qCY3mcAxrnN0qvlYlaE03MwI8SgsXYP6/oKzazMgl2zxKk82Fxlv3cGjHz/3HGZ8ZaWWs2j3G6YEbM/J7Mer7KaTJn5Yl0V3wdHsI5cYx3Vf3czULdnSsE5bZltNTVyG7BrfLhe7IDKU7i+s1Lj5uqvt60DAHt4BRvnVuNYTnY2lEXz+1ZY57f8Zq1r16ZIYpaE1x3ZDPT6MP16uQt5L6wL44HBsa1Wfd2FHPgwMOI2zICETZJ2VvVCcVuws7r2AKJtH3Ffzwzy2DGBw16wCZrfD37XWRvn+jhmii8Ws40MzPk9K1XXyvpmxNQP/gFUbAmEFL7415S83SzFbuDsyv3FF19g48aNuP/++xEVFWV6T9Gdd95pmpVsvJ29pR5++GET1MyYMcMUFLPAOL8I8PfDkM5WQPPJXzsRldXsjafxx5A7qcZ9gSKVMr5f6XrWUb499o5HlyEUaNgH+N9EoKVLNsbeYdojPWclsLF3xMzS8MQdOH9QGSilDmyoWseUgQgb3GidZy2QnZ2qkVIoj8ptgWb9rPP8Mf/3G+v8jeOtnXn/mdZ9iDU6m5N30hzB2X4e1g/1HJOyE+IO4OrXrddk0LT6W2BjctaGQYIdSPJ9MIvEnaNd1Mvsy2tVgDnDrWVhjQ5Pb1SzMg1sTmPAw4wJsyRsEmFQxsEbWYze9aWU91azW8p/jorNHffHHawxcBicsKaJWTG7PofNWnbB9pIPUqb+sJtMA0Os98bPsGaXlEwHgyZ7MEm3OiIH8H0/q/7KNcPE+hYGDambc6Y9YNUs8fas4rr45Epr7B3uYLleWMxtZ5TMYjisnSyDD2LwZmciggu7N7fSiolWgMF54tgsZwe8DDLZlGkHN8wK2gEbM1+s20pd6H4+7OXHrvY2Zg4Z2BC3C2ZwXDMWvC97DbI2iLVTXE5mxdjEyZHDXYdo4LKePgD88YrVK272s1ZwxnXP8xxLKj3MGP1wd0oQz+8cty8GL3bTJpeDA4eOa2u9Povuv7wOmPm4+yCazHSa4v0/s17P5Io9IO0DiPNlOPk9+/pGYNbTyDEY/F1Ik14+4bPMDbFm5siRIxg+fLhpWmrSpAlmzZrlLDLes2ePM0NDLAaePXs2Hn30UTRq1MiMb8NA56mnnkJ+cl3jcnh33lbsPhaNb/7ZjYFXJBf6yqVh1uKuWVbTT6FSmd+XdTc1OgPVOlu1Oxz/h3UodtE1g5LZybU5PLLl5csHA0vHAknx1thCPAK1j1IrJU+lwR03syCcDoOZKQaFPLru9KyVgXLFbBCvZ2aGWRU7qGN2wxZW3Aq+uKP9+x2gTs+U5fr7XZcn43I4gP0rrB9vrgMGRWxuY5BiN8Nxig97VGziUe7QTdaycafJnZE9ZlGXESnd/lm7xWXgzvCd+lZXfXtIAb4us1+pcX1xfbCWh5kcYhBg71yb3A6s/jqlez5rvlhzw2YOBpEL37KWi6N2s36MO8DVyYEln7dqeyso4lE4g0AGy8Qiba5z7jQ4qCQzj3aPMPZoY3MjgzTWlLEYnhgM/DzYylI9uMoK2rjM7N3HJtLJt1lNdK0HWwEsn4fPy50ph0cwH4G/9dlzBHK71oiF/AtfT6kbszNC9vIys8H1yuYZFt6zRyMDrT2LrQE+2XzHJiTiMA/crjnKOHEQTwZhm2cAvz8PdH/VCjpYFM/glNk0BqLstWZj/RDXjc10KlhhZZxsfBzrvfj+2NzDjgnM9toYILJmigFW8RrAtWOsbZKfDYvkWaTPzKyZIDg5CHXtSbjqS2s9cX3x82H28pPkXqEMsG/5LuOsCtcNp57hd529L4kZU9dMIMdl4u3psZtbuTzXvuv+vnggwHqqywelDM3BYIvF9FzOOlfD4/j8n3azPud+01N+R3yNzcc+zmz5ORw5qfrO+9gVnL2mWFycm+tvvl+xF0/+sBYlCgXjryc7o2BwqroU8S1+rVgoyaPRIcuBIhWtI9Kvb7B2XL0/tbJONu5IRyXX+NDVb7pnozLCI24WVNs/uswSDN0AFHDZtrlTt3/8bQyAeFTNI76OTwFtH7Xqp9hcwmCCc5C5Bj32zpDNka6ZsdS4c+AOlztyFrOnvo21Q9xZMnBiIMnmFc5Tll4vOvrkKmBfciEygyy7KJwe32YtL4+6GdiwOZBZhmXJdTi2pndY2UNmKDiysp0pYlbEDpTYW+725JqYL3pazWH2e+ZOy37dCi2sz9TupfO/L6yxphhA2IXRLFhnAMLPklk5NlFyh8mmR+4MmfHh58UdumtXd9aCsfnQxuVjL0Jm1WwMehlwdn0FaDME+Ky7e488DorJQIc7bJP1caRkJa56CWj7EDDxGmD3Imsb406cPQ2p1SDg8Ia0RdwMVjgGFXfS9vtmvRtfJ7BAct2Nw2o2ZZDDgIU1RfY6u/7DlCZRrksGeswYcfkYePKAgBkbjmWVGsfjumEcsPILKwvLjJndJMXPgnVl3/V131b5mTHwN4OcDnAPxvne+PkyMGLWlENf2OvDxh6ZLZKzSqm9XT8l6zXgN+sAxfbLQ8CqL6xaPX5uzLIyEGNvSG67j6yzDjZS4/bAJmVmiFPXF57Pn28A85N7CxetCgxaZAXyrr9D/I1wDcK8jRk3Fnoz88zfOZcERXbuvxXc5FLxiUno9OYC7DtxDs9fWw93t6vq60WS1LhD4xEMm2VsTNlzh5Zer7HxHa0ffWZsuKPNao0HfxxZl8IdR4maQEmXnaFt2QSreJk7ZmZJbmaTicM6MuYPvCv+JDDo4RE0mxy542HXdDbBsSbhUtcJi4C5w89K7z1mWNgcQvyh5Mz2xCzFkHR6X7G5gpknct3xdnjKCjiYNUsdtHFnxGwMn5PZEH4GXCdcD3aQxx08M1P2Y1iozeCFO04Wt3OHad9mY28/FsVnhMElC6D5PMx0PbYZeLexlc1i8MCdOps/32loFc8yK8TatekPW82DfF0WcrNpkhkefr527ZMrvhcGJSz+vms28Ho16zUfWg0Uq2oN5cBMlGsAxeJ8FupyHTJbx/ViRvFODrgZXHH8HTvYYZE7RxdnVs7OtrCGi+ub21DDm6wglVkeblcMWphhscfMsgdtZLMmM3oc6oBZor5fpQw8SpH/WdlBrmsuV8enrWlW+B75fWFzpx2YEp+r+6iU4IrNW3YWiOuPAfHoytZycngMjrTO4IjbWtlG7uuRQTjfn63doynN0ByRnZMa2z0o2QOTHRXs73Tq+7vi58mmRmb4uKyu2AzI5lsGURxAlU2MzObywMEOIrjO7MCf2aie71vrh1lBTpvDLBxr8Jgt5rbFoMMeld4VM2o8sGHTKter3ZzPJlIeNLBTRe3kKXpsLJLndsf3a0+EzCZKZhuZrWYnDg9ScJMPghv6btkeDPtpHUoWDsFfT3ZCgSBlb3I1dp/mDoQ/gPaYP57GoIXj/FzoEaKv8KiTR75McXcfbR3ds4nGtUnIFYMZ7ry5g2Wmgt3lXZvfKrayggG7iz4zWGxe+Kh18thMyTtEBk7cObO+gztkBimsUbILpa95y6q/YHMVn49NUDzy5jhE3MnwsTx6t5spM8LmIHbnZ0DLed74/Ax6uKO1dy6sZWHTU9eXrewKd2hsGmLmg8EXC6xv+sIaBmDa/VbTEwOHYzus5iFmIT5oaWV8uPNjbRB34MwouvayY5aBgSwLndlLMTXejztiFgKzCJnn2aRFN39r7dxYZ8OdMfV8L3nnmmo6nTKNrKCFPS4zwtdiEJReofOsZ6zRxDl8Q+r1y4wRm90Y4Jgd9SYrS8MejAzu3m2UEgC2vM8K9L/pDURUAu6ZY60nO0BhQGwP+2DXPZksV3JgzICXAQszYww22c2ftUNcbga/DG441IONAUj/6VbAyKCQ743jNdkBI5+XvVrtJi2uA2bm7N6gzucJtbYFbjcspucEyRxj7KvkQntm0E7ttT779HDZOF0Ps172NsaMGAND+7Oys50M6JiB5vrkOnpwpdXUzGXjNDgMYojfgT4TgTrXWgE6g/HU2WkPUHCTT4KbuAQre7P/5DmMvK4++rXJ5MdCJC9g0SizDBxqgMXq6WHPKR71c8fDo3kejbOImrU43MmxAJZjztB9C60mDB79cydHfO70et9wOhDOXF+oDPDIWisLxWCLOxLiMAOcl431Nuz1c9XI878fZvJ4hM0sVkaF+ewuzxoaZoKY4mdR9SSXkZw5zAHrh4hNgtzRujZLEudNY0Bgu+JJoLPLjvdicJ2yaY2ZL04Bw4CZO0NO2cKhFAb9ZfVg4/pgYMFmJJ4YFHiwqSLTjOb0B61gjs21JWtZwSjPczwr9nhkBpEF6/bOnJkSNr/Zo3Cz6Y7bB98bp1phk2fD/yWPWeWSJbMzJ/zcGXDY09EQsyZmGIhMuvOzWZTBNQO/e+dbzUh2ExqzeszksQaJwYjrwIUMENn0ySEtGGyzaczOToZEWJk+BvSsZ2LAz+Zvu+aKWcd2Q62sCwu1+Xr83NiDk4EWA1Z+dnazsGuTHQc0nZ3c2YfNfwykGOC0echqKmYT7ONbLq1XXzoU3OST4Ia+WrILz/+8HuWLFMSCJzqagf5EJJ0MEJsIuXNgTynWfTA1zyNgG6ftYN0Rf+zTaxLkTyV3EtzZ2Wl9NlVN7GHtgFhIzB9zNkNwHjdv7cC5HBxcknUibEpjhuh8TZgcZJFNWFxOvj8OtMmeaZeChdfcwbHJqVqHlOsZ9JlJel1qP3yFBbcsUnatqWHwymwgcTnZXZ/NJ649HDl5a+pxmeygiAN/smmVvbTYhMfH27VyDyy1al+mDUoZf4e1OcSsmhlw0M/aDlmszcxbpTbWEA8fXm5lgZhNZNaDmZRdf1m1WMymMRBmForZxJUTrYDxjmnWeFM2jgvFDGPFy60i/vQKozmy95znUwYPtQvzGZywqXL+K+5jDjFQYbDPjByDH/aWZDaRQRS/P2xOY/bJOS0Le2neaa1nD1Nwk4+Cm5j4RLQd/QeORcXhnb6NcUPTCr5eJJH8xZ71PKMeNt7A5gxO2MqiYtf54zLDQRCZ1Und6y6vYy0Jx+Y5G2m9fw5LYEaDXpfSw4o9jVwDRO4WfxliZX3smiXbo+utXmXMhLGZkEECm6uYler/a0rWiD0j2SvQDKaZQbDMTBebERlosuCeNUEMluygi82b7OUWUcH9cQyqWS91sZmRpESrBs0uRraDnF5jrffGyX7ZPMesD4cvYHbm/ctSxuAiNkH1/dp6bwwiOQ6V3bGB0+Vc7PhHmVBwk4+CGxo7fxvemL0Z1UuG4bXejdC4YhFlcEREMsK6HQYfxB5bdp2LK+4a7W7hzNRwbBtmSwanqoEhjuvDGprMehJmBWuuOC0L63aYSWHzIXszeoMjeVRv1lsxIzR4uTW9jT0KNQM6ZiDtwIyDtjKjxXo1Ns0OmOke0DMjysEQGXBx2pwLHfQyCxTc5LPg5tS5eJO9sadjaFa5KH6836WLooiIpGCROOulmMVhrU1Ws2XM4oSV8O6ysWmKGR12qb/UpsPzYQDD4RkY3LE+J4dTcJPPghtaufs4Jizcid83RCLJASwZ1hllIzxbzCUikmcw28J6kuwobpZs33/rU80jmlUuhnF3NEP9ctaAVct2uoz1ICIi7ljUq8Amz9Inm8e0rGq1gSq4ERGR/ErBTR4ObjYePI2Xf91ganJERETyC59OnCme16KKFdxsPXwWAyYuR+TpGHP5uWszGPBMREQkj1HmJo8pFhaMWqWtwbPswOaHVfvMeDgiIiL5gYKbPJy9oUIhgTgZHY+Z6w76dJlERESyi4KbPOjqhtZw3P9rVgGDOlQz579YshvRcdY4OCIiInmZxrnJow6fiUGJsBAcjYpFm1F/ICHJgeBAf9x3RTU81rW2rxdPRETkgmicG0GpwgXg7+9n/o/u3QgVihY0s4i//8c2LNl+zNeLJyIi4jUKbvKBPs0q4K8nO+G2VpXM5ZHT1yMhMcnXiyUiIuIVCm7yCT8/PzzetTYiCgZhU+QZfLlkt68XSURExCsU3OQjRcOC8VjXWub8qzM3YvG2o75eJBEREY9TcJPP3HF5ZVzXuJwpML7vq5V47Ps1+Hn1fl8vloiIiMcouMmHzVOv92mE5pWL4kxsAn5ctQ8PT1qN9QdO+XrRREREPELBTT5UICgA39zbCh/f0QwtqhQ1101Zsc/XiyUiIuIRCm7yqZDAAHSrXwYPdKphLk9bvR+xCZqiQUREcj8FN/ncFTVLonR4iJmiYd7Gw75eHBERkUum4CafC/D3Q+/LKpjzHy7Yhv0nz4GDVuezgatFRCQP0fQLgj3HotFtzEKci080UzQE+fuZupxXbmiI7g3K+HrxREREoOkX5IJUKh6Kn4e0NcXFnKIhKi4Rx6Li8MA3KzF2/jbM33QYMfGqxxERkdxBmRtxSkpymNGLQ4L8MW7BdkxZmdKDqkapQpj6QBsULhDk02UUEZH86bQyN3IxONFmvXLhqF6yEF7r3QjPXVMXV9QqiSKhQdh2+Cye+nGtanFERCTHU3AjGQY697Svhi/vaonP+rdAUIAfZq6LxKeLdvp60URERDKl4EbO67JKRfH8tfXM+VG/bcKyncd9vUgiIiIZUnAjWZ6TqleTckhMcmDwt6uw70S087bVe0/itVmbsPd4ynUiIiK+ooJiybLouATcMHYxNh86g3IRBfB4t9pm4L8Z6w6a2ysWK4gfB7XBgVMxCC8QiGolC/l6kUVEJB/uvxXcyAWJPBWD2z5Ziu1HopzX+fkBEQWDzCjHHCeH3ckLBgVg+oPtTC8rERGRS6XeUuI1ZSIKYPJ9rdGySjFUKR6K/m2qYPqQdvh5cFuUKBRsAhvigIAPT/rXeVlERCS7KHMjHsOam//2n0LtMoXR+6PFOBEdj8GdquOJbnV8vWgiIpLLKXMjPlGxWCh6NCxram04dQN9sXg3omITfL1oIiKSjyi4Ea/oXr8MqpYIw9nYBPyy5oC5jlM48Py8jYfMaMgiIiLeEOiVZ5V8j4MA3tqyEl6ZuRFfL7WyN+P+3I6jZ+PM7XXLhmPUjQ3RpGIRXy+qiIjkMcrciNf0blbB9J5af+A0Xp6x0QQ2ZSMKoHBIIDYePI0BE5e5jZcjIiLiCQpuxGuKhQWjZ6Ny5jznp2KmZuGTncypYfkIq+D4m1WITdCM4yIikseCm7Fjx6JKlSooUKAAWrVqhWXLlmXpcZMmTYKfnx969erl9WWUi/Pi9fXx1v8aY97QDrilZSUEBfijaFgwPrztMhPwrNl3Cg999y/iE927jDOz8+L0DXj51w34aMF2zN902NTviIiI5Piu4JMnT8add96JcePGmcBmzJgxmDJlCjZv3oxSpUpl+Lhdu3ahXbt2qFatGooVK4Zp06Zl6fXUFTznWLT1KO76fDniEpPQokpRFA8LQeUSoahXNhzP/LQOUXHuGZ3S4SEYd3szNK1U1GfLLCIivpGrRihmQNOiRQt88MEH5nJSUhIqVqyIBx98EE8//XS6j0lMTMQVV1yBu+66C3/99RdOnjyp4CaX+mPTIdz31UrEJ6bdDDlQYNPKRXDgZAyW7zyOyNMxCA7wx7s3NzFdzkVEJP84fQH7b5/2loqLi8PKlSsxbNgw53X+/v7o0qULlixZkuHjXnzxRZPVufvuu01wk5nY2Fhzcl05knN0rlMaP97fBn9tPYrQ4AD8ueUIFmw+gmsalsXbfRsjJDDA3I9NUkMnr8bvGw7h0e9Xo1LxUNQvF+HrxRcRkRzIp8HN0aNHTRamdOnSbtfz8qZNm9J9zKJFi/Dpp59i9erVWXqNUaNGYeTIkR5ZXvGORhWKmBMNaFvVTNDJualYT2UrFBKIj25vhgGfL8fCLUcw6OuVZtoH3ufJH9aAw+a0rlYcfVtURFiIRjgQEcnPckRBcVadOXMGd9xxByZMmIASJUpk6THMCjGFZZ/27t3r9eWUSxMaHOgW2NgC/P3w3s1NzOzje4+fw0OTVuPZqeswe/0hzNlwCC/+ugH9Jy4z81mxBxaDJBERyX98eojLACUgIACHDh1yu56Xy5Qpk+b+27dvN4XEPXv2dF7HGh0KDAw0RcjVq1d3e0xISIg5Sd5QJDQYH9/eHDd+9LfJ4NhBz8ArquHrJbuxfNcJDPh8GTYcOI2ERAee6lHHDCbIQQVFRCR/8GnmJjg4GM2aNcO8efPcghVebt26dZr716lTB+vWrTNNUvbpuuuuQ6dOncx5FiJL3levXDhG39jIefnRLjXxVPc6eKdvE3P5723HzBg6Z2IT8Ny0/3DvlyvM1A/ErM5XS3fjnTlbsGLXcRw+HaMu5iIieYzPixOGDh2Kfv36oXnz5mjZsqXpCh4VFYUBAwaY29lNvHz58qZ2huPgNGjQwO3xRYpYtRqpr5e8rVfT8khIciDy1Dnc37GGua5LvdJ46fr6mLJyH25rVQlRsYl4ffYmzNt0GAO/WonOtUvim3/2YOvhs+b+787bav4HBfjh2avron/bqj59TyIikkeCm759++LIkSMYPnw4IiMj0aRJE8yaNctZZLxnzx7Tg0oktT7NKqS57o7WVczJxjms2EzFJiy7Gat4WDAur1YcS3Ycw8noONMNnfU6tcuEo3X14uY+HCEhvbofERHJ+Xw+zk120zg3+c+ynccxdv42hAT6o07ZcNzVtoqp3SFu/o9NWYOfVu1HiULB+PiOZth1NBovTF+PikVDTTZo/f5TiE1Iwpibm5jn+GD+NlxVtzSaVynm67cmIpJvnM5Ng/hlNwU3ktq5uETc+NFiM+VDZq5rbM2T9cuaA2bqiN8fuQKlwgtk01KKiORvpxXcZEzBjaTnVHQ8XpqxAT+s3Ae2Rj3UuSaKhgaZ3lcVi4Vi/MLtZiwdVx1rl8TE/i3UfCUikg1yzQjFIjlFRGgQ3vxfY9xxeWXTtbxBeWv0Y7vIOCExCZ8s2mnO33hZefy69qAZSfnHVfvR+7Ly+HTRTlO7c0PT8igToWyOiIgvKbgRcdG4otX7LrWhXWth/YHTKFQgEK/1boSapQrjtVmbTJfyxKQkvDxjo7nfG7M3mdsqFw9FksOBckUK4tlr6po5sfj4GqUKoUCQNaWEiIh4h5qlRC4Cx83p8MZ8HDodC44PyCYrBjS7j0Wnue8tLSshwB/4euke3Ni0PN66qbGZLHTP8Wh8fU8rlChkDTL5+d87TTf2D269DFVLhKV5Hn5VGUyxuPnpHnXUHCYi+cppNUuJeBezL0M618Tz0/4zgU31kmH47eErcCwq1hQm7z9xDqdjEvDm75vx3bI9zsdNXb3fTPrJCUDpiSlr8Fn/Fmbi0JG/bgAPNV6ZsRGf9GuOyFMxKFwg0DlX1qJtR/HeH9vM+Q61SqJNjaxNQSIikt8ouBG5SH2bV8Rni3Zi17EovHR9AwQH+qNsREFzcs22vPn7FnO+Wokw7DgahTFzrcEDaf7mI2aAQY6WbOdQ5248hJHT1+OLxbsQFhyI21tXxr3tq+HN2Zudj/tiyS634IY9vm6ZsBSB/n745t5WztnURUTyIzVLiVyCI2diTbamTpn0t6WkJAcmLd+L8kULmnF0rnlvkbmeTVF3t6tq6nZs9cuFo1bpwpj67/40z1MgyB8x8UlmnB02S7Ep7K+nOqN8ESuQGv3bJoz7c7s5/9w1dXFP+2ppnoMTiX77zx6T9alZurDH1oGISHZQs5RINilZOMScMsIJO29tVcl5+dpGZU1Pqye61cJNzSuagOdgcvPTjU0rICouATPXHTQBzP0dq6NpxSJ4Z+5W5xg897Svin/3nMTi7cfwzdLdeLJ7HWyKPI1P/trhfA1OK3HjZRVQLMwaqJCiYhMw4PPlZkDDn1cfwPQH23ltnYiI+JoyNyLZXIi8/chZ1C9ndTVPz7p9p3DqXDza1bSanRKTHPh59X5sO3wWQzrXwMItRzHo65UmIFrweEfc8+UKE/B0rVcae0+cM4FQl7ql8UafRnAkN3NN/HuX2yCFc4d2MD23spKZCgsJQGiwjoNExLc0iF8mFNxIbsemrmveX2SCFQYoDHoKhQSagIX1P7dOWGqKnO2mLFt4gUBUKBqKDQdP46Era+LBzjVMRodTUTCA4txbtcoUNk1d/+0/ZZq5mEUqXigE797cBG2qq4BZRHxHwU0mFNxIXvDnliPo99ky5+UXetZzDji4cvcJDPtpLbYcsmY/r1OmMHo2Lofel1XAPzuP4eFJq00Aw/qdvSei8czVdfHPjuOYtT7SFCRzstF1+0+5vR5rfF7u1dCtiW39gVMoWSjkgqegWL33pMlM8bFcNjbdiYicj4KbTCi4kbyAX9vbP/0Hf287hkYVIjD1gbZmZGVbXEISthw6gwpFCzonCSVmapq/PBfn4hPTPKc9Xo99ngFR/zZVTBEyx99h4DP5vsvRqEIRjJq5CZ/9vdNkjDio4TWNymZpuVnzc9PHS5yXh15Vy2SRspKt+nnNfrSoUsxkn0Qk/zmtgmKRvI0D+L3epzEmLNyBu9pWdQtsiN3S7SkkXHHMnG71S2Pa6gMma9Ktfhm898dW0+V8/J3NzP81+06iY61SZjwealKxCGISkjB9zQHc/cUKFAgMQOTpGHPb2dgEDP52Ff7eXslkgBjssJmM4/80rBCBp7rXcVu2r5fuNv9Z7Hw8Kg4fLdiOm1tWRKnC7tmfE1FxZnJSe6DC71fsxdM/rUO1kmGY/cgVCOKoiCIiGVDmRiSfOXo2Fr+vP4RrG5dFeIEgU+DMoKR0Js1LzPhc98EibD8S5azfYcZm7f5TJkChchEFcGXd0qb4mQMY0jUNy+Kdvk1MsHUyOg4tX51nskq/DGmL4T+vN01UnM/rpV4NzP1jExLxzE//4cdV+9C4QgTu61AdPRqUQY93/8KmyDPmPsOvrYe72llNcCKSf5xWs1TGFNyIXByOmMyeV9VLFjJNYfbIyYu3H8UTU9Zi/8lzzvsyK8SgiZOJclydcbc3w+Tle/DC9A2mpmfmQ+3wz87juHm8NfDgrEeuQKnwENw1cTlW7D7h9rqcjNR17B8GVq/e2NBkiL75Zw8C/PxMsxi71tcuU9g50elv/0Vi6Y5jZpoLBll9W1R0ZoL4s5fV6SsYjH2+eKdpsrvvimqa9kLERxTcZELBjYjnMbMzb9NhLNt5zDRtPXpVLVNfM/CrFabHVq3ShUwz1NGzcW7Fz3d/vtw8rkH5cJQuXMCcZxf313s3MkEOZ1u33dS8AtbtP+3WpT215pWLmuBr+a7jZjRoVwyAmG06dDoGt034x0yCOqhDdVzfpJyzmYtZrUFfrUSLqsVMk9re49Gm2W3tPqvAeuR19dGvTRXvrEQRyZSCm0wouBHJPpxWgoMHnklupiocEoi/nurkLHJmoNFtzEKcjI43l9l8NeW+1mZ2dv40PTP1P+fcXDMeaodAf398uGAb9p04Z2ZaZ+8tzvP106p9Zr4udmm3FQ0NQp9mFcxzfvznDiQkOUzG6WxMglvgw6JrDph4c4tKeHbqOjOitP16j32/xjSH8TmYweH/aQ+0Rb1y4SaAWr//FNrVLJmlMYNs783bah777s1N3QZaFJHMKbjJhIIbkey151g0Fmw5jOJhIWhSqYhzygjbrP8izaCE9Nb/GqN3swrO2+ITk8xEoiwufqRLrUxf58DJc6aL/NEzsSb706d5RVNLZHePv+eL5TiRHESxPoiB0eeLd5lsErWqWgzLXOb4KlU4BIfPxJrX/vXBdhjx83qTWWKA07B8hHlO2+XViuGtm5qkeW+pcTRp1g/xNdjcxnokEckaBTeZUHAjkvP8tu6g+d+jYda6lF8M1gD1n7jMZIm+u/dy05uME45+889uvD57s8nMULPKRd0Clxevr487W1cxzWr3fbUCy3dZt7ETWNNKRbFm70mTFWIQxFGi+TzHouJMsfYT3WqjSokw53MxwJq78bDz8hd3tTQ1SbP+O2gyRo93rZ1uLzfacOA0zsTEo1W14l5bRyI5mYKbTCi4Ecm/WGjMMX4KFwhyu37VnhMY+OVKJDkcZt6tl6ZvMIMaslZo5kPtEZhck8Ofy1V7TmL+psO4umFZ0zzFzNSQ71LqclyxGe6xrrVQr1yEGRjxpV83mK7xV9UtbZ6fzVI3t6iIjxfuME1qnKfs58FtTQ1QweAAk3nisr0zZwv+2nrUPCfv/8J19U1znDdwyo09x6PQrHIxrzy/yMVScJMJBTcikh5mXGISEk3G5eCpc6Y2pn+bqs4eWJlhF/afVu032R3WAhUNC8akZXvS9Pyyg5Pnr62H/41bYqbCsBUMCjCBlz3zO5+nbtnCWJMcNDEoYvDFX2w2q/VvWwV9W1RCREErUONP+eZDZ1AmvIDbwI3p4USrs9dH4ukedUwQc+wsm9+Czfu4+t2/sOtYNN6+qbGZgPVCcBkYhLHnW50y+n0Vz1JwkwkFNyKSHVgv9MlfO7Fo2xHsOhpt6nHa1iiB+zpUM1kXBhIf/LHNZG16NiqHh6+siRs/+tvUALG3uf3LzPN9LqtgRnJmt/ah36/GodOx5raw4AB0a1AGnCGV9UIstGYh9cjrrXGDGLRwpOndx6Lx4vT1JlBrWKGIGWSR2A2f160/cBr1y4WjZqlCZoBHO+s069ErzltHZOP7eXbqf/hh5T5T87T46c5pMmQil0LBTSYU3IhITmsqs5u9mDHaeTTKTHGx70S0qe9h93aODeQ6szwHSmQ3eXv+MJtrUGQLDQ4wWSnWBbmyJ11NT6VioSaQ4ojQV9UrjYREhwmUQgIDULxQsJkGo2XVYmaso1PR8fj6n91mFGkGUTbXwRbZdMe5yPia1UoWchu1mt3tOdcYa43Ye47vq1qJMBMIMrDKytxjzJhxkteu9UunGe2aQReHJWAR+PkyWpKzKbjJhIIbEckL+NO9aNtRs+NmbU7l4mFoXb246Sr/zdI9JjBgEGE3fXWvXwZbDp/BjiNR6FS7JD7p1wIz1h00gUXTikVMN3g2gd1+eSXc064aer6/CGdirS786WEB9a0tK5lMDXuV2dd1qVvaXFe5eCjmP9YRczYewkPf/Wua2ogB08u9GphZ5lmMfcuEpYiOSzTF1xys0TXgYhMdn5ODQfL5GFQdPh2DRAcwuFN10/TF5e/78RLTZb9EoRC81ruhGW3bnlfthV/Wm15xQQF+JkPGASDTq1fi+py5LhJnY+NRtUQhLNxyxPRue7xbbTWx5RAKbjKh4EZE8jp7BGb+/2PTYZO1YU8uBhjsCcYggV3aXTG7w0CoUfkIky3hiNTzNx82GRfWA7HYmUHG7mNRZsJW1xGpmWnhWEEssmb26PJX55kpONrWKI4l24+Z0Z2ZDWKxsj1pKzNSzFKxZ5mr0uEhJjBjgXbqbJMrBivXNylvApD/9qcd2JHL+9P9bcw4SgyebKx34oSwzPSwWY692WqVLmxqrN6dtzXN8zC4+vruViazxCwbM1qcvNVef9FxCXh8yhrULROOB7MwCSwzXV8s2YUCQf64t71nR7zOaOTtFbuOm+J1Zs1yMwU3mVBwIyJyabiT50zxXyzehStqlTSzu7tmQ16duRHjF+5wK6JmtiYqLhFv/b4ZXy3d7Ww+Y61Pu5olzECLLI7+YVBr1Cxd2LwGAygOAMkM1Lp9p7B2/0mUjSiIf/eccOtSz/qgz+9qYTJGnDctKi7BjIzN7A3rkFhLdPvllTHil/UoG1EA1zUuZ2qdXB9vZ6maVmKT4DmzXGzuYpDF169XNtwsD6+rWiIML13fwCw3J699ZeZG89jRNzbEzS0rmVnsf113EDFxiWYgSbtpjYNNMpNkz7323DV1zYjX/+0/ZeqTuLwX0wuOAeSTP64xo3NP7N/CLcDhc/f8YJHJ7nHS2XJZrKFyxfeTleZBb1NwkwkFNyIi3sWmInZfLx4WbJrKOHaQ6w6XtUW/rYvErmNReLBzTZNlYXahfNGCJng5H+625mw4ZAqhGXhc3bAMapRK6dX297ajuO2Tf5yXOTErp+9o/9p8ZxMacdoPFntzdnsa1qOOmazVxnGFHvhmlbMbfuq6Jk7HwYljI0/HmMvs4cYgau2+k86ecp3rlMKrNzTEPzuP4ZHJq5293Q6cijHLzsEiD56yHs9szpBONcwEtBw6gMFK9ZJhGDN3q8lQsSmPy+wHPxw5G2sGrmQm6efVB5yjc08eeLnbWEjMKjHoo3Y1SuDLu1pmGKgws8Zl71S7FE6ei8fAL1dgc+QZxCYmYUCbKniyex1nvRQDTGbjihcKQXZRcJMJBTciInkfd8yckoMZi6XPXGn+f/zndoz6bZO5vW/zinitTyOTIVqz76Qpms5ogERmbLgzLxYabIq7X5+9Cd8ts6bpIAYonDKEAZdrITeb1ezBIW13XF7ZjFP02PerU3qmFQg0Pd7Sq3Fi8xubA8+H9UacG61Xk3JmElkOCvm/5hVw9xcrzDLYz/Ps1XVxT/uqGPfnDjN4JJsFW1UtjnuvqIoBE5ebQOuRLjVNhurLJbvdXoO1WmNvuwzzNh7Gg9/9awKvGQ+1T5Nt4lxzLHz3dOCj4CYTCm5ERPI+9jZ7dPJqXNekvAko7EzMte8vMsHE9/e1RmiwNT3HheJu8+FJq/HLmgPOjA+blzjWEbNRzO5wVOsTUXF4Zuo6k4Wxp9zgFCPMnHB07LHzt5lmohsvK2+Kp/l8bNIzvcfKRZjMFGuU2KzG5i42P7FHGcc7YuDADBAfz2wOpze5fuzfJnsUwOdPrm0izqn2v2YV8PzP682yXVGzpJmqJCMMhJgIYjZo3O2XmVqtp35ca5r62tcsYZaLwQ892LkGHuta21lPxOLtiYt3olu9MiZ49CQFN5lQcCMikn9xl8e93qXWkDAzce+XK0xTzpRBrTMd04dFxwdOxphMx/kKiLl8DCo4PACDMTabcSyi1AXg6T2OgRsDD2IAZ09Yy4CKARTrfb5wycZwEEcWg4+cvsFkp1jIzXoi9sKjLnVLmV51tHL3cdPUxwCH2OTIrA+Lsl+8voEJpt6du8VZT8TRvZnV4WjbnqLgJhMKbkREJC+avHwPnvpxnQksmJn6euluMyjkM1fXNcERC4Of/HGtyRC90LO+mTyWmGGavvaAGdOI9Txd3/nTBDEzH27vNuP9vI2HMPCrlSaQ+umBtvhw/jbT9OeqdunCGNK5huk55zqekScouMmEghsREcmLuDtfsPkILqtUFBGhQZlmnTLrlcUmPQY3roGN6wSuCUlJZqBJ1tZM+GuHGW7gbEwC7mlfDX1bVPR4UGNTcJMJBTciIiJ5e//tucYwERERkRxAwY2IiIjkKQpuREREJE9RcCMiIiJ5ioIbERERyVMU3IiIiEieouBGRERE8hQFNyIiIpKn5IjgZuzYsahSpQoKFCiAVq1aYdmyZRned8KECWjfvj2KFi1qTl26dMn0/iIiIpK/+Dy4mTx5MoYOHYoRI0Zg1apVaNy4Mbp164bDhw+ne/8FCxbglltuwfz587FkyRJUrFgRXbt2xf79+7N92UVERCTn8fn0C8zUtGjRAh988IG5nJSUZAKWBx98EE8//fR5H5+YmGgyOHz8nXfeed77a/oFERGR3CfXTL8QFxeHlStXmqYl5wL5+5vLzMpkRXR0NOLj41GsWLF0b4+NjTUrxPUkIiIieZdPg5ujR4+azEvp0qXdruflyMjILD3HU089hXLlyrkFSK5GjRplIj37xKyQiIiI5F0+r7m5FKNHj8akSZMwdepUU4ycnmHDhpkUln3au3dvti+niIiIZJ9A+FCJEiUQEBCAQ4cOuV3Py2XKlMn0sW+++aYJbubOnYtGjRpleL+QkBBzstklRmqeEhERyT3s/XaWSoUdPtayZUvHkCFDnJcTExMd5cuXd4waNSrDx7z22muO8PBwx5IlSy749fbu3cu1opNOOumkk046IfeduB8/H59mbojdwPv164fmzZujZcuWGDNmDKKiojBgwABzO3tAlS9f3tTO0GuvvYbhw4fj22+/NWPj2LU5hQoVMqfzYX0Om6YKFy4MPz8/j0eVrOnh86snlmdp3XqP1q33aN16j9Zt/lu3DocDZ86cMfvx8/F5cNO3b18cOXLEBCwMVJo0aYJZs2Y5i4z37NljelDZPvroI9PLqk+fPm7Pw3FyXnjhhfO+Hp+rQoUK8CZuDDlpg8hLtG69R+vWe7RuvUfrNn+t24iIiCzdz+fBDQ0ZMsScMhq0z9WuXbuyaalEREQkN8rVvaVEREREUlNw40HslcXmMdfeWeIZWrfeo3XrPVq33qN16z0heWDd+nz6BRERERFPUuZGRERE8hQFNyIiIpKnKLgRERGRPEXBjYiIiOQpCm48ZOzYsWbEZE7g2apVKyxbtszXi5TrcBBGjhrteqpTp47z9piYGAwePBjFixc3o1H37t07zbxkYlm4cCF69uxpRvLkepw2bZrb7exHwIEzy5Yti4IFC6JLly7YunWr232OHz+O2267zQziVaRIEdx99904e/Ys8rvzrdv+/fun2Y67d+/udh+t2/RxJPoWLVqYEeRLlSqFXr16YfPmzW73ycrvAAd/veaaaxAaGmqe54knnkBCQgLys1FZWLcdO3ZMs+0OGjQoV65bBTceMHnyZDONBLvOrVq1Co0bN0a3bt1w+PBhXy9arlO/fn0cPHjQeVq0aJHztkcffRTTp0/HlClT8Oeff+LAgQO48cYbfbq8ORWnMOF2yKA7Pa+//jree+89jBs3Dv/88w/CwsLMNssdh4073/Xr12POnDn49ddfzU594MCByO/Ot26JwYzrdvzdd9+53a51mz5+rxm4LF261Kyb+Ph4dO3a1azzrP4OJCYmmp0vR7JfvHgxvvjiC3z++ecmmM/P/szCuqV7773Xbdvlb0WuXLcXPPOkpDv55+DBg90m/yxXrlymk39KWiNGjHA0btw43dtOnjzpCAoKckyZMsV53caNG80kahczgWp+wnU0depU5+WkpCRHmTJlHG+88Ybb+g0JCXF899135vKGDRvM45YvX+68z2+//ebw8/Nz7N+/P5vfQe5Zt9SvXz/H9ddfn+FjtG6z7vDhw2Zd/fnnn1n+HZg5c6bD39/fERkZ6bzPRx99ZCZbjo2N9cG7yB3rljp06OB4+OGHHRnJTetWmZtLxAh25cqVJq3vOn8VLy9ZssSny5YbsWmE6f5q1aqZo1umQInrmEcaruuZTVaVKlXSer5AO3fuNPO4ua5LztfC5lR7XfI/m0s4oa2N9+e2zUyPZI7TxjBlX7t2bdx///04duyY8zat26w7deqU+V+sWLEs/w7wf8OGDZ3zExKzkpwMktkySX/d2r755huUKFECDRo0wLBhwxAdHe28LTet2xwxt1RudvToUZOqc/2wiZc3bdrks+XKjbhzZYqTOwSmQ0eOHIn27dvjv//+Mzvj4OBgs1NIvZ7tmeEla+z1ld42a9/G/9w5uwoMDDQ/hFrfmWOTFJtJqlatiu3bt+OZZ55Bjx49zI4hICBA6zaLkpKS8Mgjj6Bt27ZmR0tZ+R3g//S2bfs2Qbrrlm699VZUrlzZHGCuXbsWTz31lKnL+emnn3LdulVwIzkGdwC2Ro0amWCHX7Tvv//eFL2K5AY333yz8zyPcrktV69e3WRzrrzySp8uW27C+hAe2LjW3Yl31+1Al7ovbrvscMBtlkE6t+HcRM1Sl4jpOx6Npa7W5+UyZcr4bLnyAh6d1apVC9u2bTPrkk2AJ0+edLuP1vOFs9dXZtss/6cuiGePCPby0fq+MGxi5e8Et2PSuj2/IUOGmELr+fPno0KFCs7rs/I7wP/pbdv2bfndkAzWbXp4gEmu225uWbcKbi4RU6TNmjXDvHnz3FJ+vNy6dWufLltux66xPGLg0QPXcVBQkNt6ZrqUNTlazxeGzSX8IXJdl2wzZ72HvS75nzsQ1jjY/vjjD7Nt2z94kjX79u0zNTfcjknrNmOs0ebOd+rUqWadcFt1lZXfAf5ft26dWwDJ3kHsdl+vXj3kV47zrNv0rF692vx33XZzzbr1dUVzXjBp0iTT0+Tzzz83PSEGDhzoKFKkiFtFuZzfY4895liwYIFj586djr///tvRpUsXR4kSJUxVPw0aNMhRqVIlxx9//OFYsWKFo3Xr1uYkaZ05c8bx77//mhO/5m+//bY5v3v3bnP76NGjzTb6888/O9auXWt691StWtVx7tw553N0797d0bRpU8c///zjWLRokaNmzZqOW265xZHfZbZuedvjjz9ueu5wO547d67jsssuM+suJibG+Rxat+m7//77HREREeZ34ODBg85TdHS08z7n+x1ISEhwNGjQwNG1a1fH6tWrHbNmzXKULFnSMWzYMEd+dv951u22bdscL774olmn3Hb521CtWjXHFVdckSvXrYIbD3n//ffNFy44ONh0DV+6dKmvFynX6du3r6Ns2bJmHZYvX95c5hfOxh3vAw884ChatKgjNDTUccMNN5gvp6Q1f/58s+NNfWI3Zbs7+PPPP+8oXbq0CcyvvPJKx+bNm92e49ixY2aHW6hQIdPVc8CAAWbnnd9ltm65o+APP3/w2WW5cuXKjnvvvTfNgY7WbfrSW688TZw48YJ+B3bt2uXo0aOHo2DBguYAiQdO8fHxjvwM51m3e/bsMYFMsWLFzG9CjRo1HE888YTj1KlTuXLd+vGPr7NHIiIiIp6imhsRERHJUxTciIiISJ6i4EZERETyFAU3IiIikqcouBEREZE8RcGNiIiI5CkKbkRERCRPUXAjIvmSn58fpk2b5uvFEBEvUHAjItmuf//+JrhIferevbuvF01E8oBAXy+AiORPDGQmTpzodl1ISIjPlkdE8g5lbkTEJxjIcHZy11PRokXNbczifPTRR+jRowcKFiyIatWq4YcffnB7PGcn7ty5s7m9ePHiGDhwoJlJ3tVnn32G+vXrm9fizMacFdnV0aNHccMNNyA0NBQ1a9bEL7/84rztxIkTuO2221CyZEnzGrw9dTAmIjmTghsRyZGef/559O7dG2vWrDFBxs0334yNGzea26KiotCtWzcTDC1fvhxTpkzB3Llz3YIXBkeDBw82QQ8DIQYuNWrUcHuNkSNH4qabbsLatWtx9dVXm9c5fvy48/U3bNiA3377zbwun69EiRLZvBZE5KL4euZOEcl/OIN2QECAIywszO30yiuvmNv50zRo0CC3x7Rq1cpx//33m/Pjx483s0KfPXvWefuMGTMc/v7+zhm4y5Ur53j22WczXAa+xnPPPee8zOfidb/99pu53LNnTzNbt4jkPqq5ERGf6NSpk8mGuCpWrJjzfOvWrd1u4+XVq1eb88ykNG7cGGFhYc7b27Zti6SkJGzevNk0ax04cABXXnllpsvQqFEj53k+V3h4OA4fPmwu33///SZztGrVKnTt2hW9evVCmzZtLvFdi0h2UHAjIj7BYCJ1M5GnsEYmK4KCgtwuMyhigESs99m9ezdmzpyJOXPmmECJzVxvvvmmV5ZZRDxHNTcikiMtXbo0zeW6deua8/zPWhzW3tj+/vtv+Pv7o3bt2ihcuDCqVKmCefPmXdIysJi4X79++PrrrzFmzBiMHz/+kp5PRLKHMjci4hOxsbGIjIx0uy4wMNBZtMsi4ebNm6Ndu3b45ptvsGzZMnz66afmNhb+jhgxwgQeL7zwAo4cOYIHH3wQd9xxB0qXLm3uw+sHDRqEUqVKmSzMmTNnTADE+2XF8OHD0axZM9Pbisv666+/OoMrEcnZFNyIiE/MmjXLdM92xazLpk2bnD2ZJk2ahAceeMDc77vvvkO9evXMbey6PXv2bDz88MNo0aKFucz6mLffftv5XAx8YmJi8M477+Dxxx83QVOfPn2yvHzBwcEYNmwYdu3aZZq52rdvb5ZHRHI+P1YV+3ohRERS175MnTrVFPGKiFwo1dyIiIhInqLgRkRERPIU1dyISI6j1nIRuRTK3IiIiEieouBGRERE8hQFNyIiIpKnKLgRERGRPEXBjYiIiOQpCm5EREQkT1FwIyIiInmKghsRERHJUxTciIiICPKS/wORyAeyBiTjvgAAAABJRU5ErkJggg==", - "text/plain": [ - "
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": null, + "id": "78", + "metadata": {}, + "outputs": [], "source": [ "training_log = pd.read_csv(\"training_log.txt\")\n", "\n", @@ -1938,7 +1327,7 @@ }, { "cell_type": "markdown", - "id": "528a4711", + "id": "79", "metadata": {}, "source": [ "### Model testing\n", @@ -1948,8 +1337,8 @@ }, { "cell_type": "code", - "execution_count": 40, - "id": "93ea20aa", + "execution_count": null, + "id": "80", "metadata": {}, "outputs": [], "source": [ @@ -1958,7 +1347,7 @@ }, { "cell_type": "markdown", - "id": "c8081975", + "id": "81", "metadata": {}, "source": [ "### Explore results\n", @@ -1968,7 +1357,7 @@ }, { "cell_type": "markdown", - "id": "94597545", + "id": "82", "metadata": {}, "source": [ "#### Compute average accuracy" @@ -1976,18 +1365,10 @@ }, { "cell_type": "code", - "execution_count": 41, - "id": "4df4d88f", - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Accuracy: 0.8303\n" - ] - } - ], + "execution_count": null, + "id": "83", + "metadata": {}, + "outputs": [], "source": [ "# Compute average accuracy\n", "num_test_samples = len(original_images)\n", @@ -1997,7 +1378,7 @@ }, { "cell_type": "markdown", - "id": "dce3ac13", + "id": "84", "metadata": {}, "source": [ "#### Compute confusion matrix" @@ -2005,21 +1386,10 @@ }, { "cell_type": "code", - "execution_count": 42, - "id": "9be2ac3d", - "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAugAAAL2CAYAAAAThmN7AAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjEsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvc2/+5QAAAAlwSFlzAAAPYQAAD2EBqD+naQAA3pdJREFUeJzs3QV4FFcXBuAvIbiEJAQJCe7u7u5OcYf+QItLkba4u7tDcSvu7k6R4u6a4JLs/5wbZtkNCYQSmZ393udZyM5ONjM7szNnzj33joPJZDKBiIiIiIh0wTG8F4CIiIiIiD5jgE5EREREpCMM0ImIiIiIdIQBOhERERGRjjBAJyIiIiLSEQboREREREQ6wgCdiIiIiEhHGKATEREREekIA3QiIiIiIh1hgE5EunTp0iWUKlUKzs7OcHBwwKpVq0L0/a9fv67ed/bs2SH6vrasSJEi6hFSXr58iebNmyN+/Pjqs27fvn2IvTeFrGHDhiFZsmSIECECsmTJEt6LQ2T3GKATUZCuXLmC//3vf+rEHSVKFMSKFQv58+fHmDFj8ObNm1D9240aNcI///yDAQMGYN68eciRIweMonHjxipglc8zsM9RLk7kdXkMHz78u9//7t276N27N06ePInwNHDgQHUB1KpVK7UNGzRoEOp/L6Qv5ALav3+/+myfP3+OsPbXX39h9OjRIf6+mzdvRteuXdV3e9asWepz/JadO3eiWrVq6uIrUqRIiBs3LipWrIgVK1Z8cRFsuQ/L72n7dsBH7dq1rf7G+vXr1XQPDw/4+fkFuhxJkiSxeo/o0aMjV65cmDt3bqDzy/GkUqVKiBcvnppftmVQ7ty5g59++gmxY8dW39XKlSvj6tWr3/xsiEKCU4i8CxEZzrp161CzZk1EjhwZDRs2RIYMGfD+/Xvs3bsXXbp0wdmzZzF16tRQ+dsStB44cAA9e/bEr7/+Gip/I3HixOrvRIwYEeHByckJr1+/xpo1a1QQYGnBggXqgujt27f/6b0lQO/Tp48KXr4nGyqBWkjavn078uTJg169eiEsSGBZo0YNVKlSJVQDdPls5SJLArewDtDPnDkT4i0Rsp0cHR0xY8YMFWx/i2zPvn37ImXKlOoCXr5LT548UQF19erV1f5bt27dr75H27ZtkTNnTqtpsr9akveRaRLoyzKWKFEi0PeSfbxTp07q53v37mH69OnqAv/du3do0aKF1by///67uqjImjUrNm3a9NXWn6JFi8Lb2xs9evRQx4lRo0ahcOHC6sLXzc3tm58T0Y9ggE5EX7h27ZrKZsmJV06MCRIkML/2yy+/4PLlyyqADy2PHj1S/4dmACTZMwmCw4tc+EjGcuHChV8E6BKIlS9fHsuXLw+TZZELhWjRogUrOPseDx8+RLp06ULs/T5+/KgyqSG9nPZOtlPUqFGD9bkuW7ZMBedyIST7qeUFrly4S9D74cOHb75PwYIF1XsE5dWrV1i9ejUGDRqksvoSrAcVoCdMmBD169c3P5eLJ2n1k4A6YIAuxzYJ+h8/fgx3d/cg//7EiRNVS9bhw4fNFxJly5ZViYoRI0YEq5WB6IeYiIgCaNmypUkOD/v27QvW/B8+fDD17dvXlCxZMlOkSJFMiRMnNnXv3t309u1bq/lkevny5U179uwx5cyZ0xQ5cmRT0qRJTXPmzDHP06tXL/W3LR/ye6JRo0bmny1pv2Np8+bNpvz585ucnZ1N0aNHN6VKlUotk+batWvqd2bNmmX1e9u2bTMVKFDAFC1aNPW7lSpVMp07dy7Qv3fp0iW1TDJfrFixTI0bNza9evXqm5+X/I4s0+zZs9Vn8OzZM/Nrhw8fVu+9fPly9f+wYcPMrz158sTUqVMnU4YMGdTvx4wZ01SmTBnTyZMnzfPs2LHji8/Pcj0LFy5sSp8+veno0aOmggULmqJGjWpq166d+TV5aBo2bKiWL+D6lypVyhQ7dmzTnTt3Al2/oJZBPnPx4MEDU9OmTU1x48ZV758pUyb1WVjSto+s/6hRo9S+5ejoaDpx4kSgfzOwvyefs+b27dumJk2aqL8p+2i6dOlMM2bM+OJ9xo4dq16Tz0XWMXv27KYFCxYEuW9arldgLl68aKpWrZopXrx4al0TJkxoqlWrlun58+dW882bN8+ULVs2U5QoUUwuLi5qnps3b5pfl+0S1PfiR76XX9tXApMmTRqTq6urycfHx/Qtltsw4L6xdOnSr/6ufB6yve/du2caMmSI+n69efPmi/m0Y0pAOXLkUOsclEePHqnlkG0aGDk+ySMg2feTJ0/+1WUnCgnMoBPRF6TsQjJQ+fLlC9b80hFwzpw5KiMmTc2HDh1Sma/z589j5cqVVvNK9l3ma9asmWqGnjlzpsp4Zc+eHenTp1d1rZI579ChA+rUqYNy5cohRowY37X8Un5ToUIFZMqUSWX7JFstf3ffvn1f/b2tW7eqLJmsu9SmSgnMuHHjVKb7+PHjXzTBS+Y7adKkal3ldWlal1rcIUOGBGs5ZV1btmyp6nabNm2qpklWMk2aNMiWLdsX80v9q9RYS+mR/N0HDx5gypQpqtn93LlzqlY3bdq0ap3//PNP/PzzzypTKSy3pZQjyHpKK4lkHqUeNzDS10BaUGQ7ScmRdCCUvyelMFJTLn8vMLIM8rpsQ09PT3P5gWQs5TOVjqiyPaR8SdZj6dKlah+Quu527dpZvZdkT6XUR9ZFtqOrq2ugf1P+nuyHUn8s84rkyZOr/+VzklIbaTWRvynLsWHDBrUP+vj4mEtGpk2bpkovZP+U5ZC/e/r0abU/S8mGbK+LFy+qVg/JzsaJE8e8XoGRkrDSpUurUos2bdqo0gqpa167dq1aV+kArdVF//HHH2p/knWQFiTZ7woVKoQTJ06o74OUe0m5xe3bt9XfFt/6XgTneymfm5SqSaZY9l8R1PdeMsr//vuv2ldjxoyJH/HixQuVxbYk21ZKbYRkzKXERD4z2U+7deumjkuy7wenpUU+JxcXl/+0bNJKI9td+05akv1L9n9Z/h/9DIi+KkTCfCIyDG9vb5VZqly5crDml+ytzN+8eXOr6Z07d1bTt2/fbpXtkmm7d+82T3v48KHKLEpm+GuZt+/JoEvGVZ5LliwogWXQs2TJojKskqnWnDp1SmXyJJsc8O9JFthS1apVTW5ubkH+Tcv1kAy4qFGjhql48eLqZ19fX1P8+PFNffr0CfQzkMynzBNwPeTzk0yp5siRI0FmQrVM7OTJkwN9zTKDLjZt2qTm79+/v+nq1aumGDFimKpUqWIKjsCym6NHj1bvN3/+fPO09+/fm/LmzaveW8vMausvmVPZR4JDPlPLrLmmWbNmpgQJEpgeP35sNb127dqq9eP169fquezz0rrwNbI9vpU110i2/1vZ4uvXr5siRIhgGjBggNX0f/75x+Tk5GQ1XT7Lb2XN/8v30nJ//JrVq1er35XvV3B8LYP+rRYWWfdp06aZfy9fvnyBHpPk85CstnzX5SGfW4MGDdT7/fLLL/8pg669Zvmd0kyYMEG99u+//wbrMyD6rziKCxFZkYyiCG52SDqGiY4dO1pN17KmAWvVpSZZy+pq2cfUqVOH6OgIWu261LAGNfpDQNK5TDp/SSbXMksrWfiSJUua19OSZL8tyXpJdlr7DINDMrMyssX9+/dVtlr+D6qDnWSQtQyjr6+v+luSRZXPTzL4wSXv06RJk2DNK0NdSkdAycpLBlnq9iWL/l/J5yhZUWkd0Ugds2SupWPerl27rOaXTodfqxX+FqnikFp+GWFEfpasrfaQ7LZkpbXPTvYbybweOXIEIUHLkEtdttT5B0ZaT2Qfley55bLJZySdMHfs2PGf/vb3fi9D49jwNdLCs2XLFquHrLNYtGiR2s9l22tkf5FWj2fPnn3xXpLRln1EHhkzZlStArJ/y9CR/4U2spJ8TwLS+q2E9ihWRAzQiciKDCcmpAk3OG7cuKFOpilSpLCaLidbCXjkdUuJEiX64j2kKTqwE+9/VatWLVWWIk38Ur4hTeRLliz5arCuLacEu4GVbEjQJB3XvrYuWpP696yLlPBIwLN48WLVrC8d0gJ+lhpZfilvkMBNggcpsZCgRJrjJdAMLulU9z0dLWWYPLlokQuYsWPHqjKe/0o+Z1l+7ULD8jPWXrckJTA/QspFpJxEyji0IE57aBcp0klS/Pbbb+qCR8oYZBmlQ/S3yqK+RpZdAmQpHZFtJRcEEyZMsNpWUjYiFw7y9wIun5SiaMv2vb73exkax4avkUBaOn1aPrTgd/78+WobyAWolELJQ0ZdkZIhKYcKKHfu3CrA37hxo9pXZf3kO/hfOxNLh1khpUkBaSMrafMQhRbWoBPRFydhqS2W4dy+h9T3BofUMQfGv7/af/sbkk22JCfP3bt3q+yjZArlxC0BcLFixVS2Lahl+F4/si4aCbQlMy21wtKK8LVxmWXkCKlVltrYfv36mWt2pYY6uC0F/yW4kDpoLVCUsekts9+h7UcDIe1zkVp7qaUPjLSSaBcJFy5cUDXiss9I5l1G85Bsrwyt+F/IiB/SKiOtObLvSUuB1IEfPHhQ1efL8sl+LdnhwPan7+1/8V+/l8EhfSO0fSC0yAWL1oIhFy0ByUWs1sdAIxc/2ggvchEkyyl9UKQPRcAWhOCQ75V8L6VVLSBtWlD9L4hCCgN0IvqCnNwk4ygdA/PmzfvVeWUoRgky5MSqZUG1jnmSuZTXQ4pkqAO7QUxg2UAJXIsXL64eI0eOVMGtdLSToD2w4dq05ZQALSDpGCdBgNwEJTRISYt0lpVlDnizloBD3EnHORmv2pJ8JlqHxZAOyqTVQDLNUpoknQeHDh2KqlWrfjGGdXDJ5ywZf9lnLLPo8hlrr/9Xga23ZKKlhUIu4oIaps+SbGNpgZGHZGzl4kk6cXbv3l1leP/LZyvZYnnIGNwyjrq07kyePBn9+/dXHVnlgk6y7alSpfru9QvL76Usn7QwycWGBL8/evEQGAnApeRJylQCXrDIPRikBefmzZuBtsRpZIhS6Tgt33kpz/re763sl7K9jh49+sVr0tFWOpGzgyiFNpa4ENEX5K6CclKTEhE5oQd2h1E5QWslGiLgHQ4lKNZOliFFghkpD5AAzzKjFXCkmKdPn37xu9oNewJrthYy1rvMI5lsy4sAaUmQzKe2nqFBgm7JiI8fP95chxsYCVgCZuelyV9GBrGkBSQhcbdLKfuQgEg+F9mmMpKNdhOY/0I+R6mzlxYNy1E3ZNQSCfgksPqvZL0DrrN8ZlLLLNnwwFqFtDH3hZRUWJISCbkwkc9cG9v7ez5bqdmWdbMkgZ8EgNrnJxcAsoySoQ+4beW55TLJ3w5uKVNofS9lOWWZ5NgQcN2EfFekBeJHAnTpyyEXSDL6jOVDxlkXMopOcPZbWU4Zmee/kL8nmXzLIF0u3qWfSHBGkiH6UcygE1GggbAM9ycnScm+Wd5JVDKA2rB4InPmzCpgk4y7BC0SYMmQbRLQyR0dJfgMKZJdlhOvZHClVEA63k2aNEll9iw7SUqHRilxkSBEMoVSniGlClJSUKBAgSDfXzqVyfCD0mogQ/BpwyxKZ7+vlZ78KAnYJLsanJYNWTfJaEs2W0oNJKCRjF7A7Sd1uJKllUyfBHZSp/u99dwSjMjnJneO1IZ9lGEPZZhEKbWRbPr3kvIE6WQq+8+xY8dUwC8tA1LrLcHkj2QmZahOGSpTglApQZD1lfUePHiwajmRn+XGNRJ0y0Wc7DMyv3ZBJx1i5QJJMtzSd0FqwOWiSfYjbbnkbwhpjZH9UbK90gE1sCytfH4yrKMEdLKPSkCrZYa1DpCyrSSTLhl6uWOmfGfkb8kNdeTCUz6vzp07m/+2XNhI2Ya0YMgFjfztwITW91KOCbLfSauClD5JuZN2J1EpC9q2bZs6dvwXkp3Wht8Mqu+E7Ieyz8tx4Gu0mwrJviB9CbQbKsnnLy1uWqddOU7I5y8aNGhgbllo3bq1Cu5l28vnL78v7yX7hdbRlihU/efxX4jI8OQmKy1atDAlSZJE3fRDbowjN/8ZN26c1c1O5IYoMjSg3HQoYsSIJi8vr6/eqOhbw/sFNcyidgMiuVGPLE/q1KnVcH0Bh1mUmw3JkGweHh5qPvm/Tp06an0C/o2AQxFu3bpVraPcqEaG+KtYsWKQNyoKOIyjvFdwhuALzrB2QQ2zKMNRypCBsnyynAcOHAh0eEQZEk9uuCPD1QV2o6LAWL6PDHco20tuniPb11KHDh3U0JPyt78mqO0tw+jJTYPixImjtk/GjBm/2A5f2weCIkPfFSpUSH02AW9UJH9Tht2TfVP2URnOUoa3nDp1qnmeKVOmqN+XoTJl6Eq5IU2XLl3U0KOW+vXrp244JJ/B17a3DEspQ3HK+8gNiOQGP0WLFlX7WEByYyq5QZbsF/KQGwLJ8l64cME8z8uXL01169ZVN1AK7o2KgvO9DO4wi5a075gMSyr7mLu7u/quyH73X29U1KZNG/XalStXgvy7vXv3VvPI8Kdf28eE3Pwq4Hc8sBs+aQ9ZNku3bt1Sw6DKcUCGAK1QoYK6ORlRWHCQf0L3EoCIiIiIiIKLNehERERERDrCAJ2IiIiISEcYoBMRERER6QgDdCIiIiIiHWGATkRERESkIwzQiYiIiIh0hAE6EREREZGOMEAnIiIiItIRBuhERERERDrCAJ2IiIiISEcYoBMRERER6QgDdCKdMplM6v83b97A29s70NeIiIjIeBigE+mQBOAODg5Ys2YNqlevjqxZs6Jhw4aYNm2ael1eIyIiImNigE6kQxKAr1u3DrVr10a+fPkwa9YslUXv3bs3du3aFd6LR0RERKHIwcS2cqJw9/btW0SJEkVlzuXx6tUrFZwXKFAA3bt3x8uXL5EqVSrUrFkTY8aMCe/FJSIiolDEDDpROJs5cyb++OMPPHnyRGXOHR0dET16dBWUFy9eHDdv3kTq1KlRsWJFc3C+YcMGnD59OrwXnYiIiEIBA3SicKI1Xh09ehSbN2/GhAkT8PTpUzVNgnPpHLpy5UoVpJcvXx4TJ05Ur92/fx+zZ8/GuXPn2FmUiIjIgBigE4UTCbCFBN4SgK9evRpjx47Fo0ePECtWLLRv3x6jRo1CvHjxMHXqVESIEEHNP378eJU9z5MnDzuLEhERGZBTeC8AkT2aMWMGVqxYgfnz58PFxQUDBw7Ehw8f1KgtkhVv27Yt6tSpo4J4ea1Nmzaq7OXx48dYunSp6iiaJEmS8F4NIiIiCgXsJEoUDvbt2wcPDw8kTZpUlazEjx9fTe/SpQu2bduGSpUqoWPHjiooX7hwIaZMmYKoUaOqoFwy6+nSpQvvVSAiIqJQwgCdKBwdP35cBeLt2rVD1apVvwjSJXPu5uaG169fI1q0aPj48SOcnNjwRUREZGSsQScKRy9evEDEiBExadIkVd4ihg0bpjqG/v3336rj6MOHD1VwLrQ6dCIiIjIuBuhEYSCohqrChQujZ8+eqnxFOoRaBumlSpVSo7XIMIx+fn5qOjuFEhERGR/byonCgATW0glUsuWnTp1SWXEpVylTpgyKFCmiylYkKJcgXciY54MHD0akSJFQq1YtNTY6ERER2QfWoBOFIhke8dixY6qTp5AOn61bt0bMmDFVgB47dmzMnTsXOXLkwJ49ezB8+HA1/vn//vc/VK9ePbwXn4iIiMIB03KkC9p1ouX1oq1fO759+xZ3795VQyJ27doV3t7easzzkSNHqmny8PT0VJ1BT5w4gYIFC6JDhw4qcJegXW5WZOufAREREX0/ZtAp3El9tVbC4evra9UR0vI1W/TkyRPMmjUL8+bNQ5o0aVSnUKkrjxs3rnke6RAq45tL6Ys4cOAAvLy8VPBORERE9ocBOoUrywBcRjLZv38/3r9/j9SpU6Nv376wZfLVktpzCdKnT5+ubkokgfi9e/fU61LKIp1DT548iXLlymHx4sUqi05ERET2zXZTk2QIWnD+22+/oU+fPkiZMqWqxx4yZAh++uknGIGMY96kSRM0aNBAZdBbtGihpktwLqQjqLQacHxzIiIiEowIKNwdOXIEq1atwvLly5E/f36sXr1aBa1FixYNNCOtd9pyPn36VI3aIj9LSUvLli3Vz9JhtFmzZhg3bpzKqEvHUSntSZQoUXgvOhEREekAM+gU7uRW9xKQS3AugXr9+vXVaCatWrVSGWeZJmwpOJeLDBlCMW/evMiSJQumTZumMuSyTjJCi5SzJE2aVN1BVGrOZfzzhAkThvfiExERkQ4wg07hHsxKZ8jEiROru2Z269ZNBecSxAoZ3WTZsmVInz69Kn/RO1mfzZs3q/Kcfv36qaD7zJkzanSWq1evolevXvj555/VfGPHjlWZdelAGiNGjPBedCIiItIJBugUZoIakUXGBL958ybatGmDgQMHmoNz6UQpN+uRscJTpEgBW7nokM6gdevWVUMrapInT462bdsibdq0aNiwoWolkPIXGWKRwTkRERFZ4iguFCYs68clc/zvv/+quus//vhDZdDlJj2lS5dG5cqV1UgmklmePHmyuuPm8ePHVXmI3mvQta9S2bJl1QXF+PHj1Yg0Wh16+/btsW3bNhw6dAjRokWz+SEkiYiIKHQwOqBQJ4GoFlj//vvvavjEBw8eqCEVM2TIgH379qmg/O+//1Y15wMGDFAdKCVIl7twSnAuN+/Rc3AuZPnkIeskdfPSAVRq6z98+KBeT5YsmQrMZZpgcE5ERESBYYkLhTotEJVRTeTumBs2bEDOnDnV+OC//vorypcvj7Vr16JEiRKqU6XcgTNy5Mjm0g8JzvU4BKGW0b9z547KlMsoLDJcYuvWrXHw4EFVh75kyRLEiRNHzX/58mVVziPz6nF9iIiISB+YwqMwIXXZEqju3r3bHHjL+OBTp05VJSEVK1ZUGfXo0aOr6do8EgTrNZiV4FyGhixWrJi64JC68q1bt6pM+Z9//qkCcencKvXosn5yB9ERI0aoLDoRERFRUBigU5jInTu3qi8/d+4cXr9+bS59kYyyBOmSRS9QoIAa8cSSnstaZFSWHj16qFYAqTe/fv266uS6cuVKlCpVSg2lKB1eJasunUSl9jxz5szhvdhERESkc+wkSiEusM6PsptduXJFBawXL15U2XIvLy9zmYi3tzdGjRqlatT1mjHXviraRYOM3y7LO2nSJNURVAJ2GYlGLkCkzKVmzZrhvMRERERkixigU6gF5zKGufwswWu6dOnUtGvXrqnb3kswK51DJUgPGNDrveZ8y5Ytqrb81atXePfunSpz0UiduQynKCPU1K5dW60rERHZPr2PJEbG2tYscaEQ3aG1QFtqsKWTZI0aNZAvXz51J00hd8+UWmwp+ShcuLAK2ANm2/UYnAv5sm7fvl0NBykdXmVoSKk5Hz16tHkeGV5RRqCR4F3uJiqj0hARkTECtr1796pj/C+//IKjR4/i0aNH4b1oFApkW69fv16d38Mrj80AnUJ8KEUZRlHqyuUhBzAp9ZDSFrnpkEiSJAlmzZqlOkt26tQJtkKy41KeI/XmkjWXDq9yASJ3OpW7oGrk4kPuDioHcamxJyIi2ybntxUrVqiby0mQLq3Acvzv378/nj9/Ht6LRz9IWsalRVyLZ2R4ZDmHR40aNdwy6QzQ6YesW7dO7dRaFvzs2bOqvlwC8KJFi2Lnzp0qmJVses+ePTFkyBBV/iFBunwhli5dClsJzqtWrYp+/fqZh02U1gDpJJomTRo1So3cWEkjr0n5DhER2b7z58+rhJKMxCUDAMhgAHIHbBl1TO52Tbbr0qVLqmW8c+fOavQ1rTRXWsLDs8yFATr9Z5JJlnrrKVOmqJ1auLi4qAyDDD0oGWbpLClB7aJFi8xBeq9evdS8CRIkUCOcSMCud3JzoTJlyqj1lAsQy2y5BOmZMmXCmDFjMH369HBdTqKwpjX/7tixQ12wExmRj4+Punme9CuSO2GnTp0azZo1U+WcQlpXtZvSkW1JmTKlSiTOnDkTXbt2VfdiERKbyNDPIjzKXBig038mB6oiRYqo4FtGMpFMuoeHhxoPXG40JNNluMEWLVqo+ePHj6+GW9y1a5fVzi5But4E/DLKTYjatWuHn3/+WWVOZDhFjYx7LlfeJUuWVDdbIrIHlqMayXdahkqVC1jp5E1kNPfu3VOBm9Scy707JOMqySkhLcVSDsF6dNtVtWpVLFiwQMUyHTt2VNlzOZbJRZmwzKRrpTChTZ+98Uj3tCtLOSjJOOALFy5UJ+xWrVohVqxYePPmDU6fPq0yy9rt7mWc8D59+piDWD30kg6MtlxHjhxRzZrSIVRuNCTZ8vbt26vX5syZo+aV7Ll2BT5y5EjddnANa3rdthRytO17+/ZtHD9+XH0X5CTHgcHs4ztt5O+4tm5SsilBefbs2VGuXDn07t0b8eLFU4kay5JG6Uwo80piyojDJAc2zWjb+uPHj6hSpYpKLNaqVUslDiUQb9y4saoIkKGgZR6Je6T1PyzO94wm6D+RnVd2VunoKaUu0qNddmzZ0Vu2bKk6VlSrVk1llp89e6aaBOVLLhl3vR/cZbmk46dk/qWWXJo2Zbxz6eTavHlzVdYj5KJELkSkhEfYe3A+d+5cta1lLHg5mOt5G9OPk+0rwXnixInVnX9/++03Nd3o29xyv7anfVxbTzmWS78beW7E9dfWSTqEyj4tAZoE5dI6LBehMgjC48eP1UPu7SEtqpJJl46jUo9uyywDcUmwyc+urq5q3Y3I9Glby4hsx44dQ7169VSSQfrGNWzYUNWhSxwgpbsSoEuiUY51Mk+YnO9lHHSi4PL19Q10+osXL0yNGjUy5cqVyzRq1CjT27dv1fSxY8eaatasafr1119N79+/V9M+fvxo0rMzZ86Y4sWLZ5o9e7bJx8dHTevZs6cpTpw4pokTJ6rn169fN3Xo0EGt7+PHj0327s2bN6ayZcuacufObZo2bZp5P/Hz8wvvRaNQYLldp0+fbnJwcDBVrVrVdO/ePZO9fQZBHRONwnL9lixZYipQoIBp+fLl5mlG/I6vW7fOFDVqVNOECRPM5wAh57B58+aZUqdObYoVK5YpTZo0phw5cphOnDhhMpKuXbuqc6CHh4fJ2dnZNGTIENOtW7dMRrRs2TK1LX///XfTsWPHzNPXr19vihIlium3334Lt2VjgE7BZnkgnjFjhqljx46m/v37m7Zu3aqmvXz50tS4cWMVtI4ZM8YcpGv/iw8fPpj0JuAJZvv27aZUqVKpINzy5NS9e3f1RdYOVHfu3DE9fPgwzJdXrx49emSqW7euqVChQupCxh6CdCOv29fWV7vI1p5PmTJFBel//vmn6dmzZyYjsjwWLFy40FS+fHnz8cyoQbrleknQKokWOQbmzZvX9PfffxvyeyDnqzJlyph69eplTj5duHDBNHToUJW00Wzbts10+fJlddwz0nbeuXOnKX78+KYtW7aYjh49aho9erTa5nK+f/LkiclIjh8/boobN65p8uTJgb6+cuVKU/To0VXyUZJQYb2vM0CnYLHcKeWKUq6qixcvrg7UcmIePHiwOUiXnVmm9+vXz5w115vATqh3795Vy7tmzRpTtGjRzMH369evzQduT09P05w5c8J8efW+b2jbWU5kFSpUUNtfLuK0/cZIJ/DAyIla9n0j07ahBCatW7c21atXz9SlSxfziUtOcnIs6N27t+GCdMvjhax/ixYtTBEiRDD9/PPPhg/StYyqBG1ynJfAVY6DRYoUMa1YscI8jy1/x7VlP3nypMqY//TTT6YGDRqYbty4Yfrll19MRYsWVUkbyarLvm9Ukljp06eP+g5bWrBggSlSpEimuXPn2vy2tiTnqHz58plevXplnhbweywX4xLE379/3xTWGKDTd5EmoOrVq5sOHjyonsvBbPz48SYnJydVziIkUKlSpYo6ien5i3zt2jVTu3bt1M/SZCtfVPkSSnZQmi1LlSplevfunXpd1kMyJWnTpjWtXr06nJdcX7RtvGjRIlO1atVM+fPnN8WIEcOUJEkSw5a7XL16VX0PxKpVq0zp0qUz/fvvvyajk4BMMkrt27dXZV/p06dXgYt2gps6daopYsSIKnB//vy5yWgki5g9e3bT//73P9VSKEGrXKgYOUj/559/VEC+YcMG87RTp06ZChYsqC7EJaGhseXv+Nq1a9X2lPUcOHCgKU+ePCZHR0f1PZcAVRI1ErzKeUGviafvIaVK8n21TFDJNpWLbLnwFHL+0/ZpuVCRfV8SVba8nS2NGDFCHcOklSSgXbt2mZN0gb0eFhig01dZnnAkAJODVrZs2b4o7ZDMipubm6rfFvIl1nNgJssmzfIpU6Y0lS5dWh2UpLZQW14JunLmzGkqVqyYCuRlvSRzlCBBAlX6QtYOHTqkAreZM2eqbLKUAcmJzKg16XIST5gwobqQk33nr7/+MhmdXLxmyZJFla8J+R5IjWrz5s2t5hs5cqQpduzYhmj6t7R582bVD2X//v3quezTsq7ymdSvX9+wQboc/yRA1wJxbf3Onj1rihkzpipps8yk2xLteCT7tmxDbd+WYPzcuXPm8k1N06ZNVRmfHks1v4csv/QnsCw/FbJvV65cWX1/L168aFXOJi3ihQsXtvlj+L8WiRTZb+UiTMq3LMk+LuVccgETnuvLAJ2CZLljSucvKe2QoFXKP7ROMdrBWmrVJHjdt2+f1Xvo/WQlmTAJsKS51pIcuOSEJBkiadaUQD5ZsmRWnUjsWcCDluwbkkm17FAlF3FygZM4cWIVuOt9X/hef/zxh9p3MmXKZJ5m6yfur5ETtnwHpIVM+l9I0CbfH41c1Grrb7QSFzF//nx1QWLZKVz2d6m7l2OiXKhomVVbDWICK0mTC7FEiRKpAE0L2LTvsgRskoGUoO706dMmW7R3715zB/cDBw4E2WLWuXNnk4uLi2pRMBLpRyYlTJojR46okh7pJCrbVFrC5IJFjuXSEdxW921x5coVdUHdqlUrk0ZKcqVkV1rGHzx4oL7fUsYrZS2SbApPDNApUJZfQjkJS9Zcak2lx7Ps4CVKlFD1eprbt2+rg7j0fLaldZOTa8OGDVXTXbNmzQKdf8+ePaoziTQBkvVnKJ+NHMSkCVguYrRRPLTSIAnqJMsmJ3EZ7cMItOBk1qxZqpRD1k2+DxojNH8HRk5ccuKWmkz5rstxQQvIZR+QUo8dO3ao57Z8Eg9IyyDKvi6jd0gm3dLNmzdVa4pcoMoxRO+jVAXF8gJaLrBkP9YyrJMmTVKZRtnnNXI+kDpt+e5LMCOBni2SfVdGY5GLbW2ULsvPY9OmTaYmTZqoMjajjdYi+6q0Asm6axdglkG6jGIin42UvGTOnNnmL0CfPXtm6tGjh7oYk1HYhBzDWrZsqWrskyZNasqaNav6Pss5P7wxQCcrAb94kgWVGjwZ2UQjzfmSOZFAffHixap+W0Y0yJgxo82cnOQkKwchIfWz0lNdDkDShBkwcxKwGdCeWe4f0stfDuyyb8hJTurOJctkSS7i5EAvzcfS4cpoZH+X0SwkcLMM0sXhw4etOh/Z6nbWAhVZFxndQrZ5nTp1rOaXCxVpXTPCMItBtfTIsVBKmsqVK2eVRZWsXI0aNVSAIyd3ycja8joPGjRItShKjb10+NayiH379lXbXhIa0gdBzgFyzBTSqVKy6LZKWglk20n/GekEbEn2eylpkySUEfdtyY5LB2+5ALPsHCrHL9mvZQQXy8/ElloJ/QK5kJAgXfZlSTp26tTJ6nwmyQcp/ZGLbj1ggE5mAZsoZQxYyQrJaC2SPbMMviUwz5Ahg7rCluBcdngtkNV7kC7LWbt2bXWy0U6mT58+VfWHctEh2RLJAEt2XeorjdjZ7UfJyUoOZjL0mEZaVyJHjqwyE9rwY1IGIjWb3t7eJlumHeillEvqEqU1QGpUtROclENJpkmagaWZVMbUleDFFofh1NZVTlgSiJUsWVL115DgW9bNy8tLraf0LZD1ls5j0kQsHQeNFMDIdm7Tpo26GNHG/ZYLdll/+Uykg5nUKMuFmcwjxxBpLZKMpK2S/Vb6EknHf+lzIxfXrq6u5rr7pUuXqkBc1lmy51pLmfQ3kYs0W9m3pQ5Z9m9J0mjD5kprnySZZF20liCjCBioSnZ49+7d5mBbXpfzvQTp0hFWIyU/Uv4jHf61vld6P78HJANayDjuAYN0uaCW1k/L8h69YYBOinwppdnHctg86TQpO7A092hNW5bZZLnSlBO1BLta51DtgK33g5Rkc6X2TJq1pOlayAlWmnJTpEihmvClR790frR3EnBYZo8kSJGLG6nHlJYHy89Xxo2VDkZyQJfmQjnZ23rdvrbfSJAmNchSDiUXbtJhUNt3pLl/48aNKpMu80h9tmSgbJV0npJgU5p+5SJLjgESuMh6ylCakkmXkiZp9pdg1QjBuSVpCXJ3d1eZ4YoVK6r9XTqNyfFPOk3KzddkWydPnlxlkrWhWKXPinSmtxWWx0TJGkqAKsd1jYxeIaVLEqTLxZnQhtUU0h9B7g8hJS56H8VIW1dJJMj+LMco6R8j21FG7BCyb8tnIK0kUtpiBDIspGUGXPZt2V5y/JZznRy3tPO7BOkyIpsk3DSSlJDPQzLpsu/rnZ+fn/lCW5JEUoon2zTghbPs21JTL/u2JBn0iAE6ma8otatprXlHOkDJwUw6i0iWXGMZhEvnQMmySFOYZU26HmnlBtqBWjInUnohQbqWSZcvrWRGJTtsCwej0CatBxKQyolLI8HIsGHDVAAnGUbtM9U+VwnmpS+CBLRGGfFGbt4hAbk2LJlk3iRokw7E2ggAklmSG3nICc+W77on33/p+CoXq0K2q2xryTRp2TM5Bsi6ygnQVst4giLBmnR4t7zAkhZDOZFrdxWUwFQu6C3LtiRQlYszWzluWLYWyLFeAnAZiUm76NRel+myP2j3utDOE1rHSWlR0HtttrYuknCRfVlKOuQ4Jd9rOQdIS7BklMWlS5fUOsmQsUbYt6Vjt1xIygAOcqySi2q5+Dh//rwqYZJklLSMaOd1+d7LsU3GCLfMQkupq3w2et6+7y36/0gprrQASoJILq4lASnnLUvDhw9XLZ8S34THOOffwgDdzgVs+pKRGOTLqR2s5AAlX175EssBS2OZSZcvgtTvWTZ56o18SeXkqZUlaOstwYhcXMgB2tYzvaFFOyHLAV4LPGW/kIOb7CuWWXRba/4MipSmSBCu9VOQ5n4pebLsEC2lUFKPK2U9RmoSl++EfJ8lCJVmf8k2yj0NLJu9LUfrMRqpN5agRjqFy/6sHSskGSFjvAdMREhwKll2Ob7ooWPZ9x73pTRFglQ5dsswupJxtOwMKD9Ly0C3bt2s3kPOAVIWqZd63cDIxZJWoijrIqVpklCyvDiR0i0pw5N9XutDIb8nfQuMQlr8pDVcShIHDBhg9ZoE3nI8k2Scdv6WFrSAteZ67Yvl+2lbSr8QqaGX55I4kBZc7dwkx2wZuUX2b8sgXfZpufDU6x1SGaDbOW3n1v6XA5TUU0pzlpZJkRO1BOmyw0swG/B3hTSL6iFbGrATjBYwSjOddHySE6/WFKvNK81/EmjKgyUt1vuE9rMcnKU5VG7UpJW7SFO31PbJ56bdpMoIZHxn6SwmZRzSBKrtP1KHK4GpZGK0G3lIy4u270hNq62RCy05mUlHX9muEszI/xKUyzTJvElwru0HUsoiJW22Eoh+ixzvJMiUcj45wcv6y4W6bE+tbEcLTKSVUUoiJJAJSEoD9F7ioZEhMjWyjaXfjezbso1lJBa5YZtcfGskQJdpWgbdVkbwkOWWYFxaQ7RhPyVgk9IO7bm2LnKTIsmaawkcI5AsuWxPbTxz7a7fkkgLSM7rcn6XO4VaBuZ67xDq++m4JBfNsm7yPZR9WurLpTzPsgVEkktt27ZV5S7SoV2OYzKwgV5bBQQDdDsmZRySBZTyBcvblEv2UDIKMravZZAuJybJNMswRRo9jm0tTXeyjHLBYLl8ElRIhxc5EMs8lvNLramUaxjpAP1faJ+X7BPSLCgBqpZxkOZ8ycLIwU3LpEvwIkG6lAkF7Ihji6QvhdTQy/4j6xuwRUAu4GQkD23/kfll35EspK3tO7KNpQVAmnjley0dPeV7L4GqlLPICc/yglwr45Dtb4QhR6UES2prpa+JJCSkXKlSpUqqpUirW7U8eUuph9TdSzBnq6QjvGRL5XsuWdLGjRtb1d9KiZ/ULEs2WbKusr3lYlW+93oP1gIj+7Lsr5JYkJIk+d7K4AZSj2w5Vr98FyR5Y5QEjdx3Qi6yJWtseW8SGfBB6s+lxS/gsU0uZixbyfXO1+KGWfLdlVZOy/tTyAAX2h1AtXWV2EaSjXLMlthH72PaM0C3UzKqhmTHpCOUnIhk/F7LMW7lylOuMC2DdNnZA/ti6y1rIgdk+YLKyVRqJC07bUmWS+4cKs3RkhWV0WmkWUxGJrC8SLFHltkI2S+qVKmi9gHpNKSN1iKBuXSqkrp0yyBdDo5SoysnQVslzZxy+2vJsliyvMiT2nrZt7QDu4x6IUGerdWqSnZYMouSZZo9e7YKXKS+Wo4JErDLRZkE79JqIi0DcnEuo/NI/a7e+5oEh/QlkEyqZIplJBZt6DVZd9m/pQxAMo2SMZcSPklmyHaWodn0fPz7Gqm7lnIsWRchI7FIYCOBmSU5Dso88t2X46IEeVpwbivrrmXG5bsr+7Zkj6XlS1pI5GJaRliSY5rUHct5TfZ92de1jrC2TLadnLel34Q2epbldpNjnOzXcl4PmGDTY8LtW2Ut0jdILsA0EoTLtpURaSz70QRcN1u4XwUDdDslX1jJjshBW5p05YQsmUMpb5FxcGXnlaZvOThLM1DAWx7r+UAtX07JkMhY5xI4yolYMoPabXslSJeTrwRa0mFGsmdGCDp+hGUJg5y0tVYS7ZbH7dq1MwehWo2yBCtauYsE6ZZ3WLRFkomRAFU6CQZ2otLqcaVVQfYduRCU74at7TuyjeUELt//gFlRObnL+NcSzMgwipJlksy6XMRLQGeE0VrkOCAtPtrQiZbkYl5aSCR7LBclcvyTAED2d+lIpp3U9Xz8+9o6yyhLGinXkn1ZghsZVjGw0hXL74EtZNAtR5ixDMBkvGv5zkpwKkG61B5Lx1dpOZJ9XRISRijbkuBUSjlle1qSixApx9PKsLShE2WarQXpvhaJJDmOyfpKwk0bsEDIBbeWSZeynaDukqt3DNDtmGQDJSOmnXTl4Kbt1JIhlZIFyZ7JkGLSNGYrJMsvQbfWwU+a4yVLLgdjaa6Vk5U0XcvBSTrF2sqoC6FNAm8JRmR7W6pVq5bKOElmUVofJIjRgnTJOlnWtNoyuSOitBZYZt8CkosUKXGQQEcuArX6TlvexrK+lsGXXLTL8JjaiDXyXZGsqhE6hsqxQY5v2ljP2uhDlusvZSBy/NACeGkpkkyktl/YQqD6tXXWyEW3XHxLWYMErpajdtjaOgpJFsh+bXlTPSHnMdmfpYOo1NtLQC4BnMwvrcZS6qOH/lMhFaBL0snyQkzujiqlarIPyIWIdkMpGR5VRu0JeP8TW3DkyBHVYVvO63KxLPdpkOOaZZAuF2LSwinrPX/+fJMtYoBu56TeUB4a+XJLaYNkHLS7BsoNOfR+VR2QlLbI+L1aRkWCTGm+lsy5ZMfky22kjo0hQS5UJCssdbjasJPSmiJZCul0Iyc4+QwlKJdmYzmpyfwy3JoRSK2mXMQF1glQI52Q5MRmhG2sla5pLDNLErBpHWRtLVv8NXJBVbBgQRWkaCNVaSyPcVKnLNnzgMGqrR0HLddZtrmWtJCgXEoAtU7CMpKHHBeldtlWyagrUsoipUiWxy8pvdM6cEs/EcmcS+ufXkfu+NEAXe7B0Lx5czX4gWxXaf2SfVlalKX+WvpgyXFMyHy2+P3etWuXVSmiBONBBenSii5xjC3dn0DDAN3OSdAlWWWpHdZudazVrUnmSOrYtBOULZ2c5EAkB2tZZqmvl7HctZspSTOfZMm052R9MpcLMzmZy8FbOhRZ3rBDOk7KwU6r7bPFTFtQJKMm6yvrbplRswxcO3bsqOpVbamZNKhtLK0hlkG65TpJs7Fc4BrRt9Zfjn9yEWp5sxajrLOU6cjFlwSoli2HMpqNZJ/lAtyWO8Fq6ykXYDL6kGSMA95wSJILMmKJZNLl/GDL3+XASDmqlKVJp1dp+ZRAXSs/lPO8tCJIeZslWwzSA/veBhakyzpLnxJb68QvGKCTuVOl3BEvqKyCLQZikhGSjiJSn2ZrdcLhSUY0kCyx1KJrw61p9dcSxEoGSi6AjEjKGqQjnbS0SE26RjKNclKTzlWWN20yQpCqZRuFBCxyYS41qtJ5VBgtgAlq/bX1lHHN5QJFMo6W042wztKPQII3y7uFarXaUqomfU9sOVgL6vgVMMEk8xil5S+oTHpg6yfBqrSmSCBrpH1bYxmky02KNLa6ngzQ7Zi208oYwNKkK6OaWE63Vdryy13TZKglrR7P1tcrLF2+fFnd2l0CNctSAOmjINknPd+c5EfISVxqsKUWXbKJ0klSmoclqy7ZdSN0JPtWJllaCCTzZst3Q/3e9df2cUlESImE3GHRlloMv+d7Lesr32vLbR5wRAtbD9Itj1+W62nEbfo9Qbu0oEjLga1v328F6dLBXZKO2p1/bZWD/AOya3fu3EHOnDnRtm1bdOvWDUbx4MEDFChQALVr10a/fv3Ce3FszqVLl9Q+IYeIQYMGYcuWLejVqxf279+PrFmzwsgOHz6MYcOG4fLly4gZMyby5cuHZs2aIWXKlDD6Npbvyt69e5E5c2YYnbb+jo6O6NGjB0aOHIl///0XJ0+eRMSIEeHn56deM+I6i99//x358+eHEVnu23/88Ydh1/NbHj9+jOnTp6vv9MOHD7Fv3z61b/v6+iJChAgwIm9vb6xatQp58+ZFqlSpYLPC+wqB9EE6TEpPd8tmfSOQ1gHpqW6UG1CER5ZRsomSPZaOtVoriz0wcpbJkj1vY239JbMo6y4jFWnZZFss6/vedZYhJY0wdOa39m25xfuBAwdM9khKtuQzkFF7tH3ayPu2xggt5sZKDdB/Vq5cOZQvXx5p0qSBkRQtWlS1Dnh4eIT3otgkyRgPHz4cefLkwYkTJ5A9e3bYC8vMqZEbGu15G1uuf8uWLXHmzBmVXfz48SOcnJxg5HWWFqJChQohQ4YMMPp6enp62u05IEuWLJg3bx5GjRql9mnJnBt539Y4ODjA1rHEhcxkV5Cd2mhNX2/fvkWUKFHCezFs2ocPH1TgQsbFbezP6MF5YIxYymPp/fv3iBQpEuyddo4n28AAnYiIiIhIR4x7yUxEREREZIMYoBMRERER6QgDdCIiIiIiHWGATkRERESkIwzQiYiIiIh0hAE6hah3796hd+/e6n97wvW2n/W2x3W21/W2x3UWXG/7WW97XGdbWW8Os0ghysfHB87OzupWu7FixYK94Hrbz3rb4zrb63rb4zoLrrf9rLc9rrOtrDcz6EREREREOsIAnYiIiIhIR+zrfsZ2duvmu3fvImbMmGF6a19pNrL8315wve1nve1xne11ve1xnQXX237W2x7XObzXWyrLX7x4AQ8PDzg6Bp0nZw26Qd2+fRteXl7hvRhEREREFMCtW7fg6emJoDCDblCSOVcapAAiRYA9uTd0O+zNB7/3sEcRHSPB3jgg7FrEiMKDCfaZN3zr+wb2JrJjZNgbyZ6nTpruc5wWBAboBmUua5Hg3M4CdL32yA5NDNDtBwN0Mjp7DdAj+tpfSBbFMQrslcM3yo/ZSZSIiIiISEcYoBMRERER6QgDdCIiIiIiHWGATkRERESkIwzQiYiIiIh0hAE6EREREZGOMEAnIiIiItIRBuhERERERDrCAJ2IiIiISEcYoBMRERER6QgDdCIiIiIiHWGATkRERESkIwzQiYiIiIh0hAE6EREREZGOMEAnIiIiItIRBuhERERERDrCAJ2IiIiISEcYoBMRERER6QgDdCIiIiIiHWGATkRERESkIwzQiYiIiIh0hAE6EREREZGO2F2AvnPnTjg4OOD58+c//F5FihRB+/btYXh+JuDwQ2D+JWDqeWDBJeDoI8Bk+jzPVR9gzQ1g5gVg0jng8dsv3+f1R2DbHWD2BWDaeWDpVeCKD2zJ3j17UaNKTSRPlBLRI8bEmtVrgpy3bet2ap7xYybASEYNG43Ykd3QrVMP87TZ0+egfMlK8IqTWL32/Lk3bN23tvWAvgORNUM2uDvHQ0J3L5QvXRFHDh2BkQ0fOgLRIsZAl45dYWT9+w5Q62n5yJIhK+yJUbe1vR7D7925h5aNWyOlRxp4xk6MgtkL48Sxk1bzXPz3IupVb4CkcVMgkWsSlMhfGrdv3oatGj5kBArlLYL4rgmRJGFy1K5eFxcvXLKaZ+b0WShTojwSuHkiRiTnEIkNQ4phA/QDBw4gQoQIKF++vNX0fPny4d69e3B2dg63ZbM5Jx4DZ58BBeMDtZMDeeIBJ58A/zz9PM8HPyBBNCBP3KDfR4Lz5++AsomAWsmBZDGBLbeBR29gK169eo2MmTJi1NgRX53v71V/4/ChI0jgkQBGcvzoccyaNgfpM6a3mv769RuUKFUMHX/rAKP41rZOkTIFRowZgcMnDmLLzs1InDgRKpWrgkePHsGIjh45hhnTZiJjxgywB+nSp8XVW1fMj607t8BeGHlb2+Mx/Pmz5yhXtCIiRoyIxav/wr4Tu9F3cB/Ejh3bPM+1K9dRvlglpEydEqs3r8SuIzvRqXsHRI4SGbZq7559+LlVC2zfsxVr1q/Ch48fULl8Vbx69co8z5vXb1CyVHF0/q0j9MYJBjVjxgy0adNG/X/37l14eHio6ZEiRUL8+PGD/D1fX1+VYXd0NOy1y/d78AZIEhNIHNP/eaxIwCVv4KFFljz1py+6z/ug3+f+a6BQAiBeVP/n2d2BU0+BR28B90/TdK50mVLq8TV379xFp/ZdsHrdKlSvXANG8fLlS7Ro1BJjJ43CsMEjrV5r3bal+n/Prr0wim9t61p1frJ6Pnj4IMyZNRdn/jmLosWKwEhk2zdt1AwTJo/HkIFDYA8iRHBC/PjxYG+Mvq3t8Rg+dsQ4JPT0wLhpY8zTEidNbDXPgN4DUaJ0cfQe+Kd5WtLkSWDLVq1dYfV88vRJSJowOU4cP4kCBfOrab+0ba3+371rD/TG0agHmMWLF6NVq1Yqgz579uwgS1zkNbmK/Pvvv5EuXTpEjhwZN2/eROPGjVGlShX06dMH7u7uiBUrFlq2bIn374MOQOfNm4ccOXIgZsyY6iKgbt26ePjw4Rd/e9u2bWq+aNGiqYz+hQsXrN5n9erVyJYtG6JEiYJkyZKpZfj48SPCjQTUd175Z7+FlK9IsJ0oxve9T/xo/iUtb339y2MkyPf1AxJGh1H4+fmhWeMWaN+xncrAGUnndl1RqmxJFClurOAzJMhxQZpKpWUuYybjZR07tOmIMmVLo1jxorAXVy5fQbJEKZAuVQY0adAUt27egj2wx21t9GP4xrWbkTl7ZjSt2xxpvNKhaO7imDtjntU6b9mwFclTJkfNCrXUPKUKlsH6v9fDSHy8/UsvXVxcYAsMGaAvWbIEadKkQerUqVG/fn3MnDkTJst66QBev36NIUOGYPr06Th79izixvUv05BA+vz58yqwXrhwIVasWKGC5aB8+PAB/fr1w6lTp7Bq1Spcv35dBfoB9ezZEyNGjMDRo0fh5OSEpk2bml/bs2cPGjZsiHbt2uHcuXOYMmWKuogYMGDAV9f53bt38PHxsXqEmGxxgBSxgIVXgCnn/GvHM7kBqb6zTKiUp389+6wL/rXsu+8BZbwA50gwihHDRqpt2rpNKxjJ8iUrcPrEafTq/0d4L4qubFi3AXFjx4drjDiqTnXNhtWIEycOjGTp4qU4eeIk+g4I+thnNDlz5cTUGZOxeu0qjBk/Gtev30CJoqXw4sULGJk9bmt7OIbfuHYDs6fOQbLkSbFkzWI0btEIPTr9jkXzFqvXHz18jFcvX2Hs8LEoVqoolq5dgvKVyqFRrabYt3s/jMDPzw+/de6OvPnyIH2GdLAFhixxkbIWCcxFmTJl4O3tjV27dqlOnUEF1hMnTkTmzJmtpks5jAT3kulOnz49+vbtiy5duqggPLASGMtAWzLfY8eORc6cOVVGP0aMz9lmCbYLFy6sfu7WrZvK8r99+1ZlzOUCQKY1atTI/D7y97p27YpevXoFuc6DBg366sXDD7nsA1z0BkokBFwj+2fQ9z0AojkBaT7XsH2TdDR95wtUTAREcQKuvQA23waqJAHcosDWnTh2AhPHTcL+w3tVS4lR3L51R3UIXbl+udpH6bNCRQrhwNF9ePL4CWbNmI0GdRth574diBvXHUZw+9Zt1UlwzYY1drXtLUsgpEUkZ64cSJM8HZYvXYHGTf2PzUZjr9vaHo7hEpxmyZ4Zv/frqZ5nypIR/577V3Xur92glnpdlKlQBq0+lStmzJwBhw8ewexpc5C/UD7Yug5tO+Hc2fPYsmMjbIXhMuhSLnL48GHUqVNHPZcr4Vq1aqmgPSgSiGfKlOmL6RKwS3CuyZs3rwq2b90KvKnz2LFjqFixIhIlSqTKXLQgXEpmLFn+rQQJ/DugaKUwkn2XCwEJ6LVHixYtVMdWyfQHpXv37upCRHsEtYz/yYEH/ln0lM7+gbTUm2d29e88Glze74Ezz4CiHoBnDCBOFCCnu3/tuUw3gH179+PRw0dInSwtYkWJrR43b9xE9649kDaFdadKW3Ly+Em1XoVzF4VbtLjqsW/3PkyZMFX9LP027FX06NGRPEVy5MqTC5OmTVTHmzmz5sAojh8/gYcPHyFfrvyIGcVZPfbs3ouJ4yepn+1l20sZpHQKvnrlKoyK29q4x/B48eMhVZpUVtNSpkmlki/CLY6rOnalTms9j/zOnU/z2LKO7Tpj4/pNWL95DRJ6JoStMFwGXQJxqdfWOoUKKW+R2vLx48cH+jtRo0b94atl6RVcunRp9ViwYIGqW5fAXJ4HrFuXntQa7e9qV7ByASCZ8GrVqn3xN76W1ZD1k0eo+BhIeZAst+l73sPv8+8FvET8SvmRLalTvzaKBqjbrFy+CurUq40GjfxbdGxR4WKFsP+4defPX1r8qnr7t+/cTo2WRDB/j9+/+0pHaRsjnV2PnDhkNe1/zVshdepU6Nilg91sezkuX7t6DfHr1YZRcVsb9xieK29OXLl4xWralUtX4JXI05ykzJojCy4HMo/np3lskclkUp1916xeiw1b1iFJUtvq9GqoAF0C87lz56r67lKlrHtpS4dPqSOX2vTgkmz2mzdvVAAvDh48qDLaXl5eX8z777//4smTJxg8eLD5dakx/17SOVRaAVKkSAHdSBIDOP4YiBkRcPlU4nLqiXV5i3T8fPkBePXB/7nWoVTKYOQRO7J/rfmue0DeeECUCP4lLrdeAeW+/Dz1fKK+cvlzFu36tRs4dfI0XF1d4JXIC25ubl9cjMWLFw+pUltnJmyJtAYF7CwVLXp0uLq6mqc/uP8ADx48xLUr19Tzc2fOIUZM+a54wsXVNjrkfM+2dnVzxdBBw1C+QjnETxBflbhMmTRVjf5QtXpVGIVs+/QZrDOH0aNHU+sfcLqRSMa0XIWyqjX03t17alz0CBEcUbN2TRiVvWxrezyGt2z7P5QrUgGjhoxG5RqVcfzIccybMQ8jJgw3z/Nrh1/QvP7PyFsgDwoUKYDtm7dj07rNashFWy5rWbpoGRYt/wsxY8ZQ5ykRyzmWOa5T5677D8ytY2fPnEPMGDHUhYmc48KToQL0tWvX4tmzZ2jWrNkX45xXr15dZdeHDRsW7PeTzLe81++//646fEoN+K+//hpo/bkcyOUqdNy4cWq0lzNnzqja8e/1559/okKFCur9atSoof6WXCjI+/Xv3x/hokB84PAjYPd94M1HILoTkM4FyGFRZ3v9BbDj7ufnWz41i+WIA+SMC0Rw8A/EDz4ENtz0HzddAvZiHp+Hb7QBx4+dQNkS5czPu3Xprv6v16Aups6cAns1c9psDOk/1Py8XPEK6v8J08ahXsO6sEVf29ZjJ47BxQsXsWDeXyo4lyAme45s2LJjk2FGfrBnd+7cQaP6TfD0yVPEcY+DfPnzYufeHapllGybPR7Ds+XIijlLZqH/HwMwfOBIJEqSCP2H9UPNOp+HkCxfuRyGjxuK0cPGqg6kKVIlx6xFM5Anf27YqulT/Euby5awvh/O5OkTUb9hPf95ps7EoP6Dza+VLlb2i3nCi4Ppa8Ob2Bip/5Ym5nXr1n3xmtSl586dG2PGjFEjpEggL3WFMkKK3A004N2jZPQVmSZ16BMmTFCjpEhduwTgWimJdDrNkiULRo8erZ5Lhr5Hjx6qXlwy4VIXXqlSJZw4cULNJ6PBFC1a1Py3xcmTJ5E1a1Zcu3YNSZL4N79s2rRJ1aHL78nVu2T9mzdvrmrRg0tGcVEXKc1SA5GM3zRp6dVY62Zae/DBzzhlFd8joqNxRgAKLgcYp/MaUWBM31U/aRxvfIPuZ2ZUURztr0Oyj48PPOJ4qf6CMoS3XQToIUkL0GW4RFvEAN2+MEC3HwzQyegYoNsPBuix7GcUFyIiIiIiW8YAnYiIiIhIRwzVSTQkSW06EREREVFYYwadiIiIiEhHGKATEREREekIA3QiIiIiIh1hgE5EREREpCMM0ImIiIiIdIQBOhERERGRjjBAJyIiIiLSEQboREREREQ6wgCdiIiIiEhHGKATEREREekIA3QiIiIiIh1hgE5EREREpCMM0ImIiIiIdIQBOhERERGRjjBAJyIiIiLSEQboREREREQ6wgCdiIiIiEhHGKATEREREekIA3QiIiIiIh1hgE5EREREpCMM0ImIiIiIdMQpvBeAQte9odsRK1Ys2JPoZdPA3rzZeDG8F4HCiMlkgj1ycHAI70WgMOIA+9zWUSJEhb3x9fsIe+Nr8g3WfMygExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigU4h58eIFunT8DWmSp4NbTHcUK1gcx44cg0376AdceA7svQ9svwMceQR4v/d/zc8EXPIGDjwAtt8Fdt8DzjwF3vlav8fJJ8CeT78f1Dw2Zu/uvaheuQaSeiVHVKfo+Hv1Ghjd1MnTkDNrLsR1ia8ehfMXxaYNm2BPhg8dgWgRY6BLx64wsmGDhyF/noJwjx0PiRIkRs1qtXDxwkUYnb2utz0cz/bu2YeaVX5CikSpECNiLKxZvfaLef49fwE/Va0FDzdPxHWOj0J5CuPWzVuwVdOnzES+7AXgGSeRepQoVApbNm4xv371yjXUq9kAyRKmVK83qtsEDx88hF4wQA9hvXv3RpYsWb46T+PGjVGlShXz8yJFiqB9+/awdb/871fs2LYd02dPxeETB1G8ZHFUKFMJd+/chc06/xx4+g5I7wLkiQe4RgaOPwbe+voH6C8+AMliArndgcxuwOuP/gG5JZfIQCZXIG88///f+AKnn8KWvXr1ChkzZcTocaNgLxImTIh+A/pi/+G92HdoD4oULawCmHNnz8EeHD1yDDOmzUTGjBlgdHt270XLVj9j174dWLtxDT5++IAKZSup/d7I7HW97eF49vrVK2TIlAEjx44I9PWrV66iVJFSSJU6FTZsXYeDx/fjt56/IXKUKLBVCRN6oHf/Xth1YAd27t+OQkUKoU6N+jh/7rza5lXLVwccHLBm02ps2rkRH95/QK1qdeHn5wc9cDCZTKbwXogDBw6gQIECKFOmDNatW/fdAfGqVatw8uRJ6EFwlsfb2xvysceOHdscoEtQP3r06BBbDh8fHzg7O+PekzuIFSsWQtubN28QzyUBlqxYhDLlypin589VEKXKlESvvn8irEQvmyZk3sjXBOy86x94x7E4SB16CLhFAVIE8rlKdl2y7AXiAVGcAn/fR2+AU0+BYh6Ao0OILOqbjeGX5ZKM0+Lli1CpckXYGw93TwwcMgCNmzYKs78ZHofsly9fIl+uAiqAGTJwCDJlzoRhI4eG6TI4OITMd+W/ePToERIlSIIt2zehQKECsBf2uN7heTzzNYVNy6pk0Bcu+wsVK1cwT2tUrzEiOkXE9DnTEJZ8/T6G6d9LHD8Z+g3qg4SeCVGj0k+48eCqOUby9vZB4nhJsXLdchQtXiTUlkHiMy/3JCoW/Fp8posM+owZM9CmTRvs3r0bd+/acLY1mCRw1oJzo/j48SN8fX2/uNqOGjUKDuw7AJskgZApkG+JBNXP3wVdEiOcgvhqffAD7r8BnCOFWHBOYU/29SWLl6osTO48uWB0Hdp0RJmypVGseFHYIx9vH/W/i6sL7Im9rre9kYzxpvWbkSJVClQuVwVJPJKhSL6igZbB2PIxe9mS5Xj96jVy5cmJ9+/eq4v+yJEjm+eJEiUyHB0dcXD/QehBuAfokplZvHgxWrVqhfLly2P27Nnm1+TngIGsZKe1TIq83qdPH5w6dUpNk4f2+zdv3kTlypURI0YMdYXy008/4cGDB1+UosycOROJEiVS87Vu3VptxKFDhyJ+/PiIGzcuBgwYYPX3v/W+milTpsDLywvRokVT88iVUlAlLgG9e/cOnTt3Vk3q0aNHR+7cubFz507oWcyYMVWgMmTAENy7e099jgsXLMKhg4dx//592CQJsiWQvvrCv2ZcAvZ7r/2z5O/9As+4X/YB4kf9MkCXWnWpU991D3j7EcjsGmarQSHnzD9nEMc5LpyjuaBt63ZYvGwh0qZLCyNbungpTp44ib4D+sBegxepuc+bLy/SZ0gPe2Gv622PHj18pGKxkUNHoWSpEvh7/SpUrFIRdWvWU2VPtuzsmXPwcPWCe8z46PhrJyxYMg9p0qZBztw5ED16NPTq0RuvX79WyZbff/tTxS73730Z09llgL5kyRKkSZMGqVOnRv369VXAHNwm3Fq1aqFTp05Inz497t27px4yTQ4sEkQ/ffoUu3btwpYtW3D16lX1mqUrV65gw4YN2LhxIxYuXKgy+XKRcPv2bfV7Q4YMwe+//45Dhw6p+YP7vpcvX1brtWbNGvXeJ06cUMF/cP3666+q7GfRokU4ffo0atasqcp/Ll269NWgXppNLB9hbfrsaWrbpUicCi7R3TBp/GTUrFVTXZHaLKk9F6qT513g1kv/ADwgqUf/51NdeZpAWkcSx/CvU8/qJo31wNln/gE/2RSpzzx07AB279+FFv9rjhZN/6fqGY3q9q3bKkibOXcmothwLeqPaN+mA86ePYe5f82BPbHX9bZHWs11+Url8Gv7X5EpSyZ06toRZcuXwYypM2DLUqZKgT2Hd2Hb3i1o+nNTtGzeGv+e/xdx3ONg9l+zsGHdJhXAayUnmbNm1k3MEkSRbNiRoFgCcyFBqHxAEvxKXfa3RI0aVWWynZycVMZbI4HzP//8g2vXrqkstpg7d64K5I8cOYKcOXOad0q5IJDsb7p06VC0aFFcuHAB69evVxtILhokSN+xY4fKYm/bti1Y7/v27Vs1XTLgYty4cSrwHzFihNVyBkYy9LNmzVL/e3h4qGmSTZdAX6YPHDgw0N8bNGiQak0IT8mSJ8Om7RvVlaiPzwskSBAfDes2QpKkSWCzojkBOdylUA74aAIiR/APxKNG+DI4l8x4tjiBl7dEiuD/iB7R/yGjwkgmPvbn5jXSv0iRIiF5iuTq52zZs+LY0WOYMG4ixk8aByM6fvwEHj58hHy58punSYZJRoSYPHEKnr96iggRLL4LBtO+bUesX7cBW3dshqen//HcHtjretsrtzhuKo6SzLKl1GlS226JqtUxO5n6OWu2LDh+9AQmjZuCMRNHoXjJYjj173E8efwEEZycEDu2M1ImSoMkSRNDD8L1MkGC4cOHD6NOnTrquewgko2WoP1HnD9/XgXQWhAtJACXchl5TZMkSRIVnGvixYun5rO8epJpDx8+/K73lZIZLTgXefPmVRcDsr7fIhcAcgJMlSqVuvjQHnLRIhn/oHTv3l1d3GiPW7fCb2gkKcuR4PzZs2fYunkbKlQsD5sXwdE/OJca8idvAfeo1sH560/BuQTh3/Qpc66PjuL0A+R7La1XRlW0WBEcOXEIB4/uNz+yZc+G2nVqqZ+NGpxLS6AEqX+v+hsbt6y37STDd7DX9bZ3EsRmz5ENly5Yt9JfunQZXok/xztG4Gfyw/v3n4ZKtrhAkeB8147dqtynXIWygL1n0CUQl86FWqZYO0BI0f748eNVoByw3OXDhw8h9vcjRoxo9Vxq2AObFpZD7kgdmJz0jh079sXJTwL1oMhnZtnZITxs2bxVba9UqVLiypWr6Pnb70iVOiUaNG4AmyXBuOyC0Z38g/BLPv5ZdY9o/sG5DJcoQy1mcfOfTxvfPKKjfydQyZL7SKY8kn9mXYZYvOLjn4GXaTZK9tMrlz9fMF6/dh2nTp6Ci6srEiUy1gFd80ePP1G6TCl4JfJSY/4vXrgEu3ftwZr1q2FUksAIWH8sdZuubq6GrkuW8g7ZvktXLEaMmDHM/Wikg7+03BqVva63PRzPZB2vXr5qfn7j2nWcPnladQCWY1q7Tu3QqG5j5C+YH4WKFMSWTVuxYe0GbNi6Hraq9+99UbJ0CXh6ear1X7poGfbu2osVa5ep1+fPWYDUaVLBLU4cHDl0BL916o5f2rZCytQpYdcBugTmUgYiZR+lSpWyek06UEpNeOLEidWJUEomJCsrAg5fKFd+knG2lDZtWpVBloeW7T537hyeP3+uMt7/VXDfV8pTZDQa7cLj4MGD5pKZb8maNataH8naFyxYELbW47/X771x5/Yd9aWvUrUyevX784uLHpsio7JIx08Z91yC7rhR/YdXlOD7zUfg8dvPQy9akmy6jJkewQF4+Aa46uPfiVQy7DJEY1JXmx7F5fjR4yhd4nOW4bfO3dT/9RvWw7SZU2HUIeeaNWmB+/fuw9k5FjJkzKCCcxnvn4x3UypRqngZ6+kzJqNBIxtOOHyDva63PRzPjh87gXIlPrdmd+vSQ/1fr0FdTJk5GZWqVMSYCaMxYugIdOnQFSlTpcSCJfORr0Be2PIxu2WzVqrTZyznWCqpIMF5sRL+o1FdungZff7oh2dPnyFR4kTo/FtH/NIu+P0FDRugr127VpVANGvWTF2dW6pevbrKrm/atEmNgtKjRw+0bdtWdda0HOVFK1ORmnAJ3D09PVXGp0SJEsiYMSPq1aunxhaXiwHppFm4cGHkyJHjPy9zcN9XOlM1atQIw4cPV501ZdllJJdv1Z8LKW2R92/YsKG6eJGAXXYyqX/PlCmTqmXXq+o1q6mHocSL5v8ITFQnoMQ36jNjRASyu8No5IYPbz4a++YlAU2eNim8F0EXNm3bCKOzt33b3tfbHo5nhQoXxMsPXx88omGTBuphFBOmfL1vUJ8BvdRDr8KtBl0CcAl4AwbnWoB+9OhRNZrK/PnzVadNCYwlqy7DIwacVzqXSgdPd3d3NY+UpaxevRouLi4oVKiQ+jvJkiVTwzn+iOC+b4oUKVCtWjWUK1dOtQ5IYD1x4sRg/x3pDCoBuoxQI1l3aVGQTqhS205ERERExqaLO4lSyAvrO4nqSYjdSdSGhOedRCls2eshOzzvJEpkpDuJ6klY30lUD2zqTqJEREREROSPAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdcQrvBaDQ9dH0UT3syesNF2BvEvUtCXt0+fd1sDd+Jl/YIwc4wB6ZTH6wN06OEWGPHB0iwN5EdIwEexMxmOvMDDoRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQKf/ZPqUGciXLT883RKpR4mCpbBl4xareQ4fPIwKpSohQeyEap6yxcrhzZs3MKrhQ0cgWsQY6NKxK2zZkzHH8Kjv/i8eL9ZfVa+/WHsFT8Ydw6OBB/F4+GF4LzqPj49fW73H+6vP8WzmP3g8+CAejziCl1uvw+RngpH28Xat2yNzmqyIFysBknmkQJ1qdXHx34uwZYP7DYFL5DhWj1wZ85hff/v2LTq37YpkCVLC0zUxGtZqjIcPHsJIRg0bjdiR3dCtUw/1/NnTZ+jS/jfkyJAL8Z0TIkOKTOjaoRu8vX1gywb3GwqXKO5Wj1yZ8ppfnz19LiqUrIxE7knVa97PvWEEw4eMROG8RZHA1RNJE6ZA7ep1cfHCpS/mO3TwMMqXqoh4sT3g4eaF0sXK2vT5a++evahRpSaSJ0qJ6BFjYs3qNebXPnz4gN+7/4GcWXLD3Tmemqd5459x7+49GE2aFOnUeTrgo32bDtAbBuhh5Pr163BwcMDJkye/+3eLFCmC9u3bQ08SJvRA7wG9sOvgDuw8sB2FihREner1cP7seXNwXr1CDRQrURTb923Fjv3b0KJVCzg6GnOXO3rkGGZMm4mMGTPA1rk0zwS3jjnMD+f66dT0yOnc1P9OCaIjZqUUcG2dBc71/F/znn/OHIB/vP8K3gvPI1Ly2HD5OTNiVU+F9xef4dXWG7Al39rHs2TLgonTxuPw6UNYsW45TCYTqpavBl9fX9iyNOnS4N8bZ82PDTvWmV/r0fl3bFy/CbP/moG1W1fj/r37aFCrMYzi+NHjmDVtDtJnTG+edu/efbWe/Qb3xYHjezFh2nhs27wdbf7XFrZObevrZ8yPDdvXml978+Y1ipcqhg5d9XXu+VH79uxDi1bNsX3PFvy9fiU+fPyIKuWr4tWrV1bBeTV1/iqGnfu2Yef+7fhfq59t+vz16tVrZMyUEaPGjvjitdevX+PkiVPo1vM37Du8BwuXLMCli5dQs2otGM2eA7tw9dYV82PtRv8LlWo1qkJvnMJ7AYyicePGmDNnjvm5q6srcubMiaFDhyJTpkzw8vLCvXv3ECdOHBhB2QplrZ7/2e8PzJg6E0cOH0Xa9GnRvXNP/O+X/6Fj189XpSlTp4QRvXz5Ek0bNcOEyeMxZOAQ2DrH6BGtnr/fdweOLlEQMXEs9Txq9vjm1yLEBqIXTYRnU07B7/k7RHCNgrdnH8MpXjREL+zlP49rVEQvnhg+yy8iWmEvOEaOACPs402afw5MEydJhN/79ET+HAVx4/pNJEueFLbKyckJ8eLH+2K6ZIznz16AaXOnoFDRQmra+KnjkDtzXhw5dBQ5c+eArX+PWzRqibGTRmHY4JHm6enSp8W8xZ+P7UmTJ8UffXvi58Yt8fHjR/V52SonpwiBbmvRqk1L9f/eXftgJCvXLrd6Pnn6RCRLmAInjp9EgYL51bRunXug5S8/o5PF+SuVjZ+/SpcppR6BcXZ2xtqNf1tNGzlmOArlK4JbN2/BK5H/sdwI3N3drZ6PGDoCyZInQ8FCBaE3tns5qENlypRRQbg8tm3bpg7cFSpUUK9FiBAB8ePHD/JgLtk3OdjbIskYLlu8HK9fvUau3Dnx6OEjHD18FO5x3VGyUCmk8EyFcsXL48C+AzCiDm06okzZ0ihWvCiMxuTrh7enHyFKlriqBeiL19/74u3Jh3CMHRmOzpH8J/r6ARGsDy0OER2Bj374eO8ljLCPByTZtwVz/0LipInh6ZUQtuzq5atImyQ9sqTOjhaN/odbN2+r6aeOn1RN4UWKFTbPmypNSngm8sSRg0dg6zq364pSZUuiSPEi35zXx9sHMWPFtOngXFy9fA1pk2ZAljQ51MWJtq3tiWxL4eriov63PH8VL1QKyTxTokzxcthv0PNXULx9fNQx3zm2M4zq/fv3WPTXIjRs3CDQ81t4Y4AegiJHjqyCcHlkyZIF3bp1w61bt/Do0aMvSlx27typnm/YsAHZs2dXv7t37151om/YsCFixIiBBAkSYMSIL5uj9OLsP2fh4eIJ9xjx0PHXjliwdJ5qMr1+7bp6fVC/wWjUrBGWr1mGzFkzo1LpKrhy6QqMZOnipTh54iT6DugDI3r371OY3n5UAbqlN0fu4dGgg3g8+BDeX36O2PXTw+FTUB4puQs+3n6Bt2ceqbIXX593eL3b/8Tv9/I9bElQ+7hm2uTp6nV5bNm4FavWr0SkSJ8uVGxQ9pzZMWH6OCxdswQjxg1TrQHlilfAixcv8ODBQ7VuAU/YceO6q9ds2fIlK3D6xGn06v/HN+d98vgJhg4ajsbNGsKWZc+VDROmjcXSvxdjxLihuHFDtnVFvHhhmxfR/4Wfnx9+69wdefLlQboM/uV61z6dvwb2G6y28co1y5Ala2ZULF0Zlw12/gqK9DX5o/ufqFmrJmLF8m85NaI1q9fg+XNv1G9YH3pk25f/Om8unT9/PlKkSAE3Nzer+jZLEsQPHz4cyZIlg4uLC7p06YJdu3Zh9erViBs3Lnr06IHjx4+rgP9r3r17px4aH5/Q78AkJSt7juxWf2v18tVo2aw11m9dqw56QkoA6jeqp37OnDUTdm3fhXmz56u6XiO4feu26hC6ZsMaRIkSBUb09sRDRErhgggxrYPOyBndETFZbBVwvzlwFz7LLyB2k4xwcHJUtefRSyTBy3VX8WLlJcDJEdELeuLDTR9Ah1mK/7KPa0H6T3VqqpaT+/fvY9zI8Whctwk279pos/tDyTIlzD9nyJgeOXJlR8aUWbBq2WpEiWqb6/Qtt2/dUR1CV65f/s3tJvvBT1VqI02a1Oj2x2+wZSVLB9jWObMjY6qsWLVsFRo00WfAEtI6tu2M82fPYfOOjeZp2vmrafMmaNDI/3OQBNPOT+evPgY5fwVFWska1GmoWvXHTBgFI5szay5KlSkFD48E0CMG6CFo7dq1KvMtJCCXDLhM+1rHkr59+6JkyZLmoH7GjBkqsC9evLiaJnXtnp6e3/zbgwYNQp8+YZvFlWxa8hTJ1M9Zs2XB8WMnMGn8ZHTo4l+3lyZtaqv5U6VJrYJaozh+/AQePnyEfLn86xa1Uoi9e/Zh8sQpeP7qqSptslW+z9/iw7XniPXT54yxxjGKk3rALSoiesbE46GH8e7fJ4iSwb++L1peD0TNkwB+Lz/AMUoE+D5/h1fbbyJC7MiwJUHt42MmjjbXbsojecrkyJk7JxLHTYq1q9aiRu0aMALJlqdImRxXr1xD0eKFVZOwjOZhmUWX70C8eNYtLLbk5PGTqqyhcO6iVt/j/Xv2Y9qk6Xj44p76HksrQo2KP6lj/PylcxExonVfDSNta3vQqV0X1eF547Z1SOj5uSwt/qea/IDnr9QGO399LTi/eeMW1m9Za+js+c0bN7F92w4sXPoX9IolLiGoaNGiqoRFHocPH0bp0qVRtmxZ3LgR9OgVOXJ87lh15coVdQLMnTu3VWfT1KmtDxSB6d69O7y9vc0PKa0Ja5J5eP/uveowl8AjAS5dvGz1+uVLlw3V2aRosSI4cuIQDh7db35ky54NtevUUj/bcnAuVG159IiIlNK/NjNIpk+Pj9bDKEoJl2TeHSJGwLszj+EYKxKcEvhfwNoqbR8PjGSc5PHuvW2V8XyNJA2uXb2ugpbM2bKooHTXjt3m1y9duITbN28jZ54v6/JtReFihbD/+F7sObLL/MiaPQtq1qmhfpbvsWTOq5WvgYiRImHhigU220ISrG2dIPBOo0Yh31EJztesXou1m/5GkqRJrF5PnCTxp/PXJUOfv4IKzi9fvqI+F2n5N7K5c+apfgZly5WBXjGDHoKiR4+uSlo006dPV9m1adOmoXnz5kH+TkiQGnZ5hJXePfuo5nBPLy+8fPECSxctw95de9VwcxKYte3YBoP6DkKGTBmQMXNGLJy3UJ3M5y76PBqCrYsZMybSZ/g8HJuIHj0aXN1cv5huiyext6ceInKmuHBw/FyW4vvsLd6dfazKWyR495P68n13VCfQSCljm+d7vf+OKnWRkhbJrMs8sWqksnovvfvaPi6BzIqlK1CsZDHEieOGu3fuYtTQ0aoMpFQZ/xYxW/THb3+iTPnSKhCR4QUH9x2iAtTqtarB2TkW6jeuh55d/4CLS2zVSbJrh+4qOLflEVzkeywjtViKFj26So7IdC04f/36DabOmowXPi/UQ8Rxj2OzF+J/dOuFMuVKfd7W/Yb6b+ufqqnXH9x/oMa4v3rF//4HZ8+cQ8yYMeDp5QkX129ctOu8rGXpoqVYtPwvtT6yniKWcyxEjRpVnb/adWyDgX0Hq2EJ5fz117y/1Fjp8xbNhS1fgF257L8txfVrN3Dq5Gm4urogfoL4qFervhpqcdmqpfD19cP9T5+LvG7L/WqCSrTMmzMf9RvU03VHb/0umQHIF13KW4J7c4PkyZOrDNWhQ4eQKFEiNe3Zs2e4ePEiChf+PHKCHjx69Bgtm7bC/XsP1IFNxg2WwEXGPRet27ZSHU16dOmBZ0+fI0Om9Fi1YYVNDz9nTz5c9Yaf93tEyRqgdMHJUdWSvz50D6Y3H+EYIyIiJoql6s8do38+iL+//Ayv99yGydekhlyMVSsNIn8rE68zX9vH5QYeMirRpHGT8fzZc8SN5458BfJhy65NKitjq+7cuYvmDX/G0yfPEMfdDbnz5caW3RtVICoGDu+vjmkNazdRLQnFShbF8LFDYWSnTpzG0cPH1M9Z01lfiJy6cEK1GNrstm70P+ttvWuDeVvLePBDBgwzz1++RCX1/4SpY1G3YR3YKrkBmShbwn+ENc2k6RNQv6F/n6lf2rbG27fv0E2dv56pRNPqDStt+vwl5XllS5QzP+/Wpbv6v16Duuj5Zw+sW7NePc+bI5/V723Yuh6FCutvCMIfIaUtMnykjN6iZw4mSZVRiIyD/uDBA8yaNcscWI8fPx6TJk3C9u3bkSRJEiRNmhQnTpxQHT5lFBcpiZH5Ysf+nHls1aqVGtll5syZqpNoz5491e83a9YMo0f7170Gh2R9JHt/6/ENQ9eRBSaig7FqQ4Mjcb/Ax7c1usu/f76Jjr3wM9n2jZD+KwfYTutLSDKZ/Dst2hMnR/s7hgtHB9tsjfkR9vi99vHxQXw3D1WO/LX4jBn0ELRx40bVMVRrNk2TJg2WLl2q7gQqwywGx7Bhw1RTVMWKFdV7dOrUSW1EIiIiIrIPzKAbFDPo9oUZdPvBDLp9YQbdfjCDbh98gplB5yguREREREQ6wgCdiIiIiEhHGKATEREREekIA3QiIiIiIh1hgE5EREREpCMM0ImIiIiIdIQBOhERERGRjjBAJyIiIiLSEQboREREREQ6wgCdiIiIiEhHGKATEREREekIA3QiIiIiIh1hgE5EREREpCMM0ImIiIiIdIQBOhERERGRjjBAJyIiIiLSEQboREREREQ6wgCdiIiIiEhHGKATEREREekIA3QiIiIiIh1hgE5EREREpCMM0ImIiIiIdIQBOhERERGRjjiF9wJQ6HJycFIPe2KCCfbmUs81sEeFZjaCvdnVZBbs0dN3j2CPXCO7h/ciEIUaP/jB3vgFc52ZQSciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6PSf7N2zFzWq1ETyRCkRPWJMrFm9xvzahw8f8Hv3P5AzS264O8dT8zRv/DPu3b0HI6+3MJlM6Ne7P5J5pYBbTHeUL10Rly9dhpGMGjYasSO7oVunHlbTDx88goqlK8PDxQtecRKjbPEKePPmDWzFh2dvcGPmMZzptAGn26zFhb478PrG80Dnvb3gFE61/BuPtl2xmn5t4iGc674Fp39di7NdN+HmrOP48PwtjLattX29RsWf1OtrV6+DLTm87wia/9QSeVIVRLJYabB57dYg5+3ZvpeaZ+aEOeZpt2/cxm+/9EShjMWRNm5mFMlUEqMGjMX79+9hpG1dvmQlNc3y0eGXTrBlw4eMROG8RZHA1RNJE6ZA7ep1cfHCJat5Htx/gBaNf0Zyr1SIF9sDBXIVwuoVq2Hkc9eAvgORNUM2dc5O6O6lzl1HDh2Brdu7Zx9qVvkJKRKlQoyIsbBm9Vqr12VaYI/RI8YgvDFAD2X3799HmzZtkCxZMkSOHBleXl6oWLEitm3bFqzfnz17NmLHjg29efXqNTJmyohRY0d88drr169x8sQpdOv5G/Yd3oOFSxbg0sVLqFm1Fmzd19ZbjBw+CpPGT8bYCaOxc98ORI8eDZXLV8Xbt7YbpFk6fvQ4Zk2bg/QZ038RnNeoWBPFShTFtn1bsH3fVvzcqjkcHW3jEPPx1XtcGrYXDhEckezXPEjdqyg8aqRHhGgRv5jX+8Q9vLr2DE7OUb54LUaqOEjcIjvS9CmGJP/LiXePXuH61COG2taaiWMnw8HBAbbo9as3SJshDfqM+POr821aswUnj5xCvARxraZfuXgNfn5+GDC6DzYdWovfB3fHXzMXY3ifUTDatm7UtCEu3DhnfvQZ1Au2bN+efWjRqjm279mCv9evxIePH1GlfFW8evXKPM/PTVvi0sXLWLxiIQ4e349KVSqiYd0mOHXiFIx67kqRMgVGjBmBwycOYsvOzUicOBEqlauCR48ewZa9fvUKGTJlwMgg1vvKrUtWj0nTJqrjWuWqlRDenMJ7AYzs+vXryJ8/vwqwhw0bhowZM6rs8qZNm/DLL7/g33//ha0qXaaUegTG2dkZazf+bTVt5JjhKJSvCG7dvAWvRF4w4npLRnHC2Ino2qMLKlSqoKZNmzUVSRMmV1ftNWvVgC17+fIlWjRqibGTRmHY4JFWr/Xo0hM///IzOnRpb56WMnVK2IqHmy8jkmtUJGqU1TwtcpzogWbZ7yz+B8na5sHV8Ye+eN29RHLzz5HcoiFu6ZS4PvkwTL5+Kvg3wrYWp0/9gwljJmDH/m1InTgdbE2RUoXU42vu332APl36Y/bK6WhW839WrxUuWVA9NImSeuHqpWtYMGMhegz4DbbkW9s6arSoiBc/Hoxi5drlVs8nT5+IZAlT4MTxkyhQML+adujAYYwaNwI5cmZXz+WYPn7sRJw4cQqZs2aG0c5doladn6yeDx4+CHNmzcWZf86iaLEisFWlypRSj6AE3LfXrVmHQkUKIWmypAhvtnPGsEGtW7dWV2KHDx9G9erVkSpVKqRPnx4dO3bEwYMH1TwjR45UgXv06NFVdl1+Rw6YYufOnWjSpAm8vb3V+8ijd+/esEXePj5q+Z1jO8Oorl+7rppGixYranWxkjNXDhw6eBi2rnO7rihVtiSKFLc+WD96+AhHDx+Du3sclCpcBim90qBciYo4sM9/H7cFPqfuI2qi2CrbfbbLRlwYsBNP9tywmsfkZ8LN2SfgXjIFonjEClZW/tnh24iWzNWmgvOvbWuthaxFw58xbPRQQwVuliQ73unnrmjRthlSpQ3eheYL7xdwdrG949vXtrVYumgZknmkRN6s+dHn975q+xuJj7eP+t/VxcU8LXfeXFi+bCWePn2m9oVli5fj3dt3KFioAOyBlGrNnD5Lnb8yZsoAe/HgwUNsXL8JjZo0gB4wgx5Knj59io0bN2LAgAEq+A5IK1uREoCxY8ciadKkuHr1qgrQu3btiokTJyJfvnwYPXo0/vzzT1y4cEHNHyNGjED/3rt379RD4+Pjf9DRAynv+KP7n6hZqyZixfp2YGOrJDgXceNZN4fL84cP/F+zVcuXrMDpE6exff/WQC9MxOD+Q9FvcB9kzJwRi+YvRuUyVXHg+F4kT/k5q6xX7x+/xpPd11UGPG6ZVHhz4xnuLPkHDk4OcM2byJxlh6MD4hT7embl7opzeLLzGvze+yJaUhck/SU3jLKtRY/OvyNX3lwoX6kcjGryqGmIECECGrcK3on6+pUbmDN1Pnr07wojbeuataqrFs/4HvFx9p+z6N2zjyr9mL9kLoxAgu/fOndHnnx5kC7D55agOX/NQuN6TZE4flI4OTkhWrRo+GvpfCRPkQxGtmHdBjSq10RdhMVPEB9rNqxGnDhxYC/+mvcXYsaMgUo6KG8RDNBDyeXLl1XJQ5o0ab46X/v2n0sCkiRJgv79+6Nly5YqQI8UKZK6gpXMc/z48b/6PoMGDUKfPn2gN1LS06BOQ/VZjJlgm/WZ9u72rTuq49jK9csRJcqXddd+fib1f5PmjVC/UT31c+YsmbBrx27Mn7MAvfp/vc5XF0wmRE0cGwmqpFVPoyVyxtu7L/Bk9w0VoEtn0cfbryJVj8LfrLuOWyo53PInwvsnr3F/3UXcnH1cBem2UK/9rW29fs0G7N65B7sP74BR/XPiDGZPmoc1e5YHa5tJKUyTai1QrkoZ1G5sXSZgy9taNG7eyPxz+gzpVIuJXHhfu3INSZOHfwnAj+rYtjPOnz2HzTs2Wk3v33sAvJ97Y83G1XBzc8Xav9ehUd3G2LR9Q5B9MoxASjsOHN2HJ4+fYNaM2WhQt5HqSxU3rjvswdzZ8/BTnZ+C/D6ENdtqd7UhEpAGx9atW1G8eHEkTJgQMWPGRIMGDfDkyZPvbkbs3r27KoXRHrdu3YJegvObN26pA52Rs+dCa+5/+OCh1XR5Hjee7ZYCnDx+UpWxFM5dFG7R4qrHvt37MGXCVPWzdvBOnTa11e+lTpNKBQG2QDp8RkkQ02pa5Pgx8P6p/yg0ry4/wccX73Cuxxacar1GPT48fYO7y86qaVbvFSMyIseLgZjp4iJx8+x4ceYhXl97BiNs6x3bduLa1WtIHDeZ+XXRsHZjNeKHERzZfwxPHj1BgXTFkNIlvXrcuXkXA3sOQcEMxazmfXDvAeqWb4hsubNi4Ni+sCXf2ta+vr5f/E6OXP412VevXIOt69SuiypnWLd5DRJ6JjRPl3WbMnEaJk4djyLFCqsWwe5/dEPW7FkxdfJ0GJm09idPkRy58uRSnSWl9WDOrM+jFxnZvr37cenCJTRu+vmiNLwxgx5KUqZMqbIvX+sIKp1IK1SogFatWqlSGFdXV+zduxfNmjVTNWDSrBZcMkKMPPRCC84vX76CDVvWwc3NDUaXJGkSFaTv3LFTZZC1UqMjh4+i+f+aw1YVLlYI+4/vtZr2S4tfVSfQ9p3bIUmyJEjgEV81fVu6fOkKSpYuDlsQPbkr3j3w7/uheffgFSK5RVU/u+T2Qow01lmkq2MPwiWPp7kEJlCfLtRNH/xghG0t2cQmLRpbvZ4vWwEMHNYfZcqXgRFUrV0J+YvmtZrWuGpzVKldGTXrV7XKnEtwniFLegydNNBmRiwK7raWEp+A/jl1Rv0fL0E8m06edW7fVXXcX79lrTpuW3rzKTnmEGB7yuchJTH2RNb3/TvbHTr0e8ydORdZs2VVF2R6wQA9lEiwXbp0aUyYMAFt27b9og79+fPnOHbsmPoCjBgxwnxwX7JkidV8UuYSWCYjvElH1iuXr5qfX792A6dOnoarq4uqXatXq74aanHZqqXw9fXD/U/12fK6rJOt+tp6S63mL21bY+jAYUiRIjkSJ0mCfr37IYFHAlSs7D+qiy2Slp106f1LPzTRokdX+7g2vU2HNhjcb7DqUCSPv+YvUtmIuQtnwRa4F0+GS0P34sGGi4id3QOvrz/H07034FnPf8QGpxiR1MOSQwQHRIwVGVHi+/cLkaEX31x/jugpXNXwjDLE4v2//0Uk92iIluxzBzRb39aBdQz19PJEkqSJYStevXyFG1dvmp/fun4b506fV508E3p5wMXNens5RXSCe9w4SJYymTk4r1OuIRIm8lCjtjx9/NQ8r3s8d0NsayljWbp4OUqVKQEXV1dVg96jy+/IVzAfMthwmYeUtSxdtBSLlvvXG2t9h2I5x0LUqFGRKk0qVWve7pf2GDCkv/o81v69Ftu37sDSVYthxHOXq5srhg4ahvIVyqnzt5S4TJk0FXfv3EXV6p8vSm11va9arPeNa9dx+uRpuHw6Z2uJtJXLV2Hg0AHQEwbooUiCcxlmMVeuXOjbty8yZcqEjx8/YsuWLZg0aRIWLVqkMs3jxo1TY6Pv27cPkydPtnoPqUuXHUzGTc+cObPKqn9PZj20HD92AmVLfO4k1q1Ld/V/vQZ10fPPHli3Zr16njdHPqvf27B1PQoV/jw8ma352npPnTkFHTt3wOtXr/Frq7aqhjFv/rxYtXaFbmraQkvrti3x7t1bNdzis6fPkSFTelXbait1qtGSuCBpy5y4t+o8Hqy7iEhxosGjZga45PYM9ns4RooA75P3cH/tv/B754uIzlEQM7074pVNBceIX2YjKXzrzOuW/9yUPaDHYPV/9bpVMGyy/89fs3fHPty4ekM98qUpbPXaVR/bHT7XUsRIkbBz+y5MGjdZHdOkDKRS1Yro3L0jbNn0KTPU/2VLWCdNJk2fgPoN6yFixIhYtnopevXsjZ+q1lYXc8mSJ8WUGZNQumzQw/XZ8rlr7MQxuHjhIhbM+0sF5xKwZ8+RDVt2bPriIs4W17tcifLm59269DCv95SZ/vGWjNIjLSs1a+trKGQHU3CLpek/uXfvnipfWbt2rfrZ3d0d2bNnR4cOHVCkSBGMGjVKjZEuGfVChQqhXr16aNiwIZ49e2Ye6UVKYJYuXapq03v16hWsoRblilA6mN57csfwtd8EfPCzj2bIgArPagJ7s6uJbbRKhLSn72z7hin/lWtk28jIh6QIDvZ5Qetoh+ttgv2FoD4+PvBw81T9Bb8WnzFANygG6PaFAbr9YIBuXxig2w8G6PbBJ5gBum31aiEiIiIiMjgG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY44hfcCEIU0Rwf7u+50sMN1FnubzoO9idksB+zRyxlHYY9MMMHe+NnhOgsnOz2O25sIDhGCNR/3BiIiIiIiHWGATkRERESkIwzQiYiIiIh0hAE6EREREZGOMEAnIiIiItIRBuhERERERDrCAJ2IiIiISEcYoBMRERER6QgDdCIiIiIiHWGATkRERESkIwzQiYiIiIh0hAE6EREREZGOMEAnIiIiItIRBuhERERERDrCAJ2IiIiISEcYoBMRERER6QgDdCIiIiIiHWGATkRERESkIwzQiYiIiIh0hAE6EREREZGOMEAnIiIiItIRBuhERERERDrCAJ2IiIiIyJYD9Dlz5mDdunXm5127dkXs2LGRL18+3LhxI6SXj4iIiIjIrnx3gD5w4EBEjRpV/XzgwAFMmDABQ4cORZw4cdChQ4fQWEbSob179qJGlZpIniglokeMiTWr11i9bjKZ0K93fyTzSgG3mO4oX7oiLl+6DKMZNngY8ucpCPfY8ZAoQWLUrFYLFy9chJFMnzID+bLlh6dbIvUoUbAUtmzcYn69Xev2yJwmK+LFSoBkHilQp1pdXPzX9j+D4UNGoHDeIkjgmhBJEyZH7ep1cfHCJfPrT58+Ref2XZA1fXa4x4qHtMnTo0uHrvD29obN8DMBJ58AK64Bf10GVl4HTj+RL3Dg8x98AMy7BJx/Zj3d5z2w4y6w5Aqw6Aqw8RZw/zVsaVsXylsE8V0TIkkg21rMnD4LZUqURwI3T8SI5Iznz5/Dlg0fMhKF8xZFAldPJE2YItB1FocOHkb5UhURL7YHPNy8ULpYWbx58wZGPZ5ZnsOqV6wB50guWLv6c1LSKKZOnoacWXMhrkt89Sicvyg2bdgEo9u7ey+qV66BpF7JEdUpOv4OELvYdIB+69YtpEiRQv28atUqVK9eHT///DMGDRqEPXv2hMYykg69evUaGTNlxKixIwJ9feTwUZg0fjLGThiNnft2IHr0aKhcvirevn0LI9mzey9atvoZu/btwNqNa/DxwwdUKFsJr169glEkTOiB3gN6YdfBHdh5YDsKFSmIOtXr4fzZ8+r1LNmyYOK08Th8+hBWrFuuTmxVy1eDr68vbNm+PfvQolULbN+zFX+vX4UPHz+gSvmq5m17/+593Lt7DwOG9MehEwcwefpEbNm0Fb/8/CtsxtlnwMXnQK64QKXEQDY3/2n/BnKRcfMl8PgtEDXCl69tv+sf1Jf0BMp5AS6R/ae9+QhbsHfPPvz8aVuv+bSt5Xhl+T1+8/oNSpYqjs6/dYQR+O/fzbF9zxb8vX4lPnz8aLV/a8F5tQo1UKxEMezctw0792/H/1r9DEdHR8MezzQTx06Cg4MDjCphwoToN6Av9h/ei32H9qBI0cIqwXTu7DkY2atXr1TsMnrcKOidg0nOpt8hbty42LRpE7JmzaoeHTt2RIMGDXDlyhVkzpwZL1++hNE1btxYlfoIJycnuLq6IlOmTKhTp456TQ8HLx8fHzg7O+PekzuIFStWqP4tyaAvWvYXKlauqJ7LLiWZ9bYd2qB9x3ZqmmQVJQs5ZcZk1KxVI1SXx9Eh/D7/R48eIVGCJNiyfRMKFCoQZn/3vd97hKXE8ZKi3+C+aNikwRevnTl9BvlzFMSJ88eRLHnSUF0OxzDsRvPo0WMkS5gcG7atR4GC+QOdZ+WylWje+Gc8eH5PHRtCQ8xmOULuzbbfAaI4AfnifZ626y4QwREoEP/ztNcfgQ23gOIe/oF32thAWhf/1976AkuvAqU8gXj+rav44OefSS+REEgQLUQW9eWMowjLbS3Hq42BbOvdu/agXMkKuP3whirvDG0mfNcp+gf37xTYsG2deZ2LFiiBYsWL4I8+vyMs+YXROgd1PDt98h/UqlpbBfCpEqXBgqXzUaFy+VBfjkiOkRCePNw9MXDIADRu2gj2IKpTdCxevgiVPsUuYRmfxXNNoOKir8Vn3312K1myJJo3b64eFy9eRLly5dT0s2fPIkmSJLAXZcqUwb1793D9+nVs2LABRYsWRbt27VChQgV8/Bh6WaMPHz5A765fu44H9x+gaLGi5mlysZAzVw6VkTEyH28f9b+L66fgxWAkK75s8XK8fvUauXLnDDQ7sWDuX0icNDE8vRLCSHw+la64ugS9bb19fBAzVsxQC85DnHtU/1IUKVERT98BD98CHhZBteRw9t4H0sUGYkf+8j0iOwKxIgJXffwDcymbuegNRIkAuAYyvw1ta5evbGujHru0/fvRw0c4evgo3OO6o3ihUkjmmRJlipfD/n0HYOTj2evXr9G8YQsMHzMM8eJbXLgamHwOSxYvVcfv3Hlyhffi0H8N0KXmPG/evCpTuHz5cri5uanpx44dUxlkexE5cmTEjx9fNRNly5YNPXr0wOrVq1WwPnv2bDWP1CjKhYy7u7u6SipWrBhOnTpl9T7yO/L7UaJEQbJkydCnTx+rAF+a2CZNmoRKlSohevToGDBgAPROgnMRN15cq+ny/OED/9eMyM/PD106dkXefHmRPkN6GMnZf87Cw8UT7jHioeOvHbFg6TykSZfG/Pq0ydPV6/LYsnErVq1fiUiRwjcbFNLb9rfO3ZEnXx6ky5Au0HkeP36CoQOHoUmzxrAZGVyAJDGB1TeA+ZeAdTeBNLGBZBZZnTPPpFnKf3pgpAxAMuUS3EvWXGrZpUZdsu2RAymHsZFtnTdfHqQPYlsbTWD797Vr19X/A/sNRuNmDbFyzTJkyZoZFUtXxuVLV2DU41n3zj2QK28ulK/kn3w0sjP/nEEc57hwjuaCtq3bYfGyhUibLm14LxZ98t1pHmnSGz9+/BfTJbC0dxKAS5nPihUrVGBes2ZN1aFWgnbJIE+ZMgXFixdXLQ9SFiM1+w0bNsTYsWNRsGBBVSYk9fyiV69e5vft3bs3Bg8ejNGjRweZmXv37p16WDahUNhq36YDzp49h227tsJoUqZOiT1Hdqv9avXy1WjZrDXWb11rPqn9VKcmihUvivv372PcyPFoXLcJNu/aqC48jaBj206qRnXzjo2Bvi6fS83KNZEmbWr0+LM7bMb1l8C1F/7lLLEjAc/eAUceAdGcgOSxgCdvgX+fA+UT+QfigZEM++FH/hnz0p5ABAfgsg+w4x5Q1sv/vWxIh7adcO7seWwJYlsbUce2nXH+7Dmr/VuCdtG0eRM0aFRf/Zw5a2bs3L4L82bPR58Bn89RRjmeXb1yFbt37sGew7tgD1KlToVDxw7A29sHK5evRIum/8Pm7RsZpOtEsI6cp0+fDvYbSi22PUuTJo36vPbu3YvDhw/j4cOHKtsuhg8frjrWLlu2TAXiclHTrVs3NGrkX+8lGfR+/fqpoSstA/S6deuiSZMmX/270klXLxdJWrPgwwcPkSDB5zpWeZ4xszH3j/ZtO2L9ug3YumMzPD2NVdohJBuePEUy9XPWbFlw/NgJ1Ql4zMTRappcgMojecrkyJk7JxLHTYq1q9aiRu3Q7W8QFjq164yN6zepeuSEgWzbFy9eoGqF6ogRMwb+WroAESNGhM04/tg/i540pv9z6dz58iNw5ql/gP7wjX+NuYzyopHy4GOPgfPPgWpJgftvgDuvgJ+SAZE+ZczdogD3rvuXvWRwha3o+GlbbwpiWxtRp3ZdPu3f66zWOf6n47hcdFpKnSY1bt+6DSMezyShdu3KNSRyty7XbVCrIfIVyIt1W9fCSPw/h+Tq52zZs+LY0WOYMG4ixk8aF96LRsEN0LNkyaJKLYLqT6q9Jv/b+sgNP0r7HKSURTrMaiVAGhmeSjLlQubZt2+fVdmKfH4y0onUwUWL5l8HmiPHtzuFde/eXXXY1UhmwMvLC+EhSdIkKkjfuWMnMmfJZF6eI4ePovn/msNo27tDu074e9Xf2Lxto1p3eyDZtffv3gf5mcjj3fuw7bga0mQdZBjFNavXYv2WdYFuW9mvq5Svpi7CF69YZHstBh/9s6RWJFGuHeql1CV+gE6e2+74T5cA3vI9AmbYLd/HBrZ1p0/bekMQ29po/Pfvrp/277VfrHPiJImRwCMBLl20HnpRhsstWbokjHg8k9avgB3f82bLj0HDB6JM+TIwOvkcLFviyQYC9GvXLLIn9FXnz59H0qRJVXCeIEEC7Ny584t5tJ7/Mo9kvatVq/bFPJYneqk9/xYJELRMfViQZb9y+ar5+fVrN3Dq5Gm4urrAK5EXfmnbWtXjpkiRHImTJEG/3v3Uwb5i5QowWlnL4oVLsHTFYpVBlRIPIdlk7X4Btq53zz4oWaYEPL288PLFCyxdtAx7d+1VQypeu3odK5auQLGSxRAnjhvu3rmLUUNHI0rUKChVpqTNl7XIui5a/hdixoxh7lsRyzmW2rYSnFcuV1UNvzd99lS88HmhHiKOexxEiGAD9dee0f1rzKNH9C9xkTpyyYyn+BR8Sw15wDpyqUeXoRadI33uaBrJEdh/H8joBjg5AJe8gZcfgITfPnbppazla9tayDR5SBmEOHvmHGLGiAHPRJ6qZNEWy1qWLloa5DpLoqldxzYY2HewGpYuY+aM+GveX2qs9HmL5sKIxzNJLAXWMdTTyxNJkiaGkfzR40+ULlNKna+lFVDOYzJC0Zr1q2FkL1XscsVqUItTJ0/BxdUViRKFT1LzhwL0xImNtWOGlu3bt+Off/5RN2zy9PRUwZrUjAc1uo10Dr1w4YJ5XHlbIk2CZUt87kTTrYt/3W29BnUxdeYUdOzcQfWM/7VVW3g/90be/Hmxau0K28swBuNmD6JUcevsytQZk9Gg0ZdDENoiGX6tZdNWuH/vgTp5p8+YXp3MipUoqsYBP7DvACaNm4znz54jbjx35CuQD1t2bVKjP9gyuaGJKFvCeni1SdMnon7Dejh14pQa5UJkTpvVap4zF0+rDKTuyfjncqOiww/9S1miOgEpYwGZrFv+vkpqz4snBE48Abbc9s+aS/BexMNmRnEJaltP/rSt1TxTZ2JQ/8Hm1+SGPQHnsSWf19k6aTJp+gTz+kii5e3bd+jWpQeePX2GDJkyYPWGlaE+fGp4Hc/siQz00axJC9y/dx/OzrGQIWMGFZwXL1kcRnb86HGULuH/3RW/de6m/pd9ftrMqbDpcdDFvHnzMHnyZJVZl7uJSgAvHRglc1y5cmUYnYx1/uDBA8yaNUuVpMjPGzduVHXgRYoUUXXmMhZ6oUKF1JWp3Gk1VapUuHv3LtatW4eqVauqshUZT16GZfz9999Ro0YN9TtS9nLmzBn0799f/S3JYqxcuRJVqlTR7TjoehOe46CHl7AeB10vwnIcdL0I0XHQbUhYjoOuJ2E1DrqehPU46HoR3uOgk42Pgy5D/kmts4x/LsMIajXnUrYhQbq9kIBcSlgkOy5jou/YsUONxiLDJkrTtgTW69evV0G6dPCUAL127dq4ceMG4sXzb0IrXbo01q5di82bNyNnzpzIkycPRo0axRYLIiIiIjv23Rn0dOnSYeDAgSqjGzNmTJXxldFHJOsr2ePHjx+H3tJSsDGDbl+YQbcfzKDbF2bQ7Qcz6PbBJ7Qy6FLWkjWrdb2lkA6KchcqIiIiIiL67747QJc685MnTwZa8pE2LQe3JyIiIiL6Ed99izepP//ll1/UWN1SHSM341m4cKHqIDl9+vQfWhgiIiIiInv33QG63MJexkiVkUfkZjpyl0sPDw+MGTNGdYIkIiIiIqIwDNBFvXr11EMCdBn0PW7cuD+wCERERERE9EMBunj48KG6yY6QIQXd3W37piRERERERDbZSVRuvNOgQQNV1lK4cGH1kJ/r16+vhowhIiIiIqIwDNClBv3QoUPqjphyoyJ5yM12jh49iv/9738/sChERERERPTdJS4SjMst6gsUKGCeJnfEnDZtmrqjJhERERERhWEG3c3NTd2hMiCZ5uLi8gOLQkRERERE3x2gy/CKMhb6/fv3zdPk5y5duuCPP/4I6eUjIiIiIrIrwSpxyZo1qxqpRXPp0iUkSpRIPcTNmzcROXJkPHr0iHXoREREREShHaBXqVLlR/4GERERERGFZIDeq1ev4L4fERERERGFZQ06ERERERHpaJhFX19fjBo1CkuWLFG15+/fv7d6/enTpyG5fEREREREduW7M+h9+vTByJEjUatWLXXnUBnRpVq1anB0dETv3r1DZymJiIiIiOzEdwfoCxYsUDcl6tSpE5ycnFCnTh1Mnz4df/75Jw4ePBg6S0lEREREZCe+O0CXMc8zZsyofo4RI4bKoosKFSpg3bp1Ib+ERERERER25LsDdE9PT9y7d0/9nDx5cmzevFn9fOTIETUWOhERERERhWGAXrVqVWzbtk393KZNG3X30JQpU6Jhw4Zo2rTpDywKERERERF99ygugwcPNv8sHUUTJ06M/fv3qyC9YsWKIb18RERERER25YfHQc+TJ48aySV37twYOHBgyCwVEREREZGdcjCZTKaQeKNTp04hW7Zsapx0Cn8+Pj5wdnbG/Sd3EStWLNgTBwcH2Jv3ftb3I7AX733fwt5EjhAF9si9ezHYoyeDd8He+JrsM46I5BgJ9sbP5Ad7jM8SuCVUg6x8LT7jnUSJiIiIiHSEAToRERERkY4wQCciIiIissVRXKQj6Nc8evQoJJaHiIiIiMiuBTtAP3HixDfnKVSo0I8uDxERERGRXQt2gL5jx47QXRIiIiIiImINOhERERGRnjBAJyIiIiLSEQboREREREQ6wgCdiIiIiEhHGKATEREREdl6gL5nzx7Ur18fefPmxZ07d9S0efPmYe/evSG9fEREREREduW7A/Tly5ejdOnSiBo1qhob/d27d2q6t7c3Bg4cGBrLSERERERkN747QO/fvz8mT56MadOmIWLEiObp+fPnx/Hjx0N6+YiIiIiI7Mp3B+gXLlwI9I6hzs7OeP78eUgtFxERERGRXfruAD1+/Pi4fPnyF9Ol/jxZsmQhtVxERERERHbpuwP0Fi1aoF27djh06BAcHBxw9+5dLFiwAJ07d0arVq1CZymJiIiIiOyE0/f+Qrdu3eDn54fixYvj9evXqtwlcuTIKkBv06ZN6CwlEREREZGd+O4AXbLmPXv2RJcuXVSpy8uXL5EuXTrEiBEjdJaQiIiIiMiOfHeArokUKZIKzImIiIiIKBwD9KJFi6oselC2b9/+o8tERERERGS3vjtAz5Ili9XzDx8+4OTJkzhz5gwaNWoUkstGRERERGR3vjtAHzVqVKDTe/furerRiYiIiIgoDIdZDEr9+vUxc+bMkHo7IiIiIiK7FGIB+oEDBxAlSpSQejsiIiIiIrv03QF6tWrVrB5Vq1ZFnjx50KRJE/zvf/8LnaUk3UuTIh2iRYzxxaN9mw4wsqmTpyFn1lyI6xJfPQrnL4pNGzbBSKZPmYF82fLD0y2RepQoWApbNm6xmufwwcOoUKoSEsROqOYpW6wc3rx5A1uWLXVOuEdN8MWja/vu6vUH9x+iddNfkS5JJiR2S4ZieUtizcq1sPVtnTdbfiR0S6QexQuWwmaLbT1r+myUK1FBvRYrkgueP/eGrXk37SzejTjxxePD1lvqddOrD/iw/jreTfoH78acwvt5/8L34nPz75u83+HDphv+7zPmJN5NP4uP++7B5OsHW7N3zz7UrPITUiRKhRgRY2HN6i/333/PX8BPVWvBw80TcZ3jo1Cewrh10/+zskX2ejyzx3NXQL6+vujbqx/SpcwAt5juyJA6EwYPGAKTyQRD1KA7OztbPXd0dETq1KnRt29flCpVCvaqcePGmDNnjvrZyckJrq6uyJQpE+rUqaNek8/JyPYc2AVfixPUubPnUKFMRVSrURVGljBhQvQb0BcpUqZQX/L5cxegZrVaOHh0P9KlN8YwpAkTeqD3gF5IniK5Wse/5i1Ener1sOfwLqRNn1adzKpXqIEOXTtg2Kghav//5/QZm9/nN+/dYLVP/3vuX9QoXwuVq1VUz39t3gbez30wf+kcuMZxxfLFK9C8/v+wZd9GZMqSEUbY1gs/beu9n7b169dvUKJUcfXo/Xtf2KJI9VIBFudj0+M3+LDsCiKkjq2ef9hwA3jni4hVksEhqhN8/32Gj2uvwaFeajjGiwbT03fq9yOW9IJD7Mjwe/wWH7fcBD74walIQtiS169eIUOmDGjQuAHq1qz3xetXr1xFqSKl0LBJQ/T8swdixoqJ8+f+RWQbbi231+OZPZ67Aho5bCSmT5mOqTOnIG26tDh+7ARaNm+FWLFioXWbVtAbB9N3XDrI1ce+ffuQMWNGuLi4hO6S2RgJwh88eIBZs2apz0l+3rhxIwYNGoSCBQvi77//Vl/0sOLj46Mupu4/uat2vrDWpWNXbFi/Ef+cP/XVYTlDQ1j/vYA83D0xcMgANG4adqMavfd7j7CUOF5S9BvcFw2bNEDxAiVRtHgR/N6nJ8Lae9+3Yfa3enb+A5s3bMXhM/vVPpY4TnIMGzsYP9WtaZ4nVcJ0+KN/TzRo8mWwE1IiRwjb4ChRvKTo/2lba/bs2ovyJSvi5sPriB3bOmkTWty7FwuV9/244zZ8r3ojUtN0aru+G3sKTiW8ECGdq3medxNOw6mgByJkihP4exx5AN9TjxG5efoQX74ng3chLEgGfeGyv1CxcgXztEb1GiOiU0RMnzMNYcnX5GuXx7NIjpFgb+cuP1PYtTxVr1wDcePGxaRpE83T6v5UD1GiRMXMudPDND5L4JYQ3t7eX43PvuuSMEKECCpL/vz55+Y++ixy5MiIHz++ujLNli0bevTogdWrV2PDhg2YPXu2mufmzZuoXLmyuvOqbJiffvpJBfOW+vfvr3aimDFjonnz5ujWrdsXw1vq2fv377Hor0Vo2LhBuAfLYUkuzJYsXopXr14hd55cMOo6Llu8HK9fvUau3Dnx6OEjHD18FO5x3VGyUCmk8EyFcsXL48C+AzAS2aeXLVqOuo1qm/fpXHlyYNWyv/Hs6TP4+flh5ZJVePf2LfIXygcjbmsjkrIU33NPESGDm3m7OnhEh9+FZzC9+agyi5JBx0cTHL1iBv1G73zhECUCjET26U3rNyNFqhSoXK4KkngkQ5F8RQMtg7FV9no8s8dzl8iTNzd27tiFSxcvqeenT/2D/fsOoFSZktCj726zyZAhA65evRo6S2NAxYoVQ+bMmbFixQp1wJPg/OnTp9i1axe2bNmiPstatWqZ51+wYAEGDBiAIUOG4NixY0iUKBEmTZr0zb/z7t07dVVm+Qgva1avUbWp9RvWhz04888ZxHGOC+doLmjbuh0WL1uoms+M5Ow/Z+Hh4gn3GPHQ8deOWLB0HtKkS4Pr166r1wf1G4xGzRph+ZplyJw1MyqVroIrl67AKNb/vVGVs9Sp//m7On3+VHUfCMmaJ3ROjE5tumL24plIljwpbH1bJ3DxRJwY8dDBYlsbkd9lbxVcR0jvZp4WsUISmHxNeD/xH7wffVKVr0SsnBQOLpEDfQ/Ts3fwPfEoyOy6rZJgVYZOHjl0FEqWKoG/169CxSoVVSnMnt17Ycvs/XhmT+cuS526dkKNn6oja4bscI7qgnw58+OXtq1Ru+7n47qefHfNhWR3O3fujH79+iF79uyIHj261evhUU6hd2nSpMHp06exbds2/PPPP7h27Rq8vLzUa3PnzkX69Olx5MgR5MyZE+PGjUOzZs1Up1vx559/YvPmzd8cY15Kafr06QM9mDNrLkqVKQUPjwSwB6lSp8KhYwfg7e2DlctXokXT/2Hz9o2GOtClTJ0Se47sVhd+q5evRstmrbF+61p10SmaNG+M+o38yzoyZ82EXdt3Yd7s+arW0wgWzPkLxUsXQ3yP+OZpg/oMhc9zHyxfvwSubq7YsGajqkFfs3UV0mVIa9Pbem+Abb1h61pDBum+/zyBY9JYcIgR0TxNOnyqGvQaKYCoEVQQ/2HtdUSslRKO7lGtft/04j3er7gMx1QuhgvQte92+Url8Gv7X9XPmbJkwqEDhzBj6gwULFQAtsrej2f2dO6ytHzpCixeuASz5s1U63j61Gn81uk3JEiQAPUbhl5ZYqhn0KUTqDR/lCtXDqdOnUKlSpXg6empatHlETt2bNalB0GaSaX59Pz58yow14JzkS5dOvXZyWviwoULyJXLuokp4PPAdO/eXdUzaY9bt8Knl/3NGzexfduOMK1hC2+RIkVSHY6yZc+KfgP7ImOmDJgw7nONm3HWMRmyZsuiTlLSsWzS+MmIF98/YE2TNrXV/KnSpMbtW7dhBLdu3MLu7XtQv3Fd87RrV69jxuSZGDNlFAoVLYgMmdKjS89OyJItM2ZOmQUjbeuMn7a10Zh83sN08wUcM37Onpuev4PfycdwKp0IjoljwjFuNDjlSwCHeFHhe/KR9e+//IAPSy/D0SM6nEp9PqYbhVscN9VvKk1a6wuz1PLdvmnb3217Pp7Z27nLUs9uv6NTl46oWasGMmRMj7r16+DXdr9ixNARsOkMumRnW7ZsiR07doTuEhmQBN9JkyYN9fp3eYS3uXPmqfq9suXKwF5JFkZKjoy+ju/fvUfiJImQwCMBLl28bPX65UuXUbJ0CRjBwnmLESduHJQs+3l93rz2H3LN0dG6j4VjBEdzFs5Y+3PYdkIOC75nngDRnOCY7HMnV9OHT9suYN8ZeW458suL9yo4d4gbFU6lExuyr40Eb9lzZMOlC/71uppLly7DK7GxLkjs6Xhmz+euN69ffzEaj56P2cEO0LXBXgoXLhyay2M427dvV2UtHTp0UC0OktmWh5ZFP3funOp0K5l0IUNWSrlLw4YNze8hz22B7OTz5sxH/Qb1wnTEmvD0R48/UbpMKXgl8sKLFy9U89nuXXuwZv1qGEXvnn1QskwJeHp54eWLF1i6aBn27tqLFeuWq8Ckbcc2GNR3kMpCZcycUQ3NJyf1uYv8hx21ZbJPL5y7CLXq/WS1T6dMnQJJkydFp1+7os+gXnBxc8GGvzdi17bdWLBiHoy0rWXElpXrlqvXH9x/oMZ/l+H3xLkzZxEjRkx4JvKEq6vttKCqzp9nnqiRWhwsLrIcXKOooROl7typcEL/YRYvP4fpxgtEqJrsc3C+5DIQK6KaB9KZVPv96J9LZWyBlE5evfy5T9mNa9dx+uRpuLi6qGNau07t0KhuY+QvmB+FihTElk1bsWHtBmzYuh62yp6PZ/Z27gqobPmyGDp4GLwSeaoSl1MnT2H86PFqmFE9+q4oyohZgpAkV57379//YpjFChUqqIBbrtxkiMp69eph9OjR+PjxI1q3bq0uenLkyKHeo02bNmjRooV6ni9fPixevFjVrydL5n9y0DMpbZEbWMjoLfbi0aNHaNakBe7fuw9n51jIkDGDOsAVL1kcRvHo0WO0bNoK9+89QCznWEifMb06mRUrUVS93rptK7x9+xY9uvTAs6fPVbnHqg0rbL6zpNi1fTdu37qDeo1qW02PGDEiFq6aj36/D0D9Gg3x6uUrFbCPnz4GJcsUt+lt/T+LbS3NwCsttvWMqbMwuP8Q8/xlipVX/0+aPgH1Gn4uAdI7Cbjx4gMcM3wubxEOERzgVC0ZfPfcxYdVV2X8Uji4RIJT2cSI8CnT7nfjhSqFwfN3eD/1rNXvR+6UFbZExoEuV8J/G4puXXqo/+s1qIspMyejUpWKGDNhtCoB6NKhK1KmSokFS+YjX4G8sFX2fDyzt3NXQCPGDEffXv3Rvk1H1QlaWkuatmiK7r93gx4Fexx0CS5lXO1vBekyQok9CnijIqnHl9Fb6tati0aNGpmbVWSYRQnCpcOoTCtTpozqGBovXjzze0kH3LFjx6qDhAzDKEMyHj58GAcOHLCZcdDDkz1eSIb1OOh6EZbjoOtFWI+DrhehNQ663oXVOOh6EtbjoOtFeI+DHh7Cchx0vQjuOOjfFaBL1jfgnUQDkmCUQlbJkiXV+Orz5gW/6ZwBun1hgG4/GKDbFwbo9oMBun3wCWaA/l0lLrVr11Y30KHQ8/r1a0yePBmlS5dWN4ZauHAhtm7dqsZMJyIiIiLjC3aAbo9ZyfAgn/P69evVzYqkxEU6jS5fvhwlShi/BzkRERER/YdRXCh0RY0aVWXMiYiIiMg+BTtA1+s4kURERERERhLsO4kSEREREVHoY4BORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY44hfcCENGPc4QD7FHUCNHCexEojDwdvBv2KHqFtLA3b9ZdCO9FoDDiYIfnLodgrjMz6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6EREREREOsIAnYiIiIhIRxigExERERHpCAN0IiIiIiIdYYBORERERKQjDNCJiIiIiHSEAToRERERkY4wQCciIiIi0hEG6BQi+vcdgGgRY1g9smTICnsxeeIUpE6eFrGju6Jg3sI4cvgojGL4kJEonLcoErh6ImnCFKhdvS4uXrj0xXyHDh5G+VIVES+2BzzcvFC6WFm8efMGtmrvnn2oWaUWUiROjRiRnLFm9Vqr1x88eIj/NWulXnd3jo8qFarh8qUrsHX2uN579+xFjSo1kTxRSkSPGBNrVq+xen1A34HImiEb3J3jIaG7F8qXrogjh47A5nz0A84/B3bdA7bcBg49BLzff379wRvg6CNg+11g023Ax+I1ja8JOPfMf56td4ATT4B3vrBle3fvRfXKNZDUKzmiOkXH3wG2vxFNnTwNObPmQlyX+OpROH9RbNqwCfZk+NARKlbp0rEr9IgBOoWYdOnT4uqtK+bH1p1bYA+WLlmG3zp3Q88/uuPAkX3IlDkjKpWrjIcPH8II9u3ZhxatmmP7ni34e/1KfPj4EVXKV8WrV6+sgvNqFWqgWIli2LlvG3bu347/tfoZjo62e4h5/eo1MmTKgJFjhn/xmslkQp0adXHt2nUsXv4X9h3eA69EXqhYtrLV52KL7HG9X716jYyZMmLU2BGBvp4iZQqMGDMCh08cxJadm5E4cSJUKlcFjx49gk05+wx48hbI6Arkiw+4RfYPyN9+CrB9/YDYkYFUzkG/x4XnwKO3QGZXIKe7f3B+8glsmey7sv1HjxsFe5EwYUL0G9AX+w/vxb5De1CkaGHUrFYL586egz04euQYZkybiYwZM0CvHExyxKX/rHHjxpgzZ84X0y9duoQUKVIgvPj4+MDZ2Rn3n9xFrFixwiSDLpm2Q8cOILw5ODiE6d+TjHn2nNkxeuxI9dzPzw8pkqRCq19aostvncNkGT76fUBYefToMZIlTIEN29ahQMH8alrRAiVQrHgR/NHnd4QlB4TNtpZM8sKlC1CxcgX1/NLFy8iaIbsK2OTCVNvuybxSone/P9G4aSMYgZ7W28EhbC72JIO+aNlfqFi54lePrwncEmLtpjUoWqxI6C5PBf/P+YdJ5nvbHSCrG+Ae9fP0Aw+AOFGAlBZB+ZuPwO77QN64QKxIn6d/8AN23AUyuQLxo/lPe/kB2PcAyO3uH9yHgDfrLiC8SAZ98fJFqPSV7W9UHu6eGDhkQJgev8IjBH358iXy5SqgLsiGDByCTJkzYdjIoWH29+X4Ed/NA97e3l+Nz2w3vaUjZcqUwb1796weSZMmtZrn/ftAmgoN5srlK0iWKAXSpcqAJg2a4tbNWzA62a4njp9AseJFzdMkayzPDx88DCPy8fZR/7u6uKj/Hz18hKOHj8I9rjuKFyqFZJ4pUaZ4OezfF/4Xa6Hl3bt36v8oUSJbbffIkSPjwL6DMCp7Xe+A3/mZ02epBEjGTPrNvn1BAiGJhRwDXNTK82f+2/WbpORF3sMtyudpMSICUSIAz41/jjMqX19fLFm8VLUk5M6TC0bXoU1HlClb2uq8rUcM0EOAnJzix49v9ShevDh+/fVXtG/fHnHixEHp0qXVvLt27UKuXLnU7yRIkADdunXDx48fze/14sUL1KtXD9GjR1evjxo1CkX+3959wFVV/mEAf5AlspeCAu4tKs7UElHclqaimXtVWrkXmjNNc8/cK8ty5Z5lDlBzo7lyoojgZCg4UO7/875XLlzE8S+Fc895vn1OcAaX88rl3ue85/e+1KghH0fJKlaqiHkL52D9pnWYNnMqwsOvIjCgjmyPmt25c1e+uOXMmdNou1iPjr4JtRG9pQP7BeO9qu+hRKkScpsodxC++3YcOnRuh7UbV6OsXxl8WLexydcmv0zRYkVkacfwb0YiJiZGhrbJE6Yg8nokoqOjoVZabbewdfNW5HTygIudG2ZOm4WNW9fL13aTYZENcLICLsXrS1pEYL+RoA/Wj5Pf7DHEcSLfW6aLDlbZTL4OXYtO/X0Kbo454ZjDGT2698SK1b+geIm3dMdGoVatWIWw42EYNWYklI4B/R0SpS9WVlbYt28f5syZg8jISDRo0AAVK1bEiRMnMHv2bCxcuBCjR482fE2fPn3k8Rs2bMDvv/+OkJAQHDt27I16tsRtk7RLZqpbrw6aNm8qe5Rq1wnE2o1rEBcbhzWrfsvU86B3q0+Pfjh7+gyW/LTQKLQLnbp0RNv2bVDGrwzGTRyLwkUKYdmSn6BGlpaWWL5ymbwA8c6VTw6W3LsnBHXq1TbpuvvX0Wq7heo1quPAkX34c+8f8jWu7aftceuWidWgi9pzQQ4SjQSuPgA8c+hDN2lOkaJFZFnq3v170PXzLuja6XOcPXMWanU94rocELrox0XInj3NXSCFssjqE1CDTZs2wc7OzrBev359+bFw4cIYPz61rmnIkCHw9vbGzJkzZZ10sWLFcOPGDQwcOBDDhg2Tt5dEqF++fLnsgRcWL16M3Llzv/Ycxo4di5EjlXNF6OTkJAdWXb50GWrm5uYKc3PzFwaEinUPj1xQk749+2Pblu3YtnMz8njlMWxPaWex4kWNji9arKh8QVQrv3J+OHAkVNYRPnmSBHd3N9SoVhN+5dU9e5FW2y3uahYsVFAuld6rhNLFy2Lp4qWZNs7krchhAVTKqZ/NRdSkW5sDJ+6Kwus3+3rrbPoSF1GLnrYX/Umy/rHIpIgORPF8FsqV98PRI0cxa8YPmDl7BtTo2LHj8qK6aiX92ClB3AEXM1eJmdhiE+7J93OlUHeXRyYJCAhAWFiYYZk+fbrcXr58eaPjzp49iypVqhgNYqxWrZocsHD9+nVcvnwZSUlJsgQmhahzLFrUOPhkJDg4WL5hpiwREVlb/y3adOXyFdWF1Ixe4ERg2fXnbqMeZbEu3sTVQAziEeFcDALetH0D8uXPZ7Q/b7688MztiQvnjadevHjhoiyHUDvxOypCquhVPnb0OBp92ABaoNV2p/09f/LYROuuRbmLCNQiaN95BOR8w95EMWBUvH2JmWBSJCTpS2ZE+QyZ/HM6ZZyJGgXUrIHDxw/iryP7DUu58uXwSauW8nMlhXOBPehvqWcloxlbxPbMImraxZJVggcMRoNG9eHj44OoG1FyVhdz82wI+iQIatej99fo2vEzlC/vhwoVK2Dm9Flyqrp2HdpCLWUtq35dhV/XLIe9vR1uPq+td3B0gI2Njbzg7Nnna3w3apycqsy3jC+WL1su50pf9uuPMFXiIvPyxdQ7QFfDr+Jk2Ek4uzjLC4/fVq+Fm7sbvL29cPrUGQzoOwiNPmqIWrX1d79MlRbbLdp8KU2bw69cxYmwk3BxcYaLqwvGj52Aho0awMPTA3fv3MXc2fNwI/IGPm72MUyKCOOiB9zWAkh8CpyPE9PWAHlsU3vCHz1NrSdPeD4+SoR5sYhecy9b4J84/eci6J+L1YfztzSDS9b9/FPHy4RfCceJsBNwdnGBj0o7GYYOHiZLU8XvtBgrtuKXlbJcbeOW9VAre3t7lCxV0mibrW0O+TuefrsSMKBnouLFi2PNmjWyRzKlF13Um4snjZeXF5ydnWWN5+HDh2XQFURv+Pnz51G9enUomaivb9+mI+7dvSffvKtWq4Ldobvg7u4OtQtq0Rx3bt/BqBGjZXgVUzat37wOuXKp4+7Bgrn6evP6gfqp9lLMXjALbdq1lp9/2aM7Hj16jEH9ByPmXoycR3v91rUoUNB4NiNTInqFG9RObbNom9C67aeYu3C2HAQcPGAIbt28JYNbq9afYNAQZf7Bi/+HFtst2lw/MPUOwKD+wYY2T/9hGs7/cx4/L1suw7l4My9foRx+37XdMNWkyRClLSKUix5vEbBz2einV0yZ2eX2Q+BUTOrxJ+/pPxa0Bwo9n4axqBOAWP3c53JGF2ughH5GJ1N17Mgx1A3Ul6YK4u9aCOL1bf6ieVAjMYd/545dER0VDUdHB5TyLSXDuSlfaKsN50F/C/Ogx8bGYt26dUbbxcwrZcuWxdSpU41CbJEiRdCxY0c5w8s///yDLl264Msvv8SIESPkMV27dsXOnTvl4FExE8jw4cOxY8cOdO7cWc7ootR50JUks+dBV4LMnAddSTJrHnTKepk1D7rSvLV50E1IVs6DTplLixE0nvOgK/Mvd23ZsgWHDh1CmTJl8MUXX8jg/c03qX/cZfLkybJOvVGjRggMDJQ16qLn3RRGHBMRERHRf8cedIUTM7uIYD9p0iQZ5t8Ue9C1hT3opHbsQdcO9qBrhxYjaPwb9qCzBl1hjh8/jnPnzsmZXMQPb9SoUXJ748aNs/rUiIiIiCgTMKAr0MSJE2V9upjCT0zVKP5YkUn9xToiIiIi+tcY0BXGz88PR48ezerTICIiIqIsos2iPiIiIiIihWJAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFscjqE6B3S/f8Py0xg1lWn0Km02KbJTPttftpchI0SVsvYwaJm85Ba2w6loUWPVwcBq3RWj75f9rMHnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnT6V0JDQtG8SRAK+hSGraU9Nq7faNiXlJSEb4KHomLZynB3zCWP6dLhM0TdiILahO4NRbPGzZHfuyBsLGyxIc2/g1qEhuxDUJOWKJS3KOysHLFx/Saj/Tdv3sLnnbvJ/e6OHmjSqCkuXrgEdbS7BQr5FIGdpcML7X7w4AH69OiLIvmKwc0+J8qXrogFcxdCTaZMmAona1cM6jvYsK1X9z4oW6w8PBzzoGCeImjVrDXOnzsPNbd5yYKlaFj7I3i75ZX7YmPjoEbFCpVADku7F5ZeX/eGSUrWAWF3gd+uAMsvAmvDgZN3AZ0u4+P/ugksuwCcjTHeHv8E2HUDWHkJ+PUSsC0CiE6EKZs3Zz4q+lVCTmcPufhXC8D2rduhZs+ePcOo4d+iROFScLV3R6mipTFuzPfQvez5kMUY0OlfSUhIhG9pX0yZPumFfYmJiQg7fgKDhgzEvkMh+GXlz7hw/gKCPm4JtUlISJD/DlNnTIFaJSYkolTpUpg8beIL+8QLW6vmn+LKlXCsWLNc/ry9fbzxYf3G8t/GlCUmJOjbncFzXBjUbzD+2PEHFiydj6N/H8aXX3dH3579sHnjFqjBsSPHsHj+UpT0LWm0vWy5Mpg1fwYOnjiANZtWyazTtFFz+ean1jYnJj5EYJ2a6DPQRIPqGwo5sAeXIy4Zlk3b9B0OTZt/DJN0OgY4HwtUygl8lBco56rfdi6DC6xrD4A7jwAb8xf3/XlDH+prewENvAFna/22h09hqvLkyYNvx4zC/kOh2HcwBDUC/BHUtCXOnD4DtZo8YTIWzF0g38uO/X0E3343ClMmTsXsmXOgRBZZfQJK1aFDB8TGxmLdunVG23fv3o2AgADExMTAyckJWlW3Xh25ZMTR0RGbtm0w2iZ+IapXrYGIaxEywKlF3fp15aJmderVlktGRE/5oYOHcej4XyhRsrjcNm3mFBTwLoxVK1ajQ6f2MFV16tWRy8sc/OsgPm37Kar7fyDXO3XtiEXzF+PI4SNo+GEDmDJxd6Br+y8wffYUTBg32Whfhy6pP9O8+XzwzcjBeL9CdVwLv4b8BfNDjW3u3uML+TFkTyjUzN3d3Wh90vhJKFCwAD6orn+Om5zbDwEvO8DLVr9uZwmE3wfuPjI+LvEpcPg2UCu3Pnin9egZcD8JqJJLH8yFcm7A+Tgg9glgY5oxKv1r1MjRIzB/7gL5el6iZAmo0V8HDqLhhw1Rr0E9uZ43X16sWrEKRw4fhRKxBz2TPXnyBFoUFx8PMzMzODo5ZvWp0Fv0+PFj+TF7duvUF5Vs2WBtbY0D+/6CmlV+rzK2bNyCG5E35J2EPbv34uKFi6hVuxZMXb+eA1Cnfm3UqFXjlceJuyQ/L10u3+jyeOeBFtqspfeqX5f/inYd2srXbpPkbqMvRRElKsK9x8CtR0DuHKnHiJ7x0GighBPglPo6ZmCdDXCwBC7HA0nJ+rIZEc6zmwMuGRxvgsTdr5UrVsnf58rvVYJavVelMnbv2iPv6AsnT/yN/fsOvLQDKqsxoP9Ha9asQcmSJWUgyZcvHyZNMr4dLrZ9++23aNeuHRwcHPDZZ5/JF76vvvoKnp6eyJ49O/LmzYuxY8cavkb03Hfp0kX2ZoivqVmzJk6cOAFT9ejRIwwNHoaglkGyPaQeRYsVkXdEhn8zUt5VEs/tyROmIPJ6JKKjo6Fmk6ZNQLHixWQNunMOV3zcsCkmT5+I9z+oBlO2ZuVvOHn8JIaPHvrSYxbMWYg8Lj5y+WP7H1i3ZQ2srKyg5jZrjRhXJGrt27RrA5NVyhnIZw+svwr8dAHYfA0o5gQUSPM+dCoGyGam354RcXESmEcf7kX9uahlFzXqorfdOoNyGBNy6u9TcHPMCccczujRvSdWrP4FxUvo74SqUd8BfdG8RTP4lSoPRxtnVK1YDV/26I5PPlVm+a1p3ptRiKNHj6JFixYYMWIEWrZsif3796N79+5wdXWVJTIpJk6ciGHDhmH48OFyffr06diwYQNWrlwJHx8fREREyCVFUFAQbGxssHXrVlkuMnfuXNSqVQvnz5+Hi4vLS3syU3ozhfj4eCiBGDDatlU72cM4bZZ667S1ytLSEstXLkP3z76Gd658MDc3R0CtGrJHQqkDb96WObPm4vChw1i5dgV8fLzloNI+PfrBM7cnAmoFwBRdj4iUgyPXblkjOw9eJqhVkPw5R0ffxIwps9ChdWds373llV9j6m3WmqWLf5QlXrlze8JkhT8ArtwH3vcAnKyAmMf6UpYcFkBBB32py7lYoKGPPohnRLyOHbqt7zGv6wWYmwEX44FdUUB9b/1jmagiRYvg4NEDiIuLx9o1a9G10+fY8ec21Yb0Nat+w4pfVmLxskWyjSdPnMTAvgNlZ2mbdq2hNKb7zMoEmzZtgp2dndG2tAOhJk+eLIPz0KH6XpciRYrgzJkzmDBhglFAFz3gffv2Naxfu3YNhQsXxvvvvy9vHYoe9BShoaE4dOgQbt26JXvlUwK+qIVfvXq17IHPiOiBHzlyJJQkJZxfuxqBLb9vYu+5SvmV88OBI6GIi4vDkydJcHd3Q41qNeFX3g9q9fDhQ4z4ZiR+Wf2zoZ5RDCj9+8TfmDZ5uskG9LBjYbh96zb8KwcYvebtD9mP+bMX4Nb9KHkR5ujoIJeChQuiYuUKyJerIDat34zmLZtBrW3WkmtXr+HPnbvwy6rlMGnH7uh70fPb69dFDfmDp8Cpe/qAfuuhvsZczPKSQvQrHL0DnI0FmuYHoh8CkQlAiwKA1fPngWt2ICpcX/ZSKuNOM1Mg7noVLFRQfl6uvB+OHjmKWTN+wMzZM6BGQwZ9g779+yCoZXO5Xsq3pBwXJ8ZaMKCbGDEYdPbs2UbbDh48iDZt9Lf8zp49i8aNGxvtr1atGqZOnSpf4FNe1CtUqGB0jAjvtWvXRtGiRVGvXj00atQIderoB6OJUhYxWEn0wqcPBJcuvXzquuDgYPTp08eoB93b2zvLw/nFi5ew9ffNL7SH1Efc7UkZOHrs6HEMHTEEaiWe32IR9fZpZTM3R3JyMkyVf83q2H/MeCDkl12/QuGihdGrX88Mg6q4UyKWtHfw1N5mtftx6TK453RH/ecXnybraQa/i6KjPOXmnih18UhTjy7sjNRvFwE+7WOk72FP+zgqIV67TPX3+E08TEzM4DU7m2JfsxnQX8HW1haFChUy2nb9+vV/9ThplStXDleuXJElLH/88YcskwkMDJQ95CKci9stYraY9F41a4zobU/pcc8M4jwvXbxsWA+/chUnwk7CxcUZHp4eaN2yjZxqcfW6VXj2LFneChfEflOuVU1P/++QeuEUfiUcJ8JOwNnFRZY9qKWNl9P8rK+GX8XJsJNwdnGW9ee/rV4LN3c3eHt74fSpMxjQdxAafdTQ5AdLvtDuK+FG7X6/+vsYMmgostvY6Etc9u7DLz/9grETvoOpsre3N8zGkyKHra0srRPbwy+Hy593zcAAuLq5yQGyUyZMQ3ab7IodaPVf2yzcjL4p5/u/cknf03rm1BnY2dvJ57x4PqiJCCvLlv6ENm1bw8LCxCOCmL1F1JjbWupLXEQduegZL/Q8fIsa8vR15KIeXUy16GiVOtDUKhuwPxrwdQUszIALccCDJCCP8Xu7KRk6eJiciU28lt2/f1+WfuzdE4KNW9ZDreo3rI/x4ybA28dLlriI9+qZU2eibYe2UCIT/+3LWsWLF8e+ffuMtol1Ueryul4XUe4h6tbF0rx5c9mTfu/ePRnexeA68cIoBpgqleghrR+YOk3ToP7B8mPrtp9iyLDBhrmgq1SoavR1W//YYpiWTg3EvMl1A+sb1gf2GyQ/ittl8xfNgxqIn3WD2o0M64P6Dzb8rOcunC0vvoIHDMGtm7fkxVmr1p9g0JABUEW7Axtm3O5Fc7D058UYPmQEOrfrgph7MfDO643ho4ahy+edoVbW2a1xIPQvzJ4xF7ExsciZyx1V36+KHbu3yh5XtVo0fwm+Hz3esN6glv73QcwH37rdp1ATUdoibvuL2VtMnpj/XPyhokO39KUsYkrEwg5A6f/jjq6oPa+VBzh+F/j9ur7XXIT3GrlNehaX27dvo3PHroiOipblaqV8S8lwbuodK68yadpEjBo+Gr2+7iPL2sR4oU5dOyH4G/37ttKY6dQ+kusdzoN++fJlVKxY0TBI9MCBA+jWrRt++OEHQw26CNm9evWSS9raddFL7ufnJ2+3jB8/Hps3b0ZkZKSsSa9evbq8ohXbRdi/ceOG3P/xxx+/UC7zMqLERZQcRN2N1FztdzYz7U1O9CzZdP9gxn9iqtO//QdPk5Oy+hQoE1llM90Q+G/l6KTe8Suv8nBxGLQmWafM8pJ3SeQzT9c8ctzWq/IZe9D/A9HbLWZiETO0iKkURegeNWqU0QDRl91SFeH7woULsqddhPwtW7YYaqPE50OGDEHHjh3lVa6Hh4cM7bly5cqklhERERFRVmEPukqxB11b2IOuHexB1xb2oGsHe9C1If4Ne9C1l2SIiIiIiBSMAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUxCKrT4DeraTkJ3LREotsltAaMzNtXmubwQxaY22ePatPgTKRTqeD1iQuOg4tsvm4OLTmwW+noDU6vNnvtDbf1YmIiIiIFIoBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBPROYmZlh3bp1L92/e/dueUxsbCxM0ZQJU+Fk7YpBfQfL9Zh7MejfayAqlKoED8c8KFWoNAb0HoS4uHiYutCQfQhq0gKFfIrAztIBG9dvMtr/eacv5Pa0S5OGH8OUhYaEonmTIBT0KQxbS3tsXL/RsC8pKQnfBA9FxbKV4e6YSx7TpcNniLoRBbWbOH4ScljaoX+fAVC7OT/MRdGCxeFk64IPqvjj8KEjULN5c+ajol8l5HT2kIt/tQBs37odajd61Bj5nE67lC3lBzUrVqjEC20WS6+ve8Nk6XTAuVjgj0hgcwSw8wZwPk6/PcXjZ8Dxu8CO58f8dQt4kGT8OPtvAhuvGS8n70FN79l26d6vU5apk6YhqzGgvwW3b99Gt27d4OPjA2tra3h4eKBu3brYt2/fG3191apVERUVBUdHR5iaY0eOYfH8pSjpW9KwLSoqGtFR0fh23CgcOBaKWfNnYueOP/H15z1g6hITElCqdClMnj7ppcfUrhuISxEXDMvinxZl6jm+bQkJifAt7YspGbQ5MTERYcdPYNCQgdh3KAS/rPwZF85fQNDHLaFmRw4fxcL5i+DrWwpqt2rlagzsNwhDhgbjwOF9KF3GFx81aIxbt25BrfLkyYNvx4zC/kOh2HcwBDUC/BHUtCXOnD4DtStRsjguR1wyLH/s/h1qFnJgj1F7N23Td0A0bW7CHSsX44HwB4CvCxDgCRR30m+78kC/XwT1w7eBxKdAJTfA3wOwsdCH9KfJxo/lYwvUzpO6iMdS0Xv2pTTv1WKZPf8H2WHa+OOPkNUssvoE1KBZs2Z48uQJli5digIFCuDmzZvYuXMn7t69+0Zfb2VlJUO9qXnw4AG6tv8C02dPwYRxk41e4JetWGpYz18wP4aOGoLPOnyBp0+fwsLCdJ92derVkcuriIu0XB65oBZ169WRS0bEReWmbRuMtk2eNhHVq9ZAxLUIePt4Q23E875T+86YNWcmvv/ue6jd9Ckz0LFLR7Tr0E6uz/hhOrZu2Yali39E/4H9oEYNP2xgtD5y9AjMn7sAhw4eRomSJaBm5uYW8FDR69fruLu7G61PGj8JBQoWwAfVP4DJuvcY8LABctno13NYAJEJQOxjAPZAwlMg5glQwwOwt9IfU9oZ2JEIRCYCee1SH8s8G5DdHGp9z86V7rm+eeNmVK9RHfkL5EdWYw/6fyTKUkJCQvD9998jICAAefPmRaVKlRAcHIyPPkq9Artz5w4+/vhj5MiRA4ULF8aGDRteWuKyZMkSODk5ybIYcWz27Nllj3xERASUpF/PAahTvzZq1Krx2mPj4+Jh72Bv0uH8TYXsCUW+3AXgV7Icen7Z+40v1NQiLj5ePp8dnUzvjtCb6P11H9SrXxc1awVA7UTHw/Fjx43ami1bNrl+6K9D0IJnz55h5YpVSEhIQOX3KkHtLl28hAI+hVCiSCl0bNtJXmhrhXi+/7r8V7Tr0Fa+hpksF2vgzqPUkpW4J/rQnvN5YE9+XuqSLU0bRXvFujguLRHst10HdkcBZ2Nf7GFXkZs3b2Hblu1o37EtlIAB/T+ys7OTiwjTjx+ne2KnMXLkSLRo0QInT55EgwYN0Lp1a9y79/JaLlE6MGbMGPz444+yVEaE908++eSlx4vvHR8fb7S8S2tW/oaTx09i+Oihrz327p27GD92Ijp01vfAqVlg3UDMWzwXm7ZvxKjvRsn67aaNmsk3eS149OgRhgYPQ1DLIDg4OEBtVq1YhbDjYRg1ZiS04M6du/K5mzNnTqPtYj06+ibU7NTfp+DmmBOOOZzRo3tPrFj9C4qXKA41q1ipIuYtnIP1m9Zh2sypCA+/isCAOrh//z60QIyviY2NQ5t2bWDSCjkAeWyBXVHApmvA3miggD3gZavfb2cJ2JgDZ+OAJ8n6wC5KYB4909empxCP4ecKVM2pf8zrCfq6dZVavmw57O3t8JECylsE9XdnvmOiR1j0eHft2hVz5sxBuXLl4O/vL8N06dKlDcd16NABrVq1kp9/9913mD59Og4dOoR69epl+Lhi8N3MmTNRuXJluS7KZ4oXLy6/RvTQpzd27Fh5EZAZrkdEygGha7eskb37ryIuFFo0+QTFihXFoKEDoXZBLZsbPi/lW1IuvkXLYO+eEATUfP2dBlMmnrNtW7WDTqfDtFlToDbXI67LAaEbt2587fOeTF+RokVw8OgBObh97Zq16Nrpc+z4c5uqQ3raUjbf0qVQsVIFFCtYAmtW/YYOndpD7UTZliiHyJ3bEybtRqI+TJdzBewtgbgk4HSMvlTF207fU17BHThxF9h+HRAd6W7ZgZzZgTTjSI1KXRys9F9/4BaQkATYWkJtflyyDC1atVDM6zt70N9SDfqNGzdk2YoI3KJkRQR1EdxTpA3rtra2snfxVYOsRPCvWLGiYb1YsWKy7OXs2bMZHi9KauLi4gzLuyyHCTsWhtu3bsO/cgBcc+SUy769+zB31jz5eUpvseh1af5hC3mH4adVP8LSUn2/0K8j6thc3Vxx+eJlaCGcX7sagY3b1quy9/zYseO4des2qlaqBvvsjnIJ2RuKH2bOlp+r8S6Jm5srzM3NX3itEutqr1MWY4MKFiqIcuX98O13o2RgnTXjB2iJeM8pVLgQLl9S9+uXcO3qNfy5c5c6LkTOxKb2ootg7W2r70G/kObOupMV4O8J1PPSD/58L6e+N932Ff224msEUcOuMvtC9+PCPxcU9fNnQH9LxBVX7dq1MXToUOzfv1/2mA8fPtywP304FfVtyclvr5ZLDEwUoSjt8q7416yO/cdCEXJ4j2HxK18WQa2ay8/FG7roOW/asDksrazwy28/K+aKNLNFXo/Evbv34OFpeoOA/99wfvHiJWzavgGurq5QI3EH5PDxg/jryH7DUq58OXzSqqX8XDzv1RhS/cr5Ydefuw3bxOuWWK+kgXrstES7X1XGqEZiQPSVy1dUfzEm/Lh0GdxzuqN+g4zvapuUZzp9r3haLyupt8wGWJvr69Vjn6QOLM1I/POadhMeNPoyPy76Ub7W+ZbxhVKwxOUdKVGixCvnPn8dMdvJkSNHDOUs//zzj6xDF2UuWc3e3l7O1JJWDltbuLi4yO0p4Twx8SHmLZ6D+/H35SK4ubuZdJARb1hpe8OvXgnHybCTcHZxlsvYb8fJ6ZnEyPDLl69g6KBhKFioAALr1IIpt/lSmjaHX7mKE2En4eLiLC88WrdsI6daXL1uFZ49SzbUJov9IuCphXjelyyVOp2oYGubAy6uLi9sV5Mevb9G146foXx5P1SoWAEzp89CYkKiHEinVkMHD5PlHmIWInEncMUvK2WZ2sYt66FmwQMGo0Gj+nLKYPG3DMS86Obm2RD0SRDUfvG1bOlPaNO2tTomMhAhW/SWi6kTZYnLE+DyfX1PetoyGKts+mPuPwFOxehnfkkZSCrKWMSMLmJdHBf/BDgdqx+AKnrlVfCe7f18ljGRWdauWYfvxo+BkqjgmZi1xAwdQUFB6NSpkyxjEW/iIliPHz8ejRs3/tePK3rcv/76a1mrLl4wvvrqK7z33nsZ1p8rzYnjJ3Hk0FH5uV+JCsb7/jmOvPl8YKqOHT2OBoENDeuD+uv/OFPrtp9i6qwpcmDZz8uWIy42Dp65PVEzsCaGjvxG3uEw5TbXD0yddm5Q/2BDm4cMG4zNG7fI9SoVqhp93dY/tqC6vwlPVUZSUIvmuHP7DkaNGI2b0TdRukxprN+8Drly5VL137bo3LGr/HsOjo4OKOVbSobzWrVN90L7TURGRqJ9m47yrp/oTKlarQp2h+56YSpCtRGlLWK2GtVcdPo6A+figL/vAY+T9T3eop68SJqZtcSAUFGXLgaFiv1iAGna/aJO/fYjfbB/lqwP8p42QGFH1bxnz100R36+esUaOXYq6JPUMWRKYKYTZ0X/mrjlOWLECOzYsQOXLl2St/u9vb1laB88eDBsbGxkOcvatWvRpEkTo9q+qVOnylIYUbMupmiMiYmR20Xteq9evbBo0SL0799fvmh+8MEHWLhwoezZeBPiilDMUX3t9hVV1gO/ikU27dW6m730/qW6abHdJj39G/3f+BatHTmaqnuO/Yw8+O0UtCY+Ph65Xb3keMFX5TMGdAVKCegp86L/Gwzo2qLFoKrVdjOgawvforWDAV0b4t8woHOQKBERERGRgjCgExEREREpCAO6Aom69P9S3kJEREREposBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBGNCJiIiIiBSEAZ2IiIiISEEY0ImIiIiIFIQBnYiIiIhIQRjQiYiIiIgUhAGdiIiIiEhBLLL6BOjdssxmJRctSUp+Aq2xNs8OLdLpdFl9CkT0lpmZmUGL4tecgNbYNS0FzUlKfqPD2INORERERKQgDOhERERERArCgE5EREREpCAM6ERERERECsKATkRERESkIAzoREREREQKwoBORERERKQgDOhERERERArCgE5EREREpCAM6ERERERECsKATkRERESkIAzoREREREQKwoBORERERKQgDOhERERERArCgE5EREREpCAM6ERERERECsKATkRERESkIAzoREREREQKwoBORERERKQgDOhERERERArCgE5EREREpCAM6ERERERECsKATkRERESkIAzoREREREQKwoBO/0poSCiaNwlCQZ/CsLW0x8b1Gw37kpKS8E3wUFQsWxnujrnkMV06fIaoG1FQkykTpsLJ2hWD+g42bFuyYCka1v4I3m555b7Y2DiozYRxE1DtvQ/g7pQLPp55EdS0Jc7/cx5aEBl5A53adYZXLh+42LuhYtlKOHrkGNQsdG8omjVujvzeBWFjYYsNaX7X1Uyr7dbac3zenPmo6FcJOZ095OJfLQDbt26H2iyYuxBVylVDHlcfudT6oA52bPvdsH/xgiVoENhI7nOwcjbN9y6dDjgXC/wRCWyOAHbeAM7H6benePwMOH4X2PH8mL9uAQ+SUvcnPgU2Xst4uZGYqc1hQFeg8PBwmJmZISwsDEqVkJAI39K+mDJ90gv7EhMTEXb8BAYNGYh9h0Lwy8qfceH8BQR93BJqcezIMSyevxQlfUsabU9MfIjAOjXRZ2BvqFXI3lB80e0z7Nm3C5u2bcTTpCQ0qv8REhISoGYxMTGo5R8IC0tLrN34G46dPIKxE8bC2dkJaiZ+ruJ3feqMKdASLbZbi8/xPHny4Nsxo7D/UCj2HQxBjQB/2elw5vQZqEmePLkxYsxw7PlrF3Yf+BP+NT5Aq2atcfb02TTvXbXQ15Tfuy7GA+EPAF8XIMATKO6k33blgX6/COqHb+tDeCU3wN8DsLHQh/SnyfpjbMyB2nmMl6KOgLkZkDN7pjbHIlO/mwkSQflVhg8fjhEjRkBr6tarI5eMODo6YtO2DUbbJk+biOpVayDiWgS8fbxhyh48eICu7b/A9NlTMGHcZKN93Xt8IT+G7AmFWm3Yst5ofd6iufDxzIfjR4/j/ervQ60mT5gCL688mLdwjmFbvvz5oHZ169eVi9Zosd1afI43/LCB0frI0SMwf+4CHDp4GCVKloBa1G9U32h92LdDsWDeIhw+dATFSxbHlz26mf57173HgIcNkMtGv57DAohMAGIfA7AHEp4CMU+AGh6AvZX+mNLOwI5EIDIRyGsnQh+Q3dz4caMSgdw5AIvM7dNmD/prREVFGZapU6fCwcHBaFu/fv0Mx+p0Ojx9+jRLz1ep4uLj5cWOo5MjTF2/ngNQp35t1KhVI6tPRRHi4+LlR2cXZ6jZ5k2bUa58ObT+pA3y5s6H9ypUxaIFi7P6tIjeGq0/x589e4aVK1bJuyeV36sENbdz9Yo1SExIRKXKFaEaLtbAnUepJStxT/ShPefzwJ78vNQlW5qOVxHIxbo4LiOxT4D4JMDHDpmNAf01PDw8DIvoGRYhM2X93LlzsLe3x9atW1G+fHlYW1sjNDQUHTp0QJMmTYwep1evXqhRIzXQJScnY/z48ShUqJD8Oh8fH4wZM+alv0ydOnVCsWLFcO3aNZiaR48eYWjwMAS1DJIXOKZszcrfcPL4SQwfPTSrT0URxPO4f58BqFK1CkqWMi73UZsrl8Nlz1rBQoWwfvN6dP28C/r17o+ffvw5q0+N6K3Q6nP81N+n4OaYE445nNGje0+sWP0LipcoDrU5/fdpeDp7wc0uF3p/1Qc/r1qGYiWKQTUKOQB5bIFdUcCma8DeaKCAPeBlq99vZ6kvYTkbBzxJ1gd2UQLz6Jm+Nj0j1x4Adhb68J/JWOLyFgwaNAgTJ05EgQIF4Oz8Zr2IwcHBmD9/PqZMmYL3339f9saLwJ/e48eP0apVK1mXHhISAnd39wwfTxwnlhTx8fpezawmBoy2bdVO3l2YNsu0azmvR0TKAaFrt6xB9uyZW4umVL2+7o3Tp89g554/oIWLEdG7OGq0vqStrF8ZWae6YN5CtGnXOqtPj+g/0+pzvEjRIjh49ADi4uKxds1adO30OXb8uU11Ib1w0cIIPbxX5oP1a9bji87dsfWPTeoJ6TcSgesJQDlXwN4SiEsCTsfoS1a87fQ95RXcgRN3ge3XAdGR7pZdX1ueZhypwbNkfYlMkay588+A/haMGjUKtWvXfuPj79+/j2nTpmHmzJlo37693FawYEEZ1NPXOjds2FAG7127dske/JcZO3YsRo4cCSVJCefXrkZgy++bTL73POxYGG7fug3/ygFGdzf2h+zH/NkLcOt+FMzN09WuqVivHn2wZfNW/LFrh6xbVTsPTw8UK278Rla0WFGsW2tck09kqrT6HLeyskLBQgXl5+XK++HokaOYNeMHzJw9A+prZwH5uV+5sjh29Dhmz5yDaT9MhSqciU3tRRccrICHT4EL8fqALjhZAf6eQNLzHnRrcyAkWr89vRsPgWe61B74TMaA/hZUqFDh/zr+7NmzMnTXqlXrlceJnnMvLy/8+eefsLF5XkP1ih75Pn36GNbFFbK3t3eWh/OLFy9h6++b4erqClPnX7M69h8zHkDzZdevZK9Er349NRPOxd2Q3j37YsO6Ddixc5vqB5GlqFL1PVw4bzyd5MULF2V5GpEa8Dmeeich7R1pdbfzCVTjmU7fK57Wy+b5sHxe4S3q1UWduZipJb2IB/pBpyLEZwEG9LfA1tb46ipbtmwyxKQPrCleF7ZTNGjQAD/99BMOHDiAmjVrvvJYUcculswievcvXbxsWA+/chUnwk7CxcVZ9sK0btlGTrW4et0qPHuWjOjom/I4sV9cxZsiMd6gREnjW545bG3h4uJi2H4z+iZu3ryFK5euyPUzp87Azt4O3t5eqhlEKcpaVvyyEqt+WyHbFh0dLbeLOzxv+tw2RV/1+Ao1q9fC+HET0Kx5Uxw5fFQOoFNbL1vGv+uXDOvhV8JxIuwEnF1c4GPiMzK9ihbbrcXn+NDBw+SMZGJ2MXF3W7y27d0Tgo3pZqsydSOGjETteoHw8vbGg/v3serX1XLGlrWb16S+d0XfwuVL+vf1M6dOw87OHl4+XvJ92yTkstH3loupE2WJyxPg8n3A29a4DMYqm/6Y+0+AUzH6EJ4ykDRFQhJw9zFQOeOy4szAgP4OiDrxU6dOGW0Tc5pbWlrKzwsXLiyDzM6dO9GlS5eXPk63bt1QqlQpfPTRR9i8eTP8/f2hFOLWWP3A1OmpBvUPlh9bt/0UQ4YNxuaNW+R6lQpVjb5u6x9bUN3/A6jVovlL8P3o8Yb1BrUayY+z5s9A63afQi1/2EOoU6ue8faFc9C2fVuoVYWK5fHr6l8wfMhwjB09Dvny58X4Sd/jk0/VM7//y+b8rxuYOkXbwH6D5EdRkzx/0TyolRbbrcXn+O3bt9G5Y1dER0XD0dEBpXxLyXBeq/ar73Cbmtu37+DzTt0QHXUTDrKdJWU4rxmoL9lcOG8xxo3+3nB8vZoN5cfZC2aZznuXrzNwLg74+x7wOFlfey6mTkxbQy4GhIq6dDEoVOwX5SsZ1ZhfS9Dvd8+68WZmuvRdvfRSS5YskbOxxMbGyvXdu3cjICBA/nEHJ6fUP+Swfft21K9fXx5fpUoV2Qsupmj08/OTXyOIenFRhy62V6tWTb5InD59Gp07d5YDQvPnz4/jx4+jbNmy8pihQ4fK2WLS16m/jChxET2aUXcjTb72+/+VlKyiW3ZvyNpcm4NWtfjy9bq/zUDqwue4dmjxvcuhWRloTlIysO064uLiXpnP2IP+DtStW1cG6gEDBsgpBsUUie3atcPff/9tOEbst7CwwLBhw3Djxg14enriiy/0f+QmPXFRIGrFRMnLtm3bULWqca80EREREakHe9BVij3o2sIedO3Qau+iVvE5rh1afO9iD7rDSw/jHyoiIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEAZ0IiIiIiIFYUAnIiIiIlIQBnQiIiIiIgVhQCciIiIiUhAGdCIiIiIiBWFAJyIiIiJSEIusPgF6N3Q6nfx4P/4+tCYp+Qm0xtpce21O+zzXEjMzs6w+BcpEfI5rhxbfu5CUDM15mvxGv9sM6Cp1/74+mBfJXyyrT4WIiIiI0uU0R0dHvIyZTouX5xqQnJyMGzduwN7ePlN7I+Lj4+Ht7Y2IiAg4ODhAK9hu7bRbi23Waru12GaB7dZOu7XY5qxut4jdIpznzp0b2bK9vNKcPegqJX7oXl5eWfb9xRNeS7/sKdhu7dBim7Xabi22WWC7tUOLbc7Kdr+q5zwFB4kSERERESkIAzoRERERkYIwoNNbZW1tjeHDh8uPWsJ2a6fdWmyzVtutxTYLbLd22q3FNptKuzlIlIiIiIhIQdiDTkRERESkIAzoREREREQKwoBORERERKQgDOhERPTGOnTogCZNmhjWa9SogV69emX6eezevVv+EbbY2NhMa6tSz5OI1IcBnYjIxIkgKUKgWKysrFCoUCGMGjUKT58+feff+7fffsO3336ryLCaL18+TJ06NVO+FxHR28S/JEpEpAL16tXD4sWL8fjxY2zZsgVffvklLC0tERwc/MKxT548kUH+bXBxcXkrj0NERKnYg05EpAJiPl8PDw/kzZsX3bp1Q2BgIDZs2GBUqjFmzBjkzp0bRYsWldsjIiLQokULODk5yaDduHFjhIeHGx7z2bNn6NOnj9zv6uqKAQMGIP3MvOlLXMQFwsCBA+Ht7S3PSfTmL1y4UD5uQECAPMbZ2Vn2pIvzEpKTkzF27Fjkz58fNjY2KFOmDFavXm30fcRFR5EiReR+8Thpz/PfEG3r3Lmz4XuKf5Np06ZleOzIkSPh7u4u/yT4F198IS9wUrzJuad19epVfPjhh/LfwNbWFiVLlpRtIyJKiz3oREQqJMLi3bt3Des7d+6UAfP333+X60lJSahbty6qVKmCkJAQWFhYYPTo0bIn/uTJk7KHfdKkSViyZAkWLVqE4sWLy/W1a9eiZs2aL/2+7dq1w4EDBzB9+nQZVq9cuYI7d+7IwL5mzRo0a9YM//zzjzwXcY6CCLg//fQT5syZg8KFC2Pv3r1o06aNDMX+/v7yQqJp06byrsBnn32GI0eOoG/fvv/p30cEay8vL6xatUpefOzfv18+tqenp7xoSfvvlj17dlmeIy4KOnbsKI8XFztvcu7piTaIgC+OEwH9zJkzsLOz+09tISIVEn+oiIiITFf79u11jRs3lp8nJyfrfv/9d521tbWuX79+hv25cuXSPX782PA1y5Yt0xUtWlQen0Lst7Gx0W3fvl2ue3p66saPH2/Yn5SUpPPy8jJ8L8Hf31/Xs2dP+fk///wjutfl98/Irl275P6YmBjDtkePHuly5Mih279/v9GxnTt31rVq1Up+HhwcrCtRooTR/oEDB77wWOnlzZtXN2XKFN2b+vLLL3XNmjUzrIt/NxcXF11CQoJh2+zZs3V2dna6Z8+evdG5p2+zr6+vbsSIEW98TkSkTexBJyJSgU2bNsmeWNEzLnqHP/30U4wYMcKw39fX16ju/MSJE7h48SLs7e2NHufRo0e4dOkS4uLiEBUVhcqVKxv2iV72ChUqvFDmkiIsLAzm5uYZ9hy/jDiHxMRE1K5d22i76GX28/OTn589e9boPATR8/9fzZo1S94duHbtGh4+fCi/Z9myZY2OEXcBcuTIYfR9Hzx4IHv1xcfXnXt6PXr0kCVIO3bskGVI4o5C6dKl/3NbiEhdGNCJiFRA1GXPnj1bhnBRZy7CdFqinCItES7Lly+Pn3/++YXHEuUZ/0ZKycr/Q5yHsHnzZuTJk8don6hhf1d+/fVX9OvXT5btiNAtLlQmTJiAgwcPvtNz79KliywtEl8jQrookRHn8PXXX//HFhGRmjCgExGpgAjgYkDmmypXrhxWrFiBnDlzynrwjIh6bBFYq1evLtfFtI1Hjx6VX5sR0Usveu/37Nkje4fTS+nBFwM0U5QoUUKGWdGL/bKed1H/njLgNcVff/2F/2Lfvn2oWrUqunfvbtgm7hykJ+40iN71lIsP8X3FnQpRUy8G1r7u3DMivlYMNhWLmGVn/vz5DOhEZISzuBARaVDr1q3h5uYmZ24Rg0TFYE4xEFKUYFy/fl0e07NnT4wbNw7r1q3DuXPnZJh91RzmYt7x9u3bo1OnTvJrUh5z5cqVcr+YYUbM3iLKcW7fvi17oEXPtejJ7t27N5YuXSpD8rFjxzBjxgy5Logge+HCBfTv318OMF2+fLkcvPomIiMjZelN2iUmJkYO6BSDTbdv347z589j6NChOHz48AtfL8pVxGwvYjCnmG1l+PDh+Oqrr5AtW7Y3Ovf0xIw34nuKfxtx7K5du+QFCBGRkawugiciorc3SPT/2R8VFaVr166dzs3NTQ4qLVCggK5r1666uLg4w6BQMQDUwcFB5+TkpOvTp488/mWDRIWHDx/qevfuLQeYWllZ6QoVKqRbtGiRYf+oUaN0Hh4eOjMzM3leghioOnXqVDlo1dLSUufu7q6rW7eubs+ePYav27hxo3wscZ4ffPCBfMw3GSQqjkm/iAGyYoBnhw4ddI6OjrJt3bp10w0aNEhXpkyZF/7dhg0bpnN1dZWDQ8W/j/jaFK879/SDRL/66itdwYIFZTvEsW3bttXduXPnlT9fItIeM/E/48hORERERERZhSUuREREREQKwoBORERERKQgDOhERERERArCgE5EREREpCAM6ERERERECsKATkRERESkIAzoREREREQKwoBORERERKQgDOhERERERArCgE5EREREpCAM6ERERERECsKATkREREQE5fgfsjAlAX91/jkAAAAASUVORK5CYII=", - "text/plain": [ - "
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": null, + "id": "85", + "metadata": {}, + "outputs": [], "source": [ "# Compute confusion matrix\n", "cm = confusion_matrix(true_labels, predicted_labels)\n", @@ -2048,7 +1418,7 @@ }, { "cell_type": "markdown", - "id": "b449b164", + "id": "86", "metadata": {}, "source": [ "### Explain image classifier predictions\n", @@ -2058,7 +1428,7 @@ }, { "cell_type": "markdown", - "id": "bac9c664", + "id": "87", "metadata": {}, "source": [ "#### Prepare image for GradCAM\n", @@ -2068,8 +1438,8 @@ }, { "cell_type": "code", - "execution_count": 50, - "id": "972d39aa", + "execution_count": null, + "id": "88", "metadata": {}, "outputs": [], "source": [ @@ -2087,7 +1457,7 @@ }, { "cell_type": "markdown", - "id": "c2dd4920", + "id": "89", "metadata": {}, "source": [ "#### Compute GradCAM heatmap\n", @@ -2097,8 +1467,8 @@ }, { "cell_type": "code", - "execution_count": 53, - "id": "223532a5", + "execution_count": null, + "id": "90", "metadata": {}, "outputs": [], "source": [ @@ -2122,7 +1492,7 @@ }, { "cell_type": "markdown", - "id": "7ce807d1", + "id": "91", "metadata": {}, "source": [ "#### Visualise GradCAM heatmap with the image\n", @@ -2132,21 +1502,10 @@ }, { "cell_type": "code", - "execution_count": 54, - "id": "b73bf41a", - "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAZcAAADaCAYAAABq1w8LAAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjEsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvc2/+5QAAAAlwSFlzAAAPYQAAD2EBqD+naQAAMsVJREFUeJztnQeUJUX1/zu+MPMm7szOzAY2w+6ySJKggMhRMaAiKIoZ8RgRc86iiArmHDCn/8EMZjGAyFFBkqCEBXYXNk+el193/c9tfPubnfut3bfQywz6/ZwzZ6FevX7V3VV1u+p++17XGGMcQgghJEW8NA9GCCGECDQuhBBCUofGhRBCSOrQuBBCCEkdGhdCCCGpQ+NCCCEkdWhcCCGEpA6NCyGEkNShcSGEEJI6/3PG5b3vfa/juu79+u7Xv/715Lt33323s7+QY8tvyG/NBo9+9KOddevWzfl2krnNWWed5SxdunS2mzHnufs/4+iiiy7ar3PXbPCQMS4333yz87znPc9ZuHChk81mnQULFjjPfe5zk3Kyd3bs2OG85jWvcVavXu3k83ln/vz5ztFHH+285S1vcaampma7eSQl7rrrLudVr3qVc+CBBzptbW3J39q1a51zzjnHufHGG525ysTEhPO+973POfTQQ51CoZD0UXnIkf65efNm+J1nPvOZyWQrdRB//OMfk8/l79vf/jasc9xxxyWft/JAta/8+c9/dp74xCcmc1Yul3MOOOAA5ylPeYrz3e9+1/lfwH0oxBb70Y9+5Dz72c92ent7nRe/+MXOsmXLEot/8cUXO8PDw873v/9957TTTmvpWI1GI/mTm72vRFHk1Ov1xLjtrycIOS85v6997WvJ018ajIyMOIcffngygM8+++zEwMh1k8nmsssuS/5tPmXKymXnzp3OP//5zz0eU7pNtVp1wjB0fN9PpZ3kgSH38lnPepYTBEHy4CUTted5zr///e9kDG3YsCExPkuWLNnvbZG+K5N7K6v8O++803nsYx/rbNy40TnjjDOc448/3slkMkm//N73vpeM+9tuu22370hfHhgYcAYHB5NxKec2c0zK75900knJWJd/f/GLX8CxJp+vWLFir31+X7jkkkuSe3HYYYc5Z555ptPT05Nc+yuuuCIZM3/4wx92a8OFF17ovPGNb9xvc9esYOY4d9xxh2lrazOrV68227dv3+2zHTt2JOXt7e1m/fr1ezzO1NSUeShw1113ibE3X/va11I75kc+8pHkmFdddZX6bHx83JTL5V3/f+KJJ5qDDz44td8mD944kXGwZs0as3nzZvV5vV43n/zkJ83GjRsflHHywhe+0CxZsmSv9aRdhx56aDLGr7zyStg/3/72t6vyr371qyYMQ/P73/8+6dt//OMfVZ0//OEPyWenn366CYIgmS+mc/7555uBgQFz/PHHp97n165dmxyzWq2qz7Zt26bG+4UXXmj+25jz22Ji0UulkvOlL33J6e/v3+2zvr4+54tf/KJTLBadj3zkI2pv8pZbbnGe85znJE8N8jQ0/bPplMtl59WvfnVyvI6ODuepT32qc++99yb1pP6efC7yxP/kJz85WQLLNpM8VSxfvtz55je/qVYP8mRyyCGHJMv+zs7OZMl8ww03OPub9evXJ6uLY489Vn0m7UBPQnLt5GlPtlVkWT/9+tp8LvK0KucmT6KPf/zjnfb29mT78rzzzktWOmT/IfdHxoGseIeGhtTnspqRPr548WJ1v6R/POlJT0r6vqx4hCuvvDJZRchWjqzU5Xuve93rkrEyk5/85CfJtpL0I/n3xz/+ccvt/uEPf5iMgXe84x27xujM/nn++eer8u985zvO4x73uKSPrlmzJvl/G6eeempyDrKamI5sT8nW2v5Yea9fv9456qijkhXYTGRLGiFznKygpK3y3b///e+7fY7mLvl/2QaV8z/ooIOSe3DkkUcmK6TZZs4bl0svvTSZwE844QT4+aMe9ajk85///OfqMxkcYpg++MEPOi95yUusvyGD7NOf/nQywD784Q8n+72nnHJKy2284447nGc84xlJZ//oRz+aGDM55nR/kEy4MgjFEH3sYx9z3vSmNzk33XSTc+KJJ1r3lNNCtkFk6+Bb3/pWS/VHR0edJzzhCcm2ipyPbKPJvvYvf/nLvX5Xfke+K1sWMuFJR3/Pe96T/JH9uyW2cuVK55hjjtmn78k2izwIyIQnTuWnP/3pSblMxDJ2XvGKVyRjQ+rIvy94wQt2+/5vfvOb5DsyyV1wwQXO0572NOdFL3qRc80117T0+z/72c+Sf5///Oe33GYZL7KtJFvlgvz7gx/8wKnVarC+PCCJgZEttiZi0GR8ysPn/hpzl19+uXPPPfe0VF8MnTxIv+xlL3M+8IEPJA9vp59+erINvzf+9Kc/Oa997WsTn7Q8yMmWt4zBNLf57hdmDjM2NpYsGU899dQ91nvqU5+a1JuYmEj+/z3veU/y/89+9rNV3eZnTa699trk/1/72tfuVu+ss85KyqV+E9mqkjJZyjaRpb+UXXHFFbvKZPsum82aN7zhDbvKKpWKiaJot9+Q40i98847b79ui23dutX09/cnx5VtxJe//OXmu9/9bnJ9ZyLbYlLvm9/85q4yWdoPDg6apz/96Xtsp2yFSNm55567qyyOY3PKKaeYTCajtiVIOsjWkVz3pz3taeqz0dHR5Lo3/0qlkrpfb33rW9X3ptdrcsEFFxjXdc2GDRt2lR122GFmaGhot770m9/8JjluK9tihx9+uOnq6jL7wkUXXWTy+fyu8X7bbbclv/fjH/8Ybotdcskl5rLLLkva3twWfNOb3mSWL1++37aCL7744uS3pd+fdNJJ5l3veley7YfmAKk3b948MzIysqv8pz/9aVJ+6aWXWucuQf5f/q655ppdZXJ/crmcOe2008xsMqdXLpOTk8m/slzfE83Pxck3nZe//OV7/Y1f/epXyb+vfOUrdys/99xzW26nqHGmr6xk+06WqLJaaSJLXXGuNp/u5elCtiSk3j/+8Q9nfyKrCHlSk+shq5IvfOELyRObPK2+//3vV1tW0i55CmoiS3vZ8pt+PntClukzl+3yVPm73/0uxbMiTZr9Xu7bTESgIf2x+ffZz35W1ZHVyUxk9d5EtttE5PHIRz4y6SvXXXddUr5lyxbn+uuvd174whc6XV1du+rLCl7GRKtt39v4nolsAcnOQvN7q1atSlbIe9oaO/nkkxNhgIh/5Bzk3+bKZ39w9tlnJ3OLXH/ZMpdxJnOEtPUvf/mLqi/Of9nxaNKcT1oZc494xCOS828iW5myUvv1r3+dzDWzxZw2Ls3O0zQy+2qERIWxN0RlIpP+zLqyxdAqcjNnIh1FJvImcRw7H//4x5POJYZG/Dsy2EURMz4+7uwL0mG2bt26259tS6CJ7MN//vOfTyaEW2+91fnUpz6V/P673/3uRHU3nUWLFqm93ZnnY0OupficpiOyWGF/vh/0v0yz3yNJufgkf/vb31qluOKLkfs9E1FuydauTMhitKSvyBau0OyvMnYE6dMzkYemmVL46f212VbxqextfE/nX//6V2LcREIs29HNP5nEZWtw5gNmE1FoyTa5bD+JP2LTpk37tCUm7Z3efjmfvfH4xz8+meDHxsaS3xQ5uFwz2Rrfvn37HueQpqFpZcyh6y9jTrY1W2nn/6RxkachmRT3ps+Xz8XpLB3V9vS1P7E5BKevCMTv8/rXvz7xEclAl04ng/7ggw9ODM++IANDrsv0P/Q0hBCjIR1PVmbS4cUYzHzia+V8yNwbJ2iPXXwwIvOVyRgxfUU9/eFFVh/ixxRfm/gKpa82xRv72l8FcVBP76/NlwbFnyfGSvp0KzSNpIgLZFJt/olvsFKpJAIBG2JMZKUljnHxJ7a6uhKkvdPbL+fTKm1tbclK5DOf+Yzzzne+MzEYM/2X/41jLnDmOGLlv/zlLydLS6QmEVWLPBGLI+z+Ot5ksIgGffoTgDwNpYk4HEXZMnOVIE81sorZF0TbL4N9OjJY9hVZYcgTkqxm0kKupSzlm6sVofmOAt/Y3n/INtFXvvIV529/+1uyhflAEKGJ3LNvfOMbuznwZ/a55vsyt99+uzqGrI6nIw8w05VmzdWtvFQojnYxGm9729v22C6ZaGXlIeNo5ja2IFtP8jsiKEDI/CErBHn/RYQ7+4Jch+nzz/19cH34wx+e/JvmmEPXX+6fGLWZCtsHkzm9chFEVSU3UoyH+ClmynvFjyAXUerdH2TpKnzuc5/brVyUMWkiTyYzn0JEkSOS531F5IbyNDr9b/p+7Uz++te/JvvmM5GJSK7pzC2MB4o8oTWRc5b/l22JxzzmMan+Dvk/3vzmNyfjQPb6t23b9oCegJtP0dO/I//9yU9+crd68gQvLwmKEZq+tStGSKTs05GV0/T+2jQuorIUeb7Ija+++mrVFtkyE5mycNVVVyUPkmI85Hsz/8RvISoym/pSVu2yHSzKxX1RpwnS3untt60Em1x++eWwvPkiZ5pjTq7bdL+trAJ/+tOfJn6m2XzBec6vXGQ1IZ1X9PfSCWe+oS+ORnnyEX34/UEcYSKl/MQnPpFMtPIuiEj7mk/bab2JLyswkQnKwBDHqDwdylPWTP/E/kAkyPJbEsVAzlcc9LJ3/dWvfjUxVG9/+9tT+y05njgyxckrWzKy/JftFfmN2XyK+m9Hxok81YuTWiau5hv6YhRkVS6fyfYX8q/MRLaqZDzJe1ny8CPbzbLdhPb/RX4sqyZ5qhfDJg988mAm272thBWShw6JHiATtmwZy3snMnFLuUiFpd3y4CTGR/qwTJa21wTk/TQxROKsly1ohDi65W9/c+qppybzlKzM5FrKw50IWuTVCtlSk/K0kHeL5CFZ3mOSbc7mg7KE05lVzEOEG2+8MZEWi+xR3swVaaz8/0033aTqNiV7SPqK5HzFYtGcc845pre31xQKhUTSeeuttyb1PvShD+1ViixS25mIvFH+pkuRRZos7RcZ5XHHHWeuvvpqVW9/SJHl2on08ogjjkjOUd5WlnacccYZ5h//+IdqN5Jlznzj2iZFbkZLOPnkk5O3ruUNaLnmMyWYZP+9qf+KV7zCrFy5MpGjSl9rys+vv/763eo27xfilltuMY997GOT8dDX12de8pKXmBtuuAH2zR/+8IdJZACR1cub6T/60Y9afkN/umT63e9+tznkkEOSfiNtX7dunXnb295mtmzZYmq1WiLXPeGEE/Z4nGXLliXy5plS5D2xP6TI3/ve98yZZ55pVqxYkdwDOR+5Nu94xzt2Saj39ob+zFchbFJkmbu+/e1vm1WrViX3QM5fzn22eUjEFpsNxPEn8bhkL7j51jLZM6IuEt8SA2ES8uAgOyuiQpu+FT1XmPM+lwcDFNJCtslkG0GW6oQQQv7LfC4PBhKm5Nprr01UKKL7Fz+B/L30pS/dLRYTIYSQ1qBxcZzEwS4KF5EyypaOyBVFC99UqRBCCNk36HMhhBCSOvS5EEIISR0aF0IIIalD40IIIWT2HPq9y3FcqM6eNlW2yBZDKtbRT8d24qidbWE7+H4V1s2iCAcRdiUZP1RlxTI+bgME6PN8/MZ+tVZRZYWOLKxbLGnpc62Bj1sogOsggNNrRA1YtRGDiMkxvvUoC0Ook+ndV7eqExm15XHlcl03eHQMtzdwdRva8vg5KI51G3J5fG43XqVjMM0G+Z7TYHk2r/tmZ3c3PojRfbZSKsGqoQfuibFce3SZY8tY8vTAq9dxiPcYuHbBbbb240wG31OUTCuyxNVEWSFt2IJzxgacnyVziXH0mLZFYzENfdwwxOfcAHNbuYLb64EII2GA5xoDzi0I8blt22gPELrrt/dagxBCCNlHaFwIIYSkDo0LIYSQ1KFxIYQQMnsO/f4+nZ9byLRpD1WtitOWRnXtcNy5A6f4LeS0g6qzoB2egtvQdauVOm6Dq8vLVezcrNR03Y4OnCQoamgnW6mIUw/XasAp6OFza9Tjlh2O9To+DxPrZ4hKBdfNZrXTs2FJodyW0W12Lc8ryIkYNfBxkf84G+CuWi7r6xDPXtrwlmhvw45lPwTXKMJiEwOc3qWiFpUImUBf0GwG3yfkDG8AZ3PSBhf0QTAWk2OAXO7ZLO7zBnQAW9+OkHDHohSIbcIEIDaIbKoAo+9Rw3LOfqDnxtiS0z70dZtdIAgQPE+XG0unR6XBjMyjTepAdPNAXrHnyoUQQkjq0LgQQghJHRoXQgghqUPjQgghZPYc+hnLW53t4K3iUhE79KtV7ZzMZvAb6GGgHeexxXE2VdWOzIbFIVcGzulyDTsLxaU2k270trM4yOu6DQFw6AnZjI5qUKxg5/bEBH7rOp/Ptvx2tKnrZ4hGGXvqaiWdRTKTw3Xn9+i3x0PLm9Tb7tmqyiYnbW/o6+vmW5ybpZIWXVSAk38uAXy3CWEI3ni3iCki4GQPfItQwAtacpoLtajesiO8DpzptnGHxpIL7nPye0CsIIn7ED64mHWLAKEKIkrYxqntDX0T6fOIgZhHiOr63vlAXCG05wotnZswNaHHaLVmeUMfrB9sK4o6EA81LOfWCly5EEIISR0aF0IIIalD40IIISR1aFwIIYSkDo0LIYSQ2VOLVYo4DEWY1fapZgktgkKkoPwhwuiIVkS053DdQiGnylwPqxzGpnS4mUaEj5sF+R9yOaxumwAKjjDQ7RLybVrpNT61HdYNAhwiI45ROBWLYgTUdS1qGNfR5RYBGIwNEYY4hw1SBtnCytQaIEdLxhYKQ987y6nNGRoWVZ8HQ+TYwv+AsWRR1JXLtZbVnzBvCq7qRDU9J8QgPIrgg0QmQYDVbdWqbq8HFG+2nCfVmh6L9x0Dq9MMaLNNIeei87PGSDEtKwVhXUvyFxfkaIkt+asikO/IImKFx2X4F0IIIXMKGhdCCCGpQ+NCCCEkdWhcCCGEzJ5DvziJHfoeyEFRrmKHZRBq53TdErYAOdRg7oYE4CyOLaICkNsEOfSEGmhbuYhDSBSn9PXp7u7CbQAhJGzOxnw+13IoHZsj2wXXLQJOc6GjU3eJ+X06XI1QaNNta2svtByCJJ/LtCwqKHTiPDqVsr7PljQxc4aaJX+QC/J01C3hVFB/seWxMUZf59hYvLoOCLtkEd3ERp+HMXhKiSIk6MDimHpN96EcCI+StAGMZ9et7JM4Bs0rxuCL6QIPt4lxh8tk9b1rB/mvhCyYG0MgKEraAKarMMDjGWV0yVjy6KC8NA8kNxJXLoQQQlKHxoUQQkjq0LgQQghJHRoXQgghqUPjQgghZBaThYVYteSD0CmZjCW5UE1LD6I6Vjlks7ppXoiVM6WqTqhVLGIFRy6jz6NmEVrUyvqD8iQ+7lRZq7dy7VhlFQG1ULWE1XhLlyyE5cWpsirbNL4T1q0BRU3DEt5iXrdW5XSDhHBCDtyj9vb2lp9jcrhLOV09Haqsbiyqwioot4Q2mSv4fr7l0CK+bzlvoOKJLWGMgkD3Q9fDfbMe6ftXq+GQPoGv+0rk4LoRGLv1AlZD1UKdbDBY0AfrGiCRbBQnYN3ugW78ew09HicmcZK+qA5Cr0zhCaQto+9z3vI8j0KyoNA2//lEfz/ACRqzIGRWDNSYgkHKxAew/ODKhRBCSOrQuBBCCEkdGhdCCCGpQ+NCCCFk9hz6gcW55IFkD5kQhziYmpxqOadHT7d2hrkudkRNThZ1u3zsLIwj7Qm1+Iqdjjbg9LQkOMjl9e8V2rHjdsfosCqrTmCH/gGDC2D5xg2bVZlbs7StHdwP1+JYrOjQGd44blseiCN8r9GyQzfosTzbZPV5lIZxSA+U86Jex22YK3ieLeyN03L+jxrIeRKDECtCmOsBP2ZxbleB878XN8J0gRZ34D6Y7QWiEKwpcAIQmiZ7AJ5/ilPa8R4V8f3vOrgTlo9PAgHAKD6PEISackbx9fFAaCJX63ASAhAKx4twyBuzfVDX9SzqmECfR71Ubz1PjG1ybAGuXAghhKQOjQshhJDUoXEhhBCSOjQuhBBCUofGhRBCyOypxRwHKww8EEZifAKHTkB6mM4uHC4kAqEIgsBmC7WCIwOS7whlkEnKdS0KlxxK6oXrdoEQGyjMixCCQxx16MNg3Uccfigsr09o9dRWfzus2w9u3VD7fFi3L9bXsrOEz7kwpe+dacdKqNuL+rgjdawAqwVa/Zfx8P3s6tGqp7Ex/f25BQ5P47r6HKvVestDN2tLqGWAEinEajGnT/+eP4RVaI0eEIOmH/cVfyEYS12WfgWKs134nmaKOuzJkjye1tatwdd9y91avbnjni2wrudpBWg1HIJ1wzE9FrI78TlnpkD/LuPrPjyi58HyFh0ySYhKeg7yLUrRLEhMWKnY+t/e4cqFEEJI6tC4EEIISR0aF0IIIalD40IIIWT2HPrYDeU4UyD8wvDoOKzb0akdRjlLUo8yyI8SA2ezEIJcM7WaJQxJpMtjE2HLGzRayjMjTI3puuVxLGzoBnlejjhkLW4DyMUi3HXbptZymyT5alCIFHzcKKOdkCWLU3leoK97ZPBxcyM6z0fOkssnatflQTe+92WQdyMCIX7mFvga1UC+o1IZh0fKZlFODyyOadS1o9904b7iLdL9O1pgySkzpMtz83CfX527U5XN83HelaiinfR9VSzoKGT0uFs6MA/WHTAbYPnNw9fpNkQ4P0oMwrSMR8th3fH2A1TZtpwO3SIEFX2P4h1YgBD06LEQFHGfj8v6fgY5i0ADCJtMeP/XH1y5EEIISR0aF0IIIalD40IIISR1aFwIIYSkDo0LIYSQWQz/YizKBXCEQicOAZLLg7AFJawuaTT07/X2YhVIqaiVZZnQohgCCccaEdbCdXZrNVQDhI8RKkDdVgFKJiGf1yEkIosq7OZb1sPy227bCNqGVSCbPH3sHR349zp8HZKl2hiDdY/s0MmXwnas0ruxqo8RObjuvBAoyywhKyYq4H6A851bYOUTanZGX4qEINTjo1HHYzQO9O+1L8KJ7OqL9D0xK/D46GvfqcqWm9tg3cNyOpxKX4xDrJQbWm06v4rvaQHMSwsmF8O6ja1Yndhz079UmSlZ1JQdYF5arhP3CZu9pc5MJupYLVZbeJQqG+1ZCOtu26LHaFzG96itqucEP4P7SbUB5kyQQKxV5vooJIQQ8hCExoUQQkjq0LgQQghJHRoXQgghsxj+xRJ6pbNbOwa72nBeCeNoZ/qOrSOwLvLf9vb2WBqnHYAjI/i4Dgj1kreEQ+jqAWFlqviSLVmkQ7qcdMwxsO7KxQt02aE4b8vWbdhZGOb0723bikNWTJV1LoyObuxUjsC13Dl8L6wb9+ibZHA0H6djSNfNu7hyX5cWCkyWy7Buo64d+nGIBSVzBWPwtc/ldLklRQscS6Upy3l36aL8YsuNWqKFKbkC7oOrjXaEHxz8G9Z9WH1UlUX34LAy1RE9Hpf1LYJ1ezM6j8k8dwDWndyJx4c3qa97cScW7tSKujwzHz+jrznwHlU22MA5jDbldX6d9b1YdLHhIO1kDy2Cl7aiVoRUG1hIE1f03Gh8PDe2AlcuhBBCUofGhRBCSOrQuBBCCEkdGhdCCCGpQ+NCCCFk9tRiI8MTLR9g1RAOWzBR1UqJMKeVTEJXTitfanWs9ujt1cqp0oRWpwiDC/v197uwImLxYq06WbHqCFg3n9Xn3JbDyZsecdwhqmzLdqxum7zrblj+8GOPVmXz+7FKZgEIm1MG4ViETVt0QqX1d+lQM8Ktt96qyipAFSQsXADCZnRqpY/gh+B+bsYqm1ykn4+C7Nx+ZiqXLMnpwGiaB0Ls2BQ/XoClZdlBPZaibqyGaluh27B8FPfBYwtaDbW6hPtx5x36XveWh2DdoAbCCt2NFXaL5+s+P3YnDru0cxvu8z4Yu139+FouAaGb2iPcN8dLWnm5YBFOpPj3ir6WUYwVfZtW6BAyXoD7ibdT3/v6JO5/QUWPUS9g+BdCCCFzCBoXQgghqUPjQgghJHVoXAghhMyeQ784gR2AY552yNeK2MEVgbwpE5M4FEFU03VLEzgEyNCQdup1d2Fn+toDl6myQ9YsgXVdRzuzunvmw7q//tW1qmz7CHbeLV99gCq77NJfw7rXX/9PWN7fr530HQXshFy3Zq0qW3GQboPwy9/+VZUNj0zBuisPXK7KqhEOFbNl7HZVZrI61IgwMq77hCXdjdM3X8c2qQLhyFyiXsUO1Yqr841ENUuojliPj6rluCbSx6jXcG6TQZBHaX4Oj7tVIL3S8sAylkCkl9wYHqN33KjDzews43saP1rf/2tuwzmQ7ty6HbetTTvpGxkcSuegfi0IelSpC5/H3dpJX1qJ59GVx+mpeCrGIqq7KnowbF6Er2XZ6OtmLCGF2nwtIIgauJ+0AlcuhBBCUofGhRBCSOrQuBBCCEkdGhdCCCGpQ+NCCCFk9tRiMQizIXieLvctIQPm9+uEONtHcLiQ4rgO9ZINcBtyWR0a4pC1WskkPGztalW2duVKWPf2W+9QZdddewusOzioVWTHHIeThXV366RnhQ4s4Viz5kBYngcKl5FhfC2379ypynoHcLiI6667QZXd/C8dEkY4vfd0VbZoKU7oVi5p1UpHFp/z1Jiu6/q4LsplZOt/cwUDEtYJLkj45HlYLVZo18qeYhmHMaqX9e/5Vcs1CvRx+/otibr6tQRsoAOf2/AtOizMlpEdsG6hTyufBvM4pFRnTo+DDovSa2FfLyyvhmHL6rRySYeWKbbrhFzCluFtzkx2FHEImjUH6es+uACrNOfXtYqs2qnnVuGuQTDOXXx9PBBhJwbqwVbhyoUQQkjq0LgQQghJHRoXQgghqUPjQgghZPYc+jYnaS6nvUDjYzjsSc8CndNhYAA7w9aPaWdfRwGHWVi1crEqe+aZz4R1jz78SFXmN7CNXblc511Zey92QrZ3amfhytWrYN1yRYfMec5zz4B1xyzXsgHyeUxMYAdgNqOv8T9vwWFlarF2yFYqOEzLvVuHVdmCZUth3XpDO5sbNXzdiyDqhZ/T4U6EbF6XBxbhx1zBtTjpg0BfowroK0J3h65bKOD8HyMghEzWweOuo1eHNzlwHc67cxDI27Roqw55IvRu1iGa+l18bpm6dtL3ts3D4o+KDqdy1MN0uCNhp6UfT8RamDBaxWFaunw9ZZZ3aMe9EJX1cetj+N5P3KOFAvMWYOd/f6zD8ZhoE6xbbmiHvhdgh3420GPJeFig0QpzexQSQgh5SELjQgghJHVoXAghhKQOjQshhJDUoXEhhBAye2qxQidW64RAeNDdg1VdeRCmZWiwD9Y9bO0aVfbIY3A4lQWLdIKitiwOh+AY3YZ7N+IEV4WCPsbDjz4W1i1VdLiIEISVEFBkieFhHR5D2L4Nq9N8oCyaACFzElxdd/M2nDhpwQELVJkXtsG683pBwrJ2HP4lk9F1N23GaqEtW7VSp70DqxVDoCILMyAz1Rwik8WKIR/EssnlsQIsDPQ5dhSwCmhw2QpVtvgYSziVVbrPR/6dsG7N6P42UcWKxUybPo+FK3BYGaSo8i3KwiwYSxEINSRUi7i/RSCEVc2iLCuBED1mCh+3I69Vdm43nhPaqnqM9TawUusAX4d5Kk9uhHX7prTyLsziMEHtoGm+j+f9VuDKhRBCSOrQuBBCCEkdGhdCCCGpQ+NCCCFk9hz6gwuwY7GrXTuijjzqUFi3brRzKGrgHCSPOu5kVXbi8SfAutW6dm5Wa3VYd2zY4vQGtHdoh5wx2B7HkXa+NRo4hMTUpA71cMn/+wmsu2kTFhssOUA7Qy+//M+w7uDCIVV25LHrYN3Dj9D3Lo5xN2lr0yFv8pYeVamAnBlj2Elfb+i+Vi7h++k6+rj5PA5tMlcodFjyEoXa0bpggc4TJESOHktxjM97yQLt0F+6cAk+Lgin4+aw2GSirPP8DGBNgZPp0h+YIr7/cQacm4NFGjWj+8XtN/8L1r3XInjxu3SIlBvvxDmM5nXqOeHoRfgeDfUMqjIzhO99GGrHe6DTMCV0V/T96K9YwrTEOjeOX8d1ux2/5RxarcCVCyGEkNShcSGEEJI6NC6EEEJSh8aFEEJI6tC4EEIImT212MELsVqst9cSZgVQqeufu/Kq62HdTFhQZZPDOMxCJ1BwtHXjBEcdBa1uG+jHiYhGJrUqwyvpRD1JeaDPrTiM1WLDwzoB2AnHHw/rjlqShdVAMqMNG7bCuvMHtapr2QFYLVQBYWwig1U9ngGJk8r4Hg306oRu2zbjaxlXR1oKjZKUx1rpUxu9/wmOHgzm6yYn5KHUDoeKaUT6nmzYiBN1+b06kVTtn7hvZoHSLrcAq6zuzWjV47I2HC6k5Ol77Rp8bm5GP/PWy/ieliMdpmWlpW93g74tjIIwKwPjOIxNT0Grujq78ByYCfT5xXgoOW5Fn3NjG1ZIFkq6Aw04WFqWi/SckPFwI7qNnhu9CsO/EEIImUPQuBBCCEkdGhdCCCGpQ+NCCCEkdWhcCCGEzJ5a7MnH4VhUbQWdHOqKm++GdbeVtC079IhVsO7wlE6S9aer/wTrukDQUKpjZQhSHWWzOCZToaAVZ12dWBnS2amvQz6v4/oIMZCMVOtYGWIMjqk0f6BflZ1+5pPxMWKthvFA0iMhn9OqQNdSN/B1XK+polbTCIesO0SVjYxhRU57oI+RyeCuWgFJnYo7sMJurnDgYq3eE8KMPu8NO7AKqFjXfWhwSMeyEsolreC7ez2OneUCkeXIDq02Exrejaqs6t8E6xbu1IM0tw33lWxV98GgioOWmQldFk1a4pBVsfIpAxSkh647ENb1QHzEwMHqq5yn+6xrU17WQcKyEk4sNhAPqLLyBFb/hbGeB30ft8FMaXVbtagVga3ClQshhJDUoXEhhBCSOjQuhBBCUofGhRBCyOw59A9cip2FtYZ2Dh21WieyEq65fYsqKweW7ELAGR7VcAgQByTlytZwaIk41g65houdYWNF7UzdPqbPQahWtEPetVzeBvA32nL91Gq4be0FfX16urtw3VDXzeWwM7UAjlso6FA8NnFDmMH3c3CJ7hOPyp0I69bAvUP3Tdi4/g5VNgzK5hJ93TiUUgT6xcI+7WwWNoOkd3XPEqojo3/PBJY4JI7uiBH4fvJ7Rt/rkoud0IGvRRbFKhZ0RCNgMExZkvQB7UZjDFZ1onHs6K9kdJtrQNgiZICIZQyEfhJ6wVjIlfD4yFb0ePS78DkX2nT4lyWZpbBu5OtzNkCUIIyP6LBLpVGcKK4VuHIhhBCSOjQuhBBCUofGhRBCSOrQuBBCCEkdGhdCCCGzpxYbtoTqiGOt4GrPYYXL0kU6ZMlt20dh3TqI6VK3hC2IgczGhLhuBqjTPEvyHOPo4/oW1VKQ18eIsGDNcRtAsRZjO+87WNVVrWsV2b07cbIwpx62HIbCOLptrovr+q4Om+MFFrVQFrQhxN0vBmqWKMLXfWSrDhO0bvUyZy5TqmAFoDG6D4SW69ndqVV9w0UcQiiqgetZxv3NFHXdMV+HNhJGPK0gHXZxuJqeA7S6zUNxm6S8R/c3Y1OAjepjxAVLiJVOfM7emFanuaXJlue7kmUslXPg/MqWto3r47ptlvZ2AtVsu+V+gqo2tVhpUocJGrAkQmsFrlwIIYSkDo0LIYSQ1KFxIYQQkjo0LoQQQmbPoX/DZhzXP460g8rzdI4NoQLCRVRBHoOEHCi3OAANyDdifBxmIQahGqLYkv8B5FixOZZROhaQRuW+cuC8dz0c2sazONMDJEywOABdB10LS13L7+HjAoc+CI8hRODe1SyKBxeIJiyBTZwcCEEzEd3/HBQPBlsnbbl79LV3XXyNGsBT27CITRwHCAgmcXgTZ6duQ6MX39MNQ1o4kTV47DdCfc5Dy+6BdTMD+v4ZSxQSMwqu2aRlTrGIAsIxfS0z45Zx0ADjNIPruqjcEu3KyQCHfg7PCSYLBBoBnsNc37SUT0oIQJifasaiSmoBrlwIIYSkDo0LIYSQ1KFxIYQQkjo0LoQQQlKHxoUQQsjsqcWuvt0i1wDKg1yI1SW+p9VFVQ/bt4yv1SUZSygMJPaKXKy0qNdQ8hxY1YmioCV1XFLugHKvdUWWZ7DawyaTMiDkDcjbdh+BPohnCaUTADWdb1GyhYEO8xOE+h4LSKASWuR0XgMkrKrb2quv8ejUnc5c5p6RMv4AqMUCax8CqkcXhwryXR3Wwx/D98noaDpOXMBt2JHTyemijlWwbgmo27bH82Ddnqyea3qGQFYwKR/Udd0xy1iyKs5AmS1HFhBPuZbEa24GqEItYal8MLd5maDlkC4eSAqW/B6YQIxFxeo1dHvLNRyeqxW4ciGEEJI6NC6EEEJSh8aFEEJI6tC4EEIImT2HftniLIyBw6gG8lIIIXD2Rj6uWwE5L7rasWO5BtrQsORYEBdXqxiQ58VY8q4YD+RBsdhuA9pgHEuoB0tomgiEvLECxA2x5frEILyNb1EVwMgkNmECCE+BQlPYjmFA/hph++YNqiyOJ5y5TN0y7NDViCz91fP1PTWW8EiNhnboZydwjpYIpGOJLZFiHCBM2b4YO+kne7T4Y5ur8zsJfY5WFcxzsGO5x9Xlvb3DuG4PLvcnQN/aYZknaiDcDK7poPthG7Yw6pJtfKC5BpQlgNxRBghmhOK4jo9jLKG8WoErF0IIIalD40IIISR1aFwIIYSkDo0LIYSQ1KFxIYQQMntqsYotZAAIT+FZQoDUgK7CtcgnPJD4qGppg3F16ITYmjgJHcBpWamFlF62g1hbAKUhljAfHr5FnoMuhiUMBbrNFrWYC+K02BKIuSB8iwHqFKFR1aqTSlSBdetlHfqnMjkF6xbHtbyprdByt54VLJcIXmfb/YdDwaIWc40ONxMBBZlgRgq6zNblkVrQEtmmNE+rxe7uWwrrbukYUmV9DpCxOY7T7+ryeQ5WhdmO0dep1Wm9nTj+i18G19iSfMsBN8m1Xcy4dVVXHOsL32jgpF4xCBNVj7HysubrJG0hCGHTKly5EEIISR0aF0IIIalD40IIISR1aFwIIYSkTsueT6+tA5b7wKHvZzO4LghD4trCWzggX0kDhxYJgI306rgucnzFthAroDzeBx+9D0J03FcObLrl3BoWp55BDkBL2+JIO8ijyOIsBOXGkvDGA87CyOJYrBndhprFsRiBEDT1uv5+Ug7uUR18fy7hWgQvHsrzY+lDOI6ITdARtRwix5vSU4JbwfFf4kkgeNmGr73pAv21F1Z1SgNaoHPP4EJYd2fXfFU2P96G68bdsHy+0XPbPKcT1u309HULXCxMCV0tYvEd3I+zsXamG0u+o6gO8h1Z5g8ksIlcy9jP67oxTqHVEly5EEIISR0aF0IIIalD40IIISR1aFwIIYSkDo0LIYSQ2VOLBXkdFsKmcHEsCcA8oGbxLEokpCJr1LDSYmJMJ7mJQAgRoVzW8SkqFZwQp9YACidjU2+BZFhASWe9Dlhk5dTr+AOk9goCfDuDUJeHQdhyOB+bWiwEyYxig9vrZ7SCMMzhNmRzWk0Vu1hh5Tj51hMnzRG8AKspYZQdS3gkWNX6i1qVF0fjsGa1osPsxHWsFqtv1efRaOB7GuWBsrDHcp/6QfmgRQmnxWLOvfNxe9uzXfjnYq32GvSwkrHX1+1o93AixTxQi+UdHHanHShps45WkCXHyOmxbxHoOm6bHvuBJTGh44SpLj+4ciGEEJI6NC6EEEJSh8aFEEJI6tC4EEIImT2Hfgyc20IEnL2xhx1GKJyBAbkJ7juIrlsr4ZweI9t0PgZjcYQ3gCO8VrPkNwAhR2oWUYEtPEmroVt8z+Jgt+VSAY73vIed3qGvHZxBFj9XBKH+vcASgiSb1eW5Nhxiw8/othlLrhKUHqPewPcoCMOWBQhzBST+SMpRTiBLjhbHgNArIMTOfR9ox3JUx8ctT+m6JrLcJ6PvfxTh/hpVdHm0E4dYie8CoaYGYVXHDOgybwHu2+VBfB47hxapsskMdtIPgPAv3QEO/1LwtHio08NO+sjXjv6u0BIqxtfH9V18XN/V82BkCSvjwXn7/o8lrlwIIYSkDo0LIYSQ1KFxIYQQkjo0LoQQQlKHxoUQQsjsqcUqkzi5EFLmhJYwJCjqgE0t1qhr1UrcwKqMjh6thootSatQYrAowrETDEiG5ViUPjGoi0Kp2BROHgiPkpTbwn/sQ3KyMKPLQxASxnYMz8MKID8ESdp8rHpzfX1+9dhrWdHn2xSIqMzYA6HMBRq1astjCSXjSwDFBijIbH0TKRYFJJIylqRu6DobJPWzlVvukzFtqsxt4NAtHrgQrmUcuIFFeYnmpXm6DULkj6qyhkUB5niT+rguDv8Se5mWj9tw9dht2BSSRs+ZMTrhJBEiuJ8PQHjJlQshhJDUoXEhhBCSOjQuhBBCUofGhRBCyOw59I2DQxGEwHmWDS0Oo0g7swwITZE0zNdOyAA4ppNy4Ayz+qGAhyqyOOnRr9kyIbjAw253sGund93i3LSFakDZO3JZHP4lDwQEtuQfdZCPBeW1SQ4BnM2+xaFvkOPVEvkH/5j1yoMfm9sOfePge4qc9wHIH5IcI9b3xFhyDflAkOHZci4BZ7Hdp4scwBanObj/HsjFcx/tum5BlyXl3bpfxCB6TFLebjkTcOj2Ap4au3wtbig4eP7IOdqZHlhELD4Q7gSWPo+OAKbL/9QF98Mi0NlTRqD7A1cuhBBCUofGhRBCSOrQuBBCCEkdGhdCCCGpQ+NCCCFk9tRihc4CLEd6hlyow7EI9aJO9tWIsVosA5RPuRwOkeKD0CIoLIQttIgtTIsPFGCBxRxngAIsa1Fvodgt5RqWTtUtYWxQWIYwwOqSHLg+rkWFVK6DcBGWED0xuMZIPSjUaiAESR1LXKJI/15sSZqFQqagED9ziWwW92N0RwJbQjWQnC4G1802PgJLiCYXJK1zLSoidJldm8IJlHsunid8X4de8TsKLSu9GjlLv8pa+gUYptkMVkhmgJoua0n0F0bgnC2CNXSFbfMSEhvaEtDFKEGjRRSGwgc9kMR7XLkQQghJHRoXQgghqUPjQgghJHVoXAghhMyeQz8MsXPagylPcBiKMAN+zrSeVySTwW1A5ZbUFk4A8lgEFkc48nvlLI7QAnA45nK4vRFqQxHneahZHP2ocSgEjeABD55vCdHjgDwvUxWLQx85023OX1DWsNwjdNyGxVkdA4FGBMrmEp7FSY80CzaHKgzfYiy5f7zWQxP5Pgj/YvHpojw/1nMDz7EobJOQyeqwMEEbzq8S53TjahmcfyYKLR0ONCMMTMtjybNcy8AD479hy0sEQunAmrjckp4Hjkab4AWV23IEtQJXLoQQQlKHxoUQQkjq0LgQQghJHRoXQgghqUPjQgghJHVc80De7yeEEEIAXLkQQghJHRoXQgghqUPjQgghJHVoXAghhKQOjQshhJDUoXEhhBCSOjQuhBBCUofGhRBCSOrQuBBCCHHS5v8D5RX0E3Cl3UIAAAAASUVORK5CYII=", - "text/plain": [ - "
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": null, + "id": "92", + "metadata": {}, + "outputs": [], "source": [ "# Combine CAM with image\n", "visualization = show_cam_on_image(img_np, grad_cam_matrix, use_rgb=True)\n", diff --git a/tutorial/tests/test_25_libraries_image_classification.py b/tutorial/tests/test_25_libraries_image_classification.py index 31834d26..c93bd187 100644 --- a/tutorial/tests/test_25_libraries_image_classification.py +++ b/tutorial/tests/test_25_libraries_image_classification.py @@ -1,6 +1,7 @@ -import pytest import cv2 import numpy as np +import pytest + def reference_scale_image(image, scale_factor): # Get the current dimensions @@ -14,6 +15,7 @@ def reference_scale_image(image, scale_factor): # Resize the image return cv2.resize(image, new_size) + @pytest.mark.parametrize("scale_factor", [0.5, 1.0, 2.0]) def test_scale_image(scale_factor, function_to_test): image = np.ones((32, 32, 3), dtype=np.uint8) * 255 @@ -21,44 +23,49 @@ def test_scale_image(scale_factor, function_to_test): image_reference = reference_scale_image(image, scale_factor) assert image_test.shape == image_reference.shape + def reference_crop_image(image, x: int, y: int, width: int, height: int): - x1, x2, y1, y2 = x, x+width, y, y+height - return image[y:y + height, x:x + width] - -@pytest.mark.parametrize("x, y, width, height", [ - (2, 2, 2, 2), - (5, 5, 4, 4), - (10, 10, 6, 6) -]) + x1, x2, y1, y2 = x, x + width, y, y + height + return image[y : y + height, x : x + width] + + +@pytest.mark.parametrize( + "x, y, width, height", [(2, 2, 2, 2), (5, 5, 4, 4), (10, 10, 6, 6)] +) def test_crop_image(x, y, width, height, function_to_test): image = np.ones((32, 32, 3), dtype=np.uint8) * 255 - image_test = function_to_test(image, x, y, width, height) + image_test = function_to_test(image, x, y, width, height) image_reference = reference_crop_image(image, x, y, width, height) assert image_test.shape == image_reference.shape + def reference_horizontal_flip_image(image): return cv2.flip(image, 1) + def test_horizontal_flip_image(function_to_test): image = np.ones((32, 32, 3), dtype=np.uint8) * 255 image_test = function_to_test(image) image_reference = reference_horizontal_flip_image(image) assert np.allclose(image_test, image_reference) + def reference_vertical_flip_image(image): return cv2.flip(image, 0) + def test_vertical_flip_image(function_to_test): image = np.ones((32, 32, 3), dtype=np.uint8) * 255 image_test = function_to_test(image) image_reference = reference_vertical_flip_image(image) assert np.allclose(image_test, image_reference) + def reference_rotate_image(image, angle: float): (h, w) = image.shape[:2] center = (w // 2, h // 2) M = cv2.getRotationMatrix2D(center, angle, scale=1.0) - + # Compute new bounding dimensions cos = np.abs(M[0, 0]) sin = np.abs(M[0, 1]) @@ -72,6 +79,7 @@ def reference_rotate_image(image, angle: float): # Perform rotation with expanded canvas return cv2.warpAffine(image, M, (new_w, new_h)) + @pytest.mark.parametrize("angle", [5, 10, 20, 30]) def test_rotate_image(angle, function_to_test): image = np.ones((32, 32, 3), dtype=np.uint8) * 255 @@ -79,19 +87,23 @@ def test_rotate_image(angle, function_to_test): image_reference = reference_rotate_image(image, angle) assert np.allclose(image_test, image_reference) + def reference_average_filter(image, kernel_size): return cv2.blur(image, kernel_size) -@pytest.mark.parametrize("kernel_size", [(3,3), (5,5)]) + +@pytest.mark.parametrize("kernel_size", [(3, 3), (5, 5)]) def test_average_filter(kernel_size, function_to_test): image = np.ones((32, 32, 3), dtype=np.uint8) * 255 image_test = function_to_test(image, kernel_size) image_reference = reference_average_filter(image, kernel_size) assert np.allclose(image_test, image_reference) + def reference_median_filter(image, ksize): return cv2.medianBlur(image, ksize) + @pytest.mark.parametrize("ksize", [3, 5]) def test_median_filter(ksize, function_to_test): image = np.ones((32, 32, 3), dtype=np.uint8) * 255 @@ -99,24 +111,25 @@ def test_median_filter(ksize, function_to_test): image_reference = reference_median_filter(image, ksize) assert np.allclose(image_test, image_reference) + def reference_gaussian_filter(image, kernel_size, sigma): return cv2.GaussianBlur(image, kernel_size, sigma) -@pytest.mark.parametrize("kernel_size, sigma", [ - ((3,3),0), - ((5,5),0), - ((3,3),1), - ((5,5),1) -]) + +@pytest.mark.parametrize( + "kernel_size, sigma", [((3, 3), 0), ((5, 5), 0), ((3, 3), 1), ((5, 5), 1)] +) def test_gaussian_filter(kernel_size, sigma, function_to_test): image = np.ones((32, 32, 3), dtype=np.uint8) * 255 image_test = function_to_test(image, kernel_size, sigma) image_reference = reference_gaussian_filter(image, kernel_size, sigma) assert np.allclose(image_test, image_reference) + def reference_adjust_brightness(image, brightness_value): return cv2.convertScaleAbs(image, beta=brightness_value) + @pytest.mark.parametrize("brightness_value", [-30, -20, -10, 0, 10, 20, 30]) def test_adjust_brightness(brightness_value, function_to_test): image = np.ones((32, 32, 3), dtype=np.uint8) * 255 @@ -124,8 +137,10 @@ def test_adjust_brightness(brightness_value, function_to_test): image_reference = reference_adjust_brightness(image, brightness_value) assert np.allclose(image_test, image_reference) + def reference_adjust_contrast(image, contrast_value): - return cv2.convertScaleAbs(image, alpha = contrast_value) + return cv2.convertScaleAbs(image, alpha=contrast_value) + @pytest.mark.parametrize("contrast_value", [0.5, 1.0, 1.5, 2.0]) def test_adjust_contrast(contrast_value, function_to_test): @@ -134,6 +149,7 @@ def test_adjust_contrast(contrast_value, function_to_test): image_reference = reference_adjust_contrast(image, contrast_value) assert np.allclose(image_test, image_reference) + def reference_adjust_saturation(image, saturation_factor): # Convert the image from BGR to HSV image_hsv = cv2.cvtColor(image, cv2.COLOR_RGB2HSV) @@ -150,9 +166,10 @@ def reference_adjust_saturation(image, saturation_factor): # Convert the adjusted image back to BGR return cv2.cvtColor(image_hsv_adjusted, cv2.COLOR_HSV2RGB) + @pytest.mark.parametrize("saturation_factor", [0.5, 1.0, 1.5, 2.0]) def test_adjust_saturation(saturation_factor, function_to_test): image = np.ones((32, 32, 3), dtype=np.uint8) * 255 image_test = function_to_test(image, saturation_factor) image_reference = reference_adjust_saturation(image, saturation_factor) - assert np.allclose(image_test, image_reference) \ No newline at end of file + assert np.allclose(image_test, image_reference) From f2e98ae08bc0679de0404757614e8056b60d1e2e Mon Sep 17 00:00:00 2001 From: Lopes Date: Tue, 6 May 2025 16:26:16 +0200 Subject: [PATCH 03/10] Applying pre-commit on the files --- 25_libraries_image_classification.ipynb | 82 ++++++++----------- .../test_25_libraries_image_classification.py | 14 ++-- 2 files changed, 40 insertions(+), 56 deletions(-) diff --git a/25_libraries_image_classification.ipynb b/25_libraries_image_classification.ipynb index 3a8c6081..7c33a47d 100644 --- a/25_libraries_image_classification.ipynb +++ b/25_libraries_image_classification.ipynb @@ -523,16 +523,9 @@ "%%ipytest\n", "\n", "def solution_scale_image(img, scale_factor: float):\n", - " # Get the current dimensions\n", - " height, width = img.shape[:2]\n", - "\n", - " # Calculate the new dimensions\n", - " new_width = int(width * scale_factor)\n", - " new_height = int(height * scale_factor)\n", - " new_size = (new_width, new_height)\n", - "\n", - " # Resize the image\n", - " return cv2.resize(img, new_size)" + " # Start your code here\n", + " return\n", + " # End your code here" ] }, { @@ -567,8 +560,9 @@ "source": [ "%%ipytest\n", "def solution_crop_image(img, x: int, y: int, width: int, height: int):\n", - " x1, x2, y1, y2 = x, x+width, y, y+height\n", - " return img[y:y + height, x:x + width]" + " # Start your code here\n", + " return\n", + " # End your code here" ] }, { @@ -603,7 +597,9 @@ "source": [ "%%ipytest\n", "def solution_horizontal_flip_image(img):\n", - " return cv2.flip(img, 1)" + " # Start your code here\n", + " return\n", + " # End your code here" ] }, { @@ -638,7 +634,9 @@ "source": [ "%%ipytest\n", "def solution_vertical_flip_image(img):\n", - " return cv2.flip(img, 0)" + " # Start your code here\n", + " return\n", + " # End your code here" ] }, { @@ -673,22 +671,9 @@ "source": [ "%%ipytest\n", "def solution_rotate_image(img, angle: float):\n", - " (h, w) = img.shape[:2]\n", - " center = (w // 2, h // 2)\n", - " M = cv2.getRotationMatrix2D(center, angle, scale=1.0)\n", - " \n", - " # Compute new bounding dimensions\n", - " cos = np.abs(M[0, 0])\n", - " sin = np.abs(M[0, 1])\n", - " new_w = int((h * sin) + (w * cos))\n", - " new_h = int((h * cos) + (w * sin))\n", - "\n", - " # Adjust rotation matrix for translation\n", - " M[0, 2] += (new_w / 2) - center[0]\n", - " M[1, 2] += (new_h / 2) - center[1]\n", - "\n", - " # Perform rotation with expanded canvas\n", - " return cv2.warpAffine(img, M, (new_w, new_h))" + " # Start your code here\n", + " return\n", + " # End your code here" ] }, { @@ -736,7 +721,9 @@ "source": [ "%%ipytest\n", "def solution_average_filter(img, kernel_size = (5, 5)):\n", - " return cv2.blur(img, kernel_size)" + " # Start your code here\n", + " return\n", + " # End your code here" ] }, { @@ -771,7 +758,9 @@ "source": [ "%%ipytest\n", "def solution_median_filter(img, ksize):\n", - " return cv2.medianBlur(img, ksize)" + " # Start your code here\n", + " return\n", + " # End your code here" ] }, { @@ -806,7 +795,9 @@ "source": [ "%%ipytest\n", "def solution_gaussian_filter(img, kernel_size = (5, 5), sigma = 0):\n", - " return cv2.GaussianBlur(img, kernel_size, sigma)" + " # Start your code here\n", + " return\n", + " # End your code here" ] }, { @@ -854,7 +845,9 @@ "source": [ "%%ipytest\n", "def solution_adjust_brightness(img, brightness_value):\n", - " return cv2.convertScaleAbs(img, beta=brightness_value)" + " # Start your code here\n", + " return\n", + " # End your code here" ] }, { @@ -892,7 +885,9 @@ "source": [ "%%ipytest\n", "def solution_adjust_contrast(img, contrast_value):\n", - " return cv2.convertScaleAbs(img, alpha = contrast_value)" + " # Start your code here\n", + " return\n", + " # End your code here" ] }, { @@ -930,20 +925,9 @@ "source": [ "%%ipytest\n", "def solution_adjust_saturation(img, saturation_factor):\n", - " # Convert the image from BGR to HSV\n", - " image_hsv = cv2.cvtColor(img, cv2.COLOR_RGB2HSV)\n", - "\n", - " # Split the HSV image into Hue, Saturation, and Value channels\n", - " hue, saturation, value = cv2.split(image_hsv)\n", - "\n", - " # Adjust the saturation channel (Ensure it stays within valid range)\n", - " saturation = np.clip(saturation * saturation_factor, 0, 255)\n", - "\n", - " # Merge the channels back\n", - " image_hsv_adjusted = cv2.merge([hue, saturation.astype(np.uint8), value])\n", - "\n", - " # Convert the adjusted image back to BGR\n", - " return cv2.cvtColor(image_hsv_adjusted, cv2.COLOR_HSV2RGB)" + " # Start your code here\n", + " return\n", + " # End your code here" ] }, { diff --git a/tutorial/tests/test_25_libraries_image_classification.py b/tutorial/tests/test_25_libraries_image_classification.py index c93bd187..d5e1eccc 100644 --- a/tutorial/tests/test_25_libraries_image_classification.py +++ b/tutorial/tests/test_25_libraries_image_classification.py @@ -26,7 +26,7 @@ def test_scale_image(scale_factor, function_to_test): def reference_crop_image(image, x: int, y: int, width: int, height: int): x1, x2, y1, y2 = x, x + width, y, y + height - return image[y : y + height, x : x + width] + return image[y1:y2, x1:x2] @pytest.mark.parametrize( @@ -64,20 +64,20 @@ def test_vertical_flip_image(function_to_test): def reference_rotate_image(image, angle: float): (h, w) = image.shape[:2] center = (w // 2, h // 2) - M = cv2.getRotationMatrix2D(center, angle, scale=1.0) + mat = cv2.getRotationMatrix2D(center, angle, scale=1.0) # Compute new bounding dimensions - cos = np.abs(M[0, 0]) - sin = np.abs(M[0, 1]) + cos = np.abs(mat[0, 0]) + sin = np.abs(mat[0, 1]) new_w = int((h * sin) + (w * cos)) new_h = int((h * cos) + (w * sin)) # Adjust rotation matrix for translation - M[0, 2] += (new_w / 2) - center[0] - M[1, 2] += (new_h / 2) - center[1] + mat[0, 2] += (new_w / 2) - center[0] + mat[1, 2] += (new_h / 2) - center[1] # Perform rotation with expanded canvas - return cv2.warpAffine(image, M, (new_w, new_h)) + return cv2.warpAffine(image, mat, (new_w, new_h)) @pytest.mark.parametrize("angle", [5, 10, 20, 30]) From 875eeb2045e12068548a9da3a301ee6d19500743 Mon Sep 17 00:00:00 2001 From: Fabio Lopes Date: Wed, 7 May 2025 10:57:34 +0200 Subject: [PATCH 04/10] Move the code fom code cells into markdown cells --- 25_libraries_image_classification.ipynb | 789 ++++++++++++++---------- 1 file changed, 455 insertions(+), 334 deletions(-) diff --git a/25_libraries_image_classification.ipynb b/25_libraries_image_classification.ipynb index 7c33a47d..67fc4469 100644 --- a/25_libraries_image_classification.ipynb +++ b/25_libraries_image_classification.ipynb @@ -154,223 +154,303 @@ "- **ImageClassifier** is responsible for building the image classification model, which in this case is based on Convolutional Neural Networks (CNNs).\n", "- **Trainer** handles the training and evaluation processes using batches of data.\n", "\n", - "By organizing the code in this way, we simplify debugging and future extensions.\n" + "By organizing the code in this way, we simplify debugging and future extensions." + ] + }, + { + "cell_type": "markdown", + "id": "8", + "metadata": {}, + "source": [ + "The classes are currently not complete. Use the following code to prepare them:\n", + "\n", + "```ImageDataset```: \n", + "\n", + "In ```__init__```, initialise these three atributes:\n", + "```python\n", + "self.images = images\n", + "self.labels = labels\n", + "self.transform = transform\n", + "```\n", + "\n", + "Complete function ```__len__```:\n", + "```python\n", + "return len(self.images)\n", + "```\n", + "\n", + "Complete function ```__getitem__```:\n", + "```python\n", + "image = self.images[idx]\n", + "label = self.labels[idx]\n", + "\n", + "# Ensure the image is in the shape (H, W, C) for Albumentations\n", + "image = np.transpose(image, (1, 2, 0))\n", + "\n", + "# Apply Albumentations transforms\n", + "if self.transform:\n", + " augmented = self.transform(image=image)\n", + " image = augmented['image']\n", + "\n", + "return image, label\n", + "```" ] }, { "cell_type": "code", "execution_count": null, - "id": "8", + "id": "9", "metadata": {}, "outputs": [], "source": [ "class ImageDataset(Dataset):\n", " def __init__(self, images, labels, transform=None):\n", - " self.images = images\n", - " self.labels = labels\n", - " self.transform = transform\n", + " pass\n", " \n", " def __len__(self):\n", - " return len(self.images)\n", + " return\n", " \n", " def __getitem__(self, idx):\n", - " image = self.images[idx]\n", - " label = self.labels[idx]\n", - " \n", - " # Ensure the image is in the shape (H, W, C) for Albumentations\n", - " image = np.transpose(image, (1, 2, 0))\n", - " \n", - " # Apply Albumentations transforms\n", - " if self.transform:\n", - " augmented = self.transform(image=image)\n", - " image = augmented['image']\n", + " return" + ] + }, + { + "cell_type": "markdown", + "id": "10", + "metadata": {}, + "source": [ + "```ImageClassifier```: \n", "\n", - " return image, label\n", - " \n", + "In ```__init__```, initialise the attributes needed for building the neural network:\n", + "```python\n", + "# Best model weights\n", + "self.best_model_weights = None\n", + "\n", + "# Convolutional block layers\n", + "self.feature_maps = 64\n", + "\n", + "# First convolutional block\n", + "self.conv1 = nn.Conv2d(in_channels, self.feature_maps, kernel_size = 3)\n", + "self.pool1 = nn.MaxPool2d(kernel_size = 2)\n", + "self.bn1 = nn.BatchNorm2d(self.feature_maps)\n", + "\n", + "# Second convolutional block\n", + "self.conv2 = nn.Conv2d(self.feature_maps, self.feature_maps * 2, kernel_size = 3)\n", + "self.pool2 = nn.MaxPool2d(kernel_size = 2)\n", + "self.bn2 = nn.BatchNorm2d(self.feature_maps * 2)\n", + "\n", + "# Third convolutional block\n", + "self.conv3 = nn.Conv2d(self.feature_maps * 2, self.feature_maps * 4, kernel_size = 3)\n", + "self.pool3 = nn.MaxPool2d(kernel_size = 2)\n", + "self.bn3 = nn.BatchNorm2d(self.feature_maps * 4)\n", + "\n", + "# Rectified linear unit (activation function)\n", + "self.relu = nn.ReLU()\n", + "\n", + "# Flatten layer that produces the features vector\n", + "self.flatten = nn.Flatten(start_dim=1)\n", + "\n", + "# Dropout layer is responsible for introducing regularisation into training process\n", + "self.dropout = nn.Dropout(p = 0.3)\n", + "\n", + "# Number of output classes\n", + "self.out_classes = out_classes\n", + "\n", + "# Classification layer \n", + "self.fc = nn.Linear(1024, self.out_classes)\n", + "```\n", + "\n", + "Then, complete the function named ```forward```, which essentially defines the neural network architecture:\n", + "```python\n", + "# Convolutional block 1\n", + "x = self.conv1(x)\n", + "x = self.pool1(x)\n", + "x = self.relu(x)\n", + "x = self.bn1(x)\n", + "\n", + "# Convolutional block 2\n", + "x = self.conv2(x)\n", + "x = self.pool2(x)\n", + "x = self.relu(x)\n", + "x = self.bn2(x)\n", + "\n", + "# Convolutional block 3\n", + "x = self.conv3(x)\n", + "x = self.pool3(x)\n", + "x = self.relu(x)\n", + "x = self.bn3(x)\n", + "\n", + "# Classifier\n", + "x = self.flatten(x)\n", + "x = self.dropout(x)\n", + "x = self.fc(x)\n", + "return x\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "11", + "metadata": {}, + "outputs": [], + "source": [ "class ImageClassifier(nn.Module):\n", - " def __init__(self, in_channels=1, out_classes = 1):\n", + " def __init__(self, in_channels = 1, out_classes = 1):\n", " super(ImageClassifier, self).__init__()\n", + " \n", + " def forward(self, x):\n", + " return" + ] + }, + { + "cell_type": "markdown", + "id": "12", + "metadata": {}, + "source": [ + "```Trainer```: \n", + "\n", + "In ```__init__```, initialise these three atributes:\n", + "```python\n", + "self.model = model\n", + "self.train_losses = []\n", + "self.val_losses = []\n", + "```\n", + "\n", + "Complete the ```fit``` function, which is used for optimising the model:\n", + "```python\n", + "early_stopping_count = 0\n", + "best_val_loss = 9999\n", + "best_epoch = 0\n", + "\n", + "# Training log file\n", + "log_filename = \"training_log.txt\"\n", + "with open(log_filename, \"w\") as log_file:\n", + " log_file.write(\"Epoch,Train Loss,Val Loss,Best Val Loss,Best Epoch\\n\")\n", + "\n", + "\n", + "for epoch in range(epochs):\n", + " # Train mode\n", + " self.model.train()\n", + " \n", + " train_loss = 0.0\n", + " train_samples_count = 0.0\n", + " for i, data in enumerate(train_dataloader):\n", + " inputs, labels = data\n", + " inputs = inputs.to(device)\n", + " labels = labels.long().to(device)\n", " \n", - " # Best model weights\n", - " self.best_model_weights = None\n", - " \n", - " # Convolutional block layers\n", - " self.feature_maps = 64\n", - " \n", - " # First convolutional block\n", - " self.conv1 = nn.Conv2d(in_channels, self.feature_maps, kernel_size = 3)\n", - " self.pool1 = nn.MaxPool2d(kernel_size = 2)\n", - " self.bn1 = nn.BatchNorm2d(self.feature_maps)\n", - " \n", - " # Second convolutional block\n", - " self.conv2 = nn.Conv2d(self.feature_maps, self.feature_maps * 2, kernel_size = 3)\n", - " self.pool2 = nn.MaxPool2d(kernel_size = 2)\n", - " self.bn2 = nn.BatchNorm2d(self.feature_maps * 2)\n", - " \n", - " # Third convolutional block\n", - " self.conv3 = nn.Conv2d(self.feature_maps * 2, self.feature_maps * 4, kernel_size = 3)\n", - " self.pool3 = nn.MaxPool2d(kernel_size = 2)\n", - " self.bn3 = nn.BatchNorm2d(self.feature_maps * 4)\n", - " \n", - " # Rectified linear unit (activation function)\n", - " self.relu = nn.ReLU()\n", - " \n", - " # Flatten layer that produces the features vector\n", - " self.flatten = nn.Flatten(start_dim=1)\n", + " # zero the parameter gradients\n", + " optimizer.zero_grad()\n", + "\n", + " # Forward step\n", + " outputs = self.model(inputs)\n", " \n", - " # Dropout layer is responsible for introducing regularisation into training process\n", - " self.dropout = nn.Dropout(p = 0.3)\n", + " # Backward step\n", + " loss = criterion(outputs, labels)\n", + " loss.backward()\n", " \n", - " # Number of output classes\n", - " self.out_classes = out_classes\n", + " # Weight optimisation\n", + " optimizer.step()\n", + "\n", + " train_loss += loss.item()\n", + " train_samples_count += 1\n", " \n", - " # Classification layer \n", - " self.fc = nn.Linear(1024, self.out_classes)\n", + " # Validation mode\n", + " self.model.eval()\n", " \n", - " def forward(self, x):\n", - " # Convolutional block 1\n", - " x = self.conv1(x)\n", - " x = self.pool1(x)\n", - " x = self.relu(x)\n", - " x = self.bn1(x)\n", + " val_loss = 0.0\n", + " val_samples_count = 0.0\n", + " for i, data in enumerate(val_dataloader):\n", + " inputs, labels = data\n", + " inputs = inputs.to(device)\n", + " labels = labels.long().to(device)\n", " \n", - " # Convolutional block 2\n", - " x = self.conv2(x)\n", - " x = self.pool2(x)\n", - " x = self.relu(x)\n", - " x = self.bn2(x)\n", + " outputs = self.model(inputs)\n", + " loss = criterion(outputs, labels)\n", " \n", - " # Convolutional block 3\n", - " x = self.conv3(x)\n", - " x = self.pool3(x)\n", - " x = self.relu(x)\n", - " x = self.bn3(x)\n", + " val_loss += loss.item()\n", + " val_samples_count += 1\n", + " \n", + " train_loss /= train_samples_count\n", + " val_loss /= val_samples_count\n", + " \n", + " self.train_losses.append(train_loss)\n", + " self.val_losses.append(val_loss)\n", + " \n", + " early_stopping_count += 1\n", + " \n", + " if val_loss < best_val_loss:\n", + " best_epoch = epoch\n", + " best_val_loss = val_loss\n", + " early_stopping_count = 0\n", + " self.model.best_model_weights = self.model.state_dict()\n", + " \n", " \n", - " # Classifier\n", - " x = self.flatten(x)\n", - " x = self.dropout(x)\n", - " x = self.fc(x)\n", - " return x\n", + " print(f'Epoch: {epoch}, Loss: {train_loss}, Val Loss: {val_loss}. The best val loss is {best_val_loss} in epoch {best_epoch}.')\n", + " \n", + " with open(log_filename, \"a\") as log_file:\n", + " log_file.write(f\"{epoch},{train_loss},{val_loss},{best_val_loss},{best_epoch}\\n\")\n", + " \n", + " if early_stopping_count == early_stopping_limit and early_stopping_limit > 0:\n", + " break\n", + "```\n", + "\n", + "Then, complete the function ```predict```, which is used for predicting labels considering batches of data as input:\n", + "```python\n", + "# Load best weights\n", + "if self.model.best_model_weights:\n", + " self.model.load_state_dict(self.model.best_model_weights)\n", + "\n", + "# Test mode\n", + "self.model.eval()\n", + "\n", + "original_images = []\n", + "true_labels = []\n", + "predicted_labels = []\n", + "\n", + "for data in test_dataloader:\n", + " images, labels = data\n", + " images = images.to(device)\n", + " outputs = self.model(images)\n", + " _, predicted = torch.max(outputs, 1)\n", + " \n", + " images = images.cpu().detach().numpy()\n", + " labels = labels.numpy()\n", + " predicted = predicted.cpu().detach().numpy()\n", + " \n", + " original_images.append(images)\n", + " true_labels.append(labels)\n", + " predicted_labels.append(predicted)\n", "\n", + "original_images = np.concatenate(original_images)\n", + "true_labels = np.concatenate(true_labels)\n", + "predicted_labels = np.concatenate(predicted_labels)\n", + "\n", + "return original_images, true_labels, predicted_labels\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "13", + "metadata": {}, + "outputs": [], + "source": [ "class Trainer():\n", " def __init__(self, model):\n", - " self.model = model\n", - " self.train_losses = []\n", - " self.val_losses = []\n", + " pass\n", " \n", " def fit(self, epochs, train_dataloader, val_dataloader, optimizer, criterion, device, early_stopping_limit = 0):\n", - " early_stopping_count = 0\n", - " best_val_loss = 9999\n", - " best_epoch = 0\n", - " \n", - " # Training log file\n", - " log_filename = \"training_log.txt\"\n", - " with open(log_filename, \"w\") as log_file:\n", - " log_file.write(\"Epoch,Train Loss,Val Loss,Best Val Loss,Best Epoch\\n\")\n", - "\n", - " \n", - " for epoch in range(epochs):\n", - " # Train mode\n", - " self.model.train()\n", - " \n", - " train_loss = 0.0\n", - " train_samples_count = 0.0\n", - " for i, data in enumerate(train_dataloader):\n", - " inputs, labels = data\n", - " inputs = inputs.to(device)\n", - " labels = labels.long().to(device)\n", - " \n", - " # zero the parameter gradients\n", - " optimizer.zero_grad()\n", - "\n", - " # Forward step\n", - " outputs = self.model(inputs)\n", - " \n", - " # Backward step\n", - " loss = criterion(outputs, labels)\n", - " loss.backward()\n", - " \n", - " # Weight optimisation\n", - " optimizer.step()\n", - "\n", - " train_loss += loss.item()\n", - " train_samples_count += 1\n", - " \n", - " # Validation mode\n", - " self.model.eval()\n", - " \n", - " val_loss = 0.0\n", - " val_samples_count = 0.0\n", - " for i, data in enumerate(val_dataloader):\n", - " inputs, labels = data\n", - " inputs = inputs.to(device)\n", - " labels = labels.long().to(device)\n", - " \n", - " outputs = self.model(inputs)\n", - " loss = criterion(outputs, labels)\n", - " \n", - " val_loss += loss.item()\n", - " val_samples_count += 1\n", - " \n", - " train_loss /= train_samples_count\n", - " val_loss /= val_samples_count\n", - " \n", - " self.train_losses.append(train_loss)\n", - " self.val_losses.append(val_loss)\n", - " \n", - " early_stopping_count += 1\n", - " \n", - " if val_loss < best_val_loss:\n", - " best_epoch = epoch\n", - " best_val_loss = val_loss\n", - " early_stopping_count = 0\n", - " self.model.best_model_weights = self.model.state_dict()\n", - " \n", - " \n", - " print(f'Epoch: {epoch}, Loss: {train_loss}, Val Loss: {val_loss}. The best val loss is {best_val_loss} in epoch {best_epoch}.')\n", - " \n", - " with open(log_filename, \"a\") as log_file:\n", - " log_file.write(f\"{epoch},{train_loss},{val_loss},{best_val_loss},{best_epoch}\\n\")\n", - " \n", - " if early_stopping_count == early_stopping_limit and early_stopping_limit > 0:\n", - " break\n", + " return\n", " \n", " def predict(self, test_dataloader, device):\n", - " # Load best weights\n", - " if self.model.best_model_weights:\n", - " self.model.load_state_dict(self.model.best_model_weights)\n", - "\n", - " # Test mode\n", - " self.model.eval()\n", - " \n", - " original_images = []\n", - " true_labels = []\n", - " predicted_labels = []\n", - "\n", - " for data in test_dataloader:\n", - " images, labels = data\n", - " images = images.to(device)\n", - " outputs = self.model(images)\n", - " _, predicted = torch.max(outputs, 1)\n", - " \n", - " images = images.cpu().detach().numpy()\n", - " labels = labels.numpy()\n", - " predicted = predicted.cpu().detach().numpy()\n", - " \n", - " original_images.append(images)\n", - " true_labels.append(labels)\n", - " predicted_labels.append(predicted)\n", - " \n", - " original_images = np.concatenate(original_images)\n", - " true_labels = np.concatenate(true_labels)\n", - " predicted_labels = np.concatenate(predicted_labels)\n", - " \n", - " return original_images, true_labels, predicted_labels\n" + " return\n" ] }, { "cell_type": "markdown", - "id": "9", + "id": "14", "metadata": {}, "source": [ "## Functions\n", @@ -381,7 +461,7 @@ { "cell_type": "code", "execution_count": null, - "id": "10", + "id": "15", "metadata": {}, "outputs": [], "source": [ @@ -405,7 +485,7 @@ }, { "cell_type": "markdown", - "id": "11", + "id": "16", "metadata": {}, "source": [ "## Dataset\n", @@ -415,7 +495,7 @@ }, { "cell_type": "markdown", - "id": "12", + "id": "17", "metadata": {}, "source": [ "### Load data\n", @@ -426,7 +506,7 @@ { "cell_type": "code", "execution_count": null, - "id": "13", + "id": "18", "metadata": {}, "outputs": [], "source": [ @@ -448,17 +528,17 @@ }, { "cell_type": "markdown", - "id": "14", + "id": "19", "metadata": {}, "source": [ "## Explore image processing\n", "\n", - "Image processing is fundamental to computer vision, forming the basis for interpreting and analyzing visual information. By applying techniques such as resizing, filtering, color adjustments, and data augmentation, image processing enhances input quality, minimizes noise, and corrects distortions. These methods can also simulate real-world variability, helping models generalize better. In this notebook, we explore three categories of image transformations: geometric transformations, image filtering, and photometric transformations." + "Image processing is fundamental to computer vision, forming the basis for interpreting and analyzing visual information. By applying techniques such as resizing, filtering, color adjustments, and data augmentation, image processing enhances input quality, minimizes noise, and corrects distortions. These methods can also simulate real-world variability, helping models generalize better. In this notebook, we explore three categories of image transformations: geometric transformations, image filtering, and photometric transformations. The following cells contain a series of exercicies designed to help you explore the OpenCV-Python library. If you are unfamiliar with a particular method, refer to the [Image Processing in OpenCV](https://docs.opencv.org/4.x/d2/d96/tutorial_py_table_of_contents_imgproc.html) documentation." ] }, { "cell_type": "markdown", - "id": "15", + "id": "20", "metadata": {}, "source": [ "### Example image" @@ -467,7 +547,7 @@ { "cell_type": "code", "execution_count": null, - "id": "16", + "id": "21", "metadata": {}, "outputs": [], "source": [ @@ -483,7 +563,7 @@ }, { "cell_type": "markdown", - "id": "17", + "id": "22", "metadata": {}, "source": [ "### Geometric transformation\n", @@ -497,7 +577,7 @@ }, { "cell_type": "markdown", - "id": "18", + "id": "23", "metadata": {}, "source": [ "#### Scaling" @@ -506,7 +586,7 @@ { "cell_type": "code", "execution_count": null, - "id": "19", + "id": "24", "metadata": {}, "outputs": [], "source": [ @@ -516,7 +596,7 @@ { "cell_type": "code", "execution_count": null, - "id": "20", + "id": "25", "metadata": {}, "outputs": [], "source": [ @@ -531,7 +611,7 @@ { "cell_type": "code", "execution_count": null, - "id": "21", + "id": "26", "metadata": {}, "outputs": [], "source": [ @@ -545,7 +625,7 @@ }, { "cell_type": "markdown", - "id": "22", + "id": "27", "metadata": {}, "source": [ "#### Cropping" @@ -554,7 +634,7 @@ { "cell_type": "code", "execution_count": null, - "id": "23", + "id": "28", "metadata": {}, "outputs": [], "source": [ @@ -568,7 +648,7 @@ { "cell_type": "code", "execution_count": null, - "id": "24", + "id": "29", "metadata": {}, "outputs": [], "source": [ @@ -582,7 +662,7 @@ }, { "cell_type": "markdown", - "id": "25", + "id": "30", "metadata": {}, "source": [ "#### Horizontal Flip" @@ -591,7 +671,7 @@ { "cell_type": "code", "execution_count": null, - "id": "26", + "id": "31", "metadata": {}, "outputs": [], "source": [ @@ -605,7 +685,7 @@ { "cell_type": "code", "execution_count": null, - "id": "27", + "id": "32", "metadata": {}, "outputs": [], "source": [ @@ -619,7 +699,7 @@ }, { "cell_type": "markdown", - "id": "28", + "id": "33", "metadata": {}, "source": [ "#### Vertical Flip" @@ -628,7 +708,7 @@ { "cell_type": "code", "execution_count": null, - "id": "29", + "id": "34", "metadata": {}, "outputs": [], "source": [ @@ -642,7 +722,7 @@ { "cell_type": "code", "execution_count": null, - "id": "30", + "id": "35", "metadata": {}, "outputs": [], "source": [ @@ -656,7 +736,7 @@ }, { "cell_type": "markdown", - "id": "31", + "id": "36", "metadata": {}, "source": [ "#### Rotation" @@ -665,7 +745,7 @@ { "cell_type": "code", "execution_count": null, - "id": "32", + "id": "37", "metadata": {}, "outputs": [], "source": [ @@ -679,7 +759,7 @@ { "cell_type": "code", "execution_count": null, - "id": "33", + "id": "38", "metadata": {}, "outputs": [], "source": [ @@ -693,7 +773,7 @@ }, { "cell_type": "markdown", - "id": "34", + "id": "39", "metadata": {}, "source": [ "### Image filtering\n", @@ -706,7 +786,7 @@ }, { "cell_type": "markdown", - "id": "35", + "id": "40", "metadata": {}, "source": [ "#### Average filter " @@ -715,7 +795,7 @@ { "cell_type": "code", "execution_count": null, - "id": "36", + "id": "41", "metadata": {}, "outputs": [], "source": [ @@ -729,7 +809,7 @@ { "cell_type": "code", "execution_count": null, - "id": "37", + "id": "42", "metadata": {}, "outputs": [], "source": [ @@ -743,7 +823,7 @@ }, { "cell_type": "markdown", - "id": "38", + "id": "43", "metadata": {}, "source": [ "#### Median filter" @@ -752,7 +832,7 @@ { "cell_type": "code", "execution_count": null, - "id": "39", + "id": "44", "metadata": {}, "outputs": [], "source": [ @@ -766,7 +846,7 @@ { "cell_type": "code", "execution_count": null, - "id": "40", + "id": "45", "metadata": {}, "outputs": [], "source": [ @@ -780,7 +860,7 @@ }, { "cell_type": "markdown", - "id": "41", + "id": "46", "metadata": {}, "source": [ "#### Gaussian filter" @@ -789,7 +869,7 @@ { "cell_type": "code", "execution_count": null, - "id": "42", + "id": "47", "metadata": {}, "outputs": [], "source": [ @@ -803,7 +883,7 @@ { "cell_type": "code", "execution_count": null, - "id": "43", + "id": "48", "metadata": {}, "outputs": [], "source": [ @@ -817,7 +897,7 @@ }, { "cell_type": "markdown", - "id": "44", + "id": "49", "metadata": {}, "source": [ "### Photometric transformation\n", @@ -830,7 +910,7 @@ }, { "cell_type": "markdown", - "id": "45", + "id": "50", "metadata": {}, "source": [ "#### Adjust brightness" @@ -839,7 +919,7 @@ { "cell_type": "code", "execution_count": null, - "id": "46", + "id": "51", "metadata": {}, "outputs": [], "source": [ @@ -853,7 +933,7 @@ { "cell_type": "code", "execution_count": null, - "id": "47", + "id": "52", "metadata": {}, "outputs": [], "source": [ @@ -870,7 +950,7 @@ }, { "cell_type": "markdown", - "id": "48", + "id": "53", "metadata": {}, "source": [ "#### Adjust contrast" @@ -879,7 +959,7 @@ { "cell_type": "code", "execution_count": null, - "id": "49", + "id": "54", "metadata": {}, "outputs": [], "source": [ @@ -893,7 +973,7 @@ { "cell_type": "code", "execution_count": null, - "id": "50", + "id": "55", "metadata": {}, "outputs": [], "source": [ @@ -910,7 +990,7 @@ }, { "cell_type": "markdown", - "id": "51", + "id": "56", "metadata": {}, "source": [ "#### Adjust saturation" @@ -919,7 +999,7 @@ { "cell_type": "code", "execution_count": null, - "id": "52", + "id": "57", "metadata": {}, "outputs": [], "source": [ @@ -933,7 +1013,7 @@ { "cell_type": "code", "execution_count": null, - "id": "53", + "id": "58", "metadata": {}, "outputs": [], "source": [ @@ -950,7 +1030,7 @@ }, { "cell_type": "markdown", - "id": "54", + "id": "59", "metadata": {}, "source": [ "## Image classifier development using CNNs\n", @@ -960,7 +1040,7 @@ }, { "cell_type": "markdown", - "id": "55", + "id": "60", "metadata": {}, "source": [ "### Dataset preprocessing" @@ -968,7 +1048,7 @@ }, { "cell_type": "markdown", - "id": "56", + "id": "61", "metadata": {}, "source": [ "#### Train, validation, and test sets\n", @@ -979,7 +1059,7 @@ { "cell_type": "code", "execution_count": null, - "id": "57", + "id": "62", "metadata": {}, "outputs": [], "source": [ @@ -993,7 +1073,7 @@ }, { "cell_type": "markdown", - "id": "58", + "id": "63", "metadata": {}, "source": [ "#### Data Augmentation\n", @@ -1005,13 +1085,22 @@ "- ```A.VerticalFlip``` for vertical flipping;\n", "- ```A.ColorJitter``` for color jittering.\n", "\n", - "Albumentations can also be used for image normalization (```A.Normalize```), resizing (```A.Resize```), and converting images to PyTorch tensors with the (Channel, Height, Width) format using ```A.ToTensorV2```, which is required for model training." + "Albumentations can also be used for image normalization (```A.Normalize```), resizing (```A.Resize```), and converting images to PyTorch tensors with the (Channel, Height, Width) format using ```A.ToTensorV2```, which is required for model training.\n", + "\n", + "Apply the following transformations only to the training set, as the validation set should remain as close as possible to the test set. Therefore, no transformations should be applied to it.\n", + "```python\n", + "A.Affine(scale = (0.2, 1.5), p = 0.1),\n", + "A.Rotate(limit = 45, p = 0.1),\n", + "A.HorizontalFlip(p = 0.1),\n", + "A.VerticalFlip(p = 0.1),\n", + "A.ColorJitter(brightness = (0.5, 1.5), contrast = (0.5, 1.5), saturation = (0.5, 1.5), hue = (0,0), p = 0.1)\n", + "```" ] }, { "cell_type": "code", "execution_count": null, - "id": "59", + "id": "64", "metadata": {}, "outputs": [], "source": [ @@ -1019,19 +1108,14 @@ "\n", "# Transformations performed on train set\n", "train_transform = A.Compose([\n", - " A.Affine(scale = (0.2, 1.5), p = 0.1),\n", - " A.Rotate(limit = 45, p = 0.1),\n", - " A.HorizontalFlip(p = 0.1),\n", - " A.VerticalFlip(p = 0.1),\n", - " A.ColorJitter(brightness = (0.5, 1.5), contrast = (0.5, 1.5), saturation = (0.5, 1.5), hue = (0,0), p = 0.1), # Applied only to 'image'\n", - " A.Normalize(mean=(0.4914, 0.4822, 0.4465), std=(0.2470, 0.2435, 0.2616)), # Applied only to 'image'\n", + " A.Normalize(mean=(0.4914, 0.4822, 0.4465), std=(0.2470, 0.2435, 0.2616)),\n", " A.Resize(height = TARGET_SIZE, width = TARGET_SIZE),\n", " A.ToTensorV2()\n", "])\n", "\n", "# Transformations performed on validation and test sets\n", "val_transform = A.Compose([\n", - " A.Normalize(mean=(0.4914, 0.4822, 0.4465), std=(0.2470, 0.2435, 0.2616)), # Applied only to 'image'\n", + " A.Normalize(mean=(0.4914, 0.4822, 0.4465), std=(0.2470, 0.2435, 0.2616)),\n", " A.Resize(height = TARGET_SIZE, width = TARGET_SIZE),\n", " A.ToTensorV2()\n", "])" @@ -1039,7 +1123,7 @@ }, { "cell_type": "markdown", - "id": "60", + "id": "65", "metadata": {}, "source": [ "#### PyTorch Datasets\n", @@ -1050,7 +1134,7 @@ { "cell_type": "code", "execution_count": null, - "id": "61", + "id": "66", "metadata": {}, "outputs": [], "source": [ @@ -1062,30 +1146,31 @@ }, { "cell_type": "markdown", - "id": "62", + "id": "67", "metadata": {}, "source": [ "#### PyTorch Dataloaders\n", "\n", - "```DataLoader``` is essential for training efficiency and performance. It abstracts the complexity of batching, shuffling, and parallel data access, allowing you to focus on building and training your models." + "```DataLoader``` is essential for training efficiency and performance. It abstracts the complexity of batching, shuffling, and parallel data access, allowing you to focus on building and training your models. ```batch_size``` specifies the number of samples processed in parallel during each training iteration. It is typically treated as a hyperparameter, as its optimal value depends on hardware constraints (e.g., GPU memory) and its interaction with training dynamics. Notably, it is often linearly related with the learning rate. Larger batch sizes generally require proportionally larger learning rates to maintain stable and efficient convergence. ```shuffle``` controls whether the dataset is randomly permuted at the start of each epoch. Enabling ```shuffle = True``` is typically beneficial, as it helps prevent the model from learning misleading patterns due to class-wise ordering in the dataset, which could hinder generalization and convergence." ] }, { "cell_type": "code", "execution_count": null, - "id": "63", + "id": "68", "metadata": {}, "outputs": [], "source": [ "# Data loaders needed for the model training\n", - "train_dataloader = DataLoader(train_dataset, batch_size = 128, shuffle=True)\n", - "val_dataloader = DataLoader(val_dataset, batch_size = 128, shuffle=True)\n", - "test_dataloader = DataLoader(test_dataset, batch_size = 128, shuffle=True)" + "BATCH_SIZE = 64\n", + "train_dataloader = DataLoader(train_dataset, batch_size = BATCH_SIZE, shuffle = True)\n", + "val_dataloader = DataLoader(val_dataset, batch_size = BATCH_SIZE, shuffle = True)\n", + "test_dataloader = DataLoader(test_dataset, batch_size = BATCH_SIZE, shuffle = True)" ] }, { "cell_type": "markdown", - "id": "64", + "id": "69", "metadata": {}, "source": [ "### Model training\n", @@ -1095,7 +1180,30 @@ }, { "cell_type": "markdown", - "id": "65", + "id": "70", + "metadata": {}, + "source": [ + "### Model Training Overview\n", + "\n", + "Model training involves a sequence of key steps. The first step is to check which computational devices are available. If a GPU with CUDA cores is accessible, it should be used, as it significantly accelerates training (```DEVICE = torch.device(\"cuda:0\" if torch.cuda.is_available() else \"cpu\")```). Otherwise, the model will be trained on the CPU. Next, we define the model and training hyperparameters. These typically include:\n", + "\n", + "- The number of output classes (```NUMBER_CLASSES = len(CIFAR_10_CLASSES)```)\n", + "- The number of training epochs (i.e., how many times the model sees the full training set) (```EPOCHS = 500```)\n", + "- The patience for early stopping (i.e., how many consecutive epochs without improvement are allowed before stopping training) (```EARLY_STOPPING_LIMIT = EPOCHS // 10```)\n", + "- The learning rate (```LR = 0.001```)\n", + "\n", + "Additional hyperparameters may also be configured depending on the training strategy or specific use case.\n", + "\n", + "In this notebook, we focus on setting the number of training epochs. We also discuss **early stopping**, a regularization technique used to prevent overfitting. During training, the model's performance is evaluated on a validation set at the end of each epoch. If the validation loss does not improve after a predefined number of epochs, training is stopped, and the model reverts to the best-performing checkpoint (see [Early Stopping](https://paperswithcode.com/method/early-stopping)). The **learning rate** controls how quickly the model updates its weights during training. If it's too high, the model may overshoot optimal loss values, leading to instability. If it's too low, the model may converge very slowly or get stuck in a local minimum. Although learning rate tuning is not performed in this notebook, it is an essential hyperparameter that should be carefully selected (see [What is learning rate in machine learning?](https://www.ibm.com/think/topics/learning-rate)).\n", + "\n", + "After setting the hyperparameters, we define the **loss function** used to evaluate model performance (```criterion = nn.CrossEntropyLoss()```). Both the model and loss function should be moved to the selected training device (```criterion = criterion.to(DEVICE)```). The model is then instantiated using the `ImageClassifier` class (```model = ImageClassifier(in_channels = 3, out_classes = NUMBER_CLASSES)```) and transferred to the training device (```model = model.to(DEVICE)```). The **optimizer** is also defined and configured on the same device (```optimizer = optim.Adam(model.parameters(), lr = LR)```).\n", + "\n", + "Finally, if no pre-trained weights are available, the training process begins, and we monitor the learning curves to assess the model’s performance over time.\n" + ] + }, + { + "cell_type": "markdown", + "id": "71", "metadata": {}, "source": [ "#### Check which device is used for training" @@ -1104,16 +1212,16 @@ { "cell_type": "code", "execution_count": null, - "id": "66", + "id": "72", "metadata": {}, "outputs": [], "source": [ - "DEVICE = torch.device(\"cuda:0\" if torch.cuda.is_available() else \"cpu\")" + "# Check which device is available for training the model" ] }, { "cell_type": "markdown", - "id": "67", + "id": "73", "metadata": {}, "source": [ "#### Define training hyperparameters" @@ -1122,26 +1230,23 @@ { "cell_type": "code", "execution_count": null, - "id": "68", + "id": "74", "metadata": {}, "outputs": [], "source": [ "# Get number of output classes\n", "NUMBER_CLASSES = len(CIFAR_10_CLASSES)\n", "\n", - "# Number of training epochs\n", - "EPOCHS = 500\n", + "# Set the number of training epochs\n", "\n", - "# Number of consecutive not improving epochs needed for stopping the training\n", - "EARLY_STOPPING_LIMIT = EPOCHS // 10\n", + "# Set the number of consecutive not improving epochs needed for stopping the training\n", "\n", - "# Learning rate\n", - "LR = 0.001" + "# Set the learning rate" ] }, { "cell_type": "markdown", - "id": "69", + "id": "75", "metadata": {}, "source": [ "#### Loss function\n", @@ -1166,18 +1271,16 @@ { "cell_type": "code", "execution_count": null, - "id": "70", + "id": "76", "metadata": {}, "outputs": [], "source": [ - "# Cross Entropy Loss\n", - "criterion = nn.CrossEntropyLoss()\n", - "criterion = criterion.to(DEVICE)" + "# Initialise the Cross Entropy Loss and send it to the training device" ] }, { "cell_type": "markdown", - "id": "71", + "id": "77", "metadata": {}, "source": [ "#### Initialise model architecture" @@ -1186,23 +1289,21 @@ { "cell_type": "code", "execution_count": null, - "id": "72", + "id": "78", "metadata": {}, "outputs": [], "source": [ - "# Initialise image classifier\n", - "model = ImageClassifier(in_channels = 3, out_classes = NUMBER_CLASSES)\n", - "model = model.to(DEVICE)" + "# Initialise image classifier and send it to the training device" ] }, { "cell_type": "markdown", - "id": "73", + "id": "79", "metadata": {}, "source": [ "#### Optimiser function\n", "\n", - "In this notebook, we are using Adam optimiser which is one of the most used optimisers in deep neural network optimisation (see [Gentle Introduction to the Adam Optimisation Algorithm for Deep Learning](https://machinelearningmastery.com/adam-optimization-algorithm-for-deep-learning/)).\n", + "In this notebook, we are using Adam optimiser (```optimizer = optim.Adam(model.parameters(), lr = LR)```) which is one of the most used optimisers in deep neural network optimisation (see [Gentle Introduction to the Adam Optimisation Algorithm for Deep Learning](https://machinelearningmastery.com/adam-optimization-algorithm-for-deep-learning/)).\n", "\n", "The parameter update at each step is given by:\n", "\n", @@ -1240,65 +1341,76 @@ { "cell_type": "code", "execution_count": null, - "id": "74", + "id": "80", "metadata": {}, "outputs": [], "source": [ - "# Adam Optimiser\n", - "optimizer = optim.Adam(model.parameters(), lr = LR)" + "# Initialise the Adam optimiser" ] }, { "cell_type": "markdown", - "id": "75", + "id": "81", "metadata": {}, "source": [ "#### Train model\n", "\n", - "Here, we train the model. First, we initialise the class ```Trainer``` which we are going to use for training and evaluating the model using the PyTorch ```Dataset```s defined before. In case, some model weights are already available, we can skip the training and using them." + "Here, we train the model. First, we initialise the class ```Trainer``` which we are going to use for training and evaluating the model using the PyTorch ```Dataset```s defined before (```trainer = Trainer(model)```). In case, some model weights are already available, we can skip the training and using them." ] }, { "cell_type": "code", "execution_count": null, - "id": "76", + "id": "82", "metadata": {}, "outputs": [], "source": [ - "# Train the image classifier\n", - "trainer = Trainer(model)\n", + "# Initialise the Train instance, which is going to be used to train the image classifier\n" + ] + }, + { + "cell_type": "markdown", + "id": "83", + "metadata": {}, + "source": [ + "Check whether a trained model already exists. If so, load the weights using ```model_weights = torch.load(model_path, weights_only=True)```. Then, load the weights into the model using (```model.load_state_dict(model_weights)```). Finally, set the model to evaluation model (```model.eval()```). This step is essential because certain layers, such as batch normalization and dropout, behave differently during training and evaluation. Setting the model to evaluation mode ensures they operate correctly during validation or testing. If no pre-trained model is available, train a new model using the training and validation sets along with the predefined hyperparameters (```trainer.fit(EPOCHS, train_dataloader, val_dataloader, optimizer, criterion, DEVICE, EARLY_STOPPING_LIMIT)```). After training, save the best model weights (```torch.save(trainer.model.best_model_weights, model_path)```)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "84", + "metadata": {}, + "outputs": [], + "source": [ + "# Model filename\n", "model_path = \"cnn_weights.pt\"\n", "\n", "if os.path.exists(model_path):\n", - " print(\"Loading model weights...\")\n", - " model.load_state_dict(torch.load(model_path, weights_only=True))\n", - " model.eval()\n", + " pass\n", "else:\n", - " print(\"Training model weights...\")\n", - " # Fit model\n", - " trainer.fit(EPOCHS, train_dataloader, val_dataloader, optimizer, criterion, DEVICE, EARLY_STOPPING_LIMIT)\n", - " # Save model weights\n", - " torch.save(trainer.model.best_model_weights, \"cnn_weights.pt\")" + " pass" ] }, { "cell_type": "markdown", - "id": "77", + "id": "85", "metadata": {}, "source": [ "#### Learning curves\n", "\n", - "After training the model, we can analyse the learning curves to assess the training process. These curves, which typically display the loss over epochs for both the training and validation sets, are crucial for improving model performance. They can help identify issues like overfitting or underfitting. Overfitting occurs when the model performs well on the training data but poorly on the validation data, usually indicated by a widening gap between the two curves. Underfitting, on the other hand, is suggested when both the training and validation curves show poor performance and fail to improve. By monitoring these curves, we can adjust hyperparameters or modify the model architecture to address such issues." + "After training the model, we can analyse the learning curves to assess the training process. These curves, which typically display the loss over epochs for both the training and validation sets, are crucial for improving model performance. They can help identify issues like overfitting or underfitting. Overfitting occurs when the model performs well on the training data but poorly on the validation data, usually indicated by a widening gap between the two curves. Underfitting, on the other hand, is suggested when both the training and validation curves show poor performance and fail to improve. By monitoring these curves, we can adjust hyperparameters or modify the model architecture to address such issues. First, load the log file using ```pandas``` (```training_log = pd.read_csv(\"training_log.txt\")```). Then, use the ```matplotlib``` library to plot the learning curves." ] }, { "cell_type": "code", "execution_count": null, - "id": "78", + "id": "86", "metadata": {}, "outputs": [], "source": [ - "training_log = pd.read_csv(\"training_log.txt\")\n", + "# Load the training log file\n", + "training_log = None\n", "\n", "plt.figure()\n", "plt.plot(training_log[\"Train Loss\"])\n", @@ -1311,27 +1423,38 @@ }, { "cell_type": "markdown", - "id": "79", + "id": "87", "metadata": {}, "source": [ "### Model testing\n", "\n", - "After having the model already optimised, we can evaluate the model using the ```Trainer``` class by calling the ```predict``` function." + "Once the model has been trained and optimized, we can evaluate its performance using the ```Trainer``` class by calling the ```predict``` method with the test dataloader and the device:\n", + "```python\n", + "original_images, true_labels, predicted_labels = trainer.predict(test_dataloader, DEVICE)\n", + "```\n", + "\n", + "This method returns three NumPy arrays:\n", + "\n", + "- ```original_images```: the input images from the test set\n", + "\n", + "- ```true_labels```: the corresponding ground truth labels\n", + "\n", + "- ```predicted_labels```: the model's predicted classes" ] }, { "cell_type": "code", "execution_count": null, - "id": "80", + "id": "88", "metadata": {}, "outputs": [], "source": [ - "original_images, true_labels, predicted_labels = trainer.predict(test_dataloader, DEVICE)" + "# Write here the line of code to predict the labels for the test set" ] }, { "cell_type": "markdown", - "id": "81", + "id": "89", "metadata": {}, "source": [ "### Explore results\n", @@ -1341,42 +1464,48 @@ }, { "cell_type": "markdown", - "id": "82", + "id": "90", "metadata": {}, "source": [ - "#### Compute average accuracy" + "#### Compute average accuracy\n", + "\n", + "To compute the accuracy, get the number of test samples (```num_test_samples = len(original_images)```), check how many samples were correctly classified (```correct = (true_labels == predicted_labels).sum()```), and get the ratio (```accuracy = correct/num_test_samples```)." ] }, { "cell_type": "code", "execution_count": null, - "id": "83", + "id": "91", "metadata": {}, "outputs": [], "source": [ "# Compute average accuracy\n", - "num_test_samples = len(original_images)\n", - "correct = (true_labels == predicted_labels).sum()\n", - "print(\"Accuracy:\", correct/num_test_samples)" + "accuracy = 0.0\n", + "print(\"Accuracy:\", accuracy)" ] }, { "cell_type": "markdown", - "id": "84", + "id": "92", "metadata": {}, "source": [ - "#### Compute confusion matrix" + "#### Compute confusion matrix\n", + "\n", + "To compute the confusion matrix, use ```confusion_matrix``` from scikit-learn library:\n", + "```python\n", + "cm = confusion_matrix(true_labels, predicted_labels)\n", + "```" ] }, { "cell_type": "code", "execution_count": null, - "id": "85", + "id": "93", "metadata": {}, "outputs": [], "source": [ "# Compute confusion matrix\n", - "cm = confusion_matrix(true_labels, predicted_labels)\n", + "cm = None\n", "\n", "# Plot confusion matrix\n", "fig, ax = plt.subplots(figsize=(10, 8))\n", @@ -1402,7 +1531,7 @@ }, { "cell_type": "markdown", - "id": "86", + "id": "94", "metadata": {}, "source": [ "### Explain image classifier predictions\n", @@ -1412,92 +1541,84 @@ }, { "cell_type": "markdown", - "id": "87", + "id": "95", "metadata": {}, "source": [ "#### Prepare image for GradCAM\n", "\n", - "We begin by preparing the image for Grad-CAM visualisation. The image must be converted to the (Height, Width, Channels) format and normalized to values between 0 and 1, as the visualizer from the PyTorch-GradCAM library expects this format. Afterwards, a batch dimension should be added to make it compatible with the model's input requirements. We also retrieve the predicted and true labels, as both are needed for Grad-CAM computation and visualisation." + "To prepare the image for Grad-CAM visualization, first convert it to (Height, Width, Channels) format using ```img_np = np.transpose(img, (1, 2, 0)) # shape: (H, W, C)```, and normalize its values to the [0, 1] range with ```img_np = (img_np - img_np.min()) / (img_np.max() - img_np.min())```. This processed image is used only for visualization, as expected by the PyTorch-GradCAM library. Next, modify the original image for model inference by adding a batch dimension: ```img = np.expand_dims(img, axis=0)```, then convert it to a PyTorch tensor: ```img = torch.from_numpy(img)```, and move it to the appropriate computation device using ```img = img.to(DEVICE)```. Finally, retrieve the predicted and true labels, as both are required for computing and visualizing the Grad-CAM output.\n" ] }, { "cell_type": "code", "execution_count": null, - "id": "88", + "id": "96", "metadata": {}, "outputs": [], "source": [ "# Get a batch of images\n", "idx = 1\n", "img = original_images[idx]\n", - "img_np = np.transpose(img, (1,2,0)) # shape: (H, W, C)\n", - "img_np = (img_np - img_np.min()) / (img_np.max() - img_np.min()) # Should be normalised between 0 and 1\n", - "img = np.expand_dims(img, axis = 0)\n", - "img = torch.from_numpy(img)\n", - "img = img.to(DEVICE) # shape: [1, C, H, W]\n", "pred_label = predicted_labels[idx]\n", "true_label = true_labels[idx]" ] }, { "cell_type": "markdown", - "id": "89", + "id": "97", "metadata": {}, "source": [ "#### Compute GradCAM heatmap\n", "\n", - "Using the predicted class, we compute the Grad-CAM values based on the activations and gradients from the last convolutional layer of the image classifier. This layer is typically chosen because it retains spatial information that helps localise the regions of the input image most relevant to the model's decision." + "First, ensure that the `requires_grad` attribute of the input image tensor is set to `True` by using `img.requires_grad = True`. This enables gradient computation with respect to the image, which is necessary for generating class activation maps. Next, specify the layer to inspect using ```target_layers = [model.conv3]```. Typically, the last convolutional layer of the image classifier is chosen because it preserves spatial information, which is crucial for identifying the regions of the input image that most strongly influence the model's prediction. Then, define the target class to be explained with ```targets = [ClassifierOutputTarget(pred_label)]```, where ```pred_label``` is the class index corresponding to the model’s predicted output (or any other class of interest). Finally, compute the Grad-CAM heatmap using the activations and gradients from the selected layer:\n", + "```python\n", + "# Create CAM object\n", + "with GradCAM(model=model, target_layers=target_layers) as cam:\n", + " grad_cam_matrix = cam(input_tensor=img, targets=targets)\n", + " grad_cam_matrix = grad_cam_matrix[0, :]\n", + "```" ] }, { "cell_type": "code", "execution_count": null, - "id": "90", + "id": "98", "metadata": {}, "outputs": [], "source": [ "# Make sure input requires grad\n", - "img.requires_grad = True\n", "\n", "# Define the layer(s) to inspect\n", - "target_layers = [model.conv3]\n", "\n", "# Define the target class you want to explain\n", - "targets = [ClassifierOutputTarget(pred_label)]\n", "\n", - "# Make sure that the model is on eval and not on train mode\n", - "model.eval()\n", - "\n", - "# Create CAM object\n", - "with GradCAM(model=model, target_layers=target_layers) as cam:\n", - " grad_cam_matrix = cam(input_tensor=img, targets=targets)\n", - " grad_cam_matrix = grad_cam_matrix[0, :]" + "# Compute CAM object" ] }, { "cell_type": "markdown", - "id": "91", + "id": "99", "metadata": {}, "source": [ "#### Visualise GradCAM heatmap with the image\n", "\n", - "After obtaining the Grad-CAM heatmap, we overlay it on the input image to visualise the regions that contributed most to the model’s prediction. This helps identify which pixels the model focused on when predicting the class." + "After obtaining the Grad-CAM heatmap, we overlay it on the input image to visualise the regions that contributed most to the model’s prediction (```visualisation = show_cam_on_image(img_np, grad_cam_matrix, use_rgb=True)```). This helps identify which pixels the model focused on when predicting the class." ] }, { "cell_type": "code", "execution_count": null, - "id": "92", + "id": "100", "metadata": {}, "outputs": [], "source": [ "# Combine CAM with image\n", - "visualization = show_cam_on_image(img_np, grad_cam_matrix, use_rgb=True)\n", + "visualisation = None\n", "\n", "# Plot image with GradCAM output\n", "true_class = CIFAR_10_CLASSES[true_label]\n", "pred_class = CIFAR_10_CLASSES[pred_label]\n", - "plot_multiple_images((img_np, f\"Original - {true_class}\"), (visualization, f\"Grad-CAM - {pred_class}\"), figsize = (5,6))" + "plot_multiple_images((img_np, f\"Original - {true_class}\"), (visualisation, f\"Grad-CAM - {pred_class}\"), figsize = (5,6))" ] } ], From bfac81fca94ec2a39bd44114126f956dd1ca7c97 Mon Sep 17 00:00:00 2001 From: Fabio Lopes Date: Wed, 7 May 2025 13:55:40 +0200 Subject: [PATCH 05/10] Upload download link into notebook --- 25_libraries_image_classification.ipynb | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/25_libraries_image_classification.ipynb b/25_libraries_image_classification.ipynb index 67fc4469..b6656671 100644 --- a/25_libraries_image_classification.ipynb +++ b/25_libraries_image_classification.ipynb @@ -500,7 +500,7 @@ "source": [ "### Load data\n", "\n", - "Training and test sets are loaded using Pickle library." + "Training and test sets are loaded using Pickle library. If you do not have the dataset already, open this [link](https://www.dropbox.com/scl/fo/p7gfb0kpgkbrrjup340pi/AAkX2u1g-W7290-Aq7gHHvo?rlkey=vdxaj6npfy09ywh17nl8f9v6e&st=8hfq9z20&dl=0) and download it. Place it inside the data folder." ] }, { @@ -511,7 +511,7 @@ "outputs": [], "source": [ "# Sets filepaths\n", - "dataset_folder = os.path.join(\"CIFAR10\")\n", + "dataset_folder = os.path.join(\"data/CIFAR10\")\n", "train_set_file = os.path.join(dataset_folder, \"train_set.pkl\")\n", "test_set_file = os.path.join(dataset_folder, \"test_set.pkl\")\n", "\n", From 9d18ad10293f6e6caae424ead8168a1c75010071 Mon Sep 17 00:00:00 2001 From: Fabio Lopes Date: Fri, 9 May 2025 13:57:52 +0200 Subject: [PATCH 06/10] Add more details to the classes and re structure the texts. --- 25_libraries_image_classification.ipynb | 465 ++++++++++++++++-------- 1 file changed, 319 insertions(+), 146 deletions(-) diff --git a/25_libraries_image_classification.ipynb b/25_libraries_image_classification.ipynb index b6656671..a7b8197b 100644 --- a/25_libraries_image_classification.ipynb +++ b/25_libraries_image_classification.ipynb @@ -136,7 +136,9 @@ "source": [ "## Introduction\n", "\n", - "Image Classification is a foundational task in the field of computer vision and machine learning. This notebook aims to provide practical experience in image processing and in building and evaluating image classification models. It begins by demonstrating how to load and preprocess image data using Matplotlib and OpenCV-Python. Then, it shows how to build a basic image classification pipeline based on Convolutional Neural Networks (CNNs) using PyTorch, Albumentations, and Scikit-learn. Next, it covers how to evaluate model performance using Scikit-learn and NumPy, and finally, it introduces model explainability using Grad-CAM.\n", + "Image Classification is a foundational task in the field of computer vision and machine learning. This notebook aims to provide practical experience in image processing and in building and evaluating image classification models. \n", + "\n", + "It begins by demonstrating how to load and preprocess image data using Matplotlib and OpenCV-Python. Then, it shows how to build a basic image classification pipeline based on Convolutional Neural Networks (CNNs) using PyTorch, Albumentations, and Scikit-learn. Next, it covers how to evaluate model performance using Scikit-learn and NumPy, and finally, it introduces model explainability using Grad-CAM.\n", "\n", "The goal of this notebook is not to teach the underlying algorithms and procedures used in this field, but rather to give the user an idea of what can be done with these Python libraries." ] @@ -166,27 +168,27 @@ "\n", "```ImageDataset```: \n", "\n", - "In ```__init__```, initialise these three atributes:\n", + "In ```__init__```, initialise the following attributes:\n", "```python\n", - "self.images = images\n", - "self.labels = labels\n", - "self.transform = transform\n", + "self.images = images # Input images\n", + "self.labels = labels # Output classes\n", + "self.transform = transform # Transformations applied to the data when calling them\n", "```\n", "\n", - "Complete function ```__len__```:\n", + "Complete function ```__len__``` - this method is needed to let the generator know how many samples there are in the data:\n", "```python\n", "return len(self.images)\n", "```\n", "\n", - "Complete function ```__getitem__```:\n", + "Complete function ```__getitem__``` - this method is needed to lety the generator know what to do to samples when calling them:\n", "```python\n", "image = self.images[idx]\n", "label = self.labels[idx]\n", "\n", - "# Ensure the image is in the shape (H, W, C) for Albumentations\n", + "# Ensure the image is in the shape (H, W, C) for Albumentations library (library used for image augmentation)\n", "image = np.transpose(image, (1, 2, 0))\n", "\n", - "# Apply Albumentations transforms\n", + "# Apply transformations on the images\n", "if self.transform:\n", " augmented = self.transform(image=image)\n", " image = augmented['image']\n", @@ -220,46 +222,91 @@ "source": [ "```ImageClassifier```: \n", "\n", - "In ```__init__```, initialise the attributes needed for building the neural network:\n", - "```python\n", - "# Best model weights\n", - "self.best_model_weights = None\n", + "`__init__` function:\n", "\n", - "# Convolutional block layers\n", + "The first thing to do is to build the `__init__` function, which contains the variables needed for building the neural network.\n", + "\n", + "Let's start by defining the number of feature maps in the first convolutional layer (the value is empirical):\n", + "```python\n", "self.feature_maps = 64\n", + "```\n", "\n", - "# First convolutional block\n", + "To help a computer understand and classify images, we build a model made up of layers, kind of like stacking Lego blocks. Each block does a specific task — detecting patterns, reducing size, or making decisions. Here's what each component does:\n", + "```python\n", "self.conv1 = nn.Conv2d(in_channels, self.feature_maps, kernel_size = 3)\n", + "```\n", + "This layer scans the image for small patterns (like edges or colors).\n", + "\n", + "`in_channels` is the number of input image channels (e.g. 3 for RGB images).\n", + "\n", + "`self.feature_maps` is how many different patterns we want the model to learn at this layer.\n", + "\n", + "`kernel_size = 3` means the scanning window is 3x3 pixels. The value is empirical.\n", + "\n", + "```python\n", "self.pool1 = nn.MaxPool2d(kernel_size = 2)\n", + "```\n", + "This layer shrinks the size of the image while keeping the most important info (max values). It helps the model focus and reduces computation.\n", + "\n", + "```python\n", "self.bn1 = nn.BatchNorm2d(self.feature_maps)\n", + "```\n", + "This layer normalizes the outputs, making training faster and more stable.\n", "\n", - "# Second convolutional block\n", + "The combination of the foreamentioned layers is also usually called as convolutional block.\n", + "\n", + "After defining the first convolutional block, lets define the second one:\n", + "```python\n", "self.conv2 = nn.Conv2d(self.feature_maps, self.feature_maps * 2, kernel_size = 3)\n", "self.pool2 = nn.MaxPool2d(kernel_size = 2)\n", "self.bn2 = nn.BatchNorm2d(self.feature_maps * 2)\n", + "```\n", + "The second block is very similar to the first block, but now it looks for more complex patterns by increasing the number of feature maps (i.e. learning more features).\n", "\n", - "# Third convolutional block\n", + "After defining the second convolutional block. lets define the third and last one:\n", + "```python\n", "self.conv3 = nn.Conv2d(self.feature_maps * 2, self.feature_maps * 4, kernel_size = 3)\n", "self.pool3 = nn.MaxPool2d(kernel_size = 2)\n", "self.bn3 = nn.BatchNorm2d(self.feature_maps * 4)\n", + "```\n", + "This block explored even deeper patterns, such as shapes or textures. As we go deeper, the network becomes better at understanding the image.\n", "\n", - "# Rectified linear unit (activation function)\n", + "Then, we define the activation layer that is going to be used in-between these blocks:\n", + "```python\n", "self.relu = nn.ReLU()\n", + "```\n", + "After each layer, we add a \"yes/no\" switch to keep only useful patterns. ReLU (Rectified Linear Unit) sets negative values to zero — it adds non-linearity to help the network learn more complex things.\n", "\n", - "# Flatten layer that produces the features vector\n", + "Next, we define the layer that transforms the data from 2D images into an 1D vector (like stretching out a grid of pixels into a line):\n", + "```python\n", "self.flatten = nn.Flatten(start_dim=1)\n", + "```\n", "\n", - "# Dropout layer is responsible for introducing regularisation into training process\n", + "Now, we define the dropout layer:\n", + "```python\n", "self.dropout = nn.Dropout(p = 0.3)\n", + "```\n", + "This layer randomly turns off a pre-define percentage of neurons (`p = 0.3`) during training to prevent overfitting — so the model does not memorize the training data too closely.\n", "\n", - "# Number of output classes\n", + "Finally, we define the classifier:\n", + "```python\n", "self.out_classes = out_classes\n", - "\n", - "# Classification layer \n", "self.fc = nn.Linear(1024, self.out_classes)\n", "```\n", + "This final layer is like the decision-maker. It takes all the features the model has learned and decides which class (e.g. cat, dog, airplane) the input image belongs to.\n", + "\n", + "1024 is the number of features coming into the layer (depends on the hyperparameters used in the previous layers), and `out_classes` is how many classes we want to predict." + ] + }, + { + "cell_type": "markdown", + "id": "11", + "metadata": {}, + "source": [ + "`forward` function:\n", + "\n", + "After defining the function `__init__`, we need to define the function `forward`. This one is responsible to combine all the layers defined in the `__init__` to build the neural network model. Basically, it describes how an input image flows through the network, one layer at a time, to become a prediction.\n", "\n", - "Then, complete the function named ```forward```, which essentially defines the neural network architecture:\n", "```python\n", "# Convolutional block 1\n", "x = self.conv1(x)\n", @@ -290,7 +337,7 @@ { "cell_type": "code", "execution_count": null, - "id": "11", + "id": "12", "metadata": {}, "outputs": [], "source": [ @@ -304,60 +351,100 @@ }, { "cell_type": "markdown", - "id": "12", + "id": "13", "metadata": {}, "source": [ "```Trainer```: \n", "\n", - "In ```__init__```, initialise these three atributes:\n", + "`__init__` function:\n", + "\n", + "This function is used to initialise variables used in the other functions of the class.\n", + "\n", + "Start by initialising the following attributes:\n", "```python\n", "self.model = model\n", "self.train_losses = []\n", "self.val_losses = []\n", + "self.best_model_weights = None\n", "```\n", + "`self.model` – This is the neural network we're training.\n", + "\n", + "`self.train_losses` and `self.val_losses` – These lists keep track of how well the model is doing on the training and validation sets over time (used to plot learning curves).\n", + "\n", + "`self.best_model_weights` – This will store a copy of the model when it performed best on the validation set (used for early stopping)." + ] + }, + { + "cell_type": "markdown", + "id": "14", + "metadata": {}, + "source": [ + "`fit` function:\n", + "\n", + "This function goes through the data multiple times (epochs) to optimize the model’s performance. It also applies early stopping, which stops training if performance stops improving.\n", + "\n", + "Lets start by initialising the following variables:\n", "\n", - "Complete the ```fit``` function, which is used for optimising the model:\n", "```python\n", "early_stopping_count = 0\n", "best_val_loss = 9999\n", "best_epoch = 0\n", + "```\n", + "\n", + "`early_stopping_count`: It is used to track the number of epochs without improving validation loss (used in early stopping).\n", + "\n", + "`best_val_loss`: It is used to track the best validation loss ever seen. Here we use a very large meaning-less number because validation loss for classification is always smaller than that.\n", "\n", + "`best_epoch`: It is used to track the epoch that got the best validation loss.\n", + "\n", + "Then, we initialise the file that is going to store the training statistics:\n", + "\n", + "```python\n", "# Training log file\n", "log_filename = \"training_log.txt\"\n", "with open(log_filename, \"w\") as log_file:\n", " log_file.write(\"Epoch,Train Loss,Val Loss,Best Val Loss,Best Epoch\\n\")\n", + "```\n", "\n", + "Now comes the training phase. It includes a main for-loop that runs until the end of the pre-defined number of epochs, and two inner loops: one for optimising the model's weights and another for evaluating the model after each epoch.\n", "\n", + "```python\n", "for epoch in range(epochs):\n", - " # Train mode\n", + " # Set the model to training mode. This is important because some layers behave differently during training than they do during evaluation.\n", " self.model.train()\n", " \n", + " # Loop over the training set\n", " train_loss = 0.0\n", " train_samples_count = 0.0\n", " for i, data in enumerate(train_dataloader):\n", + " # Get the data and send it to the training device\n", " inputs, labels = data\n", " inputs = inputs.to(device)\n", " labels = labels.long().to(device)\n", " \n", - " # zero the parameter gradients\n", + " # Clear old gradients\n", " optimizer.zero_grad()\n", "\n", - " # Forward step\n", + " # Perform the forward step to get the predictions for the inputs\n", " outputs = self.model(inputs)\n", - " \n", - " # Backward step\n", + "\n", + " # Compute the loss of the predictions\n", " loss = criterion(outputs, labels)\n", + "\n", + " # Perform the backward step which is responsible for computing the gradients\n", " loss.backward()\n", " \n", - " # Weight optimisation\n", + " # Update the model weights using the new gradients\n", " optimizer.step()\n", "\n", + " # Save losses and number of samples in the batch\n", " train_loss += loss.item()\n", " train_samples_count += 1\n", " \n", - " # Validation mode\n", + " # Set the model to evaluation mode\n", " self.model.eval()\n", " \n", + " # Loop over the validation set. Here we just want to evaluate the model. Therefore, there is no weight optimisation.\n", " val_loss = 0.0\n", " val_samples_count = 0.0\n", " for i, data in enumerate(val_dataloader):\n", @@ -371,57 +458,89 @@ " val_loss += loss.item()\n", " val_samples_count += 1\n", " \n", + " # Divide the total train and validation losses by the number of samples, respectively.\n", " train_loss /= train_samples_count\n", " val_loss /= val_samples_count\n", " \n", + " # Average training and validation losses for the epoch are stored.\n", " self.train_losses.append(train_loss)\n", " self.val_losses.append(val_loss)\n", " \n", + " # Increase early stopping count\n", " early_stopping_count += 1\n", " \n", + " # In case the new validation loss is better than the best seen, \n", + " # save the current epoch index, new validation loss, current model \n", + " # weights and reset early stopping counter.\n", " if val_loss < best_val_loss:\n", " best_epoch = epoch\n", " best_val_loss = val_loss\n", " early_stopping_count = 0\n", " self.model.best_model_weights = self.model.state_dict()\n", " \n", - " \n", " print(f'Epoch: {epoch}, Loss: {train_loss}, Val Loss: {val_loss}. The best val loss is {best_val_loss} in epoch {best_epoch}.')\n", " \n", + " # Append the current epoch statistics to the training log file\n", " with open(log_filename, \"a\") as log_file:\n", " log_file.write(f\"{epoch},{train_loss},{val_loss},{best_val_loss},{best_epoch}\\n\")\n", " \n", + " # In case, the number of epochs without improving the validation loss \n", + " # gets above the pre-defined threshold, stop the training early to avoid overfitting.\n", " if early_stopping_count == early_stopping_limit and early_stopping_limit > 0:\n", " break\n", - "```\n", + "```" + ] + }, + { + "cell_type": "markdown", + "id": "15", + "metadata": {}, + "source": [ + "`predict` function:\n", "\n", - "Then, complete the function ```predict```, which is used for predicting labels considering batches of data as input:\n", + "Once training is done, this method is used to predict labels for new data.\n", + "\n", + "As early stopping is used during the training, it might be the case that the last models weights were not the best ones. Therefore, load the best-performing ones.\n", "```python\n", "# Load best weights\n", - "if self.model.best_model_weights:\n", - " self.model.load_state_dict(self.model.best_model_weights)\n", + "if self.best_model_weights:\n", + " self.model.load_state_dict(self.best_model_weights)\n", + "```\n", "\n", + "Set the model to evaluation mode\n", + "```python\n", "# Test mode\n", "self.model.eval()\n", + "```\n", "\n", + "Loop through the test set to get the model predictions. Not only the predictions, but also the original images and the true labels are stored for future use.\n", + "```python\n", "original_images = []\n", "true_labels = []\n", "predicted_labels = []\n", "\n", "for data in test_dataloader:\n", + " # Load data and send it to device\n", " images, labels = data\n", " images = images.to(device)\n", + "\n", + " # Get model predictions\n", " outputs = self.model(images)\n", + "\n", + " # As the model outputs a vector scores (one per class), take \n", + " # the index of the maximum score which corresponds to the predicted class.\n", " _, predicted = torch.max(outputs, 1)\n", " \n", - " images = images.cpu().detach().numpy()\n", + " # .cpu() ensures that the data is on CPU and .numpy() convert it to a NumPy array\n", + " images = images.cpu().numpy()\n", " labels = labels.numpy()\n", - " predicted = predicted.cpu().detach().numpy()\n", + " predicted = predicted.cpu().numpy()\n", " \n", " original_images.append(images)\n", " true_labels.append(labels)\n", " predicted_labels.append(predicted)\n", "\n", + "# Convert the list of NumPy arrays into only one NumPy array\n", "original_images = np.concatenate(original_images)\n", "true_labels = np.concatenate(true_labels)\n", "predicted_labels = np.concatenate(predicted_labels)\n", @@ -433,7 +552,7 @@ { "cell_type": "code", "execution_count": null, - "id": "13", + "id": "16", "metadata": {}, "outputs": [], "source": [ @@ -450,7 +569,7 @@ }, { "cell_type": "markdown", - "id": "14", + "id": "17", "metadata": {}, "source": [ "## Functions\n", @@ -461,7 +580,7 @@ { "cell_type": "code", "execution_count": null, - "id": "15", + "id": "18", "metadata": {}, "outputs": [], "source": [ @@ -485,7 +604,7 @@ }, { "cell_type": "markdown", - "id": "16", + "id": "19", "metadata": {}, "source": [ "## Dataset\n", @@ -495,7 +614,7 @@ }, { "cell_type": "markdown", - "id": "17", + "id": "20", "metadata": {}, "source": [ "### Load data\n", @@ -506,7 +625,7 @@ { "cell_type": "code", "execution_count": null, - "id": "18", + "id": "21", "metadata": {}, "outputs": [], "source": [ @@ -528,17 +647,19 @@ }, { "cell_type": "markdown", - "id": "19", + "id": "22", "metadata": {}, "source": [ "## Explore image processing\n", "\n", - "Image processing is fundamental to computer vision, forming the basis for interpreting and analyzing visual information. By applying techniques such as resizing, filtering, color adjustments, and data augmentation, image processing enhances input quality, minimizes noise, and corrects distortions. These methods can also simulate real-world variability, helping models generalize better. In this notebook, we explore three categories of image transformations: geometric transformations, image filtering, and photometric transformations. The following cells contain a series of exercicies designed to help you explore the OpenCV-Python library. If you are unfamiliar with a particular method, refer to the [Image Processing in OpenCV](https://docs.opencv.org/4.x/d2/d96/tutorial_py_table_of_contents_imgproc.html) documentation." + "Image processing is fundamental to computer vision, forming the basis for interpreting and analyzing visual information. By applying techniques such as resizing, filtering, color adjustments, and data augmentation, image processing enhances input quality, minimizes noise, and corrects distortions. These methods can also simulate real-world variability, helping models generalize better. \n", + "\n", + "In this notebook, we explore three categories of image transformations: **geometric transformations**, **image filtering**, and **photometric transformations**. The following cells contain a series of exercicies designed to help you explore the OpenCV-Python library. If you are unfamiliar with a particular method, refer to the [Image Processing in OpenCV](https://docs.opencv.org/4.x/d2/d96/tutorial_py_table_of_contents_imgproc.html) documentation." ] }, { "cell_type": "markdown", - "id": "20", + "id": "23", "metadata": {}, "source": [ "### Example image" @@ -547,7 +668,7 @@ { "cell_type": "code", "execution_count": null, - "id": "21", + "id": "24", "metadata": {}, "outputs": [], "source": [ @@ -563,7 +684,7 @@ }, { "cell_type": "markdown", - "id": "22", + "id": "25", "metadata": {}, "source": [ "### Geometric transformation\n", @@ -577,7 +698,7 @@ }, { "cell_type": "markdown", - "id": "23", + "id": "26", "metadata": {}, "source": [ "#### Scaling" @@ -586,7 +707,7 @@ { "cell_type": "code", "execution_count": null, - "id": "24", + "id": "27", "metadata": {}, "outputs": [], "source": [ @@ -596,7 +717,7 @@ { "cell_type": "code", "execution_count": null, - "id": "25", + "id": "28", "metadata": {}, "outputs": [], "source": [ @@ -611,7 +732,7 @@ { "cell_type": "code", "execution_count": null, - "id": "26", + "id": "29", "metadata": {}, "outputs": [], "source": [ @@ -625,7 +746,7 @@ }, { "cell_type": "markdown", - "id": "27", + "id": "30", "metadata": {}, "source": [ "#### Cropping" @@ -634,7 +755,7 @@ { "cell_type": "code", "execution_count": null, - "id": "28", + "id": "31", "metadata": {}, "outputs": [], "source": [ @@ -648,7 +769,7 @@ { "cell_type": "code", "execution_count": null, - "id": "29", + "id": "32", "metadata": {}, "outputs": [], "source": [ @@ -662,7 +783,7 @@ }, { "cell_type": "markdown", - "id": "30", + "id": "33", "metadata": {}, "source": [ "#### Horizontal Flip" @@ -671,7 +792,7 @@ { "cell_type": "code", "execution_count": null, - "id": "31", + "id": "34", "metadata": {}, "outputs": [], "source": [ @@ -685,7 +806,7 @@ { "cell_type": "code", "execution_count": null, - "id": "32", + "id": "35", "metadata": {}, "outputs": [], "source": [ @@ -699,7 +820,7 @@ }, { "cell_type": "markdown", - "id": "33", + "id": "36", "metadata": {}, "source": [ "#### Vertical Flip" @@ -708,7 +829,7 @@ { "cell_type": "code", "execution_count": null, - "id": "34", + "id": "37", "metadata": {}, "outputs": [], "source": [ @@ -722,7 +843,7 @@ { "cell_type": "code", "execution_count": null, - "id": "35", + "id": "38", "metadata": {}, "outputs": [], "source": [ @@ -736,7 +857,7 @@ }, { "cell_type": "markdown", - "id": "36", + "id": "39", "metadata": {}, "source": [ "#### Rotation" @@ -745,7 +866,7 @@ { "cell_type": "code", "execution_count": null, - "id": "37", + "id": "40", "metadata": {}, "outputs": [], "source": [ @@ -759,7 +880,7 @@ { "cell_type": "code", "execution_count": null, - "id": "38", + "id": "41", "metadata": {}, "outputs": [], "source": [ @@ -773,7 +894,7 @@ }, { "cell_type": "markdown", - "id": "39", + "id": "42", "metadata": {}, "source": [ "### Image filtering\n", @@ -786,7 +907,7 @@ }, { "cell_type": "markdown", - "id": "40", + "id": "43", "metadata": {}, "source": [ "#### Average filter " @@ -795,7 +916,7 @@ { "cell_type": "code", "execution_count": null, - "id": "41", + "id": "44", "metadata": {}, "outputs": [], "source": [ @@ -809,7 +930,7 @@ { "cell_type": "code", "execution_count": null, - "id": "42", + "id": "45", "metadata": {}, "outputs": [], "source": [ @@ -823,7 +944,7 @@ }, { "cell_type": "markdown", - "id": "43", + "id": "46", "metadata": {}, "source": [ "#### Median filter" @@ -832,7 +953,7 @@ { "cell_type": "code", "execution_count": null, - "id": "44", + "id": "47", "metadata": {}, "outputs": [], "source": [ @@ -846,7 +967,7 @@ { "cell_type": "code", "execution_count": null, - "id": "45", + "id": "48", "metadata": {}, "outputs": [], "source": [ @@ -860,7 +981,7 @@ }, { "cell_type": "markdown", - "id": "46", + "id": "49", "metadata": {}, "source": [ "#### Gaussian filter" @@ -869,7 +990,7 @@ { "cell_type": "code", "execution_count": null, - "id": "47", + "id": "50", "metadata": {}, "outputs": [], "source": [ @@ -883,7 +1004,7 @@ { "cell_type": "code", "execution_count": null, - "id": "48", + "id": "51", "metadata": {}, "outputs": [], "source": [ @@ -897,7 +1018,7 @@ }, { "cell_type": "markdown", - "id": "49", + "id": "52", "metadata": {}, "source": [ "### Photometric transformation\n", @@ -910,7 +1031,7 @@ }, { "cell_type": "markdown", - "id": "50", + "id": "53", "metadata": {}, "source": [ "#### Adjust brightness" @@ -919,7 +1040,7 @@ { "cell_type": "code", "execution_count": null, - "id": "51", + "id": "54", "metadata": {}, "outputs": [], "source": [ @@ -933,7 +1054,7 @@ { "cell_type": "code", "execution_count": null, - "id": "52", + "id": "55", "metadata": {}, "outputs": [], "source": [ @@ -950,7 +1071,7 @@ }, { "cell_type": "markdown", - "id": "53", + "id": "56", "metadata": {}, "source": [ "#### Adjust contrast" @@ -959,7 +1080,7 @@ { "cell_type": "code", "execution_count": null, - "id": "54", + "id": "57", "metadata": {}, "outputs": [], "source": [ @@ -973,7 +1094,7 @@ { "cell_type": "code", "execution_count": null, - "id": "55", + "id": "58", "metadata": {}, "outputs": [], "source": [ @@ -990,7 +1111,7 @@ }, { "cell_type": "markdown", - "id": "56", + "id": "59", "metadata": {}, "source": [ "#### Adjust saturation" @@ -999,7 +1120,7 @@ { "cell_type": "code", "execution_count": null, - "id": "57", + "id": "60", "metadata": {}, "outputs": [], "source": [ @@ -1013,7 +1134,7 @@ { "cell_type": "code", "execution_count": null, - "id": "58", + "id": "61", "metadata": {}, "outputs": [], "source": [ @@ -1030,17 +1151,29 @@ }, { "cell_type": "markdown", - "id": "59", + "id": "62", "metadata": {}, "source": [ "## Image classifier development using CNNs\n", "\n", - "Image classification is the task of assigning a label or category to an input image from a predefined set of classes. It is a fundamental problem in computer vision with widespread applications, including facial recognition, medical imaging, quality control, and autonomous driving. This section outlines the key steps involved in developing an image classification model using PyTorch. It begins with data preprocessing, which includes splitting the dataset into training, validation, and test sets. Afterwards, it defines data augmentation strategies using the Albumentations library, loads the data as PyTorch datasets, and initialises PyTorch dataloaders to efficiently feed data during training. The next step is model training, where a CNN-based model is initialized and optimised using the training and validation data. After training, the model is evaluated on the test set to assess its performance. The evaluation includes metrics such as accuracy and the confusion matrix, which help interpret the model's predictive behavior. Finally, the PyTorch Grad-CAM library is used to visualize the regions of input images that contribute most to the model’s decisions, providing insights into model explainability using representative examples." + "Image classification is the task of assigning a label or category to an input image from a predefined set of classes. It is a fundamental problem in computer vision with widespread applications, including facial recognition, medical imaging, quality control, and autonomous driving. \n", + "\n", + "This section outlines the key steps involved in developing an image classification model using PyTorch:\n", + "\n", + "- It begins with data preprocessing, which includes splitting the dataset into training, validation, and test sets. \n", + "\n", + "- Afterwards, it defines data augmentation strategies using the Albumentations library, loads the data as PyTorch datasets, and initialises PyTorch dataloaders to efficiently feed data during training. \n", + "\n", + "- The next step is model training, where a CNN-based model is initialized and optimised using the training and validation data. \n", + "\n", + "- After training, the model is evaluated on the test set to assess its performance. The evaluation includes metrics such as accuracy and the confusion matrix, which help interpret the model's predictive behavior. \n", + "\n", + "- Finally, the PyTorch Grad-CAM library is used to visualize the regions of input images that contribute most to the model’s decisions, providing insights into model explainability using representative examples." ] }, { "cell_type": "markdown", - "id": "60", + "id": "63", "metadata": {}, "source": [ "### Dataset preprocessing" @@ -1048,7 +1181,7 @@ }, { "cell_type": "markdown", - "id": "61", + "id": "64", "metadata": {}, "source": [ "#### Train, validation, and test sets\n", @@ -1059,7 +1192,7 @@ { "cell_type": "code", "execution_count": null, - "id": "62", + "id": "65", "metadata": {}, "outputs": [], "source": [ @@ -1073,16 +1206,23 @@ }, { "cell_type": "markdown", - "id": "63", + "id": "66", "metadata": {}, "source": [ "#### Data Augmentation\n", "\n", - "Data augmentation is a crucial technique in image classification that helps improve the performance and robustness of machine learning models. It involves generating new training samples by applying random transformations — such as rotation, flipping, cropping, scaling, or color jittering — to the original images. Albumentations is one of the most widely used libraries for performing data augmentation in image classification tasks. It includes augmentation techniques that replicate operations commonly used in image processing, such as:\n", + "Data augmentation is a crucial technique in image classification that helps improve the performance and robustness of machine learning models. It involves generating new training samples by applying random transformations — such as rotation, flipping, cropping, scaling, or color jittering — to the original images. \n", + "\n", + "Albumentations is one of the most widely used libraries for performing data augmentation in image classification tasks. It includes augmentation techniques that replicate operations commonly used in image processing, such as:\n", + "\n", "- ```A.Affine``` for scaling;\n", + "\n", "- ```A.Rotate``` for rotation;\n", + "\n", "- ```A.HorizontalFlip``` for horizontal flipping;\n", + "\n", "- ```A.VerticalFlip``` for vertical flipping;\n", + "\n", "- ```A.ColorJitter``` for color jittering.\n", "\n", "Albumentations can also be used for image normalization (```A.Normalize```), resizing (```A.Resize```), and converting images to PyTorch tensors with the (Channel, Height, Width) format using ```A.ToTensorV2```, which is required for model training.\n", @@ -1100,7 +1240,7 @@ { "cell_type": "code", "execution_count": null, - "id": "64", + "id": "67", "metadata": {}, "outputs": [], "source": [ @@ -1123,7 +1263,7 @@ }, { "cell_type": "markdown", - "id": "65", + "id": "68", "metadata": {}, "source": [ "#### PyTorch Datasets\n", @@ -1134,7 +1274,7 @@ { "cell_type": "code", "execution_count": null, - "id": "66", + "id": "69", "metadata": {}, "outputs": [], "source": [ @@ -1146,7 +1286,7 @@ }, { "cell_type": "markdown", - "id": "67", + "id": "70", "metadata": {}, "source": [ "#### PyTorch Dataloaders\n", @@ -1157,7 +1297,7 @@ { "cell_type": "code", "execution_count": null, - "id": "68", + "id": "71", "metadata": {}, "outputs": [], "source": [ @@ -1170,27 +1310,42 @@ }, { "cell_type": "markdown", - "id": "69", + "id": "72", "metadata": {}, "source": [ "### Model training\n", "\n", - "Model training comprises a series of steps. First, we must check which devices are available for training the model. In case a GPU with Cuda cores is available is should be used as it really improves the speed. Otherwise, lets use CPU. Then, model and training hyperparameters should be defined, such as numer of output classes, number of training epochs, number of consecutive not improving epochs needed for stopping the training in case we use early stopping regularisation, and learning rate. Other hyperparameters can be defined, it depends on what the user wants to do during the training. In this notebook we are going to define the number of epochs, which are the number of times the model is going to see the training set. Early stopping is a way of trying to avoid overfitting where the model evaluates the model every new epoch using a validation set. In case the loss obtained for the validation set does not decrease for a long period of time (pre-defined epochs), the model optimisation stops and retrieves the checkpoint where the validation loss got the last decrease (see [Early Stopping](https://paperswithcode.com/method/early-stopping)). Learning rate defined how quick the models weights should change during training. If it is too high the weights are going to change really quick and might miss minima because they are always jumping from one side to another side. If it is too small the model weights might get stuck a local minimum. So although this is not done in this notebook, this parameter should be studied in order to choose the best (see [What is learning rate in machine learning?](https://www.ibm.com/think/topics/learning-rate)). After defining the hyperparameters, we should define the loss function that is going to be used to evaluate the model and it should be sent to the hardware used for training. Afterwards, the model is defined using ```ImageClassifier``` class and is sent to the device used for training. Next, we should define the optimiser function and also send it to the device used for training. Afterwards, we train the model in case some optimised weights are not available and we explore the learning curves." + "Model training comprises a series of steps:\n", + "\n", + "- First, we must check which devices are available for training the model. In case a GPU with Cuda cores is available is should be used as it really improves the speed. Otherwise, lets use CPU. \n", + "\n", + "- Then, model and training hyperparameters should be defined, such as numer of output classes, number of training epochs, number of consecutive not improving epochs needed for stopping the training in case we use early stopping regularisation, and learning rate. Other hyperparameters can be defined, it depends on what the user wants to do during the training. In this notebook we are going to define the number of epochs, which are the number of times the model is going to see the training set. Early stopping is a way of trying to avoid overfitting where the model evaluates the model every new epoch using a validation set. In case the loss obtained for the validation set does not decrease for a long period of time (pre-defined epochs), the model optimisation stops and retrieves the checkpoint where the validation loss got the last decrease (see [Early Stopping](https://paperswithcode.com/method/early-stopping)). Learning rate defined how quick the models weights should change during training. If it is too high the weights are going to change really quick and might miss minima because they are always jumping from one side to another side. If it is too small the model weights might get stuck a local minimum. So although this is not done in this notebook, this parameter should be studied in order to choose the best (see [What is learning rate in machine learning?](https://www.ibm.com/think/topics/learning-rate)). \n", + "\n", + "- After defining the hyperparameters, we should define the loss function that is going to be used to evaluate the model and it should be sent to the hardware used for training. \n", + "\n", + "- Afterwards, the model is defined using ```ImageClassifier``` class and is sent to the device used for training. \n", + "\n", + "- Next, we should define the optimiser function and also send it to the device used for training. \n", + "\n", + "- Afterwards, we train the model in case some optimised weights are not available and we explore the learning curves." ] }, { "cell_type": "markdown", - "id": "70", + "id": "73", "metadata": {}, "source": [ "### Model Training Overview\n", "\n", "Model training involves a sequence of key steps. The first step is to check which computational devices are available. If a GPU with CUDA cores is accessible, it should be used, as it significantly accelerates training (```DEVICE = torch.device(\"cuda:0\" if torch.cuda.is_available() else \"cpu\")```). Otherwise, the model will be trained on the CPU. Next, we define the model and training hyperparameters. These typically include:\n", "\n", - "- The number of output classes (```NUMBER_CLASSES = len(CIFAR_10_CLASSES)```)\n", - "- The number of training epochs (i.e., how many times the model sees the full training set) (```EPOCHS = 500```)\n", - "- The patience for early stopping (i.e., how many consecutive epochs without improvement are allowed before stopping training) (```EARLY_STOPPING_LIMIT = EPOCHS // 10```)\n", - "- The learning rate (```LR = 0.001```)\n", + "- The number of output classes (```NUMBER_CLASSES = len(CIFAR_10_CLASSES)```);\n", + "\n", + "- The number of training epochs (i.e., how many times the model sees the full training set) (```EPOCHS = 500```);\n", + "\n", + "- The patience for early stopping (i.e., how many consecutive epochs without improvement are allowed before stopping training) (```EARLY_STOPPING_LIMIT = EPOCHS // 10```);\n", + "\n", + "- The learning rate (```LR = 0.001```).\n", "\n", "Additional hyperparameters may also be configured depending on the training strategy or specific use case.\n", "\n", @@ -1203,7 +1358,7 @@ }, { "cell_type": "markdown", - "id": "71", + "id": "74", "metadata": {}, "source": [ "#### Check which device is used for training" @@ -1212,7 +1367,7 @@ { "cell_type": "code", "execution_count": null, - "id": "72", + "id": "75", "metadata": {}, "outputs": [], "source": [ @@ -1221,7 +1376,7 @@ }, { "cell_type": "markdown", - "id": "73", + "id": "76", "metadata": {}, "source": [ "#### Define training hyperparameters" @@ -1230,7 +1385,7 @@ { "cell_type": "code", "execution_count": null, - "id": "74", + "id": "77", "metadata": {}, "outputs": [], "source": [ @@ -1246,7 +1401,7 @@ }, { "cell_type": "markdown", - "id": "75", + "id": "78", "metadata": {}, "source": [ "#### Loss function\n", @@ -1271,7 +1426,7 @@ { "cell_type": "code", "execution_count": null, - "id": "76", + "id": "79", "metadata": {}, "outputs": [], "source": [ @@ -1280,7 +1435,7 @@ }, { "cell_type": "markdown", - "id": "77", + "id": "80", "metadata": {}, "source": [ "#### Initialise model architecture" @@ -1289,7 +1444,7 @@ { "cell_type": "code", "execution_count": null, - "id": "78", + "id": "81", "metadata": {}, "outputs": [], "source": [ @@ -1298,7 +1453,7 @@ }, { "cell_type": "markdown", - "id": "79", + "id": "82", "metadata": {}, "source": [ "#### Optimiser function\n", @@ -1341,7 +1496,7 @@ { "cell_type": "code", "execution_count": null, - "id": "80", + "id": "83", "metadata": {}, "outputs": [], "source": [ @@ -1350,7 +1505,7 @@ }, { "cell_type": "markdown", - "id": "81", + "id": "84", "metadata": {}, "source": [ "#### Train model\n", @@ -1361,25 +1516,27 @@ { "cell_type": "code", "execution_count": null, - "id": "82", + "id": "85", "metadata": {}, "outputs": [], "source": [ - "# Initialise the Train instance, which is going to be used to train the image classifier\n" + "# Initialise the Train instance, which is going to be used to train the image classifier" ] }, { "cell_type": "markdown", - "id": "83", + "id": "86", "metadata": {}, "source": [ - "Check whether a trained model already exists. If so, load the weights using ```model_weights = torch.load(model_path, weights_only=True)```. Then, load the weights into the model using (```model.load_state_dict(model_weights)```). Finally, set the model to evaluation model (```model.eval()```). This step is essential because certain layers, such as batch normalization and dropout, behave differently during training and evaluation. Setting the model to evaluation mode ensures they operate correctly during validation or testing. If no pre-trained model is available, train a new model using the training and validation sets along with the predefined hyperparameters (```trainer.fit(EPOCHS, train_dataloader, val_dataloader, optimizer, criterion, DEVICE, EARLY_STOPPING_LIMIT)```). After training, save the best model weights (```torch.save(trainer.model.best_model_weights, model_path)```)." + "After initialising the trainer instance, check whether a trained model already exists. If so, load the weights using ```model_weights = torch.load(model_path, weights_only=True)```. Then, load the weights into the model using (```model.load_state_dict(model_weights)```). Finally, set the model to evaluation model (```model.eval()```). This step is essential because certain layers, such as batch normalization and dropout, behave differently during training and evaluation. Setting the model to evaluation mode ensures they operate correctly during validation or testing. \n", + "\n", + "If no pre-trained model is available, train a new model using the training and validation sets along with the predefined hyperparameters (```trainer.fit(EPOCHS, train_dataloader, val_dataloader, optimizer, criterion, DEVICE, EARLY_STOPPING_LIMIT)```). After training, save the best model weights (```torch.save(trainer.model.best_model_weights, model_path)```)." ] }, { "cell_type": "code", "execution_count": null, - "id": "84", + "id": "87", "metadata": {}, "outputs": [], "source": [ @@ -1394,18 +1551,20 @@ }, { "cell_type": "markdown", - "id": "85", + "id": "88", "metadata": {}, "source": [ "#### Learning curves\n", "\n", - "After training the model, we can analyse the learning curves to assess the training process. These curves, which typically display the loss over epochs for both the training and validation sets, are crucial for improving model performance. They can help identify issues like overfitting or underfitting. Overfitting occurs when the model performs well on the training data but poorly on the validation data, usually indicated by a widening gap between the two curves. Underfitting, on the other hand, is suggested when both the training and validation curves show poor performance and fail to improve. By monitoring these curves, we can adjust hyperparameters or modify the model architecture to address such issues. First, load the log file using ```pandas``` (```training_log = pd.read_csv(\"training_log.txt\")```). Then, use the ```matplotlib``` library to plot the learning curves." + "After training the model, we can analyse the learning curves to assess the training process. These curves, which typically display the loss over epochs for both the training and validation sets, are crucial for improving model performance. They can help identify issues like overfitting or underfitting. Overfitting occurs when the model performs well on the training data but poorly on the validation data, usually indicated by a widening gap between the two curves. Underfitting, on the other hand, is suggested when both the training and validation curves show poor performance and fail to improve. By monitoring these curves, we can adjust hyperparameters or modify the model architecture to address such issues. \n", + "\n", + "First, load the log file using ```pandas``` (```training_log = pd.read_csv(\"training_log.txt\")```). Then, use the ```matplotlib``` library to plot the learning curves." ] }, { "cell_type": "code", "execution_count": null, - "id": "86", + "id": "89", "metadata": {}, "outputs": [], "source": [ @@ -1423,7 +1582,7 @@ }, { "cell_type": "markdown", - "id": "87", + "id": "90", "metadata": {}, "source": [ "### Model testing\n", @@ -1445,7 +1604,7 @@ { "cell_type": "code", "execution_count": null, - "id": "88", + "id": "91", "metadata": {}, "outputs": [], "source": [ @@ -1454,7 +1613,7 @@ }, { "cell_type": "markdown", - "id": "89", + "id": "92", "metadata": {}, "source": [ "### Explore results\n", @@ -1464,7 +1623,7 @@ }, { "cell_type": "markdown", - "id": "90", + "id": "93", "metadata": {}, "source": [ "#### Compute average accuracy\n", @@ -1475,7 +1634,7 @@ { "cell_type": "code", "execution_count": null, - "id": "91", + "id": "94", "metadata": {}, "outputs": [], "source": [ @@ -1486,7 +1645,7 @@ }, { "cell_type": "markdown", - "id": "92", + "id": "95", "metadata": {}, "source": [ "#### Compute confusion matrix\n", @@ -1500,7 +1659,7 @@ { "cell_type": "code", "execution_count": null, - "id": "93", + "id": "96", "metadata": {}, "outputs": [], "source": [ @@ -1531,7 +1690,7 @@ }, { "cell_type": "markdown", - "id": "94", + "id": "97", "metadata": {}, "source": [ "### Explain image classifier predictions\n", @@ -1541,18 +1700,24 @@ }, { "cell_type": "markdown", - "id": "95", + "id": "98", "metadata": {}, "source": [ - "#### Prepare image for GradCAM\n", + "#### Prepare image for Grad-CAM\n", + "\n", + "To prepare the image for Grad-CAM visualization:\n", "\n", - "To prepare the image for Grad-CAM visualization, first convert it to (Height, Width, Channels) format using ```img_np = np.transpose(img, (1, 2, 0)) # shape: (H, W, C)```, and normalize its values to the [0, 1] range with ```img_np = (img_np - img_np.min()) / (img_np.max() - img_np.min())```. This processed image is used only for visualization, as expected by the PyTorch-GradCAM library. Next, modify the original image for model inference by adding a batch dimension: ```img = np.expand_dims(img, axis=0)```, then convert it to a PyTorch tensor: ```img = torch.from_numpy(img)```, and move it to the appropriate computation device using ```img = img.to(DEVICE)```. Finally, retrieve the predicted and true labels, as both are required for computing and visualizing the Grad-CAM output.\n" + "- First, convert it to (Height, Width, Channels) format using ```img_np = np.transpose(img, (1, 2, 0)) # shape: (H, W, C)```, and normalize its values to the [0, 1] range with ```img_np = (img_np - img_np.min()) / (img_np.max() - img_np.min())```. This processed image is used only for visualization, as expected by the PyTorch-GradCAM library;\n", + "\n", + "- Next, modify the original image for model inference by adding a batch dimension: ```img = np.expand_dims(img, axis=0)```, then convert it to a PyTorch tensor: ```img = torch.from_numpy(img)```, and move it to the appropriate computation device using ```img = img.to(DEVICE)```;\n", + "\n", + "- Finally, retrieve the predicted and true labels, as both are required for computing and visualizing the Grad-CAM output.\n" ] }, { "cell_type": "code", "execution_count": null, - "id": "96", + "id": "99", "metadata": {}, "outputs": [], "source": [ @@ -1565,12 +1730,20 @@ }, { "cell_type": "markdown", - "id": "97", + "id": "100", "metadata": {}, "source": [ "#### Compute GradCAM heatmap\n", "\n", - "First, ensure that the `requires_grad` attribute of the input image tensor is set to `True` by using `img.requires_grad = True`. This enables gradient computation with respect to the image, which is necessary for generating class activation maps. Next, specify the layer to inspect using ```target_layers = [model.conv3]```. Typically, the last convolutional layer of the image classifier is chosen because it preserves spatial information, which is crucial for identifying the regions of the input image that most strongly influence the model's prediction. Then, define the target class to be explained with ```targets = [ClassifierOutputTarget(pred_label)]```, where ```pred_label``` is the class index corresponding to the model’s predicted output (or any other class of interest). Finally, compute the Grad-CAM heatmap using the activations and gradients from the selected layer:\n", + "To compute the Grad-CAM heatmap:\n", + "\n", + "- First, ensure that the `requires_grad` attribute of the input image tensor is set to `True` by using `img.requires_grad = True`. This enables gradient computation with respect to the image, which is necessary for generating class activation maps;\n", + "\n", + "- Next, specify the layer to inspect using ```target_layers = [model.conv3]```. Typically, the last convolutional layer of the image classifier is chosen because it preserves spatial information, which is crucial for identifying the regions of the input image that most strongly influence the model's prediction. \n", + "\n", + "- Then, define the target class to be explained with ```targets = [ClassifierOutputTarget(pred_label)]```, where ```pred_label``` is the class index corresponding to the model’s predicted output (or any other class of interest). \n", + "\n", + "- Finally, compute the Grad-CAM heatmap using the activations and gradients from the selected layer:\n", "```python\n", "# Create CAM object\n", "with GradCAM(model=model, target_layers=target_layers) as cam:\n", @@ -1582,7 +1755,7 @@ { "cell_type": "code", "execution_count": null, - "id": "98", + "id": "101", "metadata": {}, "outputs": [], "source": [ @@ -1597,10 +1770,10 @@ }, { "cell_type": "markdown", - "id": "99", + "id": "102", "metadata": {}, "source": [ - "#### Visualise GradCAM heatmap with the image\n", + "#### Visualise Grad-CAM heatmap with the image\n", "\n", "After obtaining the Grad-CAM heatmap, we overlay it on the input image to visualise the regions that contributed most to the model’s prediction (```visualisation = show_cam_on_image(img_np, grad_cam_matrix, use_rgb=True)```). This helps identify which pixels the model focused on when predicting the class." ] @@ -1608,7 +1781,7 @@ { "cell_type": "code", "execution_count": null, - "id": "100", + "id": "103", "metadata": {}, "outputs": [], "source": [ @@ -1624,7 +1797,7 @@ ], "metadata": { "kernelspec": { - "display_name": "Python 3 (ipykernel)", + "display_name": "python-tutorial", "language": "python", "name": "python3" }, From 4a9ec7c3dde267f3f845dda0931817bcc7976326 Mon Sep 17 00:00:00 2001 From: Fabio Lopes Date: Fri, 9 May 2025 14:09:51 +0200 Subject: [PATCH 07/10] Correct changes --- ...ion.ipynb => 25_image_classification.ipynb | 44 +++++++++---------- ...ion.py => test_25_image_classification.py} | 0 2 files changed, 22 insertions(+), 22 deletions(-) rename 25_libraries_image_classification.ipynb => 25_image_classification.ipynb (99%) rename tutorial/tests/{test_25_libraries_image_classification.py => test_25_image_classification.py} (100%) diff --git a/25_libraries_image_classification.ipynb b/25_image_classification.ipynb similarity index 99% rename from 25_libraries_image_classification.ipynb rename to 25_image_classification.ipynb index a7b8197b..b4a46b8a 100644 --- a/25_libraries_image_classification.ipynb +++ b/25_image_classification.ipynb @@ -65,7 +65,15 @@ "id": "2", "metadata": {}, "source": [ - "## Libraries" + "## References\n", + "\n", + "Here are some additional references to guide you while self-learning:\n", + "- Official documentation for [openCV](https://docs.opencv.org/4.x/d6/d00/tutorial_py_root.html)\n", + "- Official documentation for [PIL library](https://pillow.readthedocs.io/en/stable/)\n", + "- Official documentation for [PyTorch](https://pytorch.org/)\n", + "- Official documentation for [Albumentations](https://albumentations.ai/)\n", + "- Official documentation for [PyTorch GradCAM](https://jacobgil.github.io/pytorch-gradcam-book/introduction.html)\n", + "- [A tutorial from Microsoft to compute image classification using PyTorch](https://learn.microsoft.com/en-us/windows/ai/windows-ml/tutorials/pytorch-train-model)" ] }, { @@ -73,19 +81,27 @@ "id": "3", "metadata": {}, "source": [ - "- NumPy\n", - "- Matplotlib\n", - "- Scikit-learn\n", + "## Libraries" + ] + }, + { + "cell_type": "markdown", + "id": "4", + "metadata": {}, + "source": [ + "- [Matplotlib](./20_library_matplotlib.ipynb)\n", + "- [NumPy](./21_library_numpy.ipynb)\n", + "- [scikit-learn](./22_library_sklearn.ipynb)\n", "- OpenCV-Python\n", "- PyTorch\n", "- Albumentations\n", - "- PyTorch GradCAM" + "- PyTorch Grad-CAM" ] }, { "cell_type": "code", "execution_count": null, - "id": "4", + "id": "5", "metadata": {}, "outputs": [], "source": [ @@ -113,22 +129,6 @@ "from pytorch_grad_cam.utils.image import show_cam_on_image" ] }, - { - "cell_type": "markdown", - "id": "5", - "metadata": {}, - "source": [ - "## References\n", - "\n", - "Here are some additional references to guide you while self-learning:\n", - "- Official documentation for [openCV](https://docs.opencv.org/4.x/d6/d00/tutorial_py_root.html)\n", - "- Official documentation for [PIL library](https://pillow.readthedocs.io/en/stable/)\n", - "- Official documentation for [PyTorch](https://pytorch.org/)\n", - "- Official documentation for [Albumentations](https://albumentations.ai/)\n", - "- Official documentation for [PyTorch GradCAM](https://jacobgil.github.io/pytorch-gradcam-book/introduction.html)\n", - "- [A tutorial from Microsoft to compute image classification using PyTorch](https://learn.microsoft.com/en-us/windows/ai/windows-ml/tutorials/pytorch-train-model)" - ] - }, { "cell_type": "markdown", "id": "6", diff --git a/tutorial/tests/test_25_libraries_image_classification.py b/tutorial/tests/test_25_image_classification.py similarity index 100% rename from tutorial/tests/test_25_libraries_image_classification.py rename to tutorial/tests/test_25_image_classification.py From 764a28e80f345950ee427981d0e1e8211969b818 Mon Sep 17 00:00:00 2001 From: Aliaksandr Yakutovich Date: Fri, 9 May 2025 15:38:25 +0200 Subject: [PATCH 08/10] Update toc, do one sentence per line. --- 25_image_classification.ipynb | 291 +++++++++++++++++++++------------- 1 file changed, 183 insertions(+), 108 deletions(-) diff --git a/25_image_classification.ipynb b/25_image_classification.ipynb index b4a46b8a..21bc327a 100644 --- a/25_image_classification.ipynb +++ b/25_image_classification.ipynb @@ -13,7 +13,7 @@ "id": "1", "metadata": {}, "source": [ - "## Table of Contents\n", + "# Table of Contents\n", " - [Image Classification Notebook](#Image-Classification-Notebook)\n", " - [References](#References)\n", " - [Libraries](#Libraries)\n", @@ -23,41 +23,44 @@ " - [Dataset](#Dataset)\n", " - [Load data](#Load-data)\n", " - [Explore image processing](#Explore-image-processing)\n", + " - [Example image](#Example-image)\n", " - [Geometric transformation](#Geometric-transformation)\n", " - [Scaling](#Scaling)\n", " - [Cropping](#Cropping)\n", - " - [Vertical flip](#Vertical-flip)\n", - " - [Horizontal flip](#Horizontal-flip)\n", + " - [Horizontal Flip](#Horizontal-Flip)\n", + " - [Vertical Flip](#Vertical-Flip)\n", " - [Rotation](#Rotation)\n", " - [Image filtering](#Image-filtering)\n", - " - [Average filter](#Average-filter)\n", + " - [Average filter ](#Average-filter)\n", " - [Median filter](#Median-filter)\n", " - [Gaussian filter](#Gaussian-filter)\n", " - [Photometric transformation](#Photometric-transformation)\n", " - [Adjust brightness](#Adjust-brightness)\n", " - [Adjust contrast](#Adjust-contrast)\n", " - [Adjust saturation](#Adjust-saturation)\n", - " - [CNN model development](#CNN-model-development)\n", - " - [Data preprocessing](#Data-preprocessing)\n", - " - [Data Augmentation](#Data-augmentation)\n", - " - [Train, validation, and test sets](#Train-validation-and-test-sets)\n", - " - [PyTorch datasets](#Pytorch-datasets)\n", - " - [PyTorch dataloaders](#Pytorch-dataloaders)\n", + " - [Image classifier development using CNNs](#Image-classifier-development-using-CNNs)\n", + " - [Dataset preprocessing](#Dataset-preprocessing)\n", + " - [Train, validation, and test sets](#Train,-validation,-and-test-sets)\n", + " - [Data Augmentation](#Data-Augmentation)\n", + " - [PyTorch Datasets](#PyTorch-Datasets)\n", + " - [PyTorch Dataloaders](#PyTorch-Dataloaders)\n", " - [Model training](#Model-training)\n", - " - [Training hyperparameters](#Training-hyperparameters)\n", + " - [Model Training Overview](#Model-Training-Overview)\n", + " - [Check which device is used for training](#Check-which-device-is-used-for-training)\n", + " - [Define training hyperparameters](#Define-training-hyperparameters)\n", + " - [Loss function](#Loss-function)\n", " - [Initialise model architecture](#Initialise-model-architecture)\n", - " - [Optimiser](#Optimiser)\n", + " - [Optimiser function](#Optimiser-function)\n", " - [Train model](#Train-model)\n", - " - [Loss function](#Loss-function)\n", - " - [Learning curves](#Learning-curves)\n", + " - [Learning curves](#Learning-curves)\n", " - [Model testing](#Model-testing)\n", " - [Explore results](#Explore-results)\n", " - [Compute average accuracy](#Compute-average-accuracy)\n", " - [Compute confusion matrix](#Compute-confusion-matrix)\n", - " - [Explain image prediction](#Explain-image-prediction)\n", - " - [Load data batch](#Load-data-batch)\n", + " - [Explain image classifier predictions](#Explain-image-classifier-predictions)\n", + " - [Prepare image for Grad-CAM](#Prepare-image-for-Grad-CAM)\n", " - [Compute GradCAM heatmap](#Compute-GradCAM-heatmap)\n", - " - [Visualise GradCAM heatmap with the image](#Visualise-GradCAM-heatmap-with-the-image)\n" + " - [Visualise Grad-CAM heatmap with the image](#Visualise-Grad-CAM-heatmap-with-the-image)" ] }, { @@ -136,9 +139,12 @@ "source": [ "## Introduction\n", "\n", - "Image Classification is a foundational task in the field of computer vision and machine learning. This notebook aims to provide practical experience in image processing and in building and evaluating image classification models. \n", + "Image Classification is a foundational task in the field of computer vision and machine learning.\n", + "This notebook aims to provide practical experience in image processing and in building and evaluating image classification models. \n", "\n", - "It begins by demonstrating how to load and preprocess image data using Matplotlib and OpenCV-Python. Then, it shows how to build a basic image classification pipeline based on Convolutional Neural Networks (CNNs) using PyTorch, Albumentations, and Scikit-learn. Next, it covers how to evaluate model performance using Scikit-learn and NumPy, and finally, it introduces model explainability using Grad-CAM.\n", + "It begins by demonstrating how to load and preprocess image data using Matplotlib and OpenCV-Python.\n", + "Then, it shows how to build a basic image classification pipeline based on Convolutional Neural Networks (CNNs) using PyTorch, Albumentations, and Scikit-learn.\n", + "Next, it covers how to evaluate model performance using Scikit-learn and NumPy, and finally, it introduces model explainability using Grad-CAM.\n", "\n", "The goal of this notebook is not to teach the underlying algorithms and procedures used in this field, but rather to give the user an idea of what can be done with these Python libraries." ] @@ -225,75 +231,84 @@ "`__init__` function:\n", "\n", "The first thing to do is to build the `__init__` function, which contains the variables needed for building the neural network.\n", - "\n", "Let's start by defining the number of feature maps in the first convolutional layer (the value is empirical):\n", + "\n", "```python\n", "self.feature_maps = 64\n", "```\n", "\n", - "To help a computer understand and classify images, we build a model made up of layers, kind of like stacking Lego blocks. Each block does a specific task — detecting patterns, reducing size, or making decisions. Here's what each component does:\n", + "To help a computer understand and classify images, we build a model made up of layers, kind of like stacking Lego blocks.\n", + "Each block does a specific task — detecting patterns, reducing size, or making decisions. Here's what each component does:\n", + "\n", "```python\n", "self.conv1 = nn.Conv2d(in_channels, self.feature_maps, kernel_size = 3)\n", "```\n", - "This layer scans the image for small patterns (like edges or colors).\n", "\n", + "This layer scans the image for small patterns (like edges or colors).\n", "`in_channels` is the number of input image channels (e.g. 3 for RGB images).\n", - "\n", "`self.feature_maps` is how many different patterns we want the model to learn at this layer.\n", - "\n", "`kernel_size = 3` means the scanning window is 3x3 pixels. The value is empirical.\n", "\n", "```python\n", "self.pool1 = nn.MaxPool2d(kernel_size = 2)\n", "```\n", - "This layer shrinks the size of the image while keeping the most important info (max values). It helps the model focus and reduces computation.\n", + "This layer shrinks the size of the image while keeping the most important info (max values).\n", + "It helps the model focus and reduces computation.\n", "\n", "```python\n", "self.bn1 = nn.BatchNorm2d(self.feature_maps)\n", "```\n", - "This layer normalizes the outputs, making training faster and more stable.\n", "\n", + "This layer normalizes the outputs, making training faster and more stable.\n", "The combination of the foreamentioned layers is also usually called as convolutional block.\n", - "\n", "After defining the first convolutional block, lets define the second one:\n", + "\n", "```python\n", "self.conv2 = nn.Conv2d(self.feature_maps, self.feature_maps * 2, kernel_size = 3)\n", "self.pool2 = nn.MaxPool2d(kernel_size = 2)\n", "self.bn2 = nn.BatchNorm2d(self.feature_maps * 2)\n", "```\n", "The second block is very similar to the first block, but now it looks for more complex patterns by increasing the number of feature maps (i.e. learning more features).\n", + "After defining the second convolutional block.\n", + "Lets define the third and last one:\n", "\n", - "After defining the second convolutional block. lets define the third and last one:\n", "```python\n", "self.conv3 = nn.Conv2d(self.feature_maps * 2, self.feature_maps * 4, kernel_size = 3)\n", "self.pool3 = nn.MaxPool2d(kernel_size = 2)\n", "self.bn3 = nn.BatchNorm2d(self.feature_maps * 4)\n", "```\n", - "This block explored even deeper patterns, such as shapes or textures. As we go deeper, the network becomes better at understanding the image.\n", "\n", + "This block explored even deeper patterns, such as shapes or textures.\n", + "As we go deeper, the network becomes better at understanding the image.\n", "Then, we define the activation layer that is going to be used in-between these blocks:\n", + "\n", "```python\n", "self.relu = nn.ReLU()\n", "```\n", - "After each layer, we add a \"yes/no\" switch to keep only useful patterns. ReLU (Rectified Linear Unit) sets negative values to zero — it adds non-linearity to help the network learn more complex things.\n", "\n", + "After each layer, we add a \"yes/no\" switch to keep only useful patterns.\n", + "ReLU (Rectified Linear Unit) sets negative values to zero — it adds non-linearity to help the network learn more complex things.\n", "Next, we define the layer that transforms the data from 2D images into an 1D vector (like stretching out a grid of pixels into a line):\n", + "\n", "```python\n", "self.flatten = nn.Flatten(start_dim=1)\n", "```\n", "\n", "Now, we define the dropout layer:\n", + "\n", "```python\n", "self.dropout = nn.Dropout(p = 0.3)\n", "```\n", "This layer randomly turns off a pre-define percentage of neurons (`p = 0.3`) during training to prevent overfitting — so the model does not memorize the training data too closely.\n", - "\n", "Finally, we define the classifier:\n", + "\n", "```python\n", "self.out_classes = out_classes\n", "self.fc = nn.Linear(1024, self.out_classes)\n", "```\n", - "This final layer is like the decision-maker. It takes all the features the model has learned and decides which class (e.g. cat, dog, airplane) the input image belongs to.\n", + "\n", + "This final layer is like the decision-maker.\n", + "It takes all the features the model has learned and decides which class (e.g. cat, dog, airplane) the input image belongs to.\n", "\n", "1024 is the number of features coming into the layer (depends on the hyperparameters used in the previous layers), and `out_classes` is how many classes we want to predict." ] @@ -305,7 +320,9 @@ "source": [ "`forward` function:\n", "\n", - "After defining the function `__init__`, we need to define the function `forward`. This one is responsible to combine all the layers defined in the `__init__` to build the neural network model. Basically, it describes how an input image flows through the network, one layer at a time, to become a prediction.\n", + "After defining the function `__init__`, we need to define the function `forward`.\n", + "This one is responsible to combine all the layers defined in the `__init__` to build the neural network model.\n", + "Basically, it describes how an input image flows through the network, one layer at a time, to become a prediction.\n", "\n", "```python\n", "# Convolutional block 1\n", @@ -359,19 +376,19 @@ "`__init__` function:\n", "\n", "This function is used to initialise variables used in the other functions of the class.\n", - "\n", "Start by initialising the following attributes:\n", + "\n", "```python\n", "self.model = model\n", "self.train_losses = []\n", "self.val_losses = []\n", "self.best_model_weights = None\n", "```\n", - "`self.model` – This is the neural network we're training.\n", "\n", - "`self.train_losses` and `self.val_losses` – These lists keep track of how well the model is doing on the training and validation sets over time (used to plot learning curves).\n", + "`self.model` – this is the neural network we're training.\n", + "`self.train_losses` and `self.val_losses` – these lists keep track of how well the model is doing on the training and validation sets over time (used to plot learning curves).\n", "\n", - "`self.best_model_weights` – This will store a copy of the model when it performed best on the validation set (used for early stopping)." + "`self.best_model_weights` – this will store a copy of the model when it performed best on the validation set (used for early stopping)." ] }, { @@ -381,8 +398,8 @@ "source": [ "`fit` function:\n", "\n", - "This function goes through the data multiple times (epochs) to optimize the model’s performance. It also applies early stopping, which stops training if performance stops improving.\n", - "\n", + "This function goes through the data multiple times (epochs) to optimize the model’s performance.\n", + "It also applies early stopping, which stops training if performance stops improving.\n", "Lets start by initialising the following variables:\n", "\n", "```python\n", @@ -391,11 +408,9 @@ "best_epoch = 0\n", "```\n", "\n", - "`early_stopping_count`: It is used to track the number of epochs without improving validation loss (used in early stopping).\n", - "\n", - "`best_val_loss`: It is used to track the best validation loss ever seen. Here we use a very large meaning-less number because validation loss for classification is always smaller than that.\n", - "\n", - "`best_epoch`: It is used to track the epoch that got the best validation loss.\n", + "`early_stopping_count` is used to track the number of epochs without improving validation loss (used in early stopping).\n", + "`best_val_loss`: is used to track the best validation loss ever seen. Here we use a very large meaning-less number because validation loss for classification is always smaller than that.\n", + "`best_epoch`: is used to track the epoch that got the best validation loss.\n", "\n", "Then, we initialise the file that is going to store the training statistics:\n", "\n", @@ -406,7 +421,8 @@ " log_file.write(\"Epoch,Train Loss,Val Loss,Best Val Loss,Best Epoch\\n\")\n", "```\n", "\n", - "Now comes the training phase. It includes a main for-loop that runs until the end of the pre-defined number of epochs, and two inner loops: one for optimising the model's weights and another for evaluating the model after each epoch.\n", + "Now comes the training phase.\n", + "It includes a main for-loop that runs until the end of the pre-defined number of epochs, and two inner loops: one for optimising the model's weights and another for evaluating the model after each epoch.\n", "\n", "```python\n", "for epoch in range(epochs):\n", @@ -499,8 +515,8 @@ "`predict` function:\n", "\n", "Once training is done, this method is used to predict labels for new data.\n", - "\n", "As early stopping is used during the training, it might be the case that the last models weights were not the best ones. Therefore, load the best-performing ones.\n", + "\n", "```python\n", "# Load best weights\n", "if self.best_model_weights:\n", @@ -513,7 +529,9 @@ "self.model.eval()\n", "```\n", "\n", - "Loop through the test set to get the model predictions. Not only the predictions, but also the original images and the true labels are stored for future use.\n", + "Loop through the test set to get the model predictions.\n", + "Not only the predictions, but also the original images and the true labels are stored for future use.\n", + "\n", "```python\n", "original_images = []\n", "true_labels = []\n", @@ -574,7 +592,8 @@ "source": [ "## Functions\n", "\n", - "The following three functions are going to be used throughout the notebook. They comprise the loading of binary files using Pickle (**load_pickle_file**), single image plotting (**plot_image**), and multiple image plotting (**plot_multiple_images**)." + "The following three functions are going to be used throughout the notebook.\n", + "They comprise the loading of binary files using Pickle (**load_pickle_file**), single image plotting (**plot_image**), and multiple image plotting (**plot_multiple_images**)." ] }, { @@ -609,7 +628,8 @@ "source": [ "## Dataset\n", "\n", - "In this section, we load the CIFAR-10 dataset, which consists of 60,000 32x32 color images across 10 different classes, with 6,000 images per class. The dataset is divided into 50,000 training images and 10,000 test images. It was already processed and it is ready to use after loading the binary files *train_set.pkl* and *test_set.pkl*." + "In this section, we load the CIFAR-10 dataset, which consists of 60,000 32x32 color images across 10 different classes, with 6,000 images per class.\n", + "The dataset is divided into 50,000 training images and 10,000 test images. It was already processed and it is ready to use after loading the binary files *train_set.pkl* and *test_set.pkl*." ] }, { @@ -619,7 +639,8 @@ "source": [ "### Load data\n", "\n", - "Training and test sets are loaded using Pickle library. If you do not have the dataset already, open this [link](https://www.dropbox.com/scl/fo/p7gfb0kpgkbrrjup340pi/AAkX2u1g-W7290-Aq7gHHvo?rlkey=vdxaj6npfy09ywh17nl8f9v6e&st=8hfq9z20&dl=0) and download it. Place it inside the data folder." + "Training and test sets are loaded using Pickle library. If you do not have the dataset already, open this [link](https://www.dropbox.com/scl/fo/p7gfb0kpgkbrrjup340pi/AAkX2u1g-W7290-Aq7gHHvo?rlkey=vdxaj6npfy09ywh17nl8f9v6e&st=8hfq9z20&dl=0) and download it.\n", + "Place it inside the data folder." ] }, { @@ -652,9 +673,13 @@ "source": [ "## Explore image processing\n", "\n", - "Image processing is fundamental to computer vision, forming the basis for interpreting and analyzing visual information. By applying techniques such as resizing, filtering, color adjustments, and data augmentation, image processing enhances input quality, minimizes noise, and corrects distortions. These methods can also simulate real-world variability, helping models generalize better. \n", + "Image processing is fundamental to computer vision, forming the basis for interpreting and analyzing visual information.\n", + "By applying techniques such as resizing, filtering, color adjustments, and data augmentation, image processing enhances input quality, minimizes noise, and corrects distortions.\n", + "These methods can also simulate real-world variability, helping models generalize better. \n", "\n", - "In this notebook, we explore three categories of image transformations: **geometric transformations**, **image filtering**, and **photometric transformations**. The following cells contain a series of exercicies designed to help you explore the OpenCV-Python library. If you are unfamiliar with a particular method, refer to the [Image Processing in OpenCV](https://docs.opencv.org/4.x/d2/d96/tutorial_py_table_of_contents_imgproc.html) documentation." + "In this notebook, we explore three categories of image transformations: **geometric transformations**, **image filtering**, and **photometric transformations**.\n", + "The following cells contain a series of exercicies designed to help you explore the OpenCV-Python library.\n", + "If you are unfamiliar with a particular method, refer to the [Image Processing in OpenCV](https://docs.opencv.org/4.x/d2/d96/tutorial_py_table_of_contents_imgproc.html) documentation." ] }, { @@ -689,8 +714,11 @@ "source": [ "### Geometric transformation\n", "\n", - "Geometric transformations alter the spatial structure of the image while preserving its semantic content. They help the model become invariant to different orientations and scales:\n", - "- **Scaling**: Resizes the image to a specific size, often required to match input dimensions for image classifiers. It uses interpolation to obtain the new pixel-values.\n", + "Geometric transformations alter the spatial structure of the image while preserving its semantic content.\n", + "They help the model become invariant to different orientations and scales:\n", + "\n", + "- **Scaling**: Resizes the image to a specific size, often required to match input dimensions for image classifiers.\n", + " It uses interpolation to obtain the new pixel-values.\n", "- **Cropping**: Extracts a subregion of the image; useful for focusing on important parts or adding variability.\n", "- **Horizontal and vertical flip**: Flips the image along the x-axis or y-axis; helps the model learn symmetry.\n", "- **Rotation**: Rotates the image by a small angle to simulate different orientations of the objects." @@ -760,6 +788,7 @@ "outputs": [], "source": [ "%%ipytest\n", + "\n", "def solution_crop_image(img, x: int, y: int, width: int, height: int):\n", " # Start your code here\n", " return\n", @@ -797,6 +826,7 @@ "outputs": [], "source": [ "%%ipytest\n", + "\n", "def solution_horizontal_flip_image(img):\n", " # Start your code here\n", " return\n", @@ -834,6 +864,7 @@ "outputs": [], "source": [ "%%ipytest\n", + "\n", "def solution_vertical_flip_image(img):\n", " # Start your code here\n", " return\n", @@ -871,6 +902,7 @@ "outputs": [], "source": [ "%%ipytest\n", + "\n", "def solution_rotate_image(img, angle: float):\n", " # Start your code here\n", " return\n", @@ -899,7 +931,9 @@ "source": [ "### Image filtering\n", "\n", - "Filtering helps reduce noise and enhance specific image features. These are often used as a form of preprocessing before feeding images into a model:\n", + "Filtering helps reduce noise and enhance specific image features.\n", + "These are often used as a form of preprocessing before feeding images into a model:\n", + "\n", "- **Average filter**: Applies a smoothing effect by replacing each pixel with the average of its neighborhood.\n", "- **Median filter**: Reduces salt-and-pepper noise by replacing each pixel with the median of neighboring pixels.\n", "- **Gaussian filter**: Applies a Gaussian blur to smooth the image, often used to reduce high-frequency noise." @@ -1024,6 +1058,7 @@ "### Photometric transformation\n", "\n", "Photometric transformations modify the color properties of an image to simulate different lighting conditions and improve model robustness to brightness and contrast changes:\n", + "\n", "- **Brightness**: Randomly increases or decreases the brightness of the image.\n", "- **Contrast**: Alters the difference between light and dark regions in the image.\n", "- **Saturation**: Modifies the intensity of the colors in the image." @@ -1085,6 +1120,7 @@ "outputs": [], "source": [ "%%ipytest\n", + "\n", "def solution_adjust_contrast(img, contrast_value):\n", " # Start your code here\n", " return\n", @@ -1125,6 +1161,7 @@ "outputs": [], "source": [ "%%ipytest\n", + "\n", "def solution_adjust_saturation(img, saturation_factor):\n", " # Start your code here\n", " return\n", @@ -1156,18 +1193,15 @@ "source": [ "## Image classifier development using CNNs\n", "\n", - "Image classification is the task of assigning a label or category to an input image from a predefined set of classes. It is a fundamental problem in computer vision with widespread applications, including facial recognition, medical imaging, quality control, and autonomous driving. \n", - "\n", + "Image classification is the task of assigning a label or category to an input image from a predefined set of classes.\n", + "It is a fundamental problem in computer vision with widespread applications, including facial recognition, medical imaging, quality control, and autonomous driving. \n", "This section outlines the key steps involved in developing an image classification model using PyTorch:\n", "\n", "- It begins with data preprocessing, which includes splitting the dataset into training, validation, and test sets. \n", - "\n", "- Afterwards, it defines data augmentation strategies using the Albumentations library, loads the data as PyTorch datasets, and initialises PyTorch dataloaders to efficiently feed data during training. \n", - "\n", "- The next step is model training, where a CNN-based model is initialized and optimised using the training and validation data. \n", - "\n", - "- After training, the model is evaluated on the test set to assess its performance. The evaluation includes metrics such as accuracy and the confusion matrix, which help interpret the model's predictive behavior. \n", - "\n", + "- After training, the model is evaluated on the test set to assess its performance.\n", + " The evaluation includes metrics such as accuracy and the confusion matrix, which help interpret the model's predictive behavior. \n", "- Finally, the PyTorch Grad-CAM library is used to visualize the regions of input images that contribute most to the model’s decisions, providing insights into model explainability using representative examples." ] }, @@ -1186,7 +1220,8 @@ "source": [ "#### Train, validation, and test sets\n", "\n", - "```train_test_split``` from Scikit-learn can be used to split the original training set into training and validation sets. The test set is already defined by the dataset' authors." + "```train_test_split``` from Scikit-learn can be used to split the original training set into training and validation sets.\n", + "The test set is already defined by the dataset' authors." ] }, { @@ -1211,9 +1246,11 @@ "source": [ "#### Data Augmentation\n", "\n", - "Data augmentation is a crucial technique in image classification that helps improve the performance and robustness of machine learning models. It involves generating new training samples by applying random transformations — such as rotation, flipping, cropping, scaling, or color jittering — to the original images. \n", + "Data augmentation is a crucial technique in image classification that helps improve the performance and robustness of machine learning models.\n", + "It involves generating new training samples by applying random transformations — such as rotation, flipping, cropping, scaling, or color jittering — to the original images. \n", "\n", - "Albumentations is one of the most widely used libraries for performing data augmentation in image classification tasks. It includes augmentation techniques that replicate operations commonly used in image processing, such as:\n", + "Albumentations is one of the most widely used libraries for performing data augmentation in image classification tasks.\n", + "It includes augmentation techniques that replicate operations commonly used in image processing, such as:\n", "\n", "- ```A.Affine``` for scaling;\n", "\n", @@ -1226,8 +1263,8 @@ "- ```A.ColorJitter``` for color jittering.\n", "\n", "Albumentations can also be used for image normalization (```A.Normalize```), resizing (```A.Resize```), and converting images to PyTorch tensors with the (Channel, Height, Width) format using ```A.ToTensorV2```, which is required for model training.\n", - "\n", "Apply the following transformations only to the training set, as the validation set should remain as close as possible to the test set. Therefore, no transformations should be applied to it.\n", + "\n", "```python\n", "A.Affine(scale = (0.2, 1.5), p = 0.1),\n", "A.Rotate(limit = 45, p = 0.1),\n", @@ -1291,7 +1328,13 @@ "source": [ "#### PyTorch Dataloaders\n", "\n", - "```DataLoader``` is essential for training efficiency and performance. It abstracts the complexity of batching, shuffling, and parallel data access, allowing you to focus on building and training your models. ```batch_size``` specifies the number of samples processed in parallel during each training iteration. It is typically treated as a hyperparameter, as its optimal value depends on hardware constraints (e.g., GPU memory) and its interaction with training dynamics. Notably, it is often linearly related with the learning rate. Larger batch sizes generally require proportionally larger learning rates to maintain stable and efficient convergence. ```shuffle``` controls whether the dataset is randomly permuted at the start of each epoch. Enabling ```shuffle = True``` is typically beneficial, as it helps prevent the model from learning misleading patterns due to class-wise ordering in the dataset, which could hinder generalization and convergence." + "```DataLoader``` is essential for training efficiency and performance.\n", + "It abstracts the complexity of batching, shuffling, and parallel data access, allowing you to focus on building and training your models.\n", + "```batch_size``` specifies the number of samples processed in parallel during each training iteration.\n", + "It is typically treated as a hyperparameter, as its optimal value depends on hardware constraints (e.g., GPU memory) and its interaction with training dynamics.\n", + "Notably, it is often linearly related with the learning rate. Larger batch sizes generally require proportionally larger learning rates to maintain stable and efficient convergence.\n", + "```shuffle``` controls whether the dataset is randomly permuted at the start of each epoch.\n", + "Enabling ```shuffle = True``` is typically beneficial, as it helps prevent the model from learning misleading patterns due to class-wise ordering in the dataset, which could hinder generalization and convergence." ] }, { @@ -1317,17 +1360,22 @@ "\n", "Model training comprises a series of steps:\n", "\n", - "- First, we must check which devices are available for training the model. In case a GPU with Cuda cores is available is should be used as it really improves the speed. Otherwise, lets use CPU. \n", - "\n", - "- Then, model and training hyperparameters should be defined, such as numer of output classes, number of training epochs, number of consecutive not improving epochs needed for stopping the training in case we use early stopping regularisation, and learning rate. Other hyperparameters can be defined, it depends on what the user wants to do during the training. In this notebook we are going to define the number of epochs, which are the number of times the model is going to see the training set. Early stopping is a way of trying to avoid overfitting where the model evaluates the model every new epoch using a validation set. In case the loss obtained for the validation set does not decrease for a long period of time (pre-defined epochs), the model optimisation stops and retrieves the checkpoint where the validation loss got the last decrease (see [Early Stopping](https://paperswithcode.com/method/early-stopping)). Learning rate defined how quick the models weights should change during training. If it is too high the weights are going to change really quick and might miss minima because they are always jumping from one side to another side. If it is too small the model weights might get stuck a local minimum. So although this is not done in this notebook, this parameter should be studied in order to choose the best (see [What is learning rate in machine learning?](https://www.ibm.com/think/topics/learning-rate)). \n", - "\n", - "- After defining the hyperparameters, we should define the loss function that is going to be used to evaluate the model and it should be sent to the hardware used for training. \n", - "\n", - "- Afterwards, the model is defined using ```ImageClassifier``` class and is sent to the device used for training. \n", - "\n", - "- Next, we should define the optimiser function and also send it to the device used for training. \n", - "\n", - "- Afterwards, we train the model in case some optimised weights are not available and we explore the learning curves." + "1. First, we must check which devices are available for training the model.\n", + " In case a GPU with Cuda cores is available is should be used as it really improves the speed.\n", + " Otherwise, lets use CPU. \n", + "1. Then, model and training hyperparameters should be defined, such as numer of output classes, number of training epochs, number of consecutive not improving epochs needed for stopping the training in case we use early stopping regularisation, and learning rate.\n", + " Other hyperparameters can be defined, it depends on what the user wants to do during the training.\n", + " In this notebook we are going to define the number of epochs, which are the number of times the model is going to see the training set.\n", + " Early stopping is a way of trying to avoid overfitting where the model evaluates the model every new epoch using a validation set.\n", + " In case the loss obtained for the validation set does not decrease for a long period of time (pre-defined epochs), the model optimisation stops and retrieves the checkpoint where the validation loss got the last decrease (see [Early Stopping](https://paperswithcode.com/method/early-stopping)).\n", + " Learning rate defined how quick the models weights should change during training.\n", + " If it is too high the weights are going to change really quick and might miss minima because they are always jumping from one side to another side.\n", + " If it is too small the model weights might get stuck a local minimum.\n", + " So although this is not done in this notebook, this parameter should be studied in order to choose the best (see [What is learning rate in machine learning?](https://www.ibm.com/think/topics/learning-rate)). \n", + "1. After defining the hyperparameters, we should define the loss function that is going to be used to evaluate the model and it should be sent to the hardware used for training. \n", + "1. Afterwards, the model is defined using ```ImageClassifier``` class and is sent to the device used for training.\n", + "1. Next, we should define the optimiser function and also send it to the device used for training.\n", + "1. Afterwards, we train the model in case some optimised weights are not available and we explore the learning curves." ] }, { @@ -1337,23 +1385,33 @@ "source": [ "### Model Training Overview\n", "\n", - "Model training involves a sequence of key steps. The first step is to check which computational devices are available. If a GPU with CUDA cores is accessible, it should be used, as it significantly accelerates training (```DEVICE = torch.device(\"cuda:0\" if torch.cuda.is_available() else \"cpu\")```). Otherwise, the model will be trained on the CPU. Next, we define the model and training hyperparameters. These typically include:\n", + "Model training involves a sequence of key steps.\n", + "The first step is to check which computational devices are available.\n", + "If a GPU with CUDA cores is accessible, it should be used, as it significantly accelerates training (```DEVICE = torch.device(\"cuda:0\" if torch.cuda.is_available() else \"cpu\")```).\n", + "Otherwise, the model will be trained on the CPU. Next, we define the model and training hyperparameters.\n", + "These typically include:\n", "\n", "- The number of output classes (```NUMBER_CLASSES = len(CIFAR_10_CLASSES)```);\n", - "\n", "- The number of training epochs (i.e., how many times the model sees the full training set) (```EPOCHS = 500```);\n", - "\n", "- The patience for early stopping (i.e., how many consecutive epochs without improvement are allowed before stopping training) (```EARLY_STOPPING_LIMIT = EPOCHS // 10```);\n", - "\n", "- The learning rate (```LR = 0.001```).\n", "\n", "Additional hyperparameters may also be configured depending on the training strategy or specific use case.\n", "\n", - "In this notebook, we focus on setting the number of training epochs. We also discuss **early stopping**, a regularization technique used to prevent overfitting. During training, the model's performance is evaluated on a validation set at the end of each epoch. If the validation loss does not improve after a predefined number of epochs, training is stopped, and the model reverts to the best-performing checkpoint (see [Early Stopping](https://paperswithcode.com/method/early-stopping)). The **learning rate** controls how quickly the model updates its weights during training. If it's too high, the model may overshoot optimal loss values, leading to instability. If it's too low, the model may converge very slowly or get stuck in a local minimum. Although learning rate tuning is not performed in this notebook, it is an essential hyperparameter that should be carefully selected (see [What is learning rate in machine learning?](https://www.ibm.com/think/topics/learning-rate)).\n", + "In this notebook, we focus on setting the number of training epochs. We also discuss **early stopping**, a regularization technique used to prevent overfitting.\n", + "During training, the model's performance is evaluated on a validation set at the end of each epoch.\n", + "If the validation loss does not improve after a predefined number of epochs, training is stopped, and the model reverts to the best-performing checkpoint (see [Early Stopping](https://paperswithcode.com/method/early-stopping)).\n", + "The **learning rate** controls how quickly the model updates its weights during training.\n", + "If it's too high, the model may overshoot optimal loss values, leading to instability.\n", + "If it's too low, the model may converge very slowly or get stuck in a local minimum.\n", + "Although learning rate tuning is not performed in this notebook, it is an essential hyperparameter that should be carefully selected (see [What is learning rate in machine learning?](https://www.ibm.com/think/topics/learning-rate)).\n", "\n", - "After setting the hyperparameters, we define the **loss function** used to evaluate model performance (```criterion = nn.CrossEntropyLoss()```). Both the model and loss function should be moved to the selected training device (```criterion = criterion.to(DEVICE)```). The model is then instantiated using the `ImageClassifier` class (```model = ImageClassifier(in_channels = 3, out_classes = NUMBER_CLASSES)```) and transferred to the training device (```model = model.to(DEVICE)```). The **optimizer** is also defined and configured on the same device (```optimizer = optim.Adam(model.parameters(), lr = LR)```).\n", + "After setting the hyperparameters, we define the **loss function** used to evaluate model performance (```criterion = nn.CrossEntropyLoss()```).\n", + "Both the model and loss function should be moved to the selected training device (```criterion = criterion.to(DEVICE)```).\n", + "The model is then instantiated using the `ImageClassifier` class (```model = ImageClassifier(in_channels = 3, out_classes = NUMBER_CLASSES)```) and transferred to the training device (```model = model.to(DEVICE)```).\n", + "The **optimizer** is also defined and configured on the same device (```optimizer = optim.Adam(model.parameters(), lr = LR)```).\n", "\n", - "Finally, if no pre-trained weights are available, the training process begins, and we monitor the learning curves to assess the model’s performance over time.\n" + "Finally, if no pre-trained weights are available, the training process begins, and we monitor the learning curves to assess the model’s performance over time." ] }, { @@ -1510,7 +1568,9 @@ "source": [ "#### Train model\n", "\n", - "Here, we train the model. First, we initialise the class ```Trainer``` which we are going to use for training and evaluating the model using the PyTorch ```Dataset```s defined before (```trainer = Trainer(model)```). In case, some model weights are already available, we can skip the training and using them." + "Here, we train the model.\n", + "First, we initialise the class ```Trainer``` which we are going to use for training and evaluating the model using the PyTorch ```Dataset```s defined before (```trainer = Trainer(model)```).\n", + "In case, some model weights are already available, we can skip the training and using them." ] }, { @@ -1528,9 +1588,15 @@ "id": "86", "metadata": {}, "source": [ - "After initialising the trainer instance, check whether a trained model already exists. If so, load the weights using ```model_weights = torch.load(model_path, weights_only=True)```. Then, load the weights into the model using (```model.load_state_dict(model_weights)```). Finally, set the model to evaluation model (```model.eval()```). This step is essential because certain layers, such as batch normalization and dropout, behave differently during training and evaluation. Setting the model to evaluation mode ensures they operate correctly during validation or testing. \n", + "After initialising the trainer instance, check whether a trained model already exists.\n", + "If so, load the weights using ```model_weights = torch.load(model_path, weights_only=True)```.\n", + "Then, load the weights into the model using (```model.load_state_dict(model_weights)```).\n", + "Finally, set the model to evaluation model (```model.eval()```).\n", + "This step is essential because certain layers, such as batch normalization and dropout, behave differently during training and evaluation.\n", + "Setting the model to evaluation mode ensures they operate correctly during validation or testing. \n", "\n", - "If no pre-trained model is available, train a new model using the training and validation sets along with the predefined hyperparameters (```trainer.fit(EPOCHS, train_dataloader, val_dataloader, optimizer, criterion, DEVICE, EARLY_STOPPING_LIMIT)```). After training, save the best model weights (```torch.save(trainer.model.best_model_weights, model_path)```)." + "If no pre-trained model is available, train a new model using the training and validation sets along with the predefined hyperparameters (```trainer.fit(EPOCHS, train_dataloader, val_dataloader, optimizer, criterion, DEVICE, EARLY_STOPPING_LIMIT)```).\n", + "After training, save the best model weights (```torch.save(trainer.model.best_model_weights, model_path)```)." ] }, { @@ -1556,9 +1622,14 @@ "source": [ "#### Learning curves\n", "\n", - "After training the model, we can analyse the learning curves to assess the training process. These curves, which typically display the loss over epochs for both the training and validation sets, are crucial for improving model performance. They can help identify issues like overfitting or underfitting. Overfitting occurs when the model performs well on the training data but poorly on the validation data, usually indicated by a widening gap between the two curves. Underfitting, on the other hand, is suggested when both the training and validation curves show poor performance and fail to improve. By monitoring these curves, we can adjust hyperparameters or modify the model architecture to address such issues. \n", + "After training the model, we can analyse the learning curves to assess the training process.\n", + "These curves, which typically display the loss over epochs for both the training and validation sets, are crucial for improving model performance.\n", + "They can help identify issues like overfitting or underfitting.\n", + "Overfitting occurs when the model performs well on the training data but poorly on the validation data, usually indicated by a widening gap between the two curves.\n", + "Underfitting, on the other hand, is suggested when both the training and validation curves show poor performance and fail to improve. By monitoring these curves, we can adjust hyperparameters or modify the model architecture to address such issues. \n", "\n", - "First, load the log file using ```pandas``` (```training_log = pd.read_csv(\"training_log.txt\")```). Then, use the ```matplotlib``` library to plot the learning curves." + "First, load the log file using ```pandas``` (```training_log = pd.read_csv(\"training_log.txt\")```).\n", + "Then, use the ```matplotlib``` library to plot the learning curves." ] }, { @@ -1588,6 +1659,7 @@ "### Model testing\n", "\n", "Once the model has been trained and optimized, we can evaluate its performance using the ```Trainer``` class by calling the ```predict``` method with the test dataloader and the device:\n", + "\n", "```python\n", "original_images, true_labels, predicted_labels = trainer.predict(test_dataloader, DEVICE)\n", "```\n", @@ -1595,9 +1667,7 @@ "This method returns three NumPy arrays:\n", "\n", "- ```original_images```: the input images from the test set\n", - "\n", "- ```true_labels```: the corresponding ground truth labels\n", - "\n", "- ```predicted_labels```: the model's predicted classes" ] }, @@ -1618,7 +1688,9 @@ "source": [ "### Explore results\n", "\n", - "To evaluate the results, we display the model's accuracy along with the confusion matrix. The confusion matrix is a powerful evaluation tool that helps us understand the model’s performance across multiple classes. It maps the relationship between true and predicted labels, showing the number of instances for each possible prediction-outcome pair." + "To evaluate the results, we display the model's accuracy along with the confusion matrix.\n", + "The confusion matrix is a powerful evaluation tool that helps us understand the model’s performance across multiple classes.\n", + "It maps the relationship between true and predicted labels, showing the number of instances for each possible prediction-outcome pair." ] }, { @@ -1651,6 +1723,7 @@ "#### Compute confusion matrix\n", "\n", "To compute the confusion matrix, use ```confusion_matrix``` from scikit-learn library:\n", + "\n", "```python\n", "cm = confusion_matrix(true_labels, predicted_labels)\n", "```" @@ -1695,7 +1768,10 @@ "source": [ "### Explain image classifier predictions\n", "\n", - "Deep neural networks are often described as \"black boxes\" because their decision-making processes are difficult to understand and interpret. To address this, researchers have developed various methods to make these models more explainable. One such method is Grad-CAM (Gradient-weighted Class Activation Mapping). Grad-CAM computes the gradients of a target class with respect to the final convolutional layers and generates a heatmap that highlights the regions of the input image most influential in the model’s prediction for that class." + "Deep neural networks are often described as \"black boxes\" because their decision-making processes are difficult to understand and interpret.\n", + "To address this, researchers have developed various methods to make these models more explainable.\n", + "One such method is Grad-CAM (Gradient-weighted Class Activation Mapping).\n", + "Grad-CAM computes the gradients of a target class with respect to the final convolutional layers and generates a heatmap that highlights the regions of the input image most influential in the model’s prediction for that class." ] }, { @@ -1707,11 +1783,10 @@ "\n", "To prepare the image for Grad-CAM visualization:\n", "\n", - "- First, convert it to (Height, Width, Channels) format using ```img_np = np.transpose(img, (1, 2, 0)) # shape: (H, W, C)```, and normalize its values to the [0, 1] range with ```img_np = (img_np - img_np.min()) / (img_np.max() - img_np.min())```. This processed image is used only for visualization, as expected by the PyTorch-GradCAM library;\n", - "\n", - "- Next, modify the original image for model inference by adding a batch dimension: ```img = np.expand_dims(img, axis=0)```, then convert it to a PyTorch tensor: ```img = torch.from_numpy(img)```, and move it to the appropriate computation device using ```img = img.to(DEVICE)```;\n", - "\n", - "- Finally, retrieve the predicted and true labels, as both are required for computing and visualizing the Grad-CAM output.\n" + "- First, convert it to (Height, Width, Channels) format using ```img_np = np.transpose(img, (1, 2, 0)) # shape: (H, W, C)```, and normalize its values to the [0, 1] range with ```img_np = (img_np - img_np.min()) / (img_np.max() - img_np.min())```.\n", + " This processed image is used only for visualization, as expected by the PyTorch-GradCAM library.\n", + "- Next, modify the original image for model inference by adding a batch dimension: ```img = np.expand_dims(img, axis=0)```, then convert it to a PyTorch tensor: ```img = torch.from_numpy(img)```, and move it to the appropriate computation device using ```img = img.to(DEVICE)```.\n", + "- Finally, retrieve the predicted and true labels, as both are required for computing and visualizing the Grad-CAM output." ] }, { @@ -1737,12 +1812,11 @@ "\n", "To compute the Grad-CAM heatmap:\n", "\n", - "- First, ensure that the `requires_grad` attribute of the input image tensor is set to `True` by using `img.requires_grad = True`. This enables gradient computation with respect to the image, which is necessary for generating class activation maps;\n", - "\n", - "- Next, specify the layer to inspect using ```target_layers = [model.conv3]```. Typically, the last convolutional layer of the image classifier is chosen because it preserves spatial information, which is crucial for identifying the regions of the input image that most strongly influence the model's prediction. \n", - "\n", - "- Then, define the target class to be explained with ```targets = [ClassifierOutputTarget(pred_label)]```, where ```pred_label``` is the class index corresponding to the model’s predicted output (or any other class of interest). \n", - "\n", + "- First, ensure that the `requires_grad` attribute of the input image tensor is set to `True` by using `img.requires_grad = True`.\n", + " This enables gradient computation with respect to the image, which is necessary for generating class activation maps.\n", + "- Next, specify the layer to inspect using ```target_layers = [model.conv3]```.\n", + "- Typically, the last convolutional layer of the image classifier is chosen because it preserves spatial information, which is crucial for identifying the regions of the input image that most strongly influence the model's prediction. \n", + "- Then, define the target class to be explained with ```targets = [ClassifierOutputTarget(pred_label)]```, where ```pred_label``` is the class index corresponding to the model’s predicted output (or any other class of interest).\n", "- Finally, compute the Grad-CAM heatmap using the activations and gradients from the selected layer:\n", "```python\n", "# Create CAM object\n", @@ -1775,7 +1849,8 @@ "source": [ "#### Visualise Grad-CAM heatmap with the image\n", "\n", - "After obtaining the Grad-CAM heatmap, we overlay it on the input image to visualise the regions that contributed most to the model’s prediction (```visualisation = show_cam_on_image(img_np, grad_cam_matrix, use_rgb=True)```). This helps identify which pixels the model focused on when predicting the class." + "After obtaining the Grad-CAM heatmap, we overlay it on the input image to visualise the regions that contributed most to the model’s prediction (```visualisation = show_cam_on_image(img_np, grad_cam_matrix, use_rgb=True)```).\n", + "This helps identify which pixels the model focused on when predicting the class." ] }, { @@ -1797,7 +1872,7 @@ ], "metadata": { "kernelspec": { - "display_name": "python-tutorial", + "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, @@ -1811,7 +1886,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.10.17" + "version": "3.12.10" } }, "nbformat": 4, From d0902b0ce54d66f399977e32328194b5fa7f9856 Mon Sep 17 00:00:00 2001 From: Fabio Lopes Date: Sun, 11 May 2025 12:29:35 +0200 Subject: [PATCH 09/10] Spread the library imports over the notebook instead of having all of them at the beginning --- 25_image_classification.ipynb | 394 +++++++++++++++++++++++----------- 1 file changed, 264 insertions(+), 130 deletions(-) diff --git a/25_image_classification.ipynb b/25_image_classification.ipynb index b4a46b8a..3c93d051 100644 --- a/25_image_classification.ipynb +++ b/25_image_classification.ipynb @@ -98,40 +98,9 @@ "- PyTorch Grad-CAM" ] }, - { - "cell_type": "code", - "execution_count": null, - "id": "5", - "metadata": {}, - "outputs": [], - "source": [ - "# Important modules\n", - "import pytest\n", - "import os\n", - "import pickle\n", - "import numpy as np\n", - "import pandas as pd\n", - "from matplotlib import pyplot as plt\n", - "\n", - "from sklearn.model_selection import train_test_split\n", - "from sklearn.metrics import confusion_matrix\n", - "\n", - "import cv2\n", - "\n", - "import torch\n", - "import torch.nn as nn\n", - "import torch.optim as optim\n", - "from torch.utils.data import Dataset, DataLoader\n", - "import albumentations as A\n", - "\n", - "from pytorch_grad_cam import GradCAM\n", - "from pytorch_grad_cam.utils.model_targets import ClassifierOutputTarget\n", - "from pytorch_grad_cam.utils.image import show_cam_on_image" - ] - }, { "cell_type": "markdown", - "id": "6", + "id": "5", "metadata": {}, "source": [ "## Introduction\n", @@ -145,7 +114,7 @@ }, { "cell_type": "markdown", - "id": "7", + "id": "6", "metadata": {}, "source": [ "## Classes\n", @@ -161,7 +130,7 @@ }, { "cell_type": "markdown", - "id": "8", + "id": "7", "metadata": {}, "source": [ "The classes are currently not complete. Use the following code to prepare them:\n", @@ -200,10 +169,13 @@ { "cell_type": "code", "execution_count": null, - "id": "9", + "id": "8", "metadata": {}, "outputs": [], "source": [ + "from torch.utils.data import Dataset\n", + "import numpy as np\n", + "\n", "class ImageDataset(Dataset):\n", " def __init__(self, images, labels, transform=None):\n", " pass\n", @@ -217,7 +189,7 @@ }, { "cell_type": "markdown", - "id": "10", + "id": "9", "metadata": {}, "source": [ "```ImageClassifier```: \n", @@ -300,7 +272,7 @@ }, { "cell_type": "markdown", - "id": "11", + "id": "10", "metadata": {}, "source": [ "`forward` function:\n", @@ -337,10 +309,12 @@ { "cell_type": "code", "execution_count": null, - "id": "12", + "id": "11", "metadata": {}, "outputs": [], "source": [ + "import torch.nn as nn\n", + "\n", "class ImageClassifier(nn.Module):\n", " def __init__(self, in_channels = 1, out_classes = 1):\n", " super(ImageClassifier, self).__init__()\n", @@ -351,7 +325,7 @@ }, { "cell_type": "markdown", - "id": "13", + "id": "12", "metadata": {}, "source": [ "```Trainer```: \n", @@ -376,7 +350,7 @@ }, { "cell_type": "markdown", - "id": "14", + "id": "13", "metadata": {}, "source": [ "`fit` function:\n", @@ -493,7 +467,7 @@ }, { "cell_type": "markdown", - "id": "15", + "id": "14", "metadata": {}, "source": [ "`predict` function:\n", @@ -552,10 +526,12 @@ { "cell_type": "code", "execution_count": null, - "id": "16", + "id": "15", "metadata": {}, "outputs": [], "source": [ + "import numpy as np\n", + "\n", "class Trainer():\n", " def __init__(self, model):\n", " pass\n", @@ -569,7 +545,7 @@ }, { "cell_type": "markdown", - "id": "17", + "id": "16", "metadata": {}, "source": [ "## Functions\n", @@ -580,10 +556,13 @@ { "cell_type": "code", "execution_count": null, - "id": "18", + "id": "17", "metadata": {}, "outputs": [], "source": [ + "from matplotlib import pyplot as plt\n", + "import pickle\n", + "\n", "def load_pickle_file(filepath):\n", " with open(filepath, \"rb\") as f:\n", " return pickle.load(f)\n", @@ -604,7 +583,7 @@ }, { "cell_type": "markdown", - "id": "19", + "id": "18", "metadata": {}, "source": [ "## Dataset\n", @@ -614,7 +593,7 @@ }, { "cell_type": "markdown", - "id": "20", + "id": "19", "metadata": {}, "source": [ "### Load data\n", @@ -625,10 +604,12 @@ { "cell_type": "code", "execution_count": null, - "id": "21", + "id": "20", "metadata": {}, "outputs": [], "source": [ + "import os\n", + "\n", "# Sets filepaths\n", "dataset_folder = os.path.join(\"data/CIFAR10\")\n", "train_set_file = os.path.join(dataset_folder, \"train_set.pkl\")\n", @@ -647,19 +628,21 @@ }, { "cell_type": "markdown", - "id": "22", + "id": "21", "metadata": {}, "source": [ "## Explore image processing\n", "\n", "Image processing is fundamental to computer vision, forming the basis for interpreting and analyzing visual information. By applying techniques such as resizing, filtering, color adjustments, and data augmentation, image processing enhances input quality, minimizes noise, and corrects distortions. These methods can also simulate real-world variability, helping models generalize better. \n", "\n", - "In this notebook, we explore three categories of image transformations: **geometric transformations**, **image filtering**, and **photometric transformations**. The following cells contain a series of exercicies designed to help you explore the OpenCV-Python library. If you are unfamiliar with a particular method, refer to the [Image Processing in OpenCV](https://docs.opencv.org/4.x/d2/d96/tutorial_py_table_of_contents_imgproc.html) documentation." + "In this notebook, we explore three categories of image transformations: **geometric transformations**, **image filtering**, and **photometric transformations**. The following cells contain a series of exercicies designed to help you explore the OpenCV-Python library. \n", + "\n", + "If you are unfamiliar with a particular method, refer to the [Image Processing in OpenCV](https://docs.opencv.org/4.x/d2/d96/tutorial_py_table_of_contents_imgproc.html) documentation. There you can find the description of the functions needed for [Geometric transformations](https://docs.opencv.org/4.x/da/d6e/tutorial_py_geometric_transformations.html) and [image filtering](https://docs.opencv.org/4.x/d4/d13/tutorial_py_filtering.html). Regarding photometric transformations, openCV documentation does not have a specific page for that. To adjust brightness and contrast, you can read [Changing the contrast and brightness of an image!](https://docs.opencv.org/4.x/d3/dc1/tutorial_basic_linear_transform.html). To adjust saturation, first convert the image to the HSV color space using [`cv2.cvtColor`](https://docs.opencv.org/4.x/d8/d01/group__imgproc__color__conversions.html#gaf86c09fe702ed037c03c2bc603ceab14). Then, split the image into Hue, Saturation, and Value channels with [`cv2.split`](https://docs.opencv.org/4.x/df/df2/group__core__hal__interface__split.html). Modify the Saturation channel as needed, merge the channels back together using [`cv2.merge`](https://docs.opencv.org/4.x/df/d2e/group__core__hal__interface__merge.html), and finally convert the image back to the RGB color space." ] }, { "cell_type": "markdown", - "id": "23", + "id": "22", "metadata": {}, "source": [ "### Example image" @@ -668,10 +651,12 @@ { "cell_type": "code", "execution_count": null, - "id": "24", + "id": "23", "metadata": {}, "outputs": [], "source": [ + "import numpy as np\n", + "\n", "# Select image\n", "image = train_set[0][9]\n", "\n", @@ -684,10 +669,10 @@ }, { "cell_type": "markdown", - "id": "25", + "id": "24", "metadata": {}, "source": [ - "### Geometric transformation\n", + "### Geometric transformations\n", "\n", "Geometric transformations alter the spatial structure of the image while preserving its semantic content. They help the model become invariant to different orientations and scales:\n", "- **Scaling**: Resizes the image to a specific size, often required to match input dimensions for image classifiers. It uses interpolation to obtain the new pixel-values.\n", @@ -698,7 +683,7 @@ }, { "cell_type": "markdown", - "id": "26", + "id": "25", "metadata": {}, "source": [ "#### Scaling" @@ -707,7 +692,7 @@ { "cell_type": "code", "execution_count": null, - "id": "27", + "id": "26", "metadata": {}, "outputs": [], "source": [ @@ -717,12 +702,13 @@ { "cell_type": "code", "execution_count": null, - "id": "28", + "id": "27", "metadata": {}, "outputs": [], "source": [ "%%ipytest\n", "\n", + "import cv2\n", "def solution_scale_image(img, scale_factor: float):\n", " # Start your code here\n", " return\n", @@ -732,7 +718,7 @@ { "cell_type": "code", "execution_count": null, - "id": "29", + "id": "28", "metadata": {}, "outputs": [], "source": [ @@ -746,12 +732,22 @@ }, { "cell_type": "markdown", - "id": "30", + "id": "29", "metadata": {}, "source": [ "#### Cropping" ] }, + { + "cell_type": "code", + "execution_count": null, + "id": "30", + "metadata": {}, + "outputs": [], + "source": [ + "%reload_ext tutorial.tests.testsuite" + ] + }, { "cell_type": "code", "execution_count": null, @@ -760,6 +756,8 @@ "outputs": [], "source": [ "%%ipytest\n", + "\n", + "import cv2\n", "def solution_crop_image(img, x: int, y: int, width: int, height: int):\n", " # Start your code here\n", " return\n", @@ -795,8 +793,20 @@ "id": "34", "metadata": {}, "outputs": [], + "source": [ + "%reload_ext tutorial.tests.testsuite" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "35", + "metadata": {}, + "outputs": [], "source": [ "%%ipytest\n", + "\n", + "import cv2\n", "def solution_horizontal_flip_image(img):\n", " # Start your code here\n", " return\n", @@ -806,7 +816,7 @@ { "cell_type": "code", "execution_count": null, - "id": "35", + "id": "36", "metadata": {}, "outputs": [], "source": [ @@ -820,7 +830,7 @@ }, { "cell_type": "markdown", - "id": "36", + "id": "37", "metadata": {}, "source": [ "#### Vertical Flip" @@ -829,11 +839,23 @@ { "cell_type": "code", "execution_count": null, - "id": "37", + "id": "38", + "metadata": {}, + "outputs": [], + "source": [ + "%reload_ext tutorial.tests.testsuite" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "39", "metadata": {}, "outputs": [], "source": [ "%%ipytest\n", + "\n", + "import cv2\n", "def solution_vertical_flip_image(img):\n", " # Start your code here\n", " return\n", @@ -843,7 +865,7 @@ { "cell_type": "code", "execution_count": null, - "id": "38", + "id": "40", "metadata": {}, "outputs": [], "source": [ @@ -857,7 +879,7 @@ }, { "cell_type": "markdown", - "id": "39", + "id": "41", "metadata": {}, "source": [ "#### Rotation" @@ -866,11 +888,23 @@ { "cell_type": "code", "execution_count": null, - "id": "40", + "id": "42", + "metadata": {}, + "outputs": [], + "source": [ + "%reload_ext tutorial.tests.testsuite" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "43", "metadata": {}, "outputs": [], "source": [ "%%ipytest\n", + "\n", + "import cv2\n", "def solution_rotate_image(img, angle: float):\n", " # Start your code here\n", " return\n", @@ -880,7 +914,7 @@ { "cell_type": "code", "execution_count": null, - "id": "41", + "id": "44", "metadata": {}, "outputs": [], "source": [ @@ -894,7 +928,7 @@ }, { "cell_type": "markdown", - "id": "42", + "id": "45", "metadata": {}, "source": [ "### Image filtering\n", @@ -907,7 +941,7 @@ }, { "cell_type": "markdown", - "id": "43", + "id": "46", "metadata": {}, "source": [ "#### Average filter " @@ -916,11 +950,23 @@ { "cell_type": "code", "execution_count": null, - "id": "44", + "id": "47", + "metadata": {}, + "outputs": [], + "source": [ + "%reload_ext tutorial.tests.testsuite" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "48", "metadata": {}, "outputs": [], "source": [ "%%ipytest\n", + "\n", + "import cv2\n", "def solution_average_filter(img, kernel_size = (5, 5)):\n", " # Start your code here\n", " return\n", @@ -930,7 +976,7 @@ { "cell_type": "code", "execution_count": null, - "id": "45", + "id": "49", "metadata": {}, "outputs": [], "source": [ @@ -944,7 +990,7 @@ }, { "cell_type": "markdown", - "id": "46", + "id": "50", "metadata": {}, "source": [ "#### Median filter" @@ -953,11 +999,23 @@ { "cell_type": "code", "execution_count": null, - "id": "47", + "id": "51", + "metadata": {}, + "outputs": [], + "source": [ + "%reload_ext tutorial.tests.testsuite" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "52", "metadata": {}, "outputs": [], "source": [ "%%ipytest\n", + "\n", + "import cv2\n", "def solution_median_filter(img, ksize):\n", " # Start your code here\n", " return\n", @@ -967,7 +1025,7 @@ { "cell_type": "code", "execution_count": null, - "id": "48", + "id": "53", "metadata": {}, "outputs": [], "source": [ @@ -981,7 +1039,7 @@ }, { "cell_type": "markdown", - "id": "49", + "id": "54", "metadata": {}, "source": [ "#### Gaussian filter" @@ -990,11 +1048,23 @@ { "cell_type": "code", "execution_count": null, - "id": "50", + "id": "55", + "metadata": {}, + "outputs": [], + "source": [ + "%reload_ext tutorial.tests.testsuite" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "56", "metadata": {}, "outputs": [], "source": [ "%%ipytest\n", + "\n", + "import cv2\n", "def solution_gaussian_filter(img, kernel_size = (5, 5), sigma = 0):\n", " # Start your code here\n", " return\n", @@ -1004,7 +1074,7 @@ { "cell_type": "code", "execution_count": null, - "id": "51", + "id": "57", "metadata": {}, "outputs": [], "source": [ @@ -1018,10 +1088,10 @@ }, { "cell_type": "markdown", - "id": "52", + "id": "58", "metadata": {}, "source": [ - "### Photometric transformation\n", + "### Photometric transformations\n", "\n", "Photometric transformations modify the color properties of an image to simulate different lighting conditions and improve model robustness to brightness and contrast changes:\n", "- **Brightness**: Randomly increases or decreases the brightness of the image.\n", @@ -1031,7 +1101,7 @@ }, { "cell_type": "markdown", - "id": "53", + "id": "59", "metadata": {}, "source": [ "#### Adjust brightness" @@ -1040,11 +1110,23 @@ { "cell_type": "code", "execution_count": null, - "id": "54", + "id": "60", + "metadata": {}, + "outputs": [], + "source": [ + "%reload_ext tutorial.tests.testsuite" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "61", "metadata": {}, "outputs": [], "source": [ "%%ipytest\n", + "\n", + "import cv2\n", "def solution_adjust_brightness(img, brightness_value):\n", " # Start your code here\n", " return\n", @@ -1054,7 +1136,7 @@ { "cell_type": "code", "execution_count": null, - "id": "55", + "id": "62", "metadata": {}, "outputs": [], "source": [ @@ -1071,7 +1153,7 @@ }, { "cell_type": "markdown", - "id": "56", + "id": "63", "metadata": {}, "source": [ "#### Adjust contrast" @@ -1080,11 +1162,23 @@ { "cell_type": "code", "execution_count": null, - "id": "57", + "id": "64", + "metadata": {}, + "outputs": [], + "source": [ + "%reload_ext tutorial.tests.testsuite" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "65", "metadata": {}, "outputs": [], "source": [ "%%ipytest\n", + "\n", + "import cv2\n", "def solution_adjust_contrast(img, contrast_value):\n", " # Start your code here\n", " return\n", @@ -1094,7 +1188,7 @@ { "cell_type": "code", "execution_count": null, - "id": "58", + "id": "66", "metadata": {}, "outputs": [], "source": [ @@ -1111,7 +1205,7 @@ }, { "cell_type": "markdown", - "id": "59", + "id": "67", "metadata": {}, "source": [ "#### Adjust saturation" @@ -1120,11 +1214,23 @@ { "cell_type": "code", "execution_count": null, - "id": "60", + "id": "68", + "metadata": {}, + "outputs": [], + "source": [ + "%reload_ext tutorial.tests.testsuite" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "69", "metadata": {}, "outputs": [], "source": [ "%%ipytest\n", + "\n", + "import cv2\n", "def solution_adjust_saturation(img, saturation_factor):\n", " # Start your code here\n", " return\n", @@ -1134,7 +1240,7 @@ { "cell_type": "code", "execution_count": null, - "id": "61", + "id": "70", "metadata": {}, "outputs": [], "source": [ @@ -1151,7 +1257,7 @@ }, { "cell_type": "markdown", - "id": "62", + "id": "71", "metadata": {}, "source": [ "## Image classifier development using CNNs\n", @@ -1173,7 +1279,7 @@ }, { "cell_type": "markdown", - "id": "63", + "id": "72", "metadata": {}, "source": [ "### Dataset preprocessing" @@ -1181,7 +1287,7 @@ }, { "cell_type": "markdown", - "id": "64", + "id": "73", "metadata": {}, "source": [ "#### Train, validation, and test sets\n", @@ -1192,10 +1298,12 @@ { "cell_type": "code", "execution_count": null, - "id": "65", + "id": "74", "metadata": {}, "outputs": [], "source": [ + "from sklearn.model_selection import train_test_split\n", + "\n", "# Train and validation sets\n", "X_train, y_train = train_set[0], train_set[1]\n", "X_train, X_val, y_train, y_val = train_test_split(X_train, y_train, test_size = 0.3, random_state = 42)\n", @@ -1206,7 +1314,7 @@ }, { "cell_type": "markdown", - "id": "66", + "id": "75", "metadata": {}, "source": [ "#### Data Augmentation\n", @@ -1240,13 +1348,14 @@ { "cell_type": "code", "execution_count": null, - "id": "67", + "id": "76", "metadata": {}, "outputs": [], "source": [ - "TARGET_SIZE = 32\n", + "import albumentations as A\n", "\n", "# Transformations performed on train set\n", + "TARGET_SIZE = 32\n", "train_transform = A.Compose([\n", " A.Normalize(mean=(0.4914, 0.4822, 0.4465), std=(0.2470, 0.2435, 0.2616)),\n", " A.Resize(height = TARGET_SIZE, width = TARGET_SIZE),\n", @@ -1263,7 +1372,7 @@ }, { "cell_type": "markdown", - "id": "68", + "id": "77", "metadata": {}, "source": [ "#### PyTorch Datasets\n", @@ -1274,7 +1383,7 @@ { "cell_type": "code", "execution_count": null, - "id": "69", + "id": "78", "metadata": {}, "outputs": [], "source": [ @@ -1286,7 +1395,7 @@ }, { "cell_type": "markdown", - "id": "70", + "id": "79", "metadata": {}, "source": [ "#### PyTorch Dataloaders\n", @@ -1297,10 +1406,12 @@ { "cell_type": "code", "execution_count": null, - "id": "71", + "id": "80", "metadata": {}, "outputs": [], "source": [ + "from torch.utils.data import DataLoader\n", + "\n", "# Data loaders needed for the model training\n", "BATCH_SIZE = 64\n", "train_dataloader = DataLoader(train_dataset, batch_size = BATCH_SIZE, shuffle = True)\n", @@ -1310,7 +1421,7 @@ }, { "cell_type": "markdown", - "id": "72", + "id": "81", "metadata": {}, "source": [ "### Model training\n", @@ -1332,7 +1443,7 @@ }, { "cell_type": "markdown", - "id": "73", + "id": "82", "metadata": {}, "source": [ "### Model Training Overview\n", @@ -1358,7 +1469,7 @@ }, { "cell_type": "markdown", - "id": "74", + "id": "83", "metadata": {}, "source": [ "#### Check which device is used for training" @@ -1367,16 +1478,18 @@ { "cell_type": "code", "execution_count": null, - "id": "75", + "id": "84", "metadata": {}, "outputs": [], "source": [ + "import torch\n", + "\n", "# Check which device is available for training the model" ] }, { "cell_type": "markdown", - "id": "76", + "id": "85", "metadata": {}, "source": [ "#### Define training hyperparameters" @@ -1385,7 +1498,7 @@ { "cell_type": "code", "execution_count": null, - "id": "77", + "id": "86", "metadata": {}, "outputs": [], "source": [ @@ -1401,7 +1514,7 @@ }, { "cell_type": "markdown", - "id": "78", + "id": "87", "metadata": {}, "source": [ "#### Loss function\n", @@ -1426,16 +1539,18 @@ { "cell_type": "code", "execution_count": null, - "id": "79", + "id": "88", "metadata": {}, "outputs": [], "source": [ + "import torch.nn as nn\n", + "\n", "# Initialise the Cross Entropy Loss and send it to the training device" ] }, { "cell_type": "markdown", - "id": "80", + "id": "89", "metadata": {}, "source": [ "#### Initialise model architecture" @@ -1444,7 +1559,7 @@ { "cell_type": "code", "execution_count": null, - "id": "81", + "id": "90", "metadata": {}, "outputs": [], "source": [ @@ -1453,7 +1568,7 @@ }, { "cell_type": "markdown", - "id": "82", + "id": "91", "metadata": {}, "source": [ "#### Optimiser function\n", @@ -1496,16 +1611,17 @@ { "cell_type": "code", "execution_count": null, - "id": "83", + "id": "92", "metadata": {}, "outputs": [], "source": [ - "# Initialise the Adam optimiser" + "# Initialise the Adam optimiser\n", + "import torch.optim as optim" ] }, { "cell_type": "markdown", - "id": "84", + "id": "93", "metadata": {}, "source": [ "#### Train model\n", @@ -1516,7 +1632,7 @@ { "cell_type": "code", "execution_count": null, - "id": "85", + "id": "94", "metadata": {}, "outputs": [], "source": [ @@ -1525,7 +1641,7 @@ }, { "cell_type": "markdown", - "id": "86", + "id": "95", "metadata": {}, "source": [ "After initialising the trainer instance, check whether a trained model already exists. If so, load the weights using ```model_weights = torch.load(model_path, weights_only=True)```. Then, load the weights into the model using (```model.load_state_dict(model_weights)```). Finally, set the model to evaluation model (```model.eval()```). This step is essential because certain layers, such as batch normalization and dropout, behave differently during training and evaluation. Setting the model to evaluation mode ensures they operate correctly during validation or testing. \n", @@ -1536,10 +1652,13 @@ { "cell_type": "code", "execution_count": null, - "id": "87", + "id": "96", "metadata": {}, "outputs": [], "source": [ + "import os\n", + "import torch\n", + "\n", "# Model filename\n", "model_path = \"cnn_weights.pt\"\n", "\n", @@ -1551,7 +1670,7 @@ }, { "cell_type": "markdown", - "id": "88", + "id": "97", "metadata": {}, "source": [ "#### Learning curves\n", @@ -1564,10 +1683,13 @@ { "cell_type": "code", "execution_count": null, - "id": "89", + "id": "98", "metadata": {}, "outputs": [], "source": [ + "import pandas as pd\n", + "from matplotlib import pyplot as plt\n", + "\n", "# Load the training log file\n", "training_log = None\n", "\n", @@ -1582,7 +1704,7 @@ }, { "cell_type": "markdown", - "id": "90", + "id": "99", "metadata": {}, "source": [ "### Model testing\n", @@ -1604,7 +1726,7 @@ { "cell_type": "code", "execution_count": null, - "id": "91", + "id": "100", "metadata": {}, "outputs": [], "source": [ @@ -1613,7 +1735,7 @@ }, { "cell_type": "markdown", - "id": "92", + "id": "101", "metadata": {}, "source": [ "### Explore results\n", @@ -1623,7 +1745,7 @@ }, { "cell_type": "markdown", - "id": "93", + "id": "102", "metadata": {}, "source": [ "#### Compute average accuracy\n", @@ -1634,7 +1756,7 @@ { "cell_type": "code", "execution_count": null, - "id": "94", + "id": "103", "metadata": {}, "outputs": [], "source": [ @@ -1645,7 +1767,7 @@ }, { "cell_type": "markdown", - "id": "95", + "id": "104", "metadata": {}, "source": [ "#### Compute confusion matrix\n", @@ -1659,10 +1781,14 @@ { "cell_type": "code", "execution_count": null, - "id": "96", + "id": "105", "metadata": {}, "outputs": [], "source": [ + "from sklearn.metrics import confusion_matrix\n", + "import numpy as np\n", + "from matplotlib import pyplot as plt\n", + "\n", "# Compute confusion matrix\n", "cm = None\n", "\n", @@ -1690,7 +1816,7 @@ }, { "cell_type": "markdown", - "id": "97", + "id": "106", "metadata": {}, "source": [ "### Explain image classifier predictions\n", @@ -1700,7 +1826,7 @@ }, { "cell_type": "markdown", - "id": "98", + "id": "107", "metadata": {}, "source": [ "#### Prepare image for Grad-CAM\n", @@ -1717,10 +1843,13 @@ { "cell_type": "code", "execution_count": null, - "id": "99", + "id": "108", "metadata": {}, "outputs": [], "source": [ + "import numpy as np\n", + "import torch\n", + "\n", "# Get a batch of images\n", "idx = 1\n", "img = original_images[idx]\n", @@ -1730,7 +1859,7 @@ }, { "cell_type": "markdown", - "id": "100", + "id": "109", "metadata": {}, "source": [ "#### Compute GradCAM heatmap\n", @@ -1755,10 +1884,13 @@ { "cell_type": "code", "execution_count": null, - "id": "101", + "id": "110", "metadata": {}, "outputs": [], "source": [ + "from pytorch_grad_cam import GradCAM\n", + "from pytorch_grad_cam.utils.model_targets import ClassifierOutputTarget\n", + "\n", "# Make sure input requires grad\n", "\n", "# Define the layer(s) to inspect\n", @@ -1770,7 +1902,7 @@ }, { "cell_type": "markdown", - "id": "102", + "id": "111", "metadata": {}, "source": [ "#### Visualise Grad-CAM heatmap with the image\n", @@ -1781,10 +1913,12 @@ { "cell_type": "code", "execution_count": null, - "id": "103", + "id": "112", "metadata": {}, "outputs": [], "source": [ + "from pytorch_grad_cam.utils.image import show_cam_on_image\n", + "\n", "# Combine CAM with image\n", "visualisation = None\n", "\n", From 490dd4960cadef1678c8a29ff4231a9d78353041 Mon Sep 17 00:00:00 2001 From: Aliaksandr Yakutovich Date: Sun, 11 May 2025 22:16:31 +0200 Subject: [PATCH 10/10] Rename the notebooks, add them to the index --- 00_index.ipynb | 4 ++++ ...fication.ipynb => 31_image_classification.ipynb | 14 +++++++------- ...fication.py => test_31_image_classification.py} | 0 3 files changed, 11 insertions(+), 7 deletions(-) rename 25_image_classification.ipynb => 31_image_classification.ipynb (99%) rename tutorial/tests/{test_25_image_classification.py => test_31_image_classification.py} (100%) diff --git a/00_index.ipynb b/00_index.ipynb index 13143226..1705e4ee 100644 --- a/00_index.ipynb +++ b/00_index.ipynb @@ -32,6 +32,10 @@ "- [SciPy](./23_library_scipy.ipynb)\n", "- [Pandas](./24_library_pandas.ipynb)\n", "\n", + "# Hands-On Projects\n", + "\n", + "- [Image Classification](./31_image_classification.ipynb)\n", + "\n", "# Additional Topics\n", "\n", "- [Parallelism and concurrency in Python](./14_threads.ipynb)\n" diff --git a/25_image_classification.ipynb b/31_image_classification.ipynb similarity index 99% rename from 25_image_classification.ipynb rename to 31_image_classification.ipynb index 88da2be6..bcddacec 100644 --- a/25_image_classification.ipynb +++ b/31_image_classification.ipynb @@ -71,12 +71,12 @@ "## References\n", "\n", "Here are some additional references to guide you while self-learning:\n", - "- Official documentation for [openCV](https://docs.opencv.org/4.x/d6/d00/tutorial_py_root.html)\n", - "- Official documentation for [PIL library](https://pillow.readthedocs.io/en/stable/)\n", - "- Official documentation for [PyTorch](https://pytorch.org/)\n", - "- Official documentation for [Albumentations](https://albumentations.ai/)\n", - "- Official documentation for [PyTorch GradCAM](https://jacobgil.github.io/pytorch-gradcam-book/introduction.html)\n", - "- [A tutorial from Microsoft to compute image classification using PyTorch](https://learn.microsoft.com/en-us/windows/ai/windows-ml/tutorials/pytorch-train-model)" + "- Official documentation for [openCV](https://docs.opencv.org/4.x/d6/d00/tutorial_py_root.html).\n", + "- Official documentation for [PIL library](https://pillow.readthedocs.io/en/stable/).\n", + "- Official documentation for [PyTorch](https://pytorch.org/).\n", + "- Official documentation for [Albumentations](https://albumentations.ai/).\n", + "- Official documentation for [PyTorch GradCAM](https://jacobgil.github.io/pytorch-gradcam-book/introduction.html).\n", + "- [A tutorial from Microsoft to compute image classification using PyTorch](https://learn.microsoft.com/en-us/windows/ai/windows-ml/tutorials/pytorch-train-model)." ] }, { @@ -2013,7 +2013,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.12.10" + "version": "3.10.10" } }, "nbformat": 4, diff --git a/tutorial/tests/test_25_image_classification.py b/tutorial/tests/test_31_image_classification.py similarity index 100% rename from tutorial/tests/test_25_image_classification.py rename to tutorial/tests/test_31_image_classification.py