-
-## Update Logs:
-### October 28, 2019
-* Pretrained Model and evaluation code on WFLW dataset is released.
-
-## Installation
-#### Note: Code was originally developed under Python2.X and Pytorch 0.4. This released version was revisioned from original code and was tested on Python3.5.7 and Pytorch 1.3.0.
-
-Install system requirements:
-```
-sudo apt-get install python3-dev python3-pip python3-tk libglib2.0-0
-```
-
-Install python dependencies:
-```
-pip3 install -r requirements.txt
-```
-
-## Run Evaluation on WFLW dataset
-1. Download and process WFLW dataset
- * Download WFLW dataset and annotation from [Here](https://wywu.github.io/projects/LAB/WFLW.html).
- * Unzip WFLW dataset and annotations and move files into ```./dataset``` directory. Your directory should look like this:
- ```
- AdaptiveWingLoss
- └───dataset
- │
- └───WFLW_annotations
- │ └───list_98pt_rect_attr_train_test
- │ │
- │ └───list_98pt_test
- │
- └───WFLW_images
- └───0--Parade
- │
- └───...
- ```
- * Inside ```./dataset``` directory, run:
- ```
- python convert_WFLW.py
- ```
- A new directory ```./dataset/WFLW_test``` should be generated with 2500 processed testing images and corresponding landmarks.
-
-2. Download pretrained model from [Google Drive](https://drive.google.com/file/d/1HZaSjLoorQ4QCEx7PRTxOmg0bBPYSqhH/view?usp=sharing) and put it in ```./ckpt``` directory.
-
-3. Within ```./Scripts``` directory, run following command:
- ```
- sh eval_wflw.sh
- ```
-
-
- *GTBbox indicates the ground truth landmarks are used as bounding box to crop faces.
-
-## Future Plans
-- [x] Release evaluation code and pretrained model on WFLW dataset.
-
-- [ ] Release training code on WFLW dataset.
-
-- [ ] Release pretrained model and code on 300W, AFLW and COFW dataset.
-
-- [ ] Replease facial landmark detection API
-
-
-## Citation
-If you find this useful for your research, please cite the following paper.
-
-```
-@InProceedings{Wang_2019_ICCV,
-author = {Wang, Xinyao and Bo, Liefeng and Fuxin, Li},
-title = {Adaptive Wing Loss for Robust Face Alignment via Heatmap Regression},
-booktitle = {The IEEE International Conference on Computer Vision (ICCV)},
-month = {October},
-year = {2019}
-}
-```
-
-## Acknowledgments
-This repository borrows or partially modifies hourglass model and data processing code from [face alignment](https://github.com/1adrianb/face-alignment) and [pose-hg-train](https://github.com/princeton-vl/pose-hg-train).
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/__init__.py b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/__init__.py
deleted file mode 100644
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/__pycache__/__init__.cpython-37.pyc b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/__pycache__/__init__.cpython-37.pyc
deleted file mode 100644
index f37d60bf51ae084b8f418271361dbc3478ea6f43..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/__pycache__/__init__.cpython-37.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/__pycache__/__init__.cpython-39.pyc b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/__pycache__/__init__.cpython-39.pyc
deleted file mode 100644
index 2e10d278134f89716ae4457bb211573759899493..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/__pycache__/__init__.cpython-39.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__init__.py b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__init__.py
deleted file mode 100644
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__pycache__/__init__.cpython-37.pyc b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__pycache__/__init__.cpython-37.pyc
deleted file mode 100644
index f23cf150a0633b354e088ba8fcf06c2fb178b2de..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__pycache__/__init__.cpython-37.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__pycache__/__init__.cpython-39.pyc b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__pycache__/__init__.cpython-39.pyc
deleted file mode 100644
index ea0b18dbedc1969bc59d35d96c18ea538317a8f7..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__pycache__/__init__.cpython-39.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__pycache__/coord_conv.cpython-37.pyc b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__pycache__/coord_conv.cpython-37.pyc
deleted file mode 100644
index 3372b5d0888eec66a73df15071c3aef6209a3009..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__pycache__/coord_conv.cpython-37.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__pycache__/coord_conv.cpython-39.pyc b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__pycache__/coord_conv.cpython-39.pyc
deleted file mode 100644
index fb3890faa1d24a9972f7eaefc5d1079d21524a42..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__pycache__/coord_conv.cpython-39.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__pycache__/models.cpython-37.pyc b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__pycache__/models.cpython-37.pyc
deleted file mode 100644
index 607ef853c74f776f48c993765653bd27739114d1..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__pycache__/models.cpython-37.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__pycache__/models.cpython-39.pyc b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__pycache__/models.cpython-39.pyc
deleted file mode 100644
index ed94bf963863e9ed60bbada29016b233fd37ac47..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/__pycache__/models.cpython-39.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/coord_conv.py b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/coord_conv.py
deleted file mode 100644
index 3949a4d7e694884b6fe9a5d4550631b5e7c4f247..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/coord_conv.py
+++ /dev/null
@@ -1,157 +0,0 @@
-import torch
-import torch.nn as nn
-
-
-
-device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
-
-class AddCoordsTh(nn.Module):
- def __init__(self, x_dim=64, y_dim=64, with_r=False, with_boundary=False):
- super(AddCoordsTh, self).__init__()
- self.x_dim = x_dim
- self.y_dim = y_dim
- self.with_r = with_r
- self.with_boundary = with_boundary
-
- def forward(self, input_tensor, heatmap=None):
- """
- input_tensor: (batch, c, x_dim, y_dim)
- """
- batch_size_tensor = input_tensor.shape[0]
-
- xx_ones = torch.ones([1, self.y_dim], dtype=torch.int32).to(device)
- xx_ones = xx_ones.unsqueeze(-1)
-
- xx_range = torch.arange(self.x_dim, dtype=torch.int32).unsqueeze(0).to(device)
- xx_range = xx_range.unsqueeze(1)
-
- xx_channel = torch.matmul(xx_ones.float(), xx_range.float())
- xx_channel = xx_channel.unsqueeze(-1)
-
-
- yy_ones = torch.ones([1, self.x_dim], dtype=torch.int32).to(device)
- yy_ones = yy_ones.unsqueeze(1)
-
- yy_range = torch.arange(self.y_dim, dtype=torch.int32).unsqueeze(0).to(device)
- yy_range = yy_range.unsqueeze(-1)
-
- yy_channel = torch.matmul(yy_range.float(), yy_ones.float())
- yy_channel = yy_channel.unsqueeze(-1)
-
- xx_channel = xx_channel.permute(0, 3, 2, 1)
- yy_channel = yy_channel.permute(0, 3, 2, 1)
-
- xx_channel = xx_channel / (self.x_dim - 1)
- yy_channel = yy_channel / (self.y_dim - 1)
-
- xx_channel = xx_channel * 2 - 1
- yy_channel = yy_channel * 2 - 1
-
- xx_channel = xx_channel.repeat(batch_size_tensor, 1, 1, 1)
- yy_channel = yy_channel.repeat(batch_size_tensor, 1, 1, 1)
-
- if self.with_boundary and type(heatmap) != type(None):
- boundary_channel = torch.clamp(heatmap[:, -1:, :, :],
- 0.0, 1.0)
-
- zero_tensor = torch.zeros_like(xx_channel)
- xx_boundary_channel = torch.where(boundary_channel>0.05,
- xx_channel, zero_tensor)
- yy_boundary_channel = torch.where(boundary_channel>0.05,
- yy_channel, zero_tensor)
- if self.with_boundary and type(heatmap) != type(None):
- xx_boundary_channel = xx_boundary_channel.to(device)
- yy_boundary_channel = yy_boundary_channel.to(device)
-
- ret = torch.cat([input_tensor, xx_channel, yy_channel], dim=1)
-
-
- if self.with_r:
- rr = torch.sqrt(torch.pow(xx_channel, 2) + torch.pow(yy_channel, 2))
- rr = rr / torch.max(rr)
- ret = torch.cat([ret, rr], dim=1)
-
- if self.with_boundary and type(heatmap) != type(None):
- ret = torch.cat([ret, xx_boundary_channel,
- yy_boundary_channel], dim=1)
- return ret
-
-
-class CoordConvTh(nn.Module):
- """CoordConv layer as in the paper."""
- def __init__(self, x_dim, y_dim, with_r, with_boundary,
- in_channels, first_one=False, *args, **kwargs):
- super(CoordConvTh, self).__init__()
- self.addcoords = AddCoordsTh(x_dim=x_dim, y_dim=y_dim, with_r=with_r,
- with_boundary=with_boundary)
- in_channels += 2
- if with_r:
- in_channels += 1
- if with_boundary and not first_one:
- in_channels += 2
- self.conv = nn.Conv2d(in_channels=in_channels, *args, **kwargs)
-
- def forward(self, input_tensor, heatmap=None):
- ret = self.addcoords(input_tensor, heatmap)
- last_channel = ret[:, -2:, :, :]
- ret = self.conv(ret)
- return ret, last_channel
-
-
-'''
-An alternative implementation for PyTorch with auto-infering the x-y dimensions.
-'''
-class AddCoords(nn.Module):
-
- def __init__(self, with_r=False):
- super().__init__()
- self.with_r = with_r
-
- def forward(self, input_tensor):
- """
- Args:
- input_tensor: shape(batch, channel, x_dim, y_dim)
- """
- batch_size, _, x_dim, y_dim = input_tensor.size()
-
- xx_channel = torch.arange(x_dim).repeat(1, y_dim, 1)
- yy_channel = torch.arange(y_dim).repeat(1, x_dim, 1).transpose(1, 2)
-
- xx_channel = xx_channel / (x_dim - 1)
- yy_channel = yy_channel / (y_dim - 1)
-
- xx_channel = xx_channel * 2 - 1
- yy_channel = yy_channel * 2 - 1
-
- xx_channel = xx_channel.repeat(batch_size, 1, 1, 1).transpose(2, 3)
- yy_channel = yy_channel.repeat(batch_size, 1, 1, 1).transpose(2, 3)
-
- if input_tensor.is_cuda:
- xx_channel = xx_channel.to(device)
- yy_channel = yy_channel.to(device)
-
- ret = torch.cat([
- input_tensor,
- xx_channel.type_as(input_tensor),
- yy_channel.type_as(input_tensor)], dim=1)
-
- if self.with_r:
- rr = torch.sqrt(torch.pow(xx_channel - 0.5, 2) + torch.pow(yy_channel - 0.5, 2))
- if input_tensor.is_cuda:
- rr = rr.to(device)
- ret = torch.cat([ret, rr], dim=1)
-
- return ret
-
-
-class CoordConv(nn.Module):
-
- def __init__(self, in_channels, out_channels, with_r=False, **kwargs):
- super().__init__()
- self.addcoords = AddCoords(with_r=with_r)
- self.conv = nn.Conv2d(in_channels + 2, out_channels, **kwargs)
-
- def forward(self, x):
- ret = self.addcoords(x)
- ret = self.conv(ret)
- return ret
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/dataloader.py b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/dataloader.py
deleted file mode 100644
index d50deda51bca3f3349bb84f676ea2a0447884a67..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/dataloader.py
+++ /dev/null
@@ -1,368 +0,0 @@
-import sys
-import os
-import random
-import glob
-import torch
-from skimage import io
-from skimage import transform as ski_transform
-from skimage.color import rgb2gray
-import scipy.io as sio
-from scipy import interpolate
-import numpy as np
-import matplotlib.pyplot as plt
-from torch.utils.data import Dataset, DataLoader
-from torchvision import transforms, utils
-from torchvision.transforms import Lambda, Compose
-from torchvision.transforms.functional import adjust_brightness, adjust_contrast, adjust_saturation, adjust_hue
-from utils.utils import cv_crop, cv_rotate, draw_gaussian, transform, power_transform, shuffle_lr, fig2data, generate_weight_map
-from PIL import Image
-import cv2
-import copy
-import math
-from imgaug import augmenters as iaa
-
-
-class AddBoundary(object):
- def __init__(self, num_landmarks=68):
- self.num_landmarks = num_landmarks
-
- def __call__(self, sample):
- landmarks_64 = np.floor(sample['landmarks'] / 4.0)
- if self.num_landmarks == 68:
- boundaries = {}
- boundaries['cheek'] = landmarks_64[0:17]
- boundaries['left_eyebrow'] = landmarks_64[17:22]
- boundaries['right_eyebrow'] = landmarks_64[22:27]
- boundaries['uper_left_eyelid'] = landmarks_64[36:40]
- boundaries['lower_left_eyelid'] = np.array([landmarks_64[i] for i in [36, 41, 40, 39]])
- boundaries['upper_right_eyelid'] = landmarks_64[42:46]
- boundaries['lower_right_eyelid'] = np.array([landmarks_64[i] for i in [42, 47, 46, 45]])
- boundaries['noise'] = landmarks_64[27:31]
- boundaries['noise_bot'] = landmarks_64[31:36]
- boundaries['upper_outer_lip'] = landmarks_64[48:55]
- boundaries['upper_inner_lip'] = np.array([landmarks_64[i] for i in [60, 61, 62, 63, 64]])
- boundaries['lower_outer_lip'] = np.array([landmarks_64[i] for i in [48, 59, 58, 57, 56, 55, 54]])
- boundaries['lower_inner_lip'] = np.array([landmarks_64[i] for i in [60, 67, 66, 65, 64]])
- elif self.num_landmarks == 98:
- boundaries = {}
- boundaries['cheek'] = landmarks_64[0:33]
- boundaries['left_eyebrow'] = landmarks_64[33:38]
- boundaries['right_eyebrow'] = landmarks_64[42:47]
- boundaries['uper_left_eyelid'] = landmarks_64[60:65]
- boundaries['lower_left_eyelid'] = np.array([landmarks_64[i] for i in [60, 67, 66, 65, 64]])
- boundaries['upper_right_eyelid'] = landmarks_64[68:73]
- boundaries['lower_right_eyelid'] = np.array([landmarks_64[i] for i in [68, 75, 74, 73, 72]])
- boundaries['noise'] = landmarks_64[51:55]
- boundaries['noise_bot'] = landmarks_64[55:60]
- boundaries['upper_outer_lip'] = landmarks_64[76:83]
- boundaries['upper_inner_lip'] = np.array([landmarks_64[i] for i in [88, 89, 90, 91, 92]])
- boundaries['lower_outer_lip'] = np.array([landmarks_64[i] for i in [76, 87, 86, 85, 84, 83, 82]])
- boundaries['lower_inner_lip'] = np.array([landmarks_64[i] for i in [88, 95, 94, 93, 92]])
- elif self.num_landmarks == 19:
- boundaries = {}
- boundaries['left_eyebrow'] = landmarks_64[0:3]
- boundaries['right_eyebrow'] = landmarks_64[3:5]
- boundaries['left_eye'] = landmarks_64[6:9]
- boundaries['right_eye'] = landmarks_64[9:12]
- boundaries['noise'] = landmarks_64[12:15]
-
- elif self.num_landmarks == 29:
- boundaries = {}
- boundaries['upper_left_eyebrow'] = np.stack([
- landmarks_64[0],
- landmarks_64[4],
- landmarks_64[2]
- ], axis=0)
- boundaries['lower_left_eyebrow'] = np.stack([
- landmarks_64[0],
- landmarks_64[5],
- landmarks_64[2]
- ], axis=0)
- boundaries['upper_right_eyebrow'] = np.stack([
- landmarks_64[1],
- landmarks_64[6],
- landmarks_64[3]
- ], axis=0)
- boundaries['lower_right_eyebrow'] = np.stack([
- landmarks_64[1],
- landmarks_64[7],
- landmarks_64[3]
- ], axis=0)
- boundaries['upper_left_eye'] = np.stack([
- landmarks_64[8],
- landmarks_64[12],
- landmarks_64[10]
- ], axis=0)
- boundaries['lower_left_eye'] = np.stack([
- landmarks_64[8],
- landmarks_64[13],
- landmarks_64[10]
- ], axis=0)
- boundaries['upper_right_eye'] = np.stack([
- landmarks_64[9],
- landmarks_64[14],
- landmarks_64[11]
- ], axis=0)
- boundaries['lower_right_eye'] = np.stack([
- landmarks_64[9],
- landmarks_64[15],
- landmarks_64[11]
- ], axis=0)
- boundaries['noise'] = np.stack([
- landmarks_64[18],
- landmarks_64[21],
- landmarks_64[19]
- ], axis=0)
- boundaries['outer_upper_lip'] = np.stack([
- landmarks_64[22],
- landmarks_64[24],
- landmarks_64[23]
- ], axis=0)
- boundaries['inner_upper_lip'] = np.stack([
- landmarks_64[22],
- landmarks_64[25],
- landmarks_64[23]
- ], axis=0)
- boundaries['outer_lower_lip'] = np.stack([
- landmarks_64[22],
- landmarks_64[26],
- landmarks_64[23]
- ], axis=0)
- boundaries['inner_lower_lip'] = np.stack([
- landmarks_64[22],
- landmarks_64[27],
- landmarks_64[23]
- ], axis=0)
- functions = {}
-
- for key, points in boundaries.items():
- temp = points[0]
- new_points = points[0:1, :]
- for point in points[1:]:
- if point[0] == temp[0] and point[1] == temp[1]:
- continue
- else:
- new_points = np.concatenate((new_points, np.expand_dims(point, 0)), axis=0)
- temp = point
- points = new_points
- if points.shape[0] == 1:
- points = np.concatenate((points, points+0.001), axis=0)
- k = min(4, points.shape[0])
- functions[key] = interpolate.splprep([points[:, 0], points[:, 1]], k=k-1,s=0)
-
- boundary_map = np.zeros((64, 64))
-
- fig = plt.figure(figsize=[64/96.0, 64/96.0], dpi=96)
-
- ax = fig.add_axes([0, 0, 1, 1])
-
- ax.axis('off')
-
- ax.imshow(boundary_map, interpolation='nearest', cmap='gray')
- #ax.scatter(landmarks[:, 0], landmarks[:, 1], s=1, marker=',', c='w')
-
- for key in functions.keys():
- xnew = np.arange(0, 1, 0.01)
- out = interpolate.splev(xnew, functions[key][0], der=0)
- plt.plot(out[0], out[1], ',', linewidth=1, color='w')
-
- img = fig2data(fig)
-
- plt.close()
-
- sigma = 1
- temp = 255-img[:,:,1]
- temp = cv2.distanceTransform(temp, cv2.DIST_L2, cv2.DIST_MASK_PRECISE)
- temp = temp.astype(np.float32)
- temp = np.where(temp < 3*sigma, np.exp(-(temp*temp)/(2*sigma*sigma)), 0 )
-
- fig = plt.figure(figsize=[64/96.0, 64/96.0], dpi=96)
-
- ax = fig.add_axes([0, 0, 1, 1])
-
- ax.axis('off')
- ax.imshow(temp, cmap='gray')
- plt.close()
-
- boundary_map = fig2data(fig)
-
- sample['boundary'] = boundary_map[:, :, 0]
-
- return sample
-
-class AddWeightMap(object):
- def __call__(self, sample):
- heatmap= sample['heatmap']
- boundary = sample['boundary']
- heatmap = np.concatenate((heatmap, np.expand_dims(boundary, axis=0)), 0)
- weight_map = np.zeros_like(heatmap)
- for i in range(heatmap.shape[0]):
- weight_map[i] = generate_weight_map(weight_map[i],
- heatmap[i])
- sample['weight_map'] = weight_map
- return sample
-
-class ToTensor(object):
- """Convert ndarrays in sample to Tensors."""
-
- def __call__(self, sample):
- image, heatmap, landmarks, boundary, weight_map= sample['image'], sample['heatmap'], sample['landmarks'], sample['boundary'], sample['weight_map']
-
- # swap color axis because
- # numpy image: H x W x C
- # torch image: C X H X W
- if len(image.shape) == 2:
- image = np.expand_dims(image, axis=2)
- image_small = np.expand_dims(image_small, axis=2)
- image = image.transpose((2, 0, 1))
- boundary = np.expand_dims(boundary, axis=2)
- boundary = boundary.transpose((2, 0, 1))
- return {'image': torch.from_numpy(image).float().div(255.0),
- 'heatmap': torch.from_numpy(heatmap).float(),
- 'landmarks': torch.from_numpy(landmarks).float(),
- 'boundary': torch.from_numpy(boundary).float().div(255.0),
- 'weight_map': torch.from_numpy(weight_map).float()}
-
-class FaceLandmarksDataset(Dataset):
- """Face Landmarks dataset."""
-
- def __init__(self, img_dir, landmarks_dir, num_landmarks=68, gray_scale=False,
- detect_face=False, enhance=False, center_shift=0,
- transform=None,):
- """
- Args:
- landmark_dir (string): Path to the mat file with landmarks saved.
- img_dir (string): Directory with all the images.
- transform (callable, optional): Optional transform to be applied
- on a sample.
- """
- self.img_dir = img_dir
- self.landmarks_dir = landmarks_dir
- self.num_lanmdkars = num_landmarks
- self.transform = transform
- self.img_names = glob.glob(self.img_dir+'*.jpg') + \
- glob.glob(self.img_dir+'*.png')
- self.gray_scale = gray_scale
- self.detect_face = detect_face
- self.enhance = enhance
- self.center_shift = center_shift
- if self.detect_face:
- self.face_detector = MTCNN(thresh=[0.5, 0.6, 0.7])
- def __len__(self):
- return len(self.img_names)
-
- def __getitem__(self, idx):
- img_name = self.img_names[idx]
- pil_image = Image.open(img_name)
- if pil_image.mode != "RGB":
- # if input is grayscale image, convert it to 3 channel image
- if self.enhance:
- pil_image = power_transform(pil_image, 0.5)
- temp_image = Image.new('RGB', pil_image.size)
- temp_image.paste(pil_image)
- pil_image = temp_image
- image = np.array(pil_image)
- if self.gray_scale:
- image = rgb2gray(image)
- image = np.expand_dims(image, axis=2)
- image = np.concatenate((image, image, image), axis=2)
- image = image * 255.0
- image = image.astype(np.uint8)
- if not self.detect_face:
- center = [450//2, 450//2+0]
- if self.center_shift != 0:
- center[0] += int(np.random.uniform(-self.center_shift,
- self.center_shift))
- center[1] += int(np.random.uniform(-self.center_shift,
- self.center_shift))
- scale = 1.8
- else:
- detected_faces = self.face_detector.detect_image(image)
- if len(detected_faces) > 0:
- box = detected_faces[0]
- left, top, right, bottom, _ = box
- center = [right - (right - left) / 2.0,
- bottom - (bottom - top) / 2.0]
- center[1] = center[1] - (bottom - top) * 0.12
- scale = (right - left + bottom - top) / 195.0
- else:
- center = [450//2, 450//2+0]
- scale = 1.8
- if self.center_shift != 0:
- shift = self.center * self.center_shift / 450
- center[0] += int(np.random.uniform(-shift, shift))
- center[1] += int(np.random.uniform(-shift, shift))
- base_name = os.path.basename(img_name)
- landmarks_base_name = base_name[:-4] + '_pts.mat'
- landmarks_name = os.path.join(self.landmarks_dir, landmarks_base_name)
- if os.path.isfile(landmarks_name):
- mat_data = sio.loadmat(landmarks_name)
- landmarks = mat_data['pts_2d']
- elif os.path.isfile(landmarks_name[:-8] + '.pts.npy'):
- landmarks = np.load(landmarks_name[:-8] + '.pts.npy')
- else:
- landmarks = []
- heatmap = []
-
- if landmarks != []:
- new_image, new_landmarks = cv_crop(image, landmarks, center,
- scale, 256, self.center_shift)
- tries = 0
- while self.center_shift != 0 and tries < 5 and (np.max(new_landmarks) > 240 or np.min(new_landmarks) < 15):
- center = [450//2, 450//2+0]
- scale += 0.05
- center[0] += int(np.random.uniform(-self.center_shift,
- self.center_shift))
- center[1] += int(np.random.uniform(-self.center_shift,
- self.center_shift))
-
- new_image, new_landmarks = cv_crop(image, landmarks,
- center, scale, 256,
- self.center_shift)
- tries += 1
- if np.max(new_landmarks) > 250 or np.min(new_landmarks) < 5:
- center = [450//2, 450//2+0]
- scale = 2.25
- new_image, new_landmarks = cv_crop(image, landmarks,
- center, scale, 256,
- 100)
- assert (np.min(new_landmarks) > 0 and np.max(new_landmarks) < 256), \
- "Landmarks out of boundary!"
- image = new_image
- landmarks = new_landmarks
- heatmap = np.zeros((self.num_lanmdkars, 64, 64))
- for i in range(self.num_lanmdkars):
- if landmarks[i][0] > 0:
- heatmap[i] = draw_gaussian(heatmap[i], landmarks[i]/4.0+1, 1)
- sample = {'image': image, 'heatmap': heatmap, 'landmarks': landmarks}
- if self.transform:
- sample = self.transform(sample)
-
- return sample
-
-def get_dataset(val_img_dir, val_landmarks_dir, batch_size,
- num_landmarks=68, rotation=0, scale=0,
- center_shift=0, random_flip=False,
- brightness=0, contrast=0, saturation=0,
- blur=False, noise=False, jpeg_effect=False,
- random_occlusion=False, gray_scale=False,
- detect_face=False, enhance=False):
- val_transforms = transforms.Compose([AddBoundary(num_landmarks),
- AddWeightMap(),
- ToTensor()])
-
- val_dataset = FaceLandmarksDataset(val_img_dir, val_landmarks_dir,
- num_landmarks=num_landmarks,
- gray_scale=gray_scale,
- detect_face=detect_face,
- enhance=enhance,
- transform=val_transforms)
-
- val_dataloader = torch.utils.data.DataLoader(val_dataset,
- batch_size=batch_size,
- shuffle=False,
- num_workers=6)
- data_loaders = {'val': val_dataloader}
- dataset_sizes = {}
- dataset_sizes['val'] = len(val_dataset)
- return data_loaders, dataset_sizes
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/evaler.py b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/evaler.py
deleted file mode 100644
index e5f5946e7eb0a097aba691beb573340124e53e42..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/evaler.py
+++ /dev/null
@@ -1,151 +0,0 @@
-import matplotlib
-matplotlib.use('Agg')
-import math
-import torch
-import copy
-import time
-from torch.autograd import Variable
-import shutil
-from skimage import io
-import numpy as np
-from utils.utils import fan_NME, show_landmarks, get_preds_fromhm
-from PIL import Image, ImageDraw
-import os
-import sys
-import cv2
-import matplotlib.pyplot as plt
-
-
-device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
-
-def eval_model(model, dataloaders, dataset_sizes,
- writer, use_gpu=True, epoches=5, dataset='val',
- save_path='./', num_landmarks=68):
- global_nme = 0
- model.eval()
- for epoch in range(epoches):
- running_loss = 0
- step = 0
- total_nme = 0
- total_count = 0
- fail_count = 0
- nmes = []
- # running_corrects = 0
-
- # Iterate over data.
- with torch.no_grad():
- for data in dataloaders[dataset]:
- total_runtime = 0
- run_count = 0
- step_start = time.time()
- step += 1
- # get the inputs
- inputs = data['image'].type(torch.FloatTensor)
- labels_heatmap = data['heatmap'].type(torch.FloatTensor)
- labels_boundary = data['boundary'].type(torch.FloatTensor)
- landmarks = data['landmarks'].type(torch.FloatTensor)
- loss_weight_map = data['weight_map'].type(torch.FloatTensor)
- # wrap them in Variable
- if use_gpu:
- inputs = inputs.to(device)
- labels_heatmap = labels_heatmap.to(device)
- labels_boundary = labels_boundary.to(device)
- loss_weight_map = loss_weight_map.to(device)
- else:
- inputs, labels_heatmap = Variable(inputs), Variable(labels_heatmap)
- labels_boundary = Variable(labels_boundary)
- labels = torch.cat((labels_heatmap, labels_boundary), 1)
- single_start = time.time()
- outputs, boundary_channels = model(inputs)
- single_end = time.time()
- total_runtime += time.time() - single_start
- run_count += 1
- step_end = time.time()
- for i in range(inputs.shape[0]):
- print(inputs.shape)
- img = inputs[i]
- img = img.cpu().numpy()
- img = img.transpose((1, 2, 0)) #*255.0
- # img = img.astype(np.uint8)
- # img = Image.fromarray(img)
- # pred_heatmap = outputs[-1][i].detach().cpu()[:-1, :, :]
- pred_heatmap = outputs[-1][:, :-1, :, :][i].detach().cpu()
- pred_landmarks, _ = get_preds_fromhm(pred_heatmap.unsqueeze(0))
- pred_landmarks = pred_landmarks.squeeze().numpy()
-
- gt_landmarks = data['landmarks'][i].numpy()
- print(pred_landmarks, gt_landmarks)
- import cv2
- while(True):
- imgshow = vis_landmark_on_img(cv2.UMat(img), pred_landmarks*4)
- cv2.imshow('img', imgshow)
-
- if(cv2.waitKey(10) == ord('q')):
- break
-
-
- if num_landmarks == 68:
- left_eye = np.average(gt_landmarks[36:42], axis=0)
- right_eye = np.average(gt_landmarks[42:48], axis=0)
- norm_factor = np.linalg.norm(left_eye - right_eye)
- # norm_factor = np.linalg.norm(gt_landmarks[36]- gt_landmarks[45])
-
- elif num_landmarks == 98:
- norm_factor = np.linalg.norm(gt_landmarks[60]- gt_landmarks[72])
- elif num_landmarks == 19:
- left, top = gt_landmarks[-2, :]
- right, bottom = gt_landmarks[-1, :]
- norm_factor = math.sqrt(abs(right - left)*abs(top-bottom))
- gt_landmarks = gt_landmarks[:-2, :]
- elif num_landmarks == 29:
- # norm_factor = np.linalg.norm(gt_landmarks[8]- gt_landmarks[9])
- norm_factor = np.linalg.norm(gt_landmarks[16]- gt_landmarks[17])
- single_nme = (np.sum(np.linalg.norm(pred_landmarks*4 - gt_landmarks, axis=1)) / pred_landmarks.shape[0]) / norm_factor
-
- nmes.append(single_nme)
- total_count += 1
- if single_nme > 0.1:
- fail_count += 1
- if step % 10 == 0:
- print('Step {} Time: {:.6f} Input Mean: {:.6f} Output Mean: {:.6f}'.format(
- step, step_end - step_start,
- torch.mean(labels),
- torch.mean(outputs[0])))
- # gt_landmarks = landmarks.numpy()
- # pred_heatmap = outputs[-1].to('cpu').numpy()
- gt_landmarks = landmarks
- batch_nme = fan_NME(outputs[-1][:, :-1, :, :].detach().cpu(), gt_landmarks, num_landmarks)
- # batch_nme = 0
- total_nme += batch_nme
- epoch_nme = total_nme / dataset_sizes['val']
- global_nme += epoch_nme
- nme_save_path = os.path.join(save_path, 'nme_log.npy')
- np.save(nme_save_path, np.array(nmes))
- print('NME: {:.6f} Failure Rate: {:.6f} Total Count: {:.6f} Fail Count: {:.6f}'.format(epoch_nme, fail_count/total_count, total_count, fail_count))
- print('Evaluation done! Average NME: {:.6f}'.format(global_nme/epoches))
- print('Everage runtime for a single batch: {:.6f}'.format(total_runtime/run_count))
- return model
-
-
-def vis_landmark_on_img(img, shape, linewidth=2):
- '''
- Visualize landmark on images.
- '''
-
- def draw_curve(idx_list, color=(0, 255, 0), loop=False, lineWidth=linewidth):
- for i in idx_list:
- cv2.line(img, (shape[i, 0], shape[i, 1]), (shape[i + 1, 0], shape[i + 1, 1]), color, lineWidth)
- if (loop):
- cv2.line(img, (shape[idx_list[0], 0], shape[idx_list[0], 1]),
- (shape[idx_list[-1] + 1, 0], shape[idx_list[-1] + 1, 1]), color, lineWidth)
-
- draw_curve(list(range(0, 32))) # jaw
- draw_curve(list(range(33, 41)), color=(0, 0, 255), loop=True) # eye brow
- draw_curve(list(range(42, 50)), color=(0, 0, 255), loop=True)
- draw_curve(list(range(51, 59))) # nose
- draw_curve(list(range(60, 67)), loop=True) # eyes
- draw_curve(list(range(68, 75)), loop=True)
- draw_curve(list(range(76, 87)), loop=True, color=(0, 255, 255)) # mouth
- draw_curve(list(range(88, 95)), loop=True, color=(255, 255, 0))
-
- return img
\ No newline at end of file
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/models.py b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/models.py
deleted file mode 100644
index c3d77c1b0eefcaaa20b47c8ce74a9696180803ac..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/core/models.py
+++ /dev/null
@@ -1,228 +0,0 @@
-import torch
-import torch.nn as nn
-import torch.nn.functional as F
-import math
-from core.coord_conv import CoordConvTh
-
-
-def conv3x3(in_planes, out_planes, strd=1, padding=1,
- bias=False,dilation=1):
- "3x3 convolution with padding"
- return nn.Conv2d(in_planes, out_planes, kernel_size=3,
- stride=strd, padding=padding, bias=bias,
- dilation=dilation)
-
-class BasicBlock(nn.Module):
- expansion = 1
-
- def __init__(self, inplanes, planes, stride=1, downsample=None):
- super(BasicBlock, self).__init__()
- self.conv1 = conv3x3(inplanes, planes, stride)
- # self.bn1 = nn.BatchNorm2d(planes)
- self.relu = nn.ReLU(inplace=True)
- self.conv2 = conv3x3(planes, planes)
- # self.bn2 = nn.BatchNorm2d(planes)
- self.downsample = downsample
- self.stride = stride
-
- def forward(self, x):
- residual = x
-
- out = self.conv1(x)
- # out = self.bn1(out)
- out = self.relu(out)
-
- out = self.conv2(out)
- # out = self.bn2(out)
-
- if self.downsample is not None:
- residual = self.downsample(x)
-
- out += residual
- out = self.relu(out)
-
- return out
-
-class ConvBlock(nn.Module):
- def __init__(self, in_planes, out_planes):
- super(ConvBlock, self).__init__()
- self.bn1 = nn.BatchNorm2d(in_planes)
- self.conv1 = conv3x3(in_planes, int(out_planes / 2))
- self.bn2 = nn.BatchNorm2d(int(out_planes / 2))
- self.conv2 = conv3x3(int(out_planes / 2), int(out_planes / 4),
- padding=1, dilation=1)
- self.bn3 = nn.BatchNorm2d(int(out_planes / 4))
- self.conv3 = conv3x3(int(out_planes / 4), int(out_planes / 4),
- padding=1, dilation=1)
-
- if in_planes != out_planes:
- self.downsample = nn.Sequential(
- nn.BatchNorm2d(in_planes),
- nn.ReLU(True),
- nn.Conv2d(in_planes, out_planes,
- kernel_size=1, stride=1, bias=False),
- )
- else:
- self.downsample = None
-
- def forward(self, x):
- residual = x
-
- out1 = self.bn1(x)
- out1 = F.relu(out1, True)
- out1 = self.conv1(out1)
-
- out2 = self.bn2(out1)
- out2 = F.relu(out2, True)
- out2 = self.conv2(out2)
-
- out3 = self.bn3(out2)
- out3 = F.relu(out3, True)
- out3 = self.conv3(out3)
-
- out3 = torch.cat((out1, out2, out3), 1)
-
- if self.downsample is not None:
- residual = self.downsample(residual)
-
- out3 += residual
-
- return out3
-
-class HourGlass(nn.Module):
- def __init__(self, num_modules, depth, num_features, first_one=False):
- super(HourGlass, self).__init__()
- self.num_modules = num_modules
- self.depth = depth
- self.features = num_features
- self.coordconv = CoordConvTh(x_dim=64, y_dim=64,
- with_r=True, with_boundary=True,
- in_channels=256, first_one=first_one,
- out_channels=256,
- kernel_size=1,
- stride=1, padding=0)
- self._generate_network(self.depth)
-
- def _generate_network(self, level):
- self.add_module('b1_' + str(level), ConvBlock(256, 256))
-
- self.add_module('b2_' + str(level), ConvBlock(256, 256))
-
- if level > 1:
- self._generate_network(level - 1)
- else:
- self.add_module('b2_plus_' + str(level), ConvBlock(256, 256))
-
- self.add_module('b3_' + str(level), ConvBlock(256, 256))
-
- def _forward(self, level, inp):
- # Upper branch
- up1 = inp
- up1 = self._modules['b1_' + str(level)](up1)
-
- # Lower branch
- low1 = F.avg_pool2d(inp, 2, stride=2)
- low1 = self._modules['b2_' + str(level)](low1)
-
- if level > 1:
- low2 = self._forward(level - 1, low1)
- else:
- low2 = low1
- low2 = self._modules['b2_plus_' + str(level)](low2)
-
- low3 = low2
- low3 = self._modules['b3_' + str(level)](low3)
-
- up2 = F.upsample(low3, scale_factor=2, mode='nearest')
-
- return up1 + up2
-
- def forward(self, x, heatmap):
- x, last_channel = self.coordconv(x, heatmap)
- return self._forward(self.depth, x), last_channel
-
-class FAN(nn.Module):
-
- def __init__(self, num_modules=1, end_relu=False, gray_scale=False,
- num_landmarks=68):
- super(FAN, self).__init__()
- self.num_modules = num_modules
- self.gray_scale = gray_scale
- self.end_relu = end_relu
- self.num_landmarks = num_landmarks
-
- # Base part
- if self.gray_scale:
- self.conv1 = CoordConvTh(x_dim=256, y_dim=256,
- with_r=True, with_boundary=False,
- in_channels=3, out_channels=64,
- kernel_size=7,
- stride=2, padding=3)
- else:
- self.conv1 = CoordConvTh(x_dim=256, y_dim=256,
- with_r=True, with_boundary=False,
- in_channels=3, out_channels=64,
- kernel_size=7,
- stride=2, padding=3)
- self.bn1 = nn.BatchNorm2d(64)
- self.conv2 = ConvBlock(64, 128)
- self.conv3 = ConvBlock(128, 128)
- self.conv4 = ConvBlock(128, 256)
-
- # Stacking part
- for hg_module in range(self.num_modules):
- if hg_module == 0:
- first_one = True
- else:
- first_one = False
- self.add_module('m' + str(hg_module), HourGlass(1, 4, 256,
- first_one))
- self.add_module('top_m_' + str(hg_module), ConvBlock(256, 256))
- self.add_module('conv_last' + str(hg_module),
- nn.Conv2d(256, 256, kernel_size=1, stride=1, padding=0))
- self.add_module('bn_end' + str(hg_module), nn.BatchNorm2d(256))
- self.add_module('l' + str(hg_module), nn.Conv2d(256,
- num_landmarks+1, kernel_size=1, stride=1, padding=0))
-
- if hg_module < self.num_modules - 1:
- self.add_module(
- 'bl' + str(hg_module), nn.Conv2d(256, 256, kernel_size=1, stride=1, padding=0))
- self.add_module('al' + str(hg_module), nn.Conv2d(num_landmarks+1,
- 256, kernel_size=1, stride=1, padding=0))
-
- def forward(self, x):
- x, _ = self.conv1(x)
- x = F.relu(self.bn1(x), True)
- # x = F.relu(self.bn1(self.conv1(x)), True)
- x = F.avg_pool2d(self.conv2(x), 2, stride=2)
- x = self.conv3(x)
- x = self.conv4(x)
-
- previous = x
-
- outputs = []
- boundary_channels = []
- tmp_out = None
- for i in range(self.num_modules):
- hg, boundary_channel = self._modules['m' + str(i)](previous,
- tmp_out)
-
- ll = hg
- ll = self._modules['top_m_' + str(i)](ll)
-
- ll = F.relu(self._modules['bn_end' + str(i)]
- (self._modules['conv_last' + str(i)](ll)), True)
-
- # Predict heatmaps
- tmp_out = self._modules['l' + str(i)](ll)
- if self.end_relu:
- tmp_out = F.relu(tmp_out) # HACK: Added relu
- outputs.append(tmp_out)
- boundary_channels.append(boundary_channel)
-
- if i < self.num_modules - 1:
- ll = self._modules['bl' + str(i)](ll)
- tmp_out_ = self._modules['al' + str(i)](tmp_out)
- previous = previous + ll + tmp_out_
-
- return outputs, boundary_channels
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/eval.py b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/eval.py
deleted file mode 100644
index 1236d05a337e8f56468130e4d1e74f4ee3d820a8..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/eval.py
+++ /dev/null
@@ -1,77 +0,0 @@
-from __future__ import print_function, division
-import torch
-import argparse
-import numpy as np
-import torch.nn as nn
-import time
-import os
-from core.evaler import eval_model
-from core.dataloader import get_dataset
-from core import models
-from tensorboardX import SummaryWriter
-
-# Parse arguments
-parser = argparse.ArgumentParser()
-# Dataset paths
-parser.add_argument('--val_img_dir', type=str,
- help='Validation image directory')
-parser.add_argument('--val_landmarks_dir', type=str,
- help='Validation landmarks directory')
-parser.add_argument('--num_landmarks', type=int, default=68,
- help='Number of landmarks')
-
-# Checkpoint and pretrained weights
-parser.add_argument('--ckpt_save_path', type=str,
- help='a directory to save checkpoint file')
-parser.add_argument('--pretrained_weights', type=str,
- help='a directory to save pretrained_weights')
-
-# Eval options
-parser.add_argument('--batch_size', type=int, default=25,
- help='learning rate decay after each epoch')
-
-# Network parameters
-parser.add_argument('--hg_blocks', type=int, default=4,
- help='Number of HG blocks to stack')
-parser.add_argument('--gray_scale', type=str, default="False",
- help='Whether to convert RGB image into gray scale during training')
-parser.add_argument('--end_relu', type=str, default="False",
- help='Whether to add relu at the end of each HG module')
-
-args = parser.parse_args()
-
-VAL_IMG_DIR = args.val_img_dir
-VAL_LANDMARKS_DIR = args.val_landmarks_dir
-CKPT_SAVE_PATH = args.ckpt_save_path
-BATCH_SIZE = args.batch_size
-PRETRAINED_WEIGHTS = args.pretrained_weights
-GRAY_SCALE = False if args.gray_scale == 'False' else True
-HG_BLOCKS = args.hg_blocks
-END_RELU = False if args.end_relu == 'False' else True
-NUM_LANDMARKS = args.num_landmarks
-
-device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
-
-writer = SummaryWriter(CKPT_SAVE_PATH)
-
-dataloaders, dataset_sizes = get_dataset(VAL_IMG_DIR, VAL_LANDMARKS_DIR,
- BATCH_SIZE, NUM_LANDMARKS)
-use_gpu = torch.cuda.is_available()
-model_ft = models.FAN(HG_BLOCKS, END_RELU, GRAY_SCALE, NUM_LANDMARKS)
-
-if PRETRAINED_WEIGHTS != "None":
- checkpoint = torch.load(PRETRAINED_WEIGHTS)
- if 'state_dict' not in checkpoint:
- model_ft.load_state_dict(checkpoint)
- else:
- pretrained_weights = checkpoint['state_dict']
- model_weights = model_ft.state_dict()
- pretrained_weights = {k: v for k, v in pretrained_weights.items() \
- if k in model_weights}
- model_weights.update(pretrained_weights)
- model_ft.load_state_dict(model_weights)
-
-model_ft = model_ft.to(device)
-
-model_ft = eval_model(model_ft, dataloaders, dataset_sizes, writer, use_gpu, 1, 'val', CKPT_SAVE_PATH, NUM_LANDMARKS)
-
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/images/wflw.png b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/images/wflw.png
deleted file mode 100644
index 86cb74076379f8b4399ef8f049a2394e797acbae..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/images/wflw.png
+++ /dev/null
@@ -1,3 +0,0 @@
-version https://git-lfs.github.com/spec/v1
-oid sha256:354babe46beeec86fc8a9f64c57a1dad0ec19ff23f455ac3405321bab473ce23
-size 2948110
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/images/wflw_table.png b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/images/wflw_table.png
deleted file mode 100644
index 8b850ff0ca5cbf277dcc991edd584ac5f7cc8983..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/images/wflw_table.png
+++ /dev/null
@@ -1,3 +0,0 @@
-version https://git-lfs.github.com/spec/v1
-oid sha256:87c9ea0af4854681b6fc5e911ac38042ca5099098146501f20b64a6457a9d98b
-size 1085129
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/requirements.txt b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/requirements.txt
deleted file mode 100644
index fa6fe11e90facd05c5da179b036adca36dc9e485..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/requirements.txt
+++ /dev/null
@@ -1,12 +0,0 @@
-opencv-python
-scipy>=0.17.0
-scikit-image
-numpy
-matplotlib
-Pillow>=4.3.0
-imgaug
-tensorflow
-git+https://github.com/lanpa/tensorboardX
-joblib
-torch==1.3.0
-torchvision==0.4.1
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/scripts/eval_wflw.sh b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/scripts/eval_wflw.sh
deleted file mode 100644
index 7a3bc305b5f80229b22470befad6093c388feb67..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/scripts/eval_wflw.sh
+++ /dev/null
@@ -1,10 +0,0 @@
-CUDA_VISIBLE_DEVICES=1 python ../eval.py \
- --val_img_dir='../dataset/WFLW_test/images/' \
- --val_landmarks_dir='../dataset/WFLW_test/landmarks/' \
- --ckpt_save_path='../experiments/eval_iccv_0620' \
- --hg_blocks=4 \
- --pretrained_weights='../ckpt/WFLW_4HG.pth' \
- --num_landmarks=98 \
- --end_relu='False' \
- --batch_size=20 \
-
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/utils/__init__.py b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/utils/__init__.py
deleted file mode 100644
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/utils/__pycache__/__init__.cpython-37.pyc b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/utils/__pycache__/__init__.cpython-37.pyc
deleted file mode 100644
index b0556c65f7389b35d02973841835f368f426b4be..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/utils/__pycache__/__init__.cpython-37.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/utils/__pycache__/__init__.cpython-39.pyc b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/utils/__pycache__/__init__.cpython-39.pyc
deleted file mode 100644
index b556ccb6ebc2464dc369be6f284b4f287a291c95..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/utils/__pycache__/__init__.cpython-39.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/utils/__pycache__/utils.cpython-37.pyc b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/utils/__pycache__/utils.cpython-37.pyc
deleted file mode 100644
index 8ca7f1dafd2653a1a53a6236f6f1a357543c26de..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/utils/__pycache__/utils.cpython-37.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/utils/__pycache__/utils.cpython-39.pyc b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/utils/__pycache__/utils.cpython-39.pyc
deleted file mode 100644
index 7864c6741a8b07827bedec204f8a480a603d41c0..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/utils/__pycache__/utils.cpython-39.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/utils/utils.py b/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/utils/utils.py
deleted file mode 100644
index 8fbad7b9739fe89330bec1f6e3dd07f27e33d4c0..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/AdaptiveWingLoss/utils/utils.py
+++ /dev/null
@@ -1,354 +0,0 @@
-from __future__ import print_function, division
-import os
-import sys
-import math
-import torch
-import cv2
-from PIL import Image
-from skimage import io
-from skimage import transform as ski_transform
-from scipy import ndimage
-import numpy as np
-import matplotlib
-import matplotlib.pyplot as plt
-from torch.utils.data import Dataset, DataLoader
-from torchvision import transforms, utils
-
-def _gaussian(
- size=3, sigma=0.25, amplitude=1, normalize=False, width=None,
- height=None, sigma_horz=None, sigma_vert=None, mean_horz=0.5,
- mean_vert=0.5):
- # handle some defaults
- if width is None:
- width = size
- if height is None:
- height = size
- if sigma_horz is None:
- sigma_horz = sigma
- if sigma_vert is None:
- sigma_vert = sigma
- center_x = mean_horz * width + 0.5
- center_y = mean_vert * height + 0.5
- gauss = np.empty((height, width), dtype=np.float32)
- # generate kernel
- for i in range(height):
- for j in range(width):
- gauss[i][j] = amplitude * math.exp(-(math.pow((j + 1 - center_x) / (
- sigma_horz * width), 2) / 2.0 + math.pow((i + 1 - center_y) / (sigma_vert * height), 2) / 2.0))
- if normalize:
- gauss = gauss / np.sum(gauss)
- return gauss
-
-def draw_gaussian(image, point, sigma):
- # Check if the gaussian is inside
- ul = [np.floor(np.floor(point[0]) - 3 * sigma),
- np.floor(np.floor(point[1]) - 3 * sigma)]
- br = [np.floor(np.floor(point[0]) + 3 * sigma),
- np.floor(np.floor(point[1]) + 3 * sigma)]
- if (ul[0] > image.shape[1] or ul[1] >
- image.shape[0] or br[0] < 1 or br[1] < 1):
- return image
- size = 6 * sigma + 1
- g = _gaussian(size)
- g_x = [int(max(1, -ul[0])), int(min(br[0], image.shape[1])) -
- int(max(1, ul[0])) + int(max(1, -ul[0]))]
- g_y = [int(max(1, -ul[1])), int(min(br[1], image.shape[0])) -
- int(max(1, ul[1])) + int(max(1, -ul[1]))]
- img_x = [int(max(1, ul[0])), int(min(br[0], image.shape[1]))]
- img_y = [int(max(1, ul[1])), int(min(br[1], image.shape[0]))]
- assert (g_x[0] > 0 and g_y[1] > 0)
- correct = False
- while not correct:
- try:
- image[img_y[0] - 1:img_y[1], img_x[0] - 1:img_x[1]
- ] = image[img_y[0] - 1:img_y[1], img_x[0] - 1:img_x[1]] + g[g_y[0] - 1:g_y[1], g_x[0] - 1:g_x[1]]
- correct = True
- except:
- print('img_x: {}, img_y: {}, g_x:{}, g_y:{}, point:{}, g_shape:{}, ul:{}, br:{}'.format(img_x, img_y, g_x, g_y, point, g.shape, ul, br))
- ul = [np.floor(np.floor(point[0]) - 3 * sigma),
- np.floor(np.floor(point[1]) - 3 * sigma)]
- br = [np.floor(np.floor(point[0]) + 3 * sigma),
- np.floor(np.floor(point[1]) + 3 * sigma)]
- g_x = [int(max(1, -ul[0])), int(min(br[0], image.shape[1])) -
- int(max(1, ul[0])) + int(max(1, -ul[0]))]
- g_y = [int(max(1, -ul[1])), int(min(br[1], image.shape[0])) -
- int(max(1, ul[1])) + int(max(1, -ul[1]))]
- img_x = [int(max(1, ul[0])), int(min(br[0], image.shape[1]))]
- img_y = [int(max(1, ul[1])), int(min(br[1], image.shape[0]))]
- pass
- image[image > 1] = 1
- return image
-
-def transform(point, center, scale, resolution, rotation=0, invert=False):
- _pt = np.ones(3)
- _pt[0] = point[0]
- _pt[1] = point[1]
-
- h = 200.0 * scale
- t = np.eye(3)
- t[0, 0] = resolution / h
- t[1, 1] = resolution / h
- t[0, 2] = resolution * (-center[0] / h + 0.5)
- t[1, 2] = resolution * (-center[1] / h + 0.5)
-
- if rotation != 0:
- rotation = -rotation
- r = np.eye(3)
- ang = rotation * math.pi / 180.0
- s = math.sin(ang)
- c = math.cos(ang)
- r[0][0] = c
- r[0][1] = -s
- r[1][0] = s
- r[1][1] = c
-
- t_ = np.eye(3)
- t_[0][2] = -resolution / 2.0
- t_[1][2] = -resolution / 2.0
- t_inv = torch.eye(3)
- t_inv[0][2] = resolution / 2.0
- t_inv[1][2] = resolution / 2.0
- t = reduce(np.matmul, [t_inv, r, t_, t])
-
- if invert:
- t = np.linalg.inv(t)
- new_point = (np.matmul(t, _pt))[0:2]
-
- return new_point.astype(int)
-
-def cv_crop(image, landmarks, center, scale, resolution=256, center_shift=0):
- new_image = cv2.copyMakeBorder(image, center_shift,
- center_shift,
- center_shift,
- center_shift,
- cv2.BORDER_CONSTANT, value=[0,0,0])
- new_landmarks = landmarks.copy()
- if center_shift != 0:
- center[0] += center_shift
- center[1] += center_shift
- new_landmarks = new_landmarks + center_shift
- length = 200 * scale
- top = int(center[1] - length // 2)
- bottom = int(center[1] + length // 2)
- left = int(center[0] - length // 2)
- right = int(center[0] + length // 2)
- y_pad = abs(min(top, new_image.shape[0] - bottom, 0))
- x_pad = abs(min(left, new_image.shape[1] - right, 0))
- top, bottom, left, right = top + y_pad, bottom + y_pad, left + x_pad, right + x_pad
- new_image = cv2.copyMakeBorder(new_image, y_pad,
- y_pad,
- x_pad,
- x_pad,
- cv2.BORDER_CONSTANT, value=[0,0,0])
- new_image = new_image[top:bottom, left:right]
- new_image = cv2.resize(new_image, dsize=(int(resolution), int(resolution)),
- interpolation=cv2.INTER_LINEAR)
- new_landmarks[:, 0] = (new_landmarks[:, 0] + x_pad - left) * resolution / length
- new_landmarks[:, 1] = (new_landmarks[:, 1] + y_pad - top) * resolution / length
- return new_image, new_landmarks
-
-def cv_rotate(image, landmarks, heatmap, rot, scale, resolution=256):
- img_mat = cv2.getRotationMatrix2D((resolution//2, resolution//2), rot, scale)
- ones = np.ones(shape=(landmarks.shape[0], 1))
- stacked_landmarks = np.hstack([landmarks, ones])
- new_landmarks = img_mat.dot(stacked_landmarks.T).T
- if np.max(new_landmarks) > 255 or np.min(new_landmarks) < 0:
- return image, landmarks, heatmap
- else:
- new_image = cv2.warpAffine(image, img_mat, (resolution, resolution))
- if heatmap is not None:
- new_heatmap = np.zeros((heatmap.shape[0], 64, 64))
- for i in range(heatmap.shape[0]):
- if new_landmarks[i][0] > 0:
- new_heatmap[i] = draw_gaussian(new_heatmap[i],
- new_landmarks[i]/4.0+1, 1)
- return new_image, new_landmarks, new_heatmap
-
-def show_landmarks(image, heatmap, gt_landmarks, gt_heatmap):
- """Show image with pred_landmarks"""
- pred_landmarks = []
- pred_landmarks, _ = get_preds_fromhm(torch.from_numpy(heatmap).unsqueeze(0))
- pred_landmarks = pred_landmarks.squeeze()*4
-
- # pred_landmarks2 = get_preds_fromhm2(heatmap)
- heatmap = np.max(gt_heatmap, axis=0)
- heatmap = heatmap / np.max(heatmap)
- # image = ski_transform.resize(image, (64, 64))*255
- image = image.astype(np.uint8)
- heatmap = np.max(gt_heatmap, axis=0)
- heatmap = ski_transform.resize(heatmap, (image.shape[0], image.shape[1]))
- heatmap *= 255
- heatmap = heatmap.astype(np.uint8)
- heatmap = cv2.applyColorMap(heatmap, cv2.COLORMAP_JET)
- plt.imshow(image)
- plt.scatter(gt_landmarks[:, 0], gt_landmarks[:, 1], s=0.5, marker='.', c='g')
- plt.scatter(pred_landmarks[:, 0], pred_landmarks[:, 1], s=0.5, marker='.', c='r')
- plt.pause(0.001) # pause a bit so that plots are updated
-
-def fan_NME(pred_heatmaps, gt_landmarks, num_landmarks=68):
- '''
- Calculate total NME for a batch of data
-
- Args:
- pred_heatmaps: torch tensor of size [batch, points, height, width]
- gt_landmarks: torch tesnsor of size [batch, points, x, y]
-
- Returns:
- nme: sum of nme for this batch
- '''
- nme = 0
- pred_landmarks, _ = get_preds_fromhm(pred_heatmaps)
- pred_landmarks = pred_landmarks.numpy()
- gt_landmarks = gt_landmarks.numpy()
- for i in range(pred_landmarks.shape[0]):
- pred_landmark = pred_landmarks[i] * 4.0
- gt_landmark = gt_landmarks[i]
-
- if num_landmarks == 68:
- left_eye = np.average(gt_landmark[36:42], axis=0)
- right_eye = np.average(gt_landmark[42:48], axis=0)
- norm_factor = np.linalg.norm(left_eye - right_eye)
- # norm_factor = np.linalg.norm(gt_landmark[36]- gt_landmark[45])
- elif num_landmarks == 98:
- norm_factor = np.linalg.norm(gt_landmark[60]- gt_landmark[72])
- elif num_landmarks == 19:
- left, top = gt_landmark[-2, :]
- right, bottom = gt_landmark[-1, :]
- norm_factor = math.sqrt(abs(right - left)*abs(top-bottom))
- gt_landmark = gt_landmark[:-2, :]
- elif num_landmarks == 29:
- # norm_factor = np.linalg.norm(gt_landmark[8]- gt_landmark[9])
- norm_factor = np.linalg.norm(gt_landmark[16]- gt_landmark[17])
- nme += (np.sum(np.linalg.norm(pred_landmark - gt_landmark, axis=1)) / pred_landmark.shape[0]) / norm_factor
- return nme
-
-def fan_NME_hm(pred_heatmaps, gt_heatmaps, num_landmarks=68):
- '''
- Calculate total NME for a batch of data
-
- Args:
- pred_heatmaps: torch tensor of size [batch, points, height, width]
- gt_landmarks: torch tesnsor of size [batch, points, x, y]
-
- Returns:
- nme: sum of nme for this batch
- '''
- nme = 0
- pred_landmarks, _ = get_index_fromhm(pred_heatmaps)
- pred_landmarks = pred_landmarks.numpy()
- gt_landmarks = gt_landmarks.numpy()
- for i in range(pred_landmarks.shape[0]):
- pred_landmark = pred_landmarks[i] * 4.0
- gt_landmark = gt_landmarks[i]
- if num_landmarks == 68:
- left_eye = np.average(gt_landmark[36:42], axis=0)
- right_eye = np.average(gt_landmark[42:48], axis=0)
- norm_factor = np.linalg.norm(left_eye - right_eye)
- else:
- norm_factor = np.linalg.norm(gt_landmark[60]- gt_landmark[72])
- nme += (np.sum(np.linalg.norm(pred_landmark - gt_landmark, axis=1)) / pred_landmark.shape[0]) / norm_factor
- return nme
-
-def power_transform(img, power):
- img = np.array(img)
- img_new = np.power((img/255.0), power) * 255.0
- img_new = img_new.astype(np.uint8)
- img_new = Image.fromarray(img_new)
- return img_new
-
-def get_preds_fromhm(hm, center=None, scale=None, rot=None):
- max, idx = torch.max(
- hm.view(hm.size(0), hm.size(1), hm.size(2) * hm.size(3)), 2)
- idx += 1
- preds = idx.view(idx.size(0), idx.size(1), 1).repeat(1, 1, 2).float()
- preds[..., 0].apply_(lambda x: (x - 1) % hm.size(3) + 1)
- preds[..., 1].add_(-1).div_(hm.size(2)).floor_().add_(1)
-
- for i in range(preds.size(0)):
- for j in range(preds.size(1)):
- hm_ = hm[i, j, :]
- pX, pY = int(preds[i, j, 0]) - 1, int(preds[i, j, 1]) - 1
- if pX > 0 and pX < 63 and pY > 0 and pY < 63:
- diff = torch.FloatTensor(
- [hm_[pY, pX + 1] - hm_[pY, pX - 1],
- hm_[pY + 1, pX] - hm_[pY - 1, pX]])
- preds[i, j].add_(diff.sign_().mul_(.25))
-
- preds.add_(-0.5)
-
- preds_orig = torch.zeros(preds.size())
- if center is not None and scale is not None:
- for i in range(hm.size(0)):
- for j in range(hm.size(1)):
- preds_orig[i, j] = transform(
- preds[i, j], center, scale, hm.size(2), rot, True)
-
- return preds, preds_orig
-
-def get_index_fromhm(hm):
- max, idx = torch.max(
- hm.view(hm.size(0), hm.size(1), hm.size(2) * hm.size(3)), 2)
- preds = idx.view(idx.size(0), idx.size(1), 1).repeat(1, 1, 2).float()
- preds[..., 0].remainder_(hm.size(3))
- preds[..., 1].div_(hm.size(2)).floor_()
-
- for i in range(preds.size(0)):
- for j in range(preds.size(1)):
- hm_ = hm[i, j, :]
- pX, pY = int(preds[i, j, 0]), int(preds[i, j, 1])
- if pX > 0 and pX < 63 and pY > 0 and pY < 63:
- diff = torch.FloatTensor(
- [hm_[pY, pX + 1] - hm_[pY, pX - 1],
- hm_[pY + 1, pX] - hm_[pY - 1, pX]])
- preds[i, j].add_(diff.sign_().mul_(.25))
-
- return preds
-
-def shuffle_lr(parts, num_landmarks=68, pairs=None):
- if num_landmarks == 68:
- if pairs is None:
- pairs = [[0, 16], [1, 15], [2, 14], [3, 13], [4, 12], [5, 11], [6, 10],
- [7, 9], [17, 26], [18, 25], [19, 24], [20, 23], [21, 22], [36, 45],
- [37, 44], [38, 43], [39, 42], [41, 46], [40, 47], [31, 35], [32, 34],
- [50, 52], [49, 53], [48, 54], [61, 63], [60, 64], [67, 65], [59, 55], [58, 56]]
- elif num_landmarks == 98:
- if pairs is None:
- pairs = [[0, 32], [1,31], [2, 30], [3, 29], [4, 28], [5, 27], [6, 26], [7, 25], [8, 24], [9, 23], [10, 22], [11, 21], [12, 20], [13, 19], [14, 18], [15, 17], [33, 46], [34, 45], [35, 44], [36, 43], [37, 42], [38, 50], [39, 49], [40, 48], [41, 47], [60, 72], [61, 71], [62, 70], [63, 69], [64, 68], [65, 75], [66, 74], [67, 73], [96, 97], [55, 59], [56, 58], [76, 82], [77, 81], [78, 80], [88, 92], [89, 91], [95, 93], [87, 83], [86, 84]]
- elif num_landmarks == 19:
- if pairs is None:
- pairs = [[0, 5], [1, 4], [2, 3], [6, 11], [7, 10], [8, 9], [12, 14], [15, 17]]
- elif num_landmarks == 29:
- if pairs is None:
- pairs = [[0, 1], [4, 6], [5, 7], [2, 3], [8, 9], [12, 14], [16, 17], [13, 15], [10, 11], [18, 19], [22, 23]]
- for matched_p in pairs:
- idx1, idx2 = matched_p[0], matched_p[1]
- tmp = np.copy(parts[idx1])
- np.copyto(parts[idx1], parts[idx2])
- np.copyto(parts[idx2], tmp)
- return parts
-
-
-def generate_weight_map(weight_map,heatmap):
-
- k_size = 3
- dilate = ndimage.grey_dilation(heatmap ,size=(k_size,k_size))
- weight_map[np.where(dilate>0.2)] = 1
- return weight_map
-
-def fig2data(fig):
- """
- @brief Convert a Matplotlib figure to a 4D numpy array with RGBA channels and return it
- @param fig a matplotlib figure
- @return a numpy 3D array of RGBA values
- """
- # draw the renderer
- fig.canvas.draw ( )
-
- # Get the RGB buffer from the figure
- w,h = fig.canvas.get_width_height()
- buf = np.fromstring (fig.canvas.tostring_rgb(), dtype=np.uint8)
- buf.shape = (w, h, 3)
-
- # canvas.tostring_argb give pixmap in ARGB mode. Roll the ALPHA channel to have it in RGBA mode
- buf = np.roll (buf, 3, axis=2)
- return buf
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/__init__.py b/marlenezw/audio-driven-animations/MakeItTalk/__init__.py
deleted file mode 100644
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/__pycache__/__init__.cpython-37.pyc b/marlenezw/audio-driven-animations/MakeItTalk/__pycache__/__init__.cpython-37.pyc
deleted file mode 100644
index b1a63f75d47c06913a6323fae9f571847259aa10..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/__pycache__/__init__.cpython-37.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/__pycache__/__init__.cpython-39.pyc b/marlenezw/audio-driven-animations/MakeItTalk/__pycache__/__init__.cpython-39.pyc
deleted file mode 100644
index 066584c42a9e74b4ecc546a8c138e90a60323a86..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/__pycache__/__init__.cpython-39.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/CODEOWNERS b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/CODEOWNERS
deleted file mode 100644
index 3b20970fac357d6301d9c4187e682ea2d174d7a9..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/CODEOWNERS
+++ /dev/null
@@ -1 +0,0 @@
-* @papulke
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/LICENCE.txt b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/LICENCE.txt
deleted file mode 100644
index 02fef6bb5e96ddcf3b2d47b043033b525874835b..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/LICENCE.txt
+++ /dev/null
@@ -1,21 +0,0 @@
-MIT License
-
-Copyright (c) 2019 Jordan Yaniv
-
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to deal
-in the Software without restriction, including without limitation the rights
-to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in all
-copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
-IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
-DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
-OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE
-OR OTHER DEALINGS IN THE SOFTWARE.
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/README.md b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/README.md
deleted file mode 100644
index de7e0cfc4c7a0bdcb60781bf6c59fa6a06eb8fa9..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/README.md
+++ /dev/null
@@ -1,98 +0,0 @@
-# The Face of Art: Landmark Detection and Geometric Style in Portraits
-
-Code for the landmark detection framework described in [The Face of Art: Landmark Detection and Geometric Style in Portraits](http://www.faculty.idc.ac.il/arik/site/foa/face-of-art.asp) (SIGGRAPH 2019)
-
-
-Top: landmark detection results on artistic portraits with different styles allows to define the geometric style of an artist. Bottom: results of the style transfer of portraits using various artists' geometric style, including Amedeo Modigliani, Pablo Picasso, Margaret Keane, Fernand Léger, and Tsuguharu Foujita. Top right portrait is from 'Woman with Peanuts,' ©1962, Estate of Roy Lichtenstein.
-
-## Getting Started
-
-### Requirements
-
-* python
-* anaconda
-
-### Download
-
-#### Model
-download model weights from [here](https://www.dropbox.com/sh/hrxcyug1bmbj6cs/AAAxq_zI5eawcLjM8zvUwaXha?dl=0).
-
-#### Datasets
-* The datasets used for training and evaluating our model can be found [here](https://ibug.doc.ic.ac.uk/resources/facial-point-annotations/).
-
-* The Artistic-Faces dataset can be found [here](http://www.faculty.idc.ac.il/arik/site/foa/artistic-faces-dataset.asp).
-
-* Training images with texture augmentation can be found [here](https://www.dropbox.com/sh/av2k1i1082z0nie/AAC5qV1E2UkqpDLVsv7TazMta?dl=0).
- before applying texture style transfer, the training images were cropped to the ground-truth face bounding-box with 25% margin. To crop training images, run the script `crop_training_set.py`.
-
-* our model expects the following directory structure of landmark detection datasets:
-```
-landmark_detection_datasets
- ├── training
- ├── test
- ├── challenging
- ├── common
- ├── full
- ├── crop_gt_margin_0.25 (cropped images of training set)
- └── crop_gt_margin_0.25_ns (cropped images of training set + texture style transfer)
-```
-### Install
-
-Create a virtual environment and install the following:
-* opencv
-* menpo
-* menpofit
-* tensorflow-gpu
-
-for python 2:
-```
-conda create -n foa_env python=2.7 anaconda
-source activate foa_env
-conda install -c menpo opencv
-conda install -c menpo menpo
-conda install -c menpo menpofit
-pip install tensorflow-gpu
-
-```
-
-for python 3:
-```
-conda create -n foa_env python=3.5 anaconda
-source activate foa_env
-conda install -c menpo opencv
-conda install -c menpo menpo
-conda install -c menpo menpofit
-pip3 install tensorflow-gpu
-
-```
-
-Clone repository:
-
-```
-git clone https://github.com/papulke/deep_face_heatmaps
-```
-
-## Instructions
-
-### Training
-
-To train the network you need to run `train_heatmaps_network.py`
-
-example for training a model with texture augmentation (100% of images) and geometric augmentation (~70% of images):
-```
-python train_heatmaps_network.py --output_dir='test_artistic_aug' --augment_geom=True \
---augment_texture=True --p_texture=1. --p_geom=0.7
-```
-
-### Testing
-
-For using the detection framework to predict landmarks, run the script `predict_landmarks.py`
-
-## Acknowledgments
-
-* [ect](https://github.com/HongwenZhang/ECT-FaceAlignment)
-* [menpo](https://github.com/menpo/menpo)
-* [menpofit](https://github.com/menpo/menpofit)
-* [mdm](https://github.com/trigeorgis/mdm)
-* [style transfer implementation](https://github.com/woodrush/neural-art-tf)
-* [painter-by-numbers dataset](https://www.kaggle.com/c/painter-by-numbers/data)
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__init__.py b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__init__.py
deleted file mode 100644
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__init__.pyc b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__init__.pyc
deleted file mode 100644
index 0b9e3bf343ed64266613989ad81fe216c7c9b629..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__init__.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/__init__.cpython-36.pyc b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/__init__.cpython-36.pyc
deleted file mode 100644
index 61eb9ad048f027ecb44f3c7b0f57d3b35b3e92b9..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/__init__.cpython-36.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/data_loading_functions.cpython-36.pyc b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/data_loading_functions.cpython-36.pyc
deleted file mode 100644
index 6cb2deab1fc5ad0e55627e6ff3ddb2ea8c017da8..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/data_loading_functions.cpython-36.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/deep_heatmaps_model_fusion_net.cpython-36.pyc b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/deep_heatmaps_model_fusion_net.cpython-36.pyc
deleted file mode 100644
index 4fea5794088f8f6a980ab1f57c0e4a8c81f02bf5..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/deep_heatmaps_model_fusion_net.cpython-36.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/deformation_functions.cpython-36.pyc b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/deformation_functions.cpython-36.pyc
deleted file mode 100644
index 4d0840412329da651d7e9f1838c3a32632fcae54..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/deformation_functions.cpython-36.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/logging_functions.cpython-36.pyc b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/logging_functions.cpython-36.pyc
deleted file mode 100644
index 5c5276c59f1fa5216af03f225191ae72a0163abe..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/logging_functions.cpython-36.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/menpo_functions.cpython-36.pyc b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/menpo_functions.cpython-36.pyc
deleted file mode 100644
index b6c055f4c2355a379cd06720b4499ed9086619db..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/menpo_functions.cpython-36.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/ops.cpython-36.pyc b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/ops.cpython-36.pyc
deleted file mode 100644
index 0320fef778d2ebc211c5e4ea8af76e48b0b12d05..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/ops.cpython-36.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/pdm_clm_functions.cpython-36.pyc b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/pdm_clm_functions.cpython-36.pyc
deleted file mode 100644
index 15d4f78a92df2b2ff9b8b488bcfb6cbcd8818a58..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/__pycache__/pdm_clm_functions.cpython-36.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/crop_training_set.py b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/crop_training_set.py
deleted file mode 100644
index 0a6405c4194895d2614a7e05ba79558677bfd8a5..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/crop_training_set.py
+++ /dev/null
@@ -1,38 +0,0 @@
-from scipy.misc import imsave
-from menpo_functions import *
-from data_loading_functions import *
-
-
-# define paths & parameters for cropping dataset
-img_dir = '~/landmark_detection_datasets/'
-dataset = 'training'
-bb_type = 'gt'
-margin = 0.25
-image_size = 256
-
-# load bounding boxes
-bb_dir = os.path.join(img_dir, 'Bounding_Boxes')
-bb_dictionary = load_bb_dictionary(bb_dir, mode='TRAIN', test_data=dataset)
-
-# directory for saving face crops
-outdir = os.path.join(img_dir, 'crop_'+bb_type+'_margin_'+str(margin))
-if not os.path.exists(outdir):
- os.mkdir(outdir)
-
-# load images
-imgs_to_crop = load_menpo_image_list(
- img_dir=img_dir, train_crop_dir=None, img_dir_ns=None, mode='TRAIN', bb_dictionary=bb_dictionary,
- image_size=image_size, margin=margin, bb_type=bb_type, augment_basic=False)
-
-# save cropped images with matching landmarks
-print ("\ncropping dataset from: "+os.path.join(img_dir, dataset))
-print ("\nsaving cropped dataset to: "+outdir)
-for im in imgs_to_crop:
- if im.pixels.shape[0] == 1:
- im_pixels = gray2rgb(np.squeeze(im.pixels))
- else:
- im_pixels = np.rollaxis(im.pixels, 0, 3)
- imsave(os.path.join(outdir, im.path.name.split('.')[0]+'.png'), im_pixels)
- mio.export_landmark_file(im.landmarks['PTS'], os.path.join(outdir, im.path.name.split('.')[0]+'.pts'))
-
-print ("\ncropping dataset completed!")
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/data_loading_functions.py b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/data_loading_functions.py
deleted file mode 100644
index 98a50de5e26622dcfc84f579e6d4e5f25d4ab028..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/data_loading_functions.py
+++ /dev/null
@@ -1,161 +0,0 @@
-import numpy as np
-import os
-from skimage.color import gray2rgb
-
-
-def train_val_shuffle_inds_per_epoch(valid_inds, train_inds, train_iter, batch_size, log_path, save_log=True):
- """shuffle image indices for each training epoch and save to log"""
-
- np.random.seed(0)
- num_train_images = len(train_inds)
- num_epochs = int(np.ceil((1. * train_iter) / (1. * num_train_images / batch_size)))+1
- epoch_inds_shuffle = np.zeros((num_epochs, num_train_images)).astype(int)
- img_inds = np.arange(num_train_images)
- for i in range(num_epochs):
- np.random.shuffle(img_inds)
- epoch_inds_shuffle[i, :] = img_inds
-
- if save_log:
- with open(os.path.join(log_path, "train_val_shuffle_inds.csv"), "wb") as f:
- if valid_inds is not None:
- f.write(b'valid inds\n')
- np.savetxt(f, valid_inds.reshape(1, -1), fmt='%i', delimiter=",")
- f.write(b'train inds\n')
- np.savetxt(f, train_inds.reshape(1, -1), fmt='%i', delimiter=",")
- f.write(b'shuffle inds\n')
- np.savetxt(f, epoch_inds_shuffle, fmt='%i', delimiter=",")
-
- return epoch_inds_shuffle
-
-
-def gaussian(x, y, x0, y0, sigma=6):
- return 1./(np.sqrt(2*np.pi)*sigma) * np.exp(-0.5 * ((x-x0)**2 + (y-y0)**2) / sigma**2)
-
-
-def create_gaussian_filter(sigma=6, win_mult=3.5):
- win_size = int(win_mult * sigma)
- x, y = np.mgrid[0:2*win_size+1, 0:2*win_size+1]
- gauss_filt = (8./3)*sigma*gaussian(x, y, win_size, win_size, sigma=sigma) # same as in ECT
- return gauss_filt
-
-
-def load_images(img_list, batch_inds, image_size=256, c_dim=3, scale=255):
-
- """ load images as a numpy array from menpo image list """
-
- num_inputs = len(batch_inds)
- batch_menpo_images = img_list[batch_inds]
-
- images = np.zeros([num_inputs, image_size, image_size, c_dim]).astype('float32')
-
- for ind, img in enumerate(batch_menpo_images):
- if img.n_channels < 3 and c_dim == 3:
- images[ind, :, :, :] = gray2rgb(img.pixels_with_channels_at_back())
- else:
- images[ind, :, :, :] = img.pixels_with_channels_at_back()
-
- if scale is 255:
- images *= 255
- elif scale is 0:
- images = 2 * images - 1
-
- return images
-
-
-# loading functions with pre-allocation and approx heat-map generation
-
-
-def create_approx_heat_maps_alloc_once(landmarks, maps, gauss_filt=None, win_mult=3.5, num_landmarks=68, image_size=256,
- sigma=6):
- """ create heatmaps from input landmarks"""
- maps.fill(0.)
-
- win_size = int(win_mult * sigma)
- filt_size = 2 * win_size + 1
- landmarks = landmarks.astype(int)
-
- if gauss_filt is None:
- x_small, y_small = np.mgrid[0:2 * win_size + 1, 0:2 * win_size + 1]
- gauss_filt = (8. / 3) * sigma * gaussian(x_small, y_small, win_size, win_size, sigma=sigma) # same as in ECT
-
- for i in range(num_landmarks):
-
- min_row = landmarks[i, 0] - win_size
- max_row = landmarks[i, 0] + win_size + 1
- min_col = landmarks[i, 1] - win_size
- max_col = landmarks[i, 1] + win_size + 1
-
- if min_row < 0:
- min_row_gap = -1 * min_row
- min_row = 0
- else:
- min_row_gap = 0
-
- if min_col < 0:
- min_col_gap = -1 * min_col
- min_col = 0
- else:
- min_col_gap = 0
-
- if max_row > image_size:
- max_row_gap = max_row - image_size
- max_row = image_size
- else:
- max_row_gap = 0
-
- if max_col > image_size:
- max_col_gap = max_col - image_size
- max_col = image_size
- else:
- max_col_gap = 0
-
- maps[min_row:max_row, min_col:max_col, i] =\
- gauss_filt[min_row_gap:filt_size - 1 * max_row_gap, min_col_gap:filt_size - 1 * max_col_gap]
-
-
-def load_images_landmarks_approx_maps_alloc_once(
- img_list, batch_inds, images, maps_small, maps, landmarks, image_size=256, num_landmarks=68,
- scale=255, gauss_filt_large=None, gauss_filt_small=None, win_mult=3.5, sigma=6, save_landmarks=False):
-
- """ load images and gt landmarks from menpo image list, and create matching heatmaps """
-
- batch_menpo_images = img_list[batch_inds]
- c_dim = images.shape[-1]
- grp_name = batch_menpo_images[0].landmarks.group_labels[0]
-
- win_size_large = int(win_mult * sigma)
- win_size_small = int(win_mult * (1.*sigma/4))
-
- if gauss_filt_small is None:
- x_small, y_small = np.mgrid[0:2 * win_size_small + 1, 0:2 * win_size_small + 1]
- gauss_filt_small = (8. / 3) * (1.*sigma/4) * gaussian(
- x_small, y_small, win_size_small, win_size_small, sigma=1.*sigma/4) # same as in ECT
- if gauss_filt_large is None:
- x_large, y_large = np.mgrid[0:2 * win_size_large + 1, 0:2 * win_size_large + 1]
- gauss_filt_large = (8. / 3) * sigma * gaussian(x_large, y_large, win_size_large, win_size_large, sigma=sigma) # same as in ECT
-
- for ind, img in enumerate(batch_menpo_images):
- if img.n_channels < 3 and c_dim == 3:
- images[ind, :, :, :] = gray2rgb(img.pixels_with_channels_at_back())
- else:
- images[ind, :, :, :] = img.pixels_with_channels_at_back()
-
- lms = img.landmarks[grp_name].points
- lms = np.minimum(lms, image_size - 1)
- create_approx_heat_maps_alloc_once(
- landmarks=lms, maps=maps[ind, :, :, :], gauss_filt=gauss_filt_large, win_mult=win_mult,
- num_landmarks=num_landmarks, image_size=image_size, sigma=sigma)
-
- lms_small = img.resize([image_size / 4, image_size / 4]).landmarks[grp_name].points
- lms_small = np.minimum(lms_small, image_size / 4 - 1)
- create_approx_heat_maps_alloc_once(
- landmarks=lms_small, maps=maps_small[ind, :, :, :], gauss_filt=gauss_filt_small, win_mult=win_mult,
- num_landmarks=num_landmarks, image_size=image_size / 4, sigma=1. * sigma / 4)
-
- if save_landmarks:
- landmarks[ind, :, :] = lms
-
- if scale is 255:
- images *= 255
- elif scale is 0:
- images = 2 * images - 1
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/data_loading_functions.pyc b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/data_loading_functions.pyc
deleted file mode 100644
index 2cade76a9d35dee6dd3e33c5f4fc166462b82e97..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/data_loading_functions.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/deep_heatmaps_model_fusion_net.py b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/deep_heatmaps_model_fusion_net.py
deleted file mode 100644
index b4815c866b92537d3fa685e8273e5a6215820527..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/deep_heatmaps_model_fusion_net.py
+++ /dev/null
@@ -1,872 +0,0 @@
-import scipy.io
-import scipy.misc
-from glob import glob
-import os
-import numpy as np
-from thirdparty.face_of_art.ops import *
-import tensorflow as tf
-from tensorflow import contrib
-from thirdparty.face_of_art.menpo_functions import *
-from thirdparty.face_of_art.logging_functions import *
-from thirdparty.face_of_art.data_loading_functions import *
-
-
-class DeepHeatmapsModel(object):
-
- """facial landmark localization Network"""
-
- def __init__(self, mode='TRAIN', train_iter=100000, batch_size=10, learning_rate=1e-3, l_weight_primary=1.,
- l_weight_fusion=1.,l_weight_upsample=3.,adam_optimizer=True,momentum=0.95,step=100000, gamma=0.1,reg=0,
- weight_initializer='xavier', weight_initializer_std=0.01, bias_initializer=0.0, image_size=256,c_dim=3,
- num_landmarks=68, sigma=1.5, scale=1, margin=0.25, bb_type='gt', win_mult=3.33335,
- augment_basic=True,augment_texture=False, p_texture=0., augment_geom=False, p_geom=0.,
- output_dir='output', save_model_path='model',
- save_sample_path='sample', save_log_path='logs', test_model_path='model/deep_heatmaps-50000',
- pre_train_path='model/deep_heatmaps-50000', load_pretrain=False, load_primary_only=False,
- img_path='data', test_data='full', valid_data='full', valid_size=0, log_valid_every=5,
- train_crop_dir='crop_gt_margin_0.25', img_dir_ns='crop_gt_margin_0.25_ns',
- print_every=100, save_every=5000, sample_every=5000, sample_grid=9, sample_to_log=True,
- debug_data_size=20, debug=False, epoch_data_dir='epoch_data', use_epoch_data=False, menpo_verbose=True):
-
- # define some extra parameters
-
- self.log_histograms = False # save weight + gradient histogram to log
- self.save_valid_images = True # sample heat maps of validation images
- self.sample_per_channel = False # sample heatmaps separately for each landmark
-
- # for fine-tuning, choose reset_training_op==True. when resuming training, reset_training_op==False
- self.reset_training_op = False
-
- self.fast_img_gen = True
-
- self.compute_nme = True # compute normalized mean error
-
- self.config = tf.ConfigProto()
- self.config.gpu_options.allow_growth = True
-
- # sampling and logging parameters
- self.print_every = print_every # print losses to screen + log
- self.save_every = save_every # save model
- self.sample_every = sample_every # save images of gen heat maps compared to GT
- self.sample_grid = sample_grid # number of training images in sample
- self.sample_to_log = sample_to_log # sample images to log instead of disk
- self.log_valid_every = log_valid_every # log validation loss (in epochs)
-
- self.debug = debug
- self.debug_data_size = debug_data_size
- self.use_epoch_data = use_epoch_data
- self.epoch_data_dir = epoch_data_dir
-
- self.load_pretrain = load_pretrain
- self.load_primary_only = load_primary_only
- self.pre_train_path = pre_train_path
-
- self.mode = mode
- self.train_iter = train_iter
- self.learning_rate = learning_rate
-
- self.image_size = image_size
- self.c_dim = c_dim
- self.batch_size = batch_size
-
- self.num_landmarks = num_landmarks
-
- self.save_log_path = save_log_path
- self.save_sample_path = save_sample_path
- self.save_model_path = save_model_path
- self.test_model_path = test_model_path
- self.img_path=img_path
-
- self.momentum = momentum
- self.step = step # for lr decay
- self.gamma = gamma # for lr decay
- self.reg = reg # weight decay scale
- self.l_weight_primary = l_weight_primary # primary loss weight
- self.l_weight_fusion = l_weight_fusion # fusion loss weight
- self.l_weight_upsample = l_weight_upsample # upsample loss weight
-
- self.weight_initializer = weight_initializer # random_normal or xavier
- self.weight_initializer_std = weight_initializer_std
- self.bias_initializer = bias_initializer
- self.adam_optimizer = adam_optimizer
-
- self.sigma = sigma # sigma for heatmap generation
- self.scale = scale # scale for image normalization 255 / 1 / 0
- self.win_mult = win_mult # gaussian filter size for cpu/gpu approximation: 2 * sigma * win_mult + 1
-
- self.test_data = test_data # if mode is TEST, this choose the set to use full/common/challenging/test/art
- self.train_crop_dir = train_crop_dir
- self.img_dir_ns = os.path.join(img_path,img_dir_ns)
- self.augment_basic = augment_basic # perform basic augmentation (rotation,flip,crop)
- self.augment_texture = augment_texture # perform artistic texture augmentation (NS)
- self.p_texture = p_texture # initial probability of artistic texture augmentation
- self.augment_geom = augment_geom # perform artistic geometric augmentation
- self.p_geom = p_geom # initial probability of artistic geometric augmentation
-
- self.valid_size = valid_size
- self.valid_data = valid_data
-
- # load image, bb and landmark data using menpo
- self.bb_dir = os.path.join(img_path, 'Bounding_Boxes')
- self.bb_dictionary = load_bb_dictionary(self.bb_dir, mode, test_data=self.test_data)
-
- # use pre-augmented data, to save time during training
- if self.use_epoch_data:
- epoch_0 = os.path.join(self.epoch_data_dir, '0')
- self.img_menpo_list = load_menpo_image_list(
- img_path, train_crop_dir=epoch_0, img_dir_ns=None, mode=mode, bb_dictionary=self.bb_dictionary,
- image_size=self.image_size, test_data=self.test_data, augment_basic=False, augment_texture=False,
- augment_geom=False, verbose=menpo_verbose)
- else:
- self.img_menpo_list = load_menpo_image_list(
- img_path, train_crop_dir, self.img_dir_ns, mode, bb_dictionary=self.bb_dictionary,
- image_size=self.image_size, margin=margin, bb_type=bb_type, test_data=self.test_data,
- augment_basic=augment_basic, augment_texture=augment_texture, p_texture=p_texture,
- augment_geom=augment_geom, p_geom=p_geom, verbose=menpo_verbose)
-
- if mode == 'TRAIN':
-
- train_params = locals()
- print_training_params_to_file(train_params) # save init parameters
-
- self.train_inds = np.arange(len(self.img_menpo_list))
-
- if self.debug:
- self.train_inds = self.train_inds[:self.debug_data_size]
- self.img_menpo_list = self.img_menpo_list[self.train_inds]
-
- if valid_size > 0:
-
- self.valid_bb_dictionary = load_bb_dictionary(self.bb_dir, 'TEST', test_data=self.valid_data)
- self.valid_img_menpo_list = load_menpo_image_list(
- img_path, train_crop_dir, self.img_dir_ns, 'TEST', bb_dictionary=self.valid_bb_dictionary,
- image_size=self.image_size, margin=margin, bb_type=bb_type, test_data=self.valid_data,
- verbose=menpo_verbose)
-
- np.random.seed(0)
- self.val_inds = np.arange(len(self.valid_img_menpo_list))
- np.random.shuffle(self.val_inds)
- self.val_inds = self.val_inds[:self.valid_size]
-
- self.valid_img_menpo_list = self.valid_img_menpo_list[self.val_inds]
-
- self.valid_images_loaded =\
- np.zeros([self.valid_size, self.image_size, self.image_size, self.c_dim]).astype('float32')
- self.valid_gt_maps_small_loaded =\
- np.zeros([self.valid_size, self.image_size / 4, self.image_size / 4,
- self.num_landmarks]).astype('float32')
- self.valid_gt_maps_loaded =\
- np.zeros([self.valid_size, self.image_size, self.image_size, self.num_landmarks]
- ).astype('float32')
- self.valid_landmarks_loaded = np.zeros([self.valid_size, num_landmarks, 2]).astype('float32')
- self.valid_landmarks_pred = np.zeros([self.valid_size, self.num_landmarks, 2]).astype('float32')
-
- load_images_landmarks_approx_maps_alloc_once(
- self.valid_img_menpo_list, np.arange(self.valid_size), images=self.valid_images_loaded,
- maps_small=self.valid_gt_maps_small_loaded, maps=self.valid_gt_maps_loaded,
- landmarks=self.valid_landmarks_loaded, image_size=self.image_size,
- num_landmarks=self.num_landmarks, scale=self.scale, win_mult=self.win_mult, sigma=self.sigma,
- save_landmarks=self.compute_nme)
-
- if self.valid_size > self.sample_grid:
- self.valid_gt_maps_loaded = self.valid_gt_maps_loaded[:self.sample_grid]
- self.valid_gt_maps_small_loaded = self.valid_gt_maps_small_loaded[:self.sample_grid]
- else:
- self.val_inds = None
-
- self.epoch_inds_shuffle = train_val_shuffle_inds_per_epoch(
- self.val_inds, self.train_inds, train_iter, batch_size, save_log_path)
-
- def add_placeholders(self):
-
- if self.mode == 'TEST':
- self.images = tf.placeholder(
- tf.float32, [None, self.image_size, self.image_size, self.c_dim], 'images')
-
- self.heatmaps = tf.placeholder(
- tf.float32, [None, self.image_size, self.image_size, self.num_landmarks], 'heatmaps')
-
- self.heatmaps_small = tf.placeholder(
- tf.float32, [None, int(self.image_size/4), int(self.image_size/4), self.num_landmarks], 'heatmaps_small')
- self.lms = tf.placeholder(tf.float32, [None, self.num_landmarks, 2], 'lms')
- self.pred_lms = tf.placeholder(tf.float32, [None, self.num_landmarks, 2], 'pred_lms')
-
- elif self.mode == 'TRAIN':
- self.images = tf.placeholder(
- tf.float32, [None, self.image_size, self.image_size, self.c_dim], 'train_images')
-
- self.heatmaps = tf.placeholder(
- tf.float32, [None, self.image_size, self.image_size, self.num_landmarks], 'train_heatmaps')
-
- self.heatmaps_small = tf.placeholder(
- tf.float32, [None, int(self.image_size/4), int(self.image_size/4), self.num_landmarks], 'train_heatmaps_small')
-
- self.train_lms = tf.placeholder(tf.float32, [None, self.num_landmarks, 2], 'train_lms')
- self.train_pred_lms = tf.placeholder(tf.float32, [None, self.num_landmarks, 2], 'train_pred_lms')
-
- self.valid_lms = tf.placeholder(tf.float32, [None, self.num_landmarks, 2], 'valid_lms')
- self.valid_pred_lms = tf.placeholder(tf.float32, [None, self.num_landmarks, 2], 'valid_pred_lms')
-
- # self.p_texture_log = tf.placeholder(tf.float32, [])
- # self.p_geom_log = tf.placeholder(tf.float32, [])
-
- # self.sparse_hm_small = tf.placeholder(tf.float32, [None, int(self.image_size/4), int(self.image_size/4), 1])
- # self.sparse_hm = tf.placeholder(tf.float32, [None, self.image_size, self.image_size, 1])
-
- if self.sample_to_log:
- row = int(np.sqrt(self.sample_grid))
- self.log_image_map_small = tf.placeholder(
- tf.uint8, [None, row * int(self.image_size/4), 3 * row * int(self.image_size/4), self.c_dim],
- 'sample_img_map_small')
- self.log_image_map = tf.placeholder(
- tf.uint8, [None, row * self.image_size, 3 * row * self.image_size, self.c_dim],
- 'sample_img_map')
- if self.sample_per_channel:
- row = np.ceil(np.sqrt(self.num_landmarks)).astype(np.int64)
- self.log_map_channels_small = tf.placeholder(
- tf.uint8, [None, row * int(self.image_size/4), 2 * row * int(self.image_size/4), self.c_dim],
- 'sample_map_channels_small')
- self.log_map_channels = tf.placeholder(
- tf.uint8, [None, row * self.image_size, 2 * row * self.image_size, self.c_dim],
- 'sample_map_channels')
-
- def heatmaps_network(self, input_images, reuse=None, name='pred_heatmaps'):
-
- with tf.name_scope(name):
-
- if self.weight_initializer == 'xavier':
- weight_initializer = contrib.layers.xavier_initializer()
- else:
- weight_initializer = tf.random_normal_initializer(stddev=self.weight_initializer_std)
-
- bias_init = tf.constant_initializer(self.bias_initializer)
-
- with tf.variable_scope('heatmaps_network'):
- with tf.name_scope('primary_net'):
-
- l1 = conv_relu_pool(input_images, 5, 128, conv_ker_init=weight_initializer, conv_bias_init=bias_init,
- reuse=reuse, var_scope='conv_1')
- l2 = conv_relu_pool(l1, 5, 128, conv_ker_init=weight_initializer, conv_bias_init=bias_init,
- reuse=reuse, var_scope='conv_2')
- l3 = conv_relu(l2, 5, 128, conv_ker_init=weight_initializer, conv_bias_init=bias_init,
- reuse=reuse, var_scope='conv_3')
-
- l4_1 = conv_relu(l3, 3, 128, conv_dilation=1, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_4_1')
- l4_2 = conv_relu(l3, 3, 128, conv_dilation=2, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_4_2')
- l4_3 = conv_relu(l3, 3, 128, conv_dilation=3, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_4_3')
- l4_4 = conv_relu(l3, 3, 128, conv_dilation=4, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_4_4')
-
- l4 = tf.concat([l4_1, l4_2, l4_3, l4_4], 3, name='conv_4')
-
- l5_1 = conv_relu(l4, 3, 256, conv_dilation=1, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_5_1')
- l5_2 = conv_relu(l4, 3, 256, conv_dilation=2, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_5_2')
- l5_3 = conv_relu(l4, 3, 256, conv_dilation=3, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_5_3')
- l5_4 = conv_relu(l4, 3, 256, conv_dilation=4, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_5_4')
-
- l5 = tf.concat([l5_1, l5_2, l5_3, l5_4], 3, name='conv_5')
-
- l6 = conv_relu(l5, 1, 512, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_6')
- l7 = conv_relu(l6, 1, 256, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_7')
- primary_out = conv(l7, 1, self.num_landmarks, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_8')
-
- with tf.name_scope('fusion_net'):
-
- l_fsn_0 = tf.concat([l3, l7], 3, name='conv_3_7_fsn')
-
- l_fsn_1_1 = conv_relu(l_fsn_0, 3, 64, conv_dilation=1, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_fsn_1_1')
- l_fsn_1_2 = conv_relu(l_fsn_0, 3, 64, conv_dilation=2, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_fsn_1_2')
- l_fsn_1_3 = conv_relu(l_fsn_0, 3, 64, conv_dilation=3, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_fsn_1_3')
-
- l_fsn_1 = tf.concat([l_fsn_1_1, l_fsn_1_2, l_fsn_1_3], 3, name='conv_fsn_1')
-
- l_fsn_2_1 = conv_relu(l_fsn_1, 3, 64, conv_dilation=1, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_fsn_2_1')
- l_fsn_2_2 = conv_relu(l_fsn_1, 3, 64, conv_dilation=2, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_fsn_2_2')
- l_fsn_2_3 = conv_relu(l_fsn_1, 3, 64, conv_dilation=4, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_fsn_2_3')
- l_fsn_2_4 = conv_relu(l_fsn_1, 5, 64, conv_dilation=3, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_fsn_2_4')
-
- l_fsn_2 = tf.concat([l_fsn_2_1, l_fsn_2_2, l_fsn_2_3, l_fsn_2_4], 3, name='conv_fsn_2')
-
- l_fsn_3_1 = conv_relu(l_fsn_2, 3, 128, conv_dilation=1, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_fsn_3_1')
- l_fsn_3_2 = conv_relu(l_fsn_2, 3, 128, conv_dilation=2, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_fsn_3_2')
- l_fsn_3_3 = conv_relu(l_fsn_2, 3, 128, conv_dilation=4, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_fsn_3_3')
- l_fsn_3_4 = conv_relu(l_fsn_2, 5, 128, conv_dilation=3, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_fsn_3_4')
-
- l_fsn_3 = tf.concat([l_fsn_3_1, l_fsn_3_2, l_fsn_3_3, l_fsn_3_4], 3, name='conv_fsn_3')
-
- l_fsn_4 = conv_relu(l_fsn_3, 1, 256, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_fsn_4')
- fusion_out = conv(l_fsn_4, 1, self.num_landmarks, conv_ker_init=weight_initializer,
- conv_bias_init=bias_init, reuse=reuse, var_scope='conv_fsn_5')
-
- with tf.name_scope('upsample_net'):
-
- out = deconv(fusion_out, 8, self.num_landmarks, conv_stride=4,
- conv_ker_init=deconv2d_bilinear_upsampling_initializer(
- [8, 8, self.num_landmarks, self.num_landmarks]), conv_bias_init=bias_init,
- reuse=reuse, var_scope='deconv_1')
-
- self.all_layers = [l1, l2, l3, l4, l5, l6, l7, primary_out, l_fsn_1, l_fsn_2, l_fsn_3, l_fsn_4,
- fusion_out, out]
-
- return primary_out, fusion_out, out
-
- def build_model(self):
- self.pred_hm_p, self.pred_hm_f, self.pred_hm_u = self.heatmaps_network(self.images,name='heatmaps_prediction')
-
- def create_loss_ops(self):
-
- def nme_norm_eyes(pred_landmarks, real_landmarks, normalize=True, name='NME'):
- """calculate normalized mean error on landmarks - normalize with inter pupil distance"""
-
- with tf.name_scope(name):
- with tf.name_scope('real_pred_landmarks_rmse'):
- # calculate RMS ERROR between GT and predicted lms
- landmarks_rms_err = tf.reduce_mean(
- tf.sqrt(tf.reduce_sum(tf.square(pred_landmarks - real_landmarks), axis=2)), axis=1)
- if normalize:
- # normalize RMS ERROR with inter-pupil distance of GT lms
- with tf.name_scope('inter_pupil_dist'):
- with tf.name_scope('left_eye_center'):
- p1 = tf.reduce_mean(tf.slice(real_landmarks, [0, 42, 0], [-1, 6, 2]), axis=1)
- with tf.name_scope('right_eye_center'):
- p2 = tf.reduce_mean(tf.slice(real_landmarks, [0, 36, 0], [-1, 6, 2]), axis=1)
-
- eye_dist = tf.sqrt(tf.reduce_sum(tf.square(p1 - p2), axis=1))
-
- return landmarks_rms_err / eye_dist
- else:
- return landmarks_rms_err
-
- if self.mode is 'TRAIN':
-
- # calculate L2 loss between ideal and predicted heatmaps
- primary_maps_diff = self.pred_hm_p - self.heatmaps_small
- fusion_maps_diff = self.pred_hm_f - self.heatmaps_small
- upsample_maps_diff = self.pred_hm_u - self.heatmaps
-
- self.l2_primary = tf.reduce_mean(tf.square(primary_maps_diff))
- self.l2_fusion = tf.reduce_mean(tf.square(fusion_maps_diff))
- self.l2_upsample = tf.reduce_mean(tf.square(upsample_maps_diff))
-
- self.total_loss = 1000.*(self.l_weight_primary * self.l2_primary + self.l_weight_fusion * self.l2_fusion +
- self.l_weight_upsample * self.l2_upsample)
-
- # add weight decay
- self.total_loss += self.reg * tf.add_n(
- [tf.nn.l2_loss(v) for v in tf.trainable_variables() if 'bias' not in v.name])
-
- # compute normalized mean error on gt vs. predicted landmarks (for validation)
- if self.compute_nme:
- self.nme_loss = tf.reduce_mean(nme_norm_eyes(self.train_pred_lms, self.train_lms))
-
- if self.valid_size > 0 and self.compute_nme:
- self.valid_nme_loss = tf.reduce_mean(nme_norm_eyes(self.valid_pred_lms, self.valid_lms))
-
- elif self.mode == 'TEST' and self.compute_nme:
- self.nme_per_image = nme_norm_eyes(self.pred_lms, self.lms)
- self.nme_loss = tf.reduce_mean(self.nme_per_image)
-
- def predict_valid_landmarks_in_batches(self, images, session):
-
- num_images=int(images.shape[0])
- num_batches = int(1.*num_images/self.batch_size)
- if num_batches == 0:
- batch_size = num_images
- num_batches = 1
- else:
- batch_size = self.batch_size
-
- for j in range(num_batches):
-
- batch_images = images[j * batch_size:(j + 1) * batch_size,:,:,:]
- batch_maps_pred = session.run(self.pred_hm_u, {self.images: batch_images})
- batch_heat_maps_to_landmarks_alloc_once(
- batch_maps=batch_maps_pred, batch_landmarks=self.valid_landmarks_pred[j * batch_size:(j + 1) * batch_size, :, :],
- batch_size=batch_size,image_size=self.image_size,num_landmarks=self.num_landmarks)
-
- reminder = num_images-num_batches*batch_size
- if reminder > 0:
- batch_images = images[-reminder:, :, :, :]
- batch_maps_pred = session.run(self.pred_hm_u, {self.images: batch_images})
-
- batch_heat_maps_to_landmarks_alloc_once(
- batch_maps=batch_maps_pred,
- batch_landmarks=self.valid_landmarks_pred[-reminder:, :, :],
- batch_size=reminder, image_size=self.image_size, num_landmarks=self.num_landmarks)
-
- def create_summary_ops(self):
- """create summary ops for logging"""
-
- # loss summary
- l2_primary = tf.summary.scalar('l2_primary', self.l2_primary)
- l2_fusion = tf.summary.scalar('l2_fusion', self.l2_fusion)
- l2_upsample = tf.summary.scalar('l2_upsample', self.l2_upsample)
-
- l_total = tf.summary.scalar('l_total', self.total_loss)
- self.batch_summary_op = tf.summary.merge([l2_primary,l2_fusion,l2_upsample,l_total])
-
- if self.compute_nme:
- nme = tf.summary.scalar('nme', self.nme_loss)
- self.batch_summary_op = tf.summary.merge([self.batch_summary_op, nme])
-
- if self.log_histograms:
- var_summary = [tf.summary.histogram(var.name,var) for var in tf.trainable_variables()]
- grads = tf.gradients(self.total_loss, tf.trainable_variables())
- grads = list(zip(grads, tf.trainable_variables()))
- grad_summary = [tf.summary.histogram(var.name+'/grads',grad) for grad,var in grads]
- activ_summary = [tf.summary.histogram(layer.name, layer) for layer in self.all_layers]
- self.batch_summary_op = tf.summary.merge([self.batch_summary_op, var_summary, grad_summary, activ_summary])
-
- if self.valid_size > 0 and self.compute_nme:
- self.valid_summary = tf.summary.scalar('valid_nme', self.valid_nme_loss)
-
- if self.sample_to_log:
- img_map_summary_small = tf.summary.image('compare_map_to_gt_small', self.log_image_map_small)
- img_map_summary = tf.summary.image('compare_map_to_gt', self.log_image_map)
-
- if self.sample_per_channel:
- map_channels_summary = tf.summary.image('compare_map_channels_to_gt', self.log_map_channels)
- map_channels_summary_small = tf.summary.image('compare_map_channels_to_gt_small',
- self.log_map_channels_small)
- self.img_summary = tf.summary.merge(
- [img_map_summary, img_map_summary_small,map_channels_summary,map_channels_summary_small])
- else:
- self.img_summary = tf.summary.merge([img_map_summary, img_map_summary_small])
-
- if self.valid_size >= self.sample_grid:
- img_map_summary_valid_small = tf.summary.image('compare_map_to_gt_small_valid', self.log_image_map_small)
- img_map_summary_valid = tf.summary.image('compare_map_to_gt_valid', self.log_image_map)
-
- if self.sample_per_channel:
- map_channels_summary_valid_small = tf.summary.image('compare_map_channels_to_gt_small_valid',
- self.log_map_channels_small)
- map_channels_summary_valid = tf.summary.image('compare_map_channels_to_gt_valid',
- self.log_map_channels)
- self.img_summary_valid = tf.summary.merge(
- [img_map_summary_valid,img_map_summary_valid_small,map_channels_summary_valid,
- map_channels_summary_valid_small])
- else:
- self.img_summary_valid = tf.summary.merge([img_map_summary_valid, img_map_summary_valid_small])
-
- def train(self):
- # set random seed
- tf.set_random_seed(1234)
- np.random.seed(1234)
- # build a graph
- # add placeholders
- self.add_placeholders()
- # build model
- self.build_model()
- # create loss ops
- self.create_loss_ops()
- # create summary ops
- self.create_summary_ops()
-
- # create optimizer and training op
- global_step = tf.Variable(0, trainable=False)
- lr = tf.train.exponential_decay(self.learning_rate,global_step, self.step, self.gamma, staircase=True)
- if self.adam_optimizer:
- optimizer = tf.train.AdamOptimizer(lr)
- else:
- optimizer = tf.train.MomentumOptimizer(lr, self.momentum)
-
- train_op = optimizer.minimize(self.total_loss,global_step=global_step)
-
- with tf.Session(config=self.config) as sess:
-
- tf.global_variables_initializer().run()
-
- # load pre trained weights if load_pretrain==True
- if self.load_pretrain:
- print
- print('*** loading pre-trained weights from: '+self.pre_train_path+' ***')
- if self.load_primary_only:
- print('*** loading primary-net only ***')
- primary_var = [v for v in tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES) if
- ('deconv_' not in v.name) and ('_fsn_' not in v.name)]
- loader = tf.train.Saver(var_list=primary_var)
- else:
- loader = tf.train.Saver()
- loader.restore(sess, self.pre_train_path)
- print("*** Model restore finished, current global step: %d" % global_step.eval())
-
- # for fine-tuning, choose reset_training_op==True. when resuming training, reset_training_op==False
- if self.reset_training_op:
- print ("resetting optimizer and global step")
- opt_var_list = [optimizer.get_slot(var, name) for name in optimizer.get_slot_names()
- for var in tf.global_variables() if optimizer.get_slot(var, name) is not None]
- opt_var_list_init = tf.variables_initializer(opt_var_list)
- opt_var_list_init.run()
- sess.run(global_step.initializer)
-
- # create model saver and file writer
- summary_writer = tf.summary.FileWriter(logdir=self.save_log_path, graph=tf.get_default_graph())
- saver = tf.train.Saver()
-
- print('\n*** Start Training ***')
-
- # initialize some variables before training loop
- resume_step = global_step.eval()
- num_train_images = len(self.img_menpo_list)
- batches_in_epoch = int(float(num_train_images) / float(self.batch_size))
- epoch = int(resume_step / batches_in_epoch)
- img_inds = self.epoch_inds_shuffle[epoch, :]
- log_valid = True
- log_valid_images = True
-
- # allocate space for batch images, maps and landmarks
- batch_images = np.zeros([self.batch_size, self.image_size, self.image_size, self.c_dim]).astype(
- 'float32')
- batch_lms = np.zeros([self.batch_size, self.num_landmarks, 2]).astype('float32')
- batch_lms_pred = np.zeros([self.batch_size, self.num_landmarks, 2]).astype('float32')
-
- batch_maps_small = np.zeros((self.batch_size, int(self.image_size/4),
- int(self.image_size/4), self.num_landmarks)).astype('float32')
- batch_maps = np.zeros((self.batch_size, self.image_size, self.image_size,
- self.num_landmarks)).astype('float32')
-
- # create gaussians for heatmap generation
- gaussian_filt_large = create_gaussian_filter(sigma=self.sigma, win_mult=self.win_mult)
- gaussian_filt_small = create_gaussian_filter(sigma=1.*self.sigma/4, win_mult=self.win_mult)
-
- # training loop
- for step in range(resume_step, self.train_iter):
-
- j = step % batches_in_epoch # j==0 if we finished an epoch
-
- # if we finished an epoch and this isn't the first step
- if step > resume_step and j == 0:
- epoch += 1
- img_inds = self.epoch_inds_shuffle[epoch, :] # get next shuffled image inds
- log_valid = True
- log_valid_images = True
- if self.use_epoch_data: # if using pre-augmented data, load epoch directory
- epoch_dir = os.path.join(self.epoch_data_dir, str(epoch))
- self.img_menpo_list = load_menpo_image_list(
- self.img_path, train_crop_dir=epoch_dir, img_dir_ns=None, mode=self.mode,
- bb_dictionary=self.bb_dictionary, image_size=self.image_size, test_data=self.test_data,
- augment_basic=False, augment_texture=False, augment_geom=False)
-
- # get batch indices
- batch_inds = img_inds[j * self.batch_size:(j + 1) * self.batch_size]
-
- # load batch images, gt maps and landmarks
- load_images_landmarks_approx_maps_alloc_once(
- self.img_menpo_list, batch_inds, images=batch_images, maps_small=batch_maps_small,
- maps=batch_maps, landmarks=batch_lms, image_size=self.image_size,
- num_landmarks=self.num_landmarks, scale=self.scale, gauss_filt_large=gaussian_filt_large,
- gauss_filt_small=gaussian_filt_small, win_mult=self.win_mult, sigma=self.sigma,
- save_landmarks=self.compute_nme)
-
- feed_dict_train = {self.images: batch_images, self.heatmaps: batch_maps,
- self.heatmaps_small: batch_maps_small}
-
- # train on batch
- sess.run(train_op, feed_dict_train)
-
- # save to log and print status
- if step == resume_step or (step + 1) % self.print_every == 0:
-
- # train data log
- if self.compute_nme:
- batch_maps_pred = sess.run(self.pred_hm_u, {self.images: batch_images})
-
- batch_heat_maps_to_landmarks_alloc_once(
- batch_maps=batch_maps_pred,batch_landmarks=batch_lms_pred,
- batch_size=self.batch_size, image_size=self.image_size,
- num_landmarks=self.num_landmarks)
-
- train_feed_dict_log = {
- self.images: batch_images, self.heatmaps: batch_maps,
- self.heatmaps_small: batch_maps_small, self.train_lms: batch_lms,
- self.train_pred_lms: batch_lms_pred}
-
- summary, l_p, l_f, l_t, nme = sess.run(
- [self.batch_summary_op, self.l2_primary, self.l2_fusion, self.total_loss,
- self.nme_loss],
- train_feed_dict_log)
-
- print (
- 'epoch: [%d] step: [%d/%d] primary loss: [%.6f] fusion loss: [%.6f]'
- ' total loss: [%.6f] NME: [%.6f]' % (
- epoch, step + 1, self.train_iter, l_p, l_f, l_t, nme))
- else:
- train_feed_dict_log = {self.images: batch_images, self.heatmaps: batch_maps,
- self.heatmaps_small: batch_maps_small}
-
- summary, l_p, l_f, l_t = sess.run(
- [self.batch_summary_op, self.l2_primary, self.l2_fusion, self.total_loss],
- train_feed_dict_log)
- print (
- 'epoch: [%d] step: [%d/%d] primary loss: [%.6f] fusion loss: [%.6f] total loss: [%.6f]'
- % (epoch, step + 1, self.train_iter, l_p, l_f, l_t))
-
- summary_writer.add_summary(summary, step)
-
- # valid data log
- if self.valid_size > 0 and (log_valid and epoch % self.log_valid_every == 0) \
- and self.compute_nme:
- log_valid = False
-
- self.predict_valid_landmarks_in_batches(self.valid_images_loaded, sess)
- valid_feed_dict_log = {
- self.valid_lms: self.valid_landmarks_loaded,
- self.valid_pred_lms: self.valid_landmarks_pred}
-
- v_summary, v_nme = sess.run([self.valid_summary, self.valid_nme_loss],
- valid_feed_dict_log)
- summary_writer.add_summary(v_summary, step)
- print (
- 'epoch: [%d] step: [%d/%d] valid NME: [%.6f]' % (
- epoch, step + 1, self.train_iter, v_nme))
-
- # save model
- if (step + 1) % self.save_every == 0:
- saver.save(sess, os.path.join(self.save_model_path, 'deep_heatmaps'), global_step=step + 1)
- print ('model/deep-heatmaps-%d saved' % (step + 1))
-
- # save images
- if step == resume_step or (step + 1) % self.sample_every == 0:
-
- batch_maps_small_pred = sess.run(self.pred_hm_p, {self.images: batch_images})
- if not self.compute_nme:
- batch_maps_pred = sess.run(self.pred_hm_u, {self.images: batch_images})
- batch_lms_pred = None
-
- merged_img = merge_images_landmarks_maps_gt(
- batch_images.copy(), batch_maps_pred, batch_maps, landmarks=batch_lms_pred,
- image_size=self.image_size, num_landmarks=self.num_landmarks, num_samples=self.sample_grid,
- scale=self.scale, circle_size=2, fast=self.fast_img_gen)
-
- merged_img_small = merge_images_landmarks_maps_gt(
- batch_images.copy(), batch_maps_small_pred, batch_maps_small,
- image_size=self.image_size,
- num_landmarks=self.num_landmarks, num_samples=self.sample_grid, scale=self.scale,
- circle_size=0, fast=self.fast_img_gen)
-
- if self.sample_per_channel:
- map_per_channel = map_comapre_channels(
- batch_images.copy(), batch_maps_pred, batch_maps, image_size=self.image_size,
- num_landmarks=self.num_landmarks, scale=self.scale)
-
- map_per_channel_small = map_comapre_channels(
- batch_images.copy(), batch_maps_small_pred, batch_maps_small, image_size=int(self.image_size/4),
- num_landmarks=self.num_landmarks, scale=self.scale)
-
- if self.sample_to_log: # save heatmap images to log
- if self.sample_per_channel:
- summary_img = sess.run(
- self.img_summary, {self.log_image_map: np.expand_dims(merged_img, 0),
- self.log_map_channels: np.expand_dims(map_per_channel, 0),
- self.log_image_map_small: np.expand_dims(merged_img_small, 0),
- self.log_map_channels_small: np.expand_dims(map_per_channel_small, 0)})
- else:
- summary_img = sess.run(
- self.img_summary, {self.log_image_map: np.expand_dims(merged_img, 0),
- self.log_image_map_small: np.expand_dims(merged_img_small, 0)})
- summary_writer.add_summary(summary_img, step)
-
- if (self.valid_size >= self.sample_grid) and self.save_valid_images and\
- (log_valid_images and epoch % self.log_valid_every == 0):
- log_valid_images = False
-
- batch_maps_small_pred_val,batch_maps_pred_val =\
- sess.run([self.pred_hm_p,self.pred_hm_u],
- {self.images: self.valid_images_loaded[:self.sample_grid]})
-
- merged_img_small = merge_images_landmarks_maps_gt(
- self.valid_images_loaded[:self.sample_grid].copy(), batch_maps_small_pred_val,
- self.valid_gt_maps_small_loaded, image_size=self.image_size,
- num_landmarks=self.num_landmarks, num_samples=self.sample_grid,
- scale=self.scale, circle_size=0, fast=self.fast_img_gen)
-
- merged_img = merge_images_landmarks_maps_gt(
- self.valid_images_loaded[:self.sample_grid].copy(), batch_maps_pred_val,
- self.valid_gt_maps_loaded, image_size=self.image_size,
- num_landmarks=self.num_landmarks, num_samples=self.sample_grid,
- scale=self.scale, circle_size=2, fast=self.fast_img_gen)
-
- if self.sample_per_channel:
- map_per_channel_small = map_comapre_channels(
- self.valid_images_loaded[:self.sample_grid].copy(), batch_maps_small_pred_val,
- self.valid_gt_maps_small_loaded, image_size=int(self.image_size / 4),
- num_landmarks=self.num_landmarks, scale=self.scale)
-
- map_per_channel = map_comapre_channels(
- self.valid_images_loaded[:self.sample_grid].copy(), batch_maps_pred,
- self.valid_gt_maps_loaded, image_size=self.image_size,
- num_landmarks=self.num_landmarks, scale=self.scale)
-
- summary_img = sess.run(
- self.img_summary_valid,
- {self.log_image_map: np.expand_dims(merged_img, 0),
- self.log_map_channels: np.expand_dims(map_per_channel, 0),
- self.log_image_map_small: np.expand_dims(merged_img_small, 0),
- self.log_map_channels_small: np.expand_dims(map_per_channel_small, 0)})
- else:
- summary_img = sess.run(
- self.img_summary_valid,
- {self.log_image_map: np.expand_dims(merged_img, 0),
- self.log_image_map_small: np.expand_dims(merged_img_small, 0)})
-
- summary_writer.add_summary(summary_img, step)
- else: # save heatmap images to directory
- sample_path_imgs = os.path.join(
- self.save_sample_path, 'epoch-%d-train-iter-%d-1.png' % (epoch, step + 1))
- sample_path_imgs_small = os.path.join(
- self.save_sample_path, 'epoch-%d-train-iter-%d-1-s.png' % (epoch, step + 1))
- scipy.misc.imsave(sample_path_imgs, merged_img)
- scipy.misc.imsave(sample_path_imgs_small, merged_img_small)
-
- if self.sample_per_channel:
- sample_path_ch_maps = os.path.join(
- self.save_sample_path, 'epoch-%d-train-iter-%d-3.png' % (epoch, step + 1))
- sample_path_ch_maps_small = os.path.join(
- self.save_sample_path, 'epoch-%d-train-iter-%d-3-s.png' % (epoch, step + 1))
- scipy.misc.imsave(sample_path_ch_maps, map_per_channel)
- scipy.misc.imsave(sample_path_ch_maps_small, map_per_channel_small)
-
- print('*** Finished Training ***')
-
- def get_image_maps(self, test_image, reuse=None, norm=False):
- """ returns heatmaps of input image (menpo image object)"""
-
- self.add_placeholders()
- # build model
- pred_hm_p, pred_hm_f, pred_hm_u = self.heatmaps_network(self.images, reuse=reuse)
-
- with tf.Session(config=self.config) as sess:
- # load trained parameters
- saver = tf.train.Saver()
- saver.restore(sess, self.test_model_path)
- _, model_name = os.path.split(self.test_model_path)
-
- test_image = test_image.pixels_with_channels_at_back().astype('float32')
- if norm:
- if self.scale is '255':
- test_image *= 255
- elif self.scale is '0':
- test_image = 2 * test_image - 1
-
- map_primary, map_fusion, map_upsample = sess.run(
- [pred_hm_p, pred_hm_f, pred_hm_u], {self.images: np.expand_dims(test_image, 0)})
-
- return map_primary, map_fusion, map_upsample
-
- def get_landmark_predictions(self, img_list, pdm_models_dir, clm_model_path, reuse=None, map_to_input_size=False):
-
- """returns dictionary with landmark predictions of each step of the ECpTp algorithm and ECT"""
-
- from thirdparty.face_of_art.pdm_clm_functions import feature_based_pdm_corr, clm_correct
-
- jaw_line_inds = np.arange(0, 17)
- left_brow_inds = np.arange(17, 22)
- right_brow_inds = np.arange(22, 27)
-
- self.add_placeholders()
- # build model
- _, _, pred_hm_u = self.heatmaps_network(self.images, reuse=reuse)
-
- with tf.Session(config=self.config) as sess:
- # load trained parameters
- saver = tf.train.Saver()
- saver.restore(sess, self.test_model_path)
- _, model_name = os.path.split(self.test_model_path)
- e_list = []
- ect_list = []
- ecp_list = []
- ecpt_list = []
- ecptp_jaw_list = []
- ecptp_out_list = []
-
- for test_image in img_list:
-
- if map_to_input_size:
- test_image_transform = test_image[1]
- test_image=test_image[0]
-
- # get landmarks for estimation stage
- if test_image.n_channels < 3:
- test_image_map = sess.run(
- pred_hm_u, {self.images: np.expand_dims(
- gray2rgb(test_image.pixels_with_channels_at_back()).astype('float32'), 0)})
- else:
- test_image_map = sess.run(
- pred_hm_u, {self.images: np.expand_dims(
- test_image.pixels_with_channels_at_back().astype('float32'), 0)})
- init_lms = heat_maps_to_landmarks(np.squeeze(test_image_map))
-
- # get landmarks for part-based correction stage
- p_pdm_lms = feature_based_pdm_corr(lms_init=init_lms, models_dir=pdm_models_dir, train_type='basic')
-
- # get landmarks for part-based tuning stage
- try: # clm may not converge
- pdm_clm_lms = clm_correct(
- clm_model_path=clm_model_path, image=test_image, map=test_image_map, lms_init=p_pdm_lms)
- except:
- pdm_clm_lms = p_pdm_lms.copy()
-
- # get landmarks ECT
- try: # clm may not converge
- ect_lms = clm_correct(
- clm_model_path=clm_model_path, image=test_image, map=test_image_map, lms_init=init_lms)
- except:
- ect_lms = p_pdm_lms.copy()
-
- # get landmarks for ECpTp_out (tune jaw and eyebrows)
- ecptp_out = p_pdm_lms.copy()
- ecptp_out[left_brow_inds] = pdm_clm_lms[left_brow_inds]
- ecptp_out[right_brow_inds] = pdm_clm_lms[right_brow_inds]
- ecptp_out[jaw_line_inds] = pdm_clm_lms[jaw_line_inds]
-
- # get landmarks for ECpTp_jaw (tune jaw)
- ecptp_jaw = p_pdm_lms.copy()
- ecptp_jaw[jaw_line_inds] = pdm_clm_lms[jaw_line_inds]
-
- if map_to_input_size:
- ecptp_jaw = test_image_transform.apply(ecptp_jaw)
- ecptp_out = test_image_transform.apply(ecptp_out)
- ect_lms = test_image_transform.apply(ect_lms)
- init_lms = test_image_transform.apply(init_lms)
- p_pdm_lms = test_image_transform.apply(p_pdm_lms)
- pdm_clm_lms = test_image_transform.apply(pdm_clm_lms)
-
- ecptp_jaw_list.append(ecptp_jaw) # E + p-correction + p-tuning (ECpTp_jaw)
- ecptp_out_list.append(ecptp_out) # E + p-correction + p-tuning (ECpTp_out)
- ect_list.append(ect_lms) # ECT prediction
- e_list.append(init_lms) # init prediction from heatmap network (E)
- ecp_list.append(p_pdm_lms) # init prediction + part pdm correction (ECp)
- ecpt_list.append(pdm_clm_lms) # init prediction + part pdm correction + global tuning (ECpT)
-
- pred_dict = {
- 'E': e_list,
- 'ECp': ecp_list,
- 'ECpT': ecpt_list,
- 'ECT': ect_list,
- 'ECpTp_jaw': ecptp_jaw_list,
- 'ECpTp_out': ecptp_out_list
- }
-
- return pred_dict
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/deep_heatmaps_model_fusion_net.pyc b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/deep_heatmaps_model_fusion_net.pyc
deleted file mode 100644
index b7da7873c85e4e845faff3b5a1026b43a1c9eee2..0000000000000000000000000000000000000000
Binary files a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/deep_heatmaps_model_fusion_net.pyc and /dev/null differ
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/deformation_functions.py b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/deformation_functions.py
deleted file mode 100644
index 41b9464f4a6241d055e529fb4700a7eb2f29c8f7..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/deformation_functions.py
+++ /dev/null
@@ -1,386 +0,0 @@
-import numpy as np
-
-
-def deform_part(landmarks, part_inds, scale_y=1., scale_x=1., shift_ver=0., shift_horiz=0.):
- """ deform facial part landmarks - matching ibug annotations of 68 landmarks """
-
- landmarks_part = landmarks[part_inds, :].copy()
- part_mean = np.mean(landmarks_part, 0)
-
- landmarks_norm = landmarks_part - part_mean
- landmarks_deform = landmarks_norm.copy()
- landmarks_deform[:, 1] = scale_x * landmarks_deform[:, 1]
- landmarks_deform[:, 0] = scale_y * landmarks_deform[:, 0]
-
- landmarks_deform = landmarks_deform + part_mean
- landmarks_deform = landmarks_deform + shift_ver * np.array([1, 0]) + shift_horiz * np.array([0, 1])
-
- deform_shape = landmarks.copy()
- deform_shape[part_inds] = landmarks_deform
- return deform_shape
-
-
-def deform_mouth(lms, p_scale=0, p_shift=0, pad=5):
- """ deform mouth landmarks - matching ibug annotations of 68 landmarks """
-
- jaw_line_inds = np.arange(0, 17)
- nose_inds = np.arange(27, 36)
- mouth_inds = np.arange(48, 68)
-
- part_inds = mouth_inds.copy()
-
- # find part spatial limitations
- jaw_pad = 4
- x_max = np.max(lms[part_inds, 1]) + (np.max(lms[jaw_line_inds[jaw_pad:-jaw_pad], 1]) - np.max(
- lms[part_inds, 1])) * 0.5 - pad
- x_min = np.min(lms[jaw_line_inds[jaw_pad:-jaw_pad], 1]) + (np.min(lms[part_inds, 1]) - np.min(
- lms[jaw_line_inds[jaw_pad:-jaw_pad], 1])) * 0.5 + pad
- y_min = np.max(lms[nose_inds, 0]) + (np.min(lms[part_inds, 0]) - np.max(lms[nose_inds, 0])) * 0.5
- max_jaw = np.minimum(np.max(lms[jaw_line_inds, 0]), lms[8, 0])
- y_max = max_jaw - (max_jaw - np.max(lms[part_inds, 0])) * 0.5 - pad
-
- # scale facial feature
- scale = np.random.rand()
- if p_scale > 0.5 and scale > 0.5:
-
- part_mean = np.mean(lms[part_inds, :], 0)
- lms_part_norm = lms[part_inds, :] - part_mean
-
- part_y_bound_min, part_x_bound_min = np.min(lms_part_norm, 0)
- part_y_bound_max, part_x_bound_max = np.max(lms_part_norm, 0)
-
- scale_max_y = np.minimum(
- (y_min - part_mean[0]) / part_y_bound_min,
- (y_max - part_mean[0]) / part_y_bound_max)
- scale_max_y = np.minimum(scale_max_y, 1.2)
-
- scale_max_x = np.minimum(
- (x_min - part_mean[1]) / part_x_bound_min,
- (x_max - part_mean[1]) / part_x_bound_max)
- scale_max_x = np.minimum(scale_max_x, 1.2)
-
- scale_y = np.random.uniform(0.7, scale_max_y)
- scale_x = np.random.uniform(0.7, scale_max_x)
-
- lms_def_scale = deform_part(lms, part_inds, scale_y=scale_y, scale_x=scale_x, shift_ver=0., shift_horiz=0.)
-
- # check for spatial errors
- error = check_deformation_spatial_errors(lms_def_scale, part_inds, pad=pad)
- if error:
- lms_def_scale = lms.copy()
- else:
- lms_def_scale = lms.copy()
-
- # shift facial feature
- if p_shift > 0.5 and (np.random.rand() > 0.5 or not scale):
-
- part_mean = np.mean(lms_def_scale[part_inds, :], 0)
- lms_part_norm = lms_def_scale[part_inds, :] - part_mean
-
- part_y_bound_min, part_x_bound_min = np.min(lms_part_norm, 0)
- part_y_bound_max, part_x_bound_max = np.max(lms_part_norm, 0)
-
- shift_x = np.random.uniform(x_min - (part_mean[1] + part_x_bound_min),
- x_max - (part_mean[1] + part_x_bound_max))
- shift_y = np.random.uniform(y_min - (part_mean[0] + part_y_bound_min),
- y_max - (part_mean[0] + part_y_bound_max))
-
- lms_def = deform_part(lms_def_scale, part_inds, scale_y=1., scale_x=1., shift_ver=shift_y, shift_horiz=shift_x)
- error = check_deformation_spatial_errors(lms_def, part_inds, pad=pad)
- if error:
- lms_def = lms_def_scale.copy()
- else:
- lms_def = lms_def_scale.copy()
-
- return lms_def
-
-
-def deform_nose(lms, p_scale=0, p_shift=0, pad=5):
- """ deform nose landmarks - matching ibug annotations of 68 landmarks """
-
- nose_inds = np.arange(27, 36)
- left_eye_inds = np.arange(36, 42)
- right_eye_inds = np.arange(42, 48)
- mouth_inds = np.arange(48, 68)
-
- part_inds = nose_inds.copy()
-
- # find part spatial limitations
- x_max = np.max(lms[part_inds[:4], 1]) + (np.min(lms[right_eye_inds, 1]) - np.max(lms[part_inds[:4], 1])) * 0.5 - pad
- x_min = np.max(lms[left_eye_inds, 1]) + (np.min(lms[part_inds[:4], 1]) - np.max(lms[left_eye_inds, 1])) * 0.5 + pad
-
- max_brows = np.max(lms[21:23, 0])
- y_min = np.min(lms[part_inds, 0]) + (max_brows - np.min(lms[part_inds, 0])) * 0.5
- min_mouth = np.min(lms[mouth_inds, 0])
- y_max = np.max(lms[part_inds, 0]) + (np.max(lms[part_inds, 0]) - min_mouth) * 0 - pad
-
- # scale facial feature
- scale = np.random.rand()
- if p_scale > 0.5 and scale > 0.5:
-
- part_mean = np.mean(lms[part_inds, :], 0)
- lms_part_norm = lms[part_inds, :] - part_mean
-
- part_y_bound_min = np.min(lms_part_norm[:, 0])
- part_y_bound_max = np.max(lms_part_norm[:, 0])
-
- scale_max_y = np.minimum(
- (y_min - part_mean[0]) / part_y_bound_min,
- (y_max - part_mean[0]) / part_y_bound_max)
- scale_y = np.random.uniform(0.7, scale_max_y)
- scale_x = np.random.uniform(0.7, 1.5)
-
- lms_def_scale = deform_part(lms, part_inds, scale_y=scale_y, scale_x=scale_x, shift_ver=0., shift_horiz=0.)
-
- error1 = check_deformation_spatial_errors(lms_def_scale, part_inds[:4], pad=pad)
- error2 = check_deformation_spatial_errors(lms_def_scale, part_inds[4:], pad=pad)
- error = error1 + error2
- if error:
- lms_def_scale = lms.copy()
- else:
- lms_def_scale = lms.copy()
-
- # shift facial feature
- if p_shift > 0.5 and (np.random.rand() > 0.5 or not scale):
-
- part_mean = np.mean(lms_def_scale[part_inds, :], 0)
- lms_part_norm = lms_def_scale[part_inds, :] - part_mean
-
- part_x_bound_min = np.min(lms_part_norm[:4], 0)
- part_x_bound_max = np.max(lms_part_norm[:4], 0)
- part_y_bound_min = np.min(lms_part_norm[:, 0])
- part_y_bound_max = np.max(lms_part_norm[:, 0])
-
- shift_x = np.random.uniform(x_min - (part_mean[1] + part_x_bound_min),
- x_max - (part_mean[1] + part_x_bound_max))
- shift_y = np.random.uniform(y_min - (part_mean[0] + part_y_bound_min),
- y_max - (part_mean[0] + part_y_bound_max))
-
- lms_def = deform_part(lms_def_scale, part_inds, scale_y=1., scale_x=1., shift_ver=shift_y, shift_horiz=shift_x)
-
- error1 = check_deformation_spatial_errors(lms_def, part_inds[:4], pad=pad)
- error2 = check_deformation_spatial_errors(lms_def, part_inds[4:], pad=pad)
- error = error1 + error2
- if error:
- lms_def = lms_def_scale.copy()
- else:
- lms_def = lms_def_scale.copy()
-
- return lms_def
-
-
-def deform_eyes(lms, p_scale=0, p_shift=0, pad=10):
- """ deform eyes + eyebrows landmarks - matching ibug annotations of 68 landmarks """
-
- nose_inds = np.arange(27, 36)
- left_eye_inds = np.arange(36, 42)
- right_eye_inds = np.arange(42, 48)
- left_brow_inds = np.arange(17, 22)
- right_brow_inds = np.arange(22, 27)
-
- part_inds_right = np.hstack((right_brow_inds, right_eye_inds))
- part_inds_left = np.hstack((left_brow_inds, left_eye_inds))
-
- # find part spatial limitations
-
- # right eye+eyebrow
- x_max_right = np.max(lms[part_inds_right, 1]) + (lms[16, 1] - np.max(lms[part_inds_right, 1])) * 0.5 - pad
- x_min_right = np.max(lms[nose_inds[:4], 1]) + (np.min(lms[part_inds_right, 1]) - np.max(
- lms[nose_inds[:4], 1])) * 0.5 + pad
- y_max_right = np.max(lms[part_inds_right, 0]) + (lms[33, 0] - np.max(lms[part_inds_right, 0])) * 0.25 - pad
- y_min_right = 2 * pad
-
- # left eye+eyebrow
- x_max_left = np.max(lms[part_inds_left, 1]) + (np.min(lms[nose_inds[:4], 1]) - np.max(
- lms[part_inds_left, 1])) * 0.5 - pad
- x_min_left = lms[0, 1] + (np.min(lms[part_inds_left, 1]) - lms[0, 1]) * 0.5 + pad
-
- y_max_left = np.max(lms[part_inds_left, 0]) + (lms[33, 0] - np.max(lms[part_inds_left, 0])) * 0.25 - pad
- y_min_left = 2 * pad
-
- # scale facial feature
- scale = np.random.rand()
- if p_scale > 0.5 and scale > 0.5:
-
- # right eye+eyebrow
- part_mean = np.mean(lms[part_inds_right, :], 0)
- lms_part_norm = lms[part_inds_right, :] - part_mean
-
- part_y_bound_min, part_x_bound_min = np.min(lms_part_norm, 0)
- part_y_bound_max, part_x_bound_max = np.max(lms_part_norm, 0)
-
- scale_max_y = np.minimum(
- (y_min_right - part_mean[0]) / part_y_bound_min,
- (y_max_right - part_mean[0]) / part_y_bound_max)
- scale_max_y_right = np.minimum(scale_max_y, 1.5)
-
- scale_max_x = np.minimum(
- (x_min_right - part_mean[1]) / part_x_bound_min,
- (x_max_right - part_mean[1]) / part_x_bound_max)
- scale_max_x_right = np.minimum(scale_max_x, 1.5)
-
- # left eye+eyebrow
- part_mean = np.mean(lms[part_inds_left, :], 0)
- lms_part_norm = lms[part_inds_left, :] - part_mean
-
- part_y_bound_min, part_x_bound_min = np.min(lms_part_norm, 0)
- part_y_bound_max, part_x_bound_max = np.max(lms_part_norm, 0)
-
- scale_max_y = np.minimum(
- (y_min_left - part_mean[0]) / part_y_bound_min,
- (y_max_left - part_mean[0]) / part_y_bound_max)
- scale_max_y_left = np.minimum(scale_max_y, 1.5)
-
- scale_max_x = np.minimum(
- (x_min_left - part_mean[1]) / part_x_bound_min,
- (x_max_left - part_mean[1]) / part_x_bound_max)
- scale_max_x_left = np.minimum(scale_max_x, 1.5)
-
- scale_max_x = np.minimum(scale_max_x_left, scale_max_x_right)
- scale_max_y = np.minimum(scale_max_y_left, scale_max_y_right)
- scale_y = np.random.uniform(0.8, scale_max_y)
- scale_x = np.random.uniform(0.8, scale_max_x)
-
- lms_def_scale = deform_part(lms, part_inds_right, scale_y=scale_y, scale_x=scale_x, shift_ver=0.,
- shift_horiz=0.)
- lms_def_scale = deform_part(lms_def_scale.copy(), part_inds_left, scale_y=scale_y, scale_x=scale_x,
- shift_ver=0., shift_horiz=0.)
-
- error1 = check_deformation_spatial_errors(lms_def_scale, part_inds_right, pad=pad)
- error2 = check_deformation_spatial_errors(lms_def_scale, part_inds_left, pad=pad)
- error = error1 + error2
- if error:
- lms_def_scale = lms.copy()
- else:
- lms_def_scale = lms.copy()
-
- # shift facial feature
- if p_shift > 0.5 and (np.random.rand() > 0.5 or not scale):
-
- y_min_right = np.maximum(0.8 * np.min(lms_def_scale[part_inds_right, 0]), pad)
- y_min_left = np.maximum(0.8 * np.min(lms_def_scale[part_inds_left, 0]), pad)
-
- # right eye
- part_mean = np.mean(lms_def_scale[part_inds_right, :], 0)
- lms_part_norm = lms_def_scale[part_inds_right, :] - part_mean
-
- part_y_bound_min, part_x_bound_min = np.min(lms_part_norm, 0)
- part_y_bound_max, part_x_bound_max = np.max(lms_part_norm, 0)
-
- shift_x = np.random.uniform(x_min_right - (part_mean[1] + part_x_bound_min),
- x_max_right - (part_mean[1] + part_x_bound_max))
- shift_y = np.random.uniform(y_min_right - (part_mean[0] + part_y_bound_min),
- y_max_right - (part_mean[0] + part_y_bound_max))
-
- lms_def_right = deform_part(lms_def_scale, part_inds_right, scale_y=1., scale_x=1., shift_ver=shift_y,
- shift_horiz=shift_x)
-
- error1 = check_deformation_spatial_errors(lms_def_right, part_inds_right, pad=pad)
- if error1:
- lms_def_right = lms_def_scale.copy()
-
- # left eye
- part_mean = np.mean(lms_def_scale[part_inds_left, :], 0)
- lms_part_norm = lms_def_scale[part_inds_left, :] - part_mean
-
- part_y_bound_min, part_x_bound_min = np.min(lms_part_norm, 0)
- part_y_bound_max, part_x_bound_max = np.max(lms_part_norm, 0)
-
- shift_x = np.random.uniform(x_min_left - (part_mean[1] + part_x_bound_min),
- x_max_left - (part_mean[1] + part_x_bound_max))
- shift_y = np.random.uniform(y_min_left - (part_mean[0] + part_y_bound_min),
- y_max_left - (part_mean[0] + part_y_bound_max))
-
- lms_def = deform_part(lms_def_right.copy(), part_inds_left, scale_y=1., scale_x=1., shift_ver=shift_y,
- shift_horiz=shift_x)
-
- error2 = check_deformation_spatial_errors(lms_def, part_inds_left, pad=pad)
- if error2:
- lms_def = lms_def_right.copy()
- else:
- lms_def = lms_def_scale.copy()
-
- return lms_def
-
-
-def deform_scale_face(lms, p_scale=0, pad=5, image_size=256):
- """ change face landmarks scale & aspect ratio - matching ibug annotations of 68 landmarks """
-
- part_inds = np.arange(68)
-
- # find spatial limitations
- x_max = np.max(lms[part_inds, 1]) + (image_size - np.max(lms[part_inds, 1])) * 0.5 - pad
- x_min = np.min(lms[part_inds, 1]) * 0.5 + pad
-
- y_min = 2 * pad
- y_max = np.max(lms[part_inds, 0]) + (image_size - np.max(lms[part_inds, 0])) * 0.5 - pad
-
- if p_scale > 0.5:
-
- part_mean = np.mean(lms[part_inds, :], 0)
- lms_part_norm = lms[part_inds, :] - part_mean
-
- part_y_bound_min, part_x_bound_min = np.min(lms_part_norm, 0)
- part_y_bound_max, part_x_bound_max = np.max(lms_part_norm, 0)
-
- scale_max_y = np.minimum(
- (y_min - part_mean[0]) / part_y_bound_min,
- (y_max - part_mean[0]) / part_y_bound_max)
- scale_max_y = np.minimum(scale_max_y, 1.2)
-
- scale_max_x = np.minimum(
- (x_min - part_mean[1]) / part_x_bound_min,
- (x_max - part_mean[1]) / part_x_bound_max)
- scale_max_x = np.minimum(scale_max_x, 1.2)
-
- scale_y = np.random.uniform(0.6, scale_max_y)
- scale_x = np.random.uniform(0.6, scale_max_x)
-
- lms_def_scale = deform_part(lms, part_inds, scale_y=scale_y, scale_x=scale_x, shift_ver=0., shift_horiz=0.)
-
- # check for spatial errors
- error2 = np.sum(lms_def_scale >= image_size) + np.sum(lms_def_scale < 0)
- error1 = len(np.unique((lms_def_scale).astype('int'), axis=0)) != len(lms_def_scale)
- error = error1 + error2
- if error:
- lms_def_scale = lms.copy()
- else:
- lms_def_scale = lms.copy()
-
- return lms_def_scale
-
-
-def deform_face_geometric_style(lms, p_scale=0, p_shift=0):
- """ deform facial landmarks - matching ibug annotations of 68 landmarks """
-
- lms = deform_scale_face(lms.copy(), p_scale=p_scale, pad=0)
- lms = deform_nose(lms.copy(), p_scale=p_scale, p_shift=p_shift, pad=0)
- lms = deform_mouth(lms.copy(), p_scale=p_scale, p_shift=p_shift, pad=0)
- lms = deform_eyes(lms.copy(), p_scale=p_scale, p_shift=p_shift, pad=0)
- return lms
-
-
-def get_bounds(lms):
- part_y_bound_min, part_x_bound_min = np.min(lms,0)
- part_y_bound_max, part_x_bound_max = np.max(lms,0)
- return np.array([[part_x_bound_min, part_x_bound_max], [part_y_bound_min, part_y_bound_max]])
-
-
-def part_intersection(part_to_check, points_to_compare, pad=0):
- points_to_compare = np.round(points_to_compare.copy())
- check_bounds = np.round(get_bounds(part_to_check))
- check_bounds[:, 0] += pad
- check_bounds[:, 1] -= pad
- inds_y = np.where(np.logical_and(points_to_compare[:,0] > check_bounds[1,0], points_to_compare[:,0]X\xbf\xe3\xe1g\xe3~_\xc6\xbf,rY\xef\xe2\xf2\xb0?O\xc4\xdc8\xe7\xa7\xb5\xbfEq\x01\xbf2L\xa9\xbf\x08X\x9b\xa9{\xd9\xb8\xbf\x84\x1b\xc2\xba\x8d\x11\xba\xbfTo\xc1\xa3\xd3\xa0\xb3\xbf\xa7\xbco\xc6e\xd4\xc3\xbff\xbaTv\x14\x98\xa7\xbfv\x9b\xba\xde\xed\xdd\xc6\xbf\xc1g\x1b\xbd$8m\xbf\xe7\xfd\x01\x94y\xd7\xc6\xbf\x88\xffR\x84i\xba\xa5?\x15T6\xc9\x18\x89\xc4\xbf\xd14S\xba\x92\xc3\xb5?\xdb\xb4\x81\xb8\xcd\xea\xbf\xbf\xee\x8a\xc1\xb3\x1a\xa0\xc0?,\n\xba\xb5\n\x08\xb4\xbf\x02\xfd\xec\x87\xb5\x08\xc4?\xcb\x12Y\xab\xcca\x93\xbf\x0e\xe0AfM\xdb\xc3?\x04:\x87\xc8\xbc\xbe\xaa?\x82l\\\xdc\x82?\xbe?\x99\x15q\xe2\xa1:\xbc?cJ\x1a\xf2\xae\x9c\xb0?\xb4\x1e}\xd0\xa1\xeb\xc2?eg\x16\x8d/\xbe\x91?O\xb2\xb9\x88\xddk\xc5?\xff\x96\xa8\xacsI\x9c\xbf \rW{xL\xc5?\xec\xdf\xec\xed\x96\xfb\xb1\xbf\xe0\x1axme\xbf\xc2?\xf0\xba\x932\x05\xa6\xb9\xbf\xbfL\xbd\xd1@\x99\xb9?n\xf0\xa0\x1d!\'\xbc\xbf\x1a!\xa0p\xd8W\xac?Z\xf3~8\xad\x89m?\x14\x8e\x16\xb7\xf4/\xc7?M\x03\xfa=\xc6f\xad?##\xe0\xae\x08v\xc8?g\xce\xb3\x13\x19\xa9\xb6?\x9c~\xb9\xe3\xb1\xb3\xc5?\xa4_\x89\xa1.c\xbb?\xc5\x94*q\xca\x9b\xc1?G~I\xf5<%\xbc?\x7fg]\x8b\xd2#\xba?F\xcd\x94\x91+N\xba?\xbf\xef\x97\xeb\xa4\xf8\xbc\xbf\xa9\xdfJ\xea\x12\xac\xb7?\x88\x8f\xf7\xc6\x10\xe5\xc2\xbf\t\x00S\x95\x9a\n\xb4?g[\xb1\xb7<\t\xc6\xbf\x05\xa6\x93G\xbe\xc8\xa8?\xabt\xe8y\xd9\x9f\xc8\xbf\xb7\xb7\x88\xf4\x1e\x9b\x86?\xe6\x05\xccO\x02e\xc8\xbf\xb1\x05Pr\xbb\x8ej?\xf7-\xc0d\x0f(\x81\xbfo\r0L\xb3d-?\x08\x9c\xbdZ\x19Kq\xbf`\xc3\n \x11\xaa\xf4\xbe\xc6(\xd8\x0e3\xd4Y\xbf\xd1\x88cC\x85\xfbs?H3\xe2\xed\xae\xcbL?,U\r\xb6kZz\xbf\n\xb0q][Sq?\x03\xfc\xdb\xd9\xce^\x8d?\x9a}\xe2\xf5\xb1\xf0t?\xa2\x07h\xcc\x8c\x0e\x97?\x05\x9f\xd6\t\x83\xe2\x87?\xa7\xff\xbe?5\x8a\x86?\xd0;Hf\x0ec\x8d?\x9c\x04K\x81t\xf8\x7f\xbf\x89|\x1b Q\xe9\x84?\xbd\x06\x1fcl\x95\x89\xbf3c{\xa7e\xc4\xc3?\xbd\xf8\x01H9#\x92?\xfc\x85\xe8#\xee\xa5\xc0?a\x00s\x9e\x9d\xf9\x8b?TyZ\x18\x8b\xa2\xb5?\xbc\x02j\xf5[\x1b\xa1\xbf\xae\xcbxR\x81!\xab?z\xe1\xb2\x1b\xb8\x16\xa5\xbf\xeaT\xb5\xa8\x1a%\xb6?~\xecn\x11\x9a\x04\xa5\xbf\xd8\xc6~\xce\xd6x\xc1?\x16\x96qU"-\xa4\xbf\xec0\xaf\xc8\xbc\xdb\xac\xbfWA\xba}\xd9y\x7f?\x83]\xa9\r\xf8"\xb6\xbfN\x0f\xa4\xc6\x04r\x85?y\xe4O\x93\x8d{\xc0\xbf9#\x0e\xb1\x06\xa9\x93\xbfF\xc4\xf3\x9d$y\xc3\xbf\xfe\xac\xca~[B\xa7\xbfX|\x97\x7fd\xee\xc1\xbfn\x159\x94K\x86\xa9\xbf~\xa5&\x8d\x85#\xb8\xbf\xf5\x9e0;\xf4\xe1\x8c\xbf\n\xd8\xd9\\\n\xba\x9e?\x9eF:\n\xbc\xcb\x86?,H\xc0\x97\xb7\x11\x9f?\xd7\x17\xd2\xd5Z\xea\xa0?\x06\xf8\xd0\xbd\xd6\xc1\x86?b\xd40\xdd\xde\xd3\xa0?6\xf8u\xfe\xe1\xc9p?\x0f\x93\xe3)\xcc\xed\xa0?\x95\x96\xaeFsYJ?\x91\x14\xa15\xd9 \x7f?\x9f\xe0\xf9\xce\x0br\x90\xbf(\x18\x92_\x08i\x94\xbfpv\x86<\xb1\n\x8a\xbf\x94;\xed\xe7\xc6\xbb\xb4\xbf\xf6J\xc9\xf2\x025\x96\xbf\xca\r\xfb\xf5\x91\xec\xbc\xbf\x91\xcc\xb3sc\x99v\xbf\xcb0y\x81\x0ct\xbd\xbf*\xab\x8c_\x88\x8bm?\x81^\xb0\xfb\x155\xbd\xbfr\xac\xe8\xa5\xd7\xd9\x8a?\xe0u\xbe\xd6ec\xb5\xbf\xd6b\x06I\xed{\xa2?+\xd0\xe7\xedd\x9c\x8f\xbfk\xd3\'\x9eM\xe7\x98?7\xdb\x1d%;3\x8c?K\xde\xb82\x9c\x1d\x88?\xc34H\xdd\xf1\xf7\x8a?\xacP\xb8\x17\xd7\x01p?\x8eA\xcd\xafn\xf7\x8c?\x9c\xd2\xdd-32Q\xbf\x8c\xfdr\xe8tU\x96\xbf~\xa62\x87\xe4\xfb{\xbf[\x8c\xd9\x1c\xca\xb5\xb4\xbf&\n\x18\x9a\xec\xadq\xbf\xa4\x16\xa7\xf3?#\xb5\xbf\xcc\x8d\x9a\xb0\xd8.^?\x80\r\xc8;\xdc_\xb4\xbf\xf1\xe3\xb8\xf8\x8e\xbe\x88?X\xf5\xe3\xf1\xa7\x97\x9b\xbfL\xab=&n>\xc1\xbf\xcf1\x18\xb0\x8e~\x94\xbfP\xac\xd9\xe1\xf0N\xc2\xbf\x1e\x13*CA9u\xbf\n\xfc\x89\x92!\xcf\xc3\xbf\xc0(\r\t\x1c\xfdb?\x0c\xf2\xaef\xceg\xbf\xbf\x999\xff\x81\x08Pz\xbf\x03ab\xb6\x95z\xb0\xbfQ@\xd8O\xbd\xb6\x96\xbf\x9c2\x05\x1dd\xfda\xbfTf\xc8\x9d\x12N\xa9\xbfl\xc0\xfab\n;\xa7?TE\xde\xce\x1e$\xb2\xbf\x97\xc0s\x13!%\xaf?g\x9b\x184\xfef\xb4\xbfM;\x82\xd6\x9e\xdc\xa5?|\xabM\x0fZ\xe5\xbd\xbf\xe6F \xa3\x0b{\x95?\xc7r\x10\x81\x02\xd2\xc1\xbf\xab\x91\x80\xa1\xa9\xbb\x8f?\xfe2D\xd2q \xc0\xbf!}\xb3\x1bO\xf4\xa6?\xeb\x14wG\xc8\x1f\xb8\xbf\x99\xfbe\x8d\xa3k\xb4?A+\x9fON\x98\xb1\xbf\xcd$Qto\xab\xbc?l+\x8a]\x8cu\xaa\xbf\xf5\xe4|\xd9\x02D\xc1?\x0f\xb4g\x00\xd6 \xa1\xbf\xd0\xc6:\xbd\x85_\xc1?\xa6\xa3\xf8P\xdb\x9f\x8d\xbf\xa4\x1c\x1e\xa4\x0b\xbc\xc1?\xa6\x0f\xfe9S\xb4\xb8\xbf\x99\x1f\x06\xc9\\\x19\xac?/"\xed\xee\x81!\xa1\xbft\xad~a\xd9-\xb7?-\xf3-\xf8Y\xa0\x85\xbf\xcb\xce:J\xb4\x9b\xb1?\x85\xaa\x8c\xa6\xaa\xd8\x82\xbf\xb9_\x9a\x1e{\x85\xa7?\xee#l\x98\xe7\xeb\x8c\xbf\r\x84\xa9\xb7\xa3n\x96?`\xd3>\x9c\x01\x00\x8e\xbf\r\x9c\x81\xfa6\xe8\x94?\xa6z\x17\x86\xad\xf2_\xbf\xe9Cw\x1f8fF?\xa7\x04>K\xe3ik\xbf\xce|c\x03\xa1\x88\x91\xbf\x82\xcd\xden\x127\x8c\xbffD\xa2}\xe5<\xa4\xbf\xbeBJ\xb4q\x05\xaf\xbf\xd5Z)+j\xd7\xa0\xbfB5\xed\x8d\xf9\xf6\xb3?\xb8MnY\nA\x8a?\xb5\xc7v\xc0G\x1a\x92\xbf\xadE\xf6\x18i\x19\x80?\xdf\x91\xb0\xf5)+\xbd\xbfq7\x9f0a>o?4\x10\xc5Gu\x97\xca\xbf\xbf==\xbb\x04\xff0\xbf\x84 w\xb1\xb8\xea\xb2\xbf\x87e{=\xdb\xe7\xa3?\xf4\xd9\x04\xee\x99\x9f\xba\xbfilO\xbaKN\x97?7\x10\xee\x9f\xb8\x87\xc0\xbfCl\xa8\xec\xf6\x99q\xbf\x0f)\xee59\xbc\xb8\xbf\xd7\xfeK\x9eh\xf3\xa0\xbf\x0ce\x8f\x0e\xf5A\xb1\xbf\xf6\xb8\x15\xdf3\xe5\xaa\xbf\x94\x95\r\xbe\xbc\xc4\xb7?\x82[L\x99Y\x00\x9b?l\x8c\xbf\x12\xf9%\xb9?\x8a\xe5\xbb\xdc\xba\xe7\xa2?\x9e\x82\xbb\xdf\xa0\xd1\xb8?\x8a{\x8bqw\x10\xa4?\x03y\xc3F\x14I\xb9?\x06T\x9a\x12J.\x98?M0\xe5\xf6>$\xbc?\xcfD\xe5\xd8P\xc3\xa0?\x0f%\xed\x9c\xab\x88\xba?w\x01\x9f\x9d\x99\xcc\x9f?\xd8\xf4\xd5]$\x9b\xb8?92\x05\xbd\xf4\x99y\xbf\xa0\x7f\x0e\xa4\x0c.\xb8?\xee\xe8F\x06\xee\xb1\x97\xbfc\xfbR\xb1\xb7n\xb9?\x8c\xeaD\xf0rg\x91\xbf;\xe3\xc7ju\xaf\xb8?\xccWj\x8cM\xa8{\xbf~\x13\xb0\x96y\x99\xbb?\xe4\xf4\x14\x8b\xaa\x10\x8e\xbf6\xeb*O;\xd1\xbc?lO\t\xac\x16b\x94\xbf\xd6ibPCP\xb3?\xd7\x1fg\xc7\xda\xec\xb3?Uv\xb6\x84\x03\x97\xb3\xbf\xee\x13\n\xfd\xb7#\x92?e\xa3]\xec\x1f\xa0\xc3\xbf\xac\xfe\xae\xcf\xbc\x94\x8e?%\xc9\xf4\xf8\xa0\xcb\xc4\xbf\xf8\xdf3\xd3\nI\x96\xbf\x14\xcd\x88\\f,\xc3\xbfH\x16\x16}%\xd9\xb3\xbf\xa9?\xc8h\xc10\xb3\xbfx3\x1ch1\t\xb6\xbf<\xad\xdf\x92\x91I\xb3?\xf4\xd7\x8d\xed\x87\x05\xc0\xbf\x80\xa1\xb4\xba7\xf7\xc5?\xb8\x96\xb5\x90 \x80\xb5\xbf,\xb2\xe6\xdb\x12\xf4\xc9?\xcf\xf0\xb7\xb3\x9a\xb2\xb0\xbf\r\xdb\xe6E]\x8e\xc9?\xd9\xc4s6\xc1b\x8a\xbf\xbb|\x84\xc3\'\t\xca?\x82\xa3Z \xe0\x1f\x91?\xa0y\xb6\xf5\xd4K\xc5?&a\x94\x9enC\x99?D\x14\x81\x0c\'\xc9\xa9?\xb6\xb2\xb9\x0e\x94\\\xb5?\x8b\xf3\x85L\x9b\xee\xba\xbfm\xfe)\xc4\x9dN\x95?\xdfd\x94\x88\xe9\xc0\xbe\xbfP\xed\xff@\xad\xcc\x93\xbf3$M\x0c\x98\xb5\xb9\xbf\x0c\xa6~\xe0@\xfe\xb3\xbf(G\xc6hof\xad?\x81L\xa4\x8b\xc4e\xc1\xbf\x92\r\x13\xe7\xf4\x95\xc3?\x16\xec\x1bM\x83d\xb2\xbf\xd1\xcf0\x06\xfek\xc2?\x9eU\x96\xc7\x16\xb6\x91\xbf:\x90\xe8\xbf\xc92\xc3?^*\xd5\x01\x81X\x92?\x9d\xbb\xa4\x1b\xb2\x05\xab?Gzy1<\x82\x8b\xbf"P\x04l>_\x94\xbfm\x16\x977\xc9\xa4\x91\xbf\x04\xfe\xd6C\x97\xae\xbb\xbf\xccc\x85e6\xd0\x99\xbf\xf7\xcb\xd2+\x88\xb3\xc6\xbf\xe5\x02\xc51x\xaf\x9f\xbf\xe9\x13.\xf5\xc2J\xcb\xbfSp\x9dH\xc8\x84\xaf\xbf55~,\x1e\xa3\xcc\xbf\x0f<|z9\xcd\xba\xbf\n\xea_\x84\xf7\x02\xcb\xbfl\xefn\xbb\xc2\x8f\xbf\xbf8OZ4\xf7\xe2\xc0\xbf@\xf6#\xc6d\xe5\xc1\xbf\xce\xa1\xde\xbb\xbc~\xad\xbf4W\xaa\x9d+#\xc4\xbf\xf7\xca/#\xeab\x8d?V\r]\x90\xe1\xb0\xc5\xbf\xff\x13.\x16\x8c\xa1\xba?\xd2\x85\xe2\xd2(\x93\xc4\xbf\xb1/gn~\xe9\xc5?\x84!\xf4\x9f\xfa\x1b\xc1\xbfh\xa0;"\xb0\x08\xcc?o\xe6D\xbe#\xf0\xaf\xbf\xec\xd1\xe2P,\xd0\xcc?\xe1\xab\x90W\x9dm\x94? \x93\xc4\x87\xdeP\xc6?\xdd\x15\xe6\xb1\xf1\xbf\xb0?\\\x85O\xaf\xea\x1d\xbb?\x9e\xbbL\x9d%3\xb2?\x06H\xf9\x8aP\xac\xa3??\x8c\x10\xe9\xaeA\xae?\x96\x8d\xc4\xcd\x86v\x9b?\xa7\x17\xac)\x8e>\x82\xbfm\xa2\x07\xa1N\x0b\xac?T\xe7\xf2\xc0\x90#\x97\xbf\x92\x83\xc9\x0b\x0f\x05\xb0?[\x85/\x88s!\xaa\xbf>\x8f\x02\x849\xca\xa8?\x7f$\xf2\xa9\xcb\xd1\xb3\xbf<\x1c]\xc2In\x92?8\xbd\n\xdf[\xf4\xb7\xbfWc!P4R\x95\xbf0W\xb8\x00\x16\xce\xa9\xbf\xd7\xd8\xfd\xcb\xf01\xab\xbf\xa7u0Oa\xb4\xb5\xbf\xcd\x9b\x1d\xf8\xe2\x1c\xb6\xbf\x9b\x92\xc6\xc8\x15\xdf\xbb\xbfT\xf2a\xb3}\x94\xc2\xbfr,V\x80\x07L\xbc\xbf\xc9\xecs\xd2\xec\xd8\xcb\xbf\x1c\xb6G2FG\xb0\xbf\xfe\xe6)\xbb\xd8,\xa1?\xc0\xda\xfcaE\x96\xac\xbf\xf9\xdb\xe75\x02\x86\x93?\xbf\t\xa9\x8d\x81;\xa6\xbf\xecS\xa9R\xed\xd1o?\x183\xa7\x16\x99h\x9f\xbf\xe9\xea\xba\x9a\xa4M\x8e\xbfDwl\xd6gL\x93\xbfw\\\xf6\xf3(\x86\x99?\xca)\xa9o:\x1b\xa0?EM\xbb(g\xea\x81?=\xed\x04\xc0*\xa2\x99?G\xad\x0e\xd6\xa9\x1f\x81\xbf\xe8\xa3-U3\xd8\x93?J\xbe\xd5z\xb5\xb1\x94\xbf\xb9\x1c5J\x131\x8d?\xbf$gu\xa5V\x9d\xbfX\xa0q\xc3\r\x1e\x8e?mHY\xf8\xd2\x9e\xb6?\x16[f\xbe\t\xfc\x9c\xbf\'6\x8e[~(\xb1?\xc8Q\xdf\x9bN\xfe\x9d\xbf\x83\x0f^\xd1fZ\xa9?0Em\x90\x044\x94\xbf80\xc6*\xfb\xe9\xac?\x8f\xed\xfb\x7f\xa0\xa4\x90\xbf\\}Q\xf9\xfd\xec\xb2?\xc3\xd4#N_a\x8f\xbf#s\'k\xdc\xec\xb5?p&\x90V\xbaI\x95\xbf\xf1EoW)\x08\x8f\xbf\x10\xf1\x89K\xad\xa5\x93\xbf@\xe7\x005\x90P\xa9\xbf3 %\xd7 T\xa5\xbf\xb9\x98\x07-]\xb5\xb2\xbf\xf1?`R\x04a\xa1\xbfk\xbc\xc0\xfc\xca\xc6\xb2\xbfG\xf8\x03`y\xda\x99\xbf\x87\xeaR\xaeM\x1b\xaa\xbf\x06E\xf9\x9f\xf3\xce\x92\xbf\xe1\x1dw\xc8\tD\x9f\xbf\x94/\xfc\x17\xd7u\x94\xbf\x17}~\n\xbaG\xa5?\x12\xbd #\xd6\x16\xbe?\x08\xc5\x1a\xe3\xfa,\x90?\xa3\x17\xa6\xb0$\xd0\xb7?N\x12\x00\x8fc\x91|\xbf\x90b\xb0z\xdf\xf6\xb4?\x93\x18y\x11Pj\x96\xbf\\B\xbc\'\x96\xee\xb2?\'n\xbf\xda\x89\xd6\xa0\xbf\xc5+\xaeD2\x0e\xb1?!$`\x86\xf4\'\xa7\xbf;T\x00\x1dV\xf8\xb6?\xa1\xc6D~\x9b\x19\xa6\xbf}a\xb7xb\x9b\xbc?ww\xa3sb\xe3f\xbf\xed \x91=H\x86\xbe?Z\x17M\x91o4\xa1?\x16\x01\xc3V{\x1b\xbb?\x0e\x1c\x9eP\x8f\xb8\xa8?\xf5\x10\xed\xd4+\xe2\xbb?zbf\x88|f\xaf?\x08\xb1\xb8\xb1\xfdE\xbe?\xfe_\xd1Y\x90\xea\xb0?O\xba\x88\x8b5\xd9\xbe?jw\njgY\xa4?oI\x8b\x19\xc1\x08\xc0?\xab\xeb\xc6\x8do\x03;\xbf\xc1O)\xc8d\xf5\xb6?m\xb8-\xfd\xbb\x8f\x90\xbfx\xf9\xaa\xe1"7\xb4?\x1c\x07\xd9\xfa\x1a\xea\x98\xbf\xde\x1fd#\xff\xc2\xb2?^\xac$bx\xc0\xa1\xbf\x11\xfc\x8dB9\xe5\xbe?\xa0"\xc6a_\x0c\xa2?\xc28\x95\xc4\x06\x93\xb7?\x8c4\xa0\xbat5\xa5?\xc7!q\x9e\xce\x8c\xb8?\xb4\x11\xdc\xf0$ \xa9?q\xe2\x1at\xb1\xcf\xba?\x97WBw|\xb0\xba\xbf\x0b\x01V\xcb\x0e\x91\xb0?\xa5g\xf7\xeb\xff!\xbe\xbf\xe3\xe2\x9d%\x16\x9b\xbb?\xae\xe9\xf3\xbcP\xfe\xc3\xbf\xb40\x1d\xa8\x91\xe0\xbf?\xcc\x9a\x1f\x04\xfdF\xc5\xbf6\x97\xfa\xe2\xd1\xd6\xb3?\x96\x84\xc2\xb4\x95\x1e\xbf\xbfz\x93\xdcV\xcf\xdc\x9a\xbf\xdab.\x94\xb3\xa3\xac\xbf\x94c\x9e\xa08\xdf\xbe\xbfu\x16#\xc8\x9dVy?4"\xdfMpi\xc2\xbfO\xa4\xf3j\xd7r\xb0?8V-\x08}\x85\xb8\xbfrm\xcc\\&\x1b\xb9?\xd0\x8ae\x1bl#b\xbf\x8f\xe0\x8f\x87\x9f%\xbe?\xb1\xf8U\x10~\xf7\xb5?\x89\x93\xf5W\x1c\xa4\xbd?129\xa2\x04\xd3\xc1?n0\xef\x14\x0b`\xb0?5\x8e\xbf\xb9>\xd9\xbf?\x98\xecy\xec\xbd\xbbx\xbf\xbc\xf8u\x8a\xaaw\xa6?\x8b\x15PO\xb6\x05\xb0\xbf\x1e\xfd\n\xfc,(\xb0\xbf\xf1TJ\xfb\xa8\x02\xb3\xbf\x8e\xef$\x8f\xb8\x03\xc0\xbf6R4\\\x14\xa4\xb2\xbf~l\xec\x0f\x83\x93\xbe\xbfj\xa4\x04\x14\xfe\xb6\xb2\xbf\xd22Mw-\x7f\xb9\xbf\x00F\x83\xf9\xf2\xba\xce?\x00{\x00\x88\x166\xbb\xbf\x1f\x08\xd4oH\xb0\xbd?\xaf)~\x01\t\x8b\xc5\xbf\x037OH%\x06\xa5?\xf3\xf8\x07\x11\r\xcc\xc0\xbf\x8dq\x12\xdc\xf8=\x91?\x17\xc0:\x86\x12\x86\xb2\xbfzD\xa0\xc7s\x9a\x82?\xf0\x8a\x8dY`\xff\x81\xbfT\x0f\x8f\xd9\x1d\xfc\xa3?Gi\x80{\xc5=\xb4\xbfR\x19\x0f\xfd$b\xaf?\xdaY\x91/\xa6$\x85\xbf\xb0k`dY\x82\xb9?l\xbf\xa9\xa1V\x9f\xaa?\xff\xb2\xae\xf8\xe5\xbf\xc6?\x17\xf9\xfe\xed\xab\xca\xb8?sD\x06ao@\xd2?tw\x7f\xb7\x0c\xac\xb2?\xb8\x19vT\xd93\xb3\xbf~g\x8cR\x1e\xc8\x97\xbf,\x7fv\xb0\xc7\x83\xc0\xbf`f\x85Td\xdb\x99\xbf\x1c\xbe8\x99\xdb6\xc7\xbf\xd3\xaf\x1c\x1e\xb9\x97\x9d\xbf|\xc2\xeb\x1erM\xcc\xbf\x05!\x8a?\xd3_\x9f\xbf\x95m^U\xcf\xdc\xc0\xbf^d\xa6\x81\x18\x93\x88?\xadt\xd7\nI\x1a\xc3\xbf%;\xac\x8e\x17\xf6-?\xfe\x89R\x01\x94\xb6\xc4\xbfW\xe4\xc3R[8\x8a\xbfTO\xc1\xdf\xc1\x95\xc2\xbf\xb8\xa8\x15M\xb2\xd4\x9a\xbf\x87s\xe3^c\x18\xbf\xbfxq\x83\x98\x94l\xa0\xbf)\x08\x99kL"y\xbf\xc5\xa9>\xeb\xc35\xa2\xbf\xf5\x98\x11\xa0p\xdb\x92\xbf\x8b\xff\x14\xa99]\xa7\xbf\x9c\x08L\xdb\xe5P\x9e\xbfW\xfd0\xd0eg\xa5\xbf#\xda\x1bf{\x0b\xa4\xbf\xeb^cZ\xea:\x9f\xbf\xd1\x05\xd9~\x94V\x9b\xbf\xc7\xb3\xca\xc9\x13\xfa\xa0\xbf\xc3f\xb6\x91\xaf\xec\x8b\xbf\xd1\xfc1\x88\xa1\xbd\xa2\xbf\xff\x05({\xeet\x91\xbf\xa6H\xd8\xc8wPp\xbf\x82\x95\xceQ\xf9\xf3t?\x0f\xac\x0f\x08di\x89?\x02\x12\xd5\xd7\xb5\x9a\x97?\x8b_=GI\xe4\x90?\x16\xffDA\x05\\\xa3?\xa5\x91n\xa7\xaf\xee\x87?\xc9!\xc8\xd93\x8f\x9a?G.p4D\x96\x80?\x01Mq\xe5\xc2\x1a~?\xa9Dhg\xefFU\xbf\xc8\x03\xe8\xe3S\xcc\xc0?r\xb3\xea#M\xa5\xc0?\xc9^\xa1\x95\x17\x7f\xa0?\x8e;\xd2F\xec\x8b\xb5?\xc4\x07 x1)\x89\xbf\xb4\x8d\xa2].\xac\xa4?\xca\'u\xb3\xe9\xc8\x7f\xbfT\xf8V:\x18\x9e\x9d?\xb7\xd1rO\xfc\xffv\xbf#uu\x01\xbb\x15\x8f?5\x86\xe1\x80.L\xab?\xeb\xf9\x81\\ /\x81\xbf\x93\tJ\xca\xc3<\xc4?D\xbc]L\'\xad\xa7\xbf\xcf)\x10\xf8\x94\x19\xb5?\x80O\xd5\xf3\xfb\x0bm\xbf\xc7\xe0\xcc\xc6\xb4%\x9a?\xf7O\x03\xcdL\x0e\x98?NN\xe2\xda\x13\x9b\x84?\xb3\x99\xdd\xcf\xc7k\xa3?\xfbz\xd8\x90\x93M\x87?:\xbe_"\x10\n\xac?Sv\xb1^\x17|\xa9?\xed\x06\x97r*_\xb7?A\xfe(\xe0 \xe6\xba?\xc7C%0\xaa\x02\xc2?\xca\xd4q\xeb\xfaZ`?4%l{.*\xa5?\x8c\x9c\xf5\x86\x18\x9eA\xbf"aC\x81\xc0*\x9e?\x0e\xfc\x02\xfdYt\x8b?\x0c?\x1b\x02U*\x92?>x\x11+\xf9\xd1\xbf?\xaf\x07\xca\x81)\xf4\xaa\xbf\'X\x10&\xd7\xae\x9c?&v\x89\xc1\xe9\x1f\x96?\x1e\x95\xe3\x8fE\xcb\x8e?7g\x18R(\xe2\xa1?L\xa0K\xea\xf2p\x85?\x15\xe2\x0b$\xc9\xad\xa8?\x9e\x02\x1bgxW\xb7\xbf\x03\x15\xacE\xf3W\xb8\xbf\xdf\xb4\x96R\xaf8\xb3\xbf\x8c\x93\x00L\xf8\x93\xb4\xbf\x14\xf5\xd2\xf6\xa5\x19\xad\xbf\x8d\xcf\rst\xe8\xb0\xbf\xdd\xcf7\x81LZ\xa1\xbf\xd3\xd0O\xbaia\x9c\xbf\xddD\xd8!\xc4\xb3\xa3\xbf\x16\xf9\'BU\x8d\xa2?\xc4\x958\xea\x8b\xcf\xa9\xbf\x89I\x1a\xb3h+\xbf?\xf2\xc8FA%\xda\xb0\xbf\x90S\xed\x8b7\xdf\xc5?\x9a\xea<\xa9\xb3\x98\xae\xbf\xd6\xef\x8e4\x86\xba\xc6?\xbc|\t\xd7V\xe4\x9b\xbf\xd4\xd4\xc3\xc1\xfas\xc4?\x00\xd5\x95J\xa1\x99a\xbf\xdc\xec\xf0)\xf7\x83\xc1?\x85\xb5\xfbQ\xc4\xf6\x99?\xa8\xee\x92-&\xb9\xbb?\xfems`\xdc\xbb\xa8?vl7gg\xb6\xae?*\x15Y\xc3\x0e`\xb3?\x93\xdb\xe4rp&\x95?\xf1\xa0|@An\xbd?Bv\x14\xb2\x9b\x7f\x8c?\x88Q+\x82\xca\x00\xc1?\x94\xdc\x88\x89@+\x87?%\xcd\xda\xf5.\x1b\xc1?\xc9H\x88\x90\xc1cY\xbf\x94\x116\xca\xc0\xad\xc0?\\mP\x07;p\x96\xbf\tc~\xd9c]\xd2\xbf\'B\x88\xe0L\x11\xb4\xbf\x02\xda6B\x0cM\xcd\xbfT\xb0>p\x94\x80\xa6\xbf\xfd\x0b6^\x93\xed\xc3\xbf\'\xa1\xbdz\xe8a\xa6\xbf\x8f\x03\xa7(\x9c\xb9\xb8\xbf\xdc_S\x8fD&\xb1\xbf\xcf}\x97\xdbL\'\xaa\xbf\xdb\xd6\xdf\xbdMV\xb8\xbf\x1f\xfa \xf2\xf5\xb3\xa5?\xe8\x7fm\x88\xb5L\xb4\xbf\xeb:Td[\x03\xb7?\xf1\x99\xb5\xb3\xf7\x06\xb3\xbfm\x17T\x98\x87\x83\xc1?\xfb\xc2\xa2?\xb9\xe6\xb4\xbf3\n\x04\xf1\x84/\xc6?x,\x15A\x16\xf3\xba\xbf\x05\x10c+\xbc(\xc6?f\xb4\x1c\x07c\xc3\xbe\xbfP\x80\x0f\x0eO#\x99??\x98\xc8\xc4n\xba\xa9\xbf;\xc9p\xd6\xab\xa7\xa2?N\xef\x08s\xd4\x85\xa3\xbf\x1d}\xc0\xd3o\n\xa8?>\xe6>\x85\xf0D\x99\xbf\x95s\xd8\xe8\xe4\x13\xad?\x02X3\xa9fV{\xbf\x1cM\xc2s\xbe\xda\x8e\xbf\x83\x93\xf9\x96\xc1f\xa0\xbf\x9b\rFG:^\x89?*6\xe6&%\xec\xa0\xbf\x80\x9f\xdds\n\xa2\x9d?\xb7\xca\xb1\xd8Q\x18\x9a\xbf?:\x91n\x91z\xa4?;\xb8gH[ \x9d\xbfA\x8c{\xe5\xfa\xe6\xad?\xa6|\x19\x94\x08\xdc\x99\xbf\x89\x91\xe2\xa9\x8c\x8e\xc6\xbf\xe8\xa5\xa6\x14\x0b\xdb\xad\xbf\xe2\x90\x7f\xa5\x06T\xc3\xbf[\xdb\xec\xe9\xc8\xc6\xad\xbfJ\xfa%<\xb0O\xbc\xbfe\x14\x82\xa9\xa5\x14\xb0\xbf\xe3Z\xa3#\xcd}\xb2\xbf2\xbb\xb8.Ma\xb3\xbf\x8e\\\x1d\x8e>\xae\xba\xbfPy\xae|\xd58\xb5\xbf\x11\xc1\xb9,\xa2\xd2\xc1\xbf\xb4\x84\x1f\x02\xa3B\xb4\xbf;\x1b\xc9\xe1\xdcp\xb2?\xe9.\xb6\xffaL\xb4\xbf\x8222\xb3T\t\xba?\x10\x98\x0b\xdd\x9fq\xb5\xbf6sU\x92|0\xc1?ccd\xba\xda\xd8\xb5\xbf2\xb8\xd9\xde\x9c\xc9\xc3?\xd8\xf0\xa8\xc3\xd3Y\xb6\xbf\x8e\x8fJj\xcd\xa1\xc0?\xca>][\x98\xbd\xb9\xbfH=\xff+\x9bs\xba?\xa7\xa5\x19x\x02-\xb9\xbf&\xb1w\x05#\xfd\xb8\xbf\x05\x0bt8v-\xaf?\xb8\x15\x03\xd1\x14]\xb1\xbf\xf4\xedFy*c\xb2?UX\r\x87\xbb,\x99\xbf_\xa3\x1f\x9f^\x1b\xac?\xb7\x1d,\xe8\xe4\xa5\x82\xbf\x0e\xc8\x8a\xd0\xc7\x8d\xaa?D\xd7\x04\xbf\x85y\x85?18\xfd`!\x80\xad? vy\xb5\x01\xb5\x9e?\xbf\xcey}\xbd\xf0\xb2?\x19\xb8V\xa0\xdfB\xa8?\x17\xddz\xcb\x90w\xb0?\xd6\r\xc3T\x94\r\xad?Z\xd1A\x0c\x83 \xb5?\xc6s\xdd\xbcd\xa9\xa1?\x1b\r\x0f\x81\xdb\xf4\xb1?\xdd\x19`\xb1\x80*y?L\xc8\x13\xe4\xbf\x1c\xb1?\xf9\xabUv\xf4\xd1\x98\xbf\xa3m\x89\xc84\x03\xb1?\x80\xe1\xed\x9e*\xeb\xb2\xbfw#\x1an\xdfz\xb4?\xa3A\xb3\x9f\x9aE\xb3\xbfA\xd2\xd1\x06\x14)\xb0?\xfd\xa5\x16l/\xe6\xa2\xbfH\x95_\xcf\xc76\xaa?%\xbe\x9fj\xbd\xb4\x92\xbf\x18\xf5\'9\x80\x11\xa9?HE\xe5\xb2\xa2\xf7i?\x80\x03\xd1\xffB\xd9\xaa?\xbe\xf8@f\x01\x98\x9f?\x1f\x98\xe3\xbdm\xfe\xb1?\x8b\xe7Z\x19f\x02\xa4?\xafD\xf1w\xb5\x80\xb0?r\x1aR\x13\xea\xb7w?\xab\xb9\xc0\x04\x93N\xb0?\x91PN%]W\x99\xbf\x89\xbe\xe5\xb4\xfcy\xb0?+\x8a9>\xbe\x8e\xca?a\x06/(cZ\xc4?[\rH\xf8\xec5\xb6?\xbb\x1a\xa0L\xf0\x98\xc1?\xa5\xbd\xcb!l\xb8\xa0\xbfI\xa5\xfe \xcb\x85\xb1?D\xac@J\x92\xfa\x81\xbf\xba\x9boS\x0f\x1c\x99?\xde\tW\x03\x86c\xa6?\x16\xd0\xaa\xedQ\xd5\x8b?y\x92kocs\xa0\xbf\xe6[\x7f<\xae\x0c\xc2?\xf8\xbb\xc5\x8e\xb3\xd7~\xbf\xbf\x078\xed\x83\xbf\xa1\xbf\xd6\x074Xl\xfd\x97\xbfA\x9f\xceOz\xca\xa7\xbf\xe5\x90q@\x8e\xbd\x92?\x16\xdf\\P\xdaFp?\x06\xc6\x13/\xf2\xc1\x94\xbf\xd1\x8e/z7\x16\xc4?m\x06xa\xcf\x84\x9c?\x9c\xeczA\xa7\xbd\xcb\xbf\x8a\x9c\xff\x14@-\xbe?"\xa6SN\x82\xf3\xb4\xbf\x81\\\xf0P\x18\xd4\xc3\xbfls\xe9\xbd\x02\xc2\xcb?$6o\x9fQ\xed\xae\xbfF\xce\xf46\x85r\xb3\xbf\xcb\x14G*\xf4\xec\xc7?f\xb1\xe6\xb5\xe5}\xb9\xbfoV\xd6\x8db\xdb\x9d\xbf\x82\x111\xd1\x95\xab\xc3?C\xb21\xb9\x18\xa0\xcb\xbf\xef\xb7f\x07;\x9f\xac\xbfF\xfe\x16O\x97\xc9\xa0?\x90\xce\x91;)\x13~?\xa7m\xfc,\xaf\x03\xc4?YbQ\xb9\x91\xa5\xc2?\xfbV;D\xd7\xc1\x91?\xce^Is\xaeF\xb5\xbf\xd3f\xde\x9ap\x17\xc0\xbf\xa4\x01i,\xdd\\\xac\xbf\xd1j\x02H\x84P\xb7?J\x00d\xbf\xe9\xeb\xb1?\xe6\x86\xafA\xfcV@\xbfVX\x0c\xdfZ\x99\x9b?\x0e\xad\x8f*\xefz\xab\xbf\xe7i\xdc26q\xaa\xbf3\xd0;\xf7\x1bT\xb0?:@\xf4\xfc\xfcG\xa3\xbf\x06\x81\xbdc\x13U\xa7\xbf\xe0\x04\n0\xe0\x86\xa5?\x8c\x88QI\x1e\xb9\x9d\xbf\xef\xa7\x118\xf6t\xb1?H!\x01F\xa8\xac\xab?\xf5n\xa6\x87\x11\x97\x89\xbf\x0c\x80\t\xbe\xe6L\xa1?\x9a\x89\x9d\x95\x02\xd6\x98\xbfJ-^o\xa3\x93\xa0\xbf\xf6\xb5h\xf5\x8d`\xab\xbf\xabr\xea\x82 \x0e\xa2?\x88\xa8\xa8\xc5\xdf\xa2\x80?A\x9b\\n\nE\xa5\xbf\x0c\x856Re\xa8\x97\xbfX}Z\xb8\x82\x84\x8d\xbf\xe5\xde\xda\r\x89\x95\xa7\xbf=\xd7\x0c\xd3:\x89\xb2?\xb4z\xa2_\xaa\x8d\x85?\x10\xff\x1b\'\x0b\xeb\xbe\xbfh>G{\x86fv?\xbc>\x1bo\xcam\xa4?\xa4\x80\xd6\x0f\x05\x03\xa3?0$\xccA\x10ra\xbfM\xa4N\xccX\x07\xcb?\x11v\x93\xf5\xea;\x99\xbf\x94b\xe4*\x99\\\xbb?\xd1Cr=v\xa4\x94?\x80\xad\xfd\x8b\x8fcS?\xe6\xaew9\xa4\xae\x85\xbfE;8y\x88\x98\xbd\xbf\x01\x00\x84\x9b\xe0\x88\xd0?h(a\xc6\xbf\r\xa2?-\xff\x01\x0bcd\xc0?\n\xe3\x07\xee\x7fp\xa0\xbf\xb2\xa9\xee\xaf)\xdf\xaa\xbfT\xacz\xb6Z\r\xb5\xbfIy\x89\xec?I\xbb\xbf\xc8l\xb3~\xb5\x98\x92\xbfVW1\x84\x80\xf7\xca\xbf\xd4\xb9\x1b\x8d|\xd0\xa0?\x10t%\xfc\xbb\xfds\xbf\xce\xfc\xb7T\xb0+\xc2\xbf_\x86[\x02\x8b\xf9\x9b\xbf\x1e\xad\xbf\xdb=\x85\xa2\xbf\xd0\xa6\x96\xda\xdc-|\xbf\x1eW\xe8\xd9\xcdM\xac?\xa8&vi\x9d"\xb4?\x8a,\x01\x1c8i\xcb?\xa6<\xda\xae\xc3`\x90?KM\x90\xd8\xb5\xf3\xa0\xbf\xef\x11\xe2&ZZ\xb3\xbft:&e\x06\x9d\xba\xbf\x95qliE\xbe\xb4\xbf\r\x7fiW_\x90\xc6?\xa8\xf5\x1c\xbd\x10n\xa4\xbf\xa2\xbb\xfd+\xfd\xe4\xa1?\x9c\xa7}e\x94\xd6\xa1?\xd7T\x9e\xbc\xfe\xd7\xa7\xbf\x9eu\x8bDl\xb5\xc1?\x1a\xf4L\x1f\xdda\xc4\xbf\xfbP\xec\xbe\xf2\x1a\xa1?\xfal-\xc6.\x83\xb7\xbf\x8cH\x8aJ0\xd7\xae\xbf\xc8s\\Bo\x91\x97?\xd5\xcdV:\x11\xf9\xc0\xbf|ps\x90\xf8\xf5\xa9\xbf\xb0\xc4fr/\x18\x8b?XG\xf7\x8f\xdb\xda\xb1\xbf g\x9b\xba\xdef\x9a?\x0e\xc6\xdc\xba\xd7<\xa1?\xd8\xf1\xcb&\xb9n\x98?\xf0\x96r\xfa\xe5\xd6\xbe\xbf\x00G\x87\x14\xfbc\xac\xbf,\x84\x85\xeb\x10:\xb6?\x0e|a\xd7f\x10\xb0\xbf\xc0\xbfm\xfdy\x0f_?0\x0b\x148\x84=\x83\xbf\xe0\\\xa0\x8f/\xdf\x8a\xbfPc\xea\x13\x1e\xb8\x8d?\x84\xf6\x1c\xeb\xda\x04\xa6\xbf\xf6\x12h,\xe3\xfa\x9f\xbf\x9e\xbd\x15OF\x1a\xa4?\xac\x11\xbbX\xf2_\x88\xbf\xde\xf8\xbe\x84\xbb\x97\xbe\xbf\xcdH\n\x01\xa4\x8d\x98?pnSK\x0c\xf1\x98?M\xb1\xf9\xd0\xe1z\xa0\xbf,;\xd9\xbc\xa75\xa2?\xdc\'T n4J,\xb6\xbf\xe9?T\xdd\x85\xa2\xc0?FO6\xb4\xc4\xea\xb7\xbf\xcdB\xd9\x85\xf1\x16\xd0?\x80o]\xf2\xd0\x81i?\x85\xe3\xa7tE\xce\xa2?O\x91\xd5\x1c,\xf0\xa0\xbf,\xa6\xc3\xe8B\xce\xc9?\x06\xd2\x0b\r\x19_\xba?\xf3sV\xcb\xa9%\xae\xbfN\t\xb4tx\xba\xb0\xbfw\xc5S\x8a\xe6\n\xae\xbf\x90\x01RR\xee\xb0m\xbf\xb4A\xe0T\xdf\xa9}?\xb8_\xd5\xa3\xec\x81u\xbf@\xa3\xf58|:\xc0?\xe8z\xb9\xc3\xca\x12\xa5?\r\xa8\xc3\x1f\x85\xb3\xae?\xb8\x99p\x81\xd3\x00Y?0\xa6\x02\xa5\xd2\xc4\xc3\xbf%/S\xd3\xa4\x91\xb6\xbfA\xed(\xa3\x18\x17\xbe?\xc6\x90\xccN])\x95\xbf\xe6C\xd6O\xdd\x8e\xa7?^f\xe8\x8b\x00\xd7\xb6?@\xe2.\xf4\x1e\xe2C?j\x97&m\x9e\xf7\x94?9IyJ\x11O\xcb\xbft\x14\xa1\xe2\x84\x81\xa4?26\xad8\'\xed\x99\xbf\xb2rh%\x17\x95\xbd?\x87\xe8\xca\xae>\xc2\x9a?\xc1F\x8ev\xef\x1a\xb0\xbf\xc9L#\x1e/A\xa0\xbf\xb1{Yl\xfe2\xa9\xbfPu\xf18\xaf\xf6\xb9\xbf\x8a\xd1\x90\x14P\x03\xb3\xbf@\xab\xc9\x1b+\x00\xa4\xbfl\xdd9t\xb9\x7f\xc4?\xfc(\xdb\xbdv\xfa\x89\xbfK\xd7J`\xaf0\x9c?2\xf1Sq\xd6\xba\x8b?\xd7\x1970\xa6\x14\xc8\xbf|\xeb\xb8_\x89\xf9\xa2?|\x18T\x0f8\x00\x90\xbf\x8f4\x9b\xde>\x0b\xb4?\x18\xefp\x00\xf7\xe7\xa7?\t\xd3\xf5\xe6}\xe5\xca\xbf\xa8\x0e\xef\x96\xa9c\xa2\xbf\xd7+\xe4\x13\xc4\x9f\xcc?)\x0f\x9b\xbdTvM?j\xb7Vqi\xd0\xc6\xbf\xd8x"\x88\x18\xcc\x80?\x01\x1cm\x05\x9b\xc6\xbf?\x93\xad\xf0\xc6z\xd5q\xbf>\xdf\xd0:\xa9b\xb0\xbf\xde\x9b\x02~\x9d\xd0@\xbf\x9c=\xe6k\xd2\x10\xa0?cxa\x1d\xbb\xf7U?iN\x12Mq%u\xbf\xa8\xf9[5f\x92\x89?\x88_\x9d\xea+\x14|\xbf$\x91\xd9\xf9\xbd\xe5\x9a\xbf\x08{E\x83\x95/s\xbf>\xdc\x00\x08E\x7f\x97?\xde\xec\xac\x9c_\xa0\xa0?\xf6\x98\xb6\x13\x7f\xfa\xa8\xbfH\xf2*G.C\xb8\xbf\x99!\x8f\x8ejV\xb5?~\xe3\xa7(9x\xcb?Y]\xe9\xc7\x88\xfe\xb0\xbf\xa2\r\xd1\xea\x08\xd4\xd5\xbf\xa5\x8c\x90\xc7;\x03\xa5?\xdf\x05\x9clx\xe4\xda?n\xca\r?~\x00\x99\xbf\x97\xf4\xad\x1f;\xf1\xd6\xbf CT{Z\x9cn?\xac\x8f\x0c\xa4 d\xc0?\x98\xe8$u\xe8[m\xbf\x00|yW\xfc\xaeE\xbf\xec0\x16\xccXy\x96?\xbe\xf18\x87j1\xa4\xbfh\x83\x94\xa37\xbc\x87?\xa8\xc9\xb7\x9e\t\x91\xbe?\xe9W5\xbf\x80\xa0\xa7\xbf\x0e)\xea\x9b\xb5\t\xbb\xbf\xa4\xcf\xf7q\xa2R\xa0?\xb8F(\x84\xeb\xc7\x99?\x1e\xbb\x7f\x03\x06\xe1\x97\xbf \xdc\xc2IK\x1f\x8b\xbfx*q\xcc6\x05\xa5?\xa0\xbb\xb5\xba\xc1u\xa5?\xd8E\x92|T\x08\xae\xbfRqWs\xf2\xf4\xb5\xbf\xbfh;\xf3\xee\x83\xa7?\xb0\x15\x98dl6\xac?\xc8\xad\xd2\x94K\x1d\x90\xbf\xff*9$\xa1\x0c\x86\xbf\xe0Y)\xbc\'2\x9a?\x80Z\x8bi@*=?\xfc\xa3\xd7\xa0\xa5\xc5c?p\xaf^^C\xd9i\xbf\x96\x84\xbdH\x8b\x95\xb0\xbfX\x9a\xddX\xa4xv?\x00n\x88\n\xd6\xf9\xa2?\x80]rZ\x9b\xbc1?T\xb5L\xe6\xdd\xeb\x96?\x18\xfb]\xe5\x9b\xc3\x97\xbf\xce*_\xb2Pg\xb0\xbf\x83\xd5\x95\x87\xe1\xd0\xad?\xfey\xfe\x96\xfb\x8b\xbe?l\xcda\x02\x1a\xd4c\xbf\xa1\xdaB\x8e\xcei\xb2\xbflx\x01\xec\xf4\xe5\xb0\xbf0\x18\x85\xa4\x01\xaa{?d\r \xd9\xa8m\xa2?.^\xcei\x0c\xa8\xb5?\xb6b\xf9\xe8\x8dl\x9a\xbf1F\xac\x97~\x8e\xb4\xbf\x08z\xa46Aw\xba?\x8aJ\x9c\xe3\x07I\x93?Y\xcf&\xec\n\x05\x8b\xbf!C\x98\xb5\xad\x9e\xba\xbf\xba\x94\xfcj\xa3\x0c\xb2?\xd81\xb4\xf5-\xbf\xa6\xbf\xa9\x98\x13\x8c\x99\x1b\xb9\xbf\xd2l\xdfu\x96\x11\xad?\x1d8\xee\xcfl1\xa4\xbf\'/\xe7#\xbfC\xb3?d\xa6\xc4\x9f\x0b\xbc\xa5?\xc0\xb0\xd9\xcd\xf6\x06r\xbf\x14\xa8rV\xbaG\x9e?\x0fq\x1awA}\xa5?\xfc;\xb4\xabd&\x9b?R\x004_\xcfc\xb1\xbf \xf2t\xc0d]\x80?\'\x90\x91\x080\xa0\x98\xbf\xb2\x0b\x97\x11\xcc\xa1\xac\xbf\x97\xad(et\x1e\xa1?J[\x8b\xef\xcb\x9d\xa1\xbf\xbb\x9b;\xed`c\xa3?\x80\x98\xc7vt\xc7D\xbf+\xc8{w!\x85\xa7\xbfTM\xf9\xcc\x15\xae\xb1?\xf0\x1c\xf7\xact\x8c\xaa?\x00\x91r\xde\x96Q\x97\xbf\xb8\x18\x12\xc3\xe2\x81\xb5\xbf\xa3;\xfa\x12oq\x99\xbf\xfb\x99\xd8>\xd3\xdc\x85\xbfX\x7f\xcf\x87\xd9\xbc\x81\xbf\xeaK\x88\xe7\xaa\xb9\xb9?h\xeb\x1e\xb6\x98\xd4\xb1?\x86\x8d\xbd>\xa7K\x9d\xbf \xd2\x044\xb4q\x7f\xbf\x00gU\x85\x93TN\xbf\xa8\xd27\x1a\xf9t\xb3\xbfb8N\xbd\xbb\x0f\xb2\xbf~}~\xc0 \xc3\x91?\xcc/Bk\xfc\x9e\xa1?a\x8f\x03T\x81`\xb2?nD\xcc\x15\x17X\x9c? \xb5\xbf\xbd\x9bb\xb1\xf0\xda\xd1?D\xd8\xb9\xc1\\\x1b\xc1?\x18IV\xbb!\x7f\xb6\xbf(\xd1\xa20\x1a&\x9b\xbf"q\xa5\x82?\xfd\xaf\xbf\xe8\xc2\xd0\xb0\xc1=\xb3\xbf\xf7\x14\xaa\xa1\xd83\xa7\xbf*%b\xef\n\\\xab?ld+\xf3\xe1\xc6{\xbf\x04L\xd5\xa8\xddg\xba\xbf\x80\x81T\xbf\x12\xcb\xb0?\xae\xc2\xf8\x8d>\x8a}\xbf>VC\xe9\xa5\x04\xb5\xbf9=\x01\x94R\x93\xb3\xbf\x9d6\x01/\x1e\x93\xa7\xbflX\xd4;\x91\xe8\xad?\x88"\x1d0\x89\x82\x93?\xe32\xc2\xd3\x12\xa9\xae\xbfT\x0c\x0fwf\x89\xaa?\x05\xec7\xe9;Un?4\xb7 \x91\x02\xa6\xb0\xbf\xefv\xcc\x04\xac\xe3\xaa\xbf@m\xdd\xe9\x86._?\xcf\x10\xbbJ\x946G?\x10\x11\x9b\xe0\x9e\xc3v?\xc3J\xd5\xa3F\xf5\xa8?\xa0a\x92\xcaz\x08\xa4?\x8b\x8fF\xe1\x80\xa4\xa1?`\xd7\xd3`\xe8\x8d\xa0?vw\x1ds\xf7\x08\xb4\xbf\xc9Jz6j\xb3\xad\xbf\xc1\xe8\xa5{\x8a\xf1c?\x9djdlI\xd7\x95\xbfp\xc3\xce\xd0\x82\x18\xa4?v?\xda\'\xe5\xf8\x9c?Y\x9f\xc4n\xb1\x90\xa5?(\x04\xca\x08\x19d\xa3?\xb2\x8e-v=I\xa5\xbf\x19;\xfd\xab\xc3d\x8e\xbf\xb6\x90\xcaT\x83\x13\x81?+\\%\x19\xa5\x82\xa4\xbfM|\xe4\xe9l\xed\xa4?@E\xa5rW\xe5\x91?\xc8\xf2}\xfe^3\x97\xbf"E-\xd9\tr\xab?V\x82PT\xd6\x1f\xa4\xbfr\x02\xc5\xb5\xc5g\x8c\xbfR+\xd3(\xc9\xf9\xab?FS1,\xea\xa4\xae\xbf\xd4V\x8a\xd7\x1b\x83\x9b?s\xd9\xf9\xc4\x94\x04\x94?\xfae\x98X\x10b\xb0\xbf\xa2\xb0\x99\xb0\x86\xa3\xae\xbf\x12\xc0/\xf5\x1e\x92\x9d\xbf@5Q\x87l6\xbe?\xe4[\xf1\xf7\xfbAw?N\xc0\x9b@=\xa4\xa9\xbf\x8a\xae\\\xfb\xcf<\xa1\xbf\'s\xfd\x11\x96J\xc1\xbf HAs\xd2\xc7?\xa1\xa7\xe3\x89\xdfW\xda?\x1e\xdf\xb0\x8e\x8d\x0e\xb6\xbfm\xff\x10UP\xa7\xd2\xbf\xdf9\x1f\xa8\xa3\xdc\xb5?\x1b\xbc|\xe7\x8e\xee\xd6\xbfdg\xcf\xcb\xa0\xbc\xd4?){\x18\xa7\xa9!\xe1?:\xc6\xf7\xec\xb3\xd0\xcf\xbf\xa2\x1a1\x88*c\xc7\xbfn:S\xfc\x05\xb1\xd0\xbf_T\xb9]\xf9N\xbd\xbf\x11~\x90#\x9b4\xe2\xbf1\xa49\x06d\x8e\xd1?(\xc2\r\xed\x86J\xcc?\xa80\xd3\x16\x9b\x86\xc9\xbfE)z$=&\xcd?\xf1"\x13\x97\x17\xc8\xbe\xbfbr\xca=\x05\xbf\xdb\xbf\xffZ\x1f\x0e\xad-\xd1?v]\x9a8\xca\xe6\xce?\xd0\x8d\xdc@\xb7Z\xae\xbf\x13J\xda_t\xfc\xd3?\xbe\x18\x1ev\xcd\xf6\xc4\xbf_u\xd2-@\xa9\xd2\xbfy\xc8\xc4\xf9Z\xea\xcb\xbf\xa6\x84\xe2\x13\x04*b?\x01\x07\r\xe8\xaa\xbf\x82\xfdP&\x8e\xbd\xbd\xbf\xdd\x8f\x1f\xe0\xe6"\xe3\xbf\x027\xf5\xd5\x81h\xce?1\xbc\x10\x1c\x9a|\x8a?\xc4\xc5\xf7qkH\xa5?\x9e\xben\xb7h\xf4\xd0?\xfa\xb6\n\xac\x02.\xc0?,\xf0)\xce\xd9\xe7\xe2?\xa1da\xba\xfc\xad\xb9?\x97\xc2 \x838\x81\xc6\xbf-\xc7\xb6\xc9\x9c\xfc\xb6\xbf\x8aQH\xa6\x96#\xe4?EjP\x04\xfd\x8f\xab\xbf\x10\xc5\xe3\xe4\xbc\xb3\xcf\xbf\xf4\x81\xf5G/2\xc6\xbfl\x14\x9c\xb8\xf6\xfc\xd2?\xda\xbf\xd8-\xd0\x99\xc3?\xe7k\x9f\xaf\xe4\x9f\xe2\xbf\xda\x1b\xb7\xc5\\G\xb0?`\x90Q8\x80X\xb4?\x1b\xce`\x10\xe5\x08\xbe?8BA\xc2O\x02\xb1\xbf\xe2\xbcMh<\xbe\xe4\xbf\x034\xb5\x88_\xf6\x96?.a@\x1f\x9b%\xe5?\xda9~\x97\xa8\x1d\xb9\xbfaG\x9b\x98\x03\xb7\xc6\xbf\xa7\xa5\xb1\xd4n\xec\xac\xbf\x84\x82a\xe2Y\xa3\xc1\xbf\x0b/\x85\xf4\xf5O\xc4?\x9f\xd1\x01\x17p\xb8\xc7?\x81G\xc1\xf1Wq\xa4?\x86\x96B\x935\xc2\xba?\x1cN\x81\xdb\xe1\xc9\xa2\xbf^E\xe2\x10$\xcb\xbd\xbfZw\xdc\xa8h\xa6\xd9\xbfO\xa5\xb3i\xdf\xc5\xbd?]\x89*\xb7\xe9\xe9\xc9?0v\x14P|\x9f\xc7\xbf\x16\x0c\xa5\xaa)E\x97?#\xf1\xd6\x1f\xb3\xf6\xdf?[GZa\x8e\x02\xd1\xbf\xab\xb3\xd121\xd6\xda\xbf\x0eSg\xcf\xeb\x98\xde?\x1f\xb3\xd3\xd1lD\xc0\xbfPBQ\xab\x9c7\xa7\xbf\xbb\xa9Li\x97\x05{?!5\xf9\xadf\x7f\xcd\xbf\xf2$\x1e\xea\x8b\x04n\xbf\xab\x0c<\x84\xb4_\xd7?kx\x9a\x8d\x1b\xfd\xc2?\xc2\x03\xc9\x83^\x05\x99?\xf9\x03B%W,\xde\xbf\xd9\xf2$\xc7\x12.\xde\xbf\xf8\xaa\x8d\x81\xf2\x9f\xdc?\xd2\x18\x13G\xaf\xe4\xd6?'
-p177
-tp178
-bsg77
-g62
-(g63
-(I0
-tp179
-g65
-tp180
-Rp181
-(I1
-(I12
-tp182
-g72
-I00
-S'\xeb\xbdE\xceL=\xfb?\xa4gE\xd3\xec\xeeV\xc0\x14\xd2\xc3Y\x0b=@\xc0\xac\xf4\xdcda A\xc0\x92$ o\xb1x?\xc0[\xeb\xc4K\xd0[A@I8K\xa9\x16o(@\xb9\x08b\xd9\xfa\x91W@\xa3F\xe9\x92.\x908@n\xd1\x9ap}p@@\xff\x894n9w9@\xa6\t\xbcc\x08\xf2A\xc0'
-p183
-tp184
-bsS'n_samples'
-p185
-I3148
-sS'_n_active_components'
-p186
-I8
-sbsbasS'reference_shape'
-p187
-g0
-(g85
-g2
-Ntp188
-Rp189
-(dp190
-g89
-g62
-(g63
-(I0
-tp191
-g65
-tp192
-Rp193
-(I1
-(I6
-I2
-tp194
-g72
-I00
-S"\xe1P\xbe\x86\x923x@ft\xc0\x89\xe9Mw@\xe5\x85\x92\xd3\xeb\x05v@\xa1\xacg\xaf?\x02{@\xa9\x1b\xaa\x8fV\x19v@jk\x0bsSo\x7f@\x00x\x84s\x84\xedx@\xa0\x88\xaf\xe9\xe6\x9d\x81@!\xe0[\x12\x17\xb8y@6q\x1b'kL\x7f@\x1eE\xfaF\xd2\xc1y@\xe6\xa8?\xcd\x08\xdez@"
-p195
-tp196
-bsg96
-Nsbsg46
-(lp197
-g47
-asg49
-g32
-sg44
-Nsb.
\ No newline at end of file
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/pdm_clm_models/clm_models/g_t_mouth b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/pdm_clm_models/clm_models/g_t_mouth
deleted file mode 100644
index 1afcfee3f51fa5ad08b2866ddc9c1bb229720648..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/pdm_clm_models/clm_models/g_t_mouth
+++ /dev/null
@@ -1,486 +0,0 @@
-ccopy_reg
-_reconstructor
-p0
-(cmenpofit.clm.base
-CLM
-p1
-c__builtin__
-object
-p2
-Ntp3
-Rp4
-(dp5
-S'opt'
-p6
-(dp7
-S'ablation'
-p8
-(I01
-I01
-tp9
-sS'verbose'
-p10
-I00
-sS'rho2'
-p11
-I20
-sS'sigRate'
-p12
-F0.25
-sS'ratio2'
-p13
-F0.08
-sS'smooth'
-p14
-I01
-sS'dataset'
-p15
-S'demo'
-p16
-sS'ratio1'
-p17
-F0.12
-sS'pdm_rho'
-p18
-I20
-sS'sigOffset'
-p19
-I25
-sS'kernel_covariance'
-p20
-I10
-sS'numIter'
-p21
-I5
-ssS'_shape_model_cls'
-p22
-(lp23
-cmenpofit.modelinstance
-OrthoPDM
-p24
-asS'max_shape_components'
-p25
-(lp26
-NasS'scales'
-p27
-(lp28
-I1
-asS'diagonal'
-p29
-I200
-sS'holistic_features'
-p30
-(lp31
-cmenpo.feature.features
-no_op
-p32
-asS'patch_shape'
-p33
-(lp34
-(I8
-I8
-tp35
-asS'expert_ensemble_cls'
-p36
-(lp37
-cmenpofit.clm.expert.ensemble
-FcnFilterExpertEnsemble
-p38
-asS'expert_ensembles'
-p39
-(lp40
-g0
-(g38
-g2
-Ntp41
-Rp42
-(dp43
-S'sample_offsets'
-p44
-NsS'cosine_mask'
-p45
-I01
-sS'context_shape'
-p46
-(I8
-I8
-tp47
-sg33
-g35
-sS'response_covariance'
-p48
-I3
-sS'patch_normalisation'
-p49
-g32
-sS'_icf'
-p50
-Nsbasg45
-I01
-sS'shape_models'
-p51
-(lp52
-g0
-(g24
-g2
-Ntp53
-Rp54
-(dp55
-S'similarity_model'
-p56
-g0
-(cmenpofit.modelinstance
-_SimilarityModel
-p57
-g2
-Ntp58
-Rp59
-(dp60
-S'_components'
-p61
-cnumpy.core.multiarray
-_reconstruct
-p62
-(cnumpy
-ndarray
-p63
-(I0
-tp64
-S'b'
-p65
-tp66
-Rp67
-(I1
-(I4
-I40
-tp68
-cnumpy
-dtype
-p69
-(S'f8'
-p70
-I0
-I1
-tp71
-Rp72
-(I3
-S'<'
-p73
-NNNI-1
-I-1
-I0
-tp74
-bI00
-S'\xe0\x8c\xb6\x92\xfd\xcc\xa1\xbf\xd5\xc2\xdcB\xe3\xab\xd8\xbf<\xbe3\xda!\xe7\xb7\xbfT2\xa6\xcc"\x03\xcf\xbf\xb2\xb8<\xb1\\\x17\xc0\xbfd\xe5\xf8\x1f\xb7s\xb9\xbfYB\xd5\xbc\xf6\x9e\xb9\xbf\xf4\xc2\xe4\x94\x8a\x00G\xbf;\x10\x05\xcbh,\xc0\xbf\xb6Y\xdb\x1b\x154\xb9?\x8b\x19\xfao\xa7@\xb8\xbf\xba\x16\x99R\x80\x04\xcf?\xc9\xa8B\xd1\x019\xa2\xbf\'W;]\xac\xaf\xd8?\xb9\xfa\x8b\x02\xfb:\xb9?S?\xe8ue\xab\xcf?\x94\xbf\xdb9k"\xc4?\xcd\x8d?\x03\xb1\xac\xbb?\xf8E)\x80\xe3\x99\xc5?\x18E/\x05\xad\xc9%?.\xfe\xeb\xa7\x016\xc4?\xbd\xf4)Nx\x83\xbb\xbfd\xae\xf14\xde\x8f\xb9?\x9bT\xaa\xa9\xe1\x82\xcf\xbf]\xa2\xc7\x0f,\x8e\x9c\xbf2\x022#\xe3\xcf\xd4\xbf\x12\t\xca+\xa0\x85\xa3\xbfzq\xbe\xde-\xc1\xb9\xbf\x93\x075umK\x9c\xbf\x81\xe3\xef\x9c\x01\x876\xbf\xeb\x9c\x84\xac\\\xd5\xa3\xbfw\xe5\xbc\xa2!\xab\xb9?\xd6TE\xcd\x06\xc7\x9d\xbf\x86\t\xd8`\xc5\xd1\xd4?=[\x0f&\xa5?\x9a?\xed\x00.\xfb\x1c-\xba?\xd0\xf7\xfb\x81Qy\xa3?\x9f\xae=)\xadH"\xbf\x81Yf\xb1\x94\xab\x9a?\xc9\xcf\xbf\xddO(\xba\xbf\xd6\xc2\xdcB\xe3\xab\xd8?\xd0\x8c\xb6\x92\xfd\xcc\xa1\xbfS2\xa6\xcc"\x03\xcf?;\xbe3\xda!\xe7\xb7\xbfe\xe5\xf8\x1f\xb7s\xb9?\xb2\xb8<\xb1\\\x17\xc0\xbfw\xc3\xe4\x94\x8a\x00G?YB\xd5\xbc\xf6\x9e\xb9\xbf\xb4Y\xdb\x1b\x154\xb9\xbf;\x10\x05\xcbh,\xc0\xbf\xb8\x16\x99R\x80\x04\xcf\xbf\x8c\x19\xfao\xa7@\xb8\xbf\'W;]\xac\xaf\xd8\xbf\xce\xa8B\xd1\x019\xa2\xbfR?\xe8ue\xab\xcf\xbf\xb8\xfa\x8b\x02\xfb:\xb9?\xcd\x8d?\x03\xb1\xac\xbb\xbf\x93\xbf\xdb9k"\xc4?kH/\x05\xad\xc9%\xbf\xf9E)\x80\xe3\x99\xc5?\xb9\xf4)Nx\x83\xbb?.\xfe\xeb\xa7\x016\xc4?\x99T\xaa\xa9\xe1\x82\xcf?e\xae\xf14\xde\x8f\xb9?1\x022#\xe3\xcf\xd4?K\xa2\xc7\x0f,\x8e\x9c\xbfzq\xbe\xde-\xc1\xb9?\x10\t\xca+\xa0\x85\xa3\xbf\xbe\xe3\xef\x9c\x01\x876?\x92\x075umK\x9c\xbfu\xe5\xbc\xa2!\xab\xb9\xbf\xec\x9c\x84\xac\\\xd5\xa3\xbf\x85\t\xd8`\xc5\xd1\xd4\xbf\xdeTE\xcd\x06\xc7\x9d\xbf\xed\x00.\xfb\x1c-\xba\xbf:[\x0f&\xa5?\x9a?\xf9\xad=)\xadH"?\xd0\xf7\xfb\x81Qy\xa3?\xc8\xcf\xbf\xddO(\xba?\x84Yf\xb1\x94\xab\x9a?\xce\xed\xbf\xc5%\x9f\xcc\xbf\x04\xcd"-\x05\x82\xa6\xbc\xd3\xed\xbf\xc5%\x9f\xcc\xbf\x86\xe2m\xf9\xfes\xa5\xbc\xd9\xed\xbf\xc5%\x9f\xcc\xbf0\xa3\xbc\xd1\xd7\x91\xa5\xbc\xdb\xed\xbf\xc5%\x9f\xcc\xbf s\xf2\x87\xfa\xf4\x9d\xbc\xe1\xed\xbf\xc5%\x9f\xcc\xbf\x85\xc7^\xd1\x82\x9d\x9f\xbc\xe5\xed\xbf\xc5%\x9f\xcc\xbf &\x91\x95&D\x8c\xbc\xea\xed\xbf\xc5%\x9f\xcc\xbf/\x11\xf6\xb7+\xd9\x8a<\xe3\xed\xbf\xc5%\x9f\xcc\xbf\xd6\x94\xb3\x98\xec\xb6\xa5<\xdf\xed\xbf\xc5%\x9f\xcc\xbf&\xeb+rW}\xaa<\xd8\xed\xbf\xc5%\x9f\xcc\xbf)?p\xa2\xf0j\xa9<\xd9\xed\xbf\xc5%\x9f\xcc\xbf\xb3f\xa4\xc5\x03\x91\xa4<\xd2\xed\xbf\xc5%\x9f\xcc\xbf\xcf\x82\xe6\xe5\x8f\xe5\x8d<\xcf\xed\xbf\xc5%\x9f\xcc\xbf\xe2\xc0\x19\x82\x85%\x9a\xbc\xd9\xed\xbf\xc5%\x9f\xcc\xbf\x07\x0fVr\xdc\x02\x91\xbc\xdb\xed\xbf\xc5%\x9f\xcc\xbf\xfbc\x04S1x\x80\xbc\xde\xed\xbf\xc5%\x9f\xcc\xbf\xe5\xab8f\x08\x16y\xbc\xe7\xed\xbf\xc5%\x9f\xcc\xbf2\x9e\x94\x0e\xff\xc2\x80<\xe0\xed\xbf\xc5%\x9f\xcc\xbf9Z\xf3\x968e\x8c<\xdb\xed\xbf\xc5%\x9f\xcc\xbf\xe0O\xfb\xe3\xac\xc5\x86<\xd7\xed\xbf\xc5%\x9f\xcc\xbf\x08\xdf\x7fX\xb4"R<\x8d\x97\xa7\x04-\xb6\xa1<\xcc\xed\xbf\xc5%\x9f\xcc\xbf\xe9\xd1=\x7fh{\x99<\xd8\xed\xbf\xc5%\x9f\xcc\xbf\xad\x11\xab\xb9\xab9\xa4<\xda\xed\xbf\xc5%\x9f\xcc\xbf\xc0\x04qu\x16\x00\x9d<\xdc\xed\xbf\xc5%\x9f\xcc\xbfU+I\xbf\xdd\xb1\xa1<\xdf\xed\xbf\xc5%\x9f\xcc\xbf\xac\xef\x80DSf\x94<\xe7\xed\xbf\xc5%\x9f\xcc\xbf\x1c7\xdbv\xbcgw\xbc\xeb\xed\xbf\xc5%\x9f\xcc\xbf\x1e\xa8fu\xc7\xa6\xa3\xbc\xe3\xed\xbf\xc5%\x9f\xcc\xbf\xce\x8c\xc1\x96*\x92\xaa\xbc\xdc\xed\xbf\xc5%\x9f\xcc\xbfx8\x1a\x84\xc1o\xaa\xbc\xdb\xed\xbf\xc5%\x9f\xcc\xbf\x17\xbe\x07F\xe3\xc2\xa5\xbc\xd6\xed\xbf\xc5%\x9f\xcc\xbf\xb3\x815i\xf5\xf5\x97\xbc\xd1\xed\xbf\xc5%\x9f\xcc\xbf\xa1(\xe4\xa29\xe6\x92<\xcf\xed\xbf\xc5%\x9f\xcc\xbf\xde/p66\t\x8c<\xd8\xed\xbf\xc5%\x9f\xcc\xbfG)\x8a4N\x1e{<\xdc\xed\xbf\xc5%\x9f\xcc\xbf\xfc\xfaC\x00R$y<\xe0\xed\xbf\xc5%\x9f\xcc\xbf\xe7\xe1\xf1\x14\x81\x9cl\xbc\xe4\xed\xbf\xc5%\x9f\xcc\xbfmk\xcb\x89\x84\x03\x88\xbc\xe0\xed\xbf\xc5%\x9f\xcc\xbf\xaa4\xd3\x02\x9e\x1b\x8a\xbc\xdc\xed\xbf\xc5%\x9f\xcc\xbf\x14\x9fyZ\xab\xdbu\xbc\xd8\xed\xbf\xc5%\x9f\xcc\xbf'
-p75
-tp76
-bsS'_mean'
-p77
-g62
-(g63
-(I0
-tp78
-g65
-tp79
-Rp80
-(I1
-(I40
-tp81
-g72
-I00
-S'\xea\xb02\x7fy\xf4\x1d\xc0\x0b\x1e\xc1\xf5*\xc2T\xc0\x02\x9b\xc8L\x9e\x1c4\xc0\xd2\xf3 \xb3\xf6\x17J\xc0\x1c-\x80G\t\x14;\xc0I\xe04WMj5\xc0\xfal\xd7\xf5\xb0\x8e5\xc0\x1cP\xaf\x93\x99Z\xc3\xbf\xde\xc1@Wt7;\xc0\xa6\x9a\x19\xf9\xc245@\x10\xa4<\x15\xf1g4\xc0\x83\xf2\x96\xc9\x1c\x19J@&zp\x84>\xaa\x1e\xc0fQ3\\Z\xc5T@\xff\x07\xf0\xcb\x90:5@q\x8b\x13\x95\x89\xa5J@50"^\xe9\xf0@@\x7f2Ul\tI7@\xe0!\x10\xa9\xd4,B@WI\xa2\xe7\tU\xa2?\xd8\x16:~d\x01A@\x8e\x8d\xa9hZ&7\xc0\x12\x82=[\xfd\x815@\tI\xe2\xc8r\x83J\xc09\xa7\x9ei\xc1\x06\x18\xc0%\xfc\xf9\n\xde\x82Q\xc0\n\xa1\xab~\xfcl \xc0b\x01\xd0\xddz\xab5\xc0\xd9*q\xcc\x98\xce\x17\xc0k\xb0k9W\xf4\xb2\xbf\xcd\xb7w\x82\x13\xb0 \xc0{\xff[\xd3\xed\x985@\x05\xd0\x8f\x81\xfd\r\x19\xc0?\x9f{\xccs\x84Q@\x8b\xbf\x99R\xe3\x15\x16@\xb7\xf2a\x98K\x066@\xafJh\x89\xa1b @\x89\\\xe9\xc7\x9c\xc4\x9e\xbf\xed&|j\xb4p\x16@\xb39\xfdkA\x026\xc0'
-p82
-tp83
-bsS'template_instance'
-p84
-g0
-(cmenpo.shape.pointcloud
-PointCloud
-p85
-g2
-Ntp86
-Rp87
-(dp88
-S'points'
-p89
-g62
-(g63
-(I0
-tp90
-g65
-tp91
-Rp92
-(I1
-(I20
-I2
-tp93
-g72
-I00
-S'\xea\xb02\x7fy\xf4\x1d\xc0\x0b\x1e\xc1\xf5*\xc2T\xc0\x02\x9b\xc8L\x9e\x1c4\xc0\xd2\xf3 \xb3\xf6\x17J\xc0\x1c-\x80G\t\x14;\xc0I\xe04WMj5\xc0\xfal\xd7\xf5\xb0\x8e5\xc0\x1cP\xaf\x93\x99Z\xc3\xbf\xde\xc1@Wt7;\xc0\xa6\x9a\x19\xf9\xc245@\x10\xa4<\x15\xf1g4\xc0\x83\xf2\x96\xc9\x1c\x19J@&zp\x84>\xaa\x1e\xc0fQ3\\Z\xc5T@\xff\x07\xf0\xcb\x90:5@q\x8b\x13\x95\x89\xa5J@50"^\xe9\xf0@@\x7f2Ul\tI7@\xe0!\x10\xa9\xd4,B@WI\xa2\xe7\tU\xa2?\xd8\x16:~d\x01A@\x8e\x8d\xa9hZ&7\xc0\x12\x82=[\xfd\x815@\tI\xe2\xc8r\x83J\xc09\xa7\x9ei\xc1\x06\x18\xc0%\xfc\xf9\n\xde\x82Q\xc0\n\xa1\xab~\xfcl \xc0b\x01\xd0\xddz\xab5\xc0\xd9*q\xcc\x98\xce\x17\xc0k\xb0k9W\xf4\xb2\xbf\xcd\xb7w\x82\x13\xb0 \xc0{\xff[\xd3\xed\x985@\x05\xd0\x8f\x81\xfd\r\x19\xc0?\x9f{\xccs\x84Q@\x8b\xbf\x99R\xe3\x15\x16@\xb7\xf2a\x98K\x066@\xafJh\x89\xa1b @\x89\\\xe9\xc7\x9c\xc4\x9e\xbf\xed&|j\xb4p\x16@\xb39\xfdkA\x026\xc0'
-p94
-tp95
-bsS'_landmarks'
-p96
-NsbsbsS'similarity_weights'
-p97
-g62
-(g63
-(I0
-tp98
-g65
-tp99
-Rp100
-(I1
-(I4
-tp101
-g72
-I00
-S'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
-p102
-tp103
-bsS'_weights'
-p104
-g62
-(g63
-(I0
-tp105
-g65
-tp106
-Rp107
-(I1
-(I36
-tp108
-g72
-I00
-S'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
-p109
-tp110
-bsS'_target'
-p111
-g0
-(g85
-g2
-Ntp112
-Rp113
-(dp114
-g89
-g62
-(g63
-(I0
-tp115
-g65
-tp116
-Rp117
-(I1
-(I20
-I2
-tp118
-g72
-I00
-S'\xee\xb02\x7fy\xf4\x1d\xc0\x0c\x1e\xc1\xf5*\xc2T\xc0\x05\x9b\xc8L\x9e\x1c4\xc0\xd4\xf3 \xb3\xf6\x17J\xc0\x1f-\x80G\t\x14;\xc0J\xe04WMj5\xc0\xfdl\xd7\xf5\xb0\x8e5\xc0\x1fP\xaf\x93\x99Z\xc3\xbf\xe1\xc1@Wt7;\xc0\xa7\x9a\x19\xf9\xc245@\x13\xa4<\x15\xf1g4\xc0\x85\xf2\x96\xc9\x1c\x19J@*zp\x84>\xaa\x1e\xc0gQ3\\Z\xc5T@\x02\x08\xf0\xcb\x90:5@s\x8b\x13\x95\x89\xa5J@70"^\xe9\xf0@@\x802Ul\tI7@\xe2!\x10\xa9\xd4,B@fI\xa2\xe7\tU\xa2?\xda\x16:~d\x01A@\x8f\x8d\xa9hZ&7\xc0\x15\x82=[\xfd\x815@\x0bI\xe2\xc8r\x83J\xc0<\xa7\x9ei\xc1\x06\x18\xc0&\xfc\xf9\n\xde\x82Q\xc0\x0c\xa1\xab~\xfcl \xc0c\x01\xd0\xddz\xab5\xc0\xdc*q\xcc\x98\xce\x17\xc0m\xb0k9W\xf4\xb2\xbf\xcf\xb7w\x82\x13\xb0 \xc0|\xff[\xd3\xed\x985@\x08\xd0\x8f\x81\xfd\r\x19\xc0@\x9f{\xccs\x84Q@\x8e\xbf\x99R\xe3\x15\x16@\xb8\xf2a\x98K\x066@\xb1Jh\x89\xa1b @\x85\\\xe9\xc7\x9c\xc4\x9e\xbf\xf0&|j\xb4p\x16@\xb49\xfdkA\x026\xc0'
-p119
-tp120
-bsg96
-NsbsS'global_transform'
-p121
-g0
-(cmenpofit.transform.homogeneous
-DifferentiableAlignmentSimilarity
-p122
-g2
-Ntp123
-Rp124
-(dp125
-S'_h_matrix'
-p126
-g62
-(g63
-(I0
-tp127
-g65
-tp128
-Rp129
-(I1
-(I3
-I3
-tp130
-g72
-I00
-S'\x02\x00\x00\x00\x00\x00\xf0?p\x1f)\xbbue7\xbc\x00\x00\x00\x00\x00\x00\xd0\xb9\xe4\xe0\xc3 o\xd2G<\x01\x00\x00\x00\x00\x00\xf0?\x00\x00\x00\x00\x00\x00\xf09\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xf0?'
-p131
-tp132
-bsg111
-g0
-(g85
-g2
-Ntp133
-Rp134
-(dp135
-g89
-g62
-(g63
-(I0
-tp136
-g65
-tp137
-Rp138
-(I1
-(I20
-I2
-tp139
-g72
-I00
-S'\xea\xb02\x7fy\xf4\x1d\xc0\x0b\x1e\xc1\xf5*\xc2T\xc0\x02\x9b\xc8L\x9e\x1c4\xc0\xd2\xf3 \xb3\xf6\x17J\xc0\x1c-\x80G\t\x14;\xc0I\xe04WMj5\xc0\xfal\xd7\xf5\xb0\x8e5\xc0\x1cP\xaf\x93\x99Z\xc3\xbf\xde\xc1@Wt7;\xc0\xa6\x9a\x19\xf9\xc245@\x10\xa4<\x15\xf1g4\xc0\x83\xf2\x96\xc9\x1c\x19J@&zp\x84>\xaa\x1e\xc0fQ3\\Z\xc5T@\xff\x07\xf0\xcb\x90:5@q\x8b\x13\x95\x89\xa5J@50"^\xe9\xf0@@\x7f2Ul\tI7@\xe0!\x10\xa9\xd4,B@WI\xa2\xe7\tU\xa2?\xd8\x16:~d\x01A@\x8e\x8d\xa9hZ&7\xc0\x12\x82=[\xfd\x815@\tI\xe2\xc8r\x83J\xc09\xa7\x9ei\xc1\x06\x18\xc0%\xfc\xf9\n\xde\x82Q\xc0\n\xa1\xab~\xfcl \xc0b\x01\xd0\xddz\xab5\xc0\xd9*q\xcc\x98\xce\x17\xc0k\xb0k9W\xf4\xb2\xbf\xcd\xb7w\x82\x13\xb0 \xc0{\xff[\xd3\xed\x985@\x05\xd0\x8f\x81\xfd\r\x19\xc0?\x9f{\xccs\x84Q@\x8b\xbf\x99R\xe3\x15\x16@\xb7\xf2a\x98K\x066@\xafJh\x89\xa1b @\x89\\\xe9\xc7\x9c\xc4\x9e\xbf\xed&|j\xb4p\x16@\xb39\xfdkA\x026\xc0'
-p140
-tp141
-bsg96
-NsbsS'allow_mirror'
-p142
-I00
-sS'_source'
-p143
-g134
-sbsS'model'
-p144
-g0
-(cmenpo.model.pca
-PCAModel
-p145
-g2
-Ntp146
-Rp147
-(dp148
-S'centred'
-p149
-I01
-sg84
-g0
-(g85
-g2
-Ntp150
-Rp151
-(dp152
-g89
-g62
-(g63
-(I0
-tp153
-g65
-tp154
-Rp155
-(I1
-(I20
-I2
-tp156
-g72
-I00
-S'\x0b=\xe7\xa7\xa7\xd7\x1a@@\xbc\x85\xc4\xaavU\xc0\xfd+\xb1\xedI\x8f/\xc0\xb3\x9a\xf0\x83m\x1aK\xc0\x93\xd6\xdbt\x04f9\xc0`DYsh\x9b6\xc0?\xb4>\x80p\x107\xc0\x98\xc5$\xa7\xe6F\xfd\xbf\x9b+x}Qe9\xc0{%@nW[5@\xdc\xf1\r\xff\xffI.\xc0wLJ\x06\x0f\x9eL@\x19\xd3.Q\x87R\x0e@Y\\\x17[v\xa2W@yW\xcb\xb7\x12o-@\xb9\x0b\xfd\xb1h\xdbK@Dl$}|\x845@\x8c\xe9\xc4\xab`\x924@,sX\xe8\x07\x047@\xf2\x84\x9b\x08\xa4!\xf9\xbf\x8c`\x19\xb8\xa957@\x82\x84\'A\xe1\xb59\xc0M\xbfkD\x97\xaf0@="\xeaL8 L\xc0\xaeRI\xf4b.\x15@\xd8\x96Qo\xe1\xb3R\xc0bz\x94\x0b\x81C\x13\xc0?L,S\xff+7\xc0\x9a\x94\xcaa\x9dx\x19\xc0\x17\xbf\xed\xe2!b\xd2\xbf\xde\xd4\xd52>\x9c\x19\xc0\x05\xa5D\x9d~n4@\x03=P\xb6\xd1\xa2\x12@\x14\xbe>v\x87_T@\xde\x87\x0e\x17\xd8\x18\xca\xbf\xab>\x86\xde\x8e,4@\xef\x13l\xe8\t\xf5\x05@#\xe6y\xc3\xa4\xdf\xf4\xbf\xeb\'0\xb9#\x1e\xcd?\xd3\xd77T\xdb\xe87\xc0'
-p157
-tp158
-bsg96
-NsbsS'_eigenvalues'
-p159
-g62
-(g63
-(I0
-tp160
-g65
-tp161
-Rp162
-(I1
-(I36
-tp163
-g72
-I00
-S'\xf2|\xed!\xbe\xfc\x92@\xa0\xd5\xccH?@\x8c@\xd5o\xc5\x95\x83\xa8~@\xf1\xa3\x8a\x1et(f@\xbf\xb5\x0f\x9d\xe6\xccX@\xc7\x00Eb\x9d\tQ@9\x18G\x1a\xab\xf2?@e\x14\xcb\x80\x840<@\x1a\xc1\xf4_O\xb38@\x98\xab@B\xd2\xc55@\x87\x9b||\xe1\x8f.@\xa0\xe5Y\xe7\xe4F*@V\x1f\x00\xe6\xc2\xc9(@\xf8\t\xae\xd0\xe9\xa5$@\xd2\xa8\xd88\xa5~ @\xf2"2>\x01\xe2\x1a@|\xa3`\x9b\\5\x16@C\xc8H\xc7\xceY\x14@|\na\x1c\x9b\xcb\x11@"\x83\x10\xf8\x86\xaf\x10@\xe9u\x17k\x8dN\x10@\xaa\xd8C\x05t\xc1\n@u\xfd\x99\x87\xd8b\x04@C\x17\x8dGQP\x01@\xc6\xb5\xc2\xab\xa3\xb8\x00@\xf7P\xad\x0eu9\x00@l<\x96\xbf\xfe\xa3\xfe?v|r>[[\xfe?x\xe5\xbeY\xf9s\xf6?gW\x82\xb2\x17\xac\xf4?`\x0b\xb9U\xcf\xd1\xf1?)BB\x05\x04]\xef?+\xeb\xb0\x9f13\xeb?\xa9\x01!m\x146\xe9?=\xd9h\xa1&\xef\xde?\xe5$\x02\x912\x11\xd7?'
-p164
-tp165
-bsS'_trimmed_eigenvalues'
-p166
-g62
-(g63
-(I0
-tp167
-g65
-tp168
-Rp169
-(I1
-(I1
-tp170
-g72
-I00
-S'\xda\x0b\x16\xe7\x82\x17\xc8?'
-p171
-tp172
-bsg61
-g62
-(g63
-(I0
-tp173
-g65
-tp174
-Rp175
-(I1
-(I36
-I40
-tp176
-g72
-I00
-S'}\x9c\xed\x11\xbe%\xa7\xbfp\x1a\x1a\xffk\xc2\xc7?%\x0b\xd7\xae\xa4i\xc8\xbf\xacS\xb2\xfe\xfe\xc2\xb3?\x0f\t:\xe7!\xb8\xcf\xbf\x8bL\xc3 \xdc`\xa4?4\xc1u2\x1d\xcb\xcf\xbf\xef^p\xe8\xba\xbdk\xbf\xcc\xc724\x19\xb6\xcf\xbf\x19\x13 qWH\xa9\xbf\x17\x82S\xa6\x12\x91\xc8\xbf\xd4\xf2^\x0c\x8f\xe8\xb3\xbf+\xcf^d-U\xa6\xbf")M\xb4\xdb\xe8\xc5\xbf\xf2P\xfa\xa9\x1d6\xc9?4\x94\xdc\xa6$\x83\xb2\xbf\xe7C\xadS2\xfc\xd2?:FS\xb3*\xa4\xa4\xbf\xe8{*&V\x8a\xd3?\x99\xca\x99\x81\xa2\xbdG\xbfO\xff<\xd4y\x04\xd3?k\xb4\x1cSh(\xa3?\x10\x195RtX\xc9?`n&>f\x1a\xb3?8\xd7}\'>H\xa9\xbf\xed\x04)\xe2\x8e\x81\xc5?W\xecIME\x82\xc8\xbf\x85\x00\x02\x1e\xb2%\xa7?|\xa5\xd8B\xab\\\xc9\xbf\x8f\xbem\t\xcd\x0ba\xbfQ\x1d\xa4e7c\xc8\xbf\x9fD\xe4\x91\x8dV\xaa\xbf\xef\xb2!\xd3<\x0e\xa9\xbfS\xe9\xb6\xcd\xcc\xd0\xc3\xbf\x9e\ru;\x14^\xc9?\xe5x\x1a\xc9\xeb\xb7\xa5\xbf\na8r\xda\xf0\xc9?\xbaB\x8e\xe0\x1d\x8eW\xbf\xf4t\x08\xef\x0bw\xc9?\xd6&\x82UX\'\xa4?aW\xfb\xa9F]q?/\xbaQ\xd5Y\x90\xda?N@=\x87\xce@+?![\x04\xa8\xa4\x1b\x96?\x9dU|\x94\xef\xf4\x85?gx\x8e\x10$\x8d\xcb\xbf=\x9e\xef\x10u\x8b\x85?\x8b\xe5\x18\xc8\x93\xb7\xce\xbf\r\xe6\xe5\x13o\xa2\x81?\xac\xf7\xe8\xa1U\xf6\xca\xbf\xc7\xfb\xe0t|\xf2\x85?j\x1c~ir\xaa\x9b?1\xd1$\x1d\x03\xf3\x92\xbf\x81\xa5^*\x08\xb4\xda?@\xd6\x0f\xa7%Q\x97?\x1dr\x18u\xb3\x16\xc1?\x15\xb4(,\x97\xde\x87?\x07C\x81\xc0\xbd\x15\xb3\xbf\x0c\x0c\xd0\xb9\xe0Nw\xbfb\x0b\x93@\x15a\xbb\xbf\x05\xc7\x18\xb7,E\x99\xbf\xc7]62\x14\xcf\xb3\xbfr_\x1b\x99\xd2o\xa2\xbfT\xef[\xbd\xf4(\xc0?\xc3\xf2\x07\x0bu=}?\xb1~\xf8\xb2\xaf\xf1\xd7?\x85\x1ar\xaf\x80\xb0z\xbfO\x91\x1c:\xc1\x81\xc5\xbf\xf1\x7f\xf4\x9a\x8d\x8eO?\x89Xi\xe5?W\xc9\xbf\x1f\x83p\x9e\x95\x01w?N\xbf.\x8c\x85\xf9\xc4\xbf,\x04\x98\xda\x9c\xb9\x90\xbf\xe6\xe4i\x98.7\xd8?\xcc\x03\xd0\x14\xb7\xde\x9f?\xd3\x81\x8c*\xa3+\xc1\xbfM\x1a\x91\x18\r\xe7x?\x94\xc9\xc4b\x89E\xc5\xbf9\x12\x98\xd2S,\x97\xbf\xaa\x96\x8e]7\xb1\xc1\xbf*\xbe=\x07J\xb7\xdb?\r\x1d\xadl\x83\xd5=?\xb7\xcf\x83~f\x17\x93\xbf]\xbd\x9c}\xfe\x00\x9e\xbfp\x16\x08MPE\xd1\xbf\x0e\xed^\xad?\xf8\x8d\xbf\xf9\xc2Uu\xbe%\xd0\xbf\xfd\xe8Z\xb9\x91V\x7f\xbfQB\x8a\xf9\xe0_\xd1\xbf7\x91\xf4\xf9bBE\xbf\xd4\xa6v\xd7qD\x93\xbf\xbe\xac\x81<-M\x97?\x99U\xb4\x94\x89\x05\xdc?2*2F|^\x92?\xfa\xa9\x16\x0f\xe7V\xbc?\xa0\x93\xf5X\xb4XG\xbf\x1b\x96te\x1c\x1e\xaa\xbf\xc1\n\xae\xc6\x83\xc6j?\x87\x978\x07\x8a\xc8\xb2\xbfJ\xa5\x86Ku\xdfh?\xa0\xe8\xa2p\x9c\xb0\xaa\xbf\x10\xc0a \x85\xc3S\xbfZ\x81R\xa3d\xdf\xba?\xdd\xc1\xfc\xcdmc\x85?l0\xb0\xcccR\xd4?&1\xefx\xda\xf1\xb0?\xae\x06\x15u\x9c~s?p\xe5\x04\xcdR\x8b\x80\xbf\\\xc1\xd2yf]\x9a\xbf\xd6\xb8\xc3^\xeckq\xbf\xbb\x0c\x13|7\xd3g?[\xd2\xfa\x1e\x15c!\xbf\xba2A.5r\xd4?\xe16\xb1q\xdcV\xab\xbf)\xd3d\xeb(\x80\xcc\xbf\xcd\xc1o\n\x89os\xbf?\xfb\xea\x97d\x15\xd0\xbf\x92\x86;\x83\xd0]U\xbf]\xa4m"T\xa9\xcc\xbfj\n\xdcC\xb9.`?\x0e\x9d"K&\x01\xc8\xbf-\x91^\xc4\xb4\x04\x88\xbf\x81\xa4/\xa6\x9f\x0b\xa9\xbf\xa1"\x00\xdb\xbc\xc0\xb1?\xb8\xc7\x81\xc5\xd5\xd0\xac\xbfE\xa8\x8d\x94\x0c:\xac?\x10\xc3\x859a\x8e\x9e?]\xba\xee\x06\x1b\xbcb\xbf\x92\xf4\x84\xb7\x90\xc7\xab\xbf\xac\xcbz\xd3\x85\xd1\xae\xbfV\xfb\xbd\xb5\x03s\xa7\xbf\x1ep\x9ep\xe6\xf0\xb0\xbf^GR\xe4hT\xc9\xbf\xb8\xbe\xc7\xe9\x01d\x8c?\x9bz\xe0Vj\xf9\xbb?H\xf6\x07\x01\xd3K\xac\xbf\xd27\xc8\x8f\x1f\x02\xd0?\xf9\x1a\xa5\xa0\xd4\xc6\xa7\xbft\x8a~n\xc6[\xd1?\xec{a\xb1c\x7fU\xbf\x9a\x8dA\xc0\xbd\xd5\xce?\xb6\xdb\x0c\xaf\x16.\xa6?6\xa6b\xd1\xab\xe5\xb7?e\x88<\x99\xdaR\xb0?\x85\x0c\xd3o\x80E\xbc\xbfD|-$\xa53\xa6?.y\xb7\x19\xdd\xd2\xd1?\xe7\xea0]\xe6g\xa8?\xa0~\x18s\xaeR\xd4?f:i\xbd\x00a<\xbf\xd6\x0b\xbcY?#\xd2?\x04\xf7iy\xfe\xfe\xa9\xbf\xecw\x12\xdd\xa1_\xbe\xbfi\xed\x9b\x87Qw\xa1\xbf\x08\x88\x03v\xb0\xcd\xd6\xbf\xc5\x9b.\xaa\xd8\xe9\xb0\xbf&\xf5\xf7\xd47f\xd5\xbf&\xea\xe6\xbc\xbf\xb0e\xbf\x17\xd5\x8e\xb1aG\xd7\xbfI\x85>\x80\xd58\xaf?\xdd\xfc\xc0y5n\x9b\xbf+7\xe7\xaa&\x96\xd0?s\xa3\xb1\xbc\xab\xdf\xa1\xbfJ\xe5\x1bT\xd7/\xbf\xbf\xd7\xd9s\xbafX\x88?O\r\xbfW\xcf\x8f\xd2\xbfVd\x02\x9a\xc4\xdc\xac?\xa3\xe0\x8f\x92i\xe9{\xbf\xe0\xbb\x10\xaf\xbc\x0b\x83?\xb3\x19\x86\x13\xc8\xa0\xd1?\xd0\xbez\xda\xf9\x03\xa1\xbf\x8d\xf595-\r\xbd?\xcd\xa7\xff\xc0PW\x89\xbfl\xbfrW\x8eM\xd1\xbf\xc3W\xfd%\x1f,\xb1\xbf\xd4\xc6*b}\xe4\xc1?N\xc3\xb9hzm\xa4?\xddoF\x1d\xbe\xdc\xd2?\x9aQ\xb7S\x07\x95\xb8?\xfa\xda\x02xg\x8d\x86?\x0c(\xf4R\x81\x1f\xa4?\xd2;\xabI\xf1Q\xd1\xbfi\xce\xcf\xb0O\xd9\xb1\xbf,_\xf6\x94\xa1\xe7\xbe\xbf\x03\x1a\x06]\x1bg\x8b\xbf\x8d@\x03\xd8\xc4\xa8\xcd?xf \xfb\xca\x0f\xa1?\xfb\xa7\t?WZ\xd2\xbf\x93w\x87\xb3-\xca\xb3?~\xc5\x14B\xb1\xa8n\xbf\xb8\x86\xe7\xbb\x9a\xd1\xa3?a\xban\x9b\xb6\xb0\xd1?\\\xdbL\xb02!Y?\x9f\x82\x140i\x1f\xcf\xbf&Gl\x96\xae`\xb1\xbffS\xa3\xf5\xdeO\xd2?\xcf9~\xf9\x84E~\xbf\x04\x02\x88T\x97\xedr?>4\xf4\xda-&\xb2\xbf{\x9a9] \xd2\xd1\xbf-\x00\xe8jMe\xc4\xbf\xba\x00\x8d/7\x8f\xb7\xbf;~x\xe4\xca\xfa\xb9\xbfT\xfc$\x036\x95\xc8\xbf\xdf\xcfp\\\xac\xaa\x9a\xbf\xed\x0e^=6N\xd0\xbf\x1d\xc0\x98\x89W&\x83\xbf}\xa76\x9d\x82H\xcc\xbf\xfc\xffGn}&\x99?{\x07\x02*\xa7\xc0?"?Z\xbe}&\xb3\xbf\xce\xaal\xa1\x8f\x9c\xb6?\xbc\x93\x91_}\xbe\xb5?n\x00U}\xf3\xd7\x83?\x15\x0c~V1e\xbf?\xad\x84\xe4c?\xe2\xb0\xbf\xf8\xf7C \xae?\xef\xb5\xf4c-u\x9b?7\t\x90\x8e\x16G\x82\xbf#|\x15w\x90\t\xac?_\xc3X\x8aO,\x97\xbf\xc4\xfe\xb0c\x01\x03\xb0?8\x0e\xcdMg&\xaa\xbf\xe7\xc8\xc1\xa0\x0f\xc4\xa8?\x02\xf6\x80\x90\xab\xd3\xb3\xbf\xc6w\x8f\xdd\x9b_\x92?\x8c\x89\x1b\xe7\x18\xf5\xb7\xbf*\xdd\x0b\xa6\x1dZ\x95\xbf\xbd\xf3\xb3\x0ca\xcc\xa9\xbf\x06\x97\x11\xbd\x978\xab\xbf\r\xdb\xe36I\xb2\xb5\xbf\x83j:<\x19!\xb6\xbf8\xc8\xb0k\xb7\xdb\xbb\xbf\x16\xe1D\x99\xa0\x96\xc2\xbf\x02\xd1^j]F\xbc\xbf\x1fl\xb8\x93,\xda\xcb\xbf\xab\x89\x11\x03\xc9>\xb0\xbf\xf0\xd0\xc3\xf8\x84(\xa1?\x04_\xbb\xb8\xdc\x98\xac\xbf\x04\xfe\xe8\x033\x7f\x93?\x15\xd0\x9d\xe0\xea<\xa6\xbff~\xc4#t\xaao?\xd8J\x8f\xb56i\x9f\xbf\x18^\xc3\xc7pS\x8e\xbf*\xdb\x86\x97\tJ\x93\xbfh\xf9}\x1d\xf4\x8a\x99?Z\xee\xef?X\x19\xa0?\x9e,\x03m\xb1\xf1\x81?/RB\x8d\xc7\xa0\x99?\x8f\x19:\xb7\x10\x1a\x81\xbf@\'\x98\xa3\x9c\xd9\x93?v2}\xfe\x86\xaf\x94\xbfZ*\x8ck\xc27\x8d?\x14\x91\xc1\xdf[T\x9d\xbf]\xb2\xab\n\x12\'\x8e?\x9b\xa3\xb4\xab\xbe\x9d\xb6?\xaa;\x085\xe0\t\x9d\xbf\xb2\xd8\xceDU\'\xb1?K\xc2po\xb8\x08\x9e\xbf\xd5\x8a\xe3j\xdeX\xa9?\x93\xb3\xa0\xc3\xf2;\x94\xbf\xe9\xf6T\xac\xb3\xe8\xac?/\x11\x0b\xc8e\xad\x90\xbfi\xe3\xf0wj\xec\xb2?\x15\xa5\xcb\xe1\x96x\x8f\xbf\xf4\xfb\xb4<\x0e\xec\xb5?\xff\x05;\xd9 W\x95\xbf\t\xd4\x063\'\x0e\x8f\xbf\xc0R\xda\xc2x\xa3\x93\xbfY\x08\xca\x10\xb8S\xa9\xbf\x82<\x03\xdbYP\xa5\xbf\xef\rsB\xae\xb6\xb2\xbf\x86rx9w[\xa1\xbf\xecc\x1a\xf6\xb8\xc7\xb2\xbf\xaf\xb7\xf4\x00\xf2\xce\x99\xbfrxp\xd3\xaf\x1c\xaa\xbfh\x17B\x95\x04\xc7\x92\xbf%\x85\xb8\xd4\x10G\x9f\xbf\x14\x81\r\x8b\x10q\x94\xbf\xe2Sh\x1a\xeaP\xa5?>Reu5\x15\xbe?\x13\x12\x0er\x99;\x90?&\x1d\xc9+{\xcf\xb7?iz\xef\x17\x02_|\xbf\x91\x1dg?%\xf7\xb4?\x83fo;\xc1^\x96\xbfG\xa2\xe2\xc0k\xef\xb2?\xe2y\xe7\xe2W\xd1\xa0\xbf\xb5\x0bFIt\x0f\xb1?\x91q>-\x03!\xa7\xbf\x7fv\xbe\x00\x1a\xfa\xb6?\xcb\x1b\xed\x95\xee\x10\xa6\xbf\xd6p\x05x\x10\x9d\xbc?;\xa55\xe91Lf\xbf\xa2H\xca\xa3V\x86\xbe?\xb0\xfb\xc8^\xba<\xa1?\xd5\x02\xf4P1\x1a\xbb?[>b\xbd\x19\xc1\xa8?\x87\tlNN\xe0\xbb?\x17\x82Q\x0c\xc5o\xaf?\xdc\x812>\x9bC\xbe?6\xbf\xd8\xb4C\xef\xb0?#HF\xaf\xa8\xd6\xbe?b\xb6\xf7\xa4Ic\xa4?\x88\xb1=\x88\xfe\x07\xc0?D!\x10=I|7\xbf\x8a\x8a\xa8Rk\xf5\xb6?\x0e\xf1\x90\x7f}\x83\x90\xbf\xf5\x11w\x11\xc87\xb4?w\x7f\x02\xf5\xbc\xde\x98\xbfr\xa4\x0cH\xe6\xc3\xb2?\xd9\xec\x85\xf2\x08\xb7\xa1\xbf\xcd \xbe\xfc\x8a\xe6\xbe?.9\xcf\xd2\x80\x13\xa2?\rlI\x98\xa0\x91\xb7?\xd50\xda\x0f\xed<\xa5?\xfc\xd8h\xab3\x8b\xb8?\x08\xc1)\xa1d(\xa9?\x7f#\xe6m\xc9\xcd\xba?^0{\n\t\xae\xba\xbf\xeb\xad\xa4b\x19\x95\xb0?\xf0\x92\xa2u\xe6\x1d\xbe\xbf\xbae\xc6>\x9f\x9f\xbb?\xe9\x14-\xdd\xec\xfb\xc3\xbf$\xac\xd2\xad\x9b\xe6\xbf?\xad\xd0\xd3\x9ftE\xc5\xbfe$\xb8\xbb]\xdd\xb3?\xb8\xea\xb25\x90\x1f\xbf\xbfU\xd9\xe0\x17\xb3\xc9\x9a\xbf\xfd\x11\xa7\x98\xf8\xac\xac\xbf\xce\x08\'(\xfc\xdc\xbe\xbf;u\xaes\'\xfdx?\xa1a\xb0<\x83i\xc2\xbfY\xe3\xd8\x82/o\xb0?i4\x08X\xe2\x87\xb8\xbfO\xb0\xcb\xc3\x05\x1b\xb9?\x97%\xbc\x95\x96\x9cb\xbf\x97\xa4\xbb9\x02)\xbe?\xdd\'&\xc8\xf2\xf2\xb5?>\x97\xd0\xf1\x86\xa9\xbd?\x06L\xbe\xbe\xc1\xd0\xc1?\x96\x93\xd3x\xedd\xb0?\x97OX \xc7\xd6\xbf?1R\x13\x83\xaa\x9fx\xbf=\xcf6\xcb\x1ex\xa6?\x9ep&\xbb)\x08\xb0\xbf\xa6\x01c`\xcb%\xb0\xbf9\xa8L\xfbu\x07\xb3\xbf\xd1\x13l\x1aJ\x02\xc0\xbf\x88\xcd6I\xa5\xa8\xb2\xbf\xe9\xb8\xa2b\xaf\x90\xbe\xbf-1\xce\x0c\xd2\xba\xb2\xbfu\xa2\xab\x12`|\xb9\xbf5\x9c\xf0\xd7\xdf\xb8\xce?C\x8cm%r?\xbb\xbf\x83\x10YR\xc2\xa9\xbd?\\\xae\x1f\x1eO\x8d\xc5\xbf!):\x9e\xf5\xfb\xa4?\xc6\xbb"\xe8\xd7\xcc\xc0\xbf\x947\x87.\xe52\x91?\xab\xda\xbeO\xbb\x86\xb2\xbf\xf1\x93\xebe\xa1\x97\x82?\xec6|r"\x02\x82\xbf\x9ai\x94\xb6\xfe\xf5\xa3?\x85\xc8\xa9\xa8F?\xb4\xbf\xa2\xe2\'\xefaa\xaf?@\xf7U\xcf\xc37\x85\xbfaO\xc32`\x84\xb9?,>z9\x9a\x97\xaa?5\xa0\x97\xaa\xd3\xc1\xc6? \xebO\xe2\xc8\xc3\xb8?J_\xa7w+A\xd2?,\x1d\x8eO\xf4\xa0\xb2?\x08\xcey\n\xc54\xb3\xbff\x05\x06\x00\xbb\xbc\x97\xbf\xd3*|O>\x84\xc0\xbf\xec\xa1\x03\xaa@\xc7\x99\xbf\xfe\\\xf6\xf3g7\xc7\xbf\xf1KF5\x88{\x9d\xbf\x80\xf7?\xbe\x05N\xcc\xbf\xf8r^D`=\x9f\xbf\xa4\xd3\x84\x9f\x8c\xdc\xc0\xbfr\xa2k\xef\xe4\xbb\x88?3\xab5\xb8E\x1a\xc3\xbf\xf0\xe6d9\xe3\xcf4?\x98d\xdf%\xcd\xb6\xc4\xbf\x06\xacf{+\x06\x8a\xbf\xcfV\xd2\x0b;\x96\xc2\xbf\x80LmP\x17\xbe\x9a\xbf\xfd&L\xba\xa2\x19\xbf\xbf\xe0\x94\xba\x08\x18c\xa0\xbf\xfe\x89\xcb\xc0?9y\xbfd,J@?5\xa2\xbf\x0f\x16\xc8\x8e\xb9\xe2\x92\xbfT=\xbe\xaf\xdc[\xa7\xbf\x1e\r$W\x91W\x9e\xbf\x05P-;\'e\xa5\xbfY\xf5\xb9<\x06\x0e\xa4\xbf\'\x13\xb0\xb1\xe04\x9f\xbfY\x19T\xab\xfc[\x9b\xbf8)G~\xf2\xf7\xa0\xbf\x93\xd9\xb1\x9a\x85\xf8\x8b\xbf\x11\xa6h3\xa7\xbc\xa2\xbf\xcfY\xf1c\xa4u\x91\xbf\xc7\xbeL\xd0\x10Fp\xbf\x93\xcf\x86Y\'\xfbt?"\xb3^\x84\x8dg\x89?\xdc\x95\xfc\xcd\x1e\x9d\x97?\x8a\x81\x8f<\xb8\xe0\x90?ct\xddX\xd3\\\xa3?\'\xb9\xf8\xe2\xf9\xe2\x87?B/!H!\x90\x9a?f\xe3\x19O\xfa\x8d\x80?arK\xc1\xf4\x18~?\xca\xd6\xc8\xdbfYU\xbf\x91\xb4i\xcc\xd8\xce\xc0?\xb3vfy\xc0\xa2\xc0?\xbb\x8f\x1a\xfb\xa3\x85\xa0?\xf7\xc4\xa5\xb6\xac\x8a\xb5?\xbe\x83d\n\x81\x1c\x89\xbf&\xd6 b\x1d\xad\xa4?a\xed\xeb\xcb\xb8\xb6\x7f\xbf;6\xf0<*\x9f\x9d?g\xa0\xf3xx\xf6v\xbf\xe6\x81b\xed$\x17\x8f?zWZ\xd7wK\xab?\x8c\t\x1e\xc4\xa2?\x81\xbf~P>\xd9\xda;\xc4?\xe2\xe6\x15\'|\xb9\xa7\xbfO\xb2\x8a\x1dl\x19\xb5?\xfc\x90P\x9a$sm\xbf\x9e\xdf\x08\xcc\x19)\x9a?\x0fg4\x1aH\n\x98?\x1c\xdf\x0c\x10\xb7\xa6\x84?\x98R\t\xc9\x14k\xa3?\x8c_\xa1\xebW^\x87?\xd2\xf1\x0f\xa24\t\xac?4\xe1\x0c-*\x83\xa9?\xd4{\xdc_A]\xb7?\xfe\xe9\n\x02\x9c\xeb\xba?5xmk\x9e\x00\xc2?w\x87\x97\xc5\x94\x8d`?UI\x84\xa9\x14*\xa5?\xf7\xfb\xa3pW\x08A\xbf\xf7dP\x11\xc8*\x9e?\x97\xe1\xd3Z&z\x8b?\x112#\x92%(\x92?J\xce\xe2\xe9\xe6\xcf\xbf?\x05\xd4eF\xe1\xfd\xaa\xbf\xb1\x7f\x94\xef\x07\xb2\x9c?\xdcu?\x8b\x86\x1b\x96?\xc2\x88[\xda\x11\xd6\x8e?\xff\x0b\xa5\xd3\x01\xe1\xa1?\x00{\xd2\x94\xd8\x7f\x85?}\xad\x9c\x11\x03\xad\xa8?\xda$\x81W\x16[\xb7\xbfSx\xae\x0b`T\xb8\xbfbR@K\xc6;\xb3\xbf\xf8\x811\xe1\r\x91\xb4\xbfn\xcc%\xe2\xbd\x1e\xad\xbf\xe3\x9e\x05\r<\xe6\xb0\xbf\xd3\x9e\xe0Y\x89\\\xa1\xbf\xb0\x03\xdf73\\\x9c\xbf\x90\xfe\x0fe\x08\xb1\xa3\xbf\x18\x8b\x8f\xefZ\x90\xa2?\x95\x1a=P\xff\xc5\xa9\xbf\xe3\x8e\xd3\x1bT-\xbf?kp\xe6\x86\x84\xd3\xb0\xbf\x80\x00\xa9\x86v\xe0\xc5?\x0b\xf2I\xc4\xec\x8a\xae\xbfu\x1d\xd0\xf9\xae\xbb\xc6?Gg\xd8\x01\x84\xcb\x9b\xbf\xab\x0e-V\x83t\xc4?\x19\x9at\x1f\xef\xef`\xbf\x9d\xe3\xb0\xb8\x07\x84\xc1?\xe5\x01\xc0\xb3&\x07\x9a?\xfb]A\x1e>\xb8\xbb?\xd0\x01\x10\xb3y\xc0\xa8?\x84\xa4\xb9\xaf\xd3\xb2\xae?\xb1e\xa5\x1e\xcd`\xb3?#\xb8\x1d\n\xb2\x1a\x95?(j\x9e\xd8\xcan\xbd?\xf6\xf5\xc5j\n\\\x8c?\xdb:\nB\x02\x01\xc1?\xcd\x85\x9d\x83\xf9\x01\x87?\xe8\x8e\xdc\r,\x1b\xc1?\xbe)\xd7\xb83\xb5Z\xbf\xc3J\x8dq[\xad\xc0?\'\x16\xff\x15\x8b\x84\x96\xbf_\x12\x8f\xd6(^\xd2\xbf\xd3\xcd\xf2\xdb\x16\x06\xb4\xbfP\xf3\xaf\xcc\xebM\xcd\xbf\x98CeH\xacn\xa6\xbf\xcd\x05\x86ir\xee\xc3\xbfH\xafbo\xd2U\xa6\xbf\xa2]\xeftN\xbc\xb8\xbf\xa68\x16y\x83"\xb1\xbfM\xda\xdb\xc0\xf2.\xaa\xbf\xe5\xdba cT\xb8\xbffX\xcfj\x96\xad\xa5?\xc6\x1c\xc3\x84IN\xb4\xbf\x92\xce\xba\xe5k\x00\xb7?\xcb.5v}\n\xb3\xbf\x00\xf0\xd5\x85\xeb\x81\xc1?\x8c\xe9(\x8f\x12\xec\xb4\xbf\x9d\xef\xc5\xaeu-\xc6?\x14\x03\x81\xd5\xd9\xf9\xba\xbfqq\xed\x88b&\xc6?\xbd\x95M\xc05\xca\xbe\xbf\x17\x16)"\x85\x1b\x99?-\xb91\xbeR\xbc\xa9\xbf\x03\xfcB\xde\xc1\xa4\xa2?w\xec\x91\xe0\xa7\x88\xa3\xbf\xd8\xeeI\x81\x99\x08\xa8?\r\x0e\x12X,L\x99\xbff\x05\xa0\xe5f\x13\xad?\xb9=h\xca%z{\xbf\xf6H\xb5\xb4c\xe4\x8e\xbfl\xf00\t\x90e\xa0\xbf+\xe5CO4T\x89?Fji-\x08\xed\xa0\xbf-\x99\xb5!<\x9e\x9d?4\t~\xa3\xa8\x1c\x9a\xbf\x9bg\x91\xdbvx\xa4?;\xf7O\xcfp&\x9d\xbf\xe4\xf64\xb1%\xe5\xad?\xed\xef\x05\xf5-\xe5\x99\xbf\xa9=\xc9i\xab\x8f\xc6\xbfN5T\xa7N\xcd\xad\xbfv|\xf9\xed/U\xc3\xbf\xed\x15\xee\x83\xf2\xba\xad\xbf,_\xa5N\x17R\xbc\xbf\x1f\x89XDW\x10\xb0\xbf\x99\xd6l\x9b\xbc\x80\xb2\xbfUAb\xb4s^\xb3\xbf\x1c\x83\xb8\xc7m\xb1\xba\xbf\x1a\xbc\x83v\xb64\xb5\xbf1<\xdc\xbd%\xd4\xc1\xbf\xb3\x87J\xdf3=\xb4\xbf\xa5\xfb\xd5Y\xd7m\xb2?C:\xe470O\xb4\xbf\xe9\x9e)g"\x06\xba?\xfd\xb8\x91\xaf\x89u\xb5\xbfU89M\xd1.\xc1?K\xaa\x1a\xb4\r\xde\xb5\xbf\xfd\xa5\xa6\x1f\xef\xc7\xc3?d\xa04\xd7\xdc_\xb6\xbf\x1e\xb9F\xe4\xd6\x9f\xc0?\x15(F+\xac\xc2\xb9\xbf\xe4A\xafX\xc6o\xba?\xb0\x88,K\x061\xb9\xbfp\x16\x81E\xc1\xfa\xb8\xbf\x95\xfe\xce\x82\xea4\xaf?\x17y\xf6o=Z\xb1\xbf\xbe\x14\xbf\x05\xc2e\xb2?Y\xc5\xa5\x99$$\x99\xbf\x88\x9f\x93\x98#\x1d\xac?\xf9\x93\xb7B\xef\x95\x82\xbf\x81=\x1b\xa2\x81\x8e\xaa?mWe\x97D\x8b\x85?q\xcf\xeb\x12N\x7f\xad?\xdbT\xe8=\xc8\xc0\x9e?\xdb\x08\xdet\x9f\xef\xb2?\x13\x1cy\xb6\xd1G\xa8?\xac+f|\xbcu\xb0?\x12\x81P\xe8\x02\x14\xad?\r \xc0\xddI\x1e\xb5?\rw\xd3\t\xce\xae\xa1?\xd3\xe5\xcc\x0f\x8a\xf3\xb1?\x9dY\x91\n\xadSy?\x08L %t\x1c\xb1?!\xd2/\xcf\x95\xc7\x98\xbf&\xcc\x89\xb4\'\x04\xb1?t\xa1\xf7+ \xe8\xb2\xbfKx\xb3\x9c\xa6}\xb4?\xd9\xa5\xe5D)C\xb3\xbf\x184\x80\xb9\xee+\xb0?dr\xe20\x02\xe2\xa2\xbf\xe5D\x06"\x9c9\xaa?$v\x0e\xce\xf2\xac\x92\xbf\xf7\x9d\xee\xc3\xe3\x12\xa9?\x82e\xc3\x1f\xb49j?\x80\xc3\xe5\xd4\x1a\xd9\xaa?\x90Ql_\xf8\xa2\x9f?o3\x1f\r:\xfd\xb1?\x03\x96(\xc9Y\x07\xa4?\x0f\x9d\x8f\xd15\x7f\xb0?\x18k\xa8\x9b\xb8\xdew?7\x0f9pTN\xb0?H[H\x8b\x1aM\x99\xbf\xb5\xad|\x80\xeez\xb0?}\xf2\xeb\r\xe1\x91\xca?\xe5\xc8\x1eUYV\xc4?\x97\x17\x15ac;\xb6?\xd6\x92\x90xA\x97\xc1?D2\x8eU)\xb1\xa0\xbfR\x1e"\x84\x99\x8a\xb7?\x98r\x94>7\xa8\xb9\xbf\xed\x02\x12\x83_\xd3a\xbf\x87\xc7|I\x1d\x91\xbe\xbf+(<\x12\xf4\xfa\xc1\xbf\x8b\x93\xde%\xfbt\xba\xbf\xc0\xd6*\x14\xae\xfe\xcd\xbfZ\xa4\xbf\xc9\xf1:\xb3\xbf\xec\xbc\xf7\xe5R\x19\xcf\xbf&n\xcf\xf7\xed\xd9\x9f\xbf[y\xdf%\xe4\x9b\xc3\xbf\xb1\xb2q\xfd\x1d\\\x80\xbf\xfc\x8e)E\x1c\xf2u\xbf\x17\x88C\xfa\xff\x12\x90\xbf[\x06\x81\x9f\xe5\x17\xc3?\xd2\xd8\xaa\x0c\xafL\xab\xbf\r\xcd;\x08\xb1\xcc\xce?uS\xa2\xcc\xd1\x94\xb8\xbf\xe8\xe0\x12\xf2j#\xce?\x8df\x7f\xdd\xd9U\xc1\xbf\x87!\xfd8\x0c\xbb\xc3?8\xa7\xdd\xe8N_\xc1\xbf\xa7\x9b)\xba>\x08\x9b?\xb2\xad\xd3\xc7%\xe4\xb0\xbf\x19\xa6W\xe4\xd27\xb0\xbf\xeaW\xe7\xd3;r\xa8?J\x9d\xf2\xb0;T\xbd\xbf\xec@\xf3r\x1b\xd3\xc5??\x01jg\xf5F\xc3\xbf\x01l>\x9d\x1b\x03\xc2\xbf\xbb5\x16\xe3\x01\xe7\xb4?H1\x81\x1b\xef\x1c\xbf\xbfW\x93\x19\xdef?\xab?\xb9\x99+(\x92\x02\xbf\xbfD\xe3\xf6\xa9:\xeb\x19\xbfK\x8d\x1fD\xbb\xb8\xc0\xbf\x1aP\x82uK\x9f\xab\xbf\x92\x81\x1c\x8ac\x18\xc2\xbf\xe5\xa6\x18F\xcb\x90\xb9\xbf_`nQ\x12o\xbd\xbf\xda\xd6\xc5\xef\xfe\xcc\xb1?W0U|S\x85\xb8\xbf\xc7\xe0N\xe3#U\x95?\xd5\xd4P\x8a\xd4S\xb4\xbf\x99\t\xac(\x0e\x98\xa5\xbf\x97\xea}\xa7\xc3\xbd\xb6\xbf\xcb:\xcd\xeaN\xf8\xb9\xbf\xf2Ju=\x8ap\xc0\xbf7-* \x94\x1a\xbf\xbf\xd37m\xf1\x8b\xc3\x80?\xf2\xd7\xa9\xff\x7fK\x86?@\xd2\xa8\xc2\xaf\xabj?\x13\x84\xec\xf0\xeb\xf4\x8f?vS\xdb\x86\xf6\x9fp\xbf\x15\x19\x87-\xf9U\x93?\xbe\x12\xc6"+E\x8b\xbfpm\x04\xfa\xa8H\x94?\xd3\xf9\x0e\x10\r\xa5\x98?\xb0\xa8\x8c\x98\xb9\x82\x95?\x03p#\x94\xc5-\x91?!\xe7\xfb\xd9\x85\x1c\x94?2*\x90Rd\xca\x97?\x95\xcc\xa98S\xdf\x8c?\xf1\xb5\x9c#\x91\x13\x97?\x9e_\xea`d\xe0z?\xcfX\xe2\xda$\x91\xa3?zo\x9c\xa2\xf7rk?y\xe6\xe1\xfbd}\xb1?\xb8\n\xafMS\xbb\x8f?x`q\x82\x0fM\xb0?tg\xb5\xa05\xf2W\xbfE\x03\x8d\x01\x96\xaf\xb2?\x86\xb2u+\x0e"~\xbf\xce\x9c\x8aS\x93\xd0\xb0?\xf5\xa8.\x15\xff\xe0r\xbf\xc6Y4\x93&j\xb2?I\x95b\xfaZ\xcd\x87\xbf\xb2\xd4\x14\x86p\xd5\xb0?\xaf}X+\xf1pT\xbf\xc3g\xf5\x0cf\xc6\xb6?C\xa8\xa8\xfdq\x98\x95?\xc3\xfc\xbc0\xb4\x80\xbc?i\xaf\x85\xa9\x16\xad\x8b?\xb6\xb1\xd5\xce\x89\x10\xbc?U\xe7\x90\xd6t\xe5\x81?\xe6<0\xab\x1c\xa0\xbe?e\x99\x027\xdfLw\xbf\x07\x0b\x02\x80\xc2\xbb\xbb? \xaf\xf2\xe7\x9d|~?y\xd2mIlI\xba?r9\xe8\x9d\xb69\x90?hQ\xeab\xe2\x8b\x8a?\x19\x9d&\x90LM\xbf\xbf\xc2+c\xdd\xf8r\x8f?O\x0b\x00\x95\x19d\xb2\xbfqK(\xb3\xec\x99\x9a?N\xe7\x973S!\xa3\xbf\x98\xc6J\xab\x10/\x9f?t\xff4.\x05\n8?a\xff9\xc5Y/\xa0?\xa5\xfa\xc5\xd6Y\xdc\xa6?\xa3\xd6\xc0[\r\xb8\x9f?B\xcb\x90f\xb0K\xb3?4\xde\xd9\x81\x1f\x80\xa1?\x1d,{\x16\x97]\xbb?\xbf19\xdad\xa2\x9a?\x1f\xf1k\xfc\xd8\xbd\xb3?\'VUN\x04Fl?\xcd\xe9B\xb1\xea\x9f\xa1?\xd5=\xff\t\xa7\xfeS\xbf\xab\xf5\x1e\x7f"\xa2\x86\xbf\xf7\x89\x99\xaa\x87\xc0x\xbf\x8e\x1e\xd0\x15\xba\xcc\xa7\xbf\xb9\xf2\xaau\xe5\xf1u?\x91TSO\xe9]\xb6\xbf\x11\x12\xc1\x8b% \x9b?Dx\r\xf3v\x17\xbf\xbf\xb0,6\xcd\x1e\x10\xb2?\xe2\'\xe9\xc4\x0fP\xa4\xbfI\xd9N\x1e\xec\xd5\xb2?\x96\x81xN\x92";\xbf\xce\x85\xa0\xa17\xe7\xb2?\x02\xd2\x96_`\x06\xa7?\x136\x7f\xa0\x86\xb3\xa5?GV\x16\x8b\xe4\x9d\xb9?\xd6\xf8B\xf0T\xf8\x82\xbf*\t\xf4NR%\xa2?}L\t\x1c\xceQ\x90\xbf\x9c&\xd5"\xd4\xbb\x82\xbf\x83cB\xc7|\xd6\x90\xbf\x93\xc3\x91\xe4\xe9E\xa7\xbf\xb4\xe3\xfaSF\x14\xd5?\x9e\xfa \xeb\x85-\x86\xbfr\xe8ru\x97b\xd0?\x99\xe7\xf2m|\x8a|?7\xa21\x82\xc2\x8d\xc4?\xce*\xcb{d\xaa\x86?\x86\xd5S\xbegf\xb0?\t\x9c\xec\x86\x05:>\xbf\xc8\xaaWn\xa8\xf2\x99\xbf6\x04\x0b\x95c8\x9c\xbf\xd1\xa8\xe8\x0f\xc3\x04\xb9\xbfL\x87\xec\x91\x07\x19\xb2\xbfTI\xf34\xba\xdb\xc2\xbfDiK#\x17\x1d\xb7\xbf\x7fz\x99\x85|9\xc3\xbf\x172\x90R\xe1\xb2\xb8\xbf\x0cV\x08\xe6[A\xc0\xbf\xba\xbdb\x9cj\xc5\xb3\xbfJ\xeb_\xa0\x040\xb8\xbf/\xa4~|\xcf\xc7\xa3\xbf\xd8\x1d\xffA\xa4\xe2\xb6\xbf#;\xde\xa2\xc3\xee[\xbf\xf3\xaa\xfa\x0f\x83%\xb5\xbfEB[z\x95\'\x99?\x8d\x0f\xbb\r<\xc7\xb4\xbf\xd6\xe2\xc6:`\x06\xa0?\x143\xc3\x16$\x85\xb1\xbf5\x1d\x91Pd\xf1\xa1?\xbd&?\x89\xf1\xe4\xa1\xbfn8\x81\xcdM6\xa4?\xbe\x8a\xf6\xac\xbf:\x86?JR\t\xd0\xa4\x92\xa4?\xc3\xcc\xd3\x88\xa3\xd6\xac?K;>i\x95\xeb\x97?e_\xcc3)\x08\xae\xbf-\x1b\xda\xb3\x9e\xdf\xac\xbf\xb38i\x1f\xae\xa5\x9c\xbf\x1fz\xa3P8\x1e\xa9?\n+\xa8\t\xe9\x80\xa9?~g\xf3\x14\xe8L\xc1?n\x83\x83\xa07\x01\xbc?\xd2\x8a\xfe"l/\xc8?x\xf9_\xea\xd2i\xc6?\xb2\xb0\xd3\xd77\xd1\xca?\xad\xacO\xc6\x17\x9b\xca?\x11\xe1\xa5\x0f\x13\xa7\xc8\xbf\x1c \x84?\x14\x1c\x0f27P\x93?\x87\xdd?\x13\xe1\x06\x88?A\x875\x00\xdc\xee\xa7\xbf\x02}G\xf85\xd6\x9f?\x7f\xd3\x89x\xfeH\x91\xbfa\x011gEW\xdf?R\xc1\x1f\xd3\xa6\x91\xb8\xbf\xa1\t\xbf?4f\x87?\x0eh\x19O\xd2%\xaa\xbf\xecy\xd4\xd5\xb1\xa8z\xbf\xe3\x85\xa9\\9\xdd\xa1\xbfrZ\x94\xd4\xd0Fv?\xaf\xacG\x94\xa9j\x9d?v\t\x17\x86*D\xa6?)fE\x97\xbd\xb4\xcb\xbf\xf8\xec\x1a\xc4\xfe\xec\xda\xbfQ<\x81\xad\'\n\xc6?D\xd0?\x02\r\x06\xb0\xbf\xd4\xf0\xbd\xa4\xdcv\xd1?Y\x1a\x0e\xf4y\x9bg\xbf\xf4%O,j0\xaa?\xc8\xba\xc6\x87\x19\xe9\xc0?\x97\x9dU\xc6\xf7\xa9\xde\xbf\x0c\xbcN\xc2}\xbd\xca?\x02\x1d\xb9X\xb98\xbc?\xfe0\xdb\xfd<\xdd\x82?\x11\x9e\x19\xb3\xab\xb9\xcf?:\x9a\xdd\x96\x1d\xb6\xc5\xbf\x97q\x19M\x04\xbc\xc1\xbf\x80\x85*q\x04\xa6\xbb\xbfrd\x8d i)\xbb\xbfT\x19\xa9\x91\x97\x10\xb3\xbf~\xbf\xba\x1bJ\xd3\xb1\xbf\x84\xcbc\x90\x83-b?!\x0e\xe5\x9f\xbf\xcb\xce?\xaa\xbc\xd1\x84\xa1\xaa\xcb?\xd5@\xd1\xc9K\x93\xc3\xbfr\x1b\x08d\x05W\xae?\x8e\x1cHi\xc33\xbe?\x0f\x9c[\xd3\x029\xa8?D\xf3Lw4\x86q\xbfZ`\r#\r\xad~\xbf\xdd[[\x99\x11\x0e\xb0\xbf0\x8c\xcd\x81A\x06\xa2\xbf\x00g\xccC}z\x9a?\xa7K\x7f\x1f\x1fH\x9c\xbf\x96\xefVf\x8fo\xb0\xbf\x8bl\xad\xd5\xa1B\x88\xbf\xe5\xddM\xeaM\xc3\xc0?\xe8\x0fye\xbeb\x9e?t;\x9b\xfd}\xbc\xba\xbf\x8c\xaa/+\xdd\x07\xc6?9\xb3\xd5z\x01\xbe\xaf\xbf\x1b\xee\x92I\xed.\xb5?\x9dP\x0cT9(\xd7?\x05C\xfd\xf7\xc5$\xd2\xbf?\xccl~p\x01\xd1\xbf]\x9d\xa0\xef\x18\xf5\xc1\xbfx+\x06n\xb0\x8e\xb8\xbft]\nG\xb4\xe7\xb4?\xf98k\xe3\x96]\xcc\xbf\xf1,n\x88[\x8b\xb5?V\x12\xf024\x9d\xde?\x00p\x8cQg\x0c\xd2?K\xf0\x07\xf6=\x17\xcc\xbf<\xd4,V\xf2\xfb\x9a\xbf\xaas\xe2\x1b\x80\xef\xb9?\xe3\x1a\xab\xd9\xaa\xb9\xc7\xbf\xdc\xc4\xffS\x98\xe0\xa9\xbf\xa5\x90\xe5\xf3!\xdb\xb2\xbfb\xeb\xcb\x12\xb4\\\xa5\xbf\xd6\xc3\xec4\xf8\x1a\x91?\xba\x8a\xbd\xcc\x80\xc8\xb2\xbfI\xb4\x1a\xd3\x94\xa0\xbb\xbfsF\x08\x16\xe1\xc1\xb6?\xbf+\xc9?\xaf5\xd4?\x02\xcb\x96\xa4\xb6b\xb5\xbf\x9d+\x9d-\xa0Q\xb4\xbf\xa5h\x07\x9doJ\xaa?hw\x92\xd6|\x92\xac\xbf\x93\xfa\x03\x8f]S\xa5\xbf\x05\\\xf5\xf7\xf4\x17\xb3?5\xe8\xba\'\xa3m\xa6?`}X\x7f\x97\xd6\xb4\xbf\xdf\x05\xe1\xfep^\xaf\xbf\xafG\xafP\xba\x10\x9e\xbf\xd5\t \x0em\x07\xa2?\xbc\xee\xb3\xdd\xe4\x81\xa2\xbfJe\x9aI5\xca\xc0?{\xf4[&=&\xbb?\x1d\x07\xcb\x06`;\xb3\xbfAw\xdc\xc0De\xac?t\xf8\x8b\xd7\x12\x05\xb4\xbf\xc4\x9f\xb3C\x99\xca\x8c\xbf\x10.>2\x1b%\xc7?\xd5{Dn_\xed\xb2\xbfg{\x7f\xae\x8b\xae\xaa\xbf\xc5\xadX\xf6\xc73\xc8?\xeb\xdf\x07:P\xd3\xa4\xbf\xd7\xdb\xf2/\r\xf3\xd8\xbf\x13Q\xa9=\xdc\xa0\xc2\xbf\x1f\xbf\x13\x90\xd8*\xc8?y\x10NI\x81\x11\xad?|\xd1\n\x01\xcem\xb0\xbf\xc7HP\xf0+\xbf\xa3\xbf\x1axfZ\xe2)z\xbf,\xab\xd4\x84nl\xb4?d/1\xad\xea\xbd\xb3\xbf\xb4\x16%\x84\xd0\x9d\xb0\xbf\x01\x0b\xe5\xcc\xf3H\xc3?_\xaa3A\x84#\x8c?[Q\xdfk\x13\xd2\xd4\xbf/?\x8e\xab9\xb5\xad?c=\x1d\xea6F\xc9?\xae\xf0\x9f\xad\xbc\xd3\xbb\xbfD\x82\xacw\xf5t\x8f?\x1eE\x9e%f\xe8\x96?\x95\x9f8\xf3\xb0\x1dt\xbf\xa4\x1d\xc9\xea\x02\xc1\xb8?/@\x9a\x83\xf9\x83\xca?s\xc5L\xe5\xd7\xb1}?\xc7R2m\x89;\xd9\xbf\x91\x04\xdb\x97\xcd(\xaf\xbf\xa5\xcf\xc6\xbf@\xce\xcb?\x84R\x8c\x92\x0c\xe3\xa6?\xc9z\xeeY{x\x90?\x12\x00\xde\xf3\xbfk\x9f\xbf\x81\xd2<\xb9I\x1f\xd0?\x80N\x98\xbaV\x95\x93?WVZ\xb5((\xd5\xbf\x843:\xc7T\xf9\xa4?!\x1c\xa4\x03 \x98\xcf?+\xcd\x86\xf9\'r\xd3\xbf\xd4\xfe\xd0\xf1\x9a\xc7\xc9\xbf\xfd\x87`O\xf8T\xe2?\x1f\\I\\i"\xbb?\xd4\xfe\xba\x89\xc5\xc5\xbe?\xbf\xef2\xfa\x1fH\xa2\xbf\x18\x11\xaa\xcf\xe9\xae\xb4\xbf\x080H[\xe60\xaa?C\xfc\x13\xc9\xcf\xd8\xca\xbfc\xcer\x9e\x02\x9e\xa5\xbf\'\xb8\xa5\xaa\xbc?\xd4\xbfH\x89\xaf\xdd\xd7\x8d\xa3?\x8bZ\xfe\xe1h#\xcd?\xb3\x1b\x0f`L\xeb\xd1\xbf\xd9\xcc\x80\x10Q\xf0\x8c?\xbfw\xa1\xcb\xba\xdf\xaf?\xa7\xb3\xc0\xa8J7y\xbf\xdf\x8a\xfb\x81p\xb8\xa8\xbf3\xc2\xcak>\xa7\x84\xbfi\xb4G\xb0\xe9\xaa\xa3?\x88\xfa\x85\x97\xaa\xa6\x99?\xa7\xb4\x9c^\x1b\xd0\xb3\xbf=\xabL\xfc\xa0m\x93\xbf)\xc0X\xe6\xd8L\xb0\xbf!\x8bQ\x84\xa3\x01\xb3\xbf\x1d\xc3<\xce\x0e\x80\xcc?\xd7\xb0\xc9\xe3\xbb]\xc0?g\xe0\xc1\x1b9\xca\xb3\xbf\x94\xb5\xe1\xc5\x86jy\xbf\x9b\x89\x0b\x1e\xc3\x02\x97?\xab\x92\xba\xafG|\xc2\xbfms\x87\xb8\xbd\xec\xb7\xbf\xca\x82\xc9\x9e\xc9\xae\xb8?\x19\x9aL\xb8\xce\xfb\xd1?U\xee\x00\xd0\xc8\x11w\xbf\x8f\xcd\x90D\x13>v?\x89\xf7J\x0e\xddX\x89\xbf\x82&\x05\xe1\x12\xbf\xb9?)g\x07\x18C*s?\xcc\xcdG6\xe0\xd3\x81\xbf\x8dJ\xd4\xd2\xa9\xa7\xb0\xbf\xf8\xaf+\x900g\xd2\xbf\xc8\x0e\xdfM)\x1c\xd4\xbf\x0b$\xb3%\xa9j\xc1\xbf\xf2\x9d\xe6\xe6\x8b\xf1\xa8\xbf\xb3\x1el~M\xc4}\xbf\x17\xcf\xdb9\x946\xe9>\xa5\x8fIU\x9f\xe8t?\x00\x19\xa7.a\xbd\xc3?O\xe2\xd4w8\x11\xb3?\xdf0\xd7l\xd7\xbe\xc4?F}\xd4j\xd4\t\x95\xbf\xde\xc5_\xbe\xd9\xdc\xcb\xbfvg\x07\x90\x1e\xcb\xd5\xbf\xf6r\xf1\x11}C\xc1\xbf@>\xa8w;N\xbe\xbf!\xb1\xc5`\xa3\xc2\x90?y\x1e>o\x1f#\xab\xbf\x1aT[\xb5\xe6N\x9d?+W\x8cw\xb5=\x9a\xbf*\xf5({\x87\xb3{?n\x19\x8c6\xc2\x9c\x9c?\'\xa5\x9byM\x12\xb6?\x15\xe1mU\xff\t\xa0?\xc5\x8e\xff\x1b$\xd7\xd3?j\xc3\x82\x00Sg\xd5?\x87^?\xe6kl\xb3\xbf\x11q\xc1}g\'\xb5\xbf\xd9\xd5\xac\x8f\xadW\xb7\xbf\xe3\xc5_\xfdZ\xde\xb9\xbf\xb1\x90n\xee{1\xbb?\xb0K@\x8c\xe0#\xb1\xbf\xc4G|\x92[8\xc0?\xd7Q\x87dmG\xdb?\xc7\x10\xb6q)\x8a\xc0?\\Ws\xbe\xf5d\xba?\xc7\x01\xef\xd0\x9b\xe9\xa9\xbf;\xeaU.\xccq\xb8?\xd9,>\xe6\xc7i\xc0\xbf\x12\xb2r\x167\x1c\xc2?Z\xf3\xe9R\xa7Q\xd5\xbf\x92\x9f]m\xc3>\xcd?\xd2,\xeezd\xbb\xb5?9\xc5\xd4V\x0fS\xb3\xbf\xb3\x10\xa1V\xb8)\xbf?\x83\xe0n\xf6X>\xac\xbf^\xeaM\xa0)s\xb2\xbf\xfa\xaad8\xae:\x9d?F\xc3\x13\xa5\x8f\xef\xad?\x98\x10j\xb5\xa6\xf0\xa3?&M\xf9F\xe0a\xc9\xbf\x16P\x18%\xb30\xa9\xbfV\xfbL\x11\xdb\xa6\xd1\xbf\x99\xd9\xbd"\x04\x04\xc0?\x80\xb0\x99\xf0\x974\xb5\xbf-\xbd\x7f\xdak<\xb2?\x99\xef\xff\xc0O\xe0\xb3?\xad\x82\x1d\x86\x91\n\xa3\xbfY\xcf#B\xb8\xd3\x95?\xc1\xa0\x8el\x00\xe3\xa1\xbf\x8a\x1c\xa98\x9f\xecY\xbf\xe0\x07\xd68\xc5L\xb3\xbf\xf1o1!u+\x96\xbf\x82S&\xba\x9b\xf9\xba?\xd62\xf2-y\x9b\xd8?`\xea\x8atJ\x14\xd1\xbf\x92\xda\xd9\x9eG\x0b\xb4?\xd3\x11\xdc\xa0k\xfd\xae?\x05\xcc?\x06\x19\xa9\xcf\xbf\x89\xfe\xb5\xe1\x14\xdb\xb7?\xb7\x8dU8\x892w\xbf\xf3\xe9\x83W\xa5p\xa6?\xd0\x0b,!\xfcv\xe0?\xc2\xd0\xbb\x8e@<\xc1\xbf\xc3b\xe7E\xd4\xbe\xbb?L\xe2\x98\xc1eg\xb0\xbf\x89\xc6o\xff\xdc\xfd\xc0\xbf\xe2\xa50\x00\xd5e\x86\xbf\x89\xfb\xc2\x07\xd4 \xb4\xbfb5\xacc\xdf\xb9\x81?Oo\t &\x86\xb6\xbf\xb6\x88\x01(\xa3\xd4\xa3\xbf\xc8m\x0b &\xda\x9c\xbf]3\xb1\xb3yg\xa9\xbf\xf8\xac>=d\x90\xc7?\xf8%e&\x94\x88\xc2\xbf\xe5\xaf]\xb6\xe9\x18\xd3\xbf$\xb5\x81\xbf]\xfag?\xe4:\x8d\n\x82\xbc\xc3?V0\x93[\xc4\xca\xaf?\x8f\xde\xac\xa5\xad\x18\x8c\xbf\xce\x88X\xfd\x87K\xd5?\x08O,\xbbl\x03\xb5?9k\x90u\xe3\x0c\x8c?\x19\x03o/\xe0\xec\xaf?\xba\x9c,Y\xcfg\xd5\xbf\x1d\xcb\xd3NU#y?E\xf1$\xde\xd1\xc4\xc3\xbf9\xea\xa0\xa8\x9e\x1ao\xbfL\xb2Gz\xc8\xfe\xbe?\xc3\xf243\xeb\xa3\xc2?o\xf2\x95\x99\x04\x8a\xd7?<\xd1\x95\x84~P\xc6\xbf\x1c-:\xd9\xb2\xa0\xbc?\x8bh\xa7\x88P\xaf\xb9?\xde\x95\xa0\xa29\x8d\x9b?\xfe\x19\xdc\xd29b\x7f\xbf\x95\xeaa\x93.\x01\xbe\xbfb\xc8wS^0\xa0\xbfA\xc7\xecd\xf6D\x03\xbf\x8f\xe3\x1eS\x88\\\xa4?*M\xf5O\xd9\xbc\xc3?\x86\x1b\xce\xb2H\x93\xb7\xbf\x1d92\x0e\x94\xbc^\xbf\xd7J(b\xa1)\xbf\xbf2\x02 \xfe\xf1\x07\xa9\xbf\x90\xe2\x08\xdf\xe7$\xd2\xbf\xed49{x\x1a\xb9\xbfc\xd2(/4\x94\xd7?o\xdc\xfa\xb46B\xca\xbft@\xb1E\xba,\xc3\xbf\x10m\xd7\x9dk\x12s\xbf\x00p\xf0\x82\x0f\xea\xc6\xbfVJ\xa4\xbec\xe2\xbd?\x1b\xc2\r\x91\xc0\r\xbe\xbf\xdd\r\xc2j|\x9b\x94\xbf\xc6\xf3P\xea\xe5\x81\xd5?x\xe1\x99\xf9\xb4\xa8\x9e\xbf\xbd\xd9so\xa4X\xc7\xbf\x0c\xe97\xae\xcd\xdb\xc2\xbf\xb8\xd4\xc2\x8a\x8b\x94\xbc?J\xb6[\xad\x99\x03\xbf?\x81<\x8b\xeb\xa3\xbb\xc2\xbf\x97/\xd1\xfat\x90\xbb\xbfEm\xa2\xc0\xc0\xf7\xc8?l\xf3\nGV\xc4t?^\x1bgT\x8f\xf4\xc7\xbf\xc1`\xff\x1a\x00\xf0\xb1?W\xa4\x890\xa2m\x9f?\x19\xac\xefu\')\xba?\\\x19\xb5\x15\x18\xec\xb3?\xce\xfd\xf1\xd8Ey\xa6\xbf{G\xdb\xe4G\xbc\xc2\xbf\\N\xd4\x1c\xc3d\xac\xbf\xa1\xb09\xb8\x96\xd2\xcb?j\x1f\xcc\xe4Cc\x83\xbf\xe1\xafn\x85\xa2&\x8d?\xca\x158(\x83q\xa9?K4\x1a\xaf41\xc1?\x99\x89T\xab\x10\x8e\xb4?\xf4\x92\x95WT\xf4\xcd\xbf\xb2L\x9d\xb8;\xbd\x9e\xbf\x95]\xa8\x0b\xc2|\xcb?\x84\xa0Lv!F\xb5?\xbb\xd9\xe3\x88\xc8n\xdf\xbf\x11y-\x0b\xb8\xce\xbc\xbf\x14!\xc6\xa1\xe3j\xcc?_\x1d\x03Zl\xc9\xa0\xbf\x06A\x0e\x93\xa0\xe6\xd1?\x01\xf1(\x1e\xe1\x9f\xa1\xbf\xed\x88\xdb\xbc}\xa3\xc3?\xe4 .]\x03\xbe?Y_\xd2\x1e\xb2\xbd\x96?\xcb\x7f\xf0\xe8\x85$\xbc?l\x1b\x18W\xdaK\xb7\xbfL\x0e\xa9\xe1\xccD\x8d\xbf\xe9\xec\xec\x0b\x02\x92\xb1\xbf\x12\x8c\xd6\xf2\x89\x86\xb1?\x10eG\x90\x86l\xc0?\xdcl\x13\xe5\x06\x08\xb9\xbf\x8a\xc5\x93\xea,\xe7\xc9?\xeez,t\xdc1\xa2?Vy\xf2W?k\xab\xbf\xd4\xf5\x0e\x02\xfe\x93\xb8?Qp\x11\x88n\xc4\xd6\xbf\x93F3\xf4\xd0\r\xb8\xbf$\xf9\x984\xf9\xe4\xb9?3F\xe9!\x9d\xa6\xc9\xbf\xc1\\\x0c{\xb5\xd0\xb0?\x89\x18Q%\xb3\x1b\xd2?L:Fn\x1aq\xcc?\x03\xdb\x0c\xf6v\xc5\xd8\xbfs\xf4\xa0\xf7\xe4\xe7\xbe?\xdf\xf0R\xc2\xebf\xc1?\xd9\xee\xce\xf2\xc7\xb9\xb4\xbf\xa9\x12\x86\x8b\xd6\x92\x7f?\xb56\xae\x1e\xbcU\xb0?\xc3\x80\x8d\xfa \xb0\xcd\xbf5\xf5-\x8a9x\xbf\xbf\x06\x06\xdc\x06\x11\xa1\x9d?\xaa\x1eGz@^\x9c\xbf\xc0\x0by\x00\n\xbd\xc7?<\x10\xda^\x1a\xb3\xc2?\xa6\x88\x9eE\xa8\x8a\xa4\xbfr\x85\xd9\x00\xed\xe1\xba\xbf\xd1\xc0\xb7\x89\xc9\x98\xbd?\xc3%Q\xe1\xf7\xb9\xd1\xbf\xe8\x17M#\x84\xe7\xc0\xbf\x13\xfb\xdb\x10^\xb4\xb4?,\x80\xf3\x85I\xad\xbf?\xdf\x8bZ\xc7\x06\xc3\xb6?l=\xaf\n\xc5z\x91\xbf\xe8F\x83hZ\x95\xba?N\xbd\xa7\xc0\x80\'\xb3\xbfZl\xda\xdb\xf5\xcc\xcb\xbf\xec\xc9\xf5Z\xaf`\xa3?h\xf5>\x9ax}\xbe\xbf\xf4\x19\xf6r\xb7\xd7\x9a\xbf\x97T\x10\xc9\xe2\xa5\xc8?\xa34P\x8c\x80\xce\xc5?:\xfd\xb4\xdfz\x10\xb9?\xdb\x82\xfd\x80c\xc2\x97\xbfQ\x8fxZ\xd5\xe9\x80?\xf0\xd4\x8cO\x92\xa1\xc5\xbfD|\xc2&\\q\xb3?L++\xfc\xde?\xa2?|\x8eb\xb3&g\xd0?\xb4\xf0\xf5\xc2 sW\xbfwJ4\xcdl\x86\xb9?\x93w\x13m^\xe3\xd0\xbf%?\x95\xf6Z\x04\xcb?w\xc3\x83\xfb1G\x89?1\xce(\xb7\xd0\xcd\xc6?sk\xc6\xcb\xd5\x1d\xcf?\x17\x8eU\x17y\xd6\xc6?\x8a\'=\x1fa\x98\x97\xbf\xa7<\x92\xbf\x17V\xbc?(\xa7:\xdc\x19\x85\xb7\xbf<\x84>_%J\xd3\xbf\xe8E\xdf4Q\xaa\x94?\x8d8\xc55\x8fk\xd7\xbf@\xf3%\xac0\x98\xba?%\x07>\x0f\x7f\x02\xd3\xbf\xe6\xbc\xcb\x1b\x01z\x98?\xc4$\xf2{7l;?\xf9w]\xc1\xae\xdb\xad?1\xf0;\xca\x1f\xff\xcc?a\x15\x0f\xe9\xbc\xb4]\xbfK\xd4Y\xb4\xdb-\xb5\xbf\x15\xb9\x1d\xd6\x84\xe8\xbe?\r\xb5PY\x90\x17\xb6\xbf\xf1\x12\xa5h\x9d+\xb3\xbf\x00\x91\xf5\x14\x81\x9a\xa4?h\xe0R\xe4}!\xb8\xbfG\x85A\xb6\xa4t\xd1?\xf5\xfa@9\xa1^\xa0\xbf` \xc3\xba\x9a\xff\x8a\xbf\x0fR\x073#Q\x9a?J]\x9e\x83X\x17\xd2\xbf1\x82\xf3\x98J~\xce\xbf\xef!<\x9d{\x1d\xd5?\x8c\xe3\x98%"\xe6\xa6?\xf8t\xfb\xc2 \x8c\xca?\xd5&\x18G\x16\x0b\xc3?\x8b\x06\xdf^q\xc5\xb1?\xcf\x11\x9d\xfd\xc4\xaan?\r\x1di(\xbej\xcf\xbf=\xabzq\xaa\xaal?\xd1\xfd}5\xa4\x85e?\x15>k\xb6\xb9\xea\xb8\xbfc\x7f\xc1\xaby\xdc\xbc?\x05Y\xe0\xbexe\xcc\xbf\xd5W\xe3\xe29\xac\xa1?h@\xfd\xa2\x92]\xd3?q\x0e\xe7\x13\xb2\xb6\xb0\xbfT\x99\xa6fl\x9f\x8a?\xcf\xa8\xce\x96 F\x95?m\xc0\x8e~\x08{\xca?7eM\x11\xa1\xc5\xd0\xbf\x8d3\x1d!qO\xbc?w\xea\xe0r\x90\x83\xcf\xbfn\xe7\xaaV\xd2U\xd3\xbf\xcb\xb1A\xd5\xae!\xa1\xbf\x1b\xaa)\xf7\xab\xf1\xd1?,\xc6.\xa4\xb7\x1c\x97\xbf\x8f\x1f\x83\x00\xbaU\xc0?&gr\x9e%\x8e\xb9?\t\xe1\x04j\xe3\x8f\xbc\xbf\xea\xa2,\x84\xac\xf6\xbf\xbf\xc1\xd7\xc3G\xc5^\xb1?\xb6%uf\x98\xcd\xa9?3G\xf88\x94]\x95?\xd6\x04\xd5\xbb\xb5\xaa\xca?\xebR=\x8cA\x9a\xbd\xbfu\xfa2}`U\xb8\xbf\'\x19&\x95Aj\xd2\xbf\xd0\xa2t0"q\x9b\xbf\xe4\xbc\xd5F\x82\x01\xa5?b1\x8eMB\x9f\xc5?(q\xb2\x1c\xd3)\xc2\xbf\xc8\x08\x1f\xab;\xfd\xd0\xbf\xfa\x89\x14(\xdd-\x92\xbf4\x9b<\xdf\x00\xb5\x99\xbf&\x88\xa5\x8eA\xe1\xc8?%\xb1C\xb5\xa2B\xcc?\xd55\xa2*\xc1\xf7\xb6\xbf\xaa\xad-\x82aw\x9d\xbfz\xfc\x8a6\xc1`\xd8\xbfBBD\xbe\x05\x08U\xbf\x9e\xd7M\x03\xb0;\xc9\xbfAs\xc8{\xc6\xd3\xca\xbf\xccI\xb0\x80\x9b\xc5\xa6\xbf%\xe9\xe6\xd9 \x05\xb5\xbf\xd4\xd2O\xa5\xfbV\xd0?5\x9d\xae\x9c\xc6>\xa6\xbf\x08\x85I\xd6t\x1b\xd9?\xa4\xe9W\x1bH\xad\x80?56N\xc4tQ\xbc\xbf\xaf\xccw\xea\x1d\xf7\xa2?\x08_\\-q\x03w?\x1c\xceE\x9a$\x0e\xb0?1\xea\xb4\xe8%\xb3\xbc?\xd9\xe1?\x108\xa2\xb1?\x99\xa2\x89\r\xd2\xd5\xb5\xbf|\xecY\xfd\xa1\xf1\xa2?\xc6\xe1b3\x06\x1f\x9a?B\xe2\x01v\xec\x9c\xab\xbf/\xbe\xb3\t\xeb?\xc7\xbf1\x03\x8c*w\x0c\xbb?\xb8\x19\xd3E\xa6\xe6\xd8?\x9f\x9a\x8f\xfeW#\x8f?{\xdd\xd5D\xf1\x18\xc2\xbfoX#\xeb\xdfo\xbf\xbf\xa1\xa2\xc3g\xa0c\xb2\xbf\xcb\xef\xd0\xb6\x94\xea\x83?y\xc8\xf9\xa9N-\xc3?L\xc9ol\xfd*\x98?p8\x8e\xc1D\x81\x9d?\xb3\x85\xe0\x0b\x88\x9fI?\xcbVN\x9b\x0b\x11\xb9\xbf\xc5#&c_\xce\xb1?=W\xf8\xff\x10l\xa4\xbf\xd9\t\x80\xcb>\xa9\xaf\xbf3\xa9Q\xbe\x1b\xbf\xc5?\xb6\x93\x96\x10\xbbD\xce\xbf4\xa6\x06b\x90\xdd\xb3\xbf\x84n\xa1\x8d\x8bp\xba?\xd5\xda\xd9\xfd\xd63\xb9?\xff/\xa3\x8b7\xd9\xa9\xbf\x17\xfc\xd1h\xdc\xd2\xb7\xbf[\x92F\x9aW\xcd\xa7\xbfZ`\xa2zZx\xcf\xbf\x89\x7f\x08\xcd\x1d\xbd\xb3\xbfp6N&cI\xd5?\xb1i\xb8|*\x16\xba\xbf=2z\x80\x13\xc4\xc6\xbf\xa2\xb6VAr\x95\x9f\xbf\xd9v\x06\xd0\xc5\xcf\xba?H\x0bi\xe1\xdes\xc4?\x134Q\xf2\xfb\x17\xdb\xbf\xd5\xd1\xcc;3\xb4\xc3?\xa36\xf2j\xb3\x8d\xd5?W\xe3\xd0\xc3Dt\xba?'
-p127
-tp128
-bsg27
-g12
-(g13
-(I0
-tp129
-g15
-tp130
-Rp131
-(I1
-(I40
-tp132
-g22
-I00
-S'\xcb\xfd\x9a|\xc5\x1b\x19\xc06\xb2W\xfb\xe5{U\xc0\x97m<\x1a\xdak4\xc0f\xa3\x10\xdb8\xd1J\xc05\xb7\x92.Q\x89;\xc0\'?SP\xbb$6\xc0GSmz\xdf\xfd5\xc0\xb5I\xf2\xe1#-\xf8\xbf\xbd\xe8\xb8\x87m\xe1;\xc0\xf1\xe4\xd1\x02Z\xdd5@i\x8a\x103<\xf14\xc0\td#\x0c\xd0\x12K@\xc2\xe3hPid"\xc0\x1b\x0f^\xdc\x02dU@\xc3\xcb9\x00\xb7\xc94@Ne\xc5\xb8\x1d\xcfK@U\xc7\x8d\xb4\xee\x15A@\xf5\xcd5\xf7\x1a\xed8@\xe2*O$\xfd\x8dB@,4\x8au\x1fW\xb8\xbf>n\xa4\x1e\xd6\x8dA@\xb3re>\x03\x017\xc0\x90\xb3}6\x7f\xcf6@\xf8\xf3"H\xd4\x10K\xc05t_\xd6\xfb\x90\x14\xc0\xb19H\xa1f R\xc0\x1b\xd8m\xb1\xb9@ \xc0\n\xccRU\xf006\xc0\xe9/\xdb\xf9On\x18\xc0\x80\x10\xb0s\xf5C\xf2\xbf+~\x90Kd\xb5!\xc0\x17\xf7\xe6\xf5xy6@\x80\xf4A\x95\xd2\x8c\x1d\xc0\x8c=&1\xeb\x08R@\x80z\xe1\x18R\xfd\x14@D\x1e\xf7A4\xfd6@\xb8\xb3\x81\xb2\x03\xda @\xddb\xe4\t\xe6\xda\xee\xbf\xec\xd0/`\x82\xcf\x18@\x06_\xb7ozv6\xc0'
-p133
-tp134
-bsS'n_samples'
-p135
-I3148
-sS'_n_active_components'
-p136
-I25
-sbsb.
\ No newline at end of file
diff --git a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/pdm_clm_models/pdm_models/basic_mouth_5 b/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/pdm_clm_models/pdm_models/basic_mouth_5
deleted file mode 100644
index 4e27600b7e5602605de25f9c78f6c2415c549b21..0000000000000000000000000000000000000000
--- a/marlenezw/audio-driven-animations/MakeItTalk/face_of_art/pdm_clm_models/pdm_models/basic_mouth_5
+++ /dev/null
@@ -1,338 +0,0 @@
-ccopy_reg
-_reconstructor
-p0
-(cmenpofit.modelinstance
-OrthoPDM
-p1
-c__builtin__
-object
-p2
-Ntp3
-Rp4
-(dp5
-S'similarity_model'
-p6
-g0
-(cmenpofit.modelinstance
-_SimilarityModel
-p7
-g2
-Ntp8
-Rp9
-(dp10
-S'_components'
-p11
-cnumpy.core.multiarray
-_reconstruct
-p12
-(cnumpy
-ndarray
-p13
-(I0
-tp14
-S'b'
-p15
-tp16
-Rp17
-(I1
-(I4
-I40
-tp18
-cnumpy
-dtype
-p19
-(S'f8'
-p20
-I0
-I1
-tp21
-Rp22
-(I3
-S'<'
-p23
-NNNI-1
-I-1
-I0
-tp24
-bI00
-S'\xc0\x87\xe8\x94_\xf4\x9c\xbf\xeb\xf1\x16\x8a^\xc6\xd8\xbfvx\xaf\xa7\xa6\x8c\xb7\xbf\x00M\xc0\x1a\xd5\xec\xce\xbf\x87\x0f\x87\x84 \xc1\xbf\xbf\x03\x02\xcb|\x10\x89\xb9\xbf\x17jL\xd9@\\\xb9\xbfQ\x8c\x96O0\xe1{\xbf\xd1\xa78\x04^\x13\xc0\xbf\xe8\xfa\x82\x17\xc06\xb9?\xb6n\xb9:w&\xb8\xbf\xf2m8\\x8\xcf?\xde\x91 \x91\xa45\xa5\xbf\x8d\x8cD\xb5\xd2\xaa\xd8?i\x8b\xe6H\xe4\xf8\xb7?\xff\xc0\x14\x13\xcf\x08\xd0?\xf8\x1a\x8c\xba\xed\xb3\xc3?\x05\xce\xa0\x19\x8f\xbe\xbc?\xed\x014\xcf\x96e\xc5?\xeaA\xc11\x9a\x11<\xbf\xca\xf0\x10\x0c3>\xc4?\x96\x17R\x7f\x16\x87\xba\xbf\x86\xf7\x97\xc2\xfcM\xba?\xc9J\xbd\xd0.6\xcf\xbf\xda\xd6\xebxx\xb7\x97\xbfeP\xe1\xec6\xe7\xd4\xbf\x9c\xff$\x08\x10\xbe\xa2\xbf&\x18\x0c2$\x97\xb9\xbf\xbds&\x10X,\x9c\xbf\xc9\\\x96\x1c8\x10u\xbf\x0e\xfe\xfdW\xd0k\xa4\xbf\xb3\x94;\'\xc9\xea\xb9?I\xed+t\xcc\t\xa1\xbf\xf9b\xb7\xa8"\xcc\xd4?\x86}c\x0cg4\x98?\xfd\r\xde)\xb2\x82\xba?\x0f7\xea\x19\xd5n\xa3?\xbb\x86Tel\xcaq\xbf"%\xd6\xeem\x9c\x9c?\x86\xc7\xa16U\xe7\xb9\xbf\xe8\xf1\x16\x8a^\xc6\xd8?\xb6\x87\xe8\x94_\xf4\x9c\xbf\xfeL\xc0\x1a\xd5\xec\xce?vx\xaf\xa7\xa6\x8c\xb7\xbf\x01\x02\xcb|\x10\x89\xb9?\x85\x0f\x87\x84 \xc1\xbf\xbfM\x8c\x96O0\xe1{?\x15jL\xd9@\\\xb9\xbf\xe9\xfa\x82\x17\xc06\xb9\xbf\xd0\xa78\x04^\x13\xc0\xbf\xf0m8\\x8\xcf\xbf\xb5n\xb9:w&\xb8\xbf\x8c\x8cD\xb5\xd2\xaa\xd8\xbf\xdc\x91 \x91\xa45\xa5\xbf\xfe\xc0\x14\x13\xcf\x08\xd0\xbfh\x8b\xe6H\xe4\xf8\xb7?\x03\xce\xa0\x19\x8f\xbe\xbc\xbf\xf6\x1a\x8c\xba\xed\xb3\xc3?\x0fB\xc11\x9a\x11\xeb\x014\xcf\x96e\xc5?\x96\x17R\x7f\x16\x87\xba?\xc8\xf0\x10\x0c3>\xc4?\xc7J\xbd\xd0.6\xcf?\x85\xf7\x97\xc2\xfcM\xba?dP\xe1\xec6\xe7\xd4?\xd9\xd6\xebxx\xb7\x97\xbf%\x18\x0c2$\x97\xb9?\x9b\xff$\x08\x10\xbe\xa2\xbf\xc8\\\x96\x1c8\x10u?\xbbs&\x10X,\x9c\xbf\xb1\x94;\'\xc9\xea\xb9\xbf\x0e\xfe\xfdW\xd0k\xa4\xbf\xf8b\xb7\xa8"\xcc\xd4\xbfF\xed+t\xcc\t\xa1\xbf\xfb\r\xde)\xb2\x82\xba\xbf\x83}c\x0cg4\x98?\xb8\x86Tel\xcaq?\x0e7\xea\x19\xd5n\xa3?\x85\xc7\xa16U\xe7\xb9?!%\xd6\xeem\x9c\x9c?\xd1\xed\xbf\xc5%\x9f\xcc\xbf\xba\xb2\xe1\xc0\x86k\xab<\xd4\xed\xbf\xc5%\x9f\xcc\xbf\xda\xb4\xd4\xbc\x06\t\xa2<\xd4\xed\xbf\xc5%\x9f\xcc\xbf\xd4)6`\xa5\xffR\xbc\xd8\xed\xbf\xc5%\x9f\xcc\xbf\xf5"\xffg\xee\xdf\x8f\xbc\xd7\xed\xbf\xc5%\x9f\xcc\xbfh\x861B5\xa9\xa5\xbc\xdd\xed\xbf\xc5%\x9f\xcc\xbf\xf2\xf7\xbe\x97\x8f\x0b\xb1\xbc\xe0\xed\xbf\xc5%\x9f\xcc\xbf\xd4\xa7n\x15\x1a6\xb6\xbc\xe4\xed\xbf\xc5%\x9f\xcc\xbf(Y\x83\xce\xee\xb9\xa1\xbc\xe1\xed\xbf\xc5%\x9f\xcc\xbf\xe0\xaa\xa0s*\xa3k<\xde\xed\xbf\xc5%\x9f\xcc\xbf\x06\x8aX\x8d\xe8\xac\x9d<\xda\xed\xbf\xc5%\x9f\xcc\xbfo`l\x82\x99P\xa9<\xd8\xed\xbf\xc5%\x9f\xcc\xbf\x0b\x892\x88\x01\x9c\xb1<\xd3\xed\xbf\xc5%\x9f\xcc\xbf\xb5\xa6\xee*/\n\xb1<\xd6\xed\xbf\xc5%\x9f\xcc\xbf\xcfw\xfe\x8a\xe3r\x8b<\xd9\xed\xbf\xc5%\x9f\xcc\xbfW\x83}+Bfn\xbc\xdb\xed\xbf\xc5%\x9f\xcc\xbf\xc7\x7fLp\xe5\x85\x9c\xbc\xe0\xed\xbf\xc5%\x9f\xcc\xbf\x0e\xdb\x1b\x80\xcbV\xb2\xbc\xdd\xed\xbf\xc5%\x9f\xcc\xbf\x01\x7f\xf6\x9f\xd5O\x92\xbc\xda\xed\xbf\xc5%\x9f\xcc\xbf\xc3\x0c\x15_p\xb3~<\xd9\xed\xbf\xc5%\x9f\xcc\xbfbF\xd5\x1f\x99\x17\x9b<\x83hc#\x04\x82\xb2\xbc\xcc\xed\xbf\xc5%\x9f\xcc\xbf\xc1w\xaf\xb1\x10\x13\xa7\xbc\xd6\xed\xbf\xc5%\x9f\xcc\xbf\xf8Ne\xd0\x05\xa1E\xbc\xd3\xed\xbf\xc5%\x9f\xcc\xbf.@\xf8\\\x12\xf4\x8f<\xd6\xed\xbf\xc5%\x9f\xcc\xbf\xf1h\xbaE\xc4\xf5\xa6<\xd7\xed\xbf\xc5%\x9f\xcc\xbf_I\';\xceU\xb2<\xdb\xed\xbf\xc5%\x9f\xcc\xbfUG\x0c\xd2\xa5>\xb8<\xdf\xed\xbf\xc5%\x9f\xcc\xbf_<\xfe\xcd\x81\xee\xa3<\xe2\xed\xbf\xc5%\x9f\xcc\xbf\x10`\xc3\xe8\x8c\xb2c\xbc\xdf\xed\xbf\xc5%\x9f\xcc\xbfA\xf4\x84C\xad\x01\x9e\xbc\xde\xed\xbf\xc5%\x9f\xcc\xbf\x83?\xec\xc9D\x1c\xaa\xbc\xdb\xed\xbf\xc5%\x9f\xcc\xbfG\xcf\xa6\x9e\xc7\x1a\xb2\xbc\xd6\xed\xbf\xc5%\x9f\xcc\xbfP\xe51H\x03.\xb2\xbc\xd1\xed\xbf\xc5%\x9f\xcc\xbf\xda\x05\x12{\x81\xdf\x8f\xbc\xd5\xed\xbf\xc5%\x9f\xcc\xbf^m\x8f\xebs\x1ep<\xd8\xed\xbf\xc5%\x9f\xcc\xbf\xd6-\xc03\xa5\xa9\x9e<\xd9\xed\xbf\xc5%\x9f\xcc\xbfp+2\x01Vh\xb4<\xdf\xed\xbf\xc5%\x9f\xcc\xbf3\xed\xedz\xed\xad\x94<\xdc\xed\xbf\xc5%\x9f\xcc\xbf\x06\xf7\x06\xba~\xa5~\xbc\xd8\xed\xbf\xc5%\x9f\xcc\xbf\xb2\x7f\xf4\xf7\x8d\xc7\x9b\xbc\xd9\xed\xbf\xc5%\x9f\xcc\xbf'
-p25
-tp26
-bsS'_mean'
-p27
-g12
-(g13
-(I0
-tp28
-g15
-tp29
-Rp30
-(I1
-(I40
-tp31
-g22
-I00
-S'\xcd\xfd\x9a|\xc5\x1b\x19\xc07\xb2W\xfb\xe5{U\xc0\x99m<\x1a\xdak4\xc0j\xa3\x10\xdb8\xd1J\xc08\xb7\x92.Q\x89;\xc0)?SP\xbb$6\xc0JSmz\xdf\xfd5\xc0\xc2I\xf2\xe1#-\xf8\xbf\xbd\xe8\xb8\x87m\xe1;\xc0\xf3\xe4\xd1\x02Z\xdd5@j\x8a\x103<\xf14\xc0\x0cd#\x0c\xd0\x12K@\xc0\xe3hPid"\xc0\x1e\x0f^\xdc\x02dU@\xc5\xcb9\x00\xb7\xc94@Ne\xc5\xb8\x1d\xcfK@X\xc7\x8d\xb4\xee\x15A@\xf5\xcd5\xf7\x1a\xed8@\xe3*O$\xfd\x8dB@\x0b5\x8au\x1fW\xb8\xbf@n\xa4\x1e\xd6\x8dA@\xb8re>\x03\x017\xc0\x94\xb3}6\x7f\xcf6@\xf9\xf3"H\xd4\x10K\xc03t_\xd6\xfb\x90\x14\xc0\xb19H\xa1f R\xc0\x1a\xd8m\xb1\xb9@ \xc0\r\xccRU\xf006\xc0\xe9/\xdb\xf9On\x18\xc0\x89\x10\xb0s\xf5C\xf2\xbf*~\x90Kd\xb5!\xc0\x18\xf7\xe6\xf5xy6@~\xf4A\x95\xd2\x8c\x1d\xc0\x8f=&1\xeb\x08R@\x84z\xe1\x18R\xfd\x14@F\x1e\xf7A4\xfd6@\xba\xb3\x81\xb2\x03\xda @\xf5b\xe4\t\xe6\xda\xee\xbf\xef\xd0/`\x82\xcf\x18@\t_\xb7ozv6\xc0'
-p32
-tp33
-bsS'template_instance'
-p34
-g0
-(cmenpo.shape.pointcloud
-PointCloud
-p35
-g2
-Ntp36
-Rp37
-(dp38
-S'points'
-p39
-g12
-(g13
-(I0
-tp40
-g15
-tp41
-Rp42
-(I1
-(I20
-I2
-tp43
-g22
-I00
-S'\xcd\xfd\x9a|\xc5\x1b\x19\xc07\xb2W\xfb\xe5{U\xc0\x99m<\x1a\xdak4\xc0j\xa3\x10\xdb8\xd1J\xc08\xb7\x92.Q\x89;\xc0)?SP\xbb$6\xc0JSmz\xdf\xfd5\xc0\xc2I\xf2\xe1#-\xf8\xbf\xbd\xe8\xb8\x87m\xe1;\xc0\xf3\xe4\xd1\x02Z\xdd5@j\x8a\x103<\xf14\xc0\x0cd#\x0c\xd0\x12K@\xc0\xe3hPid"\xc0\x1e\x0f^\xdc\x02dU@\xc5\xcb9\x00\xb7\xc94@Ne\xc5\xb8\x1d\xcfK@X\xc7\x8d\xb4\xee\x15A@\xf5\xcd5\xf7\x1a\xed8@\xe3*O$\xfd\x8dB@\x0b5\x8au\x1fW\xb8\xbf@n\xa4\x1e\xd6\x8dA@\xb8re>\x03\x017\xc0\x94\xb3}6\x7f\xcf6@\xf9\xf3"H\xd4\x10K\xc03t_\xd6\xfb\x90\x14\xc0\xb19H\xa1f R\xc0\x1a\xd8m\xb1\xb9@ \xc0\r\xccRU\xf006\xc0\xe9/\xdb\xf9On\x18\xc0\x89\x10\xb0s\xf5C\xf2\xbf*~\x90Kd\xb5!\xc0\x18\xf7\xe6\xf5xy6@~\xf4A\x95\xd2\x8c\x1d\xc0\x8f=&1\xeb\x08R@\x84z\xe1\x18R\xfd\x14@F\x1e\xf7A4\xfd6@\xba\xb3\x81\xb2\x03\xda @\xf5b\xe4\t\xe6\xda\xee\xbf\xef\xd0/`\x82\xcf\x18@\t_\xb7ozv6\xc0'
-p44
-tp45
-bsS'_landmarks'
-p46
-NsbsbsS'similarity_weights'
-p47
-g12
-(g13
-(I0
-tp48
-g15
-tp49
-Rp50
-(I1
-(I4
-tp51
-g22
-I00
-S'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
-p52
-tp53
-bsS'_weights'
-p54
-g12
-(g13
-(I0
-tp55
-g15
-tp56
-Rp57
-(I1
-(I5
-tp58
-g22
-I00
-S'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
-p59
-tp60
-bsS'_target'
-p61
-g0
-(g35
-g2
-Ntp62
-Rp63
-(dp64
-g39
-g12
-(g13
-(I0
-tp65
-g15
-tp66
-Rp67
-(I1
-(I20
-I2
-tp68
-g22
-I00
-S'\xd0\xfd\x9a|\xc5\x1b\x19\xc08\xb2W\xfb\xe5{U\xc0\x9cm<\x1a\xdak4\xc0l\xa3\x10\xdb8\xd1J\xc0;\xb7\x92.Q\x89;\xc0*?SP\xbb$6\xc0MSmz\xdf\xfd5\xc0\xc2I\xf2\xe1#-\xf8\xbf\xc0\xe8\xb8\x87m\xe1;\xc0\xf4\xe4\xd1\x02Z\xdd5@m\x8a\x103<\xf14\xc0\x0ed#\x0c\xd0\x12K@\xc2\xe3hPid"\xc0\x1f\x0f^\xdc\x02dU@\xc8\xcb9\x00\xb7\xc94@Pe\xc5\xb8\x1d\xcfK@Z\xc7\x8d\xb4\xee\x15A@\xf6\xcd5\xf7\x1a\xed8@\xe5*O$\xfd\x8dB@+5\x8au\x1fW\xb8\xbfBn\xa4\x1e\xd6\x8dA@\xbare>\x03\x017\xc0\x97\xb3}6\x7f\xcf6@\xfb\xf3"H\xd4\x10K\xc06t_\xd6\xfb\x90\x14\xc0\xb29H\xa1f R\xc0\x1c\xd8m\xb1\xb9@ \xc0\x0e\xccRU\xf006\xc0\xec/\xdb\xf9On\x18\xc0\x8a\x10\xb0s\xf5C\xf2\xbf,~\x90Kd\xb5!\xc0\x19\xf7\xe6\xf5xy6@\x82\xf4A\x95\xd2\x8c\x1d\xc0\x90=&1\xeb\x08R@\x87z\xe1\x18R\xfd\x14@G\x1e\xf7A4\xfd6@\xbc\xb3\x81\xb2\x03\xda @\xf8b\xe4\t\xe6\xda\xee\xbf\xf2\xd0/`\x82\xcf\x18@\n_\xb7ozv6\xc0'
-p69
-tp70
-bsg46
-NsbsS'global_transform'
-p71
-g0
-(cmenpofit.transform.homogeneous
-DifferentiableAlignmentSimilarity
-p72
-g2
-Ntp73
-Rp74
-(dp75
-S'_h_matrix'
-p76
-g12
-(g13
-(I0
-tp77
-g15
-tp78
-Rp79
-(I1
-(I3
-I3
-tp80
-g22
-I00
-S'\x02\x00\x00\x00\x00\x00\xf0?\x80\\\xff\x82\x9bU.<\x00\x00\x00\x00\x00\x00\xf89,n\xc07\xc0\x19j\xbc\x01\x00\x00\x00\x00\x00\xf0?\x00\x00\x00\x00\x00\x00\xe09\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xf0?'
-p81
-tp82
-bsg61
-g0
-(g35
-g2
-Ntp83
-Rp84
-(dp85
-g39
-g12
-(g13
-(I0
-tp86
-g15
-tp87
-Rp88
-(I1
-(I20
-I2
-tp89
-g22
-I00
-S'\xcd\xfd\x9a|\xc5\x1b\x19\xc07\xb2W\xfb\xe5{U\xc0\x99m<\x1a\xdak4\xc0j\xa3\x10\xdb8\xd1J\xc08\xb7\x92.Q\x89;\xc0)?SP\xbb$6\xc0JSmz\xdf\xfd5\xc0\xc2I\xf2\xe1#-\xf8\xbf\xbd\xe8\xb8\x87m\xe1;\xc0\xf3\xe4\xd1\x02Z\xdd5@j\x8a\x103<\xf14\xc0\x0cd#\x0c\xd0\x12K@\xc0\xe3hPid"\xc0\x1e\x0f^\xdc\x02dU@\xc5\xcb9\x00\xb7\xc94@Ne\xc5\xb8\x1d\xcfK@X\xc7\x8d\xb4\xee\x15A@\xf5\xcd5\xf7\x1a\xed8@\xe3*O$\xfd\x8dB@\x0b5\x8au\x1fW\xb8\xbf@n\xa4\x1e\xd6\x8dA@\xb8re>\x03\x017\xc0\x94\xb3}6\x7f\xcf6@\xf9\xf3"H\xd4\x10K\xc03t_\xd6\xfb\x90\x14\xc0\xb19H\xa1f R\xc0\x1a\xd8m\xb1\xb9@ \xc0\r\xccRU\xf006\xc0\xe9/\xdb\xf9On\x18\xc0\x89\x10\xb0s\xf5C\xf2\xbf*~\x90Kd\xb5!\xc0\x18\xf7\xe6\xf5xy6@~\xf4A\x95\xd2\x8c\x1d\xc0\x8f=&1\xeb\x08R@\x84z\xe1\x18R\xfd\x14@F\x1e\xf7A4\xfd6@\xba\xb3\x81\xb2\x03\xda @\xf5b\xe4\t\xe6\xda\xee\xbf\xef\xd0/`\x82\xcf\x18@\t_\xb7ozv6\xc0'
-p90
-tp91
-bsg46
-NsbsS'allow_mirror'
-p92
-I00
-sS'_source'
-p93
-g84
-sbsS'model'
-p94
-g0
-(cmenpo.model.pca
-PCAModel
-p95
-g2
-Ntp96
-Rp97
-(dp98
-S'centred'
-p99
-I01
-sg34
-g0
-(g35
-g2
-Ntp100
-Rp101
-(dp102
-g39
-g12
-(g13
-(I0
-tp103
-g15
-tp104
-Rp105
-(I1
-(I20
-I2
-tp106
-g22
-I00
-S'\xbf\xa9\xb5\xb5\x9b\xa9#@\x8f#\xc41\xec\xe7T\xc0\xd7\xfc\x98\x14\xbc06\xc0\x9d\x80t\xd0\xab\xf3J\xc0\xea\x0cG\x8bv\xf9A\xc0\xa9!W\xf1w\x857\xc0\xaf\xa7\xfbQD[@\xc0\x99\xcc\xca\xe2\x15\x89\x08\xc0\x97\x85\x1c\xab\x90\rB\xc0v\xe9\x11x\x11}3@\xfb\xbe\x00\x97w\xb15\xc0\xcc\x98\x89D&\x94K@j\xe3\xe2\x7fr"\x14@(\xd66\x01\xd6,W@\xb3\xfe\x81\x19q\xad4@E\xf2\x9b\xd11\xa9K@w\xb0\x1b\x9bgs>@\x03r\xccF\xeaP5@.\x9c\tn\x11T@@\x9fk8!\xe2\xec\xd0\xbf\xf1et\xef\xca\x82@@ptN\xde\xde\xde7\xc0\xde<\xed\x81\x8a\xe07@u\xa2\xa0\x86\x18\x0eK\xc0\xa6D^\x92\'"\x1f@\x0e\x98\xd1\xd4k9R\xc0\xf0A\xf1\x01\xfb\xfe\x1a\xc0\x00\x0b\x17w\x97\xef6\xc0\x9e\xebM\xc8R\x10"\xc0\x11\x01\x9a\xca\x887\xe4\xbf\x01\xf1\x807+P"\xc0\xb5\xc4UA\xf0\xa23@PU\x89\x13\xbbA\x19@\xf7{r\xc1\x1e\xffS@x)\xc8zv(\xd7\xbf\xdai\x87\\\x84\xb93@A\x85\xb0n\x8b,\x0f@\xc3\x84[\x92$\x04\xf2\xbf\xe9{&\xbft\x16\xda?\xf9O\x15\xb0Ga7\xc0'
-p107
-tp108
-bsg46
-NsbsS'_eigenvalues'
-p109
-g12
-(g13
-(I0
-tp110
-g15
-tp111
-Rp112
-(I1
-(I5
-tp113
-g22
-I00
-S'\xe5\xf2\xf3\xdc\xdep\x8e@ U\xfet\\,\x8e@$5\r\xf7\x7f\xe4~@\xd0/"bT\xafa@\xb1\x17\xc4\x10\x00\xf2Y@'
-p114
-tp115
-bsS'_trimmed_eigenvalues'
-p116
-g12
-(g13
-(I0
-tp117
-g15
-tp118
-Rp119
-(I1
-(I32
-tp120
-g22
-I00
-S"i\xc0\x1f\xd0\xb4\x94C@}\xf8}f<)@@+\xd8M\xb7Z3;@ N13\xe2S6@\xc77\x86}\x83F3@o\x8d;\x08\xdaO.@\xb2\x82\x99yc#+@<\xc4/\xb4\xac\x1f)@\xa1I\x8eT\xfa\xd4 @\xee\xbf\xf4\xda\xdc\x10\x1f@\x90/\xbe\xe3<\xec\x18@\xc8\xc9\\\xfb\x95N\x17@\xec\x12\t2\x8f\xf3\x13@\xf9n\xba\x97\x8ei\x11@\x04g\x80\x91J?\x10@\x1f\x91/\x95A\x1f\r@\xe1\xbc\xb1A\xa6^\t@}\xb4L2Q\x83\x03@\xaf\xa5P\x13\x83\x90\x01@\xceB'\xde\xf7\xe5\x00@>\x19\xda\x8bb\x8f\xff?\xbe5\xc7\xda\x01\xb0\xfe?r\x91\x15\x0bGt\xfd?a\xd6\xc1\xfe\xd0{\xf5?\xeb\xf4\xbcE~y\xf4?\xde\x02%\xf9y<\xf2?\xcf\xe0Y\xe1\x8b!\xf0??\xd4\xe52`\xad\xec?\x86\xe5\xb8\x19Ta\xe9?MEA\xa9E\xed\xdf?\xed@\xc55\xd2W\xd7?\x0f2\xc1\xf2&\xeb\xc8?"
-p121
-tp122
-bsg11
-g12
-(g13
-(I0
-tp123
-g15
-tp124
-Rp125
-(I1
-(I5
-I40
-tp126
-g22
-I00
-S'\xa8m\x03\x86g\xecn\xbf\xcc\xf6\xf6\xae\xf8\x92\xd7?\xe5E\xec\xe1\xd2T\xc4\xbf\xc0\xa3n\x01\xfa\x1a\xab?\x8b\xaemK\x9eT\xcb\xbf\xe9a\xa0\xe8\x0bH\xb3\xbf\xca\xd2\xdd>KC\xcd\xbfdA\xf0\x89k\xa8\xbd\xbf\xcb\x0e\x10u90\xcb\xbf\xcaR\xe7c\xcb3\xc2\xbf\xf35\x84C\xdb\x89\xc3\xbfX\xe3t\xe0`&\xa7\xbf\x9a\xb1];\x82\x81\x88\xbf3\xd1\xc0\xaeN\x8c\xa8?\xda\x08(!\x00[\xc6?\xeaQY\xedP\x02\x90?2)\xcdx\xad\xde\xce?g\xa2\xf9\xd3m\xc6\xae\xbf\xc3\x08P\x01\xf20\xce?\x1a\xce\x178\x8bo\xa6\xbf\xe6?\xa9\xd6\x1a\x01\xcd?\xab\x95o\xba\xbat\x84\xbfV\x15S\xe4`!\xc3?~\x14\x84\x8f\xb8\x80\xba?\xd4.\xed\xf1\x97\xfd\x94\xbf \xc8\x1f\x9a \x9b?y\xa0\xa8\xe8\xce\x06\xb2?.\xec \xe1\xf8D\xb4?\x9d\x9a8^\x1e\xbd\xb7?\xcf\xa0\x8b\x07\xf8\xf4\xa8?\xb3L\xba%\xe9\xe3\x9a\xbfy}w\xa8\xf8\x98\xa9?5\xb0\x12&\x17\x1e\x97\xbf\x9d\x80\xf9f\xc6%\x88\xbfP\x9a\xcb\xack\xda\xab\xbf\x7f9\xd1\x80&z\xb1\xbfl\x08+\xa5\\\xca\x9a\xbf.\xa7\xfc\xc2\'\xe6r?]W\x84\xb1\x0cQ\xaf\xbf\xff\x96\xa1/\x86\xa1\xad?\x9c\xefjc\x15I\xa9\xbf\x03\xb6\x13\xdd\xc4\xb5\x90\xbf\xf8a\xe5\xe78\x8a\xaa\xbf\x16#N$^E\x8e\xbf.,\xcc\xe9k\xec\xa7?\x8a0\xc5P_\x8a\x8f?\x81\x8d\xa9\x0748\xb1?\x834+\xf9\xd0e\xb3?7\xa4u\x0fT\xfc\xb7?"=\x82V\x91\x92\x98?\x8dp\xe8jT*\xb8?R\xae\x05XJ>\xae\xbf\xa6\xa8\re\x14q\xb3?\x102\xe5\xb7\xe2{\x91\xbf\xf1Kz\xa1\xc8\xa3\x88?Q\xf1\xff\x85\x12?\xa8?\xe2*x9S\xd9\xc0?\xf3\x1a\xba\x13\x0e>\xb7\xbf/\x058\x8boq\xc2?\xc3^Hbm6_?#\xd7\x95\x10\x1a\x96\xbc?\xbb\xa6K\xdc\xc2\xe1\xb3?x$~\n\xa5\xad\x9a\xbf\x96\'\xe9\xe7@#\xa5\xbf4c \x04\x1by\xc7\xbf\xe1d"\xc9\xbc\x10\xaf?_\x81\xbe\xf7\xc4\xd3\xc3\xbf\x9a\x1c\x13\x8c\x07\xf5q?\xff\xd2\x87\xce\xebs\xc5\xbf(m\x9e\xbfA)\xb3\xbf\xe7\xf0\x8e\x1c\xe0)\xc5\xbf@\'\xa5A\x03b\xb6\xbfB\x1c\xb4V\x8d\x82\xb2\xbf\xe2d9\xb2\xf1\xca\xa6\xbf\xa4\xbab\xea\xa4\xb2\xa0?\x12\ru{\x87\xc9w?\xf1^\x12\x9f\xa5a\xba?\xc9\xd3\x15\xc0\xc3\x0b\xb1?\xcd\x9e\xa2\xafP\xdc\xbe?\xb02\x91\x8bdA\xbe?\xc9\x19\xce\xa6a\xba\xb0?^\xdc`\x07\xa8\xc3\xbd?\x11J\xe5],/\x8d\xbf\xc9\xa9\xe13\xac&u\xbf?B\xbb6I}\xb4\xbf\xf2\n]\x83B\x9a\xc4\xbf\x07n.\x00\x1bh\xa0\xbftU\xe3q\xba\x07\xcd\xbf\x06\x96\xde\x83@\xe6\x9a?\x0e\x1f=\xd6\xb4\xb9\xc3\xbf\xc5\xb9?L\x13\xe4g\xbf\xbdd\xdb\x11d;f?\xdb\x8c\x9a\xfbH\x14\xa8\xbf\t\x17_\x01\xe6A\xc0?h@\x14\x1ax(\xb4\xbfI9\x00z\xb4\x81\xc6?\xfc{\x8d-\xa2\x01\xb3\xbf\x88)\x9f\x9dt\xcf\xc2?\xcfLr\xa3Z\xf6\xa5\xbf\x0f \x9e\x07\x1ec\xa5?\xc7Y\xc4\xef\xb2\xda\xa6?o\x1c-\xff\xdak\xb6\xbf$\x03\x9c\x82\xdd\xe7\xc4?)\x8d\xa7x\x8c\x06\xc9\xbf\xd2\x7f@2hP\xc6?\xe2\xd9\xf5pt\xe8\xc0\xbf\xaf,Q_\xd4d\xa8?+\x19W\xae\xef\xab\xcb\xbf\x86\x9ajN\x1d\r\xa2\xbf\xba\xed\xd3!\xce\xf4\xc6\xbf\xdc&\xa7+\xe9\x7fz\xbfP|/\xc4\xa9)\xba\xbfA1\xc7\xe4\x94\x13\xb1?\xd6FU\xdc\x9d\x1e\x98\xbfV\x9faP\xb0\xbai\xbf\x18%+X0\xbat\xbf\x9c6%K\x87\x91\x0c\xbf\x98\xb9\xbd6\xb9u\xb8\xbf\xc6X\x85\x8fj\xf2r\xbf\xe9\xd6\xb3x\x84\x9e\xc4\xbf-\xfa\x95\x8f\'\xe3\xb2\xbf\xfd\xd5a.l\xdd\xc7\xbf\x9eHp\xe8\t)\xc1\xbf\xb1"\xa3_\n\x9e\xc2\xbf9\xef\xc4\xeeA\xca\x9e\xbfY\t\xb4\xbf\xc9\xb7\x80?\xdc\xe5\x7f\x9f\xe4v\x96\xbfTk\xaa\x1c\x86\xa0z?x\x19\xafwJ\xdb\x8d\xbf\xd3\x17U\x9eq.`\xbf\xae\xda\xf1\x1e\x1f]z\xbfD?\xcbx"\x9c\x89\xbf5\x02x\x10\x7ft\x99\xbf\xd8(/\xe1\xbfa\x97\xbf\x17\xc3\xeb\xbd\x1a\xecy?7k\x021NF\x98\xbf\xf6\x9bRl\x0c\x9e\x9b?G\x88iE\x8b\x87\x8d\xbf\xbf\xa1\xf8\x1f\x9et\xa4?\xc3{\xd8\x0bK\xffZ?j\xc1\x1bG\xc8\x9b\xab?B\x86>\xcd3\xbb\x98?\x8f\xcfL\xd5K\x16\xb2\xbf\x00\x08\x94\x84\x9b\x15\xc1?5D\xbb\x0c\x11\xbf\xa8\xbf\xaa.\xe8\x89\xe9\xc0\xc0?\xa7\x87\x9e\xea4\x91\x93\xbf\x95_\x9a\xea=\xde\xbd?\xb9f\x84\x91\xd2\xe1\xa0?\x82\x88\x8d%\xa2\xb1\xbb?\xe9\x8cC\x16.f|\xbf>[AAI\xef\xbc?y\x91>V\xd8\xaa\xa9\xbf\xf2\x96\xd7,\xef\x93\xc0?\xdf\xd8r\xa7\x0f\x9c\x96\xbf}\x1fs\xed\x1e_\xc1?\r\x90TB#h\x84\xbf\xd9*\xc0n\x92\xb5\xc0?\x98SC \xe0\x8c\x95?\xe4\xb6\xfe5\xbb\xee\xbd?g\n"\xfdG\xff\xb1?\xfe\x8dDcyB\xb7?L}\xef\xa6\xe1Z\xab?\xb1\x01n,>\xd9\xbc?\xcbu(@\x08\xad\x92?7\x1b\xfe\x11\xb8+\xc0?\x9bP\xce\x05\xfep\xc2\xbf\x8b\xe6\xd3\xb1\x96\xea\x8f\xbfQ\xb0yo\xb3\x83\xb3\xbf\xbe\xffw\x06\xf2\x8a\x9e?\x0b*\xa9\xe9\xcc\xa3\x9b\xbf\x08\x15\x02\x92\xc0\xbb\xb1?\x08\\\xa9\x90]\xa6w?\xb1\x17D\x86\xe4\x94\xa6?\xad*\xd9\xf9G\t\xa5?9P\xd2Yk\xd9\xa2?\x83,\x9a\x8c\xc20\xb7?\xb7?\n\xd1\x9a2\x94?\xfdD\'(L\xc8\xc2?\x85\xfa\xeeM\xda\t\x94\xbf\\\xd8\xa63_\xe3\xba?\xc7\x8eE_\xed\x0b\xa9\xbf\xdd\xd2\xf7\xf6\xbdh\xa1?\xde0\xedB\xd1-\xae\xbf8^\xe2\xf5WO\x93\xbf3\xff\xb3h\xdfT\xa7\xbf{5dA\xbb\xad\xaf\xbf\xba\xac\xcd"\xa3\xfa\x9b\xbf\x92l\x1f\xbf\x11\x12\xc0\xbf\xfe\xaeX\xc2\xf7\x04\x9b\xbf7\xf0\x14\x88\x95\xe0\xbf\xbf*UZ\xa6\x19\xce\x94\xbf\x03\x9aq12\x85\xb0\xbf\x7f\x9e\x87\xd4\xcc\xf5\xa9?;W\xfd\xa0\xc2\x85\x9b\xbf"\xab\x01\xb7\x1e\'\x9b?\xbe\x90_5\xf5\x0b\x90?%\x90\xde\xda\xbb\x9d\x8c?\xec\xe0\xec\xdc8\r\xbf?\xc0\x82&\xd7\xbb)\x92\xbf\x9e\\\xbap8/\xb5?\xbf\x91\xbf\x17\x01x\x91\xbf\x93B\xd1\x1e\x02\xfd\xa4?u9+\xf9W\x04~\xbfW\xb6\x0b\x8f\xc0\xef\x84?\x9b\xc0\xf2\x9b?M\x88?\xce\xd6SIK\xd9\x90\xbf\x9a\xc2F\xd9\xd0\xc3\xb9\xbf\\,\xe2X\x95S\x94?"\x03l\x18\x85\xa5\x99\xbf\xaf\x9bM3n\x90\xa0?^;\x8c\x8crX\xa0?B\x8c\xe4\xa5\xf5\xda\x82?\xec0H\xf7\xcb\x7f\xa5?\x96d\x7fYT+P?\xeb\xa8\xd5\xbe.)\xa3?\xf3\x96p6{\xef\x81?\x96+\xda\xa2e\xafr?KjL\x89&\x01x?;\x8aq\x99\xd2\x17\x9c\xbf\xab11(\x03f\x8c\xbfJ\x8a\x88\x8c\x10\xb6\x9a\xbf\xd2e\xca\xdc\xb3F\x94\xbf/\x98{L\xf3o\x7f?\xa73[H\xd7\xae?\xbf/5\xa2\xf95\xe8\xa0?;\xce\xee\xfem8\x96?\xe31\xad\xac\x9d\xd8\xa6?Y\x911~>m\x95?\xc0\xc6@)\t\x8a\x94?\xc73\x17C\x1aV\x93\xbf\xdc\xb8\x16\xecE\xe3\xa9\xbfI\xc4\xa0\x97\xe7c\x9b\xbf0\\\x03LiD\xbc\xbfA\x94XQ>\xfcG\xbf+o!p\xad\x80\xb5\xbfFk,U\xde;\x85?\xf6\x07\x18p\x02\x03\x8e?Cg\xfd\x83\xc3%\x92\xbf9\x96&\x11T\xdc\xc0?\xe1\xf8Uy\xa7\xac\xaa\xbf\x82\xfa\xc1|\x88\xce\xb0\xbf2_\xa8\xcbU\x17\x9d?n\xf3%\x9dg\xe3\x8e\xbf\x9d\xcd\xe2\xf3\xa2\x9f\xa3?\xe5\xf0c!O\xd6b?\x0f\xca\xa0Z\xe4\x0c\x9e\xbfg\xfb\xe6?\xf6\x11\x9e?\xd8\x991\xff|\xb4\xc1\xbf=\x0c\xecnO\xfc\xb1?\x15\xa5\xc7b\xe90\xc1\xbfbG\xde\xad\xa5\xae\xbf\xbf\xcb~\xa4\xcd\xd6\xa5\xa7\xbfa4\xd7\x8b\x05r\xb3\xbfos\xd0:c\xd6\x8e?x\x8b\xf6\xe0\xd8\xec\xa4\xbf\xe0\xc3\xf0D\xfd:o\xbfO\xe4\xac\xc7\xf0\x13p?\x9dR\xcc\xb8KT\xb7\xbf\xde[\x0c\x8a\x9f/\xb6?\xa8\x92Q\x87Cq\xa9\xbf\xba]\x9bX\x88Z\x97?\x05\xd0=\xe0\xcd\xee\x9b?zde3\xeb8\x96?\xf0\xf6\xc8s\x07O\xbc?H aw\xb8\xf2\x92?\xc2\x15mHs\x98\xc8?\xdb\xf0\xcf\x8e@\x9e\x87?t\xed\x00\xddc\xe1\xb1\xbfr\xa1\x8foD\xd0\xbe?\x10ftu\x18y\xad\xbf\xa7\x9ea{\xd6;\xa8?\xb0\xcad<\x95\\\x86\xbf\'\x9a\xeb\x104i\x8f\xbf\x9fv\xb2\xfb\xce\xd0\xa0\xbf\x98M\xea\x12\x03\xa3\xb1\xbfK\xeaQ~\xdf\xb0\x9c\xbf\xe5\x82\x8e\xdc\x07w\xc0\xbf_\xb7\xebT\xa0\x93\x9b?K\xe6\xb7\xdb\x1a3\xa2?\xdaX\x87\xe4$\xc2\xd4?\xf1w*\xa9.$\xa1?\x0f \x0cF\x19\x81\xd2?\r)2\xb4\x9b\xaa\xb1\xbf\xdc_V\xb0\xe0\xcf\xc4\xbfT\xae,\x18D\x8e\xb6\xbf1\x91\x94h]\xb3\xc9\xbf\x912!\xf0\\\x8c\xa5\xbf2$\xa7\x1f\xf6\'\xc5\xbf\x9f\xe7\x19\x80Xq\xb0?@\xeb\x8d\x10\xb3\xe3\xc3\xbf\x0e\xf9\xcf\xad\xc8\xb9\xba?\x11\xc0u]q\xb2\xd2?\x17&\xb7\x12r-\xba?[\r\xbf\x8a\xecW\xd5?\xafb\x1b\x1a\xacC)\xbf\xae\x11JQw_\xa4?\xd1\xee\xcet\xb9\xca\x82\xbfe_\xe9\xa3\xbc_\xc2\xbf\xd9s\xdb\x16\xc9\xbf\xfabwB\xa6\'\xa0\xbf\xc6\xe6\xbdI\xd5\xee\x89?^(1\xb7\xfeg\xa8\xbf\x07`\xce\xa1\x01H\xd3?\xf3i4\x13O\xa0\xa0?\xe7\'\x0c\xf1B\x1f\xc0\xbf\x9e\xaf\xbf\'\x9aF\xd7\xed\x81|\xbf\xa7\xa4)d~D\xb0\xbf*m\x8a\xac\xeb\xd7\xc3?\xc60\xc8\xf9\xbd\x99\xc3?\x86\x05/\xfb\xdb\xb9\xc0\xbf\xec\xad\x96v%\xea\xb7?\xa8\xc9\xa7\xbc*\xfa\x8d?\xfca\x1cn\xad\x02\xa4?\xe9\xa1\xa3LY\x04\xb2?\xb8\x92\xb6\xe6\xba3\x94\xbf\xc7\x81\x13\xd1\xcf\xbe\xc1\xbf\xea8\\d\xe8\x8d\xc3\xbf\xd6\xcc\'\x80\xb4\xf9\x87?r?!\xd0\xe5\xfe\xa2?\xcc\xac-\x17cm\x94\xbf\xed\x0ba#\xa0\x9e\xa9?\xac\xa5\t\xdb\xd5\xba\xb1?\xa0\xf6J\xc3D\xddD\xbfj\xb8\xe1k\x18\x9d\xad\xbf.\xe0y\n\x0f\x96i?\\\x98\xba\xcf&\x94\x8f\xbf\x87\xb5\xa3\x8bBv\xa0?\xf3\xd5\x925\x1cy\xd2\xbf\x8a\xe6\x02e\x96E\xa1\xbf\\\xa5ny\x16\xa5\xac?\x04\x9f\xb3\x9d\x9ar\xa9\xbf0{~\xac\x04A\xd0?DG\x8a\x982<\x92\xbf\x90\x99T7\x86\x11s?)\x9a\x98\x07S\xd3\xa4\xbf\x16\x9e2i\x07\xdb\x88\xbfR\xee\x15\xbf\xe2\xfa\xb6?x\xf1\xca\xad*\xc3\x86?\xc3\xdc\xdb23\xa5\xb0\xbf#\xb2dk\xfb"\x96?y\xd7\xda\r\xfa6\x9a\xbf\'\x00\xe1\x14\x88H\xbd\xbf{\xf1\x08\xc7\xd6-\xbd?|\x14\xae\xfa|\x18\xc8?v\xfe\xe5G\xf3|\xc1\xbf\x17i\x9f\x82\xad\xb5\xc1\xbfj\xbe\xe2[\xba#\xbc?\xdd\x9c\xce\x93\xab\xe6\xb7?M-!\x9dyH\xb1\xbf\xb8/\xecz\x99=r\xbf\x9fH\xa2\xda\x80P\xa4?\x01\x11\x18E\x1b@\xb8\xbf\xd0\xd9\xfeC\xc4<\xb8\xbfT\x01\n\x92Ge\xc0?N`W\x8bP\xec\xc6?\n\xc2\xe4cB\xc9\xc0\xbfw\xff\x83?E\xd4\xd0\xbf\xc9U\x9d4V\xee\xaa?\xe8\xe8\xe1\xae\x99\x12\xc9?:\xe7;\xc0\xb2\x99\x81\xbf\xf67P\x89;gv?\xb0\xb4Y\xbc \xd6c\xbfH\xea;DT\x06\xc5\xbf\x00\x15\xa3\xfb[g$\xbfN\x14;\xdb\xb3=\xc9?\xb0-aC\x9a\xf8|?\xd7\xc6\xdcm\x1f\xf1\xb6\xbf\xf2\x92s9y\xb8\x99\xbf\xe0\x01\xb7\xaf\xa1\x84\x9a?\xb0\xbf\x84\xb9\x89\xf5\xa4?\xd0\x18.R\x94q\xb2\xbfN\xcf>7\xec\xb0\xa0?-\xb95\x7fH\x15\xac?\xf4\x9a^\xefp\xe7\xa5\xbf\xf6\xea\xa1\xd2w\x0f\x90\xbf\x9d\x88!\xf5\xf5\x8b\xa7?q\xbdLM\xb2#\x7f?8q\xc2$jN\x9a\xbf\xb03\xabz\xbfA\x91?\xc8\xacwD\xc7\xb4`\xbf\xa0AP\xe8\xbc\xc7n\xbf\x80\t\xa1dl\xe9l\xbf\xa6\x92\xa8\xd0\x9e\xa1\xa2\xbf\x90\xb8R7u\xdb\x95\xbff}\x1d\x1aw\x90\xa8?\x00\xa9\xfb\xbb?\x91s?\xac\xa6\xb2\xc9\xa9\xcf\x95\xbfT\xdd\xe8{\xca6\x81\xbf\xa6-\xfd\xdf\xb1\x0c\x86?\xd6\x80\xe3>`\xfb\x91?\xa6\x8d\x84\xfd\xa2\xf1\x86?\xa0\xd1f\xe8\xf6\xc7\x97\xbf\x83\\\xd9d\xe8A\x89?\x80\xe1}g\x1fJY?\x0ebpB\xef\x93k\xbf\xdb\x18\x99\x15\xe4O\x9f?\x0e$\x04a\xb7K\x92?\x1d\xceR\x13\xc7\xc4\xa7\xbf\rh\xcc\xc1~_\x83\xbfc\x19\xf2\xa4\x0b+\xaa?\x15\x08(\x8b\x12\xde\x9b\xbf\x8e\x06\xa9\xc2\x1cqx\xbf\x8c\xa5~\x0eq\xaa\x92\xbfG\x17(\xad\xcb\xb5\x93\xbfG\xad\x80li\xe2\x84??\xf9\xe1\xf7\x1e\x01\x98?X\xe9\xcbeG\x07y?:\xa1\x9b\xc8>R\xcc\xbf\xd8\xc5A\xa3\x17\xa6\x98?\x90\xae$\xfc\x9c\xf6\xcb\xbf|\xbe\xf1\x9e-\xf7\xa1?\xa2 \xb3\xb11\x87\x88\xbf\xf0=$@c\xcc\xb6\xbf?\xde\xe5\xed\xb0\x07\xc0?\xa0\x19\xe5\x9f-z\x7f?\xe6\x07\x94\xfc\xe0S\xce?\xe0\x13\xcaM\xad \x8d?\x8d%\x83\x01$=\x85?J\x8e\x02\xf3\xdb\xc3\xb1\xbf\xf2\xb3k\xb7\xbb\x17\xd2?x\xf9\xc1\x8b\xfe4\x9e? yh\x8f\xbc\xe7\xc4?b_\xfc\x16\xd5T\xb8\xbf\xc0\x9f+\xf9\xb1\x91\xab\xbf\xa8"a\xce\x9cl\xbd?miL\xcb\x8f\xdf\xc8\xbf\xa8\xef\x89\xc5\xa6\x95\x96\xbf6r\xb3\x90\xcd\x1a\xc3\xbfp\xe5\xde\t\x17\xe4\xa0?d\xe5\xce\xa5\xcb\xef\x8a\xbf\x0e\t\xc4\xb4#p\xad?v\x1b\x93R\\\xa4\xb9?\xac\x90m\x0fJX\xbd\xbfR0\xc0\xde\x05\xb1\x8e?\xfc\xcf\xb3L{\x91\x87\xbf\xe5\xdb\xd7+\xf9\xce\xb8\xbf\xa8q\xd2dS\x08\x8b?\x16t\xaa-\xaa:\xb2?7\x89\x99L\xbb\x08\xae?\xf66\xaf\x83\xa6\xd5\xb5\xbf\x1c\x87\xc7\xcd\'\xa0\xa3\xbf\xe0\x84\xcc\xcc$\xee\x9b?d:\x9aE\xef\x88\xb0?,r\xf6\x04\xach\xb3\xbf0L\xebF\xe4\xcbt\xbf P\x05\t\xde\xfd\xb7?\xea\xc0\xde\x88\x13}\xb1\xbf\xa8\xb5\x82X\x81-u\xbf"A\xb0(\x93J\x97\xbf\xaf:\xd1l\xb9\xe2\xb2\xbf\xc7\x13\xddS \xc7\xac?L\xd7\x99\xe6Z*\xad?\xea\x03\x87:/\xbc\xb5?G\x01\xf6\r)-\xb0\xbf\x91\x9f\x82\xf9\xec\xbf\xa6\xbf\xe8y\xe7V\xdf\xd3y?\x00\xf6\x0c\xf0\xfeQ<\xbf\x8a\x12\x90\xb0\x9a\x0e\xac\xbf\xe0\x1d\'\xd7\x96\x01a\xc7\x85\xbe\xb3\xbfTM\xa9\x9a\xae\xfdB\xbfrq^\xc13&\xaf?\xc2\xf9\xd5\x8f\xfc1p\xbf/[d\xc4P,\x94\xbf\xf6\xfd9\xfb\xd1\xaa\x8d\xbf\xc4\x86\x0cI\xbdQf?\xcf3#\xb6\x80\xb8\x81?\xf1\x84bJ\x8aE\xb8?y1\xaf;\xf0\xa6\xb4\xbf\x04.\x96\xeb\xa5@\xd8\xbf\xf2e\xb8jE\xb7\xc9?\xbb\x0f\xbf\xc0\xbdB\xdb?\x1a\xa9\xff\xb3>X\xbe\xbf\xaa\x19\xc3L"\xdc\xc2\xbfU\xe5\xac\xe9\xf75\x82?0\xd3\xe0d\xb1\x84A?P\x84\xb7\xbcb;\x7f\xbf4\xc2\xe2q6\xb0\x9f\xbf\xc7R\xe2D\x12h\x9e?\x98\xc0%\x00D\x96\xb9?\xc3uf9p\xb2\xb2\xbf\xe0\x82\xdb\xce\xc2\n\xbb\xbf\x00\xb3D3M\x0f\xa6?\x87(:\xf2\x12\xb2\xa2?\x83(\xa1Q\xec\xc1\xa1?\xf6\xf7\x9b\x0f\xcc\xa2\x83\xbf\xa6\xcbo\xb1>\xe0\xae\xbfH\xa8\xf6cU#:?\xaeR\x00\xc1S(b?\xe4\xf0QH\x12\xf4\x89?{\xa9\xf1)[\x05\x93?r9\x89\x9a\x08\x9d\x8f?\xe4\x9a\xd3\xbaU7b?2\xf6X\xb9l_h?H\xd3b\x03S\x14o\xbfc4\t\xe6(7\x98?\x0c\x89\x9dJ\x17\xf0\x95?w\xfb\xa1\xe9\xb5\xff\x9b\xbfv\x9d\xday\xfdc\xa1\xbf\xbeP\xd6\xcbN\x8e\xa5\xbf\xe3\xb9\xb3<\xf8\xf3x?\x0e\x87\xe4\x1c\x1f\x84\xa6?\xdf[\x0c\xaa@6\xab?\x00b\xb8\x90\xb7\xcf\xa3?@\xc7\xef\x8c\x1d]\x91\xbfP\x9c\xda\x8f\x9c~W?\xee9\xe6\xf5\xb6\x86\xab?\\d.\x11\x88\xbd\xba?dH E\x84\x12\xcb\xbf\xbc\x08]-\xbf\xc1A?.\xe9u\xba:\x1e\xa4?P\x7f\xb8l\xf6\xf4\x9a\xbf\xa6\xc1\xfb\xb8\xff\xd1\x82?\xa7Tq\xc2\x0f\x13\xbe\xbf\xb0\xae\x0b\xd4R9\xc6?\x9b\x03D~\x94\xb3\x8b\xbff\xed\xe9\xab\xf7\x86\xa9\xbfp6\xbe\\\xe88\xbd?\xf8\xf5@\xa6\x12\xa2\x97\xbf\xab\xf2\x1cz\x87\xef\xac\xbf\xd4\xf9N\x9b\xee\xdc\x90?\x1a\xd5\x04}\xb6\xf7\xb1?\xf4\x12\xbb`VQ\xb0\xbf\x1dT\xe8\x88\xb1a\xc4\xbf\xdd\xbc\x14o\x91f\xa0?*\x05\x82D\x10\xa9\xa7?[xO\xad\xdd\xc5\xb6\xbf\xa5\\\xad\xe4}\x1a\x9b?\xbe\xc7\xbcK\x0b\xac\xbe\xbf\xddo\x9bc=\x02\xa1\xbf\n6\xd5\x98\x1eE\xc5? \x0e\x7f\xd2`\xbc\xa2?\x05\xd8\x0e1\xd10\xae\xbf\x0b\x9fz(Sw\x9e\xbf\xe9\xb7\x8e\xbawR\xb1\xbf\xe0\x9eA\xa2oT\xb3\xbf\xd0\x86\xe8\xa8%2\xb8?W\x86\x8e\xfaa\xad\xa8?F\xd6\xe5\x8dKS\xbb?H\x1eM\xf3:\x81\x81?\x8c~\xd9\x9b\xd0\xa4\xa2\xbfO\xde\x96\x00\xd77\xb6\xbf\x83\x10{\xc0n>\x87?\xda8\xd6\xd2Bi\xc7?\x96\xa1\xc5^\x1a\xc1\x8d?\xf7\xe4\xf2s\x8aP\xad?\xd4\x07\t\xa2]%\x9e\xbfSe*\xa6\x8c\xc5\xbf\xbf\x1c\xdeTWc\x9e\xa7?\xf7\x00\x8d\xee.\x0c\x80\xbf|\xf8\x9d\xfb\xc8J\xc0?\x8e~Z85iy\xbf19\xbb\xbc\x1c\xb8\xcc?w`*\x889\xe3\xc1?\xfcX\xe5p\x12U\xa1\xbfzO\x13\x91\xb6i\xa7?a\x9e\xce\x1e\xf4\xa8\xc8\xbf\x08+\x8dL$\xef\x9c\xbf\x10\n~\xeaFR\xc3\xbf\x80\xec\x7f\xce\x05\x81=?\xff\x8f\x93o:B\xc3?\x995j\xd9qa\xb1\xbf\xba\xc9\xf9\xd2\x0c\xba\x99\xbf\x84-1z~\x95\xa9\xbf\xf5\x83V\xc6\xaa\xcf\xc0\xbf\xf6e\xab\x90\x13\xc3\xad\xbf^L\x1a\xb1\x06\x90q?\xf4\xa8\x0e!G{\xaa\xbf}\x94_+\xb3\xa3\x96\xbfU\xcb#\x15\xdd\x00\xc0?\xecu\x86\xe8\x85\x8f\x99?UW\xdc\x1bm\xf7\xc1\xbfPW\xd0\xac\xf2\xff\x94\xbf\xf2\x88\x8eH\x8aD\xc1?n\xc1\xfcQ\xe9\xbd\xa6?\xcae\'\x95\x18O\xbd\xbf\xc4\xaaG\xa8\xdd\xa9\xa8\xbf\xfao\xbfI\xc7c\xb3?\xaf\xd6\x90S-\xd8\xa0?E\x8e=*\xe4q\xa9\xbfig>\xb5\xe4=\x9f\xbf\x07&\n\x95\x8bC\xa3?\xc5kPh\xf4\xf7\x90?H\x1fwi\xdd\x7f\x95\xbf\x98\xec\xa0\x0b\x14\xca\x93?p\xa8\x90\x02\xa3\xd9v\xbfGmg\x8b\x890\xa3\xbf\x0f\n\x1f\x1a\xcd\\\x91\xbf^+\xd7S\xdb\x15\xa6?\xf0%*\xfa\x89m\xb2?4\xf3\xf8/\xc2\xa8\x9e\xbf~H\xca/~5\xbe\xbfy3\xf1\\\xef\x13\x98?\x06s\t\xb4\xc6\xaa\xc1?x3D\\\x94\xde\xa4\xbf#\xc5\xc7jp\xea\xc6\xbf&\nt\xad\x14\r\xa0?X\x9d\xeb\xfa\xe4`\xc7?PM~\xa8\xd1\xa1r\xbfzU\t\xfb\xd2\x17\xb3\xbf\xdc\xbb\xb2G\xf9\xa6\x84\xbfp\xe4\x06h\\\xd9\x95?\xa2\x11\xb7d/\xdd\x8b?\xea\xec>+\xf5\x88\xbe\xbf\x12\xa9\xd5bi\x9b\xa6?\xc3\xe69s\xacG\xd4?\x18\xf4^\xc6\x8d\x08\xc2\xbf\x85\n@\x1e\xaf\x8e\xd5\xbf\x10]\xa6.(\x0f\xb6?\xe36:\x18\x05P\xc0?@1\xe4\xfcG\xe7y\xbfXbl\xc5\x9cY\x88?\xf1\xb7\xad\x95\x18\xdf\x92?\x96-\xd04\x83\x94\xa5?<\xd46\xbe\xb2\x8c\xa7\xbfLv\xba\x04cU\xbf\xbf6\xbb\x9c\xcf$\x10\xb1?4\x80\xed\xd3s@\xbb?\xb4\xa8\x94B\t)\x9b\xbfG\xd3yW\xa8\x8c\xa1\xbf 9\xe3\xdb\x08\x01\x81?\xf0\xfa\'u\xcbbu\xbf\xe0`pPp\x82v?\xdc\xb5@\xdcpG\x80?\xa8Okr&\xea{\xbf\xc0}3\x0e\x9e;z?`\x1e(H\xb2\xcf`?\x90\xc2\xebN\xad~~\xbfh\x81\xf1\xbd\xfa\x8fy?\x00\x14\xc4\xe7R\x98\x86?\xa2n\x82*\t\xd7\xa4\xbf\x1f}\xb8\xb9\xb4P\x9b?\x02\x80\x87\x968s\xab?\xaf%G\xbf=\xfc\x9f\xbf\xae^\x92\xe6\x03\xe3\xac\xbf\x9a\'x*[\xb2\xa0\xbf\xf9\x9fIi\xefT\xa1?p\x97\r<&_\xaa?`((\xa3\xfd\x05\xb2?X\xa8)\x88~z\x94?\xc9\xec\xcd\xff\x1eI\xb0?\x86q\xe9\xeb\xac/j?\\WSS%,\x84?\xa0q>k%\x82\xc4?\x181W\x11\xf1\x94\xb5\xbf\x1c\x1a]i\x9e\xfdq\xbf\xfc\xb8\x98\x9fj\x8b\xaa\xbf\x16\xb2\x1e\x02\xd7\xe4\xa8\xbfT\xd0;\xfa\x10\xba\x94\xbf\xacK}\x969\x80\xc2\xbf\xcaN\xfbv\xc8\xd5\xab?\xc6Z\xaa\xc0\x8av\x94?\xfb\x0b\n-,,\x8c\xbf\xb5v\xb7\xb0\x04\xe5\xc0?\x9d6VeK\'\xab\xbf\xac\xa4\xab\xa1\x7f\x93\x9b\xbf\x82\x07\x1a\x90\x83\xbd\xb5\xbf\xec\x8a\xf7y\xfa\x99\xaa?\x1d\xe7y[\xf7d\xa5?\xc2\x7f\x91k\x18\xe2\xc1\xbf\xc9oh\xde\xc5>\xae?\x8e\xc0\x98\xe4\x061\xa8\xbfu\rBN\x98u\xa6\xbfP \xd0\x8d\xf1)\x80?\x90\x83\xbb\xe5\xf1k\xbb\xbfLu{\x82\x10z\x91?\xd6\xa0\x8b\xdb\x998\xa6?\xe0\xca\x02\'\x80\xd9\x80\xbfh\xc7\xbeu\xda@\x8c?]\xdc\xcf\xc6S\xcd\xac\xbf<4D|\x1c8\xa6\xbf\xacv\xa8\xa7e]\xa7\xbf\xf5q\xcde\x90<\xb8?\xcb%\xa6\xfcW%\xb7?h\xd3c\x80\xa6\xd3\x92?P\xdd\xb5\xef6y\x92?\x92\xa4\x81\xdaU&\x93?\xa2FZ\x8eOJ\xb7\xbf\x06\x8cX\x94\x89\xb8\xb4\xbf\x94\xa4\xd2){\x0f\xb5?\xa2\x83#\xd0\xddX\x8a?\xaa\xfap\xed\x89\x9e\xbf?\x9e\xe0,\xb8\x08Z\xa0?\xee(M\xe1\xba\x97\xb9?\xb4hO^\xd00\x90?\xfa\xc7@OlQ\xa5\xbf~N\x02\xcd\xd8[\xb5?\x18\xdd\x16\xed\xdb\xaep?\xc5\x9dQ\xd7<^\xc1?\xaf\xd2\xb0\xc5%\xbb\xba?\xee\xbb{L\xb4\xdf\xae?\x8c\xa1\xf9\xfc\x8b\xc1\xb0?\xa9\xcb\xfd\xfb\xaan\xcb\xbfEU8\x1aG\x91\xb4?h\xb1\xa3\xcb\x89\xc3\xa7\xbf\xd0*n\xa0\xdb:x\xbf\r\xca\xfe\xd9IL\xba?=\xe3\xb1\xe34]\xc2\xbf\xd9\xaf\x05\x10@\xb6\xc0?\x8f\xe1)\xb54p\xc4\xbf}\xf1\xbbt\x92>\xcd\xbf\xfb\xa8\xabu\x97\x85\xc3\xbf@\xab\xfc,IGs?\xde=\\?\xae\x8b{?\xafji7-\x8d\x80\xbfNU\xd1\xcc\xc8\xbe\x8f\xbf\xb5v\xfc\xf6i\xcc|?\xc2\xc2?\xcaB\xf2\x8f?\x14GZ8\xda\xcb\x97\xbf\xf1\x80\xda\xc1\xc1\x10\x96\xbf\xf4\xb5WF\x1bP\x9b?\xaf\xc7\x86/a\x01\x92?\xc2J\xf7\xe9l\x93\x84\xbf\xd3\xa5\x16)\x8d\xe1J?\xdd\xffQw\x83@a\xbf\x1d\x8eW\x15]rZ?\xdfT\xbd\xd7\x87+P?\x9b\xcf\x86\xa3\x97\x16\x81\xbfU\xe3+\x8d`\xde\x80?D\x08\xab9\xfb#Z?\xfc\xa9R#\xf7q\x92\xbfr\xf9:\x17\xe1\x16x?_\xbf\xe3\x8fD\xe5\x94?\xbc\x98\x18\xe90\xac\\?\xc7\xa3\n\xe7PI\x93\xbf\x10\xea\x08e\xd4\xdff\xbf\xea\xda\\\xf6bW\x92?\x0e\xf921\xfa\x13\x91\xbf\xc9\xab\x19\xc3\xa1T\x82\xbf\x80\xeeo\x7fZ\x9e\xa2?\xc8\xd1\x1e\x13\xbdl\x87\xbf\x7f\x12v"\xb6\xf5\xa7\xbfH\xad\x13.\xdf\xdf\x8b?\x19<\x8b\xcf\xf6G\xa6?)Z\xbe\xa4\xcb\x89s?\x88;Y\xca\xd1\xeb\x8d\xbf$\x02\xfbO\xe4B\x84\xbf\xf4\xc7\xd1\xa0\xab\xebw?\xbc\x83n\xd23\x88~?\xd2\xccn\x89\xbf\xfb\xa4?\xd6\x16o\xc0\xe9"\x99\xbf\xf3\xf2\x8fF!\xcb\xc4\xbf\xd26BO\xeaT\xb5?Y\xcb\xe6\xf4I\xa3\xc6?\xa2\'\x02W\xa8\x03\xad\xbf<\xf9\xde\xab\x01`\xad\xbf\xcf\xde\xb3\x92s\x93\x91?\x7f1M\x0b\x91\x90\xa3\xbfE,\xa6\xb4\xf0e\x84\xbf6\xfe\xe8\x1f\xdc\xe3\xa8?\xb5\x8d\x84\x18\x95\x89p\xbfh\xfa\x1b4x[`?C\x15G\xaey\xc3t\xbf\xb8X+\xa3\xec\xfe\x82\xbf\xd8\x1e\x97\xb5\x7f\x07d\xbf\xbdfI\xa0<\x1ad\xbf?\xeb\x889\xd1\xeb\x86?0\xba"\x8a\x1cqq?\x7f\xe0fN\x8a\xc6\x86\xbf\xe0D#KU\xc4\x14\xbf-\x1b*>\xc9\xb9\x90?\x10U\xa4=ZGh?x\'\xefF\x87\x87d?\xceB \xb0\x8eQ\x9f?\x19f\t})\x8a\x80\xbf\x92\x0b\xdb\xebr\xfb\xa9?\x83\xce=\x8b\x9d7\xb1?\xa6\xa1\\B\xde\xab\xb2\xbf\xbdQ\'\x80`\xf4\xb4\xbf\xff/\xa2sQ\xc3\x96\xbf0\x1d=t\x16\xb8\xa8?%\'i\xa8\xb8\xb1\xa6?\xb9\x95\xe7\tyo\xa0\xbf\x1e\xb8\xa3\xab\xa2\xf3\x96\xbfU3\xf2\x88\xbfn\xc9?ek\xe2\x81E{\x83\xbf\xe0\xb2\xed;aa\xc1\xbf\xa5\x18\x8a$\x91\xb0\xb9?\x0c\xca\xc1\x8f\xc7\xbd\xa5\xbf\xbd\xb84\xab\x01[\xb2\xbf\r\xaa\xb6\x1ef2\xc9?&\x7f\x18\xce\xb7\xb9\xa4?m\x98h?\xaa\xa3\xb2\xbf\x85#R\xd6\x06\x87\xa8\xbf_\xb2\xec\x8f\x9a\xae\xbc\xbf\xbb4\x07\x9c\xee\x81\x96\xbf\x04\xe8\xac\xb1\xf7c\xce\xbfwC~h\xe9\xaa\xa5?\x00\xb8\xf3\x98l\xde\xb8?\x13\x98\xa7(\xb0y\x9f?\xc7\xe9\xe1\xa8\xeeA\xaa?\xf4\x92\xca\xf1\xbb\xcc\\\xbfU\xa2\xe3\x98\x05M\xc6\xbf\xa8K:q:\x14\x90\xbf\xd3j\xa1\xb2E/\xc1?M\xcd\x107.\xf2\xac?\x82\xa4\xcb\x96\xcd\xfc\xb5?u)\xce\x83|8\xbe\xbfWcE\xe1\xb9B\x8d\xbf\xcc\xf1*g\x82\x1e\x80?\x18\xe0\xb0\x1bN\xc6\xb9?\x07\xbfi\x00V\x86\xcf?ks\r-~\x0f\xb6\xbf\xe8\xfb@\x04\x108\xbd\xbf;\xe7N\xf4P\xe0\xb3?\xd32\xe4\xd9\x81\x1f\xb1\xbf\xc2\xdc\x04\x8d\x88\x1e\x87\xbf\x012\xf6\x1c\x04c\xa4?4FY\x97\xa5\x1a\xb4\xbfs\xe8\xa4\x9e\xb7A\xc8?\xb6N\xcc\xff\xbb\x87\x97?E\x8b\xbe!\xd5\xa5\xa6?\xbe-;\xc0_*z?f:3\x98\xadn\xca\xbf\xcb\x9c\x14v\x92\xb4\xc1\xbf\xd6\xc6-\x96\x1e,\xa0?L\xf4\xa7\xeeSUv\xbfO\xb4\xf3\xfe\xad\x99\xc3?W\x07\t\xe7\x05 \xc1?NM\xb0\xd8E\x82\xd4?\xa4\xe8\x88\xf7>4\x83\xbf\x19\xf3u\xffX\xdd\xd0\xbf\x90\xbd\xba\xd7\xa3E\x9f?Di\x98\x8d,\xebi\xbf\x06\x11\x04H\xeb\xf7\xca\xbf\x10\xb2Bh\xbf\xd2\xb2\xbf\x9e\xea\xd8p\xc0\x14\xa1?\xaa\x96s\xde\x99\xe9\x9a\xbffl/-1\xec\xc6?:\x99V\n\x81X\xaf?\x1d\xe1\xb1\x83\xce\xe5\xa9\xbf*\x12\x93K\xcb\x0c\xa0\xbfi\xe4\xdf\xbdi\xf5\xc5?\xec\xe6\x04xI\xf9\xb9\xbf`"\x7f\xe3\x06]W?\t\xc0\x1e \x89Y\xc0\xbf\xec\xbe\x1dg\xb7\xa4\xc3\xbfD\xe6\x9c&\x8e\x9a\xb6\xbf\x88\'\x8a&tn\x82\xbfIk\x08\x10\x06\x0e\x84?)\x15+\x146\xca\x9a?i\xd5C\x83@8\x95\xbf\x1d\xa8\xd7\xdb\t\xd0\x94\xbfH\xa0\x99H\xd9\x90s\xbf|!XE\xca\xf3\x91\xbf\x86&\x8a?\xf9\x8c\x9a?s\xbf\xab\xa1#\x9e\xa6?\xaeE=\x8b\x1a\xf0\x94\xbfY\x98\x0fe\x99[\xa3\xbf\x102\xc4T\x8f/\x92?\xa5\x82i\xf7\xe0\xdc\x9b?\xc6\xdb[\xfb\xbft\x94\xbfb~8\xeaj\x11\x98\xbf\x06DCl:\xb5\x82?\xce\xf9\x99\x1c<\'\x90\xbf\x84\xef\xe1\xd5\xd3P|?\xc1$\x96\x95]M\xa6?\xa0\x8f\xb0\xb99b;\xbf\xc4}\x97\r~\xc2\x9b\xbf:B\xe4m\xb42|\xbf;\x91\x0f\x15"\xc1\x90?\x8c\xdds\xd8\x02\t\x84?\x18e#\xe7\x81\xde]\xbf\x00\x8a\xb2\\\xec\xabV?\x98\xdf\xc6\xfe\xdc\x9fj\xbf\x9a\xa0\r&YM\x9f\xbf\xa3\x05\'\xf5R]y?\x18\x8e\xdc~%\x0f\xaf?\x1f\x82:W\x93b\x88\xbf\xcc\x8a\xd2\xcc\xf6\x96\xad\xbflC,\xca/;q?\x87e`\xf6\x8e\x06\x95?\xaf\xbf\xcf\xb6=\xad\x83?\xb2^\xaf!d-r\xbf\x9d\x95j\xbeAz\x8d\xbf\xfc\xfe\xb4\xeagnw\xbf\xbf\x89\xf3\xa4/\xd5\x97?`MY+\xfc:\x9a?\xe5-\x01\xd0\xd5\xee\xa6\xbf\xf5\x1b\xaay\x0f\xb1\x90\xbf.\x10\xb5\xcd\x851\xa1?D\xba\xe3\xf0H6r\xbf\xa9\xd4RT\xe6\xa6\x9b\xbf\x16 \x1d\x9e+\xf5\x89\xbfo\n\xdfYb\xb5\xa3?\xb8\xd7\xd5vB!g\xbf\x86;\x8c\xc4\xc8\x81\x80\xbf\xfamKI\x86\x0f\xac?\x80\x8cqF\xec\xad\x99\xbf\xc0\x9e\xb8\xa3\xb5\x16\xab\xbf\xf4\xc9\x90\xf4i\xdc\x8d?h\xdc,\x96%[\x7f?\x8aqj\xd0\x19\t\xb0\xbf\xd0tB&\x0fH\x89\xbf\xff&\x9e\x9f7\x87\xa0?,\xc9\x93\xbe0\xa5\x8b?\x9cm-K\xef\x13\xb8?\x89X\x88\xd0v!\x91?\x13\x94h:gg\xb1\xbf\xfa\x84\x8f\x02\xcbL\x81\xbfr6\xef\xe1\x85\x12\x99?\xb3/\xe7\xa6#\x99\xa4?\x99S1\x12\xc7\xbd\xaa?Z\x0bv\xb5f\xed\xb4\xbf\'\xca\xd5\xd9G\x02\xbf\xbf\xd2\xcc\x10\x07(8_?\x11\x05{\x01\xcb\xcb\xbd?l;`E\xb6\'\xa6?\xd5z#\xb1f\x14\xb4\xbfl\xa4K\'?\xb3\x99\xbf\xb5Z\xac`\x03\xd9\xba\xbf\xfc\xc4\xd9k\xb0(\xae?\x9a>tBA\x9a\xb8?B\xe97\xa5\xca\xc7|?\x9c\x04[\xab\xc9y\xa4?\xf1\x1c\x9eD\xbe\x17\xb9?\x1a-\xd3\xc9\xaee\xd7\xbf}\xc2\xaf\xeel\xaf\x8c\xbf9\xf9@\x94\x03\xbf\xbd?=\xeb\xb8K_\xd0\x89?\xf7\xc0\xe5\xd4\x1c\xb9\xc1?\x1fS5rxr\xc4\xbf\xaf\x9b\xcaT{}\xd6?\x97\x16\x9d1\xab\x1f\xa4\xbf\x1e"\xf3\xa1\x96\xfb\xc2\xbf"5\x8a&\xf3\x99\xbc?\x97\x94t\xe5\xc3\x99\x95\xbf\x9dS\xe9\x95\x1f~\xb9\xbf\xe2K+\xdd\xdd%\xc4?\xda\x19\xb9\xf6\xfdF\xb6?\xa6\x82{4O\x04\xc4\xbfz\x92>Fl^\xc7\xbf\x85\x1a\xc4\xc6Uf\xb6\xbf|\xac\x03j\x85\xa1\xc2?\x90 \xb9\xf0[\xd1\xb1?`~X\xe0\xa0\xed\x7f\xbf3\x90\x04\xca\xd43\xb9?-&b\x0e\xa8z\xb5?2\x07dC\xed\x80\xb8\xbfp\xc9\xeeR\x0f\\\xb7\xbf]\x06\xb6\xdas\xfa\xc4?F\xfa\x88\x8c~Sj\xbfc\x1e\x9f\x19/P\xae\xbf\x01\x19\xf0\x9cuD\xc2?\xad\x1c\x06I|g\xbc\xbf\x8d\xf4\xc1a\xe6\xca\x88\xbf\xfa\xa5\xd7lM\x93\xb1\xbf\xe2\xd4G\xef6\x07\x97\xbfJ\xb0\xf6\x00\xf8p\x90\xbfw@-\x93P\xb8\x8c?B\xc4\xe7\x1e\xaf\x81\x9b\xbfpi\xedR\xbe}\xaa?l\xe0DXL\xff\xa3\xbf\xed\xaao\x9f\x9a\xb0\xac?p`\x9f\xe8\x17\xae\xb8?\xb1B*\x91\x01\x93\xc3?\x8c\xf4I\xc9N\xe7\x91\xbfu\xdb\x07b\'\xa9\xb2\xbf\xa1d\xb6@\xcd\x01\xb9\xbf\x80\xd5\r!\x9c\xb0*?\x0cE%_\xb2\x91\xd2\xbf\x0f\x88Y\x9e\xd5\xebx?\xd8\xd2-\xf5\xf3fp\xbf\xd6\xe0&\xcdl\xea\x84?:t\n\xc5\x14\xc6\xd2?fs\x85\x9a\xb6\xc5\xa7\xbf\x8f\x18@\xec \xca\xb8?\xb0\x126/\x12\xb8\x93?\xe0\xee\xbcUt\xdf\xab?N\xc6\xa4R\xfbO\xc0\xbf\xc0\xdc5\xf7\xa4R\xa5\xbf\xdfbW\xf2D\xa3\xbc\xbf\xaa6\xce\xd1\x08\xc4\x85\xbf2q\x0f\xe9\xc8\xb2\xa6\xbf\xe0\x07m\x19\xd4\xbf$?\xa4\xc2
\xac\x1c\xc2]\xec\x90\x98\xbf\xf6{\x81?\x08#f\xbfQ+\xc2\\s\x85\xa2?\x93\xc5&\xf3\xd4\x04\x81?\xa93VWp\xad\x9e\xbf\x12\xf8D\xeb\x89\xe8\x88\xbf"\xdb\xa5+\x95\xa5w?\x0e\xb5\x83|\xfe\xb9\x84?\x13\xf4\x9c\x0e\\\xc5\x84?[\xee\xc43\xd2:c?\'Lo\x8d\x9dRt\xbf0\x93;\xbf\x97\x85\x88\xbf\x19=E\xf5\x8f;\x8e\xbf\xdd=\xe6\ne0|?\\\xe0\xf0Jv+\x82?n\xba\x88f\r\xf6\x86\xbf r/y\x06A\'?+\xfaM\x8a\xd6\x17\x91?\t\xf5<(m\x95|?\xad^\xdeK?\x1d\x92\xbf\xca\x82\x0c\xba\xd1\x06\xa5\xbf\xfb\x90\xee\xc9\xbd\xe9\x90?\xe6\x8bE\xdf\x90\xa2\xb2?\xf1~\xe4\x90B\xce\x96\xbf\xfd\x0b\x02)\xeaO\xb7\xbfe\x08=c\xa5J\x8e?\xaa\xc1t\t\x15\x07\xc0?\x05 S\xbc\x89\x15\x86?\xc2\x95\xff\x16\xbb\xbe\xbc\xbf\xc4\xc9\x7f\xb6\x80\xa8\x86\xbf\xee\xee\x0b\x904\xb6\xa1?\x00x\x04\xa7\xc1}!?H\x97\xd2<\xfe\xc9u\xbfH3\x7f\xf0\xb2\xcc\x8d?\x04\xca\x01\x99\x11\xc4\x80?F\xa2K\xcb\x9a\x97\x8e\xbf\x85\x83\xe7\x10\x99?h\xdd\xd0\x89\xe8,\x96\xbf\x8d\x8c\xcd\xa6\xb9\xa6\xa7\xbf\x94\xbc\xb5\xb9t\xd0\x8a?0\xa1\x92J(T\x96?\x9b\xc6]U\x08\x1c\x95?\xb2\x1bU\x1d\x15\xe2\x94\xbf\x81\xeb\x19\xfc`\xf5\xa5\xbf]L\x91\x02T.\xaf?\x14\x85\xaa\x1e\xddj\xa3?\xc6\x1e\xdbvI\x9a\xb2\xbf\xe0MN`yyU?B\xa7\x05(\x0f9\xa9?\xd8X\x84\xc3t\x9f\x8f\xbf\x06\x8cZ\xe1\xcc\x94\x93\xbf\x10\x07\xfc0\x8e\x87i?\xf8R\xd17^\x06\x9a?\x87U\xbc\x8a\xa0z\x9c?\xe4\xeb\xbeIwcg?\xf8\x8b\xf1-74\xb0\xbf2R\x12\xf7\xe2\xccx\xbf\x90\x88\x7fCy\'\xa3?\x82\xc5%\xc9/\x07\x88\xbfvz*"\xea8\xab\xbf\xc4H\xf5;\xfc;\xa7?\xea\x8eY\xa9%\xa5\xbf?\xc8\x10&%\x18V\xab\xbfw7&\x9c\xf2\x94\xc5\xbf\x9b\x85\xba\r\xa25\xa3\xbf\x1e\xd5X\xb6\x15d\xc2?\x94w\x96(X~\xbc?\xf0:\nW+|\xac\xbf8\x85\x00\x19\xae\xb4\xa7\xbf\x19v\x88d\xf9\xf4\xad?\xd5\xdd\xa7\x118\xad\xbf?n\xc2\t\x00+\xc2n\xbf\xaf\xe1\xa0\x91\xab\xec\xd2\xbf\xb4H\x8d!\x14N\x84?\xd5N\x8d\xc3\x9f\'\xcd?$\xf9\x99\xe9fj\xc6?\x1a\xdf\x16C^\xff\xc6\xbf\xbfArM\n\x86\xc2\xbf\xed\xcc\x95\xdb\xa2\xa4\xd1?\xbaD\xd0\xe7F\x80\xb1\xbfz\t\xfc\xeb\xa6\xda\xc5\xbf\xc6\xdd1\xc1\xca\xbb\xc4\xbf\x14N0,\xe4,\xc1\xbf>\xc1\x13\xbe\xed\xf6\xb3?\xdfo\xba\xae\xc2\xa2\xd1?:3<\xcb=\xd2\xb4\xbf\xf2\xcf\x9c8%s\xd4\xbf\x95vZ\x8b@)\xac\xbf\n\x11\x16P\xd5\xbb\xc6?\xecsL\xbf\x84\x9b\xbf?\xc0\x96a\xc7X\xf9\xc7\xbf:H\xd28\xed0\xb3?\xf8\xadl*\xdc*\xc7?A0D\x17\xd7\xa2\xa9\xbfN\x1c^\xc7Q\x8e\x86?\x86\xb6q\xf0\xe9Zq?P3\xb2<\xc7\x10\x80?H\x88\\G\x92}\x9c\xbf\xca5\xf31N\x04\xaf?\x07-\x02\x0f\n\x84\x95\xbf\x1aS\xf2(\x9a\x0f\x9b\xbf\xce\xce9B(3\xb2?\xd4\xa4\x1d\x8a\x975\xbf\xbf\x08m-\\\x81O\xa1\xbf\xa2\x17P\x8ec\xfe\x92?\x97\x88\'\xe8\x1b1\xac?r\xa2\r\xb7g\x96\x90?D\xcc\xe5v%\x91\x87\xbf\xa2\xca\xfb\x97k\x19\x8e\xbf\x8d\n\x05\x7f\xe3\x1b\xbd?\x12\x15G\xb30\xf0\xaf?\xd6+\xec\x12\xe8(??\x06#C\x8dh`\x87?\x9a\x8c\xf8H\xc5[\xbb\xbf~\xb4\x83z\r\xce\xab\xbf\x90\x1c\xf3\xa7\xdd\x81_\xbf\xdc\xd9G\xb4\x1b&n?\xac\xbf\x8a\x1fH\x82\xb3?Z\xc7\x13\t\xb7\xac\x89\xbf\xd6\xa9HG\xb3\xd5\xa1\xbfPO\xbd\t~ZH\xbf\x12\xde\xb3b\xeb\x11\xab?\xc2.\x10\xa4\xf6\t\x9a\xbf\xa2\x81\xd0\xdd}\x97\x90\xbfL{u\x1c\x8d\x8e\xa5?`\xd7\xac$\x07\x8a\xb0\xbfd{O\xcc\xa3\xda\x90\xbf\x8aD\xd2.?\x1f\xab\xbf\xc49\x01\xe6^\xe3\xa4?\xab-h\xc0e=\xb9\xbf\xdbp\xe3\xb5\xcbE\x7f\xbf\xd7\xa4A\'\x10\x0b\xc4?\xf8c\xbdA\xd1@\x81?\x00t-\x00\x1d\xa0:\xbf\xfbH#\x04\x1b\xc6\x84\xbf\x13\x93=\x11\xd5\xd4K?L\xb6pXo\xe3\xa0?\x0c\xde\xa87\x7f\x18`\xbf_\xbd\x9e+x~\xa4\xbf$4$\x82\xca"V\xbf7\xb0\xbb\xb0\xf5\x1c\xa6?\x8f\xe0\xdcIh|\x82?\x8d\x80\\\x81\x1a\x1b\xa2\xbfe\x89\xc0w\xea\x06\x90\xbfY=\xc5\xa4?\xeb\x82?9\xed\x816\xc5\x16\x8d?\x0exw&\x06\x97l?<*\xce\xf5wM\x89\xbf\xdf9@\x83q7\x86\xbf\xfa\xea\xff\xf6#\x15h\xbf\x87R\xbdkP\x81\x91?\xb5\xd7Tt\xb9\xa1\x89?\x9dl=i\'\x87Z\xbf(\xf5\xbb\x10wV\x91\xbf\xad\'=l\x82+\x84?w;\xa6\xdc\xd3\xdf\x8f?j\xe7\xf9G\x9a\x86\x97\xbf\xb8F\xb4\xd6\x88\x11\x88\xbf\x9fz\x93\xde;/\x99?\x00\x1f\xf6\xa1\xce{\x87?\xfc\x0f\x0229@\x94\xbf\xe9\xf7_[\x8d\xacl\xbf|)>+\x7f\xd8\x99?\xb2\xc0e\x19"\x02g?u)\xb2y\xf4S\xa2\xbf\x15$H|\xfb\x98a\xbf,\xb0\xa4\xf9V\x03\x93?L\x95\xe8\xcd7S\x92\xbf\xa4\xb4\xc4i\xff\x93\x85?\xca\x8a\x0bG\xc3\xb2\x99?\xc2\x17\xf9#\xe5\xaa\xa7\xbf\\\r\xdd2oV\x7f\xbf\xef\x9dd\xe0A)\xa6?\xa8c9\x12\x12\xa3k?\xdbBX\xc0\xa0#\x82\xbfT[\x1e\x10\t]r?\n.\x0cu\xd8\x0el\xbf\xdd\xbc\xb1dv\x1b\x96?\xae\xa4\xae\xbd\x95\x86\xa0\xbf\xe1\xd9M\xe1d+\xac\xbf\xbc{7\x1e\xf2\xea\xbb?\xed\xbd\xbd\xd9\xb8?\x9e?\xabPU\xceg,\xbb\xbf \x11n\x93\xa7?|\xbf\x02\xa5\x1a\xa4\x9f\x94\xa0? \xfb\xdc\x7f\xd0t{?\xc04\xd3\xcfj\xbe)?\x0c\xb9\x0e\xde[\xf6\xb0\xbfM\xf0\x7f\xa2\xf3\x0c\x84?\x7f\x9d\x82\\\x00I\xba?`vA\x88\x19\xea{\xbf\x04\xa4\x81\x99W\x93\xa3\xbf7MP=\x12.}\xbfz"u\x18\xda\xda\x84\xbf\xc6\xff\xe0v\\\x87z?\x9e\xce:\xdfd\xaa\x9c?\xce-jT\xadN\x9a\xbf\xb3av\xf6\xfcA\xae\xbfnn\xbc\xe1\xdfu\xb1?\xc1\xbfO\xd0\x90\xb0\xee\xa3\xce\xbf\xc5\xab\xe6\x14\x81[\xca?\xa3\x881\xf0\x00\xc6\xd1?6\xfc\xd1\xede\x01\xa7?\xcb\r\x16^\x08Y\xce\xbf\xea\xf8\x99\xc8u\xc8\xd6\xbf/\xe2X\xc3\x07\x88\xb6?\xf0\'\xa3PpB\xcc?\xf8rSe*\xc8\xc2?\n\xca\xbbp\x07\x7f\xc3?N\xf2)\x84\xa7j\xd2\xbf,\x9c\xf0\xa1L\xeb\xd2\xbf\x15u\x88\xca\xf7H\xb7?{\xccU\x8a+|\x97?\xcc/uZ\x898\xc7?\xd9\xa8\xc9\xd4w{\xd1?\xc6\x86\xa8 y\xd7\xbd\xbf\xb6u\xd4\x0f+D\xc0\xbf\xc9M|=4v\xbc\xbf\xdc[f)\xe0\xf7\xcd\xbf\x9f\xb8\x7f\xbd\xbd\xce\xc6?\xc2\x99rl\xa3\xa2\xd2?\xc2\x08Zd7\xee\xb6\xbf\xf9\xec\x12\x08\xa6\xa1\x9e\xbf\xa9\xacJy\x1b\x9f\xb8\xbf\x1b0\x8d2n1\xc8\xbfa\xca\x97\xd2\xccR\xc5?ev\x03\xfciy\xc7?j\xa8\xac\xc5\xc7\x99\xad\xbf\xe0\xd5r5\x90\xcd\xb8?O\x91\x17\x18\xc6\xb4\x94\xbf|Z#\xe5dG\xd1\xbf\x1b\xacz\xc8\xa1\x97\x8f\xbf\xdcpl\xb8\xf7=\xb6?\x88\x03&\x1fZy\xc0?\xbc\x10\xfb\xe0\x9d\x19\xc8?,s\xf2]\x97\x98\xd0\xbf\x03\xd7VL\xaa$\xc5\xbf\xc7\x18\xbc\xecW&\xd1?Y\x00C\xb1\xc8$\xb5\xbf\xe1|^\xf5\x82D\xbb\xbf,\xe1\xfaN!5\xb9?&\x16]\x11a\xde\xbb\xbf\'\xe2ea5\'Q?\x92\x90\x06R\xd6\xe4\xcd?_(x_U\xd6\xa8?q\xf7\xdb\xbc\\\x1c\xb1\xbf%\x02\xa6\xf3\xc09\xa8\xbf\xd3\xda\x93>y\x00\xc7\xbf\x8b\x86kJS\x9e\xb9\xbf\x10R\xf3u\x02\t\xbe?Ek\xcf\x9a\x11f\xc4?L\n\x16)\xb8\xf3\xb4?r\x7f\x9c\x9c\xa2\x86\xa0?V\x01\xe9\xe9`e\xa6\xbf=\xf5\xca\xd2\x00\xbc\xd0\xbf\xb3@(\xa2\xd4L\xa0\xbft\x07v\x9d!!\xd2?\x8c\x96l%\x8cV\xbc\xbf\x98\x82G\xab\xd8\x07\xbf\xbf\xc3Z \xcd\xdc\x85\xd2?\x02>t\xbe\xa7\xf4\xc5\xbf\x84\xed\x84\x9b\x89\x95\xd4\xbf\x8dG\x1cDR\xf4\xdb?QK^\x18\xbe`\xc7?\rjz\x05\x88g\xd5\xbf\xeb\xe6\x1aJ\xcbZh?Z?\x03\xf43\xb3\xac\xbf\x12\xeb$a\xbd\x0c\xa7\xbf\xd2Wu)[\xe3\xcf?k\x1dm\xaa0\x0b\xa9\xbf\xbajY.\x07\xae\xb6\xbf>`M*\x9f\x9a\xb5?\xd5+`\x90\xb6\x0f\xbc\xbf4\x03\x80\x04\xaf\xde\xa1\xbf\xe5\x8b\xdc#\x15v\xb4?\x01\xae\xc1o\x07)\x94\xbf\x9e\xdd\xfa\x05v\x86\xb8?B\xb5\x14\xb4\x10\xd1\x9b?\xca\xb4\xc6\x9dw\xe1\xc4\xbf\x14Bn\xc6\xaeG\x95?Z\xca\xb2^\x0f7~?\xe1w\xcaB\xd3\xbd\xa0\xbfM\xda\x00v`m\xc1?\xa6\xab6\xf1st\xa2?>9\xe9\x1fe\xc0\xb7\xbf\xfc\x00\x0e-4\xa4\xa1\xbfA"\xef\xf5\xf4\xf1\xab\xbf\xf5\x07\x82C\x87\x14\x8a\xbf\x00\xc3\x0f\xdbKZ\xc0?\x18\xc5L\xf4\x1d\xc1\xb9?`\xde\x8c\xc0\xc7g\xb4\xbf\x94\x88S\xe8TS\xc7\xbfY\xc2\xf9\x93\x8dsh\xbf\x0b\xfe\xbbs\x7f\xf9\xc4?O\xdd\x89a!\xde\xac?#X\xa1\xfd} \xbc\xbf\x05\xb7\x10\xacID\x83\xbfA\xdd\x11[\x93\xfa\xc6?\x83l\x95V\xc7\xcb\xc1\xbf\x8b\x0b\xbe\xa6\x16\x86\xd3\xbf\xb1\'\x98f\xa2\xa0\xc8?w\xbfg\x182\xcd\xbe?\xb2V\xe3\xacp\x01l\xbfZ\x16\x93:\x01\xb9\xd9?\x93\xd2\xb9/r\x99\xc3\xbf\xbd\x84<\xd5G|\xe3\xbf&\xfa?C\xfd\xcd\xa6?\x85?\xe6\xc8,Y\xd0?\x89 \x0c\x1bNT\xa3?\xb1J-\xa9\xad\xac\xbe\xbfd5\x86\x10\x85\xc4\xa3?\xa4\xab\x0e\x8d\xb0\xb1\xcc?\xe9\xf1\xf7\xa3m\xeef\xbf\xf2P\x94=\xeb\xa8\xb1\xbf\xff\x8e{\xb90\x01\xb4\xbfY\x8c\xf3\xf4\xbc\xce\xc8\xbf\xdf\xbc{W]"\xa0?\xc2\xa4\x14dq\x1e\xc8?\x10x\xad\xe1\xe3\xd2\xb9?:\xc3\xf6}\x82{\xbc?\xe6\xe1\x97V\x81S\xc6\xbfd\x04\x13{\x05\xff\xd3\xbf\x02g\x80\x00\x94\x9f\xb9?,\xd9\xba\xa2\x9a\x97\xd1?\x05^\x0e\xa9\xfc|\xb7?{PTp\xf9\x1b\x93?\x85I\xb88f\x9d\xcd\xbf\xca\x14\xf2\x984\xd9\xd2\xbf1\xe5P\xf6\x06z\xc8?\xf6\xb8l4\xd7\xe7\xcf?>\xa6\xc5\xbaB\x92\xb2\xbf\xac\x8e\x08\xb9p\x93\xb1?m\xa7\xbf\xdb\x98LU?\xefS.p\x1b)\xd8\xbf\x12lZ-\xc7\xcb\x9e\xbf\xdd\xe0\xba\xe2\xf4"\xd7?\xf1r\xadw8x\xb2?\xebxD\xd2\xcc\x93\xcb\xbf/Uv\xbe\xcf4n?\xf9A\x15\x0f\xc8)\xbb?\xd6^\x89\x01\x10\xd0\xbc\xbf\xd5 \x85\xf4\x9d\xe7\x94\xbf\x9d \xf2\xc7\xb2\x19\xb2?;\xa0E`\x08p\x91?\xb7#QQ\xc5.\xb2?a\xd4\x07s4&\xa1\xbf\'o\xf9g6\x1a\xb9\xbf[\xa64\x08|\xfa\xaf?\xc3\xb5\x85\xa7\xc7F\x9c\xbf\xcb\xb2\xa1\xbe\xa1\xfb\xc3\xbf8#\xc6J\x15w\xba?\xd5xrq\x82\xce\xcb?\x92T\x16\x8cL(\xb5\xbfC\xbaj\xc0\xc3\x87\xaf\xbf\x078\x12L\xc1U\x98?\'\xc9\xea\xa0N\xa9\xcb\xbf\x82\x04\xf98B\xcc\x86?>\x05B\xe1!\xf9\xdb?\xa6\xf4\x06\rb\x96\xa5?\x80\xdd\xdcY\x18^\xdb\xbfx\xeb\xbd\x90\x85{\xc2\xbf\xc7\xb3/}\x18]\xc3?\xde\x85\xf1=(\xc8\xc2?\xbe\x9b\x96\x82P^\xc0?\xd6\x05\xe9\xdah{\xa9\xbf\xecj\xc4\xbf\x06\x94\xd3\xbf\x1b"$do\xed\xab?\x85\xff\xf1\xaf\x1ce\xd8?Iv]"\x8c$\xbc\xbf\x17-\x8a\xd5#\xee\xd3\xbf\x9b\x14\x0f\xeav\xa9\xad?\xee\xd6\x1aZ\xe9\xd9\xbf?\xf5^\xb8\x83=J\x9e?(\xe5\xdf\xfb\xa0\xe2y?\xdf\x9a\xad\xbc\xe3\xc6\xb3\xbf\xa5\xf70\xbe\xe0\xe1\x89\xbf7\xdb\x86\x16\x1aL\xa9?\xb2\xe5\x9f\xd2\xf7W\xbf?{\xabp`\x8d\xa4$?\xfb\xa9;\x04\xd3\xc4\xd6\xbf\xf5\x9fDn\xb0F\x9f\xbfZ\xf2\xaa.T\xe3\xd8?A\xd9\x7f\x9fF\xed\xa8?qZ\x1f\x8a-\xd3\xb1\xbfCQ.\xec//\x87?\x80\x1eu{\xca\x1d\xd5\xbf\x1e6\xbc\xf9\x80\x17\xc2\xbfe\xee\xc6\xd8\xb8l\xdb?\x0b\x02\x965\x1b\xc1\xcb?2\xadn\x16\x7fU\xd0\xbf\xa6W\xa6]\x10\x99\xc4\xbf\x19\xf8\xe8\xc3\t\xd0\xb8?\xc0\x01\xfb\x86\x81W\xa6?\xf9\x96\x88znM\xa3\xbf\r|.b \xbd\xa3?WP\x12\n\xb8\xe8\x8e?D\xf3\x14\xb7\xc4\xf4\xad\xbf&\x8a\xf0ZC^\xa9\xbf\xd4\xfa%x\xd3-\xa9?8\xcf(\x8d\xf42\xbb?-9Pg\xd2\xfa\x8a?\xfdr\xbe\x8e\xea\xd0\xbb\xbf\x80\x1f\xaa)2\xab\x96\xbf\xd0A\xb8\x1b\xe5d\xb5?\x9dC\x8c\x96\xde\x83\xb2\xbf\x11\x81\x1eLh\x94\x95\xbf\x92?\x03o\x00h\xce?\xe7:\n\xfc\xff\x03\x99\xbf\xb4\xf1\x98\xb6\xb7\x08\xd5\xbf\xf4>u}\xb8\x1a\x85?\xde\x8c@\xa3\x89\xe2\xc3?\xc3IK(\x0e\xb9\x88?\xcc\xbf\t\xe3\x19\xfe\xc2\xbf\xc8a\xdbT\xfb\xfc{?<\xc5\x08\xea\xec\x10\xd1?C\xacv7#\xc4\xb8\xbf\xbdF\x0en\x89I\xbc\xbf\xa7\xc5\xf1{/J\xc3?Y}\xc2\xacc\xe8\xbb\xbf\x82\xb3}g\xbf[\xbf\xbf*Zj\xfdt\x1f\xcb?\x1a7\xd5 \xe5l~?\x057\xdf#lK\xc0\xbf\xf1\xe1i\x08\xf2C\xae?F\xb5\xf3\x8b\xcfb\x82\xbfa\xc7M\xbc\x8bN\xab?\x07\xc2H\x1f\xcc"\xbf?!\xa0\xea\xd0:\xfb\xcf\xbf\xef\x88\x05\x91\xcco\xcf\xbf\xcc\xea\xeaX\xec/\xd4?\x0c.\x05X\x97j\xd4?i]`\xccD{\xd0\xbf\xc8\x17p@\xea\xc9\xd1\xbf\x06\x85\x7f\xe8\xe1O\xc3?\x83\xa9\n\xb3\xb9\xf3\xaf?\x16\x1f\xae\x15\x81\xba\xa0\xbf\x97B\xfd\xf3/\xf3\xcc?\xa9~\xf5\xf3\xcd\xf0\x9b?\x19"\xfc\x8e\xc8F\xd7\xbf\x9d\xca\xb5\x86<\xe4\xaf\xbf\xf4\x10\xceQFa\xcf?\xdb6\xa3\x9a\xcf\x01\xb3?\xce\x99\xf7S\xf8\x15\x81\xbf\x04(\xe4d\x17L\xa2\xbf\xfaT*\x0648\xab\xbf\x8f\xbc\xf31l\x04\xa9\xbf\x0clrY\x05\xf6\x90\xbf\x03=\x82\xa4\xe0\x8f\xc1?\xd6#\xbdV\xb7\xc4\xbb?\xca\x19\x1ca\x98\xf8\xc7\xbf\xd1\xcb+\'B\x9e\xc7\xbfz\xb4F\xbbx\x80\xc1?\x84\x10\x84*\xbfQ\xbd?\x9e\x16|\xeb\x0c\x11\xa6\xbf\x13\xcf\xc46\x7f7\xb4?\xa4\x0b)\x15$]\x83?b\xb68L%&\xd0\xbf\x03\xea\x0f\xce\x89d\xa5\xbf\xe8\xc8\x0b\r|\xc3\xd4?\xf9\xe8\xbd^\xae\xec\xc0?i\xc7\x1b\x8a\xf7\xaa\xd7\xbf\xe7\xdf\xcc}\xc0t\xc4\xbf\xb3T\xe2\xc8;{\xd8?\x80N\xa2h\x82\xb9\xbe?yeQ\xca\x81\xe5\xd1\xbf\x82YK\x1e$M\xbd\xbf\xab\xac\x05D-\x16\xb7?e\x0b\x81_\nB\xb9?\xe9\xb5\x9a\x93Pd\xc0?e/*%Q\x8c\xa1\xbf*\xc1\x0c\xd4-%\xcb\xbf\x9a<\xfb\xd4w_\x86\xbf\x9c\xb2\x8f\xa7k\xae\xac?\xe3Z%q\x1az\x91\xbf#\x10\xd0\xca\x1cK\xcc?z\xa8\xa39\xab\x8b\xac?t\xa8c\x0e\xeeb\xd4\xbf\xfd\x05\x9a\xd2A\xa3\xa0\xbf\xe9+I\x92\xc0\x8a\xc0?\x90\xf5\xe4(\x9e?\xc1\xbfY\x0e\xed\x12\xc4\xbc\xc1?\x92\'9\xbd\x04\xaa\xd6?\xc5*\xc3\xda1\x94\xcc\xbf\xce\xbe\xb1n\xf8\xd7\xd9\xbf\xdd\xe9\xa6\xde\x9bMx?Pcx\xd8\x85\xef\xd3?\x1dT\x13~\x82f\xcd?-]R\xee})\xc5\xbfu\xa4\xaa@\xb6\x9b\xd2\xbf\x05\x06\xbe\xcf\x07\xf9\xb0?\xfbO_\x9fS\xd0\xcb?3\xe6\x15~\x86*\xa2\xbfNC:"Z\x10\xc0\xbf\xb7\xbbyS\xf8\x0e\x8a\xbf\\\xe9\xb3\x1fs\xa6\xb8?\xfe\xd9c\xc5\xaa\xa5\xb0?6\xe9h\xc5\x0e\x87\xb1\xbfO&&\x8b\x07\t\xb7\xbf-\x97X\xe2\xf4\tv?\x9ay0\xcc\xaeu\xba?\x893\x18\xd6\xb3*\xb5?\n\xe6\x0b\x83s\x81\xbb\xbf"\xb4E\x14\xfb"\xc6\xbf2\x82\xa5\xacC\xf8\xb6?\xb6Cn\xcb?\xdd\xcc?\x92\xff\x17\xd5\xc0H\xa7\xbff7H\xb3\x05\xba\xcc\xbfl*\xd4l\x83\xf3f?\xaf\x1fh8\xf5\xba\xcc?\xff\xed\x84\xdb\xdb\x92q\xbfi\xab\xae=\x89\xb3\xc9\xbf1\xd9\x17\x90\x9emz?\xa2D?b\xd5\x91\xb4?\xde0\xbbqGf\xa1?\x82\xe3\x88D:\xe7w?~\x07\xbd^6\xdf\xba\xbf\x8e\xab\xcb\xf0\x9a\xa4\x93\xbft\xbc\xd5h\x1fc\xc9?\xe8o(\xcd\xf8\xd8\x9c?\xb7\xb3\x0b]\xd6-\xd5\xbfDOm&\x05$\xa1\xbf6;\x902\xeb6\xda?=+W\x9c\xbbu\xa4?\xba\x8d\xe6x\xceE\xd4\xbf\x8d#\\\xe1=\xa4\xad\xbf\xa6\x04\xe4\x16\xa2\x0b\xc4?7\x1f\x1a>c#\xb0?\xf6\xb8rx-\x04\xb9\xbf,\x04\xdd"&\x8a\xa9\xbf\xbb\xb4\xa7\x842.\xba?\xad+\xee\xa8\x8a\x03\xae?Sm\xb9y\x81\x18\xb5\xbf\xda\x81\x18hZ\x8b\xb6\xbf\x92F \x1f\xcd\xe0\xaa?\xfc\xd8\xa0/@6\xc2?\xf0\x8e\xb7=\xd0\xa3\xa0\xbf`\xec\xf2$\x98\xe2\xcb\xbf\xbb\xb5V\xd1\xb4\xa3\x87?\xbe\xd1h\xed\xa2\x0f\xd3?\xe9u=\xf8\x00q_?\xa4\x85\xa6p\xa0Z\xd6\xbf\xe0\x9f\xdf\xedq4_?"\x10\x84?\xeb\xd5\xd6?\xfaw\xcc\xef\x0b/\x83?\x91
W\x19\xcf{\xc1?~5`8\xb17\xd0\xbfA\xf73\xe0\x80\xdc\xc5\xbfn\xe8\xf5\x138\xec\xc1\xbf\xfbI\xb2\xcao\xbe\xc6\xbfu=\x93\x8f\x91\x81\xc0?\xfd\xc4\n[\x11v\xc7?\xc4\x12\x9b\'C\xa0\xc5?(qq\x9d\x1f\xc6\xcf?\xe9\x08\xd4\xc1\x1d\xd0u\xbfC\x86\xae\xdcp;\xc5\xbf\xd4\xdc\xe6\xde\x99\x88\xc4\xbf+\xc7\xb7\xe8\x9ah\xd1\xbfo\x89\x1b\xf0\x83\xfe\xa1\xbfp\xb3\x1e\xb1\xfaK\xce?\xf5Fd\x83\xce\xe1\xb0?\xe9%\xa2mPT\xc5?\xc1\x99\xf1R")\xbf?\x8f\xd6?-G\xc8\xcd\xbf\x80(c\xe7\xb1m\xae\xbfp\xf2\xae\xd6\xabb\xa5\xbf8\xe3\xf0G\xab:\xc4\xbf3\xc6H\x17:\xfa\xc6?\xdd\xba?=\xf9\x9d\xb1?\xb0\xbfYp\x85\xd2r?\xf6\xbc\xf1\x08\x1b$\xca?\xee\xaaQ\xe1d\x16\xc4\xbf\x06A\xc4E\xd1\xde\xbc\xbf\x00VVI>\xe4>\xbf~\xb2E\xca\xea\xf1\xc1\xbfxs0\xd0\xfa\xb3\xbb?\xad\xec\xddfH\x19\xb6?\x80,9X\xaf\xe8\x9e\xbf\xc0D\xb1\x1e\xe2\xef\x9d?\x08\xb4\xd4\xaf\xca)\x8f?\xdc\xb7\xca\x15\xdf\xed\xb4\xbf\xcc\x15U\xe9BI\x8e\xbf\xf8\xb2>\x9el\xa8\x9a?\xc0\xc9\x07\xebM^W?t\xb7\xf00\xd1\xec\xb2?V\xf8\x85\x02\x00M\xa2\xbf\xa8\xccD\xb2\xcc5\xae\xbf\x9aZ/\xc1\xeb\xa1\xa7?\x97@\xf4FG\xf2\xab\xbf~v\xe6\xc2\xdcx\x99\xbf\xc8=\xb1n\xc3g\xb3?\x10\xa1d\x06l@\x8f?\xb2\x98:Zd\xbf\x97?\x00i\xa9\xb2
\xdb\x8f\x9b\xb0\xbf\xd4uV\xac\x8f=\x92\xbf\x00\x8aCS\xa0\x91{\xbf\xd4\x0f\x05\xf8\x07\x8a\xa6?\x98\x8d3[\xb34~?\x80\t\x00\x8a\x9e\xeex\xbf\xfe3v\xcay\xb1\xb3\xbf`|*\xc9v\x16\xa0?\xfe\x1a\xef\x92\x12r\xc7?\x03\x8f>nL,\x9d?Ndz2c\xe5\xa9\xbf\xe8\xb7\xea\x8b\xeb\xde\xb1\xbfc\xad\xe6\r\xd1\x03\xcd\xbf\xf9B\xc9\xc4\xaf\x85\xa4? \x16\xb1z\xbd\xf5\xca?\x86\x90\xb9\xdd\xcd\xf4\x99\xbf\x8e\xb1\xd5\xfc\x0eV\xc8\xbf*\xd1\x87\n\xa7\x08\x95?rGC\x16\\V\xbd?L^\x1e\x90f\xde\x95\xbf\xe4\xfb8kr\xd2\xca?k\xcc\xde!~\x00\xb8\xbf:\xf6\xc7\xc9y\x8d\xd2\xbfJwV\x8dut\xb6?O\xf6\x82\x15\x8fg\xbc?h\xc6\x83\x95p\x18\x95? O\x9f\x83\xbe!\x80?\xb8O\x81\xca\xbe\x0c\x81?\x80\xac\x8aD\xa9\xbco\xbf\xc6\rFU\x7f\xbf\x91?\x00\xf3e\xc1\xa9h\x7f\xbf(m\xcff\xda\xa6\x87?\xf8\xb84\xfe\xab\xd2\x95?`3\xffw\xbe\x05_?\x8c;\xce\xeeo"\xaa\xbfv\xeb\xa7\xbf\xe5\xcb\xa1\xbf\xfdN\x1a\xa5\xd5\x0e\x90\xbf\xc0\x83\xad2\xb2\xedu?Z2j\xa1`\xa3\xa9?\x98~\xe5\xe6\x1c\xc4\x9b?\x80\xac\xa72\x17Vh\xbf\x10|*\x92r\x8do\xbf\xc0\x1e\x06\x18kxJ?Q\x00\x8e\x9c\xcdT\xa4\xbf\x1a\x07_\xe0\xc6\t\x97?\x9e\xe1n\x83\\\xa4\xb5\xbfT\xe9\x82\xc7\x80\x0b\x85?\xa0\xf1)\xeaL\x15\xab\xbf\xf6\xd1\xf3!\x8d\xfe\xa3\xbf\x8e3i|\x02\xe1\x9b\xbf\xe8e\xfb\xe8v\x12\xa6\xbf\x8b\xfe\xaf\x17\xacB\xaf?X\x7f\xda\xb4\x8b\x1f\x8d\xbf\xb0\xa0\xb0\xf8\xe6(\xb1?\x9e\x15E\x04\xda\xf9\x91?M\xd3\xfe2\x08\x7f\x9e?\x16\xe0s\xa5\x1a\xbb\xae?\x86>\xb8\xe2KM\xa0?H\xe2\x196\xea&\xb7?\xc2\x90\xb4b\xa8yv?-\xc1\x88ER-\x8f\xbf\x9d\xc6\xa7@\xb7\xfe\xa3\xbf\x0e\xdfG\x82D\x1d\xbe\xbf\xd4e\x175\x1d\x0e\xc2\xbf\xac\x05\xaa\xa1\x00\x86\x83\xbf\xab_\xb9\xd1\xa8\xc9\xaa?16EK\x1dl\xa3?K\x9c\xe8\x96\xaa\xaa\xb2?\xd2\xfe\x80\xdd\xaea\x9c\xbf\xc0\xae\xa1"\x07y\x89\xbf6D\x97\x96\x8d\x89\xb1?\xca\xe0\xae\xab\x8e\x82\x99?\xec\xaf\x07\xa8\xc7M\xb0\xbf\xbd\x96"\xb7\xe9\x96\x99\xbfb\xa5\xf9\x1c\x84\xe6\x98\xbf\t\xbf\x14\t\xf4\xb7\x94?\x80Q\xd6\xce\xc1Z\xa8\xbf\x00\xde\x9a\x94\xe1\x9fv\xbf\x12\x98\xad\xae\x18S\xbe?v\\\x08\x15\n\x00\xac?\x8b\x04\x8d\x94\xca<\xaf\xbf\x10C\xda:\xeeQ\x96\xbf\x18jM\xb4\xd8\xbav?8\xa9@7\xd6\xc6\x8a\xbfP\xc4~\xe7\xf0\x18\x87\xbf\x96\x96\xdf\xadK\x85\xa2\xbf\x08\x83(#]\xd6p?\xf1\x8a\xf7\n\x02\xc3\x93?*\x18\xfd\x9e\xbf\xac\x85?\xf8\xcd4\xbf\x81\xc2q\xbf\xed\xee\xff\xd8!\xf3\x98\xbfD\xbe\r\xfc=\xdf\xa7?\xe4\xeb\xdf\xce\x8a\xe0\xa2?n\xd5r\t\x17\xd3\x9a\xbft\x125\xa6\xbc\x83j\xbf\xf7\xb1y\x0c\x16h\xa1\xbf\xb8\x134\tj\xe6\x91\xbf\xc4\t\x07\x8c\x0c\x99\x8a? \xce,f\xe03c?\x90\xf0\xa8\x04\x9dAd\xbfD\xabC`\xba_\x8c?\x00\xb9\xba\x80z\x1fa\xbf\xa8\xb4\r\x90\xc3\xc0\x91?\xc8\x95Ma\x81\x91\x88\xbf\xe0#%]r\x84w?\xb0i\x9b\xb5\x96\xcf\x98?\x80\x1b\x16\xc1\x9do}\xbfW\x95(\xd5\x95\x18\x88\xbf}\xafE\x1f\xfa\xe7\xae?\x00r\xc0\x8ck\xeb\x92\xbf\x95\xb3W\xda}\x1b\xc1\xbf\xb6\xad>L\xf6\x14\xb5\xbf\xc8\xe4\x96n.[v\xbf@+\xc9,nQ\xb4?\xf2V%}{t\xc2?-\x1d\xf3\xd7i\xd9\xc1?$\xb2\xd0\xe8\xee\xf0\x8f\xbf98\x15\xcf\xa9\xd4\xc4\xbf*\xed\x97\xd4\xf0\x14\xbf\xbf\x05\xec\x9bx\xd1\x95\xaf\xbf\x81n\x08\xea\x8fl\xa2?|\xa2(\x9dT\xcf\xcb?r\xecc\\eI\xbc?\xa4\xca\xa2b\xb8x\xc6\xbf.\xf8\xa5\x0er\x98\x80\xbf\x8c \xd7\xd6\x7f\xb5\x96?\xa9\x94\xf0\x87(\x19\xc6\xbf\x8d\xb4\x81\xe7?\x90\xc2?\'I\x8b\xd3\xd9Q\xc5?\n\xc4\x81\x1bw\x0e\xd1\xbf\x1b\xd6\x9dM"h\xb0\xbfd\xa0\xad\x0fg\x04\xc1?\xe6b\xb6\xc0\xb6\x89\x9b\xbfj\xb8\x0c\xf0\x81\x95\xbf?nu\xca\x0b\x81\xa9\x92\xbfJ\xfdR&n\xc1\xc9\xbf\xbd%)"L\xa0\xb5?xL\xbc\xad(\x99\x91?\x7f\xd1\xb8\x81\xcc\xd1\xab\xbf\x08:}{\xfc\xbc\xc9?p\xd7d\x17\x87j\x8c?\x9c\xfd%z|\x95\xc1\xbf\xdc\xf4!\xd6\xcc\x99\xa7?\xee\x1b\x9a\xafW\x90\x9b?@\xe1\xda#\\\n\xc7\xbf}b\xdf\xbce\x05\xbc\xbf\xda\xc32\x93\xd6\xf2\xc0?\x96\xe8\x9b=1`\xa7?;\xa0&o!*\xc3?\x18#\x9d\x1b\xf8\x91\x9f?\x02$\x977\xdeW\xca\xbf\xd4\xdb\xb3A\xe6\xe1\x94?lx\x84\xa9\xbaQ\xc6?.\xfd\xec\xb1A\x08\xb3?)@\xa1\xff\x8c\xf1\xbd\xbf\xd2\xbb\x1a\xa6\x86\xa3\xa7\xbf\xae4\xfe\x97\xb0C\xc4\xbf\xa0\xb0\xf3\xb7\xb0\xb5u?\x1e\x8f\x96\x04\x9c\x0c\xcb?\x034k\x03m\xa0\x99\xbf\xc8\x1c\xbd\x9e\xcd\xc7\xaf\xbf\x10e\xdb\xc6#\xdf\x84?\x00\xcc^\xfdK\x06\x7f?hH\x9eyw\xa3|\xbf\xa0\xfd\x99\xd8\xfa\xc0\x84\xbfF95\x87N\xe1\x94\xbf`\x95\x98\x14\xb8\x95\x87?8n\xeds\xeb\xaa\x82\xbf\xc8\xd1\x19\xb0@\xe7\x8a\xbf\xa0\x1a\x9cT1\x03L?^\xd9q\xfe\x84f\x9a?-\xce[\x10\xcc\x88\xa7?\xa0c\xb8\xb75\x82\xb2?@K\xbcu\xef\x8bk?D\x1c,\xe2\x9a$\x96\xbf\r\xfe\xd4\xa4\xf5a\xad\xbfH\xa1\xeehU\x14\x80?\x80i\xfa\xcf.\xcef?2\x0f\xb4\x13\xf2\x8c\xb2\xbfW&J{\xef\xe2\xa7?hNK\xee\x05\xa9\xb7\xbf\xbc\x1a<\x94\xc1\xbf\xc6?\xf7Y\xff-\x0bg\xa4\xbf\xeb\xc6\xa3\xdc\xda\x19\xb3?\xeeD\nY6\xea\xb0?\xee\x87\xedA\xdf\xc9\xa0\xbf\xb7\xf8\xd0q4u\xa0?\x0c0@\xf01\xaf\xc0\xbf\x96.\x1f\x8e\xc2}\xb2?n\x89\x1d\xc5\x82\xf7\xb3\xbf<\xf0\x0e\x85P8\x96?J5\xce\xef\x14t\x8e\xbf\x13\xf0\xba\xb9\xc8%\xa2\xbfN^\xc0n\x1d(\xca\xbfQe\x08{Y(\xb3\xbf\x93\xc3\xf8\xe0b\x9b\xb0?\xd9\xd6\xd0\x0bXb\xb1?l\xb3?\xf7\x0cF\xb8?\x04\x05\xf0\xe1)\x17\xa9?\x046\xf9%\xf7\xcb\xb5?|\xf9\xca\xc1\xfd\x88\x86\xbf\x90D\x9dsk\xc6\\?0(\x18\xbe\xaa7\xa8\xbf\xc9\xc1{k\x0c\\\xac\xbf\xc7\xe3\xbe\x82\xf6@\x83\xbf\xd0\x8b\xbc\x03 \x19\x8f?\xc4\xfd_>!<\x9d\xbf\x149\xd2>\xa4\x11\xa9\xbf\x88\xff\xcd\x995\x8b\x98?\xb0\xa7\xfa\x1b\xde~\x8b\xbf\x16\x8b\xa2\x986\xcb\xa5?\xa9D\xaf\x8d\xde\xc3\xb1\xbf\xc8\x88\xe4?P2\x8c\xbf\xfe\xa3N\x1d\x0f\n\x92?\xd3Q\x0f\xa9\xf3{\xa7\xbfv:\xe3\x94\xd6\xbd\xa0?\xfcf\xad\xc4\xe8\x1e\x80\xbf\x80\xb4E\xb9f\x0c\x89\xbf\xbe\xe0\x01"\xd2\x02\xab?f\xef\x95\x846$\xa4\xbfPC\xce\xe7\xea*\x9c\xbf\xe0\x08\xc5\xf4\x9b\xd4\x96?y\xd5\xe6*t\xba\x9d\xbf-A\xb1\xe0E\xce\xae\xbf \xd03\xb4X*v\xbf\xc9\xc72W;\x88\x99?\xa4\xca\xa5\xee\x07\x91\xa2?\x18|EA\xc3~\xa0\xbf\xff\xf2\xfa\xe8;\\\x9e\xbf\xb0p\xf7R\xacg\x8d?P\xc4\xdf\xf4i\xb4~?\xf8\x87\x9ej\xaaQ\xab?\\\x08\xae\xc5h\xb4\x89\xbfwmM\xa1\xb5\xd0\xa6\xbf\xc03\xba\xd8BF]\xbf^<\xae\xb2\xa5\x93\x81\xbf\x06\xd4\xba\xf6\xb7!\xa6?\x824\xee\x15\x08x\x82?\xc0(\x95Ct\xbdl?JC\r@\xe7U\xaf?P\x82,\x8d(\xd9y\xbf\xe6\n\xc4\xbd\x87g~?\x00\xdfQA\xd2\x9bw?}\xa2\xe1M\xe0\xb1\xb3?`%\xb4\xe0\xf1\xc7\x80\xbfU\x12\xa5\x02\xf9f\xa0\xbfW\x90\xbb\x7fbm\xa0??B\xd8\xa3S\x17\xad?\x1e\xf4\xf8\x91&\x93f?Z\x91\x9dA\x00\x02\x97?\xe3\x0f\x07L\xd7W\xab\xbfUk{\xcc\xe8[\xb6\xbf\xf1\x14\x88\x1f\xeb$\xa8?W 9M\xc0\x99\x8e\xbf!\xd4b\xc6C\x18\x93\xbfvM!\xcfO\x0e\xb9?Z\x7f)\xe0\xca-\x90\xbf\x89\x84\xa8\x8c\x8b\xec\xa0\xbfOnY0\xe9\xf7\x82?\x8a\x06\xf2\xcf\x84\x92\xb3\xbf\x8d\x06\xb9\xe6\xc1\x89\xb6?,\xab\xad7`Q\xb9?!\xbe\xe2\xe07\xc4\xc3\xbf!\xc3\x14\xf6D\xf5\xa7\xbf\xc2t\xf2\xddns\xbb?\xabc\xe1V\xc1\xf4\xb8\xbf\x16>\xb0\x0c\xc5-\xbf\xbf\x94\x1a\x95I/f\xce?\xb9\x99\x8d\x83\xcb)\xc6?\xcbt\xc8cq\x0c\xc4\xbf\xc3f\xcbb`\x88\x85?\x13\r_N\xd0(\xbc\xbf\xa9\xc8yJ\xdf\x96\xda\xbf\xb7\xd7s\xd2\xd9X\xc9?\'\x0b5\x95%\xc4\xe1?\x80U\xa2s\xe3\x8di\xbf@H\x1d\x02\xd4 \xce\xbf\x0fm}>dh\xb3\xbf\xf6\x07\xf5j\x04~\x9e\xbf\x8c\xbeL+\xe0\xa4\xad\xbf\xb6\xbc\xce\xf4\r\x8c\xbf?\xe2\xef]\x10\tc\xb4?\xd2\x1f\x16\xa0\x84\xdc\xb5\xbf@\x16\xdf%\xc95x\xbf \x18\xf7\x96\xba\xd2\x81?\x00\xdd\xd49\x0bs\xa4\xbf\x80\xc9K\xf0R\xcca\xbf(Xh\xfc\xc1W\x97?X\xd1\xd9?z\xec\x98\xbf\xefs\x880\xe2\x8c\xb0\xbf\x19\xb42s\xd2\xd4\xb9?\x14\xfe\x1eL\xb4<\xa4?\xe7\x05\xe2\x11\xe2\xdb\xa3\xbf+b\xa4\xbf\x02d\xeb^p\x16\xa4?\xd3\x8a\x06\xc6\r\xee\x9d\xbf\x12\xe5 \xd4\xcf\xd0\xc4?&\x04\xa12Y-\x83\xbf\xd2\x0c\x15\xdf\xc0A\xae\xbf\xdd\xbd\xf8\xf9\xf8\xf6\xa3?n\xb2\x8f\x14\x1e\x18\xb7\xbf \x9d\x13\x06\xb51a?s\x0e\xd4EZ;\xc1?\xe0#CI\xaf\xc8X\xbf\x00k\xf9\\\x1a\x96\xb2\xbf$N\x83m\xc2\x0bs\xbf;a\xd9\xad\x03\xde\xad\xbf\x84\x81#$ 9\x9b\xbf\x13:\xac!\xde&\xb7\xbf\x00>HN\x84\x88\x96?j\xbb\x82\x12\xb4\x1e\xa6?\xb0{\x9c\xc9~\xb1s\xbf\xb8\xe23\xddya\xa2?\x90\x8d\xbb6\xbeP\x8f\xbf\x81\x05\xf9\x12\xe1#\xb9\xbf\x1ePL+J\x04\x94?\xb2G<\x16\r\xe5\xab?\x00g8\r~\xbd\x82?\x82\x96B?\xb2\xb5\xa6?pKg\xdb\xe1\xcd\xba?Z);O=\xc1\x9f?R,\xc8|7\xa0\xd1\xbf-\x97\x86$_\xf6\xb1\xbf<\xea\xb3\xbd\xe2\xa6\xc4\xbf\x10\x01\x12f\xa9\xb3\x9b\xbf\xe8p\xa3\xcfN\xf7\x92?Aws\x0eUh\xa4?\xf66hUj\xa7\xb7?4\x17Cv\x15}\x94?\xe8\xd3\xdc\x95\x87\t\xd5?e\x1e\xc4\x11e)\x9e?\xbc!d\xfcc\x0b\xc6\xbf\xe8Q\x13\x0cg\xde\x82\xbf\xfcN\xfc\x05\xb2O\xa6\xbf\x9d\xb1\x82\xff\x91\xf6\xb2\xbf\xfcQ`\xfbK\x97\x93?$\xbf\x9b\xe6xN\xa5\xbf\xac\x00=!\x1a2\x9a\xbf\xc2\xc0\xa0\x8a\xb2\xe8\xa5?@\x9b(\xd6\xff\xa2\x82\xbfG\xef\xa0u~E\x9c?\x8a8E\\\x9e\x9a\xab?\x84\xaeT\xc9)`\xaa?\xf4\xdc\xcd\xd6\xbee\xb8?<\xab\t++=\xa4\xbf\x1c\xb8\xc0\x1bN\xdc\xbe\xbf}\x9a<\x94u.\xad\xbf@\x80\x08\x92\x87s\x87?\xaa)\xd1\xfd\xee\x06\x80?\x8a\x9a\x04\xa1\xc4\xde\xba?\x06\xab5\rx\x91\x8c\xbf\xf7~\xdf\xfa\x07\x91\xa8\xbf\xe8m\xaa\xbd\x90\xefq?\x1f\xad9m(\x82\x99?\xac\xd8\x95\xbdqU\x8a?\xf5cI\x07\xebF\xa0?H\xf9\x10\r\x8d\xe6\xac?\x18%\xeev\x02M\x9e\xbf\xe0\xdd\xcc8\xc7 \x88?^\xc5.K\xb3$\xb3\xbf\x80\x0f\xd2\xf6\xcbv\xb0?NM\\\xab~\xc4\xc8?\xa6^\xb6\x84\xf9\xa5\xba\xbf\x17\'\xf2;.\xce\xbf\xbf\xa5<\xfe\x9fiD\xa5\xbfX3B\xef.y\xbf\xbf6\x82\x80\xa7\xbd\x82\xb5?jx\x9a\\\x81%\xd0?\x9e\xe5\x88\xbaa\x99\xa1?\xbd\xd7\xcb-\xda[\xbf\xbf\x11\xf2\xec\x98r|\xae\xbf\x13\xc9\xdf\xf2\x19C\xbd\xbf\xb4;\xd3\xa4\x856\x93?;\\"\xa0\xa94\xd1?I\x9aE[\xf2\x15\xa9?\x90!\xde\xac\xcdR\xc9\xbf\xa7I\x92\xac\xa6O\xb7\xbf\x00\xf6?y\x86\xb1\x8f\xbf\x82^\xfelLZ\xac?[\xe5`\xc0\x0f\xc6\xc5?|d\xbc\x8d\xded\x95\xbf\x91\xc4\xeb\xd5x\xd4\xcd\xbf\x16n\x90\n\xc5\x9a\xa9?\x9f\xea\x1c7\xd9\xf1\xc9?f\xa2F\xd0t\x8a\xb4\xbf\x86L\x06\n\xc1l\xb8\xbf\xc8u\x94\xf5\xd9\xbe\xb3?\xe1(\x18\xde"h\x9a\xbfh\xf7Si|\x91\x87\xbf\xba\x91\xce\xe0\xd4\x88\xac?L\xe5"\xcc\x18P\xa1\xbf\x88\xc76\x98/u\x90\xbfTP\x1c\r\xff\xbb\x92?\xe0Wi;0\xe5\x87\xbf\x86\xac{L\xc6/\xb1?x\x82\xe3\xb3c%\xb2\xbf\xcd\xe2\xaf\x12\x0c\xe4\xb8\xbf\x04\x9f\xe6\x04P\x8f\xc3?\xa0\xa0W\x1eX]{\xbf\xb2=\x07\xbb\xc3X\xbf\xbf\xc6\xbe\xa7\xbb3\xcd\xad?xbJf\x00d\xb3?d\xc2\xf0\x16r\xab\xa1\xbf\xb4\xc3Lv\x1e\x1c\xa0\xbf\x00\xa5\x9c\x96(+~?}Ox\xfc\x96>\xb6\xbf\x00\x9d\x92\xebW\xc3v?\x1c\xeb\xb0;\x8f\xe0\xc1?\xc05b\xbb\xed{\x7f?-\x10\xbd\x15\xf0\xbe\xb8\xbf8\x90\x1a\x19EK\xad\xbfNw\x1e,\xb3}\xa9?\x96\xe8\xdc\x00\xcaM\xa4?\x000\x96@\x1d\xc7M?\x90\x1a\xa1bHOy?\x00\xa6W~\xc7\x02\x15?\xfcO\x83MN\x86\x89?Pb\xa2}\x146t\xbf\xa4\x06\xc2(\x8e\x9f\x91?\x00\xca+\xf3w\xf6f\xbf\xc0\xa1\x9c\x1a\x86\xaf|?\x9d\xc5}\xf0P\xbc\xa4\xbf\xf5\xe4{\x9b\xec\x88\xa0\xbf&\xb0L\xdf5?\xaa\xbfz\x13~!\xd0\x0b\x91\xbf`\xa7#\xf1\xf5\xeew?^\xaa\xad\xcc\xc6\xbc\xaa? \xa9Z\xa9\xb3\xd4i?0\xe4\x95`\xec\xf3\x88\xbf\xe8%\xbe|\xad\xc6\xb5?H\xcc\xdfC\xb0\x1c\xa1\xbf\xc7\xacQ\xe3\x18\x84\xa2?\xef\xa7`\xff\x12\xd5\xaa\xbf\x18\x0c\xc4K;\xd5\x9f?]\x84\xd2=u\xa0\xaf?\xd0\x15qCU\x9b\x9c\xbf`\xe3\xeerw\xc9\x98?\xce\xe0:p\xbd\x0f\x99\xbf\xde r\x9b\x00\x11\xb3\xbfj]\x9eP\xf4\r\x9c\xbf\xf2\xe6\xda%T\x11\x93?v@\xbd\xeb\xcb\xe1\x85\xbf\xa2V\x19\xbcH.\xa0?\xf5\xca\x02\xc8\xfb\xe2\xad?dA\xf1e\x88&\xb8\xbf\xa0Sr\xba~\x07\xb0?\xea\xc8^l=\xd1\xba?\xa4\xbe\x83\xc3\x8b`\xa8\xbf\x92\xab\xed+l\xc3\xb0?4C\xa4)\x18\x0f\xbb\xbf?Qg\xff\xc8\x9b\x9e\xbf\x04N\xb1\xdc\x06\x07\x9a?6cB\x9e\x8c\x16\xa0\xbf\xd8t~\x11\x92\x87\xa1?\xa0\xcd\xf4p]o\x95\xbfg+9\xd8S\x05\xc0\xbfy&Adz\xf6\xc0\xbf\xde \xba\xba\x92\x99\xc6?\xe2\xb3\xf7\xfe^\xd0\xae?X\x994|)\x84\xc2?\xb0\x06\\=75\x89?70\xa4\x16\x19\xd6\xa2?h\xa2}5R|\x92?\x10\x83+8z\xbc\xb0\xbf\xaa\x9ed\xe9\t\xeb\x94\xbf>\x0c<\xf50\x97\xd4\xbf28\xb6\x81\xe5\xba\xab?c\xc4\x8dZ\x9f6\xc5?Q\xf6\x87-\xf4\x1e\xc3\xbf0\x07\x83_\x11k\xaf?0\xaeD\x97\xb4\xf7|\xbf\x84\x81\xf3v\xf5\xd7\xa2\xbf\xcc6\xf2\xc2:\xac\x9e\xbf\xa0\xcc\xcc\x89\x80\xadp?\xcb\x86\xe5\xa4\nI\x92?\xbc\xa0\xde\x89^\xca\xa6?\xa0\x9c\xe4\xb2\x10\'q\xbf\x91F#\x8c\x0f\xd8\xb1\xbfFi\xf6\x066\'\x96\xbf@\xa2j\xd6\x1e\x89U?7\x85\x8f\xb6\xab\x95\xbd?\x0c\x8b\xec\xe5\x96\x89\xb6?\xa0\xad\x84\x10\x0e\xd4\xaa\xbfhf\x83+\x10\xa6}?\xfcl1\xfa\xe3\xe7\x98\xbf<\xac\xd5\xc4\xdd\t\xbe\xbf@\xb3\xb0\x03\x92\xba\xa4\xbf\xee{\xee\xf7\x8f\xdd\x92?d}p\xe1\xe6\x88\xc1?\xe8\xea\xfdy\x19t\xa3\xbf\x99\xa1X\xa5u\xb5\x80?\x08,\x99\x07(\xaf\x85?\x9a\xd9\x19Kea\xa5?\xc40:gs\xd2\x90?\xa8R\xca\xcf|O\x8f?\xf0f\xbf\xc9\x0e^\x9d?\xce4\xa3\x89\x10\xc2\x85?\x89\x82\x03\x93\xbf\'\xb3\xbf9\xcc\x06\x87m\x17\xb7\xbf2\x99\xe7\x7fQr\xb2?\xde\t\xd4\x04\xb0\\\xbc?\xb7\xa5p\xc4\xb9 \x96\xbf%5\x9d\x9e\xd5\xe5\xa1?\xa8\xf9#d\x9e8\xa2\xbf+\xcfWs\xb1D\xc6\xbfo\xfc\x9dQi\xcc\xb6?\xa7{\xcf\xb7"[\xc4?\x1a\x8d\xa8\xd8\x16\xdf\xbc\xbf\xac \x90T\x12\xc2\xaa\xbfz\xb5A\xf4\x81\xcb\xc5?)\xc7\x1bWvk\xa7\xbf\xba\xb1\xcc "\xfa\xc8\xbf\x96\xf9\xb4&S\xee\xbd?w8;\x9f\xe2#\xc1?\x8ea\xffL\xc2\xd6\xc2\xbf\x1b\xc3\x98\xae\'\r\xc0\xbf\xc2 l\r!E\xc7?\xe8\x05r#\x91E\xcb?<\x13m))\x8e\xcd\xbf\xca\x0e\xf0\x8d\xf5\x99\xd3\xbf\x85\x14a\xf7\xda:\xca?z\x7f\xc5\x00Xn\xd3?[\x88\xb3\x8d\xff%\xb7\xbf\xacC\x8d\x95\x0fK\xce\xbf\xc8t&)B\xda\x9e\xbf\xf3\xb2F\x88\xd6L\xc7?\x93\x0c6\x12\x94"\xb8?S\x9f\x8e\x1ag\x81\xb4\xbf\xe8vD\x84\x9c\xa3\xa9\xbf\xaeN\xea\x13\xac\xa9\xa1\xbfT\\[r\x1b\x95\xa4?UM(M-=\xad?\xacJI\xd0\xc4\xc6\xb5\xbft\xf9\xb0\xc92\xfe\x8c\xbf\x82:8ni\x82\xa1?\x90\xc2t\x0f\xf1\xd0m\xbf\x19(O\xdbT\x18\xab?\x18\xb9\x9d\x1e\xaf.^?\xc4\xa6\x92\xc6L\xfe\xa2\xbf\r\xed\xa6\xbe\x0c\xb1\xa5\xbf\xb3\xa0z1=X\xa1?\xf8\xcb\x04bI\x15\xb3?\xd0\x13\xa7:\x8f\xa0\xa8\xbf.\xa4\xce\x07\x80I\xb2\xbf\xce;\xb0\xb0Vn\xa1\xbf\x97w-\xf2>\xb4\xa8?\xb0v\xd2h\xb4\xec\xb7?\xd2\xae\xb9\x0b&\x9b\x9c\xbf\x94\x17\x8c\xd5\xb4\xef\xa0\xbf\xd0\xab\xd7\xac\xc9My?#\xd3\x19\xce\xd4\xee\x8e?\xc0\x07\xe0\xbe\xb8\x0ey\xbf@\x19\xbb\xcc\x19\xfeQ?\xf6^(\xc6\xa5\xa0\x85?\\\x15fR\x9a\xf0b\xbf\x9e\xa6\x9b\x08[P\x8d\xbf\xc3\xde\x98\x0fY?\x86\xbfDT\x8d]v\xe5\x91\xbf\xb0\x7f\x96O\x91\xe0\xa1\xbf\x10\xb7\x0f\x8c--\x90?\xc0\n\xc1"o\x97_\xbf\x90T\xfc\xae\x11\xbc\x85\xbf\x87\x90\xa8\x94\x1d\xb2\xae?\xc0\xdff\xd4\x16U\x95?i\xa9\xcbFd\xd8\x8f?\xa0\x94#Zr\xc2o\xbf\xe6\xc9\xe7\xd8\xb0\x9b\xa3\xbf\x82\xaei1\x9dmb\xbf\xc0\x88\xa8J\x8a*\xbd?\x8e\xff\xe7\x8ft\xda\xa0\xbf \xa7y\xdf\xf6(r\xbf\xd1\xf9\xbeJ\xb9\t\xa0?\x00\xb2\xf7\x92\xf5\xa2{\xbf\xbc\xb6\x8a\xb8s\x9d\x94?\x10-O\x1e\xd5\xb3\xab?\xa8h\xda\x84\x9a\xdb\x84?\xbaBK\xd9\xbc\x8a\xb7\xbf\xd8\x99\xfb( g\x9d\xbf2vc\xd8\xd6\x1a\xb1\xbf\xa6S;\x02D\xa3\x8c\xbf=\xaa8\xb4&\xd7\xc1\xbf\xd9D\x96\x95\x8e\x8c\x88?\xba \xf4^\x98(\xb0?5\xaa\xd9\x1f\x1c)t?M\xc5\xdc\xce\xcfM\xb3?F\'\xf2,\x8f\x90\x97?i\xb4@\xd86\x8e\xc1\xbfpc[\xb7\x1er`?v\xa9k\xf0\x81\xfc\xb1?(D\xcf\xca\xac\x95\x89\xbfd\xf2\xec\xca\x943\xaa?\xeb\xa1\xdck\xca\x81\xbc\xbf\x16\xbfX\xb6|)\xa2\xbf\x18\x05T+B\x8b\xc7?\xeeB\xa5hS\x89\x9e?\x0eh-@U\x9a\xb3?\xbf@\x9e\x87\xcan\xa1\xbfP\xfa#s\xbd4\x97?\x88\x05\x90}tSf\xbf\xc9\xa5}1/!\xb7\xbf8\t\xc0\x04\xe3\\\x9c\xbf\xd2X\x93nW\x8f\xc7\xbfX\xf4\xff\'\xe1\x08v?B\xeb\xe0\x83c4\xb1?x\xa5\xc0\xa8\x82u\x9b\xbf\x0b\x8a&7\x80)\xa9?\x08P\x07\x8b\x94x\xa8?\x00\xab\xee"\x9a4M?o\xbf\x9eR\xfe\xa8\x88\xbf\x80\xc6[\xa0\x05VU\xbfE\xce\xac\xf6\xe4\x1c\x8d\xbf\xc0y\xc1\xc1r\x08}\xbf\xab~i\xe3\x05\x10\x97\xbfPD\xed\xd0S\xb9\x99\xbfC\xdc\x0b\x7f/\xedx?\x08\x83\xa6\xed\x97L|\xbf\xbc\x15\xe8Ea\x13\x9c?\xc0\x18\x02\xef\xb5`\xab?\xfc\xc9\xdb=\xe1\xa4\x8f?\x80\xf0\xb4\x9a\xc9R>?E\xb5\x7f\xc3\xb5g\x98?2\n\x1d\x0b\x9dM\xb3\xbfz\xae\x11\xf3Y\xd7\x9c\xbfdT\x12\x89\xb5\xcf\xaa?\xd8\x8ae\xe5\x92\xf4\x88?X!7|\x1c\r\x9e\xbf\x0e&qQi\\q\xbf\xb8\xf3\r\x9b\xf0\xce\x89?\xa8HM\xec8Z\x9b?\x10=HBK4v?\xf4_\x95\x19\xb6\xc2\x81?\xaa\xdc\xbc\x0c\xcb@\xb1?\xf4\xce\x80>\xa8\xef\xae?m\x93\\\xabZx\xc6\xbfX"\x0b\xfcKo\xbf\xbf\xb1\xe9\xb60l\x91\xc6?\xb1l\x0f\xc1\x07\xfb\xa2?\xa6s3&s\xcd\xad\xbf\x08\x97slh\\\xb5?\x02-\xc4\x0et\xec\xa3\xbf\x84c;\xe7\x9d\x98\xbb\xbf\xf0\xf0\xf8\xdb\xe6D\xb6?\xcdO\xe8\xb1]\xf6\xa4?\xff\t[\xd2\xa4\xb7\xb8\xbf@\xcfs\xcao8\xac?;J\xfba,\x10u\xbf\x9b+_\x18\x02\xee\xbb\xbf\xde\nHW\xf4r\xc8?)14}\x0bw\xc3?e\xd0,\xd8~\x85\xcf\xbf\x1d_&\xa7\xa2!\xc6\xbfE6\xc8\x95\xc0\x12\xc7?6=\x8d\xd6\xcd\xe7\xbb?*\xea/\x1a@\x82\x98|\xec!\xd97@6\x16/;a\xae/@\x15\x11\xef:\xb3\x84;@N\xe9}\x9d\xdc\xff8@Dh\x0fyW,B@G#u\x90\xf7*0@\xd4\xc5p\x82\xba\x1bD@\xd0\x93\x05\xee\xd2\xae\x1c@\x9f\xe2\x8eJ\xe4zD@\x17\xcc\r\xc5f\'\xc7?6\xf5\x8cI*!D@\xb1\xbcX+\xaa(\x1b\xc0\xe7\xb3\x10\x1bt8B@\xcb
\xc0\xbc\xbd\xd6\xe7\xfb\xb3S@\x1e\xab\xca\xbe7lD\xc0\xa1%^\xae\x01\xbcS@'
-p157
-tp158
-bsg96
-NsbsS'_eigenvalues'
-p159
-g62
-(g63
-(I0
-tp160
-g65
-tp161
-Rp162
-(I1
-(I30
-tp163
-g72
-I00
-S"\xcc:\xf2\xd61l\x85@~\x81\x02\x12\x90\xc5w@\xc5\x83\xa6\x19NWQ@\xf5k\xccUi\xc9J@H\xcf\x8f*\xc4\x98@@\xf3\xb3\x85\x9ezz2@Z\x0f3\xe3\xa6\x15-@\x9a\x7fn\x05\xa1g%@\x0e\xd6\xdd\x05'\x7f\x1b@\xb9\x86\x13\x96:q\x14@\x03u*6\xe0\xee\x12@a\xe8%i\xa5\xdc\x11@@\x8df(\xbb\x9d\x07@\xa1[\xfa\xeb\x8d7\x03@\x87~\xd6\xf7\x87\xa1\xfc?\xca\x18q\x88\x141\xf9?tj\x1e\x1c\xae\xb2\xf3?\x1c\x0b\xe6\xa3\xc3\xf3\xf2?y.\xc0\xc7\xe3H\xee?L\x9b/j\x9f\xec\xeb?\x19'\x03\x11\xae\xbb\xea?\x10K\xf4!\xd2\xb6\xe8?8\xf2\xbc\x1e\xe5$\xe6?\x82\x0c\xc0\xcc\xfb\x0e\xe5?\xae\\\xd6\x91uS\xe4?\xbd\xcb\x06?\xbay\xe2?\xd5\r\x10\xe2\xe74\xe1?r\xa3\x97\x1d\x9a\xcf\xdd?\xe8\xb0\xe7\xa0\xa3\x0e\xdc?\xd6M\x06\x9d\xa9_\xd5?"
-p164
-tp165
-bsS'_trimmed_eigenvalues'
-p166
-g62
-(g63
-(I0
-tp167
-g65
-tp168
-Rp169
-(I1
-(I1
-tp170
-g72
-I00
-S'\xb3h$\xf2\x99\x00\xd2?'
-p171
-tp172
-bsg61
-g62
-(g63
-(I0
-tp173
-g65
-tp174
-Rp175
-(I1
-(I30
-I34
-tp176
-g72
-I00
-S'\xe0W\xcf\xdd\x82[\xd4\xbf\xa7\x8f\xdc(\x13E\xc6?n\x9a\x15\xd9$\x80\xcd\xbf\xe2\xe5\xbf\x1e;\xcf\xc5?\xdb\xde\xca\x104\xec\xc1\xbf~\xdc\xee:\x1e\xcf\xc5?\x9e\x1b\x02\xdc\xf8\x94\xaa\xbf\x99\x96\x80\xe7\xa0F\xc5?\xfe6\x92H_\x15\xa1?\x0b\xc3\x0f\xd3\xa4\xaf\xc3?V\xf9Vr%[\xbd?=\xba\xa92\x92:\xc0?\t\xef\xc3~\xfa\xea\xc8?\xc7p-\x13J\x1f\xb5?\xd8\x00\xe6V\xa1\xaf\xd0?\x19\x88\x8f\xa8\x07\x93\xa2?\x7f\xef\xc7\x1aa\x01\xd2?\x08\xa2Ql\x1c\x12z\xbf\xd0k\xe5\x8b\xf1\xd3\xd0?\x94\xd7\x0e\x01\x1ep\xa6\xbfw\x1f\xd5\xe9\xfd\x0e\xc9?\x9d!2\x1c\x07\xb3\xb5\xbf\xdbv\xe4_y\x10\xbd?\t\xbc\x99\x88@\xe9\xbf\xbf\x83\x10\x86\xa2\xea\x9a\x9f?\x04-K \x10!\xc3\xbfPg\x13\xc3h\x87\xab\xbf\x18_G\x8c\x9b\x91\xc4\xbf\xe4\xc7^0\xac\x1e\xc2\xbf\x93E;\xdd\xf1L\xc5\xbf\x9bd,\x10\x19\xa9\xcd\xbf\x17\x06\xc7L\x7f\xc7\xc5\xbf;\x8c\xc1R\x14\x80\xd4\xbf\xc4\xf6\xdeV\xd2F\xc6\xbf\x13,\xe1\xb4\x0c\x85\xd6\xbf\xfc\xbbP!z\xf3\xba\xbf\x02\x85\xd9y\xc2t\xcf\xbfi\x03\x93Ub/\xbf\xbf\x05sZ\x02\xc7\xb6\xc1\xbf|\x08\xe8\xf1\x98\x92\xc3\xbf.\x9a\x10\x88\xc0\xb8\xa2\xbf\x84\xc9\x9a\x85\x9fO\xc6\xbf\xecD\x9c\x8e\xa3M\xa0?+\x1c\xc1\xc8*i\xbd\xbfs\xdf\xcat\x0b\xb6\xad?\x90Kur\xa7\r\x85?LC\x9aU\xed<\xaa?_z\xad\x90T\xc8\xc4?B\x86\x11+\xbda\x90?\xd5\x8bFF\x1e\xf9\xd3?\x1f\x8eeY\xa2~r?r\x9b|\xe6i\xc5\xd7?\xeaM\xef\xe5\x1bb\x86\xbfy\x01\xcbv\xb1Q\xd4?\x83\xbd\x9f\x95\xeeW\xa8\xbfk\x93\x13\x14\xb5\xcb\xc5?}I\xc6\xc5\xf4\x16\xad\xbf3\x10\x1f*\x03X\x8d?\xee\x80\xcc\xd28\xbf\x9f\xbf\xef8\\\x9e\xdf\x14\xbd\xbfy\xc1\xa4\n\xa6\xbb\xa1?3\xe6\xfa\x17:\x9b\xc6\xbf\x94W\xe9\x86\xd1\xf9\xc0?\xfb\xa6B\x08\xd3X\xc4\xbf|=\rx\xa9\xf0\xce?bd\x83\xc7\xfdO\xc0\xbfq\x80\xbf3\xeeI\xd6?h\xae\xc4\x8eA\xc8\xbc\xbf\xf3\n\xa0\x8e\x8f\xaa\xca?\x8e\xfd\x99\xa5j\xe4\xcb?\x95\xddM\xa9\xf4\x81\xa6?(\x99\xad\xc7_\xdf\xc6?\x9d\x82\x1f\x1f\x93#\xba\xbf:\xd5l(.\x05\xbd?\x12\x10\x94\xb9\x1d\x08\xc8\xbfH\xa5\xbb\t1kv\xbf\xd4\x912\xab\x13\xce\xc5\xbf\x98\xb1~|\x9b\x9f\xc6\xbf\x14\n\xe2,\x14h\xb6\xbfT\xdf7\x02\x8c\x07\xd3\xbf1\x8a\xa42\xf3\xaf\x98?\xc4\xbfF\xbfM\xe7\xd3\xbfUR\x88tR\x17\xc0?\xccb\xd2\x11l\xad\xc8\xbf\tf\x84\xda1\xc9\xc4?\x90\xf9\x90\x86/\xf8y?\xfb\xfa\xc1<\'\xa0\xc1?:X\xce\'\x9d\xfc\xca?\xcf\'OWk\xfc\xad?\x13C]\x0c\xe8\xa2\xd4?\xa9PR\xc4\x9b\xa3\xa5\xbf\xdd\xa2\xc0\x97\xd7\x85\xd3?\x01\xb8\x81\xca\xa0?\xbf\xbfo\xeau2\xa7+\xc7?Fz8o,R\xc2\xbf\x87p\x97\x94g\n\x89?\x04\xeb\x82\x96\xf6\xe1\xb6\xbf]\xbd\xf1n)\x9c\xbd\xbf"T?\x9a\xca\xfc\x98?\x96\n\xdd\xb5\xd2\x15\xc9\xbf\xc5LdGyg\xc4?\xd7\xfaD\x89H/\xd0\xbf\xe4\xcd\x87\xaf\\x\xc6\xbf\xc3\x07\xc4\xae\xfc\xec\xd3?\x1fLX\x92\x05\\\xaf\xbf\x9eB\x06Ep\xcb\xc7?r\xdbL\xa5\n~\xb5?\x8ft\xf4\x8a>\xe0\xad?Cu\xc4A\xcb$\xca?\x9a\x9e(\xe9\xa6\xa6\xaf\xbf\x83\xd9\xd3YN*\xd1?|lD-x\x99\xbd\xbf\xe5R\x14\x92:\x06\xd2?4\xee=\xd8\x10(\xc1\xbfy8\xf8#\x1bJ\xcf?\x1cSf\x96j\x8c\xc1\xbfO\xc05\x95q\x1f\xbf?\x8d\xe8i\x07g\x82\xbb\xbfSh\x0b#+H\x83?D!}\xbe\xc2\x01\xb4\xbf\x14:^-\xa5\xaa\xb9\xbf\x9d\x14ZU\\\xf2\xb1\xbf\xeb\xa6\x9a\x9b\xf2\xa8\xcd\xbf\x13?p\xf8\x1b\x19\xb6\xbfF\xf9"\xd3\xe3\xc3\xd2\xbf\xd3\x80L\xa2\\\x96\xb7\xbfY\'\x98o\xbe\xf1\xd2\xbf\xb3v\x86\x83\xce\x7f\xb8\xbf\xac\xb6$\x97\xfd\xa3\xcd\xbfu\x7f\xdc\x84\x14c\xad\xbf\x0c\x7f\x02\x869w\xb8\xbf8\x80\xcb\x81\x01+\xa7?\xef.\x8f\x91\x036\xb0?\xe2Ol\x1b\xb1]\xc4?\xe99=PG\xaa\xca?)\xa5/\x1eQ\xa8\xd1?j\x01~\x9c\xde\xf4\xcf?K!\x0e\xd2\xb1\x91\xb8\xbfoO\xc7\xcc\xe0\xc3\xb7?3\x08z\x18\xa3\xd3\xbe\xbf\x81x\xa5\xd5o\xe0\xab\xbf\xd8\xd8\xe8\x81\xa6\xd1\xbd\xbf\xc1\x11~?a!\xc9\xbfK=3,\xc9\xaf\x9f\xbf\xac\xdar?\x82R\xd2\xbf~\x96p\xc6}4\xbb?\xddB@a[\xdd\xcd\xbf\xbb\xec=}b\xb3\xc8?^q\xbb\xb5\xb5h\x96\xbf\xa7m\x19\x1d\x85\xa2\xc3?\xc9\x10k\xc7\x08\xfb\xd0?\xb0\x0c\x7f{\xbe\xa8\xb2?\xbe\x1d\x92i\x18\xe0\xd6?\x8e\xb5\xe9%\x8au\x8f?\x0e&\x0ff\x94{\xd1?J\x86h\x94\xa9\x17\xac\xbf\x7f%}I\x0b\xf0\x82\xbf\xa5\xf1\xd2\x10\xd1\xba\xc3\xbf\x97\x13\n\xe3&\x1a\xcc\xbfm\xbf\xc5b\xe5\x9b\xca\xbf\xb7;7\xc7\xbb\xa7\xd1\xbf\xebN\x0b\x9d\x91\x8b\xc1\xbf\x98h0\xbdT\x9b\xc8\xbfc\x94:F]\xf0|?O\x04\xce8\xb5\xa5\xb0\xbf\xa0`m-I\xe0\xbf?\xb2\xda\x10\xe1d\xcc\xb4?\x06\xfa\x963\xab\x9f\xc1?\xc3\xde\xe9\xa9U\x05\xce?"\x88\x9f~\xf0\x8c\xbb?\x07\xff\xd0\xfa\xb1\x03\xcb?\x96d\xb4\xec\x9f\x88\xd2?\xeb;<6\xce\x1b\xc2?z\x1c\xae\x87\xdax\xc5?\x06\xabsn1u\xad?g\x9f\xed\xbd\xc8\xb3\x97\xbf?\xc7\xfd\xe0ndS\xbf\x8a\x07P$\x7f\xd8\xcc\xbf\xb9|Y\xad\x06\x9b\xaa\xbf\xbf\xbf\x850\xf0\xeb\xd3\xbff\xf8h\xf7$\r\xbc\xbf_\xd2eZ\xda\x95\xcf\xbfL\x12ER\xc4\xe4\xc1\xbf\xe4\x84*\xbaA\x03z\xbf\x7f@C\xb8W\'\xb6\xbf\x03\xe0\xcf\xa1O\xb2\xcd?\xed\x05\xc6\x15\xd8f\x97\xbf\xf3-^\x9d\xbe\x8c\xd3?\xbf\x04\xe3}a5\xad?\xa5S\xeef\xf5\xeb\xc7?\xf0\xc8\xfe\x0b\xa9\x83\xc3?\t\t\x95\x88\xf9R\xa8\xbf{\xe6\xa4A$\x0e\xc3?\x01\r\xf4\x1c\xf3\xc5\xcd\xbf\xe8\\A\xfc\xd3\xcd\xb5?\xa1\x19#r\x9a\xff\xd1\xbf\xbb\xd1\xde\xf71\x17~?@&\x1co\x7f\xa3\xc8\xbf\x1e\x8a\x18\xf1%/\xb0\xbf\xd54\xc6\xbb\xec{7\xbfZ\xbc\xa5\xf5\x83=\xc4\xbfm|\xfc:Y\xa2\xc3?\xaa\xef\x87\xa3\x16%\xcd\xbf\xb7\xfb\x9b\xee\xba\xb1\xcd?\x7f\x99i\x08\xa6\x01\x94?\xed\xf1n\xe3<\xd2\xd9?Z9Ha\x03\xe4\xb0\xbf"g>\xc4\xfd\x18\xb6?Y\x00\xc8=\x1e\xa0\xa6\xbf\x98\xbf\xf8x\xed\xfc\xcc\xbfC\xe2\xf4\xf6\x13\xfe\xa2?\x0b\t~\xcc\xf6k\xd5\xbf\xccC\x98\x80\xb6\xaa\xad?GP\x97\xa6\xbf\x00\xc6\xbf\xb5\x8eGZ\x15X\x94?\xe1D\xab\x01O\xa8\xb2?v\xe9\xeb\xdd+\xa7\x9f\xbfv9\xdf\x86\xda\x06\xcd?R\x98@Z\x13@\x8b\xbf\xc9\x10\x8b\x1cn\xd1\xc5?\x0c\xf2_[\x83\x98\x81?\xe8\xa7L\x8bw\r\xa1\xbfV%\xfbU*zw?\x9d\x1b\xca\x93\xedQ\xc9\xbf\xd1K[\xf1B<\x98\xbf\x10\x8c|\xe0\xe0X\xca\xbfs\xd3r\x0f2\xedw\xbfO\xdcJ\xe7\x08\xb7\xa4\xbf\xe0\xdatd\x08S\xa0?\x93\xd1\x0e\x06\x8dF\xc9?\xe5s,\xef\xcc\x9c\x9e?>T\xc7\xe1V\xec\xd6?R\\rh\x8d\\\xa2\xbf\xe9\xe7\x96\x1f;\xb6\xcf?\xbb\xde`x\xe4\x00\xaf\xbf\x9b\xfd\xf6K f\xba\xbf\x8b\xccb\x1d>\xdc\xb1?I\x9e~\xa3"\xc7\xdc\xbf\x9b\xf6\x8dHf\x1d\xcb?\x1e\x94\xb0\x80\xf7\xd0\x7f\xbf\xfeQ\x14\x0e>\xb7y?\xf5\xf0\x01\xd9\r\xfbr?\xe9 \xb0\xd70\xf7\xc7\xbf \xd0\xbc\xea\xa6\xaa\x9a?\xc2\x83\x05\x89\xcd\x7f\xd1\xbf\x90\x1e\x14v\x96\x96\xa7?b\x19\xff\xdb\xaf\xe6\xc2\xbfJ\xce\x9fMTM\x98\xbf\x94\x9d\x07\x12\x17\x12\xc0?if\n{)\x8b\xb7\xbf\x85\x1e\xd7\x1e\xe7^\xd7?\x04\xce\xdb\xbe}<\xb1\xbf\xd8O\xd2\xcc\xf9M\xd4?;\xd0\xe3q\x98=\xae?j`\x86b6U\x83?q\xe27"AD\xbd?\xc9\xf7\x0c\xb4$\x84\xd3\xbf\xcf\xda6\xe9\x0bh\xb1?\x7fn\x91\xcb\xeb\x94\xd8\xbf\x03l\x8e\xae\xd1l\xb3\xbf\xf0\x06\xfa2\'\xf4\xc5\xbf\xef\xa1\x04\xab\xf1y\xb8\xbf\xfba\\\xb9\xa5\xbb\xc2?\xd7mI\xf3M%\xa6\xbf\x1b\x1d\x1eG\xb3\xff\xd3?m\x84\x81\x9d#\xd8\xb2?#E\x9b\xeb\xd4\xdb\xcd?\tar/c\x9b\xaf?$\x1b2\xed\x04\xceq\xbfr\xfe\xe0!GTv\xbfYZ-\xce|H\xd0\xbfRmk\x84\xb3\xbf\xa5\xbf<\x8d\xf9\x00\xf8\x03\xd0\xbf\x8c9\xb3(\x9a$\xbe?\x87@z\xdbq\xe4\xa4?\x85\xca\x1c\x98q\xc0\x98?\xda\xb5y9H\x9c\xd0?\xbc\r\xd00E\xf5\xb0\xbf\x87\x83l\xe3\xd9C\xd2?\xa5\xaf\x9cdk\x05\xb2\xbf\xe9*\xde\x1a\xd6\xe8\x93?\x822\x16\xc2"\xbf\xab?\xb9\x86\xf9/6j\xd1\xbf\x91rRm\xb6\\\xb1?\x12nZ\xd2\xb4j\xd4\xbf\x15\xd8\x11\xaf_\x12\xbb\xbf\xa1\x9aw\xf4\xc4[\xaf?f`\xaf\xcd\xd4*\xcd\xbf\n\x18\xf5\xe9\xf1z\xd2?\x8e\xc6\xd8\xa8\xd4fq?Mt{\x1dgk\xbf?\x9e.\x7fP\xe8b\xcb?B\x93\xf1\x9cO\xd5\xcf\xbfnUH\xdfu"\xbf?\x0bKi\x966c\xd1\xbf\xf6U\xc0\xd1\x1bh\xa9\xbf\xe8U\xa0h\xc2\xdd\x85\xbf\x08g\xfc\x86\xd3\t\xb2\xbf#9n\x82q\x1e\xd0?/f\xe1\xb2\x1e#\x9f?\x9f\xac;@1\xab\xd0?>\n\xafM\x1a<\xb1?\xcd\xfd\x86\x8b3\xaf\xa0?\xd6A\xbb\x97\xe4j\x8a\xbf\xb1x\x8c\xc2[\x06\xd0\xbf\\J\xa2?\x9c&\xb9\xbf$\x85\xf3\xa7r\x19\xa7?\xff\xe0\x00\x13\xbb\xcf\x82?\xa7\xf9!\xb1%\xea\xa1\xbfw\xb6:\x19\xc9\xbf\xae\xbfe\xb3<\x89\xa0 \xbd\xbfc\x90,\x94<\x8a\xba\xbf"\x17\x9a3\xbf\xe7\x9f\xbf\x9f\xb3:]P\xcc\xbb?@\x86\xd6\x17q\xee\xb1?]\x15\x04w\x903\xcd?\xbc\xbd\xa2/\x1dO\xc2?\xd9FR\xad\x9ea\xaa?M\xb0I\x95ec\xba?&\xdc\x1d{\x8d\x1d\xce\xbf\x01?\x88\xc5q\xf5\xc8\xbf\x97\xf3\xf7\rBK\xcf\xbfS\xd1o\xcf\x00k\xbc\xbf>\xd5%\x1fE]\xb0?\x97\xcbYf\xff\xae\xa3\xbf\x99a\xa2;\x928\xd5?\xcaX\xbd\x99K}\xcc?\x84\x85\x05\x99\xb4N\xbb?\\g\xe0\xcf\xc1\xf2m?KH\x89\x8b\x8e\xae\xd1\xbf\xe8\xa0Y0\xbc\xe3\xa0\xbf\xb4\x0e\\\xf8\xd7\xaf\xd6\xbfpe\x19\x02\xa8;j? E\xd8\xb0\xca\x1d\xc1?J\xda\xf9\x9c6\x8c\xb0\xbfm\x0e\x88\x92\x80a\xda?:^\xae\xd9e\xb7\xb8\xbf\xa2\xf1\xa6\x11\xab\xf5\xbf?\xc9?ft\xd6\\\xc0?cs{\xc9\x98\xf3\xd2\xbf\xb6\xfbze\xfdf\xcd?\xaedS\x92d\x91\xcf?jn\x1cd\x11\xa6\xa5\xbf\xaaw=\x8a\xb5hy\xbf$\xf3|\xc9bq\xcd\xbf]\xd6\x13\xb8\xecX\xd5\xbf6\xdd.;%\xa3\xba\xbf\x7f\x03\x888\xca\x9a\xcf\xbf\x8b?jm\xb0L\xb3?4S\xcb\xfd\xf1\x13\xb5?)\x17e\xe1k8\xc2?.\xc3\xb5\xa3\x08\x83\xd3?D\xbe\xcd\xc1\xbaA\xb4?\x8d\xbfo\x1e\xc2@\xc8?Yg\x86\x9f=.\xc4\xbf\xcce^\x14\'\x17\xc3\xbfL\xd5m\x07\x05\x1c\x8c\xbf\xc02\xd6,\x89\xe6\xd2\xbf\x94\x06\x1c\x13\x1cH\x9b?\xa2j6\xe6\xc2;~?N\xc1\xe2\x98\x14\x1a\xb7?u\xe3D\xe5{\xfa\xd2?\x92\xcd\xba\xc4\x06\x8c\xbe\xbfC\xb4\xc2\xa1\xa5e\xcd?zz\xee\xf3\x83\x05\xab\xbf\x95q\xa0\xa3\xa6\x8b\xc2\xbf\x96&\xbe.-p\xb3?\x9f\x81\xb8R\x06\x81\xcd\xbf\xe7H\xb3\x91\x8aJ\xc4?\xeem\x1fe\xda\xaa\xc0\xbfu Q\xde\x7f\x8at?\xa9\xb8AT\xd2Om?\x06\x11\xe4\xc5\xc5\xdf\xc4\xbf\xca>\xa1Z\xbe\x8f\xc6?\x0c[\x12\xa4\xe4\xbc\x8e\xbf\xe5\xc0\xc4bL\xcb\xd2?"\x04dR|\xe5\xba\xbf\xac\xf1\xfe\xe9\xa0A\xcb\xbfzsml8\x11\xcd?\xba&mm/\'\xcc\xbf\x84LB\x1f\r\x11\xb7?\xa3\xbb\xae\x85.\xcb\xb8\xbf\x02\x84\xb1\x17J\x07\xc9\xbf\x8e\xd7\xfe\xa69"\xc8?Z\xa1\xfb\xd6Nb\xcb\xbf\x94T\xe1(\xc7\xb6\xc8?\xf4\x7f&\xcc\xc8\x06\xc2?\xf9T\xcc)\x9b4\xb2\xbf\xc0P\x96sj\x89\xd4?\xb5\xca;\xbff\xa0\xcc\xbf\xe8\xb1@\x0e\xe7\xaf\xc7\xbft\xcc\xde?\x85D\xb8?\xfd\x19\x9a\x0bO\xf4\xc9\xbf\xc9\xb3\x84cc$\xb9?\xb8.\r\x10\xdfV\xc1\xbf\x8a\x15A\xab;\xc8\x9f?OV\\\x17\xbd\xdc\xd8?\xba.@@\xb9&\xb1\xbf\x92r7\xef\xcf\x07\xb8?\x8dw5\x88\x90\xcd\xc2? \xad{\x02W\xe6\xb3\xbf\xa6^\xffP\xf0\xbb\xc1\xbf{\xae\xe0R\x19\x01\xc8\xbf\xbd\x1c~\xde\xb9<\xc2\xbf\xee)\xafB\x9b\x13\xa8\xbf\xf7+\xa1=+\xef\xbb?M\xec8\xb6N\x97\xba?(\xff\xca\xd39\x97\x91?\xc5\xa9\xb7\xf4\xec\x9e\xd0?\xb4\x81\xf3A}z\xa7\xbf\xe8K\xc1)A\xea\xd5\xbf5\'\xb3l\xf5)\xbd\xbfa\xb2\xd9\xef\x1f6\xb7\xbf\xe6\xb1)I~\x0c\xd3?\x90\xb5\xde>\x0e^\xac?\xddN<\x08\xefY\xaa\xbf\x9b\x17\xb8\x08Q\x9f\xc4?\tN\xc0\x0b\xc7h\xc1\xbfQ\x92\xe0O\x80\x08\xa4?\xbb\x06\xc5K\x05j\xa0?\x17\xa97vH\xae\xba\xbf)\xb0\xeeVc\x87\xbb?\x0eL:\x88m\xe1\xb5?\xf9\xc9T\x02S\x00\xc5\xbf\x9e\x1a\xf2\xd9\xcd\xc7\xcc\xbf{\x10\xf3\xf9\x12\xb6\xb2?j\xb8\x10\x85\x14\x7f\xcc?\x7f\x1d\xb6$.j\xa6?T\x81\xdc\x96S\x14\xb3\xbf\x0c\xe9\xe3\x1ag?|\xbf\x1dd\x90\xfab2\xc7?\xd9\x91\xbe;\xe7W\xc6\xbf\x1d\xc7\xed\xa6\xc0\xe0\xd3\xbf\x0b\x10\xc4\x04`x\xc6?z^}\x1a\xa0\x99\xc0\xbf\xdf\xa5@_jB\xb7\xbf4Qu\xee\x0e\xef\xd1?F@\x93v\xd0\xdc\x90?\x1e\xd1N\xed\x99\x84\xcc?Z\x07\xb8\xb9\xafz\xcc?\xabe\x8a\x0f\xe5\x87\xcd\xbf\xcd\xe8\x00\xe9&\xad\xc7\xbfT\xca?\xf8\xb8\xcf\xc8? L\x8c\x9d_n\xc3\xbfy\xaeV\xceN\xdc\xba\xbf\xec\x8d\xcc\xfa}\xd1\xc0?\x8a\xfa\xba\x8e\xb8y\xd3\xbf\xf2/\x04%\x84\xa5\xb9?Z"\x88[2\x04\x95?h\xa6\x02\xedkU\xbc\xbf\x15\xf3=\x92M\x0c\xd3?\xf8 9R\x8b5\xb4\xbff\x85{\x11\xf6\xd7\xb0?\xda=\xa6y\r\xd9\xc5?2\xc3B\xa8\xf4\xc2\xd1\xbf\x13\x1c^\x94j\x14\xb5?L\xe5h\xc2_u\x93?\x94\x8aj\x16\xf8;\xcf\xbfu\xb1\xec\x17\xbf}\xce?Y\xe5\xdf$\xb3}\x80\xbfo\x15\xf7T\xc8H\xaf\xbf.\x89\xa8*\x14i\xd0?D^\xda\x9d\x81\xa9\xd1\xbf\rW\xf5\x89%\x1f\xb7\xbfz\x1c\xcbGm\x99\xba?\x1c\x12i<\xb3H\xc3\xbf>\x8ac8\xbd\xfd\xd1?i\tcE\xa4\x85\xaf?4\xc0s\x9c\xd10\x96?\x02\x89\xf1\x8f\n\xb8\xba?\x9f},\x07s\xc0\xd1\xbf\\}\x85\x1b\xe1C\xbe\xbf\xd5\xda\x0b\x06F\xa5\xc0\xbf\x96\xdf\x16q\x96\xae\xbe\xbfW\xca\x1eqWC\xc8?\xd6\x19\xce\xcd\x162\xc5?\xf4\xba6\xfb\x1d\x0e\x9c?\\ F\x9f\xb2\\\xb7?P\x02A\xfby\'\xa7?\xb1T\xd9,\xdb\x1a\x94?\xddw\xf0\xf8i\x9f\xc2\xbf\xf6\x16/\x1a.R\xd0\xbfY\x08\xe1\xba\xc6n\x86?\x8d\xd9QSW\xd8\x8f?X\xeb\x92\xb4\xc2\xfc\xb5? \x0f\x8f\xfacp\xce?\xeeB\xaa\xb6VX\xa0?n\xcd`\x98\x8c\xe2\xa1?\x97\xa2\x9en?\x07\xab\xbf\xa0\xfe\xc7z\xb7\xb8\xd2\xbf\x10\xf3\xd7\xfd\xa3\x8b\xb4\xbf)\xffL\xe5\xfa\x91\x8a?O\x1e,\xca\x88|\xb9?\xb9D9\x05\xc0r\xcd?\xb8.F\xea\xb6>\xc1\xbfx"\x95\xf7P\xce\xbc?K07e\x8b\xdc\xc9?\x9a;_\xae_\x9f\xd3\xbf\xc2\x88j`\xcb\xad\xa7?\xd5^\xd7\x84\x83\x91\xb3\xbf\x97x%Y\xea\x96\xc2\xbfe\x0b\xe9\x88\xdc}\xc7?r\x1a7iZ\x95\xd0\xbf\xfdze-\xb3B\xd0?\x05\xe9yO\xf1\x15\xce?\xc9t\xe8\x14\x07z\xc7\xbfEY\xa5C\xfa\x16\xcf?\x88Wy\x1d\xea`\xd2\xbf\x8d\\\x02\xfe\xd1\xd2\xca\xbf\x8a3\xcc:\x9b\xba\xca?\x8e5\xdf`\xaf\xa6\xcd?K\x88\x97\xe6X=\xc5?\x1e\x12\x04A\xd6\x00\xd2\xbf\x0e\xa5\xf7\x9a\xa3o\xce\xbf)\x1f\x17\xef*Y\xc3\xbf\x06\xbes\xc8\xd2\xf6\xc6\xbf\xa6\xd3\x07\xea2[\xbc?~\xa1\x9d\x01V\xc8\xcc?E\xa7\xb5L\x0ef\xca?R%\'\xce\x1b1\xd1?\xa7\x1c\x02\x0b\x84\x8c\xa0?\xdc+\xf0\x18~E\xca\xbf\xf5\xb9\xc9\x06\xac\x0c\xce\xbf\xc0\x93(c\x93\x10\xd1\xbfH\xae\xcc\xae\x96@\x92\xbf\xeb\x03\xb9&\xea\xad\xcd?\xe6\xfef\x0f\xe9\xd0\xbc?\xfd\x81\r\xb7\x17\xf0\xc7?\x88\x1a\xd4\x8f8;\xbe?\x95\x85!\xbb\xdb$\xd0\xbf\x14\xce\xca\x9f\xf4D\xc1\xbf\x1b\xda\xd6:\'\x9e\xad\xbf{\xbf@#\x8b\xcb\xb6\xbfI\xb97\xb3-\xc4\xc9?x\xf4W%c\xca\xb1?[\xd8]\x1a?`\x93?\x16\x9d\x0b;\xc6N\xc4?\t.\xad\xa2\xd5]\xcb\xbf\x9b{\xc4\xa8\xf41\xba\xbf\xd3\xc92\x02\x9b\xf1\x8e?vw\x8f\xf3*J\xba\xbf+T\xaa\xc9\xefj\xc4?N\x91R\x15\xbf\x01\xb3?\x1b%!!\xd5\xcf\xae\xbf\x17\\\x16\x96i\x06\xa8\xbf\x1c\xf7b\x06[\xa1\x87?\xeb,\xf9\xa2\xb0\xa5\xb5?\x0fs\x14\xaeJ\rq\xbf\xf32\x96\x03\xafQ\xb4\xbf%\x99\x7f\xcd\xd5o\xb6\xbf+\x02\\E\x8fVk\xbf\xcf\x0e\t\x95\xaa\xd9\xc1?\x1dl\x99p\x95\xb7\xb6?\x93\x08\xd54aPv\xbfc\xd1\xe6KZ\x1f\x83\xbf\xbb\x89\x94dkO\xc4\xbf\x8aSG\xdaT:\xc6\xbfrg\x17\x0b\x07\xe6\xbc?\rX6\xc4\xa0\xbf\xac\xa0O\xfee}\xc7?\xde\xdd\x91\x93\xcfy\xb5?\xfc|\xf9\x9e$\x9e\x85\xbf]\x91\x94\xcdE\x1eP\xbf\x8c\xed0\xdf\xf3;\xc6\xbf\x85*\xcf\xbf\xfe\xd4\xc6\xbff\xff,vX\xd5\xa7?\x0e\xc5\xeb\xe5\xd9\xc6\xc2?<\x0b\x12P]\r\xbe?0\xb6L0\xa8\xb1\xc0?.l\xa4-y\xeb\xa7\xbfg\x86a\xaci4\xc4\xbf?\xafz\x19\xda\x95\xb3?\xde\xa9\xddZ\xbc|\xc0?\xf4\xb6/\xe6B\x94\xa0\xbf\xaf\x8cv\xe0\xd1\x0e\xba?\xe5\x86\xab\x08\xf8,\xb5?\xc6\\\x90\x92\xff4\xb6\xbfR\x92(c\xc4\xe0\xca\xbf\x14Tz\x80\xb7\xab\xb1?\x08X:\xf5\x01\xc2\x9f?b6\x86\xd8\xb9\xce\xad\xbf\x0f\x97\x8fh3\\\xc5?\xf2\xa6\x01\x89\xbb3\xd4\xbf\x8eZ\x1b\xa3\xae \xb6\xbf\xb3\xe5?\x88A9\xca\xbfRg\x0f4bev?\xad\xb3\x01\x8f\x0f\xdd\xc1\xbf\x07\xd6\xfd%\xb1\n\xcf?\xb0S\xc2\xd4vB\xb1\xbf\x80bu\x1a\xc4:\xbc\xbf\xbdW\x06Q\x8b\xc9\xbc\xbfy\xa4\xd6\xc1\x0cm\xbd\xbf\xba\x96\xea`O(\xd1\xbf\xde\xfb\xa7\xa5"\xec\x98?N\xd8\xb8\xa0\xe6\x8c\xba\xbf\xe8\xd6\x0e\xcb\xfew\xb2\xbf]\xda\xc9\xa7\x9b\x11\xd1?\xaci*O\xd7\xce\xab?\xf1V\x91E\x13\x0f\xd8?%]\x9b\x8dvu\x92?Y!\xb5\x8dO\x1f\xd8?u\xceXs\x8b\xb3\x90\xbf\xd7\xda\xb8\xcf\x985\x86?0h`\xd3?7\xa5?d\xd1\x86\xe0\x96v\xd4?7\xd5H\x86\xbe\xbb\xb6?\x1b\x8d\x84\xb5\xf7\x92\x97?\x04\xda.\x8cfw\xc0\xbf\x08\xca\xc4\xd0\x8a$\xab\xbf\xbc\x07\xacN\x16\x88\x8b?\xc5C(5uA\xb1\xbf\xc6D\x9e\xfc\xb4\xbb\x89\xbf]\x0c\xa4m:\x93\xd0?yw\xb8~\xf0k\xa1?q\xe7)O\xa1\xdf\x8c?\xf0?N*\x16n\x9a\xbf?\x15wp\xe0\x19\xd4\xbfv\xbc&\x92\xdb\xbf\xc2\xbf\x8e$\x84\xfe\x9f\x9b\xcf?q\x07\xc0\xe0\x12m\x8b?_\x1a\xf7\x84\xf9\xa2\xd1?\xd1\xe0\xe7\xacZ.\xc3?0\xb6\xa0+:\xff\xcb?rE:\xc9\x9d\x0c\x8f?\x1e\x08s\x03G\xb0\xd6\xbf\xf6\xfc\x91+\xe0\x1a\xa6?*\xaam\xdf\xdb\xcbi?,\x93g(\xf9\x1a\xa0\xbf/v#"R|\x90\xbfp\\?Jsp\xa6?\x83\xaa\xec^si\x9c\xbf\xe7\x8f\xadJq\xa7\x7f\xbf\x88\xbf\x8f\xb7\xd1{\x85\xbfP}t\xfaq$\xaf\xbf\x8fN\xb9\xe2\xf8\xa4\x82?\x01\xe1e\x81\xc7\xdf\xd3?\xd5\xc1_t\xc4\xed\xc1\xbf\xb6\xfb\x96\xf8Fm\xa5?\xfb\x17\x8a\xc18Y\xc7\xbf\\\xf9\xd0\x0eHg\xd7\xbfNk\x92.\x02p\xcb\xbf\x99\x96?\x02\x1c\x0b\xd0?k\xb2S\x08%i\xa2\xbf@\xbb\xc9\xd2u\x87\xc9?p.\xb3\x9e!\r\xb1?1[\xa4\xaa\xc0^\xb0\xbf\tU\x19\xcdR\xd1\x94\xbf\xe0\x811Hj \xb0\xbf\x9aQS%\x08e\x95?\x83Lx\x98C\xa2\xbe?S\xef\x96\x85GB\xb1?\xbd\xb7\xa2\xb5\xa9w\xca\xbf\xe8\xf79\x89\xad\x8f\xa1?\xda\xf6\xbb\x12pw\xd0\xbf\x95\xd1s\xad\xbef\x82\xbf0\x14\x11,\xa8\xcf\xb5?\xeb\xa5\x018\x06\x96\xa5?\xa9|4\r\x9e9\xcf\xbf<\xa44a/:\x95?E`)\xc3d*v\xbf6+\x9cp\xa7E\xa1\xbf\xaa\xaa\xe6\x902\xb8\xcd?p\x9cZ\xef(/\x99?\x9ce\xfa\x90\x1a\xe6\xb1\xbf\xc1\xb5[^\x8d\xf1j?1o\xd3\xfd\xc7a\xc0\x9dq\xf8\x16\x0f._\xc0*\xd0{\xcc\x8c\x84^\xc0\x18\xc4"~\xa0u_\xc0\x96%\x13|\xf7\x10V\xc0\x00\xa1\\\xc4ojZ\xc0r\x99\xe9\xaf\xeewO\xc0L Lv\x9d\x10Y\xc0\xb6\\w\x80\xa7\xabU\xc0\x89\x96\x1c\x0b} X\xc0\xbaf/\xfb\xbe\xc2]\xc0\xa3\xe6\xd7\x81\x8b8[\xc0EY\x00\xbfJPM@\xf6\xbd\x87\x94\xd0\xce`\xc0\xf6,(\x1e@\xccT@\xf8\xa1k\x11\xc9\xb2`\xc0\x7fJ\xf8w\x16\x17]@\xf0/\xcc\x7f\xc9][\xc07\x02e\xc9~\x80a@\xae\xd2*\xec\xcd\x9bZ\xc0\xcb\xd2\x11z\xbf\x14]@\x9f\x9eG\xe7\xb2\xb8Z\xc0\xedZ \x0c\xd1PU@\x9c\xb4bM\xfdoW@\x8ahs\xeb\xab\x93V\xc0\xf1\xdc\x1d\xb3R\x9cN@#\xbe\xc5\xdd\xf7\x87N\xc0\xd8t%k\x96{G@\x99\xf1[\xb0O\xdd>\xc0\x8c89p\xb6\xf0H@|t$V\x11\xac$\xc0^F\xed\x06#\x10G@\x12\xa9\\\xd7\x95a(@\xce\xdcq\x8c\x97\xfeM@J\x8f\xca\xfdU\x10H@\xfc\xd5\x960\x99\x89U@\x7f\x8fF\xb4\xb9\x88U@\xfd\xd0!\xb9\xdc\x99Y@\x0c\x87\xc6\xd9n{H@\x16o\r\x8d\x91.\\@\xb9\x1d\x81\xb6\xdb&.@o\x17\xe3\xf5\xd0\xd1\\@\xe1\xb2\xd4\xbci\xf9\x19\xc0p\xc5\x08N+\x01]@8\x9c\xc0)\xbd\x1e>\xc0\x8f\xb3\xe2J\xf5\xd5Z@\x15?\xc2A\xe3DN\xc0\x85\x04*g\xe3\xe1V@\x1e\xaa&\xfa|\xe6S\xc0\xe5\x9a\x88\xf2\x10\x0cS@W$\x93\xc7\xb1\xd0=\xc0\x89o\xd4=(cR@\x0e\x08\x97\xea\x1d\x17\x1e\xc0=\xb4io\x97FR@\xdb\x9c+\x9eG\x88)@\xe2\xfb\x08I\x8d\xe8U@\x00\x0b\xf7Xh[R@\x8f-.-\xfdyT@V\xe6y\xd9\xe9\xfc)@w\xe311\x02\xa0U@_\xe1\x11\x87\xc5?\x1f\xc0\x9e\xce\xe0z\x0c\xd7T@\xc2x\xa3\x90r%>\xc0'
-p120
-tp121
-bsg59
-NsbsS'_eigenvalues'
-p122
-g12
-(g13
-(I0
-tp123
-g15
-tp124
-Rp125
-(I1
-(I30
-tp126
-g22
-I00
-S'\xe2\x8e\x96\xe2\xd7\xe5\xdc@\x96\xf54\xfbF\xbb\xbf@-\xf8Vt&\x1b\xaa@\x85\x81!\x16\xc6\x02\xa3@Q\xa8\xc5\xe2\x86\xff\x96@BZ\xe8\xd7Re\x91@\xf9\xa7\xda$\xd6\xd7\x8c@&\xe8\xcf\x8f\xce\xdb\x89@3h\r\x0b\x88\xae\x86@\x82toF\xbc\xe5\x83@m\xa5\x94+\xcc\xdf|@\xb5\x0e\xf3\xe9:Cz@\xf0eR\xfb9.x@\x9e\xb5\x1e}Q\xbau@~\xb0\xd8\x98\xd2^n@t\x9f\x0f\xe4\xcf\x92j@iS\x86\xa0E&h@U7;&3rf@+\xd6\x1f\xdf\x1dJc@\xe9\xc8\x00\x00\x12\xb1a@\x1d\x03\x81\x83M-_@\\G\x8dK\xd3+^@\x1e?\x08,h]]@\xec\x92\xc4\x16f\x80\\@\x90\xf0\x16h3\xfdV@nq\x84\xeb\xe2\xb9U@\x90n=\xfeb\xeaT@ZF\xadmi\x08S@P%H\xedA\xb6R@\xed\x94q\x1e,bO@'
-p127
-tp128
-bsS'_trimmed_eigenvalues'
-p129
-g12
-(g13
-(I0
-tp130
-g15
-tp131
-Rp132
-(I1
-(I103
-tp133
-g22
-I00
-S'\x02M\xe3i\xa8\xeeN@\xea\xbb\xd7\xf9@\xd7M@U\xd7\xb2o\xda\x87L@Bv=\xea\xc5\xedI@S\x8e|r\x0c\x1dI@\x0cm\xdef\xae;G@\x8b\x82\xc1\x05T\xd1E@-\x17\x8b\xb1\x96\xaaB@\xd7d&\xf6m\xc5@@\x83\'Y\r\xb7\xa5>@j\xc4\x8f\x07`\xcf<@l\xecw\xaa\x19\xea:@\xf8y\x9cQ\xbe\xb19@)\x810\xba\xf4"8@P\t\xa3nEi6@D\x9b s_\x016@\x90S \x89\x84n4@\x95\xec\x83>\xca\x842@E\xce\x03\xb6%[1@\x037\xdf\x91\xcf41@\x91\xb0\xc1\xa2\xd8\x860@f\x97\x97&\x1eQ/@M\xea\xa6\xe1\x83X-@\xbd\xfd\xb7\x9cF\x0b+@\xc5WC\x9f\xfd\t*@p\x85\xa3[\xc3/)@vbEs\x8a\xb6\'@j\x8a\xdb\xbd\xea\x98&@\x99\x18\t\xa6 C&@\x91\xb2h\x02M\xd4$@VlN~\xe5.$@\xcf[\xd1\xf1-\xc5#@\x9dU\xb3>\xd2\x87#@\xfb\x89\xd4!\xc0h#@\xa7;\xed\x83\x9a.!@\x80U\xde\x1a\x04\xfa @\xad\xf9\xacx\x13\xaa\x1f@5*\xc4^\x13\x1c\x1e@h\x9c\x84\xf3&\xc1\x1d@\t\x00?\x83hV\x1d@G\xfbPr\xa2n\x1c@\x87F\xe5\xc5\xae\x94\x1b@U\xc2\xaeE\x1a\x96\x1a@\x7fv\xcf\xa71z\x1a@zN\x83\x89i^\x19@\xa9\xba\x82\x18e\x9e\x18@/lW)\xc2h\x17@\xf2\xddx\xd6\xa45\x17@g\xb4<\xc3\xf0\xf8\x16@^\x17v\x0e\xbc\x19\x16@\x04\xe3\x14\xea\x9d\xaf\x15@n\xbe\x1b\x1c[O\x14@\xe7\x1d$\x93\xbe\xcb\x13@z\xf5j-\xef\xb3\x12@\x02\x0f\xc8\x95\x9d\xbf\x11@\x05\x92\xa6\xbb9l\x11@_\x8d\x86\xd0\xbe\'\x11@\xa9\xa4Y\xee\xea<\x10@\xb4\x83Xn\x1a\x9f\x0f@\xbf\x1a\xb3\xd3\xc2\xeb\x0e@O\x04\x8aV\x84h\x0e@\x8c\xd5\x8di\xf8m\r@\xce\x99\x13q\x03\xfd\x0c@2\'ME\x19\xff\t@\\0q\xb3g\xf6\x08@\xbe\xe0\xac}\x06\xc5\x08@\xfc\x13\x06e,X\x07@\xc3\x03\xce\x8d\x98\xad\x06@FQ\xb57\xa8\xbf\x04@\xaf^\x8d\xfe\xb7\x01\x04@\x87\x12\n\x13\x93\x92\x02@V8\xa2\xa7BU\x02@X\x85`\x84_\x1e\x02@|\xba\xea\x02\x074\x01@\x83\xdc\x9c\xd4\x08\xb7\xff?\x8d\xcf#\xbf\xb9\t\xff?\t\x08e\xf6\x90\xb3\xfe?:\xfe\xeb|\x1aJ\xfd?\xe5\xb6\t\xf0lt\xfc?iw\x07c\xc9\xb8\xfb?P.q\xd4\x12\xe8\xfa?\xcf\x9f\x9c1\xf0\x0b\xf9?\xdfe\xcfD?\x94\xf7?\xd5\xf2|~Tg\xf6?i]Tx\x7ft\xf5?\xb4%\x12\x0e\xe8\x83\xf4?\xe6X\xd3\xdb\xa4\xe6\xf3?\xc1\x14\x1b}+@\xf2?\x96\xe3\x84\x95\x1b\xff\xf1?\xa0Y<\xbc\x99\xe8\xf0?\x97\x91#\x85I\x0b\xf0??K(\x7f\xafz\xef?\xb9xM\x00\xd2\xab\xee?w\xa7\x7f\xdd\x146\xec?g\xaf&\xa9\x0f\xfa\xe9?6\xb2\x8d\xea\x85p\xe8?*\x1d\xeec\xc5\xc2\xe5?\x7f\x8f\xd2,\xd9\xee\xe4?W\xf4\xeaE\xa3Q\xe0?\xb6W\xf3\xef\xa8\xe2\xdb?\xa6$\t\xb0\xf5L\xd7?\x8a\xdc\x1f\xbe\x1b=\xcf?\xe1\xa9\\\x9e\x9a\xe0\xca?'
-p134
-tp135
-bsg11
-g12
-(g13
-(I0
-tp136
-g15
-tp137
-Rp138
-(I1
-(I30
-I136
-tp139
-g22
-I00
-S'\x896\x00\xdaw\x1f\xa9\xbf\xb0\x07Z&\x04\xc7\xd0\xbfP\xeeV\xd5F\xfb\x89\xbf\x86M\xdf\xdfe\x16\xd0\xbf\xd1\x18\xa5\xb0+,\x96?\xee\x01\xd2\xe8\xbd|\xcf\xbf\x83\xa4\x10\x12\xfd$\xa9?R\x02t\x12\xdd\xc1\xce\xbf\xa7]\x8f\xf9\x08\x99\xaf?\xa9\x1dh\x14E\x8f\xca\xbf\x18\xe7\xe5\x95S\xbc\xad?\xe6\x1f\xd9\xca\x16\x81\xc3\xbf\xee#\x85\x9e\xb0\xab\xa5?\x8f\xdd8\x9b\x93\xf5\xb5\xbf\x04\x81-S\x92\x01\x93?\xca\xd3\x8a\x0b\xf0\xf7\x96\xbf\xb5\x9e\xdao\xbcS$\xbf\x976\xa2>\x9e\xea]?\x86\xe8\xcb^\xa4\x9d\x95\xbf`\xde\xb7\xf0x;\x94\xbf\xc0\xff\xd7\x94\x95V\xab\xbf&e\xc8n\xf6Q\xb4\xbf/\x81\x03\xae\xa3\xb5\xb3\xbf\xc89\x89io/\xc2\xbf\x8a\x8el\xfe\x90\xee\xb5\xbf+\xe2\xc6\xb8\x9fl\xc9\xbf\xe0;F\x15\x13\xec\xb2\xbf\x8a3\x93Fm6\xce\xbf\x0f\xfa\xa2\xcf\x03\x97\xa6\xbf\x9a\xeff4\xe8\x98\xcf\xbf\xa3\xe1\x02\xfa_\xe6{\xbf\xd2\xd8\t\xee\x8a*\xd0\xbf\xc9\x02\x81\xc2\xc0\xd6\xa0?\x93\x14\x1c\xa5\xf7\xf3\xd0\xbf\xf5,h\x1b\xdf\xc1k?Z\xa4*;:\xa2\xac\xbf\xf8\xeb\xe6\x1c\xc7sp\xbf\xd0l\x8f\x10A\xf8}\xbfHZHr\r\xfe\x8d\xbf\x8c\xd7\x0c\xf6\x86\xb9\xa0?h\xad\xff\x8f\xfem\x8d\xbf\x9fdL\xad\x15\xa2\xb1?\xe9u\x86\xc9&\x07w\xbfF\x12\xf2\x04\xf7\xb8\xb9?\x91\xd74]\xb0\xe3\x84?\x12\xeb_\x86\x83{\xbb?\xde\x06B\xf5\\\xe9\x93?#\r\x8d/d\xfd\xb4?\x1d\xeb\x1e\xa9\x06\x1e\x96?\xd2\xd2w\xb3\xf3B\xa9?\x1aA\x0cY"\x07\x8f?\xcc\x90
\x94\xbfs\xc1\xf4\xeer\x10\x97?\xae\xb1\xf8\x8b\x1b\x0c\xa8\xbf\xde\xaf\x84s\x85\xaa\xa9\xbfi\xd9\xdf56\x1d\xab\xbfq\xd5\xf6\xce P\xbd\xbfp\x9f\xe5\xbbw\xc4\xae\xbfB\xd3\xe62#3\xc7\xbf\xeb\xa4$\xa8\xb6\xb5\xaf\xbf%\xe77$\x13\xad\x94\xbf/[w-\xa0Q\xb6\xbf2:\xb5$\x9a\x0f\xa6?\x90\xce&\xf9\xe4\xc2\xb6\xbf\x02B\xdfd\xba\x14\xb2?\x8c\xab\x03\xf9\xab\x02\xb0\xbf\xac\xd1\xcd1fb\xb2?\xe7\x8f\x92\xdbv\xfe\xa8\xbf|^\xed\xdd\x1c\xb0\xac?x\xfa\\Wh\x06\x9b\xbf\xe7W\xbb\x9f\x1a\xbf\xa2\xbf\x8aF\xea$\xad\xe6\xb5\xbfN\xbb\x96pdj\xb2\xbf\xee\xe2Z"\x91[\xb2\xbfZ\xa0\x05wr<\xb3\xbf\xb9;6\x14!\x85l\xbfe\xf9\xd0!?\xd1\x92\xbf\x8a!\xe2\xf9y\xc9\x90?\x01Dd\x18\xebO\x9f?%(cm\xa5\xdem\xbf\xe6A\x9c8\x11\x8b\xa3?\n\xf4\x88\xd9\\%\xb1\xbf\xbeE\x9e\x8d\xf7>\xa0?\x89\n\x96\x86\'\x9b\xb7\xbfd\xa0\xa9\x93+O\xa2\xbf\xd6\x00S\xb6\x83\xf3\xa3\xbfs\x83]\xfd\xdc8\x95\xbf\x86\xab\xa0\x0b\xab\xb8\xa5?I\x18\xc0\xed\xc7\xd0\xa0?\x82\x12K\xd3\xf6\xe3\xae?\x946S\xb4\xacI\xbb?\tV\xe9^5\xe6\xa8?b\xd0\xd4\xab\xaa-\xba?t\x87\x92r\x1ax\x9d\xbf\x0b\xa8\x89\x8f5\x8a\xcd\xbf$\xe9\x9b\xa1\xb13\xa8?t\xbc\xc6\xbcM|\xae\xbf\xcc\x85j\x1fk\xe9\xa1\xbf\tn\r^\xdbF\xad?iQ:\xf78\xed\xa0\xbf\x00LO\xe2\xf7e\xb5?\x8f\xdb\xc0\xe8\xe1=\xb4?\xaf\xfb\x0fUC\xfd\xad?\x83>\x891\x99\x95\xcc?\x10\xc4\x04w\x1f|\x8e?\\\xf4\xc3\xca\xdf\x89r?3\xff\xfd]\x82x\xb0\xbf\xa0\xc7m\x8egz\xd4\xbf\x1c|\xc2\x82\xe4\xbe\xb0\xbf\xa8\x06R\xff \xa8\x9a\xbf\xd8\xc5\xce\xe1\x1eg\xad?\xee/}@\xc0\xc8\xc9?\xad\xa8\x02\x7f\xedd\xbb?\xdc\x05\xa1`\x8e7\xac?"\xea\xfdu\x93\xc2\xae?V\xc3N!\x87\xac\xb2\xbfb\xc3\xec\xb7\x16g\xc0\xbf\xe3G\xb7\xb1ua\xb3\xbfQ\xde\xfb\xe2\xa4\xf4\xc6\xbf\x10\x94M,\x15\x93\xa5?\x0b\xf7\x8a\x9b\xdaN\xa6?\xa2hKKW\x9b\xa2\xbfC\x94\xab\xfb\xf6:\xb5?G\xfd\xf9$@%\xb3?\xf82\x1b\xfc\x93\r\xaf?$z\xff\xdd\xdei\xca?\x97\xbf\xbdO7t\xa9\xbfH_\xdf\x14\x13\xce\xd5\xbf\xf1\x18\xca"\xac#\x83?\xf0\xeb;\xec0\xd6\xc9?\x8a#\xf6P\x83\xbf\xae?J\xa3\x85[\x84\x08\xb0?\xb7\'`\xfa&\x8a\x8c\xbf\xd1\xf9\xbdd\'c\xac\xbf'
-p140
-tp141
-bsg27
-g12
-(g13
-(I0
-tp142
-g15
-tp143
-Rp144
-(I1
-(I136
-tp145
-g22
-I00
-S'8\xb1t\xd8\xf5\xa4[\xc0\xc4\xcc\x82\xd5\xe1\xc5k\xc0uM\xcd\xdb\x11\x03J\xc0\xef&\x8b4\xc1\x8ck\xc0M\xc5\xc5\x19"\x9e\x19@\x84\x8a\x901\xd6\x9dj\xc0\x80\x96\xa3\xce\xb8\x90O@\x9a\xaa\x16\x0b\xbd\xf7h\xc02\x8e\x80\xec\xe0\xdf\\@\xd1\n\xbb\xce\xaa\x05f\xc0\xc5\x11\xce\xb8-!d@\xa2C\x95P\x95\x8aa\xc0\xddf\x05\n\xd9\xdeh@\x10\xb2iN_8X\xc0\x88\xfd!@\xad\x9dl@\xf6\xd5\xd3\x97\xa4tH\xc0\x9e\x91\x04 \xa1\x92m@\x0fGy\xb6.\x05\x14@\x13\x03z\xb7\x9dEl@Y2@\xd9y\xedL@\x0b\xd68xh]h@L\x02\xfd7\x1b\x16Z@T\xcd\x85\xd0[\x8dc@\xe3"\x97\x8b"Vb@\x84\xb3t"w\x9c[@w\xa2\xbcg\x1f\xa3f@;i\x86&2\x8dL@ms\x9e!\xcfoi@<\xe7\x08\x7f\x05n\xd0\xbf\n\xee,\xa84\xe0j@^\x8en2\x9c4M\xc0\xf6\xc3x\xd2\xa6\x90k@\x93\xd0\xc2\xf0[=]\xc0\xa1@\x93\xad\x8f\xb7k@\xd1\xb1\xc2\x195^c\xc0, 4I\xf5\xe2f\xc0Q\xca\xfa\x8aM\x98f\xc0C4\xcd1\xdfdc\xc0x\xc8\xc7\xaf\xe7\x88g\xc0e,_\xc5\xe7\xf9\\\xc0t&y`\xde\xd4f\xc0\xc0q\xbe-(\xcfR\xc0\xe2\xe0M\x82\x9d\xe0d\xc0\x81\xc0K\x83\xdd+C\xc0\xdd\xba\xfe\x12$Ee\xc0\xd0\xf3\x93H\xa6\x12?@\x81\x9e\xc4?bQg\xc0\x9a\xee\xc3\xdbxyQ@\x9fm\xb2\xfd<\x0eh\xc0\t\x81G\xdb\xa5\x97[@\xe5\xf16~*\x1cg\xc0P\'\x07\xf9L\xb2b@q&\x94\x97\x85\x05d\xc0\x13\x19c\xde\'\x1df@\xc0\xc6ok\x00\xc1^\xc0\xf18\x1b\x9b\x9a9\xfb\xbf\x0fv\xa9ltQU\xc0h\xb5\xb7\xc2\xe1\xd1\xf4\xbf\x9b\x80(F-\xebG\xc0h\x05\xe5(+p\xe9\xbf\x97\xfa\x18T\xd9\x86"\xc0\xaf[\xe8\xe1\x8b\x19\xb7\xbf\x83\xe2\xf3\x9c\xc801@B\x9c\xb8\xf1b6G\xc0"\xd2\xbei\x86B9@\xce\x02\xb7yR\xbc7\xc0\xbf\xa3\xca\x0f=\xc6?@\xedJ\xa03t\xd9\xbd\xbfj\xa6\x9f\xdfR\x858@&\x01\xc0h_~7@\x7f-\x0b\xd5\xb5\xb50@\x9c\xcd\x0c\xd20IF@\xaa\\\xad\x15R\x08]\xc0H+\xf2x"1a\xc0\n\xdfRzuL`\xc0\xda\xa7M\x18\x01Q\\\xc0\x92:\x7f)\x12=`\xc0\xb0\xd4V@\xad\x12U\xc0\x90\xfa\x91Y\x9a\xdb[\xc0m\x96\'!4\xbcM\xc0\xa1\xe4$\x95=\x8fZ\xc0N\xf1\x01\xd3\x83MU\xc0\x14d`S8|Z\xc0\xf8\x87[\xaf\x9a\x8b\\\xc0\x8e\x95\x18\x86{Q\\\xc0\x03^\xeaXu@K@\x7f\xc5\x95.\x84\xa4`\xc0\x1c!\x18\xa9\x03\x13T@u\x8e\x83\x9a\x1f\xac`\xc0\xe0\xe8\xe6$9?[@\xe6g]\x00H\xfb]\xc0\xc3\x81A\xb0\xa5\x96`@\xfe\xd1\xe6-\xc2n[\xc0\xc9\x86\x9d\xab\xce\xda[@\xb4\x8d,1\xff)[\xc0\x90\xe0\xd2k\xa0\xc1T@Q\r\xe4\x18\x082V@\x83+\xb5\x1d\xa7PU\xc090\x95e\x18\xbaR@/\xc7h\x95\xf4\x86J\xc0\x9ab\x80}\x8f\x06Q@\xdc\xd7\xcbw\xe6M5\xc0\x06b\r1;cR@\x91\x8ew<\xc7P\xdf\xbfw\x9af\xb8\r\xf1P@A\xc9k\xf7T(7@y\x80\xe7\xa5\x95\x96R@\xf9\x96\x18}\xc6\xcfK@\xb8\xab\xff\xcd\x88wU@v\xe4\xda%8\xbdU@\x97_aL~U\x94?\xc5\xe8\xbf\x95A\xf4\xae\xbf\xdc\x15\xb0\x04.$\xa0?\xfd}_\x135\x19\xa6\xbfP\x83 O%\t\xa5?3\x85+\xe2\x87\x9f\x94\xbf\x01\x99/\x062f\xa5?\xd4\x1d\xd1\x7f~Rw\xbftJSs\x00\x8c\x9e?\x7f\'5Y\xb0\xd0\xa6\xbf\x9d*B\xa4\xab@\x80?\x12\x1a\x846P\x06\xba\xbfqX\x88\x1a4\xdau?\xe3\xc4\x17\xcf\x03p\xc4\xbf\xbd\xcef\xa0&\x90j?K \xd0\xa0s\xc2\xcb\xbfZ\xe1>\x87:\xf1R?\x95KL\xc7!\xfa\xbe\xbf\x89J\xa9\xe2S+\xa0?\x1c\x01\x82\xed\xd7\xfc\xc2\xbf\xc6E&\x8f\xc7\xb4\x93?h\\>n\xd9[\xc6\xbf7F\xd1\xe7\xec\xae8\xbfzA\xbe8oO\xc3\xbf6\xb5\x15\x86\x99)\x94\xbf\xfcY\xaf\xa8\xce\xe8\xbf\xbf"\x10\xebXB\x15\xa0\xbf\xb7\xcdgfK\xd1\x9b?g\x05I\x01\xad\x13\x94\xbf\x8e\xb9qDtP\x85?\xbd\x84\xddg\x9c{\x97\xbf\x17EZ%\xae_]?\xd3U\xb1I\x9d\x14\x90\xbf\xbc\xdf\xfeeG/^?\x80\x9bO\xea\xd5\xa5\x92\xbf\xd28\xcd\x00\xa3\xe5\x92?\x95\x15\xfa0\xdc\xaf\x8d\xbf\xb6t\xe4k\x99\xe0\x99?4\x92\xf9\x9a-\xf4\x95\xbf\xd4ld\\\n\xe4\x96\xbf\xa8\xa6\x83\xda\xc3\x86\x99?\xf9\x92\x9e\x8b\x0e3\x9a\xbf\x05\x1e\x8d\xb9"\xf7\x99?<6\xdb\x0f\x1c\xcc\x93\xbf9\x87bqi\x82\xa0?\xee\x9e\xf5\x84N\x96s\xbf\xf4\xdeJ\xe1\xbc\x12\x9b?a\xb2\x04q\xda\xa9q\xbf\x88L\x1fb\xd1!\x9f?8\xa1\xa37\xdcZ\x82\xbfC]\x9f\nN~\x97?\xf2\xc5\xca\x8d\xd9X\xcc?\xd7\xea\\\x02\x92-\xc0?\xdd\x9e\\\xbfT\x88\xc5?\x8a\x8d\xe5\x1b5\xfe\xb5?\xf1hSd\xc45\xc0?.\xf2>\xe8e0\xa4?4g\xcc\x7f\x7f\xc5\xbe?QX\xaf\xd5\xc3\x98f\xbf\xa2\xc9V\x05\xbf\xb7\xbf?\x063[-\xef\x00\xa7\xbf\x084Cr\xb6\x0f\xc5?v\x0e\xc2&!\x89\xb8\xbf/Pt\xad\x98\x95\xcb?\x17`\x86\xdd\x1d>\xc1\xbf\x86-\xa3R\x96C\xad?ObK\xd1\xab"\xbb\xbf\x0fV\r\xc1\x80\xe7\x91\xbf\x87\x1dBM\x91\x8b\xaa\xbf\xc2\xd0\x155\xea,\x9c\xbf\xd9e\xab\xed\x85\xd7u\xbf\xf1\xc8\x87\x1c\xb4P\x8c\xbf\x066\x91\x0b8\x19\xa5?\x80(Y\x80\x18n\xaf?\r&h\xc6B\x9c\xb7?\xc5\x0f\xc3\xa3\xd9\xef\xc8?.\xc8\xb04\xe3\n\xc0?\x9ao\xa4$\xf8\x18\xc3?\xcd\xdd\x97\x9bUA\xa4?\x9c\x85\xac\xda\xbf\x8a\xc1?yw6z\xbb\xa3i\xbf\x8b\xe2k\x9d9\xc1\xc2?#\xfc\xadkJ\x8d\xa7\xbf\x11D\xfb\xeb\x05V\xc8?\xab\xed,\xe1\x1fR\xc1\xbf\r\xb4\xe4\xbb\xfb\xdb1?\xa4\xe3.F\xe9\xb6\xa9\xbf\xe5\x08~\r\x05\x18\x87\xbf\xa4\xc8\x01z\xdbOp\xbf\x9a\xba\xf4\xdfQ4j?\n\xd8\x95on|\xa5?.\xfa\xdac\xa1aG\xbf\xbf\x1cR$A\r\xa8?\xc6\xd9\x85_c\x94\xa0\xbf\xdaJR\xbd\xc3U\xa3?\x92\xfd<\xb0\xba\xbd\xaf\xbf+\x8e\x88h\x1dO\x97?y\xec2/\x17z\xb4\xbf;\x1d\x08t\x0bgl?P\x13\x07\x95}\xf1\xb5\xbf\x03\xfa_g\x00\x15\x8d\xbf4\x7fC\xa1\xfd\xcd\xb1\xbf\xf2\x7fMTe\x15\x9b\xbf8z\xdd\x9c\xbeH\xa2\xbf}\x90\x97W\xabW\xa3\xbf\xbc#\x84\x91\xde\xecp\xbfU\xa6\x1a\x18\x1a\xe7\xa2\xbf\x1e\x15\x17Q\xb2\xd3|?\x02bV(>@\x8b\xbf\xea\x9cE\x19r\xbf\x7f\xbf\xce\x07j3L\xe2\x88?\x82\xedOv\xc3\x00\xa4\xbfs\n\x12\xaa\x9aZ\x94?:5\xad$\xa3\xd6\xb0\xbf\x960:\xe4\xe3\x97\x89?\xe3\x96\x9e\n\xd7\x1c\xb3\xbf.\xf3\xfcy\xef\xd8c?\xb8\xaeGe\xb5\xb9\xb1\xbf\xecP\xff\xa3V\x01\x85\xbf\xf2\xe5\x0b\x15\x9bY\xaa\xbf0\x0f\xb2\x89\x8e\x11\x9b\xbf-A\x13A\x19\xe0\x95\xbf\xd2\x9c<\xc8\xd1\xd5\xa2\xbf9\xd5l4;\xfe\x86?\xad\xad8\xbbi^\xa4\xbf4\x9b\x01\xd0dI\xc1?MmoQ\x06\x8a\x92?3\xea\xb7\xd0\xa70\xc0?\xd1\xc6\xcc\x80{\xf6w?\xccJ\xc0hit\xba?/\xc5\xfe\xcaO\xe8\x84?\xb5R\xbf\x89E\xb5\xb5?|\xe4\x93\x00J\x97\x97?\x87#V\xb8*\xa0\xb2?\x98p\xa2\xedp\x16\xa1?\xa5\x1c\xde\xc9\x8c*\xa4?|\r\x06eh;\x93\xbf\x011x\x822R\xaa?\x80{\xaf\x89\xbb-\x8a\xbf \xbb\xaf\xf2%1\xb1?\xfc\x02\xefp=\xc6n\xbf\x1c_Q\x7f=\xb7\xb4?\x91\xa1\x91\x96\x1a{R?d\x1dV\xac\xef\x86\xb5?z\x1dk:\xa1\xc3\x89\xbf?nh>\x13\x89\xaf\xbf\xe2\x95\xe5\xd3\n\x94{?\xf9\x9f_n\xfeh\xba\xbf2\xfb\xd1Ev\xa5v?:L\xbf\xf3Vi\xc2\xbf\xad\xd9\xae\xcfWVs?\xaab\xffi\xd1\x18\xc7\xbf\xf6\x13\xa0W\x0f\xfbn?\xa2\x9a\xe7\xcc\xb6L\xc9\xbfKT\xaa{\x17\xaa\xa3\xbf&\xae\x95)\x02\xa6\xc9\xbfnK\x92\x0b\xfa\xf5\x93\xbfR\x8a\nw\xfb\x16\xc9\xbfQ\x82]\xc6\xad)^?U\x89\x12\x7fm\xcf\xc9\xbf\x085\x85=\x1d\xb5\x97?\xf6Y\x93\xbd\xcf\xdb\xc9\xbf;\x81i\xc0\xdb;\xa5?\\\x85\xe4\x87\xed \xa0?\xbd\xdb[y\xeeb\x98?\xb9WO5\xee\xec\xa9?w{\xff\xb1\x84\xfa\x99?\xef\xeb\x83qC\x0f\xa8?\xda\xd8\xc6\x14\x0eA\x91?\\\\\xa7\xcf\xb71\x85?a)~\xb9\x1dK\x9a?u^\xb1\xac=!\x85\xbf\xe0\x0f>\xda\xbd\x9c\x96?\x89`\xb5\r\xc6+x\xbfKj\xc6\xb6\xb5\xfb\x9e?b\x15\xe8d\x04\x0e\x99\xbfS\x1f\x93\x19\xaf\xd6\x90\xbf\x8eR, \xb2\xa8\x85?\x8ee\xa1S\t\xdd\x86\xbf\x8d\xd0V\xcf\xb7;\x8a?\x00\xa1:\xd1F\xeb\x91\xbf\xe0i\x9d\xf2Z\xfd\x82\xbf\xbc\x0c\xfb\xcf\xda\xf4\x8c\xbf\x1d\x1c\xe8\x04nv\xa7\xbf\xd6T\xa2<~\xc7\x95\xbfH\xf3S\xb5\x1e7\xa8\xbf\xa1\x8b\xd4\x8al\xaf\x8e\xbf{#\xadD\x11\x15w\xbf>}$\xb3\x1d\x12\xc6\xbf\xae\xcc\xe5\xcf?\x02%/-\xb0\xa8r\xbf\xa5\x99\x87\xba\x0e\xbd\xc8?\xfb\x80 )t\x9d\xae\xbfD7\xd2\x81\xdd\xb6\x9a?\x1fZi\x1f7y\xc2\xbfI\x07\x9a\x1eB\x1b\xc2\xbfd&L\x03\x19\xa8\xc8\xbf\x06\xd4,\x93\xa8\xcd\xc9\xbf\x81\x98\xe7\x88\xc8\n\xbf\xbf\xc1k\x02"\xdd\x05\xc2\xbf\x9du9\xc3\x17\xfc\xa5?\xbd]L\x0c\xef\xf9\xb3\xbf\xe1M\x84l\xd2^\xc3?\xd2\xccQdG\xee\xa1?j\x83\xe4l\x963\xb8?\xcf\xd3\xf4\x98\x87\n\xcb?\xad\xf2$\x14\x0e\xf7\x90\xbfr\xb4fY\x9a\xe0b\xbf\x99\xc1K\xab\x86\xfc\xaa\xbf\xe8U\xab\xe6\\\x18\x9a?=\xd5\x8b?\xbe\x82\xb7\xbft\r\x06\xbf\xfb\xba\x9f\xbf6\xdcE32*\xb2\xbf\xebl*b\xc2\xd9\xa7\xbfa\x16\xb2OX>r?K\xf7\xda\x00<<\xa7\xbf\x1d]\x86\xd0dK\xb4?\xf3\xa4\xf7\xd9\x8c\x9ep?k\xc7\xfb\x89e\x98\xb2\xbf\xeaR#\xe1\xe9R\x93?J\xdc\xb5\x81\x84v\xa3\xbf\x90\x7f\x178.\x1c\xa5?\xa8w\xd3\xbb\x00\x1aU?yD\xc9\x06\x193\x97?@\xbc\xd5*\xd3\x9b\x85?\x93y\xf0\xc4.\xb6\xb2\xbf\x8b\xa4\xc7\xd4m)\x99?_\xd2\xd1\x05\xc6)\xcd?\xc0\xca\x9e\x06\xbc\xcc\x9c\xbf\x9cS{Gku\xb6?\x10\xb1>l\xe7\xc0\x98\xbfa/h\xfd\xee\xc5\xa8\xbf/\x7f}\xbe{\xfb\x85\xbf\xf8\xaa\xad\xa6f\x13\xc8\xbf\x9cp\xb0\xfd\xabSk?\xec\x00m/\xe3\x19\x96?a\x13t?\x90\x03\xca?\xaa\x19_A\xe2\xcd\x92?\x02I\\Y\xd3\xbb\xbd?d\x82F\x13\xb9T\x93\xbf\r\xd0\xd2\xc8\xda\xa7\x8f?\xea\xaeO\xba\xc2\xde{?b\r\xb0^;\xd0\xb6\xbf\xdeg\xda\xc5\x8b\xaci?\xfe\x1eIFPU\xc8\xbfE\xb6\x8adF\xa6\x91?B\xca\x05U\x98\x8a\x92?\x1b\xf8j+Gtl? $\xaa?\x0ck\x82\xbfP^\x94\xc3]\xb2\x93\xbfD\xecF\x9c\x11\xb8\x8f?\x0er6\xa6\xb8\xc9\xab\xbf`\xda{\xafM\x0e\xa1?\x9cl\xda\xdb\xab*x?\x0f\xc5\n\x08\xda^\xa4?2$\x8c%#d\x99?\x8dR\x0e \x186\x94?=\x06\xa5\x16l1\xa6\xbf>\x1dj\xea\xb2\x95\x80?Q\xe0\x1c\xe0\xba\xb4\x91\xbf\xfcC\x043\x9b\x92w?[5\xc1\xa4\xe7\x80\x86?\xc7\xfb!\xef`\xfe\x93?\xf3\x04\xa2\xae\xfd\xe3\x8a?\x1d\xe9\xda\xa6$\x99\x8a?\x95\x8f\xf8\x1f\xd9I\x90?\x05\xbbt\x1d\xe7F\x95?p >\xf6\xe2\xdc@?z\xb2z\x18\x007\x92?\xc4\x8eE\xa4\xf3|\xc5?v3K\xc3\xa7\x0b\xb0\xbf\x97\xa3\xf2w\xdc\xe2\xa4?&\x8ek\x0f\xe7\x9e\xb6\xbf\xa7\xa8\x8a3\x16n\x99\xbfT
9\xb1\xbf\xc1p\xe9\xe7\xa0\xd3\xcf\xbf\xb3\x1co-\xd2\xa2\xc6?;\xdd\x10\x7f\x1ac\xcc?\xd1\x02@\xce\'8\xab?^\x98\x97h\\\xcb\xd3?(P\xd1\x89u1\xc4\xbfqAA\x97\xf8\xfc\xda\xbf}\xf9R\xb6~\xa5\xae\xbfj\x13\xb4ll\xfc\xad\xbf\x95.\xed\x9b\xe5\'\xba?u\x1e*\xc2`\xfb\xd4?\xa5\x0fHR]\xf8\xc9?\x8fCt@\xc1\xa6\xb9\xbf\xa9\xc0\x1d\x96\xd0r\xc0\xbf\x16kG\xe6i\xac\xb9\xbf\x1c-\xc3\xac\xa5\xbf\xd0\xbf2\xacD{\xb9\xf8\xab?>\xad\xb1\xc0\xdb\x12\xca?'
-p127
-tp128
-bsg27
-g12
-(g13
-(I0
-tp129
-g15
-tp130
-Rp131
-(I1
-(I34
-tp132
-g22
-I00
-S'S\x1ab\x96cqh\xc0\xde\xf4\x0ea\xd9\x19l\xc0\x05\xe8\x0f\xf6_\x15a\xc0\x9a-\x8fU\xf8\xeck\xc0\x8fc\xf3f\xa7~S\xc0\x92\x00\x81s*\x08k\xc0u4\x94\xe6a\xfb4\xc0\x14B\xe1\xf5\xe4hi\xc0\xffP#y\xdd\xe9?@\x84\xa6\xd7\xb2\xcfwf\xc0\x9b.6\x93\xcd\x8bS@\x9e5BOI\xf8a\xc0\xcf\xe9\x84\xafk:]@\xe6\xde\xa3\xf8\xac\x02Y\xc0\xb1\x95.\xdc\xe3qb@\x97\xff\x8a\xfc\xa7\xd4I\xc0\xb2\xaa\xfeP}nc@nM\xb8-e,\x05@\x0fJ\x0e\xf3\xcc\x19b@k\xe2\xe4jz\xf5K@\x18\xefT\xb8\xff7\\@\x95\xc9\x0b<<\xbbY@X\xfb\x86\xcd\x96gR@\xf1\x0e\xcf\x9f\x195b@JH\xc7\x9b\xad\xe3:@