diff --git a/.eggs/README.txt b/.eggs/README.txt deleted file mode 100644 index 5d01668..0000000 --- a/.eggs/README.txt +++ /dev/null @@ -1,6 +0,0 @@ -This directory contains eggs that were downloaded by setuptools to build, test, and run plug-ins. - -This directory caches those eggs to prevent repeated downloads. - -However, it is safe to delete this directory. - diff --git a/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/LICENSE b/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/LICENSE deleted file mode 100644 index 353924b..0000000 --- a/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/LICENSE +++ /dev/null @@ -1,19 +0,0 @@ -Copyright Jason R. Coombs - -Permission is hereby granted, free of charge, to any person obtaining a copy -of this software and associated documentation files (the "Software"), to -deal in the Software without restriction, including without limitation the -rights to use, copy, modify, merge, publish, distribute, sublicense, and/or -sell copies of the Software, and to permit persons to whom the Software is -furnished to do so, subject to the following conditions: - -The above copyright notice and this permission notice shall be included in -all copies or substantial portions of the Software. - -THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR -IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, -FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE -AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER -LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING -FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS -IN THE SOFTWARE. diff --git a/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/PKG-INFO b/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/PKG-INFO deleted file mode 100644 index 75c36d6..0000000 --- a/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/PKG-INFO +++ /dev/null @@ -1,185 +0,0 @@ -Metadata-Version: 2.1 -Name: pytest-runner -Version: 6.0.1 -Summary: Invoke py.test as distutils command with dependency resolution -Home-page: https://github.com/pytest-dev/pytest-runner/ -Author: Jason R. Coombs -Author-email: jaraco@jaraco.com -Classifier: Development Status :: 7 - Inactive -Classifier: Intended Audience :: Developers -Classifier: License :: OSI Approved :: MIT License -Classifier: Programming Language :: Python :: 3 -Classifier: Programming Language :: Python :: 3 :: Only -Classifier: Framework :: Pytest -Requires-Python: >=3.7 -License-File: LICENSE -Provides-Extra: docs -Requires-Dist: sphinx ; extra == 'docs' -Requires-Dist: jaraco.packaging >=9 ; extra == 'docs' -Requires-Dist: rst.linker >=1.9 ; extra == 'docs' -Requires-Dist: jaraco.tidelift >=1.4 ; extra == 'docs' -Provides-Extra: testing -Requires-Dist: pytest >=6 ; extra == 'testing' -Requires-Dist: pytest-checkdocs >=2.4 ; extra == 'testing' -Requires-Dist: pytest-flake8 ; extra == 'testing' -Requires-Dist: pytest-cov ; extra == 'testing' -Requires-Dist: pytest-enabler >=1.0.1 ; extra == 'testing' -Requires-Dist: pytest-virtualenv ; extra == 'testing' -Requires-Dist: types-setuptools ; extra == 'testing' -Requires-Dist: pytest-black >=0.3.7 ; (platform_python_implementation != "PyPy") and extra == 'testing' -Requires-Dist: pytest-mypy >=0.9.1 ; (platform_python_implementation != "PyPy") and extra == 'testing' - -.. image:: https://img.shields.io/pypi/v/pytest-runner.svg - :target: `PyPI link`_ - -.. image:: https://img.shields.io/pypi/pyversions/pytest-runner.svg - :target: `PyPI link`_ - -.. _PyPI link: https://pypi.org/project/pytest-runner - -.. image:: https://github.com/pytest-dev/pytest-runner/workflows/tests/badge.svg - :target: https://github.com/pytest-dev/pytest-runner/actions?query=workflow%3A%22tests%22 - :alt: tests - -.. image:: https://img.shields.io/badge/code%20style-black-000000.svg - :target: https://github.com/psf/black - :alt: Code style: Black - -.. .. image:: https://readthedocs.org/projects/skeleton/badge/?version=latest -.. :target: https://skeleton.readthedocs.io/en/latest/?badge=latest - -.. image:: https://img.shields.io/badge/skeleton-2022-informational - :target: https://blog.jaraco.com/skeleton - -.. image:: https://tidelift.com/badges/package/pypi/pytest-runner - :target: https://tidelift.com/subscription/pkg/pypi-pytest-runner?utm_source=pypi-pytest-runner&utm_medium=readme - -Setup scripts can use pytest-runner to add setup.py test support for pytest -runner. - -Deprecation Notice -================== - -pytest-runner depends on deprecated features of setuptools and relies on features that break security -mechanisms in pip. For example 'setup_requires' and 'tests_require' bypass ``pip --require-hashes``. -See also `pypa/setuptools#1684 `_. - -It is recommended that you: - -- Remove ``'pytest-runner'`` from your ``setup_requires``, preferably removing the ``setup_requires`` option. -- Remove ``'pytest'`` and any other testing requirements from ``tests_require``, preferably removing the ``tests_requires`` option. -- Select a tool to bootstrap and then run tests such as tox. - -Usage -===== - -- Add 'pytest-runner' to your 'setup_requires'. Pin to '>=2.0,<3dev' (or - similar) to avoid pulling in incompatible versions. -- Include 'pytest' and any other testing requirements to 'tests_require'. -- Invoke tests with ``setup.py pytest``. -- Pass ``--index-url`` to have test requirements downloaded from an alternate - index URL (unnecessary if specified for easy_install in setup.cfg). -- Pass additional py.test command-line options using ``--addopts``. -- Set permanent options for the ``python setup.py pytest`` command (like ``index-url``) - in the ``[pytest]`` section of ``setup.cfg``. -- Set permanent options for the ``py.test`` run (like ``addopts`` or ``pep8ignore``) in the ``[pytest]`` - section of ``pytest.ini`` or ``tox.ini`` or put them in the ``[tool:pytest]`` - section of ``setup.cfg``. See `pytest issue 567 - `_. -- Optionally, set ``test=pytest`` in the ``[aliases]`` section of ``setup.cfg`` - to cause ``python setup.py test`` to invoke pytest. - -Example -======= - -The most simple usage looks like this in setup.py:: - - setup( - setup_requires=[ - 'pytest-runner', - ], - tests_require=[ - 'pytest', - ], - ) - -Additional dependencies require to run the tests (e.g. mock or pytest -plugins) may be added to tests_require and will be downloaded and -required by the session before invoking pytest. - -Follow `this search on github -`_ -for examples of real-world usage. - -Standalone Example -================== - -This technique is deprecated - if you have standalone scripts -you wish to invoke with dependencies, `use pip-run -`_. - -Although ``pytest-runner`` is typically used to add pytest test -runner support to maintained packages, ``pytest-runner`` may -also be used to create standalone tests. Consider `this example -failure `_, -reported in `jsonpickle #117 -`_ -or `this MongoDB test -`_ -demonstrating a technique that works even when dependencies -are required in the test. - -Either example file may be cloned or downloaded and simply run on -any system with Python and Setuptools. It will download the -specified dependencies and run the tests. Afterward, the the -cloned directory can be removed and with it all trace of -invoking the test. No other dependencies are needed and no -system configuration is altered. - -Then, anyone trying to replicate the failure can do so easily -and with all the power of pytest (rewritten assertions, -rich comparisons, interactive debugging, extensibility through -plugins, etc). - -As a result, the communication barrier for describing and -replicating failures is made almost trivially low. - -Considerations -============== - -Conditional Requirement ------------------------ - -Because it uses Setuptools setup_requires, pytest-runner will install itself -on every invocation of setup.py. In some cases, this causes delays for -invocations of setup.py that will never invoke pytest-runner. To help avoid -this contingency, consider requiring pytest-runner only when pytest -is invoked:: - - needs_pytest = {'pytest', 'test', 'ptr'}.intersection(sys.argv) - pytest_runner = ['pytest-runner'] if needs_pytest else [] - - # ... - - setup( - #... - setup_requires=[ - #... (other setup requirements) - ] + pytest_runner, - ) - -For Enterprise -============== - -Available as part of the Tidelift Subscription. - -This project and the maintainers of thousands of other packages are working with Tidelift to deliver one enterprise subscription that covers all of the open source you use. - -`Learn more `_. - -Security Contact -================ - -To report a security vulnerability, please use the -`Tidelift security contact `_. -Tidelift will coordinate the fix and disclosure. diff --git a/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/RECORD b/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/RECORD deleted file mode 100644 index d7ff24d..0000000 --- a/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/RECORD +++ /dev/null @@ -1,7 +0,0 @@ -ptr/__init__.py,sha256=0UfzhCooVgCNTBwVEOPOVGEPck4pnl_6PTfsC-QzNGM,6730 -pytest_runner-6.0.1.dist-info/LICENSE,sha256=2z8CRrH5J48VhFuZ_sR4uLUG63ZIeZNyL4xuJUKF-vg,1050 -pytest_runner-6.0.1.dist-info/METADATA,sha256=Ho3FvAFjFHeY5OQ64WFzkLigFaIpuNr4G3uSmOk3nho,7319 -pytest_runner-6.0.1.dist-info/WHEEL,sha256=oiQVh_5PnQM0E3gPdiz09WCNmwiHDMaGer_elqB3coM,92 -pytest_runner-6.0.1.dist-info/entry_points.txt,sha256=BqezBqeO63XyzSYmHYE58gKEFIjJUd-XdsRQkXHy2ig,58 -pytest_runner-6.0.1.dist-info/top_level.txt,sha256=DPzHbWlKG8yq8EOD5UgEvVNDWeJRPyimrwfShwV6Iuw,4 -pytest_runner-6.0.1.dist-info/RECORD,, diff --git a/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/WHEEL b/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/WHEEL deleted file mode 100644 index 98c0d20..0000000 --- a/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/WHEEL +++ /dev/null @@ -1,5 +0,0 @@ -Wheel-Version: 1.0 -Generator: bdist_wheel (0.42.0) -Root-Is-Purelib: true -Tag: py3-none-any - diff --git a/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/entry_points.txt b/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/entry_points.txt deleted file mode 100644 index 0860670..0000000 --- a/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/entry_points.txt +++ /dev/null @@ -1,3 +0,0 @@ -[distutils.commands] -ptr = ptr:PyTest -pytest = ptr:PyTest diff --git a/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/requires.txt b/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/requires.txt deleted file mode 100644 index 1535188..0000000 --- a/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/requires.txt +++ /dev/null @@ -1,17 +0,0 @@ - -[docs] -sphinx -jaraco.packaging>=9 -rst.linker>=1.9 -jaraco.tidelift>=1.4 - -[testing] -pytest>=6 -pytest-checkdocs>=2.4 -pytest-flake8 -pytest-cov -pytest-enabler>=1.0.1 -pytest-virtualenv -types-setuptools -pytest-black>=0.3.7 -pytest-mypy>=0.9.1 diff --git a/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/top_level.txt b/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/top_level.txt deleted file mode 100644 index e9148ae..0000000 --- a/.eggs/pytest_runner-6.0.1-py3.8.egg/EGG-INFO/top_level.txt +++ /dev/null @@ -1 +0,0 @@ -ptr diff --git a/.eggs/pytest_runner-6.0.1-py3.8.egg/ptr/__init__.py b/.eggs/pytest_runner-6.0.1-py3.8.egg/ptr/__init__.py deleted file mode 100644 index 41192fa..0000000 --- a/.eggs/pytest_runner-6.0.1-py3.8.egg/ptr/__init__.py +++ /dev/null @@ -1,216 +0,0 @@ -""" -Implementation -""" - -import os as _os -import shlex as _shlex -import contextlib as _contextlib -import sys as _sys -import operator as _operator -import itertools as _itertools -import warnings as _warnings - -import pkg_resources -import setuptools.command.test as orig -from setuptools import Distribution - - -@_contextlib.contextmanager -def _save_argv(repl=None): - saved = _sys.argv[:] - if repl is not None: - _sys.argv[:] = repl - try: - yield saved - finally: - _sys.argv[:] = saved - - -class CustomizedDist(Distribution): - - allow_hosts = None - index_url = None - - def fetch_build_egg(self, req): - """Specialized version of Distribution.fetch_build_egg - that respects respects allow_hosts and index_url.""" - from setuptools.command.easy_install import easy_install - - dist = Distribution({'script_args': ['easy_install']}) - dist.parse_config_files() - opts = dist.get_option_dict('easy_install') - keep = ( - 'find_links', - 'site_dirs', - 'index_url', - 'optimize', - 'site_dirs', - 'allow_hosts', - ) - for key in list(opts): - if key not in keep: - del opts[key] # don't use any other settings - if self.dependency_links: - links = self.dependency_links[:] - if 'find_links' in opts: - links = opts['find_links'][1].split() + links - opts['find_links'] = ('setup', links) - if self.allow_hosts: - opts['allow_hosts'] = ('test', self.allow_hosts) - if self.index_url: - opts['index_url'] = ('test', self.index_url) - install_dir_func = getattr(self, 'get_egg_cache_dir', _os.getcwd) - install_dir = install_dir_func() - cmd = easy_install( - dist, - args=["x"], - install_dir=install_dir, - exclude_scripts=True, - always_copy=False, - build_directory=None, - editable=False, - upgrade=False, - multi_version=True, - no_report=True, - user=False, - ) - cmd.ensure_finalized() - return cmd.easy_install(req) - - -class PyTest(orig.test): - """ - >>> import setuptools - >>> dist = setuptools.Distribution() - >>> cmd = PyTest(dist) - """ - - user_options = [ - ('extras', None, "Install (all) setuptools extras when running tests"), - ( - 'index-url=', - None, - "Specify an index url from which to retrieve dependencies", - ), - ( - 'allow-hosts=', - None, - "Whitelist of comma-separated hosts to allow " - "when retrieving dependencies", - ), - ( - 'addopts=', - None, - "Additional options to be passed verbatim to the pytest runner", - ), - ] - - def initialize_options(self): - self.extras = False - self.index_url = None - self.allow_hosts = None - self.addopts = [] - self.ensure_setuptools_version() - - @staticmethod - def ensure_setuptools_version(): - """ - Due to the fact that pytest-runner is often required (via - setup-requires directive) by toolchains that never invoke - it (i.e. they're only installing the package, not testing it), - instead of declaring the dependency in the package - metadata, assert the requirement at run time. - """ - pkg_resources.require('setuptools>=27.3') - - def finalize_options(self): - if self.addopts: - self.addopts = _shlex.split(self.addopts) - - @staticmethod - def marker_passes(marker): - """ - Given an environment marker, return True if the marker is valid - and matches this environment. - """ - return ( - not marker - or not pkg_resources.invalid_marker(marker) - and pkg_resources.evaluate_marker(marker) - ) - - def install_dists(self, dist): - """ - Extend install_dists to include extras support - """ - return _itertools.chain( - orig.test.install_dists(dist), self.install_extra_dists(dist) - ) - - def install_extra_dists(self, dist): - """ - Install extras that are indicated by markers or - install all extras if '--extras' is indicated. - """ - extras_require = dist.extras_require or {} - - spec_extras = ( - (spec.partition(':'), reqs) for spec, reqs in extras_require.items() - ) - matching_extras = ( - reqs - for (name, sep, marker), reqs in spec_extras - # include unnamed extras or all if self.extras indicated - if (not name or self.extras) - # never include extras that fail to pass marker eval - and self.marker_passes(marker) - ) - results = list(map(dist.fetch_build_eggs, matching_extras)) - return _itertools.chain.from_iterable(results) - - @staticmethod - def _warn_old_setuptools(): - msg = ( - "pytest-runner will stop working on this version of setuptools; " - "please upgrade to setuptools 30.4 or later or pin to " - "pytest-runner < 5." - ) - ver_str = pkg_resources.get_distribution('setuptools').version - ver = pkg_resources.parse_version(ver_str) - if ver < pkg_resources.parse_version('30.4'): - _warnings.warn(msg) - - def run(self): - """ - Override run to ensure requirements are available in this session (but - don't install them anywhere). - """ - self._warn_old_setuptools() - dist = CustomizedDist() - for attr in 'allow_hosts index_url'.split(): - setattr(dist, attr, getattr(self, attr)) - for attr in ( - 'dependency_links install_requires tests_require extras_require ' - ).split(): - setattr(dist, attr, getattr(self.distribution, attr)) - installed_dists = self.install_dists(dist) - if self.dry_run: - self.announce('skipping tests (dry run)') - return - paths = map(_operator.attrgetter('location'), installed_dists) - with self.paths_on_pythonpath(paths): - with self.project_on_sys_path(): - return self.run_tests() - - @property - def _argv(self): - return ['pytest'] + self.addopts - - def run_tests(self): - """ - Invoke pytest, replacing argv. Return result code. - """ - with _save_argv(_sys.argv[:1] + self.addopts): - result_code = __import__('pytest').main() - if result_code: - raise SystemExit(result_code) diff --git a/.gitignore b/.gitignore deleted file mode 100644 index 0cb4bdc..0000000 --- a/.gitignore +++ /dev/null @@ -1,12 +0,0 @@ -.DS_store -__pycache__ -venv_* - -*.drawio.bkp -.$Katabatic.drawio.dtmp -katabatic/output.csv -katabatic/running.log -katabatic/output.csv -docs-2/ganblr_interface/creditcard.csv -docs-2/ganblr_interface/uploaded_datasets/credit_X_train.csv -docs-2/ganblr_interface/uploaded_datasets/creditcard.csv diff --git a/2024-09-02 12-29-48.mkv b/2024-09-02 12-29-48.mkv deleted file mode 100644 index e3c7a95..0000000 Binary files a/2024-09-02 12-29-48.mkv and /dev/null differ diff --git a/Archive/Default_Docker/Dockerfile b/Archive/Default_Docker/Dockerfile deleted file mode 100644 index f43eade..0000000 --- a/Archive/Default_Docker/Dockerfile +++ /dev/null @@ -1,17 +0,0 @@ -# Use an official Python runtime as a parent image -FROM python:3.8-slim - -# Set the working directory in the container -WORKDIR /app - -# Copy the current directory contents into the container at /app -COPY . /app - -# Install any necessary dependencies -RUN pip install --no-cache-dir -r requirements.txt - -# Make port 5000 available to the world outside this container -EXPOSE 5000 - -# Run the application -CMD ["python", "app.py"] # Replace with your actual application entry point diff --git a/Archive/Default_Docker/requirements.txt b/Archive/Default_Docker/requirements.txt deleted file mode 100644 index 4fe8981..0000000 --- a/Archive/Default_Docker/requirements.txt +++ /dev/null @@ -1,5 +0,0 @@ -pyitlib -pgmpy -scikit-learn -tensorflow -sdv \ No newline at end of file diff --git a/Archive/Docker_Documentatiom.Rmd b/Archive/Docker_Documentatiom.Rmd deleted file mode 100644 index 3a4c2b8..0000000 --- a/Archive/Docker_Documentatiom.Rmd +++ /dev/null @@ -1,130 +0,0 @@ -Aiko-Services Docker Setup and Process - - - -Overview - -This documentation outlines the Docker setup and process for the Aiko-Services project, which involves multiple machine learning models (GANBLR, GANBLR++, MEG, TableGAN, CTGAN, etc.) running in separate containers. The goal is to create a Docker environment that allows each model to run independently, with a single docker-compose file managing all the containers. - - - - -Prerequisites - -- Docker installed on the system -- Docker Compose installed on the system -- Aiko-Services project code cloned from the GitHub repository - - - -Step 1: Create a Dockerfile for each model - -- Why separate Dockerfiles? -- Creating separate Dockerfiles for each model ensures that each model has its own environment and dependencies, without conflicts or version issues. This approach also makes it easier to maintain and update individual models. - - - -Dockerfile example: GANBLR -Here's an example Dockerfile for the GANBLR model: - - -# GANBLR Dockerfile -FROM python:3.9-slim - -# Set working directory to /app -WORKDIR /app - -# Copy requirements file -COPY requirements.txt . - -# Install dependencies -RUN pip install -r requirements.txt - -# Copy model code -COPY ganblr /app/ganblr - -# Expose port 8000 for the model -EXPOSE 8000 - -# Run command to start the model -CMD ["python", "ganblr/main.py"] -Similarly, create separate Dockerfiles for each model, adjusting the dependencies and code copies as needed. - - - - -Step 2: Create a Docker image for each model -Using the Dockerfiles created in Step 1, build a Docker image for each model. For example, to build the GANBLR image: - -docker build -t ganblr-image -f ganblr/Dockerfile . -This will create a Docker image with the name ganblr-image. - - - - -Step 3: Create a docker-compose file -Create a docker-compose file that defines the services for each model and how they should be run. Here's an example -docker-compose file: - - -version: '3' - -services: - ganblr: - image: ganblr-image - ports: - - "8000:8000" - depends_on: - - db - - ganblr++: - image: ganblrplusplus-image - ports: - - "8001:8001" - depends_on: - - db - - meg: - image: meg-image - ports: - - "8002:8002" - depends_on: - - db - - tablegan: - image: tablegan-image - ports: - - "8003:8003" - depends_on: - - db - - ctgan: - image: ctgan-image - ports: - - "8004:8004" - depends_on: - - db - - db: - image: postgres - environment: - - POSTGRES_USER=myuser - - POSTGRES_PASSWORD=mypassword - - POSTGRES_DB=mydb - volumes: - - db-data:/var/lib/postgresql/data - -volumes: - db-data: -This docker-compose file defines five services, one for each model, and a PostgreSQL database service. Each model service uses its corresponding Docker image and exposes a port for access. The depends_on directive ensures that each model service starts only after the database service is up and running. - - - -Step 4: Run the docker-compose file -Finally, to start all the services, run the docker-compose file: - -docker-compose up -d -This will start all the services in detached mode. You can then access each model by visiting http://localhost: in your web browser, where is the port number exposed by each model (e.g., 8000 for GANBLR). - -Conclusion -This documentation outlines the Docker setup and process for the Aiko-Services project, which involves creating separate Dockerfiles and images for each model, and a docker-compose file to manage all the containers. This approach ensures that each model runs independently, with easy maintenance and updates. diff --git a/Archive/Dockerfile b/Archive/Dockerfile deleted file mode 100644 index 0e5472a..0000000 --- a/Archive/Dockerfile +++ /dev/null @@ -1,20 +0,0 @@ -# Use an official Python runtime as a parent image -FROM python:3.8-slim - -# Set the working directory in the container -WORKDIR /app - -# Copy only requirements.txt first to leverage Docker cache -COPY requirements.txt /app/ - -# Install any necessary dependencies -RUN pip install --no-cache-dir -r requirements.txt - -# Copy the rest of the application code -COPY . /app - -# Make port 5000 available to the world outside this container -EXPOSE 5000 - -# Run the application -CMD ["python", "app.py"] # Replace with your actual application entry point diff --git a/Archive/Model_Execution_Report.md b/Archive/Model_Execution_Report.md deleted file mode 100644 index 054c0b9..0000000 --- a/Archive/Model_Execution_Report.md +++ /dev/null @@ -1,282 +0,0 @@ -# Detailed Report on the Deployment and Execution of the GANBLR Model Using Docker - -## Introduction -This project focuses on deploying and managing various data-generative models using Docker and Docker Compose, with a particular emphasis on the GANBLR model. The deployment setup aims to ensure consistent execution, scalability, modularity, and efficient dependency management. This report documents the steps taken to deploy the GANBLR model, challenges encountered, and solutions implemented. - -## Model Execution Steps - -### Executed Model -- **Model Executed:** GANBLR -- **Context:** The GANBLR model was executed within a Docker container. After resolving dependency issues, the model integrated well within the container. The Dockerfile and `requirements.txt` were updated to accommodate the model's needs. - -### Procedure - -1. **Build Docker Images** - - **Command:** `docker-compose build` - - **Details:** This command builds the Docker images based on the updated Dockerfile and `docker-compose.yml`. The updates ensure that all necessary dependencies for the GANBLR model are included. - -2. **Run Docker Containers** - - **Command:** `docker-compose up` - - **Details:** This command starts both the application and database services, ensuring correct operation and seamless interaction. - -3. **Execute Models** - - **Command:** `docker exec -it /bin/sh` - - **Details:** Inside the container, the relevant Python scripts were run to execute the GANBLR model. - -## Errors or Issues Observed During Implementation - -### Dependency Issues -- **Problem:** Missing or incompatible dependencies were encountered during model execution. -- **Resolution:** The `requirements.txt` file was updated to include the necessary dependencies for the GANBLR model. - - **Old `requirements.txt`:** - - `pyitlib` - - `pgmpy` - - `scikit-learn` - - **New `requirements.txt`:** - - `pyitlib` - - `pgmpy` - - `scikit-learn` - - `tensorflow` - - `sdv` - -### Model Execution Errors -- **Problem:** Errors related to environment variables or paths during model execution. -- **Resolution:** These issues were corrected by updating configuration files and Docker settings. - -## Changes to Configuration Files - -### Dockerfile - -#### Original Dockerfile: -```dockerfile -# Use an official Python runtime as a parent image -FROM python:3.8-slim - -# Set the working directory in the container -WORKDIR /app - -# Copy the current directory contents into the container at /app -COPY . /app - -# Install any necessary dependencies -RUN pip install --no-cache-dir -r requirements.txt - -# Make port 5000 available to the world outside this container -EXPOSE 5000 - -# Run the application -CMD ["python", "app.py"] # Replace with your actual application entry point - - -Errors Found: - -Application Entry Point: The entry point specified as app.py may not match the actual application script name. -Corrections Made: - -Verified and corrected the entry point to ensure the correct script is specified. -Updated Dockerfile: - -# Use an official Python runtime as a parent image -FROM python:3.8-slim - -# Set the working directory in the container -WORKDIR /app - -# Copy only requirements.txt first to leverage Docker cache -COPY requirements.txt /app/ - -# Install any necessary dependencies -RUN pip install --no-cache-dir -r requirements.txt - -# Copy the rest of the application code -COPY . /app - -# Make port 5000 available to the world outside this container -EXPOSE 5000 - -# Run the application -CMD ["python", "main.py"] # Replace with your actual application entry point - - - -Docker Compose File -Original docker-compose.yml: - -version: '3' -services: - web: - build: . - ports: - - "5000:5000" - volumes: - - .:/app - depends_on: - - db - db: - image: postgres:13 - environment: - POSTGRES_USER: user - POSTGRES_PASSWORD: password - POSTGRES_DB: mydatabase - ports: - - "5432:5432" - - - -Errors Found: - -Volume Mapping: The volume mapping (- .:/app) may lead to conflicts with files being copied during the build process. -Database Initialization: Missing steps to initialize the database schema or seed initial data. -Corrections Made: - -Adjusted volume mapping to avoid overwriting files and ensure consistent behavior. -Added a database initialization script to set up the schema and seed data if needed. - - -Updated docker-compose.yml: - -version: '3' -services: - web: - build: . - ports: - - "5000:5000" - depends_on: - - db - db: - image: postgres:13 - environment: - POSTGRES_USER: user - POSTGRES_PASSWORD: password - POSTGRES_DB: mydatabase - ports: - - "5432:5432" - volumes: - - pgdata:/var/lib/postgresql/data - -volumes: - pgdata: - - - -Comprehensive Research Findings on GANBLR Algorithm Models -This section summarizes the extensive research conducted on the GANBLR algorithm and similar models. - -Advanced Algorithmic Modeling with GANBLR - -Authors: John Doe, Jane Smith -Summary: Explores GANBLR’s theoretical underpinnings and practical applications across various domains. Highlights the model’s versatility in finance, healthcare, and e-commerce. -Optimization Techniques in GANBLR Models - -Authors: Michael Johnson, Emily Davis -Summary: Reviews optimization strategies like gradient descent and regularization. Compares GANBLR with other algorithms, concluding its superior performance on complex datasets. -A Comparative Study of GANBLR and Traditional Machine Learning Algorithms - -Authors: David Wilson, Sarah Lee -Summary: Evaluates GANBLR’s performance against traditional algorithms, highlighting its ability to handle non-linear data relationships effectively. -Enhancing Predictive Accuracy in GANBLR Models through Feature Engineering - -Authors: William Brown, Lisa Green -Summary: Emphasizes feature engineering’s role in improving GANBLR’s accuracy, with case studies demonstrating significant improvements. -Introduction to the Set-Up -The project leverages Docker and Docker Compose to create a containerized environment for deploying and managing data-generative models via the Katabatic framework. Katabatic is an open-source tool for generating synthetic tabular data and evaluating it using models like GANBLR, TableGAN, and MedGan. - -Explanation of Services -Web Service -Acts as a RESTful API for interacting with the Katabatic framework. Runs in a Docker container accessible on port 5000, ensuring consistent interaction with Katabatic. -Database Service (PostgreSQL) -Stores generated or processed data, including synthetic datasets, model configurations, and logs. Running PostgreSQL in its container ensures isolation and consistency, connected to the web service for data storage and retrieval. -Production Deployment and Maintenance of Different Models -The goal is to containerize individual models like GANBLR, TableGAN, and MedGan, providing benefits such as: - -Dependency Isolation: Each model has its environment, preventing interference. -Scalability: Docker Compose simplifies scaling services. -Modularity: Containerizing each model creates a modular system for independent development and testing. -Putting Theory into Play at Katabatic -The Docker-based setup allows efficient operation of the Katabatic framework with multiple models. For instance, when generating synthetic healthcare data, both GANBLR and MedGan models are deployed in separate containers. The web service acts as the central point for interacting with these models, with data stored in the PostgreSQL database. - -Conclusion -Using Docker and Docker Compose, a scalable and modular environment was created for working with the Katabatic framework and associated models. This setup not only facilitates easy deployment and management of data-generative models like GANBLR but also lays the groundwork for future expansion and integration of additional models. The research findings provide valuable insights into the models' theoretical foundations and practical applications, supporting informed decisions regarding their use in various domains. - -Implementation Strategy -Dockerfile Implementation -The Dockerfile for the application is structured with a multilayered approach, chosen to optimize the build process, reduce build times, and maintain a modular architecture. - -Dockerfile - -# Use an official Python runtime as a parent image -FROM python:3.8-slim - -# Set the working directory in the container -WORKDIR /app - -# Copy only requirements.txt first to leverage Docker cache -COPY requirements.txt /app/ - -# Install any necessary dependencies -RUN pip install --no-cache-dir -r requirements.txt - -# Copy the rest of the application code -COPY . /app - -# Make port 5000 available to the world outside this container -EXPOSE 5000 - -# Run the application -CMD ["python", "app.py"] # Replace with your actual application entry point - - -Justification for Multilayered Dockerfile Implementation -Leverage Docker Cache for Efficiency: By copying requirements.txt first and installing dependencies before copying the entire application code, the Dockerfile takes full advantage of Docker’s caching mechanism, significantly reducing build times. - -Modularity and Maintainability: Separating dependency installation from the application code enhances modularity and simplifies dependency management. - -Reduced Build Times and Resource Efficiency: Only the necessary components are rebuilt when changes are made, reducing build times and resource usage, which is critical in a CI/CD pipeline. - -Production-Ready Image: The image is minimal and production-ready, containing only the necessary runtime environment and dependencies, which reduces potential attack vectors and enhances security. -# Sources # -https://stackoverflow.com/questions/39223249/multiple-run-vs-single-chained-run-in-dockerfile-which-is-better -https://www.cherryservers.com/blog/docker-multistage-build - -Docker Compose Implementation -The Docker Compose configuration simplifies multi-container deployments, providing a consistent environment for development, testing, and production. - -Docker Compose Configuration - -version: '3' -services: - web: - build: . - ports: - - "5000:5000" - depends_on: - - db - db: - image: postgres:13 - environment: - POSTGRES_USER: user - POSTGRES_PASSWORD: password - POSTGRES_DB: mydatabase - ports: - - "5432:5432" - volumes: - - pgdata:/var/lib/postgresql/data - -volumes: - pgdata: - - -Explanation of Docker Compose Components -Web Service: This service builds the Docker image from the Dockerfile and runs the application. It is configured to wait for the database service before starting, ensuring the necessary dependencies are in place. -Database Service: The PostgreSQL database service is configured with a persistent volume to ensure data is not lost when the container is stopped or restarted. -Persistent Volumes: Using Docker volumes ensures data persists between container restarts, crucial for maintaining application state and data integrity in a production environment. -Conclusion and Recommendations -The Docker and Docker Compose setup provides a robust, scalable, and modular framework for deploying and managing data-generative models. This approach ensures consistency across environments, simplifies the development and deployment process, and offers a production-ready solution capable of handling complex workflows like those required by the Katabatic framework. - -Recommendations: - -CI/CD Integration: Implementing continuous integration and continuous deployment (CI/CD) pipelines would further enhance the deployment process by automating testing and deployment tasks. -Security Enhancements: Regular updates to dependencies and implementing security best practices, such as scanning Docker images for vulnerabilities, would further secure the deployment. -Monitoring and Logging: Integrating monitoring and logging services would provide better insights into application performance and facilitate troubleshooting. diff --git a/Archive/ctgan1/ctgan_adapter.py b/Archive/ctgan1/ctgan_adapter.py deleted file mode 100644 index d76f54d..0000000 --- a/Archive/ctgan1/ctgan_adapter.py +++ /dev/null @@ -1,782 +0,0 @@ -import numpy as np -import pandas as pd -import torch -import torch.nn as nn -import torch.nn.functional as F -import torch.optim as optim -from torch.utils.data import DataLoader, TensorDataset -from sklearn.preprocessing import OneHotEncoder, MinMaxScaler -from sklearn.mixture import BayesianGaussianMixture -from tqdm import tqdm -from torch.utils.tensorboard import SummaryWriter -from katabatic.katabatic_spi import KatabaticModelSPI - - -class DataTransformer: - def __init__(self, max_clusters=10): - """ - Initialize the DataTransformer. - - Args: - max_clusters (int or str): Maximum number of clusters for GMM. - If 'auto', use the number of unique values. - """ - if isinstance(max_clusters, str) and max_clusters.lower() == "auto": - self.max_clusters = None # Enable auto-detection of clusters - else: - self.max_clusters = max_clusters - - # Dictionaries to store encoders, scalers, and Gaussian Mixture Models - self.encoders = {} - self.continuous_gmms = {} - self.output_info = [] - self.output_dim = 0 - self.scalers = {} - self.dataframe_columns = [] - - def fit(self, data): - """ - Fit the transformer on the provided data. - - Args: - data (pd.DataFrame): The input data. - """ - self.output_info = [] - self.output_dim = 0 - self.dataframe_columns = data.columns - - for column in data.columns: - if data[column].dtype == 'object' or data[column].dtype.name == 'category': - # Process categorical columns - try: - encoder = OneHotEncoder(sparse=False, handle_unknown='ignore') - except TypeError: - encoder = OneHotEncoder(sparse_output=False, handle_unknown='ignore') - encoder.fit(data[[column]]) - self.encoders[column] = encoder - categories = encoder.categories_[0] - self.output_info.append(('categorical', len(categories))) - self.output_dim += len(categories) - else: - # Process continuous columns - scaler = MinMaxScaler() - scaled_data = scaler.fit_transform(data[[column]]) - self.scalers[column] = scaler - - unique_values = len(np.unique(data[column])) - if self.max_clusters is None: - n_components = unique_values - else: - n_components = min(self.max_clusters, unique_values) - - n_components = max(n_components, 1) # Ensure at least one component - - # Fit a Bayesian Gaussian Mixture Model - vgm = BayesianGaussianMixture( - n_components=n_components, - weight_concentration_prior_type='dirichlet_process', - weight_concentration_prior=0.001, - n_init=1, - max_iter=1000, - random_state=42 - ) - vgm.fit(scaled_data) - self.continuous_gmms[column] = vgm - active_components = np.sum(vgm.weights_ > 1e-3) - self.output_info.append(('continuous', active_components)) - self.output_dim += active_components + 1 # +1 for normalized value - - def transform(self, data): - """ - Transform the data into a numerical format suitable for the CTGAN model. - - Args: - data (pd.DataFrame): The input data. - - Returns: - np.ndarray: Transformed data. - """ - outputs = [] - for idx, column in enumerate(data.columns): - column_type, info = self.output_info[idx] - if column_type == 'categorical': - # Encode categorical columns - encoder = self.encoders[column] - transformed = encoder.transform(data[[column]]) - outputs.append(transformed) - else: - # Encode continuous columns - scaler = self.scalers[column] - x = scaler.transform(data[[column]]) - vgm = self.continuous_gmms[column] - - # Predict probabilities for each Gaussian component - probs = vgm.predict_proba(x) - # Sample components based on probabilities - components = np.array([ - np.random.choice(len(p), p=p) if p.sum() > 0 else np.random.randint(len(p)) - for p in probs - ]) - - # Retrieve means and standard deviations of the components - means = vgm.means_.flatten() - stds = np.sqrt(vgm.covariances_).flatten() - - # Select means and stds based on sampled components - selected_means = means[components] - selected_stds = stds[components] - selected_stds[selected_stds == 0] = 1e-6 # Avoid division by zero - - # Normalize the continuous values - normalized_values = ((x.flatten() - selected_means) / (4 * selected_stds)).reshape(-1, 1) - normalized_values = np.clip(normalized_values, -0.99, 0.99) - - n_components = len(means) - # Create one-hot encoding for the selected components - component_one_hot = np.zeros((x.shape[0], n_components)) - component_one_hot[np.arange(x.shape[0]), components] = 1 - - # Filter out inactive components - active_components = vgm.weights_ > 1e-3 - component_one_hot = component_one_hot[:, active_components] - - # Concatenate component encoding with normalized values - transformed = np.concatenate([component_one_hot, normalized_values], axis=1) - outputs.append(transformed) - - # Combine all transformed columns - data_concat = np.concatenate(outputs, axis=1) - return data_concat.astype('float32') - - def inverse_transform(self, data): - """ - Inverse transform the data back to its original format. - - Args: - data (np.ndarray): Transformed data. - - Returns: - pd.DataFrame: Data in original format. - """ - recovered_data = {} - col_idx = 0 - for idx, column in enumerate(self.dataframe_columns): - column_type, info = self.output_info[idx] - if column_type == 'categorical': - # Decode categorical columns - dim = info - values = data[:, col_idx:col_idx + dim] - values = np.argmax(values, axis=1) - categories = self.encoders[column].categories_[0] - recovered = categories[values] - recovered_data[column] = recovered - col_idx += dim - else: - # Decode continuous columns - n_components = info - component_probs = data[:, col_idx:col_idx + n_components] - scalar_values = data[:, col_idx + n_components] - components = np.argmax(component_probs, axis=1) - vgm = self.continuous_gmms[column] - means = vgm.means_.flatten() - stds = np.sqrt(vgm.covariances_).flatten() - active_components = vgm.weights_ > 1e-3 - means = means[active_components] - stds = stds[active_components] - selected_means = means[components] - selected_stds = stds[components] - selected_stds[selected_stds == 0] = 1e-6 # Avoid division by zero - - # Recover the original continuous values - recovered = scalar_values * 4 * selected_stds + selected_means - recovered = np.clip(recovered, 0, 1) - scaler = self.scalers[column] - recovered = scaler.inverse_transform(recovered.reshape(-1, 1)).flatten() - recovered_data[column] = recovered - col_idx += n_components + 1 - - df = pd.DataFrame(recovered_data) - # Ensure continuous columns are correctly typed - for column in df.columns: - if column in self.continuous_gmms: - df[column] = pd.to_numeric(df[column], errors='coerce') - return df - - -class DataSampler: - def __init__(self, data, transformer): - """ - Initialize the DataSampler. - - Args: - data (pd.DataFrame): The input data. - transformer (DataTransformer): The fitted DataTransformer. - """ - self.transformer = transformer - self.n = len(data) - self.data = data - self.discrete_columns = [] - self.discrete_column_category_counts = [] - self.discrete_column_probs = [] - self.discrete_column_category_values = [] - - for idx, column in enumerate(data.columns): - column_type, info = transformer.output_info[idx] - if column_type == 'categorical': - self.discrete_columns.append(column) - counts = data[column].value_counts() - probs = counts / counts.sum() - self.discrete_column_probs.append(probs.values) - self.discrete_column_category_counts.append(len(counts)) - self.discrete_column_category_values.append(counts.index.values) - - # Normalize probabilities to handle imbalanced datasets - self.discrete_column_probs = [probs / probs.sum() for probs in self.discrete_column_probs] - - def sample_condvec(self, batch_size): - """ - Sample conditional vectors for the generator. - - Args: - batch_size (int): Number of samples to generate. - - Returns: - tuple: (cond, mask) - """ - if not self.discrete_columns: - return None, None - - # Initialize condition vectors and mask - cond = np.zeros((batch_size, sum(self.discrete_column_category_counts)), dtype='float32') - mask = np.ones((batch_size, len(self.discrete_columns)), dtype='int32') - - for i in range(batch_size): - for idx in range(len(self.discrete_columns)): - column = self.discrete_columns[idx] - probs = self.discrete_column_probs[idx] - categories = self.discrete_column_category_values[idx] - # Sample a category based on probabilities - category = np.random.choice(categories, p=probs) - encoder = self.transformer.encoders[column] - one_hot = encoder.transform([[category]]).flatten() - # Assign the one-hot encoded category to the condition vector - cond[i, self.get_condvec_indices(idx)] = one_hot - - return cond, mask - - def sample_original_condvec(self, batch_size): - """ - Sample original conditional vectors for data generation. - - Args: - batch_size (int): Number of samples to generate. - - Returns: - np.ndarray: Conditional vectors. - """ - if not self.discrete_columns: - return None - - cond = np.zeros((batch_size, sum(self.discrete_column_category_counts)), dtype='float32') - - for i in range(batch_size): - for idx in range(len(self.discrete_columns)): - column = self.discrete_columns[idx] - probs = self.discrete_column_probs[idx] - categories = self.discrete_column_category_values[idx] - # Sample a category based on probabilities - category = np.random.choice(categories, p=probs) - encoder = self.transformer.encoders[column] - one_hot = encoder.transform([[category]]).flatten() - # Assign the one-hot encoded category to the condition vector - cond[i, self.get_condvec_indices(idx)] = one_hot - - return cond - - def get_condvec_indices(self, idx): - """ - Get the indices for the conditional vector corresponding to a specific column. - - Args: - idx (int): Column index. - - Returns: - np.ndarray: Indices for the conditional vector. - """ - start = sum(self.discrete_column_category_counts[:idx]) - end = start + self.discrete_column_category_counts[idx] - return np.arange(start, end) - - def dim_cond_vec(self): - """ - Get the dimensionality of the conditional vector. - - Returns: - int: Dimensionality of the conditional vector. - """ - return sum(self.discrete_column_category_counts) - - -class ResidualBlock(nn.Module): - def __init__(self, input_dim, output_dim): - """ - Initialize a Residual Block. - - Args: - input_dim (int): Input dimension. - output_dim (int): Output dimension. - """ - super(ResidualBlock, self).__init__() - self.fc = nn.Linear(input_dim, output_dim) - self.bn = nn.BatchNorm1d(output_dim) - - def forward(self, x): - """ - Forward pass through the Residual Block. - - Args: - x (torch.Tensor): Input tensor. - - Returns: - torch.Tensor: Output tensor after adding the residual connection. - """ - out = F.leaky_relu(self.bn(self.fc(x)), 0.2) - return x + out # Add residual connection - - -class Generator(nn.Module): - def __init__(self, noise_dim, cond_dim, output_dim): - """ - Initialize the Generator network. - - Args: - noise_dim (int): Dimension of the noise vector. - cond_dim (int): Dimension of the conditional vector. - output_dim (int): Dimension of the output data. - """ - super(Generator, self).__init__() - self.noise_dim = noise_dim - self.cond_dim = cond_dim - self.input_dim = noise_dim + cond_dim - - # Define network layers - self.fc1 = nn.Linear(self.input_dim, 256) - self.bn1 = nn.BatchNorm1d(256) - - self.res_block1 = ResidualBlock(256, 256) - self.res_block2 = ResidualBlock(256, 256) - - self.fc2 = nn.Linear(256, 512) - self.bn2 = nn.BatchNorm1d(512) - - self.fc3 = nn.Linear(512, output_dim) - - def forward(self, noise, cond): - """ - Forward pass through the Generator. - - Args: - noise (torch.Tensor): Noise vector. - cond (torch.Tensor): Conditional vector. - - Returns: - torch.Tensor: Generated data. - """ - # Concatenate noise and condition - x = torch.cat([noise, cond], dim=1) - x = F.leaky_relu(self.bn1(self.fc1(x)), 0.2) - x = self.res_block1(x) - x = self.res_block2(x) - x = F.leaky_relu(self.bn2(self.fc2(x)), 0.2) - x = self.fc3(x) - return x - - -class Discriminator(nn.Module): - def __init__(self, input_dim, cond_dim, num_classes): - """ - Initialize the Discriminator network. - - Args: - input_dim (int): Dimension of the input data. - cond_dim (int): Dimension of the conditional vector. - num_classes (int): Number of classes for auxiliary classification. - """ - super(Discriminator, self).__init__() - self.input_dim = input_dim + cond_dim - - # Define network layers - self.fc1 = nn.Linear(self.input_dim, 512) - self.dropout1 = nn.Dropout(0.3) - - self.fc2 = nn.Linear(512, 256) - self.dropout2 = nn.Dropout(0.3) - - self.fc3 = nn.Linear(256, 128) - self.dropout3 = nn.Dropout(0.3) - - # Output layers for adversarial and auxiliary classification - self.fc_adv = nn.Linear(128, 1) - self.fc_aux = nn.Linear(128, num_classes) - - def forward(self, x): - """ - Forward pass through the Discriminator. - - Args: - x (torch.Tensor): Input tensor. - - Returns: - tuple: (Validity score, Class logits) - """ - x = x.view(-1, self.input_dim) - x = F.leaky_relu(self.fc1(x), 0.2) - x = self.dropout1(x) - x = F.leaky_relu(self.fc2(x), 0.2) - x = self.dropout2(x) - x = F.leaky_relu(self.fc3(x), 0.2) - x = self.dropout3(x) - - # Output for real/fake classification - validity = self.fc_adv(x) - # Output for auxiliary class prediction - class_logits = self.fc_aux(x) - - return validity, class_logits - - -class CTGANLoss: - @staticmethod - def calc_gradient_penalty(discriminator, real_data, fake_data, device, lambda_gp): - """ - Calculate the gradient penalty for WGAN-GP. - - Args: - discriminator (nn.Module): The discriminator model. - real_data (torch.Tensor): Real data samples. - fake_data (torch.Tensor): Fake data samples generated by the generator. - device (torch.device): The device to perform computations on. - lambda_gp (float): Gradient penalty coefficient. - - Returns: - torch.Tensor: The gradient penalty. - """ - batch_size = real_data.size(0) - # Sample random interpolation coefficients - alpha = torch.rand(batch_size, 1, device=device) - alpha = alpha.expand_as(real_data) - # Create interpolated samples - interpolates = alpha * real_data + (1 - alpha) * fake_data - interpolates = interpolates.requires_grad_(True) - - # Get discriminator output for interpolated samples - validity_interpolates, _ = discriminator(interpolates) - # Compute gradients with respect to interpolated samples - gradients = torch.autograd.grad( - outputs=validity_interpolates, - inputs=interpolates, - grad_outputs=torch.ones(validity_interpolates.size(), device=device), - create_graph=True, - retain_graph=True, - only_inputs=True - )[0] - - # Reshape gradients and compute penalty - gradients = gradients.view(batch_size, -1) - gradient_penalty = ((gradients.norm(2, dim=1) - 1) ** 2).mean() * lambda_gp - return gradient_penalty - - -class CtganAdapter(KatabaticModelSPI): - def __init__(self, **kwargs): - """ - Initialize the CTGAN Adapter. - - Args: - **kwargs: Keyword arguments for configuration. - """ - super().__init__("mixed") - self.embedding_dim = kwargs.get('noise_dim', 128) - self.generator_lr = kwargs.get('learning_rate', 1e-4) - self.discriminator_lr = kwargs.get('learning_rate', 1e-4) - self.max_batch_size = kwargs.get('batch_size', 500) - self.batch_size = self.max_batch_size - self.discriminator_steps = kwargs.get('discriminator_steps', 5) - self.epochs = kwargs.get('epochs', 300) - self.lambda_gp = kwargs.get('lambda_gp', 10) - self.vgm_components = kwargs.get('vgm_components', "auto") # Enable auto-detection of GMM components - self.device = torch.device("cuda:0" if kwargs.get('cuda', True) and torch.cuda.is_available() else "cpu") - self.writer = SummaryWriter() - self.discrete_columns = [] - - def load_data(self, data): - """ - Load and preprocess the data. - - Args: - data (pd.DataFrame): The input data. - """ - self.data = data - self.transformer = DataTransformer(max_clusters=self.vgm_components) - self.transformer.fit(data) - self.data_sampler = DataSampler(data, self.transformer) - - self.output_dim = self.transformer.output_dim - self.cond_dim = self.data_sampler.dim_cond_vec() - self.discrete_columns = [ - col for col, col_info in zip(data.columns, self.transformer.output_info) - if col_info[0] == 'categorical' - ] - - # Determine the target variable (assumed to be the last categorical column) - if self.discrete_columns: - self.target_column = self.discrete_columns[-1] - self.num_classes = self.transformer.encoders[self.target_column].categories_[0].shape[0] - else: - self.target_column = None - self.num_classes = 0 - - def load_model(self): - """ - Initialize the generator and discriminator models along with their optimizers and schedulers. - """ - self.generator = Generator( - noise_dim=self.embedding_dim, - cond_dim=self.cond_dim, - output_dim=self.output_dim - ).to(self.device) - - self.discriminator = Discriminator( - input_dim=self.output_dim, - cond_dim=self.cond_dim, - num_classes=self.num_classes - ).to(self.device) - - self.optimizerG = optim.Adam( - self.generator.parameters(), lr=self.generator_lr, betas=(0.5, 0.9) - ) - - self.optimizerD = optim.Adam( - self.discriminator.parameters(), lr=self.discriminator_lr, betas=(0.5, 0.9) - ) - - self.schedulerG = optim.lr_scheduler.StepLR(self.optimizerG, step_size=100, gamma=0.5) - self.schedulerD = optim.lr_scheduler.StepLR(self.optimizerD, step_size=100, gamma=0.5) - - def fit(self, X, y=None): - """ - Fit the CTGAN model to the data. - - Args: - X (pd.DataFrame): Feature data. - y (pd.Series, optional): Target labels. - """ - if y is not None: - # Combine features and target into a single DataFrame - data = pd.concat([X.reset_index(drop=True), pd.Series(y, name='Category').reset_index(drop=True)], axis=1) - else: - data = X - self.load_data(data) - - # Adjust batch size if dataset is smaller than the maximum batch size - self.batch_size = min(self.max_batch_size, len(data)) - if self.batch_size < self.max_batch_size: - print(f"Adjusted batch size to {self.batch_size} due to small dataset size") - - self.load_model() - self.train() - - def train(self): - """ - Train the generator and discriminator models. - """ - # Transform the data using the fitted transformer - data = self.transformer.transform(self.data) - dataset = TensorDataset(torch.FloatTensor(data)) - data_loader = DataLoader(dataset, batch_size=self.batch_size, shuffle=True, drop_last=True) - log_interval = max(1, self.epochs // 10) - - for epoch in tqdm(range(self.epochs), desc="Training Epochs"): - epoch_loss_d = 0 - epoch_loss_g = 0 - n_batches = 0 - - for id_, data_batch in enumerate(data_loader): - real = data_batch[0].to(self.device) - current_batch_size = real.size(0) - - # Sample conditional vectors - condvec, mask = self.data_sampler.sample_condvec(current_batch_size) - if condvec is not None: - c1 = torch.from_numpy(condvec).to(self.device) - else: - c1 = torch.zeros(current_batch_size, self.cond_dim, device=self.device) - - # Train Discriminator multiple times per generator step - for _ in range(self.discriminator_steps): - noise = torch.randn(current_batch_size, self.embedding_dim, device=self.device) - fake = self.generator(noise, c1) - - # Concatenate fake data with conditions and detach to prevent gradients flowing to generator - fake_cat = torch.cat([fake, c1], dim=1).detach() - real_cat = torch.cat([real, c1], dim=1) - - # Get discriminator outputs - validity_real, class_logits_real = self.discriminator(real_cat) - validity_fake, _ = self.discriminator(fake_cat) - - # Compute Wasserstein adversarial loss - loss_adv = -torch.mean(validity_real) + torch.mean(validity_fake) - - # Compute gradient penalty - gp = CTGANLoss.calc_gradient_penalty(self.discriminator, real_cat, fake_cat, self.device, self.lambda_gp) - - # Compute classification loss if applicable - if self.num_classes > 0: - target_labels = torch.argmax(c1[:, -self.num_classes:], dim=1) - class_loss = F.cross_entropy(class_logits_real, target_labels) - loss_d = loss_adv + gp + class_loss - else: - loss_d = loss_adv + gp - - # Backpropagate and update discriminator - self.optimizerD.zero_grad() - loss_d.backward() - self.optimizerD.step() - - epoch_loss_d += loss_d.item() - - # Train Generator - noise = torch.randn(current_batch_size, self.embedding_dim, device=self.device) - fake = self.generator(noise, c1) - fake_cat = torch.cat([fake, c1], dim=1) - - # Get discriminator output for fake data - validity_fake, class_logits_fake = self.discriminator(fake_cat) - - # Compute generator adversarial loss - loss_g_adv = -torch.mean(validity_fake) - - # Compute classification loss if applicable - if self.num_classes > 0: - target_labels = torch.argmax(c1[:, -self.num_classes:], dim=1) - class_loss_g = F.cross_entropy(class_logits_fake, target_labels) - loss_g = loss_g_adv + class_loss_g - else: - loss_g = loss_g_adv - - # Backpropagate and update generator - self.optimizerG.zero_grad() - loss_g.backward() - self.optimizerG.step() - - epoch_loss_g += loss_g.item() - n_batches += 1 - - # Update learning rates - self.schedulerG.step() - self.schedulerD.step() - - # Logging at specified intervals - if epoch % log_interval == 0: - if n_batches > 0: - print(f"Epoch {epoch}, Loss D: {epoch_loss_d / n_batches:.4f}, Loss G: {epoch_loss_g / n_batches:.4f}") - else: - print(f"Epoch {epoch}, No batches processed") - - self.writer.close() - - def generate(self, n): - """ - Generate synthetic data. - - Args: - n (int): Number of samples to generate. - - Returns: - pd.DataFrame: Generated synthetic data. - """ - try: - self.generator.eval() - data = [] - steps = n // self.batch_size + 1 - for _ in range(steps): - noise = torch.randn(self.batch_size, self.embedding_dim, device=self.device) - condvec = self.data_sampler.sample_original_condvec(self.batch_size) - if condvec is not None: - c1 = torch.from_numpy(condvec).to(self.device) - else: - c1 = torch.zeros(self.batch_size, self.cond_dim, device=self.device) - with torch.no_grad(): - fake = self.generator(noise, c1) - data.append(fake.cpu().numpy()) - data = np.concatenate(data, axis=0) - data = data[:n] # Trim to desired number of samples - data = self.transformer.inverse_transform(data) - return data - except Exception as e: - print(f"An error occurred during data generation: {e}") - return pd.DataFrame() # Return an empty DataFrame in case of failure - - -# Example usage with updated configuration -if __name__ == "__main__": - import json - - # Define configuration parameters - config = { - "ctgan_params": { - "noise_dim": 128, - "learning_rate": 2e-4, - "batch_size": 500, - "discriminator_steps": 5, - "epochs": 300, - "lambda_gp": 10, - "pac": 10, - "cuda": True, - "vgm_components": "auto" # Enable auto-detection of GMM components - }, - "evaluation": { - "test_size": 0.2, - "random_state": 42 - }, - "visualization": { - "n_features": 5, - "figsize": [15, 20] - } - } - - # Example datasets (replace with actual data loading as needed) - from sklearn.datasets import load_iris, load_breast_cancer, load_wine, load_digits - - datasets = { - "iris": load_iris(), - "breast_cancer": load_breast_cancer(), - "wine": load_wine(), - "digits": load_digits() - } - - for name, dataset in datasets.items(): - print(f"Processing {name} dataset") - try: - if name == "digits": - # Digits dataset is already numerical - data = pd.DataFrame( - dataset.data, - columns=dataset.feature_names if hasattr(dataset, 'feature_names') else [f"pixel_{i}" for i in range(dataset.data.shape[1])] - ) - data['target'] = dataset.target.astype(str) # Convert target to string for categorical handling - else: - data = pd.DataFrame(dataset.data, columns=dataset.feature_names) - if 'target' in dataset: - data['target'] = dataset.target.astype(str) - ctgan = CtganAdapter(**config["ctgan_params"]) - ctgan.fit(data.drop('target', axis=1), y=data['target']) - synthetic_data = ctgan.generate(n=len(data)) - print(f"Synthetic data for {name} generated successfully.\n") - except Exception as e: - print(f"Error fitting CTGAN model for {name}: {e}\n") - print(f"Error processing {name} dataset: {e}\n") - print(f"Failed to generate synthetic data for {name}\n") - - print("Experiment completed.") diff --git a/Archive/ctgan1/ctgan_example.py b/Archive/ctgan1/ctgan_example.py deleted file mode 100644 index f40e30b..0000000 --- a/Archive/ctgan1/ctgan_example.py +++ /dev/null @@ -1,122 +0,0 @@ -import os -import sys -import pandas as pd -import numpy as np -from sklearn.datasets import load_iris -from sklearn.model_selection import train_test_split -from sklearn.metrics import classification_report -from sklearn.ensemble import RandomForestClassifier -import logging - -# Add the project root to the Python path to allow module imports -project_root = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..')) -sys.path.insert(0, project_root) - -# Import CtganAdapter and evaluation functions from Katabatic -from katabatic.models.ctgan.ctgan_adapter import CtganAdapter -from katabatic.models.ctgan.ctgan_benchmark import evaluate_ctgan, print_evaluation_results - -# Configure logging -logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') -logger = logging.getLogger(__name__) - -def print_data_summary(data, title): - """ - Print summary statistics of the dataset. - - Args: - data (pd.DataFrame): The dataset to summarize. - title (str): Title for the summary. - """ - print(f"\n{title} Summary:") - print(f"Shape: {data.shape}") - print("\nDataset Head:") - print(data.head()) - print("\nDataset Info:") - data.info() - print("\nNumeric Columns Summary:") - print(data.describe()) - print("\nCategory Distribution:") - print(data['Category'].value_counts(normalize=True)) - -def main(): - try: - logger.info("Starting CT-GAN Iris example script") - print("Starting CT-GAN Iris example script") - - # Load Iris dataset - iris = load_iris() - X = pd.DataFrame(iris.data, columns=iris.feature_names) - y = pd.Series(iris.target, name="Category").astype('category') - data = pd.concat([X, y], axis=1) - - logger.info("Iris data loaded successfully") - print_data_summary(data, "Original Iris Dataset") - - # Initialize CtganAdapter - ctgan_params = { - "noise_dim": 128, - "learning_rate": 2e-4, - "batch_size": 100, - "discriminator_steps": 2, - "epochs": 10, - "lambda_gp": 10, - "pac": 10, - "cuda": False, - "vgm_components": 2 - } - ctgan_model = CtganAdapter(**ctgan_params) - logger.info("CT-GAN model initialized with parameters") - - # Fit the CT-GAN model - ctgan_model.fit(X, y) - logger.info("CT-GAN model fitted successfully") - - # Generate synthetic data - synthetic_data = ctgan_model.generate(n=len(data)) - synthetic_data = synthetic_data[data.columns] - synthetic_data['Category'] = synthetic_data['Category'].astype('category') - logger.info(f"Generated {len(synthetic_data)} rows of synthetic data") - - print_data_summary(synthetic_data, "Synthetic Iris Dataset") - - # Save synthetic data to CSV - synthetic_data.to_csv("synthetic_iris_data.csv", index=False) - logger.info("Synthetic Iris data saved to 'synthetic_iris_data.csv'") - - # Evaluate the quality of the synthetic data - logger.info("Evaluating synthetic data quality") - evaluation_metrics = evaluate_ctgan(real_data=data, synthetic_data=synthetic_data) - - print("\nEvaluation Metrics:") - print_evaluation_results(evaluation_metrics) - - # Compare real and synthetic data distributions - print("\nFeature Distribution Comparison:") - for column in data.columns: - if data[column].dtype != 'category': - real_mean = data[column].mean() - real_std = data[column].std() - synth_mean = synthetic_data[column].mean() - synth_std = synthetic_data[column].std() - print(f"\n{column}:") - print(f" Real - Mean: {real_mean:.4f}, Std: {real_std:.4f}") - print(f" Synth - Mean: {synth_mean:.4f}, Std: {synth_std:.4f}") - else: - real_dist = data[column].value_counts(normalize=True) - synth_dist = synthetic_data[column].value_counts(normalize=True) - print(f"\n{column} Distribution:") - print(" Real:") - print(real_dist) - print(" Synthetic:") - print(synth_dist) - - logger.info("CT-GAN Iris example completed successfully") - print("\nCT-GAN Iris example completed successfully") - - except Exception as e: - logger.error(f"An error occurred: {str(e)}") - print(f"An error occurred: {str(e)}") - -if __name__ == "__main__": - main() \ No newline at end of file diff --git a/Archive/ctgan1/doc/DOCUMENTATION.md b/Archive/ctgan1/doc/DOCUMENTATION.md deleted file mode 100644 index 84b6763..0000000 --- a/Archive/ctgan1/doc/DOCUMENTATION.md +++ /dev/null @@ -1,131 +0,0 @@ -## Project Files Documentation - -### `config.json` -- **Purpose:** - Stores all configuration parameters for the CT-GAN model, evaluation settings, and visualization preferences. - -- **Contents:** - ```json - { - "ctgan_params": { - "noise_dim": 128, # Dimension of the noise vector - "learning_rate": 0.0002, # Learning rate for the optimizer - "batch_size": 500, # Number of samples per training batch - "discriminator_steps": 5, # Number of discriminator updates per generator update - "epochs": 300, # Total number of training epochs - "lambda_gp": 10, # Gradient penalty coefficient for WGAN-GP - "pac": 10, # Number of minibatches for PacGAN (if applicable) - "cuda": true, # Use CUDA for GPU acceleration if available - "vgm_components": 2 # Number of components in the Gaussian Mixture Model - }, - "evaluation": { - "test_size": 0.2, # Proportion of the dataset to include in the test split - "random_state": 42 # Seed for random number generator to ensure reproducibility - }, - "visualization": { - "n_features": 5, # Number of features to visualize - "figsize": [15, 20] # Size of the visualization figures (width, height) - } - } - ``` - -### Python Scripts - -#### `ctgan_adapter.py` -- **Purpose:** - Implements the `CtganAdapter` class, which serves as an interface for initializing, training, and generating synthetic data using the CT-GAN model. - -- **Key Components:** - - **Class `CtganAdapter`:** - - **Initialization:** - Sets up the CT-GAN model with specified parameters. - - **Methods:** - - `fit(X, y)`: Trains the CT-GAN model on the provided feature matrix `X` and target vector `y`. - - `generate(n)`: Generates `n` synthetic data samples based on the trained model. - -#### `ctgan_benchmark.py` -- **Purpose:** - Contains evaluation functions that assess the quality and performance of the synthetic data generated by the CT-GAN model. - -- **Key Components:** - - **Function `evaluate_ctgan(real_data, synthetic_data)`:** - - **Purpose:** Computes various evaluation metrics to compare synthetic data against real data. - - **Arguments:** - - `real_data (pd.DataFrame)`: The original real dataset. - - `synthetic_data (pd.DataFrame)`: The synthetic dataset generated by CT-GAN. - - **Returns:** - Dictionary containing all evaluation metrics. - - - **Function `print_evaluation_results(results)`:** - - **Purpose:** Formats and prints the evaluation results in a structured and readable manner. - -#### `run_ctgan.py` -- **Purpose:** - Automates CT-GAN experiments based on configurations specified in `config.json`. Handles data preprocessing, model training, synthetic data generation, visualization, saving results, and evaluation for multiple datasets. - -- **Key Steps:** - 1. **Configuration Loading:** - - Loads parameters from `config.json`. - - 2. **Data Loading and Preprocessing:** - - Loads multiple datasets (iris, breast_cancer, wine, digits). - - Adjusts batch size based on dataset size. - - 3. **Model Training:** - - Trains the CT-GAN model with specified parameters for each dataset. - - 4. **Data Generation:** - - Generates synthetic data for each dataset. - - 5. **Visualization:** - - Visualizes real vs. synthetic data distributions for each dataset. - - 6. **Data Saving:** - - Saves the synthetic data for each dataset to separate CSV files. - - 7. **Evaluation:** - - Evaluates synthetic data quality for each dataset and logs the results. - - Computes metrics such as Likelihood Fitness, Statistical Similarity, ML Efficacy, and TSTR Performance. - -### `__init__.py` -- **Purpose:** - Designates directories as Python packages to enable proper module imports and maintain a clean project structure. - -- **Locations:** - - `katabatic/__init__.py` - - `katabatic/models/__init__.py` - - `katabatic/models/ctgan/__init__.py` - -- **Contents:** - Typically empty or containing package-level import statements to expose specific classes and functions. - -## Running the Experiments - -To run the CT-GAN experiments on multiple datasets, use the following command (From Katabatic directory): - -``` -python -m katabatic.models.ctgan.run_ctgan -``` - -This command executes the `run_ctgan.py` script, which reads the configuration from `config.json`, loads the specified datasets, trains the CT-GAN model, generates synthetic data, visualizes the results, saves the synthetic data to CSV files, and evaluates the quality of the generated data. - -Make sure you have the necessary dependencies installed and are running the command from the correct directory where the `katabatic` package is located. - -## Summary of Key Accomplishments - -- **Configuration Management:** - Centralized CT-GAN parameters, evaluation settings, and visualization preferences in `config.json` for easy adjustments. - -- **Model Interface:** - Developed the `CtganAdapter` class in `ctgan_adapter.py` to streamline interactions with the CT-GAN model, including training and data generation. - -- **Evaluation Framework:** - Implemented evaluation functions in `ctgan_benchmark.py` to assess synthetic data quality using metrics like Likelihood Fitness, Statistical Similarity, ML Efficacy, and TSTR Performance. - -- **Experiment Automation:** - Developed `run_ctgan.py` to automate experiments based on configurations, enhancing scalability and flexibility for multiple datasets (iris, breast_cancer, wine, digits). - -- **Project Structuring:** - Organized the project with proper package initialization files (`__init__.py`) to ensure seamless module imports and maintain a clean hierarchy. - -These components collectively enable the effective training, generation, visualization, and evaluation of synthetic data for multiple datasets using the CT-GAN model within the Katabatic project framework. The automation and evaluation capabilities facilitate comprehensive analysis and comparison of the CT-GAN model's performance across different datasets, providing valuable insights into its efficacy and potential for generating high-quality synthetic data. diff --git a/Archive/ctgan1/doc/debugging_notes.md b/Archive/ctgan1/doc/debugging_notes.md deleted file mode 100644 index 2ecea16..0000000 --- a/Archive/ctgan1/doc/debugging_notes.md +++ /dev/null @@ -1,548 +0,0 @@ -```markdown -# CTGAN Pipeline: Errors and Fixes - -This document outlines the major errors encountered across the three scripts—**`benchmark_ctgan`**, **`ctgan_adapter`**, and **`run_ctgan`**—along with the corresponding fixes implemented. The errors are organized by script for clarity and better understanding. - -The implementation of CTGAN in this project is based on the groundbreaking work by Xu et al. (2019), adapting their methodologies and insights to the Katabatic framework. - ---- - -## `ctgan_adapter.py` - -### 1. **Import and Module Path Errors** -- **Error:** - Unable to import custom modules from the `katabatic` package, resulting in `ModuleNotFoundError`. - -- **Fix:** - Added the project root directory to Python's system path to enable importing custom modules. - ```python - project_root = os.path.abspath(os.path.dirname(os.path.dirname(__file__))) - sys.path.insert(0, project_root) - ``` - -### 2. **Handling Mixed Data Types** -- **Error:** - Inconsistent encoding of categorical features leading to data mismatches during transformation and inverse transformation. - -- **Fix:** - Implemented the `DataTransformer` class to handle both categorical and numerical features systematically. - ```python - class DataTransformer: - def __init__(self, max_clusters=10): - # Initialization code - self.encoders = {} - self.scalers = {} - # ... - - def fit(self, data): - for column in data.columns: - if data[column].dtype == 'object' or data[column].dtype.name == 'category': - encoder = OneHotEncoder(sparse=False, handle_unknown='ignore') - encoder.fit(data[[column]]) - self.encoders[column] = encoder - else: - scaler = MinMaxScaler() - scaler.fit(data[[column]]) - self.scalers[column] = scaler - # ... - - def transform(self, data): - # Transformation logic - pass - - def inverse_transform(self, data): - # Inverse transformation logic - pass - ``` - -### 3. **Missing Values in Synthetic Target Variable** -- **Error:** - Synthetic data generation resulted in `NaN` values in the target variable, causing errors during evaluation. - -- **Fix:** - Imputed missing values in the synthetic target variable with the mode to maintain data integrity. - ```python - recovered = scaler.inverse_transform(recovered.reshape(-1, 1)).flatten() - df[column] = recovered - # Later during evaluation - y_synthetic = y_synthetic.fillna(y_synthetic.mode().iloc[0]) - ``` - -### 4. **Dimension Mismatch Between Generator and Discriminator** -- **Error:** - Mismatched input dimensions when concatenating noise vectors with conditional vectors, leading to runtime errors. - -- **Fix:** - Verified and correctly calculated `noise_dim` and `cond_dim`, ensuring proper concatenation. - ```python - self.input_dim = noise_dim + cond_dim - ``` - -### 5. **Residual Connections Causing Shape Mismatches** -- **Error:** - Residual connections in the generator added tensors of different shapes, causing runtime errors during forward passes. - -- **Fix:** - Ensured matching dimensions in the `ResidualBlock` by adjusting layers or adding transformations. - ```python - class ResidualBlock(nn.Module): - def __init__(self, input_dim, output_dim): - super(ResidualBlock, self).__init__() - self.fc = nn.Linear(input_dim, output_dim) - self.bn = nn.BatchNorm1d(output_dim) - - def forward(self, x): - out = F.leaky_relu(self.bn(self.fc(x)), 0.2) - if x.shape != out.shape: - x = F.leaky_relu(self.bn(nn.Linear(x.shape[1], out.shape[1])(x)), 0.2) - return x + out - ``` - -### 6. **Gradient Leakage from Generator to Discriminator** -- **Error:** - Gradients from the generator were inadvertently propagating to the discriminator, causing unstable training dynamics. - -- **Fix:** - Detached fake data from the computation graph when training the discriminator to prevent gradient flow. - ```python - fake_cat = torch.cat([fake, c1], dim=1).detach() - ``` - -### 7. **Device Mismatch Errors (CPU vs GPU)** -- **Error:** - Tensors were located on different devices (CPU vs GPU), causing runtime errors during operations. - -- **Fix:** - Ensured that all tensors and models were consistently moved to the same device. - ```python - self.device = torch.device("cuda:0" if kwargs.get('cuda', True) and torch.cuda.is_available() else "cpu") - self.generator.to(self.device) - self.discriminator.to(self.device) - - real = data_batch[0].to(self.device) - noise = torch.randn(current_batch_size, self.embedding_dim, device=self.device) - c1 = torch.from_numpy(condvec).to(self.device) if condvec is not None else torch.zeros(current_batch_size, self.cond_dim, device=self.device) - ``` - -### 8. **Missing or Incorrect Configuration Parameters** -- **Error:** - Missing hyperparameters in `config.json` led to unexpected behaviors or crashes. - -- **Fix:** - Provided default values using `kwargs.get()` and validated the presence of essential configuration keys. - ```python - def load_config(config_path="katabatic/models/ctgan/config.json"): - if not os.path.exists(config_path): - logging.error(f"Configuration file not found at {config_path}") - sys.exit(1) - with open(config_path, "r") as f: - config = json.load(f) - required_keys = ["ctgan_params", "evaluation", "visualization"] - for key in required_keys: - if key not in config: - raise KeyError(f"Missing required configuration key: {key}") - return config - ``` - -### 9. **Incorrect Calculation of Evaluation Metrics** -- **Error:** - Metrics were inaccurately calculated, leading to misleading evaluations of synthetic data quality. - -- **Fix:** - Validated metric implementations and handled edge cases appropriately. - ```python - "JSD_mean": np.mean(js_divergences) if js_divergences else None, - "Wasserstein_mean": np.mean(wasserstein_distances) if wasserstein_distances else None - ``` - -### 10. **Incorrect Sampling of Conditional Vectors** -- **Error:** - Conditional vectors were sampled incorrectly, reducing data diversity and realism. - -- **Fix:** - Ensured correct one-hot encoding and balanced probability distributions during sampling. - ```python - one_hot = encoder.transform([[category]]).flatten() - cond[i, self.get_condvec_indices(idx)] = one_hot - ``` - -### 11. **Batch Size Larger Than Dataset Size** -- **Error:** - Set batch size larger than the dataset size, causing empty batches or processing errors. - -- **Fix:** - Dynamically adjusted batch size based on dataset size. - ```python - self.batch_size = min(self.max_batch_size, len(data)) - if self.batch_size < self.max_batch_size: - logging.info(f"Adjusted batch size to {self.batch_size} due to small dataset size") - ``` - -### 12. **Incorrect Reconstruction of Continuous Features** -- **Error:** - Continuous features were not accurately inverse transformed, resulting in unrealistic values. - -- **Fix:** - Handled division by zero, applied clipping, and correctly scaled values back using inverse transformation. - ```python - selected_stds[selected_stds == 0] = 1e-6 # Avoid division by zero - recovered = scaler.inverse_transform(recovered.reshape(-1, 1)).flatten() - ``` - -### 13. **Handling Datasets with No Categorical Features** -- **Error:** - Scripts failed when processing datasets that lacked categorical features, leading to `None` values or improper handling. - -- **Fix:** - Added conditional checks to handle absence of categorical columns gracefully. - ```python - if not self.discrete_columns: - return None, None - ``` - -### 14. **High Cardinality Categorical Features Causing Memory Issues** -- **Error:** - Excessive one-hot encoding led to high-dimensional data and memory overflow. - -- **Fix:** - Limited the number of Gaussian Mixture Model (GMM) components and filtered inactive components. - ```python - self.continuous_gmms[column] = vgm - component_one_hot = component_one_hot[:, active_components] - ``` - -### 15. **Learning Rate and Optimization Issues** -- **Error:** - Poor model convergence due to inappropriate learning rates and optimizer settings. - -- **Fix:** - Tuned learning rates, implemented learning rate schedulers, and set appropriate optimizer betas. - ```python - self.schedulerG = optim.lr_scheduler.StepLR(self.optimizerG, step_size=100, gamma=0.5) - self.schedulerD = optim.lr_scheduler.StepLR(self.optimizerD, step_size=100, gamma=0.5) - self.optimizerG = optim.Adam(self.generator.parameters(), lr=self.generator_lr, betas=(0.5, 0.9)) - self.optimizerD = optim.Adam(self.discriminator.parameters(), lr=self.discriminator_lr, betas=(0.5, 0.9)) - ``` - -### 16. **Gradient Penalty Calculation Errors for WGAN-GP** -- **Error:** - Incorrect gradient penalty calculation led to unstable training. - -- **Fix:** - Correctly computed gradients and integrated gradient penalty into discriminator loss. - ```python - gradients = torch.autograd.grad( - outputs=validity_interpolates, - inputs=interpolates, - grad_outputs=torch.ones(validity_interpolates.size(), device=device), - create_graph=True, - retain_graph=True, - only_inputs=True - )[0] - gradient_penalty = ((gradients.norm(2, dim=1) - 1) ** 2).mean() * lambda_gp - loss_d = loss_adv + gp + class_loss if self.num_classes > 0 else loss_adv + gp - ``` - -### 17. **Incorrect Identification of Target Variable** -- **Error:** - Assumed the target variable was always named 'Category', causing misalignment during data processing. - -- **Fix:** - Dynamically identified the target variable based on dataset structure. - ```python - if self.discrete_columns: - self.target_column = self.discrete_columns[-1] - self.num_classes = self.transformer.encoders[self.target_column].categories_[0].shape[0] - else: - self.target_column = None - self.num_classes = 0 - ``` - -### 18. **Handling High-Dimensional Output from DataTransformer** -- **Error:** - High-dimensional transformed data caused memory constraints and performance issues. - -- **Fix:** - Limited the number of GMM components and applied dimensionality reduction techniques where necessary. - ```python - self.max_clusters = max_clusters - active_components = vgm.weights_ > 1e-3 - component_one_hot = component_one_hot[:, active_components] - ``` - -### 19. **Saving Synthetic Data When Generation Fails** -- **Error:** - Attempted to save synthetic data even when generation failed, leading to corrupted or incomplete files. - -- **Fix:** - Implemented checks to save only when synthetic data was successfully generated. - ```python - if "synthetic_data" in result and result["synthetic_data"] is not None: - result["synthetic_data"].to_csv(output_file, index=False) - else: - logging.warning(f"Failed to generate synthetic data for {result['dataset']}") - ``` - -### 20. **Applying Classification Metrics to Numerical Targets** -- **Error:** - Used classification metrics (Accuracy, F1 Score) for regression tasks, leading to inappropriate evaluations. - -- **Fix:** - Separated evaluation logic based on the target variable type, applying appropriate metrics. - ```python - if y_real.dtype == 'object' or y_real.dtype.name == 'category': - # Calculate Accuracy and F1 Score - accuracy = accuracy_score(y_test, y_pred) - f1 = f1_score(y_test, y_pred, average='weighted') - else: - # Calculate R-squared - r2 = r2_score(y_test, y_pred) - ``` - -### 21. **Handling One-Hot Encoding Sparse Matrices** -- **Error:** - One-hot encoded data as sparse matrices caused concatenation issues with numerical features. - -- **Fix:** - Set `sparse=False` in `OneHotEncoder` or converted sparse matrices to dense arrays. - ```python - encoder = OneHotEncoder(sparse=False, handle_unknown='ignore') - transformed = encoder.transform(data[[column]]).toarray() - ``` - -### 22. **Synthetic Target Variable Not Properly Conditioned** -- **Error:** - Synthetic target did not accurately reflect conditional distributions, leading to mismatches. - -- **Fix:** - Ensured that conditional vectors included the target variable correctly and handled encoding/decoding appropriately. - ```python - c1 = torch.from_numpy(condvec).to(self.device) if condvec is not None else torch.zeros(current_batch_size, self.cond_dim, device=self.device) - ``` - -### 23. **Configuration File Path Errors** -- **Error:** - Configuration file (`config.json`) not found due to incorrect path specification, resulting in `FileNotFoundError`. - -- **Fix:** - Verified and corrected the configuration file path and added error handling to provide informative messages. - ```python - def load_config(config_path="katabatic/models/ctgan/config.json"): - if not os.path.exists(config_path): - logging.error(f"Configuration file not found at {config_path}") - sys.exit(1) - with open(config_path, "r") as f: - config = json.load(f) - return config - ``` - ---- - -## `run_ctgan.py` - -### 1. **Import and Module Path Errors** -- **Error:** - Unable to import custom modules (`CtganAdapter`, `evaluate_ctgan`, `print_evaluation_results`) from the `katabatic` package. - -- **Fix:** - Added the project root directory to Python's system path. - ```python - project_root = os.path.abspath(os.path.dirname(os.path.dirname(__file__))) - sys.path.insert(0, project_root) - ``` - -### 2. **Handling Multiple Datasets with Mixed Data Types** -- **Error:** - Different datasets had varying feature types and structures, causing preprocessing inconsistencies. - -- **Fix:** - Systematically converted each dataset into a `pandas.DataFrame` and introduced a synthetic categorical feature. - ```python - def load_data(): - datasets = { - "iris": load_iris(), - "breast_cancer": load_breast_cancer(), - "wine": load_wine(), - "digits": load_digits() - } - - processed_datasets = {} - - for name, dataset in datasets.items(): - X = pd.DataFrame(dataset.data, columns=dataset.feature_names) - y = pd.Series(dataset.target, name="Category") - X["categorical_feature"] = pd.cut(X.iloc[:, 0], bins=3, labels=["low", "medium", "high"]) - data = pd.concat([X, y], axis=1) - processed_datasets[name] = data - - return processed_datasets - ``` - -### 3. **Batch Size Larger Than Dataset Size** -- **Error:** - Set batch size larger than the dataset size, causing empty batches or processing errors. - -- **Fix:** - Dynamically adjusted batch size based on dataset size. - ```python - self.batch_size = min(self.max_batch_size, len(data)) - if self.batch_size < self.max_batch_size: - logging.info(f"Adjusted batch size to {self.batch_size} due to small dataset size") - ``` - -### 4. **Saving Synthetic Data When Generation Fails** -- **Error:** - Attempted to save synthetic data even when generation failed, leading to corrupted or incomplete files. - -- **Fix:** - Implemented checks to save only when synthetic data was successfully generated. - ```python - if "synthetic_data" in result and result["synthetic_data"] is not None: - output_file = f"{result['dataset']}_synthetic_data.csv" - result["synthetic_data"].to_csv(output_file, index=False) - else: - logging.warning(f"Failed to generate synthetic data for {result['dataset']}") - ``` - -### 5. **Configuration Loading and Parameterization Errors** -- **Error:** - Missing hyperparameters in `config.json` led to unexpected behaviors or crashes. - -- **Fix:** - Provided default values using `kwargs.get()` and validated configuration keys. - ```python - config = load_config() - ``` - -### 6. **Incorrect Identification of Target Variable** -- **Error:** - Assumed the target variable was always named 'Category', causing misalignment during data processing. - -- **Fix:** - Dynamically identified the target variable based on dataset structure. - ```python - y = pd.Series(dataset.target, name="Category") - ``` - -### 7. **Handling Numerical Targets During Evaluation** -- **Error:** - Applied classification metrics to numerical targets, leading to inappropriate evaluations. - -- **Fix:** - Implemented conditional metric calculations based on target type. - ```python - if y_real.dtype == 'object' or y_real.dtype.name == 'category': - # Classification metrics - else: - # Regression metrics - ``` - -### 8. **Incorrect Conditional Vector Dimension Calculation** -- **Error:** - Miscalculated conditional vector dimensions causing model errors. - -- **Fix:** - Accurately calculated `cond_dim` and mapped categorical columns correctly. - ```python - self.cond_dim = sum(self.discrete_column_category_counts) - ``` - ---- - -## `benchmark_ctgan.py` - -### 1. **Incorrect Sampling of Gaussian Components** -- **Error:** - Synthetic data generation sometimes sampled Gaussian components inaccurately, leading to unrealistic distributions. - -- **Fix:** - Ensured correct probability sampling and filtered inactive components. - ```python - components = np.array([ - np.random.choice(len(p), p=p) if p.sum() > 0 else np.random.randint(len(p)) - for p in probs - ]) - component_one_hot = component_one_hot[:, active_components] - ``` - -### 2. **Incorrect Calculation of Evaluation Metrics** -- **Error:** - Metrics were inaccurately calculated, leading to misleading evaluations of synthetic data quality. - -- **Fix:** - Validated metric implementations and handled edge cases. - ```python - "JSD_mean": np.mean(js_divergences) if js_divergences else None, - "Wasserstein_mean": np.mean(wasserstein_distances) if wasserstein_distances else None - ``` - -### 3. **Applying Classification Metrics to Numerical Targets** -- **Error:** - Used classification metrics (Accuracy, F1 Score) for regression tasks, leading to inappropriate evaluations. - -- **Fix:** - Separated evaluation logic based on target variable type, applying appropriate metrics. - ```python - if y_real.dtype == 'object' or y_real.dtype.name == 'category': - # Calculate Accuracy and F1 Score - accuracy = accuracy_score(y_test, y_pred) - f1 = f1_score(y_test, y_pred, average='weighted') - else: - # Calculate R-squared - r2 = r2_score(y_test, y_pred) - ``` - -### 4. **Gradient Penalty Calculation Errors for WGAN-GP** -- **Error:** - Incorrect gradient penalty calculation led to unstable training. - -- **Fix:** - Correctly computed gradients and integrated gradient penalty into discriminator loss. - ```python - gradients = torch.autograd.grad( - outputs=validity_interpolates, - inputs=interpolates, - grad_outputs=torch.ones(validity_interpolates.size(), device=device), - create_graph=True, - retain_graph=True, - only_inputs=True - )[0] - gradient_penalty = ((gradients.norm(2, dim=1) - 1) ** 2).mean() * lambda_gp - loss_d = loss_adv + gp + class_loss if self.num_classes > 0 else loss_adv + gp - ``` - -### 5. **Handling Outliers and Anomalous Data Points** -- **Error:** - Outliers or anomalous data points adversely affected the training process, leading to unstable GAN training. - -- **Fix:** - Applied data clipping and used robust scalers to mitigate the impact of outliers. - ```python - normalized_values = np.clip(normalized_values, -0.99, 0.99) - scaler = MinMaxScaler() - ``` - -### 6. **Handling High-Dimensional Output from DataTransformer** -- **Error:** - High-dimensional transformed data caused memory constraints and performance issues. - -- **Fix:** - Limited the number of GMM components and applied dimensionality reduction techniques where necessary. - ```python - self.max_clusters = max_clusters - active_components = vgm.weights_ > 1e-3 - component_one_hot = component_one_hot[:, active_components] - ``` - ---- - -# Summary - -By addressing these errors and implementing the corresponding fixes across the **`ctgan_adapter.py`**, **`run_ctgan.py`**, and **`benchmark_ctgan.py`** scripts, the CTGAN-based synthetic data generation pipeline has been made robust and reliable. The systematic handling of data preprocessing, model training, evaluation, and visualization ensures high-quality synthetic data that mirrors the statistical properties of real datasets while maintaining practical utility for downstream machine learning tasks. - - - -### References - -Xu, L., Skoularidou, M., Cuesta-Infante, A., & Veeramachaneni, K. (2019). Modeling tabular data using conditional GAN. *Advances in Neural Information Processing Systems*, *32*. -``` \ No newline at end of file diff --git a/Archive/ctgan1/doc/example-implementation.md b/Archive/ctgan1/doc/example-implementation.md deleted file mode 100644 index ca73db7..0000000 --- a/Archive/ctgan1/doc/example-implementation.md +++ /dev/null @@ -1,202 +0,0 @@ -Certainly! Here's a revised version of the Markdown file, written as if I were you, introducing CTGAN and explaining the example: - -```markdown -# CTGAN: Generating Synthetic Tabular Data - -Hello everyone! I've been working on implementing CTGAN (Conditional Tabular GAN) as part of the Katabatic framework. CTGAN is a fascinating model designed to generate synthetic tabular data, and I want to share an example of how it works using the classic Iris dataset. - -## What is CTGAN? - -CTGAN, or Conditional Tabular GAN, is a type of Generative Adversarial Network specifically tailored for tabular data. It's designed to capture the distributions and relationships in your original data and create new, synthetic samples that maintain these characteristics. - -## The Iris Dataset Example - -To demonstrate CTGAN's capabilities, I've used the well-known Iris dataset. It's a simple but effective dataset with 150 samples, 4 numerical features (sepal length, sepal width, petal length, petal width), and 1 categorical target (Iris species). - -Here's what happened when I ran CTGAN on this dataset: - -### Training Process - -I trained the model (ctgan_example.py in ctgan folder) for 10 epochs. Here's a snippet of the training output: - -``` -Epoch 0, Loss D: 19.4568, Loss G: 1.1630 -Epoch 1, Loss D: 18.5340, Loss G: 1.1807 -... -Epoch 8, Loss D: 7.3483, Loss G: 0.9190 -Epoch 9, Loss D: 6.0030, Loss G: 0.8083 -``` - -You can see the loss for both the Discriminator (D) and Generator (G) decreasing over time, which indicates that both parts of the model are learning. - -### Generated Data - -After training, CTGAN generated 150 synthetic samples to match the original dataset size. Here's a comparison of some key statistics: - -``` -Feature: Real Mean (Std) vs Synthetic Mean (Std) -sepal length: 5.8433 (0.8281) vs 4.5374 (0.3615) -sepal width: 3.0573 (0.4359) vs 3.1174 (0.4885) -petal length: 3.7580 (1.7653) vs 3.3229 (1.2725) -petal width: 1.1993 (0.7622) vs 0.6768 (0.5113) - -Category Distribution (Real vs Synthetic): -0: 33.33% vs 78.67% -1: 33.33% vs 12.67% -2: 33.33% vs 8.67% -``` - -## What Can We Learn From This? - -1. **Feature Distributions**: CTGAN has captured the general range of the features, but there are differences in the means and standard deviations. This is quite normal for GANs, especially with limited training. - -2. **Category Imbalance**: The synthetic data shows a significant imbalance in categories compared to the original balanced distribution. This is an area where our model could use some improvement. - -3. **Data Range**: The synthetic data maintains the min-max ranges of the original data for most features, which is a positive aspect. - -4. **Variance**: The synthetic data generally shows less variance compared to the original data, particularly for sepal length. - -## Room for Improvement - -Based on these results, here are some areas we could focus on to improve the model: - -1. **Balancing Categories**: We need to work on better preserving the original category distribution. This might involve adjusting the model architecture or loss function. - -2. **Preserving Variance**: The model seems to be underestimating the variance in some features. We could experiment with different normalization techniques or model architectures to address this. - -3. **Extended Training**: Running the model for more epochs might help it capture the data distributions more accurately. - -4. **Hyperparameter Tuning**: Adjusting parameters like learning rate, batch size, or the structure of the generator and discriminator could lead to better results. - -## A Note on Benchmarking - -It's important to note that this example is not benchmarked against other methods or implementations. It's meant to provide a basic understanding of how CTGAN works and what kind of output it produces. For a more comprehensive evaluation, we'd need to compare it with other data generation methods and use a wider range of evaluation metrics. - -## Conclusion - -This CTGAN implementation shows promise in generating synthetic tabular data. While it has some areas for improvement, it demonstrates the potential of using GANs for creating synthetic datasets. As we continue to refine the model, we can expect to see even better results in maintaining the statistical properties of the original data. - -``` -# ctgan_example -import os -import sys -import pandas as pd -import numpy as np -from sklearn.datasets import load_iris -from sklearn.model_selection import train_test_split -from sklearn.metrics import classification_report -from sklearn.ensemble import RandomForestClassifier -import logging - -# Add the project root to the Python path to allow module imports -project_root = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..')) -sys.path.insert(0, project_root) - -# Import CtganAdapter and evaluation functions from Katabatic -from katabatic.models.ctgan.ctgan_adapter import CtganAdapter -from katabatic.models.ctgan.ctgan_benchmark import evaluate_ctgan, print_evaluation_results - -# Configure logging -logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') -logger = logging.getLogger(__name__) - -def print_data_summary(data, title): - """ - Print summary statistics of the dataset. - - Args: - data (pd.DataFrame): The dataset to summarize. - title (str): Title for the summary. - """ - print(f"\n{title} Summary:") - print(f"Shape: {data.shape}") - print("\nDataset Head:") - print(data.head()) - print("\nDataset Info:") - data.info() - print("\nNumeric Columns Summary:") - print(data.describe()) - print("\nCategory Distribution:") - print(data['Category'].value_counts(normalize=True)) - -def main(): - try: - logger.info("Starting CT-GAN Iris example script") - print("Starting CT-GAN Iris example script") - - # Load Iris dataset - iris = load_iris() - X = pd.DataFrame(iris.data, columns=iris.feature_names) - y = pd.Series(iris.target, name="Category").astype('category') - data = pd.concat([X, y], axis=1) - - logger.info("Iris data loaded successfully") - print_data_summary(data, "Original Iris Dataset") - - # Initialize CtganAdapter - ctgan_params = { - "noise_dim": 128, - "learning_rate": 2e-4, - "batch_size": 100, - "discriminator_steps": 2, - "epochs": 10, - "lambda_gp": 10, - "pac": 10, - "cuda": False, - "vgm_components": 2 - } - ctgan_model = CtganAdapter(**ctgan_params) - logger.info("CT-GAN model initialized with parameters") - - # Fit the CT-GAN model - ctgan_model.fit(X, y) - logger.info("CT-GAN model fitted successfully") - - # Generate synthetic data - synthetic_data = ctgan_model.generate(n=len(data)) - synthetic_data = synthetic_data[data.columns] - synthetic_data['Category'] = synthetic_data['Category'].astype('category') - logger.info(f"Generated {len(synthetic_data)} rows of synthetic data") - - print_data_summary(synthetic_data, "Synthetic Iris Dataset") - - # Save synthetic data to CSV - synthetic_data.to_csv("synthetic_iris_data.csv", index=False) - logger.info("Synthetic Iris data saved to 'synthetic_iris_data.csv'") - - # Evaluate the quality of the synthetic data - logger.info("Evaluating synthetic data quality") - evaluation_metrics = evaluate_ctgan(real_data=data, synthetic_data=synthetic_data) - - print("\nEvaluation Metrics:") - print_evaluation_results(evaluation_metrics) - - # Compare real and synthetic data distributions - print("\nFeature Distribution Comparison:") - for column in data.columns: - if data[column].dtype != 'category': - real_mean = data[column].mean() - real_std = data[column].std() - synth_mean = synthetic_data[column].mean() - synth_std = synthetic_data[column].std() - print(f"\n{column}:") - print(f" Real - Mean: {real_mean:.4f}, Std: {real_std:.4f}") - print(f" Synth - Mean: {synth_mean:.4f}, Std: {synth_std:.4f}") - else: - real_dist = data[column].value_counts(normalize=True) - synth_dist = synthetic_data[column].value_counts(normalize=True) - print(f"\n{column} Distribution:") - print(" Real:") - print(real_dist) - print(" Synthetic:") - print(synth_dist) - - logger.info("CT-GAN Iris example completed successfully") - print("\nCT-GAN Iris example completed successfully") - - except Exception as e: - logger.error(f"An error occurred: {str(e)}") - print(f"An error occurred: {str(e)}") - -if __name__ == "__main__": - main() diff --git a/Archive/ctgan1/doc/katabatic-ctgan-handbook.md b/Archive/ctgan1/doc/katabatic-ctgan-handbook.md deleted file mode 100644 index 0d62869..0000000 --- a/Archive/ctgan1/doc/katabatic-ctgan-handbook.md +++ /dev/null @@ -1,718 +0,0 @@ -# Developing and Integrating CTGAN with Katabatic: Tutorial - -## 1. Introduction - -Synthetic data generation has emerged as a critical solution to numerous challenges in data science and machine learning. As organizations grapple with data scarcity, privacy concerns, and the need for diverse datasets, the ability to create high-quality synthetic data has become invaluable. This tutorial delves into the world of synthetic tabular data generation, focusing on two powerful tools: Katabatic and CTGAN. - -Katabatic is an open-source framework designed to streamline the process of generating and evaluating synthetic tabular data. It provides a unified interface for various generative models, enabling researchers and practitioners to easily experiment with different approaches. Katabatic's architecture is built on the principle of modularity, allowing seamless integration of new models and evaluation metrics. - -CTGAN, or Conditional Tabular GAN, represents a significant advancement in synthetic data generation. Built on the foundations of Generative Adversarial Networks (GANs), CTGAN addresses the unique challenges posed by tabular data, such as mixed data types and complex dependencies between columns. Its ability to capture and replicate the intricate patterns within real-world datasets makes it a powerful tool for generating realistic synthetic data. - -The integration of CTGAN into the Katabatic framework marks a significant step forward in the field of synthetic data generation. This combination leverages the strengths of both tools: CTGAN's sophisticated generation capabilities and Katabatic's robust evaluation and comparison framework. The result is a powerful system for creating, assessing, and refining synthetic datasets. - -This tutorial aims to provide a comprehensive guide to implementing CTGAN within the Katabatic framework. It covers the theoretical foundations of both Katabatic and CTGAN, delves into the practical aspects of implementation, and offers insights into advanced usage and best practices. By the end of this tutorial, readers will have a deep understanding of: - -1. The architecture and principles of Katabatic -2. The inner workings of CTGAN and its advantages -3. How to implement CTGAN as a Katabatic model -4. Techniques for training, generating, and evaluating synthetic data -5. Advanced strategies for optimizing performance and handling complex datasets - -Whether you're a data scientist looking to generate synthetic datasets for testing, a machine learning engineer seeking to augment training data, or a researcher exploring new frontiers in data generation, this tutorial will equip you with the knowledge and tools to leverage the power of CTGAN within the Katabatic framework. Let's embark on this journey to master the art and science of synthetic tabular data generation. - - - - -## 2. ## Understanding Katabatic - -Katabatic is an open-source framework designed to streamline the process of generating and evaluating synthetic tabular data. Its architecture is built on the principles of modularity, extensibility, and ease of use, making it an ideal platform for researchers and practitioners in the field of synthetic data generation. - -### Architecture Overview - -Katabatic's architecture can be conceptualized as a layered system: - -1. **Core Layer**: This foundational layer contains the essential components and interfaces that define the framework's structure. -2. **Model Layer**: Houses various synthetic data generation models, including CTGAN. -3. **Evaluation Layer**: Comprises a suite of metrics and tools for assessing synthetic data quality. -4. **Utility Layer**: Provides auxiliary functions for data preprocessing, visualization, and experiment management. -5. **Interface Layer**: Offers APIs and command-line tools for interacting with the framework. - - +-----------------+ - | Interface Layer | - +-----------------+ - ↑ ↓ - +-----------------+ - | Utility Layer | - +-----------------+ - ↑ ↓ - +-----------------+ - | Evaluation Layer| - +-----------------+ - ↑ ↓ - +-----------------+ - | Model Layer | - +-----------------+ - ↑ ↓ - +-----------------+ - | Core Layer | - +-----------------+ - -### Key Components - -#### 1. Model Service Provider Interface (SPI) - -At the heart of Katabatic lies the Model SPI, a crucial abstraction that allows seamless integration of various synthetic data generation models. This interface defines a standard set of methods that all models must implement: - -```python -class KatabaticModelSPI(ABC): - @abstractmethod - def load_model(self): - pass - - @abstractmethod - def fit(self, X, y=None): - pass - - @abstractmethod - def generate(self, n_samples): - pass -``` - -This design enables Katabatic to treat all models uniformly, regardless of their internal complexities. It facilitates easy swapping of models and promotes a plug-and-play architecture. - -#### 2. Data Transformer - -Katabatic's Data Transformer is a sophisticated component responsible for preprocessing input data and post-processing generated synthetic data. It handles: - -- Encoding of categorical variables -- Scaling of numerical features -- Handling of missing values -- Data type conversions - -The Data Transformer ensures that data is consistently formatted across different models and evaluation metrics, promoting compatibility and fair comparisons. - -#### 3. Evaluation Framework - -The Evaluation Framework is a comprehensive suite of metrics and tools designed to assess the quality of synthetic data. It includes: - -- Statistical similarity measures (e.g., Jensen-Shannon divergence, Wasserstein distance) -- Machine learning efficacy tests -- Privacy and disclosure risk assessments -- Visualization tools for comparative analysis - -This framework allows users to perform multi-faceted evaluations of synthetic data, ensuring its suitability for various downstream tasks. - -#### 4. Configuration Manager - -Katabatic employs a flexible configuration system that allows users to customize various aspects of the data generation and evaluation process. The Configuration Manager: - -- Loads settings from JSON files -- Provides a centralized point for managing hyperparameters -- Enables easy experiment reproducibility - -### Workflow Integration - -Katabatic's architecture is designed to facilitate a smooth workflow: - -1. **Data Ingestion**: Raw data is loaded and passed through the Data Transformer. -2. **Model Selection**: Users choose a synthetic data generation model (e.g., CTGAN) through the configuration system. -3. **Model Training**: The selected model is instantiated via the Model SPI and trained on the preprocessed data. -4. **Synthetic Data Generation**: The trained model generates synthetic data. -5. **Evaluation**: The Evaluation Framework assesses the quality of the synthetic data. -6. **Reporting**: Results are compiled and presented through the Interface Layer. - -### Extensibility and Customization - -One of Katabatic's key strengths is its extensibility. Researchers can easily: - -- Implement new synthetic data generation models by adhering to the Model SPI -- Add custom evaluation metrics to the Evaluation Framework -- Develop specialized data transformers for unique data types or domains - -This flexibility allows Katabatic to evolve with the rapidly advancing field of synthetic data generation. - -### Performance Considerations - -Katabatic is designed with performance in mind: - -- **Lazy Loading**: Models and evaluation metrics are loaded on-demand to conserve memory. -- **Parallelization**: Where possible, operations are parallelized to leverage multi-core processors. -- **Caching**: Intermediate results are cached to avoid redundant computations during iterative experiments. - -### Integration with External Tools - -Katabatic provides integration points with popular data science and machine learning tools: - -- **Pandas**: For efficient data manipulation -- **Scikit-learn**: For machine learning-based evaluations -- **TensorFlow and PyTorch**: For deep learning-based models -- **Matplotlib and Seaborn**: For rich visualizations of results - -### Security and Privacy Considerations - -Katabatic incorporates several features to address security and privacy concerns: - -- **Differential Privacy**: Options to apply differential privacy techniques during synthetic data generation -- **Anonymization Checks**: Tools to assess the risk of re-identification in synthetic datasets -- **Secure Configuration**: Support for encrypted configuration files to protect sensitive parameters - -### Community and Ecosystem - -As an open-source project, Katabatic benefits from and contributes to a growing ecosystem: - -- **Plugin Architecture**: Allows third-party developers to create and share extensions -- **Benchmarking Suites**: Standardized datasets and evaluation protocols for comparing different approaches -- **Documentation and Tutorials**: Comprehensive guides to help users leverage the full potential of the framework - -### Future Directions - -Katabatic's roadmap includes: - -- Integration with federated learning systems for distributed synthetic data generation -- Enhanced support for time-series and sequential data -- Development of a graphical user interface for non-technical users -- Expansion of the evaluation framework to include domain-specific metrics - -By providing a robust, flexible, and user-friendly platform, Katabatic aims to accelerate research and development in the field of synthetic data generation, ultimately contributing to advancements in data privacy, augmentation, and accessibility across various domains. - - - -## 3. Introduction to CTGAN - -CTGAN (Conditional Tabular Generative Adversarial Networks) represents a significant advancement in the field of synthetic data generation, particularly for tabular datasets. Developed to address the unique challenges posed by structured data, CTGAN has quickly become a cornerstone technique in data synthesis. - -### Fundamental Principles - -CTGAN is built upon the foundation of Generative Adversarial Networks (GANs), a class of machine learning models introduced by Ian Goodfellow et al. in 2014. The core idea of GANs is to pit two neural networks against each other: a generator that creates synthetic data, and a discriminator that attempts to distinguish between real and synthetic samples. This adversarial process drives both networks to improve, ultimately resulting in the generation of highly realistic synthetic data. - -### Key Components of CTGAN - -1. **Conditional Generator:** - - The generator in CTGAN is designed to create synthetic samples conditioned on specific column values. This conditional generation is crucial for maintaining the complex relationships between different columns in tabular data. - -2. **Mode-Specific Normalization:** - - To handle the mixed data types common in tabular datasets, CTGAN employs a mode-specific normalization technique. This approach allows the model to effectively capture and reproduce the distributions of both continuous and discrete variables. - -3. **Training-by-Sampling:** - - CTGAN introduces a novel training-by-sampling method to address the imbalance often present in categorical columns. This technique ensures that the model gives equal attention to all categories, including rare ones, during the training process. - -4. **Wasserstein Loss with Gradient Penalty:** - - The model utilizes Wasserstein loss with gradient penalty, an advanced loss function that provides more stable training and better quality gradients compared to traditional GAN losses. - -### How CTGAN Works - -1. **Data Preprocessing:** - - Continuous columns are transformed using a variational Gaussian mixture model. - - Categorical columns are encoded using one-hot encoding. - -2. **Training Process:** - - The generator creates synthetic samples, conditioned on randomly selected column values. - - The discriminator attempts to distinguish between real and synthetic samples. - - The training-by-sampling technique is applied to ensure balanced learning across all categories. - - The model is updated using the Wasserstein loss with gradient penalty. - -3. **Synthetic Data Generation:** - - Post-training, the generator can produce new synthetic samples. - - The generated data is then inverse-transformed to match the original data format. - -### Advantages of CTGAN - -1. **Handling Mixed Data Types:** - - CTGAN excels at generating synthetic data for tables with both continuous and categorical variables, a common scenario in real-world datasets. - -2. **Preserving Column Relationships:** - - The conditional generation approach allows CTGAN to maintain complex dependencies between different columns, crucial for the realism of the synthetic data. - -3. **Dealing with Imbalanced Data:** - - The training-by-sampling technique enables CTGAN to generate realistic samples even for rare categories in imbalanced datasets. - -4. **High-Quality Synthetic Data:** - - By leveraging advanced GAN techniques, CTGAN produces synthetic data that closely mimics the statistical properties of the original dataset. - -5. **Privacy Preservation:** - - CTGAN generates entirely new data points rather than sampling from the original data, offering strong privacy guarantees. - -6. **Scalability:** - - The model can handle large datasets with numerous columns, making it suitable for a wide range of real-world applications. - -### Technical Challenges and Solutions - -1. **Mode Collapse:** - - CTGAN addresses the common GAN issue of mode collapse through its conditional generation and training-by-sampling techniques, ensuring diversity in the generated samples. - -2. **Discrete Data Handling:** - - The model uses a clever combination of one-hot encoding and conditioning to effectively generate discrete data, a challenging task for traditional GANs. - -3. **Training Stability:** - - The use of Wasserstein loss with gradient penalty significantly improves training stability, a crucial factor when dealing with complex tabular data. - -### Applications - -CTGAN finds applications across various domains: - -- **Data Augmentation for Machine Learning Models** -- **Privacy-Preserving Data Sharing in Healthcare and Finance** -- **Generation of Test Data for Software Development** -- **Scenario Generation for Business Planning and Risk Assessment** - -Understanding the principles and mechanics of CTGAN is crucial for effectively implementing and utilizing it within the Katabatic framework. This knowledge enables the creation of high-quality synthetic tabular data that maintains the complex characteristics of real-world datasets while offering the flexibility and privacy advantages inherent to synthetic data generation. - - - - -## 4. Setting Up the Development Environment - -To begin working with Katabatic and CTGAN, set up the development environment as follows: - -### 1. Install Katabatic - -```bash -pip install katabatic -``` - -### 2. Install Additional Dependencies - -```bash -pip install pandas numpy torch scikit-learn -``` - -### 3. Project Structure - -``` -katabatic/ -├── models/ -│ └── ctgan/ -│ -├── ctgan_adapter.py -├── ctgan_benchmark.py -├── run_ctgan.py -├── katabatic_config.json -└── katabatic.py -``` - -### 4. Update `katabatic_config.json` - -```json -{ - "ctgan": { - "tdgm_module_name": "katabatic.models.ctgan.ctgan_adapter", - "tdgm_class_name": "CtganAdapter" - } -} -``` - -This setup provides the foundation for integrating CTGAN into the Katabatic framework. The concise structure allows for easy navigation and modification of the CTGAN implementation within Katabatic. - - - - - -## 5. Implementing CTGAN within Katabatic - -The implementation of CTGAN in this project is based on the groundbreaking work by Xu et al. (2019), adapting their methodologies and insights to the Katabatic framework. - -Integrating CTGAN into Katabatic involves creating an adapter that implements the `KatabaticModelSPI`. This adapter serves as a bridge between CTGAN's functionality and Katabatic's interface. - -### Key Components of the CTGAN Adapter - -#### 1. `CtganAdapter` Class - -```python -from katabatic.katabatic_spi import KatabaticModelSPI - -class CtganAdapter(KatabaticModelSPI): - def __init__(self, **kwargs): - super().__init__("mixed") - # Initialize CTGAN-specific parameters - - def load_model(self): - # Initialize CTGAN model - - def fit(self, X, y=None): - # Preprocess data and train CTGAN - - def generate(self, n): - # Generate synthetic data using trained CTGAN -``` - -#### 2. Data Preprocessing - -```python -class DataTransformer: - def fit(self, data): - # Fit preprocessor to data - - def transform(self, data): - # Transform data for CTGAN - - def inverse_transform(self, data): - # Reverse transformation -``` - -#### 3. CTGAN Model Implementation - -```python -import torch.nn as nn -import torch.nn.functional as F - -class Generator(nn.Module): - # Define generator architecture - def __init__(self, input_dim, output_dim): - super().__init__() - self.fc1 = nn.Linear(input_dim, 256) - self.fc2 = nn.Linear(256, output_dim) - - def forward(self, x): - x = F.relu(self.fc1(x)) - return self.fc2(x) - -class Discriminator(nn.Module): - # Define discriminator architecture - def __init__(self, input_dim): - super().__init__() - self.fc1 = nn.Linear(input_dim, 256) - self.fc2 = nn.Linear(256, 1) - - def forward(self, x): - x = F.relu(self.fc1(x)) - return torch.sigmoid(self.fc2(x)) - -class CTGAN: - def fit(self, data): - # Train CTGAN - - def sample(self, n): - # Generate samples -``` - -#### 4. Integration with Katabatic - -In `katabatic.py`, ensure CTGAN can be loaded: - -```python -def run_model(model_name): - # Load model configuration - # Instantiate CtganAdapter -``` - -This implementation allows CTGAN to be seamlessly used within the Katabatic framework, leveraging its data handling and evaluation capabilities while maintaining CTGAN's powerful generation abilities. - - - - -## 6. Developing the CTGAN Model - -The CTGAN model consists of several key components working together to generate high-quality synthetic tabular data: - -### 1. Generator Architecture - -```python -import torch.nn as nn -import torch.nn.functional as F - -class Generator(nn.Module): - def __init__(self, input_dim, output_dim): - super().__init__() - self.fc1 = nn.Linear(input_dim, 256) - self.fc2 = nn.Linear(256, output_dim) - - def forward(self, x): - x = F.relu(self.fc1(x)) - return self.fc2(x) -``` - -### 2. Discriminator Architecture - -```python -class Discriminator(nn.Module): - def __init__(self, input_dim): - super().__init__() - self.fc1 = nn.Linear(input_dim, 256) - self.fc2 = nn.Linear(256, 1) - - def forward(self, x): - x = F.relu(self.fc1(x)) - return torch.sigmoid(self.fc2(x)) -``` - -### 3. Training Process - -```python -def train(generator, discriminator, data_loader, epochs): - for epoch in range(epochs): - for real_data in data_loader: - # Train discriminator - # Train generator -``` - -### 4. Data Preprocessing - -```python -def preprocess(data): - transformer = DataTransformer() - transformer.fit(data) - return transformer.transform(data) -``` - -### 5. Conditional Vector Handling - -```python -def sample_condvec(batch_size, n_categories): - return torch.randint(0, n_categories, (batch_size,)) -``` - -### Key Aspects of CTGAN Development - -- **Mode-specific normalization for mixed data types** -- **Training-by-sampling to handle imbalanced categorical data** -- **Wasserstein loss with gradient penalty for stable training** -- **Conditional generation to preserve column relationships** - -The CTGAN model is designed to capture complex patterns in tabular data, ensuring the generated synthetic data maintains the statistical properties and relationships present in the original dataset. - - - - -## 7. Evaluation and Benchmarking - -Evaluating CTGAN within Katabatic involves several metrics to assess the quality of synthetic data: - -### 1. Statistical Similarity - -```python -def evaluate_similarity(real_data, synthetic_data): - js_divergence = calculate_js_divergence(real_data, synthetic_data) - wasserstein_distance = calculate_wasserstein(real_data, synthetic_data) - return {'JSD': js_divergence, 'WD': wasserstein_distance} -``` - -### 2. Machine Learning Efficacy - -```python -def evaluate_ml_efficacy(real_data, synthetic_data): - real_score = train_and_evaluate(real_data) - synthetic_score = train_and_evaluate(synthetic_data) - return {'Real': real_score, 'Synthetic': synthetic_score} -``` - -### 3. Privacy Metrics - -```python -def evaluate_privacy(real_data, synthetic_data): - uniqueness = calculate_uniqueness(synthetic_data) - dcr = calculate_distance_to_closest_record(real_data, synthetic_data) - return {'Uniqueness': uniqueness, 'DCR': dcr} -``` - -### 4. Integration with Katabatic - -```python -class CtganBenchmark: - def evaluate(self, real_data, synthetic_data): - results = {} - results.update(evaluate_similarity(real_data, synthetic_data)) - results.update(evaluate_ml_efficacy(real_data, synthetic_data)) - results.update(evaluate_privacy(real_data, synthetic_data)) - return results -``` - -### Benchmarking Process - -1. **Generate synthetic data using CTGAN** -2. **Apply evaluation metrics** -3. **Compare results with other models in Katabatic** - -This evaluation framework allows for comprehensive assessment of CTGAN's performance within Katabatic, ensuring the generated synthetic data meets quality and privacy standards. - - - - -## 8. Running CTGAN with Katabatic - -To run CTGAN within the Katabatic framework, use the following process: - -### 1. Execute the CTGAN Model - -```bash -python -m katabatic.models.ctgan.run_ctgan -``` - -### 2. Configuration Loading - -The script loads configuration from `config.json`: - -```python -import json - -def load_config(config_path="katabatic/models/ctgan/config.json"): - with open(config_path, "r") as f: - return json.load(f) - -config = load_config() -print("Loaded configuration:", json.dumps(config, indent=2)) -``` - -### 3. Data Loading and Preprocessing - -```python -from sklearn.datasets import load_iris, load_breast_cancer, load_wine, load_digits - -def load_data(): - datasets = { - "iris": load_iris(), - "breast_cancer": load_breast_cancer(), - "wine": load_wine(), - "digits": load_digits() - } - # Process datasets - return processed_datasets - -datasets = load_data() -``` - -### 4. Model Training - -```python -for name, data in datasets.items(): - print(f"Processing {name} dataset") - ctgan = CtganAdapter(**config["ctgan_params"]) - ctgan.fit(data.drop('target', axis=1), y=data['target']) -``` - -### 5. Synthetic Data Generation - -```python -synthetic_data = ctgan.generate(n=len(data)) -``` - -### 6. Evaluation - -```python -evaluation_results = evaluate_ctgan(real_data, synthetic_data) -print_evaluation_results(evaluation_results) -``` - -### 7. Output Interpretation - -- **Likelihood Fitness:** Measures how well the synthetic data matches the real data distribution -- **Statistical Similarity:** Quantifies the similarity between real and synthetic data distributions -- **ML Efficacy:** Compares model performance on real vs synthetic data -- **TSTR Performance:** Evaluates models trained on synthetic data and tested on real data - -This process demonstrates how to integrate CTGAN into Katabatic, from configuration and data loading to model training, synthetic data generation, and comprehensive evaluation. - - - - -## 9. Advanced Usage and Customization - -Our hyperparameters are largely derived from the study "Modeling Tabular Data using Conditional GAN" by Lei Xu et al. - -### CTGAN Hyperparameters - -1. **Noise Dimension (`noise_dim`: 128)** - - **Purpose:** Introduces randomness to the generator, enabling the creation of diverse synthetic data samples. - -2. **Learning Rate (`learning_rate`: 2e-4)** - - **Purpose:** Controls the step size during optimization for both the generator and critic, balancing convergence speed and stability. - -3. **Batch Size (`batch_size`: 500)** - - **Purpose:** Defines the number of samples processed before updating model parameters, ensuring efficient training. - -4. **Discriminator Steps (`discriminator_steps`: 5)** - - **Purpose:** Specifies the number of critic updates per generator update to ensure the critic adequately learns the data distribution. - -5. **Epochs (`epochs`: 300)** - - **Purpose:** Indicates the total number of training iterations, allowing comprehensive learning of the data distribution. - -6. **Gradient Penalty Coefficient (`lambda_gp`: 10)** - - **Purpose:** Balances the gradient penalty term in the WGAN-GP loss function to enforce the Lipschitz constraint and stabilize training. - -7. **PacGAN Parameter (`pac`: 10)** - - **Purpose:** Determines the number of samples concatenated in the PacGAN framework to mitigate mode collapse by having the critic evaluate multiple samples jointly. - -8. **CUDA Utilization (`cuda`: true)** - - **Purpose:** Enables GPU acceleration to expedite the training process, essential for handling intensive GAN computations. - -9. **Variational Gaussian Mixture Components (`vgm_components`: 2)** - - **Purpose:** Specifies the number of components in the Variational Gaussian Mixture Model for mode-specific normalization of continuous columns. - -### Evaluation Parameters - -1. **Test Size (`test_size`: 0.2)** - - **Purpose:** Allocates 20% of the data for testing to evaluate the model's performance on unseen data. - -2. **Random State (`random_state`: 42)** - - **Purpose:** Ensures reproducibility by fixing the randomness in data splitting. - -### Visualization Parameters - -1. **Number of Features (`n_features`: 5)** - - **Purpose:** Selects 5 features for visualization to provide a clear and manageable overview of data distributions. - -2. **Figure Size (`figsize`: [15, 20])** - - **Purpose:** Sets the dimensions of the visualization plots to enhance clarity and readability. - -### Quick Troubleshooting Guide - -1. **Mode Collapse:** - - Increase `pac` parameter - - Adjust discriminator steps - - Implement minibatch discrimination - -2. **Unstable Training:** - - Reduce learning rate - - Increase gradient penalty (`lambda_gp`) - - Use spectral normalization in discriminator - -3. **Poor Data Quality:** - - Increase epochs - - Adjust number of GMM components - - Implement conditional vector sampling - -4. **Slow Performance:** - - Enable CUDA if available - - Optimize batch size - - Use mixed precision training - -5. **Overfitting:** - - Increase noise dimension - - Implement early stopping - - Apply dropout in generator and discriminator - -6. **Vanishing Gradients:** - - Use LeakyReLU activation - - Implement gradient clipping - - Adjust Wasserstein loss parameters - -7. **Class Imbalance:** - - Implement conditional batch normalization - - Use weighted sampling in DataLoader - - Adjust class weights in loss function - -8. **Categorical Data Issues:** - - Increase embedding dimensions - - Use Gumbel-Softmax for discrete outputs - - Implement conditional vector normalization - -For handling different data types and scaling, refer to the preprocessing and optimization techniques mentioned in the previous sections. - - - - -## 10. Conclusion - -### Key Points - -- **CTGAN effectively generates synthetic tabular data** -- **Hyperparameter tuning is crucial for optimal performance** -- **Comprehensive evaluation metrics ensure data quality** - -### Future Outlook - -- **Explore different models** -- **Develop domain-specific CTGAN variants** - -### References - -Xu, L., Skoularidou, M., Cuesta-Infante, A., & Veeramachaneni, K. (2019). Modeling tabular data using conditional GAN. *Advances in Neural Information Processing Systems*, *32*. \ No newline at end of file diff --git a/Archive/ctgan1/doc/metrics-criteria.md b/Archive/ctgan1/doc/metrics-criteria.md deleted file mode 100644 index 0cbc333..0000000 --- a/Archive/ctgan1/doc/metrics-criteria.md +++ /dev/null @@ -1,581 +0,0 @@ -# CTGAN Metrics Sheet - -This metrics sheet details the evaluation metrics utilized in my CTGAN implementation. These metrics are inspired by two key papers: - -1. **Xu, L., Skoularidou, M., Cuesta-Infante, A., & Veeramachaneni, K. (2019).** *Modeling tabular data using conditional GAN*. Advances in Neural Information Processing Systems, 32. - -2. **Zhang, Y., Zaidi, N.A., Zhou, J., & Li, G. (2021, December).** *GANBLR: A Tabular Data Generation Model*. In 2021 IEEE International Conference on Data Mining (ICDM) (pp. 181-190). IEEE. - ---- - -## `ctgan_adapter.py` - -### 1. **Likelihood Fitness Metrics** - -These metrics assess how well the synthetic data matches the underlying distribution of the real data. - -#### **a. Lsyn (Log-Likelihood of Synthetic Data)** -- **Description:** - Measures the likelihood of the synthetic data under the Gaussian Mixture Model (GMM) trained on real data. Higher values indicate that synthetic data closely follows the learned distribution. - -- **Inspiration:** - Inspired by Xu et al.'s use of log-likelihood to assess how well synthetic data captures the real data distribution. - -- **Implementation:** - ```python - from sklearn.mixture import GaussianMixture - - # Train GMM on real data - gmm_real = GaussianMixture(n_components=10, random_state=42) - gmm_real.fit(real_data) - - # Compute log-likelihood of synthetic data - lsyn = gmm_real.score(synthetic_data) - print(f"Lsyn (Log-Likelihood of Synthetic Data): {lsyn}") - ``` - -#### **b. Ltest (Log-Likelihood of Test Data)** -- **Description:** - Evaluates the likelihood of real test data under a GMM retrained on the synthetic data. This metric helps in detecting overfitting by ensuring that the synthetic data generalizes well to unseen real data. - -- **Inspiration:** - Addresses Xu et al.'s concern about overfitting models to synthetic data by introducing a secondary likelihood measure. - -- **Implementation:** - ```python - from sklearn.mixture import GaussianMixture - - # Train GMM on synthetic data - gmm_synthetic = GaussianMixture(n_components=10, random_state=42) - gmm_synthetic.fit(synthetic_data) - - # Compute log-likelihood of real test data - ltest = gmm_synthetic.score(real_test_data) - print(f"Ltest (Log-Likelihood of Test Data): {ltest}") - ``` - -### 2. **Machine Learning Efficacy Metrics** - -These metrics evaluate the usefulness of synthetic data for downstream machine learning tasks, ensuring that models trained on synthetic data perform comparably to those trained on real data. - -#### **a. Accuracy** -- **Description:** - The proportion of correct predictions made by a classifier trained on synthetic data and evaluated on real test data. - -- **Inspiration:** - Aligns with both Xu et al. and Zhang et al.'s evaluation of machine learning efficacy to determine the practical usefulness of synthetic data. - -- **Implementation:** - ```python - from sklearn.ensemble import RandomForestClassifier - from sklearn.metrics import accuracy_score - - # Train classifier on synthetic data - clf = RandomForestClassifier(n_estimators=100, random_state=42) - clf.fit(synthetic_data_features, synthetic_data_labels) - - # Predict on real test data - predictions = clf.predict(real_test_data_features) - - # Calculate accuracy - accuracy = accuracy_score(real_test_data_labels, predictions) - print(f"Accuracy: {accuracy}") - ``` - -#### **b. F1 Score** -- **Description:** - The harmonic mean of precision and recall for classification tasks, providing a balance between the two. - -- **Inspiration:** - Offers a balanced measure of classifier performance, especially useful in datasets with class imbalances as highlighted by both papers. - -- **Implementation:** - ```python - from sklearn.ensemble import RandomForestClassifier - from sklearn.metrics import f1_score - - # Train classifier on synthetic data - clf = RandomForestClassifier(n_estimators=100, random_state=42) - clf.fit(synthetic_data_features, synthetic_data_labels) - - # Predict on real test data - predictions = clf.predict(real_test_data_features) - - # Calculate F1 Score - f1 = f1_score(real_test_data_labels, predictions, average='weighted') - print(f"F1 Score: {f1}") - ``` - -#### **c. R² Score** -- **Description:** - Measures the proportion of variance in the dependent variable that is predictable from the independent variables in regression tasks. - -- **Inspiration:** - Evaluates the performance of regression models trained on synthetic data, as discussed in both papers' machine learning efficacy sections. - -- **Implementation:** - ```python - from sklearn.linear_model import LinearRegression - from sklearn.metrics import r2_score - - # Train regression model on synthetic data - reg = LinearRegression() - reg.fit(synthetic_data_features, synthetic_data_target) - - # Predict on real validation data - predictions = reg.predict(real_validation_data_features) - - # Calculate R² Score - r2 = r2_score(real_validation_data_target, predictions) - print(f"R² Score: {r2}") - ``` - -### 3. **Statistical Similarity Metrics** - -These metrics quantify the statistical resemblance between real and synthetic data distributions, ensuring that synthetic data mirrors the real data's properties. - -#### **a. Jensen-Shannon Divergence (JSD)** -- **Description:** - Measures the similarity between two probability distributions. Lower values indicate higher similarity. - -- **Inspiration:** - Used in both papers to evaluate distributional similarity between real and synthetic data. - -- **Implementation:** - ```python - from scipy.spatial.distance import jensenshannon - import numpy as np - - def compute_jsd(real_data, synthetic_data): - # Compute probability distributions - real_dist, _ = np.histogram(real_data, bins=50, range=(real_data.min(), real_data.max()), density=True) - synthetic_dist, _ = np.histogram(synthetic_data, bins=50, range=(real_data.min(), real_data.max()), density=True) - - # Normalize distributions - real_dist = real_dist / real_dist.sum() - synthetic_dist = synthetic_dist / synthetic_dist.sum() - - # Calculate JSD - jsd = jensenshannon(real_dist, synthetic_dist) - return jsd - - # Example usage for a specific feature - feature_jsd = compute_jsd(real_data['feature_name'], synthetic_data['feature_name']) - print(f"Jensen-Shannon Divergence for feature_name: {feature_jsd}") - ``` - -#### **b. Wasserstein Distance (WD)** -- **Description:** - Computes the distance between two probability distributions, emphasizing differences in distribution shapes. - -- **Inspiration:** - Provides a robust measure of distributional differences, as utilized in both papers to assess the quality of synthetic data. - -- **Implementation:** - ```python - from scipy.stats import wasserstein_distance - - def compute_wd(real_data, synthetic_data): - wd = wasserstein_distance(real_data, synthetic_data) - return wd - - # Example usage for a specific feature - feature_wd = compute_wd(real_data['feature_name'], synthetic_data['feature_name']) - print(f"Wasserstein Distance for feature_name: {feature_wd}") - ``` - -### 4. **Interpretability Metrics** - -These metrics assess the interpretability of the synthetic data generation process, ensuring that the model provides insights into feature importance and interactions. - -#### **a. Local Interpretability (Using LIME)** -- **Description:** - Evaluates how individual synthetic data points can be interpreted in terms of feature contributions to their class labels. - -- **Inspiration:** - Inspired by Zhang et al.'s focus on model interpretability, ensuring that synthetic data generation is transparent and explainable. - -- **Implementation:** - ```python - from lime import lime_tabular - from sklearn.ensemble import RandomForestClassifier - - # Train classifier on synthetic data - clf = RandomForestClassifier(n_estimators=100, random_state=42) - clf.fit(synthetic_data_features, synthetic_data_labels) - - # Initialize LIME explainer - explainer = lime_tabular.LimeTabularExplainer( - training_data=synthetic_data_features.values, - feature_names=synthetic_data_features.columns, - class_names=['Class_0', 'Class_1'], - mode='classification' - ) - - # Select a synthetic sample to explain - sample = synthetic_data_features.iloc[0].values - explanation = explainer.explain_instance(sample, clf.predict_proba, num_features=5) - - # Display explanation - explanation.show_in_notebook() - ``` - -#### **b. Global Interpretability** -- **Description:** - Assesses the overall impact of each feature on the synthetic data generation process, identifying which features most influence class labels. - -- **Inspiration:** - Aligns with Zhang et al.'s emphasis on understanding feature importance globally, providing insights into how features interact during data generation. - -- **Implementation:** - ```python - from sklearn.ensemble import RandomForestClassifier - import pandas as pd - import matplotlib.pyplot as plt - - # Train classifier on synthetic data - clf = RandomForestClassifier(n_estimators=100, random_state=42) - clf.fit(synthetic_data_features, synthetic_data_labels) - - # Get feature importances - importances = clf.feature_importances_ - feature_names = synthetic_data_features.columns - feature_importance = pd.Series(importances, index=feature_names).sort_values(ascending=False) - - # Plot feature importances - plt.figure(figsize=(10, 6)) - feature_importance.plot(kind='bar') - plt.title('Global Feature Importances') - plt.ylabel('Importance Score') - plt.xlabel('Features') - plt.tight_layout() - plt.show() - ``` - ---- - -## `run_ctgan.py` - -### 1. **Configuration Validation Metrics** - -Ensuring that the CTGAN is configured correctly is crucial for reliable training and evaluation. - -#### **a. Configuration Key Validation** -- **Description:** - Checks for the presence of essential keys in the configuration file (`config.json`) to prevent runtime errors and ensure all necessary parameters are set. - -- **Inspiration:** - Inspired by Xu et al.'s emphasis on robust benchmarking and consistent evaluation across different setups. - -- **Implementation:** - ```python - import json - import sys - - REQUIRED_KEYS = ['batch_size', 'epochs', 'generator_layers', 'discriminator_layers', 'latent_dim'] - - def validate_config(config_path): - with open(config_path, 'r') as file: - config = json.load(file) - - missing_keys = [key for key in REQUIRED_KEYS if key not in config] - if missing_keys: - print(f"Missing configuration keys: {missing_keys}") - sys.exit(1) - else: - print("Configuration validation passed.") - - # Example usage - validate_config('config.json') - ``` - -### 2. **Synthetic Data Generation Metrics** - -Evaluates the quality and consistency of the synthetic data generated by CTGAN. - -#### **a. Sample Generation Consistency** -- **Description:** - Ensures that the number of generated samples matches the required size, preventing indexing errors and maintaining data integrity. - -- **Inspiration:** - Maintains the fidelity of synthetic data as emphasized in Xu et al.'s and Zhang et al.'s benchmarking processes. - -- **Implementation:** - ```python - def generate_synthetic_data(ctgan_model, target_size): - synthetic_data = ctgan_model.sample(target_size) - actual_size = len(synthetic_data) - assert actual_size == target_size, f"Expected {target_size} samples, but got {actual_size}" - print(f"Sample Generation Consistency Check Passed: {actual_size} samples generated.") - return synthetic_data - - # Example usage - synthetic_data = generate_synthetic_data(ctgan, target_size=1000) - ``` - -#### **b. Conditional Vector Accuracy** -- **Description:** - Verifies that conditional vectors correctly represent the desired conditions, ensuring that synthetic data adheres to specified categories or classes. - -- **Inspiration:** - Aligns with Xu et al.'s and Zhang et al.'s focus on conditional generation to handle imbalanced categorical data effectively. - -- **Implementation:** - ```python - import numpy as np - - def validate_conditional_vectors(synthetic_data, conditions): - # Assuming 'condition' is a column in the synthetic_data - condition_matches = synthetic_data['condition'].isin(conditions).all() - if condition_matches: - print("Conditional Vector Accuracy Check Passed.") - else: - print("Conditional Vector Accuracy Check Failed.") - missing_conditions = set(conditions) - set(synthetic_data['condition'].unique()) - print(f"Missing conditions: {missing_conditions}") - sys.exit(1) - - # Example usage - desired_conditions = ['A', 'B', 'C'] - validate_conditional_vectors(synthetic_data, desired_conditions) - ``` - ---- - -## `benchmark_ctgan.py` - -### 1. **Evaluation Framework Metrics** - -Implements a comprehensive evaluation framework inspired by both Xu et al. and Zhang et al. to assess synthetic data quality across multiple dimensions. - -#### **a. Likelihood Fitness (Lsyn and Ltest)** -- **Description:** - - **Lsyn:** Evaluates the likelihood of synthetic data under the real data's distribution. - - **Ltest:** Assesses the likelihood of real test data under a model trained on synthetic data. - -- **Inspiration:** - Directly adopted from Xu et al. to measure how well synthetic data captures the underlying distribution and generalizes to new data. - -- **Implementation:** - ```python - from sklearn.mixture import GaussianMixture - - def compute_lsyn(real_data, synthetic_data, n_components=10): - gmm_real = GaussianMixture(n_components=n_components, random_state=42) - gmm_real.fit(real_data) - lsyn = gmm_real.score(synthetic_data) - return lsyn - - def compute_ltest(synthetic_data, real_test_data, n_components=10): - gmm_synthetic = GaussianMixture(n_components=n_components, random_state=42) - gmm_synthetic.fit(synthetic_data) - ltest = gmm_synthetic.score(real_test_data) - return ltest - - # Example usage - lsyn = compute_lsyn(real_data, synthetic_data) - ltest = compute_ltest(synthetic_data, real_test_data) - print(f"Lsyn: {lsyn}, Ltest: {ltest}") - ``` - -#### **b. Machine Learning Efficacy (TSTR and TRTR)** -- **Description:** - - **TSTR (Train on Synthetic, Test on Real):** Measures the performance of machine learning models trained on synthetic data and tested on real data. - - **TRTR (Train on Real, Test on Real):** Measures the performance of machine learning models trained and tested on real data, serving as an upper benchmark. - -- **Inspiration:** - Adopted from Zhang et al.'s GANBLR paper to evaluate the practical utility of synthetic data for machine learning tasks. - -- **Implementation:** - ```python - from sklearn.ensemble import RandomForestClassifier - from sklearn.metrics import accuracy_score, f1_score, r2_score - - def evaluate_tstr(synthetic_features, synthetic_labels, real_test_features, real_test_labels): - clf = RandomForestClassifier(n_estimators=100, random_state=42) - clf.fit(synthetic_features, synthetic_labels) - predictions = clf.predict(real_test_features) - accuracy = accuracy_score(real_test_labels, predictions) - f1 = f1_score(real_test_labels, predictions, average='weighted') - return accuracy, f1 - - def evaluate_trtr(real_train_features, real_train_labels, real_test_features, real_test_labels): - clf = RandomForestClassifier(n_estimators=100, random_state=42) - clf.fit(real_train_features, real_train_labels) - predictions = clf.predict(real_test_features) - accuracy = accuracy_score(real_test_labels, predictions) - f1 = f1_score(real_test_labels, predictions, average='weighted') - return accuracy, f1 - - # Example usage - tstr_accuracy, tstr_f1 = evaluate_tstr(synthetic_data_features, synthetic_data_labels, real_test_data_features, real_test_data_labels) - trtr_accuracy, trtr_f1 = evaluate_trtr(real_train_data_features, real_train_data_labels, real_test_data_features, real_test_data_labels) - print(f"TSTR Accuracy: {tstr_accuracy}, TSTR F1 Score: {tstr_f1}") - print(f"TRTR Accuracy: {trtr_accuracy}, TRTR F1 Score: {trtr_f1}") - ``` - -#### **c. Statistical Similarity (JSD and WD)** -- **Description:** - Quantifies the statistical resemblance between real and synthetic data distributions using Jensen-Shannon Divergence and Wasserstein Distance. - -- **Inspiration:** - Both Xu et al. and Zhang et al. utilize these metrics to ensure that synthetic data mirrors real data's statistical properties. - -- **Implementation:** - ```python - from scipy.spatial.distance import jensenshannon - from scipy.stats import wasserstein_distance - import numpy as np - - def compute_jsd(real_data, synthetic_data): - real_dist, _ = np.histogram(real_data, bins=50, range=(real_data.min(), real_data.max()), density=True) - synthetic_dist, _ = np.histogram(synthetic_data, bins=50, range=(real_data.min(), real_data.max()), density=True) - real_dist = real_dist / real_dist.sum() - synthetic_dist = synthetic_dist / synthetic_dist.sum() - jsd = jensenshannon(real_dist, synthetic_dist) - return jsd - - def compute_wd(real_data, synthetic_data): - wd = wasserstein_distance(real_data, synthetic_data) - return wd - - # Example usage for a specific feature - feature_jsd = compute_jsd(real_data['feature_name'], synthetic_data['feature_name']) - feature_wd = compute_wd(real_data['feature_name'], synthetic_data['feature_name']) - print(f"JSD for feature_name: {feature_jsd}, WD for feature_name: {feature_wd}") - ``` - -#### **d. Interpretability Metrics** -- **Description:** - Assesses the interpretability of the synthetic data generation process, focusing on both local and global feature importance. - -- **Inspiration:** - Derived from Zhang et al.'s GANBLR paper, emphasizing the importance of interpretability in synthetic data models. - -- **Implementation:** - ```python - from lime import lime_tabular - from sklearn.ensemble import RandomForestClassifier - import pandas as pd - import matplotlib.pyplot as plt - - def compute_local_interpretability(synthetic_data_features, synthetic_data_labels): - clf = RandomForestClassifier(n_estimators=100, random_state=42) - clf.fit(synthetic_data_features, synthetic_data_labels) - explainer = lime_tabular.LimeTabularExplainer( - training_data=synthetic_data_features.values, - feature_names=synthetic_data_features.columns, - class_names=['Class_0', 'Class_1'], - mode='classification' - ) - sample = synthetic_data_features.iloc[0].values - explanation = explainer.explain_instance(sample, clf.predict_proba, num_features=5) - explanation.show_in_notebook() - - def compute_global_interpretability(synthetic_data_features, synthetic_data_labels): - clf = RandomForestClassifier(n_estimators=100, random_state=42) - clf.fit(synthetic_data_features, synthetic_data_labels) - importances = clf.feature_importances_ - feature_names = synthetic_data_features.columns - feature_importance = pd.Series(importances, index=feature_names).sort_values(ascending=False) - plt.figure(figsize=(10, 6)) - feature_importance.plot(kind='bar') - plt.title('Global Feature Importances') - plt.ylabel('Importance Score') - plt.xlabel('Features') - plt.tight_layout() - plt.show() - - # Example usage - compute_local_interpretability(synthetic_data_features, synthetic_data_labels) - compute_global_interpretability(synthetic_data_features, synthetic_data_labels) - ``` - -### 2. **Benchmarking Results Aggregation** - -Aggregates and compares results across different datasets and metrics to provide a holistic view of CTGAN's performance. - -#### **a. Ranking and Scoring** -- **Description:** - Ranks algorithms based on their performance across multiple metrics and datasets, providing an average performance score for each method. - -- **Inspiration:** - Ensures a fair and comprehensive comparison as outlined in both papers' benchmarking systems. - -- **Implementation:** - ```python - import pandas as pd - - def rank_algorithms(results_df): - # Assuming results_df has algorithms as rows and metrics as columns - results_df['Average_Score'] = results_df.mean(axis=1) - ranked_df = results_df.sort_values(by='Average_Score', ascending=False) - return ranked_df - - # Example usage - data = { - 'Algorithm': ['CTGAN', 'Baseline1', 'Baseline2'], - 'Accuracy': [0.85, 0.80, 0.78], - 'F1_Score': [0.83, 0.79, 0.75], - 'R2_Score': [0.80, 0.76, 0.74], - 'JSD': [0.05, 0.07, 0.08], - 'WD': [0.10, 0.15, 0.20] - } - results_df = pd.DataFrame(data).set_index('Algorithm') - ranked_results = rank_algorithms(results_df) - print(ranked_results) - ``` - -#### **b. Performance Reporting** -- **Description:** - Summarizes the performance of CTGAN against baseline models across various metrics, facilitating easy comparison. - -- **Inspiration:** - Facilitates comprehensive benchmarking as presented in both papers, enabling identification of strengths and weaknesses. - -- **Implementation:** - ```python - import pandas as pd - import matplotlib.pyplot as plt - - def generate_performance_report(results_df): - metrics = results_df.columns[:-1] # Exclude 'Average_Score' - for metric in metrics: - results_df[metric].plot(kind='bar', figsize=(8,6)) - plt.title(f'Performance Comparison: {metric}') - plt.ylabel(metric) - plt.xlabel('Algorithm') - plt.tight_layout() - plt.show() - - # Example usage - generate_performance_report(ranked_results) - ``` - ---- - -# Summary - -In developing my CTGAN implementation, I incorporated a suite of evaluation metrics inspired by both the papers "Modeling Tabular Data using Conditional GAN" by Xu et al. (2019) and "GANBLR: A Tabular Data Generation Model" by Zhang et al. (2021). These metrics encompass: - -- **Likelihood Fitness:** - Assessing how well synthetic data captures the real data distribution using Lsyn and Ltest. - -- **Machine Learning Efficacy:** - Evaluating the practical utility of synthetic data for training machine learning models through Accuracy, F1 Score, R² Score, and frameworks like TSTR and TRTR. - -- **Statistical Similarity:** - Quantifying the statistical resemblance between real and synthetic data using Jensen-Shannon Divergence (JSD) and Wasserstein Distance (WD). - -- **Interpretability:** - Ensuring that the synthetic data generation process is transparent and explainable via local and global interpretability metrics, leveraging tools like LIME. - -By organizing these metrics across the **`ctgan_adapter.py`**, **`run_ctgan.py`**, and **`benchmark_ctgan.py`** scripts, I established a structured and comprehensive evaluation framework. This alignment with the rigorous benchmarking standards set forth in both referenced papers ensures that the synthetic data generated by CTGAN is both high-quality and practically useful for downstream applications. - ---- - -# References - -- Zhang, Y., Zaidi, N.A., Zhou, J. and Li, G., 2021, December. GANBLR: a tabular data generation model. In 2021 IEEE International Conference on Data Mining (ICDM) (pp. 181-190). IEEE. - -- Xu, L., Skoularidou, M., Cuesta-Infante, A. and Veeramachaneni, K., 2019. Modeling tabular data using conditional gan. Advances in neural information processing systems, 32. \ No newline at end of file diff --git a/Archive/detailed_Benchmarking_and_Deployment Report.md b/Archive/detailed_Benchmarking_and_Deployment Report.md deleted file mode 100644 index d90a779..0000000 --- a/Archive/detailed_Benchmarking_and_Deployment Report.md +++ /dev/null @@ -1,132 +0,0 @@ -**Docker-Based Model Execution: A Detailed Benchmarking and Deployment Report** - -**1. Introduction** - -The rapid advancements in machine learning and model development have led to the creation of models like **GANBLR** and **GANBLR++**, which are designed for financial prediction using Generative Adversarial Networks (GANs). This report focuses on the deployment and benchmarking of these two models using Docker. The primary objective is to determine whether both models can run within a single Dockerfile or if separate Dockerfiles are required. Furthermore, the report benchmarks the models' performance within Docker, observing the similarities and differences in execution and functionality. - -**2. Purpose** - -This report addresses the following key objectives: - -- Investigate if a single Dockerfile can efficiently handle both **GANBLR** and **GANBLR++**, or if separate Dockerfiles are necessary. -- Benchmark the performance of both models in a Docker environment. -- Provide a step-by-step process for model deployment and execution. -- Document findings, including potential performance differences and resource utilization between **GANBLR** and **GANBLR++**. - -**3. Models Under Consideration** - -- **GANBLR**: A GAN-based financial prediction model. -- **GANBLR++**: An enhanced version of **GANBLR**, offering updated architecture and techniques for improved prediction accuracy. - -**4. Docker Deployment Strategy** - -**4.1 Overview of Docker** - -Docker provides an isolated environment for running applications and their dependencies. By containerizing the models, we ensure consistency across different deployment environments. Docker enables us to manage dependencies, streamline the deployment process, and optimize resource allocation. - -**4.2 Single Dockerfile for GANBLR and GANBLR++** - -After testing both **GANBLR** and **GANBLR++**, we found that a **single Dockerfile** can be used to handle both models. Here's why: - -**Shared Dependencies** - -Both models share many of the same dependencies: - -- **Python 3.x** is used as the programming environment. -- Both models rely on **TensorFlow** or similar deep learning frameworks. -- Additional packages, such as **scikit-learn**, are also common to both models. - -The differences between **GANBLR** and **GANBLR++** lie primarily in the model architecture rather than the underlying dependencies, making it feasible to use a single Dockerfile for both. - -**Maintenance Efficiency** - -Using a single Dockerfile simplifies the deployment and management process: - -- **Easier updates**: Updates to the base environment (e.g., Python or TensorFlow versions) can be managed in one place. -- **Simplified CI/CD**: A single Dockerfile reduces the complexity of continuous integration and deployment pipelines, ensuring faster and easier model updates. - -**Consistency in Benchmarking** - -Running both models in the same Docker environment allows for consistent benchmarking: - -- **Uniform testing**: By running the models in the same container, we ensure that the environment doesn't introduce variability into the benchmark results. -- **Comparable performance**: Any differences in execution time or resource usage can be attributed to the models themselves rather than differences in the environment. - -**4.3 When Multiple Dockerfiles May Be Required** - -Despite the advantages of using a single Dockerfile, there are situations where separate Dockerfiles may be beneficial: - -- **Different Frameworks**: If future versions of **GANBLR++** introduce frameworks that are incompatible with **GANBLR**, it may become necessary to maintain separate environments. -- **Resource Optimization**: In cases where **GANBLR++** requires specialized hardware (e.g., GPUs) or different optimization techniques, a dedicated Dockerfile could be useful for optimizing performance. - -**4.4 Conclusion** - -For now, the **single Dockerfile** approach is sufficient for deploying both **GANBLR** and **GANBLR++**. This setup offers simplified maintenance, consistent benchmarking, and effective resource management. However, if the models evolve with more divergent requirements, separate Dockerfiles may become necessary. - -**5. Benchmarking Execution Across Models** - -To evaluate the performance of **GANBLR** and **GANBLR++**, we benchmarked their execution within the Docker environment using various metrics. - -**5.1 Benchmarking Metrics** - -1. **Build Time**: Time taken to build the Docker image for each model. -1. **Startup Time**: Time taken to launch the container and initialize the model. -1. **Memory Usage**: RAM consumption during model execution. -1. **CPU Usage**: Processor load during execution. -1. **Execution Time**: Time taken to run the model to completion. -1. **Output Accuracy**: Comparison of the accuracy of predictions between the two models. - -**5.2 Benchmarking Results** - -|**Model**|**Build Time**|**Startup Time**|**Memory Usage**|**CPU Usage**|**Execution Time**|**Output Accuracy**| -| :-: | :-: | :-: | :-: | :-: | :-: | :-: | -|GANBLR|3 min|10 sec|450MB|40%|2 min|High| -|GANBLR++|4 min|15 sec|600MB|50%|3\.5 min|Very High| - -**5.3 Performance Analysis** - -- **Build Time**: **GANBLR++** takes slightly longer to build due to additional dependencies and more complex model architecture. -- **Startup Time**: **GANBLR++** has a higher startup time due to the extra resources required during initialization. -- **Memory and CPU Usage**: **GANBLR++** consumes more resources due to its larger and more complex network structure. -- **Execution Time**: **GANBLR++** requires more time to execute, which is expected given its more advanced architecture. -- **Output Accuracy**: **GANBLR++** demonstrates improved prediction accuracy over **GANBLR**, justifying its longer execution time and higher resource consumption. - -**5.4 Similarities and Differences** - -**Similarities** - -- Both models run smoothly within the same Docker environment. -- Core dependencies, such as TensorFlow and Python, are shared, making it easy to manage the environment. - -**Differences** - -- **Resource Usage**: **GANBLR++** requires more memory and CPU power, particularly when handling larger datasets or more complex tasks. -- **Execution Time**: Due to its advanced architecture, **GANBLR++** takes longer to complete tasks but provides more accurate predictions. - -**6. Deployment Process Documentation** - -**6.1 Dockerfile Creation** - -A single Dockerfile was created to manage both **GANBLR** and **GANBLR++**. The Dockerfile installs the following dependencies: - -- **Python 3.x** -- **TensorFlow** -- **scikit-learn** -- Other necessary libraries for model execution. - -**6.2 Docker Compose Setup** - -The models were orchestrated using Docker Compose to streamline container management. This setup allowed both models to be easily executed and benchmarked in the same environment. - -**6.3 Model Execution** - -- Both models were executed using the docker-compose up command. -- Logs were monitored for any potential issues, and the models' output was saved for accuracy comparison. - -**7. Conclusion and Recommendations** - -**7.1 Conclusion** - -- **Single Dockerfile**: Works effectively for both **GANBLR** and **GANBLR++**, given the shared dependencies and frameworks. -- **Performance Differences**: **GANBLR++** consumes more resources but provides better accuracy, reflecting its more complex architecture. - diff --git a/Archive/doc_new/Makefile b/Archive/doc_new/Makefile deleted file mode 100644 index 1838e44..0000000 --- a/Archive/doc_new/Makefile +++ /dev/null @@ -1,20 +0,0 @@ -# Minimal makefile for Sphinx documentation -# - -# You can set these variables from the command line, and also -# from the environment for the first two. -SPHINXOPTS ?= -SPHINXBUILD ?= sphinx-build -SOURCEDIR = ./source -BUILDDIR = _build - -# Put it first so that "make" without argument is like "make help". -help: - @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) - -.PHONY: help Makefile - -# Catch-all target: route all unknown targets to Sphinx using the new -# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS). -%: Makefile - @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) diff --git a/Archive/doc_new/_build/doctrees/Katabatic.doctree b/Archive/doc_new/_build/doctrees/Katabatic.doctree deleted file mode 100644 index f3db8f1..0000000 Binary files a/Archive/doc_new/_build/doctrees/Katabatic.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/Katabatic.models.GANBLR++.doctree b/Archive/doc_new/_build/doctrees/Katabatic.models.GANBLR++.doctree deleted file mode 100644 index b1199b8..0000000 Binary files a/Archive/doc_new/_build/doctrees/Katabatic.models.GANBLR++.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/Katabatic.models.doctree b/Archive/doc_new/_build/doctrees/Katabatic.models.doctree deleted file mode 100644 index b19823b..0000000 Binary files a/Archive/doc_new/_build/doctrees/Katabatic.models.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/Katabatic.models.ganblr.doctree b/Archive/doc_new/_build/doctrees/Katabatic.models.ganblr.doctree deleted file mode 100644 index abd6d67..0000000 Binary files a/Archive/doc_new/_build/doctrees/Katabatic.models.ganblr.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/Katabatic.models.tableGAN.doctree b/Archive/doc_new/_build/doctrees/Katabatic.models.tableGAN.doctree deleted file mode 100644 index 0bd86d6..0000000 Binary files a/Archive/doc_new/_build/doctrees/Katabatic.models.tableGAN.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/Katabatic.utils.doctree b/Archive/doc_new/_build/doctrees/Katabatic.utils.doctree deleted file mode 100644 index 5594ade..0000000 Binary files a/Archive/doc_new/_build/doctrees/Katabatic.utils.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/environment.pickle b/Archive/doc_new/_build/doctrees/environment.pickle deleted file mode 100644 index 4e71344..0000000 Binary files a/Archive/doc_new/_build/doctrees/environment.pickle and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/index.doctree b/Archive/doc_new/_build/doctrees/index.doctree deleted file mode 100644 index 558215b..0000000 Binary files a/Archive/doc_new/_build/doctrees/index.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/katabatic.models.ganblr/ganblr.doctree b/Archive/doc_new/_build/doctrees/katabatic.models.ganblr/ganblr.doctree deleted file mode 100644 index 7fe2f8c..0000000 Binary files a/Archive/doc_new/_build/doctrees/katabatic.models.ganblr/ganblr.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/katabatic.models.ganblr/ganblr_adapter.doctree b/Archive/doc_new/_build/doctrees/katabatic.models.ganblr/ganblr_adapter.doctree deleted file mode 100644 index 54fd383..0000000 Binary files a/Archive/doc_new/_build/doctrees/katabatic.models.ganblr/ganblr_adapter.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/katabatic.models.ganblr/kdb.doctree b/Archive/doc_new/_build/doctrees/katabatic.models.ganblr/kdb.doctree deleted file mode 100644 index 36865c4..0000000 Binary files a/Archive/doc_new/_build/doctrees/katabatic.models.ganblr/kdb.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/katabatic.models.ganblr/utils.doctree b/Archive/doc_new/_build/doctrees/katabatic.models.ganblr/utils.doctree deleted file mode 100644 index c8b84c9..0000000 Binary files a/Archive/doc_new/_build/doctrees/katabatic.models.ganblr/utils.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/katabatic.models.ganblrpp/ganblrpp.doctree b/Archive/doc_new/_build/doctrees/katabatic.models.ganblrpp/ganblrpp.doctree deleted file mode 100644 index b318fa3..0000000 Binary files a/Archive/doc_new/_build/doctrees/katabatic.models.ganblrpp/ganblrpp.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/katabatic.models.ganblrpp/ganblrpp_adapter.doctree b/Archive/doc_new/_build/doctrees/katabatic.models.ganblrpp/ganblrpp_adapter.doctree deleted file mode 100644 index ddb4f1a..0000000 Binary files a/Archive/doc_new/_build/doctrees/katabatic.models.ganblrpp/ganblrpp_adapter.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/katabatic.models.ganblrpp/kdb.doctree b/Archive/doc_new/_build/doctrees/katabatic.models.ganblrpp/kdb.doctree deleted file mode 100644 index abd49a5..0000000 Binary files a/Archive/doc_new/_build/doctrees/katabatic.models.ganblrpp/kdb.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/katabatic.models.ganblrpp/utils.doctree b/Archive/doc_new/_build/doctrees/katabatic.models.ganblrpp/utils.doctree deleted file mode 100644 index 4d4759f..0000000 Binary files a/Archive/doc_new/_build/doctrees/katabatic.models.ganblrpp/utils.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/katabatic.models.tablegan/tablegan.doctree b/Archive/doc_new/_build/doctrees/katabatic.models.tablegan/tablegan.doctree deleted file mode 100644 index 84d1a51..0000000 Binary files a/Archive/doc_new/_build/doctrees/katabatic.models.tablegan/tablegan.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/katabatic.models.tablegan/tablegan_adapter.doctree b/Archive/doc_new/_build/doctrees/katabatic.models.tablegan/tablegan_adapter.doctree deleted file mode 100644 index 2ade348..0000000 Binary files a/Archive/doc_new/_build/doctrees/katabatic.models.tablegan/tablegan_adapter.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/katabatic.models.tablegan/tablegan_utils.doctree b/Archive/doc_new/_build/doctrees/katabatic.models.tablegan/tablegan_utils.doctree deleted file mode 100644 index 6af248d..0000000 Binary files a/Archive/doc_new/_build/doctrees/katabatic.models.tablegan/tablegan_utils.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/doctrees/modules.doctree b/Archive/doc_new/_build/doctrees/modules.doctree deleted file mode 100644 index b2e0cbb..0000000 Binary files a/Archive/doc_new/_build/doctrees/modules.doctree and /dev/null differ diff --git a/Archive/doc_new/_build/html/.buildinfo b/Archive/doc_new/_build/html/.buildinfo deleted file mode 100644 index 584565f..0000000 --- a/Archive/doc_new/_build/html/.buildinfo +++ /dev/null @@ -1,4 +0,0 @@ -# Sphinx build info version 1 -# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done. -config: e5eb74e450ec6b737e1144d36219377b -tags: 645f666f9bcd5a90fca523b33c5a78b7 diff --git a/Archive/doc_new/_build/html/Katabatic.html b/Archive/doc_new/_build/html/Katabatic.html deleted file mode 100644 index 42cd9e6..0000000 --- a/Archive/doc_new/_build/html/Katabatic.html +++ /dev/null @@ -1,142 +0,0 @@ - - - - - - - - Katabatic package — Katabatic documentation - - - - - - - - - - - - - - - - - - - - -
- - -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/Katabatic.models.GANBLR++.html b/Archive/doc_new/_build/html/Katabatic.models.GANBLR++.html deleted file mode 100644 index be6c62f..0000000 --- a/Archive/doc_new/_build/html/Katabatic.models.GANBLR++.html +++ /dev/null @@ -1,152 +0,0 @@ - - - - - - - - Katabatic.models.GANBLR++ package — Katabatic documentation - - - - - - - - - - - - - - - - - - - - -
- - -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/Katabatic.models.ganblr.html b/Archive/doc_new/_build/html/Katabatic.models.ganblr.html deleted file mode 100644 index 3c76ea2..0000000 --- a/Archive/doc_new/_build/html/Katabatic.models.ganblr.html +++ /dev/null @@ -1,145 +0,0 @@ - - - - - - - - Katabatic.models.ganblr package — Katabatic documentation - - - - - - - - - - - - - - - - - - - - -
-
-
- - -
- -
-

Katabatic.models.ganblr package

-
-

Module contents

- -
-
- - -
- -
-
- -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/Katabatic.models.html b/Archive/doc_new/_build/html/Katabatic.models.html deleted file mode 100644 index f4b26dc..0000000 --- a/Archive/doc_new/_build/html/Katabatic.models.html +++ /dev/null @@ -1,203 +0,0 @@ - - - - - - - - Katabatic.models package — Katabatic documentation - - - - - - - - - - - - - - - - - - - - - - - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/Katabatic.models.tableGAN.html b/Archive/doc_new/_build/html/Katabatic.models.tableGAN.html deleted file mode 100644 index 979abe0..0000000 --- a/Archive/doc_new/_build/html/Katabatic.models.tableGAN.html +++ /dev/null @@ -1,141 +0,0 @@ - - - - - - - - Katabatic.models.tableGAN package — Katabatic documentation - - - - - - - - - - - - - - - - - - - - -
-
-
- - -
- -
-

Katabatic.models.tableGAN package

-
-

Module contents

- -
-
- - -
- -
-
- -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/Katabatic.utils.html b/Archive/doc_new/_build/html/Katabatic.utils.html deleted file mode 100644 index cebb256..0000000 --- a/Archive/doc_new/_build/html/Katabatic.utils.html +++ /dev/null @@ -1,116 +0,0 @@ - - - - - - - - Katabatic.utils package — Katabatic documentation - - - - - - - - - - - - - - - - - - - -
-
-
- - -
- -
-

Katabatic.utils package

-
-

Submodules

-
-
- - -
- -
-
- -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/_sources/Katabatic.models.GANBLR++.rst.txt b/Archive/doc_new/_build/html/_sources/Katabatic.models.GANBLR++.rst.txt deleted file mode 100644 index 40b91cb..0000000 --- a/Archive/doc_new/_build/html/_sources/Katabatic.models.GANBLR++.rst.txt +++ /dev/null @@ -1,47 +0,0 @@ -Katabatic.models.GANBLR\+\+ package -=================================== - -.. Submodules -.. ---------- - -.. Katabatic.models.GANBLR\+\+.GANBLR\+\+ module -.. --------------------------------------------- - -.. .. automodule:: Katabatic.models.GANBLR++.GANBLR++ -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.models.GANBLR\+\+.GANBLRPP\_adapter module -.. ---------------------------------------------------- - -.. .. automodule:: Katabatic.models.GANBLR++.GANBLRPP_adapter -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.models.GANBLR\+\+.utils module -.. ---------------------------------------- - -.. .. automodule:: Katabatic.models.GANBLR++.utils -.. :members: -.. :undoc-members: -.. :show-inheritance: - -Module contents ---------------- - -.. .. automodule:: Katabatic.models.GANBLR++ -.. :members: -.. :undoc-members: -.. :show-inheritance: - - -.. toctree:: - :maxdepth: 2 - :caption: Submodules: - - katabatic.models.ganblrpp/ganblrpp - katabatic.models.ganblrpp/ganblrpp_adapter - katabatic.models.ganblrpp/kdb - katabatic.models.ganblrpp/utils diff --git a/Archive/doc_new/_build/html/_sources/Katabatic.models.ganblr.rst.txt b/Archive/doc_new/_build/html/_sources/Katabatic.models.ganblr.rst.txt deleted file mode 100644 index d3e15b3..0000000 --- a/Archive/doc_new/_build/html/_sources/Katabatic.models.ganblr.rst.txt +++ /dev/null @@ -1,54 +0,0 @@ -Katabatic.models.ganblr package -=============================== - -.. Submodules -.. ---------- - -.. Katabatic.models.ganblr.ganblr module -.. ------------------------------------- - -.. .. automodule:: Katabatic.models.ganblr.ganblr -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.models.ganblr.ganblr\_adapter module -.. ---------------------------------------------- - -.. .. automodule:: Katabatic.models.ganblr.ganblr_adapter -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.models.ganblr.kdb module -.. ---------------------------------- - -.. .. automodule:: Katabatic.models.ganblr.kdb -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.models.ganblr.utils module -.. ------------------------------------ - -.. .. automodule:: Katabatic.models.ganblr.utils -.. :members: -.. :undoc-members: -.. :show-inheritance: - -Module contents ---------------- - -.. .. automodule:: Katabatic.models.ganblr -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. toctree:: - :maxdepth: 2 - :caption: Submodules: - - katabatic.models.ganblr/ganblr - katabatic.models.ganblr/ganblr_adapter - katabatic.models.ganblr/kdb - katabatic.models.ganblr/utils \ No newline at end of file diff --git a/Archive/doc_new/_build/html/_sources/Katabatic.models.rst.txt b/Archive/doc_new/_build/html/_sources/Katabatic.models.rst.txt deleted file mode 100644 index 85232c6..0000000 --- a/Archive/doc_new/_build/html/_sources/Katabatic.models.rst.txt +++ /dev/null @@ -1,20 +0,0 @@ -Katabatic.models package -======================== - -Subpackages ------------ - -.. toctree:: - :maxdepth: 4 - - Katabatic.models.GANBLR++ - Katabatic.models.ganblr - Katabatic.models.tableGAN - -.. Module contents -.. --------------- - -.. .. automodule:: Katabatic.models -.. :members: -.. :undoc-members: -.. :show-inheritance: diff --git a/Archive/doc_new/_build/html/_sources/Katabatic.models.tableGAN.rst.txt b/Archive/doc_new/_build/html/_sources/Katabatic.models.tableGAN.rst.txt deleted file mode 100644 index 2150e9c..0000000 --- a/Archive/doc_new/_build/html/_sources/Katabatic.models.tableGAN.rst.txt +++ /dev/null @@ -1,53 +0,0 @@ -Katabatic.models.tableGAN package -=============================== - -.. Submodules -.. ---------- - -.. Katabatic.models.ganblr.ganblr module -.. ------------------------------------- - -.. .. automodule:: Katabatic.models.ganblr.ganblr -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.models.ganblr.ganblr\_adapter module -.. ---------------------------------------------- - -.. .. automodule:: Katabatic.models.ganblr.ganblr_adapter -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.models.ganblr.kdb module -.. ---------------------------------- - -.. .. automodule:: Katabatic.models.ganblr.kdb -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.models.ganblr.utils module -.. ------------------------------------ - -.. .. automodule:: Katabatic.models.ganblr.utils -.. :members: -.. :undoc-members: -.. :show-inheritance: - -Module contents ---------------- - -.. .. automodule:: Katabatic.models.ganblr -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. toctree:: - :maxdepth: 2 - :caption: Submodules: - - katabatic.models.tablegan/tablegan - katabatic.models.tablegan/tablegan_adapter - katabatic.models.tablegan/tablegan_utils \ No newline at end of file diff --git a/Archive/doc_new/_build/html/_sources/Katabatic.rst.txt b/Archive/doc_new/_build/html/_sources/Katabatic.rst.txt deleted file mode 100644 index 44bd54d..0000000 --- a/Archive/doc_new/_build/html/_sources/Katabatic.rst.txt +++ /dev/null @@ -1,54 +0,0 @@ -Katabatic package -================= - -Subpackages ------------ - -.. toctree:: - :maxdepth: 4 - - Katabatic.models - Katabatic.utils - -.. Submodules -.. ---------- - -.. Katabatic.importer module -.. ------------------------- - -.. .. automodule:: Katabatic.importer -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.katabatic module -.. -------------------------- - -.. .. automodule:: Katabatic.katabatic -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.katabatic\_spi module -.. ------------------------------- - -.. .. automodule:: Katabatic.katabatic_spi -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.test2 module -.. ---------------------- - -.. .. automodule:: Katabatic.test2 -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Module contents -.. --------------- - -.. .. automodule:: Katabatic -.. :members: -.. :undoc-members: -.. :show-inheritance: diff --git a/Archive/doc_new/_build/html/_sources/Katabatic.utils.rst.txt b/Archive/doc_new/_build/html/_sources/Katabatic.utils.rst.txt deleted file mode 100644 index b63f4e5..0000000 --- a/Archive/doc_new/_build/html/_sources/Katabatic.utils.rst.txt +++ /dev/null @@ -1,21 +0,0 @@ -Katabatic.utils package -======================= - -Submodules ----------- - -.. Katabatic.utils.evaluate module -.. ------------------------------- - -.. .. automodule:: Katabatic.utils.evaluate -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Module contents -.. --------------- - -.. .. automodule:: Katabatic.utils -.. :members: -.. :undoc-members: -.. :show-inheritance: diff --git a/Archive/doc_new/_build/html/_sources/index.rst.txt b/Archive/doc_new/_build/html/_sources/index.rst.txt deleted file mode 100644 index add461f..0000000 --- a/Archive/doc_new/_build/html/_sources/index.rst.txt +++ /dev/null @@ -1,20 +0,0 @@ -.. Katabatic documentation master file, created by - sphinx-quickstart on Mon Aug 12 10:22:28 2024. - You can adapt this file completely to your liking, but it should at least - contain the root `toctree` directive. - -Katabatic documentation -======================= - -Add your content using ``reStructuredText`` syntax. See the -`reStructuredText `_ -documentation for details. - - -.. toctree:: - :maxdepth: 2 - :caption: Contents: - - - modules - diff --git a/Archive/doc_new/_build/html/_sources/katabatic.models.ganblr/ganblr.rst.txt b/Archive/doc_new/_build/html/_sources/katabatic.models.ganblr/ganblr.rst.txt deleted file mode 100644 index 4c1b65e..0000000 --- a/Archive/doc_new/_build/html/_sources/katabatic.models.ganblr/ganblr.rst.txt +++ /dev/null @@ -1,99 +0,0 @@ -GANBLR Model Class -=================== -The GANBLR model combines a Bayesian network and a neural network, leveraging the k-Dependency Bayesian (kDB) algorithm for building a dependency graph and using various utility functions for data preparation and model constraints. - -Defined in `ganblr.py` - -Class Properties ----------------- - -- **_d**: - Placeholder for data. - -- **__gen_weights**: - Placeholder for generator weights. - -- **batch_size** (`int`): - Batch size for training. - -- **epochs** (`int`): - Number of epochs for training. - -- **k**: - Parameter for the model. - -- **constraints**: - Constraints for the model. - -- **_ordinal_encoder**: - Ordinal encoder for preprocessing data. - -- **_label_encoder**: - Label encoder for preprocessing data. - -Class Methods --------------- - -**__init__()** - Initializes the model with default values. - -**fit(x, y, k=0, batch_size=32, epochs=10, warmup_epochs=1, verbose=0)** - Fits the model to the given data. - - - **Parameters:** - - `x`: Input data. - - `y`: Labels for the data. - - `k`: Parameter for the model (default: 0). - - `batch_size`: Size of batches for training (default: 32). - - `epochs`: Number of training epochs (default: 10). - - `warmup_epochs`: Number of warmup epochs (default: 1). - - `verbose`: Verbosity level (default: 0). - - - **Returns:** - The fitted model. - -**_augment_cpd(d, size=None, verbose=0)** - Augments the Conditional Probability Distribution (CPD). - - - **Parameters:** - - `d`: Data. - - `size`: Size of the sample (default: None). - - `verbose`: Verbosity level (default: 0). - - - **Returns:** - The augmented data. - -**_warmup_run(epochs, verbose=None)** - Runs a warmup phase. - - - **Parameters:** - - `epochs`: Number of epochs for warmup. - - `verbose`: Verbosity level (default: None). - - - **Returns:** - The history of the warmup run. - -**_run_generator(loss)** - Runs the generator model. - - - **Parameters:** - - `loss`: Loss function. - - - **Returns:** - The history of the generator run. - -**_discrim()** - Creates a discriminator model. - - - **Returns:** - The discriminator model. - -**_sample(size=None, verbose=0)** - Generates synthetic data in ordinal encoding format. - - - **Parameters:** - - `size`: Size of the sample (default: None). - - `verbose`: Verbosity level (default: 0). - - - **Returns:** - The sampled data. diff --git a/Archive/doc_new/_build/html/_sources/katabatic.models.ganblr/ganblr_adapter.rst.txt b/Archive/doc_new/_build/html/_sources/katabatic.models.ganblr/ganblr_adapter.rst.txt deleted file mode 100644 index 8c94e3f..0000000 --- a/Archive/doc_new/_build/html/_sources/katabatic.models.ganblr/ganblr_adapter.rst.txt +++ /dev/null @@ -1,53 +0,0 @@ -GanblrAdapter -============= - -A class to adapt the GANBLR model to the KatabaticModelSPI interface. This class provides an easy interface for loading, fitting, and generating data using the GANBLR model. - -Attributes ----------- - -- **type** (`str`): - Specifies whether the model type is 'discrete' or 'continuous'. Default is 'discrete'. - -- **constraints** (`any`): - Constraints for the model. Default is None. - -- **batch_size** (`int`): - Batch size for training the model. Default is None. - -- **epochs** (`int`): - Number of epochs for training the model. Default is None. - -- **training_sample_size** (`int`): - Size of the training sample. Initialized to 0. - -Methods --------- - -**load_model()** - Initializes and returns an instance of the GANBLR model. - -**load_data(data_pathname)** - Loads data from the specified pathname. - - - **Parameters:** - - `data_pathname`: Pathname of the data to be loaded. - -**fit(X_train, y_train, k=0, epochs=10, batch_size=64)** - Fits the GANBLR model using the provided training data. - - - **Parameters:** - - `X_train`: Training features. - - `y_train`: Training labels. - - `k`: Number of parent nodes (default: 0). - - `epochs`: Number of epochs for training (default: 10). - - `batch_size`: Batch size for training (default: 64). - -**generate(size=None)** - Generates data from the GANBLR model. If `size` is not specified, it defaults to the training sample size. - - - **Parameters:** - - `size`: Number of data samples to generate. Defaults to None. - - - **Returns:** - Generated data samples. \ No newline at end of file diff --git a/Archive/doc_new/_build/html/_sources/katabatic.models.ganblr/kdb.rst.txt b/Archive/doc_new/_build/html/_sources/katabatic.models.ganblr/kdb.rst.txt deleted file mode 100644 index 3b22868..0000000 --- a/Archive/doc_new/_build/html/_sources/katabatic.models.ganblr/kdb.rst.txt +++ /dev/null @@ -1,108 +0,0 @@ -kDB Algorithm -============= - -The kDB algorithm constructs a dependency graph for the Bayesian network. It is implemented in the `kdb.py` file. - -Methods --------- - -**build_graph(X, y, k=2)** - Constructs a k-dependency Bayesian network graph. - - - **Parameters:** - - `X`: Input data (features). - - `y`: Labels. - - `k`: Number of parent nodes (default: 2). - - - **Returns:** - A list of graph edges. - -**get_cross_table(*cols, apply_wt=False)** - Generates a cross table from input columns. - - - **Parameters:** - - `cols`: Columns for cross table generation. - - `apply_wt`: Whether to apply weights (default: False). - - - **Returns:** - A tuple containing: - - The cross table as a NumPy array. - - A list of unique values for all columns. - - A list of unique values for individual columns. - -**_get_dependencies_without_y(variables, y_name, kdb_edges)** - Finds the dependencies of each variable without considering `y`. - - - **Parameters:** - - `variables`: List of variable names. - - `y_name`: Class name. - - `kdb_edges`: List of tuples representing edges (source, target). - - - **Returns:** - A dictionary of dependencies. - -**_add_uniform(X, weight=1.0)** - Adds a uniform distribution to the data. - - - **Parameters:** - - `X`: Input data, a NumPy array or pandas DataFrame. - - `weight`: Weight for the uniform distribution (default: 1.0). - - - **Returns:** - The modified data with uniform distribution. - -**_normalize_by_column(array)** - Normalizes the array by columns. - - - **Parameters:** - - `array`: Input array to normalize. - - - **Returns:** - The normalized array. - -**_smoothing(cct, d)** - Probability smoothing for kDB. - - - - **Returns:** - A smoothed joint probability table. - -**get_high_order_feature(X, col, evidence_cols, feature_uniques)** - Encodes the high-order feature of `X[col]` given evidence from `X[evidence_cols`. - - - **Parameters:** - - `X`: Input data. - - `col`: Column to encode. - - `evidence_cols`: List of evidence columns. - - `feature_uniques`: Unique values for features. - - - **Returns:** - An encoded high-order feature. - -**get_high_order_constraints(X, col, evidence_cols, feature_uniques)** - Gets high-order constraints for the feature. - - - **Parameters:** - - `X`: Input data. - - `col`: Column to encode. - - `evidence_cols`: List of evidence columns. - - `feature_uniques`: Unique values for features. - - - **Returns:** - High-order constraints. - -Classes --------- - -**KdbHighOrderFeatureEncoder** - Encodes high-order features for the kDB model. - - - **Class Properties:** - - `feature_uniques_`: Unique values for features. - - `dependencies_`: Dependencies for features. - - `ohe_`: OneHotEncoder instance. - - - **Class Methods:** - - `fit(X, y, k=0)`: Fits the encoder to the data. - - `transform(X, return_constraints=False, use_ohe=True)`: Transforms the input data. - - `fit_transform(X, y, k=0, return_constraints=False)`: Fits the encoder and transforms the data. diff --git a/Archive/doc_new/_build/html/_sources/katabatic.models.ganblr/utils.rst.txt b/Archive/doc_new/_build/html/_sources/katabatic.models.ganblr/utils.rst.txt deleted file mode 100644 index be7d0e7..0000000 --- a/Archive/doc_new/_build/html/_sources/katabatic.models.ganblr/utils.rst.txt +++ /dev/null @@ -1,67 +0,0 @@ -Utility Functions -================= - -The utility functions are implemented in the `utils.py` file and provide various support functions for data preparation and model constraints. - -Classes --------- - -**softmax_weight** - Constrains weight tensors to be under softmax. - -**DataUtils** - Provides data utilities for preparation before training. - -Functions ---------- - -**elr_loss(KL_LOSS)** - Defines a custom loss function. - - - **Parameters:** - - `KL_LOSS`: The KL loss value. - - - **Returns:** - A custom loss function. - -**KL_loss(prob_fake)** - Calculates the KL loss. - - - **Parameters:** - - `prob_fake`: Probability of the fake data. - - - **Returns:** - The KL loss value. - -**get_lr(input_dim, output_dim, constraint=None, KL_LOSS=0)** - Creates a logistic regression model. - - - **Parameters:** - - `input_dim`: Dimension of the input features. - - `output_dim`: Dimension of the output. - - `constraint`: Optional constraint for the model (default: None). - - `KL_LOSS`: Optional KL loss value (default: 0). - - - **Returns:** - A logistic regression model. - -**sample(*arrays, n=None, frac=None, random_state=None)** - Generates random samples from the given arrays. - - - **Parameters:** - - `arrays`: Arrays to sample from. - - `n`: Number of samples to generate (default: None). - - `frac`: Fraction of samples to generate (default: None). - - `random_state`: Random seed for reproducibility (default: None). - - - **Returns:** - Random samples from the arrays. - -**get_demo_data(name='adult')** - Downloads a demo dataset from the internet. - - - **Parameters:** - - `name`: Name of the dataset to download (default: 'adult'). - - - **Returns:** - The downloaded dataset. diff --git a/Archive/doc_new/_build/html/_sources/katabatic.models.ganblrpp/ganblrpp.rst.txt b/Archive/doc_new/_build/html/_sources/katabatic.models.ganblrpp/ganblrpp.rst.txt deleted file mode 100644 index fad4152..0000000 --- a/Archive/doc_new/_build/html/_sources/katabatic.models.ganblrpp/ganblrpp.rst.txt +++ /dev/null @@ -1,146 +0,0 @@ -DMMDiscritizer Class -==================== - -The `DMMDiscritizer` class performs discretization using a mixture model approach. It scales the data, applies Bayesian Gaussian Mixture Models, and transforms the data into discrete values. It also includes methods for inverse transformation of discretized data back to its original form. - -Class Properties ----------------- - -- **__dmm_params**: - Parameters for the Bayesian Gaussian Mixture Model. - -- **__scaler**: - Min-Max scaler for data normalization. - -- **__dmms**: - List to store Bayesian Gaussian Mixture Models for each feature. - -- **__arr_mu**: - List to store means of the Gaussian distributions. - -- **__arr_sigma**: - List to store standard deviations of the Gaussian distributions. - -- **_random_state**: - Random seed for reproducibility. - -Class Methods --------------- - -**__init__(random_state)** - Initializes the DMMDiscritizer with parameters and scaler. - - - **Parameters:** - - `random_state`: Seed for random number generation. - -**fit(x)** - Fits the discretizer to the provided data. - - - **Parameters:** - - `x`: 2D numpy array of shape (n_samples, n_features). Numeric data to be discretized. - - - **Returns:** - The fitted `DMMDiscritizer` instance. - -**transform(x) -> np.ndarray** - Transforms the data using the fitted discretizer. - - - **Parameters:** - - `x`: 2D numpy array of shape (n_samples, n_features). Numeric data to be transformed. - - - **Returns:** - 2D numpy array of shape (n_samples, n_features). Discretized data. - -**fit_transform(x) -> np.ndarray** - Fits the discretizer and transforms the data. - - - **Parameters:** - - `x`: 2D numpy array of shape (n_samples, n_features). Numeric data to be discretized and transformed. - - - **Returns:** - 2D numpy array of shape (n_samples, n_features). Discretized data. - -**inverse_transform(x, verbose=1) -> np.ndarray** - Converts discretized data back to its original continuous form. - - - **Parameters:** - - `x`: 2D numpy array of shape (n_samples, n_features). Discretized data. - - `verbose`: int, default=1. Controls verbosity of the operation. - - - **Returns:** - 2D numpy array of shape (n_samples, n_features). Reverted data. - -**__sample_from_truncnorm(bins, mu, sigma, random_state=None)** - Samples data from a truncated normal distribution. - - - **Parameters:** - - `bins`: 1D numpy array of integer bins. - - `mu`: 1D numpy array of means for the normal distribution. - - `sigma`: 1D numpy array of standard deviations for the normal distribution. - - `random_state`: int or None. Seed for random number generation. - - - **Returns:** - 1D numpy array of sampled results. - -GANBLRPP Class -================ - -The `GANBLRPP` class implements the GANBLR++ model, which combines generative adversarial networks with discretization techniques. It uses the `DMMDiscritizer` for data preprocessing and the `GANBLR` model for generating synthetic data. - -Class Properties ----------------- - -- **__discritizer**: - Instance of `DMMDiscritizer` for data preprocessing. - -- **__ganblr**: - Instance of `GANBLR` for generative adversarial network functionality. - -- **_numerical_columns**: - List of indices for numerical columns in the dataset. - -Class Methods --------------- - -**__init__(numerical_columns, random_state=None)** - Initializes the GANBLR++ model with numerical column indices and optional random state. - - - **Parameters:** - - `numerical_columns`: List of indices for numerical columns. - - `random_state`: int, RandomState instance or None. Seed for random number generation. - -**fit(x, y, k=0, batch_size=32, epochs=10, warmup_epochs=1, verbose=1)** - Fits the GANBLR++ model to the provided data. - - - **Parameters:** - - `x`: 2D numpy array of shape (n_samples, n_features). Dataset to fit the model. - - `y`: 1D numpy array of shape (n_samples,). Labels for the dataset. - - `k`: int, default=0. Parameter k for the GANBLR model. - - `batch_size`: int, default=32. Size of batches for training. - - `epochs`: int, default=10. Number of training epochs. - - `warmup_epochs`: int, default=1. Number of warmup epochs. - - `verbose`: int, default=1. Controls verbosity of the training process. - - - **Returns:** - The fitted `GANBLRPP` instance. - -**sample(size=None, verbose=1)** - Generates synthetic data using the GANBLR++ model. - - - **Parameters:** - - `size`: int or None. Size of the synthetic data to generate. - - `verbose`: int, default=1. Controls verbosity of the sampling process. - - - **Returns:** - 2D numpy array of synthetic data. - -**evaluate(x, y, model='lr')** - Evaluates the model using a TSTR (Training on Synthetic data, Testing on Real data) approach. - - - **Parameters:** - - `x`: 2D numpy array of shape (n_samples, n_features). Test dataset. - - `y`: 1D numpy array of shape (n_samples,). Labels for the test dataset. - - `model`: str or object. Model to use for evaluation. Options are 'lr', 'mlp', 'rf', or a custom model with `fit` and `predict` methods. - - - **Returns:** - float. Accuracy score of the evaluation. diff --git a/Archive/doc_new/_build/html/_sources/katabatic.models.ganblrpp/ganblrpp_adapter.rst.txt b/Archive/doc_new/_build/html/_sources/katabatic.models.ganblrpp/ganblrpp_adapter.rst.txt deleted file mode 100644 index 34a81d8..0000000 --- a/Archive/doc_new/_build/html/_sources/katabatic.models.ganblrpp/ganblrpp_adapter.rst.txt +++ /dev/null @@ -1,95 +0,0 @@ -GanblrppAdapter Class -===================== - -The `GanblrppAdapter` class adapts the `GANBLRPP` model to the Katabatic Model SPI interface. It facilitates model initialization, data loading, model fitting, and data generation, bridging the gap between Katabatic's requirements and the GANBLR++ model's functionality. - - -Class Properties ----------------- - -- **type** (`str`): - Type of model to use ("discrete" by default). - -- **model**: - Instance of `GANBLRPP` for model operations. - -- **constraints**: - Constraints for the model (not used in this class). - -- **batch_size** (`int`): - Size of batches for training. - -- **epochs** (`int`): - Number of epochs for training. - -- **training_sample_size** (`int`): - Size of the training sample. - -- **numerical_columns** (`list`): - List of indices for numerical columns in the dataset. - -- **random_state**: - Seed for random number generation. - -Class Methods --------------- - -**__init__(model_type="discrete", numerical_columns=None, random_state=None)** - Initializes the `GanblrppAdapter` with model type, numerical columns, and optional random state. - - - **Parameters:** - - `model_type`: str, default="discrete". Type of model to use. - - `numerical_columns`: list, optional. List of indices for numerical columns. - - `random_state`: int, optional. Seed for random number generation. - -**load_model() -> GANBLRPP** - Initializes and loads the `GANBLRPP` model. - - - **Returns:** - The initialized `GANBLRPP` model. - - - **Raises:** - - `ValueError`: If `numerical_columns` is not provided. - - `RuntimeError`: If the model initialization fails. - -**load_data(data_pathname) -> pd.DataFrame** - Loads data from a CSV file. - - - **Parameters:** - - `data_pathname`: str. Path to the CSV file containing the data. - - - **Returns:** - Pandas DataFrame containing the loaded data. - - - **Raises:** - - `Exception`: If there is an error loading the data. - -**fit(X_train, y_train, k=0, epochs=10, batch_size=64)** - Fits the `GANBLRPP` model to the training data. - - - **Parameters:** - - `X_train`: pd.DataFrame or np.ndarray. Training data features. - - `y_train`: pd.Series or np.ndarray. Training data labels. - - `k`: int, default=0. Parameter k for the GANBLRPP model. - - `epochs`: int, default=10. Number of training epochs. - - `batch_size`: int, default=64. Size of batches for training. - - - **Returns:** - None - - - **Raises:** - - `RuntimeError`: If the model is not initialized. - - `Exception`: If there is an error during model training. - -**generate(size=None) -> pd.DataFrame** - Generates synthetic data using the `GANBLRPP` model. - - - **Parameters:** - - `size`: int or None. Size of the synthetic data to generate. Defaults to the size of the training data. - - - **Returns:** - Pandas DataFrame containing the generated data. - - - **Raises:** - - `RuntimeError`: If the model is not initialized. - - `Exception`: If there is an error during data generation. diff --git a/Archive/doc_new/_build/html/_sources/katabatic.models.ganblrpp/kdb.rst.txt b/Archive/doc_new/_build/html/_sources/katabatic.models.ganblrpp/kdb.rst.txt deleted file mode 100644 index be05e96..0000000 --- a/Archive/doc_new/_build/html/_sources/katabatic.models.ganblrpp/kdb.rst.txt +++ /dev/null @@ -1,169 +0,0 @@ -KdbHighOrderFeatureEncoder Class -================================ - -The `KdbHighOrderFeatureEncoder` class encodes high-order features using the kDB algorithm. It retrieves dependencies between features, transforms data into high-order features, and optionally returns constraints information. - -Defined in `kdb.py` - -Class Properties ----------------- - -- **dependencies_** (`dict`): - A dictionary storing the dependencies between features. - -- **constraints_** (`np.ndarray`): - Constraints information for high-order features. - -- **have_value_idxs_** (`list`): - Indices indicating the presence of values for constraints. - -- **feature_uniques_** (`list`): - List of unique values for each feature. - -- **high_order_feature_uniques_** (`list`): - List of unique values for high-order features. - -- **edges_** (`list`): - List of edges representing the kDB graph. - -- **ohe_** (`OneHotEncoder`): - OneHotEncoder instance for encoding features. - -- **k** (`int`): - Value of `k` for the kDB model. - -Methods -------- - -**__init__()** - Initializes the `KdbHighOrderFeatureEncoder` with default properties. - -**fit(X, y, k=0)** - Fits the `KdbHighOrderFeatureEncoder` to the data and labels. - - - **Parameters:** - - `X`: array_like of shape (n_samples, n_features). Data to fit in the encoder. - - `y`: array_like of shape (n_samples,). Labels to fit in the encoder. - - `k`: int, default=0. `k` value for the kDB model. `k=0` leads to a OneHotEncoder. - - - **Returns:** - - `self`: The fitted encoder. - -**transform(X, return_constraints=False, use_ohe=True)** - Transforms data to high-order features. - - - **Parameters:** - - `X`: array_like of shape (n_samples, n_features). Data to transform. - - `return_constraints`: bool, default=False. Whether to return constraint information. - - `use_ohe`: bool, default=True. Whether to apply one-hot encoding. - - - **Returns:** - - `X_out`: ndarray of shape (n_samples, n_encoded_features). Transformed input. - - If `return_constraints=True`, also returns: - - `constraints`: np.ndarray of constraints. - - `have_value_idxs`: List of boolean arrays indicating presence of values. - -**fit_transform(X, y, k=0, return_constraints=False)** - Fits the encoder and then transforms the data. - - - **Parameters:** - - `X`: array_like of shape (n_samples, n_features). Data to fit and transform. - - `y`: array_like of shape (n_samples,). Labels to fit and transform. - - `k`: int, default=0. `k` value for the kDB model. - - `return_constraints`: bool, default=False. Whether to return constraint information. - - - **Returns:** - - `X_out`: ndarray of shape (n_samples, n_encoded_features). Transformed input. - - If `return_constraints=True`, also returns: - - `constraints`: np.ndarray of constraints. - - `have_value_idxs`: List of boolean arrays indicating presence of values. - -Helper Functions ----------------- - -**build_graph(X, y, k=2)** - Constructs a kDB graph based on mutual information. - - - **Parameters:** - - `X`: array_like of shape (n_samples, n_features). Features. - - `y`: array_like of shape (n_samples,). Labels. - - `k`: int, default=2. Maximum number of parent nodes to consider. - - - **Returns:** - - `edges`: List of tuples representing edges in the graph. - -**get_cross_table(*cols, apply_wt=False)** - Computes a cross-tabulation table for the given columns. - - - **Parameters:** - - `*cols`: 1D numpy arrays of integers. Columns to cross-tabulate. - - `apply_wt`: bool, default=False. Whether to apply weights. - - - **Returns:** - - `uniq_vals_all_cols`: Tuple of 1D numpy arrays of unique values for each column. - - `xt`: NumPy array of cross-tabulation results. - -**_get_dependencies_without_y(variables, y_name, kdb_edges)** - Retrieves dependencies of each variable excluding the target variable `y`. - - - **Parameters:** - - `variables`: List of variable names. - - `y_name`: Name of the target variable. - - `kdb_edges`: List of tuples representing edges in the kDB graph. - - - **Returns:** - - `dependencies`: Dictionary of dependencies for each variable. - -**_add_uniform(array, noise=1e-5)** - Adds uniform probability to avoid zero counts in cross-tabulation. - - - **Parameters:** - - `array`: NumPy array of counts. - - `noise`: float, default=1e-5. Amount of noise to add. - - - **Returns:** - - `result`: NumPy array with added uniform probability. - -**_normalize_by_column(array)** - Normalizes an array by columns. - - - **Parameters:** - - `array`: NumPy array to normalize. - - - **Returns:** - - `normalized_array`: NumPy array normalized by columns. - -**_smoothing(cct, d)** - Applies smoothing to a cross-count table to handle zero counts. - - - **Parameters:** - - `cct`: NumPy array of cross-count table. - - `d`: int. Dimension of the cross-count table. - - - **Returns:** - - `jpt`: NumPy array of smoothed joint probability table. - -**get_high_order_feature(X, col, evidence_cols, feature_uniques)** - Encodes high-order features given the evidence columns. - - - **Parameters:** - - `X`: array_like of shape (n_samples, n_features). Data. - - `col`: int. Column index of the feature to encode. - - `evidence_cols`: List of indices for evidence columns. - - `feature_uniques`: List of unique values for each feature. - - - **Returns:** - - `high_order_feature`: NumPy array of high-order features. - -**get_high_order_constraints(X, col, evidence_cols, feature_uniques)** - Finds constraints for high-order features. - - - **Parameters:** - - `X`: array_like of shape (n_samples, n_features). Data. - - `col`: int. Column index of the feature. - - `evidence_cols`: List of indices for evidence columns. - - `feature_uniques`: List of unique values for each feature. - - - **Returns:** - - `have_value`: NumPy array of boolean values indicating presence of values. - - `high_order_constraints`: NumPy array of constraint information. diff --git a/Archive/doc_new/_build/html/_sources/katabatic.models.ganblrpp/utils.rst.txt b/Archive/doc_new/_build/html/_sources/katabatic.models.ganblrpp/utils.rst.txt deleted file mode 100644 index 6e30fac..0000000 --- a/Archive/doc_new/_build/html/_sources/katabatic.models.ganblrpp/utils.rst.txt +++ /dev/null @@ -1,158 +0,0 @@ -Utility Classes -=============== - -The utility classes are implemented in the `utility.py` file and provide various support functionalities for model constraints and data preparation. - -Classes --------- - -**softmax_weight** - Constrains weight tensors to be under softmax. - -**DataUtils** - Provides data utilities for preparation before training. - -Functions ---------- - -**elr_loss(KL_LOSS)** - Defines a custom loss function. - - - **Parameters:** - - `KL_LOSS`: The KL loss value. - - - **Returns:** - A custom loss function. - -**KL_loss(prob_fake)** - Calculates the KL loss. - - - **Parameters:** - - `prob_fake`: Probability of the fake data. - - - **Returns:** - The KL loss value. - -**get_lr(input_dim, output_dim, constraint=None, KL_LOSS=0)** - Creates a logistic regression model. - - - **Parameters:** - - `input_dim`: Dimension of the input features. - - `output_dim`: Dimension of the output. - - `constraint`: Optional constraint for the model (default: None). - - `KL_LOSS`: Optional KL loss value (default: 0). - - - **Returns:** - A logistic regression model. - -**sample(*arrays, n=None, frac=None, random_state=None)** - Generates random samples from the given arrays. - - - **Parameters:** - - `arrays`: Arrays to sample from. - - `n`: Number of samples to generate (default: None). - - `frac`: Fraction of samples to generate (default: None). - - `random_state`: Random seed for reproducibility (default: None). - - - **Returns:** - Random samples from the arrays. - -**get_demo_data(name='adult')** - Downloads a demo dataset from the internet. - - - **Parameters:** - - `name`: Name of the dataset to download (default: 'adult'). - - - **Returns:** - The downloaded dataset. - -Classes --------- - -**softmax_weight** - Constrains weight tensors to be under softmax. - - - **Defined in:** `softmax_weight.py` - - - **Properties:** - - **feature_idxs** (`list`): List of tuples indicating the start and end indices for each feature in the weight tensor. - - - **Methods:** - - **__init__(feature_uniques)** - Initializes the constraint with unique feature values. - - - **Parameters:** - - `feature_uniques`: `np.ndarray` or list of int. Unique values for each feature used to compute indices for softmax constraint. - - - **Returns:** - None. - - - **__call__(w)** - Applies the softmax constraint to the weight tensor. - - - **Parameters:** - - `w`: `tf.Tensor`. Weight tensor to which the constraint is applied. - - - **Returns:** - `tf.Tensor`: The constrained weight tensor. - - - **get_config()** - Returns the configuration of the constraint. - - - **Returns:** - `dict`: Configuration dictionary containing `feature_idxs`. - -**DataUtils** - Provides utility functions for data preparation before training. - - - **Defined in:** `data_utils.py` - - - **Properties:** - - **x** (`np.ndarray`): The feature data used for training. - - **y** (`np.ndarray`): The target labels associated with the feature data. - - **data_size** (`int`): Number of samples in the dataset. - - **num_features** (`int`): Number of features in the dataset. - - **num_classes** (`int`): Number of unique classes in the target labels. - - **class_counts** (`np.ndarray`): Counts of each class in the target labels. - - **feature_uniques** (`list`): List of unique values for each feature. - - **constraint_positions** (`np.ndarray` or `None`): Positions of constraints for high-order features. - - **_kdbe** (`KdbHighOrderFeatureEncoder` or `None`): Instance of the `KdbHighOrderFeatureEncoder` for feature encoding. - - **__kdbe_x** (`np.ndarray` or `None`): Transformed feature data after applying kDB encoding. - - - **Methods:** - - **__init__(x, y)** - Initializes the `DataUtils` with the provided feature data and target labels. - - - **Parameters:** - - `x`: `np.ndarray`. Feature data. - - `y`: `np.ndarray`. Target labels. - - - **Returns:** - None. - - - **get_categories(idxs=None)** - Retrieves categories for encoded features. - - - **Parameters:** - - `idxs`: list of int or `None`. Indices of features to retrieve categories for. If `None`, retrieves categories for all features. - - - **Returns:** - `list`: List of categories for the specified features. - - - **get_kdbe_x(k=0, dense_format=True)** - Transforms feature data into high-order features using kDB encoding. - - - **Parameters:** - - `k`: int, default=0. `k` value for the kDB model. - - `dense_format`: bool, default=True. Whether to return the transformed data in dense format. - - - **Returns:** - `np.ndarray`: Transformed feature data. - If `dense_format=True`, returns a dense NumPy array. - Also updates `constraint_positions` with the positions of constraints. - - - **clear()** - Clears the kDB encoder and transformed data. - - - **Returns:** - None. diff --git a/Archive/doc_new/_build/html/_sources/katabatic.models.tablegan/tablegan.rst.txt b/Archive/doc_new/_build/html/_sources/katabatic.models.tablegan/tablegan.rst.txt deleted file mode 100644 index 1fff5a5..0000000 --- a/Archive/doc_new/_build/html/_sources/katabatic.models.tablegan/tablegan.rst.txt +++ /dev/null @@ -1,89 +0,0 @@ -TableGAN Class -============== - -This class implements a Generative Adversarial Network (GAN) for tabular data, specifically designed to handle structured data with a classifier that enforces feature-label relationships. - -.. class:: TableGAN - - :param input_dim: Integer, the dimension of the input features. - :param label_dim: Integer, the dimension of the output labels. - :param z_dim: Integer, the dimension of the noise vector used for generating synthetic data (default: 100). - :param delta_mean: Float, the threshold for the mean feature differences during generator training (default: 0.1). - :param delta_sd: Float, the threshold for the standard deviation differences during generator training (default: 0.1). - - **Example**:: - - import tensorflow as tf - from tensorflow.keras import layers - import numpy as np - - # Create TableGAN model instance - gan = TableGAN(input_dim=32, label_dim=10) - - # Train on data - gan.fit(x_train, y_train, epochs=100) - - # Generate synthetic samples - generated_data, generated_labels = gan.sample(n_samples=1000) - - **Methods** - - .. method:: _build_generator(self) - - Constructs the generator model, which produces synthetic data from random noise. - - :return: A Keras Sequential model representing the generator. - - .. method:: _build_discriminator(self) - - Constructs the discriminator model, which evaluates the authenticity of the data and extracts features. - - :return: A Keras Model that outputs the real/fake classification and extracted features. - - .. method:: _build_classifier(self) - - Constructs the classifier model, which predicts labels for the generated data. - - :return: A Keras Sequential model that classifies the input data. - - .. method:: wasserstein_loss(self, y_true, y_pred) - - Implements the Wasserstein loss function for training the discriminator. - - :param y_true: Tensor, the true labels (real/fake). - :param y_pred: Tensor, the predicted labels. - :return: Tensor, the computed Wasserstein loss. - - .. method:: gradient_penalty(self, real, fake) - - Computes the gradient penalty term for enforcing the Lipschitz constraint in Wasserstein GANs. - - :param real: Tensor, real data samples. - :param fake: Tensor, fake data samples generated by the generator. - :return: Tensor, the computed gradient penalty. - - .. method:: train_step(self, real_data, real_labels) - - Performs a single training step for the generator, discriminator, and classifier, using real and synthetic data. - - :param real_data: Tensor, the real data samples. - :param real_labels: Tensor, the true labels for the real data samples. - :return: Tuple of three values representing the discriminator, generator, and classifier losses, respectively. - - .. method:: fit(self, x, y, batch_size=64, epochs=100, verbose=1) - - Trains the GAN model on the provided data. - - :param x: Tensor, input data for training. - :param y: Tensor, labels for the training data. - :param batch_size: Integer, the size of each training batch (default: 64). - :param epochs: Integer, the number of epochs to train the model (default: 100). - :param verbose: Integer, the verbosity mode for logging during training (default: 1). - :return: The fitted TableGAN model. - - .. method:: sample(self, n_samples) - - Generates synthetic data and corresponding labels using the trained generator and classifier. - - :param n_samples: Integer, the number of synthetic data samples to generate. - :return: Tuple of numpy arrays, containing the generated data and labels. diff --git a/Archive/doc_new/_build/html/_sources/katabatic.models.tablegan/tablegan_adapter.rst.txt b/Archive/doc_new/_build/html/_sources/katabatic.models.tablegan/tablegan_adapter.rst.txt deleted file mode 100644 index c7b8812..0000000 --- a/Archive/doc_new/_build/html/_sources/katabatic.models.tablegan/tablegan_adapter.rst.txt +++ /dev/null @@ -1,72 +0,0 @@ -TableGANAdapter -=============== - -The `TableGANAdapter` class serves as an adapter for interfacing with the `TableGAN` model. It extends the `KatabaticModelSPI` and allows for loading, fitting, and generating synthetic data using the TableGAN model. This adapter includes functionality to handle privacy settings, data preprocessing, and model training. - -Class Structure ---------------- - -.. autoclass:: TableGANAdapter - :members: - :show-inheritance: - -Attributes ----------- - -- **type** (`str`): Defines the type of data handled by the adapter (default: `'continuous'`). -- **privacy_setting** (`str`): Sets the privacy level of the model ('low', 'medium', or 'high'). -- **constraints** (`NoneType`): Currently not in use but reserved for future constraints settings. -- **batch_size** (`int`): Defines the batch size for training (default: `64`). -- **epochs** (`int`): Number of training epochs (default: `100`). -- **model** (`TableGAN`): Instance of the TableGAN model. -- **scaler** (`StandardScaler`): Scaler used for preprocessing continuous data. -- **label_encoder** (`LabelEncoder`): Encoder used for processing categorical labels. -- **input_dim** (`int`): Input dimensionality of the data. -- **label_dim** (`int`): Label dimensionality of the data. -- **training_sample_size** (`int`): Number of samples used during training. - -Methods -------- - -- **__init__(self, type='continuous', privacy_setting='low')** - - Initializes the `TableGANAdapter` class, setting parameters such as data type, privacy level, and default model parameters. - -- **load_model(self)** - - Loads and initializes the `TableGAN` model based on the input and label dimensions. Adjusts privacy parameters (`delta_mean`, `delta_sd`) according to the specified `privacy_setting`. - -- **load_data(self, data_pathname)** - - Loads training data from the specified `data_pathname`. Handles CSV files and returns the data as a Pandas DataFrame. - -- **fit(self, X_train, y_train, epochs=None, batch_size=None)** - - Trains the `TableGAN` model on the provided training data (`X_train`, `y_train`). Preprocesses the data, sets input and label dimensions, and optionally overrides the number of epochs and batch size. - -- **generate(self, size=None)** - - Generates synthetic data using the trained `TableGAN` model. If the model is not trained, raises an error. The size of generated data defaults to the training sample size unless otherwise specified. - -Usage Example -------------- - -Below is a usage example that shows how to initialize and use the `TableGANAdapter` class for training and generating synthetic data: - -.. code-block:: python - - from katabatic.models import TableGANAdapter - - # Initialize the adapter with medium privacy - adapter = TableGANAdapter(type='continuous', privacy_setting='medium') - - # Load data - data = adapter.load_data('training_data.csv') - - # Preprocess and fit the model - X_train, y_train = data.iloc[:, :-1], data.iloc[:, -1] - adapter.fit(X_train, y_train) - - # Generate synthetic data - synthetic_data = adapter.generate(size=1000) - diff --git a/Archive/doc_new/_build/html/_sources/katabatic.models.tablegan/tablegan_utils.rst.txt b/Archive/doc_new/_build/html/_sources/katabatic.models.tablegan/tablegan_utils.rst.txt deleted file mode 100644 index a9fd49c..0000000 --- a/Archive/doc_new/_build/html/_sources/katabatic.models.tablegan/tablegan_utils.rst.txt +++ /dev/null @@ -1,74 +0,0 @@ -Utility Functions -================= - -The utility functions are implemented in the `utils.py` file and provide various support functions for data preparation, model constraints, and logistic regression operations. - -Classes -------- - -- **softmax_weight** - - A class that constrains weight tensors to be normalized using the softmax function, ensuring that the values sum up to 1. - -- **DataUtils** - - A utility class that provides helper functions for preparing data before training machine learning models. - -Functions ---------- - -- **elr_loss(KL_LOSS)** - - Defines a custom loss function that integrates the KL loss into the overall loss calculation for model training. - - - **Parameters:** - - `KL_LOSS` (`float`): The Kullback-Leibler (KL) loss value that is integrated into the loss function. - - - **Returns:** - A custom loss function that can be used in training models. - -- **KL_loss(prob_fake)** - - Calculates the Kullback-Leibler (KL) divergence loss, which measures how one probability distribution diverges from a second, expected distribution. - - - **Parameters:** - - `prob_fake` (`numpy.array`): The probability distribution of the fake (generated) data. - - - **Returns:** - The calculated KL loss value. - -- **get_lr(input_dim, output_dim, constraint=None, KL_LOSS=0)** - - Creates and returns a logistic regression model, with optional constraints and KL loss integration. - - - **Parameters:** - - `input_dim` (`int`): The dimension of the input features. - - `output_dim` (`int`): The dimension of the output labels. - - `constraint` (`callable`, optional): A constraint function applied to the logistic regression model weights (default: None). - - `KL_LOSS` (`float`, optional): The Kullback-Leibler loss value to be incorporated into the model (default: 0). - - - **Returns:** - A logistic regression model object. - -- **sample(*arrays, n=None, frac=None, random_state=None)** - - Generates random samples from the provided arrays, either by specifying the number of samples or the fraction of the arrays to sample. - - - **Parameters:** - - `arrays` (`list`): The input arrays to sample from. - - `n` (`int`, optional): The number of samples to generate (default: None). - - `frac` (`float`, optional): The fraction of the arrays to sample (default: None). - - `random_state` (`int`, optional): A random seed to ensure reproducibility (default: None). - - - **Returns:** - Random samples from the provided arrays. - -- **get_demo_data(name='adult')** - - Downloads and returns a demo dataset from the internet. By default, it downloads the 'adult' dataset, but other dataset names can be specified. - - - **Parameters:** - - `name` (`str`, optional): The name of the dataset to download (default: 'adult'). - - - **Returns:** - The downloaded dataset as a Pandas DataFrame. diff --git a/Archive/doc_new/_build/html/_sources/modules.rst.txt b/Archive/doc_new/_build/html/_sources/modules.rst.txt deleted file mode 100644 index b2cb395..0000000 --- a/Archive/doc_new/_build/html/_sources/modules.rst.txt +++ /dev/null @@ -1,7 +0,0 @@ -Katabatic -========= - -.. toctree:: - :maxdepth: 4 - - Katabatic diff --git a/Archive/doc_new/_build/html/_static/alabaster.css b/Archive/doc_new/_build/html/_static/alabaster.css deleted file mode 100644 index 7e75bf8..0000000 --- a/Archive/doc_new/_build/html/_static/alabaster.css +++ /dev/null @@ -1,663 +0,0 @@ -/* -- page layout ----------------------------------------------------------- */ - -body { - font-family: Georgia, serif; - font-size: 17px; - background-color: #fff; - color: #000; - margin: 0; - padding: 0; -} - - -div.document { - width: 940px; - margin: 30px auto 0 auto; -} - -div.documentwrapper { - float: left; - width: 100%; -} - -div.bodywrapper { - margin: 0 0 0 220px; -} - -div.sphinxsidebar { - width: 220px; - font-size: 14px; - line-height: 1.5; -} - -hr { - border: 1px solid #B1B4B6; -} - -div.body { - background-color: #fff; - color: #3E4349; - padding: 0 30px 0 30px; -} - -div.body > .section { - text-align: left; -} - -div.footer { - width: 940px; - margin: 20px auto 30px auto; - font-size: 14px; - color: #888; - text-align: right; -} - -div.footer a { - color: #888; -} - -p.caption { - font-family: inherit; - font-size: inherit; -} - - -div.relations { - display: none; -} - - -div.sphinxsidebar { - max-height: 100%; - overflow-y: auto; -} - -div.sphinxsidebar a { - color: #444; - text-decoration: none; - border-bottom: 1px dotted #999; -} - -div.sphinxsidebar a:hover { - border-bottom: 1px solid #999; -} - -div.sphinxsidebarwrapper { - padding: 18px 10px; -} - -div.sphinxsidebarwrapper p.logo { - padding: 0; - margin: -10px 0 0 0px; - text-align: center; -} - -div.sphinxsidebarwrapper h1.logo { - margin-top: -10px; - text-align: center; - margin-bottom: 5px; - text-align: left; -} - -div.sphinxsidebarwrapper h1.logo-name { - margin-top: 0px; -} - -div.sphinxsidebarwrapper p.blurb { - margin-top: 0; - font-style: normal; -} - -div.sphinxsidebar h3, -div.sphinxsidebar h4 { - font-family: Georgia, serif; - color: #444; - font-size: 24px; - font-weight: normal; - margin: 0 0 5px 0; - padding: 0; -} - -div.sphinxsidebar h4 { - font-size: 20px; -} - -div.sphinxsidebar h3 a { - color: #444; -} - -div.sphinxsidebar p.logo a, -div.sphinxsidebar h3 a, -div.sphinxsidebar p.logo a:hover, -div.sphinxsidebar h3 a:hover { - border: none; -} - -div.sphinxsidebar p { - color: #555; - margin: 10px 0; -} - -div.sphinxsidebar ul { - margin: 10px 0; - padding: 0; - color: #000; -} - -div.sphinxsidebar ul li.toctree-l1 > a { - font-size: 120%; -} - -div.sphinxsidebar ul li.toctree-l2 > a { - font-size: 110%; -} - -div.sphinxsidebar input { - border: 1px solid #CCC; - font-family: Georgia, serif; - font-size: 1em; -} - -div.sphinxsidebar #searchbox { - margin: 1em 0; -} - -div.sphinxsidebar .search > div { - display: table-cell; -} - -div.sphinxsidebar hr { - border: none; - height: 1px; - color: #AAA; - background: #AAA; - - text-align: left; - margin-left: 0; - width: 50%; -} - -div.sphinxsidebar .badge { - border-bottom: none; -} - -div.sphinxsidebar .badge:hover { - border-bottom: none; -} - -/* To address an issue with donation coming after search */ -div.sphinxsidebar h3.donation { - margin-top: 10px; -} - -/* -- body styles ----------------------------------------------------------- */ - -a { - color: #004B6B; - text-decoration: underline; -} - -a:hover { - color: #6D4100; - text-decoration: underline; -} - -div.body h1, -div.body h2, -div.body h3, -div.body h4, -div.body h5, -div.body h6 { - font-family: Georgia, serif; - font-weight: normal; - margin: 30px 0px 10px 0px; - padding: 0; -} - -div.body h1 { margin-top: 0; padding-top: 0; font-size: 240%; } -div.body h2 { font-size: 180%; } -div.body h3 { font-size: 150%; } -div.body h4 { font-size: 130%; } -div.body h5 { font-size: 100%; } -div.body h6 { font-size: 100%; } - -a.headerlink { - color: #DDD; - padding: 0 4px; - text-decoration: none; -} - -a.headerlink:hover { - color: #444; - background: #EAEAEA; -} - -div.body p, div.body dd, div.body li { - line-height: 1.4em; -} - -div.admonition { - margin: 20px 0px; - padding: 10px 30px; - background-color: #EEE; - border: 1px solid #CCC; -} - -div.admonition tt.xref, div.admonition code.xref, div.admonition a tt { - background-color: #FBFBFB; - border-bottom: 1px solid #fafafa; -} - -div.admonition p.admonition-title { - font-family: Georgia, serif; - font-weight: normal; - font-size: 24px; - margin: 0 0 10px 0; - padding: 0; - line-height: 1; -} - -div.admonition p.last { - margin-bottom: 0; -} - -dt:target, .highlight { - background: #FAF3E8; -} - -div.warning { - background-color: #FCC; - border: 1px solid #FAA; -} - -div.danger { - background-color: #FCC; - border: 1px solid #FAA; - -moz-box-shadow: 2px 2px 4px #D52C2C; - -webkit-box-shadow: 2px 2px 4px #D52C2C; - box-shadow: 2px 2px 4px #D52C2C; -} - -div.error { - background-color: #FCC; - border: 1px solid #FAA; - -moz-box-shadow: 2px 2px 4px #D52C2C; - -webkit-box-shadow: 2px 2px 4px #D52C2C; - box-shadow: 2px 2px 4px #D52C2C; -} - -div.caution { - background-color: #FCC; - border: 1px solid #FAA; -} - -div.attention { - background-color: #FCC; - border: 1px solid #FAA; -} - -div.important { - background-color: #EEE; - border: 1px solid #CCC; -} - -div.note { - background-color: #EEE; - border: 1px solid #CCC; -} - -div.tip { - background-color: #EEE; - border: 1px solid #CCC; -} - -div.hint { - background-color: #EEE; - border: 1px solid #CCC; -} - -div.seealso { - background-color: #EEE; - border: 1px solid #CCC; -} - -div.topic { - background-color: #EEE; -} - -p.admonition-title { - display: inline; -} - -p.admonition-title:after { - content: ":"; -} - -pre, tt, code { - font-family: 'Consolas', 'Menlo', 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', monospace; - font-size: 0.9em; -} - -.hll { - background-color: #FFC; - margin: 0 -12px; - padding: 0 12px; - display: block; -} - -img.screenshot { -} - -tt.descname, tt.descclassname, code.descname, code.descclassname { - font-size: 0.95em; -} - -tt.descname, code.descname { - padding-right: 0.08em; -} - -img.screenshot { - -moz-box-shadow: 2px 2px 4px #EEE; - -webkit-box-shadow: 2px 2px 4px #EEE; - box-shadow: 2px 2px 4px #EEE; -} - -table.docutils { - border: 1px solid #888; - -moz-box-shadow: 2px 2px 4px #EEE; - -webkit-box-shadow: 2px 2px 4px #EEE; - box-shadow: 2px 2px 4px #EEE; -} - -table.docutils td, table.docutils th { - border: 1px solid #888; - padding: 0.25em 0.7em; -} - -table.field-list, table.footnote { - border: none; - -moz-box-shadow: none; - -webkit-box-shadow: none; - box-shadow: none; -} - -table.footnote { - margin: 15px 0; - width: 100%; - border: 1px solid #EEE; - background: #FDFDFD; - font-size: 0.9em; -} - -table.footnote + table.footnote { - margin-top: -15px; - border-top: none; -} - -table.field-list th { - padding: 0 0.8em 0 0; -} - -table.field-list td { - padding: 0; -} - -table.field-list p { - margin-bottom: 0.8em; -} - -/* Cloned from - * https://github.com/sphinx-doc/sphinx/commit/ef60dbfce09286b20b7385333d63a60321784e68 - */ -.field-name { - -moz-hyphens: manual; - -ms-hyphens: manual; - -webkit-hyphens: manual; - hyphens: manual; -} - -table.footnote td.label { - width: .1px; - padding: 0.3em 0 0.3em 0.5em; -} - -table.footnote td { - padding: 0.3em 0.5em; -} - -dl { - margin-left: 0; - margin-right: 0; - margin-top: 0; - padding: 0; -} - -dl dd { - margin-left: 30px; -} - -blockquote { - margin: 0 0 0 30px; - padding: 0; -} - -ul, ol { - /* Matches the 30px from the narrow-screen "li > ul" selector below */ - margin: 10px 0 10px 30px; - padding: 0; -} - -pre { - background: unset; - padding: 7px 30px; - margin: 15px 0px; - line-height: 1.3em; -} - -div.viewcode-block:target { - background: #ffd; -} - -dl pre, blockquote pre, li pre { - margin-left: 0; - padding-left: 30px; -} - -tt, code { - background-color: #ecf0f3; - color: #222; - /* padding: 1px 2px; */ -} - -tt.xref, code.xref, a tt { - background-color: #FBFBFB; - border-bottom: 1px solid #fff; -} - -a.reference { - text-decoration: none; - border-bottom: 1px dotted #004B6B; -} - -a.reference:hover { - border-bottom: 1px solid #6D4100; -} - -/* Don't put an underline on images */ -a.image-reference, a.image-reference:hover { - border-bottom: none; -} - -a.footnote-reference { - text-decoration: none; - font-size: 0.7em; - vertical-align: top; - border-bottom: 1px dotted #004B6B; -} - -a.footnote-reference:hover { - border-bottom: 1px solid #6D4100; -} - -a:hover tt, a:hover code { - background: #EEE; -} - -@media screen and (max-width: 940px) { - - body { - margin: 0; - padding: 20px 30px; - } - - div.documentwrapper { - float: none; - background: #fff; - margin-left: 0; - margin-top: 0; - margin-right: 0; - margin-bottom: 0; - } - - div.sphinxsidebar { - display: block; - float: none; - width: unset; - margin: 50px -30px -20px -30px; - padding: 10px 20px; - background: #333; - color: #FFF; - } - - div.sphinxsidebar h3, div.sphinxsidebar h4, div.sphinxsidebar p, - div.sphinxsidebar h3 a { - color: #fff; - } - - div.sphinxsidebar a { - color: #AAA; - } - - div.sphinxsidebar p.logo { - display: none; - } - - div.document { - width: 100%; - margin: 0; - } - - div.footer { - display: none; - } - - div.bodywrapper { - margin: 0; - } - - div.body { - min-height: 0; - min-width: auto; /* fixes width on small screens, breaks .hll */ - padding: 0; - } - - .hll { - /* "fixes" the breakage */ - width: max-content; - } - - .rtd_doc_footer { - display: none; - } - - .document { - width: auto; - } - - .footer { - width: auto; - } - - .github { - display: none; - } - - ul { - margin-left: 0; - } - - li > ul { - /* Matches the 30px from the "ul, ol" selector above */ - margin-left: 30px; - } -} - - -/* misc. */ - -.revsys-inline { - display: none!important; -} - -/* Hide ugly table cell borders in ..bibliography:: directive output */ -table.docutils.citation, table.docutils.citation td, table.docutils.citation th { - border: none; - /* Below needed in some edge cases; if not applied, bottom shadows appear */ - -moz-box-shadow: none; - -webkit-box-shadow: none; - box-shadow: none; -} - - -/* relbar */ - -.related { - line-height: 30px; - width: 100%; - font-size: 0.9rem; -} - -.related.top { - border-bottom: 1px solid #EEE; - margin-bottom: 20px; -} - -.related.bottom { - border-top: 1px solid #EEE; -} - -.related ul { - padding: 0; - margin: 0; - list-style: none; -} - -.related li { - display: inline; -} - -nav#rellinks { - float: right; -} - -nav#rellinks li+li:before { - content: "|"; -} - -nav#breadcrumbs li+li:before { - content: "\00BB"; -} - -/* Hide certain items when printing */ -@media print { - div.related { - display: none; - } -} - -img.github { - position: absolute; - top: 0; - border: 0; - right: 0; -} \ No newline at end of file diff --git a/Archive/doc_new/_build/html/_static/basic.css b/Archive/doc_new/_build/html/_static/basic.css deleted file mode 100644 index e5179b7..0000000 --- a/Archive/doc_new/_build/html/_static/basic.css +++ /dev/null @@ -1,925 +0,0 @@ -/* - * basic.css - * ~~~~~~~~~ - * - * Sphinx stylesheet -- basic theme. - * - * :copyright: Copyright 2007-2024 by the Sphinx team, see AUTHORS. - * :license: BSD, see LICENSE for details. - * - */ - -/* -- main layout ----------------------------------------------------------- */ - -div.clearer { - clear: both; -} - -div.section::after { - display: block; - content: ''; - clear: left; -} - -/* -- relbar ---------------------------------------------------------------- */ - -div.related { - width: 100%; - font-size: 90%; -} - -div.related h3 { - display: none; -} - -div.related ul { - margin: 0; - padding: 0 0 0 10px; - list-style: none; -} - -div.related li { - display: inline; -} - -div.related li.right { - float: right; - margin-right: 5px; -} - -/* -- sidebar --------------------------------------------------------------- */ - -div.sphinxsidebarwrapper { - padding: 10px 5px 0 10px; -} - -div.sphinxsidebar { - float: left; - width: 230px; - margin-left: -100%; - font-size: 90%; - word-wrap: break-word; - overflow-wrap : break-word; -} - -div.sphinxsidebar ul { - list-style: none; -} - -div.sphinxsidebar ul ul, -div.sphinxsidebar ul.want-points { - margin-left: 20px; - list-style: square; -} - -div.sphinxsidebar ul ul { - margin-top: 0; - margin-bottom: 0; -} - -div.sphinxsidebar form { - margin-top: 10px; -} - -div.sphinxsidebar input { - border: 1px solid #98dbcc; - font-family: sans-serif; - font-size: 1em; -} - -div.sphinxsidebar #searchbox form.search { - overflow: hidden; -} - -div.sphinxsidebar #searchbox input[type="text"] { - float: left; - width: 80%; - padding: 0.25em; - box-sizing: border-box; -} - -div.sphinxsidebar #searchbox input[type="submit"] { - float: left; - width: 20%; - border-left: none; - padding: 0.25em; - box-sizing: border-box; -} - - -img { - border: 0; - max-width: 100%; -} - -/* -- search page ----------------------------------------------------------- */ - -ul.search { - margin: 10px 0 0 20px; - padding: 0; -} - -ul.search li { - padding: 5px 0 5px 20px; - background-image: url(file.png); - background-repeat: no-repeat; - background-position: 0 7px; -} - -ul.search li a { - font-weight: bold; -} - -ul.search li p.context { - color: #888; - margin: 2px 0 0 30px; - text-align: left; -} - -ul.keywordmatches li.goodmatch a { - font-weight: bold; -} - -/* -- index page ------------------------------------------------------------ */ - -table.contentstable { - width: 90%; - margin-left: auto; - margin-right: auto; -} - -table.contentstable p.biglink { - line-height: 150%; -} - -a.biglink { - font-size: 1.3em; -} - -span.linkdescr { - font-style: italic; - padding-top: 5px; - font-size: 90%; -} - -/* -- general index --------------------------------------------------------- */ - -table.indextable { - width: 100%; -} - -table.indextable td { - text-align: left; - vertical-align: top; -} - -table.indextable ul { - margin-top: 0; - margin-bottom: 0; - list-style-type: none; -} - -table.indextable > tbody > tr > td > ul { - padding-left: 0em; -} - -table.indextable tr.pcap { - height: 10px; -} - -table.indextable tr.cap { - margin-top: 10px; - background-color: #f2f2f2; -} - -img.toggler { - margin-right: 3px; - margin-top: 3px; - cursor: pointer; -} - -div.modindex-jumpbox { - border-top: 1px solid #ddd; - border-bottom: 1px solid #ddd; - margin: 1em 0 1em 0; - padding: 0.4em; -} - -div.genindex-jumpbox { - border-top: 1px solid #ddd; - border-bottom: 1px solid #ddd; - margin: 1em 0 1em 0; - padding: 0.4em; -} - -/* -- domain module index --------------------------------------------------- */ - -table.modindextable td { - padding: 2px; - border-collapse: collapse; -} - -/* -- general body styles --------------------------------------------------- */ - -div.body { - min-width: inherit; - max-width: 800px; -} - -div.body p, div.body dd, div.body li, div.body blockquote { - -moz-hyphens: auto; - -ms-hyphens: auto; - -webkit-hyphens: auto; - hyphens: auto; -} - -a.headerlink { - visibility: hidden; -} - -a:visited { - color: #551A8B; -} - -h1:hover > a.headerlink, -h2:hover > a.headerlink, -h3:hover > a.headerlink, -h4:hover > a.headerlink, -h5:hover > a.headerlink, -h6:hover > a.headerlink, -dt:hover > a.headerlink, -caption:hover > a.headerlink, -p.caption:hover > a.headerlink, -div.code-block-caption:hover > a.headerlink { - visibility: visible; -} - -div.body p.caption { - text-align: inherit; -} - -div.body td { - text-align: left; -} - -.first { - margin-top: 0 !important; -} - -p.rubric { - margin-top: 30px; - font-weight: bold; -} - -img.align-left, figure.align-left, .figure.align-left, object.align-left { - clear: left; - float: left; - margin-right: 1em; -} - -img.align-right, figure.align-right, .figure.align-right, object.align-right { - clear: right; - float: right; - margin-left: 1em; -} - -img.align-center, figure.align-center, .figure.align-center, object.align-center { - display: block; - margin-left: auto; - margin-right: auto; -} - -img.align-default, figure.align-default, .figure.align-default { - display: block; - margin-left: auto; - margin-right: auto; -} - -.align-left { - text-align: left; -} - -.align-center { - text-align: center; -} - -.align-default { - text-align: center; -} - -.align-right { - text-align: right; -} - -/* -- sidebars -------------------------------------------------------------- */ - -div.sidebar, -aside.sidebar { - margin: 0 0 0.5em 1em; - border: 1px solid #ddb; - padding: 7px; - background-color: #ffe; - width: 40%; - float: right; - clear: right; - overflow-x: auto; -} - -p.sidebar-title { - font-weight: bold; -} - -nav.contents, -aside.topic, -div.admonition, div.topic, blockquote { - clear: left; -} - -/* -- topics ---------------------------------------------------------------- */ - -nav.contents, -aside.topic, -div.topic { - border: 1px solid #ccc; - padding: 7px; - margin: 10px 0 10px 0; -} - -p.topic-title { - font-size: 1.1em; - font-weight: bold; - margin-top: 10px; -} - -/* -- admonitions ----------------------------------------------------------- */ - -div.admonition { - margin-top: 10px; - margin-bottom: 10px; - padding: 7px; -} - -div.admonition dt { - font-weight: bold; -} - -p.admonition-title { - margin: 0px 10px 5px 0px; - font-weight: bold; -} - -div.body p.centered { - text-align: center; - margin-top: 25px; -} - -/* -- content of sidebars/topics/admonitions -------------------------------- */ - -div.sidebar > :last-child, -aside.sidebar > :last-child, -nav.contents > :last-child, -aside.topic > :last-child, -div.topic > :last-child, -div.admonition > :last-child { - margin-bottom: 0; -} - -div.sidebar::after, -aside.sidebar::after, -nav.contents::after, -aside.topic::after, -div.topic::after, -div.admonition::after, -blockquote::after { - display: block; - content: ''; - clear: both; -} - -/* -- tables ---------------------------------------------------------------- */ - -table.docutils { - margin-top: 10px; - margin-bottom: 10px; - border: 0; - border-collapse: collapse; -} - -table.align-center { - margin-left: auto; - margin-right: auto; -} - -table.align-default { - margin-left: auto; - margin-right: auto; -} - -table caption span.caption-number { - font-style: italic; -} - -table caption span.caption-text { -} - -table.docutils td, table.docutils th { - padding: 1px 8px 1px 5px; - border-top: 0; - border-left: 0; - border-right: 0; - border-bottom: 1px solid #aaa; -} - -th { - text-align: left; - padding-right: 5px; -} - -table.citation { - border-left: solid 1px gray; - margin-left: 1px; -} - -table.citation td { - border-bottom: none; -} - -th > :first-child, -td > :first-child { - margin-top: 0px; -} - -th > :last-child, -td > :last-child { - margin-bottom: 0px; -} - -/* -- figures --------------------------------------------------------------- */ - -div.figure, figure { - margin: 0.5em; - padding: 0.5em; -} - -div.figure p.caption, figcaption { - padding: 0.3em; -} - -div.figure p.caption span.caption-number, -figcaption span.caption-number { - font-style: italic; -} - -div.figure p.caption span.caption-text, -figcaption span.caption-text { -} - -/* -- field list styles ----------------------------------------------------- */ - -table.field-list td, table.field-list th { - border: 0 !important; -} - -.field-list ul { - margin: 0; - padding-left: 1em; -} - -.field-list p { - margin: 0; -} - -.field-name { - -moz-hyphens: manual; - -ms-hyphens: manual; - -webkit-hyphens: manual; - hyphens: manual; -} - -/* -- hlist styles ---------------------------------------------------------- */ - -table.hlist { - margin: 1em 0; -} - -table.hlist td { - vertical-align: top; -} - -/* -- object description styles --------------------------------------------- */ - -.sig { - font-family: 'Consolas', 'Menlo', 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', monospace; -} - -.sig-name, code.descname { - background-color: transparent; - font-weight: bold; -} - -.sig-name { - font-size: 1.1em; -} - -code.descname { - font-size: 1.2em; -} - -.sig-prename, code.descclassname { - background-color: transparent; -} - -.optional { - font-size: 1.3em; -} - -.sig-paren { - font-size: larger; -} - -.sig-param.n { - font-style: italic; -} - -/* C++ specific styling */ - -.sig-inline.c-texpr, -.sig-inline.cpp-texpr { - font-family: unset; -} - -.sig.c .k, .sig.c .kt, -.sig.cpp .k, .sig.cpp .kt { - color: #0033B3; -} - -.sig.c .m, -.sig.cpp .m { - color: #1750EB; -} - -.sig.c .s, .sig.c .sc, -.sig.cpp .s, .sig.cpp .sc { - color: #067D17; -} - - -/* -- other body styles ----------------------------------------------------- */ - -ol.arabic { - list-style: decimal; -} - -ol.loweralpha { - list-style: lower-alpha; -} - -ol.upperalpha { - list-style: upper-alpha; -} - -ol.lowerroman { - list-style: lower-roman; -} - -ol.upperroman { - list-style: upper-roman; -} - -:not(li) > ol > li:first-child > :first-child, -:not(li) > ul > li:first-child > :first-child { - margin-top: 0px; -} - -:not(li) > ol > li:last-child > :last-child, -:not(li) > ul > li:last-child > :last-child { - margin-bottom: 0px; -} - -ol.simple ol p, -ol.simple ul p, -ul.simple ol p, -ul.simple ul p { - margin-top: 0; -} - -ol.simple > li:not(:first-child) > p, -ul.simple > li:not(:first-child) > p { - margin-top: 0; -} - -ol.simple p, -ul.simple p { - margin-bottom: 0; -} - -aside.footnote > span, -div.citation > span { - float: left; -} -aside.footnote > span:last-of-type, -div.citation > span:last-of-type { - padding-right: 0.5em; -} -aside.footnote > p { - margin-left: 2em; -} -div.citation > p { - margin-left: 4em; -} -aside.footnote > p:last-of-type, -div.citation > p:last-of-type { - margin-bottom: 0em; -} -aside.footnote > p:last-of-type:after, -div.citation > p:last-of-type:after { - content: ""; - clear: both; -} - -dl.field-list { - display: grid; - grid-template-columns: fit-content(30%) auto; -} - -dl.field-list > dt { - font-weight: bold; - word-break: break-word; - padding-left: 0.5em; - padding-right: 5px; -} - -dl.field-list > dd { - padding-left: 0.5em; - margin-top: 0em; - margin-left: 0em; - margin-bottom: 0em; -} - -dl { - margin-bottom: 15px; -} - -dd > :first-child { - margin-top: 0px; -} - -dd ul, dd table { - margin-bottom: 10px; -} - -dd { - margin-top: 3px; - margin-bottom: 10px; - margin-left: 30px; -} - -.sig dd { - margin-top: 0px; - margin-bottom: 0px; -} - -.sig dl { - margin-top: 0px; - margin-bottom: 0px; -} - -dl > dd:last-child, -dl > dd:last-child > :last-child { - margin-bottom: 0; -} - -dt:target, span.highlighted { - background-color: #fbe54e; -} - -rect.highlighted { - fill: #fbe54e; -} - -dl.glossary dt { - font-weight: bold; - font-size: 1.1em; -} - -.versionmodified { - font-style: italic; -} - -.system-message { - background-color: #fda; - padding: 5px; - border: 3px solid red; -} - -.footnote:target { - background-color: #ffa; -} - -.line-block { - display: block; - margin-top: 1em; - margin-bottom: 1em; -} - -.line-block .line-block { - margin-top: 0; - margin-bottom: 0; - margin-left: 1.5em; -} - -.guilabel, .menuselection { - font-family: sans-serif; -} - -.accelerator { - text-decoration: underline; -} - -.classifier { - font-style: oblique; -} - -.classifier:before { - font-style: normal; - margin: 0 0.5em; - content: ":"; - display: inline-block; -} - -abbr, acronym { - border-bottom: dotted 1px; - cursor: help; -} - -.translated { - background-color: rgba(207, 255, 207, 0.2) -} - -.untranslated { - background-color: rgba(255, 207, 207, 0.2) -} - -/* -- code displays --------------------------------------------------------- */ - -pre { - overflow: auto; - overflow-y: hidden; /* fixes display issues on Chrome browsers */ -} - -pre, div[class*="highlight-"] { - clear: both; -} - -span.pre { - -moz-hyphens: none; - -ms-hyphens: none; - -webkit-hyphens: none; - hyphens: none; - white-space: nowrap; -} - -div[class*="highlight-"] { - margin: 1em 0; -} - -td.linenos pre { - border: 0; - background-color: transparent; - color: #aaa; -} - -table.highlighttable { - display: block; -} - -table.highlighttable tbody { - display: block; -} - -table.highlighttable tr { - display: flex; -} - -table.highlighttable td { - margin: 0; - padding: 0; -} - -table.highlighttable td.linenos { - padding-right: 0.5em; -} - -table.highlighttable td.code { - flex: 1; - overflow: hidden; -} - -.highlight .hll { - display: block; -} - -div.highlight pre, -table.highlighttable pre { - margin: 0; -} - -div.code-block-caption + div { - margin-top: 0; -} - -div.code-block-caption { - margin-top: 1em; - padding: 2px 5px; - font-size: small; -} - -div.code-block-caption code { - background-color: transparent; -} - -table.highlighttable td.linenos, -span.linenos, -div.highlight span.gp { /* gp: Generic.Prompt */ - user-select: none; - -webkit-user-select: text; /* Safari fallback only */ - -webkit-user-select: none; /* Chrome/Safari */ - -moz-user-select: none; /* Firefox */ - -ms-user-select: none; /* IE10+ */ -} - -div.code-block-caption span.caption-number { - padding: 0.1em 0.3em; - font-style: italic; -} - -div.code-block-caption span.caption-text { -} - -div.literal-block-wrapper { - margin: 1em 0; -} - -code.xref, a code { - background-color: transparent; - font-weight: bold; -} - -h1 code, h2 code, h3 code, h4 code, h5 code, h6 code { - background-color: transparent; -} - -.viewcode-link { - float: right; -} - -.viewcode-back { - float: right; - font-family: sans-serif; -} - -div.viewcode-block:target { - margin: -1px -10px; - padding: 0 10px; -} - -/* -- math display ---------------------------------------------------------- */ - -img.math { - vertical-align: middle; -} - -div.body div.math p { - text-align: center; -} - -span.eqno { - float: right; -} - -span.eqno a.headerlink { - position: absolute; - z-index: 1; -} - -div.math:hover a.headerlink { - visibility: visible; -} - -/* -- printout stylesheet --------------------------------------------------- */ - -@media print { - div.document, - div.documentwrapper, - div.bodywrapper { - margin: 0 !important; - width: 100%; - } - - div.sphinxsidebar, - div.related, - div.footer, - #top-link { - display: none; - } -} \ No newline at end of file diff --git a/Archive/doc_new/_build/html/_static/custom.css b/Archive/doc_new/_build/html/_static/custom.css deleted file mode 100644 index 2a924f1..0000000 --- a/Archive/doc_new/_build/html/_static/custom.css +++ /dev/null @@ -1 +0,0 @@ -/* This file intentionally left blank. */ diff --git a/Archive/doc_new/_build/html/_static/doctools.js b/Archive/doc_new/_build/html/_static/doctools.js deleted file mode 100644 index 4d67807..0000000 --- a/Archive/doc_new/_build/html/_static/doctools.js +++ /dev/null @@ -1,156 +0,0 @@ -/* - * doctools.js - * ~~~~~~~~~~~ - * - * Base JavaScript utilities for all Sphinx HTML documentation. - * - * :copyright: Copyright 2007-2024 by the Sphinx team, see AUTHORS. - * :license: BSD, see LICENSE for details. - * - */ -"use strict"; - -const BLACKLISTED_KEY_CONTROL_ELEMENTS = new Set([ - "TEXTAREA", - "INPUT", - "SELECT", - "BUTTON", -]); - -const _ready = (callback) => { - if (document.readyState !== "loading") { - callback(); - } else { - document.addEventListener("DOMContentLoaded", callback); - } -}; - -/** - * Small JavaScript module for the documentation. - */ -const Documentation = { - init: () => { - Documentation.initDomainIndexTable(); - Documentation.initOnKeyListeners(); - }, - - /** - * i18n support - */ - TRANSLATIONS: {}, - PLURAL_EXPR: (n) => (n === 1 ? 0 : 1), - LOCALE: "unknown", - - // gettext and ngettext don't access this so that the functions - // can safely bound to a different name (_ = Documentation.gettext) - gettext: (string) => { - const translated = Documentation.TRANSLATIONS[string]; - switch (typeof translated) { - case "undefined": - return string; // no translation - case "string": - return translated; // translation exists - default: - return translated[0]; // (singular, plural) translation tuple exists - } - }, - - ngettext: (singular, plural, n) => { - const translated = Documentation.TRANSLATIONS[singular]; - if (typeof translated !== "undefined") - return translated[Documentation.PLURAL_EXPR(n)]; - return n === 1 ? singular : plural; - }, - - addTranslations: (catalog) => { - Object.assign(Documentation.TRANSLATIONS, catalog.messages); - Documentation.PLURAL_EXPR = new Function( - "n", - `return (${catalog.plural_expr})` - ); - Documentation.LOCALE = catalog.locale; - }, - - /** - * helper function to focus on search bar - */ - focusSearchBar: () => { - document.querySelectorAll("input[name=q]")[0]?.focus(); - }, - - /** - * Initialise the domain index toggle buttons - */ - initDomainIndexTable: () => { - const toggler = (el) => { - const idNumber = el.id.substr(7); - const toggledRows = document.querySelectorAll(`tr.cg-${idNumber}`); - if (el.src.substr(-9) === "minus.png") { - el.src = `${el.src.substr(0, el.src.length - 9)}plus.png`; - toggledRows.forEach((el) => (el.style.display = "none")); - } else { - el.src = `${el.src.substr(0, el.src.length - 8)}minus.png`; - toggledRows.forEach((el) => (el.style.display = "")); - } - }; - - const togglerElements = document.querySelectorAll("img.toggler"); - togglerElements.forEach((el) => - el.addEventListener("click", (event) => toggler(event.currentTarget)) - ); - togglerElements.forEach((el) => (el.style.display = "")); - if (DOCUMENTATION_OPTIONS.COLLAPSE_INDEX) togglerElements.forEach(toggler); - }, - - initOnKeyListeners: () => { - // only install a listener if it is really needed - if ( - !DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS && - !DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS - ) - return; - - document.addEventListener("keydown", (event) => { - // bail for input elements - if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; - // bail with special keys - if (event.altKey || event.ctrlKey || event.metaKey) return; - - if (!event.shiftKey) { - switch (event.key) { - case "ArrowLeft": - if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; - - const prevLink = document.querySelector('link[rel="prev"]'); - if (prevLink && prevLink.href) { - window.location.href = prevLink.href; - event.preventDefault(); - } - break; - case "ArrowRight": - if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; - - const nextLink = document.querySelector('link[rel="next"]'); - if (nextLink && nextLink.href) { - window.location.href = nextLink.href; - event.preventDefault(); - } - break; - } - } - - // some keyboard layouts may need Shift to get / - switch (event.key) { - case "/": - if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) break; - Documentation.focusSearchBar(); - event.preventDefault(); - } - }); - }, -}; - -// quick alias for translations -const _ = Documentation.gettext; - -_ready(Documentation.init); diff --git a/Archive/doc_new/_build/html/_static/documentation_options.js b/Archive/doc_new/_build/html/_static/documentation_options.js deleted file mode 100644 index 7e4c114..0000000 --- a/Archive/doc_new/_build/html/_static/documentation_options.js +++ /dev/null @@ -1,13 +0,0 @@ -const DOCUMENTATION_OPTIONS = { - VERSION: '', - LANGUAGE: 'en', - COLLAPSE_INDEX: false, - BUILDER: 'html', - FILE_SUFFIX: '.html', - LINK_SUFFIX: '.html', - HAS_SOURCE: true, - SOURCELINK_SUFFIX: '.txt', - NAVIGATION_WITH_KEYS: false, - SHOW_SEARCH_SUMMARY: true, - ENABLE_SEARCH_SHORTCUTS: true, -}; \ No newline at end of file diff --git a/Archive/doc_new/_build/html/_static/file.png b/Archive/doc_new/_build/html/_static/file.png deleted file mode 100644 index a858a41..0000000 Binary files a/Archive/doc_new/_build/html/_static/file.png and /dev/null differ diff --git a/Archive/doc_new/_build/html/_static/github-banner.svg b/Archive/doc_new/_build/html/_static/github-banner.svg deleted file mode 100644 index c47d9dc..0000000 --- a/Archive/doc_new/_build/html/_static/github-banner.svg +++ /dev/null @@ -1,5 +0,0 @@ - - - - - diff --git a/Archive/doc_new/_build/html/_static/language_data.js b/Archive/doc_new/_build/html/_static/language_data.js deleted file mode 100644 index 367b8ed..0000000 --- a/Archive/doc_new/_build/html/_static/language_data.js +++ /dev/null @@ -1,199 +0,0 @@ -/* - * language_data.js - * ~~~~~~~~~~~~~~~~ - * - * This script contains the language-specific data used by searchtools.js, - * namely the list of stopwords, stemmer, scorer and splitter. - * - * :copyright: Copyright 2007-2024 by the Sphinx team, see AUTHORS. - * :license: BSD, see LICENSE for details. - * - */ - -var stopwords = ["a", "and", "are", "as", "at", "be", "but", "by", "for", "if", "in", "into", "is", "it", "near", "no", "not", "of", "on", "or", "such", "that", "the", "their", "then", "there", "these", "they", "this", "to", "was", "will", "with"]; - - -/* Non-minified version is copied as a separate JS file, if available */ - -/** - * Porter Stemmer - */ -var Stemmer = function() { - - var step2list = { - ational: 'ate', - tional: 'tion', - enci: 'ence', - anci: 'ance', - izer: 'ize', - bli: 'ble', - alli: 'al', - entli: 'ent', - eli: 'e', - ousli: 'ous', - ization: 'ize', - ation: 'ate', - ator: 'ate', - alism: 'al', - iveness: 'ive', - fulness: 'ful', - ousness: 'ous', - aliti: 'al', - iviti: 'ive', - biliti: 'ble', - logi: 'log' - }; - - var step3list = { - icate: 'ic', - ative: '', - alize: 'al', - iciti: 'ic', - ical: 'ic', - ful: '', - ness: '' - }; - - var c = "[^aeiou]"; // consonant - var v = "[aeiouy]"; // vowel - var C = c + "[^aeiouy]*"; // consonant sequence - var V = v + "[aeiou]*"; // vowel sequence - - var mgr0 = "^(" + C + ")?" + V + C; // [C]VC... is m>0 - var meq1 = "^(" + C + ")?" + V + C + "(" + V + ")?$"; // [C]VC[V] is m=1 - var mgr1 = "^(" + C + ")?" + V + C + V + C; // [C]VCVC... is m>1 - var s_v = "^(" + C + ")?" + v; // vowel in stem - - this.stemWord = function (w) { - var stem; - var suffix; - var firstch; - var origword = w; - - if (w.length < 3) - return w; - - var re; - var re2; - var re3; - var re4; - - firstch = w.substr(0,1); - if (firstch == "y") - w = firstch.toUpperCase() + w.substr(1); - - // Step 1a - re = /^(.+?)(ss|i)es$/; - re2 = /^(.+?)([^s])s$/; - - if (re.test(w)) - w = w.replace(re,"$1$2"); - else if (re2.test(w)) - w = w.replace(re2,"$1$2"); - - // Step 1b - re = /^(.+?)eed$/; - re2 = /^(.+?)(ed|ing)$/; - if (re.test(w)) { - var fp = re.exec(w); - re = new RegExp(mgr0); - if (re.test(fp[1])) { - re = /.$/; - w = w.replace(re,""); - } - } - else if (re2.test(w)) { - var fp = re2.exec(w); - stem = fp[1]; - re2 = new RegExp(s_v); - if (re2.test(stem)) { - w = stem; - re2 = /(at|bl|iz)$/; - re3 = new RegExp("([^aeiouylsz])\\1$"); - re4 = new RegExp("^" + C + v + "[^aeiouwxy]$"); - if (re2.test(w)) - w = w + "e"; - else if (re3.test(w)) { - re = /.$/; - w = w.replace(re,""); - } - else if (re4.test(w)) - w = w + "e"; - } - } - - // Step 1c - re = /^(.+?)y$/; - if (re.test(w)) { - var fp = re.exec(w); - stem = fp[1]; - re = new RegExp(s_v); - if (re.test(stem)) - w = stem + "i"; - } - - // Step 2 - re = /^(.+?)(ational|tional|enci|anci|izer|bli|alli|entli|eli|ousli|ization|ation|ator|alism|iveness|fulness|ousness|aliti|iviti|biliti|logi)$/; - if (re.test(w)) { - var fp = re.exec(w); - stem = fp[1]; - suffix = fp[2]; - re = new RegExp(mgr0); - if (re.test(stem)) - w = stem + step2list[suffix]; - } - - // Step 3 - re = /^(.+?)(icate|ative|alize|iciti|ical|ful|ness)$/; - if (re.test(w)) { - var fp = re.exec(w); - stem = fp[1]; - suffix = fp[2]; - re = new RegExp(mgr0); - if (re.test(stem)) - w = stem + step3list[suffix]; - } - - // Step 4 - re = /^(.+?)(al|ance|ence|er|ic|able|ible|ant|ement|ment|ent|ou|ism|ate|iti|ous|ive|ize)$/; - re2 = /^(.+?)(s|t)(ion)$/; - if (re.test(w)) { - var fp = re.exec(w); - stem = fp[1]; - re = new RegExp(mgr1); - if (re.test(stem)) - w = stem; - } - else if (re2.test(w)) { - var fp = re2.exec(w); - stem = fp[1] + fp[2]; - re2 = new RegExp(mgr1); - if (re2.test(stem)) - w = stem; - } - - // Step 5 - re = /^(.+?)e$/; - if (re.test(w)) { - var fp = re.exec(w); - stem = fp[1]; - re = new RegExp(mgr1); - re2 = new RegExp(meq1); - re3 = new RegExp("^" + C + v + "[^aeiouwxy]$"); - if (re.test(stem) || (re2.test(stem) && !(re3.test(stem)))) - w = stem; - } - re = /ll$/; - re2 = new RegExp(mgr1); - if (re.test(w) && re2.test(w)) { - re = /.$/; - w = w.replace(re,""); - } - - // and turn initial Y back to y - if (firstch == "y") - w = firstch.toLowerCase() + w.substr(1); - return w; - } -} - diff --git a/Archive/doc_new/_build/html/_static/minus.png b/Archive/doc_new/_build/html/_static/minus.png deleted file mode 100644 index d96755f..0000000 Binary files a/Archive/doc_new/_build/html/_static/minus.png and /dev/null differ diff --git a/Archive/doc_new/_build/html/_static/plus.png b/Archive/doc_new/_build/html/_static/plus.png deleted file mode 100644 index 7107cec..0000000 Binary files a/Archive/doc_new/_build/html/_static/plus.png and /dev/null differ diff --git a/Archive/doc_new/_build/html/_static/pygments.css b/Archive/doc_new/_build/html/_static/pygments.css deleted file mode 100644 index 04a4174..0000000 --- a/Archive/doc_new/_build/html/_static/pygments.css +++ /dev/null @@ -1,84 +0,0 @@ -pre { line-height: 125%; } -td.linenos .normal { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; } -span.linenos { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; } -td.linenos .special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; } -span.linenos.special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; } -.highlight .hll { background-color: #ffffcc } -.highlight { background: #f8f8f8; } -.highlight .c { color: #8f5902; font-style: italic } /* Comment */ -.highlight .err { color: #a40000; border: 1px solid #ef2929 } /* Error */ -.highlight .g { color: #000000 } /* Generic */ -.highlight .k { color: #004461; font-weight: bold } /* Keyword */ -.highlight .l { color: #000000 } /* Literal */ -.highlight .n { color: #000000 } /* Name */ -.highlight .o { color: #582800 } /* Operator */ -.highlight .x { color: #000000 } /* Other */ -.highlight .p { color: #000000; font-weight: bold } /* Punctuation */ -.highlight .ch { color: #8f5902; font-style: italic } /* Comment.Hashbang */ -.highlight .cm { color: #8f5902; font-style: italic } /* Comment.Multiline */ -.highlight .cp { color: #8f5902 } /* Comment.Preproc */ -.highlight .cpf { color: #8f5902; font-style: italic } /* Comment.PreprocFile */ -.highlight .c1 { color: #8f5902; font-style: italic } /* Comment.Single */ -.highlight .cs { color: #8f5902; font-style: italic } /* Comment.Special */ -.highlight .gd { color: #a40000 } /* Generic.Deleted */ -.highlight .ge { color: #000000; font-style: italic } /* Generic.Emph */ -.highlight .ges { color: #000000 } /* Generic.EmphStrong */ -.highlight .gr { color: #ef2929 } /* Generic.Error */ -.highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */ -.highlight .gi { color: #00A000 } /* Generic.Inserted */ -.highlight .go { color: #888888 } /* Generic.Output */ -.highlight .gp { color: #745334 } /* Generic.Prompt */ -.highlight .gs { color: #000000; font-weight: bold } /* Generic.Strong */ -.highlight .gu { color: #800080; font-weight: bold } /* Generic.Subheading */ -.highlight .gt { color: #a40000; font-weight: bold } /* Generic.Traceback */ -.highlight .kc { color: #004461; font-weight: bold } /* Keyword.Constant */ -.highlight .kd { color: #004461; font-weight: bold } /* Keyword.Declaration */ -.highlight .kn { color: #004461; font-weight: bold } /* Keyword.Namespace */ -.highlight .kp { color: #004461; font-weight: bold } /* Keyword.Pseudo */ -.highlight .kr { color: #004461; font-weight: bold } /* Keyword.Reserved */ -.highlight .kt { color: #004461; font-weight: bold } /* Keyword.Type */ -.highlight .ld { color: #000000 } /* Literal.Date */ -.highlight .m { color: #990000 } /* Literal.Number */ -.highlight .s { color: #4e9a06 } /* Literal.String */ -.highlight .na { color: #c4a000 } /* Name.Attribute */ -.highlight .nb { color: #004461 } /* Name.Builtin */ -.highlight .nc { color: #000000 } /* Name.Class */ -.highlight .no { color: #000000 } /* Name.Constant */ -.highlight .nd { color: #888888 } /* Name.Decorator */ -.highlight .ni { color: #ce5c00 } /* Name.Entity */ -.highlight .ne { color: #cc0000; font-weight: bold } /* Name.Exception */ -.highlight .nf { color: #000000 } /* Name.Function */ -.highlight .nl { color: #f57900 } /* Name.Label */ -.highlight .nn { color: #000000 } /* Name.Namespace */ -.highlight .nx { color: #000000 } /* Name.Other */ -.highlight .py { color: #000000 } /* Name.Property */ -.highlight .nt { color: #004461; font-weight: bold } /* Name.Tag */ -.highlight .nv { color: #000000 } /* Name.Variable */ -.highlight .ow { color: #004461; font-weight: bold } /* Operator.Word */ -.highlight .pm { color: #000000; font-weight: bold } /* Punctuation.Marker */ -.highlight .w { color: #f8f8f8 } /* Text.Whitespace */ -.highlight .mb { color: #990000 } /* Literal.Number.Bin */ -.highlight .mf { color: #990000 } /* Literal.Number.Float */ -.highlight .mh { color: #990000 } /* Literal.Number.Hex */ -.highlight .mi { color: #990000 } /* Literal.Number.Integer */ -.highlight .mo { color: #990000 } /* Literal.Number.Oct */ -.highlight .sa { color: #4e9a06 } /* Literal.String.Affix */ -.highlight .sb { color: #4e9a06 } /* Literal.String.Backtick */ -.highlight .sc { color: #4e9a06 } /* Literal.String.Char */ -.highlight .dl { color: #4e9a06 } /* Literal.String.Delimiter */ -.highlight .sd { color: #8f5902; font-style: italic } /* Literal.String.Doc */ -.highlight .s2 { color: #4e9a06 } /* Literal.String.Double */ -.highlight .se { color: #4e9a06 } /* Literal.String.Escape */ -.highlight .sh { color: #4e9a06 } /* Literal.String.Heredoc */ -.highlight .si { color: #4e9a06 } /* Literal.String.Interpol */ -.highlight .sx { color: #4e9a06 } /* Literal.String.Other */ -.highlight .sr { color: #4e9a06 } /* Literal.String.Regex */ -.highlight .s1 { color: #4e9a06 } /* Literal.String.Single */ -.highlight .ss { color: #4e9a06 } /* Literal.String.Symbol */ -.highlight .bp { color: #3465a4 } /* Name.Builtin.Pseudo */ -.highlight .fm { color: #000000 } /* Name.Function.Magic */ -.highlight .vc { color: #000000 } /* Name.Variable.Class */ -.highlight .vg { color: #000000 } /* Name.Variable.Global */ -.highlight .vi { color: #000000 } /* Name.Variable.Instance */ -.highlight .vm { color: #000000 } /* Name.Variable.Magic */ -.highlight .il { color: #990000 } /* Literal.Number.Integer.Long */ \ No newline at end of file diff --git a/Archive/doc_new/_build/html/_static/searchtools.js b/Archive/doc_new/_build/html/_static/searchtools.js deleted file mode 100644 index b08d58c..0000000 --- a/Archive/doc_new/_build/html/_static/searchtools.js +++ /dev/null @@ -1,620 +0,0 @@ -/* - * searchtools.js - * ~~~~~~~~~~~~~~~~ - * - * Sphinx JavaScript utilities for the full-text search. - * - * :copyright: Copyright 2007-2024 by the Sphinx team, see AUTHORS. - * :license: BSD, see LICENSE for details. - * - */ -"use strict"; - -/** - * Simple result scoring code. - */ -if (typeof Scorer === "undefined") { - var Scorer = { - // Implement the following function to further tweak the score for each result - // The function takes a result array [docname, title, anchor, descr, score, filename] - // and returns the new score. - /* - score: result => { - const [docname, title, anchor, descr, score, filename] = result - return score - }, - */ - - // query matches the full name of an object - objNameMatch: 11, - // or matches in the last dotted part of the object name - objPartialMatch: 6, - // Additive scores depending on the priority of the object - objPrio: { - 0: 15, // used to be importantResults - 1: 5, // used to be objectResults - 2: -5, // used to be unimportantResults - }, - // Used when the priority is not in the mapping. - objPrioDefault: 0, - - // query found in title - title: 15, - partialTitle: 7, - // query found in terms - term: 5, - partialTerm: 2, - }; -} - -const _removeChildren = (element) => { - while (element && element.lastChild) element.removeChild(element.lastChild); -}; - -/** - * See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions#escaping - */ -const _escapeRegExp = (string) => - string.replace(/[.*+\-?^${}()|[\]\\]/g, "\\$&"); // $& means the whole matched string - -const _displayItem = (item, searchTerms, highlightTerms) => { - const docBuilder = DOCUMENTATION_OPTIONS.BUILDER; - const docFileSuffix = DOCUMENTATION_OPTIONS.FILE_SUFFIX; - const docLinkSuffix = DOCUMENTATION_OPTIONS.LINK_SUFFIX; - const showSearchSummary = DOCUMENTATION_OPTIONS.SHOW_SEARCH_SUMMARY; - const contentRoot = document.documentElement.dataset.content_root; - - const [docName, title, anchor, descr, score, _filename] = item; - - let listItem = document.createElement("li"); - let requestUrl; - let linkUrl; - if (docBuilder === "dirhtml") { - // dirhtml builder - let dirname = docName + "/"; - if (dirname.match(/\/index\/$/)) - dirname = dirname.substring(0, dirname.length - 6); - else if (dirname === "index/") dirname = ""; - requestUrl = contentRoot + dirname; - linkUrl = requestUrl; - } else { - // normal html builders - requestUrl = contentRoot + docName + docFileSuffix; - linkUrl = docName + docLinkSuffix; - } - let linkEl = listItem.appendChild(document.createElement("a")); - linkEl.href = linkUrl + anchor; - linkEl.dataset.score = score; - linkEl.innerHTML = title; - if (descr) { - listItem.appendChild(document.createElement("span")).innerHTML = - " (" + descr + ")"; - // highlight search terms in the description - if (SPHINX_HIGHLIGHT_ENABLED) // set in sphinx_highlight.js - highlightTerms.forEach((term) => _highlightText(listItem, term, "highlighted")); - } - else if (showSearchSummary) - fetch(requestUrl) - .then((responseData) => responseData.text()) - .then((data) => { - if (data) - listItem.appendChild( - Search.makeSearchSummary(data, searchTerms, anchor) - ); - // highlight search terms in the summary - if (SPHINX_HIGHLIGHT_ENABLED) // set in sphinx_highlight.js - highlightTerms.forEach((term) => _highlightText(listItem, term, "highlighted")); - }); - Search.output.appendChild(listItem); -}; -const _finishSearch = (resultCount) => { - Search.stopPulse(); - Search.title.innerText = _("Search Results"); - if (!resultCount) - Search.status.innerText = Documentation.gettext( - "Your search did not match any documents. Please make sure that all words are spelled correctly and that you've selected enough categories." - ); - else - Search.status.innerText = _( - "Search finished, found ${resultCount} page(s) matching the search query." - ).replace('${resultCount}', resultCount); -}; -const _displayNextItem = ( - results, - resultCount, - searchTerms, - highlightTerms, -) => { - // results left, load the summary and display it - // this is intended to be dynamic (don't sub resultsCount) - if (results.length) { - _displayItem(results.pop(), searchTerms, highlightTerms); - setTimeout( - () => _displayNextItem(results, resultCount, searchTerms, highlightTerms), - 5 - ); - } - // search finished, update title and status message - else _finishSearch(resultCount); -}; -// Helper function used by query() to order search results. -// Each input is an array of [docname, title, anchor, descr, score, filename]. -// Order the results by score (in opposite order of appearance, since the -// `_displayNextItem` function uses pop() to retrieve items) and then alphabetically. -const _orderResultsByScoreThenName = (a, b) => { - const leftScore = a[4]; - const rightScore = b[4]; - if (leftScore === rightScore) { - // same score: sort alphabetically - const leftTitle = a[1].toLowerCase(); - const rightTitle = b[1].toLowerCase(); - if (leftTitle === rightTitle) return 0; - return leftTitle > rightTitle ? -1 : 1; // inverted is intentional - } - return leftScore > rightScore ? 1 : -1; -}; - -/** - * Default splitQuery function. Can be overridden in ``sphinx.search`` with a - * custom function per language. - * - * The regular expression works by splitting the string on consecutive characters - * that are not Unicode letters, numbers, underscores, or emoji characters. - * This is the same as ``\W+`` in Python, preserving the surrogate pair area. - */ -if (typeof splitQuery === "undefined") { - var splitQuery = (query) => query - .split(/[^\p{Letter}\p{Number}_\p{Emoji_Presentation}]+/gu) - .filter(term => term) // remove remaining empty strings -} - -/** - * Search Module - */ -const Search = { - _index: null, - _queued_query: null, - _pulse_status: -1, - - htmlToText: (htmlString, anchor) => { - const htmlElement = new DOMParser().parseFromString(htmlString, 'text/html'); - for (const removalQuery of [".headerlink", "script", "style"]) { - htmlElement.querySelectorAll(removalQuery).forEach((el) => { el.remove() }); - } - if (anchor) { - const anchorContent = htmlElement.querySelector(`[role="main"] ${anchor}`); - if (anchorContent) return anchorContent.textContent; - - console.warn( - `Anchored content block not found. Sphinx search tries to obtain it via DOM query '[role=main] ${anchor}'. Check your theme or template.` - ); - } - - // if anchor not specified or not found, fall back to main content - const docContent = htmlElement.querySelector('[role="main"]'); - if (docContent) return docContent.textContent; - - console.warn( - "Content block not found. Sphinx search tries to obtain it via DOM query '[role=main]'. Check your theme or template." - ); - return ""; - }, - - init: () => { - const query = new URLSearchParams(window.location.search).get("q"); - document - .querySelectorAll('input[name="q"]') - .forEach((el) => (el.value = query)); - if (query) Search.performSearch(query); - }, - - loadIndex: (url) => - (document.body.appendChild(document.createElement("script")).src = url), - - setIndex: (index) => { - Search._index = index; - if (Search._queued_query !== null) { - const query = Search._queued_query; - Search._queued_query = null; - Search.query(query); - } - }, - - hasIndex: () => Search._index !== null, - - deferQuery: (query) => (Search._queued_query = query), - - stopPulse: () => (Search._pulse_status = -1), - - startPulse: () => { - if (Search._pulse_status >= 0) return; - - const pulse = () => { - Search._pulse_status = (Search._pulse_status + 1) % 4; - Search.dots.innerText = ".".repeat(Search._pulse_status); - if (Search._pulse_status >= 0) window.setTimeout(pulse, 500); - }; - pulse(); - }, - - /** - * perform a search for something (or wait until index is loaded) - */ - performSearch: (query) => { - // create the required interface elements - const searchText = document.createElement("h2"); - searchText.textContent = _("Searching"); - const searchSummary = document.createElement("p"); - searchSummary.classList.add("search-summary"); - searchSummary.innerText = ""; - const searchList = document.createElement("ul"); - searchList.classList.add("search"); - - const out = document.getElementById("search-results"); - Search.title = out.appendChild(searchText); - Search.dots = Search.title.appendChild(document.createElement("span")); - Search.status = out.appendChild(searchSummary); - Search.output = out.appendChild(searchList); - - const searchProgress = document.getElementById("search-progress"); - // Some themes don't use the search progress node - if (searchProgress) { - searchProgress.innerText = _("Preparing search..."); - } - Search.startPulse(); - - // index already loaded, the browser was quick! - if (Search.hasIndex()) Search.query(query); - else Search.deferQuery(query); - }, - - _parseQuery: (query) => { - // stem the search terms and add them to the correct list - const stemmer = new Stemmer(); - const searchTerms = new Set(); - const excludedTerms = new Set(); - const highlightTerms = new Set(); - const objectTerms = new Set(splitQuery(query.toLowerCase().trim())); - splitQuery(query.trim()).forEach((queryTerm) => { - const queryTermLower = queryTerm.toLowerCase(); - - // maybe skip this "word" - // stopwords array is from language_data.js - if ( - stopwords.indexOf(queryTermLower) !== -1 || - queryTerm.match(/^\d+$/) - ) - return; - - // stem the word - let word = stemmer.stemWord(queryTermLower); - // select the correct list - if (word[0] === "-") excludedTerms.add(word.substr(1)); - else { - searchTerms.add(word); - highlightTerms.add(queryTermLower); - } - }); - - if (SPHINX_HIGHLIGHT_ENABLED) { // set in sphinx_highlight.js - localStorage.setItem("sphinx_highlight_terms", [...highlightTerms].join(" ")) - } - - // console.debug("SEARCH: searching for:"); - // console.info("required: ", [...searchTerms]); - // console.info("excluded: ", [...excludedTerms]); - - return [query, searchTerms, excludedTerms, highlightTerms, objectTerms]; - }, - - /** - * execute search (requires search index to be loaded) - */ - _performSearch: (query, searchTerms, excludedTerms, highlightTerms, objectTerms) => { - const filenames = Search._index.filenames; - const docNames = Search._index.docnames; - const titles = Search._index.titles; - const allTitles = Search._index.alltitles; - const indexEntries = Search._index.indexentries; - - // Collect multiple result groups to be sorted separately and then ordered. - // Each is an array of [docname, title, anchor, descr, score, filename]. - const normalResults = []; - const nonMainIndexResults = []; - - _removeChildren(document.getElementById("search-progress")); - - const queryLower = query.toLowerCase().trim(); - for (const [title, foundTitles] of Object.entries(allTitles)) { - if (title.toLowerCase().trim().includes(queryLower) && (queryLower.length >= title.length/2)) { - for (const [file, id] of foundTitles) { - const score = Math.round(Scorer.title * queryLower.length / title.length); - const boost = titles[file] === title ? 1 : 0; // add a boost for document titles - normalResults.push([ - docNames[file], - titles[file] !== title ? `${titles[file]} > ${title}` : title, - id !== null ? "#" + id : "", - null, - score + boost, - filenames[file], - ]); - } - } - } - - // search for explicit entries in index directives - for (const [entry, foundEntries] of Object.entries(indexEntries)) { - if (entry.includes(queryLower) && (queryLower.length >= entry.length/2)) { - for (const [file, id, isMain] of foundEntries) { - const score = Math.round(100 * queryLower.length / entry.length); - const result = [ - docNames[file], - titles[file], - id ? "#" + id : "", - null, - score, - filenames[file], - ]; - if (isMain) { - normalResults.push(result); - } else { - nonMainIndexResults.push(result); - } - } - } - } - - // lookup as object - objectTerms.forEach((term) => - normalResults.push(...Search.performObjectSearch(term, objectTerms)) - ); - - // lookup as search terms in fulltext - normalResults.push(...Search.performTermsSearch(searchTerms, excludedTerms)); - - // let the scorer override scores with a custom scoring function - if (Scorer.score) { - normalResults.forEach((item) => (item[4] = Scorer.score(item))); - nonMainIndexResults.forEach((item) => (item[4] = Scorer.score(item))); - } - - // Sort each group of results by score and then alphabetically by name. - normalResults.sort(_orderResultsByScoreThenName); - nonMainIndexResults.sort(_orderResultsByScoreThenName); - - // Combine the result groups in (reverse) order. - // Non-main index entries are typically arbitrary cross-references, - // so display them after other results. - let results = [...nonMainIndexResults, ...normalResults]; - - // remove duplicate search results - // note the reversing of results, so that in the case of duplicates, the highest-scoring entry is kept - let seen = new Set(); - results = results.reverse().reduce((acc, result) => { - let resultStr = result.slice(0, 4).concat([result[5]]).map(v => String(v)).join(','); - if (!seen.has(resultStr)) { - acc.push(result); - seen.add(resultStr); - } - return acc; - }, []); - - return results.reverse(); - }, - - query: (query) => { - const [searchQuery, searchTerms, excludedTerms, highlightTerms, objectTerms] = Search._parseQuery(query); - const results = Search._performSearch(searchQuery, searchTerms, excludedTerms, highlightTerms, objectTerms); - - // for debugging - //Search.lastresults = results.slice(); // a copy - // console.info("search results:", Search.lastresults); - - // print the results - _displayNextItem(results, results.length, searchTerms, highlightTerms); - }, - - /** - * search for object names - */ - performObjectSearch: (object, objectTerms) => { - const filenames = Search._index.filenames; - const docNames = Search._index.docnames; - const objects = Search._index.objects; - const objNames = Search._index.objnames; - const titles = Search._index.titles; - - const results = []; - - const objectSearchCallback = (prefix, match) => { - const name = match[4] - const fullname = (prefix ? prefix + "." : "") + name; - const fullnameLower = fullname.toLowerCase(); - if (fullnameLower.indexOf(object) < 0) return; - - let score = 0; - const parts = fullnameLower.split("."); - - // check for different match types: exact matches of full name or - // "last name" (i.e. last dotted part) - if (fullnameLower === object || parts.slice(-1)[0] === object) - score += Scorer.objNameMatch; - else if (parts.slice(-1)[0].indexOf(object) > -1) - score += Scorer.objPartialMatch; // matches in last name - - const objName = objNames[match[1]][2]; - const title = titles[match[0]]; - - // If more than one term searched for, we require other words to be - // found in the name/title/description - const otherTerms = new Set(objectTerms); - otherTerms.delete(object); - if (otherTerms.size > 0) { - const haystack = `${prefix} ${name} ${objName} ${title}`.toLowerCase(); - if ( - [...otherTerms].some((otherTerm) => haystack.indexOf(otherTerm) < 0) - ) - return; - } - - let anchor = match[3]; - if (anchor === "") anchor = fullname; - else if (anchor === "-") anchor = objNames[match[1]][1] + "-" + fullname; - - const descr = objName + _(", in ") + title; - - // add custom score for some objects according to scorer - if (Scorer.objPrio.hasOwnProperty(match[2])) - score += Scorer.objPrio[match[2]]; - else score += Scorer.objPrioDefault; - - results.push([ - docNames[match[0]], - fullname, - "#" + anchor, - descr, - score, - filenames[match[0]], - ]); - }; - Object.keys(objects).forEach((prefix) => - objects[prefix].forEach((array) => - objectSearchCallback(prefix, array) - ) - ); - return results; - }, - - /** - * search for full-text terms in the index - */ - performTermsSearch: (searchTerms, excludedTerms) => { - // prepare search - const terms = Search._index.terms; - const titleTerms = Search._index.titleterms; - const filenames = Search._index.filenames; - const docNames = Search._index.docnames; - const titles = Search._index.titles; - - const scoreMap = new Map(); - const fileMap = new Map(); - - // perform the search on the required terms - searchTerms.forEach((word) => { - const files = []; - const arr = [ - { files: terms[word], score: Scorer.term }, - { files: titleTerms[word], score: Scorer.title }, - ]; - // add support for partial matches - if (word.length > 2) { - const escapedWord = _escapeRegExp(word); - if (!terms.hasOwnProperty(word)) { - Object.keys(terms).forEach((term) => { - if (term.match(escapedWord)) - arr.push({ files: terms[term], score: Scorer.partialTerm }); - }); - } - if (!titleTerms.hasOwnProperty(word)) { - Object.keys(titleTerms).forEach((term) => { - if (term.match(escapedWord)) - arr.push({ files: titleTerms[term], score: Scorer.partialTitle }); - }); - } - } - - // no match but word was a required one - if (arr.every((record) => record.files === undefined)) return; - - // found search word in contents - arr.forEach((record) => { - if (record.files === undefined) return; - - let recordFiles = record.files; - if (recordFiles.length === undefined) recordFiles = [recordFiles]; - files.push(...recordFiles); - - // set score for the word in each file - recordFiles.forEach((file) => { - if (!scoreMap.has(file)) scoreMap.set(file, {}); - scoreMap.get(file)[word] = record.score; - }); - }); - - // create the mapping - files.forEach((file) => { - if (!fileMap.has(file)) fileMap.set(file, [word]); - else if (fileMap.get(file).indexOf(word) === -1) fileMap.get(file).push(word); - }); - }); - - // now check if the files don't contain excluded terms - const results = []; - for (const [file, wordList] of fileMap) { - // check if all requirements are matched - - // as search terms with length < 3 are discarded - const filteredTermCount = [...searchTerms].filter( - (term) => term.length > 2 - ).length; - if ( - wordList.length !== searchTerms.size && - wordList.length !== filteredTermCount - ) - continue; - - // ensure that none of the excluded terms is in the search result - if ( - [...excludedTerms].some( - (term) => - terms[term] === file || - titleTerms[term] === file || - (terms[term] || []).includes(file) || - (titleTerms[term] || []).includes(file) - ) - ) - break; - - // select one (max) score for the file. - const score = Math.max(...wordList.map((w) => scoreMap.get(file)[w])); - // add result to the result list - results.push([ - docNames[file], - titles[file], - "", - null, - score, - filenames[file], - ]); - } - return results; - }, - - /** - * helper function to return a node containing the - * search summary for a given text. keywords is a list - * of stemmed words. - */ - makeSearchSummary: (htmlText, keywords, anchor) => { - const text = Search.htmlToText(htmlText, anchor); - if (text === "") return null; - - const textLower = text.toLowerCase(); - const actualStartPosition = [...keywords] - .map((k) => textLower.indexOf(k.toLowerCase())) - .filter((i) => i > -1) - .slice(-1)[0]; - const startWithContext = Math.max(actualStartPosition - 120, 0); - - const top = startWithContext === 0 ? "" : "..."; - const tail = startWithContext + 240 < text.length ? "..." : ""; - - let summary = document.createElement("p"); - summary.classList.add("context"); - summary.textContent = top + text.substr(startWithContext, 240).trim() + tail; - - return summary; - }, -}; - -_ready(Search.init); diff --git a/Archive/doc_new/_build/html/_static/sphinx_highlight.js b/Archive/doc_new/_build/html/_static/sphinx_highlight.js deleted file mode 100644 index 8a96c69..0000000 --- a/Archive/doc_new/_build/html/_static/sphinx_highlight.js +++ /dev/null @@ -1,154 +0,0 @@ -/* Highlighting utilities for Sphinx HTML documentation. */ -"use strict"; - -const SPHINX_HIGHLIGHT_ENABLED = true - -/** - * highlight a given string on a node by wrapping it in - * span elements with the given class name. - */ -const _highlight = (node, addItems, text, className) => { - if (node.nodeType === Node.TEXT_NODE) { - const val = node.nodeValue; - const parent = node.parentNode; - const pos = val.toLowerCase().indexOf(text); - if ( - pos >= 0 && - !parent.classList.contains(className) && - !parent.classList.contains("nohighlight") - ) { - let span; - - const closestNode = parent.closest("body, svg, foreignObject"); - const isInSVG = closestNode && closestNode.matches("svg"); - if (isInSVG) { - span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); - } else { - span = document.createElement("span"); - span.classList.add(className); - } - - span.appendChild(document.createTextNode(val.substr(pos, text.length))); - const rest = document.createTextNode(val.substr(pos + text.length)); - parent.insertBefore( - span, - parent.insertBefore( - rest, - node.nextSibling - ) - ); - node.nodeValue = val.substr(0, pos); - /* There may be more occurrences of search term in this node. So call this - * function recursively on the remaining fragment. - */ - _highlight(rest, addItems, text, className); - - if (isInSVG) { - const rect = document.createElementNS( - "http://www.w3.org/2000/svg", - "rect" - ); - const bbox = parent.getBBox(); - rect.x.baseVal.value = bbox.x; - rect.y.baseVal.value = bbox.y; - rect.width.baseVal.value = bbox.width; - rect.height.baseVal.value = bbox.height; - rect.setAttribute("class", className); - addItems.push({ parent: parent, target: rect }); - } - } - } else if (node.matches && !node.matches("button, select, textarea")) { - node.childNodes.forEach((el) => _highlight(el, addItems, text, className)); - } -}; -const _highlightText = (thisNode, text, className) => { - let addItems = []; - _highlight(thisNode, addItems, text, className); - addItems.forEach((obj) => - obj.parent.insertAdjacentElement("beforebegin", obj.target) - ); -}; - -/** - * Small JavaScript module for the documentation. - */ -const SphinxHighlight = { - - /** - * highlight the search words provided in localstorage in the text - */ - highlightSearchWords: () => { - if (!SPHINX_HIGHLIGHT_ENABLED) return; // bail if no highlight - - // get and clear terms from localstorage - const url = new URL(window.location); - const highlight = - localStorage.getItem("sphinx_highlight_terms") - || url.searchParams.get("highlight") - || ""; - localStorage.removeItem("sphinx_highlight_terms") - url.searchParams.delete("highlight"); - window.history.replaceState({}, "", url); - - // get individual terms from highlight string - const terms = highlight.toLowerCase().split(/\s+/).filter(x => x); - if (terms.length === 0) return; // nothing to do - - // There should never be more than one element matching "div.body" - const divBody = document.querySelectorAll("div.body"); - const body = divBody.length ? divBody[0] : document.querySelector("body"); - window.setTimeout(() => { - terms.forEach((term) => _highlightText(body, term, "highlighted")); - }, 10); - - const searchBox = document.getElementById("searchbox"); - if (searchBox === null) return; - searchBox.appendChild( - document - .createRange() - .createContextualFragment( - '" - ) - ); - }, - - /** - * helper function to hide the search marks again - */ - hideSearchWords: () => { - document - .querySelectorAll("#searchbox .highlight-link") - .forEach((el) => el.remove()); - document - .querySelectorAll("span.highlighted") - .forEach((el) => el.classList.remove("highlighted")); - localStorage.removeItem("sphinx_highlight_terms") - }, - - initEscapeListener: () => { - // only install a listener if it is really needed - if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) return; - - document.addEventListener("keydown", (event) => { - // bail for input elements - if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; - // bail with special keys - if (event.shiftKey || event.altKey || event.ctrlKey || event.metaKey) return; - if (DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS && (event.key === "Escape")) { - SphinxHighlight.hideSearchWords(); - event.preventDefault(); - } - }); - }, -}; - -_ready(() => { - /* Do not call highlightSearchWords() when we are on the search page. - * It will highlight words from the *previous* search query. - */ - if (typeof Search === "undefined") SphinxHighlight.highlightSearchWords(); - SphinxHighlight.initEscapeListener(); -}); diff --git a/Archive/doc_new/_build/html/genindex.html b/Archive/doc_new/_build/html/genindex.html deleted file mode 100644 index 6c4ad34..0000000 --- a/Archive/doc_new/_build/html/genindex.html +++ /dev/null @@ -1,167 +0,0 @@ - - - - - - - Index — Katabatic documentation - - - - - - - - - - - - - - - - - - -
-
- -
- -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/index.html b/Archive/doc_new/_build/html/index.html deleted file mode 100644 index 50fd92f..0000000 --- a/Archive/doc_new/_build/html/index.html +++ /dev/null @@ -1,118 +0,0 @@ - - - - - - - - Katabatic documentation — Katabatic documentation - - - - - - - - - - - - - - - - - - - -
-
-
- - -
- -
-

Katabatic documentation

-

Add your content using reStructuredText syntax. See the -reStructuredText -documentation for details.

-
-

Contents:

- -
-
- - -
- -
-
- -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/katabatic.models.ganblr/ganblr.html b/Archive/doc_new/_build/html/katabatic.models.ganblr/ganblr.html deleted file mode 100644 index f620dae..0000000 --- a/Archive/doc_new/_build/html/katabatic.models.ganblr/ganblr.html +++ /dev/null @@ -1,205 +0,0 @@ - - - - - - - - GANBLR Model Class — Katabatic documentation - - - - - - - - - - - - - - - - - - - - -
-
-
- - -
- -
-

GANBLR Model Class

-

The GANBLR model combines a Bayesian network and a neural network, leveraging the k-Dependency Bayesian (kDB) algorithm for building a dependency graph and using various utility functions for data preparation and model constraints.

-

Defined in ganblr.py

-
-

Class Properties

-
    -
  • _d: -Placeholder for data.

  • -
  • __gen_weights: -Placeholder for generator weights.

  • -
  • batch_size (int): -Batch size for training.

  • -
  • epochs (int): -Number of epochs for training.

  • -
  • k: -Parameter for the model.

  • -
  • constraints: -Constraints for the model.

  • -
  • _ordinal_encoder: -Ordinal encoder for preprocessing data.

  • -
  • _label_encoder: -Label encoder for preprocessing data.

  • -
-
-
-

Class Methods

-
-
__init__()

Initializes the model with default values.

-
-
fit(x, y, k=0, batch_size=32, epochs=10, warmup_epochs=1, verbose=0)

Fits the model to the given data.

-
    -
  • Parameters: -- x: Input data. -- y: Labels for the data. -- k: Parameter for the model (default: 0). -- batch_size: Size of batches for training (default: 32). -- epochs: Number of training epochs (default: 10). -- warmup_epochs: Number of warmup epochs (default: 1). -- verbose: Verbosity level (default: 0).

  • -
  • Returns: -The fitted model.

  • -
-
-
_augment_cpd(d, size=None, verbose=0)

Augments the Conditional Probability Distribution (CPD).

-
    -
  • Parameters: -- d: Data. -- size: Size of the sample (default: None). -- verbose: Verbosity level (default: 0).

  • -
  • Returns: -The augmented data.

  • -
-
-
_warmup_run(epochs, verbose=None)

Runs a warmup phase.

-
    -
  • Parameters: -- epochs: Number of epochs for warmup. -- verbose: Verbosity level (default: None).

  • -
  • Returns: -The history of the warmup run.

  • -
-
-
_run_generator(loss)

Runs the generator model.

-
    -
  • Parameters: -- loss: Loss function.

  • -
  • Returns: -The history of the generator run.

  • -
-
-
_discrim()

Creates a discriminator model.

-
    -
  • Returns: -The discriminator model.

  • -
-
-
_sample(size=None, verbose=0)

Generates synthetic data in ordinal encoding format.

-
    -
  • Parameters: -- size: Size of the sample (default: None). -- verbose: Verbosity level (default: 0).

  • -
  • Returns: -The sampled data.

  • -
-
-
-
-
- - -
- -
-
- -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/katabatic.models.ganblr/ganblr_adapter.html b/Archive/doc_new/_build/html/katabatic.models.ganblr/ganblr_adapter.html deleted file mode 100644 index 80467f4..0000000 --- a/Archive/doc_new/_build/html/katabatic.models.ganblr/ganblr_adapter.html +++ /dev/null @@ -1,166 +0,0 @@ - - - - - - - - GanblrAdapter — Katabatic documentation - - - - - - - - - - - - - - - - - - - - -
-
-
- - -
- -
-

GanblrAdapter

-

A class to adapt the GANBLR model to the KatabaticModelSPI interface. This class provides an easy interface for loading, fitting, and generating data using the GANBLR model.

-
-

Attributes

-
    -
  • type (str): -Specifies whether the model type is ‘discrete’ or ‘continuous’. Default is ‘discrete’.

  • -
  • constraints (any): -Constraints for the model. Default is None.

  • -
  • batch_size (int): -Batch size for training the model. Default is None.

  • -
  • epochs (int): -Number of epochs for training the model. Default is None.

  • -
  • training_sample_size (int): -Size of the training sample. Initialized to 0.

  • -
-
-
-

Methods

-
-
load_model()

Initializes and returns an instance of the GANBLR model.

-
-
load_data(data_pathname)

Loads data from the specified pathname.

-
    -
  • Parameters: -- data_pathname: Pathname of the data to be loaded.

  • -
-
-
fit(X_train, y_train, k=0, epochs=10, batch_size=64)

Fits the GANBLR model using the provided training data.

-
    -
  • Parameters: -- X_train: Training features. -- y_train: Training labels. -- k: Number of parent nodes (default: 0). -- epochs: Number of epochs for training (default: 10). -- batch_size: Batch size for training (default: 64).

  • -
-
-
generate(size=None)

Generates data from the GANBLR model. If size is not specified, it defaults to the training sample size.

-
    -
  • Parameters: -- size: Number of data samples to generate. Defaults to None.

  • -
  • Returns: -Generated data samples.

  • -
-
-
-
-
- - -
- -
-
- -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/katabatic.models.ganblr/kdb.html b/Archive/doc_new/_build/html/katabatic.models.ganblr/kdb.html deleted file mode 100644 index db1f114..0000000 --- a/Archive/doc_new/_build/html/katabatic.models.ganblr/kdb.html +++ /dev/null @@ -1,219 +0,0 @@ - - - - - - - - kDB Algorithm — Katabatic documentation - - - - - - - - - - - - - - - - - - - - -
-
-
- - -
- -
-

kDB Algorithm

-

The kDB algorithm constructs a dependency graph for the Bayesian network. It is implemented in the kdb.py file.

-
-

Methods

-
-
build_graph(X, y, k=2)

Constructs a k-dependency Bayesian network graph.

-
    -
  • Parameters: -- X: Input data (features). -- y: Labels. -- k: Number of parent nodes (default: 2).

  • -
  • Returns: -A list of graph edges.

  • -
-
-
get_cross_table(*cols, apply_wt=False)

Generates a cross table from input columns.

-
    -
  • Parameters: -- cols: Columns for cross table generation. -- apply_wt: Whether to apply weights (default: False).

  • -
  • Returns: -A tuple containing: -- The cross table as a NumPy array. -- A list of unique values for all columns. -- A list of unique values for individual columns.

  • -
-
-
_get_dependencies_without_y(variables, y_name, kdb_edges)

Finds the dependencies of each variable without considering y.

-
    -
  • Parameters: -- variables: List of variable names. -- y_name: Class name. -- kdb_edges: List of tuples representing edges (source, target).

  • -
  • Returns: -A dictionary of dependencies.

  • -
-
-
_add_uniform(X, weight=1.0)

Adds a uniform distribution to the data.

-
    -
  • Parameters: -- X: Input data, a NumPy array or pandas DataFrame. -- weight: Weight for the uniform distribution (default: 1.0).

  • -
  • Returns: -The modified data with uniform distribution.

  • -
-
-
_normalize_by_column(array)

Normalizes the array by columns.

-
    -
  • Parameters: -- array: Input array to normalize.

  • -
  • Returns: -The normalized array.

  • -
-
-
_smoothing(cct, d)

Probability smoothing for kDB.

-
    -
  • Returns: -A smoothed joint probability table.

  • -
-
-
get_high_order_feature(X, col, evidence_cols, feature_uniques)

Encodes the high-order feature of X[col] given evidence from X[evidence_cols.

-
    -
  • Parameters: -- X: Input data. -- col: Column to encode. -- evidence_cols: List of evidence columns. -- feature_uniques: Unique values for features.

  • -
  • Returns: -An encoded high-order feature.

  • -
-
-
get_high_order_constraints(X, col, evidence_cols, feature_uniques)

Gets high-order constraints for the feature.

-
    -
  • Parameters: -- X: Input data. -- col: Column to encode. -- evidence_cols: List of evidence columns. -- feature_uniques: Unique values for features.

  • -
  • Returns: -High-order constraints.

  • -
-
-
-
-
-

Classes

-
-
KdbHighOrderFeatureEncoder

Encodes high-order features for the kDB model.

-
    -
  • Class Properties: -- feature_uniques_: Unique values for features. -- dependencies_: Dependencies for features. -- ohe_: OneHotEncoder instance.

  • -
  • Class Methods: -- fit(X, y, k=0): Fits the encoder to the data. -- transform(X, return_constraints=False, use_ohe=True): Transforms the input data. -- fit_transform(X, y, k=0, return_constraints=False): Fits the encoder and transforms the data.

  • -
-
-
-
-
- - -
- -
-
- -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/katabatic.models.ganblr/utils.html b/Archive/doc_new/_build/html/katabatic.models.ganblr/utils.html deleted file mode 100644 index f311fad..0000000 --- a/Archive/doc_new/_build/html/katabatic.models.ganblr/utils.html +++ /dev/null @@ -1,180 +0,0 @@ - - - - - - - - Utility Functions — Katabatic documentation - - - - - - - - - - - - - - - - - - - - -
-
-
- - -
- -
-

Utility Functions

-

The utility functions are implemented in the utils.py file and provide various support functions for data preparation and model constraints.

-
-

Classes

-
-
softmax_weight

Constrains weight tensors to be under softmax.

-
-
DataUtils

Provides data utilities for preparation before training.

-
-
-
-
-

Functions

-
-
elr_loss(KL_LOSS)

Defines a custom loss function.

-
    -
  • Parameters: -- KL_LOSS: The KL loss value.

  • -
  • Returns: -A custom loss function.

  • -
-
-
KL_loss(prob_fake)

Calculates the KL loss.

-
    -
  • Parameters: -- prob_fake: Probability of the fake data.

  • -
  • Returns: -The KL loss value.

  • -
-
-
get_lr(input_dim, output_dim, constraint=None, KL_LOSS=0)

Creates a logistic regression model.

-
    -
  • Parameters: -- input_dim: Dimension of the input features. -- output_dim: Dimension of the output. -- constraint: Optional constraint for the model (default: None). -- KL_LOSS: Optional KL loss value (default: 0).

  • -
  • Returns: -A logistic regression model.

  • -
-
-
sample(*arrays, n=None, frac=None, random_state=None)

Generates random samples from the given arrays.

-
    -
  • Parameters: -- arrays: Arrays to sample from. -- n: Number of samples to generate (default: None). -- frac: Fraction of samples to generate (default: None). -- random_state: Random seed for reproducibility (default: None).

  • -
  • Returns: -Random samples from the arrays.

  • -
-
-
get_demo_data(name=’adult’)

Downloads a demo dataset from the internet.

-
    -
  • Parameters: -- name: Name of the dataset to download (default: ‘adult’).

  • -
  • Returns: -The downloaded dataset.

  • -
-
-
-
-
- - -
- -
-
- -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/katabatic.models.ganblrpp/ganblrpp.html b/Archive/doc_new/_build/html/katabatic.models.ganblrpp/ganblrpp.html deleted file mode 100644 index 0898eab..0000000 --- a/Archive/doc_new/_build/html/katabatic.models.ganblrpp/ganblrpp.html +++ /dev/null @@ -1,252 +0,0 @@ - - - - - - - - DMMDiscritizer Class — Katabatic documentation - - - - - - - - - - - - - - - - - - - - -
-
-
- - -
- -
-

DMMDiscritizer Class

-

The DMMDiscritizer class performs discretization using a mixture model approach. It scales the data, applies Bayesian Gaussian Mixture Models, and transforms the data into discrete values. It also includes methods for inverse transformation of discretized data back to its original form.

-
-

Class Properties

-
    -
  • __dmm_params: -Parameters for the Bayesian Gaussian Mixture Model.

  • -
  • __scaler: -Min-Max scaler for data normalization.

  • -
  • __dmms: -List to store Bayesian Gaussian Mixture Models for each feature.

  • -
  • __arr_mu: -List to store means of the Gaussian distributions.

  • -
  • __arr_sigma: -List to store standard deviations of the Gaussian distributions.

  • -
  • _random_state: -Random seed for reproducibility.

  • -
-
-
-

Class Methods

-
-
__init__(random_state)

Initializes the DMMDiscritizer with parameters and scaler.

-
    -
  • Parameters: -- random_state: Seed for random number generation.

  • -
-
-
fit(x)

Fits the discretizer to the provided data.

-
    -
  • Parameters: -- x: 2D numpy array of shape (n_samples, n_features). Numeric data to be discretized.

  • -
  • Returns: -The fitted DMMDiscritizer instance.

  • -
-
-
transform(x) -> np.ndarray

Transforms the data using the fitted discretizer.

-
    -
  • Parameters: -- x: 2D numpy array of shape (n_samples, n_features). Numeric data to be transformed.

  • -
  • Returns: -2D numpy array of shape (n_samples, n_features). Discretized data.

  • -
-
-
fit_transform(x) -> np.ndarray

Fits the discretizer and transforms the data.

-
    -
  • Parameters: -- x: 2D numpy array of shape (n_samples, n_features). Numeric data to be discretized and transformed.

  • -
  • Returns: -2D numpy array of shape (n_samples, n_features). Discretized data.

  • -
-
-
inverse_transform(x, verbose=1) -> np.ndarray

Converts discretized data back to its original continuous form.

-
    -
  • Parameters: -- x: 2D numpy array of shape (n_samples, n_features). Discretized data. -- verbose: int, default=1. Controls verbosity of the operation.

  • -
  • Returns: -2D numpy array of shape (n_samples, n_features). Reverted data.

  • -
-
-
__sample_from_truncnorm(bins, mu, sigma, random_state=None)

Samples data from a truncated normal distribution.

-
    -
  • Parameters: -- bins: 1D numpy array of integer bins. -- mu: 1D numpy array of means for the normal distribution. -- sigma: 1D numpy array of standard deviations for the normal distribution. -- random_state: int or None. Seed for random number generation.

  • -
  • Returns: -1D numpy array of sampled results.

  • -
-
-
-
-
-
-

GANBLRPP Class

-

The GANBLRPP class implements the GANBLR++ model, which combines generative adversarial networks with discretization techniques. It uses the DMMDiscritizer for data preprocessing and the GANBLR model for generating synthetic data.

-
-

Class Properties

-
    -
  • __discritizer: -Instance of DMMDiscritizer for data preprocessing.

  • -
  • __ganblr: -Instance of GANBLR for generative adversarial network functionality.

  • -
  • _numerical_columns: -List of indices for numerical columns in the dataset.

  • -
-
-
-

Class Methods

-
-
__init__(numerical_columns, random_state=None)

Initializes the GANBLR++ model with numerical column indices and optional random state.

-
    -
  • Parameters: -- numerical_columns: List of indices for numerical columns. -- random_state: int, RandomState instance or None. Seed for random number generation.

  • -
-
-
fit(x, y, k=0, batch_size=32, epochs=10, warmup_epochs=1, verbose=1)

Fits the GANBLR++ model to the provided data.

-
    -
  • Parameters: -- x: 2D numpy array of shape (n_samples, n_features). Dataset to fit the model. -- y: 1D numpy array of shape (n_samples,). Labels for the dataset. -- k: int, default=0. Parameter k for the GANBLR model. -- batch_size: int, default=32. Size of batches for training. -- epochs: int, default=10. Number of training epochs. -- warmup_epochs: int, default=1. Number of warmup epochs. -- verbose: int, default=1. Controls verbosity of the training process.

  • -
  • Returns: -The fitted GANBLRPP instance.

  • -
-
-
sample(size=None, verbose=1)

Generates synthetic data using the GANBLR++ model.

-
    -
  • Parameters: -- size: int or None. Size of the synthetic data to generate. -- verbose: int, default=1. Controls verbosity of the sampling process.

  • -
  • Returns: -2D numpy array of synthetic data.

  • -
-
-
evaluate(x, y, model=’lr’)

Evaluates the model using a TSTR (Training on Synthetic data, Testing on Real data) approach.

-
    -
  • Parameters: -- x: 2D numpy array of shape (n_samples, n_features). Test dataset. -- y: 1D numpy array of shape (n_samples,). Labels for the test dataset. -- model: str or object. Model to use for evaluation. Options are ‘lr’, ‘mlp’, ‘rf’, or a custom model with fit and predict methods.

  • -
  • Returns: -float. Accuracy score of the evaluation.

  • -
-
-
-
-
- - -
- -
-
- -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/katabatic.models.ganblrpp/ganblrpp_adapter.html b/Archive/doc_new/_build/html/katabatic.models.ganblrpp/ganblrpp_adapter.html deleted file mode 100644 index c62ba15..0000000 --- a/Archive/doc_new/_build/html/katabatic.models.ganblrpp/ganblrpp_adapter.html +++ /dev/null @@ -1,199 +0,0 @@ - - - - - - - - GanblrppAdapter Class — Katabatic documentation - - - - - - - - - - - - - - - - - - - - -
-
-
- - -
- -
-

GanblrppAdapter Class

-

The GanblrppAdapter class adapts the GANBLRPP model to the Katabatic Model SPI interface. It facilitates model initialization, data loading, model fitting, and data generation, bridging the gap between Katabatic’s requirements and the GANBLR++ model’s functionality.

-
-

Class Properties

-
    -
  • type (str): -Type of model to use (“discrete” by default).

  • -
  • model: -Instance of GANBLRPP for model operations.

  • -
  • constraints: -Constraints for the model (not used in this class).

  • -
  • batch_size (int): -Size of batches for training.

  • -
  • epochs (int): -Number of epochs for training.

  • -
  • training_sample_size (int): -Size of the training sample.

  • -
  • numerical_columns (list): -List of indices for numerical columns in the dataset.

  • -
  • random_state: -Seed for random number generation.

  • -
-
-
-

Class Methods

-
-
__init__(model_type=”discrete”, numerical_columns=None, random_state=None)

Initializes the GanblrppAdapter with model type, numerical columns, and optional random state.

-
    -
  • Parameters: -- model_type: str, default=”discrete”. Type of model to use. -- numerical_columns: list, optional. List of indices for numerical columns. -- random_state: int, optional. Seed for random number generation.

  • -
-
-
load_model() -> GANBLRPP

Initializes and loads the GANBLRPP model.

-
    -
  • Returns: -The initialized GANBLRPP model.

  • -
  • Raises: -- ValueError: If numerical_columns is not provided. -- RuntimeError: If the model initialization fails.

  • -
-
-
load_data(data_pathname) -> pd.DataFrame

Loads data from a CSV file.

-
    -
  • Parameters: -- data_pathname: str. Path to the CSV file containing the data.

  • -
  • Returns: -Pandas DataFrame containing the loaded data.

  • -
  • Raises: -- Exception: If there is an error loading the data.

  • -
-
-
fit(X_train, y_train, k=0, epochs=10, batch_size=64)

Fits the GANBLRPP model to the training data.

-
    -
  • Parameters: -- X_train: pd.DataFrame or np.ndarray. Training data features. -- y_train: pd.Series or np.ndarray. Training data labels. -- k: int, default=0. Parameter k for the GANBLRPP model. -- epochs: int, default=10. Number of training epochs. -- batch_size: int, default=64. Size of batches for training.

  • -
  • Returns: -None

  • -
  • Raises: -- RuntimeError: If the model is not initialized. -- Exception: If there is an error during model training.

  • -
-
-
generate(size=None) -> pd.DataFrame

Generates synthetic data using the GANBLRPP model.

-
    -
  • Parameters: -- size: int or None. Size of the synthetic data to generate. Defaults to the size of the training data.

  • -
  • Returns: -Pandas DataFrame containing the generated data.

  • -
  • Raises: -- RuntimeError: If the model is not initialized. -- Exception: If there is an error during data generation.

  • -
-
-
-
-
- - -
- -
-
- -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/katabatic.models.ganblrpp/kdb.html b/Archive/doc_new/_build/html/katabatic.models.ganblrpp/kdb.html deleted file mode 100644 index 5cc2504..0000000 --- a/Archive/doc_new/_build/html/katabatic.models.ganblrpp/kdb.html +++ /dev/null @@ -1,270 +0,0 @@ - - - - - - - - KdbHighOrderFeatureEncoder Class — Katabatic documentation - - - - - - - - - - - - - - - - - - - - -
-
-
- - -
- -
-

KdbHighOrderFeatureEncoder Class

-

The KdbHighOrderFeatureEncoder class encodes high-order features using the kDB algorithm. It retrieves dependencies between features, transforms data into high-order features, and optionally returns constraints information.

-

Defined in kdb.py

-
-

Class Properties

-
    -
  • dependencies_ (dict): -A dictionary storing the dependencies between features.

  • -
  • constraints_ (np.ndarray): -Constraints information for high-order features.

  • -
  • have_value_idxs_ (list): -Indices indicating the presence of values for constraints.

  • -
  • feature_uniques_ (list): -List of unique values for each feature.

  • -
  • high_order_feature_uniques_ (list): -List of unique values for high-order features.

  • -
  • edges_ (list): -List of edges representing the kDB graph.

  • -
  • ohe_ (OneHotEncoder): -OneHotEncoder instance for encoding features.

  • -
  • k (int): -Value of k for the kDB model.

  • -
-
-
-

Methods

-
-
__init__()

Initializes the KdbHighOrderFeatureEncoder with default properties.

-
-
fit(X, y, k=0)

Fits the KdbHighOrderFeatureEncoder to the data and labels.

-
    -
  • Parameters: -- X: array_like of shape (n_samples, n_features). Data to fit in the encoder. -- y: array_like of shape (n_samples,). Labels to fit in the encoder. -- k: int, default=0. k value for the kDB model. k=0 leads to a OneHotEncoder.

  • -
  • Returns: -- self: The fitted encoder.

  • -
-
-
transform(X, return_constraints=False, use_ohe=True)

Transforms data to high-order features.

-
    -
  • Parameters: -- X: array_like of shape (n_samples, n_features). Data to transform. -- return_constraints: bool, default=False. Whether to return constraint information. -- use_ohe: bool, default=True. Whether to apply one-hot encoding.

  • -
  • Returns: -- X_out: ndarray of shape (n_samples, n_encoded_features). Transformed input. -- If return_constraints=True, also returns: -- constraints: np.ndarray of constraints. -- have_value_idxs: List of boolean arrays indicating presence of values.

  • -
-
-
fit_transform(X, y, k=0, return_constraints=False)

Fits the encoder and then transforms the data.

-
    -
  • Parameters: -- X: array_like of shape (n_samples, n_features). Data to fit and transform. -- y: array_like of shape (n_samples,). Labels to fit and transform. -- k: int, default=0. k value for the kDB model. -- return_constraints: bool, default=False. Whether to return constraint information.

  • -
  • Returns: -- X_out: ndarray of shape (n_samples, n_encoded_features). Transformed input. -- If return_constraints=True, also returns: -- constraints: np.ndarray of constraints. -- have_value_idxs: List of boolean arrays indicating presence of values.

  • -
-
-
-
-
-

Helper Functions

-
-
build_graph(X, y, k=2)

Constructs a kDB graph based on mutual information.

-
    -
  • Parameters: -- X: array_like of shape (n_samples, n_features). Features. -- y: array_like of shape (n_samples,). Labels. -- k: int, default=2. Maximum number of parent nodes to consider.

  • -
  • Returns: -- edges: List of tuples representing edges in the graph.

  • -
-
-
get_cross_table(*cols, apply_wt=False)

Computes a cross-tabulation table for the given columns.

-
    -
  • Parameters: -- *cols: 1D numpy arrays of integers. Columns to cross-tabulate. -- apply_wt: bool, default=False. Whether to apply weights.

  • -
  • Returns: -- uniq_vals_all_cols: Tuple of 1D numpy arrays of unique values for each column. -- xt: NumPy array of cross-tabulation results.

  • -
-
-
_get_dependencies_without_y(variables, y_name, kdb_edges)

Retrieves dependencies of each variable excluding the target variable y.

-
    -
  • Parameters: -- variables: List of variable names. -- y_name: Name of the target variable. -- kdb_edges: List of tuples representing edges in the kDB graph.

  • -
  • Returns: -- dependencies: Dictionary of dependencies for each variable.

  • -
-
-
_add_uniform(array, noise=1e-5)

Adds uniform probability to avoid zero counts in cross-tabulation.

-
    -
  • Parameters: -- array: NumPy array of counts. -- noise: float, default=1e-5. Amount of noise to add.

  • -
  • Returns: -- result: NumPy array with added uniform probability.

  • -
-
-
_normalize_by_column(array)

Normalizes an array by columns.

-
    -
  • Parameters: -- array: NumPy array to normalize.

  • -
  • Returns: -- normalized_array: NumPy array normalized by columns.

  • -
-
-
_smoothing(cct, d)

Applies smoothing to a cross-count table to handle zero counts.

-
    -
  • Parameters: -- cct: NumPy array of cross-count table. -- d: int. Dimension of the cross-count table.

  • -
  • Returns: -- jpt: NumPy array of smoothed joint probability table.

  • -
-
-
get_high_order_feature(X, col, evidence_cols, feature_uniques)

Encodes high-order features given the evidence columns.

-
    -
  • Parameters: -- X: array_like of shape (n_samples, n_features). Data. -- col: int. Column index of the feature to encode. -- evidence_cols: List of indices for evidence columns. -- feature_uniques: List of unique values for each feature.

  • -
  • Returns: -- high_order_feature: NumPy array of high-order features.

  • -
-
-
get_high_order_constraints(X, col, evidence_cols, feature_uniques)

Finds constraints for high-order features.

-
    -
  • Parameters: -- X: array_like of shape (n_samples, n_features). Data. -- col: int. Column index of the feature. -- evidence_cols: List of indices for evidence columns. -- feature_uniques: List of unique values for each feature.

  • -
  • Returns: -- have_value: NumPy array of boolean values indicating presence of values. -- high_order_constraints: NumPy array of constraint information.

  • -
-
-
-
-
- - -
- -
-
- -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/katabatic.models.ganblrpp/utils.html b/Archive/doc_new/_build/html/katabatic.models.ganblrpp/utils.html deleted file mode 100644 index 510332e..0000000 --- a/Archive/doc_new/_build/html/katabatic.models.ganblrpp/utils.html +++ /dev/null @@ -1,281 +0,0 @@ - - - - - - - - Utility Classes — Katabatic documentation - - - - - - - - - - - - - - - - - - - - -
-
-
- - -
- -
-

Utility Classes

-

The utility classes are implemented in the utility.py file and provide various support functionalities for model constraints and data preparation.

-
-

Classes

-
-
softmax_weight

Constrains weight tensors to be under softmax.

-
-
DataUtils

Provides data utilities for preparation before training.

-
-
-
-
-

Functions

-
-
elr_loss(KL_LOSS)

Defines a custom loss function.

-
    -
  • Parameters: -- KL_LOSS: The KL loss value.

  • -
  • Returns: -A custom loss function.

  • -
-
-
KL_loss(prob_fake)

Calculates the KL loss.

-
    -
  • Parameters: -- prob_fake: Probability of the fake data.

  • -
  • Returns: -The KL loss value.

  • -
-
-
get_lr(input_dim, output_dim, constraint=None, KL_LOSS=0)

Creates a logistic regression model.

-
    -
  • Parameters: -- input_dim: Dimension of the input features. -- output_dim: Dimension of the output. -- constraint: Optional constraint for the model (default: None). -- KL_LOSS: Optional KL loss value (default: 0).

  • -
  • Returns: -A logistic regression model.

  • -
-
-
sample(*arrays, n=None, frac=None, random_state=None)

Generates random samples from the given arrays.

-
    -
  • Parameters: -- arrays: Arrays to sample from. -- n: Number of samples to generate (default: None). -- frac: Fraction of samples to generate (default: None). -- random_state: Random seed for reproducibility (default: None).

  • -
  • Returns: -Random samples from the arrays.

  • -
-
-
get_demo_data(name=’adult’)

Downloads a demo dataset from the internet.

-
    -
  • Parameters: -- name: Name of the dataset to download (default: ‘adult’).

  • -
  • Returns: -The downloaded dataset.

  • -
-
-
-
-
-

Classes

-
-
softmax_weight

Constrains weight tensors to be under softmax.

-
    -
  • Defined in: softmax_weight.py

  • -
  • Properties: -- feature_idxs (list): List of tuples indicating the start and end indices for each feature in the weight tensor.

  • -
  • Methods: -- __init__(feature_uniques) -Initializes the constraint with unique feature values.

    -
    -
      -
    • Parameters: -- feature_uniques: np.ndarray or list of int. Unique values for each feature used to compute indices for softmax constraint.

    • -
    • Returns: -None.

    • -
    -
    -
      -
    • __call__(w) -Applies the softmax constraint to the weight tensor.

      -
        -
      • Parameters: -- w: tf.Tensor. Weight tensor to which the constraint is applied.

      • -
      • Returns: -tf.Tensor: The constrained weight tensor.

      • -
      -
    • -
    • get_config() -Returns the configuration of the constraint.

      -
        -
      • Returns: -dict: Configuration dictionary containing feature_idxs.

      • -
      -
    • -
    -
  • -
-
-
DataUtils

Provides utility functions for data preparation before training.

-
    -
  • Defined in: data_utils.py

  • -
  • Properties: -- x (np.ndarray): The feature data used for training. -- y (np.ndarray): The target labels associated with the feature data. -- data_size (int): Number of samples in the dataset. -- num_features (int): Number of features in the dataset. -- num_classes (int): Number of unique classes in the target labels. -- class_counts (np.ndarray): Counts of each class in the target labels. -- feature_uniques (list): List of unique values for each feature. -- constraint_positions (np.ndarray or None): Positions of constraints for high-order features. -- _kdbe (KdbHighOrderFeatureEncoder or None): Instance of the KdbHighOrderFeatureEncoder for feature encoding. -- __kdbe_x (np.ndarray or None): Transformed feature data after applying kDB encoding.

  • -
  • Methods: -- __init__(x, y) -Initializes the DataUtils with the provided feature data and target labels.

    -
    -
      -
    • Parameters: -- x: np.ndarray. Feature data. -- y: np.ndarray. Target labels.

    • -
    • Returns: -None.

    • -
    -
    -
      -
    • get_categories(idxs=None) -Retrieves categories for encoded features.

      -
        -
      • Parameters: -- idxs: list of int or None. Indices of features to retrieve categories for. If None, retrieves categories for all features.

      • -
      • Returns: -list: List of categories for the specified features.

      • -
      -
    • -
    • get_kdbe_x(k=0, dense_format=True) -Transforms feature data into high-order features using kDB encoding.

      -
        -
      • Parameters: -- k: int, default=0. k value for the kDB model. -- dense_format: bool, default=True. Whether to return the transformed data in dense format.

      • -
      • Returns: -np.ndarray: Transformed feature data. -If dense_format=True, returns a dense NumPy array. -Also updates constraint_positions with the positions of constraints.

      • -
      -
    • -
    • clear() -Clears the kDB encoder and transformed data.

      -
        -
      • Returns: -None.

      • -
      -
    • -
    -
  • -
-
-
-
-
- - -
- -
-
- -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/katabatic.models.tablegan/tablegan.html b/Archive/doc_new/_build/html/katabatic.models.tablegan/tablegan.html deleted file mode 100644 index e93e88e..0000000 --- a/Archive/doc_new/_build/html/katabatic.models.tablegan/tablegan.html +++ /dev/null @@ -1,270 +0,0 @@ - - - - - - - - TableGAN Class — Katabatic documentation - - - - - - - - - - - - - - - - - - - - -
-
-
- - -
- -
-

TableGAN Class

-

This class implements a Generative Adversarial Network (GAN) for tabular data, specifically designed to handle structured data with a classifier that enforces feature-label relationships.

-
-
-class TableGAN
-
-
Parameters:
-
    -
  • input_dim – Integer, the dimension of the input features.

  • -
  • label_dim – Integer, the dimension of the output labels.

  • -
  • z_dim – Integer, the dimension of the noise vector used for generating synthetic data (default: 100).

  • -
  • delta_mean – Float, the threshold for the mean feature differences during generator training (default: 0.1).

  • -
  • delta_sd – Float, the threshold for the standard deviation differences during generator training (default: 0.1).

  • -
-
-
-

Example:

-
import tensorflow as tf
-from tensorflow.keras import layers
-import numpy as np
-
-# Create TableGAN model instance
-gan = TableGAN(input_dim=32, label_dim=10)
-
-# Train on data
-gan.fit(x_train, y_train, epochs=100)
-
-# Generate synthetic samples
-generated_data, generated_labels = gan.sample(n_samples=1000)
-
-
-

Methods

-
-
-_build_generator(self)
-

Constructs the generator model, which produces synthetic data from random noise.

-
-
Returns:
-

A Keras Sequential model representing the generator.

-
-
-
- -
-
-_build_discriminator(self)
-

Constructs the discriminator model, which evaluates the authenticity of the data and extracts features.

-
-
Returns:
-

A Keras Model that outputs the real/fake classification and extracted features.

-
-
-
- -
-
-_build_classifier(self)
-

Constructs the classifier model, which predicts labels for the generated data.

-
-
Returns:
-

A Keras Sequential model that classifies the input data.

-
-
-
- -
-
-wasserstein_loss(self, y_true, y_pred)
-

Implements the Wasserstein loss function for training the discriminator.

-
-
Parameters:
-
    -
  • y_true – Tensor, the true labels (real/fake).

  • -
  • y_pred – Tensor, the predicted labels.

  • -
-
-
Returns:
-

Tensor, the computed Wasserstein loss.

-
-
-
- -
-
-gradient_penalty(self, real, fake)
-

Computes the gradient penalty term for enforcing the Lipschitz constraint in Wasserstein GANs.

-
-
Parameters:
-
    -
  • real – Tensor, real data samples.

  • -
  • fake – Tensor, fake data samples generated by the generator.

  • -
-
-
Returns:
-

Tensor, the computed gradient penalty.

-
-
-
- -
-
-train_step(self, real_data, real_labels)
-

Performs a single training step for the generator, discriminator, and classifier, using real and synthetic data.

-
-
Parameters:
-
    -
  • real_data – Tensor, the real data samples.

  • -
  • real_labels – Tensor, the true labels for the real data samples.

  • -
-
-
Returns:
-

Tuple of three values representing the discriminator, generator, and classifier losses, respectively.

-
-
-
- -
-
-fit(self, x, y, batch_size=64, epochs=100, verbose=1)
-

Trains the GAN model on the provided data.

-
-
Parameters:
-
    -
  • x – Tensor, input data for training.

  • -
  • y – Tensor, labels for the training data.

  • -
  • batch_size – Integer, the size of each training batch (default: 64).

  • -
  • epochs – Integer, the number of epochs to train the model (default: 100).

  • -
  • verbose – Integer, the verbosity mode for logging during training (default: 1).

  • -
-
-
Returns:
-

The fitted TableGAN model.

-
-
-
- -
-
-sample(self, n_samples)
-

Generates synthetic data and corresponding labels using the trained generator and classifier.

-
-
Parameters:
-

n_samples – Integer, the number of synthetic data samples to generate.

-
-
Returns:
-

Tuple of numpy arrays, containing the generated data and labels.

-
-
-
- -
- -
- - -
- -
-
- -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/katabatic.models.tablegan/tablegan_adapter.html b/Archive/doc_new/_build/html/katabatic.models.tablegan/tablegan_adapter.html deleted file mode 100644 index 8c5e4c5..0000000 --- a/Archive/doc_new/_build/html/katabatic.models.tablegan/tablegan_adapter.html +++ /dev/null @@ -1,179 +0,0 @@ - - - - - - - - TableGANAdapter — Katabatic documentation - - - - - - - - - - - - - - - - - - - - -
-
-
- - -
- -
-

TableGANAdapter

-

The TableGANAdapter class serves as an adapter for interfacing with the TableGAN model. It extends the KatabaticModelSPI and allows for loading, fitting, and generating synthetic data using the TableGAN model. This adapter includes functionality to handle privacy settings, data preprocessing, and model training.

-
-

Class Structure

-
-
-

Attributes

-
    -
  • type (str): Defines the type of data handled by the adapter (default: ‘continuous’).

  • -
  • privacy_setting (str): Sets the privacy level of the model (‘low’, ‘medium’, or ‘high’).

  • -
  • constraints (NoneType): Currently not in use but reserved for future constraints settings.

  • -
  • batch_size (int): Defines the batch size for training (default: 64).

  • -
  • epochs (int): Number of training epochs (default: 100).

  • -
  • model (TableGAN): Instance of the TableGAN model.

  • -
  • scaler (StandardScaler): Scaler used for preprocessing continuous data.

  • -
  • label_encoder (LabelEncoder): Encoder used for processing categorical labels.

  • -
  • input_dim (int): Input dimensionality of the data.

  • -
  • label_dim (int): Label dimensionality of the data.

  • -
  • training_sample_size (int): Number of samples used during training.

  • -
-
-
-

Methods

-
    -
  • __init__(self, type=’continuous’, privacy_setting=’low’)

    -

    Initializes the TableGANAdapter class, setting parameters such as data type, privacy level, and default model parameters.

    -
  • -
  • load_model(self)

    -

    Loads and initializes the TableGAN model based on the input and label dimensions. Adjusts privacy parameters (delta_mean, delta_sd) according to the specified privacy_setting.

    -
  • -
  • load_data(self, data_pathname)

    -

    Loads training data from the specified data_pathname. Handles CSV files and returns the data as a Pandas DataFrame.

    -
  • -
  • fit(self, X_train, y_train, epochs=None, batch_size=None)

    -

    Trains the TableGAN model on the provided training data (X_train, y_train). Preprocesses the data, sets input and label dimensions, and optionally overrides the number of epochs and batch size.

    -
  • -
  • generate(self, size=None)

    -

    Generates synthetic data using the trained TableGAN model. If the model is not trained, raises an error. The size of generated data defaults to the training sample size unless otherwise specified.

    -
  • -
-
-
-

Usage Example

-

Below is a usage example that shows how to initialize and use the TableGANAdapter class for training and generating synthetic data:

-
from katabatic.models import TableGANAdapter
-
-# Initialize the adapter with medium privacy
-adapter = TableGANAdapter(type='continuous', privacy_setting='medium')
-
-# Load data
-data = adapter.load_data('training_data.csv')
-
-# Preprocess and fit the model
-X_train, y_train = data.iloc[:, :-1], data.iloc[:, -1]
-adapter.fit(X_train, y_train)
-
-# Generate synthetic data
-synthetic_data = adapter.generate(size=1000)
-
-
-
-
- - -
- -
-
- -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/katabatic.models.tablegan/tablegan_utils.html b/Archive/doc_new/_build/html/katabatic.models.tablegan/tablegan_utils.html deleted file mode 100644 index 92dbdab..0000000 --- a/Archive/doc_new/_build/html/katabatic.models.tablegan/tablegan_utils.html +++ /dev/null @@ -1,187 +0,0 @@ - - - - - - - - Utility Functions — Katabatic documentation - - - - - - - - - - - - - - - - - - - - -
-
-
- - -
- -
-

Utility Functions

-

The utility functions are implemented in the utils.py file and provide various support functions for data preparation, model constraints, and logistic regression operations.

-
-

Classes

-
    -
  • softmax_weight

    -

    A class that constrains weight tensors to be normalized using the softmax function, ensuring that the values sum up to 1.

    -
  • -
  • DataUtils

    -

    A utility class that provides helper functions for preparing data before training machine learning models.

    -
  • -
-
-
-

Functions

-
    -
  • elr_loss(KL_LOSS)

    -

    Defines a custom loss function that integrates the KL loss into the overall loss calculation for model training.

    -
      -
    • Parameters: -- KL_LOSS (float): The Kullback-Leibler (KL) loss value that is integrated into the loss function.

    • -
    • Returns: -A custom loss function that can be used in training models.

    • -
    -
  • -
  • KL_loss(prob_fake)

    -

    Calculates the Kullback-Leibler (KL) divergence loss, which measures how one probability distribution diverges from a second, expected distribution.

    -
      -
    • Parameters: -- prob_fake (numpy.array): The probability distribution of the fake (generated) data.

    • -
    • Returns: -The calculated KL loss value.

    • -
    -
  • -
  • get_lr(input_dim, output_dim, constraint=None, KL_LOSS=0)

    -

    Creates and returns a logistic regression model, with optional constraints and KL loss integration.

    -
      -
    • Parameters: -- input_dim (int): The dimension of the input features. -- output_dim (int): The dimension of the output labels. -- constraint (callable, optional): A constraint function applied to the logistic regression model weights (default: None). -- KL_LOSS (float, optional): The Kullback-Leibler loss value to be incorporated into the model (default: 0).

    • -
    • Returns: -A logistic regression model object.

    • -
    -
  • -
  • sample(*arrays, n=None, frac=None, random_state=None)

    -

    Generates random samples from the provided arrays, either by specifying the number of samples or the fraction of the arrays to sample.

    -
      -
    • Parameters: -- arrays (list): The input arrays to sample from. -- n (int, optional): The number of samples to generate (default: None). -- frac (float, optional): The fraction of the arrays to sample (default: None). -- random_state (int, optional): A random seed to ensure reproducibility (default: None).

    • -
    • Returns: -Random samples from the provided arrays.

    • -
    -
  • -
  • get_demo_data(name=’adult’)

    -

    Downloads and returns a demo dataset from the internet. By default, it downloads the ‘adult’ dataset, but other dataset names can be specified.

    -
      -
    • Parameters: -- name (str, optional): The name of the dataset to download (default: ‘adult’).

    • -
    • Returns: -The downloaded dataset as a Pandas DataFrame.

    • -
    -
  • -
-
-
- - -
- -
-
- -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/modules.html b/Archive/doc_new/_build/html/modules.html deleted file mode 100644 index b343f76..0000000 --- a/Archive/doc_new/_build/html/modules.html +++ /dev/null @@ -1,129 +0,0 @@ - - - - - - - - Katabatic — Katabatic documentation - - - - - - - - - - - - - - - - - - - - -
-
-
- - - - -
-
- -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/objects.inv b/Archive/doc_new/_build/html/objects.inv deleted file mode 100644 index 6cd198b..0000000 --- a/Archive/doc_new/_build/html/objects.inv +++ /dev/null @@ -1,5 +0,0 @@ -# Sphinx inventory version 2 -# Project: Katabatic -# Version: -# The remainder of this file is compressed using zlib. -xڭQo0+,m{B=-YmڲU[  c,ce~  tckh$QXв$$F8  L-JܹA\yKQ kb&-؅HXKjף& A!4I9_N,i\1rx:@# /NKJnX5ylt#Q4hM6CsP돫Y{t@&FDBX'{'d<1)ᾝoq hFY\7)kU>rMnJbKIN|;B@HU^m`3SެEkPf@")47i:+_8*vn"9"ݝ-ErL%Q-bXa{Y'w@5[z9lm͜tXQߞVUz?y='[`U*m˛hMp{YsJ:NF;g=?}ݩ6L \ No newline at end of file diff --git a/Archive/doc_new/_build/html/search.html b/Archive/doc_new/_build/html/search.html deleted file mode 100644 index 493a91e..0000000 --- a/Archive/doc_new/_build/html/search.html +++ /dev/null @@ -1,121 +0,0 @@ - - - - - - - Search — Katabatic documentation - - - - - - - - - - - - - - - - - - - - - - - - - -
-
-
- - -
- -

Search

- - - - -

- Searching for multiple words only shows matches that contain - all words. -

- - -
- - - -
- - -
- - -
- -
-
- -
-
- - - - - - - \ No newline at end of file diff --git a/Archive/doc_new/_build/html/searchindex.js b/Archive/doc_new/_build/html/searchindex.js deleted file mode 100644 index f99f2b8..0000000 --- a/Archive/doc_new/_build/html/searchindex.js +++ /dev/null @@ -1 +0,0 @@ -Search.setIndex({"alltitles": {"Attributes": [[8, "attributes"], [16, "attributes"]], "Class Methods": [[7, "class-methods"], [11, "class-methods"], [11, "id2"], [12, "class-methods"]], "Class Properties": [[7, "class-properties"], [11, "class-properties"], [11, "id1"], [12, "class-properties"], [13, "class-properties"]], "Class Structure": [[16, "class-structure"]], "Classes": [[9, "classes"], [10, "classes"], [14, "classes"], [14, "id1"], [17, "classes"]], "Contents:": [[6, null]], "DMMDiscritizer Class": [[11, null]], "Functions": [[10, "functions"], [14, "functions"], [17, "functions"]], "GANBLR Model Class": [[7, null]], "GANBLRPP Class": [[11, "ganblrpp-class"]], "GanblrAdapter": [[8, null]], "GanblrppAdapter Class": [[12, null]], "Helper Functions": [[13, "helper-functions"]], "Katabatic": [[18, null]], "Katabatic documentation": [[6, null]], "Katabatic package": [[0, null]], "Katabatic.models package": [[1, null]], "Katabatic.models.GANBLR++ package": [[2, null]], "Katabatic.models.ganblr package": [[3, null]], "Katabatic.models.tableGAN package": [[4, null]], "Katabatic.utils package": [[5, null]], "KdbHighOrderFeatureEncoder Class": [[13, null]], "Methods": [[8, "methods"], [9, "methods"], [13, "methods"], [16, "methods"]], "Module contents": [[2, "module-contents"], [3, "module-contents"], [4, "module-contents"]], "Submodules": [[5, "submodules"]], "Submodules:": [[2, null], [3, null], [4, null]], "Subpackages": [[0, "subpackages"], [1, "subpackages"]], "TableGAN Class": [[15, null]], "TableGANAdapter": [[16, null]], "Usage Example": [[16, "usage-example"]], "Utility Classes": [[14, null]], "Utility Functions": [[10, null], [17, null]], "kDB Algorithm": [[9, null]]}, "docnames": ["Katabatic", "Katabatic.models", "Katabatic.models.GANBLR++", "Katabatic.models.ganblr", "Katabatic.models.tableGAN", "Katabatic.utils", "index", "katabatic.models.ganblr/ganblr", "katabatic.models.ganblr/ganblr_adapter", "katabatic.models.ganblr/kdb", "katabatic.models.ganblr/utils", "katabatic.models.ganblrpp/ganblrpp", "katabatic.models.ganblrpp/ganblrpp_adapter", "katabatic.models.ganblrpp/kdb", "katabatic.models.ganblrpp/utils", "katabatic.models.tablegan/tablegan", "katabatic.models.tablegan/tablegan_adapter", "katabatic.models.tablegan/tablegan_utils", "modules"], "envversion": {"sphinx": 63, "sphinx.domains.c": 3, "sphinx.domains.changeset": 1, "sphinx.domains.citation": 1, "sphinx.domains.cpp": 9, "sphinx.domains.index": 1, "sphinx.domains.javascript": 3, "sphinx.domains.math": 2, "sphinx.domains.python": 4, "sphinx.domains.rst": 2, "sphinx.domains.std": 2}, "filenames": ["Katabatic.rst", "Katabatic.models.rst", "Katabatic.models.GANBLR++.rst", "Katabatic.models.ganblr.rst", "Katabatic.models.tableGAN.rst", "Katabatic.utils.rst", "index.rst", "katabatic.models.ganblr/ganblr.rst", "katabatic.models.ganblr/ganblr_adapter.rst", "katabatic.models.ganblr/kdb.rst", "katabatic.models.ganblr/utils.rst", "katabatic.models.ganblrpp/ganblrpp.rst", "katabatic.models.ganblrpp/ganblrpp_adapter.rst", "katabatic.models.ganblrpp/kdb.rst", "katabatic.models.ganblrpp/utils.rst", "katabatic.models.tablegan/tablegan.rst", "katabatic.models.tablegan/tablegan_adapter.rst", "katabatic.models.tablegan/tablegan_utils.rst", "modules.rst"], "indexentries": {"_build_classifier() (tablegan method)": [[15, "TableGAN._build_classifier", false]], "_build_discriminator() (tablegan method)": [[15, "TableGAN._build_discriminator", false]], "_build_generator() (tablegan method)": [[15, "TableGAN._build_generator", false]], "fit() (tablegan method)": [[15, "TableGAN.fit", false]], "gradient_penalty() (tablegan method)": [[15, "TableGAN.gradient_penalty", false]], "sample() (tablegan method)": [[15, "TableGAN.sample", false]], "tablegan (built-in class)": [[15, "TableGAN", false]], "train_step() (tablegan method)": [[15, "TableGAN.train_step", false]], "wasserstein_loss() (tablegan method)": [[15, "TableGAN.wasserstein_loss", false]]}, "objects": {"": [[15, 0, 1, "", "TableGAN"]], "TableGAN": [[15, 1, 1, "", "_build_classifier"], [15, 1, 1, "", "_build_discriminator"], [15, 1, 1, "", "_build_generator"], [15, 1, 1, "", "fit"], [15, 1, 1, "", "gradient_penalty"], [15, 1, 1, "", "sample"], [15, 1, 1, "", "train_step"], [15, 1, 1, "", "wasserstein_loss"]]}, "objnames": {"0": ["py", "class", "Python class"], "1": ["py", "method", "Python method"]}, "objtypes": {"0": "py:class", "1": "py:method"}, "terms": {"": 12, "0": [7, 8, 9, 10, 11, 12, 13, 14, 15, 17], "1": [7, 9, 11, 15, 16, 17], "10": [7, 8, 11, 12, 15], "100": [15, 16], "1000": [15, 16], "1d": [11, 13], "1e": 13, "2": [9, 13], "2d": 11, "32": [7, 11, 15], "5": 13, "64": [8, 12, 15, 16], "A": [8, 9, 10, 13, 14, 15, 17], "By": 17, "If": [8, 12, 13, 14, 16], "It": [9, 11, 12, 13, 16], "The": [7, 9, 10, 11, 12, 13, 14, 15, 16, 17], "__arr_mu": 11, "__arr_sigma": 11, "__call__": 14, "__discrit": 11, "__dmm": 11, "__dmm_param": 11, "__ganblr": 11, "__gen_weight": 7, "__init__": [7, 11, 12, 13, 14, 16], "__kdbe_x": 14, "__sample_from_truncnorm": 11, "__scaler": 11, "_add_uniform": [9, 13], "_augment_cpd": 7, "_build_classifi": 15, "_build_discrimin": 15, "_build_gener": 15, "_d": 7, "_discrim": 7, "_get_dependencies_without_i": [9, 13], "_kdbe": 14, "_label_encod": 7, "_normalize_by_column": [9, 13], "_numerical_column": 11, "_ordinal_encod": 7, "_random_st": 11, "_run_gener": 7, "_sampl": 7, "_smooth": [9, 13], "_warmup_run": 7, "accord": 16, "accuraci": 11, "ad": 13, "adapt": [8, 12, 16], "add": [6, 9, 13], "adjust": 16, "adult": [10, 14, 17], "adversari": [11, 15], "after": 14, "algorithm": [1, 3, 7, 13], "all": [9, 14], "allow": 16, "also": [11, 13, 14], "amount": 13, "an": [8, 9, 12, 13, 16], "ani": 8, "appli": [9, 11, 13, 14, 17], "apply_wt": [9, 13], "approach": 11, "ar": [10, 11, 14, 17], "arrai": [9, 10, 11, 13, 14, 15, 17], "array_lik": 13, "associ": 14, "attribut": [1, 3, 4], "augment": 7, "authent": 15, "avoid": 13, "back": 11, "base": [13, 16], "batch": [7, 8, 11, 12, 15, 16], "batch_siz": [7, 8, 11, 12, 15, 16], "bayesian": [7, 9, 11], "befor": [10, 14, 17], "below": 16, "between": [12, 13], "bin": 11, "bool": [13, 14], "boolean": 13, "bridg": 12, "build": 7, "build_graph": [9, 13], "calcul": [10, 14, 17], "callabl": 17, "can": 17, "categor": 16, "categori": 14, "cct": [9, 13], "class": [1, 2, 3, 4, 8], "class_count": 14, "classif": 15, "classifi": 15, "clear": 14, "col": [9, 13], "column": [9, 11, 12, 13], "combin": [7, 11], "comput": [13, 14, 15], "condit": 7, "configur": 14, "consid": [9, 13], "constrain": [10, 14, 17], "constraint": [7, 8, 9, 10, 12, 13, 14, 15, 16, 17], "constraint_posit": 14, "constraints_": 13, "construct": [9, 13, 15], "contain": [9, 12, 14, 15], "content": [0, 1], "continu": [8, 11, 16], "control": 11, "convert": 11, "correspond": 15, "count": [13, 14], "cpd": 7, "creat": [7, 10, 14, 15, 17], "cross": [9, 13], "csv": [12, 16], "current": 16, "custom": [10, 11, 14, 17], "d": [7, 9, 13], "data": [7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17], "data_pathnam": [8, 12, 16], "data_s": 14, "data_util": 14, "datafram": [9, 12, 16, 17], "dataset": [10, 11, 12, 14, 17], "datautil": [10, 14, 17], "default": [7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17], "defin": [7, 10, 13, 14, 16, 17], "delta_mean": [15, 16], "delta_sd": [15, 16], "demo": [10, 14, 17], "dens": 14, "dense_format": 14, "depend": [7, 9, 13], "dependencies_": [9, 13], "design": 15, "detail": 6, "deviat": [11, 15], "dict": [13, 14], "dictionari": [9, 13, 14], "differ": 15, "dimens": [10, 13, 14, 15, 16, 17], "dimension": 16, "discret": [8, 11, 12], "discrimin": [7, 15], "distribut": [7, 9, 11, 17], "diverg": 17, "dmmdiscrit": [1, 2], "download": [10, 14, 17], "dure": [12, 15, 16], "each": [9, 11, 13, 14, 15], "easi": 8, "edg": [9, 13], "edges_": 13, "either": 17, "elr_loss": [10, 14, 17], "encod": [7, 9, 13, 14, 16], "end": 14, "enforc": 15, "ensur": 17, "epoch": [7, 8, 11, 12, 15, 16], "error": [12, 16], "evalu": [11, 15], "evid": [9, 13], "evidence_col": [9, 13], "exampl": [1, 4, 15], "except": 12, "exclud": 13, "expect": 17, "extend": 16, "extract": 15, "facilit": 12, "fail": 12, "fake": [10, 14, 15, 17], "fals": [9, 13], "featur": [8, 9, 10, 11, 12, 13, 14, 15, 17], "feature_idx": 14, "feature_uniqu": [9, 13, 14], "feature_uniques_": [9, 13], "file": [9, 10, 12, 14, 16, 17], "find": [9, 13], "fit": [7, 8, 9, 11, 12, 13, 15, 16], "fit_transform": [9, 11, 13], "float": [11, 13, 15, 17], "form": 11, "format": [7, 14], "frac": [10, 14, 17], "fraction": [10, 14, 17], "from": [8, 9, 10, 11, 12, 14, 15, 16, 17], "function": [1, 2, 3, 4, 7, 11, 12, 15, 16], "futur": 16, "gan": 15, "ganblr": [0, 1, 8, 11, 12], "ganblradapt": [1, 3], "ganblrpp": [1, 2, 12], "ganblrppadapt": [1, 2], "gap": 12, "gaussian": 11, "gener": [7, 8, 9, 10, 11, 12, 14, 15, 16, 17], "generated_data": 15, "generated_label": 15, "get": 9, "get_categori": 14, "get_config": 14, "get_cross_t": [9, 13], "get_demo_data": [10, 14, 17], "get_high_order_constraint": [9, 13], "get_high_order_featur": [9, 13], "get_kdbe_x": 14, "get_lr": [10, 14, 17], "given": [7, 9, 10, 13, 14], "gradient": 15, "gradient_penalti": 15, "graph": [7, 9, 13], "handl": [13, 15, 16], "have_valu": 13, "have_value_idx": 13, "have_value_idxs_": 13, "helper": [1, 2, 17], "high": [9, 13, 14, 16], "high_order_constraint": 13, "high_order_featur": 13, "high_order_feature_uniques_": 13, "histori": 7, "hot": 13, "how": [16, 17], "i": [8, 9, 12, 14, 16, 17], "idx": 14, "iloc": 16, "implement": [9, 10, 11, 14, 15, 17], "import": [15, 16], "includ": [11, 16], "incorpor": 17, "index": 13, "indic": [11, 12, 13, 14], "individu": 9, "inform": 13, "initi": [7, 8, 11, 12, 13, 14, 16], "input": [7, 9, 10, 13, 14, 15, 16, 17], "input_dim": [10, 14, 15, 16, 17], "instanc": [8, 9, 11, 12, 13, 14, 15, 16], "int": [7, 8, 11, 12, 13, 14, 16, 17], "integ": [11, 13, 15], "integr": 17, "interfac": [8, 12, 16], "internet": [10, 14, 17], "invers": 11, "inverse_transform": 11, "its": 11, "joint": [9, 13], "jpt": 13, "k": [7, 8, 9, 11, 12, 13, 14], "katabat": [12, 16], "katabaticmodelspi": [8, 16], "kdb": [1, 3, 7, 13, 14], "kdb_edg": [9, 13], "kdbhighorderfeatureencod": [1, 2, 9, 14], "kera": 15, "kl": [10, 14, 17], "kl_loss": [10, 14, 17], "kullback": 17, "label": [7, 8, 9, 11, 12, 13, 14, 15, 16, 17], "label_dim": [15, 16], "label_encod": 16, "labelencod": 16, "layer": 15, "lead": 13, "learn": 17, "leibler": 17, "level": [7, 16], "leverag": 7, "lipschitz": 15, "list": [9, 11, 12, 13, 14, 17], "load": [8, 12, 16], "load_data": [8, 12, 16], "load_model": [8, 12, 16], "log": 15, "logist": [10, 14, 17], "loss": [7, 10, 14, 15, 17], "low": 16, "lr": 11, "machin": 17, "max": 11, "maximum": 13, "mean": [11, 15], "measur": 17, "medium": 16, "method": [1, 2, 3, 4, 14, 15], "min": 11, "mixtur": 11, "mlp": 11, "mode": 15, "model": [0, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18], "model_typ": 12, "modifi": 9, "modul": [0, 1], "mu": 11, "mutual": 13, "n": [10, 14, 17], "n_encoded_featur": 13, "n_featur": [11, 13], "n_sampl": [11, 13, 15], "name": [9, 10, 13, 14, 17], "ndarrai": [11, 12, 13, 14], "network": [7, 9, 11, 15], "neural": 7, "node": [8, 9, 13], "nois": [13, 15], "none": [7, 8, 10, 11, 12, 14, 16, 17], "nonetyp": 16, "normal": [9, 11, 13, 17], "normalized_arrai": 13, "np": [11, 12, 13, 14, 15], "num_class": 14, "num_featur": 14, "number": [7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17], "numer": [11, 12], "numerical_column": [11, 12], "numpi": [9, 11, 13, 14, 15, 17], "object": [11, 17], "ohe_": [9, 13], "one": [13, 17], "onehotencod": [9, 13], "oper": [11, 12, 17], "option": [10, 11, 12, 13, 14, 16, 17], "order": [9, 13, 14], "ordin": 7, "origin": 11, "other": 17, "otherwis": 16, "output": [10, 14, 15, 17], "output_dim": [10, 14, 17], "overal": 17, "overrid": 16, "packag": [6, 18], "panda": [9, 12, 16, 17], "paramet": [7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17], "parent": [8, 9, 13], "path": 12, "pathnam": 8, "pd": 12, "penalti": 15, "perform": [11, 15], "phase": 7, "placehold": 7, "posit": 14, "predict": [11, 15], "prepar": [7, 10, 14, 17], "preprocess": [7, 11, 16], "presenc": 13, "privaci": 16, "privacy_set": 16, "prob_fak": [10, 14, 17], "probabl": [7, 9, 10, 13, 14, 17], "process": [11, 16], "produc": 15, "properti": [1, 2, 3, 9, 14], "provid": [8, 10, 11, 12, 14, 15, 16, 17], "py": [7, 9, 10, 13, 14, 17], "rais": [12, 16], "random": [10, 11, 12, 14, 15, 17], "random_st": [10, 11, 12, 14, 17], "randomst": 11, "real": [11, 15], "real_data": 15, "real_label": 15, "regress": [10, 14, 17], "relationship": 15, "repres": [9, 13, 15], "reproduc": [10, 11, 14, 17], "requir": 12, "reserv": 16, "respect": 15, "restructuredtext": 6, "result": [11, 13], "retriev": [13, 14], "return": [7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17], "return_constraint": [9, 13], "revert": 11, "rf": 11, "run": 7, "runtimeerror": 12, "sampl": [7, 8, 10, 11, 12, 14, 15, 16, 17], "scale": 11, "scaler": [11, 16], "score": 11, "second": 17, "see": 6, "seed": [10, 11, 12, 14, 17], "self": [13, 15, 16], "sequenti": 15, "seri": 12, "serv": 16, "set": 16, "shape": [11, 13], "show": 16, "sigma": 11, "singl": 15, "size": [7, 8, 11, 12, 15, 16], "smooth": [9, 13], "softmax": [10, 14, 17], "softmax_weight": [10, 14, 17], "sourc": 9, "specif": 15, "specifi": [8, 14, 16, 17], "spi": 12, "standard": [11, 15], "standardscal": 16, "start": 14, "state": [11, 12], "step": 15, "store": [11, 13], "str": [8, 11, 12, 16, 17], "structur": [1, 4, 15], "submodul": [0, 18], "subpackag": 18, "sum": 17, "support": [10, 14, 17], "syntax": 6, "synthet": [7, 11, 12, 15, 16], "synthetic_data": 16, "tabl": [9, 13], "tablegan": [0, 1, 16], "tableganadapt": [1, 4], "tabul": 13, "tabular": 15, "target": [9, 13, 14], "techniqu": 11, "tensor": [10, 14, 15, 17], "tensorflow": 15, "term": 15, "test": 11, "tf": [14, 15], "thi": [8, 12, 15, 16], "three": 15, "threshold": 15, "train": [7, 8, 10, 11, 12, 14, 15, 16, 17], "train_step": 15, "training_data": 16, "training_sample_s": [8, 12, 16], "transform": [9, 11, 13, 14], "true": [9, 13, 14, 15], "truncat": 11, "tstr": 11, "tupl": [9, 13, 14, 15], "type": [8, 12, 16], "under": [10, 14], "uniform": [9, 13], "uniq_vals_all_col": 13, "uniqu": [9, 13, 14], "unless": 16, "up": 17, "updat": 14, "us": [6, 7, 8, 11, 12, 13, 14, 15, 16, 17], "usag": [1, 4], "use_oh": [9, 13], "util": [0, 1, 2, 3, 4, 7, 18], "valu": [7, 9, 10, 11, 13, 14, 15, 17], "valueerror": 12, "variabl": [9, 13], "variou": [7, 10, 14, 17], "vector": 15, "verbos": [7, 11, 15], "w": 14, "warmup": [7, 11], "warmup_epoch": [7, 11], "wasserstein": 15, "wasserstein_loss": 15, "weight": [7, 9, 10, 13, 14, 17], "whether": [8, 9, 13, 14], "which": [11, 14, 15, 17], "without": 9, "x": [7, 9, 11, 13, 14, 15], "x_out": 13, "x_train": [8, 12, 15, 16], "xt": 13, "y": [7, 9, 11, 13, 14, 15], "y_name": [9, 13], "y_pred": 15, "y_train": [8, 12, 15, 16], "y_true": 15, "your": 6, "z_dim": 15, "zero": 13}, "titles": ["Katabatic package", "Katabatic.models package", "Katabatic.models.GANBLR++ package", "Katabatic.models.ganblr package", "Katabatic.models.tableGAN package", "Katabatic.utils package", "Katabatic documentation", "GANBLR Model Class", "GanblrAdapter", "kDB Algorithm", "Utility Functions", "DMMDiscritizer Class", "GanblrppAdapter Class", "KdbHighOrderFeatureEncoder Class", "Utility Classes", "TableGAN Class", "TableGANAdapter", "Utility Functions", "Katabatic"], "titleterms": {"algorithm": 9, "attribut": [8, 16], "class": [7, 9, 10, 11, 12, 13, 14, 15, 16, 17], "content": [2, 3, 4, 6], "dmmdiscrit": 11, "document": 6, "exampl": 16, "function": [10, 13, 14, 17], "ganblr": [2, 3, 7], "ganblradapt": 8, "ganblrpp": 11, "ganblrppadapt": 12, "helper": 13, "katabat": [0, 1, 2, 3, 4, 5, 6, 18], "kdb": 9, "kdbhighorderfeatureencod": 13, "method": [7, 8, 9, 11, 12, 13, 16], "model": [1, 2, 3, 4, 7], "modul": [2, 3, 4], "packag": [0, 1, 2, 3, 4, 5], "properti": [7, 11, 12, 13], "structur": 16, "submodul": [2, 3, 4, 5], "subpackag": [0, 1], "tablegan": [4, 15], "tableganadapt": 16, "usag": 16, "util": [5, 10, 14, 17]}}) \ No newline at end of file diff --git a/Archive/doc_new/documentation instructions.md b/Archive/doc_new/documentation instructions.md deleted file mode 100644 index 24a7b09..0000000 --- a/Archive/doc_new/documentation instructions.md +++ /dev/null @@ -1,52 +0,0 @@ -# How to Document Models Using Sphinx - -To document your models with **Sphinx**, follow these steps: - -## 1. Create reStructuredText (`.rst`) files for your models - -For each model or module, create an `.rst` file in your Sphinx `doc_new/source` directory. - -- Example file structure: - ``` - docs/ - ├── source/ - │ ├── index.rst - │ ├── TableGAN.rst - │ ├── TableGANAdapter.rst - │ ├── utils.rst - ``` - -## 2. Structure the `.rst` Files - -In each `.rst` file, document the model using the following structure: - -```rst -ModelName -========== - -.. toctree:: - :maxdepth: 2 - :caption: Contents: - -Class Description ------------------ -A short description of the class and its functionality. - -Methods -------- -- Method 1 - - Description: Brief explanation of what the method does. - - Parameters: - - `param_name`: Description of the parameter. - - Returns: - - Description of what the method returns. - -Example -------- -Provide example usage of the class and its methods, if necessary. - - -## 4. Build the documentation in html. - -In the doc_new root directory, run the command 'make html' - which should generate the documentation in a folder called _build. The items can be accessed by pressing on opening one of the html files under the directory called html under _build. -``` diff --git a/Archive/doc_new/make.bat b/Archive/doc_new/make.bat deleted file mode 100644 index 954237b..0000000 --- a/Archive/doc_new/make.bat +++ /dev/null @@ -1,35 +0,0 @@ -@ECHO OFF - -pushd %~dp0 - -REM Command file for Sphinx documentation - -if "%SPHINXBUILD%" == "" ( - set SPHINXBUILD=sphinx-build -) -set SOURCEDIR=. -set BUILDDIR=_build - -%SPHINXBUILD% >NUL 2>NUL -if errorlevel 9009 ( - echo. - echo.The 'sphinx-build' command was not found. Make sure you have Sphinx - echo.installed, then set the SPHINXBUILD environment variable to point - echo.to the full path of the 'sphinx-build' executable. Alternatively you - echo.may add the Sphinx directory to PATH. - echo. - echo.If you don't have Sphinx installed, grab it from - echo.https://www.sphinx-doc.org/ - exit /b 1 -) - -if "%1" == "" goto help - -%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O% -goto end - -:help -%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O% - -:end -popd diff --git a/Archive/doc_new/source/Katabatic.models.GANBLR++.rst b/Archive/doc_new/source/Katabatic.models.GANBLR++.rst deleted file mode 100644 index 40b91cb..0000000 --- a/Archive/doc_new/source/Katabatic.models.GANBLR++.rst +++ /dev/null @@ -1,47 +0,0 @@ -Katabatic.models.GANBLR\+\+ package -=================================== - -.. Submodules -.. ---------- - -.. Katabatic.models.GANBLR\+\+.GANBLR\+\+ module -.. --------------------------------------------- - -.. .. automodule:: Katabatic.models.GANBLR++.GANBLR++ -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.models.GANBLR\+\+.GANBLRPP\_adapter module -.. ---------------------------------------------------- - -.. .. automodule:: Katabatic.models.GANBLR++.GANBLRPP_adapter -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.models.GANBLR\+\+.utils module -.. ---------------------------------------- - -.. .. automodule:: Katabatic.models.GANBLR++.utils -.. :members: -.. :undoc-members: -.. :show-inheritance: - -Module contents ---------------- - -.. .. automodule:: Katabatic.models.GANBLR++ -.. :members: -.. :undoc-members: -.. :show-inheritance: - - -.. toctree:: - :maxdepth: 2 - :caption: Submodules: - - katabatic.models.ganblrpp/ganblrpp - katabatic.models.ganblrpp/ganblrpp_adapter - katabatic.models.ganblrpp/kdb - katabatic.models.ganblrpp/utils diff --git a/Archive/doc_new/source/Katabatic.models.ganblr.rst b/Archive/doc_new/source/Katabatic.models.ganblr.rst deleted file mode 100644 index d3e15b3..0000000 --- a/Archive/doc_new/source/Katabatic.models.ganblr.rst +++ /dev/null @@ -1,54 +0,0 @@ -Katabatic.models.ganblr package -=============================== - -.. Submodules -.. ---------- - -.. Katabatic.models.ganblr.ganblr module -.. ------------------------------------- - -.. .. automodule:: Katabatic.models.ganblr.ganblr -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.models.ganblr.ganblr\_adapter module -.. ---------------------------------------------- - -.. .. automodule:: Katabatic.models.ganblr.ganblr_adapter -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.models.ganblr.kdb module -.. ---------------------------------- - -.. .. automodule:: Katabatic.models.ganblr.kdb -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.models.ganblr.utils module -.. ------------------------------------ - -.. .. automodule:: Katabatic.models.ganblr.utils -.. :members: -.. :undoc-members: -.. :show-inheritance: - -Module contents ---------------- - -.. .. automodule:: Katabatic.models.ganblr -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. toctree:: - :maxdepth: 2 - :caption: Submodules: - - katabatic.models.ganblr/ganblr - katabatic.models.ganblr/ganblr_adapter - katabatic.models.ganblr/kdb - katabatic.models.ganblr/utils \ No newline at end of file diff --git a/Archive/doc_new/source/Katabatic.models.rst b/Archive/doc_new/source/Katabatic.models.rst deleted file mode 100644 index 85232c6..0000000 --- a/Archive/doc_new/source/Katabatic.models.rst +++ /dev/null @@ -1,20 +0,0 @@ -Katabatic.models package -======================== - -Subpackages ------------ - -.. toctree:: - :maxdepth: 4 - - Katabatic.models.GANBLR++ - Katabatic.models.ganblr - Katabatic.models.tableGAN - -.. Module contents -.. --------------- - -.. .. automodule:: Katabatic.models -.. :members: -.. :undoc-members: -.. :show-inheritance: diff --git a/Archive/doc_new/source/Katabatic.models.tableGAN.rst b/Archive/doc_new/source/Katabatic.models.tableGAN.rst deleted file mode 100644 index 2150e9c..0000000 --- a/Archive/doc_new/source/Katabatic.models.tableGAN.rst +++ /dev/null @@ -1,53 +0,0 @@ -Katabatic.models.tableGAN package -=============================== - -.. Submodules -.. ---------- - -.. Katabatic.models.ganblr.ganblr module -.. ------------------------------------- - -.. .. automodule:: Katabatic.models.ganblr.ganblr -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.models.ganblr.ganblr\_adapter module -.. ---------------------------------------------- - -.. .. automodule:: Katabatic.models.ganblr.ganblr_adapter -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.models.ganblr.kdb module -.. ---------------------------------- - -.. .. automodule:: Katabatic.models.ganblr.kdb -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.models.ganblr.utils module -.. ------------------------------------ - -.. .. automodule:: Katabatic.models.ganblr.utils -.. :members: -.. :undoc-members: -.. :show-inheritance: - -Module contents ---------------- - -.. .. automodule:: Katabatic.models.ganblr -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. toctree:: - :maxdepth: 2 - :caption: Submodules: - - katabatic.models.tablegan/tablegan - katabatic.models.tablegan/tablegan_adapter - katabatic.models.tablegan/tablegan_utils \ No newline at end of file diff --git a/Archive/doc_new/source/Katabatic.rst b/Archive/doc_new/source/Katabatic.rst deleted file mode 100644 index 44bd54d..0000000 --- a/Archive/doc_new/source/Katabatic.rst +++ /dev/null @@ -1,54 +0,0 @@ -Katabatic package -================= - -Subpackages ------------ - -.. toctree:: - :maxdepth: 4 - - Katabatic.models - Katabatic.utils - -.. Submodules -.. ---------- - -.. Katabatic.importer module -.. ------------------------- - -.. .. automodule:: Katabatic.importer -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.katabatic module -.. -------------------------- - -.. .. automodule:: Katabatic.katabatic -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.katabatic\_spi module -.. ------------------------------- - -.. .. automodule:: Katabatic.katabatic_spi -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Katabatic.test2 module -.. ---------------------- - -.. .. automodule:: Katabatic.test2 -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Module contents -.. --------------- - -.. .. automodule:: Katabatic -.. :members: -.. :undoc-members: -.. :show-inheritance: diff --git a/Archive/doc_new/source/Katabatic.utils.rst b/Archive/doc_new/source/Katabatic.utils.rst deleted file mode 100644 index b63f4e5..0000000 --- a/Archive/doc_new/source/Katabatic.utils.rst +++ /dev/null @@ -1,21 +0,0 @@ -Katabatic.utils package -======================= - -Submodules ----------- - -.. Katabatic.utils.evaluate module -.. ------------------------------- - -.. .. automodule:: Katabatic.utils.evaluate -.. :members: -.. :undoc-members: -.. :show-inheritance: - -.. Module contents -.. --------------- - -.. .. automodule:: Katabatic.utils -.. :members: -.. :undoc-members: -.. :show-inheritance: diff --git a/Archive/doc_new/source/conf.py b/Archive/doc_new/source/conf.py deleted file mode 100644 index ca7c51d..0000000 --- a/Archive/doc_new/source/conf.py +++ /dev/null @@ -1,37 +0,0 @@ -# Configuration file for the Sphinx documentation builder. -# -# For the full list of built-in configuration values, see the documentation: -# https://www.sphinx-doc.org/en/master/usage/configuration.html - -# -- Project information ----------------------------------------------------- -# https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information - -import os -import sys -sys.path.insert(0, os.path.abspath('../katabatic')) - -project = 'Katabatic' -copyright = '2024, Daniel G.E. Ken' -author = 'Daniel G.E. Ken' - -# -- General configuration --------------------------------------------------- -# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration - -extensions = [ - 'sphinx.ext.autodoc', - 'sphinx.ext.napoleon', -] - -templates_path = ['_templates'] -exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store'] - - - -# -- Options for HTML output ------------------------------------------------- -# https://www.sphinx-doc.org/en/master/usage/configuration.html#options-for-html-output - -html_theme = 'alabaster' -html_static_path = ['_static'] - -napoleon_google_docstring = True -napoleon_numpy_docstring = True \ No newline at end of file diff --git a/Archive/doc_new/source/index.rst b/Archive/doc_new/source/index.rst deleted file mode 100644 index add461f..0000000 --- a/Archive/doc_new/source/index.rst +++ /dev/null @@ -1,20 +0,0 @@ -.. Katabatic documentation master file, created by - sphinx-quickstart on Mon Aug 12 10:22:28 2024. - You can adapt this file completely to your liking, but it should at least - contain the root `toctree` directive. - -Katabatic documentation -======================= - -Add your content using ``reStructuredText`` syntax. See the -`reStructuredText `_ -documentation for details. - - -.. toctree:: - :maxdepth: 2 - :caption: Contents: - - - modules - diff --git a/Archive/doc_new/source/katabatic.models.ganblr/ganblr.rst b/Archive/doc_new/source/katabatic.models.ganblr/ganblr.rst deleted file mode 100644 index 4c1b65e..0000000 --- a/Archive/doc_new/source/katabatic.models.ganblr/ganblr.rst +++ /dev/null @@ -1,99 +0,0 @@ -GANBLR Model Class -=================== -The GANBLR model combines a Bayesian network and a neural network, leveraging the k-Dependency Bayesian (kDB) algorithm for building a dependency graph and using various utility functions for data preparation and model constraints. - -Defined in `ganblr.py` - -Class Properties ----------------- - -- **_d**: - Placeholder for data. - -- **__gen_weights**: - Placeholder for generator weights. - -- **batch_size** (`int`): - Batch size for training. - -- **epochs** (`int`): - Number of epochs for training. - -- **k**: - Parameter for the model. - -- **constraints**: - Constraints for the model. - -- **_ordinal_encoder**: - Ordinal encoder for preprocessing data. - -- **_label_encoder**: - Label encoder for preprocessing data. - -Class Methods --------------- - -**__init__()** - Initializes the model with default values. - -**fit(x, y, k=0, batch_size=32, epochs=10, warmup_epochs=1, verbose=0)** - Fits the model to the given data. - - - **Parameters:** - - `x`: Input data. - - `y`: Labels for the data. - - `k`: Parameter for the model (default: 0). - - `batch_size`: Size of batches for training (default: 32). - - `epochs`: Number of training epochs (default: 10). - - `warmup_epochs`: Number of warmup epochs (default: 1). - - `verbose`: Verbosity level (default: 0). - - - **Returns:** - The fitted model. - -**_augment_cpd(d, size=None, verbose=0)** - Augments the Conditional Probability Distribution (CPD). - - - **Parameters:** - - `d`: Data. - - `size`: Size of the sample (default: None). - - `verbose`: Verbosity level (default: 0). - - - **Returns:** - The augmented data. - -**_warmup_run(epochs, verbose=None)** - Runs a warmup phase. - - - **Parameters:** - - `epochs`: Number of epochs for warmup. - - `verbose`: Verbosity level (default: None). - - - **Returns:** - The history of the warmup run. - -**_run_generator(loss)** - Runs the generator model. - - - **Parameters:** - - `loss`: Loss function. - - - **Returns:** - The history of the generator run. - -**_discrim()** - Creates a discriminator model. - - - **Returns:** - The discriminator model. - -**_sample(size=None, verbose=0)** - Generates synthetic data in ordinal encoding format. - - - **Parameters:** - - `size`: Size of the sample (default: None). - - `verbose`: Verbosity level (default: 0). - - - **Returns:** - The sampled data. diff --git a/Archive/doc_new/source/katabatic.models.ganblr/ganblr_adapter.rst b/Archive/doc_new/source/katabatic.models.ganblr/ganblr_adapter.rst deleted file mode 100644 index 8c94e3f..0000000 --- a/Archive/doc_new/source/katabatic.models.ganblr/ganblr_adapter.rst +++ /dev/null @@ -1,53 +0,0 @@ -GanblrAdapter -============= - -A class to adapt the GANBLR model to the KatabaticModelSPI interface. This class provides an easy interface for loading, fitting, and generating data using the GANBLR model. - -Attributes ----------- - -- **type** (`str`): - Specifies whether the model type is 'discrete' or 'continuous'. Default is 'discrete'. - -- **constraints** (`any`): - Constraints for the model. Default is None. - -- **batch_size** (`int`): - Batch size for training the model. Default is None. - -- **epochs** (`int`): - Number of epochs for training the model. Default is None. - -- **training_sample_size** (`int`): - Size of the training sample. Initialized to 0. - -Methods --------- - -**load_model()** - Initializes and returns an instance of the GANBLR model. - -**load_data(data_pathname)** - Loads data from the specified pathname. - - - **Parameters:** - - `data_pathname`: Pathname of the data to be loaded. - -**fit(X_train, y_train, k=0, epochs=10, batch_size=64)** - Fits the GANBLR model using the provided training data. - - - **Parameters:** - - `X_train`: Training features. - - `y_train`: Training labels. - - `k`: Number of parent nodes (default: 0). - - `epochs`: Number of epochs for training (default: 10). - - `batch_size`: Batch size for training (default: 64). - -**generate(size=None)** - Generates data from the GANBLR model. If `size` is not specified, it defaults to the training sample size. - - - **Parameters:** - - `size`: Number of data samples to generate. Defaults to None. - - - **Returns:** - Generated data samples. \ No newline at end of file diff --git a/Archive/doc_new/source/katabatic.models.ganblr/kdb.rst b/Archive/doc_new/source/katabatic.models.ganblr/kdb.rst deleted file mode 100644 index 3b22868..0000000 --- a/Archive/doc_new/source/katabatic.models.ganblr/kdb.rst +++ /dev/null @@ -1,108 +0,0 @@ -kDB Algorithm -============= - -The kDB algorithm constructs a dependency graph for the Bayesian network. It is implemented in the `kdb.py` file. - -Methods --------- - -**build_graph(X, y, k=2)** - Constructs a k-dependency Bayesian network graph. - - - **Parameters:** - - `X`: Input data (features). - - `y`: Labels. - - `k`: Number of parent nodes (default: 2). - - - **Returns:** - A list of graph edges. - -**get_cross_table(*cols, apply_wt=False)** - Generates a cross table from input columns. - - - **Parameters:** - - `cols`: Columns for cross table generation. - - `apply_wt`: Whether to apply weights (default: False). - - - **Returns:** - A tuple containing: - - The cross table as a NumPy array. - - A list of unique values for all columns. - - A list of unique values for individual columns. - -**_get_dependencies_without_y(variables, y_name, kdb_edges)** - Finds the dependencies of each variable without considering `y`. - - - **Parameters:** - - `variables`: List of variable names. - - `y_name`: Class name. - - `kdb_edges`: List of tuples representing edges (source, target). - - - **Returns:** - A dictionary of dependencies. - -**_add_uniform(X, weight=1.0)** - Adds a uniform distribution to the data. - - - **Parameters:** - - `X`: Input data, a NumPy array or pandas DataFrame. - - `weight`: Weight for the uniform distribution (default: 1.0). - - - **Returns:** - The modified data with uniform distribution. - -**_normalize_by_column(array)** - Normalizes the array by columns. - - - **Parameters:** - - `array`: Input array to normalize. - - - **Returns:** - The normalized array. - -**_smoothing(cct, d)** - Probability smoothing for kDB. - - - - **Returns:** - A smoothed joint probability table. - -**get_high_order_feature(X, col, evidence_cols, feature_uniques)** - Encodes the high-order feature of `X[col]` given evidence from `X[evidence_cols`. - - - **Parameters:** - - `X`: Input data. - - `col`: Column to encode. - - `evidence_cols`: List of evidence columns. - - `feature_uniques`: Unique values for features. - - - **Returns:** - An encoded high-order feature. - -**get_high_order_constraints(X, col, evidence_cols, feature_uniques)** - Gets high-order constraints for the feature. - - - **Parameters:** - - `X`: Input data. - - `col`: Column to encode. - - `evidence_cols`: List of evidence columns. - - `feature_uniques`: Unique values for features. - - - **Returns:** - High-order constraints. - -Classes --------- - -**KdbHighOrderFeatureEncoder** - Encodes high-order features for the kDB model. - - - **Class Properties:** - - `feature_uniques_`: Unique values for features. - - `dependencies_`: Dependencies for features. - - `ohe_`: OneHotEncoder instance. - - - **Class Methods:** - - `fit(X, y, k=0)`: Fits the encoder to the data. - - `transform(X, return_constraints=False, use_ohe=True)`: Transforms the input data. - - `fit_transform(X, y, k=0, return_constraints=False)`: Fits the encoder and transforms the data. diff --git a/Archive/doc_new/source/katabatic.models.ganblr/utils.rst b/Archive/doc_new/source/katabatic.models.ganblr/utils.rst deleted file mode 100644 index be7d0e7..0000000 --- a/Archive/doc_new/source/katabatic.models.ganblr/utils.rst +++ /dev/null @@ -1,67 +0,0 @@ -Utility Functions -================= - -The utility functions are implemented in the `utils.py` file and provide various support functions for data preparation and model constraints. - -Classes --------- - -**softmax_weight** - Constrains weight tensors to be under softmax. - -**DataUtils** - Provides data utilities for preparation before training. - -Functions ---------- - -**elr_loss(KL_LOSS)** - Defines a custom loss function. - - - **Parameters:** - - `KL_LOSS`: The KL loss value. - - - **Returns:** - A custom loss function. - -**KL_loss(prob_fake)** - Calculates the KL loss. - - - **Parameters:** - - `prob_fake`: Probability of the fake data. - - - **Returns:** - The KL loss value. - -**get_lr(input_dim, output_dim, constraint=None, KL_LOSS=0)** - Creates a logistic regression model. - - - **Parameters:** - - `input_dim`: Dimension of the input features. - - `output_dim`: Dimension of the output. - - `constraint`: Optional constraint for the model (default: None). - - `KL_LOSS`: Optional KL loss value (default: 0). - - - **Returns:** - A logistic regression model. - -**sample(*arrays, n=None, frac=None, random_state=None)** - Generates random samples from the given arrays. - - - **Parameters:** - - `arrays`: Arrays to sample from. - - `n`: Number of samples to generate (default: None). - - `frac`: Fraction of samples to generate (default: None). - - `random_state`: Random seed for reproducibility (default: None). - - - **Returns:** - Random samples from the arrays. - -**get_demo_data(name='adult')** - Downloads a demo dataset from the internet. - - - **Parameters:** - - `name`: Name of the dataset to download (default: 'adult'). - - - **Returns:** - The downloaded dataset. diff --git a/Archive/doc_new/source/katabatic.models.ganblrpp/ganblrpp.rst b/Archive/doc_new/source/katabatic.models.ganblrpp/ganblrpp.rst deleted file mode 100644 index fad4152..0000000 --- a/Archive/doc_new/source/katabatic.models.ganblrpp/ganblrpp.rst +++ /dev/null @@ -1,146 +0,0 @@ -DMMDiscritizer Class -==================== - -The `DMMDiscritizer` class performs discretization using a mixture model approach. It scales the data, applies Bayesian Gaussian Mixture Models, and transforms the data into discrete values. It also includes methods for inverse transformation of discretized data back to its original form. - -Class Properties ----------------- - -- **__dmm_params**: - Parameters for the Bayesian Gaussian Mixture Model. - -- **__scaler**: - Min-Max scaler for data normalization. - -- **__dmms**: - List to store Bayesian Gaussian Mixture Models for each feature. - -- **__arr_mu**: - List to store means of the Gaussian distributions. - -- **__arr_sigma**: - List to store standard deviations of the Gaussian distributions. - -- **_random_state**: - Random seed for reproducibility. - -Class Methods --------------- - -**__init__(random_state)** - Initializes the DMMDiscritizer with parameters and scaler. - - - **Parameters:** - - `random_state`: Seed for random number generation. - -**fit(x)** - Fits the discretizer to the provided data. - - - **Parameters:** - - `x`: 2D numpy array of shape (n_samples, n_features). Numeric data to be discretized. - - - **Returns:** - The fitted `DMMDiscritizer` instance. - -**transform(x) -> np.ndarray** - Transforms the data using the fitted discretizer. - - - **Parameters:** - - `x`: 2D numpy array of shape (n_samples, n_features). Numeric data to be transformed. - - - **Returns:** - 2D numpy array of shape (n_samples, n_features). Discretized data. - -**fit_transform(x) -> np.ndarray** - Fits the discretizer and transforms the data. - - - **Parameters:** - - `x`: 2D numpy array of shape (n_samples, n_features). Numeric data to be discretized and transformed. - - - **Returns:** - 2D numpy array of shape (n_samples, n_features). Discretized data. - -**inverse_transform(x, verbose=1) -> np.ndarray** - Converts discretized data back to its original continuous form. - - - **Parameters:** - - `x`: 2D numpy array of shape (n_samples, n_features). Discretized data. - - `verbose`: int, default=1. Controls verbosity of the operation. - - - **Returns:** - 2D numpy array of shape (n_samples, n_features). Reverted data. - -**__sample_from_truncnorm(bins, mu, sigma, random_state=None)** - Samples data from a truncated normal distribution. - - - **Parameters:** - - `bins`: 1D numpy array of integer bins. - - `mu`: 1D numpy array of means for the normal distribution. - - `sigma`: 1D numpy array of standard deviations for the normal distribution. - - `random_state`: int or None. Seed for random number generation. - - - **Returns:** - 1D numpy array of sampled results. - -GANBLRPP Class -================ - -The `GANBLRPP` class implements the GANBLR++ model, which combines generative adversarial networks with discretization techniques. It uses the `DMMDiscritizer` for data preprocessing and the `GANBLR` model for generating synthetic data. - -Class Properties ----------------- - -- **__discritizer**: - Instance of `DMMDiscritizer` for data preprocessing. - -- **__ganblr**: - Instance of `GANBLR` for generative adversarial network functionality. - -- **_numerical_columns**: - List of indices for numerical columns in the dataset. - -Class Methods --------------- - -**__init__(numerical_columns, random_state=None)** - Initializes the GANBLR++ model with numerical column indices and optional random state. - - - **Parameters:** - - `numerical_columns`: List of indices for numerical columns. - - `random_state`: int, RandomState instance or None. Seed for random number generation. - -**fit(x, y, k=0, batch_size=32, epochs=10, warmup_epochs=1, verbose=1)** - Fits the GANBLR++ model to the provided data. - - - **Parameters:** - - `x`: 2D numpy array of shape (n_samples, n_features). Dataset to fit the model. - - `y`: 1D numpy array of shape (n_samples,). Labels for the dataset. - - `k`: int, default=0. Parameter k for the GANBLR model. - - `batch_size`: int, default=32. Size of batches for training. - - `epochs`: int, default=10. Number of training epochs. - - `warmup_epochs`: int, default=1. Number of warmup epochs. - - `verbose`: int, default=1. Controls verbosity of the training process. - - - **Returns:** - The fitted `GANBLRPP` instance. - -**sample(size=None, verbose=1)** - Generates synthetic data using the GANBLR++ model. - - - **Parameters:** - - `size`: int or None. Size of the synthetic data to generate. - - `verbose`: int, default=1. Controls verbosity of the sampling process. - - - **Returns:** - 2D numpy array of synthetic data. - -**evaluate(x, y, model='lr')** - Evaluates the model using a TSTR (Training on Synthetic data, Testing on Real data) approach. - - - **Parameters:** - - `x`: 2D numpy array of shape (n_samples, n_features). Test dataset. - - `y`: 1D numpy array of shape (n_samples,). Labels for the test dataset. - - `model`: str or object. Model to use for evaluation. Options are 'lr', 'mlp', 'rf', or a custom model with `fit` and `predict` methods. - - - **Returns:** - float. Accuracy score of the evaluation. diff --git a/Archive/doc_new/source/katabatic.models.ganblrpp/ganblrpp_adapter.rst b/Archive/doc_new/source/katabatic.models.ganblrpp/ganblrpp_adapter.rst deleted file mode 100644 index 34a81d8..0000000 --- a/Archive/doc_new/source/katabatic.models.ganblrpp/ganblrpp_adapter.rst +++ /dev/null @@ -1,95 +0,0 @@ -GanblrppAdapter Class -===================== - -The `GanblrppAdapter` class adapts the `GANBLRPP` model to the Katabatic Model SPI interface. It facilitates model initialization, data loading, model fitting, and data generation, bridging the gap between Katabatic's requirements and the GANBLR++ model's functionality. - - -Class Properties ----------------- - -- **type** (`str`): - Type of model to use ("discrete" by default). - -- **model**: - Instance of `GANBLRPP` for model operations. - -- **constraints**: - Constraints for the model (not used in this class). - -- **batch_size** (`int`): - Size of batches for training. - -- **epochs** (`int`): - Number of epochs for training. - -- **training_sample_size** (`int`): - Size of the training sample. - -- **numerical_columns** (`list`): - List of indices for numerical columns in the dataset. - -- **random_state**: - Seed for random number generation. - -Class Methods --------------- - -**__init__(model_type="discrete", numerical_columns=None, random_state=None)** - Initializes the `GanblrppAdapter` with model type, numerical columns, and optional random state. - - - **Parameters:** - - `model_type`: str, default="discrete". Type of model to use. - - `numerical_columns`: list, optional. List of indices for numerical columns. - - `random_state`: int, optional. Seed for random number generation. - -**load_model() -> GANBLRPP** - Initializes and loads the `GANBLRPP` model. - - - **Returns:** - The initialized `GANBLRPP` model. - - - **Raises:** - - `ValueError`: If `numerical_columns` is not provided. - - `RuntimeError`: If the model initialization fails. - -**load_data(data_pathname) -> pd.DataFrame** - Loads data from a CSV file. - - - **Parameters:** - - `data_pathname`: str. Path to the CSV file containing the data. - - - **Returns:** - Pandas DataFrame containing the loaded data. - - - **Raises:** - - `Exception`: If there is an error loading the data. - -**fit(X_train, y_train, k=0, epochs=10, batch_size=64)** - Fits the `GANBLRPP` model to the training data. - - - **Parameters:** - - `X_train`: pd.DataFrame or np.ndarray. Training data features. - - `y_train`: pd.Series or np.ndarray. Training data labels. - - `k`: int, default=0. Parameter k for the GANBLRPP model. - - `epochs`: int, default=10. Number of training epochs. - - `batch_size`: int, default=64. Size of batches for training. - - - **Returns:** - None - - - **Raises:** - - `RuntimeError`: If the model is not initialized. - - `Exception`: If there is an error during model training. - -**generate(size=None) -> pd.DataFrame** - Generates synthetic data using the `GANBLRPP` model. - - - **Parameters:** - - `size`: int or None. Size of the synthetic data to generate. Defaults to the size of the training data. - - - **Returns:** - Pandas DataFrame containing the generated data. - - - **Raises:** - - `RuntimeError`: If the model is not initialized. - - `Exception`: If there is an error during data generation. diff --git a/Archive/doc_new/source/katabatic.models.ganblrpp/kdb.rst b/Archive/doc_new/source/katabatic.models.ganblrpp/kdb.rst deleted file mode 100644 index be05e96..0000000 --- a/Archive/doc_new/source/katabatic.models.ganblrpp/kdb.rst +++ /dev/null @@ -1,169 +0,0 @@ -KdbHighOrderFeatureEncoder Class -================================ - -The `KdbHighOrderFeatureEncoder` class encodes high-order features using the kDB algorithm. It retrieves dependencies between features, transforms data into high-order features, and optionally returns constraints information. - -Defined in `kdb.py` - -Class Properties ----------------- - -- **dependencies_** (`dict`): - A dictionary storing the dependencies between features. - -- **constraints_** (`np.ndarray`): - Constraints information for high-order features. - -- **have_value_idxs_** (`list`): - Indices indicating the presence of values for constraints. - -- **feature_uniques_** (`list`): - List of unique values for each feature. - -- **high_order_feature_uniques_** (`list`): - List of unique values for high-order features. - -- **edges_** (`list`): - List of edges representing the kDB graph. - -- **ohe_** (`OneHotEncoder`): - OneHotEncoder instance for encoding features. - -- **k** (`int`): - Value of `k` for the kDB model. - -Methods -------- - -**__init__()** - Initializes the `KdbHighOrderFeatureEncoder` with default properties. - -**fit(X, y, k=0)** - Fits the `KdbHighOrderFeatureEncoder` to the data and labels. - - - **Parameters:** - - `X`: array_like of shape (n_samples, n_features). Data to fit in the encoder. - - `y`: array_like of shape (n_samples,). Labels to fit in the encoder. - - `k`: int, default=0. `k` value for the kDB model. `k=0` leads to a OneHotEncoder. - - - **Returns:** - - `self`: The fitted encoder. - -**transform(X, return_constraints=False, use_ohe=True)** - Transforms data to high-order features. - - - **Parameters:** - - `X`: array_like of shape (n_samples, n_features). Data to transform. - - `return_constraints`: bool, default=False. Whether to return constraint information. - - `use_ohe`: bool, default=True. Whether to apply one-hot encoding. - - - **Returns:** - - `X_out`: ndarray of shape (n_samples, n_encoded_features). Transformed input. - - If `return_constraints=True`, also returns: - - `constraints`: np.ndarray of constraints. - - `have_value_idxs`: List of boolean arrays indicating presence of values. - -**fit_transform(X, y, k=0, return_constraints=False)** - Fits the encoder and then transforms the data. - - - **Parameters:** - - `X`: array_like of shape (n_samples, n_features). Data to fit and transform. - - `y`: array_like of shape (n_samples,). Labels to fit and transform. - - `k`: int, default=0. `k` value for the kDB model. - - `return_constraints`: bool, default=False. Whether to return constraint information. - - - **Returns:** - - `X_out`: ndarray of shape (n_samples, n_encoded_features). Transformed input. - - If `return_constraints=True`, also returns: - - `constraints`: np.ndarray of constraints. - - `have_value_idxs`: List of boolean arrays indicating presence of values. - -Helper Functions ----------------- - -**build_graph(X, y, k=2)** - Constructs a kDB graph based on mutual information. - - - **Parameters:** - - `X`: array_like of shape (n_samples, n_features). Features. - - `y`: array_like of shape (n_samples,). Labels. - - `k`: int, default=2. Maximum number of parent nodes to consider. - - - **Returns:** - - `edges`: List of tuples representing edges in the graph. - -**get_cross_table(*cols, apply_wt=False)** - Computes a cross-tabulation table for the given columns. - - - **Parameters:** - - `*cols`: 1D numpy arrays of integers. Columns to cross-tabulate. - - `apply_wt`: bool, default=False. Whether to apply weights. - - - **Returns:** - - `uniq_vals_all_cols`: Tuple of 1D numpy arrays of unique values for each column. - - `xt`: NumPy array of cross-tabulation results. - -**_get_dependencies_without_y(variables, y_name, kdb_edges)** - Retrieves dependencies of each variable excluding the target variable `y`. - - - **Parameters:** - - `variables`: List of variable names. - - `y_name`: Name of the target variable. - - `kdb_edges`: List of tuples representing edges in the kDB graph. - - - **Returns:** - - `dependencies`: Dictionary of dependencies for each variable. - -**_add_uniform(array, noise=1e-5)** - Adds uniform probability to avoid zero counts in cross-tabulation. - - - **Parameters:** - - `array`: NumPy array of counts. - - `noise`: float, default=1e-5. Amount of noise to add. - - - **Returns:** - - `result`: NumPy array with added uniform probability. - -**_normalize_by_column(array)** - Normalizes an array by columns. - - - **Parameters:** - - `array`: NumPy array to normalize. - - - **Returns:** - - `normalized_array`: NumPy array normalized by columns. - -**_smoothing(cct, d)** - Applies smoothing to a cross-count table to handle zero counts. - - - **Parameters:** - - `cct`: NumPy array of cross-count table. - - `d`: int. Dimension of the cross-count table. - - - **Returns:** - - `jpt`: NumPy array of smoothed joint probability table. - -**get_high_order_feature(X, col, evidence_cols, feature_uniques)** - Encodes high-order features given the evidence columns. - - - **Parameters:** - - `X`: array_like of shape (n_samples, n_features). Data. - - `col`: int. Column index of the feature to encode. - - `evidence_cols`: List of indices for evidence columns. - - `feature_uniques`: List of unique values for each feature. - - - **Returns:** - - `high_order_feature`: NumPy array of high-order features. - -**get_high_order_constraints(X, col, evidence_cols, feature_uniques)** - Finds constraints for high-order features. - - - **Parameters:** - - `X`: array_like of shape (n_samples, n_features). Data. - - `col`: int. Column index of the feature. - - `evidence_cols`: List of indices for evidence columns. - - `feature_uniques`: List of unique values for each feature. - - - **Returns:** - - `have_value`: NumPy array of boolean values indicating presence of values. - - `high_order_constraints`: NumPy array of constraint information. diff --git a/Archive/doc_new/source/katabatic.models.ganblrpp/utils.rst b/Archive/doc_new/source/katabatic.models.ganblrpp/utils.rst deleted file mode 100644 index 6e30fac..0000000 --- a/Archive/doc_new/source/katabatic.models.ganblrpp/utils.rst +++ /dev/null @@ -1,158 +0,0 @@ -Utility Classes -=============== - -The utility classes are implemented in the `utility.py` file and provide various support functionalities for model constraints and data preparation. - -Classes --------- - -**softmax_weight** - Constrains weight tensors to be under softmax. - -**DataUtils** - Provides data utilities for preparation before training. - -Functions ---------- - -**elr_loss(KL_LOSS)** - Defines a custom loss function. - - - **Parameters:** - - `KL_LOSS`: The KL loss value. - - - **Returns:** - A custom loss function. - -**KL_loss(prob_fake)** - Calculates the KL loss. - - - **Parameters:** - - `prob_fake`: Probability of the fake data. - - - **Returns:** - The KL loss value. - -**get_lr(input_dim, output_dim, constraint=None, KL_LOSS=0)** - Creates a logistic regression model. - - - **Parameters:** - - `input_dim`: Dimension of the input features. - - `output_dim`: Dimension of the output. - - `constraint`: Optional constraint for the model (default: None). - - `KL_LOSS`: Optional KL loss value (default: 0). - - - **Returns:** - A logistic regression model. - -**sample(*arrays, n=None, frac=None, random_state=None)** - Generates random samples from the given arrays. - - - **Parameters:** - - `arrays`: Arrays to sample from. - - `n`: Number of samples to generate (default: None). - - `frac`: Fraction of samples to generate (default: None). - - `random_state`: Random seed for reproducibility (default: None). - - - **Returns:** - Random samples from the arrays. - -**get_demo_data(name='adult')** - Downloads a demo dataset from the internet. - - - **Parameters:** - - `name`: Name of the dataset to download (default: 'adult'). - - - **Returns:** - The downloaded dataset. - -Classes --------- - -**softmax_weight** - Constrains weight tensors to be under softmax. - - - **Defined in:** `softmax_weight.py` - - - **Properties:** - - **feature_idxs** (`list`): List of tuples indicating the start and end indices for each feature in the weight tensor. - - - **Methods:** - - **__init__(feature_uniques)** - Initializes the constraint with unique feature values. - - - **Parameters:** - - `feature_uniques`: `np.ndarray` or list of int. Unique values for each feature used to compute indices for softmax constraint. - - - **Returns:** - None. - - - **__call__(w)** - Applies the softmax constraint to the weight tensor. - - - **Parameters:** - - `w`: `tf.Tensor`. Weight tensor to which the constraint is applied. - - - **Returns:** - `tf.Tensor`: The constrained weight tensor. - - - **get_config()** - Returns the configuration of the constraint. - - - **Returns:** - `dict`: Configuration dictionary containing `feature_idxs`. - -**DataUtils** - Provides utility functions for data preparation before training. - - - **Defined in:** `data_utils.py` - - - **Properties:** - - **x** (`np.ndarray`): The feature data used for training. - - **y** (`np.ndarray`): The target labels associated with the feature data. - - **data_size** (`int`): Number of samples in the dataset. - - **num_features** (`int`): Number of features in the dataset. - - **num_classes** (`int`): Number of unique classes in the target labels. - - **class_counts** (`np.ndarray`): Counts of each class in the target labels. - - **feature_uniques** (`list`): List of unique values for each feature. - - **constraint_positions** (`np.ndarray` or `None`): Positions of constraints for high-order features. - - **_kdbe** (`KdbHighOrderFeatureEncoder` or `None`): Instance of the `KdbHighOrderFeatureEncoder` for feature encoding. - - **__kdbe_x** (`np.ndarray` or `None`): Transformed feature data after applying kDB encoding. - - - **Methods:** - - **__init__(x, y)** - Initializes the `DataUtils` with the provided feature data and target labels. - - - **Parameters:** - - `x`: `np.ndarray`. Feature data. - - `y`: `np.ndarray`. Target labels. - - - **Returns:** - None. - - - **get_categories(idxs=None)** - Retrieves categories for encoded features. - - - **Parameters:** - - `idxs`: list of int or `None`. Indices of features to retrieve categories for. If `None`, retrieves categories for all features. - - - **Returns:** - `list`: List of categories for the specified features. - - - **get_kdbe_x(k=0, dense_format=True)** - Transforms feature data into high-order features using kDB encoding. - - - **Parameters:** - - `k`: int, default=0. `k` value for the kDB model. - - `dense_format`: bool, default=True. Whether to return the transformed data in dense format. - - - **Returns:** - `np.ndarray`: Transformed feature data. - If `dense_format=True`, returns a dense NumPy array. - Also updates `constraint_positions` with the positions of constraints. - - - **clear()** - Clears the kDB encoder and transformed data. - - - **Returns:** - None. diff --git a/Archive/doc_new/source/katabatic.models.tablegan/tablegan.rst b/Archive/doc_new/source/katabatic.models.tablegan/tablegan.rst deleted file mode 100644 index 1fff5a5..0000000 --- a/Archive/doc_new/source/katabatic.models.tablegan/tablegan.rst +++ /dev/null @@ -1,89 +0,0 @@ -TableGAN Class -============== - -This class implements a Generative Adversarial Network (GAN) for tabular data, specifically designed to handle structured data with a classifier that enforces feature-label relationships. - -.. class:: TableGAN - - :param input_dim: Integer, the dimension of the input features. - :param label_dim: Integer, the dimension of the output labels. - :param z_dim: Integer, the dimension of the noise vector used for generating synthetic data (default: 100). - :param delta_mean: Float, the threshold for the mean feature differences during generator training (default: 0.1). - :param delta_sd: Float, the threshold for the standard deviation differences during generator training (default: 0.1). - - **Example**:: - - import tensorflow as tf - from tensorflow.keras import layers - import numpy as np - - # Create TableGAN model instance - gan = TableGAN(input_dim=32, label_dim=10) - - # Train on data - gan.fit(x_train, y_train, epochs=100) - - # Generate synthetic samples - generated_data, generated_labels = gan.sample(n_samples=1000) - - **Methods** - - .. method:: _build_generator(self) - - Constructs the generator model, which produces synthetic data from random noise. - - :return: A Keras Sequential model representing the generator. - - .. method:: _build_discriminator(self) - - Constructs the discriminator model, which evaluates the authenticity of the data and extracts features. - - :return: A Keras Model that outputs the real/fake classification and extracted features. - - .. method:: _build_classifier(self) - - Constructs the classifier model, which predicts labels for the generated data. - - :return: A Keras Sequential model that classifies the input data. - - .. method:: wasserstein_loss(self, y_true, y_pred) - - Implements the Wasserstein loss function for training the discriminator. - - :param y_true: Tensor, the true labels (real/fake). - :param y_pred: Tensor, the predicted labels. - :return: Tensor, the computed Wasserstein loss. - - .. method:: gradient_penalty(self, real, fake) - - Computes the gradient penalty term for enforcing the Lipschitz constraint in Wasserstein GANs. - - :param real: Tensor, real data samples. - :param fake: Tensor, fake data samples generated by the generator. - :return: Tensor, the computed gradient penalty. - - .. method:: train_step(self, real_data, real_labels) - - Performs a single training step for the generator, discriminator, and classifier, using real and synthetic data. - - :param real_data: Tensor, the real data samples. - :param real_labels: Tensor, the true labels for the real data samples. - :return: Tuple of three values representing the discriminator, generator, and classifier losses, respectively. - - .. method:: fit(self, x, y, batch_size=64, epochs=100, verbose=1) - - Trains the GAN model on the provided data. - - :param x: Tensor, input data for training. - :param y: Tensor, labels for the training data. - :param batch_size: Integer, the size of each training batch (default: 64). - :param epochs: Integer, the number of epochs to train the model (default: 100). - :param verbose: Integer, the verbosity mode for logging during training (default: 1). - :return: The fitted TableGAN model. - - .. method:: sample(self, n_samples) - - Generates synthetic data and corresponding labels using the trained generator and classifier. - - :param n_samples: Integer, the number of synthetic data samples to generate. - :return: Tuple of numpy arrays, containing the generated data and labels. diff --git a/Archive/doc_new/source/katabatic.models.tablegan/tablegan_adapter.rst b/Archive/doc_new/source/katabatic.models.tablegan/tablegan_adapter.rst deleted file mode 100644 index c7b8812..0000000 --- a/Archive/doc_new/source/katabatic.models.tablegan/tablegan_adapter.rst +++ /dev/null @@ -1,72 +0,0 @@ -TableGANAdapter -=============== - -The `TableGANAdapter` class serves as an adapter for interfacing with the `TableGAN` model. It extends the `KatabaticModelSPI` and allows for loading, fitting, and generating synthetic data using the TableGAN model. This adapter includes functionality to handle privacy settings, data preprocessing, and model training. - -Class Structure ---------------- - -.. autoclass:: TableGANAdapter - :members: - :show-inheritance: - -Attributes ----------- - -- **type** (`str`): Defines the type of data handled by the adapter (default: `'continuous'`). -- **privacy_setting** (`str`): Sets the privacy level of the model ('low', 'medium', or 'high'). -- **constraints** (`NoneType`): Currently not in use but reserved for future constraints settings. -- **batch_size** (`int`): Defines the batch size for training (default: `64`). -- **epochs** (`int`): Number of training epochs (default: `100`). -- **model** (`TableGAN`): Instance of the TableGAN model. -- **scaler** (`StandardScaler`): Scaler used for preprocessing continuous data. -- **label_encoder** (`LabelEncoder`): Encoder used for processing categorical labels. -- **input_dim** (`int`): Input dimensionality of the data. -- **label_dim** (`int`): Label dimensionality of the data. -- **training_sample_size** (`int`): Number of samples used during training. - -Methods -------- - -- **__init__(self, type='continuous', privacy_setting='low')** - - Initializes the `TableGANAdapter` class, setting parameters such as data type, privacy level, and default model parameters. - -- **load_model(self)** - - Loads and initializes the `TableGAN` model based on the input and label dimensions. Adjusts privacy parameters (`delta_mean`, `delta_sd`) according to the specified `privacy_setting`. - -- **load_data(self, data_pathname)** - - Loads training data from the specified `data_pathname`. Handles CSV files and returns the data as a Pandas DataFrame. - -- **fit(self, X_train, y_train, epochs=None, batch_size=None)** - - Trains the `TableGAN` model on the provided training data (`X_train`, `y_train`). Preprocesses the data, sets input and label dimensions, and optionally overrides the number of epochs and batch size. - -- **generate(self, size=None)** - - Generates synthetic data using the trained `TableGAN` model. If the model is not trained, raises an error. The size of generated data defaults to the training sample size unless otherwise specified. - -Usage Example -------------- - -Below is a usage example that shows how to initialize and use the `TableGANAdapter` class for training and generating synthetic data: - -.. code-block:: python - - from katabatic.models import TableGANAdapter - - # Initialize the adapter with medium privacy - adapter = TableGANAdapter(type='continuous', privacy_setting='medium') - - # Load data - data = adapter.load_data('training_data.csv') - - # Preprocess and fit the model - X_train, y_train = data.iloc[:, :-1], data.iloc[:, -1] - adapter.fit(X_train, y_train) - - # Generate synthetic data - synthetic_data = adapter.generate(size=1000) - diff --git a/Archive/doc_new/source/katabatic.models.tablegan/tablegan_utils.rst b/Archive/doc_new/source/katabatic.models.tablegan/tablegan_utils.rst deleted file mode 100644 index a9fd49c..0000000 --- a/Archive/doc_new/source/katabatic.models.tablegan/tablegan_utils.rst +++ /dev/null @@ -1,74 +0,0 @@ -Utility Functions -================= - -The utility functions are implemented in the `utils.py` file and provide various support functions for data preparation, model constraints, and logistic regression operations. - -Classes -------- - -- **softmax_weight** - - A class that constrains weight tensors to be normalized using the softmax function, ensuring that the values sum up to 1. - -- **DataUtils** - - A utility class that provides helper functions for preparing data before training machine learning models. - -Functions ---------- - -- **elr_loss(KL_LOSS)** - - Defines a custom loss function that integrates the KL loss into the overall loss calculation for model training. - - - **Parameters:** - - `KL_LOSS` (`float`): The Kullback-Leibler (KL) loss value that is integrated into the loss function. - - - **Returns:** - A custom loss function that can be used in training models. - -- **KL_loss(prob_fake)** - - Calculates the Kullback-Leibler (KL) divergence loss, which measures how one probability distribution diverges from a second, expected distribution. - - - **Parameters:** - - `prob_fake` (`numpy.array`): The probability distribution of the fake (generated) data. - - - **Returns:** - The calculated KL loss value. - -- **get_lr(input_dim, output_dim, constraint=None, KL_LOSS=0)** - - Creates and returns a logistic regression model, with optional constraints and KL loss integration. - - - **Parameters:** - - `input_dim` (`int`): The dimension of the input features. - - `output_dim` (`int`): The dimension of the output labels. - - `constraint` (`callable`, optional): A constraint function applied to the logistic regression model weights (default: None). - - `KL_LOSS` (`float`, optional): The Kullback-Leibler loss value to be incorporated into the model (default: 0). - - - **Returns:** - A logistic regression model object. - -- **sample(*arrays, n=None, frac=None, random_state=None)** - - Generates random samples from the provided arrays, either by specifying the number of samples or the fraction of the arrays to sample. - - - **Parameters:** - - `arrays` (`list`): The input arrays to sample from. - - `n` (`int`, optional): The number of samples to generate (default: None). - - `frac` (`float`, optional): The fraction of the arrays to sample (default: None). - - `random_state` (`int`, optional): A random seed to ensure reproducibility (default: None). - - - **Returns:** - Random samples from the provided arrays. - -- **get_demo_data(name='adult')** - - Downloads and returns a demo dataset from the internet. By default, it downloads the 'adult' dataset, but other dataset names can be specified. - - - **Parameters:** - - `name` (`str`, optional): The name of the dataset to download (default: 'adult'). - - - **Returns:** - The downloaded dataset as a Pandas DataFrame. diff --git a/Archive/doc_new/source/modules.rst b/Archive/doc_new/source/modules.rst deleted file mode 100644 index b2cb395..0000000 --- a/Archive/doc_new/source/modules.rst +++ /dev/null @@ -1,7 +0,0 @@ -Katabatic -========= - -.. toctree:: - :maxdepth: 4 - - Katabatic diff --git a/Archive/docker-compose.Yaml b/Archive/docker-compose.Yaml deleted file mode 100644 index 7e60205..0000000 --- a/Archive/docker-compose.Yaml +++ /dev/null @@ -1,18 +0,0 @@ -version: '3' -services: - web: - build: . - ports: - - "5000:5000" - volumes: - - .:/app - depends_on: - - db - db: - image: postgres:13 - environment: - POSTGRES_USER: user - POSTGRES_PASSWORD: password - POSTGRES_DB: mydatabase - ports: - - "5432:5432" diff --git a/Archive/ganblrplusplus-docker/2024-09-16 21-48-46.mkv b/Archive/ganblrplusplus-docker/2024-09-16 21-48-46.mkv deleted file mode 100644 index bc617b3..0000000 Binary files a/Archive/ganblrplusplus-docker/2024-09-16 21-48-46.mkv and /dev/null differ diff --git a/Archive/ganblrplusplus-docker/Dockerfile b/Archive/ganblrplusplus-docker/Dockerfile deleted file mode 100644 index d31901b..0000000 --- a/Archive/ganblrplusplus-docker/Dockerfile +++ /dev/null @@ -1,15 +0,0 @@ -# Base image: Python 3.9 -FROM python:3.9-slim - -# Set the working directory inside the container -WORKDIR /app - -# Copy all files from the current directory to /app in the container -COPY . /app - -# Install Python dependencies from requirements.txt -RUN pip install --upgrade pip -RUN pip install -r requirements.txt - -# Command to run the GANBLR++ model script -CMD ["python", "ganblr++.py"] diff --git a/Archive/ganblrplusplus-docker/README.md b/Archive/ganblrplusplus-docker/README.md deleted file mode 100644 index 412f7e2..0000000 --- a/Archive/ganblrplusplus-docker/README.md +++ /dev/null @@ -1,71 +0,0 @@ -# GANBLR++ Dockerized Implementation Report - -## Overview - -This report documents the Dockerization of **GANBLR++**, a generative adversarial network (GAN)-based model developed to generate synthetic tabular data. This implementation ensures that the model can run within a reproducible Docker container, encapsulating all dependencies and environment settings, allowing for seamless execution across various systems. GANBLR++ builds upon the original GANBLR model to improve the generation of high-quality synthetic tabular data. - -By packaging GANBLR++ into a Docker container, we achieve: -- **Consistency** across various platforms. -- **Ease of setup** by bundling dependencies. -- **Improved deployment** via Docker for simplified running of the model. - ---- - -## Table of Contents - -- [Overview](#overview) -- [Features](#features) -- [Requirements](#requirements) -- [Setup Instructions](#setup-instructions) - - [1. Clone the Repository](#1-clone-the-repository) - - [2. Build the Docker Image](#2-build-the-docker-image) - - [3. Run the Docker Container](#3-run-the-docker-container) - - [4. Access the Container](#4-access-the-container) - - -## Features - -- **Reproducibility:** The project runs inside a Docker container, ensuring consistency across environments. -- **Portability:** The Docker image can be built and deployed on any system supporting Docker. -- **Simplified Setup:** No need to manually install dependencies; Docker handles everything. -- **Customizable:** Easy to modify the code, models, or data inside the container. -- **Flexibility:** GANBLR++ can generate synthetic data for a wide range of tabular datasets. - ---- - -## Requirements - -- **Docker:** Ensure Docker is installed and running on your system. - - [Install Docker](https://docs.docker.com/get-docker/) -- **Git:** Used for cloning the repository. - - [Install Git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git) - ---- - -## Setup Instructions - -### 1. Clone the Repository - -Clone the repository to your local machine using the following command: - -```bash -git clone https://github.com/your-username/ganblrplusplus-docker.git -cd ganblrplusplus-docker - -2. Build the Docker Image -In the project directory, build the Docker image using the Dockerfile provided: - -docker build -t ganblrplusplus:latest . - -This will create a Docker image named ganblrplusplus with the latest version of the model and its dependencies. - -3. Run the Docker Container -After building the Docker image, you can run the container with the following command: - -docker run --name ganblrplusplus-container ganblrplusplus:latest - -4. Access the Container -To access the container (for debugging or interaction), run: - -docker exec -it ganblrplusplus-container /bin/bash - diff --git a/Archive/ganblrplusplus-docker/ganblr++.py b/Archive/ganblrplusplus-docker/ganblr++.py deleted file mode 100644 index 06589bc..0000000 --- a/Archive/ganblrplusplus-docker/ganblr++.py +++ /dev/null @@ -1,38 +0,0 @@ -import numpy as np -import pandas as pd -from sklearn.model_selection import train_test_split -from tensorflow.keras import layers, models - -# Define the GANBLR++ model (simplified example) -class GANBLRPlusPlus: - def __init__(self, input_dim, latent_dim): - self.input_dim = input_dim - self.latent_dim = latent_dim - self.generator = self.build_generator() - self.discriminator = self.build_discriminator() - - def build_generator(self): - model = models.Sequential() - model.add(layers.Dense(128, input_dim=self.latent_dim, activation='relu')) - model.add(layers.Dense(self.input_dim, activation='sigmoid')) - return model - - def build_discriminator(self): - model = models.Sequential() - model.add(layers.Dense(128, input_dim=self.input_dim, activation='relu')) - model.add(layers.Dense(1, activation='sigmoid')) - return model - - def train(self, data, epochs=1000, batch_size=32): - for epoch in range(epochs): - # Simplified training loop for GANBLR++ - pass - -# Generate some synthetic data (this is just an example) -def generate_synthetic_data(): - real_data = np.random.rand(1000, 10) - gan = GANBLRPlusPlus(input_dim=10, latent_dim=20) - gan.train(real_data) - -if __name__ == "__main__": - generate_synthetic_data() diff --git a/Archive/ganblrpp/__init__.py b/Archive/ganblrpp/__init__.py deleted file mode 100644 index 6905489..0000000 --- a/Archive/ganblrpp/__init__.py +++ /dev/null @@ -1,12 +0,0 @@ -"""Top-level package for Ganblr.""" - -__author__ = """Tulip Lab""" -__email__ = 'jhzhou@tuliplab.academy' -__version__ = '0.1.0' - -from .kdb import KdbHighOrderFeatureEncoder -from .utils import get_demo_data - -__all__ = ['models', 'KdbHighOrderFeatureEncoder', 'get_demo_data'] - - diff --git a/Archive/ganblrpp/ganblr.py b/Archive/ganblrpp/ganblr.py deleted file mode 100644 index a699618..0000000 --- a/Archive/ganblrpp/ganblr.py +++ /dev/null @@ -1,244 +0,0 @@ -from .kdb import * -from .kdb import _add_uniform -from .utils import * -from pgmpy.models import BayesianNetwork -from pgmpy.sampling import BayesianModelSampling -from pgmpy.factors.discrete import TabularCPD -from sklearn.preprocessing import OrdinalEncoder, LabelEncoder -import numpy as np -import tensorflow as tf - -class GANBLR: - """ - The GANBLR Model. - """ - def __init__(self) -> None: - self._d = None - self.__gen_weights = None - self.batch_size = None - self.epochs = None - self.k = None - self.constraints = None - self._ordinal_encoder = OrdinalEncoder(dtype=int, handle_unknown='use_encoded_value', unknown_value=-1) - self._label_encoder = LabelEncoder() - - def fit(self, x, y, k=0, batch_size=32, epochs=10, warmup_epochs=1, verbose=1): - ''' - Fit the model to the given data. - - Parameters - ---------- - x : array_like of shape (n_samples, n_features) - Dataset to fit the model. The data should be discrete. - - y : array_like of shape (n_samples,) - Label of the dataset. - - k : int, default=0 - Parameter k of ganblr model. Must be greater than 0. No more than 2 is Suggested. - - batch_size : int, default=32 - Size of the batch to feed the model at each step. - - epochs : int, default=0 - Number of epochs to use during training. - - warmup_epochs : int, default=1 - Number of epochs to use in warmup phase. Defaults to :attr:`1`. - - verbose : int, default=1 - Whether to output the log. Use 1 for log output and 0 for complete silence. - - Returns - ------- - self : object - Fitted model. - ''' - if verbose is None or not isinstance(verbose, int): - verbose = 1 - x = self._ordinal_encoder.fit_transform(x) - y = self._label_encoder.fit_transform(y).astype(int) - d = DataUtils(x, y) - self._d = d - self.k = k - self.batch_size = batch_size - if verbose: - print(f"warmup run:") - history = self._warmup_run(warmup_epochs, verbose=verbose) - syn_data = self._sample(verbose=0) - discriminator_label = np.hstack([np.ones(d.data_size), np.zeros(d.data_size)]) - for i in range(epochs): - discriminator_input = np.vstack([x, syn_data[:,:-1]]) - disc_input, disc_label = sample(discriminator_input, discriminator_label, frac=0.8) - disc = self._discrim() - d_history = disc.fit(disc_input, disc_label, batch_size=batch_size, epochs=1, verbose=0).history - prob_fake = disc.predict(x, verbose=0) - ls = np.mean(-np.log(np.subtract(1, prob_fake))) - g_history = self._run_generator(loss=ls).history - syn_data = self._sample(verbose=0) - - if verbose: - print(f"Epoch {i+1}/{epochs}: G_loss = {g_history['loss'][0]:.6f}, G_accuracy = {g_history['accuracy'][0]:.6f}, D_loss = {d_history['loss'][0]:.6f}, D_accuracy = {d_history['accuracy'][0]:.6f}") - return self - - def evaluate(self, x, y, model='lr') -> float: - """ - Perform a TSTR(Training on Synthetic data, Testing on Real data) evaluation. - - Parameters - ---------- - x, y : array_like - Test dataset. - - model : str or object - The model used for evaluate. Should be one of ['lr', 'mlp', 'rf'], or a model class that have sklearn-style `fit` and `predict` method. - - Return: - -------- - accuracy_score : float. - - """ - from sklearn.linear_model import LogisticRegression - from sklearn.neural_network import MLPClassifier - from sklearn.ensemble import RandomForestClassifier - from sklearn.preprocessing import OneHotEncoder - from sklearn.pipeline import Pipeline - from sklearn.metrics import accuracy_score - - eval_model = None - models = dict( - lr=LogisticRegression, - rf=RandomForestClassifier, - mlp=MLPClassifier - ) - if model in models.keys(): - eval_model = models[model]() - elif hasattr(model, 'fit') and hasattr(model, 'predict'): - eval_model = model - else: - raise Exception("Invalid Arugument `model`, Should be one of ['lr', 'mlp', 'rf'], or a model class that have sklearn-style `fit` and `predict` method.") - - synthetic_data = self._sample() - synthetic_x, synthetic_y = synthetic_data[:,:-1], synthetic_data[:,-1] - x_test = self._ordinal_encoder.transform(x) - y_test = self._label_encoder.transform(y) - - categories = self._d.get_categories() - pipline = Pipeline([('encoder', OneHotEncoder(categories=categories, handle_unknown='ignore')), ('model', eval_model)]) - pipline.fit(synthetic_x, synthetic_y) - pred = pipline.predict(x_test) - return accuracy_score(y_test, pred) - - def sample(self, size=None, verbose=1) -> np.ndarray: - """ - Generate synthetic data. - - Parameters - ---------- - size : int or None - Size of the data to be generated. set to `None` to make the size equal to the size of the training set. - - verbose : int, default=1 - Whether to output the log. Use 1 for log output and 0 for complete silence. - - Return: - ----------------- - synthetic_samples : np.ndarray - Generated synthetic data. - """ - ordinal_data = self._sample(size, verbose) - origin_x = self._ordinal_encoder.inverse_transform(ordinal_data[:,:-1]) - origin_y = self._label_encoder.inverse_transform(ordinal_data[:,-1]).reshape(-1,1) - return np.hstack([origin_x, origin_y]) - - def _sample(self, size=None, verbose=1) -> np.ndarray: - """ - Generate synthetic data in ordinal encoding format - """ - if verbose is None or not isinstance(verbose, int): - verbose = 1 - #basic varibles - d = self._d - feature_cards = np.array(d.feature_uniques) - #ensure sum of each constraint group equals to 1, then re concat the probs - _idxs = np.cumsum([0] + d._kdbe.constraints_.tolist()) - constraint_idxs = [(_idxs[i],_idxs[i+1]) for i in range(len(_idxs)-1)] - - probs = np.exp(self.__gen_weights[0]) - cpd_probs = [probs[start:end,:] for start, end in constraint_idxs] - cpd_probs = np.vstack([p/p.sum(axis=0) for p in cpd_probs]) - - #assign the probs to the full cpd tables - idxs = np.cumsum([0] + d._kdbe.high_order_feature_uniques_) - feature_idxs = [(idxs[i],idxs[i+1]) for i in range(len(idxs)-1)] - have_value_idxs = d._kdbe.have_value_idxs_ - full_cpd_probs = [] - for have_value, (start, end) in zip(have_value_idxs, feature_idxs): - #(n_high_order_feature_uniques, n_classes) - cpd_prob_ = cpd_probs[start:end,:] - #(n_all_combination) Note: the order is (*parent, variable) - have_value_ravel = have_value.ravel() - #(n_classes * n_all_combination) - have_value_ravel_repeat = np.hstack([have_value_ravel] * d.num_classes) - #(n_classes * n_all_combination) <- (n_classes * n_high_order_feature_uniques) - full_cpd_prob_ravel = np.zeros_like(have_value_ravel_repeat, dtype=float) - full_cpd_prob_ravel[have_value_ravel_repeat] = cpd_prob_.T.ravel() - #(n_classes * n_parent_combinations, n_variable_unique) - full_cpd_prob = full_cpd_prob_ravel.reshape(-1, have_value.shape[-1]).T - full_cpd_prob = _add_uniform(full_cpd_prob, noise=0) - full_cpd_probs.append(full_cpd_prob) - - #prepare node and edge names - node_names = [str(i) for i in range(d.num_features + 1)] - edge_names = [(str(i), str(j)) for i,j in d._kdbe.edges_] - y_name = node_names[-1] - - #create TabularCPD objects - evidences = d._kdbe.dependencies_ - feature_cpds = [ - TabularCPD(str(name), feature_cards[name], table, - evidence=[y_name, *[str(e) for e in evidences]], - evidence_card=[d.num_classes, *feature_cards[evidences].tolist()]) - for (name, evidences), table in zip(evidences.items(), full_cpd_probs) - ] - y_probs = (d.class_counts/d.data_size).reshape(-1,1) - y_cpd = TabularCPD(y_name, d.num_classes, y_probs) - - #create kDB model, then sample the data - model = BayesianNetwork(edge_names) - model.add_cpds(y_cpd, *feature_cpds) - sample_size = d.data_size if size is None else size - result = BayesianModelSampling(model).forward_sample(size=sample_size, show_progress = verbose > 0) - sorted_result = result[node_names].values - - return sorted_result - - def _warmup_run(self, epochs, verbose=None): - d = self._d - tf.keras.backend.clear_session() - ohex = d.get_kdbe_x(self.k) - self.constraints = softmax_weight(d.constraint_positions) - elr = get_lr(ohex.shape[1], d.num_classes, self.constraints) - history = elr.fit(ohex, d.y, batch_size=self.batch_size, epochs=epochs, verbose=verbose) - self.__gen_weights = elr.get_weights() - tf.keras.backend.clear_session() - return history - - def _run_generator(self, loss): - d = self._d - ohex = d.get_kdbe_x(self.k) - tf.keras.backend.clear_session() - model = tf.keras.Sequential() - model.add(tf.keras.layers.Dense(d.num_classes, input_dim=ohex.shape[1], activation='softmax',kernel_constraint=self.constraints)) - model.compile(loss=elr_loss(loss), optimizer='adam', metrics=['accuracy']) - model.set_weights(self.__gen_weights) - history = model.fit(ohex, d.y, batch_size=self.batch_size,epochs=1, verbose=0) - self.__gen_weights = model.get_weights() - tf.keras.backend.clear_session() - return history - - def _discrim(self): - model = tf.keras.Sequential() - model.add(tf.keras.layers.Dense(1, input_dim=self._d.num_features, activation='sigmoid')) - model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy']) - return model \ No newline at end of file diff --git a/Archive/ganblrpp/ganblrpp.py b/Archive/ganblrpp/ganblrpp.py deleted file mode 100644 index a23d0f0..0000000 --- a/Archive/ganblrpp/ganblrpp.py +++ /dev/null @@ -1,315 +0,0 @@ -from .ganblr import GANBLR -from sklearn.mixture import BayesianGaussianMixture -from sklearn.preprocessing import MinMaxScaler, LabelEncoder, OrdinalEncoder -from scipy.stats import truncnorm -import numpy as np - -class DMMDiscritizer: - def __init__(self, random_state): - self.__dmm_params = dict(weight_concentration_prior_type="dirichlet_process", - n_components=2 * 5, reg_covar=0, init_params='random', - max_iter=1500, mean_precision_prior=.1, - random_state=random_state) - - self.__scaler = MinMaxScaler() - self.__dmms = [] - self.__arr_mu = [] - self.__arr_sigma = [] - self._random_state = random_state - - def fit(self, x): - """ - Do DMM Discritization. - - Parameter: - --------- - x (2d np.numpy): data to be discritize. Must bu numeric data. - - Return: - ---------- - self - - """ - assert(isinstance(x, np.ndarray)) - assert(len(x.shape) == 2) - - x_scaled = self.__scaler.fit_transform(x) - self.__internal_fit(x_scaled) - return self - - def transform(self, x) -> np.ndarray: - x = self.__scaler.transform(x) - arr_modes = [] - for i, dmm in enumerate(self.__dmms): - modes = dmm.predict(x[:,i:i+1]) - modes = LabelEncoder().fit_transform(modes)#.astype(int) - arr_modes.append(modes) - return self.__internal_transform(x, arr_modes) - - def fit_transform(self, x) -> np.ndarray: - assert(isinstance(x, np.ndarray)) - assert(len(x.shape) == 2) - - x_scaled = self.__scaler.fit_transform(x) - arr_modes = self.__internal_fit(x_scaled) - return self.__internal_transform(x_scaled, arr_modes) - - def __internal_fit(self, x): - self.__dmms.clear() - self.__arr_mu.clear() - self.__arr_sigma.clear() - - arr_mode = [] - for i in range(x.shape[1]): - cur_column = x[:,i:i+1] - dmm = BayesianGaussianMixture(**self.__dmm_params) - y = dmm.fit_predict(cur_column) - lbe = LabelEncoder().fit(y) - mu = dmm.means_[:len(lbe.classes_)] - sigma = np.sqrt(dmm.covariances_[:len(lbe.classes_)]) - - arr_mode.append(lbe.transform(y))#.astype(int)) - #self.__arr_lbes.append(lbe) - self.__dmms.append(dmm) - self.__arr_mu.append(mu.ravel()) - self.__arr_sigma.append(sigma.ravel()) - return arr_mode - - def __internal_transform(self, x, arr_modes): - _and = np.logical_and - _not = np.logical_not - - discretized_data = [] - for i, (modes, mu, sigma) in enumerate(zip( - arr_modes, - self.__arr_mu, - self.__arr_sigma)): - - cur_column = x[:,i] - cur_mu = mu[modes] - cur_sigma = sigma[modes] - x_std = cur_column - cur_mu - - less_than_n3sigma = (x_std <= -3*cur_sigma) - less_than_n2sigma = (x_std <= -2*cur_sigma) - less_than_n1sigma = (x_std <= -cur_sigma) - less_than_0 = (x_std <= 0) - less_than_1sigma = (x_std <= cur_sigma) - less_than_2sigma = (x_std <= 2*cur_sigma) - less_than_3sigma = (x_std <= 3*cur_sigma) - - discretized_x = 8 * modes - discretized_x[_and(_not(less_than_n3sigma), less_than_n2sigma)] += 1 - discretized_x[_and(_not(less_than_n2sigma), less_than_n1sigma)] += 2 - discretized_x[_and(_not(less_than_n1sigma), less_than_0)] += 3 - discretized_x[_and(_not(less_than_0) , less_than_1sigma)] += 4 - discretized_x[_and(_not(less_than_1sigma) , less_than_2sigma)] += 5 - discretized_x[_and(_not(less_than_2sigma) , less_than_3sigma)] += 6 - discretized_x[_not(less_than_3sigma)] += 7 - discretized_data.append(discretized_x.reshape(-1,1)) - - return np.hstack(discretized_data) - - def inverse_transform(self, x, verbose=1) -> np.ndarray: - x_modes = x // 8 - x_bins = x % 8 - - def __sample_one_column(i, mu, sigma): - cur_column_modes = x_modes[:,i] - cur_column_bins = x_bins[:,i] - cur_column_mode_uniques = np.unique(cur_column_modes) - inversed_x = np.zeros_like(cur_column_modes, dtype=float) - - for mode in cur_column_mode_uniques: - cur_mode_idx = cur_column_modes == mode - cur_mode_mu = mu[mode] - cur_mode_sigma = sigma[mode] - - sample_results = self.__sample_from_truncnorm(cur_column_bins[cur_mode_idx], cur_mode_mu, cur_mode_sigma, random_state=self._random_state) - inversed_x[cur_mode_idx] = sample_results - - return inversed_x.reshape(-1,1) - - if verbose: - from tqdm import tqdm - _progress_wrapper = lambda iterable: tqdm(iterable, desc='sampling', total=len(self.__arr_mu)) - else: - _progress_wrapper = lambda iterable: iterable - inversed_data = np.hstack([__sample_one_column(i, mu, sigma) - for i, (mu, sigma) in _progress_wrapper(enumerate(zip(self.__arr_mu, self.__arr_sigma)))]) - - #the sampling progress is fast enough so there is no need for parallelization - #inversed_data = np.hstack(Parallel(n_jobs=n_jobs, verbose=verbose)(delayed(__sample_one_column)(i, mu, sigma) - # for i, (mu, sigma) in enumerate(zip(self.__arr_mu, self.__arr_sigma)))) - return self.__scaler.inverse_transform(inversed_data) - - @staticmethod - def __sample_from_truncnorm(bins, mu, sigma, random_state=None): - sampled_results = np.zeros_like(bins, dtype=float) - def __sampling(idx, range_min, range_max): - sampling_size = np.sum(idx) - if sampling_size != 0: - sampled_results[idx] = truncnorm.rvs(range_min, range_max, loc=mu, scale=sigma, size=sampling_size, random_state=random_state) - - #shape param (min, max) of scipy.stats.truncnorm.rvs are still defined with respect to the standard normal - __sampling(bins == 0, np.NINF, -3) - __sampling(bins == 1, -3, -2) - __sampling(bins == 2, -2, -1) - __sampling(bins == 3, -1, 0) - __sampling(bins == 4, 0, 1) - __sampling(bins == 5, 1, 2) - __sampling(bins == 6, 2, 3) - __sampling(bins == 7, 3, np.inf) - return sampled_results - -class GANBLRPP: - """ - The GANBLR++ model. - - Parameters - ---------- - numerical_columns : list of int - Indicating the indexes of numerical columns. - For example, if the 3, 5, 10th feature of a data is numerical feature, then this param should be [3, 5, 10]. - - random_state : int, RandomState instance or None - Controls the random seed given to the method chosen to initialize the parameters of `BayesianGaussianMixture` used by `GANBLRPP`. - """ - def __init__(self, numerical_columns, random_state=None): - self.__discritizer = DMMDiscritizer(random_state) - self.__ganblr = GANBLR() - self._numerical_columns = numerical_columns - pass - - def fit(self, x, y, k=0, batch_size=32, epochs=10, warmup_epochs=1, verbose=1): - ''' - Fit the model to the given data. - - Parameters - ---------- - x : array_like of shape (n_samples, n_features) - Dataset to fit the model. The data should be discrete. - - y : array_like of shape (n_samples,) - Label of the dataset. - - k : int, default=0 - Parameter k of ganblr model. Must be greater than 0. No more than 2 is Suggested. - - batch_size : int, default=32 - Size of the batch to feed the model at each step. - - epochs : int, default=0 - Number of epochs to use during training. - - warmup_epochs : int, default=1 - Number of epochs to use in warmup phase. Defaults to :attr:`1`. - - verbose : int, default=1 - Whether to output the log. Use 1 for log output and 0 for complete silence. - - Returns - ------- - self : object - Fitted model. - ''' - numerical_columns = self._numerical_columns - x[:,numerical_columns] = self.__discritizer.fit_transform(x[:,numerical_columns]) - return self.__ganblr.fit(x, y, k, batch_size, epochs, warmup_epochs, verbose) - - def sample(self, size=None, verbose=1): - """ - Generate synthetic data. - - Parameters - ---------- - size : int or None - Size of the data to be generated. set to `None` to make the size equal to the size of the training set. - - verbose : int, default=1 - Whether to output the log. Use 1 for log output and 0 for complete silence. - - Return: - ----------------- - synthetic_samples : np.ndarray - Generated synthetic data. - """ - if verbose: - print('Step 1/2: Sampling discrete data from GANBLR.') - ordinal_data = self.__ganblr._sample(size, verbose=verbose) - syn_x = self.__ganblr._ordinal_encoder.inverse_transform(ordinal_data[:,:-1]) - syn_y = self.__ganblr._label_encoder.inverse_transform(ordinal_data[:,-1]).reshape(-1,1) - if verbose: - print('step 2/2: Sampling numerical data.') - numerical_columns = self._numerical_columns - numerical_data = self.__discritizer.inverse_transform(syn_x[:,numerical_columns].astype(int)) - syn_x[:,numerical_columns] = numerical_data - return np.hstack([syn_x, syn_y]) - - def evaluate(self, x, y, model='lr'): - """ - Perform a TSTR(Training on Synthetic data, Testing on Real data) evaluation. - - Parameters - ---------- - x, y : array_like - Test dataset. - - model : str or object - The model used for evaluate. Should be one of ['lr', 'mlp', 'rf'], or a model class that have sklearn-style `fit` and `predict` method. - - Return: - -------- - accuracy_score : float. - - """ - from sklearn.linear_model import LogisticRegression - from sklearn.neural_network import MLPClassifier - from sklearn.ensemble import RandomForestClassifier - from sklearn.preprocessing import OneHotEncoder, StandardScaler, OrdinalEncoder - from sklearn.metrics import accuracy_score - - eval_model = None - if model=='lr': - eval_model = LogisticRegression() - elif model == 'rf': - eval_model = RandomForestClassifier() - elif model == 'mlp': - eval_model = MLPClassifier() - elif hasattr(model, 'fit') and hasattr(model, 'predict'): - eval_model = model - else: - raise Exception('Invalid Arugument') - numerical_columns = self._numerical_columns - catgorical_columns = list(set(range(x.shape[1])) - set(numerical_columns)) - categories = self.__ganblr._d.get_categories(catgorical_columns) - - synthetic_data = self.sample() - synthetic_x, synthetic_y = synthetic_data[:,:-1], synthetic_data[:,-1] - - ohe = OneHotEncoder(categories=categories, sparse=False, handle_unknown='ignore') - syn_x_ohe = ohe.fit_transform(synthetic_x[:,catgorical_columns]) - real_x_ohe = ohe.transform(x[:,catgorical_columns]) - syn_x_num = synthetic_x[:,numerical_columns] - real_x_num = x[:,numerical_columns] - - scaler = StandardScaler() - syn_x_concat = scaler.fit_transform(np.hstack([syn_x_num, syn_x_ohe])) - real_x_concat = scaler.transform(np.hstack([real_x_num, real_x_ohe])) - - lbe = self.__ganblr._label_encoder - real_y = lbe.transform(y) - syn_y = lbe.transform(synthetic_y) - - eval_model.fit(syn_x_concat, syn_y) - pred = eval_model.predict(real_x_concat) - return accuracy_score(real_y, pred) - - - - - - - - - diff --git a/Archive/ganblrpp/ganblrpp_adapter.py b/Archive/ganblrpp/ganblrpp_adapter.py deleted file mode 100644 index 30de7d1..0000000 --- a/Archive/ganblrpp/ganblrpp_adapter.py +++ /dev/null @@ -1,82 +0,0 @@ - -from katabatic.katabatic_spi import KatabaticModelSPI -import pandas as pd -import numpy as np -from .ganblrpp import GANBLRPP - -from katabatic.katabatic_spi import KatabaticModelSPI -import pandas as pd -import numpy as np -from .ganblrpp import GANBLRPP - -class GanblrppAdapter(KatabaticModelSPI): - def __init__(self, model_type="discrete", numerical_columns=None, random_state=None): - self.type = model_type - self.model = None - self.constraints = None - self.batch_size = None - self.epochs = None - self.training_sample_size = 0 - self.numerical_columns = numerical_columns - self.random_state = random_state - - def load_model(self): - if self.numerical_columns is None: - raise ValueError("Numerical columns must be provided for GANBLRPP initialization.") - - print("[INFO] Initializing GANBLR++ Model") - self.model = GANBLRPP(numerical_columns=self.numerical_columns, random_state=self.random_state) - - if self.model is None: - raise RuntimeError("Failed to initialize GANBLR++ model.") - - return self.model - - def load_data(self, data_pathname): - print(f"[INFO] Loading data from {data_pathname}") - try: - data = pd.read_csv(data_pathname) - print("[SUCCESS] Data loaded successfully.") - except Exception as e: - print(f"[ERROR] An error occurred: {e}") - raise - return data - - def fit(self, X_train, y_train, k=0, epochs=10, batch_size=64): - if self.model is None: - raise RuntimeError("Model is not initialized. Call `load_model()` first.") - - try: - print("[INFO] Training GANBLR++ model") - if isinstance(X_train, pd.DataFrame): - X_train = X_train.values - if isinstance(y_train, pd.Series): - y_train = y_train.values - - self.model.fit(X_train, y_train, k=k, batch_size=batch_size, epochs=epochs, verbose=0) - self.training_sample_size = len(X_train) - print("[SUCCESS] Model training completed") - except Exception as e: - print(f"[ERROR] An error occurred during model training: {e}") - raise - - def generate(self, size=None): - if self.model is None: - raise RuntimeError("Model is not initialized. Call `load_model()` first.") - - try: - print("[INFO] Generating data using GANBLR++ model") - if size is None: - size = self.training_sample_size - - generated_data = self.model.sample(size, verbose=0) - if isinstance(generated_data, np.ndarray): - generated_data = pd.DataFrame(generated_data) - - print("[SUCCESS] Data generation completed") - return generated_data - except Exception as e: - print(f"[ERROR] An error occurred during data generation: {e}") - raise - - diff --git a/Archive/ganblrpp/kdb.py b/Archive/ganblrpp/kdb.py deleted file mode 100644 index 03c080c..0000000 --- a/Archive/ganblrpp/kdb.py +++ /dev/null @@ -1,378 +0,0 @@ -import numpy as np -#import networkx as nx - - -# Chris advice -# Code it twice -# K=0 -# Warmup run (no generation) -# Then run again with adding the edges. -# What is the advantage over KDB? -# focus on generation, not on structure learning -# ganblr ++ is cheating, as we learn the strucutre prior. -# Send to STM -from pyitlib import discrete_random_variable as drv -def build_graph(X, y, k=2): - ''' - kDB algorithm - - Param: - ---------------------- - - Return: - ---------------------- - graph edges - ''' - #ensure data - num_features = X.shape[1] - x_nodes = list(range(num_features)) - y_node = num_features - - #util func - _x = lambda i:X[:,i] - _x2comb = lambda i,j:(X[:,i], X[:,j]) - - #feature indexes desc sort by mutual information - sorted_feature_idxs = np.argsort([ - drv.information_mutual(_x(i), y) - for i in range(num_features) - ])[::-1] - - #start building graph - edges = [] - for iter, target_idx in enumerate(sorted_feature_idxs): - target_node = x_nodes[target_idx] - edges.append((y_node, target_node)) - - parent_candidate_idxs = sorted_feature_idxs[:iter] - if iter <= k: - for idx in parent_candidate_idxs: - edges.append((x_nodes[idx], target_node)) - else: - first_k_parent_mi_idxs = np.argsort([ - drv.information_mutual_conditional(*_x2comb(i, target_idx), y) - for i in parent_candidate_idxs - ])[::-1][:k] - first_k_parent_idxs = parent_candidate_idxs[first_k_parent_mi_idxs] - - for parent_idx in first_k_parent_idxs: - edges.append((x_nodes[parent_idx], target_node)) - return edges - -# def draw_graph(edges): -# ''' -# Draw the graph -# -# Param -# ----------------- -# edges: edges of the graph -# -# ''' -# graph = nx.DiGraph(edges) -# pos=nx.spiral_layout(graph) -# nx.draw(graph, pos, node_color='r', edge_color='b') -# nx.draw_networkx_labels(graph, pos, font_size=20, font_family="sans-serif") - - -def get_cross_table(*cols, apply_wt=False): - ''' - author: alexland - - returns: - (i) xt, NumPy array storing the xtab results, number of dimensions is equal to - the len(args) passed in - (ii) unique_vals_all_cols, a tuple of 1D NumPy array for each dimension - in xt (for a 2D xtab, the tuple comprises the row and column headers) - pass in: - (i) 1 or more 1D NumPy arrays of integers - (ii) if wts is True, then the last array in cols is an array of weights - - if return_inverse=True, then np.unique also returns an integer index - (from 0, & of same len as array passed in) such that, uniq_vals[idx] gives the original array passed in - higher dimensional cross tabulations are supported (eg, 2D & 3D) - cross tabulation on two variables (columns): - >>> q1 = np.array([7, 8, 8, 8, 5, 6, 4, 6, 6, 8, 4, 6, 6, 6, 6, 8, 8, 5, 8, 6]) - >>> q2 = np.array([6, 4, 6, 4, 8, 8, 4, 8, 7, 4, 4, 8, 8, 7, 5, 4, 8, 4, 4, 4]) - >>> uv, xt = xtab(q1, q2) - >>> uv - (array([4, 5, 6, 7, 8]), array([4, 5, 6, 7, 8])) - >>> xt - array([[2, 0, 0, 0, 0], - [1, 0, 0, 0, 1], - [1, 1, 0, 2, 4], - [0, 0, 1, 0, 0], - [5, 0, 1, 0, 1]], dtype=uint64) - ''' - if not all(len(col) == len(cols[0]) for col in cols[1:]): - raise ValueError("all arguments must be same size") - - if len(cols) == 0: - raise TypeError("xtab() requires at least one argument") - - fnx1 = lambda q: len(q.squeeze().shape) - if not all([fnx1(col) == 1 for col in cols]): - raise ValueError("all input arrays must be 1D") - - if apply_wt: - cols, wt = cols[:-1], cols[-1] - else: - wt = 1 - - uniq_vals_all_cols, idx = zip( *(np.unique(col, return_inverse=True) for col in cols) ) - shape_xt = [uniq_vals_col.size for uniq_vals_col in uniq_vals_all_cols] - dtype_xt = 'float' if apply_wt else 'uint' - xt = np.zeros(shape_xt, dtype=dtype_xt) - np.add.at(xt, idx, wt) - return uniq_vals_all_cols, xt - -def _get_dependencies_without_y(variables, y_name, kdb_edges): - ''' - evidences of each variable without y. - - Param: - -------------- - variables: variable names - - y_name: class name - - kdb_edges: list of tuple (source, target) - ''' - dependencies = {} - kdb_edges_without_y = [edge for edge in kdb_edges if edge[0] != y_name] - mi_desc_order = {t:i for i,(s,t) in enumerate(kdb_edges) if s == y_name} - for x in variables: - current_dependencies = [s for s,t in kdb_edges_without_y if t == x] - if len(current_dependencies) >= 2: - sort_dict = {t:mi_desc_order[t] for t in current_dependencies} - dependencies[x] = sorted(sort_dict) - else: - dependencies[x] = current_dependencies - return dependencies - -def _add_uniform(array, noise=1e-5): - ''' - if no count on particular condition for any feature, give a uniform prob rather than leave 0 - ''' - sum_by_col = np.sum(array,axis=0) - zero_idxs = (array == 0).astype(int) - # zero_count_by_col = np.sum(zero_idxs,axis=0) - nunique = array.shape[0] - result = np.zeros_like(array, dtype='float') - for i in range(array.shape[1]): - if sum_by_col[i] == 0: - result[:,i] = array[:,i] + 1./nunique - elif noise != 0: - result[:,i] = array[:,i] + noise * zero_idxs[:,i] - else: - result[:,i] = array[:,i] - return result - -def _normalize_by_column(array): - sum_by_col = np.sum(array,axis=0) - return np.divide(array, sum_by_col, - out=np.zeros_like(array,dtype='float'), - where=sum_by_col !=0) - -def _smoothing(cct, d): - ''' - probability smoothing for kdb - - Parameters: - ----------- - cct (np.ndarray): cross count table with shape (x0, *parents) - - d (int): dimension of cct - - Return: - -------- - smoothed joint prob table - ''' - #covert cross-count-table to joint-prob-table by doing a normalization alone axis 0 - jpt = _normalize_by_column(cct) - smoothing_idx = jpt == 0 - if d > 1 and np.sum(smoothing_idx) > 0: - parent = cct.sum(axis=-1) - parent = _smoothing(parent, d-1) - parent_extend = parent.repeat(jpt.shape[-1]).reshape(jpt.shape) - jpt[smoothing_idx] = parent_extend[smoothing_idx] - return jpt - -def get_high_order_feature(X, col, evidence_cols, feature_uniques): - ''' - encode the high order feature of X[col] given evidences X[evidence_cols]. - ''' - if evidence_cols is None or len(evidence_cols) == 0: - return X[:,[col]] - else: - evidences = [X[:,_col] for _col in evidence_cols] - - #[1, variable_unique, evidence_unique] - base = [1, feature_uniques[col]] + [feature_uniques[_col] for _col in evidence_cols[::-1][:-1]] - cum_base = np.cumprod(base)[::-1] - - cols = evidence_cols + [col] - high_order_feature = np.sum(X[:,cols] * cum_base, axis=1).reshape(-1,1) - return high_order_feature - -def get_high_order_constraints(X, col, evidence_cols, feature_uniques): - ''' - find the constraints infomation for the high order feature X[col] given evidences X[evidence_cols]. - - Returns: - --------------------- - tuple(have_value, high_order_uniques) - - have_value: a k+1 dimensions numpy ndarray of type boolean. - Each dimension correspond to a variable, with the order (*evidence_cols, col) - True indicate the corresponding combination of variable values cound be found in the dataset. - False indicate not. - - high_order_constraints: a 1d nummy ndarray of type int. - Each number `c` indicate that there are `c` cols shound be applying the constraints since the last constrant position(or index 0), - in sequence. - - ''' - if evidence_cols is None or len(evidence_cols) == 0: - unique = feature_uniques[col] - return np.ones(unique,dtype=bool), np.array([unique]) - else: - cols = evidence_cols + [col] - cross_table_idxs, cross_table = get_cross_table(*[X[:,i] for i in cols]) - have_value = cross_table != 0 - - have_value_reshape = have_value.reshape(-1,have_value.shape[-1]) - #have_value_split = np.split(have_value_reshape, have_value_reshape.shape[0], 0) - high_order_constraints = np.sum(have_value_reshape, axis=-1) - - return have_value, high_order_constraints - -class KdbHighOrderFeatureEncoder: - ''' - High order feature encoder that uses the kdb model to retrieve the dependencies between features. - - ''' - def __init__(self): - self.dependencies_ = {} - self.constraints_ = np.array([]) - self.have_value_idxs_ = [] - self.feature_uniques_ = [] - self.high_order_feature_uniques_ = [] - self.edges_ = [] - self.ohe_ = None - self.k = None - #self.full_=True - - def fit(self, X, y, k=0): - ''' - Fit the KdbHighOrderFeatureEncoder to X, y. - - Parameters - ---------- - X : array_like of shape (n_samples, n_features) - data to fit in the encoder. - - y : array_like of shape (n_samples,) - label to fit in the encoder. - - k : int, default=0 - k value of the order of the high-order feature. k = 0 will lead to a OneHotEncoder. - - Returns - ------- - self : object - Fitted encoder. - ''' - self.k = k - edges = build_graph(X, y, k) - #n_classes = len(np.unique(y)) - num_features = X.shape[1] - - if k > 0: - dependencies = _get_dependencies_without_y(list(range(num_features)), num_features, edges) - else: - dependencies = {x:[] for x in range(num_features)} - - self.dependencies_ = dependencies - self.feature_uniques_ = [len(np.unique(X[:,i])) for i in range(num_features)] - self.edges_ = edges - #self.full_ = full - - Xk, constraints, have_value_idxs = self.transform(X, return_constraints=True, use_ohe=False) - - from sklearn.preprocessing import OneHotEncoder - self.ohe_ = OneHotEncoder().fit(Xk) - self.high_order_feature_uniques_ = [len(c) for c in self.ohe_.categories_] - self.constraints_ = constraints - self.have_value_idxs_ = have_value_idxs - return self - - def transform(self, X, return_constraints=False, use_ohe=True): - """ - Transform X to the high-order features. - - Parameters - ---------- - X : array_like of shape (n_samples, n_features) - Data to fit in the encoder. - - return_constraints : bool, default=False - Whether to return the constraint informations. - - use_ohe : bool, default=True - Whether to transform output to one-hot format. - - Returns - ------- - X_out : ndarray of shape (n_samples, n_encoded_features) - Transformed input. - """ - Xk = [] - have_value_idxs = [] - constraints = [] - for k, v in self.dependencies_.items(): - xk = get_high_order_feature(X, k, v, self.feature_uniques_) - Xk.append(xk) - - if return_constraints: - idx, constraint = get_high_order_constraints(X, k, v, self.feature_uniques_) - have_value_idxs.append(idx) - constraints.append(constraint) - - Xk = np.hstack(Xk) - from sklearn.preprocessing import OrdinalEncoder - Xk = OrdinalEncoder().fit_transform(Xk) - if use_ohe: - Xk = self.ohe_.transform(Xk) - - if return_constraints: - concated_constraints = np.hstack(constraints) - return Xk, concated_constraints, have_value_idxs - else: - return Xk - - def fit_transform(self, X, y, k=0, return_constraints=False): - ''' - Fit KdbHighOrderFeatureEncoder to X, y, then transform X. - - Equivalent to fit(X, y, k).transform(X, return_constraints) but more convenient. - - Parameters - ---------- - X : array_like of shape (n_samples, n_features) - data to fit in the encoder. - - y : array_like of shape (n_samples,) - label to fit in the encoder. - - k : int, default=0 - k value of the kdb model. k = 0 will lead to a OneHotEncoder. - - return_constraints : bool, default=False - whether to return the constraint informations. - - Returns - ------- - X_out : ndarray of shape (n_samples, n_encoded_features) - Transformed input. - ''' - return self.fit(X, y, k).transform(X, return_constraints) \ No newline at end of file diff --git a/Archive/ganblrpp/utils.py b/Archive/ganblrpp/utils.py deleted file mode 100644 index 3502866..0000000 --- a/Archive/ganblrpp/utils.py +++ /dev/null @@ -1,167 +0,0 @@ -import numpy as np -import tensorflow as tf -from tensorflow.python.ops import math_ops -from pandas import read_csv -from sklearn.preprocessing import OneHotEncoder -from .kdb import KdbHighOrderFeatureEncoder - -from tensorflow.python.ops import math_ops -import numpy as np -import tensorflow as tf - -class softmax_weight(tf.keras.constraints.Constraint): - """Constrains weight tensors to be under softmax `.""" - - def __init__(self,feature_uniques): - if isinstance(feature_uniques, np.ndarray): - idxs = math_ops.cumsum(np.hstack([np.array([0]),feature_uniques])) - else: - idxs = math_ops.cumsum([0] + feature_uniques) - idxs = [i.numpy() for i in idxs] - self.feature_idxs = [ - (idxs[i],idxs[i+1]) for i in range(len(idxs)-1) - ] - - def __call__(self, w): - w_new = [ - math_ops.log(tf.nn.softmax(w[i:j,:], axis=0)) - for i,j in self.feature_idxs - ] - return tf.concat(w_new, 0) - - def get_config(self): - return {'feature_idxs': self.feature_idxs} - -def elr_loss(KL_LOSS): - def loss(y_true, y_pred): - return tf.keras.losses.sparse_categorical_crossentropy(y_true, y_pred)+ KL_LOSS - return loss - -def KL_loss(prob_fake): - return np.mean(-np.log(np.subtract(1,prob_fake))) - -def get_lr(input_dim, output_dim, constraint=None,KL_LOSS=0): - model = tf.keras.Sequential() - model.add(tf.keras.layers.Dense(output_dim, input_dim=input_dim, activation='softmax',kernel_constraint=constraint)) - model.compile(loss=elr_loss(KL_LOSS), optimizer='adam', metrics=['accuracy']) - #log_elr = model.fit(*train_data, validation_data=test_data, batch_size=batch_size,epochs=epochs) - return model - -def sample(*arrays, n=None, frac=None, random_state=None): - ''' - generate sample random arrays from given arrays. The given arrays must be same size. - - Parameters: - -------------- - *arrays: arrays to be sampled. - - n (int): Number of random samples to generate. - - frac: Float value between 0 and 1, Returns (float value * length of given arrays). frac cannot be used with n. - - random_state: int value or numpy.random.RandomState, optional. if set to a particular integer, will return same samples in every iteration. - - Return: - -------------- - the sampled array(s). Passing in multiple arrays will result in the return of a tuple. - - ''' - random = np.random - if isinstance(random_state, int): - random = random.RandomState(random_state) - elif isinstance(random_state, np.random.RandomState): - random = random_state - - arr0 = arrays[0] - original_size = len(arr0) - if n == None and frac == None: - raise Exception('You must specify one of frac or size.') - if n == None: - n = int(len(arr0) * frac) - - idxs = random.choice(original_size, n, replace=False) - if len(arrays) > 1: - sampled_arrays = [] - for arr in arrays: - assert(len(arr) == original_size) - sampled_arrays.append(arr[idxs]) - return tuple(sampled_arrays) - else: - return arr0[idxs] - -DEMO_DATASETS = { - 'adult': { - 'link':'https://raw.githubusercontent.com/chriszhangpodo/discretizedata/main/adult-dm.csv', - 'params': { - 'dtype' : int - } - }, - 'adult-raw':{ - 'link':'https://drive.google.com/uc?export=download&id=1iA-_qIC1xKQJ4nL2ugX1_XJQf8__xOY0', - 'params': {} - } -} - -def get_demo_data(name='adult'): - """ - Download demo dataset from internet. - - Parameters - ---------- - name : str - Name of dataset. Should be one of ['adult', 'adult-raw']. - - Returns - ------- - data : pandas.DataFrame - the demo dataset. - """ - assert(name in DEMO_DATASETS.keys()) - return read_csv(DEMO_DATASETS[name]['link'], **DEMO_DATASETS[name]['params']) - -from .kdb import KdbHighOrderFeatureEncoder -from sklearn.preprocessing import OneHotEncoder -from pandas import read_csv -import numpy as np - -class DataUtils: - """ - useful data utils for the preparation before training. - """ - def __init__(self, x, y): - self.x = x - self.y = y - self.data_size = len(x) - self.num_features = x.shape[1] - - yunique, ycounts = np.unique(y, return_counts=True) - self.num_classes = len(yunique) - self.class_counts = ycounts - self.feature_uniques = [len(np.unique(x[:,i])) for i in range(self.num_features)] - - self.constraint_positions = None - self._kdbe = None - - self.__kdbe_x = None - - def get_categories(self, idxs=None): - if idxs != None: - return [self._kdbe.ohe_.categories_[i] for i in idxs] - return self._kdbe.ohe_.categories_ - - def get_kdbe_x(self, k=0, dense_format=True) -> np.ndarray: - if self.__kdbe_x is not None: - return self.__kdbe_x - if self._kdbe == None: - self._kdbe = KdbHighOrderFeatureEncoder() - self._kdbe.fit(self.x, self.y, k=k) - kdbex = self._kdbe.transform(self.x) - if dense_format: - kdbex = kdbex.todense() - self.__kdbe_x = kdbex - self.constraint_positions = self._kdbe.constraints_ - return kdbex - - def clear(self): - self._kdbe = None - self.__kdbe_x = None \ No newline at end of file diff --git a/Archive/meg/Benchmarking 1.py b/Archive/meg/Benchmarking 1.py deleted file mode 100644 index 8154854..0000000 --- a/Archive/meg/Benchmarking 1.py +++ /dev/null @@ -1,199 +0,0 @@ -import numpy as np -import pandas as pd -import torch -import torch.nn as nn -import torch.optim as optim -from sklearn.model_selection import train_test_split -from sklearn.preprocessing import StandardScaler -from sklearn.metrics import accuracy_score, precision_score, recall_score, f1_score -import matplotlib.pyplot as plt -import seaborn as sns - -# Load your dataset -data = pd.read_csv('dataset.csv') - -# Drop missing values (if any) -data = data.dropna() - -# Convert categorical columns to numerical values -data['Gender'] = data['Gender'].map({'M': 1, 'F': 0}) -data['Purchase'] = data['Purchase'].map({'Yes': 1, 'No': 0}) - -# Separate features (X) and labels (y) -X = data.drop('Purchase', axis=1) -y = data['Purchase'] - -# Split the data into train and test sets -X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42) - -# Standardize the data -scaler = StandardScaler() -X_train_scaled = scaler.fit_transform(X_train) -X_test_scaled = scaler.transform(X_test) - -# Convert to PyTorch tensors -X_train_tensor = torch.tensor(X_train_scaled, dtype=torch.float32) -y_train_tensor = torch.tensor(y_train.values, dtype=torch.float32).unsqueeze(1) -X_test_tensor = torch.tensor(X_test_scaled, dtype=torch.float32) -y_test_tensor = torch.tensor(y_test.values, dtype=torch.float32).unsqueeze(1) - -# Define the MaskedGenerator class -class MaskedGenerator(nn.Module): - def __init__(self, input_dim, output_dim, mask): - super(MaskedGenerator, self).__init__() - self.mask = mask - self.model = nn.Sequential( - nn.Linear(input_dim, 128), - nn.ReLU(), - nn.Linear(128, output_dim), - nn.Tanh() - ) - - def forward(self, x): - output = self.model(x) * self.mask - return output - -# Define the Discriminator class -class Discriminator(nn.Module): - def __init__(self, input_dim): - super(Discriminator, self).__init__() - self.model = nn.Sequential( - nn.Linear(input_dim, 128), - nn.ReLU(), - nn.Linear(128, 1), - nn.Sigmoid() - ) - - def forward(self, x): - return self.model(x) - -# Example usage: -input_dim = 100 # Noise dimension -output_dim = X_train_tensor.size(1) # Output dimension (number of features in the data) - -# Define masks for each generator -masks = [ - torch.tensor([1, 0, 0], dtype=torch.float32), # Generator 1 - torch.tensor([0, 1, 0], dtype=torch.float32), # Generator 2 - torch.tensor([0, 0, 1], dtype=torch.float32), # Generator 3 -] - -# Ensure the masks sum to cover all output dimensions -assert sum(mask.sum() for mask in masks) == output_dim - -# Initialize generators and the discriminator -generators = [MaskedGenerator(input_dim, output_dim, mask) for mask in masks] -discriminator = Discriminator(input_dim=output_dim) - -# Loss function and optimizers -criterion = nn.BCELoss() -optimizer_D = optim.Adam(discriminator.parameters(), lr=0.0002) -optimizers_G = [optim.Adam(generator.parameters(), lr=0.0002) for generator in generators] - -# Training loop -num_epochs = 5000 -batch_size = 64 -d_losses = [] -g_losses = [] - -for epoch in range(num_epochs): - # Train Discriminator - discriminator.train() - optimizer_D.zero_grad() - - # Real data - idx = torch.randperm(X_train_tensor.size(0))[:batch_size] - real_data = X_train_tensor[idx] - real_labels = torch.ones((real_data.size(0), 1)) - - # Fake data from ensemble of generators - noise = torch.randn(real_data.size(0), input_dim) - fake_data = sum(generator(noise) for generator in generators) - fake_labels = torch.zeros((real_data.size(0), 1)) - - # Compute discriminator loss - real_output = discriminator(real_data) - fake_output = discriminator(fake_data.detach()) - d_loss_real = criterion(real_output, real_labels) - d_loss_fake = criterion(fake_output, fake_labels) - d_loss = d_loss_real + d_loss_fake - d_loss.backward() - optimizer_D.step() - - # Train each Generator - for i, generator in enumerate(generators): - optimizers_G[i].zero_grad() - - # Generate fake data - noise = torch.randn(batch_size, input_dim) - fake_data = generator(noise) - - # Compute generator loss - fake_output = discriminator(fake_data) - g_loss = criterion(fake_output, torch.ones((batch_size, 1))) - g_loss.backward() - optimizers_G[i].step() - - if epoch % 500 == 0: - print(f'Epoch {epoch}, D Loss: {d_loss.item()}, G Loss: {g_loss.item()}') - - d_losses.append(d_loss.item()) - g_losses.append(g_loss.item()) - -# Plot training loss curves for Discriminator and Generators -plt.figure(figsize=(10,5)) -plt.plot(d_losses, label='Discriminator Loss') -plt.plot(g_losses, label='Generator Loss') -plt.xlabel('Epochs') -plt.ylabel('Loss') -plt.legend() -plt.title('Loss Trend During Training') -plt.show() - -# Generate synthetic data -def generate_synthetic_data(generators, num_samples): - synthetic_parts = [generator(torch.randn(num_samples, input_dim)).detach().numpy() for generator in generators] - combined_data = np.sum(synthetic_parts, axis=0) - return combined_data - -# Generate and save synthetic data -synthetic_data = generate_synthetic_data(generators, len(X_test)) -synthetic_df = pd.DataFrame(synthetic_data, columns=X.columns) # Use your original column names - -# Plot real vs synthetic data for each feature -for col in X.columns: - plt.figure(figsize=(6,4)) - sns.histplot(X_test[col], color='blue', label='Real', kde=True, stat="density", linewidth=0) - sns.histplot(synthetic_df[col], color='orange', label='Synthetic', kde=True, stat="density", linewidth=0) - plt.title(f'Distribution of {col}: Real vs Synthetic') - plt.legend() - plt.show() - -# Benchmarking: Test the generated synthetic data using classification metrics - -# Re-scale synthetic data -synthetic_df_scaled = scaler.transform(synthetic_df) - -# Convert to tensor -synthetic_tensor = torch.tensor(synthetic_df_scaled, dtype=torch.float32) - -# Compare synthetic data with real data using classification metrics -def benchmark_synthetic_data(real_data, synthetic_data, real_labels): - real_predictions = discriminator(real_data).detach().numpy() - synthetic_predictions = discriminator(synthetic_data).detach().numpy() - - real_labels = real_labels.numpy() - - # Use threshold 0.5 for binary classification - real_preds_class = (real_predictions > 0.5).astype(int) - synthetic_preds_class = (synthetic_predictions > 0.5).astype(int) - - accuracy = accuracy_score(real_labels, real_preds_class) - precision = precision_score(real_labels, real_preds_class) - recall = recall_score(real_labels, real_preds_class) - f1 = f1_score(real_labels, real_preds_class) - - print(f'Accuracy: {accuracy}, Precision: {precision}, Recall: {recall}, F1 Score: {f1}') - -# Run benchmarking -benchmark_synthetic_data(X_test_tensor, synthetic_tensor, y_test_tensor) diff --git a/Archive/meg/MEG_Implementation b/Archive/meg/MEG_Implementation deleted file mode 100644 index 9a8ff11..0000000 --- a/Archive/meg/MEG_Implementation +++ /dev/null @@ -1,145 +0,0 @@ -import numpy as np -import pandas as pd -import torch -import torch.nn as nn -import torch.optim as optim -from sklearn.model_selection import train_test_split -from sklearn.preprocessing import StandardScaler - -# Load your dataset -data = pd.read_csv('dataset.csv') - -# Drop missing values (if any) -data = data.dropna() - -# Convert categorical columns to numerical values -data['Gender'] = data['Gender'].map({'M': 1, 'F': 0}) -data['Purchase'] = data['Purchase'].map({'Yes': 1, 'No': 0}) - -# Separate features (X) and labels (y) -X = data.drop('Purchase', axis=1) -y = data['Purchase'] - -# Split the data into train and test sets -X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42) - -# Standardize the data -scaler = StandardScaler() -X_train_scaled = scaler.fit_transform(X_train) -X_test_scaled = scaler.transform(X_test) - -# Convert to PyTorch tensors -X_train_tensor = torch.tensor(X_train_scaled, dtype=torch.float32) -y_train_tensor = torch.tensor(y_train.values, dtype=torch.float32).unsqueeze(1) -X_test_tensor = torch.tensor(X_test_scaled, dtype=torch.float32) -y_test_tensor = torch.tensor(y_test.values, dtype=torch.float32).unsqueeze(1) - -# Define the MaskedGenerator class -class MaskedGenerator(nn.Module): - def __init__(self, input_dim, output_dim, mask): - super(MaskedGenerator, self).__init__() - self.mask = mask - self.model = nn.Sequential( - nn.Linear(input_dim, 128), - nn.ReLU(), - nn.Linear(128, output_dim), - nn.Tanh() - ) - - def forward(self, x): - output = self.model(x) * self.mask - return output - -# Define the Discriminator class -class Discriminator(nn.Module): - def __init__(self, input_dim): - super(Discriminator, self).__init__() - self.model = nn.Sequential( - nn.Linear(input_dim, 128), - nn.ReLU(), - nn.Linear(128, 1), - nn.Sigmoid() - ) - - def forward(self, x): - return self.model(x) - -# Example usage: -input_dim = 100 # Noise dimension -output_dim = X_train_tensor.size(1) # Output dimension (number of features in the data) - -# Define masks for each generator -masks = [ - torch.tensor([1, 0, 0], dtype=torch.float32), # Generator 1 - torch.tensor([0, 1, 0], dtype=torch.float32), # Generator 2 - torch.tensor([0, 0, 1], dtype=torch.float32), # Generator 3 -] - -# Ensure the masks sum to cover all output dimensions -assert sum(mask.sum() for mask in masks) == output_dim - -# Initialize generators and the discriminator -generators = [MaskedGenerator(input_dim, output_dim, mask) for mask in masks] -discriminator = Discriminator(input_dim=output_dim) - -# Loss function and optimizers -criterion = nn.BCELoss() -optimizer_D = optim.Adam(discriminator.parameters(), lr=0.0002) -optimizers_G = [optim.Adam(generator.parameters(), lr=0.0002) for generator in generators] - -# Training loop -num_epochs = 5000 -batch_size = 64 - -for epoch in range(num_epochs): - # Train Discriminator - discriminator.train() - optimizer_D.zero_grad() - - # Real data - idx = torch.randperm(X_train_tensor.size(0))[:batch_size] - real_data = X_train_tensor[idx] - real_labels = torch.ones((real_data.size(0), 1)) - - # Fake data from ensemble of generators - noise = torch.randn(real_data.size(0), input_dim) - fake_data = sum(generator(noise) for generator in generators) - fake_labels = torch.zeros((real_data.size(0), 1)) - - # Compute discriminator loss - real_output = discriminator(real_data) - fake_output = discriminator(fake_data.detach()) - d_loss_real = criterion(real_output, real_labels) - d_loss_fake = criterion(fake_output, fake_labels) - d_loss = d_loss_real + d_loss_fake - d_loss.backward() - optimizer_D.step() - - # Train each Generator - for i, generator in enumerate(generators): - optimizers_G[i].zero_grad() - - # Generate fake data - noise = torch.randn(batch_size, input_dim) - fake_data = generator(noise) - - # Compute generator loss - fake_output = discriminator(fake_data) - g_loss = criterion(fake_output, torch.ones((batch_size, 1))) - g_loss.backward() - optimizers_G[i].step() - - if epoch % 500 == 0: - print(f'Epoch {epoch}, D Loss: {d_loss.item()}, G Loss: {g_loss.item()}') - - -# Generate synthetic data -def generate_synthetic_data(generators, num_samples): - synthetic_parts = [generator(torch.randn(num_samples, input_dim)).detach().numpy() for generator in generators] - combined_data = np.sum(synthetic_parts, axis=0) - return combined_data - -# Generate and save synthetic data -synthetic_data = generate_synthetic_data(generators, 1000) -synthetic_df = pd.DataFrame(synthetic_data, columns=X.columns) # Use your original column names -synthetic_df.to_csv('masked_ensemble_synthetic_data.csv', index=False) diff --git a/Archive/meg/MEG_SPI.py b/Archive/meg/MEG_SPI.py deleted file mode 100644 index 1d1e75c..0000000 --- a/Archive/meg/MEG_SPI.py +++ /dev/null @@ -1,37 +0,0 @@ -from MEG_SPI import MEGModelSPI - -class MEGModelSPI(ABC): - - @abstractmethod - def __init__(self, model_type): - self.model_type = model_type # The type of model, e.g., 'ensemble', 'masked', etc. - self.parameters = None # Placeholder for model parameters - self.masking_strategy = None # Placeholder for the masking strategy to be used - self.num_models = None # The number of models in the ensemble - - @abstractmethod - def load_model(self): - - pass - - @abstractmethod - def load_data(self, data): - - pass - - @abstractmethod - def train(self): - - pass - - @abstractmethod - def generate(self, num_samples): - - pass - -class MEGMetricSPI(ABC): - - @abstractmethod - def evaluate(self, real_data, synthetic_data): - - pass diff --git a/Archive/meg/dataset.csv b/Archive/meg/dataset.csv deleted file mode 100644 index 5a9e74d..0000000 --- a/Archive/meg/dataset.csv +++ /dev/null @@ -1,11 +0,0 @@ -Age,Gender,Income,Purchase -23,F,25000,No -31,M,54000,Yes -45,F,32000,No -35,M,58000,Yes -41,F,62000,Yes -29,F,48000,No -52,M,61000,Yes -36,M,38000,No -27,F,42000,No -40,M,53000,Yes diff --git a/Archive/model_template/dummy.py b/Archive/model_template/dummy.py deleted file mode 100644 index aec50a1..0000000 --- a/Archive/model_template/dummy.py +++ /dev/null @@ -1,25 +0,0 @@ -# A dummy datagen model written by Jaime Blackwell - -from katabatic.models.model import Model # Import the abstract base class - -# This dummy model simply duplicates the data and returns it. -# Adding an extra comment for no reason -class DummyModel(Model): - - def __init__(self, x, Y, batch_size=64): - super().__init__("model") - self.batch_size = batch_size - self.x = x # data to train on - self.Y = Y # Y is the target variable - self.k = 0 - - def fit(self, x, Y): - self.x = x - self.Y = Y - return 42 - - def generate(self): - return 42 - - def evaluate(self): - return 42 \ No newline at end of file diff --git a/Archive/model_template/mock_adapter.py b/Archive/model_template/mock_adapter.py deleted file mode 100644 index 8281839..0000000 --- a/Archive/model_template/mock_adapter.py +++ /dev/null @@ -1,21 +0,0 @@ -from katabatic_spi import KatabaticModelSPI - -class MockAdapter(KatabaticModelSPI): - - def __init__(self, type='discrete'): - self.type = None # Should be either 'discrete' or 'continuous' - self.constraints = None - self.batch_size = None - self.epochs = None - - def load_model(self): - print('Loading the model') - - def load_data(self, data_pathname): - print("Loading data now.") - - def fit(self, X_train, y_train, k=0, epochs=10, batch_size=64): - print("Fitting the model now.") - - def generate(self, size=None): - print("Generating data now.") \ No newline at end of file diff --git a/Archive/model_template/model.py b/Archive/model_template/model.py deleted file mode 100644 index 362a4ad..0000000 --- a/Archive/model_template/model.py +++ /dev/null @@ -1,42 +0,0 @@ -"""Abstract Base class for all models.""" - -# Author: Jaime Blackwell -# License: GNU Affero License -import abc -from abc import ABC, abstractmethod # Import abstraction Module - -class Model(ABC): - - @abstractmethod - def __init__(self, *args, **kwargs): - - ''' - *** MODEL CLASS PARAMETERS *** - ------------------------------------------------------------ - weights : initial weights for data generation. Default = None - epochs : number of epochs to train the model. Default = None - batch_size : batch size for training. Default = None - self.x : Training set for fitting to the model - self.Y : - ------------------------------------------------------------- - ''' - self.weights = None - self.epochs = None - self.batch_size = None - self.x = None - self.Y = None - self.k = 0 - - @abstractmethod - def fit(self, x, Y): # sklearn-style fit method ,x, Y, **kwargs - self.x = x - self.Y = Y - return NotImplementedError('model.fit() must be defined in the concrete class') - - @abstractmethod - def generate(self): # , size=None, **kwargs - return NotImplementedError('model.generate() must be defined in the concrete class') - - @abstractmethod - def evaluate(self): # x, Y, classifier - pass \ No newline at end of file diff --git a/Archive/tryKatabatic.ipynb b/Archive/tryKatabatic.ipynb deleted file mode 100644 index aa8ba16..0000000 --- a/Archive/tryKatabatic.ipynb +++ /dev/null @@ -1,476 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Katabatic Demo Usage" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### Importing Dependenciencies" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: scikit-learn in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (0.24.0)\n", - "Collecting scikit-learn\n", - " Using cached scikit_learn-1.5.1-cp39-cp39-win_amd64.whl (11.0 MB)\n", - "Requirement already satisfied: scipy>=1.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.13.1)\n", - "Requirement already satisfied: joblib>=1.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.4.2)\n", - "Requirement already satisfied: numpy>=1.19.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.26.4)\n", - "Requirement already satisfied: threadpoolctl>=3.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (3.5.0)\n", - "Installing collected packages: scikit-learn\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 0.24.0\n", - " Uninstalling scikit-learn-0.24.0:\n", - " Successfully uninstalled scikit-learn-0.24.0\n", - "Successfully installed scikit-learn-1.5.1\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "pyitlib 0.2.3 requires scikit-learn<=0.24,>=0.16.0, but you have scikit-learn 1.5.1 which is incompatible.\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install -U scikit-learn" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: pyitlib in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (0.2.3)\n", - "Requirement already satisfied: pandas>=0.20.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (2.2.2)\n", - "Requirement already satisfied: numpy>=1.9.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (1.26.4)\n", - "Collecting scikit-learn<=0.24,>=0.16.0\n", - " Using cached scikit_learn-0.24.0-cp39-cp39-win_amd64.whl (6.9 MB)\n", - "Requirement already satisfied: scipy>=1.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (1.13.1)\n", - "Requirement already satisfied: future>=0.16.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (1.0.0)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=0.20.2->pyitlib) (2024.1)\n", - "Requirement already satisfied: python-dateutil>=2.8.2 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from pandas>=0.20.2->pyitlib) (2.9.0.post0)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=0.20.2->pyitlib) (2024.1)\n", - "Requirement already satisfied: threadpoolctl>=2.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn<=0.24,>=0.16.0->pyitlib) (3.5.0)\n", - "Requirement already satisfied: joblib>=0.11 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn<=0.24,>=0.16.0->pyitlib) (1.4.2)\n", - "Requirement already satisfied: six>=1.5 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from python-dateutil>=2.8.2->pandas>=0.20.2->pyitlib) (1.16.0)\n", - "Installing collected packages: scikit-learn\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 1.5.1\n", - " Uninstalling scikit-learn-1.5.1:\n", - " Successfully uninstalled scikit-learn-1.5.1\n", - "Successfully installed scikit-learn-0.24.0\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "sdmetrics 0.15.0 requires scikit-learn>=1.0.2; python_version < \"3.10\", but you have scikit-learn 0.24.0 which is incompatible.\n", - "rdt 1.12.2 requires scikit-learn>=1.0.2; python_version < \"3.10\", but you have scikit-learn 0.24.0 which is incompatible.\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install pyitlib" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: tensorflow in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (2.17.0)\n", - "Requirement already satisfied: tensorflow-intel==2.17.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow) (2.17.0)\n", - "Requirement already satisfied: numpy<2.0.0,>=1.23.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.26.4)\n", - "Requirement already satisfied: typing-extensions>=3.6.6 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (4.12.2)\n", - "Requirement already satisfied: google-pasta>=0.1.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.2.0)\n", - "Requirement already satisfied: tensorflow-io-gcs-filesystem>=0.23.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.31.0)\n", - "Requirement already satisfied: ml-dtypes<0.5.0,>=0.3.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.4.0)\n", - "Requirement already satisfied: h5py>=3.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (3.11.0)\n", - "Requirement already satisfied: six>=1.12.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.16.0)\n", - "Requirement already satisfied: requests<3,>=2.21.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.32.3)\n", - "Requirement already satisfied: keras>=3.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (3.4.1)\n", - "Requirement already satisfied: flatbuffers>=24.3.25 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (24.3.25)\n", - "Requirement already satisfied: termcolor>=1.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.4.0)\n", - "Requirement already satisfied: wrapt>=1.11.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.16.0)\n", - "Requirement already satisfied: grpcio<2.0,>=1.24.3 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.65.4)\n", - "Requirement already satisfied: opt-einsum>=2.3.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (3.3.0)\n", - "Requirement already satisfied: setuptools in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (58.1.0)\n", - "Requirement already satisfied: packaging in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (24.1)\n", - "Requirement already satisfied: absl-py>=1.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.1.0)\n", - "Requirement already satisfied: gast!=0.5.0,!=0.5.1,!=0.5.2,>=0.2.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.6.0)\n", - "Requirement already satisfied: protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.20.3 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (4.25.4)\n", - "Requirement already satisfied: libclang>=13.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (18.1.1)\n", - "Requirement already satisfied: tensorboard<2.18,>=2.17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.17.0)\n", - "Requirement already satisfied: astunparse>=1.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.6.3)\n", - "Requirement already satisfied: wheel<1.0,>=0.23.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from astunparse>=1.6.0->tensorflow-intel==2.17.0->tensorflow) (0.44.0)\n", - "Requirement already satisfied: rich in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (13.7.1)\n", - "Requirement already satisfied: namex in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (0.0.8)\n", - "Requirement already satisfied: optree in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (0.12.1)\n", - "Requirement already satisfied: charset-normalizer<4,>=2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (3.3.2)\n", - "Requirement already satisfied: idna<4,>=2.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (3.7)\n", - "Requirement already satisfied: certifi>=2017.4.17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (2024.7.4)\n", - "Requirement already satisfied: urllib3<3,>=1.21.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (1.26.19)\n", - "Requirement already satisfied: werkzeug>=1.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.0.3)\n", - "Requirement already satisfied: tensorboard-data-server<0.8.0,>=0.7.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (0.7.2)\n", - "Requirement already satisfied: markdown>=2.6.8 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.6)\n", - "Requirement already satisfied: importlib-metadata>=4.4 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from markdown>=2.6.8->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (8.2.0)\n", - "Requirement already satisfied: MarkupSafe>=2.1.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from werkzeug>=1.0.1->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (2.1.5)\n", - "Requirement already satisfied: markdown-it-py>=2.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (3.0.0)\n", - "Requirement already satisfied: pygments<3.0.0,>=2.13.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (2.18.0)\n", - "Requirement already satisfied: zipp>=0.5 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.20.0)\n", - "Requirement already satisfied: mdurl~=0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from markdown-it-py>=2.2.0->rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (0.1.2)\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install tensorflow" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: pgmpy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (0.1.26)\n", - "Requirement already satisfied: google-generativeai in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.7.2)\n", - "Requirement already satisfied: scikit-learn in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.24.0)\n", - "Requirement already satisfied: pyparsing in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.1.2)\n", - "Requirement already satisfied: torch in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.4.0)\n", - "Requirement already satisfied: pandas in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.2.2)\n", - "Requirement already satisfied: xgboost in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.1.1)\n", - "Requirement already satisfied: joblib in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.4.2)\n", - "Requirement already satisfied: numpy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.26.4)\n", - "Requirement already satisfied: statsmodels in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.14.2)\n", - "Requirement already satisfied: tqdm in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (4.66.5)\n", - "Requirement already satisfied: scipy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.13.1)\n", - "Requirement already satisfied: networkx in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.2.1)\n", - "Requirement already satisfied: opt-einsum in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.3.0)\n", - "Requirement already satisfied: google-api-core in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.19.1)\n", - "Requirement already satisfied: google-api-python-client in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.140.0)\n", - "Requirement already satisfied: google-auth>=2.15.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.33.0)\n", - "Requirement already satisfied: typing-extensions in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from google-generativeai->pgmpy) (4.12.2)\n", - "Requirement already satisfied: pydantic in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.8.2)\n", - "Requirement already satisfied: google-ai-generativelanguage==0.6.6 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (0.6.6)\n", - "Requirement already satisfied: protobuf in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (4.25.4)\n", - "Requirement already satisfied: proto-plus<2.0.0dev,>=1.22.3 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-ai-generativelanguage==0.6.6->google-generativeai->pgmpy) (1.24.0)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas->pgmpy) (2024.1)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas->pgmpy) (2024.1)\n", - "Requirement already satisfied: python-dateutil>=2.8.2 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from pandas->pgmpy) (2.9.0.post0)\n", - "Requirement already satisfied: threadpoolctl>=2.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn->pgmpy) (3.5.0)\n", - "Requirement already satisfied: packaging>=21.3 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from statsmodels->pgmpy) (24.1)\n", - "Requirement already satisfied: patsy>=0.5.6 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from statsmodels->pgmpy) (0.5.6)\n", - "Requirement already satisfied: fsspec in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (2024.6.1)\n", - "Requirement already satisfied: sympy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (1.13.2)\n", - "Requirement already satisfied: jinja2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (3.1.4)\n", - "Requirement already satisfied: filelock in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (3.15.4)\n", - "Requirement already satisfied: colorama in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tqdm->pgmpy) (0.4.6)\n", - "Requirement already satisfied: rsa<5,>=3.1.4 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-auth>=2.15.0->google-generativeai->pgmpy) (4.9)\n", - "Requirement already satisfied: cachetools<6.0,>=2.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-auth>=2.15.0->google-generativeai->pgmpy) (5.4.0)\n", - "Requirement already satisfied: pyasn1-modules>=0.2.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-auth>=2.15.0->google-generativeai->pgmpy) (0.4.0)\n", - "Requirement already satisfied: six in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from patsy>=0.5.6->statsmodels->pgmpy) (1.16.0)\n", - "Requirement already satisfied: requests<3.0.0.dev0,>=2.18.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (2.32.3)\n", - "Requirement already satisfied: googleapis-common-protos<2.0.dev0,>=1.56.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (1.63.2)\n", - "Requirement already satisfied: uritemplate<5,>=3.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-python-client->google-generativeai->pgmpy) (4.1.1)\n", - "Requirement already satisfied: google-auth-httplib2<1.0.0,>=0.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-python-client->google-generativeai->pgmpy) (0.2.0)\n", - "Requirement already satisfied: httplib2<1.dev0,>=0.19.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-python-client->google-generativeai->pgmpy) (0.22.0)\n", - "Requirement already satisfied: MarkupSafe>=2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from jinja2->torch->pgmpy) (2.1.5)\n", - "Requirement already satisfied: annotated-types>=0.4.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pydantic->google-generativeai->pgmpy) (0.7.0)\n", - "Requirement already satisfied: pydantic-core==2.20.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pydantic->google-generativeai->pgmpy) (2.20.1)\n", - "Requirement already satisfied: mpmath<1.4,>=1.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sympy->torch->pgmpy) (1.3.0)\n", - "Requirement already satisfied: grpcio-status<2.0.dev0,>=1.33.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (1.62.3)\n", - "Requirement already satisfied: grpcio<2.0dev,>=1.33.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (1.65.4)\n", - "Requirement already satisfied: pyasn1<0.7.0,>=0.4.6 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyasn1-modules>=0.2.1->google-auth>=2.15.0->google-generativeai->pgmpy) (0.6.0)\n", - "Requirement already satisfied: idna<4,>=2.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (3.7)\n", - "Requirement already satisfied: certifi>=2017.4.17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (2024.7.4)\n", - "Requirement already satisfied: charset-normalizer<4,>=2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (3.3.2)\n", - "Requirement already satisfied: urllib3<3,>=1.21.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (1.26.19)\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install pgmpy" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: sdv in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (1.15.0)\n", - "Requirement already satisfied: cloudpickle>=2.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (3.0.0)\n", - "Requirement already satisfied: ctgan>=0.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.10.1)\n", - "Requirement already satisfied: copulas>=0.11.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.11.0)\n", - "Requirement already satisfied: rdt>=1.12.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.12.2)\n", - "Requirement already satisfied: sdmetrics>=0.14.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.15.0)\n", - "Requirement already satisfied: pyyaml>=6.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (6.0.2)\n", - "Requirement already satisfied: tqdm>=4.29 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (4.66.5)\n", - "Requirement already satisfied: pandas>=1.4.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (2.2.2)\n", - "Requirement already satisfied: platformdirs>=4.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from sdv) (4.2.2)\n", - "Requirement already satisfied: deepecho>=0.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.6.0)\n", - "Requirement already satisfied: numpy<2.0.0,>=1.21.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.26.4)\n", - "Requirement already satisfied: graphviz>=0.13.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.20.3)\n", - "Requirement already satisfied: boto3>=1.28 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.34.158)\n", - "Requirement already satisfied: botocore>=1.31 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.34.158)\n", - "Requirement already satisfied: s3transfer<0.11.0,>=0.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from boto3>=1.28->sdv) (0.10.2)\n", - "Requirement already satisfied: jmespath<2.0.0,>=0.7.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from boto3>=1.28->sdv) (1.0.1)\n", - "Requirement already satisfied: urllib3<1.27,>=1.25.4 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from botocore>=1.31->sdv) (1.26.19)\n", - "Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from botocore>=1.31->sdv) (2.9.0.post0)\n", - "Requirement already satisfied: plotly>=5.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from copulas>=0.11.0->sdv) (5.23.0)\n", - "Requirement already satisfied: scipy>=1.5.4 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from copulas>=0.11.0->sdv) (1.13.1)\n", - "Requirement already satisfied: torch>=1.9.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from ctgan>=0.10.0->sdv) (2.4.0)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=1.4.0->sdv) (2024.1)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=1.4.0->sdv) (2024.1)\n", - "Collecting scikit-learn>=1.0.2\n", - " Using cached scikit_learn-1.5.1-cp39-cp39-win_amd64.whl (11.0 MB)\n", - "Requirement already satisfied: Faker>=17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from rdt>=1.12.0->sdv) (26.3.0)\n", - "Requirement already satisfied: colorama in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tqdm>=4.29->sdv) (0.4.6)\n", - "Requirement already satisfied: tenacity>=6.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from plotly>=5.10.0->copulas>=0.11.0->sdv) (9.0.0)\n", - "Requirement already satisfied: packaging in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from plotly>=5.10.0->copulas>=0.11.0->sdv) (24.1)\n", - "Requirement already satisfied: six>=1.5 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from python-dateutil<3.0.0,>=2.1->botocore>=1.31->sdv) (1.16.0)\n", - "Requirement already satisfied: joblib>=1.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn>=1.0.2->rdt>=1.12.0->sdv) (1.4.2)\n", - "Requirement already satisfied: threadpoolctl>=3.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn>=1.0.2->rdt>=1.12.0->sdv) (3.5.0)\n", - "Requirement already satisfied: filelock in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.15.4)\n", - "Requirement already satisfied: fsspec in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (2024.6.1)\n", - "Requirement already satisfied: networkx in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.2.1)\n", - "Requirement already satisfied: jinja2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.1.4)\n", - "Requirement already satisfied: sympy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (1.13.2)\n", - "Requirement already satisfied: typing-extensions>=4.8.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (4.12.2)\n", - "Requirement already satisfied: MarkupSafe>=2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from jinja2->torch>=1.9.0->ctgan>=0.10.0->sdv) (2.1.5)\n", - "Requirement already satisfied: mpmath<1.4,>=1.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sympy->torch>=1.9.0->ctgan>=0.10.0->sdv) (1.3.0)\n", - "Installing collected packages: scikit-learn\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 0.24.0\n", - " Uninstalling scikit-learn-0.24.0:\n", - " Successfully uninstalled scikit-learn-0.24.0\n", - "Successfully installed scikit-learn-1.5.1\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "pyitlib 0.2.3 requires scikit-learn<=0.24,>=0.16.0, but you have scikit-learn 1.5.1 which is incompatible.\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install sdv" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### Importing Katabatic" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [], - "source": [ - "from katabatic.katabatic import Katabatic\n", - "import numpy as np\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Sample Training Data" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": {}, - "outputs": [], - "source": [ - "# Generating sample data for X_train\n", - "X_train = np.array([\n", - " [0.1, 1.2, 3.4, 4.5],\n", - " [2.3, 3.4, 1.1, 0.2],\n", - " [1.4, 0.4, 2.2, 3.3],\n", - " [3.1, 4.2, 1.0, 2.5],\n", - " [2.2, 1.1, 0.3, 4.1],\n", - " [0.5, 2.2, 4.4, 1.3],\n", - " [1.8, 3.3, 0.9, 2.2],\n", - " [2.6, 1.5, 3.3, 0.1],\n", - " [0.7, 4.4, 1.4, 3.0],\n", - " [1.9, 2.8, 3.9, 4.0]\n", - "])\n", - "\n", - "# Generating sample data for y_train\n", - "y_train = np.array([0, 1, 0, 1, 0, 1, 0, 1, 0, 1])" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Load Model and Train it on Sample Training Data" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "--------------------------\n", - "module name: katabatic.katabatic\n", - "parent process: 28468\n", - "process id: 28884\n", - "katabatic.models.ganblr.ganblr_adapter\n", - "katabatic.models.ganblr.ganblr_adapter\n", - "\n", - "[INFO] Initializing GANBLR Model\n", - "[INFO] Training GANBLR model\n", - "[SUCCESS] Model training completed\n" - ] - } - ], - "source": [ - "model = Katabatic.run_model('ganblr')\n", - "model.load_model()\n", - "model.fit(X_train,y_train)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Generate New Data from trained model" - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] Generating data using GANBLR model\n", - "[SUCCESS] Data generation completed\n" - ] - }, - { - "data": { - "text/plain": [ - "array([[3.1, 1.1, 0.3, 4.0, 0],\n", - " [0.7, 1.5, 0.3, 3.3, 1],\n", - " [0.7, 1.1, 2.2, 2.5, 1],\n", - " [2.6, 2.8, 1.1, 0.1, 0],\n", - " [2.6, 0.4, 4.4, 4.5, 1]], dtype=object)" - ] - }, - "execution_count": 8, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "model.generate(size=5)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# Future import statement ideas\n", - "\n", - "import katabatic as kb\n", - "from katabatic.models import meg\n", - "from katabatic.models import ganblr\n", - "from katabatic.evaluate import eval_method1\n", - "from katabatic.utils.preprocessing import data_processing_method1 # good place to store preprocessing utilities" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.9.13" - } - }, - "nbformat": 4, - "nbformat_minor": 2 -} diff --git a/Archive/tryKatabatic_Ganblr++.ipynb b/Archive/tryKatabatic_Ganblr++.ipynb deleted file mode 100644 index 0468c72..0000000 --- a/Archive/tryKatabatic_Ganblr++.ipynb +++ /dev/null @@ -1,698 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Katabatic Demo Usage" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### Importing Dependenciencies" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: scikit-learn in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (0.24.0)\n", - "Collecting scikit-learn\n", - " Using cached scikit_learn-1.5.1-cp39-cp39-win_amd64.whl (11.0 MB)\n", - "Requirement already satisfied: scipy>=1.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.13.1)\n", - "Requirement already satisfied: joblib>=1.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.4.2)\n", - "Requirement already satisfied: numpy>=1.19.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.26.4)\n", - "Requirement already satisfied: threadpoolctl>=3.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (3.5.0)\n", - "Installing collected packages: scikit-learn\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 0.24.0\n", - " Uninstalling scikit-learn-0.24.0:\n", - " Successfully uninstalled scikit-learn-0.24.0\n", - "Successfully installed scikit-learn-1.5.1\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "pyitlib 0.2.3 requires scikit-learn<=0.24,>=0.16.0, but you have scikit-learn 1.5.1 which is incompatible.\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install -U scikit-learn" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: pyitlib in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (0.2.3)\n", - "Requirement already satisfied: pandas>=0.20.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (2.2.2)\n", - "Requirement already satisfied: numpy>=1.9.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (1.26.4)\n", - "Collecting scikit-learn<=0.24,>=0.16.0\n", - " Using cached scikit_learn-0.24.0-cp39-cp39-win_amd64.whl (6.9 MB)\n", - "Requirement already satisfied: scipy>=1.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (1.13.1)\n", - "Requirement already satisfied: future>=0.16.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (1.0.0)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=0.20.2->pyitlib) (2024.1)\n", - "Requirement already satisfied: python-dateutil>=2.8.2 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from pandas>=0.20.2->pyitlib) (2.9.0.post0)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=0.20.2->pyitlib) (2024.1)\n", - "Requirement already satisfied: threadpoolctl>=2.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn<=0.24,>=0.16.0->pyitlib) (3.5.0)\n", - "Requirement already satisfied: joblib>=0.11 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn<=0.24,>=0.16.0->pyitlib) (1.4.2)\n", - "Requirement already satisfied: six>=1.5 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from python-dateutil>=2.8.2->pandas>=0.20.2->pyitlib) (1.16.0)\n", - "Installing collected packages: scikit-learn\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 1.5.1\n", - " Uninstalling scikit-learn-1.5.1:\n", - " Successfully uninstalled scikit-learn-1.5.1\n", - "Successfully installed scikit-learn-0.24.0\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "sdmetrics 0.15.0 requires scikit-learn>=1.0.2; python_version < \"3.10\", but you have scikit-learn 0.24.0 which is incompatible.\n", - "rdt 1.12.2 requires scikit-learn>=1.0.2; python_version < \"3.10\", but you have scikit-learn 0.24.0 which is incompatible.\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install pyitlib" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: tensorflow in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (2.17.0)\n", - "Requirement already satisfied: tensorflow-intel==2.17.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow) (2.17.0)\n", - "Requirement already satisfied: numpy<2.0.0,>=1.23.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.26.4)\n", - "Requirement already satisfied: typing-extensions>=3.6.6 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (4.12.2)\n", - "Requirement already satisfied: google-pasta>=0.1.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.2.0)\n", - "Requirement already satisfied: tensorflow-io-gcs-filesystem>=0.23.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.31.0)\n", - "Requirement already satisfied: ml-dtypes<0.5.0,>=0.3.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.4.0)\n", - "Requirement already satisfied: h5py>=3.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (3.11.0)\n", - "Requirement already satisfied: six>=1.12.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.16.0)\n", - "Requirement already satisfied: requests<3,>=2.21.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.32.3)\n", - "Requirement already satisfied: keras>=3.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (3.4.1)\n", - "Requirement already satisfied: flatbuffers>=24.3.25 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (24.3.25)\n", - "Requirement already satisfied: termcolor>=1.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.4.0)\n", - "Requirement already satisfied: wrapt>=1.11.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.16.0)\n", - "Requirement already satisfied: grpcio<2.0,>=1.24.3 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.65.4)\n", - "Requirement already satisfied: opt-einsum>=2.3.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (3.3.0)\n", - "Requirement already satisfied: setuptools in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (58.1.0)\n", - "Requirement already satisfied: packaging in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (24.1)\n", - "Requirement already satisfied: absl-py>=1.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.1.0)\n", - "Requirement already satisfied: gast!=0.5.0,!=0.5.1,!=0.5.2,>=0.2.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.6.0)\n", - "Requirement already satisfied: protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.20.3 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (4.25.4)\n", - "Requirement already satisfied: libclang>=13.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (18.1.1)\n", - "Requirement already satisfied: tensorboard<2.18,>=2.17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.17.0)\n", - "Requirement already satisfied: astunparse>=1.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.6.3)\n", - "Requirement already satisfied: wheel<1.0,>=0.23.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from astunparse>=1.6.0->tensorflow-intel==2.17.0->tensorflow) (0.44.0)\n", - "Requirement already satisfied: rich in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (13.7.1)\n", - "Requirement already satisfied: namex in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (0.0.8)\n", - "Requirement already satisfied: optree in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (0.12.1)\n", - "Requirement already satisfied: charset-normalizer<4,>=2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (3.3.2)\n", - "Requirement already satisfied: idna<4,>=2.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (3.7)\n", - "Requirement already satisfied: certifi>=2017.4.17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (2024.7.4)\n", - "Requirement already satisfied: urllib3<3,>=1.21.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (1.26.19)\n", - "Requirement already satisfied: werkzeug>=1.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.0.3)\n", - "Requirement already satisfied: tensorboard-data-server<0.8.0,>=0.7.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (0.7.2)\n", - "Requirement already satisfied: markdown>=2.6.8 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.6)\n", - "Requirement already satisfied: importlib-metadata>=4.4 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from markdown>=2.6.8->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (8.2.0)\n", - "Requirement already satisfied: MarkupSafe>=2.1.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from werkzeug>=1.0.1->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (2.1.5)\n", - "Requirement already satisfied: markdown-it-py>=2.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (3.0.0)\n", - "Requirement already satisfied: pygments<3.0.0,>=2.13.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (2.18.0)\n", - "Requirement already satisfied: zipp>=0.5 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.20.0)\n", - "Requirement already satisfied: mdurl~=0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from markdown-it-py>=2.2.0->rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (0.1.2)\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install tensorflow" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: pgmpy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (0.1.26)\n", - "Requirement already satisfied: google-generativeai in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.7.2)\n", - "Requirement already satisfied: scikit-learn in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.24.0)\n", - "Requirement already satisfied: pyparsing in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.1.2)\n", - "Requirement already satisfied: torch in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.4.0)\n", - "Requirement already satisfied: pandas in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.2.2)\n", - "Requirement already satisfied: xgboost in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.1.1)\n", - "Requirement already satisfied: joblib in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.4.2)\n", - "Requirement already satisfied: numpy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.26.4)\n", - "Requirement already satisfied: statsmodels in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.14.2)\n", - "Requirement already satisfied: tqdm in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (4.66.5)\n", - "Requirement already satisfied: scipy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.13.1)\n", - "Requirement already satisfied: networkx in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.2.1)\n", - "Requirement already satisfied: opt-einsum in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.3.0)\n", - "Requirement already satisfied: google-api-core in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.19.1)\n", - "Requirement already satisfied: google-api-python-client in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.140.0)\n", - "Requirement already satisfied: google-auth>=2.15.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.33.0)\n", - "Requirement already satisfied: typing-extensions in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from google-generativeai->pgmpy) (4.12.2)\n", - "Requirement already satisfied: pydantic in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.8.2)\n", - "Requirement already satisfied: google-ai-generativelanguage==0.6.6 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (0.6.6)\n", - "Requirement already satisfied: protobuf in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (4.25.4)\n", - "Requirement already satisfied: proto-plus<2.0.0dev,>=1.22.3 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-ai-generativelanguage==0.6.6->google-generativeai->pgmpy) (1.24.0)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas->pgmpy) (2024.1)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas->pgmpy) (2024.1)\n", - "Requirement already satisfied: python-dateutil>=2.8.2 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from pandas->pgmpy) (2.9.0.post0)\n", - "Requirement already satisfied: threadpoolctl>=2.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn->pgmpy) (3.5.0)\n", - "Requirement already satisfied: packaging>=21.3 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from statsmodels->pgmpy) (24.1)\n", - "Requirement already satisfied: patsy>=0.5.6 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from statsmodels->pgmpy) (0.5.6)\n", - "Requirement already satisfied: fsspec in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (2024.6.1)\n", - "Requirement already satisfied: sympy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (1.13.2)\n", - "Requirement already satisfied: jinja2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (3.1.4)\n", - "Requirement already satisfied: filelock in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (3.15.4)\n", - "Requirement already satisfied: colorama in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tqdm->pgmpy) (0.4.6)\n", - "Requirement already satisfied: rsa<5,>=3.1.4 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-auth>=2.15.0->google-generativeai->pgmpy) (4.9)\n", - "Requirement already satisfied: cachetools<6.0,>=2.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-auth>=2.15.0->google-generativeai->pgmpy) (5.4.0)\n", - "Requirement already satisfied: pyasn1-modules>=0.2.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-auth>=2.15.0->google-generativeai->pgmpy) (0.4.0)\n", - "Requirement already satisfied: six in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from patsy>=0.5.6->statsmodels->pgmpy) (1.16.0)\n", - "Requirement already satisfied: requests<3.0.0.dev0,>=2.18.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (2.32.3)\n", - "Requirement already satisfied: googleapis-common-protos<2.0.dev0,>=1.56.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (1.63.2)\n", - "Requirement already satisfied: uritemplate<5,>=3.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-python-client->google-generativeai->pgmpy) (4.1.1)\n", - "Requirement already satisfied: google-auth-httplib2<1.0.0,>=0.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-python-client->google-generativeai->pgmpy) (0.2.0)\n", - "Requirement already satisfied: httplib2<1.dev0,>=0.19.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-python-client->google-generativeai->pgmpy) (0.22.0)\n", - "Requirement already satisfied: MarkupSafe>=2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from jinja2->torch->pgmpy) (2.1.5)\n", - "Requirement already satisfied: annotated-types>=0.4.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pydantic->google-generativeai->pgmpy) (0.7.0)\n", - "Requirement already satisfied: pydantic-core==2.20.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pydantic->google-generativeai->pgmpy) (2.20.1)\n", - "Requirement already satisfied: mpmath<1.4,>=1.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sympy->torch->pgmpy) (1.3.0)\n", - "Requirement already satisfied: grpcio-status<2.0.dev0,>=1.33.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (1.62.3)\n", - "Requirement already satisfied: grpcio<2.0dev,>=1.33.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (1.65.4)\n", - "Requirement already satisfied: pyasn1<0.7.0,>=0.4.6 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyasn1-modules>=0.2.1->google-auth>=2.15.0->google-generativeai->pgmpy) (0.6.0)\n", - "Requirement already satisfied: idna<4,>=2.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (3.7)\n", - "Requirement already satisfied: certifi>=2017.4.17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (2024.7.4)\n", - "Requirement already satisfied: charset-normalizer<4,>=2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (3.3.2)\n", - "Requirement already satisfied: urllib3<3,>=1.21.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (1.26.19)\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install pgmpy" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: sdv in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (1.15.0)\n", - "Requirement already satisfied: cloudpickle>=2.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (3.0.0)\n", - "Requirement already satisfied: ctgan>=0.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.10.1)\n", - "Requirement already satisfied: copulas>=0.11.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.11.0)\n", - "Requirement already satisfied: rdt>=1.12.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.12.2)\n", - "Requirement already satisfied: sdmetrics>=0.14.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.15.0)\n", - "Requirement already satisfied: pyyaml>=6.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (6.0.2)\n", - "Requirement already satisfied: tqdm>=4.29 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (4.66.5)\n", - "Requirement already satisfied: pandas>=1.4.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (2.2.2)\n", - "Requirement already satisfied: platformdirs>=4.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from sdv) (4.2.2)\n", - "Requirement already satisfied: deepecho>=0.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.6.0)\n", - "Requirement already satisfied: numpy<2.0.0,>=1.21.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.26.4)\n", - "Requirement already satisfied: graphviz>=0.13.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.20.3)\n", - "Requirement already satisfied: boto3>=1.28 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.34.158)\n", - "Requirement already satisfied: botocore>=1.31 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.34.158)\n", - "Requirement already satisfied: s3transfer<0.11.0,>=0.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from boto3>=1.28->sdv) (0.10.2)\n", - "Requirement already satisfied: jmespath<2.0.0,>=0.7.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from boto3>=1.28->sdv) (1.0.1)\n", - "Requirement already satisfied: urllib3<1.27,>=1.25.4 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from botocore>=1.31->sdv) (1.26.19)\n", - "Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from botocore>=1.31->sdv) (2.9.0.post0)\n", - "Requirement already satisfied: plotly>=5.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from copulas>=0.11.0->sdv) (5.23.0)\n", - "Requirement already satisfied: scipy>=1.5.4 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from copulas>=0.11.0->sdv) (1.13.1)\n", - "Requirement already satisfied: torch>=1.9.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from ctgan>=0.10.0->sdv) (2.4.0)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=1.4.0->sdv) (2024.1)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=1.4.0->sdv) (2024.1)\n", - "Collecting scikit-learn>=1.0.2\n", - " Using cached scikit_learn-1.5.1-cp39-cp39-win_amd64.whl (11.0 MB)\n", - "Requirement already satisfied: Faker>=17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from rdt>=1.12.0->sdv) (26.3.0)\n", - "Requirement already satisfied: colorama in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tqdm>=4.29->sdv) (0.4.6)\n", - "Requirement already satisfied: tenacity>=6.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from plotly>=5.10.0->copulas>=0.11.0->sdv) (9.0.0)\n", - "Requirement already satisfied: packaging in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from plotly>=5.10.0->copulas>=0.11.0->sdv) (24.1)\n", - "Requirement already satisfied: six>=1.5 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from python-dateutil<3.0.0,>=2.1->botocore>=1.31->sdv) (1.16.0)\n", - "Requirement already satisfied: joblib>=1.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn>=1.0.2->rdt>=1.12.0->sdv) (1.4.2)\n", - "Requirement already satisfied: threadpoolctl>=3.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn>=1.0.2->rdt>=1.12.0->sdv) (3.5.0)\n", - "Requirement already satisfied: filelock in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.15.4)\n", - "Requirement already satisfied: fsspec in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (2024.6.1)\n", - "Requirement already satisfied: networkx in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.2.1)\n", - "Requirement already satisfied: jinja2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.1.4)\n", - "Requirement already satisfied: sympy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (1.13.2)\n", - "Requirement already satisfied: typing-extensions>=4.8.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (4.12.2)\n", - "Requirement already satisfied: MarkupSafe>=2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from jinja2->torch>=1.9.0->ctgan>=0.10.0->sdv) (2.1.5)\n", - "Requirement already satisfied: mpmath<1.4,>=1.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sympy->torch>=1.9.0->ctgan>=0.10.0->sdv) (1.3.0)\n", - "Installing collected packages: scikit-learn\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 0.24.0\n", - " Uninstalling scikit-learn-0.24.0:\n", - " Successfully uninstalled scikit-learn-0.24.0\n", - "Successfully installed scikit-learn-1.5.1\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "pyitlib 0.2.3 requires scikit-learn<=0.24,>=0.16.0, but you have scikit-learn 1.5.1 which is incompatible.\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install sdv" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### Importing Katabatic" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": {}, - "outputs": [], - "source": [ - "from katabatic.katabatic import Katabatic\n", - "import numpy as np\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Sample Training Data" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "array([ 0, 2, 4, 10, 11, 12], dtype=int64)" - ] - }, - "execution_count": 5, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "from katabatic.models.ganblrpp.utils import get_demo_data\n", - "df = get_demo_data('adult-raw')\n", - "df.head()\n", - "\n", - "from sklearn.model_selection import train_test_split\n", - "x, y = df.values[:,:-1], df.values[:,-1]\n", - "X_train, X_test, y_train, y_test = train_test_split(x, y, test_size=0.5)\n", - "\n", - "import numpy as np\n", - "def is_numerical(dtype):\n", - " '''\n", - " if the type is one of ['signed-integer', 'unsigned-integer', 'floating point'], we reconginze it as a numerical one.\n", - " \n", - " Reference: https://numpy.org/doc/stable/reference/generated/numpy.dtype.kind.html#numpy.dtype.kind\n", - " '''\n", - " return dtype.kind in 'iuf'\n", - "\n", - "column_is_numerical = df.dtypes.apply(is_numerical).values\n", - "numerical = np.argwhere(column_is_numerical).ravel()\n", - "numerical\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Load Model and Train it on Sample Training Data" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] Initializing GANBLR++ Model\n", - "[INFO] Training GANBLR++ model\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\sklearn\\mixture\\_base.py:270: ConvergenceWarning: Best performing initialization did not converge. Try different init parameters, or increase max_iter, tol, or check for degenerate data.\n", - " warnings.warn(\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\sklearn\\mixture\\_base.py:270: ConvergenceWarning: Best performing initialization did not converge. Try different init parameters, or increase max_iter, tol, or check for degenerate data.\n", - " warnings.warn(\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\Documents\\GitHub\\Katabatic\\katabatic\\models\\ganblrpp\\ganblr.py:76: RuntimeWarning: divide by zero encountered in log\n", - " ls = np.mean(-np.log(np.subtract(1, prob_fake)))\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[SUCCESS] Model training completed\n" - ] - } - ], - "source": [ - "from katabatic.models.ganblrpp.ganblrpp_adapter import GanblrppAdapter\n", - "adapter = GanblrppAdapter(numerical_columns=numerical)\n", - "adapter.load_model()\n", - "adapter.fit(X_train, y_train, epochs=10)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Generate New Data from trained model" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] Generating data using GANBLR++ model\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "sampling: 0%| | 0/6 [00:00\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
01234567891011121314
045.885403Without-pay233248.633406HS-grad2.32256Separated?HusbandWhiteMale39305.046591252.46961643.084273Guatemala<=50K
118.452572Self-emp-not-inc666991.8531547th-8th9.993113WidowedExec-managerialHusbandWhiteMale653.651821675.07768123.136626Italy<=50K
247.058095Federal-gov106445.233982Bachelors13.789144Married-civ-spouseProtective-servUnmarriedAmer-Indian-EskimoFemale-3824.0361781176.2666541.029158Laos<=50K
358.60026Self-emp-not-inc291209.293581Bachelors6.997363WidowedCraft-repairOther-relativeBlackFemale7189.3822875.0433730.13387China<=50K
419.788386Self-emp-inc324818.6189851st-4th7.011698WidowedAdm-clericalUnmarriedOtherMale-550.316674465.03602420.56991Greece<=50K
\n", - "" - ], - "text/plain": [ - " 0 1 2 3 4 \\\n", - "0 45.885403 Without-pay 233248.633406 HS-grad 2.32256 \n", - "1 18.452572 Self-emp-not-inc 666991.853154 7th-8th 9.993113 \n", - "2 47.058095 Federal-gov 106445.233982 Bachelors 13.789144 \n", - "3 58.60026 Self-emp-not-inc 291209.293581 Bachelors 6.997363 \n", - "4 19.788386 Self-emp-inc 324818.618985 1st-4th 7.011698 \n", - "\n", - " 5 6 7 8 \\\n", - "0 Separated ? Husband White \n", - "1 Widowed Exec-managerial Husband White \n", - "2 Married-civ-spouse Protective-serv Unmarried Amer-Indian-Eskimo \n", - "3 Widowed Craft-repair Other-relative Black \n", - "4 Widowed Adm-clerical Unmarried Other \n", - "\n", - " 9 10 11 12 13 14 \n", - "0 Male 39305.04659 1252.469616 43.084273 Guatemala <=50K \n", - "1 Male 653.651821 675.077681 23.136626 Italy <=50K \n", - "2 Female -3824.036178 1176.26665 41.029158 Laos <=50K \n", - "3 Female 7189.3822 875.043373 0.13387 China <=50K \n", - "4 Male -550.316674 465.036024 20.56991 Greece <=50K " - ] - }, - "execution_count": 7, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "adapter.generate(size=5)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# Future import statement ideas\n", - "\n", - "import katabatic as kb\n", - "from katabatic.models import meg\n", - "from katabatic.models import ganblr\n", - "from katabatic.evaluate import eval_method1\n", - "from katabatic.utils.preprocessing import data_processing_method1 # good place to store preprocessing utilities" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.9.13" - } - }, - "nbformat": 4, - "nbformat_minor": 2 -} diff --git a/Benchmark/tryKatabatic_Ganblr++_BM.ipynb b/Benchmark/tryKatabatic_Ganblr++_BM.ipynb deleted file mode 100644 index d0fd819..0000000 --- a/Benchmark/tryKatabatic_Ganblr++_BM.ipynb +++ /dev/null @@ -1,683 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Katabatic Demo Usage" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### Importing Dependenciencies" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "!pip install -U scikit-learn" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "!pip install pyitlib" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "!pip install tensorflow" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "!pip install pgmpy" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "!pip install sdv" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### Importing Katabatic" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [], - "source": [ - "from katabatic.katabatic import Katabatic\n", - "import numpy as np\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Sample Training Data" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "array([ 0, 2, 4, 10, 11, 12], dtype=int64)" - ] - }, - "execution_count": 4, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "from katabatic.models.ganblrpp.utils import get_demo_data\n", - "real_data = get_demo_data('adult-raw')\n", - "real_data.head()\n", - "\n", - "from sklearn.model_selection import train_test_split\n", - "x, y = real_data.values[:,:-1], real_data.values[:,-1]\n", - "X_train, X_test, y_train, y_test = train_test_split(x, y, test_size=0.5)\n", - "\n", - "import numpy as np\n", - "def is_numerical(dtype):\n", - " '''\n", - " if the type is one of ['signed-integer', 'unsigned-integer', 'floating point'], we reconginze it as a numerical one.\n", - " \n", - " Reference: https://numpy.org/doc/stable/reference/generated/numpy.dtype.kind.html#numpy.dtype.kind\n", - " '''\n", - " return dtype.kind in 'iuf'\n", - "\n", - "column_is_numerical = real_data.dtypes.apply(is_numerical).values\n", - "numerical = np.argwhere(column_is_numerical).ravel()\n", - "numerical" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Load Model and Train it on Sample Training Data" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [ - { - "name": "stderr", - "output_type": "stream", - "text": [ - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\tqdm\\auto.py:21: TqdmWarning: IProgress not found. Please update jupyter and ipywidgets. See https://ipywidgets.readthedocs.io/en/stable/user_install.html\n", - " from .autonotebook import tqdm as notebook_tqdm\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] Initializing GANBLR++ Model\n", - "[INFO] Training GANBLR++ model\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\sklearn\\mixture\\_base.py:270: ConvergenceWarning: Best performing initialization did not converge. Try different init parameters, or increase max_iter, tol, or check for degenerate data.\n", - " warnings.warn(\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\sklearn\\mixture\\_base.py:270: ConvergenceWarning: Best performing initialization did not converge. Try different init parameters, or increase max_iter, tol, or check for degenerate data.\n", - " warnings.warn(\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\Documents\\GitHub\\Katabatic\\katabatic\\models\\ganblrpp\\ganblr.py:76: RuntimeWarning: divide by zero encountered in log\n", - " ls = np.mean(-np.log(np.subtract(1, prob_fake)))\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\Documents\\GitHub\\Katabatic\\katabatic\\models\\ganblrpp\\ganblr.py:76: RuntimeWarning: divide by zero encountered in log\n", - " ls = np.mean(-np.log(np.subtract(1, prob_fake)))\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[SUCCESS] Model training completed\n" - ] - } - ], - "source": [ - "from katabatic.models.ganblrpp.ganblrpp_adapter import GanblrppAdapter\n", - "adapter = GanblrppAdapter(numerical_columns=numerical)\n", - "adapter.load_model()\n", - "adapter.fit(X_train, y_train, epochs=10)\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Generate New Data from trained model" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] Generating data using GANBLR++ model\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "sampling: 100%|██████████| 6/6 [00:00<00:00, 41.19it/s]" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[SUCCESS] Data generation completed\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "\n" - ] - }, - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
ageworkclassfnlwgteducationeducation-nummarital-statusoccupationrelationshipracesexcapital-gaincapital-losshours-per-weeknative-countryclass
0NaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaN
1NaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaN
2NaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaN
3NaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaN
4NaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaN
5NaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaN
6NaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaN
7NaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaN
8NaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaN
9NaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaNNaN
\n", - "
" - ], - "text/plain": [ - " age workclass fnlwgt education education-num marital-status \\\n", - "0 NaN NaN NaN NaN NaN NaN \n", - "1 NaN NaN NaN NaN NaN NaN \n", - "2 NaN NaN NaN NaN NaN NaN \n", - "3 NaN NaN NaN NaN NaN NaN \n", - "4 NaN NaN NaN NaN NaN NaN \n", - "5 NaN NaN NaN NaN NaN NaN \n", - "6 NaN NaN NaN NaN NaN NaN \n", - "7 NaN NaN NaN NaN NaN NaN \n", - "8 NaN NaN NaN NaN NaN NaN \n", - "9 NaN NaN NaN NaN NaN NaN \n", - "\n", - " occupation relationship race sex capital-gain capital-loss \\\n", - "0 NaN NaN NaN NaN NaN NaN \n", - "1 NaN NaN NaN NaN NaN NaN \n", - "2 NaN NaN NaN NaN NaN NaN \n", - "3 NaN NaN NaN NaN NaN NaN \n", - "4 NaN NaN NaN NaN NaN NaN \n", - "5 NaN NaN NaN NaN NaN NaN \n", - "6 NaN NaN NaN NaN NaN NaN \n", - "7 NaN NaN NaN NaN NaN NaN \n", - "8 NaN NaN NaN NaN NaN NaN \n", - "9 NaN NaN NaN NaN NaN NaN \n", - "\n", - " hours-per-week native-country class \n", - "0 NaN NaN NaN \n", - "1 NaN NaN NaN \n", - "2 NaN NaN NaN \n", - "3 NaN NaN NaN \n", - "4 NaN NaN NaN \n", - "5 NaN NaN NaN \n", - "6 NaN NaN NaN \n", - "7 NaN NaN NaN \n", - "8 NaN NaN NaN \n", - "9 NaN NaN NaN " - ] - }, - "execution_count": 6, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "import pandas as pd\n", - "syn_data = adapter.generate(size= 50000)\n", - "pd.DataFrame(syn_data, columns=real_data.columns).head(10)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "from katabatic.metrics import tstr_logreg, tstr_mlp, tstr_rf, tstr_xgbt, trtr_jsd, trtr_wd\n", - "import pandas as pd\n", - "\n", - "# Ensure syn_data and real_data are numpy arrays\n", - "if isinstance(syn_data, pd.DataFrame):\n", - " X_synthetic, y_synthetic = syn_data.iloc[:, :-1].values, syn_data.iloc[:, -1].values\n", - "else:\n", - " X_synthetic, y_synthetic = syn_data[:, :-1], syn_data[:, -1]\n", - "\n", - "if isinstance(real_data, pd.DataFrame):\n", - " X_real, y_real = real_data.iloc[:, :-1].values, real_data.iloc[:, -1].values\n", - "else:\n", - " X_real, y_real = real_data[:, :-1], real_data[:, -1]\n", - "\n", - "# Convert numpy arrays back to DataFrames and Series\n", - "X_synthetic_df = pd.DataFrame(X_synthetic)\n", - "y_synthetic_df = pd.Series(y_synthetic)\n", - "X_real_df = pd.DataFrame(X_real)\n", - "y_real_df = pd.Series(y_real)\n", - "\n", - "# Evaluate using the different models\n", - "acc_score_lr = tstr_logreg.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df)\n", - "acc_score_mlp = tstr_mlp.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df)\n", - "acc_score_rf = tstr_rf.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df)\n", - "acc_score_xgbt = tstr_xgbt.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df)\n", - "jsd_value = trtr_jsd.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df)\n", - "wd_value = trtr_wd.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df)\n", - "\n", - "# Print the results with 4 decimal places\n", - "print(f\"Accuracy with Logistic Regression: {acc_score_lr:.4f}\")\n", - "print(f\"Accuracy with MLP: {acc_score_mlp:.4f}\")\n", - "print(f\"Accuracy with Random Forest: {acc_score_rf}\")\n", - "print(f\"Accuracy with XgboostTree: {acc_score_xgbt:.4f}\")\n", - "print(f\"Jensen-Shannon Divergence: {jsd_value:.4f}\")\n", - "print(f\"Wasserstein Distance: {wd_value:.4f}\")" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "import numpy as np\n", - "import pandas as pd\n", - "from katabatic.models.ganblrpp.utils import get_demo_data\n", - "from sklearn.model_selection import train_test_split, KFold\n", - "from katabatic.models.ganblrpp.ganblrpp_adapter import GanblrppAdapter\n", - "from katabatic.metrics import tstr_logreg, tstr_mlp, tstr_rf, tstr_xgbt, trtr_jsd, trtr_wd\n", - "\n", - "# Load and prepare the data\n", - "real_data = get_demo_data('adult-raw')\n", - "x, y = real_data.values[:, :-1], real_data.values[:, -1]\n", - "\n", - "# Define a function to check if a dtype is numerical\n", - "def is_numerical(dtype):\n", - " return dtype.kind in 'iuf'\n", - "\n", - "# Get numerical columns\n", - "column_is_numerical = real_data.dtypes.apply(is_numerical).values\n", - "numerical = np.argwhere(column_is_numerical).ravel()\n", - "\n", - "# Initialize metrics accumulators\n", - "acc_scores_lr = []\n", - "acc_scores_mlp = []\n", - "acc_scores_rf = []\n", - "acc_scores_xgbt = []\n", - "jsd_values = []\n", - "wd_values = []\n", - "\n", - "# Set up 2-fold cross-validation\n", - "kf = KFold(n_splits=2, shuffle=True)\n", - "\n", - "# Repeat the experiment 3 times\n", - "for repeat in range(3):\n", - " print(f\"Repeat {repeat + 1}\")\n", - " \n", - " for train_index, test_index in kf.split(x):\n", - " X_train, X_test = x[train_index], x[test_index]\n", - " y_train, y_test = y[train_index], y[test_index]\n", - " \n", - " # Initialize and fit the model\n", - " adapter = GanblrppAdapter(numerical_columns=numerical)\n", - " adapter.load_model()\n", - " adapter.fit(X_train, y_train, epochs=100)\n", - " \n", - " # Generate synthetic data\n", - " syn_data = adapter.generate(size=50000)\n", - " \n", - " # Prepare the synthetic and real datasets for evaluation\n", - " X_synthetic, y_synthetic = syn_data[:, :-1], syn_data[:, -1]\n", - " X_real, y_real = x, y\n", - " \n", - " # Convert numpy arrays to DataFrames and Series\n", - " X_synthetic_df = pd.DataFrame(X_synthetic)\n", - " y_synthetic_df = pd.Series(y_synthetic)\n", - " X_real_df = pd.DataFrame(X_real)\n", - " y_real_df = pd.Series(y_real)\n", - " \n", - " # Evaluate using the different models\n", - " acc_scores_lr.append(tstr_logreg.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df))\n", - " acc_scores_mlp.append(tstr_mlp.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df))\n", - " acc_scores_rf.append(tstr_rf.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df))\n", - " acc_scores_xgbt.append(tstr_xgbt.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df))\n", - " jsd_values.append(trtr_jsd.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df))\n", - " wd_values.append(trtr_wd.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df))\n", - "\n", - "# Calculate average results\n", - "avg_acc_score_lr = np.mean(acc_scores_lr)\n", - "avg_acc_score_mlp = np.mean(acc_scores_mlp)\n", - "avg_acc_score_rf = np.mean(acc_scores_rf)\n", - "avg_acc_score_xgbt = np.mean(acc_scores_xgbt)\n", - "avg_jsd_value = np.mean(jsd_values)\n", - "avg_wd_value = np.mean(wd_values)\n", - "\n", - "# Print the averaged results with 4 decimal places\n", - "print(f\"Average Accuracy with Logistic Regression: {avg_acc_score_lr:.4f}\")\n", - "print(f\"Average Accuracy with MLP: {avg_acc_score_mlp:.4f}\")\n", - "print(f\"Average Accuracy with Random Forest: {avg_acc_score_rf:.4f}\")\n", - "print(f\"Average Accuracy with XgboostTree: {avg_acc_score_xgbt:.4f}\")\n", - "print(f\"Average Jensen-Shannon Divergence: {avg_jsd_value:.4f}\")\n", - "print(f\"Average Wasserstein Distance: {avg_wd_value:.4f}\")\n" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# Future import statement ideas\n", - "\n", - "import katabatic as kb\n", - "from katabatic.models import meg\n", - "from katabatic.models import ganblr\n", - "from katabatic.evaluate import eval_method1\n", - "from katabatic.utils.preprocessing import data_processing_method1 # good place to store preprocessing utilities" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.9.13" - } - }, - "nbformat": 4, - "nbformat_minor": 2 -} diff --git a/Benchmark/tryKatabatic_Ganblr_BM.ipynb b/Benchmark/tryKatabatic_Ganblr_BM.ipynb deleted file mode 100644 index a091b07..0000000 --- a/Benchmark/tryKatabatic_Ganblr_BM.ipynb +++ /dev/null @@ -1,1189 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Katabatic BenchMarking" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### Importing Dependenciencies" - ] - }, - { - "cell_type": "code", - "execution_count": 50, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: scikit-learn in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (1.5.1)\n", - "Requirement already satisfied: scipy>=1.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.13.1)\n", - "Requirement already satisfied: joblib>=1.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.4.2)\n", - "Requirement already satisfied: threadpoolctl>=3.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (3.5.0)\n", - "Requirement already satisfied: numpy>=1.19.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.26.4)\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install -U scikit-learn" - ] - }, - { - "cell_type": "code", - "execution_count": 51, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: pyitlib in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (0.2.3)\n", - "Requirement already satisfied: pandas>=0.20.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (2.2.2)\n", - "Requirement already satisfied: numpy>=1.9.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (1.26.4)\n", - "Collecting scikit-learn<=0.24,>=0.16.0\n", - " Using cached scikit_learn-0.24.0-cp39-cp39-win_amd64.whl (6.9 MB)\n", - "Requirement already satisfied: scipy>=1.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (1.13.1)\n", - "Requirement already satisfied: future>=0.16.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (1.0.0)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=0.20.2->pyitlib) (2024.1)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=0.20.2->pyitlib) (2024.1)\n", - "Requirement already satisfied: python-dateutil>=2.8.2 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from pandas>=0.20.2->pyitlib) (2.9.0.post0)\n", - "Requirement already satisfied: joblib>=0.11 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn<=0.24,>=0.16.0->pyitlib) (1.4.2)\n", - "Requirement already satisfied: threadpoolctl>=2.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn<=0.24,>=0.16.0->pyitlib) (3.5.0)\n", - "Requirement already satisfied: six>=1.5 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from python-dateutil>=2.8.2->pandas>=0.20.2->pyitlib) (1.16.0)\n", - "Installing collected packages: scikit-learn\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 1.5.1\n", - " Uninstalling scikit-learn-1.5.1:\n", - " Successfully uninstalled scikit-learn-1.5.1\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR: Could not install packages due to an OSError: [WinError 5] Access is denied: 'C:\\\\Users\\\\Asus\\\\AppData\\\\Local\\\\Programs\\\\Python\\\\Python39\\\\Lib\\\\site-packages\\\\~klearn\\\\.libs\\\\msvcp140.dll'\n", - "Consider using the `--user` option or check the permissions.\n", - "\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install pyitlib" - ] - }, - { - "cell_type": "code", - "execution_count": 52, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: tensorflow in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (2.17.0)\n", - "Requirement already satisfied: tensorflow-intel==2.17.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow) (2.17.0)\n", - "Requirement already satisfied: requests<3,>=2.21.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.32.3)\n", - "Requirement already satisfied: opt-einsum>=2.3.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (3.3.0)\n", - "Requirement already satisfied: tensorflow-io-gcs-filesystem>=0.23.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.31.0)\n", - "Requirement already satisfied: typing-extensions>=3.6.6 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (4.12.2)\n", - "Requirement already satisfied: setuptools in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (58.1.0)\n", - "Requirement already satisfied: gast!=0.5.0,!=0.5.1,!=0.5.2,>=0.2.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.6.0)\n", - "Requirement already satisfied: flatbuffers>=24.3.25 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (24.3.25)\n", - "Requirement already satisfied: protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.20.3 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (4.25.4)\n", - "Requirement already satisfied: grpcio<2.0,>=1.24.3 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.65.4)\n", - "Requirement already satisfied: termcolor>=1.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.4.0)\n", - "Requirement already satisfied: h5py>=3.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (3.11.0)\n", - "Requirement already satisfied: wrapt>=1.11.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.16.0)\n", - "Requirement already satisfied: numpy<2.0.0,>=1.23.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.26.4)\n", - "Requirement already satisfied: packaging in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (24.1)\n", - "Requirement already satisfied: keras>=3.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (3.4.1)\n", - "Requirement already satisfied: six>=1.12.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.16.0)\n", - "Requirement already satisfied: absl-py>=1.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.1.0)\n", - "Requirement already satisfied: libclang>=13.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (18.1.1)\n", - "Requirement already satisfied: google-pasta>=0.1.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.2.0)\n", - "Requirement already satisfied: tensorboard<2.18,>=2.17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.17.0)\n", - "Requirement already satisfied: ml-dtypes<0.5.0,>=0.3.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.4.0)\n", - "Requirement already satisfied: astunparse>=1.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.6.3)\n", - "Requirement already satisfied: wheel<1.0,>=0.23.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from astunparse>=1.6.0->tensorflow-intel==2.17.0->tensorflow) (0.44.0)\n", - "Requirement already satisfied: optree in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (0.12.1)\n", - "Requirement already satisfied: namex in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (0.0.8)\n", - "Requirement already satisfied: rich in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (13.7.1)\n", - "Requirement already satisfied: urllib3<3,>=1.21.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (1.26.19)\n", - "Requirement already satisfied: idna<4,>=2.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (3.7)\n", - "Requirement already satisfied: certifi>=2017.4.17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (2024.7.4)\n", - "Requirement already satisfied: charset-normalizer<4,>=2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (3.3.2)\n", - "Requirement already satisfied: tensorboard-data-server<0.8.0,>=0.7.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (0.7.2)\n", - "Requirement already satisfied: markdown>=2.6.8 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.6)\n", - "Requirement already satisfied: werkzeug>=1.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.0.3)\n", - "Requirement already satisfied: importlib-metadata>=4.4 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from markdown>=2.6.8->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (8.2.0)\n", - "Requirement already satisfied: MarkupSafe>=2.1.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from werkzeug>=1.0.1->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (2.1.5)\n", - "Requirement already satisfied: markdown-it-py>=2.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (3.0.0)\n", - "Requirement already satisfied: pygments<3.0.0,>=2.13.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (2.18.0)\n", - "Requirement already satisfied: zipp>=0.5 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.20.0)\n", - "Requirement already satisfied: mdurl~=0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from markdown-it-py>=2.2.0->rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (0.1.2)\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install tensorflow" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: pgmpy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (0.1.26)\n", - "Requirement already satisfied: numpy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.26.4)\n", - "Requirement already satisfied: scipy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.13.1)\n", - "Requirement already satisfied: tqdm in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (4.66.5)\n", - "Requirement already satisfied: google-generativeai in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.7.2)\n", - "Requirement already satisfied: joblib in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.4.2)\n", - "Requirement already satisfied: statsmodels in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.14.2)\n", - "Requirement already satisfied: opt-einsum in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.3.0)\n", - "Requirement already satisfied: xgboost in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.1.1)\n", - "Requirement already satisfied: torch in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.4.0)\n", - "Requirement already satisfied: pyparsing in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.1.2)\n", - "Requirement already satisfied: networkx in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.2.1)\n", - "Requirement already satisfied: pandas in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.2.2)\n", - "Requirement already satisfied: scikit-learn in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.24.0)\n", - "Requirement already satisfied: typing-extensions in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from google-generativeai->pgmpy) (4.12.2)\n", - "Requirement already satisfied: google-ai-generativelanguage==0.6.6 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (0.6.6)\n", - "Requirement already satisfied: pydantic in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.8.2)\n", - "Requirement already satisfied: google-api-python-client in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.140.0)\n", - "Requirement already satisfied: google-api-core in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.19.1)\n", - "Requirement already satisfied: protobuf in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (4.25.4)\n", - "Requirement already satisfied: google-auth>=2.15.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.33.0)\n", - "Requirement already satisfied: proto-plus<2.0.0dev,>=1.22.3 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-ai-generativelanguage==0.6.6->google-generativeai->pgmpy) (1.24.0)\n", - "Requirement already satisfied: python-dateutil>=2.8.2 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from pandas->pgmpy) (2.9.0.post0)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas->pgmpy) (2024.1)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas->pgmpy) (2024.1)\n", - "Requirement already satisfied: threadpoolctl>=2.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn->pgmpy) (3.5.0)\n", - "Requirement already satisfied: packaging>=21.3 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from statsmodels->pgmpy) (24.1)\n", - "Requirement already satisfied: patsy>=0.5.6 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from statsmodels->pgmpy) (0.5.6)\n", - "Requirement already satisfied: jinja2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (3.1.4)\n", - "Requirement already satisfied: filelock in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (3.15.4)\n", - "Requirement already satisfied: sympy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (1.13.2)\n", - "Requirement already satisfied: fsspec in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (2024.6.1)\n", - "Requirement already satisfied: colorama in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tqdm->pgmpy) (0.4.6)\n", - "Requirement already satisfied: cachetools<6.0,>=2.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-auth>=2.15.0->google-generativeai->pgmpy) (5.4.0)\n", - "Requirement already satisfied: rsa<5,>=3.1.4 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-auth>=2.15.0->google-generativeai->pgmpy) (4.9)\n", - "Requirement already satisfied: pyasn1-modules>=0.2.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-auth>=2.15.0->google-generativeai->pgmpy) (0.4.0)\n", - "Requirement already satisfied: six in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from patsy>=0.5.6->statsmodels->pgmpy) (1.16.0)\n", - "Requirement already satisfied: googleapis-common-protos<2.0.dev0,>=1.56.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (1.63.2)\n", - "Requirement already satisfied: requests<3.0.0.dev0,>=2.18.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (2.32.3)\n", - "Requirement already satisfied: uritemplate<5,>=3.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-python-client->google-generativeai->pgmpy) (4.1.1)\n", - "Requirement already satisfied: google-auth-httplib2<1.0.0,>=0.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-python-client->google-generativeai->pgmpy) (0.2.0)\n", - "Requirement already satisfied: httplib2<1.dev0,>=0.19.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-python-client->google-generativeai->pgmpy) (0.22.0)\n", - "Requirement already satisfied: MarkupSafe>=2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from jinja2->torch->pgmpy) (2.1.5)\n", - "Requirement already satisfied: annotated-types>=0.4.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pydantic->google-generativeai->pgmpy) (0.7.0)\n", - "Requirement already satisfied: pydantic-core==2.20.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pydantic->google-generativeai->pgmpy) (2.20.1)\n", - "Requirement already satisfied: mpmath<1.4,>=1.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sympy->torch->pgmpy) (1.3.0)\n", - "Requirement already satisfied: grpcio<2.0dev,>=1.33.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (1.65.4)\n", - "Requirement already satisfied: grpcio-status<2.0.dev0,>=1.33.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (1.62.3)\n", - "Requirement already satisfied: pyasn1<0.7.0,>=0.4.6 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyasn1-modules>=0.2.1->google-auth>=2.15.0->google-generativeai->pgmpy) (0.6.0)\n", - "Requirement already satisfied: idna<4,>=2.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (3.7)\n", - "Requirement already satisfied: urllib3<3,>=1.21.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (1.26.19)\n", - "Requirement already satisfied: certifi>=2017.4.17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (2024.7.4)\n", - "Requirement already satisfied: charset-normalizer<4,>=2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (3.3.2)\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install pgmpy" - ] - }, - { - "cell_type": "code", - "execution_count": 54, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "^C\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "pyitlib 0.2.3 requires scikit-learn<=0.24,>=0.16.0, but you have scikit-learn 1.5.1 which is incompatible.\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: sdv in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (1.15.0)\n", - "Requirement already satisfied: botocore>=1.31 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.34.158)\n", - "Requirement already satisfied: tqdm>=4.29 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (4.66.5)\n", - "Requirement already satisfied: copulas>=0.11.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.11.0)\n", - "Requirement already satisfied: deepecho>=0.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.6.0)\n", - "Requirement already satisfied: pandas>=1.4.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (2.2.2)\n", - "Requirement already satisfied: ctgan>=0.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.10.1)\n", - "Requirement already satisfied: numpy<2.0.0,>=1.21.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.26.4)\n", - "Requirement already satisfied: boto3>=1.28 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.34.158)\n", - "Requirement already satisfied: platformdirs>=4.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from sdv) (4.2.2)\n", - "Requirement already satisfied: cloudpickle>=2.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (3.0.0)\n", - "Requirement already satisfied: rdt>=1.12.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.12.2)\n", - "Requirement already satisfied: sdmetrics>=0.14.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.15.0)\n", - "Requirement already satisfied: graphviz>=0.13.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.20.3)\n", - "Requirement already satisfied: pyyaml>=6.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (6.0.2)\n", - "Requirement already satisfied: jmespath<2.0.0,>=0.7.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from boto3>=1.28->sdv) (1.0.1)\n", - "Requirement already satisfied: s3transfer<0.11.0,>=0.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from boto3>=1.28->sdv) (0.10.2)\n", - "Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from botocore>=1.31->sdv) (2.9.0.post0)\n", - "Requirement already satisfied: urllib3<1.27,>=1.25.4 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from botocore>=1.31->sdv) (1.26.19)\n", - "Requirement already satisfied: scipy>=1.5.4 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from copulas>=0.11.0->sdv) (1.13.1)\n", - "Requirement already satisfied: plotly>=5.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from copulas>=0.11.0->sdv) (5.23.0)\n", - "Requirement already satisfied: torch>=1.9.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from ctgan>=0.10.0->sdv) (2.4.0)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=1.4.0->sdv) (2024.1)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=1.4.0->sdv) (2024.1)\n", - "Collecting scikit-learn>=1.0.2\n", - " Using cached scikit_learn-1.5.1-cp39-cp39-win_amd64.whl (11.0 MB)\n", - "Requirement already satisfied: Faker>=17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from rdt>=1.12.0->sdv) (26.3.0)\n", - "Requirement already satisfied: colorama in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tqdm>=4.29->sdv) (0.4.6)\n", - "Requirement already satisfied: packaging in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from plotly>=5.10.0->copulas>=0.11.0->sdv) (24.1)\n", - "Requirement already satisfied: tenacity>=6.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from plotly>=5.10.0->copulas>=0.11.0->sdv) (9.0.0)\n", - "Requirement already satisfied: six>=1.5 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from python-dateutil<3.0.0,>=2.1->botocore>=1.31->sdv) (1.16.0)\n", - "Requirement already satisfied: joblib>=1.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn>=1.0.2->rdt>=1.12.0->sdv) (1.4.2)\n", - "Requirement already satisfied: threadpoolctl>=3.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn>=1.0.2->rdt>=1.12.0->sdv) (3.5.0)\n", - "Requirement already satisfied: networkx in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.2.1)\n", - "Requirement already satisfied: typing-extensions>=4.8.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (4.12.2)\n", - "Requirement already satisfied: fsspec in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (2024.6.1)\n", - "Requirement already satisfied: sympy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (1.13.2)\n", - "Requirement already satisfied: filelock in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.15.4)\n", - "Requirement already satisfied: jinja2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.1.4)\n", - "Requirement already satisfied: MarkupSafe>=2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from jinja2->torch>=1.9.0->ctgan>=0.10.0->sdv) (2.1.5)\n", - "Requirement already satisfied: mpmath<1.4,>=1.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sympy->torch>=1.9.0->ctgan>=0.10.0->sdv) (1.3.0)\n", - "Installing collected packages: scikit-learn\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 0.24.0\n", - " Uninstalling scikit-learn-0.24.0:\n", - " Successfully uninstalled scikit-learn-0.24.0\n", - "Successfully installed scikit-learn-1.5.1\n" - ] - } - ], - "source": [ - "!pip install sdv" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### Importing Katabatic" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": {}, - "outputs": [], - "source": [ - "from katabatic.katabatic import Katabatic\n", - "import numpy as np\n", - "import pandas as pd\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Sample Training Data" - ] - }, - { - "cell_type": "code", - "execution_count": 11, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
ageworkclassfnlwgteducationeducation-nummarital-statusoccupationrelationshipracesexcapital-gaincapital-losshours-per-weeknative-countryclass
067092411411302390
17609224041000390
264011506141002390
37401026021002390
44409221052000250
564012324540002390
67406038120000230
776011524041003391
85401234101401804391
97409224041902391
\n", - "
" - ], - "text/plain": [ - " age workclass fnlwgt education education-num marital-status \\\n", - "0 6 7 0 9 2 4 \n", - "1 7 6 0 9 2 2 \n", - "2 6 4 0 11 5 0 \n", - "3 7 4 0 1 0 2 \n", - "4 4 4 0 9 2 2 \n", - "5 6 4 0 12 3 2 \n", - "6 7 4 0 6 0 3 \n", - "7 7 6 0 11 5 2 \n", - "8 5 4 0 12 3 4 \n", - "9 7 4 0 9 2 2 \n", - "\n", - " occupation relationship race sex capital-gain capital-loss \\\n", - "0 1 1 4 1 13 0 \n", - "1 4 0 4 1 0 0 \n", - "2 6 1 4 1 0 0 \n", - "3 6 0 2 1 0 0 \n", - "4 10 5 2 0 0 0 \n", - "5 4 5 4 0 0 0 \n", - "6 8 1 2 0 0 0 \n", - "7 4 0 4 1 0 0 \n", - "8 10 1 4 0 18 0 \n", - "9 4 0 4 1 9 0 \n", - "\n", - " hours-per-week native-country class \n", - "0 2 39 0 \n", - "1 0 39 0 \n", - "2 2 39 0 \n", - "3 2 39 0 \n", - "4 2 5 0 \n", - "5 2 39 0 \n", - "6 0 23 0 \n", - "7 3 39 1 \n", - "8 4 39 1 \n", - "9 2 39 1 " - ] - }, - "execution_count": 11, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "from katabatic.models.ganblr.utils import get_demo_data\n", - "real_data = get_demo_data('adult')\n", - "real_data = pd.DataFrame(real_data, columns=real_data.columns).head(10)\n", - "real_data" - ] - }, - { - "cell_type": "code", - "execution_count": 12, - "metadata": {}, - "outputs": [], - "source": [ - "from sklearn.model_selection import train_test_split\n", - "X_real, y_real = real_data.values[:,:-1], real_data.values[:,-1]\n", - "X_train, X_test, y_train, y_test = train_test_split(X_real, y_real, test_size=0.5)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Load Model and Train it on Sample Training Data" - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "--------------------------\n", - "module name: katabatic.katabatic\n", - "parent process: 15232\n", - "process id: 54428\n", - "katabatic.models.ganblr.ganblr_adapter\n", - "[INFO] Initializing GANBLR Model\n", - "[INFO] Training GANBLR model\n", - "[SUCCESS] Model training completed\n" - ] - } - ], - "source": [ - "model = Katabatic.run_model('ganblr')\n", - "model.load_model()\n", - "model.fit(X_train,y_train,k=0, epochs= 100)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Generate New Data from trained model" - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] Generating data using GANBLR model\n", - "[SUCCESS] Data generation completed\n" - ] - }, - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
ageworkclassfnlwgteducationeducation-nummarital-statusoccupationrelationshipracesexcapital-gaincapital-losshours-per-weeknative-countryclass
054092411401300390
154092410541902391
2640120310041900391
3770125215201302391
4770125240411800230
546011246141903230
67701526520130350
756090340401300230
8670122345401304390
96609528140003230
\n", - "
" - ], - "text/plain": [ - " age workclass fnlwgt education education-num marital-status occupation \\\n", - "0 5 4 0 9 2 4 1 \n", - "1 5 4 0 9 2 4 10 \n", - "2 6 4 0 12 0 3 10 \n", - "3 7 7 0 12 5 2 1 \n", - "4 7 7 0 12 5 2 4 \n", - "5 4 6 0 11 2 4 6 \n", - "6 7 7 0 1 5 2 6 \n", - "7 5 6 0 9 0 3 4 \n", - "8 6 7 0 12 2 3 4 \n", - "9 6 6 0 9 5 2 8 \n", - "\n", - " relationship race sex capital-gain capital-loss hours-per-week \\\n", - "0 1 4 0 13 0 0 \n", - "1 5 4 1 9 0 2 \n", - "2 0 4 1 9 0 0 \n", - "3 5 2 0 13 0 2 \n", - "4 0 4 1 18 0 0 \n", - "5 1 4 1 9 0 3 \n", - "6 5 2 0 13 0 3 \n", - "7 0 4 0 13 0 0 \n", - "8 5 4 0 13 0 4 \n", - "9 1 4 0 0 0 3 \n", - "\n", - " native-country class \n", - "0 39 0 \n", - "1 39 1 \n", - "2 39 1 \n", - "3 39 1 \n", - "4 23 0 \n", - "5 23 0 \n", - "6 5 0 \n", - "7 23 0 \n", - "8 39 0 \n", - "9 23 0 " - ] - }, - "execution_count": 9, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "import pandas as pd\n", - "syn_data = model.generate(size= 50000)\n", - "pd.DataFrame(syn_data, columns=real_data.columns).head(10)" - ] - }, - { - "cell_type": "code", - "execution_count": 10, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Accuracy with Logistic Regression: 0.8000\n", - "Accuracy with MLP: 0.9000\n", - "Accuracy with Random Forest: 0.9\n", - "Accuracy with XgboostTree: 0.9000\n", - "Jensen-Shannon Divergence: 0.5000\n", - "Wasserstein Distance: 0.5000\n" - ] - } - ], - "source": [ - "from katabatic.metrics import tstr_logreg, tstr_mlp, tstr_rf, tstr_xgbt, trtr_jsd, trtr_wd\n", - "import pandas as pd\n", - "\n", - "# Ensure syn_data and real_data are numpy arrays\n", - "if isinstance(syn_data, pd.DataFrame):\n", - " X_synthetic, y_synthetic = syn_data.iloc[:, :-1].values, syn_data.iloc[:, -1].values\n", - "else:\n", - " X_synthetic, y_synthetic = syn_data[:, :-1], syn_data[:, -1]\n", - "\n", - "if isinstance(real_data, pd.DataFrame):\n", - " X_real, y_real = real_data.iloc[:, :-1].values, real_data.iloc[:, -1].values\n", - "else:\n", - " X_real, y_real = real_data[:, :-1], real_data[:, -1]\n", - "\n", - "# Convert numpy arrays back to DataFrames and Series\n", - "X_synthetic_df = pd.DataFrame(X_synthetic)\n", - "y_synthetic_df = pd.Series(y_synthetic)\n", - "X_real_df = pd.DataFrame(X_real)\n", - "y_real_df = pd.Series(y_real)\n", - "\n", - "# Evaluate using the different models\n", - "acc_score_lr = tstr_logreg.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df)\n", - "acc_score_mlp = tstr_mlp.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df)\n", - "acc_score_rf = tstr_rf.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df)\n", - "acc_score_xgbt = tstr_xgbt.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df)\n", - "jsd_value = trtr_jsd.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df)\n", - "wd_value = trtr_wd.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df)\n", - "\n", - "# Print the results with 4 decimal places\n", - "print(f\"Accuracy with Logistic Regression: {acc_score_lr:.4f}\")\n", - "print(f\"Accuracy with MLP: {acc_score_mlp:.4f}\")\n", - "print(f\"Accuracy with Random Forest: {acc_score_rf}\")\n", - "print(f\"Accuracy with XgboostTree: {acc_score_xgbt:.4f}\")\n", - "print(f\"Jensen-Shannon Divergence: {jsd_value:.4f}\")\n", - "print(f\"Wasserstein Distance: {wd_value:.4f}\")\n", - "\n", - "\n" - ] - }, - { - "cell_type": "code", - "execution_count": 13, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "--------------------------\n", - "module name: katabatic.katabatic\n", - "parent process: 15232\n", - "process id: 54428\n", - "katabatic.models.ganblr.ganblr_adapter\n", - "[INFO] Initializing GANBLR Model\n", - "[INFO] Training GANBLR model\n", - "[SUCCESS] Model training completed\n", - "[INFO] Generating data using GANBLR model\n", - "[SUCCESS] Data generation completed\n", - "[INFO] Initializing GANBLR Model\n", - "[INFO] Training GANBLR model\n", - "[SUCCESS] Model training completed\n", - "[INFO] Generating data using GANBLR model\n", - "[SUCCESS] Data generation completed\n", - "--------------------------\n", - "module name: katabatic.katabatic\n", - "parent process: 15232\n", - "process id: 54428\n", - "katabatic.models.ganblr.ganblr_adapter\n", - "[INFO] Initializing GANBLR Model\n", - "[INFO] Training GANBLR model\n", - "[SUCCESS] Model training completed\n", - "[INFO] Generating data using GANBLR model\n", - "[SUCCESS] Data generation completed\n", - "[INFO] Initializing GANBLR Model\n", - "[INFO] Training GANBLR model\n", - "[SUCCESS] Model training completed\n", - "[INFO] Generating data using GANBLR model\n", - "[SUCCESS] Data generation completed\n", - "--------------------------\n", - "module name: katabatic.katabatic\n", - "parent process: 15232\n", - "process id: 54428\n", - "katabatic.models.ganblr.ganblr_adapter\n", - "[INFO] Initializing GANBLR Model\n", - "[INFO] Training GANBLR model\n", - "[SUCCESS] Model training completed\n", - "[INFO] Generating data using GANBLR model\n", - "[SUCCESS] Data generation completed\n", - "[INFO] Initializing GANBLR Model\n", - "[INFO] Training GANBLR model\n", - "[SUCCESS] Model training completed\n", - "[INFO] Generating data using GANBLR model\n", - "[SUCCESS] Data generation completed\n", - "Average logreg result: 0.6667\n", - "Average mlp result: 0.7167\n", - "Average rf result: 0.7167\n", - "Average xgbt result: 0.7333\n", - "Average jsd result: 0.5000\n", - "Average wd result: 0.5000\n" - ] - } - ], - "source": [ - "from sklearn.model_selection import KFold\n", - "import numpy as np\n", - "import pandas as pd\n", - "\n", - "# Define the number of repetitions and folds\n", - "n_repeats = 3\n", - "n_folds = 2\n", - "\n", - "def evaluate_experiment(X_real, y_real, model, metric_functions):\n", - " # Initialize lists to store metrics for each fold\n", - " metrics_results = {key: [] for key in metric_functions.keys()}\n", - "\n", - " kf = KFold(n_splits=n_folds, shuffle=True, random_state=42)\n", - " \n", - " for train_index, test_index in kf.split(X_real):\n", - " X_train, X_test = X_real[train_index], X_real[test_index]\n", - " y_train, y_test = y_real[train_index], y_real[test_index]\n", - "\n", - " model.load_model() # Ensure the model is loaded each time\n", - " model.fit(X_train, y_train, k=0, epochs=100)\n", - " syn_data = model.generate(size=50000)\n", - "\n", - " if isinstance(syn_data, pd.DataFrame):\n", - " X_synthetic, y_synthetic = syn_data.iloc[:, :-1].values, syn_data.iloc[:, -1].values\n", - " else:\n", - " X_synthetic, y_synthetic = syn_data[:, :-1], syn_data[:, -1]\n", - "\n", - " X_synthetic_df = pd.DataFrame(X_synthetic)\n", - " y_synthetic_df = pd.Series(y_synthetic)\n", - " X_real_df = pd.DataFrame(X_real)\n", - " y_real_df = pd.Series(y_real)\n", - "\n", - " for name, func in metric_functions.items():\n", - " result = func.evaluate(X_synthetic_df, y_synthetic_df, X_real_df, y_real_df)\n", - " metrics_results[name].append(result)\n", - "\n", - " # Average results across folds\n", - " averaged_results = {key: np.mean(values) for key, values in metrics_results.items()}\n", - " return averaged_results\n", - "\n", - "# Repeat the experiment n_repeats times\n", - "overall_results = {key: [] for key in ['logreg', 'mlp', 'rf', 'xgbt', 'jsd', 'wd']}\n", - "\n", - "for _ in range(n_repeats):\n", - " # Prepare the model\n", - " model = Katabatic.run_model('ganblr')\n", - "\n", - " # Define the metric functions\n", - " metric_functions = {\n", - " 'logreg': tstr_logreg,\n", - " 'mlp': tstr_mlp,\n", - " 'rf': tstr_rf,\n", - " 'xgbt': tstr_xgbt,\n", - " 'jsd': trtr_jsd,\n", - " 'wd': trtr_wd\n", - " }\n", - "\n", - " # Run the evaluation\n", - " results = evaluate_experiment(X_real, y_real, model, metric_functions)\n", - "\n", - " for key in results.keys():\n", - " overall_results[key].append(results[key])\n", - " \n", - "# Calculate and print the average results\n", - "for metric in overall_results.keys():\n", - " avg_result = np.mean(overall_results[metric])\n", - " print(f\"Average {metric} result: {avg_result:.4f}\")\n" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# Future import statement ideas\n", - "\n", - "import katabatic as kb\n", - "from katabatic.models import meg\n", - "from katabatic.models import ganblr\n", - "from katabatic.metrics import eval_method1\n", - "from katabatic.utils.preprocessing import data_processing_method1 # good place to store preprocessing utilities" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "from katabatic.evaluate import eval_method1" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.9.13" - } - }, - "nbformat": 4, - "nbformat_minor": 2 -} diff --git a/Docker/Ganblr/Dockerfile_Ganblr b/Docker/Ganblr/Dockerfile_Ganblr deleted file mode 100644 index 26388e5..0000000 --- a/Docker/Ganblr/Dockerfile_Ganblr +++ /dev/null @@ -1,27 +0,0 @@ -# Step 1: Use an official Miniconda image as a base image -FROM continuumio/miniconda3:latest - -# Step 2: Set the working directory in the container -WORKDIR /app - -# Step 3: Copy your application code (GANBLR model API code) -COPY . /app - -# Step 4: Create a new Conda environment -RUN conda create --name ganblr_env python=3.9.19 -y && \ - echo "source activate ganblr_env" > ~/.bashrc - -# Step 5: Activate the environment and install necessary libraries in the precise order -RUN /bin/bash -c "source activate ganblr_env && \ - pip install pyitlib && \ - pip install tensorflow && \ - pip install pgmpy && \ - pip install sdv && \ - pip install scikit-learn==1.0 && \ - pip install fastapi uvicorn" - -# Step 6: Expose the port that the FastAPI app will run on -EXPOSE 8000 - -# Step 7: Run the FastAPI server -CMD ["/bin/bash", "-c", "source activate ganblr_env && uvicorn ganblr_api:app --host 0.0.0.0 --port 8000"] \ No newline at end of file diff --git a/Docker/Ganblr/ganblr_api.py b/Docker/Ganblr/ganblr_api.py deleted file mode 100644 index efb1c56..0000000 --- a/Docker/Ganblr/ganblr_api.py +++ /dev/null @@ -1,105 +0,0 @@ -from fastapi import FastAPI, HTTPException, File, UploadFile -from pydantic import BaseModel -import pandas as pd -import logging -from katabatic.katabatic import Katabatic -import os - -# Initialize FastAPI app -app = FastAPI( - title="GANBLR Model API", - description="An API to train and generate synthetic data using the GANBLR model. " - "Provides endpoints to train the model on a dataset and generate synthetic data.", - version="1.0.0", -) - -# Initialize logging -logging.basicConfig(level=logging.INFO) - -# Initialize the GANBLR model instance (the model instance will persist across requests) -ganblr_model = Katabatic.run_model("ganblr") - - -# Request body for training -class TrainRequest(BaseModel): - file_path: str # Path to the dataset file (CSV) - epochs: int # Number of epochs for training - - -# Request body for generating synthetic data -class GenerateRequest(BaseModel): - num_samples: int # Number of samples to generate - - -@app.post("/train", summary="Train the GANBLR model", tags=["Model Operations"]) -async def train_model(request: TrainRequest): - """ - Train the GANBLR model on the provided dataset. - - - **file_path**: Path to the CSV dataset file. - - **epochs**: Number of epochs to run the training. - - This will train the model and persist it in memory for future synthetic data generation. - """ - if not os.path.exists(request.file_path): - raise HTTPException(status_code=404, detail="File not found") - - try: - logging.info(f"Loading dataset from {request.file_path}...") - data = pd.read_csv(request.file_path) - - # Assuming the last column is the label, split the dataset into features (X) and labels (y) - if len(data.columns) < 2: - raise HTTPException( - status_code=400, - detail="Dataset must contain at least one feature and one label.", - ) - - X, y = data.iloc[:, :-1], data.iloc[:, -1] - - # Train the model - logging.info(f"Training the GANBLR model for {request.epochs} epochs...") - ganblr_model.load_model() # Load the model if necessary - ganblr_model.fit(X, y, epochs=request.epochs) - - logging.info("Model trained successfully.") - return {"message": "Model trained successfully", "epochs": request.epochs} - except Exception as e: - logging.error(f"Error during training: {str(e)}") - raise HTTPException( - status_code=500, detail=f"An error occurred during training: {str(e)}" - ) - - -@app.post("/generate", summary="Generate synthetic data", tags=["Model Operations"]) -async def generate_synthetic_data(request: GenerateRequest): - """ - Generate synthetic data from the trained GANBLR model. - - - **num_samples**: Number of synthetic samples to generate. - - Returns a JSON response with the generated synthetic data. - """ - try: - logging.info(f"Generating {request.num_samples} synthetic data samples...") - synthetic_data = ganblr_model.generate(size=request.num_samples) - synthetic_df = pd.DataFrame(synthetic_data) - - logging.info("Synthetic data generation successful.") - return {"synthetic_data": synthetic_df.to_dict(orient="records")} - except Exception as e: - logging.error(f"Error during synthetic data generation: {str(e)}") - raise HTTPException( - status_code=500, - detail=f"An error occurred during data generation: {str(e)}", - ) - - -@app.get("/", summary="Root Endpoint", tags=["General"]) -async def root(): - """ - Root endpoint for checking the status of the API. - - Returns a simple message indicating that the API is up and running. - """ - return {"message": "GANBLR Model API is running!"} diff --git a/Docker/Ganblr/requirements.txt b/Docker/Ganblr/requirements.txt deleted file mode 100644 index 6cbee8f..0000000 --- a/Docker/Ganblr/requirements.txt +++ /dev/null @@ -1,9 +0,0 @@ -pyitlib -pgmpy -scikit-learn -tensorflow -sdv -fastapi -uvicorn -pandas -katabatic \ No newline at end of file diff --git a/Docker/Ganblrpp/Dockerfile_Ganblrpp b/Docker/Ganblrpp/Dockerfile_Ganblrpp deleted file mode 100644 index 3d0c75e..0000000 --- a/Docker/Ganblrpp/Dockerfile_Ganblrpp +++ /dev/null @@ -1,32 +0,0 @@ -# Step 1: Use an official Miniconda image as a base image -FROM continuumio/miniconda3:latest - -# Step 2: Set the working directory in the container -WORKDIR /app - -# Step 3: Copy your application code (GANBLRPP model API code) -COPY . /app - -# Step 4: Create a new Conda environment -RUN conda create --name ganblrpp_env python=3.9.19 -y && \ - echo "source activate ganblrpp_env" > ~/.bashrc - -# Step 5: Activate the environment and install necessary libraries in the precise order -#/bin/bash -c "source activate ganblrpp_env && \ -RUN /bin/bash -c "source activate ganblrpp_env && \ - pip install pyitlib && \ - pip install tensorflow && \ - pip install pgmpy && \ - pip install sdv && \ - pip install scikit-learn==1.0 && \ - pip install fastapi uvicorn && \ - pip install python-multipart" - -# Step 6: Expose the port that the FastAPI app will run on -EXPOSE 8000 - -# Step 7: Run the FastAPI server -CMD ["/bin/bash", "-c", "source activate ganblrpp_env && uvicorn ganblrpp_api:app --host 0.0.0.0 --port 8000"] - - - diff --git a/Docker/Ganblrpp/ganblrpp_api.py b/Docker/Ganblrpp/ganblrpp_api.py deleted file mode 100644 index 03e715f..0000000 --- a/Docker/Ganblrpp/ganblrpp_api.py +++ /dev/null @@ -1,129 +0,0 @@ -from fastapi import FastAPI, HTTPException, UploadFile, File, Form -from pydantic import BaseModel -import pandas as pd -import logging -import numpy as np -from katabatic.katabatic import Katabatic -from katabatic.models.ganblrpp_DGEK.ganblrpp_adapter import GanblrppAdapter - -# Initialize FastAPI app -app = FastAPI( - title="GANBLR++ Model API", - description="An API to train and generate synthetic data using the GANBLR++ model. " - "Provides endpoints to train the model on a dataset and generate synthetic data.", - version="1.0.0", -) - -# Initialize logging -logging.basicConfig(level=logging.INFO) - -# Initialize the GANBLR++ model instance (the model instance will persist across requests) -ganblrpp_model = None - -def initialize_model(numerical_columns): - global ganblrpp_model - ganblrpp_model = GanblrppAdapter(numerical_columns=numerical_columns) - -# Request body for generating synthetic data -class GenerateRequest(BaseModel): - num_samples: int # Number of samples to generate - - -@app.post("/train", summary="Train the GANBLR++ model", tags=["Model Operations"]) -async def train_model( - file: UploadFile = File(...), - epochs: int = Form(...), - batch_size: int = Form(64) -): - """ - Train the GANBLR++ model on the provided dataset. - - - **file**: CSV dataset file. - - **epochs**: Number of epochs to run the training. - - **batch_size**: (Optional) Batch size for training. - - This will train the model and persist it in memory for future synthetic data generation. - """ - try: - # Load dataset - logging.info(f"Loading dataset from uploaded file...") - data = pd.read_csv(file.file) - - # Identify numerical columns - def is_numerical(dtype): - return dtype.kind in 'iuf' - - column_is_numerical = data.dtypes.apply(is_numerical).values - numerical = np.argwhere(column_is_numerical).ravel() - - if len(numerical) == 0: - raise HTTPException( - status_code=400, - detail="Dataset must contain at least one numerical feature.", - ) - - # Initialize model with numerical columns - initialize_model(numerical) - - # Split the dataset into features (X) and labels (y) - if len(data.columns) < 2: - raise HTTPException( - status_code=400, - detail="Dataset must contain at least one feature and one label.", - ) - - X, y = data.iloc[:, :-1], data.iloc[:, -1] - - # Train the model - logging.info(f"Training the GANBLR++ model for {epochs} epochs with batch size {batch_size}...") - ganblrpp_model.load_model() # Load the model if necessary - ganblrpp_model.fit(X, y, epochs=epochs, batch_size=batch_size) - - logging.info("Model trained successfully.") - return {"message": "Model trained successfully", "epochs": epochs} - except Exception as e: - logging.error(f"Error during training: {str(e)}") - raise HTTPException( - status_code=500, detail=f"An error occurred during training: {str(e)}" - ) - - -@app.post("/generate", summary="Generate synthetic data", tags=["Model Operations"]) -async def generate_synthetic_data(request: GenerateRequest): - """ - Generate synthetic data from the trained GANBLR++ model. - - - **num_samples**: Number of synthetic samples to generate. - - Returns a JSON response with the generated synthetic data. - """ - try: - if ganblrpp_model is None: - raise HTTPException( - status_code=400, - detail="Model not trained. Please train the model first." - ) - - logging.info(f"Generating {request.num_samples} synthetic data samples...") - synthetic_data = ganblrpp_model.generate(size=request.num_samples) - synthetic_df = pd.DataFrame(synthetic_data) - - logging.info("Synthetic data generation successful.") - return {"synthetic_data": synthetic_df.to_dict(orient="records")} - except Exception as e: - logging.error(f"Error during synthetic data generation: {str(e)}") - raise HTTPException( - status_code=500, - detail=f"An error occurred during data generation: {str(e)}", - ) - - -@app.get("/", summary="Root Endpoint", tags=["General"]) -async def root(): - """ - Root endpoint for checking the status of the API. - - Returns a simple message indicating that the API is up and running. - """ - return {"message": "GANBLR++ Model API is running!"} - diff --git a/Docker/Meg/Dockerfile_Meg b/Docker/Meg/Dockerfile_Meg deleted file mode 100644 index 200d7b6..0000000 --- a/Docker/Meg/Dockerfile_Meg +++ /dev/null @@ -1,29 +0,0 @@ -# Step 1: Use an official Miniconda image as a base image -FROM continuumio/miniconda3:latest - -# Step 2: Set the working directory in the container -WORKDIR /app - -# Step 3: Copy your application code (MEG model API code) -COPY . /app - -# Step 4: Create a new Conda environment -RUN conda create --name meg_env python=3.9.19 -y && \ - echo "source activate meg_env" > ~/.bashrc - -# Step 5: Activate the environment and install necessary libraries in the precise order -#/bin/bash -c "source activate meg_env && \ -RUN /bin/bash -c "source activate meg_env && \ - pip install pyitlib && \ - pip install tensorflow && \ - pip install pgmpy && \ - pip install sdv && \ - pip install scikit-learn==1.0 && \ - pip install fastapi uvicorn && \ - pip install python-multipart" - -# Step 6: Expose the port that the FastAPI app will run on -EXPOSE 8000 - -# Step 7: Run the FastAPI server -CMD ["/bin/bash", "-c", "source activate meg_env && uvicorn meg_api:app --host 0.0.0.0 --port 8000"] \ No newline at end of file diff --git a/Docker/Meg/meg_api.py b/Docker/Meg/meg_api.py deleted file mode 100644 index 147f432..0000000 --- a/Docker/Meg/meg_api.py +++ /dev/null @@ -1,131 +0,0 @@ -from fastapi import FastAPI, HTTPException, UploadFile, File, Form -from pydantic import BaseModel -import pandas as pd -import logging -import numpy as np -from katabatic.models.meg_DGEK.meg_adapter import MegAdapter -from sklearn.model_selection import train_test_split - -# Initialize FastAPI app -app = FastAPI( - title="MEG Model API", - description="An API to train and generate synthetic data using the MEG model. " - "Provides endpoints to train the model on a dataset and generate synthetic data.", - version="1.0.0", -) - -# Initialize logging -logging.basicConfig(level=logging.INFO) - -# Initialize the MEG model instance (the model instance will persist across requests) -meg_model = None -original_labels = None - -def initialize_model(): - global meg_model - meg_model = MegAdapter() - -def store_labels(df): - global original_labels - original_labels = df.iloc[:, 0].unique() - -# Request body for generating synthetic data -class GenerateRequest(BaseModel): - num_samples: int # Number of samples to generate - -@app.post("/train", summary="Train the MEG model", tags=["Model Operations"]) -async def train_model( - file: UploadFile = File(...), - epochs: int = Form(...), -): - """ - Train the MEG model on the provided dataset. - - - **file**: CSV dataset file. - - **epochs**: Number of epochs to run the training. - - This will train the model and persist it in memory for future synthetic data generation. - """ - global meg_model, original_labels - - try: - # Load dataset from uploaded file - logging.info("Loading dataset from uploaded file...") - df = pd.read_csv(file.file) - - # Store original labels - store_labels(df) - - # Split the dataset into features (X) and labels (y) - if len(df.columns) < 2: - raise HTTPException( - status_code=400, - detail="Dataset must contain at least one feature and one label.", - ) - - X, y = df.values[:, :-1], df.values[:, -1] - X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.5) - - # Initialize and train model - initialize_model() - logging.info(f"Training the MEG model for {epochs} epochs...") - meg_model.load_model() # Load the model if necessary - meg_model.fit(X_train, y_train, epochs=epochs) - - logging.info("Model trained successfully.") - return {"message": "Model trained successfully", "epochs": epochs} - except Exception as e: - logging.error(f"Error during training: {str(e)}") - raise HTTPException( - status_code=500, detail=f"An error occurred during training: {str(e)}" - ) - -@app.post("/generate", summary="Generate synthetic data", tags=["Model Operations"]) -async def generate_synthetic_data(request: GenerateRequest): - """ - Generate synthetic data from the trained MEG model. - - - **num_samples**: Number of synthetic samples to generate. - - Returns a JSON response with the generated synthetic data. - """ - try: - if meg_model is None: - raise HTTPException( - status_code=400, - detail="Model not trained. Please train the model first." - ) - - if original_labels is None: - raise HTTPException( - status_code=500, - detail="Original labels not available. Please train the model first." - ) - - logging.info(f"Generating {request.num_samples} synthetic data samples...") - synthetic_data = meg_model.generate(size=request.num_samples) - synthetic_df = pd.DataFrame(synthetic_data) - - # Revert the first column to its original label - synthetic_df.iloc[:, 0] = pd.Series(synthetic_df.iloc[:, 0]).map(lambda x: original_labels[int(x) % len(original_labels)]) - - logging.info("Synthetic data generation successful.") - return {"synthetic_data": synthetic_df.to_dict(orient="records")} - except Exception as e: - logging.error(f"Error during synthetic data generation: {str(e)}") - raise HTTPException( - status_code=500, - detail=f"An error occurred during data generation: {str(e)}", - ) - -@app.get("/", summary="Root Endpoint", tags=["General"]) -async def root(): - """ - Root endpoint for checking the status of the API. - - Returns a simple message indicating that the API is up and running. - """ - return {"message": "MEG Model API is running!"} - - - diff --git a/Dockerfile (Muhammad) b/Dockerfile (Muhammad) deleted file mode 100644 index a08588e..0000000 --- a/Dockerfile (Muhammad) +++ /dev/null @@ -1,27 +0,0 @@ -# Step 1: Use an official Miniconda image as a base image -FROM continuumio/miniconda3:latest - -# Step 2: Set the working directory in the container -WORKDIR /app - -# Step 3: Copy your application code (GANBLR model API code) -COPY . /app - -# Step 4: Create a new Conda environment -RUN conda create --name ganblr_env python=3.9.19 -y && \ - echo "source activate ganblr_env" > ~/.bashrc - -# Step 5: Activate the environment and install necessary libraries in the precise order -RUN /bin/bash -c "source activate ganblr_env && \ - pip install pyitlib && \ - pip install tensorflow && \ - pip install pgmpy && \ - pip install sdv && \ - pip install scikit-learn==1.0 && \ - pip install fastapi uvicorn" - -# Step 6: Expose the port that the FastAPI app will run on -EXPOSE 8000 - -# Step 7: Run the FastAPI server -CMD ["/bin/bash", "-c", "source activate ganblr_env && uvicorn ganblr_api:app --host 0.0.0.0 --port 8000"] diff --git a/Documentation-PDF/CTGAN.pdf b/Documentation-PDF/CTGAN.pdf deleted file mode 100644 index c506647..0000000 Binary files a/Documentation-PDF/CTGAN.pdf and /dev/null differ diff --git a/Documentation-PDF/Ganblr Plus Plus.pdf b/Documentation-PDF/Ganblr Plus Plus.pdf deleted file mode 100644 index b5db0ac..0000000 Binary files a/Documentation-PDF/Ganblr Plus Plus.pdf and /dev/null differ diff --git a/Documentation-PDF/MEG.pdf b/Documentation-PDF/MEG.pdf deleted file mode 100644 index 8c4f10f..0000000 Binary files a/Documentation-PDF/MEG.pdf and /dev/null differ diff --git a/Documentation-PDF/MedGAN Implementation.pdf b/Documentation-PDF/MedGAN Implementation.pdf deleted file mode 100644 index cba6afc..0000000 Binary files a/Documentation-PDF/MedGAN Implementation.pdf and /dev/null differ diff --git a/Documentation-PDF/TableGAN Document .pdf b/Documentation-PDF/TableGAN Document .pdf deleted file mode 100644 index 4fd7d96..0000000 Binary files a/Documentation-PDF/TableGAN Document .pdf and /dev/null differ diff --git a/Flask.py b/Flask.py new file mode 100644 index 0000000..35dc30b --- /dev/null +++ b/Flask.py @@ -0,0 +1,53 @@ +from flask import Flask, render_template, request, redirect, url_for + +# Initialize the Flask app +app = Flask(__name__) + +@app.route('/') +def home(): + # Render the main HTML template when accessing the root route + return render_template('index.html') + +@app.route('/about') +def about(): + # Render the About Us page from templates folder when the '/about' route is accessed + return render_template('about.html') + +@app.route('/services') +def services(): + # Render the services page from templates folder when the '/services' route is accessed + return render_template('services.html') + +@app.route('/Contact') +def Contact(): + # Render the Contact page from templates folder when the '/Contact' route is accessed + return render_template('Contact.html') + +@app.route('/glanblr') +def glanblr(): + # Render the Glanblr page from templates folder when the '/glanblr' route is accessed + return render_template('models/glanblr.html') + +@app.route('/ctgan') +def ctgan(): + # Render the CTGAN page from templates folder when the '/ctgan' route is accessed + return render_template('models/CTGAN.html') + +@app.route('/meg') +def meg(): + # Render the CTGAN page from templates folder when the '/meg' route is accessed + return render_template('models/meg.html') + +@app.route('/model/') +def model_page(model_name): + # Check if the requested model name is valid + valid_models = ['glanblr', 'CTGAN', 'meg'] + if model_name in valid_models: + return render_template(f'models/{model_name}.html', model_name=model_name) + else: + # If the model name is not valid, redirect to home or show an error page + return redirect(url_for('home')) + +# Run the Flask app with debugging enabled +if __name__ == '__main__': + app.run(debug=True) diff --git a/Katabatic.drawio b/Katabatic.drawio deleted file mode 100644 index 2a258f4..0000000 --- a/Katabatic.drawio +++ /dev/null @@ -1,698 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/LICENSE b/LICENSE deleted file mode 100644 index e434105..0000000 --- a/LICENSE +++ /dev/null @@ -1,16 +0,0 @@ -Katabatic - an Open Source, Data Generation Framework - - Copyright (C) 2024 Jaime Blackwell - - This program is free software: you can redistribute it and/or modify - it under the terms of the GNU Affero General Public License as - published by the Free Software Foundation, either version 3 of the - License, or (at your option) any later version. - - This program is distributed in the hope that it will be useful, - but WITHOUT ANY WARRANTY; without even the implied warranty of - MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - GNU Affero General Public License for more details. - - You should have received a copy of the GNU Affero General Public License - along with this program. If not, see . diff --git a/README.md b/README.md deleted file mode 100644 index 23a16a0..0000000 --- a/README.md +++ /dev/null @@ -1,45 +0,0 @@ -Katabatic -========================================= -Katabatic is an open-source tabular data generation framework designed for data generative models such as GANBLR, TableGAN, MedGan etc... - -#### Code Description - -Katabatic is a framework designed to make generating and evaluating synthetic data much easier. Katabatic has been build with the understanding that different domains have different requirements of synthetic data, and therefore provides a range of evaluation methods. - -#### Installation - -1. Install Dependencies - -2. Download/clone the Katabatic code - -#### Usage - -The first step is to import the katabatic library: - - import katabatic - -Next, import the desired model or models: - - from katabatic import GANBLR - - -#### Relevant Publications - -GANBLR is a data generative model introduced in the following [paper](https://www.researchgate.net/publication/356159733_GANBLR_A_Tabular_Data_Generation_Model): - - Zhang, Yishuo & Zaidi, Nayyar & Zhou, Jiahui & li, Gang. (2021). - GANBLR: A Tabular Data Generation Model. 10.1109/ICDM51629.2021.00103. - - - # TO DO: - - Improve the evaluation methods so they are easier to use. - - Debug evaluate_data() - - Move individual models into docker containers. - - Finalise documentation page "Getting Started with Katabatic" - - Finalise documentation page "Installation Guide" - - Cleanup folder structure in preparation to add Katabatic to PyPi. - - Move Aiko implementation from Prototype to Katabatic - - - - diff --git a/Step-by-Step_Dockerisation_Guide.md b/Step-by-Step_Dockerisation_Guide.md deleted file mode 100644 index 3fd1521..0000000 --- a/Step-by-Step_Dockerisation_Guide.md +++ /dev/null @@ -1,253 +0,0 @@ -# GANBLR Model API - -A **FastAPI** service to train and generate synthetic data using the **GANBLR** model. This service allows you to train the GANBLR model on your dataset and generate synthetic data from the trained model. - -## Features - -- **Train the Model**: Upload a dataset and specify the number of epochs for training. -- **Generate Synthetic Data**: Generate synthetic data after the model is trained. -- **REST API Interface**: Access endpoints to train and generate data using FastAPI. -- **Modular Setup**: Run the service using a virtual environment or through a Docker container. - ---- - -## Quickstart Guide - -### Prerequisites - -Before you get started, ensure that you have the following tools installed: - -- **Python 3.9.19** -- **Conda** (for environment management) -- **Docker** (optional, for Dockerized deployment) - ---- - -## Running the API in a Local Environment - -### Step 1: Setup the Environment - -We recommend using **Conda** to manage the environment and ensure all dependencies are installed in the correct order. - -#### Bash Script - -You can use the provided `serve_ganblr.sh` script to create a conda environment, install dependencies, and serve the FastAPI application: - -```bash -#!/bin/bash - -# Step 1: Setup conda environment -conda create --name ganblr_env python=3.9.19 -y -conda activate ganblr_env - -# Step 2: Install libraries in the correct order -echo "Installing required libraries..." -pip install pyitlib -pip install tensorflow -pip install pgmpy -pip install sdv -pip install scikit-learn==1.0 -pip install fastapi uvicorn - -# Step 3: Serve the model API -uvicorn ganblr_api:app --reload --host 0.0.0.0 --port 8000 -``` - -### Step 2: Run the Bash Script - -Save the bash script as serve_ganblr.sh, then run the following command: - -```bash -chmod +x serve_ganblr.sh -./serve_ganblr.sh -``` - -This will set up the environment, install dependencies, and start the FastAPI server. You can access the API at http://localhost:8000. - -### Step 3: API Endpoints - - • Training the Model - -```bash -curl -X 'POST' \ - 'http://127.0.0.1:8000/train' \ - -H 'Content-Type: application/json' \ - -d '{ - "file_path": "/path/to/your/dataset.csv", - "epochs": 20 -}' -``` - - • Input: - • file_path: The path to your dataset in CSV format. - • epochs: The number of training epochs. - • Output: - • A success message indicating the model was trained. - - • Generating Synthetic Data - -```bash -curl -X 'POST' \ - 'http://127.0.0.1:8000/generate' \ - -H 'Content-Type: application/json' \ - -d '{ - "num_samples": 100 -}' -``` - - • Input: - • num_samples: The number of synthetic samples to generate. - • Output: - • JSON array containing synthetic data. - -## Running the API with Docker - -If you’d rather run the API in a Docker container, follow these steps. - -### Step 1: Create a Dockerfile - -The following Dockerfile is already provided in the repository and is set up to install dependencies and serve the FastAPI API. - -#### Step 1: Use an official Miniconda image as a base image - -```bash -FROM continuumio/miniconda3:latest -``` - -#### Step 2: Set the working directory in the container - -```bash -WORKDIR /app -``` - -#### Step 3: Copy your application code (GANBLR model API code) - -```bash -COPY . /app -``` - -#### Step 4: Create a new Conda environment - -```bash -RUN conda create --name ganblr_env python=3.9.19 -y && \ - echo "source activate ganblr_env" > ~/.bashrc -``` - -#### Step 5: Activate the environment and install necessary libraries in the precise order - -```bash -RUN /bin/bash -c "source activate ganblr_env && \ - pip install pyitlib && \ - pip install tensorflow && \ - pip install pgmpy && \ - pip install sdv && \ - pip install scikit-learn==1.0 && \ - pip install fastapi uvicorn" -``` - -#### Step 6: Expose the port that the FastAPI app will run on - -```bash -EXPOSE 8000 -``` - -#### Step 7: Run the FastAPI server - -```bash -CMD ["/bin/bash", "-c", "source activate ganblr_env && uvicorn ganblr_api:app --host 0.0.0.0 --port 8000"] -``` - -### Step 2: Build the Docker Image - -From the directory where your Dockerfile and code are located, build the Docker image: - -```bash -docker build -t ganblr-fastapi . -``` - -### Step 3: Run the Docker Container - -Once the image is built, run the Docker container with the following command: - -```bash -docker run -p 8000:8000 -v /path/to/your/local/directory:/app/data ganblr-fastapi -``` - -This will: - - • Expose port 8000: The FastAPI app will be available at http://localhost:8000. - • Mount the local directory: It mounts the local directory /path/to/your/local/directory into the container’s /app/data directory. - -### Step 4: API Endpoints (Same as Local) - -You can now interact with the API at http://localhost:8000. - - • Training the Model (with dataset file inside Docker): - -```bash -curl -X 'POST' \ - 'http://127.0.0.1:8000/train' \ - -H 'Content-Type: application/json' \ - -d '{ - "file_path": "/app/data/katabatic/nursery/nursery.data", - "epochs": 20 -}' -``` - - • Generating Synthetic Data: - -```bash -curl -X 'POST' \ - 'http://127.0.0.1:8000/generate' \ - -H 'Content-Type: application/json' \ - -d '{ - "num_samples": 100 -}' -``` - -## API Documentation - -### POST /train - -Train the GANBLR model on a dataset provided by the user. - - • Request Body: - • file_path (string): Path to the CSV dataset file. - • epochs (integer): Number of epochs for training. - • Response: - • 200: Successful training message. - • 500: Error during training (e.g., file not found or other issue). - -### POST /generate - -Generate synthetic data from the trained GANBLR model. - - • Request Body: - • num_samples (integer): Number of synthetic samples to generate. - • Response: - • 200: JSON array with generated synthetic data. - • 500: Error during data generation. - -### GET / - -Check the status of the API. - - • Response: - • 200: Status message indicating that the API is running. - -## Contributing - -Contributions are welcome! Please follow these steps: - - 1. Fork the repository. - 2. Create a new branch (git checkout -b feature-branch). - 3. Make your changes and ensure tests pass. - 4. Submit a pull request for review. - -## License - -This project is licensed under the MIT License. See the LICENSE file for details. - -## Contact - -For any questions or issues, please feel free to open an issue in the repository or reach out to the maintainers. diff --git a/Use_Cases/tryKatabatic_Ctgan.ipynb b/Use_Cases/tryKatabatic_Ctgan.ipynb deleted file mode 100644 index aa8ba16..0000000 --- a/Use_Cases/tryKatabatic_Ctgan.ipynb +++ /dev/null @@ -1,476 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Katabatic Demo Usage" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### Importing Dependenciencies" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: scikit-learn in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (0.24.0)\n", - "Collecting scikit-learn\n", - " Using cached scikit_learn-1.5.1-cp39-cp39-win_amd64.whl (11.0 MB)\n", - "Requirement already satisfied: scipy>=1.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.13.1)\n", - "Requirement already satisfied: joblib>=1.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.4.2)\n", - "Requirement already satisfied: numpy>=1.19.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.26.4)\n", - "Requirement already satisfied: threadpoolctl>=3.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (3.5.0)\n", - "Installing collected packages: scikit-learn\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 0.24.0\n", - " Uninstalling scikit-learn-0.24.0:\n", - " Successfully uninstalled scikit-learn-0.24.0\n", - "Successfully installed scikit-learn-1.5.1\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "pyitlib 0.2.3 requires scikit-learn<=0.24,>=0.16.0, but you have scikit-learn 1.5.1 which is incompatible.\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install -U scikit-learn" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: pyitlib in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (0.2.3)\n", - "Requirement already satisfied: pandas>=0.20.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (2.2.2)\n", - "Requirement already satisfied: numpy>=1.9.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (1.26.4)\n", - "Collecting scikit-learn<=0.24,>=0.16.0\n", - " Using cached scikit_learn-0.24.0-cp39-cp39-win_amd64.whl (6.9 MB)\n", - "Requirement already satisfied: scipy>=1.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (1.13.1)\n", - "Requirement already satisfied: future>=0.16.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (1.0.0)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=0.20.2->pyitlib) (2024.1)\n", - "Requirement already satisfied: python-dateutil>=2.8.2 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from pandas>=0.20.2->pyitlib) (2.9.0.post0)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=0.20.2->pyitlib) (2024.1)\n", - "Requirement already satisfied: threadpoolctl>=2.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn<=0.24,>=0.16.0->pyitlib) (3.5.0)\n", - "Requirement already satisfied: joblib>=0.11 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn<=0.24,>=0.16.0->pyitlib) (1.4.2)\n", - "Requirement already satisfied: six>=1.5 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from python-dateutil>=2.8.2->pandas>=0.20.2->pyitlib) (1.16.0)\n", - "Installing collected packages: scikit-learn\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 1.5.1\n", - " Uninstalling scikit-learn-1.5.1:\n", - " Successfully uninstalled scikit-learn-1.5.1\n", - "Successfully installed scikit-learn-0.24.0\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "sdmetrics 0.15.0 requires scikit-learn>=1.0.2; python_version < \"3.10\", but you have scikit-learn 0.24.0 which is incompatible.\n", - "rdt 1.12.2 requires scikit-learn>=1.0.2; python_version < \"3.10\", but you have scikit-learn 0.24.0 which is incompatible.\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install pyitlib" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: tensorflow in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (2.17.0)\n", - "Requirement already satisfied: tensorflow-intel==2.17.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow) (2.17.0)\n", - "Requirement already satisfied: numpy<2.0.0,>=1.23.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.26.4)\n", - "Requirement already satisfied: typing-extensions>=3.6.6 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (4.12.2)\n", - "Requirement already satisfied: google-pasta>=0.1.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.2.0)\n", - "Requirement already satisfied: tensorflow-io-gcs-filesystem>=0.23.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.31.0)\n", - "Requirement already satisfied: ml-dtypes<0.5.0,>=0.3.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.4.0)\n", - "Requirement already satisfied: h5py>=3.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (3.11.0)\n", - "Requirement already satisfied: six>=1.12.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.16.0)\n", - "Requirement already satisfied: requests<3,>=2.21.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.32.3)\n", - "Requirement already satisfied: keras>=3.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (3.4.1)\n", - "Requirement already satisfied: flatbuffers>=24.3.25 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (24.3.25)\n", - "Requirement already satisfied: termcolor>=1.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.4.0)\n", - "Requirement already satisfied: wrapt>=1.11.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.16.0)\n", - "Requirement already satisfied: grpcio<2.0,>=1.24.3 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.65.4)\n", - "Requirement already satisfied: opt-einsum>=2.3.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (3.3.0)\n", - "Requirement already satisfied: setuptools in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (58.1.0)\n", - "Requirement already satisfied: packaging in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (24.1)\n", - "Requirement already satisfied: absl-py>=1.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.1.0)\n", - "Requirement already satisfied: gast!=0.5.0,!=0.5.1,!=0.5.2,>=0.2.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.6.0)\n", - "Requirement already satisfied: protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.20.3 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (4.25.4)\n", - "Requirement already satisfied: libclang>=13.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (18.1.1)\n", - "Requirement already satisfied: tensorboard<2.18,>=2.17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.17.0)\n", - "Requirement already satisfied: astunparse>=1.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.6.3)\n", - "Requirement already satisfied: wheel<1.0,>=0.23.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from astunparse>=1.6.0->tensorflow-intel==2.17.0->tensorflow) (0.44.0)\n", - "Requirement already satisfied: rich in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (13.7.1)\n", - "Requirement already satisfied: namex in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (0.0.8)\n", - "Requirement already satisfied: optree in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (0.12.1)\n", - "Requirement already satisfied: charset-normalizer<4,>=2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (3.3.2)\n", - "Requirement already satisfied: idna<4,>=2.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (3.7)\n", - "Requirement already satisfied: certifi>=2017.4.17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (2024.7.4)\n", - "Requirement already satisfied: urllib3<3,>=1.21.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (1.26.19)\n", - "Requirement already satisfied: werkzeug>=1.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.0.3)\n", - "Requirement already satisfied: tensorboard-data-server<0.8.0,>=0.7.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (0.7.2)\n", - "Requirement already satisfied: markdown>=2.6.8 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.6)\n", - "Requirement already satisfied: importlib-metadata>=4.4 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from markdown>=2.6.8->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (8.2.0)\n", - "Requirement already satisfied: MarkupSafe>=2.1.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from werkzeug>=1.0.1->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (2.1.5)\n", - "Requirement already satisfied: markdown-it-py>=2.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (3.0.0)\n", - "Requirement already satisfied: pygments<3.0.0,>=2.13.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (2.18.0)\n", - "Requirement already satisfied: zipp>=0.5 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.20.0)\n", - "Requirement already satisfied: mdurl~=0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from markdown-it-py>=2.2.0->rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (0.1.2)\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install tensorflow" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: pgmpy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (0.1.26)\n", - "Requirement already satisfied: google-generativeai in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.7.2)\n", - "Requirement already satisfied: scikit-learn in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.24.0)\n", - "Requirement already satisfied: pyparsing in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.1.2)\n", - "Requirement already satisfied: torch in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.4.0)\n", - "Requirement already satisfied: pandas in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.2.2)\n", - "Requirement already satisfied: xgboost in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.1.1)\n", - "Requirement already satisfied: joblib in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.4.2)\n", - "Requirement already satisfied: numpy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.26.4)\n", - "Requirement already satisfied: statsmodels in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.14.2)\n", - "Requirement already satisfied: tqdm in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (4.66.5)\n", - "Requirement already satisfied: scipy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.13.1)\n", - "Requirement already satisfied: networkx in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.2.1)\n", - "Requirement already satisfied: opt-einsum in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.3.0)\n", - "Requirement already satisfied: google-api-core in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.19.1)\n", - "Requirement already satisfied: google-api-python-client in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.140.0)\n", - "Requirement already satisfied: google-auth>=2.15.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.33.0)\n", - "Requirement already satisfied: typing-extensions in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from google-generativeai->pgmpy) (4.12.2)\n", - "Requirement already satisfied: pydantic in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.8.2)\n", - "Requirement already satisfied: google-ai-generativelanguage==0.6.6 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (0.6.6)\n", - "Requirement already satisfied: protobuf in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (4.25.4)\n", - "Requirement already satisfied: proto-plus<2.0.0dev,>=1.22.3 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-ai-generativelanguage==0.6.6->google-generativeai->pgmpy) (1.24.0)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas->pgmpy) (2024.1)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas->pgmpy) (2024.1)\n", - "Requirement already satisfied: python-dateutil>=2.8.2 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from pandas->pgmpy) (2.9.0.post0)\n", - "Requirement already satisfied: threadpoolctl>=2.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn->pgmpy) (3.5.0)\n", - "Requirement already satisfied: packaging>=21.3 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from statsmodels->pgmpy) (24.1)\n", - "Requirement already satisfied: patsy>=0.5.6 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from statsmodels->pgmpy) (0.5.6)\n", - "Requirement already satisfied: fsspec in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (2024.6.1)\n", - "Requirement already satisfied: sympy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (1.13.2)\n", - "Requirement already satisfied: jinja2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (3.1.4)\n", - "Requirement already satisfied: filelock in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (3.15.4)\n", - "Requirement already satisfied: colorama in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tqdm->pgmpy) (0.4.6)\n", - "Requirement already satisfied: rsa<5,>=3.1.4 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-auth>=2.15.0->google-generativeai->pgmpy) (4.9)\n", - "Requirement already satisfied: cachetools<6.0,>=2.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-auth>=2.15.0->google-generativeai->pgmpy) (5.4.0)\n", - "Requirement already satisfied: pyasn1-modules>=0.2.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-auth>=2.15.0->google-generativeai->pgmpy) (0.4.0)\n", - "Requirement already satisfied: six in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from patsy>=0.5.6->statsmodels->pgmpy) (1.16.0)\n", - "Requirement already satisfied: requests<3.0.0.dev0,>=2.18.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (2.32.3)\n", - "Requirement already satisfied: googleapis-common-protos<2.0.dev0,>=1.56.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (1.63.2)\n", - "Requirement already satisfied: uritemplate<5,>=3.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-python-client->google-generativeai->pgmpy) (4.1.1)\n", - "Requirement already satisfied: google-auth-httplib2<1.0.0,>=0.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-python-client->google-generativeai->pgmpy) (0.2.0)\n", - "Requirement already satisfied: httplib2<1.dev0,>=0.19.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-python-client->google-generativeai->pgmpy) (0.22.0)\n", - "Requirement already satisfied: MarkupSafe>=2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from jinja2->torch->pgmpy) (2.1.5)\n", - "Requirement already satisfied: annotated-types>=0.4.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pydantic->google-generativeai->pgmpy) (0.7.0)\n", - "Requirement already satisfied: pydantic-core==2.20.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pydantic->google-generativeai->pgmpy) (2.20.1)\n", - "Requirement already satisfied: mpmath<1.4,>=1.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sympy->torch->pgmpy) (1.3.0)\n", - "Requirement already satisfied: grpcio-status<2.0.dev0,>=1.33.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (1.62.3)\n", - "Requirement already satisfied: grpcio<2.0dev,>=1.33.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (1.65.4)\n", - "Requirement already satisfied: pyasn1<0.7.0,>=0.4.6 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyasn1-modules>=0.2.1->google-auth>=2.15.0->google-generativeai->pgmpy) (0.6.0)\n", - "Requirement already satisfied: idna<4,>=2.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (3.7)\n", - "Requirement already satisfied: certifi>=2017.4.17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (2024.7.4)\n", - "Requirement already satisfied: charset-normalizer<4,>=2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (3.3.2)\n", - "Requirement already satisfied: urllib3<3,>=1.21.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (1.26.19)\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install pgmpy" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: sdv in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (1.15.0)\n", - "Requirement already satisfied: cloudpickle>=2.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (3.0.0)\n", - "Requirement already satisfied: ctgan>=0.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.10.1)\n", - "Requirement already satisfied: copulas>=0.11.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.11.0)\n", - "Requirement already satisfied: rdt>=1.12.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.12.2)\n", - "Requirement already satisfied: sdmetrics>=0.14.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.15.0)\n", - "Requirement already satisfied: pyyaml>=6.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (6.0.2)\n", - "Requirement already satisfied: tqdm>=4.29 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (4.66.5)\n", - "Requirement already satisfied: pandas>=1.4.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (2.2.2)\n", - "Requirement already satisfied: platformdirs>=4.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from sdv) (4.2.2)\n", - "Requirement already satisfied: deepecho>=0.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.6.0)\n", - "Requirement already satisfied: numpy<2.0.0,>=1.21.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.26.4)\n", - "Requirement already satisfied: graphviz>=0.13.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.20.3)\n", - "Requirement already satisfied: boto3>=1.28 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.34.158)\n", - "Requirement already satisfied: botocore>=1.31 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.34.158)\n", - "Requirement already satisfied: s3transfer<0.11.0,>=0.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from boto3>=1.28->sdv) (0.10.2)\n", - "Requirement already satisfied: jmespath<2.0.0,>=0.7.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from boto3>=1.28->sdv) (1.0.1)\n", - "Requirement already satisfied: urllib3<1.27,>=1.25.4 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from botocore>=1.31->sdv) (1.26.19)\n", - "Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from botocore>=1.31->sdv) (2.9.0.post0)\n", - "Requirement already satisfied: plotly>=5.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from copulas>=0.11.0->sdv) (5.23.0)\n", - "Requirement already satisfied: scipy>=1.5.4 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from copulas>=0.11.0->sdv) (1.13.1)\n", - "Requirement already satisfied: torch>=1.9.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from ctgan>=0.10.0->sdv) (2.4.0)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=1.4.0->sdv) (2024.1)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=1.4.0->sdv) (2024.1)\n", - "Collecting scikit-learn>=1.0.2\n", - " Using cached scikit_learn-1.5.1-cp39-cp39-win_amd64.whl (11.0 MB)\n", - "Requirement already satisfied: Faker>=17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from rdt>=1.12.0->sdv) (26.3.0)\n", - "Requirement already satisfied: colorama in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tqdm>=4.29->sdv) (0.4.6)\n", - "Requirement already satisfied: tenacity>=6.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from plotly>=5.10.0->copulas>=0.11.0->sdv) (9.0.0)\n", - "Requirement already satisfied: packaging in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from plotly>=5.10.0->copulas>=0.11.0->sdv) (24.1)\n", - "Requirement already satisfied: six>=1.5 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from python-dateutil<3.0.0,>=2.1->botocore>=1.31->sdv) (1.16.0)\n", - "Requirement already satisfied: joblib>=1.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn>=1.0.2->rdt>=1.12.0->sdv) (1.4.2)\n", - "Requirement already satisfied: threadpoolctl>=3.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn>=1.0.2->rdt>=1.12.0->sdv) (3.5.0)\n", - "Requirement already satisfied: filelock in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.15.4)\n", - "Requirement already satisfied: fsspec in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (2024.6.1)\n", - "Requirement already satisfied: networkx in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.2.1)\n", - "Requirement already satisfied: jinja2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.1.4)\n", - "Requirement already satisfied: sympy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (1.13.2)\n", - "Requirement already satisfied: typing-extensions>=4.8.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (4.12.2)\n", - "Requirement already satisfied: MarkupSafe>=2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from jinja2->torch>=1.9.0->ctgan>=0.10.0->sdv) (2.1.5)\n", - "Requirement already satisfied: mpmath<1.4,>=1.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sympy->torch>=1.9.0->ctgan>=0.10.0->sdv) (1.3.0)\n", - "Installing collected packages: scikit-learn\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 0.24.0\n", - " Uninstalling scikit-learn-0.24.0:\n", - " Successfully uninstalled scikit-learn-0.24.0\n", - "Successfully installed scikit-learn-1.5.1\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "pyitlib 0.2.3 requires scikit-learn<=0.24,>=0.16.0, but you have scikit-learn 1.5.1 which is incompatible.\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install sdv" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### Importing Katabatic" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [], - "source": [ - "from katabatic.katabatic import Katabatic\n", - "import numpy as np\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Sample Training Data" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": {}, - "outputs": [], - "source": [ - "# Generating sample data for X_train\n", - "X_train = np.array([\n", - " [0.1, 1.2, 3.4, 4.5],\n", - " [2.3, 3.4, 1.1, 0.2],\n", - " [1.4, 0.4, 2.2, 3.3],\n", - " [3.1, 4.2, 1.0, 2.5],\n", - " [2.2, 1.1, 0.3, 4.1],\n", - " [0.5, 2.2, 4.4, 1.3],\n", - " [1.8, 3.3, 0.9, 2.2],\n", - " [2.6, 1.5, 3.3, 0.1],\n", - " [0.7, 4.4, 1.4, 3.0],\n", - " [1.9, 2.8, 3.9, 4.0]\n", - "])\n", - "\n", - "# Generating sample data for y_train\n", - "y_train = np.array([0, 1, 0, 1, 0, 1, 0, 1, 0, 1])" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Load Model and Train it on Sample Training Data" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "--------------------------\n", - "module name: katabatic.katabatic\n", - "parent process: 28468\n", - "process id: 28884\n", - "katabatic.models.ganblr.ganblr_adapter\n", - "katabatic.models.ganblr.ganblr_adapter\n", - "\n", - "[INFO] Initializing GANBLR Model\n", - "[INFO] Training GANBLR model\n", - "[SUCCESS] Model training completed\n" - ] - } - ], - "source": [ - "model = Katabatic.run_model('ganblr')\n", - "model.load_model()\n", - "model.fit(X_train,y_train)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Generate New Data from trained model" - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] Generating data using GANBLR model\n", - "[SUCCESS] Data generation completed\n" - ] - }, - { - "data": { - "text/plain": [ - "array([[3.1, 1.1, 0.3, 4.0, 0],\n", - " [0.7, 1.5, 0.3, 3.3, 1],\n", - " [0.7, 1.1, 2.2, 2.5, 1],\n", - " [2.6, 2.8, 1.1, 0.1, 0],\n", - " [2.6, 0.4, 4.4, 4.5, 1]], dtype=object)" - ] - }, - "execution_count": 8, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "model.generate(size=5)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# Future import statement ideas\n", - "\n", - "import katabatic as kb\n", - "from katabatic.models import meg\n", - "from katabatic.models import ganblr\n", - "from katabatic.evaluate import eval_method1\n", - "from katabatic.utils.preprocessing import data_processing_method1 # good place to store preprocessing utilities" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.9.13" - } - }, - "nbformat": 4, - "nbformat_minor": 2 -} diff --git a/Use_Cases/tryKatabatic_Ganblr++.ipynb b/Use_Cases/tryKatabatic_Ganblr++.ipynb deleted file mode 100644 index 1d8953f..0000000 --- a/Use_Cases/tryKatabatic_Ganblr++.ipynb +++ /dev/null @@ -1,922 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Katabatic Demo Usage" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### Importing Dependenciencies" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: scikit-learn in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (0.24.0)\n", - "Collecting scikit-learn\n", - " Using cached scikit_learn-1.5.1-cp39-cp39-win_amd64.whl (11.0 MB)\n", - "Requirement already satisfied: scipy>=1.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.13.1)\n", - "Requirement already satisfied: joblib>=1.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.4.2)\n", - "Requirement already satisfied: numpy>=1.19.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.26.4)\n", - "Requirement already satisfied: threadpoolctl>=3.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (3.5.0)\n", - "Installing collected packages: scikit-learn\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 0.24.0\n", - " Uninstalling scikit-learn-0.24.0:\n", - " Successfully uninstalled scikit-learn-0.24.0\n", - "Successfully installed scikit-learn-1.5.1\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "pyitlib 0.2.3 requires scikit-learn<=0.24,>=0.16.0, but you have scikit-learn 1.5.1 which is incompatible.\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install -U scikit-learn" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Collecting pyitlib\n", - " Downloading pyitlib-0.2.3.tar.gz (30 kB)\n", - " Preparing metadata (setup.py): started\n", - " Preparing metadata (setup.py): finished with status 'done'\n", - "Collecting pandas>=0.20.2\n", - " Downloading pandas-2.2.2-cp39-cp39-win_amd64.whl (11.6 MB)\n", - " --------------------------------------- 11.6/11.6 MB 46.7 MB/s eta 0:00:00\n", - "Collecting numpy>=1.9.2\n", - " Downloading numpy-2.0.2-cp39-cp39-win_amd64.whl (15.9 MB)\n", - " --------------------------------------- 15.9/15.9 MB 38.6 MB/s eta 0:00:00\n", - "Collecting scikit-learn<=0.24,>=0.16.0\n", - " Downloading scikit_learn-0.24.0-cp39-cp39-win_amd64.whl (6.9 MB)\n", - " ---------------------------------------- 6.9/6.9 MB 29.3 MB/s eta 0:00:00\n", - "Collecting scipy>=1.0.1\n", - " Downloading scipy-1.13.1-cp39-cp39-win_amd64.whl (46.2 MB)\n", - " --------------------------------------- 46.2/46.2 MB 23.4 MB/s eta 0:00:00\n", - "Collecting future>=0.16.0\n", - " Downloading future-1.0.0-py3-none-any.whl (491 kB)\n", - " ------------------------------------- 491.3/491.3 KB 30.1 MB/s eta 0:00:00\n", - "Requirement already satisfied: python-dateutil>=2.8.2 in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from pandas>=0.20.2->pyitlib) (2.9.0.post0)\n", - "Collecting tzdata>=2022.7\n", - " Downloading tzdata-2024.1-py2.py3-none-any.whl (345 kB)\n", - " ---------------------------------------- 345.4/345.4 KB ? eta 0:00:00\n", - "Collecting pytz>=2020.1\n", - " Downloading pytz-2024.2-py2.py3-none-any.whl (508 kB)\n", - " ------------------------------------- 508.0/508.0 KB 33.2 MB/s eta 0:00:00\n", - "Collecting threadpoolctl>=2.0.0\n", - " Downloading threadpoolctl-3.5.0-py3-none-any.whl (18 kB)\n", - "Collecting joblib>=0.11\n", - " Downloading joblib-1.4.2-py3-none-any.whl (301 kB)\n", - " ------------------------------------- 301.8/301.8 KB 19.4 MB/s eta 0:00:00\n", - "Requirement already satisfied: six>=1.5 in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from python-dateutil>=2.8.2->pandas>=0.20.2->pyitlib) (1.16.0)\n", - "Using legacy 'setup.py install' for pyitlib, since package 'wheel' is not installed.\n", - "Installing collected packages: pytz, tzdata, threadpoolctl, numpy, joblib, future, scipy, pandas, scikit-learn, pyitlib\n", - " Running setup.py install for pyitlib: started\n", - " Running setup.py install for pyitlib: finished with status 'done'\n", - "Successfully installed future-1.0.0 joblib-1.4.2 numpy-2.0.2 pandas-2.2.2 pyitlib-0.2.3 pytz-2024.2 scikit-learn-0.24.0 scipy-1.13.1 threadpoolctl-3.5.0 tzdata-2024.1\n", - "Note: you may need to restart the kernel to use updated packages.\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - " WARNING: The scripts f2py.exe and numpy-config.exe are installed in 'c:\\Users\\danie\\AppData\\Local\\Programs\\Python\\Python39\\Scripts' which is not on PATH.\n", - " Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.\n", - " WARNING: The scripts futurize.exe and pasteurize.exe are installed in 'c:\\Users\\danie\\AppData\\Local\\Programs\\Python\\Python39\\Scripts' which is not on PATH.\n", - " Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'c:\\Users\\danie\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "pip install pyitlib" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Collecting tensorflow\n", - " Downloading tensorflow-2.17.0-cp39-cp39-win_amd64.whl (2.0 kB)\n", - "Collecting tensorflow-intel==2.17.0\n", - " Downloading tensorflow_intel-2.17.0-cp39-cp39-win_amd64.whl (385.0 MB)\n", - " ------------------------------------- 385.0/385.0 MB 11.9 MB/s eta 0:00:00\n", - "Collecting termcolor>=1.1.0\n", - " Downloading termcolor-2.4.0-py3-none-any.whl (7.7 kB)\n", - "Collecting gast!=0.5.0,!=0.5.1,!=0.5.2,>=0.2.1\n", - " Downloading gast-0.6.0-py3-none-any.whl (21 kB)\n", - "Collecting keras>=3.2.0\n", - " Downloading keras-3.5.0-py3-none-any.whl (1.1 MB)\n", - " ---------------------------------------- 1.1/1.1 MB 75.7 MB/s eta 0:00:00\n", - "Collecting opt-einsum>=2.3.2\n", - " Downloading opt_einsum-3.3.0-py3-none-any.whl (65 kB)\n", - " ---------------------------------------- 65.5/65.5 KB ? eta 0:00:00\n", - "Collecting grpcio<2.0,>=1.24.3\n", - " Downloading grpcio-1.66.1-cp39-cp39-win_amd64.whl (4.3 MB)\n", - " ---------------------------------------- 4.3/4.3 MB 46.0 MB/s eta 0:00:00\n", - "Requirement already satisfied: six>=1.12.0 in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.16.0)\n", - "Requirement already satisfied: packaging in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (24.1)\n", - "Collecting requests<3,>=2.21.0\n", - " Downloading requests-2.32.3-py3-none-any.whl (64 kB)\n", - " ---------------------------------------- 64.9/64.9 KB 3.4 MB/s eta 0:00:00\n", - "Requirement already satisfied: setuptools in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (58.1.0)\n", - "Collecting wrapt>=1.11.0\n", - " Downloading wrapt-1.16.0-cp39-cp39-win_amd64.whl (37 kB)\n", - "Collecting tensorboard<2.18,>=2.17\n", - " Downloading tensorboard-2.17.1-py3-none-any.whl (5.5 MB)\n", - " ---------------------------------------- 5.5/5.5 MB 39.0 MB/s eta 0:00:00\n", - "Collecting astunparse>=1.6.0\n", - " Downloading astunparse-1.6.3-py2.py3-none-any.whl (12 kB)\n", - "Collecting protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.20.3\n", - " Downloading protobuf-4.25.4-cp39-cp39-win_amd64.whl (413 kB)\n", - " ---------------------------------------- 413.4/413.4 KB ? eta 0:00:00\n", - "Collecting numpy<2.0.0,>=1.23.5\n", - " Downloading numpy-1.26.4-cp39-cp39-win_amd64.whl (15.8 MB)\n", - " --------------------------------------- 15.8/15.8 MB 43.5 MB/s eta 0:00:00\n", - "Collecting h5py>=3.10.0\n", - " Downloading h5py-3.11.0-cp39-cp39-win_amd64.whl (3.0 MB)\n", - " ---------------------------------------- 3.0/3.0 MB 47.1 MB/s eta 0:00:00\n", - "Collecting google-pasta>=0.1.1\n", - " Downloading google_pasta-0.2.0-py3-none-any.whl (57 kB)\n", - " ---------------------------------------- 57.5/57.5 KB ? eta 0:00:00\n", - "Collecting ml-dtypes<0.5.0,>=0.3.1\n", - " Downloading ml_dtypes-0.4.1-cp39-cp39-win_amd64.whl (126 kB)\n", - " ---------------------------------------- 126.7/126.7 KB ? eta 0:00:00\n", - "Collecting tensorflow-io-gcs-filesystem>=0.23.1\n", - " Downloading tensorflow_io_gcs_filesystem-0.31.0-cp39-cp39-win_amd64.whl (1.5 MB)\n", - " ---------------------------------------- 1.5/1.5 MB 32.1 MB/s eta 0:00:00\n", - "Collecting absl-py>=1.0.0\n", - " Downloading absl_py-2.1.0-py3-none-any.whl (133 kB)\n", - " ---------------------------------------- 133.7/133.7 KB ? eta 0:00:00\n", - "Collecting libclang>=13.0.0\n", - " Downloading libclang-18.1.1-py2.py3-none-win_amd64.whl (26.4 MB)\n", - " --------------------------------------- 26.4/26.4 MB 28.5 MB/s eta 0:00:00\n", - "Collecting flatbuffers>=24.3.25\n", - " Downloading flatbuffers-24.3.25-py2.py3-none-any.whl (26 kB)\n", - "Requirement already satisfied: typing-extensions>=3.6.6 in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (4.12.2)\n", - "Collecting wheel<1.0,>=0.23.0\n", - " Downloading wheel-0.44.0-py3-none-any.whl (67 kB)\n", - " ---------------------------------------- 67.1/67.1 KB ? eta 0:00:00\n", - "Collecting optree\n", - " Downloading optree-0.12.1-cp39-cp39-win_amd64.whl (263 kB)\n", - " ------------------------------------- 263.3/263.3 KB 15.8 MB/s eta 0:00:00\n", - "Collecting namex\n", - " Downloading namex-0.0.8-py3-none-any.whl (5.8 kB)\n", - "Collecting rich\n", - " Downloading rich-13.8.1-py3-none-any.whl (241 kB)\n", - " ------------------------------------- 241.6/241.6 KB 15.4 MB/s eta 0:00:00\n", - "Collecting urllib3<3,>=1.21.1\n", - " Downloading urllib3-2.2.3-py3-none-any.whl (126 kB)\n", - " ---------------------------------------- 126.3/126.3 KB ? eta 0:00:00\n", - "Collecting charset-normalizer<4,>=2\n", - " Downloading charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl (100 kB)\n", - " -------------------------------------- 100.4/100.4 KB 5.6 MB/s eta 0:00:00\n", - "Collecting certifi>=2017.4.17\n", - " Downloading certifi-2024.8.30-py3-none-any.whl (167 kB)\n", - " ---------------------------------------- 167.3/167.3 KB ? eta 0:00:00\n", - "Collecting idna<4,>=2.5\n", - " Downloading idna-3.10-py3-none-any.whl (70 kB)\n", - " ---------------------------------------- 70.4/70.4 KB ? eta 0:00:00\n", - "Collecting markdown>=2.6.8\n", - " Downloading Markdown-3.7-py3-none-any.whl (106 kB)\n", - " ---------------------------------------- 106.3/106.3 KB ? eta 0:00:00\n", - "Collecting tensorboard-data-server<0.8.0,>=0.7.0\n", - " Downloading tensorboard_data_server-0.7.2-py3-none-any.whl (2.4 kB)\n", - "Collecting werkzeug>=1.0.1\n", - " Downloading werkzeug-3.0.4-py3-none-any.whl (227 kB)\n", - " ---------------------------------------- 227.6/227.6 KB ? eta 0:00:00\n", - "Requirement already satisfied: importlib-metadata>=4.4 in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from markdown>=2.6.8->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (8.5.0)\n", - "Collecting MarkupSafe>=2.1.1\n", - " Downloading MarkupSafe-2.1.5-cp39-cp39-win_amd64.whl (17 kB)\n", - "Requirement already satisfied: pygments<3.0.0,>=2.13.0 in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (2.18.0)\n", - "Collecting markdown-it-py>=2.2.0\n", - " Downloading markdown_it_py-3.0.0-py3-none-any.whl (87 kB)\n", - " ---------------------------------------- 87.5/87.5 KB ? eta 0:00:00\n", - "Requirement already satisfied: zipp>=3.20 in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.20.2)\n", - "Collecting mdurl~=0.1\n", - " Downloading mdurl-0.1.2-py3-none-any.whl (10.0 kB)\n", - "Installing collected packages: namex, libclang, flatbuffers, wrapt, wheel, urllib3, termcolor, tensorflow-io-gcs-filesystem, tensorboard-data-server, protobuf, optree, numpy, mdurl, MarkupSafe, idna, grpcio, google-pasta, gast, charset-normalizer, certifi, absl-py, werkzeug, requests, opt-einsum, ml-dtypes, markdown-it-py, markdown, h5py, astunparse, tensorboard, rich, keras, tensorflow-intel, tensorflow\n", - " Attempting uninstall: numpy\n", - " Found existing installation: numpy 2.0.2\n", - " Uninstalling numpy-2.0.2:\n", - " Successfully uninstalled numpy-2.0.2\n", - "Successfully installed MarkupSafe-2.1.5 absl-py-2.1.0 astunparse-1.6.3 certifi-2024.8.30 charset-normalizer-3.3.2 flatbuffers-24.3.25 gast-0.6.0 google-pasta-0.2.0 grpcio-1.66.1 h5py-3.11.0 idna-3.10 keras-3.5.0 libclang-18.1.1 markdown-3.7 markdown-it-py-3.0.0 mdurl-0.1.2 ml-dtypes-0.4.1 namex-0.0.8 numpy-1.26.4 opt-einsum-3.3.0 optree-0.12.1 protobuf-4.25.4 requests-2.32.3 rich-13.8.1 tensorboard-2.17.1 tensorboard-data-server-0.7.2 tensorflow-2.17.0 tensorflow-intel-2.17.0 tensorflow-io-gcs-filesystem-0.31.0 termcolor-2.4.0 urllib3-2.2.3 werkzeug-3.0.4 wheel-0.44.0 wrapt-1.16.0\n", - "Note: you may need to restart the kernel to use updated packages.\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - " WARNING: The script wheel.exe is installed in 'c:\\Users\\danie\\AppData\\Local\\Programs\\Python\\Python39\\Scripts' which is not on PATH.\n", - " Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.\n", - " WARNING: The script f2py.exe is installed in 'c:\\Users\\danie\\AppData\\Local\\Programs\\Python\\Python39\\Scripts' which is not on PATH.\n", - " Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.\n", - " WARNING: The script normalizer.exe is installed in 'c:\\Users\\danie\\AppData\\Local\\Programs\\Python\\Python39\\Scripts' which is not on PATH.\n", - " Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.\n", - " WARNING: The script markdown-it.exe is installed in 'c:\\Users\\danie\\AppData\\Local\\Programs\\Python\\Python39\\Scripts' which is not on PATH.\n", - " Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.\n", - " WARNING: The script markdown_py.exe is installed in 'c:\\Users\\danie\\AppData\\Local\\Programs\\Python\\Python39\\Scripts' which is not on PATH.\n", - " Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.\n", - " WARNING: The script tensorboard.exe is installed in 'c:\\Users\\danie\\AppData\\Local\\Programs\\Python\\Python39\\Scripts' which is not on PATH.\n", - " Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.\n", - " WARNING: The scripts import_pb_to_tensorboard.exe, saved_model_cli.exe, tensorboard.exe, tf_upgrade_v2.exe, tflite_convert.exe, toco.exe and toco_from_protos.exe are installed in 'c:\\Users\\danie\\AppData\\Local\\Programs\\Python\\Python39\\Scripts' which is not on PATH.\n", - " Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'c:\\Users\\danie\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "pip install tensorflow" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Collecting pgmpy\n", - " Downloading pgmpy-0.1.26-py3-none-any.whl (2.0 MB)\n", - " ---------------------------------------- 2.0/2.0 MB 18.1 MB/s eta 0:00:00\n", - "Requirement already satisfied: joblib in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.4.2)\n", - "Requirement already satisfied: pandas in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.2.2)\n", - "Collecting pyparsing\n", - " Downloading pyparsing-3.1.4-py3-none-any.whl (104 kB)\n", - " ---------------------------------------- 104.1/104.1 KB ? eta 0:00:00\n", - "Collecting torch\n", - " Downloading torch-2.4.1-cp39-cp39-win_amd64.whl (199.3 MB)\n", - " ------------------------------------- 199.3/199.3 MB 19.3 MB/s eta 0:00:00\n", - "Collecting statsmodels\n", - " Downloading statsmodels-0.14.2-cp39-cp39-win_amd64.whl (9.9 MB)\n", - " ---------------------------------------- 9.9/9.9 MB 41.9 MB/s eta 0:00:00\n", - "Collecting xgboost\n", - " Downloading xgboost-2.1.1-py3-none-win_amd64.whl (124.9 MB)\n", - " ------------------------------------- 124.9/124.9 MB 21.8 MB/s eta 0:00:00\n", - "Collecting google-generativeai\n", - " Downloading google_generativeai-0.8.1-py3-none-any.whl (165 kB)\n", - " ---------------------------------------- 165.0/165.0 KB ? eta 0:00:00\n", - "Requirement already satisfied: numpy in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.26.4)\n", - "Collecting networkx\n", - " Downloading networkx-3.2.1-py3-none-any.whl (1.6 MB)\n", - " ---------------------------------------- 1.6/1.6 MB 34.8 MB/s eta 0:00:00\n", - "Requirement already satisfied: scikit-learn in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.24.0)\n", - "Requirement already satisfied: scipy in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.13.1)\n", - "Collecting tqdm\n", - " Downloading tqdm-4.66.5-py3-none-any.whl (78 kB)\n", - " ---------------------------------------- 78.4/78.4 KB ? eta 0:00:00\n", - "Requirement already satisfied: opt-einsum in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.3.0)\n", - "Requirement already satisfied: protobuf in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (4.25.4)\n", - "Collecting google-auth>=2.15.0\n", - " Downloading google_auth-2.34.0-py2.py3-none-any.whl (200 kB)\n", - " ------------------------------------- 200.9/200.9 KB 11.9 MB/s eta 0:00:00\n", - "Collecting pydantic\n", - " Downloading pydantic-2.9.1-py3-none-any.whl (434 kB)\n", - " ------------------------------------- 434.4/434.4 KB 13.7 MB/s eta 0:00:00\n", - "Collecting google-api-python-client\n", - " Downloading google_api_python_client-2.145.0-py2.py3-none-any.whl (12.2 MB)\n", - " --------------------------------------- 12.2/12.2 MB 40.9 MB/s eta 0:00:00\n", - "Collecting google-ai-generativelanguage==0.6.9\n", - " Downloading google_ai_generativelanguage-0.6.9-py3-none-any.whl (725 kB)\n", - " ------------------------------------- 725.4/725.4 KB 47.7 MB/s eta 0:00:00\n", - "Collecting google-api-core\n", - " Downloading google_api_core-2.19.2-py3-none-any.whl (139 kB)\n", - " -------------------------------------- 139.4/139.4 KB 8.1 MB/s eta 0:00:00\n", - "Requirement already satisfied: typing-extensions in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from google-generativeai->pgmpy) (4.12.2)\n", - "Collecting proto-plus<2.0.0dev,>=1.22.3\n", - " Downloading proto_plus-1.24.0-py3-none-any.whl (50 kB)\n", - " ---------------------------------------- 50.1/50.1 KB ? eta 0:00:00\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas->pgmpy) (2024.1)\n", - "Requirement already satisfied: python-dateutil>=2.8.2 in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from pandas->pgmpy) (2.9.0.post0)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas->pgmpy) (2024.2)\n", - "Requirement already satisfied: threadpoolctl>=2.0.0 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn->pgmpy) (3.5.0)\n", - "Collecting patsy>=0.5.6\n", - " Downloading patsy-0.5.6-py2.py3-none-any.whl (233 kB)\n", - " ---------------------------------------- 233.9/233.9 KB ? eta 0:00:00\n", - "Requirement already satisfied: packaging>=21.3 in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from statsmodels->pgmpy) (24.1)\n", - "Collecting filelock\n", - " Downloading filelock-3.16.0-py3-none-any.whl (16 kB)\n", - "Collecting fsspec\n", - " Downloading fsspec-2024.9.0-py3-none-any.whl (179 kB)\n", - " ------------------------------------- 179.3/179.3 KB 11.3 MB/s eta 0:00:00\n", - "Collecting sympy\n", - " Downloading sympy-1.13.2-py3-none-any.whl (6.2 MB)\n", - " ---------------------------------------- 6.2/6.2 MB 39.6 MB/s eta 0:00:00\n", - "Collecting jinja2\n", - " Downloading jinja2-3.1.4-py3-none-any.whl (133 kB)\n", - " ---------------------------------------- 133.3/133.3 KB ? eta 0:00:00\n", - "Requirement already satisfied: colorama in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from tqdm->pgmpy) (0.4.6)\n", - "Collecting rsa<5,>=3.1.4\n", - " Downloading rsa-4.9-py3-none-any.whl (34 kB)\n", - "Collecting pyasn1-modules>=0.2.1\n", - " Downloading pyasn1_modules-0.4.1-py3-none-any.whl (181 kB)\n", - " ---------------------------------------- 181.5/181.5 KB ? eta 0:00:00\n", - "Collecting cachetools<6.0,>=2.0.0\n", - " Downloading cachetools-5.5.0-py3-none-any.whl (9.5 kB)\n", - "Requirement already satisfied: six in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from patsy>=0.5.6->statsmodels->pgmpy) (1.16.0)\n", - "Requirement already satisfied: requests<3.0.0.dev0,>=2.18.0 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (2.32.3)\n", - "Collecting googleapis-common-protos<2.0.dev0,>=1.56.2\n", - " Downloading googleapis_common_protos-1.65.0-py2.py3-none-any.whl (220 kB)\n", - " ---------------------------------------- 220.9/220.9 KB ? eta 0:00:00\n", - "Collecting uritemplate<5,>=3.0.1\n", - " Downloading uritemplate-4.1.1-py2.py3-none-any.whl (10 kB)\n", - "Collecting httplib2<1.dev0,>=0.19.0\n", - " Downloading httplib2-0.22.0-py3-none-any.whl (96 kB)\n", - " ---------------------------------------- 96.9/96.9 KB ? eta 0:00:00\n", - "Collecting google-auth-httplib2<1.0.0,>=0.2.0\n", - " Downloading google_auth_httplib2-0.2.0-py2.py3-none-any.whl (9.3 kB)\n", - "Requirement already satisfied: MarkupSafe>=2.0 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from jinja2->torch->pgmpy) (2.1.5)\n", - "Collecting pydantic-core==2.23.3\n", - " Downloading pydantic_core-2.23.3-cp39-none-win_amd64.whl (1.9 MB)\n", - " ---------------------------------------- 1.9/1.9 MB 40.7 MB/s eta 0:00:00\n", - "Collecting annotated-types>=0.6.0\n", - " Downloading annotated_types-0.7.0-py3-none-any.whl (13 kB)\n", - "Collecting mpmath<1.4,>=1.1.0\n", - " Downloading mpmath-1.3.0-py3-none-any.whl (536 kB)\n", - " ------------------------------------- 536.2/536.2 KB 32.9 MB/s eta 0:00:00\n", - "Collecting grpcio-status<2.0.dev0,>=1.33.2\n", - " Downloading grpcio_status-1.66.1-py3-none-any.whl (14 kB)\n", - "Requirement already satisfied: grpcio<2.0dev,>=1.33.2 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (1.66.1)\n", - "Collecting pyasn1<0.7.0,>=0.4.6\n", - " Downloading pyasn1-0.6.1-py3-none-any.whl (83 kB)\n", - " ---------------------------------------- 83.1/83.1 KB ? eta 0:00:00\n", - "Requirement already satisfied: urllib3<3,>=1.21.1 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (2.2.3)\n", - "Requirement already satisfied: idna<4,>=2.5 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (3.10)\n", - "Requirement already satisfied: certifi>=2017.4.17 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (2024.8.30)\n", - "Requirement already satisfied: charset-normalizer<4,>=2 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (3.3.2)\n", - "Collecting protobuf\n", - " Downloading protobuf-5.28.1-cp39-cp39-win_amd64.whl (431 kB)\n", - " ------------------------------------- 431.6/431.6 KB 26.3 MB/s eta 0:00:00\n", - "Installing collected packages: mpmath, uritemplate, tqdm, sympy, pyparsing, pydantic-core, pyasn1, protobuf, patsy, networkx, jinja2, fsspec, filelock, cachetools, annotated-types, xgboost, torch, rsa, pydantic, pyasn1-modules, proto-plus, httplib2, googleapis-common-protos, statsmodels, grpcio-status, google-auth, google-auth-httplib2, google-api-core, google-api-python-client, google-ai-generativelanguage, google-generativeai, pgmpy\n", - " Attempting uninstall: protobuf\n", - " Found existing installation: protobuf 4.25.4\n", - " Uninstalling protobuf-4.25.4:\n", - " Successfully uninstalled protobuf-4.25.4\n", - "Successfully installed annotated-types-0.7.0 cachetools-5.5.0 filelock-3.16.0 fsspec-2024.9.0 google-ai-generativelanguage-0.6.9 google-api-core-2.19.2 google-api-python-client-2.145.0 google-auth-2.34.0 google-auth-httplib2-0.2.0 google-generativeai-0.8.1 googleapis-common-protos-1.65.0 grpcio-status-1.66.1 httplib2-0.22.0 jinja2-3.1.4 mpmath-1.3.0 networkx-3.2.1 patsy-0.5.6 pgmpy-0.1.26 proto-plus-1.24.0 protobuf-5.28.1 pyasn1-0.6.1 pyasn1-modules-0.4.1 pydantic-2.9.1 pydantic-core-2.23.3 pyparsing-3.1.4 rsa-4.9 statsmodels-0.14.2 sympy-1.13.2 torch-2.4.1 tqdm-4.66.5 uritemplate-4.1.1 xgboost-2.1.1\n", - "Note: you may need to restart the kernel to use updated packages.\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - " WARNING: The script tqdm.exe is installed in 'c:\\Users\\danie\\AppData\\Local\\Programs\\Python\\Python39\\Scripts' which is not on PATH.\n", - " Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.\n", - " WARNING: The script isympy.exe is installed in 'c:\\Users\\danie\\AppData\\Local\\Programs\\Python\\Python39\\Scripts' which is not on PATH.\n", - " Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.\n", - " WARNING: The scripts convert-caffe2-to-onnx.exe, convert-onnx-to-caffe2.exe and torchrun.exe are installed in 'c:\\Users\\danie\\AppData\\Local\\Programs\\Python\\Python39\\Scripts' which is not on PATH.\n", - " Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.\n", - " WARNING: The scripts pyrsa-decrypt.exe, pyrsa-encrypt.exe, pyrsa-keygen.exe, pyrsa-priv2pub.exe, pyrsa-sign.exe and pyrsa-verify.exe are installed in 'c:\\Users\\danie\\AppData\\Local\\Programs\\Python\\Python39\\Scripts' which is not on PATH.\n", - " Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.\n", - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "tensorflow-intel 2.17.0 requires protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.20.3, but you have protobuf 5.28.1 which is incompatible.\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'c:\\Users\\danie\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "pip install pgmpy" - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Collecting sdv\n", - " Downloading sdv-1.16.1-py3-none-any.whl (148 kB)\n", - " -------------------------------------- 148.8/148.8 KB 1.8 MB/s eta 0:00:00\n", - "Collecting ctgan>=0.10.0\n", - " Downloading ctgan-0.10.1-py3-none-any.whl (24 kB)\n", - "Collecting copulas>=0.11.0\n", - " Downloading copulas-0.11.1-py3-none-any.whl (51 kB)\n", - " ---------------------------------------- 51.6/51.6 KB 2.6 MB/s eta 0:00:00\n", - "Collecting boto3<2.0.0,>=1.28\n", - " Downloading boto3-1.35.19-py3-none-any.whl (139 kB)\n", - " -------------------------------------- 139.2/139.2 KB 2.0 MB/s eta 0:00:00\n", - "Requirement already satisfied: platformdirs>=4.0 in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from sdv) (4.3.3)\n", - "Collecting deepecho>=0.6.0\n", - " Downloading deepecho-0.6.0-py3-none-any.whl (27 kB)\n", - "Collecting sdmetrics>=0.14.0\n", - " Downloading sdmetrics-0.15.1-py3-none-any.whl (170 kB)\n", - " -------------------------------------- 170.7/170.7 KB 3.5 MB/s eta 0:00:00\n", - "Requirement already satisfied: pandas>=1.4.0 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (2.2.2)\n", - "Collecting botocore<2.0.0,>=1.31\n", - " Downloading botocore-1.35.19-py3-none-any.whl (12.5 MB)\n", - " --------------------------------------- 12.5/12.5 MB 23.4 MB/s eta 0:00:00\n", - "Collecting rdt>=1.12.3\n", - " Downloading rdt-1.12.4-py3-none-any.whl (65 kB)\n", - " ---------------------------------------- 65.3/65.3 KB 3.7 MB/s eta 0:00:00\n", - "Collecting graphviz>=0.13.2\n", - " Downloading graphviz-0.20.3-py3-none-any.whl (47 kB)\n", - " ---------------------------------------- 47.1/47.1 KB 2.3 MB/s eta 0:00:00\n", - "Collecting cloudpickle>=2.1.0\n", - " Downloading cloudpickle-3.0.0-py3-none-any.whl (20 kB)\n", - "Requirement already satisfied: numpy<2.0.0,>=1.21.0 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.26.4)\n", - "Requirement already satisfied: tqdm>=4.29 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (4.66.5)\n", - "Collecting pyyaml>=6.0.1\n", - " Downloading PyYAML-6.0.2-cp39-cp39-win_amd64.whl (162 kB)\n", - " ---------------------------------------- 162.3/162.3 KB ? eta 0:00:00\n", - "Collecting s3transfer<0.11.0,>=0.10.0\n", - " Downloading s3transfer-0.10.2-py3-none-any.whl (82 kB)\n", - " ---------------------------------------- 82.7/82.7 KB ? eta 0:00:00\n", - "Collecting jmespath<2.0.0,>=0.7.1\n", - " Downloading jmespath-1.0.1-py3-none-any.whl (20 kB)\n", - "Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from botocore<2.0.0,>=1.31->sdv) (2.9.0.post0)\n", - "Collecting urllib3<1.27,>=1.25.4\n", - " Downloading urllib3-1.26.20-py2.py3-none-any.whl (144 kB)\n", - " ---------------------------------------- 144.2/144.2 KB ? eta 0:00:00\n", - "Collecting plotly>=5.10.0\n", - " Downloading plotly-5.24.1-py3-none-any.whl (19.1 MB)\n", - " --------------------------------------- 19.1/19.1 MB 34.6 MB/s eta 0:00:00\n", - "Requirement already satisfied: scipy>=1.7.3 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from copulas>=0.11.0->sdv) (1.13.1)\n", - "Requirement already satisfied: torch>=1.9.0 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from ctgan>=0.10.0->sdv) (2.4.1)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=1.4.0->sdv) (2024.1)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=1.4.0->sdv) (2024.2)\n", - "Collecting scikit-learn>=1.0.2\n", - " Downloading scikit_learn-1.5.2-cp39-cp39-win_amd64.whl (11.0 MB)\n", - " --------------------------------------- 11.0/11.0 MB 40.9 MB/s eta 0:00:00\n", - "Collecting Faker>=17\n", - " Downloading Faker-28.4.1-py3-none-any.whl (1.8 MB)\n", - " ---------------------------------------- 1.8/1.8 MB 38.8 MB/s eta 0:00:00\n", - "Requirement already satisfied: colorama in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from tqdm>=4.29->sdv) (0.4.6)\n", - "Collecting tenacity>=6.2.0\n", - " Downloading tenacity-9.0.0-py3-none-any.whl (28 kB)\n", - "Requirement already satisfied: packaging in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from plotly>=5.10.0->copulas>=0.11.0->sdv) (24.1)\n", - "Requirement already satisfied: six>=1.5 in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from python-dateutil<3.0.0,>=2.1->botocore<2.0.0,>=1.31->sdv) (1.16.0)\n", - "Requirement already satisfied: threadpoolctl>=3.1.0 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn>=1.0.2->rdt>=1.12.3->sdv) (3.5.0)\n", - "Requirement already satisfied: joblib>=1.2.0 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn>=1.0.2->rdt>=1.12.3->sdv) (1.4.2)\n", - "Requirement already satisfied: sympy in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (1.13.2)\n", - "Requirement already satisfied: typing-extensions>=4.8.0 in c:\\users\\danie\\appdata\\roaming\\python\\python39\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (4.12.2)\n", - "Requirement already satisfied: networkx in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.2.1)\n", - "Requirement already satisfied: jinja2 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.1.4)\n", - "Requirement already satisfied: fsspec in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (2024.9.0)\n", - "Requirement already satisfied: filelock in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.16.0)\n", - "Requirement already satisfied: MarkupSafe>=2.0 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from jinja2->torch>=1.9.0->ctgan>=0.10.0->sdv) (2.1.5)\n", - "Requirement already satisfied: mpmath<1.4,>=1.1.0 in c:\\users\\danie\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sympy->torch>=1.9.0->ctgan>=0.10.0->sdv) (1.3.0)\n", - "Installing collected packages: urllib3, tenacity, pyyaml, jmespath, graphviz, cloudpickle, scikit-learn, plotly, Faker, botocore, s3transfer, rdt, deepecho, copulas, sdmetrics, ctgan, boto3, sdv\n", - " Attempting uninstall: urllib3\n", - " Found existing installation: urllib3 2.2.3\n", - " Uninstalling urllib3-2.2.3:\n", - " Successfully uninstalled urllib3-2.2.3\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 0.24.0\n", - " Uninstalling scikit-learn-0.24.0:\n", - " Successfully uninstalled scikit-learn-0.24.0\n", - "Successfully installed Faker-28.4.1 boto3-1.35.19 botocore-1.35.19 cloudpickle-3.0.0 copulas-0.11.1 ctgan-0.10.1 deepecho-0.6.0 graphviz-0.20.3 jmespath-1.0.1 plotly-5.24.1 pyyaml-6.0.2 rdt-1.12.4 s3transfer-0.10.2 scikit-learn-1.5.2 sdmetrics-0.15.1 sdv-1.16.1 tenacity-9.0.0 urllib3-1.26.20\n", - "Note: you may need to restart the kernel to use updated packages.\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - " WARNING: The script faker.exe is installed in 'c:\\Users\\danie\\AppData\\Local\\Programs\\Python\\Python39\\Scripts' which is not on PATH.\n", - " Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.\n", - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "pyitlib 0.2.3 requires scikit-learn<=0.24,>=0.16.0, but you have scikit-learn 1.5.2 which is incompatible.\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'c:\\Users\\danie\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "pip install sdv" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### Importing Katabatic" - ] - }, - { - "cell_type": "code", - "execution_count": 10, - "metadata": {}, - "outputs": [ - { - "ename": "ModuleNotFoundError", - "evalue": "No module named 'katabatic'", - "output_type": "error", - "traceback": [ - "\u001b[1;31m---------------------------------------------------------------------------\u001b[0m", - "\u001b[1;31mModuleNotFoundError\u001b[0m Traceback (most recent call last)", - "Cell \u001b[1;32mIn[10], line 1\u001b[0m\n\u001b[1;32m----> 1\u001b[0m \u001b[38;5;28;01mfrom\u001b[39;00m \u001b[38;5;21;01mkatabatic\u001b[39;00m\u001b[38;5;21;01m.\u001b[39;00m\u001b[38;5;21;01mkatabatic\u001b[39;00m \u001b[38;5;28;01mimport\u001b[39;00m Katabatic\n\u001b[0;32m 2\u001b[0m \u001b[38;5;28;01mimport\u001b[39;00m \u001b[38;5;21;01mnumpy\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m \u001b[38;5;21;01mnp\u001b[39;00m\n", - "\u001b[1;31mModuleNotFoundError\u001b[0m: No module named 'katabatic'" - ] - } - ], - "source": [ - "from katabatic.katabatic import \n", - "import numpy as np\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Sample Training Data" - ] - }, - { - "cell_type": "code", - "execution_count": 13, - "metadata": {}, - "outputs": [ - { - "ename": "ModuleNotFoundError", - "evalue": "No module named 'katabatic'", - "output_type": "error", - "traceback": [ - "\u001b[1;31m---------------------------------------------------------------------------\u001b[0m", - "\u001b[1;31mModuleNotFoundError\u001b[0m Traceback (most recent call last)", - "Cell \u001b[1;32mIn[13], line 1\u001b[0m\n\u001b[1;32m----> 1\u001b[0m \u001b[38;5;28;01mfrom\u001b[39;00m \u001b[38;5;21;01mkatabatic\u001b[39;00m\u001b[38;5;21;01m.\u001b[39;00m\u001b[38;5;21;01mmodels\u001b[39;00m\u001b[38;5;21;01m.\u001b[39;00m\u001b[38;5;21;01mganblrpp_DGEK\u001b[39;00m\u001b[38;5;21;01m.\u001b[39;00m\u001b[38;5;21;01mutils\u001b[39;00m \u001b[38;5;28;01mimport\u001b[39;00m get_demo_data\n\u001b[0;32m 2\u001b[0m df \u001b[38;5;241m=\u001b[39m get_demo_data(\u001b[38;5;124m'\u001b[39m\u001b[38;5;124madult-raw\u001b[39m\u001b[38;5;124m'\u001b[39m)\n\u001b[0;32m 3\u001b[0m df\u001b[38;5;241m.\u001b[39mhead()\n", - "\u001b[1;31mModuleNotFoundError\u001b[0m: No module named 'katabatic'" - ] - } - ], - "source": [ - "from katabatic.models.ganblrpp_DGEK.utils import get_demo_data\n", - "df = get_demo_data('adult-raw')\n", - "df.head()\n", - "\n", - "from sklearn.model_selection import train_test_split\n", - "x, y = df.values[:,:-1], df.values[:,-1]\n", - "X_train, X_test, y_train, y_test = train_test_split(x, y, test_size=0.5)\n", - "\n", - "import numpy as np\n", - "def is_numerical(dtype):\n", - " '''\n", - " if the type is one of ['signed-integer', 'unsigned-integer', 'floating point'], we reconginze it as a numerical one.\n", - " \n", - " Reference: https://numpy.org/doc/stable/reference/generated/numpy.dtype.kind.html#numpy.dtype.kind\n", - " '''\n", - " return dtype.kind in 'iuf'\n", - "\n", - "column_is_numerical = df.dtypes.apply(is_numerical).values\n", - "numerical = np.argwhere(column_is_numerical).ravel()\n", - "numerical\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Load Model and Train it on Sample Training Data" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] Initializing GANBLR++ Model\n", - "[INFO] Training GANBLR++ model\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\sklearn\\mixture\\_base.py:270: ConvergenceWarning: Best performing initialization did not converge. Try different init parameters, or increase max_iter, tol, or check for degenerate data.\n", - " warnings.warn(\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\sklearn\\mixture\\_base.py:270: ConvergenceWarning: Best performing initialization did not converge. Try different init parameters, or increase max_iter, tol, or check for degenerate data.\n", - " warnings.warn(\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n", - "c:\\Users\\Asus\\Documents\\GitHub\\Katabatic\\katabatic\\models\\ganblrpp\\ganblr.py:76: RuntimeWarning: divide by zero encountered in log\n", - " ls = np.mean(-np.log(np.subtract(1, prob_fake)))\n", - "c:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages\\keras\\src\\layers\\core\\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", - " super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[SUCCESS] Model training completed\n" - ] - } - ], - "source": [ - "from katabatic.models.ganblrpp_DGEK.ganblrpp_adapter import GanblrppAdapter\n", - "adapter = GanblrppAdapter(numerical_columns=numerical)\n", - "adapter.load_model()\n", - "adapter.fit(X_train, y_train, epochs=10)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Generate New Data from trained model" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] Generating data using GANBLR++ model\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "sampling: 0%| | 0/6 [00:00\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
01234567891011121314
045.885403Without-pay233248.633406HS-grad2.32256Separated?HusbandWhiteMale39305.046591252.46961643.084273Guatemala<=50K
118.452572Self-emp-not-inc666991.8531547th-8th9.993113WidowedExec-managerialHusbandWhiteMale653.651821675.07768123.136626Italy<=50K
247.058095Federal-gov106445.233982Bachelors13.789144Married-civ-spouseProtective-servUnmarriedAmer-Indian-EskimoFemale-3824.0361781176.2666541.029158Laos<=50K
358.60026Self-emp-not-inc291209.293581Bachelors6.997363WidowedCraft-repairOther-relativeBlackFemale7189.3822875.0433730.13387China<=50K
419.788386Self-emp-inc324818.6189851st-4th7.011698WidowedAdm-clericalUnmarriedOtherMale-550.316674465.03602420.56991Greece<=50K
\n", - "" - ], - "text/plain": [ - " 0 1 2 3 4 \\\n", - "0 45.885403 Without-pay 233248.633406 HS-grad 2.32256 \n", - "1 18.452572 Self-emp-not-inc 666991.853154 7th-8th 9.993113 \n", - "2 47.058095 Federal-gov 106445.233982 Bachelors 13.789144 \n", - "3 58.60026 Self-emp-not-inc 291209.293581 Bachelors 6.997363 \n", - "4 19.788386 Self-emp-inc 324818.618985 1st-4th 7.011698 \n", - "\n", - " 5 6 7 8 \\\n", - "0 Separated ? Husband White \n", - "1 Widowed Exec-managerial Husband White \n", - "2 Married-civ-spouse Protective-serv Unmarried Amer-Indian-Eskimo \n", - "3 Widowed Craft-repair Other-relative Black \n", - "4 Widowed Adm-clerical Unmarried Other \n", - "\n", - " 9 10 11 12 13 14 \n", - "0 Male 39305.04659 1252.469616 43.084273 Guatemala <=50K \n", - "1 Male 653.651821 675.077681 23.136626 Italy <=50K \n", - "2 Female -3824.036178 1176.26665 41.029158 Laos <=50K \n", - "3 Female 7189.3822 875.043373 0.13387 China <=50K \n", - "4 Male -550.316674 465.036024 20.56991 Greece <=50K " - ] - }, - "execution_count": 7, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "adapter.generate(size=5)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# Future import statement ideas\n", - "\n", - "import katabatic as kb\n", - "from katabatic.models import meg\n", - "from katabatic.models import ganblr\n", - "from katabatic.evaluate import eval_method1\n", - "from katabatic.utils.preprocessing import data_processing_method1 # good place to store preprocessing utilities" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.9.13" - } - }, - "nbformat": 4, - "nbformat_minor": 2 -} diff --git a/Use_Cases/tryKatabatic_Ganblr.ipynb b/Use_Cases/tryKatabatic_Ganblr.ipynb deleted file mode 100644 index aa8ba16..0000000 --- a/Use_Cases/tryKatabatic_Ganblr.ipynb +++ /dev/null @@ -1,476 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Katabatic Demo Usage" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### Importing Dependenciencies" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: scikit-learn in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (0.24.0)\n", - "Collecting scikit-learn\n", - " Using cached scikit_learn-1.5.1-cp39-cp39-win_amd64.whl (11.0 MB)\n", - "Requirement already satisfied: scipy>=1.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.13.1)\n", - "Requirement already satisfied: joblib>=1.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.4.2)\n", - "Requirement already satisfied: numpy>=1.19.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.26.4)\n", - "Requirement already satisfied: threadpoolctl>=3.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (3.5.0)\n", - "Installing collected packages: scikit-learn\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 0.24.0\n", - " Uninstalling scikit-learn-0.24.0:\n", - " Successfully uninstalled scikit-learn-0.24.0\n", - "Successfully installed scikit-learn-1.5.1\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "pyitlib 0.2.3 requires scikit-learn<=0.24,>=0.16.0, but you have scikit-learn 1.5.1 which is incompatible.\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install -U scikit-learn" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: pyitlib in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (0.2.3)\n", - "Requirement already satisfied: pandas>=0.20.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (2.2.2)\n", - "Requirement already satisfied: numpy>=1.9.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (1.26.4)\n", - "Collecting scikit-learn<=0.24,>=0.16.0\n", - " Using cached scikit_learn-0.24.0-cp39-cp39-win_amd64.whl (6.9 MB)\n", - "Requirement already satisfied: scipy>=1.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (1.13.1)\n", - "Requirement already satisfied: future>=0.16.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (1.0.0)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=0.20.2->pyitlib) (2024.1)\n", - "Requirement already satisfied: python-dateutil>=2.8.2 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from pandas>=0.20.2->pyitlib) (2.9.0.post0)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=0.20.2->pyitlib) (2024.1)\n", - "Requirement already satisfied: threadpoolctl>=2.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn<=0.24,>=0.16.0->pyitlib) (3.5.0)\n", - "Requirement already satisfied: joblib>=0.11 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn<=0.24,>=0.16.0->pyitlib) (1.4.2)\n", - "Requirement already satisfied: six>=1.5 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from python-dateutil>=2.8.2->pandas>=0.20.2->pyitlib) (1.16.0)\n", - "Installing collected packages: scikit-learn\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 1.5.1\n", - " Uninstalling scikit-learn-1.5.1:\n", - " Successfully uninstalled scikit-learn-1.5.1\n", - "Successfully installed scikit-learn-0.24.0\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "sdmetrics 0.15.0 requires scikit-learn>=1.0.2; python_version < \"3.10\", but you have scikit-learn 0.24.0 which is incompatible.\n", - "rdt 1.12.2 requires scikit-learn>=1.0.2; python_version < \"3.10\", but you have scikit-learn 0.24.0 which is incompatible.\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install pyitlib" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: tensorflow in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (2.17.0)\n", - "Requirement already satisfied: tensorflow-intel==2.17.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow) (2.17.0)\n", - "Requirement already satisfied: numpy<2.0.0,>=1.23.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.26.4)\n", - "Requirement already satisfied: typing-extensions>=3.6.6 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (4.12.2)\n", - "Requirement already satisfied: google-pasta>=0.1.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.2.0)\n", - "Requirement already satisfied: tensorflow-io-gcs-filesystem>=0.23.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.31.0)\n", - "Requirement already satisfied: ml-dtypes<0.5.0,>=0.3.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.4.0)\n", - "Requirement already satisfied: h5py>=3.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (3.11.0)\n", - "Requirement already satisfied: six>=1.12.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.16.0)\n", - "Requirement already satisfied: requests<3,>=2.21.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.32.3)\n", - "Requirement already satisfied: keras>=3.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (3.4.1)\n", - "Requirement already satisfied: flatbuffers>=24.3.25 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (24.3.25)\n", - "Requirement already satisfied: termcolor>=1.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.4.0)\n", - "Requirement already satisfied: wrapt>=1.11.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.16.0)\n", - "Requirement already satisfied: grpcio<2.0,>=1.24.3 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.65.4)\n", - "Requirement already satisfied: opt-einsum>=2.3.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (3.3.0)\n", - "Requirement already satisfied: setuptools in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (58.1.0)\n", - "Requirement already satisfied: packaging in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (24.1)\n", - "Requirement already satisfied: absl-py>=1.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.1.0)\n", - "Requirement already satisfied: gast!=0.5.0,!=0.5.1,!=0.5.2,>=0.2.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.6.0)\n", - "Requirement already satisfied: protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.20.3 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (4.25.4)\n", - "Requirement already satisfied: libclang>=13.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (18.1.1)\n", - "Requirement already satisfied: tensorboard<2.18,>=2.17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.17.0)\n", - "Requirement already satisfied: astunparse>=1.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.6.3)\n", - "Requirement already satisfied: wheel<1.0,>=0.23.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from astunparse>=1.6.0->tensorflow-intel==2.17.0->tensorflow) (0.44.0)\n", - "Requirement already satisfied: rich in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (13.7.1)\n", - "Requirement already satisfied: namex in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (0.0.8)\n", - "Requirement already satisfied: optree in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (0.12.1)\n", - "Requirement already satisfied: charset-normalizer<4,>=2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (3.3.2)\n", - "Requirement already satisfied: idna<4,>=2.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (3.7)\n", - "Requirement already satisfied: certifi>=2017.4.17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (2024.7.4)\n", - "Requirement already satisfied: urllib3<3,>=1.21.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (1.26.19)\n", - "Requirement already satisfied: werkzeug>=1.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.0.3)\n", - "Requirement already satisfied: tensorboard-data-server<0.8.0,>=0.7.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (0.7.2)\n", - "Requirement already satisfied: markdown>=2.6.8 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.6)\n", - "Requirement already satisfied: importlib-metadata>=4.4 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from markdown>=2.6.8->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (8.2.0)\n", - "Requirement already satisfied: MarkupSafe>=2.1.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from werkzeug>=1.0.1->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (2.1.5)\n", - "Requirement already satisfied: markdown-it-py>=2.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (3.0.0)\n", - "Requirement already satisfied: pygments<3.0.0,>=2.13.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (2.18.0)\n", - "Requirement already satisfied: zipp>=0.5 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.20.0)\n", - "Requirement already satisfied: mdurl~=0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from markdown-it-py>=2.2.0->rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (0.1.2)\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install tensorflow" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: pgmpy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (0.1.26)\n", - "Requirement already satisfied: google-generativeai in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.7.2)\n", - "Requirement already satisfied: scikit-learn in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.24.0)\n", - "Requirement already satisfied: pyparsing in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.1.2)\n", - "Requirement already satisfied: torch in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.4.0)\n", - "Requirement already satisfied: pandas in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.2.2)\n", - "Requirement already satisfied: xgboost in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.1.1)\n", - "Requirement already satisfied: joblib in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.4.2)\n", - "Requirement already satisfied: numpy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.26.4)\n", - "Requirement already satisfied: statsmodels in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.14.2)\n", - "Requirement already satisfied: tqdm in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (4.66.5)\n", - "Requirement already satisfied: scipy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.13.1)\n", - "Requirement already satisfied: networkx in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.2.1)\n", - "Requirement already satisfied: opt-einsum in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.3.0)\n", - "Requirement already satisfied: google-api-core in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.19.1)\n", - "Requirement already satisfied: google-api-python-client in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.140.0)\n", - "Requirement already satisfied: google-auth>=2.15.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.33.0)\n", - "Requirement already satisfied: typing-extensions in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from google-generativeai->pgmpy) (4.12.2)\n", - "Requirement already satisfied: pydantic in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.8.2)\n", - "Requirement already satisfied: google-ai-generativelanguage==0.6.6 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (0.6.6)\n", - "Requirement already satisfied: protobuf in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (4.25.4)\n", - "Requirement already satisfied: proto-plus<2.0.0dev,>=1.22.3 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-ai-generativelanguage==0.6.6->google-generativeai->pgmpy) (1.24.0)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas->pgmpy) (2024.1)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas->pgmpy) (2024.1)\n", - "Requirement already satisfied: python-dateutil>=2.8.2 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from pandas->pgmpy) (2.9.0.post0)\n", - "Requirement already satisfied: threadpoolctl>=2.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn->pgmpy) (3.5.0)\n", - "Requirement already satisfied: packaging>=21.3 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from statsmodels->pgmpy) (24.1)\n", - "Requirement already satisfied: patsy>=0.5.6 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from statsmodels->pgmpy) (0.5.6)\n", - "Requirement already satisfied: fsspec in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (2024.6.1)\n", - "Requirement already satisfied: sympy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (1.13.2)\n", - "Requirement already satisfied: jinja2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (3.1.4)\n", - "Requirement already satisfied: filelock in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (3.15.4)\n", - "Requirement already satisfied: colorama in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tqdm->pgmpy) (0.4.6)\n", - "Requirement already satisfied: rsa<5,>=3.1.4 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-auth>=2.15.0->google-generativeai->pgmpy) (4.9)\n", - "Requirement already satisfied: cachetools<6.0,>=2.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-auth>=2.15.0->google-generativeai->pgmpy) (5.4.0)\n", - "Requirement already satisfied: pyasn1-modules>=0.2.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-auth>=2.15.0->google-generativeai->pgmpy) (0.4.0)\n", - "Requirement already satisfied: six in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from patsy>=0.5.6->statsmodels->pgmpy) (1.16.0)\n", - "Requirement already satisfied: requests<3.0.0.dev0,>=2.18.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (2.32.3)\n", - "Requirement already satisfied: googleapis-common-protos<2.0.dev0,>=1.56.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (1.63.2)\n", - "Requirement already satisfied: uritemplate<5,>=3.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-python-client->google-generativeai->pgmpy) (4.1.1)\n", - "Requirement already satisfied: google-auth-httplib2<1.0.0,>=0.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-python-client->google-generativeai->pgmpy) (0.2.0)\n", - "Requirement already satisfied: httplib2<1.dev0,>=0.19.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-python-client->google-generativeai->pgmpy) (0.22.0)\n", - "Requirement already satisfied: MarkupSafe>=2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from jinja2->torch->pgmpy) (2.1.5)\n", - "Requirement already satisfied: annotated-types>=0.4.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pydantic->google-generativeai->pgmpy) (0.7.0)\n", - "Requirement already satisfied: pydantic-core==2.20.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pydantic->google-generativeai->pgmpy) (2.20.1)\n", - "Requirement already satisfied: mpmath<1.4,>=1.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sympy->torch->pgmpy) (1.3.0)\n", - "Requirement already satisfied: grpcio-status<2.0.dev0,>=1.33.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (1.62.3)\n", - "Requirement already satisfied: grpcio<2.0dev,>=1.33.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (1.65.4)\n", - "Requirement already satisfied: pyasn1<0.7.0,>=0.4.6 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyasn1-modules>=0.2.1->google-auth>=2.15.0->google-generativeai->pgmpy) (0.6.0)\n", - "Requirement already satisfied: idna<4,>=2.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (3.7)\n", - "Requirement already satisfied: certifi>=2017.4.17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (2024.7.4)\n", - "Requirement already satisfied: charset-normalizer<4,>=2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (3.3.2)\n", - "Requirement already satisfied: urllib3<3,>=1.21.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (1.26.19)\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install pgmpy" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: sdv in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (1.15.0)\n", - "Requirement already satisfied: cloudpickle>=2.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (3.0.0)\n", - "Requirement already satisfied: ctgan>=0.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.10.1)\n", - "Requirement already satisfied: copulas>=0.11.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.11.0)\n", - "Requirement already satisfied: rdt>=1.12.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.12.2)\n", - "Requirement already satisfied: sdmetrics>=0.14.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.15.0)\n", - "Requirement already satisfied: pyyaml>=6.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (6.0.2)\n", - "Requirement already satisfied: tqdm>=4.29 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (4.66.5)\n", - "Requirement already satisfied: pandas>=1.4.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (2.2.2)\n", - "Requirement already satisfied: platformdirs>=4.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from sdv) (4.2.2)\n", - "Requirement already satisfied: deepecho>=0.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.6.0)\n", - "Requirement already satisfied: numpy<2.0.0,>=1.21.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.26.4)\n", - "Requirement already satisfied: graphviz>=0.13.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.20.3)\n", - "Requirement already satisfied: boto3>=1.28 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.34.158)\n", - "Requirement already satisfied: botocore>=1.31 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.34.158)\n", - "Requirement already satisfied: s3transfer<0.11.0,>=0.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from boto3>=1.28->sdv) (0.10.2)\n", - "Requirement already satisfied: jmespath<2.0.0,>=0.7.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from boto3>=1.28->sdv) (1.0.1)\n", - "Requirement already satisfied: urllib3<1.27,>=1.25.4 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from botocore>=1.31->sdv) (1.26.19)\n", - "Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from botocore>=1.31->sdv) (2.9.0.post0)\n", - "Requirement already satisfied: plotly>=5.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from copulas>=0.11.0->sdv) (5.23.0)\n", - "Requirement already satisfied: scipy>=1.5.4 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from copulas>=0.11.0->sdv) (1.13.1)\n", - "Requirement already satisfied: torch>=1.9.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from ctgan>=0.10.0->sdv) (2.4.0)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=1.4.0->sdv) (2024.1)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=1.4.0->sdv) (2024.1)\n", - "Collecting scikit-learn>=1.0.2\n", - " Using cached scikit_learn-1.5.1-cp39-cp39-win_amd64.whl (11.0 MB)\n", - "Requirement already satisfied: Faker>=17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from rdt>=1.12.0->sdv) (26.3.0)\n", - "Requirement already satisfied: colorama in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tqdm>=4.29->sdv) (0.4.6)\n", - "Requirement already satisfied: tenacity>=6.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from plotly>=5.10.0->copulas>=0.11.0->sdv) (9.0.0)\n", - "Requirement already satisfied: packaging in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from plotly>=5.10.0->copulas>=0.11.0->sdv) (24.1)\n", - "Requirement already satisfied: six>=1.5 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from python-dateutil<3.0.0,>=2.1->botocore>=1.31->sdv) (1.16.0)\n", - "Requirement already satisfied: joblib>=1.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn>=1.0.2->rdt>=1.12.0->sdv) (1.4.2)\n", - "Requirement already satisfied: threadpoolctl>=3.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn>=1.0.2->rdt>=1.12.0->sdv) (3.5.0)\n", - "Requirement already satisfied: filelock in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.15.4)\n", - "Requirement already satisfied: fsspec in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (2024.6.1)\n", - "Requirement already satisfied: networkx in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.2.1)\n", - "Requirement already satisfied: jinja2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.1.4)\n", - "Requirement already satisfied: sympy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (1.13.2)\n", - "Requirement already satisfied: typing-extensions>=4.8.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (4.12.2)\n", - "Requirement already satisfied: MarkupSafe>=2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from jinja2->torch>=1.9.0->ctgan>=0.10.0->sdv) (2.1.5)\n", - "Requirement already satisfied: mpmath<1.4,>=1.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sympy->torch>=1.9.0->ctgan>=0.10.0->sdv) (1.3.0)\n", - "Installing collected packages: scikit-learn\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 0.24.0\n", - " Uninstalling scikit-learn-0.24.0:\n", - " Successfully uninstalled scikit-learn-0.24.0\n", - "Successfully installed scikit-learn-1.5.1\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "pyitlib 0.2.3 requires scikit-learn<=0.24,>=0.16.0, but you have scikit-learn 1.5.1 which is incompatible.\n", - "WARNING: You are using pip version 22.0.4; however, version 24.2 is available.\n", - "You should consider upgrading via the 'C:\\Users\\Asus\\AppData\\Local\\Programs\\Python\\Python39\\python.exe -m pip install --upgrade pip' command.\n" - ] - } - ], - "source": [ - "!pip install sdv" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### Importing Katabatic" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [], - "source": [ - "from katabatic.katabatic import Katabatic\n", - "import numpy as np\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Sample Training Data" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": {}, - "outputs": [], - "source": [ - "# Generating sample data for X_train\n", - "X_train = np.array([\n", - " [0.1, 1.2, 3.4, 4.5],\n", - " [2.3, 3.4, 1.1, 0.2],\n", - " [1.4, 0.4, 2.2, 3.3],\n", - " [3.1, 4.2, 1.0, 2.5],\n", - " [2.2, 1.1, 0.3, 4.1],\n", - " [0.5, 2.2, 4.4, 1.3],\n", - " [1.8, 3.3, 0.9, 2.2],\n", - " [2.6, 1.5, 3.3, 0.1],\n", - " [0.7, 4.4, 1.4, 3.0],\n", - " [1.9, 2.8, 3.9, 4.0]\n", - "])\n", - "\n", - "# Generating sample data for y_train\n", - "y_train = np.array([0, 1, 0, 1, 0, 1, 0, 1, 0, 1])" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Load Model and Train it on Sample Training Data" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "--------------------------\n", - "module name: katabatic.katabatic\n", - "parent process: 28468\n", - "process id: 28884\n", - "katabatic.models.ganblr.ganblr_adapter\n", - "katabatic.models.ganblr.ganblr_adapter\n", - "\n", - "[INFO] Initializing GANBLR Model\n", - "[INFO] Training GANBLR model\n", - "[SUCCESS] Model training completed\n" - ] - } - ], - "source": [ - "model = Katabatic.run_model('ganblr')\n", - "model.load_model()\n", - "model.fit(X_train,y_train)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Generate New Data from trained model" - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] Generating data using GANBLR model\n", - "[SUCCESS] Data generation completed\n" - ] - }, - { - "data": { - "text/plain": [ - "array([[3.1, 1.1, 0.3, 4.0, 0],\n", - " [0.7, 1.5, 0.3, 3.3, 1],\n", - " [0.7, 1.1, 2.2, 2.5, 1],\n", - " [2.6, 2.8, 1.1, 0.1, 0],\n", - " [2.6, 0.4, 4.4, 4.5, 1]], dtype=object)" - ] - }, - "execution_count": 8, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "model.generate(size=5)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# Future import statement ideas\n", - "\n", - "import katabatic as kb\n", - "from katabatic.models import meg\n", - "from katabatic.models import ganblr\n", - "from katabatic.evaluate import eval_method1\n", - "from katabatic.utils.preprocessing import data_processing_method1 # good place to store preprocessing utilities" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.9.13" - } - }, - "nbformat": 4, - "nbformat_minor": 2 -} diff --git a/Use_Cases/tryKatabatic_Meg.ipynb b/Use_Cases/tryKatabatic_Meg.ipynb deleted file mode 100644 index 7ca75b1..0000000 --- a/Use_Cases/tryKatabatic_Meg.ipynb +++ /dev/null @@ -1,629 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Katabatic Demo Usage" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### Importing Dependenciencies" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: scikit-learn in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (0.24.0)\n", - "Collecting scikit-learn\n", - " Using cached scikit_learn-1.5.1-cp39-cp39-win_amd64.whl.metadata (12 kB)\n", - "Requirement already satisfied: numpy>=1.19.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.26.4)\n", - "Requirement already satisfied: scipy>=1.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.13.1)\n", - "Requirement already satisfied: joblib>=1.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (1.4.2)\n", - "Requirement already satisfied: threadpoolctl>=3.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn) (3.5.0)\n", - "Downloading scikit_learn-1.5.1-cp39-cp39-win_amd64.whl (11.0 MB)\n", - " ---------------------------------------- 0.0/11.0 MB ? eta -:--:--\n", - " ----------------- ---------------------- 4.7/11.0 MB 28.6 MB/s eta 0:00:01\n", - " ---------------------------------------- 11.0/11.0 MB 29.9 MB/s eta 0:00:00\n", - "Installing collected packages: scikit-learn\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 0.24.0\n", - " Uninstalling scikit-learn-0.24.0:\n", - " Successfully uninstalled scikit-learn-0.24.0\n", - "Successfully installed scikit-learn-1.5.1\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "pyitlib 0.2.3 requires scikit-learn<=0.24,>=0.16.0, but you have scikit-learn 1.5.1 which is incompatible.\n" - ] - } - ], - "source": [ - "!pip install -U scikit-learn" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: pyitlib in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (0.2.3)\n", - "Requirement already satisfied: pandas>=0.20.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (2.2.2)\n", - "Requirement already satisfied: numpy>=1.9.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (1.26.4)\n", - "Collecting scikit-learn<=0.24,>=0.16.0 (from pyitlib)\n", - " Using cached scikit_learn-0.24.0-cp39-cp39-win_amd64.whl.metadata (9.7 kB)\n", - "Requirement already satisfied: scipy>=1.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (1.13.1)\n", - "Requirement already satisfied: future>=0.16.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyitlib) (1.0.0)\n", - "Requirement already satisfied: python-dateutil>=2.8.2 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from pandas>=0.20.2->pyitlib) (2.9.0.post0)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=0.20.2->pyitlib) (2024.1)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=0.20.2->pyitlib) (2024.1)\n", - "Requirement already satisfied: joblib>=0.11 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn<=0.24,>=0.16.0->pyitlib) (1.4.2)\n", - "Requirement already satisfied: threadpoolctl>=2.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn<=0.24,>=0.16.0->pyitlib) (3.5.0)\n", - "Requirement already satisfied: six>=1.5 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from python-dateutil>=2.8.2->pandas>=0.20.2->pyitlib) (1.16.0)\n", - "Using cached scikit_learn-0.24.0-cp39-cp39-win_amd64.whl (6.9 MB)\n", - "Installing collected packages: scikit-learn\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 1.5.1\n", - " Uninstalling scikit-learn-1.5.1:\n", - " Successfully uninstalled scikit-learn-1.5.1\n", - "Successfully installed scikit-learn-0.24.0\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "rdt 1.12.2 requires scikit-learn>=1.0.2; python_version < \"3.10\", but you have scikit-learn 0.24.0 which is incompatible.\n", - "sdmetrics 0.15.0 requires scikit-learn>=1.0.2; python_version < \"3.10\", but you have scikit-learn 0.24.0 which is incompatible.\n" - ] - } - ], - "source": [ - "!pip install pyitlib" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: tensorflow in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (2.17.0)\n", - "Requirement already satisfied: tensorflow-intel==2.17.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow) (2.17.0)\n", - "Requirement already satisfied: absl-py>=1.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.1.0)\n", - "Requirement already satisfied: astunparse>=1.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.6.3)\n", - "Requirement already satisfied: flatbuffers>=24.3.25 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (24.3.25)\n", - "Requirement already satisfied: gast!=0.5.0,!=0.5.1,!=0.5.2,>=0.2.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.6.0)\n", - "Requirement already satisfied: google-pasta>=0.1.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.2.0)\n", - "Requirement already satisfied: h5py>=3.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (3.11.0)\n", - "Requirement already satisfied: libclang>=13.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (18.1.1)\n", - "Requirement already satisfied: ml-dtypes<0.5.0,>=0.3.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.4.0)\n", - "Requirement already satisfied: opt-einsum>=2.3.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (3.3.0)\n", - "Requirement already satisfied: packaging in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (24.1)\n", - "Requirement already satisfied: protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.20.3 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (4.25.4)\n", - "Requirement already satisfied: requests<3,>=2.21.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.32.3)\n", - "Requirement already satisfied: setuptools in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (58.1.0)\n", - "Requirement already satisfied: six>=1.12.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.16.0)\n", - "Requirement already satisfied: termcolor>=1.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.4.0)\n", - "Requirement already satisfied: typing-extensions>=3.6.6 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (4.12.2)\n", - "Requirement already satisfied: wrapt>=1.11.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.16.0)\n", - "Requirement already satisfied: grpcio<2.0,>=1.24.3 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.65.4)\n", - "Requirement already satisfied: tensorboard<2.18,>=2.17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (2.17.0)\n", - "Requirement already satisfied: keras>=3.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (3.4.1)\n", - "Requirement already satisfied: tensorflow-io-gcs-filesystem>=0.23.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (0.31.0)\n", - "Requirement already satisfied: numpy<2.0.0,>=1.23.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorflow-intel==2.17.0->tensorflow) (1.26.4)\n", - "Requirement already satisfied: wheel<1.0,>=0.23.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from astunparse>=1.6.0->tensorflow-intel==2.17.0->tensorflow) (0.44.0)\n", - "Requirement already satisfied: rich in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (13.7.1)\n", - "Requirement already satisfied: namex in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (0.0.8)\n", - "Requirement already satisfied: optree in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (0.12.1)\n", - "Requirement already satisfied: charset-normalizer<4,>=2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (3.3.2)\n", - "Requirement already satisfied: idna<4,>=2.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (3.7)\n", - "Requirement already satisfied: urllib3<3,>=1.21.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (1.26.19)\n", - "Requirement already satisfied: certifi>=2017.4.17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.17.0->tensorflow) (2024.7.4)\n", - "Requirement already satisfied: markdown>=2.6.8 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.6)\n", - "Requirement already satisfied: tensorboard-data-server<0.8.0,>=0.7.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (0.7.2)\n", - "Requirement already satisfied: werkzeug>=1.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.0.3)\n", - "Requirement already satisfied: importlib-metadata>=4.4 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from markdown>=2.6.8->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (8.2.0)\n", - "Requirement already satisfied: MarkupSafe>=2.1.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from werkzeug>=1.0.1->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (2.1.5)\n", - "Requirement already satisfied: markdown-it-py>=2.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (3.0.0)\n", - "Requirement already satisfied: pygments<3.0.0,>=2.13.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (2.18.0)\n", - "Requirement already satisfied: zipp>=0.5 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard<2.18,>=2.17->tensorflow-intel==2.17.0->tensorflow) (3.20.0)\n", - "Requirement already satisfied: mdurl~=0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from markdown-it-py>=2.2.0->rich->keras>=3.2.0->tensorflow-intel==2.17.0->tensorflow) (0.1.2)\n" - ] - } - ], - "source": [ - "!pip install tensorflow" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: pgmpy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (0.1.26)\n", - "Requirement already satisfied: networkx in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.2.1)\n", - "Requirement already satisfied: numpy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.26.4)\n", - "Requirement already satisfied: scipy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.13.1)\n", - "Requirement already satisfied: scikit-learn in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.24.0)\n", - "Requirement already satisfied: pandas in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.2.2)\n", - "Requirement already satisfied: pyparsing in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.1.2)\n", - "Requirement already satisfied: torch in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.4.0)\n", - "Requirement already satisfied: statsmodels in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.14.2)\n", - "Requirement already satisfied: tqdm in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (4.66.5)\n", - "Requirement already satisfied: joblib in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (1.4.2)\n", - "Requirement already satisfied: opt-einsum in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (3.3.0)\n", - "Requirement already satisfied: xgboost in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (2.1.1)\n", - "Requirement already satisfied: google-generativeai in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pgmpy) (0.7.2)\n", - "Requirement already satisfied: google-ai-generativelanguage==0.6.6 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (0.6.6)\n", - "Requirement already satisfied: google-api-core in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.19.1)\n", - "Requirement already satisfied: google-api-python-client in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.140.0)\n", - "Requirement already satisfied: google-auth>=2.15.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.33.0)\n", - "Requirement already satisfied: protobuf in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (4.25.4)\n", - "Requirement already satisfied: pydantic in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-generativeai->pgmpy) (2.8.2)\n", - "Requirement already satisfied: typing-extensions in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from google-generativeai->pgmpy) (4.12.2)\n", - "Requirement already satisfied: proto-plus<2.0.0dev,>=1.22.3 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-ai-generativelanguage==0.6.6->google-generativeai->pgmpy) (1.24.0)\n", - "Requirement already satisfied: python-dateutil>=2.8.2 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from pandas->pgmpy) (2.9.0.post0)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas->pgmpy) (2024.1)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas->pgmpy) (2024.1)\n", - "Requirement already satisfied: threadpoolctl>=2.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn->pgmpy) (3.5.0)\n", - "Requirement already satisfied: patsy>=0.5.6 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from statsmodels->pgmpy) (0.5.6)\n", - "Requirement already satisfied: packaging>=21.3 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from statsmodels->pgmpy) (24.1)\n", - "Requirement already satisfied: filelock in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (3.15.4)\n", - "Requirement already satisfied: sympy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (1.13.2)\n", - "Requirement already satisfied: jinja2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (3.1.4)\n", - "Requirement already satisfied: fsspec in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch->pgmpy) (2024.6.1)\n", - "Requirement already satisfied: colorama in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tqdm->pgmpy) (0.4.6)\n", - "Requirement already satisfied: googleapis-common-protos<2.0.dev0,>=1.56.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (1.63.2)\n", - "Requirement already satisfied: requests<3.0.0.dev0,>=2.18.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core->google-generativeai->pgmpy) (2.32.3)\n", - "Requirement already satisfied: cachetools<6.0,>=2.0.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-auth>=2.15.0->google-generativeai->pgmpy) (5.4.0)\n", - "Requirement already satisfied: pyasn1-modules>=0.2.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-auth>=2.15.0->google-generativeai->pgmpy) (0.4.0)\n", - "Requirement already satisfied: rsa<5,>=3.1.4 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-auth>=2.15.0->google-generativeai->pgmpy) (4.9)\n", - "Requirement already satisfied: six in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from patsy>=0.5.6->statsmodels->pgmpy) (1.16.0)\n", - "Requirement already satisfied: httplib2<1.dev0,>=0.19.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-python-client->google-generativeai->pgmpy) (0.22.0)\n", - "Requirement already satisfied: google-auth-httplib2<1.0.0,>=0.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-python-client->google-generativeai->pgmpy) (0.2.0)\n", - "Requirement already satisfied: uritemplate<5,>=3.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-python-client->google-generativeai->pgmpy) (4.1.1)\n", - "Requirement already satisfied: MarkupSafe>=2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from jinja2->torch->pgmpy) (2.1.5)\n", - "Requirement already satisfied: annotated-types>=0.4.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pydantic->google-generativeai->pgmpy) (0.7.0)\n", - "Requirement already satisfied: pydantic-core==2.20.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pydantic->google-generativeai->pgmpy) (2.20.1)\n", - "Requirement already satisfied: mpmath<1.4,>=1.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sympy->torch->pgmpy) (1.3.0)\n", - "Requirement already satisfied: grpcio<2.0dev,>=1.33.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0dev,>=1.34.1->google-ai-generativelanguage==0.6.6->google-generativeai->pgmpy) (1.65.4)\n", - "Requirement already satisfied: grpcio-status<2.0.dev0,>=1.33.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0dev,>=1.34.1->google-ai-generativelanguage==0.6.6->google-generativeai->pgmpy) (1.62.3)\n", - "Requirement already satisfied: pyasn1<0.7.0,>=0.4.6 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pyasn1-modules>=0.2.1->google-auth>=2.15.0->google-generativeai->pgmpy) (0.6.0)\n", - "Requirement already satisfied: charset-normalizer<4,>=2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (3.3.2)\n", - "Requirement already satisfied: idna<4,>=2.5 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (3.7)\n", - "Requirement already satisfied: urllib3<3,>=1.21.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (1.26.19)\n", - "Requirement already satisfied: certifi>=2017.4.17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai->pgmpy) (2024.7.4)\n" - ] - } - ], - "source": [ - "!pip install pgmpy" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Requirement already satisfied: sdv in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (1.15.0)\n", - "Requirement already satisfied: boto3>=1.28 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.34.158)\n", - "Requirement already satisfied: botocore>=1.31 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.34.158)\n", - "Requirement already satisfied: cloudpickle>=2.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (3.0.0)\n", - "Requirement already satisfied: graphviz>=0.13.2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.20.3)\n", - "Requirement already satisfied: tqdm>=4.29 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (4.66.5)\n", - "Requirement already satisfied: copulas>=0.11.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.11.0)\n", - "Requirement already satisfied: ctgan>=0.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.10.1)\n", - "Requirement already satisfied: deepecho>=0.6.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.6.0)\n", - "Requirement already satisfied: rdt>=1.12.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.12.2)\n", - "Requirement already satisfied: sdmetrics>=0.14.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (0.15.0)\n", - "Requirement already satisfied: platformdirs>=4.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from sdv) (4.2.2)\n", - "Requirement already satisfied: pyyaml>=6.0.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (6.0.2)\n", - "Requirement already satisfied: numpy<2.0.0,>=1.21.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (1.26.4)\n", - "Requirement already satisfied: pandas>=1.4.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sdv) (2.2.2)\n", - "Requirement already satisfied: jmespath<2.0.0,>=0.7.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from boto3>=1.28->sdv) (1.0.1)\n", - "Requirement already satisfied: s3transfer<0.11.0,>=0.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from boto3>=1.28->sdv) (0.10.2)\n", - "Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from botocore>=1.31->sdv) (2.9.0.post0)\n", - "Requirement already satisfied: urllib3<1.27,>=1.25.4 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from botocore>=1.31->sdv) (1.26.19)\n", - "Requirement already satisfied: plotly>=5.10.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from copulas>=0.11.0->sdv) (5.23.0)\n", - "Requirement already satisfied: scipy>=1.5.4 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from copulas>=0.11.0->sdv) (1.13.1)\n", - "Requirement already satisfied: torch>=1.9.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from ctgan>=0.10.0->sdv) (2.4.0)\n", - "Requirement already satisfied: pytz>=2020.1 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=1.4.0->sdv) (2024.1)\n", - "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from pandas>=1.4.0->sdv) (2024.1)\n", - "Requirement already satisfied: Faker>=17 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from rdt>=1.12.0->sdv) (26.3.0)\n", - "Collecting scikit-learn>=1.0.2 (from rdt>=1.12.0->sdv)\n", - " Using cached scikit_learn-1.5.1-cp39-cp39-win_amd64.whl.metadata (12 kB)\n", - "Requirement already satisfied: colorama in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from tqdm>=4.29->sdv) (0.4.6)\n", - "Requirement already satisfied: tenacity>=6.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from plotly>=5.10.0->copulas>=0.11.0->sdv) (9.0.0)\n", - "Requirement already satisfied: packaging in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from plotly>=5.10.0->copulas>=0.11.0->sdv) (24.1)\n", - "Requirement already satisfied: six>=1.5 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from python-dateutil<3.0.0,>=2.1->botocore>=1.31->sdv) (1.16.0)\n", - "Requirement already satisfied: joblib>=1.2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn>=1.0.2->rdt>=1.12.0->sdv) (1.4.2)\n", - "Requirement already satisfied: threadpoolctl>=3.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from scikit-learn>=1.0.2->rdt>=1.12.0->sdv) (3.5.0)\n", - "Requirement already satisfied: filelock in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.15.4)\n", - "Requirement already satisfied: typing-extensions>=4.8.0 in c:\\users\\asus\\appdata\\roaming\\python\\python39\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (4.12.2)\n", - "Requirement already satisfied: sympy in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (1.13.2)\n", - "Requirement already satisfied: networkx in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.2.1)\n", - "Requirement already satisfied: jinja2 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (3.1.4)\n", - "Requirement already satisfied: fsspec in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from torch>=1.9.0->ctgan>=0.10.0->sdv) (2024.6.1)\n", - "Requirement already satisfied: MarkupSafe>=2.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from jinja2->torch>=1.9.0->ctgan>=0.10.0->sdv) (2.1.5)\n", - "Requirement already satisfied: mpmath<1.4,>=1.1.0 in c:\\users\\asus\\appdata\\local\\programs\\python\\python39\\lib\\site-packages (from sympy->torch>=1.9.0->ctgan>=0.10.0->sdv) (1.3.0)\n", - "Using cached scikit_learn-1.5.1-cp39-cp39-win_amd64.whl (11.0 MB)\n", - "Installing collected packages: scikit-learn\n", - " Attempting uninstall: scikit-learn\n", - " Found existing installation: scikit-learn 0.24.0\n", - " Uninstalling scikit-learn-0.24.0:\n", - " Successfully uninstalled scikit-learn-0.24.0\n", - "Successfully installed scikit-learn-1.5.1\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", - "pyitlib 0.2.3 requires scikit-learn<=0.24,>=0.16.0, but you have scikit-learn 1.5.1 which is incompatible.\n" - ] - } - ], - "source": [ - "!pip install sdv" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### Importing Katabatic" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [], - "source": [ - "from katabatic.katabatic import Katabatic\n", - "import pandas as pd\n", - "import numpy as np\n", - "\n" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "from katabatic.models.meg_DGEK.utils import get_demo_data\n", - "df = get_demo_data('adult-raw')\n", - "df.head()" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Sample Training Data" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "array([ 0, 2, 4, 10, 11, 12], dtype=int64)" - ] - }, - "execution_count": 2, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "from katabatic.models.meg_DGEK.utils import get_demo_data\n", - "df = get_demo_data('adult-raw')\n", - "df.head()\n", - "\n", - "from sklearn.model_selection import train_test_split\n", - "x, y = df.values[:,:-1], df.values[:,-1]\n", - "X_train, X_test, y_train, y_test = train_test_split(x, y, test_size=0.5)\n", - "\n", - "\n", - "\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Load Model and Train it on Sample Training Data" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] Initializing MEG Model\n", - "[INFO] Initial X columns: RangeIndex(start=0, stop=14, step=1)\n", - "[INFO] Converting column '0' to numerical format\n", - "[INFO] Converting column '1' to numerical format\n", - "[INFO] Converting column '2' to numerical format\n", - "[INFO] Converting column '3' to numerical format\n", - "[INFO] Converting column '4' to numerical format\n", - "[INFO] Converting column '5' to numerical format\n", - "[INFO] Converting column '6' to numerical format\n", - "[INFO] Converting column '7' to numerical format\n", - "[INFO] Converting column '8' to numerical format\n", - "[INFO] Converting column '9' to numerical format\n", - "[INFO] Converting column '10' to numerical format\n", - "[INFO] Converting column '11' to numerical format\n", - "[INFO] Converting column '12' to numerical format\n", - "[INFO] Converting column '13' to numerical format\n", - "[INFO] Converting labels to numerical format\n", - "[INFO] Training MEG model\n", - "E0U0: G_loss = inf, D_loss = 777.388672, sim_loss=110714.439373\n", - "E0U1: G_loss = inf, D_loss = 559.423035, sim_loss=110715.170703\n", - "E0 weight: [0.59, 0.41]\n", - "E1U0: G_loss = inf, D_loss = 661.672607, sim_loss=110659.604135\n", - "E1U1: G_loss = inf, D_loss = 728.580872, sim_loss=110660.189110\n", - "E1 weight: [0.57, 0.43]\n", - "E2U0: G_loss = inf, D_loss = 403.012054, sim_loss=111391.254888\n", - "E2U1: G_loss = inf, D_loss = 267.257874, sim_loss=111391.850214\n", - "E2 weight: [0.57, 0.43]\n", - "E3U0: G_loss = inf, D_loss = 229.164261, sim_loss=110873.795380\n", - "E3U1: G_loss = inf, D_loss = 517.770325, sim_loss=110874.383348\n", - "E3 weight: [0.57, 0.43]\n", - "E4U0: G_loss = inf, D_loss = 354.685089, sim_loss=111018.929235\n", - "E4U1: G_loss = inf, D_loss = 587.477112, sim_loss=111019.523121\n", - "E4 weight: [0.57, 0.43]\n", - "[SUCCESS] Model training completed\n" - ] - } - ], - "source": [ - "from katabatic.models.meg_DGEK.meg_adapter import MegAdapter\n", - "adapter = MegAdapter()\n", - "adapter.load_model()\n", - "adapter.fit(X_train, y_train, epochs=5)\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Generate New Data from trained model" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] Generating data using MEG model\n", - "[SUCCESS] Data generation completed\n" - ] - }, - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
012345678910111213
010317344111232182358619
142164478843120104599033
22341689299042406669333
322413388139393103441849
430111241211603019360218
\n", - "
" - ], - "text/plain": [ - " 0 1 2 3 4 5 6 7 8 9 10 11 12 13\n", - "0 10 3 1734 4 1 1 12 3 2 1 82 35 86 19\n", - "1 4 2 16447 8 8 4 3 1 2 0 104 59 90 33\n", - "2 23 4 16892 9 9 0 4 2 4 0 66 69 3 33\n", - "3 22 4 13388 13 9 3 9 3 1 0 34 41 84 9\n", - "4 3 0 11124 12 11 6 0 3 0 1 93 60 21 8" - ] - }, - "execution_count": 6, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "adapter.generate(size=5)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# Future import statement ideas\n", - "\n", - "import katabatic as kb\n", - "from katabatic.models import meg\n", - "from katabatic.models import ganblr\n", - "from katabatic.evaluate import eval_method1\n", - "from katabatic.utils.preprocessing import data_processing_method1 # good place to store preprocessing utilities" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.9.13" - } - }, - "nbformat": 4, - "nbformat_minor": 2 -} diff --git a/Use_Cases/tryKatabatic_Muhammad.ipynb b/Use_Cases/tryKatabatic_Muhammad.ipynb deleted file mode 100644 index f15de9b..0000000 --- a/Use_Cases/tryKatabatic_Muhammad.ipynb +++ /dev/null @@ -1,1763 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Katabatic Demo Usage" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### Importing Dependenciencies" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [], - "source": [ - "# !pip install -U scikit-learn" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": {}, - "outputs": [], - "source": [ - "# !pip install pyitlib" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [], - "source": [ - "# !pip install tensorflow" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": {}, - "outputs": [], - "source": [ - "# !pip install pgmpy" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [], - "source": [ - "# !pip install sdv" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### Importing Katabatic" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": {}, - "outputs": [], - "source": [ - "from katabatic.katabatic import Katabatic\n", - "import numpy as np\n", - "import pandas as pd" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Sample Training Data" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "--------------------------\n", - "module name: katabatic.katabatic\n", - "parent process: 69249\n", - "process id: 15244\n", - "katabatic.models.medgan.medgan_adapter\n", - "katabatic.models.medgan.medgan_adapter\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "2024-08-30 17:24:03.506848: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.\n", - "To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "\n" - ] - } - ], - "source": [ - "medgan_model = Katabatic.run_model('medgan')" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - " parents has_nurs form children housing finance social \\\n", - "0 usual proper complete 1 convenient convenient nonprob \n", - "1 usual proper complete 1 convenient convenient nonprob \n", - "2 usual proper complete 1 convenient convenient nonprob \n", - "3 usual proper complete 1 convenient convenient slightly_prob \n", - "4 usual proper complete 1 convenient convenient slightly_prob \n", - "\n", - " health class \n", - "0 recommended recommend \n", - "1 priority priority \n", - "2 not_recom not_recom \n", - "3 recommended recommend \n", - "4 priority priority \n" - ] - } - ], - "source": [ - "file_path = \"/Users/abdullah/Documents/GitHub/Katabatic/katabatic/nursery/nursery.data\"\n", - "\n", - "# Column names as per the dataset's description\n", - "columns = [\n", - " \"parents\", \"has_nurs\", \"form\", \"children\", \"housing\",\n", - " \"finance\", \"social\", \"health\", \"class\"\n", - "]\n", - "\n", - "# Load the dataset into a pandas DataFrame\n", - "data = pd.read_csv(file_path, header=None, names=columns)\n", - "\n", - "# Display the first few rows of the dataset\n", - "print(data.head())" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [], - "source": [ - "X, y = data.iloc[:, :-1], data.iloc[:, -1]" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] Initializing MedGAN Model\n", - "[INFO] Preprocessing and training MedGAN model\n", - "[INFO] Preprocessing data...\n", - "[SUCCESS] Data preprocessing completed.\n", - "[INFO] Data shape: (12960, 8)\n", - "[INFO] Encoded data shape: (12960, 8)\n", - "[INFO] TrainX shape: (11664, 8), ValidX shape: (1296, 8)\n", - "Pretraining autoencoder...\n", - "[INFO] Autoencoder input shape: (None, 8)\n", - "[INFO] Encoded layer shape: (None, 128)\n", - "[INFO] Decoded output shape: (None, 8)\n", - "\u001b[1m183/183\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 960us/step - loss: -1.2586\n", - "Training GAN...\n", - "[INFO] Generator InputLayer shape: (128,)\n", - "[INFO] After Generator Dense layer 0, shape: 128\n", - "[INFO] After Generator Dense layer 1, shape: 128\n", - "[INFO] Generator output shape: 8\n", - "[INFO] Generator model summary:\n" - ] - }, - { - "data": { - "text/html": [ - "
Model: \"sequential\"\n",
-       "
\n" - ], - "text/plain": [ - "\u001b[1mModel: \"sequential\"\u001b[0m\n" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "text/html": [ - "
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n",
-       "┃ Layer (type)                     Output Shape                  Param # ┃\n",
-       "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n",
-       "│ dense (Dense)                   │ (None, 128)            │        16,512 │\n",
-       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
-       "│ dense_1 (Dense)                 │ (None, 128)            │        16,512 │\n",
-       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
-       "│ dense_2 (Dense)                 │ (None, 8)              │         1,032 │\n",
-       "└─────────────────────────────────┴────────────────────────┴───────────────┘\n",
-       "
\n" - ], - "text/plain": [ - "┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n", - "┃\u001b[1m \u001b[0m\u001b[1mLayer (type) \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1mOutput Shape \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1m Param #\u001b[0m\u001b[1m \u001b[0m┃\n", - "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n", - "│ dense (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m128\u001b[0m) │ \u001b[38;5;34m16,512\u001b[0m │\n", - "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", - "│ dense_1 (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m128\u001b[0m) │ \u001b[38;5;34m16,512\u001b[0m │\n", - "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", - "│ dense_2 (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m8\u001b[0m) │ \u001b[38;5;34m1,032\u001b[0m │\n", - "└─────────────────────────────────┴────────────────────────┴───────────────┘\n" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "text/html": [ - "
 Total params: 34,056 (133.03 KB)\n",
-       "
\n" - ], - "text/plain": [ - "\u001b[1m Total params: \u001b[0m\u001b[38;5;34m34,056\u001b[0m (133.03 KB)\n" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "text/html": [ - "
 Trainable params: 34,056 (133.03 KB)\n",
-       "
\n" - ], - "text/plain": [ - "\u001b[1m Trainable params: \u001b[0m\u001b[38;5;34m34,056\u001b[0m (133.03 KB)\n" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "text/html": [ - "
 Non-trainable params: 0 (0.00 B)\n",
-       "
\n" - ], - "text/plain": [ - "\u001b[1m Non-trainable params: \u001b[0m\u001b[38;5;34m0\u001b[0m (0.00 B)\n" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] Discriminator InputLayer shape: (8,)\n", - "[INFO] After Discriminator Dense layer 0, shape: 256\n", - "[INFO] After Discriminator Dense layer 1, shape: 128\n", - "[INFO] Discriminator output shape: 1\n", - "[INFO] Discriminator model summary:\n" - ] - }, - { - "data": { - "text/html": [ - "
Model: \"sequential_1\"\n",
-       "
\n" - ], - "text/plain": [ - "\u001b[1mModel: \"sequential_1\"\u001b[0m\n" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "text/html": [ - "
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n",
-       "┃ Layer (type)                     Output Shape                  Param # ┃\n",
-       "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n",
-       "│ dense_3 (Dense)                 │ (None, 256)            │         2,304 │\n",
-       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
-       "│ dense_4 (Dense)                 │ (None, 128)            │        32,896 │\n",
-       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
-       "│ dense_5 (Dense)                 │ (None, 1)              │           129 │\n",
-       "└─────────────────────────────────┴────────────────────────┴───────────────┘\n",
-       "
\n" - ], - "text/plain": [ - "┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n", - "┃\u001b[1m \u001b[0m\u001b[1mLayer (type) \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1mOutput Shape \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1m Param #\u001b[0m\u001b[1m \u001b[0m┃\n", - "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n", - "│ dense_3 (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m256\u001b[0m) │ \u001b[38;5;34m2,304\u001b[0m │\n", - "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", - "│ dense_4 (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m128\u001b[0m) │ \u001b[38;5;34m32,896\u001b[0m │\n", - "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", - "│ dense_5 (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m1\u001b[0m) │ \u001b[38;5;34m129\u001b[0m │\n", - "└─────────────────────────────────┴────────────────────────┴───────────────┘\n" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "text/html": [ - "
 Total params: 35,329 (138.00 KB)\n",
-       "
\n" - ], - "text/plain": [ - "\u001b[1m Total params: \u001b[0m\u001b[38;5;34m35,329\u001b[0m (138.00 KB)\n" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "text/html": [ - "
 Trainable params: 35,329 (138.00 KB)\n",
-       "
\n" - ], - "text/plain": [ - "\u001b[1m Trainable params: \u001b[0m\u001b[38;5;34m35,329\u001b[0m (138.00 KB)\n" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "text/html": [ - "
 Non-trainable params: 0 (0.00 B)\n",
-       "
\n" - ], - "text/plain": [ - "\u001b[1m Non-trainable params: \u001b[0m\u001b[38;5;34m0\u001b[0m (0.00 B)\n" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] GAN model summary:\n" - ] - }, - { - "data": { - "text/html": [ - "
Model: \"functional_6\"\n",
-       "
\n" - ], - "text/plain": [ - "\u001b[1mModel: \"functional_6\"\u001b[0m\n" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "text/html": [ - "
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n",
-       "┃ Layer (type)                     Output Shape                  Param # ┃\n",
-       "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n",
-       "│ input_layer_2 (InputLayer)      │ (None, 128)            │             0 │\n",
-       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
-       "│ sequential (Sequential)         │ (None, 8)              │        34,056 │\n",
-       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
-       "│ sequential_1 (Sequential)       │ (None, 1)              │        35,329 │\n",
-       "└─────────────────────────────────┴────────────────────────┴───────────────┘\n",
-       "
\n" - ], - "text/plain": [ - "┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n", - "┃\u001b[1m \u001b[0m\u001b[1mLayer (type) \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1mOutput Shape \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1m Param #\u001b[0m\u001b[1m \u001b[0m┃\n", - "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n", - "│ input_layer_2 (\u001b[38;5;33mInputLayer\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m128\u001b[0m) │ \u001b[38;5;34m0\u001b[0m │\n", - "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", - "│ sequential (\u001b[38;5;33mSequential\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m8\u001b[0m) │ \u001b[38;5;34m34,056\u001b[0m │\n", - "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", - "│ sequential_1 (\u001b[38;5;33mSequential\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m1\u001b[0m) │ \u001b[38;5;34m35,329\u001b[0m │\n", - "└─────────────────────────────────┴────────────────────────┴───────────────┘\n" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "text/html": [ - "
 Total params: 69,385 (271.04 KB)\n",
-       "
\n" - ], - "text/plain": [ - "\u001b[1m Total params: \u001b[0m\u001b[38;5;34m69,385\u001b[0m (271.04 KB)\n" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "text/html": [ - "
 Trainable params: 34,056 (133.03 KB)\n",
-       "
\n" - ], - "text/plain": [ - "\u001b[1m Trainable params: \u001b[0m\u001b[38;5;34m34,056\u001b[0m (133.03 KB)\n" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "text/html": [ - "
 Non-trainable params: 35,329 (138.00 KB)\n",
-       "
\n" - ], - "text/plain": [ - "\u001b[1m Non-trainable params: \u001b[0m\u001b[38;5;34m35,329\u001b[0m (138.00 KB)\n" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 837us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.8183685, dtype=float32), array(0.04972565, dtype=float32)], d_loss_fake: [array(0.74627894, dtype=float32), array(0.37804356, dtype=float32)], g_loss: [array(0.74627894, dtype=float32), array(0.74627894, dtype=float32), array(0.37804356, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 827us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.7703088, dtype=float32), array(0.26860425, dtype=float32)], d_loss_fake: [array(0.753547, dtype=float32), array(0.29218107, dtype=float32)], g_loss: [array(0.753547, dtype=float32), array(0.753547, dtype=float32), array(0.29218107, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 693us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.76651126, dtype=float32), array(0.24368998, dtype=float32)], d_loss_fake: [array(0.76039344, dtype=float32), array(0.22110768, dtype=float32)], g_loss: [array(0.76039344, dtype=float32), array(0.76039344, dtype=float32), array(0.22110768, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 690us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.76867557, dtype=float32), array(0.19662453, dtype=float32)], d_loss_fake: [array(0.766811, dtype=float32), array(0.174372, dtype=float32)], g_loss: [array(0.766811, dtype=float32), array(0.766811, dtype=float32), array(0.174372, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 763us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.7725397, dtype=float32), array(0.1605224, dtype=float32)], d_loss_fake: [array(0.7727463, dtype=float32), array(0.1447188, dtype=float32)], g_loss: [array(0.7727463, dtype=float32), array(0.7727463, dtype=float32), array(0.1447188, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 696us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.77689385, dtype=float32), array(0.13608305, dtype=float32)], d_loss_fake: [array(0.77831054, dtype=float32), array(0.12474994, dtype=float32)], g_loss: [array(0.77831054, dtype=float32), array(0.77831054, dtype=float32), array(0.12474994, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 934us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.7813919, dtype=float32), array(0.11897884, dtype=float32)], d_loss_fake: [array(0.7834789, dtype=float32), array(0.11048035, dtype=float32)], g_loss: [array(0.7834789, dtype=float32), array(0.7834789, dtype=float32), array(0.11048035, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 707us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.78580487, dtype=float32), array(0.10643004, dtype=float32)], d_loss_fake: [array(0.78827137, dtype=float32), array(0.09977816, dtype=float32)], g_loss: [array(0.78827137, dtype=float32), array(0.78827137, dtype=float32), array(0.09977816, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 713us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.7900418, dtype=float32), array(0.0968339, dtype=float32)], d_loss_fake: [array(0.7926293, dtype=float32), array(0.09145424, dtype=float32)], g_loss: [array(0.7926293, dtype=float32), array(0.7926293, dtype=float32), array(0.09145424, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 720us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.793984, dtype=float32), array(0.08925799, dtype=float32)], d_loss_fake: [array(0.79655147, dtype=float32), array(0.0847951, dtype=float32)], g_loss: [array(0.79655147, dtype=float32), array(0.79655147, dtype=float32), array(0.0847951, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 758us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.79759043, dtype=float32), array(0.08312512, dtype=float32)], d_loss_fake: [array(0.80006146, dtype=float32), array(0.07934671, dtype=float32)], g_loss: [array(0.80006146, dtype=float32), array(0.80006146, dtype=float32), array(0.07934671, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 768us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.8008574, dtype=float32), array(0.07805884, dtype=float32)], d_loss_fake: [array(0.803175, dtype=float32), array(0.07480638, dtype=float32)], g_loss: [array(0.803175, dtype=float32), array(0.803175, dtype=float32), array(0.07480638, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 772us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.80378276, dtype=float32), array(0.07380316, dtype=float32)], d_loss_fake: [array(0.80596375, dtype=float32), array(0.07096457, dtype=float32)], g_loss: [array(0.80596375, dtype=float32), array(0.80596375, dtype=float32), array(0.07096457, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 746us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.8064232, dtype=float32), array(0.07017794, dtype=float32)], d_loss_fake: [array(0.8084695, dtype=float32), array(0.06767159, dtype=float32)], g_loss: [array(0.8084695, dtype=float32), array(0.8084695, dtype=float32), array(0.06767159, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 802us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.8088108, dtype=float32), array(0.06705277, dtype=float32)], d_loss_fake: [array(0.8107515, dtype=float32), array(0.06481767, dtype=float32)], g_loss: [array(0.8107515, dtype=float32), array(0.8107515, dtype=float32), array(0.06481767, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 746us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.8109971, dtype=float32), array(0.06433083, dtype=float32)], d_loss_fake: [array(0.8128684, dtype=float32), array(0.06232049, dtype=float32)], g_loss: [array(0.8128684, dtype=float32), array(0.8128684, dtype=float32), array(0.06232049, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 772us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.813035, dtype=float32), array(0.06193883, dtype=float32)], d_loss_fake: [array(0.8148717, dtype=float32), array(0.0601171, dtype=float32)], g_loss: [array(0.8148717, dtype=float32), array(0.8148717, dtype=float32), array(0.0601171, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 773us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.81497157, dtype=float32), array(0.0598202, dtype=float32)], d_loss_fake: [array(0.81680626, dtype=float32), array(0.05815853, dtype=float32)], g_loss: [array(0.81680626, dtype=float32), array(0.81680626, dtype=float32), array(0.05815853, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 1ms/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.81684846, dtype=float32), array(0.05793061, dtype=float32)], d_loss_fake: [array(0.81870955, dtype=float32), array(0.05640613, dtype=float32)], g_loss: [array(0.81870955, dtype=float32), array(0.81870955, dtype=float32), array(0.05640613, dtype=float32)]\n", - "\u001b[1m365/365\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 793us/step\n", - "[INFO] Generated data shape: (11664, 8)\n", - "[ERROR] TypeError during logging: unsupported format string passed to list.__format__\n", - "[DEBUG] d_loss_real: [array(0.8187008, dtype=float32), array(0.05623483, dtype=float32)], d_loss_fake: [array(0.8206065, dtype=float32), array(0.05482896, dtype=float32)], g_loss: [array(0.8206065, dtype=float32), array(0.8206065, dtype=float32), array(0.05482896, dtype=float32)]\n", - "[SUCCESS] Model training completed\n" - ] - } - ], - "source": [ - "medgan_model.load_model()\n", - "\n", - "medgan_model.fit(X,y, epochs=20)" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] Generating data using MedGAN model\n", - "[INFO] Generated samples shape: (12960, 8)\n", - "[INFO] Post-processing generated data...\n", - "[SUCCESS] Data post-processing completed.\n", - "[SUCCESS] Data generation completed\n" - ] - } - ], - "source": [ - "medgan_synthetic = medgan_model.generate(size=len(data))" - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": {}, - "outputs": [], - "source": [ - "medgan_synthetic = pd.DataFrame(medgan_synthetic)" - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "metadata": {}, - "outputs": [], - "source": [ - "X_synthetic_ganblr, y_synthetic_ganblr = medgan_synthetic.iloc[:, :-1], medgan_synthetic.iloc[:, -1]" - ] - }, - { - "cell_type": "code", - "execution_count": 10, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
parentshas_nursformchildrenhousingfinancesocial
0great_pretcriticalcomplete1convenientconvenientnonprob
1great_pretcriticalcomplete1convenientconvenientnonprob
2great_pretcriticalcomplete1convenientconvenientnonprob
3great_pretcriticalcomplete1convenientconvenientnonprob
4great_pretcriticalcomplete1convenientconvenientnonprob
........................
12955great_pretcriticalcomplete1convenientconvenientnonprob
12956great_pretcriticalcomplete1convenientconvenientnonprob
12957great_pretcriticalcomplete1convenientconvenientnonprob
12958great_pretcriticalcomplete1convenientconvenientnonprob
12959great_pretcriticalcomplete1convenientconvenientnonprob
\n", - "

12960 rows × 7 columns

\n", - "
" - ], - "text/plain": [ - " parents has_nurs form children housing finance \\\n", - "0 great_pret critical complete 1 convenient convenient \n", - "1 great_pret critical complete 1 convenient convenient \n", - "2 great_pret critical complete 1 convenient convenient \n", - "3 great_pret critical complete 1 convenient convenient \n", - "4 great_pret critical complete 1 convenient convenient \n", - "... ... ... ... ... ... ... \n", - "12955 great_pret critical complete 1 convenient convenient \n", - "12956 great_pret critical complete 1 convenient convenient \n", - "12957 great_pret critical complete 1 convenient convenient \n", - "12958 great_pret critical complete 1 convenient convenient \n", - "12959 great_pret critical complete 1 convenient convenient \n", - "\n", - " social \n", - "0 nonprob \n", - "1 nonprob \n", - "2 nonprob \n", - "3 nonprob \n", - "4 nonprob \n", - "... ... \n", - "12955 nonprob \n", - "12956 nonprob \n", - "12957 nonprob \n", - "12958 nonprob \n", - "12959 nonprob \n", - "\n", - "[12960 rows x 7 columns]" - ] - }, - "execution_count": 10, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "X_synthetic_ganblr" - ] - }, - { - "cell_type": "code", - "execution_count": 11, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
parentshas_nursformchildrenhousingfinancesocialhealth
0usualpropercomplete1convenientconvenientnonprobrecommended
1usualpropercomplete1convenientconvenientnonprobpriority
2usualpropercomplete1convenientconvenientnonprobnot_recom
3usualpropercomplete1convenientconvenientslightly_probrecommended
4usualpropercomplete1convenientconvenientslightly_probpriority
...........................
12955great_pretvery_critfostermorecriticalinconvslightly_probpriority
12956great_pretvery_critfostermorecriticalinconvslightly_probnot_recom
12957great_pretvery_critfostermorecriticalinconvproblematicrecommended
12958great_pretvery_critfostermorecriticalinconvproblematicpriority
12959great_pretvery_critfostermorecriticalinconvproblematicnot_recom
\n", - "

12960 rows × 8 columns

\n", - "
" - ], - "text/plain": [ - " parents has_nurs form children housing finance \\\n", - "0 usual proper complete 1 convenient convenient \n", - "1 usual proper complete 1 convenient convenient \n", - "2 usual proper complete 1 convenient convenient \n", - "3 usual proper complete 1 convenient convenient \n", - "4 usual proper complete 1 convenient convenient \n", - "... ... ... ... ... ... ... \n", - "12955 great_pret very_crit foster more critical inconv \n", - "12956 great_pret very_crit foster more critical inconv \n", - "12957 great_pret very_crit foster more critical inconv \n", - "12958 great_pret very_crit foster more critical inconv \n", - "12959 great_pret very_crit foster more critical inconv \n", - "\n", - " social health \n", - "0 nonprob recommended \n", - "1 nonprob priority \n", - "2 nonprob not_recom \n", - "3 slightly_prob recommended \n", - "4 slightly_prob priority \n", - "... ... ... \n", - "12955 slightly_prob priority \n", - "12956 slightly_prob not_recom \n", - "12957 problematic recommended \n", - "12958 problematic priority \n", - "12959 problematic not_recom \n", - "\n", - "[12960 rows x 8 columns]" - ] - }, - "execution_count": 11, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "X" - ] - }, - { - "cell_type": "code", - "execution_count": 13, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "array(['not_recom'], dtype=object)" - ] - }, - "execution_count": 13, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "y_synthetic_ganblr.unique()" - ] - }, - { - "cell_type": "code", - "execution_count": 35, - "metadata": {}, - "outputs": [], - "source": [ - "X = X.drop('health',axis=1)" - ] - }, - { - "cell_type": "code", - "execution_count": 18, - "metadata": {}, - "outputs": [], - "source": [ - "rm = pd.DataFrame(columns=[\"Metric\", \"Value\"])" - ] - }, - { - "cell_type": "code", - "execution_count": 19, - "metadata": {}, - "outputs": [], - "source": [ - "from katabatic.utils import evaluate\n", - "import pandas as pd" - ] - }, - { - "cell_type": "code", - "execution_count": 36, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
parentshas_nursformchildrenhousingfinancesocial
0usualpropercomplete1convenientconvenientnonprob
1usualpropercomplete1convenientconvenientnonprob
2usualpropercomplete1convenientconvenientnonprob
3usualpropercomplete1convenientconvenientslightly_prob
4usualpropercomplete1convenientconvenientslightly_prob
........................
12955great_pretvery_critfostermorecriticalinconvslightly_prob
12956great_pretvery_critfostermorecriticalinconvslightly_prob
12957great_pretvery_critfostermorecriticalinconvproblematic
12958great_pretvery_critfostermorecriticalinconvproblematic
12959great_pretvery_critfostermorecriticalinconvproblematic
\n", - "

12960 rows × 7 columns

\n", - "
" - ], - "text/plain": [ - " parents has_nurs form children housing finance \\\n", - "0 usual proper complete 1 convenient convenient \n", - "1 usual proper complete 1 convenient convenient \n", - "2 usual proper complete 1 convenient convenient \n", - "3 usual proper complete 1 convenient convenient \n", - "4 usual proper complete 1 convenient convenient \n", - "... ... ... ... ... ... ... \n", - "12955 great_pret very_crit foster more critical inconv \n", - "12956 great_pret very_crit foster more critical inconv \n", - "12957 great_pret very_crit foster more critical inconv \n", - "12958 great_pret very_crit foster more critical inconv \n", - "12959 great_pret very_crit foster more critical inconv \n", - "\n", - " social \n", - "0 nonprob \n", - "1 nonprob \n", - "2 nonprob \n", - "3 slightly_prob \n", - "4 slightly_prob \n", - "... ... \n", - "12955 slightly_prob \n", - "12956 slightly_prob \n", - "12957 problematic \n", - "12958 problematic \n", - "12959 problematic \n", - "\n", - "[12960 rows x 7 columns]" - ] - }, - "execution_count": 36, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "X" - ] - }, - { - "cell_type": "code", - "execution_count": 37, - "metadata": {}, - "outputs": [ - { - "ename": "ValueError", - "evalue": "y contains previously unseen labels: 'spec_prior'", - "output_type": "error", - "traceback": [ - "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", - "\u001b[0;31mKeyError\u001b[0m Traceback (most recent call last)", - "File \u001b[0;32m/opt/anaconda3/envs/katabatic/lib/python3.9/site-packages/sklearn/utils/_encode.py:225\u001b[0m, in \u001b[0;36m_encode\u001b[0;34m(values, uniques, check_unknown)\u001b[0m\n\u001b[1;32m 224\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 225\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43m_map_to_integer\u001b[49m\u001b[43m(\u001b[49m\u001b[43mvalues\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43muniques\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 226\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mKeyError\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n", - "File \u001b[0;32m/opt/anaconda3/envs/katabatic/lib/python3.9/site-packages/sklearn/utils/_encode.py:165\u001b[0m, in \u001b[0;36m_map_to_integer\u001b[0;34m(values, uniques)\u001b[0m\n\u001b[1;32m 164\u001b[0m table \u001b[38;5;241m=\u001b[39m _nandict({val: i \u001b[38;5;28;01mfor\u001b[39;00m i, val \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28menumerate\u001b[39m(uniques)})\n\u001b[0;32m--> 165\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m np\u001b[38;5;241m.\u001b[39marray([table[v] \u001b[38;5;28;01mfor\u001b[39;00m v \u001b[38;5;129;01min\u001b[39;00m values])\n", - "File \u001b[0;32m/opt/anaconda3/envs/katabatic/lib/python3.9/site-packages/sklearn/utils/_encode.py:165\u001b[0m, in \u001b[0;36m\u001b[0;34m(.0)\u001b[0m\n\u001b[1;32m 164\u001b[0m table \u001b[38;5;241m=\u001b[39m _nandict({val: i \u001b[38;5;28;01mfor\u001b[39;00m i, val \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28menumerate\u001b[39m(uniques)})\n\u001b[0;32m--> 165\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m np\u001b[38;5;241m.\u001b[39marray([\u001b[43mtable\u001b[49m\u001b[43m[\u001b[49m\u001b[43mv\u001b[49m\u001b[43m]\u001b[49m \u001b[38;5;28;01mfor\u001b[39;00m v \u001b[38;5;129;01min\u001b[39;00m values])\n", - "File \u001b[0;32m/opt/anaconda3/envs/katabatic/lib/python3.9/site-packages/sklearn/utils/_encode.py:159\u001b[0m, in \u001b[0;36m_nandict.__missing__\u001b[0;34m(self, key)\u001b[0m\n\u001b[1;32m 158\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mnan_value\n\u001b[0;32m--> 159\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mKeyError\u001b[39;00m(key)\n", - "\u001b[0;31mKeyError\u001b[0m: 'spec_prior'", - "\nDuring handling of the above exception, another exception occurred:\n", - "\u001b[0;31mValueError\u001b[0m Traceback (most recent call last)", - "Cell \u001b[0;32mIn[37], line 3\u001b[0m\n\u001b[1;32m 1\u001b[0m tstr_logreg \u001b[38;5;241m=\u001b[39m evaluate\u001b[38;5;241m.\u001b[39mrun_metric(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mtstr_logreg\u001b[39m\u001b[38;5;124m\"\u001b[39m)\n\u001b[0;32m----> 3\u001b[0m \u001b[38;5;28mprint\u001b[39m(\u001b[43mtstr_logreg\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mevaluate\u001b[49m\u001b[43m(\u001b[49m\u001b[43mX_synthetic_ganblr\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43my_synthetic_ganblr\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mX\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43my\u001b[49m\u001b[43m)\u001b[49m)\n", - "File \u001b[0;32m~/Documents/GitHub/Katabatic/katabatic/metrics/tstr_logreg.py:83\u001b[0m, in \u001b[0;36mevaluate\u001b[0;34m(X_synthetic, y_synthetic, X_real, y_real)\u001b[0m\n\u001b[1;32m 81\u001b[0m le \u001b[38;5;241m=\u001b[39m LabelEncoder()\n\u001b[1;32m 82\u001b[0m y_synthetic \u001b[38;5;241m=\u001b[39m le\u001b[38;5;241m.\u001b[39mfit_transform(y_synthetic)\n\u001b[0;32m---> 83\u001b[0m y_test_real \u001b[38;5;241m=\u001b[39m \u001b[43mle\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mtransform\u001b[49m\u001b[43m(\u001b[49m\u001b[43my_test_real\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 85\u001b[0m \u001b[38;5;66;03m# TSTR Evaluation using Logistic Regression\u001b[39;00m\n\u001b[1;32m 86\u001b[0m model\u001b[38;5;241m.\u001b[39mfit(X_synthetic, y_synthetic)\n", - "File \u001b[0;32m/opt/anaconda3/envs/katabatic/lib/python3.9/site-packages/sklearn/preprocessing/_label.py:137\u001b[0m, in \u001b[0;36mLabelEncoder.transform\u001b[0;34m(self, y)\u001b[0m\n\u001b[1;32m 134\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m _num_samples(y) \u001b[38;5;241m==\u001b[39m \u001b[38;5;241m0\u001b[39m:\n\u001b[1;32m 135\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m np\u001b[38;5;241m.\u001b[39marray([])\n\u001b[0;32m--> 137\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43m_encode\u001b[49m\u001b[43m(\u001b[49m\u001b[43my\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43muniques\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mclasses_\u001b[49m\u001b[43m)\u001b[49m\n", - "File \u001b[0;32m/opt/anaconda3/envs/katabatic/lib/python3.9/site-packages/sklearn/utils/_encode.py:227\u001b[0m, in \u001b[0;36m_encode\u001b[0;34m(values, uniques, check_unknown)\u001b[0m\n\u001b[1;32m 225\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m _map_to_integer(values, uniques)\n\u001b[1;32m 226\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mKeyError\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[0;32m--> 227\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mValueError\u001b[39;00m(\u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124my contains previously unseen labels: \u001b[39m\u001b[38;5;132;01m{\u001b[39;00m\u001b[38;5;28mstr\u001b[39m(e)\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m\"\u001b[39m)\n\u001b[1;32m 228\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 229\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m check_unknown:\n", - "\u001b[0;31mValueError\u001b[0m: y contains previously unseen labels: 'spec_prior'" - ] - } - ], - "source": [ - "tstr_logreg = evaluate.run_metric(\"tstr_logreg\")\n", - "\n", - "print(tstr_logreg.evaluate(X_synthetic_ganblr, y_synthetic_ganblr, X, y))" - ] - }, - { - "cell_type": "code", - "execution_count": 21, - "metadata": {}, - "outputs": [ - { - "ename": "ValueError", - "evalue": "The number of features in X_synthetic and X_real must be the same.", - "output_type": "error", - "traceback": [ - "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", - "\u001b[0;31mValueError\u001b[0m Traceback (most recent call last)", - "Cell \u001b[0;32mIn[21], line 2\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[38;5;66;03m# Append the result to the results DataFrame\u001b[39;00m\n\u001b[0;32m----> 2\u001b[0m rm \u001b[38;5;241m=\u001b[39m pd\u001b[38;5;241m.\u001b[39mconcat([rm, pd\u001b[38;5;241m.\u001b[39mDataFrame({\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mMetric\u001b[39m\u001b[38;5;124m\"\u001b[39m: [\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mtstr_logreg\u001b[39m\u001b[38;5;124m'\u001b[39m], \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mValue\u001b[39m\u001b[38;5;124m\"\u001b[39m: [\u001b[43mevaluate\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mrun_metric\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m'\u001b[39;49m\u001b[38;5;124;43mtstr_logreg\u001b[39;49m\u001b[38;5;124;43m'\u001b[39;49m\u001b[43m)\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mevaluate\u001b[49m\u001b[43m(\u001b[49m\u001b[43mX_synthetic_ganblr\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43my_synthetic_ganblr\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mX\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43my\u001b[49m\u001b[43m)\u001b[49m]})], ignore_index\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mTrue\u001b[39;00m,)\n\u001b[1;32m 4\u001b[0m \u001b[38;5;66;03m# Append the result to the results DataFrame\u001b[39;00m\n\u001b[1;32m 5\u001b[0m rg \u001b[38;5;241m=\u001b[39m pd\u001b[38;5;241m.\u001b[39mconcat([rm, pd\u001b[38;5;241m.\u001b[39mDataFrame({\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mMetric\u001b[39m\u001b[38;5;124m\"\u001b[39m: [\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mtstr_mlp\u001b[39m\u001b[38;5;124m'\u001b[39m], \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mValue\u001b[39m\u001b[38;5;124m\"\u001b[39m: [evaluate\u001b[38;5;241m.\u001b[39mrun_metric(\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mtstr_mlp\u001b[39m\u001b[38;5;124m'\u001b[39m)\u001b[38;5;241m.\u001b[39mevaluate(X_synthetic_ganblr, y_synthetic_ganblr, X, y)]})], ignore_index\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mTrue\u001b[39;00m,)\n", - "File \u001b[0;32m~/Documents/GitHub/Katabatic/katabatic/metrics/tstr_logreg.py:30\u001b[0m, in \u001b[0;36mevaluate\u001b[0;34m(X_synthetic, y_synthetic, X_real, y_real)\u001b[0m\n\u001b[1;32m 28\u001b[0m \u001b[38;5;66;03m# Data Validation\u001b[39;00m\n\u001b[1;32m 29\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m X_synthetic\u001b[38;5;241m.\u001b[39mshape[\u001b[38;5;241m1\u001b[39m] \u001b[38;5;241m!=\u001b[39m X_real\u001b[38;5;241m.\u001b[39mshape[\u001b[38;5;241m1\u001b[39m]:\n\u001b[0;32m---> 30\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mValueError\u001b[39;00m(\n\u001b[1;32m 31\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mThe number of features in X_synthetic and X_real must be the same.\u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 32\u001b[0m )\n\u001b[1;32m 34\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mlen\u001b[39m(y_synthetic) \u001b[38;5;241m!=\u001b[39m \u001b[38;5;28mlen\u001b[39m(X_synthetic):\n\u001b[1;32m 35\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mValueError\u001b[39;00m(\n\u001b[1;32m 36\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mX_synthetic and y_synthetic must have the same number of samples.\u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 37\u001b[0m )\n", - "\u001b[0;31mValueError\u001b[0m: The number of features in X_synthetic and X_real must be the same." - ] - } - ], - "source": [ - "# Append the result to the results DataFrame\n", - "rm = pd.concat([rm, pd.DataFrame({\"Metric\": ['tstr_logreg'], \"Value\": [evaluate.run_metric('tstr_logreg').evaluate(X_synthetic_ganblr, y_synthetic_ganblr, X, y)]})], ignore_index=True,)\n", - "\n", - "# Append the result to the results DataFrame\n", - "rg = pd.concat([rm, pd.DataFrame({\"Metric\": ['tstr_mlp'], \"Value\": [evaluate.run_metric('tstr_mlp').evaluate(X_synthetic_ganblr, y_synthetic_ganblr, X, y)]})], ignore_index=True,)\n", - "\n", - "# Append the result to the results DataFrame\n", - "rg = pd.concat([rm, pd.DataFrame({\"Metric\": ['tstr_rf'], \"Value\": [evaluate.run_metric('tstr_rf').evaluate(X_synthetic_ganblr, y_synthetic_ganblr, X, y)]})], ignore_index=True,)\n", - "\n", - "# Append the result to the results DataFrame\n", - "rg = pd.concat([rm, pd.DataFrame({\"Metric\": ['q_score'], \"Value\": [evaluate.run_metric('q_score').evaluate(X_synthetic_ganblr, y_synthetic_ganblr, X, y)]})], ignore_index=True,)\n" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "--------------------------\n", - "module name: katabatic.katabatic\n", - "parent process: 69249\n", - "process id: 12986\n", - "katabatic.models.ganblr.ganblr_adapter\n", - "katabatic.models.ganblr.ganblr_adapter\n", - "\n", - "--------------------------\n", - "module name: katabatic.katabatic\n", - "parent process: 69249\n", - "process id: 12986\n", - "katabatic.models.ctgan.ctgan_adapter\n", - "katabatic.models.ctgan.ctgan_adapter\n", - "\n", - " parents has_nurs form children housing finance social \\\n", - "0 usual proper complete 1 convenient convenient nonprob \n", - "1 usual proper complete 1 convenient convenient nonprob \n", - "2 usual proper complete 1 convenient convenient nonprob \n", - "3 usual proper complete 1 convenient convenient slightly_prob \n", - "4 usual proper complete 1 convenient convenient slightly_prob \n", - "\n", - " health class \n", - "0 recommended recommend \n", - "1 priority priority \n", - "2 not_recom not_recom \n", - "3 recommended recommend \n", - "4 priority priority \n" - ] - } - ], - "source": [ - "ganblr_model = Katabatic.run_model('ganblr')\n", - "ctgan_model = Katabatic.run_model('ctgan')\n", - "\n", - "# data = ganblr_model.load_data(\"/Users/abdullah/Documents/GitHub/Katabatic/katabatic/cities_demo.csv\")\n", - "# Local file path (assuming the file is named 'nursery.data')\n", - "file_path = \"/Users/abdullah/Documents/GitHub/Katabatic/katabatic/nursery/nursery.data\"\n", - "\n", - "# Column names as per the dataset's description\n", - "columns = [\n", - " \"parents\", \"has_nurs\", \"form\", \"children\", \"housing\",\n", - " \"finance\", \"social\", \"health\", \"class\"\n", - "]\n", - "\n", - "# Load the dataset into a pandas DataFrame\n", - "data = pd.read_csv(file_path, header=None, names=columns)\n", - "\n", - "# Display the first few rows of the dataset\n", - "print(data.head())" - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": {}, - "outputs": [], - "source": [ - "X, y = data.iloc[:, :-1], data.iloc[:, -1]" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Load Model and Train it on Sample Training Data" - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] Initializing GANBLR Model\n", - "[INFO] Training GANBLR model\n", - "warmup run:\n", - "\u001b[1m203/203\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 828us/step - accuracy: 0.2709 - loss: 1.6433\n", - "Epoch 1/10: G_loss = 1.729641, G_accuracy = 0.542284, D_loss = 1.310758, D_accuracy = 0.501254\n", - "Epoch 2/10: G_loss = 1.736655, G_accuracy = 0.751389, D_loss = 0.801364, D_accuracy = 0.502652\n", - "Epoch 3/10: G_loss = 1.884983, G_accuracy = 0.867747, D_loss = 1.114256, D_accuracy = 0.502459\n", - "Epoch 4/10: G_loss = 1.461082, G_accuracy = 0.896451, D_loss = 0.913934, D_accuracy = 0.495853\n", - "Epoch 5/10: G_loss = 1.267442, G_accuracy = 0.902855, D_loss = 0.850864, D_accuracy = 0.491898\n", - "Epoch 6/10: G_loss = 1.272656, G_accuracy = 0.907330, D_loss = 0.838198, D_accuracy = 0.504340\n", - "Epoch 7/10: G_loss = 1.152733, G_accuracy = 0.910108, D_loss = 0.832409, D_accuracy = 0.494068\n", - "Epoch 8/10: G_loss = 1.156956, G_accuracy = 0.910417, D_loss = 0.844176, D_accuracy = 0.500096\n", - "Epoch 9/10: G_loss = 1.792929, G_accuracy = 0.912500, D_loss = 1.432614, D_accuracy = 0.496962\n", - "Epoch 10/10: G_loss = 1.353315, G_accuracy = 0.913194, D_loss = 1.097382, D_accuracy = 0.494454\n", - "[SUCCESS] Model training completed\n" - ] - } - ], - "source": [ - "ganblr_model.load_model()\n", - "\n", - "ganblr_model.fit(X,y)" - ] - }, - { - "cell_type": "code", - "execution_count": 10, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Fitting the model\n" - ] - } - ], - "source": [ - "ctgan_model.load_model()\n", - "\n", - "ctgan_model.fit(X,y)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Generate New Data from trained model" - ] - }, - { - "cell_type": "code", - "execution_count": 11, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] Generating data using GANBLR model\n", - "[SUCCESS] Data generation completed\n" - ] - } - ], - "source": [ - "ganblr_synthetic = ganblr_model.generate(size=len(data))" - ] - }, - { - "cell_type": "code", - "execution_count": 12, - "metadata": {}, - "outputs": [], - "source": [ - "ctgan_synthetic = ctgan_model.generate(size=len(data))" - ] - }, - { - "cell_type": "code", - "execution_count": 13, - "metadata": {}, - "outputs": [], - "source": [ - "# pd.DataFrame(ganblr_synthetic)" - ] - }, - { - "cell_type": "code", - "execution_count": 11, - "metadata": {}, - "outputs": [], - "source": [ - "from katabatic.utils import evaluate\n", - "import pandas as pd" - ] - }, - { - "cell_type": "code", - "execution_count": 15, - "metadata": {}, - "outputs": [], - "source": [ - "ganblr_synthetic = pd.DataFrame(ganblr_synthetic)\n", - "ctgan_synthetic = pd.DataFrame(ctgan_synthetic)" - ] - }, - { - "cell_type": "code", - "execution_count": 16, - "metadata": {}, - "outputs": [], - "source": [ - "X_synthetic_ganblr, y_synthetic_ganblr = ganblr_synthetic.iloc[:, :-1], ganblr_synthetic.iloc[:, -1]\n", - "X_synthetic_ctgan, y_synthetic_ctgan= ctgan_synthetic.iloc[:, :-1], ganblr_synthetic.iloc[:, -1]" - ] - }, - { - "cell_type": "code", - "execution_count": 17, - "metadata": {}, - "outputs": [], - "source": [ - "# Initialize the results DataFrame\n", - "rg = pd.DataFrame(columns=[\"Metric\", \"Value\"])\n", - "rc = pd.DataFrame(columns=[\"Metric\", \"Value\"])" - ] - }, - { - "cell_type": "code", - "execution_count": 18, - "metadata": {}, - "outputs": [], - "source": [ - "# Append the result to the results DataFrame\n", - "rg = pd.concat([rg, pd.DataFrame({\"Metric\": ['tstr_logreg'], \"Value\": [evaluate.run_metric('tstr_logreg').evaluate(X_synthetic_ganblr, y_synthetic_ganblr, X, y)]})], ignore_index=True,)\n", - "# Append the result to the results DataFrame\n", - "rc = pd.concat([rc, pd.DataFrame({\"Metric\": ['tstr_logreg'], \"Value\": [evaluate.run_metric('tstr_logreg').evaluate(X_synthetic_ctgan, y_synthetic_ctgan, X, y)]})],ignore_index=True,)\n", - "\n", - "# Append the result to the results DataFrame\n", - "rg = pd.concat([rg, pd.DataFrame({\"Metric\": ['tstr_mlp'], \"Value\": [evaluate.run_metric('tstr_mlp').evaluate(X_synthetic_ganblr, y_synthetic_ganblr, X, y)]})], ignore_index=True,)\n", - "# Append the result to the results DataFrame\n", - "rc = pd.concat([rc, pd.DataFrame({\"Metric\": ['tstr_mlp'], \"Value\": [evaluate.run_metric('tstr_mlp').evaluate(X_synthetic_ctgan, y_synthetic_ctgan, X, y)]})],ignore_index=True,)\n", - "\n", - "# Append the result to the results DataFrame\n", - "rg = pd.concat([rg, pd.DataFrame({\"Metric\": ['tstr_rf'], \"Value\": [evaluate.run_metric('tstr_rf').evaluate(X_synthetic_ganblr, y_synthetic_ganblr, X, y)]})], ignore_index=True,)\n", - "# Append the result to the results DataFrame\n", - "rc = pd.concat([rc, pd.DataFrame({\"Metric\": ['tstr_rf'], \"Value\": [evaluate.run_metric('tstr_rf').evaluate(X_synthetic_ctgan, y_synthetic_ctgan, X, y)]})],ignore_index=True,)\n", - "\n", - "# Append the result to the results DataFrame\n", - "rg = pd.concat([rg, pd.DataFrame({\"Metric\": ['q_score'], \"Value\": [evaluate.run_metric('q_score').evaluate(X_synthetic_ganblr, y_synthetic_ganblr, X, y)]})], ignore_index=True,)\n", - "# Append the result to the results DataFrame\n", - "rc = pd.concat([rc, pd.DataFrame({\"Metric\": ['q_score'], \"Value\": [evaluate.run_metric('q_score').evaluate(X_synthetic_ctgan, y_synthetic_ctgan, X, y)]})],ignore_index=True,)" - ] - }, - { - "cell_type": "code", - "execution_count": 19, - "metadata": {}, - "outputs": [], - "source": [ - "# result_ganblr = evaluate.evaluate_data(data, ganblr_synthetic, \"continuous\", dict_of_metrics=['tstr_logreg', 'tstr_mlp', 'tstr_rf', 'q_score'])# " - ] - }, - { - "cell_type": "code", - "execution_count": 20, - "metadata": {}, - "outputs": [], - "source": [ - "# result_ganblr" - ] - }, - { - "cell_type": "code", - "execution_count": 21, - "metadata": {}, - "outputs": [], - "source": [ - "# result_ctgan = evaluate.evaluate_data(data, ctgan_synthetic, \"continuous\", dict_of_metrics=['tstr_logreg', 'tstr_mlp', 'tstr_rf', 'q_score'])" - ] - }, - { - "cell_type": "code", - "execution_count": 22, - "metadata": {}, - "outputs": [], - "source": [ - "# result_ctgan" - ] - }, - { - "cell_type": "code", - "execution_count": 23, - "metadata": {}, - "outputs": [], - "source": [ - "# !pip install matplotlib" - ] - }, - { - "cell_type": "code", - "execution_count": 24, - "metadata": {}, - "outputs": [], - "source": [ - "# !pip install seaborn" - ] - }, - { - "cell_type": "code", - "execution_count": 25, - "metadata": {}, - "outputs": [], - "source": [ - "import matplotlib.pyplot as plt\n", - "import seaborn as sns\n", - "import pandas as pd" - ] - }, - { - "cell_type": "code", - "execution_count": 38, - "metadata": {}, - "outputs": [], - "source": [ - "def visualize_metrics(results_df, plot_type=\"bar\", title=\"Metric Evaluation\", figsize=(10, 6), **kwargs):\n", - " \"\"\"\n", - " Visualize the results from the metric evaluation.\n", - "\n", - " This function supports multiple types of plots, such as bar charts, box plots, and heatmaps.\n", - " It is designed to be flexible and customizable, allowing users to specify the type of plot\n", - " and various other options.\n", - "\n", - " Parameters:\n", - " - results_df (pd.DataFrame): DataFrame containing the metric names and their corresponding evaluation values.\n", - " - plot_type (str): Type of plot to generate. Options are \"bar\", \"box\", \"heatmap\". Default is \"bar\".\n", - " - title (str): The title of the plot. Default is \"Metric Evaluation\".\n", - " - figsize (tuple): Size of the figure. Default is (10, 6).\n", - " - **kwargs: Additional keyword arguments passed to the underlying plotting functions.\n", - "\n", - " Returns:\n", - " - None: Displays the plot.\n", - " \"\"\"\n", - "\n", - " if plot_type == \"bar\":\n", - " plt.figure(figsize=figsize)\n", - " ax = sns.barplot(x=\"Metric\", y=\"Value\", data=results_df, **kwargs)\n", - " plt.title(title)\n", - " plt.xticks(rotation=45)\n", - " plt.tight_layout()\n", - " \n", - " # Adding value labels on top of the bars\n", - " for p in ax.patches:\n", - " ax.annotate(f'{p.get_height():.2f}', (p.get_x() + p.get_width() / 2., p.get_height()),\n", - " ha='center', va='center', xytext=(0, 10), textcoords='offset points')\n", - " \n", - " plt.show()\n", - "\n", - " elif plot_type == \"box\":\n", - " plt.figure(figsize=figsize)\n", - " sns.boxplot(x=\"Metric\", y=\"Value\", data=results_df, **kwargs)\n", - " plt.title(title)\n", - " plt.xticks(rotation=45)\n", - " plt.tight_layout()\n", - " plt.show()\n", - "\n", - " elif plot_type == \"heatmap\":\n", - " # Assuming the DataFrame is structured in a way that allows a pivot for a heatmap\n", - " if results_df.shape[1] > 2: # Heatmap requires more than two columns to pivot\n", - " heatmap_data = results_df.pivot(index=results_df.columns[0], columns=results_df.columns[1], values=results_df.columns[2])\n", - " plt.figure(figsize=figsize)\n", - " sns.heatmap(heatmap_data, annot=True, fmt=\".2f\", cmap=\"coolwarm\", **kwargs)\n", - " plt.title(title)\n", - " plt.tight_layout()\n", - " plt.show()\n", - " else:\n", - " raise ValueError(\"Heatmap plot requires a DataFrame with at least 3 columns.\")\n", - " \n", - " else:\n", - " raise ValueError(f\"Unsupported plot_type '{plot_type}'. Supported types are 'bar', 'box', and 'heatmap'.\")" - ] - }, - { - "cell_type": "code", - "execution_count": 39, - "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAA90AAAJOCAYAAACqS2TfAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjkuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8hTgPZAAAACXBIWXMAAA9hAAAPYQGoP6dpAABf9ElEQVR4nO3deVhV5f7+8XsDAqKCA4pKOKSWWSqFOU+dMCpzKE+OCZJaOZRGalrOZjiUoWZKKppHDcsh/ao54VAWZkpa5nDMnBWUVFBEUFi/P/yxDwgoKIst+H5d176KNX7WhuVe937Weh6LYRiGAAAAAABAnrOzdQEAAAAAABRWhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAUOBs3bpVFotFW7dutXUp96RKlSrq0aOHTfZ97NgxWSwWzZ8/3yb7B4AHBaEbAB4AR48eVf/+/fXII4/IxcVFLi4uqlWrlvr166fff/892/WGDBkii8WiTp06ZTk/7aLdYrFo2bJlmeaPHj1aFotFsbGx1mk9evSwrmOxWOTg4CAvLy917txZ+/fvz7B+WrBaunTpbY8v/fYsFotcXV3VokULrVmz5rbr3ctxPCjSfgcWi0ULFy7McpkmTZrIYrHoiSeeuKt9LF68WCEhIfdQ5b1Jf4xZvcLDw21WW16w9fsLAA86B1sXAAAw1+rVq9WpUyc5ODioW7duqlu3ruzs7HTw4EEtX75cM2fO1NGjR1W5cuUM6xmGoa+//lpVqlTR//3f/+ny5csqUaJEtvsZO3asXnnlFVksljvW5OTkpDlz5kiSbty4oSNHjmjWrFlat26d9u/fr4oVK+b6OFu1aiV/f38ZhqHjx49r5syZatOmjb7//nv5+fnleDu5OY4HibOzsxYvXqzXXnstw/Rjx47p559/lrOz811ve/Hixdq3b58GDhyY43WaN2+uxMREOTo63vV+b/XOO+/o6aefzjS9UaNGebYPW8ju/a1cubISExNVpEgR2xQGAA8IQjcAFGJHjhxR586dVblyZUVERKhChQoZ5k+cOFFffPGF7Owy3/i0detWnTp1Sps3b5afn5+WL1+ugICALPfj7e2tPXv2aMWKFXrllVfuWJeDg0Om8NawYUO99NJLWrNmjXr37p2Lo7zpkUceybDNDh06qFatWpo6dWqOQ3duj+NuXbt2TY6Ojlm+73kpISFBxYoVy5Ntvfjii1q1apViY2Pl7u5unb548WJ5eHioRo0aunjxYp7s63bSv3f3EvSz0qxZM/373//O023ezywWS56/hwCAzLi9HAAKsUmTJikhIUHz5s3LFLilm+H3nXfekZeXV6Z5ixYtUq1atfTMM8/I19dXixYtynY/nTt31iOPPKKxY8fKMIy7qrV8+fLWmvLCY489Jnd3dx05ciTH6+TmOLJ7Frdly5Zq2bKl9ee0W5fDw8M1fPhweXp6ysXFRfHx8bp+/brGjBmjGjVqyNnZWWXKlFHTpk21cePGDNs8ePCg/v3vf6t06dJydnZWvXr1tGrVqgzLzJ8/XxaLRdu2bVPfvn1Vrlw5PfTQQ9qyZYssFotWrFiRqdbFixfLYrEoMjLyju9Nu3bt5OTkpG+//TbTNjp27Ch7e/ss11u4cKF8fHxUtGhRlS5dWp07d9bJkyczvF9r1qzR8ePHrbdzV6lS5Y7vXXbPdP/yyy968cUXVapUKRUrVkx16tTR1KlT73h8OfHEE0/omWeeyTQ9NTVVnp6eGQL7J598osaNG6tMmTIqWrSofHx87viYhPS/Rxlulfb7PXbsmHXaypUr1bp1a1WsWFFOTk6qVq2axo0bp5SUFOsyt3t/s3ume/PmzWrWrJmKFSumkiVLql27djpw4ECWdf7111/q0aOHSpYsKTc3NwUGBurq1at3PE4AeJDQ0g0Ahdjq1atVvXp1NWjQIFfrJSUladmyZXrvvfckSV26dFFgYKCio6Ot4Tg9e3t7DR8+XP7+/jluJU57PjolJUV///233n//fZUpU0YvvfRSrmrNTlxcnC5evKhq1arleJ27OY6cGjdunBwdHTVo0CAlJSXJ0dFRo0ePVnBwsHr16qX69esrPj5eu3btUlRUlFq1aiVJ+vPPP9WkSRN5enpq6NChKlasmL755hu1b99ey5Yt08svv5xhP3379lXZsmU1cuRIJSQkqGXLlvLy8tKiRYsyLbto0SJVq1YtR7dPu7i4qF27dvr666/Vp08fSdLevXv1559/as6cOVn2DTB+/HiNGDFCHTt2VK9evXT+/HlNnz5dzZs312+//aaSJUvqww8/VFxcnE6dOqXPPvtMklS8ePE7vndZ2bhxo1566SVVqFBBAwYMUPny5XXgwAGtXr1aAwYMuOMxXr58Ocvn9suUKWPt22D06NGZzoPt27frzJkz6ty5s3Xa1KlT1bZtW3Xr1k3JyckKDw/Xq6++qtWrV6t169Z3rCUn5s+fr+LFiysoKEjFixfX5s2bNXLkSMXHx2vy5MmSlKP3N71NmzbphRde0MMPP6zRo0crMTFR06dPV5MmTRQVFWUN7Gk6duyoqlWrKjg4WFFRUZozZ47KlSuniRMn5skxAkChYAAACqW4uDhDktG+fftM8y5evGicP3/e+rp69WqG+UuXLjUkGYcPHzYMwzDi4+MNZ2dn47PPPsuw3NGjRw1JxuTJk40bN24YNWrUMOrWrWukpqYahmEYo0aNMiQZ58+ft64TEBBgSMr08vT0NHbv3p1h+1u2bDEkGd9+++1tj1WS0bNnT+P8+fPGuXPnjF27dhnPP/+8tbY7uZvjqFy5shEQEJBpWy1atDBatGiR6RgefvjhTO9z3bp1jdatW9+2tmeffdaoXbu2ce3aNeu01NRUo3HjxkaNGjWs0+bNm2dIMpo2bWrcuHEjwzaGDRtmODk5GZcuXbJOO3funOHg4GCMGjXqtvtP/ztYvXq1YbFYjBMnThiGYRiDBw82Hn74YetxP/7449b1jh07Ztjb2xvjx4/PsL0//vjDcHBwyDC9devWRuXKlbPdd1bvXdq8LVu2GIZhGDdu3DCqVq1qVK5c2bh48WKGZdN+j3c6xuxeZ8+eNQzDMA4dOmRIMqZPn55h/b59+xrFixfPUOOt9SYnJxtPPPGE8a9//SvD9Fv/jtL+1m6V9vs9evRotvswDMN48803DRcXlwx/L9m9v2l/9/PmzbNO8/b2NsqVK2f8888/1ml79+417OzsDH9//0x1vv766xm2+fLLLxtlypTJtC8AeJBxezkAFFLx8fGSsm7VatmypcqWLWt9zZgxI8P8RYsWqV69eqpevbokqUSJEmrduvVtbzFPayXeu3evvvvuu9vW5uzsrI0bN2rjxo1av369QkNDVbx4cb344ov673//m8sjvWnu3LkqW7asypUrp3r16ikiIkJDhgxRUFBQrraTm+PIjYCAABUtWjTDtJIlS+rPP//U4cOHs1znwoUL2rx5szp27GhthY2NjdU///wjPz8/HT58WKdPn86wTu/evTPd6u3v76+kpKQMtzcvWbJEN27cyPRs/e0899xzKl26tMLDw2UYhsLDw9WlS5csl12+fLlSU1PVsWNHa92xsbEqX768atSooS1btuR4v1m9d7f67bffdPToUQ0cOFAlS5bMMC+nneKNHDnS+neZ/lW6dGlJN/sN8Pb21pIlS6zrpKSkaOnSpWrTpk2GGtP//8WLFxUXF6dmzZopKioqR7XkRPp9pP19NGvWTFevXtXBgwdzvb2zZ89qz5496tGjh/WYJalOnTpq1aqV1q5dm2mdt956K8PPzZo10z///GP99wcAwDPdAFBopfU0fuXKlUzzQkNDtXHjxiyHgLp06ZLWrl2rFi1a6K+//rK+mjRpol27dt02FHfr1k3Vq1e/4zPR9vb28vX1la+vr5577jm98cYb2rRpk+Li4jRs2LC7ONqbzxxv3LhRa9assT5vevXq1bvqrCynx5EbVatWzTRt7NixunTpkh555BHVrl1bgwcPznCb9l9//SXDMDRixIgMX5KULVtWo0aNkiSdO3fujvupWbOmnn766QxfmixatEgNGza0frGSE0WKFNGrr76qxYsX64cfftDJkyfVtWvXLJc9fPiwDMNQjRo1MtV+4MCBTHXfTlbHdKu0Z/fvdtgySapdu7b17zL9K/3t7J06ddJPP/1k/bJj69atOnfuXKZh9VavXq2GDRvK2dlZpUuXVtmyZTVz5kzFxcXddX23+vPPP/Xyyy/Lzc1Nrq6uKlu2rPVLlLvZz/HjxyVJjz76aKZ5jz32mGJjY5WQkJBheqVKlTL8XKpUKUnKl071AKCg4JluACik3NzcVKFCBe3bty/TvLRnvNN3ypTm22+/VVJSkj799FN9+umnmeYvWrRIY8aMyXKfaa3EPXr00MqVK3NV70MPPaRHH31UP/zwQ67WS7++r6+vpJs9bbu7u6t///565plncv1sdk6OI7vW05SUlCw7FcuqpbZ58+Y6cuSIVq5cqQ0bNmjOnDn67LPPNGvWLPXq1UupqamSpEGDBmXbA/utoTm7FmF/f38NGDBAp06dUlJSknbs2KHPP/88y2Vvp2vXrpo1a5ZGjx6tunXrqlatWlkul5qaKovFou+//z7L9+N2zxXf6k6t3PmpU6dOGjZsmL799lsNHDhQ33zzjdzc3PT8889bl/nxxx/Vtm1bNW/eXF988YUqVKigIkWKaN68eVq8ePFtt3+7v6v0Ll26pBYtWsjV1VVjx45VtWrV5OzsrKioKL3//vvWvx2zZdeBXl59WQUAhQGhGwAKsdatW2vOnDnauXOn6tevn6N1Fi1apCeeeMLakppeaGioFi9enG3olqTXXntNH330kcaMGaO2bdvmqt4bN25k2TJ/N95880199tlnGj58uF5++eVcj7t9p+MoVaqULl26lGn68ePH9fDDD+d4P6VLl1ZgYKACAwN15coVNW/eXKNHj1avXr2s2ylSpIj1C4W71blzZwUFBenrr7+2js18a+tsTjRt2lSVKlXS1q1bb9tZVrVq1WQYhqpWrapHHnnkttvMizHR0zrM27dv3z2/V7dTtWpV1a9fX0uWLFH//v21fPlytW/fXk5OTtZlli1bJmdnZ61fvz7D9Hnz5t1x+2ktxZcuXcpwm3xaK3SarVu36p9//tHy5cvVvHlz6/SjR49m2mZO39/KlStLkg4dOpRp3sGDB+Xu7p5nQ9ABwIOE28sBoBAbMmSIXFxc9PrrrysmJibT/Ftbo06ePKkffvhBHTt21L///e9Mr8DAQP3111/65Zdfst1nWivxnj17Mg1rdTv//e9/dejQIdWtWzfnB3gbDg4Oeu+993TgwIFct7pLdz6OatWqaceOHUpOTrZOW716dYbhsO7kn3/+yfBz8eLFVb16dSUlJUmSypUrp5YtWyo0NFRnz57NtP758+dzvC93d3e98MILWrhwoRYtWqTnn38+w3jbOWWxWDRt2jSNGjVK3bt3z3a5V155Rfb29hozZkymvzPDMDIce7Fixe75tuunnnpKVatWVUhISKYvQ/K61bVTp07asWOHwsLCFBsbm+nLC3t7e1kslgyt08eOHctRHwFpXx6kv+MjISFBX331VaZ9SBmPLTk5WV988UWmbeb0/a1QoYK8vb311VdfZXgP9+3bpw0bNujFF1+84zYAAJnR0g0AhViNGjW0ePFidenSRY8++qi6deumunXryjAMHT16VIsXL5adnZ0eeughSTfHXDYMI9sW6hdffFEODg5atGjRbYch69atm8aNG6c9e/ZkOf/GjRvW58lTU1N17NgxzZo1S6mpqVm2sC9btizLjqECAgKyHGM8TY8ePTRy5EhNnDhR7du3z3a5uzmOXr16aenSpXr++efVsWNHHTlyRAsXLszVEGW1atVSy5Yt5ePjo9KlS2vXrl1aunSp+vfvb11mxowZatq0qWrXrq3evXvr4YcfVkxMjCIjI3Xq1Cnt3bs3x/vz9/e3jiU9bty4HK93q3bt2qldu3a3XaZatWr66KOPNGzYMB07dkzt27dXiRIldPToUa1YsUJvvPGGBg0aJEny8fHRkiVLFBQUpKefflrFixdXmzZtclWTnZ2dZs6cqTZt2sjb21uBgYGqUKGCDh48qD///FPr16+/4zZ+/PFHXbt2LdP0OnXqqE6dOtafO3bsqEGDBmnQoEEqXbp0ppb11q1ba8qUKXr++efVtWtXnTt3TjNmzFD16tWzHFotveeee06VKlVSz549NXjwYNnb2yssLExly5bViRMnrMs1btxYpUqVUkBAgN555x1ZLBb95z//yfILhty8v5MnT9YLL7ygRo0aqWfPntYhw9zc3DR69Ojb1g4AyIYNekwHAOSzv/76y+jTp49RvXp1w9nZ2ShatKhRs2ZN46233jL27NljXa527dpGpUqVbrutli1bGuXKlTOuX7+eYaitW6UNcaQcDBnm6upqPPvss8amTZsybONOQzn9+OOPhmHcHDKsX79+WdY7evToDENLZeVujsMwDOPTTz81PD09DScnJ6NJkybGrl27sh0yLKthzz766COjfv36RsmSJa2/k/HjxxvJyckZljty5Ijh7+9vlC9f3ihSpIjh6elpvPTSS8bSpUsz1fnrr79me5xJSUlGqVKlDDc3NyMxMTHb5dLL6bBttw4ZlmbZsmVG06ZNjWLFihnFihUzatasafTr1884dOiQdZkrV64YXbt2NUqWLGlIsg5vdbt93zpkWJrt27cbrVq1MkqUKGEUK1bMqFOnTqYhvrLbVnavrIZVa9KkiSHJ6NWrV5bbnDt3rlGjRg3DycnJqFmzpjFv3rwshwPLaui53bt3Gw0aNDAcHR2NSpUqGVOmTMlyyLCffvrJaNiwoVG0aFGjYsWKxpAhQ4z169dnel+ye3+zGjLMMAxj06ZNRpMmTYyiRYsarq6uRps2bYz9+/dnWCarYfQMI+uhzQDgQWcxDHq6AADgQXDjxg1VrFhRbdq00dy5c21dDgAADwSe6QYA4AHx3Xff6fz58/L397d1KQAAPDBo6QYAoJD75Zdf9Pvvv2vcuHFyd3dXVFSUrUsCAOCBQUs3AACF3MyZM9WnTx+VK1dOCxYssHU5AAA8UGjpBgAAAADAJLR0AwAAAABgEkI3AAAAAAAmcbB1AfktNTVVZ86cUYkSJWSxWGxdDgAAAACgADIMQ5cvX1bFihVlZ5d9e/YDF7rPnDkjLy8vW5cBAAAAACgETp48qYceeijb+Q9c6C5RooSkm2+Mq6urjasBAAAAABRE8fHx8vLysmbM7DxwoTvtlnJXV1dCNwAAAIB7NmPGDE2ePFnR0dGqW7eupk+frvr162e57Pz58xUYGJhhmpOTk65du5bl8m+99ZZCQ0P12WefaeDAgXldOvLAnR5bpiM1AAAAALhLS5YsUVBQkEaNGqWoqCjVrVtXfn5+OnfuXLbruLq66uzZs9bX8ePHs1xuxYoV2rFjhypWrGhW+cgHhG4AAAAAuEtTpkxR7969FRgYqFq1amnWrFlycXFRWFhYtutYLBaVL1/e+vLw8Mi0zOnTp/X2229r0aJFKlKkiJmHAJMRugEAAADgLiQnJ2v37t3y9fW1TrOzs5Ovr68iIyOzXe/KlSuqXLmyvLy81K5dO/35558Z5qempqp79+4aPHiwHn/8cdPqR/4gdAMAAADAXYiNjVVKSkqmlmoPDw9FR0dnuc6jjz6qsLAwrVy5UgsXLlRqaqoaN26sU6dOWZeZOHGiHBwc9M4775haP/LHA9eRGgAAAADYSqNGjdSoUSPrz40bN9Zjjz2m0NBQjRs3Trt379bUqVMVFRV1xw66UDDQ0g0AAAAAd8Hd3V329vaKiYnJMD0mJkbly5fP0TaKFCmiJ598Un/99Zck6ccff9S5c+dUqVIlOTg4yMHBQcePH9d7772nKlWq5PUhIB8QugEAAADgLjg6OsrHx0cRERHWaampqYqIiMjQmn07KSkp+uOPP1ShQgVJUvfu3fX7779rz5491lfFihU1ePBgrV+/3pTjgLm4vRwAAAAA7lJQUJACAgJUr1491a9fXyEhIUpISLCOxe3v7y9PT08FBwdLksaOHauGDRuqevXqunTpkiZPnqzjx4+rV69ekqQyZcqoTJkyGfZRpEgRlS9fXo8++mj+HhzyBKEbAAAAAO5Sp06ddP78eY0cOVLR0dHy9vbWunXrrJ2rnThxQnZ2/7vB+OLFi+rdu7eio6NVqlQp+fj46Oeff1atWrVsdQgwmcUwDMPWReSn+Ph4ubm5KS4uTq6urrYuBwAAAABQAOU0W/JMNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmcbB1AQAAAABsy2fwAluXAJhm92R/m+6flm4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAk90XonjFjhqpUqSJnZ2c1aNBAO3fuzHbZ+fPny2KxZHg5OzvnY7UAAAAAAOSMzUP3kiVLFBQUpFGjRikqKkp169aVn5+fzp07l+06rq6uOnv2rPV1/PjxfKwYAAAAAICcsXnonjJlinr37q3AwEDVqlVLs2bNkouLi8LCwrJdx2KxqHz58taXh4dHPlYMAAAAAEDO2DR0Jycna/fu3fL19bVOs7Ozk6+vryIjI7Nd78qVK6pcubK8vLzUrl07/fnnn9kum5SUpPj4+AwvAAAAAADyg01Dd2xsrFJSUjK1VHt4eCg6OjrLdR599FGFhYVp5cqVWrhwoVJTU9W4cWOdOnUqy+WDg4Pl5uZmfXl5eeX5cQAAAAAAkBWb316eW40aNZK/v7+8vb3VokULLV++XGXLllVoaGiWyw8bNkxxcXHW18mTJ/O5YgAAAADAg8rBljt3d3eXvb29YmJiMkyPiYlR+fLlc7SNIkWK6Mknn9Rff/2V5XwnJyc5OTndc60AAAAAAOSWTVu6HR0d5ePjo4iICOu01NRURUREqFGjRjnaRkpKiv744w9VqFDBrDIBAAAAALgrNm3plqSgoCAFBASoXr16ql+/vkJCQpSQkKDAwEBJkr+/vzw9PRUcHCxJGjt2rBo2bKjq1avr0qVLmjx5so4fP65evXrZ8jAAAAAAAMjE5qG7U6dOOn/+vEaOHKno6Gh5e3tr3bp11s7VTpw4ITu7/zXIX7x4Ub1791Z0dLRKlSolHx8f/fzzz6pVq5atDgEAAAAAgCxZDMMwbF1EfoqPj5ebm5vi4uLk6upq63IAAAAAm/MZvMDWJQCm2T3Z35Tt5jRbFrjeywEAAAAAKCgI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQDI1owZM1SlShU5OzurQYMG2rlzZ47WCw8Pl8ViUfv27a3Trl+/rvfff1+1a9dWsWLFVLFiRfn7++vMmTMmVQ8UTpyXAFCwELoBAFlasmSJgoKCNGrUKEVFRalu3bry8/PTuXPnbrvesWPHNGjQIDVr1izD9KtXryoqKkojRoxQVFSUli9frkOHDqlt27ZmHgZQqHBeAkDBYzEMw7B1EfkpPj5ebm5uiouLk6urq63LAYD7VoMGDfT000/r888/lySlpqbKy8tLb7/9toYOHZrlOikpKWrevLlef/11/fjjj7p06ZK+++67bPfx66+/qn79+jp+/LgqVapkxmEAhQrnJcziM3iBrUsATLN7sr8p281ptqSlGwCQSXJysnbv3i1fX1/rNDs7O/n6+ioyMjLb9caOHaty5cqpZ8+eOdpPXFycLBaLSpYsea8lA4Ue5yUAFEwOti4AAHD/iY2NVUpKijw8PDJM9/Dw0MGDB7NcZ/v27Zo7d6727NmTo31cu3ZN77//vrp06cKdR0AOcF4CQMFESzcA4J5dvnxZ3bt31+zZs+Xu7n7H5a9fv66OHTvKMAzNnDkzHyoEHjyclwBwf6ClGwCQibu7u+zt7RUTE5NhekxMjMqXL59p+SNHjujYsWNq06aNdVpqaqokycHBQYcOHVK1atUk/e/C/vjx49q8eTOtaUAOcV4CQMFESzfuK3k5DIokLV++XM8995zKlCkji8WS49vrgAedo6OjfHx8FBERYZ2WmpqqiIgINWrUKNPyNWvW1B9//KE9e/ZYX23bttUzzzyjPXv2yMvLS9L/LuwPHz6sTZs2qUyZMvl2TEBBx3kJAAUTLd24b6QNgzJr1iw1aNBAISEh8vPz06FDh1SuXLls18tuGBRJSkhIUNOmTdWxY0f17t3bzPKBQicoKEgBAQGqV6+e6tevr5CQECUkJCgwMFCS5O/vL09PTwUHB8vZ2VlPPPFEhvXTOmFKm379+nX9+9//VlRUlFavXq2UlBRFR0dLkkqXLi1HR8f8OziggOK8BICCh9CN+8aUKVPUu3dv64XDrFmztGbNGoWFhd12GJRu3bppzJgx1mFQ0uvevbukm8EcQO506tRJ58+f18iRIxUdHS1vb2+tW7fO2onTiRMnZGeX8xumTp8+rVWrVkmSvL29M8zbsmWLWrZsmVelA4UW5yUAFDyM0437QnJyslxcXLR06dIMt4gHBATo0qVLWrlyZZbrjRo1Sr///rtWrFihHj16ZDv26LFjx1S1alX99ttvmS4qAAAAHnSM043CzNbjdNPSjftCfgyDAgAAAAD5jY7UUCDldhgUAAAAALAFWrpxXzBzGBQAAAAAsBVaunFfMGsYFAAAAACwJVq6cd/I62FQJOnChQs6ceKEzpw5I0k6dOiQJKl8+fJZtqADAAAAQF4idOO+kdfDoEjSqlWrrKFdkjp37izpZq/no0ePzrPaAQAAACArDBkGAAAAPOAYMgyFma2HDOOZbgAAAAAATELoBgAAAADAJDzTDaDQ45Y5FGZm3TJnNs5LFFYF9ZwEYB5augEAAAAAMAmhGwAAAAAAkxC6AQAAAAAwCaEbAAAAAACTELoBAAAAADAJvZfnMXpjRWFGj6wAAABA7tDSDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJjkvgjdM2bMUJUqVeTs7KwGDRpo586dOVovPDxcFotF7du3N7dAAAAAAADugs1D95IlSxQUFKRRo0YpKipKdevWlZ+fn86dO3fb9Y4dO6ZBgwapWbNm+VQpAAAAAAC5Y/PQPWXKFPXu3VuBgYGqVauWZs2aJRcXF4WFhWW7TkpKirp166YxY8bo4YcfzsdqAQAAAADIOZuG7uTkZO3evVu+vr7WaXZ2dvL19VVkZGS2640dO1blypVTz54977iPpKQkxcfHZ3gBAAAAAJAfbBq6Y2NjlZKSIg8PjwzTPTw8FB0dneU627dv19y5czV79uwc7SM4OFhubm7Wl5eX1z3XDQAAAABATtj89vLcuHz5srp3767Zs2fL3d09R+sMGzZMcXFx1tfJkydNrhIAAAAAgJscbLlzd3d32dvbKyYmJsP0mJgYlS9fPtPyR44c0bFjx9SmTRvrtNTUVEmSg4ODDh06pGrVqmVYx8nJSU5OTiZUDwAAAADA7dm0pdvR0VE+Pj6KiIiwTktNTVVERIQaNWqUafmaNWvqjz/+0J49e6yvtm3b6plnntGePXu4dRwAAAAAcF+xaUu3JAUFBSkgIED16tVT/fr1FRISooSEBAUGBkqS/P395enpqeDgYDk7O+uJJ57IsH7JkiUlKdN0AAAAAABszeahu1OnTjp//rxGjhyp6OhoeXt7a926ddbO1U6cOCE7uwL16DkAAAAAAJLug9AtSf3791f//v2znLd169bbrjt//vy8LwgAAAAAgDxAEzIAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACa5q9B948YNbdq0SaGhobp8+bIk6cyZM7py5UqeFgcAAAAAQEGW69B9/Phx1a5dW+3atVO/fv10/vx5SdLEiRM1aNCguypixowZqlKlipydndWgQQPt3Lkz22WXL1+uevXqqWTJkipWrJi8vb31n//85672CwAAAACAmXIdugcMGKB69erp4sWLKlq0qHX6yy+/rIiIiFwXsGTJEgUFBWnUqFGKiopS3bp15efnp3PnzmW5fOnSpfXhhx8qMjJSv//+uwIDAxUYGKj169fnet8AAAAAAJgp16H7xx9/1PDhw+Xo6JhhepUqVXT69OlcFzBlyhT17t1bgYGBqlWrlmbNmiUXFxeFhYVluXzLli318ssv67HHHlO1atU0YMAA1alTR9u3b8/1vgEAAAAAMFOuQ3dqaqpSUlIyTT916pRKlCiRq20lJydr9+7d8vX1/V9Bdnby9fVVZGTkHdc3DEMRERE6dOiQmjdvnqt9AwAAAABgtlyH7ueee04hISHWny0Wi65cuaJRo0bpxRdfzNW2YmNjlZKSIg8PjwzTPTw8FB0dne16cXFxKl68uBwdHdW6dWtNnz5drVq1ynLZpKQkxcfHZ3gBAAAAAJAfHHK7wqeffio/Pz/VqlVL165dU9euXXX48GG5u7vr66+/NqPGTEqUKKE9e/boypUrioiIUFBQkB5++GG1bNky07LBwcEaM2ZMvtQFAAAAAEB6uQ7dDz30kPbu3avw8HD9/vvvunLlinr27Klu3bpl6FgtJ9zd3WVvb6+YmJgM02NiYlS+fPls17Ozs1P16tUlSd7e3jpw4ICCg4OzDN3Dhg1TUFCQ9ef4+Hh5eXnlqk4AAAAAAO5GrkO3JDk4OOi111675507OjrKx8dHERERat++vaSbz4xHRESof//+Od5OamqqkpKSspzn5OQkJyene64VAAAAAIDcynXoXrBgwW3n+/v752p7QUFBCggIUL169VS/fn2FhIQoISFBgYGB1u15enoqODhY0s3bxevVq6dq1aopKSlJa9eu1X/+8x/NnDkzt4cCAAAAAICpch26BwwYkOHn69ev6+rVq3J0dJSLi0uuQ3enTp10/vx5jRw5UtHR0fL29ta6deusnaudOHFCdnb/6+8tISFBffv21alTp1S0aFHVrFlTCxcuVKdOnXJ7KAAAAAAAmCrXofvixYuZph0+fFh9+vTR4MGD76qI/v37Z3s7+datWzP8/NFHH+mjjz66q/0AAAAAAJCfcj1kWFZq1KihCRMmZGoFBwAAAADgQZYnoVu62bnamTNn8mpzAAAAAAAUeLm+vXzVqlUZfjYMQ2fPntXnn3+uJk2a5FlhAAAAAAAUdLkO3WlDe6WxWCwqW7as/vWvf+nTTz/Nq7oAAAAAACjwch26U1NTzagDAAAAAIBCJ8+e6QYAAAAAABnlqKU7KCgoxxucMmXKXRcDAAAAAEBhkqPQ/dtvv+VoYxaL5Z6KAQAAAACgMMlR6N6yZYvZdQAAAAAAUOjwTDcAAAAAACbJde/lkrRr1y598803OnHihJKTkzPMW758eZ4UBgAAAABAQZfrlu7w8HA1btxYBw4c0IoVK3T9+nX9+eef2rx5s9zc3MyoEQAAAACAAinXofvjjz/WZ599pv/7v/+To6Ojpk6dqoMHD6pjx46qVKmSGTUCAAAAAFAg5Tp0HzlyRK1bt5YkOTo6KiEhQRaLRe+++66+/PLLPC8QAAAAAICCKtehu1SpUrp8+bIkydPTU/v27ZMkXbp0SVevXs3b6gAAAAAAKMByHLrTwnXz5s21ceNGSdKrr76qAQMGqHfv3urSpYueffZZc6oEAAAAAKAAynHv5XXq1NHTTz+t9u3b69VXX5UkffjhhypSpIh+/vlndejQQcOHDzetUAAAAAAACpoch+5t27Zp3rx5Cg4O1vjx49WhQwf16tVLQ4cONbM+AAAAAAAKrBzfXt6sWTOFhYXp7Nmzmj59uo4dO6YWLVrokUce0cSJExUdHW1mnQAAAAAAFDi57kitWLFiCgwM1LZt2/Tf//5Xr776qmbMmKFKlSqpbdu2ZtQIAAAAAECBlOvQnV716tX1wQcfaPjw4SpRooTWrFmTV3UBAAAAAFDg5fiZ7lv98MMPCgsL07Jly2RnZ6eOHTuqZ8+eeVkbAAAAAAAFWq5C95kzZzR//nzNnz9ff/31lxo3bqxp06apY8eOKlasmFk1AgAAAABQIOU4dL/wwgvatGmT3N3d5e/vr9dff12PPvqombUBAAAAAFCg5Th0FylSREuXLtVLL70ke3t7M2sCAAAAAKBQyHHoXrVqlZl1AAAAAABQ6NxT7+UAAAAAACB7hG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAk90XonjFjhqpUqSJnZ2c1aNBAO3fuzHbZ2bNnq1mzZipVqpRKlSolX1/f2y4PAAAAAICt2Dx0L1myREFBQRo1apSioqJUt25d+fn56dy5c1kuv3XrVnXp0kVbtmxRZGSkvLy89Nxzz+n06dP5XDkAAAAAALdn89A9ZcoU9e7dW4GBgapVq5ZmzZolFxcXhYWFZbn8okWL1LdvX3l7e6tmzZqaM2eOUlNTFRERkc+VAwAAAABwezYN3cnJydq9e7d8fX2t0+zs7OTr66vIyMgcbePq1au6fv26SpcuneX8pKQkxcfHZ3gBAAAAAJAfbBq6Y2NjlZKSIg8PjwzTPTw8FB0dnaNtvP/++6pYsWKG4J5ecHCw3NzcrC8vL697rhsAAAAAgJyw+e3l92LChAkKDw/XihUr5OzsnOUyw4YNU1xcnPV18uTJfK4SAAAAAPCgcrDlzt3d3WVvb6+YmJgM02NiYlS+fPnbrvvJJ59owoQJ2rRpk+rUqZPtck5OTnJycsqTegEAAAAAyA2btnQ7OjrKx8cnQydoaZ2iNWrUKNv1Jk2apHHjxmndunWqV69efpQKAAAAAECu2bSlW5KCgoIUEBCgevXqqX79+goJCVFCQoICAwMlSf7+/vL09FRwcLAkaeLEiRo5cqQWL16sKlWqWJ/9Ll68uIoXL26z4wAAAAAA4FY2D92dOnXS+fPnNXLkSEVHR8vb21vr1q2zdq524sQJ2dn9r0F+5syZSk5O1r///e8M2xk1apRGjx6dn6UDAAAAAHBbNg/dktS/f3/1798/y3lbt27N8POxY8fMLwgAAAAAgDxQoHsvBwAAAADgfkboBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMYvPQPWPGDFWpUkXOzs5q0KCBdu7cme2yf/75pzp06KAqVarIYrEoJCQk/woFAAAAACCXbBq6lyxZoqCgII0aNUpRUVGqW7eu/Pz8dO7cuSyXv3r1qh5++GFNmDBB5cuXz+dqAQAAAADIHZuG7ilTpqh3794KDAxUrVq1NGvWLLm4uCgsLCzL5Z9++mlNnjxZnTt3lpOTUz5XCwAAAABA7tgsdCcnJ2v37t3y9fX9XzF2dvL19VVkZGSe7ScpKUnx8fEZXgAAAAAA5Aebhe7Y2FilpKTIw8Mjw3QPDw9FR0fn2X6Cg4Pl5uZmfXl5eeXZtgEAAAAAuB2bd6RmtmHDhikuLs76OnnypK1LAgAAAAA8IBxstWN3d3fZ29srJiYmw/SYmJg87STNycmJ578BAAAAADZhs5ZuR0dH+fj4KCIiwjotNTVVERERatSoka3KAgAAAAAgz9ispVuSgoKCFBAQoHr16ql+/foKCQlRQkKCAgMDJUn+/v7y9PRUcHCwpJudr+3fv9/6/6dPn9aePXtUvHhxVa9e3WbHAQAAAABAVmwaujt16qTz589r5MiRio6Olre3t9atW2ftXO3EiROys/tfY/yZM2f05JNPWn/+5JNP9Mknn6hFixbaunVrfpcPAAAAAMBt2TR0S1L//v3Vv3//LOfdGqSrVKkiwzDyoSoAAAAAAO5doe+9HAAAAAAAWyF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgkvsidM+YMUNVqlSRs7OzGjRooJ07d952+W+//VY1a9aUs7OzateurbVr1+ZTpQAAAAAA5JzNQ/eSJUsUFBSkUaNGKSoqSnXr1pWfn5/OnTuX5fI///yzunTpop49e+q3335T+/bt1b59e+3bty+fKwcAAAAA4PZsHrqnTJmi3r17KzAwULVq1dKsWbPk4uKisLCwLJefOnWqnn/+eQ0ePFiPPfaYxo0bp6eeekqff/55PlcOAAAAAMDt2TR0Jycna/fu3fL19bVOs7Ozk6+vryIjI7NcJzIyMsPykuTn55ft8gAAAAAA2IqDLXceGxurlJQUeXh4ZJju4eGhgwcPZrlOdHR0lstHR0dnuXxSUpKSkpKsP8fFxUmS4uPj76X0bKUkJZqyXeB+YNZ5YzbOSxRmnJfA/YVzErj/mHVepm3XMIzbLmfT0J0fgoODNWbMmEzTvby8bFANULC5TX/L1iUAuAXnJXB/4ZwE7j9mn5eXL1+Wm5tbtvNtGrrd3d1lb2+vmJiYDNNjYmJUvnz5LNcpX758rpYfNmyYgoKCrD+npqbqwoULKlOmjCwWyz0eAWwpPj5eXl5eOnnypFxdXW1dDgBxXgL3G85J4P7DeVl4GIahy5cvq2LFirddzqah29HRUT4+PoqIiFD79u0l3QzFERER6t+/f5brNGrUSBERERo4cKB12saNG9WoUaMsl3dycpKTk1OGaSVLlsyL8nGfcHV15R8s4D7DeQncXzgngfsP52XhcLsW7jQ2v708KChIAQEBqlevnurXr6+QkBAlJCQoMDBQkuTv7y9PT08FBwdLkgYMGKAWLVro008/VevWrRUeHq5du3bpyy+/tOVhAAAAAACQic1Dd6dOnXT+/HmNHDlS0dHR8vb21rp166ydpZ04cUJ2dv/rZL1x48ZavHixhg8frg8++EA1atTQd999pyeeeMJWhwAAAAAAQJZsHrolqX///tneTr5169ZM01599VW9+uqrJleF+52Tk5NGjRqV6fEBALbDeQncXzgngfsP5+WDx2LcqX9zAAAAAABwV+zuvAgAAAAAALgbhG4AAAAAAExC6AYAAAAAwCSEbgAAgAfcunXrbF0CABRa90Xv5QAAALCNli1bytHRUb6+vnJw4NIQAPIa/7IC/59hGLJYLHecBgBAYfHVV1/p8OHDOn78uBwcHHTq1Ck99NBDti4LAAoVbi8HlDFcf/311xo5cqQkyWKxiFH1AHOlpqbaugTggVWiRAmVK1dOv//+u4YMGaIJEybo6tWrti4LAAoVWrrxwEtNTZWd3c3vn3bu3KmFCxfqwIED8vDwUL9+/azBmxZvIO+lP/+ioqJUtGhR2dvb65FHHrFxZcCDoW7duipRooS6d++uQ4cOad++fXJxceFzD7gPLV++XP/8849iY2PVq1cvlSpVikdCCgh+S3jgpV3wDx48WH/88YcsFosSExP16aef6urVqxo8eDDBGzBJ+vNv4cKFsrOz0/Xr19W/f38NGDBAbm5uNq4QKJxSUlJkb2+vatWqqVy5coqMjFTz5s0VFxcnSXzuAfeZIUOG6Ouvv5a3t7eOHTumOXPmaOLEiXrllVesn6W4fxG6Ad28pXzOnDnasGGD6tSpo4sXL2ro0KH65ptvZG9vr6CgIC5AgDyS9shG2rm0bds2hYeH6+uvv5a9vb3++9//qk+fPoqOjtb06dNlb29vy3KBQscwDOt5tWDBAp07d04LFixQSEiIPv74Yw0YMED/+te/+NwD7hMLFy7UwoULtX79etWuXVsbN26Un5+fXFxcCNwFBKEbkHTkyBFVr15dPj4+srOzU/ny5TV27Fj169dPn376qRwcHPTOO+9wAQLkAYvFotTUVFksFn311VfasWOHXnvtNbVs2VKS1KxZM1WqVEl+fn7y9vbWG2+8YduCgUIk/SMdn332maZPn66lS5fqqaeeUrVq1dS3b19NmzZNFotFzzzzDJ97wH3g6NGjatu2rWrXrq3FixerT58+mjFjhl588UUlJCToxo0b3Bl2n+OrETzQUlJSJElly5ZVcnKyTp8+LenmRUmlSpU0bNgwXblyRYsXL9bnn38uSVx4AHepR48e6tWrl6Sbt5UfOXJES5Ys0cKFC3Xx4kVJN8+969evq1WrVho4cKAWLlyohIQEOjQE8kha4P7jjz908OBBTZw4UU899ZQMw1D9+vU1a9YsnTp1StOmTdOWLVsk8bkH2Mo333wjSTpx4oRSU1O1c+dOvfXWW5owYYL69OkjSZo1a5ZmzZrF5+R9jtCNB8qtvSSnXXw0bdpUf//9t6ZOnaqrV69apxuGoWeffVaPPvqoli1bppiYmHyvGSgMrl69qieeeEKrVq3SoEGDJEnVqlVTUFCQWrZsqcWLF+vHH3+UnZ2dtVOYkiVLyjAMFS1alIt+IA+tXbtWzZo104oVK+Ts7Czp5uddamqq6tWrp9DQUJ0+fVojR45UVFSUjasFHkzBwcHq0aOHTp48qddee02bNm1Sw4YNNXXqVGvgTkhI0ObNmxUTE8Pn5H2O28vxwEh/S93s2bN18OBBHT58WG+++aZat26tJUuWqG3btrp27ZratGmjypUra/z48apTp44CAgL0+OOP69dff9VLL71k4yMBCh4XFxf17t1bxYsX1/Dhw3Xjxg2FhITI19dXjo6Oslgs6t27t7788ks1adJEV69e1datW1WuXDkuJIA89uKLL6pnz56aNm2aNm3apMaNG6tMmTLW4O3j46OpU6dqzpw58vb2tnW5wANn586dOnHihFavXi0vLy85OjrqpZde0saNG5WQkKCrV6/qv//9rz744ANFR0dr5cqVti4Zd2AxuBcBD5ghQ4Zo0aJF6tChgyTp888/1/DhwzV27FitXbtW7777rq5cuSJ7e3uVLVtW27dv19WrV9W8eXPNnTtXDRs2tPERAAVXXFycFi9erBEjRui1115TSEiIJGnz5s2aPHmyNmzYoMcee0z16tXTvn379PPPP8vR0ZFnSoE8ktZruSQNGDBAK1eu1JAhQ9StWze5ubnJMAwZhpGhc6b0X1oDMNd3332n0aNH6+rVq/r+++9VrVo1SdL+/fsVGhqqr7/+Wjdu3NBDDz0kd3d3rV+/XkWKFMlwbuP+Q0s3HigbNmzQkiVLtHr1aj355JOKiorS559/rscee0zSzW//n3rqKV24cEFXrlzR008/LYvFojFjxigpKUmVKlWy8REABcutF+tubm7q2LGjJGnEiBGSpJCQEP3rX/+Sg4ODihYtqr179+qZZ57R/PnzJUnXr19XkSJF8r12oDCyt7e3XpxPnTpVycnJmjJliiwWi7p27ZplZ0wEbiD/lClTRpUrV9b69eu1bds2a+iuVauWxo8fryFDhmjv3r3y8vLS448/Ljs7O924cYPxuu9z/HZQqN16wX/x4kU9/vjjevLJJ/X111/rjTfe0IwZM9SlSxfFxcXp9OnTqlWrlsqXLy9J2rNnjyZMmKDNmzdrw4YNqlixoq0OBShw0p9/UVFRMgxDtWrVUpkyZdSlSxdJGYN38+bNlZSUpDlz5mjq1Kl6/PHHVa9ePS4kgDyWPnjPnDlTffv2VUhIiBISEvTWW2+pePHiti4ReGA1a9ZMLi4uslgsmjlzptzc3Kx3Zzo7O6t48eLy9PS0Lp+amsrnZAHAV5co1NIu+CdMmKCjR48qJSVFZ86c0cqVK/XWW29p0qRJ1s4ovv/+ewUHB+uff/6RdLNTmVKlSqlKlSraunUrz7UBuZR2/g0dOlS+vr56+eWXVadOHe3du1clS5ZUt27dNG7cOC1atEhBQUGSpFatWqlv376qXr26Xn31Vf3666/cVg7k0q2dhqaN1JFeWvCWpC+++EJPPfWUdu7cqWLFiuVLjQD+Z82aNVqwYIHmzp2rK1euyMfHRyNGjFDlypU1bdo0rVixQpLk4OCQqZdy7kQpGHimG4VS+uc/w8LC9M4772jTpk2qUqWKOnXqpB9//FGTJ0/We++9J0lKTExUp06dVKpUKc2fPz/DRT7PsgG5k/78i4yMVO/evTVt2jQ5ODjos88+0w8//KDly5erRYsWio+P19dff60+ffros88+04ABAyRJmzZt0oIFCzRmzBhVrVrVlocDFFghISHq1KmTKlSokO0y6Z8DTfu8ow8FIP8MGjRIX3/9tVxdXZWYmKjr169r4cKFeuaZZ7Rz505NnjxZ//zzj3r16qWuXbvaulzcJUI3CrXNmzdr1apVatiwoTp37izDMDR79myFhoaqUqVK+uCDD3Tq1Cl9+eWXOn36tKKiouTg4EDQBu7SrefO3r17tXbtWg0bNkzSzeez04Y+WbFihZo3b65Lly5py5Ytatu2bYZOYBITE1W0aNF8PwagoEoflmfNmqX33ntPERERd+wA9NZ+E3g+FMgfixYt0sCBA7Vx40Z5eXnJzs5OvXv31tatW7VhwwY99dRTioyM1PDhw1WzZk3NmDHD1iXjLhG6UWht2bJFAwYM0NmzZ/XVV1/pxRdflCQlJydrwYIFWrx4sXbs2KHatWvroYceUnh4OL0/Avcg/QV/cHCwfvvtN+3atUv16tVTWFiY9TnR69evq3v37tq8ebMWLVqkVq1aWbdx48YN2dvb08oG3INt27ZpxYoVatasmfVZ0OykP2+/+eYb+fn5ZdmZGoC8FxwcrO3bt2vNmjUZvrR+/vnnde7cOf3666+yt7fX/v37VbNmTRqECjB+cyg0bv3+6Mknn1SbNm1ksVg0f/58Xb9+XZLk6OioXr16afPmzfrtt9+0ceNGLV26VEWKFLFe8APIndTUVOuF+7Rp0zRx4kR5eHiocuXKWrVqldasWaOkpCRJUpEiRbRw4ULVrVtXU6ZMybAdBwcHAjeQS+mf4d68ebP69Omj8PBwlSlTRlLWz3RLGQP37Nmz1blzZ+3cudP8ggFIkmJjY3XgwAFJN5/NTvucDAoK0oULF3TkyBFJN3sut7Ozy9RfAwoOQjcKhfQXDnPnztWWLVtUsmRJDR06VG+99ZYOHz6s4cOHW4P3jRs3JEmPPPKIXF1dZbFY6P0RuAdp377/8ccf2r9/v7799ltNnz5dW7ZsUceOHfXGG29o9erV1gsKBwcHrVu3TmvWrLFl2UCBd+PGDev59/vvv6tRo0Z6/vnnlZSUpIULF0q62WnarRfr6T83Q0NDNXjwYC1btizDnScA8t7atWv1008/SZL8/f3l4OBgfQTLycnJ+l8nJ6dMDUG0dBdc/OZQ4KVvYfvtt9/01Vdf6Z133tGuXbtUokQJDRo0SC+88IK2bdumESNGWJ9Vu7WjGP4hA+7N2rVr1axZM61atSrDBf6CBQvUtm1b9ezZU2vXrtW1a9ck3QwCfHMP3L2lS5dq6NChkqR3331XXbt2laOjo4YPH65evXpp165dGjt2rCRlONduDdxDhgzR3Llz9fLLL9vmQIAHxJAhQxQUFKRffvlFly5dUtWqVdWtWzdt3bpV/fv31/nz57V//35NnjxZXl5edCRaiJAyUKAZhmENy2PGjNG4ceN07do1HT58WG+88YZ++uknubq6aujQoXr22Wf1448/6u2331ZKSgq3sAJ57MUXX1TPnj114cIF/fDDD7p48aJ13n/+8x+1b99eHTp00C+//JJhPb7wAu5OSkqKpkyZoiZNmigsLEyLFy+Wvb29SpcuraFDh+pf//qX1q5dq3HjxklSpp7Jp02bpmHDhiksLOyOz34DuDefffaZ5s+fr3nz5qlfv34qWbKkXF1d1a9fP3Xo0EEbN25UpUqV1KFDB8XGxur777/ni+lChI7UUCjMmDFD77//vtasWaNHHnlEW7du1VdffaVz585pxowZatSokeLj4/XBBx8oOTlZoaGhhG4gD6XvgPDtt9/WmjVrNHjwYHXp0kUlS5a0Ljdu3DgNGzaMRzmAPNK8eXNt375dgYGBmjt3rrV/E4vFotjYWAUHBysyMlJNmzbVpEmTrOv9/fff8vPz07hx49S5c2dblQ8UeoZhKDExUV26dFHTpk01ePBg65dfaXdfpj32GBERobJly6pu3bqyt7dnJIFChNCNAi81NVU9evSQo6Oj5syZY52+ceNG63PcX375perVq6eEhAQVLVqUcUgBE6QP3n369NHGjRv13nvvqWvXrpl6Q+ZCArg7tw7LN3nyZCUlJWnMmDF699139fHHH8vBwcF6PsbGxuqDDz5QSkqK5syZY/3cS0pKUmxsrDw9PW11KMADIykpSQ0bNtTLL7+skSNHZpiXmJioQ4cOydvbO8N0RtMpXLjiQYFnZ2cnNzc3/fHHH7p69apcXFwkSa1atdIvv/yikSNHqk+fPvr888/VoEEDSZkvWgDcO3t7e+tFwsyZM9W3b1+FhIQoISFBb731lnXIMEkEbuAupP/sWrBggVxcXPT222/L2dlZ1atXV/fu3WWxWPTxxx9bL9aPHz+umTNnys7OztppqJ2dnZycnAjcgMkuXryoUqVKKTU1VaVKldKuXbsyLXPy5EmFhoYqKChINWrUsE4ncBcupA4UKNk91+Lt7a2TJ0/q+++/V2JionX6o48+qpdfflnVqlXT9OnTdfnyZUk8QwrcjVvPv6yGIUoL3pL0xRdf6KmnntLOnTtVrFixfKkRKKzS92EyZMgQDR06VJcvX9aFCxckSZ07d9aCBQv02Wef6b333tPevXvVtm1bvffee9bAnX4bAMw1YcIEBQQE6MiRIypatKgmTJigTZs26Z133tHVq1eVnJysuLg4DRw4UMePH1e1atVsXTJMxO3lKDDSf8O/ZcsWXbt2TTdu3FCbNm0kSd26ddP27ds1ZswYNWvWTKVLl1aPHj3UoEEDubm5aeTIkdq1axc9QQL3KCQkRJ06dVKFChWyXSb9bXFp5y6PdAB3J/3nX0hIiCZOnKiVK1eqfv361mUSExNVtGhRLV26VN26dVONGjXk5OSkHTt2qEiRIrYqHXggDRkyRAsXLlRwcLCaNm1qDdTLli1TQECAHnnkETk4OMje3l5Xr17Vrl27VKRIEe7ELMQI3ShwBg8erMWLF8vFxUVnzpxR/fr1NW3aNNWuXVs9e/bUzp07derUKXl4eMgwDB06dEh//PGHXnnlFX3//feqXr26rQ8BKFDSh+VZs2bpvffeU0REhBo2bHjb9a5fv57hYp/nuIHcS3/+JSYm6vXXX1elSpU0ceJEHTlyRL/99pvmzJkjBwcHTZo0SbVq1dKxY8cUExOjp59+WnZ2dpx7QD5au3at3nzzTS1btsz6xVhCQoJOnjypmjVr6syZM1qwYIGuXLkiDw8P9enTx9qZGudp4UXoRoEyZ84cffDBB1q3bp08PDyUlJSk9u3by8XFReHh4apSpYp27NihEydOyMHBQe3atZO9vb3efvttRUZGatOmTRl6UgaQc9u2bdOKFSvUrFmzOw4vlD4ofPPNN/Lz88vUmRqA29uyZYvOnDmjbt266Y033pCdnZ2uXbum8+fPq0WLFtqwYYOcnJzk6uqq2NhYXbp0SZs3b1aJEiWs26DlDMhfoaGhmj17tnbt2qXff/9dq1ev1vz583X8+HH16NFDoaGhmdah07TCj9CN+9b69etVv359lSpVynoB/9577+nEiRP69ttvrf9AXbx4UT4+Pnr66ae1ZMmSDNv46aefNH/+fK1YsUIRERGqW7eujY4GKHjSX6xv3rxZ/fv314ULFxQeHq6WLVtme5GQPnDPnj1bb775ptavX69WrVrla/1AQWUYhq5cuaIOHTooOTlZrq6u2rZtm/bs2aP9+/dr3rx52rFjh/r166fnnntOPj4+mjp1qrZs2aLly5cTsgEb+uWXX/TMM8+oefPmOnDggFq2bKlmzZrJ3d1dr7zyinbu3Kl69erZukzkM/5Vxn3pyy+/VIcOHbRkyRLFxcVZO4A5c+aMLl26JOlmh03Xrl1TqVKlNHnyZG3fvl0nT57M0LnT9evXderUKW3dupXADeTCjRs3rBfuv//+uxo1aqTnn39eSUlJWrhwoaSb5+CtnaulD9yhoaEaPHiwli1bRuAGcsFisahEiRIKDw9XdHS0Vq9eraFDh6pq1apq3bq15syZoz179mjYsGHy8fGRdPOLaldXVwI3YGN16tTRt99+q7Jly+rjjz9WcHCwevXqpcaNG6tBgwbcQv6AoqUb962+fftq06ZNevfdd9W5c2eVKlVK69at0yuvvKJp06apV69e1mW/+eYbjR8/Xlu3blWpUqUybCetcxkAObN06VLt2LFDn3zyid59911t3LhRe/fuVVxcnIKDg7Vx40a98sor1rFGs+ooLTQ0VEOGDFFYWNgdb0UHkLVLly6pW7duunLlipycnNS9e3d1797dOj8+Pl47d+7UhAkTdO7cOUVFRcnBwYFOC4H7yI0bN5SYmKguXbooLi5O27Zt48uxBxBfteC+k3bL6hdffKE+ffro008/lWEY6tq1q1q2bKl+/fpp/PjxunbtmgIDA3Xp0iUtWLBADz30UJbPaxO4gdxJSUnRlClTFBkZqX379unHH3+Uvb29SpcuraFDhyolJUVr166VxWLRiBEjMgXuadOmafTo0QRu4B6VLFlSa9asUXR0tHr27Kl58+bJzs5O3bp1kyQdPHhQS5culaenp77//ns6YwLuM9euXdPSpUv15Zdf6urVq4qMjJSdnR19LTyAaOnGfSn9s6J9+vTRhg0bNGjQIL3++uv6559/FBoaqkmTJql06dIqVqyYXF1dFRkZyXALQB5p3ry5tm/frsDAQM2dO1dpHxUWi0WxsbEKDg5WZGSkmjZtqkmTJlnX+/vvv+Xn56dx48apc+fOtiofKHSOHj2qt99+W8nJyeratav8/f3Vrl07VahQQaGhobJYLARuwAZud9154cIFbdq0Sfv379fw4cP5YuwBRujGfevW4L1+/XoNHjxYPXr0UNGiRXXkyBFFRUXJzc1Nzz77rOzt7fmHDLhLt140TJ48WUlJSRozZozeffddffzxx3JwcLCel7Gxsfrggw+UkpKiOXPmWFu5k5KSFBsbK09PT1sdClBoHT16VIMGDdKBAweUmJioEiVKaNeuXXJ0dOSWciAfhYeHKzY2Vv3795d0++CdlJQkJycnSZmH0sSDg9CN+1p2wbtjx44qU6ZMtssCyLn0FwsLFiyQi4uLXnrpJTk7Oys8PFzdu3dXUFCQPv74Y+s5tnv3bnl7e8vOzk4Wi4U7TIB8cvbsWe3evVsxMTEKCAig5QzIZ/v27ZO/v7/s7e01cOBA6+MeWX0O8mUY0vAvNO5r9vb21jA9c+ZM9e3bVyEhIUpISNBbb72l4sWLZ1gWQO4YhmG9SBgyZIgWLlyo8ePH68KFC6pYsaI6d+4swzAUEBCgpKQkBQYGasSIEYqPj9eWLVusIwsQuIH8UaFCBb300kvWn1NSUgjcQD4ZPHiwjhw5IgcHBx0+fFjjxo3T9evX1aNHj0zPaqcP3FOmTNGePXu0YMECW5YPG6KlGzZ167eC2bVWp5/epUsXpaSkaMmSJXx7CNyD9OdfSEiIJk6cqJUrV6p+/frWZdJ6/1+6dKm6deumGjVqyMnJSTt27OAWOQDAA2PBggUaOHCgNm3apMqVKyspKUk9evTQ5cuX1adPH/n7+0u6+dlqsVis16hffvml3n//fU2fPl2vvfaaLQ8BNkToxn0hJCREnTp1UoUKFbJdJn3wzmqIIgA5l/7cSUxM1Ouvv65KlSpp4sSJOnLkiH777TfNmTNHDg4OmjRpkmrVqqVjx44pJiZGTz/9tOzs7LilFQDwwBg1apQiIiL0ww8/WEP16dOn9corr+jChQsaMWJEhuBtZ2dnHT5z3rx5euWVV2x8BLAlrpZgE+kv+GfNmqUPP/xQDRs2vG3otre3t3ZAkdY6x211QO5t2bJFZ86cUbdu3fTGG2/Izs5OTk5O2rdvnyZNmqQNGzbIyclJpUqVUmxsrAICArR582ZVqVJFVapUkXTzgoJzDwBQ2KVdsxYtWlRJSUlKSkpS0aJFdf36dXl6eurjjz9WmzZtNH/+fCUlJal3796ys7PTzJkzNXToUIWFhRG4QUs3bGvbtm1asWKFmjVrdsfxfNMH9W+++UZ+fn5yc3PLjzKBQsEwDF25ckUdOnRQcnKyXF1dtW3bNu3Zs0f79+/XvHnztGPHDvXr10/PPfecfHx8NHXqVG3ZskXLly/nuW0AwAPrzz//lLe3t4YPH65Ro0ZZp69du1azZ8+WnZ2dzp8/r3feeUexsbF677339J///IfADUm0dCOfpX+GdPPmzerfv78uXLig9u3bS8r+me70gXv27Nl68803tX79erVq1SrfagcKOovFohIlSig8PFyNGzfWDz/8oPHjx6tq1aqqWrWqmjRpohs3bsjd3d26zvr16+Xu7k7gBgA80B5//HHNnj1bb7zxhq5cuaKOHTuqdOnS+uKLL/TUU0+pT58+GjhwoEaPHq1y5cppwYIFBG5Y0dKNfJP++c/ff/9dNWrU0Icffqh58+apQ4cOmjNnjqTMnaulD9yhoaF6//33NW/ePL388sv5fxBAIXDp0iV169ZNV65ckZOTk7p3767u3btb58fHx2vnzp2aMGGCzp07p6ioKDk4ONCHAgDggbds2TL17dtXjo6OkqSyZcvq559/lrOzs06fPq0PP/xQI0aMULVq1WxcKe4nhG7ki6VLl2rHjh365JNP9O6772rjxo3au3ev4uLiFBwcrI0bN+qVV17RyJEjJWXdUVpaZxRhYWF3vBUdwJ1FR0erZ8+eSkxMVM+ePa1jje7cuVNhYWFKTEzUnDlzVKRIETpNAwDg/ztz5oxOnz6thIQENWvWTPb29rp27ZqcnZ2zvWsTDzZCN/LFkiVL1KVLFzVq1Ej79u3Tjz/+qDp16kiS/vnnH40fP14///yzWrdurREjRkjK2MI9bdo0jR49WrNnzyZwA3no6NGjevvtt5WcnKyuXbvK399f7dq1U4UKFRQaGiqLxULgBgDgNgjauBNCN/JN8+bNtX37dgUGBmru3LlK+9OzWCyKjY1VcHCwIiMj1bRpU02aNMm63t9//y0/Pz+NGzdOnTt3tlX5QKF19OhRDRo0SAcOHFBiYqJKlCihXbt2ydHRkVvKAQAA7hGhG6a59dnsyZMnKykpSWPGjNG7776rjz/+WA4ODtZvB2NjY/XBBx8oJSVFc+bMsV7oJyUlKTY2Vp6enrY6FKDQO3v2rHbv3q2YmBgFBATIwcGBFm4AAIA8QOiGKdIH7gULFsjFxUUvvfSSnJ2dFR4eru7duysoKEgff/yx9Xac3bt3y9vbW3Z2drJYLJlCO4D8w61yAAAAeYMmDOQ5wzCsYXnIkCFauHChxo8frwsXLqhixYrq3LmzDMNQQECAkpKSFBgYqBEjRig+Pl5btmyRxWLJsA0A+Y/ADQAAkDdo6UaeSt86HRISookTJ2rlypWqX7++dZnExEQVLVpUS5cuVbdu3VSjRg05OTlpx44dKlKkiK1KBwAAAIA8R+hGnknf4VJiYqJef/11VapUSRMnTtSRI0f022+/ac6cOXJwcNCkSZNUq1YtHTt2TDExMXr66adlZ2fHM6QAAAAAChVCN/LEli1bdObMGXXr1k1vvPGG7OzsdO3aNZ0/f14tWrTQhg0b5OTkJFdXV8XGxurSpUvavHmzSpQoYd0Gz3ADAAAAKGwI3bgnhmHoypUr6tChg5KTk+Xq6qpt27Zpz5492r9/v+bNm6cdO3aoX79+eu655+Tj46OpU6dqy5YtWr58OSEbAAAAQKHGfby4JxaLRSVKlFB4eLgaN26sH374QePHj1fVqlVVtWpVNWnSRDdu3JC7u7t1nfXr18vd3Z3ADQAAAKDQI3QjT9jZ2alatWry8PDQli1b9NBDD6l79+4qWbKkJCk+Pl47d+7UhAkTdO7cOa1atUpSxufAAQAAAKCwoakReaJkyZJas2aNlixZoiJFimjevHlatGiRdf7Bgwe1dOlSeXp6avfu3XJwcNCNGzcI3AAAAAAKNZ7pRp47evSo3n77bSUnJ6tr167y9/dXu3btVKFCBYWGhspisdBLOQAAAIAHAqEbpjh69KgGDRqkAwcOKDExUSVKlNCuXbvk6OjILeUAAAAAHhiEbpjm7Nmz2r17t2JiYhQQEGC9pZwWbgAAAAAPCkI38k1KSors7e1tXQYAAAAA5BtCNwAAAAAAJqH3cgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAA98Risei7776zdRkAANyXCN0AABQCPXr0kMVi0VtvvZVpXr9+/WSxWNSjR48cbWvr1q2yWCy6dOlSjpY/e/asXnjhhVxUCwDAg4PQDQBAIeHl5aXw8HAlJiZap127dk2LFy9WpUqV8nx/ycnJkqTy5cvLyckpz7cPAEBhQOgGAKCQeOqpp+Tl5aXly5dbpy1fvlyVKlXSk08+aZ2Wmpqq4OBgVa1aVUWLFlXdunW1dOlSSdKxY8f0zDPPSJJKlSqVoYW8ZcuW6t+/vwYOHCh3d3f5+flJynx7+alTp9SlSxeVLl1axYoVU7169fTLL7+YfPQAANyfHGxdAAAAyDuvv/665s2bp27dukmSwsLCFBgYqK1bt1qXCQ4O1sKFCzVr1izVqFFDP/zwg1577TWVLVtWTZs21bJly9ShQwcdOnRIrq6uKlq0qHXdr776Sn369NFPP/2U5f6vXLmiFi1ayNPTU6tWrVL58uUVFRWl1NRUU48bAID7FaEbAIBC5LXXXtOwYcN0/PhxSdJPP/2k8PBwa+hOSkrSxx9/rE2bNqlRo0aSpIcffljbt29XaGioWrRoodKlS0uSypUrp5IlS2bYfo0aNTRp0qRs97948WKdP39ev/76q3U71atXz+OjBACg4CB0AwBQiJQtW1atW7fW/PnzZRiGWrduLXd3d+v8v/76S1evXlWrVq0yrJecnJzhFvTs+Pj43Hb+nj179OSTT1oDNwAADzpCNwAAhczrr7+u/v37S5JmzJiRYd6VK1ckSWvWrJGnp2eGeTnpDK1YsWK3nZ/+VnQAAEDoBgCg0Hn++eeVnJwsi8Vi7ewsTa1ateTk5KQTJ06oRYsWWa7v6OgoSUpJScn1vuvUqaM5c+bowoULtHYDACB6LwcAoNCxt7fXgQMHtH//ftnb22eYV6JECQ0aNEjvvvuuvvrqKx05ckRRUVGaPn26vvrqK0lS5cqVZbFYtHr1ap0/f97aOp4TXbp0Ufny5dW+fXv99NNP+vvvv7Vs2TJFRkbm6TECAFBQELoBACiEXF1d5erqmuW8cePGacSIEQoODtZjjz2m559/XmvWrFHVqlUlSZ6enhozZoyGDh0qDw8P663qOeHo6KgNGzaoXLlyevHFF1W7dm1NmDAhU/gHAOBBYTEMw7B1EQAAAAAAFEa0dAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACb5f0H73p4wV0+yAAAAAElFTkSuQmCC", - "text/plain": [ - "
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "visualize_metrics(rg, title=\"GANBLR Nursery Metric Evaluation\")" - ] - }, - { - "cell_type": "code", - "execution_count": 40, - "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAA90AAAJOCAYAAACqS2TfAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjkuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8hTgPZAAAACXBIWXMAAA9hAAAPYQGoP6dpAABj2ElEQVR4nO3deVgV5f//8dcBZFEBNRSXUNyytJSE3MqlTxiVlqblLopL5VIqLmm5+1VcynDLNbdcoFyyT5obgmVhpGRqmZm55AJKKgoIKMzvj36ejwiYC8NRfD6u61x1Zu6Zec+BkXmde+Yei2EYhgAAAAAAQJ6zs3UBAAAAAAAUVIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAHBPa9KkiZo0aWLrMu7K4sWLZbFYdPToUZtsv2vXrvL29rbJtgHgQUfoBoAH1OHDh/Xmm2+qUqVKcnZ2lpubm55++mlNmzZNly9f1ujRo2WxWP71dWMY+vbbb9WmTRuVK1dOjo6Ocnd3V926dTV27FjFx8fnWk+dOnVksVg0e/bsHOdfCy3Ozs46efJktvlNmjTR448//q/73bVrV1ksFtWsWVOGYWSbb7FY1Ldv339dT0HUpEkTWSwWVa1aNcf5W7Zssf7cV61addvrP3XqlEaPHq09e/bcZaV37to+5vR69NFHbVZXXrgXPl8AQHYOti4AAJD/1q9fr9dff11OTk4KDAzU448/rvT0dO3YsUODBw/WL7/8or59+6pKlSrWZZKSktSrVy+9+uqratWqlXW6p6en9f9HjhypcePGqVKlSuratasqVaqk1NRU7d69Wx9++KGWLFmiw4cPZ6vn0KFD+vHHH+Xt7a3ly5erV69eudaelpamiRMnasaMGXf1Gezbt09r1qxR69at72o9BY2zs7P++OMPxcTEqE6dOlnmLV++XM7OzkpNTb2jdZ86dUpjxoyRt7e3fHx8bnm5zZs339H2cvPwww8rJCQk23R3d/c83U5+u9nnO3/+fGVmZtqmMAB4wBG6AeABc+TIEbVr104VKlTQtm3bVKZMGeu8Pn366I8//tD69etVs2ZN1axZ0zovISFBvXr1Us2aNdWpU6ds6w0PD9e4cePUpk0bffrpp3J0dMwy/6OPPtJHH32UY03Lli1TqVKl9OGHH+q1117T0aNHc70U1sfHR/Pnz9ewYcNUtmzZO/gEJBcXF3l5eWns2LFq1aqVLBbLHa3nVqSkpKhw4cKmrV+SDMNQamqqXFxc7npdlStX1tWrV7Vy5cosoTs1NVVr165Vs2bNtHr16rvezq249tnd+Lt0t9zd3XP8HS7IChUqZOsSAOCBxeXlAPCAmTx5spKSkvTJJ59kCdzXVKlSRf369bvt9Y4cOVIeHh765JNPcgxJ7u7uGj16dI7LrlixQq+99pqaN28ud3d3rVixItftvPfee8rIyNDEiRNvu8Zr7OzsNHz4cO3du1dr1669advc7sWNioqSxWJRVFSUddq1S9x3796tRo0aqXDhwnrvvfckSbt27VJAQIA8PDzk4uKiihUrqlu3blnWmZmZqdDQUNWoUUPOzs7y9PTUm2++qfPnz2dp5+3trebNm2vTpk3y8/OTi4uL5s6dq8aNG6tWrVo57ke1atUUEBBwS59P+/btFR4enqVn9L///a9SUlLUpk2bHJc5efKkunXrJk9PTzk5OalGjRpauHBhls/rqaeekiQFBQVZL+levHjxv352Od3TnZqaqtGjR+uRRx6Rs7OzypQpo1atWuV4JcXtWrVqlSwWi7Zv355t3ty5c2WxWLR//35J0t69e61XdTg7O6t06dLq1q2b/v7773/djsViyfGY8Pb2VteuXa3vz507p0GDBumJJ55Q0aJF5ebmphdffFE///yztc2/fb453dOdnJysgQMHysvLS05OTqpWrZo++OCDbLddXLvl4osvvtDjjz9u/flu3LjxX/cRAEDoBoAHzn//+19VqlRJDRo0yLN1/v777/r999/VsmVLFS1a9LaW/eGHH/THH3+offv2cnR0VKtWrbR8+fJc21esWFGBgYGaP3++Tp06dcc1d+jQQVWrVtXYsWNzvLf7Tv3999968cUX5ePjo9DQUD377LM6c+aMnn/+eR09elRDhw7VjBkz1LFjR+3cuTPLsm+++aYGDx5svbc+KChIy5cvV0BAgK5cuZKl7cGDB9W+fXs1bdpU06ZNk4+Pjzp37qy9e/daA+E1P/74o37//fdb7t3t0KGDTp8+neULhRUrVui5555TqVKlsrWPj49XvXr1tHXrVvXt21fTpk1TlSpV1L17d4WGhkqSHnvsMY0dO1aS9MYbb+jTTz/Vp59+qkaNGt30s8tJRkaGmjdvrjFjxsjX11cffvih+vXrp8TExGz7ntvyCQkJ2V7JycmSpGbNmqlo0aL67LPPsi0bHh6uGjVqWMcP2LJli/78808FBQVpxowZateuncLCwvTSSy/l2e/Vn3/+qS+++ELNmzfX1KlTNXjwYO3bt0+NGze2HgO38vlezzAMvfLKK/roo4/0wgsvaOrUqapWrZoGDx6s4ODgbO137Nih3r17q127dpo8ebJSU1PVunXrW/pyAQAeeAYA4IGRmJhoSDJatGhx28uePXvWkGSMGjUq27x169YZkozQ0NAs0zMzM42zZ89meV25ciVLm759+xpeXl5GZmamYRiGsXnzZkOS8dNPP2Vpt2jRIkOS8eOPPxqHDx82HBwcjHfeecc6v3HjxkaNGjX+dT+6dOliFClSxDAMw1iyZIkhyVizZo11viSjT58+2bZ75MiRLOuJjIw0JBmRkZFZapBkzJkzJ0vbtWvXWmvPzbfffmtIMpYvX55l+saNG7NNr1ChgiHJ2LhxY5a2Fy5cMJydnY133303y/R33nnHKFKkiJGUlJTr9q/Vf+0z9PPzM7p3724YhmGcP3/ecHR0NJYsWWLd788//9y6XPfu3Y0yZcoYCQkJWdbXrl07w93d3UhJSTEMwzB+/PFHQ5KxaNGiHLed02d3bV7jxo2t7xcuXGhIMqZOnZqt7bXfo5vto6QcX2+++aa1Xfv27Y1SpUoZV69etU47ffq0YWdnZ4wdO9Y67dq+XW/lypWGJOObb76xTsvp9yi346lChQpGly5drO9TU1ONjIyMLG2OHDliODk5ZanlZp9vly5djAoVKljff/HFF4Yk4//+7/+ytHvttdcMi8Vi/PHHH1nqdHR0zDLt559/NiQZM2bMyLYtAEBW9HQDwAPk4sWLkiRXV1dT1ntjL3diYqJKliyZ5XX9yMpXr15VeHi42rZta72v+j//+Y9KlSp1097uSpUqqXPnzpo3b55Onz59x3V37Ngxz3u7nZycFBQUlGVasWLFJElfffVVth7raz7//HO5u7uradOmWXpffX19VbRoUUVGRmZpX7FixWyXi7u7u6tFixZauXKldX8yMjIUHh6uli1bqkiRIre8Hx06dNCaNWuUnp6uVatWyd7eXq+++mq2doZhaPXq1Xr55ZdlGEaW2gMCApSYmKjY2Nhb2mZOn11OVq9eLQ8PD7399tvZ5t3K/fne3t7asmVLtlf//v2tbdq2baszZ85k6e1ftWqVMjMz1bZtW+u06++jT01NVUJCgurVqydJt7zf/8bJyUl2dv+csmVkZOjvv/9W0aJFVa1atTvexoYNG2Rvb6933nkny/SBAwfKMAx9/fXXWab7+/urcuXK1vc1a9aUm5ub/vzzzzvaPgA8SAjdAPAAcXNzkyRdunQpT9d7LcQnJSVlmV60aFFroBk8eHC25TZv3qyzZ8+qTp06+uOPP/THH3/oyJEjevbZZ7Vy5cqbjrY8fPhwXb169a7u7ba3t9fw4cO1Z88effHFF3e8nutde1Ta9Ro3bqzWrVtrzJgx8vDwUIsWLbRo0SKlpaVZ2xw6dEiJiYkqVapUti8qkpKSdObMmSzrrFixYo7bDwwM1PHjx/Xtt99KkrZu3ar4+Hh17tz5tvajXbt2SkxM1Ndff63ly5erefPmOX5Zc/bsWV24cEHz5s3LVve1AH1j7bnJ6bPLyeHDh1WtWjU5ONzZeLBFihSRv79/ttf1jwx74YUX5O7urvDwcOu08PBw+fj46JFHHrFOO3funPr16ydPT0+5uLioZMmS1p9NYmLiHdV3o8zMTH300UeqWrWqnJyc5OHhoZIlS2rv3r13vI1jx46pbNmy2X6mjz32mHX+9cqXL59tHcWLF8823gAAIDtGLweAB4ibm5vKli17S/e93o5rYeXG9To4OMjf31+SdOLEiWzLXevNzm1wru3bt+d6X2+lSpXUqVMnzZs3T0OHDr3j2jt27Khx48Zp7NixatmyZbb5ufWcZmRk5Dg9pxHErz3XeufOnfrvf/+rTZs2qVu3bvrwww+1c+dOFS1aVJmZmTft4S9ZsuS/bkeSAgIC5OnpqWXLlqlRo0ZatmyZSpcubf053KoyZcqoSZMm+vDDD/Xdd9/lOmL5tS9GOnXqpC5duuTY5vpR8G8mL0ZfzytOTk5q2bKl1q5dq48//ljx8fH67rvvNGHChCzt2rRpo++//16DBw+Wj4+P9Wf5wgsv3PEjum783ZowYYJGjBihbt26ady4cSpRooTs7OzUv3//fHsMmL29fY7T8+oKEQAoyAjdAPCAad68uebNm6fo6GjVr18/T9ZZrVo1Va1aVV988YVCQ0Nv6TLm5ORkrVu3Tm3bttVrr72Wbf4777yj5cuX5xq6pX96u5ctW6ZJkybdce3Xeru7du2qdevWZZtfvHhxSdKFCxeyTL+xJ/BW1KtXT/Xq1dP48eO1YsUKdezYUWFhYerRo4cqV66srVu36umnn76r8Glvb68OHTpo8eLFmjRpkr744gv17Nkz19B0Mx06dFCPHj1UrFgxvfTSSzm2KVmypFxdXZWRkfGvwT6vHs1WuXJl/fDDD7py5Yqpj8Jq27atlixZooiICB04cECGYWS5tPz8+fOKiIjQmDFjNHLkSOv0Q4cO3dL6ixcvnu33Kj09PdstE6tWrdKzzz6rTz75JMv0CxcuyMPDw/r+dj7fChUqaOvWrbp06VKW3u7ffvvNOh8AkDe4vBwAHjBDhgxRkSJF1KNHD8XHx2ebf/jwYU2bNu221zt69GglJCSoZ8+eOd63fGOP2Nq1a5WcnKw+ffrotddey/Zq3ry5Vq9eneUS7BtVrlxZnTp10ty5cxUXF3fbNV/TqVMnValSRWPGjMlxG5L0zTffWKdlZGRo3rx5t7z+8+fPZ9t/Hx8fSbLuX5s2bZSRkaFx48ZlW/7q1avZwtnNdO7cWefPn9ebb76ppKSkO34m9WuvvaZRo0bp448/zvWyb3t7e7Vu3VqrV6/O8QqKs2fPWv//2pcxt7MvOWndurUSEhI0c+bMbPPysufV399fJUqUUHh4uMLDw1WnTp0sl/Vf+yLjxm1eG7H931SuXDnL75UkzZs3L1tPt729fbZtfP755zp58mSWabfz+b700kvKyMjI9hl+9NFHslgsevHFF29pHwAA/46ebgB4wFSuXFkrVqxQ27Zt9dhjjykwMFCPP/640tPT9f333+vzzz/P8ozgW9WhQwft379fISEhiomJUbt27VSxYkUlJydr//79WrlypVxdXa09x8uXL9dDDz2U66PLXnnlFc2fP1/r169Xq1atct3u+++/r08//VQHDx5UjRo1brtu6Z9Q8/777+c4iFeNGjVUr149DRs2TOfOnVOJEiUUFhamq1ev3vL6lyxZoo8//livvvqqKleurEuXLmn+/Plyc3Oz9iA3btxYb775pkJCQrRnzx49//zzKlSokA4dOqTPP/9c06ZNy/GKgJw8+eSTevzxx/X555/rscceU+3atW+51uvd7Nnq15s4caIiIyNVt25d9ezZU9WrV9e5c+cUGxurrVu36ty5c5L++d0rVqyY5syZI1dXVxUpUkR169bN9f703AQGBmrp0qUKDg5WTEyMGjZsqOTkZG3dulW9e/dWixYtbrp8YmKili1bluO867+gKFSokFq1aqWwsDAlJyfrgw8+yNLWzc1NjRo10uTJk3XlyhWVK1dOmzdv1pEjR25pP3r06KG33npLrVu3VtOmTfXzzz9r06ZNWXqvpX+uThk7dqyCgoLUoEED7du3T8uXL1elSpWytLudz/fll1/Ws88+q/fff19Hjx5VrVq1tHnzZq1bt079+/fPMmgaAOAu2WbQdACArf3+++9Gz549DW9vb8PR0dFwdXU1nn76aWPGjBlGampqtvY3e2TY9aKioozXXnvNKFOmjFGoUCHDzc3N8PPzM0aNGmWcPn3aMAzDiI+PNxwcHIzOnTvnup6UlBSjcOHCxquvvmoYRtZHht2oS5cuhqTbfmTY9a5cuWJUrlw52yPDDMMwDh8+bPj7+xtOTk6Gp6en8d577xlbtmzJ8ZFhOdUQGxtrtG/f3ihfvrzh5ORklCpVymjevLmxa9eubG3nzZtn+Pr6Gi4uLoarq6vxxBNPGEOGDDFOnTplbVOhQgWjWbNmN93PyZMnG5KMCRMm/NtH8q/1Xy+nR4YZxj8/0z59+hheXl5GoUKFjNKlSxvPPfecMW/evCzt1q1bZ1SvXt1wcHDI8nirm237xkeGGcY/vx/vv/++UbFiRev2XnvtNePw4cP/uo/K5ZFhOZ0WXfs5WywW46+//so2/8SJE8arr75qFCtWzHB3dzdef/1149SpU9mOlZweGZaRkWG8++67hoeHh1G4cGEjICDA+OOPP3J8ZNjAgQONMmXKGC4uLsbTTz9tREdH5/i55Pb53vjIMMMwjEuXLhkDBgwwypYtaxQqVMioWrWqMWXKlGyPXcvpmDCM7I82AwDkzGIYjIABAEBBM23aNA0YMEBHjx7NceRpAACQPwjdAAAUMIZhqFatWnrooYeyPd8bAADkL+7pBgCggEhOTtaXX36pyMhI7du3L8fR2AEAQP6ipxsAgALi6NGjqlixoooVK6bevXtr/Pjxti4JAIAHHqEbAAAAAACT8JxuAAAAAABMQugGAAAAAMAkD9xAapmZmTp16pRcXV1lsVhsXQ4AAAAA4D5kGIYuXbqksmXLys4u9/7sBy50nzp1Sl5eXrYuAwAAAABQAPz11196+OGHc53/wIVuV1dXSf98MG5ubjauBgAAAABwP7p48aK8vLysGTM3D1zovnZJuZubG6EbAAAAwF2bNWuWpkyZori4ONWqVUszZsxQnTp1cmy7ePFiBQUFZZnm5OSk1NRU6/uuXbtqyZIlWdoEBARo48aNeV887tq/3bb8wIVuAAAAAMgr4eHhCg4O1pw5c1S3bl2FhoYqICBABw8eVKlSpXJcxs3NTQcPHrS+zym0vfDCC1q0aJH1vZOTU94Xj3zB6OUAAAAAcIemTp2qnj17KigoSNWrV9ecOXNUuHBhLVy4MNdlLBaLSpcubX15enpma+Pk5JSlTfHixc3cDZiI0A0AAAAAdyA9PV27d++Wv7+/dZqdnZ38/f0VHR2d63JJSUmqUKGCvLy81KJFC/3yyy/Z2kRFRalUqVKqVq2aevXqpb///tuUfYD5CN0AAAAAcAcSEhKUkZGRrafa09NTcXFxOS5TrVo1LVy4UOvWrdOyZcuUmZmpBg0a6MSJE9Y2L7zwgpYuXaqIiAhNmjRJ27dv14svvqiMjAxT9wfm4J5uAAAAAMgn9evXV/369a3vGzRooMcee0xz587VuHHjJEnt2rWzzn/iiSdUs2ZNVa5cWVFRUXruuefyvWbcHXq6AQAAAOAOeHh4yN7eXvHx8Vmmx8fHq3Tp0re0jkKFCunJJ5/UH3/8kWubSpUqycPD46ZtcO8idAMAAADAHXB0dJSvr68iIiKs0zIzMxUREZGlN/tmMjIytG/fPpUpUybXNidOnNDff/990za4dxG6AQAAAOAOBQcHa/78+VqyZIkOHDigXr16KTk52fos7sDAQA0bNszafuzYsdq8ebP+/PNPxcbGqlOnTjp27Jh69Ogh6Z9B1gYPHqydO3fq6NGjioiIUIsWLVSlShUFBATYZB9xd7inGwAAAADuUNu2bXX27FmNHDlScXFx8vHx0caNG62Dqx0/flx2dv/r6zx//rx69uypuLg4FS9eXL6+vvr+++9VvXp1SZK9vb327t2rJUuW6MKFCypbtqyef/55jRs3jmd136cshmEYti4iP128eFHu7u5KTEyUm5ubrcsBAAAAANyHbjVbcnk5AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACZxsHUBAAAAAGzLd/BSW5cAmGb3lECbbp+ebgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCT3ROieNWuWvL295ezsrLp16yomJibXtosXL5bFYsnycnZ2zsdqAQAAAAC4NTYP3eHh4QoODtaoUaMUGxurWrVqKSAgQGfOnMl1GTc3N50+fdr6OnbsWD5WDAAAAADArbF56J46dap69uypoKAgVa9eXXPmzFHhwoW1cOHCXJexWCwqXbq09eXp6ZmPFQMAAAAAcGtsGrrT09O1e/du+fv7W6fZ2dnJ399f0dHRuS6XlJSkChUqyMvLSy1atNAvv/ySa9u0tDRdvHgxywsAAAAAgPxg09CdkJCgjIyMbD3Vnp6eiouLy3GZatWqaeHChVq3bp2WLVumzMxMNWjQQCdOnMixfUhIiNzd3a0vLy+vPN8PAAAAAAByYvPLy29X/fr1FRgYKB8fHzVu3Fhr1qxRyZIlNXfu3BzbDxs2TImJidbXX3/9lc8VAwAAAAAeVA623LiHh4fs7e0VHx+fZXp8fLxKly59S+soVKiQnnzySf3xxx85zndycpKTk9Nd1woAAAAAwO2yaU+3o6OjfH19FRERYZ2WmZmpiIgI1a9f/5bWkZGRoX379qlMmTJmlQkAAAAAwB2xaU+3JAUHB6tLly7y8/NTnTp1FBoaquTkZAUFBUmSAgMDVa5cOYWEhEiSxo4dq3r16qlKlSq6cOGCpkyZomPHjqlHjx623A0AAAAAALKxeehu27atzp49q5EjRyouLk4+Pj7auHGjdXC148ePy87ufx3y58+fV8+ePRUXF6fixYvL19dX33//vapXr26rXQAAAAAAIEcWwzAMWxeRny5evCh3d3clJibKzc3N1uUAAAAANuc7eKmtSwBMs3tKoCnrvdVsed+NXg4AAAAAwP2C0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJjkngjds2bNkre3t5ydnVW3bl3FxMTc0nJhYWGyWCxq2bKluQUCAAAAAHAHbB66w8PDFRwcrFGjRik2Nla1atVSQECAzpw5c9Pljh49qkGDBqlhw4b5VCkAAAAAALfH5qF76tSp6tmzp4KCglS9enXNmTNHhQsX1sKFC3NdJiMjQx07dtSYMWNUqVKlfKwWAAAAAIBbZ9PQnZ6ert27d8vf3986zc7OTv7+/oqOjs51ubFjx6pUqVLq3r17fpQJAAAAAMAdcbDlxhMSEpSRkSFPT88s0z09PfXbb7/luMyOHTv0ySefaM+ePbe0jbS0NKWlpVnfX7x48Y7rBQAAAADgdtj88vLbcenSJXXu3Fnz58+Xh4fHLS0TEhIid3d368vLy8vkKgEAAAAA+IdNe7o9PDxkb2+v+Pj4LNPj4+NVunTpbO0PHz6so0eP6uWXX7ZOy8zMlCQ5ODjo4MGDqly5cpZlhg0bpuDgYOv7ixcvErwBAAAAAPnCpqHb0dFRvr6+ioiIsD72KzMzUxEREerbt2+29o8++qj27duXZdrw4cN16dIlTZs2Lccw7eTkJCcnJ1PqBwAAAADgZmwauiUpODhYXbp0kZ+fn+rUqaPQ0FAlJycrKChIkhQYGKhy5copJCREzs7Oevzxx7MsX6xYMUnKNh0AAAAAAFuzeehu27atzp49q5EjRyouLk4+Pj7auHGjdXC148ePy87uvrr1HAAAAAAASZLFMAzD1kXkp4sXL8rd3V2JiYlyc3OzdTkAAACAzfkOXmrrEgDT7J4SaMp6bzVb0oUMAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3binzJo1S97e3nJ2dlbdunUVExOTa9s1a9bIz89PxYoVU5EiReTj46NPP/00SxvDMDRy5EiVKVNGLi4u8vf316FDh8zeDQAAAACQROjGPSQ8PFzBwcEaNWqUYmNjVatWLQUEBOjMmTM5ti9RooTef/99RUdHa+/evQoKClJQUJA2bdpkbTN58mRNnz5dc+bM0Q8//KAiRYooICBAqamp+bVbAAAAAB5gFsMwDFsXkZ8uXrwod3d3JSYmys3Nzdbl4Dp169bVU089pZkzZ0qSMjMz5eXlpbfffltDhw69pXXUrl1bzZo107hx42QYhsqWLauBAwdq0KBBkqTExER5enpq8eLFateunWn7AgAAcD/xHbzU1iUAptk9JdCU9d5qtqSnG/eE9PR07d69W/7+/tZpdnZ28vf3V3R09L8ubxiGIiIidPDgQTVq1EiSdOTIEcXFxWVZp7u7u+rWrXtL6wQAAACAu+Vg6wIASUpISFBGRoY8PT2zTPf09NRvv/2W63KJiYkqV66c0tLSZG9vr48//lhNmzaVJMXFxVnXceM6r80DAAAAADMRunFfc3V11Z49e5SUlKSIiAgFBwerUqVKatKkia1LAwAAAABCN+4NHh4esre3V3x8fJbp8fHxKl26dK7L2dnZqUqVKpIkHx8fHThwQCEhIWrSpIl1ufj4eJUpUybLOn18fPJ+JwAAAADgBtzTjXuCo6OjfH19FRERYZ2WmZmpiIgI1a9f/5bXk5mZqbS0NElSxYoVVbp06SzrvHjxon744YfbWicAAAAA3Cl6unHPCA4OVpcuXeTn56c6deooNDRUycnJCgoKkiQFBgaqXLlyCgkJkSSFhITIz89PlStXVlpamjZs2KBPP/1Us2fPliRZLBb1799f//d//6eqVauqYsWKGjFihMqWLauWLVvaajcBAAAAPEAI3bhntG3bVmfPntXIkSMVFxcnHx8fbdy40ToQ2vHjx2Vn97+LM5KTk9W7d2+dOHFCLi4uevTRR7Vs2TK1bdvW2mbIkCFKTk7WG2+8oQsXLuiZZ57Rxo0b5ezsnO/7BwAAAODBw3O6AQAA7iOzZs3SlClTFBcXp1q1amnGjBmqU6dOjm3nz5+vpUuXav/+/ZIkX19fTZgwIUv7+Ph4vfvuu9q8ebMuXLigRo0aacaMGapatWq+7A/uDTynGwUZz+kGAADALQkPD1dwcLBGjRql2NhY1apVSwEBATpz5kyO7aOiotS+fXtFRkYqOjpaXl5eev7553Xy5ElJkmEYatmypf7880+tW7dOP/30kypUqCB/f38lJyfn564BQIF1Rz3dV69eVVRUlA4fPqwOHTrI1dVVp06dkpubm4oWLWpGnXmGnm4AAHC/qlu3rp566inNnDlT0j8DiHp5eentt9/W0KFD/3X5jIwMFS9eXDNnzlRgYKB+//13VatWTfv371eNGjWs6yxdurQmTJigHj16mLo/uHfQ042C7L7r6T527JieeOIJtWjRQn369NHZs2clSZMmTdKgQYPuvGIAAADkKj09Xbt375a/v791mp2dnfz9/RUdHX1L60hJSdGVK1dUokQJSbI+8eP6sU7s7Ozk5OSkHTt25GH1APDguu3Q3a9fP/n5+en8+fNycXGxTn/11VezPJoJAAAAeSchIUEZGRnWAUav8fT0VFxc3C2t491331XZsmWtwf3RRx9V+fLlNWzYMJ0/f17p6emaNGmSTpw4odOnT+f5PgDAg+i2Ry//9ttv9f3338vR0THLdG9vb+v9QQAAALi3TJw4UWFhYYqKirL2bBcqVEhr1qxR9+7dVaJECdnb28vf318vvviiHrCxdgHANLcdujMzM5WRkZFt+okTJ+Tq6ponRQEAACArDw8P2dvbKz4+Psv0+Ph4lS5d+qbLfvDBB5o4caK2bt2qmjVrZpnn6+urPXv2KDExUenp6SpZsqTq1q0rPz+/PN8HAHgQ3fbl5c8//7xCQ0Ot7y0Wi5KSkjRq1Ci99NJLd1TErFmz5O3tLWdnZ9WtW1cxMTG5tl2zZo38/PxUrFgxFSlSRD4+Pvr000/vaLsAAAD3C0dHR/n6+ma5nS8zM1MRERGqX79+rstNnjxZ48aN08aNG28apN3d3VWyZEkdOnRIu3btUosWLfK0fgB4UN12T/eHH36ogIAAVa9eXampqerQoYMOHTokDw8PrVy58rYLuPboizlz5qhu3boKDQ1VQECADh48qFKlSmVrX6JECb3//vt69NFH5ejoqK+++kpBQUEqVaqUAgICbnv7AAAA94vg4GB16dJFfn5+qlOnjkJDQ5WcnKygoCBJUmBgoMqVK6eQkBBJ/wx0O3LkSK1YsULe3t7We7+LFi1qfeLM559/rpIlS6p8+fLat2+f+vXrp5YtW+r555+3zU4CQAFzx48MCwsL0969e5WUlKTatWurY8eOWQZWu1V3++gLSapdu7aaNWumcePG/WtbHhkGAADuZzNnztSUKVMUFxcnHx8fTZ8+XXXr1pUkNWnSRN7e3lq8eLGkf8bcOXbsWLZ1jBo1SqNHj5YkTZ8+XVOmTFF8fLzKlCmjwMBAjRgxItv4PSjYeGQYCjJbPzLsjkJ3XklPT1fhwoW1atUqtWzZ0jq9S5cuunDhgtatW3fT5Q3D0LZt2/TKK6/oiy++UNOmTbO1SUtLsz4OQ/rng/Hy8jItdPMPFgoys/7BAgAAtsU5LAoyW4fu2768fOnSmx+QgYG3vkM3e/TFb7/9lutyiYmJKleunNLS0mRvb6+PP/44x8AtSSEhIRozZswt1wQAAAAAQF657dDdr1+/LO+vXLmilJQUOTo6qnDhwrcVuu+Uq6ur9uzZo6SkJEVERCg4OFiVKlVSkyZNsrUdNmyYgoODre+v9XQDAAAAAGC22w7d58+fzzbt0KFD6tWrlwYPHnxb67rTR1/Y2dmpSpUqkiQfHx8dOHBAISEhOYZuJycnOTk53VZdAAAAAADkhdt+ZFhOqlatqokTJ2brBf83d/roixtlZmZmuW8bAAAAAIB7wW33dOe6IgcHnTp16raXu91HX4SEhMjPz0+VK1dWWlqaNmzYoE8//VSzZ8/Oq10BAAAAACBP3Hbo/vLLL7O8NwxDp0+f1syZM/X000/fdgFt27bV2bNnNXLkSOujLzZu3GgdXO348eOys/tfh3xycrJ69+6tEydOyMXFRY8++qiWLVumtm3b3va2AQAAAAAw020/Muz6ACxJFotFJUuW1H/+8x99+OGHKlOmTJ4WmNfMfk43j1tAQcYjwwAAKJg4h0VBdt89MiwzM/OuCgMAAAAA4EGRJwOpAQAAAACA7G6pp/v651z/m6lTp95xMQAA4MHApawoqLgVC8CNbil0//TTT7e0MovFclfFAAAAAABQkNxS6I6MjDS7DgAAAAAAChzu6QYAAAAAwCS3PXq5JO3atUufffaZjh8/rvT09Czz1qxZkyeFAQAAAABwv7vtnu6wsDA1aNBABw4c0Nq1a3XlyhX98ssv2rZtm9zd3c2oEQAAAACA+9Jth+4JEyboo48+0n//+185Ojpq2rRp+u2339SmTRuVL1/ejBoBAAAAALgv3XboPnz4sJo1ayZJcnR0VHJysiwWiwYMGKB58+bleYEAAAAAANyvbjt0Fy9eXJcuXZIklStXTvv375ckXbhwQSkpKXlbHQAAAAAA97FbDt3XwnWjRo20ZcsWSdLrr7+ufv36qWfPnmrfvr2ee+45c6oEAAAAAOA+dMujl9esWVNPPfWUWrZsqddff12S9P7776tQoUL6/vvv1bp1aw0fPty0QgEAAAAAuN/ccujevn27Fi1apJCQEI0fP16tW7dWjx49NHToUDPrAwAAAADgvnXLl5c3bNhQCxcu1OnTpzVjxgwdPXpUjRs31iOPPKJJkyYpLi7OzDoBAAAAALjv3PZAakWKFFFQUJC2b9+u33//Xa+//rpmzZql8uXL65VXXjGjRgAAAAAA7ku3HbqvV6VKFb333nsaPny4XF1dtX79+ryqCwAAAACA+94t39N9o2+++UYLFy7U6tWrZWdnpzZt2qh79+55WRsAAAAAAPe12wrdp06d0uLFi7V48WL98ccfatCggaZPn642bdqoSJEiZtUIAAAAAMB96ZZD94svvqitW7fKw8NDgYGB6tatm6pVq2ZmbQAAAAAA3NduOXQXKlRIq1atUvPmzWVvb29mTQAAAAAAFAi3HLq//PJLM+sAAAAAAKDAuavRywEAAAAAQO4I3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AyNWsWbPk7e0tZ2dn1a1bVzExMbm2nT9/vho2bKjixYurePHi8vf3z9L+ypUrevfdd/XEE0+oSJEiKlu2rAIDA3Xq1Kn82BUAAACbIHQDAHIUHh6u4OBgjRo1SrGxsapVq5YCAgJ05syZHNtHRUWpffv2ioyMVHR0tLy8vPT888/r5MmTkqSUlBTFxsZqxIgRio2N1Zo1a3Tw4EG98sor+blbAAAA+crB1gUAAO5NU6dOVc+ePRUUFCRJmjNnjtavX6+FCxdq6NCh2dovX748y/sFCxZo9erVioiIUGBgoNzd3bVly5YsbWbOnKk6dero+PHjKl++vHk7AwAAYCP0dAMAsklPT9fu3bvl7+9vnWZnZyd/f39FR0ff0jpSUlJ05coVlShRItc2iYmJslgsKlas2N2WDAAAcE8idAMAsklISFBGRoY8PT2zTPf09FRcXNwtrePdd99V2bJlswT366Wmpurdd99V+/bt5ebmdtc1AwAA3Iu4vBwAkOcmTpyosLAwRUVFydnZOdv8K1euqE2bNjIMQ7Nnz7ZBhQAAAPmD0A0AyMbDw0P29vaKj4/PMj0+Pl6lS5e+6bIffPCBJk6cqK1bt6pmzZrZ5l8L3MeOHdO2bdvo5QYAAAUal5cDALJxdHSUr6+vIiIirNMyMzMVERGh+vXr57rc5MmTNW7cOG3cuFF+fn7Z5l8L3IcOHdLWrVv10EMPmVI/AADAvYKebgBAjoKDg9WlSxf5+fmpTp06Cg0NVXJysnU088DAQJUrV04hISGSpEmTJmnkyJFasWKFvL29rfd+Fy1aVEWLFtWVK1f02muvKTY2Vl999ZUyMjKsbUqUKCFHR0fb7CgAAICJCN0AgBy1bdtWZ8+e1ciRIxUXFycfHx9t3LjROrja8ePHZWf3vwumZs+erfT0dL322mtZ1jNq1CiNHj1aJ0+e1JdffilJ8vHxydImMjJSTZo0MXV/AAAAbIHQDQDIVd++fdW3b98c50VFRWV5f/To0Zuuy9vbW4Zh5FFlAAAA9wfu6QYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJA62LgAAzOY7eKmtSwBMs3tKoK1LAAAAN0FPNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYJJ7InTPmjVL3t7ecnZ2Vt26dRUTE5Nr2/nz56thw4YqXry4ihcvLn9//5u2BwAAAADAVmweusPDwxUcHKxRo0YpNjZWtWrVUkBAgM6cOZNj+6ioKLVv316RkZGKjo6Wl5eXnn/+eZ08eTKfKwcAAAAA4OZsHrqnTp2qnj17KigoSNWrV9ecOXNUuHBhLVy4MMf2y5cvV+/eveXj46NHH31UCxYsUGZmpiIiIvK5cgAAAAAAbs6moTs9PV27d++Wv7+/dZqdnZ38/f0VHR19S+tISUnRlStXVKJECbPKBAAAAADgjjjYcuMJCQnKyMiQp6dnlumenp767bffbmkd7777rsqWLZsluF8vLS1NaWlp1vcXL16884IBAAAAALgNNr+8/G5MnDhRYWFhWrt2rZydnXNsExISInd3d+vLy8srn6sEAAAAADyobBq6PTw8ZG9vr/j4+CzT4+PjVbp06Zsu+8EHH2jixInavHmzatasmWu7YcOGKTEx0fr666+/8qR2AAAAAAD+jU1Dt6Ojo3x9fbMMgnZtULT69evnutzkyZM1btw4bdy4UX5+fjfdhpOTk9zc3LK8AAAAAADIDza9p1uSgoOD1aVLF/n5+alOnToKDQ1VcnKygoKCJEmBgYEqV66cQkJCJEmTJk3SyJEjtWLFCnl7eysuLk6SVLRoURUtWtRm+wEAAAAAwI1sHrrbtm2rs2fPauTIkYqLi5OPj482btxoHVzt+PHjsrP7X4f87NmzlZ6ertdeey3LekaNGqXRo0fnZ+kAAAAAANyUzUO3JPXt21d9+/bNcV5UVFSW90ePHjW/IAAAAAAA8sB9PXo5AAAAAAD3MkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmsXnonjVrlry9veXs7Ky6desqJiYm17a//PKLWrduLW9vb1ksFoWGhuZfoQAAAAAA3Cabhu7w8HAFBwdr1KhRio2NVa1atRQQEKAzZ87k2D4lJUWVKlXSxIkTVbp06XyuFgAAAACA22PT0D116lT17NlTQUFBql69uubMmaPChQtr4cKFObZ/6qmnNGXKFLVr105OTk75XC0AAAAAALfHZqE7PT1du3fvlr+///+KsbOTv7+/oqOj82w7aWlpunjxYpYXAAAAAAD5wWahOyEhQRkZGfL09Mwy3dPTU3FxcXm2nZCQELm7u1tfXl5eebZuAAAAAABuxuYDqZlt2LBhSkxMtL7++usvW5cEAAAAAHhAONhqwx4eHrK3t1d8fHyW6fHx8Xk6SJqTkxP3fwMAAAAAbMJmPd2Ojo7y9fVVRESEdVpmZqYiIiJUv359W5UFAAAAAECesVlPtyQFBwerS5cu8vPzU506dRQaGqrk5GQFBQVJkgIDA1WuXDmFhIRI+mfwtV9//dX6/ydPntSePXtUtGhRValSxWb7AQAAAABATmwautu2bauzZ89q5MiRiouLk4+PjzZu3GgdXO348eOys/tfZ/ypU6f05JNPWt9/8MEH+uCDD9S4cWNFRUXld/kAAAAAANyUTUO3JPXt21d9+/bNcd6NQdrb21uGYeRDVQAAAAAA3L0CP3o5AAAAAAC2QugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMQugGAAAAAMAkhG4AAAAAAExC6AYAAAAAwCSEbgAAAAAATELoBgAAAADAJIRuAAAAAABMck+E7lmzZsnb21vOzs6qW7euYmJibtr+888/16OPPipnZ2c98cQT2rBhQz5VCgAAAADArbN56A4PD1dwcLBGjRql2NhY1apVSwEBATpz5kyO7b///nu1b99e3bt3108//aSWLVuqZcuW2r9/fz5XDgAAAADAzdk8dE+dOlU9e/ZUUFCQqlevrjlz5qhw4cJauHBhju2nTZumF154QYMHD9Zjjz2mcePGqXbt2po5c2Y+Vw4AAAAAwM3ZNHSnp6dr9+7d8vf3t06zs7OTv7+/oqOjc1wmOjo6S3tJCggIyLU9AAAAAAC24mDLjSckJCgjI0Oenp5Zpnt6euq3337LcZm4uLgc28fFxeXYPi0tTWlpadb3iYmJkqSLFy/eTem5yki7bMp6gXuBWceN2TguUZBxXAL3Fo5J4N5j1nF5bb2GYdy0nU1Dd34ICQnRmDFjsk338vKyQTXA/c19xlu2LgHADTgugXsLxyRw7zH7uLx06ZLc3d1znW/T0O3h4SF7e3vFx8dnmR4fH6/SpUvnuEzp0qVvq/2wYcMUHBxsfZ+Zmalz587poYceksViucs9gC1dvHhRXl5e+uuvv+Tm5mbrcgCI4xK413BMAvcejsuCwzAMXbp0SWXLlr1pO5uGbkdHR/n6+ioiIkItW7aU9E8ojoiIUN++fXNcpn79+oqIiFD//v2t07Zs2aL69evn2N7JyUlOTk5ZphUrViwvysc9ws3NjX+wgHsMxyVwb+GYBO49HJcFw816uK+x+eXlwcHB6tKli/z8/FSnTh2FhoYqOTlZQUFBkqTAwECVK1dOISEhkqR+/fqpcePG+vDDD9WsWTOFhYVp165dmjdvni13AwAAAACAbGweutu2bauzZ89q5MiRiouLk4+PjzZu3GgdLO348eOys/vfIOsNGjTQihUrNHz4cL333nuqWrWqvvjiCz3++OO22gUAAAAAAHJk89AtSX379s31cvKoqKhs015//XW9/vrrJleFe52Tk5NGjRqV7fYBALbDcQncWzgmgXsPx+WDx2L82/jmAAAAAADgjtj9exMAAAAAAHAnCN0AAAAAAJiE0A0AAAAAgEkI3QAAAA+4jRs32roEACiw7onRywEAAGAbTZo0kaOjo/z9/eXgwKkhAOQ1/mUF/j/DMGSxWP51GgAABcWSJUt06NAhHTt2TA4ODjpx4oQefvhhW5cFAAUKl5cDyhquV65cqZEjR0qSLBaLeKoeYK7MzExblwA8sFxdXVWqVCnt3btXQ4YM0cSJE5WSkmLrsgCgQKGnGw+8zMxM2dn98/1TTEyMli1bpgMHDsjT01N9+vSxBm96vIG8d/3xFxsbKxcXF9nb2+uRRx6xcWXAg6FWrVpydXVV586ddfDgQe3fv1+FCxfm7x5wD1qzZo3+/vtvJSQkqEePHipevDi3hNwn+CnhgXfthH/w4MHat2+fLBaLLl++rA8//FApKSkaPHgwwRswyfXH37Jly2RnZ6crV66ob9++6tevn9zd3W1cIVAwZWRkyN7eXpUrV1apUqUUHR2tRo0aKTExUZL4uwfcY4YMGaKVK1fKx8dHR48e1YIFCzRp0iS1atXK+rcU9y5CN6B/LilfsGCBNm/erJo1a+r8+fMaOnSoPvvsM9nb2ys4OJgTECCPXLtl49qxtH37doWFhWnlypWyt7fX77//rl69eikuLk4zZsyQvb29LcsFChzDMKzH1dKlS3XmzBktXbpUoaGhmjBhgvr166f//Oc//N0D7hHLli3TsmXLtGnTJj3xxBPasmWLAgICVLhwYQL3fYLQDUg6fPiwqlSpIl9fX9nZ2al06dIaO3as+vTpow8//FAODg565513OAEB8oDFYlFmZqYsFouWLFminTt3qlOnTmrSpIkkqWHDhipfvrwCAgLk4+OjN954w7YFAwXI9bd0fPTRR5oxY4ZWrVql2rVrq3Llyurdu7emT58ui8WiZ599lr97wD3gyJEjeuWVV/TEE09oxYoV6tWrl2bNmqWXXnpJycnJunr1KleG3eP4agQPtIyMDElSyZIllZ6erpMnT0r656SkfPnyGjZsmJKSkrRixQrNnDlTkjjxAO5Q165d1aNHD0n/XFZ++PBhhYeHa9myZTp//rykf469K1euqGnTpurfv7+WLVum5ORkBjQE8si1wL1v3z799ttvmjRpkmrXri3DMFSnTh3NmTNHJ06c0PTp0xUZGSmJv3uArXz22WeSpOPHjyszM1MxMTF66623NHHiRPXq1UuSNGfOHM2ZM4e/k/c4QjceKDeOknzt5OOZZ57Rn3/+qWnTpiklJcU63TAMPffcc6pWrZpWr16t+Pj4fK8ZKAhSUlL0+OOP68svv9SgQYMkSZUrV1ZwcLCaNGmiFStW6Ntvv5WdnZ11UJhixYrJMAy5uLhw0g/koQ0bNqhhw4Zau3atnJ2dJf3z9y4zM1N+fn6aO3euTp48qZEjRyo2NtbG1QIPppCQEHXt2lV//fWXOnXqpK1bt6pevXqaNm2aNXAnJydr27Ztio+P5+/kPY7Ly/HAuP6Suvnz5+u3337ToUOH9Oabb6pZs2YKDw/XK6+8otTUVL388suqUKGCxo8fr5o1a6pLly6qUaOGfvzxRzVv3tzGewLcfwoXLqyePXuqaNGiGj58uK5evarQ0FD5+/vL0dFRFotFPXv21Lx58/T0008rJSVFUVFRKlWqFCcSQB576aWX1L17d02fPl1bt25VgwYN9NBDD1mDt6+vr6ZNm6YFCxbIx8fH1uUCD5yYmBgdP35cX331lby8vOTo6KjmzZtry5YtSk5OVkpKin7//Xe99957iouL07p162xdMv6FxeBaBDxghgwZouXLl6t169aSpJkzZ2r48OEaO3asNmzYoAEDBigpKUn29vYqWbKkduzYoZSUFDVq1EiffPKJ6tWrZ+M9AO5fiYmJWrFihUaMGKFOnTopNDRUkrRt2zZNmTJFmzdv1mOPPSY/Pz/t379f33//vRwdHbmnFMgj10Ytl6R+/fpp3bp1GjJkiDp27Ch3d3cZhiHDMLIMznT9l9YAzPXFF19o9OjRSklJ0ddff63KlStLkn799VfNnTtXK1eu1NWrV/Xwww/Lw8NDmzZtUqFChbIc27j30NONB8rmzZsVHh6ur776Sk8++aRiY2M1c+ZMPfbYY5L++fa/du3aOnfunJKSkvTUU0/JYrFozJgxSktLU/ny5W28B8D95caTdXd3d7Vp00aSNGLECElSaGio/vOf/8jBwUEuLi76+eef9eyzz2rx4sWSpCtXrqhQoUL5XjtQENnb21tPzqdNm6b09HRNnTpVFotFHTp0yHEwJgI3kH8eeughVahQQZs2bdL27dutobt69eoaP368hgwZop9//lleXl6qUaOG7OzsdPXqVZ7XfY/jp4MC7cYT/vPnz6tGjRp68skntXLlSr3xxhuaNWuW2rdvr8TERJ08eVLVq1dX6dKlJUl79uzRxIkTtW3bNm3evFlly5a11a4A953rj7/Y2FgZhqHq1avroYceUvv27SVlDd6NGjVSWlqaFixYoGnTpqlGjRry8/PjRALIY9cH79mzZ6t3794KDQ1VcnKy3nrrLRUtWtTWJQIPrIYNG6pw4cKyWCyaPXu23N3drVdnOjs7q2jRoipXrpy1fWZmJn8n7wN8dYkC7doJ/8SJE3XkyBFlZGTo1KlTWrdund566y1NnjzZOhjF119/rZCQEP3999+S/hlUpnjx4vL29lZUVBT3tQG36drxN3ToUPn7++vVV19VzZo19fPPP6tYsWLq2LGjxo0bp+XLlys4OFiS1LRpU/Xu3VtVqlTR66+/rh9//JHLyoHbdOOgodee1HG9a8Fbkj7++GPVrl1bMTExKlKkSL7UCOB/1q9fr6VLl+qTTz5RUlKSfH19NWLECFWoUEHTp0/X2rVrJUkODg7ZRinnSpT7A/d0o0C6/v7PhQsX6p133tHWrVvl7e2ttm3b6ttvv9WUKVM0cOBASdLly5fVtm1bFS9eXIsXL85yks+9bMDtuf74i46OVs+ePTV9+nQ5ODjoo48+0jfffKM1a9aocePGunjxolauXKlevXrpo48+Ur9+/SRJW7du1dKlSzVmzBhVrFjRlrsD3LdCQ0PVtm1blSlTJtc2198Heu3vHWMoAPln0KBBWrlypdzc3HT58mVduXJFy5Yt07PPPquYmBhNmTJFf//9t3r06KEOHTrYulzcIUI3CrRt27bpyy+/VL169dSuXTsZhqH58+dr7ty5Kl++vN577z2dOHFC8+bN08mTJxUbGysHBweCNnCHbjx2fv75Z23YsEHDhg2T9M/92dcefbJ27Vo1atRIFy5cUGRkpF555ZUsg8BcvnxZLi4u+b4PwP3q+rA8Z84cDRw4UBEREf86AOiN4yZwfyiQP5YvX67+/ftry5Yt8vLykp2dnXr27KmoqCht3rxZtWvXVnR0tIYPH65HH31Us2bNsnXJuEOEbhRYkZGR6tevn06fPq0lS5bopZdekiSlp6dr6dKlWrFihXbu3KknnnhCDz/8sMLCwhj9EbgL15/wh4SE6KefftKuXbvk5+enhQsXWu8TvXLlijp37qxt27Zp+fLlatq0qXUdV69elb29Pb1swF3Yvn271q5dq4YNG1rvBc3N9cftZ599poCAgBwHUwOQ90JCQrRjxw6tX78+y5fWL7zwgs6cOaMff/xR9vb2+vXXX/Xoo4/SIXQf4yeHAuPG74+efPJJvfzyy7JYLFq8eLGuXLkiSXJ0dFSPHj20bds2/fTTT9qyZYtWrVqlQoUKWU/4AdyezMxM64n79OnTNWnSJHl6eqpChQr68ssvtX79eqWlpUmSChUqpGXLlqlWrVqaOnVqlvU4ODgQuIHbdP093Nu2bVOvXr0UFhamhx56SFLO93RLWQP3/Pnz1a5dO8XExJhfMABJUkJCgg4cOCDpn3uzr/2dDA4O1rlz53T48GFJ/4xcbmdnl228Btw/CN0oEK4/cfjkk08UGRmpYsWKaejQoXrrrbd06NAhDR8+3Bq8r169Kkl65JFH5ObmJovFwuiPwF249u37vn379Ouvv+rzzz/XjBkzFBkZqTZt2uiNN97QV199ZT2hcHBw0MaNG7V+/Xpblg3c965evWo9/vbu3av69evrhRdeUFpampYtWybpn0HTbjxZv/7v5ty5czV48GCtXr06y5UnAPLehg0b9N1330mSAgMD5eDgYL0Fy8nJyfpfJyenbB1B9HTfv/jJ4b53fQ/bTz/9pCVLluidd97Rrl275OrqqkGDBunFF1/U9u3bNWLECOu9ajcOFMM/ZMDd2bBhgxo2bKgvv/wyywn+0qVL9corr6h79+7asGGDUlNTJf0TBPjmHrhzq1at0tChQyVJAwYMUIcOHeTo6Kjhw4erR48e2rVrl8aOHStJWY61GwP3kCFD9Mknn+jVV1+1zY4AD4ghQ4YoODhYP/zwgy5cuKCKFSuqY8eOioqKUt++fXX27Fn9+uuvmjJliry8vBhItAAhZeC+ZhiGNSyPGTNG48aNU2pqqg4dOqQ33nhD3333ndzc3DR06FA999xz+vbbb/X2228rIyODS1iBPPbSSy+pe/fuOnfunL755hudP3/eOu/TTz9Vy5Yt1bp1a/3www9ZluMLL+DOZGRkaOrUqXr66ae1cOFCrVixQvb29ipRooSGDh2q//znP9qwYYPGjRsnSdlGJp8+fbqGDRumhQsX/uu93wDuzkcffaTFixdr0aJF6tOnj4oVKyY3Nzf16dNHrVu31pYtW1S+fHm1bt1aCQkJ+vrrr/liugBhIDUUCLNmzdK7776r9evX65FHHlFUVJSWLFmiM2fOaNasWapfv74uXryo9957T+np6Zo7dy6hG8hD1w9A+Pbbb2v9+vUaPHiw2rdvr2LFilnbjRs3TsOGDeNWDiCPNGrUSDt27FBQUJA++eQT6/gmFotFCQkJCgkJUXR0tJ555hlNnjzZutyff/6pgIAAjRs3Tu3atbNV+UCBZxiGLl++rPbt2+uZZ57R4MGDrV9+Xbv68tptjxERESpZsqRq1aole3t7niRQgBC6cd/LzMxU165d5ejoqAULFlinb9myxXof97x58+Tn56fk5GS5uLjwHFLABNcH7169emnLli0aOHCgOnTokG00ZE4kgDtz42P5pkyZorS0NI0ZM0YDBgzQhAkT5ODgYD0eExIS9N577ykjI0MLFiyw/t1LS0tTQkKCypUrZ6tdAR4YaWlpqlevnl599VWNHDkyy7zLly/r4MGD8vHxyTKdp+kULJzx4L5nZ2cnd3d37du3TykpKSpcuLAkqWnTpvrhhx80cuRI9erVSzNnzlTdunUlZT9pAXD37O3trScJs2fPVu/evRUaGqrk5GS99dZb1keGSSJwA3fg+r9dS5cuVeHChfX222/L2dlZVapUUefOnWWxWDRhwgTryfqxY8c0e/Zs2dnZWQcNtbOzk5OTE4EbMNn58+dVvHhxZWZmqnjx4tq1a1e2Nn/99Zfmzp2r4OBgVa1a1TqdwF2wkDpwX8ntvhYfHx/99ddf+vrrr3X58mXr9GrVqunVV19V5cqVNWPGDF26dEkS95ACd+LG4y+nxxBdC96S9PHHH6t27dqKiYlRkSJF8qVGoKC6fgyTIUOGaOjQobp06ZLOnTsnSWrXrp2WLl2qjz76SAMHDtTPP/+sV155RQMHDrQG7uvXAcBcEydOVJcuXXT48GG5uLho4sSJ2rp1q9555x2lpKQoPT1diYmJ6t+/v44dO6bKlSvbumSYiMvLcd+4/hv+yMhIpaam6urVq3r55ZclSR07dtSOHTs0ZswYNWzYUCVKlFDXrl1Vt25dubu7a+TIkdq1axcjQQJ3KTQ0VG3btlWZMmVybXP9ZXHXjl1u6QDuzPV//0JDQzVp0iStW7dOderUsba5fPmyXFxctGrVKnXs2FFVq1aVk5OTdu7cqUKFCtmqdOCBNGTIEC1btkwhISF65plnrIF69erV6tKlix555BE5ODjI3t5eKSkp2rVrlwoVKsSVmAUYoRv3ncGDB2vFihUqXLiwTp06pTp16mj69Ol64okn1L17d8XExOjEiRPy9PSUYRg6ePCg9u3bp1atWunrr79WlSpVbL0LwH3l+rA8Z84cDRw4UBEREapXr95Nl7ty5UqWk33u4wZu3/XH3+XLl9WtWzeVL19ekyZN0uHDh/XTTz9pwYIFcnBw0OTJk1W9enUdPXpU8fHxeuqpp2RnZ8exB+SjDRs26M0339Tq1autX4wlJyfrr7/+0qOPPqpTp05p6dKlSkpKkqenp3r16mUdTI3jtOAidOO+smDBAr333nvauHGjPD09lZaWppYtW6pw4cIKCwuTt7e3du7cqePHj8vBwUEtWrSQvb293n77bUVHR2vr1q1ZRlIGcOu2b9+utWvXqmHDhv/6eKHrg8Jnn32mgICAbIOpAbi5yMhInTp1Sh07dtQbb7whOzs7paam6uzZs2rcuLE2b94sJycnubm5KSEhQRcuXNC2bdvk6upqXQc9Z0D+mjt3rubPn69du3Zp7969+uqrr7R48WIdO3ZMXbt21dy5c7Mtw6BpBR+hG/esTZs2qU6dOipevLj1BH7gwIE6fvy4Pv/8c+s/UOfPn5evr6+eeuophYeHZ1nHd999p8WLF2vt2rWKiIhQrVq1bLQ3wP3n+pP1bdu2qW/fvjp37pzCwsLUpEmTXE8Srg/c8+fP15tvvqlNmzapadOm+Vo/cL8yDENJSUlq3bq10tPT5ebmpu3bt2vPnj369ddftWjRIu3cuVN9+vTR888/L19fX02bNk2RkZFas2YNIRuwoR9++EHPPvusGjVqpAMHDqhJkyZq2LChPDw81KpVK8XExMjPz8/WZSKf8a8y7knz5s1T69atFR4ersTEROsAMKdOndKFCxck/TNgU2pqqooXL64pU6Zox44d+uuvv7IM7nTlyhWdOHFCUVFRBG7gNly9etV64r53717Vr19fL7zwgtLS0rRs2TJJ/xyDNw6udn3gnjt3rgYPHqzVq1cTuIHbYLFY5OrqqrCwMMXFxemrr77S0KFDVbFiRTVr1kwLFizQnj17NGzYMPn6+kr654tqNzc3AjdgYzVr1tTnn3+ukiVLasKECQoJCVGPHj3UoEED1a1bl0vIH1D0dOOe1bt3b23dulUDBgxQu3btVLx4cW3cuFGtWrXS9OnT1aNHD2vbzz77TOPHj1dUVJSKFy+eZT3XBpcBcGtWrVqlnTt36oMPPtCAAQO0ZcsW/fzzz0pMTFRISIi2bNmiVq1aWZ81mtNAaXPnztWQIUO0cOHCf70UHUDOLly4oI4dOyopKUlOTk7q3LmzOnfubJ1/8eJFxcTEaOLEiTpz5oxiY2Pl4ODAoIXAPeTq1au6fPmy2rdvr8TERG3fvp0vxx5AfNWCe861S1Y//vhj9erVSx9++KEMw1CHDh3UpEkT9enTR+PHj1dqaqqCgoJ04cIFLV26VA8//HCO92sTuIHbk5GRoalTpyo6Olr79+/Xt99+K3t7e5UoUUJDhw5VRkaGNmzYIIvFohEjRmQL3NOnT9fo0aMJ3MBdKlasmNavX6+4uDh1795dixYtkp2dnTp27ChJ+u2337Rq1SqVK1dOX3/9NYMxAfeY1NRUrVq1SvPmzVNKSoqio6NlZ2fHWAsPIHq6cU+6/l7RXr16afPmzRo0aJC6deumv//+W3PnztXkyZNVokQJFSlSRG5uboqOjuZxC0AeadSokXbs2KGgoCB98sknuvanwmKxKCEhQSEhIYqOjtYzzzyjyZMnW5f7888/FRAQoHHjxqldu3a2Kh8ocI4cOaK3335b6enp6tChgwIDA9WiRQuVKVNGc+fOlcViIXADNnCz885z585p69at+vXXXzV8+HC+GHuAEbpxz7oxeG/atEmDBw9W165d5eLiosOHDys2Nlbu7u567rnnZG9vzz9kwB268aRhypQpSktL05gxYzRgwABNmDBBDg4O1uMyISFB7733njIyMrRgwQJrL3daWpoSEhJUrlw5W+0KUGAdOXJEgwYN0oEDB3T58mW5urpq165dcnR05JJyIB+FhYUpISFBffv2lXTz4J2WliYnJydJ2R+liQcHoRv3tNyCd5s2bfTQQw/l2hbArbv+ZGHp0qUqXLiwmjdvLmdnZ4WFhalz584KDg7WhAkTrMfY7t275ePjIzs7O1ksFq4wAfLJ6dOntXv3bsXHx6tLly70nAH5bP/+/QoMDJS9vb369+9vvd0jp7+DfBmGa/gXGvc0e3t7a5iePXu2evfurdDQUCUnJ+utt95S0aJFs7QFcHsMw7CeJAwZMkTLli3T+PHjde7cOZUtW1bt2rWTYRjq0qWL0tLSFBQUpBEjRujixYuKjIy0PlmAwA3kjzJlyqh58+bW9xkZGQRuIJ8MHjxYhw8floODgw4dOqRx48bpypUr6tq1a7Z7ta8P3FOnTtWePXu0dOlSW5YPG6KnGzZ147eCufVWXz+9ffv2ysjIUHh4ON8eAnfh+uMvNDRUkyZN0rp161SnTh1rm2uj/69atUodO3ZU1apV5eTkpJ07d3KJHADggbF06VL1799fW7duVYUKFZSWlqauXbvq0qVL6tWrlwIDAyX987fVYrFYz1HnzZund999VzNmzFCnTp1suQuwIUI37gmhoaFq27atypQpk2ub64N3To8oAnDrrj92Ll++rG7duql8+fKaNGmSDh8+rJ9++kkLFiyQg4ODJk+erOrVq+vo0aOKj4/XU089JTs7Oy5pBQA8MEaNGqWIiAh988031lB98uRJtWrVSufOndOIESOyBG87Ozvr4zMXLVqkVq1a2XgPYEucLcEmrj/hnzNnjt5//33Vq1fvpqHb3t7eOgDFtd45LqsDbl9kZKROnTqljh076o033pCdnZ2cnJy0f/9+TZ48WZs3b5aTk5OKFy+uhIQEdenSRdu2bZO3t7e8vb0l/XNCwbEHACjorp2zuri4KC0tTWlpaXJxcdGVK1dUrlw5TZgwQS+//LIWL16stLQ09ezZU3Z2dpo9e7aGDh2qhQsXErhBTzdsa/v27Vq7dq0aNmz4r8/zvT6of/bZZwoICJC7u3t+lAkUCIZhKCkpSa1bt1Z6errc3Ny0fft27dmzR7/++qsWLVqknTt3qk+fPnr++efl6+uradOmKTIyUmvWrOG+bQDAA+uXX36Rj4+Phg8frlGjRlmnb9iwQfPnz5ednZ3Onj2rd955RwkJCRo4cKA+/fRTAjck0dONfHb9PaTbtm1T3759de7cObVs2VJS7vd0Xx+458+frzfffFObNm1S06ZN86124H5nsVjk6uqqsLAwNWjQQN98843Gjx+vihUrqmLFinr66ad19epVeXh4WJfZtGmTPDw8CNwAgAdajRo1NH/+fL3xxhtKSkpSmzZtVKJECX388ceqXbu2evXqpf79+2v06NEqVaqUli5dSuCGFT3dyDfX3/+5d+9eVa1aVe+//74WLVqk1q1ba8GCBZKyD652feCeO3eu3n33XS1atEivvvpq/u8EUABcuHBBHTt2VFJSkpycnNS5c2d17tzZOv/ixYuKiYnRxIkTdebMGcXGxsrBwYExFAAAD7zVq1erd+/ecnR0lCSVLFlS33//vZydnXXy5Em9//77GjFihCpXrmzjSnEvIXQjX6xatUo7d+7UBx98oAEDBmjLli36+eeflZiYqJCQEG3ZskWtWrXSyJEjJeU8UNq1wSgWLlz4r5eiA/h3cXFx6t69uy5fvqzu3btbnzUaExOjhQsX6vLly1qwYIEKFSrEoGkAAPx/p06d0smTJ5WcnKyGDRvK3t5eqampcnZ2zvWqTTzYCN3IF+Hh4Wrfvr3q16+v/fv369tvv1XNmjUlSX///bfGjx+v77//Xs2aNdOIESMkZe3hnj59ukaPHq358+cTuIE8dOTIEb399ttKT09Xhw4dFBgYqBYtWqhMmTKaO3euLBYLgRsAgJsgaOPfELqRbxo1aqQdO3YoKChIn3zyia796lksFiUkJCgkJETR0dF65plnNHnyZOtyf/75pwICAjRu3Di1a9fOVuUDBdaRI0c0aNAgHThwQJcvX5arq6t27dolR0dHLikHAAC4S4RumObGe7OnTJmitLQ0jRkzRgMGDNCECRPk4OBg/XYwISFB7733njIyMrRgwQLriX5aWpoSEhJUrlw5W+0KUOCdPn1au3fvVnx8vLp06SIHBwd6uAEAAPIAoRumuD5wL126VIULF1bz5s3l7OyssLAwde7cWcHBwZowYYL1cpzdu3fLx8dHdnZ2slgs2UI7gPzDpXIAAAB5gy4M5DnDMKxheciQIVq2bJnGjx+vc+fOqWzZsmrXrp0Mw1CXLl2UlpamoKAgjRgxQhcvXlRkZKQsFkuWdQDIfwRuAACAvEFPN/LU9b3ToaGhmjRpktatW6c6depY21y+fFkuLi5atWqVOnbsqKpVq8rJyUk7d+5UoUKFbFU6AAAAAOQ5QjfyzPUDLl2+fFndunVT+fLlNWnSJB0+fFg//fSTFixYIAcHB02ePFnVq1fX0aNHFR8fr6eeekp2dnbcQwoAAACgQCF0I09ERkbq1KlT6tixo9544w3Z2dkpNTVVZ8+eVePGjbV582Y5OTnJzc1NCQkJunDhgrZt2yZXV1frOriHGwAAAEBBQ+jGXTEMQ0lJSWrdurXS09Pl5uam7du3a8+ePfr111+1aNEi7dy5U3369NHzzz8vX19fTZs2TZGRkVqzZg0hGwAAAECBxnW8uCsWi0Wurq4KCwtTgwYN9M0332j8+PGqWLGiKlasqKefflpXr16Vh4eHdZlNmzbJw8ODwA0AAACgwCN0I0/Y2dmpcuXK8vT0VGRkpB5++GF17txZxYoVkyRdvHhRMTExmjhxos6cOaMvv/xSUtb7wAEAAACgoKGrEXmiWLFiWr9+vcLDw1WoUCEtWrRIy5cvt87/7bfftGrVKpUrV067d++Wg4ODrl69SuAGAAAAUKBxTzfy3JEjR/T2228rPT1dHTp0UGBgoFq0aKEyZcpo7ty5slgsjFIOAAAA4IFA6IYpjhw5okGDBunAgQO6fPmyXF1dtWvXLjk6OnJJOQAAAIAHBqEbpjl9+rR2796t+Ph4denSxXpJOT3cAAAAAB4UhG7km4yMDNnb29u6DAAAAADIN4RuAAAAAABMwujlAAAAAACYhNANAAAAAIBJCN0AAAAAAJiE0A0AAAAAgEkI3QAAAAAAmITQDQAAAACASQjdAAAAAACYhNANAADuisVi0RdffGHrMgAAuCcRugEAKAC6du0qi8Wit956K9u8Pn36yGKxqGvXrre0rqioKFksFl24cOGW2p8+fVovvvjibVQLAMCDg9ANAEAB4eXlpbCwMF2+fNk6LTU1VStWrFD58uXzfHvp6emSpNKlS8vJySnP1w8AQEFA6AYAoICoXbu2vLy8tGbNGuu0NWvWqHz58nryySet0zIzMxUSEqKKFSvKxcVFtWrV0qpVqyRJR48e1bPPPitJKl68eJYe8iZNmqhv377q37+/PDw8FBAQICn75eUnTpxQ+/btVaJECRUpUkR+fn764YcfTN57AADuTQ62LgAAAOSdbt26adGiRerYsaMkaeHChQoKClJUVJS1TUhIiJYtW6Y5c+aoatWq+uabb9SpUyeVLFlSzzzzjFavXq3WrVvr4MGDcnNzk4uLi3XZJUuWqFevXvruu+9y3H5SUpIaN26scuXK6csvv1Tp0qUVGxurzMxMU/cbAIB7FaEbAIACpFOnTho2bJiOHTsmSfruu+8UFhZmDd1paWmaMGGCtm7dqvr160uSKlWqpB07dmju3Llq3LixSpQoIUkqVaqUihUrlmX9VatW1eTJk3Pd/ooVK3T27Fn9+OOP1vVUqVIlj/cSAID7B6EbAIACpGTJkmrWrJkWL14swzDUrFkzeXh4WOf/8ccfSklJUdOmTbMsl56enuUS9Nz4+vredP6ePXv05JNPWgM3AAAPOkI3AAAFTLdu3dS3b19J0qxZs7LMS0pKkiStX79e5cqVyzLvVgZDK1KkyE3nX38pOgAAIHQDAFDgvPDCC0pPT5fFYrEOdnZN9erV5eTkpOPHj6tx48Y5Lu/o6ChJysjIuO1t16xZUwsWLNC5c+fo7QYAQIxeDgBAgWNvb68DBw7o119/lb29fZZ5rq6uGjRokAYMGKAlS5bo8OHDio2N1YwZM7RkyRJJUoUKFWSxWPTVV1/p7Nmz1t7xW9G+fXuVLl1aLVu21Hfffac///xTq1evVnR0dJ7uIwAA9wtCNwAABZCbm5vc3NxynDdu3DiNGDFCISEheuyxx/TCCy9o/fr1qlixoiSpXLlyGjNmjIYOHSpPT0/rpeq3wtHRUZs3b1apUqX00ksv6YknntDEiROzhX8AAB4UFsMwDFsXAQAAAABAQURPNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYBJCNwAAAAAAJiF0AwAAAABgEkI3AAAAAAAmIXQDAAAAAGASQjcAAAAAACYhdAMAAAAAYJL/B/mCbuezONYXAAAAAElFTkSuQmCC", - "text/plain": [ - "
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "visualize_metrics(rc, title=\"CTGAN Nursery Metric Evaluation\")" - ] - }, - { - "cell_type": "code", - "execution_count": 29, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[INFO] Generating data using GANBLR model\n", - "[SUCCESS] Data generation completed\n" - ] - } - ], - "source": [ - "model_result = evaluate.evaluate_models(data, {\"GANBLR\": ganblr_model, \"CTGAN\": ctgan_model}, ['tstr_logreg', 'tstr_mlp', 'tstr_rf', 'q_score'])" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# model_result" - ] - }, - { - "cell_type": "code", - "execution_count": 31, - "metadata": {}, - "outputs": [], - "source": [ - "# Future import statement ideas\n", - "\n", - "import katabatic as kb\n", - "from katabatic.models import meg\n", - "from katabatic.models import ganblr\n", - "from katabatic.utils.evaluate import run_metric\n", - "from katabatic.utils.preprocessing import preprosessing_method1 # good place to store preprocessing utilities" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.9.19" - } - }, - "nbformat": 4, - "nbformat_minor": 2 -} diff --git a/Use_Cases/trytableGan.ipynb b/Use_Cases/trytableGan.ipynb deleted file mode 100644 index 811cf73..0000000 --- a/Use_Cases/trytableGan.ipynb +++ /dev/null @@ -1,1012 +0,0 @@ -{ - "cells": [ - { - "cell_type": "code", - "execution_count": 4, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Loading Data...\n", - "Shape of X_train: (2000, 14)\n", - "Shape of y_train: (2000, 1)\n", - "---FIT TableGAN Model\n", - "---Initialise TableGAN Model\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "c:\\Users\\tahat\\AppData\\Local\\Programs\\Python\\Python38\\lib\\site-packages\\sklearn\\preprocessing\\_label.py:114: DataConversionWarning: A column-vector y was passed when a 1d array was expected. Please change the shape of y to (n_samples, ), for example using ravel().\n", - " y = column_or_1d(y, warn=True)\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Epoch 10/200: [D loss: -1.9100] [G loss: -0.0554] [C loss: 0.3172]\n", - "Epoch 20/200: [D loss: -1.3383] [G loss: -0.1641] [C loss: 0.0942]\n", - "Epoch 30/200: [D loss: -0.8126] [G loss: -0.3121] [C loss: 0.0424]\n", - "Epoch 40/200: [D loss: -0.7984] [G loss: -0.3318] [C loss: 0.0285]\n", - "Epoch 50/200: [D loss: -0.6670] [G loss: -0.2688] [C loss: 0.0190]\n", - "Epoch 60/200: [D loss: -0.6348] [G loss: 0.0377] [C loss: 0.0142]\n", - "Epoch 70/200: [D loss: -0.6111] [G loss: -0.0788] [C loss: 0.0101]\n", - "Epoch 80/200: [D loss: -0.5998] [G loss: -0.0851] [C loss: 0.0088]\n", - "Epoch 90/200: [D loss: -0.5438] [G loss: -0.0609] [C loss: 0.0061]\n", - "Epoch 100/200: [D loss: -0.5389] [G loss: -0.1377] [C loss: 0.0046]\n", - "Epoch 110/200: [D loss: -0.5233] [G loss: -0.2638] [C loss: 0.0039]\n", - "Epoch 120/200: [D loss: -0.5171] [G loss: -0.2334] [C loss: 0.0036]\n", - "Epoch 130/200: [D loss: -0.5011] [G loss: -0.2618] [C loss: 0.0028]\n", - "Epoch 140/200: [D loss: -0.4998] [G loss: -0.2296] [C loss: 0.0025]\n", - "Epoch 150/200: [D loss: -0.4928] [G loss: -0.1326] [C loss: 0.0022]\n", - "Epoch 160/200: [D loss: -0.4891] [G loss: -0.2150] [C loss: 0.0015]\n", - "Epoch 170/200: [D loss: -0.4749] [G loss: -0.1316] [C loss: 0.0015]\n", - "Epoch 180/200: [D loss: -0.4783] [G loss: -0.1347] [C loss: 0.0011]\n", - "Epoch 190/200: [D loss: -0.4725] [G loss: -0.0870] [C loss: 0.0009]\n", - "Epoch 200/200: [D loss: -0.4582] [G loss: -0.0580] [C loss: 0.0006]\n", - "---Generate from TableGAN Model\n", - "Shape of synthetic data: (1000, 15)\n" - ] - } - ], - "source": [ - "from katabatic.models.TableGAN import TableGANAdapter, TableGAN, preprocess_data, postprocess_data\n", - "import pandas as pd\n", - "\n", - "# Initialize the adapter with a specific privacy setting\n", - "tablegan_adapter = TableGANAdapter(type='continuous', privacy_setting='high')\n", - "\n", - "# Load data\n", - "data_path = 'data/Adult/train_Adult_cleaned.csv'\n", - "labels_path = 'data/Adult/train_Adult_labels.csv'\n", - "\n", - "# Load features\n", - "X_train = tablegan_adapter.load_data(data_path)\n", - "\n", - "# Load labels\n", - "y_train = pd.read_csv(labels_path, header=None)\n", - "\n", - "# If y_train has multiple columns, assume the first column is the target\n", - "if y_train.shape[1] > 1:\n", - " y_train = y_train.iloc[:, 0]\n", - "\n", - "# Print shapes to verify\n", - "print(\"Shape of X_train:\", X_train.shape)\n", - "print(\"Shape of y_train:\", y_train.shape)\n", - "\n", - "# Fit the model\n", - "tablegan_adapter.fit(X_train, y_train, epochs=200, batch_size=64)\n", - "\n", - "# Generate synthetic data\n", - "synthetic_data = tablegan_adapter.generate(size=1000)\n", - "\n", - "# Print shape of synthetic data\n", - "print(\"Shape of synthetic data:\", synthetic_data.shape)" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
01234567891011121314
039.1213871.427392359870.2812501.90767713.6227702.1911863.8996092.1594381.1683871.9873532386.21826250.11344941.9158330.1249101.0
163.0414430.64618581062.6093751.75695411.2162621.0694281.8342844.1493121.1000031.9994731550.688721260.51715141.4513470.6714101.0
243.2682340.392172438285.8125002.4734829.0749273.1814316.8425453.1322912.9184621.9945952721.22583069.84430741.6339110.0523181.0
361.2280541.683767310968.90625010.7982418.8417451.0859401.8634102.5877331.1398231.9914322739.26196333.71923841.6475370.0997851.0
432.0398372.333866207680.5000002.28385510.9979523.1374025.9673603.8138381.2091551.9995111500.981934127.84696241.7858580.3263281.0
\n", - "
" - ], - "text/plain": [ - " 0 1 2 3 4 5 \\\n", - "0 39.121387 1.427392 359870.281250 1.907677 13.622770 2.191186 \n", - "1 63.041443 0.646185 81062.609375 1.756954 11.216262 1.069428 \n", - "2 43.268234 0.392172 438285.812500 2.473482 9.074927 3.181431 \n", - "3 61.228054 1.683767 310968.906250 10.798241 8.841745 1.085940 \n", - "4 32.039837 2.333866 207680.500000 2.283855 10.997952 3.137402 \n", - "\n", - " 6 7 8 9 10 11 12 \\\n", - "0 3.899609 2.159438 1.168387 1.987353 2386.218262 50.113449 41.915833 \n", - "1 1.834284 4.149312 1.100003 1.999473 1550.688721 260.517151 41.451347 \n", - "2 6.842545 3.132291 2.918462 1.994595 2721.225830 69.844307 41.633911 \n", - "3 1.863410 2.587733 1.139823 1.991432 2739.261963 33.719238 41.647537 \n", - "4 5.967360 3.813838 1.209155 1.999511 1500.981934 127.846962 41.785858 \n", - "\n", - " 13 14 \n", - "0 0.124910 1.0 \n", - "1 0.671410 1.0 \n", - "2 0.052318 1.0 \n", - "3 0.099785 1.0 \n", - "4 0.326328 1.0 " - ] - }, - "execution_count": 6, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "pd.DataFrame(synthetic_data).head()" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
ageworkclassfnlwgteducationeducation-nummarital-statusoccupationrelationshipracesexcapital-gaincapital-losshours-per-weeknative-country
0396775161133941221740401
1502833111131531200131
2381215646492741200401
3531234721371735200401
428133840911316151004013
\n", - "
" - ], - "text/plain": [ - " age workclass fnlwgt education education-num marital-status \\\n", - "0 39 6 77516 1 13 3 \n", - "1 50 2 83311 1 13 1 \n", - "2 38 1 215646 4 9 2 \n", - "3 53 1 234721 3 7 1 \n", - "4 28 1 338409 1 13 1 \n", - "\n", - " occupation relationship race sex capital-gain capital-loss \\\n", - "0 9 4 1 2 2174 0 \n", - "1 5 3 1 2 0 0 \n", - "2 7 4 1 2 0 0 \n", - "3 7 3 5 2 0 0 \n", - "4 6 1 5 1 0 0 \n", - "\n", - " hours-per-week native-country \n", - "0 40 1 \n", - "1 13 1 \n", - "2 40 1 \n", - "3 40 1 \n", - "4 40 13 " - ] - }, - "execution_count": 7, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "X_train.head()" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "['cities_demo.csv', 'importer.py', 'Iris.csv', 'katabatic.py', 'katabatic_config.json', 'katabatic_spi.py', 'metrics', 'models', 'test2.py', 'utils', '__init__.py', '__pycache__']\n", - "['ctgan', 'ganblr', 'meg', 'model_template', 'TableGAN', 'tvae', '__init__.py', '__pycache__']\n", - "['tablegan.py', 'tablegan_adapter.py', 'tablegan_utils.py', '__init__.py']\n" - ] - } - ], - "source": [ - "import os\n", - "import katabatic\n", - "\n", - "katabatic_path = os.path.dirname(katabatic.__file__)\n", - "print(os.listdir(katabatic_path))\n", - "\n", - "models_path = os.path.join(katabatic_path, 'models')\n", - "if os.path.exists(models_path):\n", - " print(os.listdir(models_path))\n", - "else:\n", - " print(\"models directory not found\")\n", - "\n", - "tablegan_path = os.path.join(models_path, 'tablegan')\n", - "if os.path.exists(tablegan_path):\n", - " print(os.listdir(tablegan_path))\n", - "else:\n", - " print(\"tablegan directory not found\")" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "---FIT TableGAN Model with high privacy setting\n", - "---Initialise TableGAN Model\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "c:\\Users\\tahat\\AppData\\Local\\Programs\\Python\\Python38\\lib\\site-packages\\sklearn\\preprocessing\\_label.py:114: DataConversionWarning: A column-vector y was passed when a 1d array was expected. Please change the shape of y to (n_samples, ), for example using ravel().\n", - " y = column_or_1d(y, warn=True)\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Epoch 10/200: [D loss: -6.0872] [G loss: 12.9314] [C loss: 0.5011]\n", - "Epoch 20/200: [D loss: -4.4862] [G loss: 9.4993] [C loss: 0.2545]\n", - "Epoch 30/200: [D loss: -1.8858] [G loss: 0.9433] [C loss: 0.1350]\n", - "Epoch 40/200: [D loss: -1.5708] [G loss: 0.2090] [C loss: 0.0813]\n", - "Epoch 50/200: [D loss: -1.4518] [G loss: 0.1704] [C loss: 0.0526]\n", - "Epoch 60/200: [D loss: -1.0858] [G loss: 0.1276] [C loss: 0.0413]\n", - "Epoch 70/200: [D loss: -0.9020] [G loss: 0.2798] [C loss: 0.0304]\n", - "Epoch 80/200: [D loss: -0.9490] [G loss: 0.4647] [C loss: 0.0251]\n", - "Epoch 90/200: [D loss: -0.9015] [G loss: 0.5510] [C loss: 0.0207]\n", - "Epoch 100/200: [D loss: -0.8175] [G loss: 0.4248] [C loss: 0.0197]\n", - "Epoch 110/200: [D loss: -0.7717] [G loss: 0.2299] [C loss: 0.0159]\n", - "Epoch 120/200: [D loss: -0.7385] [G loss: 0.4194] [C loss: 0.0109]\n", - "Epoch 130/200: [D loss: -0.7586] [G loss: 0.3148] [C loss: 0.0092]\n", - "Epoch 140/200: [D loss: -0.7270] [G loss: 0.5133] [C loss: 0.0083]\n", - "Epoch 150/200: [D loss: -0.7013] [G loss: 0.4182] [C loss: 0.0074]\n", - "Epoch 160/200: [D loss: -0.6626] [G loss: 0.3099] [C loss: 0.0052]\n", - "Epoch 170/200: [D loss: -0.6453] [G loss: 0.3161] [C loss: 0.0048]\n", - "Epoch 180/200: [D loss: -0.6487] [G loss: 0.4090] [C loss: 0.0049]\n", - "Epoch 190/200: [D loss: -0.6215] [G loss: 0.4898] [C loss: 0.0027]\n", - "Epoch 200/200: [D loss: -0.6184] [G loss: 0.4421] [C loss: 0.0027]\n", - "---Generate from TableGAN Model\n", - "Shape of X_train: (1000, 14)\n", - "Shape of y_train: (1000, 1)\n", - "Shape of synthetic data: (1000, 15)\n" - ] - }, - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
01234567891011121314
025.6867105.114521210898.9531256.7700697.5738563.3802899.3692433.6092281.6995421.9956053653.465820154.86518940.6372070.8796520.0
150.8860930.960232260874.5937505.1751428.4246984.4167587.4119414.5638101.8376881.0083072824.74609418.07225040.2459110.6621500.0
229.7263365.242323164565.8437502.6029619.7325722.8831899.5517214.5212482.6292451.93345511014.456055271.14846840.8065492.0366940.0
357.6563872.477122139024.3281253.79808013.3377521.0211812.9713131.8348931.0052201.9951202998.986572106.89205940.5073814.2830480.0
427.1481231.131275135529.0156257.35954312.4701151.8871173.4015293.5468581.0120241.9987671768.729248147.95222540.7214361.8349410.0
\n", - "
" - ], - "text/plain": [ - " 0 1 2 3 4 5 \\\n", - "0 25.686710 5.114521 210898.953125 6.770069 7.573856 3.380289 \n", - "1 50.886093 0.960232 260874.593750 5.175142 8.424698 4.416758 \n", - "2 29.726336 5.242323 164565.843750 2.602961 9.732572 2.883189 \n", - "3 57.656387 2.477122 139024.328125 3.798080 13.337752 1.021181 \n", - "4 27.148123 1.131275 135529.015625 7.359543 12.470115 1.887117 \n", - "\n", - " 6 7 8 9 10 11 \\\n", - "0 9.369243 3.609228 1.699542 1.995605 3653.465820 154.865189 \n", - "1 7.411941 4.563810 1.837688 1.008307 2824.746094 18.072250 \n", - "2 9.551721 4.521248 2.629245 1.933455 11014.456055 271.148468 \n", - "3 2.971313 1.834893 1.005220 1.995120 2998.986572 106.892059 \n", - "4 3.401529 3.546858 1.012024 1.998767 1768.729248 147.952225 \n", - "\n", - " 12 13 14 \n", - "0 40.637207 0.879652 0.0 \n", - "1 40.245911 0.662150 0.0 \n", - "2 40.806549 2.036694 0.0 \n", - "3 40.507381 4.283048 0.0 \n", - "4 40.721436 1.834941 0.0 " - ] - }, - "execution_count": 2, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "from katabatic.models.TableGAN import TableGANAdapter, TableGAN, preprocess_data, postprocess_data\n", - "import pandas as pd\n", - "import numpy as np\n", - "from sklearn.model_selection import train_test_split\n", - "\n", - "# Load data\n", - "data_path = 'data/Adult/train_Adult_cleaned.csv'\n", - "labels_path = 'data/Adult/train_Adult_labels.csv'\n", - "\n", - "# Load features and labels\n", - "X = pd.read_csv(data_path)\n", - "y = pd.read_csv(labels_path, header=None)\n", - "\n", - "# If y has multiple columns, assume the first column is the target\n", - "if y.shape[1] > 1:\n", - " y = y.iloc[:, 0]\n", - "\n", - "# Split the data\n", - "X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.5, random_state=42)\n", - "\n", - "# Function to identify numerical columns\n", - "def is_numerical(dtype):\n", - " return dtype.kind in 'iuf'\n", - "\n", - "# Identify numerical columns\n", - "column_is_numerical = X_train.dtypes.apply(is_numerical).values\n", - "numerical_columns = np.argwhere(column_is_numerical).ravel()\n", - "\n", - "# Initialize the adapter with a specific privacy setting\n", - "tablegan_adapter = TableGANAdapter(type='mixed', privacy_setting='high')\n", - "\n", - "# Fit the model\n", - "tablegan_adapter.fit(X_train, y_train, epochs=200, batch_size=64)\n", - "\n", - "# Generate synthetic data\n", - "synthetic_data = tablegan_adapter.generate(size=1000)\n", - "\n", - "# Print shapes to verify\n", - "print(\"Shape of X_train:\", X_train.shape)\n", - "print(\"Shape of y_train:\", y_train.shape)\n", - "print(\"Shape of synthetic data:\", synthetic_data.shape)\n", - "\n", - "# Display the first few rows of the synthetic data\n", - "pd.DataFrame(synthetic_data).head()" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
ageworkclassfnlwgteducationeducation-nummarital-statusoccupationrelationshipracesexcapital-gaincapital-losshours-per-weeknative-country
4402913634251133641200401
573371224637111231201977401
9464011846822102961100401
997484331091132561200581
50358118098021023611004223
\n", - "
" - ], - "text/plain": [ - " age workclass fnlwgt education education-num marital-status \\\n", - "440 29 1 363425 1 13 3 \n", - "573 37 1 22463 7 11 1 \n", - "946 40 1 184682 2 10 2 \n", - "997 48 4 33109 1 13 2 \n", - "503 58 1 180980 2 10 2 \n", - "\n", - " occupation relationship race sex capital-gain capital-loss \\\n", - "440 6 4 1 2 0 0 \n", - "573 2 3 1 2 0 1977 \n", - "946 9 6 1 1 0 0 \n", - "997 5 6 1 2 0 0 \n", - "503 3 6 1 1 0 0 \n", - "\n", - " hours-per-week native-country \n", - "440 40 1 \n", - "573 40 1 \n", - "946 40 1 \n", - "997 58 1 \n", - "503 42 23 " - ] - }, - "execution_count": 3, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "X_train.head()" - ] - }, - { - "cell_type": "code", - "execution_count": 11, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Shape of synthetic data: (1000, 15)\n", - "Shape of X_train: (1000, 14)\n", - "Warning: Mismatch in number of features. Adjusting synthetic data.\n", - "Age range in synthetic data: 18 - 70\n", - "Age distribution in synthetic data:\n", - "age\n", - "18 18\n", - "19 20\n", - "20 17\n", - "21 34\n", - "22 29\n", - "Name: count, dtype: int64\n", - " age workclass fnlwgt education education-num marital-status \\\n", - "0 36 1.0 148436.421875 3.0 5.0 3.0 \n", - "1 32 0.0 258493.687500 5.0 13.0 2.0 \n", - "2 25 0.0 111331.015625 1.0 13.0 1.0 \n", - "3 49 0.0 112539.156250 3.0 7.0 2.0 \n", - "4 31 0.0 280701.781250 3.0 12.0 2.0 \n", - "\n", - " occupation relationship race sex capital-gain capital-loss \\\n", - "0 1.668044 2.665293 1.222549 1.727480 4754.707520 90.608620 \n", - "1 1.958121 4.081008 1.006382 1.999623 777.928223 90.198013 \n", - "2 5.090883 3.625424 1.005751 1.999899 409.163910 60.240929 \n", - "3 0.893426 2.313017 1.228375 1.104635 5942.598145 99.190346 \n", - "4 7.455132 3.246283 1.017341 1.999511 953.247559 84.151093 \n", - "\n", - " hours-per-week native-country \n", - "0 39.512028 2.840122 \n", - "1 39.445625 2.314419 \n", - "2 38.215969 1.173012 \n", - "3 39.392338 4.975973 \n", - "4 39.034595 1.328062 \n" - ] - } - ], - "source": [ - "from katabatic.models.TableGAN import TableGANAdapter, TableGAN, preprocess_data, postprocess_data\n", - "import pandas as pd\n", - "import numpy as np\n", - "from sklearn.model_selection import train_test_split\n", - "from sklearn.preprocessing import MinMaxScaler\n", - "\n", - "\n", - "\n", - "# Initialize the adapter with a specific privacy setting\n", - "tablegan_adapter = TableGANAdapter(type='mixed', privacy_setting='high')\n", - "\n", - "# Fit the model\n", - "tablegan_adapter.fit(X_train, y_train, epochs=200, batch_size=64)\n", - "\n", - "# Define discrete columns\n", - "discrete_columns = ['age', 'workclass', 'education', 'education-num', 'marital-status'] # Add all discrete columns\n", - "\n", - "# Initialize scaler\n", - "scaler = MinMaxScaler()\n", - "scaled_data = scaler.fit_transform(X_train)\n", - "\n", - "# Generate synthetic data\n", - "n_samples = 1000\n", - "synthetic_data = tablegan_adapter.generate(size=n_samples)\n", - "\n", - "# Post-process discrete columns\n", - "# Check the shape of synthetic data\n", - "print(\"Shape of synthetic data:\", synthetic_data.shape)\n", - "print(\"Shape of X_train:\", X_train.shape)\n", - "\n", - "pd.DataFrame(synthetic_data).head()\n", - "\n", - "# Ensure synthetic_data has the same number of features as X_train\n", - "if synthetic_data.shape[1] != X_train.shape[1]:\n", - " print(\"Warning: Mismatch in number of features. Adjusting synthetic data.\")\n", - " synthetic_data = synthetic_data[:, :X_train.shape[1]]\n", - "\n", - "# Convert synthetic data to a DataFrame\n", - "synthetic_df = pd.DataFrame(synthetic_data, columns=X_train.columns)\n", - "\n", - "# Post-process discrete columns\n", - "for col_name in discrete_columns:\n", - " if col_name == 'age':\n", - " synthetic_df[col_name] = np.round(synthetic_df[col_name]).astype(int)\n", - " else:\n", - " synthetic_df[col_name] = np.round(synthetic_df[col_name])\n", - "\n", - "# Clamp values within the original range for discrete columns\n", - "for col_name in discrete_columns:\n", - " min_val = X_train[col_name].min()\n", - " max_val = X_train[col_name].max()\n", - " synthetic_df[col_name] = np.clip(synthetic_df[col_name], min_val, max_val)\n", - "\n", - "# Print age statistics\n", - "print(\"Age range in synthetic data:\", synthetic_df['age'].min(), \"-\", synthetic_df['age'].max())\n", - "print(\"Age distribution in synthetic data:\")\n", - "print(synthetic_df['age'].value_counts().sort_index().head())\n", - "\n", - "# Display the first few rows of the synthetic data\n", - "print(synthetic_df.head())" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "chatbot", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.8.10" - } - }, - "nbformat": 4, - "nbformat_minor": 2 -} diff --git a/Website b/Website new file mode 160000 index 0000000..ae53420 --- /dev/null +++ b/Website @@ -0,0 +1 @@ +Subproject commit ae53420f76063ea54ee591c47fa053c813671d03 diff --git a/benchmarking_approach.md b/benchmarking_approach.md deleted file mode 100644 index 94ff387..0000000 --- a/benchmarking_approach.md +++ /dev/null @@ -1,59 +0,0 @@ -# Benchmarking Tabular Data Generation Models - -## 1. Define Benchmarking Objectives - -- **Data Fidelity**: How closely does the synthetic data resemble the real data in terms of statistical properties? -- **Utility**: How well can models trained on synthetic data generalize to real data? -- **Privacy**: Does the synthetic data protect against revealing sensitive information from the original dataset? -- **Scalability and Efficiency**: How well does the model handle large datasets and how computationally expensive is it? - -## 2. Select Datasets - -- **Variety of Datasets**: Use a diverse set of datasets, including different sizes, types (categorical, continuous, mixed), and domains (finance, healthcare, retail). -- **Real-World Datasets**: Choose real-world datasets that represent practical applications. -- **Publicly Available Datasets**: Prefer publicly available datasets to ensure reproducibility. - -## 3. Define Evaluation Metrics - -### Fidelity Metrics: - -- **Kolmogorov-Smirnov (KS) Test**: Compare distributions of continuous features between real and synthetic data. -- **Wasserstein Distance**: Measure the difference between distributions of real and synthetic data. -- **Jaccard Similarity**: For categorical data, evaluate the overlap of categories between real and synthetic data. -- **Correlation Matrix Comparison**: Compare correlation matrices of real and synthetic data to assess inter-feature dependencies. - -### Utility Metrics: - -- **Train on Synthetic, Test on Real (TSTR)**: Train a model on synthetic data and evaluate its performance on real data. -- **Train on Real, Test on Synthetic (TRTS)**: Train a model on real data and evaluate its performance on synthetic data. -- **Downstream Task Performance**: Evaluate synthetic data by using it for specific tasks like classification, regression, or clustering and comparing the results to those obtained using real data. - -### Privacy Metrics: - -- **Membership Inference Attack (MIA)**: Measure the risk of inferring whether a specific record was part of the real dataset. -- **Differential Privacy (DP) Guarantees**: If applicable, evaluate whether the model provides differential privacy guarantees. -- **Attribute Inference Attack**: Assess the risk of predicting sensitive attributes of real data from synthetic data. - -### Scalability and Efficiency Metrics: - -- **Training Time**: Measure the time it takes to train the model on different datasets. -- **Memory Usage**: Evaluate the memory consumption during model training. -- **Generation Speed**: Assess how quickly the model can generate synthetic data. - -## 4. Design a Benchmarking Framework - -- **Preprocessing Pipeline**: Ensure a consistent preprocessing pipeline for all datasets, including handling missing values, scaling, and encoding. -- **Model Implementation**: Implement each tabular data generation model in a consistent framework (e.g., Python, TensorFlow, PyTorch) to ensure comparability. -- **Reproducibility**: Ensure all experiments are reproducible by setting random seeds and documenting the environment (e.g., Python version, library versions). - -## 5. Run Benchmarking Experiments - -- **Baseline Models**: Include simple models (e.g., Gaussian Noise, Random Sampling) as baselines for comparison. -- **Cross-Validation**: Use cross-validation to evaluate models on multiple splits of the data. -- **Statistical Significance**: Test the statistical significance of differences in performance between models. - -## 6. Analyze and Visualize Results - -- **Aggregate Results**: Summarize the results across datasets and models to identify trends and outliers. -- **Visualization**: Use visualizations like box plots, bar charts, and heatmaps to compare model performance across metrics. -- **Report Findings**: Document the strengths and weaknesses of each model, highlighting where certain models excel or underperform. diff --git a/conda-24.7.1-py39haa95532_0.tar.bz2 b/conda-24.7.1-py39haa95532_0.tar.bz2 deleted file mode 100644 index 694164a..0000000 Binary files a/conda-24.7.1-py39haa95532_0.tar.bz2 and /dev/null differ diff --git a/data/Adult/train_Adult_cleaned.csv b/data/Adult/train_Adult_cleaned.csv deleted file mode 100644 index e50f230..0000000 --- a/data/Adult/train_Adult_cleaned.csv +++ /dev/null @@ -1,2001 +0,0 @@ -age,workclass,fnlwgt,education,education-num,marital-status,occupation,relationship,race,sex,capital-gain,capital-loss,hours-per-week,native-country -39,6,77516,1,13,3,9,4,1,2,2174,0,40,1 -50,2,83311,1,13,1,5,3,1,2,0,0,13,1 -38,1,215646,4,9,2,7,4,1,2,0,0,40,1 -53,1,234721,3,7,1,7,3,5,2,0,0,40,1 -28,1,338409,1,13,1,6,1,5,1,0,0,40,13 -37,1,284582,11,14,1,5,1,1,1,0,0,40,1 -49,1,160187,8,5,6,3,4,5,1,0,0,16,19 -52,2,209642,4,9,1,5,3,1,2,0,0,45,1 -31,1,45781,11,14,3,6,4,1,1,14084,0,50,1 -42,1,159449,1,13,1,5,3,1,2,5178,0,40,1 -37,1,280464,2,10,1,5,3,5,2,0,0,80,1 -30,6,141297,1,13,1,6,3,2,2,0,0,40,8 -23,1,122272,1,13,3,9,2,1,1,0,0,30,1 -32,1,205019,6,12,3,4,4,5,2,0,0,50,1 -40,1,121772,7,11,1,2,3,2,2,0,0,40,0 -34,1,245487,9,4,1,11,3,3,2,0,0,45,21 -25,2,176756,4,9,3,10,2,1,2,0,0,35,1 -32,1,186824,4,9,3,8,6,1,2,0,0,40,1 -38,1,28887,3,7,1,4,3,1,2,0,0,50,1 -43,2,292175,11,14,2,5,6,1,1,0,0,45,1 -40,1,193524,14,16,1,6,3,1,2,0,0,60,1 -54,1,302146,4,9,4,3,6,5,1,0,0,20,1 -35,4,76845,8,5,1,10,3,5,2,0,0,40,1 -43,1,117037,3,7,1,11,3,1,2,0,2042,40,1 -59,1,109015,4,9,2,1,6,1,1,0,0,40,1 -56,5,216851,1,13,1,1,3,1,2,0,0,40,1 -19,1,168294,4,9,3,2,2,1,2,0,0,40,1 -54,0,180211,2,10,1,0,3,2,2,0,0,60,11 -39,1,367260,4,9,2,5,4,1,2,0,0,80,1 -49,1,193366,4,9,1,2,3,1,2,0,0,40,1 -23,5,190709,6,12,3,13,4,1,2,0,0,52,1 -20,1,266015,2,10,3,4,2,5,2,0,0,44,1 -45,1,386940,1,13,2,5,2,1,2,0,1408,40,1 -30,4,59951,2,10,1,9,2,1,2,0,0,40,1 -22,6,311512,2,10,1,3,3,5,2,0,0,15,1 -48,1,242406,3,7,3,8,6,1,2,0,0,40,4 -21,1,197200,2,10,3,8,2,1,2,0,0,40,1 -19,1,544091,4,9,7,9,1,1,1,0,0,25,1 -31,1,84154,2,10,1,4,3,1,2,0,0,38,0 -48,2,265477,6,12,1,6,3,1,2,0,0,40,1 -31,1,507875,8,5,1,8,3,1,2,0,0,43,1 -53,2,88506,1,13,1,6,3,1,2,0,0,40,1 -24,1,172987,1,13,1,1,3,1,2,0,0,50,1 -49,1,94638,4,9,4,9,6,1,1,0,0,40,1 -25,1,289980,4,9,3,7,4,1,2,0,0,35,1 -57,4,337895,1,13,1,6,3,5,2,0,0,40,1 -53,1,144361,4,9,1,8,3,1,2,0,0,38,1 -44,1,128354,11,14,2,5,6,1,1,0,0,40,1 -41,6,101603,7,11,1,2,3,1,2,0,0,40,1 -29,1,271466,7,11,3,6,4,1,2,0,0,43,1 -25,1,32275,2,10,1,5,1,4,1,0,0,40,1 -18,1,226956,4,9,3,3,2,1,1,0,0,30,0 -47,1,51835,5,15,1,6,1,1,1,0,1902,60,15 -50,4,251585,1,13,2,5,4,1,2,0,0,55,1 -47,3,109832,4,9,2,5,4,1,2,0,0,60,1 -43,1,237993,2,10,1,1,3,1,2,0,0,40,1 -46,1,216666,15,3,1,8,3,1,2,0,0,40,21 -35,1,56352,7,11,1,3,3,1,2,0,0,40,4 -41,1,147372,4,9,1,9,3,1,2,0,0,48,1 -30,1,188146,4,9,1,8,3,1,2,5013,0,40,1 -30,1,59496,1,13,1,4,3,1,2,2407,0,40,1 -32,0,293936,9,4,6,0,4,1,2,0,0,40,0 -48,1,149640,4,9,1,11,3,1,2,0,0,40,1 -42,1,116632,14,16,1,6,3,1,2,0,0,45,1 -29,1,105598,2,10,2,1,4,1,2,0,0,58,1 -36,1,155537,4,9,1,2,3,1,2,0,0,40,1 -28,1,183175,2,10,2,9,4,1,1,0,0,40,1 -53,1,169846,4,9,1,9,1,1,1,0,0,40,1 -49,3,191681,2,10,1,5,3,1,2,0,0,50,1 -25,0,200681,2,10,3,0,2,1,2,0,0,40,1 -19,1,101509,2,10,3,6,2,1,2,0,0,32,1 -31,1,309974,1,13,4,4,2,5,1,0,0,40,1 -29,2,162298,1,13,1,4,3,1,2,0,0,70,1 -23,1,211678,2,10,3,8,4,1,2,0,0,40,1 -79,1,124744,2,10,1,6,5,1,2,0,0,20,1 -27,1,213921,4,9,3,3,2,1,2,0,0,40,21 -40,1,32214,6,12,1,9,3,1,2,0,0,40,1 -67,0,212759,13,6,1,0,3,1,2,0,0,2,1 -18,1,309634,3,7,3,3,2,1,1,0,0,22,1 -31,5,125927,9,4,1,10,3,1,2,0,0,40,1 -18,1,446839,4,9,3,4,4,1,2,0,0,30,1 -52,1,276515,1,13,1,3,3,1,2,0,0,40,13 -46,1,51618,4,9,1,3,1,1,1,0,0,40,1 -59,1,159937,4,9,1,4,3,1,2,0,0,48,1 -44,1,343591,4,9,2,2,4,1,1,14344,0,40,1 -53,1,346253,4,9,2,4,2,1,1,0,0,35,1 -49,5,268234,4,9,1,13,3,1,2,0,0,40,1 -33,1,202051,11,14,1,6,3,1,2,0,0,50,1 -30,1,54334,8,5,3,4,4,1,2,0,0,40,1 -43,4,410867,14,16,3,6,4,1,1,0,0,50,1 -57,1,249977,7,11,1,6,3,1,2,0,0,40,1 -37,1,286730,2,10,2,2,6,1,1,0,0,40,1 -28,1,212563,2,10,2,8,6,5,1,0,0,25,1 -30,1,117747,4,9,1,4,1,2,1,0,1573,35,0 -34,5,226296,1,13,1,13,3,1,2,0,0,40,1 -29,5,115585,2,10,3,7,4,1,2,0,0,50,1 -48,2,191277,14,16,1,6,3,1,2,0,1902,60,1 -37,1,202683,2,10,1,4,3,1,2,0,0,48,1 -48,1,171095,6,12,2,5,6,1,1,0,0,40,3 -32,4,249409,4,9,3,3,2,5,2,0,0,40,1 -76,1,124191,11,14,1,5,3,1,2,0,0,40,1 -44,1,198282,1,13,1,5,3,1,2,15024,0,60,1 -47,2,149116,11,14,3,6,4,1,1,0,0,50,1 -20,1,188300,2,10,3,1,2,1,1,0,0,40,1 -29,1,103432,4,9,3,2,4,1,2,0,0,40,1 -32,3,317660,4,9,1,2,3,1,2,7688,0,40,1 -17,0,304873,13,6,3,0,2,1,1,34095,0,32,1 -30,1,194901,3,7,3,7,2,1,2,0,0,40,1 -31,5,189265,4,9,3,9,4,1,1,0,0,40,1 -42,1,124692,4,9,1,7,3,1,2,0,0,40,1 -24,1,432376,1,13,3,4,5,1,2,0,0,40,1 -38,1,65324,5,15,1,6,3,1,2,0,0,40,1 -56,2,335605,4,9,1,3,3,1,2,0,1887,50,5 -28,1,377869,2,10,1,4,1,1,1,4064,0,25,1 -36,1,102864,4,9,3,8,2,1,1,0,0,40,1 -53,1,95647,8,5,1,7,3,1,2,0,0,50,1 -56,3,303090,2,10,1,4,3,1,2,0,0,50,1 -49,5,197371,7,11,1,2,3,5,2,0,0,40,1 -55,1,247552,2,10,1,4,3,1,2,0,0,56,1 -22,1,102632,4,9,3,2,4,1,2,0,0,41,1 -21,1,199915,2,10,3,3,2,1,1,0,0,40,1 -40,1,118853,1,13,1,5,3,1,2,0,0,60,1 -30,1,77143,1,13,3,5,2,5,2,0,0,40,6 -29,6,267989,1,13,1,6,3,1,2,0,0,50,1 -19,1,301606,2,10,3,3,2,5,2,0,0,35,1 -47,1,287828,1,13,1,5,1,1,1,0,0,40,1 -20,1,111697,2,10,3,9,2,1,1,0,1719,28,1 -31,1,114937,6,12,1,9,3,1,2,0,0,40,1 -35,0,129305,4,9,1,0,3,1,2,0,0,40,1 -39,1,365739,2,10,2,2,4,1,2,0,0,40,1 -28,1,69621,6,12,3,4,4,1,1,0,0,60,1 -24,1,43323,4,9,3,3,4,1,1,0,1762,40,1 -38,2,120985,4,9,1,2,3,1,2,4386,0,35,1 -37,1,254202,1,13,1,4,3,1,2,0,0,50,1 -46,1,146195,6,12,2,1,4,5,1,0,0,36,1 -38,4,125933,11,14,1,6,3,1,2,0,0,40,14 -43,2,56920,4,9,1,2,3,1,2,0,0,60,1 -27,1,163127,7,11,1,9,1,1,1,0,0,35,1 -20,1,34310,2,10,3,4,2,1,2,0,0,20,1 -49,1,81973,2,10,1,2,3,2,2,0,0,40,1 -61,3,66614,4,9,1,2,3,1,2,0,0,40,1 -27,1,232782,2,10,3,4,2,1,1,0,0,40,1 -19,1,316868,2,10,3,3,2,1,2,0,0,30,21 -45,1,196584,7,11,3,6,4,1,1,0,1564,40,1 -70,1,105376,2,10,3,1,5,1,2,0,0,40,1 -31,1,185814,4,9,3,11,6,5,1,0,0,30,1 -22,1,175374,2,10,1,3,3,1,2,0,0,24,1 -36,1,108293,4,9,4,3,6,1,1,0,0,24,1 -64,1,181232,3,7,1,2,3,1,2,0,2179,40,1 -43,0,174662,2,10,2,0,4,1,1,0,0,40,1 -47,5,186009,2,10,2,9,6,1,1,0,0,38,21 -34,1,198183,4,9,3,9,4,1,1,0,0,40,1 -33,1,163003,1,13,3,5,5,2,1,0,0,40,16 -21,1,296158,4,9,3,2,2,1,2,0,0,35,1 -52,0,252903,4,9,2,0,4,1,2,0,0,45,1 -48,1,187715,4,9,1,2,3,1,2,0,0,46,1 -23,1,214542,1,13,3,7,4,1,2,0,0,40,1 -71,2,494223,2,10,4,4,6,5,2,0,1816,2,1 -29,1,191535,4,9,2,2,4,1,2,0,0,60,1 -42,1,228456,1,13,4,3,5,5,2,0,0,50,1 -68,0,38317,12,2,2,0,4,1,1,0,0,20,1 -25,1,252752,4,9,3,3,6,1,1,0,0,40,1 -44,3,78374,11,14,2,5,6,2,1,0,0,40,1 -28,1,88419,4,9,3,5,4,2,1,0,0,40,3 -45,2,201080,11,14,1,4,3,1,2,0,0,40,1 -36,1,207157,2,10,2,3,6,1,1,0,0,40,21 -39,4,235485,6,12,3,5,4,1,2,0,0,42,1 -46,6,102628,11,14,4,13,6,1,2,0,0,40,1 -18,1,25828,3,7,3,7,2,1,2,0,0,16,1 -66,5,54826,7,11,4,6,4,1,1,0,0,20,1 -27,1,124953,4,9,3,3,4,1,2,0,1980,40,1 -28,6,175325,4,9,1,13,3,1,2,0,0,40,1 -51,1,96062,2,10,1,4,3,1,2,0,1977,40,1 -27,1,428030,1,13,3,2,4,1,2,0,0,50,1 -28,6,149624,1,13,1,6,3,1,2,0,0,40,1 -27,1,253814,4,9,6,4,6,1,1,0,0,25,1 -21,1,312956,4,9,3,2,2,5,2,0,0,40,1 -34,1,483777,4,9,3,7,4,5,2,0,0,40,1 -18,1,183930,4,9,3,3,2,1,2,0,0,12,1 -33,1,37274,1,13,1,6,3,1,2,0,0,65,1 -44,5,181344,2,10,1,5,3,5,2,0,0,38,1 -43,1,114580,2,10,2,9,4,1,1,0,0,40,1 -30,1,633742,2,10,3,2,4,5,2,0,0,45,1 -40,1,286370,9,4,1,8,3,1,2,0,0,40,21 -37,4,29054,2,10,1,9,3,1,2,0,0,42,1 -34,1,304030,4,9,1,9,3,5,2,0,0,40,1 -41,2,143129,1,13,2,5,4,1,1,0,0,40,1 -53,0,135105,1,13,2,0,4,1,1,0,0,50,1 -31,1,99928,11,14,1,6,1,1,1,0,0,50,1 -58,6,109567,14,16,1,6,3,1,2,0,0,1,1 -38,1,155222,2,10,2,8,4,5,1,0,0,28,1 -24,1,159567,2,10,1,8,3,1,2,0,0,40,1 -41,5,523910,1,13,1,2,3,5,2,0,0,40,1 -47,1,120939,2,10,1,1,3,1,2,0,0,45,1 -41,4,130760,1,13,1,1,3,1,2,0,0,24,1 -23,1,197387,15,3,1,11,5,1,2,0,0,40,21 -36,1,99374,2,10,2,2,4,1,2,0,0,40,1 -40,4,56795,11,14,3,5,4,1,1,14084,0,55,1 -35,1,138992,11,14,1,6,5,1,2,7298,0,40,1 -24,2,32921,4,9,3,4,4,1,2,0,0,40,1 -26,1,397317,11,14,3,6,4,1,1,0,1876,40,1 -19,0,170653,4,9,3,0,2,1,2,0,0,40,17 -51,1,259323,1,13,1,5,3,1,2,0,0,50,1 -42,5,254817,2,10,3,6,4,1,1,0,1340,40,1 -37,6,48211,4,9,2,9,6,1,1,0,0,35,1 -18,1,140164,3,7,3,4,2,1,1,0,0,40,1 -36,1,128757,1,13,1,3,3,5,2,7298,0,36,1 -35,1,36270,4,9,2,2,4,1,2,0,0,60,1 -58,3,210563,4,9,1,4,1,1,1,15024,0,35,1 -17,1,65368,3,7,3,4,2,1,1,0,0,12,1 -44,5,160943,4,9,1,11,3,5,2,0,0,40,1 -37,1,208358,4,9,1,2,3,1,2,0,0,40,1 -35,1,153790,2,10,3,4,4,3,1,0,0,40,1 -60,1,85815,4,9,1,2,3,2,2,0,0,40,1 -54,3,125417,9,4,1,8,3,1,2,0,0,40,1 -37,1,635913,1,13,3,5,4,5,2,0,0,60,1 -50,1,313321,6,12,2,4,4,1,1,0,0,40,1 -38,1,182609,1,13,1,4,3,1,2,0,0,50,18 -45,1,109434,1,13,1,6,3,1,2,0,0,55,1 -25,1,255004,13,6,3,2,4,1,2,0,0,40,1 -31,1,197860,2,10,1,7,3,1,2,0,0,40,1 -64,0,187656,12,2,2,0,4,1,2,0,0,40,1 -90,1,51744,4,9,3,3,4,5,2,0,2206,40,1 -54,1,176681,4,9,1,9,3,5,2,0,0,20,1 -53,5,140359,16,1,3,8,4,1,1,0,0,35,1 -18,1,243313,4,9,3,4,2,1,1,0,0,40,1 -60,0,24215,13,6,2,0,4,3,1,0,0,10,1 -66,2,167687,4,9,1,10,3,1,2,1409,0,50,1 -75,1,314209,7,11,4,9,4,1,1,0,0,20,29 -65,1,176796,4,9,2,9,4,1,1,0,0,40,1 -35,1,538583,3,7,4,11,4,5,2,3674,0,40,1 -41,1,130408,4,9,2,4,6,5,1,0,0,38,1 -25,1,159732,2,10,3,9,4,1,2,0,0,42,1 -33,1,110978,2,10,2,2,5,4,1,0,0,40,1 -28,1,76714,5,15,3,6,4,1,2,0,0,55,1 -59,6,268700,4,9,1,3,3,1,2,0,0,40,1 -40,6,170525,2,10,3,9,4,1,1,0,0,38,1 -41,1,180138,1,13,1,5,3,1,2,0,0,50,14 -38,5,115076,11,14,1,5,3,1,2,0,0,70,1 -23,1,115458,4,9,3,11,2,1,2,0,0,40,1 -40,1,347890,1,13,1,6,3,1,2,0,0,40,1 -41,2,196001,4,9,1,3,1,1,1,0,0,20,1 -24,6,273905,6,12,1,13,3,1,2,0,0,50,1 -20,0,119156,2,10,3,0,2,1,2,0,0,20,1 -38,1,179488,2,10,2,2,4,1,2,0,1741,40,1 -56,1,203580,4,9,1,9,3,1,2,0,0,35,0 -58,1,236596,4,9,1,9,3,1,2,0,0,45,1 -32,1,183916,4,9,3,3,4,1,1,0,0,34,1 -40,1,207578,6,12,1,1,3,1,2,0,1977,60,1 -45,1,153141,4,9,1,9,3,1,2,0,0,40,0 -41,1,112763,5,15,1,6,1,1,1,0,0,40,1 -42,1,390781,1,13,1,9,1,5,1,0,0,40,1 -59,5,171328,13,6,4,3,6,5,1,0,0,30,1 -19,5,27382,2,10,3,9,2,1,2,0,0,40,1 -58,1,259014,2,10,3,11,4,1,2,0,0,20,1 -42,2,303044,4,9,1,10,3,2,2,0,0,40,2 -20,1,117789,4,9,3,3,2,1,1,0,0,40,1 -32,1,172579,4,9,4,3,4,1,1,0,0,30,1 -45,1,187666,7,11,4,5,4,1,1,0,0,45,1 -50,1,204518,9,4,2,2,4,1,2,0,0,40,1 -36,1,150042,1,13,2,6,2,1,1,0,0,40,1 -45,1,98092,4,9,1,4,3,1,2,0,0,60,1 -17,1,245918,3,7,3,3,2,1,2,0,0,12,1 -59,1,146013,2,10,1,4,3,1,2,4064,0,40,1 -26,1,378322,3,7,1,2,3,1,2,0,0,40,1 -37,3,257295,2,10,1,5,3,2,2,0,0,75,34 -19,0,218956,2,10,3,0,2,1,2,0,0,24,5 -64,1,21174,4,9,1,5,3,1,2,0,0,40,1 -33,1,185480,1,13,3,6,4,1,1,0,0,45,1 -33,1,222205,4,9,1,2,1,1,1,0,0,40,1 -61,1,69867,4,9,1,5,3,1,2,0,0,40,1 -17,1,191260,8,5,3,3,2,1,2,1055,0,24,1 -50,2,30653,11,14,1,10,3,1,2,2407,0,98,1 -27,5,209109,11,14,3,6,2,1,2,0,0,35,1 -30,1,70377,4,9,2,6,2,1,1,0,0,40,1 -43,1,477983,4,9,1,7,3,5,2,0,0,40,1 -44,1,170924,2,10,1,2,3,1,2,7298,0,40,1 -35,1,190174,2,10,3,5,4,1,1,0,0,40,1 -25,1,193787,2,10,3,1,2,1,1,0,0,40,1 -24,1,279472,2,10,1,8,1,1,1,7298,0,48,1 -22,1,34918,1,13,3,6,4,1,1,0,0,15,6 -42,5,97688,2,10,1,2,3,1,2,5178,0,40,1 -34,1,175413,6,12,2,4,6,5,1,0,0,45,1 -60,1,173960,1,13,2,6,4,1,1,0,0,42,1 -21,1,205759,4,9,3,7,2,1,2,0,0,40,1 -57,4,425161,11,14,1,4,3,1,2,15024,0,40,1 -41,1,220531,5,15,1,6,3,1,2,0,0,60,1 -50,1,176609,2,10,2,3,4,1,2,0,0,45,1 -25,1,371987,1,13,3,5,4,1,1,0,0,40,1 -50,1,193884,9,4,1,2,3,1,2,0,0,40,26 -36,1,200352,1,13,1,6,3,1,2,0,0,45,1 -31,1,127595,4,9,2,6,4,1,1,0,0,40,1 -29,5,220419,1,13,3,13,4,1,2,0,0,56,1 -21,1,231931,2,10,3,4,2,1,2,0,0,45,1 -27,1,248402,1,13,3,1,6,5,1,0,0,40,1 -65,1,111095,4,9,1,11,3,1,2,0,0,16,1 -37,3,57424,1,13,2,4,4,1,1,0,0,60,1 -39,0,157443,11,14,1,0,1,2,1,3464,0,40,0 -24,1,278130,4,9,3,2,2,1,2,0,0,40,1 -38,1,169469,4,9,2,4,4,1,2,0,0,80,1 -48,1,146268,1,13,1,9,3,1,2,7688,0,40,1 -21,1,153718,2,10,3,3,4,2,1,0,0,25,1 -31,1,217460,4,9,1,11,3,1,2,0,0,45,1 -55,1,238638,4,9,1,4,3,1,2,4386,0,40,1 -24,1,303296,2,10,1,9,1,2,1,0,0,40,25 -43,1,173321,4,9,2,9,4,1,1,0,0,40,1 -26,1,193945,6,12,3,1,4,1,2,0,0,45,1 -46,1,83082,6,12,3,6,4,1,1,0,0,33,1 -35,1,193815,6,12,1,9,3,1,2,0,0,40,1 -41,3,34987,2,10,1,10,3,1,2,0,0,54,1 -26,1,59306,1,13,3,4,4,1,2,0,0,40,1 -34,1,142897,11,14,1,5,3,2,2,7298,0,35,27 -19,0,860348,2,10,3,0,2,5,1,0,0,25,1 -36,2,205607,1,13,2,6,4,5,1,0,0,40,1 -22,1,199698,2,10,3,4,2,1,2,0,0,15,1 -24,1,191954,2,10,3,8,4,1,2,0,0,40,1 -77,2,138714,2,10,1,4,3,1,2,0,0,40,1 -22,1,399087,15,3,1,8,5,1,1,0,0,40,21 -29,1,423158,2,10,3,1,4,1,1,0,0,40,1 -62,1,159841,4,9,4,3,4,1,1,0,0,24,1 -39,2,174308,4,9,1,5,3,1,2,0,0,40,1 -43,1,50356,2,10,1,2,3,1,2,0,1485,50,1 -35,1,186110,4,9,2,11,4,1,2,0,0,45,1 -29,1,200381,3,7,3,5,4,1,1,0,0,40,1 -76,2,174309,11,14,1,2,3,1,2,0,0,10,1 -63,2,78383,4,9,1,10,3,1,2,0,0,45,1 -23,0,211601,7,11,3,0,2,5,1,0,0,15,1 -43,1,187728,2,10,1,6,1,1,1,0,1887,50,1 -58,2,321171,4,9,1,7,3,1,2,0,0,40,1 -66,1,127921,4,9,3,11,4,1,2,2050,0,55,1 -41,1,206565,2,10,3,2,4,5,2,0,0,45,1 -26,1,224563,1,13,3,9,4,1,2,0,0,40,1 -47,1,178686,7,11,3,3,4,1,2,0,0,40,1 -55,5,98545,13,6,1,9,3,1,2,0,0,40,1 -53,1,242606,4,9,1,11,3,1,2,0,0,40,1 -17,1,270942,15,3,3,3,5,1,2,0,0,48,21 -30,1,94235,4,9,3,2,5,1,2,0,0,40,1 -49,1,71195,11,14,3,6,4,1,2,0,0,60,1 -19,1,104112,4,9,3,4,6,5,2,0,0,30,28 -45,1,261192,4,9,1,3,3,5,2,0,0,40,1 -26,1,94936,6,12,3,4,4,1,2,0,0,40,1 -38,1,296478,7,11,1,2,3,1,2,7298,0,40,1 -36,6,119272,4,9,1,13,3,1,2,7298,0,40,1 -33,1,85043,4,9,3,10,4,1,2,0,0,20,1 -22,6,293364,2,10,3,13,2,5,1,0,0,40,1 -43,2,241895,1,13,3,4,4,1,2,0,0,42,1 -67,0,36135,3,7,1,0,3,1,2,0,0,8,1 -30,0,151989,7,11,2,0,6,1,1,0,0,40,1 -56,1,101128,6,12,6,3,4,1,2,0,0,25,14 -31,1,156464,1,13,3,6,2,1,2,0,0,25,1 -33,1,117963,1,13,1,5,3,1,2,0,0,40,1 -26,1,192262,4,9,1,3,3,1,2,0,0,40,1 -33,1,111363,1,13,1,5,3,1,2,0,0,40,1 -46,5,329752,3,7,1,11,3,1,2,0,0,30,1 -59,0,372020,1,13,1,0,3,1,2,0,0,40,1 -38,4,95432,4,9,1,9,3,1,2,0,0,40,1 -65,1,161400,3,7,4,3,6,4,2,0,0,40,1 -40,1,96129,7,11,1,1,3,1,2,0,0,40,1 -42,1,111949,4,9,1,9,1,1,1,0,0,35,1 -26,2,117125,8,5,1,2,3,1,2,0,0,40,21 -36,1,348022,13,6,1,3,1,1,1,0,0,24,1 -62,1,270092,11,14,1,6,3,1,2,0,0,40,1 -43,1,180609,1,13,1,11,3,1,2,0,0,45,1 -43,1,174575,1,13,2,5,4,1,2,0,1564,45,1 -22,1,410439,4,9,6,4,4,1,2,0,0,55,1 -28,1,92262,4,9,1,2,3,1,2,0,0,40,1 -56,2,183081,2,10,1,4,3,1,2,0,0,45,1 -22,1,362589,6,12,3,4,4,1,1,0,0,15,1 -57,1,212448,1,13,2,5,4,1,1,0,0,45,1 -39,1,481060,4,9,2,4,6,1,1,0,0,40,1 -26,4,185885,2,10,3,9,6,1,1,0,0,15,1 -17,1,89821,3,7,3,3,2,1,2,0,0,10,1 -40,6,184018,7,11,1,8,3,1,2,0,0,38,1 -45,1,256649,4,9,1,8,3,5,2,0,0,40,1 -44,1,160323,4,9,3,2,4,5,2,0,0,40,1 -20,5,350845,2,10,3,9,2,1,1,0,0,10,1 -33,1,267404,4,9,1,2,1,1,1,0,0,40,1 -23,1,35633,2,10,3,4,4,1,2,0,0,40,1 -46,2,80914,11,14,2,5,4,1,2,0,0,30,1 -38,1,172927,4,9,1,4,3,1,2,0,0,50,1 -54,1,174319,4,9,2,11,4,1,2,0,0,40,1 -46,1,214955,15,3,2,2,4,1,1,0,2339,45,1 -25,1,344991,2,10,1,2,3,1,2,0,0,40,1 -46,1,108699,2,10,2,4,4,1,1,0,0,40,1 -36,5,117312,2,10,1,11,1,1,1,0,0,40,1 -23,1,396099,4,9,3,3,4,1,1,0,0,25,1 -29,1,134152,4,9,4,8,4,1,2,0,0,40,1 -44,1,162028,2,10,1,9,1,1,1,0,2415,6,1 -19,1,25429,2,10,3,9,2,1,1,0,0,16,1 -19,1,232392,4,9,3,3,5,1,1,0,0,40,1 -35,1,220098,4,9,1,3,1,1,1,0,0,40,1 -27,1,301302,1,13,3,2,4,1,2,0,0,50,1 -46,2,277946,6,12,4,2,4,1,2,0,0,40,1 -34,6,98101,1,13,1,5,3,1,2,7688,0,45,0 -34,1,196164,4,9,3,3,4,1,1,0,0,35,1 -44,1,115562,2,10,1,1,3,1,2,0,0,40,1 -45,1,96975,2,10,2,7,6,1,1,0,0,40,1 -20,0,137300,4,9,3,0,5,1,1,0,0,35,1 -25,1,86872,1,13,1,5,3,1,2,0,0,55,1 -52,3,132178,1,13,1,5,3,1,2,0,0,50,1 -20,1,416103,2,10,3,7,2,1,2,0,0,40,1 -28,1,108574,2,10,3,3,4,1,1,0,0,40,1 -50,6,288353,1,13,1,13,3,1,2,0,0,40,1 -34,1,227689,7,11,2,1,4,1,1,0,0,64,1 -28,1,166481,9,4,1,7,3,4,2,0,2179,40,4 -41,1,445382,11,14,1,5,3,1,2,0,1977,65,1 -28,1,110145,4,9,1,9,3,1,2,0,0,40,1 -46,2,317253,4,9,1,2,3,1,2,0,0,25,1 -28,0,123147,2,10,1,0,1,1,1,0,1887,40,1 -32,1,364657,2,10,1,5,3,1,2,0,0,50,1 -41,5,42346,2,10,2,3,4,5,1,0,0,24,1 -24,1,241951,4,9,3,7,6,5,1,0,0,40,1 -33,1,118500,2,10,2,5,6,1,1,0,0,40,1 -46,1,188386,14,16,1,5,3,1,2,15024,0,60,1 -31,6,1033222,2,10,1,8,3,1,2,0,0,40,1 -35,1,92440,10,8,2,2,4,1,2,0,0,50,1 -52,1,190762,12,2,1,8,3,1,2,0,0,40,21 -30,1,426017,3,7,3,3,2,1,1,0,0,19,1 -34,5,243867,3,7,4,8,4,5,2,0,0,40,1 -34,6,240283,4,9,2,11,6,1,1,0,0,40,1 -20,1,61777,2,10,1,4,3,1,2,0,0,30,1 -17,1,175024,3,7,3,7,2,1,2,2176,0,18,1 -32,6,92003,1,13,1,5,1,1,1,0,0,40,1 -29,1,188401,4,9,2,10,4,1,2,0,0,40,1 -33,1,228528,13,6,3,2,6,1,1,0,0,35,1 -25,1,133373,4,9,1,11,3,1,2,0,0,60,1 -36,4,255191,11,14,3,6,4,1,2,0,1408,40,1 -23,1,204653,4,9,3,7,4,1,2,0,0,72,24 -63,3,222289,4,9,1,5,3,1,2,0,0,40,1 -47,5,287480,11,14,1,6,3,1,2,0,0,40,1 -80,0,107762,4,9,4,0,4,1,2,0,0,24,1 -17,0,202521,3,7,3,0,2,1,2,0,0,40,1 -40,2,204116,1,13,6,6,4,1,1,2174,0,40,1 -30,1,29662,6,12,1,3,1,1,1,0,0,25,1 -27,1,116358,2,10,3,2,2,2,2,0,1980,40,16 -33,1,208405,11,14,1,6,3,1,2,0,0,50,1 -34,5,284843,4,9,3,10,4,5,2,594,0,60,1 -34,5,117018,2,10,3,13,4,1,1,0,0,40,1 -23,1,81281,2,10,1,9,3,1,2,0,0,40,1 -42,5,340148,2,10,1,9,1,1,1,0,0,40,1 -29,1,363425,1,13,3,6,4,1,2,0,0,40,1 -45,1,45857,4,9,2,9,4,1,1,0,0,28,1 -24,4,191073,4,9,3,14,2,1,2,0,0,40,1 -44,1,116632,2,10,1,6,3,1,2,0,0,40,1 -27,1,405855,8,5,3,2,5,1,2,0,0,40,21 -20,1,298227,2,10,3,4,4,1,2,0,0,35,1 -44,1,290521,4,9,4,5,6,5,1,0,0,40,1 -51,1,56915,4,9,1,2,3,3,2,0,0,40,1 -20,1,146538,4,9,1,8,3,1,2,0,0,40,1 -17,0,258872,3,7,3,0,2,1,1,0,0,5,1 -19,1,206399,4,9,3,8,2,5,1,0,0,40,1 -45,3,197332,2,10,1,2,3,1,2,0,0,55,1 -60,1,245062,4,9,1,2,3,1,2,0,0,40,1 -42,1,197583,6,12,1,5,3,5,2,0,0,40,0 -44,2,234885,4,9,1,4,1,1,1,0,0,40,1 -40,1,72887,7,11,1,8,3,2,2,0,0,40,1 -30,1,180374,4,9,1,5,1,1,1,0,0,40,1 -38,1,351299,2,10,1,11,3,5,2,0,0,50,1 -23,1,54012,4,9,1,11,3,1,2,0,0,60,1 -32,0,115745,2,10,1,0,3,1,2,0,0,40,1 -44,1,116632,6,12,3,10,2,1,2,0,0,40,1 -54,5,288825,4,9,1,11,3,5,2,0,0,40,1 -32,1,132601,1,13,1,6,3,1,2,0,0,50,1 -50,1,193374,12,2,6,2,6,1,2,0,0,40,1 -24,1,170070,1,13,3,1,4,1,1,0,0,20,1 -37,1,126708,4,9,1,9,1,1,1,0,0,60,1 -52,1,35598,4,9,2,11,6,1,2,0,0,40,1 -38,1,33983,2,10,1,11,3,1,2,0,0,40,1 -49,1,192776,11,14,1,5,3,1,2,0,1977,45,1 -30,1,118551,1,13,1,1,1,1,1,0,0,16,1 -60,1,201965,2,10,3,6,6,1,2,0,0,40,1 -22,0,139883,2,10,3,0,2,1,2,0,0,40,1 -35,1,285020,4,9,3,2,4,1,2,0,0,40,1 -30,1,303990,4,9,3,11,4,1,2,0,0,60,1 -67,1,49401,7,11,2,3,4,1,1,0,0,24,1 -46,1,279196,1,13,3,2,4,1,1,0,0,40,1 -17,1,211870,8,5,3,3,4,1,2,0,0,6,1 -22,1,281432,2,10,3,7,2,1,2,0,0,30,1 -27,1,161155,13,6,3,3,4,1,2,0,0,40,1 -23,1,197904,4,9,3,3,6,1,1,0,0,35,1 -33,1,111746,6,12,1,2,3,1,2,0,0,45,21 -43,2,170721,2,10,1,2,3,1,2,0,0,20,1 -28,6,70100,1,13,3,6,4,1,2,0,0,20,1 -41,1,193626,4,9,6,2,6,1,1,0,0,40,1 -52,0,271749,10,8,3,0,5,5,2,594,0,40,1 -25,1,189775,2,10,6,9,2,5,1,0,0,20,1 -63,0,401531,12,2,1,0,3,1,2,0,0,35,1 -59,5,286967,4,9,1,11,3,1,2,0,0,45,1 -45,5,164427,1,13,2,6,6,1,1,0,0,40,1 -38,1,91039,1,13,1,4,3,1,2,15024,0,60,1 -40,1,347934,4,9,3,3,4,1,1,0,0,35,1 -46,4,371373,4,9,2,9,4,1,2,0,0,40,1 -35,1,32220,6,12,3,5,4,1,1,0,0,60,1 -34,1,187251,4,9,2,6,6,1,1,0,0,25,1 -33,1,178107,1,13,3,2,2,1,2,0,0,20,1 -41,1,343121,4,9,2,9,6,1,1,0,0,36,1 -20,1,262749,2,10,3,8,2,1,2,0,0,40,1 -23,1,403107,15,3,3,3,2,1,2,0,0,40,36 -26,1,64293,2,10,3,6,4,1,1,0,0,35,1 -72,0,303588,4,9,1,0,3,1,2,0,0,20,1 -23,5,324960,4,9,3,10,4,1,2,0,0,40,18 -62,5,114060,4,9,1,11,3,1,2,0,0,40,1 -52,1,48925,2,10,1,9,3,1,2,0,0,40,1 -58,1,180980,2,10,2,3,6,1,1,0,0,42,23 -25,1,181054,1,13,3,4,4,1,1,0,0,40,1 -24,1,388093,1,13,3,5,4,5,2,0,0,40,1 -19,1,249609,2,10,3,13,2,1,2,0,0,8,1 -43,1,112131,2,10,1,5,3,1,2,0,0,40,1 -47,5,543162,4,9,4,9,6,5,1,0,0,40,1 -39,1,91996,4,9,2,3,6,1,1,0,0,40,1 -49,1,141944,7,11,6,7,6,1,2,0,1380,42,1 -53,0,251804,15,3,4,0,6,5,1,0,0,30,1 -32,1,37070,7,11,3,2,4,1,2,0,0,45,1 -34,1,337587,2,10,1,6,3,1,2,0,0,50,1 -28,1,189346,4,9,2,2,4,1,2,0,0,45,1 -57,0,222216,7,11,4,0,6,1,1,0,0,38,1 -25,1,267044,2,10,3,9,4,3,1,0,0,20,1 -20,0,214635,2,10,3,0,2,1,2,0,0,24,1 -21,0,204226,2,10,3,0,6,1,1,0,0,35,1 -34,1,108116,1,13,1,6,3,1,2,0,0,50,1 -38,3,99146,1,13,1,5,3,1,2,15024,0,80,1 -50,1,196232,4,9,1,5,3,1,2,7688,0,50,1 -24,5,248344,2,10,2,7,4,5,2,0,0,50,1 -37,5,186035,2,10,1,1,3,1,2,0,0,45,1 -44,1,177905,2,10,2,8,6,1,2,0,0,58,1 -28,1,85812,2,10,1,4,1,1,1,0,0,40,1 -42,1,221172,1,13,1,5,3,1,2,0,0,40,1 -74,1,99183,2,10,2,9,4,1,1,0,0,9,1 -38,2,190387,4,9,1,2,3,1,2,0,0,50,1 -44,2,202692,11,14,1,6,3,1,2,0,0,40,1 -44,1,109339,3,7,2,8,6,4,1,0,0,46,4 -26,1,108658,4,9,3,8,4,1,2,0,0,40,1 -36,1,197202,4,9,1,3,3,5,2,0,0,40,1 -41,1,101739,6,12,1,5,1,1,1,0,0,50,1 -67,1,231559,5,15,1,6,3,1,2,20051,0,48,1 -39,5,207853,10,8,1,1,3,1,2,0,0,50,1 -57,1,190942,12,2,4,12,4,5,1,0,0,30,1 -29,1,102345,7,11,4,2,4,1,2,0,0,40,1 -31,3,41493,1,13,3,10,4,1,1,0,0,45,1 -34,0,190027,4,9,3,0,6,5,1,0,0,40,1 -44,1,210525,2,10,1,11,3,1,2,0,0,40,1 -29,1,133937,14,16,3,6,2,1,2,0,0,40,1 -30,1,237903,2,10,3,7,6,1,1,0,0,40,1 -27,1,163862,4,9,3,11,4,1,2,0,0,40,1 -27,1,201872,2,10,1,4,3,1,2,0,0,50,1 -32,1,84179,4,9,3,7,4,1,1,0,0,45,1 -58,1,51662,13,6,1,3,1,1,1,0,0,8,1 -35,5,233327,2,10,1,13,3,1,2,0,0,40,1 -21,1,259510,4,9,3,7,2,1,2,0,0,36,1 -28,1,184831,2,10,3,2,6,1,2,0,0,40,1 -46,2,245724,2,10,2,5,4,1,2,0,0,50,1 -36,2,27053,4,9,4,3,6,1,1,0,0,40,1 -72,1,205343,3,7,4,9,6,1,1,0,0,40,1 -35,1,229328,4,9,1,8,1,5,1,0,0,40,1 -33,4,319560,7,11,2,2,6,5,1,0,0,40,1 -69,1,136218,3,7,3,8,4,1,1,0,0,40,1 -35,1,54576,4,9,1,8,3,1,2,0,0,40,1 -31,1,323069,4,9,4,9,6,1,1,0,0,20,0 -34,1,148291,4,9,1,1,1,1,1,0,0,32,1 -30,1,152453,3,7,1,3,3,1,2,0,0,40,21 -28,1,114053,1,13,3,11,4,1,2,0,0,55,1 -54,1,212960,1,13,1,4,3,1,2,0,0,35,1 -47,1,264052,2,10,1,6,3,1,2,0,0,50,1 -24,1,82804,4,9,3,7,6,5,1,0,0,40,1 -52,2,334273,1,13,1,6,3,1,2,0,0,60,1 -20,1,27337,4,9,3,7,2,3,2,0,0,48,1 -43,3,188436,11,14,1,5,3,1,2,5013,0,45,1 -45,1,433665,9,4,4,3,6,1,1,0,0,40,21 -29,2,110663,4,9,4,2,4,1,2,0,0,35,1 -47,1,87490,11,14,2,5,6,1,2,0,0,42,1 -24,1,354351,4,9,3,2,2,1,2,0,0,40,1 -51,1,95469,4,9,1,6,3,1,2,0,0,40,1 -17,1,242718,3,7,3,4,2,1,2,0,0,12,1 -37,1,22463,7,11,1,2,3,1,2,0,1977,40,1 -27,1,158156,14,16,3,6,4,1,1,0,0,70,1 -29,1,350162,1,13,1,5,1,1,2,0,0,40,1 -18,0,165532,10,8,3,0,2,1,2,0,0,25,1 -36,2,28738,6,12,2,4,6,1,1,0,0,35,1 -58,5,283635,1,13,3,6,4,1,1,0,0,40,1 -26,2,86646,2,10,1,2,3,1,2,0,0,45,1 -65,0,195733,2,10,1,0,3,1,2,0,0,30,1 -57,1,69884,4,9,1,4,3,1,2,0,0,40,1 -59,1,199713,4,9,1,2,3,1,2,0,0,40,1 -27,1,181659,4,9,1,8,3,1,2,0,0,40,1 -31,2,340939,1,13,3,1,4,1,2,0,0,40,1 -21,1,197747,2,10,3,4,2,1,1,0,0,24,1 -29,1,34292,2,10,3,9,4,1,2,0,0,60,1 -18,1,156764,3,7,3,3,2,1,2,0,0,40,1 -52,1,25826,13,6,1,2,3,1,2,0,1887,47,1 -57,3,103948,1,13,2,6,4,1,2,0,0,80,1 -42,0,137390,4,9,1,0,3,1,2,0,0,40,1 -55,0,105138,4,9,1,0,1,2,1,0,0,40,1 -60,1,39352,9,4,3,11,4,1,2,0,0,48,1 -31,1,168387,1,13,1,6,3,1,2,7688,0,40,5 -23,1,117789,1,13,3,9,2,1,1,0,0,40,1 -27,1,267147,4,9,3,4,2,1,2,0,0,40,1 -23,0,99399,2,10,3,0,6,3,1,0,0,25,1 -42,2,214242,5,15,1,6,3,1,2,0,1902,50,1 -25,1,200408,2,10,3,1,4,1,2,2174,0,40,1 -49,1,136455,1,13,3,6,4,1,1,0,0,45,1 -32,1,239824,1,13,3,1,4,1,2,0,0,40,1 -19,1,217039,2,10,3,9,2,1,2,0,0,28,1 -60,1,51290,9,4,2,3,4,1,1,0,0,40,1 -42,5,175674,8,5,1,3,3,1,2,0,0,40,1 -35,2,194404,6,12,3,10,4,1,2,0,0,40,1 -48,1,45612,4,9,3,9,6,5,1,0,0,37,1 -51,1,410114,11,14,1,2,3,1,2,0,0,40,1 -29,1,182521,4,9,3,2,4,3,2,0,0,40,1 -36,5,339772,4,9,4,5,4,5,2,0,0,40,1 -17,1,169658,13,6,3,3,2,1,1,0,0,21,1 -52,1,200853,11,14,2,6,4,1,1,6849,0,60,1 -24,1,247564,4,9,3,2,4,1,2,0,0,40,1 -24,1,249909,4,9,1,7,3,1,2,0,0,50,1 -26,5,208122,1,13,3,6,2,1,2,1055,0,40,1 -27,1,109881,1,13,3,3,2,1,1,0,0,20,1 -39,1,207824,4,9,3,7,2,1,2,0,0,60,1 -30,1,369027,4,9,1,11,3,5,2,0,0,45,1 -50,2,114117,4,9,1,5,3,1,2,0,0,32,1 -52,3,51048,1,13,1,4,3,1,2,0,0,55,1 -46,1,102388,5,15,1,6,3,1,2,15024,0,45,1 -23,1,190483,1,13,3,4,2,1,1,0,0,20,1 -45,1,462440,3,7,4,3,4,5,1,0,0,20,1 -65,1,109351,8,5,4,12,6,5,1,0,0,24,1 -29,1,34383,7,11,1,11,3,1,2,0,0,55,1 -47,1,241832,8,5,6,7,6,1,2,0,0,40,36 -30,1,124187,4,9,3,10,2,5,2,0,0,60,1 -34,1,153614,4,9,1,4,3,1,2,0,0,45,1 -38,2,267556,4,9,1,4,3,1,2,0,0,64,1 -33,1,205469,2,10,1,5,3,1,2,0,0,40,1 -49,1,268090,1,13,1,4,3,1,2,0,0,26,1 -47,2,165039,2,10,3,3,6,5,1,0,0,40,1 -49,5,120451,13,6,4,3,6,5,1,0,0,40,1 -43,1,154374,2,10,1,2,3,1,2,15024,0,60,1 -30,1,103649,1,13,1,9,1,5,1,0,0,40,1 -58,2,35723,4,9,1,2,3,1,2,0,0,60,1 -19,1,262601,4,9,3,3,2,1,1,0,0,14,1 -21,1,226181,1,13,3,7,4,1,2,0,0,40,1 -33,1,175697,1,13,1,5,3,1,2,15024,0,60,1 -47,3,248145,15,3,1,11,3,1,2,0,0,50,13 -52,2,289436,14,16,1,6,3,1,2,0,0,60,1 -26,1,75654,4,9,2,2,4,1,2,0,0,55,1 -60,1,199378,4,9,1,2,3,1,2,0,0,40,1 -21,1,160968,2,10,3,7,2,1,2,0,0,40,1 -36,1,188563,2,10,1,4,3,1,2,5178,0,50,1 -31,1,55849,2,10,1,2,3,1,2,0,0,45,1 -50,3,195322,14,16,4,6,4,1,2,0,0,40,1 -31,5,402089,4,9,2,9,6,1,1,0,0,40,1 -71,1,78277,4,9,1,2,3,1,2,0,0,15,1 -58,0,158611,4,9,1,0,3,1,2,0,0,50,1 -30,6,169496,1,13,1,1,3,1,2,0,0,40,1 -20,1,130959,2,10,3,3,2,1,2,0,0,20,1 -24,1,556660,4,9,3,5,5,1,2,4101,0,50,1 -35,1,292472,14,16,1,6,3,2,2,0,0,40,27 -38,6,143774,2,10,4,5,4,1,1,0,0,45,1 -27,1,288341,4,9,3,8,2,1,1,0,0,32,1 -29,6,71592,2,10,3,9,6,2,1,0,0,40,16 -70,0,167358,8,5,4,0,6,1,1,1111,0,15,1 -34,1,106742,4,9,1,2,3,1,2,0,0,45,1 -44,1,219288,9,4,1,2,3,1,2,0,0,35,1 -43,1,174524,4,9,1,8,3,1,2,0,0,40,1 -44,2,335183,10,8,1,7,3,5,2,0,0,40,1 -35,1,261293,11,14,3,4,4,1,2,0,0,60,1 -27,1,111900,2,10,3,6,4,1,2,0,0,40,1 -43,5,194360,11,14,3,6,4,1,2,0,0,38,1 -20,1,81145,2,10,3,4,4,1,1,0,0,25,1 -42,1,341204,6,12,2,6,6,1,1,8614,0,40,1 -27,6,249362,2,10,1,11,3,1,2,3411,0,40,1 -42,1,247019,4,9,1,2,3,5,2,0,0,40,1 -20,0,114746,3,7,6,0,2,2,1,0,1762,40,11 -24,1,172146,8,5,3,8,4,1,2,0,1721,40,1 -48,4,110457,2,10,2,5,4,1,2,0,0,40,1 -17,0,80077,3,7,3,0,2,1,1,0,0,20,1 -17,2,368700,3,7,3,10,2,1,2,0,0,10,1 -33,1,182556,1,13,1,5,3,1,2,0,0,40,1 -50,3,219420,4,9,1,4,3,1,2,0,0,50,1 -22,1,240817,4,9,3,4,2,1,1,2597,0,40,1 -17,1,102726,10,8,3,3,2,1,2,0,0,16,1 -32,1,226267,2,10,1,9,3,1,2,0,0,40,21 -31,1,125457,4,9,3,2,4,1,2,0,0,45,1 -58,2,204021,4,9,4,5,4,1,2,0,0,50,1 -29,5,92262,4,9,3,13,2,1,2,0,0,48,1 -37,1,161141,7,11,1,2,3,1,2,0,0,40,21 -34,2,190290,4,9,1,2,3,1,2,0,0,40,1 -23,5,430828,2,10,4,5,6,5,2,0,0,40,1 -18,6,59342,3,7,3,9,2,1,1,0,0,5,1 -34,1,136721,4,9,2,5,4,1,1,0,0,40,1 -66,0,149422,9,4,3,0,4,1,2,0,0,4,1 -45,5,86644,1,13,1,6,1,1,1,0,0,55,1 -41,1,195124,11,14,3,5,4,1,2,0,0,35,24 -26,1,167350,4,9,3,3,5,1,2,0,0,30,1 -54,5,113000,2,10,1,10,3,1,2,0,0,40,1 -24,1,140027,2,10,3,8,2,5,1,0,0,45,1 -42,1,262425,2,10,1,6,3,1,2,0,0,50,1 -20,1,316702,2,10,3,6,2,1,2,0,0,20,1 -23,6,335453,1,13,3,1,4,1,1,0,0,20,1 -25,0,202480,6,12,3,0,5,1,2,0,0,45,1 -35,1,203628,11,14,3,6,4,1,2,0,0,60,1 -31,1,118710,11,14,1,1,3,1,2,0,1902,40,1 -30,1,189620,1,13,3,6,2,1,1,0,0,40,18 -19,1,475028,4,9,3,4,2,1,1,0,0,20,1 -36,5,110866,1,13,3,6,4,1,2,0,0,50,1 -31,1,243605,1,13,4,4,6,1,1,0,1380,40,13 -21,1,163870,2,10,3,7,2,1,2,0,0,30,1 -31,2,80145,2,10,1,2,3,1,2,0,0,40,1 -46,1,295566,14,16,2,6,6,1,1,25236,0,65,1 -44,1,63042,1,13,2,5,2,1,1,0,0,50,1 -40,1,229148,10,8,1,3,3,5,2,0,0,40,19 -45,1,242552,2,10,3,4,4,5,2,0,0,40,1 -60,1,177665,4,9,1,2,3,1,2,0,0,35,1 -18,1,208103,3,7,3,3,5,1,2,0,0,25,1 -28,1,296450,1,13,1,6,3,1,2,0,0,40,1 -36,1,70282,2,10,2,9,6,5,1,0,0,40,1 -36,1,271767,1,13,4,6,4,1,2,0,0,40,0 -40,1,144995,7,11,1,1,3,1,2,4386,0,40,1 -36,5,382635,1,13,2,9,6,1,1,0,0,35,15 -31,1,295697,4,9,4,3,6,5,1,0,0,40,1 -33,1,194141,4,9,1,8,3,1,2,0,0,40,1 -19,6,378418,4,9,3,1,2,1,1,0,0,40,1 -22,1,214399,2,10,3,4,2,1,1,0,0,15,1 -34,1,217460,1,13,1,5,3,1,2,0,0,45,1 -33,1,182556,4,9,3,3,4,1,2,0,0,40,1 -41,1,125831,4,9,1,2,3,1,2,0,2051,60,1 -29,1,271328,1,13,3,6,4,1,2,4650,0,40,1 -50,5,50459,4,9,1,5,3,1,2,0,0,42,1 -42,1,162140,1,13,1,5,3,1,2,7298,0,45,1 -43,1,177937,1,13,3,6,4,1,2,0,0,40,0 -44,1,111502,4,9,1,4,1,1,1,0,0,40,1 -20,1,299047,2,10,3,3,4,1,1,0,0,20,1 -31,1,223212,4,9,1,3,3,1,2,0,0,40,21 -65,2,118474,3,7,1,5,3,1,2,9386,0,59,0 -23,1,352139,2,10,3,3,4,1,1,0,0,24,1 -55,1,173093,2,10,2,9,4,2,1,0,0,40,1 -26,1,181655,7,11,1,9,3,1,2,0,2377,45,1 -25,1,332702,7,11,3,3,2,1,1,0,0,15,1 -45,0,51164,2,10,1,0,1,5,1,0,0,40,1 -35,1,234901,2,10,1,2,3,1,2,2407,0,40,1 -36,1,131414,2,10,3,4,4,5,1,0,0,36,1 -43,6,260960,1,13,1,6,3,1,2,0,0,50,1 -56,1,156052,4,9,4,3,6,5,1,594,0,20,1 -42,1,279914,1,13,1,1,3,1,2,0,0,40,1 -19,1,192453,2,10,3,3,5,1,1,0,0,25,1 -55,2,200939,4,9,1,11,3,1,2,0,0,72,1 -42,1,151408,11,14,3,5,4,1,1,14084,0,50,1 -26,1,112847,7,11,3,1,2,1,2,0,0,40,1 -17,1,316929,10,8,3,7,2,1,2,0,0,20,1 -42,5,126319,1,13,1,6,1,1,1,0,0,40,1 -55,1,197422,4,9,1,11,3,1,2,7688,0,40,1 -32,1,267736,2,10,3,9,2,5,1,0,0,40,1 -29,1,267034,3,7,3,2,2,5,2,0,0,40,28 -46,6,193047,1,13,1,6,3,1,2,0,0,37,1 -29,6,356089,1,13,1,5,3,1,2,7688,0,40,1 -22,1,223515,1,13,3,6,6,1,2,0,0,20,1 -58,2,87510,13,6,1,2,3,1,2,0,0,40,1 -23,1,145111,4,9,3,11,6,1,2,0,0,50,1 -39,1,48093,4,9,3,7,4,1,2,0,0,40,1 -27,1,31757,7,11,3,2,2,1,2,0,0,38,1 -54,1,285854,4,9,1,11,3,1,2,0,0,40,1 -33,5,120064,1,13,3,6,4,1,1,0,0,45,1 -46,4,167381,4,9,1,9,1,1,1,0,0,40,1 -37,1,103408,4,9,3,10,4,5,2,0,0,40,1 -36,1,101460,4,9,3,3,4,1,1,0,0,18,1 -59,5,420537,4,9,1,9,1,1,1,0,0,38,1 -34,5,119411,4,9,2,13,6,1,2,0,0,40,21 -53,3,128272,14,16,1,5,3,1,2,0,0,70,1 -51,1,386773,1,13,3,4,4,1,2,0,0,55,1 -32,1,283268,13,6,4,3,6,1,1,0,0,42,1 -31,6,301526,2,10,6,3,5,1,2,0,0,40,1 -22,1,151790,2,10,1,4,1,1,1,0,0,30,6 -47,2,106252,1,13,2,4,4,1,1,0,0,50,1 -32,1,188557,4,9,3,8,4,1,2,0,0,40,1 -26,1,171114,2,10,3,10,4,1,1,0,0,38,1 -37,1,327323,15,3,4,10,4,1,2,0,0,32,31 -31,1,244147,4,9,2,2,6,1,2,0,0,55,1 -37,1,280282,7,11,1,1,1,1,1,0,0,24,1 -55,1,116442,4,9,3,4,4,1,2,0,0,38,1 -23,5,282579,7,11,2,1,4,1,2,0,0,56,1 -36,1,51838,2,10,2,9,6,1,1,0,0,40,1 -34,1,73585,11,14,1,6,3,1,2,0,0,40,0 -43,1,226902,1,13,1,4,3,1,2,0,0,50,1 -54,1,279129,2,10,3,1,4,1,2,0,0,40,1 -43,6,146908,2,10,1,2,3,1,2,0,0,40,0 -28,1,196690,7,11,3,8,4,1,1,0,1669,42,1 -40,1,130760,7,11,1,5,3,1,2,0,0,50,1 -41,2,49572,2,10,1,5,3,1,2,0,0,60,1 -40,1,237601,1,13,3,4,4,4,1,0,0,55,1 -42,1,169628,2,10,2,9,4,5,1,0,0,38,1 -61,2,36671,4,9,1,10,3,1,2,0,2352,50,1 -18,1,231193,10,8,3,8,2,1,2,0,0,30,1 -59,0,192130,4,9,1,0,3,1,2,0,0,16,1 -21,0,149704,4,9,3,0,4,1,1,1055,0,40,1 -48,1,102102,7,11,1,1,3,1,2,0,0,50,1 -41,3,32185,1,13,1,1,3,1,2,0,0,40,1 -18,0,196061,2,10,3,0,2,1,2,0,0,33,1 -23,1,211046,4,9,3,4,4,1,1,2463,0,40,1 -60,1,31577,4,9,1,11,3,1,2,0,0,60,1 -22,1,162343,2,10,3,3,5,5,2,0,0,20,1 -61,1,128831,4,9,1,8,3,1,2,0,0,40,1 -25,1,316688,4,9,3,8,4,5,2,0,0,40,1 -46,1,90758,11,14,3,1,4,1,2,0,0,35,1 -43,1,274363,1,13,1,5,3,1,2,0,1902,40,3 -43,1,154538,6,12,2,2,4,1,2,0,0,40,1 -24,1,106085,4,9,3,3,2,5,2,0,1721,30,1 -68,2,315859,3,7,3,10,6,1,2,0,0,20,1 -31,1,51471,4,9,2,9,6,1,1,0,0,38,1 -17,1,193830,3,7,3,4,2,1,1,0,0,20,1 -32,1,231043,4,9,1,2,3,1,2,5178,0,48,1 -50,0,23780,11,14,6,0,5,1,2,0,0,40,1 -33,1,169879,1,13,1,6,3,1,2,3103,0,47,1 -64,1,270333,1,13,1,6,3,1,2,0,0,40,1 -20,1,138768,4,9,3,11,2,1,2,0,0,30,1 -30,1,191571,4,9,4,3,2,1,1,0,0,36,1 -22,0,219941,2,10,3,0,2,5,2,0,0,40,1 -43,1,94113,2,10,2,5,4,1,2,0,0,45,1 -22,1,137510,2,10,3,9,2,1,2,0,0,40,1 -17,1,32607,13,6,3,10,2,1,2,0,0,20,1 -47,2,93208,4,9,1,3,1,1,1,0,0,75,17 -41,1,254440,7,11,3,6,4,1,1,0,0,60,1 -56,1,186556,2,10,1,2,3,1,2,0,0,50,1 -64,1,169871,4,9,1,11,3,1,2,0,0,45,1 -47,1,191277,4,9,1,11,3,1,2,0,0,50,1 -48,1,167159,7,11,3,9,6,1,2,0,0,40,1 -31,1,171871,11,14,3,6,4,1,1,0,0,46,1 -29,1,154411,7,11,3,1,2,1,2,0,0,40,1 -30,1,129227,4,9,4,9,4,1,1,0,0,40,1 -32,1,110331,4,9,1,5,3,1,2,0,1672,60,1 -57,1,34269,4,9,4,11,6,1,2,0,653,42,1 -62,1,174355,4,9,4,3,4,1,1,0,0,40,1 -39,1,680390,4,9,4,8,6,1,1,0,0,24,1 -43,1,233130,2,10,3,9,4,1,2,0,0,25,1 -24,3,165474,1,13,3,4,2,1,2,0,0,40,1 -42,0,257780,3,7,1,0,3,1,2,0,0,15,1 -53,1,194259,2,10,1,9,1,1,1,4386,0,40,1 -26,1,280093,2,10,3,7,2,1,2,0,0,40,1 -73,2,177387,4,9,1,5,3,1,2,0,0,40,1 -72,0,28929,3,7,4,0,4,1,1,0,0,24,1 -55,1,105304,4,9,1,8,3,1,2,0,0,40,1 -25,1,499233,4,9,2,9,4,1,2,0,0,40,1 -41,1,180572,1,13,2,6,4,1,1,0,0,50,1 -24,1,321435,1,13,3,5,4,1,2,0,0,50,1 -63,1,86108,4,9,4,10,4,1,2,0,0,6,1 -17,1,198124,3,7,3,4,2,1,2,0,0,20,1 -35,1,135162,2,10,1,2,3,1,2,0,0,50,1 -51,1,146813,2,10,1,6,3,1,2,0,0,40,1 -62,5,291175,1,13,4,6,4,1,1,0,0,48,1 -55,1,387569,4,9,1,2,3,1,2,4386,0,40,1 -43,1,102895,2,10,2,6,4,1,2,0,0,40,1 -40,5,33274,4,9,2,3,4,1,1,0,0,50,1 -37,1,86551,1,13,1,5,3,1,2,0,0,45,1 -39,1,138192,1,13,1,2,3,1,2,0,0,40,1 -31,1,118966,4,9,3,2,2,1,2,0,0,18,1 -61,1,99784,11,14,4,6,4,1,1,0,0,40,1 -26,1,90980,7,11,2,9,4,1,1,0,0,55,1 -46,2,177407,4,9,1,4,3,1,2,0,0,50,1 -26,1,96467,1,13,3,6,4,1,1,0,0,40,1 -48,6,327886,14,16,2,6,2,1,2,0,0,50,1 -34,1,111567,4,9,3,11,2,1,2,0,0,40,1 -34,5,166545,4,9,3,9,2,1,1,0,0,40,1 -59,1,142182,4,9,1,3,3,1,2,0,0,40,1 -34,1,188798,1,13,3,6,2,1,1,0,0,40,1 -49,1,38563,1,13,3,5,4,1,1,0,0,56,1 -18,1,216284,3,7,3,9,2,1,1,0,0,20,1 -43,1,191547,4,9,1,10,3,1,2,0,0,40,21 -48,1,285335,3,7,1,5,3,1,2,0,0,50,1 -28,3,142712,2,10,1,5,3,1,2,0,0,70,1 -33,1,80945,4,9,1,8,3,1,2,0,0,40,1 -24,1,309055,2,10,3,4,4,1,1,0,0,15,1 -21,1,62339,13,6,3,7,4,1,2,0,0,40,1 -17,1,368700,3,7,3,4,2,1,2,0,0,28,1 -39,1,176186,2,10,1,10,3,1,2,0,0,50,1 -29,2,266855,1,13,4,6,2,1,2,0,0,40,1 -44,1,48087,1,13,1,5,3,1,2,0,0,40,1 -24,1,121313,2,10,3,11,2,1,2,0,0,50,1 -71,2,143437,11,14,1,4,3,1,2,10605,0,40,1 -51,2,160724,1,13,1,4,3,2,2,0,2415,40,12 -55,1,282753,15,3,2,3,6,5,2,0,0,25,1 -41,1,194636,1,13,1,5,3,1,2,0,0,60,1 -23,1,153044,4,9,3,7,6,5,1,0,0,7,1 -38,1,411797,7,11,2,9,6,1,1,0,0,40,1 -39,1,117683,4,9,1,11,3,1,2,0,0,40,1 -19,1,376540,4,9,3,9,4,1,1,0,0,30,1 -49,1,72393,8,5,2,8,4,1,1,0,0,40,1 -32,1,270335,1,13,1,9,5,1,2,0,0,40,16 -27,1,96226,4,9,1,2,3,1,2,0,0,70,1 -38,1,95336,1,13,1,4,3,1,2,0,0,50,1 -33,1,258498,2,10,1,2,1,1,1,0,0,60,1 -63,0,149698,2,10,1,0,3,1,2,0,0,15,1 -23,1,205865,1,13,3,5,2,1,2,0,0,28,1 -33,3,155781,2,10,1,2,3,1,2,0,0,60,0 -54,2,406468,4,9,1,4,3,5,2,0,0,40,1 -29,1,177119,7,11,2,1,4,1,1,2174,0,45,1 -48,0,144397,2,10,2,0,6,5,1,0,0,30,1 -35,2,372525,1,13,3,5,4,1,2,0,0,40,1 -28,1,164170,7,11,1,9,1,2,1,0,0,40,8 -37,1,183800,1,13,1,6,3,1,2,7688,0,50,1 -42,2,177307,5,15,1,10,3,1,2,0,0,65,1 -40,1,170108,11,14,1,6,3,1,2,0,0,40,1 -47,1,341995,2,10,2,4,2,1,2,0,0,55,1 -22,1,226508,1,13,3,5,2,1,1,0,0,50,1 -30,1,87418,1,13,1,5,3,1,2,0,0,45,1 -28,1,109165,4,9,1,1,3,1,2,0,0,40,1 -63,5,28856,9,4,1,3,3,1,2,0,0,55,1 -51,2,175897,8,5,1,2,3,1,2,0,0,20,1 -22,1,99697,4,9,3,7,2,1,1,0,0,40,1 -27,0,90270,6,12,1,0,2,3,2,0,0,40,1 -35,1,152375,4,9,3,4,4,1,2,0,0,45,1 -46,1,171550,4,9,2,8,4,1,1,0,0,38,1 -37,1,211154,2,10,2,8,4,1,2,0,0,52,1 -24,1,202570,1,13,3,6,2,5,2,0,0,15,1 -37,2,168496,4,9,2,7,2,1,2,0,0,10,1 -53,1,68898,2,10,1,1,3,1,2,0,0,40,1 -27,1,93235,4,9,3,4,4,1,2,0,0,30,1 -38,1,278924,2,10,2,2,4,1,2,0,0,44,1 -53,2,311020,13,6,1,10,3,1,2,0,0,60,1 -34,1,175878,2,10,3,2,4,1,2,0,0,40,1 -23,1,543028,4,9,3,4,2,5,2,0,0,40,1 -39,1,202027,1,13,1,5,3,1,2,15024,0,45,1 -43,1,158926,11,14,1,6,1,2,1,0,0,50,11 -67,3,76860,4,9,1,5,3,2,2,0,0,40,1 -81,2,136063,4,9,1,5,3,1,2,0,0,30,1 -21,1,186648,2,10,3,3,2,1,2,0,0,20,1 -23,1,257509,2,10,3,4,4,1,2,0,0,25,1 -25,1,98155,2,10,3,11,6,1,2,0,0,40,1 -42,1,274198,15,3,1,8,1,1,1,0,0,38,21 -38,1,97083,2,10,1,9,1,5,1,0,0,40,1 -64,0,29825,4,9,1,0,3,1,2,0,0,5,1 -32,1,262153,4,9,1,8,3,1,2,0,0,40,1 -37,1,214738,4,9,1,8,3,1,2,0,0,40,1 -51,1,138022,11,14,1,5,3,1,2,0,0,60,1 -22,1,91842,2,10,3,4,4,1,1,0,0,42,1 -33,1,373662,12,2,6,12,4,1,1,0,0,40,31 -42,1,162003,4,9,2,8,4,1,2,0,0,55,1 -19,0,52114,2,10,3,0,2,1,1,0,0,10,1 -51,5,241843,16,1,1,3,3,1,2,0,0,40,1 -23,1,375871,4,9,1,9,1,1,1,0,0,40,21 -37,1,186934,3,7,1,8,3,1,2,3103,0,44,1 -37,1,176900,4,9,1,2,3,1,2,0,0,99,1 -47,1,21906,11,14,3,6,4,1,1,0,0,25,1 -41,1,132222,5,15,1,6,3,1,2,0,2415,40,1 -33,1,143653,4,9,1,2,3,1,2,0,0,30,1 -31,1,111567,1,13,3,4,4,1,2,0,0,40,1 -31,1,78602,6,12,2,3,6,3,1,0,0,40,1 -35,1,465507,4,9,1,8,3,5,2,0,0,40,1 -38,3,196373,4,9,1,5,3,1,2,0,0,40,1 -18,1,293227,4,9,3,3,4,1,1,0,0,45,1 -20,1,241752,4,9,1,8,3,1,2,0,0,40,1 -54,5,166398,2,10,2,5,6,5,1,0,0,35,1 -40,1,184682,2,10,2,9,6,1,1,0,0,40,1 -36,3,108293,1,13,1,5,1,1,1,0,1977,45,1 -43,1,250802,2,10,2,2,6,1,2,0,0,35,1 -44,2,325159,2,10,2,10,6,1,2,0,0,40,1 -44,6,174675,4,9,1,6,1,1,1,0,0,40,1 -43,1,227065,11,14,1,5,3,1,2,0,0,43,1 -51,1,269080,9,4,4,3,6,5,1,0,0,40,1 -18,1,177722,4,9,3,3,2,1,1,0,0,20,1 -51,1,133461,2,10,1,5,3,1,2,0,0,40,1 -41,1,239683,13,6,1,2,3,1,2,0,0,30,0 -44,3,398473,2,10,1,4,3,1,2,0,0,70,1 -33,5,298785,13,6,2,11,4,1,2,0,0,40,1 -33,2,123424,1,13,1,5,3,1,2,0,0,40,1 -42,1,176286,6,12,1,5,3,1,2,0,0,40,1 -25,1,150062,2,10,1,6,3,1,2,0,0,45,1 -32,1,169240,4,9,2,8,4,1,1,0,0,38,1 -32,1,288273,1,13,1,2,3,1,2,0,0,70,21 -36,1,526968,13,6,2,5,6,1,1,0,0,40,1 -28,1,57066,4,9,1,11,3,1,2,0,0,40,1 -20,1,323573,4,9,3,3,2,1,1,0,0,20,1 -35,3,368825,2,10,1,4,3,1,2,0,0,60,1 -55,2,189721,4,9,1,2,3,1,2,0,0,20,1 -48,1,164966,1,13,1,5,3,2,2,0,0,40,8 -36,0,94954,7,11,4,0,4,1,1,0,0,20,1 -34,1,202046,4,9,1,2,3,1,2,0,0,35,1 -28,1,161538,1,13,3,1,4,1,1,0,0,35,1 -67,1,105252,1,13,4,5,4,1,2,0,2392,40,1 -37,1,200153,4,9,1,2,3,1,2,0,0,40,1 -44,1,32185,4,9,3,11,6,1,2,0,0,70,1 -25,1,178326,2,10,3,4,4,1,1,0,0,40,1 -21,1,255957,2,10,3,5,4,1,1,4101,0,40,1 -40,6,188693,11,14,1,6,3,1,2,0,0,35,1 -78,1,182977,4,9,4,3,4,5,1,2964,0,40,1 -34,1,159929,4,9,2,7,2,1,2,0,0,40,1 -49,1,123207,4,9,3,9,4,1,1,0,0,44,1 -22,1,284317,6,12,3,9,4,1,1,0,0,40,1 -23,0,184699,4,9,3,0,6,5,1,0,0,40,1 -60,2,154474,4,9,3,10,6,1,2,0,0,42,1 -45,5,318280,4,9,4,13,4,1,2,0,0,40,1 -63,1,254907,7,11,2,3,4,1,1,0,0,20,1 -41,1,349221,4,9,3,2,2,5,1,0,0,35,1 -47,1,335973,4,9,2,9,6,1,1,0,0,40,1 -44,1,126701,4,9,2,2,6,1,2,0,0,40,1 -51,1,122159,2,10,4,6,4,1,1,3325,0,40,1 -46,1,187370,1,13,3,4,4,1,2,0,1504,40,1 -41,1,194636,7,11,1,8,3,1,2,0,0,40,1 -50,2,124793,4,9,1,2,3,1,2,0,0,30,1 -47,1,192835,4,9,1,9,3,1,2,0,0,50,1 -35,1,290226,4,9,3,5,4,1,2,0,0,45,1 -56,1,112840,4,9,1,5,3,1,2,0,0,55,1 -45,1,89325,11,14,2,6,4,1,2,0,0,45,1 -48,4,33109,1,13,2,5,6,1,2,0,0,58,1 -40,1,82465,2,10,1,8,3,1,2,2580,0,40,1 -39,3,329980,1,13,1,5,3,1,2,15024,0,50,1 -20,1,148294,2,10,3,3,2,1,2,0,0,40,1 -50,1,168212,14,16,1,6,3,1,2,0,1902,65,1 -38,6,343642,4,9,1,6,1,1,1,0,0,40,1 -23,5,115244,1,13,3,6,2,1,1,0,0,60,1 -31,1,162572,4,9,3,3,2,1,2,0,0,16,1 -58,1,356067,1,13,1,9,3,1,2,0,0,40,1 -66,1,271567,4,9,4,8,4,5,2,0,0,40,1 -39,3,180804,4,9,1,5,3,1,2,0,0,40,1 -54,2,123011,4,9,1,2,3,1,2,15024,0,52,1 -26,1,109186,2,10,1,4,3,1,2,0,0,50,6 -51,1,220537,4,9,2,8,4,1,1,0,0,40,1 -34,1,124827,7,11,3,11,2,1,2,0,0,40,1 -50,1,767403,4,9,1,8,3,1,2,3103,0,40,1 -42,1,118494,2,10,1,6,3,1,2,0,0,44,1 -38,1,173208,11,14,1,6,3,1,2,0,0,25,1 -48,1,107373,9,4,1,7,3,1,2,0,0,40,1 -33,1,26973,7,11,1,1,1,1,1,0,0,40,1 -51,1,191965,4,9,4,3,6,1,1,0,0,32,1 -22,1,122346,4,9,2,2,4,1,2,0,0,40,1 -19,0,117201,4,9,3,0,2,1,2,0,0,30,1 -41,1,198316,2,10,3,2,4,1,2,0,0,50,9 -48,5,123075,11,14,1,6,3,1,2,0,0,35,1 -42,1,209370,4,9,4,4,4,1,1,0,0,30,1 -34,1,33117,2,10,1,3,3,1,2,0,0,40,1 -23,1,129042,4,9,3,8,6,5,1,0,0,40,1 -56,1,169133,4,9,1,3,3,1,2,0,0,50,35 -30,1,201624,1,13,3,6,4,5,2,0,0,45,0 -45,1,368561,4,9,1,5,3,1,2,0,0,55,1 -48,1,207848,13,6,1,9,1,1,1,0,0,40,1 -48,3,138370,11,14,6,4,4,2,2,0,0,50,8 -31,1,93106,7,11,3,9,4,1,1,0,0,40,1 -20,6,223515,6,12,3,3,2,1,2,0,1719,20,1 -27,1,389713,2,10,3,4,4,1,2,0,0,40,1 -32,1,206365,4,9,3,3,4,5,1,0,0,40,1 -76,0,431192,9,4,4,0,4,1,2,0,0,2,1 -19,0,241616,4,9,3,0,6,1,2,0,2001,40,1 -66,3,150726,8,5,1,5,3,1,2,1409,0,1,0 -37,1,123785,4,9,3,3,4,1,2,0,0,75,1 -34,1,289984,4,9,2,12,6,5,1,0,0,30,1 -34,0,164309,3,7,1,0,1,1,1,0,0,8,1 -90,1,137018,4,9,3,3,4,1,1,0,0,40,1 -23,1,137994,2,10,3,8,2,5,1,0,0,40,1 -43,1,341204,2,10,2,9,4,1,1,0,0,40,1 -44,1,167005,1,13,1,5,3,1,2,7688,0,60,1 -24,1,34446,2,10,3,1,4,1,1,0,0,37,1 -28,1,187160,5,15,2,6,6,1,2,0,0,55,1 -64,0,196288,6,12,3,0,4,1,1,0,0,20,1 -23,1,217961,2,10,1,3,3,1,2,0,0,40,1 -20,1,74631,2,10,3,4,4,1,1,0,0,50,1 -36,1,156667,1,13,1,6,3,1,2,0,1902,50,1 -61,1,125155,4,9,1,8,3,1,2,0,0,40,1 -53,2,263925,1,13,1,4,3,1,2,0,0,40,5 -30,1,296453,1,13,1,9,3,1,2,7298,0,40,1 -52,2,44728,2,10,1,4,3,1,2,0,0,55,1 -38,1,193026,2,10,2,2,4,1,2,0,0,40,14 -32,1,87643,1,13,1,4,3,1,2,0,0,40,1 -30,2,106742,10,8,1,11,3,1,2,0,0,75,1 -41,1,302122,7,11,2,2,4,1,1,0,0,40,1 -49,5,193960,11,14,1,6,3,1,2,0,1902,40,1 -45,1,185385,4,9,1,2,3,1,2,0,0,47,1 -43,2,277647,4,9,1,5,3,1,2,0,0,35,1 -61,1,128848,4,9,1,8,3,1,2,3471,0,40,1 -54,1,377701,4,9,1,3,3,1,2,0,0,32,21 -34,1,157886,6,12,4,3,6,1,1,0,0,40,1 -49,1,175958,2,10,1,4,3,1,2,0,0,80,1 -38,1,223004,2,10,3,3,2,1,2,0,0,40,1 -35,1,199352,5,15,1,6,3,1,2,0,1977,80,1 -36,1,29984,10,8,1,11,3,1,2,0,0,40,1 -30,1,181651,1,13,1,5,3,1,2,0,0,50,1 -36,1,117312,6,12,2,1,4,1,1,0,0,60,1 -22,5,34029,1,13,3,6,2,1,1,0,0,20,1 -38,1,132879,4,9,1,2,3,1,2,0,1902,40,1 -37,1,215310,4,9,1,2,3,1,2,0,0,50,1 -48,6,55863,14,16,1,6,1,1,1,0,1902,46,1 -17,1,220384,3,7,3,9,2,1,2,0,0,15,1 -19,2,36012,2,10,3,2,2,1,2,0,0,20,1 -27,1,137645,1,13,3,4,4,5,1,0,1590,40,1 -22,1,191342,1,13,3,4,2,2,2,0,0,50,27 -49,1,31339,4,9,1,2,3,1,2,0,0,40,1 -43,6,227910,7,11,1,6,1,1,1,0,0,40,1 -43,1,173728,1,13,4,6,6,1,1,0,0,40,1 -19,5,167816,4,9,2,5,4,1,1,0,0,35,1 -58,2,81642,4,9,1,10,3,1,2,0,0,60,1 -41,5,195258,2,10,1,13,3,1,2,0,0,40,1 -31,1,232475,2,10,1,9,3,1,2,0,0,40,1 -30,1,241259,7,11,1,11,3,1,2,0,0,40,1 -32,1,118161,4,9,1,13,3,1,2,0,0,40,1 -29,1,201954,1,13,3,5,4,1,2,0,0,35,1 -42,1,150533,2,10,1,5,3,1,2,7298,0,52,1 -38,1,412296,4,9,2,2,2,1,2,0,0,28,1 -41,4,133060,6,12,1,6,3,1,2,0,0,40,1 -44,2,120539,1,13,3,4,4,1,2,0,0,50,1 -31,1,196025,14,16,6,6,4,2,2,0,0,60,12 -34,1,107793,4,9,2,2,4,1,2,0,0,40,1 -21,1,163870,2,10,3,9,2,1,2,0,0,40,1 -22,2,361280,1,13,3,6,2,2,2,0,0,20,8 -62,1,92178,13,6,1,8,3,1,2,0,0,40,1 -19,0,80710,4,9,3,0,2,1,1,0,0,40,1 -29,3,260729,4,9,1,4,1,1,1,0,1977,25,1 -43,1,182254,2,10,2,6,6,1,1,0,0,40,1 -68,0,140282,9,4,1,0,3,1,2,0,0,8,1 -45,3,149865,1,13,1,8,3,1,2,0,0,60,1 -39,3,218184,8,5,1,5,3,1,2,0,1651,40,21 -41,1,118619,1,13,1,5,3,5,2,0,0,50,1 -34,2,196791,6,12,1,6,1,1,1,0,0,25,1 -34,5,167999,4,9,2,9,6,1,1,0,0,33,1 -31,1,51259,1,13,3,6,4,1,2,0,0,47,1 -29,1,131088,4,9,3,7,2,1,2,0,0,25,1 -41,1,118212,1,13,1,2,3,1,2,3103,0,40,1 -41,1,293791,7,11,1,11,3,1,2,0,0,55,1 -35,3,289430,11,14,1,4,3,1,2,0,0,45,21 -33,1,35378,1,13,1,4,1,1,1,0,0,45,1 -37,6,60227,1,13,3,9,4,1,2,0,0,38,1 -69,1,168139,4,9,1,4,1,1,1,0,0,40,1 -34,1,290763,4,9,2,7,2,1,1,0,0,40,1 -60,3,226355,7,11,1,8,3,1,2,0,2415,70,0 -36,1,51100,2,10,1,2,3,1,2,0,0,40,1 -41,1,227644,4,9,1,11,3,1,2,0,0,50,1 -58,5,205267,1,13,1,6,1,1,1,0,0,40,1 -53,1,288020,1,13,1,6,3,2,2,0,0,40,9 -29,1,140863,2,10,1,1,3,1,2,0,0,40,1 -45,4,170915,4,9,2,1,4,1,1,4865,0,40,1 -34,6,50178,2,10,1,5,3,1,2,0,0,38,1 -36,1,112497,1,13,1,1,3,1,2,0,0,40,1 -48,1,95244,2,10,2,3,6,5,1,0,0,35,1 -20,1,117606,7,11,3,9,2,1,1,0,0,40,1 -35,1,89508,4,9,1,11,3,1,2,0,0,50,1 -63,4,124244,4,9,4,7,4,5,2,0,0,40,1 -41,2,154374,2,10,2,3,6,1,2,0,0,45,1 -28,1,294936,1,13,1,6,3,1,2,0,0,45,1 -30,1,347132,2,10,3,8,4,5,1,0,0,40,1 -34,0,181934,4,9,2,0,4,1,1,0,0,40,1 -31,1,316672,4,9,3,3,6,1,1,0,0,40,21 -37,1,189382,7,11,3,9,2,1,1,0,0,38,1 -42,0,184018,2,10,2,0,6,1,2,0,0,40,1 -31,1,184307,2,10,1,5,3,1,2,0,0,50,19 -46,2,246212,11,14,1,5,3,1,2,0,0,45,1 -35,4,250504,4,9,1,9,3,5,2,0,0,60,1 -27,1,138705,4,9,1,2,3,1,2,0,0,53,1 -41,1,328447,12,2,1,2,3,1,2,0,0,35,21 -19,1,194608,2,10,3,3,2,1,1,0,0,20,1 -20,1,230891,2,10,3,4,4,1,2,0,0,45,1 -59,4,212448,4,9,4,4,6,1,1,0,0,40,6 -40,1,214010,1,13,3,3,4,1,2,0,0,37,1 -56,2,200235,2,10,1,4,3,1,2,0,0,60,1 -33,1,354573,11,14,1,6,3,1,2,15024,0,44,1 -30,3,205733,2,10,3,5,4,1,1,0,0,60,1 -46,1,185041,4,9,1,8,3,1,2,0,0,50,1 -61,3,84409,4,9,1,5,3,1,2,0,0,35,1 -50,3,293196,4,9,1,5,3,1,2,7298,0,40,1 -25,1,241626,4,9,3,3,2,1,2,0,0,30,1 -40,1,520586,2,10,2,9,6,1,1,0,0,39,1 -24,0,35633,2,10,3,0,4,1,2,0,0,40,0 -51,1,302847,4,9,1,2,3,1,2,0,0,54,1 -43,6,165309,4,9,1,13,3,1,2,0,0,40,1 -34,1,117529,2,10,1,10,3,1,2,0,0,54,21 -46,1,106092,4,9,1,5,3,1,2,0,0,45,1 -28,6,445824,11,14,1,6,1,1,1,0,0,50,1 -26,1,227332,1,13,3,11,6,2,2,0,0,40,0 -20,1,275691,4,9,3,7,2,1,2,0,0,28,1 -44,1,193459,7,11,1,2,3,1,2,3411,0,40,1 -51,1,284329,4,9,4,11,6,1,2,0,0,40,1 -33,1,114691,1,13,1,4,3,1,2,0,0,60,1 -54,1,96062,6,12,1,1,3,1,2,0,0,40,1 -50,1,133963,1,13,1,6,1,1,1,0,1977,40,1 -33,1,178506,4,9,2,9,4,5,1,0,0,40,1 -65,1,350498,1,13,1,9,3,1,2,10605,0,20,1 -22,0,131573,2,10,3,0,2,1,1,0,0,8,1 -88,2,206291,5,15,1,6,3,1,2,0,0,40,1 -40,1,182302,4,9,2,11,4,1,2,0,0,40,1 -51,1,241346,4,9,2,9,4,1,1,0,0,43,1 -50,1,157043,3,7,2,3,4,5,1,0,0,40,1 -25,1,404616,11,14,1,10,4,1,2,0,0,99,1 -20,1,411862,7,11,3,3,4,1,2,0,0,30,1 -47,1,183013,4,9,1,3,3,1,2,0,0,40,1 -58,0,169982,2,10,1,0,3,1,2,0,0,40,1 -22,1,188544,7,11,3,9,2,1,1,0,0,30,1 -50,6,356619,4,9,1,6,3,1,2,0,0,48,1 -47,1,45857,4,9,3,9,6,1,1,0,0,40,1 -24,5,289886,3,7,3,3,4,2,2,0,0,45,1 -50,0,146015,4,9,1,0,3,1,2,0,0,40,1 -40,1,216237,1,13,2,5,4,1,1,0,0,45,1 -36,1,416745,1,13,1,6,3,1,2,0,0,40,1 -32,1,202952,4,9,1,9,3,1,2,0,0,40,1 -44,1,167725,4,9,1,2,3,1,2,0,0,40,1 -51,0,165637,11,14,1,0,3,1,2,0,0,40,1 -59,4,43280,2,10,3,5,2,5,1,0,0,40,1 -65,1,118779,4,9,1,2,3,1,2,0,0,30,1 -24,6,191269,1,13,3,9,4,1,1,0,0,65,1 -27,5,247507,1,13,3,6,2,1,2,0,0,35,1 -51,1,239155,7,11,2,3,4,1,2,0,0,40,1 -48,1,182862,2,10,1,2,3,1,2,0,0,40,1 -39,1,33886,4,9,1,11,3,1,2,0,0,60,1 -28,1,444304,1,13,1,1,3,1,2,0,0,40,1 -19,1,187161,2,10,3,3,2,1,1,0,0,25,1 -49,5,116892,1,13,2,6,4,5,1,0,0,40,1 -51,5,176813,6,12,1,6,3,1,2,0,0,60,1 -59,1,151616,11,14,1,6,3,1,2,0,0,55,1 -18,1,240747,4,9,3,8,2,1,1,0,0,40,24 -50,1,75472,4,9,1,2,3,1,2,4386,0,40,0 -45,4,320818,4,9,1,9,3,5,2,0,0,80,1 -30,5,235271,1,13,1,6,3,1,2,0,0,50,1 -37,1,166497,1,13,2,2,4,1,2,0,0,35,1 -44,1,344060,1,13,3,3,4,1,2,0,0,40,1 -33,1,221196,2,10,1,2,3,5,2,0,0,40,1 -61,3,113544,11,14,1,4,3,1,2,0,0,40,1 -61,5,321117,4,9,1,13,3,1,2,0,0,40,1 -38,1,79619,1,13,1,4,3,1,2,0,0,42,1 -22,0,42004,2,10,3,0,2,1,2,0,0,30,1 -36,1,135289,4,9,1,8,3,1,2,0,0,45,1 -44,3,320984,2,10,1,4,3,1,2,5178,0,60,1 -37,1,203070,2,10,4,9,2,1,2,0,0,62,1 -31,1,32406,2,10,2,2,6,1,1,0,0,20,1 -54,1,99185,14,16,1,6,3,1,2,15024,0,45,1 -20,1,205839,2,10,3,3,2,1,2,0,0,16,1 -63,0,150389,1,13,4,0,4,1,1,0,0,40,1 -48,2,243631,2,10,1,2,3,3,2,7688,0,40,1 -33,0,163003,4,9,2,0,4,2,1,0,0,41,12 -31,1,231263,1,13,2,5,4,1,2,4650,0,45,1 -38,1,200818,5,15,1,6,3,1,2,0,0,40,1 -45,2,247379,4,9,1,2,3,1,2,0,0,40,1 -48,1,349151,4,9,1,2,3,1,2,0,0,40,1 -53,1,22154,2,10,1,2,3,1,2,0,0,40,1 -55,1,176317,4,9,4,9,2,1,1,0,0,40,1 -38,1,22245,11,14,1,11,3,1,2,0,0,72,0 -29,1,236436,4,9,3,9,4,1,1,0,0,40,1 -36,1,354078,1,13,1,6,3,1,2,0,0,60,1 -42,2,166813,2,10,1,10,3,1,2,0,0,60,1 -50,1,358740,4,9,2,9,6,1,1,0,0,40,3 -75,2,208426,5,15,1,6,3,1,2,0,0,35,1 -46,1,265266,1,13,1,6,3,1,2,0,1902,40,1 -52,4,31838,4,9,2,9,6,1,1,0,0,40,1 -27,1,175034,1,13,3,9,4,1,1,0,0,40,1 -43,1,413297,4,9,1,2,3,1,2,0,0,45,1 -31,1,106347,3,7,4,3,4,5,1,0,0,42,1 -23,1,174754,2,10,3,4,4,1,1,0,0,30,1 -34,1,441454,4,9,3,4,4,1,1,0,0,24,1 -41,2,209344,4,9,1,4,5,1,1,0,0,40,13 -31,1,185732,4,9,3,3,2,1,1,0,0,30,1 -42,1,65372,2,10,3,9,2,1,1,0,0,40,1 -35,1,33975,11,14,3,6,4,1,2,0,0,45,1 -55,1,326297,4,9,4,9,4,1,1,0,0,25,1 -36,6,194630,4,9,4,5,4,1,1,0,0,40,1 -65,2,167414,4,9,1,5,3,1,2,0,0,59,1 -38,5,165799,2,10,3,3,4,1,2,0,0,12,1 -62,1,192866,2,10,4,3,4,1,1,0,0,20,1 -54,3,166459,5,15,1,6,3,1,2,99999,0,60,1 -49,1,148995,4,9,2,9,4,1,2,0,0,40,1 -34,1,190040,4,9,1,9,1,1,1,0,0,40,1 -32,1,209432,4,9,3,7,2,1,1,0,0,40,1 -51,3,229465,11,14,1,5,3,1,2,15024,0,50,1 -48,2,397466,2,10,1,5,3,1,2,0,0,60,1 -30,1,283767,4,9,1,9,3,1,2,0,0,40,0 -52,4,202452,4,9,2,9,4,1,1,0,0,43,1 -28,2,218555,2,10,3,11,4,1,2,0,1762,40,1 -29,1,128604,4,9,1,2,3,2,2,0,0,40,1 -38,1,65466,1,13,3,9,4,1,1,0,0,50,1 -57,1,141326,4,9,1,4,3,1,2,0,0,50,1 -43,4,369468,1,13,1,6,3,1,2,0,0,40,1 -37,6,136137,2,10,4,9,4,1,2,0,0,40,1 -30,1,236770,4,9,1,2,3,1,2,0,0,40,1 -53,1,89534,4,9,1,11,3,1,2,0,0,48,1 -69,0,195779,7,11,4,0,4,1,1,0,0,1,1 -73,1,29778,4,9,4,3,4,1,1,0,0,37,1 -22,3,153516,2,10,3,2,2,1,2,0,0,35,1 -31,1,163594,4,9,3,4,4,1,1,0,0,45,1 -38,1,189623,1,13,1,6,3,1,2,0,1887,40,1 -50,2,343748,2,10,1,2,3,1,2,0,0,35,1 -37,1,387430,2,10,1,9,3,1,2,0,0,37,1 -44,5,409505,1,13,2,13,4,1,2,0,0,40,1 -47,1,200734,1,13,3,5,6,5,1,0,0,45,1 -27,1,115831,2,10,1,3,3,1,2,0,0,40,1 -28,1,150296,6,12,3,3,4,1,1,0,0,80,1 -25,1,323545,4,9,3,1,4,5,1,0,0,40,1 -20,1,232577,2,10,3,3,6,1,1,0,0,40,1 -51,5,152754,1,13,1,9,3,1,2,0,0,40,1 -46,1,129007,1,13,1,4,3,1,2,0,1977,40,1 -67,1,171584,1,13,1,13,3,1,2,6514,0,7,1 -47,1,386136,4,9,1,11,3,1,2,0,0,35,1 -42,1,342865,4,9,1,4,3,1,2,0,0,40,1 -52,1,186785,4,9,3,4,4,1,2,0,1876,50,1 -42,4,158926,6,12,2,6,6,2,1,0,0,40,16 -65,0,36039,4,9,1,0,3,1,2,0,0,40,1 -21,1,164019,2,10,3,10,2,5,2,0,0,10,1 -50,1,88926,4,9,1,9,1,1,1,5178,0,40,1 -46,1,188861,2,10,1,11,3,1,2,0,0,40,1 -47,2,370119,1,13,1,4,3,1,2,15024,0,50,1 -57,1,182062,13,6,1,3,3,1,2,0,0,40,1 -37,1,37238,1,13,3,5,4,1,2,0,0,45,1 -50,1,421132,4,9,1,2,3,1,2,0,0,40,1 -58,0,178660,10,8,1,0,3,1,2,0,0,40,1 -63,2,795830,12,2,4,3,6,1,1,0,0,30,36 -39,1,278403,4,9,1,11,3,1,2,0,0,65,1 -46,1,279661,11,14,1,6,3,5,2,0,0,35,1 -36,1,113397,4,9,1,7,3,1,2,0,0,40,1 -26,1,280093,15,3,1,2,3,1,2,0,1628,50,1 -21,1,236696,2,10,3,3,4,1,2,0,0,57,1 -41,1,265266,6,12,1,1,3,1,2,0,0,40,1 -44,5,34935,2,10,3,3,2,5,2,0,0,40,1 -22,1,58222,4,9,1,2,3,1,2,0,0,40,1 -29,4,301010,2,10,3,14,4,5,2,0,0,60,1 -29,1,419721,4,9,3,3,6,5,1,0,0,40,9 -58,3,186791,2,10,1,11,1,1,1,0,0,40,1 -36,2,180686,5,15,1,6,3,1,2,0,0,50,1 -30,1,209103,7,11,1,5,3,1,2,0,0,48,1 -37,1,32668,1,13,1,8,3,1,2,0,0,43,1 -29,1,256956,7,11,3,8,4,1,2,0,0,40,1 -26,1,202203,15,3,3,9,2,1,1,0,0,40,21 -43,1,85995,4,9,1,4,3,1,2,0,0,45,1 -49,1,125421,4,9,1,2,1,1,1,0,0,40,1 -45,4,283037,1,13,1,5,3,1,2,0,0,40,1 -28,1,192932,1,13,3,6,4,1,1,0,0,40,1 -20,0,244689,2,10,3,0,4,1,1,0,0,25,1 -51,1,179646,4,9,2,9,6,1,1,0,0,40,1 -32,1,509350,2,10,1,7,3,1,2,0,0,50,5 -24,1,96279,4,9,3,8,2,1,1,0,0,40,1 -35,1,119098,14,16,1,6,3,1,2,7298,0,40,1 -35,0,327120,6,12,3,0,4,1,2,0,0,55,1 -41,6,144928,1,13,1,6,3,1,2,0,0,40,1 -48,1,55237,1,13,1,2,3,1,2,0,0,45,1 -61,5,101265,4,9,4,9,6,1,1,1471,0,35,1 -20,1,114874,2,10,3,9,2,1,2,0,0,30,1 -27,1,190525,4,9,3,4,4,1,2,0,0,50,1 -55,1,121912,2,10,1,11,3,1,2,0,0,24,1 -39,1,83893,4,9,1,9,3,1,2,0,0,40,1 -17,0,138507,13,6,3,0,2,1,2,0,0,20,1 -47,1,256522,4,9,1,7,3,1,2,0,0,40,0 -52,1,168381,4,9,4,3,6,2,1,0,0,40,8 -24,1,293579,4,9,3,4,2,5,1,0,0,20,1 -29,1,285290,3,7,3,3,6,5,1,0,0,40,1 -25,1,188488,4,9,3,8,5,1,2,0,0,40,1 -20,1,324469,2,10,3,8,4,1,1,0,0,40,1 -23,1,275244,4,9,3,8,4,5,2,0,0,35,1 -57,1,265099,5,15,1,6,3,1,2,0,0,60,1 -51,1,146767,7,11,1,6,3,1,2,0,0,40,1 -33,1,40681,2,10,3,6,4,1,1,3674,0,16,1 -39,1,174938,1,13,3,6,4,1,2,0,0,40,1 -40,1,240124,4,9,1,5,3,1,2,0,0,40,1 -71,1,269708,1,13,2,1,2,1,1,2329,0,16,1 -38,6,34180,2,10,1,5,3,1,2,0,0,40,1 -28,6,225904,5,15,3,6,6,1,2,0,0,40,1 -57,1,89392,11,14,6,6,4,1,1,0,0,40,1 -47,1,46857,2,10,2,9,4,1,2,0,0,40,1 -59,6,105363,4,9,3,9,2,1,2,0,0,40,1 -26,1,195105,4,9,3,4,4,4,2,0,0,40,1 -35,1,184117,1,13,3,9,2,1,1,0,0,40,1 -61,3,134768,1,13,1,5,3,1,2,0,0,50,6 -17,0,145886,3,7,3,0,2,1,1,0,0,30,1 -36,1,153078,1,13,1,5,3,2,2,0,0,60,0 -62,0,225652,9,4,1,0,3,1,2,3411,0,50,1 -34,1,467108,2,10,3,6,4,1,2,0,0,50,1 -32,3,199765,1,13,1,2,3,1,2,7688,0,50,1 -42,1,173938,4,9,4,4,2,1,1,0,0,40,1 -39,1,191161,2,10,1,2,3,1,2,0,0,40,1 -58,1,132606,15,3,2,8,4,5,2,0,0,40,1 -61,2,30073,4,9,1,10,3,1,2,0,1848,60,1 -40,1,155190,13,6,3,2,5,5,2,0,0,55,1 -31,1,42900,1,13,3,1,4,1,2,0,0,37,1 -36,1,191161,4,9,3,4,4,1,2,0,0,50,1 -23,1,181820,2,10,3,8,2,1,2,0,0,40,1 -33,1,105974,4,9,3,2,4,1,2,0,0,41,1 -52,1,146378,4,9,1,2,3,1,2,0,0,40,1 -22,1,103440,4,9,3,3,4,1,1,0,0,24,1 -51,1,203435,2,10,2,4,6,1,1,0,0,40,17 -31,4,168312,7,11,3,5,4,1,1,0,0,40,1 -49,3,257764,4,9,2,5,6,1,1,0,0,40,1 -49,1,171301,4,9,1,8,1,5,1,0,0,40,1 -53,4,225339,2,10,4,9,4,5,1,0,0,40,1 -52,1,152234,4,9,1,5,3,2,2,99999,0,40,9 -20,1,444554,13,6,3,7,2,1,2,0,0,40,1 -26,1,403788,6,12,3,3,2,5,2,0,0,40,1 -61,0,190997,4,9,1,0,3,1,2,0,0,6,1 -43,1,221550,11,14,3,3,4,1,1,0,0,30,18 -46,3,98929,1,13,1,4,3,1,2,0,0,52,1 -43,5,169203,7,11,3,9,4,1,1,0,0,35,1 -41,1,102332,4,9,2,4,6,5,1,0,0,40,1 -44,2,230684,4,9,1,11,3,1,2,0,0,55,1 -54,1,449257,4,9,1,10,3,1,2,0,0,40,1 -65,1,198766,11,14,1,4,3,1,2,20051,0,40,1 -32,1,97429,1,13,2,5,6,1,1,0,0,40,5 -25,1,208999,2,10,4,9,2,1,2,0,0,40,1 -23,1,37072,1,13,3,4,4,1,2,0,0,50,1 -25,5,163101,1,13,3,6,4,1,2,0,0,45,1 -19,1,119075,2,10,3,7,2,1,2,0,0,50,1 -37,2,137314,4,9,1,5,3,1,2,0,0,60,1 -45,1,127303,2,10,3,8,2,1,2,0,0,45,1 -37,1,349116,4,9,3,4,4,5,2,0,0,44,1 -40,2,266324,2,10,2,5,5,1,2,0,1564,70,14 -19,0,194095,2,10,3,0,2,1,2,0,0,40,1 -17,1,46496,3,7,3,3,2,1,2,0,0,5,1 -27,1,29904,1,13,3,1,4,1,1,0,0,40,1 -40,5,289403,9,4,1,2,3,5,2,0,1887,40,0 -59,1,226922,4,9,2,4,6,1,1,0,1762,30,1 -19,4,234151,4,9,3,9,6,5,1,0,0,40,1 -43,1,238287,13,6,3,8,4,5,2,0,0,40,1 -42,1,230624,13,6,3,11,6,1,2,0,0,40,1 -54,1,398212,4,9,1,11,3,5,2,5013,0,40,1 -54,2,114758,11,14,1,4,3,1,2,0,0,50,1 -51,1,246519,13,6,1,7,3,1,2,2105,0,45,1 -50,1,137815,4,9,1,11,3,1,2,0,0,60,1 -40,1,260696,4,9,3,9,4,1,1,0,0,40,1 -55,1,325007,6,12,1,1,3,1,2,0,0,25,1 -50,1,113176,4,9,2,9,4,1,1,0,0,40,1 -31,1,66815,4,9,1,2,3,1,2,0,0,50,1 -42,0,51795,4,9,2,0,6,5,1,0,0,32,1 -24,1,241523,4,9,1,3,3,1,2,0,0,45,1 -30,1,30226,3,7,2,4,2,1,2,0,0,40,1 -39,5,352628,6,12,1,2,1,1,1,0,0,50,1 -37,1,143912,4,9,2,11,4,1,2,0,0,50,1 -33,1,130021,4,9,1,11,3,1,2,0,0,40,1 -48,1,329778,4,9,4,3,6,1,1,0,0,40,1 -43,3,196945,4,9,1,3,3,2,2,0,0,78,34 -39,1,24342,1,13,1,6,3,1,2,0,0,40,1 -53,1,34368,2,10,1,5,3,1,2,0,0,45,1 -52,2,173839,13,6,1,5,3,1,2,0,0,60,1 -28,6,73211,2,10,1,9,3,2,2,0,0,20,1 -32,1,86723,4,9,3,3,4,1,2,0,0,52,1 -31,1,179186,1,13,1,4,3,5,2,0,0,90,1 -31,1,127610,1,13,1,6,1,1,1,0,0,40,1 -47,1,115070,2,10,3,9,4,1,1,0,0,40,1 -19,0,172582,2,10,3,0,2,1,2,0,0,50,1 -40,1,256202,7,11,1,7,3,5,2,0,0,40,1 -40,1,202872,6,12,3,9,2,1,1,0,0,45,1 -41,1,184102,3,7,2,3,4,1,2,0,0,40,1 -53,4,130703,4,9,1,9,3,5,2,0,0,40,1 -46,1,134727,3,7,2,8,6,3,2,0,0,43,6 -45,3,36228,1,13,1,10,3,1,2,4386,0,35,1 -39,1,297847,8,5,1,3,1,5,1,3411,0,34,1 -19,1,213644,4,9,3,8,4,1,2,0,0,40,1 -57,1,173796,4,9,1,11,3,1,2,0,1887,40,1 -49,1,147322,7,11,1,8,1,1,1,0,0,40,38 -59,1,296253,4,9,2,5,4,1,1,0,0,40,1 -32,1,180871,7,11,2,4,4,1,2,0,0,50,1 -18,0,169882,2,10,3,0,2,1,1,594,0,15,1 -35,6,211115,2,10,3,13,4,5,2,0,0,40,1 -58,3,183870,13,6,1,11,1,1,1,0,0,40,1 -28,1,441620,1,13,3,3,4,1,2,0,0,43,21 -36,4,218542,6,12,1,9,1,1,1,0,0,40,1 -41,2,141327,11,14,2,6,6,1,1,0,0,35,1 -47,1,67716,4,9,1,4,3,1,2,0,0,40,1 -50,3,175339,4,9,1,2,3,1,2,0,1672,60,1 -61,0,347089,4,9,1,0,3,1,2,0,0,16,1 -36,1,336595,4,9,1,5,3,1,2,0,0,50,1 -38,1,27997,7,11,2,2,4,1,2,0,0,40,1 -56,2,145574,4,9,1,3,3,1,2,0,1902,60,1 -50,1,30447,7,11,1,11,3,1,2,0,0,45,1 -45,2,256866,2,10,1,4,3,1,2,5013,0,40,1 -44,2,120837,4,9,1,5,3,1,2,0,0,66,1 -51,1,185283,2,10,1,5,3,1,2,0,0,45,1 -44,3,229466,2,10,1,5,3,1,2,0,0,50,1 -25,1,298225,4,9,2,2,4,1,2,0,0,50,1 -60,1,185749,3,7,4,11,6,5,2,0,0,40,1 -17,0,333100,13,6,3,0,2,1,2,1055,0,30,1 -49,3,125892,11,14,1,5,3,1,2,0,0,50,1 -46,1,563883,4,9,1,2,3,5,2,0,0,60,1 -56,1,311249,4,9,4,9,6,5,1,0,0,38,1 -25,1,221757,4,9,2,2,4,1,2,3325,0,45,1 -22,1,310152,2,10,3,11,4,1,2,0,0,40,1 -76,0,211453,4,9,4,0,4,5,1,0,0,2,1 -41,3,94113,5,15,1,6,3,1,2,0,0,60,1 -48,3,192945,4,9,1,5,3,1,2,7688,0,40,1 -46,1,161508,13,6,3,8,4,5,2,0,0,40,1 -30,1,177675,2,10,3,2,4,1,2,0,0,40,1 -39,1,51100,4,9,1,11,3,1,2,0,0,40,1 -40,1,100584,13,6,2,2,4,3,2,0,0,40,1 -70,4,163003,4,9,1,9,3,5,2,0,0,40,1 -35,1,67728,2,10,1,2,3,1,2,0,2051,45,1 -49,1,101320,11,14,2,5,4,1,1,0,0,75,1 -24,1,42706,7,11,3,13,4,1,2,0,0,60,1 -40,1,228535,6,12,1,6,3,1,2,7298,0,36,1 -61,1,120939,5,15,1,11,3,1,2,0,0,5,1 -25,1,98283,1,13,3,6,2,2,2,0,0,40,1 -28,5,216481,1,13,3,6,4,1,1,0,0,40,1 -69,6,208869,11,14,1,6,3,1,2,0,0,11,1 -22,1,207940,2,10,3,7,2,1,1,0,0,36,1 -47,1,34248,4,9,1,9,3,1,2,0,0,38,1 -38,1,83727,4,9,2,3,6,1,1,0,0,48,1 -26,1,183077,7,11,3,9,2,1,1,0,0,40,1 -17,1,197850,3,7,3,9,2,2,1,0,0,24,1 -33,2,235271,2,10,3,3,4,1,2,0,0,35,1 -43,2,35236,4,9,6,2,4,1,2,0,0,40,1 -58,1,255822,7,11,1,2,3,1,2,0,0,40,1 -53,3,263925,4,9,1,4,3,1,2,99999,0,40,1 -26,1,256263,4,9,3,2,4,1,2,0,0,25,1 -43,5,293535,1,13,1,6,3,5,2,0,0,40,1 -31,1,209448,13,6,1,10,3,1,2,2105,0,40,21 -30,1,57651,4,9,3,9,6,1,2,0,2001,42,1 -25,1,174592,4,9,3,3,2,1,1,0,0,25,1 -57,4,278763,4,9,2,9,6,1,1,0,0,40,1 -37,1,175232,11,14,2,5,6,1,2,0,0,60,1 -32,1,402812,2,10,1,4,3,1,2,0,0,48,1 -26,1,101150,4,9,2,5,4,1,1,0,0,41,1 -45,1,103538,4,9,1,9,3,1,2,0,0,40,1 -53,6,156877,4,9,1,9,3,1,2,15024,0,35,1 -27,1,23940,4,9,3,7,5,3,2,0,0,40,1 -28,3,210295,3,7,1,2,3,1,2,0,0,50,1 -32,1,80058,3,7,2,4,4,1,2,0,0,43,1 -35,1,187119,1,13,2,4,4,1,1,0,1980,65,1 -36,2,105021,2,10,1,5,3,1,2,0,0,55,1 -19,1,225775,2,10,3,9,4,1,2,0,0,40,1 -37,3,395831,1,13,1,5,3,1,2,0,0,80,1 -49,1,50282,2,10,2,8,4,1,2,3325,0,45,1 -20,1,32732,2,10,3,3,4,1,2,0,0,45,1 -64,3,179436,1,13,1,5,3,1,2,15024,0,55,1 -60,0,290593,11,14,1,0,3,1,2,0,0,40,1 -32,1,123253,6,12,1,2,3,1,2,0,0,42,1 -58,6,48433,4,9,1,5,3,1,2,0,0,40,1 -42,1,245317,4,9,1,4,3,1,2,0,0,50,1 -20,1,431745,2,10,3,9,4,5,1,0,0,14,1 -42,6,436006,2,10,1,9,3,5,2,0,0,40,1 -25,1,224943,2,10,6,6,6,5,2,0,0,40,1 -30,2,167990,7,11,1,6,3,1,2,15024,0,65,1 -37,3,217054,1,13,1,6,3,1,2,0,0,35,1 -66,2,298834,7,11,1,6,3,1,2,0,0,50,1 -59,3,125000,1,13,1,5,3,1,2,0,0,40,3 -44,1,123983,1,13,2,3,4,2,2,0,0,40,12 -46,1,155489,4,9,1,9,3,1,2,0,0,58,1 -59,1,284834,1,13,1,9,1,1,1,2885,0,30,1 -25,1,212495,1,13,3,5,2,1,2,0,1340,40,1 -17,5,32124,8,5,3,3,2,5,2,0,0,9,1 -47,5,246891,1,13,1,13,3,1,2,0,0,40,1 -47,6,141483,8,5,2,3,4,1,1,0,0,40,1 -30,1,31985,2,10,2,5,6,1,1,0,0,40,1 -20,1,170800,2,10,3,10,2,1,1,0,0,40,1 -26,5,166295,1,13,3,6,4,1,2,0,2339,55,1 -20,1,231286,2,10,3,7,4,5,2,0,0,15,1 -33,1,159322,4,9,2,3,6,1,2,0,0,40,1 -48,1,176026,4,9,1,8,3,1,2,0,0,40,1 -52,1,118025,1,13,1,5,3,1,2,99999,0,50,1 -37,1,26898,4,9,2,5,6,1,1,0,0,12,1 -47,1,232628,4,9,1,2,3,5,2,0,0,40,1 -40,1,85995,13,6,1,8,3,1,2,0,0,40,1 -48,1,125421,11,14,2,5,6,1,1,0,0,40,1 -49,1,245305,13,6,1,11,3,5,2,0,0,42,1 -50,1,73493,2,10,2,3,4,1,1,0,0,40,1 -30,1,197058,6,12,3,6,4,1,1,0,0,40,1 -34,1,122116,2,10,1,5,3,1,2,0,0,40,1 -43,1,75742,4,9,1,8,3,1,2,0,0,40,1 -22,1,214731,13,6,1,8,1,1,1,0,0,40,1 -35,1,265954,4,9,4,3,4,5,2,0,0,40,1 -26,6,197156,4,9,2,9,2,1,1,0,0,30,1 -62,1,162245,5,15,1,6,3,1,2,0,1628,70,1 -39,5,203070,4,9,4,13,4,1,2,0,0,40,1 -59,5,165695,2,10,1,2,3,1,2,0,0,40,1 -69,0,473040,15,3,2,0,4,1,2,0,0,40,1 -27,1,168107,1,13,1,2,3,1,2,0,0,40,1 -17,1,163494,13,6,3,4,2,1,2,0,0,30,1 -38,1,180342,1,13,1,9,3,1,2,0,0,40,1 -41,1,122381,2,10,1,2,3,1,2,0,1887,50,1 -27,1,148069,13,6,3,8,6,1,1,0,0,40,1 -23,1,200973,4,9,3,9,2,1,1,0,0,40,1 -17,1,130806,13,6,3,7,2,1,2,0,0,24,1 -56,1,117148,9,4,2,8,4,1,1,0,0,40,1 -24,1,213977,2,10,3,9,4,1,1,0,0,40,1 -62,1,134768,14,16,1,6,3,1,2,7688,0,40,0 -44,1,139338,10,8,2,11,6,5,2,0,0,40,1 -23,1,315877,1,13,3,9,4,1,2,0,0,30,1 -41,2,195124,4,9,3,4,4,1,2,0,0,50,0 -25,1,352057,4,9,3,11,4,1,2,0,0,65,1 -21,1,236684,2,10,3,3,5,5,1,0,0,8,1 -18,1,208447,10,8,3,7,2,1,2,0,0,6,1 -45,1,149640,2,10,1,2,3,1,2,0,0,40,1 -75,0,111177,1,13,4,0,4,1,1,25124,0,16,1 -51,1,154342,9,4,1,13,3,1,2,0,0,40,1 -42,4,141459,4,9,4,3,5,5,1,0,0,40,1 -47,1,111797,2,10,3,3,4,5,1,0,0,35,7 -29,1,111900,2,10,1,10,3,1,2,0,0,40,1 -33,1,78707,3,7,3,3,4,5,2,0,0,40,1 -43,5,160574,1,13,1,2,3,1,2,0,0,40,1 -20,0,174714,2,10,3,0,2,1,2,0,0,16,1 -19,0,62534,1,13,3,0,2,5,1,0,0,40,19 -44,1,216907,6,12,1,2,3,1,2,0,1848,40,1 -24,1,198148,2,10,3,3,2,1,2,0,0,30,1 -19,1,124265,2,10,3,7,2,1,2,0,0,40,1 -49,0,261059,13,6,4,0,2,1,2,2176,0,40,1 -52,1,208137,7,11,1,9,3,1,2,0,0,40,1 -38,2,257250,4,9,3,10,4,1,2,0,0,52,1 -24,6,147253,2,10,3,1,4,1,2,0,0,50,1 -32,5,244268,1,13,3,6,4,1,2,0,0,50,1 -72,0,213255,4,9,4,0,4,1,1,0,0,8,1 -26,1,266912,4,9,1,4,3,1,2,0,0,50,1 -31,1,169104,1,13,3,4,2,2,2,0,0,40,0 -29,1,200511,1,13,1,4,3,1,2,0,0,50,1 -39,1,128715,4,9,2,9,4,1,2,10520,0,40,1 -48,2,65535,4,9,1,10,3,1,2,0,0,50,1 -40,1,103395,2,10,1,2,3,1,2,0,0,45,1 -51,1,71046,2,10,2,5,6,1,2,0,0,45,33 -28,2,125442,4,9,2,2,2,1,2,0,0,40,1 -22,1,169188,4,9,3,5,2,1,1,0,0,20,1 -23,1,121471,1,13,3,9,4,1,1,0,0,40,1 -65,1,207281,2,10,1,13,3,1,2,0,0,16,1 -26,5,46097,1,13,1,6,3,1,2,0,0,40,1 -20,0,206671,2,10,3,0,2,1,2,1055,0,50,1 -55,1,98361,11,14,1,6,3,1,2,15024,0,50,0 -38,2,322143,1,13,1,3,3,1,2,0,0,10,1 -33,1,149184,4,9,3,6,4,1,2,0,0,60,1 -33,5,119829,4,9,1,9,1,1,1,0,0,60,1 -37,1,910398,1,13,3,4,4,5,1,0,0,40,1 -19,1,176570,3,7,3,8,2,1,2,0,0,60,1 -24,1,216129,1,13,3,4,4,1,2,0,0,50,1 -30,1,27207,3,7,1,2,3,1,2,0,0,45,1 -57,6,68830,4,9,2,5,4,1,1,0,0,50,1 -22,6,178818,2,10,3,6,2,1,1,0,0,20,1 -57,1,236944,4,9,1,8,3,5,2,0,0,40,1 -46,6,273771,4,9,1,1,3,1,2,0,0,40,1 -67,1,318533,4,9,1,9,3,1,2,0,0,35,1 -35,0,451940,4,9,3,0,6,5,1,0,0,40,1 -47,1,102318,4,9,4,2,2,1,2,0,0,40,1 -39,1,379350,13,6,1,2,3,1,2,0,0,40,1 -50,1,21095,2,10,2,3,6,2,2,0,0,40,16 -58,2,211547,10,8,2,4,4,1,1,0,0,52,1 -36,1,85272,6,12,1,6,1,1,1,0,0,30,1 -45,1,46406,11,14,1,6,3,1,2,0,0,36,3 -54,1,53833,4,9,1,8,3,1,2,0,0,40,1 -26,1,161007,1,13,3,9,2,1,2,0,0,30,1 -60,1,53707,4,9,1,6,3,1,2,0,0,40,1 -46,1,370119,5,15,1,6,3,1,2,99999,0,60,1 -26,1,310907,2,10,1,5,1,1,1,0,0,35,1 -32,1,375833,3,7,3,2,5,1,2,0,0,40,1 -38,5,107513,4,9,1,11,3,1,2,0,0,40,1 -48,2,58683,2,10,1,4,3,1,2,0,0,70,1 -44,2,179557,4,9,1,5,3,1,2,0,1977,45,1 -37,1,70240,4,9,3,3,2,2,1,0,0,40,16 -44,1,147206,2,10,1,5,3,1,2,0,0,40,1 -31,1,175548,4,9,3,3,4,4,1,0,0,35,1 -61,2,163174,5,15,1,6,3,1,2,0,0,35,1 -51,1,126010,4,9,1,2,3,1,2,0,0,40,1 -52,1,147876,1,13,1,4,1,1,1,15024,0,60,1 -45,1,428350,4,9,1,8,1,1,1,0,1740,40,1 -36,0,200904,6,12,1,0,1,5,1,0,0,21,28 -39,1,328466,3,7,1,3,3,1,2,2407,0,70,21 -67,5,258973,4,9,2,3,4,1,1,0,0,24,1 -40,6,345969,6,12,1,9,3,1,2,0,0,40,1 -27,1,127796,15,3,3,10,4,1,2,0,0,35,21 -37,1,405723,12,2,1,8,3,1,2,0,0,40,21 -57,1,175942,2,10,1,5,3,1,2,0,0,50,1 -27,1,284196,13,6,2,3,4,1,1,0,0,40,1 -28,1,89718,4,9,3,4,4,1,1,2202,0,48,1 -34,3,175761,1,13,1,6,3,1,2,0,0,60,1 -54,1,206369,4,9,1,8,3,1,2,5178,0,50,1 -52,1,158993,4,9,2,3,5,5,1,0,0,38,1 -42,1,285066,1,13,1,6,3,1,2,0,0,45,1 -48,1,126754,11,14,1,6,3,1,2,15024,0,40,1 -65,6,209280,11,14,1,6,3,1,2,6514,0,35,1 -55,2,52888,5,15,1,6,1,1,1,0,0,10,1 -71,3,133821,4,9,1,9,1,1,1,0,0,20,1 -33,1,240763,2,10,1,8,3,5,2,0,0,40,1 -30,1,39054,4,9,1,4,3,1,2,0,0,40,1 -35,1,119272,1,13,1,5,3,1,2,0,0,55,1 -59,1,143372,13,6,2,9,4,5,1,0,0,40,1 -19,1,323421,2,10,3,7,2,1,2,0,0,20,1 -36,2,136028,8,5,1,2,3,1,2,0,0,30,1 -26,2,163189,4,9,3,4,2,1,2,0,0,40,1 -34,5,202729,4,9,1,13,3,1,2,0,0,40,1 -41,1,421871,4,9,1,2,3,5,2,0,0,40,1 -44,1,120277,1,13,1,6,3,1,2,15024,0,50,17 -26,0,211798,4,9,4,0,6,1,1,0,0,40,1 -47,1,198901,4,9,1,2,3,1,2,0,0,48,1 -18,1,214617,2,10,3,4,2,1,2,0,0,16,1 -55,2,179715,13,6,1,2,3,1,2,0,0,18,1 -49,5,107231,4,9,1,2,3,1,2,0,2002,40,1 -44,1,110355,1,13,1,5,3,1,2,0,0,40,1 -43,1,184378,1,13,3,6,4,1,2,0,0,40,1 -62,1,273454,9,4,1,13,3,1,2,0,0,40,13 -44,1,443040,4,9,1,7,3,5,2,0,0,40,1 -39,0,71701,4,9,2,0,6,1,1,0,0,40,1 -50,3,160151,2,10,1,4,3,1,2,0,0,60,1 -35,1,107991,3,7,3,4,4,1,2,0,0,45,1 -52,1,94391,4,9,1,9,3,1,2,0,0,40,1 -46,1,99835,5,15,1,6,3,1,2,0,0,60,1 -44,1,43711,6,12,1,4,3,1,2,7688,0,40,1 -43,1,83756,2,10,3,5,6,1,2,0,0,50,1 -51,1,120914,9,4,1,2,3,1,2,2961,0,40,1 -20,1,180052,2,10,3,4,2,1,2,0,0,20,1 -47,1,170846,6,12,1,9,1,1,1,0,0,40,17 -43,1,37937,11,14,2,5,6,1,2,0,0,50,1 -64,0,168340,4,9,1,0,3,1,2,0,0,40,0 -24,1,38455,2,10,3,2,4,1,2,0,0,40,1 -27,4,128059,2,10,1,2,3,1,2,0,0,50,1 -32,1,420895,4,9,1,11,3,1,2,0,0,40,1 -37,1,166744,4,9,2,3,6,1,1,0,0,12,1 -26,1,238768,4,9,1,8,3,1,2,0,0,60,1 -43,1,176270,1,13,1,5,3,1,2,99999,0,60,1 -50,1,140592,4,9,1,8,1,1,1,0,0,40,1 -20,2,211466,4,9,3,11,2,1,2,0,0,80,1 -37,1,188540,4,9,1,5,3,1,2,0,1902,45,1 -43,1,39581,4,9,3,3,4,5,1,0,0,45,1 -37,1,171150,4,9,1,5,3,1,2,0,1887,50,1 -53,1,117496,8,5,2,3,4,1,1,0,0,36,5 -44,1,145160,14,16,1,6,3,1,2,0,0,40,1 -25,1,28520,4,9,3,3,4,1,1,0,0,40,1 -17,1,103851,3,7,3,9,2,1,1,1055,0,20,1 -19,1,375077,4,9,3,4,2,1,2,0,0,50,1 -53,6,281590,4,9,1,5,1,1,1,15024,0,40,1 -44,1,151504,2,10,1,4,3,1,2,0,0,50,1 -51,1,415287,4,9,1,8,3,5,2,0,1902,40,1 -49,1,32212,4,9,1,5,1,1,1,0,0,43,1 -35,1,123606,2,10,1,7,3,1,2,0,0,40,1 -44,1,202565,4,9,2,2,4,1,2,0,0,45,1 -54,1,177927,11,14,1,6,3,1,2,0,0,60,1 -37,1,256723,2,10,2,9,6,5,1,0,0,35,1 -18,1,46247,2,10,3,3,2,1,1,0,0,15,1 -24,1,266926,2,10,1,7,3,1,2,0,0,40,1 -29,1,112031,4,9,2,2,4,1,1,0,0,50,1 -22,0,376277,2,10,2,0,4,1,1,0,0,35,1 -35,1,168817,4,9,3,8,4,1,2,0,0,40,1 -56,1,187487,4,9,1,2,3,1,2,0,0,48,1 -32,0,158784,9,4,4,0,4,1,1,0,0,40,1 -24,1,67222,1,13,3,8,4,2,2,0,0,45,12 -43,1,201723,11,14,1,5,3,1,2,0,1902,40,1 -73,1,267408,4,9,4,4,5,1,1,0,0,15,1 -47,4,168191,2,10,1,2,3,1,2,0,0,40,0 -49,1,105444,10,8,1,8,3,1,2,0,0,39,1 -38,1,156728,1,13,1,4,3,1,2,0,0,40,1 -31,1,148600,4,9,1,8,3,5,2,0,0,40,1 -39,1,19914,2,10,2,9,6,3,1,0,0,40,1 -42,1,190767,1,13,2,9,4,1,1,0,0,40,1 -41,1,233955,11,14,1,6,3,2,2,0,0,45,12 -35,1,30381,1,13,1,4,3,1,2,0,0,45,1 -38,1,187069,4,9,1,11,3,1,2,0,0,45,1 -31,1,367314,4,9,2,9,4,1,1,0,0,40,1 -51,5,101119,4,9,1,13,3,1,2,0,0,70,1 -38,1,86551,1,13,2,4,4,1,2,0,0,48,1 -40,5,218995,2,10,1,5,3,1,2,0,0,42,1 -21,1,57711,4,9,3,3,4,1,1,0,0,30,1 -44,1,303521,2,10,2,2,4,1,2,0,0,40,1 -55,1,199067,4,9,1,2,3,1,2,0,0,40,1 -29,1,247445,4,9,2,4,4,1,2,0,0,45,1 -49,1,186078,11,14,1,6,1,1,1,0,0,50,1 -31,1,77634,7,11,1,3,3,1,2,0,0,42,1 -24,1,180060,11,14,3,5,2,1,2,6849,0,90,1 -46,1,56482,2,10,3,5,4,5,2,0,0,40,1 -26,1,314177,4,9,3,4,4,5,2,0,0,40,1 -35,1,239755,1,13,3,3,4,1,2,0,0,38,1 -27,1,377680,7,11,3,4,4,1,2,0,0,50,1 -64,2,134960,1,13,1,5,3,1,2,15024,0,35,1 -26,1,294493,1,13,3,2,5,1,2,0,0,40,1 -21,1,32616,4,9,3,3,4,1,1,0,1719,16,1 -45,1,182655,1,13,2,3,4,1,2,0,0,45,0 -57,5,52267,4,9,1,11,3,1,2,0,0,72,1 -30,1,117963,1,13,1,9,3,1,2,0,0,40,1 -45,1,98881,3,7,1,3,1,1,1,0,0,32,1 -50,1,196963,9,4,2,2,4,1,1,0,0,30,1 -38,1,166988,1,13,3,9,2,1,1,0,0,40,1 -43,2,193459,2,10,1,2,3,1,2,0,0,60,1 -42,1,182342,2,10,4,5,4,1,1,0,0,55,1 -32,1,496743,1,13,3,6,2,1,2,0,0,40,1 -20,1,154781,4,9,3,9,4,1,1,0,0,40,1 -27,1,219371,4,9,3,3,4,1,1,0,0,35,1 -45,1,99179,3,7,4,9,6,1,1,0,0,40,1 -40,1,224910,4,9,3,8,4,1,1,0,0,40,1 -38,1,304651,4,9,2,11,4,1,2,0,0,60,1 -37,1,349689,4,9,2,9,6,1,1,0,0,40,1 -60,1,106850,13,6,2,8,4,1,1,0,0,40,1 -53,2,196328,4,9,1,5,3,5,2,0,0,45,1 -25,1,169323,1,13,1,5,2,1,1,0,0,40,1 -47,2,162924,1,13,2,5,4,2,2,0,0,60,9 -40,2,34037,4,9,3,10,4,1,2,0,0,70,1 -51,0,167651,4,9,1,0,3,1,2,0,0,40,1 -19,1,197384,2,10,3,3,2,1,1,0,0,10,1 -42,1,251795,2,10,1,6,1,1,1,0,0,50,1 -65,0,266081,4,9,1,0,3,1,2,0,0,40,1 -41,1,165309,1,13,2,9,6,1,1,0,0,40,1 -28,1,215873,13,6,3,8,2,5,2,0,0,45,1 -46,1,133938,11,14,2,5,4,1,1,27828,0,50,1 -49,1,159816,1,13,1,6,1,1,1,99999,0,20,1 -24,1,228424,4,9,3,7,5,5,2,0,0,40,1 -32,1,195576,3,7,1,8,3,1,2,0,0,40,1 -71,1,105200,4,9,1,13,3,1,2,6767,0,20,1 -26,1,167350,2,10,1,1,3,1,2,3103,0,40,1 -29,1,52199,4,9,6,13,4,1,2,0,0,40,1 -50,1,171338,2,10,1,5,3,1,2,99999,0,50,1 -51,1,120173,7,11,1,2,3,1,2,3103,0,50,1 -17,0,158762,13,6,3,0,2,1,1,0,0,20,1 -49,1,169818,4,9,1,3,1,5,1,0,0,40,1 -31,1,288419,13,6,1,8,3,1,2,0,0,40,1 -23,1,207546,3,7,1,8,3,1,2,0,0,40,1 -59,5,147707,4,9,4,10,6,1,2,0,2339,40,1 -17,0,228373,13,6,3,0,2,1,2,0,0,30,1 -43,1,193882,2,10,1,4,3,1,2,7688,0,40,1 -38,1,31033,1,13,1,4,3,1,2,7298,0,40,1 -37,1,272950,4,9,1,5,3,1,2,0,0,40,1 -29,1,183523,13,6,3,3,2,1,2,0,0,40,1 -39,1,238415,1,13,1,5,3,1,2,0,0,50,1 -31,1,19302,6,12,3,9,4,1,2,2202,0,38,1 -42,5,339671,1,13,6,6,4,1,1,8614,0,45,1 -35,5,103260,11,14,1,6,1,1,1,0,0,35,1 -39,1,79331,11,14,1,5,3,2,2,15024,0,40,1 -40,1,135056,6,12,2,9,6,1,1,0,0,40,1 -66,1,142723,15,3,6,7,6,1,1,0,0,40,4 -30,4,188569,8,5,1,1,3,1,2,0,0,40,1 -43,1,57322,6,12,2,2,4,1,2,0,0,40,1 -47,1,178309,8,5,3,3,6,1,1,0,0,50,1 -45,1,166107,11,14,3,9,4,2,1,0,0,40,16 -31,1,53042,2,10,1,8,3,5,2,0,0,40,37 -33,1,155343,4,9,1,4,3,1,2,3103,0,40,1 -32,1,35595,1,13,3,9,4,1,2,0,0,40,1 -28,1,429507,6,12,1,7,3,5,2,0,0,40,1 -50,4,159670,4,9,1,9,3,1,2,0,0,40,1 -63,1,151210,9,4,2,4,4,1,1,0,0,40,1 -28,1,186792,1,13,1,9,3,1,2,0,0,40,1 -38,1,204640,2,10,4,9,6,5,1,0,0,40,1 -52,1,87205,4,9,2,3,4,1,1,0,0,38,1 -38,3,112847,5,15,1,6,3,2,2,0,0,40,1 -41,1,107306,4,9,3,5,4,1,2,2174,0,40,1 -50,6,211319,11,14,3,6,4,1,2,0,0,38,1 -59,1,183606,3,7,1,8,3,1,2,0,0,40,1 -32,1,205390,4,9,1,4,3,1,2,0,0,49,1 -73,5,232871,9,4,1,13,3,1,2,2228,0,10,1 -52,3,101017,4,9,2,5,6,1,2,0,0,38,1 -57,1,114495,4,9,1,5,3,1,2,0,0,60,1 -35,1,183898,7,11,1,5,3,1,2,7298,0,50,1 -51,1,163921,1,13,1,11,3,1,2,0,0,56,1 -22,1,311764,3,7,4,4,2,5,1,0,0,35,1 -49,1,188330,4,9,1,4,3,1,2,0,0,40,1 -22,1,267174,4,9,3,7,2,5,2,0,0,40,1 -46,5,36228,4,9,1,2,3,1,2,0,1902,40,1 -48,1,199739,4,9,2,2,6,1,1,0,0,40,1 -52,1,185407,4,9,1,2,3,1,2,0,0,40,0 -43,6,206139,1,13,1,9,3,1,2,0,0,50,1 -25,1,282063,9,4,1,7,3,1,2,0,0,40,21 -31,1,332379,9,4,3,2,4,1,2,0,0,40,1 -19,1,418324,2,10,3,4,2,1,2,0,0,36,1 -19,0,263338,2,10,3,0,2,1,2,0,0,45,1 -51,1,158948,4,9,1,11,3,1,2,0,0,84,1 -51,1,221532,1,13,2,5,6,1,2,0,0,40,1 -22,2,202920,4,9,3,6,6,1,1,99999,0,40,24 -37,5,118909,4,9,2,9,6,5,1,0,0,35,1 -19,1,286469,2,10,3,4,2,1,1,0,0,30,1 -45,1,191914,4,9,2,11,6,1,1,0,0,55,1 -21,6,142766,2,10,3,3,2,1,2,0,0,10,1 -52,1,198744,4,9,4,4,4,1,1,0,0,40,1 -46,5,272780,2,10,1,6,3,1,2,0,0,24,1 -42,6,219553,11,14,2,6,4,1,1,0,0,38,1 -56,1,261232,4,9,3,2,4,1,2,0,0,40,1 -23,1,64292,4,9,3,9,4,1,1,0,0,40,1 -58,1,312131,11,14,1,4,3,1,2,0,0,40,1 -70,1,30713,4,9,1,10,3,1,2,0,0,30,1 -30,1,246439,2,10,2,9,6,1,2,0,0,40,1 -45,1,338105,1,13,1,9,3,2,2,0,0,40,16 -23,1,228243,2,10,3,2,4,1,2,0,0,44,1 -34,5,62463,2,10,1,11,3,1,2,0,1579,40,1 -38,1,31603,1,13,1,2,1,1,1,0,0,40,1 -24,1,165054,7,11,1,2,3,1,2,0,0,40,1 -53,1,121618,9,4,3,11,4,3,2,0,0,40,1 -45,4,273194,4,9,3,11,4,5,2,3325,0,40,1 -21,0,163665,2,10,3,0,2,1,1,0,0,40,1 -21,1,538319,2,10,3,3,2,1,1,0,0,40,4 -34,1,238246,1,13,3,6,4,1,1,0,0,40,1 -32,3,244665,4,9,1,2,3,1,2,5178,0,45,1 -21,1,131811,4,9,1,7,3,1,2,0,0,40,1 -63,0,231777,11,14,1,0,3,1,2,0,0,30,1 -23,1,156807,8,5,3,7,2,5,2,0,0,36,1 -28,1,236861,1,13,2,2,6,1,2,0,0,50,1 -29,2,229842,4,9,3,11,6,5,2,0,0,45,1 -25,5,190057,1,13,3,6,2,1,1,0,0,40,1 -44,6,55076,11,14,3,6,4,1,1,0,0,60,1 -18,1,152545,4,9,3,3,2,1,1,0,0,8,1 -26,1,153434,4,9,3,3,2,1,2,0,0,24,1 -47,5,171095,6,12,1,9,1,1,1,0,0,35,1 -23,1,239322,4,9,2,4,4,1,2,0,0,40,1 -46,1,138999,2,10,1,8,3,1,2,0,0,40,1 -61,5,95450,1,13,1,5,3,1,2,5178,0,50,1 -25,1,176520,4,9,3,2,5,1,2,0,0,40,1 -38,5,72338,4,9,1,13,3,2,2,0,0,54,1 -60,0,386261,1,13,6,0,6,5,1,0,0,15,1 -23,1,235722,2,10,3,3,4,1,2,0,0,20,1 -36,4,128884,4,9,2,9,4,1,1,0,0,48,1 -46,1,187226,8,5,2,3,4,1,2,0,0,25,1 -32,2,298332,1,13,1,6,3,1,2,0,0,45,1 -40,1,173607,13,6,1,11,3,1,2,0,0,40,1 -31,1,226756,4,9,3,9,2,5,2,0,0,40,1 -31,1,157887,2,10,1,5,3,1,2,0,0,65,1 -32,6,171111,1,13,3,9,4,1,2,0,0,37,1 -21,1,126314,2,10,3,3,2,1,2,0,0,10,1 -63,1,174018,2,10,1,4,3,5,2,0,0,40,1 -44,1,144778,2,10,4,5,4,1,2,0,0,45,1 -42,2,201522,1,13,1,4,3,1,2,0,0,60,1 -23,0,22966,1,13,3,0,2,1,2,0,0,35,1 -30,1,399088,2,10,2,9,4,1,1,0,0,45,1 -24,1,282202,4,9,3,3,6,1,2,0,0,40,36 -42,1,102606,1,13,1,1,3,1,2,0,0,40,1 -44,2,246862,4,9,1,6,1,1,1,0,0,40,17 -27,4,508336,1,13,3,5,4,5,2,0,0,48,1 -27,5,263431,2,10,3,5,6,1,2,0,0,40,1 -22,1,235733,4,9,3,3,4,1,1,0,0,45,1 -68,1,107910,4,9,3,4,4,1,1,0,0,40,1 -55,2,184425,2,10,1,10,3,1,2,0,0,99,1 -22,2,143062,4,9,3,3,2,1,2,0,0,40,10 -25,1,199545,2,10,1,5,1,1,1,0,0,15,1 -68,2,197015,4,9,1,10,3,1,2,0,0,45,1 -62,1,149617,2,10,4,5,4,1,1,0,0,16,1 -26,1,33610,4,9,2,3,5,1,2,0,0,40,1 -34,1,192002,11,14,1,6,3,1,2,0,0,55,1 -68,1,67791,2,10,4,4,4,1,2,0,0,40,1 -42,5,445382,1,13,4,5,4,1,2,0,0,45,1 -45,1,112283,1,13,1,4,3,1,2,0,0,55,1 -26,1,157249,3,7,3,11,4,1,2,0,0,40,1 -25,1,109872,4,9,1,8,3,1,2,0,0,45,1 -23,1,119838,1,13,3,6,2,1,2,0,0,50,1 -29,1,149943,2,10,3,3,4,4,2,0,1590,40,0 -65,7,27012,9,4,4,10,6,1,1,0,0,50,1 -31,1,91666,4,9,3,11,4,1,2,0,0,50,1 -26,1,270276,4,9,3,9,4,1,1,0,0,40,1 -39,1,179271,2,10,1,2,3,1,2,0,0,50,1 -44,1,161819,4,9,1,8,3,1,2,0,0,40,1 -45,5,339681,11,14,2,6,6,1,1,1506,0,45,1 -26,2,219897,11,14,3,6,4,1,1,0,0,50,1 -26,1,91683,13,6,1,11,3,1,2,0,0,35,1 -36,1,188834,8,5,1,11,3,1,2,0,0,50,1 -38,1,187046,3,7,1,2,3,1,2,0,0,45,1 -39,1,191807,4,9,3,8,6,1,2,0,0,48,1 -52,3,179951,5,15,1,5,3,1,2,0,0,40,1 -39,1,324420,12,2,1,2,3,1,2,0,0,45,21 -41,2,66632,2,10,2,2,4,1,2,0,0,40,1 -42,5,121718,11,14,1,5,3,1,2,0,1902,60,1 -47,1,162034,1,13,1,4,3,1,2,0,0,70,1 -28,5,218990,7,11,1,13,3,5,2,0,0,46,1 -25,5,125863,1,13,3,6,2,1,2,0,0,35,1 -35,1,225330,1,13,3,9,4,1,1,0,0,40,1 -32,1,120426,4,9,4,9,6,1,1,0,0,40,1 -38,1,119741,11,14,1,2,3,5,2,0,0,40,1 -44,1,32000,2,10,1,1,3,1,2,0,0,18,1 -21,0,124242,2,10,3,0,2,1,1,0,0,40,1 -27,1,278581,1,13,3,4,2,1,1,0,0,40,1 -30,1,230224,1,13,3,5,4,1,2,0,0,55,1 -30,1,204374,1,13,3,4,4,1,2,0,1741,48,1 -45,1,188386,4,9,1,7,3,1,2,0,1628,45,1 -20,1,164922,4,9,3,2,2,1,2,0,0,40,1 -57,1,195176,11,14,3,6,4,1,2,0,0,80,1 -43,1,166740,2,10,2,9,4,1,2,0,0,48,1 -50,0,156008,3,7,1,0,2,5,1,0,0,40,1 -28,1,162551,9,4,1,8,5,2,1,0,0,48,12 -25,1,211231,4,9,1,1,5,1,1,0,0,48,1 -25,1,169990,4,9,1,4,3,1,2,0,0,40,1 -90,1,221832,1,13,1,5,3,1,2,0,0,45,1 -38,5,255454,1,13,4,6,6,5,2,0,0,40,1 -35,1,28160,1,13,6,5,6,1,1,0,0,40,1 -50,6,159219,1,13,1,5,3,1,2,0,0,40,5 -26,5,103148,4,9,3,9,4,1,1,0,0,40,1 -39,1,165186,2,10,1,10,3,1,2,0,0,45,1 -56,1,31782,13,6,1,4,3,1,2,0,0,40,1 -24,5,249101,4,9,2,13,6,5,1,0,0,40,1 -46,1,243190,4,9,1,9,1,1,1,7688,0,40,1 -18,5,153405,3,7,3,9,5,1,1,0,0,25,1 -37,1,329980,11,14,1,5,3,1,2,0,2415,60,1 -57,1,176079,11,14,2,6,4,1,1,0,0,40,1 -43,6,218542,4,9,3,9,6,5,1,0,0,40,1 -29,6,303446,1,13,1,6,3,1,2,0,0,25,32 -40,1,102606,11,14,1,5,3,1,2,0,0,40,1 -44,2,483201,1,13,1,5,3,1,2,0,0,40,1 -77,5,144608,4,9,1,5,3,1,2,0,0,6,1 -30,1,226013,1,13,3,6,4,5,2,0,0,40,1 -21,1,165475,4,9,1,8,3,1,2,0,0,40,1 -66,1,263637,13,6,1,13,3,1,2,0,0,40,1 -40,1,201495,3,7,3,11,2,1,2,0,0,35,1 -68,1,213720,4,9,1,4,3,1,2,0,0,40,1 -64,1,170483,4,9,4,3,4,1,1,0,0,38,1 -26,1,214303,1,13,3,6,4,1,1,0,0,50,1 -32,1,190511,2,10,1,5,3,1,2,0,0,40,1 -32,1,242150,4,9,3,9,2,1,2,0,0,38,1 -51,5,159755,11,14,1,6,3,1,2,0,0,40,1 -50,1,147629,11,14,1,6,3,1,2,15024,0,45,1 -49,1,268022,1,13,1,4,3,1,2,0,0,50,1 -28,1,188711,1,13,3,11,6,1,2,0,0,20,1 -29,1,452205,4,9,2,3,6,1,1,0,0,36,1 -21,1,260847,2,10,3,4,2,1,2,0,0,30,1 -28,1,291374,4,9,3,9,6,5,1,0,0,40,1 -55,1,189933,13,6,1,2,3,1,2,0,0,40,1 -45,2,133969,4,9,1,4,3,2,2,0,0,50,11 -35,1,330664,2,10,3,4,4,1,2,0,0,40,1 -31,0,672412,3,7,4,0,4,5,2,0,0,40,1 -26,1,122999,11,14,3,6,4,1,2,8614,0,40,1 -30,1,111415,4,9,1,3,3,1,2,0,0,55,6 -33,1,217235,4,9,1,4,3,1,2,0,0,20,1 -40,1,121956,1,13,6,6,4,2,2,13550,0,40,2 -23,1,120172,1,13,3,5,4,1,2,0,0,40,1 -35,1,343403,2,10,3,4,4,1,1,0,0,40,1 -48,2,104790,3,7,1,10,3,1,2,0,0,50,1 -39,5,473547,13,6,2,7,2,5,2,0,0,40,1 -53,5,260106,5,15,2,6,4,1,1,0,0,50,1 -49,4,168232,4,9,1,9,3,1,2,0,0,40,1 -31,1,348491,1,13,3,5,4,5,1,0,0,40,1 -36,1,24106,4,9,1,2,3,1,2,3103,0,40,1 -60,3,197553,2,10,1,5,3,1,2,7688,0,50,1 -29,1,421065,4,9,3,8,4,1,2,0,0,48,1 -54,3,138852,2,10,1,4,3,1,2,0,0,45,1 -28,0,169631,6,12,7,0,1,1,1,0,0,3,1 -34,1,379412,4,9,1,2,3,1,2,0,0,50,1 -30,1,181992,2,10,3,4,4,5,1,0,0,35,1 -19,1,365640,4,9,3,12,4,1,1,0,0,45,0 -26,1,236564,4,9,3,7,4,1,2,0,0,40,1 -47,1,363418,2,10,3,4,4,1,2,0,0,70,1 -50,1,112351,4,9,2,5,6,1,2,0,0,38,1 -30,1,204704,1,13,3,5,6,1,1,0,0,45,1 -44,1,54611,2,10,2,8,4,1,2,0,0,50,1 -49,1,128132,4,9,1,4,3,1,2,0,0,60,1 -75,2,30599,11,14,6,6,4,1,1,0,0,50,1 -37,1,379522,1,13,1,4,3,1,2,0,0,55,1 -51,6,196504,4,9,2,2,6,1,2,0,0,38,1 diff --git a/data/Adult/train_Adult_labels.csv b/data/Adult/train_Adult_labels.csv deleted file mode 100644 index 686d9fd..0000000 --- a/data/Adult/train_Adult_labels.csv +++ /dev/null @@ -1,2000 +0,0 @@ -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -1.000000000000000000e+00 -0.000000000000000000e+00 diff --git a/docs/Docs_GANBLR_adaptors/Code documentation b/docs/Docs_GANBLR_adaptors/Code documentation deleted file mode 100644 index f7b8182..0000000 --- a/docs/Docs_GANBLR_adaptors/Code documentation +++ /dev/null @@ -1,128 +0,0 @@ -""" -GanblrAdapter -------------- - -A class to adapt the GANBLR model to the KatabaticModelSPI interface. This class is designed to provide an easy interface for loading, fitting, and generating data using the GANBLR model. - -Attributes ----------- -type : str - Specifies whether the model type is 'discrete' or 'continuous'. Default is 'discrete'. -constraints : any - Constraints for the model. Default is None. -batch_size : int - Batch size for training the model. Default is None. -epochs : int - Number of epochs for training the model. Default is None. -training_sample_size : int - Size of the training sample. Initialized to 0. - -Methods -------- -load_model(): - Initializes and returns an instance of the GANBLR model. -load_data(data_pathname): - Loads data from the specified pathname. -fit(X_train, y_train, k=0, epochs=10, batch_size=64): - Fits the GANBLR model using the provided training data. -generate(size=None): - Generates data from the GANBLR model. If size is not specified, defaults to the training sample size. -""" - -from katabatic_spi import KatabaticModelSPI -import pandas as pd -from .ganblr import GANBLR - -class GanblrAdapter(KatabaticModelSPI): - - def __init__(self, type='discrete'): - """ - Initializes the GanblrAdapter with specified type. - - Parameters - ---------- - type : str, optional - Should be either 'discrete' or 'continuous'. Default is 'discrete'. - """ - self.type = type - self.constraints = None - self.batch_size = None - self.epochs = None - - def load_model(self): - """ - Initializes and returns an instance of the GANBLR model. - - Returns - ------- - GANBLR - An initialized instance of the GANBLR model. - """ - print("---Initialise Ganblr Model") - self.model = GANBLR() - self.training_sample_size = 0 - return self.model - - def load_data(self, data_pathname): - """ - Loads data from the specified pathname. - - Parameters - ---------- - data_pathname : str - The pathname to the data file. - - Returns - ------- - pandas.DataFrame - Loaded data as a DataFrame. - """ - print("Loading Data...") - data = pd.DataFrame # Load data using pandas (you may want to use pd.read_csv or similar) - return data - - def fit(self, X_train, y_train, k=0, epochs=10, batch_size=64): - """ - Fits the GANBLR model using the provided training data. - - Parameters - ---------- - X_train : array-like - Training data features. - y_train : array-like - Training data labels. - k : int, optional - A parameter for the fit method (default is 0). - epochs : int, optional - Number of epochs for training (default is 10). - batch_size : int, optional - Batch size for training (default is 64). - - Returns - ------- - None - """ - print("---FIT Ganblr Model") - self.model.fit(X_train, y_train, k, batch_size=batch_size, epochs=epochs, verbose=0) - self.training_sample_size = len(X_train) - - def generate(self, size=None): - """ - Generates data from the GANBLR model. - - Parameters - ---------- - size : int, optional - Number of samples to generate. If not specified, defaults to the training sample size. - - Returns - ------- - array-like - Generated data. - """ - print("---Generate from Ganblr Model") - if size is None: - size = self.training_sample_size - - generated_data = self.model.sample(size, verbose=0) - return generated_data diff --git a/docs/GANBLR_Model_Documentation.pdf b/docs/GANBLR_Model_Documentation.pdf deleted file mode 100644 index 25b7948..0000000 Binary files a/docs/GANBLR_Model_Documentation.pdf and /dev/null differ diff --git a/docs/Makefile b/docs/Makefile deleted file mode 100644 index d4bb2cb..0000000 --- a/docs/Makefile +++ /dev/null @@ -1,20 +0,0 @@ -# Minimal makefile for Sphinx documentation -# - -# You can set these variables from the command line, and also -# from the environment for the first two. -SPHINXOPTS ?= -SPHINXBUILD ?= sphinx-build -SOURCEDIR = . -BUILDDIR = _build - -# Put it first so that "make" without argument is like "make help". -help: - @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) - -.PHONY: help Makefile - -# Catch-all target: route all unknown targets to Sphinx using the new -# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS). -%: Makefile - @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) diff --git a/docs/Meg/MegDocumentation.pdf b/docs/Meg/MegDocumentation.pdf deleted file mode 100644 index 6263f44..0000000 Binary files a/docs/Meg/MegDocumentation.pdf and /dev/null differ diff --git a/docs/conf.py b/docs/conf.py deleted file mode 100644 index 7abbd7e..0000000 --- a/docs/conf.py +++ /dev/null @@ -1,28 +0,0 @@ -# Configuration file for the Sphinx documentation builder. -# -# For the full list of built-in configuration values, see the documentation: -# https://www.sphinx-doc.org/en/master/usage/configuration.html - -# -- Project information ----------------------------------------------------- -# https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information - -project = 'Katabatic' -copyright = '2024, Jaime Blackwell, Nayyar Zaidi' -author = 'Jaime Blackwell, Nayyar Zaidi' -release = '1.0' - -# -- General configuration --------------------------------------------------- -# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration - -extensions = [] - -templates_path = ['_templates'] -exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store'] - - - -# -- Options for HTML output ------------------------------------------------- -# https://www.sphinx-doc.org/en/master/usage/configuration.html#options-for-html-output - -html_theme = 'alabaster' -html_static_path = ['_static'] diff --git a/docs/index.rst b/docs/index.rst deleted file mode 100644 index cefe78a..0000000 --- a/docs/index.rst +++ /dev/null @@ -1,23 +0,0 @@ -.. Katabatic documentation master file, created by - sphinx-quickstart on Mon Jan 8 11:06:15 2024. - You can adapt this file completely to your liking, but it should at least - contain the root `toctree` directive. - -Welcome to Katabatic's documentation! -===================================== - -.. toctree:: - :maxdepth: 2 - :caption: Contents: - - usage/installation - usage/quickstart - - - -Indices and tables -================== - -* :ref:`genindex` -* :ref:`modindex` -* :ref:`search` diff --git a/docs/make.bat b/docs/make.bat deleted file mode 100644 index 954237b..0000000 --- a/docs/make.bat +++ /dev/null @@ -1,35 +0,0 @@ -@ECHO OFF - -pushd %~dp0 - -REM Command file for Sphinx documentation - -if "%SPHINXBUILD%" == "" ( - set SPHINXBUILD=sphinx-build -) -set SOURCEDIR=. -set BUILDDIR=_build - -%SPHINXBUILD% >NUL 2>NUL -if errorlevel 9009 ( - echo. - echo.The 'sphinx-build' command was not found. Make sure you have Sphinx - echo.installed, then set the SPHINXBUILD environment variable to point - echo.to the full path of the 'sphinx-build' executable. Alternatively you - echo.may add the Sphinx directory to PATH. - echo. - echo.If you don't have Sphinx installed, grab it from - echo.https://www.sphinx-doc.org/ - exit /b 1 -) - -if "%1" == "" goto help - -%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O% -goto end - -:help -%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O% - -:end -popd diff --git a/ganblr_api.py b/ganblr_api.py deleted file mode 100644 index efb1c56..0000000 --- a/ganblr_api.py +++ /dev/null @@ -1,105 +0,0 @@ -from fastapi import FastAPI, HTTPException, File, UploadFile -from pydantic import BaseModel -import pandas as pd -import logging -from katabatic.katabatic import Katabatic -import os - -# Initialize FastAPI app -app = FastAPI( - title="GANBLR Model API", - description="An API to train and generate synthetic data using the GANBLR model. " - "Provides endpoints to train the model on a dataset and generate synthetic data.", - version="1.0.0", -) - -# Initialize logging -logging.basicConfig(level=logging.INFO) - -# Initialize the GANBLR model instance (the model instance will persist across requests) -ganblr_model = Katabatic.run_model("ganblr") - - -# Request body for training -class TrainRequest(BaseModel): - file_path: str # Path to the dataset file (CSV) - epochs: int # Number of epochs for training - - -# Request body for generating synthetic data -class GenerateRequest(BaseModel): - num_samples: int # Number of samples to generate - - -@app.post("/train", summary="Train the GANBLR model", tags=["Model Operations"]) -async def train_model(request: TrainRequest): - """ - Train the GANBLR model on the provided dataset. - - - **file_path**: Path to the CSV dataset file. - - **epochs**: Number of epochs to run the training. - - This will train the model and persist it in memory for future synthetic data generation. - """ - if not os.path.exists(request.file_path): - raise HTTPException(status_code=404, detail="File not found") - - try: - logging.info(f"Loading dataset from {request.file_path}...") - data = pd.read_csv(request.file_path) - - # Assuming the last column is the label, split the dataset into features (X) and labels (y) - if len(data.columns) < 2: - raise HTTPException( - status_code=400, - detail="Dataset must contain at least one feature and one label.", - ) - - X, y = data.iloc[:, :-1], data.iloc[:, -1] - - # Train the model - logging.info(f"Training the GANBLR model for {request.epochs} epochs...") - ganblr_model.load_model() # Load the model if necessary - ganblr_model.fit(X, y, epochs=request.epochs) - - logging.info("Model trained successfully.") - return {"message": "Model trained successfully", "epochs": request.epochs} - except Exception as e: - logging.error(f"Error during training: {str(e)}") - raise HTTPException( - status_code=500, detail=f"An error occurred during training: {str(e)}" - ) - - -@app.post("/generate", summary="Generate synthetic data", tags=["Model Operations"]) -async def generate_synthetic_data(request: GenerateRequest): - """ - Generate synthetic data from the trained GANBLR model. - - - **num_samples**: Number of synthetic samples to generate. - - Returns a JSON response with the generated synthetic data. - """ - try: - logging.info(f"Generating {request.num_samples} synthetic data samples...") - synthetic_data = ganblr_model.generate(size=request.num_samples) - synthetic_df = pd.DataFrame(synthetic_data) - - logging.info("Synthetic data generation successful.") - return {"synthetic_data": synthetic_df.to_dict(orient="records")} - except Exception as e: - logging.error(f"Error during synthetic data generation: {str(e)}") - raise HTTPException( - status_code=500, - detail=f"An error occurred during data generation: {str(e)}", - ) - - -@app.get("/", summary="Root Endpoint", tags=["General"]) -async def root(): - """ - Root endpoint for checking the status of the API. - - Returns a simple message indicating that the API is up and running. - """ - return {"message": "GANBLR Model API is running!"} diff --git a/katabatic.egg-info/PKG-INFO b/katabatic.egg-info/PKG-INFO deleted file mode 100644 index d0cd424..0000000 --- a/katabatic.egg-info/PKG-INFO +++ /dev/null @@ -1,6 +0,0 @@ -Metadata-Version: 2.1 -Name: katabatic -Version: 0.0.1 -Summary: An open source framework for tabular data generation -Author: Jaime Blackwell, Nayyar Zaidi -License-File: LICENSE diff --git a/katabatic.egg-info/SOURCES.txt b/katabatic.egg-info/SOURCES.txt deleted file mode 100644 index af780f8..0000000 --- a/katabatic.egg-info/SOURCES.txt +++ /dev/null @@ -1,14 +0,0 @@ -LICENSE -README.md -setup.py -katabatic/__init__.py -katabatic/importer.py -katabatic/katabatic.py -katabatic/katabatic_spi.py -katabatic/test2.py -katabatic.egg-info/PKG-INFO -katabatic.egg-info/SOURCES.txt -katabatic.egg-info/dependency_links.txt -katabatic.egg-info/top_level.txt -tests/testClick.py -tests/testKatabatic.py \ No newline at end of file diff --git a/katabatic.egg-info/dependency_links.txt b/katabatic.egg-info/dependency_links.txt deleted file mode 100644 index 8b13789..0000000 --- a/katabatic.egg-info/dependency_links.txt +++ /dev/null @@ -1 +0,0 @@ - diff --git a/katabatic.egg-info/top_level.txt b/katabatic.egg-info/top_level.txt deleted file mode 100644 index 494e03a..0000000 --- a/katabatic.egg-info/top_level.txt +++ /dev/null @@ -1 +0,0 @@ -katabatic diff --git a/katabatic/__init__.py b/katabatic/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/katabatic/benchmark.ipynb b/katabatic/benchmark.ipynb deleted file mode 100644 index c904864..0000000 --- a/katabatic/benchmark.ipynb +++ /dev/null @@ -1,151 +0,0 @@ -{ - "cells": [ - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "import numpy as np\n", - "import pandas as pd\n", - "from scipy.stats import ks_2samp\n", - "from sklearn.metrics import accuracy_score, roc_auc_score, confusion_matrix\n", - "from sklearn.model_selection import train_test_split\n", - "from sklearn.preprocessing import OrdinalEncoder, LabelEncoder\n", - "from sklearn.linear_model import LogisticRegression\n", - "from sklearn.ensemble import RandomForestClassifier\n", - "from sklearn.neural_network import MLPClassifier\n", - "from sklearn.metrics import pairwise_distances\n", - "from scipy.spatial.distance import jensenshannon\n", - "from sklearn.utils import shuffle\n", - "import matplotlib.pyplot as plt\n", - "from ganblr import GANBLR # Assuming GANBLR is saved in a module named ganblr\n", - "\n", - "# Utility function to compute descriptive statistics comparison\n", - "def compute_statistics(real_data, synthetic_data):\n", - " real_mean = np.mean(real_data, axis=0)\n", - " real_var = np.var(real_data, axis=0)\n", - " synthetic_mean = np.mean(synthetic_data, axis=0)\n", - " synthetic_var = np.var(synthetic_data, axis=0)\n", - " \n", - " mean_diff = np.abs(real_mean - synthetic_mean)\n", - " var_diff = np.abs(real_var - synthetic_var)\n", - " \n", - " print(\"\\nMean Differences between Real and Synthetic Data:\")\n", - " print(mean_diff)\n", - " print(\"\\nVariance Differences between Real and Synthetic Data:\")\n", - " print(var_diff)\n", - " \n", - " return mean_diff, var_diff\n", - "\n", - "# Utility function to compute distribution similarity using Kolmogorov-Smirnov test\n", - "def compute_distribution_similarity(real_data, synthetic_data):\n", - " ks_results = [ks_2samp(real_data[:, i], synthetic_data[:, i]) for i in range(real_data.shape[1])]\n", - " ks_statistic = np.array([result.statistic for result in ks_results])\n", - " ks_pvalue = np.array([result.pvalue for result in ks_results])\n", - " \n", - " print(\"\\nKolmogorov-Smirnov Test Results:\")\n", - " print(\"KS Statistic:\", ks_statistic)\n", - " print(\"P-Value:\", ks_pvalue)\n", - " \n", - " return ks_statistic, ks_pvalue\n", - "\n", - "# Utility function to compute Jensen-Shannon divergence\n", - "def compute_js_divergence(real_data, synthetic_data):\n", - " js_divergence = []\n", - " for i in range(real_data.shape[1]):\n", - " real_dist = np.histogram(real_data[:, i], bins=20, density=True)[0]\n", - " synthetic_dist = np.histogram(synthetic_data[:, i], bins=20, density=True)[0]\n", - " jsd = jensenshannon(real_dist, synthetic_dist)\n", - " js_divergence.append(jsd)\n", - " \n", - " js_divergence = np.array(js_divergence)\n", - " \n", - " print(\"\\nJensen-Shannon Divergence between Real and Synthetic Data:\")\n", - " print(js_divergence)\n", - " \n", - " return js_divergence\n", - "\n", - "# Privacy evaluation: Membership Inference Attack\n", - "def membership_inference_attack(real_data, synthetic_data, model, test_size=0.2):\n", - " real_data = shuffle(real_data, random_state=42)\n", - " synthetic_data = shuffle(synthetic_data, random_state=42)\n", - "\n", - " # Train a binary classifier to distinguish between real and synthetic data\n", - " labels = np.concatenate([np.ones(len(real_data)), np.zeros(len(synthetic_data))])\n", - " combined_data = np.vstack([real_data, synthetic_data])\n", - " \n", - " X_train, X_test, y_train, y_test = train_test_split(combined_data, labels, test_size=test_size, random_state=42)\n", - " \n", - " model.fit(X_train, y_train)\n", - " predictions = model.predict(X_test)\n", - " accuracy = accuracy_score(y_test, predictions)\n", - " auc = roc_auc_score(y_test, predictions)\n", - " \n", - " print(\"\\nMembership Inference Attack Results:\")\n", - " print(f\"Accuracy: {accuracy:.4f}\")\n", - " print(f\"AUC: {auc:.4f}\")\n", - " \n", - " return accuracy, auc\n", - "\n", - "# Load Dataset (e.g., UCI Adult Dataset)\n", - "from sklearn.datasets import fetch_openml\n", - "dataset = fetch_openml(name='adult', version=2)\n", - "X, y = dataset.data, dataset.target\n", - "\n", - "# Encode categorical data\n", - "ordinal_encoder = OrdinalEncoder(dtype=int, handle_unknown='use_encoded_value', unknown_value=-1)\n", - "X = ordinal_encoder.fit_transform(X)\n", - "label_encoder = LabelEncoder()\n", - "y = label_encoder.fit_transform(y).astype(int)\n", - "\n", - "# Split dataset into training and testing sets\n", - "X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)\n", - "\n", - "# Initialize GANBLR model and fit on training data\n", - "ganblr = GANBLR()\n", - "ganblr.fit(X_train, y_train, k=2, batch_size=32, epochs=50, warmup_epochs=10, verbose=1)\n", - "\n", - "# Generate synthetic data\n", - "synthetic_data = ganblr.sample(size=len(X_train))\n", - "synthetic_X, synthetic_y = synthetic_data[:, :-1], synthetic_data[:, -1]\n", - "\n", - "# Data Quality Evaluation\n", - "mean_diff, var_diff = compute_statistics(X_train, synthetic_X)\n", - "ks_statistic, ks_pvalue = compute_distribution_similarity(X_train, synthetic_X)\n", - "js_divergence = compute_js_divergence(X_train, synthetic_X)\n", - "\n", - "# Privacy Evaluation: Membership Inference Attack\n", - "logistic_model = LogisticRegression()\n", - "mia_accuracy, mia_auc = membership_inference_attack(X_train, synthetic_X, logistic_model)\n", - "\n", - "# Plotting results for better visualization\n", - "def plot_results(mean_diff, var_diff, ks_statistic, js_divergence):\n", - " fig, axs = plt.subplots(2, 2, figsize=(15, 10))\n", - " \n", - " axs[0, 0].bar(range(len(mean_diff)), mean_diff)\n", - " axs[0, 0].set_title('Mean Differences')\n", - " \n", - " axs[0, 1].bar(range(len(var_diff)), var_diff)\n", - " axs[0, 1].set_title('Variance Differences')\n", - " \n", - " axs[1, 0].bar(range(len(ks_statistic)), ks_statistic)\n", - " axs[1, 0].set_title('KS Statistic')\n", - " \n", - " axs[1, 1].bar(range(len(js_divergence)), js_divergence)\n", - " axs[1, 1].set_title('Jensen-Shannon Divergence')\n", - " \n", - " plt.show()\n", - "\n", - "plot_results(mean_diff, var_diff, ks_statistic, js_divergence)" - ] - } - ], - "metadata": { - "language_info": { - "name": "python" - } - }, - "nbformat": 4, - "nbformat_minor": 2 -} diff --git a/katabatic/importer.py b/katabatic/importer.py deleted file mode 100644 index 87e82ac..0000000 --- a/katabatic/importer.py +++ /dev/null @@ -1,89 +0,0 @@ -# Usage -# ----- -# from aiko_services.utilities import * -# module_descriptor = "pathname/filename.py" # or "package.module" -# module = load_module(module_descriptor) -# module.some_class() -# module.some_function() -# -# To Do -# ----- -# - None, yet ! - -import importlib -import os -import sys - -__all__ = ["load_module", "load_modules"] - - -# If the environment variable 'AIKO_IMPORTER_USE_CURRENT_DIRECTORY' is set, add -# the current working directory to the system path to allow module imports from it. -if os.environ.get("AIKO_IMPORTER_USE_CURRENT_DIRECTORY"): - sys.path.append(os.getcwd()) - -# Cache for storing loaded modules to prevent reloading the same module multiple times. -MODULES_LOADED = {} - - -def load_module(module_descriptor): - """ - Load a Python module from a given descriptor. - - Arguments: - module_descriptor (str): The path to the module (e.g., "directory/file.py") - or the module name (e.g., "package.module"). - - Returns: - module: The loaded module object. - - Example: - module = load_module("example_module.py") - module.some_function() - - If the module has already been loaded, it is returned from the cache. Otherwise, - it is loaded from the file system or from installed packages. - """ - if module_descriptor in MODULES_LOADED: - module = MODULES_LOADED[module_descriptor] - else: - if module_descriptor.endswith(".py"): - # Load module from Python source pathname, e.g "directory/file.py" - module_pathname = module_descriptor - module = importlib.machinery.SourceFileLoader( - "module", module_pathname - ).load_module() - else: - print(module_descriptor) - # Load module from "installed" modules, e.g "package.module" - module_name = module_descriptor - module = importlib.import_module(module_name) - print(module) - MODULES_LOADED[module_descriptor] = module - return module - - -def load_modules(module_pathnames): - """ - Load multiple Python modules from a list of descriptors. - - Arguments: - module_pathnames (list of str): A list of paths or module names to load. - - Returns: - list: A list of loaded module objects. - - Example: - modules = load_modules(["module1.py", "module2.py"]) - modules[0].some_function() - - Each module in the list is loaded and appended to the return list. - If a module descriptor is empty, None is appended in its place. - """ - modules = [] - for module_pathname in module_pathnames: - if module_pathname: - modules.append(load_module(module_pathname)) - else: - modules.append(None) - return modules diff --git a/katabatic/katabatic.py b/katabatic/katabatic.py deleted file mode 100644 index 822c7a3..0000000 --- a/katabatic/katabatic.py +++ /dev/null @@ -1,371 +0,0 @@ -""" -This module declaratively loads the configured module and instantiates the Tabular Data Generative Model (TDGM) class. - -The `Katabatic` class provides methods for running models, evaluating metrics, preprocessing data, and evaluating models. -""" -# This file declaratively loads the configured module and instantiates the Tabular Data Generative Model (TDGM) class. -import os -import sys -import json -import pandas as pd -from multiprocessing import Process -from .katabatic_spi import * # Katabatic Model SPI -from .importer import * # Aiko Services module loader - -# from sklearn.datasets import load_breast_cancer -from sklearn.model_selection import train_test_split - -# import warnings -# warnings.filterwarnings("ignore") -import logging -from warnings import simplefilter - -simplefilter(action="ignore", category=FutureWarning) -logging.basicConfig(filename="running.log", level=logging.INFO) - -CONFIG_FILE = os.path.abspath( - "katabatic/katabatic_config.json" -) # Constant to retrieve config file -METRICS_FILE = os.path.abspath( - "metrics/metrics.json" -) # Constant to retrieve metrics function table - - -class Katabatic: - - """ - Class to handle model running, metric evaluation, and data preprocessing in the Katabatic framework. - - Methods: - run_model(model_name): - Loads and instantiates the model specified by `model_name`. - - run_metric(metric_name): - Loads and returns the metric module specified by `metric_name`. - - evaluate_data(synthetic_data, real_data, data_type, dict_of_metrics): - Evaluates the synthetic data against real data using specified metrics. - - evaluate_models(real_data, dict_of_models, dict_of_metrics): - Evaluates a set of models using specified metrics. - - preprocessing(data): - Prepares the data by handling null values and splitting it into training and testing sets. - """ - - def run_model(model_name): - - """ - Loads and instantiates the model specified by `model_name`. - - Args: - model_name (str): The name of the model to load as specified in the configuration file. - - Returns: - KatabaticModelSPI: An instance of the specified model class. - - Raises: - SystemExit: If the model configuration is missing or the module cannot be loaded. - """ - print(f"--------------------------") - print(f"module name: {__name__}") - print(f"parent process: {os.getppid()}") - print(f"process id: {os.getpid()}") - - with open(CONFIG_FILE, "r") as file: # Open config file as read only - config = json.load( - file - ) # config is a dict of dicts, each containing config variables for a model. - - if not model_name in config: - raise SystemExit( - f"Configuration '{CONFIG_FILE}' doesn't have model: {model_name}" - ) - config = config[ - model_name - ] # update config to just one dict, containing config variables of a single model. - - try: - module_name = config["tdgm_module_name"] - class_name = config["tdgm_class_name"] - except KeyError as key_error: - raise SystemExit( # SystemExit exception prints the below and immediately exits the interpreter - f"Configuration file '{CONFIG_FILE}' does not contain: {key_error}" - ) - - diagnostic = None # initialise an empty diagnostic variable - try: - # breakpoint() - print(module_name) - module = load_module(module_name) # load_module method from Aiko services - model_class = getattr(module, class_name) - except FileNotFoundError: - diagnostic = "could not be found." - except Exception as exception: - diagnostic = f"could not be loaded: {exception}" - if diagnostic: - raise SystemExit(f"Module {module_name} {diagnostic}") - - model = model_class() # Create an instance of the model class - if not isinstance(model, KatabaticModelSPI): - raise SystemExit(f"{class_name} doesn't implement KatabaticModelSPI.") - - return model - # # TODO: move the next code block outside the load_model function - # model.load_model() - # model.fit(X_train, y_train) - # synthetic_data = pd.DataFrame(model.generate()) - # return synthetic_data - - # Accepts metric_name:str. Returns an instance of the selected metric. - # TODO: possibly update METRICS_FILE to a dict of dicts (to include type etc.. of each metric) - def run_metric(metric_name): - - """ - Loads and returns the metric module specified by `metric_name`. - - Args: - metric_name (str): The name of the metric to load as specified in the metrics file. - - Returns: - module: The loaded metric module. - - Raises: - SystemExit: If the metric configuration is missing or the module cannot be loaded. - """ - - with open(METRICS_FILE, "r") as file: - metrics = json.load(file) - - if not metric_name in metrics: - raise SystemExit( - f"Metrics Function Table '{METRICS_FILE}' doesn't contain metric: {metric_name}" - ) - metric = metrics[metric_name] - - diagnostic = None # initialise an empty diagnostic variable - try: - module = load_module(metric) # load_module method from Aiko services - except FileNotFoundError: - diagnostic = "could not be found." - except Exception as exception: - diagnostic = f"could not be loaded: {exception}" - if diagnostic: - raise SystemExit(f"Metric {metric_name} {diagnostic}") - # Run Metric - # result = metric_name.evaluate() - # return result - return module - - # evaluate_data assumes the last column to be y and all others to be X - # evaluate_data evaluates the synthetic data based on the training dataset for a variety of metrics. - # TODO: Update third parameter to something like discrete_or_continuous or discrete (True or False) - def evaluate_data( - synthetic_data, real_data, data_type, dict_of_metrics - ): - """ - Evaluates the synthetic data against real data using specified metrics. - - Args: - synthetic_data (pd.DataFrame): The synthetic data to evaluate. - real_data (pd.DataFrame): The real data to compare against. - data_type (str): Type of data ('discrete' or 'continuous'). - dict_of_metrics (dict): Dictionary of metrics to use for evaluation. - - Returns: - pd.DataFrame: A DataFrame with evaluation results for each metric. - - Raises: - SystemExit: If the synthetic data and real data are not compatible. - """ - - # data_type s/be either 'discrete' or 'continuous' - # Convert column headers to integers? - # Remove column headers? - # print("Type of real_data: ", type(real_data)) - # print("Type of synthetic_data: ", type(synthetic_data)) - - # Check if synthetic_data and real_data are uniform in type, shape and columns - if not type(synthetic_data) == type(real_data): - raise SystemExit( - "WARNING: Input types do not match: synthetic_data type: ", - type(synthetic_data), - "real_data type: ", - type(real_data), - ) - if not synthetic_data.shape == real_data.shape: - raise SystemExit( - "WARNING: Input shapes do not match: synthetic_data shape: ", - synthetic_data.shape, - "real_data shape: ", - real_data.shape, - ) - if not synthetic_data.columns.all() == real_data.columns.all(): - raise SystemExit( - "WARNING: Input column headers do not match: synthetic_data headers: ", - synthetic_data.columns, - "real_data headers: ", - real_data.columns, - ) - - # Reset Column Headers for both datasets - synthetic_data.columns = range(synthetic_data.shape[1]) - real_data.columns = range(real_data.shape[1]) - - # Split X and y, assume y is the last column. - X_synthetic, y_synthetic = ( - synthetic_data.iloc[:, :-1], - synthetic_data.iloc[:, -1:], - ) - X_real, y_real = real_data.iloc[:, :-1], real_data.iloc[:, -1:] - - results_df = pd.DataFrame({"Metric": [], "Value": []}) - # By default use TSTR with Logistic Regression for discrete models - for key in dict_of_metrics: - metric_module = Katabatic.run_metric(key) - result = metric_module.evaluate( - X_synthetic, y_synthetic, X_real, y_real - ) # TODO: update parameters of the evaluate function so they work for every metric. - new_row = pd.DataFrame({"Metric": [key], "Value": [result]}) - results_df = pd.concat([results_df, new_row], ignore_index=True) - # function = METRICS_FILE.key.value - - return results_df - - # TODO: return a dynamic comparison report of - def evaluate_models(real_data, dict_of_models, dict_of_metrics): - - """ - Evaluates a set of models using specified metrics. - - Args: - real_data (pd.DataFrame): The real data to use for evaluation. - dict_of_models (list): List of model names to evaluate. - dict_of_metrics (dict): Dictionary of metrics to use for evaluation. - - Returns: - pd.DataFrame: A DataFrame with evaluation results for each model. - """ - - results_df = pd.DataFrame() - for i in range(len(dict_of_models)): - model_name = dict_of_models[i] - - # run_model - return - - # Returns X_train, X_test, y_train, y_test in that order - def preprocessing(data): - - """ - Prepares the data by handling null values and splitting it into training and testing sets. - - Args: - data (pd.DataFrame): The data to preprocess. - - Returns: - tuple: X_train, X_test, y_train, y_test - """ - - # Reset Column Headers - data.columns = range(data.shape[1]) - - # Handle null values - for col in range(len(data.columns)): # Check which features contain null values - if data[col].isnull().any(): - print("Column: ", col, " contains Null values.") - - # data=data.copy().fillna(data['f21'].median()) #copy the dataframe into a new dataframe, and fill missing values. - # Validate Data - - # X, Y Split - X = data.iloc[:, :-1] # all but last column - y = data.iloc[:, -1] # last column - # print("Shape of X: ", X.shape, "Shape of y: ", y.shape) - - # Train, Test Split - X_train, X_test, y_train, y_test = train_test_split( - X, y, test_size=0.5, random_state=42 - ) - print("Shape of X_train: ", X_train.shape, "Shape of y_train: ", y_train.shape) - print("Shape of X_test: ", X_test.shape, "Shape of y_test: ", y_test.shape) - print("Type of X_test: ", type(X_test), "Type of y_test: ", type(y_test)) - - return X_train, X_test, y_train, y_test - - -if __name__ == "__main__": - """ - Main entry point for the Katabatic script. - - Usage: - python katabatic.py MODEL_NAME ... - - Arguments: - MODEL_NAME (str): The name of the model to run. - - Example: - python katabatic.py ganblr - - Loads the specified model, preprocesses demo data, fits the model, generates synthetic data, - evaluates the synthetic data, and prints the results. - """ - print(f"[Welcome to Katabatic version 0.1]") - print(f"module name: {__name__}") - print(f"parent process: {os.getppid()}") - print(f"process id: {os.getpid()}") - import warnings - - warnings.filterwarnings("ignore") - - if len(sys.argv) < 2: - raise SystemExit("Usage: katabatic.py MODEL_NAME ...") - arguments = sys.argv[1:] - - for index in range(len(arguments)): - - model_name = arguments[index] # Accept the argument as model_name - model = Katabatic.run_model( - model_name - ) # Create an instance of the specified model - - # Get demo data from GANBLR package - from pandas import read_csv - - ganblr_demo_data = read_csv( - "https://raw.githubusercontent.com/chriszhangpodo/discretizedata/main/adult-dm.csv", - dtype=int, - ) - - X_train, X_test, y_train, y_test = Katabatic.preprocessing(ganblr_demo_data) - - # demo_data = pd.read_csv('cities_demo.csv') # Retrieve some demo data - # X_train, y_train = demo_data[["Temperature","Latitude","Longitude"]], demo_data["Category"] # Split X from y - - # TODO: Add a module for generating demo data. - # X_train, y_train = load_breast_cancer(return_X_y=True, as_frame=True) - # demo_data = load_breast_cancer() - # model.load_data(demo_data) - - model.load_model() - print("1 Shape of y_train: ", y_train.shape) - model.fit( - X_train, y_train - ) # Fit the model to the data # Louka q: Do I really need to pass y_train ? - print("---- SHOW REAL DATA ----") - real_data = pd.concat([X_train, y_train], axis=1) - print(real_data.head()) - - print("--- GENERATE SYNTHETIC DATA ---") - synthetic_data = pd.DataFrame(model.generate()) # Generate synthetic data - synthetic_data.to_csv("output.csv") # Save output to csv - - print(synthetic_data.head()) # Show a sample of the synthetic data output - - print("--- EVALUATE SYNTHETIC DATA ---") # Evaluate the Synthetic Data - real_data = pd.concat([X_test, y_test], axis=1) - data_eval_result = Katabatic.evaluate_data( - synthetic_data, real_data, "discrete", {"tstr_logreg"} - ) # Evaluate the synthetic data and show the result - - print(data_eval_result) diff --git a/katabatic/katabatic_config_Muhammad.json b/katabatic/katabatic_config_Muhammad.json deleted file mode 100644 index 3c966d1..0000000 --- a/katabatic/katabatic_config_Muhammad.json +++ /dev/null @@ -1,28 +0,0 @@ -{ - "#": "Katabatic Model Configuration File", - - "mock_model": { - "tdgm_module_name": "katabatic.models.model_template.mock_adapter", - "tdgm_class_name": "MockAdapter" - }, - "ganblr": { - "tdgm_module_name": "katabatic.models.ganblr.ganblr_adapter", - "tdgm_class_name": "GanblrAdapter" - }, - "ctgan": { - "tdgm_module_name": "katabatic.models.ctgan.ctgan_adapter", - "tdgm_class_name": "CtganAdapter" - }, - "tvae": { - "tdgm_module_name": "katabatic.models.tvae.tvae_adapter", - "tdgm_class_name": "TvaeAdapter" - }, - "ganblrpp": { - "tdgm_module_name": "katabatic.models.ganblrpp.ganblrpp_adapter", - "tdgm_class_name": "GanblrppAdapter" - }, - "medgan": { - "tdgm_module_name": "katabatic.models.medgan.medgan_adapter", - "tdgm_class_name": "MedganAdapter" - } -} diff --git a/katabatic/katabatic_spi.py b/katabatic/katabatic_spi.py deleted file mode 100644 index a0a27f0..0000000 --- a/katabatic/katabatic_spi.py +++ /dev/null @@ -1,99 +0,0 @@ -# The Model Service Provider Interface (SPI) provides an abstract base class (ABC) for all model adapters to implement. - -from abc import ABC, abstractmethod - -class KatabaticModelSPI(ABC): - """ - Abstract base class for all model adapters in Katabatic. - - This class defines the Model Service Provider Interface (SPI) which includes methods - for loading models, loading data, fitting models to data, and generating synthetic data. - - Attributes: - type (str): The type of model, either 'discrete' or 'continuous'. - constraints (Any): Constraints for the model. - batch_size (int): The batch size used for training. - epochs (int): The number of epochs for training. - """ - @abstractmethod - def __init__(self, type): - """ - Initialize the model with the given type. - - Args: - type (str): The type of model ('discrete' or 'continuous'). - - Raises: - ValueError: If type is not 'discrete' or 'continuous'. - """ - self.type = None # Should be either 'discrete' or 'continuous' - self.constraints = None - self.batch_size = None - self.epochs = None - - @abstractmethod - def load_model(self): #Load the model - """ - Load the model. - - This method should be implemented to load a pre-trained model. - """ - pass - - @abstractmethod - def load_data(self): #Load data - """ - Load the data. - - This method should be implemented to load the dataset for training or evaluation. - """ - pass - - @abstractmethod - def fit(self): #Fit model to data - """ - Fit the model to the data. - - This method should be implemented to train the model using the loaded data. - """ - pass - - @abstractmethod - def generate(self): #Generate synthetic data - """ - Generate synthetic data. - - This method should be implemented to produce synthetic data using the trained model. - """ - pass - - -#For the Katabatic Metric SPI, the input must be data/a model and the output must be a result. -# Each Evaluation method must be applicable to tabular data. -class KatabaticMetricSPI(ABC): - """ - Abstract base class for metrics in Katabatic. - - This class defines the Metric Service Provider Interface (SPI) which includes a method - for evaluating synthetic data against real data. - - Methods: - evaluate(real_data, synthetic_data): Evaluate the synthetic data against the real data. - """ - @abstractmethod - def evaluate(real_data, synthetic_data): # Evaluate the synthetic data against the real data - """ - Evaluate the synthetic data against the real data. - - Args: - real_data (Any): The real data for comparison. - synthetic_data (Any): The synthetic data generated by the model. - - Returns: - Any: The result of the evaluation. - - """ - print("Comparing real data to synthetic data.") - - - \ No newline at end of file diff --git a/katabatic/metrics/metrics.json b/katabatic/metrics/metrics.json deleted file mode 100644 index f04fdd7..0000000 --- a/katabatic/metrics/metrics.json +++ /dev/null @@ -1,9 +0,0 @@ -{ - "trtr_logreg": "katabatic/metrics/trtr_logreg.py", - "tstr_logreg": "katabatic/metrics/tstr_logreg.py", - "tstr_rf": "katabatic/metrics/tstr_rf.py", - "tstr_mlp": "katabatic/metrics/tstr_mlp.py", - "model_compatability": "katabatic/metrics/model_compatability.py", - "q_score": "katabatic/metrics/q_score.py", - "similarity:": "katabatic/metrics/similarity.py" -} \ No newline at end of file diff --git a/katabatic/metrics/q_score.py b/katabatic/metrics/q_score.py deleted file mode 100644 index f071419..0000000 --- a/katabatic/metrics/q_score.py +++ /dev/null @@ -1,131 +0,0 @@ -from sklearn.metrics import accuracy_score, roc_auc_score -from sklearn.model_selection import train_test_split -from sklearn.ensemble import RandomForestClassifier -from sklearn.preprocessing import OneHotEncoder, LabelEncoder -from sklearn.compose import ColumnTransformer -from sklearn.pipeline import Pipeline -import numpy as np -import pandas as pd - - -def evaluate(X_synthetic, y_synthetic, X_real, y_real, use_labels=False): - """ - Evaluate the quality of synthetic data using a Q-score approach. - - This method evaluates the quality of synthetic data by assessing how distinguishable - the synthetic data is from the real data. If `use_labels` is True, the original labels - (`y_real` and `y_synthetic`) are included as additional features to improve the accuracy - of this distinction. If `use_labels` is False, only the features are used. - - Parameters: - - X_synthetic: Features of the synthetic data - - y_synthetic: Labels of the synthetic data - - X_real: Features of the real data - - y_real: Labels of the real data - - use_labels: Boolean flag to determine whether to use labels in evaluation. - - Returns: - - q_score: AUC score indicating how well the model can distinguish between real and synthetic data. - """ - - # Convert to numpy arrays if they are in DataFrame format - X_synthetic = ( - X_synthetic.to_numpy() if isinstance(X_synthetic, pd.DataFrame) else X_synthetic - ) - y_synthetic = ( - y_synthetic.to_numpy().ravel() - if isinstance(y_synthetic, pd.DataFrame) - else y_synthetic - ) - X_real = X_real.to_numpy() if isinstance(X_real, pd.DataFrame) else X_real - y_real = y_real.to_numpy().ravel() if isinstance(y_real, pd.DataFrame) else y_real - - # Identify categorical features and numeric features - categorical_features = np.where(X_synthetic.dtype == "O")[ - 0 - ] # Assuming all object columns are categorical - numeric_features = np.where(X_synthetic.dtype != "O")[0] - - # Define the preprocessing pipeline for the features - preprocessor = ColumnTransformer( - transformers=[ - ("num", "passthrough", numeric_features), - ("cat", OneHotEncoder(handle_unknown="ignore"), categorical_features), - ] - ) - - if use_labels: - # Combine the features and labels of the real and synthetic data - X_combined = np.vstack([X_real, X_synthetic]) - y_combined = np.hstack([y_real, y_synthetic]) - - # Create labels for distinguishing between real and synthetic data: 1 for real, 0 for synthetic - real_vs_synthetic_labels = np.hstack( - [np.ones(X_real.shape[0]), np.zeros(X_synthetic.shape[0])] - ) - - # Combine the real_vs_synthetic labels with the original labels as additional features - X_combined_with_labels = np.column_stack((X_combined, y_combined)) - - # Create a pipeline that applies the preprocessor and then splits the data - X_combined_with_labels = preprocessor.fit_transform(X_combined_with_labels) - - # Split into train and test sets - X_train, X_test, y_train, y_test = train_test_split( - X_combined_with_labels, - real_vs_synthetic_labels, - test_size=0.33, - random_state=42, - ) - else: - # Use only features for the evaluation - X_combined = np.vstack([X_real, X_synthetic]) - - # Create labels for distinguishing between real and synthetic data: 1 for real, 0 for synthetic - real_vs_synthetic_labels = np.hstack( - [np.ones(X_real.shape[0]), np.zeros(X_synthetic.shape[0])] - ) - - # Preprocess the combined data - X_combined = preprocessor.fit_transform(X_combined) - - # Split into train and test sets - X_train, X_test, y_train, y_test = train_test_split( - X_combined, real_vs_synthetic_labels, test_size=0.33, random_state=42 - ) - - # Train a RandomForest model to distinguish between real and synthetic data - model = RandomForestClassifier(n_estimators=100, random_state=42) - model.fit(X_train, y_train) - y_pred_proba = model.predict_proba(X_test)[:, 1] # Probability of being real - - # Calculate AUC as Q-score - q_score = roc_auc_score(y_test, y_pred_proba) - - return q_score - - -""" -**Pros:** - - **Comprehensive Evaluation (use_labels=True):** By including both features and labels, - the method provides a more complete assessment of how well the synthetic data mimics - the real data. - - **Simple Approach (use_labels=False):** For cases where the labels might not be important, - this simpler method can still provide valuable insights. - - **Quantitative Metric (Q-score):** The use of AUC as the Q-score provides a quantitative - measure of similarity, where a lower score (closer to 0.5) indicates higher similarity. - - **Cons:** - - **Label Dependency (use_labels=True):** The effectiveness of the comprehensive method - is dependent on the quality and relevance of the labels. If labels are noisy or poorly defined, - this can negatively impact the Q-score. - - **Model Complexity (use_labels=True):** Adding labels as features increases model complexity, - requiring more computational resources and potentially leading to overfitting. - - **When to Use This Approach:** - - **use_labels=True:** Use this approach when generating synthetic data for classification - or regression tasks where the labels are significant, and preserving the relationship - between features and labels is crucial. - - **use_labels=False:** Use this simpler approach when you are primarily concerned with - feature distributions and not the relationship between features and labels. -""" diff --git a/katabatic/metrics/similarity.py b/katabatic/metrics/similarity.py deleted file mode 100644 index d4edde0..0000000 --- a/katabatic/metrics/similarity.py +++ /dev/null @@ -1,106 +0,0 @@ -from sklearn.preprocessing import LabelEncoder, OneHotEncoder -from sklearn.compose import ColumnTransformer -from sklearn.metrics import pairwise_distances, jaccard_score -from scipy.stats import ks_2samp, wasserstein_distance -import numpy as np -import pandas as pd - - -def evaluate(X_synthetic, X_real, similarity_metric="ks_test"): - """ - Evaluate the similarity between real and synthetic data using a specified metric. - - This function compares the real and synthetic data using the specified similarity metric. - - Parameters: - - X_synthetic: Features of the synthetic data. - - X_real: Features of the real data. - - similarity_metric: The similarity metric to apply. Supported options are: - - 'ks_test': Kolmogorov-Smirnov test for continuous distributions. - - 'wasserstein': Wasserstein distance for continuous distributions. - - 'jaccard': Jaccard similarity for binary or categorical data. - - 'euclidean': Euclidean distance between datasets. - - 'cosine': Cosine similarity between datasets. - - Returns: - - similarity_score: The computed similarity score based on the specified metric. - """ - - # Data Validation - if X_synthetic.shape[1] != X_real.shape[1]: - raise ValueError( - "The number of features in X_synthetic and X_real must be the same." - ) - - # Convert DataFrames to NumPy arrays if necessary - X_synthetic = ( - X_synthetic.to_numpy() if hasattr(X_synthetic, "to_numpy") else X_synthetic - ) - X_real = X_real.to_numpy() if hasattr(X_real, "to_numpy") else X_real - - # Define the preprocessing pipeline for the features - categorical_features = np.where(X_synthetic.dtype == "O")[ - 0 - ] # Assuming all object columns are categorical - numeric_features = np.where(X_synthetic.dtype != "O")[0] - - preprocessor = ColumnTransformer( - transformers=[ - ("num", "passthrough", numeric_features), - ("cat", OneHotEncoder(handle_unknown="ignore"), categorical_features), - ] - ) - - # Apply preprocessing to both real and synthetic data - X_synthetic_processed = preprocessor.fit_transform(X_synthetic) - X_real_processed = preprocessor.transform(X_real) - - # Compute similarity based on the specified metric - if similarity_metric == "ks_test": - similarity_scores = [ - ks_2samp(X_real_processed[:, i], X_synthetic_processed[:, i])[0] - for i in range(X_real_processed.shape[1]) - ] - similarity_score = np.mean(similarity_scores) - - elif similarity_metric == "wasserstein": - similarity_scores = [ - wasserstein_distance(X_real_processed[:, i], X_synthetic_processed[:, i]) - for i in range(X_real_processed.shape[1]) - ] - similarity_score = np.mean(similarity_scores) - - elif similarity_metric == "jaccard": - jaccard_scores = [] - for i in range(X_real_processed.shape[1]): - if ( - len(np.unique(X_real_processed[:, i])) <= 20 - ): # Assume categorical data if low cardinality - jaccard_scores.append( - jaccard_score( - X_real_processed[:, i], - X_synthetic_processed[:, i], - average="macro", - ) - ) - similarity_score = np.mean(jaccard_scores) - - elif similarity_metric == "euclidean": - similarity_score = np.mean( - pairwise_distances( - X_real_processed, X_synthetic_processed, metric="euclidean" - ) - ) - - elif similarity_metric == "cosine": - similarity_score = np.mean( - 1 - - pairwise_distances( - X_real_processed, X_synthetic_processed, metric="cosine" - ) - ) - - else: - raise ValueError(f"Unsupported similarity metric: {similarity_metric}") - - return similarity_score diff --git a/katabatic/metrics/trtr_jsd_DGEK.py b/katabatic/metrics/trtr_jsd_DGEK.py deleted file mode 100644 index 9749971..0000000 --- a/katabatic/metrics/trtr_jsd_DGEK.py +++ /dev/null @@ -1,63 +0,0 @@ -from sklearn.preprocessing import LabelEncoder -from sklearn.model_selection import train_test_split -from sklearn.metrics import accuracy_score -from scipy.spatial.distance import jensenshannon -import numpy as np - -def compute_jsd(X_real, X_synthetic): - """ - Compute Jensen-Shannon Divergence between real and synthetic datasets. - - Parameters: - - X_real: np.ndarray, real dataset features - - X_synthetic: np.ndarray, synthetic dataset features - - Returns: - - jsd: float, Jensen-Shannon Divergence value - """ - # Ensure data is in 1D arrays - X_real = np.ravel(X_real) - X_synthetic = np.ravel(X_synthetic) - - # Compute histograms - hist_real, _ = np.histogram(X_real, bins=np.arange(X_real.max() + 2) - 0.5, density=True) - hist_synthetic, _ = np.histogram(X_synthetic, bins=np.arange(X_synthetic.max() + 2) - 0.5, density=True) - - # Handle empty histograms - if np.all(hist_real == 0): - hist_real += 1e-10 - if np.all(hist_synthetic == 0): - hist_synthetic += 1e-10 - - # Compute Jensen-Shannon Divergence - jsd = jensenshannon(hist_real, hist_synthetic) - return jsd - -def evaluate(X_synthetic, y_synthetic, X_real, y_real): - """ - Evaluate the Jensen-Shannon Divergence metric between synthetic and real datasets. - - Parameters: - - X_synthetic: np.ndarray, synthetic dataset features - - y_synthetic: np.ndarray, synthetic dataset labels - - X_real: np.ndarray, real dataset features - - y_real: np.ndarray, real dataset labels - - Returns: - - jsd: float, Jensen-Shannon Divergence value between the real and synthetic datasets - """ - # Convert to numpy arrays - X_synthetic = X_synthetic.to_numpy() - y_synthetic = y_synthetic.to_numpy().ravel() - X_real = X_real.to_numpy() - y_real = y_real.to_numpy().ravel() - - # Combine features and labels - X_real_combined = np.hstack([X_real, y_real.reshape(-1, 1)]) - X_synthetic_combined = np.hstack([X_synthetic, y_synthetic.reshape(-1, 1)]) - - # Compute JSD - jsd_value = compute_jsd(X_real_combined, X_synthetic_combined) - - return jsd_value - diff --git a/katabatic/metrics/trtr_logreg.py b/katabatic/metrics/trtr_logreg.py deleted file mode 100644 index fc1b722..0000000 --- a/katabatic/metrics/trtr_logreg.py +++ /dev/null @@ -1,30 +0,0 @@ -from sklearn.linear_model import LogisticRegression -from sklearn.preprocessing import LabelEncoder -from sklearn.model_selection import train_test_split -from sklearn.metrics import accuracy_score -import numpy as np - - -def evaluate(X_synthetic, y_synthetic, X_real, y_real): - - # TODO: error handling, data validation - X_synthetic = X_synthetic.to_numpy() - y_synthetic = y_synthetic.to_numpy().ravel() - X_real = X_real.to_numpy() - y_real = y_real.to_numpy().ravel() - - X_train, X_test, y_train, y_test = train_test_split( - X_real, y_real, test_size=0.33, random_state=42 - ) - - le = LabelEncoder() # Use labelencoder to convert strings to values - le.fit(np.unique(y_train)) - y_train = le.transform(y_train) - y_test = le.transform(y_test) - - # TSTR Evaluation using Log Reg - model = LogisticRegression(max_iter=200) - model.fit(X_train, y_train) - y_pred = model.predict(X_test) - - return accuracy_score(y_test, y_pred) diff --git a/katabatic/metrics/trtr_logreg_DGEK.py b/katabatic/metrics/trtr_logreg_DGEK.py deleted file mode 100644 index bc64841..0000000 --- a/katabatic/metrics/trtr_logreg_DGEK.py +++ /dev/null @@ -1,29 +0,0 @@ -from sklearn.linear_model import LogisticRegression -from sklearn.preprocessing import LabelEncoder -from sklearn.model_selection import train_test_split -from sklearn.metrics import accuracy_score -import numpy as np - -def evaluate(X_synthetic, y_synthetic, X_real, y_real): - - # Ignore this thus simpe pass - X_synthetic = X_synthetic.to_numpy() - y_synthetic = y_synthetic.to_numpy().ravel() - - # Objective lies here - X_real = X_real.to_numpy() - y_real = y_real.to_numpy().ravel() - - X_train, X_test, y_train, y_test = train_test_split(X_real, y_real, test_size=0.33, random_state=42) - - le = LabelEncoder() # Use labelencoder to convert strings to values - le.fit(np.unique(y_train)) - y_train = le.transform(y_train) - y_test = le.transform(y_test) - - # TRTR Evaluation using Log Reg - model = LogisticRegression(max_iter=200) - model.fit(X_train, y_train) - y_pred = model.predict(X_test) - - return accuracy_score(y_test, y_pred) \ No newline at end of file diff --git a/katabatic/metrics/trtr_wd_DGEK.py b/katabatic/metrics/trtr_wd_DGEK.py deleted file mode 100644 index 90882d7..0000000 --- a/katabatic/metrics/trtr_wd_DGEK.py +++ /dev/null @@ -1,52 +0,0 @@ -from sklearn.preprocessing import LabelEncoder -from sklearn.model_selection import train_test_split -from sklearn.metrics import accuracy_score -from scipy.stats import wasserstein_distance -import numpy as np - -def compute_wd(X_real, X_synthetic): - """ - Compute Wasserstein Distance between real and synthetic datasets. - - Parameters: - - X_real: np.ndarray, real dataset features - - X_synthetic: np.ndarray, synthetic dataset features - - Returns: - - wd: float, Wasserstein Distance value - """ - # Ensure data is in 1D arrays - X_real = np.ravel(X_real) - X_synthetic = np.ravel(X_synthetic) - - # Compute Wasserstein Distance - wd = wasserstein_distance(X_real, X_synthetic) - return wd - -def evaluate(X_synthetic, y_synthetic, X_real, y_real): - """ - Evaluate the Wasserstein Distance metric between synthetic and real datasets. - - Parameters: - - X_synthetic: np.ndarray, synthetic dataset features - - y_synthetic: np.ndarray, synthetic dataset labels - - X_real: np.ndarray, real dataset features - - y_real: np.ndarray, real dataset labels - - Returns: - - wd: float, Wasserstein Distance value between the real and synthetic datasets - """ - # Convert to numpy arrays - X_synthetic = X_synthetic.to_numpy() - y_synthetic = y_synthetic.to_numpy().ravel() - X_real = X_real.to_numpy() - y_real = y_real.to_numpy().ravel() - - # Combine features and labels - X_real_combined = np.hstack([X_real, y_real.reshape(-1, 1)]) - X_synthetic_combined = np.hstack([X_synthetic, y_synthetic.reshape(-1, 1)]) - - # Compute Wasserstein Distance - wd_value = compute_wd(X_real_combined, X_synthetic_combined) - - return wd_value diff --git a/katabatic/metrics/tstr_logreg.py b/katabatic/metrics/tstr_logreg.py deleted file mode 100644 index 5c9def0..0000000 --- a/katabatic/metrics/tstr_logreg.py +++ /dev/null @@ -1,90 +0,0 @@ -from sklearn.preprocessing import LabelEncoder, OneHotEncoder -from sklearn.compose import ColumnTransformer -from sklearn.pipeline import Pipeline -from sklearn.model_selection import train_test_split -from sklearn.metrics import accuracy_score -from sklearn.linear_model import LogisticRegression -import numpy as np - - -def evaluate(X_synthetic, y_synthetic, X_real, y_real): - """ - Evaluate the quality of synthetic data using TSTR (Train on Synthetic, Test on Real) method. - - This function trains a Logistic Regression model on synthetic data and evaluates its performance - on a test set derived from real data. The accuracy score is returned as a metric - for model performance. - - Parameters: - - X_synthetic: Features of the synthetic data - - y_synthetic: Labels of the synthetic data - - X_real: Features of the real data - - y_real: Labels of the real data - - Returns: - - accuracy: Accuracy score of the Logistic Regression model on the real test set. - """ - - # Data Validation - if X_synthetic.shape[1] != X_real.shape[1]: - raise ValueError( - "The number of features in X_synthetic and X_real must be the same." - ) - - if len(y_synthetic) != len(X_synthetic): - raise ValueError( - "X_synthetic and y_synthetic must have the same number of samples." - ) - - if len(y_real) != len(X_real): - raise ValueError("X_real and y_real must have the same number of samples.") - - # Convert DataFrames to NumPy arrays if necessary - X_synthetic = ( - X_synthetic.to_numpy() if hasattr(X_synthetic, "to_numpy") else X_synthetic - ) - y_synthetic = ( - y_synthetic.to_numpy().ravel() - if hasattr(y_synthetic, "to_numpy") - else y_synthetic - ) - X_real = X_real.to_numpy() if hasattr(X_real, "to_numpy") else X_real - y_real = y_real.to_numpy().ravel() if hasattr(y_real, "to_numpy") else y_real - - # Define the preprocessing pipeline for the features - categorical_features = np.where(X_synthetic.dtype == "O")[ - 0 - ] # Assuming all object columns are categorical - numeric_features = np.where(X_synthetic.dtype != "O")[0] - - preprocessor = ColumnTransformer( - transformers=[ - ("num", "passthrough", numeric_features), - ("cat", OneHotEncoder(handle_unknown="ignore"), categorical_features), - ] - ) - - # Split the real data into training and testing sets - X_train_real, X_test_real, y_train_real, y_test_real = train_test_split( - X_real, y_real, test_size=0.33, random_state=42 - ) - - # Create a pipeline that applies the preprocessor and then trains the model - model = Pipeline( - steps=[ - ("preprocessor", preprocessor), - ("classifier", LogisticRegression(max_iter=200, random_state=42)), - ] - ) - - # # Encode labels (necessary if labels are not numerical) - le = LabelEncoder() - y_synthetic = le.fit_transform(y_synthetic) - y_test_real = le.transform(y_test_real) - - # TSTR Evaluation using Logistic Regression - model.fit(X_synthetic, y_synthetic) - y_pred = model.predict(X_test_real) - - # Return the accuracy score - return accuracy_score(y_test_real, y_pred) diff --git a/katabatic/metrics/tstr_logreg_DGEK.py b/katabatic/metrics/tstr_logreg_DGEK.py deleted file mode 100644 index 056f3dd..0000000 --- a/katabatic/metrics/tstr_logreg_DGEK.py +++ /dev/null @@ -1,32 +0,0 @@ -import numpy as np -import pandas as pd -from sklearn.linear_model import LogisticRegression -from sklearn.metrics import accuracy_score - -def evaluate(X_synthetic, y_synthetic, X_real, y_real): - # Ensure inputs are dataframes - if not all(isinstance(df, pd.DataFrame) for df in [X_synthetic, X_real]): - raise ValueError("X_synthetic and X_real must be dataframes.") - if not all(isinstance(s, pd.Series) for s in [y_synthetic, y_real]): - raise ValueError("y_synthetic and y_real must be series.") - - # Convert from pandas objects into numpy arrays - X_synthetic = X_synthetic.to_numpy() - y_synthetic = y_synthetic.to_numpy().ravel() - X_real = X_real.to_numpy() - y_real = y_real.to_numpy().ravel() - - # Ensure y_synthetic and y_real are integers - y_train = y_synthetic.astype('int') - y_test = y_real.astype('int') - - # Train Logistic Regression model - model = LogisticRegression(max_iter=200) - model.fit(X_synthetic, y_train) - y_pred = model.predict(X_real) - - # Calculate accuracy score - accuracy = accuracy_score(y_test, y_pred) - - # Return accuracy score rounded to 4 decimal places - return round(accuracy, 4) diff --git a/katabatic/metrics/tstr_mlp.py b/katabatic/metrics/tstr_mlp.py deleted file mode 100644 index 5ca540a..0000000 --- a/katabatic/metrics/tstr_mlp.py +++ /dev/null @@ -1,86 +0,0 @@ -from sklearn.neural_network import MLPClassifier -from sklearn.preprocessing import LabelEncoder, OneHotEncoder -from sklearn.compose import ColumnTransformer -from sklearn.pipeline import Pipeline -from sklearn.metrics import accuracy_score -import numpy as np - - -def evaluate(X_synthetic, y_synthetic, X_real, y_real): - """ - Evaluate the quality of synthetic data using TSTR (Train on Synthetic, Test on Real) method. - - This function trains an MLPClassifier model on synthetic data and evaluates its performance - on real data. The accuracy score is returned as a metric for model performance. - - Parameters: - - X_synthetic: Features of the synthetic data - - y_synthetic: Labels of the synthetic data - - X_real: Features of the real data - - y_real: Labels of the real data - - Returns: - - accuracy: Accuracy score of the MLPClassifier model on the real data. - """ - - # Data Validation - if X_synthetic.shape[1] != X_real.shape[1]: - raise ValueError( - "The number of features in X_synthetic and X_real must be the same." - ) - - if len(y_synthetic) != len(X_synthetic): - raise ValueError( - "X_synthetic and y_synthetic must have the same number of samples." - ) - - if len(y_real) != len(X_real): - raise ValueError("X_real and y_real must have the same number of samples.") - - # Convert DataFrames to NumPy arrays if necessary - X_synthetic = ( - X_synthetic.to_numpy() if hasattr(X_synthetic, "to_numpy") else X_synthetic - ) - y_synthetic = ( - np.squeeze(y_synthetic) if hasattr(y_synthetic, "to_numpy") else y_synthetic - ) - X_real = X_real.to_numpy() if hasattr(X_real, "to_numpy") else X_real - y_real = np.squeeze(y_real) if hasattr(y_real, "to_numpy") else y_real - - # Identify categorical features and numeric features - categorical_features = np.where(X_synthetic.dtype == "O")[ - 0 - ] # Assuming all object columns are categorical - numeric_features = np.where(X_synthetic.dtype != "O")[0] - - # Define the preprocessing pipeline for the features - preprocessor = ColumnTransformer( - transformers=[ - ("num", "passthrough", numeric_features), - ("cat", OneHotEncoder(handle_unknown="ignore"), categorical_features), - ] - ) - - # Combine y_synthetic and y_real to ensure consistent label encoding - y_combined = np.concatenate([y_synthetic, y_real]) - - # Encode labels - le = LabelEncoder() - le.fit(y_combined) # Ensure all labels from both datasets are accounted for - y_synthetic = le.transform(y_synthetic) - y_real = le.transform(y_real) - - # Create a pipeline that applies the preprocessor and then trains the model - model = Pipeline( - steps=[ - ("preprocessor", preprocessor), - ("classifier", MLPClassifier(max_iter=1000, random_state=42)), - ] - ) - - # TSTR Evaluation using MLPClassifier - model.fit(X_synthetic, y_synthetic) - y_pred = model.predict(X_real) - - # Return the accuracy score - return accuracy_score(y_real, y_pred) diff --git a/katabatic/metrics/tstr_rf.py b/katabatic/metrics/tstr_rf.py deleted file mode 100644 index 426cf19..0000000 --- a/katabatic/metrics/tstr_rf.py +++ /dev/null @@ -1,88 +0,0 @@ -from sklearn.ensemble import RandomForestClassifier -from sklearn.preprocessing import LabelEncoder, OneHotEncoder -from sklearn.compose import ColumnTransformer -from sklearn.pipeline import Pipeline -from sklearn.metrics import accuracy_score -import numpy as np - - -def evaluate(X_synthetic, y_synthetic, X_real, y_real): - """ - Evaluate the quality of synthetic data using TSTR (Train on Synthetic, Test on Real) method. - - This function trains a RandomForestClassifier model on synthetic data and evaluates its - performance on real data. The accuracy score is returned as a metric for model performance. - - Parameters: - - X_synthetic: Features of the synthetic data - - y_synthetic: Labels of the synthetic data - - X_real: Features of the real data - - y_real: Labels of the real data - - Returns: - - accuracy: Accuracy score of the RandomForestClassifier model on the real data. - """ - - # Data Validation - if X_synthetic.shape[1] != X_real.shape[1]: - raise ValueError( - "The number of features in X_synthetic and X_real must be the same." - ) - - if len(y_synthetic) != len(X_synthetic): - raise ValueError( - "X_synthetic and y_synthetic must have the same number of samples." - ) - - if len(y_real) != len(X_real): - raise ValueError("X_real and y_real must have the same number of samples.") - - # Convert DataFrames to NumPy arrays if necessary - X_synthetic = ( - X_synthetic.to_numpy() if hasattr(X_synthetic, "to_numpy") else X_synthetic - ) - y_synthetic = ( - y_synthetic.to_numpy().ravel() - if hasattr(y_synthetic, "to_numpy") - else y_synthetic - ) - X_real = X_real.to_numpy() if hasattr(X_real, "to_numpy") else X_real - y_real = y_real.to_numpy().ravel() if hasattr(y_real, "to_numpy") else y_real - - # Identify categorical features and numeric features - categorical_features = np.where(X_synthetic.dtype == "O")[ - 0 - ] # Assuming all object columns are categorical - numeric_features = np.where(X_synthetic.dtype != "O")[0] - - # Define the preprocessing pipeline for the features - preprocessor = ColumnTransformer( - transformers=[ - ("num", "passthrough", numeric_features), - ("cat", OneHotEncoder(handle_unknown="ignore"), categorical_features), - ] - ) - - # Combine y_synthetic and y_real to ensure consistent label encoding - y_combined = np.concatenate([y_synthetic, y_real]) - - # Encode labels - le = LabelEncoder() - le.fit(y_combined) # Ensure all labels from both datasets are accounted for - y_synthetic = le.transform(y_synthetic) - y_real = le.transform(y_real) - - # Create a pipeline that applies the preprocessor and then trains the model - model = Pipeline( - steps=[ - ("preprocessor", preprocessor), - ("classifier", RandomForestClassifier(random_state=42)), - ] - ) - - # TSTR Evaluation using RandomForestClassifier - model.fit(X_synthetic, y_synthetic) - y_pred = model.predict(X_real) - - # Return the accuracy score - return accuracy_score(y_real, y_pred) diff --git a/katabatic/metrics/tstr_rf_DGEK.py b/katabatic/metrics/tstr_rf_DGEK.py deleted file mode 100644 index 6ee25c1..0000000 --- a/katabatic/metrics/tstr_rf_DGEK.py +++ /dev/null @@ -1,25 +0,0 @@ -from sklearn.ensemble import RandomForestClassifier -from sklearn.preprocessing import LabelEncoder -from sklearn.metrics import accuracy_score -import numpy as np - -def evaluate(X_synthetic, y_synthetic, X_real, y_real): - - # TODO: error handling, data validation - X_synthetic = X_synthetic.to_numpy() - y_synthetic = y_synthetic.to_numpy().ravel() - X_real = X_real.to_numpy() - y_real = y_real.to_numpy().ravel() - - le = LabelEncoder() # Use labelencoder to convert strings to values - le.fit(np.unique(y_synthetic)) # TODO: Combine both y_synth and y_real values here - y_synthetic = le.transform(y_synthetic) - y_real = le.transform(y_real) - - # TSTR Evaluation using Random Forest - model = RandomForestClassifier() # LogisticRegression(max_iter=200) - model.fit(X_synthetic, y_synthetic) - y_pred = model.predict(X_real) - - - return accuracy_score(y_real, y_pred) \ No newline at end of file diff --git a/katabatic/metrics/tstr_xgbt_DGEK.py b/katabatic/metrics/tstr_xgbt_DGEK.py deleted file mode 100644 index d7ef666..0000000 --- a/katabatic/metrics/tstr_xgbt_DGEK.py +++ /dev/null @@ -1,24 +0,0 @@ -from xgboost import XGBClassifier -from sklearn.preprocessing import LabelEncoder -from sklearn.metrics import accuracy_score -import numpy as np - -def evaluate(X_synthetic, y_synthetic, X_real, y_real): - - # TODO: error handling, data validation - X_synthetic = X_synthetic.to_numpy() - y_synthetic = y_synthetic.to_numpy().ravel() - X_real = X_real.to_numpy() - y_real = y_real.to_numpy().ravel() - - le = LabelEncoder() # Use labelencoder to convert strings to values - le.fit(np.unique(np.concatenate((y_synthetic, y_real)))) # Combine both y_synthetic and y_real values here - y_synthetic = le.transform(y_synthetic) - y_real = le.transform(y_real) - - # TSTR Evaluation using XGBoost - model = XGBClassifier() # Use XGBoostClassifier - model.fit(X_synthetic, y_synthetic) - y_pred = model.predict(X_real) - - return accuracy_score(y_real, y_pred) diff --git a/katabatic/models/TableGAN/__init__.py b/katabatic/models/TableGAN/__init__.py deleted file mode 100644 index 1b49a5b..0000000 --- a/katabatic/models/TableGAN/__init__.py +++ /dev/null @@ -1,25 +0,0 @@ -# Import the TableGANAdapter class from the tablegan_adapter module -from .tablegan_adapter import TableGANAdapter -from .tablegan import TableGAN -from .tablegan_utils import preprocess_data, postprocess_data - - -# Define what should be imported when using "from katabatic.models.tablegan import *" -__all__ = ['TableGANAdapter', 'TableGAN', 'preprocess_data', 'postprocess_data'] - -# You can add any initialization code for the tablegan module here if needed - -# For example, you could set up logging for the tablegan module: -import logging - -logger = logging.getLogger(__name__) -logger.setLevel(logging.INFO) - -# Add a console handler -ch = logging.StreamHandler() -ch.setLevel(logging.INFO) -formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s') -ch.setFormatter(formatter) -logger.addHandler(ch) - -logger.info("TableGAN module initialized") \ No newline at end of file diff --git a/katabatic/models/TableGAN/tablegan.py b/katabatic/models/TableGAN/tablegan.py deleted file mode 100644 index b97fe14..0000000 --- a/katabatic/models/TableGAN/tablegan.py +++ /dev/null @@ -1,160 +0,0 @@ -import tensorflow as tf -from tensorflow.keras import layers -import numpy as np - -class TableGAN: - def __init__(self, input_dim, label_dim, z_dim=100, delta_mean=0.1, delta_sd=0.1): - self.input_dim = input_dim - self.label_dim = label_dim - self.z_dim = z_dim - self.delta_mean = delta_mean - self.delta_sd = delta_sd - - self.generator = self._build_generator() - self.discriminator = self._build_discriminator() - self.classifier = self._build_classifier() - - self.g_optimizer = tf.keras.optimizers.Adam(learning_rate=0.0001, beta_1=0.5, beta_2=0.9) - self.d_optimizer = tf.keras.optimizers.Adam(learning_rate=0.0001, beta_1=0.5, beta_2=0.9) - self.c_optimizer = tf.keras.optimizers.Adam(learning_rate=0.0001, beta_1=0.5, beta_2=0.9) - - self.f_X_mean = None - self.f_X_sd = None - self.f_Z_mean = None - self.f_Z_sd = None - - def _build_generator(self): - model = tf.keras.Sequential([ - layers.Dense(256 * 4 * 4, input_shape=(self.z_dim,), use_bias=False), - layers.BatchNormalization(), - layers.LeakyReLU(0.2), - layers.Reshape((4, 4, 256)), - layers.Conv2DTranspose(128, 4, strides=2, padding='same', use_bias=False), - layers.BatchNormalization(), - layers.LeakyReLU(0.2), - layers.Conv2DTranspose(64, 4, strides=2, padding='same', use_bias=False), - layers.BatchNormalization(), - layers.LeakyReLU(0.2), - layers.Conv2DTranspose(1, 4, strides=2, padding='same', use_bias=False, activation='tanh') - ]) - return model - - def _build_discriminator(self): - inputs = layers.Input(shape=(32, 32, 1)) - x = layers.Conv2D(64, 4, strides=2, padding='same')(inputs) - x = layers.LeakyReLU(0.2)(x) - x = layers.Conv2D(128, 4, strides=2, padding='same')(x) - x = layers.LeakyReLU(0.2)(x) - x = layers.Conv2D(256, 4, strides=2, padding='same')(x) - x = layers.LeakyReLU(0.2)(x) - features = layers.Flatten()(x) - output = layers.Dense(1)(features) - - return tf.keras.Model(inputs, [output, features]) - - def _build_classifier(self): - model = tf.keras.Sequential([ - layers.Conv2D(64, 4, strides=2, padding='same', input_shape=(32, 32, 1)), - layers.LeakyReLU(0.2), - layers.Conv2D(128, 4, strides=2, padding='same'), - layers.LeakyReLU(0.2), - layers.Conv2D(256, 4, strides=2, padding='same'), - layers.LeakyReLU(0.2), - layers.Flatten(), - layers.Dense(self.label_dim, activation='softmax') - ]) - return model - - def wasserstein_loss(self, y_true, y_pred): - return tf.reduce_mean(y_true * y_pred) - - def gradient_penalty(self, real, fake): - alpha = tf.random.uniform([tf.shape(real)[0], 1, 1, 1], 0.0, 1.0) - diff = fake - real - interpolated = real + alpha * diff - with tf.GradientTape() as gp_tape: - gp_tape.watch(interpolated) - pred, _ = self.discriminator(interpolated, training=True) - grads = gp_tape.gradient(pred, [interpolated])[0] - norm = tf.sqrt(tf.reduce_sum(tf.square(grads), axis=[1, 2, 3])) - gp = tf.reduce_mean((norm - 1.0) ** 2) - return gp - - @tf.function - def train_step(self, real_data, real_labels): - real_data = tf.cast(real_data, tf.float32) - real_labels = tf.cast(real_labels, tf.float32) - batch_size = tf.shape(real_data)[0] - noise = tf.random.normal([batch_size, self.z_dim]) - - with tf.GradientTape() as d_tape, tf.GradientTape() as c_tape, tf.GradientTape() as g_tape: - fake_data = self.generator(noise, training=True) - - real_output, real_features = self.discriminator(real_data, training=True) - fake_output, fake_features = self.discriminator(fake_data, training=True) - - d_loss = self.wasserstein_loss(real_output, -tf.ones_like(real_output)) + \ - self.wasserstein_loss(fake_output, tf.ones_like(fake_output)) - gp = self.gradient_penalty(real_data, fake_data) - d_loss += 10.0 * gp - - f_X_mean = tf.reduce_mean(real_features, axis=0) - f_X_sd = tf.math.reduce_std(real_features, axis=0) - f_Z_mean = tf.reduce_mean(fake_features, axis=0) - f_Z_sd = tf.math.reduce_std(fake_features, axis=0) - - if self.f_X_mean is None: - self.f_X_mean = tf.Variable(tf.zeros_like(f_X_mean), trainable=False) - self.f_X_sd = tf.Variable(tf.zeros_like(f_X_sd), trainable=False) - self.f_Z_mean = tf.Variable(tf.zeros_like(f_Z_mean), trainable=False) - self.f_Z_sd = tf.Variable(tf.zeros_like(f_Z_sd), trainable=False) - - self.f_X_mean.assign(0.99 * self.f_X_mean + 0.01 * f_X_mean) - self.f_X_sd.assign(0.99 * self.f_X_sd + 0.01 * f_X_sd) - self.f_Z_mean.assign(0.99 * self.f_Z_mean + 0.01 * f_Z_mean) - self.f_Z_sd.assign(0.99 * self.f_Z_sd + 0.01 * f_Z_sd) - - L_mean = tf.reduce_sum(tf.square(self.f_X_mean - self.f_Z_mean)) - L_sd = tf.reduce_sum(tf.square(self.f_X_sd - self.f_Z_sd)) - - g_info_loss = tf.maximum(0.0, L_mean - self.delta_mean) + tf.maximum(0.0, L_sd - self.delta_sd) - - c_real_pred = self.classifier(real_data, training=True) - c_fake_pred = self.classifier(fake_data, training=True) - - c_loss = tf.reduce_mean(tf.keras.losses.categorical_crossentropy(real_labels, c_real_pred)) - g_class_loss = tf.reduce_mean(tf.keras.losses.categorical_crossentropy(real_labels, c_fake_pred)) - - g_loss = -tf.reduce_mean(fake_output) + 0.1 * g_info_loss + 0.1 * g_class_loss - - d_gradients = d_tape.gradient(d_loss, self.discriminator.trainable_variables) - c_gradients = c_tape.gradient(c_loss, self.classifier.trainable_variables) - g_gradients = g_tape.gradient(g_loss, self.generator.trainable_variables) - - self.d_optimizer.apply_gradients(zip(d_gradients, self.discriminator.trainable_variables)) - self.c_optimizer.apply_gradients(zip(c_gradients, self.classifier.trainable_variables)) - self.g_optimizer.apply_gradients(zip(g_gradients, self.generator.trainable_variables)) - - return d_loss, g_loss, c_loss - - def fit(self, x, y, batch_size=64, epochs=100, verbose=1): - dataset = tf.data.Dataset.from_tensor_slices((x, y)).shuffle(10000).batch(batch_size) - - for epoch in range(epochs): - d_losses, g_losses, c_losses = [], [], [] - for batch_x, batch_y in dataset: - d_loss, g_loss, c_loss = self.train_step(batch_x, batch_y) - d_losses.append(d_loss) - g_losses.append(g_loss) - c_losses.append(c_loss) - - if verbose and (epoch + 1) % 10 == 0: - print(f"Epoch {epoch+1}/{epochs}: [D loss: {np.mean(d_losses):.4f}] [G loss: {np.mean(g_losses):.4f}] [C loss: {np.mean(c_losses):.4f}]") - - return self - - def sample(self, n_samples): - noise = tf.random.normal([n_samples, self.z_dim]) - generated_data = self.generator(noise, training=False) - generated_labels = self.classifier(generated_data, training=False) - return generated_data.numpy(), generated_labels.numpy() \ No newline at end of file diff --git a/katabatic/models/TableGAN/tablegan_adapter.py b/katabatic/models/TableGAN/tablegan_adapter.py deleted file mode 100644 index a618e25..0000000 --- a/katabatic/models/TableGAN/tablegan_adapter.py +++ /dev/null @@ -1,79 +0,0 @@ -from katabatic.katabatic_spi import KatabaticModelSPI -import pandas as pd -import numpy as np -from .tablegan import TableGAN -from .tablegan_utils import preprocess_data, postprocess_data - -class TableGANAdapter(KatabaticModelSPI): - - def __init__(self, type='continuous', privacy_setting='low'): - self.type = type - self.privacy_setting = privacy_setting - self.constraints = None - self.batch_size = 64 - self.epochs = 100 - self.model = None - self.scaler = None - self.label_encoder = None - self.input_dim = None - self.label_dim = None - self.training_sample_size = 0 - - def load_model(self): - print("---Initialise TableGAN Model") - if self.input_dim is None or self.label_dim is None: - raise ValueError("input_dim and label_dim must be set before loading the model") - - # Set privacy parameters based on privacy_setting - if self.privacy_setting == 'low': - delta_mean = 0.0 - delta_sd = 0.0 - elif self.privacy_setting == 'high': - delta_mean = 0.2 - delta_sd = 0.2 - else: - delta_mean = 0.1 - delta_sd = 0.1 - - self.model = TableGAN(input_dim=self.input_dim, label_dim=self.label_dim, - delta_mean=delta_mean, delta_sd=delta_sd) - return self.model - - - def load_data(self, data_pathname): - print("Loading Data...") - try: - data = pd.read_csv(data_pathname) - return data - except Exception as e: - print(f"Error loading data: {e}") - return None - - def fit(self, X_train, y_train, epochs=None, batch_size=None): - print(f"---FIT TableGAN Model with {self.privacy_setting} privacy setting") - if epochs is not None: - self.epochs = epochs - if batch_size is not None: - self.batch_size = batch_size - - X_processed, y_processed, self.scaler, self.label_encoder = preprocess_data(X_train, y_train) - self.input_dim = X_processed.shape[1] * X_processed.shape[2] - self.label_dim = y_processed.shape[1] - - if self.model is None: - self.load_model() - - self.model.fit(X_processed, y_processed, batch_size=self.batch_size, epochs=self.epochs, verbose=1) - self.training_sample_size = len(X_train) - - def generate(self, size=None): - print("---Generate from TableGAN Model") - if self.model is None: - raise ValueError("Model has not been trained. Call fit() before generate().") - - if size is None: - size = self.training_sample_size - - generated_data, generated_labels = self.model.sample(size) - generated_data = postprocess_data(generated_data, generated_labels, self.scaler, self.label_encoder) - return generated_data \ No newline at end of file diff --git a/katabatic/models/TableGAN/tablegan_utils.py b/katabatic/models/TableGAN/tablegan_utils.py deleted file mode 100644 index 9ca668a..0000000 --- a/katabatic/models/TableGAN/tablegan_utils.py +++ /dev/null @@ -1,31 +0,0 @@ -import numpy as np -from sklearn.preprocessing import MinMaxScaler, LabelEncoder -from tensorflow.keras.utils import to_categorical - -def record_to_matrix(record, matrix_size=32): - matrix = np.zeros((matrix_size, matrix_size)) - flat_record = record.flatten() - matrix.flat[:len(flat_record)] = flat_record - return matrix - -def preprocess_data(X, y): - scaler = MinMaxScaler(feature_range=(-1, 1)) - X_scaled = scaler.fit_transform(X) - - X_matrices = np.array([record_to_matrix(record) for record in X_scaled]) - X_reshaped = X_matrices.reshape(-1, 32, 32, 1) - - label_encoder = LabelEncoder() - y_encoded = label_encoder.fit_transform(y) - y_categorical = to_categorical(y_encoded) - - return X_reshaped, y_categorical, scaler, label_encoder - -def postprocess_data(generated_data, generated_labels, scaler, label_encoder): - n_samples = generated_data.shape[0] - synthetic_flat = generated_data.reshape(n_samples, -1)[:, :scaler.n_features_in_] - synthetic_original = scaler.inverse_transform(synthetic_flat) - - synthetic_labels = label_encoder.inverse_transform(generated_labels.argmax(axis=1)) - - return np.column_stack((synthetic_original, synthetic_labels)) \ No newline at end of file diff --git a/katabatic/models/__init__.py b/katabatic/models/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/katabatic/models/ctgan/__init__.py b/katabatic/models/ctgan/__init__.py deleted file mode 100644 index f0a90c7..0000000 --- a/katabatic/models/ctgan/__init__.py +++ /dev/null @@ -1 +0,0 @@ -from .ctgan_adapter import CtganAdapter \ No newline at end of file diff --git a/katabatic/models/ctgan/base.py b/katabatic/models/ctgan/base.py deleted file mode 100644 index add0dd7..0000000 --- a/katabatic/models/ctgan/base.py +++ /dev/null @@ -1,151 +0,0 @@ -"""BaseSynthesizer module.""" - -import contextlib - -import numpy as np -import torch - - -@contextlib.contextmanager -def set_random_states(random_state, set_model_random_state): - """Context manager for managing the random state. - - Args: - random_state (int or tuple): - The random seed or a tuple of (numpy.random.RandomState, torch.Generator). - set_model_random_state (function): - Function to set the random state on the model. - """ - original_np_state = np.random.get_state() - original_torch_state = torch.get_rng_state() - - random_np_state, random_torch_state = random_state - - np.random.set_state(random_np_state.get_state()) - torch.set_rng_state(random_torch_state.get_state()) - - try: - yield - finally: - current_np_state = np.random.RandomState() - current_np_state.set_state(np.random.get_state()) - current_torch_state = torch.Generator() - current_torch_state.set_state(torch.get_rng_state()) - set_model_random_state((current_np_state, current_torch_state)) - - np.random.set_state(original_np_state) - torch.set_rng_state(original_torch_state) - - -def random_state(function): - """Set the random state before calling the function. - - Args: - function (Callable): - The function to wrap around. - """ - - def wrapper(self, *args, **kwargs): - if self.random_states is None: - return function(self, *args, **kwargs) - - else: - with set_random_states(self.random_states, self.set_random_state): - return function(self, *args, **kwargs) - - return wrapper - - -class BaseSynthesizer: - """Base class for all default synthesizers of ``CTGAN``.""" - - random_states = None - - def __getstate__(self): - """Improve pickling state for ``BaseSynthesizer``. - - Convert to ``cpu`` device before starting the pickling process in order to be able to - load the model even when used from an external tool such as ``SDV``. Also, if - ``random_states`` are set, store their states as dictionaries rather than generators. - - Returns: - dict: - Python dict representing the object. - """ - device_backup = self._device - self.set_device(torch.device('cpu')) - state = self.__dict__.copy() - self.set_device(device_backup) - if ( - isinstance(self.random_states, tuple) - and isinstance(self.random_states[0], np.random.RandomState) - and isinstance(self.random_states[1], torch.Generator) - ): - state['_numpy_random_state'] = self.random_states[0].get_state() - state['_torch_random_state'] = self.random_states[1].get_state() - state.pop('random_states') - - return state - - def __setstate__(self, state): - """Restore the state of a ``BaseSynthesizer``. - - Restore the ``random_states`` from the state dict if those are present and then - set the device according to the current hardware. - """ - if '_numpy_random_state' in state and '_torch_random_state' in state: - np_state = state.pop('_numpy_random_state') - torch_state = state.pop('_torch_random_state') - - current_torch_state = torch.Generator() - current_torch_state.set_state(torch_state) - - current_numpy_state = np.random.RandomState() - current_numpy_state.set_state(np_state) - state['random_states'] = (current_numpy_state, current_torch_state) - - self.__dict__ = state - device = torch.device('cuda:0' if torch.cuda.is_available() else 'cpu') - self.set_device(device) - - def save(self, path): - """Save the model in the passed `path`.""" - device_backup = self._device - self.set_device(torch.device('cpu')) - torch.save(self, path) - self.set_device(device_backup) - - @classmethod - def load(cls, path): - """Load the model stored in the passed `path`.""" - device = torch.device('cuda:0' if torch.cuda.is_available() else 'cpu') - model = torch.load(path) - model.set_device(device) - return model - - def set_random_state(self, random_state): - """Set the random state. - - Args: - random_state (int, tuple, or None): - Either a tuple containing the (numpy.random.RandomState, torch.Generator) - or an int representing the random seed to use for both random states. - """ - if random_state is None: - self.random_states = random_state - elif isinstance(random_state, int): - self.random_states = ( - np.random.RandomState(seed=random_state), - torch.Generator().manual_seed(random_state), - ) - elif ( - isinstance(random_state, tuple) - and isinstance(random_state[0], np.random.RandomState) - and isinstance(random_state[1], torch.Generator) - ): - self.random_states = random_state - else: - raise TypeError( - f'`random_state` {random_state} expected to be an int or a tuple of ' - '(`np.random.RandomState`, `torch.Generator`)' - ) diff --git a/katabatic/models/ctgan/config.json b/katabatic/models/ctgan/config.json deleted file mode 100644 index ef273e6..0000000 --- a/katabatic/models/ctgan/config.json +++ /dev/null @@ -1,21 +0,0 @@ -{ - "ctgan_params": { - "noise_dim": 128, - "learning_rate": 2e-4, - "batch_size": 500, - "discriminator_steps": 5, - "epochs": 300, - "lambda_gp": 10, - "pac": 10, - "cuda": true, - "vgm_components": 2 - }, - "evaluation": { - "test_size": 0.2, - "random_state": 42 - }, - "visualization": { - "n_features": 5, - "figsize": [15, 20] - } -} diff --git a/katabatic/models/ctgan/ctgan.py b/katabatic/models/ctgan/ctgan.py deleted file mode 100644 index 5fdbc26..0000000 --- a/katabatic/models/ctgan/ctgan.py +++ /dev/null @@ -1,522 +0,0 @@ -"""CTGAN module.""" - -import warnings - -import numpy as np -import pandas as pd -import torch -from torch import optim -from torch.nn import BatchNorm1d, Dropout, LeakyReLU, Linear, Module, ReLU, Sequential, functional -from tqdm import tqdm - -from ctgan.data_sampler import DataSampler -from ctgan.data_transformer import DataTransformer -from ctgan.synthesizers.base import BaseSynthesizer, random_state - - -class Discriminator(Module): - """Discriminator for the CTGAN.""" - - def __init__(self, input_dim, discriminator_dim, pac=10): - super(Discriminator, self).__init__() - dim = input_dim * pac - self.pac = pac - self.pacdim = dim - seq = [] - for item in list(discriminator_dim): - seq += [Linear(dim, item), LeakyReLU(0.2), Dropout(0.5)] - dim = item - - seq += [Linear(dim, 1)] - self.seq = Sequential(*seq) - - def calc_gradient_penalty(self, real_data, fake_data, device='cpu', pac=10, lambda_=10): - """Compute the gradient penalty.""" - alpha = torch.rand(real_data.size(0) // pac, 1, 1, device=device) - alpha = alpha.repeat(1, pac, real_data.size(1)) - alpha = alpha.view(-1, real_data.size(1)) - - interpolates = alpha * real_data + ((1 - alpha) * fake_data) - - disc_interpolates = self(interpolates) - - gradients = torch.autograd.grad( - outputs=disc_interpolates, - inputs=interpolates, - grad_outputs=torch.ones(disc_interpolates.size(), device=device), - create_graph=True, - retain_graph=True, - only_inputs=True, - )[0] - - gradients_view = gradients.view(-1, pac * real_data.size(1)).norm(2, dim=1) - 1 - gradient_penalty = ((gradients_view) ** 2).mean() * lambda_ - - return gradient_penalty - - def forward(self, input_): - """Apply the Discriminator to the `input_`.""" - assert input_.size()[0] % self.pac == 0 - return self.seq(input_.view(-1, self.pacdim)) - - -class Residual(Module): - """Residual layer for the CTGAN.""" - - def __init__(self, i, o): - super(Residual, self).__init__() - self.fc = Linear(i, o) - self.bn = BatchNorm1d(o) - self.relu = ReLU() - - def forward(self, input_): - """Apply the Residual layer to the `input_`.""" - out = self.fc(input_) - out = self.bn(out) - out = self.relu(out) - return torch.cat([out, input_], dim=1) - - -class Generator(Module): - """Generator for the CTGAN.""" - - def __init__(self, embedding_dim, generator_dim, data_dim): - super(Generator, self).__init__() - dim = embedding_dim - seq = [] - for item in list(generator_dim): - seq += [Residual(dim, item)] - dim += item - seq.append(Linear(dim, data_dim)) - self.seq = Sequential(*seq) - - def forward(self, input_): - """Apply the Generator to the `input_`.""" - data = self.seq(input_) - return data - - -class CTGAN(BaseSynthesizer): - """Conditional Table GAN Synthesizer. - - This is the core class of the CTGAN project, where the different components - are orchestrated together. - For more details about the process, please check the [Modeling Tabular data using - Conditional GAN](https://arxiv.org/abs/1907.00503) paper. - - Args: - embedding_dim (int): - Size of the random sample passed to the Generator. Defaults to 128. - generator_dim (tuple or list of ints): - Size of the output samples for each one of the Residuals. A Residual Layer - will be created for each one of the values provided. Defaults to (256, 256). - discriminator_dim (tuple or list of ints): - Size of the output samples for each one of the Discriminator Layers. A Linear Layer - will be created for each one of the values provided. Defaults to (256, 256). - generator_lr (float): - Learning rate for the generator. Defaults to 2e-4. - generator_decay (float): - Generator weight decay for the Adam Optimizer. Defaults to 1e-6. - discriminator_lr (float): - Learning rate for the discriminator. Defaults to 2e-4. - discriminator_decay (float): - Discriminator weight decay for the Adam Optimizer. Defaults to 1e-6. - batch_size (int): - Number of data samples to process in each step. - discriminator_steps (int): - Number of discriminator updates to do for each generator update. - From the WGAN paper: https://arxiv.org/abs/1701.07875. WGAN paper - default is 5. Default used is 1 to match original CTGAN implementation. - log_frequency (boolean): - Whether to use log frequency of categorical levels in conditional - sampling. Defaults to ``True``. - verbose (boolean): - Whether to have print statements for progress results. Defaults to ``False``. - epochs (int): - Number of training epochs. Defaults to 300. - pac (int): - Number of samples to group together when applying the discriminator. - Defaults to 10. - cuda (bool): - Whether to attempt to use cuda for GPU computation. - If this is False or CUDA is not available, CPU will be used. - Defaults to ``True``. - """ - - def __init__( - self, - embedding_dim=128, - generator_dim=(256, 256), - discriminator_dim=(256, 256), - generator_lr=2e-4, - generator_decay=1e-6, - discriminator_lr=2e-4, - discriminator_decay=1e-6, - batch_size=500, - discriminator_steps=1, - log_frequency=True, - verbose=False, - epochs=300, - pac=10, - cuda=True, - ): - assert batch_size % 2 == 0 - - self._embedding_dim = embedding_dim - self._generator_dim = generator_dim - self._discriminator_dim = discriminator_dim - - self._generator_lr = generator_lr - self._generator_decay = generator_decay - self._discriminator_lr = discriminator_lr - self._discriminator_decay = discriminator_decay - - self._batch_size = batch_size - self._discriminator_steps = discriminator_steps - self._log_frequency = log_frequency - self._verbose = verbose - self._epochs = epochs - self.pac = pac - - if not cuda or not torch.cuda.is_available(): - device = 'cpu' - elif isinstance(cuda, str): - device = cuda - else: - device = 'cuda' - - self._device = torch.device(device) - - self._transformer = None - self._data_sampler = None - self._generator = None - self.loss_values = None - - @staticmethod - def _gumbel_softmax(logits, tau=1, hard=False, eps=1e-10, dim=-1): - """Deals with the instability of the gumbel_softmax for older versions of torch. - - For more details about the issue: - https://drive.google.com/file/d/1AA5wPfZ1kquaRtVruCd6BiYZGcDeNxyP/view?usp=sharing - - Args: - logits […, num_features]: - Unnormalized log probabilities - tau: - Non-negative scalar temperature - hard (bool): - If True, the returned samples will be discretized as one-hot vectors, - but will be differentiated as if it is the soft sample in autograd - dim (int): - A dimension along which softmax will be computed. Default: -1. - - Returns: - Sampled tensor of same shape as logits from the Gumbel-Softmax distribution. - """ - for _ in range(10): - transformed = functional.gumbel_softmax(logits, tau=tau, hard=hard, eps=eps, dim=dim) - if not torch.isnan(transformed).any(): - return transformed - - raise ValueError('gumbel_softmax returning NaN.') - - def _apply_activate(self, data): - """Apply proper activation function to the output of the generator.""" - data_t = [] - st = 0 - for column_info in self._transformer.output_info_list: - for span_info in column_info: - if span_info.activation_fn == 'tanh': - ed = st + span_info.dim - data_t.append(torch.tanh(data[:, st:ed])) - st = ed - elif span_info.activation_fn == 'softmax': - ed = st + span_info.dim - transformed = self._gumbel_softmax(data[:, st:ed], tau=0.2) - data_t.append(transformed) - st = ed - else: - raise ValueError(f'Unexpected activation function {span_info.activation_fn}.') - - return torch.cat(data_t, dim=1) - - def _cond_loss(self, data, c, m): - """Compute the cross entropy loss on the fixed discrete column.""" - loss = [] - st = 0 - st_c = 0 - for column_info in self._transformer.output_info_list: - for span_info in column_info: - if len(column_info) != 1 or span_info.activation_fn != 'softmax': - # not discrete column - st += span_info.dim - else: - ed = st + span_info.dim - ed_c = st_c + span_info.dim - tmp = functional.cross_entropy( - data[:, st:ed], torch.argmax(c[:, st_c:ed_c], dim=1), reduction='none' - ) - loss.append(tmp) - st = ed - st_c = ed_c - - loss = torch.stack(loss, dim=1) # noqa: PD013 - - return (loss * m).sum() / data.size()[0] - - def _validate_discrete_columns(self, train_data, discrete_columns): - """Check whether ``discrete_columns`` exists in ``train_data``. - - Args: - train_data (numpy.ndarray or pandas.DataFrame): - Training Data. It must be a 2-dimensional numpy array or a pandas.DataFrame. - discrete_columns (list-like): - List of discrete columns to be used to generate the Conditional - Vector. If ``train_data`` is a Numpy array, this list should - contain the integer indices of the columns. Otherwise, if it is - a ``pandas.DataFrame``, this list should contain the column names. - """ - if isinstance(train_data, pd.DataFrame): - invalid_columns = set(discrete_columns) - set(train_data.columns) - elif isinstance(train_data, np.ndarray): - invalid_columns = [] - for column in discrete_columns: - if column < 0 or column >= train_data.shape[1]: - invalid_columns.append(column) - else: - raise TypeError('``train_data`` should be either pd.DataFrame or np.array.') - - if invalid_columns: - raise ValueError(f'Invalid columns found: {invalid_columns}') - - @random_state - def fit(self, train_data, discrete_columns=(), epochs=None): - """Fit the CTGAN Synthesizer models to the training data. - - Args: - train_data (numpy.ndarray or pandas.DataFrame): - Training Data. It must be a 2-dimensional numpy array or a pandas.DataFrame. - discrete_columns (list-like): - List of discrete columns to be used to generate the Conditional - Vector. If ``train_data`` is a Numpy array, this list should - contain the integer indices of the columns. Otherwise, if it is - a ``pandas.DataFrame``, this list should contain the column names. - """ - self._validate_discrete_columns(train_data, discrete_columns) - - if epochs is None: - epochs = self._epochs - else: - warnings.warn( - ( - '`epochs` argument in `fit` method has been deprecated and will be removed ' - 'in a future version. Please pass `epochs` to the constructor instead' - ), - DeprecationWarning, - ) - - self._transformer = DataTransformer() - self._transformer.fit(train_data, discrete_columns) - - train_data = self._transformer.transform(train_data) - - self._data_sampler = DataSampler( - train_data, self._transformer.output_info_list, self._log_frequency - ) - - data_dim = self._transformer.output_dimensions - - self._generator = Generator( - self._embedding_dim + self._data_sampler.dim_cond_vec(), self._generator_dim, data_dim - ).to(self._device) - - discriminator = Discriminator( - data_dim + self._data_sampler.dim_cond_vec(), self._discriminator_dim, pac=self.pac - ).to(self._device) - - optimizerG = optim.Adam( - self._generator.parameters(), - lr=self._generator_lr, - betas=(0.5, 0.9), - weight_decay=self._generator_decay, - ) - - optimizerD = optim.Adam( - discriminator.parameters(), - lr=self._discriminator_lr, - betas=(0.5, 0.9), - weight_decay=self._discriminator_decay, - ) - - mean = torch.zeros(self._batch_size, self._embedding_dim, device=self._device) - std = mean + 1 - - self.loss_values = pd.DataFrame(columns=['Epoch', 'Generator Loss', 'Distriminator Loss']) - - epoch_iterator = tqdm(range(epochs), disable=(not self._verbose)) - if self._verbose: - description = 'Gen. ({gen:.2f}) | Discrim. ({dis:.2f})' - epoch_iterator.set_description(description.format(gen=0, dis=0)) - - steps_per_epoch = max(len(train_data) // self._batch_size, 1) - for i in epoch_iterator: - for id_ in range(steps_per_epoch): - for n in range(self._discriminator_steps): - fakez = torch.normal(mean=mean, std=std) - - condvec = self._data_sampler.sample_condvec(self._batch_size) - if condvec is None: - c1, m1, col, opt = None, None, None, None - real = self._data_sampler.sample_data( - train_data, self._batch_size, col, opt - ) - else: - c1, m1, col, opt = condvec - c1 = torch.from_numpy(c1).to(self._device) - m1 = torch.from_numpy(m1).to(self._device) - fakez = torch.cat([fakez, c1], dim=1) - - perm = np.arange(self._batch_size) - np.random.shuffle(perm) - real = self._data_sampler.sample_data( - train_data, self._batch_size, col[perm], opt[perm] - ) - c2 = c1[perm] - - fake = self._generator(fakez) - fakeact = self._apply_activate(fake) - - real = torch.from_numpy(real.astype('float32')).to(self._device) - - if c1 is not None: - fake_cat = torch.cat([fakeact, c1], dim=1) - real_cat = torch.cat([real, c2], dim=1) - else: - real_cat = real - fake_cat = fakeact - - y_fake = discriminator(fake_cat) - y_real = discriminator(real_cat) - - pen = discriminator.calc_gradient_penalty( - real_cat, fake_cat, self._device, self.pac - ) - loss_d = -(torch.mean(y_real) - torch.mean(y_fake)) - - optimizerD.zero_grad(set_to_none=False) - pen.backward(retain_graph=True) - loss_d.backward() - optimizerD.step() - - fakez = torch.normal(mean=mean, std=std) - condvec = self._data_sampler.sample_condvec(self._batch_size) - - if condvec is None: - c1, m1, col, opt = None, None, None, None - else: - c1, m1, col, opt = condvec - c1 = torch.from_numpy(c1).to(self._device) - m1 = torch.from_numpy(m1).to(self._device) - fakez = torch.cat([fakez, c1], dim=1) - - fake = self._generator(fakez) - fakeact = self._apply_activate(fake) - - if c1 is not None: - y_fake = discriminator(torch.cat([fakeact, c1], dim=1)) - else: - y_fake = discriminator(fakeact) - - if condvec is None: - cross_entropy = 0 - else: - cross_entropy = self._cond_loss(fake, c1, m1) - - loss_g = -torch.mean(y_fake) + cross_entropy - - optimizerG.zero_grad(set_to_none=False) - loss_g.backward() - optimizerG.step() - - generator_loss = loss_g.detach().cpu().item() - discriminator_loss = loss_d.detach().cpu().item() - - epoch_loss_df = pd.DataFrame({ - 'Epoch': [i], - 'Generator Loss': [generator_loss], - 'Discriminator Loss': [discriminator_loss], - }) - if not self.loss_values.empty: - self.loss_values = pd.concat([self.loss_values, epoch_loss_df]).reset_index( - drop=True - ) - else: - self.loss_values = epoch_loss_df - - if self._verbose: - epoch_iterator.set_description( - description.format(gen=generator_loss, dis=discriminator_loss) - ) - - @random_state - def sample(self, n, condition_column=None, condition_value=None): - """Sample data similar to the training data. - - Choosing a condition_column and condition_value will increase the probability of the - discrete condition_value happening in the condition_column. - - Args: - n (int): - Number of rows to sample. - condition_column (string): - Name of a discrete column. - condition_value (string): - Name of the category in the condition_column which we wish to increase the - probability of happening. - - Returns: - numpy.ndarray or pandas.DataFrame - """ - if condition_column is not None and condition_value is not None: - condition_info = self._transformer.convert_column_name_value_to_id( - condition_column, condition_value - ) - global_condition_vec = self._data_sampler.generate_cond_from_condition_column_info( - condition_info, self._batch_size - ) - else: - global_condition_vec = None - - steps = n // self._batch_size + 1 - data = [] - for i in range(steps): - mean = torch.zeros(self._batch_size, self._embedding_dim) - std = mean + 1 - fakez = torch.normal(mean=mean, std=std).to(self._device) - - if global_condition_vec is not None: - condvec = global_condition_vec.copy() - else: - condvec = self._data_sampler.sample_original_condvec(self._batch_size) - - if condvec is None: - pass - else: - c1 = condvec - c1 = torch.from_numpy(c1).to(self._device) - fakez = torch.cat([fakez, c1], dim=1) - - fake = self._generator(fakez) - fakeact = self._apply_activate(fake) - data.append(fakeact.detach().cpu().numpy()) - - data = np.concatenate(data, axis=0) - data = data[:n] - - return self._transformer.inverse_transform(data) - - def set_device(self, device): - """Set the `device` to be used ('GPU' or 'CPU).""" - self._device = device - if self._generator is not None: - self._generator.to(self._device) diff --git a/katabatic/models/ctgan/ctgan_adapter.py b/katabatic/models/ctgan/ctgan_adapter.py deleted file mode 100644 index 364052d..0000000 --- a/katabatic/models/ctgan/ctgan_adapter.py +++ /dev/null @@ -1,65 +0,0 @@ -from katabatic.katabatic_spi import KatabaticModelSPI -import pandas as pd -import torch as torch -from .ctgan import CTGAN - -class CtganAdapter(KatabaticModelSPI): - def __init__(self, - embedding_dim=128, - generator_dim=(256, 256), - discriminator_dim=(256, 256), - generator_lr=2e-4, - generator_decay=1e-6, - discriminator_lr=2e-4, - discriminator_decay=1e-6, - batch_size=500, - discriminator_steps=1, - log_frequency=True, - verbose=False, - epochs=300, - pac=10, - cuda=True, - discrete_columns=None): - """ - Initialize the CtganAdapter with specified hyperparameters. - """ - assert batch_size % 2 == 0, "Batch size must be even." - - self.discrete_columns = discrete_columns - self._embedding_dim = embedding_dim - self._generator_dim = generator_dim - self._discriminator_dim = discriminator_dim - self._generator_lr = generator_lr - self._generator_decay = generator_decay - self._discriminator_lr = discriminator_lr - self._discriminator_decay = discriminator_decay - self._batch_size = batch_size - self._discriminator_steps = discriminator_steps - self._log_frequency = log_frequency - self._verbose = verbose - self._epochs = epochs - self.pac = pac - - # Check for CUDA availability - if not cuda or not torch.cuda.is_available(): - self.device = torch.device("cpu") - else: - self.device = torch.device("cuda") - - # Initialize the CTGAN model - self.model = CTGAN( - embedding_dim=self._embedding_dim, - generator_dim=self._generator_dim, - discriminator_dim=self._discriminator_dim, - generator_lr=self._generator_lr, - generator_decay=self._generator_decay, - discriminator_lr=self._discriminator_lr, - discriminator_decay=self._discriminator_decay, - batch_size=self._batch_size, - discriminator_steps=self._discriminator_steps, - log_frequency=self._log_frequency, - verbose=self._verbose, - epochs=self._epochs, - pac=self.pac, - device=self.device - ) diff --git a/katabatic/models/ctgan/ctgan_benchmark.py b/katabatic/models/ctgan/ctgan_benchmark.py deleted file mode 100644 index 39b9ac8..0000000 --- a/katabatic/models/ctgan/ctgan_benchmark.py +++ /dev/null @@ -1,374 +0,0 @@ -import numpy as np -import pandas as pd -from scipy import stats -from sklearn.model_selection import train_test_split -from sklearn.preprocessing import StandardScaler, OneHotEncoder -from sklearn.impute import SimpleImputer -from sklearn.compose import ColumnTransformer -from sklearn.pipeline import Pipeline -from sklearn.metrics import accuracy_score, f1_score, r2_score -from sklearn.linear_model import LogisticRegression, LinearRegression -from sklearn.ensemble import RandomForestClassifier, RandomForestRegressor -from sklearn.neural_network import MLPClassifier, MLPRegressor -from xgboost import XGBClassifier, XGBRegressor -import logging -from scipy.stats import wasserstein_distance, entropy -from scipy.spatial.distance import jensenshannon - -# Configure logging to display information-level messages with timestamps -logging.basicConfig(level=logging.INFO, format="%(asctime)s - %(levelname)s - %(message)s") - -def evaluate_likelihood_fitness(real_data, synthetic_data): - """ - Calculate Likelihood fitness metrics (Lsyn, Ltest) on simulated data. - - Lsyn: Log-likelihood of synthetic data under the model trained on synthetic data. - Ltest: Log-likelihood of real data under the model trained on synthetic data. - - Args: - real_data (pd.DataFrame): The original real dataset. - synthetic_data (pd.DataFrame): The synthetic dataset generated by CTGAN. - - Returns: - dict: Dictionary containing Lsyn and Ltest values. - """ - from sklearn.mixture import GaussianMixture - - # Select only numerical columns for likelihood evaluation - numeric_columns = real_data.select_dtypes(include=['int64', 'float64']).columns - synthetic_numeric = synthetic_data[numeric_columns].dropna() - real_numeric = real_data[numeric_columns].dropna() - - # Fit a Gaussian Mixture Model on the synthetic data - gmm = GaussianMixture(n_components=5, covariance_type='full', random_state=42) - gmm.fit(synthetic_numeric) - - # Evaluate log-likelihoods - Lsyn = gmm.score(synthetic_numeric) # Log-likelihood of synthetic data - Ltest = gmm.score(real_numeric) # Log-likelihood of real data under the synthetic model - - return { - "Lsyn": Lsyn, - "Ltest": Ltest - } - -def evaluate_statistical_similarity(real_data, synthetic_data): - """ - Calculate Statistical similarity metrics (Jensen-Shannon Divergence, Wasserstein Distance). - - Args: - real_data (pd.DataFrame): The original real dataset. - synthetic_data (pd.DataFrame): The synthetic dataset generated by CTGAN. - - Returns: - dict: Dictionary containing mean Jensen-Shannon Divergence and mean Wasserstein Distance. - """ - js_divergences = [] - wasserstein_distances = [] - - # Iterate over each column to compute similarity metrics - for column in real_data.columns: - if real_data[column].dtype in ['int64', 'float64']: - # For numerical columns, compute Wasserstein Distance - real_values = real_data[column].dropna().values - synth_values = synthetic_data[column].dropna().values - - wd = wasserstein_distance(real_values, synth_values) - wasserstein_distances.append(wd) - else: - # For categorical columns, compute Jensen-Shannon Divergence - real_counts = real_data[column].value_counts(normalize=True) - synth_counts = synthetic_data[column].value_counts(normalize=True) - - # Ensure both distributions have the same categories - all_categories = set(real_counts.index) | set(synth_counts.index) - real_probs = real_counts.reindex(all_categories, fill_value=0) - synth_probs = synth_counts.reindex(all_categories, fill_value=0) - - # Compute Jensen-Shannon Divergence - js_div = jensenshannon(real_probs, synth_probs) - js_divergences.append(js_div) - - return { - "JSD_mean": np.mean(js_divergences) if js_divergences else None, - "Wasserstein_mean": np.mean(wasserstein_distances) if wasserstein_distances else None - } - -def evaluate_ml_efficacy(X_real, y_real): - """ - Calculate Machine Learning efficacy metrics (Accuracy, F1, R2) on real data. - - Depending on whether the target variable is categorical or numerical, different models are used. - - Args: - X_real (pd.DataFrame): Feature data from the real dataset. - y_real (pd.Series): Target labels from the real dataset. - - Returns: - dict: Dictionary containing performance metrics for each model. - """ - # Split the data into training and testing sets - X_train, X_test, y_train, y_test = train_test_split(X_real, y_real, test_size=0.2, random_state=42) - - # Identify numerical and categorical features for preprocessing - numeric_features = X_real.select_dtypes(include=['int64', 'float64']).columns - categorical_features = X_real.select_dtypes(include=['object', 'category']).columns - - # Define a preprocessing pipeline for numerical and categorical data - preprocessor = ColumnTransformer( - transformers=[ - ('num', StandardScaler(), numeric_features), - ('cat', OneHotEncoder(handle_unknown='ignore', sparse=False), categorical_features) - ]) - - # Check if the target variable is categorical - if y_real.dtype == 'object' or y_real.dtype.name == 'category': - # Define classifiers for categorical targets - classifiers = { - "LogisticRegression": LogisticRegression(max_iter=1000), - "RandomForest": RandomForestClassifier(), - "MLP": MLPClassifier(max_iter=1000), - "XGBoost": XGBClassifier(use_label_encoder=False, eval_metric='logloss') - } - - results = {} - for name, clf in classifiers.items(): - # Create a pipeline with preprocessing and the classifier - pipeline = Pipeline([ - ('preprocessor', preprocessor), - ('classifier', clf) - ]) - - # Train the classifier - pipeline.fit(X_train, y_train) - # Predict on the test set - y_pred = pipeline.predict(X_test) - - # Calculate performance metrics - accuracy = accuracy_score(y_test, y_pred) - f1 = f1_score(y_test, y_pred, average='weighted') - - results[name] = { - "Accuracy": accuracy, - "F1": f1 - } - - else: - # Define regressors for numerical targets - regressors = { - "LinearRegression": LinearRegression(), - "RandomForest": RandomForestRegressor(), - "MLP": MLPRegressor(max_iter=1000), - "XGBoost": XGBRegressor() - } - - results = {} - for name, reg in regressors.items(): - # Create a pipeline with preprocessing and the regressor - pipeline = Pipeline([ - ('preprocessor', preprocessor), - ('regressor', reg) - ]) - - # Train the regressor - pipeline.fit(X_train, y_train) - # Predict on the test set - y_pred = pipeline.predict(X_test) - - # Calculate R-squared score - r2 = r2_score(y_test, y_pred) - - results[name] = { - "R2": r2 - } - - return results - -def evaluate_tstr(X_real, y_real, X_synthetic, y_synthetic): - """ - Evaluate Machine Learning utility using the TSTR (Train on Synthetic, Test on Real) approach. - - Args: - X_real (pd.DataFrame): Feature data from the real dataset. - y_real (pd.Series): Target labels from the real dataset. - X_synthetic (pd.DataFrame): Feature data from the synthetic dataset. - y_synthetic (pd.Series): Target labels from the synthetic dataset. - - Returns: - dict: Dictionary containing performance metrics for each model. - """ - # Identify numerical and categorical features for preprocessing - numeric_features = X_real.select_dtypes(include=['int64', 'float64']).columns - categorical_features = X_real.select_dtypes(include=['object', 'category']).columns - - # Define a preprocessing pipeline for numerical and categorical data - preprocessor = ColumnTransformer( - transformers=[ - ('num', StandardScaler(), numeric_features), - ('cat', OneHotEncoder(handle_unknown='ignore', sparse=False), categorical_features) - ]) - - # Check if the target variable is categorical - if y_real.dtype == 'object' or y_real.dtype.name == 'category': - # Define classifiers for categorical targets - classifiers = { - "LogisticRegression": LogisticRegression(max_iter=1000), - "RandomForest": RandomForestClassifier(), - "MLP": MLPClassifier(max_iter=1000), - "XGBoost": XGBClassifier(use_label_encoder=False, eval_metric='logloss') - } - - results = {} - for name, clf in classifiers.items(): - # Create a pipeline with preprocessing and the classifier - pipeline = Pipeline([ - ('preprocessor', preprocessor), - ('classifier', clf) - ]) - - # Train the classifier on synthetic data - pipeline.fit(X_synthetic, y_synthetic) - # Predict on the real test set - y_pred = pipeline.predict(X_real) - - # Calculate performance metrics - accuracy = accuracy_score(y_real, y_pred) - f1 = f1_score(y_real, y_pred, average='weighted') - - results[name] = { - "Accuracy": accuracy, - "F1": f1 - } - - else: - # Define regressors for numerical targets - regressors = { - "LinearRegression": LinearRegression(), - "RandomForest": RandomForestRegressor(), - "MLP": MLPRegressor(max_iter=1000), - "XGBoost": XGBRegressor() - } - - results = {} - for name, reg in regressors.items(): - # Create a pipeline with preprocessing and the regressor - pipeline = Pipeline([ - ('preprocessor', preprocessor), - ('regressor', reg) - ]) - - # Train the regressor on synthetic data - pipeline.fit(X_synthetic, y_synthetic) - # Predict on the real test set - y_pred = pipeline.predict(X_real) - - # Calculate R-squared score - r2 = r2_score(y_real, y_pred) - - results[name] = { - "R2": r2 - } - - return results - -def evaluate_ctgan(real_data, synthetic_data): - """ - Evaluate the CTGAN model using various metrics including likelihood fitness, - statistical similarity, machine learning efficacy, and TSTR performance. - - Args: - real_data (pd.DataFrame): The original real dataset, including the target column named 'Category'. - synthetic_data (pd.DataFrame): The synthetic dataset generated by CTGAN, including the target column named 'Category'. - - Returns: - dict: Dictionary containing all evaluation metrics. - """ - # Separate features and target variable from real data - X_real = real_data.drop('Category', axis=1) - y_real = real_data['Category'] - - # Separate features and target variable from synthetic data - X_synthetic = synthetic_data.drop('Category', axis=1) - y_synthetic = synthetic_data['Category'] - - # Handle any missing values in the synthetic target variable by imputing the mode - y_synthetic = y_synthetic.fillna(y_synthetic.mode().iloc[0]) - - results = {} - - # Evaluate likelihood fitness metrics - try: - results["likelihood_fitness"] = evaluate_likelihood_fitness(X_real, X_synthetic) - except Exception as e: - print(f"Error in likelihood_fitness: {str(e)}") - results["likelihood_fitness"] = None - - # Evaluate statistical similarity metrics - try: - results["statistical_similarity"] = evaluate_statistical_similarity(X_real, X_synthetic) - except Exception as e: - print(f"Error in statistical_similarity: {str(e)}") - results["statistical_similarity"] = None - - # Evaluate machine learning efficacy on real data - try: - results["ml_efficacy"] = evaluate_ml_efficacy(X_real, y_real) - except Exception as e: - print(f"Error in ml_efficacy: {str(e)}") - results["ml_efficacy"] = None - - # Evaluate TSTR performance (train on synthetic, test on real) - try: - results["tstr_performance"] = evaluate_tstr(X_real, y_real, X_synthetic, y_synthetic) - except Exception as e: - print(f"Error in tstr_performance: {str(e)}") - results["tstr_performance"] = None - - return results - -def print_evaluation_results(results): - """ - Print the evaluation results in a structured and readable format using logging. - - Args: - results (dict): Dictionary containing all evaluation metrics. - """ - logging.info("CTGAN Evaluation Results:") - - # Print Likelihood Fitness Metrics - if results.get("likelihood_fitness") is not None: - logging.info("\nLikelihood Fitness Metrics:") - logging.info(f" - Lsyn (Synthetic Data Log-Likelihood): {results['likelihood_fitness']['Lsyn']:.4f}") - logging.info(f" - Ltest (Real Data Log-Likelihood under Synthetic Model): {results['likelihood_fitness']['Ltest']:.4f}") - else: - logging.info("Likelihood Fitness: Error occurred during calculation") - - # Print Statistical Similarity Metrics - if results.get("statistical_similarity") is not None: - logging.info("\nStatistical Similarity Metrics:") - if results['statistical_similarity']['JSD_mean'] is not None: - logging.info(f" - Jensen-Shannon Divergence Mean (Categorical): {results['statistical_similarity']['JSD_mean']:.4f}") - if results['statistical_similarity']['Wasserstein_mean'] is not None: - logging.info(f" - Wasserstein Distance Mean (Numerical): {results['statistical_similarity']['Wasserstein_mean']:.4f}") - else: - logging.info("\nStatistical Similarity: Error occurred during calculation") - - # Print Machine Learning Efficacy Metrics - if results.get("ml_efficacy") is not None: - logging.info("\nMachine Learning Efficacy Metrics on Real Data:") - for model_name, metrics in results['ml_efficacy'].items(): - logging.info(f" {model_name}:") - for metric_name, value in metrics.items(): - logging.info(f" - {metric_name}: {value:.4f}") - else: - logging.info("\nMachine Learning Efficacy: Error occurred during calculation") - - # Print TSTR Performance Metrics - if results.get("tstr_performance") is not None: - logging.info("\nMachine Learning Utility using TSTR Approach:") - for model_name, metrics in results['tstr_performance'].items(): - logging.info(f" {model_name}:") - for metric_name, value in metrics.items(): - logging.info(f" - {metric_name}: {value:.4f}") - else: - logging.info("\nTSTR Performance: Error occurred during calculation") diff --git a/katabatic/models/ctgan/data_sampler.py b/katabatic/models/ctgan/data_sampler.py deleted file mode 100644 index d9b67d1..0000000 --- a/katabatic/models/ctgan/data_sampler.py +++ /dev/null @@ -1,153 +0,0 @@ -"""DataSampler module.""" - -import numpy as np - - -class DataSampler(object): - """DataSampler samples the conditional vector and corresponding data for CTGAN.""" - - def __init__(self, data, output_info, log_frequency): - self._data_length = len(data) - - def is_discrete_column(column_info): - return len(column_info) == 1 and column_info[0].activation_fn == 'softmax' - - n_discrete_columns = sum([ - 1 for column_info in output_info if is_discrete_column(column_info) - ]) - - self._discrete_column_matrix_st = np.zeros(n_discrete_columns, dtype='int32') - - # Store the row id for each category in each discrete column. - # For example _rid_by_cat_cols[a][b] is a list of all rows with the - # a-th discrete column equal value b. - self._rid_by_cat_cols = [] - - # Compute _rid_by_cat_cols - st = 0 - for column_info in output_info: - if is_discrete_column(column_info): - span_info = column_info[0] - ed = st + span_info.dim - - rid_by_cat = [] - for j in range(span_info.dim): - rid_by_cat.append(np.nonzero(data[:, st + j])[0]) - self._rid_by_cat_cols.append(rid_by_cat) - st = ed - else: - st += sum([span_info.dim for span_info in column_info]) - assert st == data.shape[1] - - # Prepare an interval matrix for efficiently sample conditional vector - max_category = max( - [column_info[0].dim for column_info in output_info if is_discrete_column(column_info)], - default=0, - ) - - self._discrete_column_cond_st = np.zeros(n_discrete_columns, dtype='int32') - self._discrete_column_n_category = np.zeros(n_discrete_columns, dtype='int32') - self._discrete_column_category_prob = np.zeros((n_discrete_columns, max_category)) - self._n_discrete_columns = n_discrete_columns - self._n_categories = sum([ - column_info[0].dim for column_info in output_info if is_discrete_column(column_info) - ]) - - st = 0 - current_id = 0 - current_cond_st = 0 - for column_info in output_info: - if is_discrete_column(column_info): - span_info = column_info[0] - ed = st + span_info.dim - category_freq = np.sum(data[:, st:ed], axis=0) - if log_frequency: - category_freq = np.log(category_freq + 1) - category_prob = category_freq / np.sum(category_freq) - self._discrete_column_category_prob[current_id, : span_info.dim] = category_prob - self._discrete_column_cond_st[current_id] = current_cond_st - self._discrete_column_n_category[current_id] = span_info.dim - current_cond_st += span_info.dim - current_id += 1 - st = ed - else: - st += sum([span_info.dim for span_info in column_info]) - - def _random_choice_prob_index(self, discrete_column_id): - probs = self._discrete_column_category_prob[discrete_column_id] - r = np.expand_dims(np.random.rand(probs.shape[0]), axis=1) - return (probs.cumsum(axis=1) > r).argmax(axis=1) - - def sample_condvec(self, batch): - """Generate the conditional vector for training. - - Returns: - cond (batch x #categories): - The conditional vector. - mask (batch x #discrete columns): - A one-hot vector indicating the selected discrete column. - discrete column id (batch): - Integer representation of mask. - category_id_in_col (batch): - Selected category in the selected discrete column. - """ - if self._n_discrete_columns == 0: - return None - - discrete_column_id = np.random.choice(np.arange(self._n_discrete_columns), batch) - - cond = np.zeros((batch, self._n_categories), dtype='float32') - mask = np.zeros((batch, self._n_discrete_columns), dtype='float32') - mask[np.arange(batch), discrete_column_id] = 1 - category_id_in_col = self._random_choice_prob_index(discrete_column_id) - category_id = self._discrete_column_cond_st[discrete_column_id] + category_id_in_col - cond[np.arange(batch), category_id] = 1 - - return cond, mask, discrete_column_id, category_id_in_col - - def sample_original_condvec(self, batch): - """Generate the conditional vector for generation use original frequency.""" - if self._n_discrete_columns == 0: - return None - - category_freq = self._discrete_column_category_prob.flatten() - category_freq = category_freq[category_freq != 0] - category_freq = category_freq / np.sum(category_freq) - col_idxs = np.random.choice(np.arange(len(category_freq)), batch, p=category_freq) - cond = np.zeros((batch, self._n_categories), dtype='float32') - cond[np.arange(batch), col_idxs] = 1 - - return cond - - def sample_data(self, data, n, col, opt): - """Sample data from original training data satisfying the sampled conditional vector. - - Args: - data: - The training data. - - Returns: - n: - n rows of matrix data. - """ - if col is None: - idx = np.random.randint(len(data), size=n) - return data[idx] - - idx = [] - for c, o in zip(col, opt): - idx.append(np.random.choice(self._rid_by_cat_cols[c][o])) - - return data[idx] - - def dim_cond_vec(self): - """Return the total number of categories.""" - return self._n_categories - - def generate_cond_from_condition_column_info(self, condition_info, batch): - """Generate the condition vector.""" - vec = np.zeros((batch, self._n_categories), dtype='float32') - id_ = self._discrete_column_matrix_st[condition_info['discrete_column_id']] - id_ += condition_info['value_id'] - vec[:, id_] = 1 - return vec diff --git a/katabatic/models/ctgan/data_transformer.py b/katabatic/models/ctgan/data_transformer.py deleted file mode 100644 index 7ed8b0e..0000000 --- a/katabatic/models/ctgan/data_transformer.py +++ /dev/null @@ -1,265 +0,0 @@ -"""DataTransformer module.""" - -from collections import namedtuple - -import numpy as np -import pandas as pd -from joblib import Parallel, delayed -from rdt.transformers import ClusterBasedNormalizer, OneHotEncoder - -SpanInfo = namedtuple('SpanInfo', ['dim', 'activation_fn']) -ColumnTransformInfo = namedtuple( - 'ColumnTransformInfo', - ['column_name', 'column_type', 'transform', 'output_info', 'output_dimensions'], -) - - -class DataTransformer(object): - """Data Transformer. - - Model continuous columns with a BayesianGMM and normalize them to a scalar between [-1, 1] - and a vector. Discrete columns are encoded using a OneHotEncoder. - """ - - def __init__(self, max_clusters=10, weight_threshold=0.005): - """Create a data transformer. - - Args: - max_clusters (int): - Maximum number of Gaussian distributions in Bayesian GMM. - weight_threshold (float): - Weight threshold for a Gaussian distribution to be kept. - """ - self._max_clusters = max_clusters - self._weight_threshold = weight_threshold - - def _fit_continuous(self, data): - """Train Bayesian GMM for continuous columns. - - Args: - data (pd.DataFrame): - A dataframe containing a column. - - Returns: - namedtuple: - A ``ColumnTransformInfo`` object. - """ - column_name = data.columns[0] - gm = ClusterBasedNormalizer( - missing_value_generation='from_column', - max_clusters=min(len(data), self._max_clusters), - weight_threshold=self._weight_threshold, - ) - gm.fit(data, column_name) - num_components = sum(gm.valid_component_indicator) - - return ColumnTransformInfo( - column_name=column_name, - column_type='continuous', - transform=gm, - output_info=[SpanInfo(1, 'tanh'), SpanInfo(num_components, 'softmax')], - output_dimensions=1 + num_components, - ) - - def _fit_discrete(self, data): - """Fit one hot encoder for discrete column. - - Args: - data (pd.DataFrame): - A dataframe containing a column. - - Returns: - namedtuple: - A ``ColumnTransformInfo`` object. - """ - column_name = data.columns[0] - ohe = OneHotEncoder() - ohe.fit(data, column_name) - num_categories = len(ohe.dummies) - - return ColumnTransformInfo( - column_name=column_name, - column_type='discrete', - transform=ohe, - output_info=[SpanInfo(num_categories, 'softmax')], - output_dimensions=num_categories, - ) - - def fit(self, raw_data, discrete_columns=()): - """Fit the ``DataTransformer``. - - Fits a ``ClusterBasedNormalizer`` for continuous columns and a - ``OneHotEncoder`` for discrete columns. - - This step also counts the #columns in matrix data and span information. - """ - self.output_info_list = [] - self.output_dimensions = 0 - self.dataframe = True - - if not isinstance(raw_data, pd.DataFrame): - self.dataframe = False - # work around for RDT issue #328 Fitting with numerical column names fails - discrete_columns = [str(column) for column in discrete_columns] - column_names = [str(num) for num in range(raw_data.shape[1])] - raw_data = pd.DataFrame(raw_data, columns=column_names) - - self._column_raw_dtypes = raw_data.infer_objects().dtypes - self._column_transform_info_list = [] - for column_name in raw_data.columns: - if column_name in discrete_columns: - column_transform_info = self._fit_discrete(raw_data[[column_name]]) - else: - column_transform_info = self._fit_continuous(raw_data[[column_name]]) - - self.output_info_list.append(column_transform_info.output_info) - self.output_dimensions += column_transform_info.output_dimensions - self._column_transform_info_list.append(column_transform_info) - - def _transform_continuous(self, column_transform_info, data): - column_name = data.columns[0] - flattened_column = data[column_name].to_numpy().flatten() - data = data.assign(**{column_name: flattened_column}) - gm = column_transform_info.transform - transformed = gm.transform(data) - - # Converts the transformed data to the appropriate output format. - # The first column (ending in '.normalized') stays the same, - # but the lable encoded column (ending in '.component') is one hot encoded. - output = np.zeros((len(transformed), column_transform_info.output_dimensions)) - output[:, 0] = transformed[f'{column_name}.normalized'].to_numpy() - index = transformed[f'{column_name}.component'].to_numpy().astype(int) - output[np.arange(index.size), index + 1] = 1.0 - - return output - - def _transform_discrete(self, column_transform_info, data): - ohe = column_transform_info.transform - return ohe.transform(data).to_numpy() - - def _synchronous_transform(self, raw_data, column_transform_info_list): - """Take a Pandas DataFrame and transform columns synchronous. - - Outputs a list with Numpy arrays. - """ - column_data_list = [] - for column_transform_info in column_transform_info_list: - column_name = column_transform_info.column_name - data = raw_data[[column_name]] - if column_transform_info.column_type == 'continuous': - column_data_list.append(self._transform_continuous(column_transform_info, data)) - else: - column_data_list.append(self._transform_discrete(column_transform_info, data)) - - return column_data_list - - def _parallel_transform(self, raw_data, column_transform_info_list): - """Take a Pandas DataFrame and transform columns in parallel. - - Outputs a list with Numpy arrays. - """ - processes = [] - for column_transform_info in column_transform_info_list: - column_name = column_transform_info.column_name - data = raw_data[[column_name]] - process = None - if column_transform_info.column_type == 'continuous': - process = delayed(self._transform_continuous)(column_transform_info, data) - else: - process = delayed(self._transform_discrete)(column_transform_info, data) - processes.append(process) - - return Parallel(n_jobs=-1)(processes) - - def transform(self, raw_data): - """Take raw data and output a matrix data.""" - if not isinstance(raw_data, pd.DataFrame): - column_names = [str(num) for num in range(raw_data.shape[1])] - raw_data = pd.DataFrame(raw_data, columns=column_names) - - # Only use parallelization with larger data sizes. - # Otherwise, the transformation will be slower. - if raw_data.shape[0] < 500: - column_data_list = self._synchronous_transform( - raw_data, self._column_transform_info_list - ) - else: - column_data_list = self._parallel_transform(raw_data, self._column_transform_info_list) - - return np.concatenate(column_data_list, axis=1).astype(float) - - def _inverse_transform_continuous(self, column_transform_info, column_data, sigmas, st): - gm = column_transform_info.transform - data = pd.DataFrame(column_data[:, :2], columns=list(gm.get_output_sdtypes())).astype(float) - data[data.columns[1]] = np.argmax(column_data[:, 1:], axis=1) - if sigmas is not None: - selected_normalized_value = np.random.normal(data.iloc[:, 0], sigmas[st]) - data.iloc[:, 0] = selected_normalized_value - - return gm.reverse_transform(data) - - def _inverse_transform_discrete(self, column_transform_info, column_data): - ohe = column_transform_info.transform - data = pd.DataFrame(column_data, columns=list(ohe.get_output_sdtypes())) - return ohe.reverse_transform(data)[column_transform_info.column_name] - - def inverse_transform(self, data, sigmas=None): - """Take matrix data and output raw data. - - Output uses the same type as input to the transform function. - Either np array or pd dataframe. - """ - st = 0 - recovered_column_data_list = [] - column_names = [] - for column_transform_info in self._column_transform_info_list: - dim = column_transform_info.output_dimensions - column_data = data[:, st : st + dim] - if column_transform_info.column_type == 'continuous': - recovered_column_data = self._inverse_transform_continuous( - column_transform_info, column_data, sigmas, st - ) - else: - recovered_column_data = self._inverse_transform_discrete( - column_transform_info, column_data - ) - - recovered_column_data_list.append(recovered_column_data) - column_names.append(column_transform_info.column_name) - st += dim - - recovered_data = np.column_stack(recovered_column_data_list) - recovered_data = pd.DataFrame(recovered_data, columns=column_names).astype( - self._column_raw_dtypes - ) - if not self.dataframe: - recovered_data = recovered_data.to_numpy() - - return recovered_data - - def convert_column_name_value_to_id(self, column_name, value): - """Get the ids of the given `column_name`.""" - discrete_counter = 0 - column_id = 0 - for column_transform_info in self._column_transform_info_list: - if column_transform_info.column_name == column_name: - break - if column_transform_info.column_type == 'discrete': - discrete_counter += 1 - - column_id += 1 - - else: - raise ValueError(f"The column_name `{column_name}` doesn't exist in the data.") - - ohe = column_transform_info.transform - data = pd.DataFrame([value], columns=[column_transform_info.column_name]) - one_hot = ohe.transform(data).to_numpy()[0] - if sum(one_hot) == 0: - raise ValueError(f"The value `{value}` doesn't exist in the column `{column_name}`.") - - return { - 'discrete_column_id': discrete_counter, - 'column_id': column_id, - 'value_id': np.argmax(one_hot), - } diff --git a/katabatic/models/ctgan/requirements.txt b/katabatic/models/ctgan/requirements.txt deleted file mode 100644 index 07b74a9..0000000 --- a/katabatic/models/ctgan/requirements.txt +++ /dev/null @@ -1,8 +0,0 @@ -pandas>=1.3.0 -numpy>=1.19.0 -scikit-learn>=0.24.0 -matplotlib>=3.4.0 -seaborn>=0.11.0 -torch>=1.9.0 -xgboost>=1.4.0 -tqdm>=4.62.0 diff --git a/katabatic/models/ctgan/run_ctgan.py b/katabatic/models/ctgan/run_ctgan.py deleted file mode 100644 index cf33eeb..0000000 --- a/katabatic/models/ctgan/run_ctgan.py +++ /dev/null @@ -1,230 +0,0 @@ -import os -import sys -import json -import pandas as pd -import numpy as np -from sklearn.model_selection import train_test_split -from sklearn.datasets import load_iris, load_breast_cancer, load_wine, load_digits -import matplotlib.pyplot as plt -import seaborn as sns -from tqdm import tqdm -import logging - -# Add the project root directory to the Python path to allow module imports -project_root = os.path.abspath(os.path.dirname(os.path.dirname(__file__))) -sys.path.insert(0, project_root) - -# Import custom CTGAN adapter and evaluation functions from the katabatic package -from katabatic.models.ctgan.ctgan_adapter import CtganAdapter -from katabatic.models.ctgan.ctgan_benchmark import evaluate_ctgan, print_evaluation_results - -# Configure logging to display information-level messages with timestamps -logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') - -def load_config(config_path="katabatic/models/ctgan/config.json"): - """ - Load the configuration settings from a JSON file. - - Args: - config_path (str): Path to the configuration JSON file. - - Returns: - dict: Configuration settings. - """ - with open(config_path, "r") as f: - return json.load(f) - -def load_data(): - """ - Load and preprocess multiple datasets for experimentation. - - Returns: - dict: A dictionary containing processed DataFrames for each dataset. - """ - # Load predefined datasets from scikit-learn - datasets = { - "iris": load_iris(), - "breast_cancer": load_breast_cancer(), - "wine": load_wine(), - "digits": load_digits() - } - - processed_datasets = {} - - # Iterate over each dataset to preprocess - for name, dataset in datasets.items(): - # Create a DataFrame for feature data - X = pd.DataFrame(dataset.data, columns=dataset.feature_names) - # Create a Series for target labels - y = pd.Series(dataset.target, name="Category") - - # Create a categorical feature by binning the first numerical feature into three categories - X["categorical_feature"] = pd.cut(X.iloc[:, 0], bins=3, labels=["low", "medium", "high"]) - - # Combine features and target into a single DataFrame - data = pd.concat([X, y], axis=1) - - # Store the processed DataFrame in the dictionary - processed_datasets[name] = data - - return processed_datasets - -def visualize_comparison(real_data, synthetic_data, dataset_name, config): - """ - Generate and save visual comparisons between real and synthetic data distributions. - - Args: - real_data (pd.DataFrame): The real dataset. - synthetic_data (pd.DataFrame): The synthetic dataset generated by CTGAN. - dataset_name (str): Name of the dataset being processed. - config (dict): Configuration settings containing visualization parameters. - """ - # Determine the number of features to visualize, ensuring it doesn't exceed available features - n_features = min(config["visualization"]["n_features"], len(real_data.columns) - 1) - features = real_data.columns[:n_features] - - # Set up subplots for distribution and relationship plots - fig, axes = plt.subplots(n_features, 2, figsize=tuple(config["visualization"]["figsize"])) - fig.suptitle(f"Real vs Synthetic Data Distribution - {dataset_name}") - - # Iterate over each feature to create plots - for i, feature in enumerate(features): - try: - if real_data[feature].dtype == 'object' or real_data[feature].dtype.name == 'category': - # For categorical features, plot normalized value counts as bar charts - real_counts = real_data[feature].value_counts(normalize=True) - synth_counts = synthetic_data[feature].value_counts(normalize=True) - pd.concat([real_counts, synth_counts], axis=1, keys=['Real', 'Synthetic']).plot( - kind='bar', ax=axes[i, 0] - ) - else: - # For numerical features, plot overlapping histograms with KDE - sns.histplot(real_data[feature], kde=True, ax=axes[i, 0], color='blue', label='Real', stat="density") - sns.histplot(synthetic_data[feature], kde=True, ax=axes[i, 0], color='red', label='Synthetic', stat="density") - axes[i, 0].set_title(f"{feature} Distribution") - axes[i, 0].legend() - except Exception as e: - logging.error(f"Error plotting histogram for {feature}: {str(e)}") - - try: - if real_data[feature].dtype == 'object' or real_data[feature].dtype.name == 'category': - # For categorical features, plot boxen plots against the target variable - sns.boxenplot(x=feature, y=real_data.columns[-1], data=real_data, ax=axes[i, 1], color='blue') - sns.boxenplot(x=feature, y=real_data.columns[-1], data=synthetic_data, ax=axes[i, 1], color='red') - else: - # For numerical features, plot scatter plots against the target variable - sns.scatterplot(data=real_data, x=feature, y=real_data.columns[-1], ax=axes[i, 1], color='blue', label='Real') - sns.scatterplot(data=synthetic_data, x=feature, y=synthetic_data.columns[-1], ax=axes[i, 1], color='red', label='Synthetic') - axes[i, 1].set_title(f"{feature} vs Category") - # Consolidate legend to avoid duplicate entries - handles, labels = axes[i, 1].get_legend_handles_labels() - by_label = dict(zip(labels, handles)) - axes[i, 1].legend(by_label.values(), by_label.keys()) - except Exception as e: - logging.error(f"Error plotting scatter for {feature}: {str(e)}") - - # Adjust layout to prevent overlap and save the figure - plt.tight_layout() - plt.savefig(f"{dataset_name}_comparison.png") - plt.close() - -def run_experiment(dataset_name, X, y, config): - """ - Conduct the CTGAN training, synthetic data generation, evaluation, and visualization for a given dataset. - - Args: - dataset_name (str): Name of the dataset being processed. - X (pd.DataFrame): Feature data. - y (pd.Series): Target labels. - config (dict): Configuration settings. - - Returns: - dict: Results of the experiment, including evaluation metrics and synthetic data. - """ - print(f"\nProcessing {dataset_name} dataset") - - # Split the data into training and testing sets based on configuration - X_train, X_test, y_train, y_test = train_test_split( - X, y, - test_size=config["evaluation"]["test_size"], - random_state=config["evaluation"]["random_state"] - ) - - # Initialize the CTGAN adapter with specified parameters - ctgan = CtganAdapter(**config["ctgan_params"]) - try: - # Train the CTGAN model on the training data - ctgan.fit(X_train, y_train) - except Exception as e: - print(f"Error fitting CTGAN model for {dataset_name}: {str(e)}") - return {"dataset": dataset_name, "error": str(e)} - - try: - # Generate synthetic data matching the size of the training set - synthetic_data = ctgan.generate(X_train.shape[0]) - except Exception as e: - print(f"Error generating synthetic data for {dataset_name}: {str(e)}") - return {"dataset": dataset_name, "error": str(e)} - - # Combine features and target labels for the real test set - real_data = pd.concat([X_test, y_test], axis=1) - try: - # Evaluate the CTGAN-generated synthetic data against the real data - evaluation_results = evaluate_ctgan(real_data, synthetic_data) - except Exception as e: - print(f"Error evaluating CTGAN for {dataset_name}: {str(e)}") - return {"dataset": dataset_name, "error": str(e)} - - # Generate and save visual comparisons between real and synthetic data - visualize_comparison(real_data, synthetic_data, dataset_name, config) - - # Display the evaluation results - print(f"\n{dataset_name} Dataset Results:") - print_evaluation_results(evaluation_results) - - return {"dataset": dataset_name, "evaluation_results": evaluation_results, "synthetic_data": synthetic_data} - -def main(): - """ - Main function to execute the CTGAN experiments on multiple datasets. - """ - # Load configuration settings - config = load_config() - print("Loaded configuration:", json.dumps(config, indent=2)) - - # Load and preprocess datasets - datasets = load_data() - - results = [] - # Iterate over each dataset to perform experiments - for dataset_name, data in datasets.items(): - X = data.drop("Category", axis=1) - y = data["Category"] - # Run the experiment and collect results - result = run_experiment(dataset_name, X, y, config) - results.append(result) - - if "evaluation_results" in result: - # Print raw and formatted evaluation results - print(f"\n{dataset_name} Dataset Results:") - print("Raw evaluation results:") - print(json.dumps(result["evaluation_results"], indent=2)) - print("\nFormatted evaluation results:") - print_evaluation_results(result["evaluation_results"]) - else: - # Print error messages if any step failed - print(f"\nError processing {dataset_name} dataset: {result.get('error', 'Unknown error')}") - - # Save synthetic data for each successful experiment - for result in results: - if "synthetic_data" in result: - output_file = f"{result['dataset']}_synthetic_data.csv" - result["synthetic_data"].to_csv(output_file, index=False) - print(f"\nSynthetic data for {result['dataset']} saved to {output_file}") - else: - print(f"\nFailed to generate synthetic data for {result['dataset']}") - - print("\nExperiment completed.") - -if __name__ == "__main__": - main() diff --git a/katabatic/models/ganblr/__init__.py b/katabatic/models/ganblr/__init__.py deleted file mode 100644 index 2999015..0000000 --- a/katabatic/models/ganblr/__init__.py +++ /dev/null @@ -1,10 +0,0 @@ -"""Top-level package for Ganblr.""" - -__author__ = """Tulip Lab""" -__email__ = 'jhzhou@tuliplab.academy' -__version__ = '0.1.0' - -from .kdb import KdbHighOrderFeatureEncoder -from .utils import get_demo_data - -__all__ = ['models', 'KdbHighOrderFeatureEncoder', 'get_demo_data'] \ No newline at end of file diff --git a/katabatic/models/ganblr/doc.md b/katabatic/models/ganblr/doc.md deleted file mode 100644 index f31e674..0000000 --- a/katabatic/models/ganblr/doc.md +++ /dev/null @@ -1,144 +0,0 @@ -GANBLR Documentation - AO -Overview -The GANBLR model combines a Bayesian network and a neural network, leveraging the k-Dependency Bayesian (kDB) algorithm for building a dependency graph and using various utility functions for data preparation and model constraints. -GANBLR Model Class -Defined in ganblr.py -Class Properties: -• _d: Placeholder for data. -• __gen_weights: Placeholder for generator weights. -• batch_size: Batch size for training. -• epochs: Number of epochs for training. -• k: Parameter for the model. -• constraints: Constraints for the model. -• _ordinal_encoder: Ordinal encoder for preprocessing data. -• _label_encoder: Label encoder for preprocessing data. -Class Methods: -__init__(): Initializes the model with default values. -fit(x, y, k=0, batch_size=32, epochs=10, warmup_epochs=1, verbose=0): Fits the model to the given data. -Parameters: -- x: Input data. -- y: Labels for the data. -- k: Parameter for the model (default: 0). -- batch_size: Size of batches for training (default: 32). -- epochs: Number of training epochs (default: 10). -- warmup_epochs: Number of warmup epochs (default: 1). -- verbose: Verbosity level (default: 0). -Returns the fitted model. - - - -_augment_cpd(d, size=None, verbose=0): Augments the Conditional Probability Distribution (CPD). -Parameters: -- d: Data. -- size: Size of the sample (default: None). -- verbose: Verbosity level (default: 0). -Returns the augmented data. - -_warmup_run(epochs, verbose=None): Runs a warmup phase. -Parameters: -- epochs: Number of epochs for warmup. -- verbose: Verbosity level (default: None). -Returns the history of the warmup run. - -_run_generator(loss): Runs the generator model. -Parameters: -loss: Loss function. -Returns the history of the generator run. -_discrim(): Creates a discriminator model. -Returns the discriminator model. - -_sample(size=None, verbose=0): Generate synthetic data in ordinal encoding format. -Parameters: --size: Size of the sample (default: None). --verbose: Verbosity level (default: 0). -Returns the sampled data. - - - - - - -kDB Algorithm -The kDB algorithm constructs a dependency graph for the Bayesian network. It is implemented in the kdb.py file. -Methods: -build_graph(X, y, k=2): Constructs a k-dependency Bayesian network graph. -Parameters: -- X: Input data (features). -- y: Labels. -- k: Number of parent nodes (default: 2). -Returns a list of graph edges. - - -get_cross_table(*cols, apply_wt=False): Generates a cross table from input columns. -Parameters: -- cols: Columns for cross table generation. --apply_wt: Whether to apply weights (default: False). -Returns a tuple containing the cross table as a NumPy array, a list of unique values for all columns and a list of unique values for individual columns. - - -_get_dependencies_without_y(variables, y_name, kdb_edges): Finds the dependencies of each variable without considering y. -Parameters: --variables: List of variable names. --y_name: Class name. --kdb_edges: List of tuples representing edges (source, target). -Returns a dictionary of dependencies. - - -_add_uniform(X, weight=1.0): A uniform distribution to the data. -Parameters: --X: Input data, a numpy array or pandas DataFrame. --weight: Weight for the uniform distribution (default: 1.0). -Returns the modified data with uniform distribution. - - -_normalize_by_column(array): Normalizes the array by columns. -Parameters: --array: Input array to normalize. -Returns the normalized array. - - -_smoothing(cct, d): Probability smoothing for kDB. -Parameters: --cct: Cross count table with shape (x0, *parents). --d: Dimension of cct. -Returns a smoothed joint probability table. - -get_high_order_feature(X, col, evidence_cols, feature_uniques): Encodes the high order feature of X[col] given evidences X[evidence_cols]. -Parameters: --X: Input data. --col: Column to encode. --evidence_cols: List of evidence columns. --feature_uniques: Unique values for features. -Returns an encoded high order feature. - -get_high_order_constraints(X, col, evidence_cols, feature_uniques): Gets high order constraints for the feature. -Parameters: --X: Input data. --col: Column to encode. --evidence_cols: List of evidence columns. --feature_uniques: Unique values for features. -Returns high order constraints. - -Classes: -KdbHighOrderFeatureEncoder: Encodes high-order features for the kDB model. -Class Properties: -- feature_uniques_: Unique values for features. -- dependencies_: Dependencies for features. -- ohe_: OneHotEncoder instance. -Class Methods: -- fit(X, y, k=0): Fits the encoder to the data. -- transform(X, return_constraints=False, use_ohe=True): Transforms the input data. -- fit_transform(X, y, k=0, return_constraints=False): Fits the encoder and transforms the data. - - - - -Utility Functions -The utility functions are implemented in the utils.py file and provide various support functions for data preparation and model constraints. -softmax_weight Class: Constrains weight tensors to be under softmax. -DataUtils Class: Provides data utilities for preparation before training. -Function: elr_loss(KL_LOSS): Defines a custom loss function. -Function: KL_loss(prob_fake): Calculates the KL loss. -Function: get_lr(input_dim, output_dim, constraint=None, KL_LOSS=0): Creates a logistic regression model. -Function: sample(*arrays, n=None, frac=None, random_state=None): Generates random samples from the given arrays. -Function: get_demo_data(name='adult'): Downloads a demo dataset from the internet. diff --git a/katabatic/models/ganblr/ganblr.py b/katabatic/models/ganblr/ganblr.py deleted file mode 100644 index 90a2454..0000000 --- a/katabatic/models/ganblr/ganblr.py +++ /dev/null @@ -1,247 +0,0 @@ -import pgmpy -from .kdb import * -from .kdb import _add_uniform -from .utils import * -from pgmpy.models import BayesianNetwork -from pgmpy.sampling import BayesianModelSampling -from pgmpy.factors.discrete import TabularCPD -from sklearn.preprocessing import OrdinalEncoder, LabelEncoder -import numpy as np -import tensorflow as tf -import warnings -warnings.filterwarnings("ignore") - -class GANBLR: - """ - The GANBLR Model. - """ - def __init__(self) -> None: - self._d = None - self.__gen_weights = None - self.batch_size = None - self.epochs = None - self.k = None - self.constraints = None - self._ordinal_encoder = OrdinalEncoder(dtype=int, handle_unknown='use_encoded_value', unknown_value=-1) - self._label_encoder = LabelEncoder() - - def fit(self, x, y, k=0, batch_size=32, epochs=10, warmup_epochs=1, verbose=0): - ''' - Fit the model to the given data. - - Parameters - ---------- - x : array_like of shape (n_samples, n_features) - Dataset to fit the model. The data should be discrete. - - y : array_like of shape (n_samples,) - Label of the dataset. - - k : int, default=0 - Parameter k of ganblr model. Must be greater than 0. No more than 2 is Suggested. - - batch_size : int, default=32 - Size of the batch to feed the model at each step. - - epochs : int, default=0 - Number of epochs to use during training. - - warmup_epochs : int, default=1 - Number of epochs to use in warmup phase. Defaults to :attr:`1`. - - verbose : int, default=1 - Whether to output the log. Use 1 for log output and 0 for complete silence. - - Returns - ------- - self : object - Fitted model. - ''' - if verbose is None or not isinstance(verbose, int): - verbose = 1 - x = self._ordinal_encoder.fit_transform(x) - y = self._label_encoder.fit_transform(y).astype(int) - d = DataUtils(x, y) - self._d = d - self.k = k - self.batch_size = batch_size - if verbose: - print(f"warmup run:") - history = self._warmup_run(warmup_epochs, verbose=verbose) - syn_data = self._sample(verbose=0) - discriminator_label = np.hstack([np.ones(d.data_size), np.zeros(d.data_size)]) - for i in range(epochs): - discriminator_input = np.vstack([x, syn_data[:,:-1]]) - disc_input, disc_label = sample(discriminator_input, discriminator_label, frac=0.8) - disc = self._discrim() - d_history = disc.fit(disc_input, disc_label, batch_size=batch_size, epochs=1, verbose=0).history - prob_fake = disc.predict(x, verbose=0) - ls = np.mean(-np.log(np.subtract(1, prob_fake))) - g_history = self._run_generator(loss=ls).history - syn_data = self._sample(verbose=0) - - if verbose: - print(f"Epoch {i+1}/{epochs}: G_loss = {g_history['loss'][0]:.6f}, G_accuracy = {g_history['accuracy'][0]:.6f}, D_loss = {d_history['loss'][0]:.6f}, D_accuracy = {d_history['accuracy'][0]:.6f}") - return self - - def evaluate(self, x, y, model='lr') -> float: - """ - Perform a TSTR(Training on Synthetic data, Testing on Real data) evaluation. - - Parameters - ---------- - x, y : array_like - Test dataset. - - model : str or object - The model used for evaluate. Should be one of ['lr', 'mlp', 'rf'], or a model class that have sklearn-style `fit` and `predict` method. - - Return: - -------- - accuracy_score : float. - - """ - from sklearn.linear_model import LogisticRegression - from sklearn.neural_network import MLPClassifier - from sklearn.ensemble import RandomForestClassifier - from sklearn.preprocessing import OneHotEncoder - from sklearn.pipeline import Pipeline - from sklearn.metrics import accuracy_score - - eval_model = None - models = dict( - lr=LogisticRegression, - rf=RandomForestClassifier, - mlp=MLPClassifier - ) - if model in models.keys(): - eval_model = models[model]() - elif hasattr(model, 'fit') and hasattr(model, 'predict'): - eval_model = model - else: - raise Exception("Invalid Arugument `model`, Should be one of ['lr', 'mlp', 'rf'], or a model class that have sklearn-style `fit` and `predict` method.") - - synthetic_data = self._sample() - synthetic_x, synthetic_y = synthetic_data[:,:-1], synthetic_data[:,-1] - x_test = self._ordinal_encoder.transform(x) - y_test = self._label_encoder.transform(y) - - categories = self._d.get_categories() - pipline = Pipeline([('encoder', OneHotEncoder(categories=categories, handle_unknown='ignore')), ('model', eval_model)]) - pipline.fit(synthetic_x, synthetic_y) - pred = pipline.predict(x_test) - return accuracy_score(y_test, pred) - - def sample(self, size=None, verbose=0) -> np.ndarray: - """ - Generate synthetic data. - - Parameters - ---------- - size : int or None - Size of the data to be generated. set to `None` to make the size equal to the size of the training set. - - verbose : int, default=1 - Whether to output the log. Use 1 for log output and 0 for complete silence. - - Return: - ----------------- - synthetic_samples : np.ndarray - Generated synthetic data. - """ - ordinal_data = self._sample(size, verbose) - origin_x = self._ordinal_encoder.inverse_transform(ordinal_data[:,:-1]) - origin_y = self._label_encoder.inverse_transform(ordinal_data[:,-1]).reshape(-1,1) - return np.hstack([origin_x, origin_y]) - - def _sample(self, size=None, verbose=0) -> np.ndarray: - """ - Generate synthetic data in ordinal encoding format - """ - if verbose is None or not isinstance(verbose, int): - verbose = 1 - #basic varibles - d = self._d - feature_cards = np.array(d.feature_uniques) - #ensure sum of each constraint group equals to 1, then re concat the probs - _idxs = np.cumsum([0] + d._kdbe.constraints_.tolist()) - constraint_idxs = [(_idxs[i],_idxs[i+1]) for i in range(len(_idxs)-1)] - - probs = np.exp(self.__gen_weights[0]) - cpd_probs = [probs[start:end,:] for start, end in constraint_idxs] - cpd_probs = np.vstack([p/p.sum(axis=0) for p in cpd_probs]) - - #assign the probs to the full cpd tables - idxs = np.cumsum([0] + d._kdbe.high_order_feature_uniques_) - feature_idxs = [(idxs[i],idxs[i+1]) for i in range(len(idxs)-1)] - have_value_idxs = d._kdbe.have_value_idxs_ - full_cpd_probs = [] - for have_value, (start, end) in zip(have_value_idxs, feature_idxs): - #(n_high_order_feature_uniques, n_classes) - cpd_prob_ = cpd_probs[start:end,:] - #(n_all_combination) Note: the order is (*parent, variable) - have_value_ravel = have_value.ravel() - #(n_classes * n_all_combination) - have_value_ravel_repeat = np.hstack([have_value_ravel] * d.num_classes) - #(n_classes * n_all_combination) <- (n_classes * n_high_order_feature_uniques) - full_cpd_prob_ravel = np.zeros_like(have_value_ravel_repeat, dtype=float) - full_cpd_prob_ravel[have_value_ravel_repeat] = cpd_prob_.T.ravel() - #(n_classes * n_parent_combinations, n_variable_unique) - full_cpd_prob = full_cpd_prob_ravel.reshape(-1, have_value.shape[-1]).T - full_cpd_prob = _add_uniform(full_cpd_prob, noise=0) - full_cpd_probs.append(full_cpd_prob) - - #prepare node and edge names - node_names = [str(i) for i in range(d.num_features + 1)] - edge_names = [(str(i), str(j)) for i,j in d._kdbe.edges_] - y_name = node_names[-1] - - #create TabularCPD objects - evidences = d._kdbe.dependencies_ - feature_cpds = [ - TabularCPD(str(name), feature_cards[name], table, - evidence=[y_name, *[str(e) for e in evidences]], - evidence_card=[d.num_classes, *feature_cards[evidences].tolist()]) - for (name, evidences), table in zip(evidences.items(), full_cpd_probs) - ] - y_probs = (d.class_counts/d.data_size).reshape(-1,1) - y_cpd = TabularCPD(y_name, d.num_classes, y_probs) - - #create kDB model, then sample the data - model = BayesianNetwork(edge_names) - model.add_cpds(y_cpd, *feature_cpds) - sample_size = d.data_size if size is None else size - result = BayesianModelSampling(model).forward_sample(size=sample_size, show_progress = verbose > 0) - sorted_result = result[node_names].values - - return sorted_result - - def _warmup_run(self, epochs, verbose=None): - d = self._d - tf.keras.backend.clear_session() - ohex = d.get_kdbe_x(self.k) - self.constraints = softmax_weight(d.constraint_positions) - elr = get_lr(ohex.shape[1], d.num_classes, self.constraints) - history = elr.fit(ohex, d.y, batch_size=self.batch_size, epochs=epochs, verbose=verbose) - self.__gen_weights = elr.get_weights() - tf.keras.backend.clear_session() - return history - - def _run_generator(self, loss): - d = self._d - ohex = d.get_kdbe_x(self.k) - tf.keras.backend.clear_session() - model = tf.keras.Sequential() - model.add(tf.keras.layers.Dense(d.num_classes, input_dim=ohex.shape[1], activation='softmax',kernel_constraint=self.constraints)) - model.compile(loss=elr_loss(loss), optimizer='adam', metrics=['accuracy']) - model.set_weights(self.__gen_weights) - history = model.fit(ohex, d.y, batch_size=self.batch_size,epochs=1, verbose=0) - self.__gen_weights = model.get_weights() - tf.keras.backend.clear_session() - return history - - def _discrim(self): - model = tf.keras.Sequential() - model.add(tf.keras.layers.Dense(1, input_dim=self._d.num_features, activation='sigmoid')) - model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy']) - return model \ No newline at end of file diff --git a/katabatic/models/ganblr/ganblr_adapter.py b/katabatic/models/ganblr/ganblr_adapter.py deleted file mode 100644 index aee22c2..0000000 --- a/katabatic/models/ganblr/ganblr_adapter.py +++ /dev/null @@ -1,133 +0,0 @@ -from katabatic.katabatic_spi import KatabaticModelSPI -import pandas as pd -from .ganblr import GANBLR - - -class GanblrAdapter(KatabaticModelSPI): - """ - Adapter class for GANBLR model to interface with KatabaticModelSPI. - - Attributes: - type (str): The type of model, either 'discrete' or 'continuous'. - constraints: The constraints for the model (currently not used). - batch_size (int): The batch size for training the model. - epochs (int): The number of epochs for training the model. - training_sample_size (int): The size of the training sample. - """ - - def __init__(self, type="discrete"): - """ - Initialize the GANBLR Adapter with the specified type. - - Args: - type (str): The type of model, either 'discrete' or 'continuous'. Default is 'discrete'. - """ - self.type = type - self.constraints = None - self.batch_size = None - self.epochs = None - - def load_model(self): - """ - Initialize and load the GANBLR model. - - Returns: - GANBLR: An instance of the GANBLR model. - """ - print("[INFO] Initializing GANBLR Model") - self.model = GANBLR() - self.training_sample_size = 0 - return self.model - - def load_data(self, data_pathname): - """ - Load data from the specified pathname. - - Args: - data_pathname (str): The path to the data file. - - Returns: - pd.DataFrame: The loaded data as a pandas DataFrame. - - Raises: - FileNotFoundError: If the file specified by data_pathname is not found. - pd.errors.EmptyDataError: If the file is empty. - pd.errors.ParserError: If there is a parsing error while reading the file. - Exception: For any other exceptions that occur during data loading. - """ - print(f"[INFO] Loading data from {data_pathname}") - try: - data = pd.read_csv(data_pathname) - print("[SUCCESS] Data loaded successfully.") - except FileNotFoundError: - print(f"[ERROR] File '{data_pathname}' not found.") - except pd.errors.EmptyDataError: - print("[ERROR] The file is empty.") - except pd.errors.ParserError: - print("[ERROR] Parsing error while reading the file.") - except Exception as e: - print(f"[ERROR] An unexpected error occurred: {e}") - return data - - def fit(self, X_train, y_train, k=0, epochs=10, batch_size=64): - """ - Train the GANBLR model on the input data. - - Args: - X_train (pd.DataFrame or np.ndarray): Training features. - y_train (pd.Series or np.ndarray): Training labels. - k (int): An optional parameter for the model's fit method. Default is 0. - epochs (int): The number of epochs for training. Default is 10. - batch_size (int): The batch size for training. Default is 64. - - Raises: - ValueError: If there is a value error such as wrong input shape. - TypeError: If there is a type error such as wrong data type. - Exception: For any other exceptions that occur during training. - """ - try: - print("[INFO] Training GANBLR model") - self.model.fit( - X_train, y_train, k, batch_size=batch_size, epochs=epochs, verbose=1 - ) - self.training_sample_size = len(X_train) - print("[SUCCESS] Model training completed") - except ValueError as e: - print(f"[ERROR] ValueError during model training: {e}") - except TypeError as e: - print(f"[ERROR] TypeError during model training: {e}") - except Exception as e: - print(f"[ERROR] An unexpected error occurred during model training: {e}") - - def generate(self, size=None): - """ - Generate data using the GANBLR model. - - Args: - size (int): The number of samples to generate. If not specified, defaults to the training sample size. - - Returns: - pd.DataFrame or np.ndarray: The generated data. - - Raises: - ValueError: If there is a value error such as invalid size. - TypeError: If there is a type error such as wrong data type for size. - AttributeError: If the model does not have a sample method. - Exception: For any other exceptions that occur during data generation. - """ - try: - print("[INFO] Generating data using GANBLR model") - if size is None: - size = self.training_sample_size - - generated_data = self.model.sample(size, verbose=0) - print("[SUCCESS] Data generation completed") - return generated_data - except ValueError as e: - print(f"[ERROR] ValueError during data generation: {e}") - except TypeError as e: - print(f"[ERROR] TypeError during data generation: {e}") - except AttributeError as e: - print(f"[ERROR] AttributeError during data generation: {e}") - except Exception as e: - print(f"[ERROR] An unexpected error occurred during data generation: {e}") diff --git a/katabatic/models/ganblr/kdb.py b/katabatic/models/ganblr/kdb.py deleted file mode 100644 index 70df610..0000000 --- a/katabatic/models/ganblr/kdb.py +++ /dev/null @@ -1,370 +0,0 @@ -import numpy as np -#import networkx as nx -from pyitlib import discrete_random_variable as drv -import warnings -warnings.filterwarnings("ignore") - -def build_graph(X, y, k=2): - ''' - kDB algorithm - - Param: - ---------------------- - - Return: - ---------------------- - graph edges - ''' - #ensure data - num_features = X.shape[1] - x_nodes = list(range(num_features)) - y_node = num_features - - #util func - _x = lambda i:X[:,i] - _x2comb = lambda i,j:(X[:,i], X[:,j]) - - #feature indexes desc sort by mutual information - sorted_feature_idxs = np.argsort([ - drv.information_mutual(_x(i), y) - for i in range(num_features) - ])[::-1] - - #start building graph - edges = [] - for iter, target_idx in enumerate(sorted_feature_idxs): - target_node = x_nodes[target_idx] - edges.append((y_node, target_node)) - - parent_candidate_idxs = sorted_feature_idxs[:iter] - if iter <= k: - for idx in parent_candidate_idxs: - edges.append((x_nodes[idx], target_node)) - else: - first_k_parent_mi_idxs = np.argsort([ - drv.information_mutual_conditional(*_x2comb(i, target_idx), y) - for i in parent_candidate_idxs - ])[::-1][:k] - first_k_parent_idxs = parent_candidate_idxs[first_k_parent_mi_idxs] - - for parent_idx in first_k_parent_idxs: - edges.append((x_nodes[parent_idx], target_node)) - return edges - -# def draw_graph(edges): -# ''' -# Draw the graph -# -# Param -# ----------------- -# edges: edges of the graph -# -# ''' -# graph = nx.DiGraph(edges) -# pos=nx.spiral_layout(graph) -# nx.draw(graph, pos, node_color='r', edge_color='b') -# nx.draw_networkx_labels(graph, pos, font_size=20, font_family="sans-serif") - - -def get_cross_table(*cols, apply_wt=False): - ''' - author: alexland - - returns: - (i) xt, NumPy array storing the xtab results, number of dimensions is equal to - the len(args) passed in - (ii) unique_vals_all_cols, a tuple of 1D NumPy array for each dimension - in xt (for a 2D xtab, the tuple comprises the row and column headers) - pass in: - (i) 1 or more 1D NumPy arrays of integers - (ii) if wts is True, then the last array in cols is an array of weights - - if return_inverse=True, then np.unique also returns an integer index - (from 0, & of same len as array passed in) such that, uniq_vals[idx] gives the original array passed in - higher dimensional cross tabulations are supported (eg, 2D & 3D) - cross tabulation on two variables (columns): - >>> q1 = np.array([7, 8, 8, 8, 5, 6, 4, 6, 6, 8, 4, 6, 6, 6, 6, 8, 8, 5, 8, 6]) - >>> q2 = np.array([6, 4, 6, 4, 8, 8, 4, 8, 7, 4, 4, 8, 8, 7, 5, 4, 8, 4, 4, 4]) - >>> uv, xt = xtab(q1, q2) - >>> uv - (array([4, 5, 6, 7, 8]), array([4, 5, 6, 7, 8])) - >>> xt - array([[2, 0, 0, 0, 0], - [1, 0, 0, 0, 1], - [1, 1, 0, 2, 4], - [0, 0, 1, 0, 0], - [5, 0, 1, 0, 1]], dtype=uint64) - ''' - if not all(len(col) == len(cols[0]) for col in cols[1:]): - raise ValueError("all arguments must be same size") - - if len(cols) == 0: - raise TypeError("xtab() requires at least one argument") - - fnx1 = lambda q: len(q.squeeze().shape) - if not all([fnx1(col) == 1 for col in cols]): - raise ValueError("all input arrays must be 1D") - - if apply_wt: - cols, wt = cols[:-1], cols[-1] - else: - wt = 1 - - uniq_vals_all_cols, idx = zip( *(np.unique(col, return_inverse=True) for col in cols) ) - shape_xt = [uniq_vals_col.size for uniq_vals_col in uniq_vals_all_cols] - dtype_xt = 'float' if apply_wt else 'uint' - xt = np.zeros(shape_xt, dtype=dtype_xt) - np.add.at(xt, idx, wt) - return uniq_vals_all_cols, xt - -def _get_dependencies_without_y(variables, y_name, kdb_edges): - ''' - evidences of each variable without y. - - Param: - -------------- - variables: variable names - - y_name: class name - - kdb_edges: list of tuple (source, target) - ''' - dependencies = {} - kdb_edges_without_y = [edge for edge in kdb_edges if edge[0] != y_name] - mi_desc_order = {t:i for i,(s,t) in enumerate(kdb_edges) if s == y_name} - for x in variables: - current_dependencies = [s for s,t in kdb_edges_without_y if t == x] - if len(current_dependencies) >= 2: - sort_dict = {t:mi_desc_order[t] for t in current_dependencies} - dependencies[x] = sorted(sort_dict) - else: - dependencies[x] = current_dependencies - return dependencies - -def _add_uniform(array, noise=1e-5): - ''' - if no count on particular condition for any feature, give a uniform prob rather than leave 0 - ''' - sum_by_col = np.sum(array,axis=0) - zero_idxs = (array == 0).astype(int) - # zero_count_by_col = np.sum(zero_idxs,axis=0) - nunique = array.shape[0] - result = np.zeros_like(array, dtype='float') - for i in range(array.shape[1]): - if sum_by_col[i] == 0: - result[:,i] = array[:,i] + 1./nunique - elif noise != 0: - result[:,i] = array[:,i] + noise * zero_idxs[:,i] - else: - result[:,i] = array[:,i] - return result - -def _normalize_by_column(array): - sum_by_col = np.sum(array,axis=0) - return np.divide(array, sum_by_col, - out=np.zeros_like(array,dtype='float'), - where=sum_by_col !=0) - -def _smoothing(cct, d): - ''' - probability smoothing for kdb - - Parameters: - ----------- - cct (np.ndarray): cross count table with shape (x0, *parents) - - d (int): dimension of cct - - Return: - -------- - smoothed joint prob table - ''' - #covert cross-count-table to joint-prob-table by doing a normalization alone axis 0 - jpt = _normalize_by_column(cct) - smoothing_idx = jpt == 0 - if d > 1 and np.sum(smoothing_idx) > 0: - parent = cct.sum(axis=-1) - parent = _smoothing(parent, d-1) - parent_extend = parent.repeat(jpt.shape[-1]).reshape(jpt.shape) - jpt[smoothing_idx] = parent_extend[smoothing_idx] - return jpt - -def get_high_order_feature(X, col, evidence_cols, feature_uniques): - ''' - encode the high order feature of X[col] given evidences X[evidence_cols]. - ''' - if evidence_cols is None or len(evidence_cols) == 0: - return X[:,[col]] - else: - evidences = [X[:,_col] for _col in evidence_cols] - - #[1, variable_unique, evidence_unique] - base = [1, feature_uniques[col]] + [feature_uniques[_col] for _col in evidence_cols[::-1][:-1]] - cum_base = np.cumprod(base)[::-1] - - cols = evidence_cols + [col] - high_order_feature = np.sum(X[:,cols] * cum_base, axis=1).reshape(-1,1) - return high_order_feature - -def get_high_order_constraints(X, col, evidence_cols, feature_uniques): - ''' - find the constraints infomation for the high order feature X[col] given evidences X[evidence_cols]. - - Returns: - --------------------- - tuple(have_value, high_order_uniques) - - have_value: a k+1 dimensions numpy ndarray of type boolean. - Each dimension correspond to a variable, with the order (*evidence_cols, col) - True indicate the corresponding combination of variable values cound be found in the dataset. - False indicate not. - - high_order_constraints: a 1d nummy ndarray of type int. - Each number `c` indicate that there are `c` cols shound be applying the constraints since the last constrant position(or index 0), - in sequence. - - ''' - if evidence_cols is None or len(evidence_cols) == 0: - unique = feature_uniques[col] - return np.ones(unique,dtype=bool), np.array([unique]) - else: - cols = evidence_cols + [col] - cross_table_idxs, cross_table = get_cross_table(*[X[:,i] for i in cols]) - have_value = cross_table != 0 - - have_value_reshape = have_value.reshape(-1,have_value.shape[-1]) - #have_value_split = np.split(have_value_reshape, have_value_reshape.shape[0], 0) - high_order_constraints = np.sum(have_value_reshape, axis=-1) - - return have_value, high_order_constraints - -class KdbHighOrderFeatureEncoder: - ''' - High order feature encoder that uses the kdb model to retrieve the dependencies between features. - - ''' - def __init__(self): - self.dependencies_ = {} - self.constraints_ = np.array([]) - self.have_value_idxs_ = [] - self.feature_uniques_ = [] - self.high_order_feature_uniques_ = [] - self.edges_ = [] - self.ohe_ = None - self.k = None - #self.full_=True - - def fit(self, X, y, k=0): - ''' - Fit the KdbHighOrderFeatureEncoder to X, y. - - Parameters - ---------- - X : array_like of shape (n_samples, n_features) - data to fit in the encoder. - - y : array_like of shape (n_samples,) - label to fit in the encoder. - - k : int, default=0 - k value of the order of the high-order feature. k = 0 will lead to a OneHotEncoder. - - Returns - ------- - self : object - Fitted encoder. - ''' - self.k = k - edges = build_graph(X, y, k) - #n_classes = len(np.unique(y)) - num_features = X.shape[1] - - if k > 0: - dependencies = _get_dependencies_without_y(list(range(num_features)), num_features, edges) - else: - dependencies = {x:[] for x in range(num_features)} - - self.dependencies_ = dependencies - self.feature_uniques_ = [len(np.unique(X[:,i])) for i in range(num_features)] - self.edges_ = edges - #self.full_ = full - - Xk, constraints, have_value_idxs = self.transform(X, return_constraints=True, use_ohe=False) - - from sklearn.preprocessing import OneHotEncoder - self.ohe_ = OneHotEncoder().fit(Xk) - self.high_order_feature_uniques_ = [len(c) for c in self.ohe_.categories_] - self.constraints_ = constraints - self.have_value_idxs_ = have_value_idxs - return self - - def transform(self, X, return_constraints=False, use_ohe=True): - """ - Transform X to the high-order features. - - Parameters - ---------- - X : array_like of shape (n_samples, n_features) - Data to fit in the encoder. - - return_constraints : bool, default=False - Whether to return the constraint informations. - - use_ohe : bool, default=True - Whether to transform output to one-hot format. - - Returns - ------- - X_out : ndarray of shape (n_samples, n_encoded_features) - Transformed input. - """ - Xk = [] - have_value_idxs = [] - constraints = [] - for k, v in self.dependencies_.items(): - xk = get_high_order_feature(X, k, v, self.feature_uniques_) - Xk.append(xk) - - if return_constraints: - idx, constraint = get_high_order_constraints(X, k, v, self.feature_uniques_) - have_value_idxs.append(idx) - constraints.append(constraint) - - Xk = np.hstack(Xk) - from sklearn.preprocessing import OrdinalEncoder - Xk = OrdinalEncoder().fit_transform(Xk) - if use_ohe: - Xk = self.ohe_.transform(Xk) - - if return_constraints: - concated_constraints = np.hstack(constraints) - return Xk, concated_constraints, have_value_idxs - else: - return Xk - - def fit_transform(self, X, y, k=0, return_constraints=False): - ''' - Fit KdbHighOrderFeatureEncoder to X, y, then transform X. - - Equivalent to fit(X, y, k).transform(X, return_constraints) but more convenient. - - Parameters - ---------- - X : array_like of shape (n_samples, n_features) - data to fit in the encoder. - - y : array_like of shape (n_samples,) - label to fit in the encoder. - - k : int, default=0 - k value of the kdb model. k = 0 will lead to a OneHotEncoder. - - return_constraints : bool, default=False - whether to return the constraint informations. - - Returns - ------- - X_out : ndarray of shape (n_samples, n_encoded_features) - Transformed input. - ''' - return self.fit(X, y, k).transform(X, return_constraints) \ No newline at end of file diff --git a/katabatic/models/ganblr/utils.py b/katabatic/models/ganblr/utils.py deleted file mode 100644 index 58a8938..0000000 --- a/katabatic/models/ganblr/utils.py +++ /dev/null @@ -1,162 +0,0 @@ -from tensorflow.python.ops import math_ops -import numpy as np -import tensorflow as tf -import warnings -warnings.filterwarnings("ignore") - -class softmax_weight(tf.keras.constraints.Constraint): - """Constrains weight tensors to be under softmax `.""" - - def __init__(self,feature_uniques): - if isinstance(feature_uniques, np.ndarray): - idxs = math_ops.cumsum(np.hstack([np.array([0]),feature_uniques])) - else: - idxs = math_ops.cumsum([0] + feature_uniques) - idxs = [i.numpy() for i in idxs] - self.feature_idxs = [ - (idxs[i],idxs[i+1]) for i in range(len(idxs)-1) - ] - - def __call__(self, w): - w_new = [ - math_ops.log(tf.nn.softmax(w[i:j,:], axis=0)) - for i,j in self.feature_idxs - ] - return tf.concat(w_new, 0) - - def get_config(self): - return {'feature_idxs': self.feature_idxs} - -def elr_loss(KL_LOSS): - def loss(y_true, y_pred): - return tf.keras.losses.sparse_categorical_crossentropy(y_true, y_pred)+ KL_LOSS - return loss - -def KL_loss(prob_fake): - return np.mean(-np.log(np.subtract(1,prob_fake))) - -def get_lr(input_dim, output_dim, constraint=None,KL_LOSS=0): - model = tf.keras.Sequential() - model.add(tf.keras.layers.Dense(output_dim, input_dim=input_dim, activation='softmax',kernel_constraint=constraint)) - model.compile(loss=elr_loss(KL_LOSS), optimizer='adam', metrics=['accuracy']) - #log_elr = model.fit(*train_data, validation_data=test_data, batch_size=batch_size,epochs=epochs) - return model - -def sample(*arrays, n=None, frac=None, random_state=None): - ''' - generate sample random arrays from given arrays. The given arrays must be same size. - - Parameters: - -------------- - *arrays: arrays to be sampled. - - n (int): Number of random samples to generate. - - frac: Float value between 0 and 1, Returns (float value * length of given arrays). frac cannot be used with n. - - random_state: int value or numpy.random.RandomState, optional. if set to a particular integer, will return same samples in every iteration. - - Return: - -------------- - the sampled array(s). Passing in multiple arrays will result in the return of a tuple. - - ''' - random = np.random - if isinstance(random_state, int): - random = random.RandomState(random_state) - elif isinstance(random_state, np.random.RandomState): - random = random_state - - arr0 = arrays[0] - original_size = len(arr0) - if n == None and frac == None: - raise Exception('You must specify one of frac or size.') - if n == None: - n = int(len(arr0) * frac) - - idxs = random.choice(original_size, n, replace=False) - if len(arrays) > 1: - sampled_arrays = [] - for arr in arrays: - assert(len(arr) == original_size) - sampled_arrays.append(arr[idxs]) - return tuple(sampled_arrays) - else: - return arr0[idxs] - -DEMO_DATASETS = { - 'adult': { - 'link':'https://raw.githubusercontent.com/chriszhangpodo/discretizedata/main/adult-dm.csv', - 'params': { - 'dtype' : int - } - }, - 'adult-raw':{ - 'link':'https://drive.google.com/uc?export=download&id=1iA-_qIC1xKQJ4nL2ugX1_XJQf8__xOY0', - 'params': {} - } -} - -def get_demo_data(name='adult'): - """ - Download demo dataset from internet. - - Parameters - ---------- - name : str - Name of dataset. Should be one of ['adult', 'adult-raw']. - - Returns - ------- - data : pandas.DataFrame - the demo dataset. - """ - assert(name in DEMO_DATASETS.keys()) - return read_csv(DEMO_DATASETS[name]['link'], **DEMO_DATASETS[name]['params']) - -from .kdb import KdbHighOrderFeatureEncoder -from sklearn.preprocessing import OneHotEncoder -from pandas import read_csv -import numpy as np - -class DataUtils: - """ - useful data utils for the preparation before training. - """ - def __init__(self, x, y): - self.x = x - self.y = y - self.data_size = len(x) - self.num_features = x.shape[1] - - yunique, ycounts = np.unique(y, return_counts=True) - self.num_classes = len(yunique) - self.class_counts = ycounts - self.feature_uniques = [len(np.unique(x[:,i])) for i in range(self.num_features)] - - self.constraint_positions = None - self._kdbe = None - - self.__kdbe_x = None - - def get_categories(self, idxs=None): - if idxs != None: - return [self._kdbe.ohe_.categories_[i] for i in idxs] - return self._kdbe.ohe_.categories_ - - def get_kdbe_x(self, k=0, dense_format=True) -> np.ndarray: - if self.__kdbe_x is not None: - return self.__kdbe_x - if self._kdbe == None: - self._kdbe = KdbHighOrderFeatureEncoder() - self._kdbe.fit(self.x, self.y, k=k) - kdbex = self._kdbe.transform(self.x) - if dense_format: - kdbex = kdbex.todense() - self.__kdbe_x = kdbex - self.constraint_positions = self._kdbe.constraints_ - return kdbex - - def clear(self): - self._kdbe = None - self.__kdbe_x = None \ No newline at end of file diff --git a/katabatic/models/ganblrpp/__init__.py b/katabatic/models/ganblrpp/__init__.py deleted file mode 100644 index 6905489..0000000 --- a/katabatic/models/ganblrpp/__init__.py +++ /dev/null @@ -1,12 +0,0 @@ -"""Top-level package for Ganblr.""" - -__author__ = """Tulip Lab""" -__email__ = 'jhzhou@tuliplab.academy' -__version__ = '0.1.0' - -from .kdb import KdbHighOrderFeatureEncoder -from .utils import get_demo_data - -__all__ = ['models', 'KdbHighOrderFeatureEncoder', 'get_demo_data'] - - diff --git a/katabatic/models/ganblrpp/ganblr.py b/katabatic/models/ganblrpp/ganblr.py deleted file mode 100644 index a699618..0000000 --- a/katabatic/models/ganblrpp/ganblr.py +++ /dev/null @@ -1,244 +0,0 @@ -from .kdb import * -from .kdb import _add_uniform -from .utils import * -from pgmpy.models import BayesianNetwork -from pgmpy.sampling import BayesianModelSampling -from pgmpy.factors.discrete import TabularCPD -from sklearn.preprocessing import OrdinalEncoder, LabelEncoder -import numpy as np -import tensorflow as tf - -class GANBLR: - """ - The GANBLR Model. - """ - def __init__(self) -> None: - self._d = None - self.__gen_weights = None - self.batch_size = None - self.epochs = None - self.k = None - self.constraints = None - self._ordinal_encoder = OrdinalEncoder(dtype=int, handle_unknown='use_encoded_value', unknown_value=-1) - self._label_encoder = LabelEncoder() - - def fit(self, x, y, k=0, batch_size=32, epochs=10, warmup_epochs=1, verbose=1): - ''' - Fit the model to the given data. - - Parameters - ---------- - x : array_like of shape (n_samples, n_features) - Dataset to fit the model. The data should be discrete. - - y : array_like of shape (n_samples,) - Label of the dataset. - - k : int, default=0 - Parameter k of ganblr model. Must be greater than 0. No more than 2 is Suggested. - - batch_size : int, default=32 - Size of the batch to feed the model at each step. - - epochs : int, default=0 - Number of epochs to use during training. - - warmup_epochs : int, default=1 - Number of epochs to use in warmup phase. Defaults to :attr:`1`. - - verbose : int, default=1 - Whether to output the log. Use 1 for log output and 0 for complete silence. - - Returns - ------- - self : object - Fitted model. - ''' - if verbose is None or not isinstance(verbose, int): - verbose = 1 - x = self._ordinal_encoder.fit_transform(x) - y = self._label_encoder.fit_transform(y).astype(int) - d = DataUtils(x, y) - self._d = d - self.k = k - self.batch_size = batch_size - if verbose: - print(f"warmup run:") - history = self._warmup_run(warmup_epochs, verbose=verbose) - syn_data = self._sample(verbose=0) - discriminator_label = np.hstack([np.ones(d.data_size), np.zeros(d.data_size)]) - for i in range(epochs): - discriminator_input = np.vstack([x, syn_data[:,:-1]]) - disc_input, disc_label = sample(discriminator_input, discriminator_label, frac=0.8) - disc = self._discrim() - d_history = disc.fit(disc_input, disc_label, batch_size=batch_size, epochs=1, verbose=0).history - prob_fake = disc.predict(x, verbose=0) - ls = np.mean(-np.log(np.subtract(1, prob_fake))) - g_history = self._run_generator(loss=ls).history - syn_data = self._sample(verbose=0) - - if verbose: - print(f"Epoch {i+1}/{epochs}: G_loss = {g_history['loss'][0]:.6f}, G_accuracy = {g_history['accuracy'][0]:.6f}, D_loss = {d_history['loss'][0]:.6f}, D_accuracy = {d_history['accuracy'][0]:.6f}") - return self - - def evaluate(self, x, y, model='lr') -> float: - """ - Perform a TSTR(Training on Synthetic data, Testing on Real data) evaluation. - - Parameters - ---------- - x, y : array_like - Test dataset. - - model : str or object - The model used for evaluate. Should be one of ['lr', 'mlp', 'rf'], or a model class that have sklearn-style `fit` and `predict` method. - - Return: - -------- - accuracy_score : float. - - """ - from sklearn.linear_model import LogisticRegression - from sklearn.neural_network import MLPClassifier - from sklearn.ensemble import RandomForestClassifier - from sklearn.preprocessing import OneHotEncoder - from sklearn.pipeline import Pipeline - from sklearn.metrics import accuracy_score - - eval_model = None - models = dict( - lr=LogisticRegression, - rf=RandomForestClassifier, - mlp=MLPClassifier - ) - if model in models.keys(): - eval_model = models[model]() - elif hasattr(model, 'fit') and hasattr(model, 'predict'): - eval_model = model - else: - raise Exception("Invalid Arugument `model`, Should be one of ['lr', 'mlp', 'rf'], or a model class that have sklearn-style `fit` and `predict` method.") - - synthetic_data = self._sample() - synthetic_x, synthetic_y = synthetic_data[:,:-1], synthetic_data[:,-1] - x_test = self._ordinal_encoder.transform(x) - y_test = self._label_encoder.transform(y) - - categories = self._d.get_categories() - pipline = Pipeline([('encoder', OneHotEncoder(categories=categories, handle_unknown='ignore')), ('model', eval_model)]) - pipline.fit(synthetic_x, synthetic_y) - pred = pipline.predict(x_test) - return accuracy_score(y_test, pred) - - def sample(self, size=None, verbose=1) -> np.ndarray: - """ - Generate synthetic data. - - Parameters - ---------- - size : int or None - Size of the data to be generated. set to `None` to make the size equal to the size of the training set. - - verbose : int, default=1 - Whether to output the log. Use 1 for log output and 0 for complete silence. - - Return: - ----------------- - synthetic_samples : np.ndarray - Generated synthetic data. - """ - ordinal_data = self._sample(size, verbose) - origin_x = self._ordinal_encoder.inverse_transform(ordinal_data[:,:-1]) - origin_y = self._label_encoder.inverse_transform(ordinal_data[:,-1]).reshape(-1,1) - return np.hstack([origin_x, origin_y]) - - def _sample(self, size=None, verbose=1) -> np.ndarray: - """ - Generate synthetic data in ordinal encoding format - """ - if verbose is None or not isinstance(verbose, int): - verbose = 1 - #basic varibles - d = self._d - feature_cards = np.array(d.feature_uniques) - #ensure sum of each constraint group equals to 1, then re concat the probs - _idxs = np.cumsum([0] + d._kdbe.constraints_.tolist()) - constraint_idxs = [(_idxs[i],_idxs[i+1]) for i in range(len(_idxs)-1)] - - probs = np.exp(self.__gen_weights[0]) - cpd_probs = [probs[start:end,:] for start, end in constraint_idxs] - cpd_probs = np.vstack([p/p.sum(axis=0) for p in cpd_probs]) - - #assign the probs to the full cpd tables - idxs = np.cumsum([0] + d._kdbe.high_order_feature_uniques_) - feature_idxs = [(idxs[i],idxs[i+1]) for i in range(len(idxs)-1)] - have_value_idxs = d._kdbe.have_value_idxs_ - full_cpd_probs = [] - for have_value, (start, end) in zip(have_value_idxs, feature_idxs): - #(n_high_order_feature_uniques, n_classes) - cpd_prob_ = cpd_probs[start:end,:] - #(n_all_combination) Note: the order is (*parent, variable) - have_value_ravel = have_value.ravel() - #(n_classes * n_all_combination) - have_value_ravel_repeat = np.hstack([have_value_ravel] * d.num_classes) - #(n_classes * n_all_combination) <- (n_classes * n_high_order_feature_uniques) - full_cpd_prob_ravel = np.zeros_like(have_value_ravel_repeat, dtype=float) - full_cpd_prob_ravel[have_value_ravel_repeat] = cpd_prob_.T.ravel() - #(n_classes * n_parent_combinations, n_variable_unique) - full_cpd_prob = full_cpd_prob_ravel.reshape(-1, have_value.shape[-1]).T - full_cpd_prob = _add_uniform(full_cpd_prob, noise=0) - full_cpd_probs.append(full_cpd_prob) - - #prepare node and edge names - node_names = [str(i) for i in range(d.num_features + 1)] - edge_names = [(str(i), str(j)) for i,j in d._kdbe.edges_] - y_name = node_names[-1] - - #create TabularCPD objects - evidences = d._kdbe.dependencies_ - feature_cpds = [ - TabularCPD(str(name), feature_cards[name], table, - evidence=[y_name, *[str(e) for e in evidences]], - evidence_card=[d.num_classes, *feature_cards[evidences].tolist()]) - for (name, evidences), table in zip(evidences.items(), full_cpd_probs) - ] - y_probs = (d.class_counts/d.data_size).reshape(-1,1) - y_cpd = TabularCPD(y_name, d.num_classes, y_probs) - - #create kDB model, then sample the data - model = BayesianNetwork(edge_names) - model.add_cpds(y_cpd, *feature_cpds) - sample_size = d.data_size if size is None else size - result = BayesianModelSampling(model).forward_sample(size=sample_size, show_progress = verbose > 0) - sorted_result = result[node_names].values - - return sorted_result - - def _warmup_run(self, epochs, verbose=None): - d = self._d - tf.keras.backend.clear_session() - ohex = d.get_kdbe_x(self.k) - self.constraints = softmax_weight(d.constraint_positions) - elr = get_lr(ohex.shape[1], d.num_classes, self.constraints) - history = elr.fit(ohex, d.y, batch_size=self.batch_size, epochs=epochs, verbose=verbose) - self.__gen_weights = elr.get_weights() - tf.keras.backend.clear_session() - return history - - def _run_generator(self, loss): - d = self._d - ohex = d.get_kdbe_x(self.k) - tf.keras.backend.clear_session() - model = tf.keras.Sequential() - model.add(tf.keras.layers.Dense(d.num_classes, input_dim=ohex.shape[1], activation='softmax',kernel_constraint=self.constraints)) - model.compile(loss=elr_loss(loss), optimizer='adam', metrics=['accuracy']) - model.set_weights(self.__gen_weights) - history = model.fit(ohex, d.y, batch_size=self.batch_size,epochs=1, verbose=0) - self.__gen_weights = model.get_weights() - tf.keras.backend.clear_session() - return history - - def _discrim(self): - model = tf.keras.Sequential() - model.add(tf.keras.layers.Dense(1, input_dim=self._d.num_features, activation='sigmoid')) - model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy']) - return model \ No newline at end of file diff --git a/katabatic/models/ganblrpp/ganblrpp.py b/katabatic/models/ganblrpp/ganblrpp.py deleted file mode 100644 index a23d0f0..0000000 --- a/katabatic/models/ganblrpp/ganblrpp.py +++ /dev/null @@ -1,315 +0,0 @@ -from .ganblr import GANBLR -from sklearn.mixture import BayesianGaussianMixture -from sklearn.preprocessing import MinMaxScaler, LabelEncoder, OrdinalEncoder -from scipy.stats import truncnorm -import numpy as np - -class DMMDiscritizer: - def __init__(self, random_state): - self.__dmm_params = dict(weight_concentration_prior_type="dirichlet_process", - n_components=2 * 5, reg_covar=0, init_params='random', - max_iter=1500, mean_precision_prior=.1, - random_state=random_state) - - self.__scaler = MinMaxScaler() - self.__dmms = [] - self.__arr_mu = [] - self.__arr_sigma = [] - self._random_state = random_state - - def fit(self, x): - """ - Do DMM Discritization. - - Parameter: - --------- - x (2d np.numpy): data to be discritize. Must bu numeric data. - - Return: - ---------- - self - - """ - assert(isinstance(x, np.ndarray)) - assert(len(x.shape) == 2) - - x_scaled = self.__scaler.fit_transform(x) - self.__internal_fit(x_scaled) - return self - - def transform(self, x) -> np.ndarray: - x = self.__scaler.transform(x) - arr_modes = [] - for i, dmm in enumerate(self.__dmms): - modes = dmm.predict(x[:,i:i+1]) - modes = LabelEncoder().fit_transform(modes)#.astype(int) - arr_modes.append(modes) - return self.__internal_transform(x, arr_modes) - - def fit_transform(self, x) -> np.ndarray: - assert(isinstance(x, np.ndarray)) - assert(len(x.shape) == 2) - - x_scaled = self.__scaler.fit_transform(x) - arr_modes = self.__internal_fit(x_scaled) - return self.__internal_transform(x_scaled, arr_modes) - - def __internal_fit(self, x): - self.__dmms.clear() - self.__arr_mu.clear() - self.__arr_sigma.clear() - - arr_mode = [] - for i in range(x.shape[1]): - cur_column = x[:,i:i+1] - dmm = BayesianGaussianMixture(**self.__dmm_params) - y = dmm.fit_predict(cur_column) - lbe = LabelEncoder().fit(y) - mu = dmm.means_[:len(lbe.classes_)] - sigma = np.sqrt(dmm.covariances_[:len(lbe.classes_)]) - - arr_mode.append(lbe.transform(y))#.astype(int)) - #self.__arr_lbes.append(lbe) - self.__dmms.append(dmm) - self.__arr_mu.append(mu.ravel()) - self.__arr_sigma.append(sigma.ravel()) - return arr_mode - - def __internal_transform(self, x, arr_modes): - _and = np.logical_and - _not = np.logical_not - - discretized_data = [] - for i, (modes, mu, sigma) in enumerate(zip( - arr_modes, - self.__arr_mu, - self.__arr_sigma)): - - cur_column = x[:,i] - cur_mu = mu[modes] - cur_sigma = sigma[modes] - x_std = cur_column - cur_mu - - less_than_n3sigma = (x_std <= -3*cur_sigma) - less_than_n2sigma = (x_std <= -2*cur_sigma) - less_than_n1sigma = (x_std <= -cur_sigma) - less_than_0 = (x_std <= 0) - less_than_1sigma = (x_std <= cur_sigma) - less_than_2sigma = (x_std <= 2*cur_sigma) - less_than_3sigma = (x_std <= 3*cur_sigma) - - discretized_x = 8 * modes - discretized_x[_and(_not(less_than_n3sigma), less_than_n2sigma)] += 1 - discretized_x[_and(_not(less_than_n2sigma), less_than_n1sigma)] += 2 - discretized_x[_and(_not(less_than_n1sigma), less_than_0)] += 3 - discretized_x[_and(_not(less_than_0) , less_than_1sigma)] += 4 - discretized_x[_and(_not(less_than_1sigma) , less_than_2sigma)] += 5 - discretized_x[_and(_not(less_than_2sigma) , less_than_3sigma)] += 6 - discretized_x[_not(less_than_3sigma)] += 7 - discretized_data.append(discretized_x.reshape(-1,1)) - - return np.hstack(discretized_data) - - def inverse_transform(self, x, verbose=1) -> np.ndarray: - x_modes = x // 8 - x_bins = x % 8 - - def __sample_one_column(i, mu, sigma): - cur_column_modes = x_modes[:,i] - cur_column_bins = x_bins[:,i] - cur_column_mode_uniques = np.unique(cur_column_modes) - inversed_x = np.zeros_like(cur_column_modes, dtype=float) - - for mode in cur_column_mode_uniques: - cur_mode_idx = cur_column_modes == mode - cur_mode_mu = mu[mode] - cur_mode_sigma = sigma[mode] - - sample_results = self.__sample_from_truncnorm(cur_column_bins[cur_mode_idx], cur_mode_mu, cur_mode_sigma, random_state=self._random_state) - inversed_x[cur_mode_idx] = sample_results - - return inversed_x.reshape(-1,1) - - if verbose: - from tqdm import tqdm - _progress_wrapper = lambda iterable: tqdm(iterable, desc='sampling', total=len(self.__arr_mu)) - else: - _progress_wrapper = lambda iterable: iterable - inversed_data = np.hstack([__sample_one_column(i, mu, sigma) - for i, (mu, sigma) in _progress_wrapper(enumerate(zip(self.__arr_mu, self.__arr_sigma)))]) - - #the sampling progress is fast enough so there is no need for parallelization - #inversed_data = np.hstack(Parallel(n_jobs=n_jobs, verbose=verbose)(delayed(__sample_one_column)(i, mu, sigma) - # for i, (mu, sigma) in enumerate(zip(self.__arr_mu, self.__arr_sigma)))) - return self.__scaler.inverse_transform(inversed_data) - - @staticmethod - def __sample_from_truncnorm(bins, mu, sigma, random_state=None): - sampled_results = np.zeros_like(bins, dtype=float) - def __sampling(idx, range_min, range_max): - sampling_size = np.sum(idx) - if sampling_size != 0: - sampled_results[idx] = truncnorm.rvs(range_min, range_max, loc=mu, scale=sigma, size=sampling_size, random_state=random_state) - - #shape param (min, max) of scipy.stats.truncnorm.rvs are still defined with respect to the standard normal - __sampling(bins == 0, np.NINF, -3) - __sampling(bins == 1, -3, -2) - __sampling(bins == 2, -2, -1) - __sampling(bins == 3, -1, 0) - __sampling(bins == 4, 0, 1) - __sampling(bins == 5, 1, 2) - __sampling(bins == 6, 2, 3) - __sampling(bins == 7, 3, np.inf) - return sampled_results - -class GANBLRPP: - """ - The GANBLR++ model. - - Parameters - ---------- - numerical_columns : list of int - Indicating the indexes of numerical columns. - For example, if the 3, 5, 10th feature of a data is numerical feature, then this param should be [3, 5, 10]. - - random_state : int, RandomState instance or None - Controls the random seed given to the method chosen to initialize the parameters of `BayesianGaussianMixture` used by `GANBLRPP`. - """ - def __init__(self, numerical_columns, random_state=None): - self.__discritizer = DMMDiscritizer(random_state) - self.__ganblr = GANBLR() - self._numerical_columns = numerical_columns - pass - - def fit(self, x, y, k=0, batch_size=32, epochs=10, warmup_epochs=1, verbose=1): - ''' - Fit the model to the given data. - - Parameters - ---------- - x : array_like of shape (n_samples, n_features) - Dataset to fit the model. The data should be discrete. - - y : array_like of shape (n_samples,) - Label of the dataset. - - k : int, default=0 - Parameter k of ganblr model. Must be greater than 0. No more than 2 is Suggested. - - batch_size : int, default=32 - Size of the batch to feed the model at each step. - - epochs : int, default=0 - Number of epochs to use during training. - - warmup_epochs : int, default=1 - Number of epochs to use in warmup phase. Defaults to :attr:`1`. - - verbose : int, default=1 - Whether to output the log. Use 1 for log output and 0 for complete silence. - - Returns - ------- - self : object - Fitted model. - ''' - numerical_columns = self._numerical_columns - x[:,numerical_columns] = self.__discritizer.fit_transform(x[:,numerical_columns]) - return self.__ganblr.fit(x, y, k, batch_size, epochs, warmup_epochs, verbose) - - def sample(self, size=None, verbose=1): - """ - Generate synthetic data. - - Parameters - ---------- - size : int or None - Size of the data to be generated. set to `None` to make the size equal to the size of the training set. - - verbose : int, default=1 - Whether to output the log. Use 1 for log output and 0 for complete silence. - - Return: - ----------------- - synthetic_samples : np.ndarray - Generated synthetic data. - """ - if verbose: - print('Step 1/2: Sampling discrete data from GANBLR.') - ordinal_data = self.__ganblr._sample(size, verbose=verbose) - syn_x = self.__ganblr._ordinal_encoder.inverse_transform(ordinal_data[:,:-1]) - syn_y = self.__ganblr._label_encoder.inverse_transform(ordinal_data[:,-1]).reshape(-1,1) - if verbose: - print('step 2/2: Sampling numerical data.') - numerical_columns = self._numerical_columns - numerical_data = self.__discritizer.inverse_transform(syn_x[:,numerical_columns].astype(int)) - syn_x[:,numerical_columns] = numerical_data - return np.hstack([syn_x, syn_y]) - - def evaluate(self, x, y, model='lr'): - """ - Perform a TSTR(Training on Synthetic data, Testing on Real data) evaluation. - - Parameters - ---------- - x, y : array_like - Test dataset. - - model : str or object - The model used for evaluate. Should be one of ['lr', 'mlp', 'rf'], or a model class that have sklearn-style `fit` and `predict` method. - - Return: - -------- - accuracy_score : float. - - """ - from sklearn.linear_model import LogisticRegression - from sklearn.neural_network import MLPClassifier - from sklearn.ensemble import RandomForestClassifier - from sklearn.preprocessing import OneHotEncoder, StandardScaler, OrdinalEncoder - from sklearn.metrics import accuracy_score - - eval_model = None - if model=='lr': - eval_model = LogisticRegression() - elif model == 'rf': - eval_model = RandomForestClassifier() - elif model == 'mlp': - eval_model = MLPClassifier() - elif hasattr(model, 'fit') and hasattr(model, 'predict'): - eval_model = model - else: - raise Exception('Invalid Arugument') - numerical_columns = self._numerical_columns - catgorical_columns = list(set(range(x.shape[1])) - set(numerical_columns)) - categories = self.__ganblr._d.get_categories(catgorical_columns) - - synthetic_data = self.sample() - synthetic_x, synthetic_y = synthetic_data[:,:-1], synthetic_data[:,-1] - - ohe = OneHotEncoder(categories=categories, sparse=False, handle_unknown='ignore') - syn_x_ohe = ohe.fit_transform(synthetic_x[:,catgorical_columns]) - real_x_ohe = ohe.transform(x[:,catgorical_columns]) - syn_x_num = synthetic_x[:,numerical_columns] - real_x_num = x[:,numerical_columns] - - scaler = StandardScaler() - syn_x_concat = scaler.fit_transform(np.hstack([syn_x_num, syn_x_ohe])) - real_x_concat = scaler.transform(np.hstack([real_x_num, real_x_ohe])) - - lbe = self.__ganblr._label_encoder - real_y = lbe.transform(y) - syn_y = lbe.transform(synthetic_y) - - eval_model.fit(syn_x_concat, syn_y) - pred = eval_model.predict(real_x_concat) - return accuracy_score(real_y, pred) - - - - - - - - - diff --git a/katabatic/models/ganblrpp/ganblrpp_DGEK/__init__.py b/katabatic/models/ganblrpp/ganblrpp_DGEK/__init__.py deleted file mode 100644 index 6905489..0000000 --- a/katabatic/models/ganblrpp/ganblrpp_DGEK/__init__.py +++ /dev/null @@ -1,12 +0,0 @@ -"""Top-level package for Ganblr.""" - -__author__ = """Tulip Lab""" -__email__ = 'jhzhou@tuliplab.academy' -__version__ = '0.1.0' - -from .kdb import KdbHighOrderFeatureEncoder -from .utils import get_demo_data - -__all__ = ['models', 'KdbHighOrderFeatureEncoder', 'get_demo_data'] - - diff --git a/katabatic/models/ganblrpp/ganblrpp_DGEK/ganblr.py b/katabatic/models/ganblrpp/ganblrpp_DGEK/ganblr.py deleted file mode 100644 index a699618..0000000 --- a/katabatic/models/ganblrpp/ganblrpp_DGEK/ganblr.py +++ /dev/null @@ -1,244 +0,0 @@ -from .kdb import * -from .kdb import _add_uniform -from .utils import * -from pgmpy.models import BayesianNetwork -from pgmpy.sampling import BayesianModelSampling -from pgmpy.factors.discrete import TabularCPD -from sklearn.preprocessing import OrdinalEncoder, LabelEncoder -import numpy as np -import tensorflow as tf - -class GANBLR: - """ - The GANBLR Model. - """ - def __init__(self) -> None: - self._d = None - self.__gen_weights = None - self.batch_size = None - self.epochs = None - self.k = None - self.constraints = None - self._ordinal_encoder = OrdinalEncoder(dtype=int, handle_unknown='use_encoded_value', unknown_value=-1) - self._label_encoder = LabelEncoder() - - def fit(self, x, y, k=0, batch_size=32, epochs=10, warmup_epochs=1, verbose=1): - ''' - Fit the model to the given data. - - Parameters - ---------- - x : array_like of shape (n_samples, n_features) - Dataset to fit the model. The data should be discrete. - - y : array_like of shape (n_samples,) - Label of the dataset. - - k : int, default=0 - Parameter k of ganblr model. Must be greater than 0. No more than 2 is Suggested. - - batch_size : int, default=32 - Size of the batch to feed the model at each step. - - epochs : int, default=0 - Number of epochs to use during training. - - warmup_epochs : int, default=1 - Number of epochs to use in warmup phase. Defaults to :attr:`1`. - - verbose : int, default=1 - Whether to output the log. Use 1 for log output and 0 for complete silence. - - Returns - ------- - self : object - Fitted model. - ''' - if verbose is None or not isinstance(verbose, int): - verbose = 1 - x = self._ordinal_encoder.fit_transform(x) - y = self._label_encoder.fit_transform(y).astype(int) - d = DataUtils(x, y) - self._d = d - self.k = k - self.batch_size = batch_size - if verbose: - print(f"warmup run:") - history = self._warmup_run(warmup_epochs, verbose=verbose) - syn_data = self._sample(verbose=0) - discriminator_label = np.hstack([np.ones(d.data_size), np.zeros(d.data_size)]) - for i in range(epochs): - discriminator_input = np.vstack([x, syn_data[:,:-1]]) - disc_input, disc_label = sample(discriminator_input, discriminator_label, frac=0.8) - disc = self._discrim() - d_history = disc.fit(disc_input, disc_label, batch_size=batch_size, epochs=1, verbose=0).history - prob_fake = disc.predict(x, verbose=0) - ls = np.mean(-np.log(np.subtract(1, prob_fake))) - g_history = self._run_generator(loss=ls).history - syn_data = self._sample(verbose=0) - - if verbose: - print(f"Epoch {i+1}/{epochs}: G_loss = {g_history['loss'][0]:.6f}, G_accuracy = {g_history['accuracy'][0]:.6f}, D_loss = {d_history['loss'][0]:.6f}, D_accuracy = {d_history['accuracy'][0]:.6f}") - return self - - def evaluate(self, x, y, model='lr') -> float: - """ - Perform a TSTR(Training on Synthetic data, Testing on Real data) evaluation. - - Parameters - ---------- - x, y : array_like - Test dataset. - - model : str or object - The model used for evaluate. Should be one of ['lr', 'mlp', 'rf'], or a model class that have sklearn-style `fit` and `predict` method. - - Return: - -------- - accuracy_score : float. - - """ - from sklearn.linear_model import LogisticRegression - from sklearn.neural_network import MLPClassifier - from sklearn.ensemble import RandomForestClassifier - from sklearn.preprocessing import OneHotEncoder - from sklearn.pipeline import Pipeline - from sklearn.metrics import accuracy_score - - eval_model = None - models = dict( - lr=LogisticRegression, - rf=RandomForestClassifier, - mlp=MLPClassifier - ) - if model in models.keys(): - eval_model = models[model]() - elif hasattr(model, 'fit') and hasattr(model, 'predict'): - eval_model = model - else: - raise Exception("Invalid Arugument `model`, Should be one of ['lr', 'mlp', 'rf'], or a model class that have sklearn-style `fit` and `predict` method.") - - synthetic_data = self._sample() - synthetic_x, synthetic_y = synthetic_data[:,:-1], synthetic_data[:,-1] - x_test = self._ordinal_encoder.transform(x) - y_test = self._label_encoder.transform(y) - - categories = self._d.get_categories() - pipline = Pipeline([('encoder', OneHotEncoder(categories=categories, handle_unknown='ignore')), ('model', eval_model)]) - pipline.fit(synthetic_x, synthetic_y) - pred = pipline.predict(x_test) - return accuracy_score(y_test, pred) - - def sample(self, size=None, verbose=1) -> np.ndarray: - """ - Generate synthetic data. - - Parameters - ---------- - size : int or None - Size of the data to be generated. set to `None` to make the size equal to the size of the training set. - - verbose : int, default=1 - Whether to output the log. Use 1 for log output and 0 for complete silence. - - Return: - ----------------- - synthetic_samples : np.ndarray - Generated synthetic data. - """ - ordinal_data = self._sample(size, verbose) - origin_x = self._ordinal_encoder.inverse_transform(ordinal_data[:,:-1]) - origin_y = self._label_encoder.inverse_transform(ordinal_data[:,-1]).reshape(-1,1) - return np.hstack([origin_x, origin_y]) - - def _sample(self, size=None, verbose=1) -> np.ndarray: - """ - Generate synthetic data in ordinal encoding format - """ - if verbose is None or not isinstance(verbose, int): - verbose = 1 - #basic varibles - d = self._d - feature_cards = np.array(d.feature_uniques) - #ensure sum of each constraint group equals to 1, then re concat the probs - _idxs = np.cumsum([0] + d._kdbe.constraints_.tolist()) - constraint_idxs = [(_idxs[i],_idxs[i+1]) for i in range(len(_idxs)-1)] - - probs = np.exp(self.__gen_weights[0]) - cpd_probs = [probs[start:end,:] for start, end in constraint_idxs] - cpd_probs = np.vstack([p/p.sum(axis=0) for p in cpd_probs]) - - #assign the probs to the full cpd tables - idxs = np.cumsum([0] + d._kdbe.high_order_feature_uniques_) - feature_idxs = [(idxs[i],idxs[i+1]) for i in range(len(idxs)-1)] - have_value_idxs = d._kdbe.have_value_idxs_ - full_cpd_probs = [] - for have_value, (start, end) in zip(have_value_idxs, feature_idxs): - #(n_high_order_feature_uniques, n_classes) - cpd_prob_ = cpd_probs[start:end,:] - #(n_all_combination) Note: the order is (*parent, variable) - have_value_ravel = have_value.ravel() - #(n_classes * n_all_combination) - have_value_ravel_repeat = np.hstack([have_value_ravel] * d.num_classes) - #(n_classes * n_all_combination) <- (n_classes * n_high_order_feature_uniques) - full_cpd_prob_ravel = np.zeros_like(have_value_ravel_repeat, dtype=float) - full_cpd_prob_ravel[have_value_ravel_repeat] = cpd_prob_.T.ravel() - #(n_classes * n_parent_combinations, n_variable_unique) - full_cpd_prob = full_cpd_prob_ravel.reshape(-1, have_value.shape[-1]).T - full_cpd_prob = _add_uniform(full_cpd_prob, noise=0) - full_cpd_probs.append(full_cpd_prob) - - #prepare node and edge names - node_names = [str(i) for i in range(d.num_features + 1)] - edge_names = [(str(i), str(j)) for i,j in d._kdbe.edges_] - y_name = node_names[-1] - - #create TabularCPD objects - evidences = d._kdbe.dependencies_ - feature_cpds = [ - TabularCPD(str(name), feature_cards[name], table, - evidence=[y_name, *[str(e) for e in evidences]], - evidence_card=[d.num_classes, *feature_cards[evidences].tolist()]) - for (name, evidences), table in zip(evidences.items(), full_cpd_probs) - ] - y_probs = (d.class_counts/d.data_size).reshape(-1,1) - y_cpd = TabularCPD(y_name, d.num_classes, y_probs) - - #create kDB model, then sample the data - model = BayesianNetwork(edge_names) - model.add_cpds(y_cpd, *feature_cpds) - sample_size = d.data_size if size is None else size - result = BayesianModelSampling(model).forward_sample(size=sample_size, show_progress = verbose > 0) - sorted_result = result[node_names].values - - return sorted_result - - def _warmup_run(self, epochs, verbose=None): - d = self._d - tf.keras.backend.clear_session() - ohex = d.get_kdbe_x(self.k) - self.constraints = softmax_weight(d.constraint_positions) - elr = get_lr(ohex.shape[1], d.num_classes, self.constraints) - history = elr.fit(ohex, d.y, batch_size=self.batch_size, epochs=epochs, verbose=verbose) - self.__gen_weights = elr.get_weights() - tf.keras.backend.clear_session() - return history - - def _run_generator(self, loss): - d = self._d - ohex = d.get_kdbe_x(self.k) - tf.keras.backend.clear_session() - model = tf.keras.Sequential() - model.add(tf.keras.layers.Dense(d.num_classes, input_dim=ohex.shape[1], activation='softmax',kernel_constraint=self.constraints)) - model.compile(loss=elr_loss(loss), optimizer='adam', metrics=['accuracy']) - model.set_weights(self.__gen_weights) - history = model.fit(ohex, d.y, batch_size=self.batch_size,epochs=1, verbose=0) - self.__gen_weights = model.get_weights() - tf.keras.backend.clear_session() - return history - - def _discrim(self): - model = tf.keras.Sequential() - model.add(tf.keras.layers.Dense(1, input_dim=self._d.num_features, activation='sigmoid')) - model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy']) - return model \ No newline at end of file diff --git a/katabatic/models/ganblrpp/ganblrpp_DGEK/ganblrpp.py b/katabatic/models/ganblrpp/ganblrpp_DGEK/ganblrpp.py deleted file mode 100644 index a23d0f0..0000000 --- a/katabatic/models/ganblrpp/ganblrpp_DGEK/ganblrpp.py +++ /dev/null @@ -1,315 +0,0 @@ -from .ganblr import GANBLR -from sklearn.mixture import BayesianGaussianMixture -from sklearn.preprocessing import MinMaxScaler, LabelEncoder, OrdinalEncoder -from scipy.stats import truncnorm -import numpy as np - -class DMMDiscritizer: - def __init__(self, random_state): - self.__dmm_params = dict(weight_concentration_prior_type="dirichlet_process", - n_components=2 * 5, reg_covar=0, init_params='random', - max_iter=1500, mean_precision_prior=.1, - random_state=random_state) - - self.__scaler = MinMaxScaler() - self.__dmms = [] - self.__arr_mu = [] - self.__arr_sigma = [] - self._random_state = random_state - - def fit(self, x): - """ - Do DMM Discritization. - - Parameter: - --------- - x (2d np.numpy): data to be discritize. Must bu numeric data. - - Return: - ---------- - self - - """ - assert(isinstance(x, np.ndarray)) - assert(len(x.shape) == 2) - - x_scaled = self.__scaler.fit_transform(x) - self.__internal_fit(x_scaled) - return self - - def transform(self, x) -> np.ndarray: - x = self.__scaler.transform(x) - arr_modes = [] - for i, dmm in enumerate(self.__dmms): - modes = dmm.predict(x[:,i:i+1]) - modes = LabelEncoder().fit_transform(modes)#.astype(int) - arr_modes.append(modes) - return self.__internal_transform(x, arr_modes) - - def fit_transform(self, x) -> np.ndarray: - assert(isinstance(x, np.ndarray)) - assert(len(x.shape) == 2) - - x_scaled = self.__scaler.fit_transform(x) - arr_modes = self.__internal_fit(x_scaled) - return self.__internal_transform(x_scaled, arr_modes) - - def __internal_fit(self, x): - self.__dmms.clear() - self.__arr_mu.clear() - self.__arr_sigma.clear() - - arr_mode = [] - for i in range(x.shape[1]): - cur_column = x[:,i:i+1] - dmm = BayesianGaussianMixture(**self.__dmm_params) - y = dmm.fit_predict(cur_column) - lbe = LabelEncoder().fit(y) - mu = dmm.means_[:len(lbe.classes_)] - sigma = np.sqrt(dmm.covariances_[:len(lbe.classes_)]) - - arr_mode.append(lbe.transform(y))#.astype(int)) - #self.__arr_lbes.append(lbe) - self.__dmms.append(dmm) - self.__arr_mu.append(mu.ravel()) - self.__arr_sigma.append(sigma.ravel()) - return arr_mode - - def __internal_transform(self, x, arr_modes): - _and = np.logical_and - _not = np.logical_not - - discretized_data = [] - for i, (modes, mu, sigma) in enumerate(zip( - arr_modes, - self.__arr_mu, - self.__arr_sigma)): - - cur_column = x[:,i] - cur_mu = mu[modes] - cur_sigma = sigma[modes] - x_std = cur_column - cur_mu - - less_than_n3sigma = (x_std <= -3*cur_sigma) - less_than_n2sigma = (x_std <= -2*cur_sigma) - less_than_n1sigma = (x_std <= -cur_sigma) - less_than_0 = (x_std <= 0) - less_than_1sigma = (x_std <= cur_sigma) - less_than_2sigma = (x_std <= 2*cur_sigma) - less_than_3sigma = (x_std <= 3*cur_sigma) - - discretized_x = 8 * modes - discretized_x[_and(_not(less_than_n3sigma), less_than_n2sigma)] += 1 - discretized_x[_and(_not(less_than_n2sigma), less_than_n1sigma)] += 2 - discretized_x[_and(_not(less_than_n1sigma), less_than_0)] += 3 - discretized_x[_and(_not(less_than_0) , less_than_1sigma)] += 4 - discretized_x[_and(_not(less_than_1sigma) , less_than_2sigma)] += 5 - discretized_x[_and(_not(less_than_2sigma) , less_than_3sigma)] += 6 - discretized_x[_not(less_than_3sigma)] += 7 - discretized_data.append(discretized_x.reshape(-1,1)) - - return np.hstack(discretized_data) - - def inverse_transform(self, x, verbose=1) -> np.ndarray: - x_modes = x // 8 - x_bins = x % 8 - - def __sample_one_column(i, mu, sigma): - cur_column_modes = x_modes[:,i] - cur_column_bins = x_bins[:,i] - cur_column_mode_uniques = np.unique(cur_column_modes) - inversed_x = np.zeros_like(cur_column_modes, dtype=float) - - for mode in cur_column_mode_uniques: - cur_mode_idx = cur_column_modes == mode - cur_mode_mu = mu[mode] - cur_mode_sigma = sigma[mode] - - sample_results = self.__sample_from_truncnorm(cur_column_bins[cur_mode_idx], cur_mode_mu, cur_mode_sigma, random_state=self._random_state) - inversed_x[cur_mode_idx] = sample_results - - return inversed_x.reshape(-1,1) - - if verbose: - from tqdm import tqdm - _progress_wrapper = lambda iterable: tqdm(iterable, desc='sampling', total=len(self.__arr_mu)) - else: - _progress_wrapper = lambda iterable: iterable - inversed_data = np.hstack([__sample_one_column(i, mu, sigma) - for i, (mu, sigma) in _progress_wrapper(enumerate(zip(self.__arr_mu, self.__arr_sigma)))]) - - #the sampling progress is fast enough so there is no need for parallelization - #inversed_data = np.hstack(Parallel(n_jobs=n_jobs, verbose=verbose)(delayed(__sample_one_column)(i, mu, sigma) - # for i, (mu, sigma) in enumerate(zip(self.__arr_mu, self.__arr_sigma)))) - return self.__scaler.inverse_transform(inversed_data) - - @staticmethod - def __sample_from_truncnorm(bins, mu, sigma, random_state=None): - sampled_results = np.zeros_like(bins, dtype=float) - def __sampling(idx, range_min, range_max): - sampling_size = np.sum(idx) - if sampling_size != 0: - sampled_results[idx] = truncnorm.rvs(range_min, range_max, loc=mu, scale=sigma, size=sampling_size, random_state=random_state) - - #shape param (min, max) of scipy.stats.truncnorm.rvs are still defined with respect to the standard normal - __sampling(bins == 0, np.NINF, -3) - __sampling(bins == 1, -3, -2) - __sampling(bins == 2, -2, -1) - __sampling(bins == 3, -1, 0) - __sampling(bins == 4, 0, 1) - __sampling(bins == 5, 1, 2) - __sampling(bins == 6, 2, 3) - __sampling(bins == 7, 3, np.inf) - return sampled_results - -class GANBLRPP: - """ - The GANBLR++ model. - - Parameters - ---------- - numerical_columns : list of int - Indicating the indexes of numerical columns. - For example, if the 3, 5, 10th feature of a data is numerical feature, then this param should be [3, 5, 10]. - - random_state : int, RandomState instance or None - Controls the random seed given to the method chosen to initialize the parameters of `BayesianGaussianMixture` used by `GANBLRPP`. - """ - def __init__(self, numerical_columns, random_state=None): - self.__discritizer = DMMDiscritizer(random_state) - self.__ganblr = GANBLR() - self._numerical_columns = numerical_columns - pass - - def fit(self, x, y, k=0, batch_size=32, epochs=10, warmup_epochs=1, verbose=1): - ''' - Fit the model to the given data. - - Parameters - ---------- - x : array_like of shape (n_samples, n_features) - Dataset to fit the model. The data should be discrete. - - y : array_like of shape (n_samples,) - Label of the dataset. - - k : int, default=0 - Parameter k of ganblr model. Must be greater than 0. No more than 2 is Suggested. - - batch_size : int, default=32 - Size of the batch to feed the model at each step. - - epochs : int, default=0 - Number of epochs to use during training. - - warmup_epochs : int, default=1 - Number of epochs to use in warmup phase. Defaults to :attr:`1`. - - verbose : int, default=1 - Whether to output the log. Use 1 for log output and 0 for complete silence. - - Returns - ------- - self : object - Fitted model. - ''' - numerical_columns = self._numerical_columns - x[:,numerical_columns] = self.__discritizer.fit_transform(x[:,numerical_columns]) - return self.__ganblr.fit(x, y, k, batch_size, epochs, warmup_epochs, verbose) - - def sample(self, size=None, verbose=1): - """ - Generate synthetic data. - - Parameters - ---------- - size : int or None - Size of the data to be generated. set to `None` to make the size equal to the size of the training set. - - verbose : int, default=1 - Whether to output the log. Use 1 for log output and 0 for complete silence. - - Return: - ----------------- - synthetic_samples : np.ndarray - Generated synthetic data. - """ - if verbose: - print('Step 1/2: Sampling discrete data from GANBLR.') - ordinal_data = self.__ganblr._sample(size, verbose=verbose) - syn_x = self.__ganblr._ordinal_encoder.inverse_transform(ordinal_data[:,:-1]) - syn_y = self.__ganblr._label_encoder.inverse_transform(ordinal_data[:,-1]).reshape(-1,1) - if verbose: - print('step 2/2: Sampling numerical data.') - numerical_columns = self._numerical_columns - numerical_data = self.__discritizer.inverse_transform(syn_x[:,numerical_columns].astype(int)) - syn_x[:,numerical_columns] = numerical_data - return np.hstack([syn_x, syn_y]) - - def evaluate(self, x, y, model='lr'): - """ - Perform a TSTR(Training on Synthetic data, Testing on Real data) evaluation. - - Parameters - ---------- - x, y : array_like - Test dataset. - - model : str or object - The model used for evaluate. Should be one of ['lr', 'mlp', 'rf'], or a model class that have sklearn-style `fit` and `predict` method. - - Return: - -------- - accuracy_score : float. - - """ - from sklearn.linear_model import LogisticRegression - from sklearn.neural_network import MLPClassifier - from sklearn.ensemble import RandomForestClassifier - from sklearn.preprocessing import OneHotEncoder, StandardScaler, OrdinalEncoder - from sklearn.metrics import accuracy_score - - eval_model = None - if model=='lr': - eval_model = LogisticRegression() - elif model == 'rf': - eval_model = RandomForestClassifier() - elif model == 'mlp': - eval_model = MLPClassifier() - elif hasattr(model, 'fit') and hasattr(model, 'predict'): - eval_model = model - else: - raise Exception('Invalid Arugument') - numerical_columns = self._numerical_columns - catgorical_columns = list(set(range(x.shape[1])) - set(numerical_columns)) - categories = self.__ganblr._d.get_categories(catgorical_columns) - - synthetic_data = self.sample() - synthetic_x, synthetic_y = synthetic_data[:,:-1], synthetic_data[:,-1] - - ohe = OneHotEncoder(categories=categories, sparse=False, handle_unknown='ignore') - syn_x_ohe = ohe.fit_transform(synthetic_x[:,catgorical_columns]) - real_x_ohe = ohe.transform(x[:,catgorical_columns]) - syn_x_num = synthetic_x[:,numerical_columns] - real_x_num = x[:,numerical_columns] - - scaler = StandardScaler() - syn_x_concat = scaler.fit_transform(np.hstack([syn_x_num, syn_x_ohe])) - real_x_concat = scaler.transform(np.hstack([real_x_num, real_x_ohe])) - - lbe = self.__ganblr._label_encoder - real_y = lbe.transform(y) - syn_y = lbe.transform(synthetic_y) - - eval_model.fit(syn_x_concat, syn_y) - pred = eval_model.predict(real_x_concat) - return accuracy_score(real_y, pred) - - - - - - - - - diff --git a/katabatic/models/ganblrpp/ganblrpp_DGEK/ganblrpp_adapter.py b/katabatic/models/ganblrpp/ganblrpp_DGEK/ganblrpp_adapter.py deleted file mode 100644 index 30de7d1..0000000 --- a/katabatic/models/ganblrpp/ganblrpp_DGEK/ganblrpp_adapter.py +++ /dev/null @@ -1,82 +0,0 @@ - -from katabatic.katabatic_spi import KatabaticModelSPI -import pandas as pd -import numpy as np -from .ganblrpp import GANBLRPP - -from katabatic.katabatic_spi import KatabaticModelSPI -import pandas as pd -import numpy as np -from .ganblrpp import GANBLRPP - -class GanblrppAdapter(KatabaticModelSPI): - def __init__(self, model_type="discrete", numerical_columns=None, random_state=None): - self.type = model_type - self.model = None - self.constraints = None - self.batch_size = None - self.epochs = None - self.training_sample_size = 0 - self.numerical_columns = numerical_columns - self.random_state = random_state - - def load_model(self): - if self.numerical_columns is None: - raise ValueError("Numerical columns must be provided for GANBLRPP initialization.") - - print("[INFO] Initializing GANBLR++ Model") - self.model = GANBLRPP(numerical_columns=self.numerical_columns, random_state=self.random_state) - - if self.model is None: - raise RuntimeError("Failed to initialize GANBLR++ model.") - - return self.model - - def load_data(self, data_pathname): - print(f"[INFO] Loading data from {data_pathname}") - try: - data = pd.read_csv(data_pathname) - print("[SUCCESS] Data loaded successfully.") - except Exception as e: - print(f"[ERROR] An error occurred: {e}") - raise - return data - - def fit(self, X_train, y_train, k=0, epochs=10, batch_size=64): - if self.model is None: - raise RuntimeError("Model is not initialized. Call `load_model()` first.") - - try: - print("[INFO] Training GANBLR++ model") - if isinstance(X_train, pd.DataFrame): - X_train = X_train.values - if isinstance(y_train, pd.Series): - y_train = y_train.values - - self.model.fit(X_train, y_train, k=k, batch_size=batch_size, epochs=epochs, verbose=0) - self.training_sample_size = len(X_train) - print("[SUCCESS] Model training completed") - except Exception as e: - print(f"[ERROR] An error occurred during model training: {e}") - raise - - def generate(self, size=None): - if self.model is None: - raise RuntimeError("Model is not initialized. Call `load_model()` first.") - - try: - print("[INFO] Generating data using GANBLR++ model") - if size is None: - size = self.training_sample_size - - generated_data = self.model.sample(size, verbose=0) - if isinstance(generated_data, np.ndarray): - generated_data = pd.DataFrame(generated_data) - - print("[SUCCESS] Data generation completed") - return generated_data - except Exception as e: - print(f"[ERROR] An error occurred during data generation: {e}") - raise - - diff --git a/katabatic/models/ganblrpp/ganblrpp_DGEK/kdb.py b/katabatic/models/ganblrpp/ganblrpp_DGEK/kdb.py deleted file mode 100644 index 03c080c..0000000 --- a/katabatic/models/ganblrpp/ganblrpp_DGEK/kdb.py +++ /dev/null @@ -1,378 +0,0 @@ -import numpy as np -#import networkx as nx - - -# Chris advice -# Code it twice -# K=0 -# Warmup run (no generation) -# Then run again with adding the edges. -# What is the advantage over KDB? -# focus on generation, not on structure learning -# ganblr ++ is cheating, as we learn the strucutre prior. -# Send to STM -from pyitlib import discrete_random_variable as drv -def build_graph(X, y, k=2): - ''' - kDB algorithm - - Param: - ---------------------- - - Return: - ---------------------- - graph edges - ''' - #ensure data - num_features = X.shape[1] - x_nodes = list(range(num_features)) - y_node = num_features - - #util func - _x = lambda i:X[:,i] - _x2comb = lambda i,j:(X[:,i], X[:,j]) - - #feature indexes desc sort by mutual information - sorted_feature_idxs = np.argsort([ - drv.information_mutual(_x(i), y) - for i in range(num_features) - ])[::-1] - - #start building graph - edges = [] - for iter, target_idx in enumerate(sorted_feature_idxs): - target_node = x_nodes[target_idx] - edges.append((y_node, target_node)) - - parent_candidate_idxs = sorted_feature_idxs[:iter] - if iter <= k: - for idx in parent_candidate_idxs: - edges.append((x_nodes[idx], target_node)) - else: - first_k_parent_mi_idxs = np.argsort([ - drv.information_mutual_conditional(*_x2comb(i, target_idx), y) - for i in parent_candidate_idxs - ])[::-1][:k] - first_k_parent_idxs = parent_candidate_idxs[first_k_parent_mi_idxs] - - for parent_idx in first_k_parent_idxs: - edges.append((x_nodes[parent_idx], target_node)) - return edges - -# def draw_graph(edges): -# ''' -# Draw the graph -# -# Param -# ----------------- -# edges: edges of the graph -# -# ''' -# graph = nx.DiGraph(edges) -# pos=nx.spiral_layout(graph) -# nx.draw(graph, pos, node_color='r', edge_color='b') -# nx.draw_networkx_labels(graph, pos, font_size=20, font_family="sans-serif") - - -def get_cross_table(*cols, apply_wt=False): - ''' - author: alexland - - returns: - (i) xt, NumPy array storing the xtab results, number of dimensions is equal to - the len(args) passed in - (ii) unique_vals_all_cols, a tuple of 1D NumPy array for each dimension - in xt (for a 2D xtab, the tuple comprises the row and column headers) - pass in: - (i) 1 or more 1D NumPy arrays of integers - (ii) if wts is True, then the last array in cols is an array of weights - - if return_inverse=True, then np.unique also returns an integer index - (from 0, & of same len as array passed in) such that, uniq_vals[idx] gives the original array passed in - higher dimensional cross tabulations are supported (eg, 2D & 3D) - cross tabulation on two variables (columns): - >>> q1 = np.array([7, 8, 8, 8, 5, 6, 4, 6, 6, 8, 4, 6, 6, 6, 6, 8, 8, 5, 8, 6]) - >>> q2 = np.array([6, 4, 6, 4, 8, 8, 4, 8, 7, 4, 4, 8, 8, 7, 5, 4, 8, 4, 4, 4]) - >>> uv, xt = xtab(q1, q2) - >>> uv - (array([4, 5, 6, 7, 8]), array([4, 5, 6, 7, 8])) - >>> xt - array([[2, 0, 0, 0, 0], - [1, 0, 0, 0, 1], - [1, 1, 0, 2, 4], - [0, 0, 1, 0, 0], - [5, 0, 1, 0, 1]], dtype=uint64) - ''' - if not all(len(col) == len(cols[0]) for col in cols[1:]): - raise ValueError("all arguments must be same size") - - if len(cols) == 0: - raise TypeError("xtab() requires at least one argument") - - fnx1 = lambda q: len(q.squeeze().shape) - if not all([fnx1(col) == 1 for col in cols]): - raise ValueError("all input arrays must be 1D") - - if apply_wt: - cols, wt = cols[:-1], cols[-1] - else: - wt = 1 - - uniq_vals_all_cols, idx = zip( *(np.unique(col, return_inverse=True) for col in cols) ) - shape_xt = [uniq_vals_col.size for uniq_vals_col in uniq_vals_all_cols] - dtype_xt = 'float' if apply_wt else 'uint' - xt = np.zeros(shape_xt, dtype=dtype_xt) - np.add.at(xt, idx, wt) - return uniq_vals_all_cols, xt - -def _get_dependencies_without_y(variables, y_name, kdb_edges): - ''' - evidences of each variable without y. - - Param: - -------------- - variables: variable names - - y_name: class name - - kdb_edges: list of tuple (source, target) - ''' - dependencies = {} - kdb_edges_without_y = [edge for edge in kdb_edges if edge[0] != y_name] - mi_desc_order = {t:i for i,(s,t) in enumerate(kdb_edges) if s == y_name} - for x in variables: - current_dependencies = [s for s,t in kdb_edges_without_y if t == x] - if len(current_dependencies) >= 2: - sort_dict = {t:mi_desc_order[t] for t in current_dependencies} - dependencies[x] = sorted(sort_dict) - else: - dependencies[x] = current_dependencies - return dependencies - -def _add_uniform(array, noise=1e-5): - ''' - if no count on particular condition for any feature, give a uniform prob rather than leave 0 - ''' - sum_by_col = np.sum(array,axis=0) - zero_idxs = (array == 0).astype(int) - # zero_count_by_col = np.sum(zero_idxs,axis=0) - nunique = array.shape[0] - result = np.zeros_like(array, dtype='float') - for i in range(array.shape[1]): - if sum_by_col[i] == 0: - result[:,i] = array[:,i] + 1./nunique - elif noise != 0: - result[:,i] = array[:,i] + noise * zero_idxs[:,i] - else: - result[:,i] = array[:,i] - return result - -def _normalize_by_column(array): - sum_by_col = np.sum(array,axis=0) - return np.divide(array, sum_by_col, - out=np.zeros_like(array,dtype='float'), - where=sum_by_col !=0) - -def _smoothing(cct, d): - ''' - probability smoothing for kdb - - Parameters: - ----------- - cct (np.ndarray): cross count table with shape (x0, *parents) - - d (int): dimension of cct - - Return: - -------- - smoothed joint prob table - ''' - #covert cross-count-table to joint-prob-table by doing a normalization alone axis 0 - jpt = _normalize_by_column(cct) - smoothing_idx = jpt == 0 - if d > 1 and np.sum(smoothing_idx) > 0: - parent = cct.sum(axis=-1) - parent = _smoothing(parent, d-1) - parent_extend = parent.repeat(jpt.shape[-1]).reshape(jpt.shape) - jpt[smoothing_idx] = parent_extend[smoothing_idx] - return jpt - -def get_high_order_feature(X, col, evidence_cols, feature_uniques): - ''' - encode the high order feature of X[col] given evidences X[evidence_cols]. - ''' - if evidence_cols is None or len(evidence_cols) == 0: - return X[:,[col]] - else: - evidences = [X[:,_col] for _col in evidence_cols] - - #[1, variable_unique, evidence_unique] - base = [1, feature_uniques[col]] + [feature_uniques[_col] for _col in evidence_cols[::-1][:-1]] - cum_base = np.cumprod(base)[::-1] - - cols = evidence_cols + [col] - high_order_feature = np.sum(X[:,cols] * cum_base, axis=1).reshape(-1,1) - return high_order_feature - -def get_high_order_constraints(X, col, evidence_cols, feature_uniques): - ''' - find the constraints infomation for the high order feature X[col] given evidences X[evidence_cols]. - - Returns: - --------------------- - tuple(have_value, high_order_uniques) - - have_value: a k+1 dimensions numpy ndarray of type boolean. - Each dimension correspond to a variable, with the order (*evidence_cols, col) - True indicate the corresponding combination of variable values cound be found in the dataset. - False indicate not. - - high_order_constraints: a 1d nummy ndarray of type int. - Each number `c` indicate that there are `c` cols shound be applying the constraints since the last constrant position(or index 0), - in sequence. - - ''' - if evidence_cols is None or len(evidence_cols) == 0: - unique = feature_uniques[col] - return np.ones(unique,dtype=bool), np.array([unique]) - else: - cols = evidence_cols + [col] - cross_table_idxs, cross_table = get_cross_table(*[X[:,i] for i in cols]) - have_value = cross_table != 0 - - have_value_reshape = have_value.reshape(-1,have_value.shape[-1]) - #have_value_split = np.split(have_value_reshape, have_value_reshape.shape[0], 0) - high_order_constraints = np.sum(have_value_reshape, axis=-1) - - return have_value, high_order_constraints - -class KdbHighOrderFeatureEncoder: - ''' - High order feature encoder that uses the kdb model to retrieve the dependencies between features. - - ''' - def __init__(self): - self.dependencies_ = {} - self.constraints_ = np.array([]) - self.have_value_idxs_ = [] - self.feature_uniques_ = [] - self.high_order_feature_uniques_ = [] - self.edges_ = [] - self.ohe_ = None - self.k = None - #self.full_=True - - def fit(self, X, y, k=0): - ''' - Fit the KdbHighOrderFeatureEncoder to X, y. - - Parameters - ---------- - X : array_like of shape (n_samples, n_features) - data to fit in the encoder. - - y : array_like of shape (n_samples,) - label to fit in the encoder. - - k : int, default=0 - k value of the order of the high-order feature. k = 0 will lead to a OneHotEncoder. - - Returns - ------- - self : object - Fitted encoder. - ''' - self.k = k - edges = build_graph(X, y, k) - #n_classes = len(np.unique(y)) - num_features = X.shape[1] - - if k > 0: - dependencies = _get_dependencies_without_y(list(range(num_features)), num_features, edges) - else: - dependencies = {x:[] for x in range(num_features)} - - self.dependencies_ = dependencies - self.feature_uniques_ = [len(np.unique(X[:,i])) for i in range(num_features)] - self.edges_ = edges - #self.full_ = full - - Xk, constraints, have_value_idxs = self.transform(X, return_constraints=True, use_ohe=False) - - from sklearn.preprocessing import OneHotEncoder - self.ohe_ = OneHotEncoder().fit(Xk) - self.high_order_feature_uniques_ = [len(c) for c in self.ohe_.categories_] - self.constraints_ = constraints - self.have_value_idxs_ = have_value_idxs - return self - - def transform(self, X, return_constraints=False, use_ohe=True): - """ - Transform X to the high-order features. - - Parameters - ---------- - X : array_like of shape (n_samples, n_features) - Data to fit in the encoder. - - return_constraints : bool, default=False - Whether to return the constraint informations. - - use_ohe : bool, default=True - Whether to transform output to one-hot format. - - Returns - ------- - X_out : ndarray of shape (n_samples, n_encoded_features) - Transformed input. - """ - Xk = [] - have_value_idxs = [] - constraints = [] - for k, v in self.dependencies_.items(): - xk = get_high_order_feature(X, k, v, self.feature_uniques_) - Xk.append(xk) - - if return_constraints: - idx, constraint = get_high_order_constraints(X, k, v, self.feature_uniques_) - have_value_idxs.append(idx) - constraints.append(constraint) - - Xk = np.hstack(Xk) - from sklearn.preprocessing import OrdinalEncoder - Xk = OrdinalEncoder().fit_transform(Xk) - if use_ohe: - Xk = self.ohe_.transform(Xk) - - if return_constraints: - concated_constraints = np.hstack(constraints) - return Xk, concated_constraints, have_value_idxs - else: - return Xk - - def fit_transform(self, X, y, k=0, return_constraints=False): - ''' - Fit KdbHighOrderFeatureEncoder to X, y, then transform X. - - Equivalent to fit(X, y, k).transform(X, return_constraints) but more convenient. - - Parameters - ---------- - X : array_like of shape (n_samples, n_features) - data to fit in the encoder. - - y : array_like of shape (n_samples,) - label to fit in the encoder. - - k : int, default=0 - k value of the kdb model. k = 0 will lead to a OneHotEncoder. - - return_constraints : bool, default=False - whether to return the constraint informations. - - Returns - ------- - X_out : ndarray of shape (n_samples, n_encoded_features) - Transformed input. - ''' - return self.fit(X, y, k).transform(X, return_constraints) \ No newline at end of file diff --git a/katabatic/models/ganblrpp/ganblrpp_DGEK/utils.py b/katabatic/models/ganblrpp/ganblrpp_DGEK/utils.py deleted file mode 100644 index 3502866..0000000 --- a/katabatic/models/ganblrpp/ganblrpp_DGEK/utils.py +++ /dev/null @@ -1,167 +0,0 @@ -import numpy as np -import tensorflow as tf -from tensorflow.python.ops import math_ops -from pandas import read_csv -from sklearn.preprocessing import OneHotEncoder -from .kdb import KdbHighOrderFeatureEncoder - -from tensorflow.python.ops import math_ops -import numpy as np -import tensorflow as tf - -class softmax_weight(tf.keras.constraints.Constraint): - """Constrains weight tensors to be under softmax `.""" - - def __init__(self,feature_uniques): - if isinstance(feature_uniques, np.ndarray): - idxs = math_ops.cumsum(np.hstack([np.array([0]),feature_uniques])) - else: - idxs = math_ops.cumsum([0] + feature_uniques) - idxs = [i.numpy() for i in idxs] - self.feature_idxs = [ - (idxs[i],idxs[i+1]) for i in range(len(idxs)-1) - ] - - def __call__(self, w): - w_new = [ - math_ops.log(tf.nn.softmax(w[i:j,:], axis=0)) - for i,j in self.feature_idxs - ] - return tf.concat(w_new, 0) - - def get_config(self): - return {'feature_idxs': self.feature_idxs} - -def elr_loss(KL_LOSS): - def loss(y_true, y_pred): - return tf.keras.losses.sparse_categorical_crossentropy(y_true, y_pred)+ KL_LOSS - return loss - -def KL_loss(prob_fake): - return np.mean(-np.log(np.subtract(1,prob_fake))) - -def get_lr(input_dim, output_dim, constraint=None,KL_LOSS=0): - model = tf.keras.Sequential() - model.add(tf.keras.layers.Dense(output_dim, input_dim=input_dim, activation='softmax',kernel_constraint=constraint)) - model.compile(loss=elr_loss(KL_LOSS), optimizer='adam', metrics=['accuracy']) - #log_elr = model.fit(*train_data, validation_data=test_data, batch_size=batch_size,epochs=epochs) - return model - -def sample(*arrays, n=None, frac=None, random_state=None): - ''' - generate sample random arrays from given arrays. The given arrays must be same size. - - Parameters: - -------------- - *arrays: arrays to be sampled. - - n (int): Number of random samples to generate. - - frac: Float value between 0 and 1, Returns (float value * length of given arrays). frac cannot be used with n. - - random_state: int value or numpy.random.RandomState, optional. if set to a particular integer, will return same samples in every iteration. - - Return: - -------------- - the sampled array(s). Passing in multiple arrays will result in the return of a tuple. - - ''' - random = np.random - if isinstance(random_state, int): - random = random.RandomState(random_state) - elif isinstance(random_state, np.random.RandomState): - random = random_state - - arr0 = arrays[0] - original_size = len(arr0) - if n == None and frac == None: - raise Exception('You must specify one of frac or size.') - if n == None: - n = int(len(arr0) * frac) - - idxs = random.choice(original_size, n, replace=False) - if len(arrays) > 1: - sampled_arrays = [] - for arr in arrays: - assert(len(arr) == original_size) - sampled_arrays.append(arr[idxs]) - return tuple(sampled_arrays) - else: - return arr0[idxs] - -DEMO_DATASETS = { - 'adult': { - 'link':'https://raw.githubusercontent.com/chriszhangpodo/discretizedata/main/adult-dm.csv', - 'params': { - 'dtype' : int - } - }, - 'adult-raw':{ - 'link':'https://drive.google.com/uc?export=download&id=1iA-_qIC1xKQJ4nL2ugX1_XJQf8__xOY0', - 'params': {} - } -} - -def get_demo_data(name='adult'): - """ - Download demo dataset from internet. - - Parameters - ---------- - name : str - Name of dataset. Should be one of ['adult', 'adult-raw']. - - Returns - ------- - data : pandas.DataFrame - the demo dataset. - """ - assert(name in DEMO_DATASETS.keys()) - return read_csv(DEMO_DATASETS[name]['link'], **DEMO_DATASETS[name]['params']) - -from .kdb import KdbHighOrderFeatureEncoder -from sklearn.preprocessing import OneHotEncoder -from pandas import read_csv -import numpy as np - -class DataUtils: - """ - useful data utils for the preparation before training. - """ - def __init__(self, x, y): - self.x = x - self.y = y - self.data_size = len(x) - self.num_features = x.shape[1] - - yunique, ycounts = np.unique(y, return_counts=True) - self.num_classes = len(yunique) - self.class_counts = ycounts - self.feature_uniques = [len(np.unique(x[:,i])) for i in range(self.num_features)] - - self.constraint_positions = None - self._kdbe = None - - self.__kdbe_x = None - - def get_categories(self, idxs=None): - if idxs != None: - return [self._kdbe.ohe_.categories_[i] for i in idxs] - return self._kdbe.ohe_.categories_ - - def get_kdbe_x(self, k=0, dense_format=True) -> np.ndarray: - if self.__kdbe_x is not None: - return self.__kdbe_x - if self._kdbe == None: - self._kdbe = KdbHighOrderFeatureEncoder() - self._kdbe.fit(self.x, self.y, k=k) - kdbex = self._kdbe.transform(self.x) - if dense_format: - kdbex = kdbex.todense() - self.__kdbe_x = kdbex - self.constraint_positions = self._kdbe.constraints_ - return kdbex - - def clear(self): - self._kdbe = None - self.__kdbe_x = None \ No newline at end of file diff --git a/katabatic/models/ganblrpp/ganblrpp_adapter.py b/katabatic/models/ganblrpp/ganblrpp_adapter.py deleted file mode 100644 index fdea085..0000000 --- a/katabatic/models/ganblrpp/ganblrpp_adapter.py +++ /dev/null @@ -1,79 +0,0 @@ -from katabatic.katabatic_spi import KatabaticModelSPI -import pandas as pd -import numpy as np -from .ganblrpp import GANBLRPP - -from katabatic.katabatic_spi import KatabaticModelSPI -import pandas as pd -import numpy as np -from .ganblrpp import GANBLRPP - -class GanblrppAdapter(KatabaticModelSPI): - def __init__(self, model_type="discrete", numerical_columns=None, random_state=None): - self.type = model_type - self.model = None - self.constraints = None - self.batch_size = None - self.epochs = None - self.training_sample_size = 0 - self.numerical_columns = numerical_columns - self.random_state = random_state - - def load_model(self): - if self.numerical_columns is None: - raise ValueError("Numerical columns must be provided for GANBLRPP initialization.") - - print("[INFO] Initializing GANBLR++ Model") - self.model = GANBLRPP(numerical_columns=self.numerical_columns, random_state=self.random_state) - - if self.model is None: - raise RuntimeError("Failed to initialize GANBLR++ model.") - - return self.model - - def load_data(self, data_pathname): - print(f"[INFO] Loading data from {data_pathname}") - try: - data = pd.read_csv(data_pathname) - print("[SUCCESS] Data loaded successfully.") - except Exception as e: - print(f"[ERROR] An error occurred: {e}") - raise - return data - - def fit(self, X_train, y_train, k=0, epochs=10, batch_size=64): - if self.model is None: - raise RuntimeError("Model is not initialized. Call `load_model()` first.") - - try: - print("[INFO] Training GANBLR++ model") - if isinstance(X_train, pd.DataFrame): - X_train = X_train.values - if isinstance(y_train, pd.Series): - y_train = y_train.values - - self.model.fit(X_train, y_train, k=k, batch_size=batch_size, epochs=epochs, verbose=0) - self.training_sample_size = len(X_train) - print("[SUCCESS] Model training completed") - except Exception as e: - print(f"[ERROR] An error occurred during model training: {e}") - raise - - def generate(self, size=None): - if self.model is None: - raise RuntimeError("Model is not initialized. Call `load_model()` first.") - - try: - print("[INFO] Generating data using GANBLR++ model") - if size is None: - size = self.training_sample_size - - generated_data = self.model.sample(size, verbose=0) - if isinstance(generated_data, np.ndarray): - generated_data = pd.DataFrame(generated_data) - - print("[SUCCESS] Data generation completed") - return generated_data - except Exception as e: - print(f"[ERROR] An error occurred during data generation: {e}") - raise \ No newline at end of file diff --git a/katabatic/models/ganblrpp/kdb.py b/katabatic/models/ganblrpp/kdb.py deleted file mode 100644 index 03c080c..0000000 --- a/katabatic/models/ganblrpp/kdb.py +++ /dev/null @@ -1,378 +0,0 @@ -import numpy as np -#import networkx as nx - - -# Chris advice -# Code it twice -# K=0 -# Warmup run (no generation) -# Then run again with adding the edges. -# What is the advantage over KDB? -# focus on generation, not on structure learning -# ganblr ++ is cheating, as we learn the strucutre prior. -# Send to STM -from pyitlib import discrete_random_variable as drv -def build_graph(X, y, k=2): - ''' - kDB algorithm - - Param: - ---------------------- - - Return: - ---------------------- - graph edges - ''' - #ensure data - num_features = X.shape[1] - x_nodes = list(range(num_features)) - y_node = num_features - - #util func - _x = lambda i:X[:,i] - _x2comb = lambda i,j:(X[:,i], X[:,j]) - - #feature indexes desc sort by mutual information - sorted_feature_idxs = np.argsort([ - drv.information_mutual(_x(i), y) - for i in range(num_features) - ])[::-1] - - #start building graph - edges = [] - for iter, target_idx in enumerate(sorted_feature_idxs): - target_node = x_nodes[target_idx] - edges.append((y_node, target_node)) - - parent_candidate_idxs = sorted_feature_idxs[:iter] - if iter <= k: - for idx in parent_candidate_idxs: - edges.append((x_nodes[idx], target_node)) - else: - first_k_parent_mi_idxs = np.argsort([ - drv.information_mutual_conditional(*_x2comb(i, target_idx), y) - for i in parent_candidate_idxs - ])[::-1][:k] - first_k_parent_idxs = parent_candidate_idxs[first_k_parent_mi_idxs] - - for parent_idx in first_k_parent_idxs: - edges.append((x_nodes[parent_idx], target_node)) - return edges - -# def draw_graph(edges): -# ''' -# Draw the graph -# -# Param -# ----------------- -# edges: edges of the graph -# -# ''' -# graph = nx.DiGraph(edges) -# pos=nx.spiral_layout(graph) -# nx.draw(graph, pos, node_color='r', edge_color='b') -# nx.draw_networkx_labels(graph, pos, font_size=20, font_family="sans-serif") - - -def get_cross_table(*cols, apply_wt=False): - ''' - author: alexland - - returns: - (i) xt, NumPy array storing the xtab results, number of dimensions is equal to - the len(args) passed in - (ii) unique_vals_all_cols, a tuple of 1D NumPy array for each dimension - in xt (for a 2D xtab, the tuple comprises the row and column headers) - pass in: - (i) 1 or more 1D NumPy arrays of integers - (ii) if wts is True, then the last array in cols is an array of weights - - if return_inverse=True, then np.unique also returns an integer index - (from 0, & of same len as array passed in) such that, uniq_vals[idx] gives the original array passed in - higher dimensional cross tabulations are supported (eg, 2D & 3D) - cross tabulation on two variables (columns): - >>> q1 = np.array([7, 8, 8, 8, 5, 6, 4, 6, 6, 8, 4, 6, 6, 6, 6, 8, 8, 5, 8, 6]) - >>> q2 = np.array([6, 4, 6, 4, 8, 8, 4, 8, 7, 4, 4, 8, 8, 7, 5, 4, 8, 4, 4, 4]) - >>> uv, xt = xtab(q1, q2) - >>> uv - (array([4, 5, 6, 7, 8]), array([4, 5, 6, 7, 8])) - >>> xt - array([[2, 0, 0, 0, 0], - [1, 0, 0, 0, 1], - [1, 1, 0, 2, 4], - [0, 0, 1, 0, 0], - [5, 0, 1, 0, 1]], dtype=uint64) - ''' - if not all(len(col) == len(cols[0]) for col in cols[1:]): - raise ValueError("all arguments must be same size") - - if len(cols) == 0: - raise TypeError("xtab() requires at least one argument") - - fnx1 = lambda q: len(q.squeeze().shape) - if not all([fnx1(col) == 1 for col in cols]): - raise ValueError("all input arrays must be 1D") - - if apply_wt: - cols, wt = cols[:-1], cols[-1] - else: - wt = 1 - - uniq_vals_all_cols, idx = zip( *(np.unique(col, return_inverse=True) for col in cols) ) - shape_xt = [uniq_vals_col.size for uniq_vals_col in uniq_vals_all_cols] - dtype_xt = 'float' if apply_wt else 'uint' - xt = np.zeros(shape_xt, dtype=dtype_xt) - np.add.at(xt, idx, wt) - return uniq_vals_all_cols, xt - -def _get_dependencies_without_y(variables, y_name, kdb_edges): - ''' - evidences of each variable without y. - - Param: - -------------- - variables: variable names - - y_name: class name - - kdb_edges: list of tuple (source, target) - ''' - dependencies = {} - kdb_edges_without_y = [edge for edge in kdb_edges if edge[0] != y_name] - mi_desc_order = {t:i for i,(s,t) in enumerate(kdb_edges) if s == y_name} - for x in variables: - current_dependencies = [s for s,t in kdb_edges_without_y if t == x] - if len(current_dependencies) >= 2: - sort_dict = {t:mi_desc_order[t] for t in current_dependencies} - dependencies[x] = sorted(sort_dict) - else: - dependencies[x] = current_dependencies - return dependencies - -def _add_uniform(array, noise=1e-5): - ''' - if no count on particular condition for any feature, give a uniform prob rather than leave 0 - ''' - sum_by_col = np.sum(array,axis=0) - zero_idxs = (array == 0).astype(int) - # zero_count_by_col = np.sum(zero_idxs,axis=0) - nunique = array.shape[0] - result = np.zeros_like(array, dtype='float') - for i in range(array.shape[1]): - if sum_by_col[i] == 0: - result[:,i] = array[:,i] + 1./nunique - elif noise != 0: - result[:,i] = array[:,i] + noise * zero_idxs[:,i] - else: - result[:,i] = array[:,i] - return result - -def _normalize_by_column(array): - sum_by_col = np.sum(array,axis=0) - return np.divide(array, sum_by_col, - out=np.zeros_like(array,dtype='float'), - where=sum_by_col !=0) - -def _smoothing(cct, d): - ''' - probability smoothing for kdb - - Parameters: - ----------- - cct (np.ndarray): cross count table with shape (x0, *parents) - - d (int): dimension of cct - - Return: - -------- - smoothed joint prob table - ''' - #covert cross-count-table to joint-prob-table by doing a normalization alone axis 0 - jpt = _normalize_by_column(cct) - smoothing_idx = jpt == 0 - if d > 1 and np.sum(smoothing_idx) > 0: - parent = cct.sum(axis=-1) - parent = _smoothing(parent, d-1) - parent_extend = parent.repeat(jpt.shape[-1]).reshape(jpt.shape) - jpt[smoothing_idx] = parent_extend[smoothing_idx] - return jpt - -def get_high_order_feature(X, col, evidence_cols, feature_uniques): - ''' - encode the high order feature of X[col] given evidences X[evidence_cols]. - ''' - if evidence_cols is None or len(evidence_cols) == 0: - return X[:,[col]] - else: - evidences = [X[:,_col] for _col in evidence_cols] - - #[1, variable_unique, evidence_unique] - base = [1, feature_uniques[col]] + [feature_uniques[_col] for _col in evidence_cols[::-1][:-1]] - cum_base = np.cumprod(base)[::-1] - - cols = evidence_cols + [col] - high_order_feature = np.sum(X[:,cols] * cum_base, axis=1).reshape(-1,1) - return high_order_feature - -def get_high_order_constraints(X, col, evidence_cols, feature_uniques): - ''' - find the constraints infomation for the high order feature X[col] given evidences X[evidence_cols]. - - Returns: - --------------------- - tuple(have_value, high_order_uniques) - - have_value: a k+1 dimensions numpy ndarray of type boolean. - Each dimension correspond to a variable, with the order (*evidence_cols, col) - True indicate the corresponding combination of variable values cound be found in the dataset. - False indicate not. - - high_order_constraints: a 1d nummy ndarray of type int. - Each number `c` indicate that there are `c` cols shound be applying the constraints since the last constrant position(or index 0), - in sequence. - - ''' - if evidence_cols is None or len(evidence_cols) == 0: - unique = feature_uniques[col] - return np.ones(unique,dtype=bool), np.array([unique]) - else: - cols = evidence_cols + [col] - cross_table_idxs, cross_table = get_cross_table(*[X[:,i] for i in cols]) - have_value = cross_table != 0 - - have_value_reshape = have_value.reshape(-1,have_value.shape[-1]) - #have_value_split = np.split(have_value_reshape, have_value_reshape.shape[0], 0) - high_order_constraints = np.sum(have_value_reshape, axis=-1) - - return have_value, high_order_constraints - -class KdbHighOrderFeatureEncoder: - ''' - High order feature encoder that uses the kdb model to retrieve the dependencies between features. - - ''' - def __init__(self): - self.dependencies_ = {} - self.constraints_ = np.array([]) - self.have_value_idxs_ = [] - self.feature_uniques_ = [] - self.high_order_feature_uniques_ = [] - self.edges_ = [] - self.ohe_ = None - self.k = None - #self.full_=True - - def fit(self, X, y, k=0): - ''' - Fit the KdbHighOrderFeatureEncoder to X, y. - - Parameters - ---------- - X : array_like of shape (n_samples, n_features) - data to fit in the encoder. - - y : array_like of shape (n_samples,) - label to fit in the encoder. - - k : int, default=0 - k value of the order of the high-order feature. k = 0 will lead to a OneHotEncoder. - - Returns - ------- - self : object - Fitted encoder. - ''' - self.k = k - edges = build_graph(X, y, k) - #n_classes = len(np.unique(y)) - num_features = X.shape[1] - - if k > 0: - dependencies = _get_dependencies_without_y(list(range(num_features)), num_features, edges) - else: - dependencies = {x:[] for x in range(num_features)} - - self.dependencies_ = dependencies - self.feature_uniques_ = [len(np.unique(X[:,i])) for i in range(num_features)] - self.edges_ = edges - #self.full_ = full - - Xk, constraints, have_value_idxs = self.transform(X, return_constraints=True, use_ohe=False) - - from sklearn.preprocessing import OneHotEncoder - self.ohe_ = OneHotEncoder().fit(Xk) - self.high_order_feature_uniques_ = [len(c) for c in self.ohe_.categories_] - self.constraints_ = constraints - self.have_value_idxs_ = have_value_idxs - return self - - def transform(self, X, return_constraints=False, use_ohe=True): - """ - Transform X to the high-order features. - - Parameters - ---------- - X : array_like of shape (n_samples, n_features) - Data to fit in the encoder. - - return_constraints : bool, default=False - Whether to return the constraint informations. - - use_ohe : bool, default=True - Whether to transform output to one-hot format. - - Returns - ------- - X_out : ndarray of shape (n_samples, n_encoded_features) - Transformed input. - """ - Xk = [] - have_value_idxs = [] - constraints = [] - for k, v in self.dependencies_.items(): - xk = get_high_order_feature(X, k, v, self.feature_uniques_) - Xk.append(xk) - - if return_constraints: - idx, constraint = get_high_order_constraints(X, k, v, self.feature_uniques_) - have_value_idxs.append(idx) - constraints.append(constraint) - - Xk = np.hstack(Xk) - from sklearn.preprocessing import OrdinalEncoder - Xk = OrdinalEncoder().fit_transform(Xk) - if use_ohe: - Xk = self.ohe_.transform(Xk) - - if return_constraints: - concated_constraints = np.hstack(constraints) - return Xk, concated_constraints, have_value_idxs - else: - return Xk - - def fit_transform(self, X, y, k=0, return_constraints=False): - ''' - Fit KdbHighOrderFeatureEncoder to X, y, then transform X. - - Equivalent to fit(X, y, k).transform(X, return_constraints) but more convenient. - - Parameters - ---------- - X : array_like of shape (n_samples, n_features) - data to fit in the encoder. - - y : array_like of shape (n_samples,) - label to fit in the encoder. - - k : int, default=0 - k value of the kdb model. k = 0 will lead to a OneHotEncoder. - - return_constraints : bool, default=False - whether to return the constraint informations. - - Returns - ------- - X_out : ndarray of shape (n_samples, n_encoded_features) - Transformed input. - ''' - return self.fit(X, y, k).transform(X, return_constraints) \ No newline at end of file diff --git a/katabatic/models/ganblrpp/utils.py b/katabatic/models/ganblrpp/utils.py deleted file mode 100644 index 3502866..0000000 --- a/katabatic/models/ganblrpp/utils.py +++ /dev/null @@ -1,167 +0,0 @@ -import numpy as np -import tensorflow as tf -from tensorflow.python.ops import math_ops -from pandas import read_csv -from sklearn.preprocessing import OneHotEncoder -from .kdb import KdbHighOrderFeatureEncoder - -from tensorflow.python.ops import math_ops -import numpy as np -import tensorflow as tf - -class softmax_weight(tf.keras.constraints.Constraint): - """Constrains weight tensors to be under softmax `.""" - - def __init__(self,feature_uniques): - if isinstance(feature_uniques, np.ndarray): - idxs = math_ops.cumsum(np.hstack([np.array([0]),feature_uniques])) - else: - idxs = math_ops.cumsum([0] + feature_uniques) - idxs = [i.numpy() for i in idxs] - self.feature_idxs = [ - (idxs[i],idxs[i+1]) for i in range(len(idxs)-1) - ] - - def __call__(self, w): - w_new = [ - math_ops.log(tf.nn.softmax(w[i:j,:], axis=0)) - for i,j in self.feature_idxs - ] - return tf.concat(w_new, 0) - - def get_config(self): - return {'feature_idxs': self.feature_idxs} - -def elr_loss(KL_LOSS): - def loss(y_true, y_pred): - return tf.keras.losses.sparse_categorical_crossentropy(y_true, y_pred)+ KL_LOSS - return loss - -def KL_loss(prob_fake): - return np.mean(-np.log(np.subtract(1,prob_fake))) - -def get_lr(input_dim, output_dim, constraint=None,KL_LOSS=0): - model = tf.keras.Sequential() - model.add(tf.keras.layers.Dense(output_dim, input_dim=input_dim, activation='softmax',kernel_constraint=constraint)) - model.compile(loss=elr_loss(KL_LOSS), optimizer='adam', metrics=['accuracy']) - #log_elr = model.fit(*train_data, validation_data=test_data, batch_size=batch_size,epochs=epochs) - return model - -def sample(*arrays, n=None, frac=None, random_state=None): - ''' - generate sample random arrays from given arrays. The given arrays must be same size. - - Parameters: - -------------- - *arrays: arrays to be sampled. - - n (int): Number of random samples to generate. - - frac: Float value between 0 and 1, Returns (float value * length of given arrays). frac cannot be used with n. - - random_state: int value or numpy.random.RandomState, optional. if set to a particular integer, will return same samples in every iteration. - - Return: - -------------- - the sampled array(s). Passing in multiple arrays will result in the return of a tuple. - - ''' - random = np.random - if isinstance(random_state, int): - random = random.RandomState(random_state) - elif isinstance(random_state, np.random.RandomState): - random = random_state - - arr0 = arrays[0] - original_size = len(arr0) - if n == None and frac == None: - raise Exception('You must specify one of frac or size.') - if n == None: - n = int(len(arr0) * frac) - - idxs = random.choice(original_size, n, replace=False) - if len(arrays) > 1: - sampled_arrays = [] - for arr in arrays: - assert(len(arr) == original_size) - sampled_arrays.append(arr[idxs]) - return tuple(sampled_arrays) - else: - return arr0[idxs] - -DEMO_DATASETS = { - 'adult': { - 'link':'https://raw.githubusercontent.com/chriszhangpodo/discretizedata/main/adult-dm.csv', - 'params': { - 'dtype' : int - } - }, - 'adult-raw':{ - 'link':'https://drive.google.com/uc?export=download&id=1iA-_qIC1xKQJ4nL2ugX1_XJQf8__xOY0', - 'params': {} - } -} - -def get_demo_data(name='adult'): - """ - Download demo dataset from internet. - - Parameters - ---------- - name : str - Name of dataset. Should be one of ['adult', 'adult-raw']. - - Returns - ------- - data : pandas.DataFrame - the demo dataset. - """ - assert(name in DEMO_DATASETS.keys()) - return read_csv(DEMO_DATASETS[name]['link'], **DEMO_DATASETS[name]['params']) - -from .kdb import KdbHighOrderFeatureEncoder -from sklearn.preprocessing import OneHotEncoder -from pandas import read_csv -import numpy as np - -class DataUtils: - """ - useful data utils for the preparation before training. - """ - def __init__(self, x, y): - self.x = x - self.y = y - self.data_size = len(x) - self.num_features = x.shape[1] - - yunique, ycounts = np.unique(y, return_counts=True) - self.num_classes = len(yunique) - self.class_counts = ycounts - self.feature_uniques = [len(np.unique(x[:,i])) for i in range(self.num_features)] - - self.constraint_positions = None - self._kdbe = None - - self.__kdbe_x = None - - def get_categories(self, idxs=None): - if idxs != None: - return [self._kdbe.ohe_.categories_[i] for i in idxs] - return self._kdbe.ohe_.categories_ - - def get_kdbe_x(self, k=0, dense_format=True) -> np.ndarray: - if self.__kdbe_x is not None: - return self.__kdbe_x - if self._kdbe == None: - self._kdbe = KdbHighOrderFeatureEncoder() - self._kdbe.fit(self.x, self.y, k=k) - kdbex = self._kdbe.transform(self.x) - if dense_format: - kdbex = kdbex.todense() - self.__kdbe_x = kdbex - self.constraint_positions = self._kdbe.constraints_ - return kdbex - - def clear(self): - self._kdbe = None - self.__kdbe_x = None \ No newline at end of file diff --git a/katabatic/models/medgan/medgan.py b/katabatic/models/medgan/medgan.py deleted file mode 100644 index 02282ba..0000000 --- a/katabatic/models/medgan/medgan.py +++ /dev/null @@ -1,311 +0,0 @@ -import numpy as np -import tensorflow as tf -from sklearn.model_selection import train_test_split -from sklearn.metrics import roc_auc_score -from sklearn.preprocessing import OrdinalEncoder, LabelEncoder -import warnings - -warnings.filterwarnings("ignore") - - -class MedGAN: - """ - The MedGAN Model for generating synthetic tabular data. - """ - - def __init__(self) -> None: - self.input_dim = None - self.embedding_dim = None - self.random_dim = None - self.generator_dims = None - self.discriminator_dims = None - self.compress_dims = None - self.decompress_dims = None - self.bn_decay = None - self.l2_scale = None - self.data_type = None - self.autoencoder = None - self.generator = None - self._ordinal_encoder = OrdinalEncoder( - dtype=int, handle_unknown="use_encoded_value", unknown_value=-1 - ) - self._label_encoder = LabelEncoder() - - def fit( - self, - data, - input_dim, - embedding_dim=128, - random_dim=128, - generator_dims=(128, 128), - discriminator_dims=(256, 128, 1), - compress_dims=(), - decompress_dims=(), - bn_decay=0.99, - l2_scale=0.001, - data_type="binary", - epochs=10, - batch_size=32, - pretrain_epochs=1, - verbose=1, - ) -> None: - """ - Fit the MedGAN model to the data. - - Parameters - ---------- - data : np.ndarray - The input data for training, in tabular format. - - input_dim : int - The dimensionality of the input data. - - embedding_dim : int, default=128 - The dimensionality of the embedding vector. - - random_dim : int, default=128 - The dimensionality of the random noise vector. - - generator_dims : tuple, default=(128, 128) - The dimensions of the generator layers. - - discriminator_dims : tuple, default=(256, 128, 1) - The dimensions of the discriminator layers. - - compress_dims : tuple, default=() - The dimensions of the compression layers in the autoencoder. - - decompress_dims : tuple, default=() - The dimensions of the decompression layers in the autoencoder. - - bn_decay : float, default=0.99 - Decay rate for batch normalization. - - l2_scale : float, default=0.001 - L2 regularization scale. - - data_type : str, default='binary' - Type of data ('binary' or 'count'). - - batch_size : int, default=32 - Batch size for training. - - epochs : int, default=10 - Number of epochs for training. - - pretrain_epochs : int, default=1 - Number of epochs for pre-training the autoencoder. - - verbose : int, default=1 - Verbosity mode (0 = silent, 1 = progress bar). - """ - self.input_dim = input_dim - self.embedding_dim = embedding_dim - self.random_dim = random_dim - self.generator_dims = list(generator_dims) + [embedding_dim] - self.discriminator_dims = discriminator_dims - self.compress_dims = list(compress_dims) + [embedding_dim] - self.decompress_dims = list(decompress_dims) + [input_dim] - self.bn_decay = bn_decay - self.l2_scale = l2_scale - self.data_type = data_type - - print(f"[INFO] Data shape: {data.shape}") - data = self._ordinal_encoder.fit_transform(data) - print(f"[INFO] Encoded data shape: {data.shape}") - - trainX, validX = train_test_split(data, test_size=0.1, random_state=0) - print(f"[INFO] TrainX shape: {trainX.shape}, ValidX shape: {validX.shape}") - - if verbose: - print("Pretraining autoencoder...") - self._pretrain_autoencoder(trainX, pretrain_epochs, batch_size, verbose) - - if verbose: - print("Training GAN...") - self._train_gan(trainX, validX, epochs, batch_size, verbose) - - def _pretrain_autoencoder(self, trainX, pretrain_epochs, batch_size, verbose): - """ - Pretrain the autoencoder part of MedGAN. - """ - tf.keras.backend.clear_session() - input_data = tf.keras.Input(shape=(self.input_dim,)) - x = input_data - print(f"[INFO] Autoencoder input shape: {input_data.shape}") - - # Encoder - for i, dim in enumerate(self.compress_dims[:-1]): - x = tf.keras.layers.Dense(dim, activation=self._get_activation())(x) - print(f"[INFO] After Encoder Dense layer {i}, shape: {x.shape}") - - encoded = tf.keras.layers.Dense( - self.compress_dims[-1], activation=self._get_activation() - )(x) - print(f"[INFO] Encoded layer shape: {encoded.shape}") - - # Decoder - x = encoded - for i, dim in enumerate(self.decompress_dims[:-1]): - x = tf.keras.layers.Dense(dim, activation=self._get_activation())(x) - print(f"[INFO] After Decoder Dense layer {i}, shape: {x.shape}") - - decoded = tf.keras.layers.Dense( - self.input_dim, - activation="sigmoid" if self.data_type == "binary" else "relu", - )(x) - print(f"[INFO] Decoded output shape: {decoded.shape}") - - autoencoder = tf.keras.models.Model(inputs=input_data, outputs=decoded) - autoencoder.compile( - optimizer="adam", - loss="binary_crossentropy" if self.data_type == "binary" else "mse", - ) - - autoencoder.fit( - trainX, - trainX, - batch_size=batch_size, - epochs=pretrain_epochs, - verbose=verbose, - ) - self.autoencoder = autoencoder - tf.keras.backend.clear_session() - - def _train_gan(self, trainX, validX, epochs, batch_size, verbose): - """ - Train the GAN component of MedGAN. - """ - tf.keras.backend.clear_session() - - # Build the Generator - self.generator = self._build_generator() - print(f"[INFO] Generator model summary:") - self.generator.summary() # Ensure full summary is printed - - # Build the Discriminator - discriminator = self._build_discriminator() - discriminator.compile( - optimizer="adam", loss="binary_crossentropy", metrics=["accuracy"] - ) - print(f"[INFO] Discriminator model summary:") - discriminator.summary() # Ensure full summary is printed - - # Build and compile the GAN - random_input = tf.keras.Input(shape=(self.random_dim,)) - generated_data = self.generator(random_input) - discriminator.trainable = False - gan_output = discriminator(generated_data) - gan = tf.keras.models.Model(inputs=random_input, outputs=gan_output) - gan.compile(optimizer="adam", loss="binary_crossentropy") - print(f"[INFO] GAN model summary:") - gan.summary() # Ensure full summary is printed - - for epoch in range(epochs): - # Generate synthetic data - random_noise = np.random.normal(size=(trainX.shape[0], self.random_dim)) - generated_data = self.generator.predict(random_noise) - print(f"[INFO] Generated data shape: {generated_data.shape}") - - # Train Discriminator - real_labels = np.ones((trainX.shape[0], 1)) - fake_labels = np.zeros((trainX.shape[0], 1)) - d_loss_real = discriminator.train_on_batch(trainX, real_labels) - d_loss_fake = discriminator.train_on_batch(generated_data, fake_labels) - - # Train Generator - misleading_labels = np.ones((trainX.shape[0], 1)) - g_loss = gan.train_on_batch(random_noise, misleading_labels) - - if verbose: - try: - # Attempt to format the loss values correctly - d_loss_real_str = f"{d_loss_real[0]:.4f}" - d_loss_fake_str = f"{d_loss_fake[0]:.4f}" - g_loss_str = f"{g_loss:.4f}" - print( - f"Epoch {epoch+1}/{epochs}: D_loss_real = {d_loss_real_str}, D_loss_fake = {d_loss_fake_str}, G_loss = {g_loss_str}" - ) - except TypeError as e: - print(f"[ERROR] TypeError during logging: {e}") - print( - f"[DEBUG] d_loss_real: {d_loss_real}, d_loss_fake: {d_loss_fake}, g_loss: {g_loss}" - ) - - tf.keras.backend.clear_session() - - def _build_generator(self): - """ - Build the generator model. - """ - model = tf.keras.Sequential() - model.add(tf.keras.layers.InputLayer(input_shape=(self.random_dim,))) - print(f"[INFO] Generator InputLayer shape: {(self.random_dim,)}") - for i, dim in enumerate(self.generator_dims[:-1]): - model.add(tf.keras.layers.Dense(dim, activation="relu")) - print(f"[INFO] After Generator Dense layer {i}, shape: {dim}") - model.add( - tf.keras.layers.Dense( - self.input_dim, - activation="tanh" if self.data_type == "binary" else "relu", - ) - ) - print(f"[INFO] Generator output shape: {self.input_dim}") - return model - - def _build_discriminator(self): - """ - Build the discriminator model. - """ - model = tf.keras.Sequential() - model.add(tf.keras.layers.InputLayer(input_shape=(self.input_dim,))) - print(f"[INFO] Discriminator InputLayer shape: {(self.input_dim,)}") - for i, dim in enumerate(self.discriminator_dims[:-1]): - model.add(tf.keras.layers.Dense(dim, activation="relu")) - print(f"[INFO] After Discriminator Dense layer {i}, shape: {dim}") - model.add(tf.keras.layers.Dense(1, activation="sigmoid")) - print(f"[INFO] Discriminator output shape: 1") - return model - - def _generate_samples(self, n_samples, verbose): - """ - Generate synthetic samples using the trained generator. - """ - random_noise = np.random.normal(size=(n_samples, self.random_dim)) - synthetic_data = self.generator.predict(random_noise, verbose=verbose) - print(f"[INFO] Generated samples shape: {synthetic_data.shape}") - return synthetic_data - - def _get_activation(self): - """ - Get the appropriate activation function for the autoencoder. - """ - return tf.nn.tanh if self.data_type == "binary" else tf.nn.relu - - def sample(self, n_samples=100, verbose=0) -> np.ndarray: - """ - Generate synthetic data using the trained generator. - - Parameters - ---------- - n_samples : int, default=100 - Number of synthetic samples to generate. - - verbose : int, default=0 - Verbosity mode. - - Returns - ------- - np.ndarray - Generated synthetic data. - """ - if not hasattr(self, "generator"): - raise AttributeError( - "The generator model must be trained before sampling data." - ) - - # Generate synthetic samples - random_noise = np.random.normal(size=(n_samples, self.random_dim)) - synthetic_data = self.generator.predict(random_noise, verbose=verbose) - print(f"[INFO] Generated samples shape: {synthetic_data.shape}") - return synthetic_data diff --git a/katabatic/models/medgan/medgan_adapter.py b/katabatic/models/medgan/medgan_adapter.py deleted file mode 100644 index d09f1dc..0000000 --- a/katabatic/models/medgan/medgan_adapter.py +++ /dev/null @@ -1,234 +0,0 @@ -from katabatic.katabatic_spi import KatabaticModelSPI -import pandas as pd -from sklearn.preprocessing import OrdinalEncoder, LabelEncoder -from .medgan import MedGAN - - -class MedganAdapter(KatabaticModelSPI): - """ - Adapter class for MedGAN model to interface with KatabaticModelSPI. - - Attributes: - type (str): The type of model, either 'discrete' or 'continuous'. - batch_size (int): The batch size for training the model. - epochs (int): The number of epochs for training the model. - training_sample_size (int): The size of the training sample. - """ - - def __init__(self, type="discrete"): - """ - Initialize the MedGAN Adapter with the specified type. - - Args: - type (str): The type of model, either 'discrete' or 'continuous'. Default is 'discrete'. - """ - self.type = type - self.batch_size = None - self.epochs = None - self.training_sample_size = None - self.model = None - self.feature_encoder = None - self.label_encoder = None - - def load_model(self): - """ - Initialize and load the MedGAN model. - - Returns: - MedGAN: An instance of the MedGAN model. - """ - print("[INFO] Initializing MedGAN Model") - self.model = MedGAN() - self.training_sample_size = 0 - return self.model - - def load_data(self, data_pathname): - """ - Load data from the specified pathname. - - Args: - data_pathname (str): The path to the data file. - - Returns: - pd.DataFrame: The loaded data as a pandas DataFrame. - - Raises: - FileNotFoundError: If the file specified by data_pathname is not found. - pd.errors.EmptyDataError: If the file is empty. - pd.errors.ParserError: If there is a parsing error while reading the file. - Exception: For any other exceptions that occur during data loading. - """ - print(f"[INFO] Loading data from {data_pathname}") - try: - data = pd.read_csv(data_pathname) - print("[SUCCESS] Data loaded successfully.") - return data - except FileNotFoundError: - print(f"[ERROR] File '{data_pathname}' not found.") - except pd.errors.EmptyDataError: - print("[ERROR] The file is empty.") - except pd.errors.ParserError: - print("[ERROR] Parsing error while reading the file.") - except Exception as e: - print(f"[ERROR] An unexpected error occurred: {e}") - return None - - def preprocess_data(self, X, y): - """ - Preprocess the data to encode categorical variables. - - Args: - X (pd.DataFrame): The input features. - y (pd.Series or np.ndarray): The labels. - - Returns: - np.ndarray, np.ndarray: The preprocessed features and labels. - """ - print("[INFO] Preprocessing data...") - - # Encode features - self.feature_encoder = OrdinalEncoder() - X_processed = self.feature_encoder.fit_transform(X) - - # Encode labels - if y is not None: - self.label_encoder = LabelEncoder() - y_processed = self.label_encoder.fit_transform(y) - else: - y_processed = None - - print("[SUCCESS] Data preprocessing completed.") - return X_processed, y_processed - - def postprocess_data(self, X, y=None): - """ - Post-process the generated data to decode categorical variables. - - Args: - X (np.ndarray): The generated features. - y (np.ndarray, optional): The generated labels. - - Returns: - pd.DataFrame: The post-processed features and labels. - """ - print("[INFO] Post-processing generated data...") - - # Decode features - X_decoded = self.feature_encoder.inverse_transform(X) - - # Decode labels - if y is not None: - y_decoded = self.label_encoder.inverse_transform(y) - print("[SUCCESS] Data post-processing completed.") - return pd.DataFrame( - X_decoded, columns=self.feature_encoder.feature_names_in_ - ), pd.Series(y_decoded) - else: - print("[SUCCESS] Data post-processing completed.") - return pd.DataFrame( - X_decoded, columns=self.feature_encoder.feature_names_in_ - ) - - def fit( - self, - X_train, - y_train=None, - embedding_dim=128, - random_dim=128, - generator_dims=(128, 128), - discriminator_dims=(256, 128, 1), - compress_dims=(), - decompress_dims=(), - bn_decay=0.99, - l2_scale=0.001, - data_type="binary", - epochs=10, - batch_size=64, - ): - """ - Train the MedGAN model on the input data. - - Args: - X_train (pd.DataFrame or np.ndarray): Training features. - y_train (pd.Series or np.ndarray, optional): Training labels (not used in MedGAN). - embedding_dim (int): Dimension of the embedding vector. - random_dim (int): Dimension of the random noise vector. - generator_dims (tuple): Dimensions of the generator layers. - discriminator_dims (tuple): Dimensions of the discriminator layers. - compress_dims (tuple): Dimensions of the compression layers in the autoencoder. - decompress_dims (tuple): Dimensions of the decompression layers in the autoencoder. - bn_decay (float): Decay rate for batch normalization. - l2_scale (float): L2 regularization scale. - data_type (str): Type of data ('binary' or 'count'). - epochs (int): The number of epochs for training. Default is 10. - batch_size (int): The batch size for training. Default is 64. - - Raises: - ValueError: If there is a value error such as wrong input shape. - TypeError: If there is a type error such as wrong data type. - Exception: For any other exceptions that occur during training. - """ - try: - print("[INFO] Preprocessing and training MedGAN model") - X_train_processed, y_train_processed = self.preprocess_data( - X_train, y_train - ) - input_dim = X_train_processed.shape[1] - self.model.fit( - data=X_train_processed, - input_dim=input_dim, - embedding_dim=embedding_dim, - random_dim=random_dim, - generator_dims=generator_dims, - discriminator_dims=discriminator_dims, - compress_dims=compress_dims, - decompress_dims=decompress_dims, - bn_decay=bn_decay, - l2_scale=l2_scale, - data_type=data_type, - epochs=epochs, - batch_size=batch_size, - verbose=1, - ) - self.training_sample_size = len(X_train) - print("[SUCCESS] Model training completed") - except ValueError as e: - print(f"[ERROR] ValueError during model training: {e}") - except TypeError as e: - print(f"[ERROR] TypeError during model training: {e}") - except Exception as e: - print(f"[ERROR] An unexpected error occurred during model training: {e}") - - def generate(self, size=None): - """ - Generate data using the MedGAN model. - - Args: - size (int): The number of samples to generate. If not specified, defaults to the training sample size. - - Returns: - pd.DataFrame: The generated data. - - Raises: - ValueError: If there is a value error such as invalid size. - TypeError: If there is a type error such as wrong data type for size. - AttributeError: If the model does not have a sample method. - Exception: For any other exceptions that occur during data generation. - """ - try: - print("[INFO] Generating data using MedGAN model") - if size is None: - size = self.training_sample_size - - generated_data = self.model.sample(size, verbose=0) - generated_data_decoded = self.postprocess_data(generated_data) - print("[SUCCESS] Data generation completed") - return generated_data_decoded - except ValueError as e: - print(f"[ERROR] ValueError during data generation: {e}") - except TypeError as e: - print(f"[ERROR] TypeError during data generation: {e}") - except AttributeError as e: - print(f"[ERROR] AttributeError during data generation: {e}") - except Exception as e: - print(f"[ERROR] An unexpected error occurred during data generation: {e}") diff --git a/katabatic/models/meg/MEG_Adapter.py b/katabatic/models/meg/MEG_Adapter.py deleted file mode 100644 index 772ce5d..0000000 --- a/katabatic/models/meg/MEG_Adapter.py +++ /dev/null @@ -1,182 +0,0 @@ -from katabatic.katabatic_spi import KatabaticModelSPI -import pandas as pd -import numpy as np -from sklearn.preprocessing import LabelEncoder -from .meg import MEG - -class MegAdapter(KatabaticModelSPI): - """ - Adapter class for the MEG model to interface with KatabaticModelSPI. - - Attributes: - type (str): The type of model, either 'discrete' or 'continuous'. - candidate_labels (list): The candidate labels for the model. - batch_size (int): The batch size for training the model. - epochs (int): The number of epochs for training the model. - training_sample_size (int): The size of the training sample. - """ - - def __init__(self, type="discrete", candidate_labels=None): - """ - Initialize the MEG Adapter with the specified type and candidate labels. - - Args: - type (str): The type of model, either 'discrete' or 'continuous'. Default is 'discrete'. - candidate_labels (list): The candidate labels for the model. Default is None. - """ - self._d = None - self.batch_size = None - self.epochs = None - self.k = None - self._units = None - self._candidate_labels = candidate_labels - self._column_names = None - self._weight = None - self._ordinal_encoder = None - self.training_sample_size = 0 - self.model = None - - def load_model(self): - """ - Initialize and load the MEG model. - - Returns: - MEG: An instance of the MEG model. - """ - print("[INFO] Initializing MEG Model") - self.model = MEG() - return self.model - - def preprocess_data(self, X, y): - """ - Preprocess data by converting categorical features and labels to numerical format. - - Args: - X (pd.DataFrame or np.ndarray): Features to preprocess. - y (pd.Series or np.ndarray): Labels to preprocess. - - Returns: - Tuple: (preprocessed_X, preprocessed_y) - """ - # Convert X to DataFrame if it's not already - if isinstance(X, np.ndarray): - if self._column_names is None: - self._column_names = [f"feature_{i}" for i in range(X.shape[1])] - X = pd.DataFrame(X, columns=self._column_names) - - print("[INFO] Initial X columns:", X.columns) - - # Handle categorical features - for col in X.select_dtypes(include=['object']).columns: - print(f"[INFO] Converting column '{col}' to numerical format") - X[col] = X[col].astype('category').cat.codes - - # Handle categorical labels - if y is not None: - if isinstance(y, np.ndarray): - y = pd.Series(y) - if y.dtype == 'object': - print(f"[INFO] Converting labels to numerical format") - y = y.astype('category').cat.codes - - return X, y - - def load_data(self, data_pathname): - """ - Load data from the specified pathname. - - Args: - data_pathname (str): The path to the data file. - - Returns: - pd.DataFrame: The loaded data as a pandas DataFrame. - """ - print(f"[INFO] Loading data from {data_pathname}") - try: - data = pd.read_csv(data_pathname) - print("[SUCCESS] Data loaded successfully.") - return data - except FileNotFoundError: - print(f"[ERROR] File '{data_pathname}' not found.") - except pd.errors.EmptyDataError: - print("[ERROR] The file is empty.") - except pd.errors.ParserError: - print("[ERROR] Parsing error while reading the file.") - except Exception as e: - print(f"[ERROR] An unexpected error occurred: {e}") - return None - - def fit(self, X_train, y_train=None, k=0, batch_size=32, epochs=10): - """ - Train the MEG model on the input data. - - Args: - X_train (pd.DataFrame or np.ndarray): Training features. - y_train (pd.Series or np.ndarray): Training labels. Default is None. - k (int): An optional parameter for the model's fit method. Default is 0. - batch_size (int): The batch size for training. Default is 32. - epochs (int): The number of epochs for training. Default is 10. - """ - if self.model is None: - print("[ERROR] Model is not loaded. Call 'load_model()' first.") - return - - # Ensure X_train and y_train are pandas DataFrames or Series - if isinstance(X_train, np.ndarray): - X_train = pd.DataFrame(X_train, columns=self._column_names) - if isinstance(y_train, np.ndarray): - y_train = pd.Series(y_train) - - # Preprocess data - X_train, y_train = self.preprocess_data(X_train, y_train) - - try: - print("[INFO] Training MEG model") - self.model.fit( - X_train, y_train, k, batch_size=batch_size, epochs=epochs, verbose=0 - ) - self.training_sample_size = len(X_train) - print("[SUCCESS] Model training completed") - except ValueError as e: - print(f"[ERROR] ValueError during model training: {e}") - except TypeError as e: - print(f"[ERROR] TypeError during model training: {e}") - except Exception as e: - print(f"[ERROR] An unexpected error occurred during model training: {e}") - - def generate(self, size=None): - """ - Generate data using the MEG model. - - Args: - size (int): The number of samples to generate. Defaults to the training sample size if not specified. - - Returns: - pd.DataFrame or np.ndarray: The generated data. - """ - if self.model is None: - print("[ERROR] Model is not loaded. Call 'load_model()' first.") - return None - - try: - print("[INFO] Generating data using MEG model") - if size is None: - size = self.training_sample_size - - generated_data = self.model.sample(size, verbose=0) - if isinstance(generated_data, np.ndarray): - generated_data = pd.DataFrame(generated_data, columns=self._column_names) - print("[SUCCESS] Data generation completed") - return generated_data - except ValueError as e: - print(f"[ERROR] ValueError during data generation: {e}") - except TypeError as e: - print(f"[ERROR] TypeError during data generation: {e}") - except AttributeError as e: - print(f"[ERROR] AttributeError during data generation: {e}") - except Exception as e: - print(f"[ERROR] An unexpected error occurred during data generation: {e}") - return None - - - diff --git a/katabatic/models/meg/ganblr.py b/katabatic/models/meg/ganblr.py deleted file mode 100644 index 06a8628..0000000 --- a/katabatic/models/meg/ganblr.py +++ /dev/null @@ -1,246 +0,0 @@ -from .kdb import * -from .kdb import _add_uniform -from .utils import * -from scipy.linalg import pinv -from pgmpy.models import BayesianNetwork -from pgmpy.sampling import BayesianModelSampling -from pgmpy.factors.discrete import TabularCPD -from sklearn.preprocessing import OrdinalEncoder, LabelEncoder -import numpy as np -import tensorflow as tf - - -class GANBLR: - """ - The GANBLR Model. - """ - def __init__(self) -> None: - self._d = None - self.__gen_weights = None - self.batch_size = None - self.epochs = None - self.k = None - self.constraints = None - self._ordinal_encoder = OrdinalEncoder(dtype=int, handle_unknown='use_encoded_value', unknown_value=-1) - self._label_encoder = LabelEncoder() - - def fit(self, x, y, k=0, batch_size=32, epochs=10, warmup_epochs=1, verbose=1): - ''' - Fit the model to the given data. - - Parameters - ---------- - x : array_like of shape (n_samples, n_features) - Dataset to fit the model. The data should be discrete. - - y : array_like of shape (n_samples,) - Label of the dataset. - - k : int, default=0 - Parameter k of ganblr model. Must be greater than 0. No more than 2 is Suggested. - - batch_size : int, default=32 - Size of the batch to feed the model at each step. - - epochs : int, default=0 - Number of epochs to use during training. - - warmup_epochs : int, default=1 - Number of epochs to use in warmup phase. Defaults to :attr:`1`. - - verbose : int, default=1 - Whether to output the log. Use 1 for log output and 0 for complete silence. - - Returns - ------- - self : object - Fitted model. - ''' - if verbose is None or not isinstance(verbose, int): - verbose = 1 - x = self._ordinal_encoder.fit_transform(x) - y = self._label_encoder.fit_transform(y).astype(int) - d = DataUtils(x, y) - self._d = d - self.k = k - self.batch_size = batch_size - if verbose: - print(f"warmup run:") - history = self._warmup_run(warmup_epochs, verbose=verbose) - syn_data = self._sample(verbose=0) - discriminator_label = np.hstack([np.ones(d.data_size), np.zeros(d.data_size)]) - for i in range(epochs): - discriminator_input = np.vstack([x, syn_data[:,:-1]]) - disc_input, disc_label = sample(discriminator_input, discriminator_label, frac=0.8) - disc = self._discrim() - d_history = disc.fit(disc_input, disc_label, batch_size=batch_size, epochs=1, verbose=0).history - prob_fake = disc.predict(x, verbose=0) - ls = np.mean(-np.log(np.subtract(1, prob_fake))) - g_history = self._run_generator(loss=ls).history - syn_data = self._sample(verbose=0) - - if verbose: - print(f"Epoch {i+1}/{epochs}: G_loss = {g_history['loss'][0]:.6f}, G_accuracy = {g_history['accuracy'][0]:.6f}, D_loss = {d_history['loss'][0]:.6f}, D_accuracy = {d_history['accuracy'][0]:.6f}") - return self - - def evaluate(self, x, y, model='lr') -> float: - """ - Perform a TSTR(Training on Synthetic data, Testing on Real data) evaluation. - - Parameters - ---------- - x, y : array_like - Test dataset. - - model : str or object - The model used for evaluate. Should be one of ['lr', 'mlp', 'rf'], or a model class that have sklearn-style `fit` and `predict` method. - - Return: - -------- - accuracy_score : float. - - """ - from sklearn.linear_model import LogisticRegression - from sklearn.neural_network import MLPClassifier - from sklearn.ensemble import RandomForestClassifier - from sklearn.preprocessing import OneHotEncoder - from sklearn.pipeline import Pipeline - from sklearn.metrics import accuracy_score - - eval_model = None - models = dict( - lr=LogisticRegression, - rf=RandomForestClassifier, - mlp=MLPClassifier - ) - if model in models.keys(): - eval_model = models[model]() - elif hasattr(model, 'fit') and hasattr(model, 'predict'): - eval_model = model - else: - raise Exception("Invalid Arugument `model`, Should be one of ['lr', 'mlp', 'rf'], or a model class that have sklearn-style `fit` and `predict` method.") - - synthetic_data = self._sample() - synthetic_x, synthetic_y = synthetic_data[:,:-1], synthetic_data[:,-1] - x_test = self._ordinal_encoder.transform(x) - y_test = self._label_encoder.transform(y) - - categories = self._d.get_categories() - pipline = Pipeline([('encoder', OneHotEncoder(categories=categories, handle_unknown='ignore')), ('model', eval_model)]) - pipline.fit(synthetic_x, synthetic_y) - pred = pipline.predict(x_test) - return accuracy_score(y_test, pred) - - def sample(self, size=None, verbose=1) -> np.ndarray: - """ - Generate synthetic data. - - Parameters - ---------- - size : int or None - Size of the data to be generated. set to `None` to make the size equal to the size of the training set. - - verbose : int, default=1 - Whether to output the log. Use 1 for log output and 0 for complete silence. - - Return: - ----------------- - synthetic_samples : np.ndarray - Generated synthetic data. - """ - ordinal_data = self._sample(size, verbose) - origin_x = self._ordinal_encoder.inverse_transform(ordinal_data[:,:-1]) - origin_y = self._label_encoder.inverse_transform(ordinal_data[:,-1]).reshape(-1,1) - return np.hstack([origin_x, origin_y]) - - def _sample(self, size=None, verbose=1) -> np.ndarray: - """ - Generate synthetic data in ordinal encoding format - """ - if verbose is None or not isinstance(verbose, int): - verbose = 1 - #basic varibles - d = self._d - feature_cards = np.array(d.feature_uniques) - #ensure sum of each constraint group equals to 1, then re concat the probs - _idxs = np.cumsum([0] + d._kdbe.constraints_.tolist()) - constraint_idxs = [(_idxs[i],_idxs[i+1]) for i in range(len(_idxs)-1)] - - probs = np.exp(self.__gen_weights[0]) - cpd_probs = [probs[start:end,:] for start, end in constraint_idxs] - cpd_probs = np.vstack([p/p.sum(axis=0) for p in cpd_probs]) - - #assign the probs to the full cpd tables - idxs = np.cumsum([0] + d._kdbe.high_order_feature_uniques_) - feature_idxs = [(idxs[i],idxs[i+1]) for i in range(len(idxs)-1)] - have_value_idxs = d._kdbe.have_value_idxs_ - full_cpd_probs = [] - for have_value, (start, end) in zip(have_value_idxs, feature_idxs): - #(n_high_order_feature_uniques, n_classes) - cpd_prob_ = cpd_probs[start:end,:] - #(n_all_combination) Note: the order is (*parent, variable) - have_value_ravel = have_value.ravel() - #(n_classes * n_all_combination) - have_value_ravel_repeat = np.hstack([have_value_ravel] * d.num_classes) - #(n_classes * n_all_combination) <- (n_classes * n_high_order_feature_uniques) - full_cpd_prob_ravel = np.zeros_like(have_value_ravel_repeat, dtype=float) - full_cpd_prob_ravel[have_value_ravel_repeat] = cpd_prob_.T.ravel() - #(n_classes * n_parent_combinations, n_variable_unique) - full_cpd_prob = full_cpd_prob_ravel.reshape(-1, have_value.shape[-1]).T - full_cpd_prob = _add_uniform(full_cpd_prob, noise=0) - full_cpd_probs.append(full_cpd_prob) - - #prepare node and edge names - node_names = [str(i) for i in range(d.num_features + 1)] - edge_names = [(str(i), str(j)) for i,j in d._kdbe.edges_] - y_name = node_names[-1] - - #create TabularCPD objects - evidences = d._kdbe.dependencies_ - feature_cpds = [ - TabularCPD(str(name), feature_cards[name], table, - evidence=[y_name, *[str(e) for e in evidences]], - evidence_card=[d.num_classes, *feature_cards[evidences].tolist()]) - for (name, evidences), table in zip(evidences.items(), full_cpd_probs) - ] - y_probs = (d.class_counts/d.data_size).reshape(-1,1) - y_cpd = TabularCPD(y_name, d.num_classes, y_probs) - - #create kDB model, then sample the data - model = BayesianNetwork(edge_names) - model.add_cpds(y_cpd, *feature_cpds) - sample_size = d.data_size if size is None else size - result = BayesianModelSampling(model).forward_sample(size=sample_size, show_progress = verbose > 0) - sorted_result = result[node_names].values - - return sorted_result - - def _warmup_run(self, epochs, verbose=None): - d = self._d - tf.keras.backend.clear_session() - ohex = d.get_kdbe_x(self.k) - self.constraints = softmax_weight(d.constraint_positions) - elr = get_lr(ohex.shape[1], d.num_classes, self.constraints) - history = elr.fit(ohex, d.y, batch_size=self.batch_size, epochs=epochs, verbose=verbose) - self.__gen_weights = elr.get_weights() - tf.keras.backend.clear_session() - return history - - def _run_generator(self, loss): - d = self._d - ohex = d.get_kdbe_x(self.k) - tf.keras.backend.clear_session() - model = tf.keras.Sequential() - model.add(tf.keras.layers.Dense(d.num_classes, input_dim=ohex.shape[1], activation='softmax',kernel_constraint=self.constraints)) - model.compile(loss=elr_loss(loss), optimizer='adam', metrics=['accuracy']) - model.set_weights(self.__gen_weights) - history = model.fit(ohex, d.y, batch_size=self.batch_size,epochs=1, verbose=0) - self.__gen_weights = model.get_weights() - tf.keras.backend.clear_session() - return history - - def _discrim(self): - model = tf.keras.Sequential() - model.add(tf.keras.layers.Dense(1, input_dim=self._d.num_features, activation='sigmoid')) - model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy']) - return model \ No newline at end of file diff --git a/katabatic/models/meg/kdb.py b/katabatic/models/meg/kdb.py deleted file mode 100644 index 70df610..0000000 --- a/katabatic/models/meg/kdb.py +++ /dev/null @@ -1,370 +0,0 @@ -import numpy as np -#import networkx as nx -from pyitlib import discrete_random_variable as drv -import warnings -warnings.filterwarnings("ignore") - -def build_graph(X, y, k=2): - ''' - kDB algorithm - - Param: - ---------------------- - - Return: - ---------------------- - graph edges - ''' - #ensure data - num_features = X.shape[1] - x_nodes = list(range(num_features)) - y_node = num_features - - #util func - _x = lambda i:X[:,i] - _x2comb = lambda i,j:(X[:,i], X[:,j]) - - #feature indexes desc sort by mutual information - sorted_feature_idxs = np.argsort([ - drv.information_mutual(_x(i), y) - for i in range(num_features) - ])[::-1] - - #start building graph - edges = [] - for iter, target_idx in enumerate(sorted_feature_idxs): - target_node = x_nodes[target_idx] - edges.append((y_node, target_node)) - - parent_candidate_idxs = sorted_feature_idxs[:iter] - if iter <= k: - for idx in parent_candidate_idxs: - edges.append((x_nodes[idx], target_node)) - else: - first_k_parent_mi_idxs = np.argsort([ - drv.information_mutual_conditional(*_x2comb(i, target_idx), y) - for i in parent_candidate_idxs - ])[::-1][:k] - first_k_parent_idxs = parent_candidate_idxs[first_k_parent_mi_idxs] - - for parent_idx in first_k_parent_idxs: - edges.append((x_nodes[parent_idx], target_node)) - return edges - -# def draw_graph(edges): -# ''' -# Draw the graph -# -# Param -# ----------------- -# edges: edges of the graph -# -# ''' -# graph = nx.DiGraph(edges) -# pos=nx.spiral_layout(graph) -# nx.draw(graph, pos, node_color='r', edge_color='b') -# nx.draw_networkx_labels(graph, pos, font_size=20, font_family="sans-serif") - - -def get_cross_table(*cols, apply_wt=False): - ''' - author: alexland - - returns: - (i) xt, NumPy array storing the xtab results, number of dimensions is equal to - the len(args) passed in - (ii) unique_vals_all_cols, a tuple of 1D NumPy array for each dimension - in xt (for a 2D xtab, the tuple comprises the row and column headers) - pass in: - (i) 1 or more 1D NumPy arrays of integers - (ii) if wts is True, then the last array in cols is an array of weights - - if return_inverse=True, then np.unique also returns an integer index - (from 0, & of same len as array passed in) such that, uniq_vals[idx] gives the original array passed in - higher dimensional cross tabulations are supported (eg, 2D & 3D) - cross tabulation on two variables (columns): - >>> q1 = np.array([7, 8, 8, 8, 5, 6, 4, 6, 6, 8, 4, 6, 6, 6, 6, 8, 8, 5, 8, 6]) - >>> q2 = np.array([6, 4, 6, 4, 8, 8, 4, 8, 7, 4, 4, 8, 8, 7, 5, 4, 8, 4, 4, 4]) - >>> uv, xt = xtab(q1, q2) - >>> uv - (array([4, 5, 6, 7, 8]), array([4, 5, 6, 7, 8])) - >>> xt - array([[2, 0, 0, 0, 0], - [1, 0, 0, 0, 1], - [1, 1, 0, 2, 4], - [0, 0, 1, 0, 0], - [5, 0, 1, 0, 1]], dtype=uint64) - ''' - if not all(len(col) == len(cols[0]) for col in cols[1:]): - raise ValueError("all arguments must be same size") - - if len(cols) == 0: - raise TypeError("xtab() requires at least one argument") - - fnx1 = lambda q: len(q.squeeze().shape) - if not all([fnx1(col) == 1 for col in cols]): - raise ValueError("all input arrays must be 1D") - - if apply_wt: - cols, wt = cols[:-1], cols[-1] - else: - wt = 1 - - uniq_vals_all_cols, idx = zip( *(np.unique(col, return_inverse=True) for col in cols) ) - shape_xt = [uniq_vals_col.size for uniq_vals_col in uniq_vals_all_cols] - dtype_xt = 'float' if apply_wt else 'uint' - xt = np.zeros(shape_xt, dtype=dtype_xt) - np.add.at(xt, idx, wt) - return uniq_vals_all_cols, xt - -def _get_dependencies_without_y(variables, y_name, kdb_edges): - ''' - evidences of each variable without y. - - Param: - -------------- - variables: variable names - - y_name: class name - - kdb_edges: list of tuple (source, target) - ''' - dependencies = {} - kdb_edges_without_y = [edge for edge in kdb_edges if edge[0] != y_name] - mi_desc_order = {t:i for i,(s,t) in enumerate(kdb_edges) if s == y_name} - for x in variables: - current_dependencies = [s for s,t in kdb_edges_without_y if t == x] - if len(current_dependencies) >= 2: - sort_dict = {t:mi_desc_order[t] for t in current_dependencies} - dependencies[x] = sorted(sort_dict) - else: - dependencies[x] = current_dependencies - return dependencies - -def _add_uniform(array, noise=1e-5): - ''' - if no count on particular condition for any feature, give a uniform prob rather than leave 0 - ''' - sum_by_col = np.sum(array,axis=0) - zero_idxs = (array == 0).astype(int) - # zero_count_by_col = np.sum(zero_idxs,axis=0) - nunique = array.shape[0] - result = np.zeros_like(array, dtype='float') - for i in range(array.shape[1]): - if sum_by_col[i] == 0: - result[:,i] = array[:,i] + 1./nunique - elif noise != 0: - result[:,i] = array[:,i] + noise * zero_idxs[:,i] - else: - result[:,i] = array[:,i] - return result - -def _normalize_by_column(array): - sum_by_col = np.sum(array,axis=0) - return np.divide(array, sum_by_col, - out=np.zeros_like(array,dtype='float'), - where=sum_by_col !=0) - -def _smoothing(cct, d): - ''' - probability smoothing for kdb - - Parameters: - ----------- - cct (np.ndarray): cross count table with shape (x0, *parents) - - d (int): dimension of cct - - Return: - -------- - smoothed joint prob table - ''' - #covert cross-count-table to joint-prob-table by doing a normalization alone axis 0 - jpt = _normalize_by_column(cct) - smoothing_idx = jpt == 0 - if d > 1 and np.sum(smoothing_idx) > 0: - parent = cct.sum(axis=-1) - parent = _smoothing(parent, d-1) - parent_extend = parent.repeat(jpt.shape[-1]).reshape(jpt.shape) - jpt[smoothing_idx] = parent_extend[smoothing_idx] - return jpt - -def get_high_order_feature(X, col, evidence_cols, feature_uniques): - ''' - encode the high order feature of X[col] given evidences X[evidence_cols]. - ''' - if evidence_cols is None or len(evidence_cols) == 0: - return X[:,[col]] - else: - evidences = [X[:,_col] for _col in evidence_cols] - - #[1, variable_unique, evidence_unique] - base = [1, feature_uniques[col]] + [feature_uniques[_col] for _col in evidence_cols[::-1][:-1]] - cum_base = np.cumprod(base)[::-1] - - cols = evidence_cols + [col] - high_order_feature = np.sum(X[:,cols] * cum_base, axis=1).reshape(-1,1) - return high_order_feature - -def get_high_order_constraints(X, col, evidence_cols, feature_uniques): - ''' - find the constraints infomation for the high order feature X[col] given evidences X[evidence_cols]. - - Returns: - --------------------- - tuple(have_value, high_order_uniques) - - have_value: a k+1 dimensions numpy ndarray of type boolean. - Each dimension correspond to a variable, with the order (*evidence_cols, col) - True indicate the corresponding combination of variable values cound be found in the dataset. - False indicate not. - - high_order_constraints: a 1d nummy ndarray of type int. - Each number `c` indicate that there are `c` cols shound be applying the constraints since the last constrant position(or index 0), - in sequence. - - ''' - if evidence_cols is None or len(evidence_cols) == 0: - unique = feature_uniques[col] - return np.ones(unique,dtype=bool), np.array([unique]) - else: - cols = evidence_cols + [col] - cross_table_idxs, cross_table = get_cross_table(*[X[:,i] for i in cols]) - have_value = cross_table != 0 - - have_value_reshape = have_value.reshape(-1,have_value.shape[-1]) - #have_value_split = np.split(have_value_reshape, have_value_reshape.shape[0], 0) - high_order_constraints = np.sum(have_value_reshape, axis=-1) - - return have_value, high_order_constraints - -class KdbHighOrderFeatureEncoder: - ''' - High order feature encoder that uses the kdb model to retrieve the dependencies between features. - - ''' - def __init__(self): - self.dependencies_ = {} - self.constraints_ = np.array([]) - self.have_value_idxs_ = [] - self.feature_uniques_ = [] - self.high_order_feature_uniques_ = [] - self.edges_ = [] - self.ohe_ = None - self.k = None - #self.full_=True - - def fit(self, X, y, k=0): - ''' - Fit the KdbHighOrderFeatureEncoder to X, y. - - Parameters - ---------- - X : array_like of shape (n_samples, n_features) - data to fit in the encoder. - - y : array_like of shape (n_samples,) - label to fit in the encoder. - - k : int, default=0 - k value of the order of the high-order feature. k = 0 will lead to a OneHotEncoder. - - Returns - ------- - self : object - Fitted encoder. - ''' - self.k = k - edges = build_graph(X, y, k) - #n_classes = len(np.unique(y)) - num_features = X.shape[1] - - if k > 0: - dependencies = _get_dependencies_without_y(list(range(num_features)), num_features, edges) - else: - dependencies = {x:[] for x in range(num_features)} - - self.dependencies_ = dependencies - self.feature_uniques_ = [len(np.unique(X[:,i])) for i in range(num_features)] - self.edges_ = edges - #self.full_ = full - - Xk, constraints, have_value_idxs = self.transform(X, return_constraints=True, use_ohe=False) - - from sklearn.preprocessing import OneHotEncoder - self.ohe_ = OneHotEncoder().fit(Xk) - self.high_order_feature_uniques_ = [len(c) for c in self.ohe_.categories_] - self.constraints_ = constraints - self.have_value_idxs_ = have_value_idxs - return self - - def transform(self, X, return_constraints=False, use_ohe=True): - """ - Transform X to the high-order features. - - Parameters - ---------- - X : array_like of shape (n_samples, n_features) - Data to fit in the encoder. - - return_constraints : bool, default=False - Whether to return the constraint informations. - - use_ohe : bool, default=True - Whether to transform output to one-hot format. - - Returns - ------- - X_out : ndarray of shape (n_samples, n_encoded_features) - Transformed input. - """ - Xk = [] - have_value_idxs = [] - constraints = [] - for k, v in self.dependencies_.items(): - xk = get_high_order_feature(X, k, v, self.feature_uniques_) - Xk.append(xk) - - if return_constraints: - idx, constraint = get_high_order_constraints(X, k, v, self.feature_uniques_) - have_value_idxs.append(idx) - constraints.append(constraint) - - Xk = np.hstack(Xk) - from sklearn.preprocessing import OrdinalEncoder - Xk = OrdinalEncoder().fit_transform(Xk) - if use_ohe: - Xk = self.ohe_.transform(Xk) - - if return_constraints: - concated_constraints = np.hstack(constraints) - return Xk, concated_constraints, have_value_idxs - else: - return Xk - - def fit_transform(self, X, y, k=0, return_constraints=False): - ''' - Fit KdbHighOrderFeatureEncoder to X, y, then transform X. - - Equivalent to fit(X, y, k).transform(X, return_constraints) but more convenient. - - Parameters - ---------- - X : array_like of shape (n_samples, n_features) - data to fit in the encoder. - - y : array_like of shape (n_samples,) - label to fit in the encoder. - - k : int, default=0 - k value of the kdb model. k = 0 will lead to a OneHotEncoder. - - return_constraints : bool, default=False - whether to return the constraint informations. - - Returns - ------- - X_out : ndarray of shape (n_samples, n_encoded_features) - Transformed input. - ''' - return self.fit(X, y, k).transform(X, return_constraints) \ No newline at end of file diff --git a/katabatic/models/meg/meg.py b/katabatic/models/meg/meg.py deleted file mode 100644 index 4fd018e..0000000 --- a/katabatic/models/meg/meg.py +++ /dev/null @@ -1,306 +0,0 @@ -from .kdb import * -from .utils import * -from pgmpy.models import BayesianNetwork -from pgmpy.sampling import BayesianModelSampling -from pgmpy.factors.discrete import TabularCPD -from sklearn.preprocessing import OrdinalEncoder, LabelEncoder -import numpy as np -import tensorflow as tf -from scipy.spatial import distance -from scipy.special import softmax -from .ganblr import GANBLR - -def get_weight(group_loss): - weight = softmax(1- softmax(group_loss)) - #weight = (1 - softmax(group_loss))/len(group_loss) - #weight = softmax(1000 - group_loss) - return weight - -class GANBLR_MEG_UNIT(GANBLR): - def init_unit(self, x, y, k, batch_size=32): - x = self._ordinal_encoder.fit_transform(x) - y = self._label_encoder.fit_transform(y).astype(int) - d = DataUtils(x, y) - self._d = d - self.k = k - self.batch_size = batch_size - - def run_one_epoch(self, x, y, syn_x): - x = self._ordinal_encoder.transform(x) - y = self._label_encoder.transform(y).astype(int) - d = DataUtils(x, y) - self._d = d - batch_size = self.batch_size - - discriminator_label = np.hstack([np.ones(d.data_size), np.zeros(len(syn_x))]) - discriminator_input = np.vstack([x, syn_x]) - disc_input, disc_label = sample(discriminator_input, discriminator_label, frac=0.8) - disc = self._discrim() - d_history = disc.fit(disc_input, disc_label, batch_size=batch_size, epochs=1, verbose=0).history - prob_fake = disc.predict(x, verbose=0) - sim_loss = distance.euclidean(x.ravel(), syn_x.ravel())*0.1 - ls = np.mean(-np.log(np.subtract(1, prob_fake))) + sim_loss - g_history = self._run_generator(loss=ls).history - #syn_data = self._sample(verbose=0) - return self, sim_loss, d_history, g_history - - -class MEG: - """ - The GANBLR_MEG Model. - """ - def __init__(self) -> None: - self._d = None - #self.__gen_weights = None - self.batch_size = None - self.epochs = None - self.k = None - #self.constraints = None - self._units = None - self._candidate_labels = None - self._column_names = None - self._weight = None - self._ordinal_encoder = None - - def _init_units(self, data_frame, candidate_labels=None): - ''' - Parameters - ---------- - data_frame : array_like of shape (n_samples, n_features) - Dataset to fit the model. The data should be discrete. - - candidate_labels : array_like of shape (n_labels,), default=None - Index of candidate labels of the dataset. If `None`, all the features will be used as candidate labels. - ''' - if candidate_labels is None: - num_units = data_frame.shape[1] - candidate_labels = np.arange(num_units).astype(int) - else: - candidate_label_idxs = [] - for label in candidate_labels: - if isinstance(label, str): - label_idx = data_frame.columns.get_loc(label) - candidate_label_idxs.append(label_idx) - elif isinstance(label, int): - candidate_label_idxs.append(label) - else: - raise Exception(f"Invalid Value in Arugument `candidate_labels`, `{label}` is not a valid column name or index.") - - num_units = len(candidate_label_idxs) - candidate_labels = np.array(candidate_label_idxs).astype(int) - - units = {idx:GANBLR_MEG_UNIT() for idx in candidate_labels} - - self._ordinal_encoder = OrdinalEncoder().fit(data_frame) - self._candidate_labels = candidate_labels - self._units = units - - return units - - - def fit(self, data_frame, candidate_labels=None, k=0, batch_size=32, epochs=10, warmup_epochs=2, verbose=1): - ''' - Fit the model to the given data. - - Parameters - ---------- - data_frame : array_like of shape (n_samples, n_features) - Dataset to fit the model. The data should be discrete. - - candidate_labels : array_like of shape (n_labels,), default=None - Index of candidate labels of the dataset. If `None`, all the features will be used as candidate labels. - - k : int, default=0 - Parameter k of ganblr model. Must be greater than 0. No more than 2 is Suggested. - - batch_size : int, default=32 - Size of the batch to feed the model at each step. - - epochs : int, default=0 - Number of epochs to use during training. - - warmup_epochs : int, default=1 - Number of epochs to use in warmup phase. Defaults to :attr:`1`. - - verbose : int, default=1 - Whether to output the log. Use 1 for log output and 0 for complete silence. - - Returns - ------- - self : object - Fitted model. - ''' - if verbose is None or not isinstance(verbose, int): - verbose = 1 - self.k = k - self.batch_size = batch_size - units = self._init_units(data_frame, candidate_labels) - weight = self._warmup_run(data_frame, k, batch_size, warmup_epochs, verbose=verbose) - syn_data = self._weighted_sample(weight, verbose=0) - - for i in range(epochs): - group_loss = [] - for label_idx, unit in units.items(): - X, y = self._split_dataset(data_frame.values, label_idx) - syn_X, syn_y = self._split_dataset(syn_data, label_idx) - unit, sim_loss, d_history, g_history = unit.run_one_epoch(X, y, syn_X) - group_loss.append(sim_loss) - - print(f"E{i}U{label_idx}: G_loss = {g_history['loss'][0]:.6f}, D_loss = {d_history['loss'][0]:.6f}, sim_loss={sim_loss:.6f}") - weight = get_weight(group_loss) - - self._weight = weight - print(f'E{i} weight: {np.round(weight, 2).tolist()}') - syn_data = self._weighted_sample(weight, verbose=0) - - return self - - def evaluate(self, test_data, label_idx=None, model='lr') -> float: - """ - Perform a TSTR(Training on Synthetic data, Testing on Real data) evaluation. - - Parameters - ---------- - test_data : array_like - Test dataset. - - model : str or object - The model used for evaluate. Should be one of ['lr', 'mlp', 'rf'], or a model class that have sklearn-style `fit` and `predict` method. - - Return: - -------- - accuracy_score : float. - - """ - from sklearn.linear_model import LogisticRegression - from sklearn.neural_network import MLPClassifier - from sklearn.ensemble import RandomForestClassifier - from sklearn.preprocessing import OneHotEncoder - from sklearn.pipeline import Pipeline - from sklearn.metrics import accuracy_score - - eval_model = None - models = dict( - lr=LogisticRegression, - rf=RandomForestClassifier, - mlp=MLPClassifier - ) - if model in models.keys(): - eval_model = models[model]() - elif hasattr(model, 'fit') and hasattr(model, 'predict'): - eval_model = model - else: - raise Exception("Invalid Arugument `model`, Should be one of ['lr', 'mlp', 'rf'], or a model class that have sklearn-style `fit` and `predict` method.") - - oe = self._units[label_idx]._ordinal_encoder - le = self._units[label_idx]._label_encoder - d = self._units[label_idx]._d - - synthetic_data = self._weighted_sample(self._weight, verbose=0) - synthetic_x, synthetic_y = self._split_dataset(synthetic_data, label_idx) - x, y = self._split_dataset(test_data.values, label_idx) - x_test = oe.transform(x) - y_test = le.transform(y) - - categories = d.get_categories() - pipline = Pipeline([('encoder', OneHotEncoder(categories=categories, handle_unknown='ignore')), ('model', eval_model)]) - pipline.fit(synthetic_x, synthetic_y) - pred = pipline.predict(x_test) - return accuracy_score(y_test, pred) - - def sample(self, size=None, verbose=1) -> np.ndarray: - """ - Generate synthetic data. - - Parameters - ---------- - size : int or None - Size of the data to be generated. set to `None` to make the size equal to the size of the training set. - - verbose : int, default=1 - Whether to output the log. Use 1 for log output and 0 for complete silence. - - Return: - ----------------- - synthetic_samples : np.ndarray - Generated synthetic data. - """ - ordinal_data = self._weighted_sample(self._weight, size, verbose) - ordinal_data = self._ordinal_encoder.inverse_transform(ordinal_data) - return ordinal_data - - def _weighted_sample(self, weight, size=None, verbose=1) -> np.ndarray: - samples = [] - sample_size = self._units[0]._d.data_size if size is None else size - - sizes = np.round(weight * sample_size).astype(int) - if sizes.sum() != sample_size: - sizes[np.argmax(sizes)] += sample_size - sizes.sum() - - for _size, (idx, unit) in zip(sizes, self._units.items()): - result = unit._sample(_size, verbose) - - sorted_result = self._reindex_dataset(result, idx) - samples.append(sorted_result) - #print(f"sizes: {sorted_result.shape}") - return np.vstack(samples) - - def _split_dataset(self, data, label_idx): - ''' - assume idx = 2 - this method convert [x0 x1 x2 x3 y] to [x0 x1 x3 y], x2 - ''' - feature_idxs = np.delete(np.arange(data.shape[1]), label_idx).astype(int) - X = data[:,feature_idxs] - y = data[:,label_idx] - return X, y - - def _reindex_dataset(self, data, label_idx): - ''' - assume idx = 2 - this method convert [x0 x1 x3 y x2] to [x0 x1 x2 x3 y] - ''' - feature_idxs = list(range(data.shape[1]-1)) - feature_idxs.insert(label_idx, data.shape[1] - 1) - - return data[:,feature_idxs] - - def _warmup_run(self, data_frame, k, batch_size, epochs, verbose=None): - if verbose: - print(f"warmup run:") - - group_loss = [] - for label_idx, unit in self._units.items(): - X, y = self._split_dataset(data_frame.values, label_idx) - - unit.init_unit(X, y, k, batch_size) - unit._warmup_run(epochs, verbose=verbose) - syn_data = unit._sample(verbose=0) - syn_X, syn_y = self._split_dataset(syn_data, label_idx) - unit, sim_loss, d_history, g_history = unit.run_one_epoch(X, y, syn_X) - group_loss.append(sim_loss) - - #print(group_loss) - weight = get_weight(group_loss) - if verbose: - print(f'weight after warmup: {np.round(weight, 2).tolist()}') - return weight - - def _run_generator(self, loss): - d = self._d - ohex = d.get_kdbe_x(self.k) - tf.keras.backend.clear_session() - model = tf.keras.Sequential() - model.add(tf.keras.layers.Dense(d.num_classes, input_dim=ohex.shape[1], activation='softmax',kernel_constraint=self.constraints)) - model.compile(loss=elr_loss(loss), optimizer='adam', metrics=['accuracy']) - model.set_weights(self.__gen_weights) - history = model.fit(ohex, d.y, batch_size=self.batch_size,epochs=1, verbose=0) - self.__gen_weights = model.get_weights() - tf.keras.backend.clear_session() - return history - - def _discrim(self): - model = tf.keras.Sequential() - model.add(tf.keras.layers.Dense(1, input_dim=self._d.num_features, activation='sigmoid')) - model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy']) - return model \ No newline at end of file diff --git a/katabatic/models/meg/utils.py b/katabatic/models/meg/utils.py deleted file mode 100644 index 58a8938..0000000 --- a/katabatic/models/meg/utils.py +++ /dev/null @@ -1,162 +0,0 @@ -from tensorflow.python.ops import math_ops -import numpy as np -import tensorflow as tf -import warnings -warnings.filterwarnings("ignore") - -class softmax_weight(tf.keras.constraints.Constraint): - """Constrains weight tensors to be under softmax `.""" - - def __init__(self,feature_uniques): - if isinstance(feature_uniques, np.ndarray): - idxs = math_ops.cumsum(np.hstack([np.array([0]),feature_uniques])) - else: - idxs = math_ops.cumsum([0] + feature_uniques) - idxs = [i.numpy() for i in idxs] - self.feature_idxs = [ - (idxs[i],idxs[i+1]) for i in range(len(idxs)-1) - ] - - def __call__(self, w): - w_new = [ - math_ops.log(tf.nn.softmax(w[i:j,:], axis=0)) - for i,j in self.feature_idxs - ] - return tf.concat(w_new, 0) - - def get_config(self): - return {'feature_idxs': self.feature_idxs} - -def elr_loss(KL_LOSS): - def loss(y_true, y_pred): - return tf.keras.losses.sparse_categorical_crossentropy(y_true, y_pred)+ KL_LOSS - return loss - -def KL_loss(prob_fake): - return np.mean(-np.log(np.subtract(1,prob_fake))) - -def get_lr(input_dim, output_dim, constraint=None,KL_LOSS=0): - model = tf.keras.Sequential() - model.add(tf.keras.layers.Dense(output_dim, input_dim=input_dim, activation='softmax',kernel_constraint=constraint)) - model.compile(loss=elr_loss(KL_LOSS), optimizer='adam', metrics=['accuracy']) - #log_elr = model.fit(*train_data, validation_data=test_data, batch_size=batch_size,epochs=epochs) - return model - -def sample(*arrays, n=None, frac=None, random_state=None): - ''' - generate sample random arrays from given arrays. The given arrays must be same size. - - Parameters: - -------------- - *arrays: arrays to be sampled. - - n (int): Number of random samples to generate. - - frac: Float value between 0 and 1, Returns (float value * length of given arrays). frac cannot be used with n. - - random_state: int value or numpy.random.RandomState, optional. if set to a particular integer, will return same samples in every iteration. - - Return: - -------------- - the sampled array(s). Passing in multiple arrays will result in the return of a tuple. - - ''' - random = np.random - if isinstance(random_state, int): - random = random.RandomState(random_state) - elif isinstance(random_state, np.random.RandomState): - random = random_state - - arr0 = arrays[0] - original_size = len(arr0) - if n == None and frac == None: - raise Exception('You must specify one of frac or size.') - if n == None: - n = int(len(arr0) * frac) - - idxs = random.choice(original_size, n, replace=False) - if len(arrays) > 1: - sampled_arrays = [] - for arr in arrays: - assert(len(arr) == original_size) - sampled_arrays.append(arr[idxs]) - return tuple(sampled_arrays) - else: - return arr0[idxs] - -DEMO_DATASETS = { - 'adult': { - 'link':'https://raw.githubusercontent.com/chriszhangpodo/discretizedata/main/adult-dm.csv', - 'params': { - 'dtype' : int - } - }, - 'adult-raw':{ - 'link':'https://drive.google.com/uc?export=download&id=1iA-_qIC1xKQJ4nL2ugX1_XJQf8__xOY0', - 'params': {} - } -} - -def get_demo_data(name='adult'): - """ - Download demo dataset from internet. - - Parameters - ---------- - name : str - Name of dataset. Should be one of ['adult', 'adult-raw']. - - Returns - ------- - data : pandas.DataFrame - the demo dataset. - """ - assert(name in DEMO_DATASETS.keys()) - return read_csv(DEMO_DATASETS[name]['link'], **DEMO_DATASETS[name]['params']) - -from .kdb import KdbHighOrderFeatureEncoder -from sklearn.preprocessing import OneHotEncoder -from pandas import read_csv -import numpy as np - -class DataUtils: - """ - useful data utils for the preparation before training. - """ - def __init__(self, x, y): - self.x = x - self.y = y - self.data_size = len(x) - self.num_features = x.shape[1] - - yunique, ycounts = np.unique(y, return_counts=True) - self.num_classes = len(yunique) - self.class_counts = ycounts - self.feature_uniques = [len(np.unique(x[:,i])) for i in range(self.num_features)] - - self.constraint_positions = None - self._kdbe = None - - self.__kdbe_x = None - - def get_categories(self, idxs=None): - if idxs != None: - return [self._kdbe.ohe_.categories_[i] for i in idxs] - return self._kdbe.ohe_.categories_ - - def get_kdbe_x(self, k=0, dense_format=True) -> np.ndarray: - if self.__kdbe_x is not None: - return self.__kdbe_x - if self._kdbe == None: - self._kdbe = KdbHighOrderFeatureEncoder() - self._kdbe.fit(self.x, self.y, k=k) - kdbex = self._kdbe.transform(self.x) - if dense_format: - kdbex = kdbex.todense() - self.__kdbe_x = kdbex - self.constraint_positions = self._kdbe.constraints_ - return kdbex - - def clear(self): - self._kdbe = None - self.__kdbe_x = None \ No newline at end of file diff --git a/katabatic/models/model_template/mock_adapter.py b/katabatic/models/model_template/mock_adapter.py deleted file mode 100644 index 8281839..0000000 --- a/katabatic/models/model_template/mock_adapter.py +++ /dev/null @@ -1,21 +0,0 @@ -from katabatic_spi import KatabaticModelSPI - -class MockAdapter(KatabaticModelSPI): - - def __init__(self, type='discrete'): - self.type = None # Should be either 'discrete' or 'continuous' - self.constraints = None - self.batch_size = None - self.epochs = None - - def load_model(self): - print('Loading the model') - - def load_data(self, data_pathname): - print("Loading data now.") - - def fit(self, X_train, y_train, k=0, epochs=10, batch_size=64): - print("Fitting the model now.") - - def generate(self, size=None): - print("Generating data now.") \ No newline at end of file diff --git a/katabatic/models/model_template/model.py b/katabatic/models/model_template/model.py deleted file mode 100644 index 362a4ad..0000000 --- a/katabatic/models/model_template/model.py +++ /dev/null @@ -1,42 +0,0 @@ -"""Abstract Base class for all models.""" - -# Author: Jaime Blackwell -# License: GNU Affero License -import abc -from abc import ABC, abstractmethod # Import abstraction Module - -class Model(ABC): - - @abstractmethod - def __init__(self, *args, **kwargs): - - ''' - *** MODEL CLASS PARAMETERS *** - ------------------------------------------------------------ - weights : initial weights for data generation. Default = None - epochs : number of epochs to train the model. Default = None - batch_size : batch size for training. Default = None - self.x : Training set for fitting to the model - self.Y : - ------------------------------------------------------------- - ''' - self.weights = None - self.epochs = None - self.batch_size = None - self.x = None - self.Y = None - self.k = 0 - - @abstractmethod - def fit(self, x, Y): # sklearn-style fit method ,x, Y, **kwargs - self.x = x - self.Y = Y - return NotImplementedError('model.fit() must be defined in the concrete class') - - @abstractmethod - def generate(self): # , size=None, **kwargs - return NotImplementedError('model.generate() must be defined in the concrete class') - - @abstractmethod - def evaluate(self): # x, Y, classifier - pass \ No newline at end of file diff --git a/katabatic/models/model_template/utils.py b/katabatic/models/model_template/utils.py deleted file mode 100644 index 362a4ad..0000000 --- a/katabatic/models/model_template/utils.py +++ /dev/null @@ -1,42 +0,0 @@ -"""Abstract Base class for all models.""" - -# Author: Jaime Blackwell -# License: GNU Affero License -import abc -from abc import ABC, abstractmethod # Import abstraction Module - -class Model(ABC): - - @abstractmethod - def __init__(self, *args, **kwargs): - - ''' - *** MODEL CLASS PARAMETERS *** - ------------------------------------------------------------ - weights : initial weights for data generation. Default = None - epochs : number of epochs to train the model. Default = None - batch_size : batch size for training. Default = None - self.x : Training set for fitting to the model - self.Y : - ------------------------------------------------------------- - ''' - self.weights = None - self.epochs = None - self.batch_size = None - self.x = None - self.Y = None - self.k = 0 - - @abstractmethod - def fit(self, x, Y): # sklearn-style fit method ,x, Y, **kwargs - self.x = x - self.Y = Y - return NotImplementedError('model.fit() must be defined in the concrete class') - - @abstractmethod - def generate(self): # , size=None, **kwargs - return NotImplementedError('model.generate() must be defined in the concrete class') - - @abstractmethod - def evaluate(self): # x, Y, classifier - pass \ No newline at end of file diff --git a/katabatic/models/tvae/tvae_adapter.py b/katabatic/models/tvae/tvae_adapter.py deleted file mode 100644 index 2275bbc..0000000 --- a/katabatic/models/tvae/tvae_adapter.py +++ /dev/null @@ -1,37 +0,0 @@ -from katabatic_spi import KatabaticModelSPI -import sdv -import pandas as pd -from sdv.single_table import TVAESynthesizer -from sdv.metadata import SingleTableMetadata - -class TvaeAdapter(KatabaticModelSPI): - - def __init__(self, type='continuous'): - self.type = None # Should be either 'discrete' or 'continuous' - self.constraints = None - self.batch_size = None - self.epochs = None - self.data = None - -# #TODO: add exception handling to load() - def load_model(self): #Load the model - metadata = SingleTableMetadata() - metadata.detect_from_csv(filepath='cities_demo.csv') - data = pd.read_csv('cities_demo.csv') - self.load_data(data) - self.model = TVAESynthesizer(metadata) # Initialise and return an instance of the model - # print("Loading the model") - return - - def load_data(self, data): #Load data - self.data = data - -# #TODO: add exception handling to fit() - def fit(self, X_train, y_train, k=0, epochs=10, batch_size=64): #Fit model to data - self.model.fit(self.data) - print("Fitting the model") - return - - # TODO: add exception handling to generate() - def generate(self): #Generate synthetic data - return self.model.sample(num_rows=13) \ No newline at end of file diff --git a/katabatic/nursery/nursery.c45-names b/katabatic/nursery/nursery.c45-names deleted file mode 100644 index ca3a899..0000000 --- a/katabatic/nursery/nursery.c45-names +++ /dev/null @@ -1,16 +0,0 @@ -| names file (C4.5 format) for nursery domain - -| class values - -not_recom, recommend, very_recom, priority, spec_prior - -| attributes - -parents: usual, pretentious, great_pret. -has_nurs: proper, less_proper, improper, critical, very_crit. -form: complete, completed, incomplete, foster. -children: 1, 2, 3, more. -housing: convenient, less_conv, critical. -finance: convenient, inconv. -social: nonprob, slightly_prob, problematic. -health: recommended, priority, not_recom. diff --git a/katabatic/nursery/nursery.data b/katabatic/nursery/nursery.data deleted file mode 100644 index 7506888..0000000 --- a/katabatic/nursery/nursery.data +++ /dev/null @@ -1,12961 +0,0 @@ -usual,proper,complete,1,convenient,convenient,nonprob,recommended,recommend -usual,proper,complete,1,convenient,convenient,nonprob,priority,priority -usual,proper,complete,1,convenient,convenient,nonprob,not_recom,not_recom -usual,proper,complete,1,convenient,convenient,slightly_prob,recommended,recommend -usual,proper,complete,1,convenient,convenient,slightly_prob,priority,priority -usual,proper,complete,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,proper,complete,1,convenient,convenient,problematic,recommended,priority -usual,proper,complete,1,convenient,convenient,problematic,priority,priority -usual,proper,complete,1,convenient,convenient,problematic,not_recom,not_recom -usual,proper,complete,1,convenient,inconv,nonprob,recommended,very_recom -usual,proper,complete,1,convenient,inconv,nonprob,priority,priority -usual,proper,complete,1,convenient,inconv,nonprob,not_recom,not_recom -usual,proper,complete,1,convenient,inconv,slightly_prob,recommended,very_recom -usual,proper,complete,1,convenient,inconv,slightly_prob,priority,priority -usual,proper,complete,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,proper,complete,1,convenient,inconv,problematic,recommended,priority -usual,proper,complete,1,convenient,inconv,problematic,priority,priority -usual,proper,complete,1,convenient,inconv,problematic,not_recom,not_recom -usual,proper,complete,1,less_conv,convenient,nonprob,recommended,very_recom -usual,proper,complete,1,less_conv,convenient,nonprob,priority,priority -usual,proper,complete,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,proper,complete,1,less_conv,convenient,slightly_prob,recommended,very_recom -usual,proper,complete,1,less_conv,convenient,slightly_prob,priority,priority -usual,proper,complete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,proper,complete,1,less_conv,convenient,problematic,recommended,priority -usual,proper,complete,1,less_conv,convenient,problematic,priority,priority -usual,proper,complete,1,less_conv,convenient,problematic,not_recom,not_recom -usual,proper,complete,1,less_conv,inconv,nonprob,recommended,very_recom -usual,proper,complete,1,less_conv,inconv,nonprob,priority,priority -usual,proper,complete,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,proper,complete,1,less_conv,inconv,slightly_prob,recommended,very_recom -usual,proper,complete,1,less_conv,inconv,slightly_prob,priority,priority -usual,proper,complete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,proper,complete,1,less_conv,inconv,problematic,recommended,priority -usual,proper,complete,1,less_conv,inconv,problematic,priority,priority -usual,proper,complete,1,less_conv,inconv,problematic,not_recom,not_recom -usual,proper,complete,1,critical,convenient,nonprob,recommended,very_recom -usual,proper,complete,1,critical,convenient,nonprob,priority,priority -usual,proper,complete,1,critical,convenient,nonprob,not_recom,not_recom -usual,proper,complete,1,critical,convenient,slightly_prob,recommended,very_recom -usual,proper,complete,1,critical,convenient,slightly_prob,priority,priority -usual,proper,complete,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,proper,complete,1,critical,convenient,problematic,recommended,priority -usual,proper,complete,1,critical,convenient,problematic,priority,priority -usual,proper,complete,1,critical,convenient,problematic,not_recom,not_recom -usual,proper,complete,1,critical,inconv,nonprob,recommended,very_recom -usual,proper,complete,1,critical,inconv,nonprob,priority,priority -usual,proper,complete,1,critical,inconv,nonprob,not_recom,not_recom -usual,proper,complete,1,critical,inconv,slightly_prob,recommended,very_recom -usual,proper,complete,1,critical,inconv,slightly_prob,priority,priority -usual,proper,complete,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,proper,complete,1,critical,inconv,problematic,recommended,priority -usual,proper,complete,1,critical,inconv,problematic,priority,priority -usual,proper,complete,1,critical,inconv,problematic,not_recom,not_recom -usual,proper,complete,2,convenient,convenient,nonprob,recommended,very_recom -usual,proper,complete,2,convenient,convenient,nonprob,priority,priority -usual,proper,complete,2,convenient,convenient,nonprob,not_recom,not_recom -usual,proper,complete,2,convenient,convenient,slightly_prob,recommended,very_recom -usual,proper,complete,2,convenient,convenient,slightly_prob,priority,priority -usual,proper,complete,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,proper,complete,2,convenient,convenient,problematic,recommended,priority -usual,proper,complete,2,convenient,convenient,problematic,priority,priority -usual,proper,complete,2,convenient,convenient,problematic,not_recom,not_recom -usual,proper,complete,2,convenient,inconv,nonprob,recommended,very_recom -usual,proper,complete,2,convenient,inconv,nonprob,priority,priority -usual,proper,complete,2,convenient,inconv,nonprob,not_recom,not_recom -usual,proper,complete,2,convenient,inconv,slightly_prob,recommended,very_recom -usual,proper,complete,2,convenient,inconv,slightly_prob,priority,priority -usual,proper,complete,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,proper,complete,2,convenient,inconv,problematic,recommended,priority -usual,proper,complete,2,convenient,inconv,problematic,priority,priority -usual,proper,complete,2,convenient,inconv,problematic,not_recom,not_recom -usual,proper,complete,2,less_conv,convenient,nonprob,recommended,very_recom -usual,proper,complete,2,less_conv,convenient,nonprob,priority,priority -usual,proper,complete,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,proper,complete,2,less_conv,convenient,slightly_prob,recommended,very_recom -usual,proper,complete,2,less_conv,convenient,slightly_prob,priority,priority -usual,proper,complete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,proper,complete,2,less_conv,convenient,problematic,recommended,priority -usual,proper,complete,2,less_conv,convenient,problematic,priority,priority -usual,proper,complete,2,less_conv,convenient,problematic,not_recom,not_recom -usual,proper,complete,2,less_conv,inconv,nonprob,recommended,very_recom -usual,proper,complete,2,less_conv,inconv,nonprob,priority,priority -usual,proper,complete,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,proper,complete,2,less_conv,inconv,slightly_prob,recommended,very_recom -usual,proper,complete,2,less_conv,inconv,slightly_prob,priority,priority -usual,proper,complete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,proper,complete,2,less_conv,inconv,problematic,recommended,priority -usual,proper,complete,2,less_conv,inconv,problematic,priority,priority -usual,proper,complete,2,less_conv,inconv,problematic,not_recom,not_recom -usual,proper,complete,2,critical,convenient,nonprob,recommended,priority -usual,proper,complete,2,critical,convenient,nonprob,priority,priority -usual,proper,complete,2,critical,convenient,nonprob,not_recom,not_recom -usual,proper,complete,2,critical,convenient,slightly_prob,recommended,priority -usual,proper,complete,2,critical,convenient,slightly_prob,priority,priority -usual,proper,complete,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,proper,complete,2,critical,convenient,problematic,recommended,priority -usual,proper,complete,2,critical,convenient,problematic,priority,priority -usual,proper,complete,2,critical,convenient,problematic,not_recom,not_recom -usual,proper,complete,2,critical,inconv,nonprob,recommended,priority -usual,proper,complete,2,critical,inconv,nonprob,priority,priority -usual,proper,complete,2,critical,inconv,nonprob,not_recom,not_recom -usual,proper,complete,2,critical,inconv,slightly_prob,recommended,priority -usual,proper,complete,2,critical,inconv,slightly_prob,priority,priority -usual,proper,complete,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,proper,complete,2,critical,inconv,problematic,recommended,priority -usual,proper,complete,2,critical,inconv,problematic,priority,priority -usual,proper,complete,2,critical,inconv,problematic,not_recom,not_recom -usual,proper,complete,3,convenient,convenient,nonprob,recommended,very_recom -usual,proper,complete,3,convenient,convenient,nonprob,priority,priority -usual,proper,complete,3,convenient,convenient,nonprob,not_recom,not_recom -usual,proper,complete,3,convenient,convenient,slightly_prob,recommended,very_recom -usual,proper,complete,3,convenient,convenient,slightly_prob,priority,priority -usual,proper,complete,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,proper,complete,3,convenient,convenient,problematic,recommended,priority -usual,proper,complete,3,convenient,convenient,problematic,priority,priority -usual,proper,complete,3,convenient,convenient,problematic,not_recom,not_recom -usual,proper,complete,3,convenient,inconv,nonprob,recommended,priority -usual,proper,complete,3,convenient,inconv,nonprob,priority,priority -usual,proper,complete,3,convenient,inconv,nonprob,not_recom,not_recom -usual,proper,complete,3,convenient,inconv,slightly_prob,recommended,priority -usual,proper,complete,3,convenient,inconv,slightly_prob,priority,priority -usual,proper,complete,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,proper,complete,3,convenient,inconv,problematic,recommended,priority -usual,proper,complete,3,convenient,inconv,problematic,priority,priority -usual,proper,complete,3,convenient,inconv,problematic,not_recom,not_recom -usual,proper,complete,3,less_conv,convenient,nonprob,recommended,priority -usual,proper,complete,3,less_conv,convenient,nonprob,priority,priority -usual,proper,complete,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,proper,complete,3,less_conv,convenient,slightly_prob,recommended,priority -usual,proper,complete,3,less_conv,convenient,slightly_prob,priority,priority -usual,proper,complete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,proper,complete,3,less_conv,convenient,problematic,recommended,priority -usual,proper,complete,3,less_conv,convenient,problematic,priority,priority -usual,proper,complete,3,less_conv,convenient,problematic,not_recom,not_recom -usual,proper,complete,3,less_conv,inconv,nonprob,recommended,priority -usual,proper,complete,3,less_conv,inconv,nonprob,priority,priority -usual,proper,complete,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,proper,complete,3,less_conv,inconv,slightly_prob,recommended,priority -usual,proper,complete,3,less_conv,inconv,slightly_prob,priority,priority -usual,proper,complete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,proper,complete,3,less_conv,inconv,problematic,recommended,priority -usual,proper,complete,3,less_conv,inconv,problematic,priority,priority -usual,proper,complete,3,less_conv,inconv,problematic,not_recom,not_recom -usual,proper,complete,3,critical,convenient,nonprob,recommended,priority -usual,proper,complete,3,critical,convenient,nonprob,priority,priority -usual,proper,complete,3,critical,convenient,nonprob,not_recom,not_recom -usual,proper,complete,3,critical,convenient,slightly_prob,recommended,priority -usual,proper,complete,3,critical,convenient,slightly_prob,priority,priority -usual,proper,complete,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,proper,complete,3,critical,convenient,problematic,recommended,priority -usual,proper,complete,3,critical,convenient,problematic,priority,priority -usual,proper,complete,3,critical,convenient,problematic,not_recom,not_recom -usual,proper,complete,3,critical,inconv,nonprob,recommended,priority -usual,proper,complete,3,critical,inconv,nonprob,priority,priority -usual,proper,complete,3,critical,inconv,nonprob,not_recom,not_recom -usual,proper,complete,3,critical,inconv,slightly_prob,recommended,priority -usual,proper,complete,3,critical,inconv,slightly_prob,priority,priority -usual,proper,complete,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,proper,complete,3,critical,inconv,problematic,recommended,priority -usual,proper,complete,3,critical,inconv,problematic,priority,priority -usual,proper,complete,3,critical,inconv,problematic,not_recom,not_recom -usual,proper,complete,more,convenient,convenient,nonprob,recommended,very_recom -usual,proper,complete,more,convenient,convenient,nonprob,priority,priority -usual,proper,complete,more,convenient,convenient,nonprob,not_recom,not_recom -usual,proper,complete,more,convenient,convenient,slightly_prob,recommended,very_recom -usual,proper,complete,more,convenient,convenient,slightly_prob,priority,priority -usual,proper,complete,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,proper,complete,more,convenient,convenient,problematic,recommended,priority -usual,proper,complete,more,convenient,convenient,problematic,priority,priority -usual,proper,complete,more,convenient,convenient,problematic,not_recom,not_recom -usual,proper,complete,more,convenient,inconv,nonprob,recommended,priority -usual,proper,complete,more,convenient,inconv,nonprob,priority,priority -usual,proper,complete,more,convenient,inconv,nonprob,not_recom,not_recom -usual,proper,complete,more,convenient,inconv,slightly_prob,recommended,priority -usual,proper,complete,more,convenient,inconv,slightly_prob,priority,priority -usual,proper,complete,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,proper,complete,more,convenient,inconv,problematic,recommended,priority -usual,proper,complete,more,convenient,inconv,problematic,priority,priority -usual,proper,complete,more,convenient,inconv,problematic,not_recom,not_recom -usual,proper,complete,more,less_conv,convenient,nonprob,recommended,priority -usual,proper,complete,more,less_conv,convenient,nonprob,priority,priority -usual,proper,complete,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,proper,complete,more,less_conv,convenient,slightly_prob,recommended,priority -usual,proper,complete,more,less_conv,convenient,slightly_prob,priority,priority -usual,proper,complete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,proper,complete,more,less_conv,convenient,problematic,recommended,priority -usual,proper,complete,more,less_conv,convenient,problematic,priority,priority -usual,proper,complete,more,less_conv,convenient,problematic,not_recom,not_recom -usual,proper,complete,more,less_conv,inconv,nonprob,recommended,priority -usual,proper,complete,more,less_conv,inconv,nonprob,priority,priority -usual,proper,complete,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,proper,complete,more,less_conv,inconv,slightly_prob,recommended,priority -usual,proper,complete,more,less_conv,inconv,slightly_prob,priority,priority -usual,proper,complete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,proper,complete,more,less_conv,inconv,problematic,recommended,priority -usual,proper,complete,more,less_conv,inconv,problematic,priority,priority -usual,proper,complete,more,less_conv,inconv,problematic,not_recom,not_recom -usual,proper,complete,more,critical,convenient,nonprob,recommended,priority -usual,proper,complete,more,critical,convenient,nonprob,priority,priority -usual,proper,complete,more,critical,convenient,nonprob,not_recom,not_recom -usual,proper,complete,more,critical,convenient,slightly_prob,recommended,priority -usual,proper,complete,more,critical,convenient,slightly_prob,priority,priority -usual,proper,complete,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,proper,complete,more,critical,convenient,problematic,recommended,priority -usual,proper,complete,more,critical,convenient,problematic,priority,priority -usual,proper,complete,more,critical,convenient,problematic,not_recom,not_recom -usual,proper,complete,more,critical,inconv,nonprob,recommended,priority -usual,proper,complete,more,critical,inconv,nonprob,priority,priority -usual,proper,complete,more,critical,inconv,nonprob,not_recom,not_recom -usual,proper,complete,more,critical,inconv,slightly_prob,recommended,priority -usual,proper,complete,more,critical,inconv,slightly_prob,priority,priority -usual,proper,complete,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,proper,complete,more,critical,inconv,problematic,recommended,priority -usual,proper,complete,more,critical,inconv,problematic,priority,priority -usual,proper,complete,more,critical,inconv,problematic,not_recom,not_recom -usual,proper,completed,1,convenient,convenient,nonprob,recommended,very_recom -usual,proper,completed,1,convenient,convenient,nonprob,priority,priority -usual,proper,completed,1,convenient,convenient,nonprob,not_recom,not_recom -usual,proper,completed,1,convenient,convenient,slightly_prob,recommended,very_recom -usual,proper,completed,1,convenient,convenient,slightly_prob,priority,priority -usual,proper,completed,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,proper,completed,1,convenient,convenient,problematic,recommended,priority -usual,proper,completed,1,convenient,convenient,problematic,priority,priority -usual,proper,completed,1,convenient,convenient,problematic,not_recom,not_recom -usual,proper,completed,1,convenient,inconv,nonprob,recommended,very_recom -usual,proper,completed,1,convenient,inconv,nonprob,priority,priority -usual,proper,completed,1,convenient,inconv,nonprob,not_recom,not_recom -usual,proper,completed,1,convenient,inconv,slightly_prob,recommended,very_recom -usual,proper,completed,1,convenient,inconv,slightly_prob,priority,priority -usual,proper,completed,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,proper,completed,1,convenient,inconv,problematic,recommended,priority -usual,proper,completed,1,convenient,inconv,problematic,priority,priority -usual,proper,completed,1,convenient,inconv,problematic,not_recom,not_recom -usual,proper,completed,1,less_conv,convenient,nonprob,recommended,very_recom -usual,proper,completed,1,less_conv,convenient,nonprob,priority,priority -usual,proper,completed,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,proper,completed,1,less_conv,convenient,slightly_prob,recommended,very_recom -usual,proper,completed,1,less_conv,convenient,slightly_prob,priority,priority -usual,proper,completed,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,proper,completed,1,less_conv,convenient,problematic,recommended,priority -usual,proper,completed,1,less_conv,convenient,problematic,priority,priority -usual,proper,completed,1,less_conv,convenient,problematic,not_recom,not_recom -usual,proper,completed,1,less_conv,inconv,nonprob,recommended,very_recom -usual,proper,completed,1,less_conv,inconv,nonprob,priority,priority -usual,proper,completed,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,proper,completed,1,less_conv,inconv,slightly_prob,recommended,very_recom -usual,proper,completed,1,less_conv,inconv,slightly_prob,priority,priority -usual,proper,completed,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,proper,completed,1,less_conv,inconv,problematic,recommended,priority -usual,proper,completed,1,less_conv,inconv,problematic,priority,priority -usual,proper,completed,1,less_conv,inconv,problematic,not_recom,not_recom -usual,proper,completed,1,critical,convenient,nonprob,recommended,priority -usual,proper,completed,1,critical,convenient,nonprob,priority,priority -usual,proper,completed,1,critical,convenient,nonprob,not_recom,not_recom -usual,proper,completed,1,critical,convenient,slightly_prob,recommended,priority -usual,proper,completed,1,critical,convenient,slightly_prob,priority,priority -usual,proper,completed,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,proper,completed,1,critical,convenient,problematic,recommended,priority -usual,proper,completed,1,critical,convenient,problematic,priority,priority -usual,proper,completed,1,critical,convenient,problematic,not_recom,not_recom -usual,proper,completed,1,critical,inconv,nonprob,recommended,priority -usual,proper,completed,1,critical,inconv,nonprob,priority,priority -usual,proper,completed,1,critical,inconv,nonprob,not_recom,not_recom -usual,proper,completed,1,critical,inconv,slightly_prob,recommended,priority -usual,proper,completed,1,critical,inconv,slightly_prob,priority,priority -usual,proper,completed,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,proper,completed,1,critical,inconv,problematic,recommended,priority -usual,proper,completed,1,critical,inconv,problematic,priority,priority -usual,proper,completed,1,critical,inconv,problematic,not_recom,not_recom -usual,proper,completed,2,convenient,convenient,nonprob,recommended,very_recom -usual,proper,completed,2,convenient,convenient,nonprob,priority,priority -usual,proper,completed,2,convenient,convenient,nonprob,not_recom,not_recom -usual,proper,completed,2,convenient,convenient,slightly_prob,recommended,very_recom -usual,proper,completed,2,convenient,convenient,slightly_prob,priority,priority -usual,proper,completed,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,proper,completed,2,convenient,convenient,problematic,recommended,priority -usual,proper,completed,2,convenient,convenient,problematic,priority,priority -usual,proper,completed,2,convenient,convenient,problematic,not_recom,not_recom -usual,proper,completed,2,convenient,inconv,nonprob,recommended,very_recom -usual,proper,completed,2,convenient,inconv,nonprob,priority,priority -usual,proper,completed,2,convenient,inconv,nonprob,not_recom,not_recom -usual,proper,completed,2,convenient,inconv,slightly_prob,recommended,very_recom -usual,proper,completed,2,convenient,inconv,slightly_prob,priority,priority -usual,proper,completed,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,proper,completed,2,convenient,inconv,problematic,recommended,priority -usual,proper,completed,2,convenient,inconv,problematic,priority,priority -usual,proper,completed,2,convenient,inconv,problematic,not_recom,not_recom -usual,proper,completed,2,less_conv,convenient,nonprob,recommended,very_recom -usual,proper,completed,2,less_conv,convenient,nonprob,priority,priority -usual,proper,completed,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,proper,completed,2,less_conv,convenient,slightly_prob,recommended,very_recom -usual,proper,completed,2,less_conv,convenient,slightly_prob,priority,priority -usual,proper,completed,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,proper,completed,2,less_conv,convenient,problematic,recommended,priority -usual,proper,completed,2,less_conv,convenient,problematic,priority,priority -usual,proper,completed,2,less_conv,convenient,problematic,not_recom,not_recom -usual,proper,completed,2,less_conv,inconv,nonprob,recommended,very_recom -usual,proper,completed,2,less_conv,inconv,nonprob,priority,priority -usual,proper,completed,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,proper,completed,2,less_conv,inconv,slightly_prob,recommended,very_recom -usual,proper,completed,2,less_conv,inconv,slightly_prob,priority,priority -usual,proper,completed,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,proper,completed,2,less_conv,inconv,problematic,recommended,priority -usual,proper,completed,2,less_conv,inconv,problematic,priority,priority -usual,proper,completed,2,less_conv,inconv,problematic,not_recom,not_recom -usual,proper,completed,2,critical,convenient,nonprob,recommended,priority -usual,proper,completed,2,critical,convenient,nonprob,priority,priority -usual,proper,completed,2,critical,convenient,nonprob,not_recom,not_recom -usual,proper,completed,2,critical,convenient,slightly_prob,recommended,priority -usual,proper,completed,2,critical,convenient,slightly_prob,priority,priority -usual,proper,completed,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,proper,completed,2,critical,convenient,problematic,recommended,priority -usual,proper,completed,2,critical,convenient,problematic,priority,priority -usual,proper,completed,2,critical,convenient,problematic,not_recom,not_recom -usual,proper,completed,2,critical,inconv,nonprob,recommended,priority -usual,proper,completed,2,critical,inconv,nonprob,priority,priority -usual,proper,completed,2,critical,inconv,nonprob,not_recom,not_recom -usual,proper,completed,2,critical,inconv,slightly_prob,recommended,priority -usual,proper,completed,2,critical,inconv,slightly_prob,priority,priority -usual,proper,completed,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,proper,completed,2,critical,inconv,problematic,recommended,priority -usual,proper,completed,2,critical,inconv,problematic,priority,priority -usual,proper,completed,2,critical,inconv,problematic,not_recom,not_recom -usual,proper,completed,3,convenient,convenient,nonprob,recommended,very_recom -usual,proper,completed,3,convenient,convenient,nonprob,priority,priority -usual,proper,completed,3,convenient,convenient,nonprob,not_recom,not_recom -usual,proper,completed,3,convenient,convenient,slightly_prob,recommended,very_recom -usual,proper,completed,3,convenient,convenient,slightly_prob,priority,priority -usual,proper,completed,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,proper,completed,3,convenient,convenient,problematic,recommended,priority -usual,proper,completed,3,convenient,convenient,problematic,priority,priority -usual,proper,completed,3,convenient,convenient,problematic,not_recom,not_recom -usual,proper,completed,3,convenient,inconv,nonprob,recommended,priority -usual,proper,completed,3,convenient,inconv,nonprob,priority,priority -usual,proper,completed,3,convenient,inconv,nonprob,not_recom,not_recom -usual,proper,completed,3,convenient,inconv,slightly_prob,recommended,priority -usual,proper,completed,3,convenient,inconv,slightly_prob,priority,priority -usual,proper,completed,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,proper,completed,3,convenient,inconv,problematic,recommended,priority -usual,proper,completed,3,convenient,inconv,problematic,priority,priority -usual,proper,completed,3,convenient,inconv,problematic,not_recom,not_recom -usual,proper,completed,3,less_conv,convenient,nonprob,recommended,priority -usual,proper,completed,3,less_conv,convenient,nonprob,priority,priority -usual,proper,completed,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,proper,completed,3,less_conv,convenient,slightly_prob,recommended,priority -usual,proper,completed,3,less_conv,convenient,slightly_prob,priority,priority -usual,proper,completed,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,proper,completed,3,less_conv,convenient,problematic,recommended,priority -usual,proper,completed,3,less_conv,convenient,problematic,priority,priority -usual,proper,completed,3,less_conv,convenient,problematic,not_recom,not_recom -usual,proper,completed,3,less_conv,inconv,nonprob,recommended,priority -usual,proper,completed,3,less_conv,inconv,nonprob,priority,priority -usual,proper,completed,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,proper,completed,3,less_conv,inconv,slightly_prob,recommended,priority -usual,proper,completed,3,less_conv,inconv,slightly_prob,priority,priority -usual,proper,completed,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,proper,completed,3,less_conv,inconv,problematic,recommended,priority -usual,proper,completed,3,less_conv,inconv,problematic,priority,priority -usual,proper,completed,3,less_conv,inconv,problematic,not_recom,not_recom -usual,proper,completed,3,critical,convenient,nonprob,recommended,priority -usual,proper,completed,3,critical,convenient,nonprob,priority,priority -usual,proper,completed,3,critical,convenient,nonprob,not_recom,not_recom -usual,proper,completed,3,critical,convenient,slightly_prob,recommended,priority -usual,proper,completed,3,critical,convenient,slightly_prob,priority,priority -usual,proper,completed,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,proper,completed,3,critical,convenient,problematic,recommended,priority -usual,proper,completed,3,critical,convenient,problematic,priority,priority -usual,proper,completed,3,critical,convenient,problematic,not_recom,not_recom -usual,proper,completed,3,critical,inconv,nonprob,recommended,priority -usual,proper,completed,3,critical,inconv,nonprob,priority,priority -usual,proper,completed,3,critical,inconv,nonprob,not_recom,not_recom -usual,proper,completed,3,critical,inconv,slightly_prob,recommended,priority -usual,proper,completed,3,critical,inconv,slightly_prob,priority,priority -usual,proper,completed,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,proper,completed,3,critical,inconv,problematic,recommended,priority -usual,proper,completed,3,critical,inconv,problematic,priority,priority -usual,proper,completed,3,critical,inconv,problematic,not_recom,not_recom -usual,proper,completed,more,convenient,convenient,nonprob,recommended,very_recom -usual,proper,completed,more,convenient,convenient,nonprob,priority,priority -usual,proper,completed,more,convenient,convenient,nonprob,not_recom,not_recom -usual,proper,completed,more,convenient,convenient,slightly_prob,recommended,very_recom -usual,proper,completed,more,convenient,convenient,slightly_prob,priority,priority -usual,proper,completed,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,proper,completed,more,convenient,convenient,problematic,recommended,priority -usual,proper,completed,more,convenient,convenient,problematic,priority,priority -usual,proper,completed,more,convenient,convenient,problematic,not_recom,not_recom -usual,proper,completed,more,convenient,inconv,nonprob,recommended,priority -usual,proper,completed,more,convenient,inconv,nonprob,priority,priority -usual,proper,completed,more,convenient,inconv,nonprob,not_recom,not_recom -usual,proper,completed,more,convenient,inconv,slightly_prob,recommended,priority -usual,proper,completed,more,convenient,inconv,slightly_prob,priority,priority -usual,proper,completed,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,proper,completed,more,convenient,inconv,problematic,recommended,priority -usual,proper,completed,more,convenient,inconv,problematic,priority,priority -usual,proper,completed,more,convenient,inconv,problematic,not_recom,not_recom -usual,proper,completed,more,less_conv,convenient,nonprob,recommended,priority -usual,proper,completed,more,less_conv,convenient,nonprob,priority,priority -usual,proper,completed,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,proper,completed,more,less_conv,convenient,slightly_prob,recommended,priority -usual,proper,completed,more,less_conv,convenient,slightly_prob,priority,priority -usual,proper,completed,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,proper,completed,more,less_conv,convenient,problematic,recommended,priority -usual,proper,completed,more,less_conv,convenient,problematic,priority,priority -usual,proper,completed,more,less_conv,convenient,problematic,not_recom,not_recom -usual,proper,completed,more,less_conv,inconv,nonprob,recommended,priority -usual,proper,completed,more,less_conv,inconv,nonprob,priority,priority -usual,proper,completed,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,proper,completed,more,less_conv,inconv,slightly_prob,recommended,priority -usual,proper,completed,more,less_conv,inconv,slightly_prob,priority,priority -usual,proper,completed,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,proper,completed,more,less_conv,inconv,problematic,recommended,priority -usual,proper,completed,more,less_conv,inconv,problematic,priority,priority -usual,proper,completed,more,less_conv,inconv,problematic,not_recom,not_recom -usual,proper,completed,more,critical,convenient,nonprob,recommended,priority -usual,proper,completed,more,critical,convenient,nonprob,priority,priority -usual,proper,completed,more,critical,convenient,nonprob,not_recom,not_recom -usual,proper,completed,more,critical,convenient,slightly_prob,recommended,priority -usual,proper,completed,more,critical,convenient,slightly_prob,priority,priority -usual,proper,completed,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,proper,completed,more,critical,convenient,problematic,recommended,priority -usual,proper,completed,more,critical,convenient,problematic,priority,priority -usual,proper,completed,more,critical,convenient,problematic,not_recom,not_recom -usual,proper,completed,more,critical,inconv,nonprob,recommended,priority -usual,proper,completed,more,critical,inconv,nonprob,priority,priority -usual,proper,completed,more,critical,inconv,nonprob,not_recom,not_recom -usual,proper,completed,more,critical,inconv,slightly_prob,recommended,priority -usual,proper,completed,more,critical,inconv,slightly_prob,priority,priority -usual,proper,completed,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,proper,completed,more,critical,inconv,problematic,recommended,priority -usual,proper,completed,more,critical,inconv,problematic,priority,priority -usual,proper,completed,more,critical,inconv,problematic,not_recom,not_recom -usual,proper,incomplete,1,convenient,convenient,nonprob,recommended,very_recom -usual,proper,incomplete,1,convenient,convenient,nonprob,priority,priority -usual,proper,incomplete,1,convenient,convenient,nonprob,not_recom,not_recom -usual,proper,incomplete,1,convenient,convenient,slightly_prob,recommended,very_recom -usual,proper,incomplete,1,convenient,convenient,slightly_prob,priority,priority -usual,proper,incomplete,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,proper,incomplete,1,convenient,convenient,problematic,recommended,priority -usual,proper,incomplete,1,convenient,convenient,problematic,priority,priority -usual,proper,incomplete,1,convenient,convenient,problematic,not_recom,not_recom -usual,proper,incomplete,1,convenient,inconv,nonprob,recommended,very_recom -usual,proper,incomplete,1,convenient,inconv,nonprob,priority,priority -usual,proper,incomplete,1,convenient,inconv,nonprob,not_recom,not_recom -usual,proper,incomplete,1,convenient,inconv,slightly_prob,recommended,very_recom -usual,proper,incomplete,1,convenient,inconv,slightly_prob,priority,priority -usual,proper,incomplete,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,proper,incomplete,1,convenient,inconv,problematic,recommended,priority -usual,proper,incomplete,1,convenient,inconv,problematic,priority,priority -usual,proper,incomplete,1,convenient,inconv,problematic,not_recom,not_recom -usual,proper,incomplete,1,less_conv,convenient,nonprob,recommended,very_recom -usual,proper,incomplete,1,less_conv,convenient,nonprob,priority,priority -usual,proper,incomplete,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,proper,incomplete,1,less_conv,convenient,slightly_prob,recommended,very_recom -usual,proper,incomplete,1,less_conv,convenient,slightly_prob,priority,priority -usual,proper,incomplete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,proper,incomplete,1,less_conv,convenient,problematic,recommended,priority -usual,proper,incomplete,1,less_conv,convenient,problematic,priority,priority -usual,proper,incomplete,1,less_conv,convenient,problematic,not_recom,not_recom -usual,proper,incomplete,1,less_conv,inconv,nonprob,recommended,very_recom -usual,proper,incomplete,1,less_conv,inconv,nonprob,priority,priority -usual,proper,incomplete,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,proper,incomplete,1,less_conv,inconv,slightly_prob,recommended,very_recom -usual,proper,incomplete,1,less_conv,inconv,slightly_prob,priority,priority -usual,proper,incomplete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,proper,incomplete,1,less_conv,inconv,problematic,recommended,priority -usual,proper,incomplete,1,less_conv,inconv,problematic,priority,priority -usual,proper,incomplete,1,less_conv,inconv,problematic,not_recom,not_recom -usual,proper,incomplete,1,critical,convenient,nonprob,recommended,priority -usual,proper,incomplete,1,critical,convenient,nonprob,priority,priority -usual,proper,incomplete,1,critical,convenient,nonprob,not_recom,not_recom -usual,proper,incomplete,1,critical,convenient,slightly_prob,recommended,priority -usual,proper,incomplete,1,critical,convenient,slightly_prob,priority,priority -usual,proper,incomplete,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,proper,incomplete,1,critical,convenient,problematic,recommended,priority -usual,proper,incomplete,1,critical,convenient,problematic,priority,priority -usual,proper,incomplete,1,critical,convenient,problematic,not_recom,not_recom -usual,proper,incomplete,1,critical,inconv,nonprob,recommended,priority -usual,proper,incomplete,1,critical,inconv,nonprob,priority,priority -usual,proper,incomplete,1,critical,inconv,nonprob,not_recom,not_recom -usual,proper,incomplete,1,critical,inconv,slightly_prob,recommended,priority -usual,proper,incomplete,1,critical,inconv,slightly_prob,priority,priority -usual,proper,incomplete,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,proper,incomplete,1,critical,inconv,problematic,recommended,priority -usual,proper,incomplete,1,critical,inconv,problematic,priority,priority -usual,proper,incomplete,1,critical,inconv,problematic,not_recom,not_recom -usual,proper,incomplete,2,convenient,convenient,nonprob,recommended,very_recom -usual,proper,incomplete,2,convenient,convenient,nonprob,priority,priority -usual,proper,incomplete,2,convenient,convenient,nonprob,not_recom,not_recom -usual,proper,incomplete,2,convenient,convenient,slightly_prob,recommended,very_recom -usual,proper,incomplete,2,convenient,convenient,slightly_prob,priority,priority -usual,proper,incomplete,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,proper,incomplete,2,convenient,convenient,problematic,recommended,priority -usual,proper,incomplete,2,convenient,convenient,problematic,priority,priority -usual,proper,incomplete,2,convenient,convenient,problematic,not_recom,not_recom -usual,proper,incomplete,2,convenient,inconv,nonprob,recommended,priority -usual,proper,incomplete,2,convenient,inconv,nonprob,priority,priority -usual,proper,incomplete,2,convenient,inconv,nonprob,not_recom,not_recom -usual,proper,incomplete,2,convenient,inconv,slightly_prob,recommended,priority -usual,proper,incomplete,2,convenient,inconv,slightly_prob,priority,priority -usual,proper,incomplete,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,proper,incomplete,2,convenient,inconv,problematic,recommended,priority -usual,proper,incomplete,2,convenient,inconv,problematic,priority,priority -usual,proper,incomplete,2,convenient,inconv,problematic,not_recom,not_recom -usual,proper,incomplete,2,less_conv,convenient,nonprob,recommended,priority -usual,proper,incomplete,2,less_conv,convenient,nonprob,priority,priority -usual,proper,incomplete,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,proper,incomplete,2,less_conv,convenient,slightly_prob,recommended,priority -usual,proper,incomplete,2,less_conv,convenient,slightly_prob,priority,priority -usual,proper,incomplete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,proper,incomplete,2,less_conv,convenient,problematic,recommended,priority -usual,proper,incomplete,2,less_conv,convenient,problematic,priority,priority -usual,proper,incomplete,2,less_conv,convenient,problematic,not_recom,not_recom -usual,proper,incomplete,2,less_conv,inconv,nonprob,recommended,priority -usual,proper,incomplete,2,less_conv,inconv,nonprob,priority,priority -usual,proper,incomplete,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,proper,incomplete,2,less_conv,inconv,slightly_prob,recommended,priority -usual,proper,incomplete,2,less_conv,inconv,slightly_prob,priority,priority -usual,proper,incomplete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,proper,incomplete,2,less_conv,inconv,problematic,recommended,priority -usual,proper,incomplete,2,less_conv,inconv,problematic,priority,priority -usual,proper,incomplete,2,less_conv,inconv,problematic,not_recom,not_recom -usual,proper,incomplete,2,critical,convenient,nonprob,recommended,priority -usual,proper,incomplete,2,critical,convenient,nonprob,priority,priority -usual,proper,incomplete,2,critical,convenient,nonprob,not_recom,not_recom -usual,proper,incomplete,2,critical,convenient,slightly_prob,recommended,priority -usual,proper,incomplete,2,critical,convenient,slightly_prob,priority,priority -usual,proper,incomplete,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,proper,incomplete,2,critical,convenient,problematic,recommended,priority -usual,proper,incomplete,2,critical,convenient,problematic,priority,priority -usual,proper,incomplete,2,critical,convenient,problematic,not_recom,not_recom -usual,proper,incomplete,2,critical,inconv,nonprob,recommended,priority -usual,proper,incomplete,2,critical,inconv,nonprob,priority,priority -usual,proper,incomplete,2,critical,inconv,nonprob,not_recom,not_recom -usual,proper,incomplete,2,critical,inconv,slightly_prob,recommended,priority -usual,proper,incomplete,2,critical,inconv,slightly_prob,priority,priority -usual,proper,incomplete,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,proper,incomplete,2,critical,inconv,problematic,recommended,priority -usual,proper,incomplete,2,critical,inconv,problematic,priority,priority -usual,proper,incomplete,2,critical,inconv,problematic,not_recom,not_recom -usual,proper,incomplete,3,convenient,convenient,nonprob,recommended,very_recom -usual,proper,incomplete,3,convenient,convenient,nonprob,priority,priority -usual,proper,incomplete,3,convenient,convenient,nonprob,not_recom,not_recom -usual,proper,incomplete,3,convenient,convenient,slightly_prob,recommended,very_recom -usual,proper,incomplete,3,convenient,convenient,slightly_prob,priority,priority -usual,proper,incomplete,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,proper,incomplete,3,convenient,convenient,problematic,recommended,priority -usual,proper,incomplete,3,convenient,convenient,problematic,priority,priority -usual,proper,incomplete,3,convenient,convenient,problematic,not_recom,not_recom -usual,proper,incomplete,3,convenient,inconv,nonprob,recommended,priority -usual,proper,incomplete,3,convenient,inconv,nonprob,priority,priority -usual,proper,incomplete,3,convenient,inconv,nonprob,not_recom,not_recom -usual,proper,incomplete,3,convenient,inconv,slightly_prob,recommended,priority -usual,proper,incomplete,3,convenient,inconv,slightly_prob,priority,priority -usual,proper,incomplete,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,proper,incomplete,3,convenient,inconv,problematic,recommended,priority -usual,proper,incomplete,3,convenient,inconv,problematic,priority,priority -usual,proper,incomplete,3,convenient,inconv,problematic,not_recom,not_recom -usual,proper,incomplete,3,less_conv,convenient,nonprob,recommended,priority -usual,proper,incomplete,3,less_conv,convenient,nonprob,priority,priority -usual,proper,incomplete,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,proper,incomplete,3,less_conv,convenient,slightly_prob,recommended,priority -usual,proper,incomplete,3,less_conv,convenient,slightly_prob,priority,priority -usual,proper,incomplete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,proper,incomplete,3,less_conv,convenient,problematic,recommended,priority -usual,proper,incomplete,3,less_conv,convenient,problematic,priority,priority -usual,proper,incomplete,3,less_conv,convenient,problematic,not_recom,not_recom -usual,proper,incomplete,3,less_conv,inconv,nonprob,recommended,priority -usual,proper,incomplete,3,less_conv,inconv,nonprob,priority,priority -usual,proper,incomplete,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,proper,incomplete,3,less_conv,inconv,slightly_prob,recommended,priority -usual,proper,incomplete,3,less_conv,inconv,slightly_prob,priority,priority -usual,proper,incomplete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,proper,incomplete,3,less_conv,inconv,problematic,recommended,priority -usual,proper,incomplete,3,less_conv,inconv,problematic,priority,priority -usual,proper,incomplete,3,less_conv,inconv,problematic,not_recom,not_recom -usual,proper,incomplete,3,critical,convenient,nonprob,recommended,priority -usual,proper,incomplete,3,critical,convenient,nonprob,priority,priority -usual,proper,incomplete,3,critical,convenient,nonprob,not_recom,not_recom -usual,proper,incomplete,3,critical,convenient,slightly_prob,recommended,priority -usual,proper,incomplete,3,critical,convenient,slightly_prob,priority,priority -usual,proper,incomplete,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,proper,incomplete,3,critical,convenient,problematic,recommended,priority -usual,proper,incomplete,3,critical,convenient,problematic,priority,priority -usual,proper,incomplete,3,critical,convenient,problematic,not_recom,not_recom -usual,proper,incomplete,3,critical,inconv,nonprob,recommended,priority -usual,proper,incomplete,3,critical,inconv,nonprob,priority,priority -usual,proper,incomplete,3,critical,inconv,nonprob,not_recom,not_recom -usual,proper,incomplete,3,critical,inconv,slightly_prob,recommended,priority -usual,proper,incomplete,3,critical,inconv,slightly_prob,priority,priority -usual,proper,incomplete,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,proper,incomplete,3,critical,inconv,problematic,recommended,priority -usual,proper,incomplete,3,critical,inconv,problematic,priority,priority -usual,proper,incomplete,3,critical,inconv,problematic,not_recom,not_recom -usual,proper,incomplete,more,convenient,convenient,nonprob,recommended,very_recom -usual,proper,incomplete,more,convenient,convenient,nonprob,priority,priority -usual,proper,incomplete,more,convenient,convenient,nonprob,not_recom,not_recom -usual,proper,incomplete,more,convenient,convenient,slightly_prob,recommended,very_recom -usual,proper,incomplete,more,convenient,convenient,slightly_prob,priority,priority -usual,proper,incomplete,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,proper,incomplete,more,convenient,convenient,problematic,recommended,priority -usual,proper,incomplete,more,convenient,convenient,problematic,priority,priority -usual,proper,incomplete,more,convenient,convenient,problematic,not_recom,not_recom -usual,proper,incomplete,more,convenient,inconv,nonprob,recommended,priority -usual,proper,incomplete,more,convenient,inconv,nonprob,priority,priority -usual,proper,incomplete,more,convenient,inconv,nonprob,not_recom,not_recom -usual,proper,incomplete,more,convenient,inconv,slightly_prob,recommended,priority -usual,proper,incomplete,more,convenient,inconv,slightly_prob,priority,priority -usual,proper,incomplete,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,proper,incomplete,more,convenient,inconv,problematic,recommended,priority -usual,proper,incomplete,more,convenient,inconv,problematic,priority,priority -usual,proper,incomplete,more,convenient,inconv,problematic,not_recom,not_recom -usual,proper,incomplete,more,less_conv,convenient,nonprob,recommended,priority -usual,proper,incomplete,more,less_conv,convenient,nonprob,priority,priority -usual,proper,incomplete,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,proper,incomplete,more,less_conv,convenient,slightly_prob,recommended,priority -usual,proper,incomplete,more,less_conv,convenient,slightly_prob,priority,priority -usual,proper,incomplete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,proper,incomplete,more,less_conv,convenient,problematic,recommended,priority -usual,proper,incomplete,more,less_conv,convenient,problematic,priority,priority -usual,proper,incomplete,more,less_conv,convenient,problematic,not_recom,not_recom -usual,proper,incomplete,more,less_conv,inconv,nonprob,recommended,priority -usual,proper,incomplete,more,less_conv,inconv,nonprob,priority,priority -usual,proper,incomplete,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,proper,incomplete,more,less_conv,inconv,slightly_prob,recommended,priority -usual,proper,incomplete,more,less_conv,inconv,slightly_prob,priority,priority -usual,proper,incomplete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,proper,incomplete,more,less_conv,inconv,problematic,recommended,priority -usual,proper,incomplete,more,less_conv,inconv,problematic,priority,priority -usual,proper,incomplete,more,less_conv,inconv,problematic,not_recom,not_recom -usual,proper,incomplete,more,critical,convenient,nonprob,recommended,priority -usual,proper,incomplete,more,critical,convenient,nonprob,priority,priority -usual,proper,incomplete,more,critical,convenient,nonprob,not_recom,not_recom -usual,proper,incomplete,more,critical,convenient,slightly_prob,recommended,priority -usual,proper,incomplete,more,critical,convenient,slightly_prob,priority,priority -usual,proper,incomplete,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,proper,incomplete,more,critical,convenient,problematic,recommended,priority -usual,proper,incomplete,more,critical,convenient,problematic,priority,priority -usual,proper,incomplete,more,critical,convenient,problematic,not_recom,not_recom -usual,proper,incomplete,more,critical,inconv,nonprob,recommended,priority -usual,proper,incomplete,more,critical,inconv,nonprob,priority,priority -usual,proper,incomplete,more,critical,inconv,nonprob,not_recom,not_recom -usual,proper,incomplete,more,critical,inconv,slightly_prob,recommended,priority -usual,proper,incomplete,more,critical,inconv,slightly_prob,priority,priority -usual,proper,incomplete,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,proper,incomplete,more,critical,inconv,problematic,recommended,priority -usual,proper,incomplete,more,critical,inconv,problematic,priority,priority -usual,proper,incomplete,more,critical,inconv,problematic,not_recom,not_recom -usual,proper,foster,1,convenient,convenient,nonprob,recommended,very_recom -usual,proper,foster,1,convenient,convenient,nonprob,priority,priority -usual,proper,foster,1,convenient,convenient,nonprob,not_recom,not_recom -usual,proper,foster,1,convenient,convenient,slightly_prob,recommended,very_recom -usual,proper,foster,1,convenient,convenient,slightly_prob,priority,priority -usual,proper,foster,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,proper,foster,1,convenient,convenient,problematic,recommended,priority -usual,proper,foster,1,convenient,convenient,problematic,priority,priority -usual,proper,foster,1,convenient,convenient,problematic,not_recom,not_recom -usual,proper,foster,1,convenient,inconv,nonprob,recommended,priority -usual,proper,foster,1,convenient,inconv,nonprob,priority,priority -usual,proper,foster,1,convenient,inconv,nonprob,not_recom,not_recom -usual,proper,foster,1,convenient,inconv,slightly_prob,recommended,priority -usual,proper,foster,1,convenient,inconv,slightly_prob,priority,priority -usual,proper,foster,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,proper,foster,1,convenient,inconv,problematic,recommended,priority -usual,proper,foster,1,convenient,inconv,problematic,priority,priority -usual,proper,foster,1,convenient,inconv,problematic,not_recom,not_recom -usual,proper,foster,1,less_conv,convenient,nonprob,recommended,priority -usual,proper,foster,1,less_conv,convenient,nonprob,priority,priority -usual,proper,foster,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,proper,foster,1,less_conv,convenient,slightly_prob,recommended,priority -usual,proper,foster,1,less_conv,convenient,slightly_prob,priority,priority -usual,proper,foster,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,proper,foster,1,less_conv,convenient,problematic,recommended,priority -usual,proper,foster,1,less_conv,convenient,problematic,priority,priority -usual,proper,foster,1,less_conv,convenient,problematic,not_recom,not_recom -usual,proper,foster,1,less_conv,inconv,nonprob,recommended,priority -usual,proper,foster,1,less_conv,inconv,nonprob,priority,priority -usual,proper,foster,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,proper,foster,1,less_conv,inconv,slightly_prob,recommended,priority -usual,proper,foster,1,less_conv,inconv,slightly_prob,priority,priority -usual,proper,foster,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,proper,foster,1,less_conv,inconv,problematic,recommended,priority -usual,proper,foster,1,less_conv,inconv,problematic,priority,priority -usual,proper,foster,1,less_conv,inconv,problematic,not_recom,not_recom -usual,proper,foster,1,critical,convenient,nonprob,recommended,priority -usual,proper,foster,1,critical,convenient,nonprob,priority,priority -usual,proper,foster,1,critical,convenient,nonprob,not_recom,not_recom -usual,proper,foster,1,critical,convenient,slightly_prob,recommended,priority -usual,proper,foster,1,critical,convenient,slightly_prob,priority,priority -usual,proper,foster,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,proper,foster,1,critical,convenient,problematic,recommended,priority -usual,proper,foster,1,critical,convenient,problematic,priority,priority -usual,proper,foster,1,critical,convenient,problematic,not_recom,not_recom -usual,proper,foster,1,critical,inconv,nonprob,recommended,priority -usual,proper,foster,1,critical,inconv,nonprob,priority,priority -usual,proper,foster,1,critical,inconv,nonprob,not_recom,not_recom -usual,proper,foster,1,critical,inconv,slightly_prob,recommended,priority -usual,proper,foster,1,critical,inconv,slightly_prob,priority,priority -usual,proper,foster,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,proper,foster,1,critical,inconv,problematic,recommended,priority -usual,proper,foster,1,critical,inconv,problematic,priority,priority -usual,proper,foster,1,critical,inconv,problematic,not_recom,not_recom -usual,proper,foster,2,convenient,convenient,nonprob,recommended,very_recom -usual,proper,foster,2,convenient,convenient,nonprob,priority,priority -usual,proper,foster,2,convenient,convenient,nonprob,not_recom,not_recom -usual,proper,foster,2,convenient,convenient,slightly_prob,recommended,very_recom -usual,proper,foster,2,convenient,convenient,slightly_prob,priority,priority -usual,proper,foster,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,proper,foster,2,convenient,convenient,problematic,recommended,priority -usual,proper,foster,2,convenient,convenient,problematic,priority,priority -usual,proper,foster,2,convenient,convenient,problematic,not_recom,not_recom -usual,proper,foster,2,convenient,inconv,nonprob,recommended,priority -usual,proper,foster,2,convenient,inconv,nonprob,priority,priority -usual,proper,foster,2,convenient,inconv,nonprob,not_recom,not_recom -usual,proper,foster,2,convenient,inconv,slightly_prob,recommended,priority -usual,proper,foster,2,convenient,inconv,slightly_prob,priority,priority -usual,proper,foster,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,proper,foster,2,convenient,inconv,problematic,recommended,priority -usual,proper,foster,2,convenient,inconv,problematic,priority,priority -usual,proper,foster,2,convenient,inconv,problematic,not_recom,not_recom -usual,proper,foster,2,less_conv,convenient,nonprob,recommended,priority -usual,proper,foster,2,less_conv,convenient,nonprob,priority,priority -usual,proper,foster,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,proper,foster,2,less_conv,convenient,slightly_prob,recommended,priority -usual,proper,foster,2,less_conv,convenient,slightly_prob,priority,priority -usual,proper,foster,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,proper,foster,2,less_conv,convenient,problematic,recommended,priority -usual,proper,foster,2,less_conv,convenient,problematic,priority,priority -usual,proper,foster,2,less_conv,convenient,problematic,not_recom,not_recom -usual,proper,foster,2,less_conv,inconv,nonprob,recommended,priority -usual,proper,foster,2,less_conv,inconv,nonprob,priority,priority -usual,proper,foster,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,proper,foster,2,less_conv,inconv,slightly_prob,recommended,priority -usual,proper,foster,2,less_conv,inconv,slightly_prob,priority,priority -usual,proper,foster,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,proper,foster,2,less_conv,inconv,problematic,recommended,priority -usual,proper,foster,2,less_conv,inconv,problematic,priority,priority -usual,proper,foster,2,less_conv,inconv,problematic,not_recom,not_recom -usual,proper,foster,2,critical,convenient,nonprob,recommended,priority -usual,proper,foster,2,critical,convenient,nonprob,priority,priority -usual,proper,foster,2,critical,convenient,nonprob,not_recom,not_recom -usual,proper,foster,2,critical,convenient,slightly_prob,recommended,priority -usual,proper,foster,2,critical,convenient,slightly_prob,priority,priority -usual,proper,foster,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,proper,foster,2,critical,convenient,problematic,recommended,priority -usual,proper,foster,2,critical,convenient,problematic,priority,priority -usual,proper,foster,2,critical,convenient,problematic,not_recom,not_recom -usual,proper,foster,2,critical,inconv,nonprob,recommended,priority -usual,proper,foster,2,critical,inconv,nonprob,priority,priority -usual,proper,foster,2,critical,inconv,nonprob,not_recom,not_recom -usual,proper,foster,2,critical,inconv,slightly_prob,recommended,priority -usual,proper,foster,2,critical,inconv,slightly_prob,priority,priority -usual,proper,foster,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,proper,foster,2,critical,inconv,problematic,recommended,priority -usual,proper,foster,2,critical,inconv,problematic,priority,priority -usual,proper,foster,2,critical,inconv,problematic,not_recom,not_recom -usual,proper,foster,3,convenient,convenient,nonprob,recommended,very_recom -usual,proper,foster,3,convenient,convenient,nonprob,priority,priority -usual,proper,foster,3,convenient,convenient,nonprob,not_recom,not_recom -usual,proper,foster,3,convenient,convenient,slightly_prob,recommended,very_recom -usual,proper,foster,3,convenient,convenient,slightly_prob,priority,priority -usual,proper,foster,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,proper,foster,3,convenient,convenient,problematic,recommended,priority -usual,proper,foster,3,convenient,convenient,problematic,priority,priority -usual,proper,foster,3,convenient,convenient,problematic,not_recom,not_recom -usual,proper,foster,3,convenient,inconv,nonprob,recommended,priority -usual,proper,foster,3,convenient,inconv,nonprob,priority,priority -usual,proper,foster,3,convenient,inconv,nonprob,not_recom,not_recom -usual,proper,foster,3,convenient,inconv,slightly_prob,recommended,priority -usual,proper,foster,3,convenient,inconv,slightly_prob,priority,priority -usual,proper,foster,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,proper,foster,3,convenient,inconv,problematic,recommended,priority -usual,proper,foster,3,convenient,inconv,problematic,priority,priority -usual,proper,foster,3,convenient,inconv,problematic,not_recom,not_recom -usual,proper,foster,3,less_conv,convenient,nonprob,recommended,priority -usual,proper,foster,3,less_conv,convenient,nonprob,priority,priority -usual,proper,foster,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,proper,foster,3,less_conv,convenient,slightly_prob,recommended,priority -usual,proper,foster,3,less_conv,convenient,slightly_prob,priority,priority -usual,proper,foster,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,proper,foster,3,less_conv,convenient,problematic,recommended,priority -usual,proper,foster,3,less_conv,convenient,problematic,priority,priority -usual,proper,foster,3,less_conv,convenient,problematic,not_recom,not_recom -usual,proper,foster,3,less_conv,inconv,nonprob,recommended,priority -usual,proper,foster,3,less_conv,inconv,nonprob,priority,priority -usual,proper,foster,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,proper,foster,3,less_conv,inconv,slightly_prob,recommended,priority -usual,proper,foster,3,less_conv,inconv,slightly_prob,priority,priority -usual,proper,foster,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,proper,foster,3,less_conv,inconv,problematic,recommended,priority -usual,proper,foster,3,less_conv,inconv,problematic,priority,priority -usual,proper,foster,3,less_conv,inconv,problematic,not_recom,not_recom -usual,proper,foster,3,critical,convenient,nonprob,recommended,priority -usual,proper,foster,3,critical,convenient,nonprob,priority,priority -usual,proper,foster,3,critical,convenient,nonprob,not_recom,not_recom -usual,proper,foster,3,critical,convenient,slightly_prob,recommended,priority -usual,proper,foster,3,critical,convenient,slightly_prob,priority,priority -usual,proper,foster,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,proper,foster,3,critical,convenient,problematic,recommended,priority -usual,proper,foster,3,critical,convenient,problematic,priority,priority -usual,proper,foster,3,critical,convenient,problematic,not_recom,not_recom -usual,proper,foster,3,critical,inconv,nonprob,recommended,priority -usual,proper,foster,3,critical,inconv,nonprob,priority,priority -usual,proper,foster,3,critical,inconv,nonprob,not_recom,not_recom -usual,proper,foster,3,critical,inconv,slightly_prob,recommended,priority -usual,proper,foster,3,critical,inconv,slightly_prob,priority,priority -usual,proper,foster,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,proper,foster,3,critical,inconv,problematic,recommended,priority -usual,proper,foster,3,critical,inconv,problematic,priority,priority -usual,proper,foster,3,critical,inconv,problematic,not_recom,not_recom -usual,proper,foster,more,convenient,convenient,nonprob,recommended,very_recom -usual,proper,foster,more,convenient,convenient,nonprob,priority,priority -usual,proper,foster,more,convenient,convenient,nonprob,not_recom,not_recom -usual,proper,foster,more,convenient,convenient,slightly_prob,recommended,very_recom -usual,proper,foster,more,convenient,convenient,slightly_prob,priority,priority -usual,proper,foster,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,proper,foster,more,convenient,convenient,problematic,recommended,priority -usual,proper,foster,more,convenient,convenient,problematic,priority,priority -usual,proper,foster,more,convenient,convenient,problematic,not_recom,not_recom -usual,proper,foster,more,convenient,inconv,nonprob,recommended,priority -usual,proper,foster,more,convenient,inconv,nonprob,priority,priority -usual,proper,foster,more,convenient,inconv,nonprob,not_recom,not_recom -usual,proper,foster,more,convenient,inconv,slightly_prob,recommended,priority -usual,proper,foster,more,convenient,inconv,slightly_prob,priority,priority -usual,proper,foster,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,proper,foster,more,convenient,inconv,problematic,recommended,priority -usual,proper,foster,more,convenient,inconv,problematic,priority,priority -usual,proper,foster,more,convenient,inconv,problematic,not_recom,not_recom -usual,proper,foster,more,less_conv,convenient,nonprob,recommended,priority -usual,proper,foster,more,less_conv,convenient,nonprob,priority,priority -usual,proper,foster,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,proper,foster,more,less_conv,convenient,slightly_prob,recommended,priority -usual,proper,foster,more,less_conv,convenient,slightly_prob,priority,priority -usual,proper,foster,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,proper,foster,more,less_conv,convenient,problematic,recommended,priority -usual,proper,foster,more,less_conv,convenient,problematic,priority,priority -usual,proper,foster,more,less_conv,convenient,problematic,not_recom,not_recom -usual,proper,foster,more,less_conv,inconv,nonprob,recommended,priority -usual,proper,foster,more,less_conv,inconv,nonprob,priority,priority -usual,proper,foster,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,proper,foster,more,less_conv,inconv,slightly_prob,recommended,priority -usual,proper,foster,more,less_conv,inconv,slightly_prob,priority,priority -usual,proper,foster,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,proper,foster,more,less_conv,inconv,problematic,recommended,priority -usual,proper,foster,more,less_conv,inconv,problematic,priority,priority -usual,proper,foster,more,less_conv,inconv,problematic,not_recom,not_recom -usual,proper,foster,more,critical,convenient,nonprob,recommended,priority -usual,proper,foster,more,critical,convenient,nonprob,priority,priority -usual,proper,foster,more,critical,convenient,nonprob,not_recom,not_recom -usual,proper,foster,more,critical,convenient,slightly_prob,recommended,priority -usual,proper,foster,more,critical,convenient,slightly_prob,priority,priority -usual,proper,foster,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,proper,foster,more,critical,convenient,problematic,recommended,priority -usual,proper,foster,more,critical,convenient,problematic,priority,priority -usual,proper,foster,more,critical,convenient,problematic,not_recom,not_recom -usual,proper,foster,more,critical,inconv,nonprob,recommended,priority -usual,proper,foster,more,critical,inconv,nonprob,priority,priority -usual,proper,foster,more,critical,inconv,nonprob,not_recom,not_recom -usual,proper,foster,more,critical,inconv,slightly_prob,recommended,priority -usual,proper,foster,more,critical,inconv,slightly_prob,priority,priority -usual,proper,foster,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,proper,foster,more,critical,inconv,problematic,recommended,priority -usual,proper,foster,more,critical,inconv,problematic,priority,priority -usual,proper,foster,more,critical,inconv,problematic,not_recom,not_recom -usual,less_proper,complete,1,convenient,convenient,nonprob,recommended,very_recom -usual,less_proper,complete,1,convenient,convenient,nonprob,priority,priority -usual,less_proper,complete,1,convenient,convenient,nonprob,not_recom,not_recom -usual,less_proper,complete,1,convenient,convenient,slightly_prob,recommended,very_recom -usual,less_proper,complete,1,convenient,convenient,slightly_prob,priority,priority -usual,less_proper,complete,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,complete,1,convenient,convenient,problematic,recommended,priority -usual,less_proper,complete,1,convenient,convenient,problematic,priority,priority -usual,less_proper,complete,1,convenient,convenient,problematic,not_recom,not_recom -usual,less_proper,complete,1,convenient,inconv,nonprob,recommended,very_recom -usual,less_proper,complete,1,convenient,inconv,nonprob,priority,priority -usual,less_proper,complete,1,convenient,inconv,nonprob,not_recom,not_recom -usual,less_proper,complete,1,convenient,inconv,slightly_prob,recommended,very_recom -usual,less_proper,complete,1,convenient,inconv,slightly_prob,priority,priority -usual,less_proper,complete,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,complete,1,convenient,inconv,problematic,recommended,priority -usual,less_proper,complete,1,convenient,inconv,problematic,priority,priority -usual,less_proper,complete,1,convenient,inconv,problematic,not_recom,not_recom -usual,less_proper,complete,1,less_conv,convenient,nonprob,recommended,very_recom -usual,less_proper,complete,1,less_conv,convenient,nonprob,priority,priority -usual,less_proper,complete,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,less_proper,complete,1,less_conv,convenient,slightly_prob,recommended,very_recom -usual,less_proper,complete,1,less_conv,convenient,slightly_prob,priority,priority -usual,less_proper,complete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,complete,1,less_conv,convenient,problematic,recommended,priority -usual,less_proper,complete,1,less_conv,convenient,problematic,priority,priority -usual,less_proper,complete,1,less_conv,convenient,problematic,not_recom,not_recom -usual,less_proper,complete,1,less_conv,inconv,nonprob,recommended,very_recom -usual,less_proper,complete,1,less_conv,inconv,nonprob,priority,priority -usual,less_proper,complete,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,less_proper,complete,1,less_conv,inconv,slightly_prob,recommended,very_recom -usual,less_proper,complete,1,less_conv,inconv,slightly_prob,priority,priority -usual,less_proper,complete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,complete,1,less_conv,inconv,problematic,recommended,priority -usual,less_proper,complete,1,less_conv,inconv,problematic,priority,priority -usual,less_proper,complete,1,less_conv,inconv,problematic,not_recom,not_recom -usual,less_proper,complete,1,critical,convenient,nonprob,recommended,very_recom -usual,less_proper,complete,1,critical,convenient,nonprob,priority,priority -usual,less_proper,complete,1,critical,convenient,nonprob,not_recom,not_recom -usual,less_proper,complete,1,critical,convenient,slightly_prob,recommended,very_recom -usual,less_proper,complete,1,critical,convenient,slightly_prob,priority,priority -usual,less_proper,complete,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,complete,1,critical,convenient,problematic,recommended,priority -usual,less_proper,complete,1,critical,convenient,problematic,priority,priority -usual,less_proper,complete,1,critical,convenient,problematic,not_recom,not_recom -usual,less_proper,complete,1,critical,inconv,nonprob,recommended,very_recom -usual,less_proper,complete,1,critical,inconv,nonprob,priority,priority -usual,less_proper,complete,1,critical,inconv,nonprob,not_recom,not_recom -usual,less_proper,complete,1,critical,inconv,slightly_prob,recommended,very_recom -usual,less_proper,complete,1,critical,inconv,slightly_prob,priority,priority -usual,less_proper,complete,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,complete,1,critical,inconv,problematic,recommended,priority -usual,less_proper,complete,1,critical,inconv,problematic,priority,priority -usual,less_proper,complete,1,critical,inconv,problematic,not_recom,not_recom -usual,less_proper,complete,2,convenient,convenient,nonprob,recommended,very_recom -usual,less_proper,complete,2,convenient,convenient,nonprob,priority,priority -usual,less_proper,complete,2,convenient,convenient,nonprob,not_recom,not_recom -usual,less_proper,complete,2,convenient,convenient,slightly_prob,recommended,very_recom -usual,less_proper,complete,2,convenient,convenient,slightly_prob,priority,priority -usual,less_proper,complete,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,complete,2,convenient,convenient,problematic,recommended,priority -usual,less_proper,complete,2,convenient,convenient,problematic,priority,priority -usual,less_proper,complete,2,convenient,convenient,problematic,not_recom,not_recom -usual,less_proper,complete,2,convenient,inconv,nonprob,recommended,very_recom -usual,less_proper,complete,2,convenient,inconv,nonprob,priority,priority -usual,less_proper,complete,2,convenient,inconv,nonprob,not_recom,not_recom -usual,less_proper,complete,2,convenient,inconv,slightly_prob,recommended,very_recom -usual,less_proper,complete,2,convenient,inconv,slightly_prob,priority,priority -usual,less_proper,complete,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,complete,2,convenient,inconv,problematic,recommended,priority -usual,less_proper,complete,2,convenient,inconv,problematic,priority,priority -usual,less_proper,complete,2,convenient,inconv,problematic,not_recom,not_recom -usual,less_proper,complete,2,less_conv,convenient,nonprob,recommended,very_recom -usual,less_proper,complete,2,less_conv,convenient,nonprob,priority,priority -usual,less_proper,complete,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,less_proper,complete,2,less_conv,convenient,slightly_prob,recommended,very_recom -usual,less_proper,complete,2,less_conv,convenient,slightly_prob,priority,priority -usual,less_proper,complete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,complete,2,less_conv,convenient,problematic,recommended,priority -usual,less_proper,complete,2,less_conv,convenient,problematic,priority,priority -usual,less_proper,complete,2,less_conv,convenient,problematic,not_recom,not_recom -usual,less_proper,complete,2,less_conv,inconv,nonprob,recommended,very_recom -usual,less_proper,complete,2,less_conv,inconv,nonprob,priority,priority -usual,less_proper,complete,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,less_proper,complete,2,less_conv,inconv,slightly_prob,recommended,very_recom -usual,less_proper,complete,2,less_conv,inconv,slightly_prob,priority,priority -usual,less_proper,complete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,complete,2,less_conv,inconv,problematic,recommended,priority -usual,less_proper,complete,2,less_conv,inconv,problematic,priority,priority -usual,less_proper,complete,2,less_conv,inconv,problematic,not_recom,not_recom -usual,less_proper,complete,2,critical,convenient,nonprob,recommended,priority -usual,less_proper,complete,2,critical,convenient,nonprob,priority,priority -usual,less_proper,complete,2,critical,convenient,nonprob,not_recom,not_recom -usual,less_proper,complete,2,critical,convenient,slightly_prob,recommended,priority -usual,less_proper,complete,2,critical,convenient,slightly_prob,priority,priority -usual,less_proper,complete,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,complete,2,critical,convenient,problematic,recommended,priority -usual,less_proper,complete,2,critical,convenient,problematic,priority,priority -usual,less_proper,complete,2,critical,convenient,problematic,not_recom,not_recom -usual,less_proper,complete,2,critical,inconv,nonprob,recommended,priority -usual,less_proper,complete,2,critical,inconv,nonprob,priority,priority -usual,less_proper,complete,2,critical,inconv,nonprob,not_recom,not_recom -usual,less_proper,complete,2,critical,inconv,slightly_prob,recommended,priority -usual,less_proper,complete,2,critical,inconv,slightly_prob,priority,priority -usual,less_proper,complete,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,complete,2,critical,inconv,problematic,recommended,priority -usual,less_proper,complete,2,critical,inconv,problematic,priority,priority -usual,less_proper,complete,2,critical,inconv,problematic,not_recom,not_recom -usual,less_proper,complete,3,convenient,convenient,nonprob,recommended,very_recom -usual,less_proper,complete,3,convenient,convenient,nonprob,priority,priority -usual,less_proper,complete,3,convenient,convenient,nonprob,not_recom,not_recom -usual,less_proper,complete,3,convenient,convenient,slightly_prob,recommended,very_recom -usual,less_proper,complete,3,convenient,convenient,slightly_prob,priority,priority -usual,less_proper,complete,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,complete,3,convenient,convenient,problematic,recommended,priority -usual,less_proper,complete,3,convenient,convenient,problematic,priority,priority -usual,less_proper,complete,3,convenient,convenient,problematic,not_recom,not_recom -usual,less_proper,complete,3,convenient,inconv,nonprob,recommended,priority -usual,less_proper,complete,3,convenient,inconv,nonprob,priority,priority -usual,less_proper,complete,3,convenient,inconv,nonprob,not_recom,not_recom -usual,less_proper,complete,3,convenient,inconv,slightly_prob,recommended,priority -usual,less_proper,complete,3,convenient,inconv,slightly_prob,priority,priority -usual,less_proper,complete,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,complete,3,convenient,inconv,problematic,recommended,priority -usual,less_proper,complete,3,convenient,inconv,problematic,priority,priority -usual,less_proper,complete,3,convenient,inconv,problematic,not_recom,not_recom -usual,less_proper,complete,3,less_conv,convenient,nonprob,recommended,priority -usual,less_proper,complete,3,less_conv,convenient,nonprob,priority,priority -usual,less_proper,complete,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,less_proper,complete,3,less_conv,convenient,slightly_prob,recommended,priority -usual,less_proper,complete,3,less_conv,convenient,slightly_prob,priority,priority -usual,less_proper,complete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,complete,3,less_conv,convenient,problematic,recommended,priority -usual,less_proper,complete,3,less_conv,convenient,problematic,priority,priority -usual,less_proper,complete,3,less_conv,convenient,problematic,not_recom,not_recom -usual,less_proper,complete,3,less_conv,inconv,nonprob,recommended,priority -usual,less_proper,complete,3,less_conv,inconv,nonprob,priority,priority -usual,less_proper,complete,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,less_proper,complete,3,less_conv,inconv,slightly_prob,recommended,priority -usual,less_proper,complete,3,less_conv,inconv,slightly_prob,priority,priority -usual,less_proper,complete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,complete,3,less_conv,inconv,problematic,recommended,priority -usual,less_proper,complete,3,less_conv,inconv,problematic,priority,priority -usual,less_proper,complete,3,less_conv,inconv,problematic,not_recom,not_recom -usual,less_proper,complete,3,critical,convenient,nonprob,recommended,priority -usual,less_proper,complete,3,critical,convenient,nonprob,priority,priority -usual,less_proper,complete,3,critical,convenient,nonprob,not_recom,not_recom -usual,less_proper,complete,3,critical,convenient,slightly_prob,recommended,priority -usual,less_proper,complete,3,critical,convenient,slightly_prob,priority,priority -usual,less_proper,complete,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,complete,3,critical,convenient,problematic,recommended,priority -usual,less_proper,complete,3,critical,convenient,problematic,priority,priority -usual,less_proper,complete,3,critical,convenient,problematic,not_recom,not_recom -usual,less_proper,complete,3,critical,inconv,nonprob,recommended,priority -usual,less_proper,complete,3,critical,inconv,nonprob,priority,priority -usual,less_proper,complete,3,critical,inconv,nonprob,not_recom,not_recom -usual,less_proper,complete,3,critical,inconv,slightly_prob,recommended,priority -usual,less_proper,complete,3,critical,inconv,slightly_prob,priority,priority -usual,less_proper,complete,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,complete,3,critical,inconv,problematic,recommended,priority -usual,less_proper,complete,3,critical,inconv,problematic,priority,priority -usual,less_proper,complete,3,critical,inconv,problematic,not_recom,not_recom -usual,less_proper,complete,more,convenient,convenient,nonprob,recommended,very_recom -usual,less_proper,complete,more,convenient,convenient,nonprob,priority,priority -usual,less_proper,complete,more,convenient,convenient,nonprob,not_recom,not_recom -usual,less_proper,complete,more,convenient,convenient,slightly_prob,recommended,very_recom -usual,less_proper,complete,more,convenient,convenient,slightly_prob,priority,priority -usual,less_proper,complete,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,complete,more,convenient,convenient,problematic,recommended,priority -usual,less_proper,complete,more,convenient,convenient,problematic,priority,priority -usual,less_proper,complete,more,convenient,convenient,problematic,not_recom,not_recom -usual,less_proper,complete,more,convenient,inconv,nonprob,recommended,priority -usual,less_proper,complete,more,convenient,inconv,nonprob,priority,priority -usual,less_proper,complete,more,convenient,inconv,nonprob,not_recom,not_recom -usual,less_proper,complete,more,convenient,inconv,slightly_prob,recommended,priority -usual,less_proper,complete,more,convenient,inconv,slightly_prob,priority,priority -usual,less_proper,complete,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,complete,more,convenient,inconv,problematic,recommended,priority -usual,less_proper,complete,more,convenient,inconv,problematic,priority,priority -usual,less_proper,complete,more,convenient,inconv,problematic,not_recom,not_recom -usual,less_proper,complete,more,less_conv,convenient,nonprob,recommended,priority -usual,less_proper,complete,more,less_conv,convenient,nonprob,priority,priority -usual,less_proper,complete,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,less_proper,complete,more,less_conv,convenient,slightly_prob,recommended,priority -usual,less_proper,complete,more,less_conv,convenient,slightly_prob,priority,priority -usual,less_proper,complete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,complete,more,less_conv,convenient,problematic,recommended,priority -usual,less_proper,complete,more,less_conv,convenient,problematic,priority,priority -usual,less_proper,complete,more,less_conv,convenient,problematic,not_recom,not_recom -usual,less_proper,complete,more,less_conv,inconv,nonprob,recommended,priority -usual,less_proper,complete,more,less_conv,inconv,nonprob,priority,priority -usual,less_proper,complete,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,less_proper,complete,more,less_conv,inconv,slightly_prob,recommended,priority -usual,less_proper,complete,more,less_conv,inconv,slightly_prob,priority,priority -usual,less_proper,complete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,complete,more,less_conv,inconv,problematic,recommended,priority -usual,less_proper,complete,more,less_conv,inconv,problematic,priority,priority -usual,less_proper,complete,more,less_conv,inconv,problematic,not_recom,not_recom -usual,less_proper,complete,more,critical,convenient,nonprob,recommended,priority -usual,less_proper,complete,more,critical,convenient,nonprob,priority,priority -usual,less_proper,complete,more,critical,convenient,nonprob,not_recom,not_recom -usual,less_proper,complete,more,critical,convenient,slightly_prob,recommended,priority -usual,less_proper,complete,more,critical,convenient,slightly_prob,priority,priority -usual,less_proper,complete,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,complete,more,critical,convenient,problematic,recommended,priority -usual,less_proper,complete,more,critical,convenient,problematic,priority,priority -usual,less_proper,complete,more,critical,convenient,problematic,not_recom,not_recom -usual,less_proper,complete,more,critical,inconv,nonprob,recommended,priority -usual,less_proper,complete,more,critical,inconv,nonprob,priority,priority -usual,less_proper,complete,more,critical,inconv,nonprob,not_recom,not_recom -usual,less_proper,complete,more,critical,inconv,slightly_prob,recommended,priority -usual,less_proper,complete,more,critical,inconv,slightly_prob,priority,priority -usual,less_proper,complete,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,complete,more,critical,inconv,problematic,recommended,priority -usual,less_proper,complete,more,critical,inconv,problematic,priority,priority -usual,less_proper,complete,more,critical,inconv,problematic,not_recom,not_recom -usual,less_proper,completed,1,convenient,convenient,nonprob,recommended,very_recom -usual,less_proper,completed,1,convenient,convenient,nonprob,priority,priority -usual,less_proper,completed,1,convenient,convenient,nonprob,not_recom,not_recom -usual,less_proper,completed,1,convenient,convenient,slightly_prob,recommended,very_recom -usual,less_proper,completed,1,convenient,convenient,slightly_prob,priority,priority -usual,less_proper,completed,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,completed,1,convenient,convenient,problematic,recommended,priority -usual,less_proper,completed,1,convenient,convenient,problematic,priority,priority -usual,less_proper,completed,1,convenient,convenient,problematic,not_recom,not_recom -usual,less_proper,completed,1,convenient,inconv,nonprob,recommended,very_recom -usual,less_proper,completed,1,convenient,inconv,nonprob,priority,priority -usual,less_proper,completed,1,convenient,inconv,nonprob,not_recom,not_recom -usual,less_proper,completed,1,convenient,inconv,slightly_prob,recommended,very_recom -usual,less_proper,completed,1,convenient,inconv,slightly_prob,priority,priority -usual,less_proper,completed,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,completed,1,convenient,inconv,problematic,recommended,priority -usual,less_proper,completed,1,convenient,inconv,problematic,priority,priority -usual,less_proper,completed,1,convenient,inconv,problematic,not_recom,not_recom -usual,less_proper,completed,1,less_conv,convenient,nonprob,recommended,very_recom -usual,less_proper,completed,1,less_conv,convenient,nonprob,priority,priority -usual,less_proper,completed,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,less_proper,completed,1,less_conv,convenient,slightly_prob,recommended,very_recom -usual,less_proper,completed,1,less_conv,convenient,slightly_prob,priority,priority -usual,less_proper,completed,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,completed,1,less_conv,convenient,problematic,recommended,priority -usual,less_proper,completed,1,less_conv,convenient,problematic,priority,priority -usual,less_proper,completed,1,less_conv,convenient,problematic,not_recom,not_recom -usual,less_proper,completed,1,less_conv,inconv,nonprob,recommended,very_recom -usual,less_proper,completed,1,less_conv,inconv,nonprob,priority,priority -usual,less_proper,completed,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,less_proper,completed,1,less_conv,inconv,slightly_prob,recommended,very_recom -usual,less_proper,completed,1,less_conv,inconv,slightly_prob,priority,priority -usual,less_proper,completed,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,completed,1,less_conv,inconv,problematic,recommended,priority -usual,less_proper,completed,1,less_conv,inconv,problematic,priority,priority -usual,less_proper,completed,1,less_conv,inconv,problematic,not_recom,not_recom -usual,less_proper,completed,1,critical,convenient,nonprob,recommended,priority -usual,less_proper,completed,1,critical,convenient,nonprob,priority,priority -usual,less_proper,completed,1,critical,convenient,nonprob,not_recom,not_recom -usual,less_proper,completed,1,critical,convenient,slightly_prob,recommended,priority -usual,less_proper,completed,1,critical,convenient,slightly_prob,priority,priority -usual,less_proper,completed,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,completed,1,critical,convenient,problematic,recommended,priority -usual,less_proper,completed,1,critical,convenient,problematic,priority,priority -usual,less_proper,completed,1,critical,convenient,problematic,not_recom,not_recom -usual,less_proper,completed,1,critical,inconv,nonprob,recommended,priority -usual,less_proper,completed,1,critical,inconv,nonprob,priority,priority -usual,less_proper,completed,1,critical,inconv,nonprob,not_recom,not_recom -usual,less_proper,completed,1,critical,inconv,slightly_prob,recommended,priority -usual,less_proper,completed,1,critical,inconv,slightly_prob,priority,priority -usual,less_proper,completed,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,completed,1,critical,inconv,problematic,recommended,priority -usual,less_proper,completed,1,critical,inconv,problematic,priority,priority -usual,less_proper,completed,1,critical,inconv,problematic,not_recom,not_recom -usual,less_proper,completed,2,convenient,convenient,nonprob,recommended,very_recom -usual,less_proper,completed,2,convenient,convenient,nonprob,priority,priority -usual,less_proper,completed,2,convenient,convenient,nonprob,not_recom,not_recom -usual,less_proper,completed,2,convenient,convenient,slightly_prob,recommended,very_recom -usual,less_proper,completed,2,convenient,convenient,slightly_prob,priority,priority -usual,less_proper,completed,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,completed,2,convenient,convenient,problematic,recommended,priority -usual,less_proper,completed,2,convenient,convenient,problematic,priority,priority -usual,less_proper,completed,2,convenient,convenient,problematic,not_recom,not_recom -usual,less_proper,completed,2,convenient,inconv,nonprob,recommended,very_recom -usual,less_proper,completed,2,convenient,inconv,nonprob,priority,priority -usual,less_proper,completed,2,convenient,inconv,nonprob,not_recom,not_recom -usual,less_proper,completed,2,convenient,inconv,slightly_prob,recommended,very_recom -usual,less_proper,completed,2,convenient,inconv,slightly_prob,priority,priority -usual,less_proper,completed,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,completed,2,convenient,inconv,problematic,recommended,priority -usual,less_proper,completed,2,convenient,inconv,problematic,priority,priority -usual,less_proper,completed,2,convenient,inconv,problematic,not_recom,not_recom -usual,less_proper,completed,2,less_conv,convenient,nonprob,recommended,very_recom -usual,less_proper,completed,2,less_conv,convenient,nonprob,priority,priority -usual,less_proper,completed,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,less_proper,completed,2,less_conv,convenient,slightly_prob,recommended,very_recom -usual,less_proper,completed,2,less_conv,convenient,slightly_prob,priority,priority -usual,less_proper,completed,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,completed,2,less_conv,convenient,problematic,recommended,priority -usual,less_proper,completed,2,less_conv,convenient,problematic,priority,priority -usual,less_proper,completed,2,less_conv,convenient,problematic,not_recom,not_recom -usual,less_proper,completed,2,less_conv,inconv,nonprob,recommended,very_recom -usual,less_proper,completed,2,less_conv,inconv,nonprob,priority,priority -usual,less_proper,completed,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,less_proper,completed,2,less_conv,inconv,slightly_prob,recommended,very_recom -usual,less_proper,completed,2,less_conv,inconv,slightly_prob,priority,priority -usual,less_proper,completed,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,completed,2,less_conv,inconv,problematic,recommended,priority -usual,less_proper,completed,2,less_conv,inconv,problematic,priority,priority -usual,less_proper,completed,2,less_conv,inconv,problematic,not_recom,not_recom -usual,less_proper,completed,2,critical,convenient,nonprob,recommended,priority -usual,less_proper,completed,2,critical,convenient,nonprob,priority,priority -usual,less_proper,completed,2,critical,convenient,nonprob,not_recom,not_recom -usual,less_proper,completed,2,critical,convenient,slightly_prob,recommended,priority -usual,less_proper,completed,2,critical,convenient,slightly_prob,priority,priority -usual,less_proper,completed,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,completed,2,critical,convenient,problematic,recommended,priority -usual,less_proper,completed,2,critical,convenient,problematic,priority,priority -usual,less_proper,completed,2,critical,convenient,problematic,not_recom,not_recom -usual,less_proper,completed,2,critical,inconv,nonprob,recommended,priority -usual,less_proper,completed,2,critical,inconv,nonprob,priority,priority -usual,less_proper,completed,2,critical,inconv,nonprob,not_recom,not_recom -usual,less_proper,completed,2,critical,inconv,slightly_prob,recommended,priority -usual,less_proper,completed,2,critical,inconv,slightly_prob,priority,priority -usual,less_proper,completed,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,completed,2,critical,inconv,problematic,recommended,priority -usual,less_proper,completed,2,critical,inconv,problematic,priority,priority -usual,less_proper,completed,2,critical,inconv,problematic,not_recom,not_recom -usual,less_proper,completed,3,convenient,convenient,nonprob,recommended,very_recom -usual,less_proper,completed,3,convenient,convenient,nonprob,priority,priority -usual,less_proper,completed,3,convenient,convenient,nonprob,not_recom,not_recom -usual,less_proper,completed,3,convenient,convenient,slightly_prob,recommended,very_recom -usual,less_proper,completed,3,convenient,convenient,slightly_prob,priority,priority -usual,less_proper,completed,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,completed,3,convenient,convenient,problematic,recommended,priority -usual,less_proper,completed,3,convenient,convenient,problematic,priority,priority -usual,less_proper,completed,3,convenient,convenient,problematic,not_recom,not_recom -usual,less_proper,completed,3,convenient,inconv,nonprob,recommended,priority -usual,less_proper,completed,3,convenient,inconv,nonprob,priority,priority -usual,less_proper,completed,3,convenient,inconv,nonprob,not_recom,not_recom -usual,less_proper,completed,3,convenient,inconv,slightly_prob,recommended,priority -usual,less_proper,completed,3,convenient,inconv,slightly_prob,priority,priority -usual,less_proper,completed,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,completed,3,convenient,inconv,problematic,recommended,priority -usual,less_proper,completed,3,convenient,inconv,problematic,priority,priority -usual,less_proper,completed,3,convenient,inconv,problematic,not_recom,not_recom -usual,less_proper,completed,3,less_conv,convenient,nonprob,recommended,priority -usual,less_proper,completed,3,less_conv,convenient,nonprob,priority,priority -usual,less_proper,completed,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,less_proper,completed,3,less_conv,convenient,slightly_prob,recommended,priority -usual,less_proper,completed,3,less_conv,convenient,slightly_prob,priority,priority -usual,less_proper,completed,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,completed,3,less_conv,convenient,problematic,recommended,priority -usual,less_proper,completed,3,less_conv,convenient,problematic,priority,priority -usual,less_proper,completed,3,less_conv,convenient,problematic,not_recom,not_recom -usual,less_proper,completed,3,less_conv,inconv,nonprob,recommended,priority -usual,less_proper,completed,3,less_conv,inconv,nonprob,priority,priority -usual,less_proper,completed,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,less_proper,completed,3,less_conv,inconv,slightly_prob,recommended,priority -usual,less_proper,completed,3,less_conv,inconv,slightly_prob,priority,priority -usual,less_proper,completed,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,completed,3,less_conv,inconv,problematic,recommended,priority -usual,less_proper,completed,3,less_conv,inconv,problematic,priority,priority -usual,less_proper,completed,3,less_conv,inconv,problematic,not_recom,not_recom -usual,less_proper,completed,3,critical,convenient,nonprob,recommended,priority -usual,less_proper,completed,3,critical,convenient,nonprob,priority,priority -usual,less_proper,completed,3,critical,convenient,nonprob,not_recom,not_recom -usual,less_proper,completed,3,critical,convenient,slightly_prob,recommended,priority -usual,less_proper,completed,3,critical,convenient,slightly_prob,priority,priority -usual,less_proper,completed,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,completed,3,critical,convenient,problematic,recommended,priority -usual,less_proper,completed,3,critical,convenient,problematic,priority,priority -usual,less_proper,completed,3,critical,convenient,problematic,not_recom,not_recom -usual,less_proper,completed,3,critical,inconv,nonprob,recommended,priority -usual,less_proper,completed,3,critical,inconv,nonprob,priority,priority -usual,less_proper,completed,3,critical,inconv,nonprob,not_recom,not_recom -usual,less_proper,completed,3,critical,inconv,slightly_prob,recommended,priority -usual,less_proper,completed,3,critical,inconv,slightly_prob,priority,priority -usual,less_proper,completed,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,completed,3,critical,inconv,problematic,recommended,priority -usual,less_proper,completed,3,critical,inconv,problematic,priority,priority -usual,less_proper,completed,3,critical,inconv,problematic,not_recom,not_recom -usual,less_proper,completed,more,convenient,convenient,nonprob,recommended,very_recom -usual,less_proper,completed,more,convenient,convenient,nonprob,priority,priority -usual,less_proper,completed,more,convenient,convenient,nonprob,not_recom,not_recom -usual,less_proper,completed,more,convenient,convenient,slightly_prob,recommended,very_recom -usual,less_proper,completed,more,convenient,convenient,slightly_prob,priority,priority -usual,less_proper,completed,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,completed,more,convenient,convenient,problematic,recommended,priority -usual,less_proper,completed,more,convenient,convenient,problematic,priority,priority -usual,less_proper,completed,more,convenient,convenient,problematic,not_recom,not_recom -usual,less_proper,completed,more,convenient,inconv,nonprob,recommended,priority -usual,less_proper,completed,more,convenient,inconv,nonprob,priority,priority -usual,less_proper,completed,more,convenient,inconv,nonprob,not_recom,not_recom -usual,less_proper,completed,more,convenient,inconv,slightly_prob,recommended,priority -usual,less_proper,completed,more,convenient,inconv,slightly_prob,priority,priority -usual,less_proper,completed,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,completed,more,convenient,inconv,problematic,recommended,priority -usual,less_proper,completed,more,convenient,inconv,problematic,priority,priority -usual,less_proper,completed,more,convenient,inconv,problematic,not_recom,not_recom -usual,less_proper,completed,more,less_conv,convenient,nonprob,recommended,priority -usual,less_proper,completed,more,less_conv,convenient,nonprob,priority,priority -usual,less_proper,completed,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,less_proper,completed,more,less_conv,convenient,slightly_prob,recommended,priority -usual,less_proper,completed,more,less_conv,convenient,slightly_prob,priority,priority -usual,less_proper,completed,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,completed,more,less_conv,convenient,problematic,recommended,priority -usual,less_proper,completed,more,less_conv,convenient,problematic,priority,priority -usual,less_proper,completed,more,less_conv,convenient,problematic,not_recom,not_recom -usual,less_proper,completed,more,less_conv,inconv,nonprob,recommended,priority -usual,less_proper,completed,more,less_conv,inconv,nonprob,priority,priority -usual,less_proper,completed,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,less_proper,completed,more,less_conv,inconv,slightly_prob,recommended,priority -usual,less_proper,completed,more,less_conv,inconv,slightly_prob,priority,priority -usual,less_proper,completed,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,completed,more,less_conv,inconv,problematic,recommended,priority -usual,less_proper,completed,more,less_conv,inconv,problematic,priority,priority -usual,less_proper,completed,more,less_conv,inconv,problematic,not_recom,not_recom -usual,less_proper,completed,more,critical,convenient,nonprob,recommended,priority -usual,less_proper,completed,more,critical,convenient,nonprob,priority,priority -usual,less_proper,completed,more,critical,convenient,nonprob,not_recom,not_recom -usual,less_proper,completed,more,critical,convenient,slightly_prob,recommended,priority -usual,less_proper,completed,more,critical,convenient,slightly_prob,priority,priority -usual,less_proper,completed,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,completed,more,critical,convenient,problematic,recommended,priority -usual,less_proper,completed,more,critical,convenient,problematic,priority,priority -usual,less_proper,completed,more,critical,convenient,problematic,not_recom,not_recom -usual,less_proper,completed,more,critical,inconv,nonprob,recommended,priority -usual,less_proper,completed,more,critical,inconv,nonprob,priority,priority -usual,less_proper,completed,more,critical,inconv,nonprob,not_recom,not_recom -usual,less_proper,completed,more,critical,inconv,slightly_prob,recommended,priority -usual,less_proper,completed,more,critical,inconv,slightly_prob,priority,priority -usual,less_proper,completed,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,completed,more,critical,inconv,problematic,recommended,priority -usual,less_proper,completed,more,critical,inconv,problematic,priority,priority -usual,less_proper,completed,more,critical,inconv,problematic,not_recom,not_recom -usual,less_proper,incomplete,1,convenient,convenient,nonprob,recommended,very_recom -usual,less_proper,incomplete,1,convenient,convenient,nonprob,priority,priority -usual,less_proper,incomplete,1,convenient,convenient,nonprob,not_recom,not_recom -usual,less_proper,incomplete,1,convenient,convenient,slightly_prob,recommended,very_recom -usual,less_proper,incomplete,1,convenient,convenient,slightly_prob,priority,priority -usual,less_proper,incomplete,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,1,convenient,convenient,problematic,recommended,priority -usual,less_proper,incomplete,1,convenient,convenient,problematic,priority,priority -usual,less_proper,incomplete,1,convenient,convenient,problematic,not_recom,not_recom -usual,less_proper,incomplete,1,convenient,inconv,nonprob,recommended,very_recom -usual,less_proper,incomplete,1,convenient,inconv,nonprob,priority,priority -usual,less_proper,incomplete,1,convenient,inconv,nonprob,not_recom,not_recom -usual,less_proper,incomplete,1,convenient,inconv,slightly_prob,recommended,very_recom -usual,less_proper,incomplete,1,convenient,inconv,slightly_prob,priority,priority -usual,less_proper,incomplete,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,1,convenient,inconv,problematic,recommended,priority -usual,less_proper,incomplete,1,convenient,inconv,problematic,priority,priority -usual,less_proper,incomplete,1,convenient,inconv,problematic,not_recom,not_recom -usual,less_proper,incomplete,1,less_conv,convenient,nonprob,recommended,very_recom -usual,less_proper,incomplete,1,less_conv,convenient,nonprob,priority,priority -usual,less_proper,incomplete,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,less_proper,incomplete,1,less_conv,convenient,slightly_prob,recommended,very_recom -usual,less_proper,incomplete,1,less_conv,convenient,slightly_prob,priority,priority -usual,less_proper,incomplete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,1,less_conv,convenient,problematic,recommended,priority -usual,less_proper,incomplete,1,less_conv,convenient,problematic,priority,priority -usual,less_proper,incomplete,1,less_conv,convenient,problematic,not_recom,not_recom -usual,less_proper,incomplete,1,less_conv,inconv,nonprob,recommended,very_recom -usual,less_proper,incomplete,1,less_conv,inconv,nonprob,priority,priority -usual,less_proper,incomplete,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,less_proper,incomplete,1,less_conv,inconv,slightly_prob,recommended,very_recom -usual,less_proper,incomplete,1,less_conv,inconv,slightly_prob,priority,priority -usual,less_proper,incomplete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,1,less_conv,inconv,problematic,recommended,priority -usual,less_proper,incomplete,1,less_conv,inconv,problematic,priority,priority -usual,less_proper,incomplete,1,less_conv,inconv,problematic,not_recom,not_recom -usual,less_proper,incomplete,1,critical,convenient,nonprob,recommended,priority -usual,less_proper,incomplete,1,critical,convenient,nonprob,priority,priority -usual,less_proper,incomplete,1,critical,convenient,nonprob,not_recom,not_recom -usual,less_proper,incomplete,1,critical,convenient,slightly_prob,recommended,priority -usual,less_proper,incomplete,1,critical,convenient,slightly_prob,priority,priority -usual,less_proper,incomplete,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,1,critical,convenient,problematic,recommended,priority -usual,less_proper,incomplete,1,critical,convenient,problematic,priority,priority -usual,less_proper,incomplete,1,critical,convenient,problematic,not_recom,not_recom -usual,less_proper,incomplete,1,critical,inconv,nonprob,recommended,priority -usual,less_proper,incomplete,1,critical,inconv,nonprob,priority,priority -usual,less_proper,incomplete,1,critical,inconv,nonprob,not_recom,not_recom -usual,less_proper,incomplete,1,critical,inconv,slightly_prob,recommended,priority -usual,less_proper,incomplete,1,critical,inconv,slightly_prob,priority,priority -usual,less_proper,incomplete,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,1,critical,inconv,problematic,recommended,priority -usual,less_proper,incomplete,1,critical,inconv,problematic,priority,priority -usual,less_proper,incomplete,1,critical,inconv,problematic,not_recom,not_recom -usual,less_proper,incomplete,2,convenient,convenient,nonprob,recommended,very_recom -usual,less_proper,incomplete,2,convenient,convenient,nonprob,priority,priority -usual,less_proper,incomplete,2,convenient,convenient,nonprob,not_recom,not_recom -usual,less_proper,incomplete,2,convenient,convenient,slightly_prob,recommended,very_recom -usual,less_proper,incomplete,2,convenient,convenient,slightly_prob,priority,priority -usual,less_proper,incomplete,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,2,convenient,convenient,problematic,recommended,priority -usual,less_proper,incomplete,2,convenient,convenient,problematic,priority,priority -usual,less_proper,incomplete,2,convenient,convenient,problematic,not_recom,not_recom -usual,less_proper,incomplete,2,convenient,inconv,nonprob,recommended,priority -usual,less_proper,incomplete,2,convenient,inconv,nonprob,priority,priority -usual,less_proper,incomplete,2,convenient,inconv,nonprob,not_recom,not_recom -usual,less_proper,incomplete,2,convenient,inconv,slightly_prob,recommended,priority -usual,less_proper,incomplete,2,convenient,inconv,slightly_prob,priority,priority -usual,less_proper,incomplete,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,2,convenient,inconv,problematic,recommended,priority -usual,less_proper,incomplete,2,convenient,inconv,problematic,priority,priority -usual,less_proper,incomplete,2,convenient,inconv,problematic,not_recom,not_recom -usual,less_proper,incomplete,2,less_conv,convenient,nonprob,recommended,priority -usual,less_proper,incomplete,2,less_conv,convenient,nonprob,priority,priority -usual,less_proper,incomplete,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,less_proper,incomplete,2,less_conv,convenient,slightly_prob,recommended,priority -usual,less_proper,incomplete,2,less_conv,convenient,slightly_prob,priority,priority -usual,less_proper,incomplete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,2,less_conv,convenient,problematic,recommended,priority -usual,less_proper,incomplete,2,less_conv,convenient,problematic,priority,priority -usual,less_proper,incomplete,2,less_conv,convenient,problematic,not_recom,not_recom -usual,less_proper,incomplete,2,less_conv,inconv,nonprob,recommended,priority -usual,less_proper,incomplete,2,less_conv,inconv,nonprob,priority,priority -usual,less_proper,incomplete,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,less_proper,incomplete,2,less_conv,inconv,slightly_prob,recommended,priority -usual,less_proper,incomplete,2,less_conv,inconv,slightly_prob,priority,priority -usual,less_proper,incomplete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,2,less_conv,inconv,problematic,recommended,priority -usual,less_proper,incomplete,2,less_conv,inconv,problematic,priority,priority -usual,less_proper,incomplete,2,less_conv,inconv,problematic,not_recom,not_recom -usual,less_proper,incomplete,2,critical,convenient,nonprob,recommended,priority -usual,less_proper,incomplete,2,critical,convenient,nonprob,priority,priority -usual,less_proper,incomplete,2,critical,convenient,nonprob,not_recom,not_recom -usual,less_proper,incomplete,2,critical,convenient,slightly_prob,recommended,priority -usual,less_proper,incomplete,2,critical,convenient,slightly_prob,priority,priority -usual,less_proper,incomplete,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,2,critical,convenient,problematic,recommended,priority -usual,less_proper,incomplete,2,critical,convenient,problematic,priority,priority -usual,less_proper,incomplete,2,critical,convenient,problematic,not_recom,not_recom -usual,less_proper,incomplete,2,critical,inconv,nonprob,recommended,priority -usual,less_proper,incomplete,2,critical,inconv,nonprob,priority,priority -usual,less_proper,incomplete,2,critical,inconv,nonprob,not_recom,not_recom -usual,less_proper,incomplete,2,critical,inconv,slightly_prob,recommended,priority -usual,less_proper,incomplete,2,critical,inconv,slightly_prob,priority,priority -usual,less_proper,incomplete,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,2,critical,inconv,problematic,recommended,priority -usual,less_proper,incomplete,2,critical,inconv,problematic,priority,priority -usual,less_proper,incomplete,2,critical,inconv,problematic,not_recom,not_recom -usual,less_proper,incomplete,3,convenient,convenient,nonprob,recommended,very_recom -usual,less_proper,incomplete,3,convenient,convenient,nonprob,priority,priority -usual,less_proper,incomplete,3,convenient,convenient,nonprob,not_recom,not_recom -usual,less_proper,incomplete,3,convenient,convenient,slightly_prob,recommended,very_recom -usual,less_proper,incomplete,3,convenient,convenient,slightly_prob,priority,priority -usual,less_proper,incomplete,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,3,convenient,convenient,problematic,recommended,priority -usual,less_proper,incomplete,3,convenient,convenient,problematic,priority,priority -usual,less_proper,incomplete,3,convenient,convenient,problematic,not_recom,not_recom -usual,less_proper,incomplete,3,convenient,inconv,nonprob,recommended,priority -usual,less_proper,incomplete,3,convenient,inconv,nonprob,priority,priority -usual,less_proper,incomplete,3,convenient,inconv,nonprob,not_recom,not_recom -usual,less_proper,incomplete,3,convenient,inconv,slightly_prob,recommended,priority -usual,less_proper,incomplete,3,convenient,inconv,slightly_prob,priority,priority -usual,less_proper,incomplete,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,3,convenient,inconv,problematic,recommended,priority -usual,less_proper,incomplete,3,convenient,inconv,problematic,priority,priority -usual,less_proper,incomplete,3,convenient,inconv,problematic,not_recom,not_recom -usual,less_proper,incomplete,3,less_conv,convenient,nonprob,recommended,priority -usual,less_proper,incomplete,3,less_conv,convenient,nonprob,priority,priority -usual,less_proper,incomplete,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,less_proper,incomplete,3,less_conv,convenient,slightly_prob,recommended,priority -usual,less_proper,incomplete,3,less_conv,convenient,slightly_prob,priority,priority -usual,less_proper,incomplete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,3,less_conv,convenient,problematic,recommended,priority -usual,less_proper,incomplete,3,less_conv,convenient,problematic,priority,priority -usual,less_proper,incomplete,3,less_conv,convenient,problematic,not_recom,not_recom -usual,less_proper,incomplete,3,less_conv,inconv,nonprob,recommended,priority -usual,less_proper,incomplete,3,less_conv,inconv,nonprob,priority,priority -usual,less_proper,incomplete,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,less_proper,incomplete,3,less_conv,inconv,slightly_prob,recommended,priority -usual,less_proper,incomplete,3,less_conv,inconv,slightly_prob,priority,priority -usual,less_proper,incomplete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,3,less_conv,inconv,problematic,recommended,priority -usual,less_proper,incomplete,3,less_conv,inconv,problematic,priority,priority -usual,less_proper,incomplete,3,less_conv,inconv,problematic,not_recom,not_recom -usual,less_proper,incomplete,3,critical,convenient,nonprob,recommended,priority -usual,less_proper,incomplete,3,critical,convenient,nonprob,priority,priority -usual,less_proper,incomplete,3,critical,convenient,nonprob,not_recom,not_recom -usual,less_proper,incomplete,3,critical,convenient,slightly_prob,recommended,priority -usual,less_proper,incomplete,3,critical,convenient,slightly_prob,priority,priority -usual,less_proper,incomplete,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,3,critical,convenient,problematic,recommended,priority -usual,less_proper,incomplete,3,critical,convenient,problematic,priority,priority -usual,less_proper,incomplete,3,critical,convenient,problematic,not_recom,not_recom -usual,less_proper,incomplete,3,critical,inconv,nonprob,recommended,priority -usual,less_proper,incomplete,3,critical,inconv,nonprob,priority,priority -usual,less_proper,incomplete,3,critical,inconv,nonprob,not_recom,not_recom -usual,less_proper,incomplete,3,critical,inconv,slightly_prob,recommended,priority -usual,less_proper,incomplete,3,critical,inconv,slightly_prob,priority,priority -usual,less_proper,incomplete,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,3,critical,inconv,problematic,recommended,priority -usual,less_proper,incomplete,3,critical,inconv,problematic,priority,priority -usual,less_proper,incomplete,3,critical,inconv,problematic,not_recom,not_recom -usual,less_proper,incomplete,more,convenient,convenient,nonprob,recommended,very_recom -usual,less_proper,incomplete,more,convenient,convenient,nonprob,priority,priority -usual,less_proper,incomplete,more,convenient,convenient,nonprob,not_recom,not_recom -usual,less_proper,incomplete,more,convenient,convenient,slightly_prob,recommended,very_recom -usual,less_proper,incomplete,more,convenient,convenient,slightly_prob,priority,priority -usual,less_proper,incomplete,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,more,convenient,convenient,problematic,recommended,priority -usual,less_proper,incomplete,more,convenient,convenient,problematic,priority,priority -usual,less_proper,incomplete,more,convenient,convenient,problematic,not_recom,not_recom -usual,less_proper,incomplete,more,convenient,inconv,nonprob,recommended,priority -usual,less_proper,incomplete,more,convenient,inconv,nonprob,priority,priority -usual,less_proper,incomplete,more,convenient,inconv,nonprob,not_recom,not_recom -usual,less_proper,incomplete,more,convenient,inconv,slightly_prob,recommended,priority -usual,less_proper,incomplete,more,convenient,inconv,slightly_prob,priority,priority -usual,less_proper,incomplete,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,more,convenient,inconv,problematic,recommended,priority -usual,less_proper,incomplete,more,convenient,inconv,problematic,priority,priority -usual,less_proper,incomplete,more,convenient,inconv,problematic,not_recom,not_recom -usual,less_proper,incomplete,more,less_conv,convenient,nonprob,recommended,priority -usual,less_proper,incomplete,more,less_conv,convenient,nonprob,priority,priority -usual,less_proper,incomplete,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,less_proper,incomplete,more,less_conv,convenient,slightly_prob,recommended,priority -usual,less_proper,incomplete,more,less_conv,convenient,slightly_prob,priority,priority -usual,less_proper,incomplete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,more,less_conv,convenient,problematic,recommended,priority -usual,less_proper,incomplete,more,less_conv,convenient,problematic,priority,priority -usual,less_proper,incomplete,more,less_conv,convenient,problematic,not_recom,not_recom -usual,less_proper,incomplete,more,less_conv,inconv,nonprob,recommended,priority -usual,less_proper,incomplete,more,less_conv,inconv,nonprob,priority,priority -usual,less_proper,incomplete,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,less_proper,incomplete,more,less_conv,inconv,slightly_prob,recommended,priority -usual,less_proper,incomplete,more,less_conv,inconv,slightly_prob,priority,priority -usual,less_proper,incomplete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,more,less_conv,inconv,problematic,recommended,priority -usual,less_proper,incomplete,more,less_conv,inconv,problematic,priority,priority -usual,less_proper,incomplete,more,less_conv,inconv,problematic,not_recom,not_recom -usual,less_proper,incomplete,more,critical,convenient,nonprob,recommended,priority -usual,less_proper,incomplete,more,critical,convenient,nonprob,priority,priority -usual,less_proper,incomplete,more,critical,convenient,nonprob,not_recom,not_recom -usual,less_proper,incomplete,more,critical,convenient,slightly_prob,recommended,priority -usual,less_proper,incomplete,more,critical,convenient,slightly_prob,priority,priority -usual,less_proper,incomplete,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,more,critical,convenient,problematic,recommended,priority -usual,less_proper,incomplete,more,critical,convenient,problematic,priority,priority -usual,less_proper,incomplete,more,critical,convenient,problematic,not_recom,not_recom -usual,less_proper,incomplete,more,critical,inconv,nonprob,recommended,priority -usual,less_proper,incomplete,more,critical,inconv,nonprob,priority,priority -usual,less_proper,incomplete,more,critical,inconv,nonprob,not_recom,not_recom -usual,less_proper,incomplete,more,critical,inconv,slightly_prob,recommended,priority -usual,less_proper,incomplete,more,critical,inconv,slightly_prob,priority,priority -usual,less_proper,incomplete,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,incomplete,more,critical,inconv,problematic,recommended,priority -usual,less_proper,incomplete,more,critical,inconv,problematic,priority,priority -usual,less_proper,incomplete,more,critical,inconv,problematic,not_recom,not_recom -usual,less_proper,foster,1,convenient,convenient,nonprob,recommended,very_recom -usual,less_proper,foster,1,convenient,convenient,nonprob,priority,priority -usual,less_proper,foster,1,convenient,convenient,nonprob,not_recom,not_recom -usual,less_proper,foster,1,convenient,convenient,slightly_prob,recommended,very_recom -usual,less_proper,foster,1,convenient,convenient,slightly_prob,priority,priority -usual,less_proper,foster,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,foster,1,convenient,convenient,problematic,recommended,priority -usual,less_proper,foster,1,convenient,convenient,problematic,priority,priority -usual,less_proper,foster,1,convenient,convenient,problematic,not_recom,not_recom -usual,less_proper,foster,1,convenient,inconv,nonprob,recommended,priority -usual,less_proper,foster,1,convenient,inconv,nonprob,priority,priority -usual,less_proper,foster,1,convenient,inconv,nonprob,not_recom,not_recom -usual,less_proper,foster,1,convenient,inconv,slightly_prob,recommended,priority -usual,less_proper,foster,1,convenient,inconv,slightly_prob,priority,priority -usual,less_proper,foster,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,foster,1,convenient,inconv,problematic,recommended,priority -usual,less_proper,foster,1,convenient,inconv,problematic,priority,priority -usual,less_proper,foster,1,convenient,inconv,problematic,not_recom,not_recom -usual,less_proper,foster,1,less_conv,convenient,nonprob,recommended,priority -usual,less_proper,foster,1,less_conv,convenient,nonprob,priority,priority -usual,less_proper,foster,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,less_proper,foster,1,less_conv,convenient,slightly_prob,recommended,priority -usual,less_proper,foster,1,less_conv,convenient,slightly_prob,priority,priority -usual,less_proper,foster,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,foster,1,less_conv,convenient,problematic,recommended,priority -usual,less_proper,foster,1,less_conv,convenient,problematic,priority,priority -usual,less_proper,foster,1,less_conv,convenient,problematic,not_recom,not_recom -usual,less_proper,foster,1,less_conv,inconv,nonprob,recommended,priority -usual,less_proper,foster,1,less_conv,inconv,nonprob,priority,priority -usual,less_proper,foster,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,less_proper,foster,1,less_conv,inconv,slightly_prob,recommended,priority -usual,less_proper,foster,1,less_conv,inconv,slightly_prob,priority,priority -usual,less_proper,foster,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,foster,1,less_conv,inconv,problematic,recommended,priority -usual,less_proper,foster,1,less_conv,inconv,problematic,priority,priority -usual,less_proper,foster,1,less_conv,inconv,problematic,not_recom,not_recom -usual,less_proper,foster,1,critical,convenient,nonprob,recommended,priority -usual,less_proper,foster,1,critical,convenient,nonprob,priority,priority -usual,less_proper,foster,1,critical,convenient,nonprob,not_recom,not_recom -usual,less_proper,foster,1,critical,convenient,slightly_prob,recommended,priority -usual,less_proper,foster,1,critical,convenient,slightly_prob,priority,priority -usual,less_proper,foster,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,foster,1,critical,convenient,problematic,recommended,priority -usual,less_proper,foster,1,critical,convenient,problematic,priority,priority -usual,less_proper,foster,1,critical,convenient,problematic,not_recom,not_recom -usual,less_proper,foster,1,critical,inconv,nonprob,recommended,priority -usual,less_proper,foster,1,critical,inconv,nonprob,priority,priority -usual,less_proper,foster,1,critical,inconv,nonprob,not_recom,not_recom -usual,less_proper,foster,1,critical,inconv,slightly_prob,recommended,priority -usual,less_proper,foster,1,critical,inconv,slightly_prob,priority,priority -usual,less_proper,foster,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,foster,1,critical,inconv,problematic,recommended,priority -usual,less_proper,foster,1,critical,inconv,problematic,priority,priority -usual,less_proper,foster,1,critical,inconv,problematic,not_recom,not_recom -usual,less_proper,foster,2,convenient,convenient,nonprob,recommended,very_recom -usual,less_proper,foster,2,convenient,convenient,nonprob,priority,priority -usual,less_proper,foster,2,convenient,convenient,nonprob,not_recom,not_recom -usual,less_proper,foster,2,convenient,convenient,slightly_prob,recommended,very_recom -usual,less_proper,foster,2,convenient,convenient,slightly_prob,priority,priority -usual,less_proper,foster,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,foster,2,convenient,convenient,problematic,recommended,priority -usual,less_proper,foster,2,convenient,convenient,problematic,priority,priority -usual,less_proper,foster,2,convenient,convenient,problematic,not_recom,not_recom -usual,less_proper,foster,2,convenient,inconv,nonprob,recommended,priority -usual,less_proper,foster,2,convenient,inconv,nonprob,priority,priority -usual,less_proper,foster,2,convenient,inconv,nonprob,not_recom,not_recom -usual,less_proper,foster,2,convenient,inconv,slightly_prob,recommended,priority -usual,less_proper,foster,2,convenient,inconv,slightly_prob,priority,priority -usual,less_proper,foster,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,foster,2,convenient,inconv,problematic,recommended,priority -usual,less_proper,foster,2,convenient,inconv,problematic,priority,priority -usual,less_proper,foster,2,convenient,inconv,problematic,not_recom,not_recom -usual,less_proper,foster,2,less_conv,convenient,nonprob,recommended,priority -usual,less_proper,foster,2,less_conv,convenient,nonprob,priority,priority -usual,less_proper,foster,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,less_proper,foster,2,less_conv,convenient,slightly_prob,recommended,priority -usual,less_proper,foster,2,less_conv,convenient,slightly_prob,priority,priority -usual,less_proper,foster,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,foster,2,less_conv,convenient,problematic,recommended,priority -usual,less_proper,foster,2,less_conv,convenient,problematic,priority,priority -usual,less_proper,foster,2,less_conv,convenient,problematic,not_recom,not_recom -usual,less_proper,foster,2,less_conv,inconv,nonprob,recommended,priority -usual,less_proper,foster,2,less_conv,inconv,nonprob,priority,priority -usual,less_proper,foster,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,less_proper,foster,2,less_conv,inconv,slightly_prob,recommended,priority -usual,less_proper,foster,2,less_conv,inconv,slightly_prob,priority,priority -usual,less_proper,foster,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,foster,2,less_conv,inconv,problematic,recommended,priority -usual,less_proper,foster,2,less_conv,inconv,problematic,priority,priority -usual,less_proper,foster,2,less_conv,inconv,problematic,not_recom,not_recom -usual,less_proper,foster,2,critical,convenient,nonprob,recommended,priority -usual,less_proper,foster,2,critical,convenient,nonprob,priority,priority -usual,less_proper,foster,2,critical,convenient,nonprob,not_recom,not_recom -usual,less_proper,foster,2,critical,convenient,slightly_prob,recommended,priority -usual,less_proper,foster,2,critical,convenient,slightly_prob,priority,priority -usual,less_proper,foster,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,foster,2,critical,convenient,problematic,recommended,priority -usual,less_proper,foster,2,critical,convenient,problematic,priority,priority -usual,less_proper,foster,2,critical,convenient,problematic,not_recom,not_recom -usual,less_proper,foster,2,critical,inconv,nonprob,recommended,priority -usual,less_proper,foster,2,critical,inconv,nonprob,priority,priority -usual,less_proper,foster,2,critical,inconv,nonprob,not_recom,not_recom -usual,less_proper,foster,2,critical,inconv,slightly_prob,recommended,priority -usual,less_proper,foster,2,critical,inconv,slightly_prob,priority,priority -usual,less_proper,foster,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,foster,2,critical,inconv,problematic,recommended,priority -usual,less_proper,foster,2,critical,inconv,problematic,priority,priority -usual,less_proper,foster,2,critical,inconv,problematic,not_recom,not_recom -usual,less_proper,foster,3,convenient,convenient,nonprob,recommended,very_recom -usual,less_proper,foster,3,convenient,convenient,nonprob,priority,priority -usual,less_proper,foster,3,convenient,convenient,nonprob,not_recom,not_recom -usual,less_proper,foster,3,convenient,convenient,slightly_prob,recommended,very_recom -usual,less_proper,foster,3,convenient,convenient,slightly_prob,priority,priority -usual,less_proper,foster,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,foster,3,convenient,convenient,problematic,recommended,priority -usual,less_proper,foster,3,convenient,convenient,problematic,priority,priority -usual,less_proper,foster,3,convenient,convenient,problematic,not_recom,not_recom -usual,less_proper,foster,3,convenient,inconv,nonprob,recommended,priority -usual,less_proper,foster,3,convenient,inconv,nonprob,priority,priority -usual,less_proper,foster,3,convenient,inconv,nonprob,not_recom,not_recom -usual,less_proper,foster,3,convenient,inconv,slightly_prob,recommended,priority -usual,less_proper,foster,3,convenient,inconv,slightly_prob,priority,priority -usual,less_proper,foster,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,foster,3,convenient,inconv,problematic,recommended,priority -usual,less_proper,foster,3,convenient,inconv,problematic,priority,priority -usual,less_proper,foster,3,convenient,inconv,problematic,not_recom,not_recom -usual,less_proper,foster,3,less_conv,convenient,nonprob,recommended,priority -usual,less_proper,foster,3,less_conv,convenient,nonprob,priority,priority -usual,less_proper,foster,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,less_proper,foster,3,less_conv,convenient,slightly_prob,recommended,priority -usual,less_proper,foster,3,less_conv,convenient,slightly_prob,priority,priority -usual,less_proper,foster,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,foster,3,less_conv,convenient,problematic,recommended,priority -usual,less_proper,foster,3,less_conv,convenient,problematic,priority,priority -usual,less_proper,foster,3,less_conv,convenient,problematic,not_recom,not_recom -usual,less_proper,foster,3,less_conv,inconv,nonprob,recommended,priority -usual,less_proper,foster,3,less_conv,inconv,nonprob,priority,priority -usual,less_proper,foster,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,less_proper,foster,3,less_conv,inconv,slightly_prob,recommended,priority -usual,less_proper,foster,3,less_conv,inconv,slightly_prob,priority,priority -usual,less_proper,foster,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,foster,3,less_conv,inconv,problematic,recommended,priority -usual,less_proper,foster,3,less_conv,inconv,problematic,priority,priority -usual,less_proper,foster,3,less_conv,inconv,problematic,not_recom,not_recom -usual,less_proper,foster,3,critical,convenient,nonprob,recommended,priority -usual,less_proper,foster,3,critical,convenient,nonprob,priority,priority -usual,less_proper,foster,3,critical,convenient,nonprob,not_recom,not_recom -usual,less_proper,foster,3,critical,convenient,slightly_prob,recommended,priority -usual,less_proper,foster,3,critical,convenient,slightly_prob,priority,priority -usual,less_proper,foster,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,foster,3,critical,convenient,problematic,recommended,priority -usual,less_proper,foster,3,critical,convenient,problematic,priority,priority -usual,less_proper,foster,3,critical,convenient,problematic,not_recom,not_recom -usual,less_proper,foster,3,critical,inconv,nonprob,recommended,priority -usual,less_proper,foster,3,critical,inconv,nonprob,priority,priority -usual,less_proper,foster,3,critical,inconv,nonprob,not_recom,not_recom -usual,less_proper,foster,3,critical,inconv,slightly_prob,recommended,priority -usual,less_proper,foster,3,critical,inconv,slightly_prob,priority,priority -usual,less_proper,foster,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,foster,3,critical,inconv,problematic,recommended,priority -usual,less_proper,foster,3,critical,inconv,problematic,priority,priority -usual,less_proper,foster,3,critical,inconv,problematic,not_recom,not_recom -usual,less_proper,foster,more,convenient,convenient,nonprob,recommended,very_recom -usual,less_proper,foster,more,convenient,convenient,nonprob,priority,priority -usual,less_proper,foster,more,convenient,convenient,nonprob,not_recom,not_recom -usual,less_proper,foster,more,convenient,convenient,slightly_prob,recommended,very_recom -usual,less_proper,foster,more,convenient,convenient,slightly_prob,priority,priority -usual,less_proper,foster,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,foster,more,convenient,convenient,problematic,recommended,priority -usual,less_proper,foster,more,convenient,convenient,problematic,priority,priority -usual,less_proper,foster,more,convenient,convenient,problematic,not_recom,not_recom -usual,less_proper,foster,more,convenient,inconv,nonprob,recommended,priority -usual,less_proper,foster,more,convenient,inconv,nonprob,priority,priority -usual,less_proper,foster,more,convenient,inconv,nonprob,not_recom,not_recom -usual,less_proper,foster,more,convenient,inconv,slightly_prob,recommended,priority -usual,less_proper,foster,more,convenient,inconv,slightly_prob,priority,priority -usual,less_proper,foster,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,foster,more,convenient,inconv,problematic,recommended,priority -usual,less_proper,foster,more,convenient,inconv,problematic,priority,priority -usual,less_proper,foster,more,convenient,inconv,problematic,not_recom,not_recom -usual,less_proper,foster,more,less_conv,convenient,nonprob,recommended,priority -usual,less_proper,foster,more,less_conv,convenient,nonprob,priority,priority -usual,less_proper,foster,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,less_proper,foster,more,less_conv,convenient,slightly_prob,recommended,priority -usual,less_proper,foster,more,less_conv,convenient,slightly_prob,priority,priority -usual,less_proper,foster,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,foster,more,less_conv,convenient,problematic,recommended,priority -usual,less_proper,foster,more,less_conv,convenient,problematic,priority,priority -usual,less_proper,foster,more,less_conv,convenient,problematic,not_recom,not_recom -usual,less_proper,foster,more,less_conv,inconv,nonprob,recommended,priority -usual,less_proper,foster,more,less_conv,inconv,nonprob,priority,priority -usual,less_proper,foster,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,less_proper,foster,more,less_conv,inconv,slightly_prob,recommended,priority -usual,less_proper,foster,more,less_conv,inconv,slightly_prob,priority,priority -usual,less_proper,foster,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,foster,more,less_conv,inconv,problematic,recommended,priority -usual,less_proper,foster,more,less_conv,inconv,problematic,priority,priority -usual,less_proper,foster,more,less_conv,inconv,problematic,not_recom,not_recom -usual,less_proper,foster,more,critical,convenient,nonprob,recommended,priority -usual,less_proper,foster,more,critical,convenient,nonprob,priority,priority -usual,less_proper,foster,more,critical,convenient,nonprob,not_recom,not_recom -usual,less_proper,foster,more,critical,convenient,slightly_prob,recommended,priority -usual,less_proper,foster,more,critical,convenient,slightly_prob,priority,priority -usual,less_proper,foster,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,less_proper,foster,more,critical,convenient,problematic,recommended,priority -usual,less_proper,foster,more,critical,convenient,problematic,priority,priority -usual,less_proper,foster,more,critical,convenient,problematic,not_recom,not_recom -usual,less_proper,foster,more,critical,inconv,nonprob,recommended,priority -usual,less_proper,foster,more,critical,inconv,nonprob,priority,priority -usual,less_proper,foster,more,critical,inconv,nonprob,not_recom,not_recom -usual,less_proper,foster,more,critical,inconv,slightly_prob,recommended,priority -usual,less_proper,foster,more,critical,inconv,slightly_prob,priority,priority -usual,less_proper,foster,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,less_proper,foster,more,critical,inconv,problematic,recommended,priority -usual,less_proper,foster,more,critical,inconv,problematic,priority,priority -usual,less_proper,foster,more,critical,inconv,problematic,not_recom,not_recom -usual,improper,complete,1,convenient,convenient,nonprob,recommended,very_recom -usual,improper,complete,1,convenient,convenient,nonprob,priority,priority -usual,improper,complete,1,convenient,convenient,nonprob,not_recom,not_recom -usual,improper,complete,1,convenient,convenient,slightly_prob,recommended,very_recom -usual,improper,complete,1,convenient,convenient,slightly_prob,priority,priority -usual,improper,complete,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,improper,complete,1,convenient,convenient,problematic,recommended,priority -usual,improper,complete,1,convenient,convenient,problematic,priority,priority -usual,improper,complete,1,convenient,convenient,problematic,not_recom,not_recom -usual,improper,complete,1,convenient,inconv,nonprob,recommended,very_recom -usual,improper,complete,1,convenient,inconv,nonprob,priority,priority -usual,improper,complete,1,convenient,inconv,nonprob,not_recom,not_recom -usual,improper,complete,1,convenient,inconv,slightly_prob,recommended,very_recom -usual,improper,complete,1,convenient,inconv,slightly_prob,priority,priority -usual,improper,complete,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,improper,complete,1,convenient,inconv,problematic,recommended,priority -usual,improper,complete,1,convenient,inconv,problematic,priority,priority -usual,improper,complete,1,convenient,inconv,problematic,not_recom,not_recom -usual,improper,complete,1,less_conv,convenient,nonprob,recommended,very_recom -usual,improper,complete,1,less_conv,convenient,nonprob,priority,priority -usual,improper,complete,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,improper,complete,1,less_conv,convenient,slightly_prob,recommended,very_recom -usual,improper,complete,1,less_conv,convenient,slightly_prob,priority,priority -usual,improper,complete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,improper,complete,1,less_conv,convenient,problematic,recommended,priority -usual,improper,complete,1,less_conv,convenient,problematic,priority,priority -usual,improper,complete,1,less_conv,convenient,problematic,not_recom,not_recom -usual,improper,complete,1,less_conv,inconv,nonprob,recommended,very_recom -usual,improper,complete,1,less_conv,inconv,nonprob,priority,priority -usual,improper,complete,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,improper,complete,1,less_conv,inconv,slightly_prob,recommended,very_recom -usual,improper,complete,1,less_conv,inconv,slightly_prob,priority,priority -usual,improper,complete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,improper,complete,1,less_conv,inconv,problematic,recommended,priority -usual,improper,complete,1,less_conv,inconv,problematic,priority,priority -usual,improper,complete,1,less_conv,inconv,problematic,not_recom,not_recom -usual,improper,complete,1,critical,convenient,nonprob,recommended,very_recom -usual,improper,complete,1,critical,convenient,nonprob,priority,priority -usual,improper,complete,1,critical,convenient,nonprob,not_recom,not_recom -usual,improper,complete,1,critical,convenient,slightly_prob,recommended,very_recom -usual,improper,complete,1,critical,convenient,slightly_prob,priority,priority -usual,improper,complete,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,improper,complete,1,critical,convenient,problematic,recommended,priority -usual,improper,complete,1,critical,convenient,problematic,priority,priority -usual,improper,complete,1,critical,convenient,problematic,not_recom,not_recom -usual,improper,complete,1,critical,inconv,nonprob,recommended,very_recom -usual,improper,complete,1,critical,inconv,nonprob,priority,priority -usual,improper,complete,1,critical,inconv,nonprob,not_recom,not_recom -usual,improper,complete,1,critical,inconv,slightly_prob,recommended,very_recom -usual,improper,complete,1,critical,inconv,slightly_prob,priority,priority -usual,improper,complete,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,improper,complete,1,critical,inconv,problematic,recommended,priority -usual,improper,complete,1,critical,inconv,problematic,priority,priority -usual,improper,complete,1,critical,inconv,problematic,not_recom,not_recom -usual,improper,complete,2,convenient,convenient,nonprob,recommended,very_recom -usual,improper,complete,2,convenient,convenient,nonprob,priority,priority -usual,improper,complete,2,convenient,convenient,nonprob,not_recom,not_recom -usual,improper,complete,2,convenient,convenient,slightly_prob,recommended,very_recom -usual,improper,complete,2,convenient,convenient,slightly_prob,priority,priority -usual,improper,complete,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,improper,complete,2,convenient,convenient,problematic,recommended,priority -usual,improper,complete,2,convenient,convenient,problematic,priority,priority -usual,improper,complete,2,convenient,convenient,problematic,not_recom,not_recom -usual,improper,complete,2,convenient,inconv,nonprob,recommended,very_recom -usual,improper,complete,2,convenient,inconv,nonprob,priority,priority -usual,improper,complete,2,convenient,inconv,nonprob,not_recom,not_recom -usual,improper,complete,2,convenient,inconv,slightly_prob,recommended,very_recom -usual,improper,complete,2,convenient,inconv,slightly_prob,priority,priority -usual,improper,complete,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,improper,complete,2,convenient,inconv,problematic,recommended,priority -usual,improper,complete,2,convenient,inconv,problematic,priority,priority -usual,improper,complete,2,convenient,inconv,problematic,not_recom,not_recom -usual,improper,complete,2,less_conv,convenient,nonprob,recommended,very_recom -usual,improper,complete,2,less_conv,convenient,nonprob,priority,priority -usual,improper,complete,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,improper,complete,2,less_conv,convenient,slightly_prob,recommended,very_recom -usual,improper,complete,2,less_conv,convenient,slightly_prob,priority,priority -usual,improper,complete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,improper,complete,2,less_conv,convenient,problematic,recommended,priority -usual,improper,complete,2,less_conv,convenient,problematic,priority,priority -usual,improper,complete,2,less_conv,convenient,problematic,not_recom,not_recom -usual,improper,complete,2,less_conv,inconv,nonprob,recommended,very_recom -usual,improper,complete,2,less_conv,inconv,nonprob,priority,priority -usual,improper,complete,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,improper,complete,2,less_conv,inconv,slightly_prob,recommended,very_recom -usual,improper,complete,2,less_conv,inconv,slightly_prob,priority,priority -usual,improper,complete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,improper,complete,2,less_conv,inconv,problematic,recommended,priority -usual,improper,complete,2,less_conv,inconv,problematic,priority,priority -usual,improper,complete,2,less_conv,inconv,problematic,not_recom,not_recom -usual,improper,complete,2,critical,convenient,nonprob,recommended,priority -usual,improper,complete,2,critical,convenient,nonprob,priority,priority -usual,improper,complete,2,critical,convenient,nonprob,not_recom,not_recom -usual,improper,complete,2,critical,convenient,slightly_prob,recommended,priority -usual,improper,complete,2,critical,convenient,slightly_prob,priority,priority -usual,improper,complete,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,improper,complete,2,critical,convenient,problematic,recommended,priority -usual,improper,complete,2,critical,convenient,problematic,priority,priority -usual,improper,complete,2,critical,convenient,problematic,not_recom,not_recom -usual,improper,complete,2,critical,inconv,nonprob,recommended,priority -usual,improper,complete,2,critical,inconv,nonprob,priority,priority -usual,improper,complete,2,critical,inconv,nonprob,not_recom,not_recom -usual,improper,complete,2,critical,inconv,slightly_prob,recommended,priority -usual,improper,complete,2,critical,inconv,slightly_prob,priority,priority -usual,improper,complete,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,improper,complete,2,critical,inconv,problematic,recommended,priority -usual,improper,complete,2,critical,inconv,problematic,priority,priority -usual,improper,complete,2,critical,inconv,problematic,not_recom,not_recom -usual,improper,complete,3,convenient,convenient,nonprob,recommended,very_recom -usual,improper,complete,3,convenient,convenient,nonprob,priority,priority -usual,improper,complete,3,convenient,convenient,nonprob,not_recom,not_recom -usual,improper,complete,3,convenient,convenient,slightly_prob,recommended,very_recom -usual,improper,complete,3,convenient,convenient,slightly_prob,priority,priority -usual,improper,complete,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,improper,complete,3,convenient,convenient,problematic,recommended,priority -usual,improper,complete,3,convenient,convenient,problematic,priority,priority -usual,improper,complete,3,convenient,convenient,problematic,not_recom,not_recom -usual,improper,complete,3,convenient,inconv,nonprob,recommended,priority -usual,improper,complete,3,convenient,inconv,nonprob,priority,priority -usual,improper,complete,3,convenient,inconv,nonprob,not_recom,not_recom -usual,improper,complete,3,convenient,inconv,slightly_prob,recommended,priority -usual,improper,complete,3,convenient,inconv,slightly_prob,priority,priority -usual,improper,complete,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,improper,complete,3,convenient,inconv,problematic,recommended,priority -usual,improper,complete,3,convenient,inconv,problematic,priority,priority -usual,improper,complete,3,convenient,inconv,problematic,not_recom,not_recom -usual,improper,complete,3,less_conv,convenient,nonprob,recommended,priority -usual,improper,complete,3,less_conv,convenient,nonprob,priority,priority -usual,improper,complete,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,improper,complete,3,less_conv,convenient,slightly_prob,recommended,priority -usual,improper,complete,3,less_conv,convenient,slightly_prob,priority,priority -usual,improper,complete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,improper,complete,3,less_conv,convenient,problematic,recommended,priority -usual,improper,complete,3,less_conv,convenient,problematic,priority,priority -usual,improper,complete,3,less_conv,convenient,problematic,not_recom,not_recom -usual,improper,complete,3,less_conv,inconv,nonprob,recommended,priority -usual,improper,complete,3,less_conv,inconv,nonprob,priority,priority -usual,improper,complete,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,improper,complete,3,less_conv,inconv,slightly_prob,recommended,priority -usual,improper,complete,3,less_conv,inconv,slightly_prob,priority,priority -usual,improper,complete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,improper,complete,3,less_conv,inconv,problematic,recommended,priority -usual,improper,complete,3,less_conv,inconv,problematic,priority,priority -usual,improper,complete,3,less_conv,inconv,problematic,not_recom,not_recom -usual,improper,complete,3,critical,convenient,nonprob,recommended,priority -usual,improper,complete,3,critical,convenient,nonprob,priority,priority -usual,improper,complete,3,critical,convenient,nonprob,not_recom,not_recom -usual,improper,complete,3,critical,convenient,slightly_prob,recommended,priority -usual,improper,complete,3,critical,convenient,slightly_prob,priority,priority -usual,improper,complete,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,improper,complete,3,critical,convenient,problematic,recommended,priority -usual,improper,complete,3,critical,convenient,problematic,priority,priority -usual,improper,complete,3,critical,convenient,problematic,not_recom,not_recom -usual,improper,complete,3,critical,inconv,nonprob,recommended,priority -usual,improper,complete,3,critical,inconv,nonprob,priority,priority -usual,improper,complete,3,critical,inconv,nonprob,not_recom,not_recom -usual,improper,complete,3,critical,inconv,slightly_prob,recommended,priority -usual,improper,complete,3,critical,inconv,slightly_prob,priority,priority -usual,improper,complete,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,improper,complete,3,critical,inconv,problematic,recommended,priority -usual,improper,complete,3,critical,inconv,problematic,priority,priority -usual,improper,complete,3,critical,inconv,problematic,not_recom,not_recom -usual,improper,complete,more,convenient,convenient,nonprob,recommended,very_recom -usual,improper,complete,more,convenient,convenient,nonprob,priority,priority -usual,improper,complete,more,convenient,convenient,nonprob,not_recom,not_recom -usual,improper,complete,more,convenient,convenient,slightly_prob,recommended,very_recom -usual,improper,complete,more,convenient,convenient,slightly_prob,priority,priority -usual,improper,complete,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,improper,complete,more,convenient,convenient,problematic,recommended,priority -usual,improper,complete,more,convenient,convenient,problematic,priority,priority -usual,improper,complete,more,convenient,convenient,problematic,not_recom,not_recom -usual,improper,complete,more,convenient,inconv,nonprob,recommended,priority -usual,improper,complete,more,convenient,inconv,nonprob,priority,priority -usual,improper,complete,more,convenient,inconv,nonprob,not_recom,not_recom -usual,improper,complete,more,convenient,inconv,slightly_prob,recommended,priority -usual,improper,complete,more,convenient,inconv,slightly_prob,priority,priority -usual,improper,complete,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,improper,complete,more,convenient,inconv,problematic,recommended,priority -usual,improper,complete,more,convenient,inconv,problematic,priority,priority -usual,improper,complete,more,convenient,inconv,problematic,not_recom,not_recom -usual,improper,complete,more,less_conv,convenient,nonprob,recommended,priority -usual,improper,complete,more,less_conv,convenient,nonprob,priority,priority -usual,improper,complete,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,improper,complete,more,less_conv,convenient,slightly_prob,recommended,priority -usual,improper,complete,more,less_conv,convenient,slightly_prob,priority,priority -usual,improper,complete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,improper,complete,more,less_conv,convenient,problematic,recommended,priority -usual,improper,complete,more,less_conv,convenient,problematic,priority,priority -usual,improper,complete,more,less_conv,convenient,problematic,not_recom,not_recom -usual,improper,complete,more,less_conv,inconv,nonprob,recommended,priority -usual,improper,complete,more,less_conv,inconv,nonprob,priority,priority -usual,improper,complete,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,improper,complete,more,less_conv,inconv,slightly_prob,recommended,priority -usual,improper,complete,more,less_conv,inconv,slightly_prob,priority,priority -usual,improper,complete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,improper,complete,more,less_conv,inconv,problematic,recommended,priority -usual,improper,complete,more,less_conv,inconv,problematic,priority,priority -usual,improper,complete,more,less_conv,inconv,problematic,not_recom,not_recom -usual,improper,complete,more,critical,convenient,nonprob,recommended,priority -usual,improper,complete,more,critical,convenient,nonprob,priority,priority -usual,improper,complete,more,critical,convenient,nonprob,not_recom,not_recom -usual,improper,complete,more,critical,convenient,slightly_prob,recommended,priority -usual,improper,complete,more,critical,convenient,slightly_prob,priority,priority -usual,improper,complete,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,improper,complete,more,critical,convenient,problematic,recommended,priority -usual,improper,complete,more,critical,convenient,problematic,priority,priority -usual,improper,complete,more,critical,convenient,problematic,not_recom,not_recom -usual,improper,complete,more,critical,inconv,nonprob,recommended,priority -usual,improper,complete,more,critical,inconv,nonprob,priority,priority -usual,improper,complete,more,critical,inconv,nonprob,not_recom,not_recom -usual,improper,complete,more,critical,inconv,slightly_prob,recommended,priority -usual,improper,complete,more,critical,inconv,slightly_prob,priority,priority -usual,improper,complete,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,improper,complete,more,critical,inconv,problematic,recommended,priority -usual,improper,complete,more,critical,inconv,problematic,priority,priority -usual,improper,complete,more,critical,inconv,problematic,not_recom,not_recom -usual,improper,completed,1,convenient,convenient,nonprob,recommended,very_recom -usual,improper,completed,1,convenient,convenient,nonprob,priority,priority -usual,improper,completed,1,convenient,convenient,nonprob,not_recom,not_recom -usual,improper,completed,1,convenient,convenient,slightly_prob,recommended,very_recom -usual,improper,completed,1,convenient,convenient,slightly_prob,priority,priority -usual,improper,completed,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,improper,completed,1,convenient,convenient,problematic,recommended,priority -usual,improper,completed,1,convenient,convenient,problematic,priority,priority -usual,improper,completed,1,convenient,convenient,problematic,not_recom,not_recom -usual,improper,completed,1,convenient,inconv,nonprob,recommended,very_recom -usual,improper,completed,1,convenient,inconv,nonprob,priority,priority -usual,improper,completed,1,convenient,inconv,nonprob,not_recom,not_recom -usual,improper,completed,1,convenient,inconv,slightly_prob,recommended,very_recom -usual,improper,completed,1,convenient,inconv,slightly_prob,priority,priority -usual,improper,completed,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,improper,completed,1,convenient,inconv,problematic,recommended,priority -usual,improper,completed,1,convenient,inconv,problematic,priority,priority -usual,improper,completed,1,convenient,inconv,problematic,not_recom,not_recom -usual,improper,completed,1,less_conv,convenient,nonprob,recommended,very_recom -usual,improper,completed,1,less_conv,convenient,nonprob,priority,priority -usual,improper,completed,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,improper,completed,1,less_conv,convenient,slightly_prob,recommended,very_recom -usual,improper,completed,1,less_conv,convenient,slightly_prob,priority,priority -usual,improper,completed,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,improper,completed,1,less_conv,convenient,problematic,recommended,priority -usual,improper,completed,1,less_conv,convenient,problematic,priority,priority -usual,improper,completed,1,less_conv,convenient,problematic,not_recom,not_recom -usual,improper,completed,1,less_conv,inconv,nonprob,recommended,very_recom -usual,improper,completed,1,less_conv,inconv,nonprob,priority,priority -usual,improper,completed,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,improper,completed,1,less_conv,inconv,slightly_prob,recommended,very_recom -usual,improper,completed,1,less_conv,inconv,slightly_prob,priority,priority -usual,improper,completed,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,improper,completed,1,less_conv,inconv,problematic,recommended,priority -usual,improper,completed,1,less_conv,inconv,problematic,priority,priority -usual,improper,completed,1,less_conv,inconv,problematic,not_recom,not_recom -usual,improper,completed,1,critical,convenient,nonprob,recommended,priority -usual,improper,completed,1,critical,convenient,nonprob,priority,priority -usual,improper,completed,1,critical,convenient,nonprob,not_recom,not_recom -usual,improper,completed,1,critical,convenient,slightly_prob,recommended,priority -usual,improper,completed,1,critical,convenient,slightly_prob,priority,priority -usual,improper,completed,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,improper,completed,1,critical,convenient,problematic,recommended,priority -usual,improper,completed,1,critical,convenient,problematic,priority,priority -usual,improper,completed,1,critical,convenient,problematic,not_recom,not_recom -usual,improper,completed,1,critical,inconv,nonprob,recommended,priority -usual,improper,completed,1,critical,inconv,nonprob,priority,priority -usual,improper,completed,1,critical,inconv,nonprob,not_recom,not_recom -usual,improper,completed,1,critical,inconv,slightly_prob,recommended,priority -usual,improper,completed,1,critical,inconv,slightly_prob,priority,priority -usual,improper,completed,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,improper,completed,1,critical,inconv,problematic,recommended,priority -usual,improper,completed,1,critical,inconv,problematic,priority,priority -usual,improper,completed,1,critical,inconv,problematic,not_recom,not_recom -usual,improper,completed,2,convenient,convenient,nonprob,recommended,very_recom -usual,improper,completed,2,convenient,convenient,nonprob,priority,priority -usual,improper,completed,2,convenient,convenient,nonprob,not_recom,not_recom -usual,improper,completed,2,convenient,convenient,slightly_prob,recommended,very_recom -usual,improper,completed,2,convenient,convenient,slightly_prob,priority,priority -usual,improper,completed,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,improper,completed,2,convenient,convenient,problematic,recommended,priority -usual,improper,completed,2,convenient,convenient,problematic,priority,priority -usual,improper,completed,2,convenient,convenient,problematic,not_recom,not_recom -usual,improper,completed,2,convenient,inconv,nonprob,recommended,very_recom -usual,improper,completed,2,convenient,inconv,nonprob,priority,priority -usual,improper,completed,2,convenient,inconv,nonprob,not_recom,not_recom -usual,improper,completed,2,convenient,inconv,slightly_prob,recommended,very_recom -usual,improper,completed,2,convenient,inconv,slightly_prob,priority,priority -usual,improper,completed,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,improper,completed,2,convenient,inconv,problematic,recommended,priority -usual,improper,completed,2,convenient,inconv,problematic,priority,priority -usual,improper,completed,2,convenient,inconv,problematic,not_recom,not_recom -usual,improper,completed,2,less_conv,convenient,nonprob,recommended,very_recom -usual,improper,completed,2,less_conv,convenient,nonprob,priority,priority -usual,improper,completed,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,improper,completed,2,less_conv,convenient,slightly_prob,recommended,very_recom -usual,improper,completed,2,less_conv,convenient,slightly_prob,priority,priority -usual,improper,completed,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,improper,completed,2,less_conv,convenient,problematic,recommended,priority -usual,improper,completed,2,less_conv,convenient,problematic,priority,priority -usual,improper,completed,2,less_conv,convenient,problematic,not_recom,not_recom -usual,improper,completed,2,less_conv,inconv,nonprob,recommended,very_recom -usual,improper,completed,2,less_conv,inconv,nonprob,priority,priority -usual,improper,completed,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,improper,completed,2,less_conv,inconv,slightly_prob,recommended,very_recom -usual,improper,completed,2,less_conv,inconv,slightly_prob,priority,priority -usual,improper,completed,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,improper,completed,2,less_conv,inconv,problematic,recommended,priority -usual,improper,completed,2,less_conv,inconv,problematic,priority,priority -usual,improper,completed,2,less_conv,inconv,problematic,not_recom,not_recom -usual,improper,completed,2,critical,convenient,nonprob,recommended,priority -usual,improper,completed,2,critical,convenient,nonprob,priority,priority -usual,improper,completed,2,critical,convenient,nonprob,not_recom,not_recom -usual,improper,completed,2,critical,convenient,slightly_prob,recommended,priority -usual,improper,completed,2,critical,convenient,slightly_prob,priority,priority -usual,improper,completed,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,improper,completed,2,critical,convenient,problematic,recommended,priority -usual,improper,completed,2,critical,convenient,problematic,priority,priority -usual,improper,completed,2,critical,convenient,problematic,not_recom,not_recom -usual,improper,completed,2,critical,inconv,nonprob,recommended,priority -usual,improper,completed,2,critical,inconv,nonprob,priority,priority -usual,improper,completed,2,critical,inconv,nonprob,not_recom,not_recom -usual,improper,completed,2,critical,inconv,slightly_prob,recommended,priority -usual,improper,completed,2,critical,inconv,slightly_prob,priority,priority -usual,improper,completed,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,improper,completed,2,critical,inconv,problematic,recommended,priority -usual,improper,completed,2,critical,inconv,problematic,priority,priority -usual,improper,completed,2,critical,inconv,problematic,not_recom,not_recom -usual,improper,completed,3,convenient,convenient,nonprob,recommended,very_recom -usual,improper,completed,3,convenient,convenient,nonprob,priority,priority -usual,improper,completed,3,convenient,convenient,nonprob,not_recom,not_recom -usual,improper,completed,3,convenient,convenient,slightly_prob,recommended,very_recom -usual,improper,completed,3,convenient,convenient,slightly_prob,priority,priority -usual,improper,completed,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,improper,completed,3,convenient,convenient,problematic,recommended,priority -usual,improper,completed,3,convenient,convenient,problematic,priority,priority -usual,improper,completed,3,convenient,convenient,problematic,not_recom,not_recom -usual,improper,completed,3,convenient,inconv,nonprob,recommended,priority -usual,improper,completed,3,convenient,inconv,nonprob,priority,priority -usual,improper,completed,3,convenient,inconv,nonprob,not_recom,not_recom -usual,improper,completed,3,convenient,inconv,slightly_prob,recommended,priority -usual,improper,completed,3,convenient,inconv,slightly_prob,priority,priority -usual,improper,completed,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,improper,completed,3,convenient,inconv,problematic,recommended,priority -usual,improper,completed,3,convenient,inconv,problematic,priority,priority -usual,improper,completed,3,convenient,inconv,problematic,not_recom,not_recom -usual,improper,completed,3,less_conv,convenient,nonprob,recommended,priority -usual,improper,completed,3,less_conv,convenient,nonprob,priority,priority -usual,improper,completed,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,improper,completed,3,less_conv,convenient,slightly_prob,recommended,priority -usual,improper,completed,3,less_conv,convenient,slightly_prob,priority,priority -usual,improper,completed,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,improper,completed,3,less_conv,convenient,problematic,recommended,priority -usual,improper,completed,3,less_conv,convenient,problematic,priority,priority -usual,improper,completed,3,less_conv,convenient,problematic,not_recom,not_recom -usual,improper,completed,3,less_conv,inconv,nonprob,recommended,priority -usual,improper,completed,3,less_conv,inconv,nonprob,priority,priority -usual,improper,completed,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,improper,completed,3,less_conv,inconv,slightly_prob,recommended,priority -usual,improper,completed,3,less_conv,inconv,slightly_prob,priority,priority -usual,improper,completed,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,improper,completed,3,less_conv,inconv,problematic,recommended,priority -usual,improper,completed,3,less_conv,inconv,problematic,priority,priority -usual,improper,completed,3,less_conv,inconv,problematic,not_recom,not_recom -usual,improper,completed,3,critical,convenient,nonprob,recommended,priority -usual,improper,completed,3,critical,convenient,nonprob,priority,priority -usual,improper,completed,3,critical,convenient,nonprob,not_recom,not_recom -usual,improper,completed,3,critical,convenient,slightly_prob,recommended,priority -usual,improper,completed,3,critical,convenient,slightly_prob,priority,priority -usual,improper,completed,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,improper,completed,3,critical,convenient,problematic,recommended,priority -usual,improper,completed,3,critical,convenient,problematic,priority,priority -usual,improper,completed,3,critical,convenient,problematic,not_recom,not_recom -usual,improper,completed,3,critical,inconv,nonprob,recommended,priority -usual,improper,completed,3,critical,inconv,nonprob,priority,priority -usual,improper,completed,3,critical,inconv,nonprob,not_recom,not_recom -usual,improper,completed,3,critical,inconv,slightly_prob,recommended,priority -usual,improper,completed,3,critical,inconv,slightly_prob,priority,priority -usual,improper,completed,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,improper,completed,3,critical,inconv,problematic,recommended,priority -usual,improper,completed,3,critical,inconv,problematic,priority,priority -usual,improper,completed,3,critical,inconv,problematic,not_recom,not_recom -usual,improper,completed,more,convenient,convenient,nonprob,recommended,very_recom -usual,improper,completed,more,convenient,convenient,nonprob,priority,priority -usual,improper,completed,more,convenient,convenient,nonprob,not_recom,not_recom -usual,improper,completed,more,convenient,convenient,slightly_prob,recommended,very_recom -usual,improper,completed,more,convenient,convenient,slightly_prob,priority,priority -usual,improper,completed,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,improper,completed,more,convenient,convenient,problematic,recommended,priority -usual,improper,completed,more,convenient,convenient,problematic,priority,priority -usual,improper,completed,more,convenient,convenient,problematic,not_recom,not_recom -usual,improper,completed,more,convenient,inconv,nonprob,recommended,priority -usual,improper,completed,more,convenient,inconv,nonprob,priority,priority -usual,improper,completed,more,convenient,inconv,nonprob,not_recom,not_recom -usual,improper,completed,more,convenient,inconv,slightly_prob,recommended,priority -usual,improper,completed,more,convenient,inconv,slightly_prob,priority,priority -usual,improper,completed,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,improper,completed,more,convenient,inconv,problematic,recommended,priority -usual,improper,completed,more,convenient,inconv,problematic,priority,priority -usual,improper,completed,more,convenient,inconv,problematic,not_recom,not_recom -usual,improper,completed,more,less_conv,convenient,nonprob,recommended,priority -usual,improper,completed,more,less_conv,convenient,nonprob,priority,priority -usual,improper,completed,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,improper,completed,more,less_conv,convenient,slightly_prob,recommended,priority -usual,improper,completed,more,less_conv,convenient,slightly_prob,priority,priority -usual,improper,completed,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,improper,completed,more,less_conv,convenient,problematic,recommended,priority -usual,improper,completed,more,less_conv,convenient,problematic,priority,priority -usual,improper,completed,more,less_conv,convenient,problematic,not_recom,not_recom -usual,improper,completed,more,less_conv,inconv,nonprob,recommended,priority -usual,improper,completed,more,less_conv,inconv,nonprob,priority,priority -usual,improper,completed,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,improper,completed,more,less_conv,inconv,slightly_prob,recommended,priority -usual,improper,completed,more,less_conv,inconv,slightly_prob,priority,priority -usual,improper,completed,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,improper,completed,more,less_conv,inconv,problematic,recommended,priority -usual,improper,completed,more,less_conv,inconv,problematic,priority,priority -usual,improper,completed,more,less_conv,inconv,problematic,not_recom,not_recom -usual,improper,completed,more,critical,convenient,nonprob,recommended,priority -usual,improper,completed,more,critical,convenient,nonprob,priority,priority -usual,improper,completed,more,critical,convenient,nonprob,not_recom,not_recom -usual,improper,completed,more,critical,convenient,slightly_prob,recommended,priority -usual,improper,completed,more,critical,convenient,slightly_prob,priority,priority -usual,improper,completed,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,improper,completed,more,critical,convenient,problematic,recommended,priority -usual,improper,completed,more,critical,convenient,problematic,priority,priority -usual,improper,completed,more,critical,convenient,problematic,not_recom,not_recom -usual,improper,completed,more,critical,inconv,nonprob,recommended,priority -usual,improper,completed,more,critical,inconv,nonprob,priority,priority -usual,improper,completed,more,critical,inconv,nonprob,not_recom,not_recom -usual,improper,completed,more,critical,inconv,slightly_prob,recommended,priority -usual,improper,completed,more,critical,inconv,slightly_prob,priority,priority -usual,improper,completed,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,improper,completed,more,critical,inconv,problematic,recommended,priority -usual,improper,completed,more,critical,inconv,problematic,priority,priority -usual,improper,completed,more,critical,inconv,problematic,not_recom,not_recom -usual,improper,incomplete,1,convenient,convenient,nonprob,recommended,very_recom -usual,improper,incomplete,1,convenient,convenient,nonprob,priority,priority -usual,improper,incomplete,1,convenient,convenient,nonprob,not_recom,not_recom -usual,improper,incomplete,1,convenient,convenient,slightly_prob,recommended,very_recom -usual,improper,incomplete,1,convenient,convenient,slightly_prob,priority,priority -usual,improper,incomplete,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,improper,incomplete,1,convenient,convenient,problematic,recommended,priority -usual,improper,incomplete,1,convenient,convenient,problematic,priority,priority -usual,improper,incomplete,1,convenient,convenient,problematic,not_recom,not_recom -usual,improper,incomplete,1,convenient,inconv,nonprob,recommended,very_recom -usual,improper,incomplete,1,convenient,inconv,nonprob,priority,priority -usual,improper,incomplete,1,convenient,inconv,nonprob,not_recom,not_recom -usual,improper,incomplete,1,convenient,inconv,slightly_prob,recommended,very_recom -usual,improper,incomplete,1,convenient,inconv,slightly_prob,priority,priority -usual,improper,incomplete,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,improper,incomplete,1,convenient,inconv,problematic,recommended,priority -usual,improper,incomplete,1,convenient,inconv,problematic,priority,priority -usual,improper,incomplete,1,convenient,inconv,problematic,not_recom,not_recom -usual,improper,incomplete,1,less_conv,convenient,nonprob,recommended,very_recom -usual,improper,incomplete,1,less_conv,convenient,nonprob,priority,priority -usual,improper,incomplete,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,improper,incomplete,1,less_conv,convenient,slightly_prob,recommended,very_recom -usual,improper,incomplete,1,less_conv,convenient,slightly_prob,priority,priority -usual,improper,incomplete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,improper,incomplete,1,less_conv,convenient,problematic,recommended,priority -usual,improper,incomplete,1,less_conv,convenient,problematic,priority,priority -usual,improper,incomplete,1,less_conv,convenient,problematic,not_recom,not_recom -usual,improper,incomplete,1,less_conv,inconv,nonprob,recommended,very_recom -usual,improper,incomplete,1,less_conv,inconv,nonprob,priority,priority -usual,improper,incomplete,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,improper,incomplete,1,less_conv,inconv,slightly_prob,recommended,very_recom -usual,improper,incomplete,1,less_conv,inconv,slightly_prob,priority,priority -usual,improper,incomplete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,improper,incomplete,1,less_conv,inconv,problematic,recommended,priority -usual,improper,incomplete,1,less_conv,inconv,problematic,priority,priority -usual,improper,incomplete,1,less_conv,inconv,problematic,not_recom,not_recom -usual,improper,incomplete,1,critical,convenient,nonprob,recommended,priority -usual,improper,incomplete,1,critical,convenient,nonprob,priority,priority -usual,improper,incomplete,1,critical,convenient,nonprob,not_recom,not_recom -usual,improper,incomplete,1,critical,convenient,slightly_prob,recommended,priority -usual,improper,incomplete,1,critical,convenient,slightly_prob,priority,priority -usual,improper,incomplete,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,improper,incomplete,1,critical,convenient,problematic,recommended,priority -usual,improper,incomplete,1,critical,convenient,problematic,priority,priority -usual,improper,incomplete,1,critical,convenient,problematic,not_recom,not_recom -usual,improper,incomplete,1,critical,inconv,nonprob,recommended,priority -usual,improper,incomplete,1,critical,inconv,nonprob,priority,priority -usual,improper,incomplete,1,critical,inconv,nonprob,not_recom,not_recom -usual,improper,incomplete,1,critical,inconv,slightly_prob,recommended,priority -usual,improper,incomplete,1,critical,inconv,slightly_prob,priority,priority -usual,improper,incomplete,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,improper,incomplete,1,critical,inconv,problematic,recommended,priority -usual,improper,incomplete,1,critical,inconv,problematic,priority,priority -usual,improper,incomplete,1,critical,inconv,problematic,not_recom,not_recom -usual,improper,incomplete,2,convenient,convenient,nonprob,recommended,very_recom -usual,improper,incomplete,2,convenient,convenient,nonprob,priority,priority -usual,improper,incomplete,2,convenient,convenient,nonprob,not_recom,not_recom -usual,improper,incomplete,2,convenient,convenient,slightly_prob,recommended,very_recom -usual,improper,incomplete,2,convenient,convenient,slightly_prob,priority,priority -usual,improper,incomplete,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,improper,incomplete,2,convenient,convenient,problematic,recommended,priority -usual,improper,incomplete,2,convenient,convenient,problematic,priority,priority -usual,improper,incomplete,2,convenient,convenient,problematic,not_recom,not_recom -usual,improper,incomplete,2,convenient,inconv,nonprob,recommended,priority -usual,improper,incomplete,2,convenient,inconv,nonprob,priority,priority -usual,improper,incomplete,2,convenient,inconv,nonprob,not_recom,not_recom -usual,improper,incomplete,2,convenient,inconv,slightly_prob,recommended,priority -usual,improper,incomplete,2,convenient,inconv,slightly_prob,priority,priority -usual,improper,incomplete,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,improper,incomplete,2,convenient,inconv,problematic,recommended,priority -usual,improper,incomplete,2,convenient,inconv,problematic,priority,priority -usual,improper,incomplete,2,convenient,inconv,problematic,not_recom,not_recom -usual,improper,incomplete,2,less_conv,convenient,nonprob,recommended,priority -usual,improper,incomplete,2,less_conv,convenient,nonprob,priority,priority -usual,improper,incomplete,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,improper,incomplete,2,less_conv,convenient,slightly_prob,recommended,priority -usual,improper,incomplete,2,less_conv,convenient,slightly_prob,priority,priority -usual,improper,incomplete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,improper,incomplete,2,less_conv,convenient,problematic,recommended,priority -usual,improper,incomplete,2,less_conv,convenient,problematic,priority,priority -usual,improper,incomplete,2,less_conv,convenient,problematic,not_recom,not_recom -usual,improper,incomplete,2,less_conv,inconv,nonprob,recommended,priority -usual,improper,incomplete,2,less_conv,inconv,nonprob,priority,priority -usual,improper,incomplete,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,improper,incomplete,2,less_conv,inconv,slightly_prob,recommended,priority -usual,improper,incomplete,2,less_conv,inconv,slightly_prob,priority,priority -usual,improper,incomplete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,improper,incomplete,2,less_conv,inconv,problematic,recommended,priority -usual,improper,incomplete,2,less_conv,inconv,problematic,priority,priority -usual,improper,incomplete,2,less_conv,inconv,problematic,not_recom,not_recom -usual,improper,incomplete,2,critical,convenient,nonprob,recommended,priority -usual,improper,incomplete,2,critical,convenient,nonprob,priority,priority -usual,improper,incomplete,2,critical,convenient,nonprob,not_recom,not_recom -usual,improper,incomplete,2,critical,convenient,slightly_prob,recommended,priority -usual,improper,incomplete,2,critical,convenient,slightly_prob,priority,priority -usual,improper,incomplete,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,improper,incomplete,2,critical,convenient,problematic,recommended,priority -usual,improper,incomplete,2,critical,convenient,problematic,priority,priority -usual,improper,incomplete,2,critical,convenient,problematic,not_recom,not_recom -usual,improper,incomplete,2,critical,inconv,nonprob,recommended,priority -usual,improper,incomplete,2,critical,inconv,nonprob,priority,priority -usual,improper,incomplete,2,critical,inconv,nonprob,not_recom,not_recom -usual,improper,incomplete,2,critical,inconv,slightly_prob,recommended,priority -usual,improper,incomplete,2,critical,inconv,slightly_prob,priority,priority -usual,improper,incomplete,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,improper,incomplete,2,critical,inconv,problematic,recommended,priority -usual,improper,incomplete,2,critical,inconv,problematic,priority,priority -usual,improper,incomplete,2,critical,inconv,problematic,not_recom,not_recom -usual,improper,incomplete,3,convenient,convenient,nonprob,recommended,very_recom -usual,improper,incomplete,3,convenient,convenient,nonprob,priority,priority -usual,improper,incomplete,3,convenient,convenient,nonprob,not_recom,not_recom -usual,improper,incomplete,3,convenient,convenient,slightly_prob,recommended,very_recom -usual,improper,incomplete,3,convenient,convenient,slightly_prob,priority,priority -usual,improper,incomplete,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,improper,incomplete,3,convenient,convenient,problematic,recommended,priority -usual,improper,incomplete,3,convenient,convenient,problematic,priority,priority -usual,improper,incomplete,3,convenient,convenient,problematic,not_recom,not_recom -usual,improper,incomplete,3,convenient,inconv,nonprob,recommended,priority -usual,improper,incomplete,3,convenient,inconv,nonprob,priority,priority -usual,improper,incomplete,3,convenient,inconv,nonprob,not_recom,not_recom -usual,improper,incomplete,3,convenient,inconv,slightly_prob,recommended,priority -usual,improper,incomplete,3,convenient,inconv,slightly_prob,priority,priority -usual,improper,incomplete,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,improper,incomplete,3,convenient,inconv,problematic,recommended,priority -usual,improper,incomplete,3,convenient,inconv,problematic,priority,priority -usual,improper,incomplete,3,convenient,inconv,problematic,not_recom,not_recom -usual,improper,incomplete,3,less_conv,convenient,nonprob,recommended,priority -usual,improper,incomplete,3,less_conv,convenient,nonprob,priority,priority -usual,improper,incomplete,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,improper,incomplete,3,less_conv,convenient,slightly_prob,recommended,priority -usual,improper,incomplete,3,less_conv,convenient,slightly_prob,priority,priority -usual,improper,incomplete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,improper,incomplete,3,less_conv,convenient,problematic,recommended,priority -usual,improper,incomplete,3,less_conv,convenient,problematic,priority,priority -usual,improper,incomplete,3,less_conv,convenient,problematic,not_recom,not_recom -usual,improper,incomplete,3,less_conv,inconv,nonprob,recommended,priority -usual,improper,incomplete,3,less_conv,inconv,nonprob,priority,priority -usual,improper,incomplete,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,improper,incomplete,3,less_conv,inconv,slightly_prob,recommended,priority -usual,improper,incomplete,3,less_conv,inconv,slightly_prob,priority,priority -usual,improper,incomplete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,improper,incomplete,3,less_conv,inconv,problematic,recommended,priority -usual,improper,incomplete,3,less_conv,inconv,problematic,priority,priority -usual,improper,incomplete,3,less_conv,inconv,problematic,not_recom,not_recom -usual,improper,incomplete,3,critical,convenient,nonprob,recommended,priority -usual,improper,incomplete,3,critical,convenient,nonprob,priority,priority -usual,improper,incomplete,3,critical,convenient,nonprob,not_recom,not_recom -usual,improper,incomplete,3,critical,convenient,slightly_prob,recommended,priority -usual,improper,incomplete,3,critical,convenient,slightly_prob,priority,priority -usual,improper,incomplete,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,improper,incomplete,3,critical,convenient,problematic,recommended,priority -usual,improper,incomplete,3,critical,convenient,problematic,priority,priority -usual,improper,incomplete,3,critical,convenient,problematic,not_recom,not_recom -usual,improper,incomplete,3,critical,inconv,nonprob,recommended,priority -usual,improper,incomplete,3,critical,inconv,nonprob,priority,priority -usual,improper,incomplete,3,critical,inconv,nonprob,not_recom,not_recom -usual,improper,incomplete,3,critical,inconv,slightly_prob,recommended,priority -usual,improper,incomplete,3,critical,inconv,slightly_prob,priority,priority -usual,improper,incomplete,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,improper,incomplete,3,critical,inconv,problematic,recommended,priority -usual,improper,incomplete,3,critical,inconv,problematic,priority,priority -usual,improper,incomplete,3,critical,inconv,problematic,not_recom,not_recom -usual,improper,incomplete,more,convenient,convenient,nonprob,recommended,very_recom -usual,improper,incomplete,more,convenient,convenient,nonprob,priority,priority -usual,improper,incomplete,more,convenient,convenient,nonprob,not_recom,not_recom -usual,improper,incomplete,more,convenient,convenient,slightly_prob,recommended,very_recom -usual,improper,incomplete,more,convenient,convenient,slightly_prob,priority,priority -usual,improper,incomplete,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,improper,incomplete,more,convenient,convenient,problematic,recommended,priority -usual,improper,incomplete,more,convenient,convenient,problematic,priority,priority -usual,improper,incomplete,more,convenient,convenient,problematic,not_recom,not_recom -usual,improper,incomplete,more,convenient,inconv,nonprob,recommended,priority -usual,improper,incomplete,more,convenient,inconv,nonprob,priority,priority -usual,improper,incomplete,more,convenient,inconv,nonprob,not_recom,not_recom -usual,improper,incomplete,more,convenient,inconv,slightly_prob,recommended,priority -usual,improper,incomplete,more,convenient,inconv,slightly_prob,priority,priority -usual,improper,incomplete,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,improper,incomplete,more,convenient,inconv,problematic,recommended,priority -usual,improper,incomplete,more,convenient,inconv,problematic,priority,priority -usual,improper,incomplete,more,convenient,inconv,problematic,not_recom,not_recom -usual,improper,incomplete,more,less_conv,convenient,nonprob,recommended,priority -usual,improper,incomplete,more,less_conv,convenient,nonprob,priority,priority -usual,improper,incomplete,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,improper,incomplete,more,less_conv,convenient,slightly_prob,recommended,priority -usual,improper,incomplete,more,less_conv,convenient,slightly_prob,priority,priority -usual,improper,incomplete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,improper,incomplete,more,less_conv,convenient,problematic,recommended,priority -usual,improper,incomplete,more,less_conv,convenient,problematic,priority,priority -usual,improper,incomplete,more,less_conv,convenient,problematic,not_recom,not_recom -usual,improper,incomplete,more,less_conv,inconv,nonprob,recommended,priority -usual,improper,incomplete,more,less_conv,inconv,nonprob,priority,priority -usual,improper,incomplete,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,improper,incomplete,more,less_conv,inconv,slightly_prob,recommended,priority -usual,improper,incomplete,more,less_conv,inconv,slightly_prob,priority,priority -usual,improper,incomplete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,improper,incomplete,more,less_conv,inconv,problematic,recommended,priority -usual,improper,incomplete,more,less_conv,inconv,problematic,priority,priority -usual,improper,incomplete,more,less_conv,inconv,problematic,not_recom,not_recom -usual,improper,incomplete,more,critical,convenient,nonprob,recommended,priority -usual,improper,incomplete,more,critical,convenient,nonprob,priority,priority -usual,improper,incomplete,more,critical,convenient,nonprob,not_recom,not_recom -usual,improper,incomplete,more,critical,convenient,slightly_prob,recommended,priority -usual,improper,incomplete,more,critical,convenient,slightly_prob,priority,priority -usual,improper,incomplete,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,improper,incomplete,more,critical,convenient,problematic,recommended,priority -usual,improper,incomplete,more,critical,convenient,problematic,priority,priority -usual,improper,incomplete,more,critical,convenient,problematic,not_recom,not_recom -usual,improper,incomplete,more,critical,inconv,nonprob,recommended,priority -usual,improper,incomplete,more,critical,inconv,nonprob,priority,priority -usual,improper,incomplete,more,critical,inconv,nonprob,not_recom,not_recom -usual,improper,incomplete,more,critical,inconv,slightly_prob,recommended,priority -usual,improper,incomplete,more,critical,inconv,slightly_prob,priority,priority -usual,improper,incomplete,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,improper,incomplete,more,critical,inconv,problematic,recommended,priority -usual,improper,incomplete,more,critical,inconv,problematic,priority,priority -usual,improper,incomplete,more,critical,inconv,problematic,not_recom,not_recom -usual,improper,foster,1,convenient,convenient,nonprob,recommended,very_recom -usual,improper,foster,1,convenient,convenient,nonprob,priority,priority -usual,improper,foster,1,convenient,convenient,nonprob,not_recom,not_recom -usual,improper,foster,1,convenient,convenient,slightly_prob,recommended,very_recom -usual,improper,foster,1,convenient,convenient,slightly_prob,priority,priority -usual,improper,foster,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,improper,foster,1,convenient,convenient,problematic,recommended,priority -usual,improper,foster,1,convenient,convenient,problematic,priority,priority -usual,improper,foster,1,convenient,convenient,problematic,not_recom,not_recom -usual,improper,foster,1,convenient,inconv,nonprob,recommended,priority -usual,improper,foster,1,convenient,inconv,nonprob,priority,priority -usual,improper,foster,1,convenient,inconv,nonprob,not_recom,not_recom -usual,improper,foster,1,convenient,inconv,slightly_prob,recommended,priority -usual,improper,foster,1,convenient,inconv,slightly_prob,priority,priority -usual,improper,foster,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,improper,foster,1,convenient,inconv,problematic,recommended,priority -usual,improper,foster,1,convenient,inconv,problematic,priority,priority -usual,improper,foster,1,convenient,inconv,problematic,not_recom,not_recom -usual,improper,foster,1,less_conv,convenient,nonprob,recommended,priority -usual,improper,foster,1,less_conv,convenient,nonprob,priority,priority -usual,improper,foster,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,improper,foster,1,less_conv,convenient,slightly_prob,recommended,priority -usual,improper,foster,1,less_conv,convenient,slightly_prob,priority,priority -usual,improper,foster,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,improper,foster,1,less_conv,convenient,problematic,recommended,priority -usual,improper,foster,1,less_conv,convenient,problematic,priority,priority -usual,improper,foster,1,less_conv,convenient,problematic,not_recom,not_recom -usual,improper,foster,1,less_conv,inconv,nonprob,recommended,priority -usual,improper,foster,1,less_conv,inconv,nonprob,priority,priority -usual,improper,foster,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,improper,foster,1,less_conv,inconv,slightly_prob,recommended,priority -usual,improper,foster,1,less_conv,inconv,slightly_prob,priority,priority -usual,improper,foster,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,improper,foster,1,less_conv,inconv,problematic,recommended,priority -usual,improper,foster,1,less_conv,inconv,problematic,priority,priority -usual,improper,foster,1,less_conv,inconv,problematic,not_recom,not_recom -usual,improper,foster,1,critical,convenient,nonprob,recommended,priority -usual,improper,foster,1,critical,convenient,nonprob,priority,priority -usual,improper,foster,1,critical,convenient,nonprob,not_recom,not_recom -usual,improper,foster,1,critical,convenient,slightly_prob,recommended,priority -usual,improper,foster,1,critical,convenient,slightly_prob,priority,priority -usual,improper,foster,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,improper,foster,1,critical,convenient,problematic,recommended,priority -usual,improper,foster,1,critical,convenient,problematic,priority,priority -usual,improper,foster,1,critical,convenient,problematic,not_recom,not_recom -usual,improper,foster,1,critical,inconv,nonprob,recommended,priority -usual,improper,foster,1,critical,inconv,nonprob,priority,priority -usual,improper,foster,1,critical,inconv,nonprob,not_recom,not_recom -usual,improper,foster,1,critical,inconv,slightly_prob,recommended,priority -usual,improper,foster,1,critical,inconv,slightly_prob,priority,priority -usual,improper,foster,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,improper,foster,1,critical,inconv,problematic,recommended,priority -usual,improper,foster,1,critical,inconv,problematic,priority,priority -usual,improper,foster,1,critical,inconv,problematic,not_recom,not_recom -usual,improper,foster,2,convenient,convenient,nonprob,recommended,very_recom -usual,improper,foster,2,convenient,convenient,nonprob,priority,priority -usual,improper,foster,2,convenient,convenient,nonprob,not_recom,not_recom -usual,improper,foster,2,convenient,convenient,slightly_prob,recommended,very_recom -usual,improper,foster,2,convenient,convenient,slightly_prob,priority,priority -usual,improper,foster,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,improper,foster,2,convenient,convenient,problematic,recommended,priority -usual,improper,foster,2,convenient,convenient,problematic,priority,priority -usual,improper,foster,2,convenient,convenient,problematic,not_recom,not_recom -usual,improper,foster,2,convenient,inconv,nonprob,recommended,priority -usual,improper,foster,2,convenient,inconv,nonprob,priority,priority -usual,improper,foster,2,convenient,inconv,nonprob,not_recom,not_recom -usual,improper,foster,2,convenient,inconv,slightly_prob,recommended,priority -usual,improper,foster,2,convenient,inconv,slightly_prob,priority,priority -usual,improper,foster,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,improper,foster,2,convenient,inconv,problematic,recommended,priority -usual,improper,foster,2,convenient,inconv,problematic,priority,priority -usual,improper,foster,2,convenient,inconv,problematic,not_recom,not_recom -usual,improper,foster,2,less_conv,convenient,nonprob,recommended,priority -usual,improper,foster,2,less_conv,convenient,nonprob,priority,priority -usual,improper,foster,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,improper,foster,2,less_conv,convenient,slightly_prob,recommended,priority -usual,improper,foster,2,less_conv,convenient,slightly_prob,priority,priority -usual,improper,foster,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,improper,foster,2,less_conv,convenient,problematic,recommended,priority -usual,improper,foster,2,less_conv,convenient,problematic,priority,priority -usual,improper,foster,2,less_conv,convenient,problematic,not_recom,not_recom -usual,improper,foster,2,less_conv,inconv,nonprob,recommended,priority -usual,improper,foster,2,less_conv,inconv,nonprob,priority,priority -usual,improper,foster,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,improper,foster,2,less_conv,inconv,slightly_prob,recommended,priority -usual,improper,foster,2,less_conv,inconv,slightly_prob,priority,priority -usual,improper,foster,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,improper,foster,2,less_conv,inconv,problematic,recommended,priority -usual,improper,foster,2,less_conv,inconv,problematic,priority,priority -usual,improper,foster,2,less_conv,inconv,problematic,not_recom,not_recom -usual,improper,foster,2,critical,convenient,nonprob,recommended,priority -usual,improper,foster,2,critical,convenient,nonprob,priority,priority -usual,improper,foster,2,critical,convenient,nonprob,not_recom,not_recom -usual,improper,foster,2,critical,convenient,slightly_prob,recommended,priority -usual,improper,foster,2,critical,convenient,slightly_prob,priority,priority -usual,improper,foster,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,improper,foster,2,critical,convenient,problematic,recommended,priority -usual,improper,foster,2,critical,convenient,problematic,priority,priority -usual,improper,foster,2,critical,convenient,problematic,not_recom,not_recom -usual,improper,foster,2,critical,inconv,nonprob,recommended,priority -usual,improper,foster,2,critical,inconv,nonprob,priority,priority -usual,improper,foster,2,critical,inconv,nonprob,not_recom,not_recom -usual,improper,foster,2,critical,inconv,slightly_prob,recommended,priority -usual,improper,foster,2,critical,inconv,slightly_prob,priority,priority -usual,improper,foster,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,improper,foster,2,critical,inconv,problematic,recommended,priority -usual,improper,foster,2,critical,inconv,problematic,priority,priority -usual,improper,foster,2,critical,inconv,problematic,not_recom,not_recom -usual,improper,foster,3,convenient,convenient,nonprob,recommended,very_recom -usual,improper,foster,3,convenient,convenient,nonprob,priority,priority -usual,improper,foster,3,convenient,convenient,nonprob,not_recom,not_recom -usual,improper,foster,3,convenient,convenient,slightly_prob,recommended,very_recom -usual,improper,foster,3,convenient,convenient,slightly_prob,priority,priority -usual,improper,foster,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,improper,foster,3,convenient,convenient,problematic,recommended,priority -usual,improper,foster,3,convenient,convenient,problematic,priority,priority -usual,improper,foster,3,convenient,convenient,problematic,not_recom,not_recom -usual,improper,foster,3,convenient,inconv,nonprob,recommended,priority -usual,improper,foster,3,convenient,inconv,nonprob,priority,priority -usual,improper,foster,3,convenient,inconv,nonprob,not_recom,not_recom -usual,improper,foster,3,convenient,inconv,slightly_prob,recommended,priority -usual,improper,foster,3,convenient,inconv,slightly_prob,priority,priority -usual,improper,foster,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,improper,foster,3,convenient,inconv,problematic,recommended,priority -usual,improper,foster,3,convenient,inconv,problematic,priority,priority -usual,improper,foster,3,convenient,inconv,problematic,not_recom,not_recom -usual,improper,foster,3,less_conv,convenient,nonprob,recommended,priority -usual,improper,foster,3,less_conv,convenient,nonprob,priority,priority -usual,improper,foster,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,improper,foster,3,less_conv,convenient,slightly_prob,recommended,priority -usual,improper,foster,3,less_conv,convenient,slightly_prob,priority,priority -usual,improper,foster,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,improper,foster,3,less_conv,convenient,problematic,recommended,priority -usual,improper,foster,3,less_conv,convenient,problematic,priority,priority -usual,improper,foster,3,less_conv,convenient,problematic,not_recom,not_recom -usual,improper,foster,3,less_conv,inconv,nonprob,recommended,priority -usual,improper,foster,3,less_conv,inconv,nonprob,priority,priority -usual,improper,foster,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,improper,foster,3,less_conv,inconv,slightly_prob,recommended,priority -usual,improper,foster,3,less_conv,inconv,slightly_prob,priority,priority -usual,improper,foster,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,improper,foster,3,less_conv,inconv,problematic,recommended,priority -usual,improper,foster,3,less_conv,inconv,problematic,priority,priority -usual,improper,foster,3,less_conv,inconv,problematic,not_recom,not_recom -usual,improper,foster,3,critical,convenient,nonprob,recommended,priority -usual,improper,foster,3,critical,convenient,nonprob,priority,priority -usual,improper,foster,3,critical,convenient,nonprob,not_recom,not_recom -usual,improper,foster,3,critical,convenient,slightly_prob,recommended,priority -usual,improper,foster,3,critical,convenient,slightly_prob,priority,priority -usual,improper,foster,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,improper,foster,3,critical,convenient,problematic,recommended,priority -usual,improper,foster,3,critical,convenient,problematic,priority,priority -usual,improper,foster,3,critical,convenient,problematic,not_recom,not_recom -usual,improper,foster,3,critical,inconv,nonprob,recommended,priority -usual,improper,foster,3,critical,inconv,nonprob,priority,priority -usual,improper,foster,3,critical,inconv,nonprob,not_recom,not_recom -usual,improper,foster,3,critical,inconv,slightly_prob,recommended,priority -usual,improper,foster,3,critical,inconv,slightly_prob,priority,priority -usual,improper,foster,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,improper,foster,3,critical,inconv,problematic,recommended,priority -usual,improper,foster,3,critical,inconv,problematic,priority,priority -usual,improper,foster,3,critical,inconv,problematic,not_recom,not_recom -usual,improper,foster,more,convenient,convenient,nonprob,recommended,very_recom -usual,improper,foster,more,convenient,convenient,nonprob,priority,priority -usual,improper,foster,more,convenient,convenient,nonprob,not_recom,not_recom -usual,improper,foster,more,convenient,convenient,slightly_prob,recommended,very_recom -usual,improper,foster,more,convenient,convenient,slightly_prob,priority,priority -usual,improper,foster,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,improper,foster,more,convenient,convenient,problematic,recommended,priority -usual,improper,foster,more,convenient,convenient,problematic,priority,priority -usual,improper,foster,more,convenient,convenient,problematic,not_recom,not_recom -usual,improper,foster,more,convenient,inconv,nonprob,recommended,priority -usual,improper,foster,more,convenient,inconv,nonprob,priority,priority -usual,improper,foster,more,convenient,inconv,nonprob,not_recom,not_recom -usual,improper,foster,more,convenient,inconv,slightly_prob,recommended,priority -usual,improper,foster,more,convenient,inconv,slightly_prob,priority,priority -usual,improper,foster,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,improper,foster,more,convenient,inconv,problematic,recommended,priority -usual,improper,foster,more,convenient,inconv,problematic,priority,priority -usual,improper,foster,more,convenient,inconv,problematic,not_recom,not_recom -usual,improper,foster,more,less_conv,convenient,nonprob,recommended,priority -usual,improper,foster,more,less_conv,convenient,nonprob,priority,priority -usual,improper,foster,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,improper,foster,more,less_conv,convenient,slightly_prob,recommended,priority -usual,improper,foster,more,less_conv,convenient,slightly_prob,priority,priority -usual,improper,foster,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,improper,foster,more,less_conv,convenient,problematic,recommended,priority -usual,improper,foster,more,less_conv,convenient,problematic,priority,priority -usual,improper,foster,more,less_conv,convenient,problematic,not_recom,not_recom -usual,improper,foster,more,less_conv,inconv,nonprob,recommended,priority -usual,improper,foster,more,less_conv,inconv,nonprob,priority,priority -usual,improper,foster,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,improper,foster,more,less_conv,inconv,slightly_prob,recommended,priority -usual,improper,foster,more,less_conv,inconv,slightly_prob,priority,priority -usual,improper,foster,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,improper,foster,more,less_conv,inconv,problematic,recommended,priority -usual,improper,foster,more,less_conv,inconv,problematic,priority,priority -usual,improper,foster,more,less_conv,inconv,problematic,not_recom,not_recom -usual,improper,foster,more,critical,convenient,nonprob,recommended,priority -usual,improper,foster,more,critical,convenient,nonprob,priority,priority -usual,improper,foster,more,critical,convenient,nonprob,not_recom,not_recom -usual,improper,foster,more,critical,convenient,slightly_prob,recommended,priority -usual,improper,foster,more,critical,convenient,slightly_prob,priority,priority -usual,improper,foster,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,improper,foster,more,critical,convenient,problematic,recommended,priority -usual,improper,foster,more,critical,convenient,problematic,priority,priority -usual,improper,foster,more,critical,convenient,problematic,not_recom,not_recom -usual,improper,foster,more,critical,inconv,nonprob,recommended,priority -usual,improper,foster,more,critical,inconv,nonprob,priority,priority -usual,improper,foster,more,critical,inconv,nonprob,not_recom,not_recom -usual,improper,foster,more,critical,inconv,slightly_prob,recommended,priority -usual,improper,foster,more,critical,inconv,slightly_prob,priority,priority -usual,improper,foster,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,improper,foster,more,critical,inconv,problematic,recommended,priority -usual,improper,foster,more,critical,inconv,problematic,priority,priority -usual,improper,foster,more,critical,inconv,problematic,not_recom,not_recom -usual,critical,complete,1,convenient,convenient,nonprob,recommended,priority -usual,critical,complete,1,convenient,convenient,nonprob,priority,priority -usual,critical,complete,1,convenient,convenient,nonprob,not_recom,not_recom -usual,critical,complete,1,convenient,convenient,slightly_prob,recommended,priority -usual,critical,complete,1,convenient,convenient,slightly_prob,priority,priority -usual,critical,complete,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,critical,complete,1,convenient,convenient,problematic,recommended,priority -usual,critical,complete,1,convenient,convenient,problematic,priority,priority -usual,critical,complete,1,convenient,convenient,problematic,not_recom,not_recom -usual,critical,complete,1,convenient,inconv,nonprob,recommended,priority -usual,critical,complete,1,convenient,inconv,nonprob,priority,priority -usual,critical,complete,1,convenient,inconv,nonprob,not_recom,not_recom -usual,critical,complete,1,convenient,inconv,slightly_prob,recommended,priority -usual,critical,complete,1,convenient,inconv,slightly_prob,priority,priority -usual,critical,complete,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,critical,complete,1,convenient,inconv,problematic,recommended,priority -usual,critical,complete,1,convenient,inconv,problematic,priority,priority -usual,critical,complete,1,convenient,inconv,problematic,not_recom,not_recom -usual,critical,complete,1,less_conv,convenient,nonprob,recommended,priority -usual,critical,complete,1,less_conv,convenient,nonprob,priority,priority -usual,critical,complete,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,critical,complete,1,less_conv,convenient,slightly_prob,recommended,priority -usual,critical,complete,1,less_conv,convenient,slightly_prob,priority,priority -usual,critical,complete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,critical,complete,1,less_conv,convenient,problematic,recommended,priority -usual,critical,complete,1,less_conv,convenient,problematic,priority,priority -usual,critical,complete,1,less_conv,convenient,problematic,not_recom,not_recom -usual,critical,complete,1,less_conv,inconv,nonprob,recommended,priority -usual,critical,complete,1,less_conv,inconv,nonprob,priority,priority -usual,critical,complete,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,critical,complete,1,less_conv,inconv,slightly_prob,recommended,priority -usual,critical,complete,1,less_conv,inconv,slightly_prob,priority,priority -usual,critical,complete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,critical,complete,1,less_conv,inconv,problematic,recommended,priority -usual,critical,complete,1,less_conv,inconv,problematic,priority,priority -usual,critical,complete,1,less_conv,inconv,problematic,not_recom,not_recom -usual,critical,complete,1,critical,convenient,nonprob,recommended,priority -usual,critical,complete,1,critical,convenient,nonprob,priority,priority -usual,critical,complete,1,critical,convenient,nonprob,not_recom,not_recom -usual,critical,complete,1,critical,convenient,slightly_prob,recommended,priority -usual,critical,complete,1,critical,convenient,slightly_prob,priority,priority -usual,critical,complete,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,critical,complete,1,critical,convenient,problematic,recommended,priority -usual,critical,complete,1,critical,convenient,problematic,priority,priority -usual,critical,complete,1,critical,convenient,problematic,not_recom,not_recom -usual,critical,complete,1,critical,inconv,nonprob,recommended,priority -usual,critical,complete,1,critical,inconv,nonprob,priority,priority -usual,critical,complete,1,critical,inconv,nonprob,not_recom,not_recom -usual,critical,complete,1,critical,inconv,slightly_prob,recommended,priority -usual,critical,complete,1,critical,inconv,slightly_prob,priority,priority -usual,critical,complete,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,critical,complete,1,critical,inconv,problematic,recommended,priority -usual,critical,complete,1,critical,inconv,problematic,priority,priority -usual,critical,complete,1,critical,inconv,problematic,not_recom,not_recom -usual,critical,complete,2,convenient,convenient,nonprob,recommended,priority -usual,critical,complete,2,convenient,convenient,nonprob,priority,priority -usual,critical,complete,2,convenient,convenient,nonprob,not_recom,not_recom -usual,critical,complete,2,convenient,convenient,slightly_prob,recommended,priority -usual,critical,complete,2,convenient,convenient,slightly_prob,priority,priority -usual,critical,complete,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,critical,complete,2,convenient,convenient,problematic,recommended,priority -usual,critical,complete,2,convenient,convenient,problematic,priority,priority -usual,critical,complete,2,convenient,convenient,problematic,not_recom,not_recom -usual,critical,complete,2,convenient,inconv,nonprob,recommended,priority -usual,critical,complete,2,convenient,inconv,nonprob,priority,priority -usual,critical,complete,2,convenient,inconv,nonprob,not_recom,not_recom -usual,critical,complete,2,convenient,inconv,slightly_prob,recommended,priority -usual,critical,complete,2,convenient,inconv,slightly_prob,priority,priority -usual,critical,complete,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,critical,complete,2,convenient,inconv,problematic,recommended,priority -usual,critical,complete,2,convenient,inconv,problematic,priority,priority -usual,critical,complete,2,convenient,inconv,problematic,not_recom,not_recom -usual,critical,complete,2,less_conv,convenient,nonprob,recommended,priority -usual,critical,complete,2,less_conv,convenient,nonprob,priority,priority -usual,critical,complete,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,critical,complete,2,less_conv,convenient,slightly_prob,recommended,priority -usual,critical,complete,2,less_conv,convenient,slightly_prob,priority,priority -usual,critical,complete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,critical,complete,2,less_conv,convenient,problematic,recommended,priority -usual,critical,complete,2,less_conv,convenient,problematic,priority,priority -usual,critical,complete,2,less_conv,convenient,problematic,not_recom,not_recom -usual,critical,complete,2,less_conv,inconv,nonprob,recommended,priority -usual,critical,complete,2,less_conv,inconv,nonprob,priority,priority -usual,critical,complete,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,critical,complete,2,less_conv,inconv,slightly_prob,recommended,priority -usual,critical,complete,2,less_conv,inconv,slightly_prob,priority,priority -usual,critical,complete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,critical,complete,2,less_conv,inconv,problematic,recommended,priority -usual,critical,complete,2,less_conv,inconv,problematic,priority,priority -usual,critical,complete,2,less_conv,inconv,problematic,not_recom,not_recom -usual,critical,complete,2,critical,convenient,nonprob,recommended,priority -usual,critical,complete,2,critical,convenient,nonprob,priority,spec_prior -usual,critical,complete,2,critical,convenient,nonprob,not_recom,not_recom -usual,critical,complete,2,critical,convenient,slightly_prob,recommended,priority -usual,critical,complete,2,critical,convenient,slightly_prob,priority,spec_prior -usual,critical,complete,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,critical,complete,2,critical,convenient,problematic,recommended,spec_prior -usual,critical,complete,2,critical,convenient,problematic,priority,spec_prior -usual,critical,complete,2,critical,convenient,problematic,not_recom,not_recom -usual,critical,complete,2,critical,inconv,nonprob,recommended,priority -usual,critical,complete,2,critical,inconv,nonprob,priority,spec_prior -usual,critical,complete,2,critical,inconv,nonprob,not_recom,not_recom -usual,critical,complete,2,critical,inconv,slightly_prob,recommended,priority -usual,critical,complete,2,critical,inconv,slightly_prob,priority,spec_prior -usual,critical,complete,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,critical,complete,2,critical,inconv,problematic,recommended,spec_prior -usual,critical,complete,2,critical,inconv,problematic,priority,spec_prior -usual,critical,complete,2,critical,inconv,problematic,not_recom,not_recom -usual,critical,complete,3,convenient,convenient,nonprob,recommended,priority -usual,critical,complete,3,convenient,convenient,nonprob,priority,priority -usual,critical,complete,3,convenient,convenient,nonprob,not_recom,not_recom -usual,critical,complete,3,convenient,convenient,slightly_prob,recommended,priority -usual,critical,complete,3,convenient,convenient,slightly_prob,priority,priority -usual,critical,complete,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,critical,complete,3,convenient,convenient,problematic,recommended,priority -usual,critical,complete,3,convenient,convenient,problematic,priority,priority -usual,critical,complete,3,convenient,convenient,problematic,not_recom,not_recom -usual,critical,complete,3,convenient,inconv,nonprob,recommended,priority -usual,critical,complete,3,convenient,inconv,nonprob,priority,spec_prior -usual,critical,complete,3,convenient,inconv,nonprob,not_recom,not_recom -usual,critical,complete,3,convenient,inconv,slightly_prob,recommended,priority -usual,critical,complete,3,convenient,inconv,slightly_prob,priority,spec_prior -usual,critical,complete,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,critical,complete,3,convenient,inconv,problematic,recommended,spec_prior -usual,critical,complete,3,convenient,inconv,problematic,priority,spec_prior -usual,critical,complete,3,convenient,inconv,problematic,not_recom,not_recom -usual,critical,complete,3,less_conv,convenient,nonprob,recommended,priority -usual,critical,complete,3,less_conv,convenient,nonprob,priority,spec_prior -usual,critical,complete,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,critical,complete,3,less_conv,convenient,slightly_prob,recommended,priority -usual,critical,complete,3,less_conv,convenient,slightly_prob,priority,spec_prior -usual,critical,complete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,critical,complete,3,less_conv,convenient,problematic,recommended,spec_prior -usual,critical,complete,3,less_conv,convenient,problematic,priority,spec_prior -usual,critical,complete,3,less_conv,convenient,problematic,not_recom,not_recom -usual,critical,complete,3,less_conv,inconv,nonprob,recommended,priority -usual,critical,complete,3,less_conv,inconv,nonprob,priority,spec_prior -usual,critical,complete,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,critical,complete,3,less_conv,inconv,slightly_prob,recommended,priority -usual,critical,complete,3,less_conv,inconv,slightly_prob,priority,spec_prior -usual,critical,complete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,critical,complete,3,less_conv,inconv,problematic,recommended,spec_prior -usual,critical,complete,3,less_conv,inconv,problematic,priority,spec_prior -usual,critical,complete,3,less_conv,inconv,problematic,not_recom,not_recom -usual,critical,complete,3,critical,convenient,nonprob,recommended,priority -usual,critical,complete,3,critical,convenient,nonprob,priority,spec_prior -usual,critical,complete,3,critical,convenient,nonprob,not_recom,not_recom -usual,critical,complete,3,critical,convenient,slightly_prob,recommended,priority -usual,critical,complete,3,critical,convenient,slightly_prob,priority,spec_prior -usual,critical,complete,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,critical,complete,3,critical,convenient,problematic,recommended,spec_prior -usual,critical,complete,3,critical,convenient,problematic,priority,spec_prior -usual,critical,complete,3,critical,convenient,problematic,not_recom,not_recom -usual,critical,complete,3,critical,inconv,nonprob,recommended,priority -usual,critical,complete,3,critical,inconv,nonprob,priority,spec_prior -usual,critical,complete,3,critical,inconv,nonprob,not_recom,not_recom -usual,critical,complete,3,critical,inconv,slightly_prob,recommended,priority -usual,critical,complete,3,critical,inconv,slightly_prob,priority,spec_prior -usual,critical,complete,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,critical,complete,3,critical,inconv,problematic,recommended,spec_prior -usual,critical,complete,3,critical,inconv,problematic,priority,spec_prior -usual,critical,complete,3,critical,inconv,problematic,not_recom,not_recom -usual,critical,complete,more,convenient,convenient,nonprob,recommended,priority -usual,critical,complete,more,convenient,convenient,nonprob,priority,priority -usual,critical,complete,more,convenient,convenient,nonprob,not_recom,not_recom -usual,critical,complete,more,convenient,convenient,slightly_prob,recommended,priority -usual,critical,complete,more,convenient,convenient,slightly_prob,priority,priority -usual,critical,complete,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,critical,complete,more,convenient,convenient,problematic,recommended,priority -usual,critical,complete,more,convenient,convenient,problematic,priority,priority -usual,critical,complete,more,convenient,convenient,problematic,not_recom,not_recom -usual,critical,complete,more,convenient,inconv,nonprob,recommended,priority -usual,critical,complete,more,convenient,inconv,nonprob,priority,spec_prior -usual,critical,complete,more,convenient,inconv,nonprob,not_recom,not_recom -usual,critical,complete,more,convenient,inconv,slightly_prob,recommended,priority -usual,critical,complete,more,convenient,inconv,slightly_prob,priority,spec_prior -usual,critical,complete,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,critical,complete,more,convenient,inconv,problematic,recommended,spec_prior -usual,critical,complete,more,convenient,inconv,problematic,priority,spec_prior -usual,critical,complete,more,convenient,inconv,problematic,not_recom,not_recom -usual,critical,complete,more,less_conv,convenient,nonprob,recommended,priority -usual,critical,complete,more,less_conv,convenient,nonprob,priority,spec_prior -usual,critical,complete,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,critical,complete,more,less_conv,convenient,slightly_prob,recommended,priority -usual,critical,complete,more,less_conv,convenient,slightly_prob,priority,spec_prior -usual,critical,complete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,critical,complete,more,less_conv,convenient,problematic,recommended,spec_prior -usual,critical,complete,more,less_conv,convenient,problematic,priority,spec_prior -usual,critical,complete,more,less_conv,convenient,problematic,not_recom,not_recom -usual,critical,complete,more,less_conv,inconv,nonprob,recommended,priority -usual,critical,complete,more,less_conv,inconv,nonprob,priority,spec_prior -usual,critical,complete,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,critical,complete,more,less_conv,inconv,slightly_prob,recommended,priority -usual,critical,complete,more,less_conv,inconv,slightly_prob,priority,spec_prior -usual,critical,complete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,critical,complete,more,less_conv,inconv,problematic,recommended,spec_prior -usual,critical,complete,more,less_conv,inconv,problematic,priority,spec_prior -usual,critical,complete,more,less_conv,inconv,problematic,not_recom,not_recom -usual,critical,complete,more,critical,convenient,nonprob,recommended,priority -usual,critical,complete,more,critical,convenient,nonprob,priority,spec_prior -usual,critical,complete,more,critical,convenient,nonprob,not_recom,not_recom -usual,critical,complete,more,critical,convenient,slightly_prob,recommended,priority -usual,critical,complete,more,critical,convenient,slightly_prob,priority,spec_prior -usual,critical,complete,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,critical,complete,more,critical,convenient,problematic,recommended,spec_prior -usual,critical,complete,more,critical,convenient,problematic,priority,spec_prior -usual,critical,complete,more,critical,convenient,problematic,not_recom,not_recom -usual,critical,complete,more,critical,inconv,nonprob,recommended,priority -usual,critical,complete,more,critical,inconv,nonprob,priority,spec_prior -usual,critical,complete,more,critical,inconv,nonprob,not_recom,not_recom -usual,critical,complete,more,critical,inconv,slightly_prob,recommended,priority -usual,critical,complete,more,critical,inconv,slightly_prob,priority,spec_prior -usual,critical,complete,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,critical,complete,more,critical,inconv,problematic,recommended,spec_prior -usual,critical,complete,more,critical,inconv,problematic,priority,spec_prior -usual,critical,complete,more,critical,inconv,problematic,not_recom,not_recom -usual,critical,completed,1,convenient,convenient,nonprob,recommended,priority -usual,critical,completed,1,convenient,convenient,nonprob,priority,priority -usual,critical,completed,1,convenient,convenient,nonprob,not_recom,not_recom -usual,critical,completed,1,convenient,convenient,slightly_prob,recommended,priority -usual,critical,completed,1,convenient,convenient,slightly_prob,priority,priority -usual,critical,completed,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,critical,completed,1,convenient,convenient,problematic,recommended,priority -usual,critical,completed,1,convenient,convenient,problematic,priority,priority -usual,critical,completed,1,convenient,convenient,problematic,not_recom,not_recom -usual,critical,completed,1,convenient,inconv,nonprob,recommended,priority -usual,critical,completed,1,convenient,inconv,nonprob,priority,priority -usual,critical,completed,1,convenient,inconv,nonprob,not_recom,not_recom -usual,critical,completed,1,convenient,inconv,slightly_prob,recommended,priority -usual,critical,completed,1,convenient,inconv,slightly_prob,priority,priority -usual,critical,completed,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,critical,completed,1,convenient,inconv,problematic,recommended,priority -usual,critical,completed,1,convenient,inconv,problematic,priority,priority -usual,critical,completed,1,convenient,inconv,problematic,not_recom,not_recom -usual,critical,completed,1,less_conv,convenient,nonprob,recommended,priority -usual,critical,completed,1,less_conv,convenient,nonprob,priority,priority -usual,critical,completed,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,critical,completed,1,less_conv,convenient,slightly_prob,recommended,priority -usual,critical,completed,1,less_conv,convenient,slightly_prob,priority,priority -usual,critical,completed,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,critical,completed,1,less_conv,convenient,problematic,recommended,priority -usual,critical,completed,1,less_conv,convenient,problematic,priority,priority -usual,critical,completed,1,less_conv,convenient,problematic,not_recom,not_recom -usual,critical,completed,1,less_conv,inconv,nonprob,recommended,priority -usual,critical,completed,1,less_conv,inconv,nonprob,priority,priority -usual,critical,completed,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,critical,completed,1,less_conv,inconv,slightly_prob,recommended,priority -usual,critical,completed,1,less_conv,inconv,slightly_prob,priority,priority -usual,critical,completed,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,critical,completed,1,less_conv,inconv,problematic,recommended,priority -usual,critical,completed,1,less_conv,inconv,problematic,priority,priority -usual,critical,completed,1,less_conv,inconv,problematic,not_recom,not_recom -usual,critical,completed,1,critical,convenient,nonprob,recommended,priority -usual,critical,completed,1,critical,convenient,nonprob,priority,spec_prior -usual,critical,completed,1,critical,convenient,nonprob,not_recom,not_recom -usual,critical,completed,1,critical,convenient,slightly_prob,recommended,priority -usual,critical,completed,1,critical,convenient,slightly_prob,priority,spec_prior -usual,critical,completed,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,critical,completed,1,critical,convenient,problematic,recommended,spec_prior -usual,critical,completed,1,critical,convenient,problematic,priority,spec_prior -usual,critical,completed,1,critical,convenient,problematic,not_recom,not_recom -usual,critical,completed,1,critical,inconv,nonprob,recommended,priority -usual,critical,completed,1,critical,inconv,nonprob,priority,spec_prior -usual,critical,completed,1,critical,inconv,nonprob,not_recom,not_recom -usual,critical,completed,1,critical,inconv,slightly_prob,recommended,priority -usual,critical,completed,1,critical,inconv,slightly_prob,priority,spec_prior -usual,critical,completed,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,critical,completed,1,critical,inconv,problematic,recommended,spec_prior -usual,critical,completed,1,critical,inconv,problematic,priority,spec_prior -usual,critical,completed,1,critical,inconv,problematic,not_recom,not_recom -usual,critical,completed,2,convenient,convenient,nonprob,recommended,priority -usual,critical,completed,2,convenient,convenient,nonprob,priority,priority -usual,critical,completed,2,convenient,convenient,nonprob,not_recom,not_recom -usual,critical,completed,2,convenient,convenient,slightly_prob,recommended,priority -usual,critical,completed,2,convenient,convenient,slightly_prob,priority,priority -usual,critical,completed,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,critical,completed,2,convenient,convenient,problematic,recommended,priority -usual,critical,completed,2,convenient,convenient,problematic,priority,priority -usual,critical,completed,2,convenient,convenient,problematic,not_recom,not_recom -usual,critical,completed,2,convenient,inconv,nonprob,recommended,priority -usual,critical,completed,2,convenient,inconv,nonprob,priority,priority -usual,critical,completed,2,convenient,inconv,nonprob,not_recom,not_recom -usual,critical,completed,2,convenient,inconv,slightly_prob,recommended,priority -usual,critical,completed,2,convenient,inconv,slightly_prob,priority,priority -usual,critical,completed,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,critical,completed,2,convenient,inconv,problematic,recommended,priority -usual,critical,completed,2,convenient,inconv,problematic,priority,priority -usual,critical,completed,2,convenient,inconv,problematic,not_recom,not_recom -usual,critical,completed,2,less_conv,convenient,nonprob,recommended,priority -usual,critical,completed,2,less_conv,convenient,nonprob,priority,priority -usual,critical,completed,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,critical,completed,2,less_conv,convenient,slightly_prob,recommended,priority -usual,critical,completed,2,less_conv,convenient,slightly_prob,priority,priority -usual,critical,completed,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,critical,completed,2,less_conv,convenient,problematic,recommended,priority -usual,critical,completed,2,less_conv,convenient,problematic,priority,priority -usual,critical,completed,2,less_conv,convenient,problematic,not_recom,not_recom -usual,critical,completed,2,less_conv,inconv,nonprob,recommended,priority -usual,critical,completed,2,less_conv,inconv,nonprob,priority,priority -usual,critical,completed,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,critical,completed,2,less_conv,inconv,slightly_prob,recommended,priority -usual,critical,completed,2,less_conv,inconv,slightly_prob,priority,priority -usual,critical,completed,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,critical,completed,2,less_conv,inconv,problematic,recommended,priority -usual,critical,completed,2,less_conv,inconv,problematic,priority,priority -usual,critical,completed,2,less_conv,inconv,problematic,not_recom,not_recom -usual,critical,completed,2,critical,convenient,nonprob,recommended,priority -usual,critical,completed,2,critical,convenient,nonprob,priority,spec_prior -usual,critical,completed,2,critical,convenient,nonprob,not_recom,not_recom -usual,critical,completed,2,critical,convenient,slightly_prob,recommended,priority -usual,critical,completed,2,critical,convenient,slightly_prob,priority,spec_prior -usual,critical,completed,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,critical,completed,2,critical,convenient,problematic,recommended,spec_prior -usual,critical,completed,2,critical,convenient,problematic,priority,spec_prior -usual,critical,completed,2,critical,convenient,problematic,not_recom,not_recom -usual,critical,completed,2,critical,inconv,nonprob,recommended,priority -usual,critical,completed,2,critical,inconv,nonprob,priority,spec_prior -usual,critical,completed,2,critical,inconv,nonprob,not_recom,not_recom -usual,critical,completed,2,critical,inconv,slightly_prob,recommended,priority -usual,critical,completed,2,critical,inconv,slightly_prob,priority,spec_prior -usual,critical,completed,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,critical,completed,2,critical,inconv,problematic,recommended,spec_prior -usual,critical,completed,2,critical,inconv,problematic,priority,spec_prior -usual,critical,completed,2,critical,inconv,problematic,not_recom,not_recom -usual,critical,completed,3,convenient,convenient,nonprob,recommended,priority -usual,critical,completed,3,convenient,convenient,nonprob,priority,priority -usual,critical,completed,3,convenient,convenient,nonprob,not_recom,not_recom -usual,critical,completed,3,convenient,convenient,slightly_prob,recommended,priority -usual,critical,completed,3,convenient,convenient,slightly_prob,priority,priority -usual,critical,completed,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,critical,completed,3,convenient,convenient,problematic,recommended,priority -usual,critical,completed,3,convenient,convenient,problematic,priority,priority -usual,critical,completed,3,convenient,convenient,problematic,not_recom,not_recom -usual,critical,completed,3,convenient,inconv,nonprob,recommended,priority -usual,critical,completed,3,convenient,inconv,nonprob,priority,spec_prior -usual,critical,completed,3,convenient,inconv,nonprob,not_recom,not_recom -usual,critical,completed,3,convenient,inconv,slightly_prob,recommended,priority -usual,critical,completed,3,convenient,inconv,slightly_prob,priority,spec_prior -usual,critical,completed,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,critical,completed,3,convenient,inconv,problematic,recommended,spec_prior -usual,critical,completed,3,convenient,inconv,problematic,priority,spec_prior -usual,critical,completed,3,convenient,inconv,problematic,not_recom,not_recom -usual,critical,completed,3,less_conv,convenient,nonprob,recommended,priority -usual,critical,completed,3,less_conv,convenient,nonprob,priority,spec_prior -usual,critical,completed,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,critical,completed,3,less_conv,convenient,slightly_prob,recommended,priority -usual,critical,completed,3,less_conv,convenient,slightly_prob,priority,spec_prior -usual,critical,completed,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,critical,completed,3,less_conv,convenient,problematic,recommended,spec_prior -usual,critical,completed,3,less_conv,convenient,problematic,priority,spec_prior -usual,critical,completed,3,less_conv,convenient,problematic,not_recom,not_recom -usual,critical,completed,3,less_conv,inconv,nonprob,recommended,priority -usual,critical,completed,3,less_conv,inconv,nonprob,priority,spec_prior -usual,critical,completed,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,critical,completed,3,less_conv,inconv,slightly_prob,recommended,priority -usual,critical,completed,3,less_conv,inconv,slightly_prob,priority,spec_prior -usual,critical,completed,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,critical,completed,3,less_conv,inconv,problematic,recommended,spec_prior -usual,critical,completed,3,less_conv,inconv,problematic,priority,spec_prior -usual,critical,completed,3,less_conv,inconv,problematic,not_recom,not_recom -usual,critical,completed,3,critical,convenient,nonprob,recommended,priority -usual,critical,completed,3,critical,convenient,nonprob,priority,spec_prior -usual,critical,completed,3,critical,convenient,nonprob,not_recom,not_recom -usual,critical,completed,3,critical,convenient,slightly_prob,recommended,priority -usual,critical,completed,3,critical,convenient,slightly_prob,priority,spec_prior -usual,critical,completed,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,critical,completed,3,critical,convenient,problematic,recommended,spec_prior -usual,critical,completed,3,critical,convenient,problematic,priority,spec_prior -usual,critical,completed,3,critical,convenient,problematic,not_recom,not_recom -usual,critical,completed,3,critical,inconv,nonprob,recommended,priority -usual,critical,completed,3,critical,inconv,nonprob,priority,spec_prior -usual,critical,completed,3,critical,inconv,nonprob,not_recom,not_recom -usual,critical,completed,3,critical,inconv,slightly_prob,recommended,priority -usual,critical,completed,3,critical,inconv,slightly_prob,priority,spec_prior -usual,critical,completed,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,critical,completed,3,critical,inconv,problematic,recommended,spec_prior -usual,critical,completed,3,critical,inconv,problematic,priority,spec_prior -usual,critical,completed,3,critical,inconv,problematic,not_recom,not_recom -usual,critical,completed,more,convenient,convenient,nonprob,recommended,priority -usual,critical,completed,more,convenient,convenient,nonprob,priority,priority -usual,critical,completed,more,convenient,convenient,nonprob,not_recom,not_recom -usual,critical,completed,more,convenient,convenient,slightly_prob,recommended,priority -usual,critical,completed,more,convenient,convenient,slightly_prob,priority,priority -usual,critical,completed,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,critical,completed,more,convenient,convenient,problematic,recommended,priority -usual,critical,completed,more,convenient,convenient,problematic,priority,priority -usual,critical,completed,more,convenient,convenient,problematic,not_recom,not_recom -usual,critical,completed,more,convenient,inconv,nonprob,recommended,priority -usual,critical,completed,more,convenient,inconv,nonprob,priority,spec_prior -usual,critical,completed,more,convenient,inconv,nonprob,not_recom,not_recom -usual,critical,completed,more,convenient,inconv,slightly_prob,recommended,priority -usual,critical,completed,more,convenient,inconv,slightly_prob,priority,spec_prior -usual,critical,completed,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,critical,completed,more,convenient,inconv,problematic,recommended,spec_prior -usual,critical,completed,more,convenient,inconv,problematic,priority,spec_prior -usual,critical,completed,more,convenient,inconv,problematic,not_recom,not_recom -usual,critical,completed,more,less_conv,convenient,nonprob,recommended,priority -usual,critical,completed,more,less_conv,convenient,nonprob,priority,spec_prior -usual,critical,completed,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,critical,completed,more,less_conv,convenient,slightly_prob,recommended,priority -usual,critical,completed,more,less_conv,convenient,slightly_prob,priority,spec_prior -usual,critical,completed,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,critical,completed,more,less_conv,convenient,problematic,recommended,spec_prior -usual,critical,completed,more,less_conv,convenient,problematic,priority,spec_prior -usual,critical,completed,more,less_conv,convenient,problematic,not_recom,not_recom -usual,critical,completed,more,less_conv,inconv,nonprob,recommended,priority -usual,critical,completed,more,less_conv,inconv,nonprob,priority,spec_prior -usual,critical,completed,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,critical,completed,more,less_conv,inconv,slightly_prob,recommended,priority -usual,critical,completed,more,less_conv,inconv,slightly_prob,priority,spec_prior -usual,critical,completed,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,critical,completed,more,less_conv,inconv,problematic,recommended,spec_prior -usual,critical,completed,more,less_conv,inconv,problematic,priority,spec_prior -usual,critical,completed,more,less_conv,inconv,problematic,not_recom,not_recom -usual,critical,completed,more,critical,convenient,nonprob,recommended,priority -usual,critical,completed,more,critical,convenient,nonprob,priority,spec_prior -usual,critical,completed,more,critical,convenient,nonprob,not_recom,not_recom -usual,critical,completed,more,critical,convenient,slightly_prob,recommended,priority -usual,critical,completed,more,critical,convenient,slightly_prob,priority,spec_prior -usual,critical,completed,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,critical,completed,more,critical,convenient,problematic,recommended,spec_prior -usual,critical,completed,more,critical,convenient,problematic,priority,spec_prior -usual,critical,completed,more,critical,convenient,problematic,not_recom,not_recom -usual,critical,completed,more,critical,inconv,nonprob,recommended,priority -usual,critical,completed,more,critical,inconv,nonprob,priority,spec_prior -usual,critical,completed,more,critical,inconv,nonprob,not_recom,not_recom -usual,critical,completed,more,critical,inconv,slightly_prob,recommended,priority -usual,critical,completed,more,critical,inconv,slightly_prob,priority,spec_prior -usual,critical,completed,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,critical,completed,more,critical,inconv,problematic,recommended,spec_prior -usual,critical,completed,more,critical,inconv,problematic,priority,spec_prior -usual,critical,completed,more,critical,inconv,problematic,not_recom,not_recom -usual,critical,incomplete,1,convenient,convenient,nonprob,recommended,priority -usual,critical,incomplete,1,convenient,convenient,nonprob,priority,priority -usual,critical,incomplete,1,convenient,convenient,nonprob,not_recom,not_recom -usual,critical,incomplete,1,convenient,convenient,slightly_prob,recommended,priority -usual,critical,incomplete,1,convenient,convenient,slightly_prob,priority,priority -usual,critical,incomplete,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,critical,incomplete,1,convenient,convenient,problematic,recommended,priority -usual,critical,incomplete,1,convenient,convenient,problematic,priority,priority -usual,critical,incomplete,1,convenient,convenient,problematic,not_recom,not_recom -usual,critical,incomplete,1,convenient,inconv,nonprob,recommended,priority -usual,critical,incomplete,1,convenient,inconv,nonprob,priority,priority -usual,critical,incomplete,1,convenient,inconv,nonprob,not_recom,not_recom -usual,critical,incomplete,1,convenient,inconv,slightly_prob,recommended,priority -usual,critical,incomplete,1,convenient,inconv,slightly_prob,priority,priority -usual,critical,incomplete,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,critical,incomplete,1,convenient,inconv,problematic,recommended,priority -usual,critical,incomplete,1,convenient,inconv,problematic,priority,priority -usual,critical,incomplete,1,convenient,inconv,problematic,not_recom,not_recom -usual,critical,incomplete,1,less_conv,convenient,nonprob,recommended,priority -usual,critical,incomplete,1,less_conv,convenient,nonprob,priority,priority -usual,critical,incomplete,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,critical,incomplete,1,less_conv,convenient,slightly_prob,recommended,priority -usual,critical,incomplete,1,less_conv,convenient,slightly_prob,priority,priority -usual,critical,incomplete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,critical,incomplete,1,less_conv,convenient,problematic,recommended,priority -usual,critical,incomplete,1,less_conv,convenient,problematic,priority,priority -usual,critical,incomplete,1,less_conv,convenient,problematic,not_recom,not_recom -usual,critical,incomplete,1,less_conv,inconv,nonprob,recommended,priority -usual,critical,incomplete,1,less_conv,inconv,nonprob,priority,priority -usual,critical,incomplete,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,critical,incomplete,1,less_conv,inconv,slightly_prob,recommended,priority -usual,critical,incomplete,1,less_conv,inconv,slightly_prob,priority,priority -usual,critical,incomplete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,critical,incomplete,1,less_conv,inconv,problematic,recommended,priority -usual,critical,incomplete,1,less_conv,inconv,problematic,priority,priority -usual,critical,incomplete,1,less_conv,inconv,problematic,not_recom,not_recom -usual,critical,incomplete,1,critical,convenient,nonprob,recommended,priority -usual,critical,incomplete,1,critical,convenient,nonprob,priority,spec_prior -usual,critical,incomplete,1,critical,convenient,nonprob,not_recom,not_recom -usual,critical,incomplete,1,critical,convenient,slightly_prob,recommended,priority -usual,critical,incomplete,1,critical,convenient,slightly_prob,priority,spec_prior -usual,critical,incomplete,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,critical,incomplete,1,critical,convenient,problematic,recommended,spec_prior -usual,critical,incomplete,1,critical,convenient,problematic,priority,spec_prior -usual,critical,incomplete,1,critical,convenient,problematic,not_recom,not_recom -usual,critical,incomplete,1,critical,inconv,nonprob,recommended,priority -usual,critical,incomplete,1,critical,inconv,nonprob,priority,spec_prior -usual,critical,incomplete,1,critical,inconv,nonprob,not_recom,not_recom -usual,critical,incomplete,1,critical,inconv,slightly_prob,recommended,priority -usual,critical,incomplete,1,critical,inconv,slightly_prob,priority,spec_prior -usual,critical,incomplete,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,critical,incomplete,1,critical,inconv,problematic,recommended,spec_prior -usual,critical,incomplete,1,critical,inconv,problematic,priority,spec_prior -usual,critical,incomplete,1,critical,inconv,problematic,not_recom,not_recom -usual,critical,incomplete,2,convenient,convenient,nonprob,recommended,priority -usual,critical,incomplete,2,convenient,convenient,nonprob,priority,priority -usual,critical,incomplete,2,convenient,convenient,nonprob,not_recom,not_recom -usual,critical,incomplete,2,convenient,convenient,slightly_prob,recommended,priority -usual,critical,incomplete,2,convenient,convenient,slightly_prob,priority,priority -usual,critical,incomplete,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,critical,incomplete,2,convenient,convenient,problematic,recommended,priority -usual,critical,incomplete,2,convenient,convenient,problematic,priority,priority -usual,critical,incomplete,2,convenient,convenient,problematic,not_recom,not_recom -usual,critical,incomplete,2,convenient,inconv,nonprob,recommended,priority -usual,critical,incomplete,2,convenient,inconv,nonprob,priority,spec_prior -usual,critical,incomplete,2,convenient,inconv,nonprob,not_recom,not_recom -usual,critical,incomplete,2,convenient,inconv,slightly_prob,recommended,priority -usual,critical,incomplete,2,convenient,inconv,slightly_prob,priority,spec_prior -usual,critical,incomplete,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,critical,incomplete,2,convenient,inconv,problematic,recommended,spec_prior -usual,critical,incomplete,2,convenient,inconv,problematic,priority,spec_prior -usual,critical,incomplete,2,convenient,inconv,problematic,not_recom,not_recom -usual,critical,incomplete,2,less_conv,convenient,nonprob,recommended,priority -usual,critical,incomplete,2,less_conv,convenient,nonprob,priority,spec_prior -usual,critical,incomplete,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,critical,incomplete,2,less_conv,convenient,slightly_prob,recommended,priority -usual,critical,incomplete,2,less_conv,convenient,slightly_prob,priority,spec_prior -usual,critical,incomplete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,critical,incomplete,2,less_conv,convenient,problematic,recommended,spec_prior -usual,critical,incomplete,2,less_conv,convenient,problematic,priority,spec_prior -usual,critical,incomplete,2,less_conv,convenient,problematic,not_recom,not_recom -usual,critical,incomplete,2,less_conv,inconv,nonprob,recommended,priority -usual,critical,incomplete,2,less_conv,inconv,nonprob,priority,spec_prior -usual,critical,incomplete,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,critical,incomplete,2,less_conv,inconv,slightly_prob,recommended,priority -usual,critical,incomplete,2,less_conv,inconv,slightly_prob,priority,spec_prior -usual,critical,incomplete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,critical,incomplete,2,less_conv,inconv,problematic,recommended,spec_prior -usual,critical,incomplete,2,less_conv,inconv,problematic,priority,spec_prior -usual,critical,incomplete,2,less_conv,inconv,problematic,not_recom,not_recom -usual,critical,incomplete,2,critical,convenient,nonprob,recommended,priority -usual,critical,incomplete,2,critical,convenient,nonprob,priority,spec_prior -usual,critical,incomplete,2,critical,convenient,nonprob,not_recom,not_recom -usual,critical,incomplete,2,critical,convenient,slightly_prob,recommended,priority -usual,critical,incomplete,2,critical,convenient,slightly_prob,priority,spec_prior -usual,critical,incomplete,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,critical,incomplete,2,critical,convenient,problematic,recommended,spec_prior -usual,critical,incomplete,2,critical,convenient,problematic,priority,spec_prior -usual,critical,incomplete,2,critical,convenient,problematic,not_recom,not_recom -usual,critical,incomplete,2,critical,inconv,nonprob,recommended,priority -usual,critical,incomplete,2,critical,inconv,nonprob,priority,spec_prior -usual,critical,incomplete,2,critical,inconv,nonprob,not_recom,not_recom -usual,critical,incomplete,2,critical,inconv,slightly_prob,recommended,priority -usual,critical,incomplete,2,critical,inconv,slightly_prob,priority,spec_prior -usual,critical,incomplete,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,critical,incomplete,2,critical,inconv,problematic,recommended,spec_prior -usual,critical,incomplete,2,critical,inconv,problematic,priority,spec_prior -usual,critical,incomplete,2,critical,inconv,problematic,not_recom,not_recom -usual,critical,incomplete,3,convenient,convenient,nonprob,recommended,priority -usual,critical,incomplete,3,convenient,convenient,nonprob,priority,priority -usual,critical,incomplete,3,convenient,convenient,nonprob,not_recom,not_recom -usual,critical,incomplete,3,convenient,convenient,slightly_prob,recommended,priority -usual,critical,incomplete,3,convenient,convenient,slightly_prob,priority,priority -usual,critical,incomplete,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,critical,incomplete,3,convenient,convenient,problematic,recommended,priority -usual,critical,incomplete,3,convenient,convenient,problematic,priority,priority -usual,critical,incomplete,3,convenient,convenient,problematic,not_recom,not_recom -usual,critical,incomplete,3,convenient,inconv,nonprob,recommended,priority -usual,critical,incomplete,3,convenient,inconv,nonprob,priority,spec_prior -usual,critical,incomplete,3,convenient,inconv,nonprob,not_recom,not_recom -usual,critical,incomplete,3,convenient,inconv,slightly_prob,recommended,priority -usual,critical,incomplete,3,convenient,inconv,slightly_prob,priority,spec_prior -usual,critical,incomplete,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,critical,incomplete,3,convenient,inconv,problematic,recommended,spec_prior -usual,critical,incomplete,3,convenient,inconv,problematic,priority,spec_prior -usual,critical,incomplete,3,convenient,inconv,problematic,not_recom,not_recom -usual,critical,incomplete,3,less_conv,convenient,nonprob,recommended,priority -usual,critical,incomplete,3,less_conv,convenient,nonprob,priority,spec_prior -usual,critical,incomplete,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,critical,incomplete,3,less_conv,convenient,slightly_prob,recommended,priority -usual,critical,incomplete,3,less_conv,convenient,slightly_prob,priority,spec_prior -usual,critical,incomplete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,critical,incomplete,3,less_conv,convenient,problematic,recommended,spec_prior -usual,critical,incomplete,3,less_conv,convenient,problematic,priority,spec_prior -usual,critical,incomplete,3,less_conv,convenient,problematic,not_recom,not_recom -usual,critical,incomplete,3,less_conv,inconv,nonprob,recommended,priority -usual,critical,incomplete,3,less_conv,inconv,nonprob,priority,spec_prior -usual,critical,incomplete,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,critical,incomplete,3,less_conv,inconv,slightly_prob,recommended,priority -usual,critical,incomplete,3,less_conv,inconv,slightly_prob,priority,spec_prior -usual,critical,incomplete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,critical,incomplete,3,less_conv,inconv,problematic,recommended,spec_prior -usual,critical,incomplete,3,less_conv,inconv,problematic,priority,spec_prior -usual,critical,incomplete,3,less_conv,inconv,problematic,not_recom,not_recom -usual,critical,incomplete,3,critical,convenient,nonprob,recommended,priority -usual,critical,incomplete,3,critical,convenient,nonprob,priority,spec_prior -usual,critical,incomplete,3,critical,convenient,nonprob,not_recom,not_recom -usual,critical,incomplete,3,critical,convenient,slightly_prob,recommended,priority -usual,critical,incomplete,3,critical,convenient,slightly_prob,priority,spec_prior -usual,critical,incomplete,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,critical,incomplete,3,critical,convenient,problematic,recommended,spec_prior -usual,critical,incomplete,3,critical,convenient,problematic,priority,spec_prior -usual,critical,incomplete,3,critical,convenient,problematic,not_recom,not_recom -usual,critical,incomplete,3,critical,inconv,nonprob,recommended,priority -usual,critical,incomplete,3,critical,inconv,nonprob,priority,spec_prior -usual,critical,incomplete,3,critical,inconv,nonprob,not_recom,not_recom -usual,critical,incomplete,3,critical,inconv,slightly_prob,recommended,priority -usual,critical,incomplete,3,critical,inconv,slightly_prob,priority,spec_prior -usual,critical,incomplete,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,critical,incomplete,3,critical,inconv,problematic,recommended,spec_prior -usual,critical,incomplete,3,critical,inconv,problematic,priority,spec_prior -usual,critical,incomplete,3,critical,inconv,problematic,not_recom,not_recom -usual,critical,incomplete,more,convenient,convenient,nonprob,recommended,priority -usual,critical,incomplete,more,convenient,convenient,nonprob,priority,priority -usual,critical,incomplete,more,convenient,convenient,nonprob,not_recom,not_recom -usual,critical,incomplete,more,convenient,convenient,slightly_prob,recommended,priority -usual,critical,incomplete,more,convenient,convenient,slightly_prob,priority,priority -usual,critical,incomplete,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,critical,incomplete,more,convenient,convenient,problematic,recommended,priority -usual,critical,incomplete,more,convenient,convenient,problematic,priority,priority -usual,critical,incomplete,more,convenient,convenient,problematic,not_recom,not_recom -usual,critical,incomplete,more,convenient,inconv,nonprob,recommended,priority -usual,critical,incomplete,more,convenient,inconv,nonprob,priority,spec_prior -usual,critical,incomplete,more,convenient,inconv,nonprob,not_recom,not_recom -usual,critical,incomplete,more,convenient,inconv,slightly_prob,recommended,priority -usual,critical,incomplete,more,convenient,inconv,slightly_prob,priority,spec_prior -usual,critical,incomplete,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,critical,incomplete,more,convenient,inconv,problematic,recommended,spec_prior -usual,critical,incomplete,more,convenient,inconv,problematic,priority,spec_prior -usual,critical,incomplete,more,convenient,inconv,problematic,not_recom,not_recom -usual,critical,incomplete,more,less_conv,convenient,nonprob,recommended,priority -usual,critical,incomplete,more,less_conv,convenient,nonprob,priority,spec_prior -usual,critical,incomplete,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,critical,incomplete,more,less_conv,convenient,slightly_prob,recommended,priority -usual,critical,incomplete,more,less_conv,convenient,slightly_prob,priority,spec_prior -usual,critical,incomplete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,critical,incomplete,more,less_conv,convenient,problematic,recommended,spec_prior -usual,critical,incomplete,more,less_conv,convenient,problematic,priority,spec_prior -usual,critical,incomplete,more,less_conv,convenient,problematic,not_recom,not_recom -usual,critical,incomplete,more,less_conv,inconv,nonprob,recommended,priority -usual,critical,incomplete,more,less_conv,inconv,nonprob,priority,spec_prior -usual,critical,incomplete,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,critical,incomplete,more,less_conv,inconv,slightly_prob,recommended,priority -usual,critical,incomplete,more,less_conv,inconv,slightly_prob,priority,spec_prior -usual,critical,incomplete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,critical,incomplete,more,less_conv,inconv,problematic,recommended,spec_prior -usual,critical,incomplete,more,less_conv,inconv,problematic,priority,spec_prior -usual,critical,incomplete,more,less_conv,inconv,problematic,not_recom,not_recom -usual,critical,incomplete,more,critical,convenient,nonprob,recommended,priority -usual,critical,incomplete,more,critical,convenient,nonprob,priority,spec_prior -usual,critical,incomplete,more,critical,convenient,nonprob,not_recom,not_recom -usual,critical,incomplete,more,critical,convenient,slightly_prob,recommended,priority -usual,critical,incomplete,more,critical,convenient,slightly_prob,priority,spec_prior -usual,critical,incomplete,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,critical,incomplete,more,critical,convenient,problematic,recommended,spec_prior -usual,critical,incomplete,more,critical,convenient,problematic,priority,spec_prior -usual,critical,incomplete,more,critical,convenient,problematic,not_recom,not_recom -usual,critical,incomplete,more,critical,inconv,nonprob,recommended,priority -usual,critical,incomplete,more,critical,inconv,nonprob,priority,spec_prior -usual,critical,incomplete,more,critical,inconv,nonprob,not_recom,not_recom -usual,critical,incomplete,more,critical,inconv,slightly_prob,recommended,priority -usual,critical,incomplete,more,critical,inconv,slightly_prob,priority,spec_prior -usual,critical,incomplete,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,critical,incomplete,more,critical,inconv,problematic,recommended,spec_prior -usual,critical,incomplete,more,critical,inconv,problematic,priority,spec_prior -usual,critical,incomplete,more,critical,inconv,problematic,not_recom,not_recom -usual,critical,foster,1,convenient,convenient,nonprob,recommended,priority -usual,critical,foster,1,convenient,convenient,nonprob,priority,priority -usual,critical,foster,1,convenient,convenient,nonprob,not_recom,not_recom -usual,critical,foster,1,convenient,convenient,slightly_prob,recommended,priority -usual,critical,foster,1,convenient,convenient,slightly_prob,priority,priority -usual,critical,foster,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,critical,foster,1,convenient,convenient,problematic,recommended,priority -usual,critical,foster,1,convenient,convenient,problematic,priority,priority -usual,critical,foster,1,convenient,convenient,problematic,not_recom,not_recom -usual,critical,foster,1,convenient,inconv,nonprob,recommended,priority -usual,critical,foster,1,convenient,inconv,nonprob,priority,spec_prior -usual,critical,foster,1,convenient,inconv,nonprob,not_recom,not_recom -usual,critical,foster,1,convenient,inconv,slightly_prob,recommended,priority -usual,critical,foster,1,convenient,inconv,slightly_prob,priority,spec_prior -usual,critical,foster,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,critical,foster,1,convenient,inconv,problematic,recommended,spec_prior -usual,critical,foster,1,convenient,inconv,problematic,priority,spec_prior -usual,critical,foster,1,convenient,inconv,problematic,not_recom,not_recom -usual,critical,foster,1,less_conv,convenient,nonprob,recommended,priority -usual,critical,foster,1,less_conv,convenient,nonprob,priority,spec_prior -usual,critical,foster,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,critical,foster,1,less_conv,convenient,slightly_prob,recommended,priority -usual,critical,foster,1,less_conv,convenient,slightly_prob,priority,spec_prior -usual,critical,foster,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,critical,foster,1,less_conv,convenient,problematic,recommended,spec_prior -usual,critical,foster,1,less_conv,convenient,problematic,priority,spec_prior -usual,critical,foster,1,less_conv,convenient,problematic,not_recom,not_recom -usual,critical,foster,1,less_conv,inconv,nonprob,recommended,priority -usual,critical,foster,1,less_conv,inconv,nonprob,priority,spec_prior -usual,critical,foster,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,critical,foster,1,less_conv,inconv,slightly_prob,recommended,priority -usual,critical,foster,1,less_conv,inconv,slightly_prob,priority,spec_prior -usual,critical,foster,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,critical,foster,1,less_conv,inconv,problematic,recommended,spec_prior -usual,critical,foster,1,less_conv,inconv,problematic,priority,spec_prior -usual,critical,foster,1,less_conv,inconv,problematic,not_recom,not_recom -usual,critical,foster,1,critical,convenient,nonprob,recommended,priority -usual,critical,foster,1,critical,convenient,nonprob,priority,spec_prior -usual,critical,foster,1,critical,convenient,nonprob,not_recom,not_recom -usual,critical,foster,1,critical,convenient,slightly_prob,recommended,priority -usual,critical,foster,1,critical,convenient,slightly_prob,priority,spec_prior -usual,critical,foster,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,critical,foster,1,critical,convenient,problematic,recommended,spec_prior -usual,critical,foster,1,critical,convenient,problematic,priority,spec_prior -usual,critical,foster,1,critical,convenient,problematic,not_recom,not_recom -usual,critical,foster,1,critical,inconv,nonprob,recommended,priority -usual,critical,foster,1,critical,inconv,nonprob,priority,spec_prior -usual,critical,foster,1,critical,inconv,nonprob,not_recom,not_recom -usual,critical,foster,1,critical,inconv,slightly_prob,recommended,priority -usual,critical,foster,1,critical,inconv,slightly_prob,priority,spec_prior -usual,critical,foster,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,critical,foster,1,critical,inconv,problematic,recommended,spec_prior -usual,critical,foster,1,critical,inconv,problematic,priority,spec_prior -usual,critical,foster,1,critical,inconv,problematic,not_recom,not_recom -usual,critical,foster,2,convenient,convenient,nonprob,recommended,priority -usual,critical,foster,2,convenient,convenient,nonprob,priority,priority -usual,critical,foster,2,convenient,convenient,nonprob,not_recom,not_recom -usual,critical,foster,2,convenient,convenient,slightly_prob,recommended,priority -usual,critical,foster,2,convenient,convenient,slightly_prob,priority,priority -usual,critical,foster,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,critical,foster,2,convenient,convenient,problematic,recommended,priority -usual,critical,foster,2,convenient,convenient,problematic,priority,priority -usual,critical,foster,2,convenient,convenient,problematic,not_recom,not_recom -usual,critical,foster,2,convenient,inconv,nonprob,recommended,priority -usual,critical,foster,2,convenient,inconv,nonprob,priority,spec_prior -usual,critical,foster,2,convenient,inconv,nonprob,not_recom,not_recom -usual,critical,foster,2,convenient,inconv,slightly_prob,recommended,priority -usual,critical,foster,2,convenient,inconv,slightly_prob,priority,spec_prior -usual,critical,foster,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,critical,foster,2,convenient,inconv,problematic,recommended,spec_prior -usual,critical,foster,2,convenient,inconv,problematic,priority,spec_prior -usual,critical,foster,2,convenient,inconv,problematic,not_recom,not_recom -usual,critical,foster,2,less_conv,convenient,nonprob,recommended,priority -usual,critical,foster,2,less_conv,convenient,nonprob,priority,spec_prior -usual,critical,foster,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,critical,foster,2,less_conv,convenient,slightly_prob,recommended,priority -usual,critical,foster,2,less_conv,convenient,slightly_prob,priority,spec_prior -usual,critical,foster,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,critical,foster,2,less_conv,convenient,problematic,recommended,spec_prior -usual,critical,foster,2,less_conv,convenient,problematic,priority,spec_prior -usual,critical,foster,2,less_conv,convenient,problematic,not_recom,not_recom -usual,critical,foster,2,less_conv,inconv,nonprob,recommended,priority -usual,critical,foster,2,less_conv,inconv,nonprob,priority,spec_prior -usual,critical,foster,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,critical,foster,2,less_conv,inconv,slightly_prob,recommended,priority -usual,critical,foster,2,less_conv,inconv,slightly_prob,priority,spec_prior -usual,critical,foster,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,critical,foster,2,less_conv,inconv,problematic,recommended,spec_prior -usual,critical,foster,2,less_conv,inconv,problematic,priority,spec_prior -usual,critical,foster,2,less_conv,inconv,problematic,not_recom,not_recom -usual,critical,foster,2,critical,convenient,nonprob,recommended,priority -usual,critical,foster,2,critical,convenient,nonprob,priority,spec_prior -usual,critical,foster,2,critical,convenient,nonprob,not_recom,not_recom -usual,critical,foster,2,critical,convenient,slightly_prob,recommended,priority -usual,critical,foster,2,critical,convenient,slightly_prob,priority,spec_prior -usual,critical,foster,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,critical,foster,2,critical,convenient,problematic,recommended,spec_prior -usual,critical,foster,2,critical,convenient,problematic,priority,spec_prior -usual,critical,foster,2,critical,convenient,problematic,not_recom,not_recom -usual,critical,foster,2,critical,inconv,nonprob,recommended,priority -usual,critical,foster,2,critical,inconv,nonprob,priority,spec_prior -usual,critical,foster,2,critical,inconv,nonprob,not_recom,not_recom -usual,critical,foster,2,critical,inconv,slightly_prob,recommended,priority -usual,critical,foster,2,critical,inconv,slightly_prob,priority,spec_prior -usual,critical,foster,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,critical,foster,2,critical,inconv,problematic,recommended,spec_prior -usual,critical,foster,2,critical,inconv,problematic,priority,spec_prior -usual,critical,foster,2,critical,inconv,problematic,not_recom,not_recom -usual,critical,foster,3,convenient,convenient,nonprob,recommended,priority -usual,critical,foster,3,convenient,convenient,nonprob,priority,priority -usual,critical,foster,3,convenient,convenient,nonprob,not_recom,not_recom -usual,critical,foster,3,convenient,convenient,slightly_prob,recommended,priority -usual,critical,foster,3,convenient,convenient,slightly_prob,priority,priority -usual,critical,foster,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,critical,foster,3,convenient,convenient,problematic,recommended,priority -usual,critical,foster,3,convenient,convenient,problematic,priority,priority -usual,critical,foster,3,convenient,convenient,problematic,not_recom,not_recom -usual,critical,foster,3,convenient,inconv,nonprob,recommended,priority -usual,critical,foster,3,convenient,inconv,nonprob,priority,spec_prior -usual,critical,foster,3,convenient,inconv,nonprob,not_recom,not_recom -usual,critical,foster,3,convenient,inconv,slightly_prob,recommended,priority -usual,critical,foster,3,convenient,inconv,slightly_prob,priority,spec_prior -usual,critical,foster,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,critical,foster,3,convenient,inconv,problematic,recommended,spec_prior -usual,critical,foster,3,convenient,inconv,problematic,priority,spec_prior -usual,critical,foster,3,convenient,inconv,problematic,not_recom,not_recom -usual,critical,foster,3,less_conv,convenient,nonprob,recommended,priority -usual,critical,foster,3,less_conv,convenient,nonprob,priority,spec_prior -usual,critical,foster,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,critical,foster,3,less_conv,convenient,slightly_prob,recommended,priority -usual,critical,foster,3,less_conv,convenient,slightly_prob,priority,spec_prior -usual,critical,foster,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,critical,foster,3,less_conv,convenient,problematic,recommended,spec_prior -usual,critical,foster,3,less_conv,convenient,problematic,priority,spec_prior -usual,critical,foster,3,less_conv,convenient,problematic,not_recom,not_recom -usual,critical,foster,3,less_conv,inconv,nonprob,recommended,priority -usual,critical,foster,3,less_conv,inconv,nonprob,priority,spec_prior -usual,critical,foster,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,critical,foster,3,less_conv,inconv,slightly_prob,recommended,priority -usual,critical,foster,3,less_conv,inconv,slightly_prob,priority,spec_prior -usual,critical,foster,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,critical,foster,3,less_conv,inconv,problematic,recommended,spec_prior -usual,critical,foster,3,less_conv,inconv,problematic,priority,spec_prior -usual,critical,foster,3,less_conv,inconv,problematic,not_recom,not_recom -usual,critical,foster,3,critical,convenient,nonprob,recommended,priority -usual,critical,foster,3,critical,convenient,nonprob,priority,spec_prior -usual,critical,foster,3,critical,convenient,nonprob,not_recom,not_recom -usual,critical,foster,3,critical,convenient,slightly_prob,recommended,priority -usual,critical,foster,3,critical,convenient,slightly_prob,priority,spec_prior -usual,critical,foster,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,critical,foster,3,critical,convenient,problematic,recommended,spec_prior -usual,critical,foster,3,critical,convenient,problematic,priority,spec_prior -usual,critical,foster,3,critical,convenient,problematic,not_recom,not_recom -usual,critical,foster,3,critical,inconv,nonprob,recommended,priority -usual,critical,foster,3,critical,inconv,nonprob,priority,spec_prior -usual,critical,foster,3,critical,inconv,nonprob,not_recom,not_recom -usual,critical,foster,3,critical,inconv,slightly_prob,recommended,priority -usual,critical,foster,3,critical,inconv,slightly_prob,priority,spec_prior -usual,critical,foster,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,critical,foster,3,critical,inconv,problematic,recommended,spec_prior -usual,critical,foster,3,critical,inconv,problematic,priority,spec_prior -usual,critical,foster,3,critical,inconv,problematic,not_recom,not_recom -usual,critical,foster,more,convenient,convenient,nonprob,recommended,priority -usual,critical,foster,more,convenient,convenient,nonprob,priority,priority -usual,critical,foster,more,convenient,convenient,nonprob,not_recom,not_recom -usual,critical,foster,more,convenient,convenient,slightly_prob,recommended,priority -usual,critical,foster,more,convenient,convenient,slightly_prob,priority,priority -usual,critical,foster,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,critical,foster,more,convenient,convenient,problematic,recommended,priority -usual,critical,foster,more,convenient,convenient,problematic,priority,priority -usual,critical,foster,more,convenient,convenient,problematic,not_recom,not_recom -usual,critical,foster,more,convenient,inconv,nonprob,recommended,priority -usual,critical,foster,more,convenient,inconv,nonprob,priority,spec_prior -usual,critical,foster,more,convenient,inconv,nonprob,not_recom,not_recom -usual,critical,foster,more,convenient,inconv,slightly_prob,recommended,priority -usual,critical,foster,more,convenient,inconv,slightly_prob,priority,spec_prior -usual,critical,foster,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,critical,foster,more,convenient,inconv,problematic,recommended,spec_prior -usual,critical,foster,more,convenient,inconv,problematic,priority,spec_prior -usual,critical,foster,more,convenient,inconv,problematic,not_recom,not_recom -usual,critical,foster,more,less_conv,convenient,nonprob,recommended,priority -usual,critical,foster,more,less_conv,convenient,nonprob,priority,spec_prior -usual,critical,foster,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,critical,foster,more,less_conv,convenient,slightly_prob,recommended,priority -usual,critical,foster,more,less_conv,convenient,slightly_prob,priority,spec_prior -usual,critical,foster,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,critical,foster,more,less_conv,convenient,problematic,recommended,spec_prior -usual,critical,foster,more,less_conv,convenient,problematic,priority,spec_prior -usual,critical,foster,more,less_conv,convenient,problematic,not_recom,not_recom -usual,critical,foster,more,less_conv,inconv,nonprob,recommended,priority -usual,critical,foster,more,less_conv,inconv,nonprob,priority,spec_prior -usual,critical,foster,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,critical,foster,more,less_conv,inconv,slightly_prob,recommended,priority -usual,critical,foster,more,less_conv,inconv,slightly_prob,priority,spec_prior -usual,critical,foster,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,critical,foster,more,less_conv,inconv,problematic,recommended,spec_prior -usual,critical,foster,more,less_conv,inconv,problematic,priority,spec_prior -usual,critical,foster,more,less_conv,inconv,problematic,not_recom,not_recom -usual,critical,foster,more,critical,convenient,nonprob,recommended,priority -usual,critical,foster,more,critical,convenient,nonprob,priority,spec_prior -usual,critical,foster,more,critical,convenient,nonprob,not_recom,not_recom -usual,critical,foster,more,critical,convenient,slightly_prob,recommended,priority -usual,critical,foster,more,critical,convenient,slightly_prob,priority,spec_prior -usual,critical,foster,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,critical,foster,more,critical,convenient,problematic,recommended,spec_prior -usual,critical,foster,more,critical,convenient,problematic,priority,spec_prior -usual,critical,foster,more,critical,convenient,problematic,not_recom,not_recom -usual,critical,foster,more,critical,inconv,nonprob,recommended,priority -usual,critical,foster,more,critical,inconv,nonprob,priority,spec_prior -usual,critical,foster,more,critical,inconv,nonprob,not_recom,not_recom -usual,critical,foster,more,critical,inconv,slightly_prob,recommended,priority -usual,critical,foster,more,critical,inconv,slightly_prob,priority,spec_prior -usual,critical,foster,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,critical,foster,more,critical,inconv,problematic,recommended,spec_prior -usual,critical,foster,more,critical,inconv,problematic,priority,spec_prior -usual,critical,foster,more,critical,inconv,problematic,not_recom,not_recom -usual,very_crit,complete,1,convenient,convenient,nonprob,recommended,priority -usual,very_crit,complete,1,convenient,convenient,nonprob,priority,priority -usual,very_crit,complete,1,convenient,convenient,nonprob,not_recom,not_recom -usual,very_crit,complete,1,convenient,convenient,slightly_prob,recommended,priority -usual,very_crit,complete,1,convenient,convenient,slightly_prob,priority,priority -usual,very_crit,complete,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,complete,1,convenient,convenient,problematic,recommended,priority -usual,very_crit,complete,1,convenient,convenient,problematic,priority,priority -usual,very_crit,complete,1,convenient,convenient,problematic,not_recom,not_recom -usual,very_crit,complete,1,convenient,inconv,nonprob,recommended,priority -usual,very_crit,complete,1,convenient,inconv,nonprob,priority,spec_prior -usual,very_crit,complete,1,convenient,inconv,nonprob,not_recom,not_recom -usual,very_crit,complete,1,convenient,inconv,slightly_prob,recommended,priority -usual,very_crit,complete,1,convenient,inconv,slightly_prob,priority,spec_prior -usual,very_crit,complete,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,complete,1,convenient,inconv,problematic,recommended,spec_prior -usual,very_crit,complete,1,convenient,inconv,problematic,priority,spec_prior -usual,very_crit,complete,1,convenient,inconv,problematic,not_recom,not_recom -usual,very_crit,complete,1,less_conv,convenient,nonprob,recommended,priority -usual,very_crit,complete,1,less_conv,convenient,nonprob,priority,spec_prior -usual,very_crit,complete,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,very_crit,complete,1,less_conv,convenient,slightly_prob,recommended,priority -usual,very_crit,complete,1,less_conv,convenient,slightly_prob,priority,spec_prior -usual,very_crit,complete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,complete,1,less_conv,convenient,problematic,recommended,spec_prior -usual,very_crit,complete,1,less_conv,convenient,problematic,priority,spec_prior -usual,very_crit,complete,1,less_conv,convenient,problematic,not_recom,not_recom -usual,very_crit,complete,1,less_conv,inconv,nonprob,recommended,priority -usual,very_crit,complete,1,less_conv,inconv,nonprob,priority,spec_prior -usual,very_crit,complete,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,very_crit,complete,1,less_conv,inconv,slightly_prob,recommended,priority -usual,very_crit,complete,1,less_conv,inconv,slightly_prob,priority,spec_prior -usual,very_crit,complete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,complete,1,less_conv,inconv,problematic,recommended,spec_prior -usual,very_crit,complete,1,less_conv,inconv,problematic,priority,spec_prior -usual,very_crit,complete,1,less_conv,inconv,problematic,not_recom,not_recom -usual,very_crit,complete,1,critical,convenient,nonprob,recommended,priority -usual,very_crit,complete,1,critical,convenient,nonprob,priority,spec_prior -usual,very_crit,complete,1,critical,convenient,nonprob,not_recom,not_recom -usual,very_crit,complete,1,critical,convenient,slightly_prob,recommended,priority -usual,very_crit,complete,1,critical,convenient,slightly_prob,priority,spec_prior -usual,very_crit,complete,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,complete,1,critical,convenient,problematic,recommended,spec_prior -usual,very_crit,complete,1,critical,convenient,problematic,priority,spec_prior -usual,very_crit,complete,1,critical,convenient,problematic,not_recom,not_recom -usual,very_crit,complete,1,critical,inconv,nonprob,recommended,priority -usual,very_crit,complete,1,critical,inconv,nonprob,priority,spec_prior -usual,very_crit,complete,1,critical,inconv,nonprob,not_recom,not_recom -usual,very_crit,complete,1,critical,inconv,slightly_prob,recommended,priority -usual,very_crit,complete,1,critical,inconv,slightly_prob,priority,spec_prior -usual,very_crit,complete,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,complete,1,critical,inconv,problematic,recommended,spec_prior -usual,very_crit,complete,1,critical,inconv,problematic,priority,spec_prior -usual,very_crit,complete,1,critical,inconv,problematic,not_recom,not_recom -usual,very_crit,complete,2,convenient,convenient,nonprob,recommended,priority -usual,very_crit,complete,2,convenient,convenient,nonprob,priority,spec_prior -usual,very_crit,complete,2,convenient,convenient,nonprob,not_recom,not_recom -usual,very_crit,complete,2,convenient,convenient,slightly_prob,recommended,priority -usual,very_crit,complete,2,convenient,convenient,slightly_prob,priority,spec_prior -usual,very_crit,complete,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,complete,2,convenient,convenient,problematic,recommended,spec_prior -usual,very_crit,complete,2,convenient,convenient,problematic,priority,spec_prior -usual,very_crit,complete,2,convenient,convenient,problematic,not_recom,not_recom -usual,very_crit,complete,2,convenient,inconv,nonprob,recommended,priority -usual,very_crit,complete,2,convenient,inconv,nonprob,priority,spec_prior -usual,very_crit,complete,2,convenient,inconv,nonprob,not_recom,not_recom -usual,very_crit,complete,2,convenient,inconv,slightly_prob,recommended,priority -usual,very_crit,complete,2,convenient,inconv,slightly_prob,priority,spec_prior -usual,very_crit,complete,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,complete,2,convenient,inconv,problematic,recommended,spec_prior -usual,very_crit,complete,2,convenient,inconv,problematic,priority,spec_prior -usual,very_crit,complete,2,convenient,inconv,problematic,not_recom,not_recom -usual,very_crit,complete,2,less_conv,convenient,nonprob,recommended,priority -usual,very_crit,complete,2,less_conv,convenient,nonprob,priority,spec_prior -usual,very_crit,complete,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,very_crit,complete,2,less_conv,convenient,slightly_prob,recommended,priority -usual,very_crit,complete,2,less_conv,convenient,slightly_prob,priority,spec_prior -usual,very_crit,complete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,complete,2,less_conv,convenient,problematic,recommended,spec_prior -usual,very_crit,complete,2,less_conv,convenient,problematic,priority,spec_prior -usual,very_crit,complete,2,less_conv,convenient,problematic,not_recom,not_recom -usual,very_crit,complete,2,less_conv,inconv,nonprob,recommended,priority -usual,very_crit,complete,2,less_conv,inconv,nonprob,priority,spec_prior -usual,very_crit,complete,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,very_crit,complete,2,less_conv,inconv,slightly_prob,recommended,priority -usual,very_crit,complete,2,less_conv,inconv,slightly_prob,priority,spec_prior -usual,very_crit,complete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,complete,2,less_conv,inconv,problematic,recommended,spec_prior -usual,very_crit,complete,2,less_conv,inconv,problematic,priority,spec_prior -usual,very_crit,complete,2,less_conv,inconv,problematic,not_recom,not_recom -usual,very_crit,complete,2,critical,convenient,nonprob,recommended,spec_prior -usual,very_crit,complete,2,critical,convenient,nonprob,priority,spec_prior -usual,very_crit,complete,2,critical,convenient,nonprob,not_recom,not_recom -usual,very_crit,complete,2,critical,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,complete,2,critical,convenient,slightly_prob,priority,spec_prior -usual,very_crit,complete,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,complete,2,critical,convenient,problematic,recommended,spec_prior -usual,very_crit,complete,2,critical,convenient,problematic,priority,spec_prior -usual,very_crit,complete,2,critical,convenient,problematic,not_recom,not_recom -usual,very_crit,complete,2,critical,inconv,nonprob,recommended,spec_prior -usual,very_crit,complete,2,critical,inconv,nonprob,priority,spec_prior -usual,very_crit,complete,2,critical,inconv,nonprob,not_recom,not_recom -usual,very_crit,complete,2,critical,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,complete,2,critical,inconv,slightly_prob,priority,spec_prior -usual,very_crit,complete,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,complete,2,critical,inconv,problematic,recommended,spec_prior -usual,very_crit,complete,2,critical,inconv,problematic,priority,spec_prior -usual,very_crit,complete,2,critical,inconv,problematic,not_recom,not_recom -usual,very_crit,complete,3,convenient,convenient,nonprob,recommended,priority -usual,very_crit,complete,3,convenient,convenient,nonprob,priority,spec_prior -usual,very_crit,complete,3,convenient,convenient,nonprob,not_recom,not_recom -usual,very_crit,complete,3,convenient,convenient,slightly_prob,recommended,priority -usual,very_crit,complete,3,convenient,convenient,slightly_prob,priority,spec_prior -usual,very_crit,complete,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,complete,3,convenient,convenient,problematic,recommended,spec_prior -usual,very_crit,complete,3,convenient,convenient,problematic,priority,spec_prior -usual,very_crit,complete,3,convenient,convenient,problematic,not_recom,not_recom -usual,very_crit,complete,3,convenient,inconv,nonprob,recommended,spec_prior -usual,very_crit,complete,3,convenient,inconv,nonprob,priority,spec_prior -usual,very_crit,complete,3,convenient,inconv,nonprob,not_recom,not_recom -usual,very_crit,complete,3,convenient,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,complete,3,convenient,inconv,slightly_prob,priority,spec_prior -usual,very_crit,complete,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,complete,3,convenient,inconv,problematic,recommended,spec_prior -usual,very_crit,complete,3,convenient,inconv,problematic,priority,spec_prior -usual,very_crit,complete,3,convenient,inconv,problematic,not_recom,not_recom -usual,very_crit,complete,3,less_conv,convenient,nonprob,recommended,spec_prior -usual,very_crit,complete,3,less_conv,convenient,nonprob,priority,spec_prior -usual,very_crit,complete,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,very_crit,complete,3,less_conv,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,complete,3,less_conv,convenient,slightly_prob,priority,spec_prior -usual,very_crit,complete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,complete,3,less_conv,convenient,problematic,recommended,spec_prior -usual,very_crit,complete,3,less_conv,convenient,problematic,priority,spec_prior -usual,very_crit,complete,3,less_conv,convenient,problematic,not_recom,not_recom -usual,very_crit,complete,3,less_conv,inconv,nonprob,recommended,spec_prior -usual,very_crit,complete,3,less_conv,inconv,nonprob,priority,spec_prior -usual,very_crit,complete,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,very_crit,complete,3,less_conv,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,complete,3,less_conv,inconv,slightly_prob,priority,spec_prior -usual,very_crit,complete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,complete,3,less_conv,inconv,problematic,recommended,spec_prior -usual,very_crit,complete,3,less_conv,inconv,problematic,priority,spec_prior -usual,very_crit,complete,3,less_conv,inconv,problematic,not_recom,not_recom -usual,very_crit,complete,3,critical,convenient,nonprob,recommended,spec_prior -usual,very_crit,complete,3,critical,convenient,nonprob,priority,spec_prior -usual,very_crit,complete,3,critical,convenient,nonprob,not_recom,not_recom -usual,very_crit,complete,3,critical,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,complete,3,critical,convenient,slightly_prob,priority,spec_prior -usual,very_crit,complete,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,complete,3,critical,convenient,problematic,recommended,spec_prior -usual,very_crit,complete,3,critical,convenient,problematic,priority,spec_prior -usual,very_crit,complete,3,critical,convenient,problematic,not_recom,not_recom -usual,very_crit,complete,3,critical,inconv,nonprob,recommended,spec_prior -usual,very_crit,complete,3,critical,inconv,nonprob,priority,spec_prior -usual,very_crit,complete,3,critical,inconv,nonprob,not_recom,not_recom -usual,very_crit,complete,3,critical,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,complete,3,critical,inconv,slightly_prob,priority,spec_prior -usual,very_crit,complete,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,complete,3,critical,inconv,problematic,recommended,spec_prior -usual,very_crit,complete,3,critical,inconv,problematic,priority,spec_prior -usual,very_crit,complete,3,critical,inconv,problematic,not_recom,not_recom -usual,very_crit,complete,more,convenient,convenient,nonprob,recommended,priority -usual,very_crit,complete,more,convenient,convenient,nonprob,priority,spec_prior -usual,very_crit,complete,more,convenient,convenient,nonprob,not_recom,not_recom -usual,very_crit,complete,more,convenient,convenient,slightly_prob,recommended,priority -usual,very_crit,complete,more,convenient,convenient,slightly_prob,priority,spec_prior -usual,very_crit,complete,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,complete,more,convenient,convenient,problematic,recommended,spec_prior -usual,very_crit,complete,more,convenient,convenient,problematic,priority,spec_prior -usual,very_crit,complete,more,convenient,convenient,problematic,not_recom,not_recom -usual,very_crit,complete,more,convenient,inconv,nonprob,recommended,spec_prior -usual,very_crit,complete,more,convenient,inconv,nonprob,priority,spec_prior -usual,very_crit,complete,more,convenient,inconv,nonprob,not_recom,not_recom -usual,very_crit,complete,more,convenient,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,complete,more,convenient,inconv,slightly_prob,priority,spec_prior -usual,very_crit,complete,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,complete,more,convenient,inconv,problematic,recommended,spec_prior -usual,very_crit,complete,more,convenient,inconv,problematic,priority,spec_prior -usual,very_crit,complete,more,convenient,inconv,problematic,not_recom,not_recom -usual,very_crit,complete,more,less_conv,convenient,nonprob,recommended,spec_prior -usual,very_crit,complete,more,less_conv,convenient,nonprob,priority,spec_prior -usual,very_crit,complete,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,very_crit,complete,more,less_conv,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,complete,more,less_conv,convenient,slightly_prob,priority,spec_prior -usual,very_crit,complete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,complete,more,less_conv,convenient,problematic,recommended,spec_prior -usual,very_crit,complete,more,less_conv,convenient,problematic,priority,spec_prior -usual,very_crit,complete,more,less_conv,convenient,problematic,not_recom,not_recom -usual,very_crit,complete,more,less_conv,inconv,nonprob,recommended,spec_prior -usual,very_crit,complete,more,less_conv,inconv,nonprob,priority,spec_prior -usual,very_crit,complete,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,very_crit,complete,more,less_conv,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,complete,more,less_conv,inconv,slightly_prob,priority,spec_prior -usual,very_crit,complete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,complete,more,less_conv,inconv,problematic,recommended,spec_prior -usual,very_crit,complete,more,less_conv,inconv,problematic,priority,spec_prior -usual,very_crit,complete,more,less_conv,inconv,problematic,not_recom,not_recom -usual,very_crit,complete,more,critical,convenient,nonprob,recommended,spec_prior -usual,very_crit,complete,more,critical,convenient,nonprob,priority,spec_prior -usual,very_crit,complete,more,critical,convenient,nonprob,not_recom,not_recom -usual,very_crit,complete,more,critical,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,complete,more,critical,convenient,slightly_prob,priority,spec_prior -usual,very_crit,complete,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,complete,more,critical,convenient,problematic,recommended,spec_prior -usual,very_crit,complete,more,critical,convenient,problematic,priority,spec_prior -usual,very_crit,complete,more,critical,convenient,problematic,not_recom,not_recom -usual,very_crit,complete,more,critical,inconv,nonprob,recommended,spec_prior -usual,very_crit,complete,more,critical,inconv,nonprob,priority,spec_prior -usual,very_crit,complete,more,critical,inconv,nonprob,not_recom,not_recom -usual,very_crit,complete,more,critical,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,complete,more,critical,inconv,slightly_prob,priority,spec_prior -usual,very_crit,complete,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,complete,more,critical,inconv,problematic,recommended,spec_prior -usual,very_crit,complete,more,critical,inconv,problematic,priority,spec_prior -usual,very_crit,complete,more,critical,inconv,problematic,not_recom,not_recom -usual,very_crit,completed,1,convenient,convenient,nonprob,recommended,priority -usual,very_crit,completed,1,convenient,convenient,nonprob,priority,spec_prior -usual,very_crit,completed,1,convenient,convenient,nonprob,not_recom,not_recom -usual,very_crit,completed,1,convenient,convenient,slightly_prob,recommended,priority -usual,very_crit,completed,1,convenient,convenient,slightly_prob,priority,spec_prior -usual,very_crit,completed,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,completed,1,convenient,convenient,problematic,recommended,spec_prior -usual,very_crit,completed,1,convenient,convenient,problematic,priority,spec_prior -usual,very_crit,completed,1,convenient,convenient,problematic,not_recom,not_recom -usual,very_crit,completed,1,convenient,inconv,nonprob,recommended,priority -usual,very_crit,completed,1,convenient,inconv,nonprob,priority,spec_prior -usual,very_crit,completed,1,convenient,inconv,nonprob,not_recom,not_recom -usual,very_crit,completed,1,convenient,inconv,slightly_prob,recommended,priority -usual,very_crit,completed,1,convenient,inconv,slightly_prob,priority,spec_prior -usual,very_crit,completed,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,completed,1,convenient,inconv,problematic,recommended,spec_prior -usual,very_crit,completed,1,convenient,inconv,problematic,priority,spec_prior -usual,very_crit,completed,1,convenient,inconv,problematic,not_recom,not_recom -usual,very_crit,completed,1,less_conv,convenient,nonprob,recommended,priority -usual,very_crit,completed,1,less_conv,convenient,nonprob,priority,spec_prior -usual,very_crit,completed,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,very_crit,completed,1,less_conv,convenient,slightly_prob,recommended,priority -usual,very_crit,completed,1,less_conv,convenient,slightly_prob,priority,spec_prior -usual,very_crit,completed,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,completed,1,less_conv,convenient,problematic,recommended,spec_prior -usual,very_crit,completed,1,less_conv,convenient,problematic,priority,spec_prior -usual,very_crit,completed,1,less_conv,convenient,problematic,not_recom,not_recom -usual,very_crit,completed,1,less_conv,inconv,nonprob,recommended,priority -usual,very_crit,completed,1,less_conv,inconv,nonprob,priority,spec_prior -usual,very_crit,completed,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,very_crit,completed,1,less_conv,inconv,slightly_prob,recommended,priority -usual,very_crit,completed,1,less_conv,inconv,slightly_prob,priority,spec_prior -usual,very_crit,completed,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,completed,1,less_conv,inconv,problematic,recommended,spec_prior -usual,very_crit,completed,1,less_conv,inconv,problematic,priority,spec_prior -usual,very_crit,completed,1,less_conv,inconv,problematic,not_recom,not_recom -usual,very_crit,completed,1,critical,convenient,nonprob,recommended,spec_prior -usual,very_crit,completed,1,critical,convenient,nonprob,priority,spec_prior -usual,very_crit,completed,1,critical,convenient,nonprob,not_recom,not_recom -usual,very_crit,completed,1,critical,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,completed,1,critical,convenient,slightly_prob,priority,spec_prior -usual,very_crit,completed,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,completed,1,critical,convenient,problematic,recommended,spec_prior -usual,very_crit,completed,1,critical,convenient,problematic,priority,spec_prior -usual,very_crit,completed,1,critical,convenient,problematic,not_recom,not_recom -usual,very_crit,completed,1,critical,inconv,nonprob,recommended,spec_prior -usual,very_crit,completed,1,critical,inconv,nonprob,priority,spec_prior -usual,very_crit,completed,1,critical,inconv,nonprob,not_recom,not_recom -usual,very_crit,completed,1,critical,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,completed,1,critical,inconv,slightly_prob,priority,spec_prior -usual,very_crit,completed,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,completed,1,critical,inconv,problematic,recommended,spec_prior -usual,very_crit,completed,1,critical,inconv,problematic,priority,spec_prior -usual,very_crit,completed,1,critical,inconv,problematic,not_recom,not_recom -usual,very_crit,completed,2,convenient,convenient,nonprob,recommended,priority -usual,very_crit,completed,2,convenient,convenient,nonprob,priority,spec_prior -usual,very_crit,completed,2,convenient,convenient,nonprob,not_recom,not_recom -usual,very_crit,completed,2,convenient,convenient,slightly_prob,recommended,priority -usual,very_crit,completed,2,convenient,convenient,slightly_prob,priority,spec_prior -usual,very_crit,completed,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,completed,2,convenient,convenient,problematic,recommended,spec_prior -usual,very_crit,completed,2,convenient,convenient,problematic,priority,spec_prior -usual,very_crit,completed,2,convenient,convenient,problematic,not_recom,not_recom -usual,very_crit,completed,2,convenient,inconv,nonprob,recommended,priority -usual,very_crit,completed,2,convenient,inconv,nonprob,priority,spec_prior -usual,very_crit,completed,2,convenient,inconv,nonprob,not_recom,not_recom -usual,very_crit,completed,2,convenient,inconv,slightly_prob,recommended,priority -usual,very_crit,completed,2,convenient,inconv,slightly_prob,priority,spec_prior -usual,very_crit,completed,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,completed,2,convenient,inconv,problematic,recommended,spec_prior -usual,very_crit,completed,2,convenient,inconv,problematic,priority,spec_prior -usual,very_crit,completed,2,convenient,inconv,problematic,not_recom,not_recom -usual,very_crit,completed,2,less_conv,convenient,nonprob,recommended,priority -usual,very_crit,completed,2,less_conv,convenient,nonprob,priority,spec_prior -usual,very_crit,completed,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,very_crit,completed,2,less_conv,convenient,slightly_prob,recommended,priority -usual,very_crit,completed,2,less_conv,convenient,slightly_prob,priority,spec_prior -usual,very_crit,completed,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,completed,2,less_conv,convenient,problematic,recommended,spec_prior -usual,very_crit,completed,2,less_conv,convenient,problematic,priority,spec_prior -usual,very_crit,completed,2,less_conv,convenient,problematic,not_recom,not_recom -usual,very_crit,completed,2,less_conv,inconv,nonprob,recommended,priority -usual,very_crit,completed,2,less_conv,inconv,nonprob,priority,spec_prior -usual,very_crit,completed,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,very_crit,completed,2,less_conv,inconv,slightly_prob,recommended,priority -usual,very_crit,completed,2,less_conv,inconv,slightly_prob,priority,spec_prior -usual,very_crit,completed,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,completed,2,less_conv,inconv,problematic,recommended,spec_prior -usual,very_crit,completed,2,less_conv,inconv,problematic,priority,spec_prior -usual,very_crit,completed,2,less_conv,inconv,problematic,not_recom,not_recom -usual,very_crit,completed,2,critical,convenient,nonprob,recommended,spec_prior -usual,very_crit,completed,2,critical,convenient,nonprob,priority,spec_prior -usual,very_crit,completed,2,critical,convenient,nonprob,not_recom,not_recom -usual,very_crit,completed,2,critical,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,completed,2,critical,convenient,slightly_prob,priority,spec_prior -usual,very_crit,completed,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,completed,2,critical,convenient,problematic,recommended,spec_prior -usual,very_crit,completed,2,critical,convenient,problematic,priority,spec_prior -usual,very_crit,completed,2,critical,convenient,problematic,not_recom,not_recom -usual,very_crit,completed,2,critical,inconv,nonprob,recommended,spec_prior -usual,very_crit,completed,2,critical,inconv,nonprob,priority,spec_prior -usual,very_crit,completed,2,critical,inconv,nonprob,not_recom,not_recom -usual,very_crit,completed,2,critical,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,completed,2,critical,inconv,slightly_prob,priority,spec_prior -usual,very_crit,completed,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,completed,2,critical,inconv,problematic,recommended,spec_prior -usual,very_crit,completed,2,critical,inconv,problematic,priority,spec_prior -usual,very_crit,completed,2,critical,inconv,problematic,not_recom,not_recom -usual,very_crit,completed,3,convenient,convenient,nonprob,recommended,priority -usual,very_crit,completed,3,convenient,convenient,nonprob,priority,spec_prior -usual,very_crit,completed,3,convenient,convenient,nonprob,not_recom,not_recom -usual,very_crit,completed,3,convenient,convenient,slightly_prob,recommended,priority -usual,very_crit,completed,3,convenient,convenient,slightly_prob,priority,spec_prior -usual,very_crit,completed,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,completed,3,convenient,convenient,problematic,recommended,spec_prior -usual,very_crit,completed,3,convenient,convenient,problematic,priority,spec_prior -usual,very_crit,completed,3,convenient,convenient,problematic,not_recom,not_recom -usual,very_crit,completed,3,convenient,inconv,nonprob,recommended,spec_prior -usual,very_crit,completed,3,convenient,inconv,nonprob,priority,spec_prior -usual,very_crit,completed,3,convenient,inconv,nonprob,not_recom,not_recom -usual,very_crit,completed,3,convenient,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,completed,3,convenient,inconv,slightly_prob,priority,spec_prior -usual,very_crit,completed,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,completed,3,convenient,inconv,problematic,recommended,spec_prior -usual,very_crit,completed,3,convenient,inconv,problematic,priority,spec_prior -usual,very_crit,completed,3,convenient,inconv,problematic,not_recom,not_recom -usual,very_crit,completed,3,less_conv,convenient,nonprob,recommended,spec_prior -usual,very_crit,completed,3,less_conv,convenient,nonprob,priority,spec_prior -usual,very_crit,completed,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,very_crit,completed,3,less_conv,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,completed,3,less_conv,convenient,slightly_prob,priority,spec_prior -usual,very_crit,completed,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,completed,3,less_conv,convenient,problematic,recommended,spec_prior -usual,very_crit,completed,3,less_conv,convenient,problematic,priority,spec_prior -usual,very_crit,completed,3,less_conv,convenient,problematic,not_recom,not_recom -usual,very_crit,completed,3,less_conv,inconv,nonprob,recommended,spec_prior -usual,very_crit,completed,3,less_conv,inconv,nonprob,priority,spec_prior -usual,very_crit,completed,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,very_crit,completed,3,less_conv,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,completed,3,less_conv,inconv,slightly_prob,priority,spec_prior -usual,very_crit,completed,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,completed,3,less_conv,inconv,problematic,recommended,spec_prior -usual,very_crit,completed,3,less_conv,inconv,problematic,priority,spec_prior -usual,very_crit,completed,3,less_conv,inconv,problematic,not_recom,not_recom -usual,very_crit,completed,3,critical,convenient,nonprob,recommended,spec_prior -usual,very_crit,completed,3,critical,convenient,nonprob,priority,spec_prior -usual,very_crit,completed,3,critical,convenient,nonprob,not_recom,not_recom -usual,very_crit,completed,3,critical,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,completed,3,critical,convenient,slightly_prob,priority,spec_prior -usual,very_crit,completed,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,completed,3,critical,convenient,problematic,recommended,spec_prior -usual,very_crit,completed,3,critical,convenient,problematic,priority,spec_prior -usual,very_crit,completed,3,critical,convenient,problematic,not_recom,not_recom -usual,very_crit,completed,3,critical,inconv,nonprob,recommended,spec_prior -usual,very_crit,completed,3,critical,inconv,nonprob,priority,spec_prior -usual,very_crit,completed,3,critical,inconv,nonprob,not_recom,not_recom -usual,very_crit,completed,3,critical,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,completed,3,critical,inconv,slightly_prob,priority,spec_prior -usual,very_crit,completed,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,completed,3,critical,inconv,problematic,recommended,spec_prior -usual,very_crit,completed,3,critical,inconv,problematic,priority,spec_prior -usual,very_crit,completed,3,critical,inconv,problematic,not_recom,not_recom -usual,very_crit,completed,more,convenient,convenient,nonprob,recommended,priority -usual,very_crit,completed,more,convenient,convenient,nonprob,priority,spec_prior -usual,very_crit,completed,more,convenient,convenient,nonprob,not_recom,not_recom -usual,very_crit,completed,more,convenient,convenient,slightly_prob,recommended,priority -usual,very_crit,completed,more,convenient,convenient,slightly_prob,priority,spec_prior -usual,very_crit,completed,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,completed,more,convenient,convenient,problematic,recommended,spec_prior -usual,very_crit,completed,more,convenient,convenient,problematic,priority,spec_prior -usual,very_crit,completed,more,convenient,convenient,problematic,not_recom,not_recom -usual,very_crit,completed,more,convenient,inconv,nonprob,recommended,spec_prior -usual,very_crit,completed,more,convenient,inconv,nonprob,priority,spec_prior -usual,very_crit,completed,more,convenient,inconv,nonprob,not_recom,not_recom -usual,very_crit,completed,more,convenient,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,completed,more,convenient,inconv,slightly_prob,priority,spec_prior -usual,very_crit,completed,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,completed,more,convenient,inconv,problematic,recommended,spec_prior -usual,very_crit,completed,more,convenient,inconv,problematic,priority,spec_prior -usual,very_crit,completed,more,convenient,inconv,problematic,not_recom,not_recom -usual,very_crit,completed,more,less_conv,convenient,nonprob,recommended,spec_prior -usual,very_crit,completed,more,less_conv,convenient,nonprob,priority,spec_prior -usual,very_crit,completed,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,very_crit,completed,more,less_conv,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,completed,more,less_conv,convenient,slightly_prob,priority,spec_prior -usual,very_crit,completed,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,completed,more,less_conv,convenient,problematic,recommended,spec_prior -usual,very_crit,completed,more,less_conv,convenient,problematic,priority,spec_prior -usual,very_crit,completed,more,less_conv,convenient,problematic,not_recom,not_recom -usual,very_crit,completed,more,less_conv,inconv,nonprob,recommended,spec_prior -usual,very_crit,completed,more,less_conv,inconv,nonprob,priority,spec_prior -usual,very_crit,completed,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,very_crit,completed,more,less_conv,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,completed,more,less_conv,inconv,slightly_prob,priority,spec_prior -usual,very_crit,completed,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,completed,more,less_conv,inconv,problematic,recommended,spec_prior -usual,very_crit,completed,more,less_conv,inconv,problematic,priority,spec_prior -usual,very_crit,completed,more,less_conv,inconv,problematic,not_recom,not_recom -usual,very_crit,completed,more,critical,convenient,nonprob,recommended,spec_prior -usual,very_crit,completed,more,critical,convenient,nonprob,priority,spec_prior -usual,very_crit,completed,more,critical,convenient,nonprob,not_recom,not_recom -usual,very_crit,completed,more,critical,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,completed,more,critical,convenient,slightly_prob,priority,spec_prior -usual,very_crit,completed,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,completed,more,critical,convenient,problematic,recommended,spec_prior -usual,very_crit,completed,more,critical,convenient,problematic,priority,spec_prior -usual,very_crit,completed,more,critical,convenient,problematic,not_recom,not_recom -usual,very_crit,completed,more,critical,inconv,nonprob,recommended,spec_prior -usual,very_crit,completed,more,critical,inconv,nonprob,priority,spec_prior -usual,very_crit,completed,more,critical,inconv,nonprob,not_recom,not_recom -usual,very_crit,completed,more,critical,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,completed,more,critical,inconv,slightly_prob,priority,spec_prior -usual,very_crit,completed,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,completed,more,critical,inconv,problematic,recommended,spec_prior -usual,very_crit,completed,more,critical,inconv,problematic,priority,spec_prior -usual,very_crit,completed,more,critical,inconv,problematic,not_recom,not_recom -usual,very_crit,incomplete,1,convenient,convenient,nonprob,recommended,priority -usual,very_crit,incomplete,1,convenient,convenient,nonprob,priority,spec_prior -usual,very_crit,incomplete,1,convenient,convenient,nonprob,not_recom,not_recom -usual,very_crit,incomplete,1,convenient,convenient,slightly_prob,recommended,priority -usual,very_crit,incomplete,1,convenient,convenient,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,1,convenient,convenient,problematic,recommended,spec_prior -usual,very_crit,incomplete,1,convenient,convenient,problematic,priority,spec_prior -usual,very_crit,incomplete,1,convenient,convenient,problematic,not_recom,not_recom -usual,very_crit,incomplete,1,convenient,inconv,nonprob,recommended,priority -usual,very_crit,incomplete,1,convenient,inconv,nonprob,priority,spec_prior -usual,very_crit,incomplete,1,convenient,inconv,nonprob,not_recom,not_recom -usual,very_crit,incomplete,1,convenient,inconv,slightly_prob,recommended,priority -usual,very_crit,incomplete,1,convenient,inconv,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,1,convenient,inconv,problematic,recommended,spec_prior -usual,very_crit,incomplete,1,convenient,inconv,problematic,priority,spec_prior -usual,very_crit,incomplete,1,convenient,inconv,problematic,not_recom,not_recom -usual,very_crit,incomplete,1,less_conv,convenient,nonprob,recommended,priority -usual,very_crit,incomplete,1,less_conv,convenient,nonprob,priority,spec_prior -usual,very_crit,incomplete,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,very_crit,incomplete,1,less_conv,convenient,slightly_prob,recommended,priority -usual,very_crit,incomplete,1,less_conv,convenient,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,1,less_conv,convenient,problematic,recommended,spec_prior -usual,very_crit,incomplete,1,less_conv,convenient,problematic,priority,spec_prior -usual,very_crit,incomplete,1,less_conv,convenient,problematic,not_recom,not_recom -usual,very_crit,incomplete,1,less_conv,inconv,nonprob,recommended,priority -usual,very_crit,incomplete,1,less_conv,inconv,nonprob,priority,spec_prior -usual,very_crit,incomplete,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,very_crit,incomplete,1,less_conv,inconv,slightly_prob,recommended,priority -usual,very_crit,incomplete,1,less_conv,inconv,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,1,less_conv,inconv,problematic,recommended,spec_prior -usual,very_crit,incomplete,1,less_conv,inconv,problematic,priority,spec_prior -usual,very_crit,incomplete,1,less_conv,inconv,problematic,not_recom,not_recom -usual,very_crit,incomplete,1,critical,convenient,nonprob,recommended,spec_prior -usual,very_crit,incomplete,1,critical,convenient,nonprob,priority,spec_prior -usual,very_crit,incomplete,1,critical,convenient,nonprob,not_recom,not_recom -usual,very_crit,incomplete,1,critical,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,incomplete,1,critical,convenient,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,1,critical,convenient,problematic,recommended,spec_prior -usual,very_crit,incomplete,1,critical,convenient,problematic,priority,spec_prior -usual,very_crit,incomplete,1,critical,convenient,problematic,not_recom,not_recom -usual,very_crit,incomplete,1,critical,inconv,nonprob,recommended,spec_prior -usual,very_crit,incomplete,1,critical,inconv,nonprob,priority,spec_prior -usual,very_crit,incomplete,1,critical,inconv,nonprob,not_recom,not_recom -usual,very_crit,incomplete,1,critical,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,incomplete,1,critical,inconv,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,1,critical,inconv,problematic,recommended,spec_prior -usual,very_crit,incomplete,1,critical,inconv,problematic,priority,spec_prior -usual,very_crit,incomplete,1,critical,inconv,problematic,not_recom,not_recom -usual,very_crit,incomplete,2,convenient,convenient,nonprob,recommended,priority -usual,very_crit,incomplete,2,convenient,convenient,nonprob,priority,spec_prior -usual,very_crit,incomplete,2,convenient,convenient,nonprob,not_recom,not_recom -usual,very_crit,incomplete,2,convenient,convenient,slightly_prob,recommended,priority -usual,very_crit,incomplete,2,convenient,convenient,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,2,convenient,convenient,problematic,recommended,spec_prior -usual,very_crit,incomplete,2,convenient,convenient,problematic,priority,spec_prior -usual,very_crit,incomplete,2,convenient,convenient,problematic,not_recom,not_recom -usual,very_crit,incomplete,2,convenient,inconv,nonprob,recommended,spec_prior -usual,very_crit,incomplete,2,convenient,inconv,nonprob,priority,spec_prior -usual,very_crit,incomplete,2,convenient,inconv,nonprob,not_recom,not_recom -usual,very_crit,incomplete,2,convenient,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,incomplete,2,convenient,inconv,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,2,convenient,inconv,problematic,recommended,spec_prior -usual,very_crit,incomplete,2,convenient,inconv,problematic,priority,spec_prior -usual,very_crit,incomplete,2,convenient,inconv,problematic,not_recom,not_recom -usual,very_crit,incomplete,2,less_conv,convenient,nonprob,recommended,spec_prior -usual,very_crit,incomplete,2,less_conv,convenient,nonprob,priority,spec_prior -usual,very_crit,incomplete,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,very_crit,incomplete,2,less_conv,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,incomplete,2,less_conv,convenient,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,2,less_conv,convenient,problematic,recommended,spec_prior -usual,very_crit,incomplete,2,less_conv,convenient,problematic,priority,spec_prior -usual,very_crit,incomplete,2,less_conv,convenient,problematic,not_recom,not_recom -usual,very_crit,incomplete,2,less_conv,inconv,nonprob,recommended,spec_prior -usual,very_crit,incomplete,2,less_conv,inconv,nonprob,priority,spec_prior -usual,very_crit,incomplete,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,very_crit,incomplete,2,less_conv,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,incomplete,2,less_conv,inconv,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,2,less_conv,inconv,problematic,recommended,spec_prior -usual,very_crit,incomplete,2,less_conv,inconv,problematic,priority,spec_prior -usual,very_crit,incomplete,2,less_conv,inconv,problematic,not_recom,not_recom -usual,very_crit,incomplete,2,critical,convenient,nonprob,recommended,spec_prior -usual,very_crit,incomplete,2,critical,convenient,nonprob,priority,spec_prior -usual,very_crit,incomplete,2,critical,convenient,nonprob,not_recom,not_recom -usual,very_crit,incomplete,2,critical,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,incomplete,2,critical,convenient,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,2,critical,convenient,problematic,recommended,spec_prior -usual,very_crit,incomplete,2,critical,convenient,problematic,priority,spec_prior -usual,very_crit,incomplete,2,critical,convenient,problematic,not_recom,not_recom -usual,very_crit,incomplete,2,critical,inconv,nonprob,recommended,spec_prior -usual,very_crit,incomplete,2,critical,inconv,nonprob,priority,spec_prior -usual,very_crit,incomplete,2,critical,inconv,nonprob,not_recom,not_recom -usual,very_crit,incomplete,2,critical,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,incomplete,2,critical,inconv,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,2,critical,inconv,problematic,recommended,spec_prior -usual,very_crit,incomplete,2,critical,inconv,problematic,priority,spec_prior -usual,very_crit,incomplete,2,critical,inconv,problematic,not_recom,not_recom -usual,very_crit,incomplete,3,convenient,convenient,nonprob,recommended,priority -usual,very_crit,incomplete,3,convenient,convenient,nonprob,priority,spec_prior -usual,very_crit,incomplete,3,convenient,convenient,nonprob,not_recom,not_recom -usual,very_crit,incomplete,3,convenient,convenient,slightly_prob,recommended,priority -usual,very_crit,incomplete,3,convenient,convenient,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,3,convenient,convenient,problematic,recommended,spec_prior -usual,very_crit,incomplete,3,convenient,convenient,problematic,priority,spec_prior -usual,very_crit,incomplete,3,convenient,convenient,problematic,not_recom,not_recom -usual,very_crit,incomplete,3,convenient,inconv,nonprob,recommended,spec_prior -usual,very_crit,incomplete,3,convenient,inconv,nonprob,priority,spec_prior -usual,very_crit,incomplete,3,convenient,inconv,nonprob,not_recom,not_recom -usual,very_crit,incomplete,3,convenient,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,incomplete,3,convenient,inconv,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,3,convenient,inconv,problematic,recommended,spec_prior -usual,very_crit,incomplete,3,convenient,inconv,problematic,priority,spec_prior -usual,very_crit,incomplete,3,convenient,inconv,problematic,not_recom,not_recom -usual,very_crit,incomplete,3,less_conv,convenient,nonprob,recommended,spec_prior -usual,very_crit,incomplete,3,less_conv,convenient,nonprob,priority,spec_prior -usual,very_crit,incomplete,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,very_crit,incomplete,3,less_conv,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,incomplete,3,less_conv,convenient,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,3,less_conv,convenient,problematic,recommended,spec_prior -usual,very_crit,incomplete,3,less_conv,convenient,problematic,priority,spec_prior -usual,very_crit,incomplete,3,less_conv,convenient,problematic,not_recom,not_recom -usual,very_crit,incomplete,3,less_conv,inconv,nonprob,recommended,spec_prior -usual,very_crit,incomplete,3,less_conv,inconv,nonprob,priority,spec_prior -usual,very_crit,incomplete,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,very_crit,incomplete,3,less_conv,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,incomplete,3,less_conv,inconv,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,3,less_conv,inconv,problematic,recommended,spec_prior -usual,very_crit,incomplete,3,less_conv,inconv,problematic,priority,spec_prior -usual,very_crit,incomplete,3,less_conv,inconv,problematic,not_recom,not_recom -usual,very_crit,incomplete,3,critical,convenient,nonprob,recommended,spec_prior -usual,very_crit,incomplete,3,critical,convenient,nonprob,priority,spec_prior -usual,very_crit,incomplete,3,critical,convenient,nonprob,not_recom,not_recom -usual,very_crit,incomplete,3,critical,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,incomplete,3,critical,convenient,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,3,critical,convenient,problematic,recommended,spec_prior -usual,very_crit,incomplete,3,critical,convenient,problematic,priority,spec_prior -usual,very_crit,incomplete,3,critical,convenient,problematic,not_recom,not_recom -usual,very_crit,incomplete,3,critical,inconv,nonprob,recommended,spec_prior -usual,very_crit,incomplete,3,critical,inconv,nonprob,priority,spec_prior -usual,very_crit,incomplete,3,critical,inconv,nonprob,not_recom,not_recom -usual,very_crit,incomplete,3,critical,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,incomplete,3,critical,inconv,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,3,critical,inconv,problematic,recommended,spec_prior -usual,very_crit,incomplete,3,critical,inconv,problematic,priority,spec_prior -usual,very_crit,incomplete,3,critical,inconv,problematic,not_recom,not_recom -usual,very_crit,incomplete,more,convenient,convenient,nonprob,recommended,priority -usual,very_crit,incomplete,more,convenient,convenient,nonprob,priority,spec_prior -usual,very_crit,incomplete,more,convenient,convenient,nonprob,not_recom,not_recom -usual,very_crit,incomplete,more,convenient,convenient,slightly_prob,recommended,priority -usual,very_crit,incomplete,more,convenient,convenient,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,more,convenient,convenient,problematic,recommended,spec_prior -usual,very_crit,incomplete,more,convenient,convenient,problematic,priority,spec_prior -usual,very_crit,incomplete,more,convenient,convenient,problematic,not_recom,not_recom -usual,very_crit,incomplete,more,convenient,inconv,nonprob,recommended,spec_prior -usual,very_crit,incomplete,more,convenient,inconv,nonprob,priority,spec_prior -usual,very_crit,incomplete,more,convenient,inconv,nonprob,not_recom,not_recom -usual,very_crit,incomplete,more,convenient,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,incomplete,more,convenient,inconv,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,more,convenient,inconv,problematic,recommended,spec_prior -usual,very_crit,incomplete,more,convenient,inconv,problematic,priority,spec_prior -usual,very_crit,incomplete,more,convenient,inconv,problematic,not_recom,not_recom -usual,very_crit,incomplete,more,less_conv,convenient,nonprob,recommended,spec_prior -usual,very_crit,incomplete,more,less_conv,convenient,nonprob,priority,spec_prior -usual,very_crit,incomplete,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,very_crit,incomplete,more,less_conv,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,incomplete,more,less_conv,convenient,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,more,less_conv,convenient,problematic,recommended,spec_prior -usual,very_crit,incomplete,more,less_conv,convenient,problematic,priority,spec_prior -usual,very_crit,incomplete,more,less_conv,convenient,problematic,not_recom,not_recom -usual,very_crit,incomplete,more,less_conv,inconv,nonprob,recommended,spec_prior -usual,very_crit,incomplete,more,less_conv,inconv,nonprob,priority,spec_prior -usual,very_crit,incomplete,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,very_crit,incomplete,more,less_conv,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,incomplete,more,less_conv,inconv,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,more,less_conv,inconv,problematic,recommended,spec_prior -usual,very_crit,incomplete,more,less_conv,inconv,problematic,priority,spec_prior -usual,very_crit,incomplete,more,less_conv,inconv,problematic,not_recom,not_recom -usual,very_crit,incomplete,more,critical,convenient,nonprob,recommended,spec_prior -usual,very_crit,incomplete,more,critical,convenient,nonprob,priority,spec_prior -usual,very_crit,incomplete,more,critical,convenient,nonprob,not_recom,not_recom -usual,very_crit,incomplete,more,critical,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,incomplete,more,critical,convenient,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,more,critical,convenient,problematic,recommended,spec_prior -usual,very_crit,incomplete,more,critical,convenient,problematic,priority,spec_prior -usual,very_crit,incomplete,more,critical,convenient,problematic,not_recom,not_recom -usual,very_crit,incomplete,more,critical,inconv,nonprob,recommended,spec_prior -usual,very_crit,incomplete,more,critical,inconv,nonprob,priority,spec_prior -usual,very_crit,incomplete,more,critical,inconv,nonprob,not_recom,not_recom -usual,very_crit,incomplete,more,critical,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,incomplete,more,critical,inconv,slightly_prob,priority,spec_prior -usual,very_crit,incomplete,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,incomplete,more,critical,inconv,problematic,recommended,spec_prior -usual,very_crit,incomplete,more,critical,inconv,problematic,priority,spec_prior -usual,very_crit,incomplete,more,critical,inconv,problematic,not_recom,not_recom -usual,very_crit,foster,1,convenient,convenient,nonprob,recommended,priority -usual,very_crit,foster,1,convenient,convenient,nonprob,priority,spec_prior -usual,very_crit,foster,1,convenient,convenient,nonprob,not_recom,not_recom -usual,very_crit,foster,1,convenient,convenient,slightly_prob,recommended,priority -usual,very_crit,foster,1,convenient,convenient,slightly_prob,priority,spec_prior -usual,very_crit,foster,1,convenient,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,foster,1,convenient,convenient,problematic,recommended,spec_prior -usual,very_crit,foster,1,convenient,convenient,problematic,priority,spec_prior -usual,very_crit,foster,1,convenient,convenient,problematic,not_recom,not_recom -usual,very_crit,foster,1,convenient,inconv,nonprob,recommended,spec_prior -usual,very_crit,foster,1,convenient,inconv,nonprob,priority,spec_prior -usual,very_crit,foster,1,convenient,inconv,nonprob,not_recom,not_recom -usual,very_crit,foster,1,convenient,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,foster,1,convenient,inconv,slightly_prob,priority,spec_prior -usual,very_crit,foster,1,convenient,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,foster,1,convenient,inconv,problematic,recommended,spec_prior -usual,very_crit,foster,1,convenient,inconv,problematic,priority,spec_prior -usual,very_crit,foster,1,convenient,inconv,problematic,not_recom,not_recom -usual,very_crit,foster,1,less_conv,convenient,nonprob,recommended,spec_prior -usual,very_crit,foster,1,less_conv,convenient,nonprob,priority,spec_prior -usual,very_crit,foster,1,less_conv,convenient,nonprob,not_recom,not_recom -usual,very_crit,foster,1,less_conv,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,foster,1,less_conv,convenient,slightly_prob,priority,spec_prior -usual,very_crit,foster,1,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,foster,1,less_conv,convenient,problematic,recommended,spec_prior -usual,very_crit,foster,1,less_conv,convenient,problematic,priority,spec_prior -usual,very_crit,foster,1,less_conv,convenient,problematic,not_recom,not_recom -usual,very_crit,foster,1,less_conv,inconv,nonprob,recommended,spec_prior -usual,very_crit,foster,1,less_conv,inconv,nonprob,priority,spec_prior -usual,very_crit,foster,1,less_conv,inconv,nonprob,not_recom,not_recom -usual,very_crit,foster,1,less_conv,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,foster,1,less_conv,inconv,slightly_prob,priority,spec_prior -usual,very_crit,foster,1,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,foster,1,less_conv,inconv,problematic,recommended,spec_prior -usual,very_crit,foster,1,less_conv,inconv,problematic,priority,spec_prior -usual,very_crit,foster,1,less_conv,inconv,problematic,not_recom,not_recom -usual,very_crit,foster,1,critical,convenient,nonprob,recommended,spec_prior -usual,very_crit,foster,1,critical,convenient,nonprob,priority,spec_prior -usual,very_crit,foster,1,critical,convenient,nonprob,not_recom,not_recom -usual,very_crit,foster,1,critical,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,foster,1,critical,convenient,slightly_prob,priority,spec_prior -usual,very_crit,foster,1,critical,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,foster,1,critical,convenient,problematic,recommended,spec_prior -usual,very_crit,foster,1,critical,convenient,problematic,priority,spec_prior -usual,very_crit,foster,1,critical,convenient,problematic,not_recom,not_recom -usual,very_crit,foster,1,critical,inconv,nonprob,recommended,spec_prior -usual,very_crit,foster,1,critical,inconv,nonprob,priority,spec_prior -usual,very_crit,foster,1,critical,inconv,nonprob,not_recom,not_recom -usual,very_crit,foster,1,critical,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,foster,1,critical,inconv,slightly_prob,priority,spec_prior -usual,very_crit,foster,1,critical,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,foster,1,critical,inconv,problematic,recommended,spec_prior -usual,very_crit,foster,1,critical,inconv,problematic,priority,spec_prior -usual,very_crit,foster,1,critical,inconv,problematic,not_recom,not_recom -usual,very_crit,foster,2,convenient,convenient,nonprob,recommended,priority -usual,very_crit,foster,2,convenient,convenient,nonprob,priority,spec_prior -usual,very_crit,foster,2,convenient,convenient,nonprob,not_recom,not_recom -usual,very_crit,foster,2,convenient,convenient,slightly_prob,recommended,priority -usual,very_crit,foster,2,convenient,convenient,slightly_prob,priority,spec_prior -usual,very_crit,foster,2,convenient,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,foster,2,convenient,convenient,problematic,recommended,spec_prior -usual,very_crit,foster,2,convenient,convenient,problematic,priority,spec_prior -usual,very_crit,foster,2,convenient,convenient,problematic,not_recom,not_recom -usual,very_crit,foster,2,convenient,inconv,nonprob,recommended,spec_prior -usual,very_crit,foster,2,convenient,inconv,nonprob,priority,spec_prior -usual,very_crit,foster,2,convenient,inconv,nonprob,not_recom,not_recom -usual,very_crit,foster,2,convenient,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,foster,2,convenient,inconv,slightly_prob,priority,spec_prior -usual,very_crit,foster,2,convenient,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,foster,2,convenient,inconv,problematic,recommended,spec_prior -usual,very_crit,foster,2,convenient,inconv,problematic,priority,spec_prior -usual,very_crit,foster,2,convenient,inconv,problematic,not_recom,not_recom -usual,very_crit,foster,2,less_conv,convenient,nonprob,recommended,spec_prior -usual,very_crit,foster,2,less_conv,convenient,nonprob,priority,spec_prior -usual,very_crit,foster,2,less_conv,convenient,nonprob,not_recom,not_recom -usual,very_crit,foster,2,less_conv,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,foster,2,less_conv,convenient,slightly_prob,priority,spec_prior -usual,very_crit,foster,2,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,foster,2,less_conv,convenient,problematic,recommended,spec_prior -usual,very_crit,foster,2,less_conv,convenient,problematic,priority,spec_prior -usual,very_crit,foster,2,less_conv,convenient,problematic,not_recom,not_recom -usual,very_crit,foster,2,less_conv,inconv,nonprob,recommended,spec_prior -usual,very_crit,foster,2,less_conv,inconv,nonprob,priority,spec_prior -usual,very_crit,foster,2,less_conv,inconv,nonprob,not_recom,not_recom -usual,very_crit,foster,2,less_conv,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,foster,2,less_conv,inconv,slightly_prob,priority,spec_prior -usual,very_crit,foster,2,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,foster,2,less_conv,inconv,problematic,recommended,spec_prior -usual,very_crit,foster,2,less_conv,inconv,problematic,priority,spec_prior -usual,very_crit,foster,2,less_conv,inconv,problematic,not_recom,not_recom -usual,very_crit,foster,2,critical,convenient,nonprob,recommended,spec_prior -usual,very_crit,foster,2,critical,convenient,nonprob,priority,spec_prior -usual,very_crit,foster,2,critical,convenient,nonprob,not_recom,not_recom -usual,very_crit,foster,2,critical,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,foster,2,critical,convenient,slightly_prob,priority,spec_prior -usual,very_crit,foster,2,critical,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,foster,2,critical,convenient,problematic,recommended,spec_prior -usual,very_crit,foster,2,critical,convenient,problematic,priority,spec_prior -usual,very_crit,foster,2,critical,convenient,problematic,not_recom,not_recom -usual,very_crit,foster,2,critical,inconv,nonprob,recommended,spec_prior -usual,very_crit,foster,2,critical,inconv,nonprob,priority,spec_prior -usual,very_crit,foster,2,critical,inconv,nonprob,not_recom,not_recom -usual,very_crit,foster,2,critical,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,foster,2,critical,inconv,slightly_prob,priority,spec_prior -usual,very_crit,foster,2,critical,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,foster,2,critical,inconv,problematic,recommended,spec_prior -usual,very_crit,foster,2,critical,inconv,problematic,priority,spec_prior -usual,very_crit,foster,2,critical,inconv,problematic,not_recom,not_recom -usual,very_crit,foster,3,convenient,convenient,nonprob,recommended,priority -usual,very_crit,foster,3,convenient,convenient,nonprob,priority,spec_prior -usual,very_crit,foster,3,convenient,convenient,nonprob,not_recom,not_recom -usual,very_crit,foster,3,convenient,convenient,slightly_prob,recommended,priority -usual,very_crit,foster,3,convenient,convenient,slightly_prob,priority,spec_prior -usual,very_crit,foster,3,convenient,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,foster,3,convenient,convenient,problematic,recommended,spec_prior -usual,very_crit,foster,3,convenient,convenient,problematic,priority,spec_prior -usual,very_crit,foster,3,convenient,convenient,problematic,not_recom,not_recom -usual,very_crit,foster,3,convenient,inconv,nonprob,recommended,spec_prior -usual,very_crit,foster,3,convenient,inconv,nonprob,priority,spec_prior -usual,very_crit,foster,3,convenient,inconv,nonprob,not_recom,not_recom -usual,very_crit,foster,3,convenient,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,foster,3,convenient,inconv,slightly_prob,priority,spec_prior -usual,very_crit,foster,3,convenient,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,foster,3,convenient,inconv,problematic,recommended,spec_prior -usual,very_crit,foster,3,convenient,inconv,problematic,priority,spec_prior -usual,very_crit,foster,3,convenient,inconv,problematic,not_recom,not_recom -usual,very_crit,foster,3,less_conv,convenient,nonprob,recommended,spec_prior -usual,very_crit,foster,3,less_conv,convenient,nonprob,priority,spec_prior -usual,very_crit,foster,3,less_conv,convenient,nonprob,not_recom,not_recom -usual,very_crit,foster,3,less_conv,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,foster,3,less_conv,convenient,slightly_prob,priority,spec_prior -usual,very_crit,foster,3,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,foster,3,less_conv,convenient,problematic,recommended,spec_prior -usual,very_crit,foster,3,less_conv,convenient,problematic,priority,spec_prior -usual,very_crit,foster,3,less_conv,convenient,problematic,not_recom,not_recom -usual,very_crit,foster,3,less_conv,inconv,nonprob,recommended,spec_prior -usual,very_crit,foster,3,less_conv,inconv,nonprob,priority,spec_prior -usual,very_crit,foster,3,less_conv,inconv,nonprob,not_recom,not_recom -usual,very_crit,foster,3,less_conv,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,foster,3,less_conv,inconv,slightly_prob,priority,spec_prior -usual,very_crit,foster,3,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,foster,3,less_conv,inconv,problematic,recommended,spec_prior -usual,very_crit,foster,3,less_conv,inconv,problematic,priority,spec_prior -usual,very_crit,foster,3,less_conv,inconv,problematic,not_recom,not_recom -usual,very_crit,foster,3,critical,convenient,nonprob,recommended,spec_prior -usual,very_crit,foster,3,critical,convenient,nonprob,priority,spec_prior -usual,very_crit,foster,3,critical,convenient,nonprob,not_recom,not_recom -usual,very_crit,foster,3,critical,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,foster,3,critical,convenient,slightly_prob,priority,spec_prior -usual,very_crit,foster,3,critical,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,foster,3,critical,convenient,problematic,recommended,spec_prior -usual,very_crit,foster,3,critical,convenient,problematic,priority,spec_prior -usual,very_crit,foster,3,critical,convenient,problematic,not_recom,not_recom -usual,very_crit,foster,3,critical,inconv,nonprob,recommended,spec_prior -usual,very_crit,foster,3,critical,inconv,nonprob,priority,spec_prior -usual,very_crit,foster,3,critical,inconv,nonprob,not_recom,not_recom -usual,very_crit,foster,3,critical,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,foster,3,critical,inconv,slightly_prob,priority,spec_prior -usual,very_crit,foster,3,critical,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,foster,3,critical,inconv,problematic,recommended,spec_prior -usual,very_crit,foster,3,critical,inconv,problematic,priority,spec_prior -usual,very_crit,foster,3,critical,inconv,problematic,not_recom,not_recom -usual,very_crit,foster,more,convenient,convenient,nonprob,recommended,priority -usual,very_crit,foster,more,convenient,convenient,nonprob,priority,spec_prior -usual,very_crit,foster,more,convenient,convenient,nonprob,not_recom,not_recom -usual,very_crit,foster,more,convenient,convenient,slightly_prob,recommended,priority -usual,very_crit,foster,more,convenient,convenient,slightly_prob,priority,spec_prior -usual,very_crit,foster,more,convenient,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,foster,more,convenient,convenient,problematic,recommended,spec_prior -usual,very_crit,foster,more,convenient,convenient,problematic,priority,spec_prior -usual,very_crit,foster,more,convenient,convenient,problematic,not_recom,not_recom -usual,very_crit,foster,more,convenient,inconv,nonprob,recommended,spec_prior -usual,very_crit,foster,more,convenient,inconv,nonprob,priority,spec_prior -usual,very_crit,foster,more,convenient,inconv,nonprob,not_recom,not_recom -usual,very_crit,foster,more,convenient,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,foster,more,convenient,inconv,slightly_prob,priority,spec_prior -usual,very_crit,foster,more,convenient,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,foster,more,convenient,inconv,problematic,recommended,spec_prior -usual,very_crit,foster,more,convenient,inconv,problematic,priority,spec_prior -usual,very_crit,foster,more,convenient,inconv,problematic,not_recom,not_recom -usual,very_crit,foster,more,less_conv,convenient,nonprob,recommended,spec_prior -usual,very_crit,foster,more,less_conv,convenient,nonprob,priority,spec_prior -usual,very_crit,foster,more,less_conv,convenient,nonprob,not_recom,not_recom -usual,very_crit,foster,more,less_conv,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,foster,more,less_conv,convenient,slightly_prob,priority,spec_prior -usual,very_crit,foster,more,less_conv,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,foster,more,less_conv,convenient,problematic,recommended,spec_prior -usual,very_crit,foster,more,less_conv,convenient,problematic,priority,spec_prior -usual,very_crit,foster,more,less_conv,convenient,problematic,not_recom,not_recom -usual,very_crit,foster,more,less_conv,inconv,nonprob,recommended,spec_prior -usual,very_crit,foster,more,less_conv,inconv,nonprob,priority,spec_prior -usual,very_crit,foster,more,less_conv,inconv,nonprob,not_recom,not_recom -usual,very_crit,foster,more,less_conv,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,foster,more,less_conv,inconv,slightly_prob,priority,spec_prior -usual,very_crit,foster,more,less_conv,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,foster,more,less_conv,inconv,problematic,recommended,spec_prior -usual,very_crit,foster,more,less_conv,inconv,problematic,priority,spec_prior -usual,very_crit,foster,more,less_conv,inconv,problematic,not_recom,not_recom -usual,very_crit,foster,more,critical,convenient,nonprob,recommended,spec_prior -usual,very_crit,foster,more,critical,convenient,nonprob,priority,spec_prior -usual,very_crit,foster,more,critical,convenient,nonprob,not_recom,not_recom -usual,very_crit,foster,more,critical,convenient,slightly_prob,recommended,spec_prior -usual,very_crit,foster,more,critical,convenient,slightly_prob,priority,spec_prior -usual,very_crit,foster,more,critical,convenient,slightly_prob,not_recom,not_recom -usual,very_crit,foster,more,critical,convenient,problematic,recommended,spec_prior -usual,very_crit,foster,more,critical,convenient,problematic,priority,spec_prior -usual,very_crit,foster,more,critical,convenient,problematic,not_recom,not_recom -usual,very_crit,foster,more,critical,inconv,nonprob,recommended,spec_prior -usual,very_crit,foster,more,critical,inconv,nonprob,priority,spec_prior -usual,very_crit,foster,more,critical,inconv,nonprob,not_recom,not_recom -usual,very_crit,foster,more,critical,inconv,slightly_prob,recommended,spec_prior -usual,very_crit,foster,more,critical,inconv,slightly_prob,priority,spec_prior -usual,very_crit,foster,more,critical,inconv,slightly_prob,not_recom,not_recom -usual,very_crit,foster,more,critical,inconv,problematic,recommended,spec_prior -usual,very_crit,foster,more,critical,inconv,problematic,priority,spec_prior -usual,very_crit,foster,more,critical,inconv,problematic,not_recom,not_recom -pretentious,proper,complete,1,convenient,convenient,nonprob,recommended,very_recom -pretentious,proper,complete,1,convenient,convenient,nonprob,priority,priority -pretentious,proper,complete,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,proper,complete,1,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,proper,complete,1,convenient,convenient,slightly_prob,priority,priority -pretentious,proper,complete,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,complete,1,convenient,convenient,problematic,recommended,priority -pretentious,proper,complete,1,convenient,convenient,problematic,priority,priority -pretentious,proper,complete,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,proper,complete,1,convenient,inconv,nonprob,recommended,very_recom -pretentious,proper,complete,1,convenient,inconv,nonprob,priority,priority -pretentious,proper,complete,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,proper,complete,1,convenient,inconv,slightly_prob,recommended,very_recom -pretentious,proper,complete,1,convenient,inconv,slightly_prob,priority,priority -pretentious,proper,complete,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,complete,1,convenient,inconv,problematic,recommended,priority -pretentious,proper,complete,1,convenient,inconv,problematic,priority,priority -pretentious,proper,complete,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,proper,complete,1,less_conv,convenient,nonprob,recommended,very_recom -pretentious,proper,complete,1,less_conv,convenient,nonprob,priority,priority -pretentious,proper,complete,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,proper,complete,1,less_conv,convenient,slightly_prob,recommended,very_recom -pretentious,proper,complete,1,less_conv,convenient,slightly_prob,priority,priority -pretentious,proper,complete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,complete,1,less_conv,convenient,problematic,recommended,priority -pretentious,proper,complete,1,less_conv,convenient,problematic,priority,priority -pretentious,proper,complete,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,proper,complete,1,less_conv,inconv,nonprob,recommended,very_recom -pretentious,proper,complete,1,less_conv,inconv,nonprob,priority,priority -pretentious,proper,complete,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,proper,complete,1,less_conv,inconv,slightly_prob,recommended,very_recom -pretentious,proper,complete,1,less_conv,inconv,slightly_prob,priority,priority -pretentious,proper,complete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,complete,1,less_conv,inconv,problematic,recommended,priority -pretentious,proper,complete,1,less_conv,inconv,problematic,priority,priority -pretentious,proper,complete,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,proper,complete,1,critical,convenient,nonprob,recommended,very_recom -pretentious,proper,complete,1,critical,convenient,nonprob,priority,priority -pretentious,proper,complete,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,proper,complete,1,critical,convenient,slightly_prob,recommended,very_recom -pretentious,proper,complete,1,critical,convenient,slightly_prob,priority,priority -pretentious,proper,complete,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,complete,1,critical,convenient,problematic,recommended,priority -pretentious,proper,complete,1,critical,convenient,problematic,priority,priority -pretentious,proper,complete,1,critical,convenient,problematic,not_recom,not_recom -pretentious,proper,complete,1,critical,inconv,nonprob,recommended,very_recom -pretentious,proper,complete,1,critical,inconv,nonprob,priority,priority -pretentious,proper,complete,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,proper,complete,1,critical,inconv,slightly_prob,recommended,very_recom -pretentious,proper,complete,1,critical,inconv,slightly_prob,priority,priority -pretentious,proper,complete,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,complete,1,critical,inconv,problematic,recommended,priority -pretentious,proper,complete,1,critical,inconv,problematic,priority,priority -pretentious,proper,complete,1,critical,inconv,problematic,not_recom,not_recom -pretentious,proper,complete,2,convenient,convenient,nonprob,recommended,very_recom -pretentious,proper,complete,2,convenient,convenient,nonprob,priority,priority -pretentious,proper,complete,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,proper,complete,2,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,proper,complete,2,convenient,convenient,slightly_prob,priority,priority -pretentious,proper,complete,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,complete,2,convenient,convenient,problematic,recommended,priority -pretentious,proper,complete,2,convenient,convenient,problematic,priority,priority -pretentious,proper,complete,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,proper,complete,2,convenient,inconv,nonprob,recommended,very_recom -pretentious,proper,complete,2,convenient,inconv,nonprob,priority,priority -pretentious,proper,complete,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,proper,complete,2,convenient,inconv,slightly_prob,recommended,very_recom -pretentious,proper,complete,2,convenient,inconv,slightly_prob,priority,priority -pretentious,proper,complete,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,complete,2,convenient,inconv,problematic,recommended,priority -pretentious,proper,complete,2,convenient,inconv,problematic,priority,priority -pretentious,proper,complete,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,proper,complete,2,less_conv,convenient,nonprob,recommended,very_recom -pretentious,proper,complete,2,less_conv,convenient,nonprob,priority,priority -pretentious,proper,complete,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,proper,complete,2,less_conv,convenient,slightly_prob,recommended,very_recom -pretentious,proper,complete,2,less_conv,convenient,slightly_prob,priority,priority -pretentious,proper,complete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,complete,2,less_conv,convenient,problematic,recommended,priority -pretentious,proper,complete,2,less_conv,convenient,problematic,priority,priority -pretentious,proper,complete,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,proper,complete,2,less_conv,inconv,nonprob,recommended,very_recom -pretentious,proper,complete,2,less_conv,inconv,nonprob,priority,priority -pretentious,proper,complete,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,proper,complete,2,less_conv,inconv,slightly_prob,recommended,very_recom -pretentious,proper,complete,2,less_conv,inconv,slightly_prob,priority,priority -pretentious,proper,complete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,complete,2,less_conv,inconv,problematic,recommended,priority -pretentious,proper,complete,2,less_conv,inconv,problematic,priority,priority -pretentious,proper,complete,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,proper,complete,2,critical,convenient,nonprob,recommended,priority -pretentious,proper,complete,2,critical,convenient,nonprob,priority,priority -pretentious,proper,complete,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,proper,complete,2,critical,convenient,slightly_prob,recommended,priority -pretentious,proper,complete,2,critical,convenient,slightly_prob,priority,priority -pretentious,proper,complete,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,complete,2,critical,convenient,problematic,recommended,priority -pretentious,proper,complete,2,critical,convenient,problematic,priority,priority -pretentious,proper,complete,2,critical,convenient,problematic,not_recom,not_recom -pretentious,proper,complete,2,critical,inconv,nonprob,recommended,priority -pretentious,proper,complete,2,critical,inconv,nonprob,priority,priority -pretentious,proper,complete,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,proper,complete,2,critical,inconv,slightly_prob,recommended,priority -pretentious,proper,complete,2,critical,inconv,slightly_prob,priority,priority -pretentious,proper,complete,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,complete,2,critical,inconv,problematic,recommended,priority -pretentious,proper,complete,2,critical,inconv,problematic,priority,priority -pretentious,proper,complete,2,critical,inconv,problematic,not_recom,not_recom -pretentious,proper,complete,3,convenient,convenient,nonprob,recommended,very_recom -pretentious,proper,complete,3,convenient,convenient,nonprob,priority,priority -pretentious,proper,complete,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,proper,complete,3,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,proper,complete,3,convenient,convenient,slightly_prob,priority,priority -pretentious,proper,complete,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,complete,3,convenient,convenient,problematic,recommended,priority -pretentious,proper,complete,3,convenient,convenient,problematic,priority,priority -pretentious,proper,complete,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,proper,complete,3,convenient,inconv,nonprob,recommended,priority -pretentious,proper,complete,3,convenient,inconv,nonprob,priority,priority -pretentious,proper,complete,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,proper,complete,3,convenient,inconv,slightly_prob,recommended,priority -pretentious,proper,complete,3,convenient,inconv,slightly_prob,priority,priority -pretentious,proper,complete,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,complete,3,convenient,inconv,problematic,recommended,priority -pretentious,proper,complete,3,convenient,inconv,problematic,priority,priority -pretentious,proper,complete,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,proper,complete,3,less_conv,convenient,nonprob,recommended,priority -pretentious,proper,complete,3,less_conv,convenient,nonprob,priority,priority -pretentious,proper,complete,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,proper,complete,3,less_conv,convenient,slightly_prob,recommended,priority -pretentious,proper,complete,3,less_conv,convenient,slightly_prob,priority,priority -pretentious,proper,complete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,complete,3,less_conv,convenient,problematic,recommended,priority -pretentious,proper,complete,3,less_conv,convenient,problematic,priority,priority -pretentious,proper,complete,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,proper,complete,3,less_conv,inconv,nonprob,recommended,priority -pretentious,proper,complete,3,less_conv,inconv,nonprob,priority,priority -pretentious,proper,complete,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,proper,complete,3,less_conv,inconv,slightly_prob,recommended,priority -pretentious,proper,complete,3,less_conv,inconv,slightly_prob,priority,priority -pretentious,proper,complete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,complete,3,less_conv,inconv,problematic,recommended,priority -pretentious,proper,complete,3,less_conv,inconv,problematic,priority,priority -pretentious,proper,complete,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,proper,complete,3,critical,convenient,nonprob,recommended,priority -pretentious,proper,complete,3,critical,convenient,nonprob,priority,priority -pretentious,proper,complete,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,proper,complete,3,critical,convenient,slightly_prob,recommended,priority -pretentious,proper,complete,3,critical,convenient,slightly_prob,priority,priority -pretentious,proper,complete,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,complete,3,critical,convenient,problematic,recommended,priority -pretentious,proper,complete,3,critical,convenient,problematic,priority,priority -pretentious,proper,complete,3,critical,convenient,problematic,not_recom,not_recom -pretentious,proper,complete,3,critical,inconv,nonprob,recommended,priority -pretentious,proper,complete,3,critical,inconv,nonprob,priority,priority -pretentious,proper,complete,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,proper,complete,3,critical,inconv,slightly_prob,recommended,priority -pretentious,proper,complete,3,critical,inconv,slightly_prob,priority,priority -pretentious,proper,complete,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,complete,3,critical,inconv,problematic,recommended,priority -pretentious,proper,complete,3,critical,inconv,problematic,priority,priority -pretentious,proper,complete,3,critical,inconv,problematic,not_recom,not_recom -pretentious,proper,complete,more,convenient,convenient,nonprob,recommended,very_recom -pretentious,proper,complete,more,convenient,convenient,nonprob,priority,priority -pretentious,proper,complete,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,proper,complete,more,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,proper,complete,more,convenient,convenient,slightly_prob,priority,priority -pretentious,proper,complete,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,complete,more,convenient,convenient,problematic,recommended,priority -pretentious,proper,complete,more,convenient,convenient,problematic,priority,priority -pretentious,proper,complete,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,proper,complete,more,convenient,inconv,nonprob,recommended,priority -pretentious,proper,complete,more,convenient,inconv,nonprob,priority,priority -pretentious,proper,complete,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,proper,complete,more,convenient,inconv,slightly_prob,recommended,priority -pretentious,proper,complete,more,convenient,inconv,slightly_prob,priority,priority -pretentious,proper,complete,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,complete,more,convenient,inconv,problematic,recommended,priority -pretentious,proper,complete,more,convenient,inconv,problematic,priority,priority -pretentious,proper,complete,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,proper,complete,more,less_conv,convenient,nonprob,recommended,priority -pretentious,proper,complete,more,less_conv,convenient,nonprob,priority,priority -pretentious,proper,complete,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,proper,complete,more,less_conv,convenient,slightly_prob,recommended,priority -pretentious,proper,complete,more,less_conv,convenient,slightly_prob,priority,priority -pretentious,proper,complete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,complete,more,less_conv,convenient,problematic,recommended,priority -pretentious,proper,complete,more,less_conv,convenient,problematic,priority,priority -pretentious,proper,complete,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,proper,complete,more,less_conv,inconv,nonprob,recommended,priority -pretentious,proper,complete,more,less_conv,inconv,nonprob,priority,priority -pretentious,proper,complete,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,proper,complete,more,less_conv,inconv,slightly_prob,recommended,priority -pretentious,proper,complete,more,less_conv,inconv,slightly_prob,priority,priority -pretentious,proper,complete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,complete,more,less_conv,inconv,problematic,recommended,priority -pretentious,proper,complete,more,less_conv,inconv,problematic,priority,priority -pretentious,proper,complete,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,proper,complete,more,critical,convenient,nonprob,recommended,priority -pretentious,proper,complete,more,critical,convenient,nonprob,priority,priority -pretentious,proper,complete,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,proper,complete,more,critical,convenient,slightly_prob,recommended,priority -pretentious,proper,complete,more,critical,convenient,slightly_prob,priority,priority -pretentious,proper,complete,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,complete,more,critical,convenient,problematic,recommended,priority -pretentious,proper,complete,more,critical,convenient,problematic,priority,priority -pretentious,proper,complete,more,critical,convenient,problematic,not_recom,not_recom -pretentious,proper,complete,more,critical,inconv,nonprob,recommended,priority -pretentious,proper,complete,more,critical,inconv,nonprob,priority,priority -pretentious,proper,complete,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,proper,complete,more,critical,inconv,slightly_prob,recommended,priority -pretentious,proper,complete,more,critical,inconv,slightly_prob,priority,priority -pretentious,proper,complete,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,complete,more,critical,inconv,problematic,recommended,priority -pretentious,proper,complete,more,critical,inconv,problematic,priority,priority -pretentious,proper,complete,more,critical,inconv,problematic,not_recom,not_recom -pretentious,proper,completed,1,convenient,convenient,nonprob,recommended,very_recom -pretentious,proper,completed,1,convenient,convenient,nonprob,priority,priority -pretentious,proper,completed,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,proper,completed,1,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,proper,completed,1,convenient,convenient,slightly_prob,priority,priority -pretentious,proper,completed,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,completed,1,convenient,convenient,problematic,recommended,priority -pretentious,proper,completed,1,convenient,convenient,problematic,priority,priority -pretentious,proper,completed,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,proper,completed,1,convenient,inconv,nonprob,recommended,very_recom -pretentious,proper,completed,1,convenient,inconv,nonprob,priority,priority -pretentious,proper,completed,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,proper,completed,1,convenient,inconv,slightly_prob,recommended,very_recom -pretentious,proper,completed,1,convenient,inconv,slightly_prob,priority,priority -pretentious,proper,completed,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,completed,1,convenient,inconv,problematic,recommended,priority -pretentious,proper,completed,1,convenient,inconv,problematic,priority,priority -pretentious,proper,completed,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,proper,completed,1,less_conv,convenient,nonprob,recommended,very_recom -pretentious,proper,completed,1,less_conv,convenient,nonprob,priority,priority -pretentious,proper,completed,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,proper,completed,1,less_conv,convenient,slightly_prob,recommended,very_recom -pretentious,proper,completed,1,less_conv,convenient,slightly_prob,priority,priority -pretentious,proper,completed,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,completed,1,less_conv,convenient,problematic,recommended,priority -pretentious,proper,completed,1,less_conv,convenient,problematic,priority,priority -pretentious,proper,completed,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,proper,completed,1,less_conv,inconv,nonprob,recommended,very_recom -pretentious,proper,completed,1,less_conv,inconv,nonprob,priority,priority -pretentious,proper,completed,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,proper,completed,1,less_conv,inconv,slightly_prob,recommended,very_recom -pretentious,proper,completed,1,less_conv,inconv,slightly_prob,priority,priority -pretentious,proper,completed,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,completed,1,less_conv,inconv,problematic,recommended,priority -pretentious,proper,completed,1,less_conv,inconv,problematic,priority,priority -pretentious,proper,completed,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,proper,completed,1,critical,convenient,nonprob,recommended,priority -pretentious,proper,completed,1,critical,convenient,nonprob,priority,priority -pretentious,proper,completed,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,proper,completed,1,critical,convenient,slightly_prob,recommended,priority -pretentious,proper,completed,1,critical,convenient,slightly_prob,priority,priority -pretentious,proper,completed,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,completed,1,critical,convenient,problematic,recommended,priority -pretentious,proper,completed,1,critical,convenient,problematic,priority,priority -pretentious,proper,completed,1,critical,convenient,problematic,not_recom,not_recom -pretentious,proper,completed,1,critical,inconv,nonprob,recommended,priority -pretentious,proper,completed,1,critical,inconv,nonprob,priority,priority -pretentious,proper,completed,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,proper,completed,1,critical,inconv,slightly_prob,recommended,priority -pretentious,proper,completed,1,critical,inconv,slightly_prob,priority,priority -pretentious,proper,completed,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,completed,1,critical,inconv,problematic,recommended,priority -pretentious,proper,completed,1,critical,inconv,problematic,priority,priority -pretentious,proper,completed,1,critical,inconv,problematic,not_recom,not_recom -pretentious,proper,completed,2,convenient,convenient,nonprob,recommended,very_recom -pretentious,proper,completed,2,convenient,convenient,nonprob,priority,priority -pretentious,proper,completed,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,proper,completed,2,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,proper,completed,2,convenient,convenient,slightly_prob,priority,priority -pretentious,proper,completed,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,completed,2,convenient,convenient,problematic,recommended,priority -pretentious,proper,completed,2,convenient,convenient,problematic,priority,priority -pretentious,proper,completed,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,proper,completed,2,convenient,inconv,nonprob,recommended,very_recom -pretentious,proper,completed,2,convenient,inconv,nonprob,priority,priority -pretentious,proper,completed,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,proper,completed,2,convenient,inconv,slightly_prob,recommended,very_recom -pretentious,proper,completed,2,convenient,inconv,slightly_prob,priority,priority -pretentious,proper,completed,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,completed,2,convenient,inconv,problematic,recommended,priority -pretentious,proper,completed,2,convenient,inconv,problematic,priority,priority -pretentious,proper,completed,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,proper,completed,2,less_conv,convenient,nonprob,recommended,very_recom -pretentious,proper,completed,2,less_conv,convenient,nonprob,priority,priority -pretentious,proper,completed,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,proper,completed,2,less_conv,convenient,slightly_prob,recommended,very_recom -pretentious,proper,completed,2,less_conv,convenient,slightly_prob,priority,priority -pretentious,proper,completed,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,completed,2,less_conv,convenient,problematic,recommended,priority -pretentious,proper,completed,2,less_conv,convenient,problematic,priority,priority -pretentious,proper,completed,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,proper,completed,2,less_conv,inconv,nonprob,recommended,very_recom -pretentious,proper,completed,2,less_conv,inconv,nonprob,priority,priority -pretentious,proper,completed,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,proper,completed,2,less_conv,inconv,slightly_prob,recommended,very_recom -pretentious,proper,completed,2,less_conv,inconv,slightly_prob,priority,priority -pretentious,proper,completed,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,completed,2,less_conv,inconv,problematic,recommended,priority -pretentious,proper,completed,2,less_conv,inconv,problematic,priority,priority -pretentious,proper,completed,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,proper,completed,2,critical,convenient,nonprob,recommended,priority -pretentious,proper,completed,2,critical,convenient,nonprob,priority,priority -pretentious,proper,completed,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,proper,completed,2,critical,convenient,slightly_prob,recommended,priority -pretentious,proper,completed,2,critical,convenient,slightly_prob,priority,priority -pretentious,proper,completed,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,completed,2,critical,convenient,problematic,recommended,priority -pretentious,proper,completed,2,critical,convenient,problematic,priority,priority -pretentious,proper,completed,2,critical,convenient,problematic,not_recom,not_recom -pretentious,proper,completed,2,critical,inconv,nonprob,recommended,priority -pretentious,proper,completed,2,critical,inconv,nonprob,priority,priority -pretentious,proper,completed,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,proper,completed,2,critical,inconv,slightly_prob,recommended,priority -pretentious,proper,completed,2,critical,inconv,slightly_prob,priority,priority -pretentious,proper,completed,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,completed,2,critical,inconv,problematic,recommended,priority -pretentious,proper,completed,2,critical,inconv,problematic,priority,priority -pretentious,proper,completed,2,critical,inconv,problematic,not_recom,not_recom -pretentious,proper,completed,3,convenient,convenient,nonprob,recommended,very_recom -pretentious,proper,completed,3,convenient,convenient,nonprob,priority,priority -pretentious,proper,completed,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,proper,completed,3,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,proper,completed,3,convenient,convenient,slightly_prob,priority,priority -pretentious,proper,completed,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,completed,3,convenient,convenient,problematic,recommended,priority -pretentious,proper,completed,3,convenient,convenient,problematic,priority,priority -pretentious,proper,completed,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,proper,completed,3,convenient,inconv,nonprob,recommended,priority -pretentious,proper,completed,3,convenient,inconv,nonprob,priority,priority -pretentious,proper,completed,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,proper,completed,3,convenient,inconv,slightly_prob,recommended,priority -pretentious,proper,completed,3,convenient,inconv,slightly_prob,priority,priority -pretentious,proper,completed,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,completed,3,convenient,inconv,problematic,recommended,priority -pretentious,proper,completed,3,convenient,inconv,problematic,priority,priority -pretentious,proper,completed,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,proper,completed,3,less_conv,convenient,nonprob,recommended,priority -pretentious,proper,completed,3,less_conv,convenient,nonprob,priority,priority -pretentious,proper,completed,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,proper,completed,3,less_conv,convenient,slightly_prob,recommended,priority -pretentious,proper,completed,3,less_conv,convenient,slightly_prob,priority,priority -pretentious,proper,completed,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,completed,3,less_conv,convenient,problematic,recommended,priority -pretentious,proper,completed,3,less_conv,convenient,problematic,priority,priority -pretentious,proper,completed,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,proper,completed,3,less_conv,inconv,nonprob,recommended,priority -pretentious,proper,completed,3,less_conv,inconv,nonprob,priority,priority -pretentious,proper,completed,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,proper,completed,3,less_conv,inconv,slightly_prob,recommended,priority -pretentious,proper,completed,3,less_conv,inconv,slightly_prob,priority,priority -pretentious,proper,completed,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,completed,3,less_conv,inconv,problematic,recommended,priority -pretentious,proper,completed,3,less_conv,inconv,problematic,priority,priority -pretentious,proper,completed,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,proper,completed,3,critical,convenient,nonprob,recommended,priority -pretentious,proper,completed,3,critical,convenient,nonprob,priority,priority -pretentious,proper,completed,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,proper,completed,3,critical,convenient,slightly_prob,recommended,priority -pretentious,proper,completed,3,critical,convenient,slightly_prob,priority,priority -pretentious,proper,completed,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,completed,3,critical,convenient,problematic,recommended,priority -pretentious,proper,completed,3,critical,convenient,problematic,priority,priority -pretentious,proper,completed,3,critical,convenient,problematic,not_recom,not_recom -pretentious,proper,completed,3,critical,inconv,nonprob,recommended,priority -pretentious,proper,completed,3,critical,inconv,nonprob,priority,priority -pretentious,proper,completed,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,proper,completed,3,critical,inconv,slightly_prob,recommended,priority -pretentious,proper,completed,3,critical,inconv,slightly_prob,priority,priority -pretentious,proper,completed,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,completed,3,critical,inconv,problematic,recommended,priority -pretentious,proper,completed,3,critical,inconv,problematic,priority,priority -pretentious,proper,completed,3,critical,inconv,problematic,not_recom,not_recom -pretentious,proper,completed,more,convenient,convenient,nonprob,recommended,very_recom -pretentious,proper,completed,more,convenient,convenient,nonprob,priority,priority -pretentious,proper,completed,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,proper,completed,more,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,proper,completed,more,convenient,convenient,slightly_prob,priority,priority -pretentious,proper,completed,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,completed,more,convenient,convenient,problematic,recommended,priority -pretentious,proper,completed,more,convenient,convenient,problematic,priority,priority -pretentious,proper,completed,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,proper,completed,more,convenient,inconv,nonprob,recommended,priority -pretentious,proper,completed,more,convenient,inconv,nonprob,priority,priority -pretentious,proper,completed,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,proper,completed,more,convenient,inconv,slightly_prob,recommended,priority -pretentious,proper,completed,more,convenient,inconv,slightly_prob,priority,priority -pretentious,proper,completed,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,completed,more,convenient,inconv,problematic,recommended,priority -pretentious,proper,completed,more,convenient,inconv,problematic,priority,priority -pretentious,proper,completed,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,proper,completed,more,less_conv,convenient,nonprob,recommended,priority -pretentious,proper,completed,more,less_conv,convenient,nonprob,priority,priority -pretentious,proper,completed,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,proper,completed,more,less_conv,convenient,slightly_prob,recommended,priority -pretentious,proper,completed,more,less_conv,convenient,slightly_prob,priority,priority -pretentious,proper,completed,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,completed,more,less_conv,convenient,problematic,recommended,priority -pretentious,proper,completed,more,less_conv,convenient,problematic,priority,priority -pretentious,proper,completed,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,proper,completed,more,less_conv,inconv,nonprob,recommended,priority -pretentious,proper,completed,more,less_conv,inconv,nonprob,priority,priority -pretentious,proper,completed,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,proper,completed,more,less_conv,inconv,slightly_prob,recommended,priority -pretentious,proper,completed,more,less_conv,inconv,slightly_prob,priority,priority -pretentious,proper,completed,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,completed,more,less_conv,inconv,problematic,recommended,priority -pretentious,proper,completed,more,less_conv,inconv,problematic,priority,priority -pretentious,proper,completed,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,proper,completed,more,critical,convenient,nonprob,recommended,priority -pretentious,proper,completed,more,critical,convenient,nonprob,priority,priority -pretentious,proper,completed,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,proper,completed,more,critical,convenient,slightly_prob,recommended,priority -pretentious,proper,completed,more,critical,convenient,slightly_prob,priority,priority -pretentious,proper,completed,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,completed,more,critical,convenient,problematic,recommended,priority -pretentious,proper,completed,more,critical,convenient,problematic,priority,priority -pretentious,proper,completed,more,critical,convenient,problematic,not_recom,not_recom -pretentious,proper,completed,more,critical,inconv,nonprob,recommended,priority -pretentious,proper,completed,more,critical,inconv,nonprob,priority,priority -pretentious,proper,completed,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,proper,completed,more,critical,inconv,slightly_prob,recommended,priority -pretentious,proper,completed,more,critical,inconv,slightly_prob,priority,priority -pretentious,proper,completed,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,completed,more,critical,inconv,problematic,recommended,priority -pretentious,proper,completed,more,critical,inconv,problematic,priority,priority -pretentious,proper,completed,more,critical,inconv,problematic,not_recom,not_recom -pretentious,proper,incomplete,1,convenient,convenient,nonprob,recommended,very_recom -pretentious,proper,incomplete,1,convenient,convenient,nonprob,priority,priority -pretentious,proper,incomplete,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,proper,incomplete,1,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,proper,incomplete,1,convenient,convenient,slightly_prob,priority,priority -pretentious,proper,incomplete,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,1,convenient,convenient,problematic,recommended,priority -pretentious,proper,incomplete,1,convenient,convenient,problematic,priority,priority -pretentious,proper,incomplete,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,proper,incomplete,1,convenient,inconv,nonprob,recommended,very_recom -pretentious,proper,incomplete,1,convenient,inconv,nonprob,priority,priority -pretentious,proper,incomplete,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,proper,incomplete,1,convenient,inconv,slightly_prob,recommended,very_recom -pretentious,proper,incomplete,1,convenient,inconv,slightly_prob,priority,priority -pretentious,proper,incomplete,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,1,convenient,inconv,problematic,recommended,priority -pretentious,proper,incomplete,1,convenient,inconv,problematic,priority,priority -pretentious,proper,incomplete,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,proper,incomplete,1,less_conv,convenient,nonprob,recommended,very_recom -pretentious,proper,incomplete,1,less_conv,convenient,nonprob,priority,priority -pretentious,proper,incomplete,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,proper,incomplete,1,less_conv,convenient,slightly_prob,recommended,very_recom -pretentious,proper,incomplete,1,less_conv,convenient,slightly_prob,priority,priority -pretentious,proper,incomplete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,1,less_conv,convenient,problematic,recommended,priority -pretentious,proper,incomplete,1,less_conv,convenient,problematic,priority,priority -pretentious,proper,incomplete,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,proper,incomplete,1,less_conv,inconv,nonprob,recommended,very_recom -pretentious,proper,incomplete,1,less_conv,inconv,nonprob,priority,priority -pretentious,proper,incomplete,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,proper,incomplete,1,less_conv,inconv,slightly_prob,recommended,very_recom -pretentious,proper,incomplete,1,less_conv,inconv,slightly_prob,priority,priority -pretentious,proper,incomplete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,1,less_conv,inconv,problematic,recommended,priority -pretentious,proper,incomplete,1,less_conv,inconv,problematic,priority,priority -pretentious,proper,incomplete,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,proper,incomplete,1,critical,convenient,nonprob,recommended,priority -pretentious,proper,incomplete,1,critical,convenient,nonprob,priority,priority -pretentious,proper,incomplete,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,proper,incomplete,1,critical,convenient,slightly_prob,recommended,priority -pretentious,proper,incomplete,1,critical,convenient,slightly_prob,priority,priority -pretentious,proper,incomplete,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,1,critical,convenient,problematic,recommended,priority -pretentious,proper,incomplete,1,critical,convenient,problematic,priority,priority -pretentious,proper,incomplete,1,critical,convenient,problematic,not_recom,not_recom -pretentious,proper,incomplete,1,critical,inconv,nonprob,recommended,priority -pretentious,proper,incomplete,1,critical,inconv,nonprob,priority,priority -pretentious,proper,incomplete,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,proper,incomplete,1,critical,inconv,slightly_prob,recommended,priority -pretentious,proper,incomplete,1,critical,inconv,slightly_prob,priority,priority -pretentious,proper,incomplete,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,1,critical,inconv,problematic,recommended,priority -pretentious,proper,incomplete,1,critical,inconv,problematic,priority,priority -pretentious,proper,incomplete,1,critical,inconv,problematic,not_recom,not_recom -pretentious,proper,incomplete,2,convenient,convenient,nonprob,recommended,very_recom -pretentious,proper,incomplete,2,convenient,convenient,nonprob,priority,priority -pretentious,proper,incomplete,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,proper,incomplete,2,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,proper,incomplete,2,convenient,convenient,slightly_prob,priority,priority -pretentious,proper,incomplete,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,2,convenient,convenient,problematic,recommended,priority -pretentious,proper,incomplete,2,convenient,convenient,problematic,priority,priority -pretentious,proper,incomplete,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,proper,incomplete,2,convenient,inconv,nonprob,recommended,priority -pretentious,proper,incomplete,2,convenient,inconv,nonprob,priority,priority -pretentious,proper,incomplete,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,proper,incomplete,2,convenient,inconv,slightly_prob,recommended,priority -pretentious,proper,incomplete,2,convenient,inconv,slightly_prob,priority,priority -pretentious,proper,incomplete,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,2,convenient,inconv,problematic,recommended,priority -pretentious,proper,incomplete,2,convenient,inconv,problematic,priority,priority -pretentious,proper,incomplete,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,proper,incomplete,2,less_conv,convenient,nonprob,recommended,priority -pretentious,proper,incomplete,2,less_conv,convenient,nonprob,priority,priority -pretentious,proper,incomplete,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,proper,incomplete,2,less_conv,convenient,slightly_prob,recommended,priority -pretentious,proper,incomplete,2,less_conv,convenient,slightly_prob,priority,priority -pretentious,proper,incomplete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,2,less_conv,convenient,problematic,recommended,priority -pretentious,proper,incomplete,2,less_conv,convenient,problematic,priority,priority -pretentious,proper,incomplete,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,proper,incomplete,2,less_conv,inconv,nonprob,recommended,priority -pretentious,proper,incomplete,2,less_conv,inconv,nonprob,priority,priority -pretentious,proper,incomplete,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,proper,incomplete,2,less_conv,inconv,slightly_prob,recommended,priority -pretentious,proper,incomplete,2,less_conv,inconv,slightly_prob,priority,priority -pretentious,proper,incomplete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,2,less_conv,inconv,problematic,recommended,priority -pretentious,proper,incomplete,2,less_conv,inconv,problematic,priority,priority -pretentious,proper,incomplete,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,proper,incomplete,2,critical,convenient,nonprob,recommended,priority -pretentious,proper,incomplete,2,critical,convenient,nonprob,priority,priority -pretentious,proper,incomplete,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,proper,incomplete,2,critical,convenient,slightly_prob,recommended,priority -pretentious,proper,incomplete,2,critical,convenient,slightly_prob,priority,priority -pretentious,proper,incomplete,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,2,critical,convenient,problematic,recommended,priority -pretentious,proper,incomplete,2,critical,convenient,problematic,priority,priority -pretentious,proper,incomplete,2,critical,convenient,problematic,not_recom,not_recom -pretentious,proper,incomplete,2,critical,inconv,nonprob,recommended,priority -pretentious,proper,incomplete,2,critical,inconv,nonprob,priority,priority -pretentious,proper,incomplete,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,proper,incomplete,2,critical,inconv,slightly_prob,recommended,priority -pretentious,proper,incomplete,2,critical,inconv,slightly_prob,priority,priority -pretentious,proper,incomplete,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,2,critical,inconv,problematic,recommended,priority -pretentious,proper,incomplete,2,critical,inconv,problematic,priority,priority -pretentious,proper,incomplete,2,critical,inconv,problematic,not_recom,not_recom -pretentious,proper,incomplete,3,convenient,convenient,nonprob,recommended,very_recom -pretentious,proper,incomplete,3,convenient,convenient,nonprob,priority,priority -pretentious,proper,incomplete,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,proper,incomplete,3,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,proper,incomplete,3,convenient,convenient,slightly_prob,priority,priority -pretentious,proper,incomplete,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,3,convenient,convenient,problematic,recommended,priority -pretentious,proper,incomplete,3,convenient,convenient,problematic,priority,priority -pretentious,proper,incomplete,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,proper,incomplete,3,convenient,inconv,nonprob,recommended,priority -pretentious,proper,incomplete,3,convenient,inconv,nonprob,priority,priority -pretentious,proper,incomplete,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,proper,incomplete,3,convenient,inconv,slightly_prob,recommended,priority -pretentious,proper,incomplete,3,convenient,inconv,slightly_prob,priority,priority -pretentious,proper,incomplete,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,3,convenient,inconv,problematic,recommended,priority -pretentious,proper,incomplete,3,convenient,inconv,problematic,priority,priority -pretentious,proper,incomplete,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,proper,incomplete,3,less_conv,convenient,nonprob,recommended,priority -pretentious,proper,incomplete,3,less_conv,convenient,nonprob,priority,priority -pretentious,proper,incomplete,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,proper,incomplete,3,less_conv,convenient,slightly_prob,recommended,priority -pretentious,proper,incomplete,3,less_conv,convenient,slightly_prob,priority,priority -pretentious,proper,incomplete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,3,less_conv,convenient,problematic,recommended,priority -pretentious,proper,incomplete,3,less_conv,convenient,problematic,priority,priority -pretentious,proper,incomplete,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,proper,incomplete,3,less_conv,inconv,nonprob,recommended,priority -pretentious,proper,incomplete,3,less_conv,inconv,nonprob,priority,priority -pretentious,proper,incomplete,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,proper,incomplete,3,less_conv,inconv,slightly_prob,recommended,priority -pretentious,proper,incomplete,3,less_conv,inconv,slightly_prob,priority,priority -pretentious,proper,incomplete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,3,less_conv,inconv,problematic,recommended,priority -pretentious,proper,incomplete,3,less_conv,inconv,problematic,priority,priority -pretentious,proper,incomplete,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,proper,incomplete,3,critical,convenient,nonprob,recommended,priority -pretentious,proper,incomplete,3,critical,convenient,nonprob,priority,priority -pretentious,proper,incomplete,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,proper,incomplete,3,critical,convenient,slightly_prob,recommended,priority -pretentious,proper,incomplete,3,critical,convenient,slightly_prob,priority,priority -pretentious,proper,incomplete,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,3,critical,convenient,problematic,recommended,priority -pretentious,proper,incomplete,3,critical,convenient,problematic,priority,priority -pretentious,proper,incomplete,3,critical,convenient,problematic,not_recom,not_recom -pretentious,proper,incomplete,3,critical,inconv,nonprob,recommended,priority -pretentious,proper,incomplete,3,critical,inconv,nonprob,priority,priority -pretentious,proper,incomplete,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,proper,incomplete,3,critical,inconv,slightly_prob,recommended,priority -pretentious,proper,incomplete,3,critical,inconv,slightly_prob,priority,priority -pretentious,proper,incomplete,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,3,critical,inconv,problematic,recommended,priority -pretentious,proper,incomplete,3,critical,inconv,problematic,priority,priority -pretentious,proper,incomplete,3,critical,inconv,problematic,not_recom,not_recom -pretentious,proper,incomplete,more,convenient,convenient,nonprob,recommended,very_recom -pretentious,proper,incomplete,more,convenient,convenient,nonprob,priority,priority -pretentious,proper,incomplete,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,proper,incomplete,more,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,proper,incomplete,more,convenient,convenient,slightly_prob,priority,priority -pretentious,proper,incomplete,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,more,convenient,convenient,problematic,recommended,priority -pretentious,proper,incomplete,more,convenient,convenient,problematic,priority,priority -pretentious,proper,incomplete,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,proper,incomplete,more,convenient,inconv,nonprob,recommended,priority -pretentious,proper,incomplete,more,convenient,inconv,nonprob,priority,priority -pretentious,proper,incomplete,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,proper,incomplete,more,convenient,inconv,slightly_prob,recommended,priority -pretentious,proper,incomplete,more,convenient,inconv,slightly_prob,priority,priority -pretentious,proper,incomplete,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,more,convenient,inconv,problematic,recommended,priority -pretentious,proper,incomplete,more,convenient,inconv,problematic,priority,priority -pretentious,proper,incomplete,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,proper,incomplete,more,less_conv,convenient,nonprob,recommended,priority -pretentious,proper,incomplete,more,less_conv,convenient,nonprob,priority,priority -pretentious,proper,incomplete,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,proper,incomplete,more,less_conv,convenient,slightly_prob,recommended,priority -pretentious,proper,incomplete,more,less_conv,convenient,slightly_prob,priority,priority -pretentious,proper,incomplete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,more,less_conv,convenient,problematic,recommended,priority -pretentious,proper,incomplete,more,less_conv,convenient,problematic,priority,priority -pretentious,proper,incomplete,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,proper,incomplete,more,less_conv,inconv,nonprob,recommended,priority -pretentious,proper,incomplete,more,less_conv,inconv,nonprob,priority,priority -pretentious,proper,incomplete,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,proper,incomplete,more,less_conv,inconv,slightly_prob,recommended,priority -pretentious,proper,incomplete,more,less_conv,inconv,slightly_prob,priority,priority -pretentious,proper,incomplete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,more,less_conv,inconv,problematic,recommended,priority -pretentious,proper,incomplete,more,less_conv,inconv,problematic,priority,priority -pretentious,proper,incomplete,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,proper,incomplete,more,critical,convenient,nonprob,recommended,priority -pretentious,proper,incomplete,more,critical,convenient,nonprob,priority,priority -pretentious,proper,incomplete,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,proper,incomplete,more,critical,convenient,slightly_prob,recommended,priority -pretentious,proper,incomplete,more,critical,convenient,slightly_prob,priority,priority -pretentious,proper,incomplete,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,more,critical,convenient,problematic,recommended,priority -pretentious,proper,incomplete,more,critical,convenient,problematic,priority,priority -pretentious,proper,incomplete,more,critical,convenient,problematic,not_recom,not_recom -pretentious,proper,incomplete,more,critical,inconv,nonprob,recommended,priority -pretentious,proper,incomplete,more,critical,inconv,nonprob,priority,priority -pretentious,proper,incomplete,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,proper,incomplete,more,critical,inconv,slightly_prob,recommended,priority -pretentious,proper,incomplete,more,critical,inconv,slightly_prob,priority,priority -pretentious,proper,incomplete,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,incomplete,more,critical,inconv,problematic,recommended,priority -pretentious,proper,incomplete,more,critical,inconv,problematic,priority,priority -pretentious,proper,incomplete,more,critical,inconv,problematic,not_recom,not_recom -pretentious,proper,foster,1,convenient,convenient,nonprob,recommended,very_recom -pretentious,proper,foster,1,convenient,convenient,nonprob,priority,priority -pretentious,proper,foster,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,proper,foster,1,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,proper,foster,1,convenient,convenient,slightly_prob,priority,priority -pretentious,proper,foster,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,foster,1,convenient,convenient,problematic,recommended,priority -pretentious,proper,foster,1,convenient,convenient,problematic,priority,priority -pretentious,proper,foster,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,proper,foster,1,convenient,inconv,nonprob,recommended,priority -pretentious,proper,foster,1,convenient,inconv,nonprob,priority,priority -pretentious,proper,foster,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,proper,foster,1,convenient,inconv,slightly_prob,recommended,priority -pretentious,proper,foster,1,convenient,inconv,slightly_prob,priority,priority -pretentious,proper,foster,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,foster,1,convenient,inconv,problematic,recommended,priority -pretentious,proper,foster,1,convenient,inconv,problematic,priority,priority -pretentious,proper,foster,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,proper,foster,1,less_conv,convenient,nonprob,recommended,priority -pretentious,proper,foster,1,less_conv,convenient,nonprob,priority,priority -pretentious,proper,foster,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,proper,foster,1,less_conv,convenient,slightly_prob,recommended,priority -pretentious,proper,foster,1,less_conv,convenient,slightly_prob,priority,priority -pretentious,proper,foster,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,foster,1,less_conv,convenient,problematic,recommended,priority -pretentious,proper,foster,1,less_conv,convenient,problematic,priority,priority -pretentious,proper,foster,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,proper,foster,1,less_conv,inconv,nonprob,recommended,priority -pretentious,proper,foster,1,less_conv,inconv,nonprob,priority,priority -pretentious,proper,foster,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,proper,foster,1,less_conv,inconv,slightly_prob,recommended,priority -pretentious,proper,foster,1,less_conv,inconv,slightly_prob,priority,priority -pretentious,proper,foster,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,foster,1,less_conv,inconv,problematic,recommended,priority -pretentious,proper,foster,1,less_conv,inconv,problematic,priority,priority -pretentious,proper,foster,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,proper,foster,1,critical,convenient,nonprob,recommended,priority -pretentious,proper,foster,1,critical,convenient,nonprob,priority,priority -pretentious,proper,foster,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,proper,foster,1,critical,convenient,slightly_prob,recommended,priority -pretentious,proper,foster,1,critical,convenient,slightly_prob,priority,priority -pretentious,proper,foster,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,foster,1,critical,convenient,problematic,recommended,priority -pretentious,proper,foster,1,critical,convenient,problematic,priority,priority -pretentious,proper,foster,1,critical,convenient,problematic,not_recom,not_recom -pretentious,proper,foster,1,critical,inconv,nonprob,recommended,priority -pretentious,proper,foster,1,critical,inconv,nonprob,priority,priority -pretentious,proper,foster,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,proper,foster,1,critical,inconv,slightly_prob,recommended,priority -pretentious,proper,foster,1,critical,inconv,slightly_prob,priority,priority -pretentious,proper,foster,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,foster,1,critical,inconv,problematic,recommended,priority -pretentious,proper,foster,1,critical,inconv,problematic,priority,priority -pretentious,proper,foster,1,critical,inconv,problematic,not_recom,not_recom -pretentious,proper,foster,2,convenient,convenient,nonprob,recommended,very_recom -pretentious,proper,foster,2,convenient,convenient,nonprob,priority,priority -pretentious,proper,foster,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,proper,foster,2,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,proper,foster,2,convenient,convenient,slightly_prob,priority,priority -pretentious,proper,foster,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,foster,2,convenient,convenient,problematic,recommended,priority -pretentious,proper,foster,2,convenient,convenient,problematic,priority,priority -pretentious,proper,foster,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,proper,foster,2,convenient,inconv,nonprob,recommended,priority -pretentious,proper,foster,2,convenient,inconv,nonprob,priority,priority -pretentious,proper,foster,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,proper,foster,2,convenient,inconv,slightly_prob,recommended,priority -pretentious,proper,foster,2,convenient,inconv,slightly_prob,priority,priority -pretentious,proper,foster,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,foster,2,convenient,inconv,problematic,recommended,priority -pretentious,proper,foster,2,convenient,inconv,problematic,priority,priority -pretentious,proper,foster,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,proper,foster,2,less_conv,convenient,nonprob,recommended,priority -pretentious,proper,foster,2,less_conv,convenient,nonprob,priority,priority -pretentious,proper,foster,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,proper,foster,2,less_conv,convenient,slightly_prob,recommended,priority -pretentious,proper,foster,2,less_conv,convenient,slightly_prob,priority,priority -pretentious,proper,foster,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,foster,2,less_conv,convenient,problematic,recommended,priority -pretentious,proper,foster,2,less_conv,convenient,problematic,priority,priority -pretentious,proper,foster,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,proper,foster,2,less_conv,inconv,nonprob,recommended,priority -pretentious,proper,foster,2,less_conv,inconv,nonprob,priority,priority -pretentious,proper,foster,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,proper,foster,2,less_conv,inconv,slightly_prob,recommended,priority -pretentious,proper,foster,2,less_conv,inconv,slightly_prob,priority,priority -pretentious,proper,foster,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,foster,2,less_conv,inconv,problematic,recommended,priority -pretentious,proper,foster,2,less_conv,inconv,problematic,priority,priority -pretentious,proper,foster,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,proper,foster,2,critical,convenient,nonprob,recommended,priority -pretentious,proper,foster,2,critical,convenient,nonprob,priority,priority -pretentious,proper,foster,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,proper,foster,2,critical,convenient,slightly_prob,recommended,priority -pretentious,proper,foster,2,critical,convenient,slightly_prob,priority,priority -pretentious,proper,foster,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,foster,2,critical,convenient,problematic,recommended,priority -pretentious,proper,foster,2,critical,convenient,problematic,priority,priority -pretentious,proper,foster,2,critical,convenient,problematic,not_recom,not_recom -pretentious,proper,foster,2,critical,inconv,nonprob,recommended,priority -pretentious,proper,foster,2,critical,inconv,nonprob,priority,priority -pretentious,proper,foster,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,proper,foster,2,critical,inconv,slightly_prob,recommended,priority -pretentious,proper,foster,2,critical,inconv,slightly_prob,priority,priority -pretentious,proper,foster,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,foster,2,critical,inconv,problematic,recommended,priority -pretentious,proper,foster,2,critical,inconv,problematic,priority,priority -pretentious,proper,foster,2,critical,inconv,problematic,not_recom,not_recom -pretentious,proper,foster,3,convenient,convenient,nonprob,recommended,very_recom -pretentious,proper,foster,3,convenient,convenient,nonprob,priority,priority -pretentious,proper,foster,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,proper,foster,3,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,proper,foster,3,convenient,convenient,slightly_prob,priority,priority -pretentious,proper,foster,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,foster,3,convenient,convenient,problematic,recommended,priority -pretentious,proper,foster,3,convenient,convenient,problematic,priority,priority -pretentious,proper,foster,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,proper,foster,3,convenient,inconv,nonprob,recommended,priority -pretentious,proper,foster,3,convenient,inconv,nonprob,priority,priority -pretentious,proper,foster,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,proper,foster,3,convenient,inconv,slightly_prob,recommended,priority -pretentious,proper,foster,3,convenient,inconv,slightly_prob,priority,priority -pretentious,proper,foster,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,foster,3,convenient,inconv,problematic,recommended,priority -pretentious,proper,foster,3,convenient,inconv,problematic,priority,priority -pretentious,proper,foster,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,proper,foster,3,less_conv,convenient,nonprob,recommended,priority -pretentious,proper,foster,3,less_conv,convenient,nonprob,priority,priority -pretentious,proper,foster,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,proper,foster,3,less_conv,convenient,slightly_prob,recommended,priority -pretentious,proper,foster,3,less_conv,convenient,slightly_prob,priority,priority -pretentious,proper,foster,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,foster,3,less_conv,convenient,problematic,recommended,priority -pretentious,proper,foster,3,less_conv,convenient,problematic,priority,priority -pretentious,proper,foster,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,proper,foster,3,less_conv,inconv,nonprob,recommended,priority -pretentious,proper,foster,3,less_conv,inconv,nonprob,priority,priority -pretentious,proper,foster,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,proper,foster,3,less_conv,inconv,slightly_prob,recommended,priority -pretentious,proper,foster,3,less_conv,inconv,slightly_prob,priority,priority -pretentious,proper,foster,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,foster,3,less_conv,inconv,problematic,recommended,priority -pretentious,proper,foster,3,less_conv,inconv,problematic,priority,priority -pretentious,proper,foster,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,proper,foster,3,critical,convenient,nonprob,recommended,priority -pretentious,proper,foster,3,critical,convenient,nonprob,priority,priority -pretentious,proper,foster,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,proper,foster,3,critical,convenient,slightly_prob,recommended,priority -pretentious,proper,foster,3,critical,convenient,slightly_prob,priority,priority -pretentious,proper,foster,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,foster,3,critical,convenient,problematic,recommended,priority -pretentious,proper,foster,3,critical,convenient,problematic,priority,priority -pretentious,proper,foster,3,critical,convenient,problematic,not_recom,not_recom -pretentious,proper,foster,3,critical,inconv,nonprob,recommended,priority -pretentious,proper,foster,3,critical,inconv,nonprob,priority,priority -pretentious,proper,foster,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,proper,foster,3,critical,inconv,slightly_prob,recommended,priority -pretentious,proper,foster,3,critical,inconv,slightly_prob,priority,priority -pretentious,proper,foster,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,foster,3,critical,inconv,problematic,recommended,priority -pretentious,proper,foster,3,critical,inconv,problematic,priority,priority -pretentious,proper,foster,3,critical,inconv,problematic,not_recom,not_recom -pretentious,proper,foster,more,convenient,convenient,nonprob,recommended,very_recom -pretentious,proper,foster,more,convenient,convenient,nonprob,priority,priority -pretentious,proper,foster,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,proper,foster,more,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,proper,foster,more,convenient,convenient,slightly_prob,priority,priority -pretentious,proper,foster,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,foster,more,convenient,convenient,problematic,recommended,priority -pretentious,proper,foster,more,convenient,convenient,problematic,priority,priority -pretentious,proper,foster,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,proper,foster,more,convenient,inconv,nonprob,recommended,priority -pretentious,proper,foster,more,convenient,inconv,nonprob,priority,priority -pretentious,proper,foster,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,proper,foster,more,convenient,inconv,slightly_prob,recommended,priority -pretentious,proper,foster,more,convenient,inconv,slightly_prob,priority,priority -pretentious,proper,foster,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,foster,more,convenient,inconv,problematic,recommended,priority -pretentious,proper,foster,more,convenient,inconv,problematic,priority,priority -pretentious,proper,foster,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,proper,foster,more,less_conv,convenient,nonprob,recommended,priority -pretentious,proper,foster,more,less_conv,convenient,nonprob,priority,priority -pretentious,proper,foster,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,proper,foster,more,less_conv,convenient,slightly_prob,recommended,priority -pretentious,proper,foster,more,less_conv,convenient,slightly_prob,priority,priority -pretentious,proper,foster,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,foster,more,less_conv,convenient,problematic,recommended,priority -pretentious,proper,foster,more,less_conv,convenient,problematic,priority,priority -pretentious,proper,foster,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,proper,foster,more,less_conv,inconv,nonprob,recommended,priority -pretentious,proper,foster,more,less_conv,inconv,nonprob,priority,priority -pretentious,proper,foster,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,proper,foster,more,less_conv,inconv,slightly_prob,recommended,priority -pretentious,proper,foster,more,less_conv,inconv,slightly_prob,priority,priority -pretentious,proper,foster,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,foster,more,less_conv,inconv,problematic,recommended,priority -pretentious,proper,foster,more,less_conv,inconv,problematic,priority,priority -pretentious,proper,foster,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,proper,foster,more,critical,convenient,nonprob,recommended,priority -pretentious,proper,foster,more,critical,convenient,nonprob,priority,priority -pretentious,proper,foster,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,proper,foster,more,critical,convenient,slightly_prob,recommended,priority -pretentious,proper,foster,more,critical,convenient,slightly_prob,priority,priority -pretentious,proper,foster,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,proper,foster,more,critical,convenient,problematic,recommended,priority -pretentious,proper,foster,more,critical,convenient,problematic,priority,priority -pretentious,proper,foster,more,critical,convenient,problematic,not_recom,not_recom -pretentious,proper,foster,more,critical,inconv,nonprob,recommended,priority -pretentious,proper,foster,more,critical,inconv,nonprob,priority,priority -pretentious,proper,foster,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,proper,foster,more,critical,inconv,slightly_prob,recommended,priority -pretentious,proper,foster,more,critical,inconv,slightly_prob,priority,priority -pretentious,proper,foster,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,proper,foster,more,critical,inconv,problematic,recommended,priority -pretentious,proper,foster,more,critical,inconv,problematic,priority,priority -pretentious,proper,foster,more,critical,inconv,problematic,not_recom,not_recom -pretentious,less_proper,complete,1,convenient,convenient,nonprob,recommended,very_recom -pretentious,less_proper,complete,1,convenient,convenient,nonprob,priority,priority -pretentious,less_proper,complete,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,complete,1,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,complete,1,convenient,convenient,slightly_prob,priority,priority -pretentious,less_proper,complete,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,1,convenient,convenient,problematic,recommended,priority -pretentious,less_proper,complete,1,convenient,convenient,problematic,priority,priority -pretentious,less_proper,complete,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,less_proper,complete,1,convenient,inconv,nonprob,recommended,very_recom -pretentious,less_proper,complete,1,convenient,inconv,nonprob,priority,priority -pretentious,less_proper,complete,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,complete,1,convenient,inconv,slightly_prob,recommended,very_recom -pretentious,less_proper,complete,1,convenient,inconv,slightly_prob,priority,priority -pretentious,less_proper,complete,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,1,convenient,inconv,problematic,recommended,priority -pretentious,less_proper,complete,1,convenient,inconv,problematic,priority,priority -pretentious,less_proper,complete,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,less_proper,complete,1,less_conv,convenient,nonprob,recommended,very_recom -pretentious,less_proper,complete,1,less_conv,convenient,nonprob,priority,priority -pretentious,less_proper,complete,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,complete,1,less_conv,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,complete,1,less_conv,convenient,slightly_prob,priority,priority -pretentious,less_proper,complete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,1,less_conv,convenient,problematic,recommended,priority -pretentious,less_proper,complete,1,less_conv,convenient,problematic,priority,priority -pretentious,less_proper,complete,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,less_proper,complete,1,less_conv,inconv,nonprob,recommended,very_recom -pretentious,less_proper,complete,1,less_conv,inconv,nonprob,priority,priority -pretentious,less_proper,complete,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,complete,1,less_conv,inconv,slightly_prob,recommended,very_recom -pretentious,less_proper,complete,1,less_conv,inconv,slightly_prob,priority,priority -pretentious,less_proper,complete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,1,less_conv,inconv,problematic,recommended,priority -pretentious,less_proper,complete,1,less_conv,inconv,problematic,priority,priority -pretentious,less_proper,complete,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,less_proper,complete,1,critical,convenient,nonprob,recommended,very_recom -pretentious,less_proper,complete,1,critical,convenient,nonprob,priority,priority -pretentious,less_proper,complete,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,complete,1,critical,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,complete,1,critical,convenient,slightly_prob,priority,priority -pretentious,less_proper,complete,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,1,critical,convenient,problematic,recommended,priority -pretentious,less_proper,complete,1,critical,convenient,problematic,priority,priority -pretentious,less_proper,complete,1,critical,convenient,problematic,not_recom,not_recom -pretentious,less_proper,complete,1,critical,inconv,nonprob,recommended,very_recom -pretentious,less_proper,complete,1,critical,inconv,nonprob,priority,priority -pretentious,less_proper,complete,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,complete,1,critical,inconv,slightly_prob,recommended,very_recom -pretentious,less_proper,complete,1,critical,inconv,slightly_prob,priority,priority -pretentious,less_proper,complete,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,1,critical,inconv,problematic,recommended,priority -pretentious,less_proper,complete,1,critical,inconv,problematic,priority,priority -pretentious,less_proper,complete,1,critical,inconv,problematic,not_recom,not_recom -pretentious,less_proper,complete,2,convenient,convenient,nonprob,recommended,very_recom -pretentious,less_proper,complete,2,convenient,convenient,nonprob,priority,priority -pretentious,less_proper,complete,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,complete,2,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,complete,2,convenient,convenient,slightly_prob,priority,priority -pretentious,less_proper,complete,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,2,convenient,convenient,problematic,recommended,priority -pretentious,less_proper,complete,2,convenient,convenient,problematic,priority,priority -pretentious,less_proper,complete,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,less_proper,complete,2,convenient,inconv,nonprob,recommended,very_recom -pretentious,less_proper,complete,2,convenient,inconv,nonprob,priority,priority -pretentious,less_proper,complete,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,complete,2,convenient,inconv,slightly_prob,recommended,very_recom -pretentious,less_proper,complete,2,convenient,inconv,slightly_prob,priority,priority -pretentious,less_proper,complete,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,2,convenient,inconv,problematic,recommended,priority -pretentious,less_proper,complete,2,convenient,inconv,problematic,priority,priority -pretentious,less_proper,complete,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,less_proper,complete,2,less_conv,convenient,nonprob,recommended,very_recom -pretentious,less_proper,complete,2,less_conv,convenient,nonprob,priority,priority -pretentious,less_proper,complete,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,complete,2,less_conv,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,complete,2,less_conv,convenient,slightly_prob,priority,priority -pretentious,less_proper,complete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,2,less_conv,convenient,problematic,recommended,priority -pretentious,less_proper,complete,2,less_conv,convenient,problematic,priority,priority -pretentious,less_proper,complete,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,less_proper,complete,2,less_conv,inconv,nonprob,recommended,very_recom -pretentious,less_proper,complete,2,less_conv,inconv,nonprob,priority,priority -pretentious,less_proper,complete,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,complete,2,less_conv,inconv,slightly_prob,recommended,very_recom -pretentious,less_proper,complete,2,less_conv,inconv,slightly_prob,priority,priority -pretentious,less_proper,complete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,2,less_conv,inconv,problematic,recommended,priority -pretentious,less_proper,complete,2,less_conv,inconv,problematic,priority,priority -pretentious,less_proper,complete,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,less_proper,complete,2,critical,convenient,nonprob,recommended,priority -pretentious,less_proper,complete,2,critical,convenient,nonprob,priority,priority -pretentious,less_proper,complete,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,complete,2,critical,convenient,slightly_prob,recommended,priority -pretentious,less_proper,complete,2,critical,convenient,slightly_prob,priority,priority -pretentious,less_proper,complete,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,2,critical,convenient,problematic,recommended,priority -pretentious,less_proper,complete,2,critical,convenient,problematic,priority,priority -pretentious,less_proper,complete,2,critical,convenient,problematic,not_recom,not_recom -pretentious,less_proper,complete,2,critical,inconv,nonprob,recommended,priority -pretentious,less_proper,complete,2,critical,inconv,nonprob,priority,priority -pretentious,less_proper,complete,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,complete,2,critical,inconv,slightly_prob,recommended,priority -pretentious,less_proper,complete,2,critical,inconv,slightly_prob,priority,priority -pretentious,less_proper,complete,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,2,critical,inconv,problematic,recommended,priority -pretentious,less_proper,complete,2,critical,inconv,problematic,priority,priority -pretentious,less_proper,complete,2,critical,inconv,problematic,not_recom,not_recom -pretentious,less_proper,complete,3,convenient,convenient,nonprob,recommended,very_recom -pretentious,less_proper,complete,3,convenient,convenient,nonprob,priority,priority -pretentious,less_proper,complete,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,complete,3,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,complete,3,convenient,convenient,slightly_prob,priority,priority -pretentious,less_proper,complete,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,3,convenient,convenient,problematic,recommended,priority -pretentious,less_proper,complete,3,convenient,convenient,problematic,priority,priority -pretentious,less_proper,complete,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,less_proper,complete,3,convenient,inconv,nonprob,recommended,priority -pretentious,less_proper,complete,3,convenient,inconv,nonprob,priority,priority -pretentious,less_proper,complete,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,complete,3,convenient,inconv,slightly_prob,recommended,priority -pretentious,less_proper,complete,3,convenient,inconv,slightly_prob,priority,priority -pretentious,less_proper,complete,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,3,convenient,inconv,problematic,recommended,priority -pretentious,less_proper,complete,3,convenient,inconv,problematic,priority,priority -pretentious,less_proper,complete,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,less_proper,complete,3,less_conv,convenient,nonprob,recommended,priority -pretentious,less_proper,complete,3,less_conv,convenient,nonprob,priority,priority -pretentious,less_proper,complete,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,complete,3,less_conv,convenient,slightly_prob,recommended,priority -pretentious,less_proper,complete,3,less_conv,convenient,slightly_prob,priority,priority -pretentious,less_proper,complete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,3,less_conv,convenient,problematic,recommended,priority -pretentious,less_proper,complete,3,less_conv,convenient,problematic,priority,priority -pretentious,less_proper,complete,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,less_proper,complete,3,less_conv,inconv,nonprob,recommended,priority -pretentious,less_proper,complete,3,less_conv,inconv,nonprob,priority,priority -pretentious,less_proper,complete,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,complete,3,less_conv,inconv,slightly_prob,recommended,priority -pretentious,less_proper,complete,3,less_conv,inconv,slightly_prob,priority,priority -pretentious,less_proper,complete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,3,less_conv,inconv,problematic,recommended,priority -pretentious,less_proper,complete,3,less_conv,inconv,problematic,priority,priority -pretentious,less_proper,complete,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,less_proper,complete,3,critical,convenient,nonprob,recommended,priority -pretentious,less_proper,complete,3,critical,convenient,nonprob,priority,priority -pretentious,less_proper,complete,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,complete,3,critical,convenient,slightly_prob,recommended,priority -pretentious,less_proper,complete,3,critical,convenient,slightly_prob,priority,priority -pretentious,less_proper,complete,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,3,critical,convenient,problematic,recommended,priority -pretentious,less_proper,complete,3,critical,convenient,problematic,priority,priority -pretentious,less_proper,complete,3,critical,convenient,problematic,not_recom,not_recom -pretentious,less_proper,complete,3,critical,inconv,nonprob,recommended,priority -pretentious,less_proper,complete,3,critical,inconv,nonprob,priority,priority -pretentious,less_proper,complete,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,complete,3,critical,inconv,slightly_prob,recommended,priority -pretentious,less_proper,complete,3,critical,inconv,slightly_prob,priority,priority -pretentious,less_proper,complete,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,3,critical,inconv,problematic,recommended,priority -pretentious,less_proper,complete,3,critical,inconv,problematic,priority,priority -pretentious,less_proper,complete,3,critical,inconv,problematic,not_recom,not_recom -pretentious,less_proper,complete,more,convenient,convenient,nonprob,recommended,very_recom -pretentious,less_proper,complete,more,convenient,convenient,nonprob,priority,priority -pretentious,less_proper,complete,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,complete,more,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,complete,more,convenient,convenient,slightly_prob,priority,priority -pretentious,less_proper,complete,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,more,convenient,convenient,problematic,recommended,priority -pretentious,less_proper,complete,more,convenient,convenient,problematic,priority,priority -pretentious,less_proper,complete,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,less_proper,complete,more,convenient,inconv,nonprob,recommended,priority -pretentious,less_proper,complete,more,convenient,inconv,nonprob,priority,priority -pretentious,less_proper,complete,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,complete,more,convenient,inconv,slightly_prob,recommended,priority -pretentious,less_proper,complete,more,convenient,inconv,slightly_prob,priority,priority -pretentious,less_proper,complete,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,more,convenient,inconv,problematic,recommended,priority -pretentious,less_proper,complete,more,convenient,inconv,problematic,priority,priority -pretentious,less_proper,complete,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,less_proper,complete,more,less_conv,convenient,nonprob,recommended,priority -pretentious,less_proper,complete,more,less_conv,convenient,nonprob,priority,priority -pretentious,less_proper,complete,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,complete,more,less_conv,convenient,slightly_prob,recommended,priority -pretentious,less_proper,complete,more,less_conv,convenient,slightly_prob,priority,priority -pretentious,less_proper,complete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,more,less_conv,convenient,problematic,recommended,priority -pretentious,less_proper,complete,more,less_conv,convenient,problematic,priority,priority -pretentious,less_proper,complete,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,less_proper,complete,more,less_conv,inconv,nonprob,recommended,priority -pretentious,less_proper,complete,more,less_conv,inconv,nonprob,priority,priority -pretentious,less_proper,complete,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,complete,more,less_conv,inconv,slightly_prob,recommended,priority -pretentious,less_proper,complete,more,less_conv,inconv,slightly_prob,priority,priority -pretentious,less_proper,complete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,more,less_conv,inconv,problematic,recommended,priority -pretentious,less_proper,complete,more,less_conv,inconv,problematic,priority,priority -pretentious,less_proper,complete,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,less_proper,complete,more,critical,convenient,nonprob,recommended,priority -pretentious,less_proper,complete,more,critical,convenient,nonprob,priority,priority -pretentious,less_proper,complete,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,complete,more,critical,convenient,slightly_prob,recommended,priority -pretentious,less_proper,complete,more,critical,convenient,slightly_prob,priority,priority -pretentious,less_proper,complete,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,more,critical,convenient,problematic,recommended,priority -pretentious,less_proper,complete,more,critical,convenient,problematic,priority,priority -pretentious,less_proper,complete,more,critical,convenient,problematic,not_recom,not_recom -pretentious,less_proper,complete,more,critical,inconv,nonprob,recommended,priority -pretentious,less_proper,complete,more,critical,inconv,nonprob,priority,priority -pretentious,less_proper,complete,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,complete,more,critical,inconv,slightly_prob,recommended,priority -pretentious,less_proper,complete,more,critical,inconv,slightly_prob,priority,priority -pretentious,less_proper,complete,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,complete,more,critical,inconv,problematic,recommended,priority -pretentious,less_proper,complete,more,critical,inconv,problematic,priority,priority -pretentious,less_proper,complete,more,critical,inconv,problematic,not_recom,not_recom -pretentious,less_proper,completed,1,convenient,convenient,nonprob,recommended,very_recom -pretentious,less_proper,completed,1,convenient,convenient,nonprob,priority,priority -pretentious,less_proper,completed,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,completed,1,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,completed,1,convenient,convenient,slightly_prob,priority,priority -pretentious,less_proper,completed,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,1,convenient,convenient,problematic,recommended,priority -pretentious,less_proper,completed,1,convenient,convenient,problematic,priority,priority -pretentious,less_proper,completed,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,less_proper,completed,1,convenient,inconv,nonprob,recommended,very_recom -pretentious,less_proper,completed,1,convenient,inconv,nonprob,priority,priority -pretentious,less_proper,completed,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,completed,1,convenient,inconv,slightly_prob,recommended,very_recom -pretentious,less_proper,completed,1,convenient,inconv,slightly_prob,priority,priority -pretentious,less_proper,completed,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,1,convenient,inconv,problematic,recommended,priority -pretentious,less_proper,completed,1,convenient,inconv,problematic,priority,priority -pretentious,less_proper,completed,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,less_proper,completed,1,less_conv,convenient,nonprob,recommended,very_recom -pretentious,less_proper,completed,1,less_conv,convenient,nonprob,priority,priority -pretentious,less_proper,completed,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,completed,1,less_conv,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,completed,1,less_conv,convenient,slightly_prob,priority,priority -pretentious,less_proper,completed,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,1,less_conv,convenient,problematic,recommended,priority -pretentious,less_proper,completed,1,less_conv,convenient,problematic,priority,priority -pretentious,less_proper,completed,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,less_proper,completed,1,less_conv,inconv,nonprob,recommended,very_recom -pretentious,less_proper,completed,1,less_conv,inconv,nonprob,priority,priority -pretentious,less_proper,completed,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,completed,1,less_conv,inconv,slightly_prob,recommended,very_recom -pretentious,less_proper,completed,1,less_conv,inconv,slightly_prob,priority,priority -pretentious,less_proper,completed,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,1,less_conv,inconv,problematic,recommended,priority -pretentious,less_proper,completed,1,less_conv,inconv,problematic,priority,priority -pretentious,less_proper,completed,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,less_proper,completed,1,critical,convenient,nonprob,recommended,priority -pretentious,less_proper,completed,1,critical,convenient,nonprob,priority,priority -pretentious,less_proper,completed,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,completed,1,critical,convenient,slightly_prob,recommended,priority -pretentious,less_proper,completed,1,critical,convenient,slightly_prob,priority,priority -pretentious,less_proper,completed,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,1,critical,convenient,problematic,recommended,priority -pretentious,less_proper,completed,1,critical,convenient,problematic,priority,priority -pretentious,less_proper,completed,1,critical,convenient,problematic,not_recom,not_recom -pretentious,less_proper,completed,1,critical,inconv,nonprob,recommended,priority -pretentious,less_proper,completed,1,critical,inconv,nonprob,priority,priority -pretentious,less_proper,completed,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,completed,1,critical,inconv,slightly_prob,recommended,priority -pretentious,less_proper,completed,1,critical,inconv,slightly_prob,priority,priority -pretentious,less_proper,completed,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,1,critical,inconv,problematic,recommended,priority -pretentious,less_proper,completed,1,critical,inconv,problematic,priority,priority -pretentious,less_proper,completed,1,critical,inconv,problematic,not_recom,not_recom -pretentious,less_proper,completed,2,convenient,convenient,nonprob,recommended,very_recom -pretentious,less_proper,completed,2,convenient,convenient,nonprob,priority,priority -pretentious,less_proper,completed,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,completed,2,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,completed,2,convenient,convenient,slightly_prob,priority,priority -pretentious,less_proper,completed,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,2,convenient,convenient,problematic,recommended,priority -pretentious,less_proper,completed,2,convenient,convenient,problematic,priority,priority -pretentious,less_proper,completed,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,less_proper,completed,2,convenient,inconv,nonprob,recommended,very_recom -pretentious,less_proper,completed,2,convenient,inconv,nonprob,priority,priority -pretentious,less_proper,completed,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,completed,2,convenient,inconv,slightly_prob,recommended,very_recom -pretentious,less_proper,completed,2,convenient,inconv,slightly_prob,priority,priority -pretentious,less_proper,completed,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,2,convenient,inconv,problematic,recommended,priority -pretentious,less_proper,completed,2,convenient,inconv,problematic,priority,priority -pretentious,less_proper,completed,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,less_proper,completed,2,less_conv,convenient,nonprob,recommended,very_recom -pretentious,less_proper,completed,2,less_conv,convenient,nonprob,priority,priority -pretentious,less_proper,completed,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,completed,2,less_conv,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,completed,2,less_conv,convenient,slightly_prob,priority,priority -pretentious,less_proper,completed,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,2,less_conv,convenient,problematic,recommended,priority -pretentious,less_proper,completed,2,less_conv,convenient,problematic,priority,priority -pretentious,less_proper,completed,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,less_proper,completed,2,less_conv,inconv,nonprob,recommended,very_recom -pretentious,less_proper,completed,2,less_conv,inconv,nonprob,priority,priority -pretentious,less_proper,completed,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,completed,2,less_conv,inconv,slightly_prob,recommended,very_recom -pretentious,less_proper,completed,2,less_conv,inconv,slightly_prob,priority,priority -pretentious,less_proper,completed,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,2,less_conv,inconv,problematic,recommended,priority -pretentious,less_proper,completed,2,less_conv,inconv,problematic,priority,priority -pretentious,less_proper,completed,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,less_proper,completed,2,critical,convenient,nonprob,recommended,priority -pretentious,less_proper,completed,2,critical,convenient,nonprob,priority,priority -pretentious,less_proper,completed,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,completed,2,critical,convenient,slightly_prob,recommended,priority -pretentious,less_proper,completed,2,critical,convenient,slightly_prob,priority,priority -pretentious,less_proper,completed,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,2,critical,convenient,problematic,recommended,priority -pretentious,less_proper,completed,2,critical,convenient,problematic,priority,priority -pretentious,less_proper,completed,2,critical,convenient,problematic,not_recom,not_recom -pretentious,less_proper,completed,2,critical,inconv,nonprob,recommended,priority -pretentious,less_proper,completed,2,critical,inconv,nonprob,priority,priority -pretentious,less_proper,completed,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,completed,2,critical,inconv,slightly_prob,recommended,priority -pretentious,less_proper,completed,2,critical,inconv,slightly_prob,priority,priority -pretentious,less_proper,completed,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,2,critical,inconv,problematic,recommended,priority -pretentious,less_proper,completed,2,critical,inconv,problematic,priority,priority -pretentious,less_proper,completed,2,critical,inconv,problematic,not_recom,not_recom -pretentious,less_proper,completed,3,convenient,convenient,nonprob,recommended,very_recom -pretentious,less_proper,completed,3,convenient,convenient,nonprob,priority,priority -pretentious,less_proper,completed,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,completed,3,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,completed,3,convenient,convenient,slightly_prob,priority,priority -pretentious,less_proper,completed,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,3,convenient,convenient,problematic,recommended,priority -pretentious,less_proper,completed,3,convenient,convenient,problematic,priority,priority -pretentious,less_proper,completed,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,less_proper,completed,3,convenient,inconv,nonprob,recommended,priority -pretentious,less_proper,completed,3,convenient,inconv,nonprob,priority,priority -pretentious,less_proper,completed,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,completed,3,convenient,inconv,slightly_prob,recommended,priority -pretentious,less_proper,completed,3,convenient,inconv,slightly_prob,priority,priority -pretentious,less_proper,completed,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,3,convenient,inconv,problematic,recommended,priority -pretentious,less_proper,completed,3,convenient,inconv,problematic,priority,priority -pretentious,less_proper,completed,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,less_proper,completed,3,less_conv,convenient,nonprob,recommended,priority -pretentious,less_proper,completed,3,less_conv,convenient,nonprob,priority,priority -pretentious,less_proper,completed,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,completed,3,less_conv,convenient,slightly_prob,recommended,priority -pretentious,less_proper,completed,3,less_conv,convenient,slightly_prob,priority,priority -pretentious,less_proper,completed,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,3,less_conv,convenient,problematic,recommended,priority -pretentious,less_proper,completed,3,less_conv,convenient,problematic,priority,priority -pretentious,less_proper,completed,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,less_proper,completed,3,less_conv,inconv,nonprob,recommended,priority -pretentious,less_proper,completed,3,less_conv,inconv,nonprob,priority,priority -pretentious,less_proper,completed,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,completed,3,less_conv,inconv,slightly_prob,recommended,priority -pretentious,less_proper,completed,3,less_conv,inconv,slightly_prob,priority,priority -pretentious,less_proper,completed,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,3,less_conv,inconv,problematic,recommended,priority -pretentious,less_proper,completed,3,less_conv,inconv,problematic,priority,priority -pretentious,less_proper,completed,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,less_proper,completed,3,critical,convenient,nonprob,recommended,priority -pretentious,less_proper,completed,3,critical,convenient,nonprob,priority,priority -pretentious,less_proper,completed,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,completed,3,critical,convenient,slightly_prob,recommended,priority -pretentious,less_proper,completed,3,critical,convenient,slightly_prob,priority,priority -pretentious,less_proper,completed,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,3,critical,convenient,problematic,recommended,priority -pretentious,less_proper,completed,3,critical,convenient,problematic,priority,priority -pretentious,less_proper,completed,3,critical,convenient,problematic,not_recom,not_recom -pretentious,less_proper,completed,3,critical,inconv,nonprob,recommended,priority -pretentious,less_proper,completed,3,critical,inconv,nonprob,priority,priority -pretentious,less_proper,completed,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,completed,3,critical,inconv,slightly_prob,recommended,priority -pretentious,less_proper,completed,3,critical,inconv,slightly_prob,priority,priority -pretentious,less_proper,completed,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,3,critical,inconv,problematic,recommended,priority -pretentious,less_proper,completed,3,critical,inconv,problematic,priority,priority -pretentious,less_proper,completed,3,critical,inconv,problematic,not_recom,not_recom -pretentious,less_proper,completed,more,convenient,convenient,nonprob,recommended,very_recom -pretentious,less_proper,completed,more,convenient,convenient,nonprob,priority,priority -pretentious,less_proper,completed,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,completed,more,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,completed,more,convenient,convenient,slightly_prob,priority,priority -pretentious,less_proper,completed,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,more,convenient,convenient,problematic,recommended,priority -pretentious,less_proper,completed,more,convenient,convenient,problematic,priority,priority -pretentious,less_proper,completed,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,less_proper,completed,more,convenient,inconv,nonprob,recommended,priority -pretentious,less_proper,completed,more,convenient,inconv,nonprob,priority,priority -pretentious,less_proper,completed,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,completed,more,convenient,inconv,slightly_prob,recommended,priority -pretentious,less_proper,completed,more,convenient,inconv,slightly_prob,priority,priority -pretentious,less_proper,completed,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,more,convenient,inconv,problematic,recommended,priority -pretentious,less_proper,completed,more,convenient,inconv,problematic,priority,priority -pretentious,less_proper,completed,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,less_proper,completed,more,less_conv,convenient,nonprob,recommended,priority -pretentious,less_proper,completed,more,less_conv,convenient,nonprob,priority,priority -pretentious,less_proper,completed,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,completed,more,less_conv,convenient,slightly_prob,recommended,priority -pretentious,less_proper,completed,more,less_conv,convenient,slightly_prob,priority,priority -pretentious,less_proper,completed,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,more,less_conv,convenient,problematic,recommended,priority -pretentious,less_proper,completed,more,less_conv,convenient,problematic,priority,priority -pretentious,less_proper,completed,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,less_proper,completed,more,less_conv,inconv,nonprob,recommended,priority -pretentious,less_proper,completed,more,less_conv,inconv,nonprob,priority,priority -pretentious,less_proper,completed,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,completed,more,less_conv,inconv,slightly_prob,recommended,priority -pretentious,less_proper,completed,more,less_conv,inconv,slightly_prob,priority,priority -pretentious,less_proper,completed,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,more,less_conv,inconv,problematic,recommended,priority -pretentious,less_proper,completed,more,less_conv,inconv,problematic,priority,priority -pretentious,less_proper,completed,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,less_proper,completed,more,critical,convenient,nonprob,recommended,priority -pretentious,less_proper,completed,more,critical,convenient,nonprob,priority,priority -pretentious,less_proper,completed,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,completed,more,critical,convenient,slightly_prob,recommended,priority -pretentious,less_proper,completed,more,critical,convenient,slightly_prob,priority,priority -pretentious,less_proper,completed,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,more,critical,convenient,problematic,recommended,priority -pretentious,less_proper,completed,more,critical,convenient,problematic,priority,priority -pretentious,less_proper,completed,more,critical,convenient,problematic,not_recom,not_recom -pretentious,less_proper,completed,more,critical,inconv,nonprob,recommended,priority -pretentious,less_proper,completed,more,critical,inconv,nonprob,priority,priority -pretentious,less_proper,completed,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,completed,more,critical,inconv,slightly_prob,recommended,priority -pretentious,less_proper,completed,more,critical,inconv,slightly_prob,priority,priority -pretentious,less_proper,completed,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,completed,more,critical,inconv,problematic,recommended,priority -pretentious,less_proper,completed,more,critical,inconv,problematic,priority,priority -pretentious,less_proper,completed,more,critical,inconv,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,1,convenient,convenient,nonprob,recommended,very_recom -pretentious,less_proper,incomplete,1,convenient,convenient,nonprob,priority,priority -pretentious,less_proper,incomplete,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,1,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,incomplete,1,convenient,convenient,slightly_prob,priority,priority -pretentious,less_proper,incomplete,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,1,convenient,convenient,problematic,recommended,priority -pretentious,less_proper,incomplete,1,convenient,convenient,problematic,priority,priority -pretentious,less_proper,incomplete,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,1,convenient,inconv,nonprob,recommended,very_recom -pretentious,less_proper,incomplete,1,convenient,inconv,nonprob,priority,priority -pretentious,less_proper,incomplete,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,1,convenient,inconv,slightly_prob,recommended,very_recom -pretentious,less_proper,incomplete,1,convenient,inconv,slightly_prob,priority,priority -pretentious,less_proper,incomplete,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,1,convenient,inconv,problematic,recommended,priority -pretentious,less_proper,incomplete,1,convenient,inconv,problematic,priority,priority -pretentious,less_proper,incomplete,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,1,less_conv,convenient,nonprob,recommended,very_recom -pretentious,less_proper,incomplete,1,less_conv,convenient,nonprob,priority,priority -pretentious,less_proper,incomplete,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,1,less_conv,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,incomplete,1,less_conv,convenient,slightly_prob,priority,priority -pretentious,less_proper,incomplete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,1,less_conv,convenient,problematic,recommended,priority -pretentious,less_proper,incomplete,1,less_conv,convenient,problematic,priority,priority -pretentious,less_proper,incomplete,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,1,less_conv,inconv,nonprob,recommended,very_recom -pretentious,less_proper,incomplete,1,less_conv,inconv,nonprob,priority,priority -pretentious,less_proper,incomplete,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,1,less_conv,inconv,slightly_prob,recommended,very_recom -pretentious,less_proper,incomplete,1,less_conv,inconv,slightly_prob,priority,priority -pretentious,less_proper,incomplete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,1,less_conv,inconv,problematic,recommended,priority -pretentious,less_proper,incomplete,1,less_conv,inconv,problematic,priority,priority -pretentious,less_proper,incomplete,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,1,critical,convenient,nonprob,recommended,priority -pretentious,less_proper,incomplete,1,critical,convenient,nonprob,priority,priority -pretentious,less_proper,incomplete,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,1,critical,convenient,slightly_prob,recommended,priority -pretentious,less_proper,incomplete,1,critical,convenient,slightly_prob,priority,priority -pretentious,less_proper,incomplete,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,1,critical,convenient,problematic,recommended,priority -pretentious,less_proper,incomplete,1,critical,convenient,problematic,priority,priority -pretentious,less_proper,incomplete,1,critical,convenient,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,1,critical,inconv,nonprob,recommended,priority -pretentious,less_proper,incomplete,1,critical,inconv,nonprob,priority,priority -pretentious,less_proper,incomplete,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,1,critical,inconv,slightly_prob,recommended,priority -pretentious,less_proper,incomplete,1,critical,inconv,slightly_prob,priority,priority -pretentious,less_proper,incomplete,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,1,critical,inconv,problematic,recommended,priority -pretentious,less_proper,incomplete,1,critical,inconv,problematic,priority,priority -pretentious,less_proper,incomplete,1,critical,inconv,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,2,convenient,convenient,nonprob,recommended,very_recom -pretentious,less_proper,incomplete,2,convenient,convenient,nonprob,priority,priority -pretentious,less_proper,incomplete,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,2,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,incomplete,2,convenient,convenient,slightly_prob,priority,priority -pretentious,less_proper,incomplete,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,2,convenient,convenient,problematic,recommended,priority -pretentious,less_proper,incomplete,2,convenient,convenient,problematic,priority,priority -pretentious,less_proper,incomplete,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,2,convenient,inconv,nonprob,recommended,priority -pretentious,less_proper,incomplete,2,convenient,inconv,nonprob,priority,priority -pretentious,less_proper,incomplete,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,2,convenient,inconv,slightly_prob,recommended,priority -pretentious,less_proper,incomplete,2,convenient,inconv,slightly_prob,priority,priority -pretentious,less_proper,incomplete,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,2,convenient,inconv,problematic,recommended,priority -pretentious,less_proper,incomplete,2,convenient,inconv,problematic,priority,priority -pretentious,less_proper,incomplete,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,2,less_conv,convenient,nonprob,recommended,priority -pretentious,less_proper,incomplete,2,less_conv,convenient,nonprob,priority,priority -pretentious,less_proper,incomplete,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,2,less_conv,convenient,slightly_prob,recommended,priority -pretentious,less_proper,incomplete,2,less_conv,convenient,slightly_prob,priority,priority -pretentious,less_proper,incomplete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,2,less_conv,convenient,problematic,recommended,priority -pretentious,less_proper,incomplete,2,less_conv,convenient,problematic,priority,priority -pretentious,less_proper,incomplete,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,2,less_conv,inconv,nonprob,recommended,priority -pretentious,less_proper,incomplete,2,less_conv,inconv,nonprob,priority,priority -pretentious,less_proper,incomplete,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,2,less_conv,inconv,slightly_prob,recommended,priority -pretentious,less_proper,incomplete,2,less_conv,inconv,slightly_prob,priority,priority -pretentious,less_proper,incomplete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,2,less_conv,inconv,problematic,recommended,priority -pretentious,less_proper,incomplete,2,less_conv,inconv,problematic,priority,priority -pretentious,less_proper,incomplete,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,2,critical,convenient,nonprob,recommended,priority -pretentious,less_proper,incomplete,2,critical,convenient,nonprob,priority,priority -pretentious,less_proper,incomplete,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,2,critical,convenient,slightly_prob,recommended,priority -pretentious,less_proper,incomplete,2,critical,convenient,slightly_prob,priority,priority -pretentious,less_proper,incomplete,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,2,critical,convenient,problematic,recommended,priority -pretentious,less_proper,incomplete,2,critical,convenient,problematic,priority,priority -pretentious,less_proper,incomplete,2,critical,convenient,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,2,critical,inconv,nonprob,recommended,priority -pretentious,less_proper,incomplete,2,critical,inconv,nonprob,priority,priority -pretentious,less_proper,incomplete,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,2,critical,inconv,slightly_prob,recommended,priority -pretentious,less_proper,incomplete,2,critical,inconv,slightly_prob,priority,priority -pretentious,less_proper,incomplete,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,2,critical,inconv,problematic,recommended,priority -pretentious,less_proper,incomplete,2,critical,inconv,problematic,priority,priority -pretentious,less_proper,incomplete,2,critical,inconv,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,3,convenient,convenient,nonprob,recommended,very_recom -pretentious,less_proper,incomplete,3,convenient,convenient,nonprob,priority,priority -pretentious,less_proper,incomplete,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,3,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,incomplete,3,convenient,convenient,slightly_prob,priority,priority -pretentious,less_proper,incomplete,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,3,convenient,convenient,problematic,recommended,priority -pretentious,less_proper,incomplete,3,convenient,convenient,problematic,priority,priority -pretentious,less_proper,incomplete,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,3,convenient,inconv,nonprob,recommended,priority -pretentious,less_proper,incomplete,3,convenient,inconv,nonprob,priority,priority -pretentious,less_proper,incomplete,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,3,convenient,inconv,slightly_prob,recommended,priority -pretentious,less_proper,incomplete,3,convenient,inconv,slightly_prob,priority,priority -pretentious,less_proper,incomplete,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,3,convenient,inconv,problematic,recommended,priority -pretentious,less_proper,incomplete,3,convenient,inconv,problematic,priority,priority -pretentious,less_proper,incomplete,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,3,less_conv,convenient,nonprob,recommended,priority -pretentious,less_proper,incomplete,3,less_conv,convenient,nonprob,priority,priority -pretentious,less_proper,incomplete,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,3,less_conv,convenient,slightly_prob,recommended,priority -pretentious,less_proper,incomplete,3,less_conv,convenient,slightly_prob,priority,priority -pretentious,less_proper,incomplete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,3,less_conv,convenient,problematic,recommended,priority -pretentious,less_proper,incomplete,3,less_conv,convenient,problematic,priority,priority -pretentious,less_proper,incomplete,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,3,less_conv,inconv,nonprob,recommended,priority -pretentious,less_proper,incomplete,3,less_conv,inconv,nonprob,priority,priority -pretentious,less_proper,incomplete,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,3,less_conv,inconv,slightly_prob,recommended,priority -pretentious,less_proper,incomplete,3,less_conv,inconv,slightly_prob,priority,priority -pretentious,less_proper,incomplete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,3,less_conv,inconv,problematic,recommended,priority -pretentious,less_proper,incomplete,3,less_conv,inconv,problematic,priority,priority -pretentious,less_proper,incomplete,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,3,critical,convenient,nonprob,recommended,priority -pretentious,less_proper,incomplete,3,critical,convenient,nonprob,priority,priority -pretentious,less_proper,incomplete,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,3,critical,convenient,slightly_prob,recommended,priority -pretentious,less_proper,incomplete,3,critical,convenient,slightly_prob,priority,priority -pretentious,less_proper,incomplete,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,3,critical,convenient,problematic,recommended,priority -pretentious,less_proper,incomplete,3,critical,convenient,problematic,priority,priority -pretentious,less_proper,incomplete,3,critical,convenient,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,3,critical,inconv,nonprob,recommended,priority -pretentious,less_proper,incomplete,3,critical,inconv,nonprob,priority,priority -pretentious,less_proper,incomplete,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,3,critical,inconv,slightly_prob,recommended,priority -pretentious,less_proper,incomplete,3,critical,inconv,slightly_prob,priority,priority -pretentious,less_proper,incomplete,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,3,critical,inconv,problematic,recommended,priority -pretentious,less_proper,incomplete,3,critical,inconv,problematic,priority,priority -pretentious,less_proper,incomplete,3,critical,inconv,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,more,convenient,convenient,nonprob,recommended,very_recom -pretentious,less_proper,incomplete,more,convenient,convenient,nonprob,priority,priority -pretentious,less_proper,incomplete,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,more,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,incomplete,more,convenient,convenient,slightly_prob,priority,priority -pretentious,less_proper,incomplete,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,more,convenient,convenient,problematic,recommended,priority -pretentious,less_proper,incomplete,more,convenient,convenient,problematic,priority,priority -pretentious,less_proper,incomplete,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,more,convenient,inconv,nonprob,recommended,priority -pretentious,less_proper,incomplete,more,convenient,inconv,nonprob,priority,priority -pretentious,less_proper,incomplete,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,more,convenient,inconv,slightly_prob,recommended,priority -pretentious,less_proper,incomplete,more,convenient,inconv,slightly_prob,priority,priority -pretentious,less_proper,incomplete,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,more,convenient,inconv,problematic,recommended,priority -pretentious,less_proper,incomplete,more,convenient,inconv,problematic,priority,priority -pretentious,less_proper,incomplete,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,more,less_conv,convenient,nonprob,recommended,priority -pretentious,less_proper,incomplete,more,less_conv,convenient,nonprob,priority,priority -pretentious,less_proper,incomplete,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,more,less_conv,convenient,slightly_prob,recommended,priority -pretentious,less_proper,incomplete,more,less_conv,convenient,slightly_prob,priority,priority -pretentious,less_proper,incomplete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,more,less_conv,convenient,problematic,recommended,priority -pretentious,less_proper,incomplete,more,less_conv,convenient,problematic,priority,priority -pretentious,less_proper,incomplete,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,more,less_conv,inconv,nonprob,recommended,priority -pretentious,less_proper,incomplete,more,less_conv,inconv,nonprob,priority,priority -pretentious,less_proper,incomplete,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,more,less_conv,inconv,slightly_prob,recommended,priority -pretentious,less_proper,incomplete,more,less_conv,inconv,slightly_prob,priority,priority -pretentious,less_proper,incomplete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,more,less_conv,inconv,problematic,recommended,priority -pretentious,less_proper,incomplete,more,less_conv,inconv,problematic,priority,priority -pretentious,less_proper,incomplete,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,more,critical,convenient,nonprob,recommended,priority -pretentious,less_proper,incomplete,more,critical,convenient,nonprob,priority,priority -pretentious,less_proper,incomplete,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,more,critical,convenient,slightly_prob,recommended,priority -pretentious,less_proper,incomplete,more,critical,convenient,slightly_prob,priority,priority -pretentious,less_proper,incomplete,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,more,critical,convenient,problematic,recommended,priority -pretentious,less_proper,incomplete,more,critical,convenient,problematic,priority,priority -pretentious,less_proper,incomplete,more,critical,convenient,problematic,not_recom,not_recom -pretentious,less_proper,incomplete,more,critical,inconv,nonprob,recommended,priority -pretentious,less_proper,incomplete,more,critical,inconv,nonprob,priority,priority -pretentious,less_proper,incomplete,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,incomplete,more,critical,inconv,slightly_prob,recommended,priority -pretentious,less_proper,incomplete,more,critical,inconv,slightly_prob,priority,priority -pretentious,less_proper,incomplete,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,incomplete,more,critical,inconv,problematic,recommended,priority -pretentious,less_proper,incomplete,more,critical,inconv,problematic,priority,priority -pretentious,less_proper,incomplete,more,critical,inconv,problematic,not_recom,not_recom -pretentious,less_proper,foster,1,convenient,convenient,nonprob,recommended,very_recom -pretentious,less_proper,foster,1,convenient,convenient,nonprob,priority,priority -pretentious,less_proper,foster,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,foster,1,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,foster,1,convenient,convenient,slightly_prob,priority,priority -pretentious,less_proper,foster,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,1,convenient,convenient,problematic,recommended,priority -pretentious,less_proper,foster,1,convenient,convenient,problematic,priority,priority -pretentious,less_proper,foster,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,less_proper,foster,1,convenient,inconv,nonprob,recommended,priority -pretentious,less_proper,foster,1,convenient,inconv,nonprob,priority,priority -pretentious,less_proper,foster,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,foster,1,convenient,inconv,slightly_prob,recommended,priority -pretentious,less_proper,foster,1,convenient,inconv,slightly_prob,priority,priority -pretentious,less_proper,foster,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,1,convenient,inconv,problematic,recommended,priority -pretentious,less_proper,foster,1,convenient,inconv,problematic,priority,priority -pretentious,less_proper,foster,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,less_proper,foster,1,less_conv,convenient,nonprob,recommended,priority -pretentious,less_proper,foster,1,less_conv,convenient,nonprob,priority,priority -pretentious,less_proper,foster,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,foster,1,less_conv,convenient,slightly_prob,recommended,priority -pretentious,less_proper,foster,1,less_conv,convenient,slightly_prob,priority,priority -pretentious,less_proper,foster,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,1,less_conv,convenient,problematic,recommended,priority -pretentious,less_proper,foster,1,less_conv,convenient,problematic,priority,priority -pretentious,less_proper,foster,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,less_proper,foster,1,less_conv,inconv,nonprob,recommended,priority -pretentious,less_proper,foster,1,less_conv,inconv,nonprob,priority,priority -pretentious,less_proper,foster,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,foster,1,less_conv,inconv,slightly_prob,recommended,priority -pretentious,less_proper,foster,1,less_conv,inconv,slightly_prob,priority,priority -pretentious,less_proper,foster,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,1,less_conv,inconv,problematic,recommended,priority -pretentious,less_proper,foster,1,less_conv,inconv,problematic,priority,priority -pretentious,less_proper,foster,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,less_proper,foster,1,critical,convenient,nonprob,recommended,priority -pretentious,less_proper,foster,1,critical,convenient,nonprob,priority,priority -pretentious,less_proper,foster,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,foster,1,critical,convenient,slightly_prob,recommended,priority -pretentious,less_proper,foster,1,critical,convenient,slightly_prob,priority,priority -pretentious,less_proper,foster,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,1,critical,convenient,problematic,recommended,priority -pretentious,less_proper,foster,1,critical,convenient,problematic,priority,priority -pretentious,less_proper,foster,1,critical,convenient,problematic,not_recom,not_recom -pretentious,less_proper,foster,1,critical,inconv,nonprob,recommended,priority -pretentious,less_proper,foster,1,critical,inconv,nonprob,priority,priority -pretentious,less_proper,foster,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,foster,1,critical,inconv,slightly_prob,recommended,priority -pretentious,less_proper,foster,1,critical,inconv,slightly_prob,priority,priority -pretentious,less_proper,foster,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,1,critical,inconv,problematic,recommended,priority -pretentious,less_proper,foster,1,critical,inconv,problematic,priority,priority -pretentious,less_proper,foster,1,critical,inconv,problematic,not_recom,not_recom -pretentious,less_proper,foster,2,convenient,convenient,nonprob,recommended,very_recom -pretentious,less_proper,foster,2,convenient,convenient,nonprob,priority,priority -pretentious,less_proper,foster,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,foster,2,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,foster,2,convenient,convenient,slightly_prob,priority,priority -pretentious,less_proper,foster,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,2,convenient,convenient,problematic,recommended,priority -pretentious,less_proper,foster,2,convenient,convenient,problematic,priority,priority -pretentious,less_proper,foster,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,less_proper,foster,2,convenient,inconv,nonprob,recommended,priority -pretentious,less_proper,foster,2,convenient,inconv,nonprob,priority,priority -pretentious,less_proper,foster,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,foster,2,convenient,inconv,slightly_prob,recommended,priority -pretentious,less_proper,foster,2,convenient,inconv,slightly_prob,priority,priority -pretentious,less_proper,foster,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,2,convenient,inconv,problematic,recommended,priority -pretentious,less_proper,foster,2,convenient,inconv,problematic,priority,priority -pretentious,less_proper,foster,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,less_proper,foster,2,less_conv,convenient,nonprob,recommended,priority -pretentious,less_proper,foster,2,less_conv,convenient,nonprob,priority,priority -pretentious,less_proper,foster,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,foster,2,less_conv,convenient,slightly_prob,recommended,priority -pretentious,less_proper,foster,2,less_conv,convenient,slightly_prob,priority,priority -pretentious,less_proper,foster,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,2,less_conv,convenient,problematic,recommended,priority -pretentious,less_proper,foster,2,less_conv,convenient,problematic,priority,priority -pretentious,less_proper,foster,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,less_proper,foster,2,less_conv,inconv,nonprob,recommended,priority -pretentious,less_proper,foster,2,less_conv,inconv,nonprob,priority,priority -pretentious,less_proper,foster,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,foster,2,less_conv,inconv,slightly_prob,recommended,priority -pretentious,less_proper,foster,2,less_conv,inconv,slightly_prob,priority,priority -pretentious,less_proper,foster,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,2,less_conv,inconv,problematic,recommended,priority -pretentious,less_proper,foster,2,less_conv,inconv,problematic,priority,priority -pretentious,less_proper,foster,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,less_proper,foster,2,critical,convenient,nonprob,recommended,priority -pretentious,less_proper,foster,2,critical,convenient,nonprob,priority,priority -pretentious,less_proper,foster,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,foster,2,critical,convenient,slightly_prob,recommended,priority -pretentious,less_proper,foster,2,critical,convenient,slightly_prob,priority,priority -pretentious,less_proper,foster,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,2,critical,convenient,problematic,recommended,priority -pretentious,less_proper,foster,2,critical,convenient,problematic,priority,priority -pretentious,less_proper,foster,2,critical,convenient,problematic,not_recom,not_recom -pretentious,less_proper,foster,2,critical,inconv,nonprob,recommended,priority -pretentious,less_proper,foster,2,critical,inconv,nonprob,priority,priority -pretentious,less_proper,foster,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,foster,2,critical,inconv,slightly_prob,recommended,priority -pretentious,less_proper,foster,2,critical,inconv,slightly_prob,priority,priority -pretentious,less_proper,foster,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,2,critical,inconv,problematic,recommended,priority -pretentious,less_proper,foster,2,critical,inconv,problematic,priority,priority -pretentious,less_proper,foster,2,critical,inconv,problematic,not_recom,not_recom -pretentious,less_proper,foster,3,convenient,convenient,nonprob,recommended,very_recom -pretentious,less_proper,foster,3,convenient,convenient,nonprob,priority,priority -pretentious,less_proper,foster,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,foster,3,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,foster,3,convenient,convenient,slightly_prob,priority,priority -pretentious,less_proper,foster,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,3,convenient,convenient,problematic,recommended,priority -pretentious,less_proper,foster,3,convenient,convenient,problematic,priority,priority -pretentious,less_proper,foster,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,less_proper,foster,3,convenient,inconv,nonprob,recommended,priority -pretentious,less_proper,foster,3,convenient,inconv,nonprob,priority,priority -pretentious,less_proper,foster,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,foster,3,convenient,inconv,slightly_prob,recommended,priority -pretentious,less_proper,foster,3,convenient,inconv,slightly_prob,priority,priority -pretentious,less_proper,foster,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,3,convenient,inconv,problematic,recommended,priority -pretentious,less_proper,foster,3,convenient,inconv,problematic,priority,priority -pretentious,less_proper,foster,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,less_proper,foster,3,less_conv,convenient,nonprob,recommended,priority -pretentious,less_proper,foster,3,less_conv,convenient,nonprob,priority,priority -pretentious,less_proper,foster,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,foster,3,less_conv,convenient,slightly_prob,recommended,priority -pretentious,less_proper,foster,3,less_conv,convenient,slightly_prob,priority,priority -pretentious,less_proper,foster,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,3,less_conv,convenient,problematic,recommended,priority -pretentious,less_proper,foster,3,less_conv,convenient,problematic,priority,priority -pretentious,less_proper,foster,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,less_proper,foster,3,less_conv,inconv,nonprob,recommended,priority -pretentious,less_proper,foster,3,less_conv,inconv,nonprob,priority,priority -pretentious,less_proper,foster,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,foster,3,less_conv,inconv,slightly_prob,recommended,priority -pretentious,less_proper,foster,3,less_conv,inconv,slightly_prob,priority,priority -pretentious,less_proper,foster,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,3,less_conv,inconv,problematic,recommended,priority -pretentious,less_proper,foster,3,less_conv,inconv,problematic,priority,priority -pretentious,less_proper,foster,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,less_proper,foster,3,critical,convenient,nonprob,recommended,priority -pretentious,less_proper,foster,3,critical,convenient,nonprob,priority,priority -pretentious,less_proper,foster,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,foster,3,critical,convenient,slightly_prob,recommended,priority -pretentious,less_proper,foster,3,critical,convenient,slightly_prob,priority,priority -pretentious,less_proper,foster,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,3,critical,convenient,problematic,recommended,priority -pretentious,less_proper,foster,3,critical,convenient,problematic,priority,priority -pretentious,less_proper,foster,3,critical,convenient,problematic,not_recom,not_recom -pretentious,less_proper,foster,3,critical,inconv,nonprob,recommended,priority -pretentious,less_proper,foster,3,critical,inconv,nonprob,priority,priority -pretentious,less_proper,foster,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,foster,3,critical,inconv,slightly_prob,recommended,priority -pretentious,less_proper,foster,3,critical,inconv,slightly_prob,priority,priority -pretentious,less_proper,foster,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,3,critical,inconv,problematic,recommended,priority -pretentious,less_proper,foster,3,critical,inconv,problematic,priority,priority -pretentious,less_proper,foster,3,critical,inconv,problematic,not_recom,not_recom -pretentious,less_proper,foster,more,convenient,convenient,nonprob,recommended,very_recom -pretentious,less_proper,foster,more,convenient,convenient,nonprob,priority,priority -pretentious,less_proper,foster,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,foster,more,convenient,convenient,slightly_prob,recommended,very_recom -pretentious,less_proper,foster,more,convenient,convenient,slightly_prob,priority,priority -pretentious,less_proper,foster,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,more,convenient,convenient,problematic,recommended,priority -pretentious,less_proper,foster,more,convenient,convenient,problematic,priority,priority -pretentious,less_proper,foster,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,less_proper,foster,more,convenient,inconv,nonprob,recommended,priority -pretentious,less_proper,foster,more,convenient,inconv,nonprob,priority,priority -pretentious,less_proper,foster,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,foster,more,convenient,inconv,slightly_prob,recommended,priority -pretentious,less_proper,foster,more,convenient,inconv,slightly_prob,priority,priority -pretentious,less_proper,foster,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,more,convenient,inconv,problematic,recommended,priority -pretentious,less_proper,foster,more,convenient,inconv,problematic,priority,priority -pretentious,less_proper,foster,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,less_proper,foster,more,less_conv,convenient,nonprob,recommended,priority -pretentious,less_proper,foster,more,less_conv,convenient,nonprob,priority,priority -pretentious,less_proper,foster,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,foster,more,less_conv,convenient,slightly_prob,recommended,priority -pretentious,less_proper,foster,more,less_conv,convenient,slightly_prob,priority,priority -pretentious,less_proper,foster,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,more,less_conv,convenient,problematic,recommended,priority -pretentious,less_proper,foster,more,less_conv,convenient,problematic,priority,priority -pretentious,less_proper,foster,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,less_proper,foster,more,less_conv,inconv,nonprob,recommended,priority -pretentious,less_proper,foster,more,less_conv,inconv,nonprob,priority,priority -pretentious,less_proper,foster,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,foster,more,less_conv,inconv,slightly_prob,recommended,priority -pretentious,less_proper,foster,more,less_conv,inconv,slightly_prob,priority,priority -pretentious,less_proper,foster,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,more,less_conv,inconv,problematic,recommended,priority -pretentious,less_proper,foster,more,less_conv,inconv,problematic,priority,priority -pretentious,less_proper,foster,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,less_proper,foster,more,critical,convenient,nonprob,recommended,priority -pretentious,less_proper,foster,more,critical,convenient,nonprob,priority,priority -pretentious,less_proper,foster,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,less_proper,foster,more,critical,convenient,slightly_prob,recommended,priority -pretentious,less_proper,foster,more,critical,convenient,slightly_prob,priority,priority -pretentious,less_proper,foster,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,more,critical,convenient,problematic,recommended,priority -pretentious,less_proper,foster,more,critical,convenient,problematic,priority,priority -pretentious,less_proper,foster,more,critical,convenient,problematic,not_recom,not_recom -pretentious,less_proper,foster,more,critical,inconv,nonprob,recommended,priority -pretentious,less_proper,foster,more,critical,inconv,nonprob,priority,priority -pretentious,less_proper,foster,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,less_proper,foster,more,critical,inconv,slightly_prob,recommended,priority -pretentious,less_proper,foster,more,critical,inconv,slightly_prob,priority,priority -pretentious,less_proper,foster,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,less_proper,foster,more,critical,inconv,problematic,recommended,priority -pretentious,less_proper,foster,more,critical,inconv,problematic,priority,priority -pretentious,less_proper,foster,more,critical,inconv,problematic,not_recom,not_recom -pretentious,improper,complete,1,convenient,convenient,nonprob,recommended,priority -pretentious,improper,complete,1,convenient,convenient,nonprob,priority,priority -pretentious,improper,complete,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,improper,complete,1,convenient,convenient,slightly_prob,recommended,priority -pretentious,improper,complete,1,convenient,convenient,slightly_prob,priority,priority -pretentious,improper,complete,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,complete,1,convenient,convenient,problematic,recommended,priority -pretentious,improper,complete,1,convenient,convenient,problematic,priority,priority -pretentious,improper,complete,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,improper,complete,1,convenient,inconv,nonprob,recommended,priority -pretentious,improper,complete,1,convenient,inconv,nonprob,priority,priority -pretentious,improper,complete,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,improper,complete,1,convenient,inconv,slightly_prob,recommended,priority -pretentious,improper,complete,1,convenient,inconv,slightly_prob,priority,priority -pretentious,improper,complete,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,complete,1,convenient,inconv,problematic,recommended,priority -pretentious,improper,complete,1,convenient,inconv,problematic,priority,priority -pretentious,improper,complete,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,improper,complete,1,less_conv,convenient,nonprob,recommended,priority -pretentious,improper,complete,1,less_conv,convenient,nonprob,priority,priority -pretentious,improper,complete,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,improper,complete,1,less_conv,convenient,slightly_prob,recommended,priority -pretentious,improper,complete,1,less_conv,convenient,slightly_prob,priority,priority -pretentious,improper,complete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,complete,1,less_conv,convenient,problematic,recommended,priority -pretentious,improper,complete,1,less_conv,convenient,problematic,priority,priority -pretentious,improper,complete,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,improper,complete,1,less_conv,inconv,nonprob,recommended,priority -pretentious,improper,complete,1,less_conv,inconv,nonprob,priority,priority -pretentious,improper,complete,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,improper,complete,1,less_conv,inconv,slightly_prob,recommended,priority -pretentious,improper,complete,1,less_conv,inconv,slightly_prob,priority,priority -pretentious,improper,complete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,complete,1,less_conv,inconv,problematic,recommended,priority -pretentious,improper,complete,1,less_conv,inconv,problematic,priority,priority -pretentious,improper,complete,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,improper,complete,1,critical,convenient,nonprob,recommended,priority -pretentious,improper,complete,1,critical,convenient,nonprob,priority,priority -pretentious,improper,complete,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,improper,complete,1,critical,convenient,slightly_prob,recommended,priority -pretentious,improper,complete,1,critical,convenient,slightly_prob,priority,priority -pretentious,improper,complete,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,complete,1,critical,convenient,problematic,recommended,priority -pretentious,improper,complete,1,critical,convenient,problematic,priority,priority -pretentious,improper,complete,1,critical,convenient,problematic,not_recom,not_recom -pretentious,improper,complete,1,critical,inconv,nonprob,recommended,priority -pretentious,improper,complete,1,critical,inconv,nonprob,priority,priority -pretentious,improper,complete,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,improper,complete,1,critical,inconv,slightly_prob,recommended,priority -pretentious,improper,complete,1,critical,inconv,slightly_prob,priority,priority -pretentious,improper,complete,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,complete,1,critical,inconv,problematic,recommended,priority -pretentious,improper,complete,1,critical,inconv,problematic,priority,priority -pretentious,improper,complete,1,critical,inconv,problematic,not_recom,not_recom -pretentious,improper,complete,2,convenient,convenient,nonprob,recommended,priority -pretentious,improper,complete,2,convenient,convenient,nonprob,priority,priority -pretentious,improper,complete,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,improper,complete,2,convenient,convenient,slightly_prob,recommended,priority -pretentious,improper,complete,2,convenient,convenient,slightly_prob,priority,priority -pretentious,improper,complete,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,complete,2,convenient,convenient,problematic,recommended,priority -pretentious,improper,complete,2,convenient,convenient,problematic,priority,priority -pretentious,improper,complete,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,improper,complete,2,convenient,inconv,nonprob,recommended,priority -pretentious,improper,complete,2,convenient,inconv,nonprob,priority,priority -pretentious,improper,complete,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,improper,complete,2,convenient,inconv,slightly_prob,recommended,priority -pretentious,improper,complete,2,convenient,inconv,slightly_prob,priority,priority -pretentious,improper,complete,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,complete,2,convenient,inconv,problematic,recommended,priority -pretentious,improper,complete,2,convenient,inconv,problematic,priority,priority -pretentious,improper,complete,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,improper,complete,2,less_conv,convenient,nonprob,recommended,priority -pretentious,improper,complete,2,less_conv,convenient,nonprob,priority,priority -pretentious,improper,complete,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,improper,complete,2,less_conv,convenient,slightly_prob,recommended,priority -pretentious,improper,complete,2,less_conv,convenient,slightly_prob,priority,priority -pretentious,improper,complete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,complete,2,less_conv,convenient,problematic,recommended,priority -pretentious,improper,complete,2,less_conv,convenient,problematic,priority,priority -pretentious,improper,complete,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,improper,complete,2,less_conv,inconv,nonprob,recommended,priority -pretentious,improper,complete,2,less_conv,inconv,nonprob,priority,priority -pretentious,improper,complete,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,improper,complete,2,less_conv,inconv,slightly_prob,recommended,priority -pretentious,improper,complete,2,less_conv,inconv,slightly_prob,priority,priority -pretentious,improper,complete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,complete,2,less_conv,inconv,problematic,recommended,priority -pretentious,improper,complete,2,less_conv,inconv,problematic,priority,priority -pretentious,improper,complete,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,improper,complete,2,critical,convenient,nonprob,recommended,priority -pretentious,improper,complete,2,critical,convenient,nonprob,priority,spec_prior -pretentious,improper,complete,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,improper,complete,2,critical,convenient,slightly_prob,recommended,priority -pretentious,improper,complete,2,critical,convenient,slightly_prob,priority,spec_prior -pretentious,improper,complete,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,complete,2,critical,convenient,problematic,recommended,spec_prior -pretentious,improper,complete,2,critical,convenient,problematic,priority,spec_prior -pretentious,improper,complete,2,critical,convenient,problematic,not_recom,not_recom -pretentious,improper,complete,2,critical,inconv,nonprob,recommended,priority -pretentious,improper,complete,2,critical,inconv,nonprob,priority,spec_prior -pretentious,improper,complete,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,improper,complete,2,critical,inconv,slightly_prob,recommended,priority -pretentious,improper,complete,2,critical,inconv,slightly_prob,priority,spec_prior -pretentious,improper,complete,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,complete,2,critical,inconv,problematic,recommended,spec_prior -pretentious,improper,complete,2,critical,inconv,problematic,priority,spec_prior -pretentious,improper,complete,2,critical,inconv,problematic,not_recom,not_recom -pretentious,improper,complete,3,convenient,convenient,nonprob,recommended,priority -pretentious,improper,complete,3,convenient,convenient,nonprob,priority,priority -pretentious,improper,complete,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,improper,complete,3,convenient,convenient,slightly_prob,recommended,priority -pretentious,improper,complete,3,convenient,convenient,slightly_prob,priority,priority -pretentious,improper,complete,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,complete,3,convenient,convenient,problematic,recommended,priority -pretentious,improper,complete,3,convenient,convenient,problematic,priority,priority -pretentious,improper,complete,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,improper,complete,3,convenient,inconv,nonprob,recommended,priority -pretentious,improper,complete,3,convenient,inconv,nonprob,priority,spec_prior -pretentious,improper,complete,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,improper,complete,3,convenient,inconv,slightly_prob,recommended,priority -pretentious,improper,complete,3,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,improper,complete,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,complete,3,convenient,inconv,problematic,recommended,spec_prior -pretentious,improper,complete,3,convenient,inconv,problematic,priority,spec_prior -pretentious,improper,complete,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,improper,complete,3,less_conv,convenient,nonprob,recommended,priority -pretentious,improper,complete,3,less_conv,convenient,nonprob,priority,spec_prior -pretentious,improper,complete,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,improper,complete,3,less_conv,convenient,slightly_prob,recommended,priority -pretentious,improper,complete,3,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,improper,complete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,complete,3,less_conv,convenient,problematic,recommended,spec_prior -pretentious,improper,complete,3,less_conv,convenient,problematic,priority,spec_prior -pretentious,improper,complete,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,improper,complete,3,less_conv,inconv,nonprob,recommended,priority -pretentious,improper,complete,3,less_conv,inconv,nonprob,priority,spec_prior -pretentious,improper,complete,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,improper,complete,3,less_conv,inconv,slightly_prob,recommended,priority -pretentious,improper,complete,3,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,improper,complete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,complete,3,less_conv,inconv,problematic,recommended,spec_prior -pretentious,improper,complete,3,less_conv,inconv,problematic,priority,spec_prior -pretentious,improper,complete,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,improper,complete,3,critical,convenient,nonprob,recommended,priority -pretentious,improper,complete,3,critical,convenient,nonprob,priority,spec_prior -pretentious,improper,complete,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,improper,complete,3,critical,convenient,slightly_prob,recommended,priority -pretentious,improper,complete,3,critical,convenient,slightly_prob,priority,spec_prior -pretentious,improper,complete,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,complete,3,critical,convenient,problematic,recommended,spec_prior -pretentious,improper,complete,3,critical,convenient,problematic,priority,spec_prior -pretentious,improper,complete,3,critical,convenient,problematic,not_recom,not_recom -pretentious,improper,complete,3,critical,inconv,nonprob,recommended,priority -pretentious,improper,complete,3,critical,inconv,nonprob,priority,spec_prior -pretentious,improper,complete,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,improper,complete,3,critical,inconv,slightly_prob,recommended,priority -pretentious,improper,complete,3,critical,inconv,slightly_prob,priority,spec_prior -pretentious,improper,complete,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,complete,3,critical,inconv,problematic,recommended,spec_prior -pretentious,improper,complete,3,critical,inconv,problematic,priority,spec_prior -pretentious,improper,complete,3,critical,inconv,problematic,not_recom,not_recom -pretentious,improper,complete,more,convenient,convenient,nonprob,recommended,priority -pretentious,improper,complete,more,convenient,convenient,nonprob,priority,priority -pretentious,improper,complete,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,improper,complete,more,convenient,convenient,slightly_prob,recommended,priority -pretentious,improper,complete,more,convenient,convenient,slightly_prob,priority,priority -pretentious,improper,complete,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,complete,more,convenient,convenient,problematic,recommended,priority -pretentious,improper,complete,more,convenient,convenient,problematic,priority,priority -pretentious,improper,complete,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,improper,complete,more,convenient,inconv,nonprob,recommended,priority -pretentious,improper,complete,more,convenient,inconv,nonprob,priority,spec_prior -pretentious,improper,complete,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,improper,complete,more,convenient,inconv,slightly_prob,recommended,priority -pretentious,improper,complete,more,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,improper,complete,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,complete,more,convenient,inconv,problematic,recommended,spec_prior -pretentious,improper,complete,more,convenient,inconv,problematic,priority,spec_prior -pretentious,improper,complete,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,improper,complete,more,less_conv,convenient,nonprob,recommended,priority -pretentious,improper,complete,more,less_conv,convenient,nonprob,priority,spec_prior -pretentious,improper,complete,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,improper,complete,more,less_conv,convenient,slightly_prob,recommended,priority -pretentious,improper,complete,more,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,improper,complete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,complete,more,less_conv,convenient,problematic,recommended,spec_prior -pretentious,improper,complete,more,less_conv,convenient,problematic,priority,spec_prior -pretentious,improper,complete,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,improper,complete,more,less_conv,inconv,nonprob,recommended,priority -pretentious,improper,complete,more,less_conv,inconv,nonprob,priority,spec_prior -pretentious,improper,complete,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,improper,complete,more,less_conv,inconv,slightly_prob,recommended,priority -pretentious,improper,complete,more,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,improper,complete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,complete,more,less_conv,inconv,problematic,recommended,spec_prior -pretentious,improper,complete,more,less_conv,inconv,problematic,priority,spec_prior -pretentious,improper,complete,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,improper,complete,more,critical,convenient,nonprob,recommended,priority -pretentious,improper,complete,more,critical,convenient,nonprob,priority,spec_prior -pretentious,improper,complete,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,improper,complete,more,critical,convenient,slightly_prob,recommended,priority -pretentious,improper,complete,more,critical,convenient,slightly_prob,priority,spec_prior -pretentious,improper,complete,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,complete,more,critical,convenient,problematic,recommended,spec_prior -pretentious,improper,complete,more,critical,convenient,problematic,priority,spec_prior -pretentious,improper,complete,more,critical,convenient,problematic,not_recom,not_recom -pretentious,improper,complete,more,critical,inconv,nonprob,recommended,priority -pretentious,improper,complete,more,critical,inconv,nonprob,priority,spec_prior -pretentious,improper,complete,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,improper,complete,more,critical,inconv,slightly_prob,recommended,priority -pretentious,improper,complete,more,critical,inconv,slightly_prob,priority,spec_prior -pretentious,improper,complete,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,complete,more,critical,inconv,problematic,recommended,spec_prior -pretentious,improper,complete,more,critical,inconv,problematic,priority,spec_prior -pretentious,improper,complete,more,critical,inconv,problematic,not_recom,not_recom -pretentious,improper,completed,1,convenient,convenient,nonprob,recommended,priority -pretentious,improper,completed,1,convenient,convenient,nonprob,priority,priority -pretentious,improper,completed,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,improper,completed,1,convenient,convenient,slightly_prob,recommended,priority -pretentious,improper,completed,1,convenient,convenient,slightly_prob,priority,priority -pretentious,improper,completed,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,completed,1,convenient,convenient,problematic,recommended,priority -pretentious,improper,completed,1,convenient,convenient,problematic,priority,priority -pretentious,improper,completed,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,improper,completed,1,convenient,inconv,nonprob,recommended,priority -pretentious,improper,completed,1,convenient,inconv,nonprob,priority,priority -pretentious,improper,completed,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,improper,completed,1,convenient,inconv,slightly_prob,recommended,priority -pretentious,improper,completed,1,convenient,inconv,slightly_prob,priority,priority -pretentious,improper,completed,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,completed,1,convenient,inconv,problematic,recommended,priority -pretentious,improper,completed,1,convenient,inconv,problematic,priority,priority -pretentious,improper,completed,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,improper,completed,1,less_conv,convenient,nonprob,recommended,priority -pretentious,improper,completed,1,less_conv,convenient,nonprob,priority,priority -pretentious,improper,completed,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,improper,completed,1,less_conv,convenient,slightly_prob,recommended,priority -pretentious,improper,completed,1,less_conv,convenient,slightly_prob,priority,priority -pretentious,improper,completed,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,completed,1,less_conv,convenient,problematic,recommended,priority -pretentious,improper,completed,1,less_conv,convenient,problematic,priority,priority -pretentious,improper,completed,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,improper,completed,1,less_conv,inconv,nonprob,recommended,priority -pretentious,improper,completed,1,less_conv,inconv,nonprob,priority,priority -pretentious,improper,completed,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,improper,completed,1,less_conv,inconv,slightly_prob,recommended,priority -pretentious,improper,completed,1,less_conv,inconv,slightly_prob,priority,priority -pretentious,improper,completed,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,completed,1,less_conv,inconv,problematic,recommended,priority -pretentious,improper,completed,1,less_conv,inconv,problematic,priority,priority -pretentious,improper,completed,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,improper,completed,1,critical,convenient,nonprob,recommended,priority -pretentious,improper,completed,1,critical,convenient,nonprob,priority,spec_prior -pretentious,improper,completed,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,improper,completed,1,critical,convenient,slightly_prob,recommended,priority -pretentious,improper,completed,1,critical,convenient,slightly_prob,priority,spec_prior -pretentious,improper,completed,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,completed,1,critical,convenient,problematic,recommended,spec_prior -pretentious,improper,completed,1,critical,convenient,problematic,priority,spec_prior -pretentious,improper,completed,1,critical,convenient,problematic,not_recom,not_recom -pretentious,improper,completed,1,critical,inconv,nonprob,recommended,priority -pretentious,improper,completed,1,critical,inconv,nonprob,priority,spec_prior -pretentious,improper,completed,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,improper,completed,1,critical,inconv,slightly_prob,recommended,priority -pretentious,improper,completed,1,critical,inconv,slightly_prob,priority,spec_prior -pretentious,improper,completed,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,completed,1,critical,inconv,problematic,recommended,spec_prior -pretentious,improper,completed,1,critical,inconv,problematic,priority,spec_prior -pretentious,improper,completed,1,critical,inconv,problematic,not_recom,not_recom -pretentious,improper,completed,2,convenient,convenient,nonprob,recommended,priority -pretentious,improper,completed,2,convenient,convenient,nonprob,priority,priority -pretentious,improper,completed,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,improper,completed,2,convenient,convenient,slightly_prob,recommended,priority -pretentious,improper,completed,2,convenient,convenient,slightly_prob,priority,priority -pretentious,improper,completed,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,completed,2,convenient,convenient,problematic,recommended,priority -pretentious,improper,completed,2,convenient,convenient,problematic,priority,priority -pretentious,improper,completed,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,improper,completed,2,convenient,inconv,nonprob,recommended,priority -pretentious,improper,completed,2,convenient,inconv,nonprob,priority,priority -pretentious,improper,completed,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,improper,completed,2,convenient,inconv,slightly_prob,recommended,priority -pretentious,improper,completed,2,convenient,inconv,slightly_prob,priority,priority -pretentious,improper,completed,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,completed,2,convenient,inconv,problematic,recommended,priority -pretentious,improper,completed,2,convenient,inconv,problematic,priority,priority -pretentious,improper,completed,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,improper,completed,2,less_conv,convenient,nonprob,recommended,priority -pretentious,improper,completed,2,less_conv,convenient,nonprob,priority,priority -pretentious,improper,completed,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,improper,completed,2,less_conv,convenient,slightly_prob,recommended,priority -pretentious,improper,completed,2,less_conv,convenient,slightly_prob,priority,priority -pretentious,improper,completed,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,completed,2,less_conv,convenient,problematic,recommended,priority -pretentious,improper,completed,2,less_conv,convenient,problematic,priority,priority -pretentious,improper,completed,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,improper,completed,2,less_conv,inconv,nonprob,recommended,priority -pretentious,improper,completed,2,less_conv,inconv,nonprob,priority,priority -pretentious,improper,completed,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,improper,completed,2,less_conv,inconv,slightly_prob,recommended,priority -pretentious,improper,completed,2,less_conv,inconv,slightly_prob,priority,priority -pretentious,improper,completed,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,completed,2,less_conv,inconv,problematic,recommended,priority -pretentious,improper,completed,2,less_conv,inconv,problematic,priority,priority -pretentious,improper,completed,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,improper,completed,2,critical,convenient,nonprob,recommended,priority -pretentious,improper,completed,2,critical,convenient,nonprob,priority,spec_prior -pretentious,improper,completed,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,improper,completed,2,critical,convenient,slightly_prob,recommended,priority -pretentious,improper,completed,2,critical,convenient,slightly_prob,priority,spec_prior -pretentious,improper,completed,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,completed,2,critical,convenient,problematic,recommended,spec_prior -pretentious,improper,completed,2,critical,convenient,problematic,priority,spec_prior -pretentious,improper,completed,2,critical,convenient,problematic,not_recom,not_recom -pretentious,improper,completed,2,critical,inconv,nonprob,recommended,priority -pretentious,improper,completed,2,critical,inconv,nonprob,priority,spec_prior -pretentious,improper,completed,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,improper,completed,2,critical,inconv,slightly_prob,recommended,priority -pretentious,improper,completed,2,critical,inconv,slightly_prob,priority,spec_prior -pretentious,improper,completed,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,completed,2,critical,inconv,problematic,recommended,spec_prior -pretentious,improper,completed,2,critical,inconv,problematic,priority,spec_prior -pretentious,improper,completed,2,critical,inconv,problematic,not_recom,not_recom -pretentious,improper,completed,3,convenient,convenient,nonprob,recommended,priority -pretentious,improper,completed,3,convenient,convenient,nonprob,priority,priority -pretentious,improper,completed,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,improper,completed,3,convenient,convenient,slightly_prob,recommended,priority -pretentious,improper,completed,3,convenient,convenient,slightly_prob,priority,priority -pretentious,improper,completed,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,completed,3,convenient,convenient,problematic,recommended,priority -pretentious,improper,completed,3,convenient,convenient,problematic,priority,priority -pretentious,improper,completed,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,improper,completed,3,convenient,inconv,nonprob,recommended,priority -pretentious,improper,completed,3,convenient,inconv,nonprob,priority,spec_prior -pretentious,improper,completed,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,improper,completed,3,convenient,inconv,slightly_prob,recommended,priority -pretentious,improper,completed,3,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,improper,completed,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,completed,3,convenient,inconv,problematic,recommended,spec_prior -pretentious,improper,completed,3,convenient,inconv,problematic,priority,spec_prior -pretentious,improper,completed,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,improper,completed,3,less_conv,convenient,nonprob,recommended,priority -pretentious,improper,completed,3,less_conv,convenient,nonprob,priority,spec_prior -pretentious,improper,completed,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,improper,completed,3,less_conv,convenient,slightly_prob,recommended,priority -pretentious,improper,completed,3,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,improper,completed,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,completed,3,less_conv,convenient,problematic,recommended,spec_prior -pretentious,improper,completed,3,less_conv,convenient,problematic,priority,spec_prior -pretentious,improper,completed,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,improper,completed,3,less_conv,inconv,nonprob,recommended,priority -pretentious,improper,completed,3,less_conv,inconv,nonprob,priority,spec_prior -pretentious,improper,completed,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,improper,completed,3,less_conv,inconv,slightly_prob,recommended,priority -pretentious,improper,completed,3,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,improper,completed,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,completed,3,less_conv,inconv,problematic,recommended,spec_prior -pretentious,improper,completed,3,less_conv,inconv,problematic,priority,spec_prior -pretentious,improper,completed,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,improper,completed,3,critical,convenient,nonprob,recommended,priority -pretentious,improper,completed,3,critical,convenient,nonprob,priority,spec_prior -pretentious,improper,completed,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,improper,completed,3,critical,convenient,slightly_prob,recommended,priority -pretentious,improper,completed,3,critical,convenient,slightly_prob,priority,spec_prior -pretentious,improper,completed,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,completed,3,critical,convenient,problematic,recommended,spec_prior -pretentious,improper,completed,3,critical,convenient,problematic,priority,spec_prior -pretentious,improper,completed,3,critical,convenient,problematic,not_recom,not_recom -pretentious,improper,completed,3,critical,inconv,nonprob,recommended,priority -pretentious,improper,completed,3,critical,inconv,nonprob,priority,spec_prior -pretentious,improper,completed,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,improper,completed,3,critical,inconv,slightly_prob,recommended,priority -pretentious,improper,completed,3,critical,inconv,slightly_prob,priority,spec_prior -pretentious,improper,completed,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,completed,3,critical,inconv,problematic,recommended,spec_prior -pretentious,improper,completed,3,critical,inconv,problematic,priority,spec_prior -pretentious,improper,completed,3,critical,inconv,problematic,not_recom,not_recom -pretentious,improper,completed,more,convenient,convenient,nonprob,recommended,priority -pretentious,improper,completed,more,convenient,convenient,nonprob,priority,priority -pretentious,improper,completed,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,improper,completed,more,convenient,convenient,slightly_prob,recommended,priority -pretentious,improper,completed,more,convenient,convenient,slightly_prob,priority,priority -pretentious,improper,completed,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,completed,more,convenient,convenient,problematic,recommended,priority -pretentious,improper,completed,more,convenient,convenient,problematic,priority,priority -pretentious,improper,completed,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,improper,completed,more,convenient,inconv,nonprob,recommended,priority -pretentious,improper,completed,more,convenient,inconv,nonprob,priority,spec_prior -pretentious,improper,completed,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,improper,completed,more,convenient,inconv,slightly_prob,recommended,priority -pretentious,improper,completed,more,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,improper,completed,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,completed,more,convenient,inconv,problematic,recommended,spec_prior -pretentious,improper,completed,more,convenient,inconv,problematic,priority,spec_prior -pretentious,improper,completed,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,improper,completed,more,less_conv,convenient,nonprob,recommended,priority -pretentious,improper,completed,more,less_conv,convenient,nonprob,priority,spec_prior -pretentious,improper,completed,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,improper,completed,more,less_conv,convenient,slightly_prob,recommended,priority -pretentious,improper,completed,more,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,improper,completed,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,completed,more,less_conv,convenient,problematic,recommended,spec_prior -pretentious,improper,completed,more,less_conv,convenient,problematic,priority,spec_prior -pretentious,improper,completed,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,improper,completed,more,less_conv,inconv,nonprob,recommended,priority -pretentious,improper,completed,more,less_conv,inconv,nonprob,priority,spec_prior -pretentious,improper,completed,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,improper,completed,more,less_conv,inconv,slightly_prob,recommended,priority -pretentious,improper,completed,more,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,improper,completed,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,completed,more,less_conv,inconv,problematic,recommended,spec_prior -pretentious,improper,completed,more,less_conv,inconv,problematic,priority,spec_prior -pretentious,improper,completed,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,improper,completed,more,critical,convenient,nonprob,recommended,priority -pretentious,improper,completed,more,critical,convenient,nonprob,priority,spec_prior -pretentious,improper,completed,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,improper,completed,more,critical,convenient,slightly_prob,recommended,priority -pretentious,improper,completed,more,critical,convenient,slightly_prob,priority,spec_prior -pretentious,improper,completed,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,completed,more,critical,convenient,problematic,recommended,spec_prior -pretentious,improper,completed,more,critical,convenient,problematic,priority,spec_prior -pretentious,improper,completed,more,critical,convenient,problematic,not_recom,not_recom -pretentious,improper,completed,more,critical,inconv,nonprob,recommended,priority -pretentious,improper,completed,more,critical,inconv,nonprob,priority,spec_prior -pretentious,improper,completed,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,improper,completed,more,critical,inconv,slightly_prob,recommended,priority -pretentious,improper,completed,more,critical,inconv,slightly_prob,priority,spec_prior -pretentious,improper,completed,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,completed,more,critical,inconv,problematic,recommended,spec_prior -pretentious,improper,completed,more,critical,inconv,problematic,priority,spec_prior -pretentious,improper,completed,more,critical,inconv,problematic,not_recom,not_recom -pretentious,improper,incomplete,1,convenient,convenient,nonprob,recommended,priority -pretentious,improper,incomplete,1,convenient,convenient,nonprob,priority,priority -pretentious,improper,incomplete,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,improper,incomplete,1,convenient,convenient,slightly_prob,recommended,priority -pretentious,improper,incomplete,1,convenient,convenient,slightly_prob,priority,priority -pretentious,improper,incomplete,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,1,convenient,convenient,problematic,recommended,priority -pretentious,improper,incomplete,1,convenient,convenient,problematic,priority,priority -pretentious,improper,incomplete,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,improper,incomplete,1,convenient,inconv,nonprob,recommended,priority -pretentious,improper,incomplete,1,convenient,inconv,nonprob,priority,priority -pretentious,improper,incomplete,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,improper,incomplete,1,convenient,inconv,slightly_prob,recommended,priority -pretentious,improper,incomplete,1,convenient,inconv,slightly_prob,priority,priority -pretentious,improper,incomplete,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,1,convenient,inconv,problematic,recommended,priority -pretentious,improper,incomplete,1,convenient,inconv,problematic,priority,priority -pretentious,improper,incomplete,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,improper,incomplete,1,less_conv,convenient,nonprob,recommended,priority -pretentious,improper,incomplete,1,less_conv,convenient,nonprob,priority,priority -pretentious,improper,incomplete,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,improper,incomplete,1,less_conv,convenient,slightly_prob,recommended,priority -pretentious,improper,incomplete,1,less_conv,convenient,slightly_prob,priority,priority -pretentious,improper,incomplete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,1,less_conv,convenient,problematic,recommended,priority -pretentious,improper,incomplete,1,less_conv,convenient,problematic,priority,priority -pretentious,improper,incomplete,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,improper,incomplete,1,less_conv,inconv,nonprob,recommended,priority -pretentious,improper,incomplete,1,less_conv,inconv,nonprob,priority,priority -pretentious,improper,incomplete,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,improper,incomplete,1,less_conv,inconv,slightly_prob,recommended,priority -pretentious,improper,incomplete,1,less_conv,inconv,slightly_prob,priority,priority -pretentious,improper,incomplete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,1,less_conv,inconv,problematic,recommended,priority -pretentious,improper,incomplete,1,less_conv,inconv,problematic,priority,priority -pretentious,improper,incomplete,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,improper,incomplete,1,critical,convenient,nonprob,recommended,priority -pretentious,improper,incomplete,1,critical,convenient,nonprob,priority,spec_prior -pretentious,improper,incomplete,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,improper,incomplete,1,critical,convenient,slightly_prob,recommended,priority -pretentious,improper,incomplete,1,critical,convenient,slightly_prob,priority,spec_prior -pretentious,improper,incomplete,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,1,critical,convenient,problematic,recommended,spec_prior -pretentious,improper,incomplete,1,critical,convenient,problematic,priority,spec_prior -pretentious,improper,incomplete,1,critical,convenient,problematic,not_recom,not_recom -pretentious,improper,incomplete,1,critical,inconv,nonprob,recommended,priority -pretentious,improper,incomplete,1,critical,inconv,nonprob,priority,spec_prior -pretentious,improper,incomplete,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,improper,incomplete,1,critical,inconv,slightly_prob,recommended,priority -pretentious,improper,incomplete,1,critical,inconv,slightly_prob,priority,spec_prior -pretentious,improper,incomplete,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,1,critical,inconv,problematic,recommended,spec_prior -pretentious,improper,incomplete,1,critical,inconv,problematic,priority,spec_prior -pretentious,improper,incomplete,1,critical,inconv,problematic,not_recom,not_recom -pretentious,improper,incomplete,2,convenient,convenient,nonprob,recommended,priority -pretentious,improper,incomplete,2,convenient,convenient,nonprob,priority,priority -pretentious,improper,incomplete,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,improper,incomplete,2,convenient,convenient,slightly_prob,recommended,priority -pretentious,improper,incomplete,2,convenient,convenient,slightly_prob,priority,priority -pretentious,improper,incomplete,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,2,convenient,convenient,problematic,recommended,priority -pretentious,improper,incomplete,2,convenient,convenient,problematic,priority,priority -pretentious,improper,incomplete,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,improper,incomplete,2,convenient,inconv,nonprob,recommended,priority -pretentious,improper,incomplete,2,convenient,inconv,nonprob,priority,spec_prior -pretentious,improper,incomplete,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,improper,incomplete,2,convenient,inconv,slightly_prob,recommended,priority -pretentious,improper,incomplete,2,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,improper,incomplete,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,2,convenient,inconv,problematic,recommended,spec_prior -pretentious,improper,incomplete,2,convenient,inconv,problematic,priority,spec_prior -pretentious,improper,incomplete,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,improper,incomplete,2,less_conv,convenient,nonprob,recommended,priority -pretentious,improper,incomplete,2,less_conv,convenient,nonprob,priority,spec_prior -pretentious,improper,incomplete,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,improper,incomplete,2,less_conv,convenient,slightly_prob,recommended,priority -pretentious,improper,incomplete,2,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,improper,incomplete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,2,less_conv,convenient,problematic,recommended,spec_prior -pretentious,improper,incomplete,2,less_conv,convenient,problematic,priority,spec_prior -pretentious,improper,incomplete,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,improper,incomplete,2,less_conv,inconv,nonprob,recommended,priority -pretentious,improper,incomplete,2,less_conv,inconv,nonprob,priority,spec_prior -pretentious,improper,incomplete,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,improper,incomplete,2,less_conv,inconv,slightly_prob,recommended,priority -pretentious,improper,incomplete,2,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,improper,incomplete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,2,less_conv,inconv,problematic,recommended,spec_prior -pretentious,improper,incomplete,2,less_conv,inconv,problematic,priority,spec_prior -pretentious,improper,incomplete,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,improper,incomplete,2,critical,convenient,nonprob,recommended,priority -pretentious,improper,incomplete,2,critical,convenient,nonprob,priority,spec_prior -pretentious,improper,incomplete,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,improper,incomplete,2,critical,convenient,slightly_prob,recommended,priority -pretentious,improper,incomplete,2,critical,convenient,slightly_prob,priority,spec_prior -pretentious,improper,incomplete,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,2,critical,convenient,problematic,recommended,spec_prior -pretentious,improper,incomplete,2,critical,convenient,problematic,priority,spec_prior -pretentious,improper,incomplete,2,critical,convenient,problematic,not_recom,not_recom -pretentious,improper,incomplete,2,critical,inconv,nonprob,recommended,priority -pretentious,improper,incomplete,2,critical,inconv,nonprob,priority,spec_prior -pretentious,improper,incomplete,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,improper,incomplete,2,critical,inconv,slightly_prob,recommended,priority -pretentious,improper,incomplete,2,critical,inconv,slightly_prob,priority,spec_prior -pretentious,improper,incomplete,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,2,critical,inconv,problematic,recommended,spec_prior -pretentious,improper,incomplete,2,critical,inconv,problematic,priority,spec_prior -pretentious,improper,incomplete,2,critical,inconv,problematic,not_recom,not_recom -pretentious,improper,incomplete,3,convenient,convenient,nonprob,recommended,priority -pretentious,improper,incomplete,3,convenient,convenient,nonprob,priority,priority -pretentious,improper,incomplete,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,improper,incomplete,3,convenient,convenient,slightly_prob,recommended,priority -pretentious,improper,incomplete,3,convenient,convenient,slightly_prob,priority,priority -pretentious,improper,incomplete,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,3,convenient,convenient,problematic,recommended,priority -pretentious,improper,incomplete,3,convenient,convenient,problematic,priority,priority -pretentious,improper,incomplete,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,improper,incomplete,3,convenient,inconv,nonprob,recommended,priority -pretentious,improper,incomplete,3,convenient,inconv,nonprob,priority,spec_prior -pretentious,improper,incomplete,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,improper,incomplete,3,convenient,inconv,slightly_prob,recommended,priority -pretentious,improper,incomplete,3,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,improper,incomplete,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,3,convenient,inconv,problematic,recommended,spec_prior -pretentious,improper,incomplete,3,convenient,inconv,problematic,priority,spec_prior -pretentious,improper,incomplete,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,improper,incomplete,3,less_conv,convenient,nonprob,recommended,priority -pretentious,improper,incomplete,3,less_conv,convenient,nonprob,priority,spec_prior -pretentious,improper,incomplete,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,improper,incomplete,3,less_conv,convenient,slightly_prob,recommended,priority -pretentious,improper,incomplete,3,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,improper,incomplete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,3,less_conv,convenient,problematic,recommended,spec_prior -pretentious,improper,incomplete,3,less_conv,convenient,problematic,priority,spec_prior -pretentious,improper,incomplete,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,improper,incomplete,3,less_conv,inconv,nonprob,recommended,priority -pretentious,improper,incomplete,3,less_conv,inconv,nonprob,priority,spec_prior -pretentious,improper,incomplete,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,improper,incomplete,3,less_conv,inconv,slightly_prob,recommended,priority -pretentious,improper,incomplete,3,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,improper,incomplete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,3,less_conv,inconv,problematic,recommended,spec_prior -pretentious,improper,incomplete,3,less_conv,inconv,problematic,priority,spec_prior -pretentious,improper,incomplete,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,improper,incomplete,3,critical,convenient,nonprob,recommended,priority -pretentious,improper,incomplete,3,critical,convenient,nonprob,priority,spec_prior -pretentious,improper,incomplete,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,improper,incomplete,3,critical,convenient,slightly_prob,recommended,priority -pretentious,improper,incomplete,3,critical,convenient,slightly_prob,priority,spec_prior -pretentious,improper,incomplete,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,3,critical,convenient,problematic,recommended,spec_prior -pretentious,improper,incomplete,3,critical,convenient,problematic,priority,spec_prior -pretentious,improper,incomplete,3,critical,convenient,problematic,not_recom,not_recom -pretentious,improper,incomplete,3,critical,inconv,nonprob,recommended,priority -pretentious,improper,incomplete,3,critical,inconv,nonprob,priority,spec_prior -pretentious,improper,incomplete,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,improper,incomplete,3,critical,inconv,slightly_prob,recommended,priority -pretentious,improper,incomplete,3,critical,inconv,slightly_prob,priority,spec_prior -pretentious,improper,incomplete,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,3,critical,inconv,problematic,recommended,spec_prior -pretentious,improper,incomplete,3,critical,inconv,problematic,priority,spec_prior -pretentious,improper,incomplete,3,critical,inconv,problematic,not_recom,not_recom -pretentious,improper,incomplete,more,convenient,convenient,nonprob,recommended,priority -pretentious,improper,incomplete,more,convenient,convenient,nonprob,priority,priority -pretentious,improper,incomplete,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,improper,incomplete,more,convenient,convenient,slightly_prob,recommended,priority -pretentious,improper,incomplete,more,convenient,convenient,slightly_prob,priority,priority -pretentious,improper,incomplete,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,more,convenient,convenient,problematic,recommended,priority -pretentious,improper,incomplete,more,convenient,convenient,problematic,priority,priority -pretentious,improper,incomplete,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,improper,incomplete,more,convenient,inconv,nonprob,recommended,priority -pretentious,improper,incomplete,more,convenient,inconv,nonprob,priority,spec_prior -pretentious,improper,incomplete,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,improper,incomplete,more,convenient,inconv,slightly_prob,recommended,priority -pretentious,improper,incomplete,more,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,improper,incomplete,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,more,convenient,inconv,problematic,recommended,spec_prior -pretentious,improper,incomplete,more,convenient,inconv,problematic,priority,spec_prior -pretentious,improper,incomplete,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,improper,incomplete,more,less_conv,convenient,nonprob,recommended,priority -pretentious,improper,incomplete,more,less_conv,convenient,nonprob,priority,spec_prior -pretentious,improper,incomplete,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,improper,incomplete,more,less_conv,convenient,slightly_prob,recommended,priority -pretentious,improper,incomplete,more,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,improper,incomplete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,more,less_conv,convenient,problematic,recommended,spec_prior -pretentious,improper,incomplete,more,less_conv,convenient,problematic,priority,spec_prior -pretentious,improper,incomplete,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,improper,incomplete,more,less_conv,inconv,nonprob,recommended,priority -pretentious,improper,incomplete,more,less_conv,inconv,nonprob,priority,spec_prior -pretentious,improper,incomplete,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,improper,incomplete,more,less_conv,inconv,slightly_prob,recommended,priority -pretentious,improper,incomplete,more,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,improper,incomplete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,more,less_conv,inconv,problematic,recommended,spec_prior -pretentious,improper,incomplete,more,less_conv,inconv,problematic,priority,spec_prior -pretentious,improper,incomplete,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,improper,incomplete,more,critical,convenient,nonprob,recommended,priority -pretentious,improper,incomplete,more,critical,convenient,nonprob,priority,spec_prior -pretentious,improper,incomplete,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,improper,incomplete,more,critical,convenient,slightly_prob,recommended,priority -pretentious,improper,incomplete,more,critical,convenient,slightly_prob,priority,spec_prior -pretentious,improper,incomplete,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,more,critical,convenient,problematic,recommended,spec_prior -pretentious,improper,incomplete,more,critical,convenient,problematic,priority,spec_prior -pretentious,improper,incomplete,more,critical,convenient,problematic,not_recom,not_recom -pretentious,improper,incomplete,more,critical,inconv,nonprob,recommended,priority -pretentious,improper,incomplete,more,critical,inconv,nonprob,priority,spec_prior -pretentious,improper,incomplete,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,improper,incomplete,more,critical,inconv,slightly_prob,recommended,priority -pretentious,improper,incomplete,more,critical,inconv,slightly_prob,priority,spec_prior -pretentious,improper,incomplete,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,incomplete,more,critical,inconv,problematic,recommended,spec_prior -pretentious,improper,incomplete,more,critical,inconv,problematic,priority,spec_prior -pretentious,improper,incomplete,more,critical,inconv,problematic,not_recom,not_recom -pretentious,improper,foster,1,convenient,convenient,nonprob,recommended,priority -pretentious,improper,foster,1,convenient,convenient,nonprob,priority,priority -pretentious,improper,foster,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,improper,foster,1,convenient,convenient,slightly_prob,recommended,priority -pretentious,improper,foster,1,convenient,convenient,slightly_prob,priority,priority -pretentious,improper,foster,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,foster,1,convenient,convenient,problematic,recommended,priority -pretentious,improper,foster,1,convenient,convenient,problematic,priority,priority -pretentious,improper,foster,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,improper,foster,1,convenient,inconv,nonprob,recommended,priority -pretentious,improper,foster,1,convenient,inconv,nonprob,priority,spec_prior -pretentious,improper,foster,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,improper,foster,1,convenient,inconv,slightly_prob,recommended,priority -pretentious,improper,foster,1,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,improper,foster,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,foster,1,convenient,inconv,problematic,recommended,spec_prior -pretentious,improper,foster,1,convenient,inconv,problematic,priority,spec_prior -pretentious,improper,foster,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,improper,foster,1,less_conv,convenient,nonprob,recommended,priority -pretentious,improper,foster,1,less_conv,convenient,nonprob,priority,spec_prior -pretentious,improper,foster,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,improper,foster,1,less_conv,convenient,slightly_prob,recommended,priority -pretentious,improper,foster,1,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,improper,foster,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,foster,1,less_conv,convenient,problematic,recommended,spec_prior -pretentious,improper,foster,1,less_conv,convenient,problematic,priority,spec_prior -pretentious,improper,foster,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,improper,foster,1,less_conv,inconv,nonprob,recommended,priority -pretentious,improper,foster,1,less_conv,inconv,nonprob,priority,spec_prior -pretentious,improper,foster,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,improper,foster,1,less_conv,inconv,slightly_prob,recommended,priority -pretentious,improper,foster,1,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,improper,foster,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,foster,1,less_conv,inconv,problematic,recommended,spec_prior -pretentious,improper,foster,1,less_conv,inconv,problematic,priority,spec_prior -pretentious,improper,foster,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,improper,foster,1,critical,convenient,nonprob,recommended,priority -pretentious,improper,foster,1,critical,convenient,nonprob,priority,spec_prior -pretentious,improper,foster,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,improper,foster,1,critical,convenient,slightly_prob,recommended,priority -pretentious,improper,foster,1,critical,convenient,slightly_prob,priority,spec_prior -pretentious,improper,foster,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,foster,1,critical,convenient,problematic,recommended,spec_prior -pretentious,improper,foster,1,critical,convenient,problematic,priority,spec_prior -pretentious,improper,foster,1,critical,convenient,problematic,not_recom,not_recom -pretentious,improper,foster,1,critical,inconv,nonprob,recommended,priority -pretentious,improper,foster,1,critical,inconv,nonprob,priority,spec_prior -pretentious,improper,foster,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,improper,foster,1,critical,inconv,slightly_prob,recommended,priority -pretentious,improper,foster,1,critical,inconv,slightly_prob,priority,spec_prior -pretentious,improper,foster,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,foster,1,critical,inconv,problematic,recommended,spec_prior -pretentious,improper,foster,1,critical,inconv,problematic,priority,spec_prior -pretentious,improper,foster,1,critical,inconv,problematic,not_recom,not_recom -pretentious,improper,foster,2,convenient,convenient,nonprob,recommended,priority -pretentious,improper,foster,2,convenient,convenient,nonprob,priority,priority -pretentious,improper,foster,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,improper,foster,2,convenient,convenient,slightly_prob,recommended,priority -pretentious,improper,foster,2,convenient,convenient,slightly_prob,priority,priority -pretentious,improper,foster,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,foster,2,convenient,convenient,problematic,recommended,priority -pretentious,improper,foster,2,convenient,convenient,problematic,priority,priority -pretentious,improper,foster,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,improper,foster,2,convenient,inconv,nonprob,recommended,priority -pretentious,improper,foster,2,convenient,inconv,nonprob,priority,spec_prior -pretentious,improper,foster,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,improper,foster,2,convenient,inconv,slightly_prob,recommended,priority -pretentious,improper,foster,2,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,improper,foster,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,foster,2,convenient,inconv,problematic,recommended,spec_prior -pretentious,improper,foster,2,convenient,inconv,problematic,priority,spec_prior -pretentious,improper,foster,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,improper,foster,2,less_conv,convenient,nonprob,recommended,priority -pretentious,improper,foster,2,less_conv,convenient,nonprob,priority,spec_prior -pretentious,improper,foster,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,improper,foster,2,less_conv,convenient,slightly_prob,recommended,priority -pretentious,improper,foster,2,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,improper,foster,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,foster,2,less_conv,convenient,problematic,recommended,spec_prior -pretentious,improper,foster,2,less_conv,convenient,problematic,priority,spec_prior -pretentious,improper,foster,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,improper,foster,2,less_conv,inconv,nonprob,recommended,priority -pretentious,improper,foster,2,less_conv,inconv,nonprob,priority,spec_prior -pretentious,improper,foster,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,improper,foster,2,less_conv,inconv,slightly_prob,recommended,priority -pretentious,improper,foster,2,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,improper,foster,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,foster,2,less_conv,inconv,problematic,recommended,spec_prior -pretentious,improper,foster,2,less_conv,inconv,problematic,priority,spec_prior -pretentious,improper,foster,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,improper,foster,2,critical,convenient,nonprob,recommended,priority -pretentious,improper,foster,2,critical,convenient,nonprob,priority,spec_prior -pretentious,improper,foster,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,improper,foster,2,critical,convenient,slightly_prob,recommended,priority -pretentious,improper,foster,2,critical,convenient,slightly_prob,priority,spec_prior -pretentious,improper,foster,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,foster,2,critical,convenient,problematic,recommended,spec_prior -pretentious,improper,foster,2,critical,convenient,problematic,priority,spec_prior -pretentious,improper,foster,2,critical,convenient,problematic,not_recom,not_recom -pretentious,improper,foster,2,critical,inconv,nonprob,recommended,priority -pretentious,improper,foster,2,critical,inconv,nonprob,priority,spec_prior -pretentious,improper,foster,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,improper,foster,2,critical,inconv,slightly_prob,recommended,priority -pretentious,improper,foster,2,critical,inconv,slightly_prob,priority,spec_prior -pretentious,improper,foster,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,foster,2,critical,inconv,problematic,recommended,spec_prior -pretentious,improper,foster,2,critical,inconv,problematic,priority,spec_prior -pretentious,improper,foster,2,critical,inconv,problematic,not_recom,not_recom -pretentious,improper,foster,3,convenient,convenient,nonprob,recommended,priority -pretentious,improper,foster,3,convenient,convenient,nonprob,priority,priority -pretentious,improper,foster,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,improper,foster,3,convenient,convenient,slightly_prob,recommended,priority -pretentious,improper,foster,3,convenient,convenient,slightly_prob,priority,priority -pretentious,improper,foster,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,foster,3,convenient,convenient,problematic,recommended,priority -pretentious,improper,foster,3,convenient,convenient,problematic,priority,priority -pretentious,improper,foster,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,improper,foster,3,convenient,inconv,nonprob,recommended,priority -pretentious,improper,foster,3,convenient,inconv,nonprob,priority,spec_prior -pretentious,improper,foster,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,improper,foster,3,convenient,inconv,slightly_prob,recommended,priority -pretentious,improper,foster,3,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,improper,foster,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,foster,3,convenient,inconv,problematic,recommended,spec_prior -pretentious,improper,foster,3,convenient,inconv,problematic,priority,spec_prior -pretentious,improper,foster,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,improper,foster,3,less_conv,convenient,nonprob,recommended,priority -pretentious,improper,foster,3,less_conv,convenient,nonprob,priority,spec_prior -pretentious,improper,foster,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,improper,foster,3,less_conv,convenient,slightly_prob,recommended,priority -pretentious,improper,foster,3,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,improper,foster,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,foster,3,less_conv,convenient,problematic,recommended,spec_prior -pretentious,improper,foster,3,less_conv,convenient,problematic,priority,spec_prior -pretentious,improper,foster,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,improper,foster,3,less_conv,inconv,nonprob,recommended,priority -pretentious,improper,foster,3,less_conv,inconv,nonprob,priority,spec_prior -pretentious,improper,foster,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,improper,foster,3,less_conv,inconv,slightly_prob,recommended,priority -pretentious,improper,foster,3,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,improper,foster,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,foster,3,less_conv,inconv,problematic,recommended,spec_prior -pretentious,improper,foster,3,less_conv,inconv,problematic,priority,spec_prior -pretentious,improper,foster,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,improper,foster,3,critical,convenient,nonprob,recommended,priority -pretentious,improper,foster,3,critical,convenient,nonprob,priority,spec_prior -pretentious,improper,foster,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,improper,foster,3,critical,convenient,slightly_prob,recommended,priority -pretentious,improper,foster,3,critical,convenient,slightly_prob,priority,spec_prior -pretentious,improper,foster,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,foster,3,critical,convenient,problematic,recommended,spec_prior -pretentious,improper,foster,3,critical,convenient,problematic,priority,spec_prior -pretentious,improper,foster,3,critical,convenient,problematic,not_recom,not_recom -pretentious,improper,foster,3,critical,inconv,nonprob,recommended,priority -pretentious,improper,foster,3,critical,inconv,nonprob,priority,spec_prior -pretentious,improper,foster,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,improper,foster,3,critical,inconv,slightly_prob,recommended,priority -pretentious,improper,foster,3,critical,inconv,slightly_prob,priority,spec_prior -pretentious,improper,foster,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,foster,3,critical,inconv,problematic,recommended,spec_prior -pretentious,improper,foster,3,critical,inconv,problematic,priority,spec_prior -pretentious,improper,foster,3,critical,inconv,problematic,not_recom,not_recom -pretentious,improper,foster,more,convenient,convenient,nonprob,recommended,priority -pretentious,improper,foster,more,convenient,convenient,nonprob,priority,priority -pretentious,improper,foster,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,improper,foster,more,convenient,convenient,slightly_prob,recommended,priority -pretentious,improper,foster,more,convenient,convenient,slightly_prob,priority,priority -pretentious,improper,foster,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,foster,more,convenient,convenient,problematic,recommended,priority -pretentious,improper,foster,more,convenient,convenient,problematic,priority,priority -pretentious,improper,foster,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,improper,foster,more,convenient,inconv,nonprob,recommended,priority -pretentious,improper,foster,more,convenient,inconv,nonprob,priority,spec_prior -pretentious,improper,foster,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,improper,foster,more,convenient,inconv,slightly_prob,recommended,priority -pretentious,improper,foster,more,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,improper,foster,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,foster,more,convenient,inconv,problematic,recommended,spec_prior -pretentious,improper,foster,more,convenient,inconv,problematic,priority,spec_prior -pretentious,improper,foster,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,improper,foster,more,less_conv,convenient,nonprob,recommended,priority -pretentious,improper,foster,more,less_conv,convenient,nonprob,priority,spec_prior -pretentious,improper,foster,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,improper,foster,more,less_conv,convenient,slightly_prob,recommended,priority -pretentious,improper,foster,more,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,improper,foster,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,foster,more,less_conv,convenient,problematic,recommended,spec_prior -pretentious,improper,foster,more,less_conv,convenient,problematic,priority,spec_prior -pretentious,improper,foster,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,improper,foster,more,less_conv,inconv,nonprob,recommended,priority -pretentious,improper,foster,more,less_conv,inconv,nonprob,priority,spec_prior -pretentious,improper,foster,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,improper,foster,more,less_conv,inconv,slightly_prob,recommended,priority -pretentious,improper,foster,more,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,improper,foster,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,foster,more,less_conv,inconv,problematic,recommended,spec_prior -pretentious,improper,foster,more,less_conv,inconv,problematic,priority,spec_prior -pretentious,improper,foster,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,improper,foster,more,critical,convenient,nonprob,recommended,priority -pretentious,improper,foster,more,critical,convenient,nonprob,priority,spec_prior -pretentious,improper,foster,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,improper,foster,more,critical,convenient,slightly_prob,recommended,priority -pretentious,improper,foster,more,critical,convenient,slightly_prob,priority,spec_prior -pretentious,improper,foster,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,improper,foster,more,critical,convenient,problematic,recommended,spec_prior -pretentious,improper,foster,more,critical,convenient,problematic,priority,spec_prior -pretentious,improper,foster,more,critical,convenient,problematic,not_recom,not_recom -pretentious,improper,foster,more,critical,inconv,nonprob,recommended,priority -pretentious,improper,foster,more,critical,inconv,nonprob,priority,spec_prior -pretentious,improper,foster,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,improper,foster,more,critical,inconv,slightly_prob,recommended,priority -pretentious,improper,foster,more,critical,inconv,slightly_prob,priority,spec_prior -pretentious,improper,foster,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,improper,foster,more,critical,inconv,problematic,recommended,spec_prior -pretentious,improper,foster,more,critical,inconv,problematic,priority,spec_prior -pretentious,improper,foster,more,critical,inconv,problematic,not_recom,not_recom -pretentious,critical,complete,1,convenient,convenient,nonprob,recommended,priority -pretentious,critical,complete,1,convenient,convenient,nonprob,priority,priority -pretentious,critical,complete,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,critical,complete,1,convenient,convenient,slightly_prob,recommended,priority -pretentious,critical,complete,1,convenient,convenient,slightly_prob,priority,priority -pretentious,critical,complete,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,complete,1,convenient,convenient,problematic,recommended,priority -pretentious,critical,complete,1,convenient,convenient,problematic,priority,priority -pretentious,critical,complete,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,critical,complete,1,convenient,inconv,nonprob,recommended,priority -pretentious,critical,complete,1,convenient,inconv,nonprob,priority,spec_prior -pretentious,critical,complete,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,critical,complete,1,convenient,inconv,slightly_prob,recommended,priority -pretentious,critical,complete,1,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,critical,complete,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,complete,1,convenient,inconv,problematic,recommended,spec_prior -pretentious,critical,complete,1,convenient,inconv,problematic,priority,spec_prior -pretentious,critical,complete,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,critical,complete,1,less_conv,convenient,nonprob,recommended,priority -pretentious,critical,complete,1,less_conv,convenient,nonprob,priority,spec_prior -pretentious,critical,complete,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,critical,complete,1,less_conv,convenient,slightly_prob,recommended,priority -pretentious,critical,complete,1,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,critical,complete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,complete,1,less_conv,convenient,problematic,recommended,spec_prior -pretentious,critical,complete,1,less_conv,convenient,problematic,priority,spec_prior -pretentious,critical,complete,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,critical,complete,1,less_conv,inconv,nonprob,recommended,priority -pretentious,critical,complete,1,less_conv,inconv,nonprob,priority,spec_prior -pretentious,critical,complete,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,critical,complete,1,less_conv,inconv,slightly_prob,recommended,priority -pretentious,critical,complete,1,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,critical,complete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,complete,1,less_conv,inconv,problematic,recommended,spec_prior -pretentious,critical,complete,1,less_conv,inconv,problematic,priority,spec_prior -pretentious,critical,complete,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,critical,complete,1,critical,convenient,nonprob,recommended,priority -pretentious,critical,complete,1,critical,convenient,nonprob,priority,spec_prior -pretentious,critical,complete,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,critical,complete,1,critical,convenient,slightly_prob,recommended,priority -pretentious,critical,complete,1,critical,convenient,slightly_prob,priority,spec_prior -pretentious,critical,complete,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,complete,1,critical,convenient,problematic,recommended,spec_prior -pretentious,critical,complete,1,critical,convenient,problematic,priority,spec_prior -pretentious,critical,complete,1,critical,convenient,problematic,not_recom,not_recom -pretentious,critical,complete,1,critical,inconv,nonprob,recommended,priority -pretentious,critical,complete,1,critical,inconv,nonprob,priority,spec_prior -pretentious,critical,complete,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,critical,complete,1,critical,inconv,slightly_prob,recommended,priority -pretentious,critical,complete,1,critical,inconv,slightly_prob,priority,spec_prior -pretentious,critical,complete,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,complete,1,critical,inconv,problematic,recommended,spec_prior -pretentious,critical,complete,1,critical,inconv,problematic,priority,spec_prior -pretentious,critical,complete,1,critical,inconv,problematic,not_recom,not_recom -pretentious,critical,complete,2,convenient,convenient,nonprob,recommended,priority -pretentious,critical,complete,2,convenient,convenient,nonprob,priority,spec_prior -pretentious,critical,complete,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,critical,complete,2,convenient,convenient,slightly_prob,recommended,priority -pretentious,critical,complete,2,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,critical,complete,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,complete,2,convenient,convenient,problematic,recommended,spec_prior -pretentious,critical,complete,2,convenient,convenient,problematic,priority,spec_prior -pretentious,critical,complete,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,critical,complete,2,convenient,inconv,nonprob,recommended,priority -pretentious,critical,complete,2,convenient,inconv,nonprob,priority,spec_prior -pretentious,critical,complete,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,critical,complete,2,convenient,inconv,slightly_prob,recommended,priority -pretentious,critical,complete,2,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,critical,complete,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,complete,2,convenient,inconv,problematic,recommended,spec_prior -pretentious,critical,complete,2,convenient,inconv,problematic,priority,spec_prior -pretentious,critical,complete,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,critical,complete,2,less_conv,convenient,nonprob,recommended,priority -pretentious,critical,complete,2,less_conv,convenient,nonprob,priority,spec_prior -pretentious,critical,complete,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,critical,complete,2,less_conv,convenient,slightly_prob,recommended,priority -pretentious,critical,complete,2,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,critical,complete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,complete,2,less_conv,convenient,problematic,recommended,spec_prior -pretentious,critical,complete,2,less_conv,convenient,problematic,priority,spec_prior -pretentious,critical,complete,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,critical,complete,2,less_conv,inconv,nonprob,recommended,priority -pretentious,critical,complete,2,less_conv,inconv,nonprob,priority,spec_prior -pretentious,critical,complete,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,critical,complete,2,less_conv,inconv,slightly_prob,recommended,priority -pretentious,critical,complete,2,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,critical,complete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,complete,2,less_conv,inconv,problematic,recommended,spec_prior -pretentious,critical,complete,2,less_conv,inconv,problematic,priority,spec_prior -pretentious,critical,complete,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,critical,complete,2,critical,convenient,nonprob,recommended,spec_prior -pretentious,critical,complete,2,critical,convenient,nonprob,priority,spec_prior -pretentious,critical,complete,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,critical,complete,2,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,complete,2,critical,convenient,slightly_prob,priority,spec_prior -pretentious,critical,complete,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,complete,2,critical,convenient,problematic,recommended,spec_prior -pretentious,critical,complete,2,critical,convenient,problematic,priority,spec_prior -pretentious,critical,complete,2,critical,convenient,problematic,not_recom,not_recom -pretentious,critical,complete,2,critical,inconv,nonprob,recommended,spec_prior -pretentious,critical,complete,2,critical,inconv,nonprob,priority,spec_prior -pretentious,critical,complete,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,critical,complete,2,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,complete,2,critical,inconv,slightly_prob,priority,spec_prior -pretentious,critical,complete,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,complete,2,critical,inconv,problematic,recommended,spec_prior -pretentious,critical,complete,2,critical,inconv,problematic,priority,spec_prior -pretentious,critical,complete,2,critical,inconv,problematic,not_recom,not_recom -pretentious,critical,complete,3,convenient,convenient,nonprob,recommended,priority -pretentious,critical,complete,3,convenient,convenient,nonprob,priority,spec_prior -pretentious,critical,complete,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,critical,complete,3,convenient,convenient,slightly_prob,recommended,priority -pretentious,critical,complete,3,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,critical,complete,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,complete,3,convenient,convenient,problematic,recommended,spec_prior -pretentious,critical,complete,3,convenient,convenient,problematic,priority,spec_prior -pretentious,critical,complete,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,critical,complete,3,convenient,inconv,nonprob,recommended,spec_prior -pretentious,critical,complete,3,convenient,inconv,nonprob,priority,spec_prior -pretentious,critical,complete,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,critical,complete,3,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,complete,3,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,critical,complete,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,complete,3,convenient,inconv,problematic,recommended,spec_prior -pretentious,critical,complete,3,convenient,inconv,problematic,priority,spec_prior -pretentious,critical,complete,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,critical,complete,3,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,critical,complete,3,less_conv,convenient,nonprob,priority,spec_prior -pretentious,critical,complete,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,critical,complete,3,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,complete,3,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,critical,complete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,complete,3,less_conv,convenient,problematic,recommended,spec_prior -pretentious,critical,complete,3,less_conv,convenient,problematic,priority,spec_prior -pretentious,critical,complete,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,critical,complete,3,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,critical,complete,3,less_conv,inconv,nonprob,priority,spec_prior -pretentious,critical,complete,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,critical,complete,3,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,complete,3,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,critical,complete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,complete,3,less_conv,inconv,problematic,recommended,spec_prior -pretentious,critical,complete,3,less_conv,inconv,problematic,priority,spec_prior -pretentious,critical,complete,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,critical,complete,3,critical,convenient,nonprob,recommended,spec_prior -pretentious,critical,complete,3,critical,convenient,nonprob,priority,spec_prior -pretentious,critical,complete,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,critical,complete,3,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,complete,3,critical,convenient,slightly_prob,priority,spec_prior -pretentious,critical,complete,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,complete,3,critical,convenient,problematic,recommended,spec_prior -pretentious,critical,complete,3,critical,convenient,problematic,priority,spec_prior -pretentious,critical,complete,3,critical,convenient,problematic,not_recom,not_recom -pretentious,critical,complete,3,critical,inconv,nonprob,recommended,spec_prior -pretentious,critical,complete,3,critical,inconv,nonprob,priority,spec_prior -pretentious,critical,complete,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,critical,complete,3,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,complete,3,critical,inconv,slightly_prob,priority,spec_prior -pretentious,critical,complete,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,complete,3,critical,inconv,problematic,recommended,spec_prior -pretentious,critical,complete,3,critical,inconv,problematic,priority,spec_prior -pretentious,critical,complete,3,critical,inconv,problematic,not_recom,not_recom -pretentious,critical,complete,more,convenient,convenient,nonprob,recommended,priority -pretentious,critical,complete,more,convenient,convenient,nonprob,priority,spec_prior -pretentious,critical,complete,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,critical,complete,more,convenient,convenient,slightly_prob,recommended,priority -pretentious,critical,complete,more,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,critical,complete,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,complete,more,convenient,convenient,problematic,recommended,spec_prior -pretentious,critical,complete,more,convenient,convenient,problematic,priority,spec_prior -pretentious,critical,complete,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,critical,complete,more,convenient,inconv,nonprob,recommended,spec_prior -pretentious,critical,complete,more,convenient,inconv,nonprob,priority,spec_prior -pretentious,critical,complete,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,critical,complete,more,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,complete,more,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,critical,complete,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,complete,more,convenient,inconv,problematic,recommended,spec_prior -pretentious,critical,complete,more,convenient,inconv,problematic,priority,spec_prior -pretentious,critical,complete,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,critical,complete,more,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,critical,complete,more,less_conv,convenient,nonprob,priority,spec_prior -pretentious,critical,complete,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,critical,complete,more,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,complete,more,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,critical,complete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,complete,more,less_conv,convenient,problematic,recommended,spec_prior -pretentious,critical,complete,more,less_conv,convenient,problematic,priority,spec_prior -pretentious,critical,complete,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,critical,complete,more,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,critical,complete,more,less_conv,inconv,nonprob,priority,spec_prior -pretentious,critical,complete,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,critical,complete,more,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,complete,more,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,critical,complete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,complete,more,less_conv,inconv,problematic,recommended,spec_prior -pretentious,critical,complete,more,less_conv,inconv,problematic,priority,spec_prior -pretentious,critical,complete,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,critical,complete,more,critical,convenient,nonprob,recommended,spec_prior -pretentious,critical,complete,more,critical,convenient,nonprob,priority,spec_prior -pretentious,critical,complete,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,critical,complete,more,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,complete,more,critical,convenient,slightly_prob,priority,spec_prior -pretentious,critical,complete,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,complete,more,critical,convenient,problematic,recommended,spec_prior -pretentious,critical,complete,more,critical,convenient,problematic,priority,spec_prior -pretentious,critical,complete,more,critical,convenient,problematic,not_recom,not_recom -pretentious,critical,complete,more,critical,inconv,nonprob,recommended,spec_prior -pretentious,critical,complete,more,critical,inconv,nonprob,priority,spec_prior -pretentious,critical,complete,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,critical,complete,more,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,complete,more,critical,inconv,slightly_prob,priority,spec_prior -pretentious,critical,complete,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,complete,more,critical,inconv,problematic,recommended,spec_prior -pretentious,critical,complete,more,critical,inconv,problematic,priority,spec_prior -pretentious,critical,complete,more,critical,inconv,problematic,not_recom,not_recom -pretentious,critical,completed,1,convenient,convenient,nonprob,recommended,priority -pretentious,critical,completed,1,convenient,convenient,nonprob,priority,spec_prior -pretentious,critical,completed,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,critical,completed,1,convenient,convenient,slightly_prob,recommended,priority -pretentious,critical,completed,1,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,critical,completed,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,completed,1,convenient,convenient,problematic,recommended,spec_prior -pretentious,critical,completed,1,convenient,convenient,problematic,priority,spec_prior -pretentious,critical,completed,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,critical,completed,1,convenient,inconv,nonprob,recommended,priority -pretentious,critical,completed,1,convenient,inconv,nonprob,priority,spec_prior -pretentious,critical,completed,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,critical,completed,1,convenient,inconv,slightly_prob,recommended,priority -pretentious,critical,completed,1,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,critical,completed,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,completed,1,convenient,inconv,problematic,recommended,spec_prior -pretentious,critical,completed,1,convenient,inconv,problematic,priority,spec_prior -pretentious,critical,completed,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,critical,completed,1,less_conv,convenient,nonprob,recommended,priority -pretentious,critical,completed,1,less_conv,convenient,nonprob,priority,spec_prior -pretentious,critical,completed,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,critical,completed,1,less_conv,convenient,slightly_prob,recommended,priority -pretentious,critical,completed,1,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,critical,completed,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,completed,1,less_conv,convenient,problematic,recommended,spec_prior -pretentious,critical,completed,1,less_conv,convenient,problematic,priority,spec_prior -pretentious,critical,completed,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,critical,completed,1,less_conv,inconv,nonprob,recommended,priority -pretentious,critical,completed,1,less_conv,inconv,nonprob,priority,spec_prior -pretentious,critical,completed,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,critical,completed,1,less_conv,inconv,slightly_prob,recommended,priority -pretentious,critical,completed,1,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,critical,completed,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,completed,1,less_conv,inconv,problematic,recommended,spec_prior -pretentious,critical,completed,1,less_conv,inconv,problematic,priority,spec_prior -pretentious,critical,completed,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,critical,completed,1,critical,convenient,nonprob,recommended,spec_prior -pretentious,critical,completed,1,critical,convenient,nonprob,priority,spec_prior -pretentious,critical,completed,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,critical,completed,1,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,completed,1,critical,convenient,slightly_prob,priority,spec_prior -pretentious,critical,completed,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,completed,1,critical,convenient,problematic,recommended,spec_prior -pretentious,critical,completed,1,critical,convenient,problematic,priority,spec_prior -pretentious,critical,completed,1,critical,convenient,problematic,not_recom,not_recom -pretentious,critical,completed,1,critical,inconv,nonprob,recommended,spec_prior -pretentious,critical,completed,1,critical,inconv,nonprob,priority,spec_prior -pretentious,critical,completed,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,critical,completed,1,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,completed,1,critical,inconv,slightly_prob,priority,spec_prior -pretentious,critical,completed,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,completed,1,critical,inconv,problematic,recommended,spec_prior -pretentious,critical,completed,1,critical,inconv,problematic,priority,spec_prior -pretentious,critical,completed,1,critical,inconv,problematic,not_recom,not_recom -pretentious,critical,completed,2,convenient,convenient,nonprob,recommended,priority -pretentious,critical,completed,2,convenient,convenient,nonprob,priority,spec_prior -pretentious,critical,completed,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,critical,completed,2,convenient,convenient,slightly_prob,recommended,priority -pretentious,critical,completed,2,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,critical,completed,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,completed,2,convenient,convenient,problematic,recommended,spec_prior -pretentious,critical,completed,2,convenient,convenient,problematic,priority,spec_prior -pretentious,critical,completed,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,critical,completed,2,convenient,inconv,nonprob,recommended,priority -pretentious,critical,completed,2,convenient,inconv,nonprob,priority,spec_prior -pretentious,critical,completed,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,critical,completed,2,convenient,inconv,slightly_prob,recommended,priority -pretentious,critical,completed,2,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,critical,completed,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,completed,2,convenient,inconv,problematic,recommended,spec_prior -pretentious,critical,completed,2,convenient,inconv,problematic,priority,spec_prior -pretentious,critical,completed,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,critical,completed,2,less_conv,convenient,nonprob,recommended,priority -pretentious,critical,completed,2,less_conv,convenient,nonprob,priority,spec_prior -pretentious,critical,completed,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,critical,completed,2,less_conv,convenient,slightly_prob,recommended,priority -pretentious,critical,completed,2,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,critical,completed,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,completed,2,less_conv,convenient,problematic,recommended,spec_prior -pretentious,critical,completed,2,less_conv,convenient,problematic,priority,spec_prior -pretentious,critical,completed,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,critical,completed,2,less_conv,inconv,nonprob,recommended,priority -pretentious,critical,completed,2,less_conv,inconv,nonprob,priority,spec_prior -pretentious,critical,completed,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,critical,completed,2,less_conv,inconv,slightly_prob,recommended,priority -pretentious,critical,completed,2,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,critical,completed,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,completed,2,less_conv,inconv,problematic,recommended,spec_prior -pretentious,critical,completed,2,less_conv,inconv,problematic,priority,spec_prior -pretentious,critical,completed,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,critical,completed,2,critical,convenient,nonprob,recommended,spec_prior -pretentious,critical,completed,2,critical,convenient,nonprob,priority,spec_prior -pretentious,critical,completed,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,critical,completed,2,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,completed,2,critical,convenient,slightly_prob,priority,spec_prior -pretentious,critical,completed,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,completed,2,critical,convenient,problematic,recommended,spec_prior -pretentious,critical,completed,2,critical,convenient,problematic,priority,spec_prior -pretentious,critical,completed,2,critical,convenient,problematic,not_recom,not_recom -pretentious,critical,completed,2,critical,inconv,nonprob,recommended,spec_prior -pretentious,critical,completed,2,critical,inconv,nonprob,priority,spec_prior -pretentious,critical,completed,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,critical,completed,2,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,completed,2,critical,inconv,slightly_prob,priority,spec_prior -pretentious,critical,completed,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,completed,2,critical,inconv,problematic,recommended,spec_prior -pretentious,critical,completed,2,critical,inconv,problematic,priority,spec_prior -pretentious,critical,completed,2,critical,inconv,problematic,not_recom,not_recom -pretentious,critical,completed,3,convenient,convenient,nonprob,recommended,priority -pretentious,critical,completed,3,convenient,convenient,nonprob,priority,spec_prior -pretentious,critical,completed,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,critical,completed,3,convenient,convenient,slightly_prob,recommended,priority -pretentious,critical,completed,3,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,critical,completed,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,completed,3,convenient,convenient,problematic,recommended,spec_prior -pretentious,critical,completed,3,convenient,convenient,problematic,priority,spec_prior -pretentious,critical,completed,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,critical,completed,3,convenient,inconv,nonprob,recommended,spec_prior -pretentious,critical,completed,3,convenient,inconv,nonprob,priority,spec_prior -pretentious,critical,completed,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,critical,completed,3,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,completed,3,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,critical,completed,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,completed,3,convenient,inconv,problematic,recommended,spec_prior -pretentious,critical,completed,3,convenient,inconv,problematic,priority,spec_prior -pretentious,critical,completed,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,critical,completed,3,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,critical,completed,3,less_conv,convenient,nonprob,priority,spec_prior -pretentious,critical,completed,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,critical,completed,3,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,completed,3,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,critical,completed,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,completed,3,less_conv,convenient,problematic,recommended,spec_prior -pretentious,critical,completed,3,less_conv,convenient,problematic,priority,spec_prior -pretentious,critical,completed,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,critical,completed,3,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,critical,completed,3,less_conv,inconv,nonprob,priority,spec_prior -pretentious,critical,completed,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,critical,completed,3,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,completed,3,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,critical,completed,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,completed,3,less_conv,inconv,problematic,recommended,spec_prior -pretentious,critical,completed,3,less_conv,inconv,problematic,priority,spec_prior -pretentious,critical,completed,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,critical,completed,3,critical,convenient,nonprob,recommended,spec_prior -pretentious,critical,completed,3,critical,convenient,nonprob,priority,spec_prior -pretentious,critical,completed,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,critical,completed,3,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,completed,3,critical,convenient,slightly_prob,priority,spec_prior -pretentious,critical,completed,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,completed,3,critical,convenient,problematic,recommended,spec_prior -pretentious,critical,completed,3,critical,convenient,problematic,priority,spec_prior -pretentious,critical,completed,3,critical,convenient,problematic,not_recom,not_recom -pretentious,critical,completed,3,critical,inconv,nonprob,recommended,spec_prior -pretentious,critical,completed,3,critical,inconv,nonprob,priority,spec_prior -pretentious,critical,completed,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,critical,completed,3,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,completed,3,critical,inconv,slightly_prob,priority,spec_prior -pretentious,critical,completed,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,completed,3,critical,inconv,problematic,recommended,spec_prior -pretentious,critical,completed,3,critical,inconv,problematic,priority,spec_prior -pretentious,critical,completed,3,critical,inconv,problematic,not_recom,not_recom -pretentious,critical,completed,more,convenient,convenient,nonprob,recommended,priority -pretentious,critical,completed,more,convenient,convenient,nonprob,priority,spec_prior -pretentious,critical,completed,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,critical,completed,more,convenient,convenient,slightly_prob,recommended,priority -pretentious,critical,completed,more,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,critical,completed,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,completed,more,convenient,convenient,problematic,recommended,spec_prior -pretentious,critical,completed,more,convenient,convenient,problematic,priority,spec_prior -pretentious,critical,completed,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,critical,completed,more,convenient,inconv,nonprob,recommended,spec_prior -pretentious,critical,completed,more,convenient,inconv,nonprob,priority,spec_prior -pretentious,critical,completed,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,critical,completed,more,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,completed,more,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,critical,completed,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,completed,more,convenient,inconv,problematic,recommended,spec_prior -pretentious,critical,completed,more,convenient,inconv,problematic,priority,spec_prior -pretentious,critical,completed,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,critical,completed,more,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,critical,completed,more,less_conv,convenient,nonprob,priority,spec_prior -pretentious,critical,completed,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,critical,completed,more,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,completed,more,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,critical,completed,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,completed,more,less_conv,convenient,problematic,recommended,spec_prior -pretentious,critical,completed,more,less_conv,convenient,problematic,priority,spec_prior -pretentious,critical,completed,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,critical,completed,more,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,critical,completed,more,less_conv,inconv,nonprob,priority,spec_prior -pretentious,critical,completed,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,critical,completed,more,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,completed,more,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,critical,completed,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,completed,more,less_conv,inconv,problematic,recommended,spec_prior -pretentious,critical,completed,more,less_conv,inconv,problematic,priority,spec_prior -pretentious,critical,completed,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,critical,completed,more,critical,convenient,nonprob,recommended,spec_prior -pretentious,critical,completed,more,critical,convenient,nonprob,priority,spec_prior -pretentious,critical,completed,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,critical,completed,more,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,completed,more,critical,convenient,slightly_prob,priority,spec_prior -pretentious,critical,completed,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,completed,more,critical,convenient,problematic,recommended,spec_prior -pretentious,critical,completed,more,critical,convenient,problematic,priority,spec_prior -pretentious,critical,completed,more,critical,convenient,problematic,not_recom,not_recom -pretentious,critical,completed,more,critical,inconv,nonprob,recommended,spec_prior -pretentious,critical,completed,more,critical,inconv,nonprob,priority,spec_prior -pretentious,critical,completed,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,critical,completed,more,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,completed,more,critical,inconv,slightly_prob,priority,spec_prior -pretentious,critical,completed,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,completed,more,critical,inconv,problematic,recommended,spec_prior -pretentious,critical,completed,more,critical,inconv,problematic,priority,spec_prior -pretentious,critical,completed,more,critical,inconv,problematic,not_recom,not_recom -pretentious,critical,incomplete,1,convenient,convenient,nonprob,recommended,priority -pretentious,critical,incomplete,1,convenient,convenient,nonprob,priority,spec_prior -pretentious,critical,incomplete,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,critical,incomplete,1,convenient,convenient,slightly_prob,recommended,priority -pretentious,critical,incomplete,1,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,1,convenient,convenient,problematic,recommended,spec_prior -pretentious,critical,incomplete,1,convenient,convenient,problematic,priority,spec_prior -pretentious,critical,incomplete,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,critical,incomplete,1,convenient,inconv,nonprob,recommended,priority -pretentious,critical,incomplete,1,convenient,inconv,nonprob,priority,spec_prior -pretentious,critical,incomplete,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,critical,incomplete,1,convenient,inconv,slightly_prob,recommended,priority -pretentious,critical,incomplete,1,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,1,convenient,inconv,problematic,recommended,spec_prior -pretentious,critical,incomplete,1,convenient,inconv,problematic,priority,spec_prior -pretentious,critical,incomplete,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,critical,incomplete,1,less_conv,convenient,nonprob,recommended,priority -pretentious,critical,incomplete,1,less_conv,convenient,nonprob,priority,spec_prior -pretentious,critical,incomplete,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,critical,incomplete,1,less_conv,convenient,slightly_prob,recommended,priority -pretentious,critical,incomplete,1,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,1,less_conv,convenient,problematic,recommended,spec_prior -pretentious,critical,incomplete,1,less_conv,convenient,problematic,priority,spec_prior -pretentious,critical,incomplete,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,critical,incomplete,1,less_conv,inconv,nonprob,recommended,priority -pretentious,critical,incomplete,1,less_conv,inconv,nonprob,priority,spec_prior -pretentious,critical,incomplete,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,critical,incomplete,1,less_conv,inconv,slightly_prob,recommended,priority -pretentious,critical,incomplete,1,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,1,less_conv,inconv,problematic,recommended,spec_prior -pretentious,critical,incomplete,1,less_conv,inconv,problematic,priority,spec_prior -pretentious,critical,incomplete,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,critical,incomplete,1,critical,convenient,nonprob,recommended,spec_prior -pretentious,critical,incomplete,1,critical,convenient,nonprob,priority,spec_prior -pretentious,critical,incomplete,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,critical,incomplete,1,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,incomplete,1,critical,convenient,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,1,critical,convenient,problematic,recommended,spec_prior -pretentious,critical,incomplete,1,critical,convenient,problematic,priority,spec_prior -pretentious,critical,incomplete,1,critical,convenient,problematic,not_recom,not_recom -pretentious,critical,incomplete,1,critical,inconv,nonprob,recommended,spec_prior -pretentious,critical,incomplete,1,critical,inconv,nonprob,priority,spec_prior -pretentious,critical,incomplete,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,critical,incomplete,1,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,incomplete,1,critical,inconv,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,1,critical,inconv,problematic,recommended,spec_prior -pretentious,critical,incomplete,1,critical,inconv,problematic,priority,spec_prior -pretentious,critical,incomplete,1,critical,inconv,problematic,not_recom,not_recom -pretentious,critical,incomplete,2,convenient,convenient,nonprob,recommended,priority -pretentious,critical,incomplete,2,convenient,convenient,nonprob,priority,spec_prior -pretentious,critical,incomplete,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,critical,incomplete,2,convenient,convenient,slightly_prob,recommended,priority -pretentious,critical,incomplete,2,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,2,convenient,convenient,problematic,recommended,spec_prior -pretentious,critical,incomplete,2,convenient,convenient,problematic,priority,spec_prior -pretentious,critical,incomplete,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,critical,incomplete,2,convenient,inconv,nonprob,recommended,spec_prior -pretentious,critical,incomplete,2,convenient,inconv,nonprob,priority,spec_prior -pretentious,critical,incomplete,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,critical,incomplete,2,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,incomplete,2,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,2,convenient,inconv,problematic,recommended,spec_prior -pretentious,critical,incomplete,2,convenient,inconv,problematic,priority,spec_prior -pretentious,critical,incomplete,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,critical,incomplete,2,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,critical,incomplete,2,less_conv,convenient,nonprob,priority,spec_prior -pretentious,critical,incomplete,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,critical,incomplete,2,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,incomplete,2,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,2,less_conv,convenient,problematic,recommended,spec_prior -pretentious,critical,incomplete,2,less_conv,convenient,problematic,priority,spec_prior -pretentious,critical,incomplete,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,critical,incomplete,2,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,critical,incomplete,2,less_conv,inconv,nonprob,priority,spec_prior -pretentious,critical,incomplete,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,critical,incomplete,2,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,incomplete,2,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,2,less_conv,inconv,problematic,recommended,spec_prior -pretentious,critical,incomplete,2,less_conv,inconv,problematic,priority,spec_prior -pretentious,critical,incomplete,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,critical,incomplete,2,critical,convenient,nonprob,recommended,spec_prior -pretentious,critical,incomplete,2,critical,convenient,nonprob,priority,spec_prior -pretentious,critical,incomplete,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,critical,incomplete,2,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,incomplete,2,critical,convenient,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,2,critical,convenient,problematic,recommended,spec_prior -pretentious,critical,incomplete,2,critical,convenient,problematic,priority,spec_prior -pretentious,critical,incomplete,2,critical,convenient,problematic,not_recom,not_recom -pretentious,critical,incomplete,2,critical,inconv,nonprob,recommended,spec_prior -pretentious,critical,incomplete,2,critical,inconv,nonprob,priority,spec_prior -pretentious,critical,incomplete,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,critical,incomplete,2,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,incomplete,2,critical,inconv,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,2,critical,inconv,problematic,recommended,spec_prior -pretentious,critical,incomplete,2,critical,inconv,problematic,priority,spec_prior -pretentious,critical,incomplete,2,critical,inconv,problematic,not_recom,not_recom -pretentious,critical,incomplete,3,convenient,convenient,nonprob,recommended,priority -pretentious,critical,incomplete,3,convenient,convenient,nonprob,priority,spec_prior -pretentious,critical,incomplete,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,critical,incomplete,3,convenient,convenient,slightly_prob,recommended,priority -pretentious,critical,incomplete,3,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,3,convenient,convenient,problematic,recommended,spec_prior -pretentious,critical,incomplete,3,convenient,convenient,problematic,priority,spec_prior -pretentious,critical,incomplete,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,critical,incomplete,3,convenient,inconv,nonprob,recommended,spec_prior -pretentious,critical,incomplete,3,convenient,inconv,nonprob,priority,spec_prior -pretentious,critical,incomplete,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,critical,incomplete,3,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,incomplete,3,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,3,convenient,inconv,problematic,recommended,spec_prior -pretentious,critical,incomplete,3,convenient,inconv,problematic,priority,spec_prior -pretentious,critical,incomplete,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,critical,incomplete,3,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,critical,incomplete,3,less_conv,convenient,nonprob,priority,spec_prior -pretentious,critical,incomplete,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,critical,incomplete,3,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,incomplete,3,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,3,less_conv,convenient,problematic,recommended,spec_prior -pretentious,critical,incomplete,3,less_conv,convenient,problematic,priority,spec_prior -pretentious,critical,incomplete,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,critical,incomplete,3,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,critical,incomplete,3,less_conv,inconv,nonprob,priority,spec_prior -pretentious,critical,incomplete,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,critical,incomplete,3,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,incomplete,3,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,3,less_conv,inconv,problematic,recommended,spec_prior -pretentious,critical,incomplete,3,less_conv,inconv,problematic,priority,spec_prior -pretentious,critical,incomplete,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,critical,incomplete,3,critical,convenient,nonprob,recommended,spec_prior -pretentious,critical,incomplete,3,critical,convenient,nonprob,priority,spec_prior -pretentious,critical,incomplete,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,critical,incomplete,3,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,incomplete,3,critical,convenient,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,3,critical,convenient,problematic,recommended,spec_prior -pretentious,critical,incomplete,3,critical,convenient,problematic,priority,spec_prior -pretentious,critical,incomplete,3,critical,convenient,problematic,not_recom,not_recom -pretentious,critical,incomplete,3,critical,inconv,nonprob,recommended,spec_prior -pretentious,critical,incomplete,3,critical,inconv,nonprob,priority,spec_prior -pretentious,critical,incomplete,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,critical,incomplete,3,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,incomplete,3,critical,inconv,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,3,critical,inconv,problematic,recommended,spec_prior -pretentious,critical,incomplete,3,critical,inconv,problematic,priority,spec_prior -pretentious,critical,incomplete,3,critical,inconv,problematic,not_recom,not_recom -pretentious,critical,incomplete,more,convenient,convenient,nonprob,recommended,priority -pretentious,critical,incomplete,more,convenient,convenient,nonprob,priority,spec_prior -pretentious,critical,incomplete,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,critical,incomplete,more,convenient,convenient,slightly_prob,recommended,priority -pretentious,critical,incomplete,more,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,more,convenient,convenient,problematic,recommended,spec_prior -pretentious,critical,incomplete,more,convenient,convenient,problematic,priority,spec_prior -pretentious,critical,incomplete,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,critical,incomplete,more,convenient,inconv,nonprob,recommended,spec_prior -pretentious,critical,incomplete,more,convenient,inconv,nonprob,priority,spec_prior -pretentious,critical,incomplete,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,critical,incomplete,more,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,incomplete,more,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,more,convenient,inconv,problematic,recommended,spec_prior -pretentious,critical,incomplete,more,convenient,inconv,problematic,priority,spec_prior -pretentious,critical,incomplete,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,critical,incomplete,more,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,critical,incomplete,more,less_conv,convenient,nonprob,priority,spec_prior -pretentious,critical,incomplete,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,critical,incomplete,more,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,incomplete,more,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,more,less_conv,convenient,problematic,recommended,spec_prior -pretentious,critical,incomplete,more,less_conv,convenient,problematic,priority,spec_prior -pretentious,critical,incomplete,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,critical,incomplete,more,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,critical,incomplete,more,less_conv,inconv,nonprob,priority,spec_prior -pretentious,critical,incomplete,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,critical,incomplete,more,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,incomplete,more,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,more,less_conv,inconv,problematic,recommended,spec_prior -pretentious,critical,incomplete,more,less_conv,inconv,problematic,priority,spec_prior -pretentious,critical,incomplete,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,critical,incomplete,more,critical,convenient,nonprob,recommended,spec_prior -pretentious,critical,incomplete,more,critical,convenient,nonprob,priority,spec_prior -pretentious,critical,incomplete,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,critical,incomplete,more,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,incomplete,more,critical,convenient,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,more,critical,convenient,problematic,recommended,spec_prior -pretentious,critical,incomplete,more,critical,convenient,problematic,priority,spec_prior -pretentious,critical,incomplete,more,critical,convenient,problematic,not_recom,not_recom -pretentious,critical,incomplete,more,critical,inconv,nonprob,recommended,spec_prior -pretentious,critical,incomplete,more,critical,inconv,nonprob,priority,spec_prior -pretentious,critical,incomplete,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,critical,incomplete,more,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,incomplete,more,critical,inconv,slightly_prob,priority,spec_prior -pretentious,critical,incomplete,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,incomplete,more,critical,inconv,problematic,recommended,spec_prior -pretentious,critical,incomplete,more,critical,inconv,problematic,priority,spec_prior -pretentious,critical,incomplete,more,critical,inconv,problematic,not_recom,not_recom -pretentious,critical,foster,1,convenient,convenient,nonprob,recommended,priority -pretentious,critical,foster,1,convenient,convenient,nonprob,priority,spec_prior -pretentious,critical,foster,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,critical,foster,1,convenient,convenient,slightly_prob,recommended,priority -pretentious,critical,foster,1,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,critical,foster,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,foster,1,convenient,convenient,problematic,recommended,spec_prior -pretentious,critical,foster,1,convenient,convenient,problematic,priority,spec_prior -pretentious,critical,foster,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,critical,foster,1,convenient,inconv,nonprob,recommended,spec_prior -pretentious,critical,foster,1,convenient,inconv,nonprob,priority,spec_prior -pretentious,critical,foster,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,critical,foster,1,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,foster,1,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,critical,foster,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,foster,1,convenient,inconv,problematic,recommended,spec_prior -pretentious,critical,foster,1,convenient,inconv,problematic,priority,spec_prior -pretentious,critical,foster,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,critical,foster,1,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,critical,foster,1,less_conv,convenient,nonprob,priority,spec_prior -pretentious,critical,foster,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,critical,foster,1,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,foster,1,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,critical,foster,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,foster,1,less_conv,convenient,problematic,recommended,spec_prior -pretentious,critical,foster,1,less_conv,convenient,problematic,priority,spec_prior -pretentious,critical,foster,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,critical,foster,1,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,critical,foster,1,less_conv,inconv,nonprob,priority,spec_prior -pretentious,critical,foster,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,critical,foster,1,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,foster,1,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,critical,foster,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,foster,1,less_conv,inconv,problematic,recommended,spec_prior -pretentious,critical,foster,1,less_conv,inconv,problematic,priority,spec_prior -pretentious,critical,foster,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,critical,foster,1,critical,convenient,nonprob,recommended,spec_prior -pretentious,critical,foster,1,critical,convenient,nonprob,priority,spec_prior -pretentious,critical,foster,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,critical,foster,1,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,foster,1,critical,convenient,slightly_prob,priority,spec_prior -pretentious,critical,foster,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,foster,1,critical,convenient,problematic,recommended,spec_prior -pretentious,critical,foster,1,critical,convenient,problematic,priority,spec_prior -pretentious,critical,foster,1,critical,convenient,problematic,not_recom,not_recom -pretentious,critical,foster,1,critical,inconv,nonprob,recommended,spec_prior -pretentious,critical,foster,1,critical,inconv,nonprob,priority,spec_prior -pretentious,critical,foster,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,critical,foster,1,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,foster,1,critical,inconv,slightly_prob,priority,spec_prior -pretentious,critical,foster,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,foster,1,critical,inconv,problematic,recommended,spec_prior -pretentious,critical,foster,1,critical,inconv,problematic,priority,spec_prior -pretentious,critical,foster,1,critical,inconv,problematic,not_recom,not_recom -pretentious,critical,foster,2,convenient,convenient,nonprob,recommended,priority -pretentious,critical,foster,2,convenient,convenient,nonprob,priority,spec_prior -pretentious,critical,foster,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,critical,foster,2,convenient,convenient,slightly_prob,recommended,priority -pretentious,critical,foster,2,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,critical,foster,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,foster,2,convenient,convenient,problematic,recommended,spec_prior -pretentious,critical,foster,2,convenient,convenient,problematic,priority,spec_prior -pretentious,critical,foster,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,critical,foster,2,convenient,inconv,nonprob,recommended,spec_prior -pretentious,critical,foster,2,convenient,inconv,nonprob,priority,spec_prior -pretentious,critical,foster,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,critical,foster,2,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,foster,2,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,critical,foster,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,foster,2,convenient,inconv,problematic,recommended,spec_prior -pretentious,critical,foster,2,convenient,inconv,problematic,priority,spec_prior -pretentious,critical,foster,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,critical,foster,2,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,critical,foster,2,less_conv,convenient,nonprob,priority,spec_prior -pretentious,critical,foster,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,critical,foster,2,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,foster,2,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,critical,foster,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,foster,2,less_conv,convenient,problematic,recommended,spec_prior -pretentious,critical,foster,2,less_conv,convenient,problematic,priority,spec_prior -pretentious,critical,foster,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,critical,foster,2,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,critical,foster,2,less_conv,inconv,nonprob,priority,spec_prior -pretentious,critical,foster,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,critical,foster,2,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,foster,2,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,critical,foster,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,foster,2,less_conv,inconv,problematic,recommended,spec_prior -pretentious,critical,foster,2,less_conv,inconv,problematic,priority,spec_prior -pretentious,critical,foster,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,critical,foster,2,critical,convenient,nonprob,recommended,spec_prior -pretentious,critical,foster,2,critical,convenient,nonprob,priority,spec_prior -pretentious,critical,foster,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,critical,foster,2,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,foster,2,critical,convenient,slightly_prob,priority,spec_prior -pretentious,critical,foster,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,foster,2,critical,convenient,problematic,recommended,spec_prior -pretentious,critical,foster,2,critical,convenient,problematic,priority,spec_prior -pretentious,critical,foster,2,critical,convenient,problematic,not_recom,not_recom -pretentious,critical,foster,2,critical,inconv,nonprob,recommended,spec_prior -pretentious,critical,foster,2,critical,inconv,nonprob,priority,spec_prior -pretentious,critical,foster,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,critical,foster,2,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,foster,2,critical,inconv,slightly_prob,priority,spec_prior -pretentious,critical,foster,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,foster,2,critical,inconv,problematic,recommended,spec_prior -pretentious,critical,foster,2,critical,inconv,problematic,priority,spec_prior -pretentious,critical,foster,2,critical,inconv,problematic,not_recom,not_recom -pretentious,critical,foster,3,convenient,convenient,nonprob,recommended,priority -pretentious,critical,foster,3,convenient,convenient,nonprob,priority,spec_prior -pretentious,critical,foster,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,critical,foster,3,convenient,convenient,slightly_prob,recommended,priority -pretentious,critical,foster,3,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,critical,foster,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,foster,3,convenient,convenient,problematic,recommended,spec_prior -pretentious,critical,foster,3,convenient,convenient,problematic,priority,spec_prior -pretentious,critical,foster,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,critical,foster,3,convenient,inconv,nonprob,recommended,spec_prior -pretentious,critical,foster,3,convenient,inconv,nonprob,priority,spec_prior -pretentious,critical,foster,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,critical,foster,3,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,foster,3,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,critical,foster,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,foster,3,convenient,inconv,problematic,recommended,spec_prior -pretentious,critical,foster,3,convenient,inconv,problematic,priority,spec_prior -pretentious,critical,foster,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,critical,foster,3,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,critical,foster,3,less_conv,convenient,nonprob,priority,spec_prior -pretentious,critical,foster,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,critical,foster,3,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,foster,3,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,critical,foster,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,foster,3,less_conv,convenient,problematic,recommended,spec_prior -pretentious,critical,foster,3,less_conv,convenient,problematic,priority,spec_prior -pretentious,critical,foster,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,critical,foster,3,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,critical,foster,3,less_conv,inconv,nonprob,priority,spec_prior -pretentious,critical,foster,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,critical,foster,3,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,foster,3,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,critical,foster,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,foster,3,less_conv,inconv,problematic,recommended,spec_prior -pretentious,critical,foster,3,less_conv,inconv,problematic,priority,spec_prior -pretentious,critical,foster,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,critical,foster,3,critical,convenient,nonprob,recommended,spec_prior -pretentious,critical,foster,3,critical,convenient,nonprob,priority,spec_prior -pretentious,critical,foster,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,critical,foster,3,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,foster,3,critical,convenient,slightly_prob,priority,spec_prior -pretentious,critical,foster,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,foster,3,critical,convenient,problematic,recommended,spec_prior -pretentious,critical,foster,3,critical,convenient,problematic,priority,spec_prior -pretentious,critical,foster,3,critical,convenient,problematic,not_recom,not_recom -pretentious,critical,foster,3,critical,inconv,nonprob,recommended,spec_prior -pretentious,critical,foster,3,critical,inconv,nonprob,priority,spec_prior -pretentious,critical,foster,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,critical,foster,3,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,foster,3,critical,inconv,slightly_prob,priority,spec_prior -pretentious,critical,foster,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,foster,3,critical,inconv,problematic,recommended,spec_prior -pretentious,critical,foster,3,critical,inconv,problematic,priority,spec_prior -pretentious,critical,foster,3,critical,inconv,problematic,not_recom,not_recom -pretentious,critical,foster,more,convenient,convenient,nonprob,recommended,priority -pretentious,critical,foster,more,convenient,convenient,nonprob,priority,spec_prior -pretentious,critical,foster,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,critical,foster,more,convenient,convenient,slightly_prob,recommended,priority -pretentious,critical,foster,more,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,critical,foster,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,foster,more,convenient,convenient,problematic,recommended,spec_prior -pretentious,critical,foster,more,convenient,convenient,problematic,priority,spec_prior -pretentious,critical,foster,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,critical,foster,more,convenient,inconv,nonprob,recommended,spec_prior -pretentious,critical,foster,more,convenient,inconv,nonprob,priority,spec_prior -pretentious,critical,foster,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,critical,foster,more,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,foster,more,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,critical,foster,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,foster,more,convenient,inconv,problematic,recommended,spec_prior -pretentious,critical,foster,more,convenient,inconv,problematic,priority,spec_prior -pretentious,critical,foster,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,critical,foster,more,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,critical,foster,more,less_conv,convenient,nonprob,priority,spec_prior -pretentious,critical,foster,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,critical,foster,more,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,foster,more,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,critical,foster,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,foster,more,less_conv,convenient,problematic,recommended,spec_prior -pretentious,critical,foster,more,less_conv,convenient,problematic,priority,spec_prior -pretentious,critical,foster,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,critical,foster,more,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,critical,foster,more,less_conv,inconv,nonprob,priority,spec_prior -pretentious,critical,foster,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,critical,foster,more,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,foster,more,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,critical,foster,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,foster,more,less_conv,inconv,problematic,recommended,spec_prior -pretentious,critical,foster,more,less_conv,inconv,problematic,priority,spec_prior -pretentious,critical,foster,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,critical,foster,more,critical,convenient,nonprob,recommended,spec_prior -pretentious,critical,foster,more,critical,convenient,nonprob,priority,spec_prior -pretentious,critical,foster,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,critical,foster,more,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,critical,foster,more,critical,convenient,slightly_prob,priority,spec_prior -pretentious,critical,foster,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,critical,foster,more,critical,convenient,problematic,recommended,spec_prior -pretentious,critical,foster,more,critical,convenient,problematic,priority,spec_prior -pretentious,critical,foster,more,critical,convenient,problematic,not_recom,not_recom -pretentious,critical,foster,more,critical,inconv,nonprob,recommended,spec_prior -pretentious,critical,foster,more,critical,inconv,nonprob,priority,spec_prior -pretentious,critical,foster,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,critical,foster,more,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,critical,foster,more,critical,inconv,slightly_prob,priority,spec_prior -pretentious,critical,foster,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,critical,foster,more,critical,inconv,problematic,recommended,spec_prior -pretentious,critical,foster,more,critical,inconv,problematic,priority,spec_prior -pretentious,critical,foster,more,critical,inconv,problematic,not_recom,not_recom -pretentious,very_crit,complete,1,convenient,convenient,nonprob,recommended,priority -pretentious,very_crit,complete,1,convenient,convenient,nonprob,priority,priority -pretentious,very_crit,complete,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,complete,1,convenient,convenient,slightly_prob,recommended,priority -pretentious,very_crit,complete,1,convenient,convenient,slightly_prob,priority,priority -pretentious,very_crit,complete,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,1,convenient,convenient,problematic,recommended,priority -pretentious,very_crit,complete,1,convenient,convenient,problematic,priority,priority -pretentious,very_crit,complete,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,very_crit,complete,1,convenient,inconv,nonprob,recommended,priority -pretentious,very_crit,complete,1,convenient,inconv,nonprob,priority,spec_prior -pretentious,very_crit,complete,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,complete,1,convenient,inconv,slightly_prob,recommended,priority -pretentious,very_crit,complete,1,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,1,convenient,inconv,problematic,recommended,spec_prior -pretentious,very_crit,complete,1,convenient,inconv,problematic,priority,spec_prior -pretentious,very_crit,complete,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,very_crit,complete,1,less_conv,convenient,nonprob,recommended,priority -pretentious,very_crit,complete,1,less_conv,convenient,nonprob,priority,spec_prior -pretentious,very_crit,complete,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,complete,1,less_conv,convenient,slightly_prob,recommended,priority -pretentious,very_crit,complete,1,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,1,less_conv,convenient,problematic,recommended,spec_prior -pretentious,very_crit,complete,1,less_conv,convenient,problematic,priority,spec_prior -pretentious,very_crit,complete,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,very_crit,complete,1,less_conv,inconv,nonprob,recommended,priority -pretentious,very_crit,complete,1,less_conv,inconv,nonprob,priority,spec_prior -pretentious,very_crit,complete,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,complete,1,less_conv,inconv,slightly_prob,recommended,priority -pretentious,very_crit,complete,1,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,1,less_conv,inconv,problematic,recommended,spec_prior -pretentious,very_crit,complete,1,less_conv,inconv,problematic,priority,spec_prior -pretentious,very_crit,complete,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,very_crit,complete,1,critical,convenient,nonprob,recommended,priority -pretentious,very_crit,complete,1,critical,convenient,nonprob,priority,spec_prior -pretentious,very_crit,complete,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,complete,1,critical,convenient,slightly_prob,recommended,priority -pretentious,very_crit,complete,1,critical,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,1,critical,convenient,problematic,recommended,spec_prior -pretentious,very_crit,complete,1,critical,convenient,problematic,priority,spec_prior -pretentious,very_crit,complete,1,critical,convenient,problematic,not_recom,not_recom -pretentious,very_crit,complete,1,critical,inconv,nonprob,recommended,priority -pretentious,very_crit,complete,1,critical,inconv,nonprob,priority,spec_prior -pretentious,very_crit,complete,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,complete,1,critical,inconv,slightly_prob,recommended,priority -pretentious,very_crit,complete,1,critical,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,1,critical,inconv,problematic,recommended,spec_prior -pretentious,very_crit,complete,1,critical,inconv,problematic,priority,spec_prior -pretentious,very_crit,complete,1,critical,inconv,problematic,not_recom,not_recom -pretentious,very_crit,complete,2,convenient,convenient,nonprob,recommended,priority -pretentious,very_crit,complete,2,convenient,convenient,nonprob,priority,spec_prior -pretentious,very_crit,complete,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,complete,2,convenient,convenient,slightly_prob,recommended,priority -pretentious,very_crit,complete,2,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,2,convenient,convenient,problematic,recommended,spec_prior -pretentious,very_crit,complete,2,convenient,convenient,problematic,priority,spec_prior -pretentious,very_crit,complete,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,very_crit,complete,2,convenient,inconv,nonprob,recommended,priority -pretentious,very_crit,complete,2,convenient,inconv,nonprob,priority,spec_prior -pretentious,very_crit,complete,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,complete,2,convenient,inconv,slightly_prob,recommended,priority -pretentious,very_crit,complete,2,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,2,convenient,inconv,problematic,recommended,spec_prior -pretentious,very_crit,complete,2,convenient,inconv,problematic,priority,spec_prior -pretentious,very_crit,complete,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,very_crit,complete,2,less_conv,convenient,nonprob,recommended,priority -pretentious,very_crit,complete,2,less_conv,convenient,nonprob,priority,spec_prior -pretentious,very_crit,complete,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,complete,2,less_conv,convenient,slightly_prob,recommended,priority -pretentious,very_crit,complete,2,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,2,less_conv,convenient,problematic,recommended,spec_prior -pretentious,very_crit,complete,2,less_conv,convenient,problematic,priority,spec_prior -pretentious,very_crit,complete,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,very_crit,complete,2,less_conv,inconv,nonprob,recommended,priority -pretentious,very_crit,complete,2,less_conv,inconv,nonprob,priority,spec_prior -pretentious,very_crit,complete,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,complete,2,less_conv,inconv,slightly_prob,recommended,priority -pretentious,very_crit,complete,2,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,2,less_conv,inconv,problematic,recommended,spec_prior -pretentious,very_crit,complete,2,less_conv,inconv,problematic,priority,spec_prior -pretentious,very_crit,complete,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,very_crit,complete,2,critical,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,complete,2,critical,convenient,nonprob,priority,spec_prior -pretentious,very_crit,complete,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,complete,2,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,complete,2,critical,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,2,critical,convenient,problematic,recommended,spec_prior -pretentious,very_crit,complete,2,critical,convenient,problematic,priority,spec_prior -pretentious,very_crit,complete,2,critical,convenient,problematic,not_recom,not_recom -pretentious,very_crit,complete,2,critical,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,complete,2,critical,inconv,nonprob,priority,spec_prior -pretentious,very_crit,complete,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,complete,2,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,complete,2,critical,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,2,critical,inconv,problematic,recommended,spec_prior -pretentious,very_crit,complete,2,critical,inconv,problematic,priority,spec_prior -pretentious,very_crit,complete,2,critical,inconv,problematic,not_recom,not_recom -pretentious,very_crit,complete,3,convenient,convenient,nonprob,recommended,priority -pretentious,very_crit,complete,3,convenient,convenient,nonprob,priority,spec_prior -pretentious,very_crit,complete,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,complete,3,convenient,convenient,slightly_prob,recommended,priority -pretentious,very_crit,complete,3,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,3,convenient,convenient,problematic,recommended,spec_prior -pretentious,very_crit,complete,3,convenient,convenient,problematic,priority,spec_prior -pretentious,very_crit,complete,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,very_crit,complete,3,convenient,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,complete,3,convenient,inconv,nonprob,priority,spec_prior -pretentious,very_crit,complete,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,complete,3,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,complete,3,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,3,convenient,inconv,problematic,recommended,spec_prior -pretentious,very_crit,complete,3,convenient,inconv,problematic,priority,spec_prior -pretentious,very_crit,complete,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,very_crit,complete,3,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,complete,3,less_conv,convenient,nonprob,priority,spec_prior -pretentious,very_crit,complete,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,complete,3,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,complete,3,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,3,less_conv,convenient,problematic,recommended,spec_prior -pretentious,very_crit,complete,3,less_conv,convenient,problematic,priority,spec_prior -pretentious,very_crit,complete,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,very_crit,complete,3,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,complete,3,less_conv,inconv,nonprob,priority,spec_prior -pretentious,very_crit,complete,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,complete,3,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,complete,3,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,3,less_conv,inconv,problematic,recommended,spec_prior -pretentious,very_crit,complete,3,less_conv,inconv,problematic,priority,spec_prior -pretentious,very_crit,complete,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,very_crit,complete,3,critical,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,complete,3,critical,convenient,nonprob,priority,spec_prior -pretentious,very_crit,complete,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,complete,3,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,complete,3,critical,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,3,critical,convenient,problematic,recommended,spec_prior -pretentious,very_crit,complete,3,critical,convenient,problematic,priority,spec_prior -pretentious,very_crit,complete,3,critical,convenient,problematic,not_recom,not_recom -pretentious,very_crit,complete,3,critical,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,complete,3,critical,inconv,nonprob,priority,spec_prior -pretentious,very_crit,complete,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,complete,3,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,complete,3,critical,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,3,critical,inconv,problematic,recommended,spec_prior -pretentious,very_crit,complete,3,critical,inconv,problematic,priority,spec_prior -pretentious,very_crit,complete,3,critical,inconv,problematic,not_recom,not_recom -pretentious,very_crit,complete,more,convenient,convenient,nonprob,recommended,priority -pretentious,very_crit,complete,more,convenient,convenient,nonprob,priority,spec_prior -pretentious,very_crit,complete,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,complete,more,convenient,convenient,slightly_prob,recommended,priority -pretentious,very_crit,complete,more,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,more,convenient,convenient,problematic,recommended,spec_prior -pretentious,very_crit,complete,more,convenient,convenient,problematic,priority,spec_prior -pretentious,very_crit,complete,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,very_crit,complete,more,convenient,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,complete,more,convenient,inconv,nonprob,priority,spec_prior -pretentious,very_crit,complete,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,complete,more,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,complete,more,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,more,convenient,inconv,problematic,recommended,spec_prior -pretentious,very_crit,complete,more,convenient,inconv,problematic,priority,spec_prior -pretentious,very_crit,complete,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,very_crit,complete,more,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,complete,more,less_conv,convenient,nonprob,priority,spec_prior -pretentious,very_crit,complete,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,complete,more,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,complete,more,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,more,less_conv,convenient,problematic,recommended,spec_prior -pretentious,very_crit,complete,more,less_conv,convenient,problematic,priority,spec_prior -pretentious,very_crit,complete,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,very_crit,complete,more,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,complete,more,less_conv,inconv,nonprob,priority,spec_prior -pretentious,very_crit,complete,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,complete,more,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,complete,more,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,more,less_conv,inconv,problematic,recommended,spec_prior -pretentious,very_crit,complete,more,less_conv,inconv,problematic,priority,spec_prior -pretentious,very_crit,complete,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,very_crit,complete,more,critical,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,complete,more,critical,convenient,nonprob,priority,spec_prior -pretentious,very_crit,complete,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,complete,more,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,complete,more,critical,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,more,critical,convenient,problematic,recommended,spec_prior -pretentious,very_crit,complete,more,critical,convenient,problematic,priority,spec_prior -pretentious,very_crit,complete,more,critical,convenient,problematic,not_recom,not_recom -pretentious,very_crit,complete,more,critical,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,complete,more,critical,inconv,nonprob,priority,spec_prior -pretentious,very_crit,complete,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,complete,more,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,complete,more,critical,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,complete,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,complete,more,critical,inconv,problematic,recommended,spec_prior -pretentious,very_crit,complete,more,critical,inconv,problematic,priority,spec_prior -pretentious,very_crit,complete,more,critical,inconv,problematic,not_recom,not_recom -pretentious,very_crit,completed,1,convenient,convenient,nonprob,recommended,priority -pretentious,very_crit,completed,1,convenient,convenient,nonprob,priority,spec_prior -pretentious,very_crit,completed,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,completed,1,convenient,convenient,slightly_prob,recommended,priority -pretentious,very_crit,completed,1,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,1,convenient,convenient,problematic,recommended,spec_prior -pretentious,very_crit,completed,1,convenient,convenient,problematic,priority,spec_prior -pretentious,very_crit,completed,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,very_crit,completed,1,convenient,inconv,nonprob,recommended,priority -pretentious,very_crit,completed,1,convenient,inconv,nonprob,priority,spec_prior -pretentious,very_crit,completed,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,completed,1,convenient,inconv,slightly_prob,recommended,priority -pretentious,very_crit,completed,1,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,1,convenient,inconv,problematic,recommended,spec_prior -pretentious,very_crit,completed,1,convenient,inconv,problematic,priority,spec_prior -pretentious,very_crit,completed,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,very_crit,completed,1,less_conv,convenient,nonprob,recommended,priority -pretentious,very_crit,completed,1,less_conv,convenient,nonprob,priority,spec_prior -pretentious,very_crit,completed,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,completed,1,less_conv,convenient,slightly_prob,recommended,priority -pretentious,very_crit,completed,1,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,1,less_conv,convenient,problematic,recommended,spec_prior -pretentious,very_crit,completed,1,less_conv,convenient,problematic,priority,spec_prior -pretentious,very_crit,completed,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,very_crit,completed,1,less_conv,inconv,nonprob,recommended,priority -pretentious,very_crit,completed,1,less_conv,inconv,nonprob,priority,spec_prior -pretentious,very_crit,completed,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,completed,1,less_conv,inconv,slightly_prob,recommended,priority -pretentious,very_crit,completed,1,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,1,less_conv,inconv,problematic,recommended,spec_prior -pretentious,very_crit,completed,1,less_conv,inconv,problematic,priority,spec_prior -pretentious,very_crit,completed,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,very_crit,completed,1,critical,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,completed,1,critical,convenient,nonprob,priority,spec_prior -pretentious,very_crit,completed,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,completed,1,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,completed,1,critical,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,1,critical,convenient,problematic,recommended,spec_prior -pretentious,very_crit,completed,1,critical,convenient,problematic,priority,spec_prior -pretentious,very_crit,completed,1,critical,convenient,problematic,not_recom,not_recom -pretentious,very_crit,completed,1,critical,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,completed,1,critical,inconv,nonprob,priority,spec_prior -pretentious,very_crit,completed,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,completed,1,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,completed,1,critical,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,1,critical,inconv,problematic,recommended,spec_prior -pretentious,very_crit,completed,1,critical,inconv,problematic,priority,spec_prior -pretentious,very_crit,completed,1,critical,inconv,problematic,not_recom,not_recom -pretentious,very_crit,completed,2,convenient,convenient,nonprob,recommended,priority -pretentious,very_crit,completed,2,convenient,convenient,nonprob,priority,spec_prior -pretentious,very_crit,completed,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,completed,2,convenient,convenient,slightly_prob,recommended,priority -pretentious,very_crit,completed,2,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,2,convenient,convenient,problematic,recommended,spec_prior -pretentious,very_crit,completed,2,convenient,convenient,problematic,priority,spec_prior -pretentious,very_crit,completed,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,very_crit,completed,2,convenient,inconv,nonprob,recommended,priority -pretentious,very_crit,completed,2,convenient,inconv,nonprob,priority,spec_prior -pretentious,very_crit,completed,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,completed,2,convenient,inconv,slightly_prob,recommended,priority -pretentious,very_crit,completed,2,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,2,convenient,inconv,problematic,recommended,spec_prior -pretentious,very_crit,completed,2,convenient,inconv,problematic,priority,spec_prior -pretentious,very_crit,completed,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,very_crit,completed,2,less_conv,convenient,nonprob,recommended,priority -pretentious,very_crit,completed,2,less_conv,convenient,nonprob,priority,spec_prior -pretentious,very_crit,completed,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,completed,2,less_conv,convenient,slightly_prob,recommended,priority -pretentious,very_crit,completed,2,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,2,less_conv,convenient,problematic,recommended,spec_prior -pretentious,very_crit,completed,2,less_conv,convenient,problematic,priority,spec_prior -pretentious,very_crit,completed,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,very_crit,completed,2,less_conv,inconv,nonprob,recommended,priority -pretentious,very_crit,completed,2,less_conv,inconv,nonprob,priority,spec_prior -pretentious,very_crit,completed,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,completed,2,less_conv,inconv,slightly_prob,recommended,priority -pretentious,very_crit,completed,2,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,2,less_conv,inconv,problematic,recommended,spec_prior -pretentious,very_crit,completed,2,less_conv,inconv,problematic,priority,spec_prior -pretentious,very_crit,completed,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,very_crit,completed,2,critical,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,completed,2,critical,convenient,nonprob,priority,spec_prior -pretentious,very_crit,completed,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,completed,2,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,completed,2,critical,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,2,critical,convenient,problematic,recommended,spec_prior -pretentious,very_crit,completed,2,critical,convenient,problematic,priority,spec_prior -pretentious,very_crit,completed,2,critical,convenient,problematic,not_recom,not_recom -pretentious,very_crit,completed,2,critical,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,completed,2,critical,inconv,nonprob,priority,spec_prior -pretentious,very_crit,completed,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,completed,2,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,completed,2,critical,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,2,critical,inconv,problematic,recommended,spec_prior -pretentious,very_crit,completed,2,critical,inconv,problematic,priority,spec_prior -pretentious,very_crit,completed,2,critical,inconv,problematic,not_recom,not_recom -pretentious,very_crit,completed,3,convenient,convenient,nonprob,recommended,priority -pretentious,very_crit,completed,3,convenient,convenient,nonprob,priority,spec_prior -pretentious,very_crit,completed,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,completed,3,convenient,convenient,slightly_prob,recommended,priority -pretentious,very_crit,completed,3,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,3,convenient,convenient,problematic,recommended,spec_prior -pretentious,very_crit,completed,3,convenient,convenient,problematic,priority,spec_prior -pretentious,very_crit,completed,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,very_crit,completed,3,convenient,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,completed,3,convenient,inconv,nonprob,priority,spec_prior -pretentious,very_crit,completed,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,completed,3,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,completed,3,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,3,convenient,inconv,problematic,recommended,spec_prior -pretentious,very_crit,completed,3,convenient,inconv,problematic,priority,spec_prior -pretentious,very_crit,completed,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,very_crit,completed,3,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,completed,3,less_conv,convenient,nonprob,priority,spec_prior -pretentious,very_crit,completed,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,completed,3,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,completed,3,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,3,less_conv,convenient,problematic,recommended,spec_prior -pretentious,very_crit,completed,3,less_conv,convenient,problematic,priority,spec_prior -pretentious,very_crit,completed,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,very_crit,completed,3,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,completed,3,less_conv,inconv,nonprob,priority,spec_prior -pretentious,very_crit,completed,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,completed,3,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,completed,3,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,3,less_conv,inconv,problematic,recommended,spec_prior -pretentious,very_crit,completed,3,less_conv,inconv,problematic,priority,spec_prior -pretentious,very_crit,completed,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,very_crit,completed,3,critical,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,completed,3,critical,convenient,nonprob,priority,spec_prior -pretentious,very_crit,completed,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,completed,3,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,completed,3,critical,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,3,critical,convenient,problematic,recommended,spec_prior -pretentious,very_crit,completed,3,critical,convenient,problematic,priority,spec_prior -pretentious,very_crit,completed,3,critical,convenient,problematic,not_recom,not_recom -pretentious,very_crit,completed,3,critical,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,completed,3,critical,inconv,nonprob,priority,spec_prior -pretentious,very_crit,completed,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,completed,3,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,completed,3,critical,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,3,critical,inconv,problematic,recommended,spec_prior -pretentious,very_crit,completed,3,critical,inconv,problematic,priority,spec_prior -pretentious,very_crit,completed,3,critical,inconv,problematic,not_recom,not_recom -pretentious,very_crit,completed,more,convenient,convenient,nonprob,recommended,priority -pretentious,very_crit,completed,more,convenient,convenient,nonprob,priority,spec_prior -pretentious,very_crit,completed,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,completed,more,convenient,convenient,slightly_prob,recommended,priority -pretentious,very_crit,completed,more,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,more,convenient,convenient,problematic,recommended,spec_prior -pretentious,very_crit,completed,more,convenient,convenient,problematic,priority,spec_prior -pretentious,very_crit,completed,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,very_crit,completed,more,convenient,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,completed,more,convenient,inconv,nonprob,priority,spec_prior -pretentious,very_crit,completed,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,completed,more,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,completed,more,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,more,convenient,inconv,problematic,recommended,spec_prior -pretentious,very_crit,completed,more,convenient,inconv,problematic,priority,spec_prior -pretentious,very_crit,completed,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,very_crit,completed,more,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,completed,more,less_conv,convenient,nonprob,priority,spec_prior -pretentious,very_crit,completed,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,completed,more,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,completed,more,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,more,less_conv,convenient,problematic,recommended,spec_prior -pretentious,very_crit,completed,more,less_conv,convenient,problematic,priority,spec_prior -pretentious,very_crit,completed,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,very_crit,completed,more,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,completed,more,less_conv,inconv,nonprob,priority,spec_prior -pretentious,very_crit,completed,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,completed,more,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,completed,more,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,more,less_conv,inconv,problematic,recommended,spec_prior -pretentious,very_crit,completed,more,less_conv,inconv,problematic,priority,spec_prior -pretentious,very_crit,completed,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,very_crit,completed,more,critical,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,completed,more,critical,convenient,nonprob,priority,spec_prior -pretentious,very_crit,completed,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,completed,more,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,completed,more,critical,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,more,critical,convenient,problematic,recommended,spec_prior -pretentious,very_crit,completed,more,critical,convenient,problematic,priority,spec_prior -pretentious,very_crit,completed,more,critical,convenient,problematic,not_recom,not_recom -pretentious,very_crit,completed,more,critical,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,completed,more,critical,inconv,nonprob,priority,spec_prior -pretentious,very_crit,completed,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,completed,more,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,completed,more,critical,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,completed,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,completed,more,critical,inconv,problematic,recommended,spec_prior -pretentious,very_crit,completed,more,critical,inconv,problematic,priority,spec_prior -pretentious,very_crit,completed,more,critical,inconv,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,1,convenient,convenient,nonprob,recommended,priority -pretentious,very_crit,incomplete,1,convenient,convenient,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,1,convenient,convenient,slightly_prob,recommended,priority -pretentious,very_crit,incomplete,1,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,1,convenient,convenient,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,1,convenient,convenient,problematic,priority,spec_prior -pretentious,very_crit,incomplete,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,1,convenient,inconv,nonprob,recommended,priority -pretentious,very_crit,incomplete,1,convenient,inconv,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,1,convenient,inconv,slightly_prob,recommended,priority -pretentious,very_crit,incomplete,1,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,1,convenient,inconv,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,1,convenient,inconv,problematic,priority,spec_prior -pretentious,very_crit,incomplete,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,1,less_conv,convenient,nonprob,recommended,priority -pretentious,very_crit,incomplete,1,less_conv,convenient,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,1,less_conv,convenient,slightly_prob,recommended,priority -pretentious,very_crit,incomplete,1,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,1,less_conv,convenient,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,1,less_conv,convenient,problematic,priority,spec_prior -pretentious,very_crit,incomplete,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,1,less_conv,inconv,nonprob,recommended,priority -pretentious,very_crit,incomplete,1,less_conv,inconv,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,1,less_conv,inconv,slightly_prob,recommended,priority -pretentious,very_crit,incomplete,1,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,1,less_conv,inconv,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,1,less_conv,inconv,problematic,priority,spec_prior -pretentious,very_crit,incomplete,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,1,critical,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,incomplete,1,critical,convenient,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,1,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,incomplete,1,critical,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,1,critical,convenient,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,1,critical,convenient,problematic,priority,spec_prior -pretentious,very_crit,incomplete,1,critical,convenient,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,1,critical,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,incomplete,1,critical,inconv,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,1,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,incomplete,1,critical,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,1,critical,inconv,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,1,critical,inconv,problematic,priority,spec_prior -pretentious,very_crit,incomplete,1,critical,inconv,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,2,convenient,convenient,nonprob,recommended,priority -pretentious,very_crit,incomplete,2,convenient,convenient,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,2,convenient,convenient,slightly_prob,recommended,priority -pretentious,very_crit,incomplete,2,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,2,convenient,convenient,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,2,convenient,convenient,problematic,priority,spec_prior -pretentious,very_crit,incomplete,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,2,convenient,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,incomplete,2,convenient,inconv,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,2,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,incomplete,2,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,2,convenient,inconv,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,2,convenient,inconv,problematic,priority,spec_prior -pretentious,very_crit,incomplete,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,2,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,incomplete,2,less_conv,convenient,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,2,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,incomplete,2,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,2,less_conv,convenient,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,2,less_conv,convenient,problematic,priority,spec_prior -pretentious,very_crit,incomplete,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,2,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,incomplete,2,less_conv,inconv,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,2,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,incomplete,2,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,2,less_conv,inconv,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,2,less_conv,inconv,problematic,priority,spec_prior -pretentious,very_crit,incomplete,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,2,critical,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,incomplete,2,critical,convenient,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,2,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,incomplete,2,critical,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,2,critical,convenient,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,2,critical,convenient,problematic,priority,spec_prior -pretentious,very_crit,incomplete,2,critical,convenient,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,2,critical,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,incomplete,2,critical,inconv,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,2,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,incomplete,2,critical,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,2,critical,inconv,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,2,critical,inconv,problematic,priority,spec_prior -pretentious,very_crit,incomplete,2,critical,inconv,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,3,convenient,convenient,nonprob,recommended,priority -pretentious,very_crit,incomplete,3,convenient,convenient,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,3,convenient,convenient,slightly_prob,recommended,priority -pretentious,very_crit,incomplete,3,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,3,convenient,convenient,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,3,convenient,convenient,problematic,priority,spec_prior -pretentious,very_crit,incomplete,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,3,convenient,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,incomplete,3,convenient,inconv,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,3,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,incomplete,3,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,3,convenient,inconv,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,3,convenient,inconv,problematic,priority,spec_prior -pretentious,very_crit,incomplete,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,3,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,incomplete,3,less_conv,convenient,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,3,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,incomplete,3,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,3,less_conv,convenient,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,3,less_conv,convenient,problematic,priority,spec_prior -pretentious,very_crit,incomplete,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,3,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,incomplete,3,less_conv,inconv,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,3,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,incomplete,3,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,3,less_conv,inconv,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,3,less_conv,inconv,problematic,priority,spec_prior -pretentious,very_crit,incomplete,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,3,critical,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,incomplete,3,critical,convenient,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,3,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,incomplete,3,critical,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,3,critical,convenient,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,3,critical,convenient,problematic,priority,spec_prior -pretentious,very_crit,incomplete,3,critical,convenient,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,3,critical,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,incomplete,3,critical,inconv,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,3,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,incomplete,3,critical,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,3,critical,inconv,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,3,critical,inconv,problematic,priority,spec_prior -pretentious,very_crit,incomplete,3,critical,inconv,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,more,convenient,convenient,nonprob,recommended,priority -pretentious,very_crit,incomplete,more,convenient,convenient,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,more,convenient,convenient,slightly_prob,recommended,priority -pretentious,very_crit,incomplete,more,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,more,convenient,convenient,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,more,convenient,convenient,problematic,priority,spec_prior -pretentious,very_crit,incomplete,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,more,convenient,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,incomplete,more,convenient,inconv,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,more,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,incomplete,more,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,more,convenient,inconv,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,more,convenient,inconv,problematic,priority,spec_prior -pretentious,very_crit,incomplete,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,more,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,incomplete,more,less_conv,convenient,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,more,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,incomplete,more,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,more,less_conv,convenient,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,more,less_conv,convenient,problematic,priority,spec_prior -pretentious,very_crit,incomplete,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,more,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,incomplete,more,less_conv,inconv,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,more,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,incomplete,more,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,more,less_conv,inconv,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,more,less_conv,inconv,problematic,priority,spec_prior -pretentious,very_crit,incomplete,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,more,critical,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,incomplete,more,critical,convenient,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,more,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,incomplete,more,critical,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,more,critical,convenient,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,more,critical,convenient,problematic,priority,spec_prior -pretentious,very_crit,incomplete,more,critical,convenient,problematic,not_recom,not_recom -pretentious,very_crit,incomplete,more,critical,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,incomplete,more,critical,inconv,nonprob,priority,spec_prior -pretentious,very_crit,incomplete,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,incomplete,more,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,incomplete,more,critical,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,incomplete,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,incomplete,more,critical,inconv,problematic,recommended,spec_prior -pretentious,very_crit,incomplete,more,critical,inconv,problematic,priority,spec_prior -pretentious,very_crit,incomplete,more,critical,inconv,problematic,not_recom,not_recom -pretentious,very_crit,foster,1,convenient,convenient,nonprob,recommended,priority -pretentious,very_crit,foster,1,convenient,convenient,nonprob,priority,spec_prior -pretentious,very_crit,foster,1,convenient,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,foster,1,convenient,convenient,slightly_prob,recommended,priority -pretentious,very_crit,foster,1,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,1,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,1,convenient,convenient,problematic,recommended,spec_prior -pretentious,very_crit,foster,1,convenient,convenient,problematic,priority,spec_prior -pretentious,very_crit,foster,1,convenient,convenient,problematic,not_recom,not_recom -pretentious,very_crit,foster,1,convenient,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,foster,1,convenient,inconv,nonprob,priority,spec_prior -pretentious,very_crit,foster,1,convenient,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,foster,1,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,1,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,1,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,1,convenient,inconv,problematic,recommended,spec_prior -pretentious,very_crit,foster,1,convenient,inconv,problematic,priority,spec_prior -pretentious,very_crit,foster,1,convenient,inconv,problematic,not_recom,not_recom -pretentious,very_crit,foster,1,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,foster,1,less_conv,convenient,nonprob,priority,spec_prior -pretentious,very_crit,foster,1,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,foster,1,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,1,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,1,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,1,less_conv,convenient,problematic,recommended,spec_prior -pretentious,very_crit,foster,1,less_conv,convenient,problematic,priority,spec_prior -pretentious,very_crit,foster,1,less_conv,convenient,problematic,not_recom,not_recom -pretentious,very_crit,foster,1,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,foster,1,less_conv,inconv,nonprob,priority,spec_prior -pretentious,very_crit,foster,1,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,foster,1,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,1,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,1,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,1,less_conv,inconv,problematic,recommended,spec_prior -pretentious,very_crit,foster,1,less_conv,inconv,problematic,priority,spec_prior -pretentious,very_crit,foster,1,less_conv,inconv,problematic,not_recom,not_recom -pretentious,very_crit,foster,1,critical,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,foster,1,critical,convenient,nonprob,priority,spec_prior -pretentious,very_crit,foster,1,critical,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,foster,1,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,1,critical,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,1,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,1,critical,convenient,problematic,recommended,spec_prior -pretentious,very_crit,foster,1,critical,convenient,problematic,priority,spec_prior -pretentious,very_crit,foster,1,critical,convenient,problematic,not_recom,not_recom -pretentious,very_crit,foster,1,critical,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,foster,1,critical,inconv,nonprob,priority,spec_prior -pretentious,very_crit,foster,1,critical,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,foster,1,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,1,critical,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,1,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,1,critical,inconv,problematic,recommended,spec_prior -pretentious,very_crit,foster,1,critical,inconv,problematic,priority,spec_prior -pretentious,very_crit,foster,1,critical,inconv,problematic,not_recom,not_recom -pretentious,very_crit,foster,2,convenient,convenient,nonprob,recommended,priority -pretentious,very_crit,foster,2,convenient,convenient,nonprob,priority,spec_prior -pretentious,very_crit,foster,2,convenient,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,foster,2,convenient,convenient,slightly_prob,recommended,priority -pretentious,very_crit,foster,2,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,2,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,2,convenient,convenient,problematic,recommended,spec_prior -pretentious,very_crit,foster,2,convenient,convenient,problematic,priority,spec_prior -pretentious,very_crit,foster,2,convenient,convenient,problematic,not_recom,not_recom -pretentious,very_crit,foster,2,convenient,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,foster,2,convenient,inconv,nonprob,priority,spec_prior -pretentious,very_crit,foster,2,convenient,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,foster,2,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,2,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,2,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,2,convenient,inconv,problematic,recommended,spec_prior -pretentious,very_crit,foster,2,convenient,inconv,problematic,priority,spec_prior -pretentious,very_crit,foster,2,convenient,inconv,problematic,not_recom,not_recom -pretentious,very_crit,foster,2,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,foster,2,less_conv,convenient,nonprob,priority,spec_prior -pretentious,very_crit,foster,2,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,foster,2,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,2,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,2,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,2,less_conv,convenient,problematic,recommended,spec_prior -pretentious,very_crit,foster,2,less_conv,convenient,problematic,priority,spec_prior -pretentious,very_crit,foster,2,less_conv,convenient,problematic,not_recom,not_recom -pretentious,very_crit,foster,2,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,foster,2,less_conv,inconv,nonprob,priority,spec_prior -pretentious,very_crit,foster,2,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,foster,2,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,2,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,2,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,2,less_conv,inconv,problematic,recommended,spec_prior -pretentious,very_crit,foster,2,less_conv,inconv,problematic,priority,spec_prior -pretentious,very_crit,foster,2,less_conv,inconv,problematic,not_recom,not_recom -pretentious,very_crit,foster,2,critical,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,foster,2,critical,convenient,nonprob,priority,spec_prior -pretentious,very_crit,foster,2,critical,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,foster,2,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,2,critical,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,2,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,2,critical,convenient,problematic,recommended,spec_prior -pretentious,very_crit,foster,2,critical,convenient,problematic,priority,spec_prior -pretentious,very_crit,foster,2,critical,convenient,problematic,not_recom,not_recom -pretentious,very_crit,foster,2,critical,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,foster,2,critical,inconv,nonprob,priority,spec_prior -pretentious,very_crit,foster,2,critical,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,foster,2,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,2,critical,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,2,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,2,critical,inconv,problematic,recommended,spec_prior -pretentious,very_crit,foster,2,critical,inconv,problematic,priority,spec_prior -pretentious,very_crit,foster,2,critical,inconv,problematic,not_recom,not_recom -pretentious,very_crit,foster,3,convenient,convenient,nonprob,recommended,priority -pretentious,very_crit,foster,3,convenient,convenient,nonprob,priority,spec_prior -pretentious,very_crit,foster,3,convenient,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,foster,3,convenient,convenient,slightly_prob,recommended,priority -pretentious,very_crit,foster,3,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,3,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,3,convenient,convenient,problematic,recommended,spec_prior -pretentious,very_crit,foster,3,convenient,convenient,problematic,priority,spec_prior -pretentious,very_crit,foster,3,convenient,convenient,problematic,not_recom,not_recom -pretentious,very_crit,foster,3,convenient,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,foster,3,convenient,inconv,nonprob,priority,spec_prior -pretentious,very_crit,foster,3,convenient,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,foster,3,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,3,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,3,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,3,convenient,inconv,problematic,recommended,spec_prior -pretentious,very_crit,foster,3,convenient,inconv,problematic,priority,spec_prior -pretentious,very_crit,foster,3,convenient,inconv,problematic,not_recom,not_recom -pretentious,very_crit,foster,3,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,foster,3,less_conv,convenient,nonprob,priority,spec_prior -pretentious,very_crit,foster,3,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,foster,3,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,3,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,3,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,3,less_conv,convenient,problematic,recommended,spec_prior -pretentious,very_crit,foster,3,less_conv,convenient,problematic,priority,spec_prior -pretentious,very_crit,foster,3,less_conv,convenient,problematic,not_recom,not_recom -pretentious,very_crit,foster,3,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,foster,3,less_conv,inconv,nonprob,priority,spec_prior -pretentious,very_crit,foster,3,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,foster,3,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,3,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,3,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,3,less_conv,inconv,problematic,recommended,spec_prior -pretentious,very_crit,foster,3,less_conv,inconv,problematic,priority,spec_prior -pretentious,very_crit,foster,3,less_conv,inconv,problematic,not_recom,not_recom -pretentious,very_crit,foster,3,critical,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,foster,3,critical,convenient,nonprob,priority,spec_prior -pretentious,very_crit,foster,3,critical,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,foster,3,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,3,critical,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,3,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,3,critical,convenient,problematic,recommended,spec_prior -pretentious,very_crit,foster,3,critical,convenient,problematic,priority,spec_prior -pretentious,very_crit,foster,3,critical,convenient,problematic,not_recom,not_recom -pretentious,very_crit,foster,3,critical,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,foster,3,critical,inconv,nonprob,priority,spec_prior -pretentious,very_crit,foster,3,critical,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,foster,3,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,3,critical,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,3,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,3,critical,inconv,problematic,recommended,spec_prior -pretentious,very_crit,foster,3,critical,inconv,problematic,priority,spec_prior -pretentious,very_crit,foster,3,critical,inconv,problematic,not_recom,not_recom -pretentious,very_crit,foster,more,convenient,convenient,nonprob,recommended,priority -pretentious,very_crit,foster,more,convenient,convenient,nonprob,priority,spec_prior -pretentious,very_crit,foster,more,convenient,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,foster,more,convenient,convenient,slightly_prob,recommended,priority -pretentious,very_crit,foster,more,convenient,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,more,convenient,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,more,convenient,convenient,problematic,recommended,spec_prior -pretentious,very_crit,foster,more,convenient,convenient,problematic,priority,spec_prior -pretentious,very_crit,foster,more,convenient,convenient,problematic,not_recom,not_recom -pretentious,very_crit,foster,more,convenient,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,foster,more,convenient,inconv,nonprob,priority,spec_prior -pretentious,very_crit,foster,more,convenient,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,foster,more,convenient,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,more,convenient,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,more,convenient,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,more,convenient,inconv,problematic,recommended,spec_prior -pretentious,very_crit,foster,more,convenient,inconv,problematic,priority,spec_prior -pretentious,very_crit,foster,more,convenient,inconv,problematic,not_recom,not_recom -pretentious,very_crit,foster,more,less_conv,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,foster,more,less_conv,convenient,nonprob,priority,spec_prior -pretentious,very_crit,foster,more,less_conv,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,foster,more,less_conv,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,more,less_conv,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,more,less_conv,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,more,less_conv,convenient,problematic,recommended,spec_prior -pretentious,very_crit,foster,more,less_conv,convenient,problematic,priority,spec_prior -pretentious,very_crit,foster,more,less_conv,convenient,problematic,not_recom,not_recom -pretentious,very_crit,foster,more,less_conv,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,foster,more,less_conv,inconv,nonprob,priority,spec_prior -pretentious,very_crit,foster,more,less_conv,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,foster,more,less_conv,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,more,less_conv,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,more,less_conv,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,more,less_conv,inconv,problematic,recommended,spec_prior -pretentious,very_crit,foster,more,less_conv,inconv,problematic,priority,spec_prior -pretentious,very_crit,foster,more,less_conv,inconv,problematic,not_recom,not_recom -pretentious,very_crit,foster,more,critical,convenient,nonprob,recommended,spec_prior -pretentious,very_crit,foster,more,critical,convenient,nonprob,priority,spec_prior -pretentious,very_crit,foster,more,critical,convenient,nonprob,not_recom,not_recom -pretentious,very_crit,foster,more,critical,convenient,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,more,critical,convenient,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,more,critical,convenient,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,more,critical,convenient,problematic,recommended,spec_prior -pretentious,very_crit,foster,more,critical,convenient,problematic,priority,spec_prior -pretentious,very_crit,foster,more,critical,convenient,problematic,not_recom,not_recom -pretentious,very_crit,foster,more,critical,inconv,nonprob,recommended,spec_prior -pretentious,very_crit,foster,more,critical,inconv,nonprob,priority,spec_prior -pretentious,very_crit,foster,more,critical,inconv,nonprob,not_recom,not_recom -pretentious,very_crit,foster,more,critical,inconv,slightly_prob,recommended,spec_prior -pretentious,very_crit,foster,more,critical,inconv,slightly_prob,priority,spec_prior -pretentious,very_crit,foster,more,critical,inconv,slightly_prob,not_recom,not_recom -pretentious,very_crit,foster,more,critical,inconv,problematic,recommended,spec_prior -pretentious,very_crit,foster,more,critical,inconv,problematic,priority,spec_prior -pretentious,very_crit,foster,more,critical,inconv,problematic,not_recom,not_recom -great_pret,proper,complete,1,convenient,convenient,nonprob,recommended,priority -great_pret,proper,complete,1,convenient,convenient,nonprob,priority,priority -great_pret,proper,complete,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,proper,complete,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,proper,complete,1,convenient,convenient,slightly_prob,priority,priority -great_pret,proper,complete,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,complete,1,convenient,convenient,problematic,recommended,priority -great_pret,proper,complete,1,convenient,convenient,problematic,priority,priority -great_pret,proper,complete,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,proper,complete,1,convenient,inconv,nonprob,recommended,priority -great_pret,proper,complete,1,convenient,inconv,nonprob,priority,priority -great_pret,proper,complete,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,proper,complete,1,convenient,inconv,slightly_prob,recommended,priority -great_pret,proper,complete,1,convenient,inconv,slightly_prob,priority,priority -great_pret,proper,complete,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,complete,1,convenient,inconv,problematic,recommended,priority -great_pret,proper,complete,1,convenient,inconv,problematic,priority,priority -great_pret,proper,complete,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,proper,complete,1,less_conv,convenient,nonprob,recommended,priority -great_pret,proper,complete,1,less_conv,convenient,nonprob,priority,priority -great_pret,proper,complete,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,proper,complete,1,less_conv,convenient,slightly_prob,recommended,priority -great_pret,proper,complete,1,less_conv,convenient,slightly_prob,priority,priority -great_pret,proper,complete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,complete,1,less_conv,convenient,problematic,recommended,priority -great_pret,proper,complete,1,less_conv,convenient,problematic,priority,priority -great_pret,proper,complete,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,proper,complete,1,less_conv,inconv,nonprob,recommended,priority -great_pret,proper,complete,1,less_conv,inconv,nonprob,priority,priority -great_pret,proper,complete,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,proper,complete,1,less_conv,inconv,slightly_prob,recommended,priority -great_pret,proper,complete,1,less_conv,inconv,slightly_prob,priority,priority -great_pret,proper,complete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,complete,1,less_conv,inconv,problematic,recommended,priority -great_pret,proper,complete,1,less_conv,inconv,problematic,priority,priority -great_pret,proper,complete,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,proper,complete,1,critical,convenient,nonprob,recommended,priority -great_pret,proper,complete,1,critical,convenient,nonprob,priority,priority -great_pret,proper,complete,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,proper,complete,1,critical,convenient,slightly_prob,recommended,priority -great_pret,proper,complete,1,critical,convenient,slightly_prob,priority,priority -great_pret,proper,complete,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,complete,1,critical,convenient,problematic,recommended,priority -great_pret,proper,complete,1,critical,convenient,problematic,priority,priority -great_pret,proper,complete,1,critical,convenient,problematic,not_recom,not_recom -great_pret,proper,complete,1,critical,inconv,nonprob,recommended,priority -great_pret,proper,complete,1,critical,inconv,nonprob,priority,priority -great_pret,proper,complete,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,proper,complete,1,critical,inconv,slightly_prob,recommended,priority -great_pret,proper,complete,1,critical,inconv,slightly_prob,priority,priority -great_pret,proper,complete,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,complete,1,critical,inconv,problematic,recommended,priority -great_pret,proper,complete,1,critical,inconv,problematic,priority,priority -great_pret,proper,complete,1,critical,inconv,problematic,not_recom,not_recom -great_pret,proper,complete,2,convenient,convenient,nonprob,recommended,priority -great_pret,proper,complete,2,convenient,convenient,nonprob,priority,priority -great_pret,proper,complete,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,proper,complete,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,proper,complete,2,convenient,convenient,slightly_prob,priority,priority -great_pret,proper,complete,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,complete,2,convenient,convenient,problematic,recommended,priority -great_pret,proper,complete,2,convenient,convenient,problematic,priority,priority -great_pret,proper,complete,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,proper,complete,2,convenient,inconv,nonprob,recommended,priority -great_pret,proper,complete,2,convenient,inconv,nonprob,priority,priority -great_pret,proper,complete,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,proper,complete,2,convenient,inconv,slightly_prob,recommended,priority -great_pret,proper,complete,2,convenient,inconv,slightly_prob,priority,priority -great_pret,proper,complete,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,complete,2,convenient,inconv,problematic,recommended,priority -great_pret,proper,complete,2,convenient,inconv,problematic,priority,priority -great_pret,proper,complete,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,proper,complete,2,less_conv,convenient,nonprob,recommended,priority -great_pret,proper,complete,2,less_conv,convenient,nonprob,priority,priority -great_pret,proper,complete,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,proper,complete,2,less_conv,convenient,slightly_prob,recommended,priority -great_pret,proper,complete,2,less_conv,convenient,slightly_prob,priority,priority -great_pret,proper,complete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,complete,2,less_conv,convenient,problematic,recommended,priority -great_pret,proper,complete,2,less_conv,convenient,problematic,priority,priority -great_pret,proper,complete,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,proper,complete,2,less_conv,inconv,nonprob,recommended,priority -great_pret,proper,complete,2,less_conv,inconv,nonprob,priority,priority -great_pret,proper,complete,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,proper,complete,2,less_conv,inconv,slightly_prob,recommended,priority -great_pret,proper,complete,2,less_conv,inconv,slightly_prob,priority,priority -great_pret,proper,complete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,complete,2,less_conv,inconv,problematic,recommended,priority -great_pret,proper,complete,2,less_conv,inconv,problematic,priority,priority -great_pret,proper,complete,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,proper,complete,2,critical,convenient,nonprob,recommended,priority -great_pret,proper,complete,2,critical,convenient,nonprob,priority,spec_prior -great_pret,proper,complete,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,proper,complete,2,critical,convenient,slightly_prob,recommended,priority -great_pret,proper,complete,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,proper,complete,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,complete,2,critical,convenient,problematic,recommended,spec_prior -great_pret,proper,complete,2,critical,convenient,problematic,priority,spec_prior -great_pret,proper,complete,2,critical,convenient,problematic,not_recom,not_recom -great_pret,proper,complete,2,critical,inconv,nonprob,recommended,priority -great_pret,proper,complete,2,critical,inconv,nonprob,priority,spec_prior -great_pret,proper,complete,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,proper,complete,2,critical,inconv,slightly_prob,recommended,priority -great_pret,proper,complete,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,proper,complete,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,complete,2,critical,inconv,problematic,recommended,spec_prior -great_pret,proper,complete,2,critical,inconv,problematic,priority,spec_prior -great_pret,proper,complete,2,critical,inconv,problematic,not_recom,not_recom -great_pret,proper,complete,3,convenient,convenient,nonprob,recommended,priority -great_pret,proper,complete,3,convenient,convenient,nonprob,priority,priority -great_pret,proper,complete,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,proper,complete,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,proper,complete,3,convenient,convenient,slightly_prob,priority,priority -great_pret,proper,complete,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,complete,3,convenient,convenient,problematic,recommended,priority -great_pret,proper,complete,3,convenient,convenient,problematic,priority,priority -great_pret,proper,complete,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,proper,complete,3,convenient,inconv,nonprob,recommended,priority -great_pret,proper,complete,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,proper,complete,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,proper,complete,3,convenient,inconv,slightly_prob,recommended,priority -great_pret,proper,complete,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,proper,complete,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,complete,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,proper,complete,3,convenient,inconv,problematic,priority,spec_prior -great_pret,proper,complete,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,proper,complete,3,less_conv,convenient,nonprob,recommended,priority -great_pret,proper,complete,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,proper,complete,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,proper,complete,3,less_conv,convenient,slightly_prob,recommended,priority -great_pret,proper,complete,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,proper,complete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,complete,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,proper,complete,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,proper,complete,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,proper,complete,3,less_conv,inconv,nonprob,recommended,priority -great_pret,proper,complete,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,proper,complete,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,proper,complete,3,less_conv,inconv,slightly_prob,recommended,priority -great_pret,proper,complete,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,proper,complete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,complete,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,proper,complete,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,proper,complete,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,proper,complete,3,critical,convenient,nonprob,recommended,priority -great_pret,proper,complete,3,critical,convenient,nonprob,priority,spec_prior -great_pret,proper,complete,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,proper,complete,3,critical,convenient,slightly_prob,recommended,priority -great_pret,proper,complete,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,proper,complete,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,complete,3,critical,convenient,problematic,recommended,spec_prior -great_pret,proper,complete,3,critical,convenient,problematic,priority,spec_prior -great_pret,proper,complete,3,critical,convenient,problematic,not_recom,not_recom -great_pret,proper,complete,3,critical,inconv,nonprob,recommended,priority -great_pret,proper,complete,3,critical,inconv,nonprob,priority,spec_prior -great_pret,proper,complete,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,proper,complete,3,critical,inconv,slightly_prob,recommended,priority -great_pret,proper,complete,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,proper,complete,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,complete,3,critical,inconv,problematic,recommended,spec_prior -great_pret,proper,complete,3,critical,inconv,problematic,priority,spec_prior -great_pret,proper,complete,3,critical,inconv,problematic,not_recom,not_recom -great_pret,proper,complete,more,convenient,convenient,nonprob,recommended,priority -great_pret,proper,complete,more,convenient,convenient,nonprob,priority,priority -great_pret,proper,complete,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,proper,complete,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,proper,complete,more,convenient,convenient,slightly_prob,priority,priority -great_pret,proper,complete,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,complete,more,convenient,convenient,problematic,recommended,priority -great_pret,proper,complete,more,convenient,convenient,problematic,priority,priority -great_pret,proper,complete,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,proper,complete,more,convenient,inconv,nonprob,recommended,priority -great_pret,proper,complete,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,proper,complete,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,proper,complete,more,convenient,inconv,slightly_prob,recommended,priority -great_pret,proper,complete,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,proper,complete,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,complete,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,proper,complete,more,convenient,inconv,problematic,priority,spec_prior -great_pret,proper,complete,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,proper,complete,more,less_conv,convenient,nonprob,recommended,priority -great_pret,proper,complete,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,proper,complete,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,proper,complete,more,less_conv,convenient,slightly_prob,recommended,priority -great_pret,proper,complete,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,proper,complete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,complete,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,proper,complete,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,proper,complete,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,proper,complete,more,less_conv,inconv,nonprob,recommended,priority -great_pret,proper,complete,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,proper,complete,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,proper,complete,more,less_conv,inconv,slightly_prob,recommended,priority -great_pret,proper,complete,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,proper,complete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,complete,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,proper,complete,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,proper,complete,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,proper,complete,more,critical,convenient,nonprob,recommended,priority -great_pret,proper,complete,more,critical,convenient,nonprob,priority,spec_prior -great_pret,proper,complete,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,proper,complete,more,critical,convenient,slightly_prob,recommended,priority -great_pret,proper,complete,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,proper,complete,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,complete,more,critical,convenient,problematic,recommended,spec_prior -great_pret,proper,complete,more,critical,convenient,problematic,priority,spec_prior -great_pret,proper,complete,more,critical,convenient,problematic,not_recom,not_recom -great_pret,proper,complete,more,critical,inconv,nonprob,recommended,priority -great_pret,proper,complete,more,critical,inconv,nonprob,priority,spec_prior -great_pret,proper,complete,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,proper,complete,more,critical,inconv,slightly_prob,recommended,priority -great_pret,proper,complete,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,proper,complete,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,complete,more,critical,inconv,problematic,recommended,spec_prior -great_pret,proper,complete,more,critical,inconv,problematic,priority,spec_prior -great_pret,proper,complete,more,critical,inconv,problematic,not_recom,not_recom -great_pret,proper,completed,1,convenient,convenient,nonprob,recommended,priority -great_pret,proper,completed,1,convenient,convenient,nonprob,priority,priority -great_pret,proper,completed,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,proper,completed,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,proper,completed,1,convenient,convenient,slightly_prob,priority,priority -great_pret,proper,completed,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,completed,1,convenient,convenient,problematic,recommended,priority -great_pret,proper,completed,1,convenient,convenient,problematic,priority,priority -great_pret,proper,completed,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,proper,completed,1,convenient,inconv,nonprob,recommended,priority -great_pret,proper,completed,1,convenient,inconv,nonprob,priority,priority -great_pret,proper,completed,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,proper,completed,1,convenient,inconv,slightly_prob,recommended,priority -great_pret,proper,completed,1,convenient,inconv,slightly_prob,priority,priority -great_pret,proper,completed,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,completed,1,convenient,inconv,problematic,recommended,priority -great_pret,proper,completed,1,convenient,inconv,problematic,priority,priority -great_pret,proper,completed,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,proper,completed,1,less_conv,convenient,nonprob,recommended,priority -great_pret,proper,completed,1,less_conv,convenient,nonprob,priority,priority -great_pret,proper,completed,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,proper,completed,1,less_conv,convenient,slightly_prob,recommended,priority -great_pret,proper,completed,1,less_conv,convenient,slightly_prob,priority,priority -great_pret,proper,completed,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,completed,1,less_conv,convenient,problematic,recommended,priority -great_pret,proper,completed,1,less_conv,convenient,problematic,priority,priority -great_pret,proper,completed,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,proper,completed,1,less_conv,inconv,nonprob,recommended,priority -great_pret,proper,completed,1,less_conv,inconv,nonprob,priority,priority -great_pret,proper,completed,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,proper,completed,1,less_conv,inconv,slightly_prob,recommended,priority -great_pret,proper,completed,1,less_conv,inconv,slightly_prob,priority,priority -great_pret,proper,completed,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,completed,1,less_conv,inconv,problematic,recommended,priority -great_pret,proper,completed,1,less_conv,inconv,problematic,priority,priority -great_pret,proper,completed,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,proper,completed,1,critical,convenient,nonprob,recommended,priority -great_pret,proper,completed,1,critical,convenient,nonprob,priority,spec_prior -great_pret,proper,completed,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,proper,completed,1,critical,convenient,slightly_prob,recommended,priority -great_pret,proper,completed,1,critical,convenient,slightly_prob,priority,spec_prior -great_pret,proper,completed,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,completed,1,critical,convenient,problematic,recommended,spec_prior -great_pret,proper,completed,1,critical,convenient,problematic,priority,spec_prior -great_pret,proper,completed,1,critical,convenient,problematic,not_recom,not_recom -great_pret,proper,completed,1,critical,inconv,nonprob,recommended,priority -great_pret,proper,completed,1,critical,inconv,nonprob,priority,spec_prior -great_pret,proper,completed,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,proper,completed,1,critical,inconv,slightly_prob,recommended,priority -great_pret,proper,completed,1,critical,inconv,slightly_prob,priority,spec_prior -great_pret,proper,completed,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,completed,1,critical,inconv,problematic,recommended,spec_prior -great_pret,proper,completed,1,critical,inconv,problematic,priority,spec_prior -great_pret,proper,completed,1,critical,inconv,problematic,not_recom,not_recom -great_pret,proper,completed,2,convenient,convenient,nonprob,recommended,priority -great_pret,proper,completed,2,convenient,convenient,nonprob,priority,priority -great_pret,proper,completed,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,proper,completed,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,proper,completed,2,convenient,convenient,slightly_prob,priority,priority -great_pret,proper,completed,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,completed,2,convenient,convenient,problematic,recommended,priority -great_pret,proper,completed,2,convenient,convenient,problematic,priority,priority -great_pret,proper,completed,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,proper,completed,2,convenient,inconv,nonprob,recommended,priority -great_pret,proper,completed,2,convenient,inconv,nonprob,priority,priority -great_pret,proper,completed,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,proper,completed,2,convenient,inconv,slightly_prob,recommended,priority -great_pret,proper,completed,2,convenient,inconv,slightly_prob,priority,priority -great_pret,proper,completed,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,completed,2,convenient,inconv,problematic,recommended,priority -great_pret,proper,completed,2,convenient,inconv,problematic,priority,priority -great_pret,proper,completed,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,proper,completed,2,less_conv,convenient,nonprob,recommended,priority -great_pret,proper,completed,2,less_conv,convenient,nonprob,priority,priority -great_pret,proper,completed,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,proper,completed,2,less_conv,convenient,slightly_prob,recommended,priority -great_pret,proper,completed,2,less_conv,convenient,slightly_prob,priority,priority -great_pret,proper,completed,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,completed,2,less_conv,convenient,problematic,recommended,priority -great_pret,proper,completed,2,less_conv,convenient,problematic,priority,priority -great_pret,proper,completed,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,proper,completed,2,less_conv,inconv,nonprob,recommended,priority -great_pret,proper,completed,2,less_conv,inconv,nonprob,priority,priority -great_pret,proper,completed,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,proper,completed,2,less_conv,inconv,slightly_prob,recommended,priority -great_pret,proper,completed,2,less_conv,inconv,slightly_prob,priority,priority -great_pret,proper,completed,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,completed,2,less_conv,inconv,problematic,recommended,priority -great_pret,proper,completed,2,less_conv,inconv,problematic,priority,priority -great_pret,proper,completed,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,proper,completed,2,critical,convenient,nonprob,recommended,priority -great_pret,proper,completed,2,critical,convenient,nonprob,priority,spec_prior -great_pret,proper,completed,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,proper,completed,2,critical,convenient,slightly_prob,recommended,priority -great_pret,proper,completed,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,proper,completed,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,completed,2,critical,convenient,problematic,recommended,spec_prior -great_pret,proper,completed,2,critical,convenient,problematic,priority,spec_prior -great_pret,proper,completed,2,critical,convenient,problematic,not_recom,not_recom -great_pret,proper,completed,2,critical,inconv,nonprob,recommended,priority -great_pret,proper,completed,2,critical,inconv,nonprob,priority,spec_prior -great_pret,proper,completed,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,proper,completed,2,critical,inconv,slightly_prob,recommended,priority -great_pret,proper,completed,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,proper,completed,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,completed,2,critical,inconv,problematic,recommended,spec_prior -great_pret,proper,completed,2,critical,inconv,problematic,priority,spec_prior -great_pret,proper,completed,2,critical,inconv,problematic,not_recom,not_recom -great_pret,proper,completed,3,convenient,convenient,nonprob,recommended,priority -great_pret,proper,completed,3,convenient,convenient,nonprob,priority,priority -great_pret,proper,completed,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,proper,completed,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,proper,completed,3,convenient,convenient,slightly_prob,priority,priority -great_pret,proper,completed,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,completed,3,convenient,convenient,problematic,recommended,priority -great_pret,proper,completed,3,convenient,convenient,problematic,priority,priority -great_pret,proper,completed,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,proper,completed,3,convenient,inconv,nonprob,recommended,priority -great_pret,proper,completed,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,proper,completed,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,proper,completed,3,convenient,inconv,slightly_prob,recommended,priority -great_pret,proper,completed,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,proper,completed,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,completed,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,proper,completed,3,convenient,inconv,problematic,priority,spec_prior -great_pret,proper,completed,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,proper,completed,3,less_conv,convenient,nonprob,recommended,priority -great_pret,proper,completed,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,proper,completed,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,proper,completed,3,less_conv,convenient,slightly_prob,recommended,priority -great_pret,proper,completed,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,proper,completed,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,completed,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,proper,completed,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,proper,completed,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,proper,completed,3,less_conv,inconv,nonprob,recommended,priority -great_pret,proper,completed,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,proper,completed,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,proper,completed,3,less_conv,inconv,slightly_prob,recommended,priority -great_pret,proper,completed,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,proper,completed,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,completed,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,proper,completed,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,proper,completed,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,proper,completed,3,critical,convenient,nonprob,recommended,priority -great_pret,proper,completed,3,critical,convenient,nonprob,priority,spec_prior -great_pret,proper,completed,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,proper,completed,3,critical,convenient,slightly_prob,recommended,priority -great_pret,proper,completed,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,proper,completed,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,completed,3,critical,convenient,problematic,recommended,spec_prior -great_pret,proper,completed,3,critical,convenient,problematic,priority,spec_prior -great_pret,proper,completed,3,critical,convenient,problematic,not_recom,not_recom -great_pret,proper,completed,3,critical,inconv,nonprob,recommended,priority -great_pret,proper,completed,3,critical,inconv,nonprob,priority,spec_prior -great_pret,proper,completed,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,proper,completed,3,critical,inconv,slightly_prob,recommended,priority -great_pret,proper,completed,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,proper,completed,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,completed,3,critical,inconv,problematic,recommended,spec_prior -great_pret,proper,completed,3,critical,inconv,problematic,priority,spec_prior -great_pret,proper,completed,3,critical,inconv,problematic,not_recom,not_recom -great_pret,proper,completed,more,convenient,convenient,nonprob,recommended,priority -great_pret,proper,completed,more,convenient,convenient,nonprob,priority,priority -great_pret,proper,completed,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,proper,completed,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,proper,completed,more,convenient,convenient,slightly_prob,priority,priority -great_pret,proper,completed,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,completed,more,convenient,convenient,problematic,recommended,priority -great_pret,proper,completed,more,convenient,convenient,problematic,priority,priority -great_pret,proper,completed,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,proper,completed,more,convenient,inconv,nonprob,recommended,priority -great_pret,proper,completed,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,proper,completed,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,proper,completed,more,convenient,inconv,slightly_prob,recommended,priority -great_pret,proper,completed,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,proper,completed,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,completed,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,proper,completed,more,convenient,inconv,problematic,priority,spec_prior -great_pret,proper,completed,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,proper,completed,more,less_conv,convenient,nonprob,recommended,priority -great_pret,proper,completed,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,proper,completed,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,proper,completed,more,less_conv,convenient,slightly_prob,recommended,priority -great_pret,proper,completed,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,proper,completed,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,completed,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,proper,completed,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,proper,completed,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,proper,completed,more,less_conv,inconv,nonprob,recommended,priority -great_pret,proper,completed,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,proper,completed,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,proper,completed,more,less_conv,inconv,slightly_prob,recommended,priority -great_pret,proper,completed,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,proper,completed,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,completed,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,proper,completed,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,proper,completed,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,proper,completed,more,critical,convenient,nonprob,recommended,priority -great_pret,proper,completed,more,critical,convenient,nonprob,priority,spec_prior -great_pret,proper,completed,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,proper,completed,more,critical,convenient,slightly_prob,recommended,priority -great_pret,proper,completed,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,proper,completed,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,completed,more,critical,convenient,problematic,recommended,spec_prior -great_pret,proper,completed,more,critical,convenient,problematic,priority,spec_prior -great_pret,proper,completed,more,critical,convenient,problematic,not_recom,not_recom -great_pret,proper,completed,more,critical,inconv,nonprob,recommended,priority -great_pret,proper,completed,more,critical,inconv,nonprob,priority,spec_prior -great_pret,proper,completed,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,proper,completed,more,critical,inconv,slightly_prob,recommended,priority -great_pret,proper,completed,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,proper,completed,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,completed,more,critical,inconv,problematic,recommended,spec_prior -great_pret,proper,completed,more,critical,inconv,problematic,priority,spec_prior -great_pret,proper,completed,more,critical,inconv,problematic,not_recom,not_recom -great_pret,proper,incomplete,1,convenient,convenient,nonprob,recommended,priority -great_pret,proper,incomplete,1,convenient,convenient,nonprob,priority,priority -great_pret,proper,incomplete,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,proper,incomplete,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,proper,incomplete,1,convenient,convenient,slightly_prob,priority,priority -great_pret,proper,incomplete,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,1,convenient,convenient,problematic,recommended,priority -great_pret,proper,incomplete,1,convenient,convenient,problematic,priority,priority -great_pret,proper,incomplete,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,proper,incomplete,1,convenient,inconv,nonprob,recommended,priority -great_pret,proper,incomplete,1,convenient,inconv,nonprob,priority,priority -great_pret,proper,incomplete,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,proper,incomplete,1,convenient,inconv,slightly_prob,recommended,priority -great_pret,proper,incomplete,1,convenient,inconv,slightly_prob,priority,priority -great_pret,proper,incomplete,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,1,convenient,inconv,problematic,recommended,priority -great_pret,proper,incomplete,1,convenient,inconv,problematic,priority,priority -great_pret,proper,incomplete,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,proper,incomplete,1,less_conv,convenient,nonprob,recommended,priority -great_pret,proper,incomplete,1,less_conv,convenient,nonprob,priority,priority -great_pret,proper,incomplete,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,proper,incomplete,1,less_conv,convenient,slightly_prob,recommended,priority -great_pret,proper,incomplete,1,less_conv,convenient,slightly_prob,priority,priority -great_pret,proper,incomplete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,1,less_conv,convenient,problematic,recommended,priority -great_pret,proper,incomplete,1,less_conv,convenient,problematic,priority,priority -great_pret,proper,incomplete,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,proper,incomplete,1,less_conv,inconv,nonprob,recommended,priority -great_pret,proper,incomplete,1,less_conv,inconv,nonprob,priority,priority -great_pret,proper,incomplete,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,proper,incomplete,1,less_conv,inconv,slightly_prob,recommended,priority -great_pret,proper,incomplete,1,less_conv,inconv,slightly_prob,priority,priority -great_pret,proper,incomplete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,1,less_conv,inconv,problematic,recommended,priority -great_pret,proper,incomplete,1,less_conv,inconv,problematic,priority,priority -great_pret,proper,incomplete,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,proper,incomplete,1,critical,convenient,nonprob,recommended,priority -great_pret,proper,incomplete,1,critical,convenient,nonprob,priority,spec_prior -great_pret,proper,incomplete,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,proper,incomplete,1,critical,convenient,slightly_prob,recommended,priority -great_pret,proper,incomplete,1,critical,convenient,slightly_prob,priority,spec_prior -great_pret,proper,incomplete,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,1,critical,convenient,problematic,recommended,spec_prior -great_pret,proper,incomplete,1,critical,convenient,problematic,priority,spec_prior -great_pret,proper,incomplete,1,critical,convenient,problematic,not_recom,not_recom -great_pret,proper,incomplete,1,critical,inconv,nonprob,recommended,priority -great_pret,proper,incomplete,1,critical,inconv,nonprob,priority,spec_prior -great_pret,proper,incomplete,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,proper,incomplete,1,critical,inconv,slightly_prob,recommended,priority -great_pret,proper,incomplete,1,critical,inconv,slightly_prob,priority,spec_prior -great_pret,proper,incomplete,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,1,critical,inconv,problematic,recommended,spec_prior -great_pret,proper,incomplete,1,critical,inconv,problematic,priority,spec_prior -great_pret,proper,incomplete,1,critical,inconv,problematic,not_recom,not_recom -great_pret,proper,incomplete,2,convenient,convenient,nonprob,recommended,priority -great_pret,proper,incomplete,2,convenient,convenient,nonprob,priority,priority -great_pret,proper,incomplete,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,proper,incomplete,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,proper,incomplete,2,convenient,convenient,slightly_prob,priority,priority -great_pret,proper,incomplete,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,2,convenient,convenient,problematic,recommended,priority -great_pret,proper,incomplete,2,convenient,convenient,problematic,priority,priority -great_pret,proper,incomplete,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,proper,incomplete,2,convenient,inconv,nonprob,recommended,priority -great_pret,proper,incomplete,2,convenient,inconv,nonprob,priority,spec_prior -great_pret,proper,incomplete,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,proper,incomplete,2,convenient,inconv,slightly_prob,recommended,priority -great_pret,proper,incomplete,2,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,proper,incomplete,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,2,convenient,inconv,problematic,recommended,spec_prior -great_pret,proper,incomplete,2,convenient,inconv,problematic,priority,spec_prior -great_pret,proper,incomplete,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,proper,incomplete,2,less_conv,convenient,nonprob,recommended,priority -great_pret,proper,incomplete,2,less_conv,convenient,nonprob,priority,spec_prior -great_pret,proper,incomplete,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,proper,incomplete,2,less_conv,convenient,slightly_prob,recommended,priority -great_pret,proper,incomplete,2,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,proper,incomplete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,2,less_conv,convenient,problematic,recommended,spec_prior -great_pret,proper,incomplete,2,less_conv,convenient,problematic,priority,spec_prior -great_pret,proper,incomplete,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,proper,incomplete,2,less_conv,inconv,nonprob,recommended,priority -great_pret,proper,incomplete,2,less_conv,inconv,nonprob,priority,spec_prior -great_pret,proper,incomplete,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,proper,incomplete,2,less_conv,inconv,slightly_prob,recommended,priority -great_pret,proper,incomplete,2,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,proper,incomplete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,2,less_conv,inconv,problematic,recommended,spec_prior -great_pret,proper,incomplete,2,less_conv,inconv,problematic,priority,spec_prior -great_pret,proper,incomplete,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,proper,incomplete,2,critical,convenient,nonprob,recommended,priority -great_pret,proper,incomplete,2,critical,convenient,nonprob,priority,spec_prior -great_pret,proper,incomplete,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,proper,incomplete,2,critical,convenient,slightly_prob,recommended,priority -great_pret,proper,incomplete,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,proper,incomplete,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,2,critical,convenient,problematic,recommended,spec_prior -great_pret,proper,incomplete,2,critical,convenient,problematic,priority,spec_prior -great_pret,proper,incomplete,2,critical,convenient,problematic,not_recom,not_recom -great_pret,proper,incomplete,2,critical,inconv,nonprob,recommended,priority -great_pret,proper,incomplete,2,critical,inconv,nonprob,priority,spec_prior -great_pret,proper,incomplete,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,proper,incomplete,2,critical,inconv,slightly_prob,recommended,priority -great_pret,proper,incomplete,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,proper,incomplete,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,2,critical,inconv,problematic,recommended,spec_prior -great_pret,proper,incomplete,2,critical,inconv,problematic,priority,spec_prior -great_pret,proper,incomplete,2,critical,inconv,problematic,not_recom,not_recom -great_pret,proper,incomplete,3,convenient,convenient,nonprob,recommended,priority -great_pret,proper,incomplete,3,convenient,convenient,nonprob,priority,priority -great_pret,proper,incomplete,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,proper,incomplete,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,proper,incomplete,3,convenient,convenient,slightly_prob,priority,priority -great_pret,proper,incomplete,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,3,convenient,convenient,problematic,recommended,priority -great_pret,proper,incomplete,3,convenient,convenient,problematic,priority,priority -great_pret,proper,incomplete,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,proper,incomplete,3,convenient,inconv,nonprob,recommended,priority -great_pret,proper,incomplete,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,proper,incomplete,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,proper,incomplete,3,convenient,inconv,slightly_prob,recommended,priority -great_pret,proper,incomplete,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,proper,incomplete,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,proper,incomplete,3,convenient,inconv,problematic,priority,spec_prior -great_pret,proper,incomplete,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,proper,incomplete,3,less_conv,convenient,nonprob,recommended,priority -great_pret,proper,incomplete,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,proper,incomplete,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,proper,incomplete,3,less_conv,convenient,slightly_prob,recommended,priority -great_pret,proper,incomplete,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,proper,incomplete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,proper,incomplete,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,proper,incomplete,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,proper,incomplete,3,less_conv,inconv,nonprob,recommended,priority -great_pret,proper,incomplete,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,proper,incomplete,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,proper,incomplete,3,less_conv,inconv,slightly_prob,recommended,priority -great_pret,proper,incomplete,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,proper,incomplete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,proper,incomplete,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,proper,incomplete,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,proper,incomplete,3,critical,convenient,nonprob,recommended,priority -great_pret,proper,incomplete,3,critical,convenient,nonprob,priority,spec_prior -great_pret,proper,incomplete,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,proper,incomplete,3,critical,convenient,slightly_prob,recommended,priority -great_pret,proper,incomplete,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,proper,incomplete,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,3,critical,convenient,problematic,recommended,spec_prior -great_pret,proper,incomplete,3,critical,convenient,problematic,priority,spec_prior -great_pret,proper,incomplete,3,critical,convenient,problematic,not_recom,not_recom -great_pret,proper,incomplete,3,critical,inconv,nonprob,recommended,priority -great_pret,proper,incomplete,3,critical,inconv,nonprob,priority,spec_prior -great_pret,proper,incomplete,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,proper,incomplete,3,critical,inconv,slightly_prob,recommended,priority -great_pret,proper,incomplete,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,proper,incomplete,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,3,critical,inconv,problematic,recommended,spec_prior -great_pret,proper,incomplete,3,critical,inconv,problematic,priority,spec_prior -great_pret,proper,incomplete,3,critical,inconv,problematic,not_recom,not_recom -great_pret,proper,incomplete,more,convenient,convenient,nonprob,recommended,priority -great_pret,proper,incomplete,more,convenient,convenient,nonprob,priority,priority -great_pret,proper,incomplete,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,proper,incomplete,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,proper,incomplete,more,convenient,convenient,slightly_prob,priority,priority -great_pret,proper,incomplete,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,more,convenient,convenient,problematic,recommended,priority -great_pret,proper,incomplete,more,convenient,convenient,problematic,priority,priority -great_pret,proper,incomplete,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,proper,incomplete,more,convenient,inconv,nonprob,recommended,priority -great_pret,proper,incomplete,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,proper,incomplete,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,proper,incomplete,more,convenient,inconv,slightly_prob,recommended,priority -great_pret,proper,incomplete,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,proper,incomplete,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,proper,incomplete,more,convenient,inconv,problematic,priority,spec_prior -great_pret,proper,incomplete,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,proper,incomplete,more,less_conv,convenient,nonprob,recommended,priority -great_pret,proper,incomplete,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,proper,incomplete,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,proper,incomplete,more,less_conv,convenient,slightly_prob,recommended,priority -great_pret,proper,incomplete,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,proper,incomplete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,proper,incomplete,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,proper,incomplete,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,proper,incomplete,more,less_conv,inconv,nonprob,recommended,priority -great_pret,proper,incomplete,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,proper,incomplete,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,proper,incomplete,more,less_conv,inconv,slightly_prob,recommended,priority -great_pret,proper,incomplete,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,proper,incomplete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,proper,incomplete,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,proper,incomplete,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,proper,incomplete,more,critical,convenient,nonprob,recommended,priority -great_pret,proper,incomplete,more,critical,convenient,nonprob,priority,spec_prior -great_pret,proper,incomplete,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,proper,incomplete,more,critical,convenient,slightly_prob,recommended,priority -great_pret,proper,incomplete,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,proper,incomplete,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,more,critical,convenient,problematic,recommended,spec_prior -great_pret,proper,incomplete,more,critical,convenient,problematic,priority,spec_prior -great_pret,proper,incomplete,more,critical,convenient,problematic,not_recom,not_recom -great_pret,proper,incomplete,more,critical,inconv,nonprob,recommended,priority -great_pret,proper,incomplete,more,critical,inconv,nonprob,priority,spec_prior -great_pret,proper,incomplete,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,proper,incomplete,more,critical,inconv,slightly_prob,recommended,priority -great_pret,proper,incomplete,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,proper,incomplete,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,incomplete,more,critical,inconv,problematic,recommended,spec_prior -great_pret,proper,incomplete,more,critical,inconv,problematic,priority,spec_prior -great_pret,proper,incomplete,more,critical,inconv,problematic,not_recom,not_recom -great_pret,proper,foster,1,convenient,convenient,nonprob,recommended,priority -great_pret,proper,foster,1,convenient,convenient,nonprob,priority,priority -great_pret,proper,foster,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,proper,foster,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,proper,foster,1,convenient,convenient,slightly_prob,priority,priority -great_pret,proper,foster,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,foster,1,convenient,convenient,problematic,recommended,priority -great_pret,proper,foster,1,convenient,convenient,problematic,priority,priority -great_pret,proper,foster,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,proper,foster,1,convenient,inconv,nonprob,recommended,priority -great_pret,proper,foster,1,convenient,inconv,nonprob,priority,spec_prior -great_pret,proper,foster,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,proper,foster,1,convenient,inconv,slightly_prob,recommended,priority -great_pret,proper,foster,1,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,proper,foster,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,foster,1,convenient,inconv,problematic,recommended,spec_prior -great_pret,proper,foster,1,convenient,inconv,problematic,priority,spec_prior -great_pret,proper,foster,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,proper,foster,1,less_conv,convenient,nonprob,recommended,priority -great_pret,proper,foster,1,less_conv,convenient,nonprob,priority,spec_prior -great_pret,proper,foster,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,proper,foster,1,less_conv,convenient,slightly_prob,recommended,priority -great_pret,proper,foster,1,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,proper,foster,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,foster,1,less_conv,convenient,problematic,recommended,spec_prior -great_pret,proper,foster,1,less_conv,convenient,problematic,priority,spec_prior -great_pret,proper,foster,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,proper,foster,1,less_conv,inconv,nonprob,recommended,priority -great_pret,proper,foster,1,less_conv,inconv,nonprob,priority,spec_prior -great_pret,proper,foster,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,proper,foster,1,less_conv,inconv,slightly_prob,recommended,priority -great_pret,proper,foster,1,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,proper,foster,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,foster,1,less_conv,inconv,problematic,recommended,spec_prior -great_pret,proper,foster,1,less_conv,inconv,problematic,priority,spec_prior -great_pret,proper,foster,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,proper,foster,1,critical,convenient,nonprob,recommended,priority -great_pret,proper,foster,1,critical,convenient,nonprob,priority,spec_prior -great_pret,proper,foster,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,proper,foster,1,critical,convenient,slightly_prob,recommended,priority -great_pret,proper,foster,1,critical,convenient,slightly_prob,priority,spec_prior -great_pret,proper,foster,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,foster,1,critical,convenient,problematic,recommended,spec_prior -great_pret,proper,foster,1,critical,convenient,problematic,priority,spec_prior -great_pret,proper,foster,1,critical,convenient,problematic,not_recom,not_recom -great_pret,proper,foster,1,critical,inconv,nonprob,recommended,priority -great_pret,proper,foster,1,critical,inconv,nonprob,priority,spec_prior -great_pret,proper,foster,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,proper,foster,1,critical,inconv,slightly_prob,recommended,priority -great_pret,proper,foster,1,critical,inconv,slightly_prob,priority,spec_prior -great_pret,proper,foster,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,foster,1,critical,inconv,problematic,recommended,spec_prior -great_pret,proper,foster,1,critical,inconv,problematic,priority,spec_prior -great_pret,proper,foster,1,critical,inconv,problematic,not_recom,not_recom -great_pret,proper,foster,2,convenient,convenient,nonprob,recommended,priority -great_pret,proper,foster,2,convenient,convenient,nonprob,priority,priority -great_pret,proper,foster,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,proper,foster,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,proper,foster,2,convenient,convenient,slightly_prob,priority,priority -great_pret,proper,foster,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,foster,2,convenient,convenient,problematic,recommended,priority -great_pret,proper,foster,2,convenient,convenient,problematic,priority,priority -great_pret,proper,foster,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,proper,foster,2,convenient,inconv,nonprob,recommended,priority -great_pret,proper,foster,2,convenient,inconv,nonprob,priority,spec_prior -great_pret,proper,foster,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,proper,foster,2,convenient,inconv,slightly_prob,recommended,priority -great_pret,proper,foster,2,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,proper,foster,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,foster,2,convenient,inconv,problematic,recommended,spec_prior -great_pret,proper,foster,2,convenient,inconv,problematic,priority,spec_prior -great_pret,proper,foster,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,proper,foster,2,less_conv,convenient,nonprob,recommended,priority -great_pret,proper,foster,2,less_conv,convenient,nonprob,priority,spec_prior -great_pret,proper,foster,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,proper,foster,2,less_conv,convenient,slightly_prob,recommended,priority -great_pret,proper,foster,2,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,proper,foster,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,foster,2,less_conv,convenient,problematic,recommended,spec_prior -great_pret,proper,foster,2,less_conv,convenient,problematic,priority,spec_prior -great_pret,proper,foster,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,proper,foster,2,less_conv,inconv,nonprob,recommended,priority -great_pret,proper,foster,2,less_conv,inconv,nonprob,priority,spec_prior -great_pret,proper,foster,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,proper,foster,2,less_conv,inconv,slightly_prob,recommended,priority -great_pret,proper,foster,2,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,proper,foster,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,foster,2,less_conv,inconv,problematic,recommended,spec_prior -great_pret,proper,foster,2,less_conv,inconv,problematic,priority,spec_prior -great_pret,proper,foster,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,proper,foster,2,critical,convenient,nonprob,recommended,priority -great_pret,proper,foster,2,critical,convenient,nonprob,priority,spec_prior -great_pret,proper,foster,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,proper,foster,2,critical,convenient,slightly_prob,recommended,priority -great_pret,proper,foster,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,proper,foster,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,foster,2,critical,convenient,problematic,recommended,spec_prior -great_pret,proper,foster,2,critical,convenient,problematic,priority,spec_prior -great_pret,proper,foster,2,critical,convenient,problematic,not_recom,not_recom -great_pret,proper,foster,2,critical,inconv,nonprob,recommended,priority -great_pret,proper,foster,2,critical,inconv,nonprob,priority,spec_prior -great_pret,proper,foster,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,proper,foster,2,critical,inconv,slightly_prob,recommended,priority -great_pret,proper,foster,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,proper,foster,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,foster,2,critical,inconv,problematic,recommended,spec_prior -great_pret,proper,foster,2,critical,inconv,problematic,priority,spec_prior -great_pret,proper,foster,2,critical,inconv,problematic,not_recom,not_recom -great_pret,proper,foster,3,convenient,convenient,nonprob,recommended,priority -great_pret,proper,foster,3,convenient,convenient,nonprob,priority,priority -great_pret,proper,foster,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,proper,foster,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,proper,foster,3,convenient,convenient,slightly_prob,priority,priority -great_pret,proper,foster,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,foster,3,convenient,convenient,problematic,recommended,priority -great_pret,proper,foster,3,convenient,convenient,problematic,priority,priority -great_pret,proper,foster,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,proper,foster,3,convenient,inconv,nonprob,recommended,priority -great_pret,proper,foster,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,proper,foster,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,proper,foster,3,convenient,inconv,slightly_prob,recommended,priority -great_pret,proper,foster,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,proper,foster,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,foster,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,proper,foster,3,convenient,inconv,problematic,priority,spec_prior -great_pret,proper,foster,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,proper,foster,3,less_conv,convenient,nonprob,recommended,priority -great_pret,proper,foster,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,proper,foster,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,proper,foster,3,less_conv,convenient,slightly_prob,recommended,priority -great_pret,proper,foster,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,proper,foster,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,foster,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,proper,foster,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,proper,foster,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,proper,foster,3,less_conv,inconv,nonprob,recommended,priority -great_pret,proper,foster,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,proper,foster,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,proper,foster,3,less_conv,inconv,slightly_prob,recommended,priority -great_pret,proper,foster,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,proper,foster,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,foster,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,proper,foster,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,proper,foster,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,proper,foster,3,critical,convenient,nonprob,recommended,priority -great_pret,proper,foster,3,critical,convenient,nonprob,priority,spec_prior -great_pret,proper,foster,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,proper,foster,3,critical,convenient,slightly_prob,recommended,priority -great_pret,proper,foster,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,proper,foster,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,foster,3,critical,convenient,problematic,recommended,spec_prior -great_pret,proper,foster,3,critical,convenient,problematic,priority,spec_prior -great_pret,proper,foster,3,critical,convenient,problematic,not_recom,not_recom -great_pret,proper,foster,3,critical,inconv,nonprob,recommended,priority -great_pret,proper,foster,3,critical,inconv,nonprob,priority,spec_prior -great_pret,proper,foster,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,proper,foster,3,critical,inconv,slightly_prob,recommended,priority -great_pret,proper,foster,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,proper,foster,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,foster,3,critical,inconv,problematic,recommended,spec_prior -great_pret,proper,foster,3,critical,inconv,problematic,priority,spec_prior -great_pret,proper,foster,3,critical,inconv,problematic,not_recom,not_recom -great_pret,proper,foster,more,convenient,convenient,nonprob,recommended,priority -great_pret,proper,foster,more,convenient,convenient,nonprob,priority,priority -great_pret,proper,foster,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,proper,foster,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,proper,foster,more,convenient,convenient,slightly_prob,priority,priority -great_pret,proper,foster,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,foster,more,convenient,convenient,problematic,recommended,priority -great_pret,proper,foster,more,convenient,convenient,problematic,priority,priority -great_pret,proper,foster,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,proper,foster,more,convenient,inconv,nonprob,recommended,priority -great_pret,proper,foster,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,proper,foster,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,proper,foster,more,convenient,inconv,slightly_prob,recommended,priority -great_pret,proper,foster,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,proper,foster,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,foster,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,proper,foster,more,convenient,inconv,problematic,priority,spec_prior -great_pret,proper,foster,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,proper,foster,more,less_conv,convenient,nonprob,recommended,priority -great_pret,proper,foster,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,proper,foster,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,proper,foster,more,less_conv,convenient,slightly_prob,recommended,priority -great_pret,proper,foster,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,proper,foster,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,foster,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,proper,foster,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,proper,foster,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,proper,foster,more,less_conv,inconv,nonprob,recommended,priority -great_pret,proper,foster,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,proper,foster,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,proper,foster,more,less_conv,inconv,slightly_prob,recommended,priority -great_pret,proper,foster,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,proper,foster,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,foster,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,proper,foster,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,proper,foster,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,proper,foster,more,critical,convenient,nonprob,recommended,priority -great_pret,proper,foster,more,critical,convenient,nonprob,priority,spec_prior -great_pret,proper,foster,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,proper,foster,more,critical,convenient,slightly_prob,recommended,priority -great_pret,proper,foster,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,proper,foster,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,proper,foster,more,critical,convenient,problematic,recommended,spec_prior -great_pret,proper,foster,more,critical,convenient,problematic,priority,spec_prior -great_pret,proper,foster,more,critical,convenient,problematic,not_recom,not_recom -great_pret,proper,foster,more,critical,inconv,nonprob,recommended,priority -great_pret,proper,foster,more,critical,inconv,nonprob,priority,spec_prior -great_pret,proper,foster,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,proper,foster,more,critical,inconv,slightly_prob,recommended,priority -great_pret,proper,foster,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,proper,foster,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,proper,foster,more,critical,inconv,problematic,recommended,spec_prior -great_pret,proper,foster,more,critical,inconv,problematic,priority,spec_prior -great_pret,proper,foster,more,critical,inconv,problematic,not_recom,not_recom -great_pret,less_proper,complete,1,convenient,convenient,nonprob,recommended,priority -great_pret,less_proper,complete,1,convenient,convenient,nonprob,priority,priority -great_pret,less_proper,complete,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,complete,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,less_proper,complete,1,convenient,convenient,slightly_prob,priority,priority -great_pret,less_proper,complete,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,1,convenient,convenient,problematic,recommended,priority -great_pret,less_proper,complete,1,convenient,convenient,problematic,priority,priority -great_pret,less_proper,complete,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,less_proper,complete,1,convenient,inconv,nonprob,recommended,priority -great_pret,less_proper,complete,1,convenient,inconv,nonprob,priority,priority -great_pret,less_proper,complete,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,complete,1,convenient,inconv,slightly_prob,recommended,priority -great_pret,less_proper,complete,1,convenient,inconv,slightly_prob,priority,priority -great_pret,less_proper,complete,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,1,convenient,inconv,problematic,recommended,priority -great_pret,less_proper,complete,1,convenient,inconv,problematic,priority,priority -great_pret,less_proper,complete,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,less_proper,complete,1,less_conv,convenient,nonprob,recommended,priority -great_pret,less_proper,complete,1,less_conv,convenient,nonprob,priority,priority -great_pret,less_proper,complete,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,complete,1,less_conv,convenient,slightly_prob,recommended,priority -great_pret,less_proper,complete,1,less_conv,convenient,slightly_prob,priority,priority -great_pret,less_proper,complete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,1,less_conv,convenient,problematic,recommended,priority -great_pret,less_proper,complete,1,less_conv,convenient,problematic,priority,priority -great_pret,less_proper,complete,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,less_proper,complete,1,less_conv,inconv,nonprob,recommended,priority -great_pret,less_proper,complete,1,less_conv,inconv,nonprob,priority,priority -great_pret,less_proper,complete,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,complete,1,less_conv,inconv,slightly_prob,recommended,priority -great_pret,less_proper,complete,1,less_conv,inconv,slightly_prob,priority,priority -great_pret,less_proper,complete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,1,less_conv,inconv,problematic,recommended,priority -great_pret,less_proper,complete,1,less_conv,inconv,problematic,priority,priority -great_pret,less_proper,complete,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,less_proper,complete,1,critical,convenient,nonprob,recommended,priority -great_pret,less_proper,complete,1,critical,convenient,nonprob,priority,priority -great_pret,less_proper,complete,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,complete,1,critical,convenient,slightly_prob,recommended,priority -great_pret,less_proper,complete,1,critical,convenient,slightly_prob,priority,priority -great_pret,less_proper,complete,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,1,critical,convenient,problematic,recommended,priority -great_pret,less_proper,complete,1,critical,convenient,problematic,priority,priority -great_pret,less_proper,complete,1,critical,convenient,problematic,not_recom,not_recom -great_pret,less_proper,complete,1,critical,inconv,nonprob,recommended,priority -great_pret,less_proper,complete,1,critical,inconv,nonprob,priority,priority -great_pret,less_proper,complete,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,complete,1,critical,inconv,slightly_prob,recommended,priority -great_pret,less_proper,complete,1,critical,inconv,slightly_prob,priority,priority -great_pret,less_proper,complete,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,1,critical,inconv,problematic,recommended,priority -great_pret,less_proper,complete,1,critical,inconv,problematic,priority,priority -great_pret,less_proper,complete,1,critical,inconv,problematic,not_recom,not_recom -great_pret,less_proper,complete,2,convenient,convenient,nonprob,recommended,priority -great_pret,less_proper,complete,2,convenient,convenient,nonprob,priority,priority -great_pret,less_proper,complete,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,complete,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,less_proper,complete,2,convenient,convenient,slightly_prob,priority,priority -great_pret,less_proper,complete,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,2,convenient,convenient,problematic,recommended,priority -great_pret,less_proper,complete,2,convenient,convenient,problematic,priority,priority -great_pret,less_proper,complete,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,less_proper,complete,2,convenient,inconv,nonprob,recommended,priority -great_pret,less_proper,complete,2,convenient,inconv,nonprob,priority,priority -great_pret,less_proper,complete,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,complete,2,convenient,inconv,slightly_prob,recommended,priority -great_pret,less_proper,complete,2,convenient,inconv,slightly_prob,priority,priority -great_pret,less_proper,complete,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,2,convenient,inconv,problematic,recommended,priority -great_pret,less_proper,complete,2,convenient,inconv,problematic,priority,priority -great_pret,less_proper,complete,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,less_proper,complete,2,less_conv,convenient,nonprob,recommended,priority -great_pret,less_proper,complete,2,less_conv,convenient,nonprob,priority,priority -great_pret,less_proper,complete,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,complete,2,less_conv,convenient,slightly_prob,recommended,priority -great_pret,less_proper,complete,2,less_conv,convenient,slightly_prob,priority,priority -great_pret,less_proper,complete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,2,less_conv,convenient,problematic,recommended,priority -great_pret,less_proper,complete,2,less_conv,convenient,problematic,priority,priority -great_pret,less_proper,complete,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,less_proper,complete,2,less_conv,inconv,nonprob,recommended,priority -great_pret,less_proper,complete,2,less_conv,inconv,nonprob,priority,priority -great_pret,less_proper,complete,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,complete,2,less_conv,inconv,slightly_prob,recommended,priority -great_pret,less_proper,complete,2,less_conv,inconv,slightly_prob,priority,priority -great_pret,less_proper,complete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,2,less_conv,inconv,problematic,recommended,priority -great_pret,less_proper,complete,2,less_conv,inconv,problematic,priority,priority -great_pret,less_proper,complete,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,less_proper,complete,2,critical,convenient,nonprob,recommended,priority -great_pret,less_proper,complete,2,critical,convenient,nonprob,priority,spec_prior -great_pret,less_proper,complete,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,complete,2,critical,convenient,slightly_prob,recommended,priority -great_pret,less_proper,complete,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,complete,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,2,critical,convenient,problematic,recommended,spec_prior -great_pret,less_proper,complete,2,critical,convenient,problematic,priority,spec_prior -great_pret,less_proper,complete,2,critical,convenient,problematic,not_recom,not_recom -great_pret,less_proper,complete,2,critical,inconv,nonprob,recommended,priority -great_pret,less_proper,complete,2,critical,inconv,nonprob,priority,spec_prior -great_pret,less_proper,complete,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,complete,2,critical,inconv,slightly_prob,recommended,priority -great_pret,less_proper,complete,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,complete,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,2,critical,inconv,problematic,recommended,spec_prior -great_pret,less_proper,complete,2,critical,inconv,problematic,priority,spec_prior -great_pret,less_proper,complete,2,critical,inconv,problematic,not_recom,not_recom -great_pret,less_proper,complete,3,convenient,convenient,nonprob,recommended,priority -great_pret,less_proper,complete,3,convenient,convenient,nonprob,priority,priority -great_pret,less_proper,complete,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,complete,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,less_proper,complete,3,convenient,convenient,slightly_prob,priority,priority -great_pret,less_proper,complete,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,3,convenient,convenient,problematic,recommended,priority -great_pret,less_proper,complete,3,convenient,convenient,problematic,priority,priority -great_pret,less_proper,complete,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,less_proper,complete,3,convenient,inconv,nonprob,recommended,priority -great_pret,less_proper,complete,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,less_proper,complete,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,complete,3,convenient,inconv,slightly_prob,recommended,priority -great_pret,less_proper,complete,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,complete,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,less_proper,complete,3,convenient,inconv,problematic,priority,spec_prior -great_pret,less_proper,complete,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,less_proper,complete,3,less_conv,convenient,nonprob,recommended,priority -great_pret,less_proper,complete,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,less_proper,complete,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,complete,3,less_conv,convenient,slightly_prob,recommended,priority -great_pret,less_proper,complete,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,complete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,less_proper,complete,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,less_proper,complete,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,less_proper,complete,3,less_conv,inconv,nonprob,recommended,priority -great_pret,less_proper,complete,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,less_proper,complete,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,complete,3,less_conv,inconv,slightly_prob,recommended,priority -great_pret,less_proper,complete,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,complete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,less_proper,complete,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,less_proper,complete,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,less_proper,complete,3,critical,convenient,nonprob,recommended,priority -great_pret,less_proper,complete,3,critical,convenient,nonprob,priority,spec_prior -great_pret,less_proper,complete,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,complete,3,critical,convenient,slightly_prob,recommended,priority -great_pret,less_proper,complete,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,complete,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,3,critical,convenient,problematic,recommended,spec_prior -great_pret,less_proper,complete,3,critical,convenient,problematic,priority,spec_prior -great_pret,less_proper,complete,3,critical,convenient,problematic,not_recom,not_recom -great_pret,less_proper,complete,3,critical,inconv,nonprob,recommended,priority -great_pret,less_proper,complete,3,critical,inconv,nonprob,priority,spec_prior -great_pret,less_proper,complete,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,complete,3,critical,inconv,slightly_prob,recommended,priority -great_pret,less_proper,complete,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,complete,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,3,critical,inconv,problematic,recommended,spec_prior -great_pret,less_proper,complete,3,critical,inconv,problematic,priority,spec_prior -great_pret,less_proper,complete,3,critical,inconv,problematic,not_recom,not_recom -great_pret,less_proper,complete,more,convenient,convenient,nonprob,recommended,priority -great_pret,less_proper,complete,more,convenient,convenient,nonprob,priority,priority -great_pret,less_proper,complete,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,complete,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,less_proper,complete,more,convenient,convenient,slightly_prob,priority,priority -great_pret,less_proper,complete,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,more,convenient,convenient,problematic,recommended,priority -great_pret,less_proper,complete,more,convenient,convenient,problematic,priority,priority -great_pret,less_proper,complete,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,less_proper,complete,more,convenient,inconv,nonprob,recommended,priority -great_pret,less_proper,complete,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,less_proper,complete,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,complete,more,convenient,inconv,slightly_prob,recommended,priority -great_pret,less_proper,complete,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,complete,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,less_proper,complete,more,convenient,inconv,problematic,priority,spec_prior -great_pret,less_proper,complete,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,less_proper,complete,more,less_conv,convenient,nonprob,recommended,priority -great_pret,less_proper,complete,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,less_proper,complete,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,complete,more,less_conv,convenient,slightly_prob,recommended,priority -great_pret,less_proper,complete,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,complete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,less_proper,complete,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,less_proper,complete,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,less_proper,complete,more,less_conv,inconv,nonprob,recommended,priority -great_pret,less_proper,complete,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,less_proper,complete,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,complete,more,less_conv,inconv,slightly_prob,recommended,priority -great_pret,less_proper,complete,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,complete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,less_proper,complete,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,less_proper,complete,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,less_proper,complete,more,critical,convenient,nonprob,recommended,priority -great_pret,less_proper,complete,more,critical,convenient,nonprob,priority,spec_prior -great_pret,less_proper,complete,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,complete,more,critical,convenient,slightly_prob,recommended,priority -great_pret,less_proper,complete,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,complete,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,more,critical,convenient,problematic,recommended,spec_prior -great_pret,less_proper,complete,more,critical,convenient,problematic,priority,spec_prior -great_pret,less_proper,complete,more,critical,convenient,problematic,not_recom,not_recom -great_pret,less_proper,complete,more,critical,inconv,nonprob,recommended,priority -great_pret,less_proper,complete,more,critical,inconv,nonprob,priority,spec_prior -great_pret,less_proper,complete,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,complete,more,critical,inconv,slightly_prob,recommended,priority -great_pret,less_proper,complete,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,complete,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,complete,more,critical,inconv,problematic,recommended,spec_prior -great_pret,less_proper,complete,more,critical,inconv,problematic,priority,spec_prior -great_pret,less_proper,complete,more,critical,inconv,problematic,not_recom,not_recom -great_pret,less_proper,completed,1,convenient,convenient,nonprob,recommended,priority -great_pret,less_proper,completed,1,convenient,convenient,nonprob,priority,priority -great_pret,less_proper,completed,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,completed,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,less_proper,completed,1,convenient,convenient,slightly_prob,priority,priority -great_pret,less_proper,completed,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,1,convenient,convenient,problematic,recommended,priority -great_pret,less_proper,completed,1,convenient,convenient,problematic,priority,priority -great_pret,less_proper,completed,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,less_proper,completed,1,convenient,inconv,nonprob,recommended,priority -great_pret,less_proper,completed,1,convenient,inconv,nonprob,priority,priority -great_pret,less_proper,completed,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,completed,1,convenient,inconv,slightly_prob,recommended,priority -great_pret,less_proper,completed,1,convenient,inconv,slightly_prob,priority,priority -great_pret,less_proper,completed,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,1,convenient,inconv,problematic,recommended,priority -great_pret,less_proper,completed,1,convenient,inconv,problematic,priority,priority -great_pret,less_proper,completed,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,less_proper,completed,1,less_conv,convenient,nonprob,recommended,priority -great_pret,less_proper,completed,1,less_conv,convenient,nonprob,priority,priority -great_pret,less_proper,completed,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,completed,1,less_conv,convenient,slightly_prob,recommended,priority -great_pret,less_proper,completed,1,less_conv,convenient,slightly_prob,priority,priority -great_pret,less_proper,completed,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,1,less_conv,convenient,problematic,recommended,priority -great_pret,less_proper,completed,1,less_conv,convenient,problematic,priority,priority -great_pret,less_proper,completed,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,less_proper,completed,1,less_conv,inconv,nonprob,recommended,priority -great_pret,less_proper,completed,1,less_conv,inconv,nonprob,priority,priority -great_pret,less_proper,completed,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,completed,1,less_conv,inconv,slightly_prob,recommended,priority -great_pret,less_proper,completed,1,less_conv,inconv,slightly_prob,priority,priority -great_pret,less_proper,completed,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,1,less_conv,inconv,problematic,recommended,priority -great_pret,less_proper,completed,1,less_conv,inconv,problematic,priority,priority -great_pret,less_proper,completed,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,less_proper,completed,1,critical,convenient,nonprob,recommended,priority -great_pret,less_proper,completed,1,critical,convenient,nonprob,priority,spec_prior -great_pret,less_proper,completed,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,completed,1,critical,convenient,slightly_prob,recommended,priority -great_pret,less_proper,completed,1,critical,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,completed,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,1,critical,convenient,problematic,recommended,spec_prior -great_pret,less_proper,completed,1,critical,convenient,problematic,priority,spec_prior -great_pret,less_proper,completed,1,critical,convenient,problematic,not_recom,not_recom -great_pret,less_proper,completed,1,critical,inconv,nonprob,recommended,priority -great_pret,less_proper,completed,1,critical,inconv,nonprob,priority,spec_prior -great_pret,less_proper,completed,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,completed,1,critical,inconv,slightly_prob,recommended,priority -great_pret,less_proper,completed,1,critical,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,completed,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,1,critical,inconv,problematic,recommended,spec_prior -great_pret,less_proper,completed,1,critical,inconv,problematic,priority,spec_prior -great_pret,less_proper,completed,1,critical,inconv,problematic,not_recom,not_recom -great_pret,less_proper,completed,2,convenient,convenient,nonprob,recommended,priority -great_pret,less_proper,completed,2,convenient,convenient,nonprob,priority,priority -great_pret,less_proper,completed,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,completed,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,less_proper,completed,2,convenient,convenient,slightly_prob,priority,priority -great_pret,less_proper,completed,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,2,convenient,convenient,problematic,recommended,priority -great_pret,less_proper,completed,2,convenient,convenient,problematic,priority,priority -great_pret,less_proper,completed,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,less_proper,completed,2,convenient,inconv,nonprob,recommended,priority -great_pret,less_proper,completed,2,convenient,inconv,nonprob,priority,priority -great_pret,less_proper,completed,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,completed,2,convenient,inconv,slightly_prob,recommended,priority -great_pret,less_proper,completed,2,convenient,inconv,slightly_prob,priority,priority -great_pret,less_proper,completed,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,2,convenient,inconv,problematic,recommended,priority -great_pret,less_proper,completed,2,convenient,inconv,problematic,priority,priority -great_pret,less_proper,completed,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,less_proper,completed,2,less_conv,convenient,nonprob,recommended,priority -great_pret,less_proper,completed,2,less_conv,convenient,nonprob,priority,priority -great_pret,less_proper,completed,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,completed,2,less_conv,convenient,slightly_prob,recommended,priority -great_pret,less_proper,completed,2,less_conv,convenient,slightly_prob,priority,priority -great_pret,less_proper,completed,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,2,less_conv,convenient,problematic,recommended,priority -great_pret,less_proper,completed,2,less_conv,convenient,problematic,priority,priority -great_pret,less_proper,completed,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,less_proper,completed,2,less_conv,inconv,nonprob,recommended,priority -great_pret,less_proper,completed,2,less_conv,inconv,nonprob,priority,priority -great_pret,less_proper,completed,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,completed,2,less_conv,inconv,slightly_prob,recommended,priority -great_pret,less_proper,completed,2,less_conv,inconv,slightly_prob,priority,priority -great_pret,less_proper,completed,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,2,less_conv,inconv,problematic,recommended,priority -great_pret,less_proper,completed,2,less_conv,inconv,problematic,priority,priority -great_pret,less_proper,completed,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,less_proper,completed,2,critical,convenient,nonprob,recommended,priority -great_pret,less_proper,completed,2,critical,convenient,nonprob,priority,spec_prior -great_pret,less_proper,completed,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,completed,2,critical,convenient,slightly_prob,recommended,priority -great_pret,less_proper,completed,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,completed,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,2,critical,convenient,problematic,recommended,spec_prior -great_pret,less_proper,completed,2,critical,convenient,problematic,priority,spec_prior -great_pret,less_proper,completed,2,critical,convenient,problematic,not_recom,not_recom -great_pret,less_proper,completed,2,critical,inconv,nonprob,recommended,priority -great_pret,less_proper,completed,2,critical,inconv,nonprob,priority,spec_prior -great_pret,less_proper,completed,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,completed,2,critical,inconv,slightly_prob,recommended,priority -great_pret,less_proper,completed,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,completed,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,2,critical,inconv,problematic,recommended,spec_prior -great_pret,less_proper,completed,2,critical,inconv,problematic,priority,spec_prior -great_pret,less_proper,completed,2,critical,inconv,problematic,not_recom,not_recom -great_pret,less_proper,completed,3,convenient,convenient,nonprob,recommended,priority -great_pret,less_proper,completed,3,convenient,convenient,nonprob,priority,priority -great_pret,less_proper,completed,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,completed,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,less_proper,completed,3,convenient,convenient,slightly_prob,priority,priority -great_pret,less_proper,completed,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,3,convenient,convenient,problematic,recommended,priority -great_pret,less_proper,completed,3,convenient,convenient,problematic,priority,priority -great_pret,less_proper,completed,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,less_proper,completed,3,convenient,inconv,nonprob,recommended,priority -great_pret,less_proper,completed,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,less_proper,completed,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,completed,3,convenient,inconv,slightly_prob,recommended,priority -great_pret,less_proper,completed,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,completed,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,less_proper,completed,3,convenient,inconv,problematic,priority,spec_prior -great_pret,less_proper,completed,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,less_proper,completed,3,less_conv,convenient,nonprob,recommended,priority -great_pret,less_proper,completed,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,less_proper,completed,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,completed,3,less_conv,convenient,slightly_prob,recommended,priority -great_pret,less_proper,completed,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,completed,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,less_proper,completed,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,less_proper,completed,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,less_proper,completed,3,less_conv,inconv,nonprob,recommended,priority -great_pret,less_proper,completed,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,less_proper,completed,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,completed,3,less_conv,inconv,slightly_prob,recommended,priority -great_pret,less_proper,completed,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,completed,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,less_proper,completed,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,less_proper,completed,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,less_proper,completed,3,critical,convenient,nonprob,recommended,priority -great_pret,less_proper,completed,3,critical,convenient,nonprob,priority,spec_prior -great_pret,less_proper,completed,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,completed,3,critical,convenient,slightly_prob,recommended,priority -great_pret,less_proper,completed,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,completed,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,3,critical,convenient,problematic,recommended,spec_prior -great_pret,less_proper,completed,3,critical,convenient,problematic,priority,spec_prior -great_pret,less_proper,completed,3,critical,convenient,problematic,not_recom,not_recom -great_pret,less_proper,completed,3,critical,inconv,nonprob,recommended,priority -great_pret,less_proper,completed,3,critical,inconv,nonprob,priority,spec_prior -great_pret,less_proper,completed,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,completed,3,critical,inconv,slightly_prob,recommended,priority -great_pret,less_proper,completed,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,completed,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,3,critical,inconv,problematic,recommended,spec_prior -great_pret,less_proper,completed,3,critical,inconv,problematic,priority,spec_prior -great_pret,less_proper,completed,3,critical,inconv,problematic,not_recom,not_recom -great_pret,less_proper,completed,more,convenient,convenient,nonprob,recommended,priority -great_pret,less_proper,completed,more,convenient,convenient,nonprob,priority,priority -great_pret,less_proper,completed,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,completed,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,less_proper,completed,more,convenient,convenient,slightly_prob,priority,priority -great_pret,less_proper,completed,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,more,convenient,convenient,problematic,recommended,priority -great_pret,less_proper,completed,more,convenient,convenient,problematic,priority,priority -great_pret,less_proper,completed,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,less_proper,completed,more,convenient,inconv,nonprob,recommended,priority -great_pret,less_proper,completed,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,less_proper,completed,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,completed,more,convenient,inconv,slightly_prob,recommended,priority -great_pret,less_proper,completed,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,completed,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,less_proper,completed,more,convenient,inconv,problematic,priority,spec_prior -great_pret,less_proper,completed,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,less_proper,completed,more,less_conv,convenient,nonprob,recommended,priority -great_pret,less_proper,completed,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,less_proper,completed,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,completed,more,less_conv,convenient,slightly_prob,recommended,priority -great_pret,less_proper,completed,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,completed,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,less_proper,completed,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,less_proper,completed,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,less_proper,completed,more,less_conv,inconv,nonprob,recommended,priority -great_pret,less_proper,completed,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,less_proper,completed,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,completed,more,less_conv,inconv,slightly_prob,recommended,priority -great_pret,less_proper,completed,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,completed,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,less_proper,completed,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,less_proper,completed,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,less_proper,completed,more,critical,convenient,nonprob,recommended,priority -great_pret,less_proper,completed,more,critical,convenient,nonprob,priority,spec_prior -great_pret,less_proper,completed,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,completed,more,critical,convenient,slightly_prob,recommended,priority -great_pret,less_proper,completed,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,completed,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,more,critical,convenient,problematic,recommended,spec_prior -great_pret,less_proper,completed,more,critical,convenient,problematic,priority,spec_prior -great_pret,less_proper,completed,more,critical,convenient,problematic,not_recom,not_recom -great_pret,less_proper,completed,more,critical,inconv,nonprob,recommended,priority -great_pret,less_proper,completed,more,critical,inconv,nonprob,priority,spec_prior -great_pret,less_proper,completed,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,completed,more,critical,inconv,slightly_prob,recommended,priority -great_pret,less_proper,completed,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,completed,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,completed,more,critical,inconv,problematic,recommended,spec_prior -great_pret,less_proper,completed,more,critical,inconv,problematic,priority,spec_prior -great_pret,less_proper,completed,more,critical,inconv,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,1,convenient,convenient,nonprob,recommended,priority -great_pret,less_proper,incomplete,1,convenient,convenient,nonprob,priority,priority -great_pret,less_proper,incomplete,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,1,convenient,convenient,slightly_prob,priority,priority -great_pret,less_proper,incomplete,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,1,convenient,convenient,problematic,recommended,priority -great_pret,less_proper,incomplete,1,convenient,convenient,problematic,priority,priority -great_pret,less_proper,incomplete,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,1,convenient,inconv,nonprob,recommended,priority -great_pret,less_proper,incomplete,1,convenient,inconv,nonprob,priority,priority -great_pret,less_proper,incomplete,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,1,convenient,inconv,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,1,convenient,inconv,slightly_prob,priority,priority -great_pret,less_proper,incomplete,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,1,convenient,inconv,problematic,recommended,priority -great_pret,less_proper,incomplete,1,convenient,inconv,problematic,priority,priority -great_pret,less_proper,incomplete,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,1,less_conv,convenient,nonprob,recommended,priority -great_pret,less_proper,incomplete,1,less_conv,convenient,nonprob,priority,priority -great_pret,less_proper,incomplete,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,1,less_conv,convenient,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,1,less_conv,convenient,slightly_prob,priority,priority -great_pret,less_proper,incomplete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,1,less_conv,convenient,problematic,recommended,priority -great_pret,less_proper,incomplete,1,less_conv,convenient,problematic,priority,priority -great_pret,less_proper,incomplete,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,1,less_conv,inconv,nonprob,recommended,priority -great_pret,less_proper,incomplete,1,less_conv,inconv,nonprob,priority,priority -great_pret,less_proper,incomplete,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,1,less_conv,inconv,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,1,less_conv,inconv,slightly_prob,priority,priority -great_pret,less_proper,incomplete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,1,less_conv,inconv,problematic,recommended,priority -great_pret,less_proper,incomplete,1,less_conv,inconv,problematic,priority,priority -great_pret,less_proper,incomplete,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,1,critical,convenient,nonprob,recommended,priority -great_pret,less_proper,incomplete,1,critical,convenient,nonprob,priority,spec_prior -great_pret,less_proper,incomplete,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,1,critical,convenient,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,1,critical,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,incomplete,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,1,critical,convenient,problematic,recommended,spec_prior -great_pret,less_proper,incomplete,1,critical,convenient,problematic,priority,spec_prior -great_pret,less_proper,incomplete,1,critical,convenient,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,1,critical,inconv,nonprob,recommended,priority -great_pret,less_proper,incomplete,1,critical,inconv,nonprob,priority,spec_prior -great_pret,less_proper,incomplete,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,1,critical,inconv,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,1,critical,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,incomplete,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,1,critical,inconv,problematic,recommended,spec_prior -great_pret,less_proper,incomplete,1,critical,inconv,problematic,priority,spec_prior -great_pret,less_proper,incomplete,1,critical,inconv,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,2,convenient,convenient,nonprob,recommended,priority -great_pret,less_proper,incomplete,2,convenient,convenient,nonprob,priority,priority -great_pret,less_proper,incomplete,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,2,convenient,convenient,slightly_prob,priority,priority -great_pret,less_proper,incomplete,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,2,convenient,convenient,problematic,recommended,priority -great_pret,less_proper,incomplete,2,convenient,convenient,problematic,priority,priority -great_pret,less_proper,incomplete,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,2,convenient,inconv,nonprob,recommended,priority -great_pret,less_proper,incomplete,2,convenient,inconv,nonprob,priority,spec_prior -great_pret,less_proper,incomplete,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,2,convenient,inconv,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,2,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,incomplete,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,2,convenient,inconv,problematic,recommended,spec_prior -great_pret,less_proper,incomplete,2,convenient,inconv,problematic,priority,spec_prior -great_pret,less_proper,incomplete,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,2,less_conv,convenient,nonprob,recommended,priority -great_pret,less_proper,incomplete,2,less_conv,convenient,nonprob,priority,spec_prior -great_pret,less_proper,incomplete,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,2,less_conv,convenient,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,2,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,incomplete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,2,less_conv,convenient,problematic,recommended,spec_prior -great_pret,less_proper,incomplete,2,less_conv,convenient,problematic,priority,spec_prior -great_pret,less_proper,incomplete,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,2,less_conv,inconv,nonprob,recommended,priority -great_pret,less_proper,incomplete,2,less_conv,inconv,nonprob,priority,spec_prior -great_pret,less_proper,incomplete,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,2,less_conv,inconv,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,2,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,incomplete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,2,less_conv,inconv,problematic,recommended,spec_prior -great_pret,less_proper,incomplete,2,less_conv,inconv,problematic,priority,spec_prior -great_pret,less_proper,incomplete,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,2,critical,convenient,nonprob,recommended,priority -great_pret,less_proper,incomplete,2,critical,convenient,nonprob,priority,spec_prior -great_pret,less_proper,incomplete,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,2,critical,convenient,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,incomplete,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,2,critical,convenient,problematic,recommended,spec_prior -great_pret,less_proper,incomplete,2,critical,convenient,problematic,priority,spec_prior -great_pret,less_proper,incomplete,2,critical,convenient,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,2,critical,inconv,nonprob,recommended,priority -great_pret,less_proper,incomplete,2,critical,inconv,nonprob,priority,spec_prior -great_pret,less_proper,incomplete,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,2,critical,inconv,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,incomplete,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,2,critical,inconv,problematic,recommended,spec_prior -great_pret,less_proper,incomplete,2,critical,inconv,problematic,priority,spec_prior -great_pret,less_proper,incomplete,2,critical,inconv,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,3,convenient,convenient,nonprob,recommended,priority -great_pret,less_proper,incomplete,3,convenient,convenient,nonprob,priority,priority -great_pret,less_proper,incomplete,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,3,convenient,convenient,slightly_prob,priority,priority -great_pret,less_proper,incomplete,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,3,convenient,convenient,problematic,recommended,priority -great_pret,less_proper,incomplete,3,convenient,convenient,problematic,priority,priority -great_pret,less_proper,incomplete,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,3,convenient,inconv,nonprob,recommended,priority -great_pret,less_proper,incomplete,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,less_proper,incomplete,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,3,convenient,inconv,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,incomplete,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,less_proper,incomplete,3,convenient,inconv,problematic,priority,spec_prior -great_pret,less_proper,incomplete,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,3,less_conv,convenient,nonprob,recommended,priority -great_pret,less_proper,incomplete,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,less_proper,incomplete,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,3,less_conv,convenient,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,incomplete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,less_proper,incomplete,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,less_proper,incomplete,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,3,less_conv,inconv,nonprob,recommended,priority -great_pret,less_proper,incomplete,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,less_proper,incomplete,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,3,less_conv,inconv,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,incomplete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,less_proper,incomplete,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,less_proper,incomplete,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,3,critical,convenient,nonprob,recommended,priority -great_pret,less_proper,incomplete,3,critical,convenient,nonprob,priority,spec_prior -great_pret,less_proper,incomplete,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,3,critical,convenient,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,incomplete,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,3,critical,convenient,problematic,recommended,spec_prior -great_pret,less_proper,incomplete,3,critical,convenient,problematic,priority,spec_prior -great_pret,less_proper,incomplete,3,critical,convenient,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,3,critical,inconv,nonprob,recommended,priority -great_pret,less_proper,incomplete,3,critical,inconv,nonprob,priority,spec_prior -great_pret,less_proper,incomplete,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,3,critical,inconv,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,incomplete,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,3,critical,inconv,problematic,recommended,spec_prior -great_pret,less_proper,incomplete,3,critical,inconv,problematic,priority,spec_prior -great_pret,less_proper,incomplete,3,critical,inconv,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,more,convenient,convenient,nonprob,recommended,priority -great_pret,less_proper,incomplete,more,convenient,convenient,nonprob,priority,priority -great_pret,less_proper,incomplete,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,more,convenient,convenient,slightly_prob,priority,priority -great_pret,less_proper,incomplete,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,more,convenient,convenient,problematic,recommended,priority -great_pret,less_proper,incomplete,more,convenient,convenient,problematic,priority,priority -great_pret,less_proper,incomplete,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,more,convenient,inconv,nonprob,recommended,priority -great_pret,less_proper,incomplete,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,less_proper,incomplete,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,more,convenient,inconv,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,incomplete,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,less_proper,incomplete,more,convenient,inconv,problematic,priority,spec_prior -great_pret,less_proper,incomplete,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,more,less_conv,convenient,nonprob,recommended,priority -great_pret,less_proper,incomplete,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,less_proper,incomplete,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,more,less_conv,convenient,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,incomplete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,less_proper,incomplete,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,less_proper,incomplete,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,more,less_conv,inconv,nonprob,recommended,priority -great_pret,less_proper,incomplete,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,less_proper,incomplete,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,more,less_conv,inconv,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,incomplete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,less_proper,incomplete,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,less_proper,incomplete,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,more,critical,convenient,nonprob,recommended,priority -great_pret,less_proper,incomplete,more,critical,convenient,nonprob,priority,spec_prior -great_pret,less_proper,incomplete,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,more,critical,convenient,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,incomplete,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,more,critical,convenient,problematic,recommended,spec_prior -great_pret,less_proper,incomplete,more,critical,convenient,problematic,priority,spec_prior -great_pret,less_proper,incomplete,more,critical,convenient,problematic,not_recom,not_recom -great_pret,less_proper,incomplete,more,critical,inconv,nonprob,recommended,priority -great_pret,less_proper,incomplete,more,critical,inconv,nonprob,priority,spec_prior -great_pret,less_proper,incomplete,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,incomplete,more,critical,inconv,slightly_prob,recommended,priority -great_pret,less_proper,incomplete,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,incomplete,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,incomplete,more,critical,inconv,problematic,recommended,spec_prior -great_pret,less_proper,incomplete,more,critical,inconv,problematic,priority,spec_prior -great_pret,less_proper,incomplete,more,critical,inconv,problematic,not_recom,not_recom -great_pret,less_proper,foster,1,convenient,convenient,nonprob,recommended,priority -great_pret,less_proper,foster,1,convenient,convenient,nonprob,priority,priority -great_pret,less_proper,foster,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,foster,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,less_proper,foster,1,convenient,convenient,slightly_prob,priority,priority -great_pret,less_proper,foster,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,1,convenient,convenient,problematic,recommended,priority -great_pret,less_proper,foster,1,convenient,convenient,problematic,priority,priority -great_pret,less_proper,foster,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,less_proper,foster,1,convenient,inconv,nonprob,recommended,priority -great_pret,less_proper,foster,1,convenient,inconv,nonprob,priority,spec_prior -great_pret,less_proper,foster,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,foster,1,convenient,inconv,slightly_prob,recommended,priority -great_pret,less_proper,foster,1,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,1,convenient,inconv,problematic,recommended,spec_prior -great_pret,less_proper,foster,1,convenient,inconv,problematic,priority,spec_prior -great_pret,less_proper,foster,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,less_proper,foster,1,less_conv,convenient,nonprob,recommended,priority -great_pret,less_proper,foster,1,less_conv,convenient,nonprob,priority,spec_prior -great_pret,less_proper,foster,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,foster,1,less_conv,convenient,slightly_prob,recommended,priority -great_pret,less_proper,foster,1,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,1,less_conv,convenient,problematic,recommended,spec_prior -great_pret,less_proper,foster,1,less_conv,convenient,problematic,priority,spec_prior -great_pret,less_proper,foster,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,less_proper,foster,1,less_conv,inconv,nonprob,recommended,priority -great_pret,less_proper,foster,1,less_conv,inconv,nonprob,priority,spec_prior -great_pret,less_proper,foster,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,foster,1,less_conv,inconv,slightly_prob,recommended,priority -great_pret,less_proper,foster,1,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,1,less_conv,inconv,problematic,recommended,spec_prior -great_pret,less_proper,foster,1,less_conv,inconv,problematic,priority,spec_prior -great_pret,less_proper,foster,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,less_proper,foster,1,critical,convenient,nonprob,recommended,priority -great_pret,less_proper,foster,1,critical,convenient,nonprob,priority,spec_prior -great_pret,less_proper,foster,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,foster,1,critical,convenient,slightly_prob,recommended,priority -great_pret,less_proper,foster,1,critical,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,1,critical,convenient,problematic,recommended,spec_prior -great_pret,less_proper,foster,1,critical,convenient,problematic,priority,spec_prior -great_pret,less_proper,foster,1,critical,convenient,problematic,not_recom,not_recom -great_pret,less_proper,foster,1,critical,inconv,nonprob,recommended,priority -great_pret,less_proper,foster,1,critical,inconv,nonprob,priority,spec_prior -great_pret,less_proper,foster,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,foster,1,critical,inconv,slightly_prob,recommended,priority -great_pret,less_proper,foster,1,critical,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,1,critical,inconv,problematic,recommended,spec_prior -great_pret,less_proper,foster,1,critical,inconv,problematic,priority,spec_prior -great_pret,less_proper,foster,1,critical,inconv,problematic,not_recom,not_recom -great_pret,less_proper,foster,2,convenient,convenient,nonprob,recommended,priority -great_pret,less_proper,foster,2,convenient,convenient,nonprob,priority,priority -great_pret,less_proper,foster,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,foster,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,less_proper,foster,2,convenient,convenient,slightly_prob,priority,priority -great_pret,less_proper,foster,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,2,convenient,convenient,problematic,recommended,priority -great_pret,less_proper,foster,2,convenient,convenient,problematic,priority,priority -great_pret,less_proper,foster,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,less_proper,foster,2,convenient,inconv,nonprob,recommended,priority -great_pret,less_proper,foster,2,convenient,inconv,nonprob,priority,spec_prior -great_pret,less_proper,foster,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,foster,2,convenient,inconv,slightly_prob,recommended,priority -great_pret,less_proper,foster,2,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,2,convenient,inconv,problematic,recommended,spec_prior -great_pret,less_proper,foster,2,convenient,inconv,problematic,priority,spec_prior -great_pret,less_proper,foster,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,less_proper,foster,2,less_conv,convenient,nonprob,recommended,priority -great_pret,less_proper,foster,2,less_conv,convenient,nonprob,priority,spec_prior -great_pret,less_proper,foster,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,foster,2,less_conv,convenient,slightly_prob,recommended,priority -great_pret,less_proper,foster,2,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,2,less_conv,convenient,problematic,recommended,spec_prior -great_pret,less_proper,foster,2,less_conv,convenient,problematic,priority,spec_prior -great_pret,less_proper,foster,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,less_proper,foster,2,less_conv,inconv,nonprob,recommended,priority -great_pret,less_proper,foster,2,less_conv,inconv,nonprob,priority,spec_prior -great_pret,less_proper,foster,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,foster,2,less_conv,inconv,slightly_prob,recommended,priority -great_pret,less_proper,foster,2,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,2,less_conv,inconv,problematic,recommended,spec_prior -great_pret,less_proper,foster,2,less_conv,inconv,problematic,priority,spec_prior -great_pret,less_proper,foster,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,less_proper,foster,2,critical,convenient,nonprob,recommended,priority -great_pret,less_proper,foster,2,critical,convenient,nonprob,priority,spec_prior -great_pret,less_proper,foster,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,foster,2,critical,convenient,slightly_prob,recommended,priority -great_pret,less_proper,foster,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,2,critical,convenient,problematic,recommended,spec_prior -great_pret,less_proper,foster,2,critical,convenient,problematic,priority,spec_prior -great_pret,less_proper,foster,2,critical,convenient,problematic,not_recom,not_recom -great_pret,less_proper,foster,2,critical,inconv,nonprob,recommended,priority -great_pret,less_proper,foster,2,critical,inconv,nonprob,priority,spec_prior -great_pret,less_proper,foster,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,foster,2,critical,inconv,slightly_prob,recommended,priority -great_pret,less_proper,foster,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,2,critical,inconv,problematic,recommended,spec_prior -great_pret,less_proper,foster,2,critical,inconv,problematic,priority,spec_prior -great_pret,less_proper,foster,2,critical,inconv,problematic,not_recom,not_recom -great_pret,less_proper,foster,3,convenient,convenient,nonprob,recommended,priority -great_pret,less_proper,foster,3,convenient,convenient,nonprob,priority,priority -great_pret,less_proper,foster,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,foster,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,less_proper,foster,3,convenient,convenient,slightly_prob,priority,priority -great_pret,less_proper,foster,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,3,convenient,convenient,problematic,recommended,priority -great_pret,less_proper,foster,3,convenient,convenient,problematic,priority,priority -great_pret,less_proper,foster,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,less_proper,foster,3,convenient,inconv,nonprob,recommended,priority -great_pret,less_proper,foster,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,less_proper,foster,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,foster,3,convenient,inconv,slightly_prob,recommended,priority -great_pret,less_proper,foster,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,less_proper,foster,3,convenient,inconv,problematic,priority,spec_prior -great_pret,less_proper,foster,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,less_proper,foster,3,less_conv,convenient,nonprob,recommended,priority -great_pret,less_proper,foster,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,less_proper,foster,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,foster,3,less_conv,convenient,slightly_prob,recommended,priority -great_pret,less_proper,foster,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,less_proper,foster,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,less_proper,foster,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,less_proper,foster,3,less_conv,inconv,nonprob,recommended,priority -great_pret,less_proper,foster,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,less_proper,foster,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,foster,3,less_conv,inconv,slightly_prob,recommended,priority -great_pret,less_proper,foster,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,less_proper,foster,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,less_proper,foster,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,less_proper,foster,3,critical,convenient,nonprob,recommended,priority -great_pret,less_proper,foster,3,critical,convenient,nonprob,priority,spec_prior -great_pret,less_proper,foster,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,foster,3,critical,convenient,slightly_prob,recommended,priority -great_pret,less_proper,foster,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,3,critical,convenient,problematic,recommended,spec_prior -great_pret,less_proper,foster,3,critical,convenient,problematic,priority,spec_prior -great_pret,less_proper,foster,3,critical,convenient,problematic,not_recom,not_recom -great_pret,less_proper,foster,3,critical,inconv,nonprob,recommended,priority -great_pret,less_proper,foster,3,critical,inconv,nonprob,priority,spec_prior -great_pret,less_proper,foster,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,foster,3,critical,inconv,slightly_prob,recommended,priority -great_pret,less_proper,foster,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,3,critical,inconv,problematic,recommended,spec_prior -great_pret,less_proper,foster,3,critical,inconv,problematic,priority,spec_prior -great_pret,less_proper,foster,3,critical,inconv,problematic,not_recom,not_recom -great_pret,less_proper,foster,more,convenient,convenient,nonprob,recommended,priority -great_pret,less_proper,foster,more,convenient,convenient,nonprob,priority,priority -great_pret,less_proper,foster,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,foster,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,less_proper,foster,more,convenient,convenient,slightly_prob,priority,priority -great_pret,less_proper,foster,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,more,convenient,convenient,problematic,recommended,priority -great_pret,less_proper,foster,more,convenient,convenient,problematic,priority,priority -great_pret,less_proper,foster,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,less_proper,foster,more,convenient,inconv,nonprob,recommended,priority -great_pret,less_proper,foster,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,less_proper,foster,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,foster,more,convenient,inconv,slightly_prob,recommended,priority -great_pret,less_proper,foster,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,less_proper,foster,more,convenient,inconv,problematic,priority,spec_prior -great_pret,less_proper,foster,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,less_proper,foster,more,less_conv,convenient,nonprob,recommended,priority -great_pret,less_proper,foster,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,less_proper,foster,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,foster,more,less_conv,convenient,slightly_prob,recommended,priority -great_pret,less_proper,foster,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,less_proper,foster,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,less_proper,foster,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,less_proper,foster,more,less_conv,inconv,nonprob,recommended,priority -great_pret,less_proper,foster,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,less_proper,foster,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,foster,more,less_conv,inconv,slightly_prob,recommended,priority -great_pret,less_proper,foster,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,less_proper,foster,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,less_proper,foster,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,less_proper,foster,more,critical,convenient,nonprob,recommended,priority -great_pret,less_proper,foster,more,critical,convenient,nonprob,priority,spec_prior -great_pret,less_proper,foster,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,less_proper,foster,more,critical,convenient,slightly_prob,recommended,priority -great_pret,less_proper,foster,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,more,critical,convenient,problematic,recommended,spec_prior -great_pret,less_proper,foster,more,critical,convenient,problematic,priority,spec_prior -great_pret,less_proper,foster,more,critical,convenient,problematic,not_recom,not_recom -great_pret,less_proper,foster,more,critical,inconv,nonprob,recommended,priority -great_pret,less_proper,foster,more,critical,inconv,nonprob,priority,spec_prior -great_pret,less_proper,foster,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,less_proper,foster,more,critical,inconv,slightly_prob,recommended,priority -great_pret,less_proper,foster,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,less_proper,foster,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,less_proper,foster,more,critical,inconv,problematic,recommended,spec_prior -great_pret,less_proper,foster,more,critical,inconv,problematic,priority,spec_prior -great_pret,less_proper,foster,more,critical,inconv,problematic,not_recom,not_recom -great_pret,improper,complete,1,convenient,convenient,nonprob,recommended,priority -great_pret,improper,complete,1,convenient,convenient,nonprob,priority,priority -great_pret,improper,complete,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,improper,complete,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,improper,complete,1,convenient,convenient,slightly_prob,priority,priority -great_pret,improper,complete,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,complete,1,convenient,convenient,problematic,recommended,priority -great_pret,improper,complete,1,convenient,convenient,problematic,priority,priority -great_pret,improper,complete,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,improper,complete,1,convenient,inconv,nonprob,recommended,priority -great_pret,improper,complete,1,convenient,inconv,nonprob,priority,spec_prior -great_pret,improper,complete,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,improper,complete,1,convenient,inconv,slightly_prob,recommended,priority -great_pret,improper,complete,1,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,improper,complete,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,complete,1,convenient,inconv,problematic,recommended,spec_prior -great_pret,improper,complete,1,convenient,inconv,problematic,priority,spec_prior -great_pret,improper,complete,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,improper,complete,1,less_conv,convenient,nonprob,recommended,priority -great_pret,improper,complete,1,less_conv,convenient,nonprob,priority,spec_prior -great_pret,improper,complete,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,improper,complete,1,less_conv,convenient,slightly_prob,recommended,priority -great_pret,improper,complete,1,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,improper,complete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,complete,1,less_conv,convenient,problematic,recommended,spec_prior -great_pret,improper,complete,1,less_conv,convenient,problematic,priority,spec_prior -great_pret,improper,complete,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,improper,complete,1,less_conv,inconv,nonprob,recommended,priority -great_pret,improper,complete,1,less_conv,inconv,nonprob,priority,spec_prior -great_pret,improper,complete,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,improper,complete,1,less_conv,inconv,slightly_prob,recommended,priority -great_pret,improper,complete,1,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,improper,complete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,complete,1,less_conv,inconv,problematic,recommended,spec_prior -great_pret,improper,complete,1,less_conv,inconv,problematic,priority,spec_prior -great_pret,improper,complete,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,improper,complete,1,critical,convenient,nonprob,recommended,priority -great_pret,improper,complete,1,critical,convenient,nonprob,priority,spec_prior -great_pret,improper,complete,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,improper,complete,1,critical,convenient,slightly_prob,recommended,priority -great_pret,improper,complete,1,critical,convenient,slightly_prob,priority,spec_prior -great_pret,improper,complete,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,complete,1,critical,convenient,problematic,recommended,spec_prior -great_pret,improper,complete,1,critical,convenient,problematic,priority,spec_prior -great_pret,improper,complete,1,critical,convenient,problematic,not_recom,not_recom -great_pret,improper,complete,1,critical,inconv,nonprob,recommended,priority -great_pret,improper,complete,1,critical,inconv,nonprob,priority,spec_prior -great_pret,improper,complete,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,improper,complete,1,critical,inconv,slightly_prob,recommended,priority -great_pret,improper,complete,1,critical,inconv,slightly_prob,priority,spec_prior -great_pret,improper,complete,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,complete,1,critical,inconv,problematic,recommended,spec_prior -great_pret,improper,complete,1,critical,inconv,problematic,priority,spec_prior -great_pret,improper,complete,1,critical,inconv,problematic,not_recom,not_recom -great_pret,improper,complete,2,convenient,convenient,nonprob,recommended,priority -great_pret,improper,complete,2,convenient,convenient,nonprob,priority,spec_prior -great_pret,improper,complete,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,improper,complete,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,improper,complete,2,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,improper,complete,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,complete,2,convenient,convenient,problematic,recommended,spec_prior -great_pret,improper,complete,2,convenient,convenient,problematic,priority,spec_prior -great_pret,improper,complete,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,improper,complete,2,convenient,inconv,nonprob,recommended,priority -great_pret,improper,complete,2,convenient,inconv,nonprob,priority,spec_prior -great_pret,improper,complete,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,improper,complete,2,convenient,inconv,slightly_prob,recommended,priority -great_pret,improper,complete,2,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,improper,complete,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,complete,2,convenient,inconv,problematic,recommended,spec_prior -great_pret,improper,complete,2,convenient,inconv,problematic,priority,spec_prior -great_pret,improper,complete,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,improper,complete,2,less_conv,convenient,nonprob,recommended,priority -great_pret,improper,complete,2,less_conv,convenient,nonprob,priority,spec_prior -great_pret,improper,complete,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,improper,complete,2,less_conv,convenient,slightly_prob,recommended,priority -great_pret,improper,complete,2,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,improper,complete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,complete,2,less_conv,convenient,problematic,recommended,spec_prior -great_pret,improper,complete,2,less_conv,convenient,problematic,priority,spec_prior -great_pret,improper,complete,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,improper,complete,2,less_conv,inconv,nonprob,recommended,priority -great_pret,improper,complete,2,less_conv,inconv,nonprob,priority,spec_prior -great_pret,improper,complete,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,improper,complete,2,less_conv,inconv,slightly_prob,recommended,priority -great_pret,improper,complete,2,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,improper,complete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,complete,2,less_conv,inconv,problematic,recommended,spec_prior -great_pret,improper,complete,2,less_conv,inconv,problematic,priority,spec_prior -great_pret,improper,complete,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,improper,complete,2,critical,convenient,nonprob,recommended,spec_prior -great_pret,improper,complete,2,critical,convenient,nonprob,priority,spec_prior -great_pret,improper,complete,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,improper,complete,2,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,complete,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,improper,complete,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,complete,2,critical,convenient,problematic,recommended,spec_prior -great_pret,improper,complete,2,critical,convenient,problematic,priority,spec_prior -great_pret,improper,complete,2,critical,convenient,problematic,not_recom,not_recom -great_pret,improper,complete,2,critical,inconv,nonprob,recommended,spec_prior -great_pret,improper,complete,2,critical,inconv,nonprob,priority,spec_prior -great_pret,improper,complete,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,improper,complete,2,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,complete,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,improper,complete,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,complete,2,critical,inconv,problematic,recommended,spec_prior -great_pret,improper,complete,2,critical,inconv,problematic,priority,spec_prior -great_pret,improper,complete,2,critical,inconv,problematic,not_recom,not_recom -great_pret,improper,complete,3,convenient,convenient,nonprob,recommended,priority -great_pret,improper,complete,3,convenient,convenient,nonprob,priority,spec_prior -great_pret,improper,complete,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,improper,complete,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,improper,complete,3,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,improper,complete,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,complete,3,convenient,convenient,problematic,recommended,spec_prior -great_pret,improper,complete,3,convenient,convenient,problematic,priority,spec_prior -great_pret,improper,complete,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,improper,complete,3,convenient,inconv,nonprob,recommended,spec_prior -great_pret,improper,complete,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,improper,complete,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,improper,complete,3,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,complete,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,improper,complete,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,complete,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,improper,complete,3,convenient,inconv,problematic,priority,spec_prior -great_pret,improper,complete,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,improper,complete,3,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,improper,complete,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,improper,complete,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,improper,complete,3,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,complete,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,improper,complete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,complete,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,improper,complete,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,improper,complete,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,improper,complete,3,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,improper,complete,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,improper,complete,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,improper,complete,3,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,complete,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,improper,complete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,complete,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,improper,complete,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,improper,complete,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,improper,complete,3,critical,convenient,nonprob,recommended,spec_prior -great_pret,improper,complete,3,critical,convenient,nonprob,priority,spec_prior -great_pret,improper,complete,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,improper,complete,3,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,complete,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,improper,complete,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,complete,3,critical,convenient,problematic,recommended,spec_prior -great_pret,improper,complete,3,critical,convenient,problematic,priority,spec_prior -great_pret,improper,complete,3,critical,convenient,problematic,not_recom,not_recom -great_pret,improper,complete,3,critical,inconv,nonprob,recommended,spec_prior -great_pret,improper,complete,3,critical,inconv,nonprob,priority,spec_prior -great_pret,improper,complete,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,improper,complete,3,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,complete,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,improper,complete,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,complete,3,critical,inconv,problematic,recommended,spec_prior -great_pret,improper,complete,3,critical,inconv,problematic,priority,spec_prior -great_pret,improper,complete,3,critical,inconv,problematic,not_recom,not_recom -great_pret,improper,complete,more,convenient,convenient,nonprob,recommended,priority -great_pret,improper,complete,more,convenient,convenient,nonprob,priority,spec_prior -great_pret,improper,complete,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,improper,complete,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,improper,complete,more,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,improper,complete,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,complete,more,convenient,convenient,problematic,recommended,spec_prior -great_pret,improper,complete,more,convenient,convenient,problematic,priority,spec_prior -great_pret,improper,complete,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,improper,complete,more,convenient,inconv,nonprob,recommended,spec_prior -great_pret,improper,complete,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,improper,complete,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,improper,complete,more,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,complete,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,improper,complete,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,complete,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,improper,complete,more,convenient,inconv,problematic,priority,spec_prior -great_pret,improper,complete,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,improper,complete,more,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,improper,complete,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,improper,complete,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,improper,complete,more,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,complete,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,improper,complete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,complete,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,improper,complete,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,improper,complete,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,improper,complete,more,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,improper,complete,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,improper,complete,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,improper,complete,more,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,complete,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,improper,complete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,complete,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,improper,complete,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,improper,complete,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,improper,complete,more,critical,convenient,nonprob,recommended,spec_prior -great_pret,improper,complete,more,critical,convenient,nonprob,priority,spec_prior -great_pret,improper,complete,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,improper,complete,more,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,complete,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,improper,complete,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,complete,more,critical,convenient,problematic,recommended,spec_prior -great_pret,improper,complete,more,critical,convenient,problematic,priority,spec_prior -great_pret,improper,complete,more,critical,convenient,problematic,not_recom,not_recom -great_pret,improper,complete,more,critical,inconv,nonprob,recommended,spec_prior -great_pret,improper,complete,more,critical,inconv,nonprob,priority,spec_prior -great_pret,improper,complete,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,improper,complete,more,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,complete,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,improper,complete,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,complete,more,critical,inconv,problematic,recommended,spec_prior -great_pret,improper,complete,more,critical,inconv,problematic,priority,spec_prior -great_pret,improper,complete,more,critical,inconv,problematic,not_recom,not_recom -great_pret,improper,completed,1,convenient,convenient,nonprob,recommended,priority -great_pret,improper,completed,1,convenient,convenient,nonprob,priority,spec_prior -great_pret,improper,completed,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,improper,completed,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,improper,completed,1,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,improper,completed,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,completed,1,convenient,convenient,problematic,recommended,spec_prior -great_pret,improper,completed,1,convenient,convenient,problematic,priority,spec_prior -great_pret,improper,completed,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,improper,completed,1,convenient,inconv,nonprob,recommended,priority -great_pret,improper,completed,1,convenient,inconv,nonprob,priority,spec_prior -great_pret,improper,completed,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,improper,completed,1,convenient,inconv,slightly_prob,recommended,priority -great_pret,improper,completed,1,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,improper,completed,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,completed,1,convenient,inconv,problematic,recommended,spec_prior -great_pret,improper,completed,1,convenient,inconv,problematic,priority,spec_prior -great_pret,improper,completed,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,improper,completed,1,less_conv,convenient,nonprob,recommended,priority -great_pret,improper,completed,1,less_conv,convenient,nonprob,priority,spec_prior -great_pret,improper,completed,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,improper,completed,1,less_conv,convenient,slightly_prob,recommended,priority -great_pret,improper,completed,1,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,improper,completed,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,completed,1,less_conv,convenient,problematic,recommended,spec_prior -great_pret,improper,completed,1,less_conv,convenient,problematic,priority,spec_prior -great_pret,improper,completed,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,improper,completed,1,less_conv,inconv,nonprob,recommended,priority -great_pret,improper,completed,1,less_conv,inconv,nonprob,priority,spec_prior -great_pret,improper,completed,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,improper,completed,1,less_conv,inconv,slightly_prob,recommended,priority -great_pret,improper,completed,1,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,improper,completed,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,completed,1,less_conv,inconv,problematic,recommended,spec_prior -great_pret,improper,completed,1,less_conv,inconv,problematic,priority,spec_prior -great_pret,improper,completed,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,improper,completed,1,critical,convenient,nonprob,recommended,spec_prior -great_pret,improper,completed,1,critical,convenient,nonprob,priority,spec_prior -great_pret,improper,completed,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,improper,completed,1,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,completed,1,critical,convenient,slightly_prob,priority,spec_prior -great_pret,improper,completed,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,completed,1,critical,convenient,problematic,recommended,spec_prior -great_pret,improper,completed,1,critical,convenient,problematic,priority,spec_prior -great_pret,improper,completed,1,critical,convenient,problematic,not_recom,not_recom -great_pret,improper,completed,1,critical,inconv,nonprob,recommended,spec_prior -great_pret,improper,completed,1,critical,inconv,nonprob,priority,spec_prior -great_pret,improper,completed,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,improper,completed,1,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,completed,1,critical,inconv,slightly_prob,priority,spec_prior -great_pret,improper,completed,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,completed,1,critical,inconv,problematic,recommended,spec_prior -great_pret,improper,completed,1,critical,inconv,problematic,priority,spec_prior -great_pret,improper,completed,1,critical,inconv,problematic,not_recom,not_recom -great_pret,improper,completed,2,convenient,convenient,nonprob,recommended,priority -great_pret,improper,completed,2,convenient,convenient,nonprob,priority,spec_prior -great_pret,improper,completed,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,improper,completed,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,improper,completed,2,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,improper,completed,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,completed,2,convenient,convenient,problematic,recommended,spec_prior -great_pret,improper,completed,2,convenient,convenient,problematic,priority,spec_prior -great_pret,improper,completed,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,improper,completed,2,convenient,inconv,nonprob,recommended,priority -great_pret,improper,completed,2,convenient,inconv,nonprob,priority,spec_prior -great_pret,improper,completed,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,improper,completed,2,convenient,inconv,slightly_prob,recommended,priority -great_pret,improper,completed,2,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,improper,completed,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,completed,2,convenient,inconv,problematic,recommended,spec_prior -great_pret,improper,completed,2,convenient,inconv,problematic,priority,spec_prior -great_pret,improper,completed,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,improper,completed,2,less_conv,convenient,nonprob,recommended,priority -great_pret,improper,completed,2,less_conv,convenient,nonprob,priority,spec_prior -great_pret,improper,completed,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,improper,completed,2,less_conv,convenient,slightly_prob,recommended,priority -great_pret,improper,completed,2,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,improper,completed,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,completed,2,less_conv,convenient,problematic,recommended,spec_prior -great_pret,improper,completed,2,less_conv,convenient,problematic,priority,spec_prior -great_pret,improper,completed,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,improper,completed,2,less_conv,inconv,nonprob,recommended,priority -great_pret,improper,completed,2,less_conv,inconv,nonprob,priority,spec_prior -great_pret,improper,completed,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,improper,completed,2,less_conv,inconv,slightly_prob,recommended,priority -great_pret,improper,completed,2,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,improper,completed,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,completed,2,less_conv,inconv,problematic,recommended,spec_prior -great_pret,improper,completed,2,less_conv,inconv,problematic,priority,spec_prior -great_pret,improper,completed,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,improper,completed,2,critical,convenient,nonprob,recommended,spec_prior -great_pret,improper,completed,2,critical,convenient,nonprob,priority,spec_prior -great_pret,improper,completed,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,improper,completed,2,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,completed,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,improper,completed,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,completed,2,critical,convenient,problematic,recommended,spec_prior -great_pret,improper,completed,2,critical,convenient,problematic,priority,spec_prior -great_pret,improper,completed,2,critical,convenient,problematic,not_recom,not_recom -great_pret,improper,completed,2,critical,inconv,nonprob,recommended,spec_prior -great_pret,improper,completed,2,critical,inconv,nonprob,priority,spec_prior -great_pret,improper,completed,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,improper,completed,2,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,completed,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,improper,completed,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,completed,2,critical,inconv,problematic,recommended,spec_prior -great_pret,improper,completed,2,critical,inconv,problematic,priority,spec_prior -great_pret,improper,completed,2,critical,inconv,problematic,not_recom,not_recom -great_pret,improper,completed,3,convenient,convenient,nonprob,recommended,priority -great_pret,improper,completed,3,convenient,convenient,nonprob,priority,spec_prior -great_pret,improper,completed,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,improper,completed,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,improper,completed,3,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,improper,completed,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,completed,3,convenient,convenient,problematic,recommended,spec_prior -great_pret,improper,completed,3,convenient,convenient,problematic,priority,spec_prior -great_pret,improper,completed,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,improper,completed,3,convenient,inconv,nonprob,recommended,spec_prior -great_pret,improper,completed,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,improper,completed,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,improper,completed,3,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,completed,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,improper,completed,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,completed,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,improper,completed,3,convenient,inconv,problematic,priority,spec_prior -great_pret,improper,completed,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,improper,completed,3,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,improper,completed,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,improper,completed,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,improper,completed,3,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,completed,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,improper,completed,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,completed,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,improper,completed,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,improper,completed,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,improper,completed,3,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,improper,completed,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,improper,completed,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,improper,completed,3,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,completed,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,improper,completed,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,completed,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,improper,completed,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,improper,completed,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,improper,completed,3,critical,convenient,nonprob,recommended,spec_prior -great_pret,improper,completed,3,critical,convenient,nonprob,priority,spec_prior -great_pret,improper,completed,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,improper,completed,3,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,completed,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,improper,completed,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,completed,3,critical,convenient,problematic,recommended,spec_prior -great_pret,improper,completed,3,critical,convenient,problematic,priority,spec_prior -great_pret,improper,completed,3,critical,convenient,problematic,not_recom,not_recom -great_pret,improper,completed,3,critical,inconv,nonprob,recommended,spec_prior -great_pret,improper,completed,3,critical,inconv,nonprob,priority,spec_prior -great_pret,improper,completed,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,improper,completed,3,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,completed,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,improper,completed,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,completed,3,critical,inconv,problematic,recommended,spec_prior -great_pret,improper,completed,3,critical,inconv,problematic,priority,spec_prior -great_pret,improper,completed,3,critical,inconv,problematic,not_recom,not_recom -great_pret,improper,completed,more,convenient,convenient,nonprob,recommended,priority -great_pret,improper,completed,more,convenient,convenient,nonprob,priority,spec_prior -great_pret,improper,completed,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,improper,completed,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,improper,completed,more,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,improper,completed,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,completed,more,convenient,convenient,problematic,recommended,spec_prior -great_pret,improper,completed,more,convenient,convenient,problematic,priority,spec_prior -great_pret,improper,completed,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,improper,completed,more,convenient,inconv,nonprob,recommended,spec_prior -great_pret,improper,completed,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,improper,completed,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,improper,completed,more,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,completed,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,improper,completed,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,completed,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,improper,completed,more,convenient,inconv,problematic,priority,spec_prior -great_pret,improper,completed,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,improper,completed,more,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,improper,completed,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,improper,completed,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,improper,completed,more,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,completed,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,improper,completed,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,completed,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,improper,completed,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,improper,completed,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,improper,completed,more,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,improper,completed,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,improper,completed,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,improper,completed,more,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,completed,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,improper,completed,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,completed,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,improper,completed,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,improper,completed,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,improper,completed,more,critical,convenient,nonprob,recommended,spec_prior -great_pret,improper,completed,more,critical,convenient,nonprob,priority,spec_prior -great_pret,improper,completed,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,improper,completed,more,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,completed,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,improper,completed,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,completed,more,critical,convenient,problematic,recommended,spec_prior -great_pret,improper,completed,more,critical,convenient,problematic,priority,spec_prior -great_pret,improper,completed,more,critical,convenient,problematic,not_recom,not_recom -great_pret,improper,completed,more,critical,inconv,nonprob,recommended,spec_prior -great_pret,improper,completed,more,critical,inconv,nonprob,priority,spec_prior -great_pret,improper,completed,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,improper,completed,more,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,completed,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,improper,completed,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,completed,more,critical,inconv,problematic,recommended,spec_prior -great_pret,improper,completed,more,critical,inconv,problematic,priority,spec_prior -great_pret,improper,completed,more,critical,inconv,problematic,not_recom,not_recom -great_pret,improper,incomplete,1,convenient,convenient,nonprob,recommended,priority -great_pret,improper,incomplete,1,convenient,convenient,nonprob,priority,spec_prior -great_pret,improper,incomplete,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,improper,incomplete,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,improper,incomplete,1,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,1,convenient,convenient,problematic,recommended,spec_prior -great_pret,improper,incomplete,1,convenient,convenient,problematic,priority,spec_prior -great_pret,improper,incomplete,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,improper,incomplete,1,convenient,inconv,nonprob,recommended,priority -great_pret,improper,incomplete,1,convenient,inconv,nonprob,priority,spec_prior -great_pret,improper,incomplete,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,improper,incomplete,1,convenient,inconv,slightly_prob,recommended,priority -great_pret,improper,incomplete,1,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,1,convenient,inconv,problematic,recommended,spec_prior -great_pret,improper,incomplete,1,convenient,inconv,problematic,priority,spec_prior -great_pret,improper,incomplete,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,improper,incomplete,1,less_conv,convenient,nonprob,recommended,priority -great_pret,improper,incomplete,1,less_conv,convenient,nonprob,priority,spec_prior -great_pret,improper,incomplete,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,improper,incomplete,1,less_conv,convenient,slightly_prob,recommended,priority -great_pret,improper,incomplete,1,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,1,less_conv,convenient,problematic,recommended,spec_prior -great_pret,improper,incomplete,1,less_conv,convenient,problematic,priority,spec_prior -great_pret,improper,incomplete,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,improper,incomplete,1,less_conv,inconv,nonprob,recommended,priority -great_pret,improper,incomplete,1,less_conv,inconv,nonprob,priority,spec_prior -great_pret,improper,incomplete,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,improper,incomplete,1,less_conv,inconv,slightly_prob,recommended,priority -great_pret,improper,incomplete,1,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,1,less_conv,inconv,problematic,recommended,spec_prior -great_pret,improper,incomplete,1,less_conv,inconv,problematic,priority,spec_prior -great_pret,improper,incomplete,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,improper,incomplete,1,critical,convenient,nonprob,recommended,spec_prior -great_pret,improper,incomplete,1,critical,convenient,nonprob,priority,spec_prior -great_pret,improper,incomplete,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,improper,incomplete,1,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,incomplete,1,critical,convenient,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,1,critical,convenient,problematic,recommended,spec_prior -great_pret,improper,incomplete,1,critical,convenient,problematic,priority,spec_prior -great_pret,improper,incomplete,1,critical,convenient,problematic,not_recom,not_recom -great_pret,improper,incomplete,1,critical,inconv,nonprob,recommended,spec_prior -great_pret,improper,incomplete,1,critical,inconv,nonprob,priority,spec_prior -great_pret,improper,incomplete,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,improper,incomplete,1,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,incomplete,1,critical,inconv,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,1,critical,inconv,problematic,recommended,spec_prior -great_pret,improper,incomplete,1,critical,inconv,problematic,priority,spec_prior -great_pret,improper,incomplete,1,critical,inconv,problematic,not_recom,not_recom -great_pret,improper,incomplete,2,convenient,convenient,nonprob,recommended,priority -great_pret,improper,incomplete,2,convenient,convenient,nonprob,priority,spec_prior -great_pret,improper,incomplete,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,improper,incomplete,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,improper,incomplete,2,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,2,convenient,convenient,problematic,recommended,spec_prior -great_pret,improper,incomplete,2,convenient,convenient,problematic,priority,spec_prior -great_pret,improper,incomplete,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,improper,incomplete,2,convenient,inconv,nonprob,recommended,spec_prior -great_pret,improper,incomplete,2,convenient,inconv,nonprob,priority,spec_prior -great_pret,improper,incomplete,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,improper,incomplete,2,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,incomplete,2,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,2,convenient,inconv,problematic,recommended,spec_prior -great_pret,improper,incomplete,2,convenient,inconv,problematic,priority,spec_prior -great_pret,improper,incomplete,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,improper,incomplete,2,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,improper,incomplete,2,less_conv,convenient,nonprob,priority,spec_prior -great_pret,improper,incomplete,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,improper,incomplete,2,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,incomplete,2,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,2,less_conv,convenient,problematic,recommended,spec_prior -great_pret,improper,incomplete,2,less_conv,convenient,problematic,priority,spec_prior -great_pret,improper,incomplete,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,improper,incomplete,2,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,improper,incomplete,2,less_conv,inconv,nonprob,priority,spec_prior -great_pret,improper,incomplete,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,improper,incomplete,2,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,incomplete,2,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,2,less_conv,inconv,problematic,recommended,spec_prior -great_pret,improper,incomplete,2,less_conv,inconv,problematic,priority,spec_prior -great_pret,improper,incomplete,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,improper,incomplete,2,critical,convenient,nonprob,recommended,spec_prior -great_pret,improper,incomplete,2,critical,convenient,nonprob,priority,spec_prior -great_pret,improper,incomplete,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,improper,incomplete,2,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,incomplete,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,2,critical,convenient,problematic,recommended,spec_prior -great_pret,improper,incomplete,2,critical,convenient,problematic,priority,spec_prior -great_pret,improper,incomplete,2,critical,convenient,problematic,not_recom,not_recom -great_pret,improper,incomplete,2,critical,inconv,nonprob,recommended,spec_prior -great_pret,improper,incomplete,2,critical,inconv,nonprob,priority,spec_prior -great_pret,improper,incomplete,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,improper,incomplete,2,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,incomplete,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,2,critical,inconv,problematic,recommended,spec_prior -great_pret,improper,incomplete,2,critical,inconv,problematic,priority,spec_prior -great_pret,improper,incomplete,2,critical,inconv,problematic,not_recom,not_recom -great_pret,improper,incomplete,3,convenient,convenient,nonprob,recommended,priority -great_pret,improper,incomplete,3,convenient,convenient,nonprob,priority,spec_prior -great_pret,improper,incomplete,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,improper,incomplete,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,improper,incomplete,3,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,3,convenient,convenient,problematic,recommended,spec_prior -great_pret,improper,incomplete,3,convenient,convenient,problematic,priority,spec_prior -great_pret,improper,incomplete,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,improper,incomplete,3,convenient,inconv,nonprob,recommended,spec_prior -great_pret,improper,incomplete,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,improper,incomplete,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,improper,incomplete,3,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,incomplete,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,improper,incomplete,3,convenient,inconv,problematic,priority,spec_prior -great_pret,improper,incomplete,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,improper,incomplete,3,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,improper,incomplete,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,improper,incomplete,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,improper,incomplete,3,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,incomplete,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,improper,incomplete,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,improper,incomplete,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,improper,incomplete,3,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,improper,incomplete,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,improper,incomplete,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,improper,incomplete,3,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,incomplete,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,improper,incomplete,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,improper,incomplete,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,improper,incomplete,3,critical,convenient,nonprob,recommended,spec_prior -great_pret,improper,incomplete,3,critical,convenient,nonprob,priority,spec_prior -great_pret,improper,incomplete,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,improper,incomplete,3,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,incomplete,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,3,critical,convenient,problematic,recommended,spec_prior -great_pret,improper,incomplete,3,critical,convenient,problematic,priority,spec_prior -great_pret,improper,incomplete,3,critical,convenient,problematic,not_recom,not_recom -great_pret,improper,incomplete,3,critical,inconv,nonprob,recommended,spec_prior -great_pret,improper,incomplete,3,critical,inconv,nonprob,priority,spec_prior -great_pret,improper,incomplete,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,improper,incomplete,3,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,incomplete,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,3,critical,inconv,problematic,recommended,spec_prior -great_pret,improper,incomplete,3,critical,inconv,problematic,priority,spec_prior -great_pret,improper,incomplete,3,critical,inconv,problematic,not_recom,not_recom -great_pret,improper,incomplete,more,convenient,convenient,nonprob,recommended,priority -great_pret,improper,incomplete,more,convenient,convenient,nonprob,priority,spec_prior -great_pret,improper,incomplete,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,improper,incomplete,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,improper,incomplete,more,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,more,convenient,convenient,problematic,recommended,spec_prior -great_pret,improper,incomplete,more,convenient,convenient,problematic,priority,spec_prior -great_pret,improper,incomplete,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,improper,incomplete,more,convenient,inconv,nonprob,recommended,spec_prior -great_pret,improper,incomplete,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,improper,incomplete,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,improper,incomplete,more,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,incomplete,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,improper,incomplete,more,convenient,inconv,problematic,priority,spec_prior -great_pret,improper,incomplete,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,improper,incomplete,more,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,improper,incomplete,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,improper,incomplete,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,improper,incomplete,more,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,incomplete,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,improper,incomplete,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,improper,incomplete,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,improper,incomplete,more,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,improper,incomplete,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,improper,incomplete,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,improper,incomplete,more,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,incomplete,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,improper,incomplete,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,improper,incomplete,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,improper,incomplete,more,critical,convenient,nonprob,recommended,spec_prior -great_pret,improper,incomplete,more,critical,convenient,nonprob,priority,spec_prior -great_pret,improper,incomplete,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,improper,incomplete,more,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,incomplete,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,more,critical,convenient,problematic,recommended,spec_prior -great_pret,improper,incomplete,more,critical,convenient,problematic,priority,spec_prior -great_pret,improper,incomplete,more,critical,convenient,problematic,not_recom,not_recom -great_pret,improper,incomplete,more,critical,inconv,nonprob,recommended,spec_prior -great_pret,improper,incomplete,more,critical,inconv,nonprob,priority,spec_prior -great_pret,improper,incomplete,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,improper,incomplete,more,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,incomplete,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,improper,incomplete,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,incomplete,more,critical,inconv,problematic,recommended,spec_prior -great_pret,improper,incomplete,more,critical,inconv,problematic,priority,spec_prior -great_pret,improper,incomplete,more,critical,inconv,problematic,not_recom,not_recom -great_pret,improper,foster,1,convenient,convenient,nonprob,recommended,priority -great_pret,improper,foster,1,convenient,convenient,nonprob,priority,spec_prior -great_pret,improper,foster,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,improper,foster,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,improper,foster,1,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,improper,foster,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,foster,1,convenient,convenient,problematic,recommended,spec_prior -great_pret,improper,foster,1,convenient,convenient,problematic,priority,spec_prior -great_pret,improper,foster,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,improper,foster,1,convenient,inconv,nonprob,recommended,spec_prior -great_pret,improper,foster,1,convenient,inconv,nonprob,priority,spec_prior -great_pret,improper,foster,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,improper,foster,1,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,foster,1,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,improper,foster,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,foster,1,convenient,inconv,problematic,recommended,spec_prior -great_pret,improper,foster,1,convenient,inconv,problematic,priority,spec_prior -great_pret,improper,foster,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,improper,foster,1,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,improper,foster,1,less_conv,convenient,nonprob,priority,spec_prior -great_pret,improper,foster,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,improper,foster,1,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,foster,1,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,improper,foster,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,foster,1,less_conv,convenient,problematic,recommended,spec_prior -great_pret,improper,foster,1,less_conv,convenient,problematic,priority,spec_prior -great_pret,improper,foster,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,improper,foster,1,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,improper,foster,1,less_conv,inconv,nonprob,priority,spec_prior -great_pret,improper,foster,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,improper,foster,1,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,foster,1,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,improper,foster,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,foster,1,less_conv,inconv,problematic,recommended,spec_prior -great_pret,improper,foster,1,less_conv,inconv,problematic,priority,spec_prior -great_pret,improper,foster,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,improper,foster,1,critical,convenient,nonprob,recommended,spec_prior -great_pret,improper,foster,1,critical,convenient,nonprob,priority,spec_prior -great_pret,improper,foster,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,improper,foster,1,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,foster,1,critical,convenient,slightly_prob,priority,spec_prior -great_pret,improper,foster,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,foster,1,critical,convenient,problematic,recommended,spec_prior -great_pret,improper,foster,1,critical,convenient,problematic,priority,spec_prior -great_pret,improper,foster,1,critical,convenient,problematic,not_recom,not_recom -great_pret,improper,foster,1,critical,inconv,nonprob,recommended,spec_prior -great_pret,improper,foster,1,critical,inconv,nonprob,priority,spec_prior -great_pret,improper,foster,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,improper,foster,1,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,foster,1,critical,inconv,slightly_prob,priority,spec_prior -great_pret,improper,foster,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,foster,1,critical,inconv,problematic,recommended,spec_prior -great_pret,improper,foster,1,critical,inconv,problematic,priority,spec_prior -great_pret,improper,foster,1,critical,inconv,problematic,not_recom,not_recom -great_pret,improper,foster,2,convenient,convenient,nonprob,recommended,priority -great_pret,improper,foster,2,convenient,convenient,nonprob,priority,spec_prior -great_pret,improper,foster,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,improper,foster,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,improper,foster,2,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,improper,foster,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,foster,2,convenient,convenient,problematic,recommended,spec_prior -great_pret,improper,foster,2,convenient,convenient,problematic,priority,spec_prior -great_pret,improper,foster,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,improper,foster,2,convenient,inconv,nonprob,recommended,spec_prior -great_pret,improper,foster,2,convenient,inconv,nonprob,priority,spec_prior -great_pret,improper,foster,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,improper,foster,2,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,foster,2,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,improper,foster,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,foster,2,convenient,inconv,problematic,recommended,spec_prior -great_pret,improper,foster,2,convenient,inconv,problematic,priority,spec_prior -great_pret,improper,foster,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,improper,foster,2,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,improper,foster,2,less_conv,convenient,nonprob,priority,spec_prior -great_pret,improper,foster,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,improper,foster,2,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,foster,2,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,improper,foster,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,foster,2,less_conv,convenient,problematic,recommended,spec_prior -great_pret,improper,foster,2,less_conv,convenient,problematic,priority,spec_prior -great_pret,improper,foster,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,improper,foster,2,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,improper,foster,2,less_conv,inconv,nonprob,priority,spec_prior -great_pret,improper,foster,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,improper,foster,2,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,foster,2,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,improper,foster,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,foster,2,less_conv,inconv,problematic,recommended,spec_prior -great_pret,improper,foster,2,less_conv,inconv,problematic,priority,spec_prior -great_pret,improper,foster,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,improper,foster,2,critical,convenient,nonprob,recommended,spec_prior -great_pret,improper,foster,2,critical,convenient,nonprob,priority,spec_prior -great_pret,improper,foster,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,improper,foster,2,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,foster,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,improper,foster,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,foster,2,critical,convenient,problematic,recommended,spec_prior -great_pret,improper,foster,2,critical,convenient,problematic,priority,spec_prior -great_pret,improper,foster,2,critical,convenient,problematic,not_recom,not_recom -great_pret,improper,foster,2,critical,inconv,nonprob,recommended,spec_prior -great_pret,improper,foster,2,critical,inconv,nonprob,priority,spec_prior -great_pret,improper,foster,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,improper,foster,2,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,foster,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,improper,foster,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,foster,2,critical,inconv,problematic,recommended,spec_prior -great_pret,improper,foster,2,critical,inconv,problematic,priority,spec_prior -great_pret,improper,foster,2,critical,inconv,problematic,not_recom,not_recom -great_pret,improper,foster,3,convenient,convenient,nonprob,recommended,priority -great_pret,improper,foster,3,convenient,convenient,nonprob,priority,spec_prior -great_pret,improper,foster,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,improper,foster,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,improper,foster,3,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,improper,foster,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,foster,3,convenient,convenient,problematic,recommended,spec_prior -great_pret,improper,foster,3,convenient,convenient,problematic,priority,spec_prior -great_pret,improper,foster,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,improper,foster,3,convenient,inconv,nonprob,recommended,spec_prior -great_pret,improper,foster,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,improper,foster,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,improper,foster,3,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,foster,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,improper,foster,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,foster,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,improper,foster,3,convenient,inconv,problematic,priority,spec_prior -great_pret,improper,foster,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,improper,foster,3,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,improper,foster,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,improper,foster,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,improper,foster,3,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,foster,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,improper,foster,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,foster,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,improper,foster,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,improper,foster,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,improper,foster,3,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,improper,foster,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,improper,foster,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,improper,foster,3,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,foster,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,improper,foster,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,foster,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,improper,foster,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,improper,foster,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,improper,foster,3,critical,convenient,nonprob,recommended,spec_prior -great_pret,improper,foster,3,critical,convenient,nonprob,priority,spec_prior -great_pret,improper,foster,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,improper,foster,3,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,foster,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,improper,foster,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,foster,3,critical,convenient,problematic,recommended,spec_prior -great_pret,improper,foster,3,critical,convenient,problematic,priority,spec_prior -great_pret,improper,foster,3,critical,convenient,problematic,not_recom,not_recom -great_pret,improper,foster,3,critical,inconv,nonprob,recommended,spec_prior -great_pret,improper,foster,3,critical,inconv,nonprob,priority,spec_prior -great_pret,improper,foster,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,improper,foster,3,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,foster,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,improper,foster,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,foster,3,critical,inconv,problematic,recommended,spec_prior -great_pret,improper,foster,3,critical,inconv,problematic,priority,spec_prior -great_pret,improper,foster,3,critical,inconv,problematic,not_recom,not_recom -great_pret,improper,foster,more,convenient,convenient,nonprob,recommended,priority -great_pret,improper,foster,more,convenient,convenient,nonprob,priority,spec_prior -great_pret,improper,foster,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,improper,foster,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,improper,foster,more,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,improper,foster,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,foster,more,convenient,convenient,problematic,recommended,spec_prior -great_pret,improper,foster,more,convenient,convenient,problematic,priority,spec_prior -great_pret,improper,foster,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,improper,foster,more,convenient,inconv,nonprob,recommended,spec_prior -great_pret,improper,foster,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,improper,foster,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,improper,foster,more,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,foster,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,improper,foster,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,foster,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,improper,foster,more,convenient,inconv,problematic,priority,spec_prior -great_pret,improper,foster,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,improper,foster,more,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,improper,foster,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,improper,foster,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,improper,foster,more,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,foster,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,improper,foster,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,foster,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,improper,foster,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,improper,foster,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,improper,foster,more,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,improper,foster,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,improper,foster,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,improper,foster,more,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,foster,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,improper,foster,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,foster,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,improper,foster,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,improper,foster,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,improper,foster,more,critical,convenient,nonprob,recommended,spec_prior -great_pret,improper,foster,more,critical,convenient,nonprob,priority,spec_prior -great_pret,improper,foster,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,improper,foster,more,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,improper,foster,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,improper,foster,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,improper,foster,more,critical,convenient,problematic,recommended,spec_prior -great_pret,improper,foster,more,critical,convenient,problematic,priority,spec_prior -great_pret,improper,foster,more,critical,convenient,problematic,not_recom,not_recom -great_pret,improper,foster,more,critical,inconv,nonprob,recommended,spec_prior -great_pret,improper,foster,more,critical,inconv,nonprob,priority,spec_prior -great_pret,improper,foster,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,improper,foster,more,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,improper,foster,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,improper,foster,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,improper,foster,more,critical,inconv,problematic,recommended,spec_prior -great_pret,improper,foster,more,critical,inconv,problematic,priority,spec_prior -great_pret,improper,foster,more,critical,inconv,problematic,not_recom,not_recom -great_pret,critical,complete,1,convenient,convenient,nonprob,recommended,priority -great_pret,critical,complete,1,convenient,convenient,nonprob,priority,priority -great_pret,critical,complete,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,critical,complete,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,critical,complete,1,convenient,convenient,slightly_prob,priority,priority -great_pret,critical,complete,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,complete,1,convenient,convenient,problematic,recommended,priority -great_pret,critical,complete,1,convenient,convenient,problematic,priority,priority -great_pret,critical,complete,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,critical,complete,1,convenient,inconv,nonprob,recommended,priority -great_pret,critical,complete,1,convenient,inconv,nonprob,priority,spec_prior -great_pret,critical,complete,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,critical,complete,1,convenient,inconv,slightly_prob,recommended,priority -great_pret,critical,complete,1,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,critical,complete,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,complete,1,convenient,inconv,problematic,recommended,spec_prior -great_pret,critical,complete,1,convenient,inconv,problematic,priority,spec_prior -great_pret,critical,complete,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,critical,complete,1,less_conv,convenient,nonprob,recommended,priority -great_pret,critical,complete,1,less_conv,convenient,nonprob,priority,spec_prior -great_pret,critical,complete,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,critical,complete,1,less_conv,convenient,slightly_prob,recommended,priority -great_pret,critical,complete,1,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,critical,complete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,complete,1,less_conv,convenient,problematic,recommended,spec_prior -great_pret,critical,complete,1,less_conv,convenient,problematic,priority,spec_prior -great_pret,critical,complete,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,critical,complete,1,less_conv,inconv,nonprob,recommended,priority -great_pret,critical,complete,1,less_conv,inconv,nonprob,priority,spec_prior -great_pret,critical,complete,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,critical,complete,1,less_conv,inconv,slightly_prob,recommended,priority -great_pret,critical,complete,1,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,critical,complete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,complete,1,less_conv,inconv,problematic,recommended,spec_prior -great_pret,critical,complete,1,less_conv,inconv,problematic,priority,spec_prior -great_pret,critical,complete,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,critical,complete,1,critical,convenient,nonprob,recommended,priority -great_pret,critical,complete,1,critical,convenient,nonprob,priority,spec_prior -great_pret,critical,complete,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,critical,complete,1,critical,convenient,slightly_prob,recommended,priority -great_pret,critical,complete,1,critical,convenient,slightly_prob,priority,spec_prior -great_pret,critical,complete,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,complete,1,critical,convenient,problematic,recommended,spec_prior -great_pret,critical,complete,1,critical,convenient,problematic,priority,spec_prior -great_pret,critical,complete,1,critical,convenient,problematic,not_recom,not_recom -great_pret,critical,complete,1,critical,inconv,nonprob,recommended,priority -great_pret,critical,complete,1,critical,inconv,nonprob,priority,spec_prior -great_pret,critical,complete,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,critical,complete,1,critical,inconv,slightly_prob,recommended,priority -great_pret,critical,complete,1,critical,inconv,slightly_prob,priority,spec_prior -great_pret,critical,complete,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,complete,1,critical,inconv,problematic,recommended,spec_prior -great_pret,critical,complete,1,critical,inconv,problematic,priority,spec_prior -great_pret,critical,complete,1,critical,inconv,problematic,not_recom,not_recom -great_pret,critical,complete,2,convenient,convenient,nonprob,recommended,priority -great_pret,critical,complete,2,convenient,convenient,nonprob,priority,spec_prior -great_pret,critical,complete,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,critical,complete,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,critical,complete,2,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,critical,complete,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,complete,2,convenient,convenient,problematic,recommended,spec_prior -great_pret,critical,complete,2,convenient,convenient,problematic,priority,spec_prior -great_pret,critical,complete,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,critical,complete,2,convenient,inconv,nonprob,recommended,priority -great_pret,critical,complete,2,convenient,inconv,nonprob,priority,spec_prior -great_pret,critical,complete,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,critical,complete,2,convenient,inconv,slightly_prob,recommended,priority -great_pret,critical,complete,2,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,critical,complete,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,complete,2,convenient,inconv,problematic,recommended,spec_prior -great_pret,critical,complete,2,convenient,inconv,problematic,priority,spec_prior -great_pret,critical,complete,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,critical,complete,2,less_conv,convenient,nonprob,recommended,priority -great_pret,critical,complete,2,less_conv,convenient,nonprob,priority,spec_prior -great_pret,critical,complete,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,critical,complete,2,less_conv,convenient,slightly_prob,recommended,priority -great_pret,critical,complete,2,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,critical,complete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,complete,2,less_conv,convenient,problematic,recommended,spec_prior -great_pret,critical,complete,2,less_conv,convenient,problematic,priority,spec_prior -great_pret,critical,complete,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,critical,complete,2,less_conv,inconv,nonprob,recommended,priority -great_pret,critical,complete,2,less_conv,inconv,nonprob,priority,spec_prior -great_pret,critical,complete,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,critical,complete,2,less_conv,inconv,slightly_prob,recommended,priority -great_pret,critical,complete,2,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,critical,complete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,complete,2,less_conv,inconv,problematic,recommended,spec_prior -great_pret,critical,complete,2,less_conv,inconv,problematic,priority,spec_prior -great_pret,critical,complete,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,critical,complete,2,critical,convenient,nonprob,recommended,spec_prior -great_pret,critical,complete,2,critical,convenient,nonprob,priority,spec_prior -great_pret,critical,complete,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,critical,complete,2,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,complete,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,critical,complete,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,complete,2,critical,convenient,problematic,recommended,spec_prior -great_pret,critical,complete,2,critical,convenient,problematic,priority,spec_prior -great_pret,critical,complete,2,critical,convenient,problematic,not_recom,not_recom -great_pret,critical,complete,2,critical,inconv,nonprob,recommended,spec_prior -great_pret,critical,complete,2,critical,inconv,nonprob,priority,spec_prior -great_pret,critical,complete,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,critical,complete,2,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,complete,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,critical,complete,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,complete,2,critical,inconv,problematic,recommended,spec_prior -great_pret,critical,complete,2,critical,inconv,problematic,priority,spec_prior -great_pret,critical,complete,2,critical,inconv,problematic,not_recom,not_recom -great_pret,critical,complete,3,convenient,convenient,nonprob,recommended,priority -great_pret,critical,complete,3,convenient,convenient,nonprob,priority,spec_prior -great_pret,critical,complete,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,critical,complete,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,critical,complete,3,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,critical,complete,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,complete,3,convenient,convenient,problematic,recommended,spec_prior -great_pret,critical,complete,3,convenient,convenient,problematic,priority,spec_prior -great_pret,critical,complete,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,critical,complete,3,convenient,inconv,nonprob,recommended,spec_prior -great_pret,critical,complete,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,critical,complete,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,critical,complete,3,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,complete,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,critical,complete,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,complete,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,critical,complete,3,convenient,inconv,problematic,priority,spec_prior -great_pret,critical,complete,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,critical,complete,3,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,critical,complete,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,critical,complete,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,critical,complete,3,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,complete,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,critical,complete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,complete,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,critical,complete,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,critical,complete,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,critical,complete,3,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,critical,complete,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,critical,complete,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,critical,complete,3,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,complete,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,critical,complete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,complete,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,critical,complete,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,critical,complete,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,critical,complete,3,critical,convenient,nonprob,recommended,spec_prior -great_pret,critical,complete,3,critical,convenient,nonprob,priority,spec_prior -great_pret,critical,complete,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,critical,complete,3,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,complete,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,critical,complete,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,complete,3,critical,convenient,problematic,recommended,spec_prior -great_pret,critical,complete,3,critical,convenient,problematic,priority,spec_prior -great_pret,critical,complete,3,critical,convenient,problematic,not_recom,not_recom -great_pret,critical,complete,3,critical,inconv,nonprob,recommended,spec_prior -great_pret,critical,complete,3,critical,inconv,nonprob,priority,spec_prior -great_pret,critical,complete,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,critical,complete,3,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,complete,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,critical,complete,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,complete,3,critical,inconv,problematic,recommended,spec_prior -great_pret,critical,complete,3,critical,inconv,problematic,priority,spec_prior -great_pret,critical,complete,3,critical,inconv,problematic,not_recom,not_recom -great_pret,critical,complete,more,convenient,convenient,nonprob,recommended,priority -great_pret,critical,complete,more,convenient,convenient,nonprob,priority,spec_prior -great_pret,critical,complete,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,critical,complete,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,critical,complete,more,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,critical,complete,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,complete,more,convenient,convenient,problematic,recommended,spec_prior -great_pret,critical,complete,more,convenient,convenient,problematic,priority,spec_prior -great_pret,critical,complete,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,critical,complete,more,convenient,inconv,nonprob,recommended,spec_prior -great_pret,critical,complete,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,critical,complete,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,critical,complete,more,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,complete,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,critical,complete,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,complete,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,critical,complete,more,convenient,inconv,problematic,priority,spec_prior -great_pret,critical,complete,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,critical,complete,more,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,critical,complete,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,critical,complete,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,critical,complete,more,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,complete,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,critical,complete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,complete,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,critical,complete,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,critical,complete,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,critical,complete,more,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,critical,complete,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,critical,complete,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,critical,complete,more,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,complete,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,critical,complete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,complete,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,critical,complete,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,critical,complete,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,critical,complete,more,critical,convenient,nonprob,recommended,spec_prior -great_pret,critical,complete,more,critical,convenient,nonprob,priority,spec_prior -great_pret,critical,complete,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,critical,complete,more,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,complete,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,critical,complete,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,complete,more,critical,convenient,problematic,recommended,spec_prior -great_pret,critical,complete,more,critical,convenient,problematic,priority,spec_prior -great_pret,critical,complete,more,critical,convenient,problematic,not_recom,not_recom -great_pret,critical,complete,more,critical,inconv,nonprob,recommended,spec_prior -great_pret,critical,complete,more,critical,inconv,nonprob,priority,spec_prior -great_pret,critical,complete,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,critical,complete,more,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,complete,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,critical,complete,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,complete,more,critical,inconv,problematic,recommended,spec_prior -great_pret,critical,complete,more,critical,inconv,problematic,priority,spec_prior -great_pret,critical,complete,more,critical,inconv,problematic,not_recom,not_recom -great_pret,critical,completed,1,convenient,convenient,nonprob,recommended,priority -great_pret,critical,completed,1,convenient,convenient,nonprob,priority,spec_prior -great_pret,critical,completed,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,critical,completed,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,critical,completed,1,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,critical,completed,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,completed,1,convenient,convenient,problematic,recommended,spec_prior -great_pret,critical,completed,1,convenient,convenient,problematic,priority,spec_prior -great_pret,critical,completed,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,critical,completed,1,convenient,inconv,nonprob,recommended,priority -great_pret,critical,completed,1,convenient,inconv,nonprob,priority,spec_prior -great_pret,critical,completed,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,critical,completed,1,convenient,inconv,slightly_prob,recommended,priority -great_pret,critical,completed,1,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,critical,completed,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,completed,1,convenient,inconv,problematic,recommended,spec_prior -great_pret,critical,completed,1,convenient,inconv,problematic,priority,spec_prior -great_pret,critical,completed,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,critical,completed,1,less_conv,convenient,nonprob,recommended,priority -great_pret,critical,completed,1,less_conv,convenient,nonprob,priority,spec_prior -great_pret,critical,completed,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,critical,completed,1,less_conv,convenient,slightly_prob,recommended,priority -great_pret,critical,completed,1,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,critical,completed,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,completed,1,less_conv,convenient,problematic,recommended,spec_prior -great_pret,critical,completed,1,less_conv,convenient,problematic,priority,spec_prior -great_pret,critical,completed,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,critical,completed,1,less_conv,inconv,nonprob,recommended,priority -great_pret,critical,completed,1,less_conv,inconv,nonprob,priority,spec_prior -great_pret,critical,completed,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,critical,completed,1,less_conv,inconv,slightly_prob,recommended,priority -great_pret,critical,completed,1,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,critical,completed,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,completed,1,less_conv,inconv,problematic,recommended,spec_prior -great_pret,critical,completed,1,less_conv,inconv,problematic,priority,spec_prior -great_pret,critical,completed,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,critical,completed,1,critical,convenient,nonprob,recommended,spec_prior -great_pret,critical,completed,1,critical,convenient,nonprob,priority,spec_prior -great_pret,critical,completed,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,critical,completed,1,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,completed,1,critical,convenient,slightly_prob,priority,spec_prior -great_pret,critical,completed,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,completed,1,critical,convenient,problematic,recommended,spec_prior -great_pret,critical,completed,1,critical,convenient,problematic,priority,spec_prior -great_pret,critical,completed,1,critical,convenient,problematic,not_recom,not_recom -great_pret,critical,completed,1,critical,inconv,nonprob,recommended,spec_prior -great_pret,critical,completed,1,critical,inconv,nonprob,priority,spec_prior -great_pret,critical,completed,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,critical,completed,1,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,completed,1,critical,inconv,slightly_prob,priority,spec_prior -great_pret,critical,completed,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,completed,1,critical,inconv,problematic,recommended,spec_prior -great_pret,critical,completed,1,critical,inconv,problematic,priority,spec_prior -great_pret,critical,completed,1,critical,inconv,problematic,not_recom,not_recom -great_pret,critical,completed,2,convenient,convenient,nonprob,recommended,priority -great_pret,critical,completed,2,convenient,convenient,nonprob,priority,spec_prior -great_pret,critical,completed,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,critical,completed,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,critical,completed,2,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,critical,completed,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,completed,2,convenient,convenient,problematic,recommended,spec_prior -great_pret,critical,completed,2,convenient,convenient,problematic,priority,spec_prior -great_pret,critical,completed,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,critical,completed,2,convenient,inconv,nonprob,recommended,priority -great_pret,critical,completed,2,convenient,inconv,nonprob,priority,spec_prior -great_pret,critical,completed,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,critical,completed,2,convenient,inconv,slightly_prob,recommended,priority -great_pret,critical,completed,2,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,critical,completed,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,completed,2,convenient,inconv,problematic,recommended,spec_prior -great_pret,critical,completed,2,convenient,inconv,problematic,priority,spec_prior -great_pret,critical,completed,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,critical,completed,2,less_conv,convenient,nonprob,recommended,priority -great_pret,critical,completed,2,less_conv,convenient,nonprob,priority,spec_prior -great_pret,critical,completed,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,critical,completed,2,less_conv,convenient,slightly_prob,recommended,priority -great_pret,critical,completed,2,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,critical,completed,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,completed,2,less_conv,convenient,problematic,recommended,spec_prior -great_pret,critical,completed,2,less_conv,convenient,problematic,priority,spec_prior -great_pret,critical,completed,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,critical,completed,2,less_conv,inconv,nonprob,recommended,priority -great_pret,critical,completed,2,less_conv,inconv,nonprob,priority,spec_prior -great_pret,critical,completed,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,critical,completed,2,less_conv,inconv,slightly_prob,recommended,priority -great_pret,critical,completed,2,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,critical,completed,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,completed,2,less_conv,inconv,problematic,recommended,spec_prior -great_pret,critical,completed,2,less_conv,inconv,problematic,priority,spec_prior -great_pret,critical,completed,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,critical,completed,2,critical,convenient,nonprob,recommended,spec_prior -great_pret,critical,completed,2,critical,convenient,nonprob,priority,spec_prior -great_pret,critical,completed,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,critical,completed,2,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,completed,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,critical,completed,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,completed,2,critical,convenient,problematic,recommended,spec_prior -great_pret,critical,completed,2,critical,convenient,problematic,priority,spec_prior -great_pret,critical,completed,2,critical,convenient,problematic,not_recom,not_recom -great_pret,critical,completed,2,critical,inconv,nonprob,recommended,spec_prior -great_pret,critical,completed,2,critical,inconv,nonprob,priority,spec_prior -great_pret,critical,completed,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,critical,completed,2,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,completed,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,critical,completed,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,completed,2,critical,inconv,problematic,recommended,spec_prior -great_pret,critical,completed,2,critical,inconv,problematic,priority,spec_prior -great_pret,critical,completed,2,critical,inconv,problematic,not_recom,not_recom -great_pret,critical,completed,3,convenient,convenient,nonprob,recommended,priority -great_pret,critical,completed,3,convenient,convenient,nonprob,priority,spec_prior -great_pret,critical,completed,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,critical,completed,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,critical,completed,3,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,critical,completed,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,completed,3,convenient,convenient,problematic,recommended,spec_prior -great_pret,critical,completed,3,convenient,convenient,problematic,priority,spec_prior -great_pret,critical,completed,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,critical,completed,3,convenient,inconv,nonprob,recommended,spec_prior -great_pret,critical,completed,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,critical,completed,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,critical,completed,3,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,completed,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,critical,completed,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,completed,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,critical,completed,3,convenient,inconv,problematic,priority,spec_prior -great_pret,critical,completed,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,critical,completed,3,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,critical,completed,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,critical,completed,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,critical,completed,3,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,completed,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,critical,completed,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,completed,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,critical,completed,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,critical,completed,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,critical,completed,3,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,critical,completed,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,critical,completed,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,critical,completed,3,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,completed,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,critical,completed,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,completed,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,critical,completed,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,critical,completed,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,critical,completed,3,critical,convenient,nonprob,recommended,spec_prior -great_pret,critical,completed,3,critical,convenient,nonprob,priority,spec_prior -great_pret,critical,completed,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,critical,completed,3,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,completed,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,critical,completed,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,completed,3,critical,convenient,problematic,recommended,spec_prior -great_pret,critical,completed,3,critical,convenient,problematic,priority,spec_prior -great_pret,critical,completed,3,critical,convenient,problematic,not_recom,not_recom -great_pret,critical,completed,3,critical,inconv,nonprob,recommended,spec_prior -great_pret,critical,completed,3,critical,inconv,nonprob,priority,spec_prior -great_pret,critical,completed,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,critical,completed,3,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,completed,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,critical,completed,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,completed,3,critical,inconv,problematic,recommended,spec_prior -great_pret,critical,completed,3,critical,inconv,problematic,priority,spec_prior -great_pret,critical,completed,3,critical,inconv,problematic,not_recom,not_recom -great_pret,critical,completed,more,convenient,convenient,nonprob,recommended,priority -great_pret,critical,completed,more,convenient,convenient,nonprob,priority,spec_prior -great_pret,critical,completed,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,critical,completed,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,critical,completed,more,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,critical,completed,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,completed,more,convenient,convenient,problematic,recommended,spec_prior -great_pret,critical,completed,more,convenient,convenient,problematic,priority,spec_prior -great_pret,critical,completed,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,critical,completed,more,convenient,inconv,nonprob,recommended,spec_prior -great_pret,critical,completed,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,critical,completed,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,critical,completed,more,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,completed,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,critical,completed,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,completed,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,critical,completed,more,convenient,inconv,problematic,priority,spec_prior -great_pret,critical,completed,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,critical,completed,more,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,critical,completed,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,critical,completed,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,critical,completed,more,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,completed,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,critical,completed,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,completed,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,critical,completed,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,critical,completed,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,critical,completed,more,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,critical,completed,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,critical,completed,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,critical,completed,more,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,completed,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,critical,completed,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,completed,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,critical,completed,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,critical,completed,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,critical,completed,more,critical,convenient,nonprob,recommended,spec_prior -great_pret,critical,completed,more,critical,convenient,nonprob,priority,spec_prior -great_pret,critical,completed,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,critical,completed,more,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,completed,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,critical,completed,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,completed,more,critical,convenient,problematic,recommended,spec_prior -great_pret,critical,completed,more,critical,convenient,problematic,priority,spec_prior -great_pret,critical,completed,more,critical,convenient,problematic,not_recom,not_recom -great_pret,critical,completed,more,critical,inconv,nonprob,recommended,spec_prior -great_pret,critical,completed,more,critical,inconv,nonprob,priority,spec_prior -great_pret,critical,completed,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,critical,completed,more,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,completed,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,critical,completed,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,completed,more,critical,inconv,problematic,recommended,spec_prior -great_pret,critical,completed,more,critical,inconv,problematic,priority,spec_prior -great_pret,critical,completed,more,critical,inconv,problematic,not_recom,not_recom -great_pret,critical,incomplete,1,convenient,convenient,nonprob,recommended,priority -great_pret,critical,incomplete,1,convenient,convenient,nonprob,priority,spec_prior -great_pret,critical,incomplete,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,critical,incomplete,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,critical,incomplete,1,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,1,convenient,convenient,problematic,recommended,spec_prior -great_pret,critical,incomplete,1,convenient,convenient,problematic,priority,spec_prior -great_pret,critical,incomplete,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,critical,incomplete,1,convenient,inconv,nonprob,recommended,priority -great_pret,critical,incomplete,1,convenient,inconv,nonprob,priority,spec_prior -great_pret,critical,incomplete,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,critical,incomplete,1,convenient,inconv,slightly_prob,recommended,priority -great_pret,critical,incomplete,1,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,1,convenient,inconv,problematic,recommended,spec_prior -great_pret,critical,incomplete,1,convenient,inconv,problematic,priority,spec_prior -great_pret,critical,incomplete,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,critical,incomplete,1,less_conv,convenient,nonprob,recommended,priority -great_pret,critical,incomplete,1,less_conv,convenient,nonprob,priority,spec_prior -great_pret,critical,incomplete,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,critical,incomplete,1,less_conv,convenient,slightly_prob,recommended,priority -great_pret,critical,incomplete,1,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,1,less_conv,convenient,problematic,recommended,spec_prior -great_pret,critical,incomplete,1,less_conv,convenient,problematic,priority,spec_prior -great_pret,critical,incomplete,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,critical,incomplete,1,less_conv,inconv,nonprob,recommended,priority -great_pret,critical,incomplete,1,less_conv,inconv,nonprob,priority,spec_prior -great_pret,critical,incomplete,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,critical,incomplete,1,less_conv,inconv,slightly_prob,recommended,priority -great_pret,critical,incomplete,1,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,1,less_conv,inconv,problematic,recommended,spec_prior -great_pret,critical,incomplete,1,less_conv,inconv,problematic,priority,spec_prior -great_pret,critical,incomplete,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,critical,incomplete,1,critical,convenient,nonprob,recommended,spec_prior -great_pret,critical,incomplete,1,critical,convenient,nonprob,priority,spec_prior -great_pret,critical,incomplete,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,critical,incomplete,1,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,incomplete,1,critical,convenient,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,1,critical,convenient,problematic,recommended,spec_prior -great_pret,critical,incomplete,1,critical,convenient,problematic,priority,spec_prior -great_pret,critical,incomplete,1,critical,convenient,problematic,not_recom,not_recom -great_pret,critical,incomplete,1,critical,inconv,nonprob,recommended,spec_prior -great_pret,critical,incomplete,1,critical,inconv,nonprob,priority,spec_prior -great_pret,critical,incomplete,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,critical,incomplete,1,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,incomplete,1,critical,inconv,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,1,critical,inconv,problematic,recommended,spec_prior -great_pret,critical,incomplete,1,critical,inconv,problematic,priority,spec_prior -great_pret,critical,incomplete,1,critical,inconv,problematic,not_recom,not_recom -great_pret,critical,incomplete,2,convenient,convenient,nonprob,recommended,priority -great_pret,critical,incomplete,2,convenient,convenient,nonprob,priority,spec_prior -great_pret,critical,incomplete,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,critical,incomplete,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,critical,incomplete,2,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,2,convenient,convenient,problematic,recommended,spec_prior -great_pret,critical,incomplete,2,convenient,convenient,problematic,priority,spec_prior -great_pret,critical,incomplete,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,critical,incomplete,2,convenient,inconv,nonprob,recommended,spec_prior -great_pret,critical,incomplete,2,convenient,inconv,nonprob,priority,spec_prior -great_pret,critical,incomplete,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,critical,incomplete,2,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,incomplete,2,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,2,convenient,inconv,problematic,recommended,spec_prior -great_pret,critical,incomplete,2,convenient,inconv,problematic,priority,spec_prior -great_pret,critical,incomplete,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,critical,incomplete,2,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,critical,incomplete,2,less_conv,convenient,nonprob,priority,spec_prior -great_pret,critical,incomplete,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,critical,incomplete,2,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,incomplete,2,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,2,less_conv,convenient,problematic,recommended,spec_prior -great_pret,critical,incomplete,2,less_conv,convenient,problematic,priority,spec_prior -great_pret,critical,incomplete,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,critical,incomplete,2,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,critical,incomplete,2,less_conv,inconv,nonprob,priority,spec_prior -great_pret,critical,incomplete,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,critical,incomplete,2,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,incomplete,2,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,2,less_conv,inconv,problematic,recommended,spec_prior -great_pret,critical,incomplete,2,less_conv,inconv,problematic,priority,spec_prior -great_pret,critical,incomplete,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,critical,incomplete,2,critical,convenient,nonprob,recommended,spec_prior -great_pret,critical,incomplete,2,critical,convenient,nonprob,priority,spec_prior -great_pret,critical,incomplete,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,critical,incomplete,2,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,incomplete,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,2,critical,convenient,problematic,recommended,spec_prior -great_pret,critical,incomplete,2,critical,convenient,problematic,priority,spec_prior -great_pret,critical,incomplete,2,critical,convenient,problematic,not_recom,not_recom -great_pret,critical,incomplete,2,critical,inconv,nonprob,recommended,spec_prior -great_pret,critical,incomplete,2,critical,inconv,nonprob,priority,spec_prior -great_pret,critical,incomplete,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,critical,incomplete,2,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,incomplete,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,2,critical,inconv,problematic,recommended,spec_prior -great_pret,critical,incomplete,2,critical,inconv,problematic,priority,spec_prior -great_pret,critical,incomplete,2,critical,inconv,problematic,not_recom,not_recom -great_pret,critical,incomplete,3,convenient,convenient,nonprob,recommended,priority -great_pret,critical,incomplete,3,convenient,convenient,nonprob,priority,spec_prior -great_pret,critical,incomplete,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,critical,incomplete,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,critical,incomplete,3,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,3,convenient,convenient,problematic,recommended,spec_prior -great_pret,critical,incomplete,3,convenient,convenient,problematic,priority,spec_prior -great_pret,critical,incomplete,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,critical,incomplete,3,convenient,inconv,nonprob,recommended,spec_prior -great_pret,critical,incomplete,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,critical,incomplete,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,critical,incomplete,3,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,incomplete,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,critical,incomplete,3,convenient,inconv,problematic,priority,spec_prior -great_pret,critical,incomplete,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,critical,incomplete,3,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,critical,incomplete,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,critical,incomplete,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,critical,incomplete,3,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,incomplete,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,critical,incomplete,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,critical,incomplete,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,critical,incomplete,3,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,critical,incomplete,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,critical,incomplete,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,critical,incomplete,3,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,incomplete,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,critical,incomplete,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,critical,incomplete,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,critical,incomplete,3,critical,convenient,nonprob,recommended,spec_prior -great_pret,critical,incomplete,3,critical,convenient,nonprob,priority,spec_prior -great_pret,critical,incomplete,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,critical,incomplete,3,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,incomplete,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,3,critical,convenient,problematic,recommended,spec_prior -great_pret,critical,incomplete,3,critical,convenient,problematic,priority,spec_prior -great_pret,critical,incomplete,3,critical,convenient,problematic,not_recom,not_recom -great_pret,critical,incomplete,3,critical,inconv,nonprob,recommended,spec_prior -great_pret,critical,incomplete,3,critical,inconv,nonprob,priority,spec_prior -great_pret,critical,incomplete,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,critical,incomplete,3,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,incomplete,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,3,critical,inconv,problematic,recommended,spec_prior -great_pret,critical,incomplete,3,critical,inconv,problematic,priority,spec_prior -great_pret,critical,incomplete,3,critical,inconv,problematic,not_recom,not_recom -great_pret,critical,incomplete,more,convenient,convenient,nonprob,recommended,priority -great_pret,critical,incomplete,more,convenient,convenient,nonprob,priority,spec_prior -great_pret,critical,incomplete,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,critical,incomplete,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,critical,incomplete,more,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,more,convenient,convenient,problematic,recommended,spec_prior -great_pret,critical,incomplete,more,convenient,convenient,problematic,priority,spec_prior -great_pret,critical,incomplete,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,critical,incomplete,more,convenient,inconv,nonprob,recommended,spec_prior -great_pret,critical,incomplete,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,critical,incomplete,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,critical,incomplete,more,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,incomplete,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,critical,incomplete,more,convenient,inconv,problematic,priority,spec_prior -great_pret,critical,incomplete,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,critical,incomplete,more,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,critical,incomplete,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,critical,incomplete,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,critical,incomplete,more,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,incomplete,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,critical,incomplete,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,critical,incomplete,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,critical,incomplete,more,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,critical,incomplete,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,critical,incomplete,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,critical,incomplete,more,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,incomplete,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,critical,incomplete,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,critical,incomplete,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,critical,incomplete,more,critical,convenient,nonprob,recommended,spec_prior -great_pret,critical,incomplete,more,critical,convenient,nonprob,priority,spec_prior -great_pret,critical,incomplete,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,critical,incomplete,more,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,incomplete,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,more,critical,convenient,problematic,recommended,spec_prior -great_pret,critical,incomplete,more,critical,convenient,problematic,priority,spec_prior -great_pret,critical,incomplete,more,critical,convenient,problematic,not_recom,not_recom -great_pret,critical,incomplete,more,critical,inconv,nonprob,recommended,spec_prior -great_pret,critical,incomplete,more,critical,inconv,nonprob,priority,spec_prior -great_pret,critical,incomplete,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,critical,incomplete,more,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,incomplete,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,critical,incomplete,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,incomplete,more,critical,inconv,problematic,recommended,spec_prior -great_pret,critical,incomplete,more,critical,inconv,problematic,priority,spec_prior -great_pret,critical,incomplete,more,critical,inconv,problematic,not_recom,not_recom -great_pret,critical,foster,1,convenient,convenient,nonprob,recommended,priority -great_pret,critical,foster,1,convenient,convenient,nonprob,priority,spec_prior -great_pret,critical,foster,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,critical,foster,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,critical,foster,1,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,critical,foster,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,foster,1,convenient,convenient,problematic,recommended,spec_prior -great_pret,critical,foster,1,convenient,convenient,problematic,priority,spec_prior -great_pret,critical,foster,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,critical,foster,1,convenient,inconv,nonprob,recommended,spec_prior -great_pret,critical,foster,1,convenient,inconv,nonprob,priority,spec_prior -great_pret,critical,foster,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,critical,foster,1,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,foster,1,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,critical,foster,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,foster,1,convenient,inconv,problematic,recommended,spec_prior -great_pret,critical,foster,1,convenient,inconv,problematic,priority,spec_prior -great_pret,critical,foster,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,critical,foster,1,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,critical,foster,1,less_conv,convenient,nonprob,priority,spec_prior -great_pret,critical,foster,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,critical,foster,1,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,foster,1,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,critical,foster,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,foster,1,less_conv,convenient,problematic,recommended,spec_prior -great_pret,critical,foster,1,less_conv,convenient,problematic,priority,spec_prior -great_pret,critical,foster,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,critical,foster,1,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,critical,foster,1,less_conv,inconv,nonprob,priority,spec_prior -great_pret,critical,foster,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,critical,foster,1,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,foster,1,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,critical,foster,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,foster,1,less_conv,inconv,problematic,recommended,spec_prior -great_pret,critical,foster,1,less_conv,inconv,problematic,priority,spec_prior -great_pret,critical,foster,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,critical,foster,1,critical,convenient,nonprob,recommended,spec_prior -great_pret,critical,foster,1,critical,convenient,nonprob,priority,spec_prior -great_pret,critical,foster,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,critical,foster,1,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,foster,1,critical,convenient,slightly_prob,priority,spec_prior -great_pret,critical,foster,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,foster,1,critical,convenient,problematic,recommended,spec_prior -great_pret,critical,foster,1,critical,convenient,problematic,priority,spec_prior -great_pret,critical,foster,1,critical,convenient,problematic,not_recom,not_recom -great_pret,critical,foster,1,critical,inconv,nonprob,recommended,spec_prior -great_pret,critical,foster,1,critical,inconv,nonprob,priority,spec_prior -great_pret,critical,foster,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,critical,foster,1,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,foster,1,critical,inconv,slightly_prob,priority,spec_prior -great_pret,critical,foster,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,foster,1,critical,inconv,problematic,recommended,spec_prior -great_pret,critical,foster,1,critical,inconv,problematic,priority,spec_prior -great_pret,critical,foster,1,critical,inconv,problematic,not_recom,not_recom -great_pret,critical,foster,2,convenient,convenient,nonprob,recommended,priority -great_pret,critical,foster,2,convenient,convenient,nonprob,priority,spec_prior -great_pret,critical,foster,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,critical,foster,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,critical,foster,2,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,critical,foster,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,foster,2,convenient,convenient,problematic,recommended,spec_prior -great_pret,critical,foster,2,convenient,convenient,problematic,priority,spec_prior -great_pret,critical,foster,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,critical,foster,2,convenient,inconv,nonprob,recommended,spec_prior -great_pret,critical,foster,2,convenient,inconv,nonprob,priority,spec_prior -great_pret,critical,foster,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,critical,foster,2,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,foster,2,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,critical,foster,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,foster,2,convenient,inconv,problematic,recommended,spec_prior -great_pret,critical,foster,2,convenient,inconv,problematic,priority,spec_prior -great_pret,critical,foster,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,critical,foster,2,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,critical,foster,2,less_conv,convenient,nonprob,priority,spec_prior -great_pret,critical,foster,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,critical,foster,2,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,foster,2,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,critical,foster,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,foster,2,less_conv,convenient,problematic,recommended,spec_prior -great_pret,critical,foster,2,less_conv,convenient,problematic,priority,spec_prior -great_pret,critical,foster,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,critical,foster,2,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,critical,foster,2,less_conv,inconv,nonprob,priority,spec_prior -great_pret,critical,foster,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,critical,foster,2,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,foster,2,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,critical,foster,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,foster,2,less_conv,inconv,problematic,recommended,spec_prior -great_pret,critical,foster,2,less_conv,inconv,problematic,priority,spec_prior -great_pret,critical,foster,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,critical,foster,2,critical,convenient,nonprob,recommended,spec_prior -great_pret,critical,foster,2,critical,convenient,nonprob,priority,spec_prior -great_pret,critical,foster,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,critical,foster,2,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,foster,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,critical,foster,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,foster,2,critical,convenient,problematic,recommended,spec_prior -great_pret,critical,foster,2,critical,convenient,problematic,priority,spec_prior -great_pret,critical,foster,2,critical,convenient,problematic,not_recom,not_recom -great_pret,critical,foster,2,critical,inconv,nonprob,recommended,spec_prior -great_pret,critical,foster,2,critical,inconv,nonprob,priority,spec_prior -great_pret,critical,foster,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,critical,foster,2,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,foster,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,critical,foster,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,foster,2,critical,inconv,problematic,recommended,spec_prior -great_pret,critical,foster,2,critical,inconv,problematic,priority,spec_prior -great_pret,critical,foster,2,critical,inconv,problematic,not_recom,not_recom -great_pret,critical,foster,3,convenient,convenient,nonprob,recommended,priority -great_pret,critical,foster,3,convenient,convenient,nonprob,priority,spec_prior -great_pret,critical,foster,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,critical,foster,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,critical,foster,3,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,critical,foster,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,foster,3,convenient,convenient,problematic,recommended,spec_prior -great_pret,critical,foster,3,convenient,convenient,problematic,priority,spec_prior -great_pret,critical,foster,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,critical,foster,3,convenient,inconv,nonprob,recommended,spec_prior -great_pret,critical,foster,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,critical,foster,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,critical,foster,3,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,foster,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,critical,foster,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,foster,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,critical,foster,3,convenient,inconv,problematic,priority,spec_prior -great_pret,critical,foster,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,critical,foster,3,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,critical,foster,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,critical,foster,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,critical,foster,3,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,foster,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,critical,foster,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,foster,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,critical,foster,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,critical,foster,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,critical,foster,3,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,critical,foster,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,critical,foster,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,critical,foster,3,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,foster,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,critical,foster,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,foster,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,critical,foster,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,critical,foster,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,critical,foster,3,critical,convenient,nonprob,recommended,spec_prior -great_pret,critical,foster,3,critical,convenient,nonprob,priority,spec_prior -great_pret,critical,foster,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,critical,foster,3,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,foster,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,critical,foster,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,foster,3,critical,convenient,problematic,recommended,spec_prior -great_pret,critical,foster,3,critical,convenient,problematic,priority,spec_prior -great_pret,critical,foster,3,critical,convenient,problematic,not_recom,not_recom -great_pret,critical,foster,3,critical,inconv,nonprob,recommended,spec_prior -great_pret,critical,foster,3,critical,inconv,nonprob,priority,spec_prior -great_pret,critical,foster,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,critical,foster,3,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,foster,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,critical,foster,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,foster,3,critical,inconv,problematic,recommended,spec_prior -great_pret,critical,foster,3,critical,inconv,problematic,priority,spec_prior -great_pret,critical,foster,3,critical,inconv,problematic,not_recom,not_recom -great_pret,critical,foster,more,convenient,convenient,nonprob,recommended,priority -great_pret,critical,foster,more,convenient,convenient,nonprob,priority,spec_prior -great_pret,critical,foster,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,critical,foster,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,critical,foster,more,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,critical,foster,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,foster,more,convenient,convenient,problematic,recommended,spec_prior -great_pret,critical,foster,more,convenient,convenient,problematic,priority,spec_prior -great_pret,critical,foster,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,critical,foster,more,convenient,inconv,nonprob,recommended,spec_prior -great_pret,critical,foster,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,critical,foster,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,critical,foster,more,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,foster,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,critical,foster,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,foster,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,critical,foster,more,convenient,inconv,problematic,priority,spec_prior -great_pret,critical,foster,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,critical,foster,more,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,critical,foster,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,critical,foster,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,critical,foster,more,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,foster,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,critical,foster,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,foster,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,critical,foster,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,critical,foster,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,critical,foster,more,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,critical,foster,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,critical,foster,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,critical,foster,more,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,foster,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,critical,foster,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,foster,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,critical,foster,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,critical,foster,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,critical,foster,more,critical,convenient,nonprob,recommended,spec_prior -great_pret,critical,foster,more,critical,convenient,nonprob,priority,spec_prior -great_pret,critical,foster,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,critical,foster,more,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,critical,foster,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,critical,foster,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,critical,foster,more,critical,convenient,problematic,recommended,spec_prior -great_pret,critical,foster,more,critical,convenient,problematic,priority,spec_prior -great_pret,critical,foster,more,critical,convenient,problematic,not_recom,not_recom -great_pret,critical,foster,more,critical,inconv,nonprob,recommended,spec_prior -great_pret,critical,foster,more,critical,inconv,nonprob,priority,spec_prior -great_pret,critical,foster,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,critical,foster,more,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,critical,foster,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,critical,foster,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,critical,foster,more,critical,inconv,problematic,recommended,spec_prior -great_pret,critical,foster,more,critical,inconv,problematic,priority,spec_prior -great_pret,critical,foster,more,critical,inconv,problematic,not_recom,not_recom -great_pret,very_crit,complete,1,convenient,convenient,nonprob,recommended,priority -great_pret,very_crit,complete,1,convenient,convenient,nonprob,priority,priority -great_pret,very_crit,complete,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,complete,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,very_crit,complete,1,convenient,convenient,slightly_prob,priority,priority -great_pret,very_crit,complete,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,1,convenient,convenient,problematic,recommended,priority -great_pret,very_crit,complete,1,convenient,convenient,problematic,priority,priority -great_pret,very_crit,complete,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,very_crit,complete,1,convenient,inconv,nonprob,recommended,priority -great_pret,very_crit,complete,1,convenient,inconv,nonprob,priority,spec_prior -great_pret,very_crit,complete,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,complete,1,convenient,inconv,slightly_prob,recommended,priority -great_pret,very_crit,complete,1,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,1,convenient,inconv,problematic,recommended,spec_prior -great_pret,very_crit,complete,1,convenient,inconv,problematic,priority,spec_prior -great_pret,very_crit,complete,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,very_crit,complete,1,less_conv,convenient,nonprob,recommended,priority -great_pret,very_crit,complete,1,less_conv,convenient,nonprob,priority,spec_prior -great_pret,very_crit,complete,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,complete,1,less_conv,convenient,slightly_prob,recommended,priority -great_pret,very_crit,complete,1,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,1,less_conv,convenient,problematic,recommended,spec_prior -great_pret,very_crit,complete,1,less_conv,convenient,problematic,priority,spec_prior -great_pret,very_crit,complete,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,very_crit,complete,1,less_conv,inconv,nonprob,recommended,priority -great_pret,very_crit,complete,1,less_conv,inconv,nonprob,priority,spec_prior -great_pret,very_crit,complete,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,complete,1,less_conv,inconv,slightly_prob,recommended,priority -great_pret,very_crit,complete,1,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,1,less_conv,inconv,problematic,recommended,spec_prior -great_pret,very_crit,complete,1,less_conv,inconv,problematic,priority,spec_prior -great_pret,very_crit,complete,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,very_crit,complete,1,critical,convenient,nonprob,recommended,priority -great_pret,very_crit,complete,1,critical,convenient,nonprob,priority,spec_prior -great_pret,very_crit,complete,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,complete,1,critical,convenient,slightly_prob,recommended,priority -great_pret,very_crit,complete,1,critical,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,1,critical,convenient,problematic,recommended,spec_prior -great_pret,very_crit,complete,1,critical,convenient,problematic,priority,spec_prior -great_pret,very_crit,complete,1,critical,convenient,problematic,not_recom,not_recom -great_pret,very_crit,complete,1,critical,inconv,nonprob,recommended,priority -great_pret,very_crit,complete,1,critical,inconv,nonprob,priority,spec_prior -great_pret,very_crit,complete,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,complete,1,critical,inconv,slightly_prob,recommended,priority -great_pret,very_crit,complete,1,critical,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,1,critical,inconv,problematic,recommended,spec_prior -great_pret,very_crit,complete,1,critical,inconv,problematic,priority,spec_prior -great_pret,very_crit,complete,1,critical,inconv,problematic,not_recom,not_recom -great_pret,very_crit,complete,2,convenient,convenient,nonprob,recommended,priority -great_pret,very_crit,complete,2,convenient,convenient,nonprob,priority,spec_prior -great_pret,very_crit,complete,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,complete,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,very_crit,complete,2,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,2,convenient,convenient,problematic,recommended,spec_prior -great_pret,very_crit,complete,2,convenient,convenient,problematic,priority,spec_prior -great_pret,very_crit,complete,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,very_crit,complete,2,convenient,inconv,nonprob,recommended,priority -great_pret,very_crit,complete,2,convenient,inconv,nonprob,priority,spec_prior -great_pret,very_crit,complete,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,complete,2,convenient,inconv,slightly_prob,recommended,priority -great_pret,very_crit,complete,2,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,2,convenient,inconv,problematic,recommended,spec_prior -great_pret,very_crit,complete,2,convenient,inconv,problematic,priority,spec_prior -great_pret,very_crit,complete,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,very_crit,complete,2,less_conv,convenient,nonprob,recommended,priority -great_pret,very_crit,complete,2,less_conv,convenient,nonprob,priority,spec_prior -great_pret,very_crit,complete,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,complete,2,less_conv,convenient,slightly_prob,recommended,priority -great_pret,very_crit,complete,2,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,2,less_conv,convenient,problematic,recommended,spec_prior -great_pret,very_crit,complete,2,less_conv,convenient,problematic,priority,spec_prior -great_pret,very_crit,complete,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,very_crit,complete,2,less_conv,inconv,nonprob,recommended,priority -great_pret,very_crit,complete,2,less_conv,inconv,nonprob,priority,spec_prior -great_pret,very_crit,complete,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,complete,2,less_conv,inconv,slightly_prob,recommended,priority -great_pret,very_crit,complete,2,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,2,less_conv,inconv,problematic,recommended,spec_prior -great_pret,very_crit,complete,2,less_conv,inconv,problematic,priority,spec_prior -great_pret,very_crit,complete,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,very_crit,complete,2,critical,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,complete,2,critical,convenient,nonprob,priority,spec_prior -great_pret,very_crit,complete,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,complete,2,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,complete,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,2,critical,convenient,problematic,recommended,spec_prior -great_pret,very_crit,complete,2,critical,convenient,problematic,priority,spec_prior -great_pret,very_crit,complete,2,critical,convenient,problematic,not_recom,not_recom -great_pret,very_crit,complete,2,critical,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,complete,2,critical,inconv,nonprob,priority,spec_prior -great_pret,very_crit,complete,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,complete,2,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,complete,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,2,critical,inconv,problematic,recommended,spec_prior -great_pret,very_crit,complete,2,critical,inconv,problematic,priority,spec_prior -great_pret,very_crit,complete,2,critical,inconv,problematic,not_recom,not_recom -great_pret,very_crit,complete,3,convenient,convenient,nonprob,recommended,priority -great_pret,very_crit,complete,3,convenient,convenient,nonprob,priority,spec_prior -great_pret,very_crit,complete,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,complete,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,very_crit,complete,3,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,3,convenient,convenient,problematic,recommended,spec_prior -great_pret,very_crit,complete,3,convenient,convenient,problematic,priority,spec_prior -great_pret,very_crit,complete,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,very_crit,complete,3,convenient,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,complete,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,very_crit,complete,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,complete,3,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,complete,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,very_crit,complete,3,convenient,inconv,problematic,priority,spec_prior -great_pret,very_crit,complete,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,very_crit,complete,3,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,complete,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,very_crit,complete,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,complete,3,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,complete,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,very_crit,complete,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,very_crit,complete,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,very_crit,complete,3,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,complete,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,very_crit,complete,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,complete,3,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,complete,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,very_crit,complete,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,very_crit,complete,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,very_crit,complete,3,critical,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,complete,3,critical,convenient,nonprob,priority,spec_prior -great_pret,very_crit,complete,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,complete,3,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,complete,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,3,critical,convenient,problematic,recommended,spec_prior -great_pret,very_crit,complete,3,critical,convenient,problematic,priority,spec_prior -great_pret,very_crit,complete,3,critical,convenient,problematic,not_recom,not_recom -great_pret,very_crit,complete,3,critical,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,complete,3,critical,inconv,nonprob,priority,spec_prior -great_pret,very_crit,complete,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,complete,3,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,complete,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,3,critical,inconv,problematic,recommended,spec_prior -great_pret,very_crit,complete,3,critical,inconv,problematic,priority,spec_prior -great_pret,very_crit,complete,3,critical,inconv,problematic,not_recom,not_recom -great_pret,very_crit,complete,more,convenient,convenient,nonprob,recommended,priority -great_pret,very_crit,complete,more,convenient,convenient,nonprob,priority,spec_prior -great_pret,very_crit,complete,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,complete,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,very_crit,complete,more,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,more,convenient,convenient,problematic,recommended,spec_prior -great_pret,very_crit,complete,more,convenient,convenient,problematic,priority,spec_prior -great_pret,very_crit,complete,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,very_crit,complete,more,convenient,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,complete,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,very_crit,complete,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,complete,more,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,complete,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,very_crit,complete,more,convenient,inconv,problematic,priority,spec_prior -great_pret,very_crit,complete,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,very_crit,complete,more,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,complete,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,very_crit,complete,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,complete,more,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,complete,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,very_crit,complete,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,very_crit,complete,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,very_crit,complete,more,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,complete,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,very_crit,complete,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,complete,more,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,complete,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,very_crit,complete,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,very_crit,complete,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,very_crit,complete,more,critical,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,complete,more,critical,convenient,nonprob,priority,spec_prior -great_pret,very_crit,complete,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,complete,more,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,complete,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,more,critical,convenient,problematic,recommended,spec_prior -great_pret,very_crit,complete,more,critical,convenient,problematic,priority,spec_prior -great_pret,very_crit,complete,more,critical,convenient,problematic,not_recom,not_recom -great_pret,very_crit,complete,more,critical,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,complete,more,critical,inconv,nonprob,priority,spec_prior -great_pret,very_crit,complete,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,complete,more,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,complete,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,complete,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,complete,more,critical,inconv,problematic,recommended,spec_prior -great_pret,very_crit,complete,more,critical,inconv,problematic,priority,spec_prior -great_pret,very_crit,complete,more,critical,inconv,problematic,not_recom,not_recom -great_pret,very_crit,completed,1,convenient,convenient,nonprob,recommended,priority -great_pret,very_crit,completed,1,convenient,convenient,nonprob,priority,spec_prior -great_pret,very_crit,completed,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,completed,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,very_crit,completed,1,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,1,convenient,convenient,problematic,recommended,spec_prior -great_pret,very_crit,completed,1,convenient,convenient,problematic,priority,spec_prior -great_pret,very_crit,completed,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,very_crit,completed,1,convenient,inconv,nonprob,recommended,priority -great_pret,very_crit,completed,1,convenient,inconv,nonprob,priority,spec_prior -great_pret,very_crit,completed,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,completed,1,convenient,inconv,slightly_prob,recommended,priority -great_pret,very_crit,completed,1,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,1,convenient,inconv,problematic,recommended,spec_prior -great_pret,very_crit,completed,1,convenient,inconv,problematic,priority,spec_prior -great_pret,very_crit,completed,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,very_crit,completed,1,less_conv,convenient,nonprob,recommended,priority -great_pret,very_crit,completed,1,less_conv,convenient,nonprob,priority,spec_prior -great_pret,very_crit,completed,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,completed,1,less_conv,convenient,slightly_prob,recommended,priority -great_pret,very_crit,completed,1,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,1,less_conv,convenient,problematic,recommended,spec_prior -great_pret,very_crit,completed,1,less_conv,convenient,problematic,priority,spec_prior -great_pret,very_crit,completed,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,very_crit,completed,1,less_conv,inconv,nonprob,recommended,priority -great_pret,very_crit,completed,1,less_conv,inconv,nonprob,priority,spec_prior -great_pret,very_crit,completed,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,completed,1,less_conv,inconv,slightly_prob,recommended,priority -great_pret,very_crit,completed,1,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,1,less_conv,inconv,problematic,recommended,spec_prior -great_pret,very_crit,completed,1,less_conv,inconv,problematic,priority,spec_prior -great_pret,very_crit,completed,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,very_crit,completed,1,critical,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,completed,1,critical,convenient,nonprob,priority,spec_prior -great_pret,very_crit,completed,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,completed,1,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,completed,1,critical,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,1,critical,convenient,problematic,recommended,spec_prior -great_pret,very_crit,completed,1,critical,convenient,problematic,priority,spec_prior -great_pret,very_crit,completed,1,critical,convenient,problematic,not_recom,not_recom -great_pret,very_crit,completed,1,critical,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,completed,1,critical,inconv,nonprob,priority,spec_prior -great_pret,very_crit,completed,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,completed,1,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,completed,1,critical,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,1,critical,inconv,problematic,recommended,spec_prior -great_pret,very_crit,completed,1,critical,inconv,problematic,priority,spec_prior -great_pret,very_crit,completed,1,critical,inconv,problematic,not_recom,not_recom -great_pret,very_crit,completed,2,convenient,convenient,nonprob,recommended,priority -great_pret,very_crit,completed,2,convenient,convenient,nonprob,priority,spec_prior -great_pret,very_crit,completed,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,completed,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,very_crit,completed,2,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,2,convenient,convenient,problematic,recommended,spec_prior -great_pret,very_crit,completed,2,convenient,convenient,problematic,priority,spec_prior -great_pret,very_crit,completed,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,very_crit,completed,2,convenient,inconv,nonprob,recommended,priority -great_pret,very_crit,completed,2,convenient,inconv,nonprob,priority,spec_prior -great_pret,very_crit,completed,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,completed,2,convenient,inconv,slightly_prob,recommended,priority -great_pret,very_crit,completed,2,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,2,convenient,inconv,problematic,recommended,spec_prior -great_pret,very_crit,completed,2,convenient,inconv,problematic,priority,spec_prior -great_pret,very_crit,completed,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,very_crit,completed,2,less_conv,convenient,nonprob,recommended,priority -great_pret,very_crit,completed,2,less_conv,convenient,nonprob,priority,spec_prior -great_pret,very_crit,completed,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,completed,2,less_conv,convenient,slightly_prob,recommended,priority -great_pret,very_crit,completed,2,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,2,less_conv,convenient,problematic,recommended,spec_prior -great_pret,very_crit,completed,2,less_conv,convenient,problematic,priority,spec_prior -great_pret,very_crit,completed,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,very_crit,completed,2,less_conv,inconv,nonprob,recommended,priority -great_pret,very_crit,completed,2,less_conv,inconv,nonprob,priority,spec_prior -great_pret,very_crit,completed,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,completed,2,less_conv,inconv,slightly_prob,recommended,priority -great_pret,very_crit,completed,2,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,2,less_conv,inconv,problematic,recommended,spec_prior -great_pret,very_crit,completed,2,less_conv,inconv,problematic,priority,spec_prior -great_pret,very_crit,completed,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,very_crit,completed,2,critical,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,completed,2,critical,convenient,nonprob,priority,spec_prior -great_pret,very_crit,completed,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,completed,2,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,completed,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,2,critical,convenient,problematic,recommended,spec_prior -great_pret,very_crit,completed,2,critical,convenient,problematic,priority,spec_prior -great_pret,very_crit,completed,2,critical,convenient,problematic,not_recom,not_recom -great_pret,very_crit,completed,2,critical,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,completed,2,critical,inconv,nonprob,priority,spec_prior -great_pret,very_crit,completed,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,completed,2,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,completed,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,2,critical,inconv,problematic,recommended,spec_prior -great_pret,very_crit,completed,2,critical,inconv,problematic,priority,spec_prior -great_pret,very_crit,completed,2,critical,inconv,problematic,not_recom,not_recom -great_pret,very_crit,completed,3,convenient,convenient,nonprob,recommended,priority -great_pret,very_crit,completed,3,convenient,convenient,nonprob,priority,spec_prior -great_pret,very_crit,completed,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,completed,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,very_crit,completed,3,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,3,convenient,convenient,problematic,recommended,spec_prior -great_pret,very_crit,completed,3,convenient,convenient,problematic,priority,spec_prior -great_pret,very_crit,completed,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,very_crit,completed,3,convenient,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,completed,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,very_crit,completed,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,completed,3,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,completed,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,very_crit,completed,3,convenient,inconv,problematic,priority,spec_prior -great_pret,very_crit,completed,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,very_crit,completed,3,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,completed,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,very_crit,completed,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,completed,3,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,completed,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,very_crit,completed,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,very_crit,completed,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,very_crit,completed,3,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,completed,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,very_crit,completed,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,completed,3,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,completed,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,very_crit,completed,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,very_crit,completed,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,very_crit,completed,3,critical,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,completed,3,critical,convenient,nonprob,priority,spec_prior -great_pret,very_crit,completed,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,completed,3,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,completed,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,3,critical,convenient,problematic,recommended,spec_prior -great_pret,very_crit,completed,3,critical,convenient,problematic,priority,spec_prior -great_pret,very_crit,completed,3,critical,convenient,problematic,not_recom,not_recom -great_pret,very_crit,completed,3,critical,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,completed,3,critical,inconv,nonprob,priority,spec_prior -great_pret,very_crit,completed,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,completed,3,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,completed,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,3,critical,inconv,problematic,recommended,spec_prior -great_pret,very_crit,completed,3,critical,inconv,problematic,priority,spec_prior -great_pret,very_crit,completed,3,critical,inconv,problematic,not_recom,not_recom -great_pret,very_crit,completed,more,convenient,convenient,nonprob,recommended,priority -great_pret,very_crit,completed,more,convenient,convenient,nonprob,priority,spec_prior -great_pret,very_crit,completed,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,completed,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,very_crit,completed,more,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,more,convenient,convenient,problematic,recommended,spec_prior -great_pret,very_crit,completed,more,convenient,convenient,problematic,priority,spec_prior -great_pret,very_crit,completed,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,very_crit,completed,more,convenient,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,completed,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,very_crit,completed,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,completed,more,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,completed,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,very_crit,completed,more,convenient,inconv,problematic,priority,spec_prior -great_pret,very_crit,completed,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,very_crit,completed,more,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,completed,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,very_crit,completed,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,completed,more,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,completed,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,very_crit,completed,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,very_crit,completed,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,very_crit,completed,more,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,completed,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,very_crit,completed,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,completed,more,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,completed,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,very_crit,completed,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,very_crit,completed,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,very_crit,completed,more,critical,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,completed,more,critical,convenient,nonprob,priority,spec_prior -great_pret,very_crit,completed,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,completed,more,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,completed,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,more,critical,convenient,problematic,recommended,spec_prior -great_pret,very_crit,completed,more,critical,convenient,problematic,priority,spec_prior -great_pret,very_crit,completed,more,critical,convenient,problematic,not_recom,not_recom -great_pret,very_crit,completed,more,critical,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,completed,more,critical,inconv,nonprob,priority,spec_prior -great_pret,very_crit,completed,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,completed,more,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,completed,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,completed,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,completed,more,critical,inconv,problematic,recommended,spec_prior -great_pret,very_crit,completed,more,critical,inconv,problematic,priority,spec_prior -great_pret,very_crit,completed,more,critical,inconv,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,1,convenient,convenient,nonprob,recommended,priority -great_pret,very_crit,incomplete,1,convenient,convenient,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,very_crit,incomplete,1,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,1,convenient,convenient,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,1,convenient,convenient,problematic,priority,spec_prior -great_pret,very_crit,incomplete,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,1,convenient,inconv,nonprob,recommended,priority -great_pret,very_crit,incomplete,1,convenient,inconv,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,1,convenient,inconv,slightly_prob,recommended,priority -great_pret,very_crit,incomplete,1,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,1,convenient,inconv,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,1,convenient,inconv,problematic,priority,spec_prior -great_pret,very_crit,incomplete,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,1,less_conv,convenient,nonprob,recommended,priority -great_pret,very_crit,incomplete,1,less_conv,convenient,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,1,less_conv,convenient,slightly_prob,recommended,priority -great_pret,very_crit,incomplete,1,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,1,less_conv,convenient,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,1,less_conv,convenient,problematic,priority,spec_prior -great_pret,very_crit,incomplete,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,1,less_conv,inconv,nonprob,recommended,priority -great_pret,very_crit,incomplete,1,less_conv,inconv,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,1,less_conv,inconv,slightly_prob,recommended,priority -great_pret,very_crit,incomplete,1,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,1,less_conv,inconv,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,1,less_conv,inconv,problematic,priority,spec_prior -great_pret,very_crit,incomplete,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,1,critical,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,incomplete,1,critical,convenient,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,1,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,incomplete,1,critical,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,1,critical,convenient,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,1,critical,convenient,problematic,priority,spec_prior -great_pret,very_crit,incomplete,1,critical,convenient,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,1,critical,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,incomplete,1,critical,inconv,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,1,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,incomplete,1,critical,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,1,critical,inconv,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,1,critical,inconv,problematic,priority,spec_prior -great_pret,very_crit,incomplete,1,critical,inconv,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,2,convenient,convenient,nonprob,recommended,priority -great_pret,very_crit,incomplete,2,convenient,convenient,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,very_crit,incomplete,2,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,2,convenient,convenient,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,2,convenient,convenient,problematic,priority,spec_prior -great_pret,very_crit,incomplete,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,2,convenient,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,incomplete,2,convenient,inconv,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,2,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,incomplete,2,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,2,convenient,inconv,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,2,convenient,inconv,problematic,priority,spec_prior -great_pret,very_crit,incomplete,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,2,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,incomplete,2,less_conv,convenient,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,2,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,incomplete,2,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,2,less_conv,convenient,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,2,less_conv,convenient,problematic,priority,spec_prior -great_pret,very_crit,incomplete,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,2,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,incomplete,2,less_conv,inconv,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,2,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,incomplete,2,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,2,less_conv,inconv,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,2,less_conv,inconv,problematic,priority,spec_prior -great_pret,very_crit,incomplete,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,2,critical,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,incomplete,2,critical,convenient,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,2,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,incomplete,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,2,critical,convenient,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,2,critical,convenient,problematic,priority,spec_prior -great_pret,very_crit,incomplete,2,critical,convenient,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,2,critical,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,incomplete,2,critical,inconv,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,2,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,incomplete,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,2,critical,inconv,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,2,critical,inconv,problematic,priority,spec_prior -great_pret,very_crit,incomplete,2,critical,inconv,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,3,convenient,convenient,nonprob,recommended,priority -great_pret,very_crit,incomplete,3,convenient,convenient,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,very_crit,incomplete,3,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,3,convenient,convenient,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,3,convenient,convenient,problematic,priority,spec_prior -great_pret,very_crit,incomplete,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,3,convenient,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,incomplete,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,3,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,incomplete,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,3,convenient,inconv,problematic,priority,spec_prior -great_pret,very_crit,incomplete,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,3,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,incomplete,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,3,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,incomplete,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,very_crit,incomplete,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,3,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,incomplete,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,3,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,incomplete,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,very_crit,incomplete,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,3,critical,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,incomplete,3,critical,convenient,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,3,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,incomplete,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,3,critical,convenient,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,3,critical,convenient,problematic,priority,spec_prior -great_pret,very_crit,incomplete,3,critical,convenient,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,3,critical,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,incomplete,3,critical,inconv,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,3,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,incomplete,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,3,critical,inconv,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,3,critical,inconv,problematic,priority,spec_prior -great_pret,very_crit,incomplete,3,critical,inconv,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,more,convenient,convenient,nonprob,recommended,priority -great_pret,very_crit,incomplete,more,convenient,convenient,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,very_crit,incomplete,more,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,more,convenient,convenient,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,more,convenient,convenient,problematic,priority,spec_prior -great_pret,very_crit,incomplete,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,more,convenient,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,incomplete,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,more,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,incomplete,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,more,convenient,inconv,problematic,priority,spec_prior -great_pret,very_crit,incomplete,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,more,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,incomplete,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,more,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,incomplete,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,very_crit,incomplete,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,more,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,incomplete,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,more,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,incomplete,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,very_crit,incomplete,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,more,critical,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,incomplete,more,critical,convenient,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,more,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,incomplete,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,more,critical,convenient,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,more,critical,convenient,problematic,priority,spec_prior -great_pret,very_crit,incomplete,more,critical,convenient,problematic,not_recom,not_recom -great_pret,very_crit,incomplete,more,critical,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,incomplete,more,critical,inconv,nonprob,priority,spec_prior -great_pret,very_crit,incomplete,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,incomplete,more,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,incomplete,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,incomplete,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,incomplete,more,critical,inconv,problematic,recommended,spec_prior -great_pret,very_crit,incomplete,more,critical,inconv,problematic,priority,spec_prior -great_pret,very_crit,incomplete,more,critical,inconv,problematic,not_recom,not_recom -great_pret,very_crit,foster,1,convenient,convenient,nonprob,recommended,priority -great_pret,very_crit,foster,1,convenient,convenient,nonprob,priority,spec_prior -great_pret,very_crit,foster,1,convenient,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,foster,1,convenient,convenient,slightly_prob,recommended,priority -great_pret,very_crit,foster,1,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,1,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,1,convenient,convenient,problematic,recommended,spec_prior -great_pret,very_crit,foster,1,convenient,convenient,problematic,priority,spec_prior -great_pret,very_crit,foster,1,convenient,convenient,problematic,not_recom,not_recom -great_pret,very_crit,foster,1,convenient,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,foster,1,convenient,inconv,nonprob,priority,spec_prior -great_pret,very_crit,foster,1,convenient,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,foster,1,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,1,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,1,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,1,convenient,inconv,problematic,recommended,spec_prior -great_pret,very_crit,foster,1,convenient,inconv,problematic,priority,spec_prior -great_pret,very_crit,foster,1,convenient,inconv,problematic,not_recom,not_recom -great_pret,very_crit,foster,1,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,foster,1,less_conv,convenient,nonprob,priority,spec_prior -great_pret,very_crit,foster,1,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,foster,1,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,1,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,1,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,1,less_conv,convenient,problematic,recommended,spec_prior -great_pret,very_crit,foster,1,less_conv,convenient,problematic,priority,spec_prior -great_pret,very_crit,foster,1,less_conv,convenient,problematic,not_recom,not_recom -great_pret,very_crit,foster,1,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,foster,1,less_conv,inconv,nonprob,priority,spec_prior -great_pret,very_crit,foster,1,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,foster,1,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,1,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,1,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,1,less_conv,inconv,problematic,recommended,spec_prior -great_pret,very_crit,foster,1,less_conv,inconv,problematic,priority,spec_prior -great_pret,very_crit,foster,1,less_conv,inconv,problematic,not_recom,not_recom -great_pret,very_crit,foster,1,critical,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,foster,1,critical,convenient,nonprob,priority,spec_prior -great_pret,very_crit,foster,1,critical,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,foster,1,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,1,critical,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,1,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,1,critical,convenient,problematic,recommended,spec_prior -great_pret,very_crit,foster,1,critical,convenient,problematic,priority,spec_prior -great_pret,very_crit,foster,1,critical,convenient,problematic,not_recom,not_recom -great_pret,very_crit,foster,1,critical,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,foster,1,critical,inconv,nonprob,priority,spec_prior -great_pret,very_crit,foster,1,critical,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,foster,1,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,1,critical,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,1,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,1,critical,inconv,problematic,recommended,spec_prior -great_pret,very_crit,foster,1,critical,inconv,problematic,priority,spec_prior -great_pret,very_crit,foster,1,critical,inconv,problematic,not_recom,not_recom -great_pret,very_crit,foster,2,convenient,convenient,nonprob,recommended,priority -great_pret,very_crit,foster,2,convenient,convenient,nonprob,priority,spec_prior -great_pret,very_crit,foster,2,convenient,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,foster,2,convenient,convenient,slightly_prob,recommended,priority -great_pret,very_crit,foster,2,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,2,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,2,convenient,convenient,problematic,recommended,spec_prior -great_pret,very_crit,foster,2,convenient,convenient,problematic,priority,spec_prior -great_pret,very_crit,foster,2,convenient,convenient,problematic,not_recom,not_recom -great_pret,very_crit,foster,2,convenient,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,foster,2,convenient,inconv,nonprob,priority,spec_prior -great_pret,very_crit,foster,2,convenient,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,foster,2,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,2,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,2,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,2,convenient,inconv,problematic,recommended,spec_prior -great_pret,very_crit,foster,2,convenient,inconv,problematic,priority,spec_prior -great_pret,very_crit,foster,2,convenient,inconv,problematic,not_recom,not_recom -great_pret,very_crit,foster,2,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,foster,2,less_conv,convenient,nonprob,priority,spec_prior -great_pret,very_crit,foster,2,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,foster,2,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,2,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,2,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,2,less_conv,convenient,problematic,recommended,spec_prior -great_pret,very_crit,foster,2,less_conv,convenient,problematic,priority,spec_prior -great_pret,very_crit,foster,2,less_conv,convenient,problematic,not_recom,not_recom -great_pret,very_crit,foster,2,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,foster,2,less_conv,inconv,nonprob,priority,spec_prior -great_pret,very_crit,foster,2,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,foster,2,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,2,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,2,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,2,less_conv,inconv,problematic,recommended,spec_prior -great_pret,very_crit,foster,2,less_conv,inconv,problematic,priority,spec_prior -great_pret,very_crit,foster,2,less_conv,inconv,problematic,not_recom,not_recom -great_pret,very_crit,foster,2,critical,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,foster,2,critical,convenient,nonprob,priority,spec_prior -great_pret,very_crit,foster,2,critical,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,foster,2,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,2,critical,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,2,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,2,critical,convenient,problematic,recommended,spec_prior -great_pret,very_crit,foster,2,critical,convenient,problematic,priority,spec_prior -great_pret,very_crit,foster,2,critical,convenient,problematic,not_recom,not_recom -great_pret,very_crit,foster,2,critical,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,foster,2,critical,inconv,nonprob,priority,spec_prior -great_pret,very_crit,foster,2,critical,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,foster,2,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,2,critical,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,2,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,2,critical,inconv,problematic,recommended,spec_prior -great_pret,very_crit,foster,2,critical,inconv,problematic,priority,spec_prior -great_pret,very_crit,foster,2,critical,inconv,problematic,not_recom,not_recom -great_pret,very_crit,foster,3,convenient,convenient,nonprob,recommended,priority -great_pret,very_crit,foster,3,convenient,convenient,nonprob,priority,spec_prior -great_pret,very_crit,foster,3,convenient,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,foster,3,convenient,convenient,slightly_prob,recommended,priority -great_pret,very_crit,foster,3,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,3,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,3,convenient,convenient,problematic,recommended,spec_prior -great_pret,very_crit,foster,3,convenient,convenient,problematic,priority,spec_prior -great_pret,very_crit,foster,3,convenient,convenient,problematic,not_recom,not_recom -great_pret,very_crit,foster,3,convenient,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,foster,3,convenient,inconv,nonprob,priority,spec_prior -great_pret,very_crit,foster,3,convenient,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,foster,3,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,3,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,3,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,3,convenient,inconv,problematic,recommended,spec_prior -great_pret,very_crit,foster,3,convenient,inconv,problematic,priority,spec_prior -great_pret,very_crit,foster,3,convenient,inconv,problematic,not_recom,not_recom -great_pret,very_crit,foster,3,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,foster,3,less_conv,convenient,nonprob,priority,spec_prior -great_pret,very_crit,foster,3,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,foster,3,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,3,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,3,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,3,less_conv,convenient,problematic,recommended,spec_prior -great_pret,very_crit,foster,3,less_conv,convenient,problematic,priority,spec_prior -great_pret,very_crit,foster,3,less_conv,convenient,problematic,not_recom,not_recom -great_pret,very_crit,foster,3,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,foster,3,less_conv,inconv,nonprob,priority,spec_prior -great_pret,very_crit,foster,3,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,foster,3,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,3,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,3,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,3,less_conv,inconv,problematic,recommended,spec_prior -great_pret,very_crit,foster,3,less_conv,inconv,problematic,priority,spec_prior -great_pret,very_crit,foster,3,less_conv,inconv,problematic,not_recom,not_recom -great_pret,very_crit,foster,3,critical,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,foster,3,critical,convenient,nonprob,priority,spec_prior -great_pret,very_crit,foster,3,critical,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,foster,3,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,3,critical,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,3,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,3,critical,convenient,problematic,recommended,spec_prior -great_pret,very_crit,foster,3,critical,convenient,problematic,priority,spec_prior -great_pret,very_crit,foster,3,critical,convenient,problematic,not_recom,not_recom -great_pret,very_crit,foster,3,critical,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,foster,3,critical,inconv,nonprob,priority,spec_prior -great_pret,very_crit,foster,3,critical,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,foster,3,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,3,critical,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,3,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,3,critical,inconv,problematic,recommended,spec_prior -great_pret,very_crit,foster,3,critical,inconv,problematic,priority,spec_prior -great_pret,very_crit,foster,3,critical,inconv,problematic,not_recom,not_recom -great_pret,very_crit,foster,more,convenient,convenient,nonprob,recommended,priority -great_pret,very_crit,foster,more,convenient,convenient,nonprob,priority,spec_prior -great_pret,very_crit,foster,more,convenient,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,foster,more,convenient,convenient,slightly_prob,recommended,priority -great_pret,very_crit,foster,more,convenient,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,more,convenient,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,more,convenient,convenient,problematic,recommended,spec_prior -great_pret,very_crit,foster,more,convenient,convenient,problematic,priority,spec_prior -great_pret,very_crit,foster,more,convenient,convenient,problematic,not_recom,not_recom -great_pret,very_crit,foster,more,convenient,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,foster,more,convenient,inconv,nonprob,priority,spec_prior -great_pret,very_crit,foster,more,convenient,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,foster,more,convenient,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,more,convenient,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,more,convenient,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,more,convenient,inconv,problematic,recommended,spec_prior -great_pret,very_crit,foster,more,convenient,inconv,problematic,priority,spec_prior -great_pret,very_crit,foster,more,convenient,inconv,problematic,not_recom,not_recom -great_pret,very_crit,foster,more,less_conv,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,foster,more,less_conv,convenient,nonprob,priority,spec_prior -great_pret,very_crit,foster,more,less_conv,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,foster,more,less_conv,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,more,less_conv,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,more,less_conv,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,more,less_conv,convenient,problematic,recommended,spec_prior -great_pret,very_crit,foster,more,less_conv,convenient,problematic,priority,spec_prior -great_pret,very_crit,foster,more,less_conv,convenient,problematic,not_recom,not_recom -great_pret,very_crit,foster,more,less_conv,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,foster,more,less_conv,inconv,nonprob,priority,spec_prior -great_pret,very_crit,foster,more,less_conv,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,foster,more,less_conv,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,more,less_conv,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,more,less_conv,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,more,less_conv,inconv,problematic,recommended,spec_prior -great_pret,very_crit,foster,more,less_conv,inconv,problematic,priority,spec_prior -great_pret,very_crit,foster,more,less_conv,inconv,problematic,not_recom,not_recom -great_pret,very_crit,foster,more,critical,convenient,nonprob,recommended,spec_prior -great_pret,very_crit,foster,more,critical,convenient,nonprob,priority,spec_prior -great_pret,very_crit,foster,more,critical,convenient,nonprob,not_recom,not_recom -great_pret,very_crit,foster,more,critical,convenient,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,more,critical,convenient,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,more,critical,convenient,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,more,critical,convenient,problematic,recommended,spec_prior -great_pret,very_crit,foster,more,critical,convenient,problematic,priority,spec_prior -great_pret,very_crit,foster,more,critical,convenient,problematic,not_recom,not_recom -great_pret,very_crit,foster,more,critical,inconv,nonprob,recommended,spec_prior -great_pret,very_crit,foster,more,critical,inconv,nonprob,priority,spec_prior -great_pret,very_crit,foster,more,critical,inconv,nonprob,not_recom,not_recom -great_pret,very_crit,foster,more,critical,inconv,slightly_prob,recommended,spec_prior -great_pret,very_crit,foster,more,critical,inconv,slightly_prob,priority,spec_prior -great_pret,very_crit,foster,more,critical,inconv,slightly_prob,not_recom,not_recom -great_pret,very_crit,foster,more,critical,inconv,problematic,recommended,spec_prior -great_pret,very_crit,foster,more,critical,inconv,problematic,priority,spec_prior -great_pret,very_crit,foster,more,critical,inconv,problematic,not_recom,not_recom - diff --git a/katabatic/nursery/nursery.names b/katabatic/nursery/nursery.names deleted file mode 100644 index 09b489f..0000000 --- a/katabatic/nursery/nursery.names +++ /dev/null @@ -1,101 +0,0 @@ -1. Title: Nursery Database - -2. Sources: - (a) Creator: Vladislav Rajkovic et al. (13 experts) - (b) Donors: Marko Bohanec (marko.bohanec@ijs.si) - Blaz Zupan (blaz.zupan@ijs.si) - (c) Date: June, 1997 - -3. Past Usage: - - The hierarchical decision model, from which this dataset is - derived, was first presented in - - M. Olave, V. Rajkovic, M. Bohanec: An application for admission in - public school systems. In (I. Th. M. Snellen and W. B. H. J. van de - Donk and J.-P. Baquiast, editors) Expert Systems in Public - Administration, pages 145-160. Elsevier Science Publishers (North - Holland)}, 1989. - - Within machine-learning, this dataset was used for the evaluation - of HINT (Hierarchy INduction Tool), which was proved to be able to - completely reconstruct the original hierarchical model. This, - together with a comparison with C4.5, is presented in - - B. Zupan, M. Bohanec, I. Bratko, J. Demsar: Machine learning by - function decomposition. ICML-97, Nashville, TN. 1997 (to appear) - -4. Relevant Information Paragraph: - - Nursery Database was derived from a hierarchical decision model - originally developed to rank applications for nursery schools. It - was used during several years in 1980's when there was excessive - enrollment to these schools in Ljubljana, Slovenia, and the - rejected applications frequently needed an objective - explanation. The final decision depended on three subproblems: - occupation of parents and child's nursery, family structure and - financial standing, and social and health picture of the family. - The model was developed within expert system shell for decision - making DEX (M. Bohanec, V. Rajkovic: Expert system for decision - making. Sistemica 1(1), pp. 145-157, 1990.). - - The hierarchical model ranks nursery-school applications according - to the following concept structure: - - NURSERY Evaluation of applications for nursery schools - . EMPLOY Employment of parents and child's nursery - . . parents Parents' occupation - . . has_nurs Child's nursery - . STRUCT_FINAN Family structure and financial standings - . . STRUCTURE Family structure - . . . form Form of the family - . . . children Number of children - . . housing Housing conditions - . . finance Financial standing of the family - . SOC_HEALTH Social and health picture of the family - . . social Social conditions - . . health Health conditions - - Input attributes are printed in lowercase. Besides the target - concept (NURSERY) the model includes four intermediate concepts: - EMPLOY, STRUCT_FINAN, STRUCTURE, SOC_HEALTH. Every concept is in - the original model related to its lower level descendants by a set - of examples (for these examples sets see - http://www-ai.ijs.si/BlazZupan/nursery.html). - - The Nursery Database contains examples with the structural - information removed, i.e., directly relates NURSERY to the eight input - attributes: parents, has_nurs, form, children, housing, finance, - social, health. - - Because of known underlying concept structure, this database may be - particularly useful for testing constructive induction and - structure discovery methods. - -5. Number of Instances: 12960 - (instances completely cover the attribute space) - -6. Number of Attributes: 8 - -7. Attribute Values: - - parents usual, pretentious, great_pret - has_nurs proper, less_proper, improper, critical, very_crit - form complete, completed, incomplete, foster - children 1, 2, 3, more - housing convenient, less_conv, critical - finance convenient, inconv - social non-prob, slightly_prob, problematic - health recommended, priority, not_recom - -8. Missing Attribute Values: none - -9. Class Distribution (number of instances per class) - - class N N[%] - ------------------------------ - not_recom 4320 (33.333 %) - recommend 2 ( 0.015 %) - very_recom 328 ( 2.531 %) - priority 4266 (32.917 %) - spec_prior 4044 (31.204 %) diff --git a/katabatic/synthethic_datasets/output.csv b/katabatic/synthethic_datasets/output.csv deleted file mode 100644 index 7f0afe4..0000000 --- a/katabatic/synthethic_datasets/output.csv +++ /dev/null @@ -1,24422 +0,0 @@ -,0,1,2,3,4,5,6,7,8,9,10,11,12,13,14 -0,0,2,0,0,0,0,9,2,3,0,2,3,4,10,0 -1,0,8,0,7,1,6,11,0,4,1,9,3,1,31,1 -2,1,7,0,14,6,0,10,3,2,1,8,13,0,24,0 -3,1,0,0,12,6,6,10,4,0,1,14,3,0,19,0 -4,3,4,0,3,6,1,13,3,3,1,6,16,3,35,0 -5,1,0,0,13,2,6,0,0,0,1,12,15,1,12,0 -6,10,0,0,3,0,0,1,1,4,0,3,13,0,40,0 -7,0,8,0,4,6,0,7,3,0,1,0,2,0,20,0 -8,9,5,0,2,1,0,14,3,0,0,13,2,5,33,0 -9,4,5,0,7,1,6,14,3,1,1,8,2,1,20,0 -10,0,3,0,12,0,4,9,2,0,1,12,15,5,7,0 -11,7,8,0,6,0,3,10,1,4,0,12,10,1,28,0 -12,2,8,0,5,2,6,6,4,1,0,6,0,0,38,0 -13,7,6,0,2,3,1,13,3,4,0,18,17,5,16,1 -14,4,5,0,3,1,5,3,2,2,0,11,12,3,37,0 -15,1,6,0,7,5,5,3,2,2,1,10,18,0,2,0 -16,5,0,0,12,5,2,13,5,0,1,7,17,3,21,1 -17,4,2,0,13,0,2,8,3,0,1,3,2,2,29,0 -18,8,2,0,9,4,2,8,1,0,1,7,19,3,14,1 -19,5,3,0,6,1,4,8,2,3,0,13,9,0,4,0 -20,6,3,0,5,0,3,8,5,0,1,9,10,5,13,1 -21,9,8,0,13,0,2,7,4,1,1,13,9,0,11,0 -22,6,6,0,2,0,6,12,0,1,1,4,3,0,0,0 -23,0,6,0,8,5,3,2,3,3,0,6,18,3,13,0 -24,0,4,0,9,0,2,0,0,4,1,17,18,5,25,0 -25,9,4,0,3,4,1,7,1,0,0,18,19,3,3,1 -26,1,1,0,2,5,1,13,5,4,0,16,7,1,40,1 -27,4,3,0,2,3,2,12,3,0,0,6,18,1,7,0 -28,2,0,0,10,2,6,11,3,2,1,2,8,5,10,0 -29,1,2,0,0,1,4,4,5,0,1,5,0,0,35,0 -30,1,8,0,12,0,6,2,5,0,0,6,19,1,19,0 -31,3,6,0,11,2,1,11,3,1,0,2,11,5,29,0 -32,1,6,0,6,6,5,2,1,4,1,13,15,4,5,0 -33,9,0,0,1,1,3,14,1,0,1,11,20,2,40,0 -34,5,0,0,12,6,6,3,2,3,0,2,0,0,2,0 -35,0,7,0,10,1,0,8,1,0,1,17,15,5,3,0 -36,6,8,0,5,6,4,5,5,4,0,17,12,5,9,0 -37,6,1,0,9,2,0,4,4,4,1,18,14,4,8,1 -38,1,0,0,3,5,3,9,3,2,0,11,13,4,0,0 -39,0,2,0,13,6,6,1,1,0,1,12,2,1,12,0 -40,7,0,0,0,2,1,1,0,3,0,9,1,3,8,1 -41,9,5,0,7,2,0,4,3,1,1,5,8,4,38,1 -42,6,4,0,5,4,5,2,2,0,1,0,18,2,16,1 -43,7,1,0,3,2,6,5,3,3,0,10,20,3,4,0 -44,9,8,0,8,0,4,7,4,0,0,7,3,5,33,0 -45,8,6,0,8,0,4,5,2,3,1,13,15,0,12,0 -46,7,4,0,12,1,2,1,5,1,0,5,17,1,22,1 -47,2,7,0,2,2,0,0,1,2,1,17,0,5,12,0 -48,1,4,0,4,5,4,13,4,0,0,6,0,4,34,0 -49,5,3,0,10,2,2,1,3,1,0,5,17,5,25,1 -50,9,8,0,13,6,6,10,3,3,0,2,11,1,21,0 -51,2,0,0,7,5,4,0,0,1,0,8,16,2,7,0 -52,10,0,0,15,2,3,1,1,1,1,4,2,2,21,0 -53,8,8,0,1,5,2,10,1,3,0,9,1,1,10,1 -54,6,1,0,6,6,0,1,2,4,1,0,4,2,38,0 -55,10,1,0,7,5,5,1,4,4,1,18,3,1,30,1 -56,3,8,0,5,4,1,6,2,3,0,2,1,0,14,0 -57,9,2,0,1,3,0,10,5,2,1,16,14,0,20,1 -58,3,1,0,6,6,1,2,0,0,0,2,18,5,16,0 -59,9,8,0,14,0,0,9,3,0,0,2,12,0,0,0 -60,8,8,0,6,0,2,2,0,4,1,9,1,5,4,1 -61,3,6,0,6,0,2,0,3,3,1,13,11,2,4,0 -62,2,8,0,12,5,5,7,4,1,0,6,11,3,31,0 -63,6,6,0,1,4,1,9,0,0,1,3,18,3,19,0 -64,2,5,0,1,6,5,14,4,0,0,2,13,2,1,0 -65,0,3,0,13,0,0,3,0,3,1,17,2,5,35,0 -66,9,8,0,12,3,2,7,4,0,1,11,4,0,19,0 -67,2,4,0,13,1,1,12,2,3,1,10,19,5,17,0 -68,10,0,0,13,1,0,10,4,2,0,13,2,4,29,0 -69,6,7,0,11,5,5,2,2,3,0,0,19,0,0,0 -70,0,8,0,1,0,6,11,4,0,1,13,15,2,27,0 -71,3,7,0,10,0,1,3,2,0,0,0,5,4,22,0 -72,8,1,0,6,4,2,12,0,3,1,10,8,1,39,1 -73,10,5,0,13,2,3,5,0,1,0,10,11,2,21,0 -74,1,6,0,15,6,6,11,4,0,1,6,18,2,30,0 -75,9,1,0,15,6,0,2,0,3,0,7,16,2,0,1 -76,10,1,0,5,6,6,2,2,1,1,18,10,3,14,1 -77,1,4,0,4,6,2,3,2,0,0,6,2,0,12,0 -78,5,8,0,7,6,0,11,1,0,1,12,18,5,0,0 -79,2,0,0,11,3,6,3,1,4,1,18,8,3,19,1 -80,8,5,0,3,3,4,10,1,2,1,17,2,0,4,0 -81,10,0,0,3,5,4,3,4,2,1,6,9,0,20,0 -82,8,0,0,5,6,5,7,1,0,1,18,18,4,13,1 -83,2,5,0,2,4,0,10,5,0,0,3,20,4,39,1 -84,0,2,0,11,5,4,1,1,1,0,8,17,2,38,0 -85,8,6,0,3,6,0,14,2,0,1,6,11,4,25,0 -86,0,4,0,2,0,3,8,4,0,1,8,2,2,27,0 -87,2,2,0,5,0,3,1,3,2,0,11,11,2,35,0 -88,2,1,0,1,6,0,4,1,0,1,18,10,4,21,1 -89,3,1,0,8,4,6,10,4,3,1,1,5,3,25,1 -90,6,6,0,9,0,5,14,3,0,0,16,6,1,20,0 -91,8,6,0,12,1,0,11,5,0,0,12,18,1,37,0 -92,10,0,0,4,1,6,1,4,2,0,8,11,1,19,0 -93,2,7,0,13,4,0,11,5,4,0,4,2,5,10,0 -94,6,4,0,8,0,0,11,3,0,0,6,11,1,17,0 -95,9,0,0,0,5,3,2,0,2,1,8,2,2,19,0 -96,8,2,0,5,3,1,0,2,0,1,4,2,0,13,0 -97,1,6,0,10,0,4,7,4,0,0,0,11,3,21,0 -98,5,7,0,14,4,6,0,4,4,1,18,5,5,25,1 -99,8,1,0,15,6,4,11,2,0,0,6,10,4,18,1 -100,0,5,0,4,1,0,4,1,0,1,0,6,0,39,0 -101,2,4,0,1,0,3,5,0,4,1,2,6,4,30,0 -102,7,4,0,10,0,6,9,2,0,0,14,20,4,17,1 -103,6,1,0,8,1,1,5,2,0,0,4,11,5,40,0 -104,1,0,0,11,5,4,8,3,1,0,7,9,1,34,0 -105,2,2,0,5,2,0,8,4,0,0,2,6,4,33,0 -106,7,3,0,8,6,5,5,5,3,1,10,11,0,29,0 -107,3,1,0,9,4,1,7,2,1,0,4,15,2,19,1 -108,5,5,0,6,0,0,8,0,1,0,15,2,3,23,0 -109,9,6,0,9,6,5,10,5,0,0,13,12,4,40,0 -110,5,7,0,1,4,3,0,5,2,0,5,19,0,12,0 -111,4,6,0,4,1,2,12,5,2,0,14,14,5,8,1 -112,8,0,0,13,5,6,5,2,0,1,14,16,3,24,0 -113,5,3,0,7,0,5,4,0,4,0,2,6,3,24,0 -114,10,0,0,11,6,4,10,2,1,1,15,19,0,13,0 -115,7,5,0,9,4,3,3,2,3,0,13,2,1,11,0 -116,2,1,0,15,4,5,3,1,0,0,18,18,4,12,1 -117,5,2,0,10,1,1,5,5,1,0,13,11,3,38,0 -118,2,3,0,8,0,0,11,3,4,0,15,0,4,26,0 -119,2,7,0,5,4,6,10,3,1,1,12,15,0,5,0 -120,10,7,0,0,0,0,0,1,2,1,6,13,1,7,0 -121,10,5,0,8,1,1,0,0,1,1,7,7,4,20,1 -122,9,2,0,14,3,6,12,5,3,1,16,10,5,39,1 -123,10,7,0,3,2,0,1,0,1,1,2,4,1,29,0 -124,8,8,0,5,2,4,6,3,0,0,4,9,0,25,0 -125,4,8,0,3,2,2,10,4,0,0,18,5,4,1,1 -126,7,6,0,10,3,1,4,0,4,1,3,10,5,30,1 -127,2,6,0,1,1,2,8,1,4,1,13,2,3,4,0 -128,9,2,0,15,5,5,11,1,2,1,18,6,2,8,1 -129,3,3,0,0,0,0,7,1,3,1,3,13,0,40,0 -130,0,4,0,3,1,5,6,5,1,0,2,11,2,38,0 -131,5,8,0,8,2,4,3,3,3,1,5,14,2,10,0 -132,1,3,0,5,5,6,11,0,4,0,8,4,4,17,0 -133,10,6,0,5,5,1,6,2,3,0,2,20,4,32,0 -134,8,6,0,14,1,2,3,4,4,0,8,4,1,6,0 -135,6,7,0,3,0,4,6,4,0,0,2,11,2,24,0 -136,5,8,0,10,0,5,6,1,2,1,8,12,2,5,0 -137,3,6,0,15,3,2,6,0,2,0,8,9,1,27,0 -138,5,8,0,15,2,1,3,4,1,1,17,8,5,40,1 -139,1,0,0,1,1,3,10,3,1,1,6,7,0,33,0 -140,1,2,0,0,6,2,5,2,1,1,0,6,1,29,0 -141,10,6,0,14,0,4,10,1,3,1,17,6,0,10,0 -142,9,0,0,14,2,5,1,4,1,1,5,10,5,22,1 -143,3,7,0,11,6,3,6,4,0,1,16,16,3,5,0 -144,5,2,0,9,5,5,4,4,2,1,4,9,4,25,0 -145,0,6,0,3,0,3,6,0,1,1,4,19,0,3,0 -146,7,4,0,9,0,5,9,1,3,1,17,1,3,31,0 -147,10,4,0,13,0,4,4,0,4,0,8,2,5,28,0 -148,8,1,0,11,3,0,4,4,0,1,1,2,5,16,0 -149,2,1,0,6,0,4,1,3,2,0,12,20,1,40,0 -150,10,8,0,5,1,3,9,0,3,0,16,11,2,10,0 -151,10,6,0,2,6,3,5,2,2,0,0,17,1,25,0 -152,0,4,0,4,5,1,8,3,3,0,13,2,1,22,0 -153,1,2,0,12,0,4,0,3,1,0,4,1,0,18,0 -154,2,6,0,11,0,4,0,3,4,1,2,11,5,20,0 -155,2,6,0,13,6,0,4,1,0,0,15,20,4,1,0 -156,6,0,0,12,0,3,9,2,0,0,2,2,3,13,0 -157,6,1,0,13,0,2,2,2,1,1,1,17,5,14,1 -158,2,2,0,11,1,3,8,0,0,1,6,6,3,29,0 -159,0,7,0,9,0,6,9,4,3,0,2,6,0,6,0 -160,0,4,0,7,5,1,0,3,4,0,2,6,2,24,0 -161,3,8,0,3,1,5,5,3,4,0,17,8,0,0,0 -162,3,8,0,1,6,5,14,3,1,1,2,20,3,13,0 -163,10,0,0,3,5,5,2,0,0,0,4,3,5,24,0 -164,10,8,0,13,5,5,5,4,0,0,13,13,1,13,0 -165,0,1,0,15,1,3,9,1,2,1,12,11,0,19,0 -166,9,2,0,7,1,4,7,5,0,1,18,19,5,10,1 -167,1,4,0,10,0,0,14,1,0,1,0,2,4,13,0 -168,4,3,0,5,1,0,0,1,0,1,12,16,4,40,0 -169,6,1,0,11,0,3,11,5,1,0,9,10,4,33,1 -170,0,8,0,8,0,0,0,1,0,0,13,4,0,27,0 -171,7,4,0,4,4,6,3,5,3,1,13,14,0,9,0 -172,2,5,0,8,6,1,11,5,3,1,12,5,3,6,1 -173,8,2,0,12,4,6,7,2,1,0,2,11,1,24,0 -174,4,6,0,7,1,0,2,0,3,1,11,7,3,7,1 -175,10,6,0,2,6,3,14,3,0,1,11,6,5,32,0 -176,3,8,0,13,5,5,10,3,1,1,2,6,0,27,0 -177,9,7,0,7,5,1,7,5,4,0,12,18,3,9,0 -178,3,8,0,2,3,0,0,5,3,1,10,4,4,16,0 -179,0,3,0,11,4,5,6,4,0,0,10,11,2,7,0 -180,1,5,0,7,5,4,3,4,3,1,8,19,0,17,0 -181,8,8,0,11,3,4,10,5,4,0,1,18,4,8,1 -182,6,8,0,10,0,6,8,1,2,0,6,16,3,19,0 -183,6,7,0,0,2,4,9,0,4,1,13,2,0,7,0 -184,1,1,0,7,2,0,12,4,0,1,4,17,0,4,0 -185,6,0,0,6,4,3,11,4,1,1,13,2,4,29,0 -186,8,1,0,9,0,6,13,4,0,1,7,7,3,16,1 -187,9,5,0,1,6,0,13,1,1,1,13,0,3,26,0 -188,5,8,0,6,0,4,3,0,3,1,16,10,2,2,1 -189,6,2,0,6,6,0,1,4,4,0,12,19,3,16,0 -190,10,5,0,13,6,0,11,5,1,0,14,3,5,16,1 -191,0,8,0,5,1,0,10,0,2,1,2,9,1,20,0 -192,8,7,0,8,0,4,0,3,1,0,15,3,0,32,0 -193,4,0,0,1,1,2,8,1,2,1,0,0,1,13,0 -194,9,2,0,0,6,5,13,1,0,1,4,16,1,37,0 -195,1,3,0,14,0,2,6,1,3,0,0,16,0,26,0 -196,10,1,0,0,3,2,4,4,3,0,10,6,2,29,0 -197,7,6,0,11,0,5,14,5,2,0,13,16,0,3,0 -198,0,1,0,5,1,6,1,4,1,1,2,0,2,21,0 -199,2,8,0,2,0,5,9,0,2,0,17,6,5,7,0 -200,0,7,0,7,6,2,4,3,2,0,13,6,1,34,0 -201,0,3,0,4,2,4,12,0,0,0,13,2,0,33,0 -202,4,4,0,7,0,5,2,2,0,0,17,20,1,31,0 -203,7,0,0,8,0,1,5,3,2,1,2,5,3,11,0 -204,2,8,0,7,5,1,1,0,4,0,4,4,4,10,0 -205,1,7,0,13,0,4,3,3,2,1,2,4,5,6,0 -206,3,4,0,8,4,3,2,1,0,1,13,2,4,14,0 -207,3,6,0,9,0,5,12,3,3,0,11,18,1,36,0 -208,4,2,0,9,3,6,12,2,0,0,17,9,2,38,0 -209,1,6,0,12,0,4,14,4,0,0,0,9,2,24,0 -210,4,4,0,13,0,4,9,0,2,0,13,9,1,3,0 -211,9,4,0,3,4,3,8,5,0,0,5,15,3,16,1 -212,3,4,0,6,1,4,2,3,0,1,4,11,2,13,0 -213,6,2,0,13,1,2,0,3,0,0,7,8,5,18,0 -214,0,6,0,15,2,4,6,5,1,0,0,0,5,37,0 -215,8,1,0,13,5,2,14,5,3,1,3,11,3,39,1 -216,10,2,0,10,5,2,3,5,2,0,3,4,3,24,1 -217,9,8,0,0,1,4,3,4,3,1,4,15,2,14,0 -218,0,4,0,11,1,2,0,1,3,0,8,8,1,19,0 -219,1,3,0,13,0,1,6,5,3,1,2,6,4,20,0 -220,4,8,0,9,1,6,10,2,2,0,13,15,2,32,0 -221,3,0,0,7,5,6,13,0,3,1,8,1,1,5,0 -222,0,5,0,12,3,0,8,2,0,1,2,11,3,31,0 -223,0,7,0,6,3,6,5,3,3,0,13,1,3,33,0 -224,10,2,0,8,2,2,0,0,4,1,5,10,0,37,1 -225,5,1,0,12,3,1,12,1,3,1,18,10,3,24,1 -226,7,2,0,9,0,6,14,5,1,0,12,11,4,35,0 -227,2,3,0,2,5,5,2,4,4,1,2,2,3,10,0 -228,9,4,0,7,6,3,5,2,4,1,9,0,5,19,0 -229,2,5,0,6,1,5,7,4,0,1,8,16,3,8,0 -230,0,0,0,7,5,1,11,3,4,1,2,2,0,28,0 -231,2,8,0,11,5,5,1,1,3,0,11,14,5,18,0 -232,6,3,0,6,1,1,9,4,2,1,2,13,4,37,0 -233,3,2,0,10,5,3,3,0,0,1,13,6,2,8,0 -234,2,3,0,6,3,4,8,4,2,0,13,2,0,26,0 -235,6,8,0,8,1,5,2,4,2,0,13,12,0,11,0 -236,9,1,0,13,6,0,9,3,2,0,0,4,4,23,0 -237,1,0,0,2,6,0,3,3,4,1,14,15,4,2,0 -238,8,2,0,7,2,3,0,5,2,1,3,0,2,21,1 -239,9,5,0,12,1,3,10,2,0,0,13,13,1,25,0 -240,10,3,0,10,6,6,9,4,0,0,8,6,4,37,0 -241,5,2,0,4,0,0,8,1,1,0,15,0,0,6,0 -242,6,1,0,9,6,5,4,2,0,0,13,2,0,6,0 -243,3,1,0,11,2,1,0,4,1,0,8,4,5,18,0 -244,3,2,0,1,3,6,8,4,1,1,6,18,5,17,0 -245,8,6,0,8,6,1,2,0,2,0,9,5,4,10,1 -246,0,5,0,11,3,3,14,4,2,1,8,14,5,39,0 -247,3,1,0,6,6,4,1,3,0,1,15,15,0,18,0 -248,8,8,0,15,1,3,11,3,4,1,18,7,4,31,1 -249,0,7,0,5,5,4,5,4,1,1,2,3,0,21,0 -250,1,4,0,1,3,5,5,2,0,1,7,16,0,22,0 -251,10,8,0,4,6,5,5,2,0,0,13,4,3,6,0 -252,6,3,0,6,6,5,13,4,1,0,13,11,1,10,0 -253,6,7,0,0,5,0,13,2,0,0,7,6,3,32,0 -254,7,0,0,8,4,0,9,3,0,0,7,10,4,0,1 -255,6,6,0,10,1,3,8,0,0,0,4,11,0,20,0 -256,5,7,0,7,5,5,14,1,2,0,8,6,4,33,0 -257,1,1,0,0,4,4,8,3,4,0,17,6,5,31,0 -258,1,5,0,15,0,4,11,1,0,0,13,15,3,40,0 -259,8,8,0,7,0,2,11,0,0,1,3,5,1,35,1 -260,8,7,0,5,2,0,4,4,1,1,2,11,4,37,0 -261,0,6,0,5,2,4,6,3,0,0,12,2,2,35,0 -262,10,0,0,13,6,2,1,1,1,0,12,9,0,36,0 -263,2,7,0,13,0,2,13,4,4,1,0,18,1,0,0 -264,8,0,0,0,3,1,10,5,4,1,7,14,3,26,1 -265,3,0,0,0,0,6,8,3,4,1,4,2,1,25,0 -266,1,0,0,7,6,4,11,2,1,0,15,16,0,41,0 -267,1,5,0,15,4,4,0,2,2,1,2,0,1,25,0 -268,9,8,0,6,5,3,7,5,0,0,2,2,0,0,0 -269,8,2,0,10,5,4,1,2,3,0,3,6,2,0,1 -270,3,1,0,7,1,5,0,1,2,0,13,11,3,19,0 -271,3,2,0,10,2,1,10,5,0,0,9,4,2,17,1 -272,1,2,0,6,6,4,6,4,4,1,8,7,5,18,0 -273,3,2,0,3,0,3,0,5,3,1,11,18,2,18,0 -274,0,8,0,7,1,5,5,3,3,1,4,6,2,5,0 -275,5,8,0,6,1,1,10,2,2,0,0,14,1,7,0 -276,7,5,0,6,5,5,3,4,3,0,17,19,4,31,0 -277,2,0,0,14,6,3,8,2,0,1,2,11,4,37,0 -278,3,6,0,6,1,0,4,2,4,0,10,3,3,39,0 -279,0,2,0,12,2,5,7,4,1,1,8,9,0,10,0 -280,2,7,0,1,6,1,7,1,3,0,2,11,3,0,0 -281,3,7,0,12,0,3,6,5,0,0,4,16,0,12,0 -282,1,0,0,1,6,3,14,5,4,0,6,11,2,35,0 -283,4,6,0,9,1,2,4,5,2,1,18,7,2,35,1 -284,2,8,0,6,3,5,9,1,3,0,3,19,5,32,1 -285,2,7,0,3,5,1,6,3,2,0,13,6,3,16,0 -286,9,3,0,13,4,2,14,1,2,1,13,3,0,7,0 -287,0,7,0,7,5,0,8,0,3,1,13,11,5,1,0 -288,6,6,0,15,5,5,11,0,1,1,1,5,3,36,1 -289,6,2,0,14,0,3,11,3,4,0,4,18,0,31,0 -290,1,2,0,4,4,4,9,4,3,1,12,13,3,28,0 -291,7,7,0,3,6,1,3,3,0,1,11,19,0,41,1 -292,3,3,0,2,0,4,6,4,1,1,2,6,0,39,0 -293,4,4,0,11,1,3,0,4,1,0,9,20,2,9,0 -294,6,6,0,5,2,1,0,0,1,1,6,6,4,19,0 -295,0,5,0,1,2,6,6,1,2,1,4,11,0,31,0 -296,0,8,0,5,6,0,9,3,3,1,13,18,2,16,0 -297,2,2,0,2,0,0,6,2,2,0,13,0,0,3,0 -298,4,0,0,7,2,6,8,4,0,1,8,4,2,2,0 -299,6,8,0,8,6,0,12,2,4,1,8,20,0,24,0 -300,6,3,0,0,2,3,8,1,0,1,2,4,0,32,0 -301,7,0,0,12,4,1,12,0,2,0,1,14,5,22,1 -302,8,8,0,15,1,6,1,0,1,1,0,4,0,29,0 -303,8,1,0,7,3,6,8,0,1,1,17,7,4,37,1 -304,4,4,0,3,0,4,5,3,4,0,10,2,0,3,0 -305,2,3,0,0,0,6,12,2,0,0,11,0,5,40,0 -306,10,6,0,0,0,0,12,1,1,0,6,12,5,41,0 -307,2,8,0,9,0,1,5,2,0,0,0,11,2,35,0 -308,8,1,0,15,0,3,5,3,2,1,16,14,5,1,1 -309,8,6,0,2,1,0,9,2,3,0,10,13,1,30,0 -310,2,7,0,0,1,6,5,3,3,0,8,3,0,19,0 -311,1,4,0,2,6,0,13,4,1,1,10,0,1,0,0 -312,1,8,0,13,6,5,9,3,2,0,2,11,3,35,0 -313,3,2,0,9,4,5,12,4,0,1,8,9,3,2,0 -314,5,1,0,13,5,3,2,1,0,1,2,12,4,4,0 -315,9,3,0,6,4,4,14,0,2,0,14,13,3,8,0 -316,9,3,0,14,4,0,1,0,2,0,9,5,1,39,1 -317,7,0,0,7,1,1,6,5,0,1,7,7,2,26,1 -318,7,5,0,7,2,0,11,5,4,1,18,14,5,30,1 -319,7,7,0,15,3,3,1,1,0,1,18,17,0,6,1 -320,2,2,0,3,0,6,5,1,0,1,17,15,0,3,0 -321,8,2,0,3,0,6,5,4,0,0,16,14,5,32,1 -322,2,0,0,11,6,5,10,4,0,1,4,15,0,25,0 -323,3,7,0,7,0,3,1,2,1,0,13,1,1,29,0 -324,7,2,0,5,5,0,0,0,4,0,18,18,0,5,1 -325,6,2,0,5,6,5,0,1,4,1,2,11,2,10,0 -326,1,7,0,2,5,5,5,3,3,0,15,3,1,23,0 -327,9,6,0,13,6,2,5,1,3,0,0,15,5,23,0 -328,3,1,0,3,5,5,1,1,0,0,13,13,0,21,0 -329,8,4,0,13,2,0,12,4,3,0,16,10,1,29,1 -330,10,0,0,3,1,5,2,2,0,0,0,7,3,11,0 -331,9,7,0,5,6,0,13,4,3,1,2,9,1,6,0 -332,0,8,0,3,5,0,0,3,2,0,12,10,2,38,0 -333,1,3,0,8,0,3,6,2,1,0,2,11,4,39,0 -334,3,7,0,2,0,1,1,1,2,0,2,3,1,34,0 -335,1,3,0,5,6,0,8,5,3,1,2,13,0,2,0 -336,7,7,0,8,2,4,3,0,2,1,8,9,2,40,0 -337,7,0,0,8,3,2,12,1,0,1,11,17,1,27,1 -338,2,5,0,7,0,4,2,5,4,1,8,16,2,30,0 -339,10,5,0,2,3,2,0,2,0,1,11,8,1,17,1 -340,0,4,0,3,5,0,14,3,2,1,16,9,0,3,0 -341,10,0,0,6,3,5,3,2,4,0,4,4,3,41,0 -342,2,7,0,2,0,4,12,3,1,1,8,4,0,14,0 -343,6,6,0,6,6,3,6,2,4,1,9,0,1,7,0 -344,9,4,0,13,0,3,6,4,3,1,2,12,2,27,0 -345,5,3,0,1,1,3,9,1,3,0,6,18,0,9,0 -346,4,7,0,3,0,0,9,5,0,0,13,0,0,18,0 -347,9,4,0,8,1,6,9,3,1,1,2,20,4,25,0 -348,5,2,0,9,0,0,8,1,0,0,2,9,5,27,0 -349,5,7,0,1,0,6,5,5,0,0,2,14,5,3,0 -350,10,2,0,1,5,6,11,1,3,1,8,15,1,18,0 -351,2,2,0,11,0,4,10,5,2,1,7,19,5,6,1 -352,2,5,0,13,0,5,4,3,4,0,2,13,1,31,0 -353,6,3,0,9,0,4,5,3,1,1,2,11,4,28,0 -354,1,0,0,13,1,4,0,4,0,0,17,19,3,25,0 -355,2,2,0,4,5,1,14,3,4,1,8,11,1,32,0 -356,2,3,0,5,2,1,9,4,2,0,5,5,4,30,1 -357,6,6,0,5,1,3,9,0,0,0,4,18,0,41,0 -358,10,4,0,9,0,0,7,1,2,1,13,0,1,30,0 -359,3,2,0,1,2,0,14,1,0,1,13,11,1,38,0 -360,3,4,0,8,0,4,14,1,0,1,0,2,1,9,0 -361,0,1,0,4,1,2,1,2,3,1,5,6,4,29,0 -362,1,4,0,5,3,1,7,4,1,0,16,0,0,6,0 -363,10,6,0,10,0,0,4,1,0,1,2,16,3,14,0 -364,3,4,0,11,4,6,8,2,3,1,14,17,4,0,1 -365,5,1,0,9,0,5,7,3,0,1,13,11,4,39,0 -366,6,2,0,7,0,0,13,0,3,0,2,16,1,24,0 -367,1,4,0,5,3,0,5,1,3,0,11,2,4,4,0 -368,8,0,0,6,4,6,5,0,0,0,9,8,2,32,1 -369,1,6,0,11,0,2,11,5,4,1,13,11,4,19,0 -370,0,7,0,0,6,5,0,2,2,0,13,9,0,36,0 -371,3,0,0,3,4,4,4,1,0,1,18,18,1,4,0 -372,5,4,0,12,2,0,14,4,2,0,8,15,2,3,0 -373,5,7,0,11,3,0,5,2,1,1,18,19,1,20,1 -374,8,8,0,13,6,4,10,3,1,0,16,11,2,4,0 -375,8,1,0,0,3,1,4,4,1,1,6,2,4,38,1 -376,1,7,0,10,1,4,8,3,0,1,18,5,5,28,1 -377,0,0,0,3,2,3,0,5,4,0,2,20,4,39,0 -378,8,2,0,15,4,4,6,5,3,1,15,7,5,21,1 -379,9,7,0,7,1,0,0,0,4,1,3,10,4,14,1 -380,0,2,0,1,1,6,6,0,0,1,2,12,1,8,0 -381,7,2,0,15,1,2,8,3,0,0,18,6,0,21,1 -382,2,4,0,6,0,1,9,4,2,0,13,9,0,13,0 -383,2,8,0,14,5,4,1,1,0,0,13,9,2,0,0 -384,5,6,0,7,0,5,8,1,0,1,11,15,1,21,0 -385,10,8,0,10,6,5,4,1,1,1,2,2,3,23,0 -386,8,4,0,9,3,1,14,1,0,0,5,8,3,21,1 -387,6,6,0,1,0,4,5,1,2,0,4,11,5,12,0 -388,10,6,0,2,1,1,6,3,1,0,10,11,5,13,0 -389,8,6,0,13,3,4,14,5,0,1,4,20,1,1,0 -390,2,6,0,10,2,5,9,0,0,0,15,2,2,14,0 -391,10,7,0,14,5,2,12,0,1,1,9,8,1,20,1 -392,6,5,0,14,0,5,1,5,3,1,16,3,0,19,1 -393,5,3,0,13,0,4,11,4,1,1,13,0,5,29,0 -394,1,2,0,1,6,6,14,1,1,0,3,5,5,20,1 -395,0,8,0,1,1,3,5,0,0,0,13,9,0,32,0 -396,9,4,0,14,1,1,4,0,1,1,18,1,4,8,1 -397,3,4,0,1,0,5,9,0,3,1,5,11,0,2,0 -398,10,8,0,10,2,5,7,5,0,1,14,14,5,28,1 -399,7,5,0,11,3,4,7,2,0,0,13,11,0,36,0 -400,0,6,0,2,4,0,0,2,0,1,4,4,4,26,0 -401,3,3,0,15,5,5,3,1,3,0,6,11,1,18,0 -402,1,3,0,7,5,3,2,1,3,0,0,18,0,14,0 -403,9,3,0,10,5,4,9,4,3,1,13,11,3,1,0 -404,3,8,0,3,0,6,3,0,3,1,16,18,2,13,0 -405,6,2,0,6,0,2,14,0,2,1,7,0,2,6,0 -406,0,3,0,6,5,3,6,2,3,0,6,11,4,14,0 -407,0,5,0,8,0,3,5,0,3,0,4,2,1,27,0 -408,4,4,0,1,0,3,5,0,3,1,13,2,0,22,0 -409,0,6,0,7,5,0,7,3,0,1,8,11,1,32,0 -410,6,7,0,7,0,5,5,5,3,1,13,11,4,35,0 -411,2,2,0,10,1,3,7,0,0,0,16,1,0,22,1 -412,1,0,0,11,3,5,3,0,0,1,2,9,0,35,0 -413,9,0,0,5,2,0,13,0,4,0,7,6,5,2,1 -414,9,8,0,9,3,1,5,0,0,0,17,18,2,7,0 -415,7,0,0,3,1,5,4,1,3,0,0,15,4,11,0 -416,2,8,0,13,0,0,10,3,4,0,17,0,5,10,0 -417,7,8,0,1,6,0,11,1,4,1,0,6,4,20,1 -418,9,7,0,13,3,3,5,4,2,1,17,4,0,23,0 -419,2,4,0,15,4,4,6,0,2,1,2,2,3,13,0 -420,2,7,0,12,4,4,9,5,3,1,13,20,0,30,0 -421,4,7,0,0,6,3,0,1,3,0,12,13,4,21,0 -422,2,1,0,11,0,0,14,0,2,0,13,14,1,41,0 -423,8,1,0,10,5,1,3,4,1,0,18,16,2,23,1 -424,1,1,0,15,3,4,11,4,2,1,4,15,0,9,0 -425,8,2,0,6,2,2,7,3,1,1,12,8,0,2,1 -426,7,7,0,11,2,5,9,3,2,0,13,13,0,1,0 -427,2,7,0,3,0,3,9,0,1,0,10,2,5,35,0 -428,8,5,0,9,5,0,8,1,0,0,3,12,5,30,0 -429,4,7,0,5,2,5,6,3,4,1,1,5,2,5,0 -430,0,3,0,1,4,5,5,3,2,0,2,6,3,3,0 -431,0,2,0,0,6,3,9,1,0,1,0,15,4,0,0 -432,4,2,0,12,3,5,1,1,4,0,13,15,4,13,0 -433,1,7,0,12,4,6,9,3,2,0,2,11,3,40,0 -434,4,1,0,5,2,1,7,3,0,0,7,20,4,36,0 -435,2,8,0,0,6,3,9,2,3,0,8,0,3,31,0 -436,5,5,0,10,2,0,3,1,0,1,2,6,0,20,0 -437,1,5,0,13,3,3,9,0,1,1,4,1,1,23,1 -438,0,4,0,3,3,0,2,2,0,0,2,2,5,12,0 -439,0,3,0,6,6,0,6,0,1,1,13,15,2,8,0 -440,1,7,0,1,6,4,1,1,0,0,6,2,4,26,0 -441,1,6,0,0,5,4,14,4,1,1,6,6,2,1,0 -442,10,1,0,8,3,2,4,4,3,1,9,2,1,26,0 -443,9,6,0,10,0,5,12,2,2,0,2,2,5,37,0 -444,10,2,0,2,5,0,7,3,0,1,10,19,0,25,0 -445,2,3,0,14,4,5,13,0,1,0,4,15,5,29,0 -446,1,6,0,13,2,4,9,4,2,0,4,3,0,8,0 -447,0,6,0,6,0,4,3,3,2,1,13,12,5,3,0 -448,1,8,0,6,0,4,5,3,0,0,17,11,0,26,0 -449,9,2,0,3,2,3,7,0,0,0,4,11,3,16,0 -450,0,1,0,1,6,2,12,0,4,1,2,2,3,13,0 -451,1,8,0,1,1,0,3,0,1,1,8,2,3,4,0 -452,2,5,0,3,5,4,9,3,3,1,11,0,5,35,0 -453,0,5,0,11,1,5,9,2,0,0,8,11,0,41,0 -454,3,7,0,12,1,6,12,0,0,1,3,17,5,21,1 -455,8,2,0,5,3,2,10,1,0,0,0,2,0,29,0 -456,0,6,0,4,1,4,9,3,3,1,17,2,5,40,0 -457,2,8,0,10,6,4,14,2,0,1,14,11,2,2,0 -458,10,8,0,7,4,0,11,0,3,1,9,3,4,31,1 -459,0,7,0,9,4,4,1,1,0,0,15,6,1,21,0 -460,4,5,0,15,1,4,2,5,0,1,5,16,4,0,1 -461,7,8,0,11,2,1,5,4,3,0,13,0,1,29,0 -462,3,0,0,13,4,4,13,0,1,0,0,1,0,25,0 -463,4,8,0,14,6,0,6,1,0,1,2,11,1,21,0 -464,5,5,0,9,5,5,2,1,1,0,2,14,2,25,0 -465,2,8,0,5,0,4,14,2,0,1,8,15,4,22,0 -466,5,4,0,2,1,0,11,1,3,1,10,15,2,2,0 -467,6,0,0,4,4,5,10,1,4,0,1,16,5,1,1 -468,6,2,0,11,0,3,3,4,0,1,12,20,2,11,0 -469,10,2,0,5,6,0,5,1,0,0,10,2,2,30,0 -470,7,5,0,9,4,4,3,5,3,0,3,7,0,37,1 -471,0,2,0,9,1,4,4,4,0,0,2,4,0,33,0 -472,3,3,0,4,0,5,2,1,4,1,0,6,4,33,0 -473,1,1,0,8,2,2,11,3,3,1,13,11,5,25,0 -474,0,1,0,14,2,5,6,3,2,0,2,0,5,29,0 -475,2,6,0,11,0,2,0,3,4,1,17,2,0,22,0 -476,2,6,0,6,0,6,8,2,0,1,6,2,1,22,0 -477,3,0,0,9,0,1,9,4,2,0,4,10,0,10,0 -478,9,8,0,0,6,4,0,1,0,1,17,4,0,27,0 -479,3,6,0,5,4,2,12,0,3,1,16,8,2,26,1 -480,7,6,0,0,5,0,0,1,4,1,10,19,5,3,1 -481,0,7,0,0,5,4,11,4,0,1,13,9,5,3,0 -482,6,7,0,8,1,5,7,4,0,1,3,2,1,13,0 -483,0,7,0,9,5,5,6,3,2,1,2,9,3,40,0 -484,6,1,0,10,0,1,3,0,3,0,18,7,4,22,1 -485,1,2,0,5,2,3,8,3,4,1,2,2,1,35,0 -486,6,8,0,10,5,5,8,1,3,0,11,15,1,27,0 -487,2,1,0,7,2,4,0,3,1,1,15,11,0,23,0 -488,6,5,0,7,0,2,13,5,2,1,7,15,5,22,1 -489,1,1,0,3,1,0,4,1,0,0,12,6,1,17,0 -490,0,7,0,2,0,5,6,0,0,1,13,20,2,36,0 -491,2,0,0,1,6,1,0,2,4,1,17,0,0,41,0 -492,10,5,0,1,3,2,9,0,0,1,7,17,4,29,1 -493,10,4,0,9,1,3,3,4,0,1,4,7,4,21,1 -494,7,8,0,11,4,2,4,4,4,1,13,6,1,14,0 -495,5,8,0,1,5,5,2,3,3,0,8,11,1,28,0 -496,1,6,0,2,3,6,4,0,0,1,18,19,0,5,1 -497,7,8,0,9,6,1,4,2,4,0,9,7,0,11,1 -498,2,0,0,14,1,1,3,5,4,0,15,11,1,33,0 -499,5,7,0,8,0,5,10,2,0,1,2,2,0,27,0 -500,0,6,0,3,2,0,13,2,3,1,2,6,1,38,0 -501,9,5,0,10,5,6,3,0,3,1,18,5,1,33,1 -502,2,4,0,0,1,3,0,3,3,0,2,4,0,26,0 -503,9,8,0,5,2,6,5,3,4,1,3,7,3,19,1 -504,7,6,0,5,0,2,12,0,4,0,2,16,3,24,0 -505,6,7,0,14,4,6,6,0,4,0,1,7,4,32,1 -506,6,3,0,10,1,5,4,5,4,1,12,0,0,11,1 -507,2,2,0,12,6,5,9,0,1,0,13,15,1,30,0 -508,1,7,0,7,6,0,5,4,3,0,2,9,3,39,0 -509,5,2,0,6,1,0,12,3,2,1,0,13,5,34,0 -510,0,2,0,3,4,5,1,3,0,0,2,20,2,33,0 -511,2,8,0,14,1,4,14,1,1,0,0,2,5,27,0 -512,8,5,0,14,6,1,0,3,2,0,1,17,4,22,1 -513,4,0,0,2,4,4,12,3,0,1,18,2,5,38,1 -514,3,5,0,4,1,2,7,3,2,1,2,2,1,14,0 -515,3,3,0,3,6,4,9,0,3,0,15,9,3,5,0 -516,5,1,0,14,4,3,5,5,4,0,16,9,2,10,1 -517,4,7,0,11,4,4,4,5,4,0,5,4,0,12,1 -518,9,3,0,6,5,6,6,5,3,0,2,13,5,19,0 -519,1,8,0,15,5,6,8,1,4,0,18,18,3,4,1 -520,6,0,0,1,6,5,9,0,4,1,6,15,4,18,0 -521,4,3,0,8,5,5,1,0,2,1,10,2,1,23,0 -522,10,7,0,1,5,6,2,0,0,1,12,17,1,38,0 -523,7,7,0,13,2,0,0,1,0,1,5,12,5,20,1 -524,2,5,0,9,3,3,11,2,0,0,10,20,5,14,0 -525,1,4,0,8,6,0,0,3,3,1,3,11,4,17,0 -526,4,1,0,12,6,6,3,5,1,1,9,19,5,16,1 -527,8,5,0,14,1,4,5,0,1,1,6,10,0,37,1 -528,9,7,0,5,1,4,6,2,2,1,8,2,1,22,0 -529,1,6,0,3,1,0,4,3,0,0,8,20,2,10,0 -530,5,0,0,11,2,4,4,3,2,1,13,2,3,5,0 -531,10,4,0,7,4,6,9,3,2,0,6,2,4,17,0 -532,4,1,0,3,0,4,8,0,0,1,13,2,2,34,0 -533,9,1,0,3,6,4,1,2,4,1,2,0,2,37,0 -534,7,6,0,15,1,5,13,2,2,0,10,9,1,9,0 -535,7,0,0,0,6,4,7,5,4,1,2,10,5,25,0 -536,5,2,0,9,1,5,8,4,0,0,8,3,5,24,0 -537,6,8,0,11,4,1,3,1,2,0,14,7,4,5,1 -538,1,0,0,11,2,4,9,3,2,0,8,8,3,8,0 -539,3,3,0,15,2,1,0,0,2,1,8,11,0,7,0 -540,3,3,0,12,1,3,6,3,2,0,16,13,0,25,0 -541,2,2,0,7,2,2,7,2,0,1,8,5,3,23,0 -542,8,7,0,3,1,4,1,4,1,0,8,15,1,22,0 -543,10,4,0,11,4,3,5,4,2,1,15,9,4,24,0 -544,5,3,0,1,4,0,1,0,1,1,16,20,1,39,0 -545,6,3,0,12,6,6,1,0,4,1,1,17,0,35,1 -546,6,8,0,4,1,3,5,4,3,0,2,19,3,37,0 -547,2,1,0,15,2,3,4,5,3,1,2,2,3,10,0 -548,7,1,0,13,2,0,2,4,0,1,2,18,1,27,0 -549,0,0,0,12,4,2,6,5,0,0,2,6,0,5,0 -550,8,0,0,15,4,0,7,2,0,0,2,12,5,4,0 -551,5,5,0,10,6,2,0,1,1,1,16,14,2,2,1 -552,1,5,0,7,0,3,10,4,4,0,12,15,4,6,0 -553,5,7,0,5,2,4,11,4,0,1,17,15,5,25,0 -554,1,6,0,11,0,5,0,3,3,0,17,18,1,4,0 -555,8,6,0,3,1,0,3,0,3,1,4,4,0,17,0 -556,4,2,0,11,6,0,2,2,4,1,16,4,3,7,1 -557,7,1,0,6,4,4,8,2,2,1,17,14,5,36,0 -558,4,1,0,12,0,5,7,2,4,1,10,13,5,33,0 -559,2,7,0,5,2,4,6,3,1,1,8,2,0,34,0 -560,0,7,0,6,6,5,0,1,3,0,13,2,1,37,0 -561,0,6,0,5,6,0,10,2,0,1,2,2,0,40,0 -562,9,7,0,3,1,6,6,0,3,0,8,2,2,37,0 -563,9,1,0,2,1,1,10,3,4,0,6,10,4,16,1 -564,0,2,0,13,3,5,7,3,1,0,8,11,4,19,0 -565,0,4,0,13,6,0,0,1,4,0,13,2,4,24,0 -566,4,3,0,3,5,6,0,4,1,1,8,0,0,11,0 -567,10,3,0,10,2,1,11,5,3,0,5,20,4,1,1 -568,2,3,0,1,6,6,9,3,4,0,13,18,1,14,0 -569,9,7,0,15,0,0,8,0,3,1,2,15,3,30,0 -570,1,6,0,11,3,5,9,4,1,1,8,15,0,23,0 -571,6,6,0,4,0,3,12,0,1,1,9,12,5,37,1 -572,3,2,0,14,1,4,6,5,0,1,18,19,1,13,1 -573,4,6,0,7,1,6,7,1,0,0,11,20,1,0,0 -574,3,4,0,5,2,2,14,1,4,1,10,9,2,31,1 -575,4,1,0,13,1,0,3,3,1,1,17,11,2,32,0 -576,4,1,0,3,5,5,9,3,4,0,9,0,2,6,0 -577,0,1,0,3,6,5,9,0,0,1,2,11,2,6,0 -578,2,6,0,3,1,0,7,0,1,1,15,17,0,5,1 -579,0,6,0,0,4,5,5,3,2,0,13,6,3,2,0 -580,5,1,0,9,1,6,5,4,3,0,2,9,3,8,0 -581,3,4,0,13,4,5,13,0,4,1,18,19,5,1,1 -582,1,4,0,6,5,0,6,3,0,1,7,2,4,25,0 -583,0,3,0,0,3,1,8,4,2,1,10,6,3,26,0 -584,8,8,0,1,3,1,10,0,4,0,5,8,1,25,1 -585,1,0,0,4,2,1,0,3,3,0,2,2,3,36,0 -586,0,7,0,10,1,2,14,0,0,1,13,2,2,9,0 -587,6,3,0,3,5,6,12,3,2,1,8,9,4,11,0 -588,8,8,0,6,0,6,9,3,4,1,6,2,0,8,0 -589,10,7,0,0,2,6,2,4,2,0,2,4,4,29,0 -590,8,8,0,3,2,5,12,3,3,1,15,20,0,1,0 -591,0,6,0,15,2,1,10,1,2,0,6,7,3,16,0 -592,8,0,0,3,6,6,13,1,0,1,13,18,4,0,0 -593,4,8,0,3,3,0,13,3,2,1,10,6,0,36,0 -594,8,7,0,12,5,5,9,0,1,0,13,12,5,7,0 -595,5,8,0,13,2,0,4,1,2,1,17,14,1,6,0 -596,1,5,0,13,4,1,12,5,1,0,7,7,5,16,1 -597,7,3,0,7,1,0,9,3,0,0,8,11,0,19,0 -598,10,7,0,10,0,3,4,4,1,0,18,5,0,37,1 -599,2,3,0,9,0,1,2,4,3,1,8,15,2,11,0 -600,0,4,0,14,6,0,14,1,3,0,4,2,4,11,0 -601,0,5,0,6,6,3,0,1,2,1,6,11,1,6,0 -602,0,6,0,2,5,5,9,1,1,1,2,2,1,39,0 -603,0,4,0,15,0,6,7,1,2,0,17,13,2,13,0 -604,1,0,0,13,1,4,10,3,1,1,7,10,0,12,1 -605,0,3,0,13,3,0,8,1,3,1,15,2,1,25,0 -606,9,5,0,2,2,2,0,0,3,1,18,3,0,31,1 -607,6,4,0,12,1,4,9,5,0,1,18,11,2,28,1 -608,3,8,0,1,0,0,12,3,2,1,2,4,1,7,0 -609,6,4,0,4,6,0,8,4,4,1,13,18,5,39,0 -610,10,8,0,5,2,5,6,2,0,1,14,0,0,30,0 -611,1,3,0,5,0,0,6,2,2,0,2,16,5,1,0 -612,8,1,0,14,3,5,6,0,1,1,18,13,2,23,1 -613,2,4,0,10,0,1,7,5,3,1,9,10,0,24,1 -614,9,7,0,14,1,6,9,3,2,0,7,13,0,3,0 -615,0,2,0,10,1,6,13,5,2,0,2,18,4,17,0 -616,4,0,0,1,3,0,6,2,2,1,4,15,0,29,0 -617,0,3,0,13,2,0,2,4,2,0,8,16,2,16,0 -618,5,1,0,10,4,3,4,1,1,1,18,9,5,27,1 -619,2,3,0,12,0,0,12,1,2,0,2,16,4,26,0 -620,2,2,0,3,0,5,8,4,1,1,10,8,0,26,0 -621,1,7,0,1,5,6,0,2,2,0,2,3,1,10,0 -622,2,3,0,0,4,5,12,3,4,0,8,13,4,27,0 -623,3,4,0,3,3,5,13,1,0,0,4,18,1,31,0 -624,5,4,0,13,3,2,4,4,1,0,10,2,0,10,0 -625,9,6,0,12,5,5,1,5,2,0,3,19,5,36,1 -626,4,2,0,5,0,3,2,0,4,0,12,9,3,2,0 -627,1,1,0,8,3,0,8,5,4,1,2,13,0,39,0 -628,0,6,0,3,4,5,6,1,1,1,13,1,2,9,0 -629,0,1,0,9,3,5,10,1,0,0,2,9,5,12,0 -630,7,7,0,13,0,5,4,2,0,1,8,20,5,0,0 -631,7,3,0,4,3,6,2,5,4,1,1,6,5,36,1 -632,9,2,0,13,5,4,2,5,0,1,2,18,2,4,0 -633,0,8,0,7,2,0,5,0,2,0,14,2,3,3,0 -634,6,0,0,0,5,5,6,1,1,0,15,2,3,29,0 -635,9,1,0,10,5,0,2,0,1,1,5,1,4,18,1 -636,9,0,0,4,3,5,0,0,2,0,2,4,1,4,0 -637,8,6,0,7,0,0,8,3,2,1,17,20,4,4,0 -638,0,8,0,8,0,4,0,1,4,1,8,6,4,35,0 -639,3,0,0,2,0,4,9,0,2,1,17,2,4,27,0 -640,0,6,0,15,0,6,7,1,3,0,2,14,3,37,0 -641,5,0,0,11,5,2,7,4,1,0,5,8,2,23,1 -642,5,0,0,3,4,6,9,2,2,0,3,4,4,38,0 -643,0,7,0,0,6,6,11,3,3,1,14,2,2,28,0 -644,2,8,0,3,4,6,10,5,1,1,8,11,5,11,1 -645,0,4,0,11,0,6,12,2,2,1,16,20,0,41,0 -646,8,2,0,9,0,3,8,2,4,1,17,9,0,6,0 -647,9,3,0,10,3,0,13,5,3,1,14,6,1,35,1 -648,0,3,0,1,5,0,12,1,0,1,17,11,2,38,0 -649,3,4,0,3,0,5,11,4,4,1,17,10,2,10,0 -650,10,7,0,13,4,3,9,1,4,1,17,6,0,0,0 -651,7,7,0,12,2,0,10,1,2,1,18,14,2,14,1 -652,1,2,0,13,0,5,5,1,2,0,10,11,4,24,0 -653,10,0,0,11,0,5,9,1,4,1,17,3,0,19,0 -654,3,5,0,3,2,6,12,5,2,1,9,19,0,10,1 -655,10,5,0,4,5,5,14,5,1,1,14,5,3,10,1 -656,4,4,0,5,3,2,0,1,2,0,3,15,4,29,0 -657,9,6,0,4,2,5,2,0,4,1,10,6,2,24,0 -658,6,6,0,6,0,2,8,2,0,0,3,11,0,17,0 -659,4,6,0,2,5,2,7,5,2,0,11,2,5,7,0 -660,7,5,0,14,1,4,7,4,3,1,13,11,4,23,0 -661,0,0,0,2,0,1,3,4,0,1,3,2,0,26,0 -662,10,2,0,13,6,0,4,4,0,1,6,19,2,23,0 -663,0,5,0,3,2,0,6,3,1,1,8,2,2,32,0 -664,2,6,0,6,0,5,10,1,2,1,8,4,1,35,0 -665,0,0,0,8,0,1,3,4,3,1,18,19,1,7,1 -666,2,6,0,12,6,4,2,0,0,1,1,20,3,39,0 -667,9,7,0,6,2,0,3,3,1,1,2,20,4,29,0 -668,8,8,0,1,4,6,1,4,3,0,3,14,4,6,0 -669,1,2,0,0,3,4,13,0,3,1,2,2,0,8,0 -670,1,2,0,3,3,3,6,1,1,0,17,13,0,34,0 -671,1,8,0,6,4,3,6,4,4,1,18,19,3,1,1 -672,9,6,0,6,0,5,4,3,3,1,4,11,3,3,0 -673,0,7,0,13,4,3,4,5,2,0,13,5,3,24,0 -674,10,5,0,15,5,1,0,0,0,0,18,8,2,18,1 -675,10,0,0,14,1,5,3,5,1,0,15,20,0,33,0 -676,5,8,0,4,0,4,5,4,2,1,12,2,4,11,0 -677,6,3,0,6,5,6,8,3,2,0,8,6,2,17,0 -678,1,3,0,1,5,1,11,4,0,0,13,7,2,21,0 -679,9,2,0,3,4,0,13,4,3,0,9,10,3,36,1 -680,0,5,0,7,1,5,4,1,3,0,15,2,0,10,0 -681,5,3,0,1,0,4,9,3,0,0,17,6,4,24,0 -682,9,2,0,7,6,4,9,1,3,0,4,9,0,21,0 -683,4,7,0,7,0,1,2,2,2,1,16,17,1,20,1 -684,7,8,0,7,6,1,11,2,0,1,2,8,0,23,0 -685,7,7,0,11,0,6,3,0,1,0,13,11,2,26,0 -686,10,8,0,11,6,5,1,2,1,0,13,9,5,12,0 -687,1,1,0,0,1,2,12,4,4,1,18,8,3,36,1 -688,8,8,0,8,4,1,6,5,3,1,18,12,3,12,1 -689,9,3,0,12,5,5,14,3,2,1,10,11,0,4,0 -690,6,6,0,3,4,4,0,4,2,1,17,9,1,31,0 -691,0,1,0,5,6,2,13,3,4,0,4,11,5,27,0 -692,2,4,0,3,5,5,0,1,2,0,15,6,1,3,0 -693,1,5,0,13,0,3,8,1,0,0,2,16,0,4,0 -694,1,3,0,0,4,1,12,1,3,0,3,10,5,4,1 -695,2,8,0,7,6,0,14,2,4,1,9,10,2,1,1 -696,4,4,0,4,3,4,6,0,1,1,0,2,5,7,0 -697,6,3,0,1,2,0,8,3,1,0,15,2,5,4,0 -698,8,5,0,4,3,4,7,0,1,1,15,8,1,41,1 -699,2,0,0,11,3,3,10,4,0,1,4,8,5,27,0 -700,3,8,0,13,5,5,0,3,0,0,10,20,5,27,0 -701,1,8,0,4,6,5,5,3,0,0,17,2,4,6,0 -702,0,7,0,15,0,4,11,0,0,0,12,15,4,16,0 -703,0,6,0,14,0,0,9,3,0,0,0,9,3,23,0 -704,0,5,0,14,3,1,11,3,1,1,11,19,3,5,1 -705,4,1,0,12,0,6,3,3,3,1,0,13,4,14,0 -706,8,3,0,12,0,6,7,0,0,1,13,18,5,4,0 -707,1,2,0,8,1,0,5,0,3,0,2,2,1,3,0 -708,3,7,0,0,0,5,7,1,1,1,4,11,2,23,0 -709,6,5,0,1,6,6,9,0,2,0,3,9,1,11,1 -710,0,8,0,11,3,6,12,5,0,1,6,4,3,36,0 -711,5,7,0,3,1,3,9,4,4,1,8,19,4,24,0 -712,8,8,0,3,6,4,11,5,4,0,9,14,3,30,1 -713,10,1,0,14,2,5,6,5,4,0,8,13,0,0,0 -714,10,1,0,0,4,2,3,1,2,1,18,13,0,18,1 -715,7,6,0,10,3,0,7,5,0,1,15,16,5,27,1 -716,10,6,0,15,0,1,13,0,2,1,18,10,0,27,1 -717,5,8,0,8,1,5,13,1,3,1,13,10,0,23,0 -718,8,0,0,0,0,5,12,4,4,1,5,5,5,34,1 -719,0,2,0,0,4,5,9,4,1,0,13,11,0,29,0 -720,2,4,0,13,5,0,3,3,1,1,17,15,0,9,0 -721,0,8,0,14,3,0,1,3,4,1,2,17,2,25,0 -722,8,2,0,14,6,2,2,5,0,1,18,16,5,36,1 -723,2,2,0,1,2,0,6,4,2,0,15,16,4,28,0 -724,1,6,0,11,4,4,5,4,1,0,15,0,5,39,0 -725,6,8,0,3,2,5,13,1,4,1,2,2,1,4,0 -726,2,1,0,5,1,5,5,2,2,0,13,15,4,34,0 -727,3,7,0,9,2,5,8,4,2,1,4,17,3,2,0 -728,6,0,0,3,4,3,8,4,4,1,11,5,2,16,1 -729,8,6,0,7,6,4,2,0,2,0,13,20,2,3,0 -730,2,1,0,3,2,6,6,1,3,1,12,12,0,40,0 -731,9,3,0,4,0,6,3,5,0,1,16,5,0,8,1 -732,4,0,0,14,1,5,9,3,0,0,3,11,5,27,0 -733,2,2,0,10,1,3,5,1,2,1,6,7,3,36,0 -734,9,5,0,14,5,2,2,2,4,1,10,17,3,27,1 -735,5,7,0,10,2,4,10,3,2,0,15,11,1,38,0 -736,8,7,0,10,2,1,4,1,3,1,1,17,2,29,1 -737,6,0,0,7,4,1,3,5,1,1,18,17,2,31,1 -738,0,3,0,11,3,0,2,0,2,0,4,11,2,4,0 -739,0,6,0,0,2,1,14,3,0,0,15,13,3,21,0 -740,9,5,0,2,5,0,13,4,3,0,10,9,3,6,1 -741,6,6,0,15,6,5,6,3,1,0,17,18,2,25,0 -742,10,2,0,4,0,4,9,0,3,0,13,11,2,18,0 -743,9,7,0,10,1,0,12,1,0,0,14,8,0,21,1 -744,1,0,0,3,3,6,6,5,0,1,2,4,0,37,0 -745,0,2,0,12,3,0,1,3,4,0,13,2,3,22,0 -746,1,7,0,12,1,1,6,1,2,1,1,10,3,35,1 -747,10,7,0,1,5,1,12,0,3,1,15,18,4,38,1 -748,1,6,0,5,0,3,6,3,3,1,8,11,0,14,0 -749,0,2,0,8,0,3,1,0,3,0,13,16,4,27,0 -750,6,4,0,12,0,0,11,2,3,1,11,1,1,14,1 -751,2,2,0,6,6,0,12,5,1,1,1,10,1,39,1 -752,2,6,0,8,5,2,3,3,4,1,18,7,4,8,1 -753,3,6,0,3,2,3,1,2,0,0,10,16,5,36,0 -754,0,3,0,5,5,2,14,2,3,1,18,17,4,12,1 -755,9,1,0,6,3,1,4,5,4,0,18,19,5,14,1 -756,9,6,0,6,0,6,1,3,4,1,10,6,1,35,0 -757,6,2,0,0,5,5,14,2,2,1,18,10,3,6,1 -758,8,7,0,14,5,2,0,3,1,1,4,10,4,0,0 -759,0,2,0,1,2,4,7,3,1,0,15,9,5,8,0 -760,0,8,0,6,4,5,2,0,3,0,2,19,4,11,0 -761,1,4,0,4,4,2,1,1,1,0,2,11,4,6,0 -762,9,7,0,13,6,0,9,3,0,1,4,6,4,18,0 -763,4,3,0,2,5,1,1,5,2,0,15,1,5,2,0 -764,0,6,0,10,1,2,9,0,1,1,18,19,4,38,1 -765,4,2,0,15,1,1,10,5,4,1,18,7,5,23,1 -766,3,0,0,2,4,1,4,5,3,0,15,5,2,35,1 -767,1,4,0,2,0,4,2,2,2,1,9,8,3,41,1 -768,10,2,0,11,6,6,10,2,3,0,6,4,4,21,0 -769,3,1,0,15,6,2,7,3,1,1,1,1,0,19,1 -770,9,0,0,5,2,5,14,0,2,1,2,15,0,4,0 -771,6,4,0,6,1,3,5,3,2,1,13,15,1,0,0 -772,5,0,0,3,0,0,9,2,2,0,4,2,0,40,0 -773,2,5,0,0,3,5,10,1,2,1,8,18,1,32,0 -774,9,8,0,3,0,5,8,2,2,1,15,18,4,38,0 -775,4,6,0,11,6,0,12,4,2,0,8,11,4,39,0 -776,0,7,0,13,2,6,14,4,3,0,2,11,5,1,0 -777,0,2,0,5,0,3,3,1,0,0,6,4,0,5,0 -778,1,3,0,14,6,5,5,2,0,0,8,9,4,13,0 -779,6,5,0,3,0,5,5,1,3,0,2,2,0,18,0 -780,8,4,0,0,6,2,1,5,4,0,17,14,3,21,1 -781,1,3,0,1,0,6,13,0,3,1,2,2,5,24,0 -782,1,1,0,5,6,1,10,5,0,1,9,8,4,22,1 -783,0,4,0,2,1,3,8,5,1,1,12,2,5,24,0 -784,2,5,0,10,1,0,11,2,3,0,3,6,0,1,0 -785,6,4,0,8,4,3,13,5,1,1,18,5,5,17,1 -786,7,8,0,15,0,0,8,0,2,0,10,19,0,16,1 -787,2,4,0,6,1,6,9,0,0,0,13,2,1,16,0 -788,4,3,0,6,4,0,10,4,1,0,5,17,3,22,1 -789,3,5,0,3,6,5,3,3,0,0,8,18,3,24,0 -790,10,4,0,4,6,3,2,4,3,0,16,5,4,41,1 -791,0,6,0,3,2,6,11,4,2,1,2,4,3,12,0 -792,2,1,0,1,3,5,6,1,1,1,16,17,3,1,1 -793,0,0,0,11,5,3,3,2,2,0,12,9,0,14,0 -794,2,3,0,9,0,5,2,3,0,0,2,14,1,22,0 -795,2,3,0,3,4,0,6,3,3,1,6,11,0,41,0 -796,2,7,0,15,0,6,2,4,0,1,13,16,5,39,0 -797,5,2,0,13,6,4,9,0,1,0,9,7,0,9,0 -798,8,1,0,9,4,0,9,1,4,1,13,2,2,24,0 -799,4,7,0,0,0,4,10,2,3,0,13,11,0,35,0 -800,7,3,0,15,0,0,0,3,2,1,2,0,5,26,0 -801,5,6,0,0,5,1,8,3,3,0,13,13,0,6,0 -802,0,1,0,6,6,0,8,0,0,1,6,10,5,35,0 -803,5,4,0,2,2,5,7,5,3,0,13,11,5,11,0 -804,6,7,0,4,2,2,3,5,4,1,18,10,0,25,1 -805,2,4,0,1,0,1,9,4,4,0,8,6,3,27,0 -806,3,6,0,13,2,3,3,1,4,1,13,12,0,18,0 -807,0,1,0,4,2,3,14,4,3,1,0,11,0,10,0 -808,1,7,0,9,1,6,1,3,2,0,4,6,0,26,0 -809,4,0,0,5,5,5,14,4,1,0,13,15,0,14,0 -810,2,4,0,12,4,1,4,2,3,0,11,7,0,20,1 -811,0,8,0,9,5,6,2,3,2,1,11,18,0,23,0 -812,3,5,0,13,3,3,7,4,0,0,2,3,5,38,0 -813,5,4,0,10,3,0,9,3,0,1,17,19,0,8,0 -814,6,8,0,14,1,1,7,4,3,1,11,3,1,22,1 -815,0,0,0,5,1,0,12,1,2,0,12,13,4,32,0 -816,3,8,0,4,3,4,2,0,0,1,16,16,1,13,1 -817,10,2,0,1,4,0,12,0,2,0,17,3,1,18,0 -818,1,0,0,15,6,1,9,0,0,1,17,2,5,22,0 -819,7,0,0,12,1,0,5,5,2,0,8,2,5,32,0 -820,3,4,0,13,6,4,7,4,3,1,13,8,4,20,0 -821,0,1,0,11,2,2,5,3,0,0,2,2,3,5,0 -822,4,0,0,10,2,0,1,2,0,0,9,18,3,25,0 -823,10,5,0,4,6,6,9,0,0,0,6,2,0,4,0 -824,4,5,0,10,4,4,6,2,1,0,12,4,2,14,0 -825,2,8,0,8,4,6,11,0,0,0,2,19,5,6,0 -826,6,8,0,10,0,5,11,2,4,0,10,9,0,23,0 -827,7,8,0,13,6,4,4,3,0,1,13,5,1,22,0 -828,9,5,0,5,5,5,9,4,0,0,10,6,5,18,0 -829,7,1,0,9,6,1,5,4,1,0,5,7,5,17,1 -830,4,3,0,0,6,2,11,3,3,1,2,9,2,28,0 -831,0,0,0,6,2,6,4,1,2,1,3,15,0,26,0 -832,4,5,0,4,4,0,8,3,0,0,0,11,0,12,0 -833,1,5,0,1,1,4,8,3,0,0,2,2,1,3,0 -834,9,5,0,10,2,1,4,3,4,1,5,1,3,10,1 -835,5,6,0,8,4,6,9,2,0,0,10,7,2,13,0 -836,7,0,0,12,5,0,14,1,1,1,8,4,3,29,0 -837,1,6,0,0,6,0,8,2,3,0,11,2,4,20,0 -838,3,2,0,8,6,4,9,4,1,0,15,11,3,35,0 -839,10,1,0,2,5,2,10,2,0,1,14,20,3,10,1 -840,6,3,0,10,2,3,6,0,2,0,14,12,0,20,1 -841,1,6,0,3,0,6,5,4,1,0,8,20,2,0,0 -842,1,1,0,8,1,0,5,4,3,0,17,6,0,32,0 -843,9,3,0,15,6,4,6,2,1,0,4,9,1,38,0 -844,7,8,0,9,6,6,11,3,0,1,0,7,0,32,1 -845,9,7,0,0,0,3,8,0,2,0,17,11,4,21,0 -846,3,8,0,13,0,3,8,4,0,1,4,11,4,9,0 -847,9,4,0,14,6,4,6,0,0,0,17,3,1,33,0 -848,2,8,0,15,1,3,10,4,2,0,2,19,5,17,0 -849,9,8,0,10,2,3,7,4,0,0,2,2,3,37,0 -850,3,2,0,3,2,5,9,0,0,0,13,11,3,23,0 -851,2,6,0,8,6,6,6,1,0,1,13,20,0,9,0 -852,9,8,0,7,5,5,14,1,1,0,8,19,5,10,0 -853,3,6,0,3,4,0,3,3,2,1,14,15,1,27,0 -854,8,1,0,6,2,1,3,2,3,0,1,7,1,35,1 -855,5,6,0,7,5,3,9,3,4,1,13,4,3,35,0 -856,7,8,0,8,2,6,3,3,1,0,18,0,4,8,1 -857,8,7,0,14,4,0,9,0,1,0,9,5,4,36,1 -858,1,0,0,1,0,4,6,3,1,1,4,12,2,17,0 -859,1,7,0,12,6,5,4,2,3,1,10,2,2,37,0 -860,0,2,0,12,0,0,1,3,4,1,13,16,2,4,0 -861,6,8,0,11,4,2,9,3,0,0,13,0,0,3,0 -862,9,5,0,6,3,0,3,5,0,1,18,16,4,0,1 -863,8,5,0,4,5,3,4,3,2,1,14,8,1,18,1 -864,1,8,0,5,0,5,0,1,4,1,4,18,3,1,0 -865,2,7,0,6,2,6,1,4,4,1,13,13,3,18,0 -866,4,8,0,0,0,0,6,1,0,0,13,12,3,32,0 -867,8,2,0,5,1,0,4,4,2,1,1,12,5,41,1 -868,0,5,0,11,6,4,9,3,0,0,6,13,0,5,0 -869,10,8,0,1,5,1,11,4,4,0,13,19,4,28,0 -870,7,3,0,2,6,6,14,0,2,1,6,11,0,27,0 -871,2,0,0,3,1,2,6,5,1,0,15,3,5,35,0 -872,5,1,0,1,0,3,0,0,4,1,16,9,1,10,0 -873,3,6,0,9,4,5,6,2,3,1,14,10,1,7,1 -874,6,6,0,11,5,4,8,1,1,0,13,11,1,39,0 -875,10,0,0,14,0,5,8,2,3,1,13,2,1,11,0 -876,9,8,0,1,6,2,11,1,3,1,18,1,4,2,1 -877,6,6,0,12,0,1,14,3,1,0,2,13,4,4,0 -878,7,1,0,9,1,5,13,1,4,0,18,14,1,41,1 -879,6,5,0,8,1,6,10,5,2,0,8,15,3,3,0 -880,9,8,0,14,3,1,10,3,0,0,3,18,2,41,0 -881,4,7,0,5,4,1,9,5,2,0,11,17,5,18,1 -882,6,6,0,11,2,6,12,1,2,0,8,11,3,12,0 -883,10,5,0,13,3,1,8,5,3,1,18,1,2,36,1 -884,4,2,0,12,6,5,1,5,0,0,9,3,3,29,1 -885,3,7,0,11,0,5,5,2,2,0,2,15,0,0,0 -886,1,8,0,15,4,2,7,3,2,1,18,2,2,33,0 -887,2,0,0,5,1,5,6,0,3,1,15,14,0,40,0 -888,1,7,0,5,0,1,0,2,4,0,3,16,5,26,0 -889,0,6,0,7,2,4,14,5,0,0,8,2,5,14,0 -890,2,7,0,0,3,6,2,3,4,0,13,12,0,14,0 -891,2,0,0,4,4,6,10,4,1,0,17,20,2,0,0 -892,0,0,0,3,1,5,2,5,0,0,13,11,0,3,0 -893,6,6,0,8,1,4,9,1,1,0,17,0,0,12,0 -894,3,1,0,6,0,3,0,5,1,0,10,2,3,13,0 -895,8,8,0,1,0,3,12,4,2,1,13,6,0,28,0 -896,2,5,0,6,6,6,11,0,0,0,9,16,5,12,1 -897,5,2,0,3,4,4,0,4,4,0,8,13,1,17,0 -898,2,6,0,2,2,4,9,2,4,1,13,11,1,0,0 -899,8,5,0,2,2,4,3,2,3,1,0,14,0,11,0 -900,4,8,0,8,0,2,6,2,2,1,17,16,5,21,0 -901,2,3,0,0,5,6,6,0,0,1,7,16,0,7,0 -902,3,7,0,9,1,2,13,5,0,1,18,10,4,31,1 -903,4,2,0,15,6,2,5,0,0,1,8,15,0,34,0 -904,7,5,0,12,3,5,11,1,4,1,9,3,5,21,1 -905,4,2,0,3,4,5,12,3,4,1,4,9,3,14,0 -906,9,5,0,6,0,3,9,3,3,1,13,11,3,3,0 -907,6,7,0,11,0,0,9,2,0,1,18,7,5,24,1 -908,1,0,0,3,5,2,3,3,4,0,2,5,1,26,0 -909,8,5,0,7,2,5,13,1,1,1,7,14,3,12,1 -910,1,7,0,12,6,2,13,3,4,0,15,11,2,26,0 -911,8,7,0,1,5,0,0,0,2,1,13,11,4,5,0 -912,5,2,0,5,3,1,13,4,4,1,3,4,5,13,1 -913,0,5,0,1,0,0,2,3,2,0,2,20,0,10,0 -914,2,0,0,7,6,5,8,0,4,1,7,15,0,16,0 -915,5,4,0,14,1,3,13,0,0,1,18,5,4,7,1 -916,0,2,0,0,2,2,1,4,1,1,17,0,5,27,0 -917,1,3,0,8,5,5,6,4,0,0,8,2,1,32,0 -918,1,5,0,2,4,6,7,4,3,0,10,13,1,23,0 -919,9,5,0,1,1,0,9,0,3,0,3,16,1,18,0 -920,4,6,0,6,5,5,10,4,0,1,2,11,1,20,0 -921,2,2,0,8,4,3,14,2,0,1,8,0,5,10,0 -922,4,6,0,3,0,4,7,3,2,1,2,1,2,32,0 -923,1,1,0,3,0,3,10,4,0,0,6,6,0,27,0 -924,8,3,0,13,2,5,7,1,2,1,0,4,2,10,0 -925,3,8,0,5,0,0,5,3,1,1,10,2,2,38,0 -926,4,8,0,6,4,5,0,0,0,1,4,11,1,7,0 -927,3,2,0,3,2,2,5,1,3,0,2,15,2,22,0 -928,2,6,0,1,6,0,9,1,0,1,2,10,4,22,0 -929,7,4,0,13,6,2,14,1,4,1,16,6,5,34,0 -930,3,4,0,5,5,1,8,3,1,0,13,11,0,3,0 -931,3,7,0,5,4,3,14,2,1,0,13,15,1,3,0 -932,0,3,0,6,2,6,0,0,4,1,8,18,0,6,0 -933,6,4,0,10,3,6,8,0,0,1,13,4,0,41,0 -934,3,4,0,7,2,6,1,3,3,0,4,16,5,27,0 -935,0,7,0,13,4,5,7,4,2,0,8,2,4,39,0 -936,3,1,0,8,1,0,0,4,0,1,2,2,0,6,0 -937,8,2,0,1,2,6,11,1,1,0,9,10,1,7,1 -938,3,8,0,1,4,4,14,0,3,1,13,20,4,5,0 -939,7,6,0,9,3,6,3,4,4,1,13,15,2,33,0 -940,4,5,0,6,5,2,10,0,1,0,17,7,2,29,1 -941,1,1,0,15,0,1,3,4,3,1,0,13,1,21,0 -942,0,3,0,10,0,4,7,3,2,0,2,0,3,19,0 -943,1,6,0,12,6,4,9,0,3,0,8,9,2,6,0 -944,9,5,0,3,6,2,7,3,2,1,18,19,1,35,1 -945,1,3,0,12,3,0,0,3,0,1,18,16,2,6,1 -946,7,0,0,15,2,4,2,1,0,1,2,11,1,16,0 -947,2,1,0,11,6,0,12,4,1,0,0,11,3,14,0 -948,4,1,0,2,0,1,11,0,3,1,18,10,5,2,1 -949,2,7,0,5,5,2,3,3,0,1,4,0,2,37,0 -950,9,3,0,9,0,0,8,3,0,1,4,11,4,26,0 -951,1,2,0,9,6,4,7,3,1,1,2,19,3,11,0 -952,0,5,0,1,0,3,2,3,1,0,8,20,1,12,0 -953,9,2,0,2,2,5,0,3,1,1,4,20,1,19,0 -954,3,3,0,0,0,4,1,4,2,0,6,11,0,20,0 -955,7,1,0,11,1,1,13,0,3,1,16,17,5,33,1 -956,10,3,0,6,4,2,8,1,2,1,11,19,5,3,1 -957,2,3,0,9,6,3,9,3,1,1,18,7,0,3,1 -958,1,2,0,13,3,4,8,2,0,1,5,9,0,31,0 -959,3,8,0,2,1,0,7,3,2,0,8,20,1,34,0 -960,3,1,0,3,6,5,12,1,4,1,12,11,2,21,0 -961,7,2,0,12,4,0,9,3,2,0,0,2,0,24,0 -962,5,7,0,7,2,2,13,2,2,1,8,20,4,38,0 -963,0,8,0,2,6,5,6,3,3,0,10,15,2,6,0 -964,4,0,0,3,3,0,0,3,0,1,2,0,2,12,0 -965,2,8,0,6,6,6,9,2,1,1,8,13,4,6,0 -966,3,5,0,12,1,4,12,2,3,1,2,9,1,36,0 -967,10,2,0,8,3,1,13,3,3,0,2,16,0,31,0 -968,1,8,0,6,1,2,5,4,3,1,9,1,0,18,1 -969,9,7,0,4,1,3,11,4,0,1,18,1,2,0,1 -970,7,4,0,11,2,0,4,1,2,0,16,8,3,16,1 -971,7,8,0,4,4,5,13,1,4,0,8,13,2,29,0 -972,10,1,0,11,4,1,7,4,4,1,5,1,4,31,1 -973,6,7,0,5,2,2,5,4,0,1,0,16,2,24,0 -974,7,1,0,7,4,4,6,3,0,1,2,13,0,37,0 -975,10,6,0,8,3,1,4,1,1,1,7,5,3,31,1 -976,4,5,0,14,4,3,8,3,3,0,18,10,4,1,1 -977,0,8,0,13,6,6,8,2,3,1,0,7,2,23,0 -978,2,8,0,0,6,3,5,3,1,0,1,0,5,14,0 -979,1,7,0,10,5,4,5,0,0,0,8,18,2,38,0 -980,5,5,0,8,3,1,8,3,1,1,13,1,1,12,1 -981,2,0,0,6,0,0,5,2,2,1,13,1,0,2,0 -982,0,2,0,5,4,6,4,3,1,0,3,11,0,32,0 -983,10,4,0,9,6,5,2,0,1,1,17,2,1,37,0 -984,8,0,0,10,0,2,11,5,0,1,10,19,1,8,1 -985,4,5,0,12,4,2,10,5,3,1,9,10,2,36,1 -986,10,0,0,0,6,6,3,4,0,1,3,2,3,34,0 -987,1,7,0,15,6,0,10,4,3,0,8,3,0,18,0 -988,0,2,0,14,6,2,5,5,1,1,2,11,0,35,0 -989,10,2,0,6,6,1,0,1,0,1,13,4,5,4,0 -990,1,1,0,6,6,4,9,5,3,1,14,18,2,25,0 -991,7,2,0,4,4,1,2,1,0,1,18,13,3,27,1 -992,8,2,0,5,6,4,3,4,4,1,5,9,4,36,0 -993,7,3,0,0,6,6,1,1,4,1,13,11,2,11,0 -994,0,7,0,11,0,5,1,4,2,0,5,11,0,35,0 -995,9,6,0,0,5,3,8,1,4,0,13,1,2,20,0 -996,1,3,0,2,3,3,9,3,1,0,13,13,4,32,0 -997,4,0,0,6,0,5,8,4,0,0,6,16,1,25,0 -998,0,6,0,5,0,3,1,1,1,1,10,2,1,32,0 -999,10,7,0,2,1,1,10,4,3,1,7,19,5,35,1 -1000,0,0,0,13,0,3,5,5,0,1,4,20,5,41,0 -1001,0,8,0,5,2,4,5,4,4,0,13,15,2,4,0 -1002,8,2,0,5,4,0,0,1,0,1,13,1,0,8,0 -1003,4,4,0,3,2,5,9,0,1,0,18,7,1,35,1 -1004,3,5,0,11,2,3,0,5,1,1,1,1,2,28,1 -1005,2,1,0,15,1,0,2,2,2,0,13,0,1,6,0 -1006,2,5,0,9,6,0,5,1,2,1,8,4,5,29,0 -1007,6,7,0,6,0,4,7,0,2,1,2,6,2,35,0 -1008,5,0,0,8,2,1,7,5,1,0,17,5,1,24,1 -1009,2,6,0,3,4,5,2,2,2,0,13,9,0,20,0 -1010,3,5,0,11,0,0,9,2,3,0,12,15,2,4,0 -1011,8,4,0,11,3,3,1,1,2,0,8,9,5,39,0 -1012,6,3,0,11,5,2,9,0,1,0,10,17,1,24,1 -1013,0,2,0,3,2,0,5,4,2,1,18,2,3,29,0 -1014,8,8,0,5,0,4,4,2,0,0,4,15,5,39,0 -1015,3,8,0,3,0,0,5,4,0,1,2,3,0,27,0 -1016,10,8,0,3,2,2,0,2,3,0,2,2,1,18,0 -1017,1,4,0,6,5,5,13,4,3,1,13,11,4,26,0 -1018,8,4,0,0,0,0,0,1,1,0,13,1,1,20,0 -1019,0,4,0,13,5,3,1,1,0,0,2,6,4,7,0 -1020,7,4,0,11,2,2,7,2,3,0,1,2,2,17,1 -1021,10,3,0,5,0,5,12,5,2,1,5,7,3,9,1 -1022,1,5,0,13,0,4,3,1,3,0,11,12,0,7,0 -1023,10,6,0,10,4,1,8,2,3,1,12,11,2,38,0 -1024,9,1,0,2,0,1,9,3,1,1,8,11,4,14,0 -1025,5,1,0,9,4,1,12,0,0,1,14,12,0,3,1 -1026,3,8,0,0,0,5,11,4,2,1,18,11,4,37,1 -1027,5,4,0,11,5,1,4,5,0,0,5,8,2,36,1 -1028,6,0,0,10,2,1,4,0,2,1,5,5,4,20,1 -1029,8,1,0,13,1,4,11,1,2,1,0,18,4,37,0 -1030,1,6,0,2,6,6,3,0,0,1,7,16,0,7,0 -1031,0,3,0,15,4,0,12,0,2,0,0,18,0,38,0 -1032,4,8,0,3,5,4,7,2,0,1,17,6,1,22,0 -1033,1,3,0,15,1,3,5,1,0,0,8,16,0,32,0 -1034,2,6,0,10,6,2,0,1,3,0,2,20,4,2,0 -1035,9,1,0,13,6,2,13,0,0,0,10,15,0,6,0 -1036,4,7,0,12,0,4,14,1,0,0,8,2,2,32,0 -1037,3,6,0,10,4,6,12,3,4,0,4,11,5,9,0 -1038,7,2,0,8,6,6,5,2,1,1,8,2,1,5,0 -1039,0,4,0,1,6,1,11,4,2,0,2,6,2,29,0 -1040,1,5,0,7,4,3,1,4,0,1,13,2,3,30,0 -1041,6,6,0,13,0,4,2,5,2,1,4,0,3,18,0 -1042,3,3,0,14,5,5,10,3,3,1,10,9,3,17,0 -1043,9,7,0,5,6,1,5,3,0,1,15,11,2,32,0 -1044,4,4,0,6,0,5,9,0,3,0,2,9,2,0,0 -1045,10,3,0,15,5,5,11,1,0,1,6,9,0,9,0 -1046,0,7,0,14,1,0,14,0,2,0,8,20,0,31,0 -1047,5,3,0,15,4,1,13,0,4,1,17,10,4,19,1 -1048,5,6,0,6,1,2,12,3,0,1,18,9,0,24,1 -1049,7,5,0,1,5,6,13,4,0,0,15,7,4,18,1 -1050,1,6,0,8,4,4,9,4,1,0,2,12,4,41,0 -1051,3,5,0,2,6,0,4,2,1,1,13,14,0,39,0 -1052,8,4,0,9,4,0,2,0,1,1,6,20,4,33,0 -1053,7,3,0,8,2,1,8,2,4,1,0,0,4,20,0 -1054,9,2,0,10,3,5,9,5,0,1,5,10,2,35,1 -1055,9,7,0,1,0,0,2,1,1,0,11,18,0,40,0 -1056,3,3,0,7,3,3,8,0,3,1,2,2,5,4,0 -1057,5,7,0,0,1,6,9,2,2,0,4,2,0,30,0 -1058,0,1,0,12,4,2,5,3,1,1,9,5,4,4,1 -1059,10,5,0,1,1,0,5,1,0,1,3,2,3,24,0 -1060,9,5,0,7,2,5,5,0,1,1,7,15,5,40,0 -1061,6,7,0,6,3,1,0,0,1,0,18,17,1,32,1 -1062,4,7,0,13,0,2,9,5,2,0,12,11,4,0,0 -1063,10,5,0,12,3,1,1,4,2,0,15,11,0,39,0 -1064,9,6,0,15,2,6,14,0,4,0,13,2,5,4,0 -1065,2,8,0,3,5,6,7,4,4,1,6,11,1,33,0 -1066,9,0,0,11,4,0,11,4,0,1,14,19,5,22,1 -1067,9,7,0,13,1,4,0,5,3,0,9,8,5,33,1 -1068,10,1,0,5,5,5,3,1,1,1,17,9,1,14,0 -1069,2,6,0,5,6,5,12,2,0,1,10,9,0,14,0 -1070,1,5,0,1,6,0,3,2,2,0,2,16,1,4,0 -1071,10,1,0,15,4,5,8,1,2,1,3,11,5,35,0 -1072,2,0,0,0,0,0,14,1,0,0,15,2,5,3,0 -1073,1,0,0,8,1,0,4,1,1,0,17,12,4,41,0 -1074,0,1,0,5,0,1,13,2,2,0,13,18,4,41,0 -1075,9,2,0,6,5,0,14,3,4,1,18,7,2,34,1 -1076,0,7,0,8,1,6,1,4,0,1,2,6,4,27,0 -1077,4,4,0,6,6,3,11,0,2,1,5,7,2,25,1 -1078,0,3,0,12,0,0,11,3,2,0,13,6,2,35,0 -1079,2,8,0,1,6,4,7,3,0,1,12,20,4,34,0 -1080,2,0,0,2,6,5,0,1,1,1,13,11,1,18,0 -1081,10,6,0,0,6,6,3,4,2,1,14,17,3,4,1 -1082,9,5,0,3,0,5,8,5,0,0,4,14,3,16,0 -1083,8,6,0,3,0,2,8,4,0,1,17,0,5,2,0 -1084,0,8,0,11,1,6,13,2,3,0,6,15,2,32,0 -1085,9,7,0,0,0,0,10,5,0,0,2,9,3,7,0 -1086,8,6,0,4,3,2,6,2,2,0,12,2,0,22,0 -1087,7,1,0,4,3,3,7,3,1,1,17,7,3,8,0 -1088,0,6,0,10,3,4,8,4,2,0,12,2,0,37,0 -1089,3,4,0,5,2,3,3,1,4,1,4,5,1,11,0 -1090,1,7,0,7,5,3,6,0,0,0,6,11,3,19,0 -1091,8,3,0,13,2,4,8,0,4,1,15,19,1,6,0 -1092,10,3,0,15,5,3,1,2,0,1,16,2,1,29,0 -1093,7,4,0,7,4,1,10,0,3,0,18,12,3,31,1 -1094,9,7,0,3,0,3,11,3,2,1,2,11,1,7,0 -1095,10,6,0,10,5,6,11,0,1,1,3,15,5,10,1 -1096,1,5,0,5,6,3,10,2,1,1,13,15,0,33,0 -1097,4,5,0,0,2,3,11,1,2,0,8,2,0,27,0 -1098,3,4,0,7,3,3,4,4,0,1,13,11,0,35,0 -1099,7,6,0,5,2,0,5,2,1,0,13,15,3,33,0 -1100,7,1,0,3,4,3,4,3,3,1,9,20,3,35,1 -1101,7,2,0,15,5,6,7,3,3,1,13,20,1,29,0 -1102,2,6,0,13,0,4,3,0,4,1,6,4,5,26,0 -1103,3,4,0,5,2,2,9,2,0,1,4,2,2,17,0 -1104,7,3,0,13,2,4,7,2,3,1,8,11,3,11,0 -1105,8,6,0,10,5,3,13,0,1,0,9,6,5,20,1 -1106,9,3,0,11,0,0,12,4,0,0,13,9,2,37,0 -1107,0,2,0,1,6,3,3,1,0,1,4,15,1,24,0 -1108,6,8,0,6,6,0,14,1,3,0,6,4,2,0,0 -1109,10,8,0,0,0,4,5,3,1,0,8,19,2,12,0 -1110,0,4,0,1,5,3,10,3,4,1,6,2,4,14,0 -1111,0,0,0,10,6,6,10,1,4,0,1,11,4,4,0 -1112,1,2,0,9,5,0,14,4,3,1,8,16,4,34,0 -1113,5,1,0,0,6,1,13,5,2,1,18,5,4,21,1 -1114,2,2,0,4,2,2,5,3,1,1,7,16,3,29,0 -1115,1,4,0,13,0,0,5,2,0,0,13,13,1,31,0 -1116,10,6,0,2,1,3,2,3,2,1,13,15,0,26,0 -1117,9,8,0,12,2,6,4,5,2,0,1,7,1,0,1 -1118,4,8,0,6,5,4,10,4,3,0,11,20,5,37,0 -1119,10,4,0,5,3,0,4,3,3,0,17,12,2,22,0 -1120,7,5,0,3,6,3,3,5,4,1,0,1,3,24,1 -1121,0,2,0,4,1,5,14,4,2,0,10,6,4,28,0 -1122,3,6,0,3,0,3,9,2,1,0,13,11,2,36,0 -1123,0,7,0,6,4,6,3,0,0,0,17,2,5,7,0 -1124,3,4,0,7,2,5,5,4,0,1,3,20,4,32,1 -1125,4,8,0,1,5,0,3,5,0,1,14,9,0,31,0 -1126,9,5,0,3,1,5,9,2,4,0,2,6,1,32,0 -1127,3,5,0,9,4,2,3,1,4,1,11,18,2,18,1 -1128,0,7,0,3,1,5,8,0,2,1,17,2,0,4,0 -1129,6,3,0,9,0,3,9,3,3,1,4,2,0,27,0 -1130,2,7,0,5,2,4,2,0,3,0,2,15,0,18,0 -1131,0,5,0,0,3,1,2,3,0,1,3,7,3,20,1 -1132,8,2,0,12,3,1,14,0,1,0,6,11,3,4,0 -1133,1,1,0,5,6,1,2,4,3,0,13,18,3,20,0 -1134,7,0,0,0,3,1,11,0,2,0,11,2,2,36,1 -1135,9,5,0,15,4,0,10,3,2,1,18,13,1,32,1 -1136,1,8,0,15,2,4,8,1,4,0,10,11,2,34,0 -1137,4,3,0,9,6,6,6,2,4,1,2,0,0,12,0 -1138,0,8,0,11,5,5,8,2,3,1,14,18,1,2,0 -1139,7,6,0,9,2,0,2,3,0,1,10,12,1,5,0 -1140,4,4,0,5,4,3,14,4,0,0,13,17,1,16,0 -1141,1,4,0,10,0,3,13,3,4,0,4,18,0,6,0 -1142,1,8,0,1,3,5,9,0,0,1,5,2,4,6,0 -1143,1,2,0,4,3,1,12,5,3,0,18,11,1,24,1 -1144,10,0,0,1,2,6,8,3,3,1,13,9,0,22,0 -1145,0,7,0,0,2,3,7,3,1,0,7,11,2,18,0 -1146,6,7,0,9,3,5,14,1,3,1,17,2,1,18,0 -1147,1,0,0,7,3,5,11,0,4,0,0,4,1,26,0 -1148,10,4,0,14,2,1,0,4,0,1,18,19,1,1,1 -1149,7,0,0,12,2,2,7,5,3,0,13,13,2,32,0 -1150,8,4,0,6,0,2,2,3,3,1,4,13,4,29,0 -1151,1,0,0,14,5,2,7,4,0,1,10,20,4,28,0 -1152,2,3,0,8,0,5,6,4,2,0,13,11,2,1,0 -1153,6,8,0,1,2,1,3,5,0,0,13,0,0,6,0 -1154,10,1,0,0,1,0,11,5,4,0,7,2,1,39,0 -1155,0,8,0,13,3,5,2,5,1,1,2,18,2,28,0 -1156,1,3,0,7,4,2,14,0,1,1,6,10,1,14,1 -1157,0,2,0,3,4,3,5,2,2,1,0,13,0,31,0 -1158,9,7,0,6,6,4,4,1,2,1,1,14,1,23,1 -1159,6,1,0,6,4,6,4,0,2,1,0,2,2,1,0 -1160,0,0,0,13,6,4,6,1,0,1,1,16,0,4,0 -1161,1,3,0,3,4,1,9,4,0,1,5,7,4,20,1 -1162,0,6,0,12,3,0,8,4,0,0,13,14,0,24,0 -1163,4,6,0,0,1,6,13,2,3,1,5,0,3,0,1 -1164,3,4,0,11,2,6,5,3,0,1,6,2,5,26,0 -1165,8,2,0,5,3,5,12,5,3,1,0,15,3,8,0 -1166,3,8,0,7,2,5,5,1,1,0,8,11,4,22,0 -1167,2,4,0,1,1,6,2,4,4,1,13,11,0,13,0 -1168,0,3,0,0,1,6,14,4,4,1,15,9,0,31,0 -1169,5,4,0,6,1,5,3,0,2,0,18,4,1,41,1 -1170,8,3,0,11,0,0,0,4,2,0,2,18,0,31,0 -1171,6,4,0,3,0,2,9,4,2,1,8,2,4,34,0 -1172,5,8,0,3,2,1,1,0,1,0,1,17,3,12,1 -1173,3,6,0,9,0,6,0,2,1,0,10,16,1,20,0 -1174,2,4,0,8,0,2,12,4,3,1,5,1,3,5,1 -1175,10,5,0,8,4,6,0,0,4,1,9,10,2,2,1 -1176,3,4,0,10,1,3,2,0,1,0,0,13,1,29,0 -1177,5,4,0,5,4,3,13,1,2,1,9,7,2,20,1 -1178,0,0,0,0,0,6,4,1,4,1,3,10,1,16,0 -1179,2,8,0,13,4,6,5,3,1,1,13,11,0,39,0 -1180,10,3,0,8,3,0,8,1,4,1,8,3,5,23,0 -1181,4,5,0,11,1,2,3,1,2,1,17,11,0,22,0 -1182,10,0,0,7,4,0,1,2,0,1,13,11,5,31,0 -1183,0,7,0,3,6,0,0,1,0,0,5,11,5,34,0 -1184,9,6,0,0,3,6,14,2,4,1,18,8,4,38,1 -1185,1,2,0,10,6,1,5,5,0,0,11,9,2,19,0 -1186,1,8,0,0,0,5,0,3,1,0,5,5,4,13,0 -1187,2,0,0,14,0,1,8,1,4,0,15,2,0,33,0 -1188,2,5,0,6,1,5,9,4,0,1,8,15,0,29,0 -1189,9,0,0,10,0,5,10,5,1,1,16,1,0,19,1 -1190,3,2,0,1,6,2,7,3,4,0,8,11,5,12,0 -1191,5,8,0,0,5,6,13,0,4,0,18,4,3,18,1 -1192,5,7,0,6,1,5,6,1,1,0,8,1,2,8,0 -1193,4,5,0,2,0,0,4,5,3,0,13,2,2,10,0 -1194,6,7,0,11,3,0,7,1,0,1,18,5,5,41,1 -1195,4,3,0,13,2,1,6,0,3,0,18,1,3,27,1 -1196,7,3,0,12,0,5,14,3,4,1,8,13,0,0,0 -1197,8,2,0,9,4,1,7,3,0,1,11,0,1,12,0 -1198,6,0,0,13,6,0,9,5,3,0,0,9,5,22,0 -1199,7,8,0,1,6,1,11,5,4,0,5,3,2,32,1 -1200,10,8,0,5,5,2,0,3,0,1,2,11,1,20,0 -1201,0,8,0,13,5,4,5,4,0,0,13,11,3,4,0 -1202,0,2,0,3,1,5,3,1,2,0,2,6,2,32,0 -1203,8,0,0,6,5,4,7,1,1,1,13,15,5,7,0 -1204,1,1,0,10,6,6,6,2,0,1,1,15,2,26,0 -1205,2,4,0,11,2,6,9,1,2,0,9,3,4,41,1 -1206,2,5,0,7,4,5,9,1,3,0,4,16,4,34,0 -1207,3,8,0,0,2,1,10,3,3,0,16,9,3,10,1 -1208,0,3,0,4,2,2,13,5,1,1,3,1,2,9,1 -1209,10,7,0,1,0,0,3,2,4,1,17,11,3,14,0 -1210,3,0,0,1,4,5,1,0,4,1,4,13,1,23,0 -1211,0,7,0,11,3,4,9,4,3,0,14,0,5,23,0 -1212,3,4,0,0,1,4,7,2,0,1,11,0,0,18,0 -1213,8,3,0,6,0,3,7,5,3,0,8,0,2,32,0 -1214,2,7,0,6,3,4,2,1,2,1,7,9,0,19,0 -1215,7,3,0,3,5,5,12,1,3,1,13,18,3,36,0 -1216,9,6,0,15,4,4,5,2,3,1,13,11,1,6,0 -1217,9,2,0,3,4,6,5,1,0,1,13,11,0,6,0 -1218,3,6,0,15,5,3,4,1,3,0,6,2,5,7,0 -1219,0,1,0,13,0,5,6,2,2,0,4,20,4,14,0 -1220,2,7,0,11,1,5,6,5,0,1,13,11,4,41,0 -1221,9,8,0,13,6,2,1,3,0,1,2,15,3,4,0 -1222,1,6,0,13,6,3,14,2,3,1,0,9,0,2,0 -1223,7,2,0,1,6,5,10,5,3,0,2,2,0,14,0 -1224,2,7,0,12,0,3,8,1,0,0,7,20,1,14,0 -1225,5,6,0,13,6,5,14,3,4,1,16,2,3,12,0 -1226,10,8,0,3,4,2,8,2,0,1,13,11,5,40,0 -1227,6,1,0,11,0,3,6,1,3,0,2,11,3,3,0 -1228,10,6,0,3,0,0,14,0,2,0,17,20,5,26,0 -1229,3,6,0,9,2,0,4,2,1,0,2,6,2,21,0 -1230,8,3,0,3,1,6,6,1,3,0,13,16,2,4,0 -1231,1,8,0,4,4,6,5,2,0,0,0,13,0,11,0 -1232,1,0,0,6,1,5,9,3,4,1,17,7,5,23,0 -1233,3,6,0,12,6,1,2,3,2,1,0,14,3,38,1 -1234,3,8,0,14,4,4,6,1,1,0,16,3,3,32,1 -1235,10,6,0,10,0,4,8,4,0,1,1,13,4,4,0 -1236,2,2,0,3,3,5,9,4,3,0,10,5,3,23,0 -1237,4,3,0,1,5,3,2,1,0,0,10,3,2,16,0 -1238,3,0,0,4,1,3,14,3,1,0,8,0,1,1,0 -1239,3,6,0,9,1,0,9,4,1,0,10,10,2,16,0 -1240,5,4,0,10,1,0,3,1,4,0,12,9,4,37,0 -1241,3,8,0,11,0,0,4,4,4,0,12,9,2,21,0 -1242,3,0,0,12,1,4,4,1,0,0,8,4,2,6,0 -1243,6,3,0,8,5,1,2,5,4,0,18,6,4,6,1 -1244,1,3,0,11,6,3,7,1,1,1,13,19,3,36,0 -1245,0,2,0,12,1,3,13,1,3,1,0,2,1,26,0 -1246,10,5,0,10,6,1,12,2,3,1,7,1,2,20,1 -1247,4,7,0,0,4,4,13,2,3,1,1,19,1,39,1 -1248,7,0,0,11,1,2,1,3,3,1,4,11,1,10,0 -1249,1,3,0,5,2,0,14,1,0,0,4,15,2,22,0 -1250,5,8,0,12,4,4,5,2,0,0,13,11,2,17,0 -1251,6,5,0,8,0,2,8,2,4,1,8,1,3,3,0 -1252,5,7,0,7,0,1,13,5,2,1,7,1,5,11,1 -1253,2,1,0,8,2,6,11,0,3,0,1,1,1,41,1 -1254,0,8,0,7,1,5,6,0,2,0,13,15,3,7,0 -1255,10,7,0,12,0,5,13,3,1,0,13,4,1,5,0 -1256,0,0,0,9,1,1,14,5,3,0,7,2,5,13,1 -1257,0,6,0,15,3,0,10,1,3,1,8,6,1,0,0 -1258,2,2,0,11,6,5,9,0,0,0,17,13,2,7,0 -1259,8,0,0,2,0,5,8,4,0,0,6,19,4,8,0 -1260,0,6,0,11,4,5,8,4,0,0,2,19,1,10,0 -1261,8,4,0,9,6,1,11,1,0,1,18,17,4,9,1 -1262,1,4,0,0,2,4,8,2,3,1,9,15,0,32,0 -1263,9,6,0,13,0,0,14,4,2,0,3,11,1,41,0 -1264,8,8,0,13,1,1,0,2,0,1,3,13,1,2,0 -1265,0,8,0,6,5,2,11,3,4,1,6,18,1,27,0 -1266,7,1,0,14,6,0,12,0,3,1,18,12,4,10,1 -1267,1,1,0,1,2,6,11,4,3,1,2,10,0,4,0 -1268,8,7,0,1,0,6,1,1,4,1,13,11,2,8,0 -1269,7,4,0,4,4,1,2,3,0,1,1,7,5,11,1 -1270,9,0,0,14,6,0,7,2,2,0,18,7,3,12,1 -1271,4,1,0,6,1,5,2,4,3,1,13,15,2,37,0 -1272,1,2,0,4,1,6,12,1,2,0,18,5,4,0,1 -1273,5,6,0,2,2,4,12,2,1,1,14,19,1,24,1 -1274,6,4,0,15,0,4,10,3,4,1,17,4,0,32,0 -1275,3,5,0,15,5,0,3,3,2,1,2,11,3,0,0 -1276,9,5,0,12,5,5,8,1,3,1,18,7,5,18,1 -1277,2,7,0,15,6,3,3,3,1,1,3,2,0,27,0 -1278,1,8,0,0,0,5,9,3,2,1,2,14,0,5,0 -1279,10,7,0,11,1,0,1,5,3,0,2,18,2,18,0 -1280,1,5,0,5,2,0,3,0,1,0,4,15,0,27,0 -1281,6,0,0,4,2,1,6,5,3,1,18,10,5,30,1 -1282,8,0,0,5,6,5,10,2,3,1,9,5,1,13,1 -1283,7,6,0,9,0,1,4,5,3,1,14,16,1,33,1 -1284,4,1,0,7,1,6,11,5,4,1,4,15,4,0,0 -1285,0,7,0,5,0,3,12,1,4,1,0,13,1,32,0 -1286,3,5,0,11,0,5,6,5,3,0,13,9,0,4,0 -1287,6,5,0,3,6,4,3,5,2,1,8,8,2,29,1 -1288,0,0,0,6,3,5,0,4,3,0,15,2,1,13,0 -1289,2,6,0,11,3,2,5,1,1,0,13,13,4,39,0 -1290,2,6,0,0,1,6,14,2,2,1,8,6,2,41,0 -1291,1,6,0,10,1,3,10,4,2,0,8,9,5,1,0 -1292,1,6,0,13,1,1,5,4,0,0,2,8,2,21,0 -1293,6,8,0,5,3,0,1,0,1,1,6,14,5,6,0 -1294,0,3,0,5,0,5,14,3,4,1,17,18,1,4,0 -1295,0,0,0,6,1,2,3,2,0,0,8,15,1,38,0 -1296,0,8,0,13,0,4,8,4,4,1,13,2,0,23,0 -1297,3,5,0,13,4,1,11,2,3,1,4,4,0,33,0 -1298,0,4,0,4,3,1,0,4,2,1,4,2,3,30,0 -1299,8,6,0,5,5,5,6,0,0,0,4,12,4,16,0 -1300,0,6,0,4,0,1,10,1,0,0,13,11,1,6,0 -1301,2,2,0,15,5,4,3,0,3,1,15,17,3,41,1 -1302,9,6,0,13,6,0,14,3,0,0,15,10,0,6,0 -1303,5,6,0,4,0,5,13,0,0,1,17,13,4,13,0 -1304,10,0,0,9,0,0,11,0,2,1,11,10,0,5,1 -1305,5,8,0,5,5,3,14,5,3,1,2,12,2,16,0 -1306,3,3,0,8,5,5,2,4,3,1,8,3,0,29,0 -1307,0,8,0,10,0,5,13,2,2,1,13,6,4,17,0 -1308,5,6,0,1,2,5,0,3,0,1,2,11,4,11,0 -1309,2,7,0,6,4,0,3,0,3,0,11,15,0,29,0 -1310,5,0,0,13,2,4,3,3,3,0,2,6,0,26,0 -1311,2,0,0,14,6,5,14,4,2,0,0,16,0,40,0 -1312,10,3,0,3,2,0,0,4,1,1,1,3,3,19,1 -1313,0,8,0,1,6,4,6,1,3,0,13,11,1,38,0 -1314,7,8,0,14,1,3,1,0,3,0,2,15,5,12,0 -1315,4,5,0,7,1,0,14,3,0,0,9,19,5,1,1 -1316,5,1,0,6,1,4,6,2,1,0,8,6,5,27,0 -1317,0,4,0,0,6,0,5,0,2,1,2,20,1,31,0 -1318,5,5,0,12,4,4,2,1,1,0,8,7,3,2,0 -1319,0,1,0,12,5,1,1,5,2,1,4,12,1,6,1 -1320,6,7,0,4,2,5,7,1,0,0,2,0,1,34,0 -1321,0,7,0,8,0,1,0,3,4,1,17,13,4,17,0 -1322,0,5,0,7,5,3,0,0,1,1,16,9,4,16,0 -1323,9,6,0,3,1,4,2,5,0,1,2,3,2,21,0 -1324,3,6,0,2,0,6,9,3,2,0,17,2,0,22,0 -1325,2,2,0,8,6,0,7,2,2,1,8,2,0,26,0 -1326,7,5,0,9,2,0,9,1,0,1,5,8,3,30,1 -1327,7,7,0,3,4,6,2,2,0,1,1,13,2,21,1 -1328,2,8,0,13,3,0,9,2,3,0,17,13,0,18,0 -1329,0,0,0,3,1,0,4,4,4,0,13,19,1,35,0 -1330,4,3,0,3,0,4,8,2,3,1,13,2,5,34,0 -1331,0,4,0,14,1,3,3,2,3,1,10,4,4,26,0 -1332,0,3,0,11,0,2,6,1,4,1,2,6,0,7,0 -1333,3,6,0,2,1,3,11,0,2,1,17,2,3,27,0 -1334,2,2,0,12,5,0,14,3,0,0,15,9,5,10,0 -1335,6,3,0,3,3,6,4,5,4,1,11,10,4,6,1 -1336,1,6,0,15,4,3,6,1,2,0,2,0,1,25,0 -1337,8,3,0,14,2,6,14,2,4,1,14,8,1,36,1 -1338,7,1,0,15,6,1,13,0,4,0,18,7,4,28,1 -1339,2,7,0,7,1,4,9,2,2,0,2,9,5,31,0 -1340,0,3,0,1,0,4,6,0,4,0,4,18,2,3,0 -1341,0,8,0,10,6,1,13,5,4,1,14,15,2,13,1 -1342,10,1,0,6,6,2,7,1,3,1,7,5,2,20,1 -1343,6,3,0,3,0,5,12,5,2,0,2,20,1,23,0 -1344,0,5,0,9,1,5,9,1,2,0,0,18,4,26,0 -1345,2,1,0,7,1,5,10,3,1,1,10,2,2,37,0 -1346,4,8,0,13,1,2,5,1,3,0,2,6,5,41,0 -1347,0,1,0,13,5,4,2,0,3,0,10,0,2,9,0 -1348,2,4,0,8,5,5,4,0,4,0,1,2,0,25,0 -1349,1,1,0,11,6,4,6,4,4,1,18,20,4,12,0 -1350,1,8,0,9,3,1,13,3,0,1,17,2,5,32,0 -1351,9,2,0,14,6,6,3,3,2,1,13,0,5,23,0 -1352,2,4,0,8,3,4,0,1,2,0,15,2,5,32,0 -1353,10,5,0,3,3,4,10,0,3,0,4,2,3,41,0 -1354,5,7,0,7,0,2,2,4,0,0,13,19,4,35,0 -1355,5,3,0,7,2,2,5,5,0,1,9,19,5,41,1 -1356,0,7,0,4,5,0,6,3,1,1,11,16,5,39,0 -1357,0,5,0,3,3,0,14,0,0,0,8,9,3,7,0 -1358,10,1,0,11,5,1,9,4,1,1,5,3,1,0,1 -1359,4,3,0,0,4,0,9,3,1,1,0,17,1,16,0 -1360,4,4,0,9,4,0,14,4,1,0,9,20,3,22,1 -1361,0,2,0,13,3,6,14,2,1,1,13,0,1,27,0 -1362,9,1,0,11,1,6,13,1,2,0,2,6,2,29,0 -1363,10,1,0,10,0,0,11,2,4,1,18,7,4,30,1 -1364,3,7,0,13,1,3,11,5,0,0,13,9,4,32,0 -1365,9,1,0,0,1,1,0,4,2,0,10,16,5,6,0 -1366,1,8,0,3,0,4,1,3,2,1,8,16,3,22,0 -1367,7,2,0,3,6,5,10,1,0,1,8,13,5,19,0 -1368,6,3,0,12,3,2,6,4,0,0,6,17,0,37,0 -1369,10,6,0,11,1,0,4,1,0,1,13,11,2,14,0 -1370,8,6,0,13,0,1,3,0,4,0,0,13,1,29,0 -1371,0,1,0,3,4,5,9,5,0,0,0,4,1,19,0 -1372,9,8,0,13,2,4,0,4,0,1,15,11,4,6,0 -1373,3,5,0,14,5,6,9,5,3,1,2,2,5,31,0 -1374,0,5,0,4,5,2,9,5,3,1,9,1,5,20,1 -1375,4,5,0,5,1,4,0,5,2,0,2,4,2,9,0 -1376,0,4,0,11,0,4,2,2,2,0,15,13,4,19,0 -1377,8,6,0,3,6,1,4,0,4,1,17,16,0,21,1 -1378,8,8,0,3,4,4,7,5,0,0,7,14,3,28,1 -1379,3,6,0,5,1,1,5,3,0,1,2,18,0,34,0 -1380,7,0,0,5,0,0,3,5,2,1,16,7,5,6,1 -1381,0,4,0,6,2,4,1,2,0,0,2,7,3,34,0 -1382,0,3,0,11,2,5,0,2,4,1,2,13,4,28,0 -1383,3,5,0,4,1,2,11,2,4,1,17,3,1,0,0 -1384,2,4,0,13,4,3,0,0,0,1,2,13,0,4,0 -1385,0,2,0,2,2,4,1,0,3,0,6,13,4,3,0 -1386,0,5,0,3,1,2,11,1,0,1,2,2,5,26,0 -1387,10,1,0,4,2,4,7,2,4,1,2,14,4,35,0 -1388,10,2,0,12,0,1,8,3,0,0,14,11,1,20,0 -1389,4,1,0,4,2,1,3,1,1,1,4,13,5,28,0 -1390,0,5,0,2,6,3,3,5,4,1,13,15,0,38,0 -1391,1,8,0,4,2,6,0,4,2,0,13,14,0,8,0 -1392,0,7,0,14,3,0,14,5,2,1,17,19,3,40,1 -1393,0,4,0,15,6,5,2,3,4,1,6,13,1,11,0 -1394,3,7,0,2,0,2,5,0,3,1,17,15,5,40,0 -1395,0,5,0,6,0,5,3,2,1,1,13,13,3,4,0 -1396,1,6,0,10,2,1,6,5,0,1,16,8,5,1,1 -1397,8,8,0,10,6,0,13,0,4,1,16,9,3,38,1 -1398,8,6,0,7,2,5,3,0,3,0,2,6,2,14,0 -1399,10,5,0,9,0,0,5,3,4,0,4,2,0,29,0 -1400,0,0,0,15,6,5,5,3,0,1,2,2,2,4,0 -1401,2,4,0,6,4,1,13,3,2,1,9,10,5,33,1 -1402,10,2,0,0,5,4,2,3,4,0,13,6,1,22,0 -1403,7,6,0,6,0,5,5,1,3,1,2,12,3,20,0 -1404,9,5,0,12,2,4,6,0,0,1,17,11,0,16,0 -1405,9,6,0,13,2,0,7,3,4,0,2,11,4,35,0 -1406,6,2,0,9,1,5,5,4,1,1,12,15,0,3,0 -1407,2,7,0,2,3,6,13,3,4,0,12,18,2,37,0 -1408,4,8,0,11,5,1,9,2,2,1,8,13,0,23,0 -1409,5,7,0,10,6,0,14,0,1,0,11,10,1,41,1 -1410,9,6,0,7,2,3,11,5,0,0,0,11,0,5,0 -1411,2,7,0,3,6,0,13,2,0,0,10,6,4,26,0 -1412,10,3,0,7,1,6,4,2,2,0,8,18,0,13,0 -1413,5,1,0,14,5,2,13,0,3,1,3,6,2,8,1 -1414,9,3,0,6,2,3,8,3,0,0,12,6,0,40,0 -1415,1,8,0,3,3,6,0,2,1,0,6,18,0,40,0 -1416,2,0,0,11,6,0,8,5,2,1,13,6,0,30,0 -1417,8,2,0,0,5,3,3,4,2,0,13,15,0,19,0 -1418,2,5,0,4,0,6,6,5,0,1,18,5,0,10,1 -1419,4,8,0,9,0,4,9,4,4,0,4,11,2,19,0 -1420,5,3,0,1,0,4,4,3,2,0,13,11,1,1,0 -1421,9,4,0,15,4,6,5,4,0,0,6,11,5,10,0 -1422,0,0,0,8,0,4,4,1,3,1,8,5,2,5,0 -1423,1,8,0,2,1,6,1,2,0,1,10,6,2,27,0 -1424,9,7,0,6,3,3,12,0,3,1,0,12,4,5,0 -1425,9,6,0,14,4,2,11,4,1,1,15,17,0,22,1 -1426,3,4,0,10,6,4,7,3,2,1,13,12,1,5,0 -1427,9,6,0,4,6,1,8,4,4,1,9,10,2,29,1 -1428,3,6,0,15,1,1,5,4,3,1,13,0,5,4,0 -1429,5,3,0,1,2,0,5,3,0,1,8,4,2,40,0 -1430,8,0,0,0,3,6,8,4,3,1,9,10,1,10,1 -1431,8,1,0,8,2,1,10,0,0,1,16,7,3,39,1 -1432,2,3,0,0,4,2,5,5,1,1,2,13,1,35,0 -1433,10,8,0,14,1,4,4,3,4,0,13,1,1,11,0 -1434,2,4,0,8,5,0,9,3,2,1,8,2,0,31,0 -1435,10,4,0,10,1,4,9,3,1,1,14,7,3,20,1 -1436,0,2,0,13,0,1,8,4,0,1,2,5,3,28,0 -1437,9,8,0,1,5,6,5,4,2,0,7,10,2,9,1 -1438,0,2,0,13,0,0,3,2,2,0,8,6,1,23,0 -1439,0,0,0,15,1,0,4,3,0,0,4,5,4,35,0 -1440,2,0,0,10,2,1,12,0,0,0,18,16,2,16,1 -1441,0,6,0,2,4,4,5,3,0,1,4,17,3,23,0 -1442,4,1,0,12,2,0,12,5,1,0,5,8,2,33,1 -1443,2,7,0,10,1,1,14,5,4,0,2,13,0,2,0 -1444,2,2,0,1,5,2,9,3,0,0,13,7,1,21,0 -1445,5,4,0,2,5,0,3,3,0,1,11,19,1,13,1 -1446,2,4,0,3,2,6,12,2,0,1,2,11,4,14,0 -1447,9,1,0,11,6,1,10,5,1,1,1,3,3,41,1 -1448,6,3,0,6,1,5,8,2,3,1,0,11,1,27,0 -1449,5,4,0,8,5,4,9,3,2,1,17,17,2,25,0 -1450,10,6,0,9,1,1,5,1,3,0,6,2,0,16,0 -1451,9,8,0,6,5,3,0,5,2,0,2,18,4,13,0 -1452,6,7,0,2,5,0,1,0,4,0,11,3,2,18,0 -1453,6,7,0,7,6,0,8,4,4,0,18,5,3,35,1 -1454,1,0,0,0,4,1,13,4,2,0,13,17,3,32,0 -1455,6,6,0,1,3,5,10,1,0,1,2,2,2,27,0 -1456,10,1,0,12,3,5,11,5,1,0,14,16,0,30,0 -1457,7,7,0,12,3,0,10,3,0,1,17,6,3,6,0 -1458,2,2,0,1,0,2,0,3,0,0,10,3,5,3,0 -1459,1,5,0,13,0,3,7,2,0,1,5,2,0,14,0 -1460,1,3,0,1,6,5,9,3,0,0,8,2,2,2,0 -1461,8,3,0,14,4,2,11,5,3,1,18,14,5,30,1 -1462,3,6,0,4,5,2,10,1,1,1,3,4,1,18,0 -1463,0,8,0,15,1,6,5,2,4,1,4,2,0,31,0 -1464,10,2,0,6,0,2,5,2,4,1,2,2,1,22,0 -1465,9,5,0,13,3,6,8,3,2,0,15,11,1,19,0 -1466,10,5,0,4,6,0,0,3,2,1,8,0,2,21,0 -1467,1,2,0,10,6,3,8,1,3,1,8,3,1,38,0 -1468,10,0,0,15,0,5,1,2,4,1,2,2,1,35,0 -1469,10,6,0,4,0,2,11,0,0,1,8,15,4,26,0 -1470,3,2,0,12,6,1,0,4,2,0,12,4,3,30,0 -1471,1,1,0,13,1,2,9,0,3,1,8,19,1,17,0 -1472,4,7,0,4,0,2,7,2,4,1,8,18,5,30,0 -1473,5,5,0,5,4,5,14,5,0,1,14,1,2,0,1 -1474,1,5,0,10,1,2,7,0,2,0,8,15,2,19,0 -1475,3,8,0,5,6,6,5,2,0,0,10,3,2,38,0 -1476,2,0,0,13,3,6,14,3,3,0,2,11,5,6,0 -1477,1,7,0,5,5,6,11,5,2,1,17,11,0,40,0 -1478,0,6,0,3,6,6,0,0,2,0,2,9,4,3,0 -1479,2,1,0,4,2,0,6,4,0,1,13,11,4,1,0 -1480,1,1,0,13,1,5,8,4,2,0,13,14,5,29,0 -1481,8,8,0,0,4,0,3,4,2,0,11,14,5,17,0 -1482,9,8,0,12,5,2,10,4,2,0,18,7,1,5,1 -1483,1,6,0,2,6,2,8,2,4,1,2,2,3,26,0 -1484,0,4,0,1,5,2,7,3,3,0,8,9,5,29,0 -1485,7,6,0,8,0,4,10,1,1,0,10,2,5,26,0 -1486,10,3,0,10,0,0,14,5,1,0,17,18,0,3,0 -1487,1,1,0,10,3,2,12,1,1,1,1,17,2,4,1 -1488,2,2,0,15,0,3,8,4,0,0,0,6,3,38,0 -1489,7,4,0,14,1,2,5,1,2,1,0,6,0,23,0 -1490,6,6,0,3,2,3,5,1,1,0,13,1,1,18,0 -1491,7,7,0,11,5,0,9,2,1,1,2,18,1,0,0 -1492,0,3,0,5,0,3,1,3,1,1,8,2,2,30,0 -1493,10,5,0,6,2,0,4,2,2,0,13,11,4,14,0 -1494,2,0,0,11,1,0,4,3,4,0,4,15,3,26,0 -1495,4,0,0,10,2,3,10,1,4,1,13,6,0,41,0 -1496,10,1,0,5,0,5,2,4,3,1,2,0,4,13,0 -1497,9,8,0,13,0,1,14,4,3,0,17,17,3,17,0 -1498,6,3,0,3,0,4,0,4,4,1,2,0,5,21,0 -1499,7,7,0,2,6,1,7,1,4,1,18,13,3,10,1 -1500,1,6,0,3,4,5,4,3,1,0,6,9,4,39,0 -1501,9,5,0,7,2,5,5,2,3,1,1,7,5,35,1 -1502,0,2,0,2,6,0,5,3,4,1,13,11,4,9,0 -1503,4,1,0,14,0,6,8,3,4,0,18,10,5,27,1 -1504,7,2,0,5,5,1,9,2,3,1,2,17,5,6,0 -1505,9,6,0,4,3,5,6,1,4,0,6,2,0,25,0 -1506,0,7,0,3,6,6,9,4,2,0,13,11,2,0,0 -1507,9,5,0,8,3,0,13,4,1,1,8,0,0,14,0 -1508,8,0,0,14,2,1,9,2,3,1,4,10,2,24,1 -1509,2,6,0,11,4,0,3,4,4,0,12,19,1,36,0 -1510,6,0,0,7,4,2,3,0,0,1,11,8,4,28,1 -1511,1,7,0,9,1,6,11,4,4,1,8,2,4,39,0 -1512,2,0,0,6,1,0,8,4,0,0,0,2,0,8,0 -1513,10,4,0,1,1,4,9,0,0,1,10,11,0,26,0 -1514,2,5,0,12,4,1,2,5,3,0,5,10,2,39,1 -1515,8,7,0,8,3,1,6,4,0,1,1,20,4,6,1 -1516,2,7,0,6,0,3,9,5,0,1,13,2,4,22,0 -1517,9,5,0,8,6,2,8,2,2,0,2,19,0,4,0 -1518,1,4,0,9,6,0,2,0,3,0,8,11,2,23,0 -1519,8,2,0,5,2,0,5,2,0,1,8,0,2,33,0 -1520,0,6,0,0,0,0,10,2,4,0,6,13,0,38,0 -1521,2,6,0,5,3,3,6,2,4,0,15,13,4,22,0 -1522,5,5,0,1,3,0,13,2,1,1,13,16,5,21,0 -1523,2,5,0,13,1,4,9,0,2,0,13,11,2,8,0 -1524,2,5,0,10,2,1,1,2,0,1,13,5,1,25,0 -1525,7,3,0,6,1,1,6,0,4,1,7,10,0,18,1 -1526,2,0,0,0,1,2,5,4,1,0,0,3,1,31,0 -1527,5,1,0,5,2,1,14,1,4,0,18,5,2,32,1 -1528,6,3,0,13,1,6,6,2,2,0,8,18,5,37,0 -1529,0,4,0,1,0,5,6,1,3,0,13,15,1,34,0 -1530,9,5,0,14,2,2,9,4,2,1,7,6,3,1,1 -1531,0,0,0,6,0,6,0,3,3,1,16,5,2,32,0 -1532,5,5,0,3,5,4,10,5,2,1,7,19,5,21,1 -1533,9,6,0,14,4,1,12,3,1,0,9,17,5,9,1 -1534,9,8,0,0,2,3,6,3,4,1,9,3,0,11,1 -1535,4,6,0,0,1,2,14,5,1,1,5,14,0,10,1 -1536,0,8,0,4,3,2,0,4,0,1,11,4,2,9,0 -1537,3,7,0,8,0,4,9,4,2,0,8,6,2,7,0 -1538,7,1,0,7,1,5,6,5,2,0,12,14,0,28,0 -1539,2,5,0,8,1,1,12,5,0,0,13,6,3,30,0 -1540,9,3,0,10,3,1,9,5,2,1,0,1,1,22,1 -1541,1,7,0,5,0,5,2,4,3,1,6,17,1,14,0 -1542,8,2,0,0,1,6,10,5,4,1,14,8,1,4,1 -1543,2,6,0,15,5,4,4,0,0,0,15,20,0,32,0 -1544,3,0,0,6,6,2,1,3,3,0,1,15,1,28,1 -1545,5,6,0,13,6,0,1,0,2,1,16,7,3,5,1 -1546,2,2,0,14,2,1,1,5,1,1,18,5,1,24,1 -1547,2,7,0,0,5,0,12,0,3,1,8,2,1,31,0 -1548,0,0,0,11,3,3,9,2,1,1,8,13,1,24,0 -1549,9,7,0,0,4,0,10,4,3,0,9,7,2,21,1 -1550,0,0,0,8,5,0,4,4,1,0,6,9,1,29,0 -1551,10,4,0,1,3,0,12,2,2,0,2,18,5,13,0 -1552,2,4,0,13,2,6,6,5,3,0,10,20,3,17,0 -1553,7,6,0,5,1,3,3,5,3,0,5,17,1,2,1 -1554,10,6,0,11,2,1,1,5,1,1,10,4,0,26,1 -1555,8,3,0,5,4,3,9,1,4,0,6,13,2,4,0 -1556,3,6,0,5,6,3,9,5,3,1,13,11,2,35,0 -1557,2,5,0,0,1,5,6,1,3,0,12,13,4,12,0 -1558,2,2,0,2,6,4,4,0,2,1,8,13,2,8,0 -1559,2,3,0,11,0,2,9,5,1,0,2,2,0,38,0 -1560,4,5,0,10,5,6,8,1,3,1,17,16,5,23,0 -1561,5,6,0,14,2,3,8,3,2,1,10,18,0,33,0 -1562,3,1,0,1,1,4,11,1,0,1,2,13,0,19,0 -1563,8,1,0,8,4,1,14,0,2,1,1,5,0,17,1 -1564,2,2,0,3,3,5,13,2,1,1,7,2,5,41,0 -1565,8,6,0,14,2,2,10,1,0,0,8,16,5,7,0 -1566,9,3,0,8,3,4,5,3,2,0,2,2,1,12,0 -1567,4,3,0,11,0,5,9,2,1,0,13,2,5,2,0 -1568,9,7,0,2,4,3,10,4,0,1,13,2,0,28,0 -1569,5,1,0,13,2,2,1,5,0,1,16,10,5,2,1 -1570,1,2,0,9,1,5,10,3,1,1,2,13,4,8,0 -1571,0,8,0,15,0,6,9,3,0,1,13,11,1,9,0 -1572,10,0,0,7,3,3,8,5,3,0,18,19,1,28,1 -1573,8,1,0,0,2,5,4,0,3,1,0,11,2,27,0 -1574,6,3,0,8,3,3,2,4,4,1,18,7,3,18,1 -1575,7,7,0,11,6,4,14,0,2,1,12,5,2,4,0 -1576,1,1,0,0,1,6,14,3,4,0,6,4,3,11,0 -1577,2,6,0,14,5,3,2,3,0,1,16,11,5,16,0 -1578,1,4,0,10,5,5,8,3,3,1,13,6,1,3,0 -1579,0,8,0,11,0,6,3,0,4,1,4,5,1,34,0 -1580,5,5,0,15,0,6,2,1,0,0,13,2,3,27,0 -1581,0,7,0,14,4,4,7,0,0,1,4,13,1,38,0 -1582,1,3,0,4,5,5,3,4,4,0,12,6,4,28,0 -1583,0,4,0,10,3,0,13,4,4,0,7,13,3,8,0 -1584,3,8,0,4,0,4,6,5,4,0,2,11,1,18,0 -1585,9,1,0,3,2,5,1,4,0,1,8,4,1,6,0 -1586,0,1,0,3,0,5,1,1,2,0,13,20,1,6,0 -1587,10,8,0,6,5,6,7,1,2,1,10,6,0,11,0 -1588,7,6,0,3,1,6,11,3,0,0,13,15,2,7,0 -1589,9,8,0,9,5,0,4,5,3,1,18,6,5,12,1 -1590,3,7,0,2,3,5,1,3,3,0,17,2,2,29,0 -1591,7,0,0,0,4,5,11,2,3,0,4,17,2,38,1 -1592,1,1,0,7,1,4,0,2,3,0,7,9,1,25,0 -1593,3,7,0,11,3,4,14,1,0,0,2,19,0,29,0 -1594,10,3,0,3,5,0,7,2,2,1,4,6,0,30,0 -1595,9,6,0,11,4,4,2,4,3,0,13,13,3,32,0 -1596,3,6,0,12,6,2,6,0,3,1,4,15,0,37,0 -1597,10,5,0,0,1,2,14,3,3,0,14,16,0,38,1 -1598,6,8,0,10,3,2,3,0,1,1,5,10,3,12,1 -1599,3,8,0,11,4,5,8,4,0,1,2,11,4,31,0 -1600,9,5,0,0,6,1,9,2,0,1,0,7,2,29,1 -1601,5,5,0,9,6,3,4,2,2,1,4,3,1,4,1 -1602,9,3,0,6,3,2,10,0,2,0,3,7,4,35,1 -1603,2,8,0,5,6,0,8,2,1,1,3,16,4,16,0 -1604,1,3,0,8,2,4,0,5,1,1,13,1,3,26,0 -1605,8,1,0,15,1,2,10,5,3,1,9,6,2,18,1 -1606,5,4,0,13,1,6,3,0,3,1,1,5,0,19,1 -1607,3,7,0,15,1,3,11,3,1,0,10,11,0,24,0 -1608,8,6,0,11,6,0,9,0,1,1,0,11,3,17,0 -1609,2,7,0,15,1,6,9,4,0,0,2,16,3,3,0 -1610,1,6,0,3,5,3,10,4,4,1,10,2,0,31,0 -1611,10,8,0,4,4,0,10,0,4,0,4,9,0,19,0 -1612,9,5,0,13,0,0,9,2,0,1,17,2,1,37,0 -1613,1,1,0,5,5,4,9,5,2,0,2,9,0,0,0 -1614,9,7,0,15,0,6,1,2,1,1,13,6,3,17,0 -1615,5,6,0,15,1,0,2,4,2,1,13,11,0,20,0 -1616,9,0,0,5,6,5,6,1,2,0,4,3,0,22,0 -1617,7,6,0,1,1,0,11,3,2,0,8,20,2,23,0 -1618,6,2,0,8,3,4,0,2,0,0,2,18,3,18,0 -1619,5,7,0,4,6,1,9,1,0,1,1,14,1,2,1 -1620,3,7,0,7,5,2,13,4,2,0,13,0,1,31,1 -1621,6,5,0,6,1,2,3,1,3,1,13,11,3,2,0 -1622,9,2,0,14,5,0,7,3,3,0,8,13,0,11,0 -1623,7,6,0,8,6,0,1,3,3,0,8,2,1,26,0 -1624,3,7,0,5,5,6,8,3,3,0,17,11,4,31,0 -1625,4,0,0,1,0,5,13,2,4,0,2,4,0,0,0 -1626,2,5,0,6,1,4,5,1,2,1,6,2,0,2,0 -1627,9,6,0,2,2,2,12,5,4,0,8,2,0,4,0 -1628,0,4,0,1,2,4,0,5,2,1,2,9,4,17,0 -1629,0,0,0,12,6,4,8,2,0,1,6,20,4,25,0 -1630,8,2,0,4,0,4,0,2,2,1,18,5,5,23,1 -1631,2,8,0,9,6,5,9,1,4,1,17,15,3,13,0 -1632,0,1,0,9,6,3,6,2,0,1,6,11,4,4,0 -1633,8,2,0,3,6,5,8,3,4,1,2,0,3,24,0 -1634,4,4,0,11,6,2,1,1,1,1,4,9,0,37,0 -1635,6,1,0,5,0,5,0,4,4,1,14,12,3,25,0 -1636,2,0,0,15,4,3,13,4,1,0,15,12,5,32,0 -1637,6,5,0,9,0,6,13,5,4,1,1,5,5,40,1 -1638,6,7,0,0,6,4,10,3,1,0,14,9,5,41,0 -1639,6,0,0,13,0,0,12,0,0,1,13,5,0,26,0 -1640,9,0,0,6,6,5,9,3,2,1,13,4,0,33,0 -1641,3,6,0,3,1,0,6,3,1,0,17,12,0,6,0 -1642,7,6,0,5,1,6,10,2,3,1,6,14,4,5,0 -1643,1,6,0,2,1,3,8,1,0,0,11,16,3,3,0 -1644,6,2,0,13,4,0,13,4,3,1,14,5,5,32,1 -1645,4,2,0,6,2,3,5,3,3,1,14,20,5,7,0 -1646,7,0,0,8,1,1,0,2,0,0,5,19,2,22,1 -1647,3,1,0,13,0,4,6,3,4,1,4,3,3,29,0 -1648,5,3,0,3,0,2,7,4,0,1,13,9,3,30,0 -1649,1,6,0,11,6,4,8,3,4,0,8,20,0,28,0 -1650,5,8,0,0,4,0,1,4,3,0,5,5,5,16,1 -1651,2,7,0,10,6,6,8,2,3,0,11,15,5,11,0 -1652,9,1,0,11,2,1,5,0,1,0,14,15,3,14,1 -1653,0,8,0,5,4,0,7,5,0,0,13,3,0,10,0 -1654,0,1,0,7,5,4,2,2,2,0,3,2,4,40,0 -1655,3,2,0,5,5,4,4,1,2,0,17,13,0,23,0 -1656,6,6,0,8,5,3,5,3,1,1,17,11,1,7,0 -1657,0,8,0,5,4,6,3,5,4,0,8,11,0,37,0 -1658,9,6,0,13,1,6,10,0,4,0,8,0,5,19,0 -1659,9,5,0,10,1,0,9,2,1,1,18,19,5,23,1 -1660,2,7,0,4,2,0,1,2,4,1,13,19,0,7,0 -1661,1,4,0,2,0,0,11,3,3,1,15,9,5,0,0 -1662,10,8,0,1,6,6,6,5,2,1,8,9,0,18,0 -1663,2,6,0,8,6,0,6,4,3,1,10,9,5,22,0 -1664,3,4,0,4,1,0,0,3,4,1,4,13,2,22,0 -1665,6,8,0,15,3,1,6,1,3,0,3,10,1,26,1 -1666,2,6,0,3,0,4,7,3,1,1,13,6,3,7,0 -1667,1,6,0,3,6,6,5,3,0,1,8,7,2,4,0 -1668,2,0,0,1,1,2,13,1,0,1,2,0,5,14,1 -1669,2,1,0,3,0,0,5,1,4,0,11,11,4,0,0 -1670,7,8,0,10,4,2,13,1,1,1,18,10,3,8,1 -1671,0,2,0,14,6,3,8,2,3,0,2,15,5,7,0 -1672,4,3,0,11,3,4,2,4,4,1,8,2,0,22,0 -1673,1,6,0,0,2,0,9,2,2,0,8,9,4,36,0 -1674,4,0,0,13,1,0,7,2,1,0,2,12,1,31,0 -1675,10,1,0,13,1,0,5,3,2,1,12,20,1,36,0 -1676,1,0,0,11,0,4,9,4,4,1,18,16,4,40,0 -1677,0,0,0,11,2,5,5,5,0,1,2,16,4,33,0 -1678,8,6,0,15,2,3,3,4,2,0,18,16,3,28,1 -1679,1,4,0,11,2,3,5,1,4,1,4,20,0,4,0 -1680,1,7,0,9,2,4,0,1,1,0,2,2,5,36,0 -1681,2,5,0,9,2,6,6,3,4,0,2,9,5,31,0 -1682,9,3,0,13,3,5,4,1,4,1,6,0,2,38,0 -1683,3,7,0,3,1,2,1,3,3,1,13,13,5,29,0 -1684,1,6,0,15,1,0,9,3,1,0,0,2,0,33,0 -1685,2,2,0,15,1,4,4,5,0,1,2,15,0,30,0 -1686,3,4,0,4,6,1,11,5,0,0,18,16,1,27,1 -1687,2,8,0,2,0,3,14,4,4,0,8,20,5,0,0 -1688,10,0,0,14,0,5,2,0,0,0,7,8,2,16,1 -1689,3,6,0,8,3,3,9,4,0,1,12,18,4,5,0 -1690,1,6,0,3,1,6,14,2,1,1,13,11,2,31,0 -1691,4,8,0,3,0,2,4,5,2,0,2,0,2,35,0 -1692,7,4,0,0,3,4,4,4,0,1,7,8,2,33,1 -1693,1,8,0,0,5,0,5,2,3,0,2,13,1,29,0 -1694,0,6,0,13,0,5,4,0,0,0,4,11,2,39,0 -1695,0,5,0,6,6,3,5,4,2,0,17,8,5,3,0 -1696,2,5,0,10,5,2,13,0,3,0,18,5,2,30,1 -1697,4,0,0,13,3,4,8,0,2,1,1,9,1,33,0 -1698,0,0,0,5,1,3,8,5,0,0,7,9,3,39,0 -1699,3,4,0,3,1,6,13,1,1,1,2,20,5,5,0 -1700,10,2,0,7,0,6,14,3,1,0,7,0,0,27,0 -1701,2,6,0,8,6,5,5,2,3,1,17,6,4,40,0 -1702,6,3,0,2,2,4,9,5,4,1,5,11,0,0,0 -1703,4,8,0,13,2,5,8,5,2,1,18,15,0,36,1 -1704,6,7,0,4,2,0,8,3,0,0,3,10,4,25,1 -1705,6,3,0,9,1,6,9,0,2,1,0,11,2,8,0 -1706,3,6,0,12,3,5,14,2,4,0,2,11,3,41,0 -1707,1,8,0,13,3,0,12,0,0,0,8,11,5,32,0 -1708,4,2,0,9,6,2,0,1,1,0,8,15,5,20,0 -1709,6,1,0,1,2,4,9,2,2,1,15,19,1,1,1 -1710,7,1,0,4,1,0,3,1,4,0,3,3,5,31,1 -1711,4,7,0,6,0,0,12,2,0,1,12,12,0,13,0 -1712,0,2,0,3,6,4,0,3,3,1,17,0,2,36,0 -1713,9,2,0,1,1,4,5,5,0,0,8,11,0,31,0 -1714,8,3,0,2,0,3,12,1,0,0,13,9,1,1,0 -1715,9,2,0,6,1,1,12,5,4,0,1,6,2,28,1 -1716,2,0,0,2,0,5,6,1,1,1,13,9,1,29,0 -1717,0,0,0,3,4,4,10,2,3,1,2,15,5,23,0 -1718,6,3,0,3,2,1,6,4,0,0,3,17,0,28,1 -1719,4,5,0,13,0,1,8,2,1,1,4,11,1,36,0 -1720,0,0,0,1,3,5,4,2,4,1,4,11,3,3,0 -1721,3,6,0,2,1,6,8,3,1,0,8,18,0,28,0 -1722,7,8,0,12,3,2,8,3,2,0,5,14,5,37,1 -1723,10,4,0,5,2,4,5,0,1,0,2,15,0,35,0 -1724,9,6,0,14,0,2,2,5,3,1,1,19,1,6,1 -1725,0,4,0,10,1,0,3,3,0,0,4,11,2,29,0 -1726,0,6,0,13,4,4,6,4,0,0,8,9,0,34,0 -1727,10,4,0,3,6,4,6,0,0,1,14,0,3,35,0 -1728,4,4,0,6,6,2,2,1,2,0,4,0,1,14,0 -1729,8,5,0,10,2,1,7,2,0,0,9,16,0,22,1 -1730,4,5,0,11,0,6,5,5,3,0,2,20,2,40,0 -1731,1,5,0,4,0,0,7,1,3,0,1,15,0,31,0 -1732,3,0,0,3,2,1,13,5,2,0,9,5,2,7,1 -1733,3,7,0,0,4,5,7,1,3,1,13,0,1,29,0 -1734,8,5,0,13,1,5,1,5,4,0,13,6,0,8,0 -1735,6,4,0,3,4,6,9,1,4,0,10,9,0,18,0 -1736,0,2,0,7,4,1,7,3,1,1,8,2,0,5,0 -1737,1,4,0,2,6,0,9,1,4,0,17,11,1,9,0 -1738,4,1,0,9,5,0,12,4,1,1,15,10,3,12,0 -1739,2,2,0,2,0,0,0,4,3,1,4,4,0,36,0 -1740,1,4,0,14,0,6,7,1,4,0,13,13,3,1,0 -1741,4,5,0,4,6,2,3,2,1,1,16,7,4,31,1 -1742,5,7,0,10,5,6,14,5,0,1,6,6,3,1,0 -1743,0,2,0,1,6,1,6,5,2,1,2,16,3,6,0 -1744,7,6,0,5,0,5,14,3,4,1,13,11,3,9,0 -1745,0,2,0,3,2,4,8,4,0,0,6,6,3,22,0 -1746,0,0,0,14,1,6,1,2,4,0,2,2,1,24,0 -1747,6,8,0,0,1,1,8,1,0,1,2,6,3,36,0 -1748,0,5,0,1,1,5,12,5,3,1,6,0,4,29,0 -1749,6,7,0,9,2,3,10,1,0,0,2,14,1,23,0 -1750,0,8,0,2,3,0,4,3,2,1,2,4,2,21,0 -1751,5,1,0,4,1,3,12,1,2,0,13,2,4,40,0 -1752,10,3,0,2,5,4,8,3,3,1,8,18,5,27,0 -1753,3,3,0,3,1,0,6,1,1,0,3,13,1,38,0 -1754,10,0,0,12,1,1,11,1,4,1,15,17,4,39,1 -1755,1,6,0,1,2,4,10,0,2,1,13,11,5,6,0 -1756,4,8,0,14,2,2,0,1,2,1,3,7,2,2,1 -1757,2,3,0,3,4,3,14,0,1,0,8,9,5,19,0 -1758,1,3,0,11,4,6,14,2,4,1,12,2,5,20,0 -1759,2,6,0,5,5,2,8,4,1,1,3,14,5,18,1 -1760,10,4,0,11,4,6,2,1,3,1,1,10,0,37,1 -1761,0,0,0,7,6,5,8,3,0,0,17,18,4,29,0 -1762,3,8,0,6,2,6,2,4,2,0,2,2,0,29,0 -1763,9,7,0,7,2,6,0,5,2,1,7,0,5,33,1 -1764,1,3,0,0,2,4,14,0,0,0,2,16,4,27,0 -1765,1,7,0,14,0,0,2,2,3,1,8,7,0,9,0 -1766,9,1,0,4,1,5,8,4,0,1,2,20,2,9,0 -1767,1,5,0,4,5,4,12,5,3,1,18,10,5,9,1 -1768,10,8,0,8,2,1,9,4,0,1,4,0,0,9,0 -1769,3,6,0,6,6,0,8,3,2,0,12,11,1,20,0 -1770,5,8,0,5,0,6,3,4,0,0,13,3,4,26,0 -1771,8,5,0,11,6,4,9,1,0,0,8,11,2,5,0 -1772,0,6,0,9,3,6,5,3,0,0,13,11,3,38,0 -1773,3,1,0,0,1,1,7,3,0,1,8,3,5,11,0 -1774,1,2,0,3,6,3,11,3,2,1,10,11,1,19,0 -1775,10,8,0,3,4,2,3,3,1,1,5,1,4,18,1 -1776,2,8,0,5,6,2,3,4,3,1,2,13,2,20,0 -1777,8,7,0,7,1,0,4,1,0,1,11,10,5,20,1 -1778,0,0,0,13,1,6,9,2,0,1,10,11,0,8,0 -1779,6,1,0,4,2,2,14,0,3,1,8,2,1,3,0 -1780,1,6,0,1,0,5,9,0,3,0,4,17,5,35,0 -1781,0,7,0,11,0,3,2,1,3,1,13,10,0,11,0 -1782,4,1,0,2,6,5,0,5,1,1,18,7,3,22,1 -1783,0,2,0,10,0,4,8,1,0,1,2,4,0,27,0 -1784,9,6,0,2,0,1,5,1,3,0,0,11,1,23,0 -1785,0,1,0,8,6,1,5,1,0,1,12,17,5,41,0 -1786,0,4,0,15,1,4,12,1,0,1,2,3,1,2,0 -1787,9,4,0,3,3,1,0,5,3,1,11,12,3,39,1 -1788,1,4,0,13,2,4,4,2,0,0,2,19,5,6,0 -1789,10,1,0,9,6,3,6,1,0,0,17,11,5,29,0 -1790,7,8,0,3,5,3,0,2,3,0,13,13,2,40,0 -1791,0,8,0,9,0,5,0,3,0,0,4,19,1,38,0 -1792,0,7,0,7,1,0,0,5,1,1,1,8,3,19,1 -1793,6,4,0,6,2,0,8,2,1,1,12,5,1,5,0 -1794,9,1,0,9,1,2,4,4,3,1,7,18,4,1,1 -1795,2,2,0,14,1,0,3,2,3,0,9,10,2,34,1 -1796,5,8,0,7,0,0,7,4,2,1,4,11,4,21,0 -1797,6,5,0,5,1,1,0,2,2,1,1,7,0,20,1 -1798,8,8,0,9,4,5,5,2,3,0,13,15,4,4,0 -1799,8,6,0,2,0,3,8,2,3,0,4,6,0,3,0 -1800,1,8,0,15,5,6,4,4,1,0,13,15,1,35,0 -1801,0,5,0,1,0,5,5,1,4,0,0,7,1,23,0 -1802,7,3,0,15,5,2,5,5,2,0,18,3,3,28,1 -1803,1,8,0,1,6,5,1,0,0,0,3,18,1,18,0 -1804,9,2,0,8,0,2,4,3,1,0,0,6,2,31,0 -1805,1,2,0,1,0,3,8,2,4,1,12,13,3,26,0 -1806,7,8,0,4,4,2,11,0,0,1,1,19,1,19,1 -1807,4,6,0,12,1,6,6,2,1,1,0,4,1,3,0 -1808,1,5,0,0,2,0,1,3,0,0,8,9,3,32,0 -1809,1,8,0,3,5,5,4,2,2,0,4,15,0,14,0 -1810,0,5,0,12,3,6,4,3,1,0,15,6,1,33,0 -1811,9,6,0,1,6,0,10,5,4,0,5,8,4,24,0 -1812,1,3,0,1,5,6,12,3,2,0,4,2,0,5,0 -1813,4,7,0,15,1,5,4,0,1,0,14,13,5,41,1 -1814,4,2,0,9,1,4,8,1,2,0,6,15,4,32,0 -1815,9,3,0,8,6,6,9,0,1,0,2,6,2,4,0 -1816,7,6,0,2,3,1,8,5,0,0,9,7,5,0,1 -1817,2,6,0,5,6,5,12,1,2,0,13,19,1,13,0 -1818,7,5,0,7,3,1,7,0,0,1,11,10,5,41,1 -1819,9,6,0,15,0,4,13,5,4,1,11,17,3,9,1 -1820,5,6,0,3,2,2,9,2,3,0,3,2,0,17,0 -1821,1,6,0,0,1,0,9,2,1,1,4,2,3,7,0 -1822,5,3,0,3,1,4,14,1,2,1,4,9,1,33,0 -1823,7,5,0,3,0,1,10,0,0,0,5,16,0,37,1 -1824,0,6,0,13,6,2,5,3,3,1,7,2,0,14,0 -1825,2,8,0,10,3,6,3,0,4,0,2,9,0,37,0 -1826,2,7,0,1,6,6,13,0,0,1,13,9,4,9,0 -1827,6,6,0,12,6,2,10,3,2,1,8,0,0,31,0 -1828,10,4,0,3,2,3,3,3,2,0,4,18,5,41,0 -1829,1,5,0,0,3,1,1,5,3,0,16,17,0,38,1 -1830,1,6,0,2,0,4,9,4,3,1,8,0,3,0,0 -1831,2,8,0,13,3,4,0,4,2,1,6,9,4,41,0 -1832,3,5,0,9,6,5,6,0,2,1,1,2,3,24,1 -1833,10,6,0,4,5,3,14,2,2,1,14,17,1,1,1 -1834,10,2,0,15,5,5,9,5,0,0,1,11,1,35,0 -1835,8,7,0,13,4,0,9,1,1,1,8,2,0,35,0 -1836,6,2,0,6,4,0,6,4,0,1,2,11,5,6,0 -1837,5,0,0,9,3,0,6,0,0,0,13,13,3,32,0 -1838,3,8,0,13,3,6,8,4,1,1,2,2,0,3,0 -1839,0,8,0,6,4,4,8,3,4,0,8,12,5,40,0 -1840,7,4,0,6,2,4,4,0,0,0,9,10,4,36,1 -1841,1,5,0,9,1,5,7,3,1,0,13,11,5,26,0 -1842,9,1,0,4,2,4,1,2,1,1,18,7,1,7,1 -1843,0,4,0,8,2,5,7,3,1,0,2,15,2,36,0 -1844,7,5,0,10,3,1,4,5,2,1,0,5,5,7,1 -1845,0,7,0,0,0,1,5,5,1,1,3,15,1,3,0 -1846,6,3,0,6,1,6,6,3,4,1,11,7,0,28,1 -1847,0,7,0,0,0,6,4,2,3,0,4,2,2,10,0 -1848,2,8,0,7,3,4,8,3,1,1,11,20,3,28,0 -1849,6,4,0,1,4,3,9,2,0,1,12,11,0,33,0 -1850,1,6,0,11,1,3,6,3,1,0,13,20,0,10,0 -1851,9,0,0,15,1,4,14,3,4,1,7,5,1,31,1 -1852,6,6,0,14,2,6,4,5,4,1,6,17,1,19,1 -1853,10,8,0,5,2,6,14,4,0,1,15,20,3,2,0 -1854,1,5,0,15,0,6,8,4,2,0,0,11,4,7,0 -1855,5,3,0,12,6,4,7,3,0,0,4,2,0,22,0 -1856,6,8,0,5,0,5,4,1,3,1,13,15,0,18,0 -1857,6,2,0,11,5,1,0,2,1,1,17,15,0,11,0 -1858,5,1,0,6,4,4,3,1,2,1,5,5,4,23,1 -1859,2,2,0,2,5,1,4,3,3,1,7,7,5,9,1 -1860,1,8,0,4,5,0,1,1,4,0,1,2,1,14,0 -1861,5,3,0,14,5,2,13,3,0,0,18,18,4,8,1 -1862,10,5,0,0,0,4,7,1,1,0,8,0,4,6,0 -1863,9,4,0,12,0,5,10,3,3,0,17,11,1,4,0 -1864,3,3,0,6,2,5,7,2,1,1,15,17,1,30,1 -1865,9,7,0,6,0,6,1,5,0,1,18,5,4,28,1 -1866,3,4,0,10,6,4,8,0,1,1,3,7,2,21,1 -1867,0,2,0,13,5,0,9,3,0,1,18,12,3,5,0 -1868,8,0,0,4,6,4,12,5,2,0,17,1,4,37,0 -1869,5,8,0,3,6,4,4,0,2,0,17,4,5,4,0 -1870,9,4,0,3,3,2,9,0,4,0,13,6,0,14,0 -1871,10,8,0,7,1,6,11,4,3,0,18,1,5,30,1 -1872,0,1,0,2,4,4,0,2,1,0,8,6,1,31,0 -1873,4,7,0,12,2,6,9,1,2,0,13,2,2,28,0 -1874,1,0,0,13,0,4,6,2,3,1,13,2,0,18,0 -1875,2,3,0,4,4,4,4,0,3,1,16,1,4,37,1 -1876,10,3,0,10,2,1,6,4,1,1,1,17,4,20,1 -1877,8,3,0,2,2,4,9,4,1,1,4,11,1,3,0 -1878,2,4,0,3,2,6,8,4,2,1,11,2,0,9,0 -1879,7,6,0,8,3,4,9,0,2,1,17,15,5,24,0 -1880,6,6,0,15,2,1,12,4,4,0,17,19,1,16,0 -1881,1,2,0,8,0,0,11,2,1,0,2,2,0,4,0 -1882,5,3,0,10,1,4,12,4,3,1,2,0,0,14,0 -1883,8,8,0,5,0,2,13,1,4,1,4,11,0,3,0 -1884,0,7,0,7,5,0,5,1,2,0,4,0,5,26,0 -1885,2,2,0,1,6,0,7,1,3,0,4,11,3,16,0 -1886,2,4,0,9,4,0,12,3,0,0,2,11,1,8,0 -1887,2,2,0,11,5,5,8,4,0,1,13,4,2,12,0 -1888,2,6,0,12,2,4,9,3,4,0,2,11,5,31,0 -1889,5,8,0,11,3,1,11,3,0,1,14,0,0,33,0 -1890,0,0,0,10,5,5,5,4,0,1,8,6,3,17,0 -1891,0,4,0,1,6,1,8,0,0,1,2,19,5,36,0 -1892,2,8,0,4,6,4,12,1,4,1,10,2,0,12,0 -1893,0,0,0,0,1,4,14,1,3,1,4,11,4,28,0 -1894,5,6,0,8,1,1,5,0,2,0,1,1,1,38,1 -1895,0,6,0,3,2,0,6,3,2,1,0,4,4,7,0 -1896,9,2,0,0,3,6,9,2,4,1,8,13,2,36,0 -1897,5,8,0,10,0,5,8,2,1,0,13,8,0,28,0 -1898,3,8,0,15,1,0,9,1,3,0,8,2,1,5,0 -1899,0,5,0,0,1,0,11,0,1,0,10,11,5,22,0 -1900,5,8,0,2,5,1,1,3,2,0,8,9,1,3,0 -1901,10,5,0,15,1,5,10,3,2,0,6,13,0,21,0 -1902,8,3,0,1,0,2,11,1,0,1,13,11,1,21,0 -1903,4,3,0,11,1,0,3,1,0,1,18,8,3,6,1 -1904,0,7,0,11,6,3,14,4,4,1,13,2,3,26,0 -1905,1,2,0,1,6,4,5,3,4,1,11,6,2,2,0 -1906,0,0,0,6,0,5,14,1,0,1,2,0,0,8,0 -1907,2,3,0,2,0,0,3,3,4,0,2,11,3,14,0 -1908,3,6,0,13,1,0,11,3,0,0,2,3,5,27,0 -1909,0,6,0,7,2,6,11,4,3,0,7,17,1,26,0 -1910,9,0,0,4,1,3,4,1,1,0,13,20,5,17,0 -1911,7,5,0,8,3,1,11,1,2,0,9,8,2,26,1 -1912,4,1,0,0,4,5,4,4,2,1,9,10,1,29,1 -1913,2,3,0,14,4,3,9,0,0,1,2,16,0,7,0 -1914,0,0,0,13,4,3,2,2,2,0,16,9,1,12,0 -1915,2,7,0,3,0,5,2,2,3,1,6,11,3,40,0 -1916,5,4,0,3,4,1,14,1,2,1,13,2,5,17,0 -1917,6,7,0,13,5,2,5,4,1,0,16,5,3,4,1 -1918,6,4,0,6,0,4,2,4,3,0,2,7,4,26,0 -1919,0,2,0,15,2,0,6,1,2,1,0,11,4,21,0 -1920,10,5,0,8,6,5,11,2,0,0,5,11,0,39,0 -1921,4,7,0,10,2,5,5,2,2,0,7,18,4,16,0 -1922,2,0,0,7,3,1,0,1,2,0,8,18,5,30,0 -1923,2,2,0,13,1,4,4,2,0,0,13,11,3,8,0 -1924,2,8,0,1,3,5,11,2,2,1,3,11,5,4,0 -1925,8,8,0,0,4,2,7,1,4,0,14,14,2,19,1 -1926,6,1,0,12,5,1,0,3,1,1,7,10,1,21,1 -1927,2,7,0,0,6,6,12,3,4,0,13,1,0,4,0 -1928,6,5,0,14,1,2,0,5,0,0,18,14,2,28,1 -1929,7,7,0,12,0,2,4,5,1,1,5,5,0,5,1 -1930,8,2,0,14,6,2,10,5,4,0,5,14,3,32,1 -1931,7,2,0,13,0,5,5,4,2,1,8,9,3,13,0 -1932,3,2,0,14,6,4,9,3,2,1,4,2,3,5,0 -1933,9,5,0,6,2,3,2,1,1,1,3,1,1,23,1 -1934,5,4,0,8,2,2,7,4,1,0,8,15,4,22,0 -1935,1,4,0,14,0,5,11,4,2,1,2,2,1,14,0 -1936,1,3,0,2,0,6,0,4,2,1,8,11,2,34,0 -1937,8,2,0,5,6,0,4,0,1,0,3,20,1,20,1 -1938,5,6,0,13,0,3,7,1,1,1,11,12,2,10,0 -1939,10,7,0,10,1,2,14,3,0,0,4,0,1,12,0 -1940,3,0,0,4,5,2,1,1,2,1,13,6,2,31,0 -1941,3,4,0,5,6,1,4,5,4,0,18,8,4,40,1 -1942,1,8,0,3,0,5,8,0,4,0,8,11,2,28,0 -1943,4,7,0,14,2,3,14,4,2,0,12,20,1,34,0 -1944,0,3,0,0,2,2,6,1,1,0,4,0,3,30,0 -1945,1,7,0,0,5,0,12,4,0,1,6,2,2,41,0 -1946,5,4,0,5,2,0,8,1,4,1,8,16,0,3,0 -1947,2,5,0,3,5,4,6,1,2,1,13,2,0,37,0 -1948,6,5,0,5,2,0,12,4,0,0,13,20,2,24,0 -1949,9,7,0,4,2,6,0,5,1,0,2,10,4,32,1 -1950,2,8,0,6,1,6,2,0,3,1,13,10,4,25,0 -1951,3,2,0,4,2,6,11,1,2,0,13,13,2,0,0 -1952,3,8,0,13,1,3,5,2,0,1,16,18,0,20,0 -1953,10,4,0,4,1,4,12,4,0,1,6,11,4,31,0 -1954,8,1,0,1,6,6,12,4,3,1,5,10,2,40,1 -1955,1,5,0,4,4,3,9,1,4,1,18,8,3,20,1 -1956,3,7,0,4,3,4,9,3,0,0,13,2,3,39,0 -1957,0,7,0,13,3,4,3,1,3,0,17,11,0,33,0 -1958,0,6,0,2,5,0,2,3,0,1,17,2,5,19,0 -1959,0,4,0,8,2,0,8,3,1,1,8,6,0,21,0 -1960,7,6,0,6,4,5,6,0,0,1,13,2,0,3,0 -1961,6,3,0,5,1,0,11,5,2,1,13,20,3,41,1 -1962,1,7,0,13,6,2,6,2,4,0,13,20,4,21,0 -1963,3,5,0,14,4,1,14,1,4,0,14,10,4,32,1 -1964,0,1,0,14,2,4,8,4,2,0,8,9,0,30,0 -1965,6,6,0,1,5,6,7,3,0,0,10,0,0,36,0 -1966,0,2,0,13,5,0,6,4,4,0,5,2,3,12,0 -1967,3,7,0,2,1,3,13,5,4,1,11,7,3,28,1 -1968,1,3,0,4,5,2,5,0,0,0,2,2,4,6,0 -1969,8,2,0,12,1,3,8,2,1,1,0,13,5,33,0 -1970,5,8,0,3,5,1,1,3,3,1,7,5,5,18,1 -1971,10,3,0,10,0,6,10,2,2,0,17,0,2,16,0 -1972,10,0,0,11,0,6,12,5,1,1,15,3,2,17,1 -1973,4,2,0,4,3,2,13,2,1,0,18,16,2,2,1 -1974,2,7,0,8,2,0,5,2,2,0,17,9,4,5,0 -1975,0,2,0,11,0,4,5,2,0,0,2,11,5,7,0 -1976,8,7,0,15,0,6,6,4,0,0,12,6,2,11,0 -1977,10,0,0,4,4,3,12,2,0,0,8,12,1,24,0 -1978,9,4,0,14,5,4,12,0,0,0,10,12,4,39,0 -1979,6,4,0,13,2,2,11,4,4,1,8,3,4,41,0 -1980,9,6,0,11,1,0,8,4,4,0,15,9,4,39,0 -1981,10,7,0,1,4,2,3,3,1,1,2,2,4,36,0 -1982,0,5,0,5,4,1,0,4,1,0,8,1,5,19,0 -1983,7,8,0,6,3,5,10,4,1,1,14,7,0,6,1 -1984,6,7,0,11,0,2,9,2,1,0,8,13,5,8,0 -1985,5,8,0,2,6,5,6,0,2,1,3,9,5,25,1 -1986,7,6,0,7,0,5,9,3,3,1,6,2,5,4,0 -1987,7,6,0,9,6,5,3,4,1,1,2,11,0,39,0 -1988,6,6,0,14,2,0,7,3,0,0,8,15,3,4,0 -1989,0,7,0,12,0,4,1,3,1,1,4,0,1,38,0 -1990,7,1,0,2,0,3,9,4,0,1,17,0,1,16,0 -1991,7,4,0,11,0,0,0,0,4,0,2,20,5,35,0 -1992,2,2,0,10,5,4,6,4,1,0,15,19,5,17,0 -1993,3,1,0,3,2,3,11,5,2,1,0,19,1,10,1 -1994,9,1,0,11,0,5,1,5,4,0,5,16,5,32,1 -1995,1,3,0,11,2,2,0,2,1,0,2,15,3,30,0 -1996,5,4,0,0,1,6,3,4,1,1,7,19,3,22,1 -1997,1,6,0,13,4,0,5,5,0,1,3,19,1,24,0 -1998,1,0,0,3,0,0,1,3,0,0,7,2,0,3,0 -1999,6,2,0,3,3,6,13,3,0,1,2,13,2,38,0 -2000,6,0,0,10,0,2,2,0,2,1,13,2,4,8,0 -2001,5,5,0,3,4,2,10,1,0,0,0,0,3,22,0 -2002,9,7,0,13,2,0,5,1,1,1,4,12,2,29,0 -2003,5,2,0,3,2,2,0,5,1,0,18,7,3,20,1 -2004,2,8,0,9,2,6,9,3,3,1,8,4,0,12,0 -2005,2,2,0,0,4,0,6,3,1,1,4,6,0,34,0 -2006,4,6,0,0,6,4,6,1,0,0,8,4,1,8,0 -2007,8,0,0,6,0,1,2,4,4,0,8,11,5,28,0 -2008,0,5,0,3,5,2,11,1,4,0,6,20,0,19,0 -2009,1,7,0,7,0,2,9,1,1,1,6,11,1,19,0 -2010,1,7,0,14,3,0,14,4,2,0,13,11,1,2,0 -2011,10,7,0,14,4,0,11,3,0,0,8,16,0,12,0 -2012,1,6,0,5,5,3,8,2,4,0,0,11,0,6,0 -2013,2,6,0,9,0,0,1,0,2,0,13,11,0,5,0 -2014,5,1,0,15,2,0,2,3,0,0,13,18,0,23,0 -2015,3,1,0,13,3,3,9,0,1,1,18,10,1,7,1 -2016,0,4,0,11,0,2,14,4,4,0,12,6,1,40,0 -2017,10,6,0,9,0,2,14,5,4,1,18,5,1,32,1 -2018,0,6,0,10,4,0,7,1,2,1,2,9,2,25,0 -2019,7,5,0,9,6,6,14,0,3,0,9,7,1,10,1 -2020,7,6,0,3,0,3,0,3,1,0,8,2,1,19,0 -2021,8,3,0,0,0,4,12,3,2,1,2,2,0,35,0 -2022,2,7,0,3,6,5,11,2,4,1,10,9,3,7,0 -2023,1,2,0,2,0,2,7,2,0,0,8,15,2,20,0 -2024,3,7,0,1,5,2,3,4,0,1,8,2,1,6,0 -2025,8,4,0,1,1,5,12,5,0,1,5,13,0,0,0 -2026,0,5,0,7,0,4,12,4,0,1,2,19,0,21,0 -2027,1,5,0,13,2,2,9,0,4,1,2,18,3,29,0 -2028,0,4,0,7,4,3,1,2,4,0,4,2,1,6,0 -2029,9,6,0,8,2,2,9,1,3,1,10,2,4,24,1 -2030,4,8,0,3,6,3,5,0,2,0,2,13,5,25,0 -2031,5,2,0,2,5,3,6,0,3,0,7,17,4,35,1 -2032,10,4,0,5,2,4,7,3,0,1,12,11,5,25,0 -2033,4,6,0,1,4,0,4,3,3,0,6,2,0,34,0 -2034,9,8,0,7,5,6,2,4,3,0,1,20,1,17,0 -2035,1,6,0,6,0,0,9,2,0,0,4,6,0,29,0 -2036,0,3,0,13,2,1,8,1,4,1,7,2,2,36,0 -2037,5,8,0,12,1,1,9,5,4,1,2,2,5,31,0 -2038,6,6,0,9,6,1,4,0,3,1,13,0,0,18,0 -2039,6,7,0,5,1,4,1,4,0,1,16,8,4,20,0 -2040,7,8,0,7,1,1,6,5,4,0,18,12,5,29,1 -2041,2,8,0,0,6,3,3,1,0,0,14,11,4,5,0 -2042,9,6,0,10,5,6,9,2,3,0,7,10,4,16,1 -2043,9,6,0,10,1,5,2,1,0,0,3,10,2,35,1 -2044,0,2,0,9,6,0,0,4,2,0,8,0,0,30,0 -2045,8,6,0,10,4,3,2,0,2,1,18,14,0,25,1 -2046,3,4,0,15,0,0,9,4,0,1,18,5,3,17,1 -2047,4,7,0,11,3,6,8,3,0,0,13,0,0,27,0 -2048,0,1,0,3,1,1,9,0,1,1,5,2,1,32,0 -2049,9,6,0,2,6,2,4,1,2,1,5,10,1,18,1 -2050,1,1,0,13,3,5,1,0,2,1,7,18,3,11,1 -2051,4,2,0,14,5,5,11,2,1,0,8,15,0,35,0 -2052,0,0,0,12,6,5,8,2,0,1,0,2,4,27,0 -2053,1,2,0,0,3,3,12,2,2,0,13,4,1,33,0 -2054,3,6,0,10,0,2,11,1,0,0,8,6,5,37,0 -2055,6,3,0,10,1,0,3,3,0,1,6,12,0,14,0 -2056,9,0,0,11,1,3,6,4,0,0,18,12,2,2,1 -2057,1,5,0,1,6,3,11,3,3,0,9,4,4,24,0 -2058,2,8,0,13,2,5,9,2,0,1,2,9,0,24,0 -2059,10,8,0,13,4,5,9,5,2,0,13,20,0,27,0 -2060,7,8,0,3,4,3,4,0,3,1,17,2,4,26,0 -2061,3,7,0,5,2,4,6,4,4,0,4,11,0,6,0 -2062,5,6,0,13,1,4,0,3,2,0,13,8,2,0,0 -2063,2,5,0,4,6,5,8,3,2,1,13,11,5,38,0 -2064,3,4,0,9,5,0,3,1,2,1,7,20,0,0,0 -2065,6,0,0,11,3,4,3,4,2,1,13,6,1,23,0 -2066,4,1,0,5,6,0,6,4,4,1,13,18,0,28,0 -2067,7,5,0,5,6,2,9,4,3,1,2,16,4,34,1 -2068,4,7,0,5,5,6,0,3,0,0,13,10,2,10,0 -2069,3,4,0,4,1,0,9,4,1,0,11,11,2,29,0 -2070,0,0,0,7,4,5,0,0,4,1,2,6,1,38,0 -2071,0,6,0,10,1,5,5,2,4,1,4,6,5,4,0 -2072,8,4,0,0,0,5,5,4,0,0,13,11,3,3,0 -2073,10,3,0,2,0,6,5,3,2,1,0,0,4,27,0 -2074,2,5,0,2,6,5,6,0,2,0,8,2,0,7,0 -2075,7,7,0,2,0,4,11,4,2,0,8,2,0,11,0 -2076,0,0,0,6,5,3,11,2,3,0,2,13,1,31,0 -2077,9,6,0,10,0,4,8,3,4,0,8,6,3,30,0 -2078,4,3,0,14,1,1,10,0,0,0,18,7,2,14,1 -2079,0,6,0,6,0,5,0,0,1,0,10,12,4,21,0 -2080,1,2,0,12,4,6,7,5,2,1,2,2,0,33,0 -2081,1,4,0,1,4,6,10,0,0,0,2,13,0,16,0 -2082,8,7,0,8,6,6,3,3,0,0,2,15,0,5,0 -2083,0,2,0,6,5,0,0,0,1,0,10,20,1,8,0 -2084,0,4,0,7,6,5,0,3,0,0,6,11,0,4,0 -2085,0,8,0,5,5,0,13,3,0,1,2,2,1,41,0 -2086,10,0,0,10,4,1,3,3,0,0,4,2,5,27,0 -2087,4,8,0,11,1,4,7,0,1,0,12,2,1,31,0 -2088,2,3,0,2,2,5,10,0,2,0,8,6,5,40,0 -2089,0,0,0,13,3,5,2,1,2,0,11,15,1,20,0 -2090,10,7,0,3,5,0,7,5,1,1,15,9,1,7,0 -2091,10,4,0,4,1,0,10,2,4,0,0,8,1,5,0 -2092,2,8,0,4,0,6,0,4,2,1,11,15,0,3,0 -2093,8,6,0,2,4,0,2,0,4,1,18,5,0,7,1 -2094,2,3,0,6,2,4,2,1,0,1,6,5,0,40,0 -2095,0,8,0,1,0,6,6,0,2,1,17,11,1,29,0 -2096,2,5,0,11,3,0,14,4,3,0,2,11,5,14,0 -2097,5,5,0,8,0,0,2,3,1,1,6,10,3,27,0 -2098,1,6,0,7,5,1,13,3,2,1,0,6,0,25,0 -2099,0,1,0,5,2,2,5,3,0,1,2,15,3,32,0 -2100,7,0,0,5,2,4,14,4,1,1,2,2,4,5,0 -2101,3,0,0,15,0,5,6,4,4,0,15,15,2,30,0 -2102,0,6,0,10,0,6,0,2,2,0,8,18,4,14,0 -2103,2,1,0,13,1,6,6,1,1,0,17,16,4,3,0 -2104,3,5,0,1,2,0,10,1,0,1,2,16,2,36,0 -2105,6,2,0,8,6,6,10,1,4,1,18,16,1,2,1 -2106,1,4,0,0,0,4,9,1,2,0,8,13,5,5,0 -2107,5,5,0,1,5,4,7,0,4,0,17,0,3,12,0 -2108,8,7,0,15,2,3,6,3,1,1,9,8,5,26,1 -2109,3,5,0,1,4,4,2,1,3,1,15,2,1,9,0 -2110,8,6,0,11,2,2,8,3,3,0,1,7,0,22,1 -2111,3,6,0,14,1,0,10,0,4,1,16,12,0,20,1 -2112,1,2,0,4,0,5,14,5,2,0,2,4,1,41,0 -2113,2,7,0,1,2,0,6,2,0,0,10,6,1,3,0 -2114,6,8,0,1,0,6,3,4,4,0,13,2,3,37,0 -2115,5,3,0,5,2,4,9,4,3,0,13,18,4,17,0 -2116,8,8,0,9,5,5,14,5,2,0,2,1,3,12,1 -2117,2,8,0,2,1,1,3,0,4,1,9,8,5,22,1 -2118,3,8,0,11,2,4,7,3,3,0,17,1,5,14,0 -2119,9,1,0,7,1,5,6,1,0,0,13,6,2,23,0 -2120,9,3,0,10,5,1,5,1,0,1,8,15,3,20,0 -2121,2,8,0,5,0,2,4,0,2,0,17,9,3,3,0 -2122,2,0,0,7,1,6,0,0,4,0,0,6,4,40,0 -2123,7,5,0,10,3,5,4,5,1,1,11,18,3,26,1 -2124,2,8,0,3,5,5,8,1,1,0,8,13,0,3,0 -2125,9,5,0,10,1,2,2,2,0,1,6,20,1,21,0 -2126,3,5,0,9,6,2,0,4,0,1,8,4,5,1,0 -2127,2,6,0,13,0,0,6,4,3,1,13,2,3,20,0 -2128,4,1,0,14,1,6,6,2,0,1,4,12,5,13,0 -2129,1,6,0,15,5,4,5,1,3,0,13,15,5,40,0 -2130,4,7,0,13,0,6,10,4,0,1,2,11,0,38,0 -2131,2,0,0,8,2,0,9,0,0,1,13,15,4,6,0 -2132,0,3,0,7,5,1,9,5,3,1,13,2,1,25,0 -2133,8,4,0,13,0,0,0,0,0,1,4,15,3,14,0 -2134,5,4,0,9,0,3,11,3,3,0,13,7,2,6,0 -2135,0,4,0,3,2,4,5,2,2,1,15,10,3,7,0 -2136,7,6,0,5,0,0,5,4,4,0,12,19,4,2,0 -2137,5,1,0,9,4,1,3,2,4,1,18,19,5,17,1 -2138,0,1,0,8,0,5,3,4,1,0,8,15,5,40,0 -2139,7,1,0,7,1,6,9,5,0,0,9,14,5,12,1 -2140,2,8,0,12,1,2,6,1,3,1,2,15,2,38,0 -2141,5,7,0,13,4,2,11,4,1,0,8,14,0,4,0 -2142,7,7,0,10,2,2,3,4,4,1,13,6,3,5,0 -2143,5,4,0,1,5,3,6,5,0,0,2,9,4,19,0 -2144,5,2,0,2,4,1,8,3,1,1,13,18,3,6,0 -2145,1,8,0,14,3,1,6,1,3,1,13,15,5,11,0 -2146,10,2,0,4,1,6,2,4,1,1,2,11,3,16,0 -2147,5,0,0,0,1,3,4,4,0,1,8,6,0,4,0 -2148,4,7,0,5,2,0,4,1,0,0,5,5,1,2,1 -2149,3,2,0,1,5,0,8,3,3,1,10,15,3,31,0 -2150,4,0,0,13,0,0,3,1,0,1,16,11,4,36,0 -2151,10,7,0,3,0,6,0,4,2,1,9,7,1,31,1 -2152,5,1,0,13,4,2,11,5,3,1,16,7,4,28,1 -2153,0,6,0,13,2,6,13,1,1,0,13,2,5,11,0 -2154,8,2,0,2,3,0,5,2,3,1,10,11,4,39,0 -2155,3,4,0,11,0,5,7,4,0,0,13,18,2,6,0 -2156,6,8,0,2,4,4,7,4,2,1,13,15,5,19,0 -2157,8,2,0,15,3,6,5,2,2,0,15,10,1,5,1 -2158,6,6,0,6,4,6,8,5,0,0,4,11,5,8,0 -2159,4,8,0,7,1,0,2,2,4,1,15,14,2,32,0 -2160,1,8,0,12,0,0,8,1,2,1,2,7,1,28,0 -2161,7,5,0,10,4,2,1,5,2,1,9,13,3,21,1 -2162,6,5,0,7,4,0,8,5,1,1,9,8,1,1,1 -2163,6,1,0,12,6,0,5,0,0,1,0,13,4,5,0 -2164,10,5,0,14,3,5,12,5,3,1,12,10,0,2,1 -2165,9,8,0,4,0,3,14,3,0,1,8,11,0,18,0 -2166,2,2,0,4,3,5,9,5,4,1,2,6,0,39,0 -2167,6,0,0,10,0,3,7,2,4,0,8,9,4,24,0 -2168,7,0,0,10,5,6,4,5,4,0,1,8,3,5,1 -2169,8,7,0,14,3,4,13,5,4,1,0,12,0,0,1 -2170,3,3,0,2,0,1,7,1,0,1,8,4,1,14,0 -2171,7,3,0,15,5,3,7,5,1,0,10,4,0,35,0 -2172,4,2,0,9,1,3,7,1,2,1,12,20,0,37,0 -2173,0,8,0,6,5,0,7,1,1,0,13,11,0,25,0 -2174,8,6,0,9,4,4,12,1,1,1,18,18,3,30,1 -2175,8,7,0,10,0,6,4,3,0,1,9,17,2,0,1 -2176,0,3,0,7,0,0,2,4,3,0,13,14,5,33,0 -2177,1,7,0,3,0,0,0,3,2,0,13,0,3,31,0 -2178,3,7,0,15,1,4,12,2,0,0,10,10,0,31,1 -2179,7,3,0,2,6,6,7,0,0,1,18,17,4,2,1 -2180,0,6,0,11,6,1,2,0,0,1,6,17,2,34,0 -2181,0,0,0,4,0,0,6,0,0,1,8,17,0,25,0 -2182,1,6,0,13,0,2,10,4,0,1,4,12,4,32,0 -2183,3,0,0,0,4,4,6,3,2,1,10,2,2,41,0 -2184,0,7,0,2,1,0,14,2,0,0,13,3,3,16,0 -2185,5,6,0,5,6,6,3,2,3,0,12,12,3,25,0 -2186,10,1,0,7,3,2,6,0,4,1,18,5,4,14,1 -2187,2,4,0,1,0,5,13,2,3,1,2,2,5,6,0 -2188,8,4,0,11,4,2,0,1,3,1,5,5,4,23,1 -2189,10,5,0,6,4,2,4,3,2,1,0,1,4,17,1 -2190,10,2,0,6,5,1,7,2,3,0,15,10,1,0,1 -2191,3,5,0,10,0,5,8,3,1,1,13,9,1,12,0 -2192,6,5,0,8,1,5,8,3,0,1,10,15,0,3,0 -2193,1,5,0,13,0,3,8,4,4,0,4,11,2,30,0 -2194,1,0,0,11,4,4,7,3,4,0,13,2,0,3,0 -2195,1,1,0,10,0,5,8,3,1,0,5,15,2,35,0 -2196,4,4,0,10,5,2,5,4,1,1,8,12,2,28,0 -2197,10,4,0,12,3,2,13,1,0,1,1,14,1,0,1 -2198,2,4,0,4,1,0,3,5,2,1,0,12,5,16,0 -2199,8,7,0,6,3,6,8,5,4,0,14,13,2,35,0 -2200,7,6,0,0,1,0,12,1,0,1,0,11,2,4,0 -2201,2,4,0,6,1,3,1,2,0,1,2,16,1,17,0 -2202,2,8,0,4,0,6,6,1,4,1,13,18,0,17,0 -2203,9,0,0,12,0,3,4,4,0,1,17,2,1,37,0 -2204,10,3,0,4,5,2,8,0,4,0,16,19,3,31,1 -2205,2,2,0,1,4,5,14,3,4,1,2,11,4,11,0 -2206,8,7,0,11,4,1,9,2,4,0,2,19,0,19,0 -2207,9,8,0,14,3,2,7,5,3,1,9,1,3,32,1 -2208,9,4,0,8,4,2,8,0,2,0,2,16,3,0,1 -2209,0,5,0,7,6,0,2,4,0,1,10,3,2,11,1 -2210,2,4,0,5,2,3,5,3,1,0,11,11,1,22,0 -2211,3,6,0,15,5,3,12,4,1,0,6,13,4,24,0 -2212,0,8,0,15,3,1,14,0,3,0,9,7,1,12,1 -2213,1,6,0,4,3,4,1,0,0,0,15,15,2,4,0 -2214,9,5,0,13,5,5,10,4,0,0,2,2,1,10,0 -2215,9,8,0,7,1,5,11,0,2,1,13,11,1,12,0 -2216,1,0,0,4,1,5,4,2,2,1,11,18,3,26,0 -2217,10,6,0,13,0,5,2,4,0,1,2,6,1,4,0 -2218,2,1,0,0,1,5,8,0,1,0,13,13,3,29,0 -2219,6,0,0,5,5,6,5,1,2,0,17,2,1,22,0 -2220,7,5,0,9,6,4,9,1,2,1,2,20,1,31,0 -2221,10,3,0,4,6,3,2,2,4,0,4,16,1,30,0 -2222,4,3,0,13,6,1,10,4,1,0,18,5,4,9,1 -2223,7,3,0,4,3,1,3,5,0,1,7,19,0,32,1 -2224,3,0,0,0,1,2,4,4,4,0,7,10,2,26,1 -2225,10,5,0,13,6,5,8,1,0,0,0,11,5,3,0 -2226,3,5,0,0,5,1,12,0,4,0,4,0,1,9,0 -2227,1,5,0,7,6,2,1,4,1,1,13,2,0,27,0 -2228,8,2,0,13,3,1,5,4,0,1,15,8,0,13,1 -2229,5,2,0,4,3,2,1,5,3,1,5,17,4,30,1 -2230,0,7,0,1,6,1,6,3,4,0,12,2,1,35,0 -2231,0,7,0,5,6,6,11,3,0,1,2,2,2,2,0 -2232,2,6,0,4,1,1,4,5,2,0,14,1,0,19,1 -2233,8,8,0,1,6,1,9,2,0,1,13,0,4,0,0 -2234,1,8,0,3,0,6,3,3,2,1,2,6,3,16,0 -2235,3,2,0,10,0,6,6,2,0,0,17,6,0,19,0 -2236,7,6,0,1,4,1,14,3,3,1,4,16,1,39,0 -2237,3,3,0,5,6,3,12,0,2,0,4,11,4,28,0 -2238,5,7,0,11,2,0,6,4,0,0,18,6,1,23,0 -2239,7,4,0,3,2,2,12,0,1,1,3,0,2,3,1 -2240,10,2,0,1,1,6,8,3,1,0,0,11,5,35,0 -2241,9,5,0,9,1,0,10,0,2,0,2,10,1,41,0 -2242,1,5,0,7,6,3,7,2,1,0,13,1,2,38,0 -2243,0,3,0,7,6,2,6,5,2,1,3,1,2,27,1 -2244,4,3,0,14,5,2,6,0,4,0,18,17,3,36,1 -2245,8,4,0,14,0,0,9,0,0,0,0,15,2,32,0 -2246,8,1,0,0,6,3,0,3,0,1,13,6,2,18,0 -2247,5,6,0,0,2,4,13,2,2,0,8,18,2,7,0 -2248,6,2,0,6,2,5,3,1,0,0,4,0,5,20,0 -2249,4,7,0,14,0,5,12,5,0,0,5,5,4,36,1 -2250,7,7,0,3,4,3,10,0,0,1,14,5,4,35,1 -2251,3,8,0,8,6,6,4,1,0,0,17,2,2,21,0 -2252,10,1,0,6,1,3,1,1,3,1,8,18,0,31,0 -2253,4,8,0,9,6,0,6,4,2,0,17,19,1,37,0 -2254,1,8,0,0,0,0,6,4,3,1,8,12,3,32,0 -2255,4,1,0,11,3,3,12,3,2,0,13,4,0,29,0 -2256,7,3,0,10,3,2,2,5,2,0,15,7,0,1,1 -2257,3,4,0,2,0,3,5,1,0,0,15,13,5,29,0 -2258,8,5,0,5,1,0,9,1,0,1,2,9,4,0,0 -2259,1,8,0,12,5,0,7,0,2,0,13,18,0,17,0 -2260,1,3,0,1,6,2,7,4,3,0,13,11,3,12,0 -2261,7,8,0,8,0,2,12,5,0,1,16,7,1,24,1 -2262,10,2,0,9,4,0,4,1,3,1,2,15,1,6,0 -2263,2,2,0,11,4,6,13,0,3,0,8,2,0,17,0 -2264,6,4,0,4,0,2,9,5,0,0,1,16,5,21,0 -2265,0,1,0,9,0,3,11,2,4,1,16,18,3,26,0 -2266,0,8,0,1,6,6,12,2,4,0,13,2,0,0,0 -2267,0,2,0,7,6,0,6,2,2,0,17,13,1,12,0 -2268,7,4,0,7,0,4,10,5,1,1,9,5,2,8,1 -2269,2,8,0,7,0,0,5,2,0,0,13,5,1,40,0 -2270,6,6,0,3,1,1,7,0,0,1,8,6,3,23,0 -2271,6,8,0,0,4,2,2,3,4,0,14,7,3,9,1 -2272,0,6,0,13,1,6,9,2,4,0,6,15,0,3,0 -2273,4,7,0,1,5,1,12,3,0,0,13,17,0,35,0 -2274,2,7,0,15,0,4,8,0,3,1,2,4,5,25,0 -2275,5,8,0,0,2,1,10,1,1,0,5,17,3,25,1 -2276,3,8,0,5,0,5,4,4,0,1,8,0,2,40,0 -2277,9,8,0,12,1,3,3,3,1,1,18,17,5,21,1 -2278,6,7,0,7,6,6,11,3,3,1,5,18,5,1,1 -2279,1,7,0,5,4,4,10,5,4,0,6,9,0,0,0 -2280,2,0,0,5,3,0,4,1,0,0,13,11,3,33,0 -2281,8,3,0,9,5,0,9,5,1,0,18,6,5,0,1 -2282,10,8,0,1,0,6,12,4,1,1,4,11,4,38,0 -2283,2,7,0,4,6,1,1,0,4,0,2,0,5,38,0 -2284,7,5,0,7,5,2,8,4,4,0,8,9,4,11,0 -2285,2,1,0,2,0,4,12,1,1,1,4,11,1,26,0 -2286,10,4,0,3,4,5,11,5,3,1,18,17,5,25,1 -2287,4,7,0,11,1,3,5,0,2,1,17,18,0,38,0 -2288,3,2,0,3,6,4,7,4,2,1,13,20,4,18,0 -2289,2,0,0,3,3,3,9,3,3,0,13,15,2,16,0 -2290,8,6,0,6,0,4,5,2,2,0,13,0,1,31,0 -2291,10,5,0,12,4,3,8,1,0,0,17,11,4,8,0 -2292,10,5,0,15,4,0,2,0,3,1,5,20,3,5,1 -2293,10,1,0,11,3,1,3,0,0,1,14,10,4,8,1 -2294,5,6,0,13,0,0,3,3,3,1,15,2,2,25,0 -2295,0,2,0,7,4,5,0,3,0,1,7,6,0,27,0 -2296,4,1,0,1,2,1,14,1,0,1,7,5,0,35,1 -2297,2,5,0,8,3,3,0,2,4,1,5,1,4,0,1 -2298,10,0,0,10,2,4,12,4,4,1,18,7,0,39,1 -2299,4,2,0,14,3,1,0,2,0,0,2,6,3,6,0 -2300,5,1,0,15,2,0,4,3,0,0,12,0,5,30,0 -2301,10,5,0,6,0,5,11,0,4,1,4,2,0,33,0 -2302,9,4,0,6,2,4,0,1,3,0,4,0,0,16,0 -2303,8,7,0,9,0,4,4,2,0,1,9,8,0,30,0 -2304,3,6,0,9,4,0,9,1,1,1,2,11,0,17,0 -2305,10,0,0,2,0,6,3,2,0,1,2,2,2,3,0 -2306,10,3,0,3,5,4,3,2,0,0,4,19,3,37,0 -2307,3,5,0,1,2,2,13,0,1,1,18,18,3,28,1 -2308,9,2,0,9,5,6,10,4,3,0,1,7,4,10,1 -2309,8,6,0,0,1,1,0,1,1,1,2,17,5,29,1 -2310,8,5,0,3,6,0,7,2,3,1,8,9,2,35,0 -2311,0,4,0,5,0,0,13,0,2,0,8,5,1,3,0 -2312,8,5,0,0,6,5,11,3,2,1,0,0,1,41,0 -2313,0,0,0,1,2,4,3,1,1,1,13,11,1,32,0 -2314,7,3,0,3,6,0,8,1,2,1,1,10,5,23,1 -2315,8,4,0,11,0,0,1,5,4,1,16,8,4,14,1 -2316,2,6,0,10,1,0,7,1,2,1,6,19,0,18,0 -2317,0,1,0,6,4,3,13,2,3,1,13,20,1,37,0 -2318,7,3,0,13,5,0,7,2,0,1,2,10,2,32,0 -2319,1,7,0,13,0,5,10,0,1,1,10,14,1,1,0 -2320,4,8,0,11,2,4,8,0,0,0,13,12,3,30,0 -2321,1,2,0,5,0,3,11,4,0,1,11,2,4,17,0 -2322,1,2,0,7,1,3,7,3,0,1,3,1,1,37,1 -2323,1,4,0,10,6,5,1,0,2,1,2,6,2,41,0 -2324,4,8,0,5,0,5,0,1,1,1,13,14,1,24,0 -2325,10,0,0,1,0,5,1,0,0,1,8,20,5,1,0 -2326,1,2,0,10,5,1,3,5,1,1,8,18,5,38,1 -2327,5,0,0,6,6,2,3,5,0,0,3,10,2,18,1 -2328,3,1,0,5,5,6,5,3,0,0,2,6,2,38,0 -2329,4,0,0,11,3,1,5,5,1,1,5,0,3,8,1 -2330,7,6,0,8,2,2,8,5,2,0,9,10,1,12,1 -2331,3,7,0,1,0,3,2,1,0,1,8,6,0,37,0 -2332,7,4,0,1,3,2,10,5,3,1,5,14,5,8,1 -2333,3,5,0,5,0,0,1,0,3,0,2,15,2,2,0 -2334,5,1,0,8,5,2,4,0,0,1,8,16,0,33,0 -2335,4,4,0,2,2,0,11,3,0,0,8,11,3,7,0 -2336,6,8,0,1,1,4,0,1,0,0,17,10,1,22,0 -2337,5,2,0,4,1,5,14,2,4,1,8,18,0,7,0 -2338,8,0,0,10,0,3,8,3,2,1,15,11,2,29,0 -2339,9,6,0,9,3,4,13,5,3,0,18,17,2,9,1 -2340,10,7,0,7,6,5,6,4,2,1,0,11,3,4,0 -2341,9,1,0,7,0,0,11,5,2,1,5,9,4,3,1 -2342,1,5,0,11,6,5,7,3,2,0,6,6,5,29,0 -2343,4,0,0,1,1,0,6,4,3,1,10,17,3,27,1 -2344,3,8,0,0,6,5,3,2,2,1,6,20,5,23,0 -2345,0,2,0,3,6,4,9,0,2,0,13,14,5,25,0 -2346,10,8,0,0,5,4,13,1,3,1,13,2,1,20,0 -2347,4,7,0,5,0,5,13,4,0,0,2,13,3,1,0 -2348,2,7,0,11,1,4,8,2,4,1,0,11,2,0,0 -2349,2,6,0,11,6,5,7,1,0,1,13,4,0,26,0 -2350,6,8,0,6,5,0,1,2,4,0,15,20,5,35,0 -2351,0,6,0,5,1,2,3,5,3,0,14,13,5,10,1 -2352,2,7,0,14,4,0,5,2,4,1,8,15,0,9,0 -2353,9,1,0,8,4,2,13,5,2,1,18,7,0,2,1 -2354,10,1,0,12,1,4,7,2,2,0,13,18,0,35,0 -2355,1,6,0,1,0,0,9,0,2,0,15,11,0,29,0 -2356,10,3,0,9,1,1,3,3,1,1,11,17,3,40,1 -2357,2,2,0,5,2,5,10,3,4,0,6,13,1,9,0 -2358,3,3,0,0,5,5,6,4,2,1,15,2,0,28,0 -2359,0,8,0,3,2,5,8,3,1,0,8,2,1,38,0 -2360,2,6,0,2,0,6,3,3,4,0,8,11,5,12,0 -2361,1,6,0,6,1,5,2,0,1,1,6,16,2,10,0 -2362,6,8,0,9,6,0,2,5,3,1,1,17,3,30,1 -2363,0,6,0,0,6,6,7,5,1,1,18,17,0,32,1 -2364,6,3,0,0,1,5,0,3,2,0,1,7,4,26,0 -2365,10,7,0,10,0,2,1,1,4,1,1,4,5,28,0 -2366,6,8,0,10,2,5,10,3,4,0,4,15,2,16,0 -2367,10,7,0,6,0,3,1,1,4,1,0,13,3,1,0 -2368,2,5,0,13,4,6,14,3,4,1,11,0,0,3,0 -2369,9,1,0,7,1,6,12,3,4,1,10,16,0,3,0 -2370,2,1,0,12,3,3,10,0,2,0,8,15,4,5,0 -2371,1,7,0,15,5,1,6,5,3,0,13,2,5,13,0 -2372,0,6,0,3,6,4,7,5,0,0,17,2,5,17,0 -2373,10,8,0,3,0,6,14,3,2,0,8,0,2,12,0 -2374,2,1,0,3,2,2,6,4,0,0,4,2,0,5,0 -2375,7,5,0,8,2,5,7,4,3,1,17,6,2,0,0 -2376,3,6,0,5,4,1,6,1,1,1,2,11,0,21,0 -2377,3,1,0,15,3,2,14,5,4,0,9,10,1,22,1 -2378,2,5,0,7,1,5,14,1,1,0,13,13,0,30,0 -2379,0,3,0,13,2,0,8,1,2,1,4,20,1,8,0 -2380,6,5,0,3,2,0,12,0,2,0,13,11,1,2,0 -2381,9,1,0,4,2,5,10,5,1,1,9,5,2,36,1 -2382,4,5,0,5,6,3,5,2,1,0,2,8,2,37,0 -2383,2,8,0,3,1,0,9,3,0,0,2,5,2,27,0 -2384,1,0,0,4,6,3,1,0,1,1,0,11,0,18,0 -2385,0,6,0,10,5,1,6,3,0,1,2,11,2,5,0 -2386,1,5,0,12,2,0,2,1,0,1,6,7,4,25,1 -2387,3,0,0,13,1,1,9,3,0,1,18,4,0,5,0 -2388,1,2,0,11,4,4,2,3,3,0,11,2,0,10,0 -2389,1,8,0,0,2,4,4,5,0,1,0,18,5,35,0 -2390,4,8,0,0,0,2,3,1,2,0,14,14,4,37,1 -2391,2,3,0,5,2,1,6,4,4,1,0,18,3,25,0 -2392,0,0,0,2,2,5,9,1,3,0,0,2,4,20,0 -2393,7,3,0,0,1,3,10,4,3,0,10,13,3,30,0 -2394,5,6,0,5,6,1,9,3,2,0,14,5,2,8,0 -2395,0,2,0,1,5,0,6,3,2,1,13,0,0,39,0 -2396,3,5,0,4,0,2,3,4,3,0,18,8,5,12,1 -2397,5,7,0,9,6,0,4,5,1,1,11,6,3,14,0 -2398,7,4,0,8,1,1,11,4,0,1,13,19,1,21,0 -2399,4,6,0,9,3,4,7,2,2,1,17,11,0,32,0 -2400,5,3,0,15,2,0,7,4,3,0,10,2,2,21,0 -2401,7,2,0,10,0,1,10,5,1,1,18,7,5,10,1 -2402,3,6,0,5,4,0,8,5,0,0,11,10,4,40,1 -2403,2,4,0,2,5,2,5,3,0,1,4,4,3,4,0 -2404,9,4,0,9,6,2,3,0,1,0,18,11,0,7,1 -2405,2,6,0,10,3,5,2,1,0,1,13,4,2,13,0 -2406,7,4,0,11,2,5,8,1,2,0,2,13,5,1,0 -2407,2,8,0,13,0,4,5,3,1,0,12,6,4,7,0 -2408,6,7,0,7,0,6,7,4,0,0,13,12,2,3,0 -2409,2,1,0,5,0,6,6,2,0,1,13,6,1,6,0 -2410,1,7,0,2,1,1,10,4,1,1,0,6,3,32,0 -2411,3,4,0,1,5,5,4,5,0,0,2,14,3,39,0 -2412,5,8,0,13,3,5,0,4,4,1,8,8,5,9,0 -2413,0,8,0,5,5,0,10,1,2,0,2,16,5,34,0 -2414,7,7,0,9,2,5,10,1,0,0,4,2,1,5,0 -2415,1,0,0,6,4,5,6,1,0,0,6,4,1,12,0 -2416,7,0,0,6,1,4,3,5,3,0,9,7,4,10,1 -2417,1,6,0,6,4,4,4,0,3,0,13,2,4,10,0 -2418,10,1,0,9,6,2,5,1,3,1,14,20,4,41,1 -2419,4,1,0,10,3,5,13,2,3,0,7,16,2,40,1 -2420,9,0,0,0,2,0,5,4,0,1,12,18,1,3,0 -2421,4,8,0,6,5,0,7,1,2,0,12,2,1,3,0 -2422,9,0,0,15,3,1,6,5,2,0,9,7,4,28,1 -2423,4,7,0,5,1,0,4,4,4,0,14,11,3,33,0 -2424,4,2,0,10,2,5,9,4,4,1,3,4,0,22,0 -2425,7,0,0,10,1,0,2,4,1,0,9,17,4,19,1 -2426,10,6,0,4,0,6,14,3,2,1,13,11,3,5,0 -2427,3,3,0,10,5,4,11,5,2,1,9,10,3,12,1 -2428,1,0,0,6,0,5,13,5,2,1,17,11,2,3,0 -2429,1,1,0,3,0,0,8,3,0,0,12,7,3,10,0 -2430,0,8,0,1,4,0,1,5,2,0,2,9,3,10,0 -2431,8,8,0,5,5,4,6,1,0,0,4,2,2,3,0 -2432,3,2,0,4,1,0,14,2,2,0,9,8,4,0,0 -2433,0,2,0,4,0,4,6,3,3,1,8,11,3,7,0 -2434,8,4,0,12,5,2,5,5,0,0,18,7,4,30,1 -2435,1,6,0,10,6,0,1,3,3,0,6,18,0,7,0 -2436,9,3,0,0,6,3,4,2,1,1,5,10,2,31,1 -2437,2,6,0,13,4,5,3,5,1,1,2,6,2,13,0 -2438,2,8,0,8,1,3,8,1,3,0,2,9,1,27,0 -2439,10,5,0,3,0,5,5,3,0,0,10,18,4,34,0 -2440,10,1,0,3,0,1,2,0,1,1,6,2,4,24,0 -2441,7,2,0,0,6,0,7,1,1,1,17,11,5,39,0 -2442,4,2,0,11,1,6,4,4,0,0,5,5,2,22,1 -2443,0,4,0,6,3,6,11,0,0,0,2,6,0,19,0 -2444,4,0,0,14,6,1,5,3,0,1,11,13,4,1,0 -2445,9,0,0,11,5,1,3,2,3,0,14,2,5,11,0 -2446,4,7,0,12,0,0,14,2,3,0,2,13,4,32,0 -2447,4,1,0,10,3,2,12,5,4,0,11,12,3,34,1 -2448,1,4,0,14,1,5,1,5,0,1,13,2,0,20,0 -2449,9,6,0,5,5,3,2,0,2,1,4,6,2,32,0 -2450,0,6,0,6,0,6,12,0,1,1,4,6,0,35,0 -2451,6,0,0,11,0,6,9,1,4,0,15,18,4,39,0 -2452,6,1,0,14,4,1,9,5,4,1,11,12,4,35,1 -2453,4,2,0,4,0,5,12,2,3,1,4,13,0,40,0 -2454,0,3,0,10,2,3,0,3,0,1,15,2,0,16,0 -2455,2,4,0,1,5,4,2,3,0,1,12,9,0,8,0 -2456,6,8,0,0,4,5,5,2,1,1,7,7,3,17,0 -2457,1,8,0,0,2,2,8,4,0,0,18,11,5,24,0 -2458,5,7,0,1,6,4,9,0,0,1,2,11,1,1,0 -2459,0,1,0,3,3,5,6,1,0,0,5,13,0,5,0 -2460,3,5,0,2,5,1,0,3,0,0,8,7,1,31,0 -2461,0,1,0,12,0,5,9,4,0,1,0,8,4,1,0 -2462,8,6,0,8,6,3,0,0,1,1,13,9,5,6,0 -2463,5,6,0,8,0,0,7,5,3,0,9,14,1,18,1 -2464,1,7,0,1,5,4,8,0,3,0,4,15,4,14,0 -2465,4,4,0,15,0,0,3,0,1,0,6,4,1,11,0 -2466,0,4,0,4,5,0,1,0,0,0,4,15,0,40,0 -2467,3,2,0,8,1,1,11,3,0,1,4,15,2,17,0 -2468,7,0,0,4,6,5,7,1,0,1,3,3,4,16,0 -2469,0,0,0,6,5,2,1,1,0,1,2,2,4,16,0 -2470,0,7,0,5,3,2,10,1,2,0,6,6,4,32,0 -2471,0,8,0,14,5,6,1,4,4,0,3,14,4,18,0 -2472,0,8,0,6,6,0,11,0,3,0,2,13,5,23,0 -2473,0,7,0,14,1,4,7,4,4,0,8,0,0,35,0 -2474,10,7,0,4,3,3,2,0,2,0,9,10,4,12,1 -2475,4,8,0,11,1,1,1,3,3,1,13,15,1,4,0 -2476,8,7,0,2,5,5,11,1,3,1,8,2,0,25,0 -2477,9,4,0,9,2,0,5,4,2,1,13,3,0,8,0 -2478,0,1,0,4,6,5,1,3,2,0,9,12,0,3,0 -2479,8,2,0,6,6,5,10,5,3,1,7,1,3,23,1 -2480,0,6,0,4,4,2,7,1,0,1,3,18,0,38,0 -2481,6,1,0,1,1,6,5,0,1,0,13,6,2,3,0 -2482,0,8,0,6,5,1,5,5,4,0,1,6,1,38,0 -2483,8,1,0,11,3,2,7,1,0,0,18,14,2,9,1 -2484,8,6,0,3,5,1,14,2,0,1,17,15,3,0,0 -2485,9,3,0,12,2,5,9,0,3,1,11,7,0,32,1 -2486,1,4,0,6,1,0,11,4,1,0,2,11,0,25,0 -2487,10,8,0,9,3,0,3,1,4,1,12,15,4,6,0 -2488,4,6,0,11,1,3,12,3,3,0,17,2,3,5,0 -2489,4,1,0,2,1,2,6,5,1,1,13,0,5,6,0 -2490,0,7,0,6,0,3,0,1,1,0,12,2,1,41,0 -2491,3,3,0,13,2,0,12,2,0,0,0,19,2,5,0 -2492,3,7,0,15,2,5,5,4,4,0,9,13,2,31,0 -2493,1,0,0,5,1,3,7,1,2,0,13,12,2,30,0 -2494,1,4,0,13,3,0,0,2,4,0,17,12,2,0,0 -2495,9,6,0,2,2,6,12,1,1,1,18,18,4,33,1 -2496,0,2,0,15,0,0,7,4,1,0,4,11,0,3,0 -2497,10,7,0,6,2,5,5,2,0,1,0,2,3,24,0 -2498,3,4,0,14,2,2,7,5,1,1,5,1,4,14,1 -2499,7,5,0,3,4,3,6,1,0,1,2,18,4,34,0 -2500,0,5,0,9,2,5,10,2,3,0,13,6,3,34,0 -2501,0,7,0,5,0,0,6,1,2,0,15,15,3,23,0 -2502,4,2,0,14,0,5,8,2,1,0,17,11,0,7,0 -2503,3,4,0,15,2,6,13,0,2,0,5,3,5,22,0 -2504,4,2,0,3,2,5,3,4,1,1,2,13,5,40,0 -2505,7,1,0,2,2,3,2,5,1,1,3,10,1,34,1 -2506,3,6,0,13,0,5,4,1,2,1,2,15,3,16,0 -2507,5,6,0,4,2,5,5,4,3,1,8,0,2,11,0 -2508,9,1,0,10,5,3,7,1,0,0,7,14,1,14,0 -2509,10,6,0,9,0,4,6,1,3,0,6,6,2,12,0 -2510,10,3,0,0,0,4,6,3,0,0,0,12,1,30,0 -2511,4,6,0,13,5,3,2,3,2,1,17,15,5,11,0 -2512,9,0,0,11,0,1,3,1,2,1,6,11,0,25,0 -2513,9,0,0,11,1,1,4,5,4,1,10,7,3,29,1 -2514,4,5,0,6,0,0,1,1,0,0,9,10,3,32,1 -2515,0,8,0,11,1,4,3,3,3,1,12,19,4,23,0 -2516,10,6,0,13,3,1,6,0,4,1,8,13,1,33,0 -2517,0,2,0,7,6,5,5,3,0,1,10,0,0,25,0 -2518,3,5,0,12,0,6,12,2,2,0,2,0,0,16,0 -2519,9,6,0,0,2,6,0,3,2,1,6,15,3,1,0 -2520,6,6,0,6,4,0,7,4,4,0,2,2,0,8,0 -2521,3,6,0,13,2,4,1,4,1,1,13,3,2,28,0 -2522,7,7,0,0,2,5,8,2,3,0,4,0,0,4,0 -2523,8,1,0,9,4,4,2,1,3,0,13,15,2,4,0 -2524,4,5,0,5,0,3,13,2,4,0,8,18,0,36,0 -2525,3,8,0,6,4,3,3,1,2,1,2,18,0,13,0 -2526,4,1,0,14,0,1,11,3,3,1,9,17,3,5,1 -2527,3,6,0,9,5,2,0,0,1,1,4,10,3,35,1 -2528,10,8,0,13,0,1,2,0,3,0,1,4,2,31,1 -2529,3,8,0,15,4,2,7,4,0,0,4,15,1,8,0 -2530,9,6,0,15,2,0,4,4,0,0,13,16,0,33,0 -2531,3,7,0,15,3,5,8,4,4,0,10,20,5,3,0 -2532,2,1,0,1,1,4,3,5,1,1,14,0,1,0,1 -2533,7,8,0,12,0,5,4,3,3,0,8,2,1,5,0 -2534,1,1,0,11,5,0,8,1,3,1,5,2,5,19,0 -2535,10,8,0,8,0,6,5,4,2,1,18,14,2,37,1 -2536,9,6,0,1,1,6,11,1,0,1,2,11,3,6,0 -2537,1,3,0,15,0,4,9,0,3,1,2,19,0,23,0 -2538,4,3,0,13,1,2,11,1,3,0,13,11,1,23,0 -2539,7,8,0,7,3,2,3,3,0,0,1,7,5,23,1 -2540,1,0,0,6,5,3,5,4,2,0,8,7,0,8,0 -2541,2,8,0,14,4,1,6,0,0,1,4,4,1,32,0 -2542,0,8,0,12,4,0,9,2,0,1,8,3,0,41,0 -2543,6,7,0,13,1,5,8,2,2,1,8,11,3,13,0 -2544,0,7,0,0,2,0,8,3,4,0,1,2,2,21,0 -2545,1,4,0,9,5,1,3,0,0,1,1,10,1,39,1 -2546,0,1,0,5,0,4,9,5,4,0,13,0,2,14,0 -2547,4,1,0,9,0,6,11,0,0,1,1,12,4,5,1 -2548,0,3,0,9,5,0,12,4,2,0,2,11,2,27,0 -2549,8,6,0,6,3,5,12,2,0,1,15,11,2,13,0 -2550,6,6,0,9,4,6,2,5,2,1,8,2,0,33,0 -2551,3,2,0,3,4,4,6,1,1,1,18,10,5,31,0 -2552,3,0,0,3,2,6,14,3,1,0,18,8,1,27,1 -2553,0,8,0,15,3,5,9,4,3,1,6,2,2,29,0 -2554,3,7,0,0,5,2,3,2,2,0,10,18,1,25,0 -2555,0,1,0,11,4,4,13,2,4,0,5,0,3,30,0 -2556,9,1,0,6,1,3,5,5,1,0,13,2,2,7,0 -2557,1,4,0,14,1,3,3,3,2,1,18,18,1,7,0 -2558,1,6,0,0,1,3,3,0,3,0,6,4,1,20,0 -2559,3,3,0,9,1,5,6,1,1,0,17,7,1,18,0 -2560,9,5,0,1,0,1,9,4,4,0,2,3,0,27,0 -2561,0,4,0,15,4,3,9,0,0,0,4,11,0,32,0 -2562,4,1,0,6,0,0,0,0,4,0,17,2,2,28,0 -2563,1,8,0,2,5,0,5,3,4,0,10,20,5,24,0 -2564,3,8,0,11,1,5,14,1,4,0,8,13,3,3,0 -2565,4,8,0,11,2,1,3,1,1,1,14,18,4,0,0 -2566,0,1,0,8,2,5,8,2,3,1,2,1,0,28,0 -2567,0,4,0,5,5,4,4,5,0,1,2,0,0,18,0 -2568,4,6,0,3,0,3,13,2,1,0,6,15,1,3,0 -2569,3,8,0,5,0,2,11,0,4,0,2,15,5,40,0 -2570,3,3,0,12,2,4,1,0,3,0,13,16,2,25,0 -2571,0,0,0,14,6,0,7,2,3,1,13,16,3,13,0 -2572,3,1,0,0,6,2,3,2,2,0,7,5,5,8,1 -2573,6,1,0,0,1,6,9,3,1,0,0,11,2,5,0 -2574,6,1,0,1,2,2,9,0,4,0,18,10,4,31,1 -2575,3,2,0,13,3,3,6,2,3,0,8,4,4,35,0 -2576,3,6,0,11,4,6,14,3,3,1,13,9,2,10,0 -2577,2,1,0,7,4,0,3,1,2,0,3,7,0,35,1 -2578,1,7,0,4,6,6,12,5,3,0,12,9,0,13,0 -2579,5,6,0,7,3,3,12,0,1,0,7,10,4,8,1 -2580,2,7,0,9,0,3,12,3,2,0,8,2,2,29,0 -2581,4,2,0,0,6,1,5,3,0,1,7,3,2,30,1 -2582,5,8,0,6,2,6,11,4,0,0,13,11,2,36,0 -2583,6,1,0,3,1,0,14,5,0,0,8,0,4,39,0 -2584,8,6,0,10,1,6,8,0,1,1,2,18,3,10,0 -2585,5,3,0,5,4,0,9,1,1,1,5,4,1,8,0 -2586,10,4,0,14,3,5,12,3,0,0,13,10,4,12,0 -2587,1,7,0,5,6,5,11,3,3,0,13,20,4,26,0 -2588,10,2,0,3,1,1,10,4,0,0,0,2,0,35,0 -2589,0,2,0,14,6,3,5,1,4,0,12,11,3,28,0 -2590,0,4,0,6,6,5,8,2,2,0,6,6,0,38,0 -2591,5,5,0,15,0,0,3,2,4,1,11,15,5,4,0 -2592,10,0,0,5,6,4,8,2,2,1,4,4,2,11,0 -2593,5,8,0,7,2,4,6,5,1,1,13,4,0,29,0 -2594,0,2,0,1,0,0,6,0,0,1,13,1,3,28,0 -2595,1,0,0,12,4,3,8,5,1,1,10,15,4,25,0 -2596,8,6,0,6,5,2,9,1,2,1,12,9,5,8,0 -2597,4,1,0,4,0,0,12,1,0,0,16,14,4,2,0 -2598,1,8,0,4,6,2,7,1,4,0,17,18,0,3,0 -2599,6,6,0,11,6,3,8,3,0,0,6,20,1,21,0 -2600,0,6,0,3,3,4,9,3,1,0,8,2,1,25,0 -2601,4,0,0,13,0,2,2,3,1,0,5,3,5,20,0 -2602,6,4,0,7,0,2,11,1,0,0,5,10,2,30,1 -2603,3,5,0,6,2,2,7,1,4,1,1,7,4,40,1 -2604,6,5,0,3,3,1,9,2,1,0,13,0,4,4,0 -2605,10,7,0,14,0,4,11,2,0,1,2,13,0,26,0 -2606,4,2,0,4,2,6,10,4,1,0,6,9,0,25,0 -2607,9,6,0,8,3,6,5,3,0,1,8,2,5,20,0 -2608,6,8,0,2,4,0,0,1,2,1,9,18,2,2,1 -2609,7,7,0,7,6,2,1,3,1,0,9,7,0,37,1 -2610,7,2,0,0,1,2,10,5,3,0,9,8,5,23,1 -2611,4,2,0,6,0,5,14,5,0,1,8,20,4,9,0 -2612,10,4,0,15,2,0,10,3,3,0,17,11,0,9,0 -2613,2,5,0,3,4,0,12,4,3,1,10,2,2,12,0 -2614,7,5,0,2,6,0,11,5,2,1,5,1,4,37,1 -2615,10,1,0,15,0,1,6,4,2,1,13,15,4,17,0 -2616,6,6,0,3,5,4,9,5,1,1,6,9,0,29,0 -2617,0,4,0,5,0,4,3,5,2,0,2,0,3,10,0 -2618,4,8,0,7,3,4,3,2,2,1,13,2,1,17,0 -2619,2,5,0,11,6,2,10,2,0,0,2,0,0,12,0 -2620,2,3,0,0,5,4,7,3,3,0,13,4,3,29,0 -2621,0,2,0,5,0,5,10,0,3,1,7,15,5,2,0 -2622,0,5,0,12,0,6,3,5,3,0,13,6,0,32,0 -2623,3,8,0,0,4,0,1,5,3,1,16,14,2,12,1 -2624,7,0,0,12,0,6,0,0,4,0,8,19,4,13,0 -2625,9,7,0,13,2,4,9,0,2,0,11,12,5,4,0 -2626,2,7,0,5,3,3,12,2,4,1,15,16,0,39,0 -2627,2,4,0,14,5,0,0,4,4,0,5,11,3,7,0 -2628,0,1,0,8,4,4,6,0,2,1,1,20,4,10,1 -2629,5,8,0,5,0,6,9,5,4,1,0,17,3,39,1 -2630,2,4,0,0,1,6,14,2,2,0,2,8,3,29,0 -2631,0,6,0,3,5,1,12,2,2,0,14,2,0,3,0 -2632,0,4,0,0,3,2,12,4,1,1,13,2,0,6,0 -2633,10,6,0,4,4,1,2,4,0,1,9,17,1,41,1 -2634,6,0,0,6,5,6,6,1,2,0,14,2,2,4,0 -2635,10,6,0,5,3,6,7,1,4,0,8,2,4,18,0 -2636,3,4,0,6,2,4,0,1,2,0,10,8,3,35,0 -2637,4,1,0,13,0,2,2,3,2,0,2,2,0,25,0 -2638,4,1,0,2,6,6,13,5,1,0,9,5,2,23,1 -2639,0,5,0,3,4,4,6,3,4,0,8,2,5,30,0 -2640,10,2,0,3,1,1,10,1,2,0,5,2,5,12,1 -2641,10,4,0,3,5,5,4,1,1,1,15,18,1,28,0 -2642,0,0,0,13,4,0,7,2,0,0,6,1,0,13,0 -2643,3,4,0,13,3,6,9,0,4,1,12,1,0,35,0 -2644,0,0,0,5,3,1,4,2,4,1,13,2,5,28,0 -2645,1,1,0,2,2,1,1,0,4,1,0,14,3,19,1 -2646,0,7,0,8,2,4,11,4,2,1,10,1,5,26,0 -2647,3,0,0,3,6,3,6,2,4,1,15,15,4,25,0 -2648,6,8,0,13,6,5,5,3,4,0,8,0,0,16,0 -2649,5,5,0,9,0,3,10,4,2,0,2,10,1,29,0 -2650,0,1,0,3,1,2,5,4,1,0,13,4,1,28,0 -2651,10,7,0,10,5,4,0,3,1,0,8,13,1,30,0 -2652,3,3,0,10,0,0,6,4,3,0,12,11,1,24,0 -2653,2,8,0,5,2,4,2,4,0,0,17,15,3,23,0 -2654,2,0,0,5,6,1,8,2,1,0,4,8,1,37,0 -2655,6,8,0,12,1,5,12,3,4,1,11,2,0,26,0 -2656,1,1,0,0,0,2,6,1,0,0,2,19,5,33,0 -2657,0,8,0,0,3,5,5,0,4,0,13,6,2,38,0 -2658,3,8,0,4,6,5,13,2,1,1,13,4,2,27,0 -2659,8,0,0,14,0,5,10,5,0,0,8,15,0,8,0 -2660,1,7,0,5,1,3,13,3,3,0,15,8,0,34,0 -2661,9,6,0,3,0,6,10,3,3,0,12,10,4,32,0 -2662,0,1,0,15,6,1,11,3,4,0,13,2,5,26,0 -2663,6,8,0,9,6,1,13,0,3,1,7,0,4,19,1 -2664,1,1,0,8,6,5,3,2,0,0,3,13,4,11,0 -2665,2,6,0,10,4,5,10,3,3,1,10,20,5,26,0 -2666,7,7,0,8,0,0,3,0,3,1,18,13,1,21,1 -2667,5,3,0,14,1,4,9,1,3,0,2,18,2,28,0 -2668,8,6,0,3,1,5,9,5,4,1,8,6,5,19,0 -2669,6,1,0,10,4,3,12,0,1,1,14,20,0,9,1 -2670,1,8,0,4,1,1,0,1,1,0,6,19,3,27,0 -2671,10,3,0,11,6,1,9,4,2,0,2,4,5,29,0 -2672,6,8,0,0,4,0,9,2,2,1,4,2,2,36,0 -2673,2,3,0,9,0,0,10,2,2,0,13,2,5,29,0 -2674,9,1,0,8,0,3,3,2,1,0,8,17,1,35,0 -2675,7,2,0,4,2,1,9,2,0,1,6,16,4,12,0 -2676,3,7,0,13,0,0,8,2,1,0,13,16,4,26,0 -2677,3,5,0,10,1,5,12,2,0,1,17,6,5,6,0 -2678,1,6,0,9,1,6,10,3,0,1,11,2,1,35,0 -2679,1,6,0,0,4,1,8,0,2,0,13,15,4,14,0 -2680,3,7,0,8,4,6,10,2,0,0,13,13,5,31,0 -2681,7,5,0,6,3,1,9,3,2,1,5,11,2,18,0 -2682,6,2,0,9,6,1,0,2,2,1,11,13,1,21,0 -2683,10,5,0,14,4,6,5,3,4,0,13,6,2,29,0 -2684,10,4,0,5,1,4,5,5,0,1,17,2,4,39,0 -2685,6,1,0,3,5,3,9,0,2,0,1,7,3,3,1 -2686,0,4,0,2,0,4,14,2,1,0,10,19,5,12,0 -2687,3,2,0,12,6,4,0,0,1,1,5,17,1,10,1 -2688,4,8,0,3,2,3,6,0,0,1,13,2,2,35,0 -2689,4,1,0,3,3,5,8,0,0,1,1,7,1,16,1 -2690,7,2,0,12,2,6,1,0,4,1,3,17,3,39,1 -2691,6,7,0,5,2,4,10,1,2,1,4,3,1,37,0 -2692,2,0,0,7,6,3,14,2,4,0,9,10,4,31,1 -2693,1,8,0,11,1,2,10,3,1,0,2,11,0,0,0 -2694,2,5,0,4,3,5,6,1,3,0,2,1,1,10,0 -2695,8,4,0,5,2,5,9,0,0,1,13,1,0,18,0 -2696,9,1,0,3,0,3,9,3,4,1,18,3,4,25,1 -2697,1,7,0,10,5,6,11,3,0,1,10,2,3,19,0 -2698,9,7,0,4,3,0,11,5,4,1,5,17,2,13,1 -2699,10,2,0,1,3,6,11,1,2,1,16,3,2,19,0 -2700,3,8,0,15,0,1,8,1,0,0,13,11,0,41,0 -2701,8,4,0,5,2,2,6,2,3,0,13,2,5,12,0 -2702,8,4,0,0,1,0,9,3,3,0,2,6,4,10,0 -2703,2,2,0,1,0,0,7,3,3,1,2,9,4,10,0 -2704,1,1,0,13,4,5,14,3,1,0,13,11,0,36,0 -2705,0,5,0,1,3,6,6,5,0,0,5,9,2,0,0 -2706,8,8,0,4,5,0,9,0,3,0,13,0,2,22,0 -2707,5,2,0,11,1,1,10,3,2,0,2,2,3,27,0 -2708,0,8,0,10,3,6,6,0,4,0,8,13,2,6,0 -2709,9,7,0,15,1,2,5,1,0,1,8,9,2,0,0 -2710,4,6,0,1,4,3,14,1,3,1,3,11,0,24,0 -2711,0,4,0,13,6,6,12,5,1,1,8,2,5,29,0 -2712,2,0,0,3,1,6,12,0,3,0,4,0,4,38,0 -2713,8,1,0,12,1,1,2,5,0,0,18,6,1,17,1 -2714,4,5,0,10,6,0,12,2,1,0,2,20,5,29,0 -2715,1,3,0,11,6,1,1,0,4,1,13,6,3,22,0 -2716,9,1,0,10,3,1,5,0,3,0,3,10,0,36,1 -2717,7,8,0,10,4,4,5,4,0,1,5,12,0,32,1 -2718,0,4,0,11,4,2,6,3,3,0,15,11,2,0,0 -2719,1,8,0,4,4,6,13,4,4,0,16,5,2,39,1 -2720,0,5,0,2,4,6,8,3,3,1,4,20,5,13,0 -2721,0,0,0,8,0,0,7,5,1,0,0,10,0,8,0 -2722,7,8,0,8,6,3,1,4,1,1,18,10,3,30,1 -2723,1,2,0,0,4,5,5,1,3,0,5,20,3,26,1 -2724,2,3,0,15,6,1,13,4,0,1,10,2,1,36,0 -2725,9,5,0,14,5,1,11,2,0,1,16,5,0,20,1 -2726,9,3,0,6,6,3,13,0,2,0,17,13,5,14,0 -2727,3,8,0,8,5,0,5,3,2,0,6,20,3,20,0 -2728,9,3,0,14,6,2,13,0,3,0,7,10,5,7,1 -2729,0,1,0,1,6,4,9,0,0,1,13,0,2,30,0 -2730,2,7,0,7,0,0,6,3,1,1,13,2,4,6,0 -2731,7,3,0,7,6,6,11,5,3,1,7,10,1,36,1 -2732,10,6,0,9,1,6,6,4,2,1,15,11,5,12,0 -2733,2,8,0,3,0,3,11,0,0,0,17,9,4,3,0 -2734,5,3,0,13,2,5,5,3,0,1,2,11,4,9,0 -2735,2,5,0,1,1,4,8,0,2,1,8,3,2,3,0 -2736,0,5,0,1,1,0,11,5,1,1,5,18,3,20,1 -2737,3,5,0,4,6,2,2,4,0,0,18,7,4,20,1 -2738,1,7,0,5,5,4,7,0,1,0,6,11,3,36,0 -2739,1,7,0,9,6,0,12,1,2,0,17,15,4,2,0 -2740,6,8,0,10,2,3,4,3,3,0,13,4,4,12,1 -2741,3,6,0,13,6,5,2,2,0,1,0,13,0,20,0 -2742,6,4,0,4,2,6,14,1,3,1,2,11,3,28,0 -2743,5,7,0,3,1,0,9,3,0,1,10,13,0,27,0 -2744,9,4,0,3,5,3,3,3,1,0,8,11,0,5,0 -2745,5,8,0,6,1,0,12,5,2,1,18,10,4,23,1 -2746,4,8,0,1,5,2,9,2,2,0,13,12,2,6,0 -2747,0,3,0,10,0,5,9,4,0,1,2,0,0,2,0 -2748,9,7,0,9,3,3,2,2,0,0,14,7,1,29,1 -2749,5,7,0,2,1,4,11,1,1,1,5,20,0,1,1 -2750,10,1,0,2,0,2,9,4,3,1,16,9,1,25,1 -2751,6,2,0,14,2,1,2,0,4,1,9,8,3,24,1 -2752,7,1,0,11,1,2,0,0,2,0,17,2,4,7,0 -2753,6,2,0,9,6,0,3,5,1,0,4,5,5,40,1 -2754,8,2,0,15,0,3,2,1,2,0,13,15,5,7,0 -2755,9,3,0,4,5,4,7,2,2,1,9,3,5,32,1 -2756,10,8,0,0,5,6,4,2,3,0,2,8,0,33,0 -2757,6,6,0,8,1,6,3,5,0,1,16,16,3,13,1 -2758,3,7,0,10,4,3,12,4,0,0,10,11,0,27,0 -2759,0,8,0,7,2,3,12,0,4,0,5,16,5,36,1 -2760,9,5,0,5,0,1,12,0,2,1,15,19,2,33,1 -2761,3,5,0,10,3,5,14,3,4,0,10,9,1,20,0 -2762,3,1,0,3,2,5,13,0,1,1,14,8,3,19,1 -2763,7,1,0,0,3,5,2,5,1,1,18,0,3,22,1 -2764,6,5,0,10,5,4,3,3,3,1,11,9,3,16,1 -2765,9,5,0,6,6,1,0,4,3,0,13,3,0,25,0 -2766,5,1,0,6,4,1,5,5,4,0,17,7,2,36,1 -2767,5,6,0,11,0,4,12,3,2,1,7,3,0,39,0 -2768,0,5,0,8,1,0,5,3,2,1,15,20,3,9,0 -2769,2,2,0,5,6,1,0,2,1,0,13,4,1,2,0 -2770,7,3,0,10,1,3,13,4,2,1,5,10,2,0,1 -2771,6,1,0,6,5,1,12,0,4,0,18,17,4,21,1 -2772,4,1,0,4,0,3,4,5,1,1,7,1,5,21,1 -2773,9,6,0,1,4,0,10,3,2,0,8,3,0,4,0 -2774,0,8,0,7,0,3,10,3,2,0,2,8,0,9,0 -2775,9,0,0,1,5,3,10,5,3,0,18,14,3,41,1 -2776,10,5,0,8,3,1,13,0,0,1,3,10,4,0,1 -2777,0,0,0,2,1,6,9,3,2,0,13,11,1,17,0 -2778,2,2,0,9,6,0,10,2,1,0,17,15,0,1,0 -2779,8,1,0,10,1,2,4,1,2,1,7,7,5,31,1 -2780,6,7,0,14,5,1,2,3,0,1,12,11,1,8,0 -2781,9,7,0,5,0,4,10,5,2,1,9,7,3,20,1 -2782,1,3,0,7,6,4,8,5,1,0,13,2,4,9,0 -2783,0,7,0,9,5,5,10,1,3,1,2,8,2,20,0 -2784,2,0,0,12,3,3,14,4,4,0,16,5,4,17,1 -2785,5,2,0,13,2,5,2,1,3,1,13,11,2,11,0 -2786,8,8,0,2,6,3,2,2,3,1,8,19,2,37,0 -2787,1,1,0,7,0,0,5,3,1,0,10,13,2,8,0 -2788,6,6,0,3,1,4,14,2,0,0,6,5,2,6,0 -2789,5,1,0,2,6,5,10,2,2,0,0,1,5,41,0 -2790,8,6,0,5,3,1,12,3,0,1,16,10,4,39,1 -2791,0,7,0,6,0,5,0,3,3,0,17,0,2,3,0 -2792,0,8,0,10,4,4,0,0,3,1,18,13,2,23,1 -2793,0,3,0,4,2,2,2,1,1,0,9,13,2,11,0 -2794,1,5,0,6,4,4,10,3,3,0,8,20,5,38,0 -2795,1,7,0,8,2,6,9,3,4,0,4,8,1,23,0 -2796,2,5,0,12,2,6,11,2,3,1,2,11,5,4,0 -2797,6,8,0,7,5,4,14,4,1,1,0,5,3,21,0 -2798,5,6,0,2,5,1,7,1,0,1,11,5,4,29,1 -2799,4,1,0,8,2,0,0,1,4,0,13,7,5,1,1 -2800,6,2,0,15,0,5,6,3,3,1,8,6,2,38,0 -2801,7,7,0,4,0,6,10,4,3,0,7,0,1,17,0 -2802,7,0,0,3,5,2,9,1,4,0,1,10,1,30,1 -2803,6,3,0,10,3,5,6,0,2,0,7,13,1,9,1 -2804,10,2,0,1,2,6,14,0,3,1,8,16,2,18,0 -2805,1,7,0,11,0,4,4,1,0,1,10,13,3,18,0 -2806,10,8,0,0,6,4,7,3,3,0,8,1,0,2,0 -2807,1,6,0,5,6,2,14,4,4,1,2,10,4,41,0 -2808,9,5,0,12,2,3,0,3,3,0,13,18,2,13,0 -2809,0,7,0,5,6,0,9,2,1,1,1,2,3,13,0 -2810,0,3,0,12,2,4,7,0,4,1,16,8,4,25,1 -2811,10,2,0,7,5,1,9,3,3,1,13,2,5,19,0 -2812,7,4,0,8,5,0,9,3,2,1,8,13,0,34,0 -2813,1,7,0,10,5,0,11,4,3,1,6,16,0,4,0 -2814,9,1,0,3,6,6,0,1,1,1,2,13,4,26,0 -2815,0,2,0,2,5,4,3,5,2,0,10,20,0,17,0 -2816,8,8,0,1,1,3,9,2,3,1,8,7,5,21,1 -2817,5,3,0,6,6,5,5,5,1,1,8,11,3,22,0 -2818,10,6,0,10,1,3,14,3,2,1,15,7,5,13,1 -2819,6,3,0,12,2,1,6,3,3,0,5,10,3,32,1 -2820,7,0,0,13,6,3,0,4,0,0,0,11,5,26,0 -2821,4,0,0,13,1,5,7,0,3,0,4,15,0,34,0 -2822,0,3,0,5,6,4,5,4,0,1,12,11,0,27,0 -2823,1,6,0,6,1,4,9,2,4,0,15,11,0,1,0 -2824,3,3,0,15,0,3,14,4,1,1,8,15,3,27,0 -2825,3,5,0,15,0,0,3,3,3,0,17,6,2,37,0 -2826,1,3,0,10,1,6,8,1,0,1,8,12,3,1,0 -2827,2,5,0,7,4,0,5,1,4,1,9,2,3,39,0 -2828,0,6,0,12,0,1,6,2,4,1,10,17,3,25,0 -2829,5,3,0,13,5,2,13,3,1,0,11,11,5,0,0 -2830,10,2,0,6,2,0,5,4,0,1,8,20,2,19,0 -2831,3,1,0,11,5,5,4,1,4,0,10,6,1,28,0 -2832,1,8,0,12,6,2,9,2,3,1,8,18,1,40,0 -2833,9,6,0,3,4,4,1,5,0,0,2,9,0,16,0 -2834,5,8,0,0,1,1,3,5,0,0,3,7,5,28,1 -2835,4,8,0,0,4,5,10,2,2,0,4,11,1,32,0 -2836,8,3,0,7,3,5,12,0,1,1,3,1,0,5,1 -2837,7,1,0,10,6,4,11,0,3,1,0,10,5,8,1 -2838,8,3,0,3,0,4,9,3,2,1,6,12,3,25,0 -2839,4,3,0,11,5,0,7,1,3,0,18,19,5,41,1 -2840,0,2,0,10,0,5,14,3,0,0,0,3,3,2,0 -2841,0,8,0,7,1,4,8,4,2,1,13,11,5,40,0 -2842,2,3,0,3,0,5,4,0,3,1,13,11,3,6,0 -2843,10,6,0,4,1,1,1,1,0,1,2,6,5,1,0 -2844,3,7,0,3,0,0,6,4,0,0,4,20,2,1,0 -2845,7,1,0,14,3,1,4,0,2,1,3,16,2,23,1 -2846,0,8,0,4,4,0,9,3,4,0,17,17,4,10,0 -2847,5,8,0,4,5,4,3,2,2,1,17,0,0,39,0 -2848,8,6,0,3,1,0,14,0,4,1,8,18,0,8,0 -2849,8,1,0,0,6,2,3,2,3,1,14,5,4,39,1 -2850,7,6,0,12,1,4,12,5,2,0,18,5,5,29,1 -2851,6,7,0,5,0,5,5,0,2,1,16,6,0,8,1 -2852,7,1,0,9,6,6,3,4,3,0,17,17,2,34,1 -2853,1,8,0,12,2,5,5,2,0,1,13,5,5,28,0 -2854,10,1,0,6,4,4,5,0,2,0,18,7,2,1,1 -2855,10,6,0,3,6,0,7,4,3,1,4,15,0,3,0 -2856,10,1,0,4,6,3,12,3,3,1,8,2,1,35,0 -2857,6,3,0,13,0,4,10,3,1,0,2,4,2,34,0 -2858,10,2,0,5,2,1,14,1,2,0,4,20,0,33,0 -2859,6,2,0,11,1,5,3,5,1,1,3,17,1,11,1 -2860,2,7,0,6,2,2,14,5,2,0,13,11,2,6,0 -2861,4,2,0,13,6,6,13,0,2,0,2,11,0,12,0 -2862,0,0,0,6,3,2,0,3,2,1,0,9,0,4,0 -2863,9,8,0,11,5,5,5,4,0,1,13,0,0,9,0 -2864,0,2,0,13,4,0,0,0,4,0,1,0,0,20,0 -2865,2,0,0,5,1,5,7,4,0,1,4,19,5,20,0 -2866,4,0,0,3,3,0,0,4,1,1,16,5,2,23,1 -2867,6,0,0,1,1,5,5,4,0,0,13,15,3,24,0 -2868,3,0,0,2,6,1,11,0,2,1,18,1,2,39,1 -2869,2,1,0,8,6,4,9,4,1,1,10,11,1,3,0 -2870,2,2,0,11,2,3,9,3,4,1,2,15,0,38,0 -2871,3,7,0,14,5,4,10,1,1,1,2,5,2,18,0 -2872,7,2,0,8,2,6,5,5,1,1,18,5,5,17,1 -2873,0,6,0,4,6,2,12,3,0,0,13,20,1,1,0 -2874,6,3,0,6,2,2,11,3,0,1,14,1,0,1,1 -2875,0,2,0,12,0,4,4,3,3,1,3,18,1,26,0 -2876,10,6,0,6,4,2,5,3,0,1,8,11,1,26,0 -2877,1,0,0,1,6,4,4,0,2,1,7,7,2,9,1 -2878,10,2,0,3,4,6,8,4,2,1,2,11,3,35,0 -2879,0,5,0,13,6,6,8,0,3,1,4,5,0,21,0 -2880,1,3,0,8,0,1,10,1,2,0,1,9,3,0,0 -2881,2,0,0,10,1,4,2,1,0,1,13,2,1,1,0 -2882,0,3,0,9,6,6,3,4,1,1,13,11,2,14,0 -2883,3,3,0,0,4,3,1,4,3,0,8,14,4,22,0 -2884,3,6,0,9,6,0,14,2,3,1,2,11,3,27,0 -2885,1,5,0,10,0,2,0,0,2,1,8,11,0,18,0 -2886,2,3,0,5,5,3,5,3,3,0,10,15,3,34,0 -2887,8,6,0,12,3,3,12,5,2,1,11,17,1,18,1 -2888,4,8,0,15,1,0,6,0,2,1,6,2,0,13,0 -2889,0,3,0,2,5,6,8,2,1,0,13,6,0,5,0 -2890,7,4,0,10,1,3,9,4,3,1,13,6,5,27,0 -2891,2,5,0,6,0,1,6,1,0,1,2,4,4,6,0 -2892,9,6,0,10,4,3,11,2,3,1,11,0,1,14,1 -2893,6,0,0,0,0,3,0,0,3,0,6,9,4,0,0 -2894,2,3,0,2,0,6,13,4,3,1,3,0,1,36,0 -2895,3,3,0,3,1,6,7,0,3,0,6,7,0,28,0 -2896,1,5,0,0,1,1,8,5,3,1,8,11,0,12,0 -2897,8,4,0,2,0,2,9,0,3,0,2,2,3,36,0 -2898,0,4,0,10,4,0,0,0,2,1,3,2,0,16,0 -2899,6,1,0,9,3,1,7,3,4,0,18,7,5,2,1 -2900,10,7,0,14,6,2,0,5,2,0,8,6,5,28,0 -2901,6,5,0,13,3,1,8,3,4,0,13,4,2,22,0 -2902,0,6,0,3,2,0,3,3,2,0,8,8,4,3,0 -2903,3,7,0,4,2,0,6,2,4,1,6,0,5,4,0 -2904,10,0,0,13,2,0,7,0,2,1,17,18,4,3,0 -2905,0,2,0,3,3,6,8,4,4,0,6,11,5,4,0 -2906,6,4,0,1,0,4,12,3,4,1,0,2,1,0,0 -2907,8,5,0,12,1,1,0,3,3,1,9,17,3,14,1 -2908,5,4,0,10,1,6,9,1,0,0,9,7,5,13,1 -2909,2,4,0,13,1,5,5,3,2,0,15,20,0,34,0 -2910,1,8,0,13,4,2,2,4,1,0,12,15,0,27,0 -2911,2,6,0,1,5,2,12,0,1,1,2,11,3,35,0 -2912,0,2,0,12,3,4,13,1,4,1,2,16,4,10,0 -2913,7,7,0,0,2,3,9,1,0,1,17,9,0,13,0 -2914,10,7,0,1,6,0,2,4,3,0,18,17,5,10,1 -2915,0,3,0,8,1,6,8,2,4,0,8,12,1,38,0 -2916,3,1,0,7,5,0,9,2,3,0,18,10,3,17,1 -2917,6,8,0,7,5,3,12,1,1,0,8,5,0,9,0 -2918,5,1,0,6,5,3,11,3,1,0,8,2,1,0,0 -2919,10,4,0,3,4,5,13,4,4,0,2,2,0,16,0 -2920,5,3,0,7,0,5,14,3,2,1,8,16,2,26,0 -2921,5,6,0,6,0,2,5,1,4,1,6,20,4,18,0 -2922,10,7,0,0,6,5,3,3,0,1,17,2,2,20,0 -2923,1,4,0,7,0,5,5,3,3,0,0,16,5,27,0 -2924,2,8,0,10,6,2,8,5,2,0,1,8,4,41,1 -2925,9,3,0,12,6,0,4,2,4,0,12,2,4,40,0 -2926,2,8,0,0,5,2,14,5,2,0,13,11,0,22,0 -2927,2,6,0,1,2,0,14,1,4,0,10,18,5,31,0 -2928,0,0,0,5,2,5,0,4,2,1,10,6,5,4,0 -2929,5,8,0,3,6,6,14,3,0,1,6,7,1,22,1 -2930,10,7,0,4,2,0,3,4,2,0,4,15,3,25,0 -2931,9,6,0,10,3,1,6,0,1,0,8,11,5,34,0 -2932,5,1,0,7,3,0,8,1,0,1,0,20,3,16,0 -2933,3,4,0,15,4,3,8,2,4,1,13,20,0,27,0 -2934,9,0,0,12,5,0,7,4,1,1,2,5,4,4,0 -2935,8,7,0,13,5,0,1,4,0,1,2,18,5,40,0 -2936,7,6,0,9,3,4,1,2,0,0,4,6,0,5,0 -2937,6,5,0,7,1,5,0,5,1,1,18,15,0,6,0 -2938,9,1,0,7,5,2,1,2,2,1,18,10,0,7,1 -2939,0,5,0,0,0,4,11,3,1,1,8,2,3,18,0 -2940,2,5,0,0,0,3,7,1,3,0,2,8,2,36,0 -2941,5,0,0,3,6,0,9,1,4,1,10,2,2,7,0 -2942,5,5,0,1,0,5,7,5,4,0,10,11,0,20,0 -2943,5,1,0,6,4,4,0,2,3,1,8,16,4,26,0 -2944,1,7,0,3,0,0,13,4,0,1,8,15,1,30,0 -2945,0,6,0,14,6,4,5,3,0,0,8,15,2,40,0 -2946,3,8,0,0,0,2,6,4,2,0,10,19,3,35,0 -2947,5,6,0,2,3,1,3,5,1,0,8,17,5,19,1 -2948,2,3,0,11,1,0,5,3,3,0,2,11,1,7,0 -2949,0,1,0,7,3,1,13,3,3,0,8,18,4,21,0 -2950,0,6,0,4,4,6,10,3,4,1,8,2,0,27,0 -2951,2,6,0,15,6,5,10,4,2,1,13,18,2,38,0 -2952,2,0,0,15,6,1,2,2,4,0,15,7,0,30,1 -2953,9,1,0,0,1,1,0,5,1,1,18,7,5,2,1 -2954,2,6,0,2,1,0,9,1,2,0,4,2,5,38,0 -2955,5,6,0,8,2,2,11,4,4,1,18,10,4,26,1 -2956,10,1,0,1,3,4,4,4,4,1,6,15,0,11,0 -2957,2,7,0,10,2,0,10,1,1,1,10,2,0,3,0 -2958,3,3,0,10,2,5,11,2,3,0,4,7,2,36,1 -2959,6,2,0,1,0,3,0,0,2,1,2,16,2,24,0 -2960,2,3,0,12,4,6,6,0,1,0,18,13,0,19,1 -2961,2,5,0,9,3,6,4,2,4,0,5,14,2,36,1 -2962,4,3,0,13,0,4,7,2,1,0,2,11,3,32,0 -2963,6,1,0,5,5,1,4,4,3,1,8,11,2,12,0 -2964,8,2,0,10,0,1,9,0,1,1,18,19,4,31,1 -2965,2,7,0,13,0,5,6,4,0,0,10,2,0,33,0 -2966,8,6,0,6,4,6,13,5,0,0,5,3,3,40,1 -2967,0,6,0,12,0,5,4,4,1,0,13,6,0,20,0 -2968,0,2,0,12,6,0,0,4,0,0,8,13,3,9,0 -2969,8,3,0,15,4,5,6,2,0,0,0,14,2,37,0 -2970,2,6,0,6,3,4,0,3,0,1,2,18,3,8,0 -2971,4,7,0,10,3,6,3,1,3,0,3,17,2,17,1 -2972,5,2,0,3,3,2,8,4,3,0,1,14,1,12,1 -2973,8,6,0,8,5,6,0,3,4,0,13,5,5,17,0 -2974,1,1,0,13,5,0,2,4,3,0,13,11,2,9,0 -2975,4,1,0,7,4,5,8,4,0,1,11,2,1,8,0 -2976,7,4,0,7,3,5,8,5,2,1,12,12,3,10,1 -2977,3,3,0,10,6,6,2,1,4,0,11,6,2,10,0 -2978,9,6,0,13,4,4,3,4,1,1,7,20,2,33,0 -2979,2,8,0,4,0,0,13,4,0,0,8,18,0,27,0 -2980,8,6,0,2,6,0,3,1,2,0,2,5,1,14,0 -2981,2,5,0,8,0,2,9,1,1,1,4,6,0,23,0 -2982,2,8,0,13,3,3,1,5,4,1,13,9,4,23,0 -2983,9,3,0,14,0,5,1,5,3,1,18,6,3,9,1 -2984,5,8,0,10,4,1,1,5,3,1,18,17,5,27,1 -2985,6,3,0,13,5,2,2,3,0,1,2,15,0,16,0 -2986,1,2,0,8,2,5,5,2,1,1,8,11,1,28,0 -2987,6,3,0,6,5,5,10,3,4,0,2,15,1,2,0 -2988,2,3,0,7,0,5,0,0,0,0,18,12,4,28,1 -2989,10,1,0,10,4,3,14,4,2,0,4,2,0,21,0 -2990,4,6,0,4,0,0,3,4,0,1,8,13,4,28,0 -2991,1,2,0,2,1,2,4,1,0,0,9,15,2,27,0 -2992,3,6,0,7,0,4,8,0,0,1,3,17,2,7,0 -2993,6,6,0,2,5,4,1,0,2,0,1,12,0,16,1 -2994,9,8,0,3,4,0,10,0,0,1,4,15,0,6,0 -2995,2,6,0,11,1,1,7,1,0,1,10,11,3,24,0 -2996,10,2,0,15,5,4,1,5,0,1,3,17,1,32,1 -2997,0,1,0,4,6,1,9,2,3,1,2,15,0,17,0 -2998,7,7,0,6,6,6,7,2,1,0,6,9,1,7,0 -2999,3,3,0,11,1,5,9,4,1,1,4,2,0,9,0 -3000,0,5,0,15,4,4,8,0,2,1,10,4,1,25,0 -3001,3,5,0,14,2,0,8,4,3,1,16,7,5,35,1 -3002,0,1,0,4,5,5,5,5,4,0,2,0,5,12,0 -3003,10,8,0,14,4,1,4,5,0,1,5,14,4,31,1 -3004,2,0,0,12,5,3,5,1,2,1,0,0,5,30,0 -3005,7,5,0,4,1,3,5,4,4,1,10,19,0,17,0 -3006,9,0,0,11,5,5,12,5,3,0,13,11,0,35,0 -3007,10,6,0,4,3,1,13,3,3,1,13,19,0,35,1 -3008,3,3,0,10,6,3,8,5,1,1,13,11,2,31,0 -3009,10,1,0,3,6,0,7,3,1,1,2,17,4,38,0 -3010,7,7,0,13,5,6,12,4,0,0,8,16,0,7,0 -3011,4,1,0,11,6,4,3,4,4,1,0,16,1,7,0 -3012,6,1,0,13,0,4,1,3,1,0,4,20,2,38,0 -3013,6,5,0,11,2,2,0,2,2,0,0,17,4,6,0 -3014,7,4,0,11,6,1,9,0,2,0,18,5,4,9,1 -3015,1,2,0,6,0,4,6,0,1,1,8,11,1,31,0 -3016,10,3,0,3,0,2,5,4,2,1,8,2,4,3,0 -3017,0,7,0,14,0,1,6,1,3,0,15,13,1,28,0 -3018,6,2,0,3,2,5,11,2,4,1,18,16,2,31,0 -3019,9,1,0,7,1,1,2,2,4,1,5,7,0,34,1 -3020,7,8,0,9,5,4,11,5,1,1,5,1,4,14,1 -3021,4,6,0,7,6,0,0,3,3,0,18,11,0,6,1 -3022,7,7,0,6,2,6,10,4,4,1,11,9,4,20,1 -3023,3,6,0,0,1,3,9,3,3,0,13,15,5,13,0 -3024,3,6,0,13,4,5,5,2,4,0,2,15,4,23,0 -3025,6,4,0,1,3,1,3,4,1,0,16,7,1,24,1 -3026,0,2,0,14,2,1,2,3,0,1,18,17,1,41,1 -3027,9,5,0,3,1,5,14,4,0,0,10,11,0,16,0 -3028,3,1,0,5,3,0,6,4,0,0,10,8,1,32,0 -3029,4,8,0,13,5,0,2,3,2,0,8,15,1,10,0 -3030,4,8,0,6,2,6,8,4,3,0,18,4,2,36,0 -3031,0,7,0,0,5,1,5,1,0,0,8,13,1,28,0 -3032,0,6,0,14,6,2,12,3,2,1,6,7,2,39,1 -3033,2,1,0,2,6,0,0,4,3,1,6,2,5,30,0 -3034,9,5,0,1,2,0,3,4,4,0,1,6,0,5,0 -3035,7,7,0,11,2,2,2,2,0,1,18,6,4,24,1 -3036,6,3,0,15,5,0,0,3,2,1,6,15,2,5,0 -3037,10,1,0,7,0,6,9,5,1,1,13,2,5,4,0 -3038,1,3,0,11,6,1,0,5,0,1,2,14,0,10,0 -3039,9,8,0,7,6,4,7,2,0,0,13,0,1,4,0 -3040,1,4,0,13,5,6,5,3,3,0,7,6,1,38,0 -3041,4,6,0,12,4,2,2,3,0,1,18,17,1,19,1 -3042,0,8,0,14,3,1,7,1,2,0,4,16,4,33,0 -3043,7,5,0,5,5,0,14,3,1,1,4,6,0,25,0 -3044,0,0,0,4,0,5,3,1,0,0,8,6,5,16,0 -3045,0,6,0,0,2,0,7,0,2,1,6,0,2,13,0 -3046,2,4,0,10,4,5,4,5,3,0,11,19,2,33,1 -3047,0,4,0,8,5,4,6,5,0,0,4,2,0,3,0 -3048,6,8,0,4,5,5,13,1,3,0,16,17,4,34,1 -3049,7,1,0,4,0,4,1,3,3,0,8,20,5,7,0 -3050,5,0,0,9,1,4,7,5,3,1,12,6,0,0,0 -3051,0,6,0,12,6,5,8,1,2,0,3,15,5,4,0 -3052,3,7,0,12,0,0,9,0,1,0,13,16,2,31,0 -3053,3,4,0,6,0,4,2,0,2,0,15,2,1,7,0 -3054,4,6,0,3,1,3,7,4,0,0,2,11,4,18,0 -3055,10,3,0,12,4,4,11,5,0,1,16,1,2,37,1 -3056,3,5,0,13,0,6,6,1,3,0,13,11,4,3,0 -3057,8,6,0,5,1,4,5,5,2,0,13,11,1,0,0 -3058,0,0,0,1,1,6,9,3,4,1,15,0,0,38,0 -3059,4,6,0,8,1,0,9,0,1,1,13,13,3,25,0 -3060,7,7,0,12,2,2,13,4,0,0,5,5,0,7,1 -3061,5,1,0,12,5,0,3,0,4,1,5,9,4,10,1 -3062,6,3,0,10,3,6,7,1,1,1,14,12,2,39,1 -3063,9,8,0,11,0,6,5,2,2,1,10,2,2,25,0 -3064,4,8,0,8,6,1,10,5,4,1,11,17,4,28,1 -3065,6,0,0,3,3,5,4,2,1,0,1,19,3,10,1 -3066,2,8,0,11,3,0,10,2,0,1,2,0,0,4,0 -3067,3,6,0,4,6,0,9,2,0,0,7,13,2,35,0 -3068,9,6,0,14,2,6,5,3,3,1,6,1,0,13,0 -3069,4,0,0,10,5,5,11,2,0,1,9,17,1,31,1 -3070,5,7,0,14,5,2,6,0,3,0,2,18,2,4,0 -3071,9,8,0,7,6,6,7,2,2,0,17,15,1,1,0 -3072,6,0,0,0,5,5,8,5,3,0,10,18,1,3,0 -3073,9,4,0,1,2,6,9,2,3,1,8,10,3,11,0 -3074,10,3,0,1,3,1,4,1,3,1,0,5,3,33,1 -3075,8,5,0,3,0,6,9,4,4,0,2,16,0,33,0 -3076,5,8,0,13,4,5,2,2,2,1,1,14,2,33,1 -3077,8,2,0,4,4,5,3,5,1,0,14,15,5,6,1 -3078,0,3,0,10,0,5,0,3,0,0,10,11,1,25,0 -3079,0,1,0,4,6,2,3,3,3,1,6,9,4,38,0 -3080,9,0,0,4,5,3,5,4,4,0,14,0,3,3,1 -3081,5,1,0,10,5,3,3,3,2,1,9,19,2,35,1 -3082,6,6,0,10,4,6,12,3,3,1,9,5,5,9,1 -3083,6,3,0,9,2,0,5,5,0,1,3,4,1,33,0 -3084,3,1,0,14,5,0,0,0,4,1,9,11,4,21,1 -3085,5,6,0,1,4,5,9,3,4,1,13,17,1,38,0 -3086,4,6,0,0,6,1,8,4,1,1,0,10,4,19,1 -3087,1,4,0,10,2,4,5,4,1,1,5,7,4,2,1 -3088,2,4,0,5,3,1,8,0,3,1,14,16,2,22,0 -3089,9,6,0,0,0,5,0,2,4,0,2,1,4,28,0 -3090,2,0,0,2,0,3,2,5,3,0,17,2,5,21,0 -3091,3,0,0,4,3,1,5,3,1,1,17,2,5,22,0 -3092,5,3,0,7,0,5,0,1,3,1,8,1,1,6,0 -3093,3,7,0,5,3,0,6,5,3,1,2,11,3,4,0 -3094,6,4,0,6,6,5,0,1,2,0,13,18,5,24,0 -3095,2,8,0,3,3,5,8,3,1,0,14,6,0,5,0 -3096,8,0,0,5,5,6,8,2,3,0,15,6,4,12,0 -3097,1,8,0,8,6,0,12,4,2,0,8,19,1,2,0 -3098,0,3,0,3,4,5,10,0,3,0,5,0,1,39,0 -3099,2,7,0,12,2,4,1,4,2,0,2,9,0,12,0 -3100,8,5,0,4,3,6,13,5,3,1,16,15,3,32,1 -3101,0,3,0,13,0,3,4,0,0,0,2,20,4,9,0 -3102,6,8,0,2,2,1,3,0,4,1,7,8,2,25,1 -3103,7,6,0,11,2,4,9,4,3,1,13,5,3,8,0 -3104,1,4,0,2,5,3,8,4,2,1,4,11,2,38,0 -3105,9,8,0,1,3,3,4,4,0,1,6,4,1,19,0 -3106,6,3,0,11,4,6,9,1,3,0,2,0,1,23,0 -3107,3,3,0,0,4,3,7,5,2,1,2,2,3,38,0 -3108,9,3,0,5,0,4,6,2,0,0,13,3,5,39,0 -3109,7,5,0,12,2,5,4,3,1,1,15,3,2,24,0 -3110,0,5,0,13,6,0,3,4,0,1,4,9,5,22,0 -3111,0,0,0,15,6,4,0,2,3,0,2,9,0,38,0 -3112,0,3,0,8,4,6,8,1,2,0,8,12,0,29,0 -3113,1,4,0,1,4,4,10,0,2,0,8,7,3,36,0 -3114,8,7,0,8,6,0,8,3,1,1,15,2,1,37,0 -3115,3,0,0,14,4,5,0,2,3,1,16,12,3,35,1 -3116,2,6,0,10,0,4,12,1,3,0,8,0,1,40,0 -3117,7,5,0,3,3,4,0,1,3,0,18,1,4,34,1 -3118,1,1,0,9,0,5,5,4,4,1,0,15,5,19,0 -3119,4,3,0,10,0,5,3,3,0,0,13,3,1,38,0 -3120,6,6,0,14,3,5,11,4,3,1,2,11,4,35,0 -3121,3,7,0,6,6,4,6,3,4,0,7,0,3,40,0 -3122,8,5,0,11,4,0,3,3,2,1,15,0,2,22,1 -3123,3,2,0,0,3,1,14,5,3,0,4,10,0,38,0 -3124,10,0,0,3,0,6,1,1,3,1,13,15,4,3,0 -3125,1,7,0,11,5,4,13,1,0,1,16,2,5,25,0 -3126,3,4,0,13,1,4,10,3,1,0,3,16,0,6,0 -3127,4,3,0,15,2,1,13,0,4,0,3,12,5,39,1 -3128,9,1,0,15,0,5,4,5,0,1,16,17,3,8,1 -3129,10,3,0,3,1,5,3,4,4,1,17,15,1,11,0 -3130,6,2,0,10,6,0,0,3,0,0,13,9,1,1,0 -3131,8,5,0,5,4,4,0,5,3,1,5,9,0,16,0 -3132,3,2,0,9,1,4,10,2,4,0,2,0,3,24,0 -3133,10,6,0,10,0,3,14,2,1,1,0,11,0,23,0 -3134,0,6,0,6,1,5,9,2,0,1,10,19,0,17,0 -3135,2,8,0,1,5,0,10,2,2,1,4,11,4,27,0 -3136,1,1,0,11,1,2,0,2,3,0,11,11,1,21,0 -3137,10,1,0,11,2,6,1,1,3,1,5,3,4,39,1 -3138,10,8,0,4,4,6,10,4,4,0,5,7,4,12,1 -3139,0,3,0,8,5,1,12,0,1,0,13,4,2,3,0 -3140,4,6,0,15,1,4,2,2,2,1,13,2,0,8,0 -3141,3,4,0,1,4,5,14,3,1,0,0,15,1,20,0 -3142,5,5,0,14,6,1,11,3,4,0,17,11,2,23,0 -3143,5,5,0,6,0,3,7,0,2,1,2,15,3,16,0 -3144,10,7,0,15,2,0,6,3,3,0,17,20,2,31,0 -3145,0,0,0,9,5,4,1,2,0,0,2,2,4,13,0 -3146,7,0,0,15,4,5,0,2,4,1,8,14,5,4,0 -3147,2,0,0,6,1,6,4,2,0,1,13,12,5,8,0 -3148,1,8,0,13,0,2,8,2,1,1,0,9,3,4,0 -3149,2,4,0,3,3,4,8,3,0,0,4,11,5,23,0 -3150,10,0,0,11,4,1,2,3,3,0,9,16,1,26,1 -3151,2,4,0,14,2,6,7,4,4,0,6,0,1,28,0 -3152,2,6,0,13,4,2,11,4,3,1,2,11,1,0,0 -3153,0,5,0,4,6,1,2,4,2,1,15,16,0,0,0 -3154,8,2,0,0,0,2,4,5,1,1,1,10,4,0,1 -3155,10,4,0,10,5,0,3,4,4,1,13,2,0,29,0 -3156,6,2,0,9,4,6,14,1,1,1,17,11,5,40,0 -3157,9,7,0,11,0,6,9,2,3,1,17,0,0,19,0 -3158,7,2,0,1,3,4,4,0,2,0,18,7,5,31,1 -3159,2,4,0,0,6,1,9,4,0,1,10,16,0,7,0 -3160,6,5,0,2,1,0,7,2,2,0,0,13,2,8,0 -3161,1,5,0,15,1,0,6,1,1,0,4,15,3,5,0 -3162,9,4,0,8,1,0,6,3,2,0,16,11,1,1,0 -3163,3,8,0,0,0,4,14,1,3,0,2,11,4,0,0 -3164,3,7,0,6,6,0,13,3,0,0,2,20,3,6,0 -3165,0,0,0,11,1,0,14,3,2,0,17,11,0,25,0 -3166,0,4,0,15,6,3,0,0,3,0,13,9,0,7,0 -3167,2,2,0,5,2,4,1,0,4,0,17,11,4,6,0 -3168,2,2,0,1,5,1,11,1,2,1,18,13,5,34,1 -3169,1,0,0,9,6,1,11,1,4,1,3,18,3,25,0 -3170,0,6,0,10,4,0,5,2,4,0,7,3,1,1,0 -3171,8,0,0,2,6,3,5,3,3,1,0,16,1,38,0 -3172,5,3,0,9,1,5,8,3,0,1,10,12,3,22,0 -3173,9,0,0,7,0,2,14,5,2,0,18,4,5,21,1 -3174,4,5,0,3,3,3,4,0,3,1,18,13,3,27,1 -3175,5,6,0,8,6,5,14,0,3,0,1,15,5,10,1 -3176,3,5,0,3,0,4,13,2,0,0,16,5,4,37,1 -3177,10,5,0,12,6,1,11,5,1,0,14,4,4,5,1 -3178,7,1,0,15,3,5,10,3,3,1,2,4,2,2,0 -3179,4,8,0,11,1,5,9,3,3,0,2,14,0,33,0 -3180,7,6,0,10,2,6,7,0,3,1,9,6,4,12,1 -3181,0,7,0,5,6,3,11,1,3,0,17,0,5,30,0 -3182,10,5,0,5,1,6,11,2,1,1,4,6,2,1,0 -3183,10,8,0,9,3,3,8,3,3,0,10,2,2,5,0 -3184,2,2,0,13,0,3,4,2,1,1,2,16,4,29,0 -3185,9,5,0,5,3,0,3,0,0,0,2,7,0,2,0 -3186,6,5,0,8,0,0,6,0,1,0,12,11,1,29,0 -3187,2,2,0,4,5,5,3,3,2,1,13,2,1,35,0 -3188,2,6,0,15,5,2,3,0,2,1,18,7,0,26,1 -3189,10,2,0,5,2,2,2,4,4,1,9,17,4,9,1 -3190,4,8,0,8,2,1,13,2,4,0,3,2,2,27,0 -3191,0,6,0,3,6,3,5,3,0,1,13,11,2,28,0 -3192,7,0,0,1,6,5,14,5,0,1,6,5,5,35,1 -3193,0,6,0,7,0,5,10,4,3,1,2,9,3,7,0 -3194,9,6,0,7,0,5,2,3,2,1,13,9,3,37,0 -3195,9,3,0,3,0,6,13,0,0,0,9,1,4,40,1 -3196,3,0,0,0,3,2,13,2,3,1,12,7,5,37,1 -3197,5,3,0,6,2,3,2,3,1,1,4,2,4,24,0 -3198,2,8,0,8,2,6,13,4,0,1,1,9,3,26,0 -3199,1,4,0,12,6,6,5,0,3,1,17,2,4,30,0 -3200,5,6,0,3,0,3,10,1,0,0,15,11,2,8,0 -3201,0,3,0,15,6,5,9,3,0,1,0,2,0,31,0 -3202,3,6,0,10,0,4,1,1,3,1,4,0,0,39,0 -3203,10,5,0,8,0,2,10,4,0,0,17,7,3,37,1 -3204,0,4,0,8,1,5,5,0,0,0,4,9,0,41,0 -3205,9,1,0,15,6,2,13,3,3,1,1,7,4,18,1 -3206,3,0,0,2,1,4,10,3,2,0,6,0,0,4,0 -3207,9,6,0,1,0,0,5,5,0,0,15,11,3,19,0 -3208,0,4,0,5,4,6,10,0,4,1,4,9,2,3,0 -3209,4,8,0,13,0,0,9,2,0,0,13,2,0,34,0 -3210,9,2,0,9,3,5,1,0,4,1,8,3,1,19,0 -3211,3,6,0,2,2,1,6,2,0,1,7,15,4,34,0 -3212,0,6,0,4,3,5,5,3,0,1,8,15,5,13,0 -3213,3,4,0,5,2,4,4,1,2,0,8,9,0,5,0 -3214,2,8,0,3,1,2,3,4,2,0,15,2,4,28,0 -3215,5,8,0,6,2,0,12,1,1,1,5,1,3,32,1 -3216,2,3,0,11,0,4,8,4,4,1,13,7,1,10,0 -3217,0,8,0,0,6,2,14,3,3,0,4,12,1,21,0 -3218,6,7,0,9,1,5,12,4,2,1,17,4,0,38,0 -3219,0,1,0,1,3,1,12,4,1,0,13,5,1,38,0 -3220,6,4,0,0,2,1,5,0,0,0,1,6,2,14,1 -3221,3,3,0,11,3,1,11,1,1,1,14,14,3,26,1 -3222,0,2,0,14,5,3,4,3,2,1,4,15,4,10,0 -3223,0,8,0,0,2,4,7,3,0,0,10,2,0,5,0 -3224,3,3,0,0,0,0,11,0,3,1,18,1,4,24,1 -3225,5,8,0,8,6,4,13,1,1,0,4,6,3,29,0 -3226,3,2,0,11,1,5,3,2,0,0,16,20,5,31,0 -3227,5,4,0,6,3,3,14,3,4,1,13,15,0,32,0 -3228,10,6,0,4,6,2,4,1,0,0,12,11,3,29,0 -3229,1,8,0,7,0,5,5,0,4,0,8,11,0,26,0 -3230,1,5,0,8,0,0,8,3,1,0,10,15,5,18,0 -3231,7,2,0,10,0,5,5,3,3,1,13,0,2,6,0 -3232,10,1,0,6,6,1,3,0,0,1,1,3,0,32,1 -3233,10,4,0,7,1,4,9,2,0,0,0,13,2,17,0 -3234,9,1,0,2,4,5,12,5,0,1,9,3,2,19,1 -3235,7,8,0,0,3,3,9,3,2,0,4,7,3,21,1 -3236,3,7,0,1,1,1,5,0,2,1,13,16,1,24,0 -3237,0,6,0,15,1,3,1,1,0,0,13,2,5,10,0 -3238,2,1,0,11,0,3,3,3,2,1,13,11,1,24,0 -3239,10,5,0,14,2,2,6,5,1,0,3,10,5,34,1 -3240,2,8,0,8,2,0,7,5,0,1,1,9,0,22,0 -3241,4,8,0,11,6,5,5,1,0,1,13,18,5,36,0 -3242,2,2,0,6,5,3,9,0,0,1,9,2,0,10,0 -3243,0,6,0,3,0,0,6,2,0,0,2,0,3,37,0 -3244,6,7,0,1,3,5,4,2,3,1,8,7,5,6,1 -3245,4,1,0,4,0,3,6,3,3,0,18,7,2,19,1 -3246,1,6,0,3,3,0,4,3,0,1,11,0,4,11,0 -3247,0,5,0,1,4,0,9,2,4,0,4,6,4,2,0 -3248,4,0,0,4,5,3,7,1,4,1,10,20,4,23,1 -3249,4,1,0,6,1,2,5,5,3,1,18,8,3,0,1 -3250,1,2,0,7,4,5,0,4,4,1,8,3,2,3,0 -3251,0,3,0,1,1,0,5,3,3,0,13,11,5,17,0 -3252,0,4,0,15,3,5,2,2,1,1,13,18,0,26,0 -3253,6,1,0,12,6,6,13,0,1,1,11,7,2,1,1 -3254,0,6,0,6,0,4,11,5,2,1,8,6,4,17,0 -3255,4,8,0,1,0,6,9,1,1,1,16,7,0,35,0 -3256,8,6,0,11,1,1,10,5,2,0,18,7,3,26,1 -3257,10,4,0,3,5,4,9,5,0,0,8,11,5,22,0 -3258,9,5,0,5,5,4,3,4,2,0,2,16,4,37,0 -3259,1,3,0,3,0,1,9,3,0,0,13,11,1,32,0 -3260,0,7,0,7,3,3,6,1,1,0,4,6,2,23,0 -3261,8,1,0,0,0,0,10,4,3,0,8,18,0,8,0 -3262,3,1,0,11,4,4,14,1,2,1,2,2,4,40,0 -3263,4,7,0,2,0,3,14,5,4,0,11,10,3,7,1 -3264,3,4,0,1,6,0,10,1,0,0,17,20,3,37,0 -3265,2,7,0,1,4,3,5,3,2,1,13,18,4,25,0 -3266,5,3,0,1,0,5,5,3,0,1,0,2,2,21,0 -3267,8,6,0,5,5,4,10,1,3,0,15,10,2,18,1 -3268,9,5,0,14,0,6,12,5,2,1,10,16,2,14,0 -3269,5,5,0,0,6,5,9,2,3,0,6,0,1,9,0 -3270,3,0,0,1,1,5,11,3,4,1,6,0,0,7,0 -3271,3,4,0,8,1,0,9,2,0,0,8,7,3,31,0 -3272,2,6,0,2,6,3,13,3,4,1,18,16,2,39,1 -3273,3,2,0,2,4,6,6,2,2,0,6,11,0,26,0 -3274,4,8,0,3,2,3,14,3,0,1,2,4,3,20,0 -3275,2,0,0,13,5,0,10,1,4,1,15,2,5,13,0 -3276,3,3,0,0,2,4,5,0,3,1,4,16,1,20,0 -3277,3,6,0,4,0,1,12,0,2,1,13,0,3,12,0 -3278,10,6,0,11,3,0,6,2,0,0,17,6,2,6,0 -3279,0,8,0,6,0,6,2,1,2,1,8,8,2,2,0 -3280,7,4,0,10,5,6,4,1,2,0,18,5,1,3,1 -3281,1,1,0,8,1,3,0,1,4,1,13,6,2,37,0 -3282,0,3,0,13,0,6,0,3,2,1,5,11,5,7,0 -3283,2,4,0,8,6,5,3,4,2,0,4,11,0,10,0 -3284,6,6,0,0,0,6,5,4,0,0,10,11,5,28,0 -3285,9,8,0,12,5,6,1,5,2,1,1,8,2,25,1 -3286,2,8,0,4,2,1,11,0,3,0,5,11,0,3,0 -3287,4,2,0,8,3,1,4,3,2,1,1,7,4,16,1 -3288,3,4,0,11,0,0,5,0,3,1,12,11,0,22,0 -3289,7,7,0,1,3,5,9,2,1,1,5,8,4,35,1 -3290,2,1,0,0,6,2,1,0,2,0,5,15,0,1,0 -3291,2,4,0,3,1,0,5,3,2,1,13,15,0,24,0 -3292,2,3,0,4,0,5,5,3,3,0,13,5,1,37,0 -3293,2,1,0,11,6,1,10,5,0,0,8,2,2,0,0 -3294,10,6,0,9,6,0,10,5,3,1,0,7,0,19,1 -3295,3,0,0,8,2,4,8,2,1,1,0,15,1,38,0 -3296,5,3,0,3,0,5,12,4,1,0,1,6,1,39,0 -3297,8,5,0,6,3,5,9,3,2,1,17,16,4,35,0 -3298,7,0,0,4,2,2,0,3,3,0,11,8,5,4,1 -3299,1,8,0,1,2,5,5,0,2,0,16,9,0,19,0 -3300,7,0,0,8,3,2,1,3,0,1,18,17,2,12,1 -3301,0,8,0,2,6,0,4,0,3,1,2,0,4,19,0 -3302,7,6,0,2,0,0,3,0,0,1,13,3,5,8,0 -3303,2,6,0,11,1,6,14,4,4,0,4,6,0,34,0 -3304,2,1,0,8,3,6,10,5,3,1,3,2,0,4,0 -3305,8,3,0,14,6,6,11,5,3,1,10,20,4,25,1 -3306,8,5,0,7,3,1,10,3,4,1,1,14,1,4,1 -3307,3,1,0,11,6,6,14,0,0,1,8,20,0,6,0 -3308,1,6,0,11,2,4,8,5,2,1,13,6,1,19,0 -3309,1,4,0,1,0,1,2,4,0,0,2,11,1,36,0 -3310,0,8,0,10,2,3,13,5,2,0,5,7,4,7,1 -3311,0,6,0,13,6,1,14,1,4,1,5,11,0,12,0 -3312,6,3,0,9,2,2,6,0,3,0,8,20,0,36,0 -3313,7,8,0,6,2,6,9,0,4,0,3,11,3,14,0 -3314,3,3,0,12,3,2,10,3,3,0,13,2,0,33,0 -3315,5,4,0,10,3,2,1,3,3,1,18,13,1,22,1 -3316,10,8,0,3,1,5,9,3,0,0,15,2,1,7,0 -3317,0,6,0,9,3,6,9,3,2,1,8,17,3,20,0 -3318,2,0,0,7,0,0,8,5,3,1,3,15,5,9,0 -3319,7,8,0,4,3,0,1,2,0,0,3,14,4,3,0 -3320,1,4,0,11,1,4,14,5,4,1,9,1,1,17,1 -3321,1,7,0,11,0,5,4,0,0,1,12,6,5,7,0 -3322,8,7,0,4,0,4,0,5,0,0,13,11,5,30,0 -3323,9,5,0,3,0,1,9,3,0,0,10,16,1,35,0 -3324,8,7,0,2,0,2,2,0,0,1,8,2,5,9,0 -3325,4,4,0,1,0,2,9,0,2,1,9,17,4,23,1 -3326,9,8,0,3,6,5,9,0,2,0,12,18,4,28,0 -3327,6,7,0,12,5,4,1,1,4,0,15,2,0,21,0 -3328,9,4,0,6,0,6,7,5,4,0,16,20,4,16,1 -3329,3,8,0,3,1,0,14,4,1,0,13,11,2,23,0 -3330,1,1,0,3,3,4,10,3,2,0,2,2,1,38,0 -3331,5,6,0,14,3,5,13,4,4,1,13,14,5,9,0 -3332,6,3,0,1,5,5,14,4,1,1,2,4,1,14,0 -3333,10,5,0,13,5,3,2,2,3,0,16,5,2,29,0 -3334,2,4,0,5,1,0,6,4,1,0,4,18,2,26,0 -3335,6,1,0,3,2,1,2,5,1,1,16,1,0,17,1 -3336,1,8,0,2,3,2,11,3,1,0,15,2,0,26,0 -3337,0,2,0,8,0,3,14,2,0,1,13,11,4,29,0 -3338,8,6,0,6,0,1,4,3,1,0,12,11,1,8,0 -3339,3,8,0,6,3,1,8,1,3,1,2,11,1,39,0 -3340,8,8,0,6,1,3,3,5,3,1,18,17,4,21,1 -3341,4,1,0,4,5,2,11,0,0,0,15,17,5,21,1 -3342,5,0,0,15,0,6,0,2,1,1,15,1,3,6,0 -3343,2,0,0,1,6,5,9,4,1,0,8,13,4,33,0 -3344,7,2,0,12,1,4,1,5,0,1,5,1,3,19,1 -3345,3,6,0,13,4,3,10,3,2,1,13,6,0,37,0 -3346,7,0,0,14,4,4,0,4,4,1,18,16,3,5,1 -3347,7,3,0,9,0,0,9,2,0,1,13,15,0,0,0 -3348,5,1,0,4,5,5,5,3,0,0,12,2,0,8,0 -3349,0,4,0,8,4,5,7,4,0,1,2,11,1,38,0 -3350,9,2,0,11,5,4,12,2,1,1,2,18,2,9,0 -3351,1,4,0,3,1,4,9,1,0,1,2,13,1,34,0 -3352,2,7,0,11,0,0,4,0,0,1,17,11,2,21,0 -3353,10,8,0,6,2,4,12,4,3,0,4,6,0,27,0 -3354,0,7,0,14,0,6,7,1,0,0,12,2,4,38,0 -3355,1,1,0,3,3,4,5,4,2,1,8,16,1,41,0 -3356,2,4,0,4,0,5,4,1,1,1,2,18,0,14,0 -3357,3,7,0,11,6,1,0,3,4,0,16,11,3,20,0 -3358,10,2,0,12,6,0,4,4,1,1,1,0,4,6,1 -3359,6,7,0,9,2,0,2,4,0,0,17,11,5,38,0 -3360,2,8,0,11,5,0,10,2,0,1,17,2,4,34,0 -3361,0,0,0,9,5,2,14,3,3,0,6,2,4,40,0 -3362,0,0,0,6,6,3,8,3,1,0,6,13,1,22,0 -3363,0,8,0,9,5,1,7,3,2,1,13,15,1,29,0 -3364,9,4,0,1,1,4,9,4,1,1,2,9,5,20,0 -3365,5,0,0,8,6,4,7,3,2,0,0,16,3,30,0 -3366,3,1,0,14,1,5,9,2,4,0,13,15,3,38,0 -3367,7,7,0,7,3,2,2,5,4,1,18,10,4,9,1 -3368,1,6,0,11,5,4,3,4,3,1,2,2,1,3,0 -3369,0,8,0,11,0,6,11,0,3,1,12,18,0,13,0 -3370,0,6,0,15,0,3,7,3,0,1,2,9,5,29,0 -3371,9,3,0,5,3,0,13,1,3,1,17,0,0,16,0 -3372,6,0,0,10,6,3,4,4,1,1,8,18,3,0,0 -3373,1,1,0,1,5,0,9,2,2,1,4,14,0,29,0 -3374,6,1,0,0,4,0,11,1,2,0,13,15,4,39,0 -3375,4,6,0,6,0,3,3,0,2,0,12,18,2,8,0 -3376,7,1,0,12,4,1,10,3,3,0,7,10,4,21,1 -3377,9,0,0,6,0,4,12,3,1,0,17,15,0,21,0 -3378,3,8,0,5,5,0,9,3,1,0,0,15,5,33,0 -3379,0,2,0,7,0,4,14,3,2,1,12,17,4,10,0 -3380,1,8,0,7,5,3,3,1,4,0,0,2,2,26,0 -3381,3,6,0,4,1,3,3,0,0,1,17,0,1,33,0 -3382,2,7,0,0,2,5,8,3,3,1,2,4,2,14,0 -3383,4,8,0,2,6,4,9,3,4,0,8,18,1,3,0 -3384,2,1,0,6,1,2,14,5,1,1,13,9,4,26,0 -3385,1,3,0,13,3,5,10,0,0,0,13,16,5,32,0 -3386,10,5,0,3,6,6,3,1,0,0,2,11,2,1,0 -3387,2,0,0,1,3,6,7,4,2,0,0,18,2,16,0 -3388,4,3,0,11,3,0,8,3,2,1,13,11,1,23,0 -3389,1,3,0,8,3,2,0,1,3,1,8,18,1,4,0 -3390,0,5,0,10,6,3,2,2,0,1,8,18,0,27,0 -3391,8,2,0,5,0,5,12,5,1,0,6,9,0,37,0 -3392,4,0,0,2,1,0,5,3,1,0,8,15,5,19,0 -3393,1,5,0,15,3,0,13,0,4,0,0,2,0,4,0 -3394,5,6,0,15,6,5,6,3,1,0,3,20,5,21,0 -3395,2,7,0,3,1,5,9,0,1,0,4,13,3,4,0 -3396,0,4,0,1,3,5,11,2,1,1,18,0,5,7,1 -3397,9,2,0,6,0,4,5,3,4,1,2,0,2,27,0 -3398,1,7,0,8,2,1,6,3,1,0,15,13,2,3,0 -3399,8,5,0,4,1,0,3,1,1,0,2,4,2,39,0 -3400,0,2,0,12,1,2,8,4,2,0,2,2,4,36,0 -3401,1,8,0,3,4,0,3,3,1,0,1,2,4,39,0 -3402,8,8,0,4,1,1,1,1,4,1,15,15,1,39,0 -3403,10,6,0,2,5,3,13,5,1,1,8,8,0,30,1 -3404,1,6,0,9,0,5,3,2,0,1,4,20,0,26,0 -3405,10,0,0,12,0,2,6,5,3,1,13,2,2,14,0 -3406,0,1,0,12,6,2,5,0,0,1,11,15,0,17,0 -3407,4,6,0,13,5,0,13,5,1,0,16,14,0,40,1 -3408,5,7,0,10,2,5,14,1,2,0,13,1,0,33,0 -3409,6,0,0,15,5,1,3,0,0,1,16,17,2,41,1 -3410,1,0,0,10,1,0,2,4,4,0,2,2,0,32,0 -3411,10,8,0,7,0,0,9,0,4,1,2,14,1,4,0 -3412,9,8,0,8,4,2,1,3,0,0,4,1,1,35,1 -3413,0,8,0,13,1,2,11,4,3,0,8,11,3,30,0 -3414,2,0,0,4,5,3,3,4,0,1,10,11,0,8,0 -3415,8,3,0,13,3,2,5,5,3,1,7,6,3,20,1 -3416,7,2,0,12,4,4,4,0,0,1,15,7,2,19,1 -3417,0,0,0,6,6,6,3,4,3,0,8,3,5,33,0 -3418,3,4,0,10,0,6,4,1,1,1,13,1,0,8,0 -3419,2,0,0,13,1,5,13,0,3,0,2,3,0,30,0 -3420,3,8,0,7,6,0,0,5,0,1,3,10,2,18,1 -3421,1,6,0,3,4,6,4,1,1,1,13,9,1,24,0 -3422,9,7,0,6,2,5,2,5,0,0,2,15,2,25,0 -3423,5,4,0,1,5,0,4,5,1,0,16,17,4,14,1 -3424,1,2,0,3,6,6,13,4,1,1,2,2,5,9,0 -3425,3,6,0,11,6,0,5,3,1,1,16,19,2,18,0 -3426,0,8,0,13,6,5,0,5,2,1,2,11,4,0,0 -3427,1,3,0,12,3,6,6,2,3,0,13,19,1,40,0 -3428,3,4,0,14,0,0,12,3,0,0,8,18,1,34,0 -3429,1,7,0,15,5,0,14,1,1,0,2,13,4,20,0 -3430,9,7,0,8,6,0,6,2,4,1,5,7,4,31,1 -3431,9,2,0,8,1,5,9,1,4,0,15,13,3,25,0 -3432,2,6,0,4,0,5,8,2,2,1,4,9,1,40,0 -3433,0,2,0,0,6,4,8,4,1,1,13,2,1,8,0 -3434,0,1,0,5,2,5,2,0,1,1,13,8,1,35,0 -3435,3,4,0,15,6,1,14,5,3,0,10,17,5,18,1 -3436,3,4,0,15,6,1,11,5,4,0,7,6,4,16,1 -3437,0,2,0,5,6,0,5,1,4,1,13,5,0,2,0 -3438,10,6,0,13,4,1,4,3,0,1,4,0,5,19,0 -3439,5,1,0,7,4,1,13,0,0,1,18,19,2,13,1 -3440,10,0,0,6,1,3,9,3,3,1,0,17,0,35,0 -3441,9,6,0,6,4,2,9,1,4,1,9,10,5,41,1 -3442,0,0,0,3,2,5,11,4,4,1,8,15,4,22,0 -3443,9,5,0,0,3,5,9,1,3,0,8,2,0,27,0 -3444,4,3,0,11,6,2,4,5,1,0,18,16,0,11,1 -3445,9,5,0,14,2,1,2,5,0,1,3,1,1,23,1 -3446,7,0,0,3,3,0,14,3,4,1,4,11,4,37,0 -3447,6,2,0,15,0,5,1,5,1,0,9,7,4,6,1 -3448,8,5,0,4,6,1,4,5,3,0,13,11,1,7,0 -3449,4,4,0,12,5,1,14,3,0,1,16,7,2,41,1 -3450,8,4,0,12,2,2,9,5,2,1,18,19,5,29,1 -3451,7,8,0,3,1,6,14,0,2,1,4,5,3,28,1 -3452,0,2,0,7,2,5,8,4,3,1,8,13,0,32,0 -3453,4,6,0,8,4,4,12,0,3,1,15,11,5,9,0 -3454,7,5,0,0,6,1,8,4,1,0,18,7,4,33,1 -3455,5,2,0,7,0,3,4,3,4,0,5,15,0,23,0 -3456,7,5,0,7,2,0,10,3,4,1,18,7,3,16,1 -3457,0,8,0,15,1,1,6,5,0,1,12,10,0,33,1 -3458,1,1,0,13,4,1,8,0,3,1,11,19,4,41,1 -3459,0,1,0,1,4,3,14,0,4,1,8,11,5,8,0 -3460,7,1,0,9,2,3,2,0,3,1,11,7,1,22,1 -3461,9,5,0,0,0,1,5,2,3,0,12,15,0,8,0 -3462,1,0,0,6,6,4,7,0,3,0,10,18,4,24,0 -3463,1,5,0,7,0,1,5,4,2,1,8,1,1,40,0 -3464,8,7,0,11,1,0,2,1,3,1,5,7,4,27,1 -3465,10,8,0,15,0,3,11,5,4,0,18,7,4,31,1 -3466,9,0,0,7,3,0,9,4,1,1,18,17,2,23,1 -3467,4,6,0,15,1,3,7,4,1,1,4,15,3,24,0 -3468,7,6,0,13,2,4,0,5,1,1,7,15,2,0,1 -3469,5,7,0,11,1,6,3,1,0,0,17,2,0,4,0 -3470,10,4,0,10,2,1,3,5,4,0,16,7,2,38,1 -3471,1,7,0,7,0,6,11,3,0,1,13,17,1,18,0 -3472,3,6,0,2,6,6,5,1,3,1,5,13,2,4,0 -3473,10,0,0,9,5,3,11,5,3,0,18,1,5,1,1 -3474,5,1,0,13,0,4,14,5,3,0,17,11,5,3,0 -3475,9,1,0,12,2,2,12,1,0,1,5,10,3,27,1 -3476,8,4,0,8,2,0,0,5,1,1,5,20,1,31,0 -3477,5,1,0,14,6,2,10,4,3,1,16,5,4,11,1 -3478,5,2,0,8,4,2,4,5,1,0,4,2,0,25,1 -3479,6,6,0,10,6,1,12,4,3,0,10,18,1,31,1 -3480,0,5,0,8,5,3,4,3,1,1,15,0,2,12,0 -3481,7,5,0,7,3,0,9,3,1,0,8,17,2,4,0 -3482,0,3,0,8,6,5,8,2,1,1,8,13,5,23,0 -3483,3,6,0,15,6,5,12,5,2,0,1,5,0,26,1 -3484,2,7,0,15,2,2,14,3,0,0,18,7,4,0,1 -3485,1,7,0,1,2,5,3,3,1,0,15,6,3,10,0 -3486,9,4,0,14,6,3,8,2,0,0,8,20,5,13,0 -3487,9,4,0,8,2,0,5,1,2,1,10,2,0,22,0 -3488,3,7,0,5,3,0,13,0,2,1,8,11,3,24,0 -3489,7,2,0,6,4,3,1,5,0,1,18,10,2,30,1 -3490,10,8,0,8,4,0,2,2,3,0,13,11,4,28,0 -3491,0,2,0,14,0,6,5,3,2,0,2,0,3,27,0 -3492,6,8,0,0,5,5,4,3,1,0,10,2,2,29,0 -3493,0,6,0,10,0,4,9,1,2,1,13,11,2,11,0 -3494,9,7,0,12,0,5,2,2,0,1,2,13,0,11,0 -3495,3,4,0,12,1,1,3,3,3,1,3,0,0,0,0 -3496,0,0,0,5,6,3,6,4,4,0,12,2,3,26,0 -3497,5,6,0,10,0,0,1,1,1,0,17,13,0,10,0 -3498,2,6,0,1,4,1,13,3,4,0,6,9,3,6,0 -3499,0,5,0,5,6,4,8,5,2,1,2,18,2,24,0 -3500,8,4,0,0,3,5,10,3,1,0,5,20,1,9,0 -3501,3,0,0,12,4,5,9,2,0,0,8,0,1,2,0 -3502,1,7,0,6,1,5,5,3,2,1,8,2,1,23,0 -3503,3,5,0,12,6,0,8,0,1,1,8,13,3,36,0 -3504,0,5,0,12,3,6,2,1,2,1,13,2,0,19,0 -3505,9,4,0,10,4,1,14,2,4,1,17,10,1,23,1 -3506,0,6,0,6,0,5,4,3,4,1,15,2,3,32,0 -3507,5,8,0,12,6,6,5,5,2,1,18,16,4,2,1 -3508,5,7,0,1,0,0,7,0,3,0,3,8,3,23,1 -3509,8,1,0,15,2,2,3,0,0,0,5,15,3,18,0 -3510,9,8,0,5,6,6,9,3,2,0,2,18,1,31,0 -3511,10,4,0,13,1,0,10,4,0,1,16,11,4,18,0 -3512,4,5,0,3,2,6,1,3,3,1,8,4,1,21,0 -3513,6,2,0,10,4,2,0,4,0,1,18,6,2,3,1 -3514,1,2,0,13,0,2,1,5,1,1,10,15,1,7,0 -3515,2,0,0,1,4,6,11,3,0,1,4,0,3,4,0 -3516,5,8,0,7,2,5,6,4,1,0,15,12,4,12,0 -3517,2,0,0,0,6,0,0,2,4,0,8,3,2,37,0 -3518,8,5,0,5,1,4,4,5,1,1,7,7,5,8,1 -3519,1,1,0,8,4,2,0,3,0,1,15,19,1,6,0 -3520,6,0,0,1,6,6,9,1,0,0,15,6,4,22,0 -3521,0,1,0,6,3,2,9,3,2,1,10,11,0,36,0 -3522,3,1,0,4,3,3,10,3,3,0,16,3,5,21,0 -3523,0,1,0,3,6,4,11,5,0,1,13,9,1,29,0 -3524,6,5,0,6,5,6,0,1,3,1,6,2,2,41,0 -3525,5,8,0,5,6,0,9,3,3,0,8,5,2,17,0 -3526,9,6,0,13,1,0,3,1,3,0,4,3,0,27,0 -3527,4,2,0,14,3,6,12,3,2,1,8,5,4,36,1 -3528,0,1,0,11,4,1,13,2,4,0,1,1,3,19,1 -3529,3,7,0,7,3,0,7,3,0,1,10,11,0,17,0 -3530,2,6,0,10,0,0,9,1,4,1,2,3,1,2,0 -3531,2,6,0,1,1,6,7,4,3,0,2,2,4,29,0 -3532,7,8,0,6,5,4,10,0,2,0,8,4,0,26,0 -3533,9,5,0,13,5,6,3,0,0,0,10,2,0,26,0 -3534,0,0,0,12,2,5,5,2,4,1,13,13,2,4,0 -3535,3,4,0,5,5,6,5,4,3,0,10,6,3,23,0 -3536,3,2,0,2,0,4,1,2,4,1,13,18,4,16,0 -3537,1,1,0,11,1,0,4,3,1,0,13,11,1,23,0 -3538,1,0,0,4,1,6,10,4,2,0,2,13,1,29,0 -3539,2,6,0,6,1,6,2,4,1,0,8,3,4,33,0 -3540,9,4,0,12,0,0,7,2,3,1,10,2,2,17,0 -3541,6,2,0,7,6,5,10,5,2,1,18,1,0,19,1 -3542,9,5,0,9,3,3,10,0,3,0,13,9,5,12,0 -3543,5,4,0,3,0,4,8,2,1,1,13,18,4,30,0 -3544,3,4,0,12,1,6,3,3,1,0,2,9,1,4,0 -3545,0,7,0,1,5,2,8,0,3,1,18,16,3,13,1 -3546,0,8,0,10,0,6,5,1,2,1,16,0,1,14,0 -3547,0,0,0,11,0,4,0,5,4,1,4,6,2,0,0 -3548,9,0,0,3,4,1,14,5,0,1,5,9,3,38,1 -3549,9,3,0,4,0,1,4,0,1,0,17,0,1,9,0 -3550,1,3,0,1,4,0,7,1,3,1,10,14,4,37,0 -3551,7,8,0,9,0,3,11,3,1,1,0,3,3,11,0 -3552,0,3,0,8,3,0,5,4,4,0,4,20,0,22,0 -3553,0,0,0,13,1,3,6,0,1,1,8,11,2,23,0 -3554,4,5,0,12,6,1,11,1,2,0,5,19,3,9,1 -3555,2,6,0,11,3,2,5,1,0,0,4,6,1,3,0 -3556,3,2,0,12,2,2,6,0,4,1,16,18,3,33,1 -3557,9,4,0,12,1,1,9,5,0,0,16,8,5,23,1 -3558,7,1,0,6,4,2,6,1,1,0,9,0,3,3,1 -3559,7,3,0,0,0,6,11,1,0,0,17,20,2,0,0 -3560,7,8,0,0,5,5,0,4,3,0,2,9,1,8,0 -3561,7,5,0,2,4,2,13,5,0,1,16,7,5,2,1 -3562,2,6,0,3,6,1,9,4,3,0,8,2,4,27,0 -3563,0,7,0,14,6,4,0,3,4,0,6,2,3,25,0 -3564,0,8,0,9,0,4,5,0,1,1,12,15,3,14,0 -3565,9,4,0,14,3,5,10,3,0,0,8,15,2,10,0 -3566,2,1,0,13,5,0,11,3,4,0,4,6,2,25,0 -3567,9,5,0,6,4,6,2,0,3,0,15,13,3,8,0 -3568,5,5,0,1,0,0,12,1,0,0,17,19,2,4,0 -3569,5,3,0,8,4,3,0,0,4,1,13,20,0,23,0 -3570,0,8,0,8,6,5,13,0,2,0,13,5,0,38,0 -3571,2,8,0,9,4,6,14,2,2,0,2,6,3,40,0 -3572,5,8,0,8,6,5,10,2,2,0,14,6,2,38,0 -3573,8,6,0,6,4,2,7,5,1,1,1,1,3,25,1 -3574,8,2,0,8,3,1,10,1,4,0,3,8,2,38,1 -3575,8,1,0,15,6,2,4,0,2,1,18,10,3,37,1 -3576,9,8,0,5,1,0,14,4,4,1,13,2,1,5,0 -3577,0,0,0,13,1,3,1,0,0,1,4,0,0,0,0 -3578,9,8,0,11,6,4,4,5,4,0,13,20,5,24,0 -3579,3,6,0,15,0,0,5,3,3,0,13,13,3,33,0 -3580,5,7,0,1,5,0,6,1,4,0,13,10,0,13,0 -3581,0,6,0,13,3,6,0,4,0,1,13,5,2,5,0 -3582,8,1,0,1,0,2,5,4,0,0,16,7,2,1,1 -3583,8,3,0,9,1,6,12,5,1,1,15,17,4,40,1 -3584,3,5,0,13,3,4,12,0,0,0,9,5,4,27,1 -3585,9,6,0,13,3,3,12,0,1,0,5,16,5,12,1 -3586,0,2,0,4,6,5,2,3,4,1,15,18,0,38,0 -3587,3,5,0,0,6,6,6,2,1,0,2,14,0,32,0 -3588,6,3,0,11,6,5,9,3,2,1,1,0,0,33,0 -3589,6,0,0,10,5,0,9,4,2,1,3,2,4,26,0 -3590,3,1,0,13,0,6,13,5,2,0,10,13,5,20,0 -3591,5,7,0,10,6,0,9,3,0,1,4,2,2,7,0 -3592,1,0,0,5,6,4,10,2,3,0,12,6,2,38,0 -3593,2,0,0,14,6,0,1,1,2,0,2,3,5,39,0 -3594,5,5,0,6,3,3,10,2,0,1,13,9,1,3,0 -3595,2,6,0,5,1,4,1,0,0,0,10,11,4,4,0 -3596,5,0,0,1,6,4,7,2,3,1,0,0,3,1,0 -3597,4,8,0,11,6,5,1,2,0,0,4,20,0,3,0 -3598,4,0,0,2,1,2,10,5,1,1,1,5,2,10,1 -3599,2,8,0,7,1,1,6,3,3,1,10,2,0,31,0 -3600,10,6,0,6,2,4,1,2,2,1,13,11,0,4,0 -3601,1,5,0,5,0,4,1,4,2,0,4,2,5,12,0 -3602,2,6,0,10,0,0,8,0,0,1,0,11,1,8,0 -3603,0,7,0,15,6,1,3,0,3,0,8,9,1,40,0 -3604,2,3,0,2,6,0,11,4,0,1,8,9,4,33,0 -3605,6,0,0,10,4,2,10,0,1,1,15,7,4,14,1 -3606,2,3,0,3,4,3,3,1,2,1,10,0,1,33,0 -3607,6,6,0,2,2,3,2,3,2,0,3,5,4,16,1 -3608,7,7,0,12,1,0,0,0,2,0,2,9,0,37,0 -3609,5,2,0,3,1,5,9,3,0,0,2,2,0,37,0 -3610,10,2,0,9,0,6,2,0,0,1,14,11,1,7,0 -3611,0,7,0,5,5,5,2,5,0,1,0,15,5,31,0 -3612,1,8,0,1,2,1,4,2,2,0,13,2,4,0,0 -3613,0,4,0,4,5,0,9,5,0,0,0,11,4,14,0 -3614,5,8,0,2,3,0,1,0,3,1,6,6,2,24,0 -3615,2,3,0,9,5,5,4,3,0,0,6,0,2,31,0 -3616,2,6,0,13,0,5,2,3,4,0,6,11,0,22,0 -3617,0,7,0,13,6,4,5,4,4,1,8,13,3,35,0 -3618,3,3,0,9,6,4,9,4,4,0,14,11,0,19,0 -3619,0,8,0,2,6,2,12,4,1,1,2,13,5,38,0 -3620,8,0,0,0,4,6,5,5,4,1,5,18,3,13,1 -3621,1,7,0,10,0,5,6,1,0,0,9,13,3,2,0 -3622,10,0,0,13,0,2,1,0,4,1,12,20,0,38,0 -3623,5,8,0,3,2,6,11,3,0,0,2,2,0,19,0 -3624,5,5,0,6,6,1,8,5,0,1,12,15,2,27,0 -3625,7,8,0,11,5,2,12,1,4,0,2,0,5,21,0 -3626,10,3,0,13,3,3,6,5,3,1,9,10,5,23,1 -3627,0,7,0,3,1,2,8,2,3,0,4,11,2,7,0 -3628,1,7,0,1,6,0,14,3,2,0,12,0,4,6,0 -3629,0,3,0,9,5,0,6,1,2,0,8,0,1,29,0 -3630,6,3,0,7,3,3,13,1,0,1,5,10,0,6,1 -3631,0,8,0,6,0,5,0,2,1,0,15,6,4,3,0 -3632,3,2,0,9,2,1,5,4,1,1,13,11,2,11,0 -3633,2,7,0,9,2,5,14,0,2,0,13,13,5,37,0 -3634,1,4,0,5,3,6,8,2,4,1,8,0,5,29,0 -3635,3,6,0,2,5,5,14,0,3,1,16,9,0,14,0 -3636,0,0,0,10,0,3,6,2,1,0,13,18,1,27,0 -3637,8,3,0,5,6,0,14,1,3,0,3,20,4,20,0 -3638,4,6,0,6,5,6,0,4,1,0,9,20,5,10,1 -3639,7,1,0,10,4,5,8,3,0,0,2,2,3,25,0 -3640,5,2,0,15,0,4,12,3,0,1,13,15,4,8,0 -3641,5,1,0,13,3,6,0,3,2,1,17,6,0,16,0 -3642,4,7,0,5,5,6,3,4,3,0,2,11,2,26,0 -3643,7,2,0,5,0,5,0,5,2,1,2,11,0,3,0 -3644,6,2,0,3,0,5,1,3,0,1,0,2,2,33,0 -3645,10,8,0,2,1,2,14,4,3,0,11,15,0,4,0 -3646,0,6,0,15,2,4,1,1,4,0,13,11,5,3,0 -3647,9,8,0,1,1,4,5,3,1,0,13,17,1,25,0 -3648,0,5,0,4,6,2,3,1,3,0,8,0,1,39,0 -3649,10,1,0,1,2,6,13,3,4,1,3,5,5,23,1 -3650,6,6,0,9,4,5,9,5,1,1,13,18,3,11,0 -3651,8,2,0,10,6,0,12,3,4,0,4,15,0,28,0 -3652,0,1,0,5,1,2,0,2,4,1,0,15,2,5,0 -3653,0,2,0,15,6,0,1,4,1,0,14,13,3,30,0 -3654,0,8,0,13,2,5,14,2,0,0,4,15,3,4,0 -3655,9,8,0,7,0,6,10,3,1,1,13,12,5,11,0 -3656,2,7,0,15,5,3,14,3,0,1,8,2,3,16,0 -3657,8,3,0,5,5,3,6,0,1,0,18,8,1,16,1 -3658,2,2,0,4,2,6,5,3,3,0,6,19,5,19,1 -3659,0,0,0,3,0,0,9,4,3,0,2,2,0,14,0 -3660,7,6,0,1,3,5,12,3,4,1,8,11,5,37,0 -3661,3,6,0,12,0,4,6,0,4,0,2,2,1,21,0 -3662,10,5,0,8,5,6,6,3,4,1,4,16,1,2,0 -3663,10,4,0,0,6,2,14,1,1,1,1,16,0,39,1 -3664,9,6,0,6,1,2,7,0,0,0,5,18,0,13,1 -3665,1,5,0,4,3,6,0,0,1,0,9,18,2,21,1 -3666,0,7,0,13,6,0,5,4,3,0,2,12,5,23,0 -3667,6,2,0,1,5,0,0,1,4,1,7,1,5,5,0 -3668,2,5,0,11,4,1,8,4,3,1,6,20,0,22,0 -3669,2,0,0,3,1,0,4,4,1,0,6,18,5,19,1 -3670,5,1,0,3,1,0,13,0,4,1,12,8,3,40,1 -3671,2,4,0,6,2,1,5,5,1,1,17,2,2,22,0 -3672,8,4,0,7,0,4,2,4,2,0,17,0,4,24,0 -3673,9,0,0,0,2,5,14,0,3,1,16,13,3,4,0 -3674,3,3,0,0,2,6,9,4,1,1,13,11,1,29,0 -3675,10,5,0,0,3,6,5,5,2,0,1,0,5,37,1 -3676,8,4,0,11,3,5,9,3,2,1,0,6,5,36,0 -3677,0,2,0,2,1,6,0,5,2,1,7,19,1,29,1 -3678,10,8,0,14,6,3,9,0,2,1,8,2,2,1,0 -3679,6,5,0,15,2,5,8,0,1,0,1,1,1,27,1 -3680,5,6,0,7,5,5,8,3,3,0,1,6,3,22,0 -3681,7,3,0,5,2,4,6,5,0,1,13,3,4,14,0 -3682,2,8,0,13,0,5,5,1,3,1,4,15,4,38,0 -3683,10,2,0,0,0,1,5,3,0,0,8,11,5,22,0 -3684,9,5,0,15,6,2,4,3,4,0,5,1,2,0,1 -3685,0,7,0,14,5,0,9,2,1,0,8,9,4,4,0 -3686,2,6,0,13,6,1,11,1,1,1,11,10,4,6,1 -3687,2,5,0,5,2,0,6,0,2,0,13,2,5,38,0 -3688,1,2,0,8,6,5,4,1,3,0,4,19,0,16,0 -3689,2,1,0,5,6,0,12,1,0,0,2,9,1,33,0 -3690,10,4,0,4,5,6,14,5,3,1,1,5,3,41,1 -3691,2,3,0,13,5,4,4,2,2,0,4,0,0,11,0 -3692,1,8,0,5,0,0,10,2,0,1,8,13,5,8,0 -3693,6,0,0,4,0,0,10,4,3,0,8,14,0,35,0 -3694,0,6,0,7,6,6,9,3,4,0,4,17,2,25,0 -3695,2,2,0,1,2,5,9,4,2,1,13,18,0,9,0 -3696,0,6,0,3,5,6,14,3,2,0,3,6,2,26,0 -3697,3,8,0,1,5,5,10,2,0,1,15,11,0,9,0 -3698,1,3,0,13,0,3,0,4,1,1,2,0,0,31,0 -3699,0,8,0,8,6,1,12,3,1,1,0,19,5,8,0 -3700,0,0,0,12,1,5,7,5,3,0,2,0,5,3,0 -3701,10,7,0,0,5,6,7,1,2,0,4,2,2,13,0 -3702,9,5,0,0,5,5,5,2,3,0,17,15,4,37,0 -3703,2,4,0,5,6,5,3,2,1,1,13,15,0,38,0 -3704,3,4,0,6,1,0,8,1,2,1,17,11,2,23,0 -3705,2,4,0,10,1,3,8,4,4,0,13,2,5,23,0 -3706,2,4,0,7,1,6,11,4,4,0,0,12,2,0,0 -3707,10,5,0,15,1,0,9,5,3,0,7,7,3,25,0 -3708,2,0,0,7,4,3,6,1,4,0,10,4,4,4,0 -3709,0,4,0,11,1,0,9,4,3,1,2,14,1,24,0 -3710,7,5,0,12,0,1,6,2,4,1,14,4,3,12,0 -3711,0,4,0,7,2,5,8,4,1,0,2,16,1,12,0 -3712,0,3,0,1,0,1,12,3,0,1,8,20,1,11,0 -3713,5,7,0,4,6,4,8,2,0,1,10,12,0,33,0 -3714,2,2,0,0,0,4,8,0,2,1,10,14,1,5,0 -3715,3,7,0,5,5,4,0,1,0,1,8,15,2,4,0 -3716,1,2,0,5,3,4,9,0,4,0,17,15,3,11,0 -3717,1,4,0,5,0,0,5,1,4,0,13,11,5,27,0 -3718,10,2,0,8,5,1,3,0,4,1,18,14,4,32,1 -3719,0,7,0,9,3,0,9,4,0,0,15,2,3,34,0 -3720,1,7,0,5,0,0,8,1,1,1,2,0,0,26,0 -3721,3,8,0,13,1,1,5,4,3,0,2,20,0,7,0 -3722,1,2,0,2,6,0,8,0,0,0,13,11,1,39,0 -3723,7,2,0,8,6,6,10,2,2,0,16,7,4,40,1 -3724,2,3,0,0,1,4,10,3,0,0,12,2,4,26,0 -3725,10,5,0,12,4,5,2,1,3,1,18,4,1,27,1 -3726,2,6,0,9,6,6,10,3,4,1,5,6,2,20,1 -3727,10,5,0,10,6,3,0,3,0,0,9,7,4,36,1 -3728,1,2,0,14,4,5,2,1,4,1,7,17,5,12,1 -3729,9,1,0,7,0,5,9,1,0,0,12,20,4,6,0 -3730,5,5,0,5,5,0,10,5,1,0,6,3,4,16,1 -3731,4,8,0,6,1,4,14,5,0,1,2,16,1,8,0 -3732,6,1,0,3,6,0,2,3,0,1,17,16,1,10,0 -3733,7,7,0,6,1,6,9,0,3,1,6,9,2,8,1 -3734,4,8,0,11,0,4,9,5,0,0,12,13,0,8,0 -3735,5,3,0,4,6,4,12,2,0,1,10,11,0,30,0 -3736,1,6,0,12,2,2,10,0,3,1,2,13,5,33,0 -3737,1,2,0,13,6,2,8,1,3,0,2,12,4,23,0 -3738,6,1,0,3,6,0,1,5,1,0,7,19,0,33,1 -3739,3,2,0,6,0,5,12,1,1,1,11,7,3,1,0 -3740,9,8,0,6,2,3,14,1,2,1,8,0,2,26,0 -3741,2,1,0,13,0,3,0,3,0,1,13,5,3,32,0 -3742,2,7,0,1,2,6,13,3,2,1,12,6,2,0,0 -3743,6,7,0,10,0,5,11,4,4,1,2,6,4,3,0 -3744,2,7,0,3,2,3,2,3,1,1,9,3,4,37,1 -3745,4,6,0,6,5,2,0,0,2,1,2,18,2,26,0 -3746,0,4,0,7,2,4,0,4,0,1,2,9,0,33,0 -3747,7,1,0,14,5,2,8,2,0,0,1,10,4,33,1 -3748,0,8,0,10,1,5,14,4,2,1,2,2,5,21,0 -3749,2,3,0,15,0,4,7,3,2,0,1,14,5,6,0 -3750,8,8,0,5,6,4,1,3,0,0,0,13,3,38,0 -3751,6,6,0,0,4,1,11,0,4,1,18,10,5,3,1 -3752,3,2,0,6,0,0,9,2,3,0,8,1,0,3,0 -3753,0,4,0,12,1,6,0,4,2,0,7,7,0,0,1 -3754,0,1,0,7,2,6,9,2,2,0,2,16,3,10,0 -3755,2,8,0,5,5,4,0,2,0,1,0,18,2,6,0 -3756,9,3,0,0,2,5,5,1,2,1,6,11,3,39,0 -3757,3,4,0,2,1,3,4,0,3,0,16,1,3,30,1 -3758,8,6,0,3,1,6,5,5,2,1,2,6,0,21,0 -3759,1,3,0,0,3,5,0,0,1,0,11,9,0,19,0 -3760,9,3,0,4,2,3,13,3,0,1,2,2,1,24,0 -3761,0,8,0,8,1,0,14,0,1,0,10,2,4,26,0 -3762,0,1,0,5,0,5,8,1,4,0,17,14,0,35,0 -3763,10,2,0,13,3,1,4,0,0,1,11,11,1,12,0 -3764,1,6,0,5,6,0,14,5,4,0,10,2,3,30,0 -3765,5,6,0,6,0,3,13,0,1,0,18,10,1,35,1 -3766,4,2,0,3,4,0,4,4,1,1,9,11,0,6,0 -3767,1,3,0,12,0,4,0,3,4,0,4,6,0,14,0 -3768,1,4,0,3,1,2,2,1,1,0,9,15,0,11,0 -3769,5,1,0,6,2,0,7,4,4,1,14,11,0,31,0 -3770,0,3,0,14,1,0,2,3,0,0,7,6,5,26,0 -3771,0,2,0,10,4,2,14,4,2,0,8,0,2,12,0 -3772,1,2,0,9,0,5,3,0,4,0,1,9,2,7,1 -3773,2,5,0,3,6,4,7,5,2,0,8,6,5,35,0 -3774,0,4,0,6,0,1,9,0,4,1,2,6,0,13,0 -3775,2,1,0,1,2,1,14,0,0,1,8,16,0,41,0 -3776,2,6,0,2,1,0,3,4,2,1,10,0,0,17,0 -3777,1,5,0,0,4,4,4,2,0,1,2,20,4,17,0 -3778,3,1,0,6,2,0,14,5,1,0,4,20,1,38,0 -3779,7,7,0,8,1,2,5,1,4,1,18,1,4,12,1 -3780,8,1,0,0,6,2,12,1,2,0,2,13,5,12,0 -3781,3,0,0,11,6,3,12,1,3,0,13,17,2,16,0 -3782,6,7,0,12,1,5,7,5,1,1,10,17,5,34,1 -3783,9,5,0,14,5,1,11,0,4,0,14,17,4,22,1 -3784,2,7,0,4,0,6,1,3,4,1,5,19,1,8,0 -3785,1,5,0,5,5,0,11,5,1,0,17,11,2,11,0 -3786,0,6,0,3,0,5,2,4,2,0,4,13,1,9,0 -3787,1,7,0,1,6,0,12,3,3,0,10,2,5,34,0 -3788,2,7,0,11,4,5,1,1,4,0,7,1,5,24,1 -3789,8,8,0,2,2,3,9,3,2,0,4,11,5,21,0 -3790,0,2,0,2,4,6,8,0,0,0,2,6,5,30,0 -3791,0,6,0,13,0,6,11,3,2,1,13,7,0,10,0 -3792,0,6,0,9,2,0,6,4,4,1,13,13,1,26,0 -3793,6,7,0,15,4,3,0,0,4,0,0,16,1,33,0 -3794,9,3,0,5,6,6,13,4,4,0,2,20,2,25,0 -3795,10,8,0,8,2,1,11,1,2,0,18,7,4,29,1 -3796,6,2,0,7,3,3,8,4,0,1,17,12,4,31,1 -3797,5,6,0,5,1,4,13,1,1,1,0,11,1,34,0 -3798,0,0,0,3,5,3,14,4,1,1,2,11,3,11,0 -3799,3,2,0,7,2,5,7,5,3,1,2,2,0,22,0 -3800,6,5,0,13,1,0,2,4,0,0,2,2,0,7,0 -3801,8,0,0,4,0,1,8,4,0,0,10,4,3,6,0 -3802,5,1,0,4,4,4,12,1,4,1,2,12,5,9,0 -3803,5,1,0,11,6,0,8,4,0,0,13,18,4,0,0 -3804,2,8,0,10,5,0,2,3,0,1,10,11,3,41,0 -3805,5,7,0,6,0,5,2,3,1,0,8,9,3,37,0 -3806,10,0,0,13,3,4,5,1,4,1,8,9,0,26,0 -3807,6,7,0,9,0,2,11,5,3,1,6,17,5,38,1 -3808,0,2,0,3,6,6,6,2,1,0,4,18,5,7,0 -3809,3,1,0,8,2,5,11,4,3,0,5,14,1,7,0 -3810,4,2,0,9,4,0,5,5,4,1,15,5,1,11,1 -3811,1,2,0,4,3,6,3,1,2,1,8,8,5,4,0 -3812,2,0,0,9,1,4,1,0,4,0,13,2,3,30,0 -3813,0,3,0,0,6,5,5,4,3,0,8,12,5,39,0 -3814,10,1,0,0,3,2,13,0,2,0,9,18,1,41,1 -3815,8,3,0,6,1,1,3,2,1,0,8,15,1,27,0 -3816,5,2,0,11,6,5,11,1,0,1,8,9,0,35,0 -3817,9,0,0,8,3,4,6,1,1,1,4,13,0,12,0 -3818,10,4,0,5,5,0,8,1,0,1,0,11,1,23,0 -3819,2,1,0,15,5,0,11,2,0,0,8,2,0,40,0 -3820,3,4,0,8,0,0,10,3,3,1,8,9,0,27,0 -3821,8,1,0,8,4,2,0,0,1,1,7,7,4,6,1 -3822,0,4,0,0,1,5,1,4,4,0,6,15,1,13,0 -3823,3,2,0,0,5,0,1,4,0,0,2,15,4,9,0 -3824,5,1,0,14,1,6,7,4,1,0,16,4,4,8,1 -3825,1,3,0,0,3,3,10,1,0,0,17,9,4,21,0 -3826,6,8,0,14,6,5,0,5,4,0,15,20,2,30,0 -3827,5,1,0,7,6,0,12,5,0,1,1,12,4,33,0 -3828,4,3,0,5,1,5,11,1,3,1,6,11,3,19,0 -3829,9,7,0,9,6,3,9,3,4,0,0,13,2,18,0 -3830,4,8,0,12,0,1,9,1,1,1,17,18,5,26,0 -3831,0,6,0,5,4,5,14,5,3,1,4,0,3,1,0 -3832,0,1,0,14,3,2,1,5,2,1,18,14,5,9,1 -3833,3,0,0,11,6,1,2,1,2,0,11,19,3,19,1 -3834,3,5,0,13,5,0,0,1,0,1,8,18,0,18,0 -3835,5,4,0,2,0,0,3,4,2,0,8,5,3,35,0 -3836,10,5,0,8,4,3,7,0,4,1,9,17,2,21,1 -3837,0,4,0,9,5,4,5,0,0,1,13,11,0,5,0 -3838,1,4,0,13,0,2,14,0,1,0,6,2,2,35,0 -3839,2,3,0,7,0,3,5,1,3,0,13,11,5,20,0 -3840,0,4,0,7,5,3,3,4,2,1,0,5,0,3,0 -3841,7,7,0,15,5,2,0,0,1,1,8,7,4,41,1 -3842,9,2,0,5,3,0,8,4,2,1,2,2,5,7,0 -3843,6,4,0,11,2,2,4,3,3,1,3,19,2,23,1 -3844,7,6,0,8,0,0,8,4,2,1,8,0,1,27,0 -3845,7,0,0,13,0,0,3,3,0,0,13,3,5,11,0 -3846,0,7,0,10,5,0,5,2,3,0,2,18,1,3,0 -3847,10,7,0,12,3,6,11,0,2,1,2,9,1,19,0 -3848,3,7,0,6,3,4,14,2,1,0,8,3,0,29,0 -3849,6,3,0,6,6,5,2,5,3,0,13,5,1,22,0 -3850,4,0,0,9,0,5,14,4,2,1,15,9,5,12,0 -3851,0,7,0,4,0,4,8,0,2,0,13,20,1,22,0 -3852,4,8,0,3,4,2,4,5,2,0,13,4,1,16,0 -3853,4,4,0,10,4,4,1,4,4,0,3,0,1,10,0 -3854,9,1,0,7,3,1,2,5,0,1,18,10,2,7,1 -3855,0,4,0,13,5,1,1,1,0,0,13,2,2,25,0 -3856,2,8,0,2,4,0,10,2,4,1,17,2,1,19,0 -3857,3,0,0,3,5,5,5,2,3,0,13,13,1,20,0 -3858,0,2,0,8,6,6,12,4,1,0,8,2,0,13,0 -3859,2,3,0,11,2,5,3,1,3,1,17,16,2,5,0 -3860,7,7,0,9,6,2,10,1,3,0,16,10,4,31,1 -3861,9,0,0,9,1,6,7,5,0,1,1,16,4,12,1 -3862,9,1,0,0,6,5,3,5,4,1,15,13,1,1,1 -3863,0,2,0,2,1,4,9,2,2,1,13,12,5,16,0 -3864,10,0,0,7,6,6,10,2,0,0,2,11,1,22,0 -3865,6,8,0,4,0,4,6,0,1,0,13,0,3,26,0 -3866,7,0,0,14,0,5,12,0,2,0,2,11,1,39,0 -3867,9,0,0,7,0,6,0,0,4,1,8,9,3,22,0 -3868,7,7,0,0,5,4,8,1,2,0,10,16,4,1,0 -3869,3,0,0,8,3,3,5,2,3,0,12,11,3,0,0 -3870,6,8,0,1,3,1,8,1,1,0,17,8,0,37,0 -3871,9,8,0,10,5,5,3,3,4,0,10,2,2,5,0 -3872,1,6,0,5,2,5,10,0,3,0,2,15,2,34,0 -3873,6,0,0,11,3,0,11,3,0,0,10,15,0,0,0 -3874,5,7,0,3,2,0,9,3,3,0,8,9,2,32,0 -3875,0,8,0,1,3,5,0,2,0,0,4,4,3,19,0 -3876,1,3,0,11,4,4,1,1,4,0,6,6,2,31,0 -3877,0,5,0,14,6,3,2,4,2,1,13,13,0,3,0 -3878,1,6,0,8,6,6,6,0,1,0,18,1,4,26,1 -3879,1,3,0,15,1,4,5,1,0,0,13,18,3,1,0 -3880,0,4,0,8,6,5,5,3,3,0,6,7,2,40,0 -3881,2,1,0,15,1,4,11,2,1,1,10,2,1,30,0 -3882,7,5,0,6,0,2,3,0,2,1,4,13,0,18,0 -3883,3,3,0,9,3,1,4,5,2,1,6,10,4,32,1 -3884,2,6,0,11,4,5,10,3,2,1,7,18,5,5,0 -3885,1,6,0,6,3,1,6,0,4,1,0,2,5,10,0 -3886,0,6,0,9,6,4,6,1,1,0,0,11,1,3,0 -3887,3,3,0,3,1,5,7,2,1,0,11,11,5,9,0 -3888,8,6,0,9,1,0,11,0,0,1,8,9,4,3,0 -3889,1,0,0,7,0,5,7,2,3,1,13,19,1,24,0 -3890,7,2,0,14,6,4,3,0,4,1,18,1,4,18,1 -3891,1,6,0,9,6,5,11,3,2,1,8,11,0,28,0 -3892,0,6,0,13,0,4,7,0,2,1,2,2,2,36,0 -3893,2,3,0,1,0,5,5,3,0,1,4,13,3,37,0 -3894,9,8,0,2,4,5,4,0,2,0,8,13,2,26,0 -3895,5,1,0,8,0,5,5,3,0,1,2,19,3,34,0 -3896,8,2,0,1,6,3,8,4,2,0,5,19,5,7,1 -3897,1,6,0,9,2,3,2,4,0,1,13,15,0,19,0 -3898,5,3,0,15,3,4,5,3,2,1,4,11,1,17,0 -3899,4,8,0,15,6,4,8,2,2,1,8,15,1,5,0 -3900,1,8,0,1,1,5,11,0,4,0,4,20,0,9,0 -3901,0,5,0,5,0,6,7,5,0,0,0,6,3,22,0 -3902,3,3,0,13,3,3,10,0,0,1,11,0,0,39,0 -3903,1,0,0,1,5,6,7,1,0,0,1,2,5,39,0 -3904,9,6,0,11,2,5,14,2,2,0,2,18,1,40,0 -3905,2,5,0,6,6,5,12,4,0,0,1,4,4,33,0 -3906,2,6,0,13,0,0,14,1,1,1,9,2,0,37,0 -3907,8,0,0,3,5,4,0,2,2,1,2,17,0,5,0 -3908,4,2,0,2,6,6,9,5,4,0,13,2,5,6,0 -3909,2,6,0,6,1,0,2,2,3,0,2,15,1,36,0 -3910,7,7,0,0,0,5,5,2,0,0,17,9,3,20,0 -3911,3,8,0,0,5,4,5,2,2,1,17,20,3,11,0 -3912,1,8,0,9,0,6,0,1,3,1,5,0,3,10,0 -3913,0,7,0,0,1,3,11,2,0,0,2,2,4,0,0 -3914,3,4,0,5,6,6,10,3,0,0,4,13,1,38,0 -3915,10,1,0,5,6,0,12,2,1,1,5,7,2,21,1 -3916,9,4,0,8,0,5,11,3,1,0,2,18,1,11,0 -3917,2,6,0,8,4,6,7,2,0,1,15,14,3,26,0 -3918,8,3,0,4,0,4,12,1,3,1,4,6,2,27,0 -3919,1,3,0,14,3,2,9,3,1,1,13,9,1,22,0 -3920,0,0,0,13,0,0,13,1,0,0,8,15,2,14,0 -3921,9,3,0,0,1,0,9,1,4,0,2,8,3,18,0 -3922,0,4,0,1,0,5,0,0,0,1,12,0,3,25,0 -3923,10,5,0,0,6,5,8,0,2,0,9,8,0,25,1 -3924,1,0,0,13,2,0,3,1,3,1,5,13,4,25,0 -3925,10,5,0,2,6,5,7,4,1,0,18,5,4,36,1 -3926,2,5,0,4,6,5,8,5,1,1,4,2,4,26,0 -3927,2,1,0,12,6,4,11,3,1,0,16,9,0,23,0 -3928,8,0,0,11,5,1,0,3,2,0,14,5,0,19,1 -3929,0,5,0,10,5,5,5,0,4,1,0,2,1,6,0 -3930,1,4,0,12,3,6,14,3,2,0,17,13,0,39,0 -3931,9,4,0,6,0,2,3,2,4,0,5,19,3,27,1 -3932,6,0,0,2,0,3,14,4,2,0,12,11,3,3,0 -3933,6,2,0,9,6,6,1,2,0,1,14,7,5,19,1 -3934,0,5,0,7,6,4,2,1,2,1,3,2,1,7,0 -3935,5,6,0,12,0,5,14,3,3,1,17,2,0,14,0 -3936,1,3,0,3,6,6,3,1,1,1,8,6,0,8,0 -3937,1,0,0,12,5,4,9,4,0,0,10,20,2,0,0 -3938,6,4,0,7,4,4,7,3,0,0,15,15,2,7,0 -3939,0,4,0,9,2,4,9,5,1,0,5,6,5,0,0 -3940,10,6,0,4,6,1,5,4,2,1,5,17,1,1,1 -3941,1,1,0,13,5,0,9,4,3,1,6,13,4,22,0 -3942,1,5,0,2,1,1,5,4,1,1,18,5,5,19,1 -3943,8,8,0,2,0,3,13,3,4,0,6,15,3,24,0 -3944,2,7,0,2,0,0,5,1,0,1,8,4,4,41,0 -3945,7,8,0,15,4,4,3,2,1,1,8,2,2,39,0 -3946,9,2,0,1,0,0,3,5,4,0,13,18,4,26,0 -3947,6,7,0,0,1,4,2,4,4,0,8,20,2,7,0 -3948,1,7,0,13,5,6,10,2,2,1,6,20,5,30,0 -3949,1,0,0,10,4,4,0,4,3,0,2,2,1,32,0 -3950,3,0,0,11,6,1,4,4,3,0,8,0,5,36,0 -3951,0,8,0,11,0,1,1,0,0,1,0,11,5,30,0 -3952,6,8,0,11,2,5,12,0,0,1,18,12,3,31,1 -3953,2,8,0,3,2,4,13,0,2,0,17,13,0,12,0 -3954,3,7,0,4,6,3,2,1,0,1,12,6,2,13,0 -3955,0,8,0,12,6,3,0,0,4,0,9,18,4,38,0 -3956,7,1,0,6,1,5,4,4,0,0,9,5,5,14,1 -3957,6,2,0,6,5,0,4,5,3,0,16,14,4,33,1 -3958,3,5,0,1,4,0,8,4,0,1,6,17,2,27,0 -3959,6,8,0,13,4,0,4,3,0,0,17,2,2,35,0 -3960,0,3,0,9,2,3,7,2,3,0,17,17,5,20,0 -3961,1,6,0,6,2,0,9,0,0,0,13,16,1,7,0 -3962,0,0,0,11,5,1,7,4,2,0,2,0,0,22,0 -3963,0,0,0,8,6,5,7,0,2,1,13,15,3,13,0 -3964,0,3,0,0,1,3,0,0,0,0,7,9,4,27,0 -3965,3,2,0,4,0,5,6,3,4,0,15,11,3,31,0 -3966,9,2,0,12,5,1,7,5,1,1,5,18,4,24,1 -3967,1,8,0,11,0,5,5,3,2,0,12,2,2,21,0 -3968,0,2,0,13,1,5,8,3,3,1,11,7,1,2,1 -3969,7,1,0,1,6,3,2,4,0,1,7,0,0,41,1 -3970,0,8,0,4,2,6,6,5,0,0,5,10,4,33,1 -3971,6,8,0,4,3,4,8,0,3,1,0,0,4,17,0 -3972,9,2,0,13,4,0,9,3,0,1,9,9,4,26,1 -3973,4,3,0,9,1,1,7,2,1,0,2,15,1,28,0 -3974,6,8,0,15,5,5,10,4,3,1,6,2,0,6,1 -3975,4,6,0,3,0,5,4,1,4,0,6,2,2,25,0 -3976,0,3,0,5,2,0,1,5,0,1,13,9,4,34,0 -3977,1,5,0,6,0,0,0,0,2,0,13,2,5,12,0 -3978,6,7,0,6,0,5,6,2,3,0,13,9,1,6,0 -3979,6,2,0,4,0,4,6,0,0,1,10,9,2,41,0 -3980,5,7,0,0,5,6,14,5,4,1,18,1,4,14,1 -3981,5,4,0,10,3,6,3,3,0,1,5,5,5,20,1 -3982,9,7,0,2,2,5,1,4,4,0,2,6,3,39,0 -3983,0,7,0,2,6,0,7,0,1,1,2,11,3,10,0 -3984,8,0,0,9,6,3,1,4,0,1,12,0,4,38,0 -3985,5,4,0,11,1,2,5,5,1,0,14,1,5,3,1 -3986,1,5,0,9,5,2,8,4,1,1,3,18,1,29,0 -3987,2,8,0,11,0,0,9,4,4,0,2,8,1,36,0 -3988,2,6,0,15,5,2,0,5,4,0,17,0,0,33,0 -3989,7,3,0,15,5,4,8,2,2,1,13,11,2,20,0 -3990,8,6,0,4,6,5,8,2,2,0,10,11,3,40,0 -3991,2,8,0,4,2,2,4,1,1,1,2,15,2,24,0 -3992,6,7,0,5,5,4,14,2,3,0,13,2,3,16,0 -3993,1,4,0,11,2,1,14,3,3,0,13,11,0,25,0 -3994,0,5,0,13,2,3,12,2,1,1,2,11,0,14,0 -3995,2,7,0,8,5,3,1,4,2,1,2,11,4,10,0 -3996,10,4,0,15,0,6,4,3,1,1,17,11,4,21,0 -3997,7,0,0,3,2,2,7,1,3,1,2,15,0,4,0 -3998,9,0,0,12,0,6,5,0,1,1,7,11,2,5,0 -3999,2,1,0,3,1,5,8,4,2,1,15,20,2,35,0 -4000,0,6,0,0,0,0,10,5,2,0,13,3,5,26,0 -4001,3,3,0,4,6,5,5,3,3,0,8,0,0,20,0 -4002,0,6,0,3,1,0,3,0,0,1,1,18,4,35,0 -4003,10,3,0,3,1,5,14,1,0,0,2,18,3,33,0 -4004,0,0,0,7,1,4,7,4,0,0,15,15,0,18,0 -4005,2,0,0,7,5,6,3,5,3,1,18,8,5,1,1 -4006,0,3,0,0,2,5,1,5,1,1,5,5,5,12,1 -4007,4,5,0,9,6,1,0,1,1,0,1,1,2,23,1 -4008,1,8,0,1,0,4,1,1,0,1,2,20,4,8,0 -4009,5,0,0,9,0,5,2,4,1,0,18,2,4,29,0 -4010,0,8,0,1,0,3,13,3,2,0,2,15,2,12,0 -4011,0,5,0,8,6,0,3,3,0,0,17,11,2,1,0 -4012,8,8,0,1,2,1,14,1,3,1,5,17,5,14,1 -4013,8,1,0,7,4,1,8,4,3,1,18,7,2,2,1 -4014,2,8,0,5,5,1,1,0,0,1,17,10,1,37,0 -4015,10,3,0,12,6,1,14,5,3,1,9,7,4,19,1 -4016,2,2,0,15,1,2,2,2,4,0,2,20,5,25,0 -4017,3,3,0,1,1,3,2,0,0,1,8,2,2,3,0 -4018,6,6,0,15,3,3,8,4,1,1,11,11,4,26,0 -4019,9,8,0,3,1,6,2,1,4,1,3,7,3,19,1 -4020,9,0,0,9,2,2,14,4,3,0,2,2,4,38,0 -4021,4,1,0,9,4,3,14,2,2,1,7,8,2,21,1 -4022,2,2,0,2,1,6,11,2,0,0,2,20,1,39,0 -4023,0,8,0,4,1,1,10,3,3,0,7,15,4,6,0 -4024,0,8,0,15,6,0,4,5,0,1,5,10,1,29,1 -4025,4,8,0,5,5,4,11,4,0,1,2,20,1,10,0 -4026,9,1,0,7,4,0,0,4,3,0,2,20,1,0,0 -4027,2,5,0,15,4,3,8,5,1,1,10,12,4,21,0 -4028,7,0,0,14,3,1,3,5,2,1,11,19,1,32,1 -4029,2,0,0,13,1,4,1,1,0,1,10,15,0,35,0 -4030,4,6,0,9,5,4,13,4,4,0,2,9,2,17,0 -4031,2,4,0,15,6,5,3,4,1,0,10,17,1,33,1 -4032,8,5,0,8,0,2,7,1,4,1,16,11,4,8,1 -4033,5,4,0,10,2,0,14,4,0,1,2,0,1,28,0 -4034,2,6,0,8,5,4,0,3,2,0,17,13,5,2,0 -4035,5,6,0,10,0,6,10,4,2,1,9,17,5,14,1 -4036,1,0,0,8,2,6,13,3,0,0,8,11,0,6,0 -4037,1,7,0,12,4,0,4,1,2,0,0,11,4,1,0 -4038,3,0,0,2,0,2,1,4,0,0,9,11,2,28,0 -4039,10,4,0,7,4,0,4,4,0,0,13,15,4,21,0 -4040,1,8,0,7,4,2,3,5,0,0,17,9,3,4,0 -4041,6,3,0,15,1,5,14,3,1,1,9,1,4,22,1 -4042,10,1,0,7,2,2,12,3,3,0,10,2,4,26,0 -4043,10,3,0,13,1,2,1,3,0,1,6,20,2,36,0 -4044,8,2,0,12,2,5,0,5,1,1,3,7,5,38,1 -4045,0,0,0,3,0,3,5,3,2,1,10,18,1,26,0 -4046,1,5,0,0,5,5,9,0,1,1,17,20,3,0,0 -4047,0,2,0,11,0,3,2,3,4,0,6,14,0,34,0 -4048,0,3,0,0,4,5,9,0,3,1,8,20,0,0,0 -4049,3,3,0,14,5,2,14,3,1,0,4,11,4,0,0 -4050,9,0,0,4,5,4,0,5,3,0,6,10,5,30,1 -4051,5,6,0,2,5,5,13,3,3,0,18,9,1,5,1 -4052,0,8,0,0,0,4,3,5,3,1,16,4,5,29,0 -4053,3,7,0,5,4,4,11,2,3,0,13,4,4,7,0 -4054,5,2,0,9,1,5,1,1,4,1,17,9,0,22,0 -4055,9,2,0,13,5,4,6,2,0,1,13,4,1,38,0 -4056,4,8,0,13,2,0,12,2,1,0,4,2,1,10,0 -4057,1,5,0,0,1,4,8,4,3,1,8,4,2,37,0 -4058,4,0,0,13,1,1,12,2,3,1,17,18,4,21,0 -4059,2,0,0,8,6,5,0,3,2,1,8,20,0,7,0 -4060,10,6,0,6,1,0,8,4,4,0,8,2,3,27,0 -4061,0,8,0,2,0,5,1,5,1,1,2,11,1,33,0 -4062,2,4,0,11,0,4,2,3,1,0,13,9,5,37,0 -4063,9,0,0,13,2,4,4,2,0,1,18,14,4,27,1 -4064,10,6,0,6,3,5,10,4,4,0,10,13,0,26,0 -4065,1,7,0,6,1,5,5,4,1,1,13,6,1,17,0 -4066,2,3,0,1,1,6,1,2,2,1,8,11,3,21,0 -4067,4,6,0,10,5,5,0,3,0,0,1,8,3,4,1 -4068,6,6,0,1,6,2,9,5,0,1,18,13,3,2,1 -4069,6,6,0,6,2,1,2,5,4,1,5,17,2,3,1 -4070,9,2,0,1,6,4,9,1,4,1,12,20,0,8,0 -4071,0,3,0,7,1,2,3,2,4,0,0,14,0,30,0 -4072,10,4,0,12,2,1,13,5,0,1,12,10,2,10,1 -4073,3,4,0,8,6,0,5,0,4,0,10,2,4,20,0 -4074,2,7,0,2,6,3,6,4,1,0,4,14,5,41,0 -4075,8,1,0,5,1,0,7,1,1,0,4,13,4,5,0 -4076,4,4,0,11,2,2,5,3,3,1,3,2,0,19,0 -4077,10,4,0,8,4,4,2,2,4,1,17,15,5,12,0 -4078,1,4,0,9,6,2,9,5,1,0,2,6,2,37,0 -4079,3,4,0,3,4,1,7,0,1,1,17,4,2,29,0 -4080,4,4,0,14,0,0,0,2,4,0,13,13,4,35,0 -4081,3,8,0,15,1,0,6,0,1,1,4,15,0,0,0 -4082,10,0,0,13,5,5,5,1,3,0,12,15,3,31,0 -4083,8,6,0,12,0,5,6,0,3,1,10,15,1,29,0 -4084,7,2,0,9,0,5,5,4,4,0,8,2,1,25,0 -4085,1,3,0,12,2,5,1,2,4,0,17,9,2,32,0 -4086,5,5,0,5,3,4,9,4,0,1,12,11,0,21,0 -4087,2,6,0,10,5,4,14,3,2,1,11,18,0,29,0 -4088,0,0,0,9,1,0,0,5,4,0,11,18,0,26,0 -4089,2,4,0,3,2,5,9,3,3,0,13,15,1,16,0 -4090,2,4,0,14,4,1,7,3,0,1,13,16,1,23,0 -4091,9,6,0,12,2,5,12,1,2,1,8,11,4,24,0 -4092,4,2,0,6,4,4,10,4,2,1,4,6,3,5,0 -4093,0,2,0,4,3,5,14,4,0,1,2,4,0,8,0 -4094,4,4,0,11,4,5,9,0,0,1,2,11,0,37,0 -4095,9,3,0,14,4,0,0,4,4,1,2,0,1,37,0 -4096,6,6,0,10,1,2,11,1,2,0,15,1,3,11,1 -4097,3,1,0,6,3,0,3,0,3,1,10,12,1,27,0 -4098,0,0,0,7,3,2,5,4,3,0,13,2,2,12,0 -4099,5,1,0,13,6,5,13,2,0,0,13,6,1,35,0 -4100,6,8,0,1,6,5,2,3,3,0,2,0,4,13,0 -4101,0,5,0,5,2,0,5,0,0,0,2,8,3,7,0 -4102,6,6,0,14,1,4,7,3,2,1,13,3,4,14,0 -4103,7,1,0,4,0,4,7,1,2,0,13,7,4,12,0 -4104,7,4,0,2,3,3,6,0,3,1,1,8,2,19,1 -4105,3,8,0,9,0,6,0,3,0,1,4,9,0,27,0 -4106,8,7,0,0,5,1,10,0,1,0,0,16,2,19,1 -4107,0,7,0,8,3,6,12,4,4,0,2,13,1,11,0 -4108,6,3,0,13,3,0,11,4,0,0,0,11,2,32,0 -4109,2,3,0,13,1,4,11,0,2,0,17,6,5,23,0 -4110,1,1,0,12,3,0,10,4,1,0,2,18,5,31,0 -4111,0,6,0,1,0,0,10,1,4,1,15,17,0,29,0 -4112,4,3,0,4,0,5,3,4,3,1,14,8,2,12,0 -4113,0,7,0,13,0,5,2,5,2,0,2,2,1,6,0 -4114,5,8,0,12,1,6,9,4,1,0,13,2,3,40,0 -4115,6,1,0,4,1,4,8,5,2,1,13,0,5,21,0 -4116,8,8,0,2,4,5,5,2,0,0,0,11,4,9,0 -4117,9,1,0,1,0,5,1,1,3,1,6,15,4,20,0 -4118,9,4,0,10,2,1,14,4,3,0,16,1,4,10,1 -4119,3,6,0,3,6,3,9,3,1,1,4,13,1,26,0 -4120,2,8,0,3,0,6,3,4,4,1,0,15,2,8,0 -4121,6,2,0,7,1,3,12,3,4,1,10,10,1,13,1 -4122,5,0,0,9,1,2,13,1,4,0,18,8,3,3,1 -4123,7,1,0,10,0,3,7,4,0,0,4,2,0,24,0 -4124,1,7,0,7,1,6,2,0,1,0,14,12,4,10,1 -4125,4,1,0,0,4,5,9,1,2,1,8,10,2,8,0 -4126,0,8,0,6,1,1,5,5,4,0,13,2,0,6,0 -4127,9,2,0,1,3,3,7,5,0,1,9,5,2,30,1 -4128,2,2,0,5,6,3,6,3,3,1,17,1,2,8,0 -4129,6,1,0,0,4,1,0,5,0,1,18,20,4,1,1 -4130,10,6,0,0,0,3,7,4,2,0,8,11,2,36,0 -4131,5,0,0,13,3,5,14,1,1,1,15,9,1,33,0 -4132,9,5,0,3,5,4,4,1,1,0,17,14,3,3,0 -4133,7,4,0,2,2,2,9,4,1,1,13,0,1,2,0 -4134,1,7,0,8,2,5,9,1,2,1,16,14,0,8,0 -4135,4,1,0,15,6,3,14,4,1,0,9,2,5,2,0 -4136,5,2,0,7,4,1,3,5,4,0,15,12,4,8,1 -4137,1,7,0,13,1,0,1,2,4,1,13,6,4,22,0 -4138,7,3,0,9,4,0,2,4,1,0,12,10,2,9,1 -4139,0,2,0,4,1,5,0,0,3,0,13,11,5,10,0 -4140,8,5,0,14,3,1,10,5,1,1,16,20,2,4,1 -4141,3,5,0,8,1,4,5,0,3,0,12,11,3,18,0 -4142,1,5,0,0,1,6,3,4,4,1,6,0,1,5,0 -4143,7,8,0,14,6,1,5,1,1,0,2,2,2,14,0 -4144,1,1,0,13,0,0,10,4,2,0,13,15,0,11,0 -4145,0,3,0,3,3,2,5,2,2,0,11,11,2,27,0 -4146,0,3,0,15,6,4,3,2,4,0,8,7,3,24,0 -4147,8,1,0,12,2,0,1,5,0,0,1,7,3,20,1 -4148,7,8,0,7,1,3,12,1,2,1,4,11,1,11,0 -4149,0,6,0,15,5,4,9,4,2,1,17,2,1,38,0 -4150,2,6,0,2,6,6,8,3,2,0,5,13,1,37,0 -4151,2,2,0,12,1,6,1,3,0,1,8,0,3,4,0 -4152,10,5,0,1,6,5,0,4,2,0,15,18,2,25,0 -4153,2,2,0,2,3,1,4,2,3,0,9,3,4,19,1 -4154,2,0,0,14,0,2,5,3,4,0,7,2,5,21,0 -4155,3,8,0,10,5,2,9,2,4,0,2,8,0,35,0 -4156,0,6,0,9,0,5,2,4,2,1,10,16,1,35,0 -4157,8,8,0,12,0,4,3,0,0,1,7,17,5,4,1 -4158,2,7,0,15,6,5,9,0,0,1,8,1,3,2,0 -4159,10,3,0,8,3,3,1,1,3,1,18,5,1,13,1 -4160,7,3,0,6,1,2,13,0,0,1,18,19,5,28,1 -4161,9,5,0,8,1,6,11,4,0,0,17,2,0,36,0 -4162,7,3,0,12,6,4,5,5,4,1,5,19,4,30,1 -4163,9,6,0,1,1,1,8,5,2,1,14,1,3,25,1 -4164,3,1,0,3,1,5,9,3,2,0,13,14,0,30,0 -4165,1,1,0,9,0,1,5,4,0,1,0,15,5,23,0 -4166,1,7,0,15,1,4,11,0,0,1,13,2,3,10,0 -4167,1,7,0,14,3,4,7,4,1,1,2,16,0,21,0 -4168,9,2,0,9,3,6,2,1,2,0,13,13,3,23,0 -4169,5,8,0,3,1,5,4,2,2,1,13,13,0,31,0 -4170,0,1,0,0,5,3,14,1,2,0,0,11,5,38,0 -4171,8,5,0,9,0,2,13,3,0,0,0,7,3,12,1 -4172,1,6,0,11,3,0,8,5,4,0,9,1,0,30,0 -4173,4,2,0,1,4,4,13,0,2,0,2,11,5,36,0 -4174,5,4,0,0,6,3,3,3,2,0,13,6,0,37,0 -4175,2,5,0,11,4,3,13,2,4,0,13,16,1,3,0 -4176,3,7,0,1,2,4,4,3,1,0,2,11,5,40,0 -4177,7,2,0,13,2,0,10,0,4,0,16,17,5,25,1 -4178,1,3,0,15,0,4,11,1,2,0,8,11,2,24,0 -4179,6,7,0,11,5,3,2,0,3,0,6,6,3,26,0 -4180,6,1,0,9,6,6,6,3,0,0,17,11,1,10,0 -4181,5,1,0,2,1,2,5,4,2,0,16,14,2,38,0 -4182,7,5,0,4,5,1,4,4,3,1,5,7,5,0,1 -4183,0,0,0,6,3,5,9,2,0,1,13,2,0,0,0 -4184,4,3,0,9,6,2,9,5,2,1,18,14,1,31,1 -4185,4,4,0,12,0,2,11,5,0,1,18,5,2,9,1 -4186,2,7,0,7,5,4,12,3,0,0,8,14,2,24,0 -4187,4,2,0,12,1,5,5,0,1,0,13,3,0,2,0 -4188,2,0,0,1,2,1,11,0,1,1,8,9,4,3,0 -4189,8,4,0,7,3,6,10,5,0,1,4,11,0,6,0 -4190,0,1,0,9,2,3,9,4,0,1,2,18,0,2,0 -4191,1,6,0,13,4,5,0,4,1,0,17,1,0,12,0 -4192,7,4,0,11,1,3,7,2,1,1,2,16,0,19,0 -4193,2,8,0,6,6,4,3,3,4,0,8,6,5,33,0 -4194,3,6,0,2,2,3,3,3,1,1,17,16,2,38,0 -4195,0,3,0,11,1,4,8,3,3,0,13,2,5,23,0 -4196,8,3,0,12,0,2,4,3,2,0,10,0,0,8,0 -4197,0,6,0,1,5,3,9,3,4,1,0,14,1,29,0 -4198,0,0,0,12,5,3,14,3,1,1,2,17,2,32,0 -4199,3,4,0,6,2,6,13,5,4,0,9,9,1,18,1 -4200,4,5,0,12,1,3,11,0,0,1,11,3,1,18,1 -4201,2,0,0,14,2,0,7,5,0,1,8,11,3,25,0 -4202,5,6,0,8,0,4,4,2,1,0,8,4,3,37,0 -4203,5,8,0,1,3,0,9,4,2,0,2,11,1,11,0 -4204,3,0,0,9,3,6,0,5,0,0,10,10,1,5,1 -4205,1,3,0,4,1,4,14,1,0,1,7,11,2,9,0 -4206,9,1,0,0,2,0,5,3,0,0,2,2,3,28,0 -4207,1,6,0,8,1,3,8,1,3,1,4,15,1,12,0 -4208,2,6,0,15,0,2,6,4,0,1,18,12,4,39,1 -4209,2,3,0,2,0,1,0,5,1,1,18,10,3,14,1 -4210,8,3,0,0,6,1,12,3,1,1,17,2,0,34,0 -4211,8,5,0,14,0,1,8,4,1,1,7,11,5,10,1 -4212,8,2,0,13,5,2,6,5,2,0,14,4,0,21,1 -4213,4,2,0,5,4,6,8,4,0,0,13,9,2,26,0 -4214,6,1,0,1,3,1,6,5,0,1,16,7,4,2,1 -4215,10,4,0,13,3,3,7,4,3,0,13,8,2,13,0 -4216,5,6,0,8,6,1,13,0,4,1,13,13,0,1,0 -4217,8,2,0,11,5,4,9,3,1,0,13,20,1,38,0 -4218,8,7,0,8,6,0,13,0,0,0,3,11,4,2,0 -4219,9,4,0,7,1,1,7,5,0,1,0,1,2,5,1 -4220,1,5,0,4,3,0,4,3,4,0,17,2,5,2,0 -4221,2,0,0,3,6,6,0,5,3,0,13,13,3,29,0 -4222,9,7,0,6,1,4,8,3,0,0,13,8,1,20,0 -4223,8,0,0,4,3,5,2,5,1,1,16,10,1,14,1 -4224,4,4,0,2,6,5,12,1,2,1,8,11,0,24,0 -4225,5,5,0,11,3,5,10,4,3,1,13,2,2,38,0 -4226,1,8,0,13,2,5,5,3,3,1,8,2,0,2,0 -4227,8,0,0,11,4,6,5,1,2,0,0,2,2,37,0 -4228,8,2,0,7,0,3,3,3,2,1,17,6,1,21,0 -4229,6,1,0,0,2,2,4,5,3,1,5,5,2,3,1 -4230,2,8,0,3,0,2,8,4,0,1,8,20,1,22,0 -4231,2,6,0,13,4,5,3,2,3,0,8,2,5,4,0 -4232,7,8,0,6,0,0,9,3,0,1,1,15,5,2,0 -4233,1,0,0,5,2,2,0,2,0,0,17,6,0,4,0 -4234,0,1,0,2,5,4,2,0,4,1,2,2,2,35,0 -4235,2,7,0,5,2,0,5,3,3,0,16,4,2,35,0 -4236,1,4,0,0,6,2,1,1,3,1,6,8,5,18,1 -4237,0,5,0,6,1,4,6,4,0,0,13,13,4,22,0 -4238,3,6,0,4,1,0,5,2,2,0,12,14,5,7,0 -4239,0,6,0,3,2,5,0,3,3,0,8,18,3,37,0 -4240,8,3,0,7,0,3,13,4,0,0,11,16,0,38,0 -4241,9,2,0,1,0,5,7,0,3,1,18,5,1,41,1 -4242,2,8,0,10,2,6,4,1,4,1,18,3,5,9,1 -4243,5,1,0,10,0,5,2,2,1,1,13,3,0,12,0 -4244,6,7,0,9,2,4,9,0,1,1,17,16,0,8,0 -4245,6,3,0,8,3,1,0,5,2,0,18,20,4,19,1 -4246,2,4,0,13,5,2,5,3,4,1,5,3,5,24,0 -4247,0,2,0,7,0,4,6,1,1,1,13,0,5,17,0 -4248,0,2,0,13,2,3,2,0,0,0,2,11,1,31,0 -4249,1,2,0,11,6,4,9,2,0,1,8,8,0,11,0 -4250,3,1,0,11,0,4,9,4,2,0,12,9,2,26,0 -4251,2,7,0,2,0,4,10,2,4,0,6,4,1,39,0 -4252,0,3,0,10,3,6,2,1,4,1,0,2,0,41,0 -4253,0,4,0,15,1,6,14,4,0,1,0,9,5,3,0 -4254,8,3,0,5,6,3,2,1,2,1,15,3,3,7,0 -4255,10,0,0,2,6,6,1,0,4,0,4,11,5,32,0 -4256,9,4,0,11,2,2,0,2,1,1,13,2,0,29,0 -4257,5,8,0,7,5,5,7,1,0,1,13,18,3,22,0 -4258,0,5,0,14,6,0,0,3,3,0,1,2,3,2,0 -4259,7,5,0,12,0,6,8,0,1,0,18,14,5,0,1 -4260,6,7,0,4,3,4,0,1,0,0,12,11,0,32,0 -4261,0,2,0,0,1,4,9,0,0,0,10,15,4,24,0 -4262,10,2,0,4,0,0,8,4,0,0,5,9,1,22,0 -4263,7,7,0,14,4,3,14,1,3,0,5,19,4,28,1 -4264,9,5,0,13,6,1,9,2,4,0,13,20,5,25,0 -4265,3,5,0,10,1,2,4,4,3,1,18,18,5,27,1 -4266,7,7,0,11,3,4,5,2,4,1,18,7,5,6,1 -4267,1,2,0,7,3,5,9,3,4,1,12,15,5,19,0 -4268,6,8,0,14,5,4,12,4,0,1,18,4,5,19,1 -4269,5,6,0,13,1,5,6,2,0,1,13,18,2,28,0 -4270,3,3,0,14,3,6,4,3,1,1,18,13,1,14,1 -4271,3,5,0,13,5,0,3,1,3,1,4,6,1,13,0 -4272,0,4,0,10,6,0,9,0,3,0,13,7,4,32,0 -4273,0,2,0,11,3,2,10,3,4,1,17,11,4,9,0 -4274,0,5,0,12,6,4,14,3,4,1,4,2,4,24,0 -4275,10,8,0,11,3,5,6,3,3,0,13,12,3,29,0 -4276,5,8,0,4,4,4,9,5,4,1,8,19,2,14,0 -4277,5,4,0,2,0,5,0,2,0,1,13,6,3,9,0 -4278,0,2,0,10,4,1,14,5,2,1,6,9,3,3,0 -4279,1,7,0,1,2,5,1,4,1,1,13,15,5,33,0 -4280,9,2,0,5,6,6,5,5,2,1,18,7,3,31,1 -4281,2,3,0,8,2,0,5,4,2,0,12,4,5,24,0 -4282,1,5,0,7,0,1,10,4,3,1,8,18,3,12,0 -4283,0,1,0,1,5,0,7,0,0,0,6,11,4,19,0 -4284,5,7,0,7,0,6,6,0,2,0,12,19,5,5,1 -4285,7,1,0,14,5,6,2,4,4,1,9,14,0,20,1 -4286,0,6,0,5,2,5,11,3,0,0,4,18,1,24,0 -4287,7,2,0,15,5,1,13,1,1,0,5,1,3,19,1 -4288,0,2,0,0,3,0,10,1,0,0,13,11,4,39,0 -4289,8,7,0,4,2,5,13,2,0,1,17,20,2,27,0 -4290,3,5,0,5,1,5,12,1,0,1,2,13,1,12,0 -4291,9,1,0,6,3,0,0,1,3,0,18,18,4,28,1 -4292,0,1,0,5,2,5,8,4,2,1,13,6,5,30,0 -4293,10,2,0,9,0,1,13,0,2,1,5,1,0,13,1 -4294,2,0,0,2,3,0,6,3,3,1,16,10,1,3,0 -4295,10,4,0,7,2,1,1,4,4,1,6,2,2,26,0 -4296,1,4,0,5,6,0,5,0,3,1,17,3,2,27,0 -4297,10,3,0,10,5,5,8,2,3,1,9,17,5,6,1 -4298,0,0,0,7,4,3,4,4,2,0,6,18,1,3,0 -4299,8,2,0,6,3,4,9,3,3,0,4,14,5,32,0 -4300,1,6,0,5,2,4,5,2,1,0,13,11,1,26,0 -4301,0,5,0,10,6,5,9,2,4,1,1,16,4,1,1 -4302,1,3,0,8,0,3,7,2,0,0,2,2,1,4,0 -4303,9,6,0,9,5,1,11,1,1,1,2,2,5,10,0 -4304,5,0,0,15,4,0,0,0,4,0,7,16,0,31,0 -4305,1,5,0,5,2,0,14,5,0,1,8,5,4,16,0 -4306,4,6,0,11,5,5,11,3,4,0,13,0,0,37,0 -4307,0,8,0,5,0,5,8,5,3,0,2,2,2,41,0 -4308,3,0,0,0,4,0,3,4,4,0,13,2,1,14,0 -4309,0,8,0,6,5,4,10,3,1,1,2,14,4,12,0 -4310,1,5,0,8,6,0,10,1,0,0,16,6,2,0,0 -4311,2,5,0,3,6,5,2,4,1,1,4,9,3,20,0 -4312,1,0,0,12,1,4,10,2,2,1,17,11,1,17,0 -4313,3,6,0,8,3,2,1,2,3,0,10,15,1,26,0 -4314,5,8,0,6,2,0,9,2,3,0,2,20,4,9,0 -4315,4,3,0,13,5,3,5,4,2,1,0,7,2,8,0 -4316,5,1,0,3,5,0,2,2,2,0,14,3,1,8,0 -4317,2,2,0,7,6,4,7,3,2,0,2,11,2,22,0 -4318,9,2,0,4,4,5,4,1,1,0,4,6,5,20,0 -4319,9,2,0,7,1,6,1,5,4,0,18,13,0,5,0 -4320,2,3,0,3,0,3,12,4,3,0,0,2,1,23,0 -4321,2,2,0,14,1,2,11,1,0,1,5,17,5,36,1 -4322,6,0,0,12,4,6,7,5,3,1,18,1,2,24,1 -4323,9,5,0,12,0,1,4,5,3,1,6,7,5,30,1 -4324,3,4,0,6,6,5,2,1,1,0,6,12,3,35,0 -4325,0,1,0,2,2,0,3,3,1,1,4,6,1,27,0 -4326,9,0,0,1,6,2,8,5,0,1,1,10,4,23,1 -4327,5,0,0,4,3,4,6,3,2,0,0,20,5,2,0 -4328,1,5,0,4,4,0,4,5,0,0,1,5,4,35,1 -4329,8,3,0,7,4,1,1,3,3,1,18,1,3,24,1 -4330,3,5,0,4,5,2,13,0,3,0,2,11,4,37,0 -4331,0,3,0,6,3,3,13,2,1,1,13,15,1,23,0 -4332,0,8,0,3,6,5,6,4,4,0,15,2,5,4,0 -4333,10,3,0,12,6,4,4,5,1,1,16,10,0,20,1 -4334,0,5,0,6,5,4,12,5,3,1,8,14,1,32,0 -4335,1,6,0,6,2,5,13,3,4,0,17,14,2,26,0 -4336,9,0,0,6,5,1,4,5,1,0,14,7,4,19,1 -4337,3,6,0,7,5,5,7,0,4,0,17,0,0,40,0 -4338,3,1,0,14,4,1,5,2,4,0,12,0,2,29,0 -4339,2,0,0,6,6,4,6,4,2,1,6,11,2,0,0 -4340,3,7,0,14,4,2,10,1,0,1,9,10,1,23,1 -4341,7,5,0,10,5,5,14,4,0,0,17,11,5,1,0 -4342,4,2,0,2,5,0,11,1,0,1,5,6,2,32,0 -4343,7,2,0,4,4,3,4,1,4,0,18,1,2,0,1 -4344,10,6,0,4,5,3,11,2,1,1,17,10,0,4,1 -4345,7,3,0,1,5,6,9,5,1,1,18,7,3,19,1 -4346,10,2,0,2,1,6,9,1,0,0,18,17,3,34,1 -4347,2,4,0,2,0,0,6,5,3,0,6,16,2,12,0 -4348,3,8,0,10,5,6,0,4,3,0,3,12,4,23,0 -4349,2,8,0,12,4,5,4,3,4,1,10,2,3,22,0 -4350,1,7,0,1,5,4,5,2,3,0,16,20,5,3,0 -4351,1,2,0,10,4,4,9,4,4,1,8,3,2,40,0 -4352,4,1,0,3,4,5,0,0,3,0,18,10,2,17,1 -4353,0,3,0,9,0,4,5,4,2,1,14,12,3,27,0 -4354,3,0,0,0,2,0,11,0,1,1,13,11,1,40,0 -4355,9,2,0,3,0,1,3,0,4,1,3,12,5,16,1 -4356,5,4,0,8,1,5,8,4,0,1,11,11,2,32,0 -4357,6,3,0,10,4,3,1,0,1,1,1,8,5,6,1 -4358,5,8,0,7,6,0,11,1,3,1,12,18,0,30,0 -4359,2,4,0,5,0,0,6,0,0,1,12,8,0,14,0 -4360,10,0,0,5,4,6,3,4,3,0,3,7,4,14,1 -4361,5,2,0,3,5,2,8,4,1,0,18,19,2,7,0 -4362,3,5,0,6,2,2,0,4,3,1,0,9,2,18,1 -4363,3,6,0,7,2,4,4,4,2,0,12,15,0,34,0 -4364,2,5,0,2,6,3,1,2,1,1,13,11,2,40,0 -4365,7,3,0,12,3,5,7,4,1,0,3,2,4,0,1 -4366,6,8,0,1,1,0,6,3,3,0,2,18,0,30,0 -4367,2,2,0,4,0,5,4,1,2,1,8,15,1,37,0 -4368,0,8,0,12,2,0,8,5,4,0,2,9,2,0,0 -4369,0,2,0,6,6,4,7,1,4,1,13,2,4,29,0 -4370,1,6,0,7,2,6,9,0,0,0,8,0,3,5,0 -4371,2,5,0,0,4,0,2,1,1,1,12,16,2,41,0 -4372,3,0,0,4,2,3,12,2,0,0,13,18,4,26,0 -4373,8,6,0,6,1,0,7,1,2,1,4,12,4,20,0 -4374,0,6,0,4,5,5,4,4,1,0,13,15,2,35,0 -4375,1,5,0,13,3,4,5,4,4,1,0,2,0,28,0 -4376,7,3,0,10,4,5,11,0,3,1,11,17,2,13,1 -4377,10,1,0,3,2,6,7,3,2,1,12,18,0,34,0 -4378,6,7,0,3,3,5,7,4,2,0,8,13,4,6,0 -4379,0,2,0,7,2,1,2,2,4,1,13,18,0,21,0 -4380,2,6,0,9,5,0,5,1,1,1,5,14,1,8,0 -4381,0,3,0,0,6,4,8,2,1,0,2,1,1,17,0 -4382,7,3,0,12,0,6,5,4,0,1,8,20,0,18,0 -4383,0,0,0,11,0,2,1,2,2,0,1,7,3,2,0 -4384,0,1,0,7,6,5,8,3,1,1,2,16,1,2,0 -4385,2,3,0,0,1,3,8,4,1,0,10,15,5,22,0 -4386,4,8,0,9,0,6,3,4,0,1,5,15,4,10,0 -4387,3,3,0,7,6,4,5,1,0,1,15,11,1,3,0 -4388,5,5,0,0,6,5,6,2,2,1,15,5,0,31,1 -4389,4,8,0,6,1,5,6,2,0,1,6,11,0,13,0 -4390,10,4,0,14,2,2,0,2,2,1,18,15,4,22,1 -4391,1,4,0,11,6,0,7,3,2,0,8,0,4,4,0 -4392,6,8,0,8,5,0,9,3,2,0,13,16,0,8,0 -4393,6,6,0,6,0,1,8,4,0,0,3,2,5,10,0 -4394,10,0,0,13,2,3,5,2,4,1,9,2,2,3,0 -4395,1,8,0,6,0,2,5,3,2,0,3,11,4,31,0 -4396,6,2,0,13,2,4,0,4,3,0,13,6,1,41,0 -4397,4,7,0,8,5,3,6,0,1,1,8,18,0,30,0 -4398,7,1,0,4,3,4,6,0,1,1,16,16,3,25,1 -4399,10,1,0,10,6,2,3,2,0,1,17,11,5,29,0 -4400,4,0,0,6,5,4,14,3,1,0,6,9,0,17,0 -4401,1,4,0,1,1,1,1,3,3,1,1,3,1,38,1 -4402,2,4,0,0,6,2,0,3,0,0,0,14,4,7,0 -4403,10,8,0,2,0,4,13,2,2,0,3,7,4,31,1 -4404,0,6,0,12,4,3,10,4,3,0,13,4,4,25,0 -4405,2,2,0,4,2,4,9,0,0,1,0,2,0,7,0 -4406,5,5,0,15,0,6,9,4,0,1,17,20,4,30,0 -4407,7,3,0,15,0,6,3,3,4,1,18,4,3,4,1 -4408,3,3,0,10,4,2,13,1,1,1,18,10,5,25,1 -4409,9,7,0,4,2,0,11,2,3,1,18,5,1,22,1 -4410,10,0,0,4,6,1,9,5,2,1,16,13,1,24,1 -4411,8,4,0,11,0,5,1,3,2,1,2,15,3,38,0 -4412,7,5,0,11,5,4,0,0,2,0,8,18,0,20,0 -4413,7,7,0,10,4,1,1,5,2,1,9,10,5,16,1 -4414,0,4,0,12,3,5,12,4,3,0,13,20,1,38,0 -4415,4,4,0,10,2,4,10,5,4,0,15,17,4,22,1 -4416,2,6,0,10,6,5,0,3,3,1,17,9,0,25,0 -4417,5,7,0,4,5,4,6,4,0,1,6,19,5,6,0 -4418,0,4,0,5,0,0,8,5,2,0,13,0,0,14,0 -4419,2,3,0,10,4,1,11,4,0,1,13,7,2,22,0 -4420,5,6,0,7,6,0,4,2,3,1,18,11,0,19,0 -4421,6,7,0,13,0,2,13,5,2,0,17,3,1,30,0 -4422,8,2,0,1,5,5,9,0,3,1,0,0,4,29,0 -4423,0,7,0,3,1,5,7,2,3,1,2,0,1,28,0 -4424,0,4,0,3,3,4,8,1,3,1,15,2,5,19,0 -4425,9,2,0,1,1,5,13,3,3,1,6,15,0,37,0 -4426,3,7,0,7,5,4,3,4,2,1,13,2,1,38,0 -4427,6,3,0,5,6,3,4,5,4,0,12,11,5,21,0 -4428,3,3,0,2,2,0,9,2,3,1,2,7,3,34,0 -4429,9,2,0,1,6,4,5,3,0,1,2,19,5,30,0 -4430,0,5,0,0,2,5,5,1,3,1,0,9,4,20,0 -4431,10,6,0,11,1,1,8,0,1,1,4,19,2,22,0 -4432,2,3,0,1,1,5,2,1,4,1,9,1,3,26,1 -4433,2,2,0,10,2,4,14,1,2,1,2,2,0,35,0 -4434,1,1,0,15,0,4,6,4,1,0,10,13,0,22,0 -4435,5,8,0,0,3,1,11,3,4,1,13,3,0,24,0 -4436,1,2,0,2,0,3,9,4,4,1,12,11,0,36,0 -4437,6,4,0,6,6,5,10,3,2,0,5,4,0,14,0 -4438,7,0,0,11,3,6,10,5,0,0,8,0,4,7,0 -4439,10,5,0,2,6,0,1,3,0,0,13,3,0,33,0 -4440,1,7,0,0,4,0,0,0,3,0,2,2,3,25,0 -4441,10,7,0,0,6,2,13,1,0,0,2,10,2,12,1 -4442,2,8,0,6,5,6,6,3,3,0,0,0,3,10,0 -4443,0,2,0,11,0,4,7,2,0,0,13,13,0,26,0 -4444,6,0,0,13,6,3,5,4,2,1,8,2,2,21,0 -4445,10,6,0,7,3,2,2,5,1,1,18,5,2,0,1 -4446,1,7,0,6,5,1,9,0,0,1,0,18,0,29,0 -4447,2,8,0,8,0,4,11,0,1,1,7,10,1,25,1 -4448,6,5,0,12,2,5,8,0,1,0,2,8,4,6,0 -4449,9,3,0,10,5,6,5,4,3,1,2,14,1,26,0 -4450,0,5,0,0,5,1,5,1,0,1,9,2,0,5,0 -4451,5,8,0,9,1,5,13,1,2,1,4,2,2,18,0 -4452,0,4,0,0,3,1,9,1,1,0,2,9,4,23,0 -4453,8,8,0,4,0,2,12,1,3,1,18,17,5,36,1 -4454,1,2,0,0,4,3,7,1,2,0,15,9,1,10,0 -4455,0,4,0,9,5,5,4,1,4,1,12,20,4,16,0 -4456,3,0,0,11,0,3,5,1,0,1,8,11,0,20,0 -4457,8,5,0,15,0,0,14,5,0,0,0,13,5,23,0 -4458,10,8,0,1,2,1,12,0,3,0,5,5,2,3,1 -4459,7,4,0,14,6,3,14,5,0,1,18,1,4,35,1 -4460,2,0,0,3,6,4,12,4,0,0,2,11,2,23,0 -4461,1,7,0,5,3,4,8,4,0,1,2,16,0,29,0 -4462,6,1,0,14,5,2,7,0,1,1,13,4,1,24,0 -4463,0,7,0,3,6,6,6,1,1,0,2,0,1,29,0 -4464,8,5,0,13,4,5,7,3,2,1,13,2,4,25,0 -4465,7,8,0,15,4,3,13,0,4,1,5,5,4,14,1 -4466,5,6,0,10,4,6,1,1,1,1,3,2,1,25,0 -4467,3,2,0,2,5,4,4,3,4,0,1,15,4,24,0 -4468,3,7,0,0,6,4,14,3,0,0,2,11,0,39,0 -4469,2,2,0,1,1,0,3,2,0,0,4,18,2,1,0 -4470,0,8,0,8,6,5,13,3,2,1,5,9,1,32,0 -4471,9,3,0,14,5,3,12,2,2,1,1,7,5,36,1 -4472,9,7,0,7,6,0,11,5,0,1,1,13,4,21,1 -4473,5,6,0,4,2,6,6,5,2,0,14,7,0,30,1 -4474,2,5,0,14,0,4,6,1,1,1,13,19,1,5,0 -4475,5,5,0,10,4,1,7,5,1,1,18,1,0,20,1 -4476,6,6,0,11,6,3,1,1,4,1,17,2,4,22,0 -4477,9,5,0,9,0,0,7,3,0,1,6,2,1,27,0 -4478,5,7,0,13,5,5,6,3,1,1,1,11,1,10,0 -4479,2,3,0,5,1,0,5,2,3,1,10,9,2,37,0 -4480,3,4,0,13,3,3,8,0,0,0,4,11,2,16,0 -4481,7,4,0,10,3,0,9,3,1,1,1,0,2,27,0 -4482,3,2,0,3,2,0,0,0,0,0,8,12,5,23,0 -4483,5,2,0,10,5,0,7,5,3,0,3,19,3,30,1 -4484,5,6,0,8,2,1,3,3,1,1,3,7,2,2,0 -4485,5,3,0,8,5,5,12,3,2,0,8,18,2,39,0 -4486,3,5,0,8,2,1,8,3,2,0,12,11,1,7,0 -4487,0,4,0,8,2,0,6,0,2,1,17,20,4,11,0 -4488,0,0,0,3,0,3,12,0,3,0,13,6,1,3,0 -4489,0,1,0,12,1,2,13,1,4,0,1,15,2,0,1 -4490,0,8,0,6,2,5,9,3,4,1,4,2,2,38,0 -4491,1,1,0,11,3,1,2,5,2,0,11,17,2,32,1 -4492,3,0,0,13,5,3,2,2,1,0,4,6,1,36,0 -4493,1,8,0,8,6,0,1,5,4,0,10,11,0,21,0 -4494,6,6,0,6,5,2,1,0,0,1,5,6,1,12,0 -4495,9,4,0,12,3,1,10,5,0,1,14,12,4,6,1 -4496,1,0,0,14,5,1,9,4,2,0,17,2,0,38,0 -4497,0,7,0,4,5,5,9,4,1,0,7,2,4,2,0 -4498,4,3,0,6,4,4,10,5,0,0,13,15,4,20,0 -4499,2,7,0,10,5,6,11,1,1,0,7,17,3,29,1 -4500,3,8,0,4,4,6,5,0,3,0,10,6,3,22,0 -4501,3,4,0,14,2,6,4,2,0,0,10,2,4,8,0 -4502,0,0,0,8,3,0,9,4,3,0,4,0,5,19,0 -4503,1,0,0,15,0,6,7,3,3,0,3,13,4,23,0 -4504,5,1,0,11,0,5,5,5,4,0,15,3,3,7,0 -4505,2,6,0,7,2,0,7,1,1,1,14,10,5,9,1 -4506,5,5,0,0,2,2,3,2,1,1,11,7,2,29,1 -4507,0,1,0,1,1,4,13,0,2,1,2,15,3,41,0 -4508,10,2,0,0,5,5,7,1,2,0,8,3,5,21,0 -4509,4,3,0,4,2,3,1,3,3,1,2,2,4,21,0 -4510,3,4,0,6,2,0,10,1,1,1,6,18,4,17,0 -4511,0,4,0,7,0,3,0,1,0,1,18,5,0,17,1 -4512,0,3,0,5,0,0,9,4,4,1,12,11,1,40,0 -4513,7,4,0,10,5,6,0,1,0,1,14,5,0,32,0 -4514,3,3,0,5,6,5,8,1,3,1,13,9,0,21,0 -4515,0,8,0,11,5,1,9,5,1,0,0,15,1,28,0 -4516,2,2,0,2,0,1,14,0,4,0,13,9,5,9,0 -4517,1,4,0,5,5,0,8,3,3,1,8,5,4,3,0 -4518,2,7,0,0,3,4,14,3,1,1,10,15,0,41,0 -4519,7,3,0,5,5,2,1,3,2,1,16,19,2,39,1 -4520,6,7,0,7,1,5,11,0,0,0,13,4,0,16,0 -4521,5,4,0,1,1,5,12,5,0,0,4,13,1,8,0 -4522,2,2,0,13,6,2,7,0,4,1,0,6,0,39,0 -4523,10,0,0,10,6,2,6,5,1,0,16,1,5,41,1 -4524,2,7,0,7,5,6,1,4,1,0,13,19,3,30,0 -4525,1,8,0,6,2,4,9,5,0,1,2,11,0,8,0 -4526,3,6,0,8,1,2,7,5,4,1,4,20,2,20,0 -4527,10,5,0,8,1,6,7,3,3,1,13,13,1,12,0 -4528,6,0,0,9,6,5,11,3,3,0,18,7,2,28,1 -4529,6,5,0,2,4,5,13,5,3,1,9,7,2,12,1 -4530,0,0,0,13,0,4,0,3,1,1,11,0,1,8,0 -4531,3,3,0,9,2,3,7,4,3,0,8,18,0,29,0 -4532,2,6,0,14,1,6,0,4,0,1,18,17,3,37,1 -4533,7,5,0,14,3,4,14,0,3,1,13,4,4,32,0 -4534,1,3,0,11,6,4,4,0,1,0,8,2,5,37,0 -4535,2,8,0,3,6,1,0,4,2,1,13,13,1,12,0 -4536,4,6,0,9,4,6,6,5,1,1,6,9,1,8,0 -4537,6,8,0,15,4,4,5,2,3,1,0,12,2,1,0 -4538,1,2,0,0,0,4,9,4,3,0,8,13,2,3,0 -4539,0,2,0,0,4,0,14,1,4,0,13,12,1,31,0 -4540,10,4,0,10,6,0,14,1,1,0,4,15,0,11,0 -4541,10,1,0,3,5,0,4,0,3,1,9,10,4,1,1 -4542,0,4,0,0,4,6,14,3,2,1,0,11,5,9,0 -4543,7,0,0,13,6,2,13,0,3,1,13,2,2,6,0 -4544,4,6,0,12,4,1,11,2,2,0,16,19,3,16,0 -4545,2,2,0,11,6,2,5,0,3,1,8,2,0,5,0 -4546,3,5,0,5,5,4,4,2,4,1,4,19,5,40,0 -4547,8,2,0,6,6,1,12,1,1,1,11,10,3,16,1 -4548,6,7,0,15,1,6,9,1,0,1,13,20,0,6,0 -4549,1,5,0,1,4,2,10,3,4,1,18,10,4,35,1 -4550,5,8,0,10,5,1,4,5,4,1,9,9,1,1,1 -4551,8,3,0,10,6,1,11,0,2,0,5,10,5,31,1 -4552,10,5,0,8,0,0,0,5,0,0,2,11,5,38,0 -4553,9,5,0,10,4,2,10,3,0,1,10,1,5,18,1 -4554,3,0,0,8,6,0,14,4,4,1,6,11,4,24,0 -4555,0,7,0,14,4,0,1,3,3,0,8,3,1,22,0 -4556,5,8,0,7,0,5,5,4,1,1,3,11,0,31,0 -4557,3,0,0,13,6,6,4,2,0,1,16,2,2,38,0 -4558,1,8,0,2,2,6,8,0,3,1,17,2,3,11,0 -4559,8,0,0,5,3,5,5,2,0,1,5,14,5,35,0 -4560,5,0,0,5,2,5,10,2,1,1,8,15,0,38,0 -4561,9,0,0,10,6,1,14,1,3,0,5,7,0,40,1 -4562,10,8,0,11,2,3,9,4,3,0,6,19,5,12,0 -4563,1,8,0,13,0,4,3,5,0,0,13,11,0,37,0 -4564,3,7,0,9,3,5,10,2,2,0,15,9,5,7,0 -4565,8,7,0,15,3,4,2,1,3,0,2,6,1,31,0 -4566,7,0,0,15,6,3,9,5,0,0,18,1,4,13,1 -4567,0,1,0,1,0,4,11,2,4,1,8,2,2,23,0 -4568,10,4,0,6,2,6,9,0,0,0,13,11,0,8,0 -4569,4,6,0,3,3,6,7,2,3,1,2,11,2,21,0 -4570,0,2,0,5,0,0,9,2,3,1,8,3,3,24,0 -4571,1,2,0,9,4,6,9,5,0,1,11,0,3,33,1 -4572,2,2,0,15,4,4,9,2,3,1,10,15,4,32,0 -4573,10,0,0,3,5,6,1,3,3,1,13,2,2,16,0 -4574,8,6,0,13,3,0,0,3,1,1,9,7,1,16,1 -4575,10,6,0,0,0,0,11,2,1,0,13,8,3,28,0 -4576,8,5,0,10,5,6,6,4,0,1,0,11,2,31,0 -4577,4,6,0,13,0,5,14,1,2,1,15,11,0,34,0 -4578,3,0,0,14,6,3,10,4,3,0,13,13,0,33,0 -4579,6,7,0,1,6,0,5,3,3,1,10,2,0,27,0 -4580,4,5,0,14,1,4,2,3,0,1,10,2,0,6,0 -4581,2,7,0,13,3,5,9,4,0,1,2,11,0,24,0 -4582,10,7,0,6,0,6,5,4,0,1,2,19,2,18,0 -4583,5,3,0,13,3,5,4,2,3,1,13,13,1,38,0 -4584,6,8,0,7,4,3,3,0,0,0,5,7,1,39,1 -4585,10,8,0,11,2,0,10,1,1,1,8,2,2,26,0 -4586,9,1,0,9,4,2,5,5,2,0,16,5,3,9,1 -4587,0,0,0,15,3,0,0,0,1,1,18,16,4,1,1 -4588,8,7,0,10,6,2,12,1,1,1,14,11,3,39,1 -4589,3,7,0,3,5,5,10,1,1,1,2,11,2,17,0 -4590,0,4,0,8,0,5,3,2,2,1,2,18,2,23,0 -4591,2,7,0,11,5,6,11,1,2,0,8,19,0,33,0 -4592,4,5,0,7,1,2,0,1,1,1,5,13,2,31,1 -4593,7,8,0,7,6,0,2,1,4,1,13,6,0,31,0 -4594,0,5,0,15,0,5,5,3,2,1,3,11,2,25,0 -4595,9,1,0,14,0,6,8,5,2,1,10,2,0,26,0 -4596,2,7,0,12,2,4,8,3,0,1,13,2,3,41,0 -4597,9,8,0,7,2,1,5,5,2,1,18,19,1,36,1 -4598,0,4,0,6,1,3,0,2,2,0,2,20,4,16,0 -4599,8,1,0,15,2,6,6,1,4,1,14,15,4,28,0 -4600,6,2,0,5,5,4,5,0,0,1,13,3,2,18,0 -4601,4,7,0,3,6,0,4,3,4,1,6,16,1,19,1 -4602,7,2,0,5,6,4,1,0,2,0,16,2,0,29,0 -4603,2,3,0,4,5,5,9,2,2,1,6,11,4,39,0 -4604,0,4,0,3,1,2,0,1,3,1,0,20,1,25,0 -4605,1,3,0,4,3,6,4,1,2,0,17,6,3,0,0 -4606,0,6,0,11,5,4,7,3,3,0,18,0,3,12,0 -4607,8,7,0,6,1,6,3,3,1,0,2,6,0,31,0 -4608,0,7,0,10,4,6,7,5,3,1,4,20,1,29,0 -4609,3,4,0,3,2,0,0,0,0,1,2,20,4,25,0 -4610,6,0,0,15,2,1,1,0,2,1,9,14,4,21,1 -4611,6,3,0,8,4,3,2,3,3,1,7,9,4,41,0 -4612,2,0,0,7,6,0,10,1,0,1,12,13,2,13,0 -4613,2,4,0,8,5,3,1,4,1,1,11,5,5,39,1 -4614,1,0,0,4,6,0,9,1,2,1,2,0,4,5,0 -4615,4,8,0,4,4,1,14,3,1,0,11,8,1,26,1 -4616,4,6,0,2,6,5,6,2,1,1,17,2,2,38,0 -4617,10,1,0,5,2,0,0,1,0,1,13,1,1,27,0 -4618,7,8,0,13,2,3,8,0,0,1,17,11,5,30,0 -4619,9,2,0,2,2,2,0,3,3,0,14,7,1,14,1 -4620,3,8,0,13,2,2,6,4,1,0,11,6,1,24,0 -4621,4,6,0,4,1,5,12,0,4,0,13,6,4,22,0 -4622,0,1,0,0,5,4,7,1,0,1,0,11,0,37,0 -4623,5,2,0,13,0,6,12,3,2,0,18,7,2,26,1 -4624,1,5,0,7,6,5,9,0,2,0,2,6,3,8,0 -4625,10,8,0,3,0,1,13,1,0,0,2,11,0,27,0 -4626,2,8,0,12,3,0,1,2,0,0,11,16,0,32,0 -4627,7,1,0,3,1,3,12,0,3,0,9,1,4,31,1 -4628,0,0,0,10,4,5,7,3,2,0,16,2,2,4,0 -4629,8,5,0,13,6,1,10,4,3,1,12,19,4,30,1 -4630,1,7,0,4,2,6,14,1,3,0,8,15,4,13,0 -4631,10,2,0,6,3,5,7,3,3,1,17,18,1,13,0 -4632,2,8,0,15,6,0,2,3,0,1,13,1,5,6,0 -4633,0,4,0,11,0,4,12,3,0,1,8,2,1,32,0 -4634,1,0,0,6,2,0,10,5,1,1,2,13,1,29,0 -4635,4,1,0,14,4,6,8,5,0,1,1,13,4,17,1 -4636,8,1,0,10,2,1,0,0,2,1,16,19,2,23,1 -4637,10,1,0,2,0,0,9,5,0,1,1,5,5,9,1 -4638,8,3,0,6,6,6,8,5,3,1,18,14,1,37,1 -4639,2,4,0,8,6,6,11,4,1,0,13,19,2,19,0 -4640,7,2,0,2,3,0,11,3,2,1,7,1,0,11,1 -4641,7,2,0,2,1,0,10,1,4,0,0,18,4,14,0 -4642,0,7,0,13,0,2,7,2,0,1,16,5,2,37,1 -4643,4,0,0,0,4,5,5,5,3,0,8,2,5,6,0 -4644,0,5,0,5,6,2,12,1,4,0,2,6,2,4,0 -4645,10,5,0,7,2,5,12,4,0,1,7,3,0,28,1 -4646,6,1,0,14,6,4,9,3,1,1,9,16,5,7,1 -4647,6,7,0,12,1,4,8,0,2,1,0,18,4,33,0 -4648,3,3,0,3,0,4,6,4,2,1,1,0,3,25,0 -4649,2,4,0,7,6,6,6,3,0,0,2,11,5,30,0 -4650,0,7,0,3,2,5,4,1,1,0,13,11,5,23,0 -4651,0,6,0,13,0,1,14,2,3,0,6,9,4,23,0 -4652,10,1,0,6,0,5,0,3,3,1,8,2,5,9,0 -4653,1,8,0,13,3,0,8,0,0,1,13,9,1,38,0 -4654,0,4,0,1,6,6,12,1,1,0,13,2,5,14,0 -4655,8,4,0,4,1,3,0,3,4,1,1,1,4,1,1 -4656,0,8,0,6,5,1,0,3,0,0,13,15,0,37,0 -4657,4,2,0,15,2,2,8,5,4,0,9,7,1,36,1 -4658,10,3,0,7,5,1,11,5,2,0,9,10,4,18,1 -4659,0,5,0,13,4,0,2,2,0,1,15,11,1,14,0 -4660,9,3,0,11,5,2,12,2,2,0,7,13,5,22,1 -4661,5,8,0,4,5,0,9,4,1,1,7,2,1,36,0 -4662,9,4,0,12,4,0,1,2,4,1,0,7,3,26,1 -4663,6,6,0,3,2,0,5,5,1,1,0,20,1,2,0 -4664,9,5,0,11,3,5,10,0,0,1,16,14,0,36,1 -4665,2,6,0,4,1,1,4,2,1,0,5,2,2,0,0 -4666,2,8,0,8,6,3,2,1,0,0,8,11,2,12,0 -4667,6,8,0,13,1,0,1,5,3,1,10,6,2,16,0 -4668,0,1,0,0,2,6,2,0,1,0,2,15,1,40,0 -4669,10,2,0,1,1,5,12,2,2,1,13,13,5,6,0 -4670,10,3,0,15,4,0,4,4,3,1,4,15,1,28,0 -4671,2,1,0,11,0,6,12,4,3,0,0,20,1,35,0 -4672,9,2,0,0,5,2,12,0,3,1,3,0,4,41,1 -4673,4,5,0,15,4,4,8,5,3,1,5,10,2,32,1 -4674,1,5,0,3,0,4,1,2,1,0,10,3,1,35,0 -4675,1,1,0,3,6,4,6,4,4,0,10,2,1,9,0 -4676,0,2,0,4,4,4,8,4,1,0,13,4,0,4,0 -4677,1,0,0,3,6,2,5,3,4,1,13,18,3,21,0 -4678,9,7,0,4,2,4,6,4,3,1,12,9,0,29,0 -4679,0,7,0,15,0,0,12,3,2,1,13,12,3,0,0 -4680,6,8,0,11,0,4,12,3,0,1,2,15,4,33,0 -4681,7,7,0,14,0,2,14,4,2,0,12,1,1,12,1 -4682,0,8,0,11,4,1,11,2,0,1,8,10,3,26,0 -4683,5,4,0,6,4,5,1,1,1,1,6,14,0,14,1 -4684,3,6,0,15,0,6,6,4,2,1,2,15,1,22,0 -4685,1,4,0,8,5,5,1,1,4,0,13,2,4,35,0 -4686,0,5,0,0,5,4,6,2,3,0,6,6,1,18,0 -4687,6,7,0,9,3,4,13,5,3,0,8,11,0,20,0 -4688,5,0,0,9,5,0,8,1,2,0,6,6,5,7,0 -4689,10,7,0,6,4,4,14,0,3,0,3,5,1,38,0 -4690,5,8,0,11,4,0,9,0,4,1,2,18,1,26,0 -4691,7,1,0,11,2,4,4,2,3,0,2,20,0,38,0 -4692,4,3,0,1,3,4,4,2,0,0,15,18,2,6,0 -4693,0,3,0,0,0,5,12,1,1,1,2,17,5,13,0 -4694,9,1,0,13,0,5,1,3,0,1,13,2,2,31,0 -4695,0,0,0,0,6,5,14,2,4,0,16,2,3,26,0 -4696,0,2,0,14,6,3,2,1,2,0,2,18,5,34,0 -4697,3,3,0,12,1,0,6,5,4,1,0,12,0,26,0 -4698,2,7,0,13,0,3,11,0,0,0,13,0,4,14,0 -4699,9,8,0,0,6,2,3,5,4,1,9,1,3,24,1 -4700,3,0,0,9,1,6,5,3,1,1,16,7,4,40,1 -4701,3,7,0,11,1,4,1,4,2,1,4,19,1,0,0 -4702,8,8,0,15,1,5,3,2,0,1,8,1,5,0,0 -4703,0,2,0,7,0,0,0,4,0,1,16,15,3,6,0 -4704,0,0,0,13,4,0,0,5,3,1,10,11,3,11,0 -4705,8,7,0,8,0,4,4,3,0,0,13,8,5,6,0 -4706,2,3,0,1,6,5,7,3,2,1,8,11,2,14,0 -4707,8,6,0,10,0,4,13,3,0,1,8,15,0,35,0 -4708,0,6,0,6,0,5,0,2,1,0,17,15,5,40,0 -4709,0,4,0,5,4,6,1,2,3,1,2,11,0,17,0 -4710,8,6,0,15,6,1,14,3,2,0,12,9,3,17,1 -4711,3,1,0,7,4,4,12,1,1,0,2,13,0,37,0 -4712,0,1,0,10,6,4,6,5,1,0,2,11,0,41,0 -4713,9,8,0,11,0,6,1,4,0,0,4,2,1,34,0 -4714,2,6,0,8,1,3,4,2,1,0,9,5,4,28,1 -4715,0,6,0,11,2,5,7,1,0,0,2,11,3,22,0 -4716,8,4,0,12,6,3,10,0,2,1,16,17,3,3,1 -4717,7,4,0,4,1,4,1,1,0,0,18,8,3,31,1 -4718,4,6,0,13,2,1,9,2,2,0,16,8,3,40,1 -4719,7,3,0,7,3,2,10,5,2,1,16,5,0,28,1 -4720,3,1,0,14,4,3,5,2,4,0,13,15,1,22,0 -4721,2,1,0,14,5,3,13,3,4,0,17,14,1,27,0 -4722,8,0,0,6,0,2,2,2,1,1,13,6,5,9,0 -4723,1,8,0,2,2,1,10,0,0,0,13,11,0,22,0 -4724,6,5,0,4,1,6,8,3,2,1,9,20,2,2,1 -4725,10,8,0,12,6,1,5,1,0,0,5,12,0,7,0 -4726,0,1,0,10,4,2,11,2,0,1,4,12,4,1,1 -4727,9,3,0,5,5,0,1,5,0,1,8,19,0,28,0 -4728,6,6,0,8,1,5,12,1,4,1,10,4,1,30,0 -4729,3,5,0,2,6,4,5,2,1,1,8,11,3,29,0 -4730,0,1,0,6,4,4,7,4,0,0,13,2,4,17,0 -4731,0,7,0,6,1,5,7,3,4,0,6,6,0,11,0 -4732,7,4,0,12,2,5,6,5,0,0,4,7,5,36,1 -4733,2,1,0,11,0,1,4,1,3,0,5,15,0,9,0 -4734,7,4,0,7,2,6,5,5,0,0,14,17,4,37,1 -4735,0,2,0,5,2,4,2,2,0,0,17,11,3,25,0 -4736,3,3,0,9,0,4,5,4,2,1,4,18,0,4,0 -4737,1,5,0,5,4,4,10,1,0,0,1,4,3,11,1 -4738,3,6,0,1,6,0,13,1,4,0,4,4,3,18,0 -4739,4,2,0,3,4,2,6,1,1,0,2,15,0,29,0 -4740,8,3,0,5,5,2,11,5,3,1,18,1,5,5,1 -4741,2,1,0,8,0,0,0,4,1,0,12,11,0,10,0 -4742,1,4,0,4,0,3,2,0,2,0,16,11,1,3,0 -4743,4,8,0,15,5,2,9,3,4,1,2,20,0,17,0 -4744,7,0,0,7,5,6,2,1,0,1,18,4,2,12,1 -4745,1,7,0,12,3,4,12,4,0,1,13,18,3,12,0 -4746,8,2,0,13,4,3,1,5,2,1,3,10,0,37,1 -4747,0,3,0,4,5,0,10,3,4,1,2,20,2,38,0 -4748,1,6,0,11,5,5,7,3,3,0,10,11,0,41,0 -4749,1,6,0,2,1,1,6,4,3,1,13,4,3,12,0 -4750,5,3,0,5,4,1,0,0,1,1,1,14,2,19,1 -4751,5,0,0,5,1,1,14,1,4,1,1,13,3,5,1 -4752,9,8,0,15,2,5,2,2,3,0,3,17,2,20,0 -4753,9,5,0,3,4,1,14,4,1,1,5,7,5,18,1 -4754,3,6,0,6,6,2,4,4,3,0,9,11,5,31,0 -4755,3,0,0,8,0,2,1,2,4,0,8,15,3,6,0 -4756,5,3,0,13,5,6,13,2,2,1,13,11,0,29,0 -4757,2,5,0,8,5,1,12,3,2,1,11,17,4,3,1 -4758,3,8,0,12,6,0,14,3,1,1,13,11,4,33,0 -4759,3,7,0,10,2,3,1,1,0,0,18,10,4,3,1 -4760,1,5,0,9,6,1,13,1,0,1,18,17,4,41,1 -4761,0,2,0,13,2,2,0,3,0,0,8,2,2,29,0 -4762,3,3,0,5,6,0,4,0,0,0,2,6,2,22,0 -4763,4,8,0,8,0,2,11,4,2,1,0,19,2,41,0 -4764,0,8,0,13,2,5,0,5,0,1,13,11,0,33,0 -4765,2,4,0,2,4,4,0,2,0,1,10,18,0,6,0 -4766,2,3,0,3,5,2,3,2,4,0,10,9,1,0,0 -4767,4,7,0,4,6,2,13,0,4,1,7,17,4,0,1 -4768,6,5,0,15,6,0,11,5,3,1,13,0,2,40,0 -4769,5,4,0,5,5,5,2,2,3,0,2,2,4,12,0 -4770,10,5,0,8,6,3,1,3,0,1,12,1,3,30,1 -4771,2,4,0,5,5,4,2,3,4,0,4,15,4,0,0 -4772,9,8,0,14,5,6,10,5,2,0,11,8,4,35,1 -4773,3,8,0,2,5,1,3,4,0,0,10,18,2,41,0 -4774,1,5,0,13,4,0,3,3,3,0,13,13,0,9,0 -4775,2,3,0,13,6,5,3,1,3,0,4,4,0,35,0 -4776,7,8,0,3,5,5,9,4,0,0,15,11,0,26,0 -4777,8,0,0,5,6,1,5,0,3,0,8,10,2,13,0 -4778,9,7,0,15,4,6,9,3,3,1,4,11,1,7,0 -4779,0,3,0,0,0,3,9,3,1,1,13,2,2,9,0 -4780,1,2,0,5,1,0,6,3,0,1,13,13,5,4,0 -4781,2,8,0,8,0,0,6,1,1,0,2,11,2,19,0 -4782,0,0,0,7,5,2,12,2,3,0,14,12,0,32,0 -4783,2,5,0,5,1,4,1,5,0,1,13,15,3,6,0 -4784,6,3,0,6,1,2,12,0,0,1,9,0,3,10,1 -4785,2,3,0,11,4,0,2,3,3,0,8,2,1,4,0 -4786,6,3,0,3,5,4,0,0,3,0,13,6,2,11,0 -4787,10,4,0,13,0,3,6,0,3,0,8,11,5,2,0 -4788,2,8,0,3,0,3,4,1,3,0,13,16,3,40,0 -4789,3,8,0,12,0,6,12,0,4,1,18,7,5,36,1 -4790,7,8,0,5,6,2,5,0,0,1,8,20,0,34,0 -4791,9,1,0,12,4,1,9,2,1,1,18,19,3,0,1 -4792,4,0,0,12,6,5,6,4,4,1,15,11,0,34,0 -4793,10,7,0,7,6,6,8,2,2,1,13,2,3,35,0 -4794,5,6,0,8,5,6,0,1,4,0,13,11,2,24,0 -4795,0,5,0,1,0,5,9,4,4,1,13,2,1,31,0 -4796,10,3,0,5,0,0,10,2,0,0,17,3,1,0,0 -4797,7,2,0,1,6,1,2,4,1,1,12,1,3,29,1 -4798,9,8,0,3,4,0,1,2,0,0,0,4,1,7,0 -4799,8,1,0,0,0,6,8,5,4,0,13,2,3,20,0 -4800,2,0,0,1,0,0,0,1,0,1,4,18,4,41,0 -4801,0,4,0,3,1,3,11,0,4,0,2,10,0,34,0 -4802,1,8,0,3,5,4,3,3,4,0,13,8,2,19,0 -4803,3,3,0,1,5,5,2,3,3,1,8,10,2,38,0 -4804,2,2,0,1,1,0,3,4,2,0,2,2,1,28,0 -4805,2,5,0,3,3,0,3,0,1,1,15,19,5,33,0 -4806,7,0,0,5,0,6,9,0,1,0,13,11,3,25,0 -4807,6,3,0,12,6,5,5,2,2,1,13,4,4,38,0 -4808,8,2,0,4,3,6,3,3,3,0,0,8,3,40,0 -4809,2,5,0,15,2,5,1,2,2,1,2,2,1,4,0 -4810,2,1,0,11,0,4,7,0,0,0,3,13,1,17,0 -4811,4,2,0,13,2,5,8,1,0,0,13,11,0,34,0 -4812,0,3,0,6,4,5,5,3,0,0,17,12,0,18,0 -4813,4,0,0,0,4,6,9,5,0,0,18,8,4,8,1 -4814,2,2,0,13,1,3,13,0,3,0,5,9,1,17,0 -4815,0,4,0,7,5,5,6,0,4,0,2,2,4,30,0 -4816,1,3,0,10,5,5,9,2,3,0,13,14,0,29,0 -4817,1,0,0,14,0,3,4,0,1,1,13,13,4,40,0 -4818,3,5,0,0,4,0,9,3,4,1,4,6,1,16,0 -4819,1,1,0,1,5,1,6,2,0,1,2,6,0,21,0 -4820,1,0,0,6,1,5,0,2,2,1,13,7,2,36,0 -4821,2,5,0,1,2,0,8,1,0,0,2,19,3,34,0 -4822,1,3,0,8,5,3,13,2,1,0,13,6,5,1,0 -4823,7,7,0,13,2,2,0,4,4,1,1,7,4,14,1 -4824,4,3,0,10,0,5,2,1,0,1,9,17,0,6,1 -4825,9,7,0,0,3,1,11,2,0,1,6,5,4,5,1 -4826,4,8,0,10,4,0,9,0,4,1,4,0,0,9,0 -4827,0,7,0,10,1,3,13,4,4,1,17,3,2,24,0 -4828,3,4,0,9,5,4,9,2,2,0,2,20,0,41,0 -4829,2,8,0,0,1,6,6,3,4,1,8,19,3,16,0 -4830,6,0,0,9,4,0,11,3,1,0,18,17,4,22,1 -4831,6,4,0,13,1,3,14,3,4,1,8,16,1,13,0 -4832,3,7,0,9,2,4,9,4,0,0,1,7,5,37,1 -4833,0,3,0,15,3,5,8,5,2,0,2,11,5,1,0 -4834,3,2,0,6,3,4,7,3,2,0,8,20,0,3,0 -4835,0,7,0,15,6,5,5,4,3,1,15,3,1,14,0 -4836,8,2,0,5,0,1,3,4,0,0,17,4,0,28,0 -4837,2,4,0,7,0,6,5,1,3,1,2,18,0,0,0 -4838,0,0,0,10,2,6,11,1,0,0,8,10,5,18,0 -4839,1,5,0,2,3,3,3,0,0,1,13,2,4,13,0 -4840,10,6,0,13,2,6,0,2,4,0,12,9,5,16,0 -4841,8,6,0,0,0,3,13,4,3,1,9,6,0,22,0 -4842,8,3,0,15,2,0,0,5,4,1,12,10,1,1,1 -4843,3,3,0,7,3,1,8,5,2,1,5,8,1,27,1 -4844,1,0,0,9,2,6,0,3,0,0,18,4,0,14,0 -4845,3,2,0,13,5,2,4,0,3,1,6,15,0,26,0 -4846,1,0,0,13,2,0,6,4,2,1,8,13,4,7,0 -4847,6,8,0,13,0,1,5,2,4,1,2,15,1,30,0 -4848,9,7,0,6,0,6,7,5,4,0,17,11,0,27,0 -4849,7,1,0,8,4,4,8,4,4,0,5,1,5,5,1 -4850,9,0,0,9,4,2,5,4,4,1,2,20,0,0,0 -4851,10,7,0,13,0,0,8,5,1,1,11,11,0,2,0 -4852,0,7,0,7,6,2,1,5,4,0,13,15,1,24,0 -4853,1,3,0,13,2,4,12,3,0,0,8,11,2,3,0 -4854,0,4,0,12,2,5,10,4,2,1,16,11,3,31,0 -4855,7,4,0,7,6,2,11,4,3,1,14,20,0,37,1 -4856,10,3,0,15,4,4,0,5,3,1,5,20,4,23,1 -4857,0,6,0,11,2,1,6,5,0,1,4,11,3,8,0 -4858,9,4,0,3,3,0,7,1,1,1,15,11,2,0,0 -4859,2,4,0,6,2,6,4,2,0,1,6,11,2,35,0 -4860,8,4,0,7,4,5,14,5,0,0,18,19,2,8,1 -4861,4,8,0,8,4,1,13,1,3,1,5,1,2,9,1 -4862,2,2,0,8,0,5,6,1,1,0,8,11,3,29,0 -4863,1,0,0,4,6,5,14,0,3,1,6,2,1,7,0 -4864,9,1,0,4,1,2,3,5,1,1,9,20,1,37,1 -4865,0,1,0,13,0,5,13,1,0,0,8,19,2,35,0 -4866,0,3,0,11,2,0,10,3,1,0,13,18,0,39,0 -4867,9,7,0,14,5,1,9,3,4,1,13,11,2,38,0 -4868,6,2,0,11,1,0,13,3,0,1,2,13,5,27,0 -4869,2,4,0,3,1,1,14,4,3,1,4,2,1,2,0 -4870,10,2,0,15,4,1,11,5,0,1,18,7,5,2,1 -4871,0,4,0,3,0,2,8,0,3,0,2,15,0,37,0 -4872,0,8,0,3,4,0,9,3,3,1,13,11,0,6,0 -4873,4,0,0,5,3,3,11,5,4,0,7,17,4,36,1 -4874,10,2,0,10,1,4,10,1,3,1,10,17,4,11,1 -4875,9,1,0,12,0,6,14,1,3,0,11,2,0,23,0 -4876,0,3,0,3,3,2,0,4,0,1,2,18,2,19,0 -4877,7,3,0,0,4,6,4,1,1,0,13,0,0,37,0 -4878,5,6,0,6,2,4,10,2,0,0,2,10,1,0,0 -4879,2,6,0,3,3,4,9,4,4,1,13,13,1,0,0 -4880,10,4,0,6,0,4,8,3,3,1,8,15,0,14,0 -4881,7,5,0,1,1,4,11,0,2,1,1,10,1,3,1 -4882,2,0,0,6,3,0,11,0,0,1,13,2,2,6,0 -4883,8,8,0,9,6,4,8,1,2,1,13,0,1,40,0 -4884,3,5,0,8,3,6,6,0,1,1,9,10,4,9,1 -4885,0,3,0,3,0,4,13,3,1,1,8,18,5,24,0 -4886,7,0,0,7,6,3,1,4,0,0,2,6,1,10,0 -4887,2,4,0,15,0,6,13,4,2,1,8,15,2,0,0 -4888,9,5,0,11,0,4,3,4,3,1,1,7,2,21,1 -4889,6,5,0,13,3,4,0,1,2,0,13,6,2,25,0 -4890,1,6,0,3,5,1,5,2,3,1,10,18,4,40,0 -4891,3,2,0,6,5,5,9,3,1,0,2,13,1,9,0 -4892,8,0,0,15,5,2,11,3,2,1,5,7,5,17,1 -4893,1,1,0,12,0,1,8,3,3,1,13,9,0,26,0 -4894,3,0,0,4,1,5,8,1,2,0,13,6,0,5,0 -4895,3,6,0,7,6,5,4,3,0,0,13,9,1,19,0 -4896,6,5,0,1,6,0,8,0,3,0,9,3,0,24,1 -4897,1,7,0,3,6,2,4,2,4,0,4,11,4,32,0 -4898,8,4,0,3,0,0,14,0,0,0,8,16,2,8,0 -4899,0,3,0,1,3,4,3,4,0,0,8,12,3,21,0 -4900,5,8,0,11,6,0,9,3,3,1,6,18,1,25,0 -4901,2,3,0,6,0,4,3,1,1,0,6,9,5,24,0 -4902,6,1,0,9,6,0,4,3,2,1,13,20,0,28,0 -4903,3,7,0,1,0,0,9,4,2,0,17,4,1,31,0 -4904,8,4,0,7,1,2,13,4,1,1,18,8,5,7,1 -4905,0,2,0,5,0,6,7,1,1,1,8,19,0,13,0 -4906,3,7,0,15,6,5,12,3,2,0,10,17,1,35,0 -4907,0,6,0,0,5,5,8,3,4,0,14,10,5,1,0 -4908,0,5,0,8,6,1,1,1,4,0,12,9,3,29,0 -4909,5,1,0,10,4,6,3,5,3,0,18,8,3,39,1 -4910,9,1,0,5,3,2,14,3,3,0,6,15,2,28,0 -4911,7,1,0,10,2,0,1,1,4,0,9,6,0,9,1 -4912,3,8,0,9,5,6,11,0,4,0,3,15,4,31,0 -4913,6,0,0,13,2,5,11,4,3,1,2,13,2,24,0 -4914,8,0,0,1,2,5,8,1,3,1,5,4,5,33,1 -4915,6,6,0,13,1,0,13,3,3,1,17,11,3,3,0 -4916,3,4,0,4,1,5,5,4,0,0,17,15,0,1,0 -4917,0,4,0,8,2,4,4,4,4,0,2,6,5,33,0 -4918,9,1,0,10,1,1,1,4,1,0,13,15,3,34,0 -4919,2,6,0,11,4,3,0,4,0,0,4,11,2,23,0 -4920,2,7,0,10,0,0,9,3,4,0,4,6,0,36,0 -4921,6,0,0,11,2,6,11,0,0,0,4,2,5,3,0 -4922,0,3,0,12,5,2,14,2,4,0,3,9,0,19,0 -4923,3,1,0,10,1,0,10,5,3,1,8,11,0,29,0 -4924,4,4,0,10,2,6,0,2,1,1,11,1,4,39,1 -4925,5,2,0,1,6,0,9,3,0,0,17,9,4,35,0 -4926,0,3,0,4,1,4,8,1,2,0,8,19,0,23,0 -4927,7,4,0,11,4,3,10,5,3,1,1,19,0,6,1 -4928,0,3,0,8,0,5,2,3,2,1,2,20,5,36,0 -4929,9,6,0,12,1,6,13,1,4,1,9,16,3,6,1 -4930,1,6,0,3,5,5,8,2,0,0,2,18,0,2,0 -4931,9,0,0,1,0,0,5,3,2,0,8,7,3,40,0 -4932,2,5,0,12,5,2,0,4,4,0,13,16,1,5,0 -4933,4,2,0,9,2,6,8,3,1,0,18,9,1,26,0 -4934,4,7,0,3,2,6,5,1,2,1,7,10,3,21,1 -4935,4,7,0,1,5,5,13,5,2,1,5,17,1,1,1 -4936,0,4,0,15,2,4,14,2,2,1,17,6,5,30,0 -4937,1,7,0,2,5,4,1,3,0,0,12,4,0,27,0 -4938,1,4,0,14,6,5,8,5,1,0,9,2,2,35,0 -4939,2,8,0,3,6,3,1,1,1,1,0,7,0,22,0 -4940,6,8,0,0,0,6,6,4,0,1,10,6,1,23,0 -4941,7,0,0,7,5,1,0,0,0,0,14,14,5,4,1 -4942,7,3,0,13,2,1,1,1,4,0,0,3,3,13,0 -4943,2,8,0,10,4,0,13,1,2,1,16,7,4,21,1 -4944,7,8,0,1,4,3,6,2,1,0,0,11,3,20,0 -4945,6,3,0,6,4,6,11,0,1,0,18,10,0,33,1 -4946,7,3,0,8,0,6,8,0,1,0,13,9,0,0,0 -4947,9,0,0,13,5,5,5,2,0,1,4,2,3,39,0 -4948,5,1,0,1,6,0,11,2,4,0,6,12,0,17,1 -4949,9,5,0,11,3,3,11,5,1,0,18,0,0,9,1 -4950,3,8,0,6,6,0,11,2,3,1,2,2,2,5,0 -4951,1,1,0,15,6,5,0,3,3,1,2,11,3,4,0 -4952,5,5,0,15,0,2,9,3,4,1,8,2,0,11,0 -4953,0,0,0,11,3,0,9,4,4,1,13,12,0,23,0 -4954,10,2,0,15,0,1,13,1,3,1,5,5,0,16,1 -4955,10,6,0,13,5,1,6,3,0,0,13,6,0,16,0 -4956,2,7,0,4,5,0,3,2,0,0,7,19,0,16,0 -4957,7,3,0,8,6,4,9,2,0,0,8,9,4,9,0 -4958,1,8,0,3,6,2,1,4,1,1,2,11,2,25,0 -4959,5,5,0,12,3,3,2,4,3,1,9,10,5,39,1 -4960,1,6,0,14,4,2,13,3,4,1,1,10,4,11,1 -4961,9,0,0,10,5,5,2,4,3,0,8,11,5,33,0 -4962,8,2,0,11,1,5,1,0,0,1,3,9,1,20,0 -4963,2,8,0,4,0,0,7,3,2,1,8,11,1,29,0 -4964,0,3,0,13,6,0,7,4,3,0,15,20,2,4,0 -4965,7,3,0,3,2,4,7,1,2,0,10,15,4,11,0 -4966,7,4,0,1,5,5,14,5,1,0,11,0,2,14,0 -4967,1,6,0,1,4,0,0,0,4,0,7,2,0,14,0 -4968,3,7,0,4,2,3,3,3,4,1,11,18,4,3,0 -4969,8,5,0,10,4,5,13,3,1,1,1,7,4,33,1 -4970,2,6,0,3,5,3,1,0,0,1,4,2,1,14,0 -4971,1,7,0,13,1,6,5,3,3,1,13,2,5,3,0 -4972,5,1,0,1,1,0,0,2,0,0,10,13,4,6,0 -4973,3,7,0,12,0,1,3,0,4,1,1,13,2,1,0 -4974,8,5,0,14,0,2,0,2,1,0,1,17,0,34,1 -4975,6,6,0,10,1,4,14,3,4,1,0,18,0,20,0 -4976,0,2,0,6,1,0,8,0,1,0,6,11,3,7,0 -4977,1,5,0,13,4,4,4,2,0,0,1,2,3,22,0 -4978,9,4,0,4,6,1,6,3,3,1,8,2,2,10,0 -4979,6,2,0,10,4,1,7,1,0,0,7,0,4,38,0 -4980,6,6,0,15,4,0,5,0,4,0,13,2,1,3,0 -4981,4,4,0,11,3,2,11,0,0,0,9,8,3,12,1 -4982,5,5,0,0,6,1,8,1,3,0,1,15,1,7,0 -4983,0,4,0,15,0,4,11,5,4,1,15,13,3,35,0 -4984,2,3,0,8,0,5,11,5,1,0,5,19,5,16,1 -4985,9,3,0,10,0,3,2,2,2,0,18,13,2,19,1 -4986,7,3,0,13,6,4,0,0,1,0,8,13,3,12,0 -4987,2,5,0,3,6,3,7,3,1,0,17,9,4,25,0 -4988,2,6,0,3,2,1,2,0,2,0,4,19,3,22,1 -4989,5,7,0,15,0,1,8,0,3,1,14,15,3,12,1 -4990,9,8,0,12,0,5,5,3,1,1,2,2,0,6,0 -4991,3,4,0,4,2,0,7,3,2,1,17,9,3,17,0 -4992,0,7,0,1,6,5,11,5,3,0,13,2,1,26,0 -4993,6,6,0,13,0,6,3,1,0,0,2,19,2,13,0 -4994,0,8,0,3,0,0,2,3,1,1,8,13,0,19,0 -4995,5,3,0,5,1,2,5,1,2,1,0,0,2,18,0 -4996,1,4,0,0,6,3,1,3,0,1,5,20,1,32,0 -4997,8,7,0,11,5,0,11,5,0,1,15,12,0,13,1 -4998,1,8,0,0,2,4,11,1,1,0,2,15,5,7,0 -4999,10,5,0,6,2,6,4,3,4,1,11,8,0,12,1 -5000,2,8,0,5,0,0,3,0,2,1,13,11,1,30,0 -5001,4,1,0,9,0,4,14,2,0,0,13,8,1,8,0 -5002,5,7,0,4,5,6,0,1,4,1,14,15,0,13,1 -5003,0,2,0,0,3,5,5,3,3,1,2,0,3,11,0 -5004,0,8,0,3,0,4,12,4,2,1,12,13,2,2,0 -5005,6,1,0,14,2,3,2,3,2,0,9,1,3,41,1 -5006,4,0,0,5,0,2,4,3,2,0,4,13,0,16,1 -5007,0,6,0,11,1,0,3,2,3,1,0,2,0,23,0 -5008,9,8,0,8,3,2,12,1,0,0,18,19,4,37,1 -5009,0,1,0,0,4,4,3,0,2,0,17,2,5,23,0 -5010,1,3,0,1,3,4,2,4,0,0,6,2,2,13,0 -5011,2,2,0,0,0,4,4,3,0,0,13,6,0,8,0 -5012,5,2,0,3,3,4,9,5,2,1,9,6,3,38,0 -5013,10,6,0,13,5,5,8,4,4,0,4,11,3,12,0 -5014,2,4,0,4,0,4,4,4,0,1,16,14,3,27,0 -5015,9,5,0,12,6,2,10,5,3,0,14,5,0,5,1 -5016,10,2,0,4,0,3,8,4,1,0,8,15,5,17,0 -5017,5,0,0,3,2,5,12,2,0,0,0,15,4,39,0 -5018,2,6,0,4,0,0,12,2,4,0,17,9,1,24,0 -5019,2,1,0,15,1,1,12,1,0,0,2,11,0,19,0 -5020,7,0,0,1,4,4,14,4,2,1,16,10,4,11,1 -5021,0,0,0,10,2,0,8,3,1,0,13,2,0,32,0 -5022,10,7,0,10,3,2,1,4,1,1,9,5,2,1,1 -5023,7,7,0,13,4,4,0,5,4,1,5,0,2,2,1 -5024,0,2,0,4,1,0,2,2,2,1,8,9,4,35,0 -5025,1,0,0,13,4,5,5,5,0,0,13,7,1,8,0 -5026,9,3,0,9,6,2,5,3,3,0,13,0,1,13,0 -5027,2,1,0,0,2,3,11,3,3,1,6,2,1,21,0 -5028,2,5,0,11,2,3,0,1,0,1,14,15,5,38,0 -5029,1,6,0,14,0,0,6,1,4,0,0,15,3,12,0 -5030,2,0,0,0,0,5,4,4,4,1,15,16,1,38,0 -5031,8,5,0,10,0,2,8,5,4,0,3,13,2,21,0 -5032,1,8,0,13,2,0,12,0,0,1,10,15,4,0,0 -5033,3,6,0,0,6,0,14,3,0,1,17,18,2,14,0 -5034,7,8,0,2,1,0,9,1,4,1,5,4,1,32,0 -5035,2,8,0,2,2,4,2,3,4,0,10,19,4,33,0 -5036,5,5,0,10,0,4,4,4,3,1,18,19,2,35,1 -5037,0,1,0,1,5,2,5,5,3,1,2,14,4,33,0 -5038,4,8,0,7,5,0,3,1,3,1,2,2,1,14,0 -5039,6,5,0,13,1,5,7,1,4,1,13,13,4,9,0 -5040,2,8,0,13,2,6,6,0,2,0,13,15,4,29,0 -5041,1,5,0,1,1,5,8,0,2,1,4,13,2,11,0 -5042,0,1,0,11,5,6,5,3,4,0,13,9,1,37,0 -5043,7,5,0,15,3,1,8,2,1,1,1,8,0,26,1 -5044,2,8,0,14,2,3,4,4,3,0,17,13,5,4,0 -5045,9,1,0,5,3,5,11,3,1,1,18,5,5,40,1 -5046,7,8,0,2,0,2,3,3,3,1,2,20,4,30,0 -5047,0,4,0,5,0,4,4,2,2,1,13,11,2,39,0 -5048,9,2,0,8,2,6,0,3,1,0,6,3,3,4,0 -5049,8,1,0,3,3,3,4,5,1,0,5,7,4,0,1 -5050,6,6,0,5,6,5,2,2,3,0,15,2,2,5,0 -5051,10,7,0,3,3,1,8,1,0,1,0,11,1,14,0 -5052,6,0,0,6,1,6,10,4,4,1,2,11,1,21,0 -5053,2,6,0,7,1,4,7,0,0,0,17,20,1,39,0 -5054,1,3,0,13,0,4,2,4,2,0,9,3,1,28,0 -5055,9,5,0,9,6,2,11,1,4,1,5,4,5,8,1 -5056,8,1,0,4,1,2,0,1,1,1,2,16,2,29,0 -5057,3,8,0,13,4,0,11,3,4,0,15,15,5,33,0 -5058,4,4,0,9,1,0,5,3,2,0,2,11,0,13,0 -5059,1,6,0,11,5,3,12,4,1,1,2,18,4,35,0 -5060,3,3,0,9,4,1,7,4,0,0,2,11,4,3,0 -5061,7,8,0,12,5,1,13,1,0,1,3,10,5,26,1 -5062,1,6,0,2,6,2,11,2,3,0,18,8,3,36,1 -5063,3,7,0,10,4,4,11,3,3,1,9,18,2,5,1 -5064,0,5,0,12,0,3,2,4,3,0,17,4,2,34,0 -5065,0,2,0,1,5,6,5,3,0,0,8,16,5,7,0 -5066,1,4,0,4,0,6,8,4,2,0,4,2,0,0,0 -5067,2,7,0,15,5,0,6,4,4,0,12,1,1,32,0 -5068,0,7,0,12,0,3,0,3,1,0,16,11,4,37,0 -5069,6,7,0,1,2,1,0,2,1,1,8,0,2,9,0 -5070,10,7,0,3,3,4,4,3,0,0,6,5,0,13,0 -5071,6,5,0,6,0,1,13,4,3,0,2,11,0,7,0 -5072,1,8,0,2,1,3,4,3,1,1,14,0,1,1,0 -5073,1,2,0,2,1,1,14,2,3,1,1,10,5,9,1 -5074,2,5,0,13,2,1,13,0,4,1,18,14,5,7,1 -5075,0,8,0,15,0,0,11,4,4,0,0,11,1,11,0 -5076,9,7,0,15,4,5,12,2,0,1,11,7,5,25,1 -5077,9,5,0,0,3,4,6,5,4,1,11,3,5,27,1 -5078,5,0,0,0,2,5,11,1,2,0,12,2,3,13,0 -5079,0,0,0,1,1,5,2,0,4,0,8,0,3,40,0 -5080,2,6,0,7,3,3,9,1,0,0,12,20,3,37,0 -5081,7,1,0,15,0,0,8,2,1,0,6,15,4,18,0 -5082,5,6,0,3,3,1,0,5,3,1,4,17,3,13,0 -5083,1,5,0,0,6,3,6,5,4,1,13,12,0,38,0 -5084,10,8,0,0,2,3,6,2,2,0,8,6,5,6,0 -5085,7,5,0,9,0,5,11,1,2,0,2,8,1,10,1 -5086,5,7,0,6,5,0,13,3,1,0,17,0,5,36,0 -5087,3,2,0,4,2,4,4,4,0,0,8,0,3,32,0 -5088,2,4,0,15,0,0,7,0,0,0,4,15,1,8,0 -5089,0,7,0,3,6,0,4,3,0,0,0,14,4,23,0 -5090,8,0,0,12,4,2,12,0,2,1,12,8,2,1,1 -5091,10,0,0,5,0,2,6,4,4,1,6,10,2,31,1 -5092,0,4,0,5,1,0,2,3,1,0,2,2,1,34,0 -5093,4,5,0,3,4,0,8,1,4,0,18,11,5,11,1 -5094,10,5,0,3,4,2,3,5,1,0,16,19,4,7,1 -5095,4,4,0,8,6,3,13,0,0,0,2,2,3,26,0 -5096,10,7,0,3,2,2,3,2,2,1,2,20,3,37,0 -5097,1,7,0,11,3,6,8,2,1,1,8,9,5,26,0 -5098,2,2,0,13,1,4,12,1,4,1,10,11,5,31,0 -5099,2,3,0,1,6,1,12,5,2,0,18,6,4,28,0 -5100,4,8,0,3,5,4,7,4,3,0,6,2,2,21,0 -5101,1,1,0,1,0,5,9,0,0,1,8,16,3,10,0 -5102,7,5,0,10,4,5,10,1,0,1,12,1,1,41,1 -5103,3,4,0,13,4,5,5,4,2,1,17,18,2,35,0 -5104,10,6,0,8,2,3,1,3,1,1,6,13,1,9,0 -5105,9,6,0,1,1,1,5,1,1,1,3,7,1,1,1 -5106,3,4,0,13,5,1,7,2,3,0,2,15,2,1,0 -5107,10,1,0,0,6,2,1,2,3,0,16,10,1,9,1 -5108,1,5,0,2,3,0,5,1,1,1,13,16,2,1,0 -5109,0,6,0,12,0,0,8,4,0,0,6,2,4,32,0 -5110,1,3,0,6,6,5,1,5,0,1,6,2,5,6,0 -5111,4,8,0,1,0,5,5,2,3,0,0,11,2,4,0 -5112,3,0,0,7,5,6,14,0,2,0,6,7,0,11,0 -5113,7,2,0,0,3,6,10,0,4,0,18,7,0,18,1 -5114,10,7,0,10,5,6,8,1,3,0,18,5,3,19,1 -5115,2,1,0,14,4,5,1,3,3,1,1,1,0,13,1 -5116,3,5,0,11,1,5,11,0,0,1,1,18,3,7,0 -5117,3,4,0,15,5,4,1,2,1,1,13,13,1,22,0 -5118,7,8,0,1,0,1,14,1,0,0,13,2,0,1,0 -5119,2,4,0,1,2,3,0,3,4,0,17,2,1,0,0 -5120,9,2,0,1,5,0,8,4,1,0,2,16,1,9,0 -5121,7,4,0,5,2,1,0,3,3,1,0,11,5,25,0 -5122,7,5,0,0,5,5,0,4,3,1,0,2,2,31,0 -5123,5,4,0,0,5,0,2,1,0,0,2,11,0,34,0 -5124,8,5,0,2,6,6,4,0,4,1,18,10,5,20,1 -5125,0,3,0,5,3,5,0,1,3,1,2,15,0,40,0 -5126,5,6,0,7,3,5,1,5,1,1,7,20,4,38,1 -5127,0,4,0,14,6,5,7,3,3,1,15,8,0,35,0 -5128,5,8,0,11,0,6,14,2,1,1,18,8,2,38,1 -5129,2,1,0,4,0,4,11,1,4,0,17,2,3,3,0 -5130,9,7,0,15,0,0,8,1,4,1,13,3,1,14,0 -5131,9,1,0,0,3,3,0,4,1,0,8,14,4,12,0 -5132,0,6,0,15,4,4,2,5,3,0,4,10,0,29,0 -5133,5,8,0,6,4,6,8,3,1,0,2,4,1,13,0 -5134,4,5,0,10,6,6,1,5,2,1,10,10,2,1,1 -5135,1,8,0,13,1,3,14,3,0,0,2,2,5,36,0 -5136,3,3,0,3,2,1,0,1,1,0,11,18,4,3,0 -5137,0,7,0,10,4,6,3,4,3,0,10,15,0,23,0 -5138,6,1,0,1,2,0,1,1,3,0,18,9,0,38,1 -5139,7,4,0,12,1,2,10,4,4,0,9,18,5,36,1 -5140,5,7,0,2,0,5,0,5,2,0,1,2,4,30,0 -5141,0,3,0,11,5,3,9,3,3,0,4,2,4,12,0 -5142,5,8,0,1,0,4,8,1,0,0,13,15,3,12,0 -5143,2,2,0,2,1,4,3,5,3,0,11,15,2,26,0 -5144,8,3,0,13,6,0,7,4,2,1,13,9,0,11,0 -5145,4,0,0,0,3,4,5,0,3,1,18,17,0,31,1 -5146,6,0,0,9,5,1,4,4,1,0,8,20,1,18,0 -5147,2,6,0,5,2,0,7,2,2,1,2,18,4,40,0 -5148,5,8,0,1,5,3,1,5,3,0,16,20,4,40,0 -5149,0,6,0,14,5,3,3,5,3,0,2,2,3,38,0 -5150,0,8,0,2,3,3,5,1,0,1,8,16,5,4,0 -5151,1,0,0,5,5,6,1,3,1,0,2,13,4,13,0 -5152,2,2,0,0,2,3,0,1,2,1,15,15,2,6,0 -5153,5,8,0,13,1,0,0,4,3,0,2,11,5,20,0 -5154,5,6,0,10,5,4,5,2,1,1,6,13,5,18,0 -5155,5,2,0,13,4,1,13,0,2,0,3,10,0,10,1 -5156,6,6,0,13,0,2,11,4,0,0,12,13,2,27,0 -5157,2,0,0,4,1,5,4,3,2,0,8,2,0,5,0 -5158,9,3,0,7,4,6,3,1,3,1,9,15,5,1,1 -5159,3,8,0,15,1,3,7,4,0,1,2,13,2,7,0 -5160,10,6,0,11,0,3,11,3,0,0,6,2,0,38,0 -5161,0,6,0,0,0,6,3,0,4,0,4,20,3,21,0 -5162,3,1,0,10,1,2,3,2,4,1,13,6,0,37,0 -5163,0,8,0,13,3,4,6,0,2,1,8,2,0,22,0 -5164,1,8,0,15,0,4,2,3,3,1,7,11,0,14,0 -5165,6,5,0,2,4,0,13,1,1,0,18,3,3,29,1 -5166,1,3,0,12,5,6,0,4,2,0,7,11,1,34,0 -5167,7,3,0,7,2,0,0,3,4,1,2,0,1,29,0 -5168,10,0,0,6,5,3,8,4,3,1,2,15,0,39,0 -5169,2,2,0,8,4,3,6,1,0,0,15,11,2,10,0 -5170,4,8,0,2,0,5,13,3,2,1,15,2,3,2,0 -5171,5,7,0,13,5,1,2,3,0,1,11,10,5,22,1 -5172,0,5,0,7,2,2,14,2,4,1,2,6,3,11,0 -5173,6,2,0,12,1,5,14,3,4,0,1,20,5,7,1 -5174,3,8,0,14,2,5,5,5,4,0,17,20,1,7,0 -5175,9,6,0,2,2,0,4,1,3,1,15,11,0,10,0 -5176,2,6,0,12,3,4,4,4,0,0,0,7,1,35,0 -5177,6,7,0,10,6,6,5,4,0,0,2,11,0,30,0 -5178,2,2,0,5,4,4,7,0,3,0,17,15,4,4,0 -5179,6,2,0,11,6,4,4,4,0,1,6,3,5,38,0 -5180,10,5,0,3,4,1,0,5,1,1,12,16,5,10,1 -5181,10,2,0,14,6,4,1,3,3,0,18,19,4,41,1 -5182,1,4,0,11,1,1,11,3,1,1,17,11,3,17,0 -5183,2,0,0,15,6,4,5,5,4,0,2,6,1,26,0 -5184,8,8,0,14,2,3,6,1,2,0,13,9,0,41,0 -5185,2,7,0,8,6,3,10,5,4,1,18,0,2,9,1 -5186,6,4,0,14,4,3,10,5,0,1,10,18,4,22,1 -5187,5,8,0,11,5,0,14,5,3,1,10,17,5,9,1 -5188,9,6,0,5,5,3,4,0,4,1,13,11,0,27,0 -5189,2,6,0,15,5,0,4,4,4,1,0,12,5,26,0 -5190,0,0,0,8,0,4,12,4,0,1,6,2,5,24,0 -5191,0,3,0,5,1,1,6,1,1,0,0,6,2,8,0 -5192,5,5,0,7,6,3,4,5,3,0,11,8,4,12,1 -5193,7,8,0,6,1,0,9,1,3,0,15,2,0,41,0 -5194,9,6,0,11,4,1,7,1,0,1,0,1,1,19,1 -5195,8,7,0,1,6,4,13,0,4,0,7,17,0,28,0 -5196,10,5,0,8,3,2,2,1,0,1,13,11,0,6,0 -5197,5,6,0,2,3,4,10,3,1,0,17,13,3,23,0 -5198,7,0,0,13,0,5,5,0,3,0,13,0,0,34,0 -5199,4,8,0,13,6,1,12,3,2,0,11,0,1,24,0 -5200,2,2,0,3,2,2,5,1,0,0,13,3,4,36,0 -5201,0,1,0,12,5,6,12,5,2,0,13,11,3,5,0 -5202,3,1,0,3,1,3,1,3,1,0,8,18,5,18,0 -5203,8,4,0,11,4,0,6,4,0,1,8,19,5,19,0 -5204,9,2,0,12,6,4,9,3,3,0,16,15,3,4,0 -5205,7,7,0,15,0,6,10,1,2,0,17,15,4,31,0 -5206,3,1,0,12,5,5,9,5,1,1,18,19,3,7,1 -5207,0,5,0,1,4,6,12,4,3,1,6,2,5,30,0 -5208,8,7,0,2,2,3,12,1,3,1,0,19,2,32,1 -5209,7,6,0,11,5,5,10,4,3,0,15,17,4,27,0 -5210,5,1,0,3,6,6,0,3,1,1,0,11,2,41,0 -5211,2,0,0,3,1,0,1,3,4,1,12,16,4,19,0 -5212,0,2,0,7,6,0,0,2,0,0,4,11,0,10,0 -5213,0,1,0,1,0,5,7,0,1,0,14,17,0,21,0 -5214,7,7,0,12,2,3,0,4,3,1,15,12,1,35,0 -5215,10,1,0,3,2,4,14,3,0,1,7,2,4,1,0 -5216,2,1,0,3,4,0,0,4,3,0,12,6,2,6,0 -5217,0,4,0,2,4,3,1,1,1,1,4,1,4,32,0 -5218,8,5,0,4,6,1,4,5,4,0,4,10,1,18,1 -5219,8,5,0,4,1,6,7,2,1,0,3,9,3,29,0 -5220,7,3,0,0,4,2,13,5,4,1,1,7,0,1,1 -5221,6,0,0,4,2,0,14,2,0,0,10,2,2,28,0 -5222,2,7,0,13,5,0,6,4,0,1,13,6,3,33,0 -5223,4,2,0,7,4,0,2,3,1,0,18,10,1,28,1 -5224,4,0,0,13,2,3,1,4,2,0,2,4,0,29,0 -5225,5,2,0,4,6,0,0,1,0,1,8,11,2,41,0 -5226,6,1,0,7,5,6,7,4,4,1,2,7,1,10,0 -5227,1,3,0,10,6,2,2,2,4,1,5,17,3,11,1 -5228,5,0,0,13,6,6,6,2,0,0,6,17,1,25,0 -5229,6,5,0,4,0,1,5,1,3,1,5,5,5,37,1 -5230,5,8,0,2,5,4,5,0,3,0,17,18,2,26,0 -5231,2,6,0,1,3,3,5,4,0,0,2,9,5,17,0 -5232,7,3,0,5,1,2,12,5,1,1,9,10,0,30,1 -5233,10,5,0,13,1,3,4,0,0,0,2,2,3,33,0 -5234,8,8,0,14,0,5,3,4,1,0,2,13,1,31,0 -5235,5,3,0,3,3,5,2,4,4,1,6,9,1,7,0 -5236,0,8,0,7,3,5,5,3,3,1,13,2,5,10,0 -5237,6,1,0,11,0,0,5,4,4,1,10,4,1,20,0 -5238,3,0,0,5,0,5,6,4,2,0,3,0,3,21,0 -5239,9,3,0,13,1,3,6,2,0,1,13,8,4,41,0 -5240,10,3,0,6,0,4,0,4,4,0,13,20,5,22,0 -5241,1,1,0,0,3,2,2,3,2,1,18,7,4,18,1 -5242,3,0,0,11,0,1,0,1,2,0,12,11,0,3,0 -5243,5,5,0,1,3,4,11,0,2,1,15,2,3,30,0 -5244,3,8,0,5,1,1,5,3,2,1,9,11,2,41,0 -5245,2,5,0,5,0,1,4,3,0,0,8,1,4,4,0 -5246,0,5,0,10,5,4,3,3,2,0,13,0,4,4,0 -5247,5,6,0,15,1,4,1,4,0,0,6,6,3,17,0 -5248,5,2,0,15,1,0,4,3,2,0,13,11,1,6,0 -5249,4,0,0,3,6,4,0,1,0,1,12,11,0,27,0 -5250,10,5,0,4,4,2,8,4,2,0,10,11,1,39,0 -5251,1,3,0,5,0,2,5,1,3,0,17,20,1,26,0 -5252,2,2,0,0,2,3,14,2,4,0,15,2,2,39,0 -5253,3,7,0,4,5,5,3,5,1,1,1,16,2,32,1 -5254,5,2,0,0,6,6,5,3,4,0,13,16,2,7,0 -5255,9,5,0,13,6,3,4,5,0,1,18,5,1,1,1 -5256,6,5,0,0,4,1,11,4,2,1,18,3,5,27,1 -5257,8,1,0,15,3,6,5,4,0,0,12,2,5,26,0 -5258,7,5,0,15,6,1,11,1,1,1,12,2,3,36,0 -5259,3,8,0,4,2,5,9,4,2,1,13,20,1,40,0 -5260,4,6,0,3,6,4,3,4,4,0,2,11,3,5,0 -5261,9,1,0,3,3,1,14,4,3,1,6,11,0,32,0 -5262,5,7,0,3,6,4,3,3,4,0,12,17,3,14,0 -5263,4,8,0,5,0,5,0,3,0,1,0,20,4,2,0 -5264,1,0,0,5,2,5,3,2,1,0,13,18,1,28,0 -5265,4,0,0,0,6,4,10,1,1,0,2,15,1,20,0 -5266,10,5,0,13,5,5,4,0,1,0,7,7,3,4,1 -5267,2,8,0,15,1,0,3,1,2,1,10,4,4,6,0 -5268,10,4,0,5,2,0,13,0,4,0,3,7,0,14,1 -5269,1,7,0,8,6,2,7,0,3,1,5,9,4,5,1 -5270,10,1,0,13,0,1,9,1,3,1,13,11,3,12,0 -5271,1,8,0,8,1,5,8,3,2,0,2,15,4,4,0 -5272,1,3,0,13,4,3,9,3,3,1,7,2,3,2,0 -5273,6,2,0,9,0,0,9,4,1,0,2,1,1,31,0 -5274,9,4,0,4,6,0,14,4,2,0,2,16,3,27,0 -5275,4,4,0,12,3,3,0,4,2,1,16,11,5,31,0 -5276,7,1,0,14,1,2,13,0,1,0,16,13,3,32,1 -5277,1,3,0,7,0,6,0,4,3,1,2,15,2,38,0 -5278,0,4,0,3,1,3,3,3,0,1,8,11,1,11,0 -5279,7,8,0,5,5,3,13,3,1,1,2,11,0,27,0 -5280,2,5,0,12,6,4,7,4,0,0,17,11,0,22,0 -5281,1,8,0,11,4,6,8,5,4,0,5,10,4,6,1 -5282,3,8,0,15,0,1,14,3,0,0,4,2,3,23,0 -5283,6,4,0,5,4,2,10,3,0,0,13,12,5,22,0 -5284,7,2,0,12,3,2,3,0,3,0,5,9,3,30,1 -5285,3,6,0,1,6,2,7,4,1,0,6,11,5,37,0 -5286,5,3,0,12,5,5,12,2,2,0,2,2,3,41,0 -5287,1,7,0,10,2,0,11,2,1,1,7,13,2,17,1 -5288,6,3,0,9,3,0,9,1,4,1,14,14,3,12,1 -5289,0,8,0,10,1,1,8,4,3,0,4,2,2,24,0 -5290,10,8,0,11,4,5,8,4,1,1,0,9,0,33,0 -5291,3,8,0,1,3,4,3,2,3,0,8,11,0,20,0 -5292,2,6,0,11,2,5,8,3,3,1,0,0,0,31,0 -5293,8,6,0,4,5,1,3,5,0,1,1,10,4,9,1 -5294,10,1,0,6,3,2,6,5,1,1,14,1,5,32,1 -5295,5,4,0,6,0,5,9,3,1,1,13,0,4,35,0 -5296,2,3,0,12,0,3,8,0,3,0,15,11,2,30,0 -5297,0,6,0,13,0,4,7,3,3,1,13,2,1,29,0 -5298,3,5,0,0,3,5,6,5,2,0,2,11,4,16,0 -5299,8,3,0,8,4,4,11,1,2,0,14,9,2,38,1 -5300,2,1,0,6,2,5,11,5,3,1,4,19,2,37,1 -5301,6,7,0,4,0,4,0,3,4,0,17,0,3,7,0 -5302,8,8,0,0,0,4,14,2,3,0,16,12,0,27,0 -5303,2,8,0,6,1,1,5,1,4,1,10,14,1,31,0 -5304,0,3,0,15,5,1,9,4,1,1,8,13,0,19,0 -5305,3,3,0,13,5,0,8,0,2,0,17,10,3,25,0 -5306,1,1,0,1,0,2,11,2,4,0,11,11,0,10,0 -5307,0,8,0,13,0,2,7,1,0,1,13,20,4,20,0 -5308,3,8,0,12,0,5,14,4,2,0,10,18,5,4,0 -5309,1,6,0,8,6,1,7,3,0,1,12,17,3,21,1 -5310,10,7,0,14,6,4,11,2,0,0,13,18,5,19,0 -5311,1,2,0,15,6,4,11,4,4,0,11,3,2,23,0 -5312,0,6,0,12,4,2,14,1,1,1,14,11,0,11,0 -5313,9,2,0,12,2,0,13,1,1,1,7,7,5,3,1 -5314,1,0,0,10,5,3,6,1,0,0,8,10,1,29,0 -5315,4,5,0,1,5,4,6,1,4,1,7,1,2,1,1 -5316,0,0,0,1,4,6,1,2,4,0,10,13,0,30,0 -5317,1,2,0,1,5,2,2,5,1,1,2,9,0,4,0 -5318,7,5,0,0,0,4,14,4,0,1,17,14,2,19,0 -5319,4,5,0,9,3,6,3,2,4,1,11,19,4,13,1 -5320,8,8,0,15,6,2,8,3,0,1,6,11,2,39,0 -5321,10,8,0,2,0,2,12,3,1,1,2,10,2,10,0 -5322,5,1,0,1,1,6,3,4,2,1,0,11,4,35,0 -5323,2,5,0,13,2,0,4,4,2,1,17,19,0,37,0 -5324,3,7,0,1,2,3,5,3,2,1,4,15,5,20,0 -5325,5,7,0,0,1,5,12,0,2,0,0,11,3,19,0 -5326,5,7,0,2,3,3,6,2,2,0,17,2,0,27,0 -5327,9,6,0,0,6,2,6,2,0,0,2,13,0,33,0 -5328,4,6,0,5,3,3,10,5,4,0,18,19,5,16,1 -5329,7,2,0,11,0,6,13,1,3,0,8,12,4,26,0 -5330,3,7,0,8,5,0,5,1,0,0,8,11,0,17,0 -5331,3,8,0,0,4,5,2,3,3,0,17,9,0,27,0 -5332,4,7,0,15,1,4,10,5,2,0,17,13,1,21,0 -5333,0,2,0,12,4,3,5,4,4,0,12,2,3,16,0 -5334,10,3,0,1,6,2,10,5,4,0,14,6,5,29,0 -5335,7,5,0,10,3,5,0,4,0,1,8,0,4,41,0 -5336,2,2,0,7,0,6,14,4,0,0,1,7,5,26,1 -5337,2,5,0,15,0,0,8,3,0,0,2,12,1,16,0 -5338,7,3,0,6,6,1,4,3,3,1,8,6,0,31,0 -5339,3,6,0,11,4,4,5,3,1,0,1,2,0,5,0 -5340,1,7,0,8,3,5,12,4,2,1,14,5,5,1,1 -5341,7,6,0,8,2,1,5,0,1,1,12,2,3,8,0 -5342,0,5,0,13,6,4,5,3,0,0,11,13,1,13,0 -5343,2,8,0,4,6,1,8,2,0,0,13,6,2,36,0 -5344,1,6,0,12,0,2,8,1,1,0,2,18,0,27,0 -5345,7,0,0,15,5,0,2,5,2,0,9,5,1,13,1 -5346,4,6,0,0,0,0,3,1,3,0,6,9,0,23,0 -5347,0,1,0,3,2,0,5,3,2,0,13,18,1,5,0 -5348,10,0,0,0,5,6,8,1,2,0,13,2,2,3,0 -5349,9,8,0,12,6,1,0,5,2,1,9,5,4,10,1 -5350,5,8,0,1,6,5,0,2,0,1,6,2,4,39,0 -5351,4,4,0,13,6,5,7,4,2,0,2,20,1,19,0 -5352,0,6,0,2,2,3,9,4,1,1,2,11,5,8,0 -5353,9,3,0,10,3,4,8,2,4,1,18,7,4,30,1 -5354,3,6,0,8,2,6,7,5,4,0,16,10,2,27,1 -5355,2,4,0,13,0,0,1,3,4,1,0,11,1,31,0 -5356,7,4,0,13,6,1,1,3,0,0,18,4,4,28,1 -5357,8,6,0,5,4,0,13,2,3,1,9,19,3,37,1 -5358,9,0,0,9,0,4,6,0,3,0,9,10,0,7,1 -5359,7,3,0,4,6,5,0,2,0,0,8,11,1,22,0 -5360,4,5,0,0,5,3,0,1,1,1,13,2,0,12,0 -5361,3,6,0,12,2,0,2,3,1,0,15,19,3,18,1 -5362,0,0,0,15,3,3,7,4,2,0,2,12,5,4,0 -5363,3,2,0,11,3,4,7,3,0,1,6,9,2,35,0 -5364,8,1,0,10,6,1,9,1,2,0,18,10,4,29,1 -5365,8,8,0,4,3,2,0,4,0,1,13,6,1,3,0 -5366,1,6,0,1,6,6,8,5,0,0,2,11,2,26,0 -5367,6,6,0,1,6,4,6,3,2,1,13,2,4,40,0 -5368,6,2,0,11,1,4,0,5,1,1,13,0,4,40,0 -5369,9,4,0,9,5,4,6,0,2,1,8,20,4,11,0 -5370,1,4,0,10,5,4,13,0,4,1,13,0,2,28,0 -5371,5,4,0,3,4,2,11,1,2,0,11,3,2,20,1 -5372,8,1,0,14,2,2,14,4,4,1,11,7,5,13,1 -5373,0,4,0,7,5,2,11,1,1,1,18,1,4,20,1 -5374,6,6,0,9,6,5,5,1,2,1,15,2,2,10,0 -5375,6,5,0,3,1,3,10,0,2,1,2,0,0,18,0 -5376,2,1,0,10,3,5,4,4,4,0,13,10,2,26,0 -5377,2,1,0,12,0,5,5,1,2,0,13,11,5,36,0 -5378,8,4,0,8,1,1,7,3,3,1,18,12,3,19,1 -5379,0,0,0,11,0,6,13,1,3,1,2,2,1,8,0 -5380,6,8,0,5,6,1,14,4,4,0,4,0,3,6,0 -5381,9,4,0,8,5,2,9,2,2,1,6,3,1,33,0 -5382,2,0,0,13,4,5,12,0,3,0,13,2,1,20,0 -5383,9,8,0,14,1,2,0,4,0,0,8,9,3,17,0 -5384,10,8,0,8,6,5,11,2,2,1,2,11,1,28,0 -5385,0,4,0,11,0,4,5,4,0,1,4,2,0,38,0 -5386,2,5,0,5,1,3,8,1,1,1,15,20,1,19,0 -5387,0,6,0,7,0,6,11,5,0,0,10,2,5,7,0 -5388,0,0,0,6,5,5,11,5,4,0,13,4,4,22,0 -5389,5,4,0,5,0,6,10,3,2,0,13,6,2,8,0 -5390,1,5,0,3,3,0,4,5,2,1,2,17,5,29,0 -5391,0,0,0,15,0,0,8,2,2,0,10,15,0,2,0 -5392,3,0,0,5,0,1,2,1,3,0,8,3,0,40,0 -5393,3,8,0,0,5,3,0,3,2,1,0,19,1,4,0 -5394,1,0,0,9,1,5,9,3,4,0,13,2,2,39,0 -5395,5,6,0,4,1,5,9,1,0,1,2,11,4,36,0 -5396,3,5,0,7,1,6,1,2,3,1,8,9,3,12,0 -5397,10,7,0,13,4,5,8,0,1,0,1,6,1,35,0 -5398,2,8,0,2,6,0,2,0,1,0,4,11,2,3,0 -5399,1,3,0,8,0,3,0,5,4,0,17,16,3,7,0 -5400,10,3,0,12,0,5,11,4,4,0,0,2,0,25,0 -5401,1,2,0,0,2,1,10,1,0,0,2,8,4,8,0 -5402,2,5,0,11,2,2,6,1,1,1,7,12,2,33,0 -5403,0,8,0,11,0,5,13,1,1,1,14,12,1,26,0 -5404,9,8,0,10,6,3,9,4,3,1,18,3,2,22,1 -5405,3,3,0,4,4,2,10,1,4,1,0,13,4,32,1 -5406,0,8,0,12,4,5,6,3,4,0,12,15,2,6,0 -5407,10,6,0,5,0,3,1,2,0,1,8,4,2,6,0 -5408,1,6,0,5,2,5,3,3,4,1,13,16,2,13,0 -5409,8,3,0,7,3,3,3,2,2,0,8,15,0,4,0 -5410,1,4,0,12,3,0,8,3,4,1,6,2,2,35,0 -5411,3,8,0,3,0,4,5,3,1,0,13,18,1,3,0 -5412,5,2,0,4,4,2,10,3,2,1,18,7,5,1,1 -5413,7,2,0,4,0,4,7,3,0,0,8,12,2,0,0 -5414,0,2,0,14,5,3,14,2,2,1,4,3,2,25,0 -5415,3,6,0,3,5,0,6,0,4,0,13,15,2,20,0 -5416,4,1,0,3,0,3,3,4,3,1,11,7,4,28,1 -5417,1,5,0,5,4,0,8,1,4,0,12,15,0,29,0 -5418,8,2,0,5,2,4,11,5,4,1,8,5,0,23,0 -5419,10,6,0,5,6,4,9,3,2,0,0,6,0,12,0 -5420,6,2,0,15,0,5,5,0,3,1,13,13,3,26,0 -5421,10,3,0,6,6,0,8,4,0,1,8,3,2,30,0 -5422,6,6,0,5,4,5,4,5,0,1,12,12,3,4,1 -5423,2,4,0,13,2,1,7,3,1,1,8,18,3,39,0 -5424,1,2,0,10,6,0,8,0,2,0,0,18,3,3,0 -5425,3,4,0,6,2,3,8,4,0,1,8,2,0,32,0 -5426,5,7,0,2,3,6,1,0,1,0,15,19,5,37,0 -5427,0,5,0,13,2,3,14,3,0,1,15,5,4,3,1 -5428,7,1,0,13,2,3,6,0,1,0,9,14,3,38,1 -5429,2,8,0,1,6,5,10,3,1,0,2,1,0,19,0 -5430,5,1,0,2,0,5,5,2,2,1,12,11,5,37,0 -5431,9,6,0,13,3,6,7,5,0,0,13,20,1,39,0 -5432,4,5,0,9,1,5,12,2,4,1,18,9,4,16,1 -5433,1,7,0,15,0,2,12,2,0,1,8,5,3,2,0 -5434,8,3,0,10,3,0,10,3,2,0,9,10,5,10,1 -5435,8,6,0,12,3,1,5,2,1,1,16,20,1,28,1 -5436,1,0,0,0,3,4,5,2,2,0,13,15,1,32,0 -5437,7,4,0,5,6,5,4,2,1,1,8,6,0,4,0 -5438,6,0,0,7,1,0,1,5,0,0,0,9,1,41,0 -5439,0,6,0,13,1,0,7,4,1,0,4,20,3,29,0 -5440,0,3,0,5,6,1,14,4,1,0,1,18,3,18,0 -5441,0,0,0,8,4,0,12,0,3,0,2,2,1,5,0 -5442,2,0,0,7,0,6,11,5,4,1,16,10,4,18,1 -5443,1,6,0,2,4,4,8,3,1,1,3,12,4,0,1 -5444,0,6,0,8,6,5,12,2,3,1,10,0,4,37,0 -5445,4,2,0,4,0,1,14,1,3,1,2,18,2,30,0 -5446,1,6,0,5,1,0,8,3,0,0,8,11,1,27,0 -5447,0,7,0,3,3,5,3,3,1,1,13,2,0,14,0 -5448,10,8,0,15,0,5,8,3,1,1,15,3,0,39,0 -5449,9,7,0,9,0,1,12,1,0,0,14,4,0,3,0 -5450,0,8,0,13,6,4,14,1,3,0,0,4,1,25,0 -5451,8,6,0,2,6,0,0,2,1,0,8,11,3,1,0 -5452,6,3,0,0,0,0,13,5,3,1,8,6,5,23,0 -5453,1,6,0,8,4,5,0,4,0,1,14,12,5,40,0 -5454,9,6,0,2,3,6,11,1,2,1,2,0,1,22,0 -5455,9,1,0,0,6,6,2,2,3,1,4,6,1,11,0 -5456,0,2,0,1,4,4,4,2,4,1,13,18,1,9,0 -5457,0,4,0,15,4,6,2,2,1,1,1,0,1,12,1 -5458,1,8,0,2,4,0,14,4,1,1,13,12,4,39,0 -5459,9,8,0,15,6,1,9,2,2,0,2,15,4,12,0 -5460,8,3,0,6,0,0,2,3,3,0,3,6,2,6,0 -5461,1,1,0,10,5,6,11,0,3,0,10,20,0,41,0 -5462,9,0,0,14,4,0,4,0,4,1,0,5,4,10,1 -5463,2,7,0,2,0,4,3,3,0,0,8,19,2,9,0 -5464,2,6,0,0,5,1,1,2,4,1,13,2,2,13,0 -5465,4,7,0,11,3,2,10,3,3,1,9,7,3,18,1 -5466,6,6,0,11,5,4,2,3,3,1,0,18,3,10,0 -5467,3,8,0,13,1,2,9,4,0,0,0,11,2,39,0 -5468,6,5,0,8,2,6,5,2,1,0,9,15,2,4,1 -5469,2,0,0,8,1,1,8,4,2,1,8,2,5,31,0 -5470,2,2,0,7,4,0,9,1,2,0,2,13,0,24,0 -5471,6,0,0,1,2,6,14,3,3,0,13,2,2,29,0 -5472,2,6,0,0,0,0,8,0,2,0,13,0,4,31,0 -5473,4,0,0,8,0,6,13,0,4,0,10,11,3,26,0 -5474,2,5,0,7,6,5,0,3,1,0,8,11,2,17,0 -5475,3,6,0,7,6,0,1,3,2,1,13,20,4,0,0 -5476,7,3,0,6,6,1,12,1,4,0,13,15,4,36,0 -5477,5,6,0,4,0,3,5,5,0,1,17,3,1,4,0 -5478,3,7,0,11,3,6,9,1,2,1,8,4,1,12,0 -5479,3,5,0,12,2,4,3,0,0,1,1,1,4,28,1 -5480,6,6,0,11,0,2,1,5,1,1,1,5,5,31,1 -5481,3,4,0,12,0,4,5,2,2,1,4,2,0,8,0 -5482,10,8,0,13,6,5,7,4,1,0,2,20,4,0,0 -5483,2,7,0,7,0,5,8,3,2,1,5,14,3,34,0 -5484,6,8,0,8,6,3,14,2,2,0,17,2,2,6,0 -5485,5,4,0,6,4,0,6,2,0,1,4,5,3,21,1 -5486,8,6,0,11,1,6,10,3,3,1,17,13,3,29,0 -5487,3,6,0,12,2,5,1,1,3,0,13,12,1,0,0 -5488,9,5,0,14,5,3,9,0,2,1,4,13,1,5,0 -5489,10,3,0,3,1,3,8,0,3,1,8,11,0,25,0 -5490,1,0,0,13,0,4,14,2,0,0,0,2,5,38,0 -5491,7,8,0,13,6,3,9,4,3,1,2,6,2,31,0 -5492,6,7,0,8,0,4,2,1,1,1,12,5,1,7,1 -5493,0,2,0,7,5,4,7,5,1,1,11,18,2,32,0 -5494,5,1,0,2,3,3,2,0,1,1,3,7,0,21,1 -5495,8,3,0,9,0,5,0,1,0,0,2,2,1,3,0 -5496,3,5,0,13,2,0,6,5,2,1,1,15,5,32,1 -5497,5,7,0,8,4,4,2,3,0,0,13,14,2,22,0 -5498,2,2,0,8,5,4,9,2,3,1,4,0,5,9,0 -5499,9,3,0,0,0,5,7,2,2,0,6,20,0,34,0 -5500,0,4,0,4,2,0,4,3,1,0,0,15,4,8,0 -5501,1,0,0,3,2,4,9,4,4,1,15,15,3,32,0 -5502,4,1,0,11,6,5,6,3,2,1,13,0,5,4,0 -5503,8,7,0,13,2,5,1,3,1,0,9,8,1,7,1 -5504,9,6,0,15,3,0,11,0,2,0,2,5,5,22,0 -5505,9,4,0,3,0,3,8,4,3,1,17,11,3,11,0 -5506,2,3,0,1,0,2,8,0,1,0,4,2,0,30,0 -5507,7,5,0,2,3,1,9,1,0,0,13,1,3,25,0 -5508,7,0,0,0,1,4,3,5,3,1,17,19,3,11,1 -5509,0,7,0,13,3,6,13,4,4,0,4,6,0,8,0 -5510,9,1,0,2,6,2,7,5,4,1,18,19,2,35,1 -5511,0,3,0,12,0,6,0,0,1,1,11,11,0,14,0 -5512,2,6,0,2,0,1,2,0,0,0,13,20,2,23,0 -5513,9,0,0,6,1,3,5,4,2,0,17,2,2,37,0 -5514,5,7,0,6,4,6,11,4,2,1,11,15,5,30,1 -5515,10,8,0,11,2,2,14,4,1,1,5,17,2,11,1 -5516,5,8,0,9,3,3,3,4,1,0,4,20,0,21,0 -5517,0,5,0,3,5,1,5,4,3,1,13,11,5,40,0 -5518,5,5,0,8,3,5,8,0,1,0,6,18,2,12,1 -5519,9,6,0,1,1,2,11,2,4,1,5,7,5,38,1 -5520,2,8,0,13,3,2,2,1,0,0,17,11,2,30,0 -5521,0,0,0,12,1,3,0,0,0,1,1,2,1,21,0 -5522,5,7,0,10,3,4,5,4,2,1,4,4,5,11,0 -5523,8,8,0,5,2,0,10,3,3,0,3,7,5,6,1 -5524,2,1,0,15,6,6,10,0,2,1,18,19,3,10,1 -5525,5,4,0,5,1,4,9,5,0,1,13,6,2,12,0 -5526,8,0,0,0,2,4,3,3,4,0,3,2,2,22,0 -5527,3,8,0,1,4,0,5,3,0,1,17,4,4,19,0 -5528,1,6,0,0,0,0,10,0,0,0,10,14,1,31,0 -5529,10,3,0,11,2,3,1,3,0,0,12,20,5,27,0 -5530,8,0,0,5,6,5,2,2,3,0,4,3,2,31,0 -5531,5,1,0,14,3,1,1,1,0,0,15,13,0,13,0 -5532,0,0,0,10,4,6,3,2,4,0,13,18,0,35,0 -5533,1,1,0,9,4,0,12,1,3,1,11,17,1,11,1 -5534,9,0,0,13,6,6,3,4,0,1,8,6,2,18,0 -5535,8,7,0,9,6,0,6,0,2,0,17,7,2,1,1 -5536,0,4,0,13,1,4,1,3,0,0,2,11,1,38,0 -5537,2,5,0,13,0,5,7,4,0,0,6,18,4,33,0 -5538,2,2,0,12,6,0,7,2,4,1,1,9,2,8,0 -5539,0,0,0,15,0,6,12,3,4,0,8,2,4,38,0 -5540,5,8,0,6,1,5,5,0,0,0,16,13,4,29,0 -5541,3,3,0,5,3,3,10,4,0,0,12,19,0,4,0 -5542,0,5,0,13,1,6,11,4,1,1,6,8,0,19,0 -5543,6,3,0,3,3,0,14,1,2,1,17,14,5,9,0 -5544,3,3,0,5,3,1,4,0,1,0,13,11,5,24,0 -5545,4,2,0,4,2,5,2,0,4,0,11,15,0,0,0 -5546,2,0,0,12,2,4,1,2,4,0,8,11,1,31,0 -5547,6,0,0,3,5,2,7,3,3,1,8,11,0,31,0 -5548,7,5,0,14,6,1,4,5,1,1,3,4,3,31,1 -5549,1,8,0,5,0,6,9,2,4,1,6,15,3,0,0 -5550,2,6,0,13,0,4,5,0,2,0,6,11,1,22,0 -5551,10,6,0,1,2,0,8,1,4,0,6,2,3,18,0 -5552,10,5,0,1,2,1,1,0,4,1,6,1,2,18,1 -5553,6,4,0,14,1,3,4,3,4,0,11,2,0,1,0 -5554,9,5,0,4,0,6,5,3,3,1,2,14,5,3,0 -5555,9,5,0,15,1,2,4,1,1,1,14,8,1,21,1 -5556,7,5,0,0,4,5,2,0,0,1,18,6,4,37,1 -5557,9,8,0,3,3,2,8,3,2,0,2,0,5,39,0 -5558,7,6,0,8,0,5,9,1,2,0,6,0,1,4,0 -5559,6,2,0,14,2,4,0,2,0,1,13,9,3,12,0 -5560,3,5,0,2,0,3,8,3,1,1,2,20,0,40,0 -5561,2,4,0,9,0,4,3,4,3,1,0,2,0,3,0 -5562,5,7,0,9,1,5,0,5,3,1,18,7,3,9,1 -5563,6,6,0,6,5,5,9,4,0,0,17,18,0,32,0 -5564,9,7,0,5,2,4,9,3,0,0,17,11,0,10,0 -5565,5,3,0,1,4,4,3,5,1,1,5,5,1,26,1 -5566,8,5,0,9,4,5,11,1,0,0,2,2,1,30,0 -5567,2,5,0,15,6,5,4,4,1,1,8,11,1,0,0 -5568,5,8,0,3,0,1,8,3,1,1,11,14,0,17,0 -5569,9,1,0,8,6,1,10,5,3,0,1,3,4,25,1 -5570,5,1,0,3,0,2,12,5,0,0,6,13,0,29,0 -5571,5,8,0,3,6,4,14,5,2,1,18,5,3,12,1 -5572,7,1,0,2,3,3,10,0,1,0,3,4,0,6,0 -5573,2,4,0,3,6,6,14,1,3,1,17,9,0,24,0 -5574,1,0,0,5,1,3,2,0,3,0,4,15,5,38,0 -5575,4,1,0,11,2,5,8,2,2,0,13,16,5,34,0 -5576,3,1,0,11,0,1,6,5,2,0,0,7,4,21,1 -5577,5,0,0,5,0,4,14,3,0,0,12,9,5,26,0 -5578,0,4,0,11,0,0,11,5,1,0,2,2,2,19,0 -5579,4,1,0,10,1,1,13,2,2,1,5,19,3,26,1 -5580,9,5,0,2,4,6,1,5,3,0,18,7,0,39,1 -5581,10,1,0,9,1,6,9,1,0,0,8,15,4,30,0 -5582,2,7,0,1,1,0,11,2,1,0,6,3,1,31,0 -5583,7,7,0,4,0,0,8,0,3,1,0,4,0,13,0 -5584,2,4,0,5,5,5,2,0,3,1,12,11,0,18,0 -5585,3,3,0,2,1,1,0,1,3,0,4,7,4,7,1 -5586,3,5,0,14,4,3,4,1,1,1,2,4,4,4,0 -5587,2,3,0,10,3,4,11,5,2,1,9,10,0,31,1 -5588,8,7,0,7,3,2,3,2,0,0,13,6,1,17,0 -5589,1,0,0,1,1,0,0,2,0,0,8,2,2,28,0 -5590,6,6,0,3,0,5,12,5,4,0,13,11,3,10,0 -5591,1,2,0,8,0,2,7,4,0,0,8,2,5,30,0 -5592,8,8,0,15,5,4,1,1,2,1,13,14,4,33,0 -5593,9,0,0,7,2,1,3,0,4,1,8,11,2,26,0 -5594,0,8,0,3,3,4,13,4,2,1,17,11,1,41,0 -5595,9,1,0,9,1,1,3,3,0,1,18,7,5,33,1 -5596,8,7,0,1,6,0,5,4,3,1,2,8,4,21,0 -5597,10,2,0,2,3,5,12,0,1,1,11,10,2,29,0 -5598,7,6,0,8,5,5,7,0,1,0,0,9,2,19,0 -5599,10,2,0,13,0,4,1,4,0,0,2,19,0,28,0 -5600,6,1,0,1,6,0,8,1,0,0,2,2,5,2,0 -5601,5,8,0,6,5,2,11,1,4,1,18,16,3,5,1 -5602,3,7,0,13,1,2,11,1,0,1,5,5,5,13,1 -5603,2,0,0,14,5,2,0,0,2,1,17,15,5,36,0 -5604,0,2,0,0,2,2,13,0,2,1,6,16,5,7,0 -5605,7,0,0,2,2,3,12,2,3,0,1,3,1,31,1 -5606,3,8,0,8,0,4,3,3,4,0,3,11,4,4,0 -5607,3,5,0,14,0,4,5,3,2,0,13,2,0,35,0 -5608,2,6,0,11,4,5,1,4,1,1,8,13,1,37,0 -5609,2,8,0,11,1,0,13,3,1,1,13,2,1,37,0 -5610,4,3,0,7,0,4,9,4,3,1,10,15,2,25,0 -5611,10,7,0,11,5,2,14,5,2,1,16,17,5,16,1 -5612,2,2,0,2,5,6,9,1,2,1,8,11,2,17,0 -5613,0,5,0,14,0,0,7,4,0,0,8,13,2,23,0 -5614,2,8,0,4,3,4,0,1,2,0,4,16,5,7,0 -5615,2,0,0,4,1,5,7,2,0,0,2,2,4,36,0 -5616,6,5,0,3,3,1,2,0,3,0,2,11,3,29,0 -5617,9,7,0,2,5,5,8,4,0,1,12,10,4,27,0 -5618,0,6,0,10,3,6,6,4,3,0,17,9,3,36,0 -5619,0,5,0,4,2,4,3,4,3,0,8,6,0,24,0 -5620,4,2,0,10,0,3,13,0,2,1,15,5,5,4,1 -5621,1,6,0,12,2,0,1,3,0,0,13,11,1,17,0 -5622,2,7,0,11,6,1,11,0,3,1,18,7,3,6,1 -5623,9,1,0,12,3,4,14,0,2,1,12,7,3,16,1 -5624,7,5,0,13,1,1,7,0,4,0,1,2,5,19,0 -5625,2,3,0,0,3,4,3,3,4,0,14,19,1,26,0 -5626,9,5,0,7,0,5,14,2,1,0,4,2,1,1,0 -5627,10,1,0,5,4,6,3,0,3,1,11,10,4,30,1 -5628,3,7,0,3,0,2,3,2,0,1,2,1,3,16,0 -5629,6,1,0,5,5,6,14,5,2,1,0,5,2,3,1 -5630,4,7,0,3,2,6,10,3,2,0,2,2,1,3,0 -5631,2,7,0,1,5,6,6,3,3,1,13,17,4,8,0 -5632,0,8,0,7,6,5,6,0,2,1,13,11,0,23,0 -5633,6,1,0,6,1,5,0,2,3,0,13,11,2,4,0 -5634,5,0,0,7,1,5,7,2,1,1,0,13,5,10,0 -5635,1,3,0,13,2,5,9,4,4,1,7,6,4,8,0 -5636,2,2,0,0,5,5,9,1,1,0,8,15,0,23,0 -5637,8,3,0,9,4,6,6,5,2,1,11,5,3,36,1 -5638,10,3,0,0,3,0,14,0,1,1,2,0,0,5,0 -5639,0,2,0,13,0,2,5,5,2,1,2,2,0,4,0 -5640,1,6,0,9,4,2,4,4,0,1,15,1,2,24,0 -5641,3,1,0,0,0,2,4,5,4,0,1,16,1,30,1 -5642,10,6,0,8,4,5,5,4,4,1,2,6,0,30,0 -5643,10,7,0,12,6,5,7,1,3,1,2,11,0,14,0 -5644,1,1,0,13,0,6,1,3,0,0,8,3,1,26,0 -5645,10,1,0,11,4,4,12,5,2,1,1,5,5,35,1 -5646,9,2,0,13,3,4,8,3,1,1,2,2,4,2,0 -5647,10,0,0,14,0,5,8,4,1,1,0,9,3,5,0 -5648,5,6,0,0,0,6,9,2,1,1,2,13,1,6,0 -5649,9,8,0,8,0,6,4,4,1,1,9,5,2,2,1 -5650,7,8,0,7,1,5,8,3,4,1,3,2,3,38,0 -5651,5,5,0,12,5,5,8,4,3,0,8,11,2,27,0 -5652,9,4,0,11,6,5,0,4,2,1,15,2,4,6,0 -5653,5,4,0,5,3,1,0,0,3,0,14,7,1,10,1 -5654,4,6,0,0,4,1,11,0,1,0,1,19,0,26,1 -5655,2,4,0,11,1,5,5,3,4,1,13,14,2,10,0 -5656,6,5,0,12,5,4,2,5,0,0,13,20,2,19,0 -5657,10,6,0,8,5,5,9,0,3,0,2,0,0,6,0 -5658,1,3,0,2,6,5,11,0,1,0,13,11,3,32,0 -5659,3,1,0,3,0,1,13,5,1,1,1,8,4,26,1 -5660,6,0,0,7,6,4,4,2,3,0,15,11,0,30,0 -5661,2,0,0,8,1,2,1,4,4,0,16,18,5,7,0 -5662,8,4,0,7,2,1,10,5,2,1,1,1,2,1,1 -5663,2,1,0,14,1,1,4,3,1,0,7,19,2,4,1 -5664,9,5,0,6,4,1,3,2,4,1,5,17,1,41,1 -5665,1,5,0,0,6,3,11,3,2,0,15,15,1,21,0 -5666,6,7,0,3,4,6,7,5,1,0,3,17,5,23,1 -5667,1,0,0,11,2,4,13,1,2,1,15,2,2,33,0 -5668,0,0,0,6,5,0,6,0,4,1,2,18,4,35,0 -5669,2,5,0,3,1,4,0,3,3,1,10,18,1,11,0 -5670,2,3,0,4,6,3,8,2,0,0,2,11,2,4,0 -5671,7,4,0,15,5,4,3,3,0,0,8,3,1,18,0 -5672,7,7,0,9,4,1,14,1,0,1,16,1,1,3,1 -5673,0,7,0,3,2,0,3,3,3,1,13,2,2,3,0 -5674,1,7,0,0,3,4,4,3,3,0,10,6,3,13,0 -5675,0,6,0,4,1,4,8,2,0,1,6,18,0,3,0 -5676,6,6,0,2,0,6,5,1,3,1,18,5,3,9,1 -5677,4,8,0,8,0,4,5,2,4,0,6,18,5,31,0 -5678,9,2,0,5,4,3,4,0,4,0,4,14,0,26,0 -5679,1,8,0,6,6,5,7,3,0,0,14,6,1,4,0 -5680,7,4,0,8,0,0,7,3,2,0,6,6,2,7,0 -5681,0,8,0,11,1,1,12,4,3,0,12,4,4,37,0 -5682,7,1,0,12,2,6,2,4,2,1,2,2,1,23,0 -5683,1,0,0,1,0,4,12,2,1,1,15,9,0,6,0 -5684,2,6,0,3,2,4,12,2,0,1,17,16,2,24,0 -5685,1,6,0,15,6,0,10,1,3,1,2,2,4,20,0 -5686,10,5,0,1,1,3,12,0,2,1,18,10,4,17,1 -5687,10,1,0,2,5,0,5,4,0,0,11,16,0,26,1 -5688,6,8,0,10,2,5,11,5,3,1,8,16,3,0,0 -5689,5,8,0,9,6,3,7,4,2,1,17,16,1,6,0 -5690,8,6,0,11,2,4,11,0,1,1,8,8,4,21,1 -5691,3,7,0,14,0,2,1,1,0,1,6,15,0,2,0 -5692,0,5,0,11,4,3,5,3,3,0,2,4,2,20,0 -5693,0,5,0,4,4,4,7,4,0,0,13,11,5,5,0 -5694,3,2,0,6,2,0,14,2,2,1,13,20,2,33,0 -5695,3,8,0,15,4,3,13,5,0,1,8,5,4,28,0 -5696,10,1,0,0,1,4,14,2,2,0,0,13,5,3,0 -5697,5,4,0,15,2,3,6,1,4,0,11,1,5,33,1 -5698,1,8,0,8,1,1,11,3,1,0,14,5,1,25,0 -5699,4,0,0,0,1,4,14,3,4,1,11,13,4,40,0 -5700,9,4,0,5,5,6,5,2,1,0,4,6,0,12,0 -5701,1,3,0,11,2,0,4,3,4,0,8,2,1,40,0 -5702,5,8,0,5,4,5,14,0,0,0,13,2,4,17,0 -5703,7,4,0,5,6,4,4,4,1,0,8,15,2,13,0 -5704,5,7,0,7,1,1,14,0,0,1,8,6,1,26,0 -5705,5,5,0,8,4,6,7,1,3,1,18,7,1,4,1 -5706,6,4,0,15,2,2,4,2,2,1,6,2,5,31,0 -5707,9,5,0,15,1,5,1,3,4,1,9,12,3,21,1 -5708,0,5,0,10,2,2,5,3,3,1,2,4,0,27,0 -5709,9,2,0,9,0,0,2,4,2,0,2,6,1,19,0 -5710,9,2,0,8,4,6,8,1,0,0,3,9,0,4,0 -5711,5,6,0,6,6,6,13,3,4,1,13,11,0,3,0 -5712,1,7,0,13,5,6,10,0,0,1,13,20,5,40,0 -5713,0,5,0,14,3,2,1,5,0,1,0,2,2,40,0 -5714,0,8,0,11,0,4,10,4,4,0,4,16,0,29,0 -5715,8,6,0,9,6,2,9,3,3,1,3,1,4,0,1 -5716,5,8,0,3,5,1,8,2,4,1,12,15,4,32,0 -5717,1,0,0,7,3,4,6,5,0,1,2,15,3,24,0 -5718,10,2,0,9,1,1,9,1,1,1,0,17,2,0,1 -5719,10,1,0,8,4,1,9,4,2,1,18,16,4,9,1 -5720,9,7,0,4,3,5,12,5,1,1,5,10,3,30,1 -5721,0,7,0,6,2,0,1,3,1,0,10,9,4,19,0 -5722,4,7,0,5,3,0,11,3,1,0,0,9,4,22,0 -5723,2,4,0,10,3,5,10,0,3,1,17,4,1,41,0 -5724,3,6,0,3,6,5,9,0,0,0,10,7,5,2,0 -5725,3,6,0,6,0,2,3,0,4,1,8,20,4,14,0 -5726,6,6,0,12,1,0,8,4,1,1,2,14,0,0,0 -5727,0,8,0,4,1,4,13,1,2,1,6,18,0,14,0 -5728,3,1,0,4,1,0,1,2,0,1,13,6,3,25,0 -5729,0,8,0,13,6,4,14,0,3,1,7,12,5,33,0 -5730,9,5,0,8,1,1,7,5,0,0,5,7,0,10,1 -5731,2,5,0,11,0,5,8,3,2,1,0,9,3,37,0 -5732,7,5,0,3,5,6,5,5,2,0,2,6,0,0,0 -5733,10,7,0,5,2,0,0,2,0,1,12,11,2,35,0 -5734,0,6,0,1,6,4,2,0,2,0,15,11,2,31,0 -5735,2,6,0,0,2,5,7,2,1,0,8,19,3,25,0 -5736,7,6,0,8,3,1,12,5,3,1,5,7,4,1,1 -5737,3,6,0,5,3,6,3,5,2,1,3,9,2,5,0 -5738,7,2,0,3,4,4,11,1,0,1,8,18,4,19,0 -5739,8,8,0,13,3,3,0,5,0,0,11,16,2,29,0 -5740,0,7,0,15,2,0,9,4,4,0,8,13,1,6,0 -5741,10,7,0,8,1,6,2,5,0,0,18,5,5,41,1 -5742,3,3,0,9,2,0,5,1,4,1,4,2,0,18,0 -5743,0,3,0,11,3,6,2,3,3,0,10,18,4,35,0 -5744,4,8,0,13,5,2,11,2,4,0,3,11,3,30,0 -5745,0,2,0,15,4,2,5,3,2,0,2,15,4,24,0 -5746,10,4,0,6,1,6,4,1,1,1,12,16,0,16,1 -5747,0,7,0,6,2,0,11,5,1,0,0,13,5,14,0 -5748,6,2,0,14,2,4,0,0,2,1,12,11,1,3,0 -5749,3,6,0,15,5,2,7,0,4,0,2,16,0,5,0 -5750,3,6,0,15,6,1,2,2,3,1,6,9,0,33,0 -5751,9,5,0,3,4,0,2,2,3,0,14,10,5,8,1 -5752,5,7,0,1,3,5,5,5,3,1,7,2,1,16,0 -5753,1,0,0,14,4,6,3,5,4,0,11,2,3,22,1 -5754,9,7,0,2,1,4,1,1,3,0,6,13,3,38,0 -5755,9,2,0,8,2,2,12,2,3,0,1,7,3,6,1 -5756,4,6,0,0,2,3,4,0,2,1,14,10,4,6,1 -5757,0,6,0,9,0,4,12,1,4,0,5,2,3,5,0 -5758,6,8,0,0,4,5,6,0,4,1,18,12,2,23,1 -5759,0,8,0,1,5,2,14,0,1,1,6,2,3,10,0 -5760,0,3,0,6,2,0,3,2,2,1,2,20,2,16,0 -5761,7,1,0,15,6,2,5,5,3,1,15,2,1,41,0 -5762,9,7,0,6,2,2,3,2,4,1,16,7,2,2,1 -5763,2,7,0,3,1,4,4,0,2,0,13,2,0,4,0 -5764,10,7,0,7,1,0,7,3,0,1,6,3,0,10,0 -5765,5,1,0,3,0,6,2,3,0,0,10,6,5,20,0 -5766,6,7,0,10,1,3,7,1,2,0,14,11,1,19,0 -5767,0,0,0,8,0,2,12,4,2,0,6,9,4,38,0 -5768,5,1,0,8,3,5,9,1,0,1,10,7,5,16,1 -5769,1,0,0,6,3,6,8,3,0,1,18,7,3,10,1 -5770,0,7,0,4,5,4,1,3,1,1,17,17,5,20,0 -5771,3,0,0,1,4,3,2,3,4,1,8,11,1,38,0 -5772,10,5,0,14,6,4,11,3,4,0,6,17,3,37,1 -5773,9,4,0,13,2,4,6,3,3,1,17,15,5,17,0 -5774,8,2,0,11,4,0,4,1,3,0,9,3,4,26,1 -5775,9,1,0,0,1,5,1,3,1,1,5,12,2,23,1 -5776,7,2,0,10,3,3,9,4,0,1,14,18,1,5,0 -5777,1,5,0,13,1,6,11,0,4,0,13,5,1,27,0 -5778,8,6,0,0,6,0,12,1,1,1,13,20,0,32,0 -5779,2,3,0,9,6,4,8,4,4,0,0,18,1,3,0 -5780,3,2,0,10,4,4,12,2,2,1,1,19,2,41,1 -5781,3,1,0,8,3,6,6,0,1,1,16,10,3,25,1 -5782,7,1,0,1,1,6,0,5,0,0,0,14,4,4,0 -5783,1,1,0,14,6,2,3,1,0,0,13,15,2,10,0 -5784,3,0,0,4,5,0,9,5,0,0,9,2,0,4,0 -5785,7,8,0,13,1,5,3,4,0,1,2,0,0,6,0 -5786,7,8,0,6,1,6,5,1,2,0,15,18,4,5,0 -5787,2,6,0,8,6,5,3,1,0,0,0,13,2,12,0 -5788,1,4,0,5,0,4,2,5,2,1,2,0,0,34,0 -5789,0,4,0,0,5,5,4,1,3,0,13,11,1,6,0 -5790,0,7,0,2,0,5,3,5,2,0,8,5,2,18,0 -5791,10,4,0,10,2,2,4,3,0,0,1,8,4,38,1 -5792,1,3,0,15,3,4,14,3,4,1,0,2,2,3,0 -5793,7,2,0,0,6,0,4,2,0,0,2,18,1,8,0 -5794,0,3,0,10,0,2,10,1,0,0,14,11,2,20,0 -5795,1,7,0,2,0,5,7,2,0,0,2,9,0,3,0 -5796,4,1,0,10,4,0,14,2,2,0,18,0,2,29,1 -5797,8,2,0,5,1,2,12,3,3,0,5,15,3,16,0 -5798,3,5,0,1,1,0,5,1,4,1,2,20,5,10,0 -5799,0,0,0,1,6,0,3,1,4,0,0,18,0,27,0 -5800,3,5,0,3,2,5,13,1,3,1,13,13,2,32,0 -5801,9,7,0,3,2,0,8,0,3,0,2,13,1,22,0 -5802,9,8,0,7,5,1,9,2,2,1,2,8,3,40,0 -5803,2,5,0,4,5,4,7,0,4,0,2,6,3,26,0 -5804,3,4,0,6,5,5,5,1,0,0,8,18,3,39,0 -5805,1,5,0,3,6,0,5,0,0,1,2,0,3,19,0 -5806,4,5,0,5,5,4,12,5,4,1,8,2,2,22,0 -5807,2,7,0,7,4,5,1,4,2,1,13,11,0,21,0 -5808,5,4,0,11,4,1,8,3,3,0,2,4,1,0,0 -5809,4,0,0,15,0,0,14,5,3,1,2,10,0,16,0 -5810,2,2,0,3,3,6,12,4,2,1,12,14,4,21,0 -5811,1,4,0,9,5,0,14,0,3,1,6,3,2,26,0 -5812,0,2,0,5,0,0,1,1,3,0,8,15,2,14,0 -5813,3,8,0,4,6,5,13,5,4,1,12,16,2,3,0 -5814,5,4,0,0,0,1,5,2,4,1,3,7,4,38,1 -5815,8,3,0,6,3,0,0,3,2,0,12,2,3,11,0 -5816,1,1,0,13,2,4,6,1,0,1,8,7,2,21,0 -5817,0,5,0,1,6,4,11,0,4,0,1,5,4,24,1 -5818,1,8,0,6,1,0,11,3,3,0,13,11,4,6,0 -5819,3,1,0,6,4,1,10,3,3,0,13,20,0,5,0 -5820,3,6,0,4,3,0,0,5,1,0,18,7,3,29,1 -5821,3,5,0,11,3,4,5,0,2,1,18,7,1,12,1 -5822,6,4,0,8,0,4,10,5,3,1,6,7,1,12,1 -5823,0,5,0,0,4,6,0,1,1,0,13,11,4,8,0 -5824,5,4,0,1,0,0,8,2,4,1,4,11,4,2,0 -5825,5,0,0,1,6,1,11,5,2,1,18,12,2,28,1 -5826,4,8,0,4,2,0,5,0,4,1,12,18,3,29,0 -5827,1,0,0,11,2,5,3,3,1,1,13,19,5,34,0 -5828,9,8,0,9,3,2,10,5,2,1,18,4,5,13,1 -5829,1,0,0,9,5,1,10,3,2,1,0,11,2,35,0 -5830,10,2,0,0,4,3,7,3,4,1,11,7,4,21,1 -5831,1,6,0,5,2,2,10,2,4,1,1,16,3,31,1 -5832,0,2,0,11,0,6,2,0,1,0,16,13,0,7,0 -5833,0,1,0,1,6,5,6,4,3,1,8,9,0,34,0 -5834,6,6,0,1,6,4,12,4,4,0,4,20,3,14,0 -5835,0,1,0,10,0,0,9,2,0,0,4,18,0,41,0 -5836,3,7,0,2,5,2,2,1,4,1,10,15,4,26,0 -5837,8,5,0,8,1,1,4,3,2,0,18,10,0,17,1 -5838,3,3,0,1,3,0,8,3,3,0,4,0,1,29,0 -5839,1,7,0,8,1,0,8,4,0,1,6,19,3,31,0 -5840,2,3,0,5,0,0,4,1,3,1,0,11,5,8,0 -5841,6,1,0,3,1,0,4,2,4,1,18,10,3,7,1 -5842,1,7,0,8,5,6,2,1,1,1,10,15,0,18,0 -5843,0,2,0,3,6,6,9,2,3,1,12,19,0,39,0 -5844,6,7,0,9,2,3,6,0,2,1,2,6,0,38,0 -5845,5,7,0,10,4,5,11,3,0,0,2,10,0,26,0 -5846,1,2,0,13,2,2,12,3,2,0,17,15,5,6,0 -5847,7,1,0,4,4,3,0,1,1,1,18,5,1,31,1 -5848,2,1,0,4,3,5,5,5,0,0,17,15,1,8,0 -5849,0,6,0,3,0,2,11,3,1,0,4,14,3,29,0 -5850,3,2,0,11,5,5,12,1,1,0,6,6,4,1,0 -5851,9,3,0,6,6,3,9,0,2,0,4,2,2,29,0 -5852,3,6,0,2,5,5,14,0,1,0,13,4,0,3,0 -5853,0,6,0,5,5,1,13,3,1,0,10,18,0,34,0 -5854,9,8,0,11,5,3,0,0,4,0,0,4,0,40,0 -5855,5,1,0,15,6,4,10,4,1,0,17,17,1,0,0 -5856,6,6,0,13,2,0,13,2,2,1,13,15,2,36,0 -5857,4,3,0,7,1,0,4,3,4,0,13,9,0,33,0 -5858,0,2,0,5,0,2,3,2,2,0,2,5,4,7,0 -5859,0,5,0,11,0,0,9,4,0,0,6,12,1,30,0 -5860,1,2,0,3,6,5,1,1,1,1,17,16,4,12,0 -5861,7,2,0,5,6,4,9,5,0,1,4,18,4,28,1 -5862,2,8,0,4,2,5,10,0,0,0,15,0,3,38,0 -5863,0,7,0,11,6,3,2,1,2,1,1,9,0,25,0 -5864,6,0,0,11,0,3,4,0,3,0,16,9,1,9,0 -5865,5,1,0,2,2,6,9,1,0,1,0,0,1,25,1 -5866,9,6,0,1,5,5,2,2,1,1,17,15,1,38,0 -5867,10,2,0,5,5,6,10,1,4,0,13,13,0,33,0 -5868,9,1,0,12,3,4,5,5,1,1,14,19,5,22,1 -5869,2,5,0,0,3,5,8,0,0,0,13,2,1,9,0 -5870,10,5,0,15,5,4,10,4,2,0,13,6,0,3,0 -5871,10,2,0,5,6,6,1,3,3,0,17,2,2,22,0 -5872,9,6,0,5,4,1,4,5,0,1,16,8,3,22,1 -5873,5,4,0,10,4,0,0,3,4,0,6,6,5,30,0 -5874,9,8,0,9,6,4,13,1,2,1,18,9,5,4,1 -5875,1,2,0,10,3,5,13,4,2,1,4,15,1,30,0 -5876,2,7,0,15,3,2,1,2,2,0,0,0,1,12,0 -5877,0,6,0,3,0,1,9,3,0,0,5,11,1,10,0 -5878,4,6,0,12,0,3,5,1,4,1,8,15,3,41,0 -5879,9,2,0,13,1,6,6,4,1,1,13,11,1,39,0 -5880,7,5,0,11,1,2,9,1,3,1,13,4,1,37,0 -5881,3,4,0,13,0,6,0,3,2,0,11,16,4,35,0 -5882,5,2,0,4,4,1,8,4,4,1,2,0,4,14,0 -5883,1,5,0,2,6,4,14,2,1,0,5,2,2,30,0 -5884,9,4,0,13,5,4,8,2,2,1,0,2,4,9,0 -5885,8,7,0,6,0,0,8,5,3,0,8,5,1,30,0 -5886,4,6,0,14,6,1,9,4,4,1,3,10,2,4,1 -5887,2,6,0,0,6,0,5,4,0,0,4,20,0,8,0 -5888,5,4,0,13,0,4,4,1,2,0,12,12,5,38,0 -5889,8,7,0,11,3,2,7,3,2,0,7,11,1,14,0 -5890,0,8,0,11,2,0,9,2,4,0,8,3,1,21,0 -5891,10,0,0,6,1,4,4,3,4,1,0,19,1,4,0 -5892,5,5,0,12,0,0,6,1,2,1,13,18,3,28,0 -5893,1,7,0,5,6,5,6,3,4,1,10,16,4,26,0 -5894,10,4,0,9,0,1,7,4,1,0,0,13,4,10,0 -5895,3,8,0,2,3,3,6,1,0,0,6,0,2,3,0 -5896,2,7,0,5,1,1,11,1,2,0,4,6,5,8,0 -5897,3,7,0,3,5,1,3,4,4,0,13,18,3,2,0 -5898,2,1,0,13,1,3,10,3,3,1,6,6,4,3,0 -5899,3,3,0,14,5,6,13,5,3,0,9,10,4,6,1 -5900,1,6,0,5,0,4,0,3,0,0,13,6,1,2,0 -5901,0,7,0,1,6,1,3,0,0,1,13,6,4,27,0 -5902,6,5,0,0,5,4,9,2,2,0,10,2,4,14,0 -5903,4,5,0,6,4,6,14,5,3,1,18,7,5,19,1 -5904,0,2,0,14,0,2,4,0,0,0,8,16,3,16,0 -5905,4,5,0,11,5,2,0,2,1,1,13,18,3,10,0 -5906,1,7,0,1,1,3,12,1,3,1,13,2,0,22,0 -5907,4,7,0,8,4,3,5,4,3,0,4,6,5,8,0 -5908,4,1,0,13,3,0,10,5,4,0,15,1,1,41,1 -5909,2,5,0,2,0,3,10,3,1,0,8,11,1,13,0 -5910,0,2,0,13,0,5,8,3,3,1,10,11,5,22,0 -5911,9,4,0,15,2,5,3,0,4,0,2,19,0,16,0 -5912,1,4,0,3,5,1,0,1,1,0,17,6,0,34,0 -5913,4,2,0,5,2,3,5,1,1,0,13,19,0,40,0 -5914,2,3,0,9,4,4,1,1,2,0,16,2,2,2,0 -5915,3,6,0,1,6,0,2,0,1,1,1,11,1,6,0 -5916,1,0,0,10,4,2,13,2,1,0,2,2,5,22,0 -5917,8,4,0,5,5,3,5,0,0,0,2,11,0,6,0 -5918,2,8,0,13,4,6,3,3,2,0,13,11,0,9,0 -5919,0,7,0,15,2,3,5,2,2,0,17,11,5,7,0 -5920,1,0,0,12,2,5,6,0,4,0,17,18,5,23,0 -5921,10,4,0,14,4,6,13,3,2,1,10,11,1,32,0 -5922,0,3,0,3,3,4,14,2,2,0,16,2,2,25,0 -5923,7,7,0,3,3,6,6,5,3,0,1,9,5,26,1 -5924,0,7,0,5,0,5,10,1,0,0,6,0,0,33,0 -5925,8,1,0,9,2,2,1,1,2,1,13,0,1,27,0 -5926,3,2,0,3,1,1,6,3,1,0,8,3,5,40,0 -5927,8,0,0,1,4,5,1,1,3,1,5,10,1,12,1 -5928,1,2,0,15,0,0,8,3,2,1,15,17,2,0,0 -5929,2,8,0,3,6,1,8,0,0,0,2,4,4,5,0 -5930,5,6,0,11,6,4,6,4,1,0,6,11,1,12,0 -5931,8,4,0,7,3,2,13,0,2,1,16,8,5,19,1 -5932,9,8,0,6,0,1,0,3,3,0,2,11,3,16,0 -5933,0,5,0,4,6,0,5,0,0,0,17,4,1,37,0 -5934,8,7,0,3,0,4,2,0,2,0,13,19,0,41,0 -5935,1,6,0,9,0,5,11,1,1,1,15,13,4,6,0 -5936,1,7,0,1,6,5,7,3,1,0,2,2,2,20,0 -5937,6,6,0,15,5,6,9,1,1,0,9,0,0,11,0 -5938,0,3,0,6,6,3,9,0,0,1,2,11,4,31,0 -5939,4,3,0,3,5,4,12,2,0,0,13,20,2,12,0 -5940,2,6,0,11,0,4,13,2,3,0,13,11,3,16,0 -5941,3,2,0,3,0,0,8,3,3,0,0,11,0,13,0 -5942,0,7,0,2,5,5,14,5,4,0,2,2,1,30,0 -5943,0,6,0,1,6,5,2,4,1,0,13,6,3,23,0 -5944,0,8,0,3,5,6,11,1,3,0,8,0,3,25,0 -5945,0,8,0,6,6,6,13,2,2,1,0,4,5,29,0 -5946,7,5,0,0,5,5,12,1,1,0,13,10,0,4,0 -5947,8,2,0,11,3,5,11,3,2,1,6,16,0,32,0 -5948,5,5,0,13,6,5,12,2,2,1,4,15,4,9,0 -5949,4,6,0,13,6,1,12,2,0,0,0,11,1,25,0 -5950,2,4,0,11,1,0,12,4,2,1,2,3,2,5,0 -5951,8,6,0,3,6,5,0,4,2,1,2,11,5,4,0 -5952,6,3,0,8,3,0,6,0,0,0,15,0,5,4,0 -5953,10,3,0,12,1,4,10,1,4,1,6,11,0,17,0 -5954,6,2,0,8,1,0,10,1,2,0,10,6,4,18,0 -5955,3,7,0,9,0,1,13,4,2,0,5,0,2,8,1 -5956,1,3,0,0,0,5,1,4,3,0,4,18,1,30,0 -5957,3,0,0,12,2,2,9,2,0,0,13,6,0,11,0 -5958,3,5,0,7,6,3,11,5,1,1,16,1,4,34,1 -5959,5,3,0,3,6,2,9,0,0,1,13,9,1,3,0 -5960,5,5,0,3,4,2,14,1,4,1,1,4,4,1,1 -5961,0,8,0,5,6,5,11,4,0,0,17,8,2,26,0 -5962,1,7,0,13,0,3,9,4,4,1,4,20,3,14,0 -5963,2,0,0,13,1,5,12,2,2,0,8,11,5,5,0 -5964,1,4,0,7,1,1,11,2,0,0,17,2,2,10,0 -5965,3,2,0,2,3,4,5,1,2,1,2,2,2,33,0 -5966,7,6,0,3,0,0,5,2,3,0,13,12,3,24,0 -5967,4,7,0,11,5,6,5,4,2,0,2,15,0,24,0 -5968,0,6,0,9,0,0,1,4,3,1,5,9,4,19,0 -5969,7,7,0,0,6,6,10,1,1,0,13,18,1,5,0 -5970,0,6,0,8,0,5,6,5,2,1,13,13,2,12,0 -5971,0,8,0,14,2,5,4,3,1,0,13,15,0,17,0 -5972,10,0,0,8,6,4,14,1,0,0,8,6,4,4,0 -5973,6,1,0,2,5,2,11,4,2,1,18,4,4,32,1 -5974,0,0,0,7,1,3,3,2,2,1,13,6,3,6,0 -5975,0,2,0,8,4,0,9,2,3,0,17,0,4,22,0 -5976,3,0,0,15,0,6,14,2,4,1,11,3,3,38,1 -5977,9,7,0,0,2,4,5,2,3,1,4,13,2,17,0 -5978,0,6,0,14,1,0,0,5,3,1,13,2,5,14,0 -5979,2,1,0,3,6,5,8,3,0,1,17,4,2,9,0 -5980,0,4,0,8,1,0,9,3,0,0,13,11,0,6,0 -5981,9,4,0,2,6,6,6,4,4,0,2,20,5,22,0 -5982,7,7,0,11,3,1,2,1,4,0,2,12,2,33,0 -5983,1,3,0,7,6,3,12,1,0,1,1,2,1,4,0 -5984,6,1,0,11,2,3,11,3,0,1,3,7,1,29,1 -5985,0,0,0,3,1,4,6,4,3,1,11,0,3,32,0 -5986,10,0,0,10,4,0,6,0,4,0,18,14,4,28,1 -5987,2,7,0,4,5,6,3,3,1,0,0,3,3,8,0 -5988,6,8,0,10,1,0,12,3,0,1,2,2,3,3,0 -5989,7,4,0,9,0,5,14,5,0,1,3,12,3,22,0 -5990,7,3,0,8,0,6,13,2,4,0,8,11,2,5,0 -5991,0,2,0,1,0,5,11,0,3,1,2,6,2,0,0 -5992,9,4,0,12,5,6,3,5,2,1,17,13,3,6,0 -5993,9,7,0,4,5,0,3,3,0,0,11,6,4,34,0 -5994,1,3,0,14,5,3,2,3,3,0,10,2,3,27,0 -5995,6,4,0,11,0,4,6,1,0,0,0,9,5,22,0 -5996,1,7,0,9,1,5,8,4,4,0,12,20,4,4,0 -5997,8,0,0,6,5,4,11,4,2,0,5,9,3,17,0 -5998,0,5,0,5,5,5,3,0,3,0,13,9,0,18,0 -5999,8,0,0,8,4,6,3,5,2,0,5,7,1,9,1 -6000,8,6,0,0,3,4,8,2,1,0,11,11,1,21,0 -6001,7,0,0,8,2,0,1,2,0,0,5,11,4,30,0 -6002,9,7,0,10,5,0,8,3,1,1,12,11,2,12,0 -6003,3,2,0,8,6,5,8,1,2,0,0,2,1,13,0 -6004,1,8,0,15,2,4,6,4,2,0,15,15,2,39,0 -6005,2,6,0,9,6,6,4,5,3,0,9,7,3,2,1 -6006,6,0,0,5,6,3,0,1,3,0,13,13,2,27,0 -6007,10,7,0,5,0,2,14,2,2,1,8,15,5,33,0 -6008,2,2,0,12,4,1,8,3,2,1,10,5,5,24,1 -6009,0,8,0,11,0,5,14,4,2,0,13,15,3,8,0 -6010,2,3,0,5,6,3,1,1,1,0,2,4,2,8,0 -6011,2,6,0,14,0,5,6,3,0,0,2,2,5,19,0 -6012,0,5,0,10,0,0,4,1,3,0,17,6,3,34,0 -6013,2,0,0,5,5,6,0,1,1,0,16,16,3,5,0 -6014,2,5,0,11,1,6,7,4,3,0,8,9,4,27,0 -6015,5,8,0,9,6,3,8,4,0,0,13,8,2,35,0 -6016,2,4,0,15,1,1,10,2,3,0,15,10,3,28,1 -6017,3,6,0,9,1,0,12,2,0,1,3,2,0,8,0 -6018,0,3,0,5,0,5,9,4,1,1,5,0,2,6,0 -6019,2,5,0,11,4,5,7,1,0,0,13,18,0,37,0 -6020,8,4,0,10,4,4,9,5,3,1,9,7,0,2,1 -6021,1,5,0,4,4,6,0,3,0,0,11,2,0,8,0 -6022,0,5,0,1,5,5,2,3,4,0,6,2,4,12,0 -6023,5,6,0,0,4,6,1,1,0,1,10,15,5,38,0 -6024,2,2,0,14,6,5,2,3,4,0,0,6,5,33,0 -6025,2,8,0,8,1,4,3,3,4,0,4,18,1,33,0 -6026,1,5,0,5,2,6,9,3,2,0,5,7,4,41,1 -6027,5,6,0,8,1,2,9,1,0,0,10,0,4,40,0 -6028,3,6,0,5,3,4,12,4,0,0,13,19,2,8,0 -6029,10,8,0,0,6,3,3,2,0,0,13,11,5,16,0 -6030,9,3,0,4,3,1,8,0,1,1,8,11,3,22,0 -6031,0,8,0,8,0,3,3,5,2,0,13,3,2,38,0 -6032,3,4,0,14,2,0,14,5,2,0,18,3,4,20,1 -6033,6,3,0,4,5,2,6,5,4,0,5,7,5,11,1 -6034,8,4,0,3,1,4,8,1,1,1,12,19,3,8,1 -6035,4,4,0,0,3,5,3,4,1,1,6,17,1,25,0 -6036,5,0,0,12,1,3,8,3,3,0,13,15,4,14,0 -6037,7,5,0,15,1,2,6,4,2,1,8,10,4,7,0 -6038,1,3,0,12,2,6,7,4,0,1,18,1,5,9,1 -6039,7,5,0,5,0,3,6,4,3,1,8,0,5,3,0 -6040,9,1,0,0,3,5,11,5,2,0,1,8,3,25,1 -6041,5,1,0,7,4,2,12,5,4,0,18,8,2,2,1 -6042,8,8,0,3,3,0,14,1,3,1,8,15,0,12,0 -6043,2,6,0,4,3,5,1,3,0,0,2,2,0,38,0 -6044,4,4,0,6,0,6,5,1,2,1,10,14,1,3,0 -6045,1,0,0,8,3,1,3,2,0,1,11,13,4,8,0 -6046,3,5,0,4,6,5,14,2,0,1,13,6,3,2,0 -6047,10,8,0,7,5,3,8,3,1,1,2,15,5,41,0 -6048,9,5,0,6,5,3,10,3,4,1,15,17,1,41,1 -6049,2,8,0,3,5,5,11,4,1,0,13,9,5,8,0 -6050,5,7,0,4,1,4,2,1,4,1,4,9,3,13,0 -6051,8,6,0,12,6,2,8,0,3,0,15,4,5,29,0 -6052,1,4,0,10,2,6,8,4,3,0,12,3,5,41,0 -6053,8,5,0,12,2,2,13,5,3,1,18,8,3,9,1 -6054,5,6,0,0,4,4,7,4,2,0,4,2,0,19,0 -6055,5,2,0,3,2,6,6,2,4,0,12,12,4,21,0 -6056,10,6,0,13,6,5,4,0,2,0,10,2,0,40,0 -6057,1,5,0,9,5,0,1,0,1,0,0,10,0,22,1 -6058,8,7,0,3,3,1,1,5,0,1,9,7,1,37,1 -6059,1,4,0,5,0,6,2,2,0,1,2,18,2,2,0 -6060,7,7,0,15,4,3,1,2,2,0,16,2,3,11,0 -6061,8,2,0,5,5,5,13,2,0,1,8,0,2,21,0 -6062,9,5,0,12,0,0,13,5,3,1,9,11,5,35,1 -6063,3,8,0,3,4,0,13,3,1,0,2,9,1,35,0 -6064,2,5,0,6,1,4,8,1,2,0,9,7,2,16,0 -6065,1,7,0,0,5,1,3,2,0,0,2,1,0,14,0 -6066,4,0,0,12,1,5,2,0,4,1,17,1,2,39,0 -6067,9,8,0,11,4,1,6,4,1,1,11,8,1,22,1 -6068,1,2,0,5,6,6,6,2,2,0,2,13,1,21,0 -6069,3,8,0,3,5,2,11,0,4,0,14,2,0,10,0 -6070,6,8,0,1,4,1,8,4,4,1,18,8,0,21,1 -6071,2,1,0,13,5,0,8,2,0,0,6,4,5,29,0 -6072,5,8,0,12,3,4,0,0,1,1,6,17,4,8,0 -6073,6,3,0,8,2,0,9,0,0,0,8,11,0,33,0 -6074,7,0,0,1,1,4,4,0,2,0,14,8,5,36,1 -6075,8,3,0,8,5,2,11,3,0,1,18,5,2,23,1 -6076,1,0,0,7,6,5,0,1,3,0,13,13,4,17,0 -6077,1,2,0,2,0,5,5,5,3,1,4,9,0,28,0 -6078,4,1,0,13,2,5,10,3,0,0,15,14,4,6,0 -6079,2,5,0,0,5,4,7,3,2,1,1,7,1,0,1 -6080,0,5,0,7,1,4,5,4,0,0,2,2,4,1,0 -6081,7,4,0,10,2,5,4,4,0,0,8,9,5,16,0 -6082,8,1,0,1,6,3,3,2,3,0,13,18,3,11,0 -6083,9,7,0,10,6,6,11,3,2,0,8,2,5,10,0 -6084,2,5,0,3,4,2,14,5,0,0,5,7,1,23,1 -6085,0,7,0,15,6,1,4,2,4,0,16,12,0,23,0 -6086,1,8,0,11,0,0,12,2,2,0,2,15,2,19,0 -6087,3,7,0,8,2,1,9,0,1,0,4,0,3,8,0 -6088,10,2,0,2,2,5,5,4,2,0,13,11,1,18,0 -6089,1,6,0,15,1,6,8,4,4,0,8,6,1,6,0 -6090,6,2,0,1,4,5,5,4,0,1,13,2,2,6,0 -6091,2,4,0,1,4,4,0,0,3,1,6,6,5,14,0 -6092,8,1,0,14,4,3,1,4,2,1,9,10,3,11,1 -6093,0,5,0,14,1,3,0,4,3,1,8,15,3,36,0 -6094,9,4,0,5,1,1,10,4,3,0,4,13,5,39,0 -6095,7,7,0,0,3,2,7,5,0,0,4,10,2,38,1 -6096,2,2,0,11,6,1,11,3,2,1,6,2,3,12,0 -6097,6,0,0,9,6,1,11,2,3,0,12,15,1,17,0 -6098,6,4,0,2,0,4,12,4,4,0,15,11,3,41,0 -6099,7,7,0,6,2,5,8,5,0,1,1,7,5,29,1 -6100,3,0,0,6,6,5,9,2,0,0,6,0,1,10,0 -6101,10,7,0,10,3,1,0,5,1,1,16,7,5,29,1 -6102,6,5,0,7,3,3,5,2,0,0,2,2,1,19,0 -6103,7,6,0,5,4,1,8,2,2,0,8,2,1,38,0 -6104,9,6,0,12,6,6,13,5,3,0,18,7,4,41,1 -6105,7,8,0,12,0,6,14,0,2,0,13,8,0,33,1 -6106,1,1,0,13,3,0,7,2,4,1,17,2,1,39,0 -6107,3,2,0,13,5,3,9,2,2,0,15,20,0,3,0 -6108,3,5,0,0,5,5,11,2,1,1,18,11,2,33,1 -6109,3,4,0,13,4,4,2,0,4,0,10,0,1,25,0 -6110,1,5,0,13,1,4,2,1,2,0,2,2,0,30,0 -6111,4,6,0,6,6,0,8,1,0,0,13,0,3,18,0 -6112,8,5,0,7,4,6,6,4,2,1,7,7,1,11,1 -6113,5,3,0,0,4,2,13,5,0,1,5,8,5,32,1 -6114,0,3,0,15,0,0,2,5,0,0,2,3,5,38,0 -6115,0,7,0,3,0,0,13,0,0,0,2,2,5,33,0 -6116,8,5,0,2,3,1,4,5,4,0,18,1,1,26,1 -6117,0,6,0,1,4,4,5,0,0,0,6,15,1,13,0 -6118,10,2,0,9,3,4,4,5,1,1,1,17,5,2,1 -6119,6,5,0,14,1,5,4,0,0,1,3,5,2,20,1 -6120,9,4,0,13,0,5,9,1,2,0,12,9,5,7,0 -6121,9,4,0,1,0,4,10,4,0,1,8,9,3,16,0 -6122,2,0,0,1,1,0,6,4,2,1,10,0,5,16,0 -6123,1,6,0,5,0,0,9,1,3,1,2,11,1,17,0 -6124,1,0,0,4,0,0,14,0,3,0,2,17,3,16,0 -6125,2,3,0,1,3,0,9,5,4,0,14,3,1,26,0 -6126,6,3,0,1,5,3,11,1,1,1,4,14,3,38,0 -6127,8,4,0,1,1,5,5,2,3,1,8,13,2,29,0 -6128,5,4,0,9,5,3,11,2,1,0,16,5,5,34,1 -6129,0,2,0,5,2,6,1,3,0,0,13,4,1,13,0 -6130,0,3,0,1,4,4,10,0,4,0,8,2,3,14,0 -6131,2,7,0,14,1,0,2,2,4,0,6,11,0,11,0 -6132,2,6,0,14,0,4,4,3,3,0,3,0,5,40,0 -6133,8,4,0,6,6,1,13,3,3,1,7,8,5,36,1 -6134,1,2,0,2,3,3,3,3,2,0,4,2,5,16,0 -6135,1,4,0,4,3,0,7,3,1,1,2,2,1,26,0 -6136,1,4,0,4,0,3,8,3,3,0,4,20,1,10,0 -6137,3,2,0,11,3,2,6,1,1,1,2,19,2,17,1 -6138,5,0,0,10,2,0,10,4,0,1,6,11,3,38,0 -6139,0,3,0,12,5,5,14,0,3,1,13,6,1,7,0 -6140,9,3,0,11,4,3,0,0,4,0,8,9,0,25,0 -6141,2,6,0,8,1,3,4,5,0,1,10,3,4,1,0 -6142,3,0,0,9,0,1,8,2,2,0,17,11,0,11,0 -6143,3,2,0,0,0,0,0,0,1,0,10,11,1,7,0 -6144,2,3,0,5,5,5,6,0,0,0,4,2,0,36,0 -6145,7,0,0,10,4,5,2,0,0,0,9,8,0,14,1 -6146,0,8,0,8,6,4,7,4,0,1,6,2,5,14,0 -6147,3,4,0,12,4,2,2,5,1,0,7,13,0,29,1 -6148,2,7,0,4,1,5,13,4,2,0,6,4,2,24,0 -6149,9,1,0,8,0,5,2,1,1,0,9,13,1,4,0 -6150,0,8,0,0,4,0,2,3,1,1,13,11,1,3,0 -6151,0,6,0,10,0,0,7,3,3,1,12,15,1,4,0 -6152,9,5,0,5,2,2,7,0,0,1,18,7,2,7,1 -6153,1,1,0,3,2,0,1,2,4,1,9,1,4,10,1 -6154,0,3,0,9,3,5,13,3,0,0,8,20,2,4,0 -6155,0,3,0,7,1,3,14,0,0,0,15,11,4,34,0 -6156,10,6,0,2,0,0,14,0,2,0,8,13,1,36,0 -6157,1,5,0,0,5,0,2,0,1,0,1,10,2,14,1 -6158,4,1,0,7,0,1,7,2,0,1,18,10,2,26,1 -6159,5,5,0,6,5,6,6,3,2,1,2,15,1,14,0 -6160,9,5,0,12,3,6,9,4,1,1,8,12,1,3,0 -6161,4,1,0,12,5,6,2,4,0,1,7,9,5,24,1 -6162,9,0,0,13,6,0,14,3,3,1,8,13,0,10,0 -6163,8,3,0,14,6,1,7,5,3,0,7,17,5,32,1 -6164,9,2,0,15,1,5,9,4,3,0,9,13,1,37,0 -6165,2,2,0,4,0,0,2,3,4,1,16,2,2,3,0 -6166,1,2,0,11,1,0,9,3,0,0,11,6,0,28,0 -6167,3,4,0,4,0,4,8,2,0,1,2,13,1,36,0 -6168,4,2,0,13,0,1,8,0,4,1,6,2,3,13,0 -6169,1,6,0,15,3,0,0,0,4,1,17,9,2,24,0 -6170,8,3,0,11,2,2,2,3,2,0,4,20,4,33,1 -6171,9,8,0,14,1,1,6,5,2,0,6,18,2,32,0 -6172,4,6,0,15,2,3,9,1,3,0,2,14,1,39,0 -6173,2,8,0,13,0,5,9,3,3,0,2,2,1,18,0 -6174,9,1,0,7,0,1,1,2,3,1,16,19,2,13,1 -6175,0,8,0,14,1,3,9,3,4,0,2,8,3,40,0 -6176,2,2,0,3,0,2,11,5,3,0,17,13,4,0,0 -6177,9,0,0,7,0,1,1,1,0,1,1,10,4,26,1 -6178,5,8,0,9,4,5,4,0,1,0,18,1,3,30,1 -6179,5,6,0,6,2,2,3,4,1,0,18,8,3,28,1 -6180,2,2,0,13,2,5,9,3,0,0,2,6,4,28,0 -6181,0,8,0,5,1,6,0,4,0,1,0,4,1,27,0 -6182,6,3,0,14,2,5,10,1,2,0,8,3,0,41,0 -6183,6,2,0,9,5,5,12,5,2,1,9,11,4,41,1 -6184,8,8,0,15,3,2,2,2,3,0,18,7,5,27,1 -6185,1,4,0,5,0,0,8,2,4,1,2,6,0,41,0 -6186,10,8,0,1,6,3,13,0,1,1,18,5,4,23,1 -6187,0,2,0,1,2,6,13,4,1,1,13,12,2,20,0 -6188,2,8,0,4,0,1,9,3,3,0,8,2,5,20,0 -6189,10,5,0,0,3,0,1,4,2,1,13,15,1,1,0 -6190,3,6,0,9,6,5,8,0,3,1,17,10,0,35,0 -6191,7,2,0,8,2,3,6,5,1,0,7,5,3,13,1 -6192,8,2,0,3,5,4,13,0,0,1,9,10,0,39,1 -6193,10,7,0,15,0,3,6,5,0,1,8,11,3,25,0 -6194,1,8,0,2,3,2,4,5,1,0,8,4,0,26,1 -6195,3,6,0,5,0,5,5,0,2,0,6,20,3,26,0 -6196,9,7,0,12,0,0,9,2,0,1,17,2,0,29,0 -6197,3,3,0,15,4,6,0,0,2,0,2,3,2,13,0 -6198,8,3,0,13,2,0,2,2,3,1,8,14,3,40,0 -6199,3,1,0,5,3,0,8,3,3,0,14,10,3,11,1 -6200,0,1,0,2,0,4,6,4,3,1,17,2,0,28,0 -6201,6,7,0,1,2,1,14,3,1,1,8,2,3,2,0 -6202,9,4,0,13,3,2,5,0,2,0,3,11,0,41,0 -6203,0,5,0,6,2,4,7,4,1,0,12,6,1,6,0 -6204,5,4,0,13,2,1,3,1,1,0,7,10,2,3,1 -6205,0,5,0,2,6,4,11,3,0,0,2,11,0,33,0 -6206,10,6,0,10,1,0,4,4,4,1,13,13,5,21,0 -6207,7,2,0,8,5,3,2,3,0,1,0,18,5,19,0 -6208,1,4,0,1,1,0,8,3,3,0,2,11,0,5,0 -6209,1,6,0,7,0,4,12,4,0,1,8,9,2,26,0 -6210,4,2,0,13,0,2,2,0,1,1,5,10,1,1,1 -6211,1,7,0,11,0,4,8,4,3,0,0,11,5,37,0 -6212,9,6,0,6,1,4,13,3,0,1,8,20,5,7,0 -6213,0,8,0,11,1,6,4,2,4,1,16,1,1,20,1 -6214,3,4,0,4,1,2,14,4,1,1,8,11,2,39,0 -6215,8,4,0,9,6,0,0,3,2,0,13,14,4,8,0 -6216,3,5,0,6,2,4,11,0,1,0,8,18,0,6,0 -6217,5,8,0,4,2,4,3,3,3,1,17,9,5,8,0 -6218,9,8,0,2,1,3,9,4,2,1,12,11,5,38,0 -6219,3,2,0,8,5,4,2,3,1,1,1,12,2,2,0 -6220,8,8,0,5,5,3,12,3,1,0,9,10,4,32,1 -6221,0,2,0,2,0,4,13,0,0,0,13,7,4,20,0 -6222,6,7,0,4,0,5,2,2,0,1,2,2,3,41,0 -6223,7,2,0,3,0,5,11,0,0,1,2,11,1,26,0 -6224,0,7,0,2,2,0,9,0,0,0,13,1,2,3,0 -6225,3,7,0,4,6,5,0,4,0,1,1,15,5,13,0 -6226,6,1,0,14,4,4,14,2,2,1,11,1,5,17,1 -6227,3,4,0,1,6,5,0,2,4,1,2,3,0,22,0 -6228,1,1,0,0,4,2,10,0,3,1,0,19,2,5,1 -6229,7,2,0,4,2,1,8,4,4,1,2,2,1,31,0 -6230,0,0,0,6,4,0,14,3,3,1,10,10,0,16,0 -6231,8,0,0,14,4,1,9,2,0,1,3,7,3,26,1 -6232,10,8,0,12,3,2,4,4,2,1,18,12,1,5,1 -6233,2,3,0,7,5,5,8,2,1,0,2,6,2,18,0 -6234,8,6,0,14,4,0,10,0,4,1,16,3,4,26,1 -6235,7,2,0,15,3,4,3,4,1,1,10,15,1,4,0 -6236,10,2,0,8,2,3,5,0,3,0,17,4,0,4,0 -6237,5,8,0,2,6,5,13,3,2,1,2,2,0,29,0 -6238,8,4,0,0,2,0,5,5,1,1,18,3,1,1,1 -6239,3,1,0,3,0,0,12,4,3,0,13,9,3,4,0 -6240,7,6,0,10,3,6,13,3,0,1,18,5,3,39,1 -6241,6,6,0,9,2,2,1,1,0,1,1,3,2,12,1 -6242,7,4,0,11,2,0,7,5,4,1,14,18,5,36,1 -6243,3,1,0,6,6,4,8,4,4,1,4,9,0,7,0 -6244,1,2,0,5,1,6,3,4,0,0,6,11,2,27,0 -6245,10,7,0,7,2,5,11,1,0,1,6,6,2,7,0 -6246,4,1,0,13,1,4,11,3,2,0,2,20,5,32,0 -6247,2,6,0,10,6,1,2,5,0,0,13,11,5,28,0 -6248,5,8,0,14,0,1,1,0,3,0,18,9,2,0,1 -6249,0,7,0,10,1,5,7,1,3,1,2,9,1,12,0 -6250,5,7,0,4,5,5,2,2,0,1,14,15,1,30,0 -6251,2,5,0,13,1,4,7,4,0,0,7,4,1,19,0 -6252,4,6,0,5,5,6,10,2,0,1,12,0,3,31,1 -6253,4,1,0,3,3,0,4,2,0,0,15,9,0,26,0 -6254,6,1,0,10,5,4,12,1,3,0,2,11,0,9,0 -6255,7,5,0,5,3,0,3,0,3,1,16,16,1,24,0 -6256,3,0,0,13,1,0,5,5,2,1,18,17,4,4,1 -6257,7,7,0,9,4,4,4,0,3,0,9,5,0,1,1 -6258,1,1,0,15,1,1,1,4,3,0,11,13,3,10,0 -6259,5,6,0,12,3,0,0,3,4,0,17,13,0,34,0 -6260,0,1,0,15,1,0,7,0,4,0,2,2,4,37,0 -6261,0,6,0,5,4,2,5,0,4,1,2,18,2,10,0 -6262,1,7,0,13,6,5,11,0,0,1,2,16,0,32,0 -6263,10,8,0,8,3,3,2,4,3,0,2,14,0,19,0 -6264,1,2,0,2,4,0,7,4,0,1,13,20,1,0,0 -6265,1,8,0,1,0,1,13,3,3,1,15,17,1,34,1 -6266,2,4,0,3,0,1,9,2,0,0,4,15,2,27,0 -6267,10,6,0,6,3,3,12,4,4,1,4,11,1,35,0 -6268,9,8,0,7,0,2,11,0,2,1,18,4,0,36,1 -6269,4,1,0,9,1,0,10,5,1,1,2,18,2,22,0 -6270,6,1,0,1,0,4,2,1,1,0,2,6,0,29,0 -6271,2,7,0,1,0,0,4,3,0,1,8,20,3,27,0 -6272,8,2,0,9,3,6,13,3,0,1,2,0,3,19,0 -6273,3,3,0,0,3,1,10,2,1,0,10,18,0,1,1 -6274,0,2,0,12,5,0,9,2,0,1,6,11,4,17,0 -6275,10,0,0,9,4,3,1,0,0,1,11,7,0,11,1 -6276,6,2,0,10,3,4,13,1,4,1,16,5,5,34,1 -6277,1,6,0,2,5,5,14,4,3,0,3,2,1,38,0 -6278,4,2,0,0,4,1,7,0,3,1,18,3,5,12,1 -6279,7,2,0,2,0,3,3,3,0,0,1,18,2,17,0 -6280,9,8,0,9,3,5,14,1,2,0,13,13,2,4,0 -6281,6,4,0,14,5,1,4,5,2,0,3,9,4,36,1 -6282,8,7,0,5,6,4,5,4,1,1,8,15,0,35,0 -6283,2,2,0,15,3,4,6,0,4,1,10,9,3,34,0 -6284,1,6,0,12,5,3,8,3,4,0,5,6,1,14,0 -6285,8,1,0,10,4,3,8,1,1,0,14,16,5,37,1 -6286,8,4,0,3,0,5,14,1,0,1,0,6,4,25,0 -6287,7,6,0,15,1,5,0,5,0,0,0,11,5,24,0 -6288,1,4,0,4,6,2,10,2,1,0,4,15,4,0,0 -6289,4,5,0,8,4,2,7,1,0,1,10,2,3,3,0 -6290,8,6,0,10,6,0,8,2,2,0,13,20,0,17,0 -6291,8,0,0,7,0,2,9,1,1,0,5,18,5,33,0 -6292,1,0,0,14,2,5,13,5,1,1,8,0,0,35,0 -6293,0,8,0,13,1,4,12,2,4,1,2,2,3,35,0 -6294,2,3,0,15,2,4,12,0,2,1,8,11,3,17,0 -6295,0,8,0,0,3,6,14,5,2,1,0,16,3,25,0 -6296,6,8,0,1,0,6,6,3,2,1,6,11,0,28,0 -6297,7,4,0,7,0,6,6,1,3,0,8,4,0,23,0 -6298,0,7,0,13,0,4,3,3,0,0,4,6,0,32,0 -6299,3,6,0,8,0,1,13,3,0,1,15,6,1,12,0 -6300,3,6,0,3,4,3,11,2,0,1,18,8,2,34,1 -6301,10,8,0,14,1,4,5,2,3,1,13,19,3,33,0 -6302,6,6,0,10,1,6,5,1,0,0,13,16,3,1,0 -6303,7,5,0,8,1,0,9,1,3,1,3,12,3,22,0 -6304,6,7,0,14,1,6,4,1,2,1,8,11,0,7,0 -6305,5,7,0,1,4,2,2,1,3,0,10,7,4,22,1 -6306,10,3,0,0,6,0,3,3,1,1,13,7,3,17,0 -6307,9,8,0,12,3,1,2,1,3,0,17,8,0,12,0 -6308,1,6,0,2,1,5,7,0,3,0,17,11,3,37,0 -6309,8,4,0,0,6,4,5,2,1,1,2,2,4,3,0 -6310,9,4,0,14,5,5,9,1,0,1,18,8,0,37,1 -6311,10,2,0,5,0,0,5,0,0,1,2,2,1,35,0 -6312,0,1,0,1,0,4,1,3,1,1,4,13,5,8,0 -6313,0,1,0,11,6,5,8,1,2,1,6,12,1,1,0 -6314,2,5,0,0,3,4,7,3,1,1,7,11,0,9,0 -6315,6,5,0,2,0,4,4,3,0,1,2,0,0,8,0 -6316,3,5,0,5,1,3,9,0,3,0,17,11,1,26,0 -6317,5,3,0,11,0,5,11,3,1,0,2,2,2,29,0 -6318,1,6,0,3,3,4,9,1,1,0,8,0,3,5,0 -6319,3,3,0,12,2,5,5,4,2,0,4,13,5,40,0 -6320,0,3,0,8,2,5,8,2,2,1,13,15,1,1,0 -6321,1,5,0,12,5,4,2,5,2,1,0,17,3,40,1 -6322,0,6,0,0,2,5,14,3,0,0,13,17,5,19,0 -6323,8,4,0,5,6,0,6,3,1,1,12,11,0,26,0 -6324,5,2,0,7,4,0,2,5,0,0,18,14,4,7,1 -6325,4,3,0,8,0,0,14,3,0,1,15,11,2,34,0 -6326,7,5,0,8,4,1,5,4,2,0,12,0,0,34,0 -6327,1,2,0,8,3,0,7,3,1,0,17,5,1,19,0 -6328,9,5,0,1,5,3,14,0,1,0,13,12,1,5,0 -6329,5,2,0,10,2,2,11,5,1,0,3,1,5,11,1 -6330,0,7,0,9,6,3,1,3,4,1,13,17,5,36,0 -6331,0,0,0,15,0,6,13,1,1,0,4,20,5,7,0 -6332,2,8,0,6,4,4,5,2,4,0,0,12,4,31,0 -6333,1,1,0,5,3,6,6,3,1,0,13,15,1,1,0 -6334,7,6,0,9,3,1,2,1,0,1,9,7,1,24,1 -6335,0,6,0,0,0,6,8,3,0,1,0,18,3,10,0 -6336,0,6,0,11,0,5,13,2,0,1,17,15,0,8,0 -6337,0,1,0,11,2,1,9,0,0,1,7,16,1,38,0 -6338,4,8,0,13,4,3,10,0,1,0,10,13,4,16,0 -6339,3,7,0,3,0,4,14,2,4,1,17,12,5,31,0 -6340,5,4,0,8,0,0,9,3,3,1,11,13,3,41,0 -6341,0,6,0,12,0,5,14,3,4,1,18,11,3,17,0 -6342,4,1,0,9,5,3,9,5,1,0,15,15,5,21,0 -6343,2,3,0,10,0,0,5,5,0,0,6,4,2,10,0 -6344,10,2,0,0,2,6,3,0,0,1,5,10,5,8,1 -6345,5,6,0,15,0,5,6,3,0,1,1,4,5,31,0 -6346,6,4,0,15,1,0,1,0,3,0,13,19,5,10,0 -6347,0,6,0,10,3,1,2,3,3,1,17,16,4,13,0 -6348,5,2,0,13,2,4,12,4,1,1,10,13,1,40,0 -6349,6,7,0,5,0,2,3,3,0,0,16,4,4,14,1 -6350,4,1,0,15,0,2,10,3,4,0,13,11,5,35,0 -6351,6,5,0,14,0,5,8,4,4,1,13,11,2,28,0 -6352,6,6,0,3,0,4,7,2,1,1,8,17,1,37,0 -6353,4,8,0,15,0,5,5,3,4,0,13,10,1,9,0 -6354,2,0,0,1,2,3,3,0,3,1,6,2,3,41,0 -6355,0,7,0,12,1,3,9,2,2,1,2,6,5,26,0 -6356,5,4,0,6,0,1,3,5,0,0,2,6,3,23,0 -6357,4,3,0,3,0,1,1,1,1,0,15,12,1,35,0 -6358,4,8,0,2,0,3,2,2,1,1,15,14,4,36,0 -6359,4,0,0,3,6,2,9,1,0,0,13,9,3,11,0 -6360,0,3,0,4,5,0,14,1,2,0,2,9,2,12,0 -6361,9,1,0,13,0,0,1,2,4,1,14,2,1,10,0 -6362,1,5,0,0,6,3,1,3,1,1,13,13,2,9,0 -6363,4,3,0,7,1,6,3,0,3,0,18,19,3,35,1 -6364,7,5,0,0,4,4,10,1,0,1,8,6,5,4,0 -6365,4,5,0,3,1,6,5,0,1,0,2,2,2,27,0 -6366,5,2,0,10,3,3,6,4,0,0,2,6,4,19,0 -6367,8,4,0,5,1,4,3,2,0,0,11,14,3,19,0 -6368,9,4,0,12,3,4,8,4,4,0,2,7,5,28,0 -6369,2,7,0,2,1,5,9,4,2,0,4,11,5,22,0 -6370,0,4,0,15,5,0,7,3,0,1,15,11,5,37,0 -6371,6,3,0,1,1,1,6,1,4,0,16,13,5,25,0 -6372,2,0,0,3,2,6,9,4,0,0,8,5,0,26,0 -6373,2,8,0,6,2,3,10,4,4,0,13,18,5,4,0 -6374,4,6,0,13,2,2,8,3,4,1,14,11,5,30,0 -6375,8,6,0,13,1,1,4,0,2,0,9,0,5,18,1 -6376,4,0,0,7,6,5,5,3,3,1,4,15,4,8,0 -6377,8,0,0,15,5,4,13,1,3,0,0,15,0,22,0 -6378,5,1,0,4,0,6,0,4,3,1,17,16,0,32,0 -6379,9,3,0,13,0,6,3,2,4,1,15,9,2,1,0 -6380,1,4,0,15,0,6,0,2,0,1,0,15,4,21,0 -6381,5,1,0,15,3,2,2,4,3,1,18,19,0,28,1 -6382,7,8,0,0,6,0,8,4,3,0,17,6,3,16,0 -6383,9,6,0,5,3,5,5,2,3,1,17,4,5,28,0 -6384,9,1,0,1,5,5,6,5,2,1,18,5,5,36,1 -6385,7,2,0,12,2,5,1,3,1,1,10,17,2,30,1 -6386,2,5,0,7,2,5,11,2,1,1,0,6,2,39,0 -6387,9,4,0,3,1,2,8,2,3,1,2,11,0,38,0 -6388,0,0,0,12,3,1,11,4,0,0,6,0,4,5,0 -6389,2,5,0,4,4,0,8,3,4,1,18,3,5,26,1 -6390,10,8,0,4,6,5,1,0,4,1,9,10,4,23,1 -6391,7,7,0,10,4,3,10,4,2,1,5,7,1,0,1 -6392,0,8,0,1,0,2,14,4,2,0,6,2,3,4,0 -6393,0,2,0,15,3,4,6,0,2,0,8,2,0,17,0 -6394,8,0,0,13,6,5,12,5,2,0,1,17,2,7,1 -6395,4,5,0,11,1,2,5,0,4,0,5,6,0,40,1 -6396,0,8,0,7,2,3,5,4,1,0,17,11,2,17,0 -6397,2,8,0,15,0,1,12,1,2,1,17,11,5,13,0 -6398,0,2,0,7,6,0,3,4,4,1,2,0,3,18,0 -6399,0,2,0,8,0,6,6,1,4,1,11,11,4,39,0 -6400,3,7,0,9,5,3,5,0,1,0,17,12,3,14,0 -6401,6,2,0,14,4,0,4,2,2,0,5,5,1,11,1 -6402,9,6,0,8,5,3,10,1,3,0,4,4,0,2,0 -6403,3,6,0,15,2,0,13,3,4,0,13,11,1,35,0 -6404,0,3,0,11,2,4,9,4,1,1,2,0,1,30,0 -6405,6,0,0,8,0,5,11,3,0,1,13,11,5,21,0 -6406,2,6,0,8,1,2,2,1,1,0,14,2,2,13,0 -6407,7,5,0,9,2,6,10,0,2,1,1,17,4,18,1 -6408,3,7,0,1,0,5,3,1,2,1,10,13,2,5,0 -6409,0,3,0,8,0,5,11,1,1,1,8,2,5,16,0 -6410,0,6,0,1,3,4,0,3,4,0,10,20,5,6,0 -6411,10,1,0,6,2,6,2,1,3,1,13,10,4,26,0 -6412,8,6,0,6,0,5,1,1,1,1,18,16,4,38,1 -6413,5,6,0,4,6,1,11,2,3,0,17,11,2,14,0 -6414,2,1,0,6,0,6,14,1,0,1,4,6,1,0,0 -6415,5,7,0,0,4,4,13,4,1,1,18,10,4,1,1 -6416,3,1,0,8,1,0,13,3,1,0,10,2,4,4,0 -6417,1,6,0,3,6,5,13,1,0,0,6,20,0,27,0 -6418,2,1,0,12,5,0,4,3,4,0,8,4,0,2,0 -6419,1,3,0,15,0,0,1,3,0,1,6,11,0,19,0 -6420,8,1,0,12,3,3,3,0,3,1,18,19,1,22,1 -6421,1,3,0,12,2,5,14,4,3,0,2,6,1,39,0 -6422,10,3,0,15,4,5,13,3,1,1,13,18,3,33,0 -6423,7,8,0,2,3,6,8,1,0,0,13,0,5,25,0 -6424,10,3,0,11,0,1,10,1,1,0,4,11,3,31,0 -6425,0,3,0,5,2,4,1,5,2,0,6,11,5,19,0 -6426,0,3,0,1,0,4,9,0,0,0,10,15,1,13,0 -6427,10,3,0,1,3,1,0,3,0,1,14,19,5,14,1 -6428,1,8,0,8,5,4,5,1,4,1,8,15,2,23,0 -6429,9,5,0,8,2,2,13,5,1,0,13,15,5,40,0 -6430,9,7,0,15,1,3,3,3,1,1,8,5,5,24,0 -6431,6,1,0,8,0,3,5,4,3,0,17,6,1,2,0 -6432,7,2,0,1,6,4,5,1,0,0,2,20,3,9,0 -6433,8,4,0,1,6,4,0,4,0,1,13,18,3,14,0 -6434,2,3,0,1,0,5,6,3,3,0,5,15,1,36,0 -6435,1,8,0,9,3,6,8,4,0,0,17,2,1,0,0 -6436,1,0,0,6,3,5,12,5,2,0,8,6,5,29,0 -6437,5,1,0,5,4,3,9,3,0,1,17,13,2,5,0 -6438,8,4,0,9,3,2,0,5,2,0,11,6,3,26,1 -6439,10,2,0,3,4,5,4,4,1,0,2,11,4,18,0 -6440,2,8,0,2,4,6,11,2,4,0,12,2,0,31,0 -6441,6,7,0,8,1,1,4,5,3,0,11,16,4,24,1 -6442,5,0,0,15,5,4,0,5,3,0,6,4,2,7,0 -6443,0,0,0,15,2,5,2,0,0,0,13,16,1,7,0 -6444,0,0,0,11,5,5,8,4,0,1,2,20,5,12,0 -6445,1,6,0,3,3,2,9,4,2,1,6,11,0,3,0 -6446,0,1,0,4,1,5,11,2,0,0,8,5,0,38,0 -6447,6,6,0,2,0,0,12,4,2,0,17,11,3,38,0 -6448,0,8,0,13,0,3,3,1,1,0,13,11,0,8,0 -6449,7,2,0,7,4,2,10,1,2,0,18,1,1,26,1 -6450,8,5,0,9,0,4,9,4,4,1,17,2,3,20,0 -6451,0,2,0,3,0,4,13,4,1,0,12,2,5,19,0 -6452,9,1,0,6,5,0,0,3,3,1,6,19,4,23,1 -6453,3,4,0,5,3,0,2,5,4,0,2,2,0,10,0 -6454,10,8,0,13,0,5,0,2,4,1,13,6,0,3,0 -6455,5,0,0,14,1,5,3,0,1,0,2,9,2,28,0 -6456,5,8,0,1,4,6,11,5,4,0,14,16,2,36,1 -6457,0,6,0,2,5,6,9,2,3,0,2,9,4,6,0 -6458,9,3,0,1,5,6,3,5,3,1,4,8,4,31,1 -6459,1,0,0,8,4,5,12,0,2,1,18,7,4,3,1 -6460,5,5,0,3,2,0,14,5,3,1,0,8,5,24,1 -6461,9,8,0,3,6,4,5,2,2,0,8,11,5,29,0 -6462,7,8,0,8,0,1,8,1,0,0,17,9,4,22,0 -6463,6,2,0,9,0,2,11,0,0,1,14,14,5,41,1 -6464,2,1,0,3,1,4,0,2,0,1,13,19,3,31,0 -6465,0,1,0,11,5,6,12,3,4,0,0,13,4,24,0 -6466,9,5,0,3,2,2,4,1,1,0,2,2,1,24,0 -6467,1,2,0,6,6,4,6,1,1,1,4,11,0,11,0 -6468,9,2,0,12,0,0,2,4,3,1,12,15,5,30,0 -6469,6,8,0,11,0,3,8,3,3,1,2,16,2,27,0 -6470,5,6,0,8,2,4,13,0,1,1,18,4,1,8,1 -6471,8,2,0,5,0,6,5,5,2,1,3,11,4,7,0 -6472,2,4,0,10,4,6,7,5,0,1,6,9,4,4,0 -6473,0,6,0,1,4,1,10,0,3,1,4,2,1,33,0 -6474,9,7,0,10,3,6,3,5,4,0,14,7,0,16,1 -6475,10,4,0,4,2,4,7,3,3,0,2,6,3,13,0 -6476,2,2,0,1,6,6,12,3,2,1,1,17,5,33,1 -6477,4,3,0,15,3,0,11,5,3,1,14,7,0,28,1 -6478,3,3,0,3,2,4,11,3,3,0,18,5,3,20,1 -6479,4,8,0,10,4,6,7,1,0,1,14,7,3,28,1 -6480,5,6,0,11,3,5,11,0,2,0,14,20,3,38,0 -6481,1,8,0,13,2,0,10,0,3,0,14,0,2,34,0 -6482,10,8,0,13,6,0,10,1,1,0,1,13,5,25,0 -6483,3,6,0,5,0,0,5,2,4,0,8,20,2,16,0 -6484,2,1,0,0,1,6,9,4,2,0,8,4,4,33,0 -6485,0,4,0,2,1,5,14,4,0,1,12,2,2,8,0 -6486,9,6,0,1,3,6,10,1,1,1,18,16,2,39,1 -6487,9,3,0,6,4,6,4,4,3,0,18,6,4,39,1 -6488,0,2,0,4,6,1,8,2,4,0,13,4,3,38,0 -6489,3,3,0,0,0,2,9,3,2,0,8,3,5,2,0 -6490,6,6,0,15,4,0,6,3,4,1,3,2,3,26,0 -6491,6,1,0,7,6,5,14,3,2,1,0,18,0,29,0 -6492,3,6,0,0,2,4,9,2,3,0,2,2,5,17,0 -6493,7,2,0,12,0,3,6,3,1,0,13,4,0,17,0 -6494,9,6,0,10,1,4,0,4,2,0,8,9,3,5,0 -6495,0,6,0,11,3,0,7,5,2,0,13,15,3,35,0 -6496,8,8,0,12,3,4,9,4,2,0,6,10,5,9,0 -6497,0,3,0,9,6,3,8,4,3,1,1,14,5,16,1 -6498,3,4,0,3,3,0,9,4,2,0,13,3,4,23,0 -6499,2,6,0,15,2,3,4,2,4,0,13,13,5,4,0 -6500,5,2,0,5,5,5,8,5,3,0,2,14,5,29,0 -6501,3,3,0,6,4,5,1,1,3,0,8,2,2,19,0 -6502,4,0,0,1,2,6,6,3,0,1,11,8,5,25,1 -6503,2,6,0,15,0,4,6,0,2,0,12,11,0,3,0 -6504,10,0,0,3,3,4,5,1,1,0,15,6,4,41,0 -6505,8,0,0,6,2,6,9,1,3,0,2,6,4,33,0 -6506,7,2,0,0,2,5,5,3,4,1,2,10,5,12,0 -6507,7,4,0,13,0,3,9,1,4,1,10,18,2,5,0 -6508,3,6,0,5,6,6,14,3,1,0,14,15,5,13,0 -6509,3,4,0,8,1,1,11,3,2,0,5,10,5,1,1 -6510,6,0,0,3,3,4,2,1,2,1,2,9,0,34,0 -6511,8,3,0,15,0,2,13,5,3,1,3,14,4,9,1 -6512,1,0,0,12,6,4,3,0,4,1,17,20,0,28,0 -6513,7,3,0,4,3,0,0,2,4,1,9,13,3,23,1 -6514,9,4,0,13,5,4,13,3,0,0,4,3,3,14,0 -6515,3,5,0,10,4,2,5,0,2,1,5,12,0,35,1 -6516,0,0,0,13,0,3,14,4,3,1,2,11,2,38,0 -6517,2,8,0,14,0,2,12,2,3,1,2,6,1,18,0 -6518,0,3,0,7,5,6,6,5,3,0,13,0,0,32,0 -6519,1,0,0,3,1,2,13,4,1,1,13,0,3,37,0 -6520,8,1,0,8,5,6,7,3,2,1,1,5,1,30,1 -6521,3,0,0,0,0,5,8,0,2,1,0,2,5,13,0 -6522,1,2,0,5,6,6,10,1,4,0,17,0,0,9,0 -6523,8,3,0,9,1,0,4,5,4,0,16,3,2,1,1 -6524,3,5,0,1,2,1,13,4,4,0,5,8,1,21,1 -6525,5,2,0,2,2,1,10,0,2,1,5,19,5,27,1 -6526,0,8,0,11,1,0,4,2,2,1,2,6,1,10,0 -6527,0,2,0,12,5,4,6,4,1,1,2,6,5,23,0 -6528,10,7,0,8,6,0,2,1,4,0,2,20,4,32,0 -6529,0,2,0,7,1,2,1,4,0,0,8,11,0,36,0 -6530,2,8,0,7,0,1,13,0,1,0,17,11,2,20,0 -6531,0,6,0,0,0,5,9,2,3,1,15,15,1,2,0 -6532,8,4,0,13,5,0,14,1,1,0,10,15,2,29,0 -6533,3,4,0,12,6,0,5,0,2,0,0,13,4,24,0 -6534,1,6,0,5,3,2,8,2,4,1,2,0,4,10,0 -6535,9,8,0,0,4,1,2,0,2,0,1,7,0,27,1 -6536,4,8,0,11,3,6,5,4,3,0,10,6,3,32,0 -6537,5,2,0,12,1,4,12,3,0,0,6,2,1,19,0 -6538,2,1,0,9,2,1,4,1,2,0,11,2,5,11,0 -6539,2,0,0,0,4,1,0,5,3,0,18,10,4,17,1 -6540,1,2,0,1,1,2,8,5,2,0,4,13,0,24,0 -6541,5,8,0,0,0,0,10,4,3,1,13,20,0,38,0 -6542,1,4,0,14,2,5,7,3,0,1,8,20,3,2,0 -6543,3,0,0,12,1,0,2,3,3,0,4,19,0,1,0 -6544,6,1,0,1,2,0,12,0,4,0,18,14,1,37,1 -6545,4,6,0,11,5,2,12,5,2,1,5,18,0,1,1 -6546,4,5,0,9,5,1,2,3,2,0,15,11,1,36,0 -6547,6,8,0,1,1,1,4,3,0,1,9,10,4,36,1 -6548,0,2,0,9,6,3,8,3,1,0,8,11,1,14,0 -6549,7,4,0,15,4,6,12,3,2,1,7,5,4,11,1 -6550,1,5,0,14,1,3,12,2,1,0,4,11,1,36,0 -6551,10,2,0,7,2,1,11,0,4,0,18,5,3,29,1 -6552,4,7,0,4,3,6,2,4,1,0,2,9,3,1,0 -6553,1,3,0,5,4,5,14,1,2,0,13,6,0,11,0 -6554,5,3,0,7,0,3,5,2,1,0,17,20,1,0,0 -6555,2,3,0,2,0,5,9,4,2,1,5,13,0,33,0 -6556,7,6,0,13,4,2,4,5,4,1,7,19,0,14,1 -6557,9,2,0,3,0,3,10,3,3,1,2,14,5,39,0 -6558,1,2,0,6,0,0,5,3,3,1,16,11,5,3,0 -6559,7,6,0,2,6,6,5,4,4,0,13,0,4,18,0 -6560,4,6,0,7,0,5,3,2,1,0,2,9,5,9,0 -6561,6,5,0,7,1,3,4,1,4,1,18,16,1,33,1 -6562,5,2,0,12,1,4,14,3,4,1,16,15,1,25,0 -6563,8,4,0,10,0,5,11,1,3,1,2,4,2,34,0 -6564,6,6,0,8,0,0,3,5,4,1,13,15,1,1,0 -6565,10,0,0,2,6,4,0,5,0,1,10,5,5,30,1 -6566,6,1,0,7,6,5,12,0,2,1,9,19,3,0,1 -6567,3,0,0,8,2,2,1,5,4,1,18,1,3,13,1 -6568,9,6,0,5,4,3,0,3,3,0,13,6,2,40,0 -6569,2,0,0,0,6,6,7,4,3,1,0,0,0,8,0 -6570,6,5,0,1,6,4,8,4,1,0,6,2,0,2,0 -6571,0,3,0,4,0,1,10,3,1,1,10,6,1,19,0 -6572,0,8,0,2,6,4,2,3,3,0,10,11,2,10,0 -6573,4,7,0,11,1,6,8,0,1,0,15,2,2,29,0 -6574,1,2,0,9,2,6,5,0,2,1,18,10,2,16,1 -6575,6,2,0,11,0,0,13,5,2,0,7,13,2,31,0 -6576,2,4,0,6,2,3,10,5,1,0,15,11,5,10,0 -6577,2,4,0,0,0,6,10,2,2,0,9,15,0,14,0 -6578,4,0,0,5,6,1,9,1,0,0,14,10,4,26,1 -6579,8,4,0,7,6,0,11,0,0,0,3,1,3,23,1 -6580,8,5,0,3,6,0,7,3,4,1,17,11,3,4,0 -6581,4,8,0,1,0,3,9,3,3,0,4,11,5,11,0 -6582,9,0,0,15,2,5,2,0,3,0,12,12,5,13,0 -6583,9,7,0,5,2,6,8,3,1,0,17,9,0,2,0 -6584,4,8,0,5,2,0,3,0,0,1,10,1,0,38,1 -6585,10,8,0,3,5,1,6,0,1,1,17,11,4,6,0 -6586,2,1,0,8,6,2,4,3,4,1,7,7,4,7,1 -6587,1,5,0,12,0,4,12,1,2,0,13,13,2,23,0 -6588,6,3,0,13,0,6,5,3,3,0,5,11,2,33,0 -6589,1,6,0,11,3,6,6,1,0,1,2,15,0,4,0 -6590,4,3,0,7,3,2,6,3,0,1,18,19,0,21,1 -6591,2,4,0,8,0,4,2,3,0,1,10,2,2,19,0 -6592,9,1,0,9,1,5,1,1,1,0,11,8,5,9,1 -6593,10,5,0,13,5,6,5,4,2,1,16,13,0,7,0 -6594,8,8,0,9,4,6,13,0,4,1,18,18,2,31,1 -6595,2,2,0,7,6,2,6,5,1,0,2,2,5,19,0 -6596,3,2,0,11,0,3,5,0,0,1,13,9,2,3,0 -6597,1,4,0,13,0,4,5,4,0,0,3,20,0,9,0 -6598,1,6,0,1,1,1,11,0,4,1,2,6,3,4,0 -6599,1,7,0,7,6,6,6,0,0,1,2,2,4,21,0 -6600,0,6,0,11,1,0,10,0,2,1,15,6,1,20,0 -6601,10,6,0,13,2,5,7,0,3,1,12,2,2,33,0 -6602,1,4,0,2,2,4,12,2,2,1,13,2,5,0,0 -6603,1,0,0,1,5,5,6,2,3,1,8,2,5,13,0 -6604,2,8,0,11,2,6,7,3,2,1,0,13,1,25,0 -6605,10,8,0,1,6,0,2,0,1,1,2,2,2,39,0 -6606,2,4,0,3,5,3,13,2,0,0,8,2,0,35,0 -6607,3,8,0,11,2,6,3,3,1,1,13,12,1,28,0 -6608,9,1,0,2,2,2,12,0,0,1,18,8,3,28,1 -6609,0,4,0,4,0,4,8,1,3,1,2,15,3,17,0 -6610,0,8,0,6,6,0,6,1,3,1,12,11,1,16,0 -6611,6,4,0,14,4,6,4,0,1,1,3,14,0,20,1 -6612,2,2,0,5,6,0,7,1,2,1,0,18,0,27,0 -6613,0,8,0,15,5,0,3,1,0,0,14,19,5,5,0 -6614,0,1,0,3,0,4,13,3,2,0,7,2,5,14,0 -6615,0,3,0,13,2,3,7,2,3,0,13,13,2,40,0 -6616,0,2,0,12,1,0,13,0,0,1,13,11,3,4,0 -6617,3,5,0,11,0,4,9,2,0,1,10,11,0,31,0 -6618,0,2,0,13,2,4,5,3,1,1,7,6,1,0,0 -6619,2,6,0,3,1,1,1,4,1,0,8,12,4,21,0 -6620,8,8,0,0,1,0,11,4,3,1,5,17,2,31,1 -6621,10,8,0,13,0,2,14,4,3,1,5,2,2,1,0 -6622,5,6,0,0,4,6,1,4,2,0,11,12,0,1,1 -6623,0,0,0,11,0,5,14,1,3,1,2,14,1,12,0 -6624,2,4,0,13,5,1,11,5,2,0,10,6,0,31,0 -6625,1,8,0,10,3,0,14,4,4,0,8,0,1,30,0 -6626,6,6,0,12,3,0,5,5,0,0,17,11,2,28,0 -6627,8,2,0,7,2,6,8,1,1,1,2,11,5,4,0 -6628,2,6,0,6,1,5,8,3,0,0,13,11,0,11,0 -6629,0,0,0,4,2,2,14,4,1,0,18,1,1,0,1 -6630,4,0,0,1,5,0,11,5,3,0,3,7,2,16,1 -6631,2,4,0,1,5,5,3,3,2,1,8,15,0,24,0 -6632,5,0,0,4,2,2,10,0,4,1,16,10,3,24,1 -6633,1,3,0,12,3,2,8,1,3,1,9,7,3,39,1 -6634,2,4,0,7,0,3,0,3,4,1,13,0,4,25,0 -6635,8,7,0,8,1,6,7,3,0,1,4,11,3,11,0 -6636,7,7,0,5,5,2,14,1,0,0,15,5,2,21,1 -6637,0,3,0,1,2,4,2,3,1,0,2,11,0,37,0 -6638,0,6,0,8,0,4,8,2,2,0,17,6,0,26,0 -6639,7,1,0,1,5,3,9,4,0,1,13,0,5,40,0 -6640,7,2,0,9,2,5,5,5,2,0,14,7,3,23,1 -6641,1,8,0,15,1,5,10,1,3,0,2,1,4,34,0 -6642,1,1,0,2,3,6,8,3,3,1,17,2,1,3,0 -6643,9,5,0,5,0,1,7,3,0,0,8,16,0,14,0 -6644,9,1,0,4,3,6,4,1,3,1,9,18,4,39,1 -6645,5,0,0,10,0,2,7,0,0,1,2,6,0,35,0 -6646,9,7,0,10,6,6,8,2,3,1,10,11,0,16,0 -6647,5,6,0,1,6,4,7,5,3,1,17,13,0,26,0 -6648,9,2,0,7,6,4,2,3,1,0,12,2,4,27,0 -6649,1,8,0,1,5,0,4,4,2,1,13,11,0,32,0 -6650,2,2,0,6,0,5,5,0,4,1,0,8,5,5,0 -6651,1,8,0,14,4,5,8,4,4,1,10,13,1,13,0 -6652,0,1,0,8,0,6,2,4,2,0,13,2,3,25,0 -6653,6,5,0,12,2,6,13,1,0,1,13,12,2,27,0 -6654,10,1,0,1,4,6,11,2,2,1,0,2,2,37,0 -6655,0,0,0,0,1,0,9,1,1,0,2,15,4,36,0 -6656,9,3,0,7,5,2,13,4,0,1,2,2,2,5,0 -6657,9,5,0,15,3,0,3,4,0,0,7,9,1,1,0 -6658,0,1,0,14,1,0,14,0,3,1,12,15,0,33,0 -6659,8,1,0,0,1,5,8,4,0,0,2,15,5,29,0 -6660,9,6,0,14,2,1,2,5,1,0,10,10,4,39,1 -6661,9,4,0,8,5,5,14,2,0,0,15,11,5,26,0 -6662,5,0,0,7,5,1,4,4,2,0,13,20,4,37,0 -6663,10,3,0,1,6,3,0,2,1,0,2,11,0,40,0 -6664,6,1,0,1,0,4,8,2,2,0,13,1,3,10,0 -6665,2,1,0,7,3,3,13,0,2,1,15,7,1,35,1 -6666,7,0,0,5,4,3,13,4,1,0,9,20,4,22,1 -6667,0,8,0,14,0,0,0,3,1,0,8,0,2,21,0 -6668,7,1,0,2,4,2,11,5,2,1,5,19,2,30,1 -6669,8,4,0,6,3,1,7,2,0,1,2,11,0,21,0 -6670,8,1,0,5,1,4,12,2,3,1,4,6,3,37,0 -6671,7,5,0,0,5,5,8,4,0,0,15,8,3,16,1 -6672,2,0,0,4,2,2,0,4,0,1,8,11,5,0,0 -6673,2,5,0,4,0,1,1,1,1,0,13,11,0,35,0 -6674,1,8,0,6,6,0,7,2,0,0,2,11,0,4,0 -6675,0,8,0,8,2,6,8,3,0,0,17,13,2,29,0 -6676,1,6,0,0,1,0,2,3,3,1,13,17,1,23,0 -6677,2,1,0,0,5,3,9,4,2,1,13,11,3,3,0 -6678,1,4,0,0,5,2,0,3,0,1,13,3,0,40,0 -6679,1,7,0,13,0,6,11,4,4,0,7,4,1,23,0 -6680,5,1,0,11,0,1,10,3,0,1,5,6,5,3,0 -6681,8,1,0,0,2,4,8,4,3,0,9,10,5,5,1 -6682,2,3,0,3,4,4,2,1,2,0,15,12,4,3,0 -6683,5,1,0,9,5,6,1,5,0,0,0,1,0,0,1 -6684,2,2,0,15,0,6,9,0,0,1,7,0,4,25,0 -6685,2,8,0,12,1,4,14,2,0,1,14,11,1,21,0 -6686,0,1,0,5,0,5,9,2,3,0,17,12,0,10,0 -6687,8,2,0,2,3,2,3,5,4,1,7,1,3,19,1 -6688,5,3,0,13,1,2,11,1,0,0,14,14,0,28,1 -6689,0,1,0,1,6,3,8,5,4,0,13,11,1,10,0 -6690,0,4,0,9,0,5,11,3,0,1,13,6,2,32,0 -6691,0,1,0,9,4,4,14,1,0,1,13,2,1,5,0 -6692,0,0,0,5,3,3,0,3,2,0,10,15,2,7,0 -6693,1,1,0,0,0,6,3,3,1,1,10,9,4,39,0 -6694,7,2,0,3,3,1,6,3,0,1,13,20,4,12,0 -6695,2,8,0,3,6,0,4,3,1,1,13,0,0,17,0 -6696,0,0,0,7,1,1,1,5,3,0,15,0,3,13,1 -6697,9,6,0,5,6,4,0,1,0,1,13,3,0,18,0 -6698,3,3,0,15,1,4,11,0,4,0,12,1,1,25,1 -6699,6,1,0,15,4,3,7,1,2,1,1,16,3,0,1 -6700,3,8,0,15,5,5,7,4,3,1,13,6,5,18,0 -6701,0,1,0,7,6,5,9,1,2,0,4,6,1,17,0 -6702,0,5,0,1,0,0,8,4,0,1,4,11,5,23,0 -6703,4,1,0,6,2,6,14,1,3,1,17,12,3,25,0 -6704,2,0,0,3,0,4,6,3,4,1,2,2,2,10,0 -6705,0,1,0,4,1,4,3,1,2,0,7,11,1,9,0 -6706,0,2,0,10,0,3,2,4,3,1,8,11,5,9,0 -6707,6,0,0,6,1,0,3,3,0,0,10,14,3,4,0 -6708,4,5,0,14,1,3,9,5,2,0,4,10,3,12,1 -6709,5,5,0,8,4,5,2,0,2,0,9,4,5,14,1 -6710,0,5,0,0,4,3,11,1,3,0,8,2,3,1,0 -6711,0,6,0,12,6,4,7,0,3,1,14,8,3,29,0 -6712,6,0,0,11,3,6,8,2,3,1,13,11,4,22,0 -6713,0,3,0,3,1,1,2,0,3,0,6,3,0,16,0 -6714,1,5,0,10,4,4,2,5,2,1,13,20,0,5,0 -6715,6,1,0,1,4,1,9,1,1,1,3,19,5,10,1 -6716,8,3,0,4,4,0,14,5,4,0,5,14,3,9,1 -6717,0,2,0,3,3,5,2,0,1,0,13,2,2,41,0 -6718,0,7,0,10,5,1,5,5,2,1,2,2,2,1,0 -6719,5,3,0,3,2,1,4,2,1,0,16,19,4,22,0 -6720,5,4,0,0,2,1,11,2,1,1,4,14,1,41,0 -6721,7,2,0,9,0,4,3,3,4,0,2,4,3,6,0 -6722,3,1,0,1,1,2,3,3,3,0,2,2,0,21,0 -6723,10,7,0,13,1,3,5,5,2,0,13,13,2,4,0 -6724,1,2,0,1,6,3,2,1,0,0,8,10,0,10,0 -6725,2,3,0,15,6,5,5,2,1,0,6,3,3,1,0 -6726,0,7,0,12,2,4,1,2,4,1,6,17,0,6,0 -6727,7,0,0,5,2,0,7,4,1,1,18,17,3,9,1 -6728,1,1,0,1,1,4,6,1,1,0,12,15,2,29,0 -6729,9,3,0,12,0,6,3,0,3,0,9,13,4,31,1 -6730,9,3,0,5,1,5,8,0,0,1,6,14,0,28,0 -6731,10,8,0,3,6,5,11,3,1,0,0,18,5,10,0 -6732,2,1,0,2,2,5,8,1,1,0,8,4,2,11,0 -6733,7,1,0,11,0,4,12,3,3,0,2,15,0,21,0 -6734,0,6,0,0,1,5,9,3,4,0,2,15,3,24,0 -6735,8,1,0,9,3,6,12,5,1,0,18,10,1,16,1 -6736,2,2,0,15,1,2,8,5,4,1,6,2,5,28,0 -6737,5,6,0,13,5,0,6,1,1,1,6,2,0,11,0 -6738,3,2,0,5,1,5,8,3,1,0,4,6,5,21,0 -6739,3,2,0,1,1,5,10,1,1,0,12,14,0,16,0 -6740,7,2,0,3,1,2,5,1,4,0,9,17,1,12,0 -6741,0,4,0,11,5,3,0,0,2,1,15,20,1,31,0 -6742,9,7,0,11,6,4,7,3,2,0,2,2,0,26,0 -6743,10,6,0,15,2,0,7,1,0,1,2,19,1,31,0 -6744,2,1,0,11,3,3,3,1,1,0,13,0,1,34,0 -6745,10,3,0,6,3,0,9,2,3,0,8,2,5,11,0 -6746,3,4,0,11,3,6,13,0,4,0,14,13,0,40,0 -6747,7,2,0,9,5,4,8,4,2,0,10,11,4,27,0 -6748,0,8,0,14,0,0,4,2,0,1,17,2,3,33,0 -6749,0,6,0,9,1,6,9,1,0,1,12,2,1,27,0 -6750,8,7,0,9,6,1,5,3,2,0,8,12,0,26,0 -6751,2,8,0,13,2,2,9,4,0,1,10,11,3,16,0 -6752,10,6,0,5,2,3,10,2,3,1,15,6,1,18,1 -6753,2,8,0,3,5,4,5,2,2,0,10,14,5,1,0 -6754,0,7,0,6,1,3,5,1,2,1,6,3,2,28,0 -6755,10,0,0,15,4,1,5,5,2,1,5,5,4,13,1 -6756,2,2,0,9,5,0,7,3,0,1,0,2,2,18,0 -6757,8,4,0,13,4,0,3,0,1,1,11,16,1,38,1 -6758,1,6,0,14,6,6,9,3,0,0,10,2,0,34,0 -6759,8,6,0,10,2,1,13,1,3,1,12,8,4,32,1 -6760,10,8,0,11,6,4,12,2,0,0,0,16,0,29,0 -6761,3,8,0,13,4,1,9,3,1,0,18,1,5,23,1 -6762,0,8,0,12,1,4,5,0,3,0,13,17,5,30,0 -6763,3,8,0,10,0,0,14,0,3,0,17,3,5,24,0 -6764,1,3,0,6,6,0,8,3,0,1,8,9,5,0,0 -6765,10,4,0,1,4,4,10,2,3,1,9,1,0,3,1 -6766,6,3,0,5,2,5,14,4,3,1,10,9,2,10,1 -6767,1,4,0,11,0,0,2,0,4,1,12,0,3,41,0 -6768,10,8,0,15,1,6,7,4,2,1,12,4,0,41,0 -6769,9,3,0,0,3,5,8,5,2,0,10,8,4,33,0 -6770,3,6,0,3,2,0,12,5,4,0,8,16,3,19,0 -6771,7,6,0,10,0,3,5,0,3,1,15,2,1,12,0 -6772,5,3,0,11,1,6,13,3,2,0,12,13,5,26,0 -6773,6,4,0,8,6,1,8,4,3,0,10,10,0,24,0 -6774,2,0,0,5,0,5,8,4,1,0,8,18,0,10,0 -6775,0,8,0,10,4,3,4,4,4,0,9,17,3,31,1 -6776,8,1,0,6,2,1,11,4,0,0,16,13,2,38,1 -6777,2,2,0,4,3,4,12,2,0,0,8,16,0,19,0 -6778,2,8,0,11,5,1,4,4,0,1,15,20,1,21,0 -6779,2,6,0,6,0,0,0,1,3,0,13,18,4,37,0 -6780,7,2,0,14,2,5,5,2,1,1,2,18,4,4,0 -6781,10,6,0,4,6,3,11,1,3,1,2,19,1,5,0 -6782,6,8,0,2,3,2,10,1,2,1,5,8,3,29,1 -6783,2,1,0,13,0,2,0,4,3,1,12,18,0,11,0 -6784,3,6,0,15,6,4,2,3,4,0,0,11,0,6,0 -6785,10,5,0,10,5,3,6,5,2,1,8,11,4,17,0 -6786,9,7,0,13,0,5,0,4,2,0,0,2,0,33,0 -6787,1,7,0,15,1,5,1,2,0,1,13,8,0,23,0 -6788,6,6,0,13,1,3,5,4,2,0,5,12,4,22,1 -6789,1,0,0,14,2,2,8,0,0,1,12,2,5,35,0 -6790,10,3,0,12,5,4,4,4,3,1,1,12,5,7,1 -6791,9,4,0,1,4,1,13,0,2,0,15,6,0,34,0 -6792,5,0,0,0,5,0,2,3,2,1,2,11,5,14,0 -6793,10,0,0,2,6,4,6,1,0,0,2,0,0,18,0 -6794,2,7,0,2,6,5,9,0,2,1,3,7,4,19,1 -6795,1,4,0,4,1,2,6,0,2,0,15,2,4,13,0 -6796,9,4,0,13,3,2,4,2,3,0,18,17,2,18,1 -6797,8,6,0,5,1,0,4,4,3,0,0,11,0,19,0 -6798,6,2,0,5,4,4,5,4,4,0,14,10,1,36,1 -6799,1,6,0,8,5,5,3,0,0,1,13,9,0,29,0 -6800,0,1,0,5,3,0,5,2,0,0,4,11,5,32,0 -6801,4,8,0,9,1,5,9,0,3,0,2,0,0,27,0 -6802,10,2,0,13,6,4,3,1,3,0,16,4,3,7,1 -6803,4,1,0,2,3,0,1,1,1,1,2,18,3,9,0 -6804,2,4,0,8,1,4,5,2,2,1,12,0,4,16,0 -6805,5,8,0,5,1,6,0,0,2,1,2,0,3,32,0 -6806,9,7,0,3,6,3,8,3,0,0,2,2,4,22,0 -6807,8,4,0,5,1,0,11,2,3,1,13,11,0,37,0 -6808,2,0,0,15,1,0,5,3,0,1,4,11,5,25,0 -6809,3,2,0,5,3,4,13,3,4,0,13,2,0,29,0 -6810,1,1,0,14,2,1,3,1,2,1,11,17,3,31,1 -6811,6,0,0,0,4,0,3,1,1,1,7,17,4,27,1 -6812,10,0,0,3,1,4,4,4,2,0,2,11,1,18,0 -6813,5,8,0,3,5,3,1,3,3,0,8,0,3,25,0 -6814,5,8,0,5,0,5,5,4,1,0,13,13,2,22,0 -6815,9,2,0,9,4,0,12,2,1,1,9,19,1,18,1 -6816,1,4,0,13,4,1,7,3,1,0,2,15,5,1,0 -6817,2,3,0,5,0,3,7,3,2,1,3,2,0,31,0 -6818,8,3,0,11,2,3,14,1,2,0,4,9,1,6,0 -6819,9,0,0,5,1,4,7,3,2,1,13,11,3,41,0 -6820,4,2,0,15,1,5,11,4,3,1,0,11,0,39,0 -6821,0,2,0,1,0,0,5,4,4,0,13,11,2,37,0 -6822,2,5,0,7,6,4,12,4,4,1,13,18,5,7,0 -6823,0,7,0,5,0,3,3,3,0,1,2,20,4,36,0 -6824,0,3,0,1,6,3,11,2,2,0,1,18,2,16,0 -6825,9,8,0,0,3,4,0,5,2,0,12,6,4,39,0 -6826,2,6,0,0,2,4,3,5,0,0,17,6,0,14,0 -6827,9,6,0,9,1,6,13,3,1,0,18,7,3,2,1 -6828,8,1,0,0,1,1,4,5,4,0,3,10,3,23,1 -6829,2,1,0,6,2,1,3,0,2,1,11,1,5,4,1 -6830,2,7,0,12,5,5,7,4,0,1,12,6,1,9,0 -6831,1,3,0,3,3,2,8,4,2,1,8,18,4,30,0 -6832,7,7,0,15,5,1,12,0,1,0,14,5,2,19,1 -6833,3,6,0,12,2,0,3,2,1,0,2,19,4,28,0 -6834,8,6,0,8,0,3,1,5,1,1,17,18,3,26,0 -6835,1,6,0,5,5,3,7,3,3,0,4,1,5,3,0 -6836,4,0,0,10,0,5,9,4,1,1,18,6,2,28,0 -6837,8,6,0,6,0,3,8,3,3,0,16,2,1,5,0 -6838,3,0,0,15,6,3,6,3,0,0,17,2,5,32,0 -6839,7,0,0,2,4,2,13,0,2,1,14,5,3,36,1 -6840,1,1,0,10,2,5,7,0,4,1,4,16,1,5,0 -6841,3,4,0,7,6,4,4,2,1,1,8,3,2,35,0 -6842,9,1,0,2,3,3,8,3,1,1,13,6,0,4,0 -6843,10,3,0,1,0,1,0,0,0,1,2,13,2,5,0 -6844,4,1,0,9,6,5,3,2,1,0,13,11,5,3,0 -6845,0,3,0,5,1,6,2,3,3,0,13,20,0,19,0 -6846,2,5,0,7,2,3,8,1,3,0,10,3,5,37,0 -6847,6,7,0,13,0,3,13,4,3,0,8,14,4,25,0 -6848,8,8,0,10,6,5,2,0,0,1,18,3,0,9,1 -6849,0,0,0,11,5,1,7,4,0,0,2,8,3,3,0 -6850,10,1,0,10,2,0,6,3,3,1,5,11,1,7,0 -6851,0,3,0,15,6,2,2,4,0,0,2,11,0,25,0 -6852,1,0,0,3,0,0,8,0,4,0,16,19,4,18,0 -6853,4,5,0,4,3,3,11,1,3,0,2,15,1,20,0 -6854,9,4,0,13,6,3,13,0,1,1,13,13,3,4,0 -6855,3,4,0,15,3,6,14,3,2,1,2,13,1,3,0 -6856,8,1,0,6,5,1,0,3,1,1,1,10,4,18,1 -6857,1,2,0,6,6,4,11,0,1,1,15,11,3,6,0 -6858,9,1,0,14,3,2,10,3,1,0,18,17,3,30,1 -6859,1,7,0,4,6,3,3,4,3,1,4,2,5,23,0 -6860,2,7,0,13,2,6,6,5,2,1,13,0,0,20,0 -6861,2,5,0,0,1,4,8,4,0,0,13,15,2,19,0 -6862,8,6,0,15,6,0,0,4,1,1,12,7,3,2,0 -6863,9,3,0,7,6,6,11,1,4,1,7,2,1,34,0 -6864,6,3,0,15,6,4,11,5,4,0,5,10,3,8,1 -6865,4,5,0,12,6,0,5,3,2,1,2,2,0,20,0 -6866,0,3,0,6,0,5,9,0,1,0,17,2,3,0,0 -6867,5,6,0,12,1,6,4,5,1,0,0,17,2,35,1 -6868,5,8,0,3,0,4,9,0,2,0,7,3,0,22,0 -6869,1,8,0,15,5,4,10,1,0,1,15,12,1,20,0 -6870,3,8,0,9,1,6,11,5,4,0,17,13,0,11,0 -6871,1,3,0,12,6,5,10,4,1,0,2,17,0,4,0 -6872,5,6,0,5,0,4,12,4,4,0,7,6,3,31,0 -6873,4,4,0,5,0,6,9,1,2,0,13,2,0,30,0 -6874,7,7,0,4,0,6,6,3,1,0,14,18,5,13,0 -6875,5,6,0,2,3,0,12,3,2,0,13,13,4,33,0 -6876,0,1,0,8,0,5,13,3,0,0,13,15,4,5,0 -6877,5,4,0,10,0,1,11,3,2,1,18,14,4,19,1 -6878,2,5,0,3,6,2,2,2,3,1,13,2,2,4,0 -6879,5,7,0,4,4,4,8,4,1,1,2,11,3,5,0 -6880,0,5,0,14,2,2,9,5,2,1,9,7,3,4,1 -6881,9,3,0,4,2,5,5,3,3,1,13,18,3,12,0 -6882,8,2,0,13,6,4,7,3,3,1,2,2,0,28,0 -6883,6,8,0,1,2,1,0,2,0,0,10,18,5,37,0 -6884,6,3,0,6,5,5,2,4,4,0,13,17,2,24,0 -6885,0,6,0,13,6,0,7,5,3,0,13,2,0,30,0 -6886,2,6,0,10,3,4,3,1,0,0,0,6,2,6,0 -6887,10,6,0,5,3,5,0,2,3,0,2,8,3,22,0 -6888,3,3,0,8,5,5,10,4,0,0,13,0,3,33,0 -6889,8,2,0,8,6,0,11,0,4,1,9,14,1,33,1 -6890,6,8,0,9,2,0,7,4,2,0,11,11,0,0,0 -6891,2,8,0,4,5,5,0,5,2,1,2,4,1,28,0 -6892,3,8,0,4,2,5,4,4,0,1,2,5,0,35,0 -6893,10,4,0,0,1,4,5,4,4,0,8,6,0,25,0 -6894,9,6,0,5,0,3,1,3,3,0,14,15,3,24,0 -6895,6,4,0,6,6,5,6,1,4,0,6,13,5,24,0 -6896,9,3,0,8,6,5,7,0,0,1,2,9,2,12,0 -6897,8,4,0,0,3,5,0,1,2,1,18,7,3,18,1 -6898,6,2,0,13,5,1,0,1,4,0,13,16,0,9,0 -6899,10,4,0,13,2,5,9,1,4,0,15,11,0,14,0 -6900,6,6,0,15,5,6,11,0,1,1,8,13,0,36,0 -6901,8,5,0,7,4,5,4,5,2,0,17,1,0,26,1 -6902,1,4,0,8,3,1,3,4,3,0,14,6,0,21,0 -6903,0,8,0,15,3,6,7,4,3,0,2,13,2,25,0 -6904,10,5,0,5,0,4,12,0,1,0,3,17,5,29,1 -6905,3,6,0,5,1,6,13,4,3,1,13,11,3,27,0 -6906,9,4,0,3,4,5,7,1,3,0,7,11,1,5,0 -6907,9,3,0,4,0,6,11,3,2,1,4,11,0,22,0 -6908,6,3,0,11,1,3,8,0,0,0,13,3,3,7,0 -6909,5,6,0,1,5,3,0,4,3,0,0,2,2,19,0 -6910,4,5,0,3,3,1,14,1,0,1,13,2,5,9,0 -6911,0,2,0,7,3,2,5,3,0,1,11,2,4,30,0 -6912,3,1,0,11,0,5,3,3,0,0,2,20,2,38,0 -6913,0,4,0,0,1,2,4,0,2,0,2,18,1,28,0 -6914,8,0,0,6,2,4,8,2,3,0,4,16,0,39,0 -6915,0,2,0,14,6,2,5,2,3,1,8,20,3,39,0 -6916,7,0,0,1,6,2,6,4,2,1,17,6,4,5,0 -6917,7,1,0,13,3,2,0,0,3,0,18,4,3,12,1 -6918,7,1,0,5,4,0,10,1,0,0,12,6,3,25,0 -6919,1,3,0,7,4,5,7,1,4,0,4,7,4,22,0 -6920,2,7,0,13,6,3,1,1,2,1,8,11,2,28,0 -6921,1,7,0,7,3,1,9,2,0,0,13,6,2,27,0 -6922,7,2,0,11,3,6,7,1,2,1,10,1,0,18,0 -6923,4,6,0,14,4,5,2,5,4,0,18,12,5,8,1 -6924,2,3,0,5,1,1,9,1,3,1,17,2,5,17,0 -6925,9,5,0,9,1,6,7,2,3,1,14,7,4,36,1 -6926,6,5,0,4,0,5,11,2,4,1,7,5,4,20,1 -6927,1,7,0,11,3,4,0,0,0,1,6,0,2,9,0 -6928,1,4,0,3,4,3,5,4,2,0,0,9,4,8,0 -6929,4,3,0,3,6,2,0,4,0,1,4,2,0,13,0 -6930,0,8,0,5,0,3,9,5,3,1,8,5,0,38,0 -6931,9,8,0,9,0,0,14,1,2,0,15,20,5,20,0 -6932,1,6,0,1,5,5,10,0,2,1,4,9,4,17,0 -6933,6,2,0,15,6,0,8,0,2,1,7,7,5,9,0 -6934,9,1,0,11,2,5,13,2,1,1,13,16,2,13,0 -6935,5,6,0,8,2,1,3,2,3,0,2,5,5,4,0 -6936,4,5,0,1,0,4,5,2,3,0,13,2,1,38,0 -6937,0,6,0,3,0,1,8,0,3,1,2,20,2,16,0 -6938,1,6,0,5,4,1,8,5,1,1,9,19,1,5,1 -6939,9,5,0,10,5,2,9,5,1,1,11,17,5,13,1 -6940,2,4,0,5,1,1,11,1,0,1,16,11,0,17,0 -6941,0,6,0,11,2,1,3,2,3,1,8,20,0,27,0 -6942,10,6,0,1,6,6,12,1,3,0,10,11,3,8,0 -6943,9,5,0,11,4,2,1,0,3,1,18,5,0,10,1 -6944,8,5,0,2,4,2,5,4,0,0,2,5,3,27,0 -6945,9,1,0,7,2,2,11,5,1,1,1,10,1,34,1 -6946,3,1,0,0,4,4,10,2,4,0,2,18,0,29,0 -6947,5,1,0,9,2,2,5,5,0,1,18,17,1,1,1 -6948,1,8,0,5,6,3,7,4,2,0,5,15,1,6,0 -6949,6,7,0,1,5,2,3,2,4,0,2,18,4,40,0 -6950,8,8,0,4,4,0,3,3,0,1,2,2,5,2,0 -6951,1,6,0,3,4,0,13,4,0,1,13,9,4,29,0 -6952,0,5,0,15,6,0,14,3,4,0,2,1,1,24,0 -6953,0,6,0,4,0,0,12,0,0,1,13,11,4,3,0 -6954,5,6,0,3,3,4,0,0,4,1,4,17,5,9,0 -6955,8,7,0,3,4,4,9,0,0,1,7,14,3,11,1 -6956,0,7,0,13,2,6,11,0,0,0,13,4,0,35,0 -6957,0,0,0,15,5,6,3,2,2,1,4,15,3,22,0 -6958,2,4,0,7,6,0,13,5,1,0,0,13,2,26,0 -6959,2,8,0,7,4,1,12,5,2,1,1,13,1,18,1 -6960,8,1,0,12,3,6,12,5,1,1,18,10,3,6,1 -6961,8,6,0,14,3,3,10,3,1,1,1,20,1,13,1 -6962,10,5,0,8,2,4,9,4,0,0,13,11,0,38,0 -6963,3,7,0,0,5,2,5,2,2,1,17,4,1,3,0 -6964,5,8,0,10,2,6,5,0,2,1,15,6,3,12,0 -6965,0,5,0,7,3,1,10,5,4,1,16,17,4,18,1 -6966,3,7,0,9,5,0,5,1,2,0,13,9,0,11,0 -6967,8,8,0,8,2,2,1,1,4,0,18,5,1,24,1 -6968,3,8,0,13,3,1,3,5,1,0,7,7,0,14,1 -6969,5,6,0,0,4,5,1,2,3,1,6,3,3,36,0 -6970,5,7,0,9,2,4,5,3,0,0,11,2,3,30,0 -6971,8,5,0,4,2,1,13,1,3,1,1,3,4,29,1 -6972,8,8,0,10,2,3,5,3,2,1,2,20,5,29,0 -6973,1,2,0,4,5,5,5,2,1,1,12,20,4,26,0 -6974,5,6,0,15,2,4,10,4,1,0,15,11,3,20,0 -6975,6,2,0,12,4,0,1,3,2,1,10,6,3,27,0 -6976,3,2,0,0,1,4,5,0,4,1,4,15,0,22,0 -6977,3,0,0,11,3,0,8,3,3,0,10,9,0,0,0 -6978,4,2,0,0,6,1,10,5,0,1,1,15,2,11,1 -6979,9,2,0,12,2,5,12,5,1,1,18,6,1,23,1 -6980,9,7,0,13,1,4,10,3,2,0,17,11,2,22,0 -6981,8,1,0,12,4,1,4,0,4,1,15,8,4,11,1 -6982,3,7,0,0,3,1,2,1,2,1,2,20,4,18,0 -6983,7,8,0,5,4,1,3,4,3,1,7,19,4,9,1 -6984,2,7,0,5,0,6,8,3,0,1,13,13,0,5,0 -6985,10,3,0,1,2,0,12,4,3,0,5,11,3,4,0 -6986,10,7,0,7,5,3,1,5,3,1,2,9,0,38,0 -6987,2,8,0,0,1,6,5,5,3,1,4,15,0,39,0 -6988,0,0,0,3,0,2,8,4,2,0,3,2,3,9,0 -6989,0,2,0,3,4,0,8,1,0,0,4,11,4,24,0 -6990,8,6,0,13,6,0,8,4,3,0,13,10,1,29,0 -6991,7,8,0,15,3,1,14,2,2,0,12,19,2,38,0 -6992,0,7,0,10,1,4,9,4,2,1,13,11,5,18,0 -6993,8,8,0,15,3,6,3,4,2,1,13,20,0,3,0 -6994,2,1,0,13,1,5,6,2,3,1,5,1,5,31,1 -6995,8,8,0,5,2,6,13,5,4,1,12,12,0,4,0 -6996,4,4,0,10,4,1,6,5,0,1,16,4,3,33,1 -6997,1,6,0,4,1,4,7,0,0,0,1,13,2,37,0 -6998,1,2,0,3,0,0,7,0,1,0,4,7,1,3,0 -6999,2,3,0,5,4,5,5,4,0,0,15,18,1,3,0 -7000,2,0,0,4,0,5,5,4,3,0,2,1,3,33,0 -7001,0,5,0,3,3,2,9,2,3,1,17,11,0,34,0 -7002,8,5,0,1,2,3,4,0,2,0,2,11,0,10,0 -7003,1,7,0,9,0,6,9,0,0,1,13,15,4,3,0 -7004,8,4,0,11,6,4,14,4,1,0,18,12,5,27,1 -7005,2,7,0,9,2,0,9,1,3,1,2,14,0,9,0 -7006,1,5,0,1,0,5,11,5,1,1,17,11,1,7,0 -7007,1,4,0,0,1,5,1,5,0,0,8,20,0,31,0 -7008,6,3,0,10,0,1,10,1,3,1,18,1,2,17,1 -7009,6,8,0,8,2,3,9,1,1,0,15,9,2,6,0 -7010,3,3,0,3,6,5,12,4,3,0,17,9,1,33,0 -7011,2,3,0,3,0,1,5,0,3,1,18,10,5,18,1 -7012,9,7,0,10,0,1,0,4,2,0,15,7,2,0,1 -7013,0,6,0,3,5,3,0,2,0,0,2,17,3,14,0 -7014,0,7,0,3,0,4,7,4,3,1,10,6,2,21,0 -7015,4,3,0,10,0,3,8,2,0,0,13,11,0,26,0 -7016,3,5,0,1,4,4,5,3,3,0,17,2,2,28,0 -7017,8,6,0,7,1,1,12,5,2,0,11,18,3,25,1 -7018,8,5,0,6,5,2,1,0,4,1,5,6,5,39,1 -7019,4,4,0,13,1,5,3,2,3,0,1,2,2,29,0 -7020,1,7,0,6,2,0,6,1,2,0,17,13,4,4,0 -7021,4,2,0,2,5,5,10,4,2,1,14,1,2,11,1 -7022,0,5,0,11,2,0,12,3,1,0,6,0,4,23,0 -7023,0,0,0,13,6,5,0,0,3,1,14,7,3,22,1 -7024,2,6,0,11,1,5,5,1,1,0,13,2,1,22,0 -7025,5,0,0,1,5,2,4,0,3,0,5,15,0,3,0 -7026,0,4,0,9,6,0,0,1,4,1,2,19,0,8,0 -7027,0,5,0,1,4,0,9,3,3,0,13,12,1,7,0 -7028,9,0,0,13,6,6,8,1,1,1,13,9,4,30,0 -7029,1,8,0,9,1,3,1,2,3,1,0,11,2,13,0 -7030,1,3,0,4,3,3,2,4,2,1,2,6,3,29,0 -7031,3,8,0,2,0,4,11,3,0,1,2,15,4,40,0 -7032,1,3,0,12,1,0,0,1,2,0,0,2,1,30,0 -7033,4,3,0,7,2,0,13,2,2,1,0,20,3,39,0 -7034,5,5,0,15,4,0,14,1,0,1,18,10,4,6,1 -7035,6,8,0,5,3,6,9,1,0,0,1,2,0,6,0 -7036,8,1,0,14,6,4,3,5,1,1,1,19,4,27,1 -7037,7,6,0,14,2,0,10,0,1,1,18,7,2,11,1 -7038,0,1,0,14,5,4,7,4,1,0,13,11,5,25,0 -7039,10,7,0,8,0,5,6,2,1,0,8,7,2,0,0 -7040,0,0,0,6,0,2,13,0,0,0,4,11,2,35,0 -7041,0,8,0,2,0,3,7,3,2,1,2,0,5,31,0 -7042,0,8,0,13,5,2,3,2,3,0,6,5,0,30,0 -7043,2,6,0,10,5,6,14,1,0,0,15,20,1,34,0 -7044,0,3,0,11,1,1,11,3,3,0,2,18,0,3,0 -7045,4,8,0,10,1,3,5,1,0,1,0,20,1,24,0 -7046,5,0,0,13,1,0,5,1,0,1,3,13,5,21,0 -7047,2,6,0,3,5,4,10,2,0,0,17,2,1,10,0 -7048,2,1,0,13,0,2,7,0,0,1,8,11,0,32,0 -7049,10,6,0,2,5,5,7,4,2,0,13,0,2,17,0 -7050,1,8,0,2,0,4,2,5,0,1,6,11,0,22,0 -7051,10,7,0,13,5,5,6,2,1,0,8,0,5,36,0 -7052,9,2,0,13,2,0,5,0,2,0,17,13,4,39,0 -7053,2,7,0,7,6,2,8,3,3,1,4,0,0,22,0 -7054,9,8,0,2,4,5,7,4,2,0,8,9,3,32,0 -7055,0,6,0,9,0,4,11,1,0,0,13,6,1,36,0 -7056,0,8,0,0,1,1,3,3,1,1,10,8,5,11,0 -7057,8,7,0,15,0,3,12,4,0,1,18,16,3,3,1 -7058,2,6,0,7,0,0,8,2,0,0,3,13,2,16,0 -7059,3,8,0,0,6,3,3,1,1,0,5,10,0,25,1 -7060,6,8,0,1,6,2,6,1,1,1,1,12,4,36,0 -7061,5,7,0,7,1,2,6,1,4,0,2,16,0,6,0 -7062,10,2,0,6,1,2,4,5,4,1,9,13,3,37,1 -7063,2,8,0,13,5,4,7,3,2,1,12,13,3,28,0 -7064,0,3,0,13,2,6,2,3,0,0,0,0,2,1,0 -7065,1,0,0,3,0,3,13,1,4,1,2,4,2,27,0 -7066,1,7,0,13,3,5,11,1,0,1,0,2,1,27,0 -7067,10,4,0,8,4,2,0,0,0,0,17,20,0,10,0 -7068,0,0,0,8,1,0,5,4,4,1,13,15,1,7,0 -7069,0,0,0,6,5,5,10,3,3,1,6,15,4,26,0 -7070,10,0,0,10,2,6,2,1,0,1,5,0,1,33,1 -7071,10,6,0,13,5,1,6,1,2,0,10,2,5,13,0 -7072,0,4,0,11,3,1,3,4,0,1,0,11,0,19,0 -7073,2,7,0,3,6,6,6,4,2,1,17,13,3,6,0 -7074,2,8,0,4,4,0,14,2,4,0,12,20,2,31,0 -7075,3,4,0,2,2,4,10,4,1,0,12,0,5,36,0 -7076,2,1,0,12,4,0,5,5,2,1,8,17,5,24,0 -7077,2,6,0,13,6,6,2,3,1,1,4,15,2,6,0 -7078,2,2,0,9,1,3,7,1,0,0,8,2,0,1,0 -7079,1,4,0,9,3,1,6,1,2,0,0,11,2,16,0 -7080,10,3,0,13,4,0,10,0,3,0,17,18,3,19,0 -7081,1,1,0,3,6,1,8,1,4,0,2,20,1,18,0 -7082,0,4,0,15,0,0,11,1,2,1,8,18,1,13,0 -7083,5,6,0,6,5,1,10,3,3,0,10,15,4,26,0 -7084,1,8,0,14,0,1,8,0,3,1,13,20,1,37,0 -7085,3,5,0,8,5,4,7,2,4,1,18,19,0,20,1 -7086,2,3,0,14,6,2,1,1,1,0,10,13,2,11,0 -7087,5,0,0,10,2,1,3,5,3,1,18,10,4,1,1 -7088,4,8,0,9,4,1,5,5,3,0,5,5,5,25,1 -7089,1,3,0,0,0,5,5,4,4,0,2,17,0,7,0 -7090,0,3,0,12,2,2,9,2,3,1,13,2,3,4,0 -7091,10,6,0,6,2,2,0,2,0,0,7,14,1,36,1 -7092,3,3,0,1,0,3,6,2,2,1,6,15,1,26,0 -7093,7,1,0,8,4,2,0,1,1,0,2,2,0,17,0 -7094,0,6,0,12,2,3,4,4,0,0,2,0,0,39,0 -7095,10,4,0,4,0,5,1,2,2,1,17,17,0,9,0 -7096,8,8,0,7,3,2,7,1,1,0,14,15,5,11,1 -7097,1,7,0,9,3,0,5,0,2,0,13,20,1,8,0 -7098,9,3,0,2,5,3,11,5,0,1,14,0,1,11,0 -7099,10,0,0,9,2,5,7,1,3,1,13,9,0,19,0 -7100,1,5,0,7,5,4,8,2,2,1,13,18,5,9,0 -7101,1,7,0,4,0,3,8,0,4,0,2,18,4,32,0 -7102,3,3,0,12,0,6,8,0,0,1,13,20,0,0,0 -7103,3,8,0,10,2,1,14,1,3,0,2,2,4,3,0 -7104,0,8,0,10,5,3,13,2,2,1,2,13,2,0,0 -7105,2,3,0,8,1,6,12,0,0,0,17,11,5,26,0 -7106,9,8,0,5,6,5,12,1,0,0,13,15,4,41,0 -7107,6,3,0,5,4,1,11,2,0,0,3,17,3,12,1 -7108,0,7,0,13,2,4,5,1,1,1,14,15,2,29,0 -7109,1,4,0,1,2,1,14,2,0,0,8,14,2,41,0 -7110,3,8,0,3,6,0,12,4,3,0,4,20,5,10,0 -7111,0,4,0,12,4,6,9,3,4,1,2,2,0,4,0 -7112,3,8,0,3,0,5,8,2,4,0,8,2,0,10,0 -7113,6,6,0,2,4,0,4,0,0,1,16,9,0,18,1 -7114,3,1,0,5,2,4,11,5,4,1,17,13,0,28,0 -7115,4,3,0,1,4,5,6,2,0,1,13,19,4,6,0 -7116,4,4,0,4,6,6,6,4,3,1,6,2,5,30,0 -7117,1,0,0,11,6,5,0,2,3,0,2,11,0,13,0 -7118,4,5,0,13,4,5,3,2,2,1,3,8,2,16,1 -7119,6,5,0,7,6,4,5,1,1,1,11,10,1,23,1 -7120,2,1,0,8,0,4,8,0,1,1,6,18,0,3,0 -7121,9,7,0,6,4,0,8,0,2,0,18,7,5,14,1 -7122,2,3,0,2,0,2,3,2,1,0,8,20,1,3,0 -7123,0,7,0,6,2,6,5,4,1,0,8,11,1,38,0 -7124,7,0,0,12,4,3,12,2,1,0,1,5,2,41,1 -7125,0,5,0,3,0,4,0,3,1,0,6,11,5,26,0 -7126,1,3,0,15,5,3,6,3,2,0,12,9,0,6,0 -7127,1,3,0,15,6,5,0,1,2,0,2,9,3,38,0 -7128,4,6,0,7,6,0,6,3,2,1,16,20,2,37,0 -7129,4,6,0,13,2,5,6,5,2,0,13,19,5,14,0 -7130,9,3,0,10,1,4,8,3,1,0,8,8,5,19,0 -7131,0,1,0,14,6,3,4,5,2,1,14,11,1,29,0 -7132,9,4,0,13,1,1,6,3,4,0,13,11,0,21,0 -7133,3,3,0,7,2,2,0,3,2,1,4,15,2,0,0 -7134,10,3,0,10,0,4,5,4,3,0,14,15,0,0,0 -7135,2,8,0,2,2,0,14,1,3,0,2,9,1,21,0 -7136,5,6,0,4,6,5,4,3,4,1,16,18,3,32,0 -7137,3,7,0,13,5,5,3,0,3,0,8,12,1,35,0 -7138,2,4,0,8,0,5,6,1,0,1,4,11,2,21,0 -7139,2,1,0,11,1,5,5,3,3,0,6,2,2,19,0 -7140,2,0,0,4,0,4,7,2,3,0,2,2,0,23,0 -7141,5,0,0,9,4,3,4,0,1,1,7,19,2,1,1 -7142,7,2,0,11,6,5,0,4,1,0,17,15,0,37,0 -7143,1,8,0,3,1,1,1,1,2,0,7,20,1,4,0 -7144,0,4,0,7,2,6,9,1,1,1,2,11,3,28,0 -7145,1,4,0,3,0,3,9,4,2,0,4,13,3,30,0 -7146,3,0,0,1,5,6,12,2,0,1,13,11,5,14,0 -7147,0,6,0,7,3,0,5,1,0,1,4,0,0,11,0 -7148,1,8,0,15,6,0,0,2,1,1,2,14,1,6,0 -7149,1,8,0,3,1,5,2,2,1,1,15,2,5,25,0 -7150,3,8,0,10,2,0,3,4,0,0,12,2,0,23,0 -7151,1,5,0,15,1,3,0,3,0,1,8,2,3,5,0 -7152,9,2,0,8,0,5,9,4,2,0,13,16,0,12,0 -7153,10,0,0,3,3,2,2,2,0,1,3,12,3,13,0 -7154,2,3,0,6,3,3,2,2,3,0,2,0,4,0,0 -7155,2,2,0,13,6,2,0,3,0,1,10,12,4,26,0 -7156,1,1,0,1,3,0,1,2,1,0,6,9,5,20,0 -7157,6,8,0,2,5,5,13,5,2,1,5,7,4,0,1 -7158,0,0,0,2,1,5,5,2,4,1,6,9,2,29,0 -7159,3,7,0,15,0,3,1,2,3,0,5,17,0,23,0 -7160,0,0,0,3,6,5,14,2,2,1,2,11,0,22,0 -7161,10,4,0,6,6,6,9,4,4,1,15,16,0,20,0 -7162,4,7,0,15,0,5,6,3,2,1,18,15,0,12,0 -7163,6,0,0,10,6,3,12,1,1,1,11,1,2,19,1 -7164,4,0,0,8,1,6,7,4,0,0,17,2,4,25,0 -7165,0,0,0,9,4,5,8,3,4,1,8,8,1,6,0 -7166,3,3,0,10,0,3,0,3,0,0,2,14,5,29,0 -7167,0,8,0,3,0,0,0,3,0,1,4,2,0,39,0 -7168,7,3,0,11,3,2,0,3,1,0,9,9,4,29,1 -7169,6,8,0,1,6,5,2,1,2,1,6,16,0,18,0 -7170,9,4,0,10,1,6,12,1,0,0,3,7,4,40,1 -7171,10,6,0,5,6,5,9,4,1,1,13,6,1,38,0 -7172,6,2,0,3,6,4,8,3,4,1,17,16,5,24,0 -7173,2,3,0,3,1,5,14,1,2,1,14,11,0,1,0 -7174,8,5,0,1,6,4,6,3,1,0,2,2,0,21,0 -7175,10,6,0,6,0,0,8,1,2,1,10,15,2,21,0 -7176,6,6,0,7,4,5,7,3,1,1,6,19,1,36,0 -7177,2,5,0,8,3,2,2,0,3,1,4,6,0,37,0 -7178,1,8,0,10,0,5,11,4,3,0,11,0,4,8,0 -7179,10,6,0,6,2,5,8,2,0,1,13,15,0,27,0 -7180,0,1,0,1,2,6,0,2,3,0,17,6,4,37,0 -7181,7,7,0,4,0,5,11,1,3,0,16,11,0,20,1 -7182,8,6,0,10,4,5,14,2,3,1,0,2,0,4,0 -7183,0,6,0,7,1,0,9,2,1,0,1,16,0,16,0 -7184,9,8,0,12,6,3,0,5,0,0,14,7,3,23,1 -7185,0,6,0,0,4,4,6,5,3,0,13,15,5,31,0 -7186,0,6,0,10,5,5,13,2,2,1,15,3,3,4,0 -7187,4,8,0,7,1,6,8,4,4,0,8,2,5,7,0 -7188,7,2,0,10,1,4,8,4,2,0,10,12,4,35,0 -7189,5,1,0,12,3,0,3,3,4,1,2,2,4,33,0 -7190,2,0,0,14,1,2,13,0,1,1,2,11,3,20,0 -7191,0,2,0,3,3,0,8,2,2,0,13,16,2,5,0 -7192,2,5,0,7,4,5,12,0,3,0,7,11,4,34,0 -7193,9,8,0,1,6,0,10,5,3,1,18,5,4,27,1 -7194,9,1,0,3,1,1,6,2,4,0,9,1,1,38,1 -7195,5,8,0,3,2,6,1,2,3,0,2,15,5,0,0 -7196,7,2,0,4,3,4,11,0,4,1,11,10,4,32,1 -7197,8,6,0,7,3,4,7,2,4,0,17,3,0,1,0 -7198,2,5,0,8,4,1,13,1,0,0,1,2,5,38,0 -7199,10,7,0,2,6,5,3,4,1,1,17,12,0,18,0 -7200,5,7,0,0,2,6,5,2,0,0,2,11,2,13,0 -7201,7,2,0,10,1,6,14,2,1,1,12,15,3,9,0 -7202,10,4,0,8,1,2,8,1,4,1,16,17,3,9,1 -7203,0,3,0,14,3,2,6,4,2,0,13,3,0,22,0 -7204,7,4,0,3,1,3,5,2,1,1,13,9,0,39,0 -7205,5,2,0,0,1,4,0,0,0,0,2,18,5,9,0 -7206,4,1,0,15,6,6,12,5,0,0,5,7,3,17,1 -7207,5,6,0,9,0,6,8,4,4,1,11,11,1,10,0 -7208,2,4,0,6,2,3,9,0,2,1,6,13,0,6,0 -7209,1,6,0,6,6,0,8,3,1,0,14,2,1,39,0 -7210,4,4,0,6,2,0,9,3,4,1,0,2,0,20,0 -7211,7,5,0,3,3,5,0,5,2,1,11,15,3,11,0 -7212,7,6,0,5,3,4,0,4,0,1,4,6,0,6,0 -7213,7,4,0,1,6,1,14,0,4,1,9,7,3,18,1 -7214,3,5,0,6,3,4,7,4,0,1,13,15,2,7,0 -7215,10,7,0,5,3,3,5,3,0,1,4,4,0,14,0 -7216,1,8,0,13,2,4,3,2,4,0,8,11,2,22,0 -7217,3,8,0,8,5,1,3,2,0,0,2,2,1,24,0 -7218,0,5,0,8,2,5,1,1,1,0,13,2,4,31,0 -7219,2,7,0,3,4,6,7,4,3,1,0,11,2,3,0 -7220,9,4,0,15,0,6,11,1,3,1,9,7,4,32,1 -7221,10,3,0,2,0,4,13,3,4,0,10,2,4,20,0 -7222,3,6,0,9,2,0,9,2,3,1,16,18,1,13,0 -7223,0,8,0,9,1,4,8,0,3,0,9,3,0,23,0 -7224,9,5,0,11,0,5,1,1,3,0,8,12,3,34,0 -7225,8,1,0,2,1,0,14,0,4,1,1,11,0,18,0 -7226,2,7,0,6,3,6,2,2,0,1,0,2,0,24,0 -7227,7,4,0,2,3,3,10,1,4,1,8,3,0,10,0 -7228,6,6,0,7,2,4,10,1,2,0,2,20,4,18,0 -7229,10,2,0,0,0,6,8,0,0,0,0,2,5,40,0 -7230,6,4,0,5,2,2,8,5,3,0,13,20,4,21,0 -7231,4,6,0,1,6,2,8,4,2,1,2,1,1,30,0 -7232,5,2,0,0,1,1,0,4,1,1,14,1,2,0,1 -7233,3,4,0,3,6,5,6,5,2,1,14,7,3,4,1 -7234,4,7,0,3,0,5,13,1,3,1,13,4,0,20,0 -7235,8,1,0,0,0,2,5,5,0,0,12,0,0,22,0 -7236,1,8,0,13,2,4,9,3,2,1,8,11,3,26,0 -7237,10,3,0,8,6,1,2,3,1,0,10,11,0,37,0 -7238,2,3,0,1,0,0,2,3,4,0,2,11,3,24,0 -7239,1,6,0,2,2,5,3,3,4,1,2,5,0,19,0 -7240,3,3,0,8,3,3,8,4,0,1,15,11,3,5,0 -7241,2,3,0,1,1,5,9,5,3,0,2,13,3,3,0 -7242,7,2,0,12,2,0,0,4,1,1,9,14,3,7,1 -7243,6,2,0,1,0,4,7,4,0,0,8,20,1,0,0 -7244,2,6,0,3,2,5,6,3,0,0,2,3,0,28,0 -7245,9,1,0,2,6,5,3,5,1,0,8,5,0,38,0 -7246,6,6,0,4,1,2,13,3,3,1,11,14,1,38,1 -7247,5,5,0,7,6,1,3,0,2,0,17,16,4,35,1 -7248,8,0,0,14,2,5,0,0,1,1,6,2,2,30,0 -7249,7,4,0,15,1,3,2,0,3,1,7,9,3,2,1 -7250,6,8,0,0,5,2,4,1,3,1,2,9,1,26,0 -7251,2,4,0,0,1,2,14,4,3,1,7,2,4,11,0 -7252,4,7,0,14,5,0,6,4,0,0,2,6,0,26,0 -7253,1,5,0,3,6,3,7,2,3,0,2,12,5,19,0 -7254,0,0,0,2,0,6,9,2,4,1,8,11,1,38,0 -7255,2,2,0,12,0,5,10,0,3,1,8,11,3,24,0 -7256,10,3,0,4,4,6,8,0,3,1,1,10,0,14,1 -7257,3,2,0,15,6,4,5,3,2,0,15,15,0,2,0 -7258,8,3,0,4,4,0,13,2,1,1,18,19,4,20,1 -7259,5,3,0,12,1,2,9,4,3,1,14,8,5,10,1 -7260,10,8,0,15,2,0,10,1,2,0,14,6,2,7,0 -7261,9,2,0,14,2,6,11,1,4,1,9,17,4,22,1 -7262,3,1,0,10,3,0,4,0,0,1,17,14,5,21,0 -7263,2,3,0,7,4,1,6,1,4,0,5,9,2,31,0 -7264,10,2,0,13,1,3,14,1,0,1,11,2,1,41,0 -7265,5,6,0,5,0,3,5,3,3,0,0,2,5,30,0 -7266,3,4,0,13,0,5,8,5,1,0,2,1,3,8,0 -7267,0,3,0,6,6,5,11,3,2,0,13,2,2,25,0 -7268,7,7,0,5,1,6,2,5,3,1,6,15,0,30,0 -7269,4,8,0,10,4,0,5,4,4,0,13,11,4,32,0 -7270,2,6,0,6,0,1,12,3,2,0,13,2,3,27,0 -7271,3,6,0,13,3,3,3,2,3,1,13,11,0,39,0 -7272,9,3,0,7,0,5,6,4,0,1,17,3,2,7,0 -7273,3,8,0,2,0,0,4,2,1,1,7,13,5,13,0 -7274,8,7,0,8,1,3,1,4,1,0,11,2,4,35,0 -7275,0,5,0,13,1,1,8,1,2,1,0,4,0,33,0 -7276,3,8,0,11,1,3,1,4,1,0,8,20,3,6,0 -7277,1,2,0,13,6,2,9,4,2,0,2,18,0,1,0 -7278,1,8,0,2,1,4,11,3,0,0,8,10,0,29,0 -7279,1,5,0,6,4,6,6,3,4,0,15,2,1,8,0 -7280,6,6,0,10,1,4,5,2,1,1,2,1,2,9,0 -7281,8,8,0,3,3,4,3,1,4,1,2,9,5,6,0 -7282,4,2,0,15,1,0,13,4,4,0,2,13,1,22,0 -7283,3,5,0,11,0,6,5,1,0,0,10,11,5,10,0 -7284,3,8,0,4,1,6,3,3,1,0,6,7,5,18,0 -7285,2,7,0,4,0,5,10,1,0,0,13,20,0,30,0 -7286,5,2,0,0,4,5,0,0,3,1,14,5,4,30,1 -7287,1,7,0,5,1,4,6,3,0,1,8,0,0,4,0 -7288,9,5,0,7,6,0,7,0,1,1,18,19,4,39,1 -7289,2,8,0,2,0,4,9,2,3,1,13,9,0,13,0 -7290,1,6,0,15,6,4,2,1,2,1,10,5,0,17,0 -7291,4,1,0,11,2,2,1,5,0,1,15,5,3,36,1 -7292,9,7,0,11,6,6,6,1,1,1,18,7,0,4,1 -7293,1,6,0,2,0,6,3,0,1,1,12,13,3,18,0 -7294,5,6,0,0,2,3,0,2,2,0,2,7,1,7,0 -7295,6,0,0,2,2,3,3,5,0,1,7,17,5,32,1 -7296,4,1,0,8,4,3,7,5,1,0,17,5,4,5,1 -7297,7,7,0,9,4,6,3,4,0,0,5,16,4,20,1 -7298,4,0,0,4,6,6,11,5,0,1,9,1,3,36,1 -7299,1,6,0,7,6,3,12,4,0,1,17,11,2,23,0 -7300,1,8,0,3,0,6,1,2,3,1,2,4,0,34,0 -7301,3,1,0,9,6,1,0,2,1,1,8,2,0,33,0 -7302,10,5,0,0,0,4,4,0,4,1,17,3,4,7,0 -7303,4,3,0,7,2,3,11,0,0,0,0,4,4,37,0 -7304,1,5,0,6,3,5,5,0,3,1,2,15,4,12,0 -7305,2,2,0,5,6,5,6,3,0,1,16,12,1,12,0 -7306,1,0,0,12,5,0,2,2,0,0,2,8,5,1,0 -7307,9,5,0,0,1,1,5,3,2,0,6,15,5,3,0 -7308,0,2,0,4,3,2,10,2,0,1,16,13,5,10,0 -7309,0,4,0,13,4,4,14,0,4,1,5,11,1,13,0 -7310,10,6,0,4,6,0,8,2,1,1,8,18,5,38,0 -7311,10,7,0,8,2,4,3,3,3,0,0,6,4,35,0 -7312,2,7,0,9,5,6,10,1,2,0,8,20,5,16,0 -7313,9,1,0,8,2,4,9,1,1,1,0,6,0,40,0 -7314,1,6,0,7,0,6,8,3,2,0,13,11,3,10,0 -7315,9,2,0,8,6,3,7,0,3,0,10,13,3,17,0 -7316,3,5,0,13,3,5,9,1,2,0,10,7,1,41,0 -7317,1,8,0,13,5,6,0,2,2,0,10,9,4,22,0 -7318,1,6,0,6,5,0,14,2,2,1,13,15,4,40,0 -7319,0,5,0,9,0,3,9,0,2,0,3,2,3,11,0 -7320,6,1,0,14,1,0,5,3,2,0,4,2,2,7,0 -7321,2,8,0,5,5,3,2,0,2,0,9,9,1,30,0 -7322,2,3,0,4,0,0,14,3,2,1,13,19,4,40,0 -7323,4,3,0,5,3,5,14,2,4,0,9,7,4,6,1 -7324,0,7,0,15,0,0,0,0,2,1,13,2,2,21,0 -7325,6,4,0,1,0,4,1,1,2,0,2,8,2,3,0 -7326,2,3,0,2,5,5,9,4,3,1,10,2,5,25,0 -7327,1,6,0,13,4,5,2,5,2,1,0,13,4,17,0 -7328,0,8,0,2,0,6,4,0,1,1,13,15,2,26,0 -7329,3,8,0,0,3,1,1,5,3,1,3,7,5,14,1 -7330,6,8,0,13,3,0,3,4,2,1,11,17,0,26,1 -7331,0,3,0,7,0,0,11,4,0,1,2,4,5,26,0 -7332,5,4,0,9,4,6,5,1,2,0,8,19,0,30,1 -7333,10,2,0,3,2,6,2,3,3,0,2,2,5,23,0 -7334,3,5,0,5,0,0,0,2,2,1,8,3,0,3,0 -7335,6,8,0,5,6,5,8,4,1,0,14,0,5,7,0 -7336,1,2,0,11,4,6,1,0,0,1,0,18,0,26,0 -7337,9,8,0,3,0,2,1,3,0,0,11,10,4,40,1 -7338,2,2,0,15,0,5,6,3,4,1,6,15,0,26,0 -7339,1,8,0,7,0,5,5,1,4,0,12,11,0,6,0 -7340,2,4,0,14,3,3,5,5,1,1,2,17,0,19,0 -7341,8,2,0,6,4,6,7,1,0,0,10,7,5,10,0 -7342,0,4,0,12,0,4,10,1,3,1,4,18,4,28,0 -7343,7,3,0,11,1,1,9,4,3,1,13,2,3,29,0 -7344,0,6,0,9,1,1,0,1,0,1,17,13,3,29,0 -7345,6,7,0,8,5,4,4,5,4,0,8,13,5,35,0 -7346,10,6,0,9,4,2,12,1,3,1,1,13,0,5,1 -7347,3,3,0,8,5,0,3,4,2,0,8,2,4,0,0 -7348,8,2,0,13,3,6,10,1,3,0,18,14,0,1,1 -7349,6,5,0,3,1,0,4,3,0,0,6,5,3,16,1 -7350,7,7,0,8,6,5,14,2,4,1,4,8,1,36,0 -7351,0,6,0,5,0,1,12,1,2,0,0,3,4,33,0 -7352,5,3,0,11,6,0,12,4,1,0,5,13,0,1,0 -7353,9,7,0,15,5,6,5,4,1,0,2,13,4,41,0 -7354,5,3,0,12,0,6,3,3,2,0,2,6,0,4,0 -7355,2,3,0,14,4,3,8,3,0,0,0,17,1,5,0 -7356,1,1,0,15,2,0,9,1,0,0,13,4,1,34,0 -7357,1,3,0,7,0,6,7,2,3,0,4,10,3,22,0 -7358,2,7,0,13,6,1,6,1,3,0,15,9,0,3,0 -7359,0,4,0,12,2,1,9,4,2,0,2,11,0,17,0 -7360,6,3,0,1,1,1,11,5,2,1,4,5,4,7,1 -7361,1,8,0,1,6,5,11,3,4,1,2,14,3,31,0 -7362,9,6,0,15,4,3,5,3,0,0,13,8,0,18,0 -7363,9,6,0,0,1,4,7,1,2,1,8,13,3,14,0 -7364,3,0,0,15,4,3,4,2,3,1,1,1,5,36,0 -7365,0,2,0,2,0,5,7,0,1,0,8,4,0,28,0 -7366,0,7,0,3,0,1,13,3,2,1,13,16,5,24,0 -7367,6,1,0,15,3,1,12,0,3,1,9,4,1,1,1 -7368,3,0,0,1,2,0,5,1,0,0,10,20,0,18,0 -7369,6,4,0,3,6,0,11,4,3,0,13,13,3,25,0 -7370,9,8,0,6,4,0,2,5,1,0,4,2,4,17,0 -7371,8,8,0,7,6,4,10,2,0,1,10,13,3,3,0 -7372,2,2,0,13,1,0,9,0,0,0,14,13,0,22,0 -7373,9,1,0,9,4,3,2,1,1,0,7,14,0,6,1 -7374,8,0,0,10,3,1,5,0,1,0,16,13,4,2,1 -7375,1,6,0,1,3,1,5,5,2,1,17,14,0,17,0 -7376,7,2,0,0,1,1,0,4,0,0,6,2,0,9,0 -7377,4,2,0,5,0,6,6,4,0,0,8,2,5,18,0 -7378,6,8,0,9,6,2,12,5,4,0,1,17,4,23,1 -7379,2,5,0,4,5,0,8,2,2,1,2,9,2,19,0 -7380,1,4,0,2,1,0,5,5,1,0,4,20,2,27,0 -7381,9,2,0,6,6,3,9,3,4,0,12,5,2,10,1 -7382,5,0,0,0,2,3,0,4,3,1,2,11,3,17,0 -7383,9,0,0,7,2,2,2,0,2,0,5,8,5,3,1 -7384,0,5,0,12,4,4,3,2,0,1,10,11,2,38,0 -7385,3,5,0,12,5,5,10,0,0,1,5,5,3,13,1 -7386,9,4,0,9,6,4,12,3,3,0,13,11,0,4,0 -7387,0,3,0,3,6,0,13,4,0,0,2,8,3,18,0 -7388,8,2,0,0,0,0,0,1,1,1,10,18,0,0,0 -7389,7,8,0,12,1,3,9,4,0,1,5,2,3,29,0 -7390,7,4,0,10,4,5,9,2,1,0,2,13,2,8,0 -7391,2,4,0,2,2,6,2,3,1,1,13,19,0,19,0 -7392,1,7,0,1,0,5,0,1,2,1,10,2,1,24,0 -7393,8,8,0,1,0,3,5,2,3,1,17,19,5,12,0 -7394,8,6,0,14,3,1,14,3,1,1,8,15,5,13,0 -7395,5,4,0,15,2,2,12,0,1,1,7,19,5,33,1 -7396,5,2,0,3,6,4,8,3,4,0,9,1,5,24,1 -7397,1,1,0,11,0,5,3,3,3,1,6,11,4,8,0 -7398,10,5,0,8,1,0,8,5,0,0,9,18,5,7,1 -7399,1,6,0,3,0,6,8,1,1,0,4,6,0,29,0 -7400,4,5,0,4,5,5,6,1,3,1,13,11,0,35,0 -7401,1,6,0,3,4,0,8,2,2,0,4,13,1,7,0 -7402,0,8,0,6,3,4,10,4,1,1,6,2,2,27,0 -7403,3,2,0,5,1,3,5,3,1,1,1,20,3,27,0 -7404,0,2,0,8,6,1,3,0,4,0,0,15,5,26,0 -7405,4,3,0,1,5,5,8,4,2,1,5,14,0,26,0 -7406,5,8,0,11,4,1,8,4,0,1,13,20,0,23,0 -7407,0,1,0,13,6,3,5,4,0,0,13,6,0,4,0 -7408,4,5,0,14,6,0,3,2,3,1,13,4,5,32,0 -7409,6,8,0,12,6,1,9,0,2,0,13,2,1,22,0 -7410,6,4,0,5,3,4,3,0,1,0,13,11,0,31,0 -7411,6,1,0,6,2,0,9,1,1,0,12,19,3,21,0 -7412,4,8,0,7,6,6,4,1,3,1,8,2,3,24,0 -7413,3,5,0,2,6,2,8,0,0,1,13,9,0,9,0 -7414,2,8,0,10,1,4,14,4,4,0,13,2,5,25,0 -7415,8,7,0,11,4,0,0,0,4,1,17,17,3,33,0 -7416,8,4,0,13,5,3,5,3,4,0,8,2,1,33,0 -7417,5,7,0,1,3,2,5,5,4,1,0,15,5,29,1 -7418,0,6,0,5,6,2,10,3,0,0,13,16,3,17,0 -7419,2,3,0,5,0,4,9,3,4,0,9,15,5,35,0 -7420,10,5,0,2,1,3,4,5,4,1,5,10,5,6,1 -7421,8,7,0,5,4,0,0,2,3,0,6,6,4,11,0 -7422,6,5,0,10,0,3,7,4,1,1,6,2,5,14,0 -7423,0,6,0,3,4,0,7,3,1,1,4,0,3,31,0 -7424,0,0,0,4,0,6,7,0,1,0,15,11,2,2,0 -7425,8,1,0,5,1,2,0,5,2,1,18,19,4,29,1 -7426,5,2,0,6,5,5,1,0,2,0,18,10,5,27,0 -7427,0,3,0,11,5,6,3,0,1,0,13,5,5,0,0 -7428,10,5,0,0,4,4,1,3,0,1,18,9,4,22,1 -7429,8,7,0,2,1,2,3,2,3,0,13,4,3,35,0 -7430,6,5,0,14,0,6,8,1,1,1,4,15,1,24,0 -7431,4,6,0,11,5,0,1,4,0,0,0,11,0,14,0 -7432,0,2,0,3,0,1,0,1,4,0,2,6,0,11,0 -7433,8,7,0,12,6,5,3,2,2,0,0,0,5,40,0 -7434,7,5,0,3,0,5,9,1,1,1,4,20,2,7,0 -7435,0,4,0,11,5,3,4,5,0,0,13,12,2,19,0 -7436,4,7,0,6,1,0,12,2,0,1,7,12,1,31,1 -7437,6,8,0,15,2,0,9,2,2,1,5,0,1,5,0 -7438,2,0,0,4,5,2,5,1,0,0,8,9,2,13,0 -7439,5,8,0,9,1,6,2,5,1,1,9,5,3,28,1 -7440,3,4,0,5,0,5,1,2,3,0,0,11,4,14,0 -7441,1,4,0,12,1,6,1,4,4,1,8,11,3,21,0 -7442,2,8,0,11,1,6,5,1,4,1,2,2,5,5,0 -7443,3,3,0,13,3,4,9,1,3,1,12,11,1,6,0 -7444,2,4,0,3,1,4,10,1,2,0,13,6,2,24,0 -7445,1,3,0,2,0,0,14,2,4,0,12,4,1,16,0 -7446,3,1,0,15,5,4,9,2,0,0,15,18,2,38,0 -7447,4,6,0,6,0,2,3,2,1,1,2,2,5,38,0 -7448,0,5,0,11,6,6,9,1,0,1,12,2,0,17,0 -7449,6,5,0,14,6,1,13,5,2,0,13,0,0,18,0 -7450,1,0,0,6,1,4,14,0,3,1,10,3,0,7,0 -7451,3,6,0,3,0,5,2,2,3,0,8,2,2,18,0 -7452,9,5,0,5,6,0,8,2,0,0,12,16,2,38,0 -7453,0,2,0,0,4,0,0,2,0,0,13,4,4,14,0 -7454,6,4,0,12,6,5,12,3,1,0,4,2,0,24,0 -7455,1,7,0,5,4,3,1,2,2,0,8,11,2,20,0 -7456,2,8,0,9,5,5,2,4,2,0,8,11,0,14,0 -7457,3,2,0,7,6,1,9,4,4,0,17,18,2,6,0 -7458,3,1,0,11,5,5,14,3,0,0,2,6,1,19,0 -7459,0,4,0,3,3,6,2,0,2,1,14,20,3,28,0 -7460,0,4,0,11,6,4,10,1,1,0,2,2,5,13,0 -7461,6,6,0,2,4,0,0,2,4,0,8,2,0,10,0 -7462,8,8,0,7,6,0,2,5,4,0,18,4,1,13,1 -7463,1,0,0,0,0,2,3,2,4,0,8,0,0,6,0 -7464,5,0,0,2,5,5,11,3,1,0,15,6,4,5,0 -7465,0,3,0,7,6,6,12,4,0,0,10,11,1,25,0 -7466,5,1,0,1,1,5,3,0,0,0,1,2,1,8,0 -7467,9,6,0,6,1,3,0,0,2,1,11,4,2,37,0 -7468,2,2,0,2,0,4,5,3,4,1,12,15,2,12,0 -7469,0,0,0,1,1,5,5,4,1,0,13,11,0,35,0 -7470,2,6,0,15,0,6,9,4,2,1,8,2,5,14,0 -7471,2,0,0,3,2,3,6,2,2,0,5,9,1,18,0 -7472,1,2,0,12,0,3,9,4,2,1,17,6,1,20,0 -7473,10,8,0,6,4,4,7,3,4,0,8,6,4,6,0 -7474,0,8,0,3,2,4,0,3,0,0,10,15,4,38,0 -7475,6,2,0,5,4,3,4,3,0,1,5,7,5,7,1 -7476,1,2,0,11,1,6,5,4,0,1,13,2,5,28,0 -7477,6,3,0,11,5,3,2,0,2,1,13,18,0,39,0 -7478,4,0,0,13,5,4,7,1,0,0,8,20,3,40,0 -7479,10,3,0,0,1,5,9,0,2,1,13,13,0,6,0 -7480,8,3,0,13,2,4,9,2,1,0,6,15,1,35,0 -7481,5,8,0,1,4,4,1,1,1,1,2,6,5,26,0 -7482,2,0,0,14,6,4,3,2,0,1,13,3,1,41,0 -7483,10,8,0,12,5,0,6,5,0,1,17,16,5,27,0 -7484,7,7,0,0,0,4,0,0,1,1,13,9,5,7,0 -7485,3,2,0,14,6,5,8,1,0,0,4,2,3,6,0 -7486,9,4,0,1,0,6,7,1,4,0,13,13,4,26,0 -7487,5,5,0,13,3,6,1,4,2,0,6,13,4,28,0 -7488,3,7,0,6,6,5,1,2,2,0,6,11,1,27,0 -7489,3,1,0,3,4,5,0,4,2,1,7,4,0,12,1 -7490,3,7,0,6,0,5,2,0,2,0,2,8,3,27,0 -7491,6,4,0,4,6,1,4,3,0,1,12,3,0,1,0 -7492,0,6,0,14,5,0,5,0,2,0,10,8,0,37,0 -7493,7,1,0,15,2,1,4,4,1,1,11,5,2,21,1 -7494,10,3,0,9,0,5,3,1,0,0,2,15,2,12,0 -7495,6,4,0,14,4,2,1,1,1,1,1,13,4,19,1 -7496,3,2,0,13,4,3,13,1,0,0,2,2,4,0,0 -7497,7,6,0,13,0,4,8,3,2,1,12,15,1,24,0 -7498,4,0,0,15,6,0,1,3,1,0,1,17,0,24,1 -7499,0,1,0,13,0,5,6,0,3,1,14,18,1,11,0 -7500,2,6,0,9,6,0,12,4,4,1,2,3,3,21,0 -7501,2,0,0,10,4,0,5,4,3,1,2,13,3,11,0 -7502,1,2,0,2,1,5,1,1,1,0,8,9,0,22,0 -7503,6,3,0,3,5,0,7,3,3,0,13,20,2,8,0 -7504,3,0,0,9,3,0,7,5,4,1,5,10,0,20,1 -7505,6,1,0,12,3,1,8,3,4,1,10,16,5,40,1 -7506,7,3,0,6,6,0,5,3,0,1,6,6,2,6,0 -7507,1,4,0,13,6,5,4,3,2,1,13,19,2,8,0 -7508,1,3,0,2,2,1,5,5,4,1,8,18,4,3,0 -7509,1,1,0,11,4,6,6,0,3,0,11,3,0,26,0 -7510,0,3,0,3,2,5,1,1,2,1,7,15,0,5,0 -7511,6,1,0,13,0,1,12,1,3,0,8,4,5,10,0 -7512,0,3,0,7,2,4,3,5,2,1,2,11,0,0,0 -7513,2,8,0,12,2,4,3,2,2,0,13,2,4,5,0 -7514,1,4,0,0,4,4,8,4,0,0,8,18,2,18,0 -7515,10,0,0,15,6,6,2,3,1,1,14,1,0,6,1 -7516,0,0,0,13,0,6,8,0,2,1,4,2,2,14,0 -7517,6,4,0,12,0,6,13,2,3,0,0,16,2,5,0 -7518,8,1,0,3,5,3,14,5,2,1,13,2,0,21,0 -7519,10,8,0,11,5,6,8,2,1,1,14,11,0,1,0 -7520,1,6,0,2,5,5,9,2,2,0,10,15,1,26,0 -7521,5,7,0,13,4,5,7,0,1,1,7,7,0,23,1 -7522,2,8,0,14,0,3,6,4,3,0,13,9,0,37,0 -7523,1,6,0,0,6,6,14,2,0,1,4,11,3,11,0 -7524,2,6,0,4,0,4,5,1,4,1,2,4,3,28,0 -7525,9,4,0,3,2,0,12,2,4,1,16,10,3,6,1 -7526,6,5,0,11,3,1,6,5,2,1,11,4,1,5,1 -7527,10,5,0,8,0,0,7,4,3,1,2,11,3,22,0 -7528,5,8,0,8,5,0,11,1,1,1,12,1,1,14,0 -7529,0,7,0,1,1,0,5,2,0,0,13,8,4,14,0 -7530,7,1,0,3,4,4,7,2,3,1,10,6,2,8,0 -7531,2,5,0,2,5,1,13,1,0,1,8,2,3,39,0 -7532,3,7,0,6,3,5,11,0,3,1,13,11,0,7,0 -7533,9,1,0,1,6,3,1,5,3,0,8,18,3,40,0 -7534,7,1,0,8,6,4,6,5,0,0,0,4,0,9,0 -7535,8,3,0,3,0,5,8,5,3,0,0,2,4,17,0 -7536,1,6,0,4,1,6,12,3,2,0,4,8,1,29,0 -7537,0,6,0,13,2,6,2,3,3,0,13,2,1,6,0 -7538,4,8,0,13,0,0,10,1,0,1,4,9,0,17,0 -7539,5,0,0,9,5,4,4,1,4,0,15,18,0,30,1 -7540,10,1,0,4,6,1,8,4,0,0,6,17,2,32,1 -7541,8,7,0,9,5,6,6,3,3,1,13,15,1,19,0 -7542,2,1,0,15,0,4,13,5,1,1,8,9,3,36,0 -7543,2,8,0,11,2,0,9,1,1,1,7,11,5,29,0 -7544,6,6,0,5,4,2,1,5,1,1,3,20,4,32,0 -7545,2,1,0,10,6,0,5,2,3,0,8,13,1,2,0 -7546,10,7,0,8,5,3,12,3,3,1,18,4,4,21,1 -7547,1,5,0,3,1,6,6,1,1,0,8,12,3,3,0 -7548,0,0,0,13,0,5,11,3,4,0,11,15,2,4,0 -7549,8,0,0,0,0,4,8,5,3,0,7,5,5,13,1 -7550,6,2,0,7,1,4,9,3,3,0,13,15,5,26,0 -7551,1,6,0,13,0,4,4,1,0,0,17,2,1,19,0 -7552,0,6,0,3,0,2,8,2,0,0,12,2,0,8,0 -7553,1,8,0,13,0,3,6,5,0,1,0,17,2,35,0 -7554,5,8,0,4,3,0,9,0,0,1,18,14,2,4,1 -7555,3,3,0,5,2,6,5,2,0,0,12,11,4,12,0 -7556,0,2,0,11,0,5,10,4,0,1,13,20,3,14,0 -7557,0,6,0,13,0,0,6,3,0,1,13,11,1,30,0 -7558,5,4,0,11,0,3,4,5,3,1,2,1,5,22,0 -7559,2,1,0,11,2,1,2,1,3,1,18,3,5,32,1 -7560,4,5,0,4,4,2,0,5,4,1,16,14,4,36,1 -7561,7,6,0,5,4,2,13,1,3,1,18,12,2,30,1 -7562,5,6,0,13,1,2,12,2,0,0,10,6,1,16,0 -7563,0,6,0,3,1,4,10,5,2,0,2,15,3,12,0 -7564,2,2,0,10,4,4,0,0,0,0,17,16,1,40,0 -7565,2,4,0,4,2,2,9,4,1,1,0,2,5,27,0 -7566,10,3,0,3,3,6,5,3,4,0,5,7,3,5,1 -7567,10,3,0,3,2,5,14,3,4,0,12,20,1,24,0 -7568,10,4,0,10,6,6,9,3,1,1,3,19,0,35,1 -7569,1,4,0,13,1,6,9,2,0,0,0,2,5,17,0 -7570,1,7,0,6,1,5,13,2,0,1,2,0,0,5,0 -7571,9,6,0,15,3,5,12,3,3,1,13,2,4,6,0 -7572,0,6,0,11,0,3,8,4,3,1,10,17,3,34,0 -7573,4,1,0,4,3,3,6,3,1,1,10,2,2,29,0 -7574,6,2,0,5,6,1,14,4,3,1,18,19,4,38,1 -7575,2,4,0,1,5,0,3,3,1,1,18,7,5,16,1 -7576,0,2,0,13,5,3,6,0,0,0,8,11,2,0,0 -7577,3,8,0,13,2,5,11,3,4,1,13,11,1,5,0 -7578,10,1,0,10,6,1,0,2,1,0,13,17,2,11,1 -7579,0,6,0,13,1,3,12,2,3,1,4,6,3,3,0 -7580,2,6,0,8,0,2,0,2,2,1,8,11,0,5,0 -7581,0,6,0,3,0,6,9,3,0,0,16,15,0,36,0 -7582,1,0,0,12,5,4,6,5,4,0,17,2,2,10,0 -7583,3,8,0,14,1,2,5,2,3,1,4,2,1,19,0 -7584,7,1,0,15,4,2,5,3,4,1,9,5,3,17,1 -7585,0,4,0,1,4,0,6,3,2,1,2,11,5,36,0 -7586,9,2,0,0,5,0,11,3,3,0,4,14,3,30,0 -7587,3,3,0,15,2,4,3,0,1,0,7,3,3,6,1 -7588,3,4,0,4,5,6,9,5,1,1,7,13,1,9,1 -7589,10,3,0,15,3,1,12,2,1,1,16,6,1,37,0 -7590,7,5,0,0,1,2,11,1,1,1,3,7,5,6,1 -7591,9,0,0,10,3,5,12,3,4,1,10,15,0,3,0 -7592,3,2,0,12,0,4,6,4,3,0,8,0,3,10,0 -7593,1,4,0,15,2,3,2,0,4,0,4,0,0,33,0 -7594,2,3,0,1,0,2,11,1,3,0,17,4,3,41,0 -7595,1,8,0,9,6,1,2,0,0,1,8,11,3,34,0 -7596,7,5,0,0,2,0,8,3,2,0,11,2,4,4,0 -7597,4,0,0,13,1,2,7,4,0,0,2,13,4,37,0 -7598,0,1,0,4,5,5,9,1,2,1,3,1,2,0,0 -7599,1,6,0,6,0,5,0,2,0,1,13,12,2,6,0 -7600,1,7,0,1,5,3,4,1,1,0,17,18,1,7,0 -7601,0,0,0,15,2,4,5,4,3,1,0,6,1,19,0 -7602,1,3,0,0,0,5,3,5,2,1,13,18,0,37,0 -7603,9,5,0,11,2,5,7,3,3,0,6,2,4,35,0 -7604,6,6,0,14,3,6,4,2,0,1,18,5,3,38,1 -7605,4,7,0,4,3,5,4,0,1,1,13,3,5,4,0 -7606,0,5,0,3,6,5,12,2,4,0,2,6,2,29,0 -7607,0,3,0,11,3,5,6,2,0,1,13,11,4,40,0 -7608,1,4,0,5,0,0,12,2,4,1,5,20,5,33,0 -7609,1,4,0,2,4,4,2,3,4,0,17,6,4,37,0 -7610,1,5,0,1,6,0,0,2,4,1,4,15,1,22,0 -7611,9,7,0,5,2,0,11,1,2,0,8,17,5,21,0 -7612,5,0,0,1,0,3,5,1,3,0,15,3,4,8,0 -7613,1,2,0,2,3,2,10,3,1,1,5,19,4,19,1 -7614,7,6,0,4,2,0,4,4,4,1,16,20,3,14,0 -7615,8,8,0,7,6,3,6,3,0,0,11,14,4,29,1 -7616,0,7,0,6,2,6,8,4,0,0,8,11,3,22,0 -7617,2,8,0,8,2,1,2,0,4,0,2,10,5,41,0 -7618,5,0,0,3,2,1,5,0,3,1,14,2,3,19,0 -7619,9,6,0,1,2,1,14,5,2,1,16,19,1,37,1 -7620,5,6,0,0,0,5,13,3,2,0,9,5,4,4,1 -7621,10,3,0,13,0,5,11,3,3,1,0,14,4,27,0 -7622,6,2,0,10,6,5,3,2,4,0,2,11,1,17,0 -7623,7,0,0,10,3,0,10,5,4,0,18,14,4,9,1 -7624,6,0,0,14,0,1,5,0,3,1,8,11,4,0,0 -7625,1,8,0,2,5,3,6,0,2,0,17,3,2,18,0 -7626,0,2,0,0,5,6,1,1,0,0,17,20,5,16,0 -7627,9,6,0,5,0,6,9,3,0,0,2,18,1,35,0 -7628,7,1,0,11,3,4,11,0,1,1,3,1,1,38,1 -7629,8,5,0,8,4,6,0,0,2,1,18,8,1,18,1 -7630,9,2,0,11,6,0,5,1,0,0,13,15,1,26,0 -7631,2,1,0,4,2,0,8,0,0,0,15,20,4,23,0 -7632,10,6,0,8,6,5,14,5,0,1,18,7,3,6,1 -7633,0,3,0,13,5,6,6,4,1,0,2,9,3,14,0 -7634,5,3,0,3,2,5,7,3,0,0,0,13,3,30,0 -7635,4,7,0,4,3,5,9,3,2,1,13,2,2,7,0 -7636,3,8,0,11,4,0,5,4,1,1,2,4,3,38,0 -7637,7,7,0,10,5,1,11,1,0,1,0,8,5,39,1 -7638,10,0,0,14,6,6,5,3,2,1,13,2,5,26,0 -7639,6,0,0,5,3,6,6,4,1,1,8,14,0,32,1 -7640,6,7,0,3,0,1,0,4,1,0,13,3,5,6,0 -7641,9,7,0,2,2,6,5,0,2,0,8,15,4,39,0 -7642,1,8,0,9,6,5,5,4,0,1,13,2,0,4,0 -7643,2,7,0,7,5,0,1,2,0,0,13,6,2,35,0 -7644,1,4,0,11,2,3,11,4,1,0,13,4,0,28,0 -7645,2,6,0,3,1,4,10,3,2,1,8,12,2,33,0 -7646,0,6,0,6,0,3,2,2,0,0,6,0,2,40,0 -7647,1,0,0,10,2,0,10,2,0,1,9,20,1,32,0 -7648,10,5,0,11,6,3,5,2,3,1,2,11,0,35,0 -7649,7,3,0,9,3,3,3,2,2,1,1,7,5,37,1 -7650,2,6,0,1,3,3,9,5,3,0,16,14,5,19,0 -7651,4,0,0,1,0,0,13,4,2,0,13,4,1,12,0 -7652,0,8,0,10,0,4,12,2,1,0,13,11,1,25,0 -7653,2,4,0,6,6,2,5,2,0,1,13,17,0,32,0 -7654,5,8,0,8,0,2,1,1,1,0,6,14,2,29,0 -7655,6,7,0,13,0,6,0,3,4,0,4,15,1,7,0 -7656,1,5,0,0,1,4,13,1,0,1,8,2,0,32,0 -7657,0,0,0,12,3,3,11,3,3,0,2,10,3,35,0 -7658,0,8,0,5,5,3,8,3,4,0,17,2,0,40,0 -7659,2,6,0,4,0,4,3,4,0,1,8,11,2,18,0 -7660,0,7,0,11,2,6,11,5,0,1,0,6,2,22,0 -7661,9,3,0,3,1,6,1,4,3,1,2,20,0,31,0 -7662,8,8,0,14,6,2,5,0,4,1,8,18,2,27,0 -7663,3,8,0,13,6,4,12,1,1,1,13,4,3,37,0 -7664,7,8,0,1,1,3,13,0,2,0,14,10,2,20,1 -7665,9,8,0,3,6,0,6,0,0,0,4,11,2,13,0 -7666,0,3,0,6,2,5,6,2,3,1,13,2,1,25,0 -7667,4,4,0,5,2,5,13,3,1,0,2,19,3,7,0 -7668,7,0,0,10,6,1,14,4,2,0,7,15,2,40,0 -7669,1,3,0,3,6,3,1,1,3,1,4,0,1,6,0 -7670,2,6,0,8,2,6,10,1,3,1,2,2,0,4,0 -7671,5,7,0,13,5,6,3,0,2,0,8,2,0,35,0 -7672,0,8,0,5,0,4,10,2,2,0,8,15,3,36,0 -7673,7,4,0,8,4,0,14,2,4,0,5,20,4,9,1 -7674,3,0,0,0,3,0,6,1,2,1,13,15,1,19,0 -7675,1,5,0,10,5,6,9,1,1,0,4,18,5,4,0 -7676,6,1,0,12,5,0,4,0,0,0,4,17,4,20,1 -7677,3,8,0,10,0,6,3,2,3,0,4,5,4,38,0 -7678,7,4,0,13,6,2,5,1,4,0,7,8,3,5,1 -7679,2,1,0,4,4,0,14,5,2,0,14,1,1,6,1 -7680,8,0,0,15,0,1,10,2,4,1,10,17,4,31,1 -7681,9,4,0,7,0,3,9,3,3,0,10,15,1,21,0 -7682,10,1,0,2,2,1,13,5,4,0,9,14,4,41,1 -7683,10,7,0,2,6,1,3,0,4,0,11,7,1,18,1 -7684,2,2,0,11,2,1,0,1,4,0,13,15,5,35,0 -7685,4,4,0,11,1,3,0,0,1,1,8,4,1,35,0 -7686,6,5,0,1,4,6,12,3,0,1,18,8,0,4,1 -7687,0,7,0,6,1,3,3,1,1,0,13,14,1,34,0 -7688,10,3,0,13,4,4,1,1,4,1,18,4,3,34,1 -7689,7,6,0,6,5,5,9,1,1,1,9,18,5,21,1 -7690,2,8,0,4,4,4,4,0,4,0,12,5,1,8,0 -7691,7,7,0,12,1,6,4,5,4,1,10,0,1,21,0 -7692,0,5,0,11,1,4,11,2,0,0,0,2,0,32,0 -7693,7,3,0,9,4,2,8,5,1,1,7,5,2,8,1 -7694,3,4,0,7,0,6,7,0,2,1,0,19,4,8,1 -7695,4,7,0,5,1,5,14,1,0,0,13,20,0,7,0 -7696,2,7,0,8,6,2,6,3,2,0,13,2,1,29,0 -7697,0,6,0,4,2,0,7,1,0,1,17,17,2,20,0 -7698,9,8,0,1,5,0,8,1,4,1,2,0,1,9,0 -7699,10,1,0,8,4,5,11,5,4,1,12,18,2,38,1 -7700,1,3,0,0,2,4,9,4,3,1,7,2,3,24,0 -7701,2,6,0,0,0,1,1,5,0,1,13,6,3,30,0 -7702,2,8,0,0,4,6,6,3,3,1,15,2,2,37,0 -7703,5,7,0,12,3,0,3,0,4,1,17,12,3,27,1 -7704,9,3,0,15,1,2,4,4,2,1,8,11,0,14,0 -7705,0,6,0,1,6,0,0,3,4,1,8,0,1,3,0 -7706,8,8,0,8,6,5,6,2,3,1,17,13,1,19,0 -7707,8,4,0,7,0,4,3,2,0,0,13,9,2,21,0 -7708,2,6,0,5,0,6,11,4,1,1,0,9,0,40,0 -7709,9,1,0,9,6,5,14,1,0,1,4,6,2,12,0 -7710,4,8,0,13,0,4,2,2,1,0,4,2,0,24,0 -7711,5,7,0,9,1,0,14,0,2,0,4,2,3,17,0 -7712,0,4,0,0,2,2,8,4,2,1,8,13,1,12,0 -7713,2,3,0,9,0,6,14,4,3,0,2,2,2,6,0 -7714,0,6,0,8,0,0,9,4,1,0,13,11,4,12,0 -7715,2,7,0,13,0,6,5,1,2,0,2,6,2,24,0 -7716,2,6,0,1,2,6,7,4,0,1,6,6,0,3,0 -7717,5,6,0,6,3,2,9,4,2,1,16,8,5,37,1 -7718,2,2,0,11,6,5,0,3,1,1,6,11,4,25,0 -7719,0,7,0,5,6,6,9,0,3,1,10,20,0,21,0 -7720,10,0,0,7,3,2,12,5,2,1,5,7,3,21,1 -7721,2,7,0,13,5,5,5,3,0,1,13,16,2,11,0 -7722,3,6,0,7,6,5,13,2,4,0,15,2,0,36,0 -7723,2,6,0,3,4,2,8,3,3,0,8,18,5,31,0 -7724,5,3,0,3,1,3,4,4,4,1,3,5,3,27,1 -7725,3,6,0,11,6,3,10,2,1,0,0,9,0,7,0 -7726,3,7,0,10,3,6,6,1,2,1,10,13,2,21,0 -7727,10,8,0,5,0,6,0,4,1,0,11,0,1,12,0 -7728,1,5,0,2,2,5,4,0,3,1,18,7,2,35,1 -7729,1,5,0,5,2,2,5,3,2,0,13,20,3,35,0 -7730,2,3,0,14,0,0,8,1,0,0,10,9,1,35,0 -7731,0,2,0,4,2,4,11,3,2,0,13,15,5,33,0 -7732,3,3,0,5,1,2,0,5,3,0,11,9,3,11,0 -7733,2,2,0,6,1,6,8,2,0,0,12,11,4,35,0 -7734,3,3,0,7,0,0,7,5,0,1,6,0,0,18,0 -7735,8,7,0,7,2,2,4,0,0,1,5,8,4,36,1 -7736,8,7,0,2,3,6,0,1,4,1,11,19,4,5,1 -7737,1,5,0,15,1,2,14,1,3,1,1,17,3,6,1 -7738,6,2,0,15,4,0,8,3,1,0,15,2,4,13,0 -7739,2,3,0,3,5,6,7,3,4,1,10,2,1,7,0 -7740,8,8,0,2,0,4,13,3,0,0,15,3,3,33,0 -7741,1,1,0,15,6,4,5,0,2,0,2,15,5,38,0 -7742,5,0,0,1,0,2,14,0,0,1,1,18,3,0,0 -7743,7,7,0,13,6,0,0,3,3,0,13,20,0,20,0 -7744,0,5,0,12,4,0,13,0,0,0,4,20,1,9,1 -7745,3,3,0,11,6,5,13,3,2,1,8,20,0,19,0 -7746,3,6,0,5,6,3,5,1,2,0,2,16,5,28,0 -7747,1,8,0,14,1,4,14,1,1,1,15,13,0,34,0 -7748,10,7,0,5,3,4,7,0,2,0,17,6,1,12,0 -7749,3,3,0,6,0,2,9,1,3,0,4,11,5,37,0 -7750,9,5,0,8,1,0,10,2,1,1,18,4,2,21,1 -7751,1,7,0,3,6,5,12,1,0,0,15,5,3,29,0 -7752,1,0,0,11,3,4,1,4,3,0,13,2,5,9,0 -7753,8,0,0,15,2,1,1,5,4,1,3,7,2,35,1 -7754,6,2,0,11,4,3,11,3,2,1,9,1,1,41,1 -7755,9,6,0,0,5,0,8,0,2,0,14,15,2,13,0 -7756,8,8,0,7,2,0,0,1,4,1,18,4,4,24,1 -7757,5,1,0,0,5,5,4,5,4,1,2,9,0,39,0 -7758,8,1,0,3,6,2,14,5,2,1,14,8,5,9,1 -7759,0,3,0,7,2,4,5,4,3,0,2,9,1,4,0 -7760,2,2,0,4,1,5,11,4,2,0,8,0,5,7,0 -7761,10,7,0,6,2,3,12,0,0,0,11,2,0,3,0 -7762,6,7,0,8,5,2,2,5,0,1,9,10,3,37,1 -7763,9,1,0,12,3,0,9,0,2,0,3,15,2,6,0 -7764,2,1,0,8,5,6,12,5,0,1,2,12,3,18,0 -7765,5,5,0,2,3,5,3,5,4,0,9,7,5,19,1 -7766,0,4,0,15,5,4,9,4,2,1,5,13,5,34,0 -7767,3,7,0,7,6,1,9,1,0,0,16,11,4,21,0 -7768,6,0,0,11,6,1,4,2,4,1,9,12,4,36,1 -7769,7,7,0,15,0,1,14,3,1,0,18,14,3,8,1 -7770,3,2,0,13,5,3,5,0,4,1,17,3,0,37,0 -7771,2,0,0,2,0,4,5,3,2,1,17,9,0,34,0 -7772,2,5,0,12,0,3,14,2,4,1,18,0,2,19,1 -7773,10,8,0,9,0,3,8,4,4,0,2,13,1,30,0 -7774,2,0,0,6,5,5,12,4,0,0,13,11,0,2,0 -7775,1,2,0,4,1,1,7,2,0,1,17,16,0,3,0 -7776,3,3,0,13,5,4,3,3,0,0,8,0,2,18,0 -7777,3,8,0,12,4,5,10,1,4,0,10,5,2,3,0 -7778,4,8,0,13,1,1,4,4,1,0,10,18,3,30,0 -7779,3,6,0,10,4,0,5,4,0,0,17,6,0,32,0 -7780,1,0,0,11,0,4,4,3,1,1,13,2,5,40,0 -7781,4,6,0,5,0,5,10,2,2,0,3,15,2,18,0 -7782,9,4,0,10,1,2,8,2,1,0,15,5,0,7,0 -7783,0,2,0,3,2,5,5,3,0,0,6,11,1,28,0 -7784,2,0,0,6,6,2,0,4,1,1,10,16,0,27,0 -7785,9,7,0,0,4,6,14,2,4,0,5,19,3,7,1 -7786,1,6,0,1,4,0,6,0,0,0,2,9,4,30,0 -7787,5,1,0,5,0,6,12,5,1,0,8,6,1,26,0 -7788,9,7,0,1,0,5,0,0,4,0,10,2,3,24,0 -7789,7,3,0,10,0,2,5,5,2,1,5,5,5,25,1 -7790,9,3,0,1,1,6,9,2,3,0,8,11,2,12,0 -7791,4,1,0,2,3,2,10,2,2,1,11,1,5,19,1 -7792,2,6,0,9,0,5,0,1,0,0,10,2,1,20,0 -7793,1,2,0,9,0,6,12,4,0,1,17,15,3,38,0 -7794,1,8,0,13,5,1,9,4,4,0,1,7,3,34,0 -7795,7,8,0,7,3,6,1,3,2,1,1,8,4,33,1 -7796,3,6,0,13,2,0,1,5,0,1,2,5,2,0,0 -7797,8,7,0,13,5,6,5,1,4,0,18,2,2,1,0 -7798,3,1,0,6,0,4,9,2,2,0,13,4,2,19,0 -7799,2,4,0,8,0,5,2,4,1,0,4,2,2,38,0 -7800,2,3,0,0,4,5,3,1,3,0,7,18,2,11,1 -7801,1,1,0,5,4,0,14,1,2,0,13,18,1,20,0 -7802,0,1,0,6,6,0,7,1,4,1,4,16,2,11,0 -7803,5,3,0,11,5,0,7,3,1,0,13,12,4,20,0 -7804,0,2,0,12,2,4,0,4,1,1,15,18,1,20,0 -7805,8,2,0,0,0,2,8,4,3,0,2,2,3,31,0 -7806,10,3,0,0,6,0,10,1,3,1,2,6,0,1,0 -7807,0,3,0,8,5,3,6,5,1,1,15,2,0,28,0 -7808,5,8,0,0,4,1,4,5,1,1,16,17,5,3,1 -7809,0,1,0,5,1,1,6,3,3,1,2,2,4,30,0 -7810,1,6,0,13,1,3,5,3,3,0,10,20,5,9,0 -7811,1,6,0,10,6,2,3,0,2,1,18,8,2,27,1 -7812,5,3,0,5,3,5,7,0,0,0,13,19,0,16,0 -7813,3,8,0,15,5,3,8,4,0,1,2,15,0,20,0 -7814,7,2,0,15,2,6,1,3,4,0,13,15,2,38,0 -7815,5,7,0,1,1,0,8,2,4,0,2,17,3,38,0 -7816,2,3,0,2,5,6,2,3,2,1,12,2,3,31,0 -7817,10,1,0,0,4,3,6,2,3,0,11,6,0,20,0 -7818,1,2,0,13,2,3,8,1,3,1,8,14,0,31,0 -7819,1,3,0,1,0,4,0,1,1,1,8,2,0,31,0 -7820,0,0,0,0,0,0,11,3,0,0,4,2,2,26,0 -7821,7,4,0,9,4,4,7,5,4,1,18,7,1,41,1 -7822,7,2,0,6,4,5,14,0,3,0,1,9,1,29,1 -7823,10,1,0,9,3,5,10,3,0,0,13,11,5,1,0 -7824,8,5,0,8,0,3,7,1,4,0,13,11,5,10,0 -7825,3,3,0,1,2,1,10,5,1,0,6,3,4,41,1 -7826,2,6,0,4,2,5,0,4,1,1,17,12,1,8,0 -7827,3,3,0,11,0,0,1,1,2,0,8,16,0,9,0 -7828,4,4,0,10,5,4,9,5,0,0,5,7,3,18,1 -7829,8,1,0,6,0,5,5,1,0,1,10,6,0,13,0 -7830,6,3,0,2,0,4,2,2,4,0,17,1,2,22,0 -7831,0,6,0,7,2,4,12,0,3,1,0,9,0,36,0 -7832,10,6,0,14,5,4,9,3,0,1,14,3,1,19,0 -7833,6,7,0,8,3,2,12,0,0,0,16,17,4,11,1 -7834,6,6,0,11,3,1,0,3,3,0,2,2,2,17,0 -7835,0,5,0,13,4,4,3,2,3,1,13,2,0,37,0 -7836,1,1,0,3,2,1,0,5,1,0,12,11,3,19,1 -7837,0,0,0,5,1,6,2,2,4,1,15,0,2,9,0 -7838,10,5,0,14,1,0,9,1,0,0,17,15,1,25,0 -7839,10,0,0,4,0,2,10,4,3,1,0,4,3,4,0 -7840,2,5,0,13,0,2,5,4,3,0,4,0,2,16,0 -7841,6,7,0,8,5,0,5,0,1,0,5,12,1,14,1 -7842,2,5,0,2,0,0,11,3,0,0,13,18,2,16,0 -7843,4,8,0,10,1,0,10,2,1,1,8,18,1,2,0 -7844,9,8,0,5,6,0,3,1,0,1,13,20,1,41,0 -7845,0,5,0,13,3,2,6,5,3,0,13,9,4,33,0 -7846,2,4,0,0,1,6,2,4,2,1,2,3,1,32,0 -7847,1,1,0,2,3,1,13,5,3,0,5,17,1,13,1 -7848,3,7,0,13,1,0,12,5,3,1,13,18,3,18,0 -7849,10,4,0,4,4,4,8,2,1,1,2,17,0,11,0 -7850,5,3,0,14,0,0,6,3,1,1,2,6,4,5,0 -7851,9,0,0,9,3,4,6,0,0,1,15,3,3,34,1 -7852,8,5,0,15,4,0,13,3,0,1,8,2,0,21,1 -7853,3,7,0,13,6,6,8,1,2,1,2,18,0,18,0 -7854,10,8,0,11,4,5,9,4,0,1,13,2,1,12,0 -7855,4,3,0,13,1,5,3,3,0,1,2,10,3,9,0 -7856,8,1,0,0,6,2,3,0,0,1,9,17,4,3,1 -7857,0,7,0,7,4,3,7,4,2,0,13,13,4,39,0 -7858,8,3,0,11,5,1,4,3,0,1,17,18,1,1,0 -7859,6,4,0,12,5,5,0,2,0,1,13,11,0,31,0 -7860,2,8,0,1,0,4,3,3,3,1,17,10,3,26,0 -7861,3,4,0,15,4,1,6,4,4,1,13,14,5,38,0 -7862,4,6,0,5,3,5,13,3,2,0,16,13,4,6,0 -7863,0,3,0,7,3,4,8,1,3,0,8,13,4,32,0 -7864,2,8,0,8,2,3,13,3,0,0,2,20,4,29,0 -7865,1,1,0,13,6,5,9,0,0,0,0,11,4,12,0 -7866,0,4,0,0,4,4,0,2,3,1,0,6,4,26,0 -7867,5,8,0,11,0,2,14,2,0,1,10,6,0,13,0 -7868,2,5,0,7,4,3,1,5,1,0,9,14,2,32,1 -7869,0,5,0,12,6,4,7,1,0,0,2,10,0,23,0 -7870,2,7,0,5,1,0,9,4,3,1,4,2,2,29,0 -7871,10,6,0,14,2,5,5,3,0,0,15,6,3,27,0 -7872,3,7,0,9,6,0,13,1,1,1,2,19,3,27,0 -7873,7,0,0,2,4,4,6,2,3,1,2,6,3,37,0 -7874,4,7,0,10,4,0,0,0,4,1,2,0,0,27,0 -7875,10,4,0,3,5,6,5,0,2,1,2,11,3,14,0 -7876,1,8,0,2,0,6,14,3,2,0,17,13,0,11,0 -7877,0,3,0,15,1,4,14,1,4,0,6,1,3,38,0 -7878,6,6,0,8,1,2,12,1,3,1,2,4,5,39,0 -7879,0,4,0,13,1,2,10,3,0,1,13,0,0,7,0 -7880,8,0,0,14,1,2,4,3,2,0,14,7,3,40,1 -7881,1,0,0,3,2,0,2,2,4,0,10,9,0,18,0 -7882,2,7,0,15,4,5,6,2,0,1,10,15,1,7,0 -7883,7,0,0,9,2,5,1,0,1,1,18,13,5,37,1 -7884,7,8,0,9,6,0,1,5,0,1,18,7,3,26,1 -7885,7,7,0,12,6,6,1,3,4,1,18,5,4,40,1 -7886,6,2,0,4,6,4,8,2,2,1,13,13,0,6,0 -7887,10,5,0,10,1,2,11,4,3,1,13,14,4,19,0 -7888,1,1,0,3,2,3,6,4,1,1,18,1,4,23,1 -7889,9,7,0,8,0,4,10,5,2,1,11,3,3,38,1 -7890,7,8,0,10,0,2,10,5,0,1,0,7,2,40,1 -7891,5,3,0,2,1,5,0,1,0,0,4,15,0,32,0 -7892,2,4,0,13,2,0,1,5,2,1,13,9,4,16,0 -7893,1,8,0,1,4,1,11,0,4,1,2,6,2,4,0 -7894,3,6,0,2,2,4,5,3,2,0,13,6,4,23,0 -7895,2,2,0,5,2,1,11,4,1,0,2,2,2,29,0 -7896,9,7,0,4,5,2,6,1,3,1,13,0,4,41,0 -7897,8,1,0,15,1,0,0,5,4,1,9,5,1,25,1 -7898,10,5,0,7,2,5,5,3,0,1,4,20,4,36,0 -7899,9,8,0,10,3,0,8,0,3,0,2,13,1,28,0 -7900,7,8,0,10,3,1,9,4,1,1,5,1,5,40,1 -7901,2,7,0,11,5,3,6,3,1,0,13,13,2,24,0 -7902,0,4,0,15,3,5,1,0,1,0,3,8,0,37,1 -7903,2,2,0,14,5,0,14,0,2,1,5,15,5,0,0 -7904,10,1,0,7,6,5,5,5,3,1,16,16,2,12,0 -7905,0,8,0,4,4,6,5,4,2,0,15,15,0,40,0 -7906,3,3,0,11,2,1,14,2,2,0,13,2,0,14,0 -7907,2,7,0,11,6,6,11,4,1,1,10,11,0,37,0 -7908,3,6,0,8,2,6,4,3,4,0,2,11,0,4,0 -7909,7,7,0,8,2,4,5,5,0,1,0,2,1,29,0 -7910,2,5,0,13,0,3,8,5,0,1,0,0,1,11,0 -7911,6,0,0,10,6,0,13,4,4,0,4,17,4,30,1 -7912,9,6,0,12,0,5,14,0,0,1,6,11,2,11,0 -7913,1,4,0,0,1,4,7,2,1,0,0,2,0,39,0 -7914,0,1,0,4,6,4,11,0,1,0,13,20,1,16,0 -7915,0,2,0,9,3,0,8,2,0,0,5,2,5,25,0 -7916,1,8,0,11,6,0,12,3,4,0,2,16,4,11,0 -7917,6,7,0,15,5,3,8,2,0,1,15,18,5,25,0 -7918,8,8,0,14,6,6,6,0,3,1,1,15,3,31,1 -7919,4,1,0,11,6,1,5,4,0,1,9,17,5,5,1 -7920,10,7,0,3,5,5,11,2,2,1,7,7,5,40,1 -7921,3,8,0,13,1,0,3,5,2,1,6,1,5,9,0 -7922,0,6,0,12,3,5,10,1,1,0,5,13,5,3,0 -7923,9,1,0,4,5,4,0,3,2,1,13,2,5,4,0 -7924,4,6,0,5,2,1,0,1,3,1,2,11,2,3,0 -7925,0,1,0,12,6,5,5,5,3,0,13,9,2,9,0 -7926,7,1,0,15,4,6,4,4,1,0,1,8,2,21,1 -7927,7,0,0,0,3,6,8,3,2,0,8,11,5,30,0 -7928,4,1,0,7,0,5,5,3,0,1,4,13,1,12,0 -7929,5,3,0,1,5,1,9,5,1,1,13,9,1,2,0 -7930,10,1,0,13,5,4,12,4,3,1,13,2,3,20,0 -7931,4,4,0,1,6,1,11,5,3,0,18,5,4,8,1 -7932,0,4,0,9,1,3,1,2,2,0,13,10,4,6,0 -7933,4,1,0,14,1,2,1,1,1,1,16,1,3,6,1 -7934,7,4,0,5,5,5,6,3,3,0,2,0,2,21,0 -7935,8,4,0,6,2,5,7,2,3,0,0,2,2,4,0 -7936,0,1,0,8,6,2,11,4,0,1,9,16,2,38,1 -7937,0,5,0,7,5,4,8,1,3,0,17,0,1,0,0 -7938,9,8,0,13,3,3,8,0,4,1,13,6,5,20,0 -7939,3,1,0,13,2,5,8,1,4,1,11,15,1,29,0 -7940,2,6,0,2,0,6,14,3,0,1,7,12,2,27,0 -7941,0,5,0,1,3,1,6,3,0,0,2,17,2,38,0 -7942,7,8,0,13,4,4,2,3,4,1,18,6,0,9,0 -7943,7,7,0,8,6,6,12,5,4,0,1,7,5,12,1 -7944,2,8,0,6,5,4,3,3,0,1,13,11,3,24,0 -7945,0,6,0,14,1,2,9,4,1,0,12,3,3,33,0 -7946,0,7,0,13,4,6,4,2,0,1,12,2,2,31,0 -7947,0,2,0,7,0,6,2,3,2,0,16,2,0,12,0 -7948,4,5,0,15,0,0,10,1,2,0,7,11,1,10,0 -7949,0,3,0,12,1,4,12,2,0,0,13,11,5,18,0 -7950,0,5,0,13,1,0,9,5,3,0,4,4,5,26,0 -7951,2,8,0,3,0,3,6,0,2,1,12,4,0,35,0 -7952,7,8,0,7,1,5,9,0,2,0,4,2,3,14,0 -7953,6,1,0,2,2,5,12,1,1,1,9,1,1,28,1 -7954,0,5,0,5,2,4,1,4,4,1,2,11,5,22,0 -7955,9,8,0,3,1,5,3,2,0,0,2,9,4,26,0 -7956,1,4,0,6,0,1,14,3,4,1,13,9,5,6,0 -7957,0,7,0,9,2,2,14,2,4,1,0,2,5,3,0 -7958,10,2,0,2,3,6,8,4,1,1,13,2,3,21,0 -7959,5,4,0,11,0,0,10,1,4,0,4,20,4,41,0 -7960,1,2,0,7,1,6,12,0,4,1,4,2,2,4,0 -7961,0,1,0,6,1,3,6,4,1,1,2,15,3,21,0 -7962,3,3,0,5,5,6,5,1,4,0,10,11,0,40,0 -7963,0,7,0,2,2,1,1,5,4,1,1,2,1,21,0 -7964,0,5,0,5,4,6,5,5,1,0,0,0,0,3,0 -7965,9,7,0,10,4,6,7,0,4,1,18,6,0,20,1 -7966,3,4,0,12,6,6,4,3,4,0,2,12,3,21,0 -7967,10,0,0,15,0,2,5,5,0,1,0,13,0,10,0 -7968,8,4,0,3,3,3,11,2,0,0,17,9,4,0,0 -7969,0,2,0,0,5,5,11,5,1,1,2,11,5,12,0 -7970,0,0,0,2,6,0,14,1,1,0,17,11,3,41,0 -7971,3,1,0,2,3,0,11,5,1,1,11,17,2,21,1 -7972,0,6,0,3,0,3,3,2,1,0,16,6,4,36,0 -7973,7,4,0,12,1,4,6,0,4,1,6,4,5,23,1 -7974,0,6,0,3,1,6,14,4,0,1,4,6,0,22,0 -7975,2,1,0,4,1,2,6,0,0,1,18,1,2,0,1 -7976,10,4,0,0,0,6,0,2,3,1,8,18,4,33,0 -7977,0,8,0,8,5,5,8,1,4,1,8,9,0,12,0 -7978,6,0,0,13,1,4,10,4,3,1,2,2,4,27,0 -7979,4,6,0,8,6,0,9,2,4,0,14,18,0,23,0 -7980,1,4,0,15,0,3,5,3,4,1,9,9,5,18,0 -7981,2,5,0,6,0,5,14,3,2,1,4,15,0,27,0 -7982,5,6,0,5,0,4,3,1,3,0,6,2,2,10,0 -7983,3,3,0,15,0,6,8,3,1,0,17,2,5,20,0 -7984,2,8,0,3,0,0,14,1,1,1,2,11,0,29,0 -7985,0,3,0,15,5,6,8,5,1,1,13,2,1,14,0 -7986,8,4,0,0,2,6,5,5,2,1,12,12,3,4,1 -7987,3,7,0,13,6,3,5,3,1,1,6,20,0,14,0 -7988,6,4,0,12,2,5,6,3,0,1,3,3,1,28,0 -7989,2,6,0,11,2,1,8,4,0,1,2,6,3,28,0 -7990,0,3,0,8,2,5,3,4,4,0,10,15,4,13,0 -7991,4,6,0,10,5,2,8,5,0,1,12,8,2,1,1 -7992,3,0,0,0,1,1,6,3,0,0,18,16,0,18,1 -7993,8,0,0,10,2,1,2,5,0,1,18,7,4,37,1 -7994,6,5,0,13,0,4,4,2,1,0,16,2,2,0,0 -7995,9,1,0,10,0,5,5,3,4,1,2,2,4,34,0 -7996,0,6,0,5,4,2,8,3,3,0,6,11,0,11,0 -7997,10,1,0,2,5,1,0,3,2,0,5,5,2,1,1 -7998,9,0,0,4,4,5,12,4,0,1,16,10,2,36,1 -7999,0,6,0,1,3,2,0,1,2,1,18,10,3,11,1 -8000,7,3,0,8,0,4,4,1,1,0,17,0,0,7,0 -8001,0,7,0,4,0,3,0,0,1,0,1,5,2,29,0 -8002,8,2,0,12,6,2,10,0,2,1,5,19,1,8,1 -8003,2,4,0,9,0,3,10,0,3,0,13,1,4,17,0 -8004,2,6,0,5,6,3,5,1,3,0,17,18,1,33,0 -8005,4,5,0,8,2,5,7,4,1,0,13,15,3,22,0 -8006,0,7,0,2,3,5,5,4,0,1,3,11,5,33,0 -8007,2,3,0,14,0,3,5,2,0,1,8,13,4,7,0 -8008,10,7,0,10,0,4,0,3,0,1,13,17,0,19,0 -8009,1,5,0,13,4,0,1,3,0,0,12,13,2,2,0 -8010,6,8,0,13,1,4,11,4,1,1,12,1,0,3,0 -8011,3,6,0,13,2,0,3,4,1,0,5,3,1,11,0 -8012,3,8,0,4,0,0,5,1,3,1,2,11,4,6,0 -8013,0,8,0,11,0,4,0,3,1,1,2,13,5,32,0 -8014,3,7,0,7,2,4,6,1,0,0,14,18,0,6,0 -8015,2,7,0,12,6,5,5,2,3,0,10,2,1,37,0 -8016,7,7,0,1,1,6,8,4,0,1,10,6,0,3,0 -8017,9,3,0,3,1,0,6,3,3,0,13,11,1,22,0 -8018,9,7,0,4,4,5,5,3,2,0,13,2,2,4,0 -8019,2,7,0,2,0,0,3,1,3,1,8,2,1,25,0 -8020,3,2,0,15,3,1,9,1,2,0,4,12,0,26,1 -8021,10,4,0,4,6,5,11,5,2,1,11,10,2,40,1 -8022,6,0,0,0,6,0,9,0,0,0,10,9,5,38,0 -8023,5,6,0,14,5,3,10,5,4,1,5,10,3,13,1 -8024,5,1,0,8,5,0,13,1,3,0,13,11,2,11,0 -8025,7,0,0,11,1,6,12,5,2,1,5,1,4,6,1 -8026,7,8,0,3,1,6,4,4,0,1,8,5,1,24,1 -8027,10,3,0,10,1,5,8,3,0,1,13,2,4,7,0 -8028,0,8,0,13,4,0,7,4,2,0,13,15,4,36,0 -8029,9,7,0,8,0,2,9,5,0,0,13,20,4,2,0 -8030,6,4,0,15,3,6,3,2,3,1,8,16,3,23,0 -8031,9,7,0,14,1,5,14,2,4,1,3,10,4,33,1 -8032,0,2,0,3,0,4,1,1,3,1,13,11,5,6,0 -8033,6,6,0,0,0,2,1,3,4,0,13,2,1,0,0 -8034,10,6,0,3,4,4,14,1,0,0,17,5,4,16,0 -8035,7,8,0,1,0,4,4,4,2,1,10,15,0,25,0 -8036,1,7,0,14,2,3,8,4,3,0,13,11,3,27,0 -8037,1,7,0,6,4,6,8,1,1,1,18,13,3,18,1 -8038,0,0,0,13,6,4,7,3,4,1,6,4,0,37,0 -8039,1,1,0,4,0,4,9,2,0,1,17,14,5,5,0 -8040,1,8,0,14,1,3,8,2,2,1,15,20,2,7,0 -8041,7,7,0,12,1,0,13,2,3,1,17,14,3,11,0 -8042,2,5,0,8,0,6,14,1,1,0,13,11,1,14,0 -8043,1,7,0,12,0,4,9,0,2,1,2,16,3,19,0 -8044,6,5,0,15,0,3,8,4,2,0,13,14,2,26,0 -8045,0,8,0,2,0,0,5,3,1,0,10,6,5,38,0 -8046,6,6,0,5,4,1,9,1,1,0,12,20,4,6,0 -8047,7,1,0,5,1,1,8,5,3,0,18,7,3,19,1 -8048,2,4,0,15,6,6,7,1,2,1,8,11,3,22,0 -8049,2,7,0,5,4,0,2,2,2,0,4,2,4,33,0 -8050,6,4,0,9,3,0,5,2,1,1,15,19,4,32,1 -8051,0,0,0,2,0,1,3,0,0,1,1,0,5,6,0 -8052,0,8,0,5,2,3,9,3,3,0,13,16,5,33,0 -8053,6,2,0,10,4,4,4,1,3,1,11,9,1,18,0 -8054,3,8,0,1,5,4,11,3,0,1,13,11,2,1,0 -8055,0,6,0,6,0,3,3,2,3,1,17,2,0,40,0 -8056,1,8,0,3,3,6,8,3,3,1,15,2,0,6,0 -8057,6,7,0,4,0,0,11,1,3,0,10,0,0,18,0 -8058,0,0,0,5,5,5,3,0,2,1,13,17,4,28,0 -8059,2,6,0,5,1,5,6,3,3,0,13,11,0,19,0 -8060,9,6,0,2,0,3,7,3,3,1,2,2,5,7,0 -8061,10,4,0,6,0,4,3,1,2,0,13,2,0,9,0 -8062,3,8,0,13,4,3,12,4,4,1,0,11,0,41,0 -8063,9,1,0,5,1,5,11,1,2,0,0,18,2,32,0 -8064,3,0,0,14,0,1,7,0,2,1,14,11,3,36,0 -8065,5,8,0,7,1,0,14,1,1,0,8,5,4,4,0 -8066,4,6,0,3,1,6,5,2,1,0,4,14,1,33,0 -8067,2,1,0,1,3,5,13,2,1,0,2,11,0,21,0 -8068,1,8,0,9,1,0,3,4,2,1,17,2,4,24,0 -8069,5,7,0,9,1,0,6,5,0,1,11,12,3,38,0 -8070,7,7,0,15,6,1,9,2,2,0,3,2,1,31,0 -8071,7,5,0,10,1,2,13,5,0,1,5,7,1,32,1 -8072,6,2,0,10,4,1,3,4,2,1,9,14,5,7,1 -8073,6,0,0,8,6,0,9,2,0,1,15,15,2,6,0 -8074,5,0,0,7,4,4,5,1,0,0,13,2,0,40,0 -8075,2,0,0,5,1,0,7,3,0,1,8,0,5,16,0 -8076,3,3,0,6,3,5,0,1,4,0,10,6,3,30,0 -8077,5,8,0,11,2,4,1,2,2,0,11,2,0,21,0 -8078,8,6,0,0,5,4,11,5,4,1,17,1,0,38,0 -8079,10,2,0,11,5,0,13,3,0,1,17,11,5,38,0 -8080,8,3,0,13,2,4,9,4,4,1,16,2,2,19,0 -8081,2,1,0,15,3,0,2,1,4,1,13,9,0,29,0 -8082,7,0,0,14,3,5,14,5,2,1,16,11,0,17,0 -8083,5,3,0,6,0,2,11,5,1,0,7,5,3,12,1 -8084,10,4,0,11,6,0,5,0,4,1,16,5,5,23,1 -8085,10,7,0,6,4,4,9,0,2,0,11,5,4,23,1 -8086,7,2,0,10,3,0,9,0,1,0,14,19,4,0,1 -8087,7,7,0,10,0,4,8,2,0,0,6,20,1,17,0 -8088,10,6,0,4,6,3,0,1,2,1,5,19,5,33,1 -8089,0,5,0,1,5,4,10,4,2,1,13,11,3,40,0 -8090,10,5,0,13,1,4,9,0,2,0,13,0,5,19,0 -8091,5,3,0,11,6,4,0,1,2,0,2,18,4,27,0 -8092,2,1,0,13,2,1,9,3,2,0,4,12,1,38,0 -8093,8,7,0,13,3,5,6,2,0,0,2,6,4,25,0 -8094,0,5,0,0,6,0,0,5,2,1,7,3,1,39,0 -8095,0,6,0,12,6,4,8,2,0,1,10,9,4,29,0 -8096,2,0,0,3,4,5,7,1,0,1,2,11,1,25,0 -8097,8,7,0,15,1,5,14,0,0,0,7,10,3,17,1 -8098,7,5,0,10,6,1,6,1,1,1,6,11,5,18,1 -8099,6,8,0,5,4,5,14,4,4,0,2,13,0,27,0 -8100,6,6,0,5,0,5,6,0,0,1,4,2,4,29,0 -8101,3,4,0,8,1,4,5,1,3,1,17,6,4,35,0 -8102,4,4,0,12,6,4,14,2,1,1,8,7,2,4,0 -8103,2,8,0,15,5,0,9,5,0,1,13,13,4,4,0 -8104,4,8,0,5,0,3,8,5,2,1,10,13,4,19,0 -8105,3,5,0,15,4,1,11,5,1,0,11,4,3,32,1 -8106,9,7,0,13,1,0,4,0,0,0,6,11,2,29,0 -8107,1,0,0,14,1,1,8,5,3,0,17,13,4,7,0 -8108,10,2,0,12,0,6,7,0,3,1,18,17,4,11,1 -8109,2,6,0,7,0,4,7,1,1,1,1,19,0,1,1 -8110,6,8,0,1,4,2,12,1,0,1,13,4,1,6,0 -8111,10,6,0,13,0,0,8,1,0,1,0,12,1,36,0 -8112,0,4,0,2,1,0,5,2,2,0,17,12,2,26,0 -8113,10,8,0,7,6,0,4,2,1,1,2,11,3,28,0 -8114,2,1,0,15,1,4,1,4,4,0,17,20,0,28,0 -8115,3,2,0,0,2,4,0,0,2,0,13,11,3,14,0 -8116,5,5,0,13,0,5,9,0,1,0,13,11,1,38,0 -8117,9,8,0,14,1,5,2,0,4,0,18,11,5,22,1 -8118,0,1,0,1,6,0,1,1,0,1,11,10,3,0,1 -8119,0,0,0,3,1,2,0,2,2,1,13,0,0,35,0 -8120,2,6,0,2,0,0,8,1,0,0,4,4,2,36,0 -8121,0,2,0,15,1,4,8,3,0,1,13,15,0,21,0 -8122,1,2,0,13,4,3,12,1,2,0,4,2,5,20,0 -8123,7,1,0,12,2,2,14,1,2,0,2,0,3,9,0 -8124,4,7,0,3,0,5,11,2,0,0,13,15,0,39,0 -8125,3,8,0,0,0,0,3,1,1,0,6,4,4,34,0 -8126,6,3,0,4,2,0,14,3,3,0,4,17,0,5,0 -8127,8,7,0,11,6,1,0,0,4,0,18,1,0,41,1 -8128,1,2,0,5,2,3,11,1,4,0,2,6,2,17,0 -8129,9,6,0,9,4,0,0,5,2,1,11,10,1,1,1 -8130,2,5,0,3,0,3,0,2,3,1,0,10,0,29,0 -8131,0,6,0,5,0,5,13,3,2,0,0,13,2,26,0 -8132,4,8,0,13,6,1,9,2,0,1,13,2,5,36,0 -8133,5,2,0,11,2,6,1,3,2,0,17,15,3,40,0 -8134,3,5,0,3,3,3,6,4,3,0,10,20,1,6,0 -8135,7,3,0,9,1,5,9,3,4,1,6,16,5,32,0 -8136,9,5,0,8,4,1,11,3,1,1,18,19,5,28,1 -8137,1,0,0,15,4,4,0,3,0,0,17,2,5,41,0 -8138,0,0,0,8,0,1,3,3,1,0,15,11,2,40,0 -8139,6,4,0,5,6,0,12,3,1,0,2,3,1,35,0 -8140,7,5,0,5,6,1,8,3,1,0,2,20,4,13,0 -8141,9,8,0,4,2,0,2,1,0,0,6,2,0,9,0 -8142,0,3,0,13,0,6,6,1,0,0,2,2,0,20,0 -8143,5,6,0,0,3,1,0,5,0,1,10,5,3,8,1 -8144,0,3,0,7,3,3,14,3,0,0,11,20,4,9,1 -8145,1,8,0,7,6,6,8,3,1,0,8,18,5,17,0 -8146,3,6,0,6,5,6,7,1,2,1,10,9,1,33,0 -8147,1,6,0,9,4,0,10,5,1,0,8,13,4,6,0 -8148,2,4,0,15,1,0,4,3,2,1,7,20,5,18,0 -8149,5,8,0,4,3,0,9,1,2,1,15,13,4,3,0 -8150,0,6,0,13,2,6,7,3,2,0,2,18,3,33,0 -8151,0,4,0,1,5,4,12,4,0,0,4,11,0,27,0 -8152,5,3,0,12,0,0,8,1,2,1,4,2,5,22,0 -8153,0,7,0,3,0,5,14,0,0,1,17,13,1,31,0 -8154,2,8,0,1,6,5,9,2,0,0,17,18,4,24,0 -8155,4,4,0,1,2,1,8,4,3,1,6,0,1,10,1 -8156,9,2,0,13,0,5,7,5,0,1,18,10,3,6,1 -8157,3,5,0,13,0,0,14,0,2,0,8,15,3,23,0 -8158,0,1,0,6,0,5,5,3,0,1,13,12,5,4,0 -8159,1,0,0,8,6,1,8,5,0,0,17,11,0,0,0 -8160,0,2,0,0,4,0,10,4,2,0,8,14,2,22,0 -8161,8,5,0,10,6,1,1,5,0,0,1,4,3,7,1 -8162,1,3,0,4,2,6,5,3,3,1,17,6,5,1,0 -8163,9,4,0,8,1,4,9,2,1,0,4,4,4,21,0 -8164,3,1,0,11,0,5,2,4,3,0,2,16,1,20,0 -8165,9,0,0,11,1,1,10,4,0,0,4,0,0,13,0 -8166,2,6,0,7,6,4,0,4,4,1,2,2,0,41,0 -8167,6,1,0,3,0,4,3,4,2,1,11,10,0,41,1 -8168,10,3,0,13,2,4,8,3,4,0,13,6,0,17,0 -8169,0,1,0,15,6,0,4,4,3,0,4,11,3,40,0 -8170,5,8,0,2,4,0,7,3,2,0,17,18,0,5,0 -8171,7,6,0,10,6,0,2,1,3,1,15,11,0,12,0 -8172,6,1,0,6,6,6,12,5,2,0,9,10,1,24,1 -8173,0,4,0,5,3,4,3,2,1,1,6,11,5,34,0 -8174,0,3,0,15,0,3,0,3,4,1,5,11,1,18,0 -8175,3,5,0,7,6,5,10,2,1,1,4,11,5,38,0 -8176,3,1,0,1,4,5,8,0,0,0,8,10,4,25,0 -8177,3,6,0,11,4,0,14,5,0,1,8,11,0,37,0 -8178,10,2,0,3,1,5,6,1,4,0,10,20,1,7,0 -8179,9,4,0,13,2,3,5,3,2,1,12,6,4,19,0 -8180,1,2,0,15,5,0,2,3,4,0,10,11,1,31,0 -8181,6,7,0,1,5,5,9,1,0,1,14,3,0,16,0 -8182,8,8,0,3,4,2,8,0,3,0,9,3,4,32,1 -8183,5,3,0,0,5,4,5,3,3,0,13,9,3,33,0 -8184,9,4,0,3,0,4,0,4,1,1,15,2,2,31,0 -8185,2,7,0,4,1,6,8,2,1,0,8,6,3,21,0 -8186,3,8,0,11,0,2,6,2,3,1,13,12,1,8,0 -8187,3,7,0,4,6,4,8,0,2,1,13,11,0,3,0 -8188,1,2,0,13,6,1,3,0,1,1,6,18,5,23,0 -8189,2,5,0,5,3,5,6,2,2,1,0,16,0,29,0 -8190,6,6,0,6,4,4,9,3,2,1,0,15,0,6,0 -8191,9,6,0,15,2,0,2,0,4,1,9,7,4,40,1 -8192,9,1,0,13,2,4,9,0,0,1,8,6,4,16,0 -8193,0,2,0,15,6,4,5,3,4,1,13,6,1,31,0 -8194,5,7,0,7,2,2,5,5,0,0,5,17,1,11,1 -8195,5,2,0,14,5,1,9,1,2,1,4,12,4,40,1 -8196,10,1,0,0,5,2,2,5,4,1,11,10,1,39,1 -8197,0,8,0,8,3,4,10,1,3,0,14,15,4,6,0 -8198,0,0,0,3,0,3,5,3,1,1,8,11,5,14,0 -8199,3,4,0,8,6,1,7,2,3,1,7,13,2,27,1 -8200,0,2,0,6,5,0,9,3,4,1,13,16,2,35,0 -8201,8,3,0,2,3,6,12,5,3,0,14,6,4,19,1 -8202,0,2,0,2,0,0,3,3,1,1,8,20,5,8,0 -8203,9,2,0,13,1,2,3,2,3,1,8,11,4,7,0 -8204,4,2,0,11,1,5,9,3,0,0,6,7,2,13,0 -8205,0,8,0,3,0,1,10,5,0,1,0,19,4,9,1 -8206,9,1,0,1,1,5,9,4,2,0,4,20,3,22,0 -8207,2,5,0,0,5,6,13,1,0,0,2,11,2,9,0 -8208,1,2,0,13,0,5,7,1,4,1,2,8,0,14,0 -8209,4,3,0,8,5,5,5,4,0,1,6,6,3,25,0 -8210,1,1,0,13,1,5,5,2,4,0,7,11,5,40,0 -8211,9,5,0,6,5,2,5,4,0,0,4,4,4,34,0 -8212,3,4,0,6,1,5,8,3,1,0,8,18,3,35,0 -8213,8,5,0,15,2,4,11,3,3,1,4,7,2,32,0 -8214,7,4,0,8,0,5,7,3,4,0,11,0,1,32,0 -8215,0,4,0,11,3,5,4,5,1,1,2,0,2,38,0 -8216,1,0,0,4,6,5,7,0,4,1,13,9,4,30,0 -8217,3,5,0,2,6,0,2,2,0,1,2,6,1,11,0 -8218,4,4,0,2,1,0,12,0,1,1,9,7,3,6,1 -8219,1,6,0,4,6,0,6,4,0,1,4,15,3,28,0 -8220,1,6,0,3,0,0,6,3,0,1,7,15,0,28,0 -8221,1,5,0,1,0,1,11,1,1,1,13,9,5,30,0 -8222,10,7,0,15,5,3,6,1,4,1,8,16,0,41,0 -8223,1,7,0,6,1,5,13,1,3,0,8,0,3,39,0 -8224,4,3,0,5,0,2,7,0,1,1,10,14,0,35,0 -8225,0,0,0,12,6,1,9,3,4,0,15,2,3,18,0 -8226,1,6,0,10,2,6,13,0,0,1,11,16,1,25,1 -8227,1,0,0,4,0,4,13,0,0,0,17,2,2,6,0 -8228,2,6,0,6,6,0,13,1,3,0,17,9,1,25,0 -8229,10,1,0,8,1,4,7,0,4,1,13,0,2,14,0 -8230,3,0,0,0,5,0,2,4,0,1,13,10,0,37,0 -8231,6,3,0,2,5,3,4,2,4,1,16,9,5,23,1 -8232,0,8,0,14,6,2,3,3,1,0,13,18,2,31,0 -8233,8,8,0,8,4,4,10,4,1,1,1,11,0,31,0 -8234,0,3,0,3,3,5,5,2,1,0,8,9,0,16,0 -8235,0,7,0,14,6,6,0,0,0,1,12,17,3,23,0 -8236,8,2,0,14,5,5,6,3,2,1,2,4,4,2,0 -8237,3,8,0,7,1,2,10,4,2,1,13,19,2,32,0 -8238,4,2,0,2,1,5,0,3,0,1,8,19,2,8,0 -8239,7,6,0,12,3,5,4,4,4,1,13,5,4,41,1 -8240,2,8,0,1,1,0,2,1,0,0,5,14,4,35,0 -8241,6,7,0,12,1,6,3,4,1,1,13,6,1,19,0 -8242,1,7,0,6,5,4,12,4,2,0,15,14,0,25,0 -8243,8,6,0,15,4,5,14,0,1,1,10,2,4,26,0 -8244,9,4,0,12,4,1,9,5,3,1,18,5,4,29,1 -8245,9,0,0,3,4,6,13,1,1,1,17,1,0,21,0 -8246,0,2,0,4,5,1,7,3,3,1,13,6,4,8,0 -8247,5,1,0,10,2,4,10,5,2,0,7,20,4,2,1 -8248,9,4,0,2,3,4,5,3,1,0,8,11,0,29,0 -8249,1,4,0,13,2,2,4,0,2,1,8,2,4,19,0 -8250,5,2,0,6,1,1,2,0,1,1,5,10,4,18,1 -8251,9,3,0,12,0,6,5,5,3,1,6,9,0,1,0 -8252,2,3,0,6,0,2,5,4,0,0,1,15,0,20,0 -8253,4,4,0,12,0,5,5,3,1,1,2,13,2,0,0 -8254,6,3,0,10,0,4,2,1,4,0,18,17,5,20,1 -8255,1,2,0,11,3,0,6,3,3,1,11,18,2,14,0 -8256,4,4,0,2,4,0,3,3,1,1,18,19,5,3,1 -8257,2,8,0,2,4,0,2,1,2,1,6,2,1,13,0 -8258,8,5,0,14,4,3,11,1,2,1,3,16,0,11,1 -8259,8,1,0,10,4,5,3,0,0,1,16,1,5,34,1 -8260,0,7,0,13,0,4,4,0,2,0,12,18,3,4,0 -8261,4,6,0,5,5,5,9,0,3,1,13,0,0,13,0 -8262,5,7,0,2,4,4,12,1,2,1,9,17,3,10,1 -8263,9,3,0,7,0,2,2,2,0,1,4,2,0,35,0 -8264,5,2,0,9,6,2,2,1,1,0,8,2,0,18,0 -8265,7,2,0,5,2,1,9,0,3,0,7,1,4,11,1 -8266,9,7,0,4,3,5,9,1,3,0,4,18,0,34,0 -8267,8,7,0,6,0,4,2,3,2,1,18,7,1,20,1 -8268,2,5,0,11,6,0,7,3,3,1,8,2,4,33,0 -8269,3,6,0,12,1,5,7,2,0,0,13,3,5,27,0 -8270,3,7,0,9,3,3,10,5,1,1,3,7,3,20,1 -8271,7,6,0,7,1,4,5,3,0,1,8,2,0,23,0 -8272,0,7,0,7,1,4,14,4,0,1,2,2,3,21,0 -8273,5,0,0,15,6,4,5,5,0,0,18,17,2,17,1 -8274,9,5,0,15,1,0,7,0,2,1,18,2,4,3,1 -8275,0,0,0,0,6,4,2,2,1,0,4,6,5,39,0 -8276,3,5,0,13,5,5,8,3,2,1,13,11,5,13,0 -8277,2,8,0,10,0,0,14,2,2,1,2,20,5,11,0 -8278,7,5,0,4,5,6,0,4,3,0,2,2,0,17,0 -8279,4,6,0,3,2,0,14,0,1,1,11,1,3,7,1 -8280,0,1,0,1,0,4,4,1,2,0,13,16,1,29,0 -8281,9,3,0,3,3,5,6,2,0,1,13,13,5,24,0 -8282,1,6,0,13,0,2,12,3,4,0,9,15,3,13,0 -8283,6,8,0,0,6,4,7,0,2,1,4,11,4,6,0 -8284,4,0,0,6,0,3,7,3,3,0,13,15,1,19,0 -8285,3,1,0,15,1,3,1,3,3,1,13,14,2,20,0 -8286,10,1,0,1,6,0,13,5,4,0,5,1,2,22,1 -8287,9,1,0,15,1,4,12,0,3,1,17,2,2,22,0 -8288,2,4,0,2,5,0,6,4,2,1,17,20,2,32,0 -8289,3,6,0,12,2,2,0,2,4,0,17,9,2,1,0 -8290,9,3,0,8,0,0,14,3,1,1,11,1,1,37,1 -8291,6,7,0,5,1,0,6,0,0,0,8,18,1,25,0 -8292,5,6,0,7,4,6,8,4,1,0,4,6,1,41,0 -8293,0,2,0,0,0,3,4,2,2,1,10,8,3,39,1 -8294,0,7,0,3,0,0,3,3,2,1,7,18,3,31,0 -8295,1,3,0,9,6,0,3,1,2,1,15,0,1,30,0 -8296,2,3,0,1,5,6,5,1,2,1,18,7,1,26,1 -8297,9,6,0,7,0,6,7,3,4,0,5,17,5,24,1 -8298,8,2,0,6,2,4,10,5,1,0,3,10,2,30,1 -8299,0,3,0,3,2,5,7,2,0,0,13,17,1,0,0 -8300,2,3,0,4,0,2,5,2,3,1,4,19,3,34,0 -8301,9,5,0,11,1,6,5,3,4,0,15,8,4,16,0 -8302,4,7,0,0,2,6,2,3,2,0,1,14,3,6,0 -8303,7,2,0,10,2,2,2,5,2,0,3,1,2,0,1 -8304,0,7,0,6,0,5,0,4,3,1,4,6,5,38,0 -8305,1,8,0,0,1,0,0,1,0,1,13,10,3,10,0 -8306,1,0,0,3,2,0,5,4,0,0,17,0,4,6,0 -8307,1,6,0,0,6,1,4,5,1,0,0,11,0,25,0 -8308,5,8,0,9,2,0,6,1,0,0,17,16,1,22,0 -8309,9,0,0,0,3,4,14,5,2,0,16,2,1,4,0 -8310,1,0,0,8,6,4,0,3,2,0,2,14,2,25,0 -8311,4,2,0,15,3,6,6,4,0,0,16,0,3,7,0 -8312,0,7,0,15,5,1,10,0,3,0,15,11,0,33,0 -8313,0,6,0,0,4,6,0,0,3,0,13,18,2,16,0 -8314,3,6,0,10,1,3,0,3,3,1,18,20,3,11,1 -8315,0,3,0,15,4,0,8,2,1,1,10,12,3,2,0 -8316,9,8,0,11,2,3,11,2,4,1,8,11,0,31,0 -8317,1,5,0,12,2,6,3,3,0,1,6,9,0,8,0 -8318,5,5,0,11,1,1,0,3,3,1,11,7,0,23,1 -8319,1,3,0,1,4,6,9,0,1,1,7,17,2,26,1 -8320,0,8,0,5,5,0,0,3,1,0,8,2,1,3,0 -8321,8,7,0,12,6,1,13,5,0,0,9,5,3,16,1 -8322,5,1,0,0,1,0,3,0,3,0,11,11,3,41,0 -8323,9,1,0,11,6,3,13,5,4,0,3,8,0,1,1 -8324,2,8,0,6,3,5,9,4,0,0,15,17,4,7,0 -8325,7,2,0,1,6,2,1,1,0,1,18,20,2,17,1 -8326,1,2,0,3,1,2,7,1,4,0,3,19,2,23,0 -8327,3,6,0,13,0,3,8,3,0,0,2,4,1,35,0 -8328,0,6,0,3,3,0,1,4,3,1,10,4,0,24,0 -8329,0,8,0,10,0,6,0,1,4,0,5,7,4,3,1 -8330,6,1,0,11,1,6,5,3,3,0,13,9,5,6,0 -8331,2,2,0,5,6,6,11,2,1,1,12,20,1,35,0 -8332,2,7,0,4,5,0,14,1,0,1,2,18,0,35,0 -8333,9,8,0,6,5,4,12,5,2,0,2,1,0,35,0 -8334,9,3,0,1,2,5,1,3,0,1,2,16,3,41,0 -8335,3,1,0,12,6,2,4,5,2,1,18,17,5,41,1 -8336,2,3,0,13,1,6,14,2,1,1,13,18,5,24,0 -8337,6,8,0,6,2,3,2,0,1,1,18,20,2,27,1 -8338,3,7,0,6,0,0,7,0,2,1,4,1,4,40,0 -8339,1,3,0,1,4,2,5,5,2,0,13,13,2,35,0 -8340,0,3,0,0,3,4,8,0,3,1,4,0,2,25,0 -8341,4,3,0,5,6,4,12,3,0,0,8,13,0,5,0 -8342,8,4,0,6,2,2,3,4,2,1,13,9,5,9,0 -8343,1,2,0,12,6,5,9,3,3,0,10,15,5,21,0 -8344,7,6,0,15,6,6,7,3,1,0,13,19,3,31,0 -8345,6,6,0,4,1,3,3,2,1,1,13,2,3,5,0 -8346,7,6,0,5,4,3,11,1,4,0,4,8,0,31,0 -8347,10,2,0,11,6,3,0,4,1,0,13,5,3,29,0 -8348,3,0,0,2,0,6,7,4,0,1,8,6,0,29,0 -8349,9,1,0,0,1,1,14,1,1,1,5,10,3,30,1 -8350,4,4,0,2,4,1,14,0,3,0,14,5,3,40,1 -8351,7,8,0,9,5,4,8,5,2,1,3,18,4,1,1 -8352,6,3,0,10,0,4,1,4,0,0,8,13,5,4,0 -8353,0,8,0,4,1,1,10,2,1,0,17,2,2,10,0 -8354,1,6,0,13,6,4,6,5,4,0,13,2,5,2,0 -8355,2,0,0,6,2,1,6,3,4,0,2,2,0,18,0 -8356,3,5,0,12,4,1,12,2,4,0,18,17,1,5,1 -8357,2,6,0,1,1,2,12,3,3,1,2,15,5,28,0 -8358,4,1,0,2,6,4,9,1,2,0,12,8,3,12,1 -8359,0,2,0,13,5,1,0,5,3,1,12,13,0,0,0 -8360,0,4,0,9,6,0,11,4,4,0,6,11,2,13,0 -8361,1,7,0,9,3,6,13,0,0,1,5,18,1,16,0 -8362,1,2,0,4,5,3,7,3,2,1,1,15,0,8,0 -8363,10,1,0,4,0,0,6,4,2,1,17,11,3,8,0 -8364,0,5,0,13,3,0,5,4,3,1,13,12,5,40,0 -8365,2,8,0,15,1,0,10,4,1,1,2,3,2,25,0 -8366,3,7,0,4,0,2,0,0,2,1,6,7,5,20,1 -8367,4,8,0,2,0,1,8,0,1,0,10,11,3,11,0 -8368,0,7,0,0,6,6,11,2,0,0,4,20,1,39,0 -8369,1,7,0,4,2,4,8,3,1,0,17,7,3,19,0 -8370,5,6,0,2,1,0,11,4,1,0,10,8,2,29,0 -8371,0,5,0,3,5,4,5,4,4,1,0,18,3,7,0 -8372,0,5,0,3,3,4,4,1,2,0,13,2,0,27,0 -8373,10,8,0,15,6,0,10,0,0,0,8,13,5,17,0 -8374,7,1,0,1,3,2,12,4,2,0,15,19,0,25,1 -8375,0,8,0,9,0,0,1,1,3,1,13,0,3,23,0 -8376,9,5,0,15,4,3,14,1,2,0,9,7,3,41,0 -8377,8,7,0,11,3,1,14,2,3,1,18,14,4,31,1 -8378,5,7,0,11,0,1,14,4,0,0,8,11,1,26,0 -8379,1,0,0,12,6,1,13,3,3,0,6,10,0,2,0 -8380,10,3,0,1,2,5,3,3,4,0,2,2,3,18,0 -8381,0,1,0,3,1,5,3,5,2,0,2,11,0,25,0 -8382,0,1,0,1,2,0,7,1,1,0,2,17,1,13,0 -8383,3,4,0,5,6,0,14,0,4,0,2,15,2,33,0 -8384,1,5,0,8,5,3,0,4,2,0,13,2,0,18,0 -8385,7,2,0,3,4,4,9,3,0,0,11,7,4,17,1 -8386,5,7,0,0,6,6,8,1,4,1,2,11,2,31,0 -8387,5,5,0,14,3,0,11,4,1,1,10,6,2,7,0 -8388,9,4,0,6,6,3,3,5,3,1,9,3,0,29,1 -8389,3,6,0,4,3,0,14,2,3,0,5,2,0,28,0 -8390,3,2,0,5,1,5,1,3,3,0,7,2,2,37,0 -8391,3,4,0,15,0,4,10,5,2,0,4,2,3,29,0 -8392,3,7,0,3,0,2,9,4,4,1,17,14,2,19,0 -8393,6,6,0,2,0,4,2,4,3,1,12,11,0,12,0 -8394,4,6,0,14,5,4,3,4,0,0,13,15,0,28,0 -8395,3,1,0,12,6,3,10,1,0,0,11,17,3,34,1 -8396,7,7,0,11,6,4,8,5,2,0,13,9,0,13,0 -8397,0,0,0,5,5,2,7,0,4,0,2,0,2,12,0 -8398,10,1,0,10,0,3,4,5,0,1,13,0,2,33,0 -8399,7,8,0,7,5,2,10,3,1,0,2,16,0,26,0 -8400,3,0,0,5,1,3,11,0,4,1,0,3,3,39,1 -8401,1,2,0,3,1,5,3,1,1,1,8,11,5,28,0 -8402,5,2,0,5,0,5,2,4,0,1,14,13,0,41,1 -8403,3,0,0,15,4,6,1,4,0,1,12,18,0,4,1 -8404,7,6,0,2,5,4,8,1,0,1,0,17,5,9,0 -8405,3,7,0,6,1,0,0,4,2,1,15,15,1,30,0 -8406,9,5,0,8,3,1,8,5,2,0,16,10,3,30,1 -8407,8,7,0,14,1,2,2,3,4,0,2,7,2,41,1 -8408,8,8,0,11,5,5,10,1,3,0,17,6,4,13,0 -8409,9,3,0,11,5,5,6,2,1,1,14,17,0,17,1 -8410,3,0,0,12,4,2,3,5,2,1,18,14,3,26,1 -8411,0,0,0,13,1,4,5,2,4,1,17,4,0,13,0 -8412,0,0,0,5,6,0,14,2,4,0,16,20,4,17,0 -8413,7,5,0,14,5,4,0,2,3,1,8,20,0,23,0 -8414,7,6,0,9,5,0,7,0,0,0,8,2,1,22,0 -8415,9,8,0,11,6,4,8,4,3,1,2,2,5,30,0 -8416,5,1,0,14,1,6,2,3,4,0,13,7,4,38,0 -8417,9,6,0,0,6,4,3,4,0,1,14,13,3,18,1 -8418,1,1,0,14,6,5,5,3,4,0,6,17,3,35,0 -8419,9,0,0,11,1,5,1,2,0,1,13,10,4,19,0 -8420,1,4,0,5,0,0,7,5,0,0,0,2,5,31,0 -8421,6,1,0,7,4,6,2,4,1,1,6,5,5,24,1 -8422,0,2,0,15,0,4,2,1,3,0,10,9,2,13,0 -8423,7,7,0,4,0,0,9,5,0,1,18,0,5,26,1 -8424,0,2,0,3,1,5,13,2,1,1,10,12,1,4,0 -8425,2,0,0,15,4,2,3,1,0,1,1,17,1,27,1 -8426,10,1,0,12,4,5,13,3,0,1,11,5,2,12,1 -8427,8,5,0,15,2,5,11,5,0,1,9,1,2,38,1 -8428,3,8,0,6,0,3,13,2,2,1,17,4,1,20,0 -8429,2,4,0,10,3,3,5,2,4,1,8,6,3,3,0 -8430,0,3,0,9,3,3,9,4,2,0,0,13,1,36,0 -8431,0,2,0,3,2,3,7,0,0,0,10,2,3,13,0 -8432,5,3,0,10,3,0,14,5,0,1,16,10,5,14,1 -8433,3,8,0,5,6,2,3,0,2,0,1,4,5,2,0 -8434,10,4,0,11,0,0,1,5,0,1,18,10,3,41,1 -8435,7,5,0,2,2,2,5,1,4,0,15,20,5,23,1 -8436,4,8,0,12,3,6,0,1,4,1,14,11,5,36,1 -8437,0,7,0,13,6,3,11,2,0,0,17,9,0,37,0 -8438,3,1,0,14,3,0,8,4,3,0,1,18,4,24,0 -8439,0,4,0,3,1,6,9,4,4,1,12,4,1,39,0 -8440,5,8,0,13,6,0,1,0,4,1,18,5,2,33,1 -8441,3,7,0,0,4,1,1,5,3,1,14,13,3,1,1 -8442,9,5,0,4,3,4,12,3,4,0,8,2,0,19,0 -8443,0,4,0,5,4,3,5,3,0,1,2,20,0,37,0 -8444,1,8,0,1,0,6,1,3,4,0,13,13,2,14,0 -8445,8,1,0,6,0,4,7,4,3,1,5,0,2,9,0 -8446,4,7,0,9,2,5,11,0,1,0,10,2,2,31,0 -8447,0,6,0,15,0,5,5,5,2,1,2,13,4,26,0 -8448,5,4,0,4,0,6,2,4,2,0,10,18,1,13,0 -8449,6,8,0,5,5,6,12,0,0,0,2,2,1,9,0 -8450,9,4,0,15,6,6,3,4,2,0,4,1,3,3,0 -8451,7,2,0,4,3,0,5,5,0,1,10,7,4,8,1 -8452,0,0,0,15,4,4,3,4,2,0,13,0,5,22,0 -8453,2,6,0,15,5,5,5,4,2,1,12,19,1,29,0 -8454,5,4,0,11,5,4,9,4,0,0,4,15,4,18,0 -8455,9,6,0,13,6,3,12,4,1,1,4,14,4,20,1 -8456,3,7,0,14,0,4,6,2,3,1,13,2,3,11,0 -8457,7,8,0,8,1,3,8,1,1,0,12,6,0,41,0 -8458,0,0,0,12,1,4,12,2,4,1,13,0,0,40,0 -8459,0,1,0,7,6,4,12,3,0,1,6,2,0,16,0 -8460,0,8,0,11,2,5,8,1,2,1,12,11,3,33,0 -8461,0,2,0,13,0,4,8,1,1,0,8,3,1,16,0 -8462,5,7,0,6,2,0,1,4,0,0,1,2,2,40,0 -8463,4,5,0,2,0,4,11,5,3,1,10,17,5,23,1 -8464,4,5,0,15,2,5,8,3,0,0,5,11,4,30,0 -8465,9,4,0,1,1,1,10,0,1,0,4,15,0,9,0 -8466,10,2,0,2,4,6,11,5,0,0,18,7,1,10,1 -8467,6,7,0,2,4,2,13,1,3,0,6,20,2,20,0 -8468,0,2,0,12,0,0,9,4,1,0,2,19,4,8,0 -8469,0,7,0,3,1,3,9,1,0,0,13,15,4,11,0 -8470,2,3,0,4,1,5,5,2,4,0,6,11,4,12,0 -8471,5,6,0,14,3,2,7,5,3,0,12,2,1,34,0 -8472,10,1,0,12,2,3,9,0,4,0,1,10,5,30,1 -8473,2,5,0,3,3,4,11,3,3,0,9,18,2,18,1 -8474,2,0,0,8,0,6,1,4,0,1,13,2,1,41,0 -8475,4,1,0,15,4,3,2,0,2,1,15,10,5,24,1 -8476,0,6,0,1,0,1,11,4,4,1,10,15,2,19,0 -8477,7,3,0,6,2,2,13,3,0,0,2,2,2,21,0 -8478,8,3,0,6,6,6,5,2,0,1,11,5,5,23,1 -8479,2,6,0,9,5,2,2,2,0,0,10,20,1,6,0 -8480,2,6,0,15,5,5,8,1,4,0,0,2,4,35,0 -8481,7,4,0,10,2,0,3,2,0,1,4,8,3,9,1 -8482,0,8,0,4,3,2,5,3,3,0,4,20,0,24,0 -8483,7,8,0,9,3,6,9,1,0,0,3,7,4,28,1 -8484,4,2,0,0,0,3,2,0,3,1,2,11,0,37,0 -8485,1,8,0,5,3,4,8,2,0,1,17,11,4,13,0 -8486,4,5,0,0,1,0,10,0,4,1,6,2,3,20,0 -8487,6,8,0,12,1,6,6,3,1,1,2,2,4,26,0 -8488,8,8,0,4,6,6,8,3,4,1,14,13,3,27,0 -8489,8,1,0,2,4,4,7,1,0,1,1,7,5,38,1 -8490,10,7,0,10,2,5,7,1,2,0,11,19,1,7,1 -8491,1,4,0,9,0,2,4,1,1,0,2,11,0,8,0 -8492,2,0,0,3,1,1,14,1,1,0,13,10,1,3,0 -8493,8,2,0,5,4,3,5,1,3,1,12,20,4,9,1 -8494,2,0,0,15,0,5,9,2,3,1,2,3,3,10,0 -8495,4,7,0,9,4,1,4,1,3,0,1,14,4,9,1 -8496,3,1,0,0,3,0,5,1,1,0,11,11,3,16,0 -8497,0,8,0,7,6,1,14,0,3,0,17,11,5,12,0 -8498,0,2,0,4,1,0,8,2,0,0,4,13,2,0,0 -8499,5,6,0,6,6,2,14,5,2,1,12,5,5,19,1 -8500,3,5,0,8,3,0,10,5,0,1,7,10,4,28,1 -8501,10,5,0,13,2,3,14,4,0,0,3,15,1,16,0 -8502,1,1,0,9,0,0,1,0,2,0,3,4,5,23,1 -8503,3,3,0,11,2,2,4,3,3,0,12,14,1,30,1 -8504,7,4,0,4,2,2,6,2,4,1,2,11,1,28,0 -8505,2,0,0,13,3,6,14,3,3,1,17,16,4,4,0 -8506,5,7,0,6,5,3,4,1,1,1,8,9,4,2,0 -8507,10,8,0,0,6,0,9,4,1,0,10,20,2,16,0 -8508,2,8,0,8,5,2,6,4,3,1,17,2,4,6,0 -8509,10,7,0,2,2,3,4,0,3,0,2,5,0,41,1 -8510,0,4,0,3,1,6,4,0,2,0,13,6,3,27,0 -8511,2,5,0,2,4,5,1,3,2,0,0,11,2,6,0 -8512,8,6,0,6,1,0,11,2,0,0,6,3,4,32,0 -8513,10,5,0,2,5,1,12,4,3,1,0,15,4,34,0 -8514,6,8,0,13,5,5,10,0,0,1,6,6,4,12,0 -8515,0,1,0,13,3,4,5,0,3,0,11,6,2,0,0 -8516,1,4,0,4,6,0,9,1,0,0,15,6,4,3,0 -8517,1,8,0,10,1,4,0,1,0,0,13,11,3,10,0 -8518,5,3,0,13,0,3,1,4,0,1,0,6,2,1,0 -8519,0,3,0,2,1,6,0,3,1,1,6,6,1,38,0 -8520,1,3,0,13,2,4,2,2,4,1,13,9,4,25,0 -8521,8,3,0,14,4,3,11,3,2,0,5,20,3,22,1 -8522,4,7,0,10,6,2,4,3,4,1,17,7,5,31,1 -8523,9,3,0,1,1,2,2,3,4,0,11,2,4,0,1 -8524,0,6,0,14,6,5,11,3,0,0,14,2,3,40,0 -8525,3,7,0,14,4,5,12,3,2,1,4,18,0,8,0 -8526,8,7,0,13,1,0,6,1,4,1,8,0,4,28,0 -8527,3,6,0,15,1,2,4,0,0,1,3,0,1,0,0 -8528,4,7,0,1,3,5,11,5,4,0,13,18,2,20,0 -8529,2,0,0,6,0,4,0,0,4,1,13,11,1,38,0 -8530,9,8,0,14,1,1,0,1,2,1,7,7,4,31,1 -8531,9,6,0,8,4,0,11,2,3,1,6,7,5,25,1 -8532,5,0,0,15,5,2,3,1,2,1,16,1,5,30,1 -8533,2,8,0,4,6,5,6,0,0,0,8,9,5,31,0 -8534,2,8,0,3,1,3,3,2,3,1,2,15,3,30,0 -8535,0,0,0,15,0,5,6,3,4,0,17,2,0,3,0 -8536,1,7,0,3,4,2,1,0,0,0,0,4,4,30,0 -8537,2,6,0,12,3,5,6,0,1,1,15,13,1,16,0 -8538,10,4,0,0,0,3,4,5,2,0,5,12,1,26,1 -8539,4,8,0,11,3,5,13,1,2,0,2,11,1,2,0 -8540,0,0,0,7,1,0,9,0,3,1,4,12,4,12,0 -8541,8,6,0,13,2,5,3,1,2,1,0,2,2,10,0 -8542,1,7,0,1,5,0,14,3,2,0,13,11,3,8,0 -8543,8,0,0,15,5,0,6,4,2,0,13,13,0,4,0 -8544,9,2,0,13,0,0,9,3,1,0,2,2,4,22,0 -8545,5,4,0,9,6,3,6,4,2,1,18,1,2,19,1 -8546,4,7,0,12,0,6,8,3,3,1,4,9,0,29,0 -8547,2,2,0,2,5,5,8,4,3,0,6,4,5,7,0 -8548,0,3,0,2,2,4,4,1,2,0,3,11,0,21,0 -8549,1,3,0,0,0,4,0,1,0,1,16,0,0,13,0 -8550,0,7,0,4,3,4,9,2,3,1,0,11,3,29,0 -8551,2,4,0,13,1,5,7,5,0,0,2,15,1,2,0 -8552,3,3,0,4,2,5,5,3,4,0,10,2,4,10,0 -8553,10,2,0,8,1,0,8,5,2,1,13,12,1,10,0 -8554,5,1,0,1,1,1,10,1,2,1,16,10,3,2,1 -8555,3,2,0,3,2,6,12,3,3,0,10,4,2,24,0 -8556,6,6,0,14,1,5,2,2,1,0,14,12,4,8,0 -8557,0,3,0,6,2,5,3,3,4,1,10,2,3,18,0 -8558,4,1,0,11,6,5,5,4,0,1,15,18,1,25,0 -8559,9,0,0,5,4,5,4,1,2,1,2,2,2,26,0 -8560,10,4,0,12,3,1,8,2,2,0,10,18,0,20,0 -8561,7,8,0,11,3,5,9,4,0,0,0,13,3,12,0 -8562,9,4,0,13,0,3,8,0,4,0,8,18,5,31,0 -8563,0,8,0,4,3,6,9,2,3,1,12,14,4,39,0 -8564,10,7,0,1,6,0,10,3,3,1,18,0,1,28,0 -8565,5,8,0,14,5,1,12,1,2,1,2,18,3,1,0 -8566,7,2,0,8,0,4,13,3,2,1,8,2,5,34,0 -8567,8,4,0,1,4,6,6,4,3,1,17,11,2,7,0 -8568,9,0,0,13,0,0,14,2,0,1,14,11,0,34,0 -8569,0,5,0,7,2,3,3,2,1,0,10,2,0,36,0 -8570,0,8,0,4,6,5,8,4,3,0,10,16,0,37,0 -8571,6,1,0,11,3,0,14,5,3,1,17,3,0,21,0 -8572,8,5,0,14,1,1,4,5,0,1,6,9,2,38,1 -8573,5,7,0,15,4,4,10,0,4,0,8,11,5,27,0 -8574,7,0,0,3,1,3,3,2,2,0,13,11,1,24,0 -8575,0,4,0,13,4,3,13,2,1,1,3,15,5,4,0 -8576,3,6,0,10,2,6,5,0,1,0,6,2,2,8,0 -8577,1,0,0,6,2,0,1,3,2,1,7,13,2,8,0 -8578,10,3,0,13,6,4,5,3,2,0,8,2,0,19,0 -8579,2,3,0,13,1,3,9,3,1,1,2,8,1,23,0 -8580,0,5,0,9,2,3,4,3,1,0,8,2,4,5,0 -8581,6,0,0,15,0,6,8,3,0,1,8,11,2,25,0 -8582,2,7,0,7,6,4,2,3,0,1,8,2,1,13,0 -8583,9,7,0,11,4,5,1,4,4,0,8,18,4,30,0 -8584,2,4,0,3,2,0,3,3,3,1,15,11,5,38,0 -8585,0,6,0,13,1,5,14,3,0,0,2,11,0,13,0 -8586,3,5,0,4,4,1,0,3,1,0,10,1,0,18,1 -8587,1,6,0,13,2,6,0,2,1,0,2,11,3,26,0 -8588,10,7,0,11,2,0,12,5,3,0,14,3,2,1,1 -8589,3,5,0,13,4,4,8,1,0,0,8,13,1,25,0 -8590,10,0,0,7,3,5,5,1,0,1,1,10,2,18,1 -8591,8,2,0,14,1,1,13,5,1,0,5,2,2,9,1 -8592,2,3,0,0,0,0,14,4,0,1,17,2,5,40,0 -8593,8,4,0,5,6,6,10,3,1,1,18,7,1,29,1 -8594,4,2,0,15,4,1,9,4,3,1,12,7,4,13,1 -8595,9,5,0,3,1,4,3,0,2,1,11,7,2,19,1 -8596,1,4,0,13,2,6,1,2,1,0,2,9,5,14,0 -8597,3,2,0,5,0,5,10,1,4,0,10,11,5,13,0 -8598,0,1,0,9,6,0,9,0,1,0,13,18,3,12,0 -8599,9,7,0,1,2,5,12,2,4,0,6,18,0,27,0 -8600,8,4,0,0,0,6,5,2,0,1,9,12,4,3,0 -8601,0,5,0,9,3,5,7,3,2,1,8,15,0,30,0 -8602,8,2,0,14,0,4,10,2,4,0,3,20,5,37,1 -8603,3,6,0,6,2,4,7,1,4,0,17,9,5,28,0 -8604,10,6,0,7,3,6,9,4,1,0,13,13,4,2,0 -8605,10,7,0,12,6,6,6,4,2,1,10,0,0,22,0 -8606,8,8,0,6,2,4,4,5,0,1,13,13,4,38,0 -8607,10,6,0,3,4,3,14,3,3,0,17,2,4,33,0 -8608,0,1,0,4,1,1,13,0,2,0,10,2,2,17,0 -8609,7,0,0,4,1,2,13,5,0,1,18,16,0,31,1 -8610,4,7,0,0,0,5,8,2,2,1,2,12,4,33,0 -8611,1,1,0,13,2,3,12,3,4,0,2,10,0,23,0 -8612,3,5,0,4,1,4,9,1,4,0,13,2,5,28,0 -8613,6,0,0,12,0,5,1,3,2,0,13,15,4,31,0 -8614,0,0,0,13,4,4,11,1,0,1,17,20,1,10,0 -8615,10,7,0,13,2,6,0,3,2,0,0,20,0,31,0 -8616,4,6,0,6,3,5,0,5,2,0,16,4,0,33,0 -8617,3,1,0,6,2,3,4,1,1,1,18,16,3,5,1 -8618,0,5,0,4,1,1,14,1,4,0,2,0,2,3,0 -8619,1,0,0,2,0,0,13,3,3,0,17,8,5,20,0 -8620,10,2,0,1,0,5,0,3,0,1,17,2,1,5,0 -8621,9,1,0,13,0,6,1,0,0,0,5,17,3,27,1 -8622,6,4,0,4,0,6,13,4,1,1,0,18,1,30,0 -8623,6,1,0,13,5,6,10,5,4,1,4,2,0,7,0 -8624,3,3,0,6,6,5,0,3,1,1,15,13,0,6,0 -8625,7,0,0,10,6,6,13,5,4,0,9,8,4,41,1 -8626,9,6,0,15,3,6,4,4,2,1,18,5,3,39,1 -8627,7,5,0,13,1,4,13,4,4,0,8,13,1,11,0 -8628,10,3,0,10,3,4,4,4,0,0,3,10,4,32,1 -8629,7,4,0,14,2,5,4,5,3,0,18,12,2,22,1 -8630,1,3,0,6,6,2,9,2,0,1,4,2,1,0,0 -8631,2,1,0,11,6,0,12,3,4,0,10,14,3,29,0 -8632,0,7,0,8,0,6,6,2,3,0,17,6,0,1,0 -8633,3,0,0,6,3,0,14,1,2,1,4,14,1,17,0 -8634,7,0,0,2,2,3,13,1,3,1,13,17,4,34,1 -8635,2,2,0,9,6,6,10,1,3,1,13,2,1,10,0 -8636,1,5,0,2,2,5,14,0,2,1,0,12,2,25,0 -8637,0,0,0,4,2,4,0,1,0,1,11,4,3,33,0 -8638,5,6,0,15,3,1,11,4,2,1,7,9,0,1,1 -8639,0,1,0,8,0,0,13,3,0,1,17,18,5,7,0 -8640,3,7,0,5,3,6,6,4,4,0,10,1,1,25,0 -8641,1,0,0,3,1,6,9,4,0,0,8,11,0,20,0 -8642,7,5,0,1,4,1,13,4,2,0,1,5,1,0,1 -8643,0,5,0,2,6,3,12,5,1,0,17,16,4,26,0 -8644,1,2,0,3,4,4,14,3,0,0,8,15,4,21,0 -8645,2,8,0,3,0,5,9,4,0,0,13,0,0,21,0 -8646,2,8,0,13,6,6,12,4,4,1,13,18,5,8,0 -8647,8,6,0,14,6,4,9,0,4,1,18,12,5,37,1 -8648,4,2,0,7,0,4,4,1,3,1,13,6,0,8,0 -8649,2,2,0,5,3,3,1,1,2,0,2,13,0,12,0 -8650,0,6,0,0,2,0,13,1,2,1,10,19,5,28,0 -8651,9,2,0,6,4,2,14,1,4,0,16,7,4,27,1 -8652,6,5,0,14,4,2,14,1,4,0,18,3,1,5,1 -8653,3,2,0,5,6,6,10,4,2,1,7,7,1,18,1 -8654,0,2,0,3,0,0,0,1,3,0,16,6,0,18,0 -8655,3,6,0,8,1,6,14,3,2,1,0,11,5,26,0 -8656,7,3,0,14,2,0,4,4,2,1,16,13,1,23,1 -8657,7,3,0,1,6,0,2,4,0,0,17,11,4,16,0 -8658,5,0,0,0,5,6,1,0,1,1,14,18,4,23,0 -8659,10,4,0,11,4,1,0,1,0,1,16,10,3,41,1 -8660,1,1,0,14,6,0,5,2,1,0,2,18,2,3,0 -8661,1,4,0,6,1,0,7,4,4,1,8,13,1,21,0 -8662,0,4,0,3,4,3,8,3,0,1,10,2,1,9,0 -8663,10,4,0,15,5,1,5,4,2,0,9,17,4,22,1 -8664,7,7,0,13,4,1,1,2,1,1,9,4,5,33,1 -8665,5,7,0,9,2,4,12,2,0,0,18,15,5,25,1 -8666,4,7,0,6,1,4,0,2,3,0,12,11,1,6,0 -8667,5,8,0,1,6,0,12,4,4,0,4,13,3,38,0 -8668,6,1,0,6,5,5,7,0,0,0,3,14,1,35,0 -8669,2,4,0,14,3,1,4,1,1,1,17,5,1,24,0 -8670,7,2,0,7,0,6,14,3,2,0,4,2,1,38,0 -8671,2,5,0,2,0,0,9,0,0,0,11,5,1,39,0 -8672,4,3,0,5,1,4,12,5,3,1,18,17,2,41,1 -8673,4,5,0,8,2,5,11,5,1,1,18,10,2,9,1 -8674,1,4,0,2,6,1,0,3,2,0,13,15,2,10,0 -8675,4,8,0,3,5,5,9,3,4,0,0,3,4,19,0 -8676,5,2,0,2,1,2,13,5,2,0,0,1,4,21,1 -8677,2,6,0,7,1,4,0,2,2,1,2,7,0,7,0 -8678,9,5,0,2,4,3,14,0,0,1,9,18,4,36,1 -8679,0,5,0,5,5,4,0,3,4,0,3,14,3,8,1 -8680,4,5,0,4,2,2,14,1,4,1,2,19,3,19,0 -8681,2,8,0,6,6,2,5,5,2,0,2,2,5,25,0 -8682,0,3,0,15,5,0,5,0,3,1,15,13,3,34,0 -8683,7,0,0,10,4,4,8,2,4,1,7,0,5,3,0 -8684,0,4,0,2,3,1,8,4,3,0,8,11,1,36,0 -8685,0,8,0,5,0,4,11,3,1,1,13,15,1,29,0 -8686,6,2,0,7,3,6,0,0,4,1,2,9,2,8,0 -8687,9,4,0,2,0,3,5,5,2,0,10,0,1,16,0 -8688,1,5,0,6,6,1,9,4,3,1,13,3,1,26,0 -8689,2,4,0,2,1,5,2,1,3,0,12,11,5,40,0 -8690,6,6,0,12,2,1,11,0,3,1,5,17,5,6,1 -8691,0,4,0,12,0,4,2,3,4,0,16,9,4,7,0 -8692,1,4,0,3,0,6,8,5,1,0,15,9,2,34,0 -8693,4,8,0,9,1,2,9,5,3,1,13,6,2,4,0 -8694,1,8,0,5,5,4,12,3,3,1,13,13,2,23,0 -8695,3,5,0,13,2,5,7,4,0,0,12,2,5,18,0 -8696,5,5,0,8,4,1,13,5,0,0,10,14,3,35,1 -8697,1,8,0,6,4,0,5,3,2,1,4,12,5,23,0 -8698,8,3,0,13,2,1,7,1,1,1,10,11,1,41,0 -8699,1,5,0,13,1,0,2,5,0,0,13,6,5,20,0 -8700,8,7,0,9,6,2,14,1,3,1,7,8,4,29,1 -8701,0,7,0,4,1,2,6,3,1,1,17,16,0,27,0 -8702,2,2,0,3,3,5,1,2,3,1,13,0,2,12,0 -8703,0,0,0,5,5,4,9,1,4,0,17,11,1,12,0 -8704,10,7,0,7,0,5,6,4,3,1,8,15,3,20,0 -8705,0,4,0,11,4,5,2,3,1,0,4,20,3,31,0 -8706,1,8,0,8,6,5,9,3,0,1,6,1,1,20,0 -8707,0,3,0,11,0,4,1,1,1,0,2,2,2,32,0 -8708,5,7,0,12,2,4,9,4,4,0,2,9,5,23,0 -8709,4,0,0,10,0,3,7,1,2,0,14,6,2,23,0 -8710,0,8,0,14,6,0,9,3,3,1,13,0,0,37,0 -8711,7,6,0,5,4,1,10,1,1,0,3,17,5,22,1 -8712,8,7,0,0,6,1,14,4,4,0,13,15,4,27,0 -8713,1,4,0,0,0,0,0,5,4,0,8,13,1,18,0 -8714,0,6,0,10,3,4,7,3,1,0,6,15,5,5,0 -8715,8,8,0,1,0,5,0,2,0,1,17,20,5,37,0 -8716,5,6,0,8,5,2,5,3,0,1,0,0,2,1,0 -8717,9,1,0,11,0,1,7,4,1,1,3,4,4,26,0 -8718,4,1,0,7,0,6,7,3,1,0,13,11,5,3,0 -8719,7,3,0,5,3,1,2,0,1,1,18,7,5,34,1 -8720,4,7,0,2,6,0,6,0,3,0,0,20,2,34,0 -8721,3,7,0,13,6,4,2,5,3,1,2,2,3,17,0 -8722,2,7,0,1,1,2,4,4,2,0,5,17,1,7,1 -8723,2,6,0,12,0,5,11,3,0,0,13,2,3,4,0 -8724,5,5,0,15,5,1,1,1,0,1,9,17,3,41,1 -8725,1,0,0,11,0,4,0,2,1,0,13,20,5,7,0 -8726,0,7,0,11,2,6,3,3,2,0,11,3,1,36,0 -8727,0,0,0,8,0,0,14,5,1,1,4,11,5,41,0 -8728,6,2,0,4,4,0,6,4,3,0,2,20,4,23,0 -8729,5,1,0,1,2,0,8,4,4,0,2,4,1,27,0 -8730,0,7,0,10,3,6,2,5,3,0,2,15,1,26,0 -8731,2,6,0,1,1,4,9,5,3,1,12,5,4,13,1 -8732,5,0,0,15,2,0,5,4,3,1,6,20,3,6,0 -8733,1,7,0,1,1,6,6,3,1,1,11,9,5,35,0 -8734,8,3,0,0,1,0,0,3,1,0,16,2,0,6,0 -8735,7,3,0,3,3,3,7,2,1,1,13,14,2,22,0 -8736,1,6,0,1,3,6,5,3,2,0,1,19,2,5,1 -8737,9,4,0,7,0,6,0,2,3,1,4,20,1,8,0 -8738,4,4,0,15,1,6,8,3,1,1,0,11,4,11,0 -8739,10,7,0,3,0,3,2,5,2,0,13,11,5,6,0 -8740,3,2,0,4,0,6,8,1,0,1,8,0,0,29,0 -8741,7,3,0,7,2,5,6,1,1,1,8,3,5,32,0 -8742,10,0,0,10,6,3,11,1,3,1,11,1,3,9,1 -8743,0,8,0,9,2,5,14,3,1,0,2,9,2,37,0 -8744,0,6,0,4,4,0,10,3,0,0,15,11,0,28,0 -8745,0,8,0,6,5,5,3,4,0,0,8,13,3,28,0 -8746,3,8,0,15,1,4,0,0,4,1,18,1,4,36,1 -8747,2,8,0,4,6,5,8,5,0,0,8,20,4,20,0 -8748,4,7,0,5,5,6,1,5,3,0,13,2,0,36,0 -8749,1,2,0,13,4,4,0,1,3,0,8,11,5,7,0 -8750,2,5,0,9,4,1,3,3,4,1,5,18,1,36,1 -8751,5,6,0,13,4,6,3,1,0,1,1,17,1,33,1 -8752,6,3,0,13,0,2,4,4,4,1,10,2,2,26,0 -8753,9,5,0,3,2,0,5,3,4,1,2,10,3,34,0 -8754,2,3,0,11,0,0,6,0,0,1,8,11,2,16,0 -8755,0,3,0,0,6,4,2,0,4,0,10,6,3,20,0 -8756,2,7,0,7,1,5,11,4,2,1,9,4,0,27,0 -8757,2,7,0,6,1,5,2,0,3,1,0,5,0,12,0 -8758,3,6,0,5,6,6,9,4,1,0,13,6,0,9,0 -8759,1,3,0,5,0,4,9,1,0,1,2,11,1,28,0 -8760,1,8,0,9,2,6,1,2,4,1,16,3,1,0,0 -8761,0,6,0,3,5,0,8,4,2,0,0,2,0,22,0 -8762,3,2,0,8,6,0,8,1,4,1,17,11,4,1,0 -8763,10,3,0,13,6,3,10,3,2,0,2,15,1,8,0 -8764,0,4,0,3,4,6,3,5,0,0,0,17,1,35,0 -8765,3,4,0,3,5,5,3,0,3,0,18,14,1,6,0 -8766,6,8,0,3,0,0,5,5,1,0,13,11,2,3,0 -8767,6,1,0,13,6,0,4,0,0,1,0,11,4,40,0 -8768,5,7,0,12,5,5,10,3,2,0,15,5,1,27,0 -8769,1,6,0,10,3,5,4,0,2,0,1,14,1,17,1 -8770,1,3,0,3,5,1,0,3,4,0,8,18,3,18,0 -8771,0,8,0,10,1,2,9,3,1,0,17,17,2,0,0 -8772,9,0,0,14,2,1,14,4,1,1,9,14,3,4,1 -8773,10,0,0,8,0,6,5,1,0,0,17,0,0,32,0 -8774,0,5,0,1,6,0,12,3,3,0,2,2,3,31,0 -8775,0,6,0,3,2,0,14,1,3,1,2,0,5,6,0 -8776,7,1,0,9,0,0,3,5,3,0,13,13,0,3,0 -8777,10,4,0,3,4,0,8,3,4,0,17,11,0,33,0 -8778,0,8,0,14,0,4,10,3,4,0,0,13,5,10,0 -8779,1,8,0,14,3,0,4,1,1,1,13,20,5,22,0 -8780,1,0,0,5,6,0,5,0,3,1,8,15,0,0,0 -8781,0,5,0,14,2,6,4,0,3,0,6,4,0,33,0 -8782,2,1,0,13,0,4,5,0,2,1,6,1,1,31,0 -8783,10,2,0,6,2,3,9,4,1,1,3,5,2,10,0 -8784,3,0,0,3,6,6,0,1,1,1,15,11,4,16,0 -8785,5,5,0,6,0,3,4,5,1,1,13,15,5,20,0 -8786,7,8,0,10,5,6,9,1,2,0,13,20,1,21,0 -8787,0,3,0,6,2,0,7,2,1,1,12,11,5,11,0 -8788,2,2,0,3,2,0,5,0,2,0,17,9,2,41,0 -8789,6,7,0,2,1,5,9,4,3,0,16,14,0,4,0 -8790,0,2,0,3,3,2,7,3,4,0,13,2,1,8,0 -8791,7,8,0,4,4,1,13,3,3,0,5,7,3,24,1 -8792,4,5,0,4,1,1,0,5,1,0,7,10,5,23,1 -8793,1,5,0,6,6,3,2,4,2,0,2,12,5,36,0 -8794,0,0,0,12,6,0,8,3,3,0,2,2,5,37,0 -8795,6,5,0,3,2,6,1,4,3,0,7,15,3,17,0 -8796,8,3,0,5,1,1,2,4,4,1,5,17,3,27,1 -8797,6,4,0,12,1,4,8,3,0,0,2,13,0,10,0 -8798,0,1,0,14,5,0,11,2,0,0,9,17,5,39,1 -8799,5,5,0,14,5,6,13,1,0,1,16,1,5,24,1 -8800,9,4,0,12,0,5,4,4,3,1,2,2,0,4,0 -8801,10,6,0,3,3,3,14,1,1,0,13,20,3,27,0 -8802,0,5,0,7,0,0,8,4,0,1,13,13,0,40,0 -8803,2,1,0,11,2,3,7,2,0,1,13,16,0,27,0 -8804,5,4,0,13,1,4,8,0,3,0,4,6,1,23,0 -8805,2,8,0,15,1,5,8,1,4,0,12,2,0,38,0 -8806,4,6,0,3,2,2,8,2,1,1,4,0,3,5,0 -8807,1,6,0,0,2,3,12,1,2,0,8,16,3,25,0 -8808,9,0,0,0,4,4,12,0,4,1,1,1,3,38,1 -8809,8,5,0,11,5,5,10,0,0,1,18,10,3,27,1 -8810,10,1,0,11,6,2,11,3,4,0,7,4,4,6,1 -8811,8,4,0,3,6,0,5,3,4,0,15,4,2,6,0 -8812,8,0,0,5,5,4,2,2,2,1,8,4,0,23,0 -8813,3,2,0,14,1,5,14,3,2,1,3,12,4,35,1 -8814,4,4,0,0,2,0,3,1,4,0,14,10,1,40,1 -8815,9,4,0,1,3,0,7,4,2,1,6,17,0,3,0 -8816,2,3,0,5,0,4,0,4,1,0,17,11,1,35,0 -8817,5,7,0,0,4,1,7,4,1,0,18,17,4,26,1 -8818,5,7,0,4,3,0,9,2,4,1,3,18,4,34,0 -8819,3,7,0,15,1,2,2,3,1,0,13,7,1,16,0 -8820,1,8,0,14,4,1,12,4,4,1,4,10,0,18,1 -8821,8,1,0,0,6,3,10,0,0,1,8,20,3,20,0 -8822,0,8,0,12,6,1,5,4,0,1,2,11,3,13,0 -8823,1,8,0,15,4,5,2,0,4,1,11,9,0,41,0 -8824,0,3,0,2,0,5,0,4,1,0,8,13,5,11,0 -8825,6,6,0,13,0,0,12,4,3,1,2,6,5,14,0 -8826,1,4,0,2,5,4,12,3,2,0,13,2,0,33,0 -8827,8,6,0,5,3,2,10,3,1,1,1,14,2,19,1 -8828,0,1,0,13,6,2,4,4,2,1,16,6,1,0,0 -8829,2,1,0,6,5,5,0,1,0,1,13,5,1,14,0 -8830,1,6,0,15,6,0,9,5,3,0,10,11,4,13,0 -8831,0,6,0,15,0,0,5,1,2,0,11,2,0,26,0 -8832,3,6,0,6,6,5,6,3,1,1,4,2,2,26,0 -8833,2,2,0,12,0,6,9,0,1,1,10,9,0,8,0 -8834,1,4,0,10,2,5,13,0,1,1,13,11,0,20,0 -8835,3,0,0,13,1,5,1,4,4,0,13,4,0,28,0 -8836,3,0,0,13,6,5,4,1,3,1,5,0,5,5,0 -8837,1,2,0,1,2,5,9,2,0,0,13,11,2,41,0 -8838,1,8,0,13,0,5,7,3,2,1,15,16,5,23,0 -8839,7,2,0,15,6,5,2,5,1,1,15,10,5,31,1 -8840,3,4,0,4,6,4,4,4,0,1,4,16,5,17,0 -8841,1,8,0,1,4,3,8,0,0,1,0,16,1,7,0 -8842,2,4,0,4,3,5,9,0,1,0,13,6,0,41,0 -8843,3,4,0,2,3,5,12,4,0,1,13,11,0,30,0 -8844,2,5,0,7,0,4,1,1,3,0,11,14,5,24,0 -8845,3,2,0,9,2,6,10,4,4,0,5,16,2,38,1 -8846,9,8,0,1,6,6,9,0,2,1,0,0,1,0,0 -8847,10,1,0,15,1,0,14,3,0,1,14,17,4,37,1 -8848,9,6,0,2,5,0,9,4,3,1,0,2,4,6,0 -8849,9,5,0,1,6,4,10,3,0,1,15,19,1,40,0 -8850,0,2,0,9,0,4,2,3,2,0,8,2,3,31,0 -8851,5,8,0,12,6,5,9,3,2,1,13,15,1,13,0 -8852,10,0,0,13,0,0,7,3,0,0,2,13,2,40,0 -8853,6,3,0,5,4,5,9,3,1,0,13,2,0,3,0 -8854,7,8,0,15,0,5,14,0,0,1,13,11,2,12,0 -8855,8,8,0,14,5,5,9,1,1,1,18,7,5,8,0 -8856,10,3,0,9,2,6,4,3,3,1,3,10,3,21,1 -8857,2,8,0,12,2,1,4,1,2,1,8,20,0,24,0 -8858,9,5,0,5,1,6,3,5,3,1,14,19,4,27,1 -8859,7,2,0,11,3,4,14,1,2,1,2,0,1,39,0 -8860,8,0,0,2,4,4,9,0,3,1,13,6,1,5,0 -8861,7,2,0,7,2,6,9,4,2,0,8,3,3,8,0 -8862,10,7,0,6,0,5,6,4,4,1,13,20,2,31,0 -8863,8,4,0,5,0,0,11,3,2,1,18,7,0,39,1 -8864,1,8,0,2,2,4,13,3,0,1,8,11,3,38,0 -8865,0,1,0,10,1,1,8,5,1,1,7,1,1,35,1 -8866,0,8,0,15,1,2,8,1,3,0,0,1,3,6,0 -8867,4,3,0,5,2,5,4,4,1,0,16,12,2,30,0 -8868,0,6,0,6,6,2,2,2,1,1,6,17,1,39,0 -8869,2,2,0,5,2,1,6,5,1,1,4,2,0,1,0 -8870,2,7,0,10,0,5,9,1,4,1,2,0,3,30,0 -8871,2,8,0,13,6,1,8,1,1,1,8,13,3,29,0 -8872,0,1,0,15,5,6,11,3,0,0,4,9,3,30,0 -8873,10,6,0,12,2,0,1,4,3,0,9,10,3,20,1 -8874,6,7,0,2,5,4,13,3,2,0,8,0,5,13,0 -8875,0,8,0,7,0,0,5,1,2,0,6,2,3,3,0 -8876,0,5,0,3,0,3,8,1,2,0,2,9,0,9,0 -8877,8,1,0,3,4,2,10,3,1,0,9,3,0,23,1 -8878,10,4,0,1,5,3,12,5,0,1,7,1,5,13,1 -8879,8,0,0,15,6,2,4,2,0,0,13,2,1,33,0 -8880,0,2,0,1,0,3,7,2,0,0,2,6,4,34,0 -8881,5,4,0,8,3,1,5,4,3,1,2,13,0,22,0 -8882,1,4,0,4,6,5,0,2,1,0,7,8,4,14,0 -8883,0,6,0,11,0,0,8,0,4,1,0,5,4,12,0 -8884,7,3,0,10,5,4,13,4,0,0,5,5,5,19,1 -8885,9,0,0,2,3,1,4,5,0,0,18,17,3,23,1 -8886,1,8,0,15,6,0,6,3,2,1,17,11,1,21,0 -8887,4,0,0,10,0,3,0,0,0,1,6,3,0,11,0 -8888,0,1,0,15,0,0,8,5,3,1,13,11,3,19,0 -8889,2,0,0,4,2,4,9,3,0,1,18,0,2,27,0 -8890,0,8,0,13,6,2,7,2,2,1,4,2,1,9,0 -8891,0,1,0,1,5,3,9,1,4,1,18,10,4,9,1 -8892,4,6,0,5,0,6,1,1,0,1,6,20,3,9,0 -8893,2,6,0,1,5,2,12,3,2,0,14,1,4,18,0 -8894,4,8,0,6,0,1,3,3,1,1,14,6,2,18,0 -8895,7,0,0,8,3,2,5,3,1,1,3,8,0,25,1 -8896,7,0,0,2,4,6,5,4,0,1,8,0,0,36,0 -8897,0,2,0,5,6,3,2,0,2,1,10,18,2,10,0 -8898,1,8,0,1,1,0,9,1,0,1,11,15,0,6,0 -8899,1,6,0,1,4,6,12,3,4,1,18,8,3,20,1 -8900,9,2,0,13,1,0,0,1,0,1,13,16,2,14,0 -8901,4,8,0,12,0,4,8,3,2,1,8,6,3,12,0 -8902,4,8,0,9,6,1,9,4,2,1,13,11,2,23,0 -8903,9,8,0,11,2,5,14,1,2,1,13,10,4,7,0 -8904,4,2,0,12,2,5,0,1,2,0,9,10,4,9,1 -8905,2,6,0,1,6,4,0,0,3,1,2,8,0,19,0 -8906,3,8,0,9,5,6,2,0,1,0,2,2,5,33,0 -8907,1,8,0,8,5,3,8,0,2,0,2,11,0,26,0 -8908,0,3,0,3,2,2,2,3,4,0,12,2,1,27,0 -8909,8,1,0,11,0,3,12,0,1,1,8,0,3,26,0 -8910,8,6,0,15,6,2,13,3,2,0,5,1,3,10,1 -8911,6,1,0,7,6,6,3,1,2,0,2,2,1,7,0 -8912,3,1,0,13,6,0,8,1,0,0,2,11,4,5,0 -8913,3,4,0,12,4,4,1,0,0,0,17,2,2,14,0 -8914,5,6,0,6,3,1,1,4,0,0,1,14,2,23,1 -8915,7,7,0,1,1,6,11,0,1,0,14,6,4,38,1 -8916,5,6,0,9,0,2,9,1,3,0,12,15,0,19,0 -8917,4,6,0,9,6,2,0,1,4,0,18,0,4,40,1 -8918,10,5,0,7,0,5,9,1,2,0,0,3,1,38,0 -8919,0,1,0,11,2,2,10,4,2,0,12,6,4,37,0 -8920,2,7,0,14,1,0,3,1,3,1,13,12,5,23,0 -8921,1,8,0,13,2,3,5,1,2,0,2,0,0,23,0 -8922,1,4,0,7,1,6,7,3,0,0,14,20,0,41,0 -8923,10,8,0,13,5,2,9,4,3,0,2,19,5,4,0 -8924,1,5,0,12,1,4,13,3,4,0,10,11,4,31,0 -8925,9,2,0,3,2,2,9,5,2,0,5,7,2,41,1 -8926,0,0,0,2,2,0,5,3,4,1,13,2,0,34,0 -8927,6,5,0,2,1,1,11,3,1,0,17,4,1,23,0 -8928,5,1,0,3,5,5,14,0,3,1,2,9,3,31,0 -8929,9,4,0,3,2,6,5,4,4,0,12,7,2,40,0 -8930,0,2,0,4,0,4,1,3,0,0,0,11,3,10,0 -8931,3,3,0,2,6,5,8,1,0,0,2,2,4,2,0 -8932,1,5,0,12,6,5,14,0,4,1,14,10,2,19,1 -8933,3,6,0,0,3,5,6,0,2,1,4,13,5,2,0 -8934,6,2,0,12,6,0,6,0,3,0,6,18,3,9,0 -8935,7,4,0,7,3,5,2,1,4,1,3,17,5,7,1 -8936,0,8,0,1,4,0,5,1,3,0,4,2,4,17,0 -8937,2,3,0,3,5,1,13,0,0,0,5,2,2,21,0 -8938,1,6,0,1,6,5,7,1,2,0,8,4,0,38,0 -8939,10,2,0,13,1,3,12,0,0,1,9,8,0,19,1 -8940,5,7,0,2,0,4,14,0,3,1,9,5,0,26,1 -8941,2,0,0,10,5,0,3,4,4,1,2,20,0,14,0 -8942,2,7,0,0,4,0,9,0,2,0,13,2,0,4,0 -8943,0,5,0,6,0,1,14,5,0,0,18,12,4,29,0 -8944,9,6,0,7,0,0,7,4,4,1,7,9,0,19,0 -8945,4,6,0,11,0,6,3,1,4,0,13,11,4,3,0 -8946,5,4,0,6,4,0,8,2,0,1,6,9,5,12,0 -8947,7,8,0,13,3,0,12,3,1,1,14,15,3,2,1 -8948,9,4,0,0,3,1,10,1,0,0,1,3,4,38,1 -8949,0,8,0,11,5,5,1,2,0,0,13,11,0,21,0 -8950,6,5,0,12,1,2,4,1,1,1,13,2,5,28,0 -8951,2,7,0,15,1,4,14,0,3,1,0,13,4,3,0 -8952,1,4,0,3,2,2,0,0,3,1,15,6,0,21,0 -8953,8,8,0,10,0,1,13,0,3,0,18,7,5,2,1 -8954,2,8,0,7,1,6,2,1,4,1,4,13,4,21,0 -8955,1,2,0,1,4,1,3,5,0,0,9,7,3,1,1 -8956,9,7,0,5,2,6,7,4,0,1,14,6,4,7,0 -8957,4,3,0,13,3,0,6,5,3,1,18,5,1,35,1 -8958,0,3,0,9,3,4,1,4,0,0,2,4,0,6,0 -8959,1,2,0,9,2,1,7,0,3,0,17,11,4,37,0 -8960,5,7,0,3,5,6,9,3,2,0,0,3,5,10,0 -8961,5,7,0,0,2,5,4,1,0,1,13,6,4,3,0 -8962,9,6,0,13,3,3,6,2,3,1,2,2,3,30,0 -8963,6,7,0,11,2,0,4,5,2,1,3,10,0,25,1 -8964,4,1,0,13,4,2,1,3,4,1,16,19,3,7,1 -8965,0,1,0,13,2,5,6,3,4,1,6,6,5,22,0 -8966,9,2,0,11,0,4,1,0,0,1,14,15,3,32,0 -8967,6,7,0,10,6,1,9,2,1,1,12,8,1,13,1 -8968,1,7,0,4,6,1,6,4,0,0,17,2,5,22,0 -8969,0,4,0,13,2,3,7,2,3,1,12,2,0,8,0 -8970,9,4,0,14,1,3,10,2,0,1,13,17,4,36,1 -8971,4,5,0,6,2,1,2,2,1,1,9,1,2,28,1 -8972,1,3,0,5,1,0,5,4,3,1,17,0,5,32,0 -8973,0,6,0,7,4,3,5,4,3,1,8,15,5,19,0 -8974,5,4,0,12,4,6,3,3,3,1,6,5,1,16,1 -8975,3,6,0,10,3,6,4,4,0,0,2,0,0,33,0 -8976,0,6,0,6,4,4,14,1,2,1,14,20,4,24,1 -8977,3,6,0,0,6,3,8,1,1,1,0,13,2,8,0 -8978,3,7,0,9,0,2,2,1,3,1,4,9,0,25,0 -8979,0,7,0,0,0,3,2,0,1,1,6,9,3,30,0 -8980,2,0,0,0,1,0,11,2,1,0,8,13,4,35,0 -8981,10,8,0,3,5,5,7,2,2,1,6,2,2,9,0 -8982,2,7,0,8,4,5,4,0,1,0,13,11,5,18,0 -8983,6,2,0,3,0,2,0,1,2,1,6,7,2,19,1 -8984,8,5,0,4,2,0,5,0,3,0,13,1,0,4,0 -8985,8,2,0,12,5,6,7,4,4,1,14,2,5,36,0 -8986,1,8,0,8,0,5,7,1,1,0,2,4,0,40,0 -8987,2,0,0,13,2,6,0,3,3,0,8,13,5,34,0 -8988,6,6,0,3,6,4,7,2,0,1,2,2,1,20,0 -8989,5,5,0,13,6,2,10,0,0,0,13,12,0,3,0 -8990,6,4,0,7,1,4,6,0,1,1,1,7,5,21,1 -8991,5,0,0,15,0,3,8,4,1,1,2,14,3,35,0 -8992,9,6,0,8,6,3,4,5,4,1,9,7,3,38,1 -8993,6,8,0,10,4,0,13,4,4,0,0,7,4,37,1 -8994,0,7,0,0,3,6,8,5,3,1,9,7,5,23,1 -8995,3,1,0,10,6,0,8,4,1,0,13,11,4,35,0 -8996,3,2,0,8,3,4,9,5,2,0,8,11,0,26,0 -8997,0,3,0,3,5,3,10,1,0,0,8,2,4,13,0 -8998,10,0,0,3,0,0,3,3,4,1,17,10,0,30,0 -8999,10,5,0,3,2,3,9,1,2,1,16,20,5,36,0 -9000,4,4,0,1,3,2,13,0,2,1,0,12,0,31,0 -9001,5,0,0,9,3,2,8,0,0,1,2,4,2,34,0 -9002,5,5,0,11,6,5,6,1,0,0,0,1,4,34,0 -9003,2,8,0,13,6,5,12,2,2,0,8,2,4,22,0 -9004,9,3,0,15,2,4,14,3,2,1,2,9,4,25,0 -9005,1,0,0,13,0,6,9,0,4,1,8,2,0,38,0 -9006,6,7,0,15,3,2,10,1,2,1,5,1,5,37,1 -9007,2,6,0,3,5,0,1,2,0,0,13,11,4,9,0 -9008,4,6,0,6,1,3,7,3,2,1,13,11,3,4,0 -9009,10,8,0,14,5,4,11,5,4,1,18,16,2,29,1 -9010,3,4,0,1,0,6,9,3,4,0,17,17,5,19,0 -9011,2,7,0,6,6,2,5,3,1,0,13,2,2,14,0 -9012,7,3,0,15,4,5,9,2,1,1,16,17,4,21,1 -9013,1,6,0,14,1,5,4,4,3,0,13,4,4,20,0 -9014,2,6,0,12,0,0,0,2,2,1,8,1,0,40,0 -9015,3,5,0,11,5,3,14,2,4,1,0,2,4,21,0 -9016,9,6,0,11,6,2,10,3,2,1,11,14,3,3,1 -9017,0,8,0,7,3,4,8,1,0,1,8,18,1,24,0 -9018,6,4,0,1,6,3,4,1,1,1,16,13,5,41,0 -9019,10,4,0,2,2,1,14,2,0,1,18,10,5,10,1 -9020,4,5,0,9,1,5,4,2,1,0,9,8,5,9,1 -9021,10,3,0,6,5,2,2,2,2,0,0,12,3,2,0 -9022,0,3,0,11,1,5,3,0,3,1,2,20,4,19,0 -9023,1,0,0,4,0,5,5,0,0,1,11,19,5,41,0 -9024,10,2,0,13,5,5,3,1,4,1,18,10,1,40,1 -9025,8,1,0,14,1,5,14,1,1,1,18,1,4,9,1 -9026,2,6,0,8,3,6,1,2,0,0,10,13,4,21,0 -9027,0,6,0,6,0,5,11,1,3,0,13,14,3,0,0 -9028,4,0,0,6,4,0,8,0,2,0,4,6,2,5,0 -9029,8,5,0,15,1,6,3,5,4,0,9,1,3,1,1 -9030,6,6,0,3,5,4,8,2,2,0,8,6,1,17,0 -9031,3,1,0,2,0,5,12,3,3,0,13,17,2,33,0 -9032,0,6,0,10,0,4,1,4,0,1,9,0,5,2,0 -9033,5,3,0,11,2,4,2,0,4,0,12,7,4,29,0 -9034,3,6,0,12,5,5,9,2,0,1,7,11,1,17,0 -9035,4,2,0,1,0,5,10,1,3,1,13,9,3,2,0 -9036,0,3,0,8,1,1,4,1,1,0,6,15,0,26,0 -9037,9,0,0,8,5,6,9,1,0,0,2,12,4,29,0 -9038,0,7,0,8,3,0,2,4,2,1,17,13,0,34,0 -9039,7,6,0,14,2,1,11,1,2,1,14,10,1,41,1 -9040,10,4,0,2,0,5,9,5,3,0,13,11,5,39,0 -9041,2,7,0,3,0,6,7,1,1,0,17,0,1,8,0 -9042,6,8,0,12,3,6,10,3,4,1,5,7,1,37,1 -9043,1,7,0,1,0,2,4,2,0,0,0,1,3,9,0 -9044,3,1,0,15,4,0,2,4,4,0,13,0,2,29,0 -9045,4,5,0,11,0,6,9,5,3,0,2,11,2,29,0 -9046,1,3,0,8,0,4,2,2,1,1,0,6,2,2,0 -9047,8,5,0,1,4,0,12,4,0,1,4,14,2,23,0 -9048,7,4,0,1,3,3,4,4,1,0,18,1,0,12,1 -9049,9,5,0,12,3,1,9,5,4,1,9,7,4,21,1 -9050,0,2,0,7,0,4,9,1,3,0,13,4,3,16,0 -9051,10,3,0,15,4,1,6,0,1,0,9,7,1,9,1 -9052,2,7,0,15,6,3,8,1,1,0,17,2,3,24,0 -9053,0,7,0,9,0,1,7,5,0,0,6,2,2,17,0 -9054,10,7,0,11,6,6,9,5,0,0,1,1,4,23,1 -9055,9,6,0,11,3,1,14,1,1,0,6,16,4,27,0 -9056,9,0,0,8,6,4,7,1,4,0,2,2,1,0,0 -9057,5,2,0,2,5,1,1,4,4,1,16,19,5,7,1 -9058,7,5,0,8,6,0,0,5,2,1,18,12,2,3,1 -9059,9,5,0,15,0,4,3,4,2,0,17,17,1,2,0 -9060,8,2,0,0,6,1,0,4,3,0,8,2,2,11,0 -9061,8,2,0,6,1,3,7,0,1,1,7,0,1,0,0 -9062,1,4,0,12,0,1,3,5,0,1,14,7,4,35,1 -9063,6,3,0,8,5,5,10,3,2,0,13,6,3,1,0 -9064,9,8,0,11,6,6,5,2,0,0,8,9,0,0,0 -9065,0,0,0,13,0,1,9,0,0,0,8,5,0,13,0 -9066,1,4,0,9,0,2,14,0,0,0,18,7,1,36,1 -9067,10,8,0,3,2,1,5,1,1,1,17,9,2,7,0 -9068,3,6,0,14,5,0,9,5,0,0,12,2,3,37,0 -9069,3,0,0,14,0,4,11,5,2,0,0,11,0,17,0 -9070,0,3,0,13,1,4,10,1,1,0,8,18,4,34,0 -9071,10,5,0,11,0,0,3,3,4,1,2,15,1,37,0 -9072,2,3,0,15,0,3,1,4,0,0,4,0,1,34,0 -9073,10,3,0,12,0,2,3,3,1,1,1,7,1,6,1 -9074,2,2,0,7,2,2,10,4,2,0,14,15,1,17,0 -9075,0,7,0,14,5,0,6,2,3,0,8,0,0,20,0 -9076,2,7,0,3,6,2,3,2,4,1,13,18,2,29,0 -9077,10,4,0,3,2,3,0,0,0,0,2,2,2,41,0 -9078,3,1,0,5,0,0,3,5,2,1,9,14,2,4,1 -9079,2,3,0,13,0,2,0,3,2,1,2,15,0,30,0 -9080,0,4,0,7,1,3,13,2,4,0,3,15,3,17,0 -9081,0,0,0,7,0,1,12,3,3,0,13,6,0,30,0 -9082,5,6,0,15,6,6,3,0,1,0,15,13,3,31,0 -9083,8,4,0,3,6,4,14,0,2,0,2,11,1,27,0 -9084,3,0,0,6,3,1,12,5,4,1,0,2,1,16,0 -9085,6,5,0,9,2,0,0,4,1,0,2,11,4,23,0 -9086,5,6,0,12,5,5,8,3,4,1,13,1,3,18,0 -9087,9,8,0,13,1,5,7,3,0,0,17,18,2,16,0 -9088,9,8,0,12,3,5,8,3,3,1,2,14,1,5,0 -9089,5,4,0,8,1,5,6,1,3,0,0,11,1,7,0 -9090,1,5,0,7,0,4,2,3,0,1,13,9,0,7,0 -9091,2,1,0,6,6,0,1,3,3,0,13,18,3,8,0 -9092,7,0,0,15,4,6,13,0,3,1,15,3,1,28,0 -9093,1,0,0,5,1,0,5,3,1,1,0,6,0,39,0 -9094,0,3,0,2,2,5,14,2,0,1,17,0,1,36,0 -9095,0,2,0,1,4,4,10,2,0,1,11,17,1,16,0 -9096,4,3,0,7,0,4,8,0,1,1,2,0,3,34,0 -9097,4,4,0,15,2,4,10,0,4,0,18,5,1,33,1 -9098,0,3,0,10,6,0,5,4,2,0,2,18,4,8,0 -9099,1,2,0,5,6,1,8,1,3,0,8,14,5,3,0 -9100,1,7,0,2,2,1,14,1,1,1,0,19,3,2,1 -9101,0,7,0,2,3,5,6,0,1,1,0,5,3,33,0 -9102,8,6,0,4,6,1,7,4,0,1,17,11,5,28,0 -9103,8,8,0,14,4,1,3,0,2,1,9,19,3,12,1 -9104,1,7,0,7,2,5,8,4,1,0,13,0,0,39,0 -9105,1,8,0,14,2,1,2,2,3,1,13,20,2,13,0 -9106,9,6,0,9,4,5,2,4,1,1,2,16,0,25,0 -9107,5,2,0,14,1,5,14,5,4,1,0,0,2,24,0 -9108,1,5,0,13,2,0,12,0,0,0,2,20,0,5,0 -9109,2,3,0,12,2,4,3,5,1,1,18,8,4,21,1 -9110,8,0,0,15,1,1,10,2,0,1,8,17,3,7,0 -9111,3,8,0,13,1,5,7,1,4,1,4,2,1,11,0 -9112,4,4,0,6,2,4,5,4,2,0,13,0,5,30,0 -9113,3,2,0,3,6,2,10,0,4,0,13,12,1,22,0 -9114,2,7,0,0,5,5,4,0,1,1,8,0,2,21,0 -9115,5,0,0,10,4,2,0,5,4,0,15,18,0,7,1 -9116,9,3,0,12,5,0,0,3,1,1,4,19,2,14,0 -9117,5,6,0,11,3,5,11,4,0,0,18,16,0,24,1 -9118,0,5,0,11,2,1,4,5,2,1,18,5,3,12,1 -9119,9,6,0,14,0,5,1,0,2,1,13,20,2,0,0 -9120,1,3,0,7,4,5,5,3,4,1,17,0,2,38,1 -9121,7,0,0,13,6,5,3,4,4,0,13,14,4,22,0 -9122,7,1,0,8,0,6,8,5,2,1,5,2,5,29,0 -9123,0,6,0,10,0,0,7,0,1,1,17,12,3,33,0 -9124,0,2,0,4,0,0,10,4,0,1,0,18,4,1,0 -9125,7,0,0,7,0,6,11,0,2,1,12,2,2,12,0 -9126,8,0,0,7,3,3,3,4,0,0,0,17,1,21,0 -9127,5,8,0,13,2,6,1,1,1,1,1,7,4,37,1 -9128,2,2,0,4,5,5,9,3,4,1,10,6,3,20,0 -9129,3,8,0,1,6,3,8,0,4,1,13,13,3,19,0 -9130,10,1,0,15,6,1,2,5,0,0,7,7,3,7,1 -9131,6,8,0,2,1,1,2,0,2,1,8,16,5,23,0 -9132,0,2,0,15,6,5,9,5,4,0,12,2,5,12,0 -9133,2,8,0,15,0,4,1,3,2,0,13,12,0,5,0 -9134,2,8,0,13,0,5,5,2,1,0,17,9,0,21,0 -9135,10,8,0,0,5,5,12,5,3,1,1,5,5,1,1 -9136,3,2,0,14,6,5,11,2,0,0,15,2,3,17,0 -9137,3,8,0,11,6,3,6,4,3,0,2,2,2,37,0 -9138,4,3,0,14,5,0,6,3,3,0,17,0,3,16,0 -9139,8,8,0,10,0,4,3,1,2,1,10,15,3,34,0 -9140,8,3,0,1,0,4,0,2,2,1,12,3,0,13,0 -9141,0,8,0,4,1,0,5,2,0,0,13,2,2,41,0 -9142,6,0,0,12,4,6,2,3,3,0,16,20,4,35,0 -9143,1,6,0,7,0,4,9,2,4,1,3,2,3,6,0 -9144,4,6,0,1,6,5,12,1,4,0,15,13,1,27,0 -9145,8,6,0,4,4,6,4,4,2,0,7,14,3,24,1 -9146,0,7,0,7,4,2,13,2,0,0,10,6,4,26,0 -9147,9,3,0,15,0,4,11,1,4,0,8,7,4,6,0 -9148,9,6,0,12,6,6,8,2,1,0,1,18,5,36,0 -9149,2,0,0,14,0,1,14,3,2,0,1,8,1,30,1 -9150,3,1,0,13,4,3,10,2,1,0,13,1,0,36,0 -9151,3,8,0,7,0,5,3,3,0,0,11,2,0,7,0 -9152,0,1,0,7,1,4,3,1,3,0,13,8,2,10,0 -9153,3,6,0,0,3,0,7,4,2,0,2,11,1,9,0 -9154,7,3,0,7,4,0,9,5,2,1,3,10,4,20,1 -9155,10,6,0,3,0,4,4,4,4,0,6,0,5,1,0 -9156,7,6,0,3,4,0,10,3,1,1,15,15,1,38,0 -9157,8,7,0,7,5,0,12,1,3,0,12,11,0,2,0 -9158,2,3,0,10,3,6,11,1,2,0,2,20,4,36,0 -9159,3,2,0,11,4,4,1,4,0,0,8,4,1,34,0 -9160,0,6,0,3,6,4,2,1,2,0,0,8,3,41,0 -9161,9,5,0,4,4,1,14,0,3,0,15,13,2,20,0 -9162,5,1,0,7,6,1,13,5,3,0,12,1,5,21,1 -9163,3,6,0,5,1,0,8,3,2,0,4,18,2,19,0 -9164,0,7,0,13,6,0,14,4,0,1,17,11,0,35,0 -9165,8,4,0,1,6,5,3,0,2,1,13,20,2,40,0 -9166,0,5,0,7,0,0,12,0,2,1,2,11,1,10,0 -9167,5,1,0,4,0,0,0,4,2,1,8,15,0,25,0 -9168,8,2,0,4,1,6,8,2,0,1,10,0,4,10,0 -9169,4,6,0,3,6,4,8,2,2,1,8,16,5,29,0 -9170,2,0,0,10,3,5,10,4,0,0,14,2,3,5,0 -9171,10,7,0,14,4,1,7,0,4,1,18,8,3,26,1 -9172,1,3,0,5,0,4,7,3,1,1,4,11,2,8,0 -9173,6,4,0,0,6,5,1,1,3,0,8,6,0,17,0 -9174,0,1,0,6,3,2,0,5,2,0,15,6,1,36,0 -9175,0,8,0,1,4,1,8,2,1,1,2,20,0,1,0 -9176,2,0,0,0,4,1,5,4,0,1,13,2,5,8,0 -9177,9,6,0,14,3,0,5,4,2,0,13,11,1,9,0 -9178,3,8,0,0,2,0,2,1,0,0,6,15,0,4,0 -9179,0,7,0,1,6,5,4,4,4,0,2,2,1,35,0 -9180,8,4,0,10,4,1,14,2,1,0,18,16,1,18,1 -9181,8,7,0,11,5,6,11,2,3,1,4,12,2,20,1 -9182,4,2,0,1,2,6,12,2,2,0,2,2,1,11,0 -9183,7,2,0,5,5,0,0,4,0,1,8,11,1,31,0 -9184,8,7,0,2,6,4,14,5,2,1,18,10,5,27,1 -9185,8,4,0,15,4,3,0,5,0,1,7,8,2,22,1 -9186,1,3,0,10,1,1,3,3,2,0,15,17,2,30,0 -9187,9,3,0,9,2,0,9,1,2,1,18,7,4,38,1 -9188,9,3,0,4,4,1,7,1,0,1,18,14,3,41,1 -9189,10,0,0,4,4,2,9,3,0,1,4,7,2,36,0 -9190,3,3,0,0,0,3,4,0,0,0,2,18,2,35,0 -9191,5,8,0,7,6,5,10,3,0,0,11,6,0,3,0 -9192,0,2,0,11,4,4,12,4,0,0,4,10,1,14,0 -9193,9,6,0,6,0,1,9,4,0,0,8,2,0,8,0 -9194,2,2,0,2,1,3,0,2,0,1,2,9,5,4,0 -9195,1,0,0,12,2,1,8,3,0,0,2,9,0,16,0 -9196,1,8,0,8,5,0,14,2,3,1,6,2,5,4,0 -9197,9,4,0,4,6,4,3,5,0,1,1,3,4,1,1 -9198,3,2,0,11,6,5,13,5,2,1,18,15,1,5,1 -9199,0,5,0,3,0,4,6,1,0,0,12,17,1,21,0 -9200,2,8,0,4,6,2,5,1,3,1,4,11,1,39,0 -9201,0,8,0,15,1,6,3,4,2,1,13,2,3,22,0 -9202,3,4,0,13,1,0,11,4,1,1,2,15,2,37,0 -9203,1,8,0,1,0,6,3,1,0,0,13,0,5,19,0 -9204,2,8,0,15,4,5,5,2,2,0,2,15,3,35,0 -9205,6,7,0,13,3,5,5,4,2,1,13,2,5,36,0 -9206,6,8,0,13,6,4,5,1,0,1,18,12,2,24,0 -9207,7,4,0,6,0,5,2,0,0,0,13,14,1,27,0 -9208,2,6,0,6,4,3,10,3,3,1,2,2,3,6,0 -9209,5,8,0,7,0,1,4,1,0,1,7,18,3,27,1 -9210,7,3,0,9,2,1,3,0,3,0,18,10,0,39,1 -9211,2,3,0,5,0,5,8,3,0,1,0,11,1,23,0 -9212,9,8,0,8,0,5,13,5,2,1,8,6,1,12,0 -9213,2,4,0,2,1,0,12,1,2,0,8,2,3,31,0 -9214,3,2,0,13,4,3,2,0,3,1,0,0,1,7,0 -9215,5,6,0,9,0,2,6,1,0,0,9,7,2,21,1 -9216,9,3,0,13,5,5,14,4,4,1,2,3,2,10,0 -9217,8,2,0,7,1,0,7,5,0,1,10,19,0,0,1 -9218,4,6,0,1,6,1,5,3,0,0,8,6,5,3,0 -9219,9,2,0,11,6,4,9,3,4,0,9,6,1,7,0 -9220,2,1,0,14,6,0,13,0,0,1,13,4,1,33,0 -9221,4,2,0,6,0,3,10,1,0,1,6,15,5,5,0 -9222,9,6,0,6,3,1,13,3,4,1,9,7,2,37,1 -9223,10,2,0,11,0,1,5,3,0,0,2,12,2,12,0 -9224,1,6,0,6,6,5,8,0,3,1,8,3,4,25,0 -9225,2,7,0,7,3,4,11,1,4,0,2,13,0,27,0 -9226,2,1,0,13,0,6,3,3,2,1,13,9,0,34,0 -9227,5,5,0,15,6,0,12,5,0,1,13,15,0,25,0 -9228,0,2,0,0,6,5,11,0,3,0,7,3,3,11,0 -9229,3,4,0,2,6,3,1,1,1,1,5,11,4,0,0 -9230,4,6,0,11,0,6,8,0,1,0,0,6,4,3,0 -9231,1,4,0,14,6,5,1,2,0,0,2,6,1,25,0 -9232,4,7,0,3,6,6,9,2,0,1,8,0,5,8,0 -9233,2,6,0,3,0,1,9,2,3,1,17,2,0,7,0 -9234,2,1,0,8,6,4,4,3,1,1,5,11,5,36,0 -9235,9,6,0,10,6,1,1,5,2,0,13,6,1,37,0 -9236,9,0,0,13,0,2,9,2,3,0,13,8,0,41,0 -9237,8,1,0,8,2,6,5,3,1,0,13,18,2,21,0 -9238,2,6,0,0,3,5,3,2,2,0,2,6,0,25,0 -9239,3,3,0,8,5,3,4,3,1,0,2,2,3,27,0 -9240,10,6,0,4,3,4,5,4,4,0,2,7,1,25,0 -9241,8,1,0,9,2,6,5,2,4,1,18,17,3,16,1 -9242,0,5,0,13,0,4,0,1,4,1,17,2,4,25,0 -9243,10,8,0,4,4,1,7,4,1,0,13,2,0,0,0 -9244,3,7,0,6,0,2,9,2,2,0,13,2,0,13,0 -9245,9,8,0,13,0,0,7,1,4,1,15,2,4,40,0 -9246,7,2,0,5,3,6,7,4,4,1,4,11,4,30,0 -9247,0,2,0,12,5,0,6,3,3,1,6,16,2,17,0 -9248,1,5,0,9,3,3,1,0,0,1,6,2,0,1,0 -9249,0,2,0,6,0,5,4,3,0,0,8,0,0,6,0 -9250,6,0,0,13,1,4,8,4,2,0,2,7,2,10,0 -9251,7,0,0,6,2,1,6,2,2,1,13,18,5,30,0 -9252,0,1,0,0,2,2,7,4,4,1,17,11,0,6,0 -9253,0,0,0,5,0,5,1,1,1,0,13,9,4,20,0 -9254,8,2,0,2,5,3,9,1,3,1,16,1,5,7,1 -9255,0,6,0,8,0,4,13,5,3,1,13,20,1,28,0 -9256,9,5,0,3,6,5,11,1,2,0,17,2,1,19,0 -9257,10,6,0,5,1,0,8,3,4,0,10,2,0,23,0 -9258,6,0,0,14,4,1,8,0,4,1,18,19,3,30,1 -9259,10,3,0,8,0,0,11,2,3,1,11,2,0,6,0 -9260,4,2,0,1,6,0,5,4,3,1,17,19,1,8,0 -9261,5,6,0,9,5,4,1,2,0,0,2,11,2,32,0 -9262,9,3,0,2,1,3,9,2,4,1,8,2,2,35,0 -9263,10,3,0,3,1,4,0,1,0,1,12,2,1,39,0 -9264,3,8,0,5,3,6,6,3,0,0,8,13,5,29,0 -9265,2,1,0,10,0,2,11,4,4,1,1,3,1,6,1 -9266,0,0,0,3,3,4,8,1,3,0,8,15,4,7,0 -9267,7,6,0,14,3,3,5,4,1,0,18,10,4,22,1 -9268,8,4,0,2,3,0,13,2,4,0,17,2,4,5,0 -9269,1,3,0,3,6,1,0,4,0,0,2,11,0,26,0 -9270,2,6,0,8,0,3,3,1,2,1,5,17,3,8,1 -9271,1,5,0,7,2,6,4,1,0,0,12,1,5,26,0 -9272,3,5,0,4,1,2,8,3,4,1,4,16,0,33,0 -9273,0,6,0,4,0,5,0,3,3,0,10,20,4,37,0 -9274,9,6,0,12,1,4,5,3,3,1,12,0,3,10,0 -9275,3,7,0,7,3,4,7,1,3,0,10,15,2,32,0 -9276,3,8,0,13,2,5,3,2,0,0,13,15,4,34,0 -9277,0,8,0,0,1,0,11,1,2,1,2,1,3,36,0 -9278,4,4,0,5,0,0,5,0,0,0,13,7,0,29,0 -9279,1,8,0,11,6,0,1,3,3,1,2,18,5,4,0 -9280,8,2,0,12,0,2,12,0,0,1,18,7,4,5,1 -9281,1,3,0,13,2,0,6,4,3,1,17,16,5,31,0 -9282,9,1,0,11,3,5,1,0,4,0,13,0,0,5,0 -9283,0,8,0,0,3,4,6,5,4,1,13,15,1,5,0 -9284,4,1,0,5,0,5,8,2,1,1,17,11,0,27,0 -9285,7,6,0,7,0,1,7,5,3,1,18,7,2,0,1 -9286,1,8,0,15,6,4,1,3,1,0,5,13,2,32,1 -9287,7,0,0,3,6,0,5,0,2,1,8,14,0,6,0 -9288,0,1,0,12,6,2,9,3,0,1,15,0,1,18,0 -9289,2,3,0,12,5,1,6,0,2,0,13,5,1,27,1 -9290,5,4,0,0,2,0,11,1,2,1,16,4,3,21,1 -9291,1,0,0,0,5,2,9,1,0,0,4,9,5,14,0 -9292,4,8,0,8,0,5,10,2,1,1,8,0,1,14,0 -9293,6,8,0,4,0,6,7,3,3,0,4,0,2,8,0 -9294,0,4,0,3,6,2,7,2,0,1,13,20,0,37,0 -9295,10,3,0,4,1,1,9,5,4,1,9,17,1,13,1 -9296,3,4,0,15,1,5,9,4,3,1,7,2,4,36,0 -9297,9,6,0,1,5,3,11,5,2,1,14,13,4,3,0 -9298,10,5,0,0,1,2,1,4,1,1,9,5,5,7,1 -9299,0,8,0,5,1,6,5,3,4,0,13,2,0,23,0 -9300,6,0,0,7,6,1,5,0,4,0,5,4,2,27,0 -9301,6,6,0,6,0,0,3,1,0,1,8,13,3,35,0 -9302,4,1,0,0,1,1,10,3,2,0,12,8,4,12,1 -9303,0,2,0,3,2,4,11,1,0,1,13,19,5,40,0 -9304,1,0,0,5,1,3,14,5,0,1,8,11,0,36,0 -9305,6,0,0,14,4,3,8,0,2,0,3,14,3,39,1 -9306,2,2,0,7,4,4,9,1,1,1,14,3,3,21,1 -9307,0,7,0,0,3,5,3,3,1,1,8,9,5,24,0 -9308,9,8,0,12,2,2,9,3,4,1,13,6,0,2,0 -9309,7,4,0,9,6,6,4,1,1,1,6,17,5,38,1 -9310,3,6,0,1,2,5,3,2,0,0,15,0,3,4,0 -9311,1,2,0,11,4,5,8,0,1,1,10,9,0,30,0 -9312,5,2,0,10,1,3,9,2,0,1,2,2,1,17,0 -9313,9,6,0,10,0,5,8,1,1,1,15,11,2,40,0 -9314,0,3,0,5,0,5,5,3,0,1,5,2,0,4,0 -9315,7,6,0,14,3,5,12,4,0,1,18,19,0,8,1 -9316,2,2,0,0,1,5,12,2,4,1,6,11,1,12,0 -9317,4,5,0,6,4,4,11,2,1,0,14,5,0,36,1 -9318,7,1,0,5,0,6,6,1,4,1,7,1,3,29,1 -9319,7,2,0,1,1,5,4,3,0,0,8,15,2,16,0 -9320,1,8,0,13,1,6,6,0,1,0,13,18,0,40,0 -9321,3,6,0,1,4,0,1,0,4,0,15,6,3,19,0 -9322,1,0,0,0,3,0,1,4,4,1,3,8,4,40,1 -9323,6,8,0,6,0,0,8,1,0,0,2,8,0,40,0 -9324,6,8,0,1,0,3,14,5,4,1,4,9,0,22,0 -9325,8,7,0,4,0,5,12,4,0,1,13,11,1,0,0 -9326,5,8,0,9,6,6,5,2,1,0,15,6,1,26,0 -9327,9,4,0,6,5,5,5,2,3,0,2,20,0,12,0 -9328,1,1,0,0,4,5,11,1,1,0,8,2,1,11,0 -9329,3,8,0,11,0,4,5,0,0,0,2,2,2,5,0 -9330,0,0,0,3,0,5,0,4,3,1,14,11,1,28,0 -9331,8,6,0,10,0,0,8,2,4,0,0,9,0,38,0 -9332,7,7,0,9,5,2,7,1,3,1,10,2,1,6,0 -9333,0,3,0,14,3,0,9,2,3,0,4,11,0,2,0 -9334,2,8,0,1,6,2,2,3,0,0,13,2,4,9,0 -9335,9,2,0,15,1,4,5,1,4,0,9,5,1,20,1 -9336,0,6,0,3,1,2,9,1,2,0,15,13,4,29,0 -9337,2,0,0,12,1,1,0,1,0,0,2,5,1,13,0 -9338,5,6,0,3,5,5,12,2,0,0,8,6,4,13,0 -9339,5,3,0,8,0,0,4,4,4,0,16,0,5,19,0 -9340,10,7,0,12,1,0,9,2,0,0,13,13,2,3,0 -9341,6,4,0,0,2,5,6,5,3,1,11,10,4,28,1 -9342,0,6,0,10,0,6,10,3,4,1,8,4,4,12,0 -9343,7,8,0,2,5,1,1,5,3,0,3,2,0,39,0 -9344,0,4,0,8,2,4,0,4,0,1,0,4,2,32,0 -9345,4,1,0,8,4,6,8,3,0,0,8,11,4,39,0 -9346,6,3,0,15,0,0,2,1,0,1,8,6,4,38,0 -9347,6,2,0,13,6,2,10,5,0,0,14,5,1,18,1 -9348,4,4,0,13,1,6,12,0,0,1,2,2,1,10,0 -9349,1,0,0,6,2,1,11,5,3,0,17,1,2,10,0 -9350,3,8,0,2,5,5,0,4,0,0,9,16,4,25,0 -9351,9,3,0,13,1,1,4,0,2,1,18,8,4,19,1 -9352,2,6,0,0,1,1,6,1,3,1,17,0,4,13,0 -9353,1,3,0,9,4,6,4,3,3,1,18,17,0,16,1 -9354,1,2,0,7,5,5,0,2,0,0,2,2,0,21,0 -9355,6,8,0,11,4,4,7,5,0,0,7,7,1,21,1 -9356,9,4,0,13,2,2,9,3,1,1,13,11,5,12,0 -9357,0,4,0,6,3,5,0,2,4,1,2,2,3,40,0 -9358,2,8,0,10,5,4,10,2,0,1,0,2,0,28,0 -9359,5,7,0,3,6,3,0,4,4,1,15,2,3,34,0 -9360,8,3,0,11,3,0,9,3,0,0,2,8,0,21,0 -9361,9,4,0,5,6,6,4,4,2,1,15,12,0,11,0 -9362,9,8,0,10,6,3,8,4,3,0,0,10,1,30,1 -9363,5,2,0,3,2,6,9,4,4,1,13,19,1,29,0 -9364,6,4,0,15,4,2,9,0,2,1,5,17,5,29,1 -9365,4,5,0,2,1,0,7,3,3,1,13,0,0,36,0 -9366,3,4,0,0,5,2,6,5,0,0,3,2,0,27,0 -9367,3,3,0,14,2,5,2,2,2,1,2,15,0,14,0 -9368,0,7,0,9,1,5,7,4,0,1,1,11,5,39,0 -9369,10,4,0,10,6,6,7,1,4,0,17,7,4,25,1 -9370,5,4,0,11,2,5,11,1,2,1,6,4,3,21,0 -9371,1,7,0,12,1,5,6,4,1,0,2,9,4,6,0 -9372,7,6,0,1,0,5,7,3,2,0,6,11,2,0,0 -9373,4,4,0,10,1,0,7,4,2,1,16,5,3,1,1 -9374,8,2,0,1,6,5,14,0,2,1,8,2,5,3,0 -9375,3,0,0,9,4,2,0,0,3,1,14,7,2,7,1 -9376,1,1,0,8,0,5,6,2,4,1,1,11,5,20,0 -9377,4,7,0,7,3,5,6,0,4,0,5,1,3,40,1 -9378,2,5,0,15,0,0,5,0,3,0,0,15,5,3,0 -9379,6,5,0,15,0,4,8,3,4,1,8,6,1,27,0 -9380,10,8,0,15,2,5,11,5,4,0,0,4,0,36,1 -9381,9,8,0,0,3,3,14,0,1,1,5,5,3,10,1 -9382,0,4,0,15,0,3,13,3,3,1,3,14,2,6,0 -9383,2,3,0,6,2,6,0,2,2,1,17,6,2,33,0 -9384,0,6,0,13,0,2,8,3,3,0,10,2,1,40,0 -9385,3,7,0,7,2,4,8,4,2,0,8,11,1,39,0 -9386,3,4,0,5,3,4,6,1,3,0,0,13,2,16,0 -9387,10,3,0,6,2,6,14,3,0,1,13,9,0,14,0 -9388,0,0,0,12,1,0,0,3,3,0,13,13,0,35,0 -9389,0,1,0,11,6,3,6,2,2,1,8,12,3,4,0 -9390,4,1,0,2,3,2,3,5,0,1,13,18,3,29,0 -9391,3,5,0,7,0,3,12,3,0,0,6,6,3,8,0 -9392,8,3,0,10,2,4,13,5,2,0,15,14,1,32,1 -9393,0,0,0,4,6,3,12,4,1,1,17,3,4,0,0 -9394,8,0,0,0,2,3,13,3,0,0,18,8,3,7,1 -9395,1,0,0,11,0,3,0,1,4,1,3,11,5,31,0 -9396,7,2,0,4,3,2,7,5,3,0,18,0,5,22,1 -9397,2,3,0,2,3,1,8,2,4,1,8,13,1,2,0 -9398,1,0,0,11,5,5,11,3,4,1,2,13,4,5,0 -9399,3,5,0,0,6,2,8,1,1,1,13,6,4,24,0 -9400,6,0,0,7,6,2,4,4,0,1,6,7,3,31,1 -9401,8,7,0,1,2,4,5,4,0,0,0,0,2,8,0 -9402,4,7,0,5,6,5,2,3,0,0,8,11,0,1,0 -9403,6,2,0,6,0,0,9,3,1,1,2,4,3,35,0 -9404,5,8,0,0,0,5,6,3,2,0,13,0,0,3,0 -9405,0,3,0,7,2,0,13,4,4,0,8,13,4,12,0 -9406,6,4,0,10,0,4,13,4,2,1,0,11,1,11,0 -9407,0,2,0,11,2,6,13,0,4,1,13,6,3,4,0 -9408,2,8,0,13,5,6,13,4,4,1,16,10,5,25,0 -9409,4,6,0,11,5,3,4,5,2,0,11,9,1,12,1 -9410,2,2,0,3,3,5,4,2,2,1,6,2,3,34,0 -9411,5,5,0,11,1,6,5,4,1,0,18,13,3,30,0 -9412,2,7,0,9,5,0,4,4,3,1,8,0,0,25,0 -9413,2,8,0,15,5,5,4,4,0,1,15,19,5,3,0 -9414,9,6,0,5,4,1,5,1,2,0,0,11,3,9,0 -9415,4,2,0,11,1,6,7,2,2,0,13,11,1,5,0 -9416,1,2,0,8,0,3,9,4,0,0,2,2,1,33,0 -9417,6,6,0,14,1,0,6,0,2,1,0,14,0,28,1 -9418,2,5,0,2,1,2,11,0,0,0,6,3,0,29,0 -9419,1,8,0,1,6,2,3,5,3,0,13,13,5,19,0 -9420,3,2,0,9,5,3,6,2,4,0,1,11,5,8,0 -9421,1,6,0,14,6,3,3,4,2,0,8,15,0,28,0 -9422,0,8,0,1,6,3,3,4,3,1,6,15,0,0,0 -9423,1,3,0,1,3,4,3,1,4,1,7,5,0,12,1 -9424,2,6,0,13,5,4,10,1,4,0,2,13,5,38,0 -9425,0,5,0,9,1,6,6,2,1,1,2,18,5,3,0 -9426,3,1,0,15,3,6,1,3,0,0,4,11,0,18,0 -9427,6,1,0,2,4,5,5,2,4,0,10,2,4,25,0 -9428,10,1,0,2,6,4,11,2,0,1,8,6,0,26,0 -9429,8,6,0,3,3,6,3,0,3,0,8,11,4,0,0 -9430,8,2,0,13,6,2,13,3,3,1,15,9,2,25,0 -9431,5,8,0,15,6,0,12,1,1,0,2,9,4,24,0 -9432,3,7,0,5,4,1,0,4,3,1,14,17,1,20,1 -9433,10,5,0,1,0,0,2,5,2,1,17,2,1,16,0 -9434,4,4,0,12,2,4,2,2,2,0,18,13,4,11,1 -9435,7,4,0,12,5,5,14,3,1,0,18,10,4,33,1 -9436,2,7,0,2,2,2,9,4,3,0,13,2,4,4,0 -9437,3,1,0,4,6,6,14,2,4,1,13,11,1,12,0 -9438,3,1,0,9,3,6,8,4,4,0,8,20,4,4,0 -9439,7,4,0,2,4,3,2,0,4,0,15,15,0,29,0 -9440,2,7,0,5,3,2,5,0,4,1,8,8,0,37,0 -9441,1,5,0,9,5,0,0,0,1,1,13,11,0,40,0 -9442,1,1,0,3,5,5,13,4,1,1,2,13,4,28,0 -9443,1,5,0,9,2,3,14,3,2,0,6,15,1,39,0 -9444,4,0,0,12,6,4,6,3,2,1,13,3,2,31,0 -9445,9,4,0,6,2,4,9,2,2,0,13,13,1,3,0 -9446,2,7,0,14,5,6,8,3,4,0,10,13,3,12,0 -9447,9,3,0,4,0,0,14,3,2,0,14,7,2,14,1 -9448,0,0,0,6,1,4,0,2,0,0,13,11,3,38,0 -9449,4,3,0,13,3,4,14,5,2,0,13,13,0,27,0 -9450,0,3,0,5,2,5,1,5,3,0,6,10,0,13,0 -9451,6,0,0,7,2,1,5,1,1,0,10,15,3,35,0 -9452,8,8,0,9,4,0,11,3,4,0,18,19,4,22,1 -9453,2,2,0,6,5,0,14,1,2,0,4,2,0,4,0 -9454,6,1,0,6,6,0,1,4,0,0,9,1,3,27,1 -9455,3,8,0,12,6,0,5,1,1,0,8,16,2,39,0 -9456,6,4,0,0,6,6,10,1,4,0,14,0,4,23,1 -9457,2,7,0,1,2,0,9,3,2,1,4,2,3,37,0 -9458,4,4,0,7,2,1,11,5,3,1,9,7,1,5,1 -9459,9,1,0,13,1,3,9,3,2,1,13,11,2,35,0 -9460,0,2,0,7,6,0,7,2,1,1,1,8,1,28,0 -9461,3,2,0,3,1,6,5,5,0,0,0,9,1,9,0 -9462,2,2,0,1,0,2,9,0,0,1,13,18,2,25,0 -9463,9,7,0,6,6,4,1,3,0,1,13,11,4,6,0 -9464,0,2,0,13,1,3,13,5,4,1,8,19,2,4,0 -9465,3,7,0,4,6,2,5,3,3,0,2,11,2,4,0 -9466,0,5,0,15,0,4,5,0,4,1,13,13,3,7,0 -9467,8,5,0,8,2,2,2,2,2,1,5,7,4,17,1 -9468,5,5,0,15,1,1,9,3,3,1,8,0,5,35,0 -9469,6,8,0,12,6,5,3,5,0,0,6,0,4,3,0 -9470,2,7,0,6,0,2,9,2,3,0,8,19,0,14,0 -9471,7,8,0,10,5,5,2,1,1,1,5,7,2,36,1 -9472,2,1,0,14,3,5,11,3,3,0,14,19,2,30,1 -9473,7,3,0,1,1,2,7,1,2,0,0,2,1,17,0 -9474,0,3,0,1,3,5,2,0,2,0,13,20,2,2,0 -9475,7,0,0,14,1,5,3,4,3,1,5,19,2,40,1 -9476,6,2,0,6,0,1,12,2,2,1,17,15,0,16,0 -9477,3,6,0,0,6,4,13,4,1,0,2,16,2,20,0 -9478,6,3,0,14,0,0,2,0,2,1,9,7,3,12,1 -9479,0,0,0,8,6,0,8,5,0,1,13,2,2,34,0 -9480,0,7,0,12,0,5,1,4,4,1,4,2,4,14,0 -9481,0,2,0,6,0,4,0,2,0,1,2,10,2,26,0 -9482,2,7,0,1,6,5,6,0,2,1,6,6,1,2,0 -9483,3,1,0,7,6,6,5,0,3,0,15,2,0,9,0 -9484,0,1,0,5,1,4,7,4,2,1,2,4,5,33,0 -9485,4,7,0,1,1,0,2,3,2,0,13,3,5,8,0 -9486,2,4,0,6,2,4,9,2,4,1,13,2,2,0,0 -9487,0,6,0,11,6,0,6,4,3,0,4,6,1,34,0 -9488,3,4,0,11,4,3,1,1,3,1,13,16,1,2,0 -9489,1,7,0,13,2,6,7,5,3,1,2,17,1,21,0 -9490,0,6,0,1,4,0,2,1,0,1,7,19,4,4,0 -9491,7,0,0,7,0,5,13,5,2,0,11,5,5,10,1 -9492,1,2,0,1,6,5,3,0,0,0,2,11,0,4,0 -9493,0,5,0,3,1,3,8,4,4,0,15,20,3,36,0 -9494,7,7,0,14,4,4,2,5,2,1,3,17,2,34,1 -9495,3,8,0,1,5,0,0,0,0,0,4,4,5,0,0 -9496,0,5,0,3,0,1,6,3,1,1,2,2,4,37,0 -9497,2,2,0,14,5,5,11,5,3,1,11,7,3,9,1 -9498,9,2,0,15,2,5,10,3,2,0,0,4,4,33,1 -9499,10,1,0,0,3,3,3,1,3,0,8,19,3,17,0 -9500,8,5,0,6,3,2,7,4,3,1,17,16,3,32,0 -9501,9,2,0,6,6,4,0,0,0,1,13,4,0,0,0 -9502,4,7,0,4,3,0,7,1,2,1,18,1,5,2,1 -9503,1,7,0,15,0,1,0,2,2,1,3,11,5,6,0 -9504,2,1,0,3,0,0,11,1,3,0,17,16,0,24,0 -9505,5,2,0,11,2,5,12,4,2,0,10,0,3,2,0 -9506,0,6,0,5,6,3,12,3,3,0,15,1,3,27,0 -9507,5,5,0,8,1,5,13,0,4,0,12,18,0,29,0 -9508,0,0,0,9,1,5,6,2,3,1,0,2,2,7,0 -9509,1,1,0,6,0,0,0,4,1,1,18,8,2,24,1 -9510,7,4,0,1,4,4,3,0,4,1,8,11,0,36,0 -9511,1,5,0,6,5,2,5,0,1,1,18,8,4,10,1 -9512,7,8,0,8,4,5,5,3,3,0,13,3,3,5,0 -9513,5,6,0,14,0,5,3,1,2,1,13,15,3,14,0 -9514,4,7,0,14,1,5,6,1,3,1,8,11,0,26,0 -9515,1,2,0,11,6,0,13,0,2,0,5,14,2,14,1 -9516,3,5,0,10,1,4,10,3,0,1,14,15,4,29,1 -9517,2,4,0,2,6,1,5,4,3,1,12,13,3,23,0 -9518,4,6,0,10,3,6,9,5,0,1,9,13,3,40,1 -9519,1,8,0,15,6,4,13,0,1,0,8,0,0,17,0 -9520,8,6,0,4,6,3,9,5,0,0,17,6,1,18,0 -9521,2,6,0,0,5,3,4,0,0,0,2,20,1,39,0 -9522,5,8,0,4,4,0,6,1,0,0,13,13,1,2,0 -9523,10,6,0,10,2,4,8,1,3,0,2,9,4,8,0 -9524,10,5,0,3,2,0,8,4,1,0,4,12,2,22,0 -9525,7,2,0,8,4,0,2,3,0,1,16,10,0,10,1 -9526,2,3,0,0,4,6,8,0,3,0,10,11,3,19,0 -9527,4,6,0,7,3,1,11,4,1,1,5,10,4,24,1 -9528,1,5,0,4,2,6,6,1,1,0,9,12,2,12,1 -9529,10,6,0,13,2,5,0,1,1,0,2,14,2,1,0 -9530,0,7,0,3,6,0,14,1,3,0,2,6,4,16,0 -9531,9,0,0,8,5,5,9,0,3,1,8,2,1,23,0 -9532,5,8,0,13,1,4,7,1,2,0,4,6,3,34,0 -9533,3,4,0,10,6,4,6,2,1,1,8,2,5,2,0 -9534,1,4,0,10,6,2,10,2,0,0,2,6,0,4,0 -9535,0,7,0,6,2,0,12,5,1,0,10,18,5,26,0 -9536,2,8,0,4,2,0,8,4,2,0,11,13,5,11,0 -9537,5,5,0,15,2,0,4,3,0,0,13,18,1,38,0 -9538,0,4,0,1,2,0,9,2,3,1,0,15,5,6,0 -9539,9,6,0,4,1,5,13,3,2,1,15,0,2,26,0 -9540,1,7,0,0,1,5,13,2,3,1,2,14,4,19,0 -9541,10,4,0,5,3,0,2,0,3,0,16,2,1,32,0 -9542,6,8,0,2,2,0,6,2,3,1,2,2,0,2,0 -9543,3,0,0,13,3,5,6,1,1,0,15,11,3,19,0 -9544,6,8,0,4,3,5,6,0,3,0,0,5,0,22,0 -9545,0,1,0,11,6,6,7,3,0,1,16,9,2,23,0 -9546,3,4,0,2,1,5,7,0,1,0,5,0,0,13,0 -9547,3,0,0,2,1,0,12,0,1,1,1,8,4,17,1 -9548,0,2,0,10,6,3,9,0,1,1,13,2,0,12,0 -9549,3,0,0,13,6,5,4,1,1,0,13,13,5,4,0 -9550,0,7,0,11,2,2,7,1,1,0,8,18,3,14,0 -9551,1,4,0,11,4,5,9,0,1,1,5,5,3,25,1 -9552,0,7,0,13,6,2,4,1,3,1,11,6,0,41,0 -9553,2,0,0,1,3,6,1,3,1,1,13,11,2,3,0 -9554,8,7,0,11,1,1,9,4,3,0,13,0,1,16,0 -9555,8,6,0,12,2,5,9,3,3,1,7,1,5,28,1 -9556,7,5,0,0,0,0,11,4,3,1,2,18,1,1,0 -9557,7,7,0,3,5,4,6,2,0,0,6,12,3,21,0 -9558,2,6,0,11,4,4,4,1,0,0,7,14,2,33,0 -9559,5,1,0,6,1,2,5,4,0,1,8,20,0,3,0 -9560,1,3,0,15,4,4,3,3,4,0,8,2,0,13,0 -9561,7,5,0,8,2,3,0,2,1,1,1,12,0,35,1 -9562,1,5,0,15,0,1,4,0,4,1,18,7,2,33,1 -9563,1,6,0,12,0,5,11,0,0,1,17,2,2,35,0 -9564,0,3,0,12,0,5,11,4,1,1,2,13,3,39,0 -9565,8,4,0,2,5,2,7,5,0,1,5,13,3,9,1 -9566,7,3,0,13,2,6,1,1,4,0,17,13,1,3,0 -9567,3,2,0,2,6,2,7,2,2,0,13,11,5,34,0 -9568,0,8,0,13,5,6,14,2,4,0,13,18,0,11,0 -9569,1,6,0,7,4,1,9,5,2,1,8,11,4,2,1 -9570,10,5,0,8,0,5,4,1,2,1,9,3,3,23,1 -9571,10,4,0,5,6,6,8,0,1,0,13,11,0,38,0 -9572,1,0,0,1,6,0,4,1,2,1,13,6,2,38,0 -9573,6,7,0,5,6,0,0,3,4,1,17,2,4,41,0 -9574,6,2,0,10,6,5,11,2,3,1,0,6,2,5,0 -9575,3,7,0,3,2,5,2,0,3,0,9,11,3,26,0 -9576,1,0,0,4,3,5,7,2,3,0,16,20,2,36,0 -9577,7,3,0,9,0,6,8,5,4,1,10,15,0,37,0 -9578,5,8,0,6,3,2,6,2,0,0,8,13,4,7,0 -9579,0,2,0,9,3,0,9,2,0,1,13,11,2,2,0 -9580,6,8,0,14,6,3,12,4,2,1,18,10,0,32,1 -9581,1,0,0,0,2,1,14,1,1,0,2,2,5,33,0 -9582,1,8,0,0,3,6,9,4,1,0,18,5,4,38,1 -9583,5,1,0,8,6,6,12,3,0,1,3,20,4,37,1 -9584,0,1,0,11,1,0,5,3,1,1,8,13,1,32,0 -9585,0,8,0,15,0,5,14,2,0,0,8,13,5,3,0 -9586,5,7,0,0,0,5,0,2,2,1,16,5,4,19,1 -9587,8,0,0,2,4,3,6,4,3,1,1,5,5,13,1 -9588,3,4,0,11,3,4,8,3,3,1,13,4,4,38,0 -9589,1,8,0,0,6,3,6,5,1,1,0,18,3,14,0 -9590,3,8,0,12,1,3,9,0,1,1,13,18,3,7,0 -9591,6,3,0,10,6,2,0,4,3,0,4,2,1,25,0 -9592,6,8,0,15,1,3,0,1,3,0,17,14,0,9,0 -9593,5,8,0,15,3,2,12,2,0,0,5,9,3,32,1 -9594,9,0,0,1,3,4,0,0,1,1,13,20,1,33,0 -9595,4,8,0,8,4,3,12,1,1,1,2,4,0,34,0 -9596,1,2,0,5,6,0,6,4,0,0,13,19,0,27,0 -9597,7,2,0,12,2,2,6,3,2,0,1,8,2,36,1 -9598,9,2,0,14,1,0,8,1,0,1,17,11,0,20,0 -9599,4,3,0,8,4,5,13,3,0,0,13,18,3,6,0 -9600,0,7,0,1,3,5,9,3,1,1,4,17,0,26,0 -9601,0,1,0,11,6,2,5,5,0,1,13,6,1,38,0 -9602,1,2,0,2,6,0,9,4,2,1,8,2,3,32,0 -9603,5,0,0,14,3,2,8,3,1,1,13,7,3,41,1 -9604,6,4,0,7,4,1,2,3,2,0,18,8,4,12,1 -9605,6,3,0,11,3,6,9,5,1,1,5,11,0,21,0 -9606,9,3,0,12,6,6,2,3,2,1,12,15,1,19,0 -9607,2,2,0,8,0,2,10,2,0,1,14,18,2,39,1 -9608,2,6,0,15,5,3,6,0,4,1,8,12,2,41,0 -9609,5,5,0,10,6,5,1,1,4,1,1,5,1,34,1 -9610,0,7,0,1,6,6,5,2,1,0,17,18,0,29,0 -9611,0,6,0,0,6,2,5,3,2,1,7,9,0,2,0 -9612,8,6,0,8,3,1,1,1,4,1,5,8,3,11,1 -9613,5,6,0,11,5,6,0,2,1,0,7,12,0,28,0 -9614,1,6,0,4,2,3,9,1,0,0,2,11,0,34,0 -9615,3,7,0,5,0,1,8,2,3,1,2,9,0,9,0 -9616,2,7,0,3,6,0,8,2,1,0,6,12,2,38,0 -9617,1,3,0,4,1,3,0,3,2,1,8,4,4,19,0 -9618,4,2,0,4,5,4,14,5,0,0,18,17,4,36,1 -9619,6,1,0,10,1,1,14,4,2,0,4,19,5,10,0 -9620,4,5,0,9,2,3,5,5,3,1,17,16,4,20,1 -9621,3,7,0,4,0,3,13,5,3,0,2,14,3,25,0 -9622,8,8,0,7,4,6,8,5,4,1,11,20,4,2,1 -9623,8,7,0,7,1,5,0,3,4,1,14,12,0,28,1 -9624,9,0,0,11,4,4,13,1,3,1,10,0,5,2,0 -9625,3,4,0,8,2,0,9,2,1,1,13,14,3,16,0 -9626,2,4,0,4,2,1,0,3,1,1,10,9,4,33,0 -9627,1,5,0,2,6,2,13,5,3,0,9,16,2,40,1 -9628,6,8,0,11,4,4,8,3,0,1,17,2,1,36,0 -9629,9,8,0,15,3,3,5,1,0,0,9,10,1,22,1 -9630,0,8,0,13,5,6,8,4,2,0,2,18,0,40,0 -9631,0,8,0,4,0,5,5,4,0,1,13,9,4,0,0 -9632,3,5,0,5,3,6,9,1,1,1,4,2,3,25,0 -9633,2,4,0,0,5,0,10,2,3,0,14,11,4,9,0 -9634,5,8,0,2,2,4,3,3,3,0,13,6,0,28,0 -9635,8,8,0,13,3,3,8,0,3,1,5,6,5,2,1 -9636,2,6,0,11,5,2,1,4,4,1,2,2,2,0,0 -9637,9,0,0,15,6,6,13,4,4,0,9,7,3,39,1 -9638,8,5,0,8,2,0,2,0,2,1,1,7,5,10,1 -9639,2,7,0,7,0,5,5,4,2,1,13,0,0,10,0 -9640,8,6,0,13,5,1,5,5,0,1,13,15,2,29,0 -9641,8,2,0,15,2,4,2,5,2,0,10,2,5,10,0 -9642,0,0,0,4,5,1,4,3,3,0,2,11,2,22,0 -9643,4,6,0,3,0,4,6,4,4,1,14,2,2,29,0 -9644,2,0,0,15,0,6,14,0,1,0,6,11,0,26,0 -9645,3,1,0,13,2,6,5,0,2,1,17,11,2,10,0 -9646,1,0,0,7,5,5,8,4,4,0,2,2,4,33,0 -9647,0,0,0,6,6,1,2,0,0,0,14,17,5,36,1 -9648,3,1,0,14,0,6,9,3,3,0,14,18,0,41,0 -9649,8,5,0,0,6,4,1,3,3,0,17,2,1,16,0 -9650,3,1,0,3,2,0,6,3,4,1,8,14,4,20,0 -9651,7,3,0,9,2,1,8,1,2,1,13,11,5,37,1 -9652,6,4,0,11,1,1,2,5,1,1,16,10,1,34,1 -9653,3,0,0,12,6,5,6,0,2,1,18,14,0,16,0 -9654,7,3,0,8,3,0,3,1,3,0,18,15,2,6,1 -9655,0,7,0,6,0,3,4,1,0,0,2,2,0,35,0 -9656,7,3,0,0,1,1,14,2,2,0,3,4,0,19,0 -9657,6,1,0,7,6,2,12,0,0,0,9,17,2,16,1 -9658,4,3,0,8,4,2,2,2,2,1,9,10,4,16,1 -9659,3,2,0,3,2,6,5,4,3,1,4,3,0,18,0 -9660,5,2,0,10,3,2,2,5,1,1,5,17,3,1,1 -9661,7,3,0,9,3,4,12,3,0,0,2,15,3,25,0 -9662,6,7,0,2,2,4,13,4,4,0,13,18,0,38,0 -9663,10,8,0,14,0,5,3,3,0,0,13,3,1,3,0 -9664,1,7,0,6,5,0,9,2,4,1,16,15,0,37,0 -9665,1,4,0,3,3,4,14,3,0,1,3,11,3,7,0 -9666,5,8,0,14,6,4,10,5,0,1,11,7,4,5,1 -9667,3,3,0,3,2,5,3,2,0,1,2,20,0,20,0 -9668,5,0,0,0,6,4,9,2,4,1,13,15,4,1,0 -9669,5,6,0,1,2,0,8,1,3,1,10,4,0,7,0 -9670,1,0,0,15,6,4,8,2,2,0,11,15,3,38,0 -9671,0,6,0,2,6,4,10,1,3,0,0,4,1,25,0 -9672,2,6,0,13,2,3,14,2,1,1,13,15,0,29,0 -9673,0,5,0,9,2,6,14,0,1,0,13,1,4,9,0 -9674,8,0,0,3,3,4,6,3,3,0,0,13,0,2,0 -9675,2,1,0,13,0,0,11,2,0,1,2,0,2,37,0 -9676,2,0,0,12,0,1,5,3,1,0,2,13,2,39,0 -9677,3,0,0,13,1,5,6,3,1,0,4,13,2,29,0 -9678,4,8,0,14,4,5,9,4,2,1,3,0,0,41,0 -9679,3,8,0,10,3,3,6,1,0,0,13,6,0,5,0 -9680,1,4,0,5,0,4,0,1,0,1,2,12,1,28,0 -9681,3,0,0,6,1,5,10,0,0,1,17,18,4,41,0 -9682,9,2,0,3,0,5,13,2,0,1,13,2,4,17,0 -9683,2,2,0,5,1,4,8,4,4,0,2,2,0,21,0 -9684,7,2,0,3,3,2,10,5,2,1,3,7,2,14,1 -9685,0,1,0,13,0,0,3,5,2,1,0,0,2,29,0 -9686,0,2,0,0,4,0,10,0,3,0,8,18,1,18,0 -9687,0,8,0,5,2,6,8,3,0,1,17,20,3,1,0 -9688,4,6,0,0,3,2,11,5,4,1,5,7,3,4,1 -9689,7,2,0,12,6,3,0,4,2,1,13,10,5,14,1 -9690,9,0,0,5,3,5,4,4,2,0,0,14,4,23,0 -9691,9,7,0,13,2,4,2,1,0,0,13,18,2,12,0 -9692,3,5,0,13,0,4,3,1,3,0,17,0,2,30,0 -9693,0,8,0,2,4,2,5,4,4,1,0,11,0,25,0 -9694,0,7,0,15,0,6,12,2,3,0,2,16,0,3,0 -9695,1,0,0,3,2,0,3,1,2,1,8,8,2,26,0 -9696,9,0,0,7,2,5,2,5,0,1,3,5,5,26,1 -9697,8,0,0,13,3,3,8,0,3,1,5,3,3,9,1 -9698,9,2,0,8,3,3,3,2,3,1,8,13,2,23,0 -9699,5,5,0,6,1,4,5,4,2,0,17,18,2,19,0 -9700,9,7,0,5,4,4,5,4,4,0,5,4,2,3,0 -9701,4,4,0,8,2,3,0,5,0,1,5,12,4,28,1 -9702,0,3,0,9,2,4,9,0,4,0,8,2,0,17,0 -9703,1,0,0,8,0,3,3,3,3,1,8,2,1,8,0 -9704,0,8,0,13,6,2,3,4,2,1,9,2,5,8,0 -9705,10,3,0,4,0,2,3,3,3,1,9,3,3,11,1 -9706,2,6,0,3,1,4,12,1,1,0,10,0,5,10,0 -9707,3,3,0,5,2,0,0,0,3,1,13,13,3,24,0 -9708,3,5,0,4,4,6,14,0,3,0,14,3,4,21,1 -9709,0,4,0,4,5,2,9,0,0,0,6,11,3,12,0 -9710,3,8,0,3,4,2,10,3,1,1,5,5,0,10,1 -9711,7,5,0,12,3,2,14,3,0,0,8,11,3,8,0 -9712,7,2,0,10,4,2,2,0,1,0,1,10,2,23,1 -9713,1,1,0,11,2,4,9,0,0,1,13,11,0,40,0 -9714,10,5,0,14,0,1,8,1,1,1,18,1,3,7,1 -9715,2,1,0,8,0,3,6,0,2,0,13,17,3,20,0 -9716,5,0,0,7,3,5,8,0,4,1,13,14,1,38,0 -9717,0,7,0,1,5,4,6,3,2,0,13,4,0,41,0 -9718,1,8,0,3,4,2,12,2,2,1,2,11,4,9,0 -9719,6,8,0,10,5,2,6,3,1,1,3,6,4,25,1 -9720,10,3,0,1,4,3,4,5,4,1,2,11,1,4,0 -9721,2,6,0,11,6,4,10,4,2,1,13,15,3,0,0 -9722,9,6,0,10,6,3,6,5,0,1,14,12,4,36,0 -9723,3,3,0,14,0,4,11,2,4,0,12,11,0,17,0 -9724,5,6,0,4,5,1,2,1,1,0,9,8,3,24,1 -9725,1,1,0,10,6,6,6,2,0,0,7,1,2,22,1 -9726,4,4,0,2,4,4,9,1,2,0,2,11,0,26,0 -9727,7,5,0,15,6,2,0,2,4,1,9,4,5,21,1 -9728,5,8,0,2,0,5,5,3,1,0,15,13,0,40,0 -9729,5,3,0,6,0,6,4,5,2,0,18,13,0,26,0 -9730,0,8,0,5,1,5,3,3,4,1,8,11,3,3,0 -9731,4,8,0,1,4,5,4,1,1,1,12,10,0,2,1 -9732,8,6,0,14,0,4,7,1,3,1,13,15,0,32,0 -9733,2,7,0,5,6,0,0,4,4,1,17,15,1,11,0 -9734,7,0,0,9,5,0,12,5,3,1,7,0,3,8,1 -9735,1,0,0,5,2,3,14,4,2,1,0,15,4,17,0 -9736,2,3,0,2,0,4,9,0,1,1,13,19,4,20,0 -9737,9,8,0,1,1,5,4,2,0,0,14,6,4,5,0 -9738,6,4,0,11,1,2,13,1,1,0,8,11,3,37,0 -9739,0,2,0,3,3,4,9,1,0,0,13,2,2,30,0 -9740,9,8,0,11,2,5,8,1,3,1,17,15,0,2,0 -9741,1,8,0,3,0,0,5,4,3,1,8,9,1,30,0 -9742,5,2,0,3,6,3,9,1,2,1,9,17,2,3,1 -9743,2,0,0,2,3,2,14,2,3,1,9,12,2,41,1 -9744,2,6,0,1,6,4,1,3,2,1,2,0,3,36,0 -9745,10,6,0,12,2,4,9,3,0,0,13,6,0,12,0 -9746,6,2,0,5,2,4,5,0,2,0,2,6,0,10,0 -9747,10,0,0,8,5,4,2,1,2,1,9,0,3,24,0 -9748,9,5,0,0,4,3,6,1,1,0,9,12,0,10,1 -9749,0,0,0,14,0,1,9,3,3,1,2,0,2,4,0 -9750,0,4,0,6,5,1,5,4,0,1,12,2,4,24,0 -9751,10,0,0,5,3,3,5,4,0,0,10,18,5,36,0 -9752,7,1,0,7,0,5,5,3,3,0,8,15,0,14,0 -9753,1,5,0,1,0,0,11,2,4,0,8,14,0,30,0 -9754,3,6,0,1,0,5,10,4,4,1,2,6,2,17,0 -9755,3,0,0,11,0,6,7,1,0,1,2,13,0,23,0 -9756,2,4,0,3,0,2,8,2,0,0,3,2,1,19,0 -9757,1,3,0,15,4,4,3,1,3,1,3,19,4,0,1 -9758,0,7,0,2,5,1,12,3,4,1,13,9,3,39,0 -9759,9,8,0,9,0,1,11,5,1,0,18,7,5,25,1 -9760,10,6,0,4,1,2,11,5,1,0,18,10,2,17,1 -9761,5,7,0,12,2,6,6,4,3,0,2,11,1,28,0 -9762,2,0,0,5,1,1,7,2,0,0,13,11,0,3,0 -9763,2,1,0,10,2,4,0,1,3,1,2,1,2,37,0 -9764,10,3,0,13,2,4,4,0,2,0,9,16,3,31,1 -9765,0,6,0,5,6,1,2,3,1,1,13,20,2,18,0 -9766,1,4,0,13,4,3,6,0,3,0,7,2,1,31,0 -9767,9,1,0,1,4,2,9,0,2,0,16,16,1,5,1 -9768,8,0,0,14,0,4,5,3,0,1,1,8,3,1,1 -9769,3,2,0,5,0,4,9,1,3,1,3,4,3,6,0 -9770,2,1,0,2,1,2,14,2,1,1,2,19,1,35,0 -9771,7,2,0,14,5,2,12,5,1,1,18,12,2,38,1 -9772,2,6,0,15,0,6,9,4,4,1,13,17,2,36,0 -9773,2,5,0,13,3,3,4,2,4,0,2,12,0,18,0 -9774,7,2,0,5,6,2,1,0,4,1,7,7,5,13,1 -9775,9,1,0,15,2,6,3,3,4,1,13,13,2,9,0 -9776,2,5,0,10,0,5,13,2,1,1,0,1,3,27,0 -9777,10,5,0,9,6,1,11,4,1,0,3,4,3,31,1 -9778,4,4,0,14,6,1,11,1,2,1,1,10,2,9,1 -9779,6,3,0,2,1,0,12,2,2,0,8,11,2,6,0 -9780,8,3,0,7,0,4,10,0,0,1,13,4,1,31,0 -9781,0,2,0,4,6,5,10,3,4,1,10,2,0,20,0 -9782,7,8,0,12,6,1,11,2,3,1,5,10,3,34,1 -9783,4,2,0,4,1,1,6,1,0,0,18,7,5,25,1 -9784,5,0,0,9,2,2,6,4,0,0,7,18,5,4,0 -9785,8,6,0,0,3,4,8,4,4,1,2,18,2,23,0 -9786,6,0,0,7,0,1,13,0,1,1,0,15,1,34,0 -9787,1,7,0,7,5,1,7,0,3,0,16,11,0,26,0 -9788,7,8,0,13,2,1,1,0,1,1,9,5,3,2,1 -9789,10,3,0,0,6,5,11,3,0,0,13,3,1,0,0 -9790,4,5,0,11,3,4,3,5,1,0,7,17,4,32,1 -9791,4,1,0,5,0,6,2,1,0,1,3,1,3,12,0 -9792,3,0,0,10,0,4,4,2,0,1,7,16,4,4,0 -9793,5,3,0,9,6,1,0,5,2,0,3,12,2,28,1 -9794,9,3,0,2,0,3,0,3,2,1,7,13,0,2,0 -9795,2,8,0,9,1,2,9,2,0,1,6,18,5,25,0 -9796,6,1,0,0,5,0,9,0,4,0,13,11,3,33,0 -9797,6,5,0,3,3,6,4,0,0,1,18,10,1,38,1 -9798,0,8,0,13,6,0,6,4,3,0,2,11,0,35,0 -9799,2,3,0,8,2,4,6,4,0,1,4,19,5,22,0 -9800,8,5,0,3,1,5,10,4,2,0,15,2,0,29,0 -9801,5,7,0,5,3,5,9,3,2,0,13,18,5,2,0 -9802,2,5,0,1,5,0,6,3,3,0,2,10,1,9,0 -9803,3,8,0,1,1,4,3,1,3,1,17,5,1,37,0 -9804,9,3,0,5,5,3,13,5,3,0,1,7,2,30,1 -9805,4,4,0,6,0,4,9,3,1,1,2,9,2,7,0 -9806,6,3,0,15,6,0,8,3,4,1,6,2,4,9,0 -9807,9,7,0,7,3,3,10,3,3,0,18,6,1,27,1 -9808,1,2,0,5,0,6,9,2,0,0,2,8,5,0,0 -9809,2,7,0,15,5,6,1,2,0,0,2,15,0,3,0 -9810,10,7,0,1,4,2,3,1,1,1,1,17,2,40,1 -9811,0,3,0,5,0,5,11,1,4,1,0,20,4,17,0 -9812,10,5,0,8,0,2,9,2,0,0,1,16,2,40,0 -9813,10,8,0,8,0,1,9,1,1,1,2,16,0,7,0 -9814,2,5,0,1,4,5,4,2,4,0,2,0,5,25,0 -9815,2,2,0,8,2,5,14,4,1,1,6,20,0,10,0 -9816,6,6,0,12,6,0,12,1,3,1,14,4,2,18,1 -9817,1,8,0,6,6,3,8,4,2,1,13,6,4,18,0 -9818,3,4,0,3,2,6,1,5,3,1,5,5,4,5,1 -9819,6,0,0,5,0,4,3,1,4,0,17,2,0,32,0 -9820,9,5,0,1,3,2,11,1,0,0,2,16,3,30,0 -9821,2,4,0,6,4,5,9,3,0,0,10,6,1,0,0 -9822,2,2,0,9,2,4,10,4,1,1,13,13,4,11,0 -9823,10,3,0,8,6,0,6,4,1,0,0,0,2,26,0 -9824,9,2,0,0,1,5,3,0,2,0,0,11,0,21,0 -9825,5,7,0,6,6,1,14,0,0,1,13,16,1,9,0 -9826,8,0,0,4,1,5,6,2,0,0,4,4,4,16,0 -9827,6,2,0,10,0,2,7,2,3,0,2,20,5,19,0 -9828,0,8,0,5,0,3,6,2,4,1,13,12,0,29,0 -9829,5,7,0,6,0,4,14,2,0,0,13,6,2,25,0 -9830,1,1,0,12,2,0,6,3,1,0,2,2,2,35,0 -9831,5,6,0,0,0,5,5,4,2,1,2,6,0,7,0 -9832,1,3,0,6,1,0,5,1,2,1,13,11,0,37,0 -9833,8,2,0,6,1,6,10,4,1,1,4,20,0,19,0 -9834,7,1,0,9,2,3,8,1,4,0,18,16,1,8,1 -9835,0,4,0,14,0,0,0,1,4,0,0,10,1,6,0 -9836,9,1,0,1,5,2,10,5,4,0,15,7,4,5,1 -9837,10,4,0,3,3,4,8,3,3,0,13,2,2,14,0 -9838,7,7,0,9,6,2,0,2,1,1,6,15,2,32,0 -9839,9,7,0,12,2,0,4,3,4,0,17,11,5,22,0 -9840,10,5,0,8,6,5,9,4,3,0,3,20,5,25,1 -9841,0,1,0,13,6,0,1,0,4,0,10,13,1,40,0 -9842,2,1,0,14,4,4,1,2,4,0,8,15,1,30,0 -9843,6,0,0,2,0,4,11,2,1,0,17,2,1,9,0 -9844,6,6,0,2,2,3,5,2,0,1,18,7,3,40,1 -9845,0,8,0,4,1,5,5,1,4,1,10,15,0,24,0 -9846,0,8,0,3,4,6,7,1,4,0,15,18,0,3,0 -9847,1,6,0,0,0,5,7,4,3,0,13,4,1,12,0 -9848,7,3,0,2,3,0,6,0,1,1,7,10,4,20,1 -9849,8,4,0,12,6,2,8,3,3,1,17,10,5,39,1 -9850,3,6,0,7,5,0,3,1,1,0,13,13,2,8,0 -9851,9,3,0,11,0,6,3,3,0,0,4,3,5,28,0 -9852,4,8,0,7,1,1,0,1,2,0,5,7,0,12,1 -9853,0,1,0,6,3,0,6,4,4,0,8,18,4,6,0 -9854,0,7,0,5,6,1,7,5,0,1,0,1,3,7,0 -9855,7,5,0,10,4,6,0,0,3,1,3,14,0,5,1 -9856,10,4,0,0,3,1,9,0,3,0,9,13,5,41,1 -9857,7,8,0,2,3,5,11,5,3,0,5,10,4,38,1 -9858,0,0,0,9,0,4,11,2,0,1,15,19,1,13,0 -9859,10,8,0,1,6,3,5,2,1,0,4,9,4,27,0 -9860,10,2,0,12,0,5,10,4,2,1,10,13,1,36,0 -9861,9,6,0,0,1,3,9,4,2,1,2,20,0,3,0 -9862,3,0,0,15,0,0,3,3,3,0,13,11,2,3,0 -9863,4,8,0,15,3,5,2,3,1,0,9,0,2,16,0 -9864,4,8,0,5,1,0,3,4,2,1,2,6,4,3,0 -9865,2,6,0,9,4,6,6,1,0,0,6,2,3,20,0 -9866,5,5,0,11,3,2,0,1,0,0,7,11,1,34,0 -9867,9,5,0,1,0,6,14,2,4,0,2,6,4,7,0 -9868,2,8,0,12,2,5,4,4,3,0,17,16,0,17,0 -9869,2,8,0,10,6,0,8,5,2,1,2,11,5,24,0 -9870,3,8,0,13,0,0,9,2,1,1,7,18,3,0,0 -9871,4,2,0,6,6,3,14,2,4,1,3,20,4,29,0 -9872,9,5,0,4,5,0,12,4,4,1,2,18,5,19,0 -9873,10,6,0,5,0,5,0,0,2,1,5,11,1,41,0 -9874,2,5,0,11,2,0,8,1,4,1,4,11,0,31,0 -9875,0,1,0,9,3,1,4,0,0,0,9,12,5,5,1 -9876,2,0,0,2,1,1,7,3,3,1,7,0,1,2,0 -9877,1,1,0,11,3,4,6,0,3,1,4,3,2,3,0 -9878,7,4,0,9,4,4,2,5,2,0,18,20,4,7,1 -9879,2,3,0,6,0,1,5,1,3,1,8,2,2,31,0 -9880,9,4,0,8,5,2,8,1,3,1,10,10,4,16,0 -9881,4,1,0,12,1,0,6,0,2,1,2,4,2,41,0 -9882,8,0,0,0,4,2,9,3,1,1,10,15,0,30,0 -9883,9,1,0,5,6,5,6,0,0,0,8,2,3,22,0 -9884,4,8,0,15,5,4,4,4,2,1,2,0,3,37,0 -9885,0,0,0,9,4,0,6,3,4,1,8,2,2,13,0 -9886,6,3,0,13,2,0,9,2,3,1,6,11,1,21,0 -9887,6,6,0,5,0,5,4,1,2,1,2,6,0,30,0 -9888,7,7,0,7,5,2,1,2,3,1,2,9,1,33,0 -9889,3,7,0,13,0,5,8,2,1,0,2,5,1,38,0 -9890,9,3,0,3,0,3,12,4,4,0,15,16,0,39,0 -9891,2,8,0,6,4,0,4,4,4,0,2,2,5,4,0 -9892,10,4,0,5,6,0,8,4,3,0,13,12,3,7,0 -9893,1,7,0,3,1,3,6,4,4,1,13,11,5,21,0 -9894,3,3,0,5,2,5,3,0,0,0,5,13,0,37,0 -9895,0,8,0,14,5,3,9,3,4,0,6,11,0,37,0 -9896,2,2,0,3,6,0,4,3,1,0,0,11,0,7,0 -9897,6,4,0,1,1,2,13,2,3,1,18,1,1,20,1 -9898,8,4,0,15,0,4,10,5,2,0,5,19,4,22,1 -9899,7,7,0,6,0,1,2,1,0,1,6,9,4,37,0 -9900,6,6,0,7,0,4,12,2,2,0,15,11,4,40,0 -9901,10,6,0,9,3,2,12,5,4,0,3,5,2,12,1 -9902,0,8,0,3,0,2,14,3,3,0,13,11,1,4,0 -9903,6,2,0,0,6,3,0,2,3,0,15,18,0,22,0 -9904,5,0,0,15,0,3,1,4,2,1,18,8,2,0,1 -9905,7,8,0,2,6,4,4,4,3,1,13,11,0,25,0 -9906,6,7,0,9,5,1,4,4,3,0,14,10,4,41,1 -9907,0,3,0,1,1,4,10,1,3,0,4,18,3,10,0 -9908,9,0,0,4,5,5,1,1,4,0,7,11,3,4,0 -9909,0,8,0,4,2,5,8,5,2,1,0,11,3,6,0 -9910,10,6,0,10,0,5,11,1,2,1,15,13,0,27,0 -9911,6,0,0,6,5,2,4,0,0,0,8,11,0,11,0 -9912,0,8,0,5,2,5,5,2,2,0,2,15,3,12,0 -9913,9,3,0,6,2,5,0,5,3,1,9,17,5,17,1 -9914,1,7,0,3,6,0,5,5,4,0,4,15,1,19,0 -9915,0,0,0,7,3,4,9,0,2,0,8,4,0,20,0 -9916,9,5,0,10,0,2,4,0,2,0,18,19,1,8,1 -9917,4,6,0,3,2,5,5,2,2,1,13,15,4,36,0 -9918,2,7,0,3,0,5,9,4,0,0,2,10,0,28,0 -9919,7,2,0,8,0,0,7,3,0,1,9,12,0,1,1 -9920,2,8,0,2,0,4,2,4,3,1,15,2,2,18,0 -9921,8,8,0,10,3,6,8,3,2,0,14,7,5,36,1 -9922,7,7,0,10,0,1,11,1,3,0,2,0,3,17,0 -9923,3,4,0,11,1,6,5,3,0,0,10,18,0,5,0 -9924,9,0,0,15,3,5,9,4,1,0,17,20,2,17,0 -9925,2,0,0,11,3,4,0,0,1,0,13,9,0,27,0 -9926,5,8,0,1,4,1,11,5,2,0,7,17,4,9,1 -9927,3,5,0,4,2,0,14,4,0,0,10,9,5,23,0 -9928,4,4,0,12,5,0,8,4,4,1,13,11,1,18,0 -9929,8,3,0,10,5,2,13,2,1,0,18,7,3,27,1 -9930,0,5,0,9,0,6,6,0,3,1,18,18,0,18,0 -9931,4,3,0,5,1,3,13,3,0,0,6,2,2,3,0 -9932,0,3,0,2,3,5,10,1,0,0,6,9,1,34,0 -9933,2,2,0,14,5,0,0,4,4,0,6,9,1,23,0 -9934,4,3,0,4,0,6,8,2,4,0,13,10,5,1,0 -9935,5,7,0,8,6,2,3,5,1,0,16,19,1,11,1 -9936,8,4,0,3,3,3,13,0,2,1,11,2,5,6,0 -9937,3,8,0,11,3,4,9,3,1,1,12,18,4,32,0 -9938,3,3,0,8,6,1,10,3,1,1,18,7,1,41,1 -9939,9,8,0,12,6,5,4,1,4,1,18,8,1,38,1 -9940,2,0,0,12,3,0,3,3,2,0,14,9,2,38,0 -9941,0,6,0,4,0,3,3,3,2,1,2,11,0,12,0 -9942,2,4,0,5,1,0,12,3,1,0,10,15,0,30,0 -9943,6,7,0,2,4,1,7,3,0,0,12,18,0,29,0 -9944,2,5,0,3,5,6,4,1,1,0,4,6,5,26,0 -9945,4,8,0,7,2,4,5,2,4,1,1,11,5,13,0 -9946,6,1,0,10,4,5,5,1,0,1,18,14,3,8,1 -9947,0,3,0,5,0,0,11,1,1,0,2,14,4,30,0 -9948,2,5,0,11,6,3,2,1,4,0,2,6,3,4,0 -9949,9,6,0,14,2,1,14,4,1,1,14,10,5,34,1 -9950,0,6,0,2,1,3,9,0,1,0,3,12,0,10,0 -9951,3,4,0,1,1,4,7,1,3,0,8,2,1,31,0 -9952,10,2,0,7,6,6,9,5,4,1,9,16,4,41,1 -9953,7,8,0,14,4,5,2,1,0,1,4,2,1,20,0 -9954,6,8,0,9,2,5,8,2,1,0,10,5,1,1,1 -9955,3,7,0,4,6,3,5,4,3,0,13,18,3,23,0 -9956,1,3,0,1,1,4,13,2,0,0,2,6,1,35,0 -9957,4,3,0,10,3,0,3,3,0,0,11,11,0,35,0 -9958,3,6,0,5,5,1,7,5,4,0,1,10,0,1,1 -9959,2,7,0,5,1,5,6,1,0,1,2,12,4,36,0 -9960,0,8,0,5,6,4,5,0,1,1,8,9,1,10,0 -9961,1,7,0,6,2,4,14,2,2,1,16,0,0,4,0 -9962,2,1,0,0,2,6,13,2,1,0,13,2,0,38,0 -9963,9,0,0,3,4,0,6,2,1,1,13,2,5,14,0 -9964,3,4,0,6,3,3,9,3,2,0,8,20,3,22,0 -9965,3,0,0,13,4,0,8,3,2,0,2,2,3,18,0 -9966,1,5,0,3,0,0,0,0,3,0,4,2,2,9,0 -9967,5,3,0,3,3,3,13,0,2,1,1,7,4,14,1 -9968,3,8,0,15,5,5,0,4,0,0,4,20,1,38,0 -9969,9,8,0,0,0,4,13,5,1,0,13,11,0,38,0 -9970,1,3,0,12,5,6,5,2,3,0,17,6,4,14,0 -9971,9,8,0,0,2,6,0,1,0,1,2,4,5,12,0 -9972,9,8,0,10,4,5,4,0,1,0,18,14,4,5,1 -9973,1,4,0,6,2,0,4,2,2,0,6,6,5,31,0 -9974,5,7,0,12,3,4,14,5,3,0,13,11,0,22,0 -9975,1,3,0,3,6,3,9,1,1,0,13,4,4,25,0 -9976,3,2,0,5,0,4,12,3,0,1,4,6,5,8,0 -9977,6,7,0,13,4,5,5,1,0,0,14,11,3,17,0 -9978,10,8,0,6,6,0,8,0,4,1,7,15,5,6,0 -9979,4,7,0,2,2,4,2,4,3,1,18,11,4,5,1 -9980,1,0,0,13,1,0,5,3,0,0,10,16,5,25,0 -9981,9,4,0,11,3,2,8,0,2,1,18,10,5,3,1 -9982,0,3,0,6,4,6,7,1,1,1,9,2,1,9,0 -9983,0,3,0,7,5,5,9,3,1,0,0,14,1,24,0 -9984,1,3,0,3,3,2,14,4,1,0,5,17,4,9,1 -9985,2,1,0,2,3,6,14,3,2,1,13,15,0,24,0 -9986,9,7,0,8,5,6,1,0,0,1,13,14,1,39,0 -9987,0,4,0,12,6,6,14,2,0,0,0,2,0,4,0 -9988,3,0,0,11,5,3,8,3,0,1,13,11,5,29,0 -9989,9,3,0,1,4,2,12,5,1,1,2,14,0,39,0 -9990,10,7,0,13,3,4,6,4,1,0,2,13,4,21,0 -9991,1,8,0,5,0,6,4,1,2,0,12,6,0,0,0 -9992,3,6,0,6,0,3,14,2,3,0,11,11,3,7,0 -9993,6,2,0,5,1,1,7,0,2,0,13,11,3,34,0 -9994,10,0,0,6,6,0,5,2,3,1,4,18,0,18,0 -9995,2,5,0,0,5,2,1,5,3,1,14,18,3,29,0 -9996,6,5,0,3,0,6,13,2,3,1,14,19,4,14,0 -9997,10,1,0,6,0,4,5,0,2,1,0,16,2,30,0 -9998,4,2,0,4,0,5,0,3,4,0,2,18,3,26,0 -9999,10,0,0,9,6,0,11,3,1,1,4,2,0,41,0 -10000,5,8,0,1,5,0,2,2,1,1,17,13,5,25,0 -10001,8,0,0,9,0,0,6,4,0,0,4,18,0,28,0 -10002,10,0,0,8,2,4,9,3,2,0,13,9,0,34,0 -10003,1,4,0,8,4,5,3,0,4,0,6,4,3,25,0 -10004,9,2,0,2,4,5,13,4,1,1,0,18,0,2,0 -10005,7,7,0,6,1,5,13,2,4,1,8,2,3,16,0 -10006,9,8,0,8,6,3,14,2,1,1,14,5,3,38,1 -10007,2,2,0,10,3,4,0,2,1,0,4,6,1,14,0 -10008,2,3,0,1,4,3,12,4,4,1,13,18,2,28,0 -10009,1,0,0,1,6,0,12,4,3,1,2,11,5,11,0 -10010,2,1,0,15,5,3,2,4,1,1,4,10,3,10,0 -10011,0,3,0,14,0,0,9,0,0,0,13,4,0,8,0 -10012,10,8,0,14,6,4,7,4,0,0,1,15,0,4,0 -10013,5,4,0,4,0,5,9,1,2,0,4,20,5,27,0 -10014,0,2,0,1,3,2,7,3,3,0,13,0,1,40,0 -10015,7,4,0,10,6,4,4,5,4,1,12,12,2,0,1 -10016,1,2,0,3,1,0,7,0,0,0,2,6,4,30,0 -10017,5,2,0,11,1,6,3,0,2,0,12,6,4,8,0 -10018,7,4,0,7,3,0,6,0,1,0,8,4,1,28,0 -10019,1,8,0,1,5,4,3,1,0,0,9,8,3,37,0 -10020,3,0,0,8,1,0,6,2,1,0,6,15,1,9,0 -10021,5,2,0,9,3,1,14,1,4,1,1,7,3,23,1 -10022,10,3,0,3,0,5,0,3,4,1,13,6,2,4,0 -10023,9,8,0,0,5,2,11,2,1,1,8,11,1,30,0 -10024,8,0,0,3,2,4,1,0,0,0,2,9,1,18,0 -10025,0,5,0,1,5,5,6,4,2,0,8,0,3,3,0 -10026,10,7,0,11,0,5,8,0,1,0,4,18,1,35,0 -10027,4,8,0,13,6,4,11,3,2,1,7,2,0,35,0 -10028,0,7,0,1,6,6,14,4,3,1,10,14,0,1,0 -10029,4,8,0,1,5,3,5,1,2,0,13,5,2,3,0 -10030,0,2,0,8,1,4,3,5,2,0,8,6,4,29,0 -10031,9,8,0,15,0,2,12,2,3,1,2,11,3,39,0 -10032,7,4,0,1,5,1,3,3,3,0,9,7,3,20,1 -10033,1,3,0,9,0,6,4,2,2,0,8,15,1,37,0 -10034,5,8,0,11,4,6,10,4,1,1,4,1,4,9,1 -10035,3,7,0,3,0,3,4,4,1,1,6,11,1,3,0 -10036,3,1,0,2,2,3,4,4,2,0,13,0,0,27,0 -10037,2,1,0,11,6,4,5,1,4,1,17,2,5,31,0 -10038,3,0,0,13,6,5,10,3,3,0,8,20,0,13,0 -10039,3,6,0,3,2,2,8,3,1,0,6,18,5,35,0 -10040,10,2,0,3,5,4,10,4,0,1,2,9,0,3,0 -10041,0,8,0,11,4,4,2,3,4,1,17,3,0,25,0 -10042,2,8,0,7,0,4,12,4,0,0,5,0,4,20,1 -10043,0,1,0,1,0,6,3,1,3,1,17,2,2,10,0 -10044,0,8,0,3,3,6,3,0,2,1,13,9,0,9,0 -10045,7,3,0,11,6,4,6,1,0,1,2,0,5,18,0 -10046,10,7,0,0,4,3,11,0,0,1,5,10,0,38,1 -10047,3,5,0,11,1,5,4,5,0,0,18,6,0,24,0 -10048,3,4,0,9,2,5,6,2,0,0,0,2,1,30,0 -10049,0,3,0,4,0,4,5,3,2,1,13,11,4,27,0 -10050,7,7,0,2,3,2,12,1,3,0,14,5,0,35,1 -10051,5,4,0,4,1,4,7,0,1,1,15,0,3,16,0 -10052,0,8,0,4,0,0,14,0,4,1,17,1,0,30,0 -10053,3,5,0,1,4,6,0,1,4,1,2,20,2,1,0 -10054,3,2,0,1,6,0,0,5,4,0,16,8,3,34,1 -10055,7,4,0,9,2,6,10,5,3,1,10,6,5,19,0 -10056,6,6,0,11,2,6,5,2,4,0,6,11,5,31,0 -10057,0,8,0,8,4,0,14,5,4,1,18,10,0,9,1 -10058,7,4,0,13,0,5,14,4,2,0,2,6,2,34,0 -10059,9,3,0,2,2,4,0,4,1,1,16,4,2,8,0 -10060,2,7,0,7,6,0,10,3,1,0,15,9,3,6,0 -10061,8,7,0,1,3,3,8,3,0,0,13,15,2,27,0 -10062,5,5,0,15,0,3,14,3,0,1,2,6,5,19,0 -10063,0,5,0,10,6,4,5,1,4,0,4,6,0,12,0 -10064,4,8,0,11,4,1,13,0,1,0,2,2,2,37,0 -10065,6,7,0,12,0,6,4,5,1,0,12,2,4,32,0 -10066,8,1,0,1,4,4,2,3,4,1,8,9,5,18,0 -10067,8,2,0,11,1,5,9,4,3,0,10,11,1,23,0 -10068,0,4,0,3,5,6,12,3,2,1,13,2,0,36,0 -10069,4,1,0,8,2,4,6,3,3,1,4,6,2,27,0 -10070,5,4,0,9,3,1,4,3,3,1,18,5,2,14,1 -10071,1,1,0,15,2,6,4,2,1,1,16,17,0,4,1 -10072,2,0,0,6,0,5,11,3,0,1,16,20,3,28,0 -10073,9,5,0,15,1,2,13,2,0,0,11,4,3,9,1 -10074,0,1,0,5,6,2,0,1,0,0,0,12,3,1,0 -10075,8,4,0,1,3,1,10,1,3,1,9,3,0,7,1 -10076,5,1,0,15,3,0,7,4,2,0,2,2,5,3,0 -10077,0,5,0,11,1,0,8,3,3,1,4,15,3,12,0 -10078,5,0,0,3,3,2,12,2,4,0,12,16,4,41,0 -10079,5,5,0,10,4,1,12,5,0,1,18,8,2,5,1 -10080,10,2,0,4,4,2,9,4,1,1,4,9,4,30,0 -10081,3,5,0,15,2,3,12,0,0,1,13,16,0,23,0 -10082,7,4,0,13,5,3,9,2,1,1,12,6,1,36,0 -10083,7,1,0,9,3,4,3,1,1,0,7,14,3,27,1 -10084,0,4,0,9,6,6,7,1,2,0,7,2,0,31,0 -10085,6,0,0,10,1,2,0,1,4,1,11,7,1,10,1 -10086,9,2,0,6,6,2,6,3,2,0,13,8,4,24,0 -10087,9,3,0,12,5,3,4,1,1,1,17,8,4,13,1 -10088,4,6,0,14,2,2,13,2,1,0,3,6,0,10,1 -10089,9,8,0,11,1,1,2,4,0,0,2,6,3,7,0 -10090,0,6,0,0,3,0,2,3,2,1,13,11,2,16,0 -10091,6,1,0,1,0,2,5,1,0,0,13,15,1,29,0 -10092,1,2,0,15,4,5,8,5,0,0,12,2,2,27,0 -10093,5,8,0,4,3,0,12,5,2,0,5,20,0,6,0 -10094,2,7,0,1,6,2,0,1,4,0,4,11,1,31,0 -10095,10,1,0,1,3,2,3,1,0,0,3,1,1,28,1 -10096,3,4,0,8,6,4,10,2,0,0,8,2,5,1,0 -10097,4,6,0,3,2,4,2,4,3,1,13,20,1,18,0 -10098,4,5,0,14,1,3,11,5,1,0,9,1,2,5,1 -10099,3,5,0,3,4,5,0,3,1,0,13,19,3,26,0 -10100,10,4,0,8,2,5,6,2,2,1,18,17,4,16,1 -10101,4,1,0,4,3,4,11,1,2,1,5,4,2,14,1 -10102,0,8,0,11,0,4,13,4,3,1,11,6,2,14,0 -10103,2,2,0,3,0,1,14,2,4,1,5,2,2,14,0 -10104,1,5,0,0,0,4,8,1,0,1,12,0,4,20,0 -10105,8,7,0,12,1,2,1,5,0,1,18,7,2,5,1 -10106,0,8,0,3,6,3,3,0,2,0,17,17,0,39,0 -10107,10,5,0,6,6,3,0,0,2,1,8,11,1,12,0 -10108,1,7,0,0,0,6,2,4,3,0,13,0,0,22,0 -10109,7,2,0,13,2,5,11,5,0,0,2,11,1,3,0 -10110,3,5,0,6,6,5,3,0,3,1,0,4,3,5,0 -10111,2,5,0,13,5,3,5,4,0,0,1,0,2,4,0 -10112,3,2,0,11,0,6,10,0,4,1,10,9,0,36,0 -10113,9,5,0,15,0,3,0,3,4,0,10,16,1,6,0 -10114,4,4,0,4,0,5,3,4,2,1,15,11,0,7,0 -10115,3,4,0,8,5,4,7,3,0,0,15,9,0,3,0 -10116,6,3,0,12,0,2,5,3,3,0,17,14,2,7,0 -10117,7,6,0,9,5,4,12,0,0,1,17,18,0,38,0 -10118,4,7,0,13,1,5,2,2,2,1,13,12,3,18,0 -10119,1,4,0,11,3,5,13,2,3,1,10,11,0,29,0 -10120,1,5,0,14,1,3,2,5,0,0,2,2,5,38,0 -10121,8,1,0,2,2,0,13,1,3,0,1,16,1,26,1 -10122,7,4,0,0,2,6,13,4,3,1,2,9,2,28,0 -10123,2,2,0,4,1,3,0,2,3,1,12,0,3,18,0 -10124,4,0,0,6,6,0,2,2,2,1,4,14,1,25,0 -10125,1,6,0,15,6,4,12,1,2,1,8,8,3,0,0 -10126,9,2,0,15,3,2,10,3,1,1,17,1,5,32,1 -10127,9,0,0,1,6,4,4,3,0,1,13,9,5,20,0 -10128,0,6,0,3,6,6,0,2,2,0,2,15,0,27,0 -10129,2,7,0,3,5,5,4,2,3,0,5,2,4,10,0 -10130,2,6,0,7,0,0,8,4,1,1,2,18,0,37,0 -10131,1,7,0,7,0,6,4,1,2,0,13,11,1,8,0 -10132,1,2,0,3,6,2,2,4,3,0,13,6,1,11,0 -10133,8,3,0,0,2,4,6,1,2,0,2,13,2,4,0 -10134,10,5,0,8,2,1,11,2,2,0,18,19,5,11,1 -10135,1,1,0,9,0,2,10,0,4,0,3,15,4,7,0 -10136,8,4,0,9,6,1,14,4,3,1,3,20,5,37,0 -10137,2,4,0,8,2,3,14,5,4,0,13,6,4,33,0 -10138,0,6,0,0,2,2,7,3,0,1,2,6,0,33,0 -10139,2,5,0,1,1,5,1,1,0,1,17,6,4,11,0 -10140,9,1,0,3,2,0,0,5,3,0,8,17,2,17,1 -10141,0,6,0,13,6,0,6,3,2,1,1,11,3,6,0 -10142,4,7,0,4,4,6,13,5,4,1,4,3,1,27,0 -10143,9,3,0,13,5,2,8,1,2,0,6,13,1,29,0 -10144,8,1,0,0,4,5,13,3,1,1,0,15,0,5,0 -10145,0,7,0,10,1,0,9,4,4,1,13,0,0,26,0 -10146,7,4,0,7,6,3,1,1,2,0,15,0,3,36,0 -10147,2,8,0,6,6,3,13,0,0,1,15,17,1,2,0 -10148,8,4,0,14,4,5,11,5,2,1,15,14,1,19,1 -10149,3,4,0,3,4,5,10,5,2,0,13,0,5,33,0 -10150,9,6,0,11,1,0,9,1,2,1,2,4,2,16,0 -10151,3,4,0,12,0,3,14,4,4,0,8,11,1,5,0 -10152,1,3,0,14,2,4,12,0,1,0,12,2,1,4,0 -10153,2,4,0,12,5,1,12,5,3,1,18,9,4,8,1 -10154,7,1,0,2,5,6,5,3,1,0,13,13,4,8,0 -10155,6,8,0,12,4,2,14,0,0,1,3,7,3,18,1 -10156,4,8,0,13,2,2,4,4,4,0,5,7,5,28,1 -10157,6,6,0,8,0,0,11,2,3,0,2,3,4,36,0 -10158,3,2,0,4,6,0,3,3,2,1,2,2,3,2,0 -10159,6,5,0,8,0,6,0,5,0,0,13,13,0,31,0 -10160,0,6,0,2,3,1,12,1,3,0,4,18,5,10,0 -10161,3,1,0,12,2,6,12,5,3,1,0,5,4,19,1 -10162,2,4,0,10,1,3,12,3,0,1,3,15,4,14,0 -10163,7,8,0,4,0,4,8,0,1,0,2,13,5,34,0 -10164,10,1,0,14,6,5,10,3,0,1,6,9,2,30,0 -10165,4,4,0,8,4,0,0,3,4,1,2,2,2,17,0 -10166,5,4,0,7,5,6,10,1,4,1,18,16,4,39,1 -10167,8,8,0,2,0,4,4,3,1,1,8,13,0,34,0 -10168,3,6,0,5,5,5,8,0,2,1,13,15,1,9,0 -10169,7,1,0,12,4,6,11,0,3,1,18,8,4,11,1 -10170,10,7,0,8,6,6,6,2,0,1,2,6,0,0,0 -10171,10,0,0,11,1,1,7,4,0,0,8,2,1,2,0 -10172,4,8,0,15,6,6,6,1,2,0,18,8,1,20,1 -10173,7,3,0,3,2,0,0,4,1,1,18,16,5,22,1 -10174,5,3,0,5,3,0,13,3,3,0,8,8,3,3,0 -10175,9,6,0,7,1,5,5,1,2,0,13,13,2,18,0 -10176,8,4,0,7,5,0,5,4,3,0,6,2,4,3,0 -10177,3,7,0,3,6,4,1,2,0,1,4,13,4,24,0 -10178,9,5,0,11,6,6,0,1,3,0,10,2,0,13,0 -10179,2,1,0,1,5,6,3,4,2,0,8,20,0,26,0 -10180,2,7,0,11,0,3,11,4,0,0,13,11,1,13,0 -10181,0,0,0,1,0,2,12,2,3,0,0,20,3,7,0 -10182,5,0,0,10,5,1,2,5,2,1,18,10,2,0,1 -10183,5,8,0,13,6,2,14,1,3,0,6,6,3,30,0 -10184,2,6,0,5,3,2,13,4,0,0,2,11,4,4,0 -10185,10,8,0,10,1,4,12,0,4,1,1,15,1,18,1 -10186,2,4,0,15,5,5,10,3,2,1,1,2,5,39,0 -10187,6,8,0,6,5,6,7,4,0,0,18,17,2,33,1 -10188,4,8,0,5,1,4,8,1,0,1,13,14,1,6,0 -10189,0,5,0,5,0,5,5,1,4,0,0,18,0,41,0 -10190,10,2,0,3,0,5,11,0,0,0,8,2,2,17,0 -10191,1,4,0,5,0,6,11,4,3,0,1,4,5,13,0 -10192,4,1,0,0,2,3,6,3,0,0,12,0,4,28,0 -10193,1,2,0,7,4,3,5,0,0,1,14,8,3,8,1 -10194,4,5,0,14,2,6,5,4,3,0,8,4,2,7,0 -10195,8,8,0,5,0,5,5,5,4,1,13,9,5,38,0 -10196,10,5,0,12,4,6,4,5,4,1,16,9,3,14,1 -10197,7,5,0,8,6,0,8,2,1,1,2,11,0,39,0 -10198,0,5,0,5,1,1,8,2,4,0,8,16,3,5,0 -10199,1,4,0,6,6,1,3,2,0,1,11,11,0,39,0 -10200,8,1,0,0,3,0,10,5,3,0,3,2,4,5,0 -10201,8,2,0,10,2,5,3,2,0,1,2,13,1,28,0 -10202,0,0,0,4,5,6,1,2,4,0,3,1,1,20,1 -10203,2,2,0,5,6,6,13,4,0,1,13,2,0,22,0 -10204,8,5,0,14,0,4,6,4,2,0,10,2,3,26,0 -10205,4,6,0,5,0,4,11,2,4,1,12,15,1,16,0 -10206,6,1,0,4,6,0,11,1,2,0,18,5,1,21,1 -10207,6,5,0,10,2,1,11,0,1,1,18,5,4,12,1 -10208,8,7,0,4,2,5,8,3,3,1,16,11,0,7,0 -10209,0,3,0,4,0,6,0,2,4,0,8,11,5,31,0 -10210,10,2,0,14,6,6,13,5,0,1,18,10,4,25,1 -10211,3,7,0,8,0,5,4,3,1,1,13,18,2,16,0 -10212,10,8,0,8,0,6,4,1,0,1,18,19,4,35,1 -10213,5,1,0,12,2,6,3,4,1,1,11,2,4,21,0 -10214,0,3,0,11,5,5,0,1,0,0,2,15,4,28,0 -10215,9,2,0,3,5,5,2,0,0,0,12,6,0,14,0 -10216,10,5,0,15,1,2,6,5,2,0,12,7,5,5,1 -10217,7,1,0,11,6,0,3,1,4,1,14,4,0,27,1 -10218,3,1,0,7,4,4,6,5,0,0,2,2,4,16,0 -10219,0,6,0,1,5,4,0,3,3,0,13,18,3,13,0 -10220,2,7,0,3,2,5,5,2,2,0,13,6,2,9,0 -10221,1,8,0,6,0,3,9,3,3,1,13,11,1,21,0 -10222,8,4,0,4,0,5,6,3,4,0,13,2,4,36,0 -10223,6,0,0,0,1,4,9,4,0,1,0,9,5,29,0 -10224,2,0,0,0,1,0,2,3,2,0,0,2,4,12,0 -10225,1,6,0,5,3,0,7,1,1,1,2,11,0,10,0 -10226,10,6,0,7,2,4,13,0,1,0,16,0,4,8,0 -10227,3,6,0,0,0,4,10,3,4,0,2,2,0,0,0 -10228,2,5,0,13,5,0,6,3,2,0,15,13,3,6,0 -10229,0,5,0,7,0,4,6,0,3,1,2,18,3,19,0 -10230,2,5,0,5,5,4,13,1,2,1,0,2,1,3,0 -10231,9,6,0,12,6,0,3,1,2,1,13,11,1,10,0 -10232,0,6,0,15,3,1,9,2,1,1,2,7,1,28,0 -10233,3,7,0,14,1,0,2,5,1,0,16,8,5,20,1 -10234,10,4,0,5,3,6,3,5,2,1,6,2,3,7,0 -10235,8,2,0,8,6,5,14,0,0,1,10,11,0,28,0 -10236,3,0,0,2,2,0,10,4,3,1,15,12,0,3,0 -10237,9,0,0,15,6,0,14,3,0,1,13,15,2,29,0 -10238,2,5,0,1,0,4,14,2,3,0,4,6,0,3,0 -10239,6,4,0,12,0,6,13,5,2,1,5,15,5,0,1 -10240,3,7,0,11,4,2,2,2,2,1,2,18,1,7,0 -10241,1,2,0,11,1,5,1,0,3,1,2,6,1,18,0 -10242,0,8,0,15,2,4,5,0,0,0,2,11,3,19,0 -10243,0,0,0,9,3,4,0,4,0,0,12,17,1,33,0 -10244,10,3,0,7,1,0,5,2,2,1,1,17,2,7,1 -10245,6,5,0,4,1,3,8,1,0,0,4,15,0,16,0 -10246,0,3,0,1,2,3,12,4,1,1,13,20,3,34,0 -10247,9,4,0,0,4,1,0,4,2,1,8,13,1,27,0 -10248,8,3,0,15,4,6,0,5,2,1,18,10,0,16,1 -10249,9,3,0,1,0,6,6,4,4,0,15,20,3,4,0 -10250,7,8,0,12,1,3,3,5,2,1,18,1,3,0,1 -10251,3,6,0,10,5,4,5,3,0,0,0,9,2,13,0 -10252,2,6,0,3,5,6,6,0,4,0,18,16,5,8,1 -10253,9,8,0,6,5,1,9,2,0,0,0,2,3,21,0 -10254,7,4,0,15,5,2,3,1,1,0,2,20,1,37,0 -10255,2,8,0,5,1,4,7,3,2,0,7,15,0,17,0 -10256,2,7,0,4,1,0,3,0,0,0,0,3,1,37,0 -10257,8,3,0,0,5,6,8,2,1,1,4,16,2,36,0 -10258,3,6,0,0,0,0,10,4,1,1,14,10,0,37,0 -10259,5,0,0,12,1,5,9,2,2,0,8,6,4,35,0 -10260,0,5,0,2,1,0,11,1,2,1,18,5,1,27,1 -10261,4,0,0,13,2,0,4,0,3,1,1,7,4,37,1 -10262,7,0,0,13,4,2,1,0,0,1,2,2,5,12,0 -10263,4,2,0,10,5,6,0,5,4,1,7,17,5,29,1 -10264,10,7,0,15,1,4,7,1,1,1,8,9,4,37,0 -10265,4,6,0,0,0,4,8,1,1,1,13,15,1,10,0 -10266,2,2,0,13,0,5,1,0,2,1,13,17,3,8,0 -10267,7,5,0,12,0,5,3,2,2,1,18,6,5,27,1 -10268,9,6,0,9,6,0,7,4,4,1,4,20,3,21,0 -10269,8,8,0,6,0,1,9,1,0,0,2,9,1,2,0 -10270,8,8,0,5,4,1,6,2,4,0,2,11,2,28,0 -10271,2,6,0,8,0,5,0,4,0,1,1,11,3,3,0 -10272,7,5,0,1,2,1,13,4,2,1,5,19,3,33,1 -10273,5,0,0,6,0,5,8,3,4,1,6,20,2,28,0 -10274,1,3,0,11,4,1,12,3,1,1,8,11,2,14,0 -10275,2,0,0,4,1,5,8,2,3,1,13,9,5,4,0 -10276,6,5,0,9,2,1,8,5,1,0,14,7,2,27,1 -10277,9,3,0,8,6,4,7,0,4,1,2,18,2,14,0 -10278,3,8,0,5,4,1,2,1,1,0,15,14,1,8,0 -10279,0,3,0,15,0,3,3,4,4,0,6,19,5,40,0 -10280,0,1,0,11,2,1,13,5,2,0,2,2,1,10,0 -10281,6,4,0,5,6,4,5,1,2,1,4,4,0,41,0 -10282,8,3,0,2,3,5,14,5,0,1,13,2,3,31,0 -10283,10,8,0,6,3,5,1,1,4,1,10,3,4,5,0 -10284,2,4,0,13,5,3,8,3,2,0,3,6,5,37,0 -10285,9,3,0,12,6,4,1,4,0,1,13,20,1,23,0 -10286,0,1,0,1,2,4,13,5,0,0,6,13,1,7,0 -10287,5,1,0,0,0,6,8,3,4,1,9,5,3,19,1 -10288,0,0,0,13,6,0,13,0,0,1,12,20,3,27,0 -10289,1,4,0,9,1,0,14,2,1,1,0,13,3,33,0 -10290,1,0,0,3,2,3,1,3,1,0,6,7,5,1,0 -10291,6,4,0,5,2,5,13,0,4,1,18,7,0,1,1 -10292,1,0,0,14,0,0,6,3,1,0,4,2,4,3,0 -10293,8,6,0,2,4,0,13,4,3,0,0,2,2,23,0 -10294,9,8,0,3,6,5,11,3,3,1,13,12,4,21,0 -10295,7,4,0,5,6,0,1,4,3,1,2,11,5,27,0 -10296,1,8,0,7,4,5,10,2,3,1,8,20,3,6,0 -10297,9,7,0,4,5,4,14,4,2,1,13,16,2,35,0 -10298,9,5,0,0,0,5,14,1,0,0,17,11,5,17,0 -10299,5,0,0,9,6,2,6,0,4,1,1,8,5,9,1 -10300,3,4,0,5,4,5,1,2,0,0,17,18,2,6,0 -10301,1,5,0,12,5,3,11,4,4,0,0,2,4,27,0 -10302,1,8,0,9,0,4,9,1,4,1,2,9,0,4,0 -10303,8,2,0,15,0,5,0,0,2,0,6,15,4,19,0 -10304,0,2,0,2,5,3,5,5,3,0,8,15,1,34,0 -10305,0,8,0,9,5,2,0,3,0,0,7,11,5,14,0 -10306,0,8,0,8,5,1,2,3,1,1,15,17,4,0,1 -10307,5,5,0,10,1,0,2,4,0,0,8,11,4,14,0 -10308,10,5,0,12,3,1,6,3,0,1,14,8,3,30,1 -10309,3,1,0,4,1,4,5,5,0,1,8,9,4,22,0 -10310,4,7,0,5,0,0,3,1,4,1,12,20,0,31,1 -10311,5,7,0,15,0,0,3,0,0,0,8,11,4,38,0 -10312,6,8,0,14,2,2,9,3,0,1,8,15,1,37,0 -10313,2,5,0,8,4,1,12,5,3,1,1,15,1,31,1 -10314,3,3,0,1,1,4,12,3,3,1,0,17,0,37,0 -10315,6,1,0,2,6,6,6,2,1,0,10,18,4,31,0 -10316,5,0,0,7,3,6,4,1,3,1,2,15,2,7,0 -10317,6,4,0,13,2,0,3,2,0,0,17,2,2,14,0 -10318,3,6,0,11,4,5,13,3,1,0,18,1,4,13,1 -10319,10,6,0,2,6,4,11,3,3,1,13,11,4,4,0 -10320,9,6,0,1,0,6,5,3,3,0,10,2,0,29,0 -10321,0,7,0,8,0,5,7,1,2,0,8,3,4,19,0 -10322,5,4,0,10,2,1,7,0,2,1,8,18,3,24,0 -10323,2,3,0,11,2,1,8,1,0,1,2,9,5,41,0 -10324,1,5,0,15,1,3,8,3,1,0,2,15,0,12,0 -10325,0,1,0,0,5,0,12,4,3,0,2,11,3,21,0 -10326,1,3,0,3,0,0,11,2,0,0,2,12,0,9,0 -10327,9,1,0,1,5,3,7,4,0,1,2,11,5,24,0 -10328,8,8,0,1,3,1,12,3,0,0,3,20,1,3,1 -10329,2,6,0,11,6,1,8,1,3,0,0,6,0,41,0 -10330,0,8,0,4,2,4,7,5,4,0,16,4,0,30,0 -10331,9,7,0,6,1,4,2,0,1,0,12,17,1,32,0 -10332,2,3,0,9,0,5,11,3,2,0,2,0,1,0,0 -10333,10,3,0,11,0,5,9,2,3,0,4,16,0,31,0 -10334,1,7,0,3,6,5,8,3,0,0,16,13,2,37,0 -10335,0,2,0,14,1,0,2,0,3,0,12,0,2,19,0 -10336,10,5,0,3,6,3,7,3,0,1,8,11,1,0,0 -10337,2,3,0,11,6,0,3,3,0,1,17,17,3,26,0 -10338,3,0,0,5,1,0,6,4,0,0,14,11,2,5,0 -10339,5,3,0,4,6,5,4,3,0,0,17,11,5,9,0 -10340,2,6,0,0,2,0,4,3,0,0,15,9,2,19,0 -10341,2,3,0,15,3,5,3,2,2,0,8,17,3,6,0 -10342,8,1,0,6,5,2,5,5,4,0,9,17,0,39,1 -10343,1,8,0,4,1,6,5,4,1,0,8,6,0,12,0 -10344,3,2,0,2,6,5,6,2,4,0,12,20,5,8,0 -10345,2,0,0,3,2,0,8,4,0,0,17,0,2,12,0 -10346,10,0,0,1,4,4,8,1,0,0,6,20,0,20,0 -10347,3,2,0,6,2,5,5,4,3,0,8,6,1,14,0 -10348,0,3,0,3,3,6,7,4,1,0,6,15,4,11,0 -10349,0,4,0,3,0,4,14,0,3,0,3,2,5,14,0 -10350,1,4,0,5,4,4,12,0,0,0,13,19,0,5,0 -10351,4,4,0,1,1,4,5,2,3,1,0,18,5,3,0 -10352,0,6,0,7,5,5,11,4,2,0,9,17,1,38,1 -10353,9,0,0,14,5,6,0,2,2,1,5,7,3,22,1 -10354,7,7,0,15,1,2,9,1,0,0,2,20,1,0,0 -10355,5,4,0,14,4,5,7,4,0,1,13,9,1,21,0 -10356,10,1,0,5,2,6,2,1,3,1,5,7,1,34,1 -10357,2,8,0,11,0,0,4,4,4,0,17,15,3,19,0 -10358,10,0,0,0,6,5,2,3,3,0,8,11,4,31,0 -10359,2,2,0,3,0,0,0,1,2,0,13,13,4,8,0 -10360,3,3,0,15,1,0,8,1,1,1,8,11,2,24,0 -10361,10,0,0,15,0,4,8,0,3,1,9,14,1,29,1 -10362,4,8,0,15,6,4,12,4,1,1,17,13,1,11,0 -10363,8,6,0,4,5,5,5,1,1,1,8,6,1,39,0 -10364,2,6,0,7,0,3,8,3,0,1,0,3,2,26,0 -10365,5,2,0,12,4,6,11,2,4,0,5,11,5,29,1 -10366,7,6,0,15,1,3,14,1,4,0,10,1,1,1,0 -10367,0,5,0,13,3,1,9,4,0,0,10,11,0,5,0 -10368,3,2,0,11,5,6,12,4,2,0,16,9,0,23,0 -10369,5,3,0,4,1,1,2,4,1,1,5,19,1,32,1 -10370,7,6,0,13,4,1,3,0,4,0,4,6,0,23,0 -10371,4,8,0,4,4,0,12,0,1,1,11,17,3,19,1 -10372,2,1,0,6,1,0,13,4,0,1,0,0,0,20,0 -10373,1,0,0,12,0,4,3,1,0,0,8,15,3,10,0 -10374,6,0,0,4,0,6,8,1,0,0,6,6,5,39,0 -10375,3,3,0,5,3,0,4,1,0,0,2,2,3,3,0 -10376,1,3,0,8,5,2,2,2,4,0,2,9,3,26,0 -10377,4,7,0,2,0,2,11,1,0,1,13,11,1,9,0 -10378,7,7,0,14,3,2,6,0,2,0,10,17,1,12,1 -10379,0,1,0,6,5,2,7,4,3,0,5,6,0,38,0 -10380,5,5,0,2,4,0,2,2,3,1,0,2,1,22,0 -10381,2,0,0,9,6,4,9,4,2,1,10,11,4,34,0 -10382,4,6,0,11,5,4,2,1,0,0,0,13,1,6,0 -10383,7,0,0,7,6,2,11,4,3,1,16,7,4,35,1 -10384,7,3,0,4,4,5,2,3,0,0,12,12,5,10,0 -10385,1,0,0,3,5,2,3,1,0,1,13,9,3,28,0 -10386,8,0,0,5,3,0,10,5,1,0,18,17,5,30,1 -10387,5,8,0,14,0,0,1,2,0,1,17,0,4,27,0 -10388,10,3,0,15,1,5,11,0,4,0,6,13,1,35,0 -10389,1,2,0,0,2,0,7,1,2,0,15,16,3,38,0 -10390,1,3,0,3,0,1,6,0,3,1,8,16,1,40,0 -10391,7,8,0,11,5,1,4,1,1,0,16,5,3,7,1 -10392,0,5,0,1,1,5,0,4,4,1,2,20,2,39,0 -10393,7,2,0,11,0,2,3,3,4,1,13,18,4,13,0 -10394,10,0,0,3,5,0,7,2,0,0,17,13,1,3,0 -10395,2,4,0,8,5,0,6,2,0,0,4,18,4,6,0 -10396,7,2,0,5,2,0,6,1,4,1,17,11,2,11,0 -10397,9,7,0,3,0,3,14,2,0,0,16,18,1,8,0 -10398,10,7,0,12,4,4,1,2,4,1,10,15,0,6,0 -10399,9,8,0,8,2,6,9,2,2,1,0,2,3,40,0 -10400,1,2,0,0,4,2,12,5,2,1,15,0,1,38,1 -10401,0,3,0,0,2,3,13,0,0,0,1,13,0,21,0 -10402,10,5,0,10,0,5,0,1,1,0,0,18,1,9,0 -10403,0,4,0,13,0,5,12,1,4,1,6,6,3,28,0 -10404,2,3,0,12,6,1,9,2,0,0,8,6,4,1,0 -10405,5,1,0,3,4,6,8,1,1,0,2,2,3,40,0 -10406,3,8,0,3,0,4,7,4,1,1,2,17,1,13,0 -10407,7,6,0,7,3,1,4,1,3,1,18,10,1,13,1 -10408,0,2,0,6,6,1,1,4,2,1,2,11,1,18,0 -10409,1,1,0,15,5,3,12,2,3,1,4,3,0,26,0 -10410,0,4,0,6,1,0,6,4,0,1,2,0,1,0,0 -10411,2,7,0,6,0,4,0,5,0,0,10,3,5,11,0 -10412,10,0,0,5,4,3,1,5,2,1,9,9,3,1,1 -10413,9,6,0,10,5,3,7,5,2,1,18,12,4,33,1 -10414,8,1,0,1,6,6,6,1,4,1,9,17,3,29,1 -10415,7,3,0,6,4,3,7,4,0,0,9,0,3,30,1 -10416,0,2,0,4,5,4,8,2,2,0,11,20,0,31,0 -10417,2,8,0,6,3,0,5,1,0,0,12,15,0,41,0 -10418,0,1,0,10,6,0,10,5,1,0,2,2,4,36,0 -10419,2,7,0,0,6,4,8,3,4,1,2,15,1,13,0 -10420,4,4,0,15,4,0,13,4,2,1,13,11,2,41,0 -10421,2,5,0,1,0,3,6,3,0,0,2,13,5,27,0 -10422,6,5,0,13,1,0,12,0,1,0,8,4,5,8,0 -10423,2,7,0,11,0,3,5,0,0,1,11,17,0,16,0 -10424,5,3,0,3,4,0,3,3,1,0,2,15,5,10,0 -10425,0,4,0,2,1,6,8,0,1,0,17,2,0,37,0 -10426,6,6,0,5,3,1,3,2,2,0,8,18,3,21,0 -10427,2,7,0,8,0,4,8,3,4,0,1,11,2,39,0 -10428,6,2,0,14,6,2,10,0,1,1,18,10,1,27,1 -10429,0,3,0,2,2,4,1,3,1,1,12,11,0,16,0 -10430,6,7,0,7,0,0,2,0,4,0,13,14,0,17,0 -10431,6,5,0,10,6,2,5,0,4,1,18,12,1,30,1 -10432,5,4,0,6,5,4,4,5,0,0,15,18,1,10,0 -10433,6,8,0,13,6,4,6,4,3,1,12,9,2,23,0 -10434,8,0,0,9,0,0,4,5,0,0,14,1,1,20,1 -10435,1,5,0,5,0,4,1,3,0,0,8,15,2,10,0 -10436,0,0,0,15,5,0,1,2,1,1,0,16,4,38,0 -10437,3,7,0,9,2,0,12,2,2,1,0,0,0,14,0 -10438,5,1,0,2,4,4,9,1,3,0,4,2,5,29,0 -10439,1,0,0,0,1,4,9,2,0,0,2,6,1,38,0 -10440,4,7,0,1,1,0,1,4,1,0,0,6,0,1,0 -10441,0,8,0,14,2,2,14,3,2,0,17,6,0,4,0 -10442,0,4,0,7,5,0,9,0,4,0,13,6,1,1,0 -10443,3,6,0,10,6,4,1,2,0,0,0,11,0,24,0 -10444,2,7,0,15,1,3,6,0,4,1,0,4,1,35,0 -10445,3,5,0,13,4,2,2,3,0,1,0,9,1,9,0 -10446,1,8,0,13,0,5,11,1,2,1,12,2,2,28,0 -10447,7,1,0,10,2,3,12,3,0,0,5,7,0,19,1 -10448,7,7,0,7,6,0,2,4,0,1,7,9,2,18,0 -10449,1,3,0,5,3,4,14,4,1,0,8,11,0,7,0 -10450,6,3,0,8,5,3,13,5,3,1,7,19,4,24,1 -10451,2,6,0,6,0,6,8,4,3,0,13,0,0,25,0 -10452,5,7,0,14,1,0,6,1,2,0,2,0,5,9,0 -10453,10,0,0,3,2,2,10,4,3,1,4,12,3,7,1 -10454,1,1,0,1,0,3,11,3,0,1,8,7,3,7,0 -10455,1,1,0,13,0,4,8,4,0,1,2,4,0,4,0 -10456,0,3,0,8,4,0,5,1,3,0,4,0,1,41,0 -10457,10,4,0,5,3,1,3,2,4,1,9,8,4,28,1 -10458,9,5,0,0,1,4,9,3,1,1,13,11,4,20,0 -10459,9,4,0,11,1,4,1,2,2,0,13,20,1,37,0 -10460,1,1,0,6,6,5,9,4,4,0,4,18,4,17,0 -10461,3,0,0,5,4,2,2,5,0,0,7,16,3,21,1 -10462,9,1,0,14,1,2,12,1,0,0,10,0,5,17,0 -10463,0,0,0,3,6,4,9,1,2,1,8,9,2,2,0 -10464,1,4,0,8,4,0,2,0,4,1,18,10,3,31,1 -10465,3,4,0,2,6,0,6,2,0,1,3,1,5,30,0 -10466,1,6,0,11,4,5,14,3,3,1,9,5,1,4,1 -10467,7,8,0,5,0,0,8,3,1,0,15,0,2,26,0 -10468,9,6,0,10,1,3,14,3,1,1,8,6,5,31,0 -10469,1,6,0,8,6,1,12,5,1,1,9,14,1,3,1 -10470,4,7,0,7,6,1,8,2,2,0,8,19,3,18,0 -10471,6,1,0,4,4,1,0,5,1,0,3,8,4,3,1 -10472,2,7,0,15,6,4,5,5,2,0,13,2,4,4,0 -10473,10,1,0,14,1,3,10,4,0,1,14,19,5,24,1 -10474,3,7,0,0,3,1,9,5,0,0,2,2,1,20,0 -10475,10,1,0,15,3,0,2,5,3,1,16,11,1,41,0 -10476,5,7,0,2,6,6,0,0,0,1,18,1,3,22,1 -10477,2,6,0,5,2,4,12,4,1,1,18,8,3,8,1 -10478,10,1,0,10,6,5,7,3,2,1,8,16,2,29,0 -10479,1,0,0,4,2,2,9,3,4,1,11,8,1,3,0 -10480,4,5,0,12,0,0,9,1,1,0,13,2,2,13,0 -10481,3,7,0,1,2,2,11,3,1,1,18,8,0,31,1 -10482,0,7,0,0,6,5,13,0,0,0,13,2,2,27,0 -10483,4,1,0,8,2,0,5,0,3,1,13,15,1,39,0 -10484,5,8,0,8,0,3,6,1,2,0,15,11,0,8,0 -10485,3,5,0,14,1,2,9,2,2,0,10,16,3,14,0 -10486,9,3,0,9,5,5,11,4,0,0,18,5,5,13,1 -10487,9,2,0,11,1,1,13,2,2,1,10,11,3,6,0 -10488,3,7,0,12,5,3,10,4,1,1,8,2,4,27,0 -10489,8,4,0,3,0,0,8,2,3,0,17,15,1,18,0 -10490,9,8,0,5,6,4,6,3,2,1,0,15,3,36,0 -10491,1,5,0,3,6,0,8,5,0,1,3,13,1,29,0 -10492,9,8,0,9,0,4,10,3,2,1,13,18,4,28,0 -10493,10,2,0,15,4,4,1,3,1,0,13,2,0,34,0 -10494,2,8,0,13,0,5,13,3,3,0,10,2,0,21,0 -10495,4,1,0,13,6,3,0,1,0,0,15,0,2,11,0 -10496,9,5,0,3,2,2,10,0,2,1,16,17,1,31,1 -10497,6,0,0,3,2,5,5,1,3,1,2,6,1,3,0 -10498,1,4,0,4,6,1,2,4,4,1,13,4,5,16,0 -10499,5,1,0,5,5,2,9,3,0,1,3,10,0,37,1 -10500,9,5,0,10,4,3,7,0,3,0,16,17,3,24,1 -10501,9,8,0,5,0,3,14,4,2,0,8,13,1,5,0 -10502,3,4,0,12,6,3,7,2,1,1,13,18,0,17,0 -10503,2,8,0,13,1,3,4,4,3,1,17,11,3,18,0 -10504,8,4,0,9,6,0,10,0,4,0,18,5,1,20,1 -10505,3,2,0,15,3,1,1,1,4,1,7,5,1,30,1 -10506,6,5,0,2,0,4,3,1,1,1,13,0,2,10,0 -10507,4,6,0,6,0,2,10,4,4,1,0,20,4,3,0 -10508,7,5,0,14,4,1,6,1,2,0,14,7,5,20,1 -10509,10,8,0,6,6,5,6,2,2,1,7,7,1,30,0 -10510,0,7,0,11,2,6,7,3,4,0,14,2,2,26,0 -10511,2,0,0,9,4,6,9,2,4,1,9,6,3,37,0 -10512,8,4,0,2,0,6,1,5,2,0,9,1,2,21,1 -10513,10,0,0,9,3,0,12,1,4,1,9,1,2,31,1 -10514,7,6,0,2,2,6,7,4,0,1,0,10,1,5,1 -10515,8,8,0,10,5,1,9,1,4,1,18,8,1,20,1 -10516,2,7,0,10,0,5,11,1,0,1,2,2,5,40,0 -10517,7,1,0,3,2,0,1,1,3,0,18,19,4,35,1 -10518,6,0,0,3,3,6,5,2,1,0,2,4,2,34,0 -10519,8,8,0,13,0,3,12,4,0,1,8,0,3,8,0 -10520,1,8,0,8,2,6,9,0,3,1,16,11,1,26,0 -10521,10,0,0,6,2,2,7,3,3,0,8,4,5,26,0 -10522,7,8,0,13,2,4,8,1,4,0,8,0,5,2,0 -10523,0,7,0,11,3,4,5,2,2,1,2,16,4,39,0 -10524,0,6,0,14,5,6,13,1,1,1,10,2,1,32,0 -10525,7,6,0,6,5,5,1,3,0,1,13,9,1,36,0 -10526,3,0,0,7,4,4,7,4,3,1,8,2,4,4,0 -10527,9,8,0,0,1,4,8,5,4,1,7,1,4,32,1 -10528,5,7,0,15,3,5,0,5,4,0,14,12,1,2,0 -10529,10,5,0,9,5,2,6,3,4,1,10,17,0,33,0 -10530,2,4,0,13,0,0,13,3,2,0,17,11,1,4,0 -10531,9,5,0,4,6,6,8,1,1,0,1,5,3,26,1 -10532,6,5,0,8,4,1,2,3,3,1,18,19,1,24,1 -10533,9,1,0,2,2,2,14,3,4,1,2,8,4,12,0 -10534,1,7,0,6,1,5,3,1,1,1,3,19,0,13,0 -10535,7,8,0,7,0,3,2,4,4,0,5,18,4,31,0 -10536,8,8,0,13,3,2,3,5,0,1,10,9,4,1,0 -10537,0,3,0,8,1,5,3,1,4,1,8,2,4,34,0 -10538,9,0,0,1,6,5,13,0,3,1,18,5,3,11,1 -10539,1,3,0,4,6,1,14,2,1,0,10,2,1,10,0 -10540,7,6,0,12,5,4,6,4,4,1,9,12,2,31,1 -10541,5,5,0,8,0,1,7,1,1,0,5,1,2,0,1 -10542,0,6,0,1,1,3,1,4,2,0,8,2,3,0,0 -10543,3,4,0,13,4,5,12,4,3,1,13,6,2,8,0 -10544,5,7,0,14,1,0,7,2,4,0,17,11,3,33,0 -10545,10,3,0,6,0,0,7,2,4,0,15,6,0,33,0 -10546,0,2,0,2,0,0,6,0,4,0,12,16,2,23,0 -10547,0,7,0,8,3,0,0,5,4,0,13,11,4,27,0 -10548,2,5,0,12,2,6,14,5,2,1,6,5,4,26,1 -10549,2,7,0,4,0,4,0,3,1,1,8,11,0,3,0 -10550,2,7,0,15,1,4,1,4,3,1,15,11,2,2,0 -10551,1,6,0,2,6,0,10,1,0,1,16,8,5,32,1 -10552,3,3,0,10,0,4,1,1,4,1,6,18,4,27,0 -10553,3,4,0,9,1,1,11,2,3,0,18,2,4,5,1 -10554,2,0,0,8,1,4,0,0,2,1,10,0,5,16,0 -10555,6,8,0,9,3,0,14,0,3,1,12,4,5,32,1 -10556,2,3,0,10,6,4,7,5,0,1,16,9,0,17,0 -10557,10,6,0,3,2,0,1,3,2,0,17,10,4,23,0 -10558,0,6,0,15,2,3,10,0,3,1,12,10,2,6,1 -10559,9,8,0,6,1,2,1,2,4,1,16,10,4,8,1 -10560,3,0,0,9,5,3,12,5,3,1,11,5,0,41,1 -10561,3,0,0,15,6,1,1,1,1,1,11,20,3,4,0 -10562,3,5,0,2,3,1,8,2,2,1,13,2,1,0,0 -10563,8,6,0,9,5,5,14,4,3,0,13,20,3,41,0 -10564,9,7,0,13,1,0,3,1,4,0,1,10,2,22,1 -10565,7,3,0,4,0,4,4,1,3,1,8,11,2,19,0 -10566,2,7,0,15,6,5,10,3,2,1,0,15,2,30,0 -10567,2,2,0,7,6,5,1,1,1,1,8,20,4,2,0 -10568,6,4,0,9,4,0,14,0,2,1,15,10,1,7,1 -10569,4,5,0,14,5,6,7,5,2,0,13,20,3,14,0 -10570,3,7,0,11,0,0,8,3,2,1,13,1,5,27,0 -10571,9,0,0,4,5,2,0,5,2,1,18,10,2,9,1 -10572,10,6,0,7,6,3,10,5,3,1,18,13,2,27,1 -10573,2,3,0,6,3,4,12,4,1,0,13,17,4,6,0 -10574,0,7,0,10,0,0,7,3,0,1,13,13,4,28,0 -10575,0,8,0,14,3,1,0,4,1,1,0,0,2,3,0 -10576,7,5,0,4,6,0,7,1,1,1,11,1,1,23,1 -10577,2,3,0,8,1,6,9,4,1,1,2,1,0,24,0 -10578,8,7,0,13,3,4,13,4,1,1,4,2,2,5,0 -10579,8,2,0,12,6,2,11,0,3,0,5,13,5,38,1 -10580,9,5,0,14,1,4,11,1,0,1,9,10,5,0,1 -10581,10,8,0,8,1,1,14,2,4,0,6,11,0,37,0 -10582,7,4,0,9,4,5,11,2,0,1,0,6,3,27,0 -10583,3,0,0,3,0,5,1,0,3,1,15,13,2,41,0 -10584,8,4,0,10,0,1,0,1,1,1,7,19,4,20,1 -10585,2,0,0,13,4,6,12,3,0,1,8,13,3,21,0 -10586,2,7,0,2,4,0,8,4,3,1,13,15,2,24,0 -10587,0,8,0,5,6,3,9,3,3,0,13,13,3,32,0 -10588,3,0,0,3,2,5,1,3,3,1,3,18,0,31,0 -10589,0,6,0,8,6,5,5,3,2,1,2,7,3,39,0 -10590,9,3,0,1,2,0,9,1,3,0,4,18,4,1,0 -10591,3,6,0,7,1,1,9,3,1,1,10,19,0,9,0 -10592,3,1,0,4,0,5,12,4,3,1,16,20,1,11,1 -10593,3,8,0,5,5,5,11,2,3,0,2,3,1,40,0 -10594,2,3,0,7,3,5,14,0,4,0,11,6,2,41,0 -10595,4,1,0,5,4,6,4,2,3,0,10,3,5,12,1 -10596,10,2,0,13,0,4,0,5,3,1,13,8,0,29,0 -10597,5,3,0,8,5,1,4,3,4,1,18,17,2,31,1 -10598,0,0,0,12,0,0,4,5,3,1,8,8,2,17,0 -10599,10,0,0,3,5,0,14,1,4,0,11,15,5,35,0 -10600,10,1,0,5,4,2,0,5,2,1,1,8,3,30,1 -10601,2,6,0,7,1,4,4,5,3,1,6,15,2,35,0 -10602,9,7,0,7,2,5,3,1,0,1,13,6,1,35,0 -10603,10,6,0,1,6,5,6,1,2,0,12,6,3,19,0 -10604,1,0,0,6,6,0,0,3,2,0,4,2,3,29,0 -10605,8,2,0,3,2,1,8,2,1,1,2,0,5,10,0 -10606,9,5,0,3,3,3,0,1,2,0,2,16,4,9,0 -10607,2,7,0,5,5,4,4,4,3,0,17,11,0,32,0 -10608,3,8,0,8,3,4,10,2,0,0,2,0,0,18,0 -10609,5,3,0,2,4,0,3,3,0,0,2,2,5,12,0 -10610,9,1,0,10,2,0,4,0,0,1,8,7,2,39,0 -10611,8,2,0,9,0,4,14,4,1,0,17,11,2,36,0 -10612,6,1,0,14,0,4,13,5,4,1,18,7,3,22,1 -10613,10,3,0,9,0,4,4,4,2,1,17,2,2,13,0 -10614,2,5,0,11,4,6,9,4,3,1,18,16,5,16,1 -10615,1,8,0,8,0,5,5,4,0,1,8,2,2,8,0 -10616,10,1,0,2,0,1,9,2,3,0,13,2,4,37,0 -10617,0,6,0,13,0,6,9,3,1,1,4,0,5,6,0 -10618,2,5,0,8,1,6,9,0,2,1,2,11,4,31,0 -10619,0,7,0,2,2,4,10,3,2,0,4,2,1,37,0 -10620,10,8,0,10,5,2,9,2,0,0,18,7,0,16,1 -10621,6,5,0,3,2,4,5,3,1,1,13,20,1,26,0 -10622,3,3,0,6,1,1,5,0,0,1,18,0,3,3,1 -10623,2,3,0,7,0,3,11,3,4,0,4,15,1,3,0 -10624,10,6,0,13,0,5,8,1,3,0,8,4,3,33,0 -10625,1,6,0,1,2,6,1,1,2,1,11,15,0,34,0 -10626,6,1,0,10,1,1,8,0,2,1,10,6,4,0,0 -10627,9,1,0,8,0,4,9,1,4,0,8,11,2,18,0 -10628,8,1,0,10,6,5,14,0,1,0,9,10,0,9,1 -10629,4,5,0,4,6,4,3,4,2,1,17,6,1,26,0 -10630,3,4,0,4,6,6,5,0,2,0,11,3,3,5,0 -10631,6,8,0,14,0,2,6,0,0,1,1,8,4,9,1 -10632,1,5,0,0,5,5,13,4,2,1,5,2,3,6,0 -10633,1,3,0,5,6,1,0,1,1,1,13,2,0,5,0 -10634,4,6,0,12,0,3,9,4,1,0,2,3,4,38,0 -10635,0,4,0,12,4,1,3,0,3,0,17,11,5,16,0 -10636,9,2,0,0,1,0,1,1,3,0,8,18,0,6,0 -10637,5,1,0,1,6,4,7,1,3,0,13,20,2,3,0 -10638,3,5,0,6,3,4,5,4,3,1,4,11,1,22,0 -10639,4,5,0,1,1,5,4,3,3,0,12,2,5,1,0 -10640,5,8,0,11,0,5,8,5,0,1,18,7,1,19,1 -10641,2,3,0,8,6,5,3,4,1,1,14,20,0,8,0 -10642,6,1,0,8,5,1,14,2,2,1,5,13,1,0,1 -10643,7,8,0,12,5,5,9,0,4,1,7,11,5,37,0 -10644,1,5,0,9,0,0,8,5,3,0,17,14,3,38,0 -10645,0,7,0,2,0,3,9,4,0,1,16,6,5,26,0 -10646,6,8,0,13,6,0,10,4,0,0,2,2,3,6,0 -10647,6,8,0,5,2,2,0,1,4,0,18,19,4,6,1 -10648,2,5,0,7,1,0,1,0,0,0,10,9,0,4,0 -10649,9,7,0,7,5,5,6,5,1,1,4,3,0,25,0 -10650,2,8,0,1,4,3,8,0,4,0,8,20,2,33,0 -10651,5,6,0,13,0,0,9,1,4,0,13,16,3,17,0 -10652,5,7,0,6,1,4,2,2,4,0,15,2,2,28,0 -10653,10,7,0,11,0,5,10,1,1,1,2,12,5,18,0 -10654,1,1,0,15,5,6,14,3,3,1,10,15,2,4,0 -10655,7,3,0,8,1,1,12,1,2,0,2,9,5,32,0 -10656,5,4,0,7,6,1,4,0,4,0,3,9,4,14,1 -10657,8,5,0,11,1,6,8,4,1,0,4,15,1,11,0 -10658,4,6,0,1,1,0,3,2,0,0,16,12,4,0,0 -10659,2,5,0,7,2,5,0,4,2,1,13,15,0,40,0 -10660,2,8,0,0,1,4,2,0,3,0,15,16,0,7,0 -10661,3,1,0,7,2,6,1,4,4,1,8,8,4,40,0 -10662,1,1,0,5,5,0,4,4,4,0,15,20,5,32,0 -10663,5,0,0,14,6,1,4,5,2,1,7,19,0,39,1 -10664,8,2,0,3,5,1,10,3,1,0,5,7,3,21,1 -10665,0,4,0,12,3,2,14,3,2,1,8,16,1,24,0 -10666,0,2,0,1,1,4,9,1,3,1,12,11,0,24,0 -10667,7,5,0,6,5,4,7,2,2,1,6,16,4,17,0 -10668,2,5,0,1,3,0,1,2,2,0,2,8,5,38,0 -10669,1,8,0,11,0,0,10,1,4,1,11,13,0,16,0 -10670,3,0,0,7,0,0,8,4,4,1,17,5,2,6,0 -10671,0,4,0,0,1,5,12,5,1,0,13,19,4,9,0 -10672,2,2,0,7,0,3,0,1,4,0,4,15,2,31,0 -10673,4,2,0,8,6,3,12,3,1,0,12,20,1,2,0 -10674,4,7,0,4,1,6,1,0,0,0,2,0,4,41,0 -10675,4,6,0,4,1,4,3,5,2,1,3,17,1,0,1 -10676,7,2,0,10,3,6,3,5,4,1,18,13,0,38,1 -10677,9,8,0,10,2,0,3,0,4,1,4,11,5,26,0 -10678,0,0,0,5,6,0,8,2,1,0,13,13,2,31,0 -10679,2,4,0,11,1,0,1,2,1,0,6,13,5,16,0 -10680,9,8,0,9,0,0,6,3,3,0,10,12,0,34,0 -10681,7,8,0,15,4,1,1,0,4,0,11,10,4,9,1 -10682,0,4,0,13,0,4,7,0,2,0,13,6,1,2,0 -10683,4,3,0,3,1,1,10,5,2,1,14,8,4,19,1 -10684,9,5,0,9,3,0,11,4,4,0,7,10,4,8,1 -10685,4,2,0,15,2,3,9,1,1,1,13,3,0,3,0 -10686,3,7,0,9,6,3,5,2,4,1,2,6,1,33,0 -10687,1,4,0,14,0,4,2,0,4,0,17,17,5,0,0 -10688,3,0,0,14,1,4,13,2,2,0,16,11,0,37,0 -10689,8,8,0,14,1,1,3,0,2,1,1,7,2,40,1 -10690,2,1,0,12,6,2,11,1,4,0,6,14,2,24,1 -10691,0,6,0,5,1,2,5,3,3,1,2,2,5,31,0 -10692,9,6,0,8,1,5,5,4,2,0,8,20,2,20,0 -10693,9,0,0,4,6,3,10,0,2,1,17,0,1,16,0 -10694,0,0,0,12,4,5,6,3,4,1,11,13,5,36,0 -10695,9,4,0,15,2,4,5,4,1,0,6,20,4,33,0 -10696,6,1,0,9,0,0,5,5,4,0,7,7,2,12,1 -10697,4,7,0,13,5,6,5,3,1,0,6,4,3,37,0 -10698,7,6,0,4,6,3,3,3,0,1,4,12,3,41,0 -10699,9,7,0,11,6,5,7,1,0,1,2,2,2,21,0 -10700,3,1,0,1,0,4,0,4,1,1,11,0,5,14,0 -10701,9,6,0,14,5,1,1,0,2,1,4,15,0,18,0 -10702,2,3,0,0,4,6,4,4,0,1,4,6,0,41,0 -10703,10,4,0,15,3,0,0,1,1,1,8,6,3,8,0 -10704,9,6,0,8,2,0,0,2,0,0,8,2,0,1,0 -10705,8,4,0,8,2,3,8,4,4,0,10,11,3,29,0 -10706,4,7,0,0,4,1,2,3,3,0,6,3,3,7,0 -10707,7,8,0,11,0,4,8,4,2,0,8,2,1,30,0 -10708,6,2,0,7,4,4,12,2,4,1,18,5,2,13,1 -10709,2,1,0,0,0,4,14,1,1,0,4,11,1,32,0 -10710,0,6,0,5,0,4,1,3,3,1,13,0,2,11,0 -10711,0,8,0,15,6,5,9,4,0,1,2,2,2,29,0 -10712,5,7,0,2,0,1,7,2,0,0,10,12,4,30,0 -10713,3,3,0,14,2,2,9,1,2,1,13,4,0,8,0 -10714,4,8,0,3,0,4,10,4,1,0,15,15,5,11,0 -10715,6,7,0,14,5,0,12,0,4,1,9,19,3,38,1 -10716,8,7,0,9,3,4,0,3,1,1,8,9,5,32,0 -10717,3,1,0,0,2,6,11,1,1,0,2,6,1,22,0 -10718,3,4,0,11,2,6,5,2,0,0,4,15,1,19,0 -10719,2,2,0,1,0,4,9,4,1,0,2,6,5,7,0 -10720,10,2,0,11,5,3,3,3,4,1,2,2,4,17,0 -10721,0,3,0,0,6,5,5,2,2,0,13,15,1,1,0 -10722,2,2,0,14,2,2,3,5,1,0,5,1,5,20,1 -10723,2,7,0,13,4,5,2,1,4,0,10,13,0,9,0 -10724,10,5,0,5,4,0,12,2,2,1,16,17,1,17,1 -10725,5,3,0,5,5,6,9,4,3,0,8,8,1,8,0 -10726,6,3,0,10,0,3,5,3,2,1,13,13,0,16,0 -10727,2,4,0,11,4,4,5,5,1,0,14,11,4,40,0 -10728,6,3,0,2,1,2,10,2,0,1,15,5,1,8,1 -10729,4,3,0,14,2,1,11,2,4,0,12,7,2,9,1 -10730,9,7,0,6,4,5,11,0,4,1,4,15,1,41,0 -10731,8,0,0,4,1,2,14,3,0,0,18,10,5,20,1 -10732,5,1,0,9,6,3,9,0,0,0,9,19,5,6,1 -10733,3,1,0,11,5,6,3,4,0,0,3,8,0,13,1 -10734,2,3,0,4,6,2,5,2,4,0,0,2,0,3,0 -10735,0,8,0,0,6,6,0,0,1,1,16,10,1,21,1 -10736,1,4,0,2,5,0,7,1,4,0,10,2,1,3,0 -10737,2,3,0,10,2,6,11,0,1,0,2,19,2,16,0 -10738,0,8,0,11,1,0,3,3,0,0,3,11,3,31,0 -10739,3,1,0,14,0,5,9,0,3,0,4,11,1,11,0 -10740,8,7,0,2,4,4,3,3,0,0,2,2,3,31,0 -10741,5,8,0,10,0,2,12,2,4,0,15,7,2,11,1 -10742,6,8,0,1,3,4,5,1,0,0,11,11,1,29,0 -10743,9,8,0,10,2,0,7,5,0,1,16,12,4,14,1 -10744,2,7,0,4,1,2,7,0,0,0,5,11,0,6,0 -10745,8,2,0,7,6,1,7,4,0,0,16,8,4,38,1 -10746,9,6,0,10,5,1,6,5,0,1,10,8,0,18,1 -10747,2,4,0,6,5,3,8,1,2,0,8,2,1,11,0 -10748,4,7,0,1,1,1,5,5,4,1,10,20,5,29,0 -10749,2,5,0,9,0,4,6,4,0,1,13,16,5,10,0 -10750,1,5,0,7,1,0,3,4,2,1,5,11,3,7,0 -10751,1,1,0,2,6,0,4,4,1,1,8,2,1,16,0 -10752,7,2,0,10,1,3,0,5,4,1,5,1,4,37,1 -10753,2,6,0,8,2,5,14,1,1,1,6,13,1,22,0 -10754,4,3,0,15,1,1,0,3,3,1,4,2,0,0,0 -10755,9,2,0,2,2,6,5,1,2,0,18,8,4,30,1 -10756,7,8,0,14,4,3,13,1,3,1,9,5,3,29,1 -10757,10,4,0,4,5,0,9,1,0,0,2,2,4,22,0 -10758,6,1,0,7,0,3,8,5,4,1,2,14,5,3,0 -10759,1,2,0,2,2,5,11,0,1,1,13,2,1,36,0 -10760,5,2,0,7,4,1,9,3,0,0,17,2,0,29,0 -10761,5,0,0,8,0,5,12,1,0,1,10,13,4,6,0 -10762,2,5,0,2,2,3,5,2,1,1,8,16,3,9,0 -10763,0,1,0,15,0,5,0,4,3,1,15,6,4,9,0 -10764,0,0,0,8,5,5,9,4,1,0,0,11,1,2,0 -10765,10,0,0,13,4,2,14,5,0,1,11,1,2,22,1 -10766,0,8,0,13,2,5,6,4,2,0,11,2,1,18,0 -10767,8,3,0,14,3,5,11,3,3,1,5,7,0,39,1 -10768,5,5,0,14,3,2,10,5,3,1,16,7,1,4,1 -10769,10,4,0,3,0,0,0,2,1,0,15,18,5,27,0 -10770,7,8,0,13,0,0,9,1,3,0,4,11,1,31,0 -10771,9,4,0,13,1,4,8,0,0,0,1,2,2,7,0 -10772,6,1,0,12,4,2,5,4,3,1,3,7,0,2,1 -10773,0,8,0,14,0,3,9,1,0,0,17,2,2,19,0 -10774,0,4,0,12,3,1,8,4,0,1,8,6,1,4,0 -10775,10,0,0,11,6,6,5,0,4,0,6,15,0,6,0 -10776,6,8,0,6,5,4,12,3,4,0,17,19,3,31,1 -10777,9,5,0,9,1,3,11,4,3,0,12,2,0,41,0 -10778,0,3,0,5,5,6,0,4,4,0,12,2,0,16,0 -10779,0,4,0,10,0,6,12,4,0,0,18,11,5,21,0 -10780,2,8,0,9,4,1,5,3,3,1,12,7,3,25,1 -10781,1,4,0,12,1,4,8,1,4,0,13,2,0,26,0 -10782,3,0,0,9,2,3,5,2,0,1,0,2,4,12,0 -10783,2,1,0,5,2,4,6,4,1,0,8,11,2,25,0 -10784,0,7,0,10,6,4,5,1,1,0,13,4,1,23,0 -10785,0,1,0,5,4,2,5,1,0,1,3,12,4,28,0 -10786,10,0,0,13,0,3,4,2,1,0,8,11,1,32,0 -10787,0,6,0,8,4,5,2,3,0,0,0,11,4,38,0 -10788,4,6,0,13,0,6,5,5,0,0,8,2,1,7,0 -10789,0,7,0,11,3,3,5,3,3,0,11,2,3,16,0 -10790,1,8,0,5,0,4,9,5,2,1,4,11,0,27,0 -10791,10,6,0,13,2,4,14,4,0,1,14,9,0,35,0 -10792,1,8,0,11,5,0,14,3,2,0,8,0,1,0,0 -10793,1,5,0,11,3,4,7,4,0,1,4,2,0,26,0 -10794,3,6,0,14,6,4,4,1,1,1,8,18,3,38,0 -10795,3,0,0,11,4,0,14,5,3,1,16,10,3,12,1 -10796,1,8,0,6,3,4,10,1,0,1,18,14,4,5,1 -10797,3,7,0,12,2,1,10,2,0,1,17,2,5,30,0 -10798,3,6,0,1,4,5,3,1,2,1,9,10,4,37,1 -10799,9,1,0,2,0,0,14,3,2,1,0,8,1,11,0 -10800,5,1,0,3,0,5,4,2,3,1,2,2,4,36,0 -10801,8,1,0,12,2,4,4,2,0,0,10,9,0,24,0 -10802,10,4,0,10,4,5,2,0,2,0,1,5,3,32,1 -10803,2,3,0,10,0,2,14,0,3,1,18,8,1,26,1 -10804,6,6,0,6,2,0,8,3,3,0,0,4,5,28,0 -10805,5,1,0,3,2,5,2,0,1,1,9,10,2,8,1 -10806,7,5,0,0,2,3,1,3,0,1,13,10,0,8,0 -10807,1,7,0,13,1,2,0,5,4,0,18,6,4,30,1 -10808,3,7,0,12,5,6,8,2,2,0,8,6,4,14,0 -10809,1,5,0,8,1,1,0,3,3,0,7,4,3,17,0 -10810,2,2,0,3,0,2,14,1,3,1,4,0,2,10,0 -10811,8,6,0,6,4,4,7,0,2,1,2,0,1,13,0 -10812,0,4,0,3,5,1,1,3,0,0,4,2,1,6,0 -10813,3,2,0,11,1,0,11,0,0,1,10,2,3,19,0 -10814,2,6,0,14,3,5,0,3,2,1,13,9,2,7,0 -10815,1,3,0,0,3,0,3,4,3,1,2,4,5,2,0 -10816,1,4,0,4,4,6,8,1,0,1,2,11,0,5,0 -10817,7,5,0,12,1,3,10,1,0,1,10,12,4,8,0 -10818,1,5,0,6,0,2,7,1,1,1,3,18,5,8,0 -10819,6,2,0,9,4,3,11,4,4,1,9,19,5,0,1 -10820,10,7,0,0,1,2,12,0,4,0,1,11,3,29,0 -10821,4,8,0,8,4,3,5,4,1,0,4,18,0,16,0 -10822,1,6,0,15,1,4,9,1,0,1,15,9,1,0,0 -10823,7,1,0,7,2,4,8,3,0,0,0,4,0,11,0 -10824,0,8,0,13,6,4,0,4,3,0,5,9,0,6,0 -10825,0,7,0,4,0,0,12,1,1,0,13,8,3,22,0 -10826,0,7,0,15,5,2,14,2,3,0,15,15,0,11,0 -10827,0,5,0,9,3,0,0,4,2,1,7,17,4,7,1 -10828,8,8,0,13,1,4,4,4,1,0,8,11,5,31,0 -10829,0,3,0,5,0,5,8,1,3,1,13,6,1,20,0 -10830,1,1,0,11,5,6,7,5,4,1,14,14,5,1,1 -10831,0,8,0,14,6,4,9,0,0,0,9,7,5,37,1 -10832,6,5,0,2,6,3,4,1,2,1,14,14,2,30,1 -10833,3,5,0,1,1,0,3,4,2,0,16,15,1,16,0 -10834,7,4,0,3,3,0,8,4,1,1,2,11,0,22,0 -10835,7,3,0,3,1,6,11,4,2,0,8,15,1,36,0 -10836,1,4,0,12,2,5,1,4,0,0,0,2,1,3,0 -10837,3,1,0,1,3,5,13,5,4,1,13,11,1,11,0 -10838,9,6,0,2,5,0,8,4,0,1,2,15,1,36,0 -10839,8,0,0,8,0,1,10,0,3,1,3,17,4,21,1 -10840,2,3,0,3,1,3,9,2,1,0,0,2,2,23,0 -10841,0,0,0,1,0,1,0,0,4,0,2,12,0,40,0 -10842,0,3,0,12,4,1,12,2,2,1,15,2,5,38,0 -10843,0,6,0,11,0,4,9,4,2,1,13,15,4,30,0 -10844,6,6,0,6,2,0,12,2,3,1,6,15,0,36,0 -10845,9,6,0,2,3,1,0,4,1,0,8,11,0,8,0 -10846,5,7,0,15,5,2,14,0,0,0,14,6,2,7,0 -10847,4,7,0,1,6,5,13,3,4,0,18,7,1,9,1 -10848,6,0,0,14,1,3,7,3,3,0,4,11,3,28,0 -10849,0,2,0,3,5,3,14,3,3,1,9,9,5,9,0 -10850,0,1,0,5,2,4,9,1,4,0,10,2,1,5,0 -10851,8,4,0,11,4,4,9,0,1,1,13,2,0,11,0 -10852,5,4,0,3,0,3,12,5,1,1,2,15,3,22,0 -10853,1,5,0,1,5,0,10,3,4,0,10,11,3,30,0 -10854,2,6,0,15,2,6,6,4,3,0,17,20,2,23,0 -10855,1,8,0,0,1,5,3,2,4,1,2,11,0,24,0 -10856,2,2,0,1,0,1,9,3,1,1,8,13,0,22,0 -10857,9,4,0,3,5,0,1,1,4,1,13,11,2,26,0 -10858,0,6,0,7,6,0,8,1,0,1,8,2,5,9,0 -10859,10,5,0,9,6,4,0,5,1,0,13,9,2,10,0 -10860,3,8,0,7,1,2,5,3,0,0,17,9,4,24,0 -10861,1,7,0,6,0,5,10,1,3,1,2,12,0,18,0 -10862,0,5,0,0,1,0,7,2,0,1,15,0,0,31,0 -10863,0,2,0,9,5,3,4,4,0,1,8,11,2,31,0 -10864,4,5,0,15,6,1,13,5,2,1,11,13,3,18,1 -10865,6,4,0,5,0,2,14,1,3,1,0,11,3,35,0 -10866,7,6,0,14,2,0,8,5,3,1,16,18,4,8,0 -10867,0,0,0,2,3,3,11,2,1,1,6,9,0,6,0 -10868,7,2,0,0,2,4,2,3,1,0,5,19,1,37,0 -10869,5,8,0,5,5,0,14,3,0,1,8,0,1,24,0 -10870,7,8,0,2,4,4,1,3,0,1,6,16,4,26,0 -10871,8,8,0,3,6,5,8,4,3,0,13,11,5,10,0 -10872,0,3,0,13,1,1,14,3,2,0,2,3,0,10,0 -10873,2,5,0,6,0,6,5,3,0,1,13,15,1,29,0 -10874,2,2,0,15,3,4,5,3,1,1,12,3,2,21,0 -10875,6,2,0,12,4,2,5,1,2,1,11,10,5,5,1 -10876,10,2,0,13,6,3,14,4,0,1,13,5,3,22,0 -10877,1,2,0,5,6,2,2,0,4,0,2,2,3,3,0 -10878,2,7,0,8,1,4,6,1,0,0,16,11,4,37,0 -10879,1,1,0,8,0,0,1,1,3,0,13,9,2,16,0 -10880,4,7,0,4,0,4,10,4,1,1,8,2,2,5,0 -10881,5,0,0,0,6,4,10,2,2,1,12,2,3,19,0 -10882,9,7,0,14,3,0,10,2,3,0,16,14,1,23,1 -10883,3,0,0,10,6,4,2,2,0,0,2,4,3,24,0 -10884,0,4,0,11,0,4,5,1,1,1,13,17,0,3,0 -10885,9,5,0,10,2,0,12,3,2,1,4,14,2,30,0 -10886,9,7,0,3,2,2,13,1,1,1,13,9,0,30,0 -10887,6,5,0,15,2,6,9,2,3,1,18,13,5,8,1 -10888,1,2,0,4,6,5,8,4,2,1,17,18,0,26,0 -10889,0,3,0,7,6,5,1,2,1,1,10,11,1,21,0 -10890,5,4,0,4,1,0,2,4,1,1,13,11,2,14,0 -10891,10,5,0,6,3,2,6,4,4,0,9,10,5,20,1 -10892,0,5,0,13,4,4,11,2,1,1,4,2,2,28,0 -10893,2,8,0,11,6,6,0,3,1,0,17,16,3,33,0 -10894,2,1,0,6,6,3,5,0,1,0,17,2,5,32,0 -10895,2,5,0,10,4,0,11,1,0,0,11,16,0,31,0 -10896,3,1,0,3,1,5,9,2,2,1,2,0,3,26,0 -10897,0,3,0,13,0,1,5,0,2,1,2,11,0,5,0 -10898,9,6,0,1,6,2,6,4,0,1,14,3,1,27,0 -10899,1,3,0,1,0,4,8,3,3,1,1,0,5,19,0 -10900,0,4,0,1,2,5,7,4,2,0,0,15,0,40,0 -10901,0,0,0,10,3,6,13,0,2,1,13,11,1,38,0 -10902,9,7,0,15,1,0,1,5,1,1,13,2,5,23,0 -10903,4,6,0,4,2,4,3,3,2,1,3,15,0,8,1 -10904,9,1,0,14,1,2,9,1,4,1,15,11,4,26,0 -10905,2,2,0,4,3,5,7,5,0,1,6,11,5,0,0 -10906,1,1,0,14,6,0,6,4,2,0,10,11,1,0,0 -10907,0,5,0,6,0,6,12,0,4,1,12,6,2,11,0 -10908,10,0,0,6,5,0,2,4,4,1,2,9,0,24,0 -10909,4,2,0,9,6,0,6,4,2,1,8,2,3,27,0 -10910,10,7,0,7,3,1,8,5,2,0,5,17,3,31,1 -10911,5,6,0,11,1,6,0,2,0,0,4,20,3,19,0 -10912,2,7,0,5,1,0,7,3,2,1,14,11,0,26,0 -10913,4,6,0,6,0,3,11,0,3,0,13,8,1,3,0 -10914,4,3,0,12,2,2,2,2,2,1,2,10,2,35,1 -10915,6,4,0,10,6,1,10,2,1,1,7,5,4,30,1 -10916,5,5,0,4,6,1,9,1,4,0,6,1,3,40,0 -10917,0,3,0,6,2,3,13,2,2,0,8,11,1,38,0 -10918,4,8,0,13,3,2,8,3,1,1,13,17,3,37,0 -10919,1,0,0,9,2,2,1,1,4,0,8,3,3,16,0 -10920,2,2,0,14,2,5,2,2,4,0,10,2,2,4,0 -10921,0,2,0,6,0,6,1,0,4,1,8,4,0,31,0 -10922,2,6,0,15,3,3,8,2,0,1,13,2,0,1,0 -10923,8,2,0,5,1,4,3,4,3,1,15,9,3,6,0 -10924,7,4,0,5,2,6,6,3,2,1,6,14,5,41,0 -10925,2,4,0,10,3,1,10,4,2,1,1,4,1,21,0 -10926,8,3,0,5,2,6,10,4,0,1,13,6,0,0,0 -10927,4,5,0,2,3,0,2,1,4,1,18,3,3,17,1 -10928,3,3,0,3,0,0,4,2,4,0,13,9,0,21,0 -10929,10,1,0,7,5,3,14,2,3,0,10,0,4,26,0 -10930,0,0,0,4,1,4,14,4,2,1,17,0,0,38,0 -10931,10,6,0,11,4,4,7,1,4,1,2,8,4,0,0 -10932,0,7,0,10,6,6,11,1,4,0,0,5,5,27,0 -10933,0,8,0,13,3,4,12,5,4,0,8,6,4,21,0 -10934,10,1,0,10,1,3,2,1,2,0,14,17,3,34,1 -10935,0,2,0,3,1,1,12,0,0,1,2,16,0,28,0 -10936,6,8,0,12,1,1,14,0,0,1,1,1,1,10,1 -10937,9,7,0,3,1,6,12,3,1,0,11,15,2,13,0 -10938,9,8,0,7,2,0,2,3,1,1,9,10,2,11,1 -10939,7,4,0,3,2,1,11,1,3,0,4,6,4,7,0 -10940,4,1,0,2,2,5,6,1,0,0,13,0,4,3,0 -10941,7,0,0,10,0,1,10,0,1,1,9,9,5,3,1 -10942,10,8,0,4,5,0,4,3,4,0,7,1,5,11,0 -10943,0,7,0,0,2,0,11,1,2,0,13,2,2,31,0 -10944,3,1,0,4,6,3,9,3,2,0,2,2,3,39,0 -10945,1,6,0,10,2,0,10,0,2,1,2,13,1,22,0 -10946,1,3,0,3,6,4,1,0,2,0,6,20,5,27,0 -10947,3,4,0,2,6,0,0,4,0,0,16,6,0,2,0 -10948,0,7,0,3,5,1,8,4,3,1,13,2,5,12,0 -10949,4,4,0,3,0,1,9,4,3,0,14,18,1,24,0 -10950,5,7,0,13,2,5,9,3,3,0,0,15,0,14,0 -10951,10,7,0,13,6,4,7,4,2,0,8,18,5,26,0 -10952,9,3,0,3,3,1,3,5,2,1,3,7,1,36,1 -10953,1,6,0,14,0,3,8,3,0,0,14,11,2,13,0 -10954,8,4,0,10,6,6,11,3,0,0,18,10,2,23,1 -10955,1,7,0,7,0,5,3,3,2,1,14,13,2,27,0 -10956,3,0,0,2,2,0,1,3,0,0,3,19,4,5,1 -10957,10,6,0,6,4,2,3,0,0,1,2,15,2,20,0 -10958,9,6,0,11,4,0,0,2,4,1,8,11,0,31,0 -10959,10,5,0,8,0,1,12,3,0,1,13,11,0,13,0 -10960,3,1,0,13,4,6,14,2,3,0,8,11,0,25,0 -10961,8,8,0,3,2,5,9,1,0,0,7,0,4,4,0 -10962,2,6,0,9,5,3,4,2,0,1,7,13,5,39,0 -10963,8,3,0,2,2,4,5,1,2,0,5,11,0,34,0 -10964,6,1,0,6,6,1,14,5,0,0,1,17,5,39,1 -10965,3,7,0,15,6,0,2,3,3,1,8,0,1,7,0 -10966,5,2,0,6,6,4,9,3,2,0,2,2,4,31,0 -10967,5,5,0,11,2,5,3,5,1,1,1,8,5,11,1 -10968,6,6,0,8,4,5,9,4,4,1,4,6,0,41,0 -10969,2,2,0,6,3,2,13,5,0,1,14,0,5,40,1 -10970,0,2,0,13,6,5,13,3,1,0,7,10,3,9,0 -10971,0,0,0,5,3,0,12,3,2,0,4,2,4,20,0 -10972,10,7,0,6,1,3,11,3,3,0,5,9,5,8,0 -10973,9,2,0,8,2,5,10,0,4,1,14,19,3,27,1 -10974,2,0,0,15,1,1,8,4,3,0,2,11,5,36,0 -10975,2,8,0,9,3,5,14,3,0,0,2,19,0,12,0 -10976,8,4,0,11,0,0,12,1,0,0,8,4,5,17,0 -10977,0,1,0,6,5,0,11,4,0,0,6,13,0,5,0 -10978,3,1,0,12,6,3,9,3,0,1,2,13,4,7,0 -10979,9,2,0,13,4,4,7,2,1,0,13,3,5,0,0 -10980,4,1,0,9,4,6,13,1,2,1,16,17,2,21,1 -10981,9,4,0,2,5,0,2,5,1,0,13,11,0,35,0 -10982,7,5,0,8,6,0,10,3,0,0,18,16,2,39,1 -10983,6,5,0,11,6,2,11,3,0,1,13,13,2,21,0 -10984,4,4,0,6,0,0,3,4,0,0,16,5,5,17,0 -10985,10,6,0,14,6,5,0,4,1,0,2,18,3,21,0 -10986,4,7,0,5,6,4,7,1,1,0,13,2,0,22,0 -10987,1,6,0,2,6,2,6,1,4,1,12,20,1,8,0 -10988,3,1,0,3,3,4,3,4,0,1,8,2,5,5,0 -10989,0,3,0,6,0,2,0,3,1,1,13,18,5,3,0 -10990,9,1,0,10,0,3,0,2,0,0,6,2,3,16,0 -10991,0,0,0,15,1,1,7,1,1,1,17,18,3,35,0 -10992,7,4,0,11,6,4,6,2,1,0,12,6,0,1,0 -10993,2,8,0,7,0,6,6,3,1,0,10,4,4,7,0 -10994,2,3,0,6,3,5,4,5,0,0,17,18,1,40,0 -10995,1,5,0,1,1,0,0,2,1,1,13,14,1,38,0 -10996,3,6,0,12,0,0,4,3,1,0,0,6,2,3,0 -10997,5,0,0,9,3,4,5,5,4,0,14,7,0,16,1 -10998,10,1,0,3,6,3,14,5,0,0,13,3,3,16,0 -10999,6,6,0,4,6,1,10,2,3,1,14,10,5,41,1 -11000,3,5,0,12,5,5,5,3,1,0,8,2,4,0,0 -11001,3,0,0,10,6,5,5,4,4,0,9,1,4,3,1 -11002,0,8,0,0,1,3,3,2,4,0,14,2,3,22,0 -11003,6,1,0,14,1,5,12,5,0,0,6,7,4,35,1 -11004,3,4,0,6,6,0,13,4,1,1,17,2,3,16,0 -11005,4,4,0,15,2,6,8,3,3,0,3,19,3,12,1 -11006,6,8,0,11,4,5,8,0,3,1,13,15,0,35,0 -11007,0,7,0,15,4,2,9,5,1,0,18,1,1,16,1 -11008,1,2,0,4,6,2,14,3,1,1,13,9,0,40,0 -11009,8,4,0,5,1,0,9,3,3,1,13,15,4,16,0 -11010,5,2,0,10,5,3,2,2,2,0,11,2,2,20,0 -11011,3,7,0,13,0,0,0,0,2,1,2,2,4,22,0 -11012,8,8,0,4,5,5,9,1,4,1,15,13,0,5,0 -11013,3,4,0,8,4,4,2,0,1,0,13,7,0,37,0 -11014,5,6,0,13,0,6,6,3,2,1,11,3,4,21,0 -11015,1,1,0,12,1,0,1,0,2,0,0,9,1,1,0 -11016,0,0,0,2,6,1,9,4,0,1,17,15,4,2,0 -11017,1,3,0,5,3,2,14,3,1,1,13,2,5,8,0 -11018,8,0,0,5,2,5,1,1,0,1,9,12,4,32,1 -11019,5,3,0,15,1,0,0,5,1,0,2,18,1,16,0 -11020,7,0,0,8,2,4,0,3,2,0,2,2,0,4,0 -11021,9,7,0,7,2,0,1,4,2,0,12,12,4,1,1 -11022,9,0,0,1,0,4,12,0,0,1,6,4,2,18,0 -11023,8,6,0,13,4,3,0,0,4,0,11,11,1,29,0 -11024,6,1,0,13,6,1,8,4,1,1,5,5,2,21,1 -11025,5,5,0,12,4,1,0,4,1,0,9,14,4,5,1 -11026,9,4,0,12,5,3,6,4,4,1,9,7,4,9,1 -11027,1,7,0,15,2,0,14,5,1,1,11,2,5,24,0 -11028,6,0,0,3,1,0,8,2,2,0,6,11,5,16,0 -11029,3,5,0,14,4,2,0,0,4,1,5,8,2,25,1 -11030,3,7,0,5,5,6,9,2,4,1,6,4,5,14,0 -11031,3,0,0,13,4,2,14,4,3,0,2,11,4,13,0 -11032,0,7,0,7,3,3,14,4,2,0,13,8,4,10,0 -11033,0,7,0,7,6,4,2,5,3,1,10,15,4,22,0 -11034,4,6,0,10,6,4,3,4,2,1,2,2,4,35,0 -11035,3,8,0,13,1,0,5,4,3,0,6,0,5,20,0 -11036,2,8,0,13,5,0,2,4,2,0,4,0,2,24,0 -11037,9,0,0,5,5,4,9,5,3,1,0,17,5,25,0 -11038,2,2,0,5,1,5,13,2,1,1,15,20,1,26,0 -11039,6,2,0,13,0,0,14,2,0,0,13,13,1,40,0 -11040,3,4,0,11,5,0,1,0,3,0,13,0,1,35,0 -11041,1,1,0,13,6,5,8,1,3,1,8,13,2,18,0 -11042,6,7,0,12,3,1,3,5,4,1,14,7,3,16,1 -11043,6,8,0,9,1,4,0,2,2,0,2,2,4,41,0 -11044,0,6,0,3,0,4,3,3,4,1,8,11,2,23,0 -11045,1,6,0,13,3,4,0,4,0,0,13,2,5,34,0 -11046,3,3,0,8,5,4,13,4,4,1,6,20,4,7,0 -11047,1,0,0,6,4,2,5,2,0,0,13,18,1,40,0 -11048,5,0,0,5,0,5,0,2,2,0,2,17,0,13,0 -11049,8,4,0,2,5,3,5,4,2,0,13,20,0,2,0 -11050,3,4,0,12,2,0,5,5,2,0,14,1,4,41,1 -11051,3,2,0,6,0,0,1,3,3,1,8,6,4,5,0 -11052,4,2,0,9,0,5,12,1,0,1,6,7,3,34,0 -11053,5,1,0,11,5,5,1,5,4,0,13,6,0,6,0 -11054,10,0,0,7,4,6,1,5,4,0,2,16,5,7,0 -11055,4,0,0,3,1,1,10,5,0,1,5,14,2,31,1 -11056,9,6,0,0,4,5,8,3,0,1,18,19,1,19,1 -11057,4,5,0,11,2,1,12,3,0,1,5,14,5,21,1 -11058,9,8,0,5,0,3,11,3,0,1,6,2,5,29,0 -11059,7,3,0,15,6,4,6,3,4,1,17,2,1,13,0 -11060,2,2,0,0,6,6,12,2,0,0,11,20,0,21,0 -11061,7,2,0,13,1,5,4,0,1,1,16,20,5,28,1 -11062,0,7,0,12,4,6,0,4,4,0,2,20,3,35,0 -11063,3,0,0,14,2,2,13,0,4,0,2,4,2,23,0 -11064,0,8,0,5,1,5,11,1,2,0,13,17,4,3,0 -11065,5,6,0,5,0,4,6,2,2,0,10,16,3,23,0 -11066,0,6,0,5,6,0,1,1,3,0,6,9,1,39,0 -11067,5,7,0,7,2,5,3,4,3,1,4,18,3,33,0 -11068,2,7,0,15,5,6,13,2,1,1,4,11,3,6,0 -11069,2,6,0,1,4,3,8,2,0,0,15,2,2,27,0 -11070,9,1,0,15,5,5,12,3,0,1,8,0,4,14,0 -11071,1,8,0,11,0,5,5,1,1,1,1,11,4,16,0 -11072,3,2,0,7,1,5,6,0,3,0,15,2,0,24,0 -11073,1,3,0,1,6,4,9,3,0,0,13,2,1,34,0 -11074,3,7,0,2,4,1,1,0,3,0,14,1,2,37,1 -11075,3,7,0,0,6,6,2,5,3,1,4,7,1,1,1 -11076,10,1,0,3,2,4,7,2,1,0,0,15,3,10,0 -11077,7,1,0,1,0,5,2,1,1,0,15,11,1,40,0 -11078,2,7,0,3,0,5,2,1,4,0,8,11,3,8,0 -11079,2,5,0,12,4,6,6,1,3,1,13,11,3,18,0 -11080,6,5,0,4,0,5,13,4,0,1,8,11,4,37,0 -11081,10,4,0,10,1,1,9,5,1,1,3,19,0,13,1 -11082,8,8,0,2,4,0,12,0,0,1,5,18,2,1,1 -11083,5,0,0,0,1,4,5,0,0,1,11,9,0,2,0 -11084,9,0,0,8,1,4,8,1,1,1,2,6,2,40,0 -11085,6,2,0,12,2,4,5,3,0,1,13,9,3,38,0 -11086,9,1,0,1,4,5,5,4,2,0,8,6,2,25,0 -11087,4,8,0,0,3,4,11,2,1,0,14,6,1,17,0 -11088,10,0,0,15,2,6,12,0,4,0,2,2,0,33,0 -11089,7,6,0,1,4,6,13,5,1,1,5,8,4,34,1 -11090,3,4,0,8,0,0,10,5,0,0,13,11,2,27,0 -11091,3,6,0,9,5,1,14,1,1,0,4,2,1,21,0 -11092,0,5,0,11,3,0,13,1,0,0,4,11,0,23,0 -11093,1,3,0,1,0,6,13,3,0,1,13,10,5,25,0 -11094,2,6,0,3,4,0,5,3,0,0,13,2,0,29,0 -11095,0,1,0,8,6,0,9,1,2,1,15,2,0,0,0 -11096,9,0,0,7,2,3,9,2,3,1,0,4,3,25,0 -11097,8,6,0,10,1,4,6,1,2,1,3,2,1,0,0 -11098,2,1,0,11,6,5,6,1,3,1,8,8,0,40,0 -11099,1,1,0,8,1,4,5,4,2,0,10,9,1,33,0 -11100,1,6,0,11,3,1,2,0,4,1,18,7,4,14,1 -11101,6,4,0,11,2,5,6,3,1,1,6,15,3,32,0 -11102,0,4,0,12,6,0,8,2,4,1,13,9,5,35,0 -11103,9,2,0,5,4,2,8,2,1,1,13,15,0,28,0 -11104,0,6,0,12,0,5,9,2,3,0,13,14,1,12,0 -11105,3,3,0,6,6,4,14,2,3,0,17,2,4,20,0 -11106,1,6,0,14,5,3,6,5,0,0,13,2,5,8,0 -11107,9,2,0,6,6,6,11,0,0,0,13,15,1,12,0 -11108,0,2,0,6,6,4,11,2,3,0,4,0,0,16,0 -11109,8,4,0,11,6,1,14,4,4,1,2,11,3,5,0 -11110,9,1,0,0,5,5,12,1,2,1,4,2,3,31,0 -11111,4,0,0,9,1,4,0,3,2,0,15,15,1,29,0 -11112,9,3,0,1,3,6,14,5,0,1,1,8,3,25,1 -11113,7,8,0,6,6,2,5,1,2,0,2,0,1,35,0 -11114,4,0,0,12,0,5,3,1,0,1,8,9,5,38,0 -11115,10,7,0,13,0,6,10,0,4,1,3,5,2,17,1 -11116,0,2,0,7,6,0,9,4,2,1,8,2,3,37,0 -11117,7,4,0,9,4,1,6,5,2,0,9,5,5,8,1 -11118,1,7,0,3,6,6,8,4,1,0,2,4,5,0,0 -11119,0,7,0,5,4,6,7,3,0,1,5,2,0,39,0 -11120,9,7,0,4,0,3,1,4,1,1,18,9,5,38,1 -11121,0,6,0,4,0,4,14,1,4,0,11,2,5,36,0 -11122,4,2,0,3,5,3,14,5,3,0,2,14,2,0,0 -11123,10,8,0,12,1,4,0,2,4,0,2,18,2,16,0 -11124,8,4,0,0,1,0,8,2,0,0,11,16,4,35,0 -11125,9,8,0,6,6,4,11,5,3,1,2,2,1,22,0 -11126,4,3,0,8,1,6,9,1,2,1,6,2,2,30,0 -11127,1,0,0,3,1,3,14,3,3,1,12,13,0,25,0 -11128,3,0,0,1,4,5,2,3,4,0,6,18,5,38,0 -11129,3,1,0,8,4,3,6,4,3,1,9,7,3,20,1 -11130,3,5,0,8,4,5,8,2,0,0,13,3,1,18,0 -11131,0,4,0,7,4,6,9,5,3,1,6,6,1,32,0 -11132,1,1,0,6,4,3,4,5,1,0,1,10,1,41,1 -11133,7,7,0,3,0,0,6,4,0,0,2,11,1,41,0 -11134,0,8,0,3,0,4,6,4,4,1,8,2,3,25,0 -11135,5,8,0,8,5,6,8,3,4,0,15,15,4,3,0 -11136,0,4,0,10,6,2,3,2,4,1,5,1,3,38,1 -11137,3,8,0,3,3,2,12,4,3,0,10,19,0,35,0 -11138,5,1,0,5,1,4,8,1,2,0,13,11,2,31,0 -11139,2,6,0,4,2,4,9,4,0,0,8,11,2,39,0 -11140,9,6,0,1,2,1,7,2,1,1,2,2,0,19,0 -11141,3,4,0,2,1,4,2,3,0,0,8,4,3,14,0 -11142,6,7,0,2,1,6,12,4,2,0,13,6,0,12,0 -11143,2,3,0,11,0,6,5,2,2,0,17,0,3,9,0 -11144,3,5,0,5,3,2,8,4,4,0,15,11,1,21,0 -11145,1,5,0,3,5,4,0,0,2,0,4,20,3,5,0 -11146,5,3,0,3,5,4,2,3,0,1,17,2,1,9,0 -11147,6,8,0,14,4,3,13,4,0,1,13,15,0,10,0 -11148,10,3,0,13,0,4,13,2,2,1,13,11,5,27,0 -11149,3,0,0,0,3,6,2,1,1,0,13,6,0,30,0 -11150,9,8,0,4,1,2,2,5,4,0,18,10,3,29,1 -11151,8,6,0,11,6,6,6,3,2,1,14,2,0,40,0 -11152,0,8,0,13,5,2,13,1,0,1,11,19,0,23,1 -11153,3,3,0,5,3,6,10,1,0,1,13,9,5,0,0 -11154,8,8,0,6,4,0,1,3,2,0,8,17,3,7,0 -11155,1,6,0,15,6,0,11,2,4,0,5,9,0,25,0 -11156,0,6,0,4,3,4,5,2,4,1,13,11,3,9,0 -11157,0,7,0,11,3,5,3,1,3,1,2,2,5,11,0 -11158,0,6,0,3,4,0,3,5,3,0,2,6,3,35,0 -11159,1,8,0,13,2,1,2,4,3,0,7,18,0,7,0 -11160,4,1,0,13,6,1,11,2,0,0,9,13,4,1,1 -11161,0,4,0,7,2,3,0,3,0,0,13,0,2,4,0 -11162,5,7,0,0,2,3,3,0,3,0,18,18,0,7,1 -11163,3,6,0,12,6,0,7,1,4,1,16,11,3,33,0 -11164,8,2,0,5,2,3,13,3,1,0,2,12,1,29,0 -11165,0,1,0,3,1,4,9,0,3,1,2,13,3,34,0 -11166,1,8,0,6,2,2,2,2,0,1,2,11,2,8,0 -11167,5,4,0,13,2,2,12,1,1,1,15,11,0,3,0 -11168,3,5,0,0,4,1,14,1,2,0,17,6,1,2,0 -11169,5,5,0,13,0,5,10,1,0,1,11,10,3,14,0 -11170,7,0,0,15,1,0,2,3,1,0,10,4,0,21,0 -11171,8,8,0,4,1,6,3,2,1,0,7,1,0,4,1 -11172,0,1,0,8,6,1,8,3,3,1,7,12,2,5,0 -11173,5,2,0,2,2,5,14,4,0,0,4,3,0,5,0 -11174,9,8,0,2,6,5,2,0,2,0,2,13,4,4,0 -11175,1,6,0,4,2,5,10,4,1,0,13,15,5,0,0 -11176,0,7,0,15,2,2,6,2,0,0,2,17,4,26,0 -11177,4,0,0,8,5,1,12,1,1,1,13,2,2,2,0 -11178,3,4,0,7,0,5,7,4,2,0,8,9,1,24,0 -11179,10,1,0,0,1,6,14,2,2,1,16,15,4,31,1 -11180,3,1,0,8,0,0,6,3,2,1,13,11,0,31,0 -11181,2,6,0,1,0,4,8,2,0,1,0,2,2,37,0 -11182,2,0,0,13,0,5,11,4,2,0,5,18,1,26,0 -11183,6,0,0,8,5,2,5,2,1,1,0,8,2,18,1 -11184,0,1,0,5,3,0,5,2,1,0,6,6,0,24,0 -11185,3,0,0,1,4,3,5,4,1,1,13,2,5,17,0 -11186,1,6,0,9,0,0,2,2,0,0,13,11,4,11,0 -11187,2,0,0,14,2,6,4,5,2,0,5,18,3,19,1 -11188,2,4,0,15,4,3,2,2,1,0,12,9,3,39,0 -11189,10,4,0,7,2,1,10,5,2,0,14,7,5,0,1 -11190,2,6,0,3,1,4,14,2,3,0,18,9,4,28,0 -11191,2,7,0,15,5,3,9,4,0,1,12,20,2,17,0 -11192,1,8,0,15,6,3,14,0,0,0,4,16,2,40,0 -11193,7,0,0,11,5,2,4,4,4,1,13,0,4,32,0 -11194,8,1,0,1,6,1,4,0,2,0,11,1,2,20,1 -11195,0,4,0,7,3,1,6,3,3,0,18,0,3,36,0 -11196,1,4,0,6,2,5,11,0,4,1,18,13,4,16,1 -11197,2,4,0,8,6,5,8,3,4,0,7,15,2,17,0 -11198,5,4,0,11,3,5,6,4,1,0,10,8,3,0,0 -11199,8,7,0,11,6,3,13,4,2,1,8,11,2,41,0 -11200,3,2,0,0,0,3,7,3,4,1,4,17,1,20,0 -11201,7,3,0,0,3,0,11,0,0,0,2,2,4,24,0 -11202,5,0,0,9,3,6,9,3,3,1,4,14,0,17,0 -11203,6,3,0,6,2,1,5,3,2,1,13,0,4,19,0 -11204,0,6,0,6,1,4,6,0,0,1,10,2,0,31,0 -11205,0,8,0,13,0,0,2,3,3,1,4,1,3,26,0 -11206,3,5,0,12,3,2,11,2,4,1,7,10,1,2,1 -11207,1,1,0,4,5,3,6,4,0,1,15,20,1,35,0 -11208,3,2,0,12,1,2,14,5,0,1,18,3,1,20,1 -11209,1,4,0,9,1,5,8,1,0,0,2,11,3,3,0 -11210,0,0,0,6,3,5,0,0,1,0,6,17,0,33,0 -11211,3,6,0,2,6,5,9,4,0,1,13,6,0,17,0 -11212,2,4,0,3,0,4,3,2,2,0,13,6,3,12,0 -11213,9,8,0,8,4,1,9,2,4,1,18,7,5,9,1 -11214,6,0,0,15,5,4,13,5,2,0,9,0,2,32,1 -11215,5,1,0,5,6,6,0,5,4,1,18,7,4,17,1 -11216,0,2,0,10,1,5,10,2,1,0,18,7,3,21,1 -11217,8,6,0,0,4,2,0,0,0,1,17,4,0,21,1 -11218,2,4,0,5,1,2,6,3,0,0,15,5,0,9,0 -11219,4,7,0,1,0,4,5,2,3,1,6,2,2,4,0 -11220,10,0,0,13,2,4,3,4,3,0,9,19,5,20,1 -11221,1,4,0,9,5,5,6,0,0,0,0,12,3,25,0 -11222,8,4,0,5,1,1,7,2,0,1,13,2,5,30,0 -11223,1,7,0,1,0,0,14,4,0,0,8,6,3,29,0 -11224,10,2,0,15,2,3,6,2,0,0,18,19,4,25,1 -11225,8,4,0,10,2,4,4,3,2,1,9,9,5,25,1 -11226,7,5,0,1,4,4,13,1,4,1,18,1,5,4,1 -11227,1,4,0,3,5,1,8,0,3,0,2,9,0,34,0 -11228,7,2,0,5,5,4,1,0,2,0,0,12,1,8,0 -11229,10,3,0,11,0,0,9,3,0,0,5,3,5,1,0 -11230,5,4,0,9,0,3,12,5,4,1,3,7,0,35,1 -11231,9,0,0,0,5,5,5,3,0,0,12,8,3,0,0 -11232,3,4,0,14,6,5,14,0,1,1,6,6,0,23,0 -11233,7,1,0,11,5,4,9,0,0,1,15,0,2,21,1 -11234,4,3,0,4,2,2,12,3,3,0,17,2,1,9,0 -11235,0,3,0,5,2,0,10,1,2,1,2,11,5,2,0 -11236,3,8,0,3,0,6,7,2,4,1,2,11,0,30,0 -11237,4,3,0,14,1,1,0,1,4,0,6,12,2,26,1 -11238,0,6,0,14,6,3,4,1,1,0,2,7,1,11,0 -11239,0,7,0,2,6,4,14,3,2,1,13,2,3,22,0 -11240,2,8,0,8,6,6,7,3,4,1,8,11,0,38,0 -11241,10,3,0,5,6,5,14,1,2,1,7,3,0,23,0 -11242,0,5,0,15,4,1,0,3,4,1,2,2,4,6,0 -11243,1,6,0,14,1,4,6,2,4,0,4,9,2,27,0 -11244,6,2,0,13,2,1,13,5,0,1,1,7,1,39,1 -11245,4,7,0,5,0,0,11,4,2,0,6,2,5,16,0 -11246,9,2,0,7,6,5,3,2,3,1,13,11,5,38,0 -11247,7,4,0,5,6,3,14,2,2,0,6,0,0,24,0 -11248,0,3,0,0,3,6,6,3,0,0,14,0,1,20,0 -11249,10,2,0,0,3,3,0,3,4,1,5,3,1,19,1 -11250,4,5,0,1,5,5,2,5,4,1,9,5,5,29,1 -11251,2,3,0,15,3,0,3,4,2,0,3,11,4,6,0 -11252,1,4,0,15,5,4,0,3,0,0,2,20,1,7,0 -11253,2,8,0,6,3,6,3,1,2,1,10,0,2,24,0 -11254,7,0,0,4,0,5,1,5,3,1,7,2,1,40,0 -11255,0,2,0,0,4,2,10,3,2,1,15,11,3,30,0 -11256,8,2,0,15,2,2,9,2,3,1,13,5,3,37,0 -11257,0,3,0,4,5,0,9,2,4,0,15,5,2,10,0 -11258,9,6,0,1,1,4,11,2,0,1,18,10,2,29,1 -11259,1,8,0,2,4,1,7,2,1,1,13,11,4,24,0 -11260,1,6,0,12,0,0,9,4,1,1,13,20,1,16,0 -11261,8,8,0,2,0,0,4,4,0,0,15,11,1,9,0 -11262,1,8,0,11,5,6,9,5,3,1,8,6,0,30,0 -11263,1,1,0,2,2,3,2,3,0,0,9,7,1,9,1 -11264,0,1,0,13,0,6,7,2,2,1,8,16,1,10,0 -11265,6,4,0,3,4,5,7,3,3,1,14,14,5,2,1 -11266,0,1,0,2,4,3,5,2,1,0,2,2,5,35,0 -11267,1,3,0,3,6,3,3,2,1,1,8,20,0,13,0 -11268,5,3,0,12,3,0,14,1,3,0,18,11,0,27,0 -11269,9,4,0,10,0,6,4,4,1,0,12,6,1,14,0 -11270,0,6,0,3,4,0,3,3,4,0,2,9,1,30,0 -11271,0,2,0,10,4,6,10,1,4,0,8,16,0,13,0 -11272,2,3,0,6,0,2,12,3,2,0,4,13,3,33,0 -11273,8,7,0,14,2,5,0,5,3,1,12,6,0,27,1 -11274,8,4,0,1,1,5,14,1,3,1,5,16,0,4,1 -11275,0,1,0,2,3,4,3,2,1,0,8,15,0,7,0 -11276,2,6,0,10,3,0,1,2,4,1,18,10,0,31,1 -11277,0,0,0,6,1,4,4,0,3,1,2,11,1,4,0 -11278,6,6,0,5,1,5,13,5,1,1,7,14,4,5,1 -11279,3,2,0,12,0,0,11,0,4,0,15,3,1,23,0 -11280,6,8,0,14,2,5,8,5,1,1,2,19,2,32,0 -11281,4,7,0,6,1,6,14,1,0,0,14,2,3,26,0 -11282,8,0,0,14,1,6,11,1,4,0,18,6,4,23,0 -11283,10,4,0,0,6,5,6,2,2,0,2,13,4,36,0 -11284,2,5,0,13,2,0,3,4,2,0,10,6,0,6,0 -11285,10,6,0,13,0,1,0,2,0,1,8,9,2,13,0 -11286,7,0,0,6,6,4,3,4,4,0,8,15,1,19,0 -11287,0,3,0,1,0,6,3,4,4,0,13,2,3,7,0 -11288,3,2,0,9,3,0,2,4,3,0,2,6,0,9,0 -11289,7,6,0,0,6,3,10,4,0,0,18,14,0,5,1 -11290,3,4,0,13,6,2,3,1,3,0,7,11,3,28,0 -11291,8,3,0,0,6,6,7,0,0,0,13,9,0,8,0 -11292,1,6,0,10,4,3,3,4,2,1,7,6,0,13,0 -11293,3,3,0,8,0,5,4,3,4,1,13,4,5,16,0 -11294,3,8,0,15,4,5,9,5,3,1,17,14,2,37,0 -11295,2,8,0,7,6,5,5,1,4,1,17,2,2,7,0 -11296,6,6,0,12,4,5,5,2,0,0,3,7,4,4,1 -11297,5,8,0,10,3,1,10,2,2,0,7,3,5,41,1 -11298,10,5,0,0,0,0,2,1,0,0,17,17,0,17,0 -11299,2,2,0,15,5,5,5,1,0,0,16,11,4,6,0 -11300,3,7,0,0,2,3,9,4,4,0,17,2,4,22,0 -11301,9,8,0,7,1,4,14,1,4,0,3,11,4,34,0 -11302,9,7,0,3,2,5,4,5,4,0,18,14,5,22,1 -11303,2,4,0,11,1,1,13,4,4,0,3,19,5,36,1 -11304,2,1,0,9,6,3,14,2,3,1,18,18,4,12,1 -11305,9,3,0,2,3,3,5,3,3,1,8,17,1,0,0 -11306,9,7,0,11,3,0,3,5,3,0,4,17,3,11,1 -11307,0,3,0,15,2,0,10,3,0,1,0,2,3,3,0 -11308,1,2,0,15,4,5,8,0,4,0,18,3,4,18,1 -11309,3,8,0,12,3,0,14,5,4,0,1,7,5,37,1 -11310,8,3,0,10,6,5,7,3,4,1,6,20,1,29,0 -11311,6,4,0,5,3,3,7,5,1,1,18,1,3,32,1 -11312,1,3,0,4,6,4,5,4,2,0,8,9,0,7,0 -11313,9,8,0,15,5,0,10,3,4,1,13,3,4,30,0 -11314,7,4,0,10,4,6,7,3,2,1,18,3,2,33,1 -11315,1,3,0,2,2,2,13,0,4,0,17,12,3,27,0 -11316,6,1,0,3,6,6,9,2,0,1,18,12,1,22,0 -11317,1,8,0,14,1,0,5,0,0,1,8,20,0,28,0 -11318,3,2,0,0,1,5,1,0,1,0,17,18,3,37,0 -11319,7,1,0,13,4,3,8,5,0,0,17,14,2,11,0 -11320,9,1,0,10,0,2,2,5,0,0,1,10,4,31,1 -11321,6,7,0,15,6,4,8,4,0,0,4,13,2,28,0 -11322,3,1,0,11,0,0,11,4,2,1,17,15,5,6,0 -11323,6,2,0,3,4,6,14,3,4,1,8,15,1,23,0 -11324,0,2,0,4,5,0,0,4,1,0,4,4,4,37,0 -11325,0,6,0,3,6,4,5,0,3,0,13,13,2,19,0 -11326,5,8,0,9,2,5,9,4,3,0,10,3,3,13,0 -11327,5,3,0,15,5,2,11,5,1,1,18,19,2,33,1 -11328,0,3,0,10,0,4,13,3,4,0,10,11,0,40,0 -11329,9,3,0,12,2,1,7,3,4,0,9,17,0,32,1 -11330,0,3,0,6,5,5,9,5,2,0,4,2,5,3,0 -11331,0,5,0,2,2,5,14,2,0,0,17,11,0,0,0 -11332,2,5,0,13,6,4,8,2,1,0,13,2,0,17,0 -11333,1,2,0,5,2,4,8,4,4,1,7,8,1,8,0 -11334,3,2,0,3,2,4,1,4,0,1,4,17,1,26,0 -11335,6,7,0,14,3,1,11,5,4,0,14,17,3,21,1 -11336,9,3,0,13,0,6,0,1,1,0,8,16,2,26,0 -11337,5,5,0,15,0,0,4,4,1,1,9,7,5,18,1 -11338,3,8,0,11,4,2,12,4,1,1,2,9,0,28,0 -11339,4,0,0,10,6,2,9,1,1,1,1,5,3,38,1 -11340,0,7,0,12,3,0,8,3,0,0,2,18,0,11,0 -11341,5,4,0,15,2,4,2,0,2,1,12,9,1,32,0 -11342,2,2,0,8,1,4,13,0,1,1,13,19,5,27,0 -11343,1,7,0,6,0,5,9,0,1,0,10,8,0,0,0 -11344,10,0,0,7,1,5,14,4,0,1,4,15,0,41,0 -11345,8,7,0,9,2,4,8,3,0,0,6,4,5,0,0 -11346,1,4,0,15,6,1,4,4,4,1,2,0,5,11,0 -11347,0,2,0,5,1,2,2,4,1,0,13,9,0,0,0 -11348,9,8,0,11,2,4,1,2,4,1,17,6,3,41,0 -11349,2,0,0,2,0,0,2,1,2,1,8,15,5,35,0 -11350,10,3,0,2,4,2,4,1,4,0,18,12,0,12,1 -11351,3,5,0,6,4,5,9,2,2,0,8,17,2,38,0 -11352,2,4,0,5,2,5,10,2,4,1,8,2,5,1,0 -11353,6,3,0,3,1,3,12,3,1,0,8,11,2,12,0 -11354,6,2,0,6,1,0,4,0,0,1,16,16,3,4,0 -11355,1,5,0,5,5,6,7,5,0,1,13,2,0,16,0 -11356,10,1,0,10,4,1,2,4,2,1,0,10,3,35,1 -11357,1,2,0,15,2,4,11,2,0,0,8,16,5,23,0 -11358,3,8,0,2,6,5,8,1,2,0,17,1,0,17,0 -11359,7,7,0,1,4,5,7,4,2,1,13,11,3,1,0 -11360,5,3,0,11,5,3,4,3,0,1,13,4,0,16,0 -11361,9,4,0,0,1,6,11,1,1,1,8,9,4,22,1 -11362,1,8,0,0,4,5,12,1,1,0,13,11,1,36,0 -11363,8,4,0,1,0,6,5,4,0,0,10,11,1,1,0 -11364,3,7,0,8,1,1,1,0,1,1,18,8,0,21,1 -11365,4,4,0,0,2,3,5,1,1,0,8,6,5,8,0 -11366,2,8,0,11,6,2,2,0,0,1,4,11,1,23,0 -11367,9,2,0,8,4,5,14,2,0,1,5,1,2,35,1 -11368,3,2,0,1,1,1,10,5,0,1,5,3,5,24,1 -11369,1,6,0,7,3,1,4,4,3,1,17,11,2,4,0 -11370,10,2,0,12,6,0,6,5,0,0,16,8,3,35,1 -11371,7,6,0,10,5,4,4,4,1,0,2,4,4,7,0 -11372,3,2,0,10,6,3,8,0,0,1,6,15,0,13,0 -11373,9,8,0,15,6,3,3,2,1,1,16,11,1,34,0 -11374,0,3,0,2,1,1,9,2,0,0,8,20,1,9,0 -11375,1,6,0,15,3,2,8,5,1,1,17,19,0,27,0 -11376,6,6,0,13,1,3,6,0,1,0,2,2,0,5,0 -11377,7,6,0,10,3,2,4,5,2,0,18,0,3,16,1 -11378,5,1,0,2,2,3,13,5,3,0,9,3,4,18,1 -11379,1,2,0,9,0,5,9,3,3,0,14,0,0,31,0 -11380,3,2,0,5,6,5,6,3,0,1,8,10,2,40,0 -11381,0,8,0,10,0,6,1,5,4,1,11,0,4,22,1 -11382,1,6,0,5,5,3,5,2,0,0,5,11,2,13,0 -11383,6,0,0,2,6,0,2,3,2,1,6,13,0,31,0 -11384,2,5,0,13,6,5,0,1,0,1,13,18,2,14,0 -11385,0,1,0,3,5,2,9,1,0,0,10,15,1,22,0 -11386,7,1,0,5,5,5,1,4,4,0,4,19,2,27,0 -11387,2,5,0,15,5,6,13,1,0,0,10,10,0,25,0 -11388,9,2,0,15,0,1,3,1,4,1,13,2,3,18,0 -11389,1,8,0,11,0,6,4,1,3,0,13,13,1,20,0 -11390,2,2,0,1,5,0,10,3,0,1,8,14,0,29,0 -11391,7,3,0,3,5,0,12,2,3,0,17,11,5,12,0 -11392,7,4,0,9,4,2,4,0,3,0,18,16,2,36,1 -11393,0,6,0,15,2,6,9,1,1,1,9,2,5,5,0 -11394,7,0,0,15,0,3,11,0,4,0,15,12,1,13,1 -11395,6,2,0,6,5,3,9,3,2,1,8,2,5,33,0 -11396,0,4,0,10,4,4,5,3,4,1,13,0,2,40,0 -11397,5,7,0,12,0,0,4,5,3,1,17,9,0,35,0 -11398,8,6,0,8,5,5,4,0,2,1,5,14,2,35,1 -11399,10,3,0,1,0,0,2,3,4,0,13,2,1,34,0 -11400,2,6,0,1,2,6,2,0,3,1,5,17,4,16,1 -11401,10,4,0,10,4,0,12,3,0,1,13,0,5,6,0 -11402,2,6,0,0,0,5,3,5,4,0,8,9,0,2,0 -11403,3,7,0,1,5,0,3,4,1,0,13,2,1,23,0 -11404,7,4,0,3,4,2,13,1,0,1,18,12,5,9,1 -11405,4,5,0,5,1,3,8,0,2,0,10,12,1,31,0 -11406,9,0,0,8,1,1,9,2,2,1,17,13,1,2,0 -11407,7,3,0,12,1,4,0,2,2,0,8,2,0,33,0 -11408,1,4,0,5,1,3,10,5,2,0,1,12,5,13,1 -11409,5,7,0,13,2,4,6,4,3,0,12,8,2,22,0 -11410,0,5,0,10,2,5,5,0,0,0,8,18,3,7,0 -11411,4,7,0,11,2,1,1,3,4,0,17,2,5,1,0 -11412,8,3,0,1,0,5,11,1,0,0,3,6,4,8,0 -11413,0,6,0,14,1,1,2,0,4,1,5,7,1,5,1 -11414,4,6,0,10,6,0,6,0,3,1,8,9,3,7,0 -11415,1,4,0,13,3,1,1,5,0,1,0,8,0,23,0 -11416,10,8,0,0,5,5,10,5,0,0,2,9,2,38,0 -11417,7,7,0,10,6,3,2,2,4,1,13,20,1,35,0 -11418,4,6,0,5,1,0,11,3,0,1,10,2,3,11,0 -11419,0,3,0,13,1,5,9,5,3,0,11,4,5,5,0 -11420,9,4,0,10,4,6,14,2,4,1,9,5,3,20,1 -11421,3,8,0,14,0,5,10,2,4,0,17,10,5,0,0 -11422,3,8,0,11,6,4,6,3,0,1,2,9,1,30,0 -11423,7,1,0,2,1,2,14,5,2,1,3,11,4,35,1 -11424,1,3,0,0,3,4,0,5,0,1,8,18,3,19,0 -11425,5,0,0,12,3,2,9,2,4,0,16,19,3,23,1 -11426,10,6,0,12,4,1,7,4,1,1,18,8,4,18,1 -11427,0,0,0,11,0,0,1,2,2,1,17,9,2,7,0 -11428,3,8,0,1,4,5,2,5,0,1,2,14,5,4,0 -11429,2,4,0,7,1,4,2,1,0,0,13,16,5,32,0 -11430,10,0,0,6,2,2,4,2,4,1,7,12,1,10,1 -11431,10,7,0,6,0,4,9,4,1,0,1,11,3,35,0 -11432,4,7,0,0,3,6,11,2,3,0,8,11,5,9,0 -11433,0,6,0,4,0,3,5,5,4,0,8,3,2,18,0 -11434,0,6,0,13,0,5,9,0,0,1,2,15,3,14,0 -11435,3,1,0,11,5,2,3,0,0,1,8,17,2,22,0 -11436,10,8,0,4,0,3,6,3,3,0,4,8,3,27,0 -11437,0,0,0,15,4,5,0,2,3,0,2,4,4,20,0 -11438,5,0,0,9,6,2,7,3,0,0,16,4,2,41,1 -11439,2,1,0,14,5,1,11,1,3,1,14,5,2,28,1 -11440,10,1,0,6,0,1,14,0,3,0,4,10,5,25,0 -11441,7,0,0,6,3,2,9,0,0,0,3,6,1,31,0 -11442,0,0,0,11,3,2,8,5,4,0,6,13,4,18,0 -11443,1,0,0,13,6,2,0,0,0,1,8,0,2,27,0 -11444,7,3,0,11,3,2,7,0,0,0,2,2,1,3,0 -11445,1,1,0,3,2,2,14,5,2,0,17,10,3,0,0 -11446,7,3,0,11,6,5,3,1,0,1,14,6,2,3,0 -11447,4,4,0,3,0,0,5,4,0,0,2,2,0,19,0 -11448,0,5,0,15,0,2,2,4,4,1,13,2,0,41,0 -11449,1,3,0,3,6,6,4,4,3,0,10,11,4,29,0 -11450,0,6,0,0,0,5,1,1,1,1,2,13,3,35,0 -11451,10,6,0,4,5,1,11,1,2,1,9,14,2,31,1 -11452,4,0,0,11,5,4,11,3,4,1,13,20,1,35,0 -11453,7,2,0,13,0,3,14,2,0,0,2,2,1,7,0 -11454,10,4,0,3,2,2,3,4,1,1,9,19,2,36,1 -11455,0,7,0,13,6,5,7,4,0,1,13,13,1,34,0 -11456,1,1,0,2,5,3,2,0,1,0,4,15,0,32,0 -11457,3,6,0,4,0,6,7,3,3,0,13,3,2,27,0 -11458,9,1,0,3,0,1,2,5,0,1,18,17,5,7,1 -11459,3,8,0,14,1,2,8,3,0,1,6,12,4,0,0 -11460,1,8,0,10,6,0,6,2,2,0,8,10,3,26,0 -11461,10,6,0,13,0,5,6,4,2,1,11,14,5,4,0 -11462,2,2,0,0,2,5,9,5,1,0,2,6,3,2,0 -11463,2,1,0,13,6,0,8,3,0,0,7,15,2,33,0 -11464,2,3,0,2,4,2,10,3,3,0,12,1,1,29,0 -11465,4,4,0,10,3,6,1,0,3,1,17,10,5,1,1 -11466,0,0,0,1,6,4,13,5,2,0,13,17,0,37,0 -11467,3,7,0,8,5,0,5,3,0,1,13,2,0,39,0 -11468,1,2,0,9,3,5,7,0,1,0,17,4,0,11,0 -11469,1,5,0,8,5,5,9,3,1,0,4,11,1,38,0 -11470,2,6,0,1,6,0,13,1,3,1,0,2,0,40,0 -11471,8,5,0,4,6,4,7,2,0,1,13,2,0,37,0 -11472,2,8,0,0,5,6,11,2,0,1,4,2,5,40,0 -11473,3,0,0,5,5,0,10,2,3,0,12,15,0,39,0 -11474,4,8,0,8,1,0,9,3,3,0,2,1,5,3,0 -11475,2,0,0,5,1,4,9,1,1,0,4,15,3,6,0 -11476,0,8,0,0,0,4,7,3,2,0,10,9,0,27,0 -11477,2,3,0,13,1,3,6,2,3,0,10,4,4,26,0 -11478,7,7,0,15,0,1,0,3,2,1,18,17,2,13,1 -11479,7,6,0,9,1,4,10,2,2,1,13,3,0,14,0 -11480,4,2,0,1,1,5,6,5,0,1,13,11,2,29,0 -11481,9,7,0,11,6,1,7,0,1,0,8,0,1,9,0 -11482,0,0,0,0,0,4,1,2,1,1,3,15,1,40,0 -11483,1,7,0,11,2,1,4,2,3,1,12,12,2,16,0 -11484,2,5,0,13,0,3,2,5,3,0,8,2,0,7,0 -11485,7,3,0,8,1,0,1,1,0,0,18,3,3,2,1 -11486,1,4,0,10,4,5,5,3,4,0,8,9,0,39,0 -11487,7,0,0,11,1,2,0,1,0,0,13,11,0,4,0 -11488,9,6,0,13,5,6,13,1,4,1,0,0,0,24,0 -11489,0,3,0,0,2,5,0,3,3,0,4,11,0,39,0 -11490,0,4,0,9,5,5,12,2,4,0,15,15,2,12,0 -11491,2,6,0,2,1,5,6,5,2,0,4,1,5,38,0 -11492,4,5,0,14,2,3,8,0,0,1,10,11,4,1,0 -11493,8,6,0,10,2,5,13,3,2,0,12,4,3,38,0 -11494,7,5,0,1,3,3,12,1,0,0,10,14,0,0,0 -11495,7,0,0,10,4,4,11,2,4,1,18,16,3,33,1 -11496,2,4,0,6,6,5,12,2,3,1,13,2,0,16,0 -11497,7,3,0,7,0,4,10,3,0,1,2,15,2,26,0 -11498,3,6,0,9,6,2,5,0,1,1,1,8,5,13,1 -11499,6,6,0,8,1,0,6,4,0,0,6,3,5,36,0 -11500,9,5,0,7,3,4,5,0,2,0,2,16,1,0,0 -11501,4,8,0,4,0,4,3,0,0,1,13,11,0,35,0 -11502,10,2,0,3,1,4,9,5,0,1,4,20,1,0,0 -11503,10,6,0,9,5,6,5,0,3,0,13,2,2,29,0 -11504,9,6,0,1,6,5,10,0,0,0,2,2,5,40,0 -11505,2,8,0,6,2,5,13,0,2,0,4,0,1,1,0 -11506,7,4,0,6,6,4,9,4,3,0,7,15,1,27,0 -11507,6,3,0,3,5,6,10,3,1,0,13,15,5,1,0 -11508,8,4,0,1,4,4,14,3,0,1,4,9,1,9,0 -11509,6,2,0,8,5,1,9,5,4,0,17,17,0,38,1 -11510,2,7,0,3,5,0,6,1,2,1,15,16,5,35,0 -11511,1,6,0,3,3,4,8,4,4,1,8,11,0,40,0 -11512,5,1,0,10,4,3,14,2,4,1,11,7,0,38,1 -11513,10,3,0,0,5,3,4,4,4,1,2,15,5,8,0 -11514,2,0,0,11,6,3,2,4,0,0,4,9,1,18,0 -11515,1,5,0,14,0,2,14,1,2,1,18,10,0,2,1 -11516,2,7,0,1,1,3,8,5,1,0,15,11,3,21,0 -11517,10,1,0,3,2,4,9,5,0,1,18,12,5,13,1 -11518,10,6,0,1,2,2,5,2,2,1,0,9,1,7,0 -11519,1,5,0,14,5,3,3,2,2,1,13,1,3,33,0 -11520,1,8,0,1,2,4,10,3,4,1,8,2,1,37,0 -11521,4,6,0,10,5,0,12,3,2,0,13,13,5,14,0 -11522,0,8,0,2,1,4,0,2,3,0,13,11,3,35,0 -11523,10,6,0,13,6,2,14,2,0,1,4,17,1,13,1 -11524,9,3,0,0,2,0,0,4,1,1,11,7,4,40,1 -11525,3,2,0,15,2,0,1,4,1,0,13,19,1,7,0 -11526,2,8,0,8,5,5,11,0,2,1,17,11,4,7,0 -11527,4,1,0,6,2,3,11,0,1,1,12,1,1,31,1 -11528,9,3,0,15,0,0,4,2,0,1,2,8,1,12,0 -11529,9,5,0,14,6,5,12,0,4,1,18,17,1,38,1 -11530,0,1,0,7,1,1,8,0,1,0,4,17,0,29,0 -11531,1,7,0,5,1,4,11,3,1,0,8,6,0,2,0 -11532,3,5,0,8,6,1,12,3,4,0,8,2,0,36,0 -11533,6,1,0,12,0,4,14,3,1,0,2,15,0,18,0 -11534,5,6,0,8,2,4,5,2,2,1,15,2,0,18,0 -11535,8,6,0,1,3,5,9,0,3,1,1,14,2,21,1 -11536,9,1,0,12,1,0,12,0,0,0,4,6,4,28,0 -11537,5,0,0,5,4,1,5,1,1,0,13,15,4,11,0 -11538,1,0,0,0,5,5,8,3,1,0,6,15,5,19,0 -11539,3,8,0,13,3,3,8,3,2,1,0,2,2,0,0 -11540,3,1,0,2,4,4,5,3,0,1,15,11,4,39,0 -11541,3,4,0,0,0,1,10,5,1,0,16,7,0,40,1 -11542,4,8,0,9,0,2,11,1,3,0,0,7,0,35,1 -11543,9,8,0,14,5,3,10,2,0,1,8,6,1,11,0 -11544,0,5,0,0,0,4,8,1,3,1,10,11,3,33,0 -11545,2,7,0,7,2,3,14,2,3,0,10,12,0,34,0 -11546,4,1,0,12,6,5,10,3,0,1,2,11,2,14,0 -11547,3,3,0,6,0,2,8,2,0,0,10,12,1,27,0 -11548,3,3,0,13,3,3,14,1,3,1,2,3,5,35,0 -11549,2,3,0,14,1,6,3,0,0,1,2,2,0,34,0 -11550,0,0,0,13,1,1,6,2,0,0,8,3,2,8,0 -11551,0,2,0,6,4,3,13,1,3,1,2,11,2,13,0 -11552,6,0,0,13,3,4,10,2,3,1,18,8,4,41,1 -11553,10,8,0,10,1,1,8,4,1,0,5,14,3,22,1 -11554,2,0,0,14,1,6,14,2,2,1,13,0,4,14,0 -11555,1,4,0,7,4,4,3,1,0,1,9,10,4,28,1 -11556,5,5,0,9,3,2,2,4,2,1,1,7,3,7,1 -11557,4,7,0,5,0,3,11,3,4,1,8,9,1,2,0 -11558,8,2,0,8,5,3,9,1,1,1,1,17,3,41,1 -11559,0,1,0,4,3,4,5,2,2,0,8,2,5,14,0 -11560,9,6,0,15,6,5,12,1,2,1,1,8,3,27,1 -11561,3,3,0,0,0,6,6,5,4,0,17,20,0,23,0 -11562,0,4,0,6,3,2,4,1,0,0,2,2,1,29,0 -11563,0,7,0,2,3,4,11,0,4,0,12,11,3,30,0 -11564,10,2,0,8,0,6,8,2,3,0,13,9,2,4,0 -11565,3,8,0,7,1,2,8,0,3,0,0,0,1,3,0 -11566,1,6,0,7,3,4,12,4,1,0,17,16,2,28,0 -11567,9,1,0,8,0,5,13,2,4,1,7,15,3,24,0 -11568,8,7,0,12,4,0,12,5,1,1,5,10,4,28,1 -11569,3,6,0,5,6,5,13,5,2,0,15,18,0,2,0 -11570,3,3,0,14,0,0,1,4,4,1,8,2,4,24,0 -11571,6,6,0,0,3,1,10,1,3,0,8,2,0,32,0 -11572,1,1,0,3,6,3,3,4,0,0,13,13,3,25,0 -11573,10,4,0,8,2,6,6,3,3,0,11,18,0,4,0 -11574,0,4,0,0,0,3,12,4,0,1,6,13,0,16,0 -11575,4,4,0,11,6,3,12,3,0,1,2,9,1,20,0 -11576,9,3,0,13,2,2,10,0,1,0,2,5,0,21,0 -11577,5,4,0,2,0,1,3,1,3,1,0,2,4,31,0 -11578,0,7,0,7,6,3,6,5,1,0,13,17,3,40,0 -11579,0,2,0,10,3,4,8,3,3,1,4,2,0,24,0 -11580,1,6,0,2,6,0,7,3,0,1,14,11,1,40,0 -11581,7,6,0,12,1,5,5,0,4,0,13,2,3,40,0 -11582,1,0,0,13,5,3,1,3,4,1,3,2,0,3,0 -11583,7,5,0,7,5,1,11,5,2,1,16,3,0,27,0 -11584,1,7,0,15,6,0,4,4,2,1,8,2,3,3,0 -11585,10,8,0,15,3,1,10,2,0,0,9,12,2,14,1 -11586,7,3,0,0,3,2,5,2,0,1,11,0,1,16,0 -11587,6,0,0,6,4,2,3,1,1,0,11,14,4,36,1 -11588,1,4,0,3,1,4,8,3,4,0,2,13,5,31,0 -11589,5,4,0,11,0,1,11,1,1,1,3,5,3,4,1 -11590,9,6,0,8,0,6,8,0,0,0,10,11,5,39,0 -11591,10,7,0,3,2,3,11,5,0,0,14,7,5,37,1 -11592,1,1,0,6,6,3,11,2,3,0,8,15,0,19,0 -11593,0,1,0,5,6,1,5,1,0,0,4,17,0,17,0 -11594,0,5,0,14,1,3,6,2,0,1,13,6,2,24,0 -11595,9,2,0,14,0,6,14,3,1,0,2,9,2,11,0 -11596,5,3,0,13,0,3,9,0,2,0,17,20,1,24,0 -11597,5,7,0,5,6,4,10,5,4,1,18,19,0,20,1 -11598,3,8,0,9,1,5,3,3,1,1,13,2,0,2,0 -11599,3,3,0,2,5,0,7,0,2,0,5,7,3,31,1 -11600,9,6,0,3,0,6,14,1,4,0,17,10,4,30,0 -11601,1,3,0,7,6,5,8,4,2,1,13,4,4,31,0 -11602,10,1,0,13,4,5,14,5,0,0,13,15,5,25,0 -11603,0,4,0,6,1,4,6,3,3,0,2,2,4,20,0 -11604,0,1,0,4,6,3,9,3,1,1,12,20,1,7,0 -11605,5,5,0,11,4,2,4,4,4,1,4,0,1,4,0 -11606,4,6,0,15,3,1,9,0,2,0,13,20,1,6,0 -11607,7,3,0,13,1,0,5,3,3,1,12,15,2,7,0 -11608,4,0,0,4,6,5,2,0,1,0,8,13,0,17,0 -11609,4,4,0,8,1,1,14,0,2,1,5,16,3,18,1 -11610,4,7,0,13,6,5,14,2,4,1,13,0,4,37,0 -11611,6,7,0,4,5,3,1,5,1,1,14,14,3,19,0 -11612,7,4,0,5,6,5,13,4,0,0,15,15,3,9,0 -11613,7,2,0,3,6,1,4,4,3,1,15,0,5,16,0 -11614,0,8,0,5,2,4,5,0,3,0,4,13,1,26,0 -11615,3,5,0,0,0,0,3,5,1,1,5,5,4,24,1 -11616,2,0,0,10,4,1,14,4,3,0,15,16,3,30,0 -11617,0,3,0,9,1,4,0,3,0,0,6,16,3,19,0 -11618,1,6,0,15,5,4,1,3,0,1,4,14,5,27,0 -11619,0,3,0,15,2,6,7,0,3,1,2,15,4,8,0 -11620,7,6,0,8,1,5,9,1,0,0,4,16,1,19,0 -11621,4,2,0,10,0,3,10,0,2,0,2,1,5,10,0 -11622,2,5,0,3,6,2,14,4,3,1,0,17,2,21,1 -11623,2,8,0,3,2,2,4,5,0,1,13,18,4,28,0 -11624,1,1,0,10,0,0,9,4,4,0,6,20,2,18,0 -11625,9,4,0,13,6,1,2,2,3,1,13,0,5,8,0 -11626,0,5,0,10,6,5,2,3,4,1,2,2,5,33,0 -11627,0,5,0,11,5,4,2,2,3,0,2,11,4,8,0 -11628,0,8,0,5,5,4,9,1,1,1,15,2,2,32,0 -11629,9,1,0,1,0,0,12,5,2,0,13,2,3,0,0 -11630,5,1,0,2,6,6,6,2,4,1,4,17,1,9,0 -11631,1,7,0,13,4,3,5,1,0,1,8,18,4,3,0 -11632,1,1,0,9,4,5,5,5,0,0,14,7,5,16,1 -11633,4,2,0,9,6,3,7,5,0,1,3,7,3,38,0 -11634,10,4,0,2,0,0,6,2,1,1,4,15,5,29,0 -11635,3,8,0,1,1,4,11,3,1,0,4,6,0,19,0 -11636,2,1,0,5,0,0,3,4,4,0,1,18,1,29,0 -11637,1,8,0,1,5,4,8,1,2,1,13,6,0,6,0 -11638,7,6,0,9,0,3,9,2,2,1,13,13,0,13,0 -11639,2,1,0,9,0,4,5,3,2,1,4,2,0,31,0 -11640,5,3,0,10,2,4,7,3,0,1,1,5,5,28,1 -11641,5,7,0,7,3,0,2,5,0,0,11,7,0,13,1 -11642,10,2,0,10,4,6,6,2,0,1,15,2,5,20,0 -11643,5,7,0,13,4,0,11,3,0,0,17,0,4,21,0 -11644,6,6,0,5,4,0,4,3,4,1,5,2,4,7,0 -11645,3,3,0,9,2,1,8,1,3,1,4,5,1,13,1 -11646,0,1,0,8,3,4,3,4,3,0,4,19,3,32,0 -11647,3,7,0,13,4,6,8,0,4,0,10,18,1,10,0 -11648,1,3,0,13,0,6,1,1,3,0,17,2,5,6,0 -11649,10,0,0,6,1,0,6,3,3,0,13,2,2,2,0 -11650,2,2,0,2,4,5,1,3,0,1,17,0,1,24,0 -11651,1,6,0,12,0,0,14,2,4,0,6,3,5,1,0 -11652,1,0,0,12,6,4,8,4,3,0,13,4,2,26,0 -11653,0,2,0,15,5,5,2,5,1,0,4,20,0,37,0 -11654,5,0,0,9,5,0,11,0,4,0,13,15,4,6,0 -11655,3,3,0,8,3,0,11,0,0,1,12,10,4,37,1 -11656,1,2,0,5,3,4,14,3,3,0,6,20,5,24,0 -11657,2,6,0,5,6,2,8,5,2,0,13,13,4,18,0 -11658,5,3,0,7,5,0,0,4,3,1,8,15,5,17,0 -11659,8,1,0,0,6,1,4,1,3,0,5,14,3,1,1 -11660,5,6,0,8,6,5,7,2,3,0,13,14,5,23,0 -11661,9,3,0,8,2,1,14,1,2,0,0,10,2,34,0 -11662,7,8,0,14,1,6,6,4,3,0,13,13,0,3,0 -11663,2,8,0,5,1,6,10,3,4,0,0,18,2,25,0 -11664,0,7,0,9,1,1,8,3,4,0,2,8,1,26,0 -11665,3,7,0,9,3,5,0,0,1,0,5,2,0,22,0 -11666,10,2,0,11,1,3,12,0,1,1,16,0,2,17,1 -11667,4,0,0,10,5,0,11,5,1,1,8,20,0,22,0 -11668,0,1,0,10,2,4,13,5,2,0,8,0,3,31,0 -11669,2,3,0,2,3,2,1,0,2,0,17,11,2,9,0 -11670,1,1,0,14,4,5,4,1,0,1,15,5,1,38,0 -11671,1,6,0,5,6,5,13,1,1,0,12,5,5,20,0 -11672,1,1,0,15,5,6,2,0,0,0,2,4,5,27,0 -11673,8,4,0,11,6,0,1,0,0,1,13,2,0,27,0 -11674,9,7,0,13,0,1,8,2,0,0,8,11,4,23,0 -11675,10,4,0,0,6,2,13,0,2,0,8,5,0,20,0 -11676,2,7,0,10,3,0,14,0,1,0,7,6,3,32,0 -11677,1,3,0,15,2,1,5,4,4,0,8,2,4,26,0 -11678,3,6,0,6,0,0,8,0,3,1,8,16,0,16,0 -11679,0,0,0,11,2,0,3,3,3,0,4,15,3,6,0 -11680,0,0,0,1,6,0,7,3,2,0,10,15,2,10,0 -11681,9,7,0,13,1,4,14,2,0,1,2,12,0,26,0 -11682,3,5,0,3,5,0,2,4,4,1,2,11,5,0,0 -11683,0,5,0,14,6,0,7,2,0,0,13,18,4,39,0 -11684,2,3,0,7,4,6,4,3,0,1,11,13,3,9,0 -11685,9,4,0,3,0,6,13,3,1,1,6,0,0,13,0 -11686,4,6,0,0,2,6,12,4,3,1,8,4,1,17,0 -11687,3,4,0,13,1,6,10,1,2,0,8,0,2,19,0 -11688,7,0,0,12,2,0,4,2,2,1,18,7,2,5,1 -11689,4,8,0,11,0,1,9,4,3,0,13,2,3,5,0 -11690,5,2,0,15,4,1,11,5,2,0,14,7,0,29,1 -11691,7,3,0,1,0,4,14,0,3,1,11,11,1,10,0 -11692,7,3,0,13,5,2,6,2,0,0,5,2,2,3,0 -11693,1,2,0,11,4,0,9,4,0,0,12,2,0,33,0 -11694,4,1,0,13,0,5,2,1,0,0,16,0,1,3,0 -11695,2,8,0,1,0,5,8,4,4,0,8,20,3,24,0 -11696,1,5,0,14,6,1,2,2,4,0,0,11,5,29,0 -11697,0,6,0,5,5,0,14,0,2,1,2,0,0,4,0 -11698,0,1,0,5,2,3,11,1,0,1,15,13,1,11,0 -11699,5,1,0,14,4,3,12,5,4,1,18,17,1,11,1 -11700,7,7,0,5,0,1,4,0,2,1,16,8,5,35,1 -11701,6,4,0,4,3,2,1,0,2,1,18,8,4,2,1 -11702,5,4,0,15,3,1,2,5,2,1,18,7,1,30,1 -11703,6,6,0,9,6,4,6,3,1,1,13,0,0,33,0 -11704,1,8,0,15,5,0,2,3,1,1,17,4,0,33,0 -11705,8,0,0,2,1,2,2,3,0,0,8,0,0,33,0 -11706,0,3,0,9,0,0,9,1,2,1,2,9,1,29,0 -11707,1,4,0,4,2,3,0,4,0,1,2,9,5,13,0 -11708,1,6,0,4,1,0,6,5,4,0,13,6,5,31,0 -11709,9,1,0,2,2,0,4,5,1,1,1,8,5,14,1 -11710,3,4,0,7,2,0,6,4,1,0,6,2,1,18,0 -11711,0,2,0,15,3,5,4,0,1,1,0,7,2,2,1 -11712,1,1,0,1,5,4,9,0,0,0,6,2,1,27,0 -11713,3,8,0,11,6,4,9,1,1,1,13,18,4,1,0 -11714,10,2,0,9,5,1,5,0,2,1,15,7,0,27,1 -11715,5,3,0,1,4,5,1,0,3,0,1,8,3,35,1 -11716,9,6,0,8,0,4,7,5,3,0,8,14,3,3,0 -11717,2,2,0,12,1,0,1,1,4,0,12,18,4,31,0 -11718,1,3,0,2,4,2,14,2,4,1,1,11,2,39,0 -11719,2,7,0,13,0,4,9,4,3,1,13,15,2,25,0 -11720,2,8,0,13,4,0,2,1,0,1,8,2,0,33,0 -11721,7,5,0,1,5,1,9,2,0,1,17,9,0,5,0 -11722,1,3,0,12,1,2,4,5,0,1,7,15,0,3,0 -11723,9,6,0,11,4,0,4,5,3,1,3,19,5,13,1 -11724,3,3,0,0,1,4,2,4,1,0,8,15,0,0,0 -11725,2,4,0,0,6,1,7,3,1,1,11,7,4,38,1 -11726,6,5,0,5,4,3,5,5,1,1,12,0,0,12,0 -11727,8,1,0,11,6,5,5,1,2,1,18,12,2,16,1 -11728,6,2,0,12,0,6,9,1,1,1,4,15,0,6,0 -11729,5,3,0,1,2,6,10,1,2,0,10,17,3,6,1 -11730,2,7,0,3,0,6,9,3,4,1,4,2,0,37,0 -11731,4,8,0,0,0,4,9,0,1,0,15,13,5,33,0 -11732,0,3,0,14,2,5,1,1,2,1,6,0,2,26,0 -11733,3,3,0,4,3,6,8,1,4,1,8,13,2,21,0 -11734,10,2,0,3,3,1,5,1,1,1,14,17,1,23,1 -11735,6,7,0,2,6,0,13,1,1,0,8,7,1,21,0 -11736,0,8,0,15,2,5,12,5,2,1,15,6,1,38,0 -11737,7,8,0,9,1,3,11,5,3,1,18,7,5,23,1 -11738,1,5,0,1,4,1,5,4,2,1,4,20,5,25,0 -11739,4,4,0,7,2,6,10,2,3,0,12,19,1,27,0 -11740,3,7,0,0,2,3,6,3,3,1,9,6,1,30,0 -11741,2,1,0,12,3,6,1,2,0,0,4,11,1,27,0 -11742,1,2,0,5,0,4,9,2,3,1,11,11,0,28,0 -11743,9,1,0,11,5,5,2,2,1,0,8,2,4,3,0 -11744,2,8,0,3,0,5,10,0,0,0,11,19,2,3,0 -11745,9,1,0,12,2,2,10,1,2,1,9,10,1,8,1 -11746,5,6,0,5,2,2,4,3,3,1,17,11,2,2,0 -11747,0,5,0,1,2,4,3,3,3,0,4,20,5,26,0 -11748,2,2,0,1,0,4,0,2,1,1,13,1,0,6,0 -11749,10,5,0,15,6,6,5,1,4,0,2,13,3,26,0 -11750,1,4,0,4,2,3,11,4,2,1,6,16,1,0,0 -11751,6,7,0,15,1,4,2,0,2,0,12,7,4,27,1 -11752,5,3,0,3,6,4,9,2,0,0,4,11,4,5,0 -11753,2,5,0,12,3,3,0,1,0,0,15,11,1,9,0 -11754,10,5,0,6,3,4,7,5,0,1,16,1,1,7,1 -11755,4,8,0,15,6,1,7,1,3,1,14,1,2,18,1 -11756,9,7,0,5,2,5,14,4,3,1,4,18,2,40,0 -11757,1,2,0,15,0,1,14,3,0,1,4,18,3,14,0 -11758,6,3,0,12,2,3,1,3,2,0,13,9,3,2,0 -11759,2,3,0,9,3,5,3,5,4,1,7,10,4,9,1 -11760,0,3,0,13,6,0,0,4,4,0,17,11,4,17,0 -11761,1,3,0,4,1,3,11,2,2,1,0,2,0,17,0 -11762,2,7,0,4,0,0,1,3,0,1,2,1,4,34,0 -11763,4,0,0,9,5,5,0,1,3,1,2,4,1,7,0 -11764,2,8,0,13,1,5,14,4,0,1,13,3,3,14,0 -11765,1,3,0,13,0,2,10,5,0,0,13,11,2,1,0 -11766,9,6,0,9,6,2,12,5,2,0,3,1,3,27,1 -11767,1,5,0,0,5,2,7,0,0,0,12,0,1,37,0 -11768,0,5,0,5,6,3,5,3,3,0,13,0,0,3,0 -11769,8,2,0,6,4,6,0,1,2,1,3,0,0,39,0 -11770,10,7,0,11,6,5,1,2,2,1,8,15,3,16,0 -11771,0,1,0,13,2,5,5,4,1,0,2,6,4,29,0 -11772,2,7,0,0,2,1,2,1,3,0,0,17,4,27,0 -11773,8,5,0,4,1,2,0,1,3,1,13,6,0,36,0 -11774,5,0,0,9,5,5,5,0,1,1,11,8,4,33,1 -11775,9,1,0,4,1,5,4,2,0,0,0,9,0,5,0 -11776,7,1,0,3,1,0,13,0,2,0,7,3,5,35,0 -11777,2,0,0,4,0,4,9,4,4,1,8,20,0,32,0 -11778,1,5,0,0,6,1,12,5,0,1,14,0,3,35,1 -11779,2,8,0,13,3,0,0,2,1,0,2,16,2,3,0 -11780,6,3,0,5,6,2,7,0,2,0,18,10,4,21,1 -11781,9,6,0,5,1,4,8,2,4,1,8,3,3,0,0 -11782,7,5,0,5,2,2,0,0,1,0,2,10,2,1,1 -11783,1,3,0,0,0,6,3,0,3,0,3,12,0,20,0 -11784,7,8,0,5,1,0,0,0,3,1,13,11,5,1,0 -11785,1,8,0,7,0,3,8,4,0,0,16,7,3,39,0 -11786,10,4,0,12,4,0,13,3,4,0,1,7,1,24,1 -11787,3,8,0,10,4,2,0,3,2,1,8,3,3,17,0 -11788,1,7,0,13,0,5,0,0,0,1,17,15,1,41,0 -11789,7,0,0,7,2,2,3,5,1,0,18,10,3,30,1 -11790,8,6,0,3,1,0,12,5,2,0,9,18,2,17,1 -11791,7,3,0,1,2,0,0,3,4,0,4,3,0,34,0 -11792,8,5,0,1,5,2,2,4,2,1,3,14,4,24,1 -11793,10,8,0,0,2,0,5,2,4,1,4,15,5,13,0 -11794,0,7,0,5,3,0,9,4,0,0,4,13,5,31,0 -11795,9,3,0,7,6,3,8,3,2,0,8,9,0,32,0 -11796,8,8,0,5,5,0,14,5,4,1,17,6,3,11,1 -11797,2,6,0,6,1,5,2,1,2,0,6,13,1,3,0 -11798,1,2,0,0,1,5,9,4,2,1,2,11,0,12,0 -11799,7,8,0,15,1,4,6,2,4,0,6,15,2,1,0 -11800,0,4,0,13,0,3,9,4,4,1,4,20,2,10,0 -11801,6,6,0,3,0,2,5,4,0,0,4,16,1,32,0 -11802,3,1,0,4,3,1,13,4,4,0,5,7,4,24,1 -11803,10,1,0,10,0,0,10,3,2,0,4,9,2,19,0 -11804,9,1,0,9,3,1,11,0,2,1,18,20,5,0,1 -11805,4,4,0,6,6,3,11,4,4,1,2,11,2,22,0 -11806,6,1,0,14,6,5,2,2,3,1,7,11,1,29,0 -11807,3,2,0,1,2,5,11,5,2,1,13,2,1,10,0 -11808,2,0,0,14,5,5,11,0,0,1,13,18,5,14,0 -11809,4,5,0,5,1,5,6,4,2,1,13,12,3,30,0 -11810,9,3,0,11,1,4,3,0,3,1,14,5,4,8,1 -11811,0,8,0,6,0,4,14,2,4,1,2,4,5,19,0 -11812,6,2,0,15,3,5,11,5,1,1,18,10,1,2,1 -11813,6,4,0,6,1,4,0,2,0,1,4,14,2,34,0 -11814,0,0,0,1,5,1,11,1,0,0,5,0,3,26,0 -11815,2,0,0,3,0,5,9,3,3,1,11,11,1,26,0 -11816,1,6,0,0,2,0,6,2,0,1,0,17,4,4,0 -11817,6,8,0,3,6,6,2,1,4,0,10,4,3,3,0 -11818,7,7,0,9,1,4,14,3,2,0,2,6,2,40,0 -11819,1,8,0,6,2,0,14,5,0,0,1,10,1,28,1 -11820,4,1,0,8,3,6,4,2,2,1,9,5,1,17,1 -11821,5,4,0,8,5,6,12,1,0,0,18,5,4,19,1 -11822,5,0,0,15,2,4,7,3,0,0,8,11,5,36,0 -11823,1,4,0,3,5,5,8,0,1,0,17,0,4,1,0 -11824,3,1,0,7,5,0,7,5,1,0,8,13,5,6,0 -11825,10,6,0,0,5,6,8,5,0,0,10,11,2,36,0 -11826,0,6,0,8,3,3,11,0,1,0,4,13,5,38,0 -11827,2,7,0,13,5,1,7,3,3,1,8,2,0,34,0 -11828,7,0,0,12,6,5,9,1,1,0,13,15,0,35,0 -11829,5,8,0,4,6,6,9,2,3,0,2,6,1,32,0 -11830,3,7,0,13,2,6,11,3,2,1,2,18,0,6,0 -11831,8,3,0,2,0,3,9,1,4,0,8,16,2,24,0 -11832,0,7,0,6,0,0,3,0,0,1,4,2,0,9,0 -11833,5,8,0,0,4,0,12,5,2,1,17,2,2,23,0 -11834,3,8,0,15,5,4,12,0,1,0,2,9,3,41,0 -11835,0,5,0,9,5,2,9,2,2,0,2,11,4,7,0 -11836,10,5,0,11,5,0,14,2,2,0,4,6,5,0,0 -11837,1,1,0,0,5,1,14,1,0,0,13,11,5,20,0 -11838,8,0,0,7,6,3,9,1,1,0,13,15,0,33,0 -11839,6,5,0,15,2,3,9,2,0,0,4,18,5,20,0 -11840,7,6,0,4,0,1,0,4,0,1,14,19,1,5,1 -11841,3,2,0,0,2,3,0,4,1,0,13,11,5,21,0 -11842,3,1,0,9,2,5,12,3,1,0,18,10,0,18,1 -11843,9,1,0,9,1,0,7,4,4,1,11,11,5,6,0 -11844,4,3,0,4,2,2,13,0,4,0,18,14,2,27,1 -11845,1,1,0,6,0,1,14,5,1,1,13,2,0,3,0 -11846,4,4,0,2,0,6,10,2,3,0,13,2,2,41,0 -11847,3,6,0,8,0,1,14,2,3,1,2,11,2,8,0 -11848,4,6,0,8,2,0,5,0,1,1,2,15,2,2,0 -11849,5,0,0,7,3,2,0,4,3,0,15,11,2,21,0 -11850,3,6,0,8,3,6,3,2,4,1,1,19,4,18,1 -11851,10,1,0,3,5,1,9,0,1,0,0,5,4,6,1 -11852,1,6,0,13,0,5,8,0,0,1,10,13,3,3,0 -11853,9,5,0,8,0,2,0,3,1,0,1,5,4,11,1 -11854,0,8,0,4,0,6,2,4,0,0,15,14,2,13,0 -11855,0,4,0,6,0,5,5,2,4,0,5,20,5,28,0 -11856,1,7,0,8,2,5,7,3,1,0,3,11,5,24,0 -11857,2,6,0,9,3,1,4,2,3,1,2,11,0,40,0 -11858,1,2,0,5,4,1,7,1,3,1,17,18,1,16,0 -11859,2,2,0,10,2,3,11,5,0,0,4,9,2,41,0 -11860,9,7,0,1,3,4,6,1,3,1,3,11,0,26,0 -11861,3,6,0,12,0,0,4,2,2,0,2,11,0,4,0 -11862,4,3,0,12,4,5,7,5,0,1,9,20,2,2,1 -11863,3,5,0,12,2,5,3,4,3,0,18,2,0,34,0 -11864,6,6,0,1,5,1,7,0,2,1,11,17,2,23,1 -11865,1,3,0,14,1,1,14,5,3,1,4,13,1,7,0 -11866,5,6,0,12,0,1,3,5,3,0,5,7,5,2,1 -11867,4,4,0,8,4,0,11,3,3,1,18,4,2,31,1 -11868,8,8,0,0,3,0,12,4,4,0,5,7,2,12,1 -11869,9,2,0,13,4,0,4,1,2,1,2,2,4,31,0 -11870,6,1,0,7,5,3,0,0,3,0,13,11,5,27,0 -11871,9,4,0,5,2,6,5,1,2,1,2,9,0,4,0 -11872,3,7,0,8,1,4,8,2,3,1,18,7,0,6,1 -11873,4,1,0,9,5,1,14,5,3,0,2,20,3,29,0 -11874,10,5,0,3,1,6,11,4,4,0,16,1,4,31,1 -11875,3,7,0,11,1,4,3,3,0,1,2,2,0,40,0 -11876,3,4,0,15,5,5,0,3,3,0,6,11,1,10,0 -11877,2,7,0,13,1,4,9,1,4,1,6,2,0,30,0 -11878,1,2,0,10,2,0,7,2,4,1,2,11,2,4,0 -11879,0,2,0,6,0,1,5,3,4,1,15,20,4,6,0 -11880,8,7,0,5,1,4,7,4,3,1,6,6,3,5,0 -11881,1,3,0,2,4,5,8,4,2,0,7,3,4,9,1 -11882,4,5,0,14,6,4,5,2,1,0,1,9,1,39,1 -11883,5,2,0,3,3,4,1,1,4,0,8,13,3,40,0 -11884,1,1,0,5,5,3,8,5,2,0,9,19,5,24,1 -11885,2,2,0,12,2,3,11,3,1,1,2,1,0,22,0 -11886,0,0,0,3,4,5,6,5,1,1,8,15,0,31,0 -11887,2,8,0,10,2,4,8,2,4,0,6,8,0,41,0 -11888,7,1,0,9,1,4,8,1,3,0,13,11,5,30,0 -11889,1,8,0,14,6,4,14,5,4,0,11,8,5,37,1 -11890,2,1,0,12,2,5,10,4,1,1,11,12,2,41,0 -11891,3,2,0,15,4,0,8,3,4,1,0,0,2,19,0 -11892,5,4,0,3,2,3,13,3,0,0,12,2,0,32,0 -11893,6,8,0,2,5,4,2,0,1,1,7,16,5,35,1 -11894,1,4,0,7,1,2,8,0,0,1,6,3,1,2,0 -11895,3,4,0,2,3,4,0,5,0,1,7,16,4,16,1 -11896,0,2,0,15,1,3,9,4,0,1,13,17,2,14,0 -11897,0,8,0,8,0,0,2,3,0,1,17,6,1,5,0 -11898,4,4,0,7,5,0,9,2,4,1,13,13,5,11,0 -11899,4,4,0,1,1,1,13,0,3,0,0,6,4,29,0 -11900,4,6,0,2,6,4,9,5,0,1,14,2,5,16,0 -11901,2,0,0,6,4,5,7,1,2,1,4,2,3,20,0 -11902,0,1,0,6,0,4,0,2,0,0,17,9,3,25,0 -11903,1,8,0,3,0,2,1,3,0,0,13,17,0,0,0 -11904,7,7,0,3,1,0,8,2,3,1,2,11,2,10,0 -11905,6,1,0,9,3,5,5,1,2,0,3,6,0,20,0 -11906,1,6,0,13,6,3,9,0,0,0,8,15,5,14,0 -11907,6,7,0,4,4,0,8,0,3,1,5,8,2,1,1 -11908,10,8,0,5,1,2,4,4,4,1,3,11,0,9,0 -11909,3,4,0,14,5,6,11,1,3,1,11,7,3,39,1 -11910,9,8,0,3,0,4,0,5,1,0,13,16,3,39,0 -11911,6,4,0,5,1,3,14,0,0,1,18,5,5,12,1 -11912,0,5,0,13,4,6,1,1,4,0,8,11,2,14,0 -11913,4,8,0,4,6,3,12,4,1,1,0,0,3,21,0 -11914,7,2,0,7,1,0,12,3,1,0,15,12,1,25,0 -11915,6,8,0,13,5,4,9,3,2,1,17,6,4,36,0 -11916,9,7,0,2,4,4,8,1,0,1,0,13,3,35,0 -11917,4,1,0,11,0,5,7,1,0,1,4,11,0,26,0 -11918,1,2,0,13,1,2,12,3,0,1,8,11,4,39,0 -11919,1,4,0,4,4,0,13,3,0,0,9,10,2,24,1 -11920,1,2,0,7,2,4,10,2,1,0,15,2,5,7,0 -11921,0,2,0,3,6,0,10,3,2,0,4,9,5,0,0 -11922,8,8,0,8,1,3,1,0,2,1,9,3,3,11,1 -11923,6,7,0,1,6,4,11,3,2,1,14,5,4,16,1 -11924,7,2,0,5,6,5,0,2,3,0,0,9,0,5,0 -11925,5,0,0,10,5,5,5,0,1,0,11,20,0,25,0 -11926,2,0,0,9,5,5,14,4,1,1,2,11,2,19,0 -11927,1,5,0,13,2,5,9,3,2,0,17,4,0,26,0 -11928,0,6,0,1,0,2,9,3,0,1,4,2,4,32,0 -11929,3,2,0,13,6,5,9,2,1,1,8,8,0,29,0 -11930,6,8,0,5,1,2,6,3,0,0,16,7,1,28,1 -11931,1,8,0,1,5,6,12,0,4,1,18,12,2,16,1 -11932,0,0,0,1,5,5,2,3,2,0,0,2,2,22,0 -11933,0,5,0,10,2,1,8,5,0,1,16,1,1,13,1 -11934,6,2,0,15,0,1,3,0,3,1,7,17,5,29,1 -11935,3,6,0,9,1,6,2,0,4,0,7,1,3,40,1 -11936,4,7,0,13,0,1,9,3,0,0,13,18,5,8,0 -11937,0,0,0,9,3,4,6,1,1,1,17,18,5,40,0 -11938,4,0,0,15,3,2,11,3,1,0,17,9,3,28,0 -11939,0,3,0,4,6,3,2,4,2,0,13,18,4,0,0 -11940,0,1,0,4,6,4,6,2,2,1,0,11,3,16,0 -11941,1,5,0,4,5,0,4,2,0,0,0,12,2,30,0 -11942,0,0,0,8,0,4,0,3,0,0,13,2,4,35,0 -11943,0,7,0,7,3,4,13,4,3,1,13,11,1,37,0 -11944,9,2,0,6,6,5,5,0,0,1,13,11,0,1,0 -11945,1,8,0,10,0,3,6,1,4,0,10,2,4,7,0 -11946,2,6,0,8,3,4,3,0,3,1,2,11,4,6,0 -11947,2,6,0,8,3,0,2,1,3,0,10,15,0,27,0 -11948,8,7,0,1,4,2,2,2,0,1,18,10,3,17,1 -11949,9,3,0,15,6,6,8,1,4,0,13,6,0,0,0 -11950,8,0,0,13,0,2,5,2,2,1,0,14,3,4,0 -11951,4,7,0,13,6,6,0,5,1,1,6,5,4,28,1 -11952,7,7,0,0,6,5,3,3,0,0,10,2,2,8,0 -11953,10,1,0,1,1,5,13,2,1,0,12,19,2,13,0 -11954,0,5,0,1,6,4,3,2,2,0,4,18,2,25,0 -11955,1,4,0,0,1,4,12,2,1,1,2,17,5,2,0 -11956,0,3,0,3,0,1,13,3,0,0,6,14,1,19,0 -11957,0,8,0,15,0,0,8,5,4,0,12,6,4,17,0 -11958,7,7,0,11,2,6,10,3,1,1,4,2,2,29,0 -11959,1,6,0,8,2,0,4,3,2,1,2,18,1,9,0 -11960,9,6,0,1,0,0,6,2,2,1,18,7,2,18,1 -11961,0,8,0,14,0,5,9,2,0,0,13,2,1,14,0 -11962,4,7,0,3,0,6,5,3,2,1,8,9,4,30,0 -11963,9,6,0,0,6,0,3,3,3,0,13,17,5,39,0 -11964,1,1,0,2,1,5,5,2,3,0,13,15,3,2,0 -11965,0,0,0,3,3,4,12,5,0,0,13,2,1,20,0 -11966,6,7,0,9,6,0,11,3,3,0,16,3,2,41,0 -11967,6,3,0,1,6,1,13,2,3,0,3,11,4,5,1 -11968,0,6,0,13,0,1,7,1,0,0,8,8,4,31,0 -11969,7,2,0,12,4,4,8,5,2,0,13,12,0,8,0 -11970,2,8,0,11,0,6,8,1,0,0,10,15,2,21,0 -11971,0,5,0,2,6,0,1,0,2,1,13,0,0,35,0 -11972,4,8,0,0,2,6,9,2,3,0,17,9,1,32,0 -11973,5,5,0,7,2,3,14,3,3,0,3,10,5,4,1 -11974,6,2,0,14,5,0,11,4,0,1,11,15,2,38,1 -11975,0,2,0,5,0,6,3,0,3,1,13,20,0,31,0 -11976,2,7,0,10,0,6,14,1,4,0,13,20,0,5,0 -11977,1,8,0,15,0,0,12,4,1,0,0,9,1,37,0 -11978,9,2,0,13,4,4,10,3,2,1,18,14,0,27,1 -11979,0,0,0,11,2,0,9,3,4,1,15,9,0,16,0 -11980,0,3,0,13,0,1,0,0,0,0,10,8,2,13,0 -11981,0,3,0,9,5,5,5,4,1,1,8,16,3,7,0 -11982,3,5,0,3,4,2,2,2,3,0,16,0,5,21,1 -11983,4,1,0,13,2,5,9,3,1,0,8,18,3,7,0 -11984,5,5,0,8,4,1,10,5,1,0,3,11,3,2,1 -11985,7,2,0,15,6,0,13,0,1,1,13,20,0,36,0 -11986,6,0,0,3,5,5,9,2,0,0,9,20,1,40,0 -11987,2,3,0,11,0,3,0,4,0,0,0,9,0,19,0 -11988,8,3,0,10,6,4,0,2,2,0,17,15,0,31,0 -11989,1,7,0,14,2,5,3,1,4,0,2,13,4,35,0 -11990,3,0,0,3,1,4,13,5,2,0,10,6,1,7,0 -11991,1,6,0,2,1,2,0,0,1,1,2,0,4,3,0 -11992,7,0,0,10,5,2,2,2,4,0,18,17,5,25,1 -11993,5,6,0,10,0,5,10,5,1,0,18,18,2,6,1 -11994,2,0,0,1,0,4,0,4,0,0,12,2,0,28,0 -11995,7,0,0,5,1,5,9,4,1,1,10,2,4,1,0 -11996,6,6,0,2,1,0,0,1,0,0,2,11,2,5,0 -11997,10,5,0,6,6,0,1,4,0,1,13,2,2,35,0 -11998,4,5,0,3,3,3,2,5,3,0,5,4,2,20,1 -11999,8,0,0,15,6,3,4,4,2,1,13,14,2,7,0 -12000,2,7,0,3,2,0,2,1,0,0,15,20,1,4,0 -12001,1,0,0,1,1,4,13,1,1,0,13,9,1,23,0 -12002,4,2,0,15,0,5,14,2,0,1,2,16,5,35,0 -12003,10,3,0,2,3,0,8,5,1,1,18,10,1,10,1 -12004,9,7,0,15,1,3,9,3,3,1,17,3,0,1,0 -12005,3,8,0,4,1,6,4,3,4,1,16,19,0,20,0 -12006,9,5,0,7,0,3,5,1,3,1,11,9,3,21,0 -12007,1,6,0,0,6,4,5,3,3,1,13,15,3,3,0 -12008,2,8,0,11,3,3,0,2,1,1,0,15,0,11,0 -12009,10,0,0,9,4,5,4,5,4,1,18,8,4,1,1 -12010,9,7,0,10,6,1,12,5,1,0,14,12,3,21,1 -12011,0,7,0,2,1,0,12,2,2,1,8,11,0,12,0 -12012,9,7,0,1,1,4,9,1,0,1,0,2,3,9,0 -12013,10,3,0,8,5,4,1,4,2,0,6,0,5,18,0 -12014,9,1,0,1,5,0,11,5,1,1,7,5,5,40,1 -12015,1,3,0,2,4,5,0,2,0,1,17,9,0,22,0 -12016,2,3,0,7,5,6,13,1,0,0,2,6,0,32,0 -12017,3,5,0,1,1,2,9,1,0,1,11,2,0,6,0 -12018,9,1,0,6,6,0,3,5,4,0,16,5,3,31,1 -12019,10,1,0,1,2,6,14,4,2,1,0,7,3,7,1 -12020,3,3,0,4,1,4,6,3,3,0,10,3,4,38,0 -12021,5,2,0,5,5,4,4,0,4,1,10,4,5,14,0 -12022,3,4,0,12,0,4,8,2,4,0,8,3,4,22,0 -12023,8,3,0,6,5,1,13,4,2,0,4,4,4,37,0 -12024,0,6,0,1,0,5,12,1,3,1,2,18,4,28,0 -12025,2,0,0,14,0,6,5,2,3,1,13,2,1,33,0 -12026,7,0,0,2,6,6,9,2,3,0,15,15,1,29,0 -12027,8,6,0,14,3,1,11,5,1,0,7,15,4,7,0 -12028,10,2,0,14,2,3,9,4,4,1,8,18,0,10,0 -12029,0,1,0,9,2,5,11,2,2,0,2,8,2,6,0 -12030,4,3,0,8,0,3,13,4,4,0,15,6,0,21,0 -12031,5,4,0,3,3,4,7,1,0,0,8,11,5,3,0 -12032,0,5,0,8,0,2,12,0,2,1,8,9,0,27,0 -12033,6,0,0,6,5,3,11,1,4,0,16,6,1,13,0 -12034,10,0,0,9,6,5,1,4,3,1,2,19,4,14,0 -12035,7,0,0,8,0,4,0,4,4,1,13,4,4,13,0 -12036,10,6,0,10,2,1,1,4,3,1,18,14,2,29,1 -12037,9,4,0,11,6,0,10,5,1,1,14,7,4,10,1 -12038,2,7,0,13,3,5,9,2,0,1,2,13,3,3,0 -12039,0,0,0,6,4,5,14,3,0,0,4,14,0,19,0 -12040,0,1,0,14,4,5,6,1,4,0,13,17,5,7,0 -12041,1,6,0,3,6,5,2,3,3,1,6,3,4,41,0 -12042,9,2,0,4,4,6,11,0,3,0,1,10,5,2,1 -12043,0,4,0,14,1,0,1,5,2,0,6,13,4,38,0 -12044,4,8,0,5,1,0,13,5,0,1,13,2,0,0,0 -12045,2,8,0,8,6,1,2,5,1,1,5,7,0,9,1 -12046,2,4,0,3,3,2,6,3,1,1,8,11,5,37,0 -12047,5,3,0,5,1,1,5,0,1,1,16,4,3,1,0 -12048,4,4,0,4,2,0,10,3,4,0,12,20,4,12,0 -12049,0,5,0,7,5,1,9,1,4,0,11,13,0,16,0 -12050,0,3,0,15,0,1,11,5,3,1,10,10,0,12,1 -12051,7,0,0,2,3,3,2,0,4,1,9,3,0,16,1 -12052,8,2,0,11,1,5,14,1,3,1,6,6,5,13,0 -12053,0,3,0,13,2,4,9,0,1,1,8,2,4,23,0 -12054,7,3,0,9,0,1,7,5,1,1,6,15,4,7,1 -12055,0,7,0,6,5,5,7,2,2,1,6,2,3,18,0 -12056,7,2,0,14,2,6,10,5,1,1,9,19,1,5,1 -12057,10,6,0,9,2,1,5,3,4,1,11,4,0,2,1 -12058,0,3,0,4,4,4,9,5,2,0,2,6,2,35,0 -12059,9,5,0,3,4,6,2,5,4,1,12,1,2,11,1 -12060,5,0,0,11,4,4,9,3,2,0,5,11,0,5,0 -12061,6,2,0,15,0,6,9,4,1,1,0,0,0,34,0 -12062,9,8,0,6,3,1,7,2,3,0,14,19,3,12,1 -12063,0,2,0,5,0,1,9,2,1,1,2,20,3,20,0 -12064,2,3,0,3,4,2,5,3,0,0,10,2,4,4,0 -12065,7,6,0,15,1,6,2,3,1,1,1,15,1,28,0 -12066,3,7,0,6,2,4,8,3,3,1,4,9,0,2,0 -12067,8,7,0,15,6,5,7,5,0,0,18,0,2,36,1 -12068,1,6,0,11,0,6,13,5,0,0,2,15,0,38,0 -12069,4,6,0,6,6,4,14,3,2,0,13,9,4,26,0 -12070,0,3,0,12,6,6,12,0,0,0,11,13,2,2,1 -12071,1,8,0,11,2,2,0,1,1,1,18,7,3,25,1 -12072,9,7,0,5,0,0,11,5,4,0,2,11,3,0,0 -12073,3,6,0,14,0,5,4,5,4,0,2,17,5,29,0 -12074,0,0,0,13,2,0,7,0,4,1,12,9,0,34,0 -12075,9,4,0,13,1,5,9,3,3,0,17,11,3,25,0 -12076,3,5,0,15,1,0,11,4,0,0,13,13,0,34,0 -12077,1,2,0,11,1,3,9,4,0,1,6,2,1,41,0 -12078,1,5,0,10,6,1,8,1,4,0,11,19,1,23,1 -12079,0,8,0,3,2,5,8,4,1,1,13,9,4,19,0 -12080,8,7,0,8,1,0,9,3,3,0,17,15,2,13,0 -12081,1,4,0,1,0,3,6,1,4,0,3,0,5,38,0 -12082,3,0,0,2,1,6,2,2,0,0,4,13,1,26,0 -12083,7,8,0,14,2,2,7,5,1,0,4,6,2,6,0 -12084,3,8,0,8,2,1,8,4,2,0,17,0,1,24,0 -12085,6,6,0,4,6,3,6,1,4,0,7,7,5,1,1 -12086,9,0,0,15,5,1,3,1,0,0,13,13,5,10,0 -12087,9,6,0,5,4,2,6,0,3,0,9,12,3,6,0 -12088,5,6,0,11,3,6,11,3,1,0,12,2,5,40,0 -12089,8,5,0,6,1,1,9,5,1,0,6,7,3,39,1 -12090,3,8,0,3,1,5,4,4,0,0,13,6,0,28,0 -12091,10,5,0,9,3,1,5,0,0,0,15,15,1,27,0 -12092,10,2,0,8,2,1,9,5,3,1,9,7,0,11,1 -12093,10,4,0,0,1,4,3,4,0,0,8,11,0,6,0 -12094,4,3,0,6,2,1,9,2,2,0,8,2,1,39,0 -12095,3,4,0,12,1,0,2,4,1,0,5,11,4,25,0 -12096,0,3,0,6,2,5,0,5,4,1,13,9,4,12,0 -12097,6,0,0,3,1,2,9,3,3,0,8,18,1,3,0 -12098,3,7,0,4,2,0,2,5,1,1,8,0,5,22,0 -12099,8,8,0,1,6,5,11,3,1,1,2,3,4,29,0 -12100,3,3,0,14,6,0,14,1,0,0,17,9,2,6,0 -12101,2,6,0,12,6,5,8,4,4,0,11,2,2,29,0 -12102,0,8,0,2,4,1,11,1,0,0,16,12,1,5,0 -12103,1,7,0,5,0,0,2,3,1,1,13,13,2,39,0 -12104,0,1,0,13,0,0,2,3,1,0,13,20,1,21,0 -12105,5,0,0,10,4,3,2,5,0,0,17,5,4,5,1 -12106,3,4,0,14,3,6,12,3,0,1,10,14,5,23,0 -12107,9,5,0,5,3,5,5,2,1,0,7,11,0,16,0 -12108,3,6,0,9,2,1,11,4,4,1,1,10,4,32,1 -12109,0,6,0,9,1,0,9,4,3,0,2,9,0,16,0 -12110,3,4,0,0,2,4,3,1,3,1,13,15,0,40,0 -12111,5,7,0,4,1,4,1,0,2,1,13,11,4,41,0 -12112,9,6,0,12,0,4,5,3,2,1,6,0,5,27,0 -12113,2,5,0,14,6,5,1,3,3,1,11,9,1,17,0 -12114,8,3,0,4,4,2,4,5,1,0,5,8,4,32,1 -12115,0,7,0,1,1,4,10,0,2,0,6,20,3,23,0 -12116,1,0,0,11,6,0,5,4,1,1,17,16,1,18,0 -12117,5,1,0,12,1,2,1,3,1,1,18,17,4,8,1 -12118,0,6,0,4,1,3,12,4,3,0,2,20,4,22,0 -12119,2,8,0,7,2,0,13,3,1,0,8,6,1,8,0 -12120,1,4,0,0,5,5,1,2,2,0,12,18,0,6,0 -12121,4,3,0,15,0,0,3,2,4,0,6,2,4,19,0 -12122,5,4,0,1,2,0,6,0,2,0,4,20,4,0,0 -12123,0,8,0,10,6,4,5,5,3,0,12,11,4,18,0 -12124,8,4,0,6,0,6,3,3,4,1,2,18,1,23,0 -12125,6,3,0,10,4,4,11,3,4,1,18,5,1,29,1 -12126,7,2,0,4,4,1,13,2,4,0,5,18,5,18,1 -12127,1,2,0,9,6,6,12,3,1,0,4,8,4,26,0 -12128,10,6,0,0,1,1,4,2,0,1,10,2,0,30,0 -12129,0,7,0,10,5,4,9,4,0,0,8,11,3,12,0 -12130,10,0,0,3,3,1,9,2,3,0,9,13,5,27,0 -12131,3,5,0,5,2,2,0,4,4,1,15,11,5,21,0 -12132,8,2,0,12,0,6,5,3,1,1,8,6,1,13,0 -12133,8,4,0,2,4,0,11,5,4,1,18,8,3,35,1 -12134,4,8,0,10,3,6,14,4,4,1,14,16,3,28,1 -12135,5,7,0,15,6,1,6,5,2,1,18,7,3,12,1 -12136,1,3,0,3,0,3,2,3,3,1,14,15,5,10,0 -12137,3,5,0,14,4,5,6,3,3,1,18,14,4,12,1 -12138,3,3,0,13,3,4,5,4,0,1,17,2,1,35,0 -12139,4,6,0,13,4,5,8,1,0,0,8,17,0,1,0 -12140,10,6,0,7,0,3,10,0,2,1,2,6,4,35,0 -12141,6,6,0,15,6,4,0,3,3,1,2,4,3,27,0 -12142,1,8,0,10,6,5,0,1,2,0,4,17,2,36,0 -12143,5,6,0,6,0,1,11,5,4,0,9,10,4,1,1 -12144,4,8,0,6,5,6,4,1,4,1,10,11,0,9,0 -12145,2,3,0,3,6,1,11,3,3,0,7,11,5,2,1 -12146,1,1,0,0,2,6,0,2,2,1,13,20,5,40,0 -12147,2,5,0,3,4,3,4,2,4,1,4,13,4,1,0 -12148,1,8,0,6,3,4,3,4,4,1,17,2,0,28,0 -12149,2,6,0,14,4,5,10,1,0,1,2,17,0,19,0 -12150,10,7,0,9,0,3,3,4,2,1,16,8,4,38,1 -12151,10,5,0,15,6,0,9,4,0,0,8,13,2,27,0 -12152,10,4,0,4,1,0,4,1,3,0,14,1,5,30,1 -12153,2,6,0,9,2,1,3,0,1,0,2,2,0,19,0 -12154,2,0,0,2,5,1,7,1,4,0,6,11,5,8,0 -12155,8,2,0,14,1,5,1,3,2,0,8,8,5,39,1 -12156,3,7,0,4,5,4,4,4,2,1,13,6,0,26,0 -12157,9,7,0,3,2,5,1,0,0,0,10,6,0,38,0 -12158,1,8,0,0,6,0,3,0,3,1,5,1,2,4,1 -12159,8,2,0,14,0,0,7,3,4,1,0,18,0,11,0 -12160,6,3,0,3,4,1,11,4,0,1,14,7,4,18,1 -12161,4,8,0,4,4,4,7,4,3,1,12,13,4,14,0 -12162,10,8,0,11,5,0,13,0,1,1,3,10,4,14,1 -12163,9,3,0,5,0,4,3,4,2,1,10,15,0,19,0 -12164,0,6,0,15,2,5,3,4,3,1,6,11,2,41,0 -12165,9,8,0,12,5,6,2,4,1,0,11,14,5,5,1 -12166,9,8,0,14,6,3,8,4,3,1,17,19,3,0,0 -12167,4,6,0,11,0,1,11,4,4,0,2,11,1,18,0 -12168,0,2,0,3,6,3,6,4,1,1,3,13,1,41,0 -12169,1,1,0,6,5,3,1,0,4,1,14,3,4,22,1 -12170,7,4,0,5,6,0,3,1,0,0,3,11,2,36,0 -12171,0,3,0,12,0,5,8,5,0,1,13,11,1,7,0 -12172,0,7,0,3,4,4,3,3,0,0,10,9,1,33,0 -12173,7,0,0,7,3,6,10,2,2,1,6,13,1,8,0 -12174,8,0,0,3,4,1,2,5,2,0,3,10,3,23,1 -12175,2,4,0,5,0,0,12,2,3,1,13,13,3,19,0 -12176,1,6,0,5,1,2,5,3,2,0,16,7,1,1,1 -12177,3,2,0,12,2,0,9,2,0,0,17,7,5,1,0 -12178,6,4,0,4,1,0,7,1,0,1,12,0,5,41,0 -12179,1,4,0,4,6,5,3,4,4,1,13,17,2,4,0 -12180,6,0,0,12,6,5,0,5,0,1,10,2,5,41,0 -12181,10,0,0,11,4,5,8,1,4,1,2,9,2,7,0 -12182,2,0,0,14,4,0,4,3,0,1,1,14,4,2,1 -12183,4,6,0,12,1,6,4,1,1,0,15,13,5,36,0 -12184,6,1,0,8,4,3,6,3,4,1,8,0,4,14,0 -12185,0,1,0,4,6,0,1,2,1,1,16,13,0,35,0 -12186,5,0,0,5,1,2,11,0,4,0,10,2,2,11,0 -12187,4,8,0,4,2,3,4,5,2,1,8,2,2,25,0 -12188,5,7,0,0,3,1,4,5,1,1,11,5,0,18,1 -12189,0,6,0,13,2,2,7,1,3,1,15,7,2,6,0 -12190,10,0,0,0,6,2,11,2,4,0,13,11,0,27,0 -12191,4,7,0,14,4,1,8,0,3,1,9,8,4,21,1 -12192,8,8,0,6,0,0,5,1,0,1,13,4,0,13,0 -12193,4,6,0,4,5,0,13,2,2,0,12,5,2,34,0 -12194,0,0,0,4,5,0,14,0,3,1,8,4,4,22,0 -12195,9,7,0,11,6,5,9,3,3,0,6,11,0,29,0 -12196,3,1,0,11,3,4,12,0,1,0,14,2,2,8,0 -12197,0,3,0,8,2,0,3,1,4,1,2,5,0,10,0 -12198,7,6,0,12,4,1,13,0,2,1,18,7,2,4,1 -12199,9,8,0,8,5,0,4,5,4,0,14,7,4,3,1 -12200,5,6,0,0,1,1,12,1,2,1,8,11,3,6,0 -12201,2,6,0,5,2,5,13,4,3,1,2,11,0,27,0 -12202,2,8,0,8,3,5,9,2,0,1,11,18,1,36,0 -12203,0,0,0,6,6,6,4,4,0,0,13,13,1,17,0 -12204,0,7,0,13,0,2,5,3,2,1,13,13,1,31,0 -12205,7,5,0,5,4,6,4,5,0,1,4,18,1,33,0 -12206,5,3,0,10,6,6,12,5,4,0,7,17,5,12,1 -12207,0,0,0,2,0,4,2,1,2,1,2,6,3,16,0 -12208,10,1,0,6,6,2,2,5,0,0,9,10,1,38,1 -12209,5,3,0,0,5,5,2,1,4,0,17,2,1,22,0 -12210,2,5,0,15,6,6,5,0,0,0,8,7,1,10,0 -12211,7,0,0,6,1,5,5,4,4,0,2,13,0,5,0 -12212,6,2,0,12,2,0,4,3,3,1,10,11,0,16,0 -12213,8,1,0,4,5,6,3,3,4,1,6,4,2,37,0 -12214,5,0,0,7,5,1,4,2,4,0,18,7,2,29,1 -12215,1,6,0,4,3,5,5,2,0,1,8,2,1,26,0 -12216,9,7,0,13,3,1,8,5,0,1,11,1,5,31,1 -12217,8,0,0,9,1,5,2,4,0,1,1,14,3,18,1 -12218,10,0,0,10,3,5,4,4,0,1,18,5,5,30,1 -12219,2,7,0,1,0,3,9,0,3,1,8,9,3,12,0 -12220,1,2,0,15,0,6,9,4,1,0,15,9,5,38,0 -12221,1,8,0,5,0,0,10,0,3,1,2,6,5,10,0 -12222,9,1,0,14,0,0,10,1,4,0,4,2,1,24,0 -12223,3,8,0,12,2,2,13,0,1,1,8,9,4,6,0 -12224,2,6,0,4,3,4,10,3,3,1,13,11,3,34,0 -12225,8,6,0,6,4,1,9,1,3,0,16,10,3,17,1 -12226,9,1,0,7,4,1,7,0,4,1,18,13,3,23,1 -12227,0,8,0,8,6,0,10,0,0,0,15,15,0,12,0 -12228,10,3,0,9,5,5,9,4,4,0,4,8,4,29,0 -12229,2,5,0,2,6,4,8,0,3,1,5,2,1,33,0 -12230,6,7,0,13,2,5,3,2,1,1,7,9,5,9,0 -12231,5,2,0,0,4,3,8,1,4,1,18,7,4,20,1 -12232,8,5,0,3,5,4,2,0,0,0,4,2,2,8,0 -12233,5,1,0,10,1,1,4,3,4,1,7,20,2,19,1 -12234,5,3,0,12,1,4,10,5,0,1,17,0,2,4,0 -12235,1,5,0,13,6,6,5,3,0,1,0,9,4,41,0 -12236,4,2,0,7,1,1,14,2,2,0,2,14,4,18,0 -12237,9,4,0,2,6,4,5,3,2,0,13,6,2,0,0 -12238,2,2,0,2,3,0,2,2,0,0,13,20,1,40,0 -12239,0,0,0,3,5,3,2,4,0,1,4,3,3,3,0 -12240,0,2,0,10,2,2,14,5,3,1,14,14,3,8,1 -12241,2,6,0,15,3,5,8,2,3,1,10,16,1,18,0 -12242,2,2,0,1,2,3,9,2,3,0,13,15,5,6,0 -12243,1,0,0,1,0,3,5,2,0,0,4,2,4,6,0 -12244,5,1,0,15,3,6,6,3,0,0,17,9,2,3,0 -12245,2,1,0,9,0,0,2,1,3,0,8,0,2,28,0 -12246,0,0,0,5,0,1,12,0,4,0,2,9,3,29,0 -12247,2,2,0,3,0,0,5,3,2,0,6,20,5,27,0 -12248,7,6,0,3,0,5,8,3,3,1,4,13,1,32,0 -12249,1,6,0,5,0,0,9,2,4,1,9,13,2,7,0 -12250,9,2,0,13,5,4,14,4,2,1,10,11,1,19,0 -12251,7,8,0,7,6,5,2,5,0,1,12,11,0,13,0 -12252,4,6,0,11,6,0,14,1,0,0,17,20,0,23,0 -12253,3,3,0,6,6,5,13,1,0,1,12,0,2,19,0 -12254,3,8,0,6,6,0,2,1,3,1,13,9,0,40,0 -12255,3,4,0,12,4,5,0,2,0,1,8,11,5,28,0 -12256,1,7,0,3,4,3,4,3,1,0,17,2,4,8,0 -12257,0,3,0,8,2,1,4,1,1,0,2,9,4,31,0 -12258,1,2,0,2,1,5,13,4,0,0,6,20,3,6,0 -12259,8,1,0,8,0,1,1,1,1,1,4,12,0,17,0 -12260,4,6,0,0,0,0,4,5,2,1,9,12,2,20,1 -12261,1,3,0,6,6,0,3,0,1,0,2,2,5,34,0 -12262,8,1,0,15,2,0,10,3,4,0,2,18,0,5,0 -12263,2,2,0,11,6,5,9,1,1,0,2,15,5,4,0 -12264,10,8,0,2,6,1,6,3,1,0,1,8,2,39,1 -12265,2,3,0,13,3,3,3,4,0,1,17,9,1,14,0 -12266,5,7,0,14,0,4,4,2,2,1,2,5,5,35,0 -12267,7,6,0,2,3,0,9,0,2,0,13,6,0,29,0 -12268,3,7,0,2,2,4,1,5,3,1,2,2,0,14,0 -12269,10,2,0,12,1,2,1,4,4,0,9,19,3,9,1 -12270,6,7,0,8,1,4,13,0,0,0,13,19,3,14,0 -12271,4,2,0,10,6,6,4,5,1,0,5,5,5,22,1 -12272,9,0,0,3,0,6,14,4,0,1,3,11,5,26,0 -12273,10,7,0,7,4,5,2,0,3,0,12,8,1,3,1 -12274,1,7,0,5,0,0,7,3,1,0,10,6,1,24,0 -12275,0,3,0,15,0,5,5,1,1,0,0,0,1,12,0 -12276,8,2,0,8,3,1,4,0,0,1,0,5,5,30,1 -12277,7,7,0,7,0,2,2,1,2,1,6,9,2,12,1 -12278,5,6,0,5,0,1,2,0,4,1,10,20,1,27,1 -12279,9,7,0,2,1,3,3,1,4,0,17,11,4,27,0 -12280,6,2,0,2,2,1,7,1,0,1,2,14,0,22,0 -12281,4,4,0,3,6,0,8,0,1,1,13,15,0,29,0 -12282,0,1,0,5,2,1,10,1,3,1,5,18,2,21,1 -12283,0,6,0,13,3,1,3,1,3,1,6,19,1,8,0 -12284,6,4,0,5,3,1,4,2,2,1,18,10,2,29,1 -12285,8,7,0,14,4,5,7,3,0,1,5,3,0,9,1 -12286,9,8,0,10,1,1,9,3,1,1,13,6,3,4,0 -12287,0,1,0,15,4,1,14,4,1,0,0,6,4,32,0 -12288,1,1,0,15,3,5,14,5,4,0,4,9,1,41,0 -12289,4,1,0,0,2,1,4,3,0,1,9,8,4,2,1 -12290,6,6,0,15,3,0,8,4,4,1,7,2,3,27,0 -12291,2,7,0,0,2,6,7,0,3,1,8,16,1,37,0 -12292,1,1,0,0,4,5,7,3,3,1,12,0,0,12,0 -12293,1,0,0,12,4,2,4,5,1,1,18,17,1,22,1 -12294,1,2,0,10,5,3,11,4,1,1,17,13,3,17,0 -12295,2,0,0,10,2,5,5,2,4,0,16,2,4,40,0 -12296,5,4,0,7,4,0,5,3,3,0,11,16,4,24,0 -12297,4,5,0,5,5,0,7,1,0,0,10,13,2,14,0 -12298,0,7,0,15,0,6,1,4,2,1,13,13,2,9,0 -12299,8,2,0,11,3,0,5,4,2,1,10,11,3,21,0 -12300,1,6,0,8,2,2,14,1,3,1,18,1,0,20,1 -12301,2,1,0,12,3,5,10,0,2,1,14,19,2,8,1 -12302,0,6,0,3,5,5,7,5,1,0,8,6,0,16,0 -12303,9,6,0,2,4,5,14,1,1,0,1,6,3,0,0 -12304,9,4,0,5,6,2,12,2,4,1,2,15,3,28,0 -12305,0,3,0,13,1,0,13,1,4,0,17,2,4,7,0 -12306,2,7,0,2,1,6,6,5,0,0,14,10,3,31,1 -12307,3,1,0,2,1,3,5,0,2,0,12,3,5,39,0 -12308,7,2,0,5,3,0,1,1,0,0,15,15,4,9,0 -12309,10,8,0,9,0,5,7,1,3,0,8,6,3,2,0 -12310,7,2,0,13,5,4,1,2,0,0,4,11,2,34,0 -12311,7,3,0,0,3,5,13,5,2,0,3,10,4,0,1 -12312,3,2,0,15,4,5,9,1,4,1,9,4,3,32,1 -12313,7,1,0,13,4,1,12,5,4,1,18,5,5,16,1 -12314,3,4,0,0,0,0,8,5,2,1,4,13,3,7,0 -12315,6,7,0,15,0,1,3,5,0,1,9,12,1,22,1 -12316,7,3,0,4,0,0,4,2,1,1,6,15,3,33,0 -12317,5,8,0,6,6,4,6,3,1,0,4,17,1,38,0 -12318,0,3,0,6,5,6,8,5,2,0,4,11,2,21,0 -12319,0,3,0,8,6,6,3,2,2,1,13,4,1,31,0 -12320,6,2,0,0,3,1,13,1,1,1,6,13,2,28,1 -12321,5,1,0,8,2,5,3,0,1,1,8,11,1,1,0 -12322,2,2,0,10,4,0,1,3,3,0,18,4,2,11,1 -12323,2,2,0,12,4,2,3,1,0,1,9,17,2,3,1 -12324,5,1,0,11,5,4,12,3,2,1,4,20,0,27,0 -12325,2,4,0,3,5,3,8,0,4,1,7,11,1,4,0 -12326,0,6,0,11,1,5,1,5,1,1,10,9,0,8,0 -12327,0,4,0,13,0,4,3,0,0,0,2,16,5,8,0 -12328,8,4,0,3,0,0,8,1,4,0,10,11,3,14,0 -12329,8,8,0,14,3,2,6,3,0,1,13,10,3,7,0 -12330,4,6,0,8,0,4,12,2,4,1,6,18,5,26,0 -12331,5,5,0,11,4,2,12,2,3,0,14,7,1,11,1 -12332,9,0,0,10,1,2,7,0,1,1,11,1,3,31,1 -12333,5,7,0,2,1,3,1,3,4,0,13,11,2,20,0 -12334,2,2,0,1,6,0,9,4,4,1,4,2,0,26,0 -12335,10,3,0,11,1,1,10,1,1,1,9,5,5,14,1 -12336,7,8,0,3,5,3,1,3,1,0,2,2,5,37,0 -12337,10,2,0,12,3,0,8,5,2,1,18,0,4,30,1 -12338,9,6,0,6,0,3,14,2,3,1,13,1,3,21,0 -12339,3,6,0,9,0,3,14,2,0,1,13,0,3,29,0 -12340,1,8,0,5,0,2,7,4,4,0,17,9,1,34,0 -12341,6,3,0,9,1,4,14,4,3,1,13,2,5,9,0 -12342,0,0,0,0,1,5,6,0,3,1,2,19,5,35,0 -12343,2,2,0,8,1,0,5,0,1,0,13,11,5,18,0 -12344,2,1,0,2,2,1,9,1,2,1,13,4,0,11,0 -12345,1,6,0,8,6,3,8,5,2,1,0,17,0,14,0 -12346,9,5,0,3,0,1,10,4,0,1,8,13,0,14,0 -12347,7,6,0,1,0,3,3,4,4,1,4,15,5,2,0 -12348,8,8,0,9,3,0,4,3,4,0,5,3,1,2,1 -12349,1,8,0,8,0,0,11,1,1,0,4,11,2,37,0 -12350,7,6,0,7,3,1,4,0,0,0,1,7,0,35,1 -12351,1,3,0,3,6,4,12,4,3,1,16,7,1,35,0 -12352,0,8,0,5,0,0,6,0,4,1,2,15,0,24,0 -12353,4,6,0,7,0,0,0,3,3,0,8,2,1,31,0 -12354,2,2,0,4,4,1,5,4,0,1,11,15,0,39,0 -12355,0,0,0,10,1,2,4,0,3,1,1,1,5,35,1 -12356,9,8,0,2,4,1,8,5,3,1,9,7,3,23,1 -12357,8,4,0,3,0,1,10,5,0,1,4,10,3,23,1 -12358,0,5,0,1,6,5,13,0,0,0,13,6,0,22,0 -12359,5,6,0,5,5,5,6,3,4,0,10,2,2,6,0 -12360,6,3,0,4,6,5,8,3,2,0,17,4,3,27,0 -12361,7,0,0,0,0,2,7,0,4,1,2,2,3,17,0 -12362,0,7,0,5,4,0,3,0,3,1,18,16,5,19,1 -12363,1,6,0,15,4,3,8,3,4,1,9,1,5,26,0 -12364,8,3,0,1,4,1,7,2,3,1,18,19,5,13,1 -12365,9,4,0,2,3,3,13,0,0,0,12,8,0,2,1 -12366,5,8,0,12,0,1,2,1,3,1,5,6,3,27,0 -12367,9,3,0,5,1,5,3,4,0,1,8,18,1,8,0 -12368,5,3,0,5,4,6,2,2,2,1,7,17,0,23,1 -12369,6,1,0,15,3,1,3,2,4,0,7,7,1,11,1 -12370,3,1,0,11,6,5,12,2,2,1,2,2,1,11,0 -12371,2,2,0,5,6,5,11,1,1,1,13,0,5,13,0 -12372,4,3,0,15,1,6,13,1,0,1,13,0,0,31,0 -12373,6,8,0,13,5,4,9,0,0,0,13,3,5,37,0 -12374,1,0,0,11,3,3,2,3,4,1,5,13,0,26,0 -12375,2,0,0,1,0,3,8,3,1,1,13,18,5,0,0 -12376,2,0,0,2,0,4,12,3,4,0,2,9,5,40,0 -12377,9,3,0,11,0,2,0,4,1,0,16,9,1,11,0 -12378,9,6,0,15,3,6,14,5,3,0,14,7,3,35,1 -12379,3,5,0,4,2,0,9,2,2,1,16,3,0,7,0 -12380,0,7,0,11,2,5,12,5,2,1,13,12,3,33,0 -12381,0,3,0,9,2,2,14,2,1,0,9,15,0,0,0 -12382,5,2,0,4,5,0,2,0,3,0,6,16,1,4,1 -12383,3,8,0,15,1,0,13,5,4,1,7,19,4,1,1 -12384,8,7,0,5,5,1,0,3,3,1,9,7,2,29,1 -12385,3,7,0,1,6,5,2,3,2,0,2,2,3,27,0 -12386,0,6,0,0,5,0,5,2,0,1,0,10,2,22,0 -12387,8,2,0,7,5,3,8,1,1,0,18,5,5,16,1 -12388,4,5,0,5,0,5,5,1,0,1,12,2,0,26,0 -12389,2,6,0,14,3,5,3,0,1,0,8,15,0,20,0 -12390,9,7,0,6,0,4,1,2,2,0,13,15,3,9,0 -12391,5,0,0,7,2,2,5,0,0,1,11,2,1,27,0 -12392,0,8,0,15,6,5,6,0,1,1,0,16,1,24,0 -12393,10,8,0,12,1,5,5,2,0,1,2,5,1,4,0 -12394,0,8,0,13,4,4,14,5,2,0,7,11,2,3,0 -12395,6,6,0,14,3,4,11,2,2,0,12,1,4,24,0 -12396,8,6,0,3,6,6,10,5,3,0,6,10,1,13,1 -12397,10,6,0,3,1,4,11,1,3,0,7,6,5,27,0 -12398,5,6,0,2,2,0,9,1,3,0,0,16,5,26,0 -12399,3,5,0,10,0,2,3,5,1,0,14,5,5,9,1 -12400,5,8,0,13,2,4,1,0,0,0,8,2,3,20,0 -12401,6,6,0,11,1,1,9,2,2,0,0,9,0,12,0 -12402,2,6,0,12,0,1,5,2,0,0,2,20,0,11,0 -12403,7,8,0,12,4,5,3,4,3,1,2,6,0,10,0 -12404,0,7,0,6,5,5,6,4,3,1,11,0,3,29,0 -12405,9,5,0,10,0,1,3,0,2,0,18,7,1,33,1 -12406,8,0,0,8,5,1,1,3,1,1,18,7,4,32,1 -12407,10,1,0,12,5,5,3,5,3,0,4,18,5,20,1 -12408,6,3,0,15,2,3,4,4,1,0,11,0,4,2,1 -12409,2,8,0,1,6,4,8,4,3,1,4,2,5,31,0 -12410,4,1,0,1,0,6,13,4,1,1,4,2,0,8,0 -12411,8,1,0,10,1,4,1,3,0,0,0,2,2,35,0 -12412,0,2,0,13,2,5,8,4,3,1,10,4,5,8,0 -12413,8,6,0,11,0,0,8,0,4,0,13,0,1,14,0 -12414,3,5,0,14,0,3,13,1,0,1,2,4,5,14,0 -12415,9,4,0,5,1,0,8,3,2,0,8,4,4,16,0 -12416,0,8,0,6,4,6,6,4,1,0,13,20,2,25,0 -12417,1,3,0,3,0,4,14,3,1,1,15,15,3,26,0 -12418,9,3,0,3,6,4,6,4,2,1,10,14,1,41,0 -12419,7,5,0,13,3,5,6,2,2,0,5,10,0,17,0 -12420,7,2,0,0,1,4,14,3,2,0,18,7,3,22,1 -12421,7,0,0,6,2,6,5,5,1,1,10,11,3,40,0 -12422,7,0,0,7,6,4,11,2,1,1,18,10,1,2,1 -12423,9,3,0,0,2,6,3,1,4,0,4,11,5,12,0 -12424,6,0,0,13,2,0,11,4,1,0,12,0,3,13,0 -12425,6,8,0,2,6,4,9,2,2,0,17,2,3,13,0 -12426,9,1,0,12,5,6,0,1,4,1,2,0,3,32,0 -12427,6,7,0,8,2,4,6,0,0,1,16,7,1,31,1 -12428,9,0,0,9,2,3,13,0,2,1,17,2,1,24,0 -12429,0,0,0,3,0,6,14,2,2,1,17,5,1,14,1 -12430,5,0,0,4,0,2,8,1,4,1,10,14,2,13,1 -12431,3,3,0,4,0,1,6,4,3,1,6,2,5,27,0 -12432,2,8,0,14,1,3,6,1,0,0,8,11,1,33,0 -12433,1,8,0,3,5,5,8,1,3,1,6,15,1,11,0 -12434,1,3,0,9,0,5,0,4,4,1,18,18,2,6,0 -12435,5,4,0,11,0,1,4,5,2,0,10,11,0,26,0 -12436,8,7,0,3,3,2,8,2,3,0,8,12,3,5,0 -12437,0,8,0,1,4,4,12,4,1,1,0,4,0,4,0 -12438,1,0,0,1,6,0,10,1,1,0,6,20,0,27,0 -12439,7,1,0,13,2,0,10,5,4,1,15,11,5,8,1 -12440,6,0,0,2,3,3,11,4,0,0,18,12,0,31,1 -12441,6,0,0,2,0,0,12,5,0,1,6,9,0,30,0 -12442,10,5,0,3,4,0,5,4,0,0,8,6,2,14,0 -12443,4,2,0,15,4,5,2,4,4,1,9,5,4,41,1 -12444,9,2,0,5,1,6,8,4,0,1,13,2,3,0,0 -12445,0,5,0,0,1,0,0,1,2,0,5,11,0,0,0 -12446,3,1,0,13,3,4,6,3,1,0,10,15,1,0,0 -12447,1,7,0,0,2,5,2,4,3,0,8,6,3,20,0 -12448,2,0,0,14,6,2,0,3,1,1,2,11,1,39,0 -12449,4,3,0,15,2,5,5,4,0,1,2,2,2,37,0 -12450,9,7,0,12,2,6,13,3,1,0,2,11,5,11,0 -12451,4,1,0,12,4,4,0,4,0,1,2,15,2,38,0 -12452,2,3,0,9,4,3,10,3,0,0,9,10,5,28,1 -12453,4,8,0,7,6,6,6,3,4,1,18,15,5,17,0 -12454,7,1,0,12,3,1,12,4,1,1,1,14,5,13,1 -12455,3,6,0,13,5,1,7,4,2,0,17,2,0,12,0 -12456,2,2,0,2,4,3,10,0,1,1,13,11,2,28,0 -12457,0,7,0,3,3,3,5,0,2,0,9,18,1,31,0 -12458,4,0,0,2,4,6,5,5,3,0,5,0,5,11,1 -12459,4,3,0,13,1,3,2,0,0,0,13,18,5,28,0 -12460,6,8,0,14,0,1,2,5,0,1,11,8,3,11,1 -12461,6,3,0,11,6,0,8,5,0,1,18,8,2,14,1 -12462,7,1,0,1,2,3,2,1,3,0,4,18,4,1,0 -12463,0,5,0,1,6,4,2,3,1,0,4,11,3,4,0 -12464,3,4,0,3,4,0,7,5,3,1,13,14,5,6,0 -12465,0,7,0,14,0,1,8,4,3,0,10,6,2,22,0 -12466,9,6,0,10,5,2,3,1,1,0,2,11,2,35,0 -12467,10,7,0,10,3,0,8,4,1,1,2,13,1,0,0 -12468,9,8,0,6,5,6,12,4,4,1,2,4,4,3,0 -12469,4,8,0,12,2,5,9,4,2,1,6,18,0,25,0 -12470,9,8,0,13,4,5,6,3,0,0,16,2,5,6,0 -12471,0,3,0,3,5,6,7,1,3,1,2,11,5,6,0 -12472,7,5,0,10,1,4,13,5,2,0,18,1,0,16,1 -12473,10,1,0,13,0,6,7,1,0,1,10,9,5,10,0 -12474,0,6,0,2,1,6,9,1,4,1,2,2,1,8,0 -12475,1,2,0,4,6,5,2,2,2,0,13,2,1,36,0 -12476,4,0,0,6,4,4,6,2,3,1,2,9,4,26,0 -12477,4,0,0,11,2,2,8,5,3,0,13,15,0,2,0 -12478,10,7,0,9,4,5,3,2,0,0,13,2,0,34,0 -12479,3,7,0,10,0,0,14,1,4,0,0,6,0,17,0 -12480,8,1,0,11,4,6,4,1,3,0,7,8,2,27,1 -12481,5,0,0,9,5,0,12,2,0,1,13,10,3,4,0 -12482,1,8,0,0,6,4,13,4,0,1,2,15,0,30,0 -12483,2,7,0,12,2,3,14,0,3,0,8,13,3,23,0 -12484,2,0,0,4,0,0,7,1,2,0,2,2,0,31,0 -12485,8,8,0,11,4,2,4,5,0,1,14,10,2,0,1 -12486,6,5,0,15,5,0,6,0,2,0,2,2,2,6,0 -12487,3,7,0,14,0,0,4,5,0,1,8,6,3,10,0 -12488,6,6,0,4,5,3,3,4,0,0,8,15,1,22,0 -12489,1,6,0,0,2,3,13,0,3,0,13,6,2,13,0 -12490,9,1,0,7,5,5,5,5,4,0,13,4,2,8,0 -12491,6,2,0,14,0,5,5,5,0,1,12,13,2,6,0 -12492,0,8,0,15,3,4,9,2,1,0,17,19,4,18,0 -12493,0,4,0,13,6,4,8,5,0,0,4,9,0,16,0 -12494,6,7,0,11,4,4,12,4,4,0,2,6,5,34,0 -12495,6,4,0,11,5,5,5,2,2,0,12,13,0,41,0 -12496,8,6,0,5,0,4,1,1,2,1,11,16,4,31,0 -12497,2,2,0,2,3,5,11,4,0,1,4,11,5,9,0 -12498,4,7,0,6,3,5,9,1,2,1,13,15,3,3,0 -12499,6,1,0,1,3,6,9,2,4,1,10,2,3,10,0 -12500,0,7,0,2,0,5,5,1,4,1,3,15,4,40,0 -12501,6,5,0,8,3,6,0,1,4,1,18,7,0,7,1 -12502,0,1,0,12,6,1,10,5,1,0,13,20,4,35,0 -12503,3,2,0,15,2,5,9,1,3,0,4,14,1,27,0 -12504,10,1,0,5,2,0,4,4,0,0,2,3,0,6,0 -12505,9,7,0,3,1,2,11,0,0,1,16,11,4,33,0 -12506,1,6,0,9,0,5,7,2,0,1,13,6,5,0,0 -12507,5,4,0,3,4,4,11,3,3,0,6,18,4,35,0 -12508,7,6,0,15,1,6,11,3,3,1,12,11,5,37,0 -12509,9,5,0,12,6,1,6,2,0,1,16,5,0,14,1 -12510,2,1,0,15,2,4,5,4,0,1,2,13,3,11,0 -12511,8,6,0,3,0,0,13,3,2,0,8,2,2,2,0 -12512,1,3,0,2,2,5,13,0,1,1,13,2,1,29,0 -12513,5,5,0,13,1,6,3,1,1,1,2,18,3,34,0 -12514,6,0,0,4,3,1,14,4,3,1,2,16,4,9,0 -12515,0,2,0,6,2,4,8,4,4,0,8,18,2,24,0 -12516,8,7,0,13,6,0,6,5,0,0,15,2,0,27,0 -12517,0,4,0,11,2,1,6,2,4,1,4,14,1,5,0 -12518,10,5,0,15,3,0,8,1,3,0,10,19,2,5,1 -12519,9,6,0,10,1,6,5,4,1,0,18,7,4,33,1 -12520,10,7,0,6,1,0,0,5,4,0,11,3,5,39,0 -12521,10,2,0,2,3,4,0,0,0,0,7,7,0,14,1 -12522,0,5,0,11,6,5,10,1,3,0,2,3,2,28,0 -12523,6,0,0,7,2,5,14,1,0,1,7,16,2,34,0 -12524,7,7,0,14,4,4,12,1,2,0,9,18,2,5,1 -12525,2,6,0,6,1,5,13,4,1,1,13,7,2,3,0 -12526,7,2,0,0,2,3,0,4,3,1,17,5,4,8,0 -12527,0,3,0,10,3,5,2,5,3,1,18,7,3,2,1 -12528,2,8,0,11,0,0,0,2,2,0,4,13,1,36,0 -12529,8,8,0,7,3,0,6,0,3,1,13,2,3,3,0 -12530,7,3,0,13,0,0,3,3,4,0,7,2,4,40,0 -12531,2,4,0,1,0,2,9,1,1,0,8,12,2,41,0 -12532,10,8,0,13,2,1,8,2,0,1,17,0,1,11,0 -12533,8,0,0,3,6,6,4,3,2,1,8,19,3,5,0 -12534,1,0,0,11,5,5,10,4,0,0,12,14,5,39,0 -12535,1,5,0,1,3,1,0,3,1,0,8,12,0,26,0 -12536,8,0,0,10,4,0,9,5,2,0,14,10,4,19,1 -12537,6,1,0,12,5,2,8,2,1,0,9,3,4,27,1 -12538,10,4,0,11,4,1,0,2,0,1,10,13,4,0,1 -12539,9,1,0,6,6,6,9,1,0,0,9,17,4,35,1 -12540,7,4,0,4,1,0,9,2,3,1,16,20,2,2,0 -12541,3,6,0,6,2,5,8,1,3,0,8,16,4,19,0 -12542,10,8,0,6,5,6,4,5,2,0,13,2,2,26,0 -12543,4,1,0,4,1,5,7,4,2,1,0,5,2,20,1 -12544,2,0,0,13,1,3,0,1,1,1,0,2,0,9,0 -12545,1,6,0,11,2,0,6,4,1,0,6,6,3,4,0 -12546,1,0,0,5,2,0,2,3,1,0,8,18,3,12,0 -12547,5,2,0,5,1,2,3,0,4,1,2,13,2,37,0 -12548,6,1,0,3,6,3,0,0,4,1,14,0,0,36,0 -12549,1,7,0,14,5,5,14,3,3,0,0,3,1,4,0 -12550,4,2,0,14,6,4,4,0,0,1,2,17,2,2,1 -12551,6,2,0,3,4,4,3,3,0,0,4,0,3,19,0 -12552,5,0,0,3,0,5,3,5,0,0,4,6,1,30,0 -12553,9,5,0,3,4,0,9,4,0,0,15,13,2,6,0 -12554,10,8,0,3,2,4,10,1,2,1,4,15,5,3,0 -12555,0,3,0,12,1,0,2,2,0,1,8,11,0,29,0 -12556,3,8,0,14,3,2,4,4,3,0,2,16,1,34,0 -12557,0,4,0,6,0,4,4,5,1,0,11,9,0,34,0 -12558,5,7,0,10,5,0,5,3,1,1,4,6,1,37,0 -12559,10,1,0,3,3,3,4,5,4,1,5,2,0,25,0 -12560,6,4,0,3,4,1,0,3,4,1,11,1,4,7,1 -12561,1,5,0,1,6,5,2,3,3,0,17,2,0,33,0 -12562,1,6,0,10,4,3,11,5,4,0,6,20,1,24,0 -12563,0,8,0,3,3,6,2,2,0,1,2,2,3,37,0 -12564,9,8,0,3,1,5,4,0,1,0,9,10,1,16,1 -12565,2,2,0,0,0,0,8,4,1,1,11,6,0,16,0 -12566,2,6,0,1,5,1,3,3,1,0,6,20,1,30,0 -12567,3,3,0,9,0,3,1,4,2,0,13,9,4,4,0 -12568,0,1,0,10,2,2,6,0,1,0,6,18,0,27,0 -12569,9,7,0,15,5,5,11,0,1,1,18,5,1,19,1 -12570,2,5,0,15,0,0,5,3,0,0,7,5,5,37,0 -12571,1,5,0,0,2,1,8,3,2,1,10,8,3,3,0 -12572,1,0,0,3,6,6,12,5,1,1,18,19,2,39,1 -12573,8,4,0,14,5,5,3,5,0,0,9,15,5,2,1 -12574,7,1,0,11,0,0,14,3,0,0,4,2,2,38,0 -12575,0,2,0,1,0,2,7,4,2,0,13,16,0,11,0 -12576,3,1,0,2,3,5,8,2,0,0,4,2,3,27,0 -12577,0,4,0,11,4,1,12,3,1,0,2,16,3,30,0 -12578,5,5,0,4,3,6,2,5,3,1,5,10,3,4,1 -12579,6,7,0,1,1,4,7,1,2,0,17,11,4,14,0 -12580,4,1,0,5,6,4,3,4,0,1,18,11,4,31,1 -12581,2,2,0,9,5,2,0,5,2,1,6,11,1,26,0 -12582,4,3,0,2,4,6,5,3,4,1,13,9,5,13,0 -12583,8,6,0,3,1,3,9,4,2,1,2,9,4,1,0 -12584,2,4,0,11,6,5,6,2,2,0,12,16,3,8,0 -12585,0,8,0,12,0,2,13,0,1,0,12,15,0,10,0 -12586,5,1,0,2,6,6,4,5,2,1,1,17,1,14,1 -12587,3,2,0,7,5,4,3,5,0,0,8,17,2,38,0 -12588,0,7,0,1,6,4,9,4,0,1,17,20,0,33,0 -12589,8,0,0,6,3,4,12,3,3,1,3,11,4,35,0 -12590,7,6,0,7,3,2,14,5,1,0,1,10,0,24,1 -12591,8,1,0,6,6,4,2,2,0,0,1,17,5,23,1 -12592,6,2,0,3,4,1,11,4,4,1,13,9,3,28,0 -12593,2,2,0,2,6,0,7,1,3,1,2,9,0,28,0 -12594,6,7,0,13,0,1,9,4,3,1,8,17,3,37,0 -12595,0,8,0,3,5,2,4,0,0,0,10,11,5,27,0 -12596,7,3,0,12,6,0,1,4,4,0,2,2,4,3,0 -12597,10,8,0,2,3,4,4,3,1,0,14,10,4,41,1 -12598,7,0,0,14,0,0,6,1,2,0,0,14,1,24,1 -12599,0,2,0,1,1,6,6,4,4,0,12,4,5,22,0 -12600,2,6,0,0,6,1,9,2,4,1,13,3,4,4,0 -12601,0,1,0,15,2,0,6,1,0,0,13,2,2,0,0 -12602,10,7,0,5,0,4,5,0,0,1,13,18,3,5,0 -12603,7,5,0,5,6,1,3,1,2,1,11,5,4,0,1 -12604,5,8,0,0,1,3,0,0,1,1,13,4,0,2,0 -12605,4,1,0,14,4,2,2,0,3,1,5,17,1,2,1 -12606,7,2,0,2,4,2,11,1,4,1,3,8,5,38,1 -12607,10,1,0,13,0,4,7,2,3,0,15,0,0,34,0 -12608,6,6,0,0,1,5,13,0,0,0,8,2,2,20,0 -12609,1,6,0,1,6,0,6,1,3,1,0,11,5,12,0 -12610,0,1,0,1,4,4,8,2,3,1,8,15,0,36,0 -12611,3,4,0,14,1,5,12,3,1,0,14,15,1,35,1 -12612,0,7,0,15,1,4,5,2,2,0,5,20,4,26,0 -12613,10,3,0,9,6,3,12,5,3,1,11,17,5,2,1 -12614,3,5,0,11,0,3,1,4,2,1,16,16,5,1,1 -12615,9,6,0,9,2,5,12,5,4,1,13,2,3,23,0 -12616,4,7,0,2,0,0,5,2,1,0,8,15,3,11,0 -12617,1,6,0,2,0,5,6,2,2,0,6,6,4,24,0 -12618,7,6,0,2,3,4,4,2,3,1,1,12,3,29,1 -12619,7,0,0,6,5,6,6,5,1,0,12,11,2,0,0 -12620,10,2,0,8,1,1,4,4,4,1,5,11,1,20,1 -12621,2,4,0,7,1,1,2,3,2,0,8,15,5,37,0 -12622,1,8,0,11,6,3,3,5,4,0,18,1,1,13,1 -12623,3,7,0,1,5,5,8,1,2,1,13,4,0,32,0 -12624,8,8,0,11,5,0,12,2,1,1,0,18,1,25,0 -12625,7,2,0,10,6,5,0,1,1,0,14,16,5,5,1 -12626,8,0,0,10,2,0,10,3,4,0,13,17,2,41,0 -12627,3,3,0,14,0,1,11,4,4,1,4,0,0,4,0 -12628,4,1,0,9,3,6,3,4,2,0,18,1,4,34,1 -12629,10,4,0,8,5,3,6,4,0,0,18,9,0,3,0 -12630,8,0,0,9,3,4,10,5,4,1,9,8,5,18,1 -12631,6,4,0,13,0,5,14,0,0,0,4,12,0,0,0 -12632,3,1,0,10,4,3,10,4,2,1,11,6,2,35,0 -12633,2,0,0,3,1,4,6,3,3,0,7,11,3,33,0 -12634,5,8,0,11,3,6,9,5,2,0,4,11,0,5,0 -12635,0,3,0,2,1,5,7,1,4,1,2,0,1,41,0 -12636,6,3,0,8,5,5,8,4,1,1,2,10,1,38,0 -12637,4,5,0,8,1,2,11,3,1,0,13,15,1,13,0 -12638,0,7,0,8,6,6,9,3,1,1,2,14,1,23,0 -12639,3,3,0,5,6,4,9,5,0,1,6,3,5,22,1 -12640,0,2,0,9,2,0,8,0,1,0,2,20,1,2,0 -12641,8,1,0,12,3,0,5,1,1,0,2,2,0,2,0 -12642,1,2,0,3,2,0,9,5,3,1,10,6,0,27,0 -12643,0,6,0,9,1,4,10,1,0,0,10,2,2,11,0 -12644,0,0,0,4,1,3,14,1,0,0,2,0,0,28,0 -12645,4,2,0,3,4,0,4,1,2,1,14,12,0,39,1 -12646,0,4,0,12,4,6,10,5,3,1,18,10,0,2,1 -12647,0,8,0,5,0,2,5,5,3,1,8,9,0,41,0 -12648,6,6,0,6,6,5,6,5,0,0,15,3,0,28,0 -12649,2,6,0,11,2,4,14,5,0,1,8,10,3,18,0 -12650,3,6,0,12,2,4,9,3,1,0,8,2,0,40,0 -12651,6,0,0,5,4,1,13,0,3,0,5,13,3,10,1 -12652,0,6,0,5,3,3,6,3,4,0,14,3,0,41,0 -12653,2,8,0,6,2,5,11,3,3,0,16,0,4,3,0 -12654,0,5,0,6,2,3,10,4,2,1,0,15,3,17,0 -12655,0,5,0,10,0,5,8,5,0,1,13,15,5,19,0 -12656,4,2,0,10,4,4,6,2,1,0,17,18,3,39,0 -12657,6,3,0,11,6,3,4,2,4,0,13,11,1,22,0 -12658,2,4,0,1,3,6,11,3,0,1,17,2,3,40,0 -12659,1,3,0,8,1,5,3,4,1,0,17,15,2,4,0 -12660,0,8,0,1,6,4,6,5,1,1,4,13,1,0,0 -12661,2,8,0,11,3,3,2,4,2,0,18,16,5,33,1 -12662,1,0,0,5,2,6,6,0,3,0,2,14,1,0,0 -12663,5,7,0,10,3,0,5,0,0,1,2,11,4,26,0 -12664,10,2,0,10,2,6,6,5,1,0,3,2,2,28,0 -12665,3,5,0,3,5,4,9,5,4,0,15,0,4,31,0 -12666,10,0,0,4,4,5,1,5,0,0,2,2,1,34,0 -12667,1,8,0,9,0,0,12,2,3,0,5,13,4,6,0 -12668,6,3,0,0,0,6,7,3,0,1,2,2,2,17,0 -12669,4,8,0,3,6,5,0,0,0,0,4,11,0,4,0 -12670,10,0,0,15,1,4,2,3,1,1,12,12,1,5,0 -12671,1,4,0,11,0,3,4,4,4,1,4,3,0,28,0 -12672,9,8,0,9,4,2,2,2,3,1,9,17,4,12,1 -12673,0,4,0,10,4,2,0,1,2,0,5,3,0,10,0 -12674,9,2,0,4,2,0,14,2,1,0,18,7,1,12,1 -12675,10,1,0,5,4,5,13,2,4,1,18,1,1,18,1 -12676,0,5,0,6,1,3,5,5,0,1,18,10,5,28,1 -12677,4,7,0,8,5,2,8,4,1,1,1,7,1,10,1 -12678,10,8,0,4,4,5,4,1,0,0,10,6,2,40,0 -12679,7,6,0,13,6,4,5,4,0,1,10,6,2,28,0 -12680,9,5,0,11,5,6,11,0,3,1,6,15,3,28,0 -12681,4,5,0,1,5,0,7,2,3,1,10,2,2,2,0 -12682,5,6,0,13,3,6,14,3,2,0,17,20,5,6,0 -12683,2,7,0,15,3,2,5,3,3,0,12,9,5,23,0 -12684,1,1,0,12,4,5,0,2,4,0,13,2,0,10,0 -12685,7,0,0,15,4,3,2,3,0,0,8,18,2,10,0 -12686,0,6,0,8,0,0,12,4,2,1,2,0,2,36,0 -12687,8,6,0,7,2,4,0,2,2,1,8,3,2,30,0 -12688,1,3,0,12,0,6,0,1,4,0,13,10,3,22,0 -12689,1,2,0,8,1,2,14,0,4,0,13,2,0,9,0 -12690,6,4,0,12,5,0,11,4,3,1,16,5,5,17,1 -12691,1,7,0,13,6,1,12,2,2,1,2,11,4,5,0 -12692,8,6,0,10,5,2,9,2,2,1,2,20,0,12,0 -12693,2,3,0,5,0,4,1,4,2,0,13,9,1,28,0 -12694,7,8,0,12,4,1,12,4,2,1,14,5,0,5,1 -12695,8,7,0,3,0,3,2,4,3,0,8,15,0,2,0 -12696,8,8,0,0,1,5,6,0,3,1,2,0,0,6,0 -12697,7,0,0,10,2,1,10,5,1,0,7,17,5,8,1 -12698,8,6,0,15,1,2,4,1,2,0,18,8,4,10,1 -12699,9,5,0,15,5,6,3,0,2,0,4,20,1,7,0 -12700,1,6,0,1,3,1,12,0,0,1,6,11,2,11,0 -12701,1,5,0,12,5,2,0,1,2,0,13,2,3,26,0 -12702,8,6,0,14,2,3,8,0,0,0,8,2,0,36,0 -12703,9,0,0,10,2,2,0,4,3,1,15,9,2,17,1 -12704,6,8,0,13,2,0,3,1,1,1,13,2,3,27,0 -12705,8,0,0,11,6,4,10,4,2,1,12,4,1,29,0 -12706,0,0,0,3,4,6,0,1,2,1,13,15,0,40,0 -12707,10,3,0,2,4,2,2,5,4,1,2,15,0,1,0 -12708,2,3,0,3,0,0,3,3,4,1,8,1,5,10,0 -12709,1,6,0,4,0,6,1,1,4,1,13,11,0,2,0 -12710,1,1,0,2,0,0,4,0,1,0,13,9,5,24,0 -12711,1,7,0,6,2,5,6,3,4,0,17,0,5,20,0 -12712,9,5,0,2,3,2,1,5,0,1,9,7,5,39,1 -12713,8,8,0,1,2,6,4,2,4,0,11,7,4,31,1 -12714,0,8,0,15,6,0,3,1,4,1,10,6,0,2,0 -12715,1,6,0,13,3,3,4,0,1,0,13,20,0,30,0 -12716,2,6,0,1,0,1,0,5,1,0,1,9,0,35,0 -12717,3,7,0,1,2,4,5,1,3,1,11,11,2,17,0 -12718,9,8,0,6,1,0,0,0,3,1,16,5,3,14,1 -12719,9,5,0,13,1,3,2,0,4,1,18,7,4,21,1 -12720,10,4,0,10,1,2,12,1,1,1,1,8,4,40,1 -12721,3,5,0,2,0,4,14,0,1,0,4,2,1,0,0 -12722,0,7,0,13,4,4,5,3,4,0,2,0,4,40,0 -12723,4,0,0,4,2,3,14,3,4,0,7,13,4,30,0 -12724,0,3,0,13,0,5,14,4,1,0,13,4,1,41,0 -12725,2,8,0,12,6,0,6,3,4,1,10,6,5,27,0 -12726,8,5,0,0,4,6,10,5,4,1,10,19,0,19,1 -12727,4,0,0,11,4,0,6,1,0,1,3,15,0,27,0 -12728,8,1,0,9,2,6,13,5,1,1,18,7,0,28,1 -12729,6,1,0,13,2,0,12,4,2,1,1,7,1,20,1 -12730,7,8,0,1,2,6,9,0,1,1,2,2,0,6,0 -12731,7,0,0,4,2,5,7,5,4,1,11,5,5,8,1 -12732,5,3,0,10,6,1,12,0,4,0,12,20,0,2,0 -12733,10,6,0,4,6,4,6,0,4,1,4,9,1,14,0 -12734,0,2,0,3,3,3,0,0,3,0,7,2,3,17,0 -12735,8,8,0,15,6,0,4,0,0,1,8,19,0,39,0 -12736,9,1,0,13,1,3,13,1,4,1,4,2,1,30,0 -12737,8,6,0,14,6,0,14,4,4,0,8,12,5,21,0 -12738,0,7,0,11,4,3,10,1,4,1,6,15,4,18,0 -12739,4,6,0,3,5,0,0,1,1,0,13,0,0,33,0 -12740,3,4,0,6,6,6,14,1,3,1,4,5,1,0,1 -12741,5,3,0,13,6,5,7,5,1,0,14,12,2,41,0 -12742,9,3,0,7,6,6,2,3,4,0,13,9,3,38,0 -12743,5,5,0,5,2,1,9,3,1,0,8,6,0,5,0 -12744,5,4,0,0,3,5,1,3,1,1,11,9,5,40,0 -12745,3,2,0,8,3,4,6,1,0,0,8,13,4,23,0 -12746,4,6,0,12,5,3,14,5,3,1,7,13,2,11,1 -12747,5,5,0,13,5,0,5,3,0,0,13,9,0,12,0 -12748,0,8,0,12,2,3,6,4,0,1,4,19,5,5,0 -12749,3,5,0,13,5,5,9,0,3,0,2,2,0,31,0 -12750,4,5,0,5,6,4,6,5,3,1,18,3,0,33,1 -12751,2,8,0,8,1,3,7,2,2,0,4,13,3,35,0 -12752,0,2,0,7,2,1,9,4,3,1,7,8,5,21,1 -12753,4,4,0,3,5,6,7,0,0,0,13,13,1,2,0 -12754,1,3,0,6,1,3,5,3,4,0,6,9,0,1,0 -12755,4,4,0,6,4,3,13,0,1,0,13,2,4,34,0 -12756,2,3,0,0,0,0,14,2,1,1,17,11,3,12,0 -12757,10,1,0,5,2,2,10,5,0,1,1,3,3,20,1 -12758,5,6,0,2,3,3,9,0,0,1,16,10,5,31,1 -12759,0,3,0,13,1,0,6,2,1,0,4,11,0,14,0 -12760,10,3,0,5,0,4,6,4,4,0,3,11,5,38,0 -12761,6,5,0,8,2,5,9,2,4,0,8,12,0,14,0 -12762,2,7,0,14,0,4,0,1,3,1,13,2,0,8,0 -12763,0,0,0,10,4,6,7,2,0,1,13,12,1,41,0 -12764,3,0,0,10,5,6,1,0,0,0,2,4,0,10,0 -12765,1,8,0,6,0,0,1,1,2,1,13,1,2,21,0 -12766,7,5,0,2,0,1,14,2,4,0,4,7,2,19,1 -12767,8,1,0,7,6,0,3,4,3,0,5,19,3,37,1 -12768,5,3,0,14,1,6,14,5,3,0,4,1,2,36,1 -12769,5,8,0,15,4,0,11,2,4,0,2,16,0,17,0 -12770,0,1,0,2,5,4,1,5,1,0,17,9,5,12,0 -12771,1,5,0,6,3,5,11,1,1,0,3,6,0,29,0 -12772,2,5,0,9,3,2,9,3,2,0,14,18,2,0,0 -12773,6,1,0,6,1,4,14,1,3,0,6,2,1,10,0 -12774,6,0,0,0,2,5,10,2,0,1,5,13,0,30,0 -12775,6,6,0,15,5,3,5,3,0,1,10,2,0,23,0 -12776,0,5,0,12,6,4,5,1,2,0,2,13,1,30,0 -12777,9,5,0,1,0,0,4,3,3,0,4,13,1,22,0 -12778,9,8,0,7,0,5,9,0,0,0,13,13,5,26,0 -12779,6,4,0,0,3,5,2,0,0,1,16,15,3,37,0 -12780,10,0,0,5,5,6,8,4,2,1,11,6,4,41,0 -12781,10,4,0,9,5,3,9,4,2,0,10,17,0,5,1 -12782,9,6,0,2,4,0,11,1,0,0,13,16,2,9,0 -12783,0,1,0,3,0,1,7,4,1,1,0,11,2,41,0 -12784,5,8,0,14,3,0,9,4,2,0,15,6,5,4,0 -12785,0,4,0,6,2,2,8,1,1,0,6,11,1,11,0 -12786,0,8,0,8,2,0,7,1,4,0,10,11,1,17,0 -12787,9,7,0,13,2,6,6,4,4,0,2,11,2,24,0 -12788,1,5,0,0,5,6,9,1,4,1,13,13,0,13,0 -12789,3,7,0,2,0,4,4,1,0,1,6,2,5,0,0 -12790,5,2,0,0,0,4,0,1,4,1,7,6,0,30,0 -12791,1,6,0,14,3,1,4,5,3,1,12,5,3,34,0 -12792,6,1,0,3,1,2,11,2,2,0,5,10,2,16,1 -12793,0,5,0,10,6,4,8,1,1,1,3,13,1,30,1 -12794,0,6,0,2,0,4,9,1,4,0,17,3,3,20,0 -12795,10,7,0,5,3,4,5,2,2,0,13,14,1,11,0 -12796,2,4,0,7,6,0,9,5,4,0,4,11,5,36,0 -12797,0,3,0,5,5,4,3,3,0,1,2,2,4,13,0 -12798,0,2,0,12,1,5,13,1,3,0,15,2,5,5,0 -12799,7,6,0,9,3,5,7,2,0,1,4,6,2,10,0 -12800,8,3,0,12,4,1,9,1,4,0,16,8,1,22,1 -12801,3,7,0,0,5,3,9,3,2,1,8,6,2,38,0 -12802,6,8,0,13,0,5,3,2,2,0,17,11,2,28,0 -12803,6,6,0,15,4,0,7,1,1,0,2,15,0,21,0 -12804,4,7,0,11,4,0,5,2,4,1,2,6,1,28,0 -12805,2,4,0,0,6,4,5,4,4,0,13,11,3,30,0 -12806,2,8,0,1,6,0,11,3,0,1,13,11,3,33,0 -12807,10,5,0,4,2,6,5,4,2,0,4,11,3,7,0 -12808,0,1,0,9,1,5,11,3,2,0,12,9,3,16,0 -12809,2,7,0,2,0,6,14,0,1,0,13,14,2,4,0 -12810,4,8,0,11,1,2,8,1,0,0,8,18,0,6,0 -12811,6,5,0,15,6,2,9,5,2,0,7,1,4,39,1 -12812,3,7,0,12,0,3,7,0,3,1,2,9,1,36,0 -12813,1,6,0,9,3,2,14,4,0,0,2,12,2,18,0 -12814,9,0,0,0,2,2,9,2,4,1,1,14,3,21,1 -12815,9,2,0,12,5,6,10,0,0,0,3,7,5,35,1 -12816,7,6,0,8,6,6,9,4,0,0,18,7,2,28,1 -12817,8,8,0,8,2,0,10,4,2,0,17,15,4,3,0 -12818,4,1,0,15,1,2,2,4,4,1,1,8,2,1,1 -12819,10,8,0,8,2,0,14,2,3,1,12,2,2,10,0 -12820,10,3,0,3,0,4,11,2,4,1,4,15,0,38,0 -12821,5,8,0,9,0,4,12,0,2,0,2,2,0,8,0 -12822,1,5,0,14,0,6,9,4,3,0,2,7,2,33,0 -12823,7,0,0,7,4,5,8,3,2,0,1,10,4,22,1 -12824,5,0,0,9,2,0,12,5,3,1,18,0,2,39,1 -12825,7,8,0,3,3,2,7,0,2,0,9,0,1,18,1 -12826,9,1,0,12,1,5,14,0,0,0,0,5,5,13,1 -12827,0,8,0,2,0,4,9,3,0,0,0,15,0,30,0 -12828,10,6,0,12,6,5,1,1,1,1,18,8,3,1,1 -12829,5,5,0,3,5,6,11,0,2,1,7,19,2,41,1 -12830,7,6,0,13,2,4,9,4,2,1,13,2,1,22,0 -12831,1,3,0,4,1,6,8,1,4,1,2,11,0,4,0 -12832,8,3,0,10,2,4,3,2,0,0,5,6,5,3,1 -12833,10,2,0,5,1,6,13,3,3,1,13,2,4,27,0 -12834,3,4,0,13,6,3,0,3,3,0,4,11,0,29,0 -12835,9,0,0,1,5,5,6,5,3,0,13,6,3,3,0 -12836,2,6,0,7,0,0,7,5,4,0,4,15,1,0,0 -12837,2,2,0,8,4,4,4,0,3,1,10,13,0,29,0 -12838,2,0,0,3,0,0,5,0,0,0,13,4,4,18,0 -12839,1,8,0,4,5,3,9,5,3,0,3,11,5,7,0 -12840,9,6,0,0,0,4,0,1,2,1,6,11,5,5,0 -12841,8,6,0,13,5,5,6,3,1,1,8,0,0,27,0 -12842,2,6,0,12,0,2,9,3,1,0,17,0,4,10,0 -12843,0,0,0,15,0,6,1,4,4,0,8,6,2,4,0 -12844,8,1,0,8,3,5,5,1,3,1,13,0,3,6,0 -12845,4,3,0,14,4,1,14,2,1,0,13,2,1,32,0 -12846,10,8,0,7,6,3,14,2,0,1,16,5,5,5,1 -12847,7,3,0,10,1,5,5,1,0,0,8,11,0,0,0 -12848,5,6,0,11,6,2,4,3,0,0,13,11,3,34,0 -12849,10,3,0,11,1,5,8,4,4,1,4,2,2,38,0 -12850,0,7,0,2,0,5,11,1,3,0,8,15,0,31,0 -12851,5,6,0,10,1,4,0,1,2,0,10,2,2,28,0 -12852,4,8,0,11,0,0,9,1,4,0,2,9,1,11,0 -12853,1,7,0,9,1,6,14,2,4,0,13,15,3,27,0 -12854,5,7,0,7,6,1,12,0,1,1,16,5,0,11,1 -12855,3,5,0,3,2,3,10,3,0,1,13,4,5,33,0 -12856,8,4,0,2,0,4,0,3,1,1,10,4,1,38,0 -12857,2,5,0,5,3,0,13,3,0,1,17,2,4,41,0 -12858,0,6,0,9,0,1,9,3,2,1,11,11,3,1,0 -12859,8,5,0,14,4,3,14,5,1,0,16,13,0,16,1 -12860,0,6,0,8,6,3,3,5,4,0,15,7,2,23,0 -12861,2,6,0,3,1,5,1,1,2,1,10,4,0,16,0 -12862,1,7,0,5,4,4,0,2,0,1,8,0,3,21,0 -12863,3,0,0,12,6,2,1,5,0,0,4,0,4,6,1 -12864,0,1,0,0,6,6,6,3,2,0,13,11,4,26,0 -12865,6,8,0,6,0,4,8,2,3,0,13,14,2,2,0 -12866,1,3,0,6,1,3,0,0,4,0,17,15,1,7,0 -12867,10,5,0,3,6,4,6,2,2,1,8,2,3,16,0 -12868,9,6,0,2,3,0,3,1,0,1,7,10,4,6,0 -12869,0,1,0,11,6,5,11,4,0,0,13,0,2,35,0 -12870,2,8,0,8,1,3,6,1,3,1,3,15,0,7,0 -12871,4,2,0,15,4,0,7,0,0,0,13,16,0,29,0 -12872,8,8,0,7,5,0,7,1,1,0,13,9,0,34,0 -12873,4,5,0,1,1,4,3,3,0,0,8,11,4,8,0 -12874,1,1,0,2,2,3,4,4,2,1,14,17,3,23,1 -12875,1,4,0,5,0,5,2,4,0,0,11,13,2,10,0 -12876,1,7,0,3,0,4,11,2,2,1,17,6,5,7,0 -12877,2,5,0,3,0,5,3,2,0,1,12,13,2,40,0 -12878,3,3,0,7,5,4,11,0,3,0,10,12,5,0,0 -12879,7,2,0,1,2,6,12,5,0,1,1,17,2,22,1 -12880,0,3,0,7,3,2,6,4,3,0,8,5,0,19,0 -12881,2,4,0,0,1,0,4,4,0,1,2,2,2,30,0 -12882,2,8,0,6,1,4,6,1,2,0,2,4,1,20,0 -12883,0,6,0,0,5,4,10,2,1,1,15,13,0,7,0 -12884,1,1,0,15,2,0,0,1,0,1,18,7,0,7,1 -12885,4,6,0,2,6,2,7,0,1,1,13,0,0,17,0 -12886,0,7,0,15,0,4,0,2,4,0,8,19,3,4,0 -12887,3,8,0,0,4,4,11,1,1,1,16,18,1,22,0 -12888,10,8,0,13,4,3,9,4,1,1,10,15,2,13,0 -12889,1,0,0,2,0,4,14,1,3,0,14,11,3,29,0 -12890,0,7,0,10,6,0,14,3,0,1,8,10,2,16,0 -12891,1,5,0,12,3,4,9,4,2,1,10,3,4,26,0 -12892,3,4,0,13,0,5,14,4,0,1,17,0,5,18,0 -12893,1,6,0,1,6,5,10,1,1,0,0,6,5,29,0 -12894,0,7,0,3,1,6,1,4,2,1,2,2,5,5,0 -12895,9,0,0,4,4,5,8,2,4,0,6,7,2,37,1 -12896,1,2,0,13,0,6,0,3,0,1,17,18,2,33,0 -12897,5,7,0,1,0,4,5,1,1,0,0,18,0,34,0 -12898,7,5,0,1,3,6,4,5,4,1,18,19,3,1,1 -12899,0,8,0,11,3,3,11,2,3,1,8,19,3,27,0 -12900,1,2,0,7,0,0,6,3,4,1,7,11,1,6,0 -12901,6,5,0,8,1,6,8,4,1,0,12,11,2,11,0 -12902,7,2,0,4,6,0,14,3,2,1,18,17,4,41,1 -12903,0,8,0,13,6,1,2,0,2,1,3,15,2,18,0 -12904,5,8,0,7,2,6,8,0,0,0,9,3,3,11,1 -12905,7,2,0,3,3,5,7,4,2,0,2,6,4,7,0 -12906,3,8,0,10,3,5,7,2,0,0,17,0,5,38,0 -12907,3,0,0,8,0,2,6,1,0,1,12,10,5,20,0 -12908,6,3,0,13,3,6,10,0,1,1,0,3,1,6,0 -12909,7,7,0,1,2,3,5,3,2,1,10,2,0,33,0 -12910,1,1,0,6,0,3,9,3,0,0,13,2,1,39,0 -12911,2,6,0,3,6,6,13,5,0,1,0,16,0,6,0 -12912,2,2,0,8,6,3,14,2,2,1,7,8,0,22,0 -12913,9,6,0,13,4,4,6,3,3,0,12,9,3,34,0 -12914,10,0,0,15,2,3,8,3,0,0,15,16,4,13,0 -12915,1,4,0,13,0,5,4,1,0,1,13,20,2,4,0 -12916,4,3,0,12,6,6,5,4,2,1,13,19,0,8,0 -12917,0,8,0,15,5,5,3,2,3,0,4,14,2,19,0 -12918,7,2,0,7,4,2,3,2,2,1,18,5,3,33,1 -12919,4,6,0,5,5,5,6,5,3,1,18,8,1,21,1 -12920,5,6,0,7,1,1,1,3,4,1,1,5,2,16,1 -12921,5,5,0,2,0,3,6,0,3,1,5,7,0,27,1 -12922,5,6,0,0,2,0,8,3,1,0,4,2,2,36,0 -12923,1,8,0,4,1,4,12,5,2,0,18,10,0,20,1 -12924,9,7,0,11,1,1,4,0,1,1,0,9,4,33,0 -12925,6,8,0,4,4,4,12,5,2,0,16,5,0,21,1 -12926,1,3,0,15,5,0,6,4,1,0,10,10,4,3,1 -12927,3,7,0,10,1,1,8,1,2,0,13,16,1,25,0 -12928,2,3,0,5,6,4,5,4,0,1,14,6,2,27,0 -12929,3,1,0,12,5,4,11,1,1,1,12,7,0,27,1 -12930,5,8,0,3,6,4,8,3,0,0,2,11,0,28,0 -12931,8,6,0,11,2,3,5,5,0,0,8,11,0,33,0 -12932,4,6,0,1,0,0,9,5,4,0,13,2,2,37,0 -12933,3,7,0,2,1,1,0,5,3,1,8,17,3,35,0 -12934,2,0,0,0,5,0,5,3,2,0,2,13,0,7,0 -12935,4,1,0,13,1,5,4,0,4,1,8,11,0,27,0 -12936,7,4,0,10,2,5,0,4,3,0,18,19,4,7,1 -12937,9,6,0,7,0,0,6,0,1,0,12,9,4,29,0 -12938,0,6,0,2,2,1,8,4,4,1,13,13,1,36,0 -12939,0,3,0,13,0,0,0,1,4,1,4,16,3,25,0 -12940,3,5,0,1,5,5,5,3,1,0,12,15,2,41,0 -12941,5,7,0,6,5,2,8,0,4,1,2,6,0,7,0 -12942,6,4,0,15,0,5,12,4,1,1,5,10,4,39,1 -12943,1,2,0,10,5,6,5,2,3,0,13,11,1,19,0 -12944,6,0,0,15,0,5,7,1,1,1,13,11,5,38,0 -12945,10,5,0,5,5,0,2,0,1,0,2,11,1,12,0 -12946,10,8,0,3,4,5,12,0,0,1,13,6,3,35,0 -12947,2,3,0,14,2,0,5,3,0,1,8,0,5,37,0 -12948,0,0,0,5,0,5,9,3,3,1,4,11,5,30,0 -12949,8,6,0,10,0,0,4,1,2,0,13,5,4,38,0 -12950,7,3,0,12,5,4,5,1,0,0,10,18,0,34,0 -12951,0,6,0,11,1,0,11,2,1,1,0,11,2,26,0 -12952,5,3,0,9,0,4,8,2,1,0,17,13,3,4,0 -12953,8,8,0,6,0,4,6,5,4,0,13,2,1,11,0 -12954,0,0,0,14,4,5,2,5,0,0,8,2,5,30,0 -12955,3,8,0,12,3,0,2,2,1,0,18,5,5,38,1 -12956,0,6,0,10,1,1,3,2,2,0,15,11,0,28,0 -12957,10,7,0,2,5,2,4,0,0,0,10,8,2,20,1 -12958,9,1,0,12,2,5,5,1,0,1,18,8,3,19,1 -12959,2,1,0,6,3,4,6,1,3,1,15,2,5,25,0 -12960,5,2,0,4,0,2,4,3,4,0,1,5,0,12,1 -12961,0,6,0,3,2,6,11,1,3,1,4,11,5,29,0 -12962,6,8,0,2,0,3,5,3,1,1,17,3,1,14,0 -12963,3,4,0,13,1,3,1,1,4,0,15,15,4,24,0 -12964,9,0,0,8,3,4,6,0,4,1,18,4,1,21,1 -12965,8,6,0,5,2,4,10,4,0,0,17,11,1,0,0 -12966,8,2,0,3,0,0,1,1,0,0,13,9,0,37,0 -12967,5,3,0,4,3,5,2,0,4,1,5,7,5,22,1 -12968,3,2,0,3,0,4,4,4,4,0,8,0,3,27,0 -12969,9,0,0,12,5,2,3,3,0,1,17,18,1,25,0 -12970,0,8,0,1,2,6,8,3,4,0,17,6,0,35,0 -12971,0,6,0,14,5,4,0,0,0,0,13,1,0,41,0 -12972,2,7,0,1,0,5,14,3,4,0,17,13,2,5,0 -12973,3,1,0,6,6,0,9,0,2,1,10,0,1,7,0 -12974,5,6,0,10,2,6,0,2,3,0,0,16,1,14,0 -12975,0,4,0,5,1,4,10,4,0,0,6,16,0,16,0 -12976,1,7,0,4,2,2,9,4,4,0,11,2,0,37,0 -12977,6,0,0,3,0,6,5,3,0,1,13,2,2,28,0 -12978,0,3,0,2,5,6,14,0,4,1,8,9,3,14,0 -12979,10,1,0,1,0,6,0,3,4,1,15,13,2,32,0 -12980,9,4,0,3,3,6,0,2,1,1,18,8,4,3,1 -12981,10,7,0,15,1,5,4,0,4,1,7,8,0,26,1 -12982,1,4,0,10,2,1,2,5,1,0,16,17,1,14,1 -12983,3,6,0,5,5,0,3,2,2,1,8,11,1,17,0 -12984,5,7,0,6,3,4,5,3,0,0,10,11,1,10,0 -12985,4,1,0,15,6,6,5,1,1,0,13,9,2,39,0 -12986,0,5,0,13,1,0,13,3,4,1,6,9,1,3,0 -12987,3,6,0,6,2,3,1,1,4,1,17,2,5,27,0 -12988,8,1,0,5,6,1,1,0,3,0,18,5,1,38,1 -12989,7,2,0,9,2,1,5,1,4,1,18,19,2,17,1 -12990,2,6,0,11,4,5,10,2,3,1,0,18,1,30,0 -12991,3,3,0,11,0,2,8,3,3,0,13,15,4,17,0 -12992,0,0,0,4,6,0,6,0,1,0,17,2,0,35,0 -12993,5,4,0,11,6,4,14,1,3,0,8,16,2,14,0 -12994,2,2,0,10,3,5,14,4,2,1,10,9,1,7,0 -12995,0,6,0,8,6,5,14,1,2,0,2,1,2,24,0 -12996,9,3,0,10,4,1,11,5,3,0,9,8,4,34,1 -12997,3,2,0,8,2,4,7,1,0,0,14,2,0,38,0 -12998,0,1,0,8,2,3,1,3,1,0,13,20,5,40,0 -12999,8,1,0,8,6,3,1,5,3,1,5,14,1,38,1 -13000,7,0,0,10,4,2,13,3,0,1,18,10,3,32,1 -13001,6,3,0,6,6,5,12,3,1,1,13,8,1,36,0 -13002,9,7,0,4,4,5,0,4,1,0,2,15,0,10,0 -13003,1,0,0,2,2,3,13,3,3,0,13,7,1,29,0 -13004,4,2,0,7,6,0,2,1,3,0,18,19,3,17,1 -13005,0,6,0,12,1,4,1,2,0,1,2,15,3,5,0 -13006,4,4,0,0,0,1,2,5,0,0,15,7,5,41,1 -13007,4,6,0,12,5,4,12,3,1,1,0,2,1,6,0 -13008,0,2,0,13,4,3,0,3,0,0,2,19,0,38,0 -13009,3,0,0,0,4,2,3,2,0,0,16,2,5,22,0 -13010,0,0,0,1,6,6,14,3,3,1,4,11,0,31,0 -13011,1,8,0,2,6,4,0,1,0,0,8,19,1,26,0 -13012,6,0,0,5,3,4,10,0,3,1,2,2,2,26,0 -13013,4,3,0,4,4,2,6,5,1,1,14,7,3,31,1 -13014,2,8,0,6,3,5,3,3,0,1,3,11,3,29,0 -13015,0,2,0,11,5,1,7,3,2,0,8,7,4,1,0 -13016,10,4,0,15,6,5,6,0,0,1,2,20,0,29,0 -13017,3,3,0,6,0,1,9,1,2,1,2,4,5,13,0 -13018,3,1,0,12,1,3,10,3,3,1,8,6,0,14,0 -13019,1,0,0,4,1,1,9,3,3,0,13,9,0,13,0 -13020,2,6,0,6,3,3,9,1,0,1,4,19,2,34,0 -13021,8,4,0,6,6,0,6,4,0,0,13,6,2,9,0 -13022,1,6,0,4,6,6,14,2,0,0,2,5,3,11,0 -13023,3,7,0,13,4,4,4,5,0,0,9,2,3,34,1 -13024,9,0,0,3,6,6,0,4,2,1,13,3,0,33,0 -13025,0,0,0,11,6,6,9,3,4,0,15,3,5,1,0 -13026,0,3,0,3,0,1,3,4,1,0,2,0,0,33,0 -13027,8,3,0,13,0,6,0,5,0,0,13,11,0,20,0 -13028,2,6,0,12,1,2,8,1,2,0,0,3,0,11,0 -13029,0,1,0,8,1,6,1,1,1,1,18,10,2,0,1 -13030,6,2,0,11,0,1,8,3,4,0,10,3,0,13,0 -13031,0,8,0,13,4,5,12,3,0,0,8,2,1,27,0 -13032,9,6,0,6,2,5,0,4,1,1,18,5,1,32,1 -13033,1,1,0,13,6,0,12,2,2,0,16,6,4,29,0 -13034,3,1,0,0,0,6,14,2,1,0,3,6,1,7,0 -13035,0,6,0,5,2,0,9,3,4,1,12,2,2,22,0 -13036,4,3,0,7,6,6,2,4,3,1,13,13,4,8,0 -13037,3,3,0,6,0,2,1,3,2,0,12,4,3,2,0 -13038,1,5,0,13,1,3,7,2,1,0,13,13,0,28,0 -13039,1,2,0,6,5,5,9,1,3,1,8,9,2,3,0 -13040,0,8,0,13,2,0,12,0,0,0,8,2,3,12,0 -13041,7,0,0,13,1,0,1,1,0,0,8,13,1,8,0 -13042,9,6,0,5,0,4,7,4,3,0,4,9,0,27,0 -13043,1,6,0,5,4,0,6,0,2,1,13,15,4,31,0 -13044,3,8,0,4,5,1,8,4,3,1,11,0,4,11,0 -13045,2,7,0,3,6,1,7,1,2,0,2,0,0,5,0 -13046,2,2,0,6,6,0,3,0,0,1,13,15,5,27,0 -13047,5,3,0,15,0,0,3,4,1,1,13,5,0,19,0 -13048,6,3,0,9,4,5,14,5,2,1,6,11,2,41,0 -13049,9,5,0,10,6,3,3,2,0,1,8,18,3,38,0 -13050,3,6,0,9,0,3,5,1,0,0,2,11,3,36,0 -13051,1,2,0,4,5,4,11,5,4,0,12,7,2,33,1 -13052,9,8,0,11,4,5,8,0,4,1,4,6,1,1,0 -13053,5,1,0,8,1,3,14,4,4,0,18,8,1,27,1 -13054,7,3,0,1,5,4,11,2,4,1,8,11,2,21,0 -13055,0,3,0,3,0,0,8,3,4,0,5,11,2,37,0 -13056,0,3,0,2,6,6,9,5,1,1,17,19,2,17,0 -13057,9,6,0,2,2,3,9,4,0,0,13,16,1,35,0 -13058,2,0,0,11,0,1,11,3,1,1,13,2,1,28,0 -13059,8,6,0,6,3,5,3,5,4,1,13,4,0,19,0 -13060,6,3,0,7,2,1,4,1,3,1,5,18,5,40,1 -13061,9,2,0,13,2,5,5,4,4,0,0,20,5,11,0 -13062,3,7,0,15,0,2,5,2,0,0,16,13,1,34,0 -13063,6,8,0,12,0,0,8,4,1,0,8,11,0,27,0 -13064,1,2,0,8,4,2,9,0,3,1,2,5,2,26,0 -13065,10,3,0,5,2,3,1,3,4,0,17,12,1,3,0 -13066,3,8,0,13,6,5,2,3,1,0,2,9,3,10,0 -13067,2,6,0,3,6,6,7,4,1,1,8,0,0,7,0 -13068,7,6,0,1,4,1,4,2,1,1,14,7,2,39,1 -13069,10,0,0,1,3,3,9,2,4,0,13,3,2,5,0 -13070,6,8,0,15,3,5,8,0,0,0,13,19,1,34,0 -13071,6,8,0,7,4,3,13,4,1,1,14,8,0,30,1 -13072,0,2,0,3,3,0,5,4,4,0,13,9,1,39,0 -13073,5,0,0,10,2,2,5,4,1,0,5,14,4,37,1 -13074,8,5,0,0,3,6,2,0,2,0,16,18,0,32,0 -13075,0,7,0,8,0,4,3,0,4,1,17,11,0,6,0 -13076,9,6,0,14,4,3,4,0,1,1,11,6,3,35,1 -13077,1,0,0,11,1,5,6,1,3,0,15,12,5,3,0 -13078,6,5,0,7,1,0,5,2,4,1,8,2,0,7,0 -13079,8,6,0,2,3,3,7,1,3,0,3,16,4,6,1 -13080,3,8,0,9,0,4,8,0,4,0,12,11,4,16,0 -13081,10,6,0,6,1,6,5,1,2,1,0,9,5,0,0 -13082,10,7,0,8,5,1,5,1,1,1,13,11,4,35,0 -13083,4,0,0,13,3,5,11,3,3,0,13,6,2,3,0 -13084,1,5,0,6,0,0,9,0,1,0,2,11,5,16,0 -13085,2,8,0,9,1,6,11,4,2,1,10,3,1,8,0 -13086,3,8,0,15,6,1,5,3,2,0,12,18,5,37,0 -13087,6,1,0,9,3,3,12,4,0,1,1,5,0,27,1 -13088,0,6,0,1,6,3,4,0,2,0,10,12,5,35,0 -13089,6,6,0,11,1,5,5,3,4,0,18,10,4,40,1 -13090,2,0,0,3,3,3,3,2,3,1,6,18,1,34,0 -13091,9,6,0,4,6,3,14,4,2,1,2,16,0,7,0 -13092,10,6,0,12,1,4,4,1,1,1,13,18,1,7,0 -13093,5,6,0,7,0,3,4,0,2,1,18,7,2,1,1 -13094,2,3,0,15,2,5,9,4,0,0,13,2,1,9,0 -13095,0,7,0,3,4,2,11,3,2,0,10,0,0,24,0 -13096,0,7,0,1,2,4,4,2,2,0,14,13,2,29,0 -13097,7,4,0,13,2,0,10,4,2,1,17,3,2,5,0 -13098,4,1,0,13,4,0,1,3,3,1,14,18,4,25,1 -13099,2,3,0,8,4,1,0,4,0,1,2,15,3,19,0 -13100,2,5,0,14,1,3,8,1,0,1,13,4,2,2,0 -13101,1,8,0,7,4,5,9,4,1,0,2,8,5,31,0 -13102,7,7,0,3,1,0,0,0,0,0,14,2,1,39,0 -13103,1,0,0,13,2,3,1,3,3,1,12,9,2,12,0 -13104,6,1,0,1,2,4,11,5,4,1,7,4,2,9,1 -13105,10,6,0,7,0,5,2,0,4,0,13,2,2,21,0 -13106,2,4,0,13,0,1,7,1,1,0,17,9,4,27,0 -13107,7,3,0,4,0,2,10,1,3,0,2,9,0,39,0 -13108,0,3,0,3,2,5,12,3,3,1,8,5,1,1,0 -13109,6,8,0,4,4,1,2,3,4,1,14,17,4,5,1 -13110,3,5,0,15,1,1,10,2,3,0,18,8,4,28,1 -13111,8,8,0,15,6,1,3,0,3,0,3,10,5,22,1 -13112,2,0,0,15,2,1,11,1,1,1,12,11,5,26,0 -13113,0,5,0,1,4,5,8,4,4,1,4,18,2,20,0 -13114,6,2,0,13,0,0,1,5,3,0,3,7,2,4,0 -13115,0,6,0,1,0,2,10,1,4,0,2,12,2,14,0 -13116,3,4,0,9,2,2,1,5,0,1,5,10,2,13,1 -13117,0,6,0,6,6,4,9,4,1,0,13,0,1,37,0 -13118,1,4,0,10,5,5,7,3,3,0,4,15,1,35,0 -13119,0,4,0,3,6,5,7,5,0,0,10,10,4,19,0 -13120,7,5,0,0,1,2,3,1,0,1,14,7,4,19,1 -13121,8,6,0,15,2,5,7,2,2,0,15,6,1,14,0 -13122,1,4,0,7,2,1,11,2,1,1,9,5,5,9,1 -13123,8,4,0,12,1,0,13,1,2,0,8,2,5,6,0 -13124,6,8,0,0,1,2,6,1,3,1,13,2,4,10,0 -13125,0,0,0,8,2,4,14,3,4,0,13,11,4,16,0 -13126,2,6,0,3,2,4,12,3,3,0,13,20,5,4,0 -13127,8,0,0,10,0,5,8,0,2,1,13,6,5,16,0 -13128,8,6,0,13,6,6,4,3,3,0,12,7,4,37,1 -13129,1,8,0,7,5,1,0,5,1,1,6,4,5,28,1 -13130,3,2,0,13,3,4,8,2,0,0,17,15,5,10,0 -13131,5,0,0,8,5,5,14,2,0,0,13,17,5,34,0 -13132,6,5,0,13,0,6,6,5,3,1,17,2,5,40,0 -13133,0,2,0,11,3,1,5,2,4,0,10,11,2,34,0 -13134,5,7,0,11,5,5,11,3,4,0,13,13,1,32,0 -13135,1,3,0,6,0,2,6,0,3,1,4,11,5,13,0 -13136,2,5,0,7,6,0,8,4,2,1,8,18,0,35,0 -13137,4,5,0,14,0,4,12,4,4,0,13,7,5,21,1 -13138,5,1,0,14,3,0,11,4,2,0,1,7,2,7,1 -13139,9,6,0,6,0,3,5,4,3,0,13,6,0,33,0 -13140,3,8,0,10,0,3,4,5,2,1,7,10,1,7,1 -13141,1,3,0,9,2,4,14,0,0,0,2,11,3,16,0 -13142,0,6,0,13,1,3,2,4,2,1,16,11,4,26,0 -13143,3,5,0,13,0,4,10,2,0,1,2,0,1,40,0 -13144,2,7,0,11,0,1,12,4,1,1,2,20,0,12,0 -13145,1,6,0,0,3,5,6,1,2,0,7,0,0,6,0 -13146,1,3,0,0,6,0,3,2,2,1,16,2,3,6,0 -13147,3,5,0,8,0,5,2,4,3,1,8,11,4,7,0 -13148,9,0,0,14,0,3,3,5,0,0,5,10,2,19,1 -13149,1,1,0,6,6,5,8,2,0,0,4,16,2,25,0 -13150,8,5,0,0,0,5,7,3,1,0,0,18,0,13,0 -13151,0,5,0,12,6,1,14,4,1,1,15,2,2,35,0 -13152,4,0,0,6,0,5,6,1,3,0,13,13,0,29,0 -13153,1,8,0,3,0,5,9,0,0,0,4,4,5,34,0 -13154,6,6,0,15,2,1,2,0,0,1,11,4,1,22,1 -13155,5,6,0,13,6,4,2,1,2,1,6,7,2,8,1 -13156,0,4,0,6,0,6,8,0,1,1,17,18,1,4,0 -13157,10,8,0,0,1,1,4,0,0,1,2,15,3,7,0 -13158,7,7,0,10,6,1,13,1,3,1,7,14,4,29,1 -13159,3,4,0,7,3,5,3,0,4,0,2,9,4,36,0 -13160,6,6,0,1,1,6,9,3,2,0,0,6,4,4,0 -13161,9,5,0,10,1,5,4,5,3,0,18,8,2,11,1 -13162,0,3,0,8,6,4,14,3,2,1,6,2,3,31,0 -13163,1,0,0,7,4,2,4,0,3,0,3,17,1,16,1 -13164,0,7,0,11,2,2,2,5,3,0,2,18,5,13,0 -13165,2,0,0,5,1,3,4,3,1,1,10,4,2,12,0 -13166,2,8,0,6,6,6,9,2,1,1,4,2,1,34,0 -13167,4,5,0,14,4,5,10,4,4,0,17,2,2,28,0 -13168,0,5,0,13,2,3,9,5,0,0,0,14,1,34,0 -13169,0,5,0,5,4,3,3,3,2,0,5,11,5,4,0 -13170,10,6,0,3,0,0,12,0,1,1,7,6,3,6,0 -13171,1,0,0,1,0,6,2,3,1,1,17,2,2,11,0 -13172,0,6,0,15,2,0,6,2,0,1,4,15,1,25,0 -13173,4,8,0,7,5,3,10,2,2,0,0,2,2,26,0 -13174,0,7,0,13,2,4,8,1,3,0,11,2,5,13,0 -13175,3,8,0,9,0,3,6,2,1,1,13,0,1,30,0 -13176,8,8,0,1,4,5,2,5,0,0,13,3,0,13,0 -13177,0,2,0,3,2,0,10,4,1,1,4,3,4,12,0 -13178,4,7,0,7,3,5,9,0,0,0,13,9,2,4,0 -13179,7,7,0,3,3,1,13,3,3,0,18,17,4,17,1 -13180,1,6,0,5,5,2,7,4,3,1,4,9,4,35,0 -13181,7,6,0,9,6,6,5,0,0,0,2,18,0,26,0 -13182,6,7,0,7,4,2,11,3,2,1,5,12,3,2,1 -13183,9,7,0,0,5,6,10,3,0,1,6,11,4,21,0 -13184,6,1,0,6,2,5,8,1,3,1,4,10,2,11,0 -13185,8,4,0,0,4,2,0,5,0,0,10,10,4,5,1 -13186,5,8,0,7,0,0,12,4,3,0,13,11,3,10,0 -13187,3,7,0,7,0,1,4,3,2,1,2,16,1,33,0 -13188,9,5,0,14,2,0,13,5,0,1,18,20,5,16,1 -13189,0,2,0,2,6,6,9,0,2,1,2,2,3,28,0 -13190,3,4,0,5,5,3,6,2,2,0,13,15,3,29,0 -13191,2,5,0,10,5,5,6,0,0,0,6,6,3,0,0 -13192,9,8,0,5,0,3,0,0,3,0,8,13,3,16,0 -13193,9,4,0,1,2,6,13,1,0,1,13,13,0,40,0 -13194,10,1,0,12,6,6,10,0,1,0,2,13,0,13,0 -13195,6,4,0,1,0,5,14,2,2,1,13,9,0,27,0 -13196,4,3,0,11,3,4,7,2,0,0,17,9,1,21,0 -13197,5,7,0,3,0,5,13,3,3,0,10,20,2,41,0 -13198,5,1,0,3,3,6,6,4,1,0,2,4,5,10,0 -13199,7,6,0,3,6,0,7,2,0,1,8,12,5,2,0 -13200,2,5,0,1,2,6,6,1,1,0,0,6,0,37,0 -13201,7,7,0,13,1,0,14,1,2,0,4,13,1,25,0 -13202,6,0,0,3,0,5,3,1,2,1,13,11,4,14,0 -13203,2,7,0,2,2,0,1,1,2,1,6,2,3,10,0 -13204,1,4,0,5,5,4,6,1,0,0,6,11,2,2,0 -13205,6,1,0,15,5,5,2,3,1,0,8,15,1,33,0 -13206,1,8,0,6,2,2,7,0,2,1,2,2,4,35,0 -13207,2,2,0,12,3,5,4,0,3,0,13,11,0,20,0 -13208,5,4,0,15,1,5,12,2,0,0,0,3,2,41,0 -13209,7,1,0,6,1,5,11,4,3,1,2,1,0,22,0 -13210,0,3,0,2,3,4,9,2,3,0,13,13,5,6,0 -13211,9,0,0,12,4,4,6,3,1,1,16,7,1,31,1 -13212,8,1,0,13,4,4,9,3,4,1,13,13,1,28,0 -13213,0,5,0,4,1,4,5,2,0,1,2,11,1,14,0 -13214,0,6,0,6,4,5,5,4,3,0,2,15,5,1,0 -13215,2,2,0,3,1,1,6,3,3,0,8,2,3,5,0 -13216,6,2,0,3,4,5,14,2,0,0,13,9,5,0,0 -13217,5,4,0,1,3,4,0,0,1,1,15,14,3,24,0 -13218,6,3,0,14,4,5,9,4,1,1,13,4,2,37,0 -13219,5,5,0,2,0,0,13,1,1,0,7,20,0,25,1 -13220,6,8,0,3,2,0,14,3,4,0,13,6,3,10,0 -13221,7,6,0,8,3,4,8,3,4,0,13,9,0,13,0 -13222,2,3,0,6,1,1,5,4,3,0,13,9,4,1,0 -13223,7,8,0,13,2,5,11,2,0,0,3,7,3,1,1 -13224,3,3,0,12,3,6,11,4,0,1,8,2,1,8,0 -13225,0,6,0,3,4,1,2,2,2,0,2,3,3,6,0 -13226,2,8,0,1,0,3,1,3,3,1,17,20,1,8,0 -13227,1,1,0,5,5,4,11,1,0,0,13,20,2,7,0 -13228,8,0,0,13,5,4,5,1,4,0,13,15,2,33,0 -13229,2,8,0,0,5,5,6,2,2,1,6,11,2,6,0 -13230,0,3,0,0,5,4,0,3,1,1,15,6,0,38,0 -13231,7,2,0,3,2,2,14,4,4,1,2,2,0,17,0 -13232,2,2,0,9,3,2,7,0,3,0,0,6,2,16,0 -13233,0,8,0,13,5,2,0,1,0,1,10,2,0,29,0 -13234,7,0,0,12,4,1,5,1,4,1,6,11,0,20,0 -13235,7,2,0,3,2,4,14,4,1,0,4,11,0,27,0 -13236,0,8,0,9,2,4,13,0,0,1,6,2,1,16,0 -13237,8,6,0,13,0,2,6,2,0,0,2,3,1,35,0 -13238,7,7,0,1,3,0,12,0,1,1,17,15,2,11,0 -13239,5,5,0,3,1,5,3,3,3,0,4,4,5,2,0 -13240,4,0,0,0,4,4,5,3,4,0,18,9,4,21,0 -13241,2,3,0,5,4,5,1,0,3,1,13,9,2,2,0 -13242,0,5,0,4,1,4,4,5,3,1,0,10,2,23,1 -13243,1,7,0,11,0,6,1,2,4,1,12,17,4,24,0 -13244,5,8,0,0,2,5,4,0,1,1,12,10,4,14,1 -13245,7,8,0,10,5,0,8,1,1,1,6,6,1,30,0 -13246,7,8,0,9,3,1,13,1,4,1,16,9,2,23,1 -13247,4,1,0,1,2,0,9,3,0,0,13,9,4,4,0 -13248,1,4,0,15,4,4,6,3,3,0,8,18,4,16,0 -13249,5,7,0,8,0,0,9,0,3,0,1,2,3,1,0 -13250,1,0,0,5,6,2,5,2,1,1,0,19,3,37,0 -13251,10,8,0,11,3,6,14,5,1,1,15,15,4,3,0 -13252,6,5,0,7,6,6,4,5,1,1,9,8,4,9,1 -13253,1,7,0,6,2,4,11,0,2,0,10,3,0,10,0 -13254,9,6,0,6,0,3,11,4,1,0,13,4,5,34,0 -13255,2,5,0,9,0,6,4,0,3,0,15,2,2,19,0 -13256,7,8,0,11,6,0,13,0,1,0,10,18,0,16,0 -13257,4,8,0,5,5,5,4,1,1,0,9,10,4,7,1 -13258,2,2,0,9,1,1,9,4,0,0,2,2,1,30,0 -13259,6,5,0,10,3,6,4,4,0,1,0,9,5,11,0 -13260,3,6,0,1,0,4,7,5,0,1,2,6,4,30,0 -13261,2,5,0,10,0,5,7,4,2,1,2,11,1,22,0 -13262,1,4,0,7,2,0,7,4,0,0,13,20,1,36,0 -13263,7,4,0,14,4,4,10,5,3,0,18,17,3,13,1 -13264,8,6,0,14,0,0,2,0,0,0,3,11,3,30,0 -13265,8,6,0,8,0,4,8,2,0,1,13,18,1,1,0 -13266,6,1,0,11,3,1,6,4,0,1,18,17,4,30,1 -13267,2,4,0,13,3,0,5,4,3,0,1,12,4,25,0 -13268,4,6,0,11,2,6,4,2,1,1,2,0,5,18,0 -13269,1,5,0,15,3,3,6,5,0,0,2,4,2,27,0 -13270,0,5,0,1,1,4,0,3,3,1,0,2,2,30,0 -13271,3,4,0,12,0,0,12,0,0,1,2,0,0,13,0 -13272,7,8,0,10,2,5,6,1,0,1,13,15,2,32,0 -13273,5,2,0,7,6,5,7,1,1,1,2,14,0,40,0 -13274,0,8,0,0,3,3,8,2,4,0,8,6,2,41,0 -13275,3,8,0,11,5,0,12,5,4,0,0,4,0,26,0 -13276,3,2,0,4,0,1,9,3,1,0,8,19,3,25,0 -13277,6,1,0,3,2,6,9,1,3,1,13,6,5,37,0 -13278,2,3,0,8,4,6,11,3,3,0,3,5,4,11,1 -13279,3,7,0,3,0,0,4,4,4,1,12,5,3,21,0 -13280,9,1,0,4,0,3,5,5,4,0,17,2,1,23,0 -13281,4,5,0,4,5,5,0,0,1,1,9,12,5,29,1 -13282,4,7,0,3,6,0,3,3,1,1,13,7,0,32,0 -13283,8,5,0,10,2,4,7,2,0,1,3,16,5,26,0 -13284,6,2,0,15,5,2,3,3,1,0,2,11,2,25,0 -13285,5,6,0,6,0,3,9,1,1,1,2,4,1,6,0 -13286,4,6,0,8,5,6,10,2,2,0,15,11,3,24,0 -13287,6,6,0,2,6,2,3,4,4,0,13,6,4,27,0 -13288,9,6,0,4,3,0,6,0,4,0,2,1,1,28,0 -13289,4,3,0,8,1,3,6,3,4,0,13,15,2,28,0 -13290,0,0,0,5,5,5,10,1,1,0,13,6,0,1,0 -13291,0,4,0,5,1,5,4,3,0,0,4,5,0,30,0 -13292,9,8,0,8,2,2,9,5,2,1,18,19,4,36,1 -13293,9,5,0,15,1,4,14,2,4,1,10,13,2,3,0 -13294,1,5,0,2,1,0,9,4,1,0,3,13,5,38,0 -13295,4,4,0,10,4,0,10,1,1,1,13,2,1,4,0 -13296,2,6,0,3,5,0,3,1,4,1,2,15,3,25,0 -13297,10,0,0,3,0,1,10,0,1,0,14,17,5,26,1 -13298,8,1,0,2,4,3,11,1,4,0,18,7,4,31,1 -13299,5,2,0,3,0,6,6,2,1,0,8,13,1,34,0 -13300,8,6,0,1,6,0,9,2,2,1,2,6,2,0,0 -13301,1,4,0,14,3,4,10,3,0,1,13,16,2,0,0 -13302,8,0,0,9,0,0,5,4,1,1,4,16,4,33,1 -13303,1,2,0,6,4,0,6,3,0,0,7,15,0,11,0 -13304,1,0,0,10,6,4,7,2,1,1,12,6,5,30,0 -13305,2,7,0,0,0,0,13,0,3,0,4,3,0,8,0 -13306,0,6,0,10,0,1,14,3,2,0,4,15,2,14,0 -13307,5,2,0,0,0,1,3,2,2,1,2,0,0,5,0 -13308,0,2,0,11,3,5,9,5,4,0,0,20,2,26,0 -13309,6,5,0,8,4,6,10,1,1,1,1,5,0,18,1 -13310,5,2,0,13,2,5,0,3,2,0,13,5,1,39,0 -13311,5,0,0,4,0,6,3,4,1,1,11,2,1,21,0 -13312,5,2,0,6,4,5,9,0,0,0,8,8,1,32,0 -13313,3,5,0,0,3,5,10,0,1,0,8,9,2,32,0 -13314,3,5,0,0,3,4,3,0,2,0,5,11,0,40,0 -13315,9,3,0,6,2,5,5,4,2,0,10,6,3,39,0 -13316,0,0,0,13,0,0,6,5,1,0,13,2,5,30,0 -13317,9,3,0,11,0,3,12,2,0,1,12,14,5,23,1 -13318,0,4,0,6,5,0,10,4,2,1,17,17,4,1,0 -13319,7,8,0,7,0,0,0,5,0,0,8,14,3,31,1 -13320,2,1,0,12,5,4,7,1,3,0,8,2,5,27,0 -13321,7,6,0,13,0,6,5,3,0,0,14,16,0,1,0 -13322,0,8,0,1,3,1,5,4,1,0,16,13,4,2,0 -13323,0,2,0,14,0,3,11,4,1,1,13,9,2,24,0 -13324,0,0,0,14,6,0,1,4,0,1,13,12,1,12,0 -13325,10,4,0,11,1,5,0,4,2,1,2,17,2,41,0 -13326,8,6,0,4,5,6,10,4,1,0,1,17,5,17,1 -13327,5,7,0,14,5,2,6,0,4,0,13,9,4,38,0 -13328,8,7,0,10,4,4,8,0,3,0,4,18,3,4,0 -13329,10,3,0,2,1,2,5,4,0,0,9,4,5,8,0 -13330,0,6,0,9,1,5,11,3,1,1,13,11,3,18,0 -13331,3,1,0,3,5,5,11,1,0,0,2,20,0,23,0 -13332,8,5,0,14,3,4,11,2,3,0,15,7,3,25,1 -13333,9,1,0,15,1,5,11,4,3,0,6,15,0,28,0 -13334,2,1,0,7,0,6,2,2,1,0,8,9,3,24,0 -13335,5,7,0,9,4,4,4,4,1,1,11,17,4,28,1 -13336,4,2,0,8,6,1,8,0,4,0,2,6,3,32,0 -13337,10,2,0,11,4,6,1,1,4,1,7,5,4,10,1 -13338,5,3,0,6,3,4,6,5,2,0,13,6,5,17,0 -13339,7,2,0,14,6,4,12,4,0,1,17,11,5,21,0 -13340,4,2,0,7,0,4,11,1,0,0,4,19,4,38,0 -13341,1,1,0,15,5,2,4,1,0,1,9,1,1,12,1 -13342,1,6,0,0,0,2,1,4,4,1,0,18,0,20,0 -13343,6,1,0,5,6,2,7,4,1,0,8,2,0,25,0 -13344,10,4,0,11,4,0,13,2,0,0,2,11,2,17,0 -13345,8,3,0,13,4,2,8,1,3,0,7,17,5,19,1 -13346,6,6,0,11,1,2,2,2,0,0,0,17,4,8,0 -13347,2,3,0,14,6,5,5,1,4,0,13,11,0,21,0 -13348,3,4,0,6,6,2,8,2,3,1,7,6,1,37,0 -13349,9,7,0,11,5,0,8,4,1,1,15,6,0,26,0 -13350,0,0,0,5,5,4,7,1,0,0,13,2,4,18,0 -13351,4,4,0,7,4,4,13,5,1,0,1,16,5,23,1 -13352,1,8,0,3,6,1,8,3,3,1,0,2,3,19,0 -13353,1,8,0,8,5,0,1,0,0,1,13,4,0,28,0 -13354,10,5,0,13,1,6,0,1,4,0,9,13,2,23,1 -13355,3,0,0,4,0,0,1,0,1,1,12,2,0,5,0 -13356,4,5,0,5,1,5,8,0,2,0,17,2,2,5,0 -13357,5,0,0,10,5,6,3,3,3,0,1,17,0,24,1 -13358,7,6,0,0,3,5,10,1,1,1,15,9,3,6,0 -13359,4,6,0,2,2,0,9,1,0,0,17,15,5,21,0 -13360,0,0,0,0,0,6,4,3,0,1,2,18,3,24,0 -13361,9,8,0,1,6,5,2,0,4,0,13,9,4,37,0 -13362,0,7,0,3,5,3,14,0,0,0,13,5,5,27,0 -13363,0,2,0,5,3,0,4,1,1,0,17,9,5,32,0 -13364,9,5,0,4,1,2,5,4,4,0,13,6,1,41,0 -13365,9,8,0,0,0,1,6,3,3,1,2,12,4,32,0 -13366,2,7,0,11,0,6,1,1,1,0,6,7,0,29,0 -13367,0,6,0,15,0,5,1,2,3,1,8,13,1,20,0 -13368,7,3,0,15,1,5,0,5,1,1,13,18,3,26,0 -13369,7,7,0,3,4,0,9,4,4,0,2,20,2,37,0 -13370,9,5,0,13,0,3,6,3,2,1,10,11,1,27,0 -13371,0,5,0,11,0,0,12,3,0,1,0,9,1,17,0 -13372,5,1,0,0,6,2,12,5,3,1,14,17,5,39,1 -13373,6,2,0,15,5,0,7,3,0,0,8,11,1,10,0 -13374,9,8,0,15,4,2,0,4,0,0,6,2,1,24,0 -13375,1,2,0,12,4,2,11,5,4,1,18,14,5,1,1 -13376,8,0,0,11,6,6,1,1,0,1,13,2,5,28,0 -13377,3,2,0,3,5,0,3,3,0,1,2,2,0,9,0 -13378,0,4,0,5,5,1,9,3,1,0,13,18,1,20,0 -13379,10,3,0,9,5,1,10,5,2,1,18,8,4,19,1 -13380,10,3,0,10,6,2,10,0,0,1,18,19,5,2,1 -13381,6,0,0,5,0,6,6,0,3,0,6,9,1,9,0 -13382,8,2,0,5,2,2,7,5,2,1,14,8,4,0,1 -13383,3,8,0,12,0,6,9,4,3,0,11,2,2,29,0 -13384,2,7,0,4,6,0,14,3,4,0,17,2,3,22,0 -13385,1,8,0,2,6,1,5,1,3,0,13,18,1,2,0 -13386,3,4,0,0,2,6,6,0,4,0,0,18,3,17,0 -13387,6,4,0,6,5,2,4,5,0,0,16,10,3,32,1 -13388,0,7,0,10,6,5,6,3,1,0,17,11,0,0,0 -13389,1,1,0,11,2,4,3,1,3,0,9,10,3,12,1 -13390,0,8,0,0,3,0,9,5,3,0,8,2,4,20,0 -13391,2,4,0,15,0,4,2,4,3,1,10,17,3,34,0 -13392,1,1,0,11,3,6,0,3,1,0,15,3,5,39,0 -13393,1,2,0,1,0,6,2,0,3,0,13,15,3,37,0 -13394,7,1,0,9,5,0,14,1,0,0,13,4,1,35,0 -13395,7,8,0,4,2,1,6,4,1,1,12,7,1,30,1 -13396,7,0,0,13,3,0,9,2,2,0,0,13,1,2,0 -13397,8,0,0,1,0,6,6,0,1,0,17,2,2,36,0 -13398,6,5,0,5,4,4,13,1,0,1,6,11,4,22,0 -13399,0,0,0,2,0,5,11,2,4,1,13,8,1,1,0 -13400,7,2,0,4,6,4,6,4,1,1,13,20,1,26,0 -13401,10,8,0,15,5,1,9,5,4,1,6,13,3,41,0 -13402,4,0,0,1,0,1,3,3,4,0,8,8,2,4,0 -13403,0,3,0,6,5,3,0,2,0,0,0,11,5,2,0 -13404,2,8,0,9,3,4,8,4,2,0,10,4,0,36,0 -13405,10,4,0,12,2,5,11,3,1,1,18,17,4,10,1 -13406,3,3,0,11,5,3,2,0,1,0,4,6,1,17,0 -13407,1,1,0,9,0,0,14,1,4,1,8,2,5,27,0 -13408,5,3,0,6,1,1,14,2,3,0,8,4,5,16,0 -13409,6,3,0,5,1,2,8,0,4,1,10,2,3,38,0 -13410,10,6,0,11,1,5,2,1,2,0,14,13,2,33,0 -13411,3,0,0,2,3,4,0,3,1,1,6,9,0,37,0 -13412,10,4,0,3,5,0,10,3,0,1,15,6,4,8,0 -13413,8,1,0,13,1,0,3,1,1,0,13,13,0,31,0 -13414,5,7,0,1,3,1,8,3,0,1,2,2,4,22,0 -13415,6,1,0,4,6,6,5,5,4,1,9,10,2,9,1 -13416,1,8,0,6,5,5,7,3,1,0,2,6,2,23,0 -13417,2,7,0,3,6,4,10,2,4,1,9,19,1,30,1 -13418,2,2,0,11,5,5,4,1,1,0,13,17,0,21,0 -13419,8,7,0,3,2,2,12,3,1,0,3,5,3,36,1 -13420,4,5,0,3,0,4,8,3,4,0,2,2,1,38,0 -13421,9,4,0,15,2,6,7,5,2,1,17,10,4,13,1 -13422,6,3,0,3,0,0,10,1,1,0,8,2,0,33,0 -13423,5,2,0,12,5,4,14,1,1,0,10,2,0,19,0 -13424,6,1,0,14,4,6,4,0,3,1,9,5,5,37,1 -13425,7,3,0,5,0,0,8,4,1,0,6,2,2,22,0 -13426,0,8,0,0,0,0,11,0,4,0,2,1,1,3,0 -13427,7,0,0,15,5,6,11,2,0,1,13,9,3,13,0 -13428,8,8,0,4,2,0,13,2,0,1,4,20,3,35,0 -13429,5,5,0,15,1,2,9,4,1,1,8,19,0,34,0 -13430,4,5,0,5,2,2,8,4,3,0,13,11,0,40,0 -13431,6,6,0,10,2,4,3,4,3,0,2,15,4,31,0 -13432,1,6,0,1,3,3,14,3,0,0,6,11,0,31,0 -13433,5,3,0,3,4,4,9,3,3,1,2,15,0,14,0 -13434,1,7,0,8,1,4,2,5,0,0,6,6,4,23,0 -13435,3,5,0,13,2,3,9,0,2,1,7,4,4,41,0 -13436,3,8,0,3,2,3,14,0,4,1,13,20,1,7,0 -13437,4,0,0,10,6,4,13,0,1,1,2,15,2,37,0 -13438,5,1,0,13,0,4,7,2,0,1,17,9,0,0,0 -13439,1,7,0,1,1,5,10,3,1,0,13,2,5,13,0 -13440,0,2,0,15,1,4,9,2,4,1,17,2,5,14,0 -13441,9,3,0,6,0,3,12,4,2,0,10,4,0,31,0 -13442,1,6,0,4,6,3,14,3,2,1,8,3,3,0,0 -13443,0,4,0,4,0,2,13,4,1,1,2,5,1,26,0 -13444,9,6,0,9,0,5,3,0,1,1,13,9,0,35,0 -13445,3,8,0,0,3,5,6,0,0,1,8,2,0,23,0 -13446,9,2,0,15,4,3,2,5,4,0,18,19,2,4,1 -13447,6,8,0,14,0,0,12,2,1,1,15,20,4,29,0 -13448,10,2,0,2,0,3,9,1,0,1,13,9,0,0,0 -13449,1,4,0,4,0,5,14,4,2,1,7,9,2,4,0 -13450,0,8,0,1,5,6,7,2,0,1,0,14,0,37,0 -13451,1,8,0,9,5,0,14,2,0,1,10,15,4,41,0 -13452,2,8,0,2,2,6,2,2,4,0,4,10,4,37,0 -13453,8,6,0,10,3,0,1,3,3,0,1,7,3,41,1 -13454,0,4,0,3,0,3,0,3,4,0,6,2,3,27,0 -13455,10,8,0,9,2,2,5,3,0,1,2,10,5,29,0 -13456,9,8,0,9,4,1,5,5,4,1,7,16,0,22,1 -13457,6,0,0,3,2,6,14,3,3,0,8,11,1,27,0 -13458,0,5,0,6,1,0,12,3,3,1,2,13,1,1,0 -13459,8,3,0,8,1,4,11,2,4,1,14,3,2,33,0 -13460,5,3,0,4,3,3,2,1,3,1,3,5,2,7,1 -13461,9,4,0,3,6,2,10,5,2,1,9,10,1,7,1 -13462,6,6,0,12,1,0,4,1,0,1,5,16,5,18,1 -13463,1,6,0,12,2,6,14,4,1,0,16,11,0,23,0 -13464,1,5,0,9,5,0,4,3,0,1,10,16,2,12,0 -13465,1,0,0,3,6,3,7,3,3,0,8,9,3,25,0 -13466,0,8,0,4,0,2,14,1,0,1,4,14,5,14,0 -13467,10,8,0,12,3,3,7,4,2,0,10,2,0,0,0 -13468,3,7,0,1,5,2,9,4,3,0,17,2,1,20,0 -13469,7,6,0,12,6,4,8,4,0,1,1,19,2,30,1 -13470,0,0,0,7,1,6,7,4,0,1,13,15,1,13,0 -13471,6,4,0,5,4,2,8,3,0,0,5,19,1,3,1 -13472,8,5,0,11,6,3,9,1,0,0,2,4,4,0,0 -13473,1,7,0,13,6,0,6,0,2,1,15,13,2,2,0 -13474,2,0,0,10,3,5,7,4,2,0,13,9,5,39,0 -13475,2,2,0,5,2,2,13,1,1,1,3,19,2,18,1 -13476,2,7,0,5,0,6,4,2,0,0,0,16,0,12,0 -13477,2,6,0,13,3,2,6,4,1,0,4,11,5,27,0 -13478,9,3,0,12,6,6,10,4,2,0,13,18,0,33,0 -13479,0,6,0,3,0,4,7,3,0,0,13,2,0,0,0 -13480,1,0,0,6,6,1,4,0,1,1,17,4,1,9,0 -13481,1,1,0,1,1,4,8,3,4,1,2,11,0,4,0 -13482,3,5,0,1,6,0,2,1,3,1,13,11,1,4,0 -13483,3,7,0,1,2,5,10,2,0,1,2,17,1,5,0 -13484,9,0,0,13,2,6,5,3,2,0,15,2,2,37,0 -13485,5,3,0,0,3,2,3,1,2,0,9,7,3,29,1 -13486,6,6,0,11,6,2,0,3,1,1,2,9,3,7,0 -13487,4,8,0,15,5,5,7,1,3,1,13,6,1,20,0 -13488,3,2,0,11,5,5,1,4,2,0,6,4,5,1,0 -13489,6,7,0,8,0,1,9,2,0,0,4,9,5,16,0 -13490,2,6,0,8,3,1,2,5,2,0,12,5,1,3,0 -13491,0,8,0,4,0,5,1,3,3,1,13,11,1,41,0 -13492,6,2,0,0,0,4,6,3,1,0,7,13,2,21,0 -13493,2,6,0,9,1,5,5,4,2,0,14,19,4,35,1 -13494,6,2,0,6,6,3,11,2,1,0,9,8,1,9,1 -13495,8,2,0,10,5,1,7,1,4,0,18,17,5,31,1 -13496,6,2,0,4,2,4,8,4,1,0,5,11,5,33,0 -13497,10,1,0,9,4,5,3,2,0,0,1,0,5,29,1 -13498,5,6,0,13,5,6,0,0,2,0,17,6,5,34,0 -13499,0,7,0,15,0,3,4,2,2,1,8,9,1,24,0 -13500,9,7,0,1,3,3,9,5,2,0,12,11,0,31,0 -13501,10,6,0,15,2,4,9,2,3,0,17,15,1,26,0 -13502,7,8,0,12,2,0,1,1,2,0,9,17,0,1,1 -13503,2,2,0,15,0,5,7,3,3,1,10,2,0,1,0 -13504,0,7,0,3,6,5,8,2,3,0,13,9,0,19,0 -13505,1,3,0,3,5,1,8,3,3,1,16,10,3,13,0 -13506,5,5,0,10,5,0,8,3,0,0,17,15,0,28,0 -13507,2,4,0,1,0,6,13,4,4,0,6,9,1,40,0 -13508,8,5,0,14,0,1,11,4,1,1,18,16,5,17,1 -13509,10,2,0,1,0,4,4,3,0,0,10,11,5,5,0 -13510,1,2,0,15,5,0,8,0,1,0,8,2,1,14,0 -13511,6,5,0,12,5,5,6,4,0,1,10,18,5,30,0 -13512,0,6,0,5,0,5,9,3,2,1,2,15,1,16,0 -13513,7,6,0,2,0,5,2,2,0,1,8,11,1,21,0 -13514,0,8,0,10,0,5,4,3,2,0,4,11,0,12,0 -13515,3,4,0,13,3,1,0,3,3,1,13,11,4,28,0 -13516,0,0,0,0,0,3,5,0,0,1,8,15,4,14,0 -13517,2,7,0,11,0,3,4,3,3,0,8,15,0,8,0 -13518,3,1,0,10,0,2,9,0,4,1,1,7,2,21,1 -13519,4,3,0,6,2,0,6,4,0,1,12,19,1,12,0 -13520,10,2,0,4,1,0,6,3,0,0,16,11,3,23,0 -13521,9,0,0,11,4,0,8,4,3,1,2,2,5,6,0 -13522,0,8,0,1,4,2,9,0,1,1,13,2,3,14,0 -13523,6,2,0,2,1,1,2,3,3,1,7,20,1,29,0 -13524,3,3,0,2,0,3,11,1,3,0,13,15,4,31,0 -13525,8,0,0,15,3,4,13,1,1,0,18,13,2,17,1 -13526,3,2,0,6,4,0,6,0,0,0,10,11,0,10,0 -13527,1,2,0,2,1,3,12,5,4,0,10,13,5,33,0 -13528,4,0,0,1,4,6,0,2,3,1,0,9,3,8,0 -13529,7,7,0,0,4,0,12,5,2,1,13,3,5,26,0 -13530,9,5,0,9,6,5,9,3,0,0,10,2,3,7,0 -13531,0,8,0,8,5,4,12,0,4,0,8,3,2,38,0 -13532,3,1,0,13,3,4,6,4,1,1,4,2,5,1,0 -13533,9,6,0,9,0,6,9,1,3,1,18,7,2,39,1 -13534,7,3,0,10,2,6,2,3,2,1,5,10,1,5,1 -13535,5,8,0,11,3,5,8,2,4,1,17,9,2,9,0 -13536,1,5,0,13,1,2,9,3,4,0,12,12,2,38,0 -13537,1,5,0,2,6,2,8,4,2,0,4,15,1,13,0 -13538,3,3,0,5,0,4,1,2,1,0,6,17,3,4,0 -13539,0,5,0,5,1,5,9,3,0,0,2,2,4,13,0 -13540,6,6,0,3,0,0,13,1,2,0,12,0,5,2,0 -13541,2,5,0,0,6,0,6,0,4,1,2,8,3,26,0 -13542,2,0,0,0,6,0,8,4,3,1,2,11,1,12,0 -13543,1,2,0,3,1,6,10,1,2,0,12,13,1,29,0 -13544,8,1,0,14,0,2,9,4,1,1,8,1,4,27,1 -13545,5,1,0,12,3,2,10,0,4,0,11,1,3,21,1 -13546,3,2,0,9,3,4,9,1,3,1,8,0,4,5,0 -13547,7,7,0,3,4,0,9,2,0,0,17,11,5,38,0 -13548,7,6,0,7,4,5,11,5,0,0,16,8,4,13,1 -13549,0,8,0,11,0,1,7,3,0,0,13,6,0,26,0 -13550,6,2,0,13,6,2,0,2,3,1,2,9,0,6,0 -13551,3,3,0,12,0,2,8,4,1,0,0,20,0,29,0 -13552,0,4,0,14,1,0,1,3,2,1,13,4,2,30,0 -13553,6,2,0,6,4,0,5,3,4,0,13,2,5,23,0 -13554,10,2,0,6,1,2,7,0,0,1,18,12,3,27,1 -13555,3,0,0,1,3,3,8,2,0,1,17,11,0,38,0 -13556,6,6,0,9,5,5,12,3,0,1,18,20,3,27,1 -13557,1,5,0,15,3,4,12,5,0,1,5,4,0,9,0 -13558,2,6,0,3,1,0,5,2,1,0,2,2,4,0,0 -13559,3,6,0,4,0,4,10,2,2,1,8,2,2,14,0 -13560,7,3,0,3,2,2,0,0,4,1,4,11,0,3,0 -13561,9,0,0,10,6,0,8,2,1,1,12,0,4,1,0 -13562,9,3,0,3,1,0,9,2,0,1,4,11,3,28,0 -13563,3,5,0,7,6,6,7,4,3,0,13,11,0,25,0 -13564,6,1,0,12,5,1,5,3,2,0,17,4,1,19,0 -13565,8,1,0,6,2,6,8,2,2,0,2,10,5,41,0 -13566,10,6,0,11,6,3,3,4,3,0,2,10,0,11,0 -13567,2,5,0,9,3,6,5,0,3,1,17,18,1,40,0 -13568,7,5,0,2,4,3,14,0,2,1,18,10,4,6,1 -13569,0,0,0,3,0,3,8,2,1,0,8,1,4,12,0 -13570,1,5,0,3,5,2,13,1,0,0,12,16,2,28,0 -13571,5,2,0,5,6,5,0,4,0,0,6,9,1,25,0 -13572,0,0,0,5,0,2,14,1,0,0,4,11,0,32,0 -13573,5,7,0,13,4,3,9,1,2,0,6,19,4,11,1 -13574,0,1,0,3,0,2,9,4,0,0,15,3,0,28,0 -13575,3,7,0,9,2,3,4,0,4,1,1,3,3,37,1 -13576,10,7,0,1,0,2,7,2,3,0,12,2,5,33,0 -13577,5,0,0,10,1,6,6,2,3,0,6,14,5,5,0 -13578,1,1,0,7,1,6,8,2,1,1,2,2,4,29,0 -13579,0,4,0,11,0,4,10,3,0,1,4,18,4,26,0 -13580,1,7,0,3,3,4,9,2,2,0,2,11,0,30,0 -13581,4,0,0,5,3,1,10,5,2,1,5,8,1,0,1 -13582,7,1,0,7,5,1,2,5,0,1,18,10,1,40,1 -13583,2,6,0,12,0,0,5,2,3,0,13,15,2,16,0 -13584,6,0,0,14,2,3,14,1,1,1,10,3,5,31,0 -13585,9,8,0,13,0,3,11,1,1,0,13,2,0,13,0 -13586,7,3,0,3,6,5,13,1,3,0,13,2,1,31,0 -13587,8,5,0,11,4,5,10,0,0,1,5,10,2,33,1 -13588,7,5,0,11,5,4,9,3,2,0,8,15,4,38,0 -13589,5,1,0,13,5,4,8,1,2,1,11,10,3,16,0 -13590,10,3,0,12,0,5,14,3,1,0,18,3,4,30,1 -13591,4,2,0,11,6,5,11,0,1,0,16,13,5,20,0 -13592,3,8,0,7,6,1,6,2,2,1,8,11,1,33,0 -13593,4,4,0,0,6,5,6,5,2,1,2,13,2,25,0 -13594,9,0,0,1,6,4,8,3,2,1,13,13,2,4,0 -13595,0,6,0,11,3,5,9,3,4,1,1,18,4,14,0 -13596,1,0,0,3,5,4,9,1,3,0,13,11,4,4,0 -13597,2,8,0,6,6,4,7,4,3,1,6,9,4,25,0 -13598,0,3,0,5,0,5,0,3,3,0,13,7,1,18,0 -13599,10,6,0,2,0,3,2,4,1,0,8,1,1,28,0 -13600,4,4,0,10,5,0,2,3,2,1,8,10,1,12,0 -13601,1,5,0,9,0,5,9,1,2,1,14,9,0,39,0 -13602,10,2,0,9,1,1,6,3,0,0,4,10,3,18,0 -13603,6,3,0,13,3,3,9,4,2,0,8,18,1,33,0 -13604,3,2,0,2,4,4,6,3,3,0,2,6,4,7,0 -13605,6,0,0,11,6,1,1,3,1,0,13,18,4,27,0 -13606,0,0,0,5,6,1,5,0,2,0,2,18,0,34,0 -13607,4,8,0,6,2,2,12,2,0,0,10,11,3,34,0 -13608,8,6,0,4,5,5,14,0,3,0,6,14,3,2,1 -13609,10,4,0,11,1,0,11,4,0,1,2,12,5,16,0 -13610,1,3,0,0,4,4,3,1,1,1,2,9,1,12,0 -13611,5,5,0,4,1,6,12,1,1,1,18,3,4,3,1 -13612,5,1,0,12,6,3,7,1,1,1,1,7,3,6,1 -13613,5,3,0,6,3,1,10,0,4,1,18,1,2,33,1 -13614,3,6,0,12,0,3,0,2,4,1,4,13,1,35,0 -13615,1,6,0,15,0,3,1,1,4,1,4,11,1,37,0 -13616,9,4,0,8,4,6,6,3,3,1,1,17,0,36,1 -13617,0,6,0,12,6,2,13,0,2,1,2,15,3,37,0 -13618,0,7,0,13,6,2,6,2,0,1,17,11,3,35,0 -13619,4,0,0,2,0,4,10,5,4,1,13,17,3,17,0 -13620,0,8,0,2,5,3,5,0,2,0,6,6,3,6,0 -13621,9,0,0,12,5,0,8,4,4,1,10,11,4,41,0 -13622,3,7,0,5,3,1,9,3,3,1,4,19,2,11,0 -13623,1,6,0,1,4,5,9,3,0,0,4,4,2,7,0 -13624,2,7,0,9,6,5,1,4,2,1,0,0,3,35,0 -13625,4,2,0,3,5,4,0,3,2,0,0,2,4,6,0 -13626,0,6,0,2,0,0,3,4,0,1,13,7,1,22,0 -13627,2,7,0,10,6,3,13,0,2,0,13,7,2,4,0 -13628,2,6,0,15,1,0,0,3,3,1,8,2,0,17,0 -13629,8,1,0,3,4,6,14,5,2,1,5,20,1,23,1 -13630,6,8,0,2,0,3,7,3,2,0,2,11,3,5,0 -13631,6,0,0,9,6,6,4,0,1,1,1,3,1,32,1 -13632,8,7,0,2,1,3,0,4,1,1,3,12,1,36,0 -13633,3,7,0,13,1,6,4,0,2,0,18,8,2,13,1 -13634,5,3,0,12,3,3,9,2,4,1,8,20,5,39,0 -13635,9,1,0,8,6,2,5,0,4,0,18,17,4,16,1 -13636,6,5,0,11,3,6,9,4,1,1,15,2,0,27,0 -13637,3,3,0,3,5,5,9,3,3,1,17,18,0,37,0 -13638,10,8,0,14,5,3,4,5,2,0,14,19,5,7,1 -13639,10,4,0,11,2,5,1,1,3,1,6,9,1,40,0 -13640,6,6,0,1,2,5,0,2,1,0,1,6,0,22,0 -13641,1,2,0,15,0,4,13,1,1,0,2,6,0,31,0 -13642,2,4,0,15,6,3,1,2,1,1,4,18,0,36,0 -13643,0,8,0,2,6,2,1,1,0,1,2,15,3,2,0 -13644,3,7,0,6,2,3,1,4,4,1,18,7,5,32,1 -13645,0,6,0,1,0,6,0,3,2,1,8,13,2,3,0 -13646,0,3,0,7,0,6,4,3,2,1,13,11,3,38,0 -13647,0,3,0,13,4,4,4,3,4,0,6,2,0,3,0 -13648,3,3,0,1,0,4,9,4,0,1,15,11,3,6,0 -13649,6,3,0,5,1,6,0,3,3,0,6,2,1,35,0 -13650,2,5,0,6,0,4,12,5,3,1,13,14,2,29,0 -13651,2,1,0,13,5,0,14,1,3,1,17,11,0,21,0 -13652,6,6,0,11,2,0,2,5,2,0,9,5,0,25,1 -13653,2,4,0,9,0,1,4,1,1,0,2,11,3,17,0 -13654,3,4,0,4,0,5,7,4,4,0,9,20,3,19,0 -13655,9,4,0,5,1,1,2,4,4,1,9,10,1,30,1 -13656,0,6,0,3,3,0,9,3,0,0,2,2,5,11,0 -13657,1,2,0,5,1,2,5,0,2,1,2,20,2,25,0 -13658,7,6,0,12,0,5,5,4,2,0,10,11,2,10,0 -13659,1,8,0,10,6,1,0,3,1,1,13,11,4,3,0 -13660,0,7,0,13,2,5,14,2,0,0,3,4,5,7,0 -13661,2,5,0,2,1,5,3,2,2,0,16,2,0,38,0 -13662,0,8,0,3,6,0,14,4,4,0,4,16,3,7,0 -13663,8,0,0,2,3,6,2,0,4,0,2,3,1,35,0 -13664,10,5,0,11,0,6,13,0,1,1,11,7,5,32,1 -13665,6,7,0,1,6,1,0,4,0,1,2,11,2,20,0 -13666,9,0,0,13,6,4,10,1,1,1,8,13,1,19,0 -13667,0,1,0,9,3,4,2,4,4,1,5,17,4,31,1 -13668,4,2,0,0,0,2,0,1,3,0,12,6,3,7,0 -13669,5,1,0,2,1,3,11,5,4,0,9,7,4,3,1 -13670,3,8,0,0,1,0,6,4,0,0,8,9,3,30,0 -13671,2,8,0,11,0,4,2,3,2,1,13,18,2,24,0 -13672,10,8,0,11,3,1,10,1,0,0,4,15,2,4,0 -13673,9,1,0,1,4,4,10,3,1,0,0,11,3,34,0 -13674,3,3,0,13,3,6,9,0,1,1,17,19,1,30,0 -13675,10,2,0,3,1,6,6,3,0,1,2,0,2,31,0 -13676,0,5,0,4,0,5,8,2,2,0,17,15,4,16,0 -13677,1,4,0,11,1,4,0,2,4,1,8,18,0,26,0 -13678,10,0,0,10,6,4,2,0,2,0,17,11,2,4,0 -13679,0,5,0,9,0,6,5,2,4,0,8,11,4,9,0 -13680,2,0,0,8,0,1,3,1,1,1,8,14,2,36,0 -13681,5,3,0,13,1,4,5,4,3,0,6,5,0,29,0 -13682,9,8,0,9,4,2,11,4,2,1,18,4,4,28,1 -13683,1,2,0,8,0,3,12,2,2,0,8,18,0,25,0 -13684,3,6,0,4,5,5,0,1,0,1,17,11,1,22,0 -13685,5,6,0,15,5,4,12,2,2,1,8,12,0,16,0 -13686,4,8,0,0,0,5,6,2,4,1,12,11,3,23,0 -13687,0,1,0,14,5,3,8,4,0,0,10,19,0,37,0 -13688,9,8,0,4,3,1,10,4,4,1,16,14,5,26,1 -13689,0,7,0,12,1,4,14,3,0,1,15,5,1,0,0 -13690,5,7,0,0,0,5,8,5,0,0,4,9,1,35,0 -13691,0,3,0,13,5,2,4,5,3,1,2,9,0,29,0 -13692,3,7,0,13,0,5,4,5,0,1,0,9,5,21,0 -13693,7,7,0,4,1,2,1,5,1,1,13,7,1,12,1 -13694,7,0,0,3,6,5,0,3,2,1,0,5,3,40,0 -13695,2,2,0,3,2,4,5,4,4,1,0,2,3,33,0 -13696,2,4,0,6,0,4,6,1,1,0,17,13,2,3,0 -13697,10,3,0,15,2,0,4,4,1,0,2,2,1,30,0 -13698,9,1,0,11,5,6,12,3,4,1,18,5,3,20,1 -13699,0,8,0,11,0,0,10,0,4,0,8,15,0,40,0 -13700,8,8,0,5,5,6,12,1,4,0,14,3,4,22,1 -13701,6,3,0,1,0,0,7,4,2,0,2,2,2,25,0 -13702,9,7,0,2,6,3,10,1,0,1,15,11,5,3,0 -13703,9,1,0,12,6,3,3,3,3,0,18,12,4,2,1 -13704,5,3,0,0,6,1,8,0,1,1,13,11,1,31,0 -13705,2,3,0,1,4,0,8,1,1,0,8,1,5,6,0 -13706,2,8,0,1,1,5,14,4,1,1,2,18,1,41,0 -13707,2,3,0,0,6,6,11,3,3,0,8,18,4,3,0 -13708,0,0,0,3,3,5,0,1,3,0,15,2,0,11,0 -13709,7,3,0,10,3,6,5,4,4,1,18,17,1,9,1 -13710,4,4,0,14,0,3,3,3,0,0,13,2,0,37,0 -13711,2,3,0,13,6,2,9,2,3,1,11,11,0,6,0 -13712,7,5,0,2,5,3,5,1,0,1,13,5,5,33,0 -13713,1,8,0,9,5,2,3,0,2,0,10,4,4,35,0 -13714,2,8,0,10,5,4,12,1,0,0,2,0,3,26,0 -13715,0,3,0,14,4,1,7,3,0,0,14,6,0,20,0 -13716,9,2,0,9,3,6,2,5,3,1,8,12,5,35,0 -13717,7,0,0,7,0,2,9,3,4,1,7,19,5,34,1 -13718,8,2,0,1,0,0,6,4,1,0,17,11,5,14,0 -13719,7,8,0,2,0,4,10,3,3,1,1,14,1,21,1 -13720,4,0,0,7,5,1,4,3,1,0,16,11,5,21,0 -13721,5,4,0,11,2,5,1,0,1,0,8,2,4,16,0 -13722,2,5,0,4,3,4,9,3,3,1,17,12,2,7,0 -13723,5,7,0,13,1,0,5,3,0,0,13,2,4,35,0 -13724,0,0,0,8,6,1,3,3,4,1,15,2,5,32,0 -13725,5,0,0,14,6,5,1,5,2,1,12,5,2,10,1 -13726,7,1,0,1,2,0,3,3,3,0,13,11,4,6,0 -13727,9,3,0,0,0,3,10,4,3,1,2,11,0,19,0 -13728,2,0,0,3,6,6,14,1,0,1,13,0,3,4,0 -13729,6,3,0,7,3,3,2,2,1,1,16,19,2,17,1 -13730,6,6,0,2,3,6,8,1,4,0,13,11,3,22,0 -13731,7,7,0,12,1,3,11,4,2,0,8,15,1,26,0 -13732,0,7,0,11,5,4,5,4,1,0,7,15,0,23,0 -13733,0,8,0,2,5,1,9,4,2,1,17,4,0,17,0 -13734,0,3,0,15,0,4,4,1,3,0,2,2,0,26,0 -13735,2,7,0,15,0,0,9,2,1,0,2,18,4,21,0 -13736,0,3,0,13,2,4,8,4,0,0,13,14,2,12,0 -13737,1,7,0,4,4,4,7,2,4,1,5,16,2,38,1 -13738,2,8,0,4,3,5,5,5,0,1,12,11,0,37,0 -13739,8,1,0,4,1,6,11,0,2,1,0,3,2,39,1 -13740,0,5,0,0,5,1,8,2,3,0,2,11,3,41,0 -13741,1,2,0,0,0,0,0,4,0,0,2,11,1,7,0 -13742,0,7,0,13,0,0,12,1,1,0,8,3,0,27,0 -13743,7,3,0,12,5,4,7,4,4,1,2,11,2,4,0 -13744,3,3,0,7,0,2,6,2,2,0,13,13,1,17,0 -13745,0,6,0,11,0,1,3,2,1,1,17,9,0,29,0 -13746,2,8,0,13,6,3,3,4,3,0,3,13,5,16,0 -13747,10,5,0,15,1,6,0,3,1,0,10,18,1,12,0 -13748,3,7,0,12,2,6,1,5,1,0,12,8,0,41,0 -13749,9,8,0,7,2,6,9,0,2,1,8,11,4,41,0 -13750,6,5,0,3,1,4,0,4,2,1,17,18,1,14,0 -13751,0,2,0,6,4,6,0,3,2,0,13,0,5,8,0 -13752,2,1,0,8,0,2,5,3,0,1,7,13,0,38,0 -13753,6,3,0,1,6,4,7,2,2,0,0,15,1,22,0 -13754,0,6,0,15,2,4,9,0,1,1,18,5,0,39,1 -13755,8,5,0,15,0,6,3,4,1,1,12,12,4,33,0 -13756,7,2,0,8,5,1,5,0,2,0,15,8,4,16,1 -13757,4,2,0,11,4,3,0,2,0,0,13,11,5,8,0 -13758,3,1,0,9,0,4,2,2,2,0,17,2,1,39,0 -13759,6,5,0,1,1,0,12,1,3,0,7,9,3,7,0 -13760,1,0,0,4,1,1,9,1,3,1,15,0,0,38,0 -13761,2,1,0,8,0,5,12,5,3,0,13,15,0,34,0 -13762,2,8,0,12,5,5,9,4,1,1,17,3,0,33,0 -13763,0,2,0,8,5,4,9,4,0,1,13,2,3,38,0 -13764,2,1,0,8,0,6,3,4,2,1,17,3,4,18,0 -13765,9,0,0,13,2,2,9,0,2,0,8,16,1,6,0 -13766,8,3,0,4,2,4,5,5,3,1,1,7,2,24,1 -13767,6,0,0,5,0,0,8,5,2,0,2,18,1,40,0 -13768,1,2,0,3,3,5,4,2,1,0,13,20,2,10,0 -13769,0,1,0,7,4,6,5,4,4,1,17,7,2,10,1 -13770,5,6,0,13,1,6,14,5,1,0,2,18,0,34,0 -13771,5,8,0,4,5,6,5,0,3,0,18,14,5,12,1 -13772,7,7,0,7,5,2,14,5,1,1,5,17,1,2,1 -13773,5,5,0,4,5,0,6,0,1,0,18,12,5,8,1 -13774,9,2,0,1,1,2,1,1,0,1,6,1,4,22,1 -13775,10,3,0,7,0,4,6,4,4,1,2,18,0,11,0 -13776,0,6,0,13,2,1,9,0,3,0,12,15,5,37,0 -13777,5,1,0,14,1,2,4,5,3,1,16,7,5,13,1 -13778,6,6,0,10,0,5,11,4,0,1,6,9,4,0,0 -13779,10,2,0,7,1,5,10,1,1,1,12,20,1,16,0 -13780,1,2,0,10,0,1,14,3,0,1,7,5,2,17,1 -13781,10,2,0,2,2,2,6,4,0,1,0,18,3,32,0 -13782,7,7,0,9,2,1,8,4,1,0,17,11,0,23,0 -13783,6,5,0,3,5,5,9,3,2,0,8,13,4,34,0 -13784,5,1,0,10,2,3,13,5,2,1,9,14,0,38,1 -13785,5,8,0,4,6,1,1,5,3,0,18,10,0,14,1 -13786,3,8,0,8,5,2,8,4,2,1,10,19,0,19,0 -13787,8,5,0,5,5,6,14,2,3,0,9,3,1,0,1 -13788,8,6,0,14,1,4,2,2,1,0,6,20,5,11,1 -13789,2,6,0,15,2,4,0,5,0,1,3,17,1,7,1 -13790,5,7,0,0,3,5,11,5,1,1,16,17,4,7,1 -13791,2,2,0,15,6,1,12,2,1,1,10,13,0,33,0 -13792,4,8,0,15,0,3,6,3,0,0,15,20,0,35,0 -13793,3,2,0,7,3,1,13,5,2,0,7,8,3,39,1 -13794,1,0,0,0,1,5,0,2,0,1,8,20,0,8,0 -13795,10,8,0,5,5,5,10,4,3,0,11,13,5,6,1 -13796,9,7,0,13,2,4,14,1,3,0,2,3,3,20,0 -13797,0,5,0,12,4,3,5,0,3,0,8,11,4,5,0 -13798,0,4,0,15,2,4,10,3,4,0,12,2,1,27,0 -13799,10,2,0,4,1,5,6,3,3,0,2,20,0,31,0 -13800,1,2,0,5,2,0,12,4,2,1,12,20,5,33,0 -13801,0,4,0,0,1,2,10,2,4,0,6,2,1,1,0 -13802,3,1,0,3,6,5,5,4,1,1,13,2,0,4,0 -13803,2,5,0,13,0,5,8,1,0,1,2,2,5,10,0 -13804,0,3,0,2,6,6,7,0,3,0,17,0,4,26,0 -13805,5,4,0,0,3,2,2,3,3,0,13,0,4,34,0 -13806,8,0,0,13,6,1,6,5,0,0,18,15,3,22,1 -13807,8,3,0,15,6,5,5,1,3,0,12,2,2,3,0 -13808,10,3,0,11,1,4,11,4,4,1,10,5,2,18,1 -13809,10,2,0,15,6,3,6,1,0,0,0,13,5,13,0 -13810,2,7,0,10,0,1,4,2,1,0,2,0,3,26,0 -13811,3,3,0,12,6,5,3,1,0,1,2,6,0,10,0 -13812,0,3,0,5,6,3,3,3,4,1,13,0,1,18,0 -13813,4,2,0,11,4,3,14,2,1,1,5,7,1,12,1 -13814,0,4,0,13,0,1,5,2,0,1,6,17,2,31,0 -13815,4,7,0,14,5,4,5,4,4,1,0,11,2,27,0 -13816,10,8,0,14,4,3,3,4,4,1,6,6,0,34,0 -13817,9,4,0,4,6,3,12,3,0,0,8,11,5,35,0 -13818,6,7,0,13,2,3,7,2,1,0,8,12,1,36,0 -13819,1,5,0,8,2,1,14,4,3,1,12,17,5,12,1 -13820,5,3,0,3,4,4,13,5,3,1,18,17,0,25,1 -13821,6,5,0,9,2,3,6,4,4,1,0,11,3,34,0 -13822,6,4,0,0,2,1,14,5,3,0,18,8,0,26,1 -13823,7,1,0,13,4,0,1,2,4,0,1,16,4,8,1 -13824,1,5,0,1,1,1,13,3,4,1,4,2,2,36,0 -13825,1,6,0,5,6,2,5,5,0,0,2,9,5,11,0 -13826,6,7,0,15,4,5,1,5,4,0,9,8,5,10,1 -13827,4,8,0,5,0,3,0,2,0,0,4,4,1,27,0 -13828,8,8,0,10,6,5,5,3,0,1,2,2,4,34,0 -13829,1,2,0,6,3,5,0,3,2,0,15,2,1,7,0 -13830,1,8,0,13,4,2,5,2,1,0,2,2,4,7,0 -13831,2,0,0,3,5,3,5,4,2,0,13,15,2,12,0 -13832,2,6,0,2,2,2,7,5,3,1,18,13,1,22,1 -13833,6,7,0,5,0,4,6,0,0,0,13,9,1,5,0 -13834,3,1,0,9,4,3,8,3,0,1,12,4,5,6,0 -13835,5,6,0,14,0,6,7,5,3,0,6,15,0,9,0 -13836,5,6,0,12,5,5,7,1,4,1,4,9,2,0,0 -13837,8,8,0,4,2,5,13,0,1,0,8,13,2,39,0 -13838,2,6,0,6,6,1,9,4,3,0,0,2,5,13,0 -13839,1,8,0,10,0,6,9,5,2,1,8,9,0,33,0 -13840,0,8,0,8,3,1,8,0,3,1,9,1,1,3,1 -13841,3,0,0,2,1,5,8,4,3,0,8,9,3,37,0 -13842,7,4,0,7,3,1,8,4,3,1,18,17,4,3,1 -13843,4,1,0,5,6,1,9,1,0,1,4,3,0,17,0 -13844,2,4,0,6,3,0,3,5,1,0,16,17,0,34,0 -13845,1,0,0,5,4,5,13,0,2,0,6,11,0,9,0 -13846,4,3,0,0,6,0,13,2,1,0,5,2,0,24,0 -13847,10,3,0,4,3,1,2,4,4,0,13,9,4,16,0 -13848,6,3,0,14,1,1,2,4,2,0,10,14,4,39,0 -13849,3,7,0,13,0,5,2,1,3,1,8,2,2,18,0 -13850,0,2,0,3,5,5,11,4,2,0,2,11,5,3,0 -13851,10,5,0,3,3,6,0,5,0,1,2,7,5,31,1 -13852,9,1,0,2,0,4,11,4,1,0,2,14,1,7,0 -13853,3,1,0,15,0,3,0,1,0,0,12,15,0,7,0 -13854,10,5,0,14,0,1,6,3,2,1,17,3,1,12,0 -13855,1,8,0,2,6,6,2,1,2,0,17,9,3,11,0 -13856,10,5,0,0,4,0,11,2,1,0,17,10,0,35,1 -13857,2,5,0,13,2,6,12,3,1,0,14,11,0,32,0 -13858,8,7,0,7,0,6,4,2,4,0,5,3,1,34,1 -13859,1,2,0,14,3,0,1,4,2,1,11,12,2,38,0 -13860,7,5,0,13,3,2,13,5,4,0,10,1,5,8,1 -13861,9,4,0,3,0,0,3,1,0,0,3,20,2,12,0 -13862,8,7,0,15,1,0,7,3,1,0,6,2,5,2,0 -13863,3,3,0,3,0,3,6,2,0,0,8,16,1,10,0 -13864,8,8,0,10,3,6,10,0,1,1,5,7,3,8,1 -13865,4,6,0,15,4,6,6,2,4,0,6,20,5,3,0 -13866,6,0,0,6,4,1,9,5,4,1,16,16,4,14,1 -13867,1,8,0,0,2,3,4,4,1,0,5,14,3,4,0 -13868,2,1,0,8,6,5,8,0,0,0,17,9,4,7,0 -13869,0,8,0,0,2,3,12,0,4,0,18,7,5,40,1 -13870,0,8,0,5,5,5,5,0,2,0,13,2,3,23,0 -13871,2,0,0,10,1,0,10,0,4,0,8,6,1,38,0 -13872,1,0,0,12,1,1,5,3,2,0,10,0,0,0,0 -13873,1,5,0,11,5,1,8,0,1,0,8,11,4,18,0 -13874,3,4,0,8,6,0,7,2,2,0,17,19,1,7,0 -13875,10,0,0,10,2,4,13,0,3,1,18,14,3,3,1 -13876,7,0,0,9,4,6,6,0,4,1,1,10,3,20,1 -13877,4,5,0,3,6,4,12,0,0,0,17,15,5,39,0 -13878,1,5,0,6,1,0,5,4,4,0,13,20,0,1,0 -13879,9,0,0,7,6,4,11,5,1,1,5,7,0,27,1 -13880,1,1,0,14,3,5,7,2,2,1,2,6,3,38,0 -13881,3,2,0,11,5,0,14,3,0,0,4,5,4,38,0 -13882,7,7,0,5,6,5,9,2,2,1,15,2,5,25,0 -13883,5,8,0,6,0,0,3,3,2,0,2,9,2,18,0 -13884,6,7,0,0,6,2,11,5,0,1,18,17,4,28,1 -13885,1,0,0,2,3,6,11,1,3,0,0,8,2,7,0 -13886,0,7,0,5,3,3,3,5,0,0,6,13,1,4,0 -13887,4,8,0,1,0,5,3,4,1,0,2,11,3,30,0 -13888,7,2,0,4,0,5,5,3,2,1,0,4,0,6,0 -13889,6,1,0,11,4,0,3,4,0,0,14,2,3,40,0 -13890,7,3,0,9,2,3,6,0,3,1,18,5,4,9,1 -13891,5,6,0,12,1,5,8,1,0,1,0,18,4,37,0 -13892,6,7,0,5,1,5,2,4,3,1,18,15,4,5,0 -13893,2,8,0,3,0,5,5,0,4,0,6,2,4,5,0 -13894,3,4,0,5,2,5,3,2,1,1,2,0,0,28,0 -13895,2,7,0,9,0,0,11,4,3,0,16,19,0,36,0 -13896,3,3,0,5,2,0,12,3,3,1,9,2,0,7,0 -13897,0,0,0,9,0,5,0,3,1,1,17,11,5,5,0 -13898,6,8,0,12,6,2,4,5,3,1,1,7,0,35,1 -13899,0,7,0,1,0,4,9,2,3,1,13,18,3,37,0 -13900,0,3,0,0,1,0,4,2,0,0,4,0,5,7,0 -13901,0,8,0,5,5,1,4,4,3,1,8,9,3,33,1 -13902,0,0,0,6,5,1,7,0,3,1,0,20,3,6,0 -13903,0,8,0,9,0,1,8,2,4,0,8,4,4,8,1 -13904,0,4,0,9,3,0,9,5,4,0,13,18,3,26,0 -13905,8,2,0,14,6,2,8,1,4,1,1,8,0,6,1 -13906,0,2,0,14,5,5,8,4,2,0,2,11,2,24,0 -13907,4,8,0,8,6,6,8,3,1,0,4,2,2,36,0 -13908,6,4,0,3,6,3,6,5,0,1,2,9,2,23,0 -13909,3,3,0,7,0,2,2,0,3,1,16,4,5,4,1 -13910,6,7,0,6,3,6,9,3,0,1,13,13,5,6,0 -13911,2,3,0,8,0,3,4,3,1,0,4,4,1,23,0 -13912,10,8,0,3,2,0,14,1,0,1,2,11,1,19,0 -13913,6,2,0,1,0,2,2,1,0,0,10,2,4,14,0 -13914,9,8,0,8,6,4,2,5,0,1,4,20,3,35,0 -13915,1,4,0,1,0,4,8,2,0,1,2,11,1,31,0 -13916,3,5,0,12,6,4,8,3,3,0,4,2,0,34,0 -13917,1,4,0,0,6,5,0,2,1,1,3,7,2,41,1 -13918,9,7,0,2,2,2,10,0,0,1,18,7,1,20,1 -13919,5,1,0,8,3,5,13,3,4,1,18,7,5,31,0 -13920,1,5,0,3,4,4,8,1,3,0,2,11,1,3,0 -13921,3,2,0,6,2,3,11,4,4,0,2,13,0,16,0 -13922,7,6,0,0,0,4,13,1,2,0,2,15,4,38,0 -13923,9,5,0,0,0,6,9,2,3,0,8,15,0,18,0 -13924,9,2,0,15,4,6,8,1,2,0,1,9,1,18,0 -13925,5,1,0,6,0,2,12,0,3,1,12,1,4,16,1 -13926,9,4,0,10,6,5,8,2,3,0,18,8,2,30,1 -13927,0,0,0,8,1,4,9,4,4,0,13,2,4,14,0 -13928,0,4,0,14,1,0,9,1,0,1,17,3,5,18,0 -13929,0,2,0,12,6,4,5,3,1,1,17,4,0,3,0 -13930,9,7,0,4,4,1,12,2,1,1,1,14,4,22,1 -13931,3,5,0,3,0,0,2,2,3,0,0,3,2,3,0 -13932,1,8,0,5,6,3,5,0,2,1,3,13,3,27,0 -13933,0,0,0,2,5,5,9,2,2,0,12,3,0,23,0 -13934,9,8,0,7,2,5,7,4,4,0,14,6,3,40,1 -13935,2,3,0,0,0,4,2,1,2,0,2,0,4,11,0 -13936,0,6,0,2,6,0,7,3,4,0,6,11,4,9,0 -13937,0,8,0,13,5,0,14,2,3,0,6,18,1,23,0 -13938,2,6,0,6,0,2,13,0,3,0,2,13,2,16,0 -13939,3,1,0,7,4,5,4,5,1,1,5,14,5,19,0 -13940,5,0,0,12,3,6,13,1,4,0,17,17,1,21,1 -13941,1,0,0,0,1,2,6,5,4,0,13,11,0,9,0 -13942,7,4,0,13,1,3,8,4,0,1,8,11,4,14,0 -13943,9,2,0,4,6,1,9,0,4,1,2,4,2,14,0 -13944,9,2,0,5,1,6,11,4,0,0,12,6,0,2,0 -13945,2,6,0,4,2,1,1,3,2,1,13,6,2,3,0 -13946,1,8,0,5,6,5,5,0,3,0,10,5,2,32,0 -13947,2,1,0,1,0,4,5,3,4,1,6,9,5,10,0 -13948,0,2,0,2,5,6,5,3,3,0,17,2,4,14,0 -13949,10,8,0,14,2,4,7,2,3,1,4,14,0,27,0 -13950,10,3,0,15,0,0,8,3,0,0,12,18,2,39,0 -13951,0,2,0,13,3,1,1,5,4,0,13,11,0,27,0 -13952,4,1,0,13,4,3,9,4,3,1,12,13,0,19,1 -13953,0,4,0,6,0,0,0,3,0,0,8,11,1,0,0 -13954,0,0,0,10,2,0,5,4,4,0,10,3,3,19,0 -13955,0,1,0,13,1,1,14,3,2,0,15,20,4,40,0 -13956,10,6,0,10,0,6,13,2,0,0,13,2,3,16,0 -13957,4,2,0,2,0,3,9,2,2,0,11,13,2,25,0 -13958,9,5,0,12,6,5,3,5,3,1,8,18,1,39,0 -13959,2,0,0,0,5,2,9,2,1,0,4,8,1,11,0 -13960,2,6,0,14,0,6,3,3,0,0,13,11,2,32,0 -13961,2,1,0,4,4,0,12,4,4,0,13,18,0,14,0 -13962,1,1,0,13,0,0,2,2,4,1,13,6,1,8,0 -13963,9,5,0,15,0,0,1,0,4,1,0,11,1,20,0 -13964,3,5,0,10,3,0,3,2,2,0,9,17,4,36,1 -13965,1,6,0,12,6,6,0,5,0,1,7,7,2,28,1 -13966,1,1,0,3,5,4,10,5,3,1,2,15,5,1,0 -13967,5,1,0,12,6,1,6,5,1,0,2,16,2,10,0 -13968,10,4,0,10,0,2,11,3,1,1,3,4,0,25,1 -13969,8,2,0,1,0,6,1,5,2,0,15,10,3,27,1 -13970,0,0,0,15,1,4,10,2,4,1,14,6,3,4,0 -13971,10,2,0,11,0,3,13,3,3,1,13,16,2,10,0 -13972,1,2,0,3,2,1,11,1,0,0,13,6,1,31,0 -13973,0,4,0,1,4,2,9,2,3,1,8,6,4,19,0 -13974,3,4,0,14,5,6,14,3,0,0,13,13,3,26,0 -13975,0,7,0,5,2,1,10,5,2,0,16,2,4,28,0 -13976,3,2,0,10,6,6,2,4,4,1,5,14,1,1,1 -13977,0,2,0,1,5,1,6,2,0,1,4,15,0,35,0 -13978,10,2,0,0,0,5,7,2,0,0,8,0,1,21,0 -13979,6,7,0,2,5,3,7,4,3,1,8,18,1,9,0 -13980,9,4,0,1,3,5,5,3,0,0,15,9,0,26,0 -13981,9,5,0,2,6,3,0,4,1,0,13,18,5,24,0 -13982,1,7,0,4,5,4,0,4,2,1,12,4,4,34,0 -13983,0,1,0,11,4,5,14,0,1,0,10,4,5,35,0 -13984,3,1,0,12,3,5,9,5,3,0,5,10,5,28,1 -13985,1,5,0,12,2,6,10,4,2,0,16,0,2,2,1 -13986,0,6,0,12,0,5,1,2,2,1,13,2,3,26,0 -13987,9,5,0,7,5,6,2,4,1,1,7,17,2,13,1 -13988,7,1,0,8,2,2,12,4,3,0,1,17,2,5,1 -13989,4,5,0,8,6,5,0,0,3,1,2,13,5,32,0 -13990,2,7,0,13,4,3,8,5,0,0,16,5,1,4,0 -13991,8,3,0,5,3,1,5,0,1,1,7,11,0,39,0 -13992,8,1,0,3,6,2,3,2,0,0,9,0,0,18,1 -13993,9,5,0,8,5,0,3,4,1,0,1,5,2,9,1 -13994,6,6,0,11,1,2,9,4,0,1,13,6,4,4,0 -13995,1,7,0,4,6,2,6,1,0,0,8,2,4,8,0 -13996,4,4,0,14,0,1,10,1,0,1,13,2,1,8,0 -13997,0,6,0,4,5,5,10,5,4,1,9,6,5,41,0 -13998,1,2,0,10,3,2,10,1,4,0,1,7,4,9,1 -13999,9,1,0,13,4,2,7,4,2,1,7,5,2,0,1 -14000,4,4,0,12,1,2,6,3,3,1,13,2,0,26,0 -14001,1,3,0,13,0,5,9,2,0,0,13,18,2,5,0 -14002,8,3,0,9,2,1,1,3,1,1,18,7,2,13,1 -14003,5,7,0,7,4,6,13,3,2,0,5,19,1,36,1 -14004,9,4,0,0,6,5,5,0,1,0,2,16,0,23,0 -14005,5,3,0,9,1,4,10,3,1,1,2,9,5,0,0 -14006,5,5,0,13,2,2,10,0,3,0,9,1,3,1,1 -14007,8,5,0,5,4,5,7,3,3,0,2,2,0,38,0 -14008,6,0,0,3,4,2,2,5,0,1,9,15,4,21,1 -14009,1,8,0,9,0,3,12,2,0,1,2,2,5,21,0 -14010,3,8,0,14,0,6,13,3,1,0,8,12,2,35,0 -14011,2,7,0,0,0,5,5,3,3,1,6,15,1,24,0 -14012,2,0,0,1,0,6,6,3,2,1,8,15,5,16,0 -14013,9,1,0,10,1,1,0,4,0,0,0,5,2,41,1 -14014,5,5,0,15,0,6,9,5,4,1,10,16,5,36,0 -14015,0,1,0,11,0,4,10,4,0,0,16,3,1,37,0 -14016,10,8,0,13,5,5,3,4,1,0,12,18,0,29,0 -14017,4,0,0,13,3,6,2,4,3,0,6,2,5,5,0 -14018,3,6,0,15,6,3,5,0,3,1,2,18,5,13,0 -14019,8,0,0,14,2,0,9,3,0,0,2,3,2,40,0 -14020,2,3,0,9,2,0,8,0,2,1,11,14,1,39,0 -14021,2,8,0,11,0,5,1,2,4,1,13,16,3,16,0 -14022,2,4,0,5,5,5,5,1,0,1,4,3,0,34,0 -14023,1,0,0,5,5,5,2,3,3,0,8,16,0,8,0 -14024,7,4,0,7,5,3,13,4,4,1,8,3,2,18,1 -14025,6,5,0,8,0,4,5,5,3,0,2,15,1,3,0 -14026,9,7,0,5,0,3,7,0,2,0,5,6,1,28,1 -14027,0,3,0,1,2,3,12,1,2,1,17,2,0,40,0 -14028,0,4,0,13,5,3,13,2,3,0,17,2,5,33,0 -14029,3,0,0,7,3,6,9,1,1,1,1,17,3,36,1 -14030,1,7,0,7,6,4,14,3,3,1,13,18,2,4,0 -14031,10,2,0,13,5,5,9,4,0,0,13,15,2,13,0 -14032,2,6,0,7,6,3,0,3,1,1,8,12,2,20,0 -14033,10,1,0,3,6,4,3,3,0,1,4,3,1,41,0 -14034,0,6,0,14,6,0,0,1,1,0,4,13,5,26,0 -14035,1,7,0,0,1,0,13,0,0,1,8,2,1,36,0 -14036,2,6,0,14,0,4,11,3,1,1,8,6,0,34,0 -14037,3,8,0,3,6,0,7,4,1,1,12,15,2,19,0 -14038,5,3,0,11,6,6,2,2,0,1,4,18,2,0,0 -14039,0,2,0,15,0,4,5,2,3,0,2,15,2,29,0 -14040,6,0,0,13,3,3,10,0,4,1,18,19,2,41,1 -14041,6,3,0,1,5,5,12,1,4,0,13,0,3,26,0 -14042,10,6,0,11,0,6,12,3,0,0,8,6,5,26,0 -14043,6,2,0,0,5,3,14,3,3,1,0,11,4,22,0 -14044,6,2,0,3,2,4,4,3,2,1,12,18,1,29,0 -14045,1,0,0,2,1,4,9,3,1,1,6,2,3,28,0 -14046,9,6,0,10,4,4,8,3,4,1,16,19,0,3,0 -14047,0,8,0,1,6,0,9,4,1,1,13,15,0,35,0 -14048,0,6,0,7,0,6,4,1,4,0,13,3,2,19,0 -14049,1,8,0,5,0,4,6,3,2,0,2,1,4,26,0 -14050,9,1,0,3,6,4,11,4,4,1,1,13,0,20,0 -14051,2,2,0,5,3,1,3,5,1,1,2,17,1,7,0 -14052,0,5,0,11,3,5,14,3,0,0,13,16,4,20,0 -14053,0,8,0,5,6,5,12,5,3,0,6,11,0,24,0 -14054,8,2,0,14,1,3,13,3,1,0,18,7,4,34,1 -14055,9,7,0,11,0,5,5,1,3,0,12,9,0,21,0 -14056,3,4,0,0,5,2,5,4,3,0,13,0,1,1,0 -14057,5,6,0,7,5,5,7,5,3,1,17,9,1,30,0 -14058,8,8,0,13,2,4,7,0,0,0,16,15,0,1,0 -14059,5,5,0,7,1,5,2,5,0,0,9,5,2,16,1 -14060,9,2,0,10,4,1,1,5,1,0,1,1,4,20,1 -14061,2,3,0,2,1,2,0,0,0,1,6,8,3,31,1 -14062,2,4,0,2,5,2,11,0,4,0,7,1,0,11,1 -14063,5,8,0,3,5,0,13,5,0,0,5,11,2,9,0 -14064,0,6,0,7,0,0,0,2,1,1,16,0,3,6,0 -14065,2,0,0,6,0,4,14,4,1,0,13,12,2,26,0 -14066,0,3,0,11,1,0,10,4,4,0,1,13,2,4,0 -14067,2,7,0,13,0,1,5,4,3,1,15,20,0,18,0 -14068,1,6,0,8,3,2,5,2,3,1,13,13,1,6,0 -14069,2,7,0,3,2,1,9,0,4,0,6,16,1,18,0 -14070,1,8,0,9,4,3,5,2,3,1,7,0,5,32,0 -14071,1,2,0,8,0,5,9,3,0,0,17,4,3,3,0 -14072,3,2,0,3,1,3,7,4,3,1,3,18,3,38,0 -14073,0,6,0,6,0,3,2,5,1,1,15,2,5,25,0 -14074,1,3,0,6,4,5,10,0,1,1,2,3,1,35,0 -14075,6,5,0,4,4,6,8,4,0,1,2,2,4,38,0 -14076,3,4,0,13,4,6,6,3,2,1,2,2,5,24,0 -14077,7,3,0,14,2,5,7,4,1,0,7,7,3,18,1 -14078,9,5,0,15,3,5,11,4,0,1,10,17,5,16,1 -14079,3,6,0,5,4,0,0,1,3,0,4,14,3,0,0 -14080,2,0,0,12,3,0,2,5,0,0,3,17,3,12,1 -14081,3,4,0,9,1,5,3,2,2,0,3,0,1,11,0 -14082,1,2,0,0,0,5,11,2,4,1,4,2,5,7,0 -14083,10,7,0,12,4,5,8,0,1,1,7,17,4,37,1 -14084,3,8,0,1,1,6,6,4,4,1,13,4,1,29,0 -14085,2,3,0,2,2,3,1,1,0,0,8,6,0,22,0 -14086,1,2,0,8,3,6,12,2,1,0,13,2,4,33,0 -14087,1,4,0,5,3,0,13,4,0,1,15,11,0,29,0 -14088,7,8,0,5,5,0,8,2,1,1,17,2,4,22,0 -14089,6,7,0,13,0,3,0,2,2,0,13,2,0,16,0 -14090,7,4,0,10,1,0,4,1,0,0,18,16,3,10,0 -14091,7,1,0,1,6,3,3,5,1,1,11,0,4,34,1 -14092,8,0,0,8,1,2,6,1,4,0,7,13,0,38,0 -14093,2,1,0,0,3,6,3,4,2,1,13,14,0,26,0 -14094,3,1,0,12,6,5,0,0,4,1,5,14,3,38,1 -14095,4,3,0,2,2,2,10,1,4,0,5,15,5,32,1 -14096,1,8,0,13,3,5,8,4,2,0,11,2,4,25,0 -14097,0,2,0,0,1,4,9,3,2,1,0,19,1,17,0 -14098,10,5,0,10,1,2,6,5,2,1,9,7,5,41,1 -14099,0,4,0,5,0,6,3,2,2,0,0,15,3,31,0 -14100,9,3,0,5,6,5,10,2,0,1,14,1,1,4,0 -14101,2,6,0,3,3,5,4,5,1,0,11,18,4,25,0 -14102,1,5,0,12,1,3,6,4,0,1,13,11,5,3,0 -14103,0,6,0,12,4,6,14,0,3,0,12,13,2,9,0 -14104,1,3,0,8,0,0,4,1,1,1,4,5,4,17,0 -14105,8,6,0,3,0,5,4,1,4,1,2,2,1,1,0 -14106,0,5,0,6,0,0,5,0,2,1,8,11,4,8,0 -14107,6,8,0,0,6,6,3,5,1,1,9,12,2,39,1 -14108,2,0,0,14,0,6,14,5,3,0,10,17,0,17,0 -14109,2,5,0,14,0,6,9,0,3,1,4,20,2,0,0 -14110,7,6,0,11,2,0,0,3,4,0,0,0,3,2,0 -14111,5,1,0,11,1,5,1,2,3,0,4,11,4,33,0 -14112,7,4,0,0,6,3,14,5,1,1,13,1,5,23,1 -14113,6,1,0,11,3,3,2,0,1,1,16,12,0,37,1 -14114,9,3,0,5,6,3,8,2,3,1,13,12,0,4,0 -14115,4,6,0,5,6,0,13,1,3,1,18,10,1,41,1 -14116,0,5,0,7,0,0,6,0,0,1,6,5,5,10,0 -14117,6,2,0,15,2,6,6,0,1,1,9,12,4,11,1 -14118,5,8,0,4,6,3,9,1,4,0,17,9,1,20,0 -14119,3,1,0,8,3,1,2,2,2,0,8,9,0,22,0 -14120,0,1,0,0,0,0,0,4,2,1,4,9,2,27,0 -14121,4,3,0,0,6,5,9,1,0,0,17,9,5,30,0 -14122,10,0,0,13,6,3,8,1,4,1,4,19,2,30,0 -14123,9,2,0,12,0,4,4,4,4,0,4,8,4,40,0 -14124,8,6,0,5,5,3,13,0,2,1,4,11,3,9,0 -14125,8,6,0,1,1,4,2,4,3,0,2,20,0,9,0 -14126,7,0,0,4,0,5,14,5,0,0,5,2,3,35,0 -14127,10,5,0,11,0,5,8,2,3,1,0,6,0,38,0 -14128,1,6,0,9,1,2,8,3,0,0,17,11,0,9,0 -14129,8,6,0,0,4,0,10,5,0,1,3,7,3,14,1 -14130,1,6,0,7,1,0,8,2,0,0,2,6,1,21,0 -14131,6,6,0,3,6,1,4,0,1,0,12,1,5,16,1 -14132,1,3,0,6,1,6,1,0,4,1,12,0,0,7,0 -14133,0,8,0,9,3,0,10,3,4,1,2,15,1,12,0 -14134,7,6,0,13,3,5,2,0,0,1,13,2,2,40,0 -14135,8,5,0,2,2,2,11,4,2,1,1,14,5,21,1 -14136,0,6,0,4,5,4,7,5,2,0,2,16,5,24,0 -14137,2,2,0,5,5,4,6,4,4,0,4,20,0,34,0 -14138,6,2,0,14,1,2,4,1,2,0,0,16,3,26,0 -14139,3,3,0,13,2,0,13,0,2,0,0,13,0,9,0 -14140,1,7,0,6,1,2,5,0,2,1,8,11,3,37,0 -14141,0,8,0,5,2,6,14,3,2,0,2,11,4,18,0 -14142,4,0,0,7,6,1,8,4,0,1,0,2,3,6,0 -14143,0,2,0,0,2,5,8,2,1,0,2,16,5,25,0 -14144,1,4,0,8,0,5,8,1,4,1,0,4,4,13,0 -14145,5,7,0,4,6,5,10,4,4,1,2,9,0,2,0 -14146,1,1,0,3,4,3,11,4,3,1,4,20,3,40,0 -14147,3,5,0,4,2,4,6,0,0,1,8,9,2,35,0 -14148,7,8,0,5,2,4,14,0,2,0,13,11,0,37,0 -14149,10,2,0,5,5,1,13,0,0,0,13,15,4,22,0 -14150,9,4,0,3,6,0,2,1,3,1,4,20,0,29,0 -14151,2,0,0,14,4,6,0,2,4,1,2,13,2,26,0 -14152,2,2,0,10,0,4,3,1,3,0,13,11,4,38,0 -14153,8,1,0,5,1,6,10,0,1,1,1,7,3,41,1 -14154,4,1,0,15,4,1,3,1,2,1,16,10,3,21,1 -14155,2,7,0,11,3,1,8,2,0,1,2,11,0,29,0 -14156,0,6,0,6,3,4,6,0,1,1,13,4,4,38,0 -14157,2,7,0,12,0,5,4,4,3,1,13,0,0,24,0 -14158,1,1,0,11,0,0,5,3,4,0,2,3,5,38,0 -14159,2,7,0,2,6,4,12,3,3,0,2,19,0,22,0 -14160,3,8,0,1,5,5,12,0,2,1,0,18,3,11,0 -14161,5,4,0,1,5,5,3,4,3,0,17,18,4,23,0 -14162,4,3,0,14,2,1,3,0,3,1,15,1,4,6,1 -14163,1,4,0,13,6,6,0,2,0,1,0,2,2,13,0 -14164,7,3,0,9,6,4,3,1,0,1,12,10,1,30,1 -14165,7,3,0,14,0,5,8,1,0,0,2,17,0,30,0 -14166,4,0,0,14,0,6,8,3,4,1,8,5,1,23,0 -14167,5,2,0,10,4,5,14,5,4,0,10,5,4,8,1 -14168,6,5,0,6,1,6,0,1,1,1,13,1,4,0,0 -14169,8,5,0,4,4,0,1,4,3,1,2,9,2,0,0 -14170,1,4,0,12,5,4,11,1,0,1,11,2,5,29,0 -14171,3,3,0,3,4,0,14,0,4,0,0,3,1,18,0 -14172,5,4,0,9,1,6,9,5,0,1,16,8,5,5,1 -14173,4,0,0,8,4,3,12,1,1,1,0,18,0,22,0 -14174,0,3,0,6,0,3,8,4,3,0,17,9,2,39,0 -14175,10,3,0,7,2,6,7,0,1,1,1,17,5,39,1 -14176,0,8,0,8,0,3,2,2,4,1,16,16,2,5,0 -14177,0,0,0,14,1,0,1,4,0,0,6,17,0,36,0 -14178,0,3,0,1,5,6,11,4,0,1,4,4,3,34,0 -14179,7,1,0,2,2,3,7,2,0,1,16,0,4,40,0 -14180,2,7,0,6,4,4,5,1,3,0,2,0,0,13,0 -14181,1,4,0,3,6,3,9,3,1,1,8,2,2,1,0 -14182,0,2,0,7,5,0,12,1,2,1,2,12,3,0,0 -14183,4,8,0,5,5,4,6,2,3,1,15,18,0,12,0 -14184,10,0,0,5,3,0,9,1,0,0,13,15,2,28,0 -14185,2,4,0,9,6,5,9,3,3,1,13,11,1,29,0 -14186,10,5,0,9,6,6,12,2,3,1,18,10,5,3,1 -14187,9,7,0,6,6,1,14,1,3,0,3,16,3,2,1 -14188,2,3,0,3,3,0,9,3,4,1,0,16,0,8,0 -14189,2,2,0,14,3,4,8,1,4,1,17,11,4,33,0 -14190,6,8,0,10,4,2,11,0,3,0,5,19,0,37,1 -14191,5,0,0,15,6,2,2,5,3,0,18,17,3,6,1 -14192,4,6,0,2,0,1,8,2,0,1,6,2,2,0,0 -14193,1,6,0,5,4,4,4,1,3,1,14,20,4,28,1 -14194,7,0,0,9,4,0,5,0,1,0,2,15,0,18,0 -14195,2,3,0,8,2,3,6,0,0,1,13,6,3,37,0 -14196,0,7,0,3,4,5,9,2,2,1,4,0,5,7,0 -14197,3,7,0,1,1,5,12,1,3,1,10,2,0,35,0 -14198,6,2,0,12,3,3,9,4,1,1,10,6,2,13,0 -14199,10,0,0,14,4,4,11,4,1,1,1,20,5,39,1 -14200,9,6,0,12,6,6,10,4,3,1,4,2,1,40,0 -14201,1,5,0,5,3,3,14,1,3,0,4,5,1,29,0 -14202,4,6,0,13,0,4,11,3,0,1,11,2,3,23,0 -14203,2,0,0,5,0,4,9,1,4,0,6,2,0,19,0 -14204,1,7,0,6,0,0,5,1,2,0,8,13,5,39,0 -14205,0,3,0,3,0,6,14,3,0,0,14,11,0,37,0 -14206,3,2,0,5,0,3,0,0,1,0,0,2,0,10,0 -14207,6,0,0,3,6,4,5,4,0,0,13,4,4,3,0 -14208,2,6,0,3,1,2,6,4,1,0,2,2,3,5,0 -14209,0,2,0,15,4,5,0,3,4,0,8,2,1,2,0 -14210,6,1,0,3,6,5,10,0,2,1,2,15,4,0,0 -14211,6,8,0,2,0,1,4,0,4,0,13,9,1,13,0 -14212,10,3,0,2,0,4,13,2,2,1,0,16,5,26,0 -14213,1,5,0,5,3,5,14,2,2,1,8,9,1,13,0 -14214,1,8,0,15,2,4,3,1,3,1,12,1,4,22,0 -14215,0,5,0,5,0,0,5,1,3,0,10,15,0,6,0 -14216,10,1,0,6,0,3,12,2,2,1,7,1,3,23,0 -14217,6,5,0,11,3,4,13,0,3,1,8,8,2,33,0 -14218,5,6,0,9,1,0,13,0,0,1,3,14,5,22,1 -14219,3,1,0,10,2,1,11,5,1,1,7,7,3,20,1 -14220,8,8,0,12,0,2,6,5,0,1,9,0,4,36,1 -14221,9,4,0,3,5,4,11,3,1,0,5,2,5,40,0 -14222,8,0,0,1,1,1,6,4,3,1,2,4,0,26,0 -14223,1,5,0,12,1,1,12,1,1,1,10,3,5,14,1 -14224,9,6,0,10,2,3,14,2,4,0,6,8,5,30,0 -14225,0,8,0,13,6,6,1,5,0,1,13,18,2,29,0 -14226,5,0,0,5,2,5,4,5,4,0,2,7,1,4,0 -14227,8,1,0,13,1,2,10,4,4,0,13,17,5,7,0 -14228,7,7,0,5,1,3,11,4,3,1,6,18,5,12,0 -14229,2,2,0,7,6,6,3,3,2,0,0,6,4,34,0 -14230,8,6,0,5,5,6,6,2,3,0,6,0,2,3,0 -14231,0,6,0,3,2,5,6,0,4,0,4,15,3,7,0 -14232,8,4,0,11,6,5,9,2,2,1,18,18,4,11,1 -14233,2,3,0,0,6,5,12,1,2,0,17,2,0,7,0 -14234,1,8,0,3,0,5,2,2,0,1,2,11,5,22,0 -14235,2,8,0,2,0,2,2,4,0,0,11,11,3,13,0 -14236,6,8,0,1,2,6,10,2,0,1,8,0,1,10,0 -14237,6,7,0,14,3,0,14,2,4,1,8,8,0,27,0 -14238,3,5,0,6,2,4,9,4,4,0,13,11,0,13,0 -14239,9,0,0,6,5,4,5,4,0,0,11,6,3,39,0 -14240,0,6,0,13,2,0,2,4,3,1,15,15,0,23,0 -14241,7,6,0,3,6,5,5,4,0,1,8,15,5,31,0 -14242,2,4,0,2,0,6,13,0,3,0,18,19,3,5,1 -14243,4,7,0,11,4,5,12,0,3,0,2,2,4,18,0 -14244,2,5,0,9,4,4,8,4,4,1,13,2,0,12,0 -14245,1,4,0,13,2,6,5,1,0,0,13,6,0,27,0 -14246,7,6,0,3,1,1,4,5,4,0,5,5,4,5,1 -14247,0,6,0,1,1,5,0,3,2,1,8,2,0,4,0 -14248,6,7,0,9,5,2,11,4,1,0,0,11,5,9,0 -14249,3,7,0,1,3,4,14,2,4,1,14,13,2,5,0 -14250,8,0,0,3,2,1,2,1,4,0,1,1,4,12,1 -14251,9,3,0,6,5,0,12,4,3,0,0,10,0,0,0 -14252,3,0,0,11,0,4,12,0,2,1,0,11,1,7,0 -14253,5,7,0,7,0,6,6,5,4,0,0,0,2,24,0 -14254,1,1,0,8,2,0,7,1,0,1,13,11,1,34,0 -14255,1,6,0,12,0,6,8,3,0,1,10,20,2,39,0 -14256,6,8,0,12,6,5,0,1,1,1,13,1,3,3,0 -14257,4,3,0,15,6,1,3,2,0,1,15,17,4,37,0 -14258,9,4,0,12,3,1,4,0,0,1,18,4,2,32,1 -14259,1,5,0,7,6,5,8,4,1,1,0,0,5,7,0 -14260,4,2,0,12,6,6,7,3,3,0,0,11,1,28,0 -14261,0,0,0,1,2,0,4,2,2,1,2,16,0,12,0 -14262,2,3,0,5,4,3,8,0,3,0,4,18,5,29,0 -14263,2,3,0,5,4,0,13,5,3,0,17,2,0,22,0 -14264,9,5,0,2,3,6,8,1,0,0,13,3,0,8,0 -14265,6,5,0,2,1,3,5,0,4,0,8,8,3,20,0 -14266,4,6,0,6,1,0,3,1,0,0,2,11,2,14,0 -14267,1,5,0,12,0,4,7,4,4,0,10,11,4,8,0 -14268,10,0,0,4,5,1,4,0,1,1,1,5,5,5,1 -14269,6,6,0,8,5,3,11,4,4,0,13,17,1,12,0 -14270,4,0,0,6,4,2,9,5,0,0,9,7,1,23,1 -14271,2,2,0,2,6,0,13,0,2,1,6,18,1,27,0 -14272,0,6,0,13,0,5,14,3,1,1,15,11,3,19,0 -14273,0,3,0,13,5,5,10,3,3,1,8,18,3,13,0 -14274,9,0,0,10,5,1,0,0,3,0,18,14,0,21,1 -14275,6,6,0,0,6,2,1,1,2,1,3,17,1,20,1 -14276,0,8,0,8,2,0,9,4,1,1,16,14,2,20,0 -14277,8,7,0,12,3,4,3,4,1,0,8,7,2,37,0 -14278,7,3,0,15,0,6,9,1,4,0,2,17,0,37,0 -14279,2,3,0,0,1,0,1,3,3,0,6,13,3,37,0 -14280,10,1,0,9,3,2,10,2,0,1,9,10,1,24,1 -14281,10,1,0,4,0,3,7,4,3,0,13,18,4,8,0 -14282,9,6,0,4,5,2,12,4,3,0,4,18,5,17,0 -14283,4,8,0,9,1,5,7,2,3,1,2,6,3,11,0 -14284,5,6,0,13,6,4,9,1,0,1,8,18,1,38,0 -14285,4,8,0,11,2,0,8,4,4,0,17,2,0,36,0 -14286,1,1,0,3,2,5,13,2,0,0,6,20,3,8,0 -14287,3,6,0,4,1,2,6,4,1,0,8,2,3,6,0 -14288,1,0,0,8,2,3,8,1,2,0,12,10,2,40,0 -14289,0,4,0,2,4,3,14,5,0,1,16,17,2,17,1 -14290,10,6,0,4,2,3,6,0,0,0,17,20,4,31,0 -14291,8,1,0,12,5,5,10,5,0,0,12,1,5,12,1 -14292,3,1,0,7,3,2,5,4,0,1,5,13,3,31,1 -14293,0,6,0,11,4,0,8,1,3,0,14,11,4,36,0 -14294,10,3,0,6,0,0,14,3,2,1,15,2,0,3,0 -14295,5,4,0,0,3,6,10,2,0,0,5,11,4,23,0 -14296,2,7,0,0,2,0,7,0,0,0,12,11,5,21,0 -14297,6,5,0,5,5,2,13,5,2,1,18,7,5,7,1 -14298,4,8,0,1,0,4,3,2,3,0,2,3,1,12,0 -14299,9,0,0,5,0,0,1,3,0,1,13,13,4,11,0 -14300,2,0,0,2,1,3,2,1,4,1,0,5,4,37,1 -14301,0,1,0,6,5,0,13,3,2,1,13,9,4,37,0 -14302,3,2,0,1,0,2,14,3,0,1,0,4,4,29,0 -14303,1,3,0,14,0,0,0,3,0,0,10,4,2,26,0 -14304,10,8,0,7,2,4,14,4,1,1,14,10,4,18,1 -14305,2,1,0,0,6,1,12,4,0,0,1,14,5,25,0 -14306,6,7,0,11,6,4,0,2,3,1,15,18,2,35,0 -14307,0,3,0,11,3,2,13,3,1,0,10,2,3,8,0 -14308,3,0,0,0,2,0,0,5,2,1,13,0,1,19,0 -14309,2,3,0,13,4,5,14,1,2,1,13,6,4,8,0 -14310,2,6,0,3,4,5,14,1,3,1,13,2,4,35,0 -14311,3,5,0,11,5,6,11,5,0,0,1,14,2,1,1 -14312,3,2,0,0,3,4,0,0,3,0,1,12,3,31,1 -14313,2,7,0,3,3,2,3,5,3,1,16,15,3,29,1 -14314,10,7,0,5,6,0,10,2,1,0,13,15,0,30,0 -14315,8,1,0,0,4,0,0,2,2,0,8,16,1,35,0 -14316,10,1,0,11,6,6,2,1,0,0,13,0,3,25,0 -14317,3,1,0,4,0,5,6,0,0,1,3,11,1,8,0 -14318,0,8,0,9,0,4,8,4,4,1,2,2,3,17,0 -14319,10,6,0,1,2,5,3,4,3,1,16,15,1,29,0 -14320,0,1,0,4,3,5,8,2,1,1,15,13,5,17,0 -14321,8,6,0,14,4,3,7,4,0,0,13,9,1,0,0 -14322,7,6,0,14,4,3,4,3,3,0,13,15,4,38,1 -14323,3,2,0,14,6,6,14,2,1,0,2,17,0,18,0 -14324,0,1,0,14,2,5,2,4,2,0,8,18,3,30,0 -14325,1,5,0,8,4,6,6,4,0,0,9,11,1,36,0 -14326,9,8,0,5,0,4,1,3,4,0,15,15,2,12,0 -14327,1,0,0,3,4,1,9,4,1,1,5,17,3,29,1 -14328,1,8,0,13,0,2,5,4,3,0,6,13,0,5,0 -14329,10,5,0,3,1,3,8,4,0,0,17,18,0,9,0 -14330,7,2,0,5,2,3,10,0,4,1,18,1,3,16,1 -14331,7,3,0,13,5,5,8,0,2,1,17,11,0,24,0 -14332,5,8,0,2,1,1,6,5,0,0,2,2,0,10,0 -14333,1,6,0,15,0,0,3,3,0,0,2,0,3,32,0 -14334,9,3,0,0,3,0,7,4,3,1,2,18,2,21,0 -14335,0,5,0,15,4,3,0,4,2,1,1,5,2,11,1 -14336,4,4,0,12,0,4,6,4,1,0,17,11,5,18,0 -14337,1,3,0,11,0,1,5,4,0,1,17,6,0,4,0 -14338,3,2,0,8,1,5,0,3,1,1,4,4,0,33,0 -14339,9,5,0,9,3,4,1,1,0,0,1,3,3,17,0 -14340,8,6,0,5,3,3,3,1,1,1,0,4,3,17,0 -14341,3,7,0,10,2,5,0,0,1,0,13,6,5,38,0 -14342,9,2,0,11,2,5,0,1,2,1,2,15,4,34,0 -14343,10,7,0,6,3,0,6,4,1,0,9,1,3,5,1 -14344,4,8,0,13,5,4,14,0,2,0,15,11,5,22,0 -14345,3,8,0,0,3,4,1,2,4,0,2,11,2,10,0 -14346,9,2,0,11,1,2,2,4,3,1,5,8,0,22,1 -14347,7,6,0,15,3,5,8,3,0,1,13,2,5,33,0 -14348,0,0,0,2,6,1,13,3,1,0,6,4,1,6,0 -14349,3,6,0,8,0,5,4,2,0,1,11,16,0,17,0 -14350,1,8,0,11,5,1,5,3,0,0,7,13,0,31,0 -14351,5,8,0,0,6,0,5,0,0,0,10,3,5,19,0 -14352,9,3,0,5,0,0,13,3,4,0,10,14,5,3,0 -14353,9,0,0,8,2,5,1,0,0,1,9,8,1,22,1 -14354,8,3,0,13,3,4,7,5,0,0,10,18,1,10,0 -14355,7,1,0,1,1,3,2,5,1,0,9,7,2,26,1 -14356,0,3,0,1,2,3,11,5,4,0,8,13,4,38,0 -14357,1,0,0,10,0,4,13,5,0,0,17,2,1,34,0 -14358,5,4,0,6,0,4,14,3,2,0,13,5,3,28,0 -14359,3,5,0,13,6,4,5,3,3,0,2,18,1,8,0 -14360,3,0,0,7,6,5,11,0,0,1,8,11,2,12,0 -14361,1,7,0,3,5,4,7,3,3,1,13,11,5,5,0 -14362,6,5,0,4,0,2,12,4,0,0,10,2,3,28,0 -14363,5,8,0,13,0,4,2,3,1,1,3,9,3,1,0 -14364,10,3,0,13,3,0,12,5,0,1,5,5,5,40,0 -14365,7,3,0,3,6,0,13,2,4,1,12,11,0,6,0 -14366,6,4,0,1,1,1,4,4,4,1,2,2,0,1,0 -14367,0,7,0,6,5,4,4,5,4,1,1,1,4,8,1 -14368,6,4,0,1,5,3,14,0,4,1,17,9,5,14,0 -14369,0,5,0,11,6,0,12,2,3,0,4,13,1,19,0 -14370,2,7,0,14,5,5,9,1,3,0,16,11,1,30,0 -14371,10,3,0,10,2,0,12,2,2,0,8,14,4,37,0 -14372,4,7,0,15,0,6,12,3,2,0,17,11,3,21,0 -14373,4,4,0,6,4,3,0,5,3,0,3,8,5,0,1 -14374,2,6,0,13,3,5,12,2,3,1,6,6,0,2,0 -14375,8,5,0,11,4,1,14,5,4,1,7,8,0,25,1 -14376,6,0,0,9,0,6,5,2,1,1,0,20,1,13,0 -14377,2,8,0,13,0,1,0,4,3,1,8,2,4,11,0 -14378,6,6,0,0,0,0,3,4,0,1,13,11,0,12,0 -14379,4,6,0,5,0,1,13,3,2,0,12,3,0,9,0 -14380,8,3,0,14,4,1,11,1,4,1,1,10,2,19,1 -14381,6,3,0,4,2,3,13,5,4,0,18,7,5,4,1 -14382,6,1,0,2,2,2,1,5,1,0,16,19,4,18,1 -14383,0,2,0,3,3,3,8,4,3,1,16,16,2,34,0 -14384,3,1,0,15,0,5,4,5,1,1,11,15,1,22,1 -14385,1,8,0,7,2,2,1,1,2,0,18,13,3,13,1 -14386,0,3,0,4,2,1,5,5,3,0,10,6,1,22,0 -14387,0,0,0,5,2,4,4,5,3,1,15,1,0,23,0 -14388,3,3,0,6,0,0,10,1,0,0,6,8,3,8,0 -14389,9,7,0,3,4,1,11,0,4,0,8,6,1,14,1 -14390,5,1,0,8,2,4,8,4,0,1,14,18,4,6,0 -14391,3,1,0,2,0,1,1,1,1,1,7,7,2,26,1 -14392,7,8,0,9,4,1,11,5,4,0,12,7,3,4,1 -14393,1,0,0,13,0,4,8,1,1,0,6,8,1,17,0 -14394,7,8,0,12,1,3,13,0,4,1,5,10,3,13,1 -14395,7,4,0,5,5,6,4,0,2,1,6,3,0,4,0 -14396,1,4,0,14,1,5,13,3,3,1,2,15,4,24,0 -14397,0,7,0,4,2,4,6,3,0,0,0,9,4,31,0 -14398,5,0,0,12,0,2,10,3,3,1,8,2,3,26,0 -14399,10,4,0,1,6,4,6,1,4,1,8,16,0,36,0 -14400,4,6,0,4,6,2,0,5,3,1,12,1,2,8,1 -14401,2,6,0,8,5,3,11,0,2,1,13,0,4,34,0 -14402,5,2,0,7,0,4,4,3,0,0,17,11,0,36,0 -14403,9,3,0,6,2,0,2,2,0,0,13,2,1,8,0 -14404,4,7,0,5,0,1,5,1,0,1,8,4,0,18,0 -14405,4,3,0,0,5,3,0,4,0,1,10,2,1,8,0 -14406,9,3,0,5,2,6,14,1,2,0,8,15,4,5,0 -14407,1,0,0,1,0,0,4,3,0,0,9,1,1,11,1 -14408,6,6,0,2,0,0,14,1,4,0,16,10,5,24,1 -14409,8,3,0,7,2,2,11,5,2,0,11,4,2,17,1 -14410,7,4,0,4,0,5,6,4,3,1,2,19,1,14,0 -14411,8,6,0,13,1,3,13,2,3,1,7,5,0,20,1 -14412,2,0,0,3,6,4,0,3,2,1,2,2,0,10,0 -14413,9,3,0,4,3,3,4,4,3,0,0,16,4,12,0 -14414,7,4,0,12,4,2,2,5,0,0,5,7,2,4,1 -14415,3,4,0,4,1,0,9,4,2,1,13,4,0,29,0 -14416,3,4,0,14,3,2,6,1,4,0,11,12,4,18,1 -14417,8,0,0,13,4,4,7,5,3,1,18,17,0,0,1 -14418,4,3,0,5,0,5,11,1,1,0,0,11,4,22,0 -14419,1,1,0,7,4,2,9,2,4,0,1,8,2,27,1 -14420,7,2,0,1,6,2,3,5,2,1,11,17,5,16,1 -14421,3,1,0,9,3,2,10,4,2,1,14,6,3,11,1 -14422,2,1,0,9,5,1,0,4,1,1,12,2,1,14,0 -14423,3,3,0,10,1,4,13,1,3,1,2,15,1,8,0 -14424,1,8,0,15,4,6,6,0,0,0,2,2,3,34,0 -14425,8,3,0,10,6,4,7,0,0,1,14,16,3,13,1 -14426,3,2,0,11,5,3,5,3,4,1,2,18,5,27,0 -14427,5,7,0,7,5,3,7,4,0,1,8,11,0,10,0 -14428,5,5,0,1,0,4,9,0,0,0,2,11,3,1,0 -14429,6,3,0,8,2,4,14,1,3,0,15,6,5,37,0 -14430,2,8,0,14,0,2,3,2,0,1,18,0,3,25,1 -14431,7,1,0,8,6,3,2,5,3,1,11,5,4,25,1 -14432,1,3,0,1,3,1,14,2,2,1,12,4,2,35,0 -14433,2,0,0,1,6,4,8,1,0,0,2,18,5,19,0 -14434,3,1,0,6,1,5,6,2,3,0,15,16,1,9,0 -14435,10,0,0,13,4,3,9,3,3,1,13,1,3,41,0 -14436,3,4,0,10,5,6,1,2,2,1,5,14,5,2,0 -14437,10,0,0,13,6,2,14,4,2,1,6,20,4,20,1 -14438,2,0,0,4,2,1,6,0,2,0,13,14,0,8,0 -14439,10,6,0,6,6,2,12,5,3,0,5,12,4,19,1 -14440,1,4,0,14,2,3,8,4,3,1,12,14,1,22,0 -14441,10,4,0,11,1,4,8,4,0,0,7,3,3,32,1 -14442,1,0,0,1,4,2,10,4,2,1,8,19,4,41,1 -14443,1,8,0,10,0,0,8,3,0,0,13,2,5,8,0 -14444,2,2,0,2,3,4,4,4,4,0,6,16,4,9,1 -14445,5,1,0,14,5,0,6,1,2,0,18,15,2,2,1 -14446,1,1,0,6,6,1,2,4,0,1,15,2,1,20,0 -14447,10,5,0,13,0,1,11,1,1,0,13,19,3,28,0 -14448,0,3,0,15,5,2,3,0,1,1,8,15,5,7,0 -14449,1,6,0,13,0,4,11,2,3,0,13,2,1,1,0 -14450,8,7,0,6,3,0,7,3,0,1,4,20,1,24,0 -14451,0,1,0,4,6,3,10,4,4,1,17,2,2,27,0 -14452,4,0,0,8,5,5,2,0,4,1,2,11,2,24,0 -14453,9,6,0,0,0,0,2,1,1,0,17,20,4,6,0 -14454,7,2,0,10,6,4,2,5,0,1,14,7,5,37,1 -14455,10,3,0,2,2,0,8,3,4,0,4,6,2,21,0 -14456,1,7,0,14,0,1,8,4,0,1,13,9,2,40,0 -14457,0,7,0,14,1,3,13,1,1,1,4,18,0,18,0 -14458,5,0,0,14,6,0,5,3,0,0,17,20,2,27,0 -14459,1,7,0,13,1,5,13,0,0,1,15,20,0,39,0 -14460,0,6,0,0,5,1,12,0,1,0,6,20,5,16,0 -14461,3,6,0,13,0,3,9,4,0,0,6,2,1,34,0 -14462,8,0,0,4,5,3,9,2,4,1,6,11,1,4,0 -14463,1,2,0,1,6,6,5,1,2,0,2,15,5,30,0 -14464,9,2,0,2,6,5,1,0,0,1,13,6,0,38,0 -14465,6,5,0,12,3,0,13,0,2,1,9,10,2,8,1 -14466,0,7,0,5,6,4,10,1,1,1,13,15,3,8,0 -14467,5,8,0,10,4,5,4,5,3,0,0,7,4,21,1 -14468,8,8,0,11,1,5,8,2,0,0,14,18,2,7,0 -14469,8,2,0,0,3,1,3,2,4,1,18,1,5,20,1 -14470,0,0,0,4,3,5,8,4,0,0,8,13,3,29,0 -14471,2,7,0,6,5,5,3,3,1,1,3,14,0,27,0 -14472,3,7,0,15,2,3,13,2,4,0,12,3,0,35,0 -14473,5,3,0,9,1,3,7,3,3,1,4,15,1,34,0 -14474,2,0,0,3,0,1,5,1,3,0,2,15,0,36,0 -14475,5,1,0,11,5,4,2,0,0,1,10,5,4,18,1 -14476,3,2,0,7,4,4,5,5,4,0,6,18,1,12,0 -14477,7,7,0,1,6,4,8,0,3,1,0,19,2,26,0 -14478,10,7,0,5,6,0,0,3,4,1,9,15,3,0,0 -14479,9,4,0,5,1,2,0,5,0,1,3,14,4,23,1 -14480,8,7,0,10,2,1,5,0,0,1,11,4,4,39,1 -14481,2,7,0,0,6,4,8,1,1,1,5,10,3,10,0 -14482,0,6,0,0,0,3,11,4,3,0,2,1,0,16,0 -14483,0,6,0,9,1,5,5,4,2,0,13,10,1,17,0 -14484,6,1,0,8,6,3,12,0,0,1,12,17,3,11,1 -14485,6,8,0,10,3,6,0,0,4,0,16,12,1,11,1 -14486,0,4,0,2,6,4,14,4,0,1,10,3,0,5,0 -14487,2,2,0,14,4,4,9,3,1,0,2,2,0,13,0 -14488,8,1,0,9,4,3,12,5,1,0,11,10,2,34,1 -14489,3,5,0,8,6,4,14,2,0,1,4,2,5,31,0 -14490,8,6,0,12,2,3,1,3,4,0,10,13,3,13,0 -14491,2,7,0,6,0,5,14,1,0,1,2,11,2,21,0 -14492,2,1,0,3,3,3,14,3,1,0,8,11,5,22,0 -14493,4,7,0,11,6,0,3,5,0,1,17,0,1,20,1 -14494,6,3,0,12,4,5,9,1,2,0,0,17,3,17,1 -14495,1,6,0,13,0,6,5,2,1,0,0,13,5,26,0 -14496,8,5,0,14,5,4,13,0,4,1,18,5,1,13,1 -14497,6,8,0,3,4,0,14,4,1,0,15,13,0,26,0 -14498,2,7,0,13,4,0,8,0,2,1,8,15,0,0,0 -14499,9,2,0,13,4,6,5,1,4,1,9,8,4,3,1 -14500,0,4,0,5,0,2,14,4,4,0,8,18,1,5,0 -14501,4,0,0,9,1,6,11,5,1,0,9,17,3,21,1 -14502,4,6,0,8,1,4,8,1,1,1,8,19,3,19,0 -14503,7,6,0,6,6,1,0,2,3,0,4,15,4,29,0 -14504,9,8,0,12,3,6,9,1,1,1,0,6,5,11,0 -14505,5,5,0,12,2,3,5,2,4,0,0,4,0,0,0 -14506,1,5,0,5,1,3,10,1,3,0,8,18,2,28,0 -14507,0,1,0,7,0,4,5,3,4,1,17,6,2,37,0 -14508,2,2,0,14,0,6,6,1,4,1,8,6,4,3,0 -14509,2,5,0,14,2,3,3,4,2,1,2,9,1,20,0 -14510,0,3,0,2,2,3,5,3,4,0,17,2,5,11,0 -14511,7,7,0,9,6,2,11,5,3,0,18,7,4,28,1 -14512,1,5,0,13,0,5,8,4,4,1,13,18,1,36,0 -14513,3,6,0,10,1,2,5,5,0,0,5,11,0,18,0 -14514,0,1,0,5,6,3,7,4,3,1,9,6,1,14,0 -14515,6,8,0,6,0,4,11,2,4,0,13,11,0,16,0 -14516,1,6,0,4,6,6,8,5,4,1,0,20,2,33,0 -14517,4,8,0,14,6,2,9,0,2,0,13,2,5,9,0 -14518,5,3,0,1,6,0,9,3,0,0,2,0,4,29,0 -14519,4,7,0,11,1,4,1,4,2,1,6,2,4,27,0 -14520,3,6,0,5,5,4,6,0,4,1,0,2,4,3,0 -14521,7,6,0,1,5,6,11,5,1,0,5,1,4,31,1 -14522,1,4,0,2,1,0,7,1,0,1,12,11,5,27,0 -14523,5,7,0,1,1,6,0,3,0,0,5,4,3,38,0 -14524,3,8,0,0,5,0,12,4,3,0,2,9,2,12,0 -14525,10,5,0,11,4,1,4,3,1,1,9,1,2,35,1 -14526,2,3,0,1,2,0,12,2,1,0,2,0,4,28,0 -14527,5,7,0,5,3,1,14,3,2,1,7,10,0,30,1 -14528,2,7,0,5,3,4,9,2,1,1,8,15,2,31,0 -14529,6,4,0,2,2,5,9,1,2,1,2,11,0,37,0 -14530,1,7,0,9,4,2,10,1,4,1,18,5,3,21,1 -14531,5,3,0,7,1,0,9,0,3,0,0,2,0,1,0 -14532,2,6,0,13,5,5,2,3,4,1,1,11,0,10,0 -14533,3,1,0,10,3,1,6,3,4,0,17,7,0,20,1 -14534,4,4,0,10,6,2,14,0,2,1,1,1,3,38,1 -14535,3,3,0,7,3,3,0,0,0,0,12,16,3,14,0 -14536,3,2,0,15,5,4,5,2,0,1,17,5,4,14,0 -14537,5,5,0,2,0,0,13,5,1,1,17,19,3,4,0 -14538,5,2,0,7,5,3,3,4,1,0,11,16,4,33,1 -14539,7,5,0,2,0,3,8,2,4,1,11,5,3,14,0 -14540,3,7,0,0,1,3,2,1,4,0,5,18,4,35,0 -14541,2,8,0,5,3,4,4,1,3,1,8,19,0,8,0 -14542,0,7,0,10,0,5,1,3,3,0,13,20,3,31,0 -14543,9,6,0,6,2,5,11,2,4,1,13,0,0,25,0 -14544,6,2,0,3,1,4,14,3,1,0,4,6,2,30,0 -14545,1,1,0,3,3,6,2,1,3,0,12,0,1,12,0 -14546,2,6,0,1,2,6,3,3,4,1,13,0,0,39,0 -14547,9,5,0,1,1,2,11,5,0,0,18,9,3,22,1 -14548,5,6,0,6,0,2,9,2,0,0,12,11,4,6,0 -14549,3,3,0,15,4,1,14,1,2,0,8,13,3,14,0 -14550,10,0,0,13,0,2,9,4,1,1,13,2,3,38,0 -14551,1,5,0,15,3,6,1,0,1,0,13,14,5,20,0 -14552,5,6,0,11,2,0,13,4,3,0,12,16,2,24,0 -14553,9,3,0,4,1,4,11,4,3,1,2,11,0,26,0 -14554,3,3,0,11,5,0,2,2,4,1,8,13,1,11,0 -14555,10,0,0,12,5,2,8,4,1,1,10,6,2,1,0 -14556,2,8,0,10,0,0,3,2,0,0,0,18,0,12,0 -14557,1,6,0,11,1,6,3,2,0,0,6,13,2,26,0 -14558,0,2,0,11,0,5,8,4,4,0,7,2,2,28,0 -14559,0,2,0,10,5,0,14,2,0,0,2,6,0,9,0 -14560,1,1,0,12,5,5,8,5,2,1,7,7,2,1,1 -14561,6,5,0,14,6,2,6,4,1,1,18,12,1,13,1 -14562,0,1,0,15,0,1,14,3,2,0,8,5,0,21,0 -14563,2,0,0,1,0,0,5,0,0,0,17,15,2,32,0 -14564,4,5,0,4,0,6,12,5,2,1,1,12,2,22,1 -14565,2,2,0,14,0,5,14,1,2,0,15,16,5,0,0 -14566,5,6,0,5,4,1,8,4,1,0,10,20,2,17,0 -14567,3,3,0,2,0,2,6,2,4,1,13,2,2,16,0 -14568,10,8,0,3,5,1,8,2,4,0,0,11,5,12,0 -14569,5,5,0,15,5,1,3,0,1,1,12,6,4,30,0 -14570,9,2,0,5,5,6,14,2,1,1,6,11,2,38,0 -14571,1,1,0,3,0,3,0,1,3,0,6,9,4,4,0 -14572,5,1,0,3,6,6,8,5,0,1,9,5,2,27,1 -14573,3,0,0,7,1,5,3,4,1,1,7,1,4,21,1 -14574,3,8,0,14,1,5,8,4,1,1,17,6,5,9,0 -14575,5,2,0,15,1,0,9,0,2,1,2,6,1,14,0 -14576,3,4,0,6,5,6,1,1,3,0,0,11,2,6,0 -14577,8,8,0,6,3,3,9,3,1,0,8,2,5,29,0 -14578,2,6,0,1,5,2,9,2,1,1,2,11,1,23,0 -14579,5,8,0,13,5,2,0,2,0,0,8,17,3,18,0 -14580,9,1,0,8,1,3,3,1,0,1,6,14,4,33,1 -14581,4,3,0,9,1,3,5,0,2,1,4,11,0,9,0 -14582,5,8,0,14,0,6,14,3,3,1,6,0,1,33,0 -14583,3,2,0,11,5,6,2,5,1,1,6,2,2,22,0 -14584,8,7,0,12,5,5,13,3,1,1,8,11,0,39,0 -14585,5,8,0,5,0,0,9,1,2,1,17,17,2,40,0 -14586,7,6,0,7,6,1,2,0,4,1,5,14,1,28,1 -14587,10,5,0,2,2,2,1,4,1,1,7,10,1,22,1 -14588,1,8,0,8,2,4,4,3,3,0,11,16,1,20,0 -14589,2,3,0,9,0,2,12,4,4,1,5,10,5,36,1 -14590,2,7,0,11,1,5,11,2,2,0,13,18,1,10,0 -14591,1,1,0,12,2,0,11,4,4,0,6,2,0,37,0 -14592,0,7,0,6,5,3,5,2,3,0,4,9,2,30,0 -14593,3,1,0,1,0,5,13,1,0,0,8,12,4,40,0 -14594,5,6,0,11,0,4,12,1,2,1,2,2,2,4,0 -14595,6,7,0,8,6,3,13,0,2,0,5,1,4,39,1 -14596,6,1,0,1,4,2,13,5,1,0,9,7,4,8,1 -14597,4,0,0,4,0,6,3,0,1,0,17,13,3,13,0 -14598,3,3,0,2,0,3,2,4,1,0,6,11,2,33,0 -14599,9,8,0,4,3,1,12,2,3,1,15,16,3,40,1 -14600,3,6,0,6,6,2,11,3,4,1,1,11,4,38,1 -14601,10,8,0,6,6,6,7,2,3,0,0,16,3,4,0 -14602,2,7,0,0,6,5,7,4,0,1,13,2,2,13,0 -14603,1,6,0,2,1,3,10,5,0,1,8,11,1,26,0 -14604,9,2,0,10,4,0,7,0,1,0,4,15,0,33,0 -14605,3,6,0,3,0,3,2,0,3,0,3,2,5,29,0 -14606,2,7,0,13,3,4,4,2,0,1,0,11,0,7,0 -14607,6,4,0,1,6,2,11,5,0,1,14,19,3,23,1 -14608,2,6,0,3,5,5,7,0,1,1,4,2,4,25,0 -14609,9,2,0,11,6,0,6,3,0,1,4,9,0,8,0 -14610,7,7,0,9,4,2,4,5,2,0,11,19,1,20,1 -14611,4,1,0,3,0,1,12,2,2,1,18,17,4,10,1 -14612,5,4,0,13,4,5,2,4,1,0,18,3,3,5,1 -14613,9,2,0,6,3,0,8,1,4,0,2,6,0,12,0 -14614,1,1,0,5,0,5,11,2,3,1,13,4,3,32,0 -14615,7,8,0,12,5,4,8,1,2,0,9,10,3,1,1 -14616,10,3,0,13,1,3,8,5,2,0,10,11,0,3,0 -14617,8,3,0,8,1,1,12,2,1,1,6,8,3,16,1 -14618,0,6,0,4,4,5,3,0,1,1,13,2,3,27,0 -14619,7,6,0,4,0,0,8,0,0,1,13,2,5,33,0 -14620,0,4,0,14,6,2,11,5,0,1,15,19,4,27,1 -14621,6,5,0,15,0,6,3,1,2,0,17,8,0,24,0 -14622,10,2,0,2,2,0,7,1,2,1,0,17,5,27,0 -14623,6,2,0,10,3,2,3,5,3,0,18,7,0,32,1 -14624,10,2,0,3,6,5,9,2,0,0,8,18,4,10,0 -14625,3,7,0,0,2,2,3,0,3,1,18,8,2,23,1 -14626,1,1,0,4,1,3,1,0,3,1,2,2,2,36,0 -14627,1,3,0,4,3,5,6,3,3,1,8,12,2,35,0 -14628,1,0,0,3,1,0,5,3,0,1,11,2,0,32,0 -14629,10,8,0,14,4,3,14,1,0,1,2,12,3,17,0 -14630,7,4,0,4,6,0,6,4,2,1,15,16,3,34,0 -14631,7,7,0,14,6,0,1,3,2,1,2,20,5,6,0 -14632,2,8,0,13,0,0,4,1,2,1,8,2,3,34,0 -14633,10,2,0,3,5,0,2,2,0,0,13,20,0,9,0 -14634,2,4,0,11,0,2,8,4,4,1,8,18,4,2,0 -14635,10,5,0,15,5,0,4,3,1,0,11,3,3,27,1 -14636,2,8,0,14,6,1,12,2,2,0,8,18,0,19,0 -14637,10,5,0,0,1,0,6,2,2,0,18,14,4,8,1 -14638,1,2,0,8,6,1,4,0,1,0,9,8,1,24,1 -14639,5,2,0,5,5,5,9,1,2,1,17,2,5,40,0 -14640,3,4,0,3,0,4,4,4,0,0,2,0,0,31,0 -14641,2,3,0,13,2,4,8,5,3,0,15,20,5,29,0 -14642,7,8,0,6,6,0,7,4,0,0,17,9,0,34,0 -14643,0,2,0,1,3,5,9,4,3,0,0,13,2,24,0 -14644,3,0,0,3,3,5,0,4,3,1,0,13,5,35,0 -14645,2,7,0,9,3,3,3,4,4,1,2,9,0,4,0 -14646,9,5,0,7,6,2,5,4,3,1,15,0,4,26,0 -14647,5,2,0,14,2,4,3,1,1,0,2,14,3,36,0 -14648,7,4,0,8,4,0,12,1,2,1,13,11,0,6,0 -14649,0,5,0,10,5,0,4,4,0,0,8,11,5,8,0 -14650,8,3,0,7,3,5,10,1,1,1,18,10,3,0,1 -14651,9,6,0,5,0,3,4,1,0,0,17,2,0,35,0 -14652,2,8,0,5,0,3,6,1,2,0,4,20,2,31,0 -14653,1,5,0,9,5,4,5,3,4,0,4,18,2,31,0 -14654,10,1,0,13,1,4,6,4,3,0,6,13,5,17,0 -14655,4,5,0,3,5,4,7,4,2,1,17,11,5,33,0 -14656,2,0,0,8,3,4,0,4,2,0,12,4,3,22,0 -14657,7,3,0,3,0,3,11,3,3,1,4,11,3,31,0 -14658,9,0,0,11,3,0,6,3,4,0,7,2,5,23,0 -14659,2,7,0,5,6,3,6,1,0,1,2,9,1,11,0 -14660,2,3,0,13,0,4,10,0,2,0,13,2,2,16,0 -14661,1,4,0,3,4,6,14,4,2,1,4,10,5,33,0 -14662,0,7,0,13,5,5,14,2,2,0,4,9,2,4,0 -14663,10,1,0,11,5,0,14,4,3,1,4,5,0,9,0 -14664,0,3,0,3,2,5,7,2,2,0,13,2,0,29,0 -14665,10,2,0,15,5,1,11,3,2,1,3,17,4,23,1 -14666,3,5,0,10,5,4,13,2,3,1,17,9,2,32,0 -14667,2,6,0,11,0,0,9,1,0,1,4,2,1,16,0 -14668,0,6,0,0,1,1,12,2,0,1,13,11,0,19,0 -14669,3,6,0,2,2,2,13,2,1,0,13,11,1,24,0 -14670,10,4,0,6,3,5,9,0,2,1,12,14,3,11,1 -14671,9,3,0,1,6,5,9,5,1,1,3,4,5,18,1 -14672,7,1,0,15,3,4,10,1,0,1,18,18,3,1,1 -14673,7,2,0,6,1,3,6,1,3,0,14,0,2,14,0 -14674,1,2,0,6,3,0,5,2,2,0,2,6,1,9,0 -14675,1,4,0,2,4,4,5,5,0,0,13,2,3,24,0 -14676,7,3,0,11,2,2,2,3,4,1,4,9,0,22,0 -14677,10,4,0,11,0,4,13,3,4,0,2,6,2,7,0 -14678,6,4,0,5,3,0,9,1,1,0,2,2,5,14,0 -14679,3,6,0,1,3,0,12,0,1,1,13,20,1,40,0 -14680,3,2,0,8,1,1,9,0,0,0,17,17,1,22,0 -14681,3,4,0,5,2,2,11,0,4,0,15,17,1,28,1 -14682,9,6,0,11,3,0,3,1,4,1,17,2,0,10,0 -14683,6,8,0,10,0,3,11,0,0,1,9,17,3,39,1 -14684,8,0,0,2,3,6,4,5,3,1,2,9,4,35,0 -14685,1,8,0,2,0,5,8,1,0,1,17,0,0,8,0 -14686,1,6,0,8,3,0,4,0,1,0,13,7,2,36,1 -14687,4,6,0,0,5,2,8,2,4,1,13,2,5,26,0 -14688,10,6,0,0,6,3,6,5,4,0,15,8,3,34,1 -14689,3,3,0,4,6,3,12,3,4,1,17,11,4,35,0 -14690,3,7,0,10,1,0,0,2,4,0,17,2,5,9,0 -14691,5,5,0,15,1,5,7,5,4,1,5,8,4,0,1 -14692,6,8,0,12,4,3,9,3,0,0,10,16,0,7,0 -14693,2,7,0,15,1,4,0,4,0,1,1,1,2,30,0 -14694,0,6,0,5,5,3,9,0,1,1,8,3,4,23,0 -14695,2,0,0,5,0,3,6,3,1,0,8,19,0,28,0 -14696,10,4,0,3,1,0,14,2,2,1,0,18,1,14,0 -14697,0,0,0,14,3,5,1,1,4,0,14,0,4,9,0 -14698,8,3,0,15,6,1,10,3,3,1,7,7,5,41,1 -14699,1,0,0,0,4,0,9,3,1,1,16,0,2,13,0 -14700,1,2,0,3,3,1,8,2,3,1,2,9,0,37,0 -14701,4,8,0,7,0,0,2,5,3,1,3,0,0,1,0 -14702,3,1,0,5,1,3,0,0,1,1,2,6,0,3,0 -14703,5,7,0,2,6,2,5,2,0,0,2,9,2,35,0 -14704,0,4,0,13,1,2,10,1,3,1,13,6,3,6,0 -14705,1,3,0,13,4,4,9,2,1,0,13,14,1,18,0 -14706,4,3,0,9,4,0,3,3,3,1,1,10,4,11,1 -14707,0,0,0,13,0,4,13,1,3,0,8,11,4,26,0 -14708,2,0,0,11,1,0,2,2,4,1,8,2,4,35,0 -14709,5,5,0,14,1,5,5,3,3,0,8,16,4,26,0 -14710,0,3,0,9,0,1,5,5,0,1,8,4,0,24,0 -14711,0,1,0,7,5,0,10,3,2,0,13,14,4,6,0 -14712,0,1,0,6,0,0,7,4,3,0,14,15,3,20,0 -14713,2,8,0,4,2,5,11,5,3,0,17,20,5,5,0 -14714,6,7,0,11,1,2,10,0,0,0,4,10,0,31,0 -14715,2,8,0,1,4,3,10,1,0,0,13,7,3,21,0 -14716,9,2,0,13,3,2,12,4,4,1,4,12,4,29,0 -14717,9,3,0,9,0,3,5,3,3,0,16,16,4,29,0 -14718,3,6,0,10,6,5,1,2,1,1,13,2,0,2,0 -14719,1,7,0,7,5,1,8,3,2,0,13,4,5,17,0 -14720,6,7,0,7,0,3,14,4,0,1,12,6,3,37,0 -14721,0,0,0,12,2,3,8,0,0,0,8,18,0,9,0 -14722,3,0,0,14,3,5,12,2,1,1,13,11,3,26,0 -14723,2,4,0,6,3,5,12,0,2,0,5,8,3,5,1 -14724,0,5,0,9,3,1,11,4,0,1,3,19,1,3,1 -14725,2,8,0,13,5,3,0,4,1,0,9,13,4,29,0 -14726,5,4,0,14,0,6,4,3,2,0,18,14,3,23,1 -14727,8,5,0,5,4,1,14,1,3,1,9,17,3,20,1 -14728,5,8,0,10,6,3,4,0,4,1,11,7,3,23,1 -14729,9,5,0,6,1,1,0,5,0,1,3,7,3,30,1 -14730,0,4,0,6,1,0,13,4,1,1,3,0,5,22,0 -14731,1,5,0,1,2,4,12,3,0,0,4,11,2,36,0 -14732,0,7,0,10,3,0,10,0,2,1,2,18,3,29,0 -14733,6,1,0,3,0,2,1,1,1,1,13,2,0,40,0 -14734,10,7,0,11,6,4,14,3,1,0,13,11,5,24,0 -14735,3,2,0,11,3,2,11,5,4,1,18,5,3,5,1 -14736,6,8,0,8,3,4,11,3,1,0,8,18,4,19,0 -14737,5,6,0,8,4,4,6,0,3,1,18,6,2,9,0 -14738,4,3,0,3,4,2,13,2,0,0,11,14,4,33,1 -14739,2,3,0,6,0,6,4,0,1,0,4,2,4,3,0 -14740,1,6,0,3,4,5,9,4,1,0,1,10,1,40,1 -14741,4,8,0,8,2,2,8,5,0,1,0,10,2,20,1 -14742,0,3,0,11,6,4,2,1,1,1,2,14,0,27,0 -14743,6,1,0,5,3,0,1,3,2,1,2,15,1,22,0 -14744,6,2,0,14,3,1,11,5,1,1,1,17,0,36,1 -14745,9,0,0,9,0,3,12,5,4,1,9,12,3,21,1 -14746,3,4,0,14,0,1,9,4,0,1,10,2,3,26,0 -14747,3,8,0,6,2,3,5,1,1,1,4,13,4,11,0 -14748,6,6,0,11,4,3,12,3,0,1,2,11,5,37,0 -14749,3,7,0,12,5,6,6,0,3,0,17,3,2,16,0 -14750,0,6,0,10,4,6,9,1,3,0,13,11,2,16,0 -14751,2,0,0,8,5,0,1,4,3,0,8,1,0,38,0 -14752,9,0,0,0,4,3,10,3,2,1,13,8,0,27,0 -14753,5,3,0,14,2,3,8,3,2,0,4,18,1,4,0 -14754,2,7,0,0,2,0,2,4,0,1,16,7,0,18,0 -14755,3,1,0,5,0,2,3,4,3,0,10,16,3,32,0 -14756,2,6,0,11,6,4,4,3,1,1,17,19,3,8,0 -14757,0,2,0,8,5,0,4,0,4,0,2,6,1,39,0 -14758,5,7,0,14,3,5,9,2,4,0,7,0,4,25,0 -14759,3,8,0,4,0,3,4,1,3,1,5,6,2,4,0 -14760,8,2,0,3,3,5,6,0,0,0,2,9,4,38,0 -14761,1,4,0,7,0,0,1,4,4,1,2,14,5,10,0 -14762,9,3,0,7,4,2,0,5,4,1,12,17,4,14,1 -14763,2,5,0,1,6,5,11,4,1,0,0,2,3,35,0 -14764,3,7,0,14,6,2,0,1,3,0,2,15,5,24,0 -14765,0,5,0,13,6,4,9,1,1,1,16,13,2,26,0 -14766,8,2,0,5,6,3,13,5,1,1,14,5,4,8,1 -14767,3,4,0,0,0,6,4,1,4,1,10,2,1,41,0 -14768,1,2,0,6,1,0,5,4,2,0,15,20,2,14,0 -14769,3,2,0,15,2,1,10,5,2,0,9,7,2,21,1 -14770,0,2,0,5,6,4,14,3,0,0,0,8,4,6,0 -14771,4,8,0,11,2,4,9,3,2,0,4,15,4,26,0 -14772,6,2,0,10,4,0,14,0,4,1,5,5,4,21,1 -14773,1,7,0,15,6,3,10,2,1,1,10,11,0,31,0 -14774,1,6,0,13,3,4,9,0,0,0,13,18,3,12,0 -14775,2,3,0,6,0,3,13,2,3,1,3,2,2,37,0 -14776,0,4,0,3,1,3,0,3,4,0,13,2,4,40,0 -14777,3,8,0,0,4,3,8,1,3,0,0,0,0,38,0 -14778,9,1,0,11,5,0,14,2,3,1,2,11,4,18,0 -14779,7,3,0,11,1,2,10,0,2,1,2,0,4,6,0 -14780,7,6,0,15,4,1,4,5,1,1,7,19,5,14,1 -14781,0,0,0,14,6,2,12,0,2,1,0,16,0,17,0 -14782,6,1,0,7,3,4,8,0,3,1,8,20,4,16,0 -14783,0,7,0,3,2,5,10,4,3,1,8,2,5,1,0 -14784,4,3,0,10,6,4,4,4,4,1,16,19,4,41,1 -14785,7,8,0,3,2,5,0,5,2,1,18,7,3,5,1 -14786,7,1,0,15,3,1,0,1,2,0,9,17,3,3,1 -14787,4,0,0,6,2,5,7,0,1,1,18,10,5,39,1 -14788,1,5,0,15,0,6,10,3,0,0,2,11,2,17,0 -14789,8,7,0,13,1,5,14,3,0,1,13,0,1,41,0 -14790,5,0,0,13,2,4,3,2,4,1,18,7,1,31,1 -14791,7,0,0,1,3,6,9,5,4,1,9,1,0,37,1 -14792,5,7,0,9,1,5,8,4,0,1,8,0,4,28,0 -14793,1,7,0,14,1,4,7,0,3,0,8,18,0,2,0 -14794,0,6,0,5,0,4,7,0,4,0,13,6,0,5,0 -14795,1,0,0,6,6,0,0,4,4,0,5,20,4,28,0 -14796,0,4,0,5,2,5,9,1,4,0,8,19,3,5,0 -14797,3,3,0,13,1,5,0,2,0,1,3,11,1,29,0 -14798,2,1,0,4,2,5,5,2,2,1,11,11,5,12,0 -14799,6,3,0,10,4,1,11,3,2,1,5,8,4,32,1 -14800,7,8,0,3,3,6,8,3,4,1,10,2,2,22,0 -14801,1,0,0,0,2,4,8,3,0,1,13,18,0,29,0 -14802,2,3,0,13,2,5,12,2,0,1,10,2,1,19,0 -14803,2,8,0,6,1,5,5,2,4,0,4,11,0,5,0 -14804,4,4,0,10,6,2,7,4,0,1,13,18,5,33,0 -14805,0,6,0,1,2,5,12,5,2,1,16,16,1,23,0 -14806,8,7,0,1,5,1,5,0,4,0,5,20,5,27,0 -14807,9,2,0,1,6,0,7,5,0,0,12,5,4,36,1 -14808,6,6,0,14,4,4,3,2,3,0,7,19,4,21,1 -14809,5,4,0,4,2,5,5,2,1,0,12,11,4,5,0 -14810,5,1,0,10,1,2,3,5,4,1,3,2,3,36,1 -14811,3,4,0,3,0,4,12,1,1,0,0,2,5,37,0 -14812,0,1,0,3,2,0,13,3,0,1,10,2,1,22,0 -14813,2,7,0,3,6,6,12,2,4,0,4,14,0,28,0 -14814,10,8,0,15,0,4,1,3,2,1,15,11,0,33,0 -14815,1,2,0,15,3,1,12,3,0,0,2,2,4,29,0 -14816,3,2,0,9,3,5,13,3,3,1,17,12,0,21,0 -14817,1,7,0,11,0,3,5,0,4,1,13,12,1,26,0 -14818,7,7,0,0,2,3,5,5,1,0,8,9,5,26,0 -14819,6,5,0,3,6,1,9,1,2,1,18,0,5,11,1 -14820,2,5,0,9,2,0,2,2,1,0,8,20,2,16,0 -14821,1,1,0,1,6,0,0,3,1,1,4,6,0,7,0 -14822,2,6,0,3,4,4,9,0,4,1,18,1,0,1,1 -14823,7,8,0,8,5,3,2,5,2,1,8,18,1,5,0 -14824,6,7,0,5,3,1,4,5,1,1,11,1,5,3,1 -14825,10,5,0,6,0,6,14,1,3,0,8,2,4,17,0 -14826,10,6,0,1,5,2,4,5,4,0,16,19,4,19,1 -14827,1,7,0,12,5,0,0,5,4,0,13,7,0,30,0 -14828,2,8,0,3,0,3,7,2,2,0,2,2,4,26,0 -14829,0,6,0,1,0,3,14,2,4,0,8,9,0,22,0 -14830,0,1,0,1,3,3,1,0,2,1,13,2,4,18,0 -14831,5,6,0,6,4,2,7,5,3,1,16,15,3,18,1 -14832,1,5,0,5,2,0,5,3,4,0,17,4,1,25,0 -14833,0,8,0,6,6,2,7,3,4,1,13,11,5,1,0 -14834,0,2,0,3,2,6,4,1,0,1,11,18,1,10,0 -14835,7,2,0,0,1,1,6,2,0,1,8,19,4,31,1 -14836,1,3,0,3,0,2,5,3,1,0,2,9,5,6,0 -14837,0,2,0,13,2,5,0,2,4,0,17,9,2,0,0 -14838,2,6,0,7,5,4,5,2,1,1,17,2,0,40,0 -14839,2,6,0,7,2,3,14,3,0,0,2,15,2,18,0 -14840,10,2,0,11,5,5,7,1,2,0,13,9,0,24,0 -14841,1,8,0,2,6,1,0,4,1,1,18,12,2,28,1 -14842,7,6,0,9,4,0,0,4,3,0,5,4,4,26,1 -14843,4,5,0,10,0,4,1,1,3,1,8,6,5,19,0 -14844,2,2,0,2,2,1,11,0,0,1,0,13,5,18,0 -14845,4,0,0,7,6,3,10,0,0,0,13,11,5,29,0 -14846,1,5,0,11,1,4,2,1,0,0,4,12,1,35,0 -14847,2,0,0,11,0,1,11,4,0,1,17,11,2,7,0 -14848,3,2,0,15,6,2,8,0,2,0,13,11,3,31,0 -14849,1,6,0,4,3,2,5,1,4,0,4,2,1,29,0 -14850,1,0,0,5,0,2,10,4,0,0,1,7,5,3,1 -14851,3,5,0,15,4,3,0,3,4,0,2,18,1,16,0 -14852,4,6,0,15,4,4,4,3,0,1,9,8,2,23,1 -14853,4,6,0,11,5,2,4,0,4,1,9,18,5,25,1 -14854,6,6,0,12,1,0,5,0,1,1,13,0,0,26,0 -14855,1,3,0,3,0,5,14,1,2,0,2,9,0,6,0 -14856,5,3,0,12,0,5,11,1,4,0,4,0,4,2,0 -14857,3,5,0,3,4,1,12,3,3,1,10,15,1,6,0 -14858,8,8,0,2,0,3,7,0,3,0,14,18,2,26,0 -14859,1,7,0,4,1,3,6,4,4,0,13,18,0,10,0 -14860,9,6,0,10,0,4,8,4,2,0,0,11,0,0,0 -14861,8,2,0,6,1,1,0,0,1,0,5,10,5,14,1 -14862,4,6,0,7,6,5,2,1,4,0,4,2,5,14,0 -14863,8,1,0,12,0,0,3,5,3,0,5,15,5,14,1 -14864,3,5,0,9,6,0,0,1,4,0,13,20,4,0,0 -14865,3,8,0,9,5,2,10,2,4,1,8,4,5,38,0 -14866,6,6,0,10,0,4,5,4,4,0,13,20,1,40,0 -14867,7,6,0,14,6,4,9,5,1,1,1,3,5,31,1 -14868,5,6,0,3,4,4,10,5,1,1,5,5,2,4,1 -14869,2,4,0,15,5,1,9,4,2,0,13,4,5,26,0 -14870,10,6,0,5,6,6,5,1,1,1,13,15,4,3,0 -14871,1,8,0,2,1,5,3,4,2,0,13,20,4,16,0 -14872,2,7,0,2,6,0,0,4,0,0,2,9,4,17,0 -14873,8,1,0,8,4,6,10,2,1,0,2,0,0,6,0 -14874,9,5,0,5,6,3,4,0,0,1,17,18,0,39,0 -14875,6,0,0,8,2,5,0,5,1,1,12,4,5,31,0 -14876,1,7,0,7,2,6,3,4,3,0,18,2,3,0,1 -14877,0,5,0,14,2,0,3,0,1,1,5,7,1,1,1 -14878,4,2,0,3,2,6,1,1,0,0,7,15,4,9,0 -14879,1,1,0,15,0,1,12,3,1,0,2,6,0,27,0 -14880,8,8,0,6,0,4,14,4,2,0,12,20,3,16,0 -14881,8,8,0,3,6,3,12,5,3,0,11,16,2,25,0 -14882,2,3,0,0,5,5,14,2,3,0,13,2,1,2,0 -14883,4,8,0,4,4,0,2,4,0,0,13,0,0,4,0 -14884,0,3,0,1,0,5,8,1,2,0,2,13,4,7,0 -14885,5,1,0,5,6,6,6,3,0,1,14,15,0,8,0 -14886,6,1,0,8,1,3,3,0,4,1,17,19,5,31,1 -14887,10,1,0,1,5,5,11,2,1,1,10,4,4,38,1 -14888,3,4,0,8,5,4,5,4,2,0,10,2,2,20,0 -14889,2,8,0,14,0,0,6,1,4,1,4,14,2,19,0 -14890,9,6,0,3,5,1,0,1,3,1,10,11,5,0,0 -14891,10,3,0,9,6,2,8,1,3,1,18,3,3,35,1 -14892,0,4,0,2,0,5,12,4,2,1,8,11,1,40,0 -14893,8,4,0,2,2,6,10,5,0,1,18,8,5,18,1 -14894,4,7,0,7,6,0,6,4,0,1,12,13,5,19,0 -14895,9,0,0,11,0,6,13,4,2,1,15,7,5,32,1 -14896,7,2,0,8,5,4,11,4,0,0,16,11,5,8,0 -14897,4,4,0,13,5,3,7,0,3,1,13,2,5,35,0 -14898,4,7,0,1,2,3,14,1,4,1,13,8,4,17,0 -14899,0,1,0,5,0,4,11,1,4,0,10,0,1,34,0 -14900,6,6,0,4,1,2,11,2,3,0,13,2,3,28,0 -14901,0,8,0,4,6,4,6,2,0,1,9,11,2,37,0 -14902,2,8,0,15,0,2,11,3,2,1,9,14,3,9,1 -14903,6,2,0,1,0,4,10,3,2,0,15,15,5,3,0 -14904,3,7,0,2,4,3,3,4,0,0,11,2,5,26,0 -14905,2,8,0,8,0,1,9,1,1,1,13,2,1,7,0 -14906,3,2,0,15,5,5,12,4,0,0,15,20,4,2,0 -14907,2,5,0,2,5,6,9,2,2,0,17,1,4,21,0 -14908,4,1,0,8,5,4,9,3,0,0,2,6,2,13,0 -14909,6,1,0,11,6,2,1,2,3,0,6,2,1,25,0 -14910,8,0,0,11,5,5,0,1,1,0,8,3,3,2,0 -14911,1,4,0,2,0,6,12,3,2,0,17,13,5,41,0 -14912,3,2,0,15,3,6,11,0,1,0,9,19,4,12,1 -14913,8,5,0,4,1,2,6,0,3,1,0,11,5,20,0 -14914,10,2,0,4,0,4,0,3,1,1,8,11,3,34,0 -14915,0,3,0,6,6,4,2,0,0,0,17,11,5,26,0 -14916,4,7,0,13,6,6,5,4,1,1,13,11,4,21,0 -14917,7,2,0,13,2,3,11,0,4,1,18,7,4,31,1 -14918,1,2,0,0,2,0,5,4,0,0,6,11,4,20,0 -14919,7,5,0,0,6,2,5,4,4,1,18,7,0,2,1 -14920,8,0,0,12,6,1,11,1,2,1,9,14,3,20,1 -14921,0,8,0,15,2,5,6,2,0,1,6,17,2,32,0 -14922,10,0,0,1,2,5,4,1,0,1,2,9,1,14,0 -14923,1,4,0,3,0,3,14,4,1,1,6,19,0,35,0 -14924,5,1,0,5,3,3,14,5,1,1,14,12,1,16,1 -14925,10,2,0,14,4,6,2,0,3,0,18,14,4,1,1 -14926,4,4,0,10,0,6,1,2,3,1,2,11,0,20,0 -14927,2,0,0,1,1,5,6,0,0,0,2,11,0,23,0 -14928,0,4,0,12,2,2,2,3,2,1,13,16,5,20,0 -14929,3,7,0,15,3,5,9,0,4,0,4,9,1,33,0 -14930,5,6,0,12,6,6,4,3,0,1,13,15,0,2,0 -14931,10,8,0,3,3,3,9,4,2,1,17,11,0,12,0 -14932,6,8,0,3,0,5,6,4,4,1,7,0,2,13,0 -14933,0,6,0,4,0,3,9,2,3,0,12,5,4,20,0 -14934,1,8,0,1,2,5,11,0,0,0,6,13,2,31,0 -14935,4,0,0,6,6,1,10,4,2,0,13,2,4,8,0 -14936,0,6,0,14,6,5,8,1,2,0,13,11,4,3,0 -14937,8,0,0,0,5,3,6,5,4,1,7,1,5,35,1 -14938,1,8,0,12,4,0,8,3,0,0,4,18,4,32,0 -14939,1,0,0,13,0,5,8,4,4,1,2,2,1,22,0 -14940,6,8,0,11,5,6,7,0,1,1,16,16,1,32,1 -14941,4,6,0,4,0,5,1,2,3,0,17,0,1,30,0 -14942,10,7,0,14,4,6,1,1,4,1,4,6,2,4,0 -14943,1,3,0,0,4,5,8,3,4,0,2,16,5,14,0 -14944,0,8,0,14,6,1,5,3,1,1,18,19,3,2,1 -14945,0,2,0,15,1,5,8,0,3,0,8,15,0,27,0 -14946,0,1,0,8,3,0,6,2,0,0,8,11,4,33,0 -14947,1,2,0,5,2,5,13,1,0,1,4,11,2,28,0 -14948,1,4,0,10,5,5,12,2,3,1,12,13,1,17,0 -14949,3,2,0,1,6,0,8,0,3,1,17,9,1,6,0 -14950,7,0,0,3,4,6,3,5,1,1,18,10,1,21,1 -14951,9,5,0,5,5,5,5,0,2,1,9,1,4,31,1 -14952,9,1,0,14,0,3,3,4,1,0,9,4,5,33,1 -14953,7,4,0,7,6,2,13,1,2,0,13,13,0,4,0 -14954,3,6,0,11,0,3,5,0,1,1,7,11,2,36,0 -14955,3,3,0,4,1,3,7,5,4,1,8,2,4,7,0 -14956,0,7,0,5,6,0,0,4,0,0,8,12,0,26,0 -14957,0,4,0,6,4,0,12,0,0,0,17,18,0,28,0 -14958,8,0,0,14,2,3,10,3,2,1,18,8,4,20,1 -14959,4,0,0,10,4,1,14,1,4,1,5,5,3,18,1 -14960,1,5,0,13,4,3,0,3,1,0,15,20,3,20,0 -14961,10,2,0,1,4,2,4,2,0,1,11,11,1,0,0 -14962,0,6,0,12,0,5,8,3,3,1,2,16,2,19,0 -14963,3,0,0,2,5,4,5,4,3,1,10,18,4,33,0 -14964,10,0,0,10,6,5,1,5,4,1,18,5,4,5,1 -14965,8,8,0,3,1,2,5,4,0,1,7,2,3,32,0 -14966,7,1,0,2,5,2,13,5,0,1,5,17,4,21,1 -14967,8,6,0,8,1,5,14,5,1,1,18,17,2,37,1 -14968,0,8,0,10,6,4,7,2,3,1,0,2,2,35,0 -14969,7,2,0,12,1,0,2,4,0,0,13,6,0,4,0 -14970,4,6,0,3,5,0,7,2,3,1,12,11,5,14,0 -14971,4,6,0,13,5,6,8,3,2,0,17,4,0,38,0 -14972,0,2,0,5,1,6,12,3,1,1,8,2,5,29,0 -14973,10,2,0,9,6,5,13,2,4,0,12,10,1,9,0 -14974,3,3,0,14,3,4,0,0,0,0,16,13,1,13,0 -14975,0,6,0,0,6,5,1,1,4,1,8,15,4,19,0 -14976,4,3,0,14,6,1,1,5,4,1,18,7,5,12,1 -14977,3,8,0,8,3,2,6,1,3,1,15,3,2,19,0 -14978,0,2,0,3,1,3,6,1,2,1,16,3,2,5,0 -14979,0,6,0,11,6,4,13,4,1,1,8,13,5,20,0 -14980,2,5,0,4,4,0,10,3,0,0,10,13,1,0,0 -14981,1,6,0,12,1,4,9,2,3,1,17,11,2,37,0 -14982,9,3,0,4,2,6,11,0,4,1,5,1,1,26,1 -14983,10,6,0,5,3,6,4,5,4,0,18,5,2,10,1 -14984,8,0,0,12,3,0,5,0,1,1,5,4,1,35,1 -14985,3,8,0,5,4,4,12,2,3,1,2,20,5,23,0 -14986,8,1,0,9,2,5,3,2,2,0,0,11,0,14,0 -14987,0,1,0,5,6,1,4,3,2,0,4,14,1,1,0 -14988,6,4,0,0,2,1,6,2,1,1,18,14,1,25,1 -14989,5,0,0,10,0,0,6,0,2,0,4,20,1,30,0 -14990,4,6,0,8,2,0,5,3,3,1,14,11,5,5,0 -14991,0,5,0,10,4,3,2,2,2,0,11,3,5,34,1 -14992,2,8,0,0,6,0,12,2,4,1,2,20,0,25,0 -14993,0,1,0,13,6,5,8,4,3,1,10,18,2,30,0 -14994,5,7,0,4,1,2,14,4,3,0,17,11,2,30,0 -14995,9,1,0,0,2,0,8,5,3,1,12,2,0,8,0 -14996,4,2,0,2,0,0,1,4,0,0,3,15,5,1,0 -14997,2,3,0,13,1,4,12,2,2,0,13,19,4,39,0 -14998,10,2,0,5,3,2,1,0,0,0,18,3,5,40,1 -14999,4,2,0,6,2,0,12,3,4,0,8,16,0,10,0 -15000,1,1,0,11,0,4,10,2,1,0,8,13,3,7,0 -15001,2,6,0,11,0,2,9,4,2,1,5,2,3,22,0 -15002,10,3,0,8,0,2,14,0,4,1,6,11,4,18,0 -15003,1,7,0,4,1,0,9,1,2,1,0,13,1,39,0 -15004,5,6,0,9,6,3,0,0,3,0,2,4,2,38,0 -15005,4,8,0,13,5,0,8,3,1,0,12,11,3,9,0 -15006,0,0,0,3,5,5,6,4,1,1,13,6,4,21,0 -15007,1,2,0,0,4,1,2,1,1,1,3,9,4,22,1 -15008,0,3,0,11,2,0,0,4,2,1,0,18,0,14,0 -15009,8,3,0,12,2,1,13,2,1,0,6,7,3,38,1 -15010,0,6,0,3,0,6,8,4,4,1,13,2,5,17,0 -15011,2,2,0,13,3,3,9,3,1,1,12,11,2,38,0 -15012,4,6,0,3,2,0,0,0,1,0,13,0,2,5,0 -15013,2,8,0,11,1,4,0,4,2,0,16,3,3,21,0 -15014,10,5,0,2,0,3,5,3,0,1,9,13,1,38,0 -15015,3,8,0,11,0,5,1,5,3,0,16,6,2,12,0 -15016,10,7,0,8,1,3,12,4,0,0,12,11,1,1,0 -15017,2,7,0,5,6,3,1,1,3,1,14,2,3,37,0 -15018,9,4,0,3,4,5,1,2,4,1,2,11,0,22,0 -15019,6,0,0,13,3,5,4,3,0,1,13,11,0,29,0 -15020,10,8,0,15,4,4,1,1,1,0,16,2,1,11,0 -15021,8,3,0,0,1,1,10,5,3,1,14,10,3,10,1 -15022,8,0,0,6,6,0,4,3,0,0,3,10,3,40,1 -15023,5,6,0,0,6,3,1,0,1,1,16,7,5,10,1 -15024,5,5,0,7,5,4,14,2,3,0,6,11,0,13,0 -15025,1,8,0,13,5,5,9,4,1,1,11,11,3,38,0 -15026,6,4,0,7,5,1,3,4,0,0,9,8,5,23,1 -15027,1,2,0,3,3,3,14,4,1,1,1,16,5,18,0 -15028,6,6,0,9,4,1,5,1,4,0,17,4,0,13,0 -15029,2,6,0,13,1,0,12,5,4,0,13,13,5,23,0 -15030,0,3,0,1,5,5,9,4,2,1,8,20,3,18,0 -15031,2,1,0,10,1,0,4,0,2,0,15,6,3,5,0 -15032,3,8,0,11,2,0,8,0,0,1,12,11,0,27,0 -15033,9,6,0,13,6,0,0,0,0,0,13,9,1,39,0 -15034,1,4,0,3,0,4,9,2,3,1,7,18,1,8,0 -15035,2,7,0,15,0,6,8,4,0,0,2,8,5,35,0 -15036,1,1,0,11,6,6,10,3,1,0,2,0,0,10,0 -15037,9,0,0,7,5,2,1,0,1,1,18,5,1,21,1 -15038,9,7,0,3,2,4,6,4,2,0,5,18,4,29,0 -15039,8,2,0,8,0,0,0,0,3,0,2,11,3,11,0 -15040,1,7,0,12,2,5,4,5,3,0,5,14,3,12,1 -15041,1,2,0,1,1,0,1,0,3,0,2,13,0,12,0 -15042,9,8,0,13,2,2,4,1,4,0,4,9,0,34,0 -15043,10,5,0,4,6,0,10,3,4,1,6,2,1,25,0 -15044,3,1,0,7,0,6,0,1,3,0,5,5,4,8,1 -15045,5,6,0,5,0,0,10,1,1,1,13,6,4,11,0 -15046,5,4,0,5,3,2,11,3,4,1,8,18,5,39,0 -15047,2,8,0,3,5,4,8,1,3,1,2,18,0,34,0 -15048,9,1,0,14,0,3,12,1,2,1,2,13,0,27,0 -15049,3,8,0,12,4,2,3,4,1,0,17,11,5,22,0 -15050,5,2,0,9,1,0,10,4,3,0,0,13,2,41,0 -15051,10,6,0,12,3,5,4,5,4,1,3,7,2,25,1 -15052,1,2,0,10,0,6,11,2,4,1,2,11,0,26,0 -15053,0,3,0,15,4,0,7,5,0,0,17,11,5,7,0 -15054,6,8,0,1,5,4,8,3,4,0,13,0,1,39,0 -15055,9,8,0,2,0,0,9,3,3,1,13,15,5,25,0 -15056,2,7,0,11,5,5,5,5,1,0,6,9,4,27,0 -15057,0,2,0,0,6,4,3,3,0,1,17,18,2,6,0 -15058,3,7,0,5,2,0,1,4,0,0,8,9,3,22,0 -15059,7,5,0,0,2,0,14,5,0,1,5,0,0,21,1 -15060,10,0,0,8,1,6,13,2,2,0,2,11,3,31,0 -15061,0,8,0,9,0,1,7,2,3,1,17,15,0,24,0 -15062,10,7,0,3,6,5,2,0,2,1,18,15,5,2,0 -15063,3,7,0,12,5,6,0,4,1,1,2,2,4,18,0 -15064,9,4,0,6,6,5,7,3,1,0,8,16,4,17,0 -15065,8,6,0,11,0,5,4,2,4,1,13,14,3,19,0 -15066,7,6,0,8,0,2,4,3,3,1,9,1,3,21,1 -15067,10,4,0,9,2,5,6,4,0,0,15,6,1,26,0 -15068,6,6,0,6,5,3,6,1,4,0,2,4,1,21,0 -15069,10,3,0,5,4,4,10,3,0,1,2,4,4,10,0 -15070,1,0,0,7,5,2,13,0,1,0,13,12,5,30,0 -15071,0,0,0,3,5,6,9,3,3,1,4,11,1,41,0 -15072,3,8,0,11,3,2,10,2,3,1,5,17,2,28,1 -15073,3,7,0,3,1,2,8,4,2,0,4,0,2,12,0 -15074,10,6,0,1,4,1,2,0,4,1,9,1,1,36,1 -15075,2,2,0,3,0,3,8,3,2,1,8,2,5,6,0 -15076,8,7,0,10,4,3,8,1,3,0,1,19,3,27,1 -15077,0,7,0,6,5,0,14,0,3,0,13,6,3,14,0 -15078,3,2,0,13,5,0,5,0,3,1,17,9,1,9,0 -15079,4,4,0,12,3,2,14,0,1,0,3,7,5,0,1 -15080,3,8,0,1,0,6,10,0,4,0,13,12,0,4,0 -15081,0,0,0,5,0,3,0,2,0,0,2,2,4,18,0 -15082,9,0,0,0,0,4,0,2,1,1,17,16,0,4,0 -15083,3,4,0,0,2,4,3,4,0,1,8,15,5,25,0 -15084,3,7,0,13,2,5,7,1,0,0,10,11,3,14,0 -15085,2,6,0,9,4,3,14,3,1,0,13,10,1,41,0 -15086,1,0,0,2,2,3,3,1,3,0,17,15,0,26,0 -15087,2,2,0,13,0,3,2,2,3,1,8,18,1,18,0 -15088,2,6,0,6,4,5,14,3,4,1,4,4,2,38,0 -15089,0,8,0,15,1,4,6,1,0,1,13,18,4,0,0 -15090,0,5,0,6,1,4,2,3,0,0,2,11,0,10,0 -15091,9,0,0,13,3,5,8,2,3,1,6,2,0,4,0 -15092,6,4,0,3,5,5,4,5,0,0,11,5,3,41,1 -15093,0,6,0,2,0,4,12,5,0,0,2,12,2,29,0 -15094,10,0,0,9,4,5,8,0,4,1,9,18,1,41,1 -15095,2,6,0,15,2,1,13,1,2,0,13,14,1,7,0 -15096,9,2,0,3,2,3,6,1,3,0,8,20,4,2,0 -15097,6,4,0,8,2,5,9,0,3,1,18,0,4,29,1 -15098,8,5,0,4,2,5,9,3,0,1,5,2,2,38,0 -15099,6,2,0,12,0,5,8,0,4,0,4,0,1,19,0 -15100,1,2,0,3,2,1,0,3,2,0,5,6,4,16,0 -15101,0,6,0,0,6,3,6,0,0,1,17,2,0,23,0 -15102,3,3,0,1,2,1,8,5,3,1,15,7,3,0,1 -15103,4,5,0,0,0,3,1,2,3,1,13,13,4,31,0 -15104,5,7,0,15,4,1,9,2,2,1,10,2,2,3,0 -15105,2,6,0,1,3,4,12,0,0,0,13,9,0,35,0 -15106,1,0,0,12,6,2,7,0,0,1,1,19,2,16,1 -15107,1,3,0,13,6,6,5,2,0,1,16,2,1,30,0 -15108,9,2,0,11,0,4,6,2,4,0,2,0,0,10,0 -15109,4,3,0,2,3,1,14,5,1,1,13,14,5,22,0 -15110,1,7,0,6,4,3,10,4,4,0,13,2,5,32,0 -15111,2,2,0,1,1,0,9,5,1,1,15,6,5,34,0 -15112,2,5,0,14,0,3,12,3,3,1,8,11,0,0,0 -15113,0,7,0,6,1,0,3,3,1,0,17,8,0,7,0 -15114,1,7,0,15,1,1,0,4,4,1,13,12,4,38,0 -15115,2,5,0,2,6,6,6,1,3,1,13,9,2,24,0 -15116,5,2,0,0,6,5,11,2,1,1,2,3,1,7,0 -15117,2,8,0,7,6,4,10,2,0,1,0,6,0,18,0 -15118,1,3,0,3,3,5,10,3,3,1,13,6,5,25,0 -15119,2,8,0,5,4,0,6,4,0,0,10,11,2,41,0 -15120,2,7,0,11,2,3,9,1,2,0,2,2,5,22,0 -15121,1,6,0,13,2,6,2,1,4,0,4,18,0,7,0 -15122,6,2,0,12,0,0,8,2,0,0,4,11,3,30,0 -15123,4,8,0,5,3,3,0,2,2,1,17,4,4,28,0 -15124,1,5,0,1,0,1,9,3,0,1,0,9,0,21,0 -15125,10,6,0,5,5,0,1,2,2,1,0,4,1,18,0 -15126,2,2,0,3,0,0,6,0,0,1,12,15,3,23,0 -15127,5,6,0,1,2,5,10,4,0,1,2,16,3,10,0 -15128,2,3,0,13,1,0,11,3,2,0,6,11,5,14,0 -15129,0,1,0,1,3,0,9,4,0,0,12,16,2,40,0 -15130,2,6,0,15,1,5,6,0,2,1,2,2,2,1,0 -15131,1,5,0,0,0,4,13,0,0,1,2,15,3,9,0 -15132,4,8,0,10,1,2,14,3,3,1,13,16,0,24,0 -15133,6,5,0,12,6,1,6,4,3,1,18,5,0,35,1 -15134,3,6,0,0,5,4,11,3,2,1,0,13,1,8,0 -15135,2,8,0,5,0,6,11,2,0,0,15,15,1,35,0 -15136,4,3,0,13,6,0,6,2,0,1,2,18,3,21,0 -15137,1,2,0,5,6,4,11,3,3,1,13,5,1,23,0 -15138,8,6,0,4,4,3,1,5,4,1,14,19,0,12,1 -15139,0,8,0,9,0,5,3,3,3,0,12,2,1,24,0 -15140,0,2,0,15,1,0,0,1,1,0,13,2,0,24,0 -15141,4,8,0,12,0,2,6,1,0,0,2,2,4,27,0 -15142,9,1,0,14,4,2,4,4,0,1,1,10,4,41,1 -15143,1,4,0,0,3,6,13,3,2,1,8,18,5,2,0 -15144,0,4,0,8,0,0,7,0,3,0,17,4,3,14,0 -15145,10,6,0,3,0,0,14,3,2,0,17,16,0,22,0 -15146,8,0,0,7,4,1,6,1,3,0,15,2,0,18,0 -15147,1,0,0,3,4,6,5,3,0,1,8,15,0,14,0 -15148,7,7,0,8,2,1,10,0,3,1,12,15,1,24,1 -15149,4,8,0,15,1,6,8,0,2,1,1,20,0,9,0 -15150,6,4,0,2,2,6,0,5,4,1,5,13,4,35,1 -15151,0,8,0,11,1,4,10,3,0,0,18,7,3,20,1 -15152,5,8,0,4,0,0,3,1,1,1,4,2,3,36,0 -15153,3,0,0,0,0,4,7,2,1,0,6,4,2,13,0 -15154,9,6,0,8,0,4,7,1,0,1,4,9,5,9,0 -15155,0,7,0,6,0,0,5,3,2,1,6,6,1,36,0 -15156,10,1,0,9,1,0,6,3,0,1,2,18,4,39,0 -15157,4,1,0,9,6,6,1,5,0,1,14,19,1,17,1 -15158,10,8,0,2,2,0,14,0,2,0,8,11,1,32,0 -15159,1,6,0,12,5,0,5,1,1,1,9,15,0,28,1 -15160,1,1,0,15,1,3,0,0,3,0,18,14,2,21,1 -15161,0,7,0,10,2,3,3,4,4,1,13,11,4,7,0 -15162,8,7,0,1,0,2,0,0,3,0,13,3,1,0,0 -15163,5,5,0,5,3,1,9,0,0,0,0,2,4,31,0 -15164,2,0,0,4,5,1,10,5,3,0,14,2,0,30,0 -15165,0,0,0,3,3,4,1,2,0,1,2,11,4,22,0 -15166,6,0,0,4,3,4,5,4,2,1,14,19,4,12,1 -15167,4,6,0,4,0,6,2,1,0,0,6,18,2,22,0 -15168,4,5,0,12,4,2,11,1,3,1,16,19,0,5,1 -15169,2,2,0,13,2,3,11,5,2,1,8,0,1,25,0 -15170,6,4,0,5,0,6,8,1,4,1,13,13,3,27,0 -15171,2,1,0,13,4,4,8,0,1,1,4,13,1,29,0 -15172,2,5,0,7,3,4,8,3,0,1,3,15,4,21,1 -15173,0,7,0,9,0,5,9,3,1,1,13,4,0,41,0 -15174,6,0,0,5,1,4,4,4,0,0,13,11,1,0,0 -15175,8,7,0,12,0,0,3,3,2,1,11,2,3,30,0 -15176,5,8,0,6,0,4,12,3,1,0,13,11,0,39,0 -15177,2,1,0,1,2,3,6,0,2,0,17,6,2,24,0 -15178,2,2,0,12,5,4,8,2,0,1,13,16,0,13,0 -15179,1,4,0,15,2,2,14,0,3,1,6,14,4,24,1 -15180,4,1,0,5,0,6,13,0,1,0,12,2,4,33,0 -15181,0,4,0,13,6,0,2,4,4,0,17,18,0,22,0 -15182,3,3,0,7,3,2,11,5,2,1,5,8,4,40,1 -15183,3,4,0,6,2,6,11,0,0,1,2,11,2,25,0 -15184,0,8,0,4,4,4,3,2,4,1,0,0,0,18,0 -15185,10,5,0,7,5,4,14,2,0,1,13,18,1,25,0 -15186,2,6,0,10,2,0,9,3,4,0,17,2,5,0,0 -15187,1,8,0,13,5,4,10,3,1,1,2,13,5,24,0 -15188,9,1,0,9,2,0,9,2,2,1,13,15,2,27,0 -15189,7,7,0,8,0,6,4,1,0,0,2,14,2,13,0 -15190,7,6,0,5,4,2,4,1,2,0,1,7,1,21,1 -15191,4,1,0,1,0,2,4,0,0,1,11,1,0,3,1 -15192,4,1,0,4,5,6,11,3,3,1,1,7,3,7,1 -15193,6,3,0,15,5,4,10,1,3,1,13,9,2,39,0 -15194,0,8,0,8,1,5,9,2,2,0,11,18,1,32,0 -15195,9,4,0,9,0,5,8,5,2,0,18,5,2,19,1 -15196,10,0,0,0,0,3,6,1,4,0,8,9,5,40,0 -15197,7,1,0,2,0,5,5,4,4,1,2,15,0,18,0 -15198,7,1,0,9,0,2,9,2,4,1,12,7,4,25,1 -15199,7,3,0,5,2,5,3,2,0,1,6,11,1,26,0 -15200,1,2,0,8,0,1,6,3,0,1,12,2,0,19,0 -15201,8,2,0,5,3,6,8,5,3,1,9,14,1,29,1 -15202,9,3,0,11,6,1,8,4,3,1,8,11,3,19,0 -15203,2,1,0,12,6,0,1,4,3,0,18,17,5,26,1 -15204,2,8,0,9,6,0,0,4,2,0,13,3,4,29,0 -15205,5,3,0,5,6,5,7,0,1,0,0,11,1,21,0 -15206,0,8,0,6,6,0,14,5,0,0,12,4,2,23,0 -15207,10,1,0,15,1,1,7,0,0,1,11,8,5,22,1 -15208,6,1,0,9,4,5,8,5,3,1,5,19,3,5,1 -15209,9,5,0,9,6,3,9,2,3,1,18,13,5,6,0 -15210,4,0,0,2,0,6,13,5,0,1,17,9,3,18,0 -15211,6,4,0,9,0,0,1,3,2,1,15,18,0,6,0 -15212,3,6,0,7,2,1,6,1,4,1,6,4,4,2,1 -15213,5,6,0,11,6,3,8,4,2,1,2,15,5,23,0 -15214,9,1,0,11,0,6,9,5,0,1,3,20,0,21,0 -15215,5,1,0,6,6,1,8,1,3,1,2,9,1,7,0 -15216,7,3,0,10,0,3,5,2,0,1,3,15,4,7,1 -15217,6,8,0,3,3,0,5,1,4,0,6,10,4,19,0 -15218,4,5,0,8,2,4,9,5,4,0,17,2,1,24,0 -15219,1,6,0,0,0,0,5,2,3,1,2,11,4,11,0 -15220,0,2,0,7,0,2,12,4,1,1,18,10,4,29,1 -15221,3,4,0,2,6,3,8,1,1,1,0,7,4,37,1 -15222,7,0,0,4,2,1,8,1,3,1,13,12,1,38,0 -15223,3,8,0,6,3,6,9,2,3,0,14,4,3,5,0 -15224,0,8,0,0,4,0,7,4,0,1,17,14,5,9,0 -15225,0,5,0,14,2,4,13,5,2,0,2,5,4,34,0 -15226,3,8,0,5,4,1,0,3,3,0,13,12,5,24,0 -15227,8,5,0,11,3,5,0,1,1,1,3,2,0,26,0 -15228,0,7,0,12,5,3,4,5,3,1,13,11,4,4,0 -15229,3,5,0,0,0,0,5,3,3,0,10,15,1,18,0 -15230,4,3,0,1,6,5,6,3,1,1,13,9,5,9,0 -15231,3,1,0,13,3,4,5,0,0,0,17,2,3,23,0 -15232,0,3,0,1,0,0,4,3,4,1,13,2,0,36,0 -15233,8,1,0,5,0,4,13,1,4,0,0,2,0,25,0 -15234,9,0,0,15,5,3,11,5,1,1,2,5,5,35,0 -15235,10,3,0,6,0,4,3,0,1,1,5,19,4,39,1 -15236,6,0,0,2,4,5,5,3,2,1,18,7,5,16,1 -15237,0,7,0,3,0,3,9,3,1,1,4,11,0,22,0 -15238,2,7,0,0,0,0,0,1,3,1,6,2,1,24,0 -15239,2,0,0,10,0,0,3,3,2,1,4,2,0,20,0 -15240,1,2,0,14,0,0,5,1,3,0,0,15,4,31,0 -15241,0,6,0,1,6,3,8,3,4,1,16,2,4,38,0 -15242,9,1,0,10,6,3,3,5,4,0,1,7,5,18,1 -15243,0,4,0,5,4,0,14,2,3,1,14,1,2,29,1 -15244,3,1,0,0,4,6,12,3,2,0,4,0,0,20,0 -15245,5,6,0,2,6,1,14,1,3,0,8,6,3,21,0 -15246,5,1,0,0,4,6,5,5,2,1,18,10,0,9,1 -15247,3,2,0,11,1,0,14,2,4,0,16,13,1,9,0 -15248,0,5,0,9,6,5,5,2,1,1,8,2,4,29,0 -15249,8,6,0,4,0,2,8,3,2,1,4,6,4,9,0 -15250,4,0,0,8,4,1,1,5,1,0,12,7,5,32,1 -15251,10,7,0,4,1,6,14,4,0,0,13,2,4,24,0 -15252,0,6,0,0,6,6,14,5,1,1,0,18,1,32,0 -15253,9,4,0,13,3,0,0,2,4,0,13,2,4,20,0 -15254,5,2,0,5,1,3,0,2,1,1,16,10,3,8,1 -15255,3,3,0,14,3,3,12,0,1,0,4,2,5,21,0 -15256,2,1,0,8,4,2,10,4,3,0,0,2,4,25,0 -15257,7,0,0,10,5,1,10,0,1,0,3,17,5,9,1 -15258,5,5,0,8,6,1,14,5,1,1,7,7,3,11,1 -15259,4,4,0,8,5,0,0,3,3,0,2,9,2,27,0 -15260,1,3,0,3,1,5,7,4,3,0,15,2,1,20,0 -15261,2,6,0,5,6,4,12,0,3,0,13,13,3,14,0 -15262,8,3,0,6,2,3,7,2,2,1,13,5,2,38,0 -15263,0,4,0,9,2,0,10,4,4,0,15,13,4,19,0 -15264,9,7,0,14,2,5,12,4,0,1,3,11,0,13,0 -15265,5,2,0,8,0,5,4,0,1,0,17,4,3,35,0 -15266,3,5,0,6,2,1,7,1,4,1,18,5,4,21,1 -15267,6,1,0,1,0,2,7,5,1,1,8,9,2,5,0 -15268,10,8,0,3,1,6,12,4,3,0,13,6,1,27,0 -15269,4,6,0,12,1,0,6,2,3,0,8,0,4,38,0 -15270,1,0,0,7,2,5,8,4,0,1,13,15,0,40,0 -15271,7,8,0,3,3,0,7,4,0,0,7,11,1,7,0 -15272,4,3,0,8,5,4,7,4,3,0,13,2,3,0,0 -15273,8,3,0,8,6,5,1,2,0,0,7,6,0,29,0 -15274,0,3,0,1,0,5,2,0,1,0,0,17,3,4,0 -15275,2,6,0,8,6,0,8,1,2,1,10,4,1,10,0 -15276,6,5,0,9,3,3,3,4,1,0,2,18,0,20,0 -15277,1,3,0,10,0,1,1,0,0,0,17,6,2,34,0 -15278,1,4,0,4,6,1,6,5,0,1,14,6,1,7,0 -15279,3,6,0,7,0,5,9,2,3,1,13,12,5,24,0 -15280,2,4,0,2,0,0,11,4,0,1,11,0,2,35,0 -15281,1,3,0,11,0,3,8,0,4,1,0,2,2,9,0 -15282,10,6,0,11,6,0,5,4,2,0,2,14,5,35,0 -15283,1,0,0,10,5,5,1,3,0,0,11,2,4,7,0 -15284,2,7,0,15,5,3,1,3,3,1,8,6,5,26,0 -15285,5,7,0,11,6,4,8,3,1,0,4,2,4,28,0 -15286,5,8,0,15,0,1,0,3,0,1,13,16,0,18,0 -15287,7,0,0,12,1,6,0,4,0,0,15,2,3,17,0 -15288,2,5,0,0,6,4,0,1,4,1,8,13,5,26,0 -15289,4,1,0,13,3,2,5,5,3,1,15,9,5,3,0 -15290,0,4,0,1,5,1,5,2,0,0,2,2,0,16,0 -15291,4,3,0,0,0,3,0,3,0,0,10,6,0,21,0 -15292,9,6,0,13,1,5,11,0,1,1,13,2,2,26,0 -15293,2,8,0,4,6,6,14,1,0,0,12,16,0,20,0 -15294,2,3,0,11,5,1,2,2,0,0,4,2,4,9,0 -15295,0,2,0,0,6,1,11,3,4,1,17,7,5,0,0 -15296,6,1,0,4,2,4,11,1,0,1,1,17,1,24,1 -15297,1,0,0,1,4,5,13,2,3,1,4,14,0,22,0 -15298,10,2,0,2,4,3,5,0,0,0,2,7,2,0,1 -15299,5,5,0,11,1,1,10,4,3,1,1,9,1,7,1 -15300,7,7,0,10,6,4,10,3,4,1,18,18,3,38,1 -15301,0,7,0,12,0,0,14,4,3,0,12,12,3,25,0 -15302,4,4,0,3,1,0,9,4,0,0,10,11,2,31,0 -15303,5,7,0,3,2,0,4,5,1,1,17,18,3,2,1 -15304,3,1,0,3,2,4,9,2,0,1,13,18,5,17,0 -15305,8,6,0,3,0,0,10,1,3,1,8,2,3,38,0 -15306,7,5,0,7,2,5,14,5,1,1,6,16,3,25,0 -15307,0,5,0,8,6,3,6,1,4,1,0,15,5,19,0 -15308,10,7,0,15,0,3,0,5,3,0,16,3,4,2,1 -15309,1,3,0,4,6,2,2,4,3,0,8,2,1,27,0 -15310,5,2,0,9,2,1,8,5,4,0,5,5,1,23,1 -15311,7,4,0,7,4,3,4,5,3,0,18,17,4,1,1 -15312,9,8,0,11,1,1,6,1,2,1,1,8,5,41,1 -15313,9,1,0,5,3,0,14,4,4,0,14,5,5,13,1 -15314,0,4,0,5,5,0,10,4,1,1,8,11,4,12,0 -15315,5,0,0,6,1,6,14,1,2,1,13,6,3,11,0 -15316,8,8,0,8,4,6,13,0,4,1,11,7,5,35,1 -15317,3,0,0,3,6,5,9,2,3,1,10,2,1,25,0 -15318,5,8,0,10,6,3,7,2,2,1,8,11,0,16,0 -15319,1,2,0,3,3,3,4,3,2,1,15,15,1,4,0 -15320,10,0,0,1,0,5,10,0,0,0,13,9,0,34,0 -15321,9,5,0,9,0,4,12,1,2,0,9,10,3,27,1 -15322,0,8,0,8,3,4,14,1,1,1,2,0,5,1,0 -15323,3,3,0,13,4,6,12,0,3,0,13,11,2,41,0 -15324,3,2,0,10,0,2,8,3,0,0,8,11,2,0,0 -15325,0,6,0,11,5,1,14,5,3,1,13,2,4,0,0 -15326,6,4,0,2,4,5,4,5,0,0,1,8,4,20,1 -15327,1,7,0,3,6,0,9,3,1,1,10,6,0,40,0 -15328,2,4,0,4,4,3,8,2,2,0,0,15,2,20,0 -15329,7,0,0,15,5,2,8,4,2,1,5,11,5,19,0 -15330,4,1,0,12,6,4,11,4,1,0,13,18,0,26,0 -15331,4,2,0,15,0,4,14,2,1,0,2,0,1,5,0 -15332,5,4,0,4,6,3,4,0,0,1,9,5,3,19,1 -15333,7,1,0,14,2,4,8,2,2,1,15,16,2,21,1 -15334,5,6,0,13,6,6,8,3,3,0,1,18,5,37,0 -15335,8,2,0,10,4,4,7,0,0,0,6,11,2,36,0 -15336,9,1,0,4,2,4,12,4,3,0,18,10,5,21,1 -15337,5,2,0,11,0,1,14,5,4,0,0,1,4,26,1 -15338,0,6,0,2,5,0,14,0,0,1,10,9,2,31,0 -15339,2,0,0,4,3,0,9,4,0,0,2,12,3,9,0 -15340,10,1,0,14,4,5,5,1,2,1,18,16,1,9,1 -15341,3,8,0,10,5,6,1,2,2,1,2,2,0,38,0 -15342,5,6,0,3,0,2,12,2,0,1,6,11,0,9,0 -15343,4,2,0,13,1,4,8,4,0,1,8,9,0,39,0 -15344,4,3,0,7,2,2,2,0,3,0,17,9,1,25,0 -15345,6,5,0,13,3,2,6,5,0,1,5,1,3,13,1 -15346,5,8,0,3,6,5,5,3,4,0,3,17,2,34,0 -15347,6,5,0,11,4,6,6,2,4,0,3,8,2,21,1 -15348,7,5,0,0,4,4,10,2,1,1,13,16,3,4,0 -15349,8,0,0,7,2,0,9,1,1,1,7,18,4,39,0 -15350,3,8,0,6,1,6,0,0,4,0,2,9,4,41,0 -15351,8,4,0,14,5,4,13,3,2,1,18,15,2,7,1 -15352,0,0,0,15,4,0,7,4,2,1,0,13,0,33,0 -15353,5,2,0,0,3,4,9,1,2,1,16,2,0,33,0 -15354,4,8,0,5,2,4,10,3,3,0,14,2,1,32,0 -15355,0,3,0,10,4,0,5,1,4,1,2,0,5,8,0 -15356,3,5,0,1,3,1,3,5,2,0,3,19,1,11,1 -15357,7,4,0,13,0,1,13,0,1,0,18,3,3,0,1 -15358,5,7,0,8,5,2,8,3,0,1,8,9,1,24,0 -15359,5,5,0,0,2,4,4,4,3,0,8,10,4,1,1 -15360,2,8,0,13,3,3,3,0,3,0,13,6,2,5,0 -15361,10,0,0,0,2,5,11,3,3,0,2,11,0,16,0 -15362,10,6,0,13,2,2,2,1,4,0,9,0,4,23,1 -15363,10,5,0,4,3,2,5,1,0,0,13,12,1,0,0 -15364,2,5,0,11,3,4,0,1,3,1,17,13,1,34,0 -15365,1,5,0,5,1,0,12,3,0,0,17,11,5,24,0 -15366,0,6,0,12,6,2,9,0,0,0,2,2,4,19,0 -15367,0,2,0,5,6,3,5,4,2,0,10,15,5,38,0 -15368,7,8,0,14,4,5,1,1,0,1,1,3,1,23,1 -15369,1,7,0,11,5,2,8,1,3,0,17,11,0,7,0 -15370,9,5,0,2,6,5,7,2,2,0,17,15,5,34,0 -15371,2,0,0,12,5,0,12,0,4,0,11,20,4,12,0 -15372,10,1,0,14,0,6,9,3,1,1,13,11,1,20,0 -15373,2,4,0,1,0,2,9,1,3,1,2,18,2,37,0 -15374,10,3,0,2,0,2,5,5,3,1,3,5,4,41,1 -15375,1,1,0,14,6,5,9,5,2,1,2,15,3,27,0 -15376,7,1,0,4,4,1,0,5,3,1,18,12,1,12,1 -15377,5,1,0,9,4,2,7,5,0,1,3,7,5,2,1 -15378,10,6,0,11,2,6,3,1,4,1,10,18,0,35,0 -15379,0,1,0,10,1,4,3,5,2,0,10,13,4,33,0 -15380,1,2,0,12,6,1,2,3,2,1,4,15,0,27,0 -15381,1,0,0,6,1,2,12,0,2,1,1,10,3,3,1 -15382,0,4,0,5,6,4,0,2,1,0,5,13,0,22,0 -15383,2,6,0,3,0,5,9,4,1,0,8,13,5,24,0 -15384,2,0,0,9,0,5,12,2,1,0,13,6,4,14,0 -15385,5,7,0,0,0,3,10,2,1,1,12,2,0,32,0 -15386,0,6,0,4,1,3,3,0,4,0,13,14,4,29,0 -15387,5,1,0,11,3,0,10,0,0,1,5,7,1,8,1 -15388,7,2,0,2,5,4,10,5,1,0,6,16,5,0,0 -15389,6,4,0,3,3,6,9,3,2,1,2,11,5,34,0 -15390,0,8,0,10,5,1,6,2,2,0,13,2,1,16,0 -15391,2,2,0,6,0,3,3,4,2,0,13,11,3,11,0 -15392,8,1,0,5,5,2,4,5,3,1,18,4,3,22,1 -15393,0,0,0,12,6,4,2,4,1,0,2,9,0,33,0 -15394,8,5,0,9,4,1,2,5,0,1,14,1,2,26,1 -15395,5,8,0,0,0,6,4,1,0,1,8,4,2,14,0 -15396,1,4,0,7,5,5,7,5,1,1,0,13,1,20,0 -15397,6,2,0,6,1,5,5,0,3,0,12,2,5,18,0 -15398,2,2,0,9,1,0,0,4,1,1,8,9,2,38,0 -15399,2,0,0,11,6,2,8,5,1,1,13,11,0,7,0 -15400,3,2,0,5,4,1,13,3,4,1,16,7,4,22,1 -15401,0,7,0,1,4,1,7,5,4,1,16,7,3,25,1 -15402,2,4,0,11,6,4,9,2,2,0,13,2,4,40,0 -15403,6,3,0,1,2,4,7,2,3,0,13,15,4,17,0 -15404,7,4,0,15,4,1,12,1,3,1,14,17,5,25,1 -15405,4,3,0,11,0,0,4,3,0,0,17,1,4,38,0 -15406,0,6,0,14,5,0,1,3,4,0,4,6,1,5,0 -15407,4,5,0,14,3,1,13,5,4,1,14,1,5,13,1 -15408,5,2,0,0,6,1,8,2,4,1,13,15,0,33,0 -15409,10,6,0,3,3,0,13,2,3,1,2,5,0,13,0 -15410,6,3,0,13,0,5,8,2,2,0,2,18,5,38,0 -15411,5,6,0,13,1,4,9,3,0,1,2,2,0,28,0 -15412,9,5,0,10,2,1,6,5,4,1,14,3,3,16,1 -15413,5,3,0,4,3,4,6,1,3,0,2,2,4,2,0 -15414,6,7,0,5,4,4,9,4,2,1,2,11,0,26,0 -15415,0,8,0,13,2,6,9,4,0,0,2,3,0,10,0 -15416,1,5,0,1,3,2,14,1,4,0,12,6,5,6,0 -15417,1,2,0,6,3,4,9,2,0,1,6,11,0,11,0 -15418,3,6,0,3,0,4,9,3,0,1,17,9,2,29,0 -15419,8,7,0,12,0,6,2,4,2,1,13,11,0,8,0 -15420,2,7,0,13,2,4,8,1,3,0,2,11,4,6,0 -15421,7,7,0,3,0,4,12,5,4,1,2,9,2,39,0 -15422,5,5,0,13,6,0,6,4,1,0,15,11,2,23,0 -15423,0,4,0,1,0,5,0,4,2,0,12,2,3,23,0 -15424,2,7,0,12,0,5,7,1,4,0,17,2,4,7,0 -15425,3,2,0,9,6,6,5,3,2,1,16,4,1,34,0 -15426,2,1,0,15,2,4,12,1,0,0,17,3,0,40,0 -15427,6,1,0,0,4,5,0,0,2,1,18,14,0,32,1 -15428,5,5,0,5,4,0,12,1,2,1,14,2,0,30,0 -15429,8,4,0,13,4,3,9,2,0,0,10,4,3,28,0 -15430,6,6,0,5,6,3,5,0,1,0,4,11,5,24,0 -15431,7,5,0,15,3,3,6,4,4,1,1,13,5,12,1 -15432,3,6,0,12,2,3,14,0,4,1,6,13,5,8,1 -15433,6,0,0,6,5,0,8,0,1,0,3,2,1,40,0 -15434,4,8,0,5,0,0,1,1,2,0,1,12,1,8,1 -15435,2,6,0,9,3,1,6,1,4,0,11,0,1,13,0 -15436,0,4,0,10,5,5,11,0,0,0,17,16,4,23,0 -15437,6,0,0,8,5,5,2,0,0,0,7,5,0,20,1 -15438,0,1,0,9,0,3,3,1,4,1,3,14,5,19,0 -15439,7,2,0,0,0,1,14,2,3,1,7,7,1,13,1 -15440,2,5,0,3,6,5,7,2,4,0,2,11,2,26,0 -15441,0,6,0,14,6,6,6,1,2,1,2,6,5,27,0 -15442,10,2,0,3,0,5,0,2,0,1,4,11,4,13,0 -15443,1,3,0,1,1,1,14,1,2,0,16,10,3,24,1 -15444,7,8,0,8,1,2,1,1,4,1,4,13,2,13,0 -15445,1,6,0,1,5,6,5,0,0,0,18,11,0,28,0 -15446,1,2,0,9,0,4,0,0,0,1,8,2,0,33,0 -15447,1,4,0,0,6,6,4,2,0,0,12,7,1,30,1 -15448,10,1,0,15,0,6,6,0,0,0,4,11,0,30,0 -15449,2,8,0,14,3,5,0,3,2,0,2,11,3,12,0 -15450,6,3,0,4,1,4,5,4,1,0,7,4,2,6,0 -15451,1,4,0,5,6,5,8,3,2,1,2,9,0,0,0 -15452,10,1,0,1,4,4,10,0,1,1,4,1,2,12,1 -15453,0,3,0,3,1,6,3,5,0,0,2,6,2,10,0 -15454,3,3,0,0,1,1,9,2,0,1,2,2,5,4,0 -15455,0,5,0,8,6,5,2,3,0,0,14,11,1,25,0 -15456,5,1,0,13,4,2,9,4,4,1,13,9,2,0,0 -15457,5,7,0,15,2,1,0,1,0,0,8,11,4,27,0 -15458,0,4,0,8,5,0,8,4,0,0,2,20,0,39,0 -15459,5,2,0,12,4,2,4,1,4,1,8,4,1,30,0 -15460,5,7,0,14,3,4,3,5,1,0,18,17,4,28,1 -15461,1,2,0,10,1,5,9,2,1,1,0,2,1,40,0 -15462,2,4,0,11,3,6,0,1,0,1,16,3,2,28,1 -15463,9,2,0,9,6,1,10,2,1,0,13,9,0,35,0 -15464,1,8,0,0,0,2,0,5,3,1,8,11,3,38,0 -15465,2,7,0,10,0,2,8,3,1,0,2,11,3,30,0 -15466,3,8,0,0,0,6,6,5,0,1,6,0,0,11,0 -15467,0,6,0,11,0,2,2,1,0,1,8,2,2,39,0 -15468,2,8,0,12,0,0,4,1,0,0,15,1,2,3,0 -15469,7,1,0,3,0,0,0,3,1,1,2,5,4,6,1 -15470,2,4,0,14,3,0,1,1,0,1,8,9,4,32,0 -15471,10,4,0,14,0,3,8,4,3,1,13,2,2,10,0 -15472,3,4,0,4,5,0,8,1,0,1,13,13,0,28,0 -15473,1,4,0,0,1,0,14,2,0,0,12,11,1,20,0 -15474,7,1,0,9,2,1,3,5,1,1,6,14,2,11,1 -15475,7,1,0,14,6,2,1,0,3,0,18,1,2,18,1 -15476,0,8,0,14,4,3,4,5,0,0,17,11,1,0,0 -15477,2,3,0,11,5,1,14,2,3,1,12,9,4,21,0 -15478,3,0,0,8,1,0,11,0,2,1,17,6,4,25,0 -15479,0,0,0,11,6,6,2,3,2,0,2,15,1,35,0 -15480,7,6,0,14,4,0,14,1,0,1,13,3,2,27,0 -15481,1,7,0,9,6,3,9,2,0,0,15,20,1,19,0 -15482,4,6,0,5,6,4,4,5,0,0,13,13,2,34,0 -15483,6,3,0,12,1,1,12,0,3,0,1,17,2,35,1 -15484,5,7,0,1,1,4,9,2,1,1,13,6,4,6,0 -15485,0,8,0,14,4,5,9,1,2,1,13,18,1,36,0 -15486,1,6,0,13,6,5,2,4,2,0,8,18,1,29,0 -15487,3,5,0,4,5,6,10,2,2,0,8,3,1,22,0 -15488,6,6,0,15,0,4,2,3,2,0,13,17,4,25,0 -15489,10,8,0,8,1,5,3,3,4,0,2,18,3,33,0 -15490,3,5,0,10,0,5,11,4,2,0,10,15,0,7,0 -15491,6,2,0,3,0,4,3,2,2,1,10,2,5,32,0 -15492,1,8,0,11,0,0,0,1,2,1,4,2,1,8,0 -15493,6,4,0,13,4,0,0,4,1,1,17,11,2,30,0 -15494,0,2,0,3,1,5,7,0,0,0,11,11,2,19,0 -15495,2,0,0,1,0,1,1,5,1,1,0,18,5,35,0 -15496,0,5,0,11,0,0,8,4,2,0,8,17,4,23,0 -15497,6,7,0,14,1,2,6,2,1,1,9,10,5,26,1 -15498,10,5,0,8,3,3,4,2,2,1,1,17,2,11,1 -15499,2,5,0,0,1,0,4,2,2,1,13,6,4,31,0 -15500,6,6,0,6,0,0,6,2,3,1,13,15,0,10,0 -15501,2,0,0,1,3,3,6,2,1,0,11,11,4,13,0 -15502,5,5,0,1,6,6,3,3,1,1,8,7,5,0,0 -15503,6,7,0,9,5,5,1,1,2,0,12,6,4,19,0 -15504,2,6,0,10,5,0,6,5,3,0,8,9,0,37,0 -15505,6,0,0,5,1,5,2,3,3,1,2,11,3,7,0 -15506,7,1,0,4,0,1,11,1,2,0,17,9,0,27,0 -15507,3,8,0,11,0,5,12,4,3,1,16,18,1,24,0 -15508,0,8,0,4,0,6,9,1,3,1,9,18,2,7,1 -15509,0,6,0,11,4,5,3,3,0,1,6,4,3,29,0 -15510,7,2,0,13,6,1,2,2,0,0,13,0,2,8,0 -15511,3,8,0,3,0,4,14,2,3,0,6,15,0,37,0 -15512,2,7,0,8,6,3,0,1,4,0,2,4,3,19,0 -15513,9,1,0,13,4,0,7,3,2,0,2,6,1,16,0 -15514,0,8,0,13,4,4,11,1,3,1,2,13,2,35,0 -15515,1,0,0,0,1,3,11,3,1,0,16,10,0,4,0 -15516,5,1,0,7,1,1,12,3,4,1,9,10,2,8,1 -15517,5,7,0,1,3,0,9,4,2,0,7,16,0,18,0 -15518,2,3,0,0,2,0,0,3,2,1,13,9,2,39,0 -15519,0,0,0,0,4,1,3,1,4,1,17,11,3,40,0 -15520,1,5,0,1,6,1,3,2,4,0,2,0,0,29,0 -15521,3,6,0,10,2,3,13,3,4,1,1,20,2,19,0 -15522,1,7,0,7,0,3,5,1,3,1,17,2,2,33,0 -15523,1,5,0,15,0,5,4,1,4,1,6,11,3,30,0 -15524,1,0,0,5,2,3,13,3,1,1,5,3,2,37,1 -15525,5,4,0,13,1,2,10,0,1,1,5,16,0,32,1 -15526,9,1,0,3,2,5,5,5,0,1,18,17,5,14,1 -15527,6,2,0,12,0,5,4,2,1,0,2,11,1,38,0 -15528,5,0,0,6,3,4,4,2,4,1,14,6,1,8,0 -15529,3,5,0,3,6,4,10,2,4,0,14,11,1,41,0 -15530,2,7,0,9,6,6,8,1,0,0,13,11,1,39,0 -15531,2,4,0,14,5,0,11,5,1,0,10,10,1,10,1 -15532,7,5,0,0,1,1,0,4,0,0,15,15,0,3,0 -15533,7,4,0,3,0,1,9,4,1,0,13,2,2,3,0 -15534,3,8,0,4,5,4,2,3,4,0,13,8,1,24,0 -15535,4,2,0,13,6,0,4,0,0,0,18,14,1,13,1 -15536,1,0,0,2,6,4,4,2,1,1,3,7,2,19,1 -15537,10,5,0,12,5,2,8,3,0,0,12,16,1,11,0 -15538,7,8,0,15,3,0,9,1,4,1,8,2,4,38,0 -15539,5,1,0,6,6,0,8,0,4,0,17,2,4,19,0 -15540,1,2,0,14,6,5,11,2,1,0,14,7,0,8,1 -15541,6,8,0,2,2,1,5,4,4,0,7,7,5,7,1 -15542,5,2,0,8,6,4,0,0,4,0,8,9,1,7,0 -15543,8,4,0,14,4,1,11,5,1,1,8,19,4,19,1 -15544,7,5,0,6,5,1,3,0,3,0,15,14,1,12,0 -15545,3,3,0,6,6,6,11,5,1,0,6,20,0,3,0 -15546,0,5,0,13,2,2,7,0,0,0,15,11,0,32,0 -15547,2,8,0,0,5,4,7,2,0,0,15,11,5,27,0 -15548,5,4,0,0,4,2,3,4,0,1,1,8,5,8,1 -15549,10,8,0,1,4,5,4,5,2,1,14,6,5,3,1 -15550,8,1,0,3,5,6,6,2,0,0,3,15,2,6,0 -15551,2,0,0,2,5,0,6,5,2,0,2,0,5,35,0 -15552,2,3,0,7,0,5,6,0,1,1,2,20,5,30,0 -15553,1,0,0,11,6,3,10,5,0,0,1,10,2,26,1 -15554,4,1,0,5,1,0,4,4,0,0,9,5,1,2,1 -15555,0,6,0,8,6,4,11,4,4,0,11,11,5,31,0 -15556,6,1,0,9,5,3,5,2,1,1,7,2,5,37,0 -15557,6,4,0,11,2,4,8,4,2,0,5,0,3,21,1 -15558,1,6,0,13,0,4,8,5,1,0,0,2,1,26,0 -15559,0,4,0,7,2,5,5,2,1,0,8,6,0,4,0 -15560,0,8,0,3,1,1,13,1,0,1,13,15,1,26,0 -15561,10,0,0,8,1,3,5,3,1,1,10,17,2,17,1 -15562,9,4,0,11,1,6,6,1,3,1,8,9,5,8,0 -15563,8,6,0,5,1,4,14,2,1,1,1,16,3,5,1 -15564,0,3,0,11,3,0,9,3,2,0,16,19,1,36,0 -15565,2,3,0,3,0,0,7,3,0,0,6,15,3,19,0 -15566,3,0,0,9,2,3,6,0,4,0,17,18,2,2,0 -15567,1,3,0,7,5,5,9,4,4,1,13,2,1,23,0 -15568,0,4,0,6,6,1,0,4,3,0,13,15,0,35,0 -15569,0,2,0,8,0,5,10,4,2,0,15,9,0,2,0 -15570,5,7,0,2,2,1,4,1,0,0,13,9,0,33,0 -15571,6,2,0,11,1,0,8,0,0,1,13,9,3,26,0 -15572,0,7,0,8,6,5,2,5,0,0,8,15,0,8,0 -15573,0,6,0,6,1,0,9,4,3,1,10,11,5,11,0 -15574,1,0,0,15,6,4,2,4,2,1,6,18,0,30,0 -15575,5,2,0,5,4,3,8,1,2,1,5,9,1,27,0 -15576,8,2,0,2,4,1,10,1,3,1,2,11,2,38,0 -15577,0,5,0,1,0,5,10,3,3,1,0,2,0,14,0 -15578,2,5,0,15,2,6,6,0,1,0,0,13,3,27,0 -15579,10,2,0,13,0,0,1,0,3,1,13,13,1,27,0 -15580,8,4,0,0,1,4,2,2,0,1,15,9,3,9,0 -15581,9,7,0,3,3,4,3,1,3,0,2,18,2,17,0 -15582,6,6,0,2,2,4,3,4,3,0,13,11,1,40,0 -15583,1,4,0,0,0,0,3,2,4,0,2,15,1,11,0 -15584,7,1,0,9,5,4,14,5,3,0,9,9,3,5,1 -15585,9,2,0,13,4,3,11,2,3,1,7,2,3,39,0 -15586,4,8,0,0,5,0,9,2,4,0,13,2,3,3,0 -15587,1,8,0,14,1,4,9,2,4,0,13,2,4,6,0 -15588,6,2,0,14,1,2,12,5,1,1,9,19,3,11,1 -15589,5,6,0,11,2,1,6,0,1,1,6,2,0,19,0 -15590,1,1,0,5,0,2,13,1,2,0,13,13,5,25,0 -15591,0,2,0,4,0,1,14,5,4,1,13,11,0,5,0 -15592,0,3,0,12,6,4,6,3,4,0,2,11,1,19,0 -15593,2,0,0,6,0,3,9,4,4,0,4,14,4,1,0 -15594,6,7,0,14,3,2,11,0,1,1,18,14,5,30,1 -15595,6,0,0,14,6,5,7,2,0,0,6,2,3,14,0 -15596,3,3,0,4,3,2,12,4,1,1,18,0,0,27,1 -15597,2,6,0,0,1,6,6,4,0,0,2,13,3,10,0 -15598,0,6,0,11,0,1,11,3,2,0,2,2,5,26,0 -15599,4,6,0,15,0,0,3,4,3,1,15,8,0,39,0 -15600,0,8,0,4,6,0,5,0,2,1,17,2,0,6,0 -15601,7,3,0,2,5,0,8,5,3,1,18,3,3,18,1 -15602,3,0,0,11,6,6,8,5,1,1,18,6,2,38,1 -15603,3,7,0,5,5,5,8,4,3,0,5,15,0,8,0 -15604,3,7,0,0,2,4,8,3,1,0,7,2,5,19,0 -15605,0,6,0,6,0,2,10,0,1,1,13,19,1,3,0 -15606,0,3,0,15,5,5,12,2,0,0,4,9,5,31,0 -15607,1,4,0,8,0,2,5,4,0,1,17,16,0,23,0 -15608,3,3,0,6,2,5,1,1,0,1,13,13,2,33,0 -15609,8,5,0,14,1,1,11,5,3,0,18,5,4,11,1 -15610,1,6,0,1,5,0,1,3,1,0,16,4,5,32,0 -15611,0,3,0,12,6,6,12,1,4,1,10,11,0,7,0 -15612,9,2,0,1,1,0,9,1,0,1,0,6,0,6,0 -15613,2,4,0,9,6,0,13,3,1,1,13,15,0,12,0 -15614,2,1,0,13,6,4,13,1,2,1,7,10,4,17,1 -15615,0,6,0,4,0,3,0,3,2,1,2,6,0,26,0 -15616,9,7,0,3,6,1,4,0,1,0,17,18,4,34,0 -15617,7,0,0,15,2,5,7,0,1,1,5,9,2,33,1 -15618,10,8,0,12,3,0,5,3,3,1,18,19,2,8,1 -15619,2,1,0,5,5,2,2,5,3,1,18,10,2,17,1 -15620,10,5,0,14,1,0,13,1,3,1,11,12,5,41,1 -15621,1,8,0,11,0,1,13,5,2,0,8,15,2,41,0 -15622,10,7,0,15,6,5,13,2,1,1,4,1,3,11,1 -15623,9,0,0,12,6,0,5,2,0,0,2,11,3,39,0 -15624,5,3,0,3,0,0,5,4,2,1,17,16,1,12,0 -15625,0,2,0,6,0,5,10,0,3,1,4,11,5,6,0 -15626,9,4,0,12,6,6,6,5,1,1,15,15,5,29,0 -15627,9,0,0,15,3,3,1,5,2,1,1,3,3,16,1 -15628,2,2,0,2,5,3,3,4,0,1,2,9,5,7,0 -15629,1,6,0,11,3,4,9,4,2,0,0,19,1,39,0 -15630,1,0,0,3,6,4,8,2,3,0,8,2,3,33,0 -15631,3,0,0,9,2,1,12,1,3,1,7,20,2,38,1 -15632,2,2,0,6,5,4,8,3,3,1,15,20,5,4,0 -15633,6,4,0,3,0,6,11,4,0,0,2,6,5,11,0 -15634,3,2,0,13,3,0,6,1,4,1,13,4,3,28,0 -15635,5,8,0,6,1,0,3,2,0,0,3,14,1,33,1 -15636,0,6,0,0,5,5,0,2,3,0,4,2,5,0,0 -15637,2,8,0,15,0,1,5,1,0,1,4,2,5,7,0 -15638,0,7,0,9,6,6,8,0,0,1,2,4,4,37,0 -15639,3,8,0,6,2,3,9,4,1,0,17,15,1,32,0 -15640,6,5,0,2,6,0,12,2,0,0,17,11,0,41,0 -15641,3,5,0,10,6,1,11,5,2,0,5,18,0,34,0 -15642,0,2,0,0,5,5,5,3,1,1,0,2,5,14,0 -15643,3,7,0,7,6,3,10,3,4,0,1,5,5,30,1 -15644,10,1,0,12,5,3,8,3,3,1,14,2,4,30,0 -15645,1,2,0,3,0,0,11,2,0,0,13,4,0,11,0 -15646,2,0,0,7,5,2,3,5,1,0,13,9,2,41,0 -15647,8,0,0,0,4,5,13,5,1,1,11,7,3,7,1 -15648,5,6,0,3,4,0,12,3,1,1,8,11,2,41,0 -15649,0,1,0,4,6,4,7,3,1,1,2,9,4,13,0 -15650,0,3,0,3,6,6,2,0,0,1,13,6,3,4,0 -15651,4,3,0,2,0,6,0,2,3,0,15,8,2,37,0 -15652,8,8,0,4,2,6,10,0,4,1,7,14,4,39,1 -15653,1,8,0,6,6,4,5,4,0,0,5,2,5,17,0 -15654,9,4,0,0,6,1,4,0,4,0,16,11,5,38,0 -15655,2,1,0,7,4,3,10,4,4,0,2,3,0,4,0 -15656,2,6,0,10,2,3,11,1,1,0,0,9,2,24,0 -15657,6,4,0,12,4,6,2,3,1,1,17,4,4,34,0 -15658,0,7,0,13,0,0,3,2,2,1,4,12,2,0,0 -15659,7,2,0,5,6,5,8,2,1,0,8,15,4,33,0 -15660,10,8,0,13,6,4,8,4,0,1,13,16,5,16,0 -15661,10,4,0,1,1,3,9,4,2,1,13,11,1,16,0 -15662,3,7,0,5,3,6,4,5,4,1,18,18,0,35,0 -15663,0,1,0,15,5,3,10,4,2,1,13,2,1,37,0 -15664,2,7,0,0,2,4,1,2,2,0,0,12,4,20,0 -15665,6,4,0,13,1,0,1,1,4,0,13,11,2,20,0 -15666,8,8,0,6,5,5,0,4,2,0,2,11,2,0,0 -15667,9,6,0,15,1,3,7,2,3,0,13,11,2,37,0 -15668,0,2,0,14,6,0,10,0,2,0,2,1,1,19,0 -15669,9,4,0,13,4,0,0,1,2,0,16,11,0,26,0 -15670,0,2,0,0,0,1,1,5,4,1,11,20,2,0,0 -15671,10,1,0,10,4,5,1,5,2,1,1,10,2,9,1 -15672,9,0,0,6,5,0,5,2,2,0,13,2,3,24,0 -15673,6,6,0,9,0,4,1,0,3,0,8,13,3,33,0 -15674,2,4,0,11,1,3,9,3,2,1,13,7,0,4,0 -15675,4,2,0,0,3,3,9,0,1,1,9,12,3,38,1 -15676,4,3,0,13,6,5,6,2,2,1,2,8,5,21,0 -15677,1,7,0,7,2,0,8,3,2,0,2,11,3,9,0 -15678,10,4,0,4,0,3,0,4,1,0,2,2,0,6,0 -15679,9,8,0,5,0,5,1,0,1,0,5,6,1,2,0 -15680,8,0,0,0,6,2,13,4,2,1,5,17,5,16,1 -15681,1,0,0,15,5,6,3,0,0,0,13,0,5,38,0 -15682,1,6,0,5,3,1,10,0,0,1,9,17,4,17,1 -15683,9,3,0,7,2,0,3,4,1,1,5,13,4,40,1 -15684,7,4,0,5,6,5,3,3,2,0,2,16,0,1,0 -15685,9,1,0,4,6,5,5,1,2,1,4,4,5,18,0 -15686,0,0,0,3,3,0,8,5,0,1,10,20,3,1,0 -15687,0,7,0,5,5,2,0,5,1,1,1,2,0,20,0 -15688,0,7,0,3,2,3,4,0,4,1,18,8,4,38,1 -15689,5,7,0,15,6,0,9,3,0,1,8,20,1,33,0 -15690,5,6,0,11,4,0,8,3,2,1,18,17,1,9,1 -15691,9,1,0,10,5,3,10,3,0,0,8,2,2,26,0 -15692,6,5,0,7,0,1,3,2,4,1,5,10,5,9,1 -15693,2,1,0,13,2,3,2,5,0,0,13,2,0,27,0 -15694,0,2,0,0,6,5,0,4,2,1,2,11,0,36,0 -15695,9,1,0,4,6,3,10,2,0,1,11,0,5,14,1 -15696,8,6,0,8,0,2,9,4,2,0,0,0,2,18,0 -15697,8,6,0,10,2,5,7,3,1,1,5,8,4,9,1 -15698,4,7,0,13,5,2,0,3,4,1,13,9,0,34,0 -15699,2,0,0,3,4,2,8,3,4,1,2,6,3,38,0 -15700,4,8,0,9,6,5,5,0,0,0,8,1,1,13,0 -15701,10,2,0,2,6,4,10,0,4,1,12,5,1,1,1 -15702,9,7,0,7,0,4,11,0,2,0,0,4,1,35,0 -15703,1,2,0,7,4,3,3,4,1,1,11,14,2,8,1 -15704,7,8,0,5,2,2,5,3,0,1,0,6,5,27,0 -15705,4,6,0,8,3,2,2,4,0,1,9,7,2,11,1 -15706,3,8,0,13,3,3,10,1,2,1,4,8,4,9,1 -15707,1,4,0,15,5,4,9,0,0,1,2,19,0,34,0 -15708,0,1,0,11,0,0,7,3,1,0,8,11,5,16,0 -15709,4,6,0,3,0,4,1,1,4,1,2,6,3,4,0 -15710,10,5,0,15,6,5,9,5,2,0,18,0,4,9,1 -15711,4,0,0,8,3,2,0,3,4,0,0,18,3,30,0 -15712,6,5,0,8,6,1,10,0,3,0,0,2,1,37,0 -15713,6,8,0,7,1,2,9,1,0,1,15,20,2,13,0 -15714,2,5,0,5,4,5,8,5,3,0,2,11,3,32,0 -15715,5,7,0,10,5,0,0,1,1,0,8,2,4,18,0 -15716,9,3,0,2,6,1,7,5,2,1,4,12,3,3,0 -15717,2,1,0,2,6,2,5,3,0,1,5,10,3,39,1 -15718,2,8,0,2,1,5,2,1,0,0,4,19,3,34,0 -15719,7,5,0,15,2,5,7,2,0,1,14,17,2,13,1 -15720,10,2,0,11,1,4,14,2,2,1,4,6,2,7,0 -15721,3,8,0,14,6,6,8,4,4,1,2,5,1,37,0 -15722,10,8,0,6,0,4,7,3,0,1,13,2,0,12,0 -15723,10,6,0,15,2,2,4,0,1,1,18,17,4,27,1 -15724,9,5,0,13,0,3,6,3,2,1,13,13,2,10,0 -15725,2,5,0,5,5,2,13,1,0,0,17,2,0,14,0 -15726,8,7,0,1,4,1,10,5,1,1,5,5,3,0,1 -15727,3,1,0,13,6,0,11,1,0,1,14,6,5,19,0 -15728,7,0,0,14,3,1,6,5,0,1,18,17,3,29,1 -15729,0,6,0,3,3,0,12,1,0,1,8,2,2,7,0 -15730,0,5,0,11,2,0,10,2,1,1,2,13,5,19,0 -15731,5,7,0,6,6,0,9,4,3,1,10,18,0,22,0 -15732,2,8,0,11,6,5,8,0,1,1,13,20,4,23,0 -15733,6,3,0,2,2,4,10,3,4,1,17,2,0,7,0 -15734,0,2,0,13,1,2,9,3,1,0,12,15,4,29,0 -15735,1,3,0,3,1,0,11,4,2,1,10,19,0,26,0 -15736,3,5,0,13,0,1,6,3,2,0,2,17,0,18,0 -15737,4,6,0,5,5,2,3,3,3,1,10,11,2,7,0 -15738,2,4,0,9,0,1,9,4,2,0,0,5,2,5,0 -15739,1,2,0,6,2,6,0,4,0,0,14,14,1,13,0 -15740,2,7,0,12,5,6,8,2,0,1,8,2,2,13,0 -15741,7,3,0,6,4,6,7,3,3,1,4,11,0,7,0 -15742,4,0,0,8,5,1,10,4,2,1,18,17,2,1,1 -15743,0,1,0,11,1,0,10,4,3,1,2,12,2,31,0 -15744,3,6,0,0,6,5,5,2,3,1,4,12,0,21,0 -15745,0,0,0,11,0,3,0,0,0,0,8,8,0,14,0 -15746,10,2,0,0,3,1,2,5,3,1,18,10,5,11,1 -15747,7,0,0,4,6,3,0,2,2,1,17,13,4,37,0 -15748,4,6,0,0,1,3,2,5,4,1,18,14,5,12,1 -15749,8,2,0,7,4,2,14,0,3,0,1,18,0,17,1 -15750,2,6,0,13,3,2,4,4,3,1,17,4,5,25,0 -15751,3,0,0,5,0,4,0,4,2,0,17,20,1,23,0 -15752,2,8,0,12,3,1,8,5,2,1,13,15,3,32,0 -15753,7,7,0,2,2,0,7,0,2,1,5,13,5,6,0 -15754,1,2,0,15,6,3,0,4,0,0,11,4,2,34,0 -15755,2,2,0,13,6,4,5,3,2,0,13,12,1,28,0 -15756,3,7,0,9,6,0,7,1,0,0,13,11,0,27,0 -15757,3,3,0,1,3,4,5,1,4,0,8,2,3,22,0 -15758,8,6,0,1,1,0,0,3,3,0,13,11,2,34,0 -15759,0,6,0,0,6,5,3,2,0,0,10,15,3,12,0 -15760,4,2,0,5,5,0,4,0,0,0,13,8,2,36,0 -15761,0,6,0,0,0,5,6,4,3,1,5,4,4,8,0 -15762,9,1,0,9,2,5,12,0,1,1,7,5,4,8,1 -15763,0,7,0,4,0,0,7,2,1,0,13,4,0,4,0 -15764,5,8,0,14,6,0,11,0,2,0,2,2,0,25,0 -15765,3,2,0,14,2,3,2,1,0,0,13,4,4,16,0 -15766,9,4,0,8,0,3,3,2,1,1,8,9,3,29,0 -15767,0,2,0,12,6,6,10,4,0,1,8,15,3,7,0 -15768,4,2,0,2,0,6,11,3,0,1,18,7,1,40,1 -15769,2,5,0,13,4,3,0,2,3,1,17,3,2,20,0 -15770,0,7,0,3,5,2,1,5,4,0,6,2,0,26,0 -15771,5,6,0,3,0,4,8,4,2,1,13,10,5,36,0 -15772,4,8,0,5,6,4,9,1,2,0,8,20,1,3,0 -15773,1,5,0,13,6,6,1,3,2,0,2,0,0,7,0 -15774,2,1,0,11,4,5,2,3,3,1,7,19,2,17,1 -15775,0,7,0,3,0,1,2,0,2,1,0,13,3,17,0 -15776,3,5,0,15,1,6,0,4,3,1,2,6,2,21,0 -15777,4,6,0,5,3,0,7,4,3,0,4,0,0,6,0 -15778,1,6,0,8,0,0,7,4,0,1,12,3,0,30,0 -15779,2,3,0,15,4,0,3,2,0,1,0,11,4,5,0 -15780,10,8,0,9,2,0,8,4,2,0,2,9,1,28,0 -15781,8,4,0,15,4,3,14,3,3,0,12,5,2,19,1 -15782,6,2,0,7,2,0,8,5,2,0,6,2,4,10,0 -15783,6,6,0,13,1,5,9,3,2,0,10,15,1,31,0 -15784,0,3,0,2,1,0,4,1,1,0,14,0,2,14,0 -15785,3,3,0,12,2,5,1,2,0,0,0,12,2,26,0 -15786,0,7,0,3,5,0,5,2,0,1,2,0,2,32,0 -15787,9,6,0,0,0,0,14,4,0,0,2,6,1,6,0 -15788,6,2,0,12,2,1,9,5,3,1,18,19,3,35,1 -15789,1,4,0,7,5,0,11,2,2,0,3,12,2,40,0 -15790,0,4,0,5,0,3,0,1,2,0,2,3,2,13,0 -15791,2,0,0,13,6,4,8,1,3,1,13,11,1,25,0 -15792,4,2,0,8,6,1,8,5,4,1,17,6,4,9,0 -15793,10,3,0,6,3,5,5,3,4,1,13,13,0,5,0 -15794,9,8,0,9,4,3,8,3,0,0,15,5,5,37,0 -15795,7,0,0,9,6,5,11,1,3,1,8,3,0,31,0 -15796,0,8,0,5,2,2,10,2,4,1,0,2,1,4,0 -15797,3,0,0,15,2,1,8,3,3,0,10,18,2,24,0 -15798,4,6,0,8,2,4,7,2,0,1,13,6,4,11,0 -15799,2,1,0,2,0,5,4,4,2,0,10,3,2,22,0 -15800,7,3,0,4,2,6,11,1,1,1,9,1,1,33,1 -15801,6,0,0,2,4,4,13,4,1,0,3,4,4,22,1 -15802,8,0,0,10,5,6,13,4,3,1,7,19,4,13,1 -15803,1,2,0,0,6,2,2,4,2,0,17,2,3,27,0 -15804,10,8,0,4,4,0,6,1,3,0,4,0,1,37,0 -15805,5,1,0,8,5,0,9,5,2,1,5,11,2,20,0 -15806,4,8,0,13,5,0,10,3,0,0,4,9,2,38,0 -15807,6,7,0,9,0,6,0,4,4,0,4,6,0,27,0 -15808,9,1,0,5,4,6,12,4,3,1,0,17,1,32,1 -15809,7,1,0,11,1,0,12,2,2,0,12,18,0,27,0 -15810,0,7,0,7,5,5,2,0,2,1,4,4,2,0,0 -15811,8,4,0,0,4,0,11,4,3,0,17,5,3,24,1 -15812,4,5,0,7,0,3,14,0,3,1,13,13,4,14,0 -15813,0,7,0,6,5,5,5,3,3,0,13,20,3,14,0 -15814,0,3,0,11,3,1,7,2,1,0,6,20,4,28,0 -15815,0,0,0,15,1,0,7,3,2,1,4,15,3,9,0 -15816,4,8,0,3,0,1,2,2,3,1,2,11,0,18,0 -15817,9,5,0,3,4,3,6,1,2,1,7,3,2,17,1 -15818,7,6,0,7,2,5,13,5,3,1,5,10,5,13,1 -15819,2,8,0,4,5,4,14,1,3,0,17,16,0,26,0 -15820,4,3,0,5,5,3,8,1,2,0,17,6,0,6,0 -15821,3,0,0,13,4,4,0,2,3,0,2,2,2,22,0 -15822,7,7,0,7,3,4,3,3,0,1,12,13,0,7,0 -15823,2,7,0,15,5,4,9,2,0,0,8,11,0,14,0 -15824,3,0,0,13,2,3,10,4,1,1,2,19,2,36,0 -15825,1,8,0,12,5,0,5,1,3,1,4,0,2,14,0 -15826,7,3,0,9,0,6,13,2,0,0,16,14,1,37,1 -15827,9,8,0,9,6,5,5,1,0,1,13,9,1,29,0 -15828,9,4,0,3,0,1,1,1,0,1,8,13,1,37,0 -15829,7,3,0,0,2,0,8,2,4,0,4,15,2,22,0 -15830,10,1,0,4,0,4,5,1,2,1,4,2,3,19,0 -15831,5,8,0,14,6,0,5,4,0,1,2,14,0,40,0 -15832,10,6,0,1,1,1,0,5,1,1,9,7,5,14,1 -15833,9,6,0,9,2,1,2,5,0,1,2,6,4,10,0 -15834,6,2,0,11,0,2,13,4,3,0,13,13,5,11,0 -15835,1,7,0,5,0,1,0,1,3,1,17,13,1,38,0 -15836,1,7,0,6,2,2,10,3,3,0,13,11,1,32,0 -15837,1,2,0,5,5,0,8,4,2,1,11,20,5,30,0 -15838,8,4,0,6,1,1,6,5,0,1,8,2,5,14,0 -15839,9,0,0,11,4,4,10,0,2,0,0,15,2,3,0 -15840,3,2,0,12,6,6,14,1,0,1,4,4,5,40,0 -15841,0,4,0,15,0,4,4,3,4,1,14,0,5,14,0 -15842,0,6,0,15,0,0,8,4,1,0,13,11,0,35,0 -15843,0,6,0,5,0,4,7,2,2,1,13,18,1,23,0 -15844,4,0,0,12,0,1,6,0,4,1,10,11,2,18,0 -15845,2,5,0,2,3,0,2,1,3,1,3,8,1,16,1 -15846,7,1,0,3,0,4,4,5,1,0,3,17,2,8,1 -15847,5,8,0,2,0,6,8,3,0,0,10,2,4,33,0 -15848,5,2,0,15,3,2,13,3,4,0,7,1,3,8,1 -15849,6,5,0,1,3,4,4,1,2,0,9,5,3,28,1 -15850,7,5,0,5,2,3,9,3,3,1,13,6,3,3,0 -15851,3,7,0,5,4,5,11,3,2,0,4,2,2,27,0 -15852,5,8,0,7,3,5,5,4,1,1,13,6,2,23,0 -15853,7,2,0,14,1,6,12,4,1,1,9,4,0,11,1 -15854,0,8,0,1,6,5,5,1,0,1,13,20,0,14,0 -15855,9,7,0,1,1,2,11,3,1,0,16,2,0,14,0 -15856,3,1,0,4,2,4,14,3,2,0,13,12,0,11,0 -15857,8,2,0,1,2,3,6,1,4,1,15,2,3,35,0 -15858,7,7,0,13,2,0,3,5,4,1,0,11,4,14,0 -15859,7,2,0,9,1,5,14,0,1,0,5,10,1,41,1 -15860,2,3,0,9,2,3,14,2,0,0,16,14,0,2,0 -15861,0,6,0,8,4,6,8,4,0,1,2,20,0,36,0 -15862,7,1,0,13,2,0,12,4,3,1,17,4,5,17,0 -15863,1,3,0,3,6,4,1,3,4,0,11,13,4,21,0 -15864,1,0,0,2,3,5,13,5,3,0,4,12,0,27,0 -15865,6,7,0,14,2,1,2,0,4,1,9,7,4,1,1 -15866,0,0,0,13,0,0,7,1,3,1,15,16,1,20,0 -15867,10,3,0,7,2,2,13,1,2,1,16,16,2,37,1 -15868,9,3,0,4,4,5,5,2,4,1,4,4,1,3,0 -15869,9,0,0,10,4,3,2,2,1,1,5,4,1,26,0 -15870,3,1,0,13,6,6,7,0,0,1,13,2,4,35,0 -15871,9,8,0,5,4,5,1,2,2,1,9,3,4,30,1 -15872,2,2,0,13,2,4,12,2,2,0,17,2,5,31,0 -15873,5,7,0,15,0,5,4,2,0,1,13,0,0,4,0 -15874,10,8,0,12,5,2,11,0,0,0,9,12,5,22,1 -15875,4,1,0,9,5,3,12,2,3,0,9,1,0,10,1 -15876,6,2,0,6,3,6,1,3,1,0,13,9,3,19,0 -15877,0,1,0,6,0,0,9,5,0,1,4,9,1,32,0 -15878,1,3,0,13,2,5,8,2,2,1,10,15,3,27,0 -15879,6,8,0,5,6,3,1,4,2,1,8,15,0,29,0 -15880,1,7,0,1,0,4,3,2,0,1,17,6,3,0,0 -15881,5,7,0,9,5,3,9,1,0,1,2,16,2,8,0 -15882,4,0,0,10,6,2,10,2,1,1,10,0,2,14,0 -15883,9,6,0,8,0,6,3,0,2,1,11,17,1,8,1 -15884,10,8,0,6,1,2,2,2,1,0,18,7,2,31,1 -15885,2,6,0,7,1,4,9,4,0,1,13,11,1,12,0 -15886,2,7,0,0,2,5,13,1,3,1,15,5,2,25,1 -15887,1,5,0,14,1,1,13,5,1,1,5,5,4,35,1 -15888,6,1,0,3,3,6,6,1,0,1,18,8,3,32,1 -15889,0,3,0,8,0,4,5,0,1,0,2,13,3,2,0 -15890,2,8,0,4,0,4,12,5,4,1,4,8,1,8,0 -15891,2,7,0,10,0,1,3,4,4,0,4,19,4,10,1 -15892,0,1,0,15,6,4,0,2,3,0,8,6,0,4,0 -15893,3,3,0,8,6,0,14,0,0,0,13,2,4,38,0 -15894,0,3,0,3,0,2,9,0,1,1,13,2,0,25,0 -15895,3,6,0,0,5,3,0,0,2,0,6,15,0,16,0 -15896,3,3,0,6,2,5,8,2,2,0,13,3,0,5,0 -15897,10,4,0,11,1,5,6,2,3,0,7,11,1,12,0 -15898,5,2,0,5,2,6,0,0,2,1,14,2,5,6,0 -15899,0,0,0,4,1,0,8,1,4,0,2,13,3,37,0 -15900,1,4,0,2,2,1,6,5,0,0,13,20,2,27,0 -15901,1,2,0,3,6,3,14,0,2,1,5,8,1,23,1 -15902,3,5,0,14,0,3,4,0,3,0,11,11,2,25,0 -15903,9,1,0,14,1,5,10,4,4,1,13,11,1,4,0 -15904,9,7,0,5,6,2,1,3,0,0,15,14,0,13,0 -15905,7,3,0,0,0,5,7,2,2,0,2,4,1,28,0 -15906,5,3,0,10,3,4,1,4,4,0,18,1,1,8,1 -15907,8,2,0,9,6,5,7,3,4,1,9,7,4,18,1 -15908,5,3,0,12,0,2,6,5,1,1,2,17,0,25,0 -15909,10,3,0,0,5,2,0,3,1,1,16,7,5,3,1 -15910,2,8,0,7,6,6,11,0,4,1,4,19,5,30,1 -15911,0,6,0,9,0,1,6,3,0,0,8,16,2,41,0 -15912,10,5,0,15,0,2,13,5,1,1,9,7,5,25,1 -15913,1,1,0,13,0,2,8,5,2,1,0,11,2,6,0 -15914,0,8,0,11,3,0,5,0,3,1,10,2,0,21,0 -15915,0,1,0,1,6,0,9,4,0,1,2,11,2,27,0 -15916,0,8,0,3,0,0,11,3,3,0,13,18,1,20,0 -15917,0,3,0,3,6,0,12,2,2,0,8,11,3,18,0 -15918,2,8,0,9,0,1,3,4,4,1,6,6,1,3,0 -15919,2,0,0,11,0,2,4,2,1,1,13,13,5,11,0 -15920,7,5,0,14,4,5,11,3,2,0,16,6,5,5,1 -15921,3,4,0,15,0,4,8,3,1,0,8,20,4,38,0 -15922,8,6,0,7,6,3,8,5,2,0,5,10,1,10,1 -15923,4,2,0,14,2,5,7,4,3,1,12,13,4,20,0 -15924,2,7,0,13,6,4,2,0,0,0,13,0,2,18,0 -15925,0,2,0,11,0,0,8,1,1,0,0,15,2,27,0 -15926,2,5,0,10,4,3,0,3,0,1,8,15,5,30,0 -15927,0,6,0,5,5,2,2,3,2,0,6,0,5,4,0 -15928,5,2,0,11,3,3,10,5,2,1,1,19,1,14,1 -15929,0,6,0,13,1,4,5,2,4,0,4,18,3,39,0 -15930,2,4,0,0,6,1,9,2,0,1,2,11,4,28,0 -15931,5,8,0,4,0,3,7,3,2,0,18,6,1,7,0 -15932,1,3,0,5,3,3,6,0,2,1,8,4,4,33,0 -15933,5,4,0,9,6,5,12,0,4,1,1,13,1,28,1 -15934,2,1,0,11,2,5,7,0,0,0,4,20,0,31,0 -15935,1,8,0,5,3,5,12,4,0,0,17,6,4,6,0 -15936,7,7,0,0,4,3,3,1,4,1,10,7,2,7,1 -15937,2,7,0,12,2,2,7,5,3,0,2,2,0,1,0 -15938,4,6,0,9,5,0,14,0,4,0,8,2,2,16,0 -15939,0,7,0,6,0,0,1,3,1,1,13,9,0,37,0 -15940,4,8,0,6,5,5,12,0,4,1,13,15,3,38,0 -15941,3,8,0,1,1,3,8,4,3,1,4,3,4,6,0 -15942,0,4,0,3,2,1,6,4,0,0,3,9,0,26,0 -15943,8,6,0,9,0,3,8,5,0,1,8,16,5,29,0 -15944,6,2,0,2,1,5,6,1,1,0,4,2,0,29,0 -15945,0,3,0,7,1,4,5,1,3,1,2,17,4,10,0 -15946,0,5,0,12,0,0,9,0,2,0,17,11,2,8,0 -15947,8,6,0,15,0,2,5,5,3,1,7,18,1,32,1 -15948,8,7,0,8,0,1,3,1,2,1,10,18,1,4,0 -15949,8,3,0,11,0,1,9,3,0,0,2,9,2,14,0 -15950,3,8,0,4,3,3,5,4,2,1,13,20,2,8,0 -15951,1,7,0,3,1,3,5,4,2,0,13,12,0,12,0 -15952,6,8,0,4,5,0,12,4,4,0,17,4,3,13,0 -15953,10,8,0,15,0,0,14,0,2,0,2,20,0,16,0 -15954,1,8,0,6,3,2,9,3,0,0,13,9,1,19,0 -15955,2,6,0,13,5,4,3,0,0,0,4,4,4,30,0 -15956,2,3,0,11,6,6,12,4,3,1,13,11,4,13,0 -15957,3,3,0,1,0,5,1,3,2,0,4,13,5,22,0 -15958,0,6,0,11,1,5,10,2,1,1,3,11,3,34,0 -15959,6,6,0,12,5,4,9,4,4,1,13,11,3,19,0 -15960,7,1,0,10,3,1,13,4,2,1,9,7,4,10,1 -15961,6,4,0,14,0,0,3,4,1,0,0,11,0,23,0 -15962,0,3,0,9,3,0,5,0,2,1,13,16,0,12,0 -15963,5,7,0,3,1,6,7,0,0,0,11,11,0,8,0 -15964,9,8,0,3,5,5,14,5,3,1,13,3,2,36,1 -15965,6,1,0,9,1,6,6,4,1,0,13,11,2,7,0 -15966,0,0,0,7,5,0,14,0,2,0,2,13,2,37,0 -15967,0,8,0,15,1,4,0,1,4,0,13,11,3,20,0 -15968,2,6,0,12,3,1,0,2,0,0,2,2,2,18,0 -15969,4,7,0,2,6,3,11,3,1,1,15,9,2,25,0 -15970,2,4,0,12,0,4,6,5,4,0,9,9,0,13,0 -15971,0,0,0,15,0,3,5,2,2,1,2,0,1,22,0 -15972,10,7,0,5,1,1,9,5,1,0,10,20,2,27,0 -15973,10,8,0,15,6,5,9,4,1,0,7,13,1,17,0 -15974,3,7,0,1,2,4,5,3,3,0,4,11,0,5,0 -15975,1,0,0,0,2,5,12,4,3,0,13,11,4,3,0 -15976,1,4,0,15,0,3,2,1,1,0,15,11,2,18,0 -15977,6,2,0,7,6,2,4,0,4,1,12,5,1,36,1 -15978,0,6,0,14,6,6,4,5,1,1,12,11,4,4,0 -15979,6,5,0,9,3,5,10,5,1,1,16,15,2,24,1 -15980,7,6,0,15,0,3,11,3,2,1,2,2,0,6,0 -15981,0,6,0,13,5,1,12,4,0,1,13,6,3,25,0 -15982,10,7,0,0,6,4,5,0,1,0,15,12,4,38,0 -15983,9,4,0,10,1,6,9,1,2,1,0,2,2,17,0 -15984,9,6,0,9,5,4,6,0,3,1,6,5,5,10,0 -15985,0,8,0,1,4,2,8,1,0,0,0,15,5,34,0 -15986,7,6,0,0,5,4,8,4,4,1,13,8,0,22,0 -15987,9,0,0,1,1,5,3,0,3,0,4,20,0,35,0 -15988,1,3,0,5,3,4,6,4,1,0,2,18,0,30,0 -15989,5,5,0,11,4,6,7,0,4,1,18,6,3,7,1 -15990,9,2,0,0,0,5,12,3,3,0,11,11,1,31,0 -15991,0,6,0,13,3,4,4,5,0,0,8,11,1,13,0 -15992,5,4,0,12,3,5,0,4,2,1,9,4,5,39,1 -15993,3,0,0,2,0,0,14,1,2,1,2,2,5,41,0 -15994,2,8,0,11,5,0,14,2,0,0,8,6,5,10,0 -15995,2,5,0,11,0,4,7,2,1,1,10,1,0,9,0 -15996,1,7,0,6,0,3,10,2,2,1,3,11,5,4,0 -15997,7,4,0,14,5,1,13,0,4,0,11,9,3,33,1 -15998,10,4,0,8,1,1,11,0,4,0,3,10,1,11,1 -15999,0,3,0,13,0,5,9,5,2,0,10,9,5,5,0 -16000,2,2,0,6,0,5,11,0,0,0,4,8,0,25,0 -16001,1,4,0,12,4,5,13,4,3,1,4,2,5,24,0 -16002,8,6,0,15,6,4,2,1,1,1,6,13,1,4,0 -16003,0,2,0,11,0,3,5,5,1,1,3,15,0,27,0 -16004,4,4,0,8,2,3,12,1,3,0,4,17,2,4,0 -16005,0,4,0,12,6,5,4,1,1,0,3,16,3,16,0 -16006,5,0,0,1,0,5,6,2,0,0,10,11,4,25,0 -16007,2,8,0,0,3,3,5,3,1,0,2,2,5,20,0 -16008,9,0,0,5,1,3,6,2,3,1,14,5,1,24,1 -16009,1,7,0,5,1,1,9,5,2,1,10,20,1,3,0 -16010,4,0,0,7,0,3,14,4,1,1,13,1,0,20,0 -16011,8,7,0,9,1,1,12,1,1,0,16,8,1,16,1 -16012,6,7,0,12,4,1,9,2,3,1,9,9,0,2,1 -16013,1,8,0,15,5,0,1,1,1,1,2,3,0,7,0 -16014,9,1,0,13,5,4,11,2,4,1,7,2,0,35,0 -16015,10,1,0,13,0,5,5,4,3,1,2,0,1,16,0 -16016,0,6,0,12,0,2,14,3,2,1,2,2,1,38,0 -16017,3,0,0,7,1,0,6,4,2,0,0,19,2,31,0 -16018,0,5,0,1,4,5,6,3,2,1,17,11,0,19,0 -16019,4,7,0,3,1,4,13,5,0,1,18,7,3,5,1 -16020,3,6,0,13,4,4,12,0,4,0,2,3,1,30,0 -16021,9,7,0,14,1,6,0,3,0,0,6,20,2,37,0 -16022,2,6,0,15,0,4,14,1,1,1,0,2,0,17,0 -16023,0,6,0,2,3,1,4,2,4,1,4,9,1,22,0 -16024,0,5,0,0,2,2,13,3,3,1,13,12,0,10,0 -16025,1,3,0,11,0,4,7,2,0,0,0,2,4,24,0 -16026,3,8,0,8,3,5,7,1,0,0,4,2,2,26,0 -16027,4,6,0,15,1,0,6,2,0,1,4,3,2,6,0 -16028,8,5,0,6,2,5,7,3,0,0,18,10,4,13,1 -16029,5,2,0,11,6,1,2,2,2,0,15,13,5,0,0 -16030,2,6,0,1,5,5,8,2,0,0,15,4,2,7,0 -16031,9,4,0,7,6,1,3,5,0,0,9,4,2,20,1 -16032,1,4,0,14,0,4,9,3,1,0,12,2,1,21,0 -16033,4,2,0,14,5,0,6,3,3,1,13,18,5,36,0 -16034,1,4,0,6,3,6,13,1,4,0,2,11,5,30,0 -16035,3,6,0,12,3,6,7,0,2,0,17,15,0,19,0 -16036,3,8,0,0,5,0,11,3,4,1,5,12,2,16,1 -16037,10,6,0,11,1,3,0,3,0,0,13,11,1,8,0 -16038,0,5,0,15,3,6,3,3,0,1,13,20,0,27,0 -16039,4,8,0,4,0,0,0,5,0,1,0,0,2,17,0 -16040,8,6,0,4,6,0,12,3,1,0,6,17,4,9,0 -16041,9,4,0,15,1,1,2,3,0,0,0,17,3,23,1 -16042,6,6,0,2,0,4,1,4,0,0,10,6,4,13,0 -16043,1,1,0,14,6,3,4,2,1,0,13,20,0,26,0 -16044,2,0,0,0,0,6,0,3,0,1,13,19,5,40,0 -16045,0,3,0,5,4,0,9,4,0,1,10,9,5,9,0 -16046,9,4,0,0,0,6,14,3,4,1,13,3,1,5,0 -16047,4,7,0,10,2,1,11,4,4,1,18,6,4,0,1 -16048,8,8,0,3,6,0,8,5,0,0,16,7,2,31,1 -16049,0,8,0,4,2,4,9,2,0,0,13,2,1,21,0 -16050,0,4,0,12,0,0,5,5,1,1,18,17,5,6,1 -16051,2,4,0,3,3,5,5,3,4,1,2,6,4,13,0 -16052,10,0,0,13,4,5,5,1,3,1,8,3,1,5,0 -16053,7,7,0,12,0,2,3,0,2,1,0,16,3,14,1 -16054,7,1,0,7,0,4,10,3,2,1,9,5,3,23,1 -16055,5,6,0,5,1,5,12,4,3,1,11,12,2,19,1 -16056,0,0,0,0,0,5,6,5,0,0,4,18,2,1,0 -16057,7,2,0,14,0,0,11,1,4,0,5,15,4,40,0 -16058,0,4,0,8,5,4,5,5,4,1,2,1,5,11,0 -16059,2,8,0,2,2,5,8,4,2,1,8,2,0,22,0 -16060,7,1,0,10,3,3,5,3,2,1,12,20,2,28,0 -16061,0,0,0,15,0,0,1,2,2,1,3,15,5,2,0 -16062,6,4,0,8,2,1,13,5,4,0,4,12,1,40,1 -16063,1,7,0,13,2,4,1,5,4,0,8,13,2,18,0 -16064,0,2,0,11,6,5,1,1,2,0,9,2,5,29,0 -16065,0,8,0,1,0,0,3,4,2,1,2,16,0,29,0 -16066,5,3,0,1,2,6,7,4,3,0,13,2,3,12,0 -16067,8,8,0,13,1,2,6,3,2,0,9,11,2,6,0 -16068,4,8,0,0,2,0,8,1,0,0,2,15,1,25,0 -16069,3,3,0,12,6,1,3,5,3,0,7,1,3,11,1 -16070,6,2,0,0,3,5,8,0,3,0,11,13,1,21,0 -16071,0,6,0,3,0,5,14,1,0,1,6,2,2,4,0 -16072,6,0,0,2,6,4,5,2,1,1,5,17,4,30,1 -16073,0,4,0,11,1,0,1,4,4,0,13,6,0,36,0 -16074,5,8,0,4,2,4,11,5,4,1,9,17,4,16,1 -16075,0,7,0,3,4,0,1,1,3,1,2,1,3,39,0 -16076,8,7,0,13,5,3,9,4,0,1,13,15,1,38,0 -16077,2,7,0,7,2,5,10,4,2,0,10,2,1,38,0 -16078,1,4,0,2,1,5,10,0,4,1,17,16,5,40,0 -16079,9,1,0,10,6,2,1,5,4,0,18,7,4,28,1 -16080,2,3,0,11,3,5,8,2,3,0,13,2,5,22,0 -16081,3,6,0,6,1,4,13,2,0,1,6,11,1,28,0 -16082,4,0,0,4,3,5,3,1,3,1,0,2,2,39,0 -16083,6,4,0,11,5,4,6,0,4,0,2,19,5,28,0 -16084,0,8,0,1,5,0,9,4,2,0,13,2,0,40,0 -16085,10,1,0,9,6,5,0,1,3,0,7,3,0,21,0 -16086,0,1,0,15,4,3,6,2,2,1,14,13,3,32,0 -16087,6,5,0,8,6,2,9,1,1,1,9,6,1,29,1 -16088,6,6,0,9,3,0,1,5,0,1,14,10,5,35,1 -16089,8,4,0,11,5,1,12,5,2,1,18,16,4,19,1 -16090,6,1,0,4,3,1,13,4,2,1,5,7,4,7,1 -16091,3,7,0,3,5,0,4,2,1,0,15,17,0,19,0 -16092,2,8,0,15,6,3,0,0,0,0,17,11,2,37,0 -16093,0,6,0,5,6,6,3,3,0,1,17,2,5,27,0 -16094,5,2,0,3,4,5,6,5,4,1,1,7,4,37,1 -16095,7,8,0,6,3,2,2,5,3,1,9,1,3,11,1 -16096,5,7,0,8,0,5,9,3,4,0,16,15,2,27,0 -16097,4,1,0,13,1,2,13,2,1,1,10,12,0,23,1 -16098,6,8,0,0,6,6,6,3,0,1,8,6,0,22,0 -16099,6,2,0,3,3,0,2,4,3,1,7,1,3,0,0 -16100,2,2,0,5,5,4,10,3,0,0,0,6,0,17,0 -16101,2,6,0,0,1,4,4,3,4,1,8,15,2,37,0 -16102,9,1,0,1,2,6,0,2,3,0,8,20,3,7,0 -16103,0,1,0,15,6,3,14,1,4,1,13,11,0,19,0 -16104,5,2,0,3,1,5,11,3,2,1,8,20,5,38,0 -16105,1,4,0,13,2,5,5,2,1,1,17,9,0,25,0 -16106,0,1,0,0,6,5,6,3,4,0,7,9,2,27,0 -16107,1,7,0,9,5,5,0,0,4,1,8,6,0,9,0 -16108,2,7,0,15,6,0,8,0,3,1,15,9,4,24,0 -16109,0,4,0,8,0,4,13,3,3,0,12,8,0,16,0 -16110,9,5,0,15,3,4,13,0,3,0,6,6,4,0,0 -16111,6,2,0,10,4,6,11,5,1,0,18,11,1,8,1 -16112,0,1,0,5,2,2,10,5,3,0,2,18,0,40,0 -16113,7,6,0,7,3,3,5,0,1,1,15,9,3,18,0 -16114,0,8,0,5,0,5,13,1,4,1,8,13,2,40,0 -16115,0,4,0,5,3,2,12,2,2,1,15,15,4,12,0 -16116,3,7,0,2,6,0,0,1,3,0,2,3,5,39,0 -16117,1,1,0,10,6,6,12,3,0,0,2,14,0,0,0 -16118,2,3,0,11,5,1,1,4,3,0,15,6,5,26,0 -16119,0,8,0,13,3,6,3,3,4,0,17,16,4,24,0 -16120,0,5,0,5,5,5,4,4,2,1,8,14,2,28,0 -16121,0,4,0,12,0,5,14,3,4,1,14,12,2,10,0 -16122,2,6,0,6,6,4,14,0,4,0,10,3,5,3,0 -16123,3,5,0,3,1,4,1,1,4,0,17,9,3,25,0 -16124,1,7,0,8,0,0,2,2,1,0,4,5,5,10,1 -16125,8,2,0,13,4,1,3,1,3,1,3,10,4,23,1 -16126,1,8,0,9,0,4,8,4,4,1,13,2,1,18,0 -16127,6,6,0,3,3,3,9,4,4,0,10,13,5,23,0 -16128,4,4,0,4,2,5,12,3,3,0,17,18,2,29,0 -16129,3,1,0,12,0,0,5,2,4,0,8,6,4,31,0 -16130,7,7,0,8,0,5,0,5,3,1,1,8,5,38,1 -16131,0,3,0,14,1,4,11,0,0,1,13,6,5,18,0 -16132,9,3,0,13,5,5,12,4,0,1,0,20,5,25,0 -16133,0,0,0,11,1,3,13,2,0,1,0,0,2,2,0 -16134,2,0,0,9,6,4,10,0,2,0,2,9,3,29,0 -16135,8,7,0,15,0,1,1,4,4,0,13,14,3,14,0 -16136,6,0,0,8,1,6,14,4,3,1,0,16,0,37,0 -16137,0,7,0,3,6,4,5,2,2,0,13,9,2,37,0 -16138,5,8,0,12,0,5,2,4,1,0,8,2,1,33,0 -16139,6,7,0,11,2,0,9,3,2,0,8,20,1,1,0 -16140,1,8,0,2,1,3,8,1,0,0,17,9,0,38,0 -16141,9,8,0,2,0,2,0,3,0,1,17,2,3,24,0 -16142,10,3,0,6,3,5,5,2,2,1,2,17,1,28,0 -16143,7,1,0,3,0,6,6,3,1,1,8,11,4,21,0 -16144,10,1,0,5,2,6,0,5,4,0,18,17,2,30,1 -16145,6,3,0,13,6,4,6,2,4,0,7,9,0,23,0 -16146,5,2,0,10,1,3,10,4,1,1,14,3,0,9,1 -16147,0,4,0,9,1,0,10,3,0,1,0,11,0,32,0 -16148,0,4,0,15,4,5,7,2,4,0,8,5,3,27,0 -16149,3,3,0,9,2,1,13,3,0,1,8,20,3,41,0 -16150,8,1,0,12,6,1,0,5,2,1,18,14,1,26,1 -16151,4,0,0,6,1,3,14,0,0,0,6,1,2,19,0 -16152,7,3,0,4,5,6,1,0,2,0,18,7,0,31,1 -16153,9,2,0,8,5,3,1,4,1,1,8,0,2,27,0 -16154,0,8,0,7,2,0,1,4,2,0,8,9,0,11,0 -16155,2,0,0,9,6,5,6,3,4,1,13,11,0,35,0 -16156,0,4,0,12,2,4,1,5,1,1,8,9,3,12,0 -16157,3,1,0,0,1,1,13,5,1,1,18,16,5,41,1 -16158,7,8,0,7,6,2,10,1,0,1,5,1,4,29,1 -16159,10,4,0,7,0,0,5,3,2,0,2,11,4,24,0 -16160,3,7,0,11,3,0,14,4,1,1,6,13,0,6,0 -16161,10,5,0,7,4,5,2,3,3,1,15,18,2,31,0 -16162,6,6,0,3,2,5,8,2,0,0,12,16,5,30,0 -16163,1,1,0,2,6,4,10,2,0,0,12,7,3,33,0 -16164,4,1,0,15,5,6,9,1,0,1,16,19,5,35,1 -16165,0,8,0,13,2,5,4,2,4,0,17,11,3,20,0 -16166,3,3,0,9,0,3,9,4,1,1,2,11,0,26,0 -16167,0,5,0,15,6,2,14,5,1,1,17,9,5,13,0 -16168,5,4,0,12,0,4,2,1,0,0,1,8,1,13,1 -16169,4,7,0,7,6,5,7,4,0,0,4,9,4,25,0 -16170,1,7,0,11,5,0,9,0,2,1,10,0,0,25,0 -16171,0,5,0,0,3,4,5,1,0,1,2,9,2,13,0 -16172,10,4,0,13,5,6,11,1,1,1,9,19,4,19,1 -16173,9,2,0,0,5,4,8,4,4,0,8,2,3,26,0 -16174,0,1,0,4,0,4,8,4,2,0,4,13,4,19,0 -16175,1,0,0,4,3,0,3,1,3,1,8,2,1,20,0 -16176,6,5,0,14,2,2,3,3,3,1,5,9,4,21,1 -16177,6,5,0,0,3,2,6,1,2,1,15,18,0,22,0 -16178,2,3,0,15,1,0,12,4,1,0,4,11,0,33,0 -16179,6,4,0,0,3,1,7,1,4,1,16,1,1,6,1 -16180,1,6,0,11,3,6,4,2,1,1,15,4,0,38,0 -16181,8,3,0,0,2,5,8,4,3,0,7,0,4,22,0 -16182,5,0,0,0,5,6,10,1,4,1,5,5,2,29,1 -16183,5,7,0,0,0,1,0,4,2,0,8,9,1,18,0 -16184,3,4,0,13,5,0,14,3,2,0,14,4,5,23,0 -16185,9,5,0,6,2,5,9,0,0,0,5,19,3,28,1 -16186,4,4,0,3,3,6,8,1,4,0,6,20,4,26,0 -16187,8,2,0,6,4,2,2,3,0,1,9,17,5,41,1 -16188,1,8,0,4,5,5,10,3,3,0,10,6,2,8,0 -16189,5,2,0,2,1,0,2,1,2,1,15,12,5,31,1 -16190,0,7,0,6,6,4,14,2,2,0,1,7,0,22,1 -16191,1,7,0,3,6,1,6,0,1,0,17,5,2,0,1 -16192,4,7,0,4,5,2,2,3,0,1,2,2,4,3,0 -16193,0,7,0,2,0,0,7,0,3,0,17,2,1,8,0 -16194,9,2,0,1,6,3,9,4,3,0,13,11,0,39,0 -16195,0,7,0,5,6,0,9,4,0,0,2,9,5,12,0 -16196,9,2,0,13,5,5,14,4,2,1,3,12,1,24,0 -16197,2,1,0,5,3,1,7,2,0,1,13,15,5,12,0 -16198,10,2,0,13,4,6,10,4,2,1,6,13,4,1,0 -16199,5,8,0,4,6,2,5,5,4,0,18,10,3,40,1 -16200,1,2,0,2,2,5,14,0,1,0,13,15,1,29,0 -16201,8,6,0,7,3,6,8,3,0,1,12,5,5,32,0 -16202,1,2,0,5,6,4,9,1,0,0,4,0,4,14,0 -16203,1,7,0,3,5,5,10,2,3,1,15,13,4,31,0 -16204,0,0,0,1,3,4,4,4,0,0,3,10,1,12,0 -16205,1,3,0,0,1,5,12,5,2,0,4,6,5,19,0 -16206,10,8,0,0,2,3,8,1,1,1,4,11,1,28,0 -16207,2,2,0,6,1,1,0,3,3,0,4,2,3,35,0 -16208,3,7,0,5,6,6,6,1,1,1,2,9,0,5,0 -16209,2,7,0,13,4,4,14,4,2,0,17,18,1,3,0 -16210,5,0,0,12,1,0,8,3,3,0,6,2,4,12,0 -16211,9,5,0,5,2,4,8,4,0,1,6,13,5,27,0 -16212,1,7,0,8,1,1,14,3,3,0,3,8,1,21,1 -16213,10,2,0,1,1,4,9,0,4,0,13,11,4,21,0 -16214,4,4,0,0,5,4,8,0,2,0,16,13,5,32,0 -16215,10,1,0,11,4,6,11,0,1,0,14,3,5,3,1 -16216,7,6,0,13,4,3,0,1,4,0,11,11,2,40,0 -16217,1,1,0,8,4,3,13,0,0,1,1,10,5,25,1 -16218,8,1,0,1,2,4,2,1,3,1,8,11,1,27,0 -16219,3,0,0,13,6,3,9,0,0,0,10,2,3,26,0 -16220,8,0,0,11,0,2,14,1,2,0,0,3,0,26,0 -16221,0,8,0,7,2,1,8,4,0,0,4,18,1,35,0 -16222,3,5,0,3,1,2,2,3,3,1,0,6,5,19,0 -16223,9,7,0,8,0,6,10,4,4,0,6,15,3,0,0 -16224,0,1,0,8,0,5,9,2,4,1,12,17,0,17,0 -16225,2,1,0,6,6,0,9,4,0,1,14,2,2,18,0 -16226,2,4,0,10,6,0,11,0,2,1,0,10,5,40,0 -16227,0,7,0,14,0,4,13,1,4,1,17,15,5,10,0 -16228,5,4,0,2,0,1,11,5,0,1,2,9,5,26,0 -16229,0,4,0,3,1,4,8,0,4,0,3,16,0,39,0 -16230,0,1,0,8,6,3,5,5,4,0,18,17,5,35,1 -16231,4,8,0,10,0,5,2,5,3,1,8,10,1,2,0 -16232,1,2,0,12,0,3,5,0,0,1,12,2,3,7,0 -16233,9,6,0,0,0,4,9,5,2,1,17,14,5,1,0 -16234,6,3,0,13,2,5,3,5,1,1,2,20,4,13,0 -16235,3,6,0,14,6,4,9,2,0,1,18,10,0,8,1 -16236,0,8,0,12,3,3,8,3,2,0,13,15,4,35,0 -16237,6,2,0,14,6,4,6,0,4,0,0,11,5,36,0 -16238,3,8,0,12,5,5,4,3,2,1,0,8,0,18,1 -16239,3,3,0,11,2,6,8,3,3,0,2,18,3,17,0 -16240,6,6,0,0,6,1,13,1,0,1,9,5,0,20,1 -16241,0,5,0,13,3,2,8,3,2,0,15,11,1,7,0 -16242,0,5,0,0,1,0,1,3,4,0,13,13,4,4,0 -16243,1,2,0,6,4,5,5,3,2,0,8,19,4,28,0 -16244,3,1,0,5,0,3,11,2,2,1,17,2,1,18,0 -16245,8,2,0,7,4,6,6,0,0,0,1,10,3,41,1 -16246,0,3,0,13,4,0,3,2,4,1,10,14,3,26,0 -16247,0,6,0,5,6,3,0,3,2,1,11,14,0,35,0 -16248,10,3,0,15,1,6,8,2,0,0,15,15,1,9,0 -16249,0,4,0,3,2,4,3,3,2,1,8,11,0,33,0 -16250,9,8,0,12,1,3,14,3,3,0,2,2,4,8,0 -16251,2,6,0,9,2,0,3,4,2,1,13,15,0,11,0 -16252,5,5,0,10,0,3,6,5,0,0,4,6,5,16,0 -16253,5,0,0,6,6,0,3,2,2,1,10,9,2,3,0 -16254,5,1,0,3,5,4,13,3,3,1,13,11,5,20,0 -16255,7,0,0,13,0,4,0,1,0,0,14,11,4,19,0 -16256,3,7,0,1,0,3,2,4,0,1,4,15,4,17,0 -16257,0,6,0,14,0,4,5,3,4,0,0,11,1,0,0 -16258,3,1,0,5,3,1,12,4,2,1,17,14,2,5,1 -16259,3,7,0,10,1,4,6,3,0,1,2,6,5,4,0 -16260,5,4,0,3,4,2,4,4,4,1,0,10,4,1,1 -16261,9,3,0,13,0,3,3,3,0,0,13,11,1,6,0 -16262,2,6,0,5,6,6,13,2,0,1,13,0,1,3,0 -16263,0,3,0,2,0,4,0,1,3,1,5,19,2,35,1 -16264,1,6,0,13,5,4,3,3,0,1,11,10,0,8,0 -16265,8,3,0,14,3,5,8,4,4,1,2,2,1,6,0 -16266,4,0,0,13,6,6,13,2,0,0,2,16,2,23,0 -16267,3,6,0,13,2,0,9,5,2,1,4,2,3,35,0 -16268,10,3,0,3,1,5,7,3,1,0,0,9,1,2,0 -16269,1,1,0,10,5,1,11,0,1,0,5,17,0,17,1 -16270,1,6,0,13,0,4,7,3,1,0,7,6,3,20,0 -16271,8,1,0,13,4,2,9,2,3,1,8,6,1,12,0 -16272,6,5,0,9,4,4,3,4,4,1,17,10,5,5,1 -16273,6,0,0,11,5,0,7,0,2,0,13,20,0,17,0 -16274,1,8,0,1,3,2,5,1,2,1,10,11,0,5,0 -16275,1,4,0,9,1,4,0,0,0,1,13,2,0,9,0 -16276,1,2,0,2,3,6,5,5,4,1,17,5,5,2,0 -16277,1,6,0,4,0,3,14,0,0,0,2,2,0,25,0 -16278,2,8,0,2,4,6,3,3,1,1,0,11,5,1,0 -16279,5,1,0,14,5,3,12,5,2,0,3,8,2,32,0 -16280,8,7,0,9,3,1,11,1,0,1,3,7,0,17,1 -16281,0,7,0,13,5,3,13,4,0,1,17,14,0,38,0 -16282,8,7,0,5,4,1,9,3,4,1,14,10,4,39,0 -16283,3,7,0,5,6,5,7,0,0,1,10,16,5,7,0 -16284,6,2,0,10,0,3,13,2,0,0,4,18,0,19,0 -16285,4,2,0,12,2,6,14,3,3,1,9,6,0,7,0 -16286,3,8,0,12,5,3,8,3,1,1,12,2,2,4,0 -16287,7,6,0,6,4,5,13,3,1,0,8,2,1,18,0 -16288,10,3,0,8,5,3,7,3,3,0,12,15,2,13,0 -16289,9,8,0,13,0,0,12,1,2,0,0,11,5,26,0 -16290,7,0,0,15,2,1,8,3,4,1,8,4,1,31,0 -16291,5,4,0,10,6,6,9,1,2,0,0,2,1,5,0 -16292,1,6,0,4,5,0,10,4,3,0,8,11,0,14,0 -16293,8,7,0,2,0,4,8,5,1,0,2,18,4,0,0 -16294,0,4,0,2,2,4,6,1,1,0,8,17,0,8,0 -16295,3,5,0,0,5,2,2,1,0,1,8,11,3,19,0 -16296,10,3,0,7,1,0,13,5,1,1,11,1,2,7,1 -16297,1,1,0,3,4,1,4,3,4,0,7,1,3,0,1 -16298,9,3,0,2,6,6,2,3,4,0,13,9,2,22,0 -16299,6,4,0,7,6,1,0,1,1,1,5,14,4,21,1 -16300,9,8,0,15,4,3,9,1,0,0,14,13,2,28,0 -16301,10,6,0,15,1,6,8,3,1,0,8,11,0,38,0 -16302,10,4,0,1,2,0,3,1,1,0,9,15,4,38,0 -16303,2,8,0,13,2,1,9,1,2,0,13,11,0,8,0 -16304,9,4,0,13,4,5,2,2,1,1,1,5,5,8,1 -16305,10,6,0,9,6,0,0,4,2,0,2,7,0,26,0 -16306,3,6,0,8,6,4,6,0,0,1,2,11,4,38,0 -16307,1,1,0,13,6,2,4,0,0,0,18,3,0,11,1 -16308,0,3,0,9,5,0,0,0,2,1,9,2,1,3,0 -16309,3,6,0,3,1,5,14,2,0,1,13,11,4,20,0 -16310,1,5,0,9,6,5,8,3,3,0,13,13,0,38,0 -16311,4,7,0,9,1,1,3,3,4,1,1,5,1,31,1 -16312,0,4,0,11,1,1,0,2,3,1,12,15,2,14,0 -16313,4,0,0,7,3,2,14,3,3,1,14,12,5,37,1 -16314,8,6,0,6,6,5,3,1,3,0,8,6,4,29,0 -16315,7,5,0,11,1,4,0,0,3,0,0,2,2,14,0 -16316,10,5,0,1,0,0,2,3,1,1,2,10,4,33,0 -16317,3,7,0,10,0,3,10,4,0,1,16,15,4,25,0 -16318,10,2,0,8,6,2,8,5,3,1,5,5,2,25,1 -16319,6,4,0,3,4,5,8,4,3,0,13,11,4,33,0 -16320,0,8,0,13,0,2,9,2,4,1,8,2,2,32,0 -16321,4,0,0,5,2,5,12,2,4,1,17,2,1,36,0 -16322,9,8,0,13,5,6,11,0,1,0,13,6,0,11,0 -16323,2,4,0,11,0,4,1,2,1,1,13,20,5,36,0 -16324,10,1,0,8,3,5,7,0,3,0,8,4,1,26,0 -16325,2,3,0,3,0,4,0,1,1,1,8,19,0,14,0 -16326,4,7,0,3,2,2,4,4,3,1,12,19,0,26,0 -16327,8,4,0,5,2,5,8,5,3,1,2,11,5,0,0 -16328,2,1,0,3,0,3,2,1,4,0,10,6,5,8,0 -16329,2,1,0,6,5,1,9,2,1,0,13,2,1,0,0 -16330,0,1,0,1,5,4,1,1,2,1,12,18,0,32,0 -16331,2,6,0,15,1,4,13,3,3,1,2,0,0,5,0 -16332,7,1,0,1,5,1,2,3,4,1,18,8,1,26,1 -16333,3,1,0,3,6,3,7,4,0,0,0,2,5,39,0 -16334,10,3,0,4,5,5,10,1,4,1,2,7,4,41,0 -16335,4,8,0,2,5,5,5,3,3,1,2,17,5,27,0 -16336,2,7,0,6,1,2,6,2,1,0,12,20,4,0,0 -16337,9,2,0,8,0,5,4,3,1,1,13,18,1,8,0 -16338,0,8,0,2,2,0,4,4,4,1,12,0,1,25,0 -16339,0,1,0,11,6,5,0,1,3,0,6,20,3,19,0 -16340,10,4,0,7,6,3,4,0,3,1,9,18,5,22,1 -16341,10,3,0,6,5,2,9,2,3,0,2,6,4,5,0 -16342,6,2,0,12,4,1,8,5,0,1,7,14,3,29,1 -16343,3,7,0,3,0,0,13,5,2,1,8,9,4,37,0 -16344,1,1,0,0,4,0,2,2,1,1,15,0,2,22,0 -16345,7,3,0,8,2,0,6,3,0,1,10,20,0,16,0 -16346,7,7,0,3,2,6,7,4,0,1,13,2,3,39,0 -16347,7,6,0,2,0,5,11,2,4,0,17,11,2,24,0 -16348,10,6,0,0,0,5,2,2,3,0,8,15,1,26,0 -16349,0,6,0,3,0,4,8,5,1,0,13,11,3,26,0 -16350,1,2,0,7,2,4,11,1,3,1,1,15,5,18,0 -16351,10,5,0,5,6,5,5,3,4,0,8,16,1,27,0 -16352,0,4,0,12,0,5,4,2,1,0,2,2,2,41,0 -16353,5,2,0,10,0,5,1,2,0,1,0,2,0,33,0 -16354,4,7,0,0,5,6,8,2,1,1,13,20,5,17,0 -16355,0,6,0,7,6,4,0,1,4,1,13,2,0,8,0 -16356,9,5,0,0,0,4,4,4,2,0,17,11,0,34,0 -16357,1,0,0,13,6,4,0,3,1,0,12,0,2,21,0 -16358,7,4,0,4,4,2,2,2,2,1,10,4,5,29,1 -16359,2,6,0,0,0,5,13,0,1,0,17,8,1,38,0 -16360,10,6,0,13,2,5,13,0,3,0,8,0,0,36,0 -16361,5,1,0,3,0,0,0,4,1,1,8,13,0,40,0 -16362,6,6,0,13,0,0,14,1,4,0,6,19,2,11,0 -16363,1,0,0,0,5,0,11,1,2,1,16,15,1,41,0 -16364,1,7,0,1,2,2,7,3,1,1,6,0,1,41,0 -16365,1,5,0,1,0,2,5,0,1,0,2,2,1,29,0 -16366,3,8,0,1,0,3,0,4,0,0,13,11,2,40,0 -16367,7,6,0,15,2,1,4,0,4,0,1,10,1,32,1 -16368,9,3,0,6,0,3,2,0,1,1,4,20,0,28,0 -16369,0,3,0,14,0,2,1,2,0,1,8,0,0,24,0 -16370,4,3,0,1,6,2,3,5,4,0,7,1,3,8,1 -16371,2,2,0,12,1,0,9,2,3,1,12,12,3,4,0 -16372,3,3,0,1,4,3,14,2,2,0,6,13,1,31,0 -16373,0,2,0,14,4,3,1,2,2,1,9,19,4,23,1 -16374,7,5,0,2,6,3,10,3,2,1,13,11,3,22,0 -16375,1,5,0,12,0,0,8,2,3,1,5,0,2,4,0 -16376,4,4,0,13,0,0,9,2,4,1,0,17,0,4,0 -16377,1,5,0,13,6,0,2,3,2,0,6,0,2,2,0 -16378,2,5,0,15,1,1,12,0,1,1,14,17,1,41,1 -16379,7,2,0,7,4,0,7,0,3,1,5,9,4,26,1 -16380,7,6,0,14,4,2,6,5,1,0,18,10,3,1,1 -16381,9,3,0,0,0,5,12,4,1,0,17,20,0,12,0 -16382,10,1,0,1,6,0,10,0,1,1,12,1,2,35,1 -16383,8,7,0,4,6,3,5,1,0,0,6,18,0,30,0 -16384,9,4,0,13,1,5,7,3,4,0,7,20,4,9,0 -16385,2,1,0,6,5,6,12,5,3,1,15,2,2,13,0 -16386,4,7,0,3,3,1,3,3,1,0,8,13,1,4,0 -16387,8,8,0,10,5,1,3,4,0,1,7,5,4,10,1 -16388,0,7,0,1,0,2,3,2,3,1,8,2,4,23,0 -16389,3,2,0,1,5,0,10,4,0,1,13,2,5,4,0 -16390,0,2,0,4,3,2,9,1,1,1,8,11,1,14,0 -16391,2,1,0,6,2,2,9,3,0,1,4,4,4,29,0 -16392,9,7,0,12,5,4,14,4,4,1,8,2,0,31,0 -16393,9,1,0,9,4,3,2,5,3,1,5,10,3,32,1 -16394,2,5,0,4,6,6,1,0,2,1,13,15,2,32,0 -16395,4,0,0,15,2,0,12,3,0,0,12,15,0,11,0 -16396,2,8,0,15,1,6,6,5,0,0,0,5,2,20,0 -16397,6,5,0,3,6,0,1,5,0,1,0,7,3,26,1 -16398,3,8,0,11,2,6,8,1,4,1,15,3,0,41,0 -16399,1,8,0,13,1,4,5,1,1,1,4,12,2,18,0 -16400,1,2,0,14,1,4,5,3,0,1,10,15,1,17,0 -16401,0,6,0,1,1,6,9,4,3,1,8,2,0,11,0 -16402,0,3,0,3,3,2,1,3,4,1,6,9,5,12,0 -16403,6,5,0,3,6,0,6,5,3,0,2,15,4,29,0 -16404,0,4,0,4,3,6,6,1,0,0,17,16,3,26,0 -16405,2,8,0,13,0,5,6,3,1,0,2,1,1,6,0 -16406,0,6,0,11,6,5,9,0,0,0,4,2,3,0,0 -16407,9,3,0,4,1,5,8,3,2,1,4,17,0,39,0 -16408,9,1,0,7,1,2,0,1,1,1,13,18,0,34,0 -16409,0,4,0,0,1,5,3,1,2,0,8,9,1,9,0 -16410,0,0,0,0,1,5,1,4,1,1,2,5,5,22,0 -16411,1,2,0,12,1,6,6,3,1,1,2,9,0,31,0 -16412,4,7,0,12,3,5,7,5,0,1,14,1,4,41,1 -16413,9,5,0,4,6,5,0,2,4,0,4,13,3,19,0 -16414,9,0,0,13,0,2,4,2,0,0,13,13,3,6,0 -16415,5,5,0,9,4,0,11,5,1,0,9,8,2,40,1 -16416,0,0,0,13,2,0,7,4,2,1,17,4,4,37,0 -16417,10,8,0,11,5,2,2,2,0,0,13,11,2,10,0 -16418,0,2,0,5,0,3,4,4,1,1,13,2,0,34,0 -16419,9,7,0,5,0,3,14,4,2,1,2,11,2,14,0 -16420,3,2,0,4,1,2,0,3,3,0,13,14,0,14,0 -16421,4,8,0,2,6,1,12,5,3,0,9,10,2,19,1 -16422,1,1,0,1,4,1,8,5,4,0,2,9,5,30,0 -16423,8,2,0,12,0,0,4,2,4,1,16,11,5,10,0 -16424,6,6,0,5,3,5,13,4,3,0,15,2,2,26,0 -16425,3,5,0,7,3,1,2,3,4,0,13,13,2,3,0 -16426,0,5,0,3,6,5,4,1,4,1,6,18,0,2,0 -16427,1,6,0,5,4,5,2,0,4,1,13,20,2,29,0 -16428,1,3,0,1,0,5,9,3,2,0,8,10,3,22,0 -16429,9,6,0,1,4,1,9,2,0,1,11,15,0,2,1 -16430,3,3,0,3,2,0,9,1,2,0,0,2,0,3,0 -16431,2,6,0,11,2,1,5,5,2,1,16,0,3,40,0 -16432,3,8,0,13,5,1,13,1,3,0,0,11,5,41,0 -16433,10,3,0,14,1,5,7,1,0,1,0,2,5,21,0 -16434,0,5,0,6,6,6,9,4,3,1,2,20,5,6,0 -16435,2,8,0,14,4,5,5,5,4,1,9,15,5,20,1 -16436,3,3,0,11,5,5,9,3,1,0,12,6,4,33,0 -16437,4,6,0,15,1,3,5,0,3,1,13,0,3,7,0 -16438,9,1,0,7,6,6,0,2,3,1,18,17,1,24,1 -16439,10,3,0,10,1,6,13,1,3,0,9,5,2,40,1 -16440,7,8,0,15,1,2,4,0,0,0,7,8,3,25,1 -16441,8,6,0,2,4,2,7,0,1,0,16,3,4,34,1 -16442,0,2,0,0,6,0,6,0,3,1,6,11,5,33,0 -16443,6,4,0,12,5,5,6,5,1,0,3,7,3,27,1 -16444,3,5,0,1,0,5,8,0,1,0,11,15,3,30,0 -16445,5,2,0,7,0,1,12,5,4,1,5,16,2,30,1 -16446,9,4,0,12,0,5,2,3,1,0,2,1,3,29,0 -16447,9,0,0,3,1,4,7,0,4,0,9,5,3,37,1 -16448,0,4,0,4,6,1,2,1,0,0,11,2,5,2,0 -16449,1,0,0,13,0,5,14,0,2,0,13,6,4,39,0 -16450,5,6,0,13,1,3,5,1,0,0,17,11,1,17,0 -16451,3,2,0,10,0,5,13,2,4,0,13,11,5,17,0 -16452,2,6,0,6,1,5,9,0,2,0,15,16,2,3,0 -16453,2,7,0,10,1,2,14,4,2,0,1,11,3,23,1 -16454,1,3,0,14,2,6,3,0,4,1,13,16,5,7,0 -16455,6,6,0,14,4,1,8,3,2,1,12,18,1,9,0 -16456,7,8,0,10,4,0,2,2,0,0,18,18,4,22,1 -16457,0,2,0,3,4,6,4,3,3,0,12,11,2,30,0 -16458,7,2,0,8,3,2,3,0,1,1,9,16,0,30,1 -16459,10,1,0,6,1,6,5,3,2,1,13,2,4,35,0 -16460,1,8,0,13,5,3,11,1,2,0,1,2,5,8,0 -16461,8,6,0,9,3,5,14,0,4,0,16,5,4,28,1 -16462,6,1,0,6,3,6,13,2,3,0,9,17,4,2,1 -16463,3,3,0,1,0,6,5,4,0,0,13,11,0,32,0 -16464,0,3,0,15,5,3,9,4,0,0,4,11,2,25,0 -16465,10,8,0,14,6,5,11,2,0,0,9,11,3,41,0 -16466,2,8,0,6,3,3,12,5,2,1,13,1,5,3,1 -16467,9,0,0,6,6,3,0,0,3,0,13,13,2,32,0 -16468,2,2,0,8,0,2,7,5,0,0,15,9,0,31,0 -16469,6,2,0,13,4,5,8,3,0,1,2,20,4,19,0 -16470,3,0,0,14,2,1,8,4,3,1,6,2,0,18,0 -16471,1,3,0,0,2,3,7,4,0,1,10,10,5,21,1 -16472,4,7,0,6,4,0,12,3,0,0,8,2,1,33,0 -16473,1,2,0,1,1,5,1,1,4,1,2,20,0,3,0 -16474,8,2,0,3,0,2,0,4,0,1,13,11,3,5,0 -16475,3,6,0,8,6,4,9,1,2,1,2,18,2,12,0 -16476,0,8,0,14,2,5,10,2,4,1,17,14,5,3,0 -16477,9,3,0,15,2,6,2,0,2,0,12,14,0,41,0 -16478,8,7,0,11,6,4,13,2,0,1,17,11,1,35,0 -16479,9,2,0,2,0,4,7,4,3,1,4,2,1,18,0 -16480,3,1,0,10,5,5,13,0,2,0,0,9,0,23,0 -16481,1,0,0,1,0,5,14,3,2,0,8,20,0,38,0 -16482,7,6,0,6,6,4,8,1,2,1,8,20,2,36,0 -16483,0,7,0,4,6,0,13,4,2,1,0,13,0,40,0 -16484,3,6,0,6,1,0,8,2,4,1,8,2,4,5,0 -16485,2,3,0,1,2,6,11,5,1,0,2,15,1,14,0 -16486,8,1,0,6,4,1,14,5,2,1,1,5,1,24,1 -16487,0,1,0,14,2,0,4,0,2,1,18,16,4,11,1 -16488,7,0,0,12,5,1,0,5,4,1,17,3,5,24,1 -16489,1,3,0,5,0,1,11,4,3,1,15,8,4,17,0 -16490,2,1,0,7,1,5,5,3,3,0,4,5,0,29,0 -16491,9,6,0,14,6,5,1,1,1,0,4,16,5,40,0 -16492,0,6,0,14,4,3,9,4,3,0,0,1,1,21,0 -16493,9,0,0,10,3,2,9,5,2,0,5,7,3,24,1 -16494,2,0,0,0,0,4,3,4,2,1,8,11,3,1,0 -16495,3,4,0,0,0,4,14,4,0,1,8,2,1,37,0 -16496,0,6,0,5,0,5,10,3,2,1,8,14,1,18,0 -16497,0,2,0,3,6,3,2,1,2,1,4,11,0,29,0 -16498,8,3,0,2,1,5,8,4,0,1,2,2,0,9,0 -16499,5,0,0,5,6,3,6,0,4,0,10,2,2,29,0 -16500,0,3,0,14,2,1,6,2,3,0,6,11,1,17,0 -16501,8,1,0,4,2,1,6,5,3,0,4,6,0,38,0 -16502,2,2,0,8,0,3,5,0,0,1,2,2,1,29,0 -16503,0,7,0,13,0,5,8,1,2,1,13,15,2,3,0 -16504,6,5,0,0,0,5,4,3,3,0,7,11,0,30,0 -16505,2,3,0,4,2,6,12,1,1,0,4,17,4,19,1 -16506,6,1,0,9,6,5,5,2,4,1,13,15,1,2,0 -16507,7,0,0,10,0,2,13,0,2,1,18,7,1,0,1 -16508,1,2,0,11,4,5,1,0,0,1,2,15,3,25,0 -16509,0,1,0,13,0,6,8,4,1,1,12,13,2,25,0 -16510,10,5,0,9,3,4,8,3,0,0,12,11,3,39,0 -16511,2,4,0,11,6,6,9,1,4,0,4,15,4,26,0 -16512,9,0,0,6,2,2,3,2,2,0,6,13,5,30,0 -16513,3,1,0,3,3,2,5,3,0,0,2,11,2,34,0 -16514,0,4,0,3,5,0,0,2,0,1,4,16,3,34,0 -16515,3,0,0,12,3,1,0,3,4,1,1,14,4,12,1 -16516,4,4,0,3,5,3,14,1,4,1,11,18,2,4,0 -16517,1,1,0,11,2,0,14,0,3,1,8,15,1,29,0 -16518,6,1,0,9,2,3,0,5,0,0,1,14,5,21,1 -16519,9,2,0,3,6,4,6,3,2,1,0,14,5,3,0 -16520,8,4,0,8,2,4,11,5,3,0,10,16,0,22,0 -16521,0,3,0,6,0,5,9,1,0,1,4,2,5,29,0 -16522,7,7,0,5,0,3,3,3,4,1,11,11,1,41,0 -16523,2,3,0,0,2,6,3,1,0,1,2,9,1,0,0 -16524,10,3,0,13,5,6,8,3,2,1,6,10,1,35,0 -16525,3,2,0,2,3,2,8,1,1,0,4,12,5,31,0 -16526,5,7,0,7,6,2,4,4,3,0,4,18,2,21,0 -16527,1,6,0,11,2,4,5,4,2,1,0,15,0,40,0 -16528,7,8,0,15,2,0,3,1,2,1,18,5,4,20,1 -16529,1,5,0,2,5,3,7,0,0,1,18,10,5,23,1 -16530,5,5,0,10,6,3,2,0,0,1,16,1,3,24,1 -16531,2,3,0,10,0,6,0,3,4,0,13,6,0,27,0 -16532,9,6,0,11,6,6,1,2,1,1,4,3,3,1,0 -16533,1,3,0,15,6,5,9,0,4,0,17,15,1,39,0 -16534,2,5,0,1,2,1,6,4,2,1,0,4,2,1,0 -16535,6,4,0,10,2,2,13,5,1,0,10,10,2,37,1 -16536,2,0,0,3,0,0,0,3,4,0,13,9,0,12,0 -16537,9,2,0,15,2,6,5,3,2,0,2,4,2,17,0 -16538,1,6,0,5,4,3,6,4,0,1,15,11,1,23,0 -16539,7,4,0,9,3,5,14,0,3,0,0,5,1,10,1 -16540,8,3,0,8,0,5,5,3,1,0,14,6,3,36,0 -16541,2,4,0,13,1,2,4,0,0,0,17,4,1,33,0 -16542,0,7,0,7,1,5,7,3,1,0,8,20,4,7,0 -16543,5,2,0,6,2,4,8,1,2,0,7,13,0,27,0 -16544,9,2,0,12,0,5,10,2,1,1,4,15,0,25,0 -16545,4,4,0,10,0,5,8,3,4,1,4,6,2,27,0 -16546,9,7,0,3,0,4,14,4,2,1,14,2,0,41,0 -16547,3,7,0,5,3,4,13,3,0,0,13,12,5,1,0 -16548,3,2,0,15,0,0,10,2,3,1,8,2,2,11,0 -16549,0,5,0,14,6,6,4,0,3,1,11,7,3,16,1 -16550,9,4,0,14,3,4,10,3,3,1,17,2,0,14,0 -16551,1,0,0,10,2,5,5,1,0,0,10,11,0,16,0 -16552,0,8,0,5,5,3,7,3,2,1,16,6,0,18,0 -16553,0,2,0,3,6,3,5,5,1,0,13,11,3,31,0 -16554,7,2,0,8,3,4,13,3,0,0,18,7,5,32,1 -16555,1,3,0,0,2,4,11,5,2,0,2,2,4,14,0 -16556,9,1,0,13,6,3,9,3,2,1,6,2,3,34,0 -16557,0,7,0,5,0,0,13,0,2,0,3,18,4,27,0 -16558,7,5,0,7,3,3,4,5,4,1,3,10,5,36,1 -16559,2,6,0,15,0,3,14,2,3,0,8,11,1,31,0 -16560,1,4,0,8,0,0,12,2,1,1,2,4,1,18,0 -16561,0,0,0,3,0,5,5,4,4,1,12,2,0,17,0 -16562,0,1,0,8,3,6,3,4,2,1,13,2,0,11,0 -16563,9,8,0,5,1,3,6,2,0,1,8,11,1,36,0 -16564,0,8,0,1,6,1,8,4,0,0,1,18,2,35,1 -16565,0,4,0,0,2,1,9,1,2,0,17,3,4,40,0 -16566,9,4,0,8,0,1,7,5,2,0,2,9,1,12,0 -16567,0,3,0,7,1,3,14,3,0,0,6,16,3,17,0 -16568,7,5,0,10,2,5,10,0,3,0,1,10,1,17,1 -16569,2,2,0,6,4,4,1,4,0,0,13,15,0,8,0 -16570,0,1,0,1,5,6,0,3,4,0,10,11,0,34,0 -16571,8,8,0,11,6,4,6,5,2,1,2,13,1,24,0 -16572,8,2,0,6,6,0,14,1,2,0,13,2,0,5,0 -16573,0,6,0,3,1,4,13,2,2,0,2,13,4,29,0 -16574,4,0,0,2,6,4,0,0,2,0,14,11,1,19,0 -16575,5,4,0,2,6,6,10,4,0,1,13,6,3,30,0 -16576,10,3,0,1,5,4,1,4,2,0,2,11,5,20,0 -16577,1,5,0,5,2,2,1,1,1,1,18,8,4,20,1 -16578,2,8,0,3,2,2,3,5,4,0,4,2,1,27,0 -16579,10,1,0,4,3,2,9,2,2,0,2,4,1,30,0 -16580,5,6,0,10,2,5,5,0,0,1,14,1,1,39,1 -16581,8,2,0,1,6,3,12,5,0,1,18,10,3,35,1 -16582,1,4,0,11,1,5,0,3,4,0,0,11,2,28,0 -16583,10,2,0,6,3,0,2,2,3,0,0,9,0,32,0 -16584,6,6,0,11,3,3,2,4,1,1,11,6,4,5,0 -16585,1,3,0,13,6,0,3,3,0,1,15,5,0,23,0 -16586,2,4,0,4,5,6,0,3,3,1,4,2,5,16,0 -16587,3,8,0,15,6,4,5,1,4,1,5,15,4,21,0 -16588,6,6,0,0,6,5,6,5,2,0,17,18,4,29,0 -16589,4,8,0,0,6,6,4,2,1,1,14,16,3,36,1 -16590,10,5,0,4,4,6,5,3,3,0,6,0,5,14,0 -16591,3,0,0,8,5,5,2,3,2,1,2,11,0,34,0 -16592,6,3,0,9,4,3,6,5,2,0,17,7,5,20,1 -16593,7,0,0,3,5,1,8,3,1,1,7,14,1,24,1 -16594,2,1,0,7,1,4,1,3,4,1,2,2,3,5,0 -16595,3,8,0,14,5,6,8,2,4,1,11,5,1,30,1 -16596,10,7,0,1,5,5,11,2,0,0,8,0,0,38,0 -16597,2,8,0,13,5,4,0,0,0,1,6,13,0,31,0 -16598,10,3,0,7,4,2,14,4,3,1,18,12,4,36,1 -16599,10,8,0,11,1,0,3,4,0,0,2,15,1,11,0 -16600,2,6,0,3,1,6,0,0,0,0,10,11,4,27,0 -16601,6,0,0,10,5,2,11,5,0,1,3,12,2,41,1 -16602,0,3,0,0,6,5,2,4,1,1,15,5,4,13,0 -16603,9,5,0,13,4,6,11,5,0,0,14,1,4,2,1 -16604,4,8,0,0,2,0,14,0,0,1,17,2,4,14,0 -16605,6,2,0,11,0,2,12,0,2,1,2,11,0,12,0 -16606,1,4,0,5,3,4,3,0,3,1,6,2,1,33,0 -16607,1,6,0,8,3,2,11,2,0,0,4,4,2,33,0 -16608,1,8,0,3,0,6,3,4,0,1,12,2,1,19,0 -16609,1,7,0,13,0,6,11,2,3,1,13,11,4,18,0 -16610,1,6,0,11,5,0,2,3,3,0,15,18,5,27,0 -16611,7,3,0,10,3,2,8,0,2,1,9,1,4,30,1 -16612,4,1,0,10,1,1,0,5,3,0,6,7,3,19,1 -16613,0,5,0,15,1,4,1,4,1,0,4,2,0,37,0 -16614,7,2,0,10,2,2,3,5,3,1,18,16,1,10,1 -16615,8,6,0,3,3,4,3,2,0,0,2,10,1,22,0 -16616,6,2,0,12,6,0,2,4,1,0,1,11,2,6,0 -16617,7,7,0,14,0,1,9,0,0,1,14,1,0,11,1 -16618,9,3,0,4,4,1,3,3,0,1,13,2,5,24,0 -16619,10,4,0,1,0,5,0,2,0,0,2,18,5,4,0 -16620,1,8,0,6,2,4,5,4,1,0,2,11,0,14,0 -16621,4,5,0,12,6,4,10,5,0,1,14,7,3,6,1 -16622,9,0,0,2,0,6,8,0,3,0,2,16,5,32,0 -16623,8,4,0,1,1,5,7,5,0,1,0,3,0,29,0 -16624,5,7,0,4,2,1,5,2,1,1,9,5,4,36,1 -16625,6,5,0,15,4,1,9,3,4,1,18,6,5,41,1 -16626,4,4,0,1,0,2,13,3,2,0,11,13,3,37,0 -16627,2,4,0,4,6,5,11,1,0,1,17,13,2,38,0 -16628,2,2,0,0,4,1,13,5,2,1,18,10,5,11,1 -16629,9,7,0,3,4,1,4,5,3,1,9,5,5,32,1 -16630,9,8,0,9,0,0,7,2,0,1,6,15,5,26,0 -16631,10,8,0,4,0,5,4,0,1,1,15,11,0,25,0 -16632,5,2,0,4,2,4,6,2,0,0,13,6,2,37,0 -16633,5,6,0,11,5,5,6,1,4,1,1,0,2,12,0 -16634,3,6,0,14,0,6,11,5,4,1,8,13,0,35,0 -16635,8,5,0,8,0,5,0,4,4,1,17,17,2,40,0 -16636,2,8,0,4,6,3,8,0,1,1,13,8,1,35,0 -16637,1,8,0,5,4,4,12,3,0,1,7,10,3,0,1 -16638,9,6,0,2,1,3,9,2,2,0,16,11,5,4,0 -16639,1,7,0,3,2,2,4,4,4,0,2,6,0,26,0 -16640,1,6,0,3,5,1,8,2,2,1,4,11,1,14,0 -16641,8,4,0,9,3,0,6,1,4,1,9,12,1,3,1 -16642,6,1,0,2,2,4,14,5,3,1,5,8,1,31,1 -16643,1,5,0,3,0,0,14,3,0,1,10,20,0,21,0 -16644,5,3,0,1,0,2,13,5,4,1,3,11,5,12,0 -16645,9,1,0,6,4,1,10,2,2,0,10,17,2,36,1 -16646,1,2,0,13,1,6,7,1,4,0,2,2,2,9,0 -16647,2,3,0,14,1,0,12,2,1,0,4,2,4,13,0 -16648,8,8,0,5,2,5,2,0,0,0,17,6,5,34,0 -16649,4,1,0,13,5,5,5,0,3,0,8,13,5,16,0 -16650,0,8,0,15,0,5,9,0,0,1,13,11,5,4,0 -16651,2,8,0,3,1,2,12,5,0,0,18,7,5,40,1 -16652,9,0,0,13,0,3,6,3,4,0,2,11,4,0,0 -16653,6,5,0,0,5,0,0,4,1,1,13,11,3,30,0 -16654,6,0,0,11,3,5,13,1,0,1,14,10,2,2,1 -16655,7,6,0,15,3,1,12,1,1,0,18,17,2,21,1 -16656,1,2,0,13,0,6,6,3,2,0,12,1,0,30,0 -16657,2,4,0,15,0,1,0,0,1,0,0,2,1,26,0 -16658,6,1,0,5,0,2,3,2,2,1,13,2,5,29,0 -16659,0,6,0,6,4,4,4,2,0,0,2,13,1,36,0 -16660,10,0,0,11,4,0,12,3,3,0,8,19,1,14,0 -16661,1,7,0,12,6,2,9,3,4,0,12,11,0,23,0 -16662,1,8,0,1,6,3,12,0,3,0,17,4,4,28,0 -16663,7,6,0,12,2,3,1,4,3,0,3,1,4,28,1 -16664,8,0,0,3,4,5,12,4,2,1,18,5,1,23,1 -16665,0,6,0,6,0,2,11,3,2,0,13,11,0,40,0 -16666,6,4,0,13,5,4,11,2,4,0,0,20,2,5,0 -16667,3,3,0,5,5,0,8,4,1,1,2,9,3,10,0 -16668,3,0,0,0,2,5,12,1,3,0,4,0,5,26,0 -16669,2,3,0,13,6,4,12,4,1,1,4,4,1,16,0 -16670,0,3,0,3,0,0,5,3,1,1,13,20,0,17,0 -16671,3,2,0,1,2,3,11,5,3,1,18,16,5,40,1 -16672,5,7,0,3,5,4,0,3,0,1,14,2,0,4,0 -16673,5,1,0,15,0,0,4,5,3,1,0,2,5,4,0 -16674,7,3,0,6,3,2,5,0,4,1,13,9,1,18,0 -16675,3,5,0,11,0,2,3,1,4,1,2,6,5,7,0 -16676,0,7,0,3,3,1,0,3,0,0,17,11,5,20,0 -16677,9,7,0,0,1,4,14,4,2,0,15,1,5,31,1 -16678,1,0,0,12,5,0,13,0,0,0,14,11,4,30,0 -16679,5,4,0,10,0,1,3,0,0,1,18,8,3,9,1 -16680,4,6,0,9,1,1,0,1,2,0,14,7,0,29,1 -16681,9,2,0,7,3,4,4,2,4,0,15,18,0,32,0 -16682,3,3,0,5,1,1,13,3,3,0,8,11,2,13,0 -16683,7,5,0,7,1,1,6,1,0,1,8,11,2,29,0 -16684,0,1,0,8,6,1,9,4,4,0,7,15,5,11,0 -16685,5,5,0,9,3,4,2,1,1,1,18,7,0,7,1 -16686,1,6,0,11,2,5,12,3,4,0,13,11,2,7,0 -16687,10,8,0,8,4,1,5,5,4,1,7,17,5,18,1 -16688,3,1,0,15,6,4,1,3,2,1,2,6,0,38,0 -16689,1,6,0,6,0,0,9,1,1,0,7,2,2,17,0 -16690,4,8,0,13,6,6,8,3,4,1,9,18,1,40,0 -16691,3,5,0,10,4,0,12,1,4,0,8,15,1,10,0 -16692,7,6,0,12,3,1,2,2,3,1,6,18,3,32,0 -16693,7,5,0,10,2,0,14,3,2,1,14,7,5,18,1 -16694,1,6,0,12,2,4,12,3,2,0,2,13,2,30,0 -16695,0,6,0,14,0,3,14,1,4,1,15,3,0,8,0 -16696,6,5,0,5,4,0,11,1,1,1,5,17,4,29,1 -16697,8,4,0,7,5,4,9,3,1,0,4,2,0,41,0 -16698,1,8,0,13,0,3,6,5,4,1,17,7,1,18,0 -16699,9,7,0,8,6,4,5,1,3,1,3,19,0,17,0 -16700,9,0,0,11,0,6,9,0,1,1,11,15,1,1,0 -16701,0,2,0,9,1,4,3,4,3,0,2,15,1,8,0 -16702,1,5,0,14,5,0,6,1,1,1,2,6,0,36,0 -16703,2,8,0,6,6,3,4,1,4,1,8,13,2,3,0 -16704,4,4,0,3,2,6,5,0,4,1,11,8,5,28,1 -16705,9,2,0,8,4,2,0,5,2,0,16,7,2,24,1 -16706,8,8,0,10,2,0,6,4,2,1,15,14,2,11,1 -16707,5,4,0,1,4,1,0,3,3,1,4,19,4,12,1 -16708,4,1,0,13,1,0,1,3,2,1,7,17,0,4,1 -16709,0,7,0,1,3,6,6,4,1,0,7,14,4,21,0 -16710,2,0,0,5,2,5,6,3,3,1,14,17,1,13,0 -16711,9,5,0,5,0,1,14,4,1,0,12,0,3,26,0 -16712,8,0,0,5,0,2,5,5,1,0,12,15,1,7,0 -16713,2,1,0,8,2,6,0,0,0,0,5,14,3,9,1 -16714,0,7,0,4,6,0,7,1,3,0,4,11,4,12,0 -16715,1,2,0,8,0,0,6,4,2,1,2,15,0,18,0 -16716,10,1,0,2,1,2,4,4,3,1,14,17,1,20,1 -16717,9,6,0,12,5,0,13,3,3,1,9,10,3,16,1 -16718,3,4,0,8,4,1,8,2,2,0,13,7,1,11,0 -16719,8,8,0,4,6,0,5,4,0,1,6,11,0,11,0 -16720,1,8,0,8,0,4,5,3,0,0,2,9,3,3,0 -16721,0,5,0,14,5,4,9,0,0,1,12,2,2,36,0 -16722,5,5,0,14,6,6,4,3,4,0,2,4,2,24,0 -16723,10,5,0,3,1,5,5,3,0,1,8,11,2,3,0 -16724,2,4,0,14,5,0,14,4,4,1,13,3,5,30,0 -16725,6,6,0,1,2,4,6,1,2,1,14,11,2,12,0 -16726,2,0,0,15,0,4,7,2,4,0,2,18,3,0,0 -16727,3,2,0,9,4,2,3,2,0,0,4,11,2,21,0 -16728,0,7,0,2,5,5,9,1,1,0,12,8,3,26,0 -16729,4,6,0,13,0,0,11,3,3,1,2,3,1,21,0 -16730,2,7,0,0,5,0,11,0,0,1,8,11,3,2,0 -16731,3,7,0,5,6,5,12,4,2,0,3,2,1,2,0 -16732,4,6,0,13,2,3,0,1,3,0,17,18,4,12,0 -16733,10,2,0,8,0,2,2,1,2,1,17,16,5,36,0 -16734,1,4,0,2,0,3,11,3,3,0,8,4,0,12,0 -16735,10,6,0,6,0,3,6,0,0,1,17,9,3,21,0 -16736,9,2,0,7,0,6,0,4,3,0,3,4,5,2,0 -16737,1,0,0,3,2,1,7,2,0,1,13,13,2,19,0 -16738,9,1,0,1,3,4,2,2,0,1,14,7,2,27,1 -16739,10,0,0,1,6,5,8,0,3,0,2,9,2,4,0 -16740,6,4,0,1,6,4,8,0,4,0,2,11,5,40,0 -16741,3,2,0,2,3,4,14,4,3,1,2,11,4,36,0 -16742,2,8,0,1,6,5,6,2,4,0,13,2,5,8,0 -16743,0,7,0,4,1,5,6,1,0,1,12,17,2,0,0 -16744,6,6,0,8,1,2,14,2,2,0,8,4,2,13,0 -16745,4,0,0,0,3,5,8,4,1,1,15,6,0,41,0 -16746,6,0,0,5,6,0,10,5,0,0,9,14,1,1,1 -16747,3,6,0,11,6,6,8,3,0,1,2,15,2,18,0 -16748,3,7,0,11,2,1,5,3,1,1,2,17,1,29,0 -16749,4,3,0,8,6,5,8,5,0,1,12,12,4,20,1 -16750,8,1,0,1,6,4,2,4,1,0,17,9,1,33,0 -16751,5,2,0,14,4,4,6,4,2,1,7,2,2,10,0 -16752,10,7,0,11,6,6,1,4,2,0,6,5,3,25,0 -16753,6,6,0,11,1,6,0,2,3,0,17,12,3,4,0 -16754,0,1,0,15,1,3,14,4,3,0,8,11,3,11,0 -16755,2,8,0,8,2,6,11,3,1,1,17,5,0,10,0 -16756,4,0,0,10,5,2,7,4,2,1,2,2,5,27,0 -16757,1,2,0,4,6,4,5,4,4,0,17,9,2,32,0 -16758,2,6,0,13,0,4,14,2,0,0,2,17,3,25,0 -16759,0,7,0,8,3,2,3,2,0,1,4,5,1,4,0 -16760,6,8,0,3,1,0,14,2,0,1,12,0,5,27,0 -16761,10,4,0,6,4,0,4,5,1,1,6,3,3,0,1 -16762,1,6,0,2,2,5,10,4,0,0,5,12,2,3,1 -16763,0,4,0,9,6,6,8,5,0,1,18,1,2,28,1 -16764,7,7,0,3,0,3,5,5,2,0,18,1,5,9,1 -16765,5,2,0,13,3,3,11,5,0,1,3,5,3,8,1 -16766,4,3,0,4,0,4,6,1,0,1,3,5,1,24,1 -16767,1,4,0,0,5,3,8,4,0,1,2,0,4,2,0 -16768,6,3,0,3,3,2,5,2,2,0,2,11,2,21,0 -16769,2,4,0,15,5,3,7,1,1,1,13,6,0,25,0 -16770,8,0,0,5,1,5,2,2,4,0,2,19,1,22,0 -16771,8,1,0,10,2,5,1,4,0,1,2,0,3,33,0 -16772,8,6,0,14,0,6,4,0,4,0,18,7,5,3,1 -16773,10,6,0,13,5,0,6,2,2,1,8,6,4,34,0 -16774,5,6,0,2,5,2,8,5,1,1,14,16,3,14,1 -16775,1,6,0,13,6,6,2,0,0,1,2,9,5,3,0 -16776,4,7,0,3,4,1,4,0,1,0,10,16,3,2,1 -16777,2,8,0,6,1,5,9,2,1,0,17,2,2,0,0 -16778,8,8,0,5,3,5,4,2,3,1,18,17,5,24,1 -16779,6,1,0,7,2,2,3,2,3,0,18,1,3,27,1 -16780,2,8,0,5,6,6,0,1,2,0,16,15,0,31,0 -16781,8,6,0,12,4,5,12,0,0,1,16,7,1,20,1 -16782,5,4,0,8,1,6,1,1,2,1,9,20,1,11,1 -16783,1,4,0,4,0,6,11,3,0,1,15,12,0,13,0 -16784,8,7,0,13,3,3,2,1,4,1,9,9,2,38,0 -16785,0,3,0,2,0,1,2,2,2,1,3,3,1,12,1 -16786,6,0,0,10,2,5,13,1,0,0,13,19,5,24,0 -16787,1,0,0,5,2,4,6,4,0,1,13,2,5,12,0 -16788,8,6,0,1,6,0,3,0,3,0,8,4,3,9,0 -16789,0,0,0,8,1,5,11,3,4,0,13,20,3,3,0 -16790,6,4,0,8,4,5,12,1,0,1,5,1,4,17,1 -16791,6,7,0,0,4,4,12,0,1,0,17,9,4,14,0 -16792,6,3,0,6,6,4,12,5,2,0,2,6,0,11,0 -16793,7,6,0,15,3,4,14,1,4,0,13,6,2,34,0 -16794,8,5,0,10,2,5,2,0,4,1,3,6,5,29,0 -16795,6,1,0,4,4,0,4,4,2,1,8,9,3,17,0 -16796,2,6,0,11,3,0,4,5,2,0,13,2,0,0,0 -16797,1,8,0,15,0,6,10,0,0,1,13,18,5,2,0 -16798,5,3,0,2,2,2,12,3,1,1,14,7,4,2,1 -16799,9,4,0,11,3,5,13,1,2,0,14,5,4,40,1 -16800,3,2,0,10,0,5,1,2,2,1,17,11,4,39,0 -16801,2,5,0,5,0,4,8,1,2,0,2,2,0,28,0 -16802,1,7,0,14,5,2,4,1,2,1,9,8,3,1,1 -16803,0,8,0,13,3,0,10,2,2,0,17,16,1,17,0 -16804,9,1,0,4,1,4,7,2,1,1,8,11,3,36,0 -16805,2,7,0,8,3,5,1,0,1,0,2,9,0,36,0 -16806,2,8,0,9,0,2,8,5,0,0,10,9,4,30,0 -16807,1,3,0,13,1,2,9,5,1,1,18,16,2,40,1 -16808,1,4,0,2,0,4,0,2,1,0,13,2,1,7,0 -16809,0,0,0,10,0,0,14,4,2,1,6,18,3,8,0 -16810,7,3,0,7,0,2,14,0,1,0,8,15,0,13,0 -16811,7,4,0,7,5,4,13,2,2,1,2,2,0,11,0 -16812,10,8,0,5,4,1,11,2,2,1,9,19,3,18,1 -16813,4,6,0,6,1,1,13,5,0,1,11,18,2,34,1 -16814,2,7,0,12,3,1,3,1,0,1,8,2,1,25,0 -16815,3,4,0,9,1,1,1,1,2,0,1,15,5,4,0 -16816,10,4,0,13,6,4,3,2,0,1,8,9,2,1,0 -16817,0,6,0,11,5,1,9,0,3,1,11,11,4,31,0 -16818,6,3,0,7,5,1,1,3,3,1,18,2,3,31,1 -16819,10,8,0,1,0,5,7,4,4,0,12,11,3,21,0 -16820,2,2,0,3,1,1,2,2,0,1,6,6,2,19,0 -16821,0,8,0,0,1,1,9,3,4,1,18,12,2,41,1 -16822,1,0,0,5,0,4,9,4,2,0,2,11,0,37,0 -16823,7,4,0,2,4,2,0,4,2,0,16,5,4,31,1 -16824,4,8,0,8,5,2,13,1,0,0,9,5,1,2,1 -16825,1,3,0,11,0,0,0,1,0,1,17,2,4,3,0 -16826,0,6,0,15,0,5,2,4,1,0,3,13,2,24,0 -16827,0,8,0,13,6,4,8,5,1,1,4,15,3,37,0 -16828,0,0,0,4,5,3,13,5,1,0,2,11,0,3,0 -16829,1,5,0,0,6,0,2,2,1,1,12,11,2,7,0 -16830,6,6,0,10,0,5,1,3,0,1,15,12,1,0,0 -16831,5,2,0,14,2,3,14,5,4,0,3,16,5,3,1 -16832,2,2,0,15,1,4,8,4,2,1,4,2,3,1,0 -16833,5,7,0,9,0,1,11,4,0,0,8,19,5,21,1 -16834,0,6,0,7,5,4,5,1,0,0,2,11,5,6,0 -16835,2,4,0,13,2,4,7,0,0,1,2,9,0,19,0 -16836,0,5,0,5,2,5,5,3,0,1,8,2,2,24,0 -16837,10,4,0,10,2,6,7,5,4,1,14,5,1,28,1 -16838,0,4,0,11,4,1,5,4,0,1,8,0,2,31,0 -16839,8,5,0,15,5,6,0,2,4,0,7,9,1,25,0 -16840,7,4,0,0,2,2,12,5,3,0,5,3,1,4,1 -16841,9,2,0,10,3,1,9,4,2,1,18,10,1,22,1 -16842,0,1,0,0,5,4,8,0,2,0,17,19,1,24,0 -16843,8,6,0,0,2,1,8,4,3,0,11,9,5,26,0 -16844,0,3,0,13,5,5,9,5,2,0,2,6,1,12,0 -16845,3,3,0,4,5,5,14,4,3,1,12,9,0,28,0 -16846,5,8,0,5,0,3,14,4,0,0,10,6,3,29,0 -16847,5,7,0,8,4,4,8,2,1,0,13,15,3,35,0 -16848,8,2,0,7,3,2,11,3,4,0,7,7,1,25,1 -16849,8,6,0,10,5,3,6,4,0,0,15,9,2,28,0 -16850,2,8,0,0,6,4,5,3,1,0,15,4,5,2,0 -16851,10,0,0,0,6,0,14,4,1,0,6,3,4,25,1 -16852,6,1,0,2,1,1,2,5,2,1,1,13,5,24,1 -16853,0,2,0,5,0,6,6,3,2,1,8,20,4,33,0 -16854,0,6,0,5,5,5,1,4,0,1,11,11,2,33,0 -16855,0,8,0,3,0,3,1,1,1,0,8,11,1,4,0 -16856,4,1,0,9,5,5,5,3,3,1,17,6,4,6,0 -16857,6,8,0,13,1,2,5,5,4,0,8,0,1,0,0 -16858,3,6,0,1,0,3,4,1,0,0,8,9,3,18,0 -16859,1,5,0,5,5,5,13,2,4,0,18,10,0,35,1 -16860,3,2,0,7,6,6,6,5,3,1,1,6,1,27,0 -16861,9,5,0,13,2,4,0,5,2,1,18,19,0,1,1 -16862,0,5,0,1,0,4,5,1,1,0,4,15,5,34,0 -16863,2,2,0,1,2,2,12,1,2,0,13,11,5,12,0 -16864,2,6,0,11,1,2,11,3,4,0,13,17,3,37,0 -16865,0,5,0,11,6,6,2,2,2,1,13,7,4,28,0 -16866,1,2,0,4,2,5,10,3,4,0,2,2,0,19,0 -16867,2,2,0,12,0,4,3,0,0,1,8,15,0,38,0 -16868,0,6,0,1,4,5,10,1,4,0,8,6,4,32,0 -16869,0,8,0,1,0,2,5,5,1,0,13,18,2,4,0 -16870,1,2,0,0,1,0,13,5,4,0,7,20,3,37,0 -16871,9,0,0,3,2,4,10,5,3,0,12,7,2,26,1 -16872,8,6,0,7,1,6,4,1,0,0,2,2,5,17,0 -16873,3,0,0,6,3,5,1,0,1,1,6,5,5,25,1 -16874,4,7,0,8,0,3,3,3,1,1,2,11,0,32,0 -16875,2,7,0,5,1,5,9,2,0,1,4,14,4,24,0 -16876,3,6,0,7,4,6,11,3,0,1,9,10,1,40,1 -16877,8,4,0,0,0,5,4,5,4,1,18,7,2,36,1 -16878,3,1,0,14,5,5,10,5,0,0,8,15,0,28,0 -16879,3,4,0,3,6,0,5,0,3,1,2,13,3,23,0 -16880,4,1,0,13,2,0,4,1,3,1,8,0,5,2,0 -16881,0,1,0,11,1,1,0,0,2,1,17,2,1,35,0 -16882,4,3,0,7,4,1,14,5,0,1,18,14,5,19,1 -16883,6,8,0,4,6,0,8,2,2,0,17,11,1,5,0 -16884,10,1,0,9,0,0,13,4,3,1,6,0,5,19,0 -16885,4,0,0,13,6,0,6,1,3,0,2,2,1,11,0 -16886,5,3,0,3,0,0,5,4,2,0,13,15,0,30,0 -16887,0,3,0,9,5,5,5,0,0,1,2,0,0,0,0 -16888,3,5,0,7,1,5,9,3,4,0,11,1,3,31,1 -16889,3,5,0,5,4,3,6,0,3,1,6,5,1,2,1 -16890,7,5,0,7,0,1,6,2,1,0,13,11,0,27,0 -16891,9,2,0,12,2,2,4,5,0,0,9,8,4,13,1 -16892,9,2,0,2,2,6,3,3,0,1,18,13,4,0,1 -16893,0,8,0,0,0,2,8,0,3,0,8,17,3,22,0 -16894,2,7,0,1,0,1,8,0,2,0,8,11,1,3,0 -16895,7,3,0,6,2,2,3,1,2,1,8,12,1,34,0 -16896,5,0,0,9,0,5,8,0,1,0,15,2,0,4,0 -16897,7,3,0,15,6,0,6,3,2,1,13,12,0,26,0 -16898,1,2,0,7,0,2,4,1,4,1,10,11,1,6,0 -16899,2,8,0,9,2,2,5,0,0,1,7,12,4,19,1 -16900,1,3,0,4,5,1,9,4,0,0,13,12,5,10,0 -16901,0,7,0,6,2,5,4,0,0,1,15,20,5,37,0 -16902,2,1,0,5,1,0,6,3,0,0,15,13,1,34,0 -16903,10,5,0,2,5,3,13,3,2,1,12,16,2,20,0 -16904,6,7,0,12,0,1,0,1,0,0,18,5,0,21,1 -16905,7,8,0,1,2,1,1,2,1,0,4,20,1,9,0 -16906,3,4,0,12,0,6,12,3,3,1,13,6,2,36,0 -16907,3,7,0,15,2,0,10,5,4,0,14,10,2,34,1 -16908,0,8,0,3,1,4,6,1,2,1,2,4,3,28,0 -16909,0,7,0,1,4,0,0,4,3,0,12,9,4,35,0 -16910,0,4,0,0,3,5,0,4,4,0,10,2,2,31,0 -16911,0,5,0,5,2,4,7,3,2,0,0,20,5,29,0 -16912,2,6,0,5,2,2,6,2,3,1,2,9,2,18,0 -16913,0,6,0,4,0,0,0,1,3,0,15,6,0,4,0 -16914,2,2,0,2,2,5,7,4,4,1,13,6,3,19,0 -16915,3,1,0,13,0,5,0,3,0,1,13,15,2,3,0 -16916,1,8,0,11,0,1,5,0,1,0,2,3,5,2,0 -16917,3,8,0,3,6,0,8,3,0,1,4,16,2,0,0 -16918,4,4,0,12,3,0,8,5,4,0,0,19,2,34,1 -16919,0,3,0,7,4,2,12,1,2,0,18,20,2,16,1 -16920,10,4,0,2,6,6,7,1,1,0,18,7,4,41,1 -16921,9,4,0,6,4,1,0,1,2,0,13,15,5,3,0 -16922,0,7,0,12,3,0,5,4,2,0,2,11,0,21,0 -16923,0,0,0,15,0,4,0,1,3,0,12,3,1,6,0 -16924,3,8,0,6,0,6,8,3,1,1,12,16,0,14,0 -16925,1,1,0,5,3,1,13,2,2,0,18,7,5,13,1 -16926,0,7,0,3,6,0,9,1,2,0,17,2,0,16,0 -16927,2,2,0,3,0,4,7,3,1,1,2,2,3,26,0 -16928,10,4,0,0,1,6,1,1,0,0,4,6,1,22,0 -16929,5,5,0,3,4,5,8,0,3,0,9,6,5,13,0 -16930,8,8,0,1,1,1,13,0,4,1,9,7,3,11,1 -16931,1,0,0,6,6,6,5,1,3,0,10,11,1,16,0 -16932,1,6,0,4,6,0,9,0,4,1,2,8,0,3,0 -16933,5,7,0,11,0,5,4,4,1,0,14,20,0,7,0 -16934,1,1,0,14,0,3,2,2,2,1,2,11,2,25,0 -16935,5,6,0,10,4,3,7,3,0,0,6,2,3,3,0 -16936,4,6,0,1,0,3,13,3,0,0,9,18,5,25,0 -16937,7,8,0,10,5,2,2,3,3,1,4,4,0,3,0 -16938,4,0,0,9,4,0,10,0,0,0,2,1,1,7,0 -16939,2,6,0,6,0,0,6,3,0,0,2,11,2,29,0 -16940,4,7,0,3,3,3,8,1,0,0,9,2,2,23,0 -16941,7,7,0,1,3,0,7,5,2,1,18,16,3,31,1 -16942,0,2,0,6,1,3,7,2,2,1,4,5,3,27,0 -16943,9,7,0,4,4,5,14,0,3,0,16,15,0,29,0 -16944,7,3,0,7,4,1,10,0,3,0,1,7,1,39,1 -16945,3,1,0,1,1,6,12,2,3,1,12,16,2,39,0 -16946,2,8,0,6,0,6,4,5,2,1,0,20,1,26,0 -16947,7,4,0,0,2,5,7,3,1,0,14,15,4,32,0 -16948,4,8,0,13,0,1,11,1,2,1,2,7,2,31,0 -16949,1,2,0,6,2,1,8,0,3,0,6,0,0,0,1 -16950,10,3,0,3,2,5,11,4,1,0,13,11,5,14,0 -16951,3,8,0,8,6,6,10,0,4,1,7,11,0,16,0 -16952,10,4,0,1,5,5,8,0,2,0,4,2,5,9,0 -16953,2,3,0,3,1,4,11,5,2,0,18,12,2,3,1 -16954,2,4,0,0,0,5,9,0,2,1,13,11,3,31,0 -16955,3,5,0,5,0,4,5,3,4,1,4,2,4,23,0 -16956,7,2,0,14,5,1,4,3,0,1,18,1,5,17,1 -16957,10,7,0,10,6,4,12,5,1,1,0,11,0,8,0 -16958,7,3,0,6,1,0,9,1,3,1,2,6,1,37,0 -16959,4,0,0,4,3,3,13,5,0,1,0,8,3,41,1 -16960,8,0,0,9,1,0,3,5,3,1,10,7,5,36,1 -16961,2,0,0,10,4,5,8,1,4,1,2,11,3,4,0 -16962,2,5,0,5,2,5,3,4,4,1,8,11,1,26,0 -16963,4,8,0,3,4,6,4,3,1,1,4,0,5,26,0 -16964,2,5,0,8,4,0,8,2,0,0,2,17,0,13,0 -16965,2,8,0,6,6,6,0,0,4,0,16,14,3,40,1 -16966,5,2,0,5,5,5,5,1,3,1,8,2,3,35,0 -16967,9,6,0,3,3,3,3,2,4,1,5,17,3,23,1 -16968,1,8,0,13,5,2,5,0,1,1,12,6,3,3,1 -16969,5,2,0,13,4,5,1,3,2,0,2,13,2,12,0 -16970,2,6,0,6,6,4,9,3,4,0,2,0,1,22,0 -16971,9,7,0,11,5,3,2,2,3,1,15,3,0,9,0 -16972,0,8,0,3,5,4,13,1,1,1,4,17,0,5,0 -16973,2,4,0,10,1,4,8,3,2,1,15,15,2,26,0 -16974,1,7,0,8,1,1,9,1,2,0,10,11,1,19,0 -16975,2,2,0,5,0,3,8,4,0,1,8,11,1,29,0 -16976,2,5,0,13,3,4,5,4,2,1,13,11,4,21,0 -16977,6,7,0,13,2,5,4,1,1,1,17,18,1,11,0 -16978,10,7,0,9,5,5,14,4,3,1,4,20,0,26,0 -16979,0,4,0,15,5,4,6,3,0,1,10,20,1,4,0 -16980,0,4,0,9,0,4,7,1,4,1,6,11,4,34,0 -16981,6,2,0,12,3,3,13,3,1,0,9,7,1,23,1 -16982,3,1,0,4,0,1,10,1,0,0,13,15,2,23,0 -16983,2,8,0,6,3,5,1,4,4,0,2,14,0,37,0 -16984,8,1,0,15,4,2,4,5,3,0,18,16,1,4,1 -16985,3,4,0,3,5,5,8,5,0,0,16,0,2,7,0 -16986,0,2,0,2,6,5,9,4,2,0,10,16,4,13,0 -16987,2,5,0,2,5,5,6,5,3,1,12,6,0,41,0 -16988,5,7,0,5,2,6,9,1,1,1,2,11,3,12,0 -16989,4,5,0,11,0,5,13,4,3,1,2,2,1,18,0 -16990,8,5,0,5,0,0,5,3,3,0,4,9,1,29,0 -16991,3,6,0,3,2,6,12,1,4,1,8,2,5,5,0 -16992,3,0,0,2,0,4,11,1,2,1,2,13,1,22,0 -16993,4,0,0,6,6,5,5,3,2,0,12,9,1,34,0 -16994,2,0,0,1,3,4,1,5,1,0,18,1,1,23,1 -16995,10,3,0,7,1,5,14,1,2,1,16,8,5,41,1 -16996,9,0,0,0,0,4,6,3,0,0,13,2,3,8,0 -16997,1,3,0,7,6,4,2,2,4,0,6,13,5,40,0 -16998,1,6,0,12,1,1,1,4,3,0,18,18,2,28,1 -16999,9,4,0,12,4,1,11,4,1,0,16,1,4,21,1 -17000,5,7,0,3,5,5,0,1,0,0,8,0,2,8,0 -17001,3,2,0,12,3,5,0,2,1,1,18,2,3,12,0 -17002,7,7,0,3,1,3,7,4,1,0,6,2,3,16,0 -17003,0,5,0,13,4,4,5,0,0,0,2,11,2,17,0 -17004,10,6,0,15,5,3,9,3,1,1,8,6,0,32,0 -17005,9,1,0,8,4,1,11,5,0,0,18,19,5,8,1 -17006,3,8,0,4,0,4,11,0,1,1,8,5,5,3,0 -17007,10,0,0,11,2,6,0,3,1,0,8,16,1,17,0 -17008,10,5,0,10,4,5,6,2,3,0,13,15,4,30,0 -17009,0,0,0,7,0,4,11,3,3,1,8,15,4,13,0 -17010,3,2,0,2,3,3,14,3,4,1,9,1,2,17,1 -17011,0,2,0,8,1,1,7,1,0,1,9,12,5,10,0 -17012,7,6,0,10,1,0,4,2,1,1,15,8,3,3,1 -17013,5,4,0,8,2,2,14,1,2,0,12,15,0,16,0 -17014,9,1,0,14,5,2,4,0,4,0,5,19,5,6,1 -17015,4,1,0,2,1,1,0,1,3,1,11,7,5,41,1 -17016,5,3,0,9,5,5,8,2,1,0,8,14,0,28,0 -17017,9,1,0,14,0,6,8,2,0,1,11,11,0,5,0 -17018,5,3,0,15,0,2,9,2,3,0,12,15,2,8,0 -17019,7,3,0,5,4,4,3,3,1,1,9,8,1,41,1 -17020,10,3,0,2,1,1,6,0,3,1,17,2,3,13,0 -17021,7,5,0,6,0,6,12,4,4,0,13,11,4,5,0 -17022,0,7,0,10,1,3,13,5,3,0,0,6,1,24,0 -17023,5,2,0,0,2,0,9,1,3,0,15,6,5,25,0 -17024,8,0,0,11,2,1,11,0,3,1,2,2,0,30,0 -17025,0,8,0,3,6,5,2,3,4,1,8,15,1,6,0 -17026,1,1,0,12,1,1,11,1,0,0,2,16,2,36,0 -17027,1,1,0,3,0,6,9,1,0,0,0,4,2,29,0 -17028,0,8,0,2,0,0,5,0,0,1,2,6,0,0,0 -17029,6,6,0,1,1,5,7,2,4,1,16,11,5,31,0 -17030,0,5,0,2,0,5,12,2,3,1,2,19,5,2,0 -17031,2,2,0,13,2,0,4,4,1,0,2,9,2,4,0 -17032,0,8,0,8,6,3,5,4,0,0,16,20,0,38,0 -17033,9,7,0,3,5,3,4,1,0,1,2,6,5,4,0 -17034,9,7,0,12,6,4,2,1,2,0,2,14,0,21,0 -17035,2,7,0,15,4,4,4,0,4,1,17,4,4,5,0 -17036,3,4,0,15,5,6,0,3,3,1,16,11,5,18,0 -17037,6,1,0,14,4,5,9,2,4,1,18,18,1,8,1 -17038,1,8,0,15,3,2,6,0,4,0,7,19,2,40,0 -17039,0,2,0,3,0,4,7,1,1,1,13,2,0,25,0 -17040,9,0,0,12,0,1,7,5,3,1,18,4,4,37,1 -17041,1,5,0,1,1,0,3,2,4,1,2,6,0,24,0 -17042,6,4,0,9,4,0,2,4,0,1,18,13,3,9,1 -17043,7,7,0,13,1,1,2,2,0,1,14,5,5,6,1 -17044,2,7,0,11,5,4,2,1,4,1,9,10,1,30,1 -17045,0,2,0,0,6,4,14,3,0,0,10,0,4,21,0 -17046,3,5,0,12,1,5,0,1,0,1,12,10,2,4,0 -17047,6,1,0,10,4,1,11,4,3,1,11,7,2,34,1 -17048,9,4,0,5,0,6,9,0,4,1,4,4,3,23,0 -17049,0,8,0,2,0,3,14,4,4,1,2,3,1,25,0 -17050,3,3,0,11,2,5,0,4,0,0,8,6,0,3,0 -17051,1,1,0,15,2,0,6,4,0,0,8,2,5,39,0 -17052,7,3,0,14,1,4,1,1,1,0,7,10,4,7,1 -17053,9,8,0,5,3,0,7,3,0,1,1,13,1,22,0 -17054,5,8,0,7,5,2,6,1,0,0,13,6,4,36,0 -17055,4,3,0,9,6,0,9,1,3,1,12,20,5,37,0 -17056,3,6,0,0,6,1,11,1,4,0,8,15,4,4,0 -17057,2,0,0,7,5,0,2,2,3,1,13,6,3,19,0 -17058,10,1,0,12,1,3,10,4,0,0,8,15,0,13,0 -17059,0,7,0,1,1,4,12,0,1,1,14,12,1,3,0 -17060,5,6,0,4,1,6,13,4,1,0,1,13,1,0,0 -17061,10,7,0,6,6,3,12,1,2,0,13,2,0,34,0 -17062,5,0,0,14,2,3,10,1,0,1,7,7,2,16,1 -17063,10,3,0,3,6,2,7,5,3,1,18,7,1,40,1 -17064,6,8,0,13,6,1,8,3,3,1,7,20,2,14,1 -17065,10,6,0,0,6,2,13,2,3,0,15,11,5,27,0 -17066,2,4,0,14,1,0,8,3,0,0,13,16,3,7,0 -17067,0,2,0,12,5,0,2,1,1,0,10,13,0,17,0 -17068,3,1,0,5,3,6,3,1,2,1,2,2,1,30,0 -17069,2,7,0,6,1,0,10,0,2,1,13,18,5,27,0 -17070,0,8,0,13,0,4,5,1,0,0,11,2,5,38,0 -17071,8,4,0,1,3,5,8,4,1,1,2,5,0,29,0 -17072,1,6,0,0,5,6,1,4,2,0,5,18,1,0,0 -17073,7,4,0,6,2,3,1,2,4,0,8,20,4,26,0 -17074,0,1,0,15,2,0,6,3,3,1,2,11,3,11,0 -17075,9,2,0,4,4,0,13,5,4,1,13,3,1,38,0 -17076,4,4,0,12,1,5,5,4,2,0,11,11,1,12,0 -17077,0,0,0,3,2,5,9,5,3,1,13,2,0,29,0 -17078,8,6,0,13,1,0,9,4,0,1,15,1,1,2,0 -17079,10,3,0,8,5,3,10,0,3,1,13,4,1,3,0 -17080,10,0,0,12,1,4,3,5,3,1,2,6,2,37,0 -17081,9,0,0,6,4,3,14,0,1,0,18,10,2,28,1 -17082,9,2,0,5,3,4,5,1,0,1,8,11,3,4,0 -17083,10,6,0,7,0,6,9,5,1,0,13,2,2,3,0 -17084,4,3,0,2,0,0,5,1,3,0,17,13,3,6,0 -17085,6,1,0,5,0,3,6,0,0,0,14,11,4,14,0 -17086,0,2,0,9,5,4,9,1,0,0,15,0,4,25,0 -17087,2,2,0,6,6,2,14,5,1,1,0,20,0,9,0 -17088,0,5,0,12,3,2,13,0,0,0,0,20,1,5,0 -17089,3,3,0,8,1,0,6,3,2,0,8,4,1,5,0 -17090,5,7,0,4,5,1,3,5,1,1,1,10,0,18,1 -17091,2,8,0,13,0,2,4,4,0,1,15,18,0,23,0 -17092,2,8,0,4,2,1,14,0,4,0,2,18,4,33,0 -17093,7,3,0,5,2,4,8,0,2,1,4,11,1,33,0 -17094,3,0,0,2,5,6,9,4,4,0,17,11,1,22,0 -17095,10,7,0,8,5,4,0,1,2,1,3,11,2,26,0 -17096,8,6,0,13,0,3,8,3,3,1,2,3,0,22,0 -17097,0,5,0,9,5,5,12,1,1,0,13,2,1,4,0 -17098,10,1,0,0,6,6,10,3,4,1,5,13,5,16,0 -17099,4,6,0,14,6,4,6,3,4,0,4,20,3,28,0 -17100,0,8,0,8,0,0,8,3,0,1,14,11,0,27,0 -17101,6,4,0,3,6,0,9,3,1,1,17,2,2,3,0 -17102,2,4,0,6,0,3,6,2,4,0,14,13,4,24,0 -17103,5,0,0,3,3,0,10,1,0,0,4,20,1,19,0 -17104,7,3,0,14,4,1,2,5,3,1,5,5,3,1,1 -17105,0,8,0,15,2,4,9,1,3,1,2,2,0,21,0 -17106,0,0,0,7,6,2,11,4,2,0,14,6,0,29,0 -17107,2,7,0,1,4,4,8,4,2,0,8,0,3,23,0 -17108,6,6,0,6,1,6,9,4,0,1,2,7,0,25,0 -17109,6,0,0,4,1,4,6,5,1,1,6,6,0,12,0 -17110,1,1,0,1,6,0,8,1,2,0,6,11,2,5,0 -17111,0,3,0,3,2,6,8,2,4,0,12,9,1,16,0 -17112,9,7,0,4,0,3,2,2,1,0,8,6,3,20,0 -17113,5,1,0,15,3,6,4,1,0,1,18,12,5,27,1 -17114,0,2,0,0,5,6,14,5,2,0,13,11,1,16,0 -17115,2,6,0,4,0,5,5,1,2,0,12,17,0,10,0 -17116,2,5,0,5,0,0,12,1,2,1,8,13,0,29,0 -17117,7,3,0,13,0,5,10,3,2,0,0,16,1,34,0 -17118,1,8,0,6,6,5,7,3,1,1,4,11,0,5,0 -17119,3,8,0,10,2,6,8,0,0,1,6,9,1,18,0 -17120,0,2,0,14,0,1,7,4,3,0,8,9,1,13,0 -17121,5,6,0,3,2,5,1,1,3,1,8,16,0,13,0 -17122,1,3,0,0,0,6,9,3,2,1,15,15,4,33,0 -17123,10,6,0,9,3,1,9,2,0,0,3,18,4,24,1 -17124,3,6,0,6,0,4,3,2,2,0,6,13,0,27,0 -17125,1,6,0,13,5,4,1,4,2,1,4,11,1,35,0 -17126,8,4,0,0,3,3,0,4,1,1,13,6,4,3,0 -17127,10,3,0,11,4,2,1,3,0,1,1,8,1,2,1 -17128,5,0,0,6,3,2,3,0,4,1,5,10,3,31,1 -17129,0,2,0,7,3,5,10,4,4,1,13,19,0,5,0 -17130,7,6,0,15,5,1,12,0,2,1,0,7,5,30,1 -17131,0,0,0,10,0,5,3,2,3,1,13,9,2,3,0 -17132,9,4,0,14,1,3,14,4,1,1,12,10,2,19,1 -17133,0,7,0,6,1,0,1,1,0,1,4,8,0,30,0 -17134,8,0,0,13,0,4,10,3,0,0,2,20,1,13,0 -17135,1,5,0,2,6,5,3,1,4,0,5,16,3,27,0 -17136,3,8,0,3,4,3,4,1,3,1,13,3,0,28,0 -17137,2,3,0,13,0,0,0,3,3,0,4,2,4,30,0 -17138,0,8,0,6,1,3,14,3,1,1,0,12,4,11,0 -17139,0,8,0,6,4,4,9,5,2,1,4,2,1,34,0 -17140,7,5,0,13,0,5,1,3,0,1,6,2,5,2,0 -17141,8,2,0,7,6,5,6,1,4,1,18,9,2,16,0 -17142,7,3,0,8,5,0,11,2,3,1,2,15,5,17,0 -17143,6,1,0,9,3,4,6,1,2,0,8,2,0,40,0 -17144,5,3,0,5,6,2,9,5,4,0,16,13,2,3,0 -17145,7,1,0,1,3,4,4,4,0,1,11,4,1,40,0 -17146,0,2,0,15,0,4,6,3,1,1,17,17,0,18,1 -17147,9,0,0,12,1,6,4,1,3,1,4,10,0,12,1 -17148,8,5,0,9,3,4,12,0,4,1,14,17,4,28,1 -17149,3,5,0,9,1,0,9,4,2,1,1,2,5,16,0 -17150,7,2,0,0,4,3,14,3,2,0,18,8,5,30,1 -17151,8,8,0,3,6,1,2,2,2,0,5,17,4,37,1 -17152,0,6,0,0,1,5,14,2,1,0,8,2,1,27,0 -17153,1,3,0,6,5,4,0,1,1,1,17,4,1,29,0 -17154,2,2,0,6,0,2,11,4,0,1,8,6,5,2,0 -17155,7,3,0,0,1,3,0,5,0,1,6,18,5,35,0 -17156,0,1,0,5,3,1,9,3,0,0,13,4,3,38,0 -17157,5,6,0,5,5,3,1,5,4,1,14,1,2,32,1 -17158,10,1,0,1,3,1,5,1,1,0,12,0,2,28,0 -17159,6,8,0,12,0,4,8,0,3,0,8,0,4,39,0 -17160,3,8,0,9,3,1,10,1,2,1,9,19,3,4,1 -17161,3,4,0,12,6,4,14,3,0,0,17,14,3,9,0 -17162,8,3,0,14,4,0,8,3,3,1,13,11,1,1,0 -17163,5,7,0,13,4,0,2,0,3,0,2,13,1,20,0 -17164,1,7,0,4,0,5,8,4,4,0,4,10,0,34,0 -17165,4,2,0,11,2,4,6,3,4,0,13,11,5,28,0 -17166,9,2,0,12,3,3,11,1,2,1,0,20,5,21,0 -17167,1,5,0,11,0,5,4,0,2,1,9,17,4,9,1 -17168,9,1,0,10,4,1,3,0,0,1,5,8,5,21,1 -17169,6,5,0,10,5,3,1,0,4,1,8,15,0,24,0 -17170,1,8,0,5,4,0,4,3,2,0,8,5,4,4,0 -17171,4,3,0,10,5,3,3,3,0,1,8,15,2,39,0 -17172,6,8,0,1,5,6,9,4,2,1,6,9,2,11,0 -17173,7,2,0,11,3,1,4,2,0,0,18,0,0,1,1 -17174,0,7,0,1,6,2,6,4,3,1,8,11,2,25,0 -17175,2,6,0,6,1,0,0,2,2,1,2,20,4,14,0 -17176,7,0,0,0,0,1,14,0,3,0,2,11,2,0,0 -17177,6,6,0,6,6,1,5,4,3,0,6,9,5,11,0 -17178,3,2,0,13,5,5,13,1,0,0,13,2,5,35,0 -17179,2,2,0,3,3,6,8,3,4,0,13,4,0,32,0 -17180,6,4,0,8,6,0,12,4,2,0,10,11,2,27,1 -17181,9,2,0,6,5,2,6,3,2,1,5,10,3,22,1 -17182,0,8,0,0,5,5,1,4,1,0,13,11,0,20,0 -17183,10,4,0,5,3,2,9,4,4,1,16,11,2,25,0 -17184,4,1,0,6,1,2,9,0,0,1,5,7,4,34,1 -17185,4,5,0,10,0,4,6,2,1,1,17,10,4,36,1 -17186,1,6,0,10,0,5,0,0,0,0,13,3,1,37,0 -17187,2,8,0,12,3,4,14,4,0,1,17,9,1,39,0 -17188,1,6,0,4,0,0,7,3,2,1,1,14,2,41,0 -17189,0,1,0,5,6,5,0,2,0,1,10,6,0,23,0 -17190,8,1,0,15,3,5,11,1,0,0,14,7,2,26,1 -17191,10,6,0,5,5,4,4,4,4,0,12,17,3,21,0 -17192,9,2,0,3,6,2,11,1,4,1,8,20,0,31,0 -17193,2,0,0,3,6,4,1,3,0,1,2,14,1,13,0 -17194,0,2,0,8,0,0,5,4,2,1,2,7,5,29,0 -17195,0,1,0,12,6,5,13,2,1,1,6,18,0,38,0 -17196,2,5,0,8,3,4,8,4,0,1,13,14,2,22,0 -17197,1,2,0,0,0,5,6,1,0,0,6,11,1,22,0 -17198,0,8,0,4,6,3,1,1,0,0,2,6,2,38,0 -17199,3,0,0,3,2,0,8,4,4,0,17,16,5,22,0 -17200,0,7,0,6,1,0,10,2,3,0,13,13,0,14,0 -17201,10,3,0,7,6,6,2,5,0,0,4,15,2,19,0 -17202,0,3,0,5,3,5,14,0,0,1,16,10,4,25,0 -17203,6,3,0,10,2,3,10,3,2,0,1,14,4,19,1 -17204,0,5,0,15,4,6,9,5,0,0,8,11,4,11,0 -17205,7,8,0,7,5,0,5,2,1,1,7,13,3,16,0 -17206,2,2,0,2,5,0,2,2,4,0,2,9,1,35,0 -17207,6,2,0,3,0,5,0,0,1,1,9,7,4,21,1 -17208,2,1,0,4,4,4,7,1,4,1,17,2,3,3,0 -17209,5,7,0,15,0,0,7,1,3,1,13,8,4,5,0 -17210,1,2,0,1,3,6,11,4,1,0,15,19,1,35,0 -17211,2,3,0,0,3,4,5,1,2,0,8,19,5,3,0 -17212,3,8,0,9,2,3,14,2,0,1,8,11,4,20,0 -17213,9,5,0,2,4,3,4,0,0,1,11,5,3,9,1 -17214,2,6,0,5,4,4,1,2,2,0,15,11,2,29,0 -17215,8,3,0,2,4,1,7,0,4,1,5,10,3,21,1 -17216,1,4,0,1,2,4,2,0,0,1,15,13,2,41,0 -17217,0,6,0,0,1,4,0,3,3,1,13,0,5,35,0 -17218,1,3,0,0,6,6,2,0,2,1,4,11,2,41,0 -17219,9,1,0,9,6,6,10,4,2,0,18,7,3,19,1 -17220,7,8,0,11,4,6,1,4,4,1,16,16,1,19,1 -17221,6,2,0,1,1,4,0,4,4,0,8,12,5,31,0 -17222,5,7,0,9,4,2,11,1,2,0,5,3,3,17,1 -17223,1,8,0,3,3,5,0,5,3,1,6,7,2,35,1 -17224,10,6,0,6,0,4,2,0,4,1,4,15,4,3,0 -17225,3,5,0,0,0,0,9,4,2,1,13,15,3,26,0 -17226,6,1,0,4,0,2,8,5,0,0,0,17,0,6,0 -17227,3,5,0,8,6,3,14,1,3,0,10,13,1,24,0 -17228,10,6,0,4,5,4,5,2,3,0,17,2,2,14,0 -17229,10,8,0,11,6,2,2,5,0,0,17,11,5,10,0 -17230,9,5,0,7,4,0,13,1,4,0,5,11,4,24,1 -17231,10,6,0,6,2,6,3,1,1,0,1,6,5,9,1 -17232,9,6,0,7,0,6,4,0,0,0,18,7,4,3,1 -17233,0,2,0,1,2,0,12,2,2,0,13,11,0,38,0 -17234,1,0,0,0,3,0,6,3,2,0,0,3,1,38,0 -17235,5,5,0,7,4,0,2,4,0,0,18,10,5,9,1 -17236,9,4,0,13,5,6,1,0,3,1,5,18,2,28,0 -17237,3,7,0,12,3,4,14,2,4,0,13,8,4,6,0 -17238,1,0,0,2,0,4,12,0,2,0,13,13,0,19,0 -17239,0,6,0,2,5,3,9,4,4,0,6,2,3,23,0 -17240,3,2,0,8,0,0,3,4,2,0,6,11,1,11,0 -17241,9,2,0,13,1,5,9,2,4,1,11,8,2,26,0 -17242,0,4,0,10,6,1,2,2,0,0,4,15,0,19,0 -17243,9,6,0,1,1,3,6,0,3,0,2,18,4,9,0 -17244,0,4,0,14,1,2,8,1,2,0,2,9,3,17,0 -17245,2,4,0,8,4,5,2,4,2,0,2,1,0,31,0 -17246,6,6,0,11,1,0,0,2,3,0,8,12,2,11,0 -17247,7,0,0,13,1,0,11,1,3,1,18,12,0,12,1 -17248,0,1,0,5,2,1,11,5,1,1,2,11,0,32,0 -17249,6,5,0,2,3,1,11,0,1,1,18,17,5,5,1 -17250,6,5,0,3,0,1,1,3,2,1,6,2,4,5,0 -17251,1,0,0,14,5,1,8,5,4,1,0,3,1,23,0 -17252,3,7,0,5,6,0,0,4,4,0,13,11,1,38,0 -17253,6,2,0,5,6,5,14,4,0,1,8,6,2,24,0 -17254,9,0,0,0,4,0,3,4,1,1,13,13,1,10,0 -17255,3,1,0,1,0,1,6,4,4,1,2,20,4,19,0 -17256,2,0,0,10,4,0,9,3,0,1,2,15,1,25,0 -17257,4,2,0,10,6,6,8,2,1,1,0,11,0,27,0 -17258,2,7,0,0,6,2,8,2,3,0,4,8,4,10,1 -17259,0,1,0,9,3,4,13,2,1,1,18,18,5,23,1 -17260,8,1,0,14,0,6,10,3,3,1,18,16,0,38,1 -17261,10,8,0,3,5,4,8,1,4,0,18,7,5,6,1 -17262,2,7,0,6,5,0,10,0,4,0,6,19,1,20,0 -17263,3,4,0,9,5,0,5,2,3,1,4,6,1,16,0 -17264,10,7,0,7,0,5,14,0,2,1,17,1,0,32,0 -17265,1,1,0,13,6,5,6,1,1,1,2,0,0,3,0 -17266,0,8,0,1,4,0,13,4,2,1,15,2,2,5,0 -17267,2,1,0,13,0,6,13,5,4,1,3,14,5,16,1 -17268,10,7,0,15,4,2,10,4,0,0,18,20,5,6,1 -17269,2,8,0,12,6,4,4,5,4,0,9,15,3,13,1 -17270,0,3,0,0,0,0,5,3,2,0,0,13,2,25,0 -17271,2,0,0,2,5,4,7,2,0,0,3,13,0,31,0 -17272,7,6,0,14,6,6,3,2,3,1,2,2,2,34,0 -17273,0,7,0,13,6,3,9,5,2,1,2,10,5,13,0 -17274,5,2,0,13,1,3,9,3,3,0,2,15,1,5,0 -17275,8,8,0,15,0,2,13,4,1,0,7,19,4,26,1 -17276,6,3,0,14,4,4,12,1,4,0,2,11,0,6,0 -17277,10,2,0,8,2,5,9,4,1,1,2,11,0,9,0 -17278,10,0,0,3,3,3,9,1,2,1,18,8,1,32,1 -17279,10,0,0,9,3,0,8,5,0,1,2,2,4,37,0 -17280,10,5,0,10,1,5,14,1,4,0,10,6,0,4,0 -17281,10,5,0,12,6,4,11,3,3,1,12,2,0,24,0 -17282,0,5,0,13,0,0,12,3,0,0,0,2,5,17,0 -17283,4,3,0,0,1,2,14,4,3,0,2,18,1,25,0 -17284,8,1,0,8,3,4,5,2,4,1,12,11,4,33,0 -17285,0,2,0,12,4,0,0,1,4,0,6,15,1,30,0 -17286,6,8,0,3,2,5,14,0,0,1,6,15,3,39,0 -17287,7,8,0,12,4,5,11,3,3,1,8,11,3,13,0 -17288,3,6,0,12,6,4,1,4,2,0,3,11,5,0,0 -17289,0,4,0,13,1,6,4,0,1,1,2,2,4,34,0 -17290,1,6,0,11,6,6,2,1,4,1,17,20,4,39,0 -17291,7,1,0,14,4,3,3,5,3,1,5,1,4,20,1 -17292,6,3,0,5,5,6,11,1,1,0,5,16,4,30,1 -17293,2,7,0,3,0,5,2,3,4,0,12,9,1,33,0 -17294,1,0,0,10,4,4,7,5,1,0,15,8,4,32,1 -17295,6,1,0,11,4,2,0,2,0,0,7,19,1,10,1 -17296,10,3,0,6,4,5,6,4,1,0,0,11,5,21,0 -17297,2,5,0,0,0,6,11,2,3,1,2,0,3,41,0 -17298,9,5,0,5,5,3,12,0,1,0,2,11,1,12,0 -17299,5,8,0,0,0,0,3,3,2,1,13,4,1,3,0 -17300,9,4,0,4,4,3,7,2,4,1,18,11,4,22,1 -17301,5,5,0,13,1,5,3,5,1,1,1,7,5,27,1 -17302,10,6,0,3,1,5,9,1,0,1,2,5,0,3,0 -17303,0,4,0,13,0,5,9,2,1,1,11,2,2,14,0 -17304,5,1,0,6,4,3,1,4,3,0,2,20,2,16,0 -17305,2,6,0,15,2,4,7,2,0,1,12,2,3,10,0 -17306,4,1,0,8,3,2,13,1,3,0,9,19,3,17,1 -17307,1,0,0,10,3,2,7,5,0,0,8,2,1,12,0 -17308,0,5,0,14,6,3,10,0,4,0,8,4,0,13,0 -17309,3,3,0,13,5,0,5,4,4,0,2,2,1,12,0 -17310,0,4,0,3,0,5,6,1,3,0,8,19,5,40,0 -17311,7,2,0,2,6,1,3,5,4,1,16,10,3,30,1 -17312,0,7,0,15,2,2,0,1,0,1,10,13,1,11,0 -17313,1,6,0,5,0,0,8,4,0,1,7,0,2,24,0 -17314,8,0,0,10,1,3,14,3,4,0,12,10,0,14,0 -17315,0,7,0,4,3,3,7,2,1,0,15,13,5,36,0 -17316,8,4,0,15,2,3,9,1,3,0,8,12,5,0,0 -17317,2,6,0,15,1,0,3,0,3,0,4,12,0,19,0 -17318,2,7,0,8,0,6,9,3,0,0,14,11,3,7,0 -17319,3,6,0,11,5,5,11,0,2,0,12,7,4,35,0 -17320,5,7,0,13,0,2,1,1,0,0,8,4,0,6,0 -17321,0,3,0,13,2,0,14,3,1,0,8,16,2,31,0 -17322,5,0,0,8,6,0,12,1,4,1,4,17,3,39,1 -17323,4,8,0,1,2,1,3,0,4,1,8,1,0,29,0 -17324,10,3,0,0,0,3,3,5,1,1,2,4,0,27,0 -17325,7,6,0,0,6,4,9,2,0,0,13,20,0,35,0 -17326,5,2,0,3,2,5,10,1,4,1,1,16,1,17,0 -17327,3,0,0,3,6,3,2,1,4,1,1,4,1,22,0 -17328,2,2,0,9,6,6,8,4,1,0,4,20,5,33,0 -17329,2,6,0,2,0,3,5,0,4,1,0,11,0,2,0 -17330,2,7,0,10,6,0,3,1,1,0,6,19,4,12,1 -17331,4,3,0,11,5,5,7,0,0,1,1,5,5,25,1 -17332,2,4,0,14,6,5,5,3,0,0,9,14,5,7,0 -17333,9,2,0,5,4,1,5,0,1,1,18,17,4,9,1 -17334,3,8,0,15,0,3,4,3,3,1,0,0,5,28,0 -17335,4,5,0,7,4,1,2,5,4,1,10,15,2,5,1 -17336,4,2,0,13,4,6,0,3,4,1,13,14,0,12,0 -17337,0,2,0,11,3,0,1,0,2,0,2,2,5,11,0 -17338,0,8,0,13,0,3,14,3,1,0,2,15,2,26,0 -17339,4,2,0,2,0,0,8,3,1,0,13,15,2,16,0 -17340,4,8,0,11,0,0,5,4,4,0,8,13,0,35,0 -17341,2,3,0,3,4,1,9,3,2,0,4,4,2,38,0 -17342,0,6,0,0,6,0,4,0,2,1,3,1,1,29,0 -17343,4,6,0,7,3,6,7,3,1,0,8,2,5,11,0 -17344,0,7,0,2,1,5,13,1,0,0,7,12,2,19,1 -17345,0,1,0,3,6,4,8,2,2,1,1,2,2,8,0 -17346,3,2,0,5,0,1,9,2,0,1,2,2,0,37,0 -17347,8,0,0,1,6,4,5,3,2,1,8,0,2,22,0 -17348,2,6,0,8,3,4,0,3,2,0,4,1,1,25,0 -17349,9,6,0,13,5,0,1,4,4,0,4,2,5,14,0 -17350,9,7,0,15,5,5,14,5,1,0,12,2,0,36,0 -17351,10,4,0,3,5,0,7,4,0,1,2,15,1,14,0 -17352,5,1,0,7,0,0,11,3,3,0,2,20,2,26,0 -17353,2,6,0,15,4,4,12,3,0,0,10,0,0,31,0 -17354,0,4,0,14,2,0,2,1,0,1,2,16,2,34,0 -17355,7,5,0,14,3,5,11,2,4,1,7,12,5,19,1 -17356,6,7,0,0,4,0,12,1,2,0,9,4,3,9,1 -17357,7,8,0,9,1,3,6,0,1,0,14,7,2,39,1 -17358,9,4,0,5,4,4,5,2,0,0,8,6,1,35,0 -17359,5,0,0,9,0,0,0,0,0,0,3,11,0,33,0 -17360,9,6,0,12,6,0,11,1,3,1,8,9,2,4,0 -17361,5,4,0,11,6,3,9,3,3,0,8,16,3,17,0 -17362,6,2,0,14,4,1,10,2,0,1,18,12,5,12,1 -17363,9,1,0,7,5,5,9,3,1,0,5,6,2,4,0 -17364,2,6,0,11,5,2,0,4,1,1,8,15,0,40,0 -17365,7,7,0,6,6,5,10,4,3,0,15,2,5,36,0 -17366,7,5,0,11,0,0,0,4,0,1,14,10,2,29,1 -17367,3,1,0,3,2,5,4,2,1,1,13,2,1,29,0 -17368,6,7,0,3,0,1,13,2,3,1,17,14,5,6,0 -17369,7,2,0,10,1,1,8,5,2,1,9,1,5,25,1 -17370,1,1,0,7,0,5,14,3,2,0,13,0,5,18,0 -17371,5,5,0,10,3,4,8,2,3,1,12,11,1,9,0 -17372,1,2,0,12,2,6,9,3,2,1,8,15,1,16,0 -17373,0,7,0,12,6,0,2,1,1,1,2,18,0,35,0 -17374,2,8,0,8,3,0,14,1,2,0,13,6,0,2,0 -17375,5,5,0,14,5,0,2,3,4,1,6,0,5,18,0 -17376,0,2,0,10,0,4,5,3,3,0,13,0,2,6,0 -17377,10,8,0,8,6,1,3,1,0,0,16,2,4,8,0 -17378,2,5,0,15,4,1,12,3,2,1,7,17,3,8,1 -17379,2,5,0,5,6,0,11,0,1,0,4,11,0,17,0 -17380,0,6,0,12,5,0,4,0,2,0,13,9,4,30,0 -17381,8,3,0,8,4,3,9,5,1,0,1,8,4,22,1 -17382,3,3,0,14,6,2,0,0,4,1,18,12,0,23,1 -17383,8,1,0,3,0,1,6,3,0,1,8,10,1,39,0 -17384,9,1,0,0,3,0,0,3,4,1,2,15,4,16,0 -17385,7,5,0,0,3,1,4,5,0,1,14,7,3,9,1 -17386,0,2,0,5,5,3,0,4,3,0,2,4,2,10,0 -17387,6,0,0,13,1,0,5,3,2,0,7,13,1,25,0 -17388,1,8,0,4,5,0,8,1,2,1,8,6,0,23,0 -17389,9,1,0,6,5,0,12,2,3,0,8,11,4,4,0 -17390,2,4,0,2,1,5,3,2,2,0,11,5,1,30,0 -17391,10,8,0,1,4,0,13,1,2,0,2,16,4,4,0 -17392,9,8,0,15,5,3,12,1,4,0,5,15,4,32,0 -17393,9,1,0,6,5,2,12,3,3,1,4,13,1,1,0 -17394,10,7,0,1,5,6,9,0,4,1,1,14,1,30,1 -17395,4,2,0,4,2,2,0,2,3,0,2,6,0,28,0 -17396,0,2,0,8,5,5,13,1,3,0,13,11,4,39,0 -17397,2,0,0,5,6,1,2,4,3,0,7,1,3,30,1 -17398,4,5,0,13,3,6,11,0,0,0,8,0,0,25,0 -17399,9,3,0,11,3,1,13,4,0,1,16,15,3,18,1 -17400,10,6,0,10,5,5,11,2,2,1,17,11,5,10,0 -17401,0,4,0,5,2,3,13,3,3,0,12,2,4,24,0 -17402,5,2,0,2,0,1,0,0,1,0,14,8,5,28,1 -17403,3,1,0,3,6,4,0,4,4,1,8,17,1,11,0 -17404,6,7,0,5,3,5,6,2,1,1,7,5,0,16,1 -17405,8,7,0,2,2,5,1,4,3,1,13,6,1,23,0 -17406,0,6,0,9,0,4,1,2,2,1,4,3,3,17,0 -17407,10,8,0,5,1,5,2,2,3,1,0,6,5,14,0 -17408,4,4,0,9,4,0,13,1,2,0,2,11,0,5,0 -17409,8,5,0,5,0,6,10,0,3,0,18,5,5,21,1 -17410,0,4,0,6,5,5,8,3,3,1,2,9,4,26,0 -17411,0,6,0,14,6,1,5,3,4,0,2,6,1,24,0 -17412,3,8,0,8,5,3,4,3,4,0,13,11,3,20,0 -17413,1,6,0,7,1,0,13,3,3,1,8,2,0,37,0 -17414,6,3,0,12,0,1,9,1,0,1,10,2,0,10,0 -17415,3,2,0,8,0,6,9,3,2,1,6,0,1,7,0 -17416,1,5,0,3,0,0,7,4,3,1,12,17,0,2,0 -17417,1,6,0,0,6,5,9,0,4,0,1,11,5,29,0 -17418,1,3,0,6,1,0,12,4,4,0,14,2,2,19,0 -17419,10,4,0,6,1,3,14,1,1,1,12,11,5,17,0 -17420,5,8,0,6,1,1,12,0,2,1,13,15,2,18,0 -17421,3,7,0,4,5,3,9,2,2,0,7,11,3,10,0 -17422,1,2,0,2,0,2,8,4,1,1,4,11,4,37,0 -17423,10,6,0,1,2,0,13,0,0,0,4,15,5,39,0 -17424,1,0,0,2,5,0,8,4,2,0,16,3,5,24,0 -17425,4,0,0,11,1,4,9,2,4,0,13,20,3,35,0 -17426,4,1,0,6,1,2,8,4,3,0,17,9,0,4,0 -17427,4,3,0,1,5,6,2,3,2,1,4,4,0,26,0 -17428,1,7,0,0,3,1,3,3,2,1,4,11,0,40,0 -17429,3,8,0,12,0,4,14,3,1,0,13,15,0,4,0 -17430,0,4,0,3,2,0,1,4,2,1,10,2,5,5,0 -17431,7,6,0,5,3,1,7,0,3,0,6,10,4,31,1 -17432,0,3,0,4,3,0,6,5,3,0,13,3,2,1,0 -17433,6,6,0,11,6,6,2,3,1,1,11,10,4,12,1 -17434,0,6,0,11,0,6,14,4,4,1,8,11,3,12,0 -17435,8,2,0,11,2,3,14,0,4,1,12,17,3,20,1 -17436,4,8,0,10,6,5,6,3,2,1,18,4,1,3,0 -17437,3,3,0,9,6,4,2,4,2,0,13,15,2,4,0 -17438,8,4,0,1,1,2,12,5,3,1,7,17,4,33,1 -17439,9,7,0,9,1,3,14,3,0,1,0,9,2,13,0 -17440,7,6,0,8,0,1,8,5,2,0,17,12,0,22,0 -17441,9,1,0,15,0,5,9,5,1,0,7,18,4,1,0 -17442,2,0,0,14,6,3,9,3,2,1,4,2,2,14,0 -17443,8,2,0,11,2,4,0,5,3,0,2,2,3,38,0 -17444,0,3,0,15,6,6,8,1,1,1,15,2,4,23,0 -17445,5,6,0,3,2,2,11,5,1,0,5,12,3,10,1 -17446,4,6,0,3,1,6,6,5,2,0,13,18,2,41,0 -17447,9,3,0,3,4,0,12,5,2,1,1,10,4,1,1 -17448,1,3,0,9,5,0,7,5,3,0,8,7,0,0,0 -17449,0,5,0,5,4,0,14,4,0,1,5,18,2,14,0 -17450,10,0,0,5,1,4,9,4,2,1,2,1,4,34,0 -17451,3,8,0,2,6,3,10,1,3,1,13,4,1,31,0 -17452,10,1,0,3,3,4,0,2,3,0,12,4,5,22,0 -17453,8,5,0,7,1,3,13,1,1,1,13,18,1,6,0 -17454,8,3,0,11,0,1,8,5,1,0,16,5,2,23,1 -17455,8,7,0,5,6,0,8,3,0,1,4,5,3,25,0 -17456,4,4,0,6,5,3,3,0,3,1,9,13,1,9,1 -17457,3,8,0,6,2,1,8,4,2,0,8,2,0,8,0 -17458,6,5,0,14,6,0,3,1,2,0,10,3,0,3,0 -17459,4,6,0,4,3,4,13,0,4,1,1,5,3,35,1 -17460,8,5,0,6,6,4,10,1,0,0,2,2,0,1,0 -17461,1,7,0,1,2,3,7,3,3,1,2,11,0,3,0 -17462,1,2,0,5,0,5,5,5,0,1,13,13,2,29,0 -17463,8,4,0,2,3,1,14,2,0,0,5,1,1,21,1 -17464,2,8,0,1,1,6,9,3,3,0,16,0,4,39,0 -17465,6,5,0,14,3,0,12,4,4,1,7,17,5,12,1 -17466,8,1,0,7,1,0,2,3,3,0,17,2,4,40,0 -17467,3,2,0,2,4,5,8,2,1,1,0,12,5,34,0 -17468,3,7,0,5,6,0,12,2,0,1,2,17,0,29,0 -17469,7,4,0,7,6,2,3,5,0,1,18,8,3,0,1 -17470,3,1,0,10,1,3,13,1,4,0,5,5,3,34,1 -17471,2,8,0,11,0,0,12,1,4,1,17,6,4,38,0 -17472,0,8,0,6,1,5,2,2,3,1,13,15,1,17,0 -17473,5,0,0,3,1,2,13,5,2,1,15,10,4,12,1 -17474,2,4,0,14,0,5,3,3,0,1,17,0,3,21,0 -17475,5,8,0,9,0,0,10,1,4,0,2,2,0,6,0 -17476,9,7,0,0,4,3,14,0,4,1,18,5,0,30,1 -17477,8,5,0,12,0,1,14,0,2,1,10,19,2,9,0 -17478,10,8,0,4,0,6,7,4,3,1,6,9,4,13,0 -17479,1,6,0,3,6,6,2,0,0,1,4,2,1,37,0 -17480,2,2,0,8,6,4,4,2,1,0,13,11,4,40,0 -17481,3,7,0,15,1,0,0,1,2,1,3,3,0,16,0 -17482,0,6,0,1,0,3,9,3,0,0,12,11,4,26,0 -17483,1,8,0,5,4,3,11,0,2,0,1,9,2,24,1 -17484,0,7,0,8,0,4,3,3,1,1,6,1,5,33,0 -17485,1,1,0,13,3,2,2,4,2,1,8,5,3,30,0 -17486,1,8,0,9,4,0,6,1,1,1,2,0,1,8,0 -17487,5,1,0,15,2,0,10,4,0,1,0,14,0,27,0 -17488,8,4,0,15,2,3,5,4,2,1,18,7,4,39,1 -17489,4,3,0,0,0,3,3,2,1,1,2,11,1,37,0 -17490,1,8,0,15,2,0,0,3,4,0,11,11,4,23,0 -17491,3,6,0,12,2,0,3,2,1,1,17,12,0,24,0 -17492,1,3,0,7,1,1,13,2,2,1,10,2,0,24,0 -17493,6,8,0,7,2,3,1,3,0,0,2,9,0,6,0 -17494,6,4,0,11,6,5,8,2,1,1,8,12,5,33,0 -17495,2,5,0,7,3,0,7,2,3,0,11,19,4,6,1 -17496,0,4,0,10,5,0,9,1,0,0,2,15,4,2,0 -17497,1,4,0,13,3,1,7,3,1,0,13,4,3,4,0 -17498,6,3,0,7,4,0,5,3,3,0,13,9,2,5,0 -17499,0,1,0,2,6,3,6,5,3,1,14,15,2,10,1 -17500,5,2,0,10,1,5,2,4,0,0,1,17,5,32,1 -17501,6,6,0,6,1,0,6,3,0,1,10,13,5,6,0 -17502,1,0,0,6,5,5,5,2,3,1,17,20,0,37,0 -17503,2,2,0,10,0,4,2,4,3,1,13,15,2,7,0 -17504,0,1,0,3,3,6,13,3,2,0,4,4,5,7,0 -17505,3,8,0,11,6,5,5,1,2,0,13,20,2,13,0 -17506,1,3,0,12,3,2,12,1,3,1,13,15,5,40,0 -17507,9,7,0,2,6,4,5,3,4,0,13,13,0,29,0 -17508,2,2,0,5,0,3,8,2,2,0,13,0,3,22,0 -17509,0,6,0,6,4,1,9,2,4,1,6,0,4,16,0 -17510,6,5,0,8,6,1,10,5,4,1,15,17,5,12,1 -17511,1,6,0,6,0,5,4,3,4,0,2,13,4,19,0 -17512,0,6,0,10,0,5,5,0,4,1,17,13,1,28,0 -17513,2,4,0,14,6,6,5,5,0,1,8,2,3,30,0 -17514,4,4,0,10,4,3,4,1,4,1,18,8,1,30,1 -17515,9,8,0,2,4,0,1,0,2,1,6,9,3,13,0 -17516,0,4,0,3,6,0,4,2,3,0,8,11,2,13,0 -17517,2,7,0,5,4,0,8,2,3,1,2,3,0,28,0 -17518,3,8,0,0,0,6,5,1,1,1,15,9,2,40,0 -17519,3,4,0,7,1,5,14,2,3,1,8,19,1,26,0 -17520,10,4,0,0,5,2,12,0,1,0,5,15,1,1,1 -17521,3,5,0,15,5,0,12,0,3,1,1,17,5,35,1 -17522,5,6,0,10,4,6,2,1,3,0,5,10,5,33,1 -17523,0,0,0,2,6,5,10,3,0,1,7,1,2,16,1 -17524,6,7,0,13,0,0,3,2,1,1,6,20,5,26,0 -17525,8,7,0,11,6,4,7,4,4,0,7,20,4,38,0 -17526,1,5,0,3,2,2,3,4,3,1,6,11,0,18,0 -17527,9,2,0,2,2,0,13,1,0,0,13,16,0,34,0 -17528,10,3,0,0,6,0,4,4,3,1,4,13,4,13,0 -17529,2,4,0,12,1,0,5,1,2,0,13,13,1,31,0 -17530,7,0,0,5,6,6,10,1,0,1,18,1,5,16,1 -17531,1,6,0,8,4,1,9,3,0,1,17,2,0,34,0 -17532,0,4,0,5,0,6,8,2,4,1,10,2,5,28,0 -17533,3,5,0,3,1,5,2,3,3,0,10,6,1,6,0 -17534,9,5,0,12,1,2,8,3,3,1,3,10,2,22,1 -17535,8,5,0,2,5,5,0,2,0,0,8,11,3,10,0 -17536,4,3,0,8,3,1,4,1,4,1,9,7,5,29,1 -17537,8,2,0,2,5,5,8,2,0,0,12,20,5,34,0 -17538,4,0,0,6,1,2,7,1,4,0,13,2,4,16,0 -17539,6,0,0,8,4,1,10,5,0,1,12,20,1,37,1 -17540,0,7,0,8,6,3,7,3,3,1,10,5,2,0,0 -17541,1,1,0,4,0,2,6,0,2,1,8,2,1,6,0 -17542,1,4,0,15,1,0,1,1,1,0,2,17,5,2,0 -17543,9,7,0,11,3,1,8,4,0,0,12,1,3,4,0 -17544,3,2,0,10,5,0,9,3,3,1,8,13,1,10,0 -17545,0,3,0,13,6,6,5,1,1,1,17,13,1,23,0 -17546,0,4,0,10,0,5,13,1,1,0,13,20,5,40,0 -17547,6,6,0,14,2,0,0,0,2,1,2,6,0,19,0 -17548,0,1,0,14,4,4,0,3,0,0,13,0,3,14,0 -17549,7,2,0,1,2,1,7,2,2,1,13,11,5,8,0 -17550,3,5,0,6,0,6,7,3,2,1,13,2,4,26,0 -17551,5,5,0,5,2,2,10,4,1,0,8,20,1,0,0 -17552,5,3,0,13,6,5,6,4,0,0,9,15,1,1,0 -17553,0,6,0,1,5,0,2,2,1,1,13,15,5,33,0 -17554,3,5,0,13,4,2,0,1,4,1,0,18,4,21,0 -17555,0,8,0,12,0,5,1,5,0,1,16,1,5,22,0 -17556,0,6,0,1,3,5,5,1,0,0,8,8,1,20,0 -17557,9,1,0,1,1,6,14,0,0,0,8,17,3,35,0 -17558,7,5,0,3,0,5,6,1,2,0,4,0,0,7,0 -17559,1,1,0,2,4,3,13,1,3,1,1,5,5,11,1 -17560,5,8,0,11,5,6,10,1,4,0,8,4,0,12,0 -17561,0,7,0,11,0,6,11,1,2,0,2,2,4,21,0 -17562,3,4,0,10,0,5,11,3,0,0,9,8,4,37,1 -17563,1,8,0,2,6,0,0,4,2,1,0,14,1,26,0 -17564,0,5,0,9,2,0,4,1,0,0,5,9,5,37,0 -17565,0,7,0,0,0,2,8,4,0,1,8,2,4,19,0 -17566,3,8,0,3,0,5,3,0,2,1,13,20,3,8,0 -17567,0,0,0,5,0,6,6,3,1,0,6,13,5,36,0 -17568,2,8,0,2,0,6,11,3,3,1,4,6,0,20,0 -17569,8,1,0,6,1,3,4,2,3,1,13,6,2,23,0 -17570,5,5,0,4,1,1,11,0,2,1,11,6,5,40,1 -17571,4,7,0,10,4,3,0,4,4,0,2,6,0,11,0 -17572,8,1,0,14,1,2,0,3,4,0,5,16,1,5,1 -17573,6,5,0,10,1,5,9,1,0,1,3,1,4,13,1 -17574,2,3,0,3,2,5,3,5,2,0,16,11,1,37,0 -17575,7,2,0,13,3,0,11,0,1,0,5,13,5,36,1 -17576,0,4,0,10,5,3,6,3,3,1,6,6,1,11,0 -17577,3,8,0,5,2,4,5,1,4,1,5,2,3,34,0 -17578,9,8,0,12,4,1,1,2,1,1,18,5,2,23,1 -17579,10,4,0,2,3,2,7,2,1,1,1,19,3,8,1 -17580,4,3,0,1,0,0,10,5,1,1,13,14,3,11,0 -17581,3,0,0,15,5,2,2,4,0,0,13,3,4,7,0 -17582,9,6,0,8,0,1,8,3,3,0,8,6,0,6,0 -17583,0,2,0,1,0,5,14,1,2,0,13,2,0,0,0 -17584,2,7,0,0,4,3,0,1,4,1,13,2,2,29,0 -17585,10,5,0,12,5,1,3,1,4,0,18,3,2,0,1 -17586,4,1,0,10,4,1,0,1,1,1,9,14,3,36,1 -17587,10,8,0,14,2,2,9,1,1,1,1,19,4,0,1 -17588,9,6,0,15,4,6,6,3,1,0,8,15,1,1,0 -17589,5,8,0,2,5,0,2,4,0,0,2,12,0,1,0 -17590,1,5,0,15,6,2,14,0,4,0,2,6,5,10,0 -17591,7,5,0,0,6,0,10,5,2,1,0,5,4,28,1 -17592,2,1,0,1,2,0,13,4,1,0,3,0,0,31,0 -17593,1,4,0,0,0,1,6,1,0,1,14,11,0,24,0 -17594,3,6,0,6,6,6,0,3,0,1,2,11,5,33,0 -17595,2,3,0,3,0,5,0,3,4,0,8,9,2,29,0 -17596,7,4,0,6,6,6,7,1,1,1,14,7,3,19,1 -17597,5,5,0,2,2,6,13,4,3,1,9,0,2,40,1 -17598,8,2,0,10,6,6,8,5,1,0,0,4,5,11,1 -17599,4,4,0,7,1,1,8,0,2,0,18,10,5,16,1 -17600,2,6,0,6,6,4,9,4,3,0,4,6,3,23,0 -17601,5,7,0,14,3,3,12,1,4,0,16,5,5,18,1 -17602,8,0,0,9,4,1,4,2,0,1,14,7,2,37,1 -17603,6,4,0,14,5,2,11,0,0,1,9,7,4,27,1 -17604,5,4,0,11,6,1,10,3,3,0,11,1,4,41,1 -17605,7,4,0,11,2,4,5,3,3,1,6,6,0,38,0 -17606,6,5,0,14,1,2,4,1,4,1,17,19,5,41,1 -17607,0,7,0,13,0,6,5,4,3,0,2,15,4,21,0 -17608,1,3,0,8,2,4,1,1,3,0,8,19,1,1,0 -17609,0,8,0,0,0,0,6,2,3,1,13,0,5,4,0 -17610,7,7,0,7,4,5,2,4,4,0,0,2,2,5,0 -17611,10,2,0,8,1,3,0,3,2,0,8,4,5,22,0 -17612,7,2,0,2,3,6,10,1,3,1,18,10,3,37,1 -17613,7,3,0,9,3,5,4,2,0,0,8,12,4,36,0 -17614,0,7,0,13,2,4,11,3,0,0,2,13,3,35,0 -17615,9,5,0,4,0,1,3,1,1,0,12,11,2,13,0 -17616,0,6,0,1,0,5,3,4,1,1,0,11,3,38,0 -17617,1,0,0,0,0,1,8,4,3,1,13,9,5,18,0 -17618,0,0,0,7,5,5,0,1,3,0,10,13,4,11,0 -17619,1,3,0,5,5,3,5,1,0,0,2,0,0,2,0 -17620,2,3,0,13,2,4,6,5,0,1,8,17,2,23,0 -17621,10,5,0,0,0,2,7,5,1,1,1,14,3,8,1 -17622,8,5,0,10,0,6,11,3,3,1,8,0,3,34,0 -17623,2,1,0,5,3,1,7,5,3,1,5,5,3,24,1 -17624,10,0,0,10,6,5,0,2,0,1,9,12,3,1,1 -17625,6,1,0,7,2,1,8,3,1,0,16,17,2,20,1 -17626,1,5,0,5,0,6,2,1,0,0,12,14,2,26,0 -17627,10,1,0,7,1,4,5,1,0,0,12,6,3,14,0 -17628,7,1,0,13,6,4,0,4,3,0,0,18,1,33,0 -17629,4,6,0,2,3,5,11,5,3,1,2,2,0,17,0 -17630,7,6,0,5,0,3,6,4,2,0,15,2,3,18,0 -17631,1,1,0,15,6,3,5,1,1,0,2,11,3,33,0 -17632,1,1,0,15,5,5,0,0,4,1,2,12,0,24,0 -17633,8,7,0,0,0,0,14,5,4,1,16,10,0,1,1 -17634,1,8,0,6,1,3,6,4,2,0,17,3,1,7,0 -17635,1,0,0,8,4,5,12,2,0,0,15,10,4,11,0 -17636,3,0,0,9,5,3,5,4,4,1,6,2,1,23,0 -17637,2,2,0,5,1,0,0,4,3,0,13,13,5,4,0 -17638,4,7,0,11,6,4,4,1,2,1,9,0,4,22,1 -17639,10,4,0,0,2,4,4,1,4,1,5,9,1,4,0 -17640,9,6,0,0,3,2,11,5,1,1,18,7,5,32,1 -17641,6,1,0,14,6,2,6,0,4,1,2,2,0,36,0 -17642,6,8,0,1,2,3,4,4,1,1,2,11,1,32,0 -17643,3,1,0,0,2,5,13,4,3,1,14,18,0,21,0 -17644,4,2,0,5,2,1,8,2,1,1,5,13,3,2,1 -17645,8,1,0,5,2,3,8,3,0,1,0,13,4,2,0 -17646,6,7,0,3,5,3,7,5,1,1,16,19,3,28,1 -17647,4,8,0,4,6,5,5,1,4,1,13,13,3,13,0 -17648,1,7,0,9,0,1,12,5,1,1,12,12,3,25,1 -17649,1,2,0,3,2,6,8,4,1,0,2,13,5,25,0 -17650,0,4,0,13,6,3,9,3,1,0,3,20,0,22,0 -17651,0,7,0,14,6,6,12,4,0,0,10,2,5,31,0 -17652,0,4,0,9,6,0,5,1,4,1,2,20,0,34,0 -17653,0,8,0,13,0,5,6,3,0,0,2,3,2,3,0 -17654,3,1,0,3,2,6,7,5,0,1,9,4,4,28,1 -17655,3,3,0,6,0,0,8,2,3,1,4,2,3,0,0 -17656,4,6,0,5,3,2,9,3,1,1,10,20,2,13,0 -17657,5,7,0,13,2,5,3,1,3,0,0,7,2,16,0 -17658,2,6,0,11,5,5,7,2,3,0,17,13,3,17,0 -17659,2,8,0,3,3,0,12,0,0,0,2,0,0,12,0 -17660,2,8,0,1,0,2,5,2,4,1,13,11,1,4,0 -17661,7,8,0,3,2,0,0,2,0,0,4,0,1,32,0 -17662,0,0,0,15,0,4,10,2,3,0,17,20,3,17,0 -17663,2,8,0,5,5,5,7,3,2,0,8,9,3,5,0 -17664,0,6,0,15,5,6,3,2,2,1,13,11,2,6,0 -17665,7,3,0,13,6,1,3,5,2,0,18,7,5,34,1 -17666,2,3,0,4,3,4,7,3,0,0,2,13,3,37,0 -17667,7,1,0,13,3,3,4,0,2,1,1,5,0,10,1 -17668,8,8,0,9,3,5,0,3,1,1,5,2,0,39,0 -17669,5,4,0,13,0,1,10,1,4,1,17,4,4,26,0 -17670,10,0,0,2,0,1,3,3,4,0,4,2,2,0,0 -17671,2,4,0,2,0,4,13,4,0,1,17,12,2,0,0 -17672,1,1,0,13,0,5,8,4,4,0,13,3,0,41,0 -17673,10,5,0,6,5,1,10,4,3,1,18,5,2,32,1 -17674,9,8,0,8,0,0,14,3,3,0,13,14,4,33,0 -17675,8,1,0,7,4,5,14,0,0,1,4,18,2,29,0 -17676,8,5,0,2,4,6,1,1,1,0,6,14,4,1,1 -17677,3,3,0,1,6,4,9,2,2,0,15,0,0,22,0 -17678,6,0,0,12,2,4,2,0,4,1,18,5,4,11,1 -17679,8,5,0,8,0,5,10,1,0,0,10,11,4,0,0 -17680,1,3,0,5,3,5,8,4,2,1,8,15,5,40,0 -17681,0,1,0,11,6,3,12,1,2,0,12,11,4,2,0 -17682,10,5,0,5,2,0,11,2,4,1,18,10,4,11,1 -17683,8,5,0,1,1,6,10,0,2,0,4,16,0,8,0 -17684,0,7,0,9,6,6,5,5,3,0,13,9,0,7,0 -17685,10,7,0,7,2,1,2,5,0,1,16,8,5,3,1 -17686,8,6,0,4,0,5,4,3,0,1,7,7,4,19,1 -17687,5,8,0,5,6,6,0,5,4,0,13,0,1,18,0 -17688,8,5,0,0,3,2,11,5,4,0,11,8,1,14,1 -17689,1,0,0,13,0,3,14,4,3,0,2,4,0,22,0 -17690,10,0,0,1,0,2,2,1,3,1,4,0,3,26,0 -17691,1,6,0,9,0,0,5,2,0,0,5,18,1,39,0 -17692,8,0,0,11,6,4,13,0,4,0,18,10,0,12,1 -17693,4,2,0,8,6,6,8,4,3,1,2,20,0,7,0 -17694,0,4,0,4,3,3,13,3,0,1,17,10,4,35,0 -17695,9,8,0,2,6,4,5,3,2,1,4,16,4,16,0 -17696,0,4,0,0,3,4,6,0,1,1,4,4,5,12,0 -17697,2,7,0,12,0,6,9,3,0,0,8,2,2,31,0 -17698,3,6,0,6,2,5,9,1,0,0,13,15,0,7,0 -17699,0,3,0,13,4,6,13,2,1,0,4,11,0,40,0 -17700,5,8,0,9,0,5,7,4,2,0,13,6,0,35,0 -17701,2,5,0,11,4,3,5,0,3,1,13,2,2,30,0 -17702,0,5,0,6,2,0,6,3,1,0,4,9,0,29,0 -17703,1,3,0,2,5,4,12,4,3,0,16,5,5,41,0 -17704,7,2,0,3,2,5,10,5,3,1,14,13,5,29,0 -17705,4,8,0,13,6,4,10,3,0,0,17,11,0,32,0 -17706,4,7,0,6,1,4,1,3,3,1,6,11,0,10,0 -17707,2,2,0,3,0,0,6,4,1,1,0,15,1,13,0 -17708,2,7,0,8,0,2,1,3,1,1,8,11,4,31,0 -17709,5,3,0,0,0,3,12,1,4,0,6,2,4,31,0 -17710,2,5,0,12,0,0,7,3,4,1,8,9,1,3,0 -17711,2,3,0,5,1,5,14,3,3,0,8,14,2,24,0 -17712,2,7,0,4,1,4,13,3,0,1,2,11,0,6,0 -17713,7,3,0,15,6,6,6,4,0,1,15,2,1,4,0 -17714,4,4,0,8,5,6,13,2,0,1,2,2,3,35,0 -17715,0,7,0,13,5,0,9,0,0,1,2,3,4,30,0 -17716,0,8,0,6,1,4,12,2,0,1,2,15,5,12,0 -17717,8,4,0,8,4,4,7,5,0,1,10,6,4,29,0 -17718,9,5,0,8,0,5,9,3,3,1,4,8,4,28,0 -17719,7,8,0,15,2,3,7,3,2,0,2,2,1,25,0 -17720,5,7,0,12,0,3,0,5,4,0,5,12,3,2,1 -17721,2,5,0,14,1,0,0,1,4,1,17,4,0,30,0 -17722,1,4,0,6,6,2,9,3,0,0,2,18,4,17,0 -17723,10,5,0,15,4,4,2,5,0,0,4,2,5,11,0 -17724,0,2,0,10,6,6,8,4,4,0,0,3,5,20,0 -17725,2,8,0,4,0,0,11,2,3,0,2,6,5,16,0 -17726,1,3,0,3,0,5,9,0,0,0,8,19,2,9,0 -17727,2,1,0,6,4,4,12,3,0,0,13,11,0,29,0 -17728,3,2,0,10,2,3,0,0,0,0,14,20,0,2,0 -17729,3,8,0,8,4,4,8,3,0,1,17,16,5,8,0 -17730,8,4,0,4,6,4,9,1,3,0,13,4,4,16,0 -17731,4,8,0,15,0,5,7,2,0,1,8,8,1,11,0 -17732,0,1,0,1,0,6,6,4,1,0,2,4,1,16,0 -17733,0,6,0,11,5,4,13,1,4,0,0,8,0,26,0 -17734,10,2,0,3,5,5,7,1,0,0,13,2,0,10,0 -17735,5,5,0,10,0,5,3,1,2,1,9,8,2,24,1 -17736,10,5,0,6,1,0,5,4,3,1,10,10,0,22,1 -17737,1,5,0,3,0,0,12,3,1,0,2,16,5,18,0 -17738,3,3,0,15,2,5,5,5,1,0,13,16,2,24,0 -17739,10,0,0,15,5,6,3,5,1,1,0,2,0,28,0 -17740,6,6,0,10,5,4,3,3,1,1,18,8,1,18,1 -17741,10,8,0,1,4,4,13,4,1,1,1,10,4,2,1 -17742,8,3,0,7,0,5,14,4,2,0,13,2,3,25,0 -17743,9,0,0,6,4,4,3,3,1,0,13,9,0,10,0 -17744,3,5,0,12,0,2,2,2,2,1,13,9,3,6,0 -17745,0,8,0,11,5,5,7,1,4,0,13,4,2,3,0 -17746,6,5,0,5,5,1,4,5,4,0,5,7,5,27,1 -17747,7,8,0,4,1,6,12,4,0,0,2,5,0,34,0 -17748,10,5,0,13,3,5,1,0,1,0,17,0,2,21,0 -17749,5,7,0,5,2,5,10,4,0,1,4,20,0,4,0 -17750,2,5,0,11,1,4,3,0,0,0,8,19,2,36,0 -17751,0,2,0,6,5,0,7,4,0,1,2,2,0,34,0 -17752,6,4,0,3,1,3,11,4,1,0,4,15,2,3,0 -17753,3,5,0,2,0,3,14,3,0,0,13,6,1,4,0 -17754,6,6,0,15,1,6,12,5,3,1,0,7,5,2,1 -17755,3,8,0,6,6,6,9,2,1,1,12,2,5,36,0 -17756,7,3,0,7,0,3,9,2,2,1,15,6,0,39,0 -17757,0,8,0,0,2,5,14,0,1,0,0,15,4,0,0 -17758,0,0,0,10,6,2,9,4,1,0,1,15,0,33,0 -17759,8,4,0,8,0,5,14,0,0,0,4,2,1,8,0 -17760,7,1,0,3,0,1,3,3,3,1,4,11,1,0,0 -17761,5,3,0,2,0,0,4,4,1,1,12,18,3,3,0 -17762,6,3,0,5,2,5,4,5,0,1,13,16,0,13,0 -17763,5,5,0,11,6,5,13,0,1,0,2,14,1,1,0 -17764,0,4,0,14,6,2,5,3,3,1,18,10,2,26,1 -17765,9,5,0,9,5,1,5,4,0,1,11,10,5,3,1 -17766,1,5,0,8,4,5,0,5,1,1,18,5,3,18,1 -17767,6,5,0,9,5,5,4,2,0,1,9,7,3,16,1 -17768,1,6,0,5,4,3,0,2,1,1,6,9,2,26,0 -17769,2,5,0,3,0,6,2,3,4,1,5,17,0,30,1 -17770,1,5,0,13,5,4,7,5,2,0,12,2,0,10,0 -17771,2,7,0,5,4,5,7,1,3,0,17,13,4,36,0 -17772,1,2,0,13,0,5,2,0,4,1,8,9,1,30,0 -17773,1,0,0,9,3,5,6,1,2,1,13,15,0,30,0 -17774,2,2,0,7,1,3,11,1,4,1,2,6,2,38,0 -17775,6,7,0,15,0,2,10,5,0,1,7,7,3,30,1 -17776,4,1,0,10,4,3,11,2,0,1,13,3,5,5,1 -17777,6,4,0,3,0,3,5,1,3,0,17,11,2,4,0 -17778,8,1,0,14,1,3,4,4,0,1,2,13,1,30,0 -17779,0,4,0,3,5,5,5,3,4,0,13,4,0,3,0 -17780,1,6,0,5,5,6,4,2,3,1,13,11,4,23,0 -17781,1,5,0,9,1,2,9,5,1,1,17,2,3,13,0 -17782,8,0,0,2,4,4,12,5,4,1,14,19,0,1,1 -17783,3,0,0,11,0,5,5,1,1,0,2,9,2,7,0 -17784,9,2,0,13,3,3,9,5,2,1,17,15,1,39,0 -17785,9,6,0,9,6,0,8,1,1,1,11,9,5,20,0 -17786,1,6,0,9,6,4,6,5,1,0,15,2,2,36,0 -17787,1,8,0,3,5,3,12,3,2,0,2,0,1,35,0 -17788,2,6,0,4,4,5,5,1,2,0,4,11,0,26,0 -17789,0,6,0,1,2,0,11,1,0,0,13,11,1,31,0 -17790,1,3,0,13,0,1,0,5,0,1,0,16,2,5,0 -17791,0,5,0,10,4,3,8,3,0,1,4,13,3,30,0 -17792,4,7,0,10,5,2,10,3,1,1,5,20,4,28,1 -17793,5,8,0,13,3,6,11,2,3,1,9,12,2,11,1 -17794,1,3,0,8,1,5,2,2,2,0,13,0,0,37,0 -17795,0,3,0,2,5,4,8,4,0,0,17,9,0,6,0 -17796,6,2,0,12,2,6,10,0,0,0,8,13,3,16,0 -17797,3,6,0,15,5,5,5,2,2,1,1,16,4,31,0 -17798,0,4,0,1,0,0,7,5,3,1,17,11,5,37,0 -17799,10,6,0,3,4,0,0,5,4,0,14,8,5,9,1 -17800,3,4,0,11,6,1,7,0,3,1,9,1,3,18,1 -17801,8,6,0,12,6,5,11,1,1,1,9,12,1,31,1 -17802,7,7,0,9,0,6,13,3,3,1,5,5,4,37,1 -17803,6,5,0,4,4,2,13,5,3,1,18,16,1,12,1 -17804,1,2,0,8,2,3,5,0,0,1,13,2,5,21,0 -17805,9,8,0,14,1,1,10,4,1,1,5,19,3,0,1 -17806,3,0,0,5,3,0,10,1,2,0,17,0,0,12,0 -17807,2,8,0,15,0,1,8,3,0,1,17,2,2,7,0 -17808,4,0,0,9,0,1,7,0,0,1,10,2,4,30,0 -17809,6,2,0,9,6,3,4,4,1,1,5,12,1,36,0 -17810,5,5,0,14,6,1,7,1,0,1,18,10,1,10,1 -17811,2,5,0,1,2,3,1,3,0,1,8,11,0,24,0 -17812,2,1,0,5,6,0,13,4,3,0,17,3,5,2,0 -17813,0,4,0,11,5,3,2,1,2,1,8,18,3,19,0 -17814,9,6,0,3,4,1,13,5,2,1,18,14,1,25,1 -17815,7,3,0,11,3,0,0,1,0,0,2,11,4,27,0 -17816,3,2,0,4,3,3,11,0,2,1,12,2,0,25,0 -17817,7,6,0,0,6,3,7,1,0,1,0,15,3,18,0 -17818,0,7,0,3,6,5,4,4,3,1,17,6,0,20,0 -17819,9,4,0,6,1,0,10,2,3,0,13,2,2,37,0 -17820,2,1,0,5,5,5,6,4,4,0,8,4,1,22,0 -17821,7,8,0,3,2,3,2,5,1,0,3,5,4,12,1 -17822,4,0,0,6,0,0,4,2,0,1,8,2,0,9,0 -17823,1,1,0,1,2,5,6,4,1,0,4,15,4,9,0 -17824,10,5,0,7,4,1,4,5,0,1,18,8,3,7,1 -17825,5,8,0,10,0,5,12,2,3,0,9,17,0,4,1 -17826,7,7,0,13,1,6,6,4,4,1,13,13,0,26,0 -17827,0,5,0,3,2,4,1,1,0,1,13,2,0,39,0 -17828,0,6,0,7,0,0,0,3,0,1,2,0,1,32,0 -17829,5,5,0,14,0,2,8,0,0,0,13,11,5,8,0 -17830,6,2,0,1,6,5,11,3,0,1,10,16,3,30,0 -17831,9,1,0,8,4,3,10,3,0,1,7,19,4,5,1 -17832,9,1,0,5,6,5,8,1,3,0,17,11,0,37,0 -17833,0,5,0,15,0,5,5,4,3,1,2,13,2,13,0 -17834,3,6,0,3,6,0,9,2,2,0,13,9,2,34,0 -17835,3,0,0,8,0,3,11,0,0,0,12,18,2,3,0 -17836,7,8,0,6,3,1,14,5,2,1,9,7,0,1,1 -17837,7,1,0,9,5,1,12,3,4,1,18,3,3,41,1 -17838,4,1,0,0,0,5,0,4,3,1,13,2,0,40,0 -17839,5,4,0,5,6,6,0,3,2,0,13,2,0,29,0 -17840,0,1,0,4,0,0,3,4,0,1,8,0,1,14,0 -17841,9,2,0,12,5,5,9,2,2,0,8,11,0,24,0 -17842,9,5,0,7,6,4,9,1,0,0,17,15,2,6,0 -17843,3,5,0,12,0,4,13,0,0,1,5,7,4,23,1 -17844,7,4,0,0,1,6,7,1,4,0,6,20,5,35,0 -17845,4,2,0,12,6,5,6,1,0,1,0,20,2,11,0 -17846,3,3,0,10,1,0,1,4,3,1,2,0,5,3,0 -17847,5,4,0,1,3,0,13,4,3,0,13,13,0,3,0 -17848,1,8,0,13,4,3,0,1,0,1,8,9,4,9,0 -17849,5,3,0,4,0,3,1,3,0,1,13,16,2,27,0 -17850,9,3,0,1,0,3,12,0,0,0,13,15,0,12,0 -17851,10,6,0,11,2,3,11,3,2,1,6,2,2,27,0 -17852,4,2,0,3,6,5,7,4,3,1,11,4,5,35,0 -17853,0,8,0,12,0,0,6,1,1,1,14,2,0,39,0 -17854,5,4,0,8,6,1,3,5,1,1,9,5,5,38,1 -17855,0,1,0,6,1,6,7,1,4,1,15,19,5,16,0 -17856,5,0,0,9,1,3,14,3,3,0,2,15,5,33,0 -17857,0,2,0,6,3,2,6,4,2,1,13,2,1,41,0 -17858,2,6,0,0,4,0,6,4,0,0,14,11,0,22,0 -17859,0,5,0,15,6,2,8,3,2,0,4,11,3,14,0 -17860,5,5,0,8,2,2,12,4,2,1,12,5,0,23,1 -17861,2,7,0,1,1,5,5,4,3,1,2,4,0,0,0 -17862,0,8,0,1,2,4,10,3,2,1,3,18,1,24,0 -17863,7,7,0,15,5,1,10,3,1,1,5,5,4,40,1 -17864,9,0,0,13,6,3,11,4,2,0,13,0,2,31,0 -17865,4,1,0,9,3,3,4,0,1,0,1,14,1,39,1 -17866,1,1,0,1,6,3,11,0,2,0,4,2,0,19,0 -17867,0,0,0,5,3,4,8,2,0,1,13,2,5,6,0 -17868,4,1,0,6,3,0,9,0,3,1,6,3,5,22,1 -17869,2,8,0,8,3,0,5,0,0,1,7,18,4,19,0 -17870,7,1,0,5,4,2,0,5,1,0,18,3,2,9,1 -17871,9,0,0,11,5,2,2,3,0,1,9,8,1,21,1 -17872,4,1,0,9,5,5,9,3,3,0,13,10,0,4,0 -17873,2,0,0,12,0,5,1,1,0,1,15,11,5,14,0 -17874,7,2,0,1,5,6,2,4,0,1,2,13,2,27,0 -17875,0,0,0,4,4,4,2,0,2,1,4,0,0,29,0 -17876,10,5,0,12,4,5,4,4,3,1,11,16,0,2,1 -17877,6,5,0,11,2,2,11,0,3,0,6,1,2,23,1 -17878,0,8,0,3,6,6,14,4,1,0,4,11,4,19,0 -17879,1,3,0,15,2,3,8,1,1,1,13,2,4,25,0 -17880,9,8,0,14,6,4,6,0,2,1,13,15,3,27,0 -17881,3,0,0,14,0,5,9,1,0,0,12,2,0,2,0 -17882,1,7,0,11,2,4,13,2,1,1,2,6,5,17,0 -17883,1,8,0,13,2,1,6,2,4,1,2,14,0,6,0 -17884,3,6,0,5,6,1,8,5,0,1,2,9,0,13,0 -17885,7,6,0,3,2,2,4,4,3,0,18,7,5,16,1 -17886,5,0,0,8,0,4,8,3,3,1,12,3,2,16,0 -17887,6,8,0,8,5,5,5,1,3,0,2,13,1,4,0 -17888,0,3,0,11,1,3,10,1,3,1,13,2,0,3,0 -17889,10,7,0,6,6,3,4,2,0,1,0,12,3,0,0 -17890,8,7,0,1,5,4,14,3,2,0,0,15,2,2,0 -17891,2,6,0,7,0,6,0,3,1,1,2,6,3,6,0 -17892,4,0,0,14,1,4,8,0,4,1,13,2,5,21,0 -17893,2,1,0,1,0,0,11,3,2,0,13,13,5,20,0 -17894,3,3,0,4,4,5,8,3,3,1,6,1,0,11,1 -17895,1,6,0,6,6,5,3,5,3,0,13,2,0,22,0 -17896,6,8,0,11,3,5,12,3,1,0,12,2,1,37,0 -17897,0,0,0,7,2,5,8,2,3,0,2,12,4,31,0 -17898,6,6,0,3,0,6,6,4,1,1,2,9,3,5,0 -17899,1,4,0,3,5,5,9,5,0,1,12,7,0,9,0 -17900,3,6,0,13,0,0,4,2,3,0,13,7,2,6,0 -17901,1,6,0,7,2,6,12,1,2,0,6,2,4,30,0 -17902,2,2,0,9,4,2,3,5,3,1,18,3,2,12,1 -17903,3,5,0,0,0,4,12,2,4,0,2,13,4,26,0 -17904,6,1,0,13,1,4,4,0,2,1,4,2,1,22,0 -17905,10,8,0,8,1,5,1,0,0,0,18,3,1,13,1 -17906,5,5,0,7,5,0,12,1,4,1,1,5,5,0,1 -17907,8,5,0,10,6,5,10,5,1,0,17,16,4,14,0 -17908,7,4,0,5,6,1,0,1,3,1,13,3,4,11,0 -17909,8,2,0,14,4,2,12,3,0,0,17,12,4,29,0 -17910,3,2,0,11,0,3,10,0,1,0,2,11,2,9,0 -17911,2,0,0,2,0,2,5,3,1,1,13,11,1,0,0 -17912,0,7,0,9,6,2,8,4,1,1,13,9,4,33,0 -17913,6,1,0,4,2,5,0,0,2,1,8,19,2,6,0 -17914,0,1,0,13,6,0,4,5,3,1,17,3,2,21,1 -17915,6,3,0,13,0,5,12,5,3,0,13,2,4,38,0 -17916,3,1,0,10,2,5,0,0,4,1,2,2,5,32,0 -17917,10,8,0,15,1,4,4,5,1,1,13,2,3,20,0 -17918,2,7,0,3,3,4,6,2,1,1,6,13,4,7,0 -17919,0,4,0,10,1,4,14,3,2,0,15,14,3,19,0 -17920,7,5,0,2,0,4,14,1,0,1,16,11,3,11,0 -17921,8,4,0,2,6,2,5,5,0,1,14,10,2,10,1 -17922,3,6,0,15,2,6,5,0,4,1,4,13,0,37,0 -17923,2,6,0,15,3,6,11,1,2,0,5,2,2,41,0 -17924,4,6,0,7,4,4,5,4,4,0,1,11,5,9,1 -17925,8,6,0,2,3,4,12,1,0,1,15,7,4,34,1 -17926,3,3,0,12,6,1,5,2,2,1,10,6,0,4,0 -17927,0,7,0,3,5,3,6,1,4,0,10,6,2,28,0 -17928,3,8,0,1,2,5,8,3,2,1,2,2,1,25,0 -17929,4,1,0,3,6,1,6,2,0,0,7,10,4,34,1 -17930,5,2,0,5,3,3,4,2,2,1,12,13,0,40,0 -17931,3,7,0,4,2,1,8,4,0,0,13,19,1,11,0 -17932,8,0,0,15,6,0,3,4,3,1,7,17,5,19,1 -17933,0,2,0,7,1,6,9,5,3,1,5,2,2,26,0 -17934,0,0,0,0,0,0,10,3,0,1,2,16,5,11,0 -17935,0,4,0,8,0,5,1,3,1,0,13,6,1,12,0 -17936,7,3,0,10,4,3,0,3,0,0,8,16,3,19,0 -17937,0,4,0,12,4,4,3,0,0,0,4,11,0,26,0 -17938,9,0,0,9,3,1,11,3,0,0,6,7,0,2,1 -17939,6,3,0,12,2,3,3,3,4,1,10,19,3,10,1 -17940,0,0,0,13,2,0,10,1,1,1,2,11,2,29,0 -17941,2,7,0,12,1,6,5,1,2,0,13,11,0,16,0 -17942,7,0,0,15,2,4,11,5,0,0,8,2,2,3,0 -17943,4,5,0,11,5,3,9,4,3,0,1,12,4,5,1 -17944,8,5,0,10,5,2,6,3,2,0,18,12,1,26,1 -17945,1,5,0,3,2,4,6,4,3,1,13,2,1,27,0 -17946,4,8,0,12,4,0,11,1,4,1,18,7,2,32,1 -17947,4,4,0,12,5,0,14,2,4,0,15,11,4,13,0 -17948,1,3,0,1,0,0,7,5,3,0,12,9,0,11,0 -17949,9,2,0,13,5,1,6,4,3,1,18,2,1,9,0 -17950,0,6,0,0,2,5,5,1,1,0,2,11,2,37,0 -17951,7,8,0,5,1,3,5,4,0,1,7,2,4,25,0 -17952,9,8,0,8,6,3,11,3,3,0,6,9,2,34,0 -17953,10,7,0,6,6,4,5,3,3,1,0,11,0,18,0 -17954,0,6,0,3,1,2,13,1,2,0,8,14,0,4,0 -17955,4,2,0,10,4,3,6,5,1,1,9,7,5,5,1 -17956,7,3,0,3,4,3,13,5,4,0,3,5,5,25,0 -17957,8,8,0,9,3,1,5,5,3,0,9,1,0,40,1 -17958,5,3,0,5,5,1,8,3,3,1,5,8,5,13,1 -17959,2,8,0,0,2,0,10,4,1,1,2,5,2,14,0 -17960,0,1,0,12,6,1,5,1,1,1,13,9,3,11,0 -17961,10,3,0,1,1,4,8,5,1,1,2,11,2,4,0 -17962,1,8,0,5,2,4,2,2,2,0,16,6,1,11,0 -17963,6,4,0,8,5,6,10,2,3,1,18,7,5,7,1 -17964,0,1,0,8,6,5,5,2,3,0,0,12,4,10,0 -17965,8,6,0,12,6,2,6,0,3,0,2,5,2,18,1 -17966,10,5,0,15,6,2,6,1,2,1,3,10,3,20,1 -17967,4,4,0,4,6,5,8,4,4,1,2,1,3,8,0 -17968,0,4,0,2,5,5,10,5,1,0,2,6,5,22,0 -17969,7,1,0,9,2,1,4,4,1,1,18,10,3,35,1 -17970,1,6,0,4,4,1,13,3,1,0,15,0,4,25,0 -17971,1,6,0,13,5,0,5,4,2,0,8,2,4,17,0 -17972,9,6,0,2,4,3,1,5,0,1,9,7,2,26,1 -17973,0,8,0,11,5,5,10,3,1,0,13,2,5,4,0 -17974,0,4,0,4,1,2,1,0,4,1,13,19,0,0,0 -17975,6,7,0,9,6,0,13,4,0,1,11,2,3,13,0 -17976,0,8,0,1,3,1,6,2,2,1,17,2,1,39,0 -17977,0,4,0,13,2,2,10,3,2,1,2,9,5,18,0 -17978,0,6,0,9,0,5,5,4,0,0,8,15,5,29,0 -17979,1,5,0,6,3,2,6,3,3,1,10,15,0,8,0 -17980,0,1,0,13,4,1,1,2,4,1,13,6,0,38,0 -17981,9,3,0,0,5,2,9,2,0,1,8,11,0,4,0 -17982,2,5,0,7,2,5,1,4,1,1,13,8,2,26,0 -17983,9,4,0,0,2,0,9,1,1,1,2,2,0,7,0 -17984,7,2,0,13,1,5,13,4,3,0,2,14,2,11,0 -17985,9,0,0,12,4,1,0,5,0,0,11,7,4,11,1 -17986,1,6,0,15,2,3,0,3,3,1,13,14,4,7,0 -17987,10,6,0,6,0,1,10,4,0,0,11,1,4,24,1 -17988,0,2,0,15,0,5,8,3,0,0,8,19,0,19,0 -17989,3,6,0,0,1,4,4,2,1,0,12,15,2,25,0 -17990,8,3,0,6,0,1,10,4,3,1,0,15,2,0,0 -17991,2,8,0,4,2,1,10,0,4,0,4,4,1,20,0 -17992,0,4,0,15,6,5,6,4,0,1,2,11,2,35,0 -17993,1,8,0,1,1,6,5,0,2,0,0,9,1,31,0 -17994,0,5,0,8,4,0,9,1,2,1,3,17,4,27,1 -17995,4,3,0,2,6,3,7,2,4,0,17,3,1,12,0 -17996,6,1,0,9,2,2,0,0,1,1,9,18,4,11,1 -17997,9,4,0,5,1,3,8,5,0,0,15,2,0,31,0 -17998,6,3,0,14,4,4,3,5,2,1,1,8,4,38,1 -17999,1,8,0,9,2,1,10,1,4,0,2,11,2,7,0 -18000,4,8,0,13,2,4,12,5,1,1,13,9,0,29,0 -18001,8,2,0,5,5,3,12,4,1,0,3,17,5,6,1 -18002,7,6,0,7,5,4,9,4,3,1,18,11,3,3,0 -18003,2,7,0,2,0,4,0,2,2,0,2,3,2,11,0 -18004,7,6,0,14,0,5,5,4,0,0,8,11,5,5,0 -18005,10,6,0,2,6,1,11,5,2,0,16,7,4,38,1 -18006,0,6,0,15,6,2,8,0,0,1,13,11,0,3,0 -18007,3,2,0,5,1,5,12,3,2,0,4,11,2,24,0 -18008,6,5,0,7,3,4,14,1,3,1,0,0,3,17,0 -18009,4,6,0,6,0,1,9,5,0,1,15,1,3,16,1 -18010,10,8,0,15,3,0,5,3,0,0,15,15,1,2,0 -18011,1,6,0,10,1,0,7,5,2,1,0,13,4,20,0 -18012,1,1,0,4,2,3,2,4,3,0,13,13,0,29,0 -18013,1,8,0,2,6,0,12,2,2,1,13,6,3,36,0 -18014,3,3,0,0,1,6,9,4,0,1,10,19,5,28,0 -18015,3,0,0,0,5,5,2,1,3,1,2,0,4,36,0 -18016,5,0,0,0,3,0,2,3,3,1,8,9,5,14,0 -18017,4,6,0,3,2,5,6,3,3,0,8,18,3,17,0 -18018,6,8,0,15,1,4,3,3,4,1,17,11,3,35,0 -18019,1,6,0,15,2,1,13,3,3,0,0,2,4,31,0 -18020,7,4,0,10,0,0,1,1,3,1,3,4,3,40,1 -18021,1,5,0,10,2,6,9,3,1,0,17,10,3,0,1 -18022,7,8,0,13,6,0,9,1,4,1,13,6,4,18,0 -18023,1,1,0,5,2,2,8,4,3,0,17,17,1,13,0 -18024,7,1,0,0,0,5,3,4,0,0,14,11,5,31,1 -18025,1,1,0,3,6,0,11,3,4,1,10,14,4,18,1 -18026,5,7,0,0,1,6,11,2,4,1,17,13,0,24,0 -18027,8,2,0,0,6,3,8,3,1,0,14,16,4,13,0 -18028,0,6,0,5,1,5,2,4,1,0,8,2,0,3,0 -18029,9,0,0,5,1,3,8,1,0,1,4,13,3,3,0 -18030,5,6,0,8,3,3,5,0,4,1,5,10,3,1,1 -18031,0,3,0,11,0,0,2,1,2,1,14,15,4,24,0 -18032,3,7,0,10,5,0,4,3,1,0,18,9,1,12,1 -18033,5,6,0,6,0,0,14,2,1,0,10,16,2,0,0 -18034,9,8,0,11,3,1,7,4,0,0,2,6,5,29,0 -18035,3,3,0,6,2,5,13,4,1,1,10,3,4,31,0 -18036,5,0,0,9,5,6,9,3,4,0,11,14,3,13,0 -18037,1,8,0,13,1,4,4,4,3,1,6,6,5,40,0 -18038,9,0,0,3,4,1,10,2,3,1,3,16,2,40,1 -18039,3,1,0,9,1,4,14,2,1,0,5,4,1,20,1 -18040,9,1,0,7,4,6,7,1,0,0,12,5,2,4,0 -18041,8,5,0,0,0,5,1,3,0,0,2,4,1,2,0 -18042,0,3,0,14,2,4,14,1,4,1,10,2,0,19,0 -18043,9,3,0,8,4,4,9,0,0,1,8,6,4,18,0 -18044,5,6,0,15,5,3,0,4,4,1,13,9,1,27,0 -18045,10,5,0,4,6,5,4,1,1,1,16,17,5,10,1 -18046,9,6,0,7,0,4,0,4,2,1,8,7,5,13,0 -18047,2,5,0,7,5,2,9,5,2,0,8,9,2,34,0 -18048,5,5,0,13,3,6,3,3,1,0,8,12,5,11,0 -18049,9,7,0,10,5,2,12,5,3,0,14,10,3,8,1 -18050,6,4,0,10,0,6,13,5,2,0,10,5,5,35,1 -18051,3,2,0,7,2,6,3,2,1,1,8,20,4,27,0 -18052,1,6,0,11,0,0,7,4,4,0,8,18,3,4,0 -18053,9,1,0,3,0,3,11,4,2,1,6,1,1,21,1 -18054,8,7,0,2,3,6,9,1,0,1,1,8,3,28,1 -18055,6,6,0,8,4,6,5,0,2,1,14,13,4,21,1 -18056,6,2,0,4,0,5,8,3,0,0,2,12,0,5,0 -18057,6,5,0,0,6,5,13,5,1,0,5,10,3,23,1 -18058,1,4,0,11,5,0,14,1,4,1,10,6,2,33,0 -18059,0,0,0,10,1,6,14,4,4,0,14,6,0,4,0 -18060,0,6,0,2,2,4,11,3,2,1,13,6,1,28,0 -18061,7,6,0,3,6,3,6,3,4,1,2,17,0,20,0 -18062,10,5,0,10,1,6,6,1,4,0,15,6,0,25,0 -18063,1,2,0,15,4,6,9,0,1,0,7,13,1,34,0 -18064,0,6,0,3,0,4,10,3,4,0,17,11,1,18,0 -18065,0,0,0,13,5,5,11,1,4,0,11,1,4,25,0 -18066,3,1,0,14,4,3,10,4,0,0,12,13,0,28,0 -18067,0,5,0,8,6,5,3,2,3,1,3,0,0,26,0 -18068,0,7,0,3,3,0,5,4,4,0,18,10,5,22,1 -18069,3,4,0,5,2,3,10,2,2,0,6,11,3,33,0 -18070,3,1,0,6,3,3,5,5,2,0,17,7,4,35,0 -18071,7,1,0,3,2,5,7,1,0,0,18,7,5,40,1 -18072,1,8,0,1,0,4,14,0,4,1,8,11,0,14,0 -18073,10,4,0,7,0,2,4,5,1,1,9,9,5,7,1 -18074,7,1,0,0,3,6,8,1,2,1,18,10,3,7,1 -18075,4,8,0,1,1,0,9,2,1,0,8,2,0,4,0 -18076,5,8,0,13,5,1,11,1,1,0,4,11,0,3,0 -18077,7,2,0,5,6,4,0,2,4,1,2,18,2,5,0 -18078,1,5,0,13,0,6,12,2,2,0,6,2,4,35,0 -18079,8,1,0,6,3,2,1,2,1,0,14,7,2,31,1 -18080,1,4,0,2,0,1,14,0,1,1,17,9,2,14,0 -18081,5,2,0,13,5,4,6,2,1,1,6,3,3,21,1 -18082,2,6,0,11,6,6,6,1,4,1,8,2,5,33,0 -18083,0,1,0,10,1,2,5,4,1,0,8,13,0,34,0 -18084,3,3,0,6,0,4,3,5,2,0,13,2,4,1,0 -18085,2,2,0,15,6,6,12,5,0,1,0,11,5,31,0 -18086,5,2,0,13,0,3,2,4,1,1,12,13,3,40,0 -18087,8,8,0,12,2,5,1,4,3,0,8,0,3,34,0 -18088,4,7,0,8,2,1,9,2,4,0,8,6,2,10,0 -18089,2,0,0,11,6,0,5,5,0,0,13,20,4,6,0 -18090,0,2,0,4,0,4,4,4,2,1,2,13,0,23,0 -18091,1,8,0,3,1,1,10,5,0,0,5,10,2,8,1 -18092,2,2,0,3,0,5,7,2,4,0,8,0,3,18,0 -18093,1,8,0,6,0,0,10,1,2,1,8,15,4,21,0 -18094,9,5,0,2,6,0,11,3,0,1,2,6,2,10,0 -18095,7,3,0,11,3,0,14,4,1,1,13,5,0,33,0 -18096,4,2,0,7,1,3,14,5,1,1,11,18,5,30,1 -18097,3,6,0,15,0,6,11,1,0,0,14,8,0,29,0 -18098,4,4,0,1,6,1,5,3,3,0,18,12,5,10,1 -18099,9,3,0,0,4,4,4,0,1,0,13,13,5,14,0 -18100,5,6,0,3,5,0,0,3,4,1,13,2,1,38,0 -18101,8,2,0,6,1,3,3,4,3,0,4,18,2,11,0 -18102,2,7,0,13,5,4,9,4,1,0,0,3,0,32,0 -18103,9,0,0,8,3,1,8,2,2,0,12,11,1,4,0 -18104,4,8,0,0,6,0,5,0,4,0,14,2,5,33,0 -18105,4,0,0,10,2,5,12,3,4,0,6,11,1,6,0 -18106,2,5,0,9,2,6,14,5,2,1,3,7,0,10,1 -18107,8,4,0,14,3,5,12,0,2,0,9,7,0,22,1 -18108,0,5,0,4,0,5,10,2,1,0,13,6,5,29,0 -18109,6,1,0,2,3,3,6,2,1,0,6,2,4,12,0 -18110,9,8,0,11,4,5,13,5,0,1,8,12,2,26,0 -18111,7,5,0,1,0,6,10,1,2,1,2,8,1,39,0 -18112,3,8,0,8,6,4,4,1,2,0,9,20,5,31,0 -18113,1,8,0,3,1,4,7,3,0,0,13,13,3,2,0 -18114,1,4,0,10,0,4,7,3,1,1,6,11,2,6,0 -18115,9,6,0,0,3,0,5,1,4,1,2,16,0,10,0 -18116,7,1,0,12,4,5,0,1,0,1,2,16,4,10,0 -18117,0,3,0,4,6,0,9,2,1,1,2,12,3,6,0 -18118,7,2,0,10,0,0,0,1,2,0,6,2,4,8,0 -18119,5,5,0,10,2,4,8,3,2,0,13,13,1,33,0 -18120,6,5,0,14,2,1,4,5,4,1,1,7,0,25,1 -18121,0,7,0,3,6,5,8,3,3,0,7,9,2,13,0 -18122,8,4,0,0,2,1,5,0,0,0,8,0,0,34,0 -18123,5,6,0,12,0,5,7,3,0,0,8,5,0,31,0 -18124,5,2,0,11,4,6,9,5,3,1,18,17,5,17,1 -18125,6,6,0,7,6,3,1,4,4,1,13,20,0,6,0 -18126,1,2,0,2,5,4,2,0,1,1,2,2,2,25,0 -18127,0,1,0,0,0,6,3,4,4,1,2,15,1,1,0 -18128,8,5,0,12,0,0,6,1,1,0,10,15,1,21,0 -18129,5,8,0,9,4,6,0,0,1,1,1,14,2,4,1 -18130,7,8,0,4,5,0,2,0,4,1,18,1,5,8,1 -18131,1,4,0,11,5,0,0,2,1,1,4,9,0,20,0 -18132,3,2,0,0,4,2,2,5,3,0,13,13,5,18,0 -18133,0,5,0,13,5,6,9,2,1,1,2,2,4,2,0 -18134,9,1,0,6,5,5,7,4,4,0,15,13,3,28,0 -18135,6,7,0,11,4,6,6,2,2,1,10,15,1,9,0 -18136,1,6,0,12,2,5,13,5,2,0,13,15,0,1,0 -18137,3,8,0,4,1,4,0,1,4,1,13,15,0,38,0 -18138,0,2,0,1,6,5,7,3,3,0,4,6,0,7,0 -18139,3,6,0,8,4,4,14,4,4,1,10,1,0,2,1 -18140,2,1,0,11,4,0,12,2,3,1,13,20,3,38,0 -18141,5,5,0,2,0,6,4,1,0,0,12,1,2,39,1 -18142,10,7,0,4,0,6,9,0,2,0,7,2,1,16,0 -18143,3,6,0,13,5,0,13,0,2,1,13,15,1,34,0 -18144,9,7,0,5,3,5,0,5,3,0,2,2,3,2,0 -18145,0,7,0,12,5,0,8,1,4,0,13,2,1,2,0 -18146,0,1,0,12,5,3,0,5,4,1,12,7,5,0,1 -18147,2,0,0,8,2,4,1,1,0,0,2,15,1,25,0 -18148,10,6,0,4,3,6,10,0,0,0,0,12,0,9,0 -18149,1,6,0,3,0,5,3,5,3,1,3,11,5,27,0 -18150,0,7,0,14,5,4,4,4,1,0,6,9,1,33,0 -18151,1,3,0,15,2,4,7,0,0,1,4,16,2,20,0 -18152,1,4,0,2,6,6,8,3,2,1,2,13,5,6,0 -18153,8,1,0,9,2,6,3,3,2,1,16,14,3,4,1 -18154,0,4,0,15,4,3,8,0,2,1,7,3,2,12,0 -18155,9,2,0,12,5,2,5,4,4,1,18,19,4,24,1 -18156,1,7,0,0,0,5,4,0,4,1,8,2,2,33,0 -18157,7,0,0,5,4,2,9,3,2,0,14,20,2,38,0 -18158,6,7,0,11,3,4,7,4,0,0,17,2,2,12,0 -18159,4,3,0,4,0,5,6,4,2,0,17,12,2,17,0 -18160,7,6,0,7,3,6,13,5,1,1,10,8,5,7,1 -18161,3,2,0,3,1,0,10,4,3,1,2,10,0,27,0 -18162,0,2,0,9,4,1,12,3,0,1,2,6,3,14,0 -18163,2,1,0,15,5,5,8,1,3,1,8,11,0,3,0 -18164,0,6,0,2,1,3,9,3,1,1,0,2,0,4,0 -18165,2,0,0,5,6,5,0,1,0,1,8,11,0,40,0 -18166,1,6,0,7,1,4,14,5,0,1,5,11,3,4,0 -18167,9,6,0,5,5,2,2,5,3,0,14,0,3,12,1 -18168,8,5,0,0,1,3,13,4,1,1,17,16,4,20,0 -18169,4,4,0,9,4,4,4,5,1,1,4,8,0,1,1 -18170,0,7,0,8,2,5,11,3,3,0,13,2,4,14,0 -18171,6,7,0,7,0,0,7,3,1,1,6,3,4,31,0 -18172,5,8,0,2,5,5,12,3,2,0,17,3,0,37,0 -18173,3,4,0,0,6,0,0,1,4,1,3,3,3,8,0 -18174,8,8,0,6,2,2,2,5,0,1,14,2,5,9,1 -18175,2,3,0,15,2,0,1,1,0,0,6,9,3,11,0 -18176,1,4,0,6,2,5,5,3,3,0,6,17,1,38,0 -18177,0,1,0,8,4,1,13,0,0,0,5,10,3,13,1 -18178,0,8,0,3,0,5,5,5,4,1,17,2,1,28,0 -18179,8,0,0,1,0,6,6,1,3,1,4,11,1,41,0 -18180,2,2,0,14,1,5,2,3,0,0,2,6,0,35,0 -18181,6,6,0,14,6,0,10,1,3,0,10,15,2,7,0 -18182,7,5,0,5,3,1,0,3,2,1,4,1,1,33,1 -18183,2,6,0,15,5,5,3,4,0,0,17,15,0,3,0 -18184,4,7,0,9,0,0,8,1,0,1,8,2,2,16,0 -18185,5,1,0,3,5,4,4,0,2,0,2,16,4,33,0 -18186,7,6,0,14,6,0,11,5,4,0,1,20,1,35,1 -18187,3,3,0,3,6,4,5,0,2,1,13,11,3,7,0 -18188,8,7,0,4,2,6,3,0,3,0,18,7,3,35,1 -18189,1,5,0,1,0,3,4,1,3,0,16,1,0,28,1 -18190,10,4,0,8,2,3,2,4,0,1,5,7,4,35,1 -18191,0,8,0,12,6,2,3,3,3,0,17,20,4,9,0 -18192,9,3,0,11,5,0,0,4,4,1,11,1,1,1,1 -18193,5,3,0,0,0,5,3,2,2,0,8,2,4,32,0 -18194,5,8,0,5,5,4,9,3,0,0,13,2,0,29,0 -18195,6,6,0,10,0,4,12,3,0,0,17,2,0,40,0 -18196,2,0,0,1,0,3,4,2,4,1,13,13,1,3,0 -18197,5,6,0,13,1,5,12,4,0,1,11,2,0,8,0 -18198,0,4,0,11,2,2,9,5,3,0,2,2,3,14,0 -18199,4,1,0,11,5,0,1,3,0,0,8,2,5,11,0 -18200,2,7,0,6,2,6,11,5,4,0,8,6,0,7,0 -18201,0,3,0,11,2,4,10,4,4,1,13,7,1,29,0 -18202,8,1,0,12,3,0,14,3,2,1,17,20,0,39,0 -18203,1,6,0,3,0,5,6,2,1,0,6,15,3,21,0 -18204,0,4,0,6,6,1,1,5,0,1,12,19,2,31,1 -18205,10,1,0,5,3,0,5,4,1,0,5,5,4,40,1 -18206,0,5,0,7,3,4,12,5,3,1,13,9,5,28,0 -18207,0,5,0,14,1,6,10,2,0,0,8,20,1,7,0 -18208,4,6,0,6,0,6,6,4,2,0,1,14,1,28,0 -18209,0,1,0,1,6,0,8,4,2,0,13,11,1,39,0 -18210,4,8,0,6,6,4,2,5,4,0,18,7,4,22,1 -18211,7,1,0,11,5,5,6,4,0,0,11,2,0,14,0 -18212,10,7,0,1,0,2,1,4,3,0,18,7,5,38,1 -18213,6,8,0,4,0,3,4,3,3,1,5,7,2,27,1 -18214,2,8,0,3,2,3,3,2,3,1,2,18,4,27,0 -18215,7,0,0,3,1,2,14,1,0,1,4,18,2,7,0 -18216,0,0,0,13,5,0,10,1,4,0,4,2,4,22,0 -18217,6,5,0,2,2,4,12,1,0,1,16,11,5,30,0 -18218,5,6,0,4,1,3,3,0,4,1,8,12,0,2,0 -18219,9,0,0,1,0,4,6,1,1,0,11,2,5,2,0 -18220,0,8,0,12,5,6,12,4,3,1,13,6,2,16,0 -18221,1,3,0,1,4,5,9,3,0,1,17,20,4,30,0 -18222,8,7,0,0,6,5,8,0,0,1,13,2,1,25,0 -18223,5,2,0,11,0,4,2,1,2,0,13,2,5,5,0 -18224,6,8,0,15,2,0,11,1,2,0,4,11,3,8,0 -18225,0,7,0,11,0,1,3,4,2,0,18,12,1,30,1 -18226,9,2,0,12,6,4,6,2,3,1,8,11,2,14,0 -18227,8,1,0,2,3,1,8,5,0,1,18,8,1,21,1 -18228,0,1,0,0,0,0,12,3,4,1,14,20,1,11,0 -18229,1,6,0,14,5,3,5,1,0,1,8,7,0,13,0 -18230,1,2,0,9,1,4,14,1,4,0,2,13,4,22,0 -18231,2,6,0,1,1,5,8,2,4,0,12,13,5,0,0 -18232,4,8,0,12,0,1,8,3,1,1,16,20,2,22,1 -18233,4,0,0,10,5,4,5,1,3,0,4,15,1,27,0 -18234,3,5,0,4,3,6,14,3,1,0,2,13,5,37,0 -18235,2,4,0,11,0,6,1,2,2,0,2,11,5,10,0 -18236,5,2,0,15,0,4,11,4,1,0,7,8,3,20,0 -18237,10,1,0,13,6,4,4,3,0,1,6,13,2,28,0 -18238,3,5,0,1,4,4,0,3,1,0,2,6,1,21,0 -18239,5,5,0,8,0,0,9,2,1,0,3,16,4,21,1 -18240,3,5,0,7,0,0,3,0,4,0,17,1,5,11,1 -18241,1,1,0,12,6,0,13,5,0,1,5,5,3,17,1 -18242,0,0,0,6,0,5,6,4,2,0,8,8,1,8,0 -18243,0,2,0,6,4,0,13,3,1,1,13,11,1,41,0 -18244,7,8,0,13,6,0,11,3,0,1,2,10,4,37,0 -18245,9,8,0,2,2,1,2,0,3,0,11,7,1,12,1 -18246,7,4,0,6,2,6,10,5,4,1,5,12,5,30,1 -18247,2,6,0,11,1,5,6,3,0,1,12,14,4,41,0 -18248,0,3,0,14,0,4,3,4,4,0,8,6,4,29,0 -18249,3,5,0,1,2,2,4,1,1,0,8,4,1,16,0 -18250,0,5,0,9,5,3,10,3,3,1,10,2,2,5,0 -18251,0,2,0,15,6,4,7,1,4,1,13,11,1,22,0 -18252,0,2,0,15,0,0,14,5,4,0,12,18,4,14,0 -18253,0,0,0,15,2,3,4,4,0,0,15,15,4,35,0 -18254,9,1,0,1,6,4,5,3,0,0,8,15,5,10,0 -18255,5,4,0,9,6,4,12,3,1,0,17,6,4,10,0 -18256,10,8,0,15,0,0,8,0,0,1,13,0,3,18,0 -18257,2,2,0,13,6,4,3,5,3,1,17,0,5,20,0 -18258,6,4,0,3,5,4,10,4,0,1,11,11,1,23,0 -18259,2,2,0,8,0,1,10,2,1,0,8,15,0,0,0 -18260,4,0,0,7,1,0,6,0,2,1,10,2,3,33,0 -18261,1,6,0,11,1,0,1,3,4,1,8,11,3,28,0 -18262,0,2,0,14,2,1,1,3,1,0,0,4,4,14,1 -18263,2,2,0,12,4,5,0,2,1,0,0,13,0,27,0 -18264,0,7,0,1,1,2,8,3,3,0,15,10,4,26,0 -18265,1,6,0,1,4,5,14,0,4,0,4,15,4,13,0 -18266,4,7,0,1,1,6,9,4,0,0,17,3,5,37,0 -18267,1,6,0,4,3,6,6,2,4,0,8,9,0,25,0 -18268,9,8,0,6,0,2,13,1,3,0,2,4,4,35,0 -18269,4,2,0,13,4,0,10,1,1,0,18,19,5,31,1 -18270,5,7,0,9,5,3,6,1,1,0,18,2,4,25,0 -18271,8,6,0,13,0,5,0,2,2,0,13,16,4,16,0 -18272,9,7,0,14,3,2,12,1,1,1,9,8,5,27,1 -18273,8,4,0,7,5,0,11,3,3,0,10,2,4,39,0 -18274,4,2,0,12,6,4,0,2,1,0,11,6,2,20,0 -18275,3,8,0,1,6,5,5,1,0,0,13,2,3,26,0 -18276,9,5,0,8,0,4,3,3,0,0,3,10,4,39,1 -18277,0,1,0,1,0,3,2,5,3,1,2,9,2,39,0 -18278,8,6,0,4,1,5,4,2,4,0,15,0,2,29,0 -18279,2,3,0,7,5,2,14,4,4,0,6,9,3,4,0 -18280,3,0,0,0,0,6,1,1,2,0,15,15,0,29,0 -18281,8,8,0,4,2,6,5,5,2,0,0,9,0,4,0 -18282,10,6,0,3,1,5,6,5,4,0,16,9,3,38,0 -18283,4,2,0,6,6,2,10,5,2,0,18,1,2,0,1 -18284,2,0,0,1,2,3,8,3,1,0,10,15,1,13,0 -18285,10,6,0,2,0,1,14,3,4,0,9,17,5,32,1 -18286,9,2,0,5,1,0,0,1,2,0,8,0,3,7,0 -18287,0,8,0,11,0,5,4,4,1,0,6,2,2,38,0 -18288,6,1,0,1,6,5,1,2,2,0,18,5,0,39,1 -18289,7,2,0,8,2,5,0,4,0,1,14,3,3,40,1 -18290,0,5,0,0,5,2,8,4,1,0,2,13,5,4,0 -18291,3,6,0,5,0,5,8,3,1,0,6,18,0,36,0 -18292,3,7,0,7,0,5,3,1,3,1,10,12,2,29,0 -18293,4,8,0,6,2,5,8,2,3,1,13,2,0,13,0 -18294,0,7,0,5,1,5,8,2,0,0,17,19,0,3,0 -18295,1,4,0,3,0,5,8,2,0,1,0,9,0,16,0 -18296,1,2,0,14,1,3,11,4,3,1,15,15,5,4,0 -18297,5,8,0,10,2,5,7,3,2,1,17,6,4,17,0 -18298,3,5,0,6,0,6,6,1,1,0,4,9,4,5,0 -18299,0,1,0,10,1,4,5,4,0,1,8,0,4,13,0 -18300,1,2,0,2,2,3,14,1,1,0,13,15,0,29,0 -18301,4,5,0,15,6,1,11,0,4,1,1,7,5,36,1 -18302,10,1,0,12,4,4,7,5,4,0,18,7,2,17,1 -18303,10,1,0,10,2,1,7,1,1,0,11,19,4,21,1 -18304,4,4,0,5,3,0,14,3,4,1,0,3,2,14,0 -18305,0,2,0,3,0,5,7,4,0,1,0,18,3,18,0 -18306,5,4,0,11,5,0,9,3,2,1,4,3,1,13,0 -18307,1,1,0,1,2,0,3,3,1,1,1,7,5,7,1 -18308,1,6,0,3,2,0,9,0,3,1,8,0,0,36,0 -18309,2,3,0,4,0,5,14,4,2,1,2,2,0,26,0 -18310,0,1,0,13,3,3,4,5,3,0,11,0,5,3,0 -18311,0,1,0,8,6,6,0,4,2,1,7,16,4,39,1 -18312,7,0,0,14,2,5,12,5,1,1,18,1,2,38,1 -18313,6,6,0,3,3,3,11,4,0,1,2,18,2,0,0 -18314,1,5,0,1,1,0,6,2,3,1,0,3,0,32,0 -18315,3,4,0,3,2,4,11,4,2,1,8,8,0,36,0 -18316,6,6,0,10,6,1,10,1,3,1,9,17,2,22,1 -18317,1,5,0,14,6,1,0,1,2,0,2,15,0,35,0 -18318,10,5,0,9,5,0,3,2,1,0,9,8,4,5,1 -18319,6,8,0,7,3,2,11,0,1,0,13,11,1,0,0 -18320,1,4,0,7,2,4,9,4,0,1,6,13,2,38,0 -18321,4,2,0,5,6,5,9,5,0,1,17,15,3,24,0 -18322,2,1,0,3,2,2,4,3,2,1,2,6,0,39,0 -18323,1,1,0,13,1,1,6,0,3,1,18,19,2,33,1 -18324,5,2,0,3,6,0,13,2,2,0,13,20,5,31,0 -18325,1,1,0,0,0,6,5,4,4,0,13,2,2,0,0 -18326,1,0,0,1,0,2,3,0,4,1,0,9,3,41,0 -18327,8,8,0,2,2,5,14,2,3,0,6,17,0,16,0 -18328,1,6,0,5,6,5,6,0,4,1,13,11,2,29,0 -18329,2,0,0,9,1,1,2,2,1,1,7,19,5,32,1 -18330,0,2,0,11,0,3,1,0,2,1,17,15,4,0,0 -18331,2,0,0,11,2,1,13,4,2,0,17,16,3,14,0 -18332,9,2,0,6,1,6,6,1,3,1,13,15,1,2,0 -18333,0,5,0,1,4,3,6,4,0,0,4,18,3,33,0 -18334,2,8,0,13,0,3,12,4,2,0,2,13,0,27,0 -18335,9,5,0,11,3,4,7,4,2,1,9,19,4,10,1 -18336,1,3,0,3,6,4,6,3,2,1,12,17,3,6,0 -18337,5,6,0,0,3,2,14,5,2,0,14,7,4,18,1 -18338,0,7,0,15,0,6,3,0,0,1,2,2,2,39,0 -18339,2,6,0,7,0,6,12,3,1,0,2,11,2,4,0 -18340,1,4,0,11,1,3,13,5,4,1,1,13,4,7,0 -18341,4,8,0,13,5,5,12,0,0,0,8,2,5,5,0 -18342,3,2,0,0,6,2,4,4,1,0,7,6,3,31,0 -18343,5,3,0,7,3,4,4,0,0,0,2,11,5,19,0 -18344,4,3,0,4,1,0,3,3,0,0,2,0,2,41,0 -18345,7,4,0,9,6,6,13,2,3,1,11,13,5,16,0 -18346,10,4,0,9,6,4,6,0,2,1,10,14,0,39,0 -18347,10,3,0,3,2,3,0,0,0,1,4,2,3,17,0 -18348,1,8,0,11,1,0,14,3,0,0,4,9,3,29,0 -18349,3,1,0,3,0,3,6,4,1,0,13,15,0,12,0 -18350,10,5,0,13,5,4,8,2,0,0,13,2,3,35,0 -18351,0,0,0,10,5,0,5,2,3,1,8,9,0,25,0 -18352,2,1,0,5,0,4,3,3,0,1,4,9,2,20,0 -18353,0,0,0,3,0,0,5,2,4,0,2,2,5,40,0 -18354,2,6,0,4,6,5,5,0,2,0,2,18,0,26,0 -18355,3,8,0,13,2,2,6,4,0,1,2,9,1,36,0 -18356,2,8,0,11,6,4,3,2,2,0,15,3,0,35,0 -18357,10,7,0,15,1,0,8,3,1,1,2,16,1,17,0 -18358,4,7,0,3,0,3,9,1,1,1,2,2,1,14,0 -18359,9,1,0,9,5,5,3,1,4,1,13,6,0,29,0 -18360,1,7,0,14,6,0,9,4,1,1,8,11,4,2,0 -18361,0,1,0,3,6,3,4,5,0,0,0,12,4,26,0 -18362,1,5,0,0,1,2,0,3,1,0,6,11,5,35,0 -18363,4,3,0,15,1,4,3,4,3,0,8,13,0,37,0 -18364,8,1,0,12,2,0,6,1,4,1,13,16,0,30,0 -18365,10,2,0,9,4,2,12,3,2,1,12,18,0,36,1 -18366,3,6,0,3,3,3,7,5,3,1,2,11,4,35,0 -18367,3,0,0,5,0,5,7,1,0,1,11,6,0,5,0 -18368,5,3,0,2,0,0,5,2,3,0,4,2,0,16,0 -18369,6,2,0,10,6,0,1,5,1,0,4,8,0,39,1 -18370,2,7,0,7,1,4,9,2,4,0,13,6,2,33,0 -18371,5,2,0,10,0,1,3,5,1,0,2,9,1,31,0 -18372,1,0,0,13,4,4,3,1,0,1,8,6,0,26,0 -18373,8,2,0,3,4,2,9,1,4,0,8,9,0,30,0 -18374,10,7,0,13,2,4,9,4,2,1,13,20,4,23,0 -18375,2,2,0,5,2,4,12,2,2,1,4,11,3,38,0 -18376,2,3,0,6,3,0,11,1,0,1,8,6,0,37,0 -18377,9,4,0,1,0,2,2,5,2,0,9,13,2,39,1 -18378,1,2,0,10,2,2,6,5,0,1,17,18,2,18,0 -18379,1,3,0,0,6,5,7,3,4,1,13,2,0,6,0 -18380,0,6,0,15,0,6,6,0,3,0,8,0,0,7,0 -18381,1,3,0,4,4,3,14,4,2,0,16,15,0,22,0 -18382,6,6,0,10,4,6,1,5,4,1,9,1,4,21,1 -18383,3,6,0,9,4,4,2,1,1,1,13,9,4,36,0 -18384,10,4,0,8,3,4,14,3,0,0,17,6,5,16,0 -18385,0,5,0,3,1,2,1,0,2,0,13,13,2,6,0 -18386,0,4,0,1,2,2,14,2,1,0,15,6,0,41,0 -18387,2,4,0,14,6,1,9,0,0,1,0,7,5,3,0 -18388,2,6,0,14,1,0,0,3,3,0,17,20,0,8,0 -18389,0,5,0,5,1,4,13,3,1,1,9,14,5,32,1 -18390,3,8,0,4,6,1,13,4,4,1,15,7,2,37,1 -18391,9,7,0,1,0,3,4,3,3,0,2,11,0,34,0 -18392,5,6,0,14,0,0,1,3,1,0,2,18,0,24,0 -18393,4,8,0,5,3,5,12,1,0,1,2,1,1,13,0 -18394,9,8,0,3,0,6,0,3,0,1,10,2,2,40,0 -18395,0,2,0,13,0,5,5,1,3,0,13,2,2,11,0 -18396,0,8,0,10,5,5,5,2,3,0,14,9,3,16,0 -18397,0,6,0,2,3,6,6,4,2,0,15,18,5,32,0 -18398,1,5,0,13,6,1,9,3,4,0,12,2,2,0,0 -18399,0,3,0,10,0,4,5,3,4,0,0,11,0,14,0 -18400,0,7,0,3,0,1,3,4,0,0,10,11,5,12,0 -18401,0,2,0,2,0,0,11,4,4,0,13,10,4,35,0 -18402,1,7,0,3,0,6,3,4,3,1,13,9,2,14,0 -18403,3,8,0,9,6,0,4,2,0,1,13,19,1,8,0 -18404,0,2,0,13,1,4,6,1,0,0,11,0,4,5,1 -18405,0,3,0,5,3,6,3,3,2,1,6,3,1,23,0 -18406,9,6,0,4,0,6,7,4,2,0,2,2,0,39,0 -18407,1,7,0,6,1,1,11,2,1,0,8,2,1,25,0 -18408,1,4,0,0,6,0,4,3,2,0,8,8,2,37,0 -18409,2,6,0,5,3,5,7,5,3,1,6,11,5,7,0 -18410,3,0,0,5,6,1,0,5,2,1,7,10,5,34,1 -18411,8,0,0,13,4,5,1,5,0,1,13,20,4,0,0 -18412,9,7,0,1,5,5,13,0,2,0,11,4,1,10,0 -18413,3,4,0,6,5,4,13,4,3,1,8,0,3,31,0 -18414,0,6,0,11,3,6,0,3,2,1,3,13,2,5,0 -18415,8,4,0,2,5,5,9,2,1,0,18,10,0,16,1 -18416,1,5,0,11,3,4,0,0,1,1,9,5,3,41,1 -18417,5,3,0,13,5,3,3,0,1,0,2,4,5,17,0 -18418,0,7,0,13,5,5,6,0,2,1,14,2,1,0,0 -18419,10,6,0,13,1,0,8,4,1,0,2,18,0,39,0 -18420,7,6,0,6,2,2,10,1,2,0,2,2,1,27,0 -18421,3,1,0,7,5,6,4,0,2,0,13,9,2,28,0 -18422,4,4,0,13,5,2,5,5,2,0,0,2,2,35,0 -18423,1,6,0,15,5,1,3,4,4,1,6,7,0,3,0 -18424,9,0,0,13,0,5,0,0,2,0,13,18,2,22,0 -18425,3,7,0,5,6,5,10,1,0,1,4,4,5,36,0 -18426,8,8,0,13,0,0,6,3,0,0,17,15,4,19,0 -18427,3,0,0,5,3,3,10,2,4,1,13,2,0,34,0 -18428,3,6,0,9,6,4,6,4,4,1,2,6,5,27,0 -18429,10,5,0,10,1,0,6,3,1,0,10,11,3,36,0 -18430,8,1,0,5,2,5,12,3,0,1,8,4,3,39,0 -18431,8,4,0,7,0,1,10,0,4,1,11,12,4,29,1 -18432,2,0,0,9,6,5,2,4,4,0,9,15,4,33,0 -18433,3,8,0,11,0,6,14,1,1,0,13,11,5,34,0 -18434,3,6,0,4,2,4,9,2,0,1,8,4,5,4,0 -18435,7,7,0,10,4,5,14,5,3,0,18,6,4,33,1 -18436,8,6,0,0,0,3,10,3,4,0,3,7,5,8,1 -18437,8,3,0,0,6,5,4,0,1,1,14,3,0,7,1 -18438,8,3,0,9,2,6,7,5,4,0,18,0,4,26,1 -18439,8,6,0,3,1,2,13,3,1,1,2,3,0,8,0 -18440,2,8,0,4,2,0,5,3,3,1,17,6,2,8,0 -18441,8,4,0,3,1,0,11,3,4,1,2,13,4,19,0 -18442,0,3,0,15,0,4,9,3,0,1,8,11,5,0,0 -18443,3,5,0,11,1,1,5,3,4,1,4,14,0,41,0 -18444,6,0,0,14,4,1,4,5,3,1,6,10,2,6,1 -18445,7,0,0,2,1,0,14,3,0,0,7,2,5,22,0 -18446,7,6,0,6,5,3,14,0,3,0,3,9,5,7,1 -18447,10,8,0,4,4,1,0,0,4,1,17,3,4,4,0 -18448,5,2,0,5,2,2,13,5,2,0,5,17,2,22,1 -18449,7,5,0,1,1,3,8,3,1,1,4,16,3,37,0 -18450,5,7,0,10,4,1,9,0,0,1,3,10,3,12,1 -18451,0,3,0,12,0,0,9,3,4,0,2,20,3,6,0 -18452,3,1,0,5,6,5,1,2,1,0,12,8,4,14,1 -18453,0,7,0,13,0,6,0,3,2,0,4,2,4,9,0 -18454,0,3,0,3,1,0,4,1,0,0,8,3,0,40,0 -18455,7,0,0,15,3,3,3,2,3,1,14,11,0,19,0 -18456,2,6,0,6,2,3,9,5,3,1,15,2,4,3,0 -18457,1,3,0,13,0,4,5,2,2,0,10,11,1,32,0 -18458,1,2,0,3,6,1,1,2,4,0,10,15,0,11,0 -18459,1,8,0,6,6,4,10,5,2,1,2,13,4,18,0 -18460,8,1,0,3,2,4,10,1,0,1,13,6,5,16,0 -18461,3,1,0,2,0,4,14,5,3,1,8,3,2,31,0 -18462,0,7,0,1,1,4,14,4,3,0,6,11,0,22,0 -18463,0,6,0,8,0,5,7,3,2,1,8,9,0,1,0 -18464,9,6,0,3,4,0,4,0,1,0,13,6,0,6,0 -18465,2,1,0,13,2,5,7,1,2,0,13,15,0,4,0 -18466,9,6,0,13,3,4,14,3,0,1,9,17,3,39,1 -18467,6,2,0,4,4,1,3,5,0,1,18,7,3,4,1 -18468,6,8,0,7,4,6,12,0,0,0,18,7,5,23,1 -18469,1,6,0,0,0,0,9,1,0,1,13,15,3,38,0 -18470,2,0,0,10,5,4,12,2,3,1,16,11,5,27,0 -18471,2,2,0,10,6,5,9,3,0,1,0,14,4,35,0 -18472,8,2,0,5,5,5,12,5,2,0,1,10,3,29,1 -18473,8,5,0,3,0,0,9,3,3,1,13,18,0,29,0 -18474,2,6,0,5,0,1,7,5,4,0,2,9,1,8,0 -18475,7,5,0,11,6,0,14,4,4,1,18,17,1,33,1 -18476,8,2,0,11,4,4,4,2,4,1,17,13,4,30,1 -18477,8,2,0,14,3,3,10,4,1,0,9,8,3,5,1 -18478,9,6,0,12,4,5,5,2,0,0,5,6,2,30,0 -18479,1,3,0,3,2,3,9,2,1,1,6,14,4,23,0 -18480,0,0,0,10,6,0,13,5,0,0,18,4,2,29,1 -18481,2,2,0,14,0,6,4,1,2,0,18,7,5,31,1 -18482,5,8,0,11,2,1,0,5,0,0,11,9,3,22,1 -18483,5,2,0,10,3,1,10,1,4,1,11,12,0,41,1 -18484,0,3,0,5,4,6,6,2,3,1,2,11,0,38,0 -18485,9,7,0,1,3,2,0,2,0,1,18,17,2,23,1 -18486,0,2,0,2,1,0,11,0,1,0,0,10,1,30,0 -18487,0,3,0,0,6,6,5,4,0,0,10,3,1,21,0 -18488,4,2,0,10,3,1,8,1,1,1,11,6,4,36,1 -18489,2,5,0,1,0,3,0,4,0,0,12,13,2,13,0 -18490,5,4,0,3,0,6,6,1,3,0,17,13,1,10,0 -18491,4,2,0,12,2,1,10,0,2,1,18,14,4,25,1 -18492,1,4,0,6,6,5,6,4,2,1,1,2,3,2,0 -18493,0,1,0,2,0,6,7,2,4,0,12,2,4,40,0 -18494,5,3,0,14,6,6,0,5,3,1,18,5,3,13,1 -18495,1,3,0,5,0,3,5,0,2,1,2,12,4,19,0 -18496,10,1,0,7,6,3,10,0,2,1,12,10,5,13,1 -18497,8,3,0,11,5,5,3,4,2,0,6,15,2,11,0 -18498,6,6,0,1,5,5,6,3,0,0,15,12,4,34,1 -18499,1,7,0,4,6,5,4,5,3,1,12,5,5,14,0 -18500,3,1,0,13,0,5,6,4,4,0,2,0,1,25,0 -18501,8,0,0,12,0,5,10,2,0,0,14,7,0,26,1 -18502,10,0,0,14,1,0,7,1,3,1,3,13,2,8,0 -18503,7,6,0,14,4,1,13,1,4,0,14,1,4,19,1 -18504,2,4,0,0,4,6,11,4,4,0,10,10,1,23,1 -18505,0,6,0,4,5,5,5,4,0,1,17,5,5,32,0 -18506,0,4,0,9,0,1,11,5,2,1,9,12,4,38,1 -18507,0,7,0,3,5,6,3,4,4,0,12,2,3,19,0 -18508,0,4,0,2,6,5,2,3,2,0,0,4,0,38,0 -18509,6,7,0,3,2,1,13,0,4,1,1,11,1,8,1 -18510,7,1,0,10,0,5,7,1,3,0,8,2,4,24,0 -18511,6,2,0,3,2,5,4,1,4,1,17,6,4,23,0 -18512,3,6,0,14,5,1,13,0,0,1,5,19,3,6,1 -18513,6,1,0,8,6,1,8,2,0,0,12,18,4,25,1 -18514,1,5,0,15,6,5,14,4,3,1,10,15,0,25,0 -18515,0,2,0,1,6,4,6,3,1,0,7,3,2,36,0 -18516,6,1,0,6,1,2,9,1,0,0,13,13,3,1,0 -18517,4,5,0,0,3,4,12,4,0,0,8,4,4,39,0 -18518,7,4,0,14,4,3,10,5,4,1,18,19,2,41,1 -18519,8,2,0,4,6,1,14,2,0,1,18,4,1,13,1 -18520,3,1,0,6,1,4,4,2,2,1,15,0,4,5,0 -18521,2,3,0,15,5,4,10,4,0,1,10,18,3,27,0 -18522,8,1,0,10,4,3,2,1,4,1,18,14,4,10,1 -18523,6,0,0,5,0,0,2,3,1,1,12,17,3,30,0 -18524,3,2,0,2,1,4,7,1,2,1,6,8,4,7,1 -18525,0,6,0,4,0,0,10,1,0,0,10,15,1,7,0 -18526,8,5,0,3,4,3,14,5,4,1,9,5,4,7,1 -18527,9,6,0,8,2,1,5,2,0,1,13,6,3,24,0 -18528,10,7,0,8,2,4,14,0,2,1,13,19,3,5,0 -18529,4,4,0,2,1,2,11,5,3,1,9,15,3,30,1 -18530,9,7,0,3,2,2,3,5,2,1,14,7,0,7,1 -18531,0,2,0,6,5,4,0,5,0,0,18,5,4,34,1 -18532,4,1,0,13,0,5,2,4,4,1,4,9,5,1,0 -18533,7,5,0,4,3,4,9,4,0,1,13,2,0,22,0 -18534,4,3,0,7,6,0,6,3,2,1,2,13,1,4,0 -18535,7,4,0,4,1,3,2,3,1,1,6,6,4,30,0 -18536,3,8,0,13,2,6,14,1,2,1,2,18,1,34,0 -18537,9,8,0,11,5,4,10,5,4,0,1,19,2,16,1 -18538,3,2,0,10,5,6,7,2,0,1,17,0,0,16,0 -18539,9,5,0,15,6,3,10,2,4,1,6,20,3,33,0 -18540,9,0,0,12,6,5,0,2,1,0,12,20,0,38,0 -18541,2,8,0,8,6,1,10,5,2,1,0,10,2,41,1 -18542,10,0,0,11,0,5,9,3,1,0,8,6,0,0,0 -18543,7,4,0,13,2,4,11,3,3,0,13,12,0,34,0 -18544,2,8,0,9,0,1,3,5,2,1,13,15,5,17,0 -18545,8,6,0,11,2,0,14,0,3,1,10,11,0,7,0 -18546,0,8,0,4,5,6,13,0,0,1,6,15,1,40,0 -18547,4,4,0,5,6,2,12,1,2,1,6,14,4,20,0 -18548,5,6,0,10,4,3,10,3,3,0,9,17,4,32,0 -18549,4,7,0,15,1,4,6,2,3,0,8,20,0,7,0 -18550,0,7,0,6,1,2,1,1,2,1,0,1,3,30,1 -18551,2,2,0,7,2,0,3,1,4,0,8,9,4,24,0 -18552,4,6,0,8,3,2,12,5,3,1,11,13,3,25,1 -18553,2,2,0,4,0,3,14,4,1,0,5,3,2,12,0 -18554,1,2,0,4,0,6,5,3,2,1,13,12,0,14,0 -18555,3,6,0,12,4,1,9,3,4,1,13,2,0,16,0 -18556,2,7,0,13,4,0,0,2,3,1,8,11,4,8,0 -18557,0,4,0,3,5,3,4,5,0,0,13,2,5,21,0 -18558,2,8,0,0,0,1,7,4,3,1,8,20,2,39,0 -18559,8,5,0,10,6,1,2,5,0,0,18,4,3,20,1 -18560,9,2,0,13,1,4,1,3,0,1,4,2,0,20,0 -18561,8,5,0,4,4,1,14,5,4,0,1,1,4,2,1 -18562,7,4,0,5,3,0,3,3,0,0,18,1,3,28,1 -18563,10,6,0,1,5,3,7,2,1,1,13,6,5,0,0 -18564,8,0,0,2,6,1,9,3,0,0,4,2,2,19,0 -18565,4,3,0,0,2,3,3,1,1,0,2,2,5,31,0 -18566,9,7,0,11,1,0,0,1,1,1,8,13,0,27,0 -18567,8,6,0,11,1,6,6,1,0,1,16,11,5,0,0 -18568,9,3,0,5,5,5,5,4,0,0,6,2,1,33,0 -18569,8,6,0,10,1,3,7,4,2,1,17,2,2,8,0 -18570,9,2,0,3,1,3,2,2,4,1,2,2,0,4,0 -18571,1,1,0,6,3,3,6,0,4,1,8,11,0,22,0 -18572,10,7,0,13,1,5,12,5,4,1,10,7,3,26,0 -18573,2,5,0,10,6,5,8,0,4,1,17,2,5,8,0 -18574,3,7,0,6,4,0,8,3,1,1,2,2,3,9,0 -18575,3,8,0,10,0,0,7,3,2,1,12,6,1,35,0 -18576,2,7,0,15,4,4,8,0,1,1,8,11,3,26,0 -18577,6,0,0,1,3,1,1,4,2,1,18,14,3,9,1 -18578,0,8,0,12,2,0,11,4,0,1,2,0,3,19,0 -18579,5,7,0,1,1,2,0,3,4,1,9,5,4,8,1 -18580,1,6,0,9,3,6,9,2,2,1,2,2,5,37,0 -18581,1,1,0,1,5,0,4,4,2,1,1,11,4,23,0 -18582,0,7,0,6,5,6,12,4,3,0,13,15,0,20,0 -18583,1,6,0,9,4,1,12,1,4,0,16,20,0,6,0 -18584,3,3,0,3,5,2,9,2,2,1,13,2,0,41,0 -18585,7,8,0,14,5,1,0,4,2,1,9,1,5,32,1 -18586,10,2,0,5,2,2,9,3,3,1,4,11,0,26,0 -18587,4,3,0,14,3,2,3,3,2,0,13,4,0,36,0 -18588,5,2,0,2,1,6,13,5,3,1,9,10,4,38,1 -18589,1,1,0,3,1,5,5,2,4,1,2,9,0,23,0 -18590,5,7,0,9,4,2,1,1,0,0,14,9,5,0,0 -18591,4,4,0,2,0,0,0,3,4,1,13,10,0,40,0 -18592,2,0,0,6,2,4,12,3,3,0,13,2,5,9,0 -18593,10,6,0,0,0,5,10,1,2,0,13,2,1,13,0 -18594,0,2,0,5,1,5,0,3,2,1,4,6,1,5,0 -18595,0,7,0,11,0,0,9,5,2,1,17,6,0,18,0 -18596,0,0,0,9,1,5,6,4,4,1,17,6,0,14,0 -18597,9,2,0,13,6,1,10,0,0,1,10,0,0,4,1 -18598,4,6,0,11,1,6,1,2,0,1,13,0,2,14,0 -18599,1,8,0,2,1,5,9,3,1,0,4,14,0,39,0 -18600,7,5,0,13,5,0,0,1,3,1,13,11,0,10,0 -18601,7,4,0,15,1,5,9,4,0,1,15,14,0,19,0 -18602,6,3,0,11,6,4,0,3,3,1,8,3,0,12,0 -18603,7,0,0,7,2,4,11,4,3,1,13,11,2,41,0 -18604,0,5,0,15,1,1,5,4,2,1,1,0,3,26,0 -18605,1,4,0,7,3,1,6,5,1,0,9,8,4,19,1 -18606,8,7,0,1,0,5,6,4,4,0,8,6,5,37,0 -18607,2,6,0,14,1,4,9,1,3,1,4,18,3,35,0 -18608,0,8,0,3,0,1,7,3,0,1,8,9,2,1,0 -18609,7,7,0,5,6,3,8,2,1,0,14,0,5,22,0 -18610,0,2,0,13,1,1,6,0,3,1,0,12,3,36,0 -18611,0,0,0,13,0,6,6,4,1,0,4,13,1,27,0 -18612,2,1,0,14,2,5,9,1,4,0,0,4,3,39,0 -18613,1,8,0,10,6,0,5,1,1,0,13,2,5,12,0 -18614,0,2,0,11,2,3,3,2,2,1,4,0,0,37,0 -18615,0,8,0,9,5,1,3,4,2,1,6,14,5,7,0 -18616,10,6,0,10,3,2,4,3,4,1,5,7,1,29,1 -18617,1,0,0,0,0,3,9,2,1,0,2,2,3,19,0 -18618,10,7,0,13,0,0,5,3,3,1,17,11,1,38,0 -18619,10,8,0,0,1,1,2,1,0,1,16,5,5,0,1 -18620,2,8,0,13,6,4,4,2,3,1,13,2,0,26,0 -18621,2,8,0,11,0,2,2,3,0,1,2,7,1,8,0 -18622,2,0,0,0,5,5,14,3,0,0,10,7,1,4,0 -18623,10,7,0,5,0,4,3,1,0,0,8,15,4,11,0 -18624,9,6,0,5,0,1,7,1,0,1,0,2,0,22,0 -18625,0,6,0,3,1,4,8,3,0,0,14,15,1,36,0 -18626,4,0,0,12,3,5,6,4,3,0,8,15,0,12,0 -18627,8,2,0,6,0,1,1,1,3,1,8,6,0,6,0 -18628,9,8,0,2,2,4,3,4,1,1,5,2,1,32,0 -18629,9,7,0,13,1,3,5,3,4,0,13,2,0,13,0 -18630,3,1,0,15,3,4,8,4,1,1,8,16,2,6,0 -18631,2,2,0,10,6,5,7,4,2,0,1,1,2,35,0 -18632,10,3,0,13,5,4,9,4,2,1,1,11,5,17,0 -18633,7,7,0,14,0,4,11,4,1,1,0,4,4,24,0 -18634,6,6,0,10,6,1,8,0,0,1,1,19,2,2,1 -18635,0,1,0,3,4,5,5,5,3,1,2,2,5,39,0 -18636,4,2,0,8,0,5,0,1,1,0,9,12,3,21,1 -18637,6,6,0,5,0,3,7,1,1,1,11,2,1,6,0 -18638,2,2,0,3,0,0,10,1,4,1,8,11,4,28,0 -18639,1,3,0,2,5,0,9,3,3,0,13,5,2,32,0 -18640,1,7,0,10,0,2,12,4,2,1,6,2,4,29,0 -18641,4,3,0,7,2,4,12,0,4,0,10,1,2,30,1 -18642,0,3,0,1,0,5,13,5,0,1,3,10,1,36,1 -18643,5,6,0,13,6,3,9,0,1,1,4,11,4,5,0 -18644,10,8,0,8,2,5,10,5,0,0,7,10,5,14,1 -18645,10,7,0,3,6,5,4,1,0,0,10,1,3,34,1 -18646,1,3,0,3,0,1,1,0,0,1,2,14,0,7,0 -18647,10,2,0,1,3,5,8,1,2,1,2,0,5,28,0 -18648,6,6,0,5,3,0,0,4,4,1,17,2,5,18,0 -18649,1,8,0,3,3,4,5,2,1,1,11,15,2,14,0 -18650,9,6,0,15,3,0,13,1,1,0,18,5,2,29,1 -18651,7,5,0,15,1,4,2,3,1,0,15,0,0,6,0 -18652,4,2,0,12,2,5,11,3,4,1,1,14,0,10,1 -18653,0,5,0,7,0,5,2,1,0,0,8,0,4,32,0 -18654,2,6,0,5,5,4,11,3,1,0,8,13,2,10,0 -18655,6,3,0,3,5,6,2,0,3,1,1,14,5,28,1 -18656,3,6,0,9,0,5,9,3,0,1,13,2,5,30,0 -18657,2,4,0,1,1,5,10,5,3,1,13,2,0,37,0 -18658,1,1,0,11,0,5,12,4,0,0,11,11,1,20,0 -18659,8,8,0,14,0,3,14,4,2,0,11,9,3,23,0 -18660,8,6,0,0,5,1,13,2,0,0,2,0,2,21,0 -18661,1,6,0,10,1,5,5,3,0,1,8,13,4,13,0 -18662,1,4,0,6,5,5,7,3,0,0,12,13,5,37,0 -18663,10,2,0,6,5,0,4,2,1,1,2,2,5,21,0 -18664,2,5,0,15,5,1,10,3,0,1,13,14,4,31,0 -18665,3,5,0,3,2,4,7,0,4,0,8,3,2,17,0 -18666,0,3,0,2,1,5,0,3,2,1,6,2,4,18,0 -18667,10,1,0,15,4,1,4,5,2,1,14,17,4,14,1 -18668,9,4,0,7,0,5,0,2,2,1,2,6,1,35,0 -18669,0,4,0,3,6,1,5,4,1,0,4,11,5,19,0 -18670,7,2,0,8,6,4,8,2,0,0,10,11,1,36,0 -18671,3,7,0,15,4,6,0,0,3,1,9,1,4,35,1 -18672,9,1,0,10,6,2,10,1,0,1,9,12,2,6,1 -18673,7,6,0,12,4,1,6,3,4,1,14,19,1,14,0 -18674,0,8,0,4,1,1,2,4,1,0,6,4,0,25,0 -18675,4,8,0,3,2,0,13,3,0,1,2,9,1,0,0 -18676,5,4,0,4,5,5,4,1,1,0,14,19,2,2,1 -18677,6,7,0,4,1,1,12,1,2,1,4,18,1,36,1 -18678,9,5,0,11,4,0,14,2,4,1,6,11,3,35,0 -18679,5,5,0,8,1,1,10,0,0,1,5,7,3,21,1 -18680,9,2,0,6,5,5,11,0,0,0,5,19,3,17,1 -18681,3,6,0,12,6,4,4,0,0,1,13,0,2,3,0 -18682,4,1,0,6,4,5,10,3,2,0,15,15,2,0,0 -18683,2,0,0,3,2,5,11,2,2,1,13,15,1,31,0 -18684,1,7,0,0,0,0,9,1,2,0,13,0,1,35,0 -18685,3,0,0,5,6,4,9,0,0,1,16,7,5,5,1 -18686,9,0,0,8,6,2,14,5,1,0,18,1,4,23,1 -18687,6,6,0,5,4,3,6,2,4,1,11,5,4,39,1 -18688,4,7,0,0,3,6,5,2,1,0,16,17,4,12,1 -18689,4,3,0,11,4,3,2,3,4,0,13,19,4,18,1 -18690,7,4,0,12,1,1,11,0,0,1,18,10,3,35,1 -18691,10,5,0,9,3,6,12,5,0,0,9,20,3,37,1 -18692,3,4,0,13,0,3,0,4,1,1,0,11,1,3,0 -18693,7,4,0,15,4,0,9,2,0,1,11,3,4,39,1 -18694,10,7,0,5,0,5,12,1,0,1,13,9,2,36,0 -18695,9,5,0,3,6,5,10,3,3,1,9,18,1,10,0 -18696,1,3,0,0,1,5,11,1,3,1,17,20,3,0,0 -18697,7,2,0,11,4,3,1,3,1,1,11,18,5,4,0 -18698,9,2,0,11,4,0,14,4,3,0,8,11,5,34,0 -18699,0,2,0,6,2,1,12,0,0,1,10,3,2,17,0 -18700,2,7,0,10,0,3,8,1,0,0,13,20,5,24,0 -18701,5,1,0,7,5,5,9,0,3,0,13,9,5,26,0 -18702,0,8,0,15,5,0,14,4,2,1,12,9,1,12,0 -18703,0,2,0,8,1,4,0,4,4,1,11,11,4,41,0 -18704,3,2,0,4,0,5,11,4,1,1,13,18,2,33,0 -18705,2,7,0,6,0,5,13,0,3,1,10,11,2,22,0 -18706,0,1,0,9,0,2,3,4,4,0,12,15,3,23,0 -18707,7,3,0,7,6,0,8,0,3,1,18,7,3,32,1 -18708,5,0,0,11,4,1,4,1,0,0,17,2,5,29,0 -18709,1,1,0,14,0,6,12,3,0,1,8,11,2,37,0 -18710,7,6,0,0,0,3,0,1,4,1,2,2,1,26,0 -18711,0,3,0,13,0,6,12,4,3,1,13,13,5,14,0 -18712,6,5,0,4,1,4,10,4,2,1,18,5,4,19,1 -18713,7,2,0,5,0,0,12,3,2,1,4,13,4,21,0 -18714,7,7,0,3,2,6,9,2,4,0,8,3,1,10,0 -18715,2,2,0,13,4,0,6,3,4,1,13,15,1,4,0 -18716,9,2,0,1,5,3,3,3,0,0,17,6,0,7,0 -18717,5,6,0,10,4,5,8,3,1,0,1,1,5,14,1 -18718,7,5,0,5,3,4,6,2,2,1,2,16,2,18,0 -18719,2,0,0,11,1,4,9,3,0,0,13,17,3,25,0 -18720,6,2,0,15,0,3,11,4,1,1,7,10,4,8,1 -18721,3,2,0,4,0,3,4,1,2,0,15,12,0,3,0 -18722,0,2,0,6,0,3,9,1,0,1,2,19,1,37,0 -18723,5,6,0,2,3,5,5,0,0,1,2,14,0,10,0 -18724,0,7,0,9,5,2,10,5,2,0,18,9,3,22,1 -18725,5,3,0,3,2,6,13,1,0,1,2,11,0,39,0 -18726,0,6,0,5,2,1,7,1,0,1,13,0,2,35,0 -18727,5,3,0,7,0,4,7,4,0,0,13,2,3,8,0 -18728,4,6,0,4,1,4,10,3,1,1,13,2,2,0,0 -18729,3,6,0,2,6,0,9,3,4,1,4,8,0,6,0 -18730,10,3,0,3,0,3,0,2,4,0,10,11,2,10,0 -18731,2,6,0,9,2,3,10,4,2,0,8,9,0,26,0 -18732,1,6,0,6,2,5,1,4,3,0,2,9,1,30,0 -18733,0,7,0,13,0,5,12,4,0,0,13,9,5,7,0 -18734,0,7,0,15,4,5,11,5,0,0,18,18,5,6,1 -18735,1,0,0,0,1,4,9,2,4,0,15,2,3,12,0 -18736,1,4,0,10,3,5,12,5,2,1,15,2,0,3,0 -18737,2,2,0,3,0,4,6,0,2,1,10,15,4,10,0 -18738,8,6,0,1,2,0,4,5,0,1,18,16,4,35,1 -18739,6,3,0,3,3,0,7,1,2,1,8,13,2,12,0 -18740,0,8,0,14,1,6,5,3,4,0,2,4,1,28,0 -18741,0,7,0,0,0,6,6,0,1,0,17,20,5,17,0 -18742,3,6,0,13,3,5,5,3,1,0,3,0,5,40,0 -18743,8,5,0,5,0,3,11,1,2,0,13,11,1,7,0 -18744,6,7,0,6,0,4,2,4,1,1,15,11,1,5,0 -18745,6,3,0,1,5,0,13,4,2,1,12,0,1,28,0 -18746,1,2,0,6,0,3,6,4,3,0,4,0,3,29,0 -18747,0,8,0,13,1,3,9,4,0,1,17,9,2,1,0 -18748,1,6,0,11,0,2,9,3,4,0,2,11,3,25,0 -18749,3,1,0,14,3,4,8,2,1,1,2,2,3,25,0 -18750,2,4,0,7,1,3,9,1,4,1,10,11,1,12,0 -18751,7,8,0,5,0,1,4,1,1,0,5,7,2,3,1 -18752,0,2,0,0,1,4,5,2,3,0,6,6,3,14,0 -18753,9,8,0,14,3,4,4,2,4,1,5,7,1,11,1 -18754,2,6,0,15,1,4,9,1,0,0,13,7,3,2,0 -18755,6,6,0,1,5,6,9,5,2,0,13,6,3,35,0 -18756,4,8,0,13,0,5,6,4,3,1,12,2,3,18,0 -18757,2,5,0,5,6,5,9,3,1,1,5,12,4,27,0 -18758,5,7,0,10,0,0,7,4,3,0,2,0,4,3,0 -18759,4,1,0,14,5,5,13,0,2,0,18,8,3,37,1 -18760,2,5,0,7,2,2,5,3,2,1,4,4,2,38,0 -18761,10,4,0,10,3,0,2,1,1,1,14,0,2,11,1 -18762,7,2,0,7,3,6,2,5,2,0,11,7,5,8,1 -18763,10,8,0,10,3,3,7,3,1,0,2,2,1,29,0 -18764,4,1,0,0,1,4,12,1,0,1,6,4,1,7,0 -18765,8,3,0,5,2,5,13,5,4,1,5,7,5,12,1 -18766,4,3,0,1,6,1,10,5,3,0,2,2,4,38,0 -18767,0,7,0,13,6,0,4,3,2,0,13,15,0,32,0 -18768,6,5,0,10,6,6,3,1,4,1,14,7,3,27,1 -18769,3,7,0,6,1,5,8,4,3,0,2,13,4,17,0 -18770,0,8,0,3,0,0,3,3,3,1,13,11,5,14,0 -18771,7,3,0,0,0,0,8,4,0,0,2,15,0,21,0 -18772,5,0,0,3,4,1,1,4,1,1,0,6,3,7,0 -18773,0,8,0,1,5,4,14,1,0,1,2,2,3,27,0 -18774,2,1,0,3,3,3,8,1,3,0,3,20,1,0,0 -18775,4,3,0,9,3,3,11,1,3,1,18,5,4,28,1 -18776,6,3,0,15,5,4,14,4,1,1,4,2,4,9,0 -18777,5,3,0,2,1,3,12,0,3,1,5,10,2,1,1 -18778,3,1,0,13,1,5,2,5,2,0,13,16,5,21,0 -18779,2,1,0,0,1,0,1,0,3,1,11,12,4,2,1 -18780,8,8,0,10,1,4,5,5,3,1,1,10,5,37,1 -18781,1,3,0,4,1,3,13,3,2,1,4,9,5,22,0 -18782,7,7,0,6,0,0,3,2,3,0,4,9,1,3,0 -18783,9,4,0,12,6,4,4,4,0,0,0,11,0,31,0 -18784,6,8,0,13,6,5,9,0,2,1,2,4,0,3,0 -18785,9,0,0,15,3,1,11,5,0,1,5,4,2,39,1 -18786,1,5,0,8,1,0,7,3,4,1,10,6,0,0,0 -18787,5,7,0,12,0,4,1,3,3,0,9,13,1,30,0 -18788,5,6,0,6,6,6,4,3,0,0,7,13,5,10,0 -18789,8,7,0,2,0,1,4,4,2,0,8,6,4,16,0 -18790,0,8,0,5,0,1,12,3,0,0,4,6,0,38,0 -18791,9,6,0,7,2,4,14,5,1,0,8,10,0,4,0 -18792,0,7,0,11,6,3,9,0,0,1,4,11,0,9,0 -18793,9,3,0,9,5,3,4,4,3,1,18,8,5,11,1 -18794,2,5,0,11,4,2,9,5,2,1,9,8,0,3,1 -18795,1,0,0,10,1,5,5,3,0,0,2,11,0,22,0 -18796,10,5,0,13,0,4,7,3,2,1,8,0,1,7,0 -18797,1,7,0,4,1,1,9,1,4,0,18,17,2,3,1 -18798,9,1,0,14,5,5,3,3,1,0,8,1,2,10,0 -18799,2,8,0,8,3,1,4,2,1,1,9,7,2,4,1 -18800,4,3,0,15,0,5,5,4,4,1,13,15,3,23,0 -18801,0,1,0,11,4,1,14,5,4,0,16,1,0,41,1 -18802,1,0,0,0,4,0,3,0,2,0,4,1,0,0,0 -18803,1,8,0,11,6,0,12,4,4,1,12,18,5,7,0 -18804,1,7,0,12,4,5,10,0,2,0,2,2,5,19,0 -18805,10,0,0,15,0,6,2,5,3,1,13,20,0,9,0 -18806,8,3,0,2,0,3,7,1,0,1,17,11,1,31,0 -18807,2,1,0,4,0,5,8,1,4,1,16,4,5,39,0 -18808,3,6,0,11,0,1,9,3,2,0,2,13,5,36,0 -18809,2,8,0,5,0,5,6,2,1,1,2,2,4,1,0 -18810,9,2,0,1,6,4,6,2,1,1,17,11,3,12,0 -18811,4,1,0,12,1,6,6,1,0,1,13,9,0,14,0 -18812,8,2,0,3,0,3,6,2,4,1,5,13,0,21,0 -18813,2,4,0,4,2,2,2,0,1,1,18,18,1,36,1 -18814,5,4,0,6,0,3,2,2,3,0,2,2,4,35,0 -18815,1,5,0,14,5,0,11,3,4,0,17,11,1,28,0 -18816,2,5,0,0,0,6,11,2,1,0,13,11,2,29,0 -18817,10,1,0,12,5,5,7,1,3,1,18,4,3,31,1 -18818,1,8,0,1,0,3,6,0,4,0,17,2,5,13,0 -18819,5,0,0,4,1,6,14,0,0,0,13,19,2,38,0 -18820,9,3,0,9,2,5,13,2,4,1,0,11,5,19,0 -18821,3,7,0,6,6,4,6,0,0,0,0,18,4,14,0 -18822,0,0,0,7,0,4,9,3,1,0,8,6,4,1,0 -18823,0,7,0,7,0,5,8,5,3,0,4,11,2,18,0 -18824,9,7,0,7,3,3,8,4,2,0,18,5,1,9,1 -18825,2,6,0,6,1,2,8,0,3,1,18,7,1,0,1 -18826,2,3,0,10,6,1,0,1,0,1,5,1,3,1,1 -18827,3,7,0,2,5,0,11,2,3,1,18,1,0,9,1 -18828,9,8,0,4,0,1,9,2,2,0,17,0,5,12,0 -18829,4,5,0,3,5,6,4,3,2,1,2,2,1,14,0 -18830,0,7,0,12,0,4,3,0,0,0,4,9,1,5,0 -18831,9,2,0,7,1,4,7,0,1,0,2,9,2,31,0 -18832,9,0,0,2,1,4,11,4,2,1,14,7,0,33,1 -18833,9,8,0,7,6,5,12,1,0,1,11,10,2,23,1 -18834,1,0,0,3,6,5,0,1,3,0,6,14,1,11,0 -18835,0,6,0,0,5,3,5,3,0,1,13,4,4,40,0 -18836,0,2,0,6,6,0,8,0,3,1,15,2,1,4,0 -18837,5,0,0,9,0,5,3,4,0,0,2,2,4,17,0 -18838,8,7,0,6,6,3,0,5,1,1,5,1,0,31,1 -18839,3,8,0,6,5,5,6,3,2,1,12,16,3,10,0 -18840,9,3,0,0,6,2,8,2,0,1,8,1,0,34,0 -18841,1,0,0,4,6,3,8,1,3,0,4,15,1,39,0 -18842,0,6,0,4,0,0,6,3,4,1,2,18,5,6,0 -18843,4,7,0,4,0,6,5,3,4,1,13,9,2,21,0 -18844,4,1,0,3,0,5,13,3,0,0,0,2,3,24,0 -18845,2,2,0,11,6,0,5,1,4,0,10,2,4,28,0 -18846,3,2,0,3,6,2,10,1,0,1,18,12,2,19,1 -18847,10,7,0,3,1,1,8,3,0,0,13,2,0,3,0 -18848,0,5,0,6,5,3,2,2,3,1,13,2,1,0,0 -18849,9,7,0,15,4,5,5,3,2,1,3,2,2,39,0 -18850,2,7,0,3,3,6,5,1,2,1,6,7,3,31,0 -18851,8,5,0,5,0,4,12,5,0,0,16,8,3,22,1 -18852,3,5,0,5,3,4,3,1,2,1,0,11,2,6,0 -18853,4,6,0,2,0,6,14,1,1,1,2,4,1,22,0 -18854,0,3,0,2,5,2,4,5,0,1,0,15,4,11,1 -18855,7,2,0,10,1,3,9,1,2,0,18,7,0,30,1 -18856,1,3,0,3,5,6,9,0,4,1,12,11,5,5,0 -18857,0,3,0,6,6,1,12,1,1,1,0,9,2,5,0 -18858,9,8,0,10,1,4,1,1,3,1,10,10,0,8,1 -18859,2,2,0,15,0,5,10,2,2,1,13,20,0,26,0 -18860,9,2,0,15,6,6,7,3,2,0,3,16,3,22,1 -18861,7,5,0,3,4,2,2,5,0,0,18,14,0,27,1 -18862,7,5,0,3,2,6,3,1,1,1,13,2,0,18,0 -18863,7,8,0,12,4,2,4,0,0,1,18,5,2,13,1 -18864,9,8,0,1,3,5,8,0,1,1,17,11,0,21,0 -18865,0,1,0,0,2,3,10,2,0,0,2,11,3,21,0 -18866,4,7,0,3,2,2,0,3,3,1,16,5,4,12,1 -18867,3,5,0,14,0,0,9,1,0,0,8,6,0,9,0 -18868,1,4,0,11,1,6,9,4,2,0,11,11,5,26,0 -18869,0,2,0,2,5,5,11,4,0,0,10,6,4,33,0 -18870,4,5,0,9,3,1,9,0,4,0,8,18,0,22,0 -18871,0,4,0,2,6,4,10,2,2,0,13,15,3,21,0 -18872,4,0,0,9,5,6,1,4,2,0,5,10,0,41,1 -18873,6,0,0,4,6,2,14,1,0,1,8,6,0,9,0 -18874,8,0,0,14,6,5,8,4,2,1,8,6,0,19,0 -18875,2,8,0,5,0,5,3,3,4,0,2,9,4,13,0 -18876,1,3,0,0,2,3,1,1,3,1,9,18,5,14,0 -18877,2,1,0,11,6,0,1,2,2,1,2,9,0,23,0 -18878,3,8,0,6,5,5,4,4,4,0,3,0,5,26,0 -18879,10,4,0,1,3,3,12,1,1,1,12,12,4,33,1 -18880,1,5,0,7,1,4,8,5,1,1,13,14,2,26,0 -18881,10,6,0,4,2,5,4,1,3,1,4,2,1,36,0 -18882,2,6,0,9,2,4,5,3,0,0,13,15,4,13,0 -18883,9,1,0,9,6,1,11,5,2,0,5,10,4,30,1 -18884,8,6,0,9,2,5,13,5,2,0,11,17,3,30,1 -18885,3,2,0,4,6,5,5,3,4,1,2,4,4,3,0 -18886,1,7,0,13,6,0,6,3,0,1,8,2,3,37,0 -18887,10,2,0,13,3,3,10,2,2,0,4,3,0,25,0 -18888,10,6,0,4,2,4,13,2,4,0,17,7,4,22,0 -18889,2,1,0,6,0,5,14,5,1,0,2,11,1,34,0 -18890,2,2,0,12,6,5,14,3,2,0,8,2,3,27,0 -18891,1,5,0,3,0,5,6,2,0,1,2,9,1,18,0 -18892,0,7,0,2,1,0,7,0,4,1,11,6,0,19,0 -18893,1,6,0,13,6,5,5,3,4,0,15,6,4,19,0 -18894,6,3,0,13,5,4,10,3,2,0,8,16,0,35,0 -18895,5,2,0,3,2,6,10,4,2,0,15,9,0,1,0 -18896,2,7,0,11,0,1,14,0,0,0,10,20,1,6,0 -18897,0,0,0,5,1,0,14,0,4,1,2,18,3,21,0 -18898,4,6,0,5,1,1,9,1,3,1,2,17,5,37,0 -18899,0,6,0,11,6,3,6,3,3,0,2,15,5,35,0 -18900,5,3,0,6,2,4,5,3,4,0,10,6,2,36,0 -18901,0,0,0,8,0,4,3,2,4,0,8,11,0,38,0 -18902,2,3,0,4,0,2,3,0,1,1,7,2,5,0,0 -18903,2,3,0,10,6,5,3,0,1,1,13,9,5,38,0 -18904,5,1,0,8,4,4,6,0,0,1,8,14,0,16,0 -18905,8,5,0,14,0,5,6,3,3,0,8,16,1,21,0 -18906,2,1,0,2,1,0,0,5,0,0,14,10,3,28,1 -18907,3,1,0,8,4,2,5,2,0,0,13,11,5,18,0 -18908,9,6,0,15,0,5,8,0,0,0,2,2,2,7,0 -18909,7,4,0,9,0,0,3,5,2,1,2,12,3,27,0 -18910,0,7,0,15,0,6,12,4,1,1,2,19,2,24,1 -18911,1,4,0,5,1,6,11,5,2,0,10,11,5,6,0 -18912,10,6,0,13,2,0,14,2,1,1,4,2,4,21,0 -18913,7,4,0,1,2,5,8,1,4,0,9,19,2,2,1 -18914,2,8,0,15,0,4,12,5,0,0,16,11,2,22,0 -18915,9,6,0,9,6,1,5,1,1,0,2,12,2,0,0 -18916,3,0,0,5,3,6,0,0,3,1,17,11,5,24,0 -18917,8,5,0,11,0,1,3,1,4,1,4,7,3,31,1 -18918,7,8,0,9,6,3,10,5,2,1,1,3,5,23,1 -18919,10,5,0,7,1,2,12,5,0,0,18,7,0,19,1 -18920,1,7,0,1,0,6,0,4,3,1,10,12,1,21,0 -18921,8,7,0,4,0,1,1,5,2,0,14,7,2,32,1 -18922,0,2,0,8,6,4,1,2,0,1,18,7,5,32,1 -18923,1,0,0,4,6,5,8,0,2,0,2,0,1,5,0 -18924,7,5,0,5,2,2,2,0,1,1,8,8,4,34,1 -18925,9,4,0,13,4,6,8,4,0,0,13,9,4,16,0 -18926,0,1,0,4,6,0,4,3,3,1,11,9,5,11,0 -18927,3,6,0,13,5,5,8,2,0,1,6,2,5,1,0 -18928,1,4,0,4,6,4,5,3,2,0,17,15,5,29,0 -18929,3,0,0,8,5,6,9,2,1,1,2,15,5,19,0 -18930,2,1,0,14,5,3,14,3,0,1,12,7,5,35,0 -18931,10,1,0,9,2,1,5,0,3,1,5,19,3,18,1 -18932,10,7,0,2,0,0,11,3,2,1,6,3,0,27,0 -18933,8,3,0,13,6,5,0,5,4,1,11,7,0,36,1 -18934,5,3,0,11,2,4,11,0,1,1,16,2,4,11,0 -18935,4,8,0,2,6,4,6,5,2,0,2,6,5,36,0 -18936,10,8,0,0,2,1,5,5,2,1,11,19,0,18,1 -18937,0,4,0,9,0,3,4,3,4,1,13,9,0,25,0 -18938,1,7,0,5,1,1,13,5,0,1,9,10,1,3,1 -18939,10,4,0,6,1,0,6,0,1,0,13,6,1,36,0 -18940,4,1,0,5,0,0,7,0,3,0,13,2,0,22,0 -18941,8,2,0,5,1,4,6,5,2,0,18,4,3,38,1 -18942,5,8,0,5,3,5,14,1,2,0,1,1,2,20,1 -18943,4,0,0,3,3,4,10,2,2,1,2,2,4,7,0 -18944,0,2,0,6,1,5,2,1,0,1,2,4,0,21,0 -18945,10,8,0,10,6,1,12,5,3,0,16,3,2,3,1 -18946,1,2,0,3,0,5,12,2,4,0,18,7,3,5,1 -18947,1,5,0,3,6,5,9,1,4,1,2,13,0,27,0 -18948,0,7,0,15,6,4,3,1,4,0,13,13,4,35,0 -18949,6,8,0,1,4,0,0,2,0,1,5,7,0,32,1 -18950,3,3,0,14,5,5,3,3,4,1,9,11,2,0,0 -18951,5,6,0,13,3,4,0,3,2,0,0,3,3,8,0 -18952,4,6,0,9,0,2,11,4,0,1,7,7,2,31,1 -18953,2,0,0,7,4,5,12,1,2,0,18,9,1,9,1 -18954,2,6,0,13,0,2,11,1,3,1,3,7,0,10,1 -18955,8,6,0,0,2,4,3,0,1,0,15,15,2,2,0 -18956,2,1,0,11,5,5,0,3,1,0,8,15,1,34,0 -18957,10,1,0,7,4,1,3,1,0,0,9,14,2,21,1 -18958,9,3,0,6,4,4,5,3,1,1,9,17,3,26,0 -18959,6,6,0,11,3,5,8,1,4,0,6,20,4,31,0 -18960,0,3,0,5,5,6,2,2,0,1,11,7,3,36,1 -18961,3,6,0,7,1,0,3,0,3,0,6,12,4,2,0 -18962,3,1,0,1,3,1,13,0,2,1,9,15,4,9,1 -18963,6,7,0,2,5,2,8,0,4,0,3,19,0,37,1 -18964,10,1,0,1,2,5,4,3,0,0,10,6,5,14,0 -18965,5,0,0,5,2,1,8,5,3,1,9,4,1,37,1 -18966,6,2,0,4,5,1,1,4,4,1,8,10,1,38,0 -18967,1,7,0,2,1,0,1,3,3,1,2,12,0,6,0 -18968,0,3,0,11,4,5,1,4,4,1,4,2,4,38,0 -18969,3,5,0,0,0,0,8,5,4,1,8,1,4,40,0 -18970,6,7,0,9,6,6,14,4,4,1,4,13,3,37,0 -18971,0,7,0,9,6,2,2,4,3,0,13,6,4,29,0 -18972,1,0,0,6,6,2,2,1,4,1,13,9,5,7,0 -18973,9,6,0,3,0,3,3,5,3,1,2,6,4,39,0 -18974,7,5,0,10,0,3,11,4,0,1,11,12,1,30,1 -18975,2,5,0,12,6,4,4,2,0,1,6,2,0,2,0 -18976,1,8,0,3,6,6,9,2,2,1,8,6,1,7,0 -18977,8,2,0,2,3,3,6,4,4,1,8,2,0,13,0 -18978,6,2,0,3,1,5,5,1,3,1,13,2,5,10,0 -18979,8,8,0,9,5,3,3,3,4,0,9,10,3,27,1 -18980,4,2,0,5,6,0,8,2,0,1,4,6,4,34,0 -18981,0,6,0,5,0,4,9,3,0,0,2,9,2,21,0 -18982,2,7,0,5,1,4,9,3,3,0,8,6,0,17,0 -18983,0,1,0,1,5,0,3,2,0,0,8,7,4,40,0 -18984,0,1,0,5,0,1,1,3,1,0,17,12,3,0,0 -18985,3,7,0,10,6,3,9,1,3,1,13,11,1,4,0 -18986,5,0,0,7,4,6,3,0,4,0,15,10,0,36,1 -18987,4,6,0,5,0,6,4,1,0,0,4,9,2,28,0 -18988,10,8,0,15,6,6,0,0,3,1,6,1,3,34,1 -18989,0,0,0,7,5,3,2,3,0,1,2,6,1,26,0 -18990,9,3,0,0,0,6,2,4,1,1,3,12,4,18,1 -18991,4,3,0,12,1,2,7,2,0,0,13,13,0,0,0 -18992,5,6,0,13,4,0,5,0,0,1,2,11,4,4,0 -18993,7,7,0,5,4,4,6,1,3,1,17,19,2,38,0 -18994,9,0,0,2,1,0,1,1,4,0,3,14,0,30,0 -18995,2,0,0,3,5,5,0,3,4,0,12,2,5,13,0 -18996,0,8,0,11,1,4,12,1,3,1,13,5,2,5,0 -18997,1,6,0,3,4,5,8,2,4,1,17,15,0,3,0 -18998,4,5,0,6,6,1,4,2,2,1,8,11,4,40,0 -18999,0,3,0,7,1,4,8,3,2,0,17,0,0,39,0 -19000,9,4,0,4,3,5,7,1,0,1,8,3,5,23,0 -19001,7,3,0,9,0,4,8,2,3,0,16,16,4,30,0 -19002,8,1,0,9,3,6,10,5,4,1,18,16,0,22,1 -19003,3,1,0,15,1,4,8,4,2,1,6,11,1,21,0 -19004,4,4,0,2,1,6,3,0,1,1,2,2,5,39,0 -19005,2,5,0,4,5,3,6,0,2,0,6,11,4,9,0 -19006,1,6,0,0,1,1,11,4,3,0,13,9,4,26,0 -19007,4,1,0,7,1,6,3,3,4,1,10,4,5,27,0 -19008,6,0,0,8,1,1,10,5,3,1,8,19,1,9,0 -19009,6,1,0,1,3,4,2,4,3,0,2,1,2,21,1 -19010,0,6,0,11,5,6,0,1,3,0,8,0,5,1,0 -19011,10,8,0,2,2,5,6,2,0,1,4,2,3,25,0 -19012,5,4,0,5,5,0,7,1,3,0,12,19,1,39,0 -19013,0,8,0,2,2,5,8,3,3,0,13,2,0,25,0 -19014,3,5,0,12,2,5,7,5,4,1,6,16,5,10,1 -19015,5,4,0,11,2,4,7,2,3,0,13,9,3,30,0 -19016,0,3,0,6,1,1,13,0,2,0,6,4,0,35,0 -19017,4,8,0,14,4,4,1,3,4,0,18,17,4,23,1 -19018,4,5,0,2,2,6,14,5,4,0,3,10,1,27,1 -19019,10,7,0,6,6,4,14,5,3,1,13,11,0,7,0 -19020,2,0,0,1,2,1,8,3,2,0,5,11,4,18,0 -19021,5,4,0,14,0,4,12,0,3,1,10,14,5,24,0 -19022,4,8,0,8,3,4,8,0,2,1,2,19,5,30,0 -19023,10,6,0,15,1,2,11,3,2,0,1,2,5,1,1 -19024,5,8,0,5,4,6,3,3,1,1,15,12,0,4,0 -19025,9,1,0,0,0,6,4,3,2,0,13,9,4,7,0 -19026,0,6,0,10,2,1,6,4,2,0,11,11,1,29,0 -19027,2,4,0,14,4,0,2,5,0,0,17,11,2,17,0 -19028,0,7,0,3,0,0,10,1,4,0,2,18,1,19,0 -19029,5,6,0,3,5,1,13,3,3,0,18,1,1,23,1 -19030,8,8,0,2,1,0,9,4,2,0,13,11,1,0,0 -19031,6,7,0,9,1,5,7,3,4,1,2,20,1,6,0 -19032,0,0,0,2,1,0,9,3,3,0,17,3,0,20,0 -19033,2,8,0,6,2,2,4,1,4,0,13,18,0,13,0 -19034,1,4,0,0,1,5,9,1,2,1,10,5,2,3,1 -19035,7,7,0,7,0,0,1,1,2,0,0,6,1,26,0 -19036,2,1,0,10,6,1,3,0,0,0,11,12,1,40,1 -19037,4,6,0,0,0,5,6,3,2,1,8,2,5,40,0 -19038,1,8,0,13,2,6,9,1,4,1,2,15,5,26,0 -19039,5,1,0,1,1,4,3,1,1,1,16,2,1,40,0 -19040,8,4,0,13,2,4,9,1,3,1,2,2,0,7,0 -19041,4,7,0,3,5,6,3,0,2,0,15,19,0,27,1 -19042,0,2,0,0,6,2,6,4,0,1,0,4,3,37,0 -19043,3,5,0,8,2,1,6,5,4,0,18,16,3,10,1 -19044,4,7,0,12,6,4,7,4,0,0,6,15,1,17,0 -19045,8,4,0,10,2,6,0,5,2,1,18,7,5,9,1 -19046,8,0,0,10,2,2,11,2,3,0,5,8,3,5,1 -19047,1,8,0,5,4,4,0,2,2,0,9,1,5,41,1 -19048,0,4,0,0,0,2,9,4,0,1,13,15,4,39,0 -19049,1,2,0,3,6,0,3,4,0,0,6,2,3,4,0 -19050,1,2,0,1,4,5,14,1,1,0,8,10,3,41,1 -19051,0,7,0,11,1,6,7,4,0,1,13,15,5,40,0 -19052,1,6,0,13,6,4,1,1,2,1,7,9,4,4,0 -19053,9,8,0,6,3,1,6,1,1,0,1,7,2,33,1 -19054,8,7,0,1,6,0,3,1,0,0,9,1,4,24,1 -19055,0,0,0,6,3,2,9,0,3,0,15,11,0,19,0 -19056,2,8,0,15,5,5,9,1,2,1,13,3,4,16,0 -19057,7,7,0,11,5,6,13,0,0,0,2,3,0,14,0 -19058,4,5,0,13,6,0,1,2,0,0,13,18,2,4,0 -19059,3,8,0,11,0,5,12,4,2,0,13,16,0,37,0 -19060,7,3,0,3,6,2,11,3,1,1,2,11,2,19,0 -19061,3,4,0,10,3,4,2,0,0,1,4,11,0,39,0 -19062,0,1,0,2,0,2,14,1,0,0,14,2,3,3,0 -19063,1,7,0,14,6,6,12,4,1,0,13,0,2,33,0 -19064,6,5,0,15,0,2,0,3,1,0,13,4,4,26,0 -19065,1,6,0,6,4,4,6,2,0,0,13,0,0,34,0 -19066,7,3,0,5,3,6,3,3,0,1,8,0,4,7,0 -19067,5,0,0,8,6,2,3,0,2,1,18,10,5,33,1 -19068,3,0,0,9,1,3,9,3,2,0,13,13,4,14,0 -19069,2,7,0,5,0,0,11,2,1,0,13,2,5,24,0 -19070,2,6,0,7,1,6,7,3,3,1,17,4,5,3,0 -19071,1,8,0,13,0,1,10,2,1,0,2,6,4,28,0 -19072,4,1,0,6,3,5,3,5,1,0,1,19,2,10,1 -19073,0,5,0,0,1,5,0,3,1,0,10,18,1,30,0 -19074,8,4,0,12,1,2,1,1,4,1,18,10,3,38,1 -19075,2,8,0,10,0,5,9,4,2,0,2,6,1,0,0 -19076,0,3,0,3,4,0,14,1,3,0,8,2,1,4,0 -19077,3,6,0,13,3,2,2,2,4,1,6,14,4,35,0 -19078,4,1,0,3,5,5,2,4,0,1,18,19,1,27,1 -19079,10,7,0,5,0,1,0,0,0,1,18,19,3,23,1 -19080,10,1,0,14,0,3,9,1,1,0,4,18,0,19,0 -19081,2,8,0,6,5,4,5,4,2,1,4,12,2,30,0 -19082,7,0,0,12,3,2,4,3,3,0,14,8,4,18,1 -19083,4,6,0,15,2,4,6,1,4,1,7,2,0,24,0 -19084,10,2,0,13,4,4,13,0,3,1,0,4,3,3,1 -19085,6,0,0,5,6,3,13,5,0,0,5,10,3,36,1 -19086,2,0,0,3,3,0,8,3,0,0,2,15,5,27,0 -19087,9,4,0,5,6,0,8,0,3,0,15,12,2,2,0 -19088,0,8,0,4,2,6,7,4,4,1,17,6,2,30,0 -19089,6,1,0,12,0,1,4,4,4,1,5,10,4,18,1 -19090,2,2,0,1,4,0,3,3,2,1,0,11,3,37,0 -19091,0,0,0,4,6,2,5,2,1,0,4,13,5,40,0 -19092,9,8,0,6,6,4,5,4,0,1,2,0,1,36,0 -19093,2,2,0,4,5,2,1,3,0,0,2,16,0,29,0 -19094,2,6,0,11,0,5,0,3,0,0,8,2,5,3,0 -19095,6,4,0,11,5,5,4,5,1,1,18,17,1,4,1 -19096,7,6,0,15,0,5,5,5,3,1,17,20,2,9,0 -19097,0,0,0,14,0,5,10,3,0,1,17,6,3,37,0 -19098,0,5,0,2,0,4,2,1,0,0,13,1,1,6,0 -19099,0,0,0,8,0,6,2,5,0,0,1,15,3,8,0 -19100,5,4,0,8,5,2,2,2,4,1,8,18,1,27,0 -19101,0,8,0,3,1,3,9,2,1,1,12,2,1,23,0 -19102,2,8,0,10,3,2,10,5,3,0,18,3,5,13,1 -19103,4,8,0,11,0,0,3,5,0,1,4,13,4,38,0 -19104,2,6,0,8,6,1,11,4,3,1,8,18,4,6,0 -19105,9,5,0,8,2,2,9,3,3,0,8,12,3,37,0 -19106,0,6,0,0,5,6,0,3,1,0,10,11,3,25,0 -19107,9,5,0,12,3,2,7,1,2,1,8,11,2,29,0 -19108,3,8,0,9,2,2,6,4,4,0,14,9,2,23,0 -19109,8,6,0,13,0,0,14,1,4,1,8,0,2,2,0 -19110,10,2,0,10,1,5,5,3,2,1,13,16,0,12,0 -19111,5,8,0,11,0,6,4,2,3,1,2,11,4,40,0 -19112,6,6,0,7,4,1,1,4,1,1,18,19,4,24,1 -19113,1,8,0,11,0,4,9,1,1,1,8,11,4,31,0 -19114,7,2,0,6,1,5,11,3,4,0,13,9,4,32,0 -19115,7,8,0,13,6,4,5,2,0,1,2,9,3,17,0 -19116,8,0,0,5,6,0,11,0,3,0,16,7,0,2,1 -19117,6,5,0,13,0,2,6,3,4,0,15,15,3,30,0 -19118,1,2,0,1,0,5,0,3,0,1,0,11,2,21,0 -19119,1,4,0,10,1,4,3,4,2,0,0,10,3,4,0 -19120,6,4,0,11,5,0,14,2,4,0,2,6,2,11,0 -19121,10,7,0,10,0,4,8,0,4,0,8,2,1,8,0 -19122,0,7,0,1,2,4,6,4,0,0,3,15,2,3,0 -19123,8,7,0,15,4,5,5,3,3,0,2,6,2,32,0 -19124,5,6,0,0,1,6,7,4,0,0,11,13,4,19,0 -19125,2,1,0,9,4,3,0,0,1,1,5,14,4,37,1 -19126,6,7,0,1,2,3,2,4,3,0,2,2,4,35,0 -19127,3,0,0,13,2,4,11,4,0,0,5,2,0,29,0 -19128,6,8,0,10,2,2,6,1,0,0,11,4,5,1,1 -19129,9,6,0,10,2,2,13,5,3,0,11,7,2,4,1 -19130,10,6,0,13,6,5,7,4,2,0,11,20,2,16,0 -19131,10,5,0,7,6,1,12,3,1,1,18,17,3,12,1 -19132,7,8,0,4,6,0,2,3,3,1,13,11,1,10,0 -19133,1,7,0,15,2,5,4,2,2,1,2,10,5,30,0 -19134,8,2,0,0,0,1,1,5,4,1,4,16,2,16,1 -19135,8,5,0,10,3,2,2,3,3,1,5,7,2,26,1 -19136,2,2,0,1,5,4,9,4,2,1,1,0,3,23,0 -19137,2,8,0,4,0,0,13,5,2,0,0,6,3,30,0 -19138,2,4,0,12,1,4,3,3,1,0,17,7,2,23,0 -19139,10,3,0,3,3,4,0,2,2,1,8,2,3,16,0 -19140,0,6,0,11,5,4,6,2,0,1,3,11,2,18,0 -19141,7,1,0,13,3,4,6,5,2,0,8,2,3,39,0 -19142,0,0,0,7,4,5,12,1,3,0,2,0,2,33,0 -19143,2,6,0,11,0,5,8,4,4,1,4,14,0,6,0 -19144,0,8,0,0,6,6,6,0,0,0,17,6,0,7,0 -19145,2,4,0,13,6,5,10,4,3,0,14,13,2,0,0 -19146,0,3,0,12,1,2,11,5,0,0,1,10,0,5,1 -19147,2,3,0,11,5,3,7,0,3,1,12,20,2,14,0 -19148,3,6,0,14,0,5,7,1,2,0,5,13,4,20,0 -19149,8,7,0,13,1,5,6,5,1,0,12,11,4,23,0 -19150,7,0,0,13,0,0,0,3,4,1,2,18,0,26,0 -19151,3,3,0,9,6,3,0,1,0,1,2,11,1,3,0 -19152,1,1,0,10,0,1,4,2,2,0,16,18,1,30,1 -19153,1,8,0,8,1,0,6,4,2,0,13,18,1,9,0 -19154,0,3,0,13,4,2,9,3,3,0,13,15,1,16,0 -19155,6,6,0,7,4,6,7,0,3,1,4,15,1,16,0 -19156,5,5,0,8,2,6,10,1,1,0,7,10,1,20,1 -19157,0,3,0,4,4,2,14,1,4,0,13,2,3,13,0 -19158,0,4,0,1,1,5,9,0,2,1,15,1,3,19,0 -19159,7,0,0,10,3,1,7,1,3,1,14,10,4,22,1 -19160,9,2,0,1,2,4,9,0,4,0,16,8,3,7,1 -19161,9,8,0,9,4,0,12,1,1,1,4,15,3,10,0 -19162,4,8,0,11,1,0,5,1,4,0,15,20,0,31,0 -19163,6,4,0,4,0,5,13,1,4,0,8,2,0,26,0 -19164,10,0,0,6,5,4,3,0,4,0,6,6,4,14,0 -19165,7,7,0,13,0,5,5,1,0,0,2,15,2,41,0 -19166,1,3,0,13,6,0,8,1,4,1,6,2,0,6,0 -19167,3,6,0,13,0,5,10,1,2,0,12,2,5,38,0 -19168,0,8,0,13,3,4,9,3,1,1,10,16,1,39,0 -19169,10,0,0,10,2,6,6,0,1,1,13,15,0,3,0 -19170,1,8,0,15,3,0,0,2,0,0,16,12,1,38,0 -19171,3,8,0,10,0,6,11,5,0,1,11,14,1,12,0 -19172,9,1,0,14,4,2,6,1,1,0,9,19,2,11,1 -19173,7,3,0,4,5,6,12,5,1,1,18,5,2,11,1 -19174,2,3,0,7,0,0,11,3,2,0,2,8,5,26,0 -19175,2,7,0,0,0,2,9,1,4,0,8,19,0,20,0 -19176,5,5,0,10,1,2,1,4,3,0,10,6,3,9,0 -19177,0,5,0,2,6,0,6,2,4,1,2,5,0,0,0 -19178,10,6,0,3,2,0,4,5,2,1,13,2,2,11,0 -19179,9,8,0,4,4,6,3,2,0,0,10,9,2,32,0 -19180,1,8,0,13,5,1,8,4,4,0,8,13,0,38,0 -19181,4,6,0,2,4,5,3,0,0,0,8,9,2,33,0 -19182,9,4,0,5,0,6,6,2,2,0,13,9,0,38,0 -19183,6,4,0,3,3,2,0,1,2,1,5,14,0,17,1 -19184,4,2,0,6,2,5,5,2,0,1,16,0,1,37,0 -19185,10,6,0,12,2,1,7,1,0,0,1,14,0,16,0 -19186,7,4,0,3,4,4,10,2,2,1,7,9,5,9,0 -19187,7,3,0,9,4,6,1,5,1,1,18,14,0,18,1 -19188,4,3,0,9,0,0,7,2,3,1,4,4,4,25,0 -19189,9,8,0,10,6,1,10,5,3,1,14,7,5,9,1 -19190,0,0,0,13,2,5,9,3,2,0,2,11,1,33,0 -19191,10,6,0,4,5,5,11,1,3,0,5,10,4,1,1 -19192,2,6,0,6,6,4,1,2,0,0,4,1,3,9,0 -19193,0,0,0,9,3,0,13,4,3,0,0,20,0,28,0 -19194,6,4,0,9,5,5,13,1,0,1,8,0,5,26,0 -19195,0,0,0,4,1,5,0,2,2,0,2,14,3,16,0 -19196,0,6,0,7,5,1,5,1,2,1,13,18,2,16,0 -19197,1,7,0,13,6,3,6,3,3,0,13,11,1,30,0 -19198,2,8,0,4,3,6,7,3,3,0,13,11,0,34,0 -19199,7,3,0,0,4,4,9,5,0,0,18,12,3,28,1 -19200,10,1,0,9,4,5,4,5,4,0,16,5,2,8,1 -19201,4,8,0,7,0,5,12,0,2,1,5,7,5,10,1 -19202,6,8,0,10,0,4,6,3,4,0,8,2,4,17,0 -19203,2,2,0,4,5,4,13,3,2,1,17,2,4,7,0 -19204,2,6,0,2,1,3,7,3,0,0,8,11,2,6,0 -19205,5,5,0,9,2,4,0,0,3,1,16,4,2,31,0 -19206,0,7,0,2,5,2,5,4,4,0,2,11,2,30,0 -19207,3,7,0,6,6,4,3,5,1,1,5,10,2,10,1 -19208,2,3,0,3,4,4,14,1,2,0,12,18,3,10,0 -19209,9,4,0,8,1,4,11,5,0,1,15,8,2,9,1 -19210,2,7,0,4,1,2,5,3,2,1,17,2,5,16,0 -19211,0,6,0,5,6,5,0,4,1,1,6,4,0,29,0 -19212,4,7,0,13,1,6,0,3,0,1,0,11,3,7,0 -19213,6,3,0,0,0,5,9,3,0,0,13,2,5,11,0 -19214,8,5,0,6,6,1,9,0,0,0,7,16,5,20,1 -19215,10,1,0,7,3,5,11,0,1,0,1,7,0,23,1 -19216,3,2,0,6,5,3,11,0,0,0,1,17,3,21,1 -19217,4,4,0,5,5,5,11,2,0,0,8,15,5,39,0 -19218,6,8,0,0,6,6,9,5,3,1,4,7,2,10,1 -19219,10,7,0,5,2,2,6,4,2,0,2,11,0,38,0 -19220,6,5,0,8,2,1,1,0,4,1,0,12,4,38,1 -19221,2,0,0,1,0,4,7,4,3,0,15,13,2,13,0 -19222,8,2,0,5,2,1,6,0,0,1,8,0,0,11,0 -19223,3,2,0,13,1,5,8,4,0,0,8,11,5,29,0 -19224,10,0,0,4,6,3,7,0,4,0,4,7,5,20,1 -19225,1,3,0,5,2,6,6,4,0,1,13,14,0,19,0 -19226,10,6,0,6,3,1,14,1,0,1,9,7,2,22,1 -19227,2,4,0,3,6,2,6,1,2,0,12,0,0,30,0 -19228,1,6,0,14,2,3,3,2,2,0,13,1,1,3,0 -19229,2,6,0,3,6,0,9,3,1,0,4,11,2,3,0 -19230,1,0,0,0,2,0,2,0,0,0,3,4,4,20,0 -19231,3,2,0,1,5,1,5,3,3,0,2,15,5,6,0 -19232,5,3,0,11,6,2,0,3,1,0,13,11,5,22,0 -19233,3,3,0,3,1,0,11,0,4,1,0,11,1,40,0 -19234,6,8,0,3,0,5,12,2,2,0,13,11,5,18,0 -19235,3,2,0,11,2,1,8,4,3,0,16,3,1,34,0 -19236,2,8,0,15,3,0,2,4,0,1,1,15,0,30,0 -19237,8,1,0,3,5,3,0,1,2,0,13,6,2,25,0 -19238,0,8,0,4,6,2,1,0,2,1,2,6,0,21,0 -19239,7,3,0,12,4,2,10,0,0,0,4,15,1,13,1 -19240,5,8,0,13,2,3,5,0,0,0,10,0,0,31,0 -19241,4,1,0,9,6,4,14,4,0,0,10,2,3,28,0 -19242,10,7,0,6,2,4,9,2,3,1,10,11,0,12,0 -19243,0,6,0,0,0,1,10,4,1,0,18,7,5,21,1 -19244,3,7,0,11,1,3,5,0,2,1,4,2,1,27,0 -19245,5,8,0,1,6,1,14,1,3,0,15,11,5,35,0 -19246,2,3,0,5,6,0,6,4,0,0,6,16,1,9,0 -19247,4,8,0,12,0,0,8,4,4,0,13,15,3,1,0 -19248,3,0,0,10,6,5,0,3,4,0,17,3,5,19,0 -19249,9,3,0,1,0,2,4,5,0,1,5,7,2,7,1 -19250,4,2,0,2,3,3,14,4,2,0,18,10,2,21,1 -19251,0,7,0,4,6,1,11,3,0,1,8,1,0,14,0 -19252,1,3,0,0,5,3,13,5,0,0,13,2,2,5,0 -19253,10,5,0,3,5,6,8,5,4,1,4,9,5,37,0 -19254,1,8,0,15,6,4,14,1,4,0,6,2,4,35,0 -19255,5,1,0,8,3,2,6,1,1,1,16,15,1,38,0 -19256,2,0,0,6,1,0,5,0,0,0,13,3,3,13,0 -19257,0,5,0,9,1,5,0,3,3,1,7,9,2,40,0 -19258,9,3,0,15,6,4,4,3,3,1,15,11,3,6,0 -19259,0,5,0,4,1,3,2,4,4,0,0,11,2,9,0 -19260,1,3,0,10,1,2,11,4,3,1,18,1,0,35,1 -19261,0,3,0,9,3,0,9,2,1,0,15,0,0,3,0 -19262,4,7,0,13,5,5,11,2,2,1,17,18,1,6,0 -19263,7,6,0,1,4,0,9,3,1,1,9,19,4,13,1 -19264,3,2,0,14,1,4,1,3,1,1,13,1,1,20,0 -19265,0,6,0,2,4,0,7,0,4,1,4,3,0,16,0 -19266,4,7,0,13,3,3,3,4,3,0,9,14,1,14,1 -19267,10,5,0,9,1,2,13,2,0,1,16,17,5,29,0 -19268,4,8,0,1,1,0,9,0,1,0,10,3,0,29,0 -19269,0,8,0,8,4,0,4,2,0,1,17,13,3,35,0 -19270,4,5,0,2,6,5,9,5,0,0,8,2,0,2,0 -19271,4,3,0,1,6,1,0,2,4,0,2,5,3,5,0 -19272,1,8,0,12,4,5,8,4,2,1,6,2,5,10,0 -19273,0,6,0,0,6,0,3,4,2,1,2,9,2,6,0 -19274,9,7,0,7,6,1,11,5,1,0,3,7,3,37,1 -19275,3,4,0,2,6,3,0,1,0,1,2,13,5,38,0 -19276,0,8,0,4,1,1,10,3,3,1,10,6,4,32,0 -19277,2,8,0,7,6,2,11,4,4,1,8,2,5,9,0 -19278,3,5,0,7,1,2,5,5,4,1,15,10,1,0,1 -19279,2,7,0,10,4,2,1,4,0,0,9,8,3,16,1 -19280,3,4,0,9,5,2,8,0,0,1,13,2,3,33,0 -19281,2,5,0,7,5,0,11,1,3,1,8,16,5,1,0 -19282,8,5,0,11,1,5,2,1,4,0,4,2,4,4,0 -19283,3,0,0,5,5,5,0,3,3,0,13,3,0,0,0 -19284,1,8,0,2,2,4,3,1,2,1,13,20,1,35,0 -19285,8,4,0,2,2,0,0,1,0,1,18,20,1,19,1 -19286,8,6,0,9,2,3,7,0,2,0,12,7,2,19,1 -19287,3,7,0,0,6,6,13,3,0,1,5,19,4,20,1 -19288,1,0,0,3,5,3,4,3,3,1,13,5,3,3,0 -19289,1,8,0,11,0,6,2,0,1,0,13,0,2,5,0 -19290,5,2,0,3,0,4,10,3,2,1,4,12,0,38,0 -19291,3,7,0,8,0,4,6,0,0,1,8,2,1,35,0 -19292,1,8,0,9,2,0,2,2,0,0,2,16,5,18,0 -19293,4,4,0,3,1,4,5,0,0,1,2,2,4,21,0 -19294,0,3,0,7,5,2,11,0,0,0,8,16,0,30,0 -19295,0,3,0,7,6,2,12,2,2,0,6,2,5,8,0 -19296,6,8,0,14,2,4,3,4,2,0,7,0,3,24,0 -19297,9,0,0,4,0,6,0,5,2,0,5,19,5,11,1 -19298,2,4,0,9,3,2,10,1,0,1,11,11,4,9,0 -19299,7,1,0,3,2,2,1,1,4,1,8,4,1,33,0 -19300,3,3,0,14,1,3,6,1,3,1,8,2,0,0,0 -19301,8,7,0,0,2,1,2,1,1,0,18,7,3,19,1 -19302,2,4,0,5,4,4,0,4,1,0,18,5,2,10,1 -19303,3,7,0,9,3,4,0,1,2,1,9,7,4,26,1 -19304,8,4,0,5,3,3,9,5,4,1,9,7,1,30,1 -19305,2,5,0,3,6,3,0,3,4,1,6,6,2,0,0 -19306,1,7,0,3,6,6,3,2,0,1,2,6,0,4,0 -19307,1,8,0,14,1,3,5,1,1,0,5,6,4,6,0 -19308,4,1,0,7,4,1,0,4,1,0,18,5,2,16,1 -19309,5,4,0,6,4,6,7,4,4,1,8,15,3,5,0 -19310,4,5,0,11,3,6,5,3,2,1,6,11,1,14,0 -19311,10,2,0,10,5,4,4,5,1,1,18,1,2,29,1 -19312,9,2,0,12,1,4,8,5,2,1,18,5,1,12,1 -19313,0,2,0,15,3,4,2,0,1,1,10,11,0,16,0 -19314,4,8,0,14,0,5,1,4,3,0,17,11,4,22,0 -19315,0,8,0,1,6,5,10,2,2,1,13,3,4,3,0 -19316,6,1,0,2,1,3,5,0,3,1,18,19,4,5,1 -19317,8,8,0,8,4,4,1,3,2,0,13,2,5,3,0 -19318,1,8,0,13,0,5,9,3,3,0,0,11,1,16,0 -19319,1,7,0,6,2,4,3,3,3,1,13,11,0,36,0 -19320,2,2,0,9,2,3,0,4,0,0,15,16,2,8,0 -19321,3,8,0,12,0,3,5,1,0,0,13,17,0,24,0 -19322,9,8,0,14,3,2,6,3,0,0,17,11,1,25,0 -19323,8,0,0,15,5,2,4,2,2,0,0,7,5,23,1 -19324,1,2,0,1,2,5,7,2,2,1,13,6,0,38,0 -19325,0,0,0,15,0,6,5,5,4,0,13,18,1,26,0 -19326,3,5,0,13,1,0,5,5,3,0,2,2,1,7,0 -19327,0,8,0,7,2,2,14,1,0,0,7,10,5,41,1 -19328,0,7,0,0,2,5,0,2,1,0,8,4,0,3,0 -19329,0,7,0,5,0,3,4,3,0,0,13,4,2,37,0 -19330,6,6,0,11,0,5,0,3,3,1,14,3,0,0,0 -19331,5,8,0,6,5,3,14,4,3,0,17,2,2,21,0 -19332,5,0,0,15,1,5,11,4,1,0,13,2,0,37,0 -19333,2,1,0,13,4,4,5,4,1,1,7,20,2,8,0 -19334,5,2,0,5,4,1,14,2,4,0,18,8,4,12,1 -19335,7,7,0,6,0,4,5,4,4,1,0,9,0,26,0 -19336,1,5,0,7,3,3,1,4,1,1,8,16,4,39,0 -19337,9,5,0,3,2,4,11,5,3,1,18,1,4,18,1 -19338,5,6,0,1,2,0,2,5,1,1,11,17,3,34,1 -19339,0,7,0,10,6,0,2,2,4,1,7,7,5,12,1 -19340,2,7,0,6,4,6,11,5,3,1,3,8,4,21,1 -19341,4,2,0,10,1,0,7,5,0,0,6,6,1,35,0 -19342,2,2,0,6,3,0,5,0,1,0,10,18,2,12,0 -19343,7,7,0,15,3,4,4,3,0,0,14,10,0,2,0 -19344,4,7,0,13,2,5,12,2,2,1,8,15,2,19,0 -19345,9,2,0,3,0,3,9,2,2,1,2,11,5,35,0 -19346,2,0,0,3,6,2,9,5,0,0,3,0,4,8,0 -19347,0,1,0,2,2,3,11,1,1,0,18,12,2,19,1 -19348,0,8,0,7,4,4,7,0,1,1,13,16,0,40,0 -19349,0,2,0,2,3,0,4,4,2,0,8,9,2,4,0 -19350,8,8,0,12,5,5,6,3,0,1,2,6,1,29,0 -19351,0,5,0,12,5,5,4,1,1,0,4,6,1,20,0 -19352,6,1,0,11,6,1,2,3,0,1,2,15,3,2,0 -19353,8,4,0,5,2,0,9,4,0,0,1,11,0,40,0 -19354,8,4,0,7,0,0,14,2,4,1,16,6,0,19,0 -19355,1,6,0,14,5,1,9,2,2,1,7,2,0,0,0 -19356,4,3,0,10,2,2,11,4,1,1,18,3,5,20,1 -19357,2,3,0,3,0,1,0,5,0,1,6,5,1,3,0 -19358,8,0,0,2,2,1,3,1,1,1,14,7,3,39,1 -19359,7,2,0,3,3,4,14,2,4,0,18,4,2,14,0 -19360,0,8,0,8,2,0,3,4,3,1,0,10,4,9,0 -19361,10,6,0,8,6,3,1,3,4,0,15,2,3,14,0 -19362,0,4,0,10,6,4,4,3,4,0,13,13,0,38,0 -19363,8,6,0,3,6,4,0,1,2,0,13,11,1,20,0 -19364,0,2,0,4,1,5,14,4,4,0,9,15,3,8,0 -19365,10,2,0,14,4,1,0,5,1,1,16,5,5,24,1 -19366,10,1,0,1,6,5,10,3,1,1,7,2,1,38,0 -19367,2,2,0,2,2,6,9,3,2,0,4,15,5,4,0 -19368,5,8,0,5,0,1,2,3,1,0,2,5,5,0,0 -19369,9,7,0,3,4,5,3,3,0,0,2,13,5,25,0 -19370,4,4,0,15,5,3,10,4,2,1,18,7,2,6,1 -19371,0,1,0,5,2,3,10,4,3,0,2,4,0,25,0 -19372,3,6,0,8,4,5,9,3,0,1,8,11,2,32,0 -19373,6,7,0,3,1,5,6,4,3,1,11,11,0,26,0 -19374,2,2,0,12,6,0,5,3,3,1,10,13,1,9,0 -19375,1,3,0,15,6,0,6,0,4,0,16,10,4,20,1 -19376,0,5,0,9,6,0,4,0,4,0,13,9,0,31,0 -19377,8,1,0,12,0,2,1,1,1,1,5,13,1,19,0 -19378,10,1,0,2,6,6,9,2,0,1,2,15,5,39,0 -19379,2,4,0,2,4,1,2,2,0,1,13,4,2,9,0 -19380,1,7,0,8,2,5,6,3,1,0,2,16,2,19,0 -19381,3,1,0,11,6,0,0,5,4,1,8,18,1,39,0 -19382,10,2,0,6,6,3,10,0,4,0,18,14,0,2,1 -19383,3,5,0,14,1,6,5,2,3,1,10,10,3,3,0 -19384,6,4,0,14,1,5,14,4,0,1,13,6,0,38,0 -19385,10,3,0,7,1,4,5,2,0,1,6,15,4,31,0 -19386,8,4,0,12,2,3,9,3,0,1,11,12,2,38,0 -19387,8,2,0,3,1,4,1,2,3,0,8,6,5,14,0 -19388,5,8,0,5,1,3,14,4,1,1,10,12,5,38,0 -19389,0,2,0,2,0,4,9,2,2,1,2,6,1,26,0 -19390,5,5,0,13,6,5,0,1,2,1,8,18,1,16,0 -19391,8,1,0,2,6,5,7,3,0,1,14,4,5,35,0 -19392,10,8,0,6,4,1,12,3,3,0,13,13,1,36,0 -19393,9,7,0,9,5,6,7,0,0,0,2,18,0,27,0 -19394,1,8,0,1,0,2,11,1,0,0,2,3,5,29,0 -19395,6,4,0,1,1,4,8,2,1,0,4,18,1,14,0 -19396,1,6,0,12,0,3,1,0,1,0,15,4,3,25,0 -19397,5,8,0,1,2,3,0,3,3,1,2,2,1,4,0 -19398,0,1,0,7,3,4,10,2,3,1,13,17,1,4,0 -19399,9,0,0,2,2,3,8,2,4,0,10,14,1,11,0 -19400,10,5,0,14,0,5,6,5,2,0,17,19,4,28,0 -19401,10,3,0,12,3,4,6,1,3,0,9,6,3,7,0 -19402,1,0,0,5,0,4,5,0,3,0,15,11,1,35,0 -19403,10,2,0,3,6,5,4,4,1,1,4,13,3,8,0 -19404,1,5,0,15,3,1,4,4,2,0,15,19,2,26,0 -19405,1,3,0,11,0,4,4,5,3,1,8,16,2,28,0 -19406,5,4,0,1,6,5,6,3,3,0,2,8,1,20,0 -19407,0,1,0,0,1,5,14,4,1,0,2,13,1,32,0 -19408,1,0,0,1,5,0,1,1,1,0,2,11,3,26,0 -19409,10,5,0,11,4,2,0,3,1,0,3,7,3,21,1 -19410,0,3,0,5,5,1,0,1,1,1,8,6,3,18,0 -19411,3,6,0,4,2,0,14,3,0,1,3,13,4,19,0 -19412,7,2,0,9,0,3,4,3,0,1,16,19,2,9,1 -19413,8,6,0,14,6,1,11,0,3,1,18,19,4,25,1 -19414,2,6,0,1,3,0,8,2,3,0,11,4,3,38,0 -19415,2,5,0,7,5,4,0,3,1,0,6,2,2,10,0 -19416,2,8,0,9,0,4,5,3,1,1,17,6,5,35,0 -19417,3,5,0,13,1,5,9,3,0,0,4,18,2,5,0 -19418,1,3,0,3,3,6,11,3,1,0,17,0,0,6,0 -19419,9,7,0,9,1,5,2,5,3,1,15,5,1,30,1 -19420,0,3,0,15,2,0,6,3,1,1,7,0,2,14,0 -19421,0,7,0,0,1,0,9,1,2,0,12,14,1,1,0 -19422,8,2,0,10,6,2,5,4,3,1,2,17,5,9,1 -19423,9,0,0,5,5,5,10,2,1,0,13,20,5,14,0 -19424,1,2,0,3,0,1,3,4,4,0,16,0,1,20,0 -19425,5,1,0,14,6,5,6,5,1,1,4,11,1,32,0 -19426,9,5,0,2,0,3,6,0,0,1,7,2,4,13,0 -19427,2,7,0,9,0,1,0,4,0,0,13,10,3,31,0 -19428,1,7,0,6,4,1,5,3,2,1,13,12,0,24,0 -19429,10,2,0,11,1,0,9,1,0,1,2,11,1,39,0 -19430,2,7,0,11,2,6,0,1,1,0,1,20,0,26,0 -19431,3,2,0,2,1,6,1,2,1,1,2,6,2,6,0 -19432,2,6,0,8,0,5,6,2,1,0,4,15,5,10,0 -19433,2,6,0,3,0,6,7,4,1,0,6,9,0,36,0 -19434,10,1,0,13,3,6,2,2,3,0,6,10,4,2,1 -19435,8,6,0,10,6,4,9,0,3,0,13,13,4,22,0 -19436,1,1,0,2,4,6,5,0,4,0,15,2,4,11,0 -19437,0,4,0,5,5,4,1,1,2,1,13,3,4,33,0 -19438,1,7,0,10,0,0,12,5,0,0,2,6,5,29,0 -19439,3,6,0,8,2,6,6,2,4,0,7,16,3,14,0 -19440,6,1,0,11,2,1,8,0,2,1,9,7,2,41,1 -19441,1,4,0,1,1,1,3,4,3,1,2,14,2,9,0 -19442,1,8,0,5,0,4,5,0,4,1,11,11,3,30,0 -19443,3,8,0,1,5,0,6,3,2,1,6,13,4,6,0 -19444,2,2,0,1,5,5,9,4,0,0,8,2,5,6,0 -19445,9,2,0,11,1,4,2,4,3,1,13,18,0,3,0 -19446,10,2,0,14,3,6,5,0,4,0,8,14,3,39,0 -19447,8,6,0,0,1,5,0,1,0,0,16,0,0,7,0 -19448,1,2,0,12,6,3,6,4,0,0,13,7,1,27,0 -19449,10,7,0,6,2,3,11,4,4,1,10,5,4,6,0 -19450,8,6,0,15,1,4,0,4,1,0,13,0,0,4,0 -19451,5,5,0,0,4,2,1,1,0,1,14,17,4,5,1 -19452,10,8,0,0,3,0,0,0,3,1,8,0,0,39,0 -19453,3,4,0,3,0,4,2,3,3,1,6,13,4,34,0 -19454,1,1,0,8,5,5,5,4,4,0,2,11,4,29,0 -19455,3,7,0,11,0,3,5,2,3,1,8,3,2,29,0 -19456,6,6,0,4,4,2,11,2,3,0,4,8,2,34,1 -19457,2,4,0,1,0,0,2,1,0,0,4,20,0,29,0 -19458,0,2,0,11,6,4,5,3,2,0,13,20,1,26,0 -19459,3,6,0,0,2,6,0,2,3,0,8,6,0,41,0 -19460,10,5,0,10,0,4,7,4,0,0,2,9,2,35,0 -19461,9,0,0,14,6,0,0,2,2,1,17,11,0,8,0 -19462,10,6,0,10,5,3,12,4,3,0,8,8,3,22,0 -19463,0,0,0,6,2,0,9,3,3,1,2,11,0,10,0 -19464,9,6,0,4,4,5,5,2,3,1,14,2,2,16,0 -19465,4,1,0,14,1,6,8,1,0,1,0,2,0,34,0 -19466,4,1,0,1,4,5,9,2,1,1,0,12,1,8,0 -19467,4,6,0,12,6,2,12,0,2,0,7,7,1,33,1 -19468,6,5,0,7,2,5,1,2,0,1,6,6,3,17,0 -19469,3,1,0,4,0,2,9,1,3,0,12,13,0,32,0 -19470,7,6,0,10,3,0,6,4,3,0,9,8,1,2,1 -19471,6,7,0,15,3,6,5,2,0,0,8,11,3,38,0 -19472,0,7,0,13,0,5,1,2,1,0,13,15,1,38,0 -19473,10,2,0,8,5,1,3,0,2,0,4,0,3,9,1 -19474,8,4,0,15,6,3,5,2,0,0,4,0,2,6,0 -19475,7,2,0,0,6,2,1,1,3,1,18,17,2,5,1 -19476,9,1,0,3,1,6,8,0,1,0,8,11,4,12,0 -19477,6,6,0,8,2,4,8,1,1,0,15,15,0,28,0 -19478,6,7,0,5,3,4,7,0,1,0,13,6,2,40,0 -19479,10,6,0,2,0,0,4,5,4,1,14,5,5,33,1 -19480,5,6,0,7,4,1,12,1,0,1,5,1,4,0,1 -19481,0,1,0,5,1,5,14,5,0,0,2,15,4,1,0 -19482,2,4,0,5,4,0,5,1,2,1,10,2,5,18,0 -19483,10,3,0,4,4,5,9,2,2,0,0,6,5,37,0 -19484,9,4,0,15,6,2,14,5,4,1,12,7,2,28,1 -19485,0,7,0,11,1,5,5,4,0,1,2,14,3,11,0 -19486,8,6,0,3,0,2,5,3,3,1,7,11,2,12,0 -19487,0,2,0,0,4,4,4,1,0,1,8,4,0,0,0 -19488,6,4,0,13,4,5,14,2,0,1,13,4,0,19,0 -19489,2,7,0,0,2,6,9,2,3,1,15,9,3,30,0 -19490,9,4,0,4,1,4,9,0,0,1,1,13,4,33,0 -19491,4,8,0,3,5,4,6,2,0,1,4,2,3,7,0 -19492,5,6,0,5,1,1,7,1,2,0,2,4,1,24,0 -19493,0,4,0,1,0,4,7,3,2,0,13,9,0,24,0 -19494,7,1,0,9,2,0,2,2,0,1,7,6,1,24,0 -19495,1,3,0,11,2,4,0,5,3,0,6,6,1,1,0 -19496,3,0,0,5,2,1,12,5,4,1,18,14,1,7,1 -19497,10,3,0,10,4,3,4,1,2,0,2,10,1,22,1 -19498,2,8,0,5,3,1,3,0,2,0,2,17,4,6,0 -19499,1,5,0,1,5,3,2,2,0,0,5,16,1,25,0 -19500,2,7,0,0,5,5,6,2,3,0,13,13,2,10,0 -19501,2,3,0,13,1,0,9,0,4,0,8,4,2,3,0 -19502,3,1,0,0,6,2,14,3,1,1,2,11,4,13,0 -19503,4,5,0,9,3,5,12,5,1,1,11,8,2,30,1 -19504,2,1,0,15,0,5,0,2,2,0,12,12,0,10,1 -19505,1,8,0,1,1,0,8,3,1,1,2,2,0,3,0 -19506,7,2,0,15,4,1,12,5,3,1,7,7,3,32,1 -19507,6,3,0,1,6,2,6,3,2,1,0,10,2,30,0 -19508,10,5,0,5,0,5,11,0,4,0,4,13,0,17,0 -19509,10,7,0,5,3,3,9,1,2,1,18,10,3,21,1 -19510,2,2,0,7,2,5,7,2,4,0,17,2,3,27,0 -19511,9,2,0,5,2,3,3,0,0,0,14,11,2,14,0 -19512,0,8,0,0,4,4,8,3,0,0,2,3,4,25,0 -19513,9,7,0,4,4,5,5,1,4,0,18,6,0,36,0 -19514,1,1,0,1,0,1,5,4,3,1,3,2,0,17,0 -19515,7,2,0,1,0,4,8,5,1,0,13,6,2,35,0 -19516,10,6,0,3,5,4,2,3,4,0,2,16,4,26,0 -19517,9,1,0,13,3,1,5,5,2,1,2,8,5,24,0 -19518,4,0,0,5,6,0,8,4,2,0,9,2,3,4,0 -19519,3,6,0,9,2,6,0,2,3,0,11,13,1,26,0 -19520,1,6,0,0,0,5,9,4,4,1,17,2,0,19,0 -19521,8,5,0,5,2,1,10,0,0,0,4,6,5,4,0 -19522,2,5,0,5,0,3,10,3,4,0,13,6,1,38,0 -19523,2,8,0,3,4,0,3,4,2,0,13,11,1,23,0 -19524,4,8,0,4,5,3,14,1,4,0,8,2,5,23,0 -19525,3,8,0,10,6,0,6,3,4,0,8,2,0,38,0 -19526,2,2,0,8,4,4,8,0,0,0,17,2,5,6,0 -19527,2,0,0,4,4,4,6,4,0,0,18,7,3,41,1 -19528,1,0,0,13,4,1,13,0,2,0,9,1,0,36,1 -19529,3,6,0,6,0,6,9,4,1,0,12,2,5,40,0 -19530,8,3,0,9,6,3,7,3,1,0,18,12,2,20,1 -19531,5,8,0,10,6,4,6,4,2,0,6,9,5,3,0 -19532,7,0,0,3,4,0,8,4,4,0,17,9,4,17,0 -19533,9,0,0,10,4,2,0,1,1,1,18,5,2,14,1 -19534,4,4,0,3,2,6,8,0,2,1,0,2,0,31,0 -19535,2,2,0,13,6,0,7,3,0,1,17,6,1,40,0 -19536,5,3,0,1,0,4,5,3,0,0,12,6,2,26,0 -19537,10,1,0,15,5,0,3,5,3,0,13,18,1,3,0 -19538,8,6,0,11,1,1,14,5,0,1,9,20,4,24,1 -19539,1,3,0,3,3,3,2,4,3,0,10,15,3,25,0 -19540,6,1,0,11,1,0,2,0,3,0,0,3,1,4,1 -19541,1,0,0,2,0,4,5,1,2,0,13,15,0,18,0 -19542,4,3,0,15,0,3,14,4,0,1,13,2,5,19,0 -19543,0,7,0,10,6,4,12,5,4,0,13,3,1,5,0 -19544,0,8,0,13,5,4,7,3,1,0,8,11,0,21,0 -19545,0,5,0,7,1,0,8,1,3,1,13,15,1,13,0 -19546,4,4,0,3,3,2,7,0,2,1,2,11,1,11,0 -19547,3,1,0,9,2,5,7,1,1,1,4,6,5,35,0 -19548,8,2,0,11,6,4,13,0,2,1,8,11,3,14,0 -19549,6,2,0,1,6,6,8,2,2,0,2,13,2,16,0 -19550,0,3,0,2,2,4,8,4,2,0,4,15,5,30,0 -19551,3,8,0,14,6,2,7,2,4,0,17,6,5,27,0 -19552,5,8,0,9,4,5,6,4,1,1,18,6,2,36,1 -19553,0,4,0,4,0,4,6,2,3,1,13,14,0,13,0 -19554,9,8,0,2,5,1,3,1,3,1,18,5,1,7,1 -19555,7,0,0,1,2,3,5,1,4,1,17,3,2,8,0 -19556,8,4,0,12,5,6,10,0,1,1,1,7,4,38,1 -19557,10,2,0,1,6,4,1,4,0,1,0,3,0,14,1 -19558,5,5,0,3,0,5,9,5,4,1,2,14,4,6,0 -19559,0,2,0,7,6,5,13,3,4,1,15,0,3,14,0 -19560,5,1,0,7,1,4,9,0,4,1,4,18,0,34,0 -19561,1,6,0,3,4,0,6,3,0,1,16,8,4,6,0 -19562,7,8,0,4,3,0,0,3,3,0,13,0,1,28,0 -19563,4,1,0,10,2,1,0,1,3,1,5,14,1,1,1 -19564,7,2,0,3,2,0,8,0,0,1,0,9,4,31,0 -19565,7,5,0,4,5,1,8,3,2,1,6,6,1,1,0 -19566,0,4,0,3,4,5,0,4,4,1,2,2,1,6,0 -19567,0,1,0,2,3,2,8,4,4,1,4,13,5,10,0 -19568,10,1,0,1,4,4,6,0,1,0,18,17,5,13,1 -19569,4,4,0,6,3,6,10,3,0,0,2,3,1,7,0 -19570,8,6,0,15,0,6,8,4,2,0,7,5,3,37,0 -19571,2,0,0,7,4,3,3,4,2,1,13,11,0,0,0 -19572,0,7,0,7,6,1,3,1,4,1,16,7,4,30,1 -19573,4,7,0,4,5,2,5,0,2,1,8,15,2,7,1 -19574,2,4,0,12,2,5,2,4,3,1,2,3,4,20,0 -19575,4,2,0,10,5,1,10,1,0,1,9,13,2,4,0 -19576,7,1,0,3,2,6,0,5,3,0,17,3,3,8,1 -19577,9,0,0,6,2,0,12,3,3,1,5,10,3,4,1 -19578,0,3,0,14,6,2,9,0,0,1,0,9,1,39,0 -19579,8,6,0,2,2,4,12,1,4,1,13,6,1,25,0 -19580,9,8,0,3,0,1,1,1,3,0,12,16,4,1,0 -19581,1,3,0,7,0,0,8,4,3,1,0,9,1,25,0 -19582,4,2,0,15,2,4,12,5,1,1,3,1,3,18,1 -19583,2,3,0,1,4,2,2,2,4,0,17,7,4,22,1 -19584,9,8,0,4,1,2,4,5,0,0,13,16,1,37,0 -19585,1,2,0,4,5,2,3,2,4,0,2,16,5,8,0 -19586,3,7,0,11,4,3,9,1,2,0,2,11,0,27,0 -19587,5,5,0,4,5,6,9,2,0,0,13,11,3,28,0 -19588,3,2,0,2,2,5,10,5,3,1,2,17,2,8,0 -19589,9,1,0,3,5,6,8,3,1,1,10,16,4,38,0 -19590,5,6,0,15,2,6,13,0,2,1,5,7,0,22,1 -19591,8,8,0,8,2,5,3,5,1,1,18,17,3,19,1 -19592,4,8,0,6,2,5,14,3,0,0,3,13,0,9,0 -19593,8,0,0,5,5,6,3,3,1,1,14,14,3,1,1 -19594,2,2,0,0,6,0,0,2,0,0,12,2,2,3,0 -19595,10,5,0,10,3,1,12,3,0,0,1,7,5,10,1 -19596,10,4,0,13,2,6,6,2,0,1,2,1,2,9,0 -19597,4,1,0,0,0,6,7,1,1,1,15,9,2,6,0 -19598,2,0,0,12,0,3,10,5,4,1,18,7,4,12,1 -19599,7,6,0,10,3,5,9,3,4,0,12,9,3,27,0 -19600,2,6,0,13,1,4,0,0,1,0,10,2,1,7,0 -19601,3,2,0,5,6,3,3,2,2,1,0,8,0,29,0 -19602,3,3,0,5,5,2,9,2,1,1,8,0,5,2,0 -19603,6,0,0,6,0,0,9,1,0,1,13,13,0,4,0 -19604,0,3,0,6,0,5,5,2,0,1,10,18,3,22,0 -19605,4,6,0,9,6,2,11,2,1,1,10,2,1,21,0 -19606,0,2,0,1,0,2,14,3,2,0,8,11,4,0,0 -19607,0,4,0,6,6,3,5,3,1,1,0,2,1,33,0 -19608,4,1,0,12,6,5,4,5,2,0,5,7,1,18,1 -19609,1,3,0,4,3,2,0,5,3,1,8,7,5,34,1 -19610,1,5,0,4,5,1,6,4,1,0,8,15,0,22,0 -19611,9,0,0,2,3,1,0,1,4,1,17,7,4,31,1 -19612,10,3,0,0,6,1,6,4,3,1,13,2,0,23,0 -19613,8,8,0,1,3,4,10,0,2,1,7,17,1,2,1 -19614,5,8,0,3,4,2,6,1,1,1,0,6,1,4,0 -19615,4,1,0,8,6,2,2,0,3,1,14,12,5,6,1 -19616,5,5,0,14,0,4,6,1,3,1,2,11,5,8,0 -19617,10,3,0,2,3,5,4,1,3,0,1,10,2,18,1 -19618,6,5,0,2,0,5,6,0,1,1,4,7,2,29,0 -19619,1,3,0,0,0,5,13,5,2,0,8,12,2,14,0 -19620,2,4,0,8,4,5,14,5,3,0,17,9,2,13,0 -19621,0,0,0,11,2,3,5,4,4,0,2,1,1,19,0 -19622,0,6,0,7,4,4,5,3,2,0,8,4,4,14,0 -19623,9,5,0,8,5,3,2,4,4,1,6,2,3,7,0 -19624,3,7,0,2,1,0,4,5,2,1,17,19,2,33,0 -19625,8,5,0,14,6,2,11,5,0,1,14,7,3,22,1 -19626,1,5,0,10,0,3,12,0,2,0,12,12,1,38,0 -19627,8,3,0,12,0,1,14,5,1,1,6,9,2,37,1 -19628,0,1,0,7,0,5,5,4,3,0,8,16,3,2,0 -19629,1,8,0,4,2,4,0,4,1,1,13,2,3,27,0 -19630,9,8,0,3,0,3,5,3,2,0,13,0,1,39,0 -19631,5,7,0,2,1,3,14,1,4,0,4,13,1,8,0 -19632,0,5,0,1,5,3,5,3,2,1,6,18,0,18,0 -19633,5,1,0,9,1,3,7,3,3,0,2,2,4,30,0 -19634,10,2,0,2,0,0,9,5,4,0,15,11,5,17,0 -19635,10,8,0,0,3,6,12,5,4,1,0,7,0,8,1 -19636,2,7,0,3,0,5,13,1,2,1,14,10,4,40,1 -19637,8,8,0,6,0,1,9,1,1,0,3,16,2,4,0 -19638,8,2,0,4,1,2,4,4,0,0,2,15,1,11,0 -19639,8,4,0,15,2,5,8,3,0,1,4,9,3,24,0 -19640,5,4,0,13,6,0,3,2,0,0,4,6,2,38,0 -19641,0,8,0,14,0,0,5,2,4,1,0,6,5,20,0 -19642,10,7,0,11,1,4,4,1,0,0,2,2,0,0,0 -19643,10,0,0,5,0,1,5,0,0,0,17,11,5,9,0 -19644,8,7,0,13,0,0,9,3,1,0,4,2,5,16,0 -19645,9,1,0,12,5,5,4,4,4,0,11,3,3,41,1 -19646,2,0,0,1,4,3,0,4,1,0,4,11,5,31,0 -19647,0,1,0,14,6,2,0,3,4,1,2,1,5,31,0 -19648,6,3,0,11,0,4,0,0,0,1,8,4,0,8,0 -19649,4,8,0,13,1,5,11,1,4,1,17,2,1,34,0 -19650,5,0,0,0,0,2,5,5,2,0,4,15,1,5,0 -19651,5,8,0,0,0,0,12,3,0,0,2,13,0,7,0 -19652,0,6,0,12,2,1,12,4,0,1,13,1,3,22,1 -19653,1,4,0,6,3,2,7,3,4,0,17,15,0,36,0 -19654,10,8,0,4,5,4,5,1,4,0,8,2,0,38,0 -19655,9,1,0,1,4,0,7,0,4,1,7,5,2,41,1 -19656,0,2,0,8,6,4,12,5,4,0,4,9,5,29,0 -19657,1,3,0,10,3,4,4,2,0,0,6,0,0,4,0 -19658,6,1,0,11,5,1,5,4,0,0,13,4,0,19,0 -19659,2,6,0,1,1,0,0,1,4,0,16,13,1,16,0 -19660,6,0,0,3,3,1,10,3,3,0,13,2,4,6,0 -19661,6,4,0,3,3,0,7,0,0,0,6,11,1,40,0 -19662,4,8,0,10,1,1,9,0,3,0,7,1,2,6,1 -19663,0,4,0,2,0,2,12,0,4,1,9,16,1,5,0 -19664,5,4,0,13,5,6,10,1,0,0,13,11,1,18,0 -19665,3,2,0,1,4,4,4,5,3,1,2,6,3,40,0 -19666,0,5,0,6,2,1,8,1,2,0,8,20,3,17,0 -19667,9,4,0,8,6,3,6,0,3,0,7,5,2,38,0 -19668,0,7,0,1,4,6,2,0,0,0,13,2,0,14,0 -19669,1,1,0,1,3,0,2,3,0,0,13,18,2,16,0 -19670,6,6,0,14,0,5,1,5,3,0,15,4,1,27,0 -19671,7,4,0,10,1,1,6,2,0,1,12,17,1,37,1 -19672,9,7,0,15,6,6,8,0,0,1,2,18,0,36,0 -19673,9,8,0,12,6,1,7,3,4,1,4,2,0,13,0 -19674,2,0,0,13,4,6,10,3,2,0,10,2,3,3,0 -19675,5,6,0,9,1,0,8,0,2,0,2,6,2,0,0 -19676,9,1,0,2,1,3,0,2,1,1,14,7,3,5,1 -19677,4,7,0,11,5,4,11,1,0,0,13,20,2,36,0 -19678,2,8,0,15,2,5,12,4,0,0,6,3,2,31,0 -19679,2,3,0,1,4,3,3,1,3,0,2,18,3,38,0 -19680,0,6,0,6,3,3,14,3,0,1,0,18,0,33,0 -19681,10,6,0,15,5,0,0,1,1,0,6,12,4,31,0 -19682,4,3,0,1,1,5,8,0,0,0,4,11,2,22,0 -19683,0,4,0,1,0,3,13,4,0,1,14,2,1,14,0 -19684,4,8,0,13,1,4,5,5,4,0,11,2,5,36,0 -19685,0,3,0,7,0,5,14,4,0,1,13,13,3,36,0 -19686,1,5,0,13,6,4,2,3,1,0,15,2,0,13,0 -19687,10,3,0,6,5,4,6,3,2,1,17,12,2,40,0 -19688,2,8,0,3,3,6,2,4,4,0,2,0,3,35,0 -19689,6,8,0,4,4,2,13,0,3,1,10,4,5,33,1 -19690,4,7,0,3,0,0,0,0,3,0,10,15,5,27,0 -19691,1,3,0,5,1,1,6,4,3,1,7,15,4,19,0 -19692,7,1,0,6,3,0,8,2,4,0,5,15,2,23,0 -19693,3,3,0,13,0,6,9,2,3,1,17,2,1,24,0 -19694,7,7,0,10,0,1,14,3,3,0,13,15,1,27,0 -19695,0,4,0,13,5,3,5,3,1,1,4,11,0,1,0 -19696,8,0,0,2,3,5,8,3,3,0,2,2,5,24,0 -19697,5,8,0,4,1,5,6,1,4,1,13,15,0,35,0 -19698,4,1,0,14,1,2,2,3,4,1,2,6,1,16,0 -19699,1,6,0,5,2,5,0,0,0,0,17,11,0,5,0 -19700,7,2,0,0,1,1,1,4,0,1,1,8,1,39,1 -19701,1,6,0,6,0,3,0,2,4,0,13,7,2,39,0 -19702,4,4,0,15,1,1,0,4,0,1,17,18,0,26,0 -19703,0,8,0,5,2,5,11,1,2,0,2,4,5,28,0 -19704,2,2,0,12,6,0,2,1,0,0,0,11,2,38,0 -19705,10,5,0,7,5,0,11,1,3,1,18,14,5,9,1 -19706,0,1,0,8,2,5,1,2,4,1,17,6,2,2,0 -19707,6,4,0,8,3,1,7,1,2,1,7,11,5,31,0 -19708,2,5,0,0,0,5,1,3,0,0,12,15,4,25,0 -19709,1,0,0,11,2,4,9,3,2,0,16,18,1,4,0 -19710,5,6,0,11,6,4,8,3,3,1,13,6,3,10,0 -19711,9,6,0,9,3,2,4,4,3,1,11,3,5,18,1 -19712,4,7,0,8,5,3,3,0,4,1,4,2,4,35,0 -19713,6,6,0,11,3,0,8,3,0,0,4,6,2,29,0 -19714,1,4,0,3,5,3,9,4,1,1,17,2,1,31,0 -19715,2,7,0,15,1,6,6,5,4,0,0,11,5,5,0 -19716,1,3,0,2,6,2,0,2,4,1,0,15,3,6,0 -19717,6,2,0,1,2,4,6,1,2,0,13,17,3,19,0 -19718,0,5,0,15,2,5,3,5,0,0,2,9,2,11,0 -19719,1,8,0,4,5,0,6,2,3,1,13,4,0,40,0 -19720,10,7,0,13,0,6,8,1,0,1,8,2,0,0,0 -19721,2,5,0,12,5,2,13,3,3,0,2,2,2,11,0 -19722,0,8,0,0,1,5,0,4,0,0,12,13,0,12,0 -19723,1,6,0,12,4,5,4,5,2,0,13,11,3,4,0 -19724,9,0,0,5,6,6,14,1,2,0,10,10,1,20,1 -19725,4,8,0,5,0,6,9,4,3,1,8,3,5,18,0 -19726,0,5,0,8,0,3,1,4,0,1,13,12,2,31,0 -19727,5,4,0,7,1,0,11,2,0,0,13,2,5,11,0 -19728,9,5,0,0,2,2,9,1,3,0,10,18,3,22,0 -19729,1,3,0,9,4,4,12,3,0,1,14,20,3,31,1 -19730,8,5,0,14,6,3,13,1,1,0,12,2,3,0,0 -19731,9,6,0,1,5,0,3,2,0,1,11,5,5,2,1 -19732,9,5,0,0,2,1,2,2,3,0,4,11,3,7,0 -19733,10,4,0,1,3,1,8,2,4,0,1,10,3,35,1 -19734,0,5,0,3,6,4,9,2,0,0,13,2,3,20,0 -19735,0,2,0,3,1,3,9,4,0,0,13,20,4,37,0 -19736,2,8,0,2,4,2,4,3,4,1,8,6,3,3,0 -19737,4,6,0,12,6,2,7,0,3,1,4,15,3,13,0 -19738,3,2,0,6,6,6,7,1,2,1,14,7,2,28,1 -19739,3,0,0,12,6,4,8,3,1,1,8,15,2,20,0 -19740,10,5,0,3,4,3,11,0,4,0,11,11,0,1,0 -19741,0,0,0,3,0,0,8,4,0,1,13,13,1,23,0 -19742,0,1,0,4,5,4,3,2,3,0,17,11,4,29,0 -19743,0,5,0,1,6,0,9,0,0,0,4,11,4,30,0 -19744,10,4,0,7,5,0,13,1,0,0,11,11,4,0,0 -19745,3,4,0,3,0,3,2,0,1,1,12,11,5,34,0 -19746,10,6,0,3,6,0,3,5,1,0,10,14,0,21,0 -19747,0,2,0,4,6,1,14,4,1,0,10,19,1,5,0 -19748,9,3,0,0,0,4,14,1,0,1,13,18,3,10,0 -19749,6,5,0,7,6,3,3,3,4,0,8,14,5,19,1 -19750,0,7,0,9,4,5,10,2,1,1,13,13,2,26,0 -19751,6,2,0,6,5,0,9,4,3,0,13,11,4,3,0 -19752,2,1,0,15,0,1,0,1,1,1,2,13,3,36,0 -19753,7,8,0,2,0,1,4,2,2,1,6,19,3,8,1 -19754,3,5,0,8,6,4,11,3,1,1,4,5,4,17,1 -19755,1,0,0,13,6,4,8,1,2,1,7,14,4,8,0 -19756,3,3,0,4,3,5,11,5,0,1,16,7,3,40,1 -19757,2,2,0,2,6,1,7,2,3,0,4,13,0,16,0 -19758,1,3,0,5,1,5,0,4,1,1,2,2,1,38,0 -19759,0,3,0,1,0,3,14,4,4,1,13,2,3,24,0 -19760,1,3,0,12,5,1,9,1,3,0,10,11,2,17,0 -19761,9,1,0,9,4,2,11,4,1,0,6,19,2,25,1 -19762,5,1,0,14,0,6,12,0,4,0,18,11,0,3,0 -19763,2,3,0,10,3,2,5,1,3,0,8,20,2,41,0 -19764,0,6,0,6,0,1,0,4,1,1,12,3,4,9,0 -19765,2,8,0,2,4,5,1,4,1,1,9,12,2,21,1 -19766,0,7,0,8,2,3,13,4,1,0,18,1,5,17,1 -19767,3,6,0,2,0,4,2,4,4,0,6,6,3,12,0 -19768,9,4,0,12,2,4,12,4,0,0,13,2,2,12,0 -19769,6,7,0,2,5,3,3,0,3,1,8,19,4,37,0 -19770,0,0,0,6,5,2,0,0,1,1,17,11,0,29,0 -19771,0,7,0,6,3,4,8,0,1,0,13,6,0,9,0 -19772,6,1,0,12,4,2,2,4,3,1,6,19,2,14,1 -19773,9,6,0,13,4,5,12,0,1,0,2,2,2,20,0 -19774,5,7,0,5,3,2,13,5,2,0,8,4,4,13,0 -19775,9,0,0,10,2,6,0,1,0,0,5,17,1,26,1 -19776,2,6,0,3,6,5,4,2,0,0,15,20,5,32,0 -19777,0,2,0,11,5,3,5,1,0,1,17,18,2,36,0 -19778,2,4,0,3,3,6,9,2,0,1,10,15,4,25,0 -19779,0,8,0,3,6,4,6,2,0,1,13,15,0,4,0 -19780,9,7,0,3,2,3,9,2,4,1,6,5,2,1,1 -19781,7,8,0,5,3,2,9,3,2,1,18,0,3,29,0 -19782,4,6,0,10,2,6,1,1,3,1,5,1,3,20,1 -19783,0,0,0,0,2,3,7,4,0,1,6,18,3,33,0 -19784,4,1,0,11,0,1,6,3,4,0,18,12,5,16,1 -19785,6,4,0,0,0,5,10,0,1,0,4,18,4,1,0 -19786,6,5,0,5,0,6,2,1,2,1,13,6,3,4,0 -19787,5,2,0,5,4,2,10,0,3,0,8,2,4,10,0 -19788,6,2,0,1,3,1,1,5,4,1,9,12,0,14,1 -19789,0,4,0,6,1,0,2,4,2,0,2,19,0,41,0 -19790,1,0,0,4,0,0,7,3,2,0,10,3,2,39,0 -19791,8,3,0,6,0,0,12,2,3,1,1,5,0,22,1 -19792,9,0,0,1,3,3,12,1,1,0,8,13,3,36,0 -19793,8,4,0,6,2,1,12,3,3,1,9,10,5,5,1 -19794,2,3,0,1,3,5,9,0,4,1,15,3,4,20,0 -19795,10,8,0,2,3,0,11,2,1,0,10,2,2,7,0 -19796,9,6,0,2,6,2,10,4,0,0,1,0,3,18,0 -19797,0,2,0,14,0,6,8,5,3,1,13,15,1,28,0 -19798,3,6,0,0,5,6,3,3,0,0,13,11,0,5,0 -19799,0,7,0,13,5,5,7,1,0,1,13,11,1,37,0 -19800,9,3,0,1,5,3,8,1,0,0,17,9,2,39,0 -19801,9,5,0,3,0,5,1,1,1,0,13,20,0,8,0 -19802,6,7,0,11,5,4,5,4,1,0,14,11,1,12,0 -19803,5,5,0,14,2,3,13,3,3,0,9,18,2,17,1 -19804,9,4,0,9,4,1,4,5,1,0,9,14,1,40,1 -19805,6,3,0,10,4,2,14,2,2,1,14,2,1,0,1 -19806,6,3,0,4,1,0,5,3,0,0,9,6,3,6,0 -19807,2,0,0,1,0,0,2,2,4,0,17,16,5,23,0 -19808,10,4,0,7,2,2,12,5,4,0,9,16,4,11,1 -19809,10,4,0,6,0,4,13,3,1,0,2,2,2,37,0 -19810,9,0,0,4,4,5,13,0,1,0,13,2,1,40,0 -19811,5,5,0,7,6,1,4,0,4,1,9,18,5,16,1 -19812,0,4,0,4,6,1,14,0,3,1,16,16,2,10,1 -19813,0,4,0,13,5,1,5,3,1,1,13,11,3,31,0 -19814,2,4,0,3,0,3,0,1,0,1,13,16,3,1,0 -19815,0,6,0,1,0,0,14,4,1,0,6,15,2,19,0 -19816,10,6,0,0,4,1,3,5,2,1,18,12,2,17,1 -19817,0,6,0,6,1,0,9,4,0,0,8,2,0,27,0 -19818,4,8,0,8,0,2,11,2,3,0,17,15,3,29,0 -19819,6,5,0,7,5,1,0,3,2,1,7,16,2,11,1 -19820,8,8,0,7,5,5,11,1,1,0,17,15,0,33,0 -19821,2,2,0,14,3,6,3,0,1,1,0,17,0,26,0 -19822,3,4,0,1,1,0,1,2,2,1,6,11,3,7,0 -19823,0,5,0,10,1,4,14,2,1,0,4,6,3,21,0 -19824,3,2,0,12,0,4,12,0,4,1,2,15,4,35,0 -19825,3,2,0,4,3,4,12,2,2,0,6,11,5,13,0 -19826,3,0,0,11,4,1,2,0,1,1,18,17,3,1,1 -19827,1,5,0,4,4,6,9,1,1,1,9,2,0,9,1 -19828,4,2,0,13,1,5,5,1,3,1,2,0,3,25,0 -19829,0,4,0,2,2,0,12,4,0,0,4,0,3,39,0 -19830,7,2,0,0,0,0,6,1,2,1,2,9,1,3,0 -19831,10,8,0,13,4,5,5,4,2,1,11,6,4,2,0 -19832,2,7,0,8,6,3,7,3,4,1,15,18,3,32,0 -19833,9,6,0,13,2,4,11,0,0,1,4,12,3,16,0 -19834,0,6,0,2,5,5,12,3,2,1,13,2,0,5,0 -19835,4,4,0,6,0,3,4,0,1,1,3,10,5,9,1 -19836,7,5,0,9,2,3,8,5,3,0,6,10,4,23,1 -19837,4,1,0,13,0,3,11,2,2,0,10,2,0,26,0 -19838,6,6,0,3,6,6,3,1,3,1,6,13,3,0,0 -19839,0,1,0,13,5,3,14,1,3,1,8,16,1,14,0 -19840,0,0,0,13,1,2,12,4,1,1,10,11,2,16,0 -19841,10,7,0,15,4,3,8,2,0,0,18,10,5,19,1 -19842,0,6,0,0,2,4,10,2,4,0,2,0,2,18,0 -19843,1,7,0,7,1,6,8,1,4,0,4,11,5,9,0 -19844,6,1,0,11,2,1,10,1,4,1,14,3,5,38,1 -19845,10,7,0,12,3,5,7,4,0,0,13,2,0,7,0 -19846,4,2,0,3,6,6,8,3,1,1,8,11,0,35,0 -19847,8,4,0,1,1,1,11,1,3,0,0,14,1,41,1 -19848,0,3,0,0,0,0,0,3,3,0,11,2,4,31,0 -19849,2,8,0,7,1,5,9,0,0,0,2,6,1,21,0 -19850,2,3,0,3,6,5,4,2,1,1,13,15,1,40,0 -19851,1,5,0,5,0,6,11,0,0,0,2,2,3,5,0 -19852,8,4,0,0,4,3,0,1,2,1,17,13,3,26,0 -19853,3,8,0,7,5,0,14,3,1,0,8,9,5,33,0 -19854,2,2,0,15,6,6,2,3,2,0,13,2,1,5,0 -19855,7,0,0,11,4,5,5,4,2,1,1,5,0,13,1 -19856,3,4,0,6,0,5,10,0,4,1,5,5,4,32,1 -19857,8,1,0,0,2,6,14,5,4,1,14,13,1,31,1 -19858,6,4,0,5,4,4,12,3,3,1,13,11,5,26,0 -19859,7,7,0,10,2,4,11,3,1,0,18,10,2,19,1 -19860,6,6,0,11,0,0,8,2,3,0,4,6,0,35,0 -19861,7,1,0,0,0,0,9,1,1,0,12,2,1,17,0 -19862,3,8,0,6,0,3,2,2,3,1,5,15,3,39,0 -19863,10,0,0,15,5,0,7,2,3,1,11,6,3,35,0 -19864,2,8,0,13,6,1,9,4,3,1,2,18,2,21,0 -19865,8,7,0,9,6,3,9,1,0,0,10,18,5,8,0 -19866,5,1,0,1,4,3,13,2,2,1,3,20,2,20,0 -19867,3,4,0,9,6,1,9,1,2,1,18,7,4,1,1 -19868,2,5,0,2,1,2,9,3,0,0,8,6,5,0,0 -19869,1,4,0,3,4,1,0,0,1,0,13,2,2,31,0 -19870,3,4,0,0,5,4,13,5,0,1,8,11,0,39,0 -19871,3,2,0,11,2,2,2,3,1,1,0,16,2,16,0 -19872,2,4,0,14,5,6,2,0,4,0,10,5,1,27,1 -19873,0,4,0,4,6,4,4,1,3,0,11,5,0,33,0 -19874,7,1,0,12,1,0,3,4,3,1,7,2,0,19,0 -19875,7,8,0,0,0,1,9,3,3,0,8,12,2,23,0 -19876,5,2,0,10,2,2,4,0,1,1,5,7,1,41,1 -19877,6,6,0,6,4,1,13,0,2,1,5,10,3,0,1 -19878,7,7,0,7,2,5,1,0,3,1,2,13,4,30,0 -19879,2,3,0,11,0,3,13,3,0,0,0,20,4,32,0 -19880,0,8,0,7,6,3,10,4,0,0,13,1,4,25,0 -19881,2,5,0,5,2,3,9,4,3,0,0,3,4,38,0 -19882,0,3,0,14,6,6,3,1,4,1,4,3,2,8,0 -19883,7,7,0,12,0,5,3,3,2,1,13,9,5,9,0 -19884,3,0,0,6,0,0,5,3,2,0,8,18,5,27,0 -19885,1,7,0,1,6,4,5,0,0,1,4,5,1,21,0 -19886,1,2,0,9,0,3,0,3,0,0,2,2,2,1,0 -19887,7,8,0,5,3,0,8,2,0,1,2,2,1,2,0 -19888,0,7,0,11,2,5,6,2,0,1,6,11,0,14,0 -19889,5,3,0,15,0,3,4,3,1,1,17,11,1,14,0 -19890,0,0,0,0,1,5,4,3,3,0,6,6,4,4,0 -19891,0,8,0,7,0,2,10,3,1,0,4,6,5,2,0 -19892,5,4,0,1,0,3,6,2,2,0,2,2,3,22,0 -19893,0,7,0,8,6,0,9,1,0,0,2,0,3,28,0 -19894,2,0,0,7,4,1,12,3,3,1,2,11,5,5,0 -19895,9,3,0,3,5,6,1,3,3,1,2,17,4,3,0 -19896,1,1,0,10,1,1,6,0,2,1,1,5,2,19,1 -19897,6,2,0,1,5,0,11,3,2,1,8,13,0,40,0 -19898,1,0,0,5,1,5,9,2,2,0,4,18,0,7,0 -19899,9,1,0,6,1,4,5,1,3,0,17,4,3,31,0 -19900,10,2,0,2,0,1,1,5,3,1,18,10,2,18,1 -19901,9,8,0,5,3,6,0,0,4,1,14,18,0,37,1 -19902,9,1,0,11,6,2,13,4,2,0,5,8,5,22,1 -19903,2,1,0,6,2,4,5,4,2,1,6,6,4,17,0 -19904,0,4,0,13,0,5,0,4,3,0,2,10,2,30,0 -19905,10,3,0,14,1,2,5,0,4,1,11,18,4,0,1 -19906,1,7,0,11,0,2,14,4,3,0,13,6,5,26,0 -19907,2,6,0,9,1,3,9,2,3,0,13,2,0,41,0 -19908,0,7,0,15,3,0,3,5,2,0,3,13,3,8,0 -19909,9,1,0,10,2,6,8,0,0,1,11,19,4,38,1 -19910,0,6,0,11,1,5,13,3,3,0,2,7,5,7,0 -19911,1,0,0,6,0,0,2,2,0,1,6,11,0,19,0 -19912,4,8,0,2,0,0,0,2,4,0,3,11,5,7,0 -19913,7,0,0,7,5,3,0,5,1,1,7,5,3,22,1 -19914,5,2,0,0,3,1,4,2,1,1,9,5,4,4,1 -19915,9,2,0,3,4,3,4,4,1,0,3,20,1,33,1 -19916,4,2,0,13,0,1,1,5,4,0,4,9,1,29,0 -19917,0,5,0,14,0,0,11,5,3,0,8,20,2,10,0 -19918,0,2,0,6,2,0,1,3,0,1,11,11,0,10,0 -19919,7,3,0,13,2,5,5,3,1,1,18,10,0,6,1 -19920,2,1,0,8,5,1,4,5,4,1,9,8,2,13,1 -19921,0,3,0,11,5,5,12,2,0,1,6,0,1,4,0 -19922,0,0,0,0,3,4,13,3,1,0,8,11,0,20,0 -19923,0,1,0,6,2,4,1,4,2,1,13,5,5,4,0 -19924,0,5,0,6,2,5,11,2,0,1,8,6,5,3,0 -19925,9,1,0,13,6,5,5,2,3,1,10,17,1,33,0 -19926,2,2,0,7,0,5,10,2,3,1,3,11,0,32,0 -19927,5,4,0,11,6,4,12,0,4,1,2,11,1,25,0 -19928,3,8,0,14,3,5,14,2,0,0,8,19,0,33,0 -19929,0,2,0,4,0,6,5,2,4,0,17,16,5,2,0 -19930,2,2,0,7,0,3,0,5,4,0,13,11,1,26,0 -19931,10,5,0,11,4,4,6,3,2,1,17,9,1,27,0 -19932,1,5,0,2,2,1,14,5,4,1,13,6,2,38,0 -19933,0,7,0,12,5,6,3,3,0,0,13,18,1,38,0 -19934,2,7,0,0,2,0,7,0,1,1,4,18,2,23,0 -19935,4,6,0,5,4,2,0,1,0,1,18,10,2,32,1 -19936,1,4,0,0,3,4,8,0,2,0,15,11,3,37,0 -19937,8,0,0,10,2,6,4,0,0,0,3,1,3,34,1 -19938,4,4,0,3,0,6,5,4,2,0,4,7,0,31,0 -19939,5,6,0,6,6,5,3,4,0,0,13,11,5,23,0 -19940,2,2,0,15,6,6,14,1,4,1,8,2,0,35,0 -19941,3,7,0,9,1,4,5,3,2,0,13,6,0,32,0 -19942,6,4,0,7,5,6,10,2,2,0,0,2,3,25,0 -19943,10,7,0,8,3,6,3,1,3,0,4,18,3,5,0 -19944,1,8,0,0,3,6,0,0,4,0,2,0,4,19,0 -19945,2,8,0,0,4,4,9,1,4,1,13,14,0,14,0 -19946,6,8,0,10,0,4,7,3,2,0,13,11,3,22,0 -19947,8,5,0,3,1,1,7,3,3,1,9,7,4,4,1 -19948,8,7,0,10,4,4,9,4,0,1,4,10,5,24,0 -19949,0,6,0,11,5,6,12,1,3,0,6,11,3,2,0 -19950,5,7,0,1,6,0,6,1,1,1,3,15,0,5,0 -19951,7,7,0,10,6,5,6,4,4,0,12,6,5,1,0 -19952,2,2,0,7,0,4,0,3,0,0,8,9,1,14,0 -19953,4,6,0,7,4,4,10,5,2,1,9,12,5,10,1 -19954,9,0,0,7,6,2,10,2,2,0,8,15,5,3,0 -19955,7,5,0,3,6,5,4,2,1,1,9,10,2,29,1 -19956,1,2,0,13,6,0,1,1,0,0,13,6,2,4,0 -19957,5,6,0,15,0,1,7,5,3,0,4,20,0,31,0 -19958,0,2,0,3,0,0,7,1,4,1,3,2,0,30,0 -19959,5,0,0,13,6,5,8,3,0,1,7,12,0,5,0 -19960,0,2,0,13,1,0,1,4,0,0,6,9,0,33,0 -19961,0,6,0,1,6,5,11,5,2,0,13,0,5,40,0 -19962,3,7,0,1,3,3,7,4,2,0,8,9,4,7,0 -19963,9,0,0,7,5,2,3,0,4,0,7,8,1,41,1 -19964,2,6,0,3,6,3,3,5,4,1,13,13,1,27,1 -19965,6,4,0,6,6,5,14,0,2,0,0,2,0,22,0 -19966,10,1,0,10,6,0,12,5,4,1,5,18,5,3,1 -19967,2,5,0,0,4,6,4,5,3,1,10,8,2,31,1 -19968,0,2,0,6,4,6,0,0,0,0,4,15,0,26,0 -19969,6,8,0,2,5,5,5,2,1,0,17,2,0,23,0 -19970,0,1,0,6,3,4,11,2,2,1,10,15,5,38,0 -19971,4,6,0,2,0,5,4,3,0,0,12,11,2,40,0 -19972,7,1,0,2,0,3,7,3,2,0,8,2,5,12,0 -19973,7,7,0,0,2,0,1,4,0,0,8,2,1,19,0 -19974,3,1,0,7,1,5,8,3,0,1,13,11,1,32,0 -19975,2,2,0,6,5,4,10,1,3,0,0,0,5,14,0 -19976,5,4,0,2,2,4,5,3,0,0,6,9,4,11,0 -19977,2,5,0,12,6,2,14,5,1,0,3,10,1,20,1 -19978,4,5,0,2,6,5,7,5,0,0,0,6,0,9,0 -19979,10,4,0,7,2,3,0,5,4,0,6,2,1,25,0 -19980,4,0,0,3,0,0,5,0,0,1,7,20,3,1,0 -19981,0,4,0,15,5,6,13,2,4,0,8,13,5,9,0 -19982,5,5,0,12,0,0,6,1,3,0,8,6,0,20,0 -19983,3,8,0,0,0,4,10,2,4,0,2,16,5,32,0 -19984,9,1,0,8,2,6,0,0,0,1,6,17,1,9,1 -19985,3,4,0,6,3,1,7,3,3,1,9,9,0,1,1 -19986,1,6,0,11,6,2,11,1,2,0,6,11,5,21,0 -19987,9,3,0,12,1,4,0,4,4,0,2,20,5,5,0 -19988,1,6,0,7,2,4,6,4,4,0,12,11,5,14,0 -19989,3,2,0,0,0,5,10,2,0,0,8,10,3,19,0 -19990,2,5,0,14,6,3,14,0,2,1,13,15,4,35,0 -19991,4,8,0,6,0,1,1,3,4,0,2,16,2,30,0 -19992,7,1,0,3,4,5,11,4,3,1,13,11,3,9,0 -19993,1,1,0,5,0,1,9,4,0,0,17,2,1,5,0 -19994,4,4,0,13,6,4,12,3,1,0,16,15,2,4,0 -19995,6,5,0,2,6,4,4,3,3,1,13,3,4,37,0 -19996,10,2,0,13,0,0,9,3,2,0,0,3,0,0,0 -19997,7,6,0,13,1,4,13,2,3,0,13,18,1,24,0 -19998,9,0,0,14,2,0,12,0,0,1,18,7,3,28,1 -19999,1,1,0,2,1,4,7,3,0,0,2,18,0,25,0 -20000,9,3,0,14,5,0,2,2,2,1,4,2,5,29,0 -20001,0,2,0,1,5,5,0,2,3,1,16,17,5,24,0 -20002,3,1,0,8,6,5,4,0,1,1,16,19,4,4,1 -20003,1,6,0,5,6,1,11,3,3,1,5,11,2,17,0 -20004,9,1,0,5,5,2,1,0,1,0,5,8,4,41,1 -20005,6,1,0,6,2,3,7,3,2,1,13,9,1,21,0 -20006,2,0,0,7,1,5,12,1,0,1,3,14,1,2,1 -20007,6,4,0,2,2,6,6,2,2,0,7,2,5,23,0 -20008,0,5,0,0,0,6,8,2,4,1,17,11,3,0,0 -20009,6,7,0,8,4,4,0,3,4,0,11,11,2,38,0 -20010,0,2,0,6,3,2,1,3,0,0,8,18,1,36,0 -20011,3,0,0,3,0,3,5,3,1,1,13,11,0,3,0 -20012,7,1,0,1,1,3,5,0,4,1,5,18,1,29,1 -20013,0,4,0,2,0,3,13,2,0,0,12,11,3,0,0 -20014,2,6,0,5,0,4,4,4,1,1,13,11,4,14,0 -20015,7,3,0,6,1,4,11,1,4,1,13,6,2,34,0 -20016,8,4,0,12,0,0,1,2,0,1,0,2,2,5,0 -20017,6,7,0,2,2,4,11,5,3,1,5,5,3,22,1 -20018,10,2,0,12,3,6,8,2,4,1,15,14,0,5,0 -20019,9,2,0,4,3,4,4,2,3,1,15,15,0,32,0 -20020,4,2,0,13,4,4,14,4,4,0,10,15,3,37,0 -20021,6,3,0,1,1,4,1,5,0,1,6,12,5,30,1 -20022,10,6,0,7,4,6,7,1,2,1,9,7,0,12,1 -20023,3,3,0,2,6,0,7,2,2,1,13,0,0,26,0 -20024,2,8,0,5,0,6,3,0,0,0,13,20,1,35,0 -20025,8,8,0,12,5,3,10,3,0,1,2,9,1,25,0 -20026,4,7,0,2,3,0,14,5,2,0,0,1,5,34,0 -20027,6,4,0,10,0,2,10,5,2,0,11,7,3,36,1 -20028,10,7,0,11,1,0,0,4,2,1,12,2,4,3,0 -20029,10,5,0,13,1,5,8,3,4,0,0,11,5,12,0 -20030,3,5,0,8,0,1,14,2,1,1,17,6,1,0,0 -20031,9,3,0,8,2,2,12,0,1,1,4,11,0,4,0 -20032,1,8,0,3,4,4,3,4,3,0,8,6,5,28,0 -20033,2,8,0,14,0,0,9,3,4,1,2,13,0,23,0 -20034,7,8,0,8,4,5,6,0,0,0,9,13,2,12,1 -20035,0,2,0,9,3,2,10,1,4,0,4,6,3,4,0 -20036,9,8,0,5,1,5,5,3,3,1,4,11,0,22,0 -20037,1,4,0,7,1,5,10,3,4,0,2,13,4,6,0 -20038,10,4,0,5,4,3,5,4,2,0,2,4,4,3,0 -20039,0,0,0,13,6,2,5,4,1,1,13,9,1,9,0 -20040,5,2,0,2,0,3,8,2,0,0,13,15,3,20,0 -20041,0,2,0,14,6,0,14,3,0,1,13,11,4,3,0 -20042,1,2,0,3,4,3,0,1,4,1,8,9,2,10,0 -20043,6,7,0,10,1,1,11,1,3,1,7,14,0,20,1 -20044,1,4,0,0,3,6,4,4,4,1,16,10,2,32,1 -20045,1,7,0,12,1,0,7,3,0,0,13,12,4,4,0 -20046,8,1,0,12,2,1,13,0,0,0,16,9,1,24,1 -20047,7,0,0,0,3,5,0,1,0,0,4,20,0,16,0 -20048,4,8,0,6,5,6,10,3,1,1,17,15,4,39,0 -20049,0,2,0,13,0,6,8,3,1,0,2,0,0,30,0 -20050,5,5,0,9,1,0,3,5,0,1,9,5,3,30,1 -20051,10,4,0,7,3,3,3,2,1,1,16,15,0,38,1 -20052,0,5,0,7,2,6,1,3,3,0,17,5,0,30,0 -20053,6,5,0,10,1,4,6,0,3,1,9,8,2,20,1 -20054,4,8,0,8,6,1,12,5,0,0,8,15,0,5,0 -20055,0,4,0,2,0,0,0,3,0,1,13,11,1,23,0 -20056,3,2,0,3,0,0,7,3,3,0,8,2,0,22,0 -20057,1,3,0,6,5,6,8,4,4,0,5,9,1,0,0 -20058,2,8,0,13,1,3,5,1,4,1,15,17,1,1,0 -20059,5,8,0,2,4,5,11,0,3,1,2,2,1,41,0 -20060,10,1,0,6,0,5,11,1,4,1,11,18,4,4,0 -20061,2,5,0,13,0,0,12,3,1,0,0,18,5,30,0 -20062,1,2,0,5,0,0,9,4,3,0,2,5,5,4,0 -20063,9,7,0,14,1,1,1,0,2,0,17,6,5,22,1 -20064,0,8,0,3,2,5,6,1,3,0,8,19,0,20,0 -20065,8,8,0,1,1,6,13,4,4,0,10,20,1,18,1 -20066,2,7,0,1,5,6,9,2,4,1,2,2,3,40,0 -20067,2,0,0,13,4,5,4,2,0,0,2,19,4,9,0 -20068,9,1,0,5,6,4,10,2,0,1,14,4,0,27,0 -20069,5,8,0,1,1,4,9,2,0,1,14,2,3,28,0 -20070,8,0,0,15,1,5,12,5,0,1,3,14,5,26,1 -20071,2,5,0,4,6,1,5,3,0,1,16,5,1,16,1 -20072,4,0,0,11,4,6,1,0,3,0,12,15,1,5,0 -20073,1,7,0,9,1,6,1,4,0,0,16,6,1,13,0 -20074,1,8,0,3,1,3,2,2,0,1,13,13,2,25,0 -20075,5,8,0,5,5,2,3,1,2,1,6,0,1,30,0 -20076,9,4,0,1,4,3,13,5,1,1,18,10,3,14,1 -20077,10,0,0,0,0,6,9,4,3,0,2,6,4,9,0 -20078,3,8,0,3,0,2,1,4,1,1,2,0,0,29,0 -20079,5,2,0,14,3,3,12,1,1,0,12,9,1,17,0 -20080,4,6,0,1,4,4,13,0,0,1,10,17,3,36,1 -20081,4,7,0,0,3,5,3,2,4,1,2,15,3,13,0 -20082,7,8,0,12,2,0,12,4,2,1,7,17,4,17,1 -20083,7,6,0,5,5,1,3,0,1,1,12,8,5,20,1 -20084,6,2,0,10,6,5,2,2,0,1,18,10,4,20,1 -20085,5,3,0,4,5,3,13,0,1,1,16,12,4,40,1 -20086,10,7,0,6,6,1,8,0,2,1,18,16,3,7,1 -20087,1,0,0,12,4,0,10,4,4,1,4,20,0,0,0 -20088,0,8,0,15,6,4,9,1,2,1,17,11,1,40,0 -20089,8,3,0,3,2,2,11,0,2,0,2,6,2,20,0 -20090,5,8,0,13,2,3,2,2,1,0,6,16,2,35,0 -20091,10,2,0,1,0,5,10,4,4,1,13,8,5,21,0 -20092,1,1,0,11,2,4,0,3,2,1,15,18,4,32,0 -20093,9,7,0,1,5,4,4,3,1,0,9,2,1,3,0 -20094,3,3,0,5,2,3,4,4,0,0,6,9,2,32,0 -20095,7,7,0,13,3,2,8,0,4,1,13,11,1,35,0 -20096,1,8,0,0,6,4,12,3,1,0,6,18,4,9,0 -20097,2,7,0,1,1,1,9,0,2,0,2,10,1,20,0 -20098,0,3,0,5,0,0,8,1,0,1,17,18,1,4,0 -20099,2,8,0,0,5,1,13,3,2,1,8,19,1,1,0 -20100,10,5,0,14,4,4,14,3,2,0,2,2,4,22,0 -20101,10,6,0,10,0,0,3,4,3,1,18,8,2,31,1 -20102,10,8,0,13,2,6,4,1,0,0,4,6,2,27,0 -20103,1,6,0,2,6,2,6,1,0,0,0,18,0,9,0 -20104,8,0,0,6,6,2,5,5,4,1,3,5,3,24,1 -20105,3,0,0,7,6,0,12,0,1,0,13,11,3,5,0 -20106,2,2,0,13,2,3,13,1,4,0,6,9,3,31,0 -20107,6,2,0,11,4,3,3,4,4,1,8,9,0,10,0 -20108,0,3,0,0,3,3,8,3,0,1,8,15,1,37,0 -20109,5,8,0,14,6,5,5,0,2,1,8,15,0,1,0 -20110,1,3,0,0,5,4,3,3,4,0,13,11,0,10,0 -20111,6,3,0,4,4,3,1,5,0,1,8,2,1,37,0 -20112,10,5,0,9,1,5,12,5,0,1,5,1,0,24,1 -20113,3,6,0,15,2,6,8,3,3,0,17,17,4,37,0 -20114,7,5,0,14,1,4,8,2,4,0,18,10,5,19,1 -20115,2,0,0,12,6,5,8,1,0,0,13,11,5,28,0 -20116,2,7,0,4,0,4,5,3,0,0,11,20,1,12,0 -20117,2,5,0,12,1,3,0,4,4,0,1,9,1,0,0 -20118,0,4,0,3,5,0,0,2,2,0,7,0,2,19,0 -20119,0,5,0,5,5,3,8,4,2,1,17,2,1,25,0 -20120,3,3,0,15,1,1,12,3,0,0,17,15,1,41,0 -20121,5,7,0,12,1,3,9,2,1,0,17,13,0,11,0 -20122,4,4,0,15,5,6,6,5,0,1,8,11,2,36,0 -20123,6,5,0,13,6,3,2,0,3,0,14,7,2,39,1 -20124,10,5,0,10,0,1,1,4,0,1,18,7,5,35,1 -20125,7,6,0,12,1,2,13,5,2,1,3,5,1,41,1 -20126,9,6,0,1,4,0,13,2,0,1,15,19,3,24,0 -20127,1,1,0,7,4,5,4,1,2,1,13,9,1,31,0 -20128,8,8,0,11,1,0,2,0,4,0,13,9,2,39,0 -20129,0,1,0,3,2,5,7,2,2,1,17,2,2,35,0 -20130,4,0,0,11,1,1,7,1,0,1,9,9,2,6,1 -20131,6,3,0,15,5,4,7,4,3,0,9,20,1,13,0 -20132,1,3,0,11,3,0,0,2,3,1,13,8,1,19,0 -20133,10,5,0,13,5,4,6,1,4,1,2,16,4,27,0 -20134,4,5,0,2,6,2,10,2,0,0,13,9,4,26,0 -20135,2,4,0,8,1,0,8,3,1,0,2,17,4,37,0 -20136,1,5,0,13,6,6,8,5,2,0,17,13,3,4,0 -20137,8,2,0,12,6,4,2,3,4,0,13,0,2,4,0 -20138,0,7,0,0,5,3,13,2,4,1,0,13,5,27,0 -20139,0,3,0,9,6,4,6,2,3,0,17,6,3,32,0 -20140,6,5,0,1,1,0,6,1,1,1,10,4,0,5,0 -20141,7,4,0,5,1,1,9,2,2,0,2,16,0,5,0 -20142,0,1,0,7,6,6,14,1,1,0,18,18,2,10,1 -20143,0,0,0,6,0,6,8,3,2,0,13,2,3,35,0 -20144,6,7,0,9,1,2,11,4,3,1,10,5,3,7,1 -20145,8,2,0,3,3,3,5,5,3,1,9,19,3,34,1 -20146,7,4,0,3,4,1,13,4,3,1,18,1,4,2,1 -20147,0,3,0,1,1,6,11,3,2,1,0,9,1,28,0 -20148,7,0,0,0,6,6,11,5,1,1,12,5,2,7,1 -20149,5,6,0,9,0,1,12,3,3,0,4,15,1,6,0 -20150,2,8,0,10,1,0,10,4,0,1,8,15,1,36,0 -20151,3,6,0,3,6,5,8,5,0,0,11,10,4,14,0 -20152,0,0,0,6,2,0,8,2,3,1,2,2,5,32,0 -20153,9,3,0,3,1,0,1,3,3,0,10,17,5,33,1 -20154,6,1,0,9,1,1,0,5,1,0,6,10,3,31,1 -20155,10,5,0,5,5,6,11,1,0,1,6,11,0,0,0 -20156,4,7,0,14,1,5,14,2,0,0,5,0,4,35,0 -20157,1,8,0,12,3,5,9,5,0,1,13,15,0,2,0 -20158,9,1,0,7,5,5,1,0,0,1,3,0,3,14,1 -20159,8,5,0,10,0,4,3,1,2,1,5,19,3,25,1 -20160,0,5,0,9,1,0,5,2,1,1,6,6,0,4,0 -20161,8,3,0,11,2,5,13,0,4,1,5,19,3,41,1 -20162,10,7,0,1,0,3,10,2,0,0,2,11,0,23,0 -20163,3,8,0,1,4,6,0,1,1,0,11,0,1,37,0 -20164,1,4,0,5,2,1,9,4,1,1,8,2,1,40,0 -20165,0,3,0,5,5,6,0,0,4,1,13,9,0,29,0 -20166,7,1,0,10,5,2,5,0,4,1,1,0,5,13,1 -20167,4,3,0,10,1,6,3,2,3,0,18,3,3,19,1 -20168,2,8,0,3,0,6,14,3,3,0,12,4,3,7,0 -20169,2,7,0,8,0,4,5,1,2,1,10,15,4,13,0 -20170,4,0,0,15,0,2,1,0,0,0,8,11,5,32,0 -20171,8,5,0,9,4,5,11,3,2,1,14,3,5,41,1 -20172,3,3,0,6,6,5,6,4,2,0,14,11,5,0,0 -20173,5,0,0,15,5,5,10,1,4,0,2,2,4,37,0 -20174,3,4,0,12,0,5,4,1,4,0,2,17,5,21,0 -20175,4,6,0,1,6,6,0,3,0,0,0,4,0,40,0 -20176,2,8,0,5,6,5,9,2,4,0,13,14,4,40,0 -20177,2,4,0,10,5,4,4,1,2,1,18,19,5,11,1 -20178,6,7,0,8,2,1,8,4,1,1,18,8,3,7,1 -20179,7,7,0,12,4,0,6,0,0,0,13,11,3,27,0 -20180,10,5,0,15,5,2,8,1,2,0,2,18,4,26,0 -20181,9,1,0,6,5,3,1,4,4,0,7,11,2,10,0 -20182,8,3,0,3,6,0,1,0,0,1,15,2,2,14,0 -20183,8,6,0,5,4,1,3,0,4,1,14,14,3,13,1 -20184,10,8,0,3,0,6,5,4,0,1,13,13,0,28,0 -20185,3,2,0,2,2,1,4,0,2,1,12,15,0,3,0 -20186,2,1,0,7,5,5,12,3,2,1,6,13,3,5,0 -20187,5,1,0,3,6,6,9,4,0,1,10,2,0,23,0 -20188,6,5,0,11,6,1,10,1,3,0,0,16,1,32,0 -20189,6,8,0,6,2,1,6,4,4,0,6,11,5,7,0 -20190,3,7,0,11,1,5,5,1,2,1,6,11,0,14,0 -20191,7,1,0,5,1,3,5,3,4,0,3,11,5,1,1 -20192,5,6,0,15,0,3,7,5,0,1,7,18,2,16,0 -20193,2,6,0,6,0,5,7,3,2,1,18,15,0,26,0 -20194,8,4,0,2,0,5,9,4,3,0,17,0,0,4,0 -20195,2,8,0,0,3,4,0,4,0,1,15,6,2,33,0 -20196,6,6,0,0,2,3,6,1,4,0,14,20,0,30,0 -20197,7,0,0,14,3,5,7,5,4,0,18,20,1,9,1 -20198,8,1,0,11,6,4,5,3,3,1,8,6,5,28,0 -20199,1,0,0,7,1,4,8,1,4,0,2,11,1,28,0 -20200,9,5,0,9,2,0,3,0,3,1,12,3,1,17,1 -20201,9,8,0,12,2,4,6,5,0,0,6,9,4,31,0 -20202,1,4,0,6,5,0,4,2,4,0,4,11,1,0,0 -20203,4,4,0,13,1,6,7,3,4,1,13,8,5,14,0 -20204,6,8,0,8,3,2,2,1,2,1,13,11,3,37,0 -20205,7,3,0,6,1,6,12,0,3,1,9,16,2,41,1 -20206,3,8,0,0,6,1,10,2,0,0,2,18,1,41,0 -20207,10,7,0,7,0,3,7,4,1,1,11,4,1,38,0 -20208,8,6,0,5,1,2,14,2,4,0,8,11,2,20,0 -20209,2,3,0,15,4,4,1,1,1,0,0,13,3,3,0 -20210,0,2,0,4,3,6,13,4,3,0,4,11,5,6,0 -20211,8,4,0,10,3,2,1,5,2,1,3,19,3,12,1 -20212,7,1,0,4,3,3,13,5,1,1,3,5,1,16,1 -20213,2,1,0,1,6,1,12,1,0,1,14,18,2,6,1 -20214,3,0,0,13,2,6,14,3,4,0,9,5,2,8,1 -20215,3,0,0,2,3,2,14,5,1,1,11,3,2,25,1 -20216,6,3,0,14,6,1,14,2,3,0,2,2,3,36,0 -20217,6,7,0,0,1,6,10,2,0,0,0,11,5,40,0 -20218,8,4,0,13,2,4,3,3,0,0,4,11,5,10,0 -20219,6,6,0,2,1,1,6,1,2,1,7,17,3,20,1 -20220,8,2,0,9,0,2,13,0,4,0,7,7,5,21,1 -20221,10,4,0,14,0,1,13,5,0,1,5,2,5,26,0 -20222,3,0,0,11,0,5,5,2,4,1,13,3,0,38,0 -20223,2,2,0,13,6,6,12,3,1,1,2,11,5,25,0 -20224,10,1,0,0,1,5,1,5,0,0,9,11,2,26,0 -20225,2,5,0,11,1,3,14,3,0,0,13,0,2,36,0 -20226,2,0,0,0,1,5,12,1,0,1,13,9,5,39,0 -20227,4,8,0,9,0,2,13,1,3,0,13,15,2,24,0 -20228,4,4,0,13,1,4,8,1,4,0,8,13,2,18,0 -20229,3,0,0,5,6,5,9,4,2,1,8,1,0,18,0 -20230,3,0,0,13,0,2,13,5,4,0,2,18,2,22,0 -20231,0,1,0,2,4,5,9,4,0,0,13,11,5,40,0 -20232,7,0,0,2,3,3,11,4,2,1,5,10,5,32,1 -20233,5,1,0,5,3,0,13,0,2,1,9,17,4,7,1 -20234,1,3,0,3,6,1,4,5,2,0,10,11,0,40,0 -20235,8,8,0,1,5,2,2,5,1,0,18,1,2,7,1 -20236,3,3,0,3,3,5,4,2,3,0,6,13,4,34,0 -20237,0,0,0,3,1,4,0,1,3,0,3,20,0,5,0 -20238,3,2,0,7,6,0,8,1,4,1,2,2,5,36,0 -20239,4,7,0,0,0,5,14,4,0,0,13,11,1,2,0 -20240,3,8,0,5,5,4,8,5,3,0,8,20,1,2,0 -20241,8,3,0,1,6,2,13,5,2,1,0,5,3,16,1 -20242,0,4,0,11,0,6,7,3,0,0,2,5,5,8,0 -20243,0,4,0,15,0,1,14,1,2,1,10,6,1,35,0 -20244,2,0,0,1,6,2,6,3,4,1,4,11,4,29,0 -20245,5,2,0,5,5,5,8,5,3,1,8,16,4,30,0 -20246,1,8,0,3,6,1,7,3,0,0,8,0,1,7,0 -20247,5,2,0,3,0,5,7,0,4,1,15,17,2,10,1 -20248,10,1,0,0,0,0,7,0,2,1,13,15,3,18,0 -20249,9,5,0,3,2,1,2,0,1,1,10,6,2,6,0 -20250,3,6,0,2,6,5,3,4,0,1,13,3,0,39,0 -20251,1,8,0,5,6,2,9,1,3,0,13,9,4,27,0 -20252,5,1,0,10,1,5,7,0,4,0,7,8,3,31,1 -20253,9,6,0,6,6,1,2,2,3,0,8,2,5,5,0 -20254,4,7,0,12,4,2,6,0,0,1,10,10,0,24,1 -20255,2,8,0,6,6,0,7,2,0,0,18,19,4,5,1 -20256,4,5,0,2,0,0,6,5,4,1,8,9,1,24,0 -20257,2,1,0,6,5,5,8,4,2,1,17,11,1,4,0 -20258,9,5,0,1,1,4,7,5,0,1,3,1,2,2,1 -20259,9,2,0,9,4,0,4,5,4,1,18,7,0,13,1 -20260,1,0,0,1,6,1,7,4,2,0,2,2,4,9,0 -20261,1,5,0,10,1,0,3,2,3,0,6,20,4,40,0 -20262,7,1,0,9,1,1,5,0,2,0,5,11,3,39,0 -20263,1,4,0,13,1,5,3,2,0,0,8,3,2,27,0 -20264,7,6,0,1,4,5,8,2,0,1,17,11,1,30,0 -20265,0,4,0,4,0,3,5,3,1,1,2,20,1,4,0 -20266,0,1,0,10,6,3,4,2,4,1,17,6,5,22,0 -20267,8,8,0,5,6,5,12,0,3,0,16,5,4,39,1 -20268,0,8,0,5,6,3,5,3,2,0,8,2,4,31,0 -20269,7,3,0,11,1,5,9,2,1,0,16,13,5,38,0 -20270,0,2,0,13,3,6,0,4,3,0,1,20,5,37,0 -20271,1,7,0,5,2,6,9,4,1,1,16,10,1,18,0 -20272,6,7,0,10,5,4,13,2,4,1,9,5,1,8,1 -20273,2,3,0,10,2,1,12,0,3,1,5,19,2,28,1 -20274,7,1,0,1,0,1,2,3,2,1,8,5,4,25,0 -20275,3,7,0,8,5,3,7,0,1,0,13,13,0,29,0 -20276,6,3,0,10,6,0,8,0,2,1,2,16,2,38,0 -20277,9,5,0,0,3,2,0,3,1,0,10,5,5,6,0 -20278,1,4,0,10,4,3,9,5,3,1,0,17,3,6,1 -20279,5,5,0,15,0,4,14,4,2,1,6,13,0,30,0 -20280,2,4,0,11,1,6,9,3,3,0,7,9,4,35,0 -20281,3,6,0,5,1,0,14,0,2,1,14,19,4,8,0 -20282,4,5,0,0,0,4,13,3,0,0,8,2,1,9,0 -20283,3,6,0,15,0,5,13,2,0,0,10,3,4,26,0 -20284,5,2,0,15,1,1,14,2,2,1,11,2,4,2,0 -20285,10,5,0,12,4,2,2,1,2,0,17,17,0,14,0 -20286,3,4,0,0,0,3,8,3,3,1,2,0,0,22,0 -20287,0,8,0,4,1,4,5,3,4,1,2,11,3,38,0 -20288,10,2,0,13,4,0,5,2,0,1,6,0,3,3,0 -20289,0,2,0,6,2,0,0,1,2,0,5,5,0,12,1 -20290,2,7,0,0,5,1,11,1,0,0,12,2,2,23,0 -20291,5,8,0,9,5,1,8,1,1,1,1,11,1,14,1 -20292,0,5,0,9,2,1,0,3,0,1,2,3,2,26,0 -20293,7,5,0,11,1,4,8,4,0,1,12,2,5,32,0 -20294,0,2,0,0,1,0,14,2,4,0,2,11,3,5,0 -20295,0,4,0,14,0,5,8,0,2,1,16,8,1,6,1 -20296,3,4,0,15,2,3,5,0,2,1,16,10,4,2,1 -20297,2,8,0,12,6,4,8,0,1,1,6,19,2,21,0 -20298,7,4,0,3,3,0,4,2,4,0,8,16,3,26,0 -20299,1,5,0,13,1,0,8,3,1,0,13,15,0,27,0 -20300,10,6,0,3,2,4,6,2,2,0,8,15,0,5,0 -20301,6,6,0,0,3,1,4,2,2,0,16,10,1,21,1 -20302,4,0,0,5,0,4,2,3,2,0,1,2,0,0,0 -20303,2,6,0,13,1,3,1,0,1,0,9,10,1,40,1 -20304,8,6,0,4,3,0,0,4,3,1,16,8,0,9,1 -20305,7,7,0,1,2,5,7,2,1,1,0,4,0,22,0 -20306,10,0,0,6,2,5,8,1,4,0,11,11,0,39,0 -20307,3,4,0,13,3,4,5,3,1,0,8,6,2,20,0 -20308,7,0,0,4,2,0,13,1,2,1,17,11,5,5,0 -20309,7,4,0,11,3,6,2,2,3,1,9,6,1,32,0 -20310,0,7,0,6,6,3,0,5,1,1,16,0,4,38,0 -20311,7,7,0,0,5,5,1,0,1,1,10,2,5,41,0 -20312,7,5,0,10,2,0,9,0,1,0,15,9,0,41,0 -20313,6,2,0,12,1,5,7,5,2,1,9,19,4,18,1 -20314,5,8,0,0,0,0,13,2,0,1,17,15,5,33,0 -20315,8,0,0,2,2,5,8,0,1,1,10,18,2,35,0 -20316,9,2,0,12,0,4,1,0,4,1,15,15,3,4,0 -20317,1,4,0,11,6,2,0,0,0,1,11,12,4,13,0 -20318,10,1,0,5,3,6,6,0,3,1,12,11,2,3,0 -20319,3,4,0,0,1,3,13,1,1,1,8,15,0,35,0 -20320,3,6,0,14,0,0,8,2,4,1,10,6,1,3,0 -20321,1,5,0,15,5,4,11,2,1,0,2,2,4,30,0 -20322,0,3,0,3,5,0,2,5,4,1,3,19,4,29,0 -20323,7,2,0,0,3,5,12,3,0,1,12,20,3,29,0 -20324,4,8,0,7,2,1,12,1,1,0,7,10,1,8,1 -20325,5,7,0,0,1,2,2,1,2,1,11,7,3,20,1 -20326,9,7,0,5,5,3,3,0,3,0,4,13,3,28,0 -20327,3,6,0,2,0,6,7,4,4,0,15,2,1,4,0 -20328,1,0,0,4,0,6,3,3,3,0,6,15,1,40,0 -20329,3,7,0,7,6,4,7,3,2,0,8,19,0,38,0 -20330,1,7,0,1,0,0,12,0,0,1,6,17,1,35,0 -20331,10,1,0,15,5,6,14,1,0,1,3,12,3,13,1 -20332,5,5,0,10,3,0,14,0,2,1,16,5,2,32,1 -20333,2,2,0,3,4,4,8,3,0,0,12,2,5,31,0 -20334,8,0,0,13,1,4,11,4,2,0,17,2,2,6,0 -20335,0,3,0,3,1,5,9,5,2,0,2,0,4,13,0 -20336,2,5,0,8,2,1,5,3,4,1,17,15,3,5,0 -20337,6,0,0,12,3,5,8,5,4,1,16,3,3,4,1 -20338,1,5,0,1,3,6,7,0,1,1,12,18,3,31,0 -20339,4,5,0,1,4,0,7,4,2,1,2,2,4,2,0 -20340,7,0,0,15,1,4,0,1,1,1,15,11,2,28,0 -20341,7,7,0,13,0,6,9,4,3,1,15,9,4,14,0 -20342,7,1,0,4,2,6,14,2,1,0,11,5,3,27,1 -20343,8,3,0,10,6,5,11,4,0,1,9,5,5,33,1 -20344,1,7,0,6,2,5,11,4,1,1,8,2,5,24,0 -20345,0,8,0,14,4,4,7,4,0,0,0,13,0,18,0 -20346,0,6,0,13,1,6,12,1,0,1,12,13,3,38,0 -20347,2,8,0,11,0,0,9,4,0,0,0,2,2,7,0 -20348,9,6,0,2,2,3,3,5,1,0,0,10,3,19,1 -20349,9,6,0,7,5,3,14,2,1,0,1,1,5,39,1 -20350,1,8,0,15,2,5,7,2,3,0,13,16,3,38,0 -20351,8,4,0,3,5,1,5,4,2,0,6,12,3,28,0 -20352,8,0,0,0,5,5,11,4,1,1,17,17,3,34,1 -20353,9,3,0,2,6,3,8,0,3,0,6,17,1,10,1 -20354,3,8,0,5,4,0,12,5,0,0,7,2,1,9,0 -20355,9,1,0,9,6,4,2,1,3,1,13,2,2,30,0 -20356,3,8,0,12,4,3,9,5,0,1,4,2,0,35,0 -20357,9,6,0,0,2,0,3,0,4,0,18,8,1,34,1 -20358,0,2,0,9,0,5,8,4,4,0,17,15,1,19,0 -20359,6,8,0,3,0,5,14,4,2,1,14,6,4,2,0 -20360,1,3,0,0,4,3,6,1,1,0,17,2,3,39,0 -20361,8,4,0,4,2,5,0,3,2,0,6,2,5,37,0 -20362,2,7,0,0,1,0,4,4,4,0,15,15,2,27,0 -20363,3,4,0,8,1,1,3,1,3,1,2,11,1,31,0 -20364,8,6,0,10,1,1,0,2,2,0,1,7,4,22,1 -20365,3,2,0,5,3,1,5,3,3,0,9,14,5,17,1 -20366,7,7,0,14,0,5,5,2,0,1,8,11,1,8,0 -20367,6,6,0,14,0,4,1,4,4,1,4,7,2,31,1 -20368,8,0,0,10,4,0,11,5,2,1,8,19,3,25,1 -20369,7,0,0,11,2,1,1,1,4,1,8,17,1,6,0 -20370,8,2,0,6,2,5,10,4,3,0,13,18,0,10,0 -20371,2,0,0,10,4,0,0,5,2,1,8,20,5,25,0 -20372,0,6,0,13,5,3,2,1,0,1,8,2,5,27,0 -20373,3,0,0,11,5,0,2,4,1,0,2,9,0,29,0 -20374,1,6,0,3,0,5,5,3,0,0,2,20,1,21,0 -20375,1,4,0,11,6,5,5,4,0,1,2,2,4,3,0 -20376,2,0,0,15,0,5,12,1,3,1,8,0,0,18,0 -20377,8,1,0,14,3,6,14,2,3,1,18,17,3,6,1 -20378,10,5,0,9,3,6,0,2,0,0,13,6,3,38,0 -20379,10,0,0,9,2,5,4,1,4,0,18,17,5,40,1 -20380,2,6,0,8,5,2,2,4,0,1,6,11,5,16,0 -20381,8,3,0,3,1,5,1,1,1,1,4,11,2,30,1 -20382,4,1,0,4,2,6,5,5,2,0,9,10,1,18,1 -20383,8,1,0,10,3,0,8,1,0,0,5,10,4,39,1 -20384,8,0,0,5,0,4,5,1,3,0,13,2,1,37,0 -20385,0,8,0,3,3,6,13,3,2,0,0,2,0,33,0 -20386,2,1,0,15,6,1,7,1,2,0,9,1,4,22,1 -20387,3,8,0,13,6,3,14,4,0,1,8,6,3,26,0 -20388,8,3,0,3,0,6,8,0,1,0,13,12,1,14,0 -20389,1,6,0,0,3,4,1,0,0,1,18,5,5,34,1 -20390,0,2,0,5,3,0,5,2,3,0,8,2,5,16,0 -20391,2,3,0,5,0,2,9,5,2,0,8,13,0,26,0 -20392,10,1,0,0,6,0,9,3,1,1,0,2,4,5,0 -20393,2,6,0,10,4,2,8,2,0,1,10,2,0,12,0 -20394,3,5,0,2,2,4,9,4,0,0,4,13,1,7,0 -20395,3,3,0,15,0,0,10,5,3,0,13,16,5,34,0 -20396,5,3,0,6,2,4,11,2,0,1,16,19,0,9,0 -20397,6,4,0,8,2,2,3,5,2,1,5,7,2,3,1 -20398,7,8,0,3,1,0,1,4,4,1,2,9,2,28,0 -20399,5,1,0,5,2,6,3,5,1,1,1,3,2,25,1 -20400,4,0,0,8,4,1,9,3,0,1,17,17,3,34,0 -20401,0,7,0,4,5,2,14,1,0,1,8,2,0,26,0 -20402,8,4,0,3,2,2,0,4,3,1,13,11,0,23,0 -20403,1,0,0,11,0,5,11,4,4,1,13,10,3,9,0 -20404,8,2,0,5,6,6,9,4,4,0,10,15,5,35,0 -20405,3,8,0,4,1,5,12,1,4,1,17,18,5,32,0 -20406,7,2,0,0,4,3,4,1,1,1,9,10,4,19,1 -20407,10,1,0,14,1,4,9,1,1,0,8,6,4,7,0 -20408,0,6,0,2,0,1,2,4,2,0,8,2,0,21,0 -20409,9,2,0,5,2,5,10,3,0,0,2,6,2,28,0 -20410,0,3,0,7,4,5,5,4,3,1,12,2,5,20,0 -20411,7,7,0,7,2,0,8,3,1,0,9,11,1,2,0 -20412,0,1,0,3,2,5,5,4,2,0,8,18,2,0,0 -20413,0,5,0,14,3,1,14,3,3,1,2,20,2,11,0 -20414,9,6,0,5,3,5,9,3,1,0,7,13,3,22,0 -20415,7,5,0,15,5,1,1,4,3,1,9,8,1,1,1 -20416,3,5,0,3,1,0,0,2,3,0,12,2,4,36,0 -20417,0,3,0,15,1,1,14,0,2,0,16,20,4,4,0 -20418,1,0,0,3,5,2,6,2,1,1,2,15,5,20,0 -20419,8,3,0,4,1,5,11,3,2,1,9,2,3,32,0 -20420,2,3,0,14,1,3,8,3,1,0,2,4,5,33,0 -20421,9,1,0,4,1,0,0,0,1,1,18,14,4,10,1 -20422,2,7,0,3,6,1,10,3,4,0,1,7,0,41,1 -20423,0,2,0,7,6,3,6,3,0,0,13,3,3,41,0 -20424,3,0,0,0,5,5,0,2,3,0,13,19,0,39,0 -20425,10,1,0,10,1,4,7,1,0,1,5,9,1,36,0 -20426,4,2,0,2,4,2,0,2,0,0,8,13,0,24,0 -20427,0,7,0,0,0,5,10,4,4,1,13,0,4,35,0 -20428,9,7,0,0,5,4,4,1,3,1,12,7,0,33,1 -20429,1,5,0,3,2,3,9,5,4,1,8,11,4,7,0 -20430,9,0,0,15,2,1,3,2,3,1,8,2,3,10,0 -20431,0,1,0,10,0,0,4,2,2,0,6,2,1,41,0 -20432,0,7,0,3,6,3,5,2,3,1,13,0,3,4,0 -20433,3,8,0,7,0,3,0,2,3,1,0,2,5,17,0 -20434,6,3,0,10,4,4,0,5,0,1,15,10,2,28,1 -20435,10,2,0,4,2,0,0,1,2,1,10,16,0,9,0 -20436,10,2,0,6,5,1,3,2,0,0,4,7,3,28,1 -20437,2,2,0,6,1,3,9,0,0,0,13,11,3,25,0 -20438,3,3,0,12,2,3,8,2,0,1,1,20,2,37,0 -20439,3,5,0,9,4,5,13,0,3,0,13,9,0,17,0 -20440,9,4,0,7,3,6,13,2,4,0,9,10,5,4,1 -20441,4,4,0,2,2,6,2,5,2,0,1,19,0,36,1 -20442,1,8,0,10,4,6,9,3,2,1,9,12,3,25,1 -20443,4,0,0,10,0,3,3,2,0,1,12,1,1,24,1 -20444,1,3,0,15,6,3,2,4,4,1,2,9,3,40,0 -20445,6,1,0,0,1,0,7,3,4,0,8,9,1,31,0 -20446,1,1,0,14,5,4,4,5,3,0,8,13,3,11,0 -20447,1,7,0,11,3,3,6,4,3,0,17,20,2,37,0 -20448,9,2,0,12,2,1,0,1,1,0,4,5,1,18,1 -20449,2,8,0,2,0,5,14,0,4,0,10,11,2,32,0 -20450,0,6,0,8,5,0,9,0,3,1,6,2,3,17,0 -20451,0,1,0,15,0,0,10,0,0,1,13,12,4,37,0 -20452,3,3,0,13,2,1,12,1,0,1,13,15,1,26,0 -20453,0,4,0,8,0,6,1,3,0,1,2,13,0,24,0 -20454,3,4,0,1,5,4,4,3,0,0,2,13,1,29,0 -20455,2,8,0,5,2,4,14,4,3,0,8,6,1,26,0 -20456,5,1,0,12,2,4,1,5,2,0,7,7,4,37,1 -20457,1,2,0,5,0,4,14,4,2,0,7,9,0,14,0 -20458,0,7,0,0,3,1,8,4,2,0,6,5,5,39,0 -20459,8,7,0,4,1,6,6,1,1,0,1,11,5,26,0 -20460,5,3,0,6,1,4,6,3,0,0,8,11,1,41,0 -20461,0,5,0,5,6,4,10,3,0,0,2,1,4,25,0 -20462,3,3,0,11,5,3,14,1,0,0,4,18,0,29,0 -20463,8,1,0,14,6,1,3,3,2,1,5,17,5,16,1 -20464,0,6,0,10,5,3,9,5,4,1,12,4,2,4,0 -20465,1,4,0,10,0,3,6,5,3,1,9,2,5,27,0 -20466,1,0,0,13,1,5,7,1,0,1,8,6,5,13,0 -20467,3,7,0,3,6,2,6,3,3,0,13,2,4,11,0 -20468,9,2,0,7,0,0,7,1,3,0,0,15,0,13,0 -20469,10,4,0,8,1,2,5,0,4,1,13,2,0,40,0 -20470,7,6,0,13,6,6,0,2,0,0,11,3,4,7,0 -20471,2,6,0,11,1,6,5,3,2,0,2,11,5,3,0 -20472,10,5,0,2,1,6,13,5,1,0,10,17,2,5,1 -20473,8,6,0,7,5,4,4,0,0,1,1,12,0,36,1 -20474,0,0,0,5,1,2,7,3,0,1,13,11,2,33,0 -20475,2,1,0,9,6,5,5,5,4,1,9,7,5,2,1 -20476,1,1,0,11,0,0,5,4,0,1,10,15,5,11,0 -20477,7,2,0,2,4,4,1,5,0,1,14,4,5,24,1 -20478,1,5,0,0,2,4,5,1,0,0,6,15,3,10,0 -20479,1,5,0,11,0,4,6,3,4,0,6,11,5,35,0 -20480,2,8,0,8,6,3,7,3,1,0,6,18,3,20,0 -20481,6,4,0,15,6,0,11,2,3,1,6,11,2,3,0 -20482,10,0,0,0,2,5,0,3,4,1,17,8,5,19,0 -20483,2,8,0,13,4,3,13,4,2,0,2,20,3,4,0 -20484,6,0,0,1,0,3,11,3,1,0,2,15,2,12,0 -20485,2,7,0,7,1,5,7,1,2,0,8,6,2,29,0 -20486,1,0,0,3,0,1,3,1,3,1,2,2,5,25,0 -20487,3,1,0,15,0,0,2,3,1,1,3,16,2,10,0 -20488,10,0,0,11,6,2,2,4,0,1,1,5,3,22,1 -20489,2,5,0,11,0,6,3,0,2,1,1,15,3,24,0 -20490,1,6,0,3,0,0,10,1,2,1,7,20,0,3,0 -20491,9,4,0,6,1,6,10,3,2,0,10,9,2,0,0 -20492,1,0,0,12,4,2,13,2,3,1,18,10,2,27,1 -20493,0,0,0,3,1,4,1,1,1,0,6,2,1,33,0 -20494,4,6,0,1,2,6,3,4,0,1,10,13,0,30,0 -20495,3,8,0,9,0,6,5,4,2,1,2,11,4,29,0 -20496,5,6,0,15,0,6,4,4,3,0,17,20,4,35,0 -20497,9,3,0,13,2,5,1,1,4,0,14,2,5,19,0 -20498,1,6,0,1,6,6,10,4,1,1,8,2,4,33,0 -20499,5,1,0,8,5,1,5,0,2,1,8,4,5,29,0 -20500,9,0,0,8,6,2,1,1,2,0,18,1,4,13,1 -20501,7,1,0,12,3,6,11,0,4,0,15,7,5,26,1 -20502,0,5,0,8,5,0,11,1,2,0,13,10,2,21,0 -20503,9,2,0,15,0,5,6,1,3,0,10,13,0,9,0 -20504,3,1,0,14,2,0,11,3,3,1,0,15,5,2,0 -20505,0,8,0,3,4,3,2,1,3,0,2,16,1,22,0 -20506,4,5,0,14,6,1,12,2,3,0,1,16,5,5,1 -20507,3,1,0,4,6,0,12,3,2,0,15,20,4,41,0 -20508,10,2,0,3,0,3,12,0,1,0,10,20,2,36,0 -20509,2,1,0,15,1,5,10,0,3,0,13,11,5,10,0 -20510,4,7,0,5,3,2,6,1,2,1,13,11,1,36,0 -20511,0,1,0,1,5,1,13,3,4,1,13,3,0,34,0 -20512,8,5,0,13,6,0,13,5,3,1,10,11,1,7,0 -20513,10,6,0,3,0,1,14,4,1,1,12,17,1,23,1 -20514,3,6,0,9,4,0,3,3,1,1,7,4,0,14,0 -20515,7,2,0,4,6,4,14,3,2,0,2,2,1,20,0 -20516,7,1,0,5,4,2,8,2,2,1,3,10,0,8,1 -20517,8,7,0,5,2,5,2,0,0,0,13,2,2,22,0 -20518,0,7,0,13,0,4,14,2,2,0,2,2,2,17,0 -20519,2,6,0,14,0,3,11,1,3,0,16,0,5,37,0 -20520,2,5,0,9,1,0,2,5,1,0,12,17,5,0,1 -20521,6,0,0,3,5,3,3,5,0,0,13,0,1,2,0 -20522,4,3,0,13,6,1,8,1,2,0,13,6,0,14,0 -20523,3,5,0,2,0,1,11,4,2,0,12,16,5,5,0 -20524,0,3,0,2,5,5,14,3,2,0,2,13,0,35,0 -20525,4,8,0,6,6,0,10,2,1,1,18,8,3,3,1 -20526,7,8,0,10,0,3,13,5,4,1,18,13,3,4,1 -20527,5,7,0,14,5,6,3,3,0,0,13,9,4,19,0 -20528,2,4,0,0,1,3,8,3,2,0,15,0,3,6,0 -20529,3,7,0,8,0,2,5,4,4,0,13,2,5,38,0 -20530,2,6,0,0,6,5,12,0,4,1,16,12,3,11,0 -20531,2,7,0,2,0,0,14,3,2,0,13,6,1,19,0 -20532,4,6,0,13,0,5,14,3,1,1,8,2,3,23,0 -20533,8,3,0,3,0,0,3,5,1,0,2,13,0,6,0 -20534,2,8,0,3,6,0,0,5,1,0,13,12,5,28,0 -20535,1,6,0,13,5,3,7,4,3,0,1,0,0,38,0 -20536,2,4,0,4,0,4,2,2,4,1,4,18,1,32,0 -20537,2,7,0,5,3,2,10,3,0,0,8,17,3,10,0 -20538,8,1,0,1,6,1,0,4,1,0,11,5,4,18,1 -20539,4,8,0,15,5,5,3,3,4,1,2,0,0,37,0 -20540,1,2,0,2,1,6,9,3,0,1,4,15,0,31,0 -20541,9,1,0,12,1,1,5,0,0,0,1,7,0,40,1 -20542,6,7,0,15,2,5,5,1,2,1,13,2,4,19,0 -20543,2,8,0,8,0,5,2,4,4,1,4,7,3,12,1 -20544,0,4,0,9,2,1,1,4,3,0,2,16,2,40,0 -20545,1,1,0,3,3,6,14,1,4,1,0,13,0,39,0 -20546,9,0,0,12,6,1,5,0,2,1,6,7,4,29,1 -20547,5,7,0,3,4,4,6,4,4,0,2,18,3,25,0 -20548,5,4,0,9,1,0,7,5,4,0,18,6,1,24,1 -20549,7,0,0,0,0,3,1,1,0,1,15,10,2,21,0 -20550,2,2,0,14,5,0,6,3,0,1,13,3,1,5,0 -20551,2,6,0,2,2,5,10,3,3,0,15,1,1,7,0 -20552,0,5,0,14,4,3,5,0,2,1,2,2,1,30,0 -20553,3,0,0,13,4,2,13,0,2,1,11,5,2,41,1 -20554,6,6,0,1,0,0,3,2,2,0,4,2,5,25,0 -20555,1,8,0,13,3,0,11,5,3,0,8,16,4,37,0 -20556,0,7,0,5,6,3,3,1,2,0,4,18,4,30,0 -20557,4,7,0,8,2,1,10,3,4,0,2,17,0,34,0 -20558,3,8,0,1,5,4,11,3,0,1,7,10,4,16,1 -20559,8,8,0,2,0,0,0,4,3,0,8,3,4,6,0 -20560,6,3,0,3,5,0,6,1,2,1,11,19,5,40,1 -20561,1,1,0,3,6,6,10,1,2,1,9,19,4,31,1 -20562,0,8,0,2,0,1,1,3,0,1,6,6,1,8,0 -20563,1,5,0,6,2,0,14,0,3,0,2,19,2,16,0 -20564,9,5,0,0,3,3,13,4,0,1,5,16,0,25,0 -20565,3,0,0,1,1,5,10,1,2,0,6,4,5,5,0 -20566,1,8,0,6,5,2,6,2,2,0,8,2,5,35,0 -20567,2,8,0,13,1,4,9,3,1,1,13,9,4,0,0 -20568,0,7,0,3,1,4,3,0,0,0,15,2,2,21,0 -20569,3,2,0,4,0,5,13,3,1,1,12,2,0,36,0 -20570,9,8,0,10,3,2,3,1,2,0,14,10,2,10,1 -20571,2,2,0,2,3,3,5,4,2,0,12,16,3,27,0 -20572,9,0,0,8,0,5,4,4,0,1,2,6,5,1,0 -20573,5,7,0,13,2,2,7,1,3,0,6,18,0,21,0 -20574,2,3,0,8,1,1,2,5,4,0,6,9,1,40,0 -20575,0,6,0,9,5,4,6,0,3,0,13,9,4,4,0 -20576,8,6,0,7,0,1,6,5,0,1,13,9,5,12,0 -20577,5,6,0,0,6,5,13,1,0,0,2,11,2,21,0 -20578,4,7,0,2,6,0,3,5,0,1,2,4,1,20,0 -20579,1,4,0,10,5,3,6,4,1,1,17,16,4,34,0 -20580,3,5,0,1,2,2,5,0,2,0,2,9,2,26,0 -20581,1,7,0,9,2,5,2,0,3,1,18,9,4,13,0 -20582,7,4,0,2,5,1,10,0,1,1,15,0,5,18,1 -20583,9,5,0,7,2,1,5,5,1,1,12,4,3,21,1 -20584,2,3,0,10,5,0,5,2,1,0,13,2,1,8,0 -20585,3,8,0,4,1,4,13,1,0,1,6,16,3,38,0 -20586,7,6,0,7,5,3,0,4,0,1,16,10,5,21,1 -20587,7,4,0,14,5,3,13,0,0,0,9,17,0,21,1 -20588,0,5,0,2,2,5,6,0,0,0,2,11,3,22,0 -20589,10,7,0,7,3,6,9,1,1,1,2,0,1,4,0 -20590,0,3,0,4,0,5,2,4,2,0,6,2,1,41,0 -20591,1,7,0,13,0,0,6,3,1,0,8,13,1,10,0 -20592,10,7,0,7,6,6,11,2,2,1,5,15,4,39,1 -20593,7,2,0,9,2,1,2,1,4,0,2,13,2,37,0 -20594,2,7,0,0,3,1,5,2,2,0,4,19,2,10,0 -20595,7,6,0,7,4,2,2,0,4,1,18,10,1,14,1 -20596,3,8,0,6,5,4,10,4,0,0,13,6,3,34,0 -20597,8,0,0,4,6,6,5,1,1,1,8,15,1,34,0 -20598,2,2,0,1,6,1,0,1,4,0,3,0,2,6,0 -20599,0,3,0,13,1,6,1,2,0,1,11,11,5,39,0 -20600,5,5,0,3,6,0,12,4,1,1,10,2,2,11,0 -20601,9,2,0,1,5,2,1,4,1,0,7,6,5,31,0 -20602,10,2,0,7,5,1,9,0,0,0,16,10,4,17,1 -20603,9,2,0,6,1,3,2,4,4,0,8,1,4,28,0 -20604,0,6,0,6,3,6,1,2,0,1,2,11,5,6,0 -20605,2,4,0,14,5,3,0,4,0,0,13,11,5,25,0 -20606,8,5,0,8,4,2,8,4,1,1,7,1,2,21,1 -20607,2,4,0,2,4,5,5,1,0,1,12,17,2,31,0 -20608,2,1,0,7,2,0,9,5,1,1,1,10,5,16,1 -20609,8,4,0,0,4,4,10,5,1,0,15,5,0,10,1 -20610,1,1,0,1,6,2,4,1,1,1,14,5,5,11,1 -20611,10,2,0,2,3,1,10,3,4,1,2,15,2,20,0 -20612,7,1,0,7,4,3,12,5,2,1,9,1,2,8,1 -20613,4,3,0,6,0,5,10,5,3,0,2,2,0,14,0 -20614,0,0,0,6,2,4,0,3,4,1,6,11,4,25,0 -20615,10,8,0,0,2,1,3,3,1,0,9,19,3,2,1 -20616,0,6,0,0,0,6,14,2,1,0,11,2,2,17,0 -20617,2,7,0,5,4,2,2,5,4,1,4,12,0,35,0 -20618,2,0,0,4,2,4,5,3,3,1,4,2,4,24,0 -20619,1,4,0,6,6,4,0,1,0,1,4,11,0,2,0 -20620,10,5,0,2,2,1,2,2,3,0,2,20,1,36,0 -20621,0,2,0,9,3,0,13,0,2,0,8,15,1,16,0 -20622,6,6,0,12,0,3,1,4,0,0,8,9,0,38,0 -20623,0,8,0,14,5,1,1,3,3,0,14,11,0,20,0 -20624,0,1,0,9,5,5,7,4,3,0,4,2,0,30,0 -20625,6,0,0,6,2,4,9,3,2,0,17,2,0,20,0 -20626,1,1,0,12,3,4,2,1,1,1,17,4,3,36,0 -20627,0,5,0,6,4,0,4,4,0,1,6,15,1,7,0 -20628,4,6,0,5,1,5,2,4,0,1,2,11,3,16,0 -20629,3,7,0,4,1,0,9,3,0,1,8,6,2,7,0 -20630,0,2,0,11,1,5,9,3,0,1,4,6,3,20,0 -20631,10,8,0,2,0,0,3,3,0,0,13,16,3,0,0 -20632,2,6,0,6,4,0,4,5,2,1,17,6,5,19,0 -20633,0,3,0,1,0,4,7,4,1,0,5,17,1,21,0 -20634,7,0,0,3,2,5,14,0,3,0,5,8,0,35,1 -20635,5,8,0,15,0,6,5,4,3,0,17,14,1,1,0 -20636,0,0,0,7,5,5,11,4,1,1,13,2,2,27,0 -20637,0,6,0,13,3,4,9,5,3,0,16,8,2,18,0 -20638,2,1,0,9,5,2,10,5,4,1,1,8,0,0,1 -20639,0,6,0,0,3,4,8,3,2,0,8,2,2,32,0 -20640,7,2,0,9,6,6,7,5,0,0,1,13,1,14,0 -20641,1,4,0,4,1,6,4,3,3,1,2,15,0,4,0 -20642,6,8,0,14,5,5,10,1,4,1,9,1,1,9,1 -20643,9,0,0,3,1,5,0,4,1,0,3,18,2,19,0 -20644,6,4,0,3,2,0,3,1,4,0,2,11,3,27,0 -20645,0,2,0,13,5,3,11,3,3,0,7,11,1,37,0 -20646,0,1,0,8,5,5,8,0,1,0,0,11,3,17,0 -20647,7,2,0,3,0,5,5,1,1,0,13,15,0,19,0 -20648,8,1,0,1,2,0,2,5,1,0,14,7,5,24,1 -20649,10,8,0,12,3,3,4,5,0,0,5,13,3,21,1 -20650,3,2,0,13,1,1,10,4,3,0,12,3,2,21,0 -20651,2,6,0,13,2,4,10,4,4,1,16,17,2,13,1 -20652,0,6,0,4,0,0,7,2,4,0,4,18,1,37,0 -20653,6,7,0,11,0,4,7,0,3,0,2,2,4,36,0 -20654,0,7,0,11,6,5,14,3,3,1,2,13,1,25,0 -20655,5,3,0,1,2,3,7,0,3,1,8,13,0,0,0 -20656,3,5,0,4,5,6,11,1,0,0,13,11,0,16,0 -20657,2,0,0,14,1,6,5,5,2,0,8,13,0,38,0 -20658,3,7,0,2,0,4,9,2,0,0,8,11,5,29,0 -20659,0,2,0,6,0,6,7,0,0,1,2,9,4,16,0 -20660,10,8,0,7,5,3,10,2,3,1,14,10,0,37,1 -20661,3,8,0,11,6,0,10,1,2,0,8,18,5,29,0 -20662,0,7,0,12,5,0,2,2,3,0,10,6,3,0,0 -20663,2,3,0,1,6,5,14,3,4,0,0,2,5,8,0 -20664,6,4,0,12,6,3,14,1,4,1,16,9,2,16,1 -20665,2,8,0,13,5,0,12,3,0,0,17,2,5,4,0 -20666,10,6,0,13,5,3,1,1,3,1,13,2,2,13,0 -20667,2,8,0,3,6,3,3,1,0,1,9,10,3,36,1 -20668,8,7,0,8,0,5,7,2,2,1,8,11,5,6,0 -20669,10,6,0,15,6,6,6,2,1,0,9,12,0,40,0 -20670,5,2,0,0,4,2,2,0,1,1,3,19,3,1,1 -20671,0,1,0,5,4,1,0,3,2,0,13,7,1,23,0 -20672,0,5,0,0,5,2,14,0,3,0,13,3,5,7,0 -20673,3,2,0,14,3,5,12,5,4,1,8,13,4,6,0 -20674,0,0,0,14,2,5,12,5,2,0,18,7,0,2,1 -20675,4,1,0,4,4,5,8,3,3,1,4,11,2,8,0 -20676,1,6,0,11,4,5,10,2,1,1,13,18,3,19,0 -20677,4,2,0,7,6,1,5,5,4,0,12,6,4,20,0 -20678,1,7,0,9,4,4,11,4,2,0,14,13,1,13,0 -20679,2,1,0,10,4,5,13,3,2,0,2,20,0,23,0 -20680,5,7,0,13,6,6,2,4,0,1,10,5,1,29,0 -20681,0,3,0,11,2,5,5,5,2,0,4,2,5,3,0 -20682,6,6,0,13,6,6,5,4,4,0,6,6,3,8,0 -20683,0,4,0,11,4,4,1,4,4,0,4,3,0,11,0 -20684,5,0,0,5,1,5,4,4,0,1,8,11,0,21,0 -20685,4,7,0,4,4,5,9,3,0,0,3,2,4,9,0 -20686,10,7,0,6,2,5,3,3,3,0,13,2,5,23,0 -20687,4,8,0,11,6,6,5,4,0,0,15,13,3,40,0 -20688,2,6,0,13,0,4,14,3,3,0,6,15,5,34,0 -20689,9,5,0,9,6,5,5,4,3,0,18,17,2,20,1 -20690,0,5,0,12,2,4,6,4,0,0,2,14,5,38,0 -20691,0,6,0,11,5,6,4,0,3,0,13,2,4,29,0 -20692,9,6,0,3,6,0,0,3,1,1,13,20,4,26,0 -20693,0,4,0,10,6,3,6,2,3,0,13,2,2,2,0 -20694,0,2,0,5,4,6,0,1,0,1,10,16,1,13,0 -20695,0,7,0,0,4,5,1,1,1,1,2,11,5,8,0 -20696,4,4,0,13,3,5,8,1,3,0,2,0,2,13,0 -20697,9,8,0,4,2,3,9,4,1,0,11,6,1,30,0 -20698,4,7,0,12,0,4,5,0,3,0,4,15,0,17,0 -20699,5,0,0,1,5,1,14,3,1,1,9,7,5,2,1 -20700,3,1,0,10,5,1,13,2,2,1,11,19,0,32,1 -20701,2,8,0,11,3,5,1,1,1,0,2,6,0,32,0 -20702,0,2,0,5,3,2,2,5,4,1,10,19,5,20,0 -20703,2,6,0,1,5,6,5,1,4,1,2,6,4,23,0 -20704,9,7,0,15,5,6,6,1,2,1,4,9,5,35,0 -20705,3,7,0,6,0,6,9,4,0,1,17,13,2,17,0 -20706,6,6,0,6,2,5,6,1,0,0,16,7,2,8,0 -20707,7,0,0,11,6,2,11,0,2,1,2,10,4,13,1 -20708,3,8,0,0,0,6,3,2,4,1,7,14,1,23,0 -20709,1,7,0,2,5,4,5,1,4,0,17,0,2,1,0 -20710,1,6,0,4,5,1,9,2,4,1,13,4,5,20,0 -20711,10,8,0,7,5,0,14,5,0,0,9,7,2,22,1 -20712,0,0,0,13,5,0,6,1,4,1,13,11,4,6,0 -20713,0,4,0,3,0,5,12,3,0,0,2,9,0,5,0 -20714,2,4,0,13,2,3,9,1,4,0,2,11,4,19,0 -20715,2,6,0,1,4,6,3,4,0,0,2,11,0,24,0 -20716,2,5,0,14,6,1,12,4,0,1,0,0,5,14,0 -20717,3,6,0,2,5,4,10,0,0,0,14,11,0,35,0 -20718,0,6,0,13,0,5,12,4,3,0,13,15,2,27,0 -20719,1,8,0,5,0,4,11,3,2,0,4,11,1,22,0 -20720,2,3,0,8,4,2,5,3,3,1,8,2,3,41,0 -20721,2,2,0,5,5,0,5,2,4,1,13,11,1,20,0 -20722,2,6,0,15,0,0,13,1,0,0,8,9,3,33,0 -20723,2,1,0,4,0,6,5,5,0,1,8,11,2,27,0 -20724,4,0,0,15,1,2,5,3,3,1,8,4,0,26,0 -20725,5,5,0,0,0,3,7,0,0,1,13,15,2,27,0 -20726,0,6,0,3,6,4,10,3,3,1,12,11,0,23,0 -20727,9,3,0,12,5,3,4,0,2,1,0,15,4,3,0 -20728,2,2,0,15,2,0,10,3,0,1,4,17,4,40,0 -20729,0,0,0,5,0,4,0,1,4,0,0,11,1,16,0 -20730,0,7,0,3,4,3,3,1,2,1,9,11,5,18,0 -20731,2,8,0,9,3,1,8,4,1,1,13,11,1,41,0 -20732,3,7,0,9,6,0,13,3,0,1,7,9,1,4,0 -20733,3,1,0,1,1,4,9,2,1,1,10,0,2,6,0 -20734,5,5,0,1,1,4,1,3,4,1,17,9,1,10,0 -20735,1,3,0,13,1,5,3,1,2,1,8,20,3,30,0 -20736,7,5,0,7,1,6,8,5,4,0,11,10,4,17,1 -20737,3,8,0,4,1,4,13,2,2,0,13,15,2,6,0 -20738,7,0,0,15,4,2,11,0,4,0,15,8,4,7,1 -20739,9,2,0,2,2,2,12,4,0,1,7,5,1,14,1 -20740,8,0,0,8,2,4,7,1,4,0,8,9,0,20,0 -20741,10,6,0,15,3,6,9,3,4,0,2,13,0,27,0 -20742,0,5,0,3,3,0,8,3,1,0,10,0,1,32,0 -20743,3,1,0,3,2,1,1,2,2,0,2,2,1,21,0 -20744,7,7,0,0,2,2,8,3,3,0,14,19,2,13,1 -20745,3,3,0,7,6,6,3,2,3,0,5,19,1,25,1 -20746,7,8,0,0,4,3,13,0,4,1,5,5,5,34,1 -20747,6,7,0,13,3,4,8,3,2,1,2,11,2,38,0 -20748,8,4,0,2,1,5,2,2,1,0,2,2,5,13,0 -20749,6,7,0,11,5,5,9,1,0,0,10,19,5,41,0 -20750,1,8,0,6,0,4,5,3,2,1,15,11,1,36,0 -20751,1,7,0,15,2,4,7,4,3,0,13,9,0,33,0 -20752,7,7,0,15,6,1,7,0,0,0,6,16,2,25,1 -20753,10,3,0,12,3,1,1,2,3,1,16,17,5,18,1 -20754,0,7,0,8,5,5,5,5,1,0,15,0,1,3,0 -20755,1,2,0,4,2,5,10,3,3,0,2,16,1,3,0 -20756,8,5,0,5,0,2,5,5,0,1,15,8,5,13,1 -20757,0,0,0,0,5,0,0,2,1,0,10,9,5,17,0 -20758,3,8,0,1,3,0,7,3,3,0,18,4,3,12,0 -20759,5,7,0,10,0,3,9,5,1,1,9,17,1,30,1 -20760,2,2,0,10,6,0,3,2,3,0,17,6,2,30,0 -20761,8,5,0,1,0,6,2,0,3,1,15,14,0,35,1 -20762,4,8,0,2,5,2,8,4,0,0,13,0,4,6,0 -20763,0,0,0,0,0,0,8,2,3,1,7,6,3,17,0 -20764,0,4,0,14,5,6,8,3,3,0,8,4,4,7,0 -20765,0,6,0,13,3,4,4,5,0,0,0,11,0,40,0 -20766,8,3,0,5,6,3,4,5,4,0,9,7,2,7,1 -20767,9,7,0,9,2,5,10,4,2,0,2,2,3,39,0 -20768,7,2,0,4,5,4,7,1,3,0,15,2,0,31,0 -20769,6,3,0,7,5,2,0,2,3,0,16,1,0,2,1 -20770,10,8,0,1,1,2,1,5,0,0,1,5,3,6,1 -20771,5,0,0,0,5,5,11,5,4,1,18,10,2,12,1 -20772,8,2,0,13,1,3,4,3,1,1,6,17,4,31,1 -20773,10,4,0,12,6,3,0,1,3,1,3,3,2,35,1 -20774,6,3,0,5,6,6,11,1,1,0,3,17,3,25,1 -20775,4,7,0,5,2,0,13,4,0,1,2,17,1,6,0 -20776,1,1,0,3,6,4,12,2,0,1,15,14,3,30,0 -20777,1,4,0,13,5,1,11,3,2,1,2,9,5,22,0 -20778,3,6,0,0,1,4,5,3,0,1,10,6,2,31,0 -20779,10,0,0,1,0,4,1,1,0,0,8,15,0,6,0 -20780,1,7,0,11,0,1,8,4,1,0,12,1,1,18,0 -20781,1,4,0,7,0,6,3,1,0,1,1,16,2,33,0 -20782,9,1,0,13,0,4,11,0,0,0,17,11,3,16,0 -20783,1,7,0,1,6,2,7,4,0,1,8,16,5,4,0 -20784,7,5,0,4,2,2,11,1,0,0,13,18,4,0,0 -20785,5,1,0,7,5,6,14,3,0,0,16,13,0,38,0 -20786,10,3,0,5,2,1,11,3,0,0,8,2,2,29,0 -20787,7,3,0,14,4,4,14,2,3,1,8,8,3,31,0 -20788,1,5,0,7,6,0,5,4,0,0,6,2,0,17,0 -20789,2,6,0,2,3,4,2,1,3,0,2,20,5,36,0 -20790,0,8,0,6,2,4,5,3,3,0,12,2,0,31,0 -20791,10,8,0,9,0,6,3,0,0,1,13,2,0,34,0 -20792,9,2,0,7,6,4,7,5,3,0,1,5,0,33,1 -20793,10,2,0,15,1,2,11,3,4,1,18,9,5,39,1 -20794,4,8,0,4,0,3,10,1,3,0,4,2,1,3,0 -20795,5,2,0,12,2,6,4,0,3,1,5,10,1,23,1 -20796,2,2,0,1,3,0,7,1,0,1,2,6,3,18,0 -20797,10,0,0,4,6,2,6,2,4,1,10,13,4,32,0 -20798,9,3,0,8,0,5,8,3,1,1,4,12,4,7,0 -20799,6,3,0,14,4,1,1,0,4,0,18,19,2,32,1 -20800,3,8,0,0,3,4,2,3,2,1,18,10,1,31,1 -20801,9,1,0,10,4,5,3,5,2,1,14,17,4,35,1 -20802,0,1,0,10,0,0,12,2,4,0,16,11,5,5,0 -20803,8,7,0,7,3,6,4,5,4,1,1,9,4,8,1 -20804,0,3,0,15,6,6,5,3,3,0,0,9,3,19,0 -20805,3,3,0,3,5,5,13,0,0,0,2,11,5,38,0 -20806,6,2,0,7,1,5,14,3,3,0,6,4,0,1,0 -20807,3,0,0,3,6,0,12,1,4,1,8,15,5,17,0 -20808,2,0,0,0,5,4,9,3,2,1,13,20,1,40,0 -20809,0,0,0,2,3,0,4,0,1,0,6,9,1,8,0 -20810,7,0,0,5,4,4,6,3,1,0,2,9,4,7,0 -20811,1,8,0,6,0,0,13,3,0,0,15,3,0,13,0 -20812,7,3,0,0,2,3,1,0,1,0,16,7,5,41,1 -20813,0,0,0,6,2,5,14,4,2,0,2,15,1,34,0 -20814,8,0,0,10,3,2,0,0,0,1,8,6,1,13,0 -20815,2,4,0,4,0,3,6,4,0,0,10,2,0,12,0 -20816,1,1,0,8,5,6,8,3,1,1,7,13,1,3,0 -20817,6,6,0,15,0,5,5,3,4,1,8,2,2,4,0 -20818,3,3,0,8,2,1,6,4,0,0,15,10,2,11,1 -20819,6,0,0,15,5,2,8,3,0,1,14,2,2,7,0 -20820,7,4,0,0,3,2,13,3,4,1,9,16,0,20,1 -20821,2,8,0,13,2,5,14,0,3,1,8,19,2,4,0 -20822,6,6,0,11,0,6,11,4,1,0,2,13,0,1,0 -20823,1,5,0,11,4,5,8,4,1,0,10,16,2,4,1 -20824,8,1,0,13,0,6,3,4,3,0,2,7,2,13,0 -20825,4,7,0,12,3,5,6,4,4,1,3,17,5,14,1 -20826,2,5,0,4,6,6,2,3,4,1,8,2,5,25,0 -20827,8,2,0,15,2,3,12,3,1,0,13,11,4,30,0 -20828,0,8,0,15,6,5,14,1,3,1,2,6,0,23,0 -20829,4,1,0,4,6,3,3,5,0,0,4,1,0,21,1 -20830,8,1,0,5,0,2,9,5,1,0,3,8,4,21,1 -20831,0,4,0,11,1,0,8,3,1,1,1,11,2,7,0 -20832,8,6,0,14,6,4,8,0,3,0,14,2,3,36,0 -20833,0,7,0,4,0,1,10,5,1,0,2,19,0,27,0 -20834,3,8,0,12,6,4,14,4,4,0,13,13,1,41,0 -20835,4,5,0,6,5,5,1,5,1,1,0,1,4,35,1 -20836,2,3,0,12,5,2,4,0,3,0,17,11,0,1,0 -20837,8,5,0,13,2,4,6,1,0,0,2,6,0,23,0 -20838,2,5,0,13,6,5,1,5,4,0,6,11,5,7,0 -20839,4,2,0,1,3,6,11,4,4,0,14,11,2,12,0 -20840,5,5,0,13,5,5,8,4,3,0,16,0,5,8,0 -20841,8,6,0,9,2,2,8,0,1,0,9,7,0,10,1 -20842,7,1,0,4,0,4,14,5,1,1,5,15,5,24,0 -20843,2,3,0,15,1,0,12,3,4,1,2,19,1,3,0 -20844,1,1,0,6,6,5,0,4,0,1,0,2,1,28,0 -20845,2,8,0,8,0,5,9,4,3,1,10,2,5,5,0 -20846,10,5,0,2,2,6,10,2,0,0,15,11,4,8,0 -20847,0,5,0,8,6,5,0,0,1,1,8,6,4,26,0 -20848,9,7,0,7,0,1,2,4,2,0,17,5,4,26,0 -20849,4,1,0,6,5,3,12,4,2,0,2,11,0,38,0 -20850,6,1,0,6,5,4,4,0,2,0,1,8,0,28,0 -20851,1,4,0,9,0,0,14,1,0,0,2,11,1,36,0 -20852,7,5,0,2,4,0,1,2,3,1,5,10,2,7,1 -20853,0,8,0,15,6,4,12,3,4,0,17,12,0,37,0 -20854,1,3,0,4,0,6,13,3,2,1,8,9,5,26,0 -20855,8,7,0,5,1,2,0,4,0,1,6,2,4,18,0 -20856,4,1,0,11,0,5,7,0,0,1,13,15,1,26,0 -20857,6,4,0,11,4,3,9,3,0,1,2,13,4,35,0 -20858,7,6,0,10,4,6,10,5,2,1,11,6,1,14,0 -20859,6,2,0,6,1,5,14,4,2,0,8,2,3,12,0 -20860,0,8,0,15,0,0,13,2,1,0,0,11,4,17,0 -20861,2,4,0,2,3,6,11,4,2,1,3,18,4,24,0 -20862,5,8,0,5,5,1,2,2,4,1,13,2,0,25,0 -20863,9,6,0,12,1,1,11,2,1,1,5,8,4,39,1 -20864,2,1,0,8,4,3,6,2,2,0,15,15,2,24,0 -20865,5,0,0,13,0,6,13,4,3,1,11,15,3,34,1 -20866,10,6,0,8,6,2,9,0,2,1,2,6,4,17,0 -20867,4,7,0,5,0,3,13,3,3,0,10,13,2,22,0 -20868,7,1,0,12,0,6,2,5,0,0,18,3,4,25,1 -20869,4,2,0,4,3,4,7,5,0,1,2,16,5,41,0 -20870,2,7,0,0,1,3,14,4,3,0,6,0,1,33,0 -20871,5,2,0,13,0,6,0,2,1,1,11,17,4,35,0 -20872,0,1,0,4,6,2,5,4,1,1,8,13,1,6,0 -20873,2,6,0,8,0,2,8,0,0,1,6,14,2,3,0 -20874,9,3,0,6,2,1,1,1,3,0,18,1,0,31,1 -20875,5,8,0,8,6,5,12,4,0,0,2,2,3,33,0 -20876,3,5,0,12,6,6,13,1,2,0,0,13,1,0,0 -20877,6,4,0,11,0,0,12,4,0,0,13,2,1,7,0 -20878,3,1,0,6,5,6,2,0,2,0,14,9,4,12,0 -20879,3,6,0,8,5,3,9,3,4,0,2,2,5,22,0 -20880,7,1,0,15,5,4,14,5,0,0,17,11,0,5,0 -20881,0,0,0,1,1,4,5,3,2,1,13,18,1,11,0 -20882,4,1,0,9,2,5,3,3,0,1,10,18,5,8,0 -20883,9,7,0,0,3,5,11,3,3,0,8,20,2,37,0 -20884,10,2,0,15,6,2,4,0,3,1,0,7,4,30,1 -20885,3,7,0,15,6,0,12,1,0,1,8,11,0,3,0 -20886,1,0,0,0,4,3,1,5,4,1,8,16,5,19,1 -20887,8,6,0,6,5,2,2,0,2,1,5,3,5,4,1 -20888,4,4,0,5,5,4,12,1,0,0,6,3,5,18,0 -20889,0,5,0,0,1,5,12,4,4,0,9,3,2,6,1 -20890,2,6,0,1,0,4,6,5,1,0,8,17,5,23,0 -20891,1,1,0,4,0,4,13,3,3,1,8,9,1,32,0 -20892,2,8,0,0,1,1,14,1,4,0,2,10,1,37,0 -20893,0,6,0,15,3,3,6,1,2,1,8,12,0,26,0 -20894,0,2,0,4,6,3,14,5,3,0,4,19,2,37,0 -20895,1,1,0,14,6,3,3,4,0,1,4,8,3,17,0 -20896,7,4,0,7,0,2,14,4,1,0,18,10,5,10,1 -20897,7,0,0,7,0,3,4,0,4,1,14,19,2,36,1 -20898,9,2,0,6,6,3,4,4,1,1,4,2,2,26,0 -20899,0,0,0,4,0,4,8,4,0,1,17,14,2,36,0 -20900,5,2,0,12,4,3,6,2,3,1,2,11,1,41,0 -20901,0,1,0,0,1,0,3,2,3,0,8,5,0,17,0 -20902,7,5,0,8,1,5,13,5,0,0,6,7,2,33,1 -20903,0,2,0,1,4,3,14,2,0,1,13,17,0,21,0 -20904,4,4,0,3,1,0,3,0,1,1,17,20,0,3,0 -20905,0,2,0,3,6,0,14,0,2,1,2,11,5,0,0 -20906,7,6,0,10,6,4,5,3,0,0,8,9,1,40,0 -20907,1,4,0,15,1,4,11,4,3,0,12,11,4,27,0 -20908,0,8,0,2,0,5,5,4,4,0,13,3,0,7,0 -20909,5,6,0,8,5,2,9,4,2,1,13,7,2,24,0 -20910,4,2,0,12,4,4,5,2,2,1,4,2,2,28,0 -20911,9,6,0,6,6,6,12,0,2,1,9,18,3,1,1 -20912,7,6,0,3,5,1,4,0,1,0,18,12,2,1,1 -20913,8,6,0,7,0,5,7,0,0,0,13,9,2,34,0 -20914,6,0,0,5,5,3,0,0,4,0,12,0,1,8,0 -20915,3,8,0,3,2,5,12,4,1,0,13,2,3,6,0 -20916,1,8,0,9,2,1,6,3,1,0,7,6,2,3,0 -20917,2,6,0,10,4,0,14,4,3,1,6,0,4,30,0 -20918,4,2,0,1,2,4,0,5,1,1,18,17,4,9,1 -20919,6,5,0,12,4,5,10,5,4,1,5,14,4,1,1 -20920,5,7,0,14,4,1,7,2,1,0,8,0,5,5,0 -20921,10,1,0,8,2,5,7,3,1,0,16,20,3,25,0 -20922,5,6,0,3,1,4,3,3,1,1,8,18,2,0,0 -20923,4,4,0,5,1,5,14,2,2,1,8,2,5,10,0 -20924,4,4,0,11,5,1,4,5,1,1,4,19,5,29,1 -20925,6,8,0,12,5,4,8,1,2,0,2,6,2,19,0 -20926,9,2,0,1,3,4,6,3,0,0,15,15,1,1,0 -20927,1,0,0,5,0,6,9,1,2,1,0,7,4,5,0 -20928,8,0,0,8,4,6,4,1,1,1,9,8,5,40,1 -20929,9,0,0,13,3,3,1,4,1,1,3,7,3,1,1 -20930,0,6,0,9,6,5,4,1,3,0,13,0,0,24,0 -20931,0,8,0,6,5,5,5,0,2,1,2,17,1,20,0 -20932,3,5,0,4,0,6,9,3,0,0,2,20,4,13,0 -20933,1,4,0,6,5,2,14,1,4,0,8,10,1,2,1 -20934,0,8,0,2,3,0,7,4,3,0,12,11,2,12,0 -20935,2,8,0,11,0,5,3,0,2,0,17,2,0,3,0 -20936,7,7,0,3,0,3,9,0,1,0,8,6,0,17,0 -20937,1,6,0,15,1,2,6,1,4,0,2,4,2,29,0 -20938,5,1,0,10,5,6,4,4,0,1,8,2,3,19,0 -20939,9,5,0,14,3,2,2,2,2,1,1,7,1,24,1 -20940,10,3,0,14,0,3,8,0,0,0,8,20,1,24,0 -20941,2,6,0,9,1,6,9,4,1,1,13,15,0,39,0 -20942,0,1,0,11,0,2,1,1,3,1,4,3,2,2,0 -20943,8,5,0,0,5,6,0,0,2,1,5,5,5,10,1 -20944,3,6,0,10,1,6,6,2,3,0,8,11,0,41,0 -20945,7,4,0,13,5,1,12,1,4,0,2,11,4,20,0 -20946,6,7,0,15,6,3,3,1,3,0,8,2,4,22,0 -20947,3,5,0,10,6,5,5,4,0,1,8,6,0,3,0 -20948,2,4,0,9,0,3,11,3,0,0,5,6,2,28,0 -20949,8,1,0,10,4,4,0,0,1,1,13,20,2,8,0 -20950,5,4,0,11,3,4,7,1,2,1,4,13,1,22,0 -20951,0,0,0,3,0,0,10,5,1,1,0,16,3,7,0 -20952,8,7,0,8,0,1,3,0,1,1,8,15,2,35,0 -20953,2,4,0,1,0,0,10,1,1,1,0,15,2,40,0 -20954,0,1,0,3,3,0,8,3,3,0,2,2,1,29,0 -20955,9,1,0,0,0,6,2,5,0,1,9,5,5,34,1 -20956,1,6,0,7,1,4,8,3,4,1,13,13,2,26,0 -20957,2,1,0,3,2,2,5,3,0,0,4,6,0,2,0 -20958,7,3,0,13,4,5,8,0,2,0,16,10,5,40,1 -20959,0,8,0,15,0,1,11,2,0,0,4,9,5,37,0 -20960,1,7,0,5,3,4,4,4,0,1,0,8,5,24,1 -20961,8,0,0,8,3,4,10,4,0,0,2,6,2,23,0 -20962,6,8,0,3,0,3,10,4,3,0,0,18,0,9,0 -20963,2,4,0,11,0,5,6,0,4,0,13,11,3,12,0 -20964,2,8,0,7,4,3,1,3,4,0,5,2,2,21,0 -20965,5,6,0,3,6,5,13,5,0,0,16,20,2,6,1 -20966,5,0,0,13,5,5,2,3,1,0,5,2,3,28,0 -20967,2,1,0,12,2,3,6,4,1,1,2,11,0,40,0 -20968,2,0,0,12,2,4,4,2,2,0,12,0,3,4,0 -20969,1,4,0,8,2,2,11,5,2,0,3,1,3,16,1 -20970,3,7,0,13,6,2,13,0,2,1,2,19,5,23,0 -20971,1,7,0,2,3,1,2,3,2,1,18,10,2,9,1 -20972,2,4,0,0,5,3,9,3,2,0,16,2,5,13,0 -20973,2,0,0,9,6,4,0,1,2,0,14,11,4,0,0 -20974,4,1,0,5,2,3,6,3,0,1,4,11,0,18,0 -20975,10,6,0,11,5,5,5,2,1,0,8,13,3,33,0 -20976,6,6,0,14,3,0,10,2,2,1,14,8,4,31,1 -20977,0,2,0,13,2,3,9,1,2,0,17,4,1,24,0 -20978,2,6,0,5,0,6,2,3,0,0,13,2,4,26,0 -20979,1,3,0,3,4,0,4,3,0,1,10,15,0,35,0 -20980,3,3,0,11,1,4,8,3,0,0,17,20,4,37,0 -20981,8,0,0,3,6,0,11,0,2,1,13,19,2,7,0 -20982,6,8,0,4,2,1,3,3,4,0,18,4,5,10,1 -20983,1,1,0,0,6,1,12,0,3,0,4,20,1,14,0 -20984,7,6,0,9,1,6,7,3,2,0,13,10,2,8,0 -20985,8,4,0,1,4,5,3,4,2,1,13,18,0,5,0 -20986,10,0,0,12,6,0,0,1,3,1,17,5,5,5,1 -20987,0,6,0,6,2,0,9,4,1,0,2,13,2,25,0 -20988,3,7,0,3,3,0,2,1,1,1,2,13,0,21,0 -20989,0,7,0,7,2,6,0,2,4,0,4,2,2,34,0 -20990,0,4,0,11,6,2,3,5,0,1,14,18,2,17,1 -20991,8,2,0,2,0,4,2,3,0,1,6,11,4,4,0 -20992,3,8,0,8,1,5,8,1,3,1,17,2,5,31,0 -20993,0,8,0,13,6,5,4,1,3,0,10,16,2,35,0 -20994,8,2,0,15,4,0,10,4,2,0,10,4,2,29,0 -20995,8,7,0,11,1,1,10,1,2,0,6,10,3,7,1 -20996,0,3,0,15,1,3,6,4,3,0,12,3,1,18,0 -20997,5,1,0,8,0,1,7,3,1,0,17,11,1,20,0 -20998,5,2,0,13,6,5,9,3,3,0,2,9,4,33,0 -20999,0,8,0,7,4,3,7,3,1,0,0,4,4,39,0 -21000,7,7,0,5,6,6,10,1,3,0,14,19,3,1,1 -21001,3,5,0,15,4,5,3,0,1,1,2,2,1,4,0 -21002,1,7,0,10,4,3,3,3,3,0,8,13,0,26,0 -21003,0,4,0,13,2,5,14,3,0,1,8,9,3,25,0 -21004,4,3,0,0,5,5,7,2,3,0,6,6,0,38,0 -21005,7,6,0,9,4,6,6,0,1,1,11,10,5,26,1 -21006,6,1,0,9,1,6,11,0,2,1,18,4,4,9,1 -21007,5,4,0,15,2,6,11,4,0,1,18,5,5,14,1 -21008,1,8,0,15,5,4,8,3,1,1,2,12,5,1,0 -21009,10,5,0,11,0,4,0,0,0,0,4,0,1,23,0 -21010,6,6,0,5,0,0,4,5,0,1,11,12,0,4,0 -21011,6,8,0,8,1,5,13,5,4,0,18,19,3,39,1 -21012,9,8,0,3,5,5,6,4,0,0,7,18,1,35,0 -21013,2,5,0,13,2,1,12,3,1,1,13,13,3,23,0 -21014,3,3,0,2,3,5,4,2,3,1,8,2,3,10,0 -21015,4,5,0,10,3,4,14,1,1,1,12,6,1,6,0 -21016,7,8,0,6,0,5,9,4,4,0,8,11,0,28,0 -21017,8,2,0,7,4,2,5,0,2,0,9,5,2,1,1 -21018,3,6,0,15,2,4,6,3,2,0,10,11,3,20,0 -21019,2,8,0,0,6,5,5,3,3,1,18,8,0,6,1 -21020,8,2,0,3,0,5,6,2,2,0,2,13,5,25,0 -21021,10,6,0,6,6,6,10,0,4,0,0,10,3,39,1 -21022,0,8,0,3,1,5,13,5,3,1,14,13,1,35,0 -21023,6,3,0,5,6,5,11,4,3,0,17,7,4,29,0 -21024,2,4,0,0,0,4,3,2,0,0,2,13,0,20,0 -21025,10,1,0,10,1,5,7,2,2,1,14,2,5,38,0 -21026,1,3,0,3,3,3,9,4,0,1,17,1,5,35,0 -21027,3,7,0,12,0,3,6,5,0,1,2,10,0,13,0 -21028,1,8,0,9,3,2,5,2,3,1,3,6,5,26,0 -21029,6,1,0,5,3,1,11,5,0,1,14,7,2,39,1 -21030,8,0,0,15,4,2,1,3,4,0,12,15,5,4,0 -21031,8,7,0,11,2,3,8,1,2,0,8,18,0,38,0 -21032,8,6,0,15,0,5,9,5,3,0,4,9,3,37,0 -21033,0,6,0,3,5,6,13,3,4,0,12,16,5,3,0 -21034,7,7,0,13,5,1,1,2,2,1,18,10,0,38,1 -21035,8,6,0,13,5,5,9,3,4,0,5,11,3,12,0 -21036,0,1,0,6,2,4,5,5,3,0,17,6,2,41,0 -21037,5,8,0,0,5,5,9,3,3,0,10,11,3,0,0 -21038,7,6,0,13,5,1,4,0,4,0,10,18,0,9,0 -21039,10,8,0,3,5,2,4,5,2,0,1,7,2,37,1 -21040,5,7,0,15,6,4,8,1,4,0,2,6,1,39,0 -21041,3,1,0,6,1,6,4,0,0,0,15,20,2,34,0 -21042,0,8,0,15,0,0,5,4,3,0,2,16,4,25,0 -21043,3,6,0,8,1,3,7,3,1,1,6,20,1,3,0 -21044,9,1,0,4,2,1,4,2,4,0,8,11,3,26,0 -21045,6,4,0,0,0,1,1,0,4,0,14,12,5,9,1 -21046,10,2,0,15,0,0,9,0,2,1,2,18,3,26,0 -21047,10,1,0,4,3,4,0,3,4,1,8,8,4,17,0 -21048,1,5,0,0,1,5,9,3,2,0,17,6,2,20,0 -21049,9,3,0,6,5,0,1,4,0,1,17,6,0,12,0 -21050,2,3,0,3,1,4,14,3,0,1,14,16,2,9,0 -21051,8,7,0,13,4,0,0,3,0,0,12,11,2,26,0 -21052,5,5,0,7,2,2,5,5,0,0,9,10,4,35,1 -21053,4,5,0,8,4,5,8,4,1,1,8,18,0,7,0 -21054,2,3,0,13,0,1,9,2,3,0,0,13,5,27,0 -21055,0,6,0,3,5,6,11,5,0,0,2,6,3,28,0 -21056,2,8,0,9,2,4,13,3,0,1,8,13,5,38,0 -21057,1,3,0,7,0,3,13,4,1,0,13,11,3,38,0 -21058,3,6,0,4,6,1,11,0,4,1,18,10,4,20,1 -21059,9,0,0,7,1,1,3,5,4,0,14,5,1,10,1 -21060,0,4,0,6,1,1,0,3,3,0,2,3,2,14,0 -21061,6,3,0,0,4,6,1,2,4,0,4,3,5,41,0 -21062,0,8,0,8,0,6,9,1,0,1,13,15,1,37,0 -21063,3,4,0,3,0,5,0,4,4,0,8,6,5,27,0 -21064,7,6,0,8,4,5,12,0,2,0,2,2,5,5,0 -21065,4,6,0,6,4,4,10,5,0,0,9,1,1,24,1 -21066,3,0,0,15,3,0,0,2,2,1,17,15,4,29,0 -21067,10,8,0,4,1,0,6,4,0,0,12,2,4,26,0 -21068,0,6,0,13,5,3,2,2,4,0,7,6,3,13,0 -21069,10,2,0,2,5,3,13,2,0,0,2,11,2,25,0 -21070,1,2,0,5,6,6,14,2,0,1,17,8,5,30,0 -21071,0,2,0,15,6,4,6,4,1,0,10,2,0,29,0 -21072,5,1,0,14,6,2,6,4,3,1,6,3,3,2,0 -21073,0,4,0,7,3,2,6,3,0,0,8,18,4,0,0 -21074,1,8,0,12,5,5,9,4,0,0,12,2,0,22,0 -21075,10,0,0,1,3,4,5,4,3,1,8,13,1,17,0 -21076,0,6,0,4,5,5,12,2,0,0,5,7,0,28,1 -21077,4,6,0,9,0,6,10,1,3,0,14,9,1,4,0 -21078,3,7,0,13,0,4,11,3,3,1,0,2,1,26,0 -21079,1,8,0,11,0,6,9,4,0,0,2,17,2,17,0 -21080,8,1,0,14,5,4,11,2,3,1,13,15,0,32,0 -21081,5,1,0,6,5,4,8,4,2,1,8,2,0,7,0 -21082,2,8,0,12,6,5,9,3,0,0,6,20,1,6,0 -21083,9,2,0,14,5,4,11,4,2,1,13,15,5,28,0 -21084,7,4,0,12,0,5,11,3,1,1,13,2,5,32,0 -21085,7,0,0,7,1,3,0,2,1,0,2,16,0,1,0 -21086,3,0,0,3,0,1,0,1,4,1,4,11,0,32,0 -21087,0,3,0,5,0,4,6,3,1,1,3,12,1,14,0 -21088,0,6,0,2,6,4,2,4,2,0,2,2,1,34,0 -21089,10,6,0,12,5,4,7,2,1,0,16,0,5,31,0 -21090,7,4,0,3,1,6,11,1,3,1,13,9,4,35,0 -21091,3,2,0,6,3,0,7,0,3,1,17,4,1,10,0 -21092,9,8,0,3,0,4,4,2,0,0,2,15,0,3,0 -21093,0,1,0,3,6,0,5,3,4,0,13,20,0,33,0 -21094,0,1,0,15,0,5,4,0,1,1,8,16,2,39,0 -21095,4,6,0,6,3,4,2,3,3,0,3,14,5,19,0 -21096,4,3,0,10,1,5,13,2,0,1,9,5,2,5,1 -21097,5,6,0,2,2,3,6,0,1,0,8,0,4,18,0 -21098,6,1,0,12,4,6,0,1,3,1,9,20,2,13,1 -21099,7,0,0,5,2,3,5,1,1,1,3,2,1,8,0 -21100,7,8,0,11,1,3,12,2,3,1,1,10,5,18,1 -21101,7,1,0,14,5,2,3,5,4,1,9,10,4,21,1 -21102,9,0,0,1,2,0,9,1,0,1,2,2,0,12,0 -21103,2,6,0,4,3,5,6,2,0,1,2,20,4,8,0 -21104,7,2,0,14,0,0,9,5,2,0,2,11,5,36,0 -21105,0,7,0,3,0,0,7,2,1,1,2,19,4,41,0 -21106,0,0,0,5,0,4,7,4,1,1,13,11,5,34,0 -21107,3,6,0,4,1,4,11,2,3,0,2,2,2,26,0 -21108,7,6,0,6,0,5,13,0,1,1,17,15,3,26,0 -21109,4,8,0,13,6,4,2,3,2,1,13,16,0,13,0 -21110,10,3,0,3,3,1,7,2,2,1,10,20,0,0,0 -21111,2,0,0,3,5,3,12,3,1,0,7,6,5,10,0 -21112,3,6,0,11,5,4,0,4,1,1,2,18,0,22,0 -21113,0,1,0,5,3,4,4,4,4,0,4,2,2,19,0 -21114,8,3,0,2,3,1,3,3,0,1,18,5,4,1,1 -21115,8,5,0,5,0,4,12,4,3,1,2,2,1,27,0 -21116,8,8,0,13,5,4,2,0,0,0,10,2,1,19,0 -21117,5,8,0,0,0,0,5,5,0,1,4,1,4,19,0 -21118,0,1,0,11,0,5,12,5,0,0,2,11,1,39,0 -21119,4,3,0,6,0,4,5,2,2,1,17,13,0,5,0 -21120,4,2,0,9,0,5,2,1,3,0,14,13,0,7,0 -21121,2,6,0,5,0,6,12,1,3,0,4,6,0,12,0 -21122,2,6,0,2,6,2,6,1,0,0,13,11,0,0,0 -21123,2,6,0,8,2,5,11,0,4,1,12,7,1,3,1 -21124,4,8,0,14,6,3,9,0,0,1,13,2,5,40,0 -21125,8,2,0,13,5,0,3,3,2,1,6,1,3,12,0 -21126,4,8,0,0,0,5,14,3,2,0,11,2,5,13,0 -21127,3,6,0,2,1,3,0,2,1,1,8,11,3,0,0 -21128,2,0,0,4,4,1,1,4,1,0,14,4,4,16,0 -21129,3,1,0,0,6,4,3,0,4,0,1,8,1,23,1 -21130,4,7,0,6,0,5,2,0,4,1,18,14,0,5,1 -21131,3,1,0,10,0,4,9,3,3,0,11,3,3,27,0 -21132,6,2,0,3,1,1,8,5,0,1,1,19,3,13,1 -21133,2,8,0,1,6,4,11,0,3,1,8,14,2,39,0 -21134,8,4,0,6,0,4,4,2,0,0,16,3,1,2,1 -21135,1,8,0,11,4,6,6,2,0,1,9,19,2,12,1 -21136,2,5,0,11,4,4,0,4,0,0,7,13,2,25,0 -21137,4,3,0,3,5,0,4,4,3,1,4,11,5,29,0 -21138,7,3,0,6,6,3,7,4,3,0,15,11,3,40,0 -21139,0,8,0,1,0,6,1,3,1,1,0,0,3,24,0 -21140,2,0,0,7,2,4,5,0,0,0,2,13,0,5,0 -21141,3,0,0,9,3,1,12,5,4,0,16,19,4,13,1 -21142,9,5,0,15,6,1,3,1,1,0,17,11,5,20,0 -21143,1,0,0,13,0,4,8,5,2,1,7,2,5,35,0 -21144,9,6,0,6,3,0,12,2,1,1,10,11,3,40,0 -21145,0,8,0,6,4,4,0,4,4,1,2,2,0,29,0 -21146,10,0,0,1,3,5,7,1,2,1,8,20,5,2,0 -21147,1,7,0,15,3,3,13,1,4,1,13,17,1,7,0 -21148,9,0,0,1,6,3,14,0,4,1,13,3,2,39,0 -21149,4,8,0,3,6,2,10,1,2,1,13,1,2,1,0 -21150,3,0,0,15,1,5,6,5,0,0,12,9,5,23,0 -21151,1,7,0,0,3,3,9,3,0,0,2,2,2,31,0 -21152,5,2,0,15,2,5,7,4,3,0,13,18,4,21,0 -21153,10,4,0,3,2,4,4,3,1,0,16,7,2,41,1 -21154,1,0,0,2,2,0,0,5,0,1,13,13,3,1,0 -21155,0,7,0,7,6,3,8,2,4,1,0,18,1,26,0 -21156,7,7,0,7,4,1,4,0,4,1,3,7,2,24,1 -21157,2,4,0,8,2,4,9,0,2,1,13,2,0,0,0 -21158,0,0,0,11,4,6,7,5,3,1,0,4,3,35,0 -21159,8,0,0,10,2,1,14,5,2,0,14,19,2,20,1 -21160,1,7,0,9,6,4,10,3,3,0,2,3,4,18,0 -21161,4,1,0,8,4,1,9,3,1,0,1,15,1,40,0 -21162,4,4,0,8,4,2,6,0,0,1,18,10,1,39,1 -21163,0,2,0,3,5,0,7,4,3,1,2,9,0,24,0 -21164,3,2,0,9,0,1,5,4,1,0,13,11,2,37,0 -21165,0,3,0,5,4,0,7,3,3,1,13,11,0,19,0 -21166,2,7,0,0,3,5,8,4,4,1,8,18,0,6,0 -21167,4,5,0,6,2,3,4,2,2,1,15,2,2,37,0 -21168,0,1,0,11,0,2,14,4,2,1,1,0,0,30,0 -21169,8,1,0,10,4,3,10,5,3,1,1,5,3,30,1 -21170,7,0,0,12,2,6,9,2,3,1,18,5,3,11,1 -21171,9,5,0,2,5,2,13,0,1,0,16,8,3,21,1 -21172,9,7,0,1,1,4,5,3,2,1,8,0,4,3,0 -21173,6,5,0,3,0,4,8,4,4,1,8,5,0,14,0 -21174,1,2,0,7,6,4,14,3,4,0,2,20,3,22,0 -21175,8,7,0,13,0,5,13,2,4,0,6,17,4,10,0 -21176,7,8,0,11,1,1,7,0,1,0,12,15,0,27,0 -21177,5,6,0,9,1,6,7,4,3,1,2,7,2,20,0 -21178,6,7,0,0,3,5,8,3,1,1,8,11,0,33,0 -21179,2,1,0,7,4,5,3,5,0,1,14,17,5,29,1 -21180,8,4,0,12,2,3,8,5,0,1,16,17,1,9,1 -21181,6,4,0,2,3,3,2,0,1,1,2,14,0,31,0 -21182,9,1,0,5,2,0,0,3,1,1,12,13,3,32,1 -21183,3,8,0,1,0,5,9,0,1,0,13,6,2,14,0 -21184,4,6,0,12,1,6,8,5,4,1,6,11,0,19,0 -21185,4,2,0,5,1,4,12,4,2,0,2,0,1,24,0 -21186,0,3,0,12,1,4,2,3,0,0,15,2,2,38,0 -21187,6,1,0,1,3,1,13,1,1,0,15,10,1,10,0 -21188,1,8,0,0,0,0,7,3,0,0,7,2,4,21,0 -21189,4,2,0,6,5,4,5,0,4,1,13,9,1,23,0 -21190,2,4,0,1,2,4,1,4,0,1,8,2,3,11,0 -21191,1,0,0,4,2,3,7,3,4,0,8,16,3,34,0 -21192,5,1,0,0,0,4,3,2,0,1,6,19,3,22,0 -21193,0,6,0,15,2,0,9,4,0,0,2,2,5,12,0 -21194,0,3,0,1,4,4,3,1,1,1,8,6,5,22,0 -21195,7,4,0,6,6,2,6,1,0,0,4,2,1,26,0 -21196,1,1,0,12,3,0,7,5,1,1,2,4,5,30,0 -21197,1,3,0,11,5,0,9,2,2,1,0,18,3,40,0 -21198,0,5,0,8,6,0,7,1,2,0,13,9,0,14,0 -21199,1,8,0,10,5,2,12,5,4,1,18,10,3,0,1 -21200,5,1,0,12,1,2,6,1,3,0,17,18,1,30,0 -21201,5,8,0,13,4,3,9,2,4,1,8,4,1,0,0 -21202,2,7,0,10,3,3,4,5,0,0,5,17,1,11,1 -21203,10,0,0,4,5,0,5,2,2,1,8,11,3,41,0 -21204,8,2,0,9,3,4,1,0,0,1,3,1,4,18,1 -21205,4,8,0,3,4,2,4,3,3,1,8,10,1,2,1 -21206,2,1,0,3,5,6,5,4,2,0,1,11,0,21,0 -21207,7,5,0,3,2,0,0,3,4,0,9,20,2,5,1 -21208,0,7,0,14,0,4,11,1,4,0,2,1,4,12,0 -21209,0,7,0,1,4,1,5,2,1,0,12,11,0,27,0 -21210,7,5,0,8,0,1,10,1,2,0,13,9,1,0,0 -21211,9,6,0,3,5,4,8,4,3,0,12,6,1,22,0 -21212,5,3,0,2,2,4,9,2,3,0,9,11,4,19,0 -21213,4,4,0,7,4,3,10,2,0,1,18,7,4,29,1 -21214,0,2,0,11,5,3,3,1,2,0,13,14,5,33,0 -21215,4,2,0,2,2,3,1,3,3,1,9,8,1,34,1 -21216,9,7,0,10,2,5,11,0,3,1,6,9,3,6,0 -21217,0,3,0,13,1,6,6,4,4,0,2,12,0,31,0 -21218,10,2,0,10,1,2,4,0,1,0,9,7,3,21,1 -21219,8,6,0,3,2,0,4,3,4,0,13,7,5,26,0 -21220,2,5,0,7,2,1,12,4,1,1,16,18,4,9,1 -21221,0,4,0,2,4,5,8,3,0,0,6,20,1,11,0 -21222,8,5,0,8,5,1,4,5,3,1,7,5,3,0,1 -21223,6,8,0,11,6,2,0,2,2,0,1,14,1,4,1 -21224,9,6,0,12,2,4,13,1,4,1,16,19,4,6,1 -21225,2,0,0,15,6,1,6,4,1,1,14,12,1,36,0 -21226,7,1,0,6,2,4,14,2,1,0,4,20,0,37,0 -21227,0,0,0,13,0,4,3,4,2,0,13,13,1,28,0 -21228,0,6,0,9,3,0,13,5,0,1,9,14,3,41,1 -21229,0,2,0,11,1,0,8,5,4,0,13,9,2,21,0 -21230,5,7,0,6,0,1,9,1,2,1,8,6,0,7,0 -21231,6,6,0,8,0,0,6,4,0,1,4,18,2,39,0 -21232,4,5,0,15,4,0,10,0,2,1,17,15,2,4,0 -21233,1,0,0,9,0,0,10,5,4,1,2,0,1,36,0 -21234,8,6,0,14,5,3,10,0,2,1,18,7,0,11,1 -21235,0,7,0,6,6,3,7,0,0,0,8,19,1,3,0 -21236,8,5,0,4,1,2,4,4,4,1,9,19,0,7,1 -21237,0,4,0,12,2,1,13,5,1,1,18,8,1,33,1 -21238,7,5,0,11,2,0,0,0,0,0,8,20,0,10,0 -21239,6,3,0,15,1,4,1,2,2,0,13,14,1,3,0 -21240,9,5,0,10,0,1,3,1,4,1,0,17,5,2,1 -21241,9,8,0,0,3,1,13,1,3,1,3,9,1,41,1 -21242,6,2,0,14,4,6,6,0,3,1,16,1,5,18,1 -21243,4,3,0,5,5,3,5,1,1,1,12,3,0,34,0 -21244,5,6,0,15,1,0,7,1,0,0,8,6,3,14,0 -21245,4,3,0,3,0,4,4,2,3,0,3,0,3,2,0 -21246,9,3,0,3,4,0,11,5,0,1,18,20,1,2,1 -21247,2,6,0,11,2,1,10,2,0,0,15,0,0,31,0 -21248,0,1,0,9,2,3,5,0,3,1,2,6,1,11,0 -21249,7,4,0,2,5,5,9,1,0,1,14,5,4,27,1 -21250,3,4,0,6,1,5,14,0,0,0,13,6,0,37,0 -21251,10,2,0,4,3,1,9,3,4,0,11,6,5,26,1 -21252,9,4,0,11,1,5,0,3,2,1,16,17,5,41,1 -21253,7,4,0,4,4,2,9,0,2,1,14,7,4,0,1 -21254,4,0,0,0,5,4,8,1,4,1,10,7,5,32,1 -21255,3,2,0,11,4,5,6,4,0,0,12,2,0,0,0 -21256,8,7,0,7,4,2,10,1,3,1,18,15,0,33,1 -21257,4,0,0,15,4,4,4,4,2,0,2,6,1,4,0 -21258,5,0,0,8,1,4,2,2,4,0,12,11,0,41,0 -21259,4,2,0,4,4,6,1,4,2,1,2,11,0,23,0 -21260,1,4,0,0,1,1,3,4,2,0,10,11,1,5,0 -21261,3,4,0,0,3,1,0,4,3,1,6,3,1,6,0 -21262,1,7,0,13,0,5,5,3,0,0,13,13,0,4,0 -21263,5,4,0,8,1,5,5,1,2,1,9,8,0,29,0 -21264,8,0,0,2,6,6,6,1,4,1,5,10,3,35,1 -21265,7,3,0,5,5,6,8,5,3,1,2,2,5,13,0 -21266,0,4,0,4,0,3,0,5,3,1,15,10,2,23,1 -21267,9,6,0,3,2,0,6,1,3,0,17,4,1,17,0 -21268,1,0,0,9,3,1,11,3,2,1,9,7,0,12,1 -21269,3,0,0,7,0,0,0,2,3,1,6,9,5,41,0 -21270,7,8,0,6,5,0,8,5,3,1,10,13,3,31,0 -21271,4,1,0,0,1,4,3,2,4,1,12,18,3,30,1 -21272,0,8,0,11,3,5,12,3,4,1,8,6,1,21,0 -21273,5,4,0,12,1,1,1,0,1,1,2,0,1,4,0 -21274,1,2,0,6,5,6,10,0,1,0,15,18,0,20,0 -21275,5,5,0,10,5,0,10,0,0,1,13,11,0,6,0 -21276,5,4,0,7,2,2,0,5,1,0,9,8,5,5,1 -21277,3,8,0,14,0,1,5,1,0,1,4,6,5,6,0 -21278,1,2,0,5,1,2,8,3,1,0,6,2,2,19,0 -21279,6,3,0,9,4,0,10,5,3,0,5,10,2,18,1 -21280,0,6,0,1,6,0,12,3,3,1,8,11,5,11,0 -21281,2,3,0,5,1,6,9,2,3,0,2,5,0,3,0 -21282,3,4,0,3,0,4,13,1,0,1,2,11,3,38,0 -21283,7,3,0,12,0,2,0,5,2,1,18,16,3,20,1 -21284,0,4,0,3,0,3,14,4,1,1,8,12,0,26,0 -21285,6,1,0,0,0,4,10,2,3,0,2,11,1,16,0 -21286,2,6,0,15,5,5,14,5,4,0,7,15,2,6,0 -21287,6,3,0,1,2,5,2,3,1,1,2,11,4,35,0 -21288,2,4,0,6,5,1,6,1,4,1,16,0,1,13,0 -21289,0,6,0,1,6,0,1,2,3,0,4,3,5,27,0 -21290,3,7,0,5,6,3,6,3,2,0,1,2,0,2,0 -21291,3,8,0,15,2,5,9,3,3,1,11,14,4,1,0 -21292,7,4,0,14,0,3,5,2,3,0,5,7,3,13,1 -21293,4,5,0,1,4,5,7,3,1,1,2,20,0,16,0 -21294,1,7,0,3,1,1,14,5,0,1,9,10,1,5,1 -21295,7,1,0,0,6,2,5,0,4,1,4,8,1,19,1 -21296,5,8,0,11,6,3,6,3,2,0,13,0,2,17,0 -21297,2,4,0,6,5,4,2,1,3,0,8,12,1,23,0 -21298,9,6,0,13,0,6,11,3,3,0,6,13,0,34,0 -21299,8,1,0,5,1,0,5,4,2,0,9,17,3,39,0 -21300,8,1,0,2,3,1,14,3,0,1,7,15,0,31,0 -21301,0,0,0,13,0,3,4,1,0,1,8,2,3,2,0 -21302,2,5,0,14,6,0,2,1,4,1,15,17,3,10,1 -21303,1,0,0,2,0,5,8,3,3,0,15,2,1,13,0 -21304,3,8,0,6,0,0,6,2,2,1,2,13,3,30,0 -21305,1,2,0,3,5,6,13,0,0,0,15,18,0,38,0 -21306,2,3,0,13,1,3,5,1,3,1,8,0,0,30,0 -21307,9,1,0,12,1,5,1,0,0,0,0,2,1,0,0 -21308,2,4,0,10,6,3,6,4,3,1,1,13,3,8,1 -21309,2,8,0,4,3,0,2,2,0,0,2,20,0,6,0 -21310,10,0,0,4,4,1,6,5,1,0,18,7,5,3,1 -21311,0,2,0,13,2,4,13,1,3,0,8,2,4,40,0 -21312,7,8,0,8,5,0,7,5,1,0,8,11,0,41,0 -21313,7,6,0,14,2,3,0,2,0,0,2,2,0,22,0 -21314,4,6,0,15,6,5,0,1,1,0,13,2,1,35,0 -21315,4,1,0,5,6,3,2,1,0,0,13,18,3,39,0 -21316,10,7,0,7,1,4,13,3,0,0,4,2,2,36,0 -21317,1,0,0,14,6,3,12,3,1,1,12,2,0,4,0 -21318,1,3,0,10,4,5,3,1,3,0,11,18,4,41,0 -21319,9,7,0,4,5,6,13,1,3,0,11,5,1,32,1 -21320,7,7,0,11,0,0,0,2,1,0,2,15,1,40,0 -21321,0,0,0,2,5,5,6,5,2,1,12,11,3,37,0 -21322,3,0,0,9,2,5,8,1,0,1,13,6,1,28,0 -21323,2,7,0,3,0,5,9,4,0,0,8,18,2,40,0 -21324,10,1,0,3,4,6,0,0,2,1,0,7,4,9,1 -21325,3,4,0,2,4,6,13,5,4,1,18,19,3,25,1 -21326,8,6,0,0,3,3,3,3,4,0,7,15,0,1,0 -21327,9,8,0,3,2,6,12,3,4,0,15,9,2,6,0 -21328,3,4,0,10,6,3,11,4,4,1,18,10,4,16,1 -21329,10,2,0,8,5,1,0,0,0,1,18,6,3,2,1 -21330,0,6,0,6,3,4,5,0,2,1,13,6,0,27,0 -21331,3,2,0,14,6,0,2,4,3,1,0,2,4,35,0 -21332,7,8,0,3,5,5,1,3,3,0,12,12,1,37,0 -21333,6,0,0,3,0,0,1,4,4,0,14,11,1,34,0 -21334,0,8,0,1,0,5,4,1,4,1,13,2,1,9,0 -21335,0,7,0,3,3,0,5,3,4,0,0,2,5,40,0 -21336,2,1,0,13,6,5,6,5,0,0,0,11,5,13,0 -21337,3,6,0,10,5,3,11,2,4,1,13,13,4,41,0 -21338,10,8,0,15,0,4,12,0,1,1,2,6,1,8,0 -21339,1,7,0,0,4,4,2,0,3,0,7,13,0,38,0 -21340,10,8,0,2,1,4,9,4,0,1,13,16,2,37,0 -21341,10,8,0,10,4,4,4,0,3,0,7,11,5,11,1 -21342,0,3,0,4,1,6,3,2,1,1,13,18,3,28,0 -21343,8,1,0,14,5,1,6,5,4,1,5,7,5,8,1 -21344,10,6,0,10,6,4,5,3,4,1,2,0,1,26,0 -21345,7,6,0,13,2,4,14,0,1,0,13,11,0,12,0 -21346,1,1,0,3,0,5,8,4,3,0,2,2,5,5,0 -21347,4,0,0,5,6,2,4,5,4,1,9,5,4,41,1 -21348,2,4,0,14,3,1,5,2,0,0,13,9,0,30,0 -21349,9,8,0,4,4,5,12,2,0,1,10,13,4,41,0 -21350,10,6,0,8,5,2,6,2,3,1,11,2,0,0,0 -21351,0,7,0,8,6,4,8,4,2,1,8,18,0,11,0 -21352,0,5,0,13,0,1,3,2,0,0,2,16,2,6,0 -21353,6,4,0,5,0,0,10,3,3,1,10,13,1,8,0 -21354,8,5,0,8,2,1,2,2,4,0,9,5,2,27,1 -21355,10,8,0,7,2,1,1,1,1,0,15,11,4,0,0 -21356,8,0,0,12,1,4,8,5,1,1,0,7,2,25,0 -21357,0,6,0,12,5,5,7,3,3,1,4,20,2,29,0 -21358,0,4,0,6,4,1,12,4,1,0,8,0,0,21,0 -21359,6,0,0,5,2,1,3,2,1,1,0,19,2,25,1 -21360,9,8,0,1,2,5,8,1,4,0,0,15,5,38,0 -21361,2,7,0,15,3,2,1,1,0,1,5,0,2,40,0 -21362,6,4,0,4,6,6,12,3,3,1,2,4,1,29,0 -21363,1,7,0,1,0,4,14,3,1,0,4,6,5,7,0 -21364,7,6,0,4,1,6,5,0,2,0,8,11,1,29,0 -21365,2,6,0,2,0,5,9,1,1,0,15,7,3,35,0 -21366,10,5,0,2,5,0,13,2,4,1,5,8,2,18,1 -21367,9,3,0,1,1,0,2,4,3,0,1,7,3,0,1 -21368,1,8,0,3,1,5,6,1,0,1,18,5,5,39,1 -21369,1,6,0,8,2,1,4,5,3,0,9,17,3,30,0 -21370,10,7,0,12,5,6,13,4,0,1,1,10,2,41,1 -21371,9,7,0,12,6,5,0,3,2,0,2,19,1,23,0 -21372,6,0,0,10,0,0,11,5,1,1,18,20,0,30,1 -21373,2,6,0,13,6,3,3,1,1,1,9,17,3,19,0 -21374,2,4,0,5,0,3,8,1,0,0,13,4,1,6,0 -21375,0,8,0,7,2,1,5,3,0,0,13,11,2,31,0 -21376,0,5,0,7,3,4,5,0,0,1,2,13,5,8,0 -21377,4,3,0,13,0,5,8,4,1,1,8,12,0,40,0 -21378,6,2,0,1,6,0,7,0,0,1,17,11,0,9,0 -21379,1,8,0,13,5,0,5,5,1,0,6,9,5,24,0 -21380,0,7,0,2,0,0,0,1,2,0,12,11,4,12,0 -21381,3,6,0,8,5,2,4,3,2,1,2,2,5,18,0 -21382,6,5,0,1,5,2,11,5,0,1,16,9,3,25,1 -21383,9,0,0,3,0,6,10,2,3,0,18,12,1,10,1 -21384,7,1,0,4,6,2,13,4,1,0,18,5,3,18,1 -21385,9,7,0,0,3,2,5,3,0,0,13,1,3,24,0 -21386,10,8,0,0,3,6,5,3,2,1,4,17,4,37,0 -21387,10,3,0,3,0,1,0,5,1,1,3,8,4,1,1 -21388,2,3,0,13,6,0,3,2,3,1,8,0,0,6,0 -21389,8,3,0,0,3,3,11,2,3,0,8,1,2,25,0 -21390,2,7,0,8,5,1,0,4,3,1,15,19,5,5,0 -21391,6,5,0,3,3,0,6,5,4,1,0,10,5,20,1 -21392,0,8,0,0,2,0,6,2,0,0,17,11,2,17,0 -21393,6,4,0,3,5,0,5,2,1,1,2,11,0,38,0 -21394,6,6,0,12,5,3,5,1,3,0,17,0,1,28,0 -21395,5,6,0,9,2,2,2,4,0,0,11,14,1,8,1 -21396,2,1,0,12,1,5,11,0,0,0,8,0,1,37,0 -21397,0,2,0,5,4,0,3,5,0,0,15,9,0,40,0 -21398,10,3,0,15,0,0,0,0,4,0,0,11,5,22,0 -21399,0,1,0,0,6,4,9,3,1,0,15,4,5,20,0 -21400,4,6,0,3,5,0,14,1,1,0,8,20,4,22,0 -21401,1,7,0,10,0,0,12,2,1,1,4,11,0,27,0 -21402,9,5,0,13,3,4,5,3,1,1,2,11,2,29,0 -21403,2,1,0,8,1,2,6,0,0,0,2,2,0,39,0 -21404,7,2,0,12,4,6,9,4,0,1,2,13,5,26,0 -21405,2,4,0,2,4,6,11,0,3,1,16,13,0,21,1 -21406,3,0,0,11,0,3,9,2,2,0,17,9,2,4,0 -21407,3,1,0,7,2,6,2,5,2,1,9,17,0,27,1 -21408,2,3,0,0,6,2,3,2,1,1,0,2,2,14,0 -21409,0,7,0,15,6,0,13,1,2,1,3,4,1,13,0 -21410,3,0,0,14,5,2,6,1,2,1,2,2,1,11,0 -21411,5,6,0,12,4,0,2,5,4,0,18,7,2,19,1 -21412,5,2,0,14,4,5,4,0,0,1,18,1,4,28,1 -21413,1,7,0,3,3,6,13,5,1,0,14,16,5,25,0 -21414,3,2,0,4,0,1,6,5,3,1,18,17,3,27,1 -21415,6,3,0,5,0,4,11,0,2,1,13,20,3,27,0 -21416,5,0,0,10,4,3,14,5,1,1,3,7,3,27,1 -21417,3,0,0,7,0,0,8,3,0,0,4,9,0,14,0 -21418,9,5,0,0,2,5,14,1,3,1,18,18,2,11,1 -21419,9,1,0,3,3,5,1,0,2,0,9,5,3,11,1 -21420,0,3,0,5,0,1,8,4,3,0,17,2,1,37,0 -21421,3,8,0,6,0,4,1,1,3,1,15,19,2,26,0 -21422,3,0,0,2,0,0,3,0,4,1,2,18,1,41,0 -21423,10,2,0,14,5,5,12,0,3,0,0,2,3,4,0 -21424,1,2,0,11,3,3,4,3,1,0,2,18,2,9,0 -21425,1,3,0,11,4,5,0,3,4,0,13,18,3,35,0 -21426,10,8,0,3,6,0,4,5,3,1,18,7,5,11,1 -21427,9,1,0,2,2,6,4,4,0,1,14,12,4,31,1 -21428,1,2,0,1,3,4,1,0,1,0,18,16,0,41,1 -21429,2,2,0,4,4,1,1,3,0,1,13,11,0,14,0 -21430,0,8,0,13,5,1,5,3,0,0,2,0,1,3,0 -21431,5,1,0,13,0,0,13,1,3,1,5,17,1,37,1 -21432,10,2,0,11,3,3,13,3,0,1,13,14,2,38,0 -21433,1,5,0,3,5,2,5,3,1,1,13,12,0,11,0 -21434,3,6,0,15,5,4,9,0,1,1,10,0,0,22,0 -21435,0,2,0,3,5,6,10,4,2,1,13,16,1,1,0 -21436,0,3,0,3,3,5,12,2,2,1,17,11,4,41,0 -21437,2,6,0,15,5,3,7,2,1,1,18,2,1,17,1 -21438,5,2,0,9,0,5,0,3,0,0,10,11,3,5,0 -21439,7,2,0,11,5,1,7,1,0,0,5,14,3,4,1 -21440,10,7,0,7,1,0,13,5,2,0,18,19,2,2,1 -21441,0,0,0,12,1,4,8,1,0,0,4,18,3,18,0 -21442,10,8,0,3,4,2,9,3,2,1,13,17,4,39,0 -21443,2,1,0,9,6,4,0,1,0,1,6,11,1,33,0 -21444,8,6,0,4,3,4,13,2,3,1,0,20,3,11,0 -21445,3,2,0,3,6,5,3,3,4,1,12,0,3,25,0 -21446,8,5,0,2,2,3,10,1,1,0,5,7,0,9,1 -21447,0,1,0,6,0,4,1,3,2,0,8,3,2,27,0 -21448,0,1,0,5,0,3,1,1,2,1,12,2,0,12,0 -21449,8,8,0,8,6,4,11,4,1,0,13,11,2,17,0 -21450,6,2,0,11,6,1,12,0,0,0,14,1,4,12,1 -21451,8,1,0,4,0,6,10,3,0,0,2,2,0,16,0 -21452,3,1,0,13,0,3,3,4,3,1,13,6,5,3,0 -21453,6,1,0,4,1,4,6,5,4,1,14,17,2,31,1 -21454,7,7,0,5,5,5,3,0,1,1,1,20,2,35,1 -21455,4,2,0,8,2,1,5,3,3,1,13,20,1,2,0 -21456,0,2,0,3,3,3,8,0,2,1,13,9,1,17,0 -21457,8,6,0,2,6,2,14,2,1,1,1,10,5,33,1 -21458,0,6,0,10,3,5,2,1,2,1,8,9,0,31,0 -21459,4,8,0,6,3,5,3,5,3,0,7,12,0,21,1 -21460,5,0,0,1,2,3,5,5,4,1,18,5,5,20,1 -21461,2,8,0,13,0,5,2,0,2,0,1,3,0,14,0 -21462,6,0,0,1,0,4,9,5,0,1,11,11,5,1,0 -21463,1,7,0,8,0,0,11,1,4,1,13,15,1,16,0 -21464,6,6,0,13,0,5,14,3,2,0,15,13,3,3,0 -21465,8,7,0,9,5,5,8,1,4,1,2,11,3,16,0 -21466,1,6,0,3,5,4,10,0,0,0,16,6,1,14,0 -21467,7,2,0,1,1,0,13,3,0,1,5,12,4,23,1 -21468,8,1,0,0,0,4,0,3,0,0,5,7,1,10,1 -21469,2,7,0,3,1,2,3,1,0,1,8,11,4,21,0 -21470,4,1,0,7,0,6,4,5,0,1,15,2,2,22,0 -21471,0,6,0,4,6,2,10,5,4,1,1,19,3,13,1 -21472,6,0,0,3,0,3,8,0,4,0,2,14,1,8,0 -21473,10,1,0,4,5,5,12,3,0,1,1,17,1,13,1 -21474,7,2,0,12,4,0,0,5,2,1,16,12,4,18,1 -21475,9,1,0,10,4,3,1,4,2,1,11,14,5,41,1 -21476,9,7,0,4,1,5,12,2,1,1,2,18,5,1,0 -21477,10,8,0,6,2,5,8,1,4,0,8,5,2,12,0 -21478,10,5,0,4,6,5,10,1,3,1,15,14,3,36,1 -21479,3,2,0,12,3,1,10,1,4,0,4,0,3,22,0 -21480,6,3,0,11,5,6,5,2,1,0,13,0,3,26,0 -21481,1,2,0,0,2,6,5,3,3,0,2,9,0,3,0 -21482,6,3,0,6,3,5,9,3,1,0,13,6,4,16,0 -21483,8,7,0,7,2,6,3,1,4,0,8,0,3,40,0 -21484,2,0,0,7,0,0,0,4,0,1,10,15,0,27,0 -21485,0,1,0,13,1,5,5,3,3,0,13,9,1,8,0 -21486,3,8,0,1,5,6,2,2,3,1,13,11,2,40,0 -21487,3,3,0,15,1,5,5,0,2,0,2,6,2,6,0 -21488,1,5,0,1,6,6,9,4,3,1,15,11,0,19,0 -21489,6,5,0,13,0,3,9,4,3,0,8,2,3,7,0 -21490,0,8,0,6,1,1,5,4,1,1,17,18,3,38,0 -21491,1,2,0,10,6,1,0,5,1,0,18,0,2,30,1 -21492,1,8,0,2,6,4,4,3,4,0,8,10,1,21,1 -21493,3,4,0,0,6,5,4,4,1,1,7,18,2,22,0 -21494,0,8,0,4,6,1,14,4,3,1,15,10,1,37,0 -21495,10,7,0,8,2,5,13,5,3,1,6,15,0,24,0 -21496,7,0,0,2,2,0,12,1,3,0,0,2,1,37,0 -21497,10,4,0,1,6,2,5,4,4,0,2,12,2,4,0 -21498,0,1,0,7,1,0,14,0,4,0,2,15,0,33,0 -21499,8,8,0,12,0,5,1,2,2,1,16,1,1,39,1 -21500,1,8,0,0,6,0,8,0,0,1,4,9,2,14,0 -21501,5,3,0,1,0,3,0,3,0,1,17,3,4,17,0 -21502,1,8,0,12,4,0,9,4,0,0,11,7,1,26,1 -21503,6,7,0,7,2,2,5,2,4,1,3,17,5,0,1 -21504,2,8,0,15,3,3,3,0,3,0,6,6,2,39,0 -21505,7,2,0,15,1,3,3,5,3,1,18,16,1,16,1 -21506,8,8,0,7,1,0,7,4,0,0,18,5,3,34,1 -21507,7,7,0,8,6,3,4,2,4,0,13,1,0,1,0 -21508,7,5,0,14,6,5,10,4,0,1,3,12,3,0,1 -21509,2,6,0,13,2,0,10,0,0,0,15,20,3,10,0 -21510,1,0,0,14,6,1,13,5,4,0,18,3,0,32,1 -21511,1,8,0,5,3,4,8,3,4,1,2,19,0,8,0 -21512,5,2,0,5,0,6,0,2,4,1,4,2,1,37,0 -21513,0,4,0,13,4,6,3,2,2,1,4,9,3,0,0 -21514,9,8,0,15,0,0,1,5,3,0,13,0,0,41,0 -21515,0,7,0,15,1,5,5,0,3,0,4,4,1,6,0 -21516,2,6,0,15,3,5,9,0,3,1,13,6,1,35,0 -21517,6,5,0,4,1,5,6,3,1,1,2,18,4,14,0 -21518,3,3,0,6,6,4,7,3,2,0,16,11,0,4,0 -21519,8,8,0,15,6,5,6,3,2,1,12,14,0,19,0 -21520,5,3,0,8,0,0,1,1,0,0,6,2,0,5,0 -21521,9,6,0,7,4,1,11,0,1,0,3,8,3,14,1 -21522,1,6,0,7,2,4,5,1,3,1,10,18,0,6,0 -21523,8,4,0,2,4,4,0,3,0,0,4,2,0,4,0 -21524,4,1,0,2,5,5,9,3,1,0,8,19,0,14,0 -21525,9,0,0,15,1,5,11,0,4,0,4,6,3,7,0 -21526,6,7,0,12,0,0,3,2,4,0,13,6,0,3,0 -21527,4,2,0,2,6,5,0,4,1,0,13,15,5,14,0 -21528,1,1,0,0,1,4,1,5,0,0,17,15,0,7,0 -21529,3,5,0,15,2,6,2,3,3,1,2,11,5,33,0 -21530,7,7,0,13,2,0,1,5,2,0,16,9,4,11,0 -21531,1,3,0,12,0,6,10,1,4,1,2,2,5,26,0 -21532,6,2,0,3,1,0,6,0,1,0,13,9,4,10,0 -21533,5,2,0,9,0,4,6,0,3,1,17,13,2,38,0 -21534,0,2,0,2,1,6,6,3,2,0,15,18,0,18,0 -21535,8,1,0,3,2,0,8,4,0,0,4,11,0,19,0 -21536,4,0,0,0,2,0,5,3,2,1,8,3,1,18,0 -21537,4,4,0,8,3,6,4,5,3,0,3,10,5,2,1 -21538,8,6,0,14,3,5,7,0,4,1,7,7,5,41,1 -21539,2,3,0,6,1,2,10,1,0,0,13,13,1,14,0 -21540,5,7,0,8,1,2,12,3,3,1,5,10,1,21,1 -21541,9,4,0,6,1,4,9,4,1,0,2,4,2,26,0 -21542,10,3,0,12,3,1,14,2,3,0,8,20,0,6,0 -21543,3,3,0,15,5,4,3,3,2,0,6,20,1,28,0 -21544,0,7,0,5,6,1,7,3,2,0,2,17,0,2,0 -21545,10,0,0,7,4,6,0,0,1,1,5,5,4,32,1 -21546,2,0,0,3,6,6,2,3,1,1,13,14,3,18,0 -21547,8,0,0,4,5,6,14,5,2,1,5,17,4,11,1 -21548,3,3,0,10,3,0,11,4,0,1,13,12,2,16,0 -21549,4,6,0,9,5,1,5,2,4,0,2,2,4,5,0 -21550,0,2,0,1,4,2,0,3,4,1,10,2,5,38,0 -21551,5,4,0,14,2,5,13,4,3,1,18,7,5,25,1 -21552,5,4,0,9,1,1,7,4,0,0,18,6,0,3,0 -21553,0,3,0,3,5,4,6,1,2,1,13,2,0,29,0 -21554,7,0,0,9,5,0,4,2,4,1,15,14,4,23,0 -21555,0,8,0,5,0,3,7,1,2,1,3,4,2,13,0 -21556,5,1,0,1,2,2,4,3,1,0,13,2,1,6,0 -21557,2,8,0,12,5,3,9,3,0,0,8,20,4,0,0 -21558,3,4,0,5,5,6,5,1,1,1,16,15,4,38,0 -21559,6,4,0,1,4,3,13,0,3,0,0,11,0,13,0 -21560,6,0,0,14,2,3,8,5,0,1,4,10,1,38,1 -21561,7,2,0,4,0,4,5,1,0,1,13,16,1,33,0 -21562,3,6,0,0,0,0,5,2,4,0,8,11,1,40,0 -21563,4,4,0,11,0,6,12,1,3,0,0,11,3,21,0 -21564,2,3,0,15,1,0,3,5,1,1,15,11,0,38,0 -21565,1,1,0,12,4,4,13,2,3,0,17,15,0,25,0 -21566,1,6,0,11,3,4,13,1,4,0,8,6,5,27,0 -21567,3,8,0,8,6,4,1,2,1,0,4,11,1,38,0 -21568,5,4,0,2,2,0,11,2,2,1,12,3,3,1,0 -21569,9,0,0,14,1,1,3,5,3,1,18,10,1,32,1 -21570,3,8,0,11,1,5,3,4,0,0,2,9,5,21,0 -21571,6,8,0,9,3,5,11,1,3,1,13,2,5,0,0 -21572,3,6,0,8,5,4,3,0,3,0,10,11,3,10,0 -21573,5,2,0,7,2,6,11,4,3,1,3,11,1,20,0 -21574,0,4,0,9,2,5,8,0,2,0,12,6,3,37,0 -21575,3,0,0,4,1,0,5,2,3,0,7,11,4,34,0 -21576,2,3,0,1,0,6,5,0,0,1,3,17,0,0,1 -21577,2,4,0,13,5,5,5,2,3,0,13,11,2,29,0 -21578,5,5,0,6,4,6,9,3,2,0,10,2,0,11,0 -21579,9,8,0,2,1,4,6,5,2,1,12,4,0,3,0 -21580,8,3,0,0,5,1,14,0,1,0,2,0,4,0,0 -21581,3,0,0,3,4,4,2,1,2,1,6,18,5,5,0 -21582,7,5,0,3,1,3,5,1,3,1,8,4,2,13,0 -21583,7,5,0,8,3,0,3,2,2,1,1,11,3,4,0 -21584,10,5,0,13,5,4,11,4,1,0,8,4,4,35,0 -21585,6,6,0,8,2,5,6,3,4,1,8,11,2,29,0 -21586,7,6,0,13,3,5,2,0,2,1,8,20,3,19,0 -21587,1,3,0,1,3,3,5,4,4,0,6,19,0,24,0 -21588,4,5,0,4,6,4,12,2,1,1,2,2,0,17,0 -21589,10,2,0,14,5,5,6,1,3,1,11,8,4,18,1 -21590,1,6,0,3,6,1,2,1,2,1,16,4,4,21,0 -21591,6,6,0,11,6,4,2,4,1,1,2,2,0,40,0 -21592,2,8,0,0,1,4,0,3,0,0,4,19,3,35,0 -21593,1,4,0,2,4,0,0,5,2,1,9,19,0,31,1 -21594,9,6,0,9,6,1,11,2,2,0,18,14,5,9,0 -21595,3,7,0,4,0,0,6,2,1,0,13,18,5,29,0 -21596,0,7,0,4,1,2,1,4,1,0,13,18,1,33,0 -21597,0,6,0,11,5,5,10,0,0,0,13,2,3,4,0 -21598,5,4,0,0,1,6,5,4,3,1,8,18,4,34,0 -21599,7,7,0,8,3,0,12,3,0,1,4,2,3,2,0 -21600,0,1,0,11,6,6,9,2,0,0,2,0,5,5,0 -21601,1,1,0,11,4,5,0,1,1,1,17,18,0,18,0 -21602,2,3,0,13,4,3,1,4,1,0,2,15,0,16,0 -21603,6,5,0,6,4,6,7,3,4,1,1,7,4,41,1 -21604,8,3,0,3,2,0,10,5,0,0,13,8,5,30,0 -21605,7,8,0,7,6,3,3,4,4,1,2,12,4,17,0 -21606,9,7,0,13,0,5,14,3,4,1,2,20,1,37,0 -21607,8,1,0,6,2,1,5,1,4,0,18,19,0,10,1 -21608,9,4,0,8,5,3,14,3,0,0,7,7,4,13,1 -21609,9,3,0,6,6,1,1,4,1,0,17,0,1,6,0 -21610,0,6,0,15,2,5,0,0,2,0,10,15,0,4,0 -21611,0,4,0,4,1,0,2,4,3,1,13,11,0,0,0 -21612,0,6,0,10,3,5,2,1,2,0,6,18,3,24,0 -21613,9,2,0,3,4,0,0,5,3,0,17,3,5,38,0 -21614,0,3,0,12,2,0,8,4,1,0,8,19,4,3,0 -21615,8,1,0,3,1,4,8,2,4,1,2,19,0,25,0 -21616,0,8,0,14,1,4,12,0,0,1,13,11,2,31,0 -21617,7,4,0,4,4,2,8,0,0,1,11,19,3,18,1 -21618,2,0,0,4,1,4,3,5,0,0,10,6,2,12,0 -21619,0,2,0,0,2,4,2,4,2,1,16,15,3,7,0 -21620,5,5,0,8,4,2,3,0,2,1,18,7,5,18,1 -21621,7,1,0,13,6,5,12,1,2,1,3,7,1,39,1 -21622,8,1,0,7,4,3,1,0,0,0,0,13,0,26,0 -21623,8,0,0,0,0,0,4,4,0,1,2,13,2,14,0 -21624,10,1,0,12,6,1,2,1,1,0,12,19,5,32,1 -21625,9,2,0,10,0,0,12,4,0,0,2,15,3,33,0 -21626,2,7,0,14,6,1,6,3,0,1,2,11,4,9,0 -21627,3,8,0,13,2,3,5,2,1,0,13,15,5,10,0 -21628,7,8,0,1,1,4,13,3,2,1,0,14,0,0,0 -21629,9,1,0,1,3,2,8,5,0,1,14,2,3,7,0 -21630,7,5,0,7,1,4,2,2,1,0,9,18,1,31,1 -21631,7,5,0,4,5,2,10,5,1,1,5,7,2,35,1 -21632,10,6,0,1,0,5,5,3,1,1,13,2,3,5,0 -21633,3,0,0,1,2,3,5,0,4,1,8,0,0,34,0 -21634,0,5,0,7,0,5,4,4,2,0,10,2,2,35,0 -21635,10,2,0,5,1,5,5,1,0,1,16,15,5,35,0 -21636,8,3,0,5,3,1,13,5,0,1,1,8,4,18,1 -21637,0,5,0,11,0,4,12,4,2,0,13,20,1,2,0 -21638,2,1,0,12,1,0,14,0,0,1,8,6,3,7,0 -21639,6,2,0,2,5,5,11,3,2,0,14,7,0,5,1 -21640,0,1,0,2,1,5,6,4,3,1,8,14,4,25,0 -21641,2,8,0,2,6,4,1,2,1,1,13,2,1,36,0 -21642,1,4,0,12,6,2,2,1,2,0,2,2,3,23,0 -21643,9,5,0,9,5,1,9,5,0,1,18,7,2,12,1 -21644,2,8,0,8,1,4,1,3,1,1,2,3,0,35,0 -21645,2,1,0,1,1,6,12,0,4,1,13,17,2,3,0 -21646,1,4,0,5,2,4,12,3,3,0,6,2,0,26,0 -21647,0,6,0,14,6,3,5,0,2,0,14,14,4,4,0 -21648,4,8,0,3,5,4,1,1,1,1,17,6,2,19,0 -21649,1,2,0,8,5,4,7,0,0,0,15,20,1,30,0 -21650,4,7,0,13,6,4,4,3,0,1,13,8,0,27,0 -21651,0,4,0,14,2,1,0,1,4,0,13,19,0,14,0 -21652,2,7,0,11,2,3,6,5,2,1,1,18,3,22,1 -21653,2,1,0,13,0,6,8,3,0,1,8,14,0,18,0 -21654,1,6,0,14,0,0,9,0,3,0,2,0,1,30,0 -21655,1,8,0,4,4,0,3,4,0,1,2,10,2,33,0 -21656,5,4,0,0,5,4,10,1,0,0,1,1,1,28,1 -21657,10,7,0,2,5,0,2,2,4,1,7,12,1,7,1 -21658,4,2,0,13,6,1,12,0,0,0,13,15,3,40,0 -21659,3,4,0,3,2,6,3,3,1,0,9,17,4,33,0 -21660,8,6,0,13,3,1,5,3,1,1,13,4,5,0,0 -21661,3,7,0,1,6,0,7,1,4,1,0,11,2,14,0 -21662,5,7,0,9,6,6,5,3,1,1,17,2,4,10,0 -21663,0,4,0,3,0,2,6,5,3,1,6,0,4,34,0 -21664,8,8,0,5,3,0,2,2,2,0,10,12,0,20,0 -21665,1,3,0,8,5,3,9,4,3,1,15,2,1,14,0 -21666,4,2,0,10,4,6,4,3,1,1,7,12,3,7,1 -21667,4,6,0,5,1,3,6,2,3,1,7,20,5,14,0 -21668,6,1,0,12,5,1,11,3,4,1,18,7,3,16,1 -21669,9,1,0,2,4,0,5,0,2,0,1,7,1,5,1 -21670,8,1,0,8,6,3,0,5,2,0,9,14,1,31,1 -21671,10,8,0,3,5,1,4,0,4,0,2,11,3,39,0 -21672,0,5,0,14,1,0,0,3,1,1,0,15,2,21,0 -21673,0,7,0,7,3,2,13,4,2,1,14,6,0,23,0 -21674,0,1,0,1,6,0,1,3,3,1,13,10,0,3,0 -21675,4,7,0,3,5,3,0,2,2,0,2,18,1,9,0 -21676,4,6,0,12,3,2,10,3,0,1,1,5,4,34,1 -21677,1,7,0,9,2,5,2,0,0,1,9,17,2,20,1 -21678,1,2,0,2,0,6,9,4,0,1,10,11,5,3,0 -21679,2,7,0,1,1,0,6,1,3,1,4,11,2,27,0 -21680,2,7,0,4,2,6,5,2,0,1,10,2,0,34,0 -21681,8,2,0,3,3,6,2,5,0,0,18,10,1,30,1 -21682,0,6,0,3,0,4,5,2,2,1,2,4,4,26,0 -21683,9,1,0,7,3,3,10,4,3,1,11,15,2,40,0 -21684,8,6,0,2,1,2,13,5,3,1,16,13,5,4,0 -21685,1,8,0,12,4,0,8,0,4,1,13,6,1,10,0 -21686,0,6,0,13,1,0,6,5,1,1,13,6,0,26,0 -21687,0,0,0,7,5,5,5,4,0,1,4,2,0,34,0 -21688,3,0,0,1,5,0,10,3,1,0,8,11,3,12,0 -21689,5,3,0,11,2,2,8,2,2,1,14,20,3,30,0 -21690,10,8,0,1,0,0,12,1,1,1,2,15,5,37,0 -21691,9,2,0,13,2,1,14,4,2,1,12,9,0,21,0 -21692,0,6,0,1,0,5,9,1,1,1,4,11,4,31,0 -21693,9,3,0,7,6,0,9,1,3,1,5,2,5,37,0 -21694,10,4,0,0,2,1,1,1,4,1,7,7,4,17,1 -21695,2,7,0,10,6,0,5,1,2,1,8,2,3,4,0 -21696,0,8,0,4,6,4,5,3,2,1,8,2,0,35,0 -21697,0,6,0,8,4,1,14,3,0,1,2,15,0,30,0 -21698,8,8,0,13,6,2,3,2,2,1,10,6,3,14,0 -21699,6,1,0,12,0,3,10,3,3,1,17,9,2,14,0 -21700,9,6,0,5,3,4,6,1,0,1,13,13,2,19,0 -21701,10,0,0,2,6,4,3,2,3,1,13,11,2,7,0 -21702,2,6,0,0,0,5,6,0,1,1,12,4,1,36,0 -21703,5,7,0,7,4,1,4,5,2,1,18,12,5,12,1 -21704,5,1,0,0,2,0,3,1,0,0,18,19,1,8,1 -21705,7,7,0,11,5,1,7,1,2,1,9,11,0,23,1 -21706,3,6,0,8,0,3,12,1,4,1,13,12,5,25,0 -21707,8,3,0,3,2,3,9,2,3,1,5,2,3,16,0 -21708,0,8,0,12,6,5,2,1,4,1,2,16,2,40,0 -21709,9,6,0,11,0,5,11,1,3,1,3,2,3,35,0 -21710,2,5,0,10,5,3,5,3,0,1,13,9,0,24,0 -21711,4,7,0,8,4,0,9,1,2,1,18,10,1,2,1 -21712,7,5,0,7,3,5,11,5,0,1,1,1,4,31,1 -21713,4,8,0,1,0,2,8,5,3,1,10,11,5,13,0 -21714,0,3,0,3,2,1,1,3,2,1,12,0,0,26,0 -21715,2,6,0,2,0,6,0,0,0,0,3,11,2,32,0 -21716,0,6,0,7,1,4,2,1,1,1,17,15,3,14,0 -21717,5,8,0,7,3,6,3,4,3,1,17,11,1,3,0 -21718,0,4,0,13,0,4,1,0,0,1,13,15,1,2,0 -21719,0,6,0,3,1,5,8,3,3,1,8,2,2,10,0 -21720,3,8,0,4,0,3,13,3,3,0,6,15,4,3,0 -21721,8,1,0,9,2,0,1,5,0,0,14,5,2,17,1 -21722,2,4,0,15,1,3,6,0,3,1,8,6,2,1,0 -21723,5,4,0,6,4,6,8,0,1,1,1,10,2,19,1 -21724,3,8,0,5,0,5,0,2,0,1,8,6,4,31,0 -21725,0,8,0,1,5,4,10,4,2,0,13,15,5,33,0 -21726,10,0,0,14,3,2,4,5,4,1,17,5,1,20,1 -21727,0,8,0,10,0,0,8,1,4,0,4,7,2,10,0 -21728,4,8,0,12,2,3,9,4,1,1,13,13,1,22,0 -21729,10,8,0,10,5,3,10,5,4,0,18,10,0,5,1 -21730,3,4,0,9,3,0,4,5,0,0,13,4,1,23,0 -21731,6,8,0,6,2,4,14,0,0,1,2,16,5,29,0 -21732,0,2,0,1,0,4,6,3,3,0,11,2,4,36,0 -21733,0,0,0,2,6,0,5,3,1,1,2,19,3,18,0 -21734,4,8,0,12,4,5,9,0,3,1,8,15,0,16,0 -21735,5,3,0,3,3,5,5,5,4,0,6,0,0,39,0 -21736,5,6,0,14,2,4,4,3,1,0,11,9,4,41,1 -21737,3,6,0,13,3,3,2,0,4,0,1,7,0,35,1 -21738,0,6,0,0,6,3,6,2,0,1,13,2,4,10,0 -21739,4,1,0,0,4,5,8,3,2,1,14,1,2,8,1 -21740,3,0,0,4,2,6,13,5,2,1,9,14,2,35,1 -21741,9,3,0,5,3,4,5,5,0,1,11,7,2,20,1 -21742,0,4,0,6,1,6,1,1,0,1,2,17,0,6,0 -21743,5,4,0,6,4,0,6,0,1,1,2,18,1,1,0 -21744,7,3,0,4,4,2,1,1,1,1,3,5,5,13,1 -21745,1,2,0,0,3,6,7,2,0,0,13,11,3,23,0 -21746,5,5,0,8,1,4,9,2,0,1,11,7,5,18,1 -21747,2,0,0,5,0,6,4,4,3,0,0,14,3,33,0 -21748,9,0,0,2,0,5,5,4,0,0,6,20,2,8,0 -21749,1,5,0,11,5,0,8,2,3,1,10,11,0,13,0 -21750,6,6,0,10,3,2,8,0,3,0,14,7,2,10,1 -21751,1,0,0,3,5,6,9,3,2,0,18,10,5,7,1 -21752,0,6,0,8,0,5,0,4,4,1,8,2,5,4,0 -21753,6,1,0,9,4,4,4,5,2,0,0,5,4,10,1 -21754,5,4,0,10,1,3,4,4,3,1,15,3,2,41,1 -21755,0,7,0,11,0,6,3,4,4,1,12,20,5,4,0 -21756,6,3,0,5,1,0,6,3,0,1,13,2,1,22,0 -21757,8,7,0,2,3,4,2,3,3,0,18,5,4,8,1 -21758,8,3,0,11,4,5,8,1,2,0,4,14,4,20,0 -21759,0,8,0,6,5,0,5,3,1,0,0,11,1,9,0 -21760,7,3,0,2,6,2,11,1,4,0,9,4,5,8,1 -21761,7,6,0,0,0,5,11,3,0,0,1,5,3,34,1 -21762,6,0,0,5,2,1,6,4,1,1,1,10,3,1,1 -21763,3,5,0,14,3,5,3,5,2,1,15,16,3,29,1 -21764,0,5,0,3,2,0,9,4,1,1,13,12,4,16,0 -21765,1,8,0,15,6,3,13,0,4,1,11,13,3,13,0 -21766,0,5,0,14,0,2,3,3,3,1,8,16,5,21,0 -21767,2,4,0,1,6,2,0,1,2,1,1,9,1,30,0 -21768,9,7,0,3,6,6,2,1,3,0,13,15,2,8,0 -21769,9,7,0,2,0,5,9,3,4,0,13,6,0,7,0 -21770,8,0,0,11,2,3,4,4,2,1,17,17,1,29,0 -21771,4,8,0,1,6,4,3,0,0,0,2,11,5,6,0 -21772,10,4,0,3,4,2,14,4,2,1,9,15,2,12,1 -21773,3,6,0,12,1,6,7,2,0,1,17,11,0,25,0 -21774,5,7,0,15,5,6,8,3,4,0,10,17,1,10,1 -21775,9,8,0,3,6,3,13,1,0,0,2,2,0,0,0 -21776,6,2,0,2,1,1,10,1,0,1,10,13,3,35,0 -21777,8,8,0,4,4,0,12,1,3,0,2,2,5,27,0 -21778,4,7,0,7,5,3,2,5,3,1,15,4,5,25,1 -21779,2,8,0,1,2,4,14,1,2,1,15,15,0,25,0 -21780,0,7,0,0,1,1,9,3,3,0,2,9,0,33,0 -21781,4,7,0,3,1,0,1,4,3,1,4,2,3,11,0 -21782,2,8,0,13,0,4,7,2,0,0,10,18,5,9,0 -21783,4,6,0,5,0,4,8,0,0,0,17,0,3,31,0 -21784,6,1,0,3,0,4,7,4,4,0,0,4,4,30,0 -21785,10,3,0,1,6,6,8,1,0,1,2,19,4,40,0 -21786,3,8,0,11,5,5,7,3,1,1,13,11,0,4,0 -21787,8,2,0,8,1,0,1,1,3,1,2,10,5,12,0 -21788,10,2,0,1,5,6,5,5,3,1,13,11,5,3,0 -21789,3,3,0,1,5,6,14,3,3,0,2,16,0,38,0 -21790,0,7,0,5,0,5,4,1,2,1,4,2,0,38,0 -21791,0,3,0,14,3,5,6,5,0,0,4,3,0,17,0 -21792,2,0,0,13,1,4,3,4,3,1,10,20,1,17,0 -21793,1,1,0,12,6,2,9,1,2,0,11,10,5,32,1 -21794,0,3,0,4,1,0,9,1,3,1,0,17,1,12,0 -21795,9,0,0,7,2,6,5,4,2,0,13,13,2,0,0 -21796,0,4,0,8,6,3,2,0,2,0,4,19,2,12,0 -21797,8,6,0,1,5,0,12,4,0,0,13,15,2,18,0 -21798,3,1,0,0,3,1,7,5,4,0,5,20,4,22,1 -21799,7,1,0,11,3,4,13,4,1,1,9,1,1,12,1 -21800,5,6,0,1,4,4,14,2,2,0,2,0,0,33,0 -21801,0,4,0,15,4,0,9,3,2,1,8,2,1,38,0 -21802,0,0,0,5,6,4,9,3,2,0,4,2,0,16,0 -21803,6,5,0,6,6,5,1,4,4,0,2,2,3,21,0 -21804,0,0,0,14,0,2,7,1,1,1,8,11,1,17,0 -21805,1,0,0,7,0,0,0,5,1,1,10,14,1,40,0 -21806,3,5,0,4,4,0,12,3,0,1,6,20,4,2,0 -21807,10,3,0,6,6,6,4,2,3,0,4,2,3,40,0 -21808,6,8,0,13,6,3,7,5,1,1,13,11,5,27,0 -21809,5,7,0,12,3,3,12,4,1,0,8,17,4,17,0 -21810,7,8,0,5,2,2,1,2,0,1,13,11,3,34,0 -21811,4,7,0,7,1,1,11,2,0,1,14,10,3,23,1 -21812,10,8,0,2,1,3,3,2,2,0,2,2,3,3,0 -21813,0,2,0,1,0,0,8,3,2,0,17,11,1,7,0 -21814,3,3,0,10,4,6,8,1,3,0,13,2,0,16,0 -21815,5,1,0,0,2,1,14,1,3,1,0,14,5,34,1 -21816,8,0,0,12,5,1,13,4,0,1,10,5,3,5,1 -21817,0,3,0,15,6,1,10,3,0,1,17,2,4,10,0 -21818,2,1,0,11,6,4,0,2,4,0,4,20,3,5,0 -21819,8,8,0,15,2,5,10,5,3,1,10,1,0,27,0 -21820,5,4,0,5,5,6,2,5,3,0,18,8,4,5,1 -21821,7,7,0,9,5,4,3,1,0,0,6,2,5,14,0 -21822,0,7,0,4,2,3,8,4,1,1,2,2,2,1,0 -21823,9,5,0,14,6,2,0,5,2,1,9,10,1,40,1 -21824,1,0,0,6,6,3,1,2,3,0,11,0,3,21,0 -21825,10,2,0,10,6,2,6,0,2,1,7,17,5,38,1 -21826,4,7,0,1,0,2,9,1,2,0,12,18,0,35,0 -21827,2,1,0,9,4,6,6,5,1,0,14,5,4,14,1 -21828,3,8,0,5,5,0,7,4,3,0,17,2,1,8,0 -21829,2,6,0,4,1,6,3,0,1,1,2,16,2,33,0 -21830,8,1,0,11,6,0,9,0,4,0,1,7,5,28,1 -21831,1,7,0,12,3,3,7,5,3,0,8,4,2,41,0 -21832,2,7,0,0,6,1,0,0,0,0,0,4,4,17,0 -21833,4,2,0,12,3,2,11,5,3,1,11,3,4,23,1 -21834,0,4,0,11,2,6,14,2,4,1,2,6,0,4,0 -21835,0,3,0,8,2,4,0,5,2,1,17,3,3,3,0 -21836,3,6,0,14,4,0,4,3,4,1,4,19,5,27,1 -21837,0,7,0,9,5,0,1,2,0,0,8,9,1,29,0 -21838,6,7,0,15,0,1,8,3,2,0,8,0,3,14,0 -21839,1,4,0,1,0,0,4,1,2,1,13,2,0,26,0 -21840,9,0,0,4,2,0,11,3,4,0,18,12,4,13,1 -21841,3,3,0,13,6,6,5,5,3,1,13,6,1,16,0 -21842,7,6,0,4,4,4,8,3,3,0,1,18,1,18,0 -21843,1,7,0,10,1,1,10,1,0,1,3,15,0,36,1 -21844,8,4,0,4,3,1,14,3,3,1,10,2,3,24,0 -21845,9,6,0,6,1,5,6,3,0,1,17,0,1,8,0 -21846,6,8,0,3,2,4,9,2,2,0,13,6,5,24,0 -21847,3,3,0,0,0,0,0,3,3,0,10,8,3,11,0 -21848,0,2,0,7,6,5,2,4,3,0,2,14,3,37,0 -21849,7,3,0,9,0,1,12,4,3,1,9,19,5,25,1 -21850,8,4,0,1,6,4,14,3,0,0,13,13,4,33,0 -21851,5,3,0,4,1,6,13,1,0,0,2,2,3,26,0 -21852,5,6,0,15,5,1,0,2,4,0,2,18,4,20,0 -21853,9,6,0,4,6,0,14,3,1,0,14,16,4,27,0 -21854,5,3,0,15,6,6,10,1,1,0,8,4,1,14,0 -21855,5,2,0,3,3,4,8,0,2,1,8,18,0,7,0 -21856,7,3,0,0,6,5,1,2,4,1,16,11,0,13,0 -21857,4,6,0,0,2,4,4,1,0,0,16,7,0,14,1 -21858,1,6,0,5,6,4,7,2,1,1,2,16,0,12,0 -21859,1,6,0,9,6,1,10,4,0,0,2,18,5,33,0 -21860,0,0,0,0,0,3,0,1,3,1,13,0,0,31,0 -21861,3,3,0,2,5,5,5,5,3,0,18,11,0,32,1 -21862,2,5,0,5,3,0,11,3,0,0,8,2,0,0,0 -21863,1,8,0,13,5,4,12,2,3,0,13,9,1,18,0 -21864,7,3,0,11,5,0,3,2,0,1,11,13,3,33,0 -21865,1,1,0,12,6,5,0,4,0,0,13,12,2,17,0 -21866,5,6,0,11,2,4,5,4,2,0,12,11,4,31,0 -21867,2,0,0,5,4,3,11,2,0,1,17,20,5,27,0 -21868,7,6,0,3,4,6,6,3,1,0,18,3,4,23,1 -21869,10,7,0,3,1,1,8,3,4,0,17,15,2,0,0 -21870,0,4,0,5,0,1,11,0,1,1,9,8,3,21,1 -21871,9,5,0,13,1,3,3,2,3,0,2,9,5,35,0 -21872,3,3,0,4,5,6,9,3,3,1,2,15,0,11,0 -21873,4,1,0,6,6,0,2,5,3,0,2,6,1,19,0 -21874,6,2,0,0,6,4,10,5,1,0,17,7,3,8,1 -21875,4,4,0,2,5,0,10,1,3,0,10,0,3,18,0 -21876,3,6,0,7,6,4,14,4,1,1,13,15,1,2,0 -21877,0,6,0,12,6,1,5,3,0,0,4,15,5,28,0 -21878,6,8,0,2,4,1,5,3,1,0,8,17,0,28,0 -21879,4,5,0,11,4,1,8,1,0,0,8,11,3,23,0 -21880,2,5,0,8,2,3,6,4,1,0,14,9,2,0,0 -21881,1,4,0,10,1,4,3,2,0,1,17,2,1,2,0 -21882,4,2,0,15,5,4,14,0,0,1,1,9,0,11,0 -21883,2,4,0,3,1,0,9,5,1,0,17,2,1,7,0 -21884,0,1,0,0,5,6,2,3,0,0,8,11,1,13,0 -21885,4,8,0,8,2,5,4,0,2,1,3,19,3,10,1 -21886,7,0,0,7,4,1,1,3,1,0,13,10,5,23,0 -21887,9,5,0,2,1,3,14,4,4,1,10,2,5,16,0 -21888,10,2,0,6,2,5,12,4,0,1,15,0,3,4,0 -21889,8,8,0,8,2,3,12,5,4,1,1,12,5,22,1 -21890,10,4,0,8,3,4,6,2,0,1,2,0,1,37,0 -21891,6,5,0,5,6,1,8,4,3,0,18,5,4,9,1 -21892,0,8,0,3,6,4,9,3,4,0,6,6,0,4,0 -21893,2,4,0,3,5,0,6,3,0,0,8,8,0,13,0 -21894,7,1,0,7,4,0,9,3,3,1,12,2,3,6,0 -21895,4,3,0,12,0,4,11,1,1,0,8,6,3,9,0 -21896,5,0,0,10,3,2,2,5,3,1,7,10,5,8,1 -21897,8,3,0,4,1,4,4,1,0,1,0,17,5,4,0 -21898,3,6,0,12,4,3,0,4,1,1,18,19,0,18,1 -21899,10,1,0,6,5,3,3,2,3,0,14,20,0,2,0 -21900,10,5,0,9,4,1,13,2,3,1,9,12,4,13,1 -21901,4,8,0,15,3,6,0,2,2,1,6,4,3,13,0 -21902,10,8,0,8,0,1,8,5,2,1,13,0,1,30,0 -21903,5,1,0,3,0,2,9,5,2,1,18,7,1,33,1 -21904,0,7,0,1,6,6,14,5,4,1,13,13,1,14,0 -21905,6,1,0,1,0,5,5,3,1,1,4,11,2,19,0 -21906,8,7,0,3,2,5,14,4,2,0,2,0,3,13,0 -21907,0,0,0,3,0,3,2,4,4,0,2,2,3,38,0 -21908,5,7,0,1,2,4,6,0,2,0,13,6,0,37,0 -21909,10,6,0,5,6,5,5,4,3,1,0,18,3,3,0 -21910,1,3,0,0,2,6,5,4,3,0,10,12,2,4,0 -21911,1,6,0,3,0,3,4,1,4,1,13,0,1,19,0 -21912,7,5,0,7,1,4,4,2,0,1,18,1,4,39,1 -21913,3,5,0,7,2,5,9,4,4,0,5,18,3,21,1 -21914,9,4,0,0,3,4,9,5,4,1,2,11,0,31,0 -21915,9,8,0,14,2,6,10,5,4,0,18,17,1,38,1 -21916,5,6,0,13,6,6,9,1,1,1,3,17,4,16,0 -21917,5,5,0,5,6,2,4,4,3,1,5,15,2,23,0 -21918,2,3,0,10,6,6,14,4,1,0,13,0,5,19,0 -21919,3,2,0,7,3,6,10,1,0,1,2,19,0,27,0 -21920,1,0,0,8,6,0,9,1,0,0,12,13,2,14,0 -21921,7,3,0,2,0,5,5,2,1,1,8,11,2,33,0 -21922,6,1,0,10,3,5,8,3,4,1,18,5,5,9,1 -21923,0,0,0,11,5,5,0,4,4,0,2,19,1,16,0 -21924,2,4,0,11,0,5,13,1,1,0,17,19,4,34,0 -21925,6,7,0,9,2,1,5,0,3,0,14,0,2,39,1 -21926,10,0,0,12,3,6,14,2,2,1,3,10,5,2,1 -21927,1,1,0,13,5,3,4,4,1,0,12,6,2,26,0 -21928,0,2,0,13,4,0,9,3,2,1,8,4,5,29,0 -21929,9,7,0,11,6,5,5,4,3,1,8,6,5,3,0 -21930,2,8,0,13,1,0,13,1,0,1,8,11,2,13,0 -21931,3,7,0,15,1,4,9,5,1,1,9,10,1,2,1 -21932,10,4,0,7,0,5,13,2,0,1,18,8,2,16,1 -21933,0,4,0,2,4,1,11,5,0,0,11,1,0,29,1 -21934,2,6,0,5,5,0,9,0,4,0,13,6,3,1,0 -21935,5,7,0,11,0,5,0,4,3,1,17,18,1,14,0 -21936,10,8,0,6,1,0,0,2,0,1,4,2,0,13,0 -21937,0,8,0,13,3,4,2,1,0,1,13,4,3,41,0 -21938,1,0,0,13,0,5,0,2,3,0,6,11,0,38,0 -21939,2,2,0,1,2,0,13,1,3,0,8,4,1,17,0 -21940,7,5,0,9,1,4,11,2,2,1,3,9,5,29,0 -21941,5,8,0,1,0,5,0,3,4,1,17,11,5,31,0 -21942,4,3,0,3,5,3,7,3,1,1,4,20,5,16,0 -21943,1,5,0,0,6,5,1,4,1,1,17,20,1,1,0 -21944,3,4,0,3,0,4,14,1,2,1,16,2,5,11,0 -21945,2,4,0,6,2,2,4,1,0,0,18,19,0,10,1 -21946,7,1,0,8,4,4,8,4,1,1,14,4,5,32,1 -21947,2,2,0,10,6,5,8,4,2,0,8,11,2,17,0 -21948,1,4,0,8,6,0,9,2,0,0,15,0,0,16,0 -21949,3,4,0,11,2,6,12,1,1,1,2,11,3,9,0 -21950,5,8,0,0,4,1,9,2,1,1,6,5,1,9,0 -21951,4,0,0,0,2,6,13,2,0,0,17,16,1,34,0 -21952,7,6,0,7,3,5,2,3,2,0,6,13,0,20,0 -21953,2,6,0,0,2,3,9,1,4,1,4,11,3,22,0 -21954,6,7,0,13,3,0,4,3,3,1,13,3,4,28,0 -21955,1,7,0,14,4,1,14,2,3,1,16,8,1,0,1 -21956,5,5,0,5,4,3,10,5,3,1,11,7,4,19,1 -21957,6,8,0,8,3,6,12,4,4,0,7,6,2,36,0 -21958,1,4,0,2,3,3,6,3,0,0,4,4,1,38,0 -21959,10,7,0,13,1,4,2,0,0,0,13,19,5,18,0 -21960,4,8,0,10,0,5,8,4,2,1,2,0,3,13,0 -21961,1,6,0,7,3,5,1,5,4,0,2,16,1,30,0 -21962,8,3,0,6,5,3,8,1,2,1,5,20,3,37,1 -21963,6,6,0,5,3,6,5,2,3,0,13,2,1,13,0 -21964,1,3,0,1,2,3,5,4,1,1,0,2,2,4,0 -21965,10,5,0,4,3,1,8,2,0,0,6,13,1,1,0 -21966,1,6,0,15,1,3,6,4,0,0,8,20,0,20,0 -21967,9,4,0,8,5,6,0,3,3,1,5,3,2,41,1 -21968,0,6,0,6,1,0,3,3,4,0,13,3,3,29,0 -21969,2,2,0,3,3,4,11,5,0,0,5,2,3,38,0 -21970,3,1,0,7,1,6,13,0,4,1,14,20,3,28,1 -21971,5,2,0,8,1,0,14,3,2,1,5,1,5,10,1 -21972,6,3,0,13,6,4,1,1,1,0,1,15,3,3,0 -21973,9,5,0,6,3,2,14,3,0,1,17,6,4,23,0 -21974,0,1,0,10,5,4,12,4,3,1,8,0,0,9,0 -21975,2,3,0,5,3,6,7,2,3,0,2,3,2,18,0 -21976,2,0,0,3,6,5,8,4,4,1,6,15,0,28,0 -21977,8,8,0,2,5,3,14,4,4,0,5,4,3,19,0 -21978,1,6,0,6,0,0,13,4,2,1,2,6,1,3,0 -21979,9,2,0,3,0,3,9,3,0,1,7,16,3,18,0 -21980,0,8,0,13,1,2,6,5,3,1,4,9,5,26,0 -21981,2,3,0,11,3,4,12,1,0,0,12,11,3,28,0 -21982,7,7,0,8,4,2,2,0,0,0,15,7,5,0,1 -21983,4,5,0,4,4,1,6,5,3,1,16,8,0,23,1 -21984,8,5,0,10,4,2,13,2,4,0,18,1,5,35,1 -21985,1,6,0,2,5,2,7,4,1,0,6,9,1,34,0 -21986,1,0,0,4,1,4,12,4,0,0,2,14,3,32,0 -21987,5,0,0,0,1,2,9,5,1,1,2,2,5,21,0 -21988,2,8,0,1,3,4,5,3,0,0,0,16,5,25,0 -21989,5,4,0,11,4,6,5,5,0,0,13,18,1,33,0 -21990,1,4,0,8,2,6,8,3,0,1,8,13,4,26,0 -21991,0,3,0,6,4,4,1,3,1,0,13,2,4,3,0 -21992,0,7,0,4,6,3,12,1,2,0,0,9,4,31,0 -21993,10,8,0,14,3,5,1,2,2,0,15,2,4,2,0 -21994,9,5,0,10,6,2,12,0,2,1,9,19,5,8,1 -21995,9,1,0,13,3,5,6,0,2,1,0,0,3,31,0 -21996,8,8,0,7,4,0,11,4,0,1,8,6,2,30,0 -21997,0,6,0,5,5,4,7,2,2,0,6,2,5,32,0 -21998,0,6,0,2,2,3,10,1,3,0,2,9,0,34,0 -21999,8,0,0,0,1,1,11,5,1,1,1,1,4,34,1 -22000,0,1,0,6,5,2,3,3,1,0,6,13,1,5,0 -22001,0,8,0,12,5,6,11,5,0,1,4,8,5,0,0 -22002,7,3,0,4,0,6,14,1,3,1,0,11,4,28,0 -22003,9,7,0,6,0,0,7,3,3,0,13,6,2,0,0 -22004,6,1,0,0,0,1,8,5,0,0,18,3,1,5,1 -22005,1,7,0,3,1,6,8,3,4,0,8,2,1,4,0 -22006,8,8,0,10,2,6,7,4,4,1,17,11,2,12,0 -22007,4,6,0,5,0,0,11,4,3,0,2,0,0,38,0 -22008,10,3,0,8,3,1,9,2,0,1,9,5,2,12,1 -22009,7,8,0,2,3,5,3,2,2,0,13,16,0,34,0 -22010,9,5,0,11,1,1,5,5,4,1,7,12,4,27,1 -22011,9,4,0,1,5,1,1,4,2,0,15,15,5,32,0 -22012,0,3,0,10,1,0,6,2,1,1,13,11,0,7,0 -22013,1,6,0,12,3,0,7,3,3,1,6,11,0,3,0 -22014,1,7,0,15,4,3,5,3,3,1,3,6,0,22,0 -22015,0,7,0,11,5,2,7,0,0,1,8,3,1,30,0 -22016,9,8,0,10,3,2,1,4,2,1,8,7,1,6,0 -22017,8,4,0,1,4,3,9,1,1,1,17,3,1,29,0 -22018,10,7,0,3,6,2,0,0,0,0,5,2,5,2,0 -22019,0,2,0,0,2,2,14,4,3,0,17,2,1,37,0 -22020,5,0,0,0,5,4,6,0,1,1,10,11,0,3,0 -22021,5,0,0,3,5,0,0,5,0,0,5,6,3,7,1 -22022,6,6,0,13,3,5,10,4,1,0,1,15,0,38,0 -22023,2,8,0,10,3,0,9,1,1,1,15,11,2,11,0 -22024,7,5,0,15,2,4,5,0,3,1,18,19,3,5,1 -22025,6,6,0,15,3,1,8,0,0,1,17,10,3,21,1 -22026,9,2,0,8,5,4,6,1,2,1,13,11,1,30,0 -22027,4,0,0,5,0,5,8,2,1,0,2,15,1,20,0 -22028,6,2,0,11,1,0,14,1,3,0,11,2,1,9,0 -22029,7,8,0,7,1,5,14,1,0,1,6,2,0,12,0 -22030,0,0,0,5,4,4,14,4,0,1,8,20,2,14,0 -22031,2,7,0,8,2,6,7,3,0,0,13,13,0,8,0 -22032,0,3,0,13,0,3,11,1,2,1,13,13,1,41,0 -22033,2,3,0,3,1,1,6,4,0,0,8,0,1,39,0 -22034,7,6,0,1,1,0,3,2,0,1,9,7,1,23,1 -22035,0,8,0,12,6,5,10,4,1,0,8,2,1,5,0 -22036,3,8,0,7,1,3,0,4,3,1,11,11,1,25,0 -22037,8,5,0,7,1,6,6,4,0,0,2,10,0,4,0 -22038,3,5,0,7,2,0,2,1,2,0,8,2,2,12,0 -22039,6,5,0,9,4,3,5,5,2,0,6,7,4,19,1 -22040,6,2,0,9,2,4,0,1,2,0,7,8,2,6,1 -22041,8,5,0,2,1,3,14,0,3,1,1,7,5,23,1 -22042,6,0,0,1,5,1,10,2,1,1,0,14,0,20,0 -22043,1,6,0,5,4,1,0,0,2,0,8,14,5,23,0 -22044,3,5,0,14,1,5,5,2,3,0,14,4,0,26,0 -22045,0,0,0,3,2,5,9,3,3,0,12,2,0,29,0 -22046,1,6,0,0,6,0,11,3,4,0,10,18,3,5,0 -22047,3,6,0,3,2,4,12,4,3,0,13,11,2,3,0 -22048,3,4,0,10,3,1,4,0,2,1,17,15,5,23,0 -22049,0,0,0,7,6,4,11,2,3,1,2,15,4,25,0 -22050,2,0,0,9,6,5,5,5,0,1,18,19,0,24,1 -22051,1,1,0,4,6,0,1,3,0,1,2,6,5,22,0 -22052,8,3,0,2,0,1,7,2,1,0,1,17,5,18,1 -22053,7,6,0,9,4,5,13,3,1,1,5,7,4,19,1 -22054,0,2,0,2,4,5,12,4,1,1,17,0,1,23,0 -22055,1,6,0,9,1,6,9,0,2,0,13,18,5,10,0 -22056,5,3,0,6,2,2,11,3,4,0,18,19,3,1,1 -22057,3,8,0,14,3,2,6,1,0,1,18,10,5,2,1 -22058,8,1,0,0,0,3,6,3,2,1,15,11,0,32,0 -22059,8,6,0,12,6,4,13,5,4,1,7,10,3,34,1 -22060,8,2,0,10,3,5,1,5,0,0,3,1,5,13,1 -22061,9,2,0,14,4,0,3,0,2,0,9,10,1,27,1 -22062,0,3,0,13,4,2,0,2,0,0,6,15,3,18,0 -22063,5,2,0,8,0,3,9,3,2,1,13,2,2,37,0 -22064,10,3,0,5,6,3,12,1,0,0,15,0,0,36,0 -22065,8,0,0,0,6,5,13,1,0,0,10,15,5,41,0 -22066,9,3,0,6,0,1,7,1,2,1,4,20,0,32,0 -22067,4,8,0,12,3,5,5,0,0,0,4,13,3,20,0 -22068,6,8,0,5,0,1,14,3,2,1,13,6,5,40,0 -22069,3,3,0,2,4,4,9,2,1,1,4,2,0,41,0 -22070,0,3,0,3,0,2,3,3,4,1,11,15,4,0,0 -22071,3,3,0,0,2,6,6,0,0,1,17,6,2,33,0 -22072,9,1,0,9,4,4,5,4,1,1,5,5,4,38,1 -22073,7,6,0,8,4,6,0,3,3,1,4,4,5,9,0 -22074,8,4,0,1,0,2,14,3,0,0,9,17,4,14,1 -22075,1,8,0,5,4,1,14,1,1,1,10,8,1,16,0 -22076,5,8,0,11,0,0,9,1,0,0,9,0,3,33,0 -22077,7,0,0,12,6,4,2,4,0,0,13,0,1,26,0 -22078,3,0,0,1,6,0,3,4,2,0,13,2,5,21,0 -22079,0,3,0,2,1,6,1,1,4,1,2,11,3,20,0 -22080,2,6,0,13,6,5,0,1,2,1,17,9,0,19,0 -22081,9,5,0,11,5,6,6,2,4,1,6,11,3,4,0 -22082,4,6,0,14,6,1,1,2,0,1,17,11,0,7,0 -22083,7,8,0,0,3,1,9,4,1,1,18,8,4,12,1 -22084,3,6,0,10,6,3,0,5,1,0,2,13,0,21,0 -22085,6,5,0,15,5,1,4,4,1,1,3,5,0,34,1 -22086,2,4,0,15,3,0,0,3,1,0,17,14,1,1,0 -22087,1,4,0,8,2,3,12,5,1,0,2,2,2,27,0 -22088,7,3,0,3,4,0,2,3,4,0,11,15,5,38,0 -22089,4,7,0,2,2,5,2,4,2,0,13,4,0,27,0 -22090,5,5,0,9,5,2,7,5,3,1,15,10,5,4,1 -22091,2,8,0,10,6,4,9,4,4,1,2,2,2,27,0 -22092,6,4,0,7,5,0,6,1,2,0,13,9,0,35,0 -22093,4,6,0,9,6,1,13,3,2,1,2,9,3,5,0 -22094,4,1,0,15,3,1,2,4,1,0,9,10,4,24,1 -22095,4,8,0,13,6,0,7,5,3,1,9,5,2,24,1 -22096,1,7,0,1,0,2,9,1,0,1,12,8,2,1,0 -22097,0,0,0,2,6,1,3,0,1,0,13,19,0,19,0 -22098,7,7,0,7,1,4,13,2,3,0,13,2,0,1,0 -22099,0,0,0,10,4,3,4,0,4,0,13,16,2,2,0 -22100,2,2,0,1,0,4,6,5,0,1,16,16,4,3,0 -22101,0,6,0,0,4,3,7,1,3,0,8,2,5,27,0 -22102,10,0,0,0,6,0,8,1,1,0,17,11,1,41,0 -22103,8,0,0,5,1,2,3,5,1,1,9,5,4,5,1 -22104,0,7,0,5,3,5,14,3,1,0,17,11,3,29,0 -22105,10,3,0,10,2,5,10,1,3,0,0,17,2,34,0 -22106,6,0,0,15,1,1,5,3,1,1,15,15,1,11,1 -22107,8,4,0,3,0,3,6,1,3,0,8,14,2,21,0 -22108,10,5,0,0,2,6,3,0,1,0,3,11,1,6,0 -22109,0,3,0,5,2,6,7,5,0,0,6,13,5,23,0 -22110,10,7,0,5,1,3,3,2,4,0,14,11,5,29,0 -22111,0,5,0,0,4,2,12,3,1,0,2,2,4,31,0 -22112,4,0,0,15,4,1,6,1,4,1,18,12,0,18,1 -22113,7,1,0,9,6,4,5,4,1,1,4,12,2,33,0 -22114,9,2,0,4,1,1,2,4,3,0,2,15,3,34,0 -22115,4,6,0,6,6,3,4,1,4,0,3,15,0,33,0 -22116,9,5,0,1,5,5,0,4,2,1,14,1,4,21,1 -22117,2,1,0,8,1,6,6,4,4,0,8,19,3,5,0 -22118,0,6,0,13,0,5,9,1,1,1,17,2,5,30,0 -22119,7,6,0,5,6,2,0,0,0,0,2,8,3,0,0 -22120,4,8,0,15,0,2,6,3,4,1,11,11,4,38,0 -22121,5,4,0,8,3,4,9,1,2,0,2,16,5,41,0 -22122,0,3,0,15,0,4,1,4,0,0,6,16,0,6,0 -22123,1,6,0,3,2,1,8,2,2,0,2,6,2,5,0 -22124,0,4,0,8,1,0,14,3,1,1,6,4,4,6,0 -22125,3,2,0,0,0,0,11,2,2,0,17,11,4,17,0 -22126,3,6,0,15,0,5,2,0,3,0,3,4,2,27,0 -22127,0,5,0,11,1,6,4,4,3,1,9,1,3,0,1 -22128,0,8,0,3,3,5,0,1,0,0,2,13,2,40,0 -22129,4,6,0,8,0,1,10,2,4,0,8,11,2,36,0 -22130,0,2,0,8,2,6,8,3,4,0,13,1,2,26,0 -22131,4,4,0,10,4,1,14,5,0,0,1,7,3,17,1 -22132,2,4,0,14,6,1,5,3,0,1,4,15,5,3,0 -22133,1,8,0,9,5,0,3,1,0,1,13,11,1,37,0 -22134,0,7,0,14,1,4,2,2,2,1,2,15,5,16,0 -22135,2,5,0,10,6,6,7,1,2,1,8,4,4,14,0 -22136,6,7,0,7,0,5,5,1,0,1,8,11,1,7,0 -22137,7,5,0,8,0,3,4,5,3,1,18,7,5,30,1 -22138,3,3,0,1,3,1,6,2,0,1,17,0,2,4,0 -22139,3,8,0,3,6,4,13,4,4,0,2,15,4,35,0 -22140,3,6,0,6,1,2,4,3,2,1,18,3,5,8,1 -22141,3,8,0,0,2,4,13,1,0,0,13,2,0,16,0 -22142,1,7,0,11,5,2,12,1,4,1,1,0,1,38,1 -22143,10,4,0,1,1,1,2,5,4,1,9,10,1,21,1 -22144,3,0,0,3,0,4,7,5,1,0,12,1,5,22,0 -22145,1,5,0,10,2,5,8,4,2,0,17,2,0,17,0 -22146,7,3,0,10,1,1,13,1,2,1,11,3,5,40,1 -22147,9,2,0,4,3,3,5,2,2,0,16,0,0,35,0 -22148,0,8,0,3,4,4,8,1,2,0,4,2,3,14,0 -22149,5,8,0,0,6,4,8,3,0,0,2,0,2,39,0 -22150,2,7,0,13,2,5,9,3,4,0,13,12,5,39,0 -22151,7,4,0,11,0,5,5,5,2,0,13,2,2,8,0 -22152,1,1,0,11,1,5,1,4,1,0,10,2,0,0,0 -22153,9,5,0,0,0,0,5,2,0,1,4,2,3,36,0 -22154,0,4,0,10,1,4,5,4,0,1,17,3,2,41,0 -22155,7,6,0,3,6,1,10,5,2,0,18,12,4,41,1 -22156,2,8,0,12,3,4,11,2,0,0,13,0,5,40,0 -22157,1,1,0,10,3,5,9,0,4,0,8,6,4,17,0 -22158,2,7,0,5,6,4,10,3,2,0,3,11,1,4,0 -22159,0,3,0,1,0,6,3,4,3,1,13,17,5,19,0 -22160,10,2,0,7,1,4,4,1,4,1,4,1,2,3,0 -22161,1,0,0,13,5,0,6,2,0,0,2,18,0,6,0 -22162,9,2,0,13,3,5,10,3,3,1,17,7,3,7,0 -22163,2,7,0,13,5,0,11,5,3,0,10,18,1,12,0 -22164,4,2,0,4,6,4,14,0,1,0,2,15,0,19,0 -22165,0,6,0,3,5,3,6,3,2,1,17,0,5,13,0 -22166,4,3,0,2,6,2,13,5,2,1,7,15,3,27,1 -22167,0,1,0,1,1,0,3,2,4,1,12,2,1,0,0 -22168,1,4,0,12,2,5,0,1,0,1,13,12,1,3,0 -22169,2,7,0,2,3,0,12,4,2,0,6,6,2,0,0 -22170,2,4,0,5,6,1,7,5,2,1,16,12,0,40,1 -22171,0,2,0,11,3,0,4,0,3,1,13,2,5,28,0 -22172,10,7,0,13,4,0,14,4,0,1,8,11,3,4,0 -22173,5,2,0,8,0,4,0,3,4,0,13,19,2,35,0 -22174,5,7,0,5,3,0,9,4,2,0,8,15,3,20,0 -22175,8,6,0,12,1,6,7,1,1,1,18,17,2,6,1 -22176,6,6,0,7,4,4,7,5,3,1,18,10,3,36,1 -22177,4,6,0,8,4,2,10,0,1,0,16,7,5,35,1 -22178,1,0,0,10,0,0,6,2,0,1,6,2,1,20,0 -22179,2,1,0,9,6,3,10,5,1,1,2,6,0,0,0 -22180,8,8,0,3,2,5,10,0,1,1,6,14,1,20,1 -22181,6,6,0,3,2,6,1,0,0,1,2,9,4,16,0 -22182,6,1,0,1,1,1,5,3,0,0,13,15,4,30,0 -22183,2,8,0,4,6,2,3,3,3,0,5,0,1,26,0 -22184,3,6,0,7,1,2,4,5,2,1,13,2,0,30,0 -22185,6,0,0,5,5,2,5,3,4,1,2,17,1,37,0 -22186,0,0,0,12,0,5,5,4,1,0,8,12,2,0,0 -22187,5,3,0,3,1,0,14,2,0,1,2,11,4,25,0 -22188,2,8,0,15,0,3,3,1,3,0,2,11,5,30,0 -22189,5,0,0,13,5,4,9,3,4,0,9,3,4,11,0 -22190,4,8,0,11,6,1,1,5,2,1,3,17,0,0,1 -22191,0,7,0,13,4,3,9,4,1,0,10,6,2,37,0 -22192,8,4,0,10,2,5,0,1,3,0,2,11,0,41,0 -22193,4,3,0,1,6,6,11,1,2,0,0,7,3,11,0 -22194,1,8,0,12,3,4,7,2,3,1,15,3,1,3,0 -22195,9,3,0,15,4,5,10,2,4,1,2,16,2,30,0 -22196,3,3,0,1,1,4,14,4,1,0,12,14,5,29,0 -22197,6,8,0,14,2,0,9,5,1,1,9,0,5,16,1 -22198,8,6,0,11,3,1,4,0,0,0,9,10,3,21,1 -22199,5,7,0,11,1,2,3,4,0,0,6,2,2,23,0 -22200,2,8,0,8,6,6,7,4,0,0,2,4,0,26,0 -22201,5,8,0,5,4,0,14,2,2,1,2,2,1,9,0 -22202,3,4,0,12,2,1,4,3,0,1,18,17,3,23,1 -22203,7,0,0,4,3,1,4,4,1,1,3,10,0,12,1 -22204,5,1,0,0,4,3,3,2,2,1,11,13,3,2,1 -22205,2,3,0,6,5,0,5,2,1,0,13,0,0,38,0 -22206,2,3,0,4,5,1,9,3,0,1,4,18,0,23,0 -22207,1,1,0,5,5,2,0,3,3,0,3,18,5,26,1 -22208,10,0,0,11,6,4,7,5,0,1,18,10,3,7,1 -22209,0,6,0,10,0,2,2,1,0,1,6,2,2,11,0 -22210,1,1,0,7,6,2,7,0,0,0,13,12,1,4,0 -22211,4,7,0,11,0,4,9,3,0,1,10,0,0,28,0 -22212,4,5,0,8,0,3,14,2,0,1,13,11,1,12,0 -22213,3,1,0,15,6,6,7,4,3,1,2,2,1,40,0 -22214,5,7,0,6,2,4,1,3,4,0,15,2,2,25,0 -22215,9,7,0,7,3,2,10,1,3,0,0,17,3,27,0 -22216,0,6,0,1,1,3,5,3,0,0,17,2,2,14,0 -22217,3,5,0,1,0,4,11,3,0,0,8,13,4,7,0 -22218,3,1,0,12,1,0,0,1,0,1,8,15,0,4,0 -22219,0,1,0,5,1,0,9,1,1,1,8,16,3,23,0 -22220,5,0,0,11,2,0,8,5,4,0,10,4,2,3,0 -22221,0,4,0,7,0,4,9,0,0,0,13,6,0,25,0 -22222,3,5,0,11,0,4,5,3,0,0,3,15,2,14,0 -22223,10,8,0,3,5,2,7,0,0,0,4,2,2,30,0 -22224,0,4,0,4,5,3,6,1,3,1,2,6,1,25,0 -22225,5,8,0,15,1,4,14,0,2,1,3,2,2,4,0 -22226,4,3,0,6,5,2,14,1,0,0,5,12,2,40,1 -22227,5,2,0,13,2,6,11,2,1,0,16,19,0,38,1 -22228,7,5,0,10,6,1,6,5,3,0,5,10,1,37,1 -22229,4,3,0,0,5,0,13,2,2,0,3,11,0,31,0 -22230,6,1,0,4,4,6,13,3,0,1,5,13,3,36,1 -22231,10,6,0,4,0,1,9,4,2,1,8,12,3,33,0 -22232,7,7,0,0,4,5,1,5,2,1,18,10,3,1,1 -22233,10,8,0,4,6,6,14,2,0,0,18,16,5,37,1 -22234,8,3,0,13,0,3,5,2,4,0,3,13,0,33,0 -22235,9,4,0,1,6,6,12,0,4,1,11,16,4,38,0 -22236,10,1,0,2,0,5,7,0,0,0,7,2,0,19,0 -22237,10,3,0,6,1,2,7,5,3,1,17,20,3,21,0 -22238,4,6,0,11,0,0,2,3,2,0,2,2,0,22,0 -22239,3,1,0,8,4,3,13,5,3,1,18,1,0,18,1 -22240,3,1,0,9,1,6,10,2,0,0,12,18,5,26,1 -22241,8,8,0,7,1,1,13,2,3,1,18,3,5,10,1 -22242,2,8,0,3,5,0,5,2,2,1,4,6,5,35,0 -22243,1,2,0,6,1,6,13,2,1,1,17,2,1,0,0 -22244,4,5,0,14,6,3,14,3,3,0,5,17,5,26,1 -22245,2,1,0,4,5,1,7,4,4,1,11,16,5,22,1 -22246,9,0,0,11,6,5,1,5,1,0,3,8,2,19,1 -22247,9,4,0,7,2,5,12,4,3,1,18,12,3,34,1 -22248,5,0,0,4,1,4,10,5,3,1,18,8,0,22,1 -22249,1,8,0,11,2,5,11,4,0,0,13,6,2,12,0 -22250,0,7,0,4,4,5,11,2,1,0,1,8,4,18,1 -22251,0,6,0,12,2,0,5,1,2,0,13,6,2,36,0 -22252,7,5,0,13,0,4,3,3,3,0,2,20,5,35,0 -22253,3,4,0,7,0,6,9,4,0,0,8,2,2,28,0 -22254,9,4,0,4,1,6,12,3,0,1,0,2,5,25,0 -22255,2,0,0,0,1,1,1,0,4,1,13,9,2,22,0 -22256,2,1,0,5,3,2,6,5,2,0,2,20,4,24,0 -22257,1,2,0,11,0,3,14,2,4,0,13,11,5,16,0 -22258,3,1,0,2,6,5,3,2,1,1,15,2,1,13,0 -22259,0,6,0,6,0,2,12,4,2,0,8,11,2,40,0 -22260,8,3,0,13,6,3,3,4,0,0,2,3,1,20,0 -22261,2,0,0,11,1,2,7,4,4,1,2,6,0,25,0 -22262,10,8,0,11,3,0,1,5,4,1,18,14,3,24,1 -22263,0,0,0,11,3,5,8,3,3,1,13,2,1,39,0 -22264,0,8,0,12,1,3,7,4,3,0,8,17,4,29,0 -22265,10,1,0,7,6,5,7,0,0,0,10,6,0,3,0 -22266,0,5,0,9,5,5,13,1,4,1,13,0,0,24,0 -22267,3,8,0,0,2,4,5,3,1,0,13,16,3,6,0 -22268,0,1,0,11,6,5,14,0,0,1,15,11,5,27,0 -22269,6,8,0,0,1,1,7,4,3,0,18,10,2,11,1 -22270,10,0,0,4,6,6,1,0,3,1,9,7,4,18,1 -22271,3,1,0,10,3,0,4,2,3,1,1,1,4,7,1 -22272,6,8,0,8,5,4,14,1,1,1,11,15,0,35,0 -22273,7,4,0,8,4,4,5,4,2,1,2,7,0,23,0 -22274,9,7,0,1,0,0,1,5,2,1,9,3,1,24,1 -22275,1,4,0,2,0,3,5,2,4,0,2,15,3,25,0 -22276,9,8,0,13,1,3,7,2,4,1,0,7,2,10,1 -22277,7,1,0,1,2,0,5,0,0,0,0,10,1,31,1 -22278,10,2,0,8,3,5,11,2,2,1,18,1,4,19,1 -22279,9,3,0,4,1,6,7,0,3,0,4,2,2,28,0 -22280,0,3,0,0,2,1,2,0,4,1,13,15,2,9,0 -22281,0,2,0,12,1,1,6,2,3,1,5,20,5,21,1 -22282,0,7,0,1,1,5,8,3,4,1,15,4,3,33,0 -22283,2,8,0,5,4,5,14,1,2,0,13,2,3,5,0 -22284,2,7,0,9,2,0,8,1,1,0,5,15,1,8,0 -22285,2,5,0,4,3,6,0,4,1,0,6,11,4,38,0 -22286,10,1,0,11,4,5,0,2,0,0,1,17,3,20,1 -22287,6,3,0,10,6,4,1,0,1,1,12,6,1,18,0 -22288,3,1,0,13,5,4,9,4,0,1,13,13,2,38,0 -22289,3,6,0,5,2,5,12,1,2,1,13,12,1,8,0 -22290,8,6,0,3,3,2,0,0,3,0,18,16,4,16,1 -22291,3,7,0,8,5,6,5,1,3,1,4,16,0,14,0 -22292,4,8,0,1,6,1,11,4,4,0,17,10,3,12,1 -22293,8,0,0,8,4,6,14,1,4,0,18,7,4,29,1 -22294,1,8,0,10,5,1,4,1,1,1,9,8,0,40,1 -22295,2,5,0,14,0,5,3,3,4,1,10,11,0,28,0 -22296,7,4,0,8,5,1,10,0,1,1,18,10,4,27,1 -22297,0,4,0,12,3,2,8,3,3,0,6,0,0,8,0 -22298,5,1,0,3,3,5,0,1,0,1,13,15,0,36,0 -22299,7,8,0,0,2,6,0,4,1,1,0,7,2,41,1 -22300,2,4,0,7,2,5,5,1,3,1,8,2,3,12,0 -22301,0,4,0,8,1,5,2,2,0,0,2,2,5,41,0 -22302,5,4,0,12,3,0,4,3,3,0,7,16,5,36,0 -22303,0,4,0,5,0,0,8,4,4,0,13,15,5,4,0 -22304,1,7,0,7,4,5,1,3,2,0,17,9,1,29,0 -22305,5,8,0,11,1,0,2,1,2,1,2,0,0,27,0 -22306,7,7,0,6,6,2,3,1,0,1,14,10,3,38,1 -22307,2,6,0,0,2,3,13,1,2,0,2,15,2,6,0 -22308,0,1,0,3,4,4,11,2,2,1,6,18,4,5,0 -22309,6,0,0,1,0,5,8,2,2,1,0,9,5,13,0 -22310,10,7,0,3,5,3,3,4,4,1,13,20,0,32,0 -22311,9,6,0,14,5,0,11,2,0,0,8,4,5,24,0 -22312,6,0,0,5,3,5,7,5,2,1,18,3,5,27,1 -22313,7,3,0,9,4,3,10,2,0,1,5,14,3,23,1 -22314,0,0,0,3,5,6,13,0,0,1,8,11,0,40,0 -22315,7,0,0,10,6,6,12,0,0,0,14,12,3,2,1 -22316,10,3,0,8,5,6,14,5,2,1,9,5,5,0,1 -22317,3,5,0,9,4,1,2,0,2,1,9,0,2,32,1 -22318,2,6,0,7,2,4,0,5,4,0,14,19,2,24,1 -22319,2,7,0,9,2,3,7,4,3,1,1,9,1,18,0 -22320,8,8,0,6,1,6,3,4,2,0,2,9,0,31,0 -22321,3,7,0,4,2,2,14,2,2,1,1,3,1,35,1 -22322,1,7,0,0,0,4,9,2,1,0,13,11,0,9,0 -22323,1,5,0,2,5,1,10,1,2,1,17,14,4,19,1 -22324,8,4,0,12,6,4,12,3,2,0,5,7,5,19,1 -22325,4,3,0,5,1,4,5,3,4,1,8,6,4,0,0 -22326,1,8,0,13,0,6,8,3,3,0,8,9,2,3,0 -22327,0,7,0,13,0,4,0,0,2,0,7,11,2,28,0 -22328,2,4,0,0,4,4,2,4,1,0,4,11,1,12,0 -22329,5,1,0,14,6,5,9,3,4,0,17,11,0,31,0 -22330,3,3,0,4,2,1,11,4,1,0,8,11,3,27,0 -22331,9,2,0,6,2,4,12,5,0,1,13,11,5,23,0 -22332,3,8,0,5,2,6,9,3,3,1,13,11,4,32,0 -22333,0,6,0,7,6,3,1,0,1,1,11,2,2,40,0 -22334,3,6,0,2,5,3,9,1,3,1,2,11,1,38,0 -22335,10,3,0,6,3,5,11,2,4,0,5,10,4,38,1 -22336,10,8,0,15,4,1,0,1,4,0,0,13,3,7,0 -22337,0,0,0,4,0,2,0,4,2,0,13,11,3,0,0 -22338,9,8,0,3,1,5,14,3,0,0,2,9,4,29,0 -22339,1,5,0,5,2,0,9,4,3,1,11,11,3,0,1 -22340,0,6,0,1,6,4,12,2,1,0,8,6,4,6,0 -22341,0,4,0,12,3,2,3,4,2,1,16,11,0,24,0 -22342,1,3,0,13,4,4,7,3,0,1,5,4,5,17,0 -22343,0,2,0,3,0,0,1,1,0,0,13,9,1,29,0 -22344,10,3,0,11,6,4,7,2,0,1,13,6,4,27,0 -22345,6,4,0,5,2,5,0,0,0,1,8,13,3,9,0 -22346,0,4,0,5,6,6,7,3,3,1,6,20,1,23,0 -22347,5,5,0,5,3,1,0,4,0,0,18,7,0,36,1 -22348,9,2,0,14,4,5,9,4,0,0,13,9,1,13,0 -22349,1,5,0,11,0,0,9,4,3,0,2,11,0,27,0 -22350,8,1,0,0,5,3,14,5,1,1,15,5,3,13,1 -22351,2,8,0,10,0,5,11,4,4,1,10,16,5,0,0 -22352,0,2,0,12,6,6,2,1,2,1,10,15,0,17,0 -22353,4,8,0,12,5,3,9,1,0,0,3,4,3,6,0 -22354,9,7,0,0,3,3,5,0,2,1,15,11,5,23,0 -22355,0,2,0,12,2,4,5,0,0,0,5,2,4,14,0 -22356,7,6,0,10,4,2,14,5,0,1,3,19,2,9,1 -22357,0,3,0,15,5,3,3,2,2,1,4,9,1,28,0 -22358,2,6,0,2,4,0,7,2,0,1,0,20,1,6,0 -22359,4,6,0,10,2,6,6,0,1,1,14,17,5,11,1 -22360,0,6,0,11,3,1,10,1,1,0,8,2,1,39,0 -22361,10,3,0,7,2,1,8,1,0,0,13,20,0,9,0 -22362,0,5,0,2,2,0,8,4,0,0,2,12,0,14,0 -22363,4,6,0,13,4,5,9,2,0,1,10,16,2,3,0 -22364,0,6,0,5,0,4,4,4,3,1,6,11,3,26,0 -22365,1,1,0,15,0,0,1,0,2,1,6,2,1,31,0 -22366,1,0,0,13,0,5,9,4,2,0,8,11,0,13,0 -22367,1,2,0,13,3,0,9,1,3,0,10,14,0,8,0 -22368,2,6,0,13,4,4,12,4,1,0,2,11,1,3,0 -22369,4,8,0,13,1,6,9,5,3,1,8,13,0,34,0 -22370,9,6,0,15,5,2,2,5,4,1,18,14,0,25,1 -22371,2,6,0,2,0,3,8,2,3,0,2,11,5,25,0 -22372,1,8,0,12,0,6,14,2,2,1,13,6,5,4,0 -22373,3,5,0,6,0,5,5,2,2,0,17,5,2,32,0 -22374,2,3,0,14,3,4,10,4,4,0,13,6,0,34,0 -22375,1,2,0,15,1,3,13,1,4,1,8,13,1,35,0 -22376,6,5,0,0,0,4,9,0,0,1,13,15,0,19,0 -22377,10,4,0,6,6,5,5,0,2,1,0,11,1,16,0 -22378,0,7,0,1,0,6,7,1,2,1,13,11,2,0,0 -22379,2,8,0,9,4,4,4,3,3,1,15,2,4,23,0 -22380,9,0,0,3,6,5,10,2,0,0,4,9,0,29,0 -22381,3,8,0,3,1,2,10,2,3,0,5,2,2,32,0 -22382,1,0,0,11,5,2,6,5,4,1,16,7,1,21,1 -22383,0,6,0,14,3,2,5,4,3,1,14,2,2,6,0 -22384,10,4,0,5,6,4,1,3,0,1,15,2,4,0,0 -22385,6,6,0,12,0,2,13,1,1,1,2,11,3,3,0 -22386,1,0,0,8,2,6,12,4,0,1,2,2,0,38,0 -22387,10,7,0,8,0,4,11,3,0,1,17,11,4,26,0 -22388,4,4,0,9,4,6,14,3,4,1,14,18,5,34,1 -22389,2,0,0,0,0,0,7,3,0,0,13,2,5,9,0 -22390,1,6,0,14,6,0,12,2,1,1,5,8,3,17,1 -22391,3,0,0,15,4,5,11,2,3,1,9,5,4,5,1 -22392,9,1,0,14,4,4,2,5,2,1,3,7,5,12,1 -22393,6,2,0,1,5,2,9,5,4,0,9,7,5,41,1 -22394,9,6,0,12,3,5,4,0,2,0,5,7,0,13,1 -22395,10,4,0,10,3,0,4,2,4,1,0,1,5,30,1 -22396,7,4,0,15,0,6,7,1,1,0,17,15,3,1,0 -22397,5,4,0,3,1,5,7,5,3,1,0,2,2,3,0 -22398,8,6,0,11,1,5,12,0,4,1,14,17,3,30,1 -22399,9,6,0,0,5,5,12,3,2,1,4,2,0,30,0 -22400,0,3,0,14,6,3,1,0,4,1,4,4,0,20,0 -22401,0,8,0,13,6,0,11,3,4,0,8,4,3,30,0 -22402,0,8,0,9,0,5,11,1,0,1,6,12,0,31,0 -22403,10,8,0,2,2,1,12,3,0,0,6,2,0,17,0 -22404,9,6,0,7,1,4,9,0,2,0,11,14,4,38,0 -22405,6,3,0,5,6,2,2,5,3,1,7,7,4,40,1 -22406,0,7,0,8,1,1,0,3,1,1,18,11,3,4,0 -22407,5,7,0,14,1,4,14,4,2,0,2,2,1,28,0 -22408,7,6,0,0,0,1,4,0,4,1,16,10,1,3,1 -22409,1,6,0,12,1,4,12,0,0,1,15,2,4,18,0 -22410,10,0,0,9,1,2,9,0,2,1,11,10,1,9,1 -22411,6,4,0,6,5,5,1,1,1,0,0,2,0,41,0 -22412,7,4,0,2,3,2,7,1,0,1,2,2,1,22,0 -22413,9,4,0,8,3,5,3,0,1,0,11,3,4,21,1 -22414,3,4,0,14,6,4,8,0,1,0,0,20,1,22,0 -22415,9,6,0,6,3,0,0,1,0,1,0,11,2,3,0 -22416,9,2,0,6,0,0,1,1,1,1,15,2,0,14,0 -22417,6,3,0,3,4,0,5,3,0,1,7,11,0,41,0 -22418,6,7,0,13,1,4,14,2,3,1,14,20,3,28,0 -22419,8,0,0,8,6,1,2,5,2,1,17,10,4,39,1 -22420,2,0,0,4,0,5,2,3,2,0,13,15,3,30,0 -22421,9,5,0,14,2,6,7,0,2,0,9,16,2,41,1 -22422,2,4,0,3,1,5,11,1,2,0,6,17,2,4,0 -22423,0,3,0,3,5,3,13,0,4,0,3,15,2,1,0 -22424,10,6,0,12,6,4,9,3,0,0,2,18,4,0,0 -22425,1,7,0,13,3,5,8,4,3,1,15,18,0,25,0 -22426,0,8,0,13,0,5,14,0,2,1,14,0,0,0,0 -22427,6,7,0,2,4,4,10,3,1,0,3,13,2,12,0 -22428,7,0,0,1,6,3,8,3,4,1,13,6,3,18,0 -22429,5,6,0,10,1,0,5,1,2,1,17,6,0,18,0 -22430,0,7,0,15,6,0,3,0,2,0,12,15,0,18,0 -22431,1,7,0,10,5,0,0,2,4,0,0,3,5,38,0 -22432,3,4,0,12,5,4,7,3,0,1,2,3,5,13,0 -22433,3,2,0,13,3,4,0,4,2,0,13,4,0,8,0 -22434,3,3,0,11,5,5,6,3,2,1,4,18,1,9,0 -22435,3,0,0,3,6,5,7,3,1,1,8,4,4,1,0 -22436,8,1,0,15,0,5,7,1,0,0,13,20,5,31,0 -22437,2,6,0,7,1,3,5,3,3,1,10,15,3,6,0 -22438,5,7,0,11,1,5,8,4,1,0,18,7,3,10,1 -22439,7,6,0,0,6,3,7,0,0,0,13,6,5,41,0 -22440,1,5,0,1,2,3,0,2,0,1,6,19,1,29,0 -22441,1,2,0,1,5,6,14,2,0,0,13,9,5,27,0 -22442,10,3,0,1,0,2,7,2,0,0,13,12,2,26,0 -22443,8,4,0,1,2,6,8,2,4,1,17,11,0,35,0 -22444,5,3,0,15,2,2,13,4,0,1,13,2,2,5,0 -22445,3,4,0,15,3,3,5,3,4,0,10,5,1,39,0 -22446,0,5,0,13,3,6,9,5,0,1,17,15,0,31,0 -22447,7,6,0,3,5,0,8,3,0,1,6,17,2,10,0 -22448,1,3,0,11,2,2,13,4,3,1,17,1,2,21,0 -22449,5,6,0,15,6,3,7,0,4,0,2,19,2,25,0 -22450,1,4,0,6,2,3,9,1,2,0,17,2,1,7,0 -22451,6,6,0,3,0,3,6,4,1,0,2,2,4,40,0 -22452,9,0,0,15,6,4,3,1,2,1,2,6,5,12,0 -22453,5,0,0,5,0,5,9,3,2,0,13,11,0,6,0 -22454,5,3,0,11,2,4,13,3,3,0,2,11,3,7,0 -22455,6,5,0,15,4,6,9,5,2,1,11,7,4,16,1 -22456,2,2,0,12,1,3,8,4,4,0,13,11,0,1,0 -22457,8,3,0,0,0,1,8,1,1,0,2,6,5,23,0 -22458,3,6,0,11,1,4,14,3,1,1,13,20,2,8,0 -22459,3,2,0,7,5,6,5,4,0,0,13,13,4,29,0 -22460,9,6,0,6,4,0,3,3,4,0,8,0,4,6,0 -22461,8,8,0,3,0,0,10,1,2,1,17,6,3,26,0 -22462,2,6,0,3,5,3,11,0,1,1,0,10,3,28,1 -22463,1,4,0,10,6,1,6,0,4,1,6,15,0,3,0 -22464,0,7,0,5,6,4,14,2,4,0,13,9,3,24,0 -22465,3,6,0,4,0,6,6,4,3,0,17,4,0,21,0 -22466,1,2,0,13,3,0,8,5,0,1,10,20,2,11,0 -22467,0,7,0,6,1,3,0,1,1,0,17,6,3,17,0 -22468,5,7,0,4,6,3,11,1,2,0,17,14,1,8,1 -22469,8,6,0,6,1,4,5,1,0,0,2,2,0,4,0 -22470,2,8,0,0,0,1,13,3,3,0,2,19,4,31,0 -22471,8,1,0,11,1,1,8,5,4,1,18,5,4,0,1 -22472,5,5,0,1,4,1,2,0,0,1,18,17,1,26,1 -22473,7,6,0,12,3,3,5,0,2,1,17,7,1,4,0 -22474,6,6,0,3,5,0,14,4,3,0,4,11,2,30,0 -22475,6,8,0,13,2,3,10,4,0,0,13,2,0,16,0 -22476,0,2,0,5,6,6,10,0,0,0,13,4,2,17,0 -22477,10,1,0,13,3,1,9,4,0,1,18,10,0,16,1 -22478,7,8,0,2,2,5,0,5,3,0,2,2,0,23,0 -22479,0,4,0,0,6,2,3,5,4,1,4,6,4,33,0 -22480,1,8,0,4,5,6,2,2,4,1,10,2,4,0,0 -22481,5,1,0,14,0,0,6,4,2,1,2,5,0,27,0 -22482,0,0,0,12,5,6,2,0,1,0,17,6,3,8,0 -22483,6,4,0,8,2,3,3,1,3,1,10,15,2,26,0 -22484,2,6,0,3,5,5,14,1,1,0,10,2,0,9,0 -22485,3,4,0,9,2,0,2,2,1,0,4,14,0,36,0 -22486,10,8,0,12,6,0,0,1,4,0,2,16,1,18,0 -22487,5,2,0,11,1,4,0,4,2,0,8,20,3,38,0 -22488,2,0,0,3,0,1,6,4,4,1,0,11,3,30,0 -22489,4,3,0,4,2,1,11,3,0,1,4,8,5,14,1 -22490,0,2,0,7,6,0,8,4,2,1,12,10,1,37,0 -22491,7,1,0,7,2,2,3,0,4,1,1,5,3,20,1 -22492,4,5,0,8,1,1,10,1,2,1,3,20,0,1,1 -22493,9,2,0,6,0,1,10,0,1,1,18,8,1,33,1 -22494,0,5,0,8,2,5,1,0,3,1,16,12,0,21,0 -22495,2,6,0,7,6,0,12,1,0,0,13,11,2,23,0 -22496,4,0,0,11,0,0,1,1,3,0,13,16,0,3,0 -22497,10,5,0,0,2,5,8,2,1,1,5,3,3,28,1 -22498,5,4,0,3,5,4,14,3,4,1,8,6,5,34,0 -22499,7,0,0,8,3,2,6,1,0,1,15,8,1,7,1 -22500,4,1,0,5,0,5,6,4,1,0,17,19,0,12,0 -22501,1,7,0,15,1,5,10,1,0,1,16,15,1,25,0 -22502,1,7,0,7,6,6,12,4,4,0,8,8,0,23,0 -22503,5,4,0,2,1,4,5,2,3,1,9,7,5,7,1 -22504,4,3,0,12,3,5,11,1,4,1,18,17,0,31,1 -22505,2,0,0,4,5,6,6,4,0,0,4,2,4,5,0 -22506,3,2,0,6,1,4,14,1,0,1,10,8,2,38,1 -22507,2,3,0,3,0,2,8,4,0,1,13,11,1,33,0 -22508,1,4,0,13,1,5,13,0,0,0,14,17,5,25,1 -22509,9,1,0,10,6,4,7,2,2,1,18,3,3,11,1 -22510,5,5,0,7,5,0,2,0,2,1,18,14,2,14,1 -22511,7,1,0,3,0,6,1,0,0,1,8,2,1,19,0 -22512,9,3,0,12,1,0,8,1,3,1,7,7,2,31,1 -22513,8,2,0,7,6,4,14,0,0,1,18,18,0,33,1 -22514,6,3,0,0,1,2,10,0,3,0,8,2,0,9,0 -22515,0,8,0,12,6,5,12,0,1,0,13,16,0,36,0 -22516,5,1,0,8,5,1,14,0,4,1,9,17,0,19,1 -22517,6,8,0,0,4,1,14,2,2,1,2,11,1,27,0 -22518,0,7,0,12,1,0,11,4,1,1,8,18,0,4,0 -22519,8,0,0,13,6,2,11,4,0,0,9,10,5,12,1 -22520,2,1,0,5,0,4,8,5,3,1,10,11,4,11,0 -22521,4,8,0,8,2,0,12,5,4,1,17,7,4,9,1 -22522,9,4,0,11,1,4,9,5,2,0,12,11,3,21,0 -22523,2,2,0,13,4,4,5,0,0,0,4,15,2,29,0 -22524,3,4,0,4,2,5,4,1,3,1,13,11,2,30,0 -22525,0,6,0,11,2,6,5,2,2,0,13,15,3,20,0 -22526,0,6,0,6,5,2,5,0,0,1,4,13,2,26,0 -22527,1,6,0,12,3,1,7,3,2,0,0,0,1,26,0 -22528,0,0,0,6,6,5,5,0,3,1,12,17,4,22,1 -22529,3,5,0,9,6,3,4,0,2,1,14,11,1,22,0 -22530,8,7,0,8,5,3,10,1,0,0,9,7,3,30,1 -22531,9,3,0,4,0,4,1,3,4,0,15,11,1,20,0 -22532,9,0,0,8,2,6,3,2,4,1,17,11,1,25,0 -22533,1,2,0,2,4,4,11,1,2,1,9,5,5,27,1 -22534,10,4,0,13,6,0,6,1,2,0,2,18,4,28,0 -22535,4,6,0,11,3,2,14,3,2,1,11,3,2,24,1 -22536,2,5,0,0,3,3,13,3,0,1,2,15,0,5,0 -22537,2,2,0,5,3,4,9,0,0,1,4,13,2,10,0 -22538,5,4,0,11,4,1,2,1,1,1,5,8,5,18,1 -22539,1,8,0,6,5,0,14,1,1,1,8,20,3,33,0 -22540,3,6,0,13,3,4,2,4,4,0,8,14,3,26,0 -22541,9,8,0,1,1,2,6,3,4,1,10,7,4,35,1 -22542,7,8,0,3,1,3,11,0,3,1,7,10,5,30,1 -22543,6,2,0,5,6,4,8,1,0,1,1,10,5,11,1 -22544,6,6,0,6,2,6,12,2,2,0,7,20,4,13,1 -22545,1,6,0,13,6,3,3,0,4,1,11,9,4,23,0 -22546,2,1,0,1,4,0,2,5,4,0,18,7,3,33,1 -22547,2,2,0,11,0,1,14,2,3,0,4,2,1,13,0 -22548,0,1,0,0,2,2,6,4,4,1,6,15,0,34,0 -22549,4,2,0,5,2,1,11,3,3,1,8,8,4,8,0 -22550,0,0,0,15,0,4,7,3,3,1,13,9,2,17,0 -22551,5,3,0,12,2,5,10,5,3,0,16,7,3,5,1 -22552,0,3,0,8,0,1,5,3,1,0,17,11,0,4,0 -22553,2,7,0,12,4,1,14,2,3,1,14,20,3,35,1 -22554,3,8,0,7,0,1,6,5,3,1,13,12,1,34,0 -22555,0,4,0,13,6,4,9,4,3,0,8,2,3,12,0 -22556,5,2,0,3,3,0,3,0,2,1,16,9,1,33,0 -22557,2,0,0,2,5,5,1,1,4,0,17,4,2,10,0 -22558,10,3,0,1,6,4,11,1,1,1,18,7,5,3,1 -22559,7,8,0,6,0,5,6,1,3,0,8,11,5,14,0 -22560,2,3,0,1,0,5,7,2,0,1,13,4,3,6,0 -22561,2,2,0,10,6,2,11,2,1,1,8,6,1,1,0 -22562,1,2,0,9,5,3,8,5,2,0,2,0,4,34,0 -22563,0,6,0,13,5,6,9,5,3,0,15,0,3,22,0 -22564,7,5,0,10,1,5,1,2,1,0,5,10,4,29,1 -22565,0,7,0,13,6,0,8,0,2,1,1,11,2,3,0 -22566,0,5,0,1,6,5,9,2,0,1,2,5,0,31,0 -22567,9,3,0,10,2,2,6,5,1,0,9,2,5,24,1 -22568,6,2,0,13,1,3,10,0,2,1,8,0,0,18,0 -22569,9,2,0,6,3,5,14,4,0,1,6,8,1,29,0 -22570,1,2,0,13,5,6,8,3,0,0,3,6,5,26,0 -22571,0,3,0,6,2,0,8,4,3,1,13,9,1,19,0 -22572,5,8,0,0,6,2,9,5,0,0,15,15,1,36,0 -22573,1,5,0,14,0,3,3,5,0,1,17,13,0,26,0 -22574,0,0,0,12,4,5,8,2,0,1,17,18,3,37,0 -22575,0,1,0,14,4,3,8,5,2,1,13,2,3,32,0 -22576,6,7,0,15,3,6,12,5,3,1,18,7,0,3,1 -22577,4,8,0,9,5,0,13,0,0,1,18,6,1,37,0 -22578,6,8,0,13,6,6,9,3,3,1,4,16,2,33,0 -22579,5,6,0,3,5,4,4,1,1,0,2,11,5,11,0 -22580,10,1,0,14,6,5,6,2,1,0,8,0,2,36,0 -22581,0,7,0,6,0,3,6,3,1,1,15,11,1,39,0 -22582,0,5,0,7,5,3,13,0,0,1,8,18,0,27,0 -22583,8,4,0,6,5,5,8,2,0,1,8,20,0,7,0 -22584,7,6,0,10,3,1,12,2,3,0,9,12,2,11,1 -22585,8,2,0,9,0,2,1,5,1,1,13,18,2,6,0 -22586,4,1,0,4,2,3,6,3,0,0,14,18,3,0,0 -22587,9,4,0,8,2,1,3,0,3,1,11,7,3,24,1 -22588,3,3,0,11,1,0,3,5,1,1,9,14,3,20,1 -22589,1,7,0,12,5,6,14,3,0,0,15,2,2,28,0 -22590,9,2,0,15,0,6,0,0,2,1,11,9,1,29,0 -22591,8,7,0,4,3,5,12,5,2,1,17,1,4,31,1 -22592,5,8,0,5,3,1,13,3,4,0,0,6,2,36,0 -22593,5,3,0,3,0,5,6,1,0,1,0,11,0,25,0 -22594,3,8,0,6,0,4,6,1,2,0,13,7,0,41,0 -22595,3,6,0,6,6,0,9,3,2,1,15,13,2,29,0 -22596,4,7,0,4,2,5,0,3,2,0,17,11,5,34,0 -22597,6,2,0,3,4,5,9,5,3,0,8,2,4,16,0 -22598,5,7,0,3,0,3,5,3,1,0,13,8,3,20,0 -22599,10,2,0,12,0,2,4,4,3,1,16,7,2,21,1 -22600,7,2,0,3,2,6,9,4,1,1,13,15,0,26,0 -22601,7,1,0,15,6,1,0,3,3,1,4,8,4,12,1 -22602,10,5,0,2,0,5,3,0,3,0,12,17,3,38,1 -22603,2,4,0,8,2,0,8,4,0,1,4,13,4,4,0 -22604,6,7,0,15,6,2,3,5,0,0,14,7,5,14,1 -22605,0,4,0,13,4,5,0,0,1,1,2,9,4,30,0 -22606,0,2,0,6,0,0,5,0,0,0,14,15,4,34,0 -22607,10,1,0,8,2,0,9,4,3,1,0,19,1,25,0 -22608,3,8,0,6,0,3,5,0,1,1,2,6,2,7,0 -22609,3,5,0,12,0,0,4,0,3,0,18,12,0,11,1 -22610,9,8,0,11,2,6,12,4,1,1,18,19,4,28,1 -22611,10,1,0,7,1,1,10,3,2,0,7,17,4,36,1 -22612,5,5,0,9,6,0,9,0,0,1,18,7,2,4,1 -22613,0,6,0,4,4,1,14,5,3,1,13,4,3,8,0 -22614,9,4,0,4,2,1,11,1,2,1,5,11,5,13,0 -22615,8,2,0,6,3,2,5,0,3,1,12,11,3,6,0 -22616,0,2,0,13,6,5,5,4,1,1,13,12,1,7,0 -22617,1,4,0,13,0,3,6,1,0,1,0,18,2,19,0 -22618,6,1,0,11,3,2,11,5,2,1,1,7,3,36,1 -22619,10,3,0,6,3,6,9,5,3,0,11,10,1,28,1 -22620,4,7,0,1,2,2,3,4,4,1,7,8,4,21,1 -22621,9,5,0,14,3,1,13,0,3,0,18,7,0,36,1 -22622,3,5,0,6,1,4,3,1,0,1,8,2,3,36,0 -22623,2,1,0,1,0,0,11,4,0,1,3,16,4,40,0 -22624,6,2,0,7,3,2,14,4,3,1,15,20,4,22,0 -22625,8,3,0,12,5,1,8,2,0,0,6,13,1,22,0 -22626,10,4,0,0,5,0,6,1,1,0,4,11,5,21,0 -22627,7,8,0,11,0,3,10,3,1,1,8,7,3,32,1 -22628,2,3,0,13,5,6,6,1,3,0,0,15,4,5,0 -22629,6,8,0,0,4,5,9,4,3,1,6,20,5,10,1 -22630,3,8,0,0,3,6,5,3,0,0,13,6,4,1,0 -22631,1,2,0,1,6,4,5,4,1,0,17,8,3,13,0 -22632,0,8,0,15,4,4,6,0,4,1,6,11,2,28,0 -22633,6,2,0,2,0,0,7,1,3,1,0,8,5,8,0 -22634,6,1,0,9,2,4,8,0,0,0,9,9,5,2,1 -22635,6,4,0,14,2,5,7,3,0,1,13,9,4,12,0 -22636,2,3,0,11,0,5,5,3,0,0,2,15,0,12,0 -22637,5,6,0,12,2,6,9,3,0,1,8,10,0,24,0 -22638,6,6,0,5,0,6,7,1,1,1,10,1,2,35,0 -22639,8,3,0,9,0,6,4,2,3,0,13,6,0,23,0 -22640,2,7,0,0,6,0,2,4,3,1,8,2,0,35,0 -22641,7,8,0,4,2,6,7,0,3,1,8,9,1,9,0 -22642,7,0,0,15,1,1,3,0,0,1,14,13,5,20,1 -22643,10,2,0,15,0,6,5,5,0,1,1,11,3,14,0 -22644,5,2,0,6,4,4,5,3,2,0,0,11,1,17,0 -22645,0,2,0,3,2,4,7,5,1,1,8,13,4,14,0 -22646,0,0,0,5,6,1,0,4,4,0,13,13,2,7,0 -22647,6,7,0,0,0,4,9,4,1,1,8,20,0,37,0 -22648,1,6,0,6,5,2,9,1,0,0,2,15,4,35,0 -22649,0,1,0,15,0,6,2,2,2,0,10,13,0,39,0 -22650,1,3,0,8,5,4,11,3,3,0,13,15,0,18,0 -22651,0,3,0,10,1,0,3,0,1,0,8,2,2,28,0 -22652,3,4,0,6,5,3,10,0,0,0,10,15,5,13,0 -22653,1,2,0,1,5,5,6,3,2,0,2,13,0,14,0 -22654,10,2,0,13,5,0,1,0,1,1,8,11,4,16,0 -22655,1,1,0,11,4,0,13,2,2,0,2,13,5,17,0 -22656,1,5,0,0,5,2,7,2,3,1,0,16,3,10,0 -22657,8,1,0,7,1,6,5,2,1,0,13,15,2,2,0 -22658,1,1,0,5,5,6,6,0,1,0,2,15,4,19,0 -22659,3,7,0,7,4,3,6,3,1,1,15,11,5,13,0 -22660,7,5,0,15,1,5,8,4,4,1,15,20,3,0,0 -22661,0,3,0,8,5,6,14,4,4,0,15,15,1,25,0 -22662,0,8,0,10,0,0,0,3,2,0,6,18,2,40,0 -22663,6,2,0,8,5,1,11,0,4,1,0,17,4,13,1 -22664,8,3,0,12,3,3,14,0,0,1,18,6,2,1,1 -22665,1,7,0,10,3,6,0,1,4,0,7,7,2,29,0 -22666,8,3,0,1,6,5,3,4,0,1,13,5,2,33,1 -22667,9,4,0,1,2,3,0,4,0,0,18,18,2,8,0 -22668,0,7,0,5,0,6,12,2,0,1,13,20,2,3,0 -22669,0,2,0,0,1,5,5,4,0,0,10,11,2,3,0 -22670,0,4,0,3,2,1,3,3,0,0,16,0,5,23,0 -22671,8,8,0,2,1,3,7,3,2,0,7,18,1,9,0 -22672,0,6,0,10,2,0,14,5,3,1,3,7,1,41,1 -22673,4,3,0,7,3,3,7,0,0,0,4,18,1,33,0 -22674,1,4,0,8,6,1,11,5,3,1,0,6,5,31,1 -22675,5,6,0,6,2,4,8,5,2,0,18,1,2,35,1 -22676,10,8,0,15,5,6,12,3,2,1,8,20,0,21,0 -22677,3,8,0,4,1,6,8,0,4,1,13,2,5,27,0 -22678,0,2,0,9,0,5,13,1,1,1,11,12,2,17,0 -22679,3,7,0,15,5,4,7,1,1,1,2,20,0,26,0 -22680,5,5,0,4,6,6,14,2,4,0,15,7,2,26,1 -22681,0,8,0,1,0,6,3,3,1,1,2,18,2,6,0 -22682,1,3,0,10,0,2,4,4,0,0,11,6,3,39,0 -22683,0,4,0,6,0,5,10,0,0,1,6,11,1,38,0 -22684,1,5,0,13,5,4,8,3,2,1,13,17,2,10,0 -22685,1,3,0,0,1,6,11,4,1,1,13,6,0,19,0 -22686,0,0,0,11,6,5,11,2,3,1,8,11,4,31,0 -22687,10,2,0,13,6,0,13,0,2,1,8,6,0,17,0 -22688,7,1,0,13,1,0,3,3,2,1,8,3,3,4,0 -22689,10,1,0,1,2,0,0,3,1,1,8,11,4,10,0 -22690,8,8,0,5,4,5,0,0,2,1,5,3,3,36,1 -22691,7,4,0,12,3,1,6,4,4,0,9,13,1,13,1 -22692,1,8,0,1,2,5,5,0,1,1,6,0,3,37,1 -22693,7,6,0,5,3,1,11,1,0,0,14,17,2,39,1 -22694,5,0,0,7,5,1,0,4,0,0,13,11,0,36,0 -22695,2,7,0,7,6,0,8,5,2,1,2,17,1,20,0 -22696,8,2,0,10,1,5,14,4,1,0,16,1,2,38,0 -22697,0,7,0,14,3,6,11,2,0,1,17,16,2,12,0 -22698,1,0,0,14,2,5,6,3,2,1,2,9,1,5,0 -22699,6,0,0,7,0,0,5,3,3,0,2,2,1,17,0 -22700,1,5,0,15,0,5,2,4,4,0,13,10,0,16,0 -22701,1,7,0,8,6,0,2,2,1,1,2,18,1,0,0 -22702,9,3,0,12,1,4,4,5,4,0,18,9,4,40,1 -22703,2,1,0,4,2,3,9,3,4,1,11,9,0,14,0 -22704,7,6,0,4,2,0,4,0,1,0,4,6,3,3,0 -22705,1,7,0,2,3,1,3,5,1,1,10,0,4,38,0 -22706,5,2,0,12,4,5,0,1,0,0,9,16,5,25,1 -22707,2,2,0,10,1,6,4,0,2,0,18,7,2,27,1 -22708,1,4,0,1,5,6,2,2,0,0,13,13,0,7,0 -22709,3,7,0,7,2,0,7,2,2,1,2,1,3,37,0 -22710,9,4,0,6,0,6,8,2,1,0,8,19,4,29,0 -22711,2,6,0,11,6,2,10,3,1,0,15,2,1,18,0 -22712,3,4,0,6,6,2,5,5,1,1,3,1,3,20,1 -22713,4,2,0,6,3,6,0,3,4,1,0,18,4,36,0 -22714,9,7,0,9,5,0,3,5,3,0,14,4,5,26,0 -22715,4,7,0,4,5,1,10,2,2,0,17,6,5,37,0 -22716,9,2,0,4,5,1,8,0,3,1,18,7,3,24,1 -22717,5,4,0,6,4,1,14,1,1,1,1,20,4,30,1 -22718,9,4,0,11,1,5,5,0,1,0,14,11,5,3,0 -22719,3,4,0,1,2,5,10,3,2,1,16,16,5,14,0 -22720,10,2,0,13,3,0,4,3,0,0,17,4,2,32,0 -22721,5,2,0,1,3,1,8,0,3,0,2,9,4,23,0 -22722,1,1,0,4,2,2,9,4,0,0,1,13,1,36,0 -22723,0,3,0,15,5,5,6,0,1,1,2,20,1,10,0 -22724,3,0,0,14,2,3,4,4,2,1,8,2,1,41,0 -22725,2,4,0,13,0,3,12,3,1,1,13,6,0,37,0 -22726,10,0,0,7,1,5,8,0,0,1,13,15,5,11,0 -22727,6,7,0,15,5,1,11,3,1,1,5,14,2,21,1 -22728,8,8,0,0,2,1,14,1,0,0,6,6,3,41,0 -22729,1,5,0,0,0,3,1,4,0,1,8,0,1,34,0 -22730,6,8,0,9,1,6,0,2,1,0,13,2,5,27,0 -22731,8,6,0,12,0,5,14,3,1,0,13,12,2,22,0 -22732,2,2,0,14,0,0,4,3,3,0,8,13,1,5,0 -22733,6,5,0,3,1,2,9,0,4,0,2,20,4,6,0 -22734,5,5,0,6,4,5,3,5,3,1,18,9,1,1,1 -22735,4,7,0,9,5,1,4,1,1,1,11,19,3,23,1 -22736,2,3,0,2,0,2,9,3,0,0,15,11,1,34,0 -22737,4,7,0,7,0,2,3,0,1,1,9,15,4,1,1 -22738,4,5,0,12,2,3,12,3,1,0,2,9,1,24,0 -22739,4,2,0,14,0,5,8,1,1,1,8,18,1,26,0 -22740,1,6,0,1,1,3,6,4,0,1,2,2,1,9,0 -22741,10,3,0,0,3,1,4,3,1,1,9,7,2,9,1 -22742,4,1,0,9,2,1,14,5,4,1,18,5,3,28,1 -22743,2,7,0,9,1,4,14,0,2,1,13,11,0,27,0 -22744,9,7,0,7,5,1,7,4,1,1,5,7,5,12,1 -22745,1,4,0,7,5,4,0,1,2,1,10,0,5,6,0 -22746,1,8,0,4,4,6,5,0,3,0,10,2,4,20,0 -22747,0,4,0,8,0,3,5,4,0,0,0,2,2,20,0 -22748,6,1,0,11,0,6,9,1,4,0,10,2,3,38,0 -22749,9,8,0,3,1,5,10,0,1,1,12,0,4,16,0 -22750,0,6,0,8,1,4,8,3,1,1,6,2,5,19,0 -22751,0,8,0,11,6,5,11,4,0,1,15,15,2,13,0 -22752,5,1,0,7,2,5,1,1,3,1,4,1,1,23,1 -22753,4,0,0,2,5,0,6,3,3,1,0,4,1,9,0 -22754,5,8,0,1,5,1,13,1,2,1,7,5,2,37,1 -22755,8,6,0,5,6,5,5,5,2,0,8,18,3,20,0 -22756,0,8,0,11,6,3,5,0,2,1,10,13,0,29,0 -22757,2,6,0,15,6,6,6,3,1,1,8,13,0,0,0 -22758,0,6,0,15,0,0,4,3,2,0,13,19,5,3,0 -22759,3,3,0,0,2,1,1,3,4,1,7,11,0,5,0 -22760,1,4,0,15,0,6,9,0,2,1,14,7,2,17,1 -22761,10,1,0,10,1,1,9,4,0,1,12,5,3,22,1 -22762,8,3,0,12,2,2,10,5,2,1,6,5,5,39,1 -22763,2,2,0,4,1,3,9,5,1,1,8,18,0,10,0 -22764,0,5,0,10,0,4,12,4,1,1,4,11,0,32,0 -22765,2,1,0,1,0,0,2,0,1,0,8,9,0,22,0 -22766,1,3,0,4,1,0,13,3,4,0,13,9,1,9,0 -22767,9,0,0,4,4,2,9,3,2,0,13,12,0,34,0 -22768,2,2,0,12,3,5,10,4,3,1,15,2,5,25,0 -22769,7,6,0,8,6,1,1,4,1,1,10,13,1,4,0 -22770,7,1,0,3,4,2,4,5,4,1,14,10,4,10,1 -22771,2,7,0,12,2,1,14,1,2,1,10,12,3,12,0 -22772,1,7,0,3,1,4,12,4,1,1,7,9,2,29,0 -22773,4,3,0,13,0,1,8,1,3,1,2,1,1,31,0 -22774,5,5,0,1,1,5,3,3,2,0,17,12,1,27,0 -22775,3,7,0,8,2,4,5,1,2,0,17,12,0,5,0 -22776,1,6,0,12,6,3,0,4,3,1,10,15,0,39,0 -22777,8,8,0,8,2,2,1,5,3,1,9,8,5,38,1 -22778,6,5,0,10,1,4,13,1,3,1,18,17,3,31,1 -22779,0,7,0,13,6,3,14,2,1,0,10,4,2,37,0 -22780,4,7,0,15,0,0,12,5,3,0,2,13,0,5,0 -22781,5,4,0,5,0,1,10,2,4,1,6,13,2,6,0 -22782,0,7,0,11,6,0,6,2,3,0,0,9,0,21,0 -22783,9,3,0,12,2,5,2,3,0,1,13,18,3,28,0 -22784,4,5,0,6,4,4,2,1,2,0,12,11,1,9,0 -22785,7,4,0,6,4,5,7,1,1,0,18,16,4,23,1 -22786,0,7,0,8,4,5,13,3,4,0,2,15,0,4,0 -22787,6,3,0,4,0,0,10,3,4,1,8,9,5,6,0 -22788,7,1,0,2,1,6,8,0,3,1,7,10,4,16,1 -22789,4,2,0,11,5,2,3,0,2,0,14,3,3,19,0 -22790,7,0,0,11,0,1,1,2,3,1,8,11,2,8,0 -22791,6,1,0,6,6,1,14,2,2,1,14,7,1,3,1 -22792,2,1,0,8,0,0,10,2,3,0,13,11,3,17,0 -22793,0,2,0,9,6,0,0,4,1,0,12,2,4,41,0 -22794,3,7,0,11,6,4,10,1,1,1,16,2,3,13,0 -22795,4,2,0,7,6,5,14,2,2,0,15,20,5,18,0 -22796,2,8,0,9,5,4,9,5,2,1,2,2,0,21,0 -22797,7,8,0,7,2,4,5,1,1,1,2,15,1,10,0 -22798,4,8,0,10,6,5,4,0,2,0,18,2,4,16,1 -22799,7,1,0,0,1,2,7,5,1,0,18,7,5,18,1 -22800,10,1,0,5,5,4,0,1,3,0,2,9,2,6,0 -22801,7,5,0,6,3,2,11,0,0,0,5,5,4,16,1 -22802,0,4,0,3,2,5,1,3,2,1,16,15,4,17,0 -22803,7,5,0,11,3,0,0,5,1,0,5,7,1,25,1 -22804,7,2,0,4,2,1,0,5,0,1,6,18,1,25,0 -22805,2,3,0,10,4,6,10,0,1,0,6,5,0,26,1 -22806,9,6,0,15,0,1,14,0,2,1,2,13,2,30,0 -22807,7,7,0,15,2,4,10,3,2,0,4,12,3,32,0 -22808,2,2,0,13,4,2,6,4,3,0,4,15,0,26,0 -22809,7,5,0,14,1,2,11,0,0,0,0,19,1,8,1 -22810,4,5,0,9,2,6,5,1,0,1,8,2,5,16,0 -22811,6,7,0,15,2,2,6,5,0,0,15,13,1,5,0 -22812,5,5,0,15,0,4,6,3,1,0,12,4,0,36,0 -22813,0,3,0,5,2,1,1,0,2,0,13,14,3,1,0 -22814,1,8,0,10,3,3,7,3,0,0,2,3,5,22,0 -22815,1,2,0,6,2,1,12,2,0,1,17,15,1,17,0 -22816,10,3,0,0,2,4,14,3,1,0,2,10,3,35,0 -22817,0,3,0,1,2,3,2,3,4,1,8,14,4,38,0 -22818,5,6,0,2,6,4,8,0,1,0,10,2,0,9,0 -22819,5,1,0,13,5,4,9,1,2,0,0,11,0,39,0 -22820,1,6,0,3,0,3,13,4,4,1,4,13,1,9,0 -22821,6,3,0,13,3,3,12,4,4,1,4,9,0,2,0 -22822,0,6,0,7,2,6,11,2,0,1,10,13,3,23,1 -22823,6,6,0,13,6,2,1,2,0,0,14,0,0,37,0 -22824,3,2,0,6,0,2,2,3,0,0,0,6,0,8,0 -22825,0,8,0,3,0,3,14,4,0,1,15,20,4,0,0 -22826,2,7,0,1,3,5,13,2,0,0,1,13,4,4,0 -22827,9,6,0,13,3,4,5,5,0,0,5,7,0,18,1 -22828,0,6,0,8,1,4,12,1,4,1,8,18,4,4,0 -22829,5,0,0,14,2,0,12,5,1,1,18,19,3,25,1 -22830,8,5,0,8,2,1,12,0,2,0,5,1,4,40,1 -22831,3,7,0,0,5,4,8,4,3,0,1,2,2,38,0 -22832,9,0,0,15,0,3,14,1,4,1,16,0,5,28,0 -22833,8,5,0,12,1,2,8,5,4,0,18,7,1,1,1 -22834,0,2,0,7,0,6,14,5,3,1,3,1,4,8,0 -22835,1,5,0,2,2,2,14,4,4,1,0,2,4,17,0 -22836,7,5,0,12,0,6,0,3,4,0,11,2,1,34,0 -22837,2,3,0,8,0,6,5,3,3,1,4,13,2,16,0 -22838,4,4,0,13,2,4,5,4,2,1,17,13,0,0,0 -22839,2,2,0,10,6,0,9,0,0,0,0,15,3,19,0 -22840,2,8,0,7,1,5,5,1,0,0,2,15,1,31,0 -22841,3,4,0,14,1,6,10,3,3,1,16,7,2,31,1 -22842,10,8,0,5,6,3,0,4,4,0,6,2,0,23,0 -22843,1,7,0,9,1,0,4,1,1,1,0,11,2,22,0 -22844,0,8,0,5,0,0,5,4,0,0,2,11,5,14,0 -22845,3,1,0,3,0,3,6,3,0,1,2,19,3,22,0 -22846,1,7,0,13,2,0,5,3,0,1,13,11,2,14,0 -22847,6,3,0,11,5,2,0,3,0,0,13,18,2,20,0 -22848,8,3,0,11,5,4,11,0,2,1,9,8,5,36,1 -22849,5,8,0,9,5,2,11,5,2,1,5,18,0,9,1 -22850,9,7,0,5,1,1,14,4,0,1,2,2,4,23,0 -22851,0,3,0,14,6,3,14,4,3,1,13,11,3,19,0 -22852,0,2,0,15,0,4,10,3,0,1,13,2,5,30,0 -22853,8,7,0,12,5,4,13,5,0,1,14,5,1,3,1 -22854,4,5,0,5,1,4,8,4,3,0,5,14,1,33,0 -22855,2,3,0,7,4,3,4,4,4,1,10,2,3,35,0 -22856,3,2,0,14,2,6,5,3,3,1,1,4,4,24,1 -22857,6,6,0,6,6,6,3,3,0,1,6,6,2,22,0 -22858,1,2,0,2,0,6,1,2,3,1,13,11,5,39,0 -22859,10,8,0,4,4,2,14,2,0,1,18,10,5,34,1 -22860,6,0,0,2,5,0,9,3,4,0,5,20,5,7,0 -22861,0,6,0,0,2,3,7,4,3,1,4,20,2,5,0 -22862,10,4,0,6,0,5,8,2,4,0,13,11,0,25,0 -22863,4,0,0,13,0,6,8,0,3,0,12,13,2,24,0 -22864,7,0,0,2,4,5,13,0,4,1,2,6,2,13,0 -22865,3,7,0,13,6,4,0,1,3,1,2,20,0,3,0 -22866,5,7,0,11,6,2,4,3,3,1,1,5,4,33,1 -22867,6,8,0,5,0,5,6,1,0,0,13,16,3,0,0 -22868,1,3,0,3,2,6,3,3,3,0,0,15,0,6,0 -22869,0,4,0,1,0,1,14,0,3,1,18,11,0,34,0 -22870,4,7,0,8,3,2,12,2,2,1,8,9,5,32,0 -22871,4,3,0,11,4,4,13,5,3,1,15,8,4,33,1 -22872,5,6,0,5,5,2,0,3,2,0,11,9,0,34,0 -22873,2,8,0,13,0,2,0,4,2,0,2,6,0,12,0 -22874,6,1,0,14,1,3,13,3,0,0,2,4,0,38,0 -22875,1,6,0,12,6,6,5,2,0,0,2,2,4,14,0 -22876,8,4,0,14,2,2,5,1,2,1,1,8,0,30,1 -22877,1,7,0,3,3,5,14,3,4,0,6,9,4,29,0 -22878,5,8,0,4,2,3,12,1,3,1,13,13,3,17,0 -22879,6,0,0,3,3,5,6,1,4,0,8,15,2,32,0 -22880,6,1,0,3,1,2,3,1,4,1,18,10,2,3,1 -22881,6,8,0,0,2,3,14,4,1,1,4,15,5,40,0 -22882,1,2,0,7,0,3,0,5,2,1,17,11,2,2,0 -22883,9,4,0,4,4,4,8,4,2,0,4,4,2,12,0 -22884,2,5,0,9,4,5,2,4,3,0,7,6,4,4,0 -22885,2,4,0,0,2,1,7,2,3,1,18,6,0,38,0 -22886,0,5,0,2,1,5,3,2,1,1,13,2,4,24,0 -22887,10,3,0,4,6,0,5,0,2,1,2,3,1,3,0 -22888,7,3,0,0,4,6,14,4,0,0,12,1,0,7,0 -22889,1,0,0,5,4,3,2,3,0,1,11,14,2,31,1 -22890,6,1,0,11,0,3,13,1,0,0,13,9,0,24,0 -22891,0,6,0,9,2,1,5,2,0,1,17,2,5,24,0 -22892,2,7,0,11,2,0,2,3,0,0,17,8,0,39,0 -22893,2,5,0,5,1,4,5,1,1,0,2,9,0,30,0 -22894,3,1,0,3,3,2,5,3,0,0,17,9,1,3,0 -22895,8,4,0,0,1,0,9,2,4,1,12,19,0,32,0 -22896,9,0,0,7,4,5,11,0,1,1,18,14,2,38,1 -22897,1,7,0,8,6,5,3,0,3,1,1,1,4,1,1 -22898,2,8,0,1,6,5,12,1,0,0,17,5,1,38,0 -22899,10,5,0,15,6,1,1,0,3,0,5,18,4,28,1 -22900,5,7,0,1,0,2,11,0,4,1,18,5,0,8,1 -22901,6,3,0,15,6,1,13,0,0,1,14,17,5,25,1 -22902,3,0,0,15,3,6,6,5,3,0,8,13,3,7,0 -22903,2,3,0,3,2,4,8,1,0,1,17,13,2,41,0 -22904,4,6,0,6,2,6,3,3,0,1,10,2,5,6,0 -22905,1,7,0,13,3,5,3,4,2,1,8,4,0,21,0 -22906,10,1,0,11,4,5,2,5,2,1,13,2,2,10,0 -22907,6,5,0,10,6,5,5,0,1,1,4,17,5,39,0 -22908,2,2,0,6,5,0,8,1,3,0,17,2,3,29,0 -22909,2,0,0,15,2,1,3,4,3,1,13,13,2,13,0 -22910,1,1,0,5,5,0,12,3,3,0,2,2,3,1,0 -22911,1,6,0,1,0,4,8,2,4,0,1,13,0,20,0 -22912,5,4,0,6,5,5,5,3,0,0,13,13,5,28,0 -22913,2,2,0,8,0,0,4,5,3,0,8,2,0,8,0 -22914,9,8,0,7,2,2,6,0,1,0,13,0,3,22,0 -22915,9,4,0,3,0,5,7,1,4,0,10,7,5,29,1 -22916,9,2,0,6,0,5,1,4,2,1,2,16,4,21,0 -22917,6,7,0,12,3,5,14,1,1,1,1,7,4,30,1 -22918,2,1,0,11,1,2,13,2,2,0,17,0,0,40,0 -22919,0,3,0,12,2,3,7,4,1,1,8,12,1,40,0 -22920,1,6,0,10,4,1,2,3,0,1,18,7,3,8,1 -22921,10,6,0,1,2,2,14,1,3,0,16,11,1,37,0 -22922,2,2,0,12,6,1,11,5,2,0,6,16,2,22,1 -22923,8,5,0,2,0,5,14,5,0,0,17,9,5,1,0 -22924,9,7,0,3,3,1,4,4,0,1,3,0,5,9,1 -22925,9,6,0,11,0,1,14,1,3,0,7,11,0,18,0 -22926,0,1,0,8,3,3,3,5,1,0,12,20,2,13,0 -22927,6,4,0,7,4,1,11,2,0,1,15,8,0,20,1 -22928,7,3,0,4,6,1,11,4,2,1,4,14,5,24,1 -22929,0,8,0,5,6,6,0,1,1,0,15,2,4,10,0 -22930,2,8,0,1,2,6,7,1,3,0,17,9,3,24,0 -22931,8,3,0,8,1,0,2,5,4,1,2,2,2,2,0 -22932,1,3,0,13,3,2,0,2,1,1,1,2,4,0,0 -22933,7,6,0,4,4,1,6,3,2,1,11,8,2,32,1 -22934,7,4,0,1,5,6,12,3,0,0,17,15,5,17,0 -22935,4,5,0,1,0,5,9,2,1,0,8,4,1,2,0 -22936,9,5,0,0,1,1,7,5,0,1,8,6,1,10,0 -22937,1,8,0,4,0,3,12,3,3,1,2,3,1,30,0 -22938,6,6,0,9,0,5,6,5,4,0,2,15,3,24,0 -22939,2,3,0,4,4,5,14,1,1,1,6,15,3,17,0 -22940,10,3,0,6,6,6,2,5,1,1,18,20,5,2,1 -22941,2,7,0,1,3,5,10,3,4,0,2,16,5,9,0 -22942,4,8,0,11,0,3,0,1,2,1,2,18,4,7,0 -22943,0,7,0,5,0,2,8,2,2,1,8,12,0,39,0 -22944,8,7,0,3,6,2,2,0,1,1,16,6,3,13,0 -22945,4,2,0,9,3,5,14,3,1,1,4,0,3,16,0 -22946,6,7,0,2,3,1,2,4,1,1,5,5,4,9,1 -22947,7,2,0,15,3,0,10,2,1,0,8,9,5,40,0 -22948,2,8,0,14,3,3,10,4,4,1,3,19,2,16,0 -22949,0,6,0,12,2,1,3,0,4,0,13,2,0,22,0 -22950,10,5,0,2,5,2,11,0,2,1,18,17,1,37,1 -22951,1,3,0,3,5,4,9,4,3,0,2,8,5,7,0 -22952,3,1,0,11,1,0,3,0,4,0,17,0,4,21,0 -22953,8,8,0,0,6,2,14,5,4,1,18,19,1,12,1 -22954,7,1,0,1,5,3,4,1,3,1,18,13,0,5,1 -22955,2,6,0,12,0,2,8,3,3,1,4,9,0,20,0 -22956,0,0,0,11,2,6,8,3,0,0,17,20,2,10,0 -22957,1,7,0,9,2,1,10,3,3,0,17,14,4,24,1 -22958,9,2,0,7,4,5,6,0,1,1,12,8,2,22,1 -22959,10,7,0,8,5,0,7,1,1,1,8,11,4,24,0 -22960,2,2,0,3,1,3,2,3,3,0,13,2,3,3,0 -22961,10,2,0,8,6,2,1,2,2,1,17,2,1,9,0 -22962,8,8,0,3,4,5,3,5,2,1,9,3,4,24,1 -22963,4,8,0,7,5,2,1,2,3,0,6,1,1,10,1 -22964,7,6,0,6,1,4,9,1,4,1,13,20,5,4,0 -22965,4,5,0,0,4,5,8,5,1,1,5,11,0,35,0 -22966,5,8,0,3,0,3,3,4,4,0,2,14,4,19,0 -22967,9,1,0,3,0,3,9,4,3,1,2,9,1,7,0 -22968,9,6,0,2,6,4,12,2,2,1,9,9,4,26,0 -22969,0,0,0,0,1,5,9,2,4,0,2,4,4,7,0 -22970,3,0,0,14,3,0,6,3,2,1,18,3,5,5,1 -22971,8,5,0,10,1,5,0,1,3,1,11,11,5,21,0 -22972,10,4,0,0,6,1,3,1,4,1,5,9,5,14,0 -22973,2,7,0,15,2,5,6,4,2,0,13,20,3,23,0 -22974,1,0,0,12,1,5,14,5,4,0,13,16,3,33,0 -22975,0,3,0,4,2,6,8,4,4,0,2,20,0,12,0 -22976,5,1,0,15,6,1,1,4,0,1,8,13,3,33,0 -22977,0,8,0,7,2,6,0,4,0,0,17,3,0,14,0 -22978,4,1,0,0,3,0,14,3,0,0,2,0,5,11,0 -22979,2,4,0,4,2,4,1,3,3,0,10,2,0,12,0 -22980,8,6,0,14,3,6,13,0,3,1,4,10,1,2,1 -22981,5,4,0,5,3,0,6,3,1,0,17,12,3,24,0 -22982,6,5,0,6,2,6,9,0,2,0,2,12,0,3,0 -22983,1,4,0,5,5,5,13,3,3,1,5,14,5,12,1 -22984,10,4,0,9,5,4,4,2,3,1,18,1,5,37,1 -22985,6,1,0,3,0,1,12,3,4,0,17,1,5,26,0 -22986,1,7,0,6,5,4,12,3,1,0,8,11,1,32,0 -22987,9,5,0,13,3,4,7,2,0,1,13,11,1,19,0 -22988,0,6,0,1,5,4,5,1,1,1,12,13,0,20,0 -22989,0,3,0,11,0,4,10,5,4,0,4,13,1,33,0 -22990,1,6,0,1,6,5,12,1,0,0,4,4,2,19,0 -22991,2,8,0,13,5,4,7,0,0,0,13,15,4,28,0 -22992,5,2,0,1,1,1,6,3,0,0,13,0,1,2,0 -22993,4,5,0,7,6,3,12,5,4,0,16,13,5,20,0 -22994,8,7,0,12,3,5,5,0,3,0,4,17,2,22,0 -22995,10,6,0,11,2,2,8,2,2,1,6,11,3,41,0 -22996,3,2,0,3,3,1,2,3,1,1,7,13,0,33,0 -22997,0,2,0,11,6,4,9,1,4,0,8,13,2,35,0 -22998,3,6,0,3,1,4,3,2,4,1,12,20,5,7,0 -22999,3,1,0,7,1,2,1,1,0,0,13,18,0,28,0 -23000,3,0,0,8,5,4,10,1,3,1,4,2,0,5,0 -23001,8,1,0,6,2,6,14,4,2,1,15,18,0,0,0 -23002,9,2,0,12,6,2,13,0,2,0,4,15,1,0,0 -23003,2,6,0,3,6,4,8,2,0,0,10,0,5,18,0 -23004,6,8,0,6,0,1,0,3,2,0,8,11,3,28,0 -23005,2,1,0,12,3,6,7,0,2,1,6,12,5,27,0 -23006,2,3,0,3,2,1,10,2,1,1,13,6,3,23,0 -23007,8,5,0,7,4,6,5,1,0,0,10,16,4,16,0 -23008,1,0,0,7,1,0,4,2,1,1,4,6,0,40,0 -23009,9,6,0,15,0,2,12,1,0,0,10,2,0,8,0 -23010,8,7,0,2,2,5,6,1,3,1,6,19,1,23,0 -23011,2,0,0,15,5,5,8,0,1,0,10,6,0,37,0 -23012,10,3,0,5,5,6,2,3,1,0,0,20,0,4,0 -23013,10,6,0,12,5,5,14,2,3,1,8,2,1,19,0 -23014,0,8,0,3,1,1,8,3,4,1,12,15,5,19,0 -23015,8,6,0,2,3,4,1,4,4,0,4,18,5,3,0 -23016,2,6,0,1,4,6,9,4,0,0,17,2,0,41,0 -23017,9,6,0,3,6,3,0,3,3,0,8,3,3,37,0 -23018,4,5,0,2,4,0,8,5,0,0,8,9,4,37,0 -23019,4,2,0,9,0,6,12,3,2,0,7,9,2,39,0 -23020,5,3,0,11,1,6,12,5,3,1,5,16,5,20,1 -23021,2,4,0,0,1,3,8,4,0,1,2,2,1,37,0 -23022,2,0,0,3,5,3,0,3,2,1,10,18,5,23,0 -23023,7,3,0,3,1,4,2,3,2,0,2,18,0,23,0 -23024,9,1,0,3,6,3,6,2,0,1,13,11,3,3,0 -23025,8,0,0,3,0,3,1,1,2,0,13,6,1,16,0 -23026,5,8,0,1,0,2,11,0,0,0,2,15,4,24,0 -23027,10,0,0,5,2,3,12,0,0,0,0,10,4,22,1 -23028,8,2,0,7,2,6,1,4,0,0,2,2,5,40,0 -23029,0,2,0,3,6,5,5,1,3,0,17,2,0,7,0 -23030,7,4,0,14,2,3,1,1,3,1,17,2,1,2,0 -23031,0,8,0,4,6,1,10,1,0,1,13,11,5,0,0 -23032,10,1,0,1,6,5,6,1,4,0,15,2,0,18,0 -23033,6,1,0,8,5,5,1,0,3,0,6,13,5,38,0 -23034,3,0,0,15,0,6,2,0,0,0,17,2,3,31,0 -23035,10,5,0,11,4,2,14,3,3,0,5,15,2,29,1 -23036,3,3,0,1,0,6,7,4,3,1,17,2,1,34,0 -23037,10,8,0,3,1,5,2,0,0,0,2,11,0,35,0 -23038,4,1,0,15,0,0,7,4,2,1,12,15,5,10,0 -23039,8,2,0,2,6,1,7,1,0,0,15,7,1,22,1 -23040,10,5,0,9,0,4,8,4,0,0,8,6,5,8,0 -23041,10,4,0,11,0,0,12,3,1,1,2,13,5,12,0 -23042,1,5,0,2,0,4,6,0,0,0,2,6,4,4,0 -23043,1,7,0,14,6,2,4,1,3,0,3,8,4,18,1 -23044,3,0,0,15,0,0,0,1,3,1,4,4,5,36,0 -23045,2,1,0,3,5,5,9,2,2,0,13,15,1,10,0 -23046,0,3,0,3,0,5,9,1,3,1,4,18,2,5,0 -23047,9,8,0,7,0,5,9,5,0,0,2,9,1,8,0 -23048,10,0,0,15,2,6,3,0,1,1,18,10,3,17,1 -23049,4,8,0,5,3,2,5,2,2,1,6,13,1,26,0 -23050,1,6,0,10,5,0,6,5,3,0,2,11,4,41,0 -23051,7,1,0,15,4,5,12,1,3,1,3,6,2,17,0 -23052,0,2,0,15,4,4,2,4,0,0,17,11,5,18,0 -23053,0,7,0,1,6,6,10,2,0,1,2,11,4,19,0 -23054,1,0,0,8,5,5,1,1,1,0,2,13,5,13,0 -23055,2,5,0,8,3,0,11,0,3,0,15,18,1,31,0 -23056,2,2,0,5,0,5,9,5,4,0,13,9,4,38,0 -23057,8,7,0,8,6,4,14,0,0,1,13,17,4,16,0 -23058,8,6,0,1,6,2,11,0,1,0,4,0,5,10,1 -23059,0,2,0,14,1,3,4,4,2,0,6,11,4,11,0 -23060,0,8,0,11,0,4,3,1,1,1,0,16,0,18,0 -23061,3,6,0,3,1,0,11,5,4,0,17,0,0,22,0 -23062,7,5,0,9,2,5,1,0,1,0,18,8,4,22,1 -23063,6,7,0,9,2,5,14,2,3,0,4,11,0,19,0 -23064,2,4,0,8,6,3,3,1,1,0,6,19,3,35,0 -23065,0,4,0,1,5,5,7,3,1,1,13,15,2,20,0 -23066,5,3,0,4,1,2,12,5,1,1,9,3,4,2,1 -23067,6,1,0,7,6,1,12,3,0,0,16,13,1,30,1 -23068,8,8,0,3,0,4,13,2,1,1,0,18,5,11,0 -23069,0,7,0,11,6,5,12,1,0,1,9,2,2,17,0 -23070,10,8,0,3,1,4,8,1,1,0,6,9,0,36,0 -23071,1,4,0,6,4,0,0,2,0,0,15,0,2,19,0 -23072,2,8,0,6,0,4,7,0,3,0,5,11,0,27,0 -23073,6,8,0,1,2,5,0,0,3,1,2,2,1,30,0 -23074,4,3,0,11,2,4,0,5,0,0,2,15,5,28,0 -23075,1,3,0,0,4,1,6,5,0,1,13,6,0,22,0 -23076,8,7,0,12,6,4,11,0,3,0,9,16,5,0,1 -23077,9,4,0,3,3,0,14,4,3,1,16,0,5,7,0 -23078,3,3,0,2,1,2,11,5,0,0,10,15,0,34,0 -23079,5,8,0,15,0,2,5,1,2,1,8,15,3,27,0 -23080,4,6,0,7,2,1,2,4,3,0,1,1,2,5,1 -23081,0,2,0,0,1,2,6,5,2,1,2,16,0,0,0 -23082,9,8,0,2,5,3,11,0,0,0,4,17,5,1,1 -23083,8,5,0,4,1,3,1,2,1,0,4,1,0,26,0 -23084,2,2,0,9,3,4,14,0,3,1,14,19,3,23,1 -23085,7,5,0,11,5,0,3,4,4,0,2,15,0,36,0 -23086,7,0,0,4,0,2,5,0,1,1,13,11,3,38,0 -23087,5,8,0,1,2,1,0,4,0,1,17,6,1,21,0 -23088,7,0,0,4,4,1,1,5,4,1,18,1,2,7,1 -23089,3,0,0,13,5,4,11,1,1,1,12,13,0,25,0 -23090,6,2,0,11,0,5,2,0,2,1,15,11,0,34,0 -23091,0,2,0,4,0,3,0,4,0,0,15,13,2,4,0 -23092,0,0,0,11,3,3,0,3,1,0,13,18,2,13,0 -23093,3,8,0,1,1,1,8,1,2,0,15,11,5,9,0 -23094,1,5,0,11,0,2,13,0,2,0,12,4,5,14,0 -23095,5,0,0,3,5,6,1,5,2,1,3,15,0,26,0 -23096,9,6,0,6,6,6,11,3,2,0,13,4,3,35,0 -23097,2,2,0,5,2,0,12,0,3,1,2,6,4,5,0 -23098,3,8,0,9,2,1,14,2,4,1,13,2,2,26,0 -23099,0,8,0,15,1,4,10,3,3,1,8,11,0,3,0 -23100,5,2,0,11,2,6,1,4,3,0,15,3,1,25,1 -23101,7,7,0,4,0,3,1,4,3,1,0,3,0,8,0 -23102,4,7,0,11,0,3,0,2,1,0,13,6,5,21,0 -23103,5,6,0,2,2,4,2,0,3,1,13,13,5,34,0 -23104,3,3,0,5,5,0,11,4,1,0,3,11,4,27,0 -23105,7,3,0,11,2,4,10,5,2,1,16,11,4,21,0 -23106,4,4,0,8,0,1,11,5,2,0,17,11,4,23,0 -23107,1,8,0,5,1,0,12,0,0,0,8,0,0,29,0 -23108,0,2,0,13,0,1,6,2,2,0,8,6,1,31,0 -23109,8,3,0,12,4,1,13,2,4,0,1,1,3,9,1 -23110,10,6,0,0,2,4,12,0,3,0,13,2,0,26,0 -23111,8,0,0,8,4,4,6,0,4,1,16,1,1,28,1 -23112,3,8,0,3,0,0,3,0,2,0,12,0,1,5,0 -23113,1,2,0,12,1,5,4,4,2,0,12,11,2,33,0 -23114,8,2,0,4,4,3,3,1,1,0,13,14,5,7,0 -23115,0,8,0,14,0,4,7,0,0,1,2,2,5,36,0 -23116,3,7,0,14,0,2,9,1,4,1,15,4,3,29,0 -23117,3,2,0,1,6,4,10,5,2,1,16,2,3,41,0 -23118,0,2,0,13,4,0,11,4,3,1,2,11,1,22,0 -23119,6,7,0,11,3,5,9,0,3,0,5,12,1,20,0 -23120,6,6,0,2,5,2,13,2,2,0,0,17,3,12,0 -23121,1,6,0,13,0,2,5,4,1,0,10,18,0,19,0 -23122,0,8,0,5,6,4,5,2,0,0,8,20,3,20,0 -23123,5,4,0,9,6,4,14,0,4,1,5,7,5,10,1 -23124,3,8,0,1,6,4,8,4,3,0,0,13,2,36,0 -23125,0,5,0,4,1,5,5,0,1,1,2,11,3,19,0 -23126,7,0,0,6,4,4,1,1,1,0,2,2,0,8,0 -23127,1,7,0,15,3,4,4,5,3,1,11,2,5,40,0 -23128,8,5,0,5,6,1,1,5,4,0,18,18,5,13,1 -23129,0,4,0,5,2,4,10,2,0,1,5,7,2,8,1 -23130,9,1,0,6,5,6,6,3,2,1,4,2,5,28,0 -23131,1,4,0,6,2,1,3,3,2,0,16,0,3,31,0 -23132,3,3,0,11,0,1,7,1,1,1,13,18,5,29,0 -23133,10,0,0,11,4,0,4,3,1,0,2,9,2,34,0 -23134,1,8,0,13,3,5,9,1,2,1,0,15,2,24,0 -23135,0,4,0,5,0,3,10,0,3,0,7,1,4,8,0 -23136,2,1,0,1,4,0,8,1,0,0,4,11,1,18,0 -23137,6,4,0,3,3,0,3,1,2,1,7,8,1,27,0 -23138,0,0,0,11,2,0,8,3,4,1,16,13,4,23,0 -23139,0,8,0,12,4,0,1,3,1,1,17,17,0,6,0 -23140,8,4,0,5,1,2,9,4,4,0,15,2,4,41,0 -23141,0,6,0,7,0,5,6,0,1,0,8,18,5,14,0 -23142,1,3,0,0,6,3,5,3,3,0,0,0,0,40,0 -23143,4,5,0,12,0,0,9,4,4,0,5,3,3,41,0 -23144,0,1,0,5,5,2,0,4,0,0,4,2,3,26,0 -23145,1,8,0,5,6,1,9,5,3,0,6,2,3,6,0 -23146,6,7,0,3,4,0,13,1,3,1,13,18,1,3,0 -23147,3,2,0,9,1,1,4,1,2,0,5,14,4,24,1 -23148,5,5,0,3,4,3,10,5,3,1,2,8,0,0,0 -23149,9,6,0,1,4,5,4,1,2,1,18,7,5,26,1 -23150,1,5,0,6,2,0,3,1,4,1,15,11,0,16,0 -23151,5,1,0,10,1,4,9,4,1,1,13,9,2,13,0 -23152,0,3,0,3,0,0,9,0,0,1,15,17,3,25,0 -23153,2,5,0,11,6,2,8,4,3,1,13,16,0,30,0 -23154,6,3,0,1,0,0,5,3,0,0,10,11,2,2,0 -23155,3,7,0,6,5,5,2,4,4,0,2,6,1,4,0 -23156,4,5,0,14,4,2,0,5,0,1,5,10,5,3,1 -23157,6,3,0,13,1,0,1,4,1,0,2,2,0,29,0 -23158,2,5,0,0,3,6,4,4,4,1,2,6,2,37,0 -23159,1,1,0,10,2,3,3,3,2,1,2,6,5,39,0 -23160,5,1,0,5,1,3,11,3,0,1,5,7,2,14,1 -23161,5,8,0,10,3,3,5,5,2,1,2,11,0,19,0 -23162,6,2,0,8,1,2,12,5,3,0,5,0,1,21,1 -23163,4,8,0,5,0,1,2,0,3,1,12,17,2,23,0 -23164,10,4,0,3,3,0,6,3,4,1,13,14,3,9,0 -23165,6,0,0,5,6,3,0,5,3,1,13,2,2,17,0 -23166,0,4,0,1,0,6,2,2,3,0,4,6,2,26,0 -23167,5,5,0,10,6,2,5,2,1,1,18,6,3,31,1 -23168,7,1,0,13,0,2,13,1,1,1,18,10,5,2,1 -23169,2,4,0,14,0,2,0,2,0,1,8,18,2,9,0 -23170,3,0,0,6,6,4,0,2,0,0,8,2,1,10,0 -23171,10,8,0,15,4,5,3,3,4,1,8,15,0,3,0 -23172,9,3,0,1,6,2,0,4,1,0,9,5,5,16,1 -23173,2,2,0,11,2,4,9,4,3,1,10,9,1,18,0 -23174,7,3,0,5,6,0,0,5,0,1,18,7,2,1,1 -23175,1,3,0,10,3,5,10,0,2,1,0,2,4,23,0 -23176,5,8,0,6,1,5,5,5,0,1,10,11,0,5,0 -23177,8,7,0,15,4,2,6,3,0,1,18,11,0,9,0 -23178,2,6,0,7,4,3,10,1,0,1,1,18,3,11,0 -23179,0,3,0,1,4,5,6,2,1,1,17,11,3,28,0 -23180,6,4,0,6,0,0,13,4,0,0,12,18,1,24,0 -23181,10,5,0,11,0,1,11,3,2,0,1,12,0,20,0 -23182,9,3,0,4,2,3,7,2,2,0,8,2,3,3,0 -23183,7,1,0,10,6,4,4,1,2,0,18,13,1,3,1 -23184,3,1,0,10,0,3,14,3,4,0,8,6,5,3,0 -23185,8,7,0,5,6,4,6,5,3,1,2,6,5,37,0 -23186,0,1,0,3,0,5,11,0,0,1,6,2,3,14,0 -23187,1,2,0,14,4,2,7,5,0,1,1,10,1,23,1 -23188,10,4,0,14,4,5,2,1,3,0,3,1,3,1,1 -23189,6,6,0,4,5,0,7,1,0,1,11,6,1,23,0 -23190,1,6,0,13,1,6,7,3,1,1,15,2,5,22,0 -23191,5,2,0,5,2,6,7,5,3,0,2,6,0,12,0 -23192,0,8,0,0,1,2,8,2,4,0,10,6,5,33,0 -23193,0,0,0,10,1,5,9,2,0,1,2,3,0,3,0 -23194,0,5,0,13,2,0,9,4,4,0,13,0,1,8,0 -23195,10,4,0,2,5,5,6,3,0,0,3,19,1,38,0 -23196,6,5,0,5,0,5,9,4,0,0,6,13,4,8,0 -23197,7,5,0,7,0,5,6,2,0,1,12,2,0,8,0 -23198,4,4,0,7,6,6,7,1,1,0,10,11,4,20,0 -23199,2,4,0,7,1,5,3,4,2,1,0,12,1,35,0 -23200,4,7,0,1,0,0,2,4,4,1,12,18,0,3,0 -23201,2,6,0,1,3,1,0,5,3,0,0,9,2,40,0 -23202,5,3,0,9,0,4,5,2,1,0,2,18,0,26,0 -23203,1,2,0,1,2,0,2,4,4,0,2,5,0,37,0 -23204,2,7,0,11,1,6,6,1,3,0,2,14,0,25,0 -23205,4,1,0,5,2,1,9,3,3,0,8,11,5,5,0 -23206,7,5,0,1,1,3,14,1,2,1,2,6,0,34,0 -23207,2,7,0,6,2,0,5,2,0,1,6,13,1,14,0 -23208,1,2,0,11,5,1,1,3,0,0,6,2,0,40,0 -23209,1,8,0,13,1,3,9,5,4,1,8,17,0,31,0 -23210,2,0,0,13,1,5,12,1,4,1,10,16,0,29,0 -23211,4,8,0,12,1,6,12,3,3,1,13,6,5,35,0 -23212,5,7,0,7,6,0,12,2,4,0,4,18,3,20,0 -23213,3,7,0,0,4,6,13,2,4,0,2,2,1,38,0 -23214,10,7,0,4,5,4,11,0,4,0,13,15,4,0,0 -23215,2,5,0,8,3,3,7,5,3,0,9,7,5,3,1 -23216,1,8,0,0,0,5,6,0,0,1,13,2,2,14,0 -23217,2,2,0,1,5,6,7,3,0,0,14,0,5,13,0 -23218,0,7,0,14,6,2,3,0,4,1,8,11,3,41,0 -23219,1,0,0,3,2,5,5,4,4,1,2,9,0,27,0 -23220,9,0,0,11,0,3,10,4,4,0,1,10,2,8,1 -23221,7,4,0,13,4,4,6,0,3,1,2,0,5,3,0 -23222,5,5,0,0,0,1,12,1,0,0,11,1,1,16,1 -23223,10,5,0,3,3,0,4,0,1,0,9,14,5,41,1 -23224,3,2,0,9,4,2,13,4,2,1,9,14,3,3,1 -23225,8,2,0,6,3,5,10,3,0,0,18,1,5,41,1 -23226,3,3,0,13,0,4,12,4,0,1,3,6,0,24,0 -23227,8,2,0,11,0,0,9,4,0,0,0,19,3,7,0 -23228,0,6,0,5,5,5,9,1,0,0,17,9,5,38,0 -23229,2,7,0,6,0,6,12,5,0,1,13,2,5,33,0 -23230,4,0,0,12,0,6,0,1,0,0,17,19,2,12,0 -23231,0,0,0,6,3,3,5,1,3,1,13,15,0,16,0 -23232,4,4,0,10,5,6,10,3,1,1,7,7,4,25,1 -23233,0,6,0,4,0,0,8,4,2,1,13,9,4,4,0 -23234,7,5,0,9,2,5,4,4,4,1,4,9,2,13,0 -23235,10,7,0,9,2,3,4,4,2,1,4,6,2,26,0 -23236,5,3,0,11,0,0,6,0,2,0,2,7,3,10,0 -23237,0,8,0,6,0,4,1,4,4,0,0,13,3,7,0 -23238,2,2,0,3,0,1,2,2,4,0,2,18,0,0,0 -23239,7,7,0,12,4,0,3,4,2,0,18,19,0,36,1 -23240,3,0,0,6,5,5,8,4,2,1,17,12,0,27,0 -23241,0,8,0,9,5,5,12,0,3,1,15,15,4,12,0 -23242,6,6,0,3,1,4,13,2,2,1,6,19,1,14,0 -23243,0,7,0,3,0,1,9,4,3,0,17,2,0,33,0 -23244,4,4,0,2,3,1,6,1,1,1,18,7,4,26,1 -23245,5,4,0,12,5,0,8,0,4,1,1,17,2,22,1 -23246,3,1,0,8,0,4,1,2,3,0,6,9,2,0,0 -23247,4,3,0,12,3,1,13,2,4,1,9,19,1,0,1 -23248,2,8,0,10,5,3,10,4,0,1,13,18,0,33,0 -23249,10,8,0,2,2,4,9,4,1,1,13,9,2,22,0 -23250,0,4,0,7,4,2,12,4,4,0,4,6,0,30,0 -23251,7,6,0,13,0,5,5,2,2,0,13,6,0,14,0 -23252,3,7,0,0,4,6,7,0,3,0,17,2,2,20,0 -23253,0,7,0,12,0,5,13,0,3,0,8,0,1,35,0 -23254,8,6,0,9,5,1,5,0,1,1,11,7,4,33,1 -23255,2,7,0,13,0,5,7,2,0,1,13,0,2,3,0 -23256,0,3,0,8,3,6,12,2,0,1,13,11,0,0,0 -23257,7,7,0,13,0,1,13,0,4,0,14,1,2,34,1 -23258,0,6,0,6,0,4,14,1,0,0,12,16,4,23,0 -23259,0,1,0,12,6,6,8,3,0,0,2,18,1,14,0 -23260,5,7,0,8,6,3,12,1,2,0,8,19,4,20,0 -23261,9,7,0,10,2,0,1,3,2,0,13,6,3,30,0 -23262,1,0,0,6,6,4,11,2,0,1,2,2,2,18,0 -23263,1,7,0,8,0,2,14,3,3,0,1,2,5,12,0 -23264,3,8,0,8,0,5,5,1,4,1,13,2,1,1,0 -23265,10,1,0,13,4,1,11,5,0,1,18,5,1,30,1 -23266,4,8,0,12,0,1,9,2,4,1,0,11,1,19,0 -23267,9,7,0,9,6,1,0,2,2,1,13,9,1,1,0 -23268,10,8,0,1,0,3,8,1,0,0,8,20,4,23,0 -23269,3,1,0,6,2,3,5,3,1,0,2,11,4,3,0 -23270,2,0,0,15,0,3,7,4,3,0,8,17,1,0,0 -23271,0,7,0,14,6,3,4,3,1,0,8,0,5,21,0 -23272,6,4,0,13,6,6,8,2,2,0,2,11,1,12,0 -23273,4,4,0,14,4,5,14,1,2,1,18,10,4,13,1 -23274,2,6,0,2,1,5,8,0,4,0,8,4,5,35,0 -23275,9,2,0,4,3,4,1,1,1,0,7,1,5,2,1 -23276,8,7,0,12,3,2,0,1,0,0,16,16,2,4,1 -23277,8,2,0,10,0,4,0,5,2,0,7,11,4,9,0 -23278,0,2,0,6,4,0,6,2,4,1,13,11,2,32,0 -23279,8,1,0,12,6,1,2,0,4,0,18,19,0,7,1 -23280,5,8,0,1,5,0,11,3,0,0,5,7,4,6,1 -23281,1,4,0,15,2,5,11,5,3,1,2,6,0,0,0 -23282,5,5,0,4,3,6,9,2,2,1,15,8,4,40,1 -23283,5,8,0,0,5,0,14,0,3,1,10,15,1,38,0 -23284,6,2,0,9,0,1,11,0,2,1,12,10,1,7,1 -23285,6,3,0,1,4,6,1,3,3,1,10,2,1,1,0 -23286,1,6,0,1,1,4,3,0,2,1,2,11,5,27,0 -23287,0,4,0,13,2,1,8,2,2,1,8,20,2,11,0 -23288,0,2,0,4,6,0,8,3,1,1,8,11,4,14,0 -23289,7,8,0,5,2,6,12,4,3,0,10,12,4,37,0 -23290,2,5,0,12,0,2,0,0,3,1,2,3,2,6,0 -23291,5,6,0,14,2,5,6,5,2,1,11,5,0,36,1 -23292,7,3,0,8,6,1,5,5,1,1,18,19,4,4,1 -23293,7,1,0,13,6,0,2,4,4,1,2,0,3,41,0 -23294,9,8,0,5,6,0,5,4,2,1,4,12,0,19,0 -23295,1,6,0,15,2,4,9,4,3,0,17,2,2,30,0 -23296,6,0,0,3,6,5,11,4,0,1,13,13,5,6,0 -23297,0,7,0,14,0,3,14,2,0,1,2,11,3,33,0 -23298,7,0,0,13,0,3,6,2,1,0,8,11,0,27,0 -23299,5,3,0,9,3,1,6,1,4,1,18,7,5,36,1 -23300,6,2,0,7,4,0,7,0,1,1,5,8,3,41,1 -23301,1,2,0,1,0,0,5,2,2,0,11,6,5,14,0 -23302,6,1,0,11,2,5,3,1,2,0,6,5,1,35,0 -23303,6,2,0,12,1,1,12,5,0,1,18,17,5,35,1 -23304,0,3,0,13,5,2,3,3,0,1,17,15,1,33,0 -23305,4,3,0,7,4,5,3,3,4,1,13,11,0,41,0 -23306,5,1,0,13,2,2,3,5,2,1,4,20,0,38,0 -23307,7,6,0,12,0,6,6,2,0,0,13,18,3,6,0 -23308,9,2,0,9,0,1,13,5,0,1,18,7,5,2,1 -23309,7,3,0,2,0,1,10,1,0,0,14,1,4,26,1 -23310,3,6,0,6,0,1,9,4,2,0,15,9,5,12,0 -23311,9,6,0,10,4,2,2,0,1,1,18,12,4,16,1 -23312,9,6,0,14,0,0,0,3,2,0,4,13,1,28,0 -23313,0,6,0,9,1,1,7,2,4,1,2,11,3,4,0 -23314,2,4,0,9,3,6,0,3,2,0,0,0,4,30,0 -23315,0,4,0,2,5,6,11,1,0,1,17,9,2,29,0 -23316,7,3,0,5,0,3,7,1,4,0,6,11,3,4,0 -23317,7,8,0,15,2,4,3,1,1,0,4,3,0,6,0 -23318,0,5,0,4,6,0,4,4,0,0,17,2,5,6,0 -23319,0,4,0,9,5,5,7,4,2,0,0,11,5,30,0 -23320,0,6,0,13,0,5,2,3,2,0,11,15,3,6,0 -23321,6,8,0,0,1,2,7,1,0,1,6,10,4,38,1 -23322,3,6,0,11,4,2,8,5,1,1,18,18,2,19,1 -23323,8,6,0,14,3,3,5,1,2,0,13,16,4,30,0 -23324,0,8,0,4,1,1,5,5,3,1,9,3,4,16,1 -23325,3,0,0,11,2,5,14,3,3,0,12,9,5,39,0 -23326,10,2,0,12,4,1,8,5,4,0,14,1,1,36,1 -23327,3,8,0,11,0,5,5,2,1,1,10,3,4,8,0 -23328,4,8,0,1,6,0,3,1,4,0,13,18,1,16,0 -23329,3,5,0,0,4,6,2,0,3,1,16,13,0,19,1 -23330,5,1,0,11,5,5,3,3,0,1,13,11,2,22,0 -23331,3,7,0,6,3,5,5,1,3,0,2,20,0,38,0 -23332,2,3,0,14,0,1,10,1,2,1,7,5,3,9,1 -23333,7,8,0,3,1,1,13,1,1,0,8,2,2,6,0 -23334,7,5,0,10,3,2,8,5,2,1,11,5,0,24,1 -23335,8,5,0,12,5,0,14,2,0,1,6,2,1,4,0 -23336,0,6,0,4,0,6,3,3,2,1,2,2,5,33,0 -23337,8,3,0,9,2,0,2,4,0,0,13,2,0,5,0 -23338,9,0,0,0,0,5,5,0,2,0,13,6,3,23,0 -23339,8,4,0,12,3,2,1,0,0,1,1,12,3,9,1 -23340,3,0,0,14,2,4,9,1,0,0,2,20,2,10,0 -23341,9,2,0,1,6,6,12,4,2,0,15,16,3,33,1 -23342,6,3,0,2,4,0,14,0,1,0,2,11,1,8,0 -23343,8,7,0,8,1,4,14,2,1,1,18,10,1,5,1 -23344,4,4,0,13,3,4,14,0,1,0,17,12,0,21,0 -23345,0,0,0,11,0,6,10,2,2,1,10,20,0,38,0 -23346,0,5,0,5,4,5,2,1,4,1,2,9,0,0,0 -23347,0,0,0,8,5,5,9,0,0,1,8,2,1,22,0 -23348,3,8,0,12,1,5,6,1,3,1,18,17,4,40,1 -23349,7,2,0,13,0,2,12,5,1,0,1,3,3,19,1 -23350,3,3,0,15,0,4,5,2,0,1,12,16,1,41,0 -23351,0,6,0,13,3,2,10,1,1,0,17,3,4,9,0 -23352,4,1,0,7,6,5,14,1,4,0,2,9,5,0,0 -23353,0,0,0,5,1,1,8,3,1,1,6,1,0,21,0 -23354,3,7,0,12,4,5,0,5,4,1,3,16,1,35,1 -23355,5,4,0,9,4,2,11,1,1,1,5,14,5,10,1 -23356,7,4,0,15,1,4,5,5,3,1,5,8,5,36,1 -23357,7,2,0,13,4,6,4,5,4,1,9,10,3,5,1 -23358,9,0,0,4,6,5,14,3,4,0,12,19,5,1,0 -23359,5,0,0,4,4,6,3,1,1,1,1,7,5,13,1 -23360,6,4,0,5,2,3,8,3,0,1,2,18,5,4,0 -23361,3,3,0,1,6,3,11,2,0,1,15,0,0,17,0 -23362,2,0,0,10,5,0,12,5,0,0,0,16,4,27,0 -23363,5,8,0,1,0,1,2,2,3,0,8,18,2,21,0 -23364,10,8,0,14,0,5,6,2,4,1,17,11,0,4,0 -23365,10,1,0,13,0,1,5,3,0,0,4,4,0,35,0 -23366,1,4,0,11,4,6,4,0,2,0,9,13,1,13,0 -23367,2,3,0,3,0,6,1,3,3,1,2,11,1,1,0 -23368,0,2,0,9,4,0,0,3,2,0,2,11,4,4,0 -23369,9,6,0,11,0,0,2,5,3,0,2,20,0,30,0 -23370,10,6,0,13,1,5,6,1,0,0,15,2,1,14,0 -23371,5,1,0,9,4,0,13,0,0,1,16,10,4,39,1 -23372,0,7,0,2,0,4,1,4,2,1,2,11,3,31,0 -23373,0,5,0,1,4,0,1,4,3,0,10,0,4,25,0 -23374,2,5,0,2,6,5,9,3,4,0,8,11,1,17,0 -23375,5,8,0,0,0,4,6,3,0,1,13,15,2,12,0 -23376,10,3,0,6,1,2,3,0,4,1,1,8,1,37,1 -23377,0,0,0,8,0,2,3,1,2,0,0,13,5,29,0 -23378,1,2,0,1,4,4,3,3,2,1,8,15,2,21,0 -23379,0,0,0,5,5,3,2,5,2,0,10,11,4,19,0 -23380,1,8,0,15,5,3,8,0,4,0,8,11,3,22,0 -23381,0,1,0,10,5,0,8,1,3,1,15,2,2,1,0 -23382,2,0,0,4,5,1,6,4,0,1,0,18,4,13,0 -23383,3,3,0,9,0,5,13,0,0,0,6,2,0,12,0 -23384,5,6,0,12,1,2,10,5,3,1,6,18,1,19,0 -23385,0,4,0,3,3,6,12,3,0,1,2,6,0,38,0 -23386,1,0,0,1,4,6,1,2,4,1,12,11,3,5,0 -23387,5,7,0,15,2,1,11,1,3,1,0,13,0,30,0 -23388,0,3,0,5,6,3,14,2,1,0,2,0,4,13,0 -23389,10,6,0,3,1,5,8,2,4,0,1,2,4,17,0 -23390,8,8,0,11,0,2,12,0,0,1,1,0,4,2,0 -23391,6,8,0,0,1,4,2,0,4,1,1,14,3,16,1 -23392,10,2,0,10,6,6,5,4,4,1,2,9,0,25,0 -23393,5,7,0,8,3,5,10,5,4,1,16,10,2,32,1 -23394,3,7,0,4,1,4,12,1,0,0,0,6,0,31,0 -23395,0,6,0,8,6,3,9,3,3,1,10,6,2,18,0 -23396,1,8,0,6,6,2,8,3,3,0,17,17,2,1,0 -23397,2,5,0,11,3,2,14,0,4,0,13,6,4,26,0 -23398,0,2,0,15,5,6,4,0,2,1,2,9,0,16,0 -23399,3,2,0,6,0,0,2,3,0,1,6,15,3,24,0 -23400,7,1,0,13,3,5,4,0,3,1,2,13,1,0,0 -23401,9,8,0,13,3,0,3,3,1,0,3,11,2,22,0 -23402,5,4,0,1,2,1,4,3,4,0,9,11,0,35,0 -23403,2,1,0,3,4,6,1,5,0,0,5,8,5,12,1 -23404,8,7,0,15,1,1,13,1,1,1,15,17,4,5,1 -23405,1,6,0,13,1,5,12,3,0,0,3,13,2,4,0 -23406,1,4,0,0,0,6,5,3,2,0,12,18,2,21,0 -23407,7,4,0,8,2,5,13,3,2,1,2,20,0,16,0 -23408,7,2,0,8,5,4,6,4,1,1,12,9,3,14,0 -23409,8,4,0,6,6,3,5,1,2,0,8,11,0,13,0 -23410,10,8,0,14,0,5,2,3,2,0,10,14,1,33,0 -23411,1,0,0,13,5,0,10,3,4,1,17,0,2,23,0 -23412,1,2,0,12,2,1,11,4,1,0,7,7,4,11,1 -23413,7,4,0,11,6,5,5,0,2,1,10,1,0,20,0 -23414,6,1,0,14,4,2,4,5,4,1,5,8,4,11,1 -23415,8,0,0,3,2,5,1,4,0,0,13,15,1,3,0 -23416,7,2,0,14,0,1,2,3,2,1,13,4,0,30,0 -23417,2,8,0,13,2,0,8,4,0,1,13,9,3,25,0 -23418,3,7,0,15,2,5,11,0,0,0,3,11,3,1,1 -23419,7,5,0,9,4,1,9,4,2,1,9,1,1,30,1 -23420,8,1,0,10,4,2,0,3,0,1,18,17,2,26,1 -23421,4,5,0,15,4,4,9,0,1,0,4,15,2,17,0 -23422,0,7,0,13,6,1,5,0,0,0,13,2,5,2,0 -23423,6,7,0,4,0,2,0,1,4,0,8,2,3,3,0 -23424,7,1,0,12,4,1,14,5,2,0,1,7,3,24,1 -23425,7,1,0,4,0,2,8,4,4,0,18,19,3,9,1 -23426,3,3,0,12,0,0,12,1,0,1,6,2,5,28,0 -23427,6,2,0,5,0,3,9,4,3,0,8,6,0,30,0 -23428,3,6,0,0,1,5,1,2,0,1,6,15,0,7,0 -23429,0,1,0,0,6,0,7,0,0,1,0,14,0,32,0 -23430,1,0,0,11,0,5,12,4,4,0,13,11,4,27,0 -23431,3,7,0,0,6,5,4,4,4,0,13,11,2,4,0 -23432,5,8,0,12,1,4,7,2,0,0,2,2,3,10,0 -23433,5,8,0,7,6,0,3,0,0,0,13,2,1,28,0 -23434,3,8,0,3,1,0,7,3,2,0,2,15,2,16,0 -23435,0,7,0,10,4,0,2,2,4,0,2,2,1,4,0 -23436,1,2,0,14,0,3,10,0,1,1,15,4,4,17,1 -23437,10,4,0,2,1,1,10,1,0,1,5,17,2,27,1 -23438,1,0,0,12,2,1,4,3,1,1,11,10,2,37,1 -23439,0,5,0,6,1,0,5,4,1,0,9,6,0,20,0 -23440,1,6,0,12,3,6,14,2,0,1,2,4,1,24,0 -23441,0,2,0,2,5,4,5,4,1,1,13,9,3,39,0 -23442,7,2,0,0,4,5,8,0,2,0,6,0,2,41,0 -23443,4,8,0,5,3,1,2,0,1,1,9,12,3,11,1 -23444,1,7,0,0,0,2,12,0,4,1,8,16,3,2,0 -23445,2,7,0,10,1,2,5,0,2,1,18,12,4,30,1 -23446,9,2,0,1,6,1,3,5,2,1,1,7,4,34,1 -23447,1,6,0,7,0,1,11,3,4,1,16,8,3,2,1 -23448,3,0,0,9,6,6,9,1,4,0,8,6,3,31,0 -23449,5,4,0,8,2,1,14,2,1,0,7,7,4,19,1 -23450,10,6,0,6,0,0,9,4,2,1,3,11,2,38,0 -23451,1,7,0,8,1,2,6,4,1,1,2,17,5,26,0 -23452,0,4,0,0,5,5,6,4,4,1,2,6,0,20,0 -23453,0,7,0,1,4,4,6,2,3,0,0,14,5,16,0 -23454,8,1,0,10,3,1,4,3,4,0,0,18,3,27,0 -23455,1,4,0,8,2,2,1,2,0,1,5,19,5,19,1 -23456,6,8,0,9,3,0,1,1,0,0,16,9,0,39,0 -23457,6,3,0,6,4,2,0,3,2,1,18,7,1,28,1 -23458,10,8,0,14,3,0,10,5,3,0,18,7,3,26,1 -23459,6,3,0,5,2,4,6,4,3,1,6,2,4,6,0 -23460,4,7,0,5,0,3,8,3,1,0,13,20,0,21,0 -23461,5,5,0,14,0,1,12,1,0,1,9,12,5,21,1 -23462,7,0,0,4,1,0,6,3,2,1,2,11,4,17,0 -23463,8,3,0,3,4,0,13,0,4,0,18,17,3,17,1 -23464,8,6,0,2,2,0,11,0,2,0,17,15,3,33,0 -23465,0,8,0,4,5,0,0,4,0,0,13,13,0,24,0 -23466,7,5,0,1,5,3,10,2,0,0,5,5,4,9,1 -23467,7,8,0,2,1,5,3,0,0,0,9,17,1,7,1 -23468,3,4,0,6,0,4,8,2,3,0,7,9,5,10,0 -23469,7,6,0,11,4,5,8,4,4,1,8,8,5,3,0 -23470,8,4,0,9,3,5,10,5,2,1,18,7,4,26,1 -23471,10,7,0,2,0,4,14,4,0,1,4,6,1,1,0 -23472,9,1,0,9,0,2,13,1,0,0,18,7,0,22,1 -23473,1,4,0,3,5,5,12,3,4,1,13,12,5,30,0 -23474,4,4,0,14,0,6,12,1,4,0,12,1,2,25,0 -23475,1,8,0,10,1,6,10,3,1,1,14,17,0,5,1 -23476,7,2,0,9,3,4,4,5,0,1,5,5,4,37,1 -23477,10,4,0,10,2,6,2,5,1,1,3,1,1,22,1 -23478,7,4,0,7,5,6,13,3,1,0,13,10,0,19,1 -23479,2,3,0,15,0,0,8,0,4,1,13,11,5,1,0 -23480,5,8,0,15,2,4,5,4,0,1,8,13,3,32,0 -23481,10,0,0,3,6,3,13,4,3,1,2,2,3,20,0 -23482,8,4,0,7,6,1,0,3,1,1,8,9,4,37,0 -23483,8,8,0,5,5,1,2,2,4,0,7,7,1,18,1 -23484,5,8,0,2,5,0,0,0,0,1,2,6,5,39,0 -23485,10,6,0,14,1,5,1,3,1,1,7,19,1,31,1 -23486,8,6,0,0,3,5,3,3,1,1,13,18,2,38,0 -23487,3,2,0,13,6,2,10,0,0,0,10,10,0,26,0 -23488,3,6,0,14,6,6,8,1,1,0,13,2,1,10,0 -23489,4,3,0,1,6,5,0,5,1,1,8,11,2,18,0 -23490,0,2,0,13,0,1,13,1,0,0,2,11,0,4,0 -23491,3,5,0,9,6,0,9,0,1,1,13,9,0,1,0 -23492,7,2,0,1,6,1,10,3,3,1,18,5,0,31,1 -23493,0,4,0,0,4,5,9,4,0,1,16,11,4,41,0 -23494,4,4,0,9,0,5,13,3,4,1,2,13,0,8,0 -23495,6,1,0,12,5,1,14,5,3,1,16,7,2,3,1 -23496,1,1,0,7,2,4,10,0,0,1,1,12,3,5,1 -23497,4,4,0,5,2,4,11,2,2,1,11,17,4,13,1 -23498,1,5,0,7,3,1,11,5,1,0,17,10,4,9,1 -23499,8,5,0,9,3,5,6,4,2,1,8,11,4,32,0 -23500,10,7,0,1,0,1,8,0,0,0,13,2,3,5,0 -23501,0,2,0,5,4,2,7,5,3,1,4,4,0,34,0 -23502,2,2,0,12,3,0,2,0,1,1,13,6,4,38,0 -23503,10,1,0,9,5,3,9,4,3,0,16,11,2,34,0 -23504,5,0,0,11,2,5,12,1,2,0,17,16,2,1,0 -23505,0,8,0,9,5,5,12,1,2,1,13,15,0,35,0 -23506,10,1,0,3,2,2,5,1,0,0,5,16,4,22,1 -23507,3,8,0,3,3,5,7,0,0,0,2,2,2,21,0 -23508,1,1,0,5,2,6,9,4,2,1,2,6,4,33,0 -23509,10,5,0,8,3,6,9,0,4,1,3,10,5,40,1 -23510,3,7,0,5,3,1,6,4,2,1,17,6,1,33,0 -23511,0,2,0,14,1,4,8,2,4,0,2,2,2,28,0 -23512,2,5,0,1,2,0,7,5,3,0,2,11,0,33,0 -23513,10,8,0,5,2,2,9,0,0,1,2,13,2,41,0 -23514,6,3,0,3,6,6,9,3,1,0,4,2,4,20,0 -23515,6,4,0,13,6,6,6,4,0,1,2,15,1,31,0 -23516,3,3,0,0,1,5,3,3,3,0,11,13,4,16,0 -23517,6,3,0,4,6,0,0,4,2,1,13,13,3,12,0 -23518,3,1,0,5,2,4,7,3,0,1,11,11,3,31,0 -23519,7,4,0,10,4,6,13,3,1,1,1,16,3,6,1 -23520,3,8,0,7,6,5,2,4,1,0,17,2,1,23,0 -23521,10,8,0,3,4,6,4,1,1,0,6,15,5,10,0 -23522,2,8,0,7,5,5,8,2,0,1,3,2,0,30,0 -23523,5,8,0,3,5,4,2,0,1,1,11,13,1,31,1 -23524,1,4,0,13,3,2,0,4,3,1,18,7,1,10,1 -23525,5,8,0,7,0,3,14,0,3,0,14,2,0,35,0 -23526,0,8,0,1,2,0,13,0,2,1,13,11,5,2,0 -23527,3,5,0,3,4,2,7,4,3,1,11,11,0,7,0 -23528,9,6,0,7,6,4,5,4,0,0,4,17,5,35,0 -23529,6,5,0,3,2,4,13,4,1,1,16,0,1,13,1 -23530,5,3,0,7,1,3,2,3,1,0,14,11,1,25,0 -23531,0,8,0,4,5,0,12,2,2,1,13,2,0,19,0 -23532,4,1,0,1,2,4,6,0,0,0,10,4,2,35,0 -23533,0,6,0,2,0,1,9,1,3,0,2,11,0,2,0 -23534,2,7,0,5,1,5,2,4,3,0,2,2,0,24,0 -23535,2,4,0,0,0,1,5,4,4,1,13,11,5,31,0 -23536,2,6,0,6,6,1,5,1,0,0,8,0,0,38,0 -23537,0,6,0,4,2,6,1,3,3,1,12,10,5,1,0 -23538,8,2,0,6,3,2,9,5,0,1,7,6,1,30,1 -23539,2,7,0,15,5,2,7,1,3,0,3,2,2,40,0 -23540,0,8,0,13,6,3,1,2,2,0,14,18,2,19,0 -23541,8,4,0,10,4,0,12,5,2,0,18,12,4,21,1 -23542,5,6,0,9,4,3,4,4,1,0,4,20,5,29,1 -23543,9,3,0,15,0,3,11,4,4,0,8,16,0,4,0 -23544,7,8,0,10,3,1,13,0,3,1,2,7,1,28,1 -23545,8,3,0,6,6,5,9,5,4,1,17,2,5,14,0 -23546,10,0,0,14,6,2,2,0,2,0,16,19,2,8,1 -23547,2,8,0,13,5,6,11,5,3,0,13,3,0,26,0 -23548,10,2,0,10,6,5,3,2,0,0,10,20,3,11,0 -23549,6,5,0,3,1,5,6,2,1,1,13,16,5,24,0 -23550,2,3,0,4,1,3,13,1,4,1,8,4,1,26,0 -23551,8,2,0,1,0,5,5,2,3,0,2,8,5,29,0 -23552,1,1,0,15,5,5,6,5,0,0,3,11,2,19,1 -23553,4,6,0,2,3,4,0,3,0,0,3,9,2,4,0 -23554,7,5,0,0,6,2,4,0,0,0,1,19,2,3,1 -23555,0,2,0,2,6,1,7,3,3,1,13,6,1,33,0 -23556,4,2,0,4,2,3,3,0,3,1,13,3,1,16,0 -23557,7,5,0,1,4,3,1,0,4,1,1,17,1,5,1 -23558,3,3,0,10,6,0,10,1,4,1,18,5,5,7,1 -23559,8,3,0,4,1,5,12,4,2,1,6,18,2,40,0 -23560,8,3,0,13,2,0,9,4,0,0,2,11,4,33,0 -23561,7,4,0,11,2,3,9,5,0,1,9,1,2,4,1 -23562,0,3,0,3,3,1,5,4,2,0,13,5,1,27,0 -23563,0,5,0,8,4,0,3,4,3,1,12,6,5,1,0 -23564,1,2,0,1,1,4,0,2,2,0,2,13,0,25,0 -23565,8,1,0,9,5,0,5,2,4,1,7,13,4,3,1 -23566,9,0,0,13,1,5,7,1,0,0,8,12,0,29,0 -23567,4,1,0,15,5,0,5,0,0,1,10,8,3,36,1 -23568,2,6,0,13,5,4,9,3,0,1,8,0,4,14,0 -23569,0,2,0,0,0,0,11,3,0,0,8,0,2,32,0 -23570,10,4,0,14,3,5,5,5,0,1,2,4,5,10,0 -23571,7,7,0,5,0,6,13,3,1,0,10,11,5,14,0 -23572,2,3,0,6,4,1,4,3,4,1,1,7,2,25,1 -23573,9,8,0,6,1,0,12,4,1,1,13,11,1,38,0 -23574,9,5,0,7,1,4,2,3,2,0,8,11,2,9,0 -23575,1,8,0,3,5,0,2,2,0,0,17,15,0,33,0 -23576,8,5,0,11,1,2,14,2,3,0,2,2,0,17,0 -23577,7,1,0,7,6,6,9,5,2,0,18,20,3,18,1 -23578,0,1,0,5,5,2,8,3,3,0,2,2,0,36,0 -23579,10,6,0,10,0,0,4,5,2,1,14,14,0,6,1 -23580,0,4,0,8,2,0,8,3,2,1,13,11,5,11,0 -23581,4,7,0,11,3,4,12,5,3,1,8,2,3,29,0 -23582,0,2,0,11,0,2,14,4,0,0,2,2,4,16,0 -23583,5,7,0,4,0,0,2,3,2,1,13,20,0,27,0 -23584,6,3,0,3,1,5,8,3,1,1,18,17,4,41,1 -23585,6,0,0,0,6,3,4,1,3,0,2,11,1,1,0 -23586,3,5,0,1,0,4,0,2,2,1,0,11,2,10,0 -23587,10,7,0,10,0,3,5,2,1,0,7,7,2,35,1 -23588,5,4,0,10,0,4,7,2,1,0,2,15,4,30,0 -23589,9,5,0,1,5,2,1,1,4,0,11,12,5,18,1 -23590,3,6,0,13,1,2,7,3,3,1,1,7,4,34,1 -23591,4,7,0,4,1,1,11,2,0,1,10,18,2,27,0 -23592,1,2,0,15,0,1,4,3,4,0,13,12,5,37,0 -23593,2,7,0,0,1,6,6,2,4,1,13,3,2,23,0 -23594,3,1,0,6,1,5,2,2,1,0,1,6,5,36,1 -23595,6,6,0,10,5,1,11,5,1,1,14,8,4,37,1 -23596,5,6,0,12,6,5,8,3,3,0,4,2,3,16,0 -23597,0,7,0,11,5,0,11,5,1,1,16,4,4,5,0 -23598,4,1,0,0,6,4,14,4,1,1,10,11,4,22,0 -23599,7,1,0,11,3,3,3,3,0,1,5,11,0,29,0 -23600,6,4,0,13,4,0,7,2,4,1,9,13,2,10,0 -23601,1,6,0,9,2,3,8,2,0,1,13,12,2,29,0 -23602,1,7,0,5,6,0,13,3,1,1,7,7,0,18,1 -23603,10,8,0,8,4,0,2,0,1,0,17,0,5,21,0 -23604,1,2,0,1,2,4,5,2,2,0,13,11,0,3,0 -23605,6,2,0,10,5,3,0,0,1,1,9,14,3,41,1 -23606,1,8,0,13,3,4,12,3,0,0,8,11,4,19,0 -23607,0,5,0,3,3,6,8,1,2,0,2,17,3,37,0 -23608,8,4,0,1,4,2,13,0,1,1,14,17,2,30,1 -23609,3,5,0,1,3,5,8,5,4,0,6,1,0,5,0 -23610,7,5,0,10,5,4,5,3,1,0,6,16,0,21,0 -23611,1,7,0,0,2,3,0,4,0,1,0,13,0,33,0 -23612,3,3,0,12,2,1,9,4,1,1,2,13,5,35,0 -23613,5,8,0,7,6,6,4,0,2,0,8,18,4,28,0 -23614,10,8,0,5,2,0,8,4,2,1,2,11,1,26,0 -23615,6,7,0,4,5,0,0,0,2,1,5,19,2,12,1 -23616,1,1,0,9,6,6,10,5,2,1,18,17,5,31,1 -23617,1,7,0,7,4,0,12,0,2,0,17,5,0,27,0 -23618,1,5,0,5,4,0,0,2,1,1,8,6,4,0,0 -23619,10,8,0,5,4,0,5,3,4,1,1,2,0,4,0 -23620,7,3,0,5,0,4,6,4,2,0,14,9,5,31,0 -23621,1,8,0,13,5,5,0,3,1,0,13,18,2,35,0 -23622,8,3,0,13,1,0,6,4,4,0,17,11,2,38,0 -23623,5,1,0,7,6,6,8,4,3,0,11,20,3,21,1 -23624,8,6,0,15,1,5,10,0,0,0,7,11,2,3,0 -23625,4,5,0,4,0,4,11,5,1,1,14,10,1,41,1 -23626,3,4,0,0,1,6,3,5,0,0,2,5,1,25,0 -23627,3,0,0,5,1,5,6,5,0,0,0,9,0,16,0 -23628,4,0,0,3,3,6,2,1,1,1,9,11,4,28,1 -23629,4,0,0,13,1,6,14,2,1,1,2,11,1,25,0 -23630,4,1,0,8,3,6,9,0,3,1,18,1,4,24,1 -23631,1,0,0,12,4,0,10,4,3,0,2,2,1,38,0 -23632,3,8,0,15,1,4,14,4,4,0,17,11,2,38,0 -23633,7,6,0,12,5,2,4,1,0,1,11,8,5,8,1 -23634,7,6,0,11,5,2,7,5,3,1,9,7,5,31,1 -23635,1,6,0,11,0,4,1,3,2,0,10,13,4,17,0 -23636,8,7,0,6,0,0,2,1,1,1,13,2,0,0,0 -23637,1,0,0,8,5,0,2,1,0,1,8,2,3,35,0 -23638,4,4,0,1,2,3,10,0,1,0,10,2,5,36,0 -23639,1,5,0,13,0,4,3,3,2,0,13,2,3,38,0 -23640,10,7,0,3,1,5,4,5,2,1,13,6,0,20,0 -23641,1,7,0,11,4,4,1,5,1,1,1,4,3,8,1 -23642,2,3,0,2,0,4,5,1,2,0,13,15,0,33,0 -23643,3,2,0,5,5,3,6,3,4,0,12,9,0,39,0 -23644,3,2,0,15,5,5,3,3,4,1,10,4,1,39,0 -23645,10,1,0,4,1,1,6,0,0,0,13,2,3,20,0 -23646,3,2,0,0,1,2,13,0,4,0,1,18,1,38,0 -23647,0,2,0,10,1,4,8,4,4,1,1,2,1,3,0 -23648,5,0,0,12,2,5,10,0,2,0,8,7,5,3,0 -23649,9,2,0,14,4,2,14,4,0,1,7,20,3,23,1 -23650,1,2,0,5,1,2,6,2,2,0,13,11,0,36,0 -23651,0,0,0,3,1,6,11,4,0,1,9,11,2,38,0 -23652,2,7,0,4,1,4,3,0,3,1,6,0,1,31,0 -23653,6,3,0,4,4,0,8,4,0,0,8,2,0,5,0 -23654,1,1,0,4,1,1,7,0,2,1,13,6,1,14,0 -23655,3,7,0,15,6,0,0,3,3,0,13,20,3,37,0 -23656,0,2,0,15,1,4,1,0,1,0,2,11,2,13,0 -23657,6,6,0,4,6,2,1,2,2,0,17,1,1,9,1 -23658,2,6,0,6,3,2,11,0,1,1,18,7,0,41,1 -23659,6,7,0,4,1,2,11,1,1,0,18,18,3,25,1 -23660,3,7,0,3,6,3,13,0,4,1,12,16,3,29,0 -23661,10,0,0,11,4,0,2,4,3,1,10,6,0,31,0 -23662,7,2,0,13,6,5,12,5,3,1,8,6,1,29,0 -23663,1,4,0,15,2,4,13,3,1,0,2,20,0,38,0 -23664,6,2,0,11,3,1,8,4,1,0,13,0,3,27,0 -23665,6,4,0,10,5,4,13,4,3,0,13,11,5,28,0 -23666,2,7,0,3,0,3,12,4,3,1,12,0,3,29,0 -23667,3,3,0,3,1,4,9,1,3,1,13,11,0,23,0 -23668,5,2,0,11,5,3,14,3,4,1,3,11,5,19,0 -23669,0,6,0,11,6,5,1,1,2,1,6,2,0,3,0 -23670,9,6,0,2,4,6,9,1,2,1,8,13,2,14,0 -23671,2,4,0,0,4,6,14,3,0,1,6,15,3,38,0 -23672,1,7,0,13,1,1,4,0,2,0,16,15,0,32,0 -23673,2,0,0,5,3,4,12,3,4,0,13,0,2,35,0 -23674,7,0,0,5,4,2,3,5,3,0,3,4,0,21,1 -23675,9,2,0,3,6,3,14,5,0,1,18,12,2,32,1 -23676,10,5,0,0,6,1,11,1,2,1,9,10,4,9,1 -23677,0,8,0,13,0,5,0,3,3,0,10,6,5,25,0 -23678,10,7,0,7,2,0,9,0,0,0,0,2,5,32,0 -23679,0,4,0,8,0,5,2,3,0,1,4,11,1,21,0 -23680,3,6,0,1,0,0,4,2,0,0,8,2,5,30,0 -23681,0,0,0,13,4,0,4,2,0,1,6,11,5,31,0 -23682,10,2,0,0,1,5,8,0,4,1,13,18,0,33,0 -23683,1,8,0,13,4,5,0,3,1,0,6,12,3,41,0 -23684,0,2,0,6,2,2,9,2,0,1,5,16,3,8,0 -23685,7,5,0,9,6,1,7,5,0,1,18,17,4,19,1 -23686,3,3,0,7,2,4,8,4,0,0,2,0,5,38,0 -23687,1,1,0,3,2,3,0,3,3,1,0,2,2,35,0 -23688,2,5,0,13,4,6,9,4,2,0,2,11,1,32,0 -23689,7,3,0,13,3,5,5,5,0,1,4,18,3,25,0 -23690,2,3,0,13,4,4,2,2,2,0,15,0,5,20,0 -23691,5,8,0,6,0,3,8,1,4,1,8,0,4,6,0 -23692,7,8,0,2,5,0,8,2,0,1,18,8,1,36,1 -23693,0,2,0,4,5,1,8,1,4,0,6,6,2,19,0 -23694,6,1,0,10,4,2,1,0,4,1,7,5,0,27,1 -23695,5,1,0,3,0,0,9,1,1,0,13,2,0,35,0 -23696,3,4,0,2,2,1,6,5,0,1,5,3,5,20,1 -23697,8,2,0,3,4,3,6,2,3,1,3,12,0,22,0 -23698,5,4,0,7,4,5,8,3,1,1,16,0,1,20,0 -23699,2,6,0,6,6,0,9,0,1,0,2,3,4,20,0 -23700,8,8,0,4,0,3,6,4,3,0,18,6,3,3,0 -23701,9,2,0,12,0,4,7,4,4,1,8,12,0,32,0 -23702,3,7,0,0,6,4,14,2,2,0,3,6,4,38,0 -23703,8,4,0,4,3,2,13,1,1,1,18,8,3,12,1 -23704,0,1,0,8,3,1,8,0,1,1,10,12,5,18,1 -23705,9,1,0,2,6,0,11,1,1,1,17,1,1,14,1 -23706,6,6,0,10,1,3,4,5,4,1,18,7,4,37,1 -23707,10,6,0,5,4,5,4,3,1,1,18,10,5,23,1 -23708,1,5,0,13,6,0,4,3,2,1,5,3,5,25,0 -23709,7,8,0,11,4,6,3,1,1,1,16,1,0,1,1 -23710,2,7,0,15,3,0,10,4,3,0,2,2,4,23,0 -23711,0,2,0,6,5,6,2,4,3,1,17,2,1,11,0 -23712,10,7,0,1,0,3,8,0,3,0,2,13,4,35,0 -23713,6,8,0,3,0,0,3,2,4,1,16,4,5,37,0 -23714,8,0,0,1,0,0,10,4,2,1,0,9,2,40,0 -23715,10,6,0,0,5,3,13,0,1,1,13,11,3,30,0 -23716,3,3,0,3,1,4,0,3,2,0,5,18,1,32,0 -23717,2,2,0,5,0,6,10,2,3,0,14,9,0,34,0 -23718,7,0,0,5,3,5,3,2,4,0,13,13,1,23,0 -23719,6,7,0,8,0,5,10,0,4,0,18,7,4,7,1 -23720,0,1,0,4,6,2,7,0,0,1,13,15,1,27,0 -23721,0,3,0,0,5,4,8,4,3,0,13,15,1,26,0 -23722,2,7,0,0,0,3,3,0,2,0,4,2,5,41,0 -23723,1,0,0,11,0,6,0,0,3,1,14,0,1,40,0 -23724,6,2,0,1,0,1,10,5,0,1,13,1,0,1,1 -23725,8,8,0,14,2,2,12,5,2,1,14,17,1,6,1 -23726,3,5,0,1,2,5,10,4,4,1,2,2,5,23,0 -23727,4,3,0,5,5,3,13,2,3,1,18,16,1,40,1 -23728,0,4,0,6,1,0,1,3,4,0,17,11,5,25,0 -23729,6,0,0,6,4,1,2,4,3,1,16,10,4,24,1 -23730,4,1,0,0,6,4,8,4,1,1,10,14,5,10,0 -23731,3,8,0,13,2,2,10,0,3,0,13,6,4,25,0 -23732,4,0,0,13,0,5,12,2,2,0,8,20,0,4,0 -23733,1,0,0,5,6,6,2,2,1,0,12,11,4,0,0 -23734,9,0,0,13,1,2,2,0,0,0,2,9,0,21,0 -23735,1,1,0,11,6,1,5,2,0,1,17,4,5,33,0 -23736,1,2,0,9,3,2,10,0,3,0,17,10,2,40,1 -23737,6,8,0,4,3,1,9,1,2,1,16,2,5,11,0 -23738,5,3,0,3,6,4,5,3,1,0,13,19,1,3,0 -23739,5,4,0,6,2,6,9,0,0,1,0,8,4,30,0 -23740,2,5,0,6,6,4,4,3,1,1,10,6,4,28,0 -23741,6,2,0,13,4,6,5,0,1,0,1,17,4,12,1 -23742,1,2,0,13,0,3,3,1,3,1,13,6,3,37,0 -23743,9,5,0,11,1,5,4,2,4,0,16,3,0,38,0 -23744,8,6,0,0,6,1,2,3,0,0,8,2,1,19,0 -23745,0,3,0,0,0,6,11,1,1,1,4,20,5,36,0 -23746,9,2,0,14,5,0,8,4,2,0,2,4,5,23,0 -23747,10,5,0,7,4,0,3,3,0,1,0,2,5,28,0 -23748,7,6,0,6,2,0,4,3,4,0,2,6,1,36,0 -23749,9,2,0,3,1,4,2,5,1,1,18,8,1,21,1 -23750,8,5,0,4,4,3,5,1,1,0,2,11,0,8,0 -23751,8,0,0,7,0,5,12,0,1,0,13,16,1,30,0 -23752,10,0,0,4,1,0,12,5,3,0,8,17,2,35,0 -23753,7,0,0,10,4,0,4,0,4,1,0,7,2,30,1 -23754,2,1,0,9,2,2,1,3,1,0,14,13,1,18,0 -23755,0,4,0,8,6,3,9,5,4,1,17,4,3,8,0 -23756,1,4,0,15,0,5,8,3,3,1,15,1,2,29,0 -23757,1,6,0,0,0,5,0,4,1,0,2,16,0,19,0 -23758,0,3,0,6,5,5,5,3,3,0,12,20,2,5,0 -23759,5,5,0,8,1,2,6,0,0,1,11,3,5,25,1 -23760,0,5,0,1,1,4,5,5,2,0,8,0,3,8,0 -23761,10,3,0,15,6,0,3,3,0,0,6,6,2,4,0 -23762,3,0,0,6,5,3,2,5,3,0,7,5,2,31,1 -23763,5,7,0,8,2,2,14,1,1,1,18,14,1,28,1 -23764,3,8,0,15,2,3,5,1,0,0,4,15,2,11,0 -23765,0,7,0,15,4,4,4,3,3,1,1,2,4,16,0 -23766,2,7,0,8,6,4,2,0,3,1,13,2,1,40,0 -23767,8,6,0,4,4,0,7,0,4,0,2,9,2,28,0 -23768,2,3,0,12,6,4,6,0,4,0,15,11,0,3,0 -23769,0,2,0,13,3,6,8,1,4,1,13,11,5,7,0 -23770,5,4,0,11,1,0,14,0,1,0,13,0,0,28,0 -23771,3,4,0,3,1,6,9,4,2,1,13,18,1,37,0 -23772,1,8,0,4,5,0,9,4,2,0,13,15,5,37,0 -23773,6,3,0,12,2,2,7,5,0,0,11,4,0,4,0 -23774,3,3,0,2,2,2,8,3,0,1,0,19,0,29,0 -23775,1,8,0,7,4,3,1,2,0,0,13,4,5,26,0 -23776,7,0,0,3,0,5,10,2,2,1,5,11,4,28,0 -23777,0,2,0,1,0,1,4,4,1,1,17,2,1,4,0 -23778,4,3,0,1,0,1,7,4,4,1,17,16,4,36,0 -23779,4,6,0,9,1,1,3,0,3,0,0,9,5,3,0 -23780,10,4,0,14,1,1,4,1,3,0,3,10,3,27,1 -23781,0,2,0,3,5,5,3,3,2,0,2,15,0,25,0 -23782,2,0,0,12,1,3,2,2,3,0,13,2,1,30,0 -23783,10,0,0,10,4,1,6,1,2,1,16,10,3,41,1 -23784,7,5,0,2,0,5,1,2,3,0,2,2,4,30,0 -23785,6,5,0,4,2,2,2,0,2,0,18,1,4,13,1 -23786,7,6,0,15,4,1,3,1,4,0,16,2,0,3,0 -23787,9,8,0,4,3,0,8,2,0,1,4,11,4,9,0 -23788,0,4,0,8,3,0,9,2,1,1,8,11,4,13,0 -23789,2,8,0,8,6,4,11,2,3,1,16,3,4,39,0 -23790,6,8,0,10,6,4,8,0,1,0,5,9,3,28,1 -23791,7,8,0,15,5,2,11,3,3,0,14,3,1,11,1 -23792,0,3,0,3,0,0,12,3,2,0,2,11,1,16,0 -23793,1,8,0,14,3,0,7,2,3,0,2,11,5,3,0 -23794,3,4,0,12,2,5,5,5,0,0,7,2,0,24,0 -23795,7,8,0,4,6,2,4,3,3,0,9,1,5,25,1 -23796,8,8,0,8,6,3,2,5,1,1,17,13,2,17,0 -23797,9,5,0,7,3,5,1,4,3,1,11,7,5,31,1 -23798,5,3,0,5,1,1,11,5,3,0,18,14,3,8,1 -23799,1,7,0,6,1,3,14,1,1,1,8,20,1,11,0 -23800,1,6,0,6,2,5,6,3,4,0,13,2,5,6,0 -23801,6,6,0,10,5,2,6,4,4,1,9,1,4,13,1 -23802,8,7,0,4,1,5,4,5,4,1,7,8,5,19,1 -23803,3,8,0,10,6,6,14,4,0,1,4,5,0,16,0 -23804,5,2,0,3,5,4,9,1,4,0,12,16,4,18,0 -23805,2,3,0,0,5,2,9,0,2,1,0,16,4,38,0 -23806,6,6,0,9,0,1,11,5,3,0,18,12,5,10,1 -23807,3,6,0,15,2,3,5,5,0,0,10,20,2,5,0 -23808,2,8,0,13,5,1,2,3,4,0,6,13,1,31,0 -23809,0,6,0,12,1,2,5,1,4,1,8,15,5,6,0 -23810,3,7,0,9,6,4,1,0,4,1,17,4,2,7,0 -23811,2,8,0,15,5,0,9,0,0,0,12,12,5,19,0 -23812,5,8,0,12,3,4,4,3,0,1,18,5,1,3,1 -23813,2,5,0,3,6,4,6,5,0,0,2,3,1,16,0 -23814,7,8,0,5,0,5,4,2,4,0,4,13,5,27,0 -23815,8,4,0,10,6,0,6,5,4,0,17,7,2,37,0 -23816,2,6,0,3,0,6,0,3,1,0,4,0,1,17,0 -23817,8,5,0,14,0,0,8,4,3,1,8,2,3,3,0 -23818,7,0,0,15,6,4,7,2,3,1,14,7,3,30,0 -23819,1,4,0,10,0,3,11,1,3,1,15,3,4,18,0 -23820,1,4,0,3,0,2,2,3,2,0,17,4,5,32,0 -23821,0,7,0,8,4,6,10,1,3,0,12,11,2,41,0 -23822,5,4,0,3,1,3,4,5,0,0,13,13,1,12,0 -23823,3,0,0,2,6,3,5,4,4,0,8,13,0,25,0 -23824,1,7,0,3,4,5,5,0,4,1,2,15,4,22,0 -23825,3,0,0,12,2,3,9,0,1,0,8,3,5,5,0 -23826,2,6,0,7,2,2,5,1,1,1,8,15,2,22,0 -23827,4,4,0,8,4,5,6,5,0,1,10,7,0,27,1 -23828,5,0,0,12,6,1,0,3,0,1,13,11,0,23,0 -23829,10,2,0,1,5,0,10,0,0,1,17,4,4,22,0 -23830,4,6,0,3,2,1,1,2,0,1,8,11,3,7,0 -23831,0,8,0,8,0,3,2,0,0,1,0,2,1,4,0 -23832,0,2,0,2,6,3,10,4,0,0,2,15,5,28,0 -23833,4,1,0,4,4,4,0,5,2,1,4,8,5,19,0 -23834,0,3,0,15,0,2,9,3,0,0,2,5,2,0,0 -23835,10,4,0,3,1,3,1,1,2,1,17,14,2,18,0 -23836,6,1,0,11,2,2,7,0,3,0,5,8,4,28,1 -23837,2,7,0,3,1,2,10,2,1,1,11,15,5,28,1 -23838,0,5,0,2,1,0,9,2,3,1,8,13,5,40,0 -23839,9,8,0,3,6,5,3,1,2,1,13,4,3,28,0 -23840,9,0,0,0,3,1,12,5,0,1,14,7,4,22,1 -23841,8,3,0,4,4,6,10,5,1,1,9,7,5,19,1 -23842,4,7,0,15,1,2,9,4,2,0,2,17,5,14,0 -23843,5,2,0,11,1,5,3,0,0,0,13,17,1,2,1 -23844,1,8,0,7,5,3,9,3,4,0,13,6,2,20,0 -23845,8,7,0,9,2,6,2,1,0,1,2,13,5,38,0 -23846,2,1,0,5,0,1,2,5,3,1,8,18,0,20,0 -23847,0,2,0,0,0,3,1,2,1,1,4,9,0,26,0 -23848,0,8,0,15,1,3,12,2,3,1,4,18,3,36,0 -23849,9,6,0,6,0,2,13,2,4,1,12,2,5,6,0 -23850,2,7,0,1,3,5,8,1,1,0,5,7,5,41,1 -23851,5,4,0,14,2,2,9,2,4,0,0,17,4,20,1 -23852,8,6,0,9,0,4,3,0,2,0,13,4,0,24,0 -23853,4,0,0,3,6,3,6,0,1,1,7,12,1,0,0 -23854,6,0,0,0,4,3,14,2,2,0,13,11,2,29,0 -23855,8,1,0,3,1,1,14,5,1,1,6,8,1,36,1 -23856,1,4,0,5,1,1,12,0,2,0,2,11,0,19,0 -23857,0,0,0,4,5,0,6,2,2,1,0,2,3,22,0 -23858,1,1,0,3,4,5,9,4,1,1,2,2,5,11,0 -23859,0,6,0,3,5,1,12,1,4,0,13,2,2,12,0 -23860,6,8,0,5,5,2,11,2,1,0,9,6,2,13,1 -23861,1,4,0,3,6,4,9,2,0,0,17,11,5,37,0 -23862,0,5,0,5,3,0,9,3,4,1,8,2,3,32,0 -23863,10,5,0,3,5,3,8,1,0,0,13,9,1,27,0 -23864,1,4,0,6,2,6,3,0,4,0,13,18,4,13,0 -23865,8,2,0,3,1,0,13,4,4,0,2,9,2,19,0 -23866,3,4,0,1,6,0,8,3,2,1,17,15,0,39,0 -23867,0,8,0,8,1,3,5,2,3,0,0,6,5,23,0 -23868,0,2,0,10,3,5,6,2,0,1,2,2,0,17,0 -23869,5,4,0,10,0,4,5,4,4,0,14,3,0,16,0 -23870,9,1,0,8,3,4,0,1,2,0,3,19,1,19,1 -23871,0,3,0,11,5,0,1,1,0,0,8,15,3,8,0 -23872,6,3,0,12,0,0,0,5,3,0,12,4,2,28,0 -23873,10,6,0,3,0,6,5,4,0,1,13,2,4,18,0 -23874,1,4,0,1,0,5,6,0,3,0,4,2,4,0,0 -23875,4,8,0,1,6,0,9,5,2,1,2,2,5,40,0 -23876,4,3,0,1,5,6,7,2,3,0,17,10,4,22,0 -23877,1,8,0,0,1,3,12,2,1,0,18,8,4,21,1 -23878,6,6,0,1,0,5,0,5,3,1,15,2,4,39,0 -23879,8,1,0,6,1,4,8,1,0,0,13,2,0,26,0 -23880,0,0,0,9,6,5,2,1,2,0,2,2,1,39,0 -23881,3,3,0,8,2,5,10,3,1,1,13,13,5,35,0 -23882,8,7,0,3,6,6,9,4,3,1,11,5,5,20,1 -23883,10,2,0,6,4,5,5,5,1,1,13,2,1,8,0 -23884,10,1,0,10,4,2,13,2,4,0,18,1,5,1,1 -23885,1,7,0,10,1,5,4,4,1,0,8,6,5,32,0 -23886,4,3,0,13,6,5,1,1,2,0,8,0,0,23,0 -23887,10,8,0,3,6,5,6,0,0,0,10,18,1,35,0 -23888,1,6,0,5,0,5,7,2,3,1,2,11,1,8,0 -23889,9,5,0,7,3,0,0,0,1,1,3,10,4,3,1 -23890,10,7,0,12,5,1,7,1,4,1,14,8,3,3,1 -23891,9,2,0,15,2,0,4,3,3,0,2,5,4,2,0 -23892,0,2,0,4,5,5,14,5,3,1,8,9,2,8,0 -23893,3,8,0,2,4,0,3,0,0,1,10,4,5,19,0 -23894,0,3,0,4,6,4,5,4,2,1,14,9,1,12,0 -23895,1,8,0,7,0,1,11,1,0,0,2,6,2,29,0 -23896,10,5,0,6,1,6,4,4,4,0,14,1,3,27,1 -23897,6,0,0,12,0,5,11,4,0,0,13,2,1,17,0 -23898,10,8,0,11,0,3,3,2,3,1,4,11,3,40,0 -23899,1,2,0,6,0,2,1,3,1,0,10,13,1,29,0 -23900,7,7,0,3,5,1,5,2,3,0,13,4,1,37,0 -23901,2,7,0,3,5,6,9,3,0,0,10,15,0,10,0 -23902,9,4,0,3,2,5,6,3,1,1,6,13,0,7,0 -23903,9,8,0,1,4,1,8,3,3,1,2,4,2,31,0 -23904,0,5,0,3,3,0,3,3,2,0,17,2,0,34,0 -23905,7,4,0,13,6,5,6,3,0,0,0,2,1,11,0 -23906,2,2,0,5,6,5,7,3,2,1,2,2,5,18,0 -23907,3,7,0,8,3,6,2,4,3,0,13,2,1,31,0 -23908,4,0,0,7,2,5,3,5,0,1,18,1,1,30,1 -23909,0,0,0,12,4,3,3,4,0,0,7,11,2,41,0 -23910,0,1,0,11,5,0,14,2,3,1,15,13,3,30,0 -23911,4,6,0,14,5,6,6,1,1,1,15,6,5,37,0 -23912,2,5,0,3,6,0,3,0,0,0,4,20,0,12,0 -23913,6,1,0,5,6,6,0,3,1,0,13,1,5,14,0 -23914,0,7,0,14,1,0,13,5,4,0,2,0,0,30,0 -23915,2,4,0,1,0,6,6,5,4,0,8,2,2,1,0 -23916,4,5,0,10,3,4,9,4,2,0,15,6,4,27,0 -23917,2,6,0,8,5,2,9,3,2,1,8,15,0,19,0 -23918,3,4,0,11,4,0,9,4,1,0,8,11,0,18,0 -23919,8,3,0,8,6,0,13,1,3,1,18,16,5,1,1 -23920,7,7,0,1,5,1,6,2,0,0,13,2,4,2,0 -23921,2,7,0,4,3,5,2,3,3,1,8,17,4,33,0 -23922,8,1,0,6,0,5,9,4,0,1,10,9,3,2,0 -23923,0,5,0,11,0,0,8,0,2,0,8,15,0,0,0 -23924,1,3,0,9,2,3,5,4,1,1,0,16,1,7,0 -23925,10,7,0,12,4,0,6,1,3,1,1,6,5,13,1 -23926,3,6,0,2,0,0,8,0,4,1,13,6,2,40,0 -23927,8,1,0,3,3,1,6,2,4,1,11,11,5,29,1 -23928,5,1,0,14,4,0,7,3,3,1,13,0,1,39,0 -23929,0,7,0,11,2,6,8,0,1,1,8,6,2,22,0 -23930,0,5,0,4,4,5,8,4,1,0,13,1,0,28,0 -23931,0,0,0,0,4,1,8,4,1,1,3,18,4,22,1 -23932,2,0,0,9,1,6,11,3,4,0,2,11,5,24,0 -23933,3,1,0,10,2,5,4,0,0,1,14,17,4,0,0 -23934,3,2,0,0,5,5,10,4,4,0,8,11,5,14,0 -23935,3,8,0,14,3,3,0,3,1,1,5,8,4,3,1 -23936,6,8,0,0,4,0,9,2,1,0,13,6,0,29,0 -23937,0,8,0,4,1,5,5,1,4,1,10,2,1,13,0 -23938,2,3,0,5,3,4,6,1,0,1,17,18,5,6,0 -23939,9,7,0,13,6,4,14,3,4,0,4,9,1,8,0 -23940,5,8,0,0,0,5,0,0,0,0,16,1,4,20,0 -23941,3,2,0,11,6,5,14,1,4,1,11,2,0,41,0 -23942,8,4,0,11,5,0,8,0,1,0,2,2,1,23,0 -23943,9,4,0,15,4,5,12,5,4,0,4,4,0,27,0 -23944,6,7,0,3,0,1,1,1,1,0,13,6,0,25,0 -23945,0,1,0,11,0,6,6,4,0,1,13,4,0,31,0 -23946,2,8,0,15,0,5,2,3,4,1,13,0,5,31,0 -23947,4,2,0,6,6,3,9,3,3,0,0,11,5,4,0 -23948,10,5,0,14,3,1,14,4,1,1,12,11,5,41,0 -23949,10,2,0,6,2,2,1,3,3,1,4,2,2,35,0 -23950,3,2,0,12,0,0,3,4,0,1,13,5,2,19,0 -23951,2,3,0,2,2,5,3,0,1,1,0,9,1,7,0 -23952,0,6,0,0,3,4,13,0,0,1,13,20,4,8,0 -23953,1,6,0,3,1,6,5,1,1,0,13,2,0,4,0 -23954,1,5,0,14,3,4,8,1,3,1,13,2,5,40,0 -23955,2,4,0,14,2,5,13,0,2,0,2,15,0,29,0 -23956,2,6,0,4,0,4,1,1,0,1,2,6,1,17,0 -23957,9,4,0,5,0,6,0,5,0,1,8,9,3,11,0 -23958,5,6,0,4,5,4,10,0,2,1,13,19,3,28,0 -23959,3,5,0,13,6,3,4,5,2,1,8,4,1,37,0 -23960,0,8,0,3,6,5,8,3,4,0,13,2,1,12,0 -23961,1,7,0,14,1,4,6,1,1,0,7,4,0,20,0 -23962,10,0,0,11,5,5,7,3,4,1,1,17,4,8,1 -23963,0,3,0,11,0,6,8,3,2,0,0,18,1,40,0 -23964,4,2,0,10,3,0,10,3,3,1,10,6,0,37,0 -23965,3,6,0,5,0,2,6,2,0,1,2,2,1,38,0 -23966,0,7,0,10,6,2,10,3,2,0,6,2,4,0,0 -23967,6,6,0,15,6,6,6,3,2,1,13,0,2,27,0 -23968,1,0,0,4,5,2,0,3,1,0,8,18,1,13,0 -23969,5,4,0,15,1,2,1,0,2,0,1,16,5,38,0 -23970,9,1,0,2,0,5,2,2,1,1,12,13,0,6,0 -23971,4,1,0,10,5,1,0,0,3,1,11,1,1,23,1 -23972,9,1,0,7,6,6,0,1,2,0,18,13,4,38,1 -23973,3,5,0,2,5,3,0,3,3,1,6,2,3,4,0 -23974,10,3,0,5,1,5,11,0,1,0,17,20,3,3,0 -23975,0,3,0,13,2,4,1,2,4,0,8,0,0,28,0 -23976,0,5,0,12,0,2,1,1,4,0,15,2,5,24,0 -23977,6,2,0,3,5,5,13,0,1,0,0,14,2,19,0 -23978,1,7,0,13,2,6,6,4,1,0,2,9,0,12,0 -23979,2,8,0,9,1,1,2,2,2,1,4,2,3,34,0 -23980,8,5,0,6,0,3,12,4,4,0,1,16,3,16,1 -23981,2,5,0,9,4,5,10,0,4,0,9,10,3,21,1 -23982,0,6,0,0,2,6,2,3,0,0,2,12,5,11,0 -23983,2,2,0,6,3,1,9,3,4,0,13,2,4,5,0 -23984,0,0,0,13,2,6,4,4,3,1,17,11,0,40,0 -23985,3,4,0,2,0,0,13,1,4,0,2,11,3,29,0 -23986,10,2,0,2,0,1,8,4,0,1,17,9,0,31,0 -23987,6,5,0,11,2,1,4,2,0,1,1,5,3,16,1 -23988,6,6,0,0,2,1,0,5,1,1,9,10,3,23,1 -23989,9,6,0,8,5,6,9,2,3,0,0,2,0,9,0 -23990,7,4,0,6,2,3,5,1,0,1,2,9,2,18,0 -23991,1,6,0,5,2,0,3,3,1,0,0,15,0,25,0 -23992,8,2,0,11,6,0,10,3,3,0,9,11,2,6,0 -23993,10,8,0,4,3,3,8,0,1,1,3,12,1,37,1 -23994,5,1,0,5,0,0,8,4,0,0,2,16,0,1,0 -23995,0,0,0,3,6,3,9,3,2,1,13,2,1,36,0 -23996,7,6,0,4,0,0,6,5,3,1,18,2,4,32,1 -23997,7,6,0,12,3,5,11,4,0,1,13,2,2,39,0 -23998,4,0,0,1,2,0,10,1,3,1,8,6,0,23,0 -23999,7,6,0,13,1,5,2,3,1,1,13,2,5,28,0 -24000,9,6,0,1,5,2,10,2,2,1,2,11,2,16,0 -24001,10,1,0,9,2,3,1,2,4,0,7,17,3,22,1 -24002,3,8,0,6,0,3,8,4,3,1,1,9,3,41,0 -24003,2,7,0,9,2,5,8,2,1,0,7,5,5,32,1 -24004,1,7,0,12,1,0,2,2,4,1,6,11,2,13,0 -24005,7,3,0,6,6,5,11,1,4,1,13,11,0,30,0 -24006,3,1,0,1,3,5,6,1,2,0,13,11,0,3,0 -24007,0,5,0,6,0,5,10,4,1,0,2,15,0,18,0 -24008,10,5,0,5,0,0,13,5,0,0,18,19,1,14,1 -24009,0,8,0,0,4,1,0,1,1,1,8,18,2,24,0 -24010,10,0,0,15,5,5,7,4,4,0,8,15,0,19,0 -24011,9,3,0,5,2,2,4,4,0,1,5,4,4,32,1 -24012,0,6,0,15,6,4,10,3,2,0,14,11,1,10,0 -24013,9,8,0,7,5,2,10,5,3,1,18,10,1,1,1 -24014,10,7,0,1,2,2,12,0,2,0,4,11,0,26,0 -24015,3,7,0,1,2,0,10,0,2,1,4,11,0,11,0 -24016,6,6,0,13,6,5,9,0,0,0,0,0,3,4,0 -24017,0,7,0,8,6,1,9,0,2,0,17,13,4,1,0 -24018,0,4,0,11,5,0,7,4,2,1,2,15,2,41,0 -24019,3,1,0,1,3,6,3,1,1,1,4,7,4,11,1 -24020,1,1,0,10,3,2,11,2,1,1,1,14,1,9,1 -24021,8,5,0,1,3,2,10,5,3,0,10,10,5,41,1 -24022,3,5,0,4,0,6,7,1,0,0,13,2,2,3,0 -24023,7,6,0,9,4,4,6,1,2,1,5,0,5,28,1 -24024,2,1,0,1,1,5,12,5,3,1,13,6,0,7,0 -24025,10,6,0,5,3,3,5,3,3,1,13,2,5,39,0 -24026,10,6,0,10,6,5,13,4,0,1,5,7,2,9,1 -24027,0,1,0,10,5,2,10,0,0,0,13,2,3,12,0 -24028,9,5,0,4,5,4,9,2,4,0,6,0,3,38,0 -24029,9,6,0,10,5,1,12,5,3,1,18,1,3,31,1 -24030,1,3,0,3,1,4,8,2,3,0,13,18,3,14,0 -24031,0,7,0,7,6,5,9,3,0,1,13,15,5,26,0 -24032,0,3,0,6,2,1,6,1,2,0,12,15,1,22,0 -24033,6,7,0,2,0,4,9,4,3,1,13,11,0,2,0 -24034,1,4,0,14,0,6,14,1,4,0,0,0,5,25,0 -24035,0,1,0,8,0,4,0,2,4,0,9,13,2,16,0 -24036,3,7,0,1,2,6,0,1,0,0,14,11,5,1,0 -24037,0,7,0,5,0,2,9,3,1,1,13,6,3,1,0 -24038,5,7,0,11,6,0,8,3,0,1,3,11,0,10,0 -24039,2,6,0,15,0,2,9,4,1,1,2,14,0,26,0 -24040,6,0,0,14,0,5,13,2,1,1,8,1,0,8,0 -24041,1,2,0,14,3,1,3,4,3,0,2,2,4,20,0 -24042,1,3,0,0,3,5,6,1,0,1,15,18,2,33,0 -24043,9,4,0,13,4,1,9,1,3,0,2,18,3,25,0 -24044,3,0,0,2,2,4,13,0,2,1,10,18,5,22,0 -24045,2,5,0,0,6,5,8,0,0,1,6,9,1,27,0 -24046,8,0,0,14,6,0,9,2,1,0,8,0,0,20,0 -24047,1,7,0,13,5,3,0,5,3,1,3,11,1,22,0 -24048,4,1,0,14,0,4,5,2,2,1,2,2,0,7,0 -24049,1,8,0,1,0,5,10,3,0,1,10,16,4,6,0 -24050,3,7,0,7,0,3,9,4,2,1,13,11,5,3,0 -24051,1,8,0,15,5,4,6,1,0,0,13,2,5,16,0 -24052,1,3,0,3,0,1,14,1,0,1,8,9,4,27,0 -24053,2,6,0,9,1,6,9,5,4,1,18,17,4,28,1 -24054,4,4,0,5,3,6,10,0,0,1,16,9,5,1,0 -24055,5,3,0,2,5,3,12,5,2,1,14,13,4,23,1 -24056,3,7,0,13,1,4,2,2,0,1,2,6,1,9,0 -24057,3,3,0,7,2,0,9,4,3,0,13,18,4,35,0 -24058,6,1,0,10,5,0,2,0,1,0,5,7,5,14,1 -24059,1,1,0,0,0,5,0,5,1,1,13,13,5,8,0 -24060,10,0,0,12,6,5,7,2,2,0,6,2,4,18,0 -24061,10,3,0,4,1,3,2,3,0,0,5,17,4,29,1 -24062,5,4,0,1,1,6,10,5,3,1,4,3,1,39,0 -24063,9,2,0,1,2,0,11,0,1,1,8,8,4,10,0 -24064,7,2,0,10,1,3,14,3,3,0,5,6,2,22,1 -24065,5,8,0,15,6,6,0,0,0,1,7,14,4,40,1 -24066,4,4,0,3,6,0,6,4,0,1,2,11,5,3,0 -24067,8,5,0,8,1,0,5,2,0,1,8,11,1,18,0 -24068,3,7,0,7,2,4,14,0,1,0,14,5,0,17,0 -24069,4,0,0,3,4,2,12,5,2,0,18,7,1,23,1 -24070,1,4,0,5,5,6,8,5,2,1,13,2,2,26,0 -24071,8,2,0,14,1,2,9,0,3,1,7,1,2,8,1 -24072,1,6,0,13,5,4,6,3,4,1,8,0,3,31,0 -24073,8,2,0,7,0,0,3,2,2,0,2,15,1,4,0 -24074,7,7,0,3,0,3,6,4,3,1,13,11,2,8,0 -24075,5,0,0,10,5,5,7,0,1,0,13,2,2,13,0 -24076,1,6,0,13,6,6,5,4,1,0,8,9,4,16,0 -24077,7,5,0,6,3,4,2,5,0,0,5,19,3,8,1 -24078,4,1,0,8,2,6,1,4,1,1,2,6,1,22,0 -24079,2,7,0,5,5,2,8,0,3,0,10,2,0,31,0 -24080,1,4,0,8,0,4,1,1,4,1,6,11,3,29,0 -24081,4,7,0,2,1,6,1,4,0,0,2,11,0,12,0 -24082,0,8,0,7,3,4,4,3,3,1,2,2,1,8,0 -24083,0,7,0,8,3,4,11,2,0,1,1,18,4,6,0 -24084,10,8,0,4,2,4,1,3,1,1,5,2,5,22,0 -24085,2,8,0,11,2,1,4,3,2,0,13,9,5,33,0 -24086,1,2,0,10,0,0,7,0,2,0,5,0,3,16,0 -24087,1,6,0,4,6,3,9,3,2,0,14,18,2,34,0 -24088,0,5,0,7,0,5,3,2,1,0,8,11,0,6,0 -24089,1,3,0,8,0,3,10,0,0,1,8,11,3,20,0 -24090,10,8,0,11,1,0,14,4,0,0,6,7,1,13,1 -24091,0,5,0,7,2,0,10,2,0,0,13,9,4,20,0 -24092,3,6,0,13,2,3,6,4,2,0,10,2,0,31,0 -24093,2,8,0,7,5,4,8,2,2,0,17,2,0,19,0 -24094,1,8,0,8,0,3,8,5,0,0,2,13,3,35,0 -24095,5,5,0,15,0,1,9,3,3,1,2,11,1,4,0 -24096,6,8,0,15,3,1,4,1,1,1,9,17,0,7,1 -24097,0,3,0,0,3,2,6,1,1,0,7,2,5,19,0 -24098,6,1,0,8,4,6,11,4,2,0,5,7,1,6,0 -24099,10,7,0,10,6,3,1,2,0,0,3,7,2,38,1 -24100,9,8,0,14,4,2,7,2,4,1,9,17,3,30,1 -24101,6,7,0,6,2,1,9,4,0,1,5,4,5,39,1 -24102,0,6,0,12,1,4,3,0,0,0,13,2,0,31,0 -24103,0,5,0,5,0,0,8,3,2,1,7,0,0,21,0 -24104,8,1,0,11,3,5,11,0,1,1,7,5,4,21,1 -24105,4,8,0,8,1,1,1,0,3,0,9,3,3,1,1 -24106,0,7,0,8,0,1,10,0,4,1,8,2,5,7,0 -24107,2,1,0,11,5,5,0,5,3,1,3,17,1,17,1 -24108,1,8,0,6,5,6,9,4,1,1,2,4,4,38,0 -24109,4,3,0,4,6,0,1,5,3,1,2,2,3,17,0 -24110,8,7,0,8,3,1,13,5,2,0,3,3,4,1,1 -24111,3,7,0,10,0,2,10,1,3,1,2,11,0,32,0 -24112,7,1,0,12,6,1,3,2,2,0,11,5,4,7,1 -24113,3,2,0,7,6,6,8,3,3,1,2,18,0,29,0 -24114,6,4,0,2,5,4,11,0,3,1,10,2,3,13,0 -24115,1,0,0,2,5,4,7,3,0,0,8,2,4,8,0 -24116,0,8,0,0,0,5,5,2,2,0,4,2,4,4,0 -24117,4,4,0,11,1,1,5,1,0,1,3,1,5,34,1 -24118,2,4,0,11,0,6,9,4,4,0,13,19,2,26,0 -24119,5,8,0,1,0,3,5,0,4,0,2,19,4,16,0 -24120,7,2,0,7,1,2,1,0,2,1,13,18,0,14,0 -24121,7,3,0,3,0,3,0,4,3,1,8,9,1,4,0 -24122,8,2,0,6,1,1,0,0,1,1,9,18,0,5,1 -24123,10,8,0,3,0,0,7,4,1,0,17,15,5,6,0 -24124,6,2,0,4,6,5,10,1,3,1,13,11,4,30,0 -24125,4,7,0,3,2,1,14,2,2,1,2,0,1,27,0 -24126,10,1,0,12,3,1,11,1,3,1,1,19,1,28,1 -24127,8,4,0,7,5,2,14,3,4,1,1,13,5,33,1 -24128,1,6,0,3,3,6,9,4,4,0,17,18,2,26,0 -24129,4,4,0,13,1,4,10,0,2,0,5,14,0,36,0 -24130,2,1,0,9,0,5,8,1,4,0,7,2,0,6,0 -24131,7,4,0,8,1,6,12,4,1,0,2,2,1,39,0 -24132,2,8,0,14,2,1,12,2,0,0,12,2,2,22,0 -24133,6,7,0,8,4,5,6,5,2,1,6,6,4,24,0 -24134,7,4,0,4,1,1,2,0,2,0,5,16,5,21,0 -24135,3,3,0,3,6,0,9,1,0,1,7,13,3,13,0 -24136,9,3,0,10,0,1,10,3,1,1,2,15,1,28,0 -24137,0,5,0,6,1,3,1,2,4,1,2,8,2,19,0 -24138,10,5,0,0,4,5,6,1,1,0,6,11,4,26,0 -24139,4,6,0,1,6,5,6,2,2,0,2,15,4,5,0 -24140,4,6,0,9,6,2,5,2,4,1,12,7,5,16,1 -24141,7,4,0,11,6,4,5,4,2,1,2,15,1,37,0 -24142,0,3,0,4,0,0,1,5,4,0,6,9,4,4,0 -24143,2,6,0,5,3,4,6,1,2,1,2,13,1,33,0 -24144,2,2,0,15,5,4,1,0,4,1,13,2,1,14,0 -24145,3,5,0,12,1,4,5,3,0,1,17,9,1,34,0 -24146,2,1,0,2,1,6,7,4,0,1,2,2,0,30,0 -24147,6,2,0,12,2,1,2,0,3,0,9,14,5,9,1 -24148,9,0,0,3,1,5,3,2,1,0,17,20,0,40,0 -24149,4,0,0,9,2,1,0,3,1,0,15,15,4,32,0 -24150,9,5,0,12,1,4,5,3,4,1,8,2,2,26,0 -24151,8,4,0,11,4,4,12,3,4,1,9,15,3,32,1 -24152,5,8,0,2,0,3,0,1,1,0,2,11,1,26,0 -24153,3,1,0,13,3,1,0,1,0,1,9,10,5,6,1 -24154,0,8,0,2,0,4,7,2,0,1,8,5,0,24,0 -24155,9,6,0,6,1,4,8,3,3,1,2,2,4,14,0 -24156,6,7,0,3,0,1,2,2,2,1,6,11,2,33,0 -24157,0,6,0,15,3,2,8,2,0,0,12,11,3,5,0 -24158,3,6,0,8,0,5,8,2,0,1,0,2,4,29,0 -24159,5,2,0,14,1,0,9,1,2,1,4,7,2,31,1 -24160,6,1,0,0,0,1,9,4,4,1,2,15,4,38,0 -24161,1,1,0,9,1,1,12,3,1,0,13,6,3,35,0 -24162,2,6,0,11,0,5,14,5,2,0,8,9,3,21,0 -24163,3,8,0,3,6,0,13,5,0,0,7,4,1,38,0 -24164,0,2,0,7,1,2,7,3,3,1,2,2,0,8,0 -24165,6,3,0,1,6,0,12,5,4,1,4,11,5,4,0 -24166,7,6,0,1,0,0,8,0,2,1,5,17,3,17,0 -24167,6,8,0,12,4,5,5,1,4,0,15,14,2,13,1 -24168,0,2,0,14,1,0,14,4,2,1,4,16,3,34,0 -24169,8,4,0,0,2,0,5,0,2,1,18,14,5,12,1 -24170,1,7,0,10,3,4,11,1,0,0,8,15,1,27,0 -24171,3,7,0,8,0,5,2,2,4,0,0,6,2,35,0 -24172,2,4,0,1,5,5,1,4,3,0,8,17,4,4,0 -24173,5,1,0,13,0,5,1,2,0,1,2,9,1,25,0 -24174,4,2,0,1,3,4,6,4,3,1,17,11,3,13,0 -24175,9,6,0,1,1,2,6,3,3,0,1,13,0,8,0 -24176,7,5,0,5,5,2,11,2,0,1,9,5,4,9,1 -24177,1,1,0,2,6,4,6,3,1,0,17,12,3,23,0 -24178,8,8,0,15,4,1,4,3,4,1,9,11,5,34,1 -24179,9,8,0,1,1,2,10,4,3,1,3,9,0,40,0 -24180,10,4,0,6,5,0,1,3,1,0,13,6,4,16,0 -24181,6,4,0,8,4,1,2,0,1,1,9,13,2,35,1 -24182,8,8,0,3,2,2,3,0,1,1,11,10,2,29,1 -24183,2,1,0,4,0,4,6,3,1,0,13,6,0,14,0 -24184,9,1,0,13,6,3,2,5,1,1,8,2,4,31,0 -24185,9,2,0,12,3,4,13,4,0,1,2,0,5,33,0 -24186,9,4,0,2,5,2,10,0,1,1,18,5,2,20,1 -24187,2,8,0,6,1,5,9,3,1,1,2,18,4,25,0 -24188,2,7,0,7,5,1,8,2,4,0,13,11,0,10,0 -24189,6,7,0,13,0,5,0,1,3,1,4,18,4,3,0 -24190,2,5,0,0,0,0,13,1,3,0,13,4,1,40,0 -24191,4,7,0,7,1,4,9,4,4,1,8,13,4,37,0 -24192,0,5,0,7,1,0,10,4,0,1,17,20,3,23,0 -24193,10,4,0,5,0,5,11,3,2,0,4,0,1,2,0 -24194,1,6,0,9,0,6,0,3,4,0,6,2,5,39,0 -24195,3,4,0,3,1,0,3,1,4,0,2,18,1,33,0 -24196,4,1,0,9,3,0,10,3,4,1,9,7,4,16,1 -24197,9,7,0,12,0,0,2,1,0,1,8,0,5,37,0 -24198,0,5,0,15,0,5,3,3,3,0,4,3,1,9,0 -24199,2,2,0,3,4,1,5,5,0,1,12,10,3,11,1 -24200,3,8,0,13,4,4,7,0,0,1,6,15,1,7,0 -24201,2,3,0,8,0,1,6,2,4,1,2,4,2,40,0 -24202,1,7,0,8,5,5,4,3,1,0,8,15,0,23,0 -24203,10,2,0,3,4,2,5,1,3,1,13,14,0,34,0 -24204,7,1,0,7,2,2,14,0,1,1,5,20,3,1,1 -24205,3,8,0,3,1,6,6,3,3,1,0,6,4,4,0 -24206,1,6,0,0,4,4,11,1,4,1,4,15,2,25,0 -24207,1,0,0,2,5,6,7,2,0,1,2,2,4,24,0 -24208,3,1,0,2,0,1,0,4,3,0,13,11,3,19,0 -24209,9,7,0,4,2,2,2,0,3,1,18,9,3,35,1 -24210,2,0,0,3,6,5,10,3,0,0,6,12,4,5,0 -24211,2,5,0,13,1,0,1,3,3,1,2,8,5,4,0 -24212,6,6,0,10,6,1,4,1,2,0,0,1,1,41,1 -24213,0,1,0,8,3,0,8,4,2,0,4,0,3,26,0 -24214,10,6,0,9,4,5,3,1,0,0,18,7,2,28,1 -24215,4,5,0,6,4,4,7,1,2,1,8,0,1,4,0 -24216,0,6,0,8,6,6,9,3,1,0,2,13,0,5,0 -24217,9,8,0,11,6,5,13,3,3,1,9,2,0,4,0 -24218,0,8,0,13,6,3,3,5,2,1,6,2,0,27,0 -24219,0,6,0,1,6,5,10,2,1,1,8,11,1,14,0 -24220,1,0,0,7,0,4,4,2,1,1,12,11,2,0,0 -24221,1,1,0,8,6,3,8,4,0,0,10,2,0,37,0 -24222,5,0,0,10,3,1,13,2,1,1,18,5,3,7,1 -24223,7,8,0,5,6,0,5,5,1,0,2,11,4,37,0 -24224,9,3,0,14,6,3,7,5,4,0,7,18,1,37,1 -24225,4,8,0,2,5,2,0,3,0,1,13,20,4,26,0 -24226,7,1,0,12,1,5,2,4,0,0,18,18,2,8,0 -24227,0,7,0,4,5,5,9,3,1,1,7,11,5,41,0 -24228,10,0,0,2,3,3,6,4,2,0,2,18,5,31,0 -24229,3,1,0,8,5,1,3,0,2,0,8,15,0,10,0 -24230,4,7,0,2,6,2,5,3,0,0,2,11,0,37,0 -24231,10,4,0,14,4,2,4,5,4,1,15,16,1,32,1 -24232,2,2,0,11,3,1,3,1,2,1,13,11,3,21,0 -24233,5,8,0,12,3,6,3,3,4,1,7,7,0,34,1 -24234,3,7,0,1,4,2,0,5,3,0,18,7,5,7,1 -24235,4,8,0,2,2,6,14,3,3,0,3,9,1,18,0 -24236,7,7,0,6,6,3,12,2,1,0,11,16,5,17,1 -24237,1,3,0,9,5,5,14,3,1,1,8,3,0,7,0 -24238,8,0,0,8,5,3,5,5,0,0,1,14,4,3,1 -24239,1,8,0,6,2,4,13,3,2,1,4,13,5,10,0 -24240,7,6,0,1,2,0,3,5,0,0,18,19,1,23,1 -24241,3,4,0,8,0,0,5,0,1,1,8,13,0,32,0 -24242,0,0,0,0,0,0,14,3,0,0,11,2,0,38,0 -24243,2,0,0,0,6,2,13,3,1,0,13,11,5,11,0 -24244,6,5,0,13,3,4,1,0,1,1,6,6,4,10,0 -24245,3,0,0,1,5,0,14,4,0,1,2,9,0,20,0 -24246,7,2,0,7,5,2,0,2,0,1,6,4,1,38,0 -24247,10,5,0,10,3,2,0,4,4,1,14,10,3,24,1 -24248,4,5,0,9,6,5,2,0,1,1,1,7,0,37,1 -24249,10,7,0,14,4,6,3,4,0,0,15,11,1,17,0 -24250,8,3,0,6,2,4,0,0,2,0,17,20,1,37,0 -24251,0,3,0,10,6,2,5,4,2,1,0,17,4,33,1 -24252,0,5,0,15,5,0,8,1,0,1,10,13,5,29,0 -24253,0,3,0,11,6,2,7,5,1,1,6,1,5,41,0 -24254,9,0,0,8,0,0,5,4,0,0,17,1,1,1,0 -24255,2,4,0,11,1,5,4,2,0,1,2,2,0,31,0 -24256,9,8,0,6,2,0,5,4,3,0,2,2,1,3,0 -24257,0,5,0,3,2,3,5,4,4,1,2,15,4,26,0 -24258,5,3,0,3,1,3,8,4,1,0,6,13,4,11,0 -24259,5,4,0,3,6,6,5,3,0,0,4,16,0,29,0 -24260,9,8,0,1,2,0,1,1,2,1,8,20,0,23,0 -24261,2,3,0,15,1,0,8,1,2,1,17,18,1,27,0 -24262,1,2,0,3,5,1,0,4,3,1,2,13,2,22,0 -24263,8,2,0,9,2,0,1,5,0,1,18,7,4,22,1 -24264,10,7,0,2,5,0,1,5,2,1,9,19,2,14,1 -24265,8,1,0,12,6,3,3,1,0,0,13,6,3,18,0 -24266,9,6,0,9,2,0,7,2,4,1,0,18,5,17,0 -24267,3,7,0,3,1,2,7,2,2,0,5,8,0,29,0 -24268,0,0,0,9,2,5,7,1,4,1,0,18,4,28,0 -24269,0,3,0,0,2,1,3,3,0,0,2,2,4,26,0 -24270,4,5,0,13,5,3,8,0,2,0,0,11,1,33,0 -24271,2,1,0,4,0,3,0,2,3,1,13,8,1,25,0 -24272,0,4,0,5,4,3,6,2,0,0,16,18,5,13,0 -24273,4,3,0,15,0,2,14,5,2,0,4,0,1,26,0 -24274,1,3,0,8,1,3,3,2,2,1,0,9,1,26,0 -24275,2,3,0,1,0,6,5,1,0,0,2,0,3,4,0 -24276,10,0,0,0,3,0,6,2,1,1,14,2,2,35,0 -24277,3,5,0,12,2,1,6,4,0,1,3,19,5,5,1 -24278,10,4,0,10,2,5,10,3,1,1,18,19,2,18,1 -24279,8,8,0,4,2,5,14,5,4,1,8,17,4,30,1 -24280,9,1,0,15,2,0,14,4,3,1,13,9,3,28,0 -24281,8,6,0,3,1,5,2,1,0,1,2,13,0,13,0 -24282,2,4,0,8,6,5,6,1,2,0,9,3,3,24,1 -24283,1,4,0,12,6,4,2,3,0,0,17,13,1,37,0 -24284,10,4,0,3,6,5,6,2,4,0,1,2,1,14,0 -24285,0,6,0,1,5,2,0,1,0,0,13,15,4,6,0 -24286,10,2,0,0,0,1,1,0,0,0,13,0,1,7,0 -24287,5,1,0,12,2,3,9,1,0,0,5,5,5,4,0 -24288,10,2,0,4,4,1,5,4,3,1,11,10,1,40,1 -24289,1,3,0,5,0,2,2,2,1,1,2,13,0,6,0 -24290,0,3,0,5,2,5,11,1,4,0,8,13,4,37,0 -24291,9,6,0,6,4,2,10,3,4,1,7,10,4,36,1 -24292,8,7,0,0,0,3,7,1,2,0,13,17,2,22,0 -24293,7,6,0,12,1,3,5,0,4,0,9,17,2,20,1 -24294,2,3,0,10,6,6,4,0,3,0,3,10,4,7,1 -24295,0,6,0,1,0,3,4,1,0,0,8,13,2,30,0 -24296,3,4,0,6,4,0,0,1,0,1,2,4,0,5,0 -24297,4,5,0,7,1,6,6,1,3,0,7,15,3,31,0 -24298,6,1,0,5,4,1,7,5,3,0,7,4,3,4,0 -24299,9,0,0,10,2,5,13,5,2,1,14,20,4,0,1 -24300,0,8,0,0,5,6,2,5,0,1,14,15,0,4,0 -24301,4,6,0,11,2,6,1,2,4,0,0,15,3,36,0 -24302,0,3,0,1,1,6,11,5,0,1,8,11,0,26,0 -24303,3,8,0,6,6,5,3,1,1,1,13,11,1,18,0 -24304,3,2,0,12,4,6,2,0,2,0,8,9,2,2,0 -24305,0,4,0,15,1,5,8,4,2,0,8,11,0,33,0 -24306,1,3,0,12,6,0,2,3,3,0,17,11,3,18,0 -24307,2,8,0,3,4,6,2,3,2,0,2,18,1,21,0 -24308,9,0,0,10,2,6,9,4,0,0,8,2,5,8,0 -24309,3,3,0,14,5,5,11,5,1,0,10,7,5,19,1 -24310,1,7,0,1,2,5,11,1,0,1,13,2,1,11,0 -24311,5,1,0,10,6,4,0,1,1,0,14,15,2,4,0 -24312,7,7,0,10,3,6,12,1,4,1,2,2,0,3,0 -24313,0,2,0,8,1,5,10,3,2,0,10,6,3,25,0 -24314,0,7,0,6,4,1,3,3,0,1,3,10,2,31,1 -24315,4,1,0,14,0,0,6,1,2,0,1,18,0,21,0 -24316,0,7,0,1,0,0,3,3,1,0,7,11,0,37,0 -24317,10,3,0,0,4,1,0,0,4,1,9,10,1,5,1 -24318,0,8,0,14,2,4,6,3,1,0,2,2,5,35,0 -24319,9,4,0,12,6,2,2,3,0,0,18,5,1,30,1 -24320,3,7,0,4,4,3,8,2,3,1,2,11,0,31,0 -24321,5,3,0,6,6,2,13,0,3,1,2,8,2,22,0 -24322,1,3,0,15,1,5,0,3,1,0,16,18,1,8,0 -24323,6,1,0,3,0,4,9,3,3,0,10,4,0,33,0 -24324,7,2,0,7,3,5,7,3,3,1,3,17,5,37,1 -24325,0,4,0,1,5,4,1,3,1,1,9,12,2,11,0 -24326,7,8,0,6,0,4,4,3,0,0,0,2,0,4,0 -24327,2,7,0,10,5,0,11,4,1,0,13,16,2,14,0 -24328,0,8,0,0,2,6,14,1,1,0,18,14,3,25,1 -24329,3,7,0,4,1,5,7,4,1,1,6,11,2,4,0 -24330,9,6,0,3,1,5,4,5,2,1,4,0,2,4,1 -24331,1,8,0,10,3,6,6,1,0,1,8,11,0,10,0 -24332,6,6,0,9,1,0,1,0,2,1,10,0,4,14,0 -24333,0,1,0,7,4,1,10,5,2,0,13,18,1,40,0 -24334,10,5,0,5,4,1,4,1,2,0,9,1,5,35,1 -24335,1,3,0,1,5,2,4,1,3,1,8,18,0,26,0 -24336,0,2,0,0,5,6,2,5,2,1,5,6,0,26,0 -24337,2,6,0,13,0,3,7,0,3,0,13,9,3,39,0 -24338,4,2,0,1,3,2,13,1,2,0,2,9,0,5,0 -24339,1,7,0,12,5,5,1,5,3,1,2,11,5,14,0 -24340,6,4,0,6,0,4,8,0,1,0,8,11,2,18,0 -24341,1,0,0,3,6,5,14,0,1,0,6,11,2,4,0 -24342,9,6,0,8,0,1,9,3,3,1,2,6,3,7,0 -24343,8,5,0,15,0,0,4,5,1,0,8,5,2,17,1 -24344,0,1,0,13,5,0,4,5,0,0,2,1,4,0,0 -24345,5,8,0,9,5,3,2,0,2,1,12,11,4,27,0 -24346,2,6,0,13,1,1,7,4,0,1,7,13,1,20,0 -24347,1,1,0,7,4,2,10,3,1,0,16,7,1,27,1 -24348,10,2,0,14,4,0,13,2,2,0,13,2,0,0,0 -24349,7,3,0,2,1,3,9,2,1,1,2,12,3,31,0 -24350,8,5,0,8,5,6,4,5,2,1,18,15,3,31,1 -24351,10,2,0,12,2,4,2,0,2,1,9,8,1,38,1 -24352,8,4,0,2,2,1,14,0,1,1,4,6,2,24,0 -24353,1,6,0,1,4,3,10,4,3,1,8,4,1,1,0 -24354,9,2,0,14,3,2,11,1,0,1,5,12,4,16,1 -24355,1,0,0,5,0,5,14,0,3,1,0,18,4,36,0 -24356,4,4,0,13,2,0,0,5,3,1,2,2,5,29,0 -24357,1,6,0,0,5,6,8,4,0,0,3,11,1,0,0 -24358,0,5,0,15,4,4,4,2,1,0,15,16,2,24,0 -24359,6,5,0,14,6,1,8,3,1,0,15,4,0,21,0 -24360,2,7,0,10,6,3,5,1,2,0,7,17,5,17,0 -24361,6,1,0,3,5,6,8,2,1,1,18,10,2,0,1 -24362,5,5,0,12,2,3,11,0,1,0,18,19,2,9,1 -24363,1,5,0,4,6,4,0,3,4,0,2,2,5,41,0 -24364,8,1,0,7,5,1,4,1,3,1,18,8,0,23,1 -24365,7,6,0,3,3,0,8,4,1,0,13,20,1,12,0 -24366,5,2,0,4,4,4,14,1,1,0,9,8,3,13,1 -24367,2,2,0,6,5,2,8,1,3,0,15,2,0,7,0 -24368,2,7,0,15,0,2,4,3,2,0,8,16,1,40,0 -24369,1,0,0,6,1,5,5,1,2,0,2,0,2,4,0 -24370,0,3,0,5,2,0,0,4,0,1,15,1,3,35,0 -24371,8,4,0,12,4,5,12,3,0,0,16,8,1,31,1 -24372,5,0,0,9,2,0,7,2,1,1,13,19,0,10,0 -24373,10,6,0,15,0,1,10,1,0,0,17,2,3,6,0 -24374,6,6,0,0,0,0,3,0,2,0,17,20,0,7,0 -24375,4,3,0,10,0,4,9,2,3,1,8,2,2,0,0 -24376,4,3,0,2,0,3,9,0,4,1,16,4,1,23,0 -24377,8,6,0,7,6,2,3,4,3,0,13,0,1,17,0 -24378,4,8,0,4,0,6,5,0,2,1,18,7,3,39,1 -24379,6,2,0,4,1,1,0,5,4,1,12,16,4,17,1 -24380,4,4,0,5,1,3,8,5,2,0,9,17,0,35,1 -24381,5,2,0,1,2,4,6,2,2,1,4,2,5,16,0 -24382,0,2,0,2,3,0,1,4,3,0,17,2,0,41,0 -24383,2,2,0,1,5,3,3,0,3,0,4,1,1,7,0 -24384,0,7,0,4,3,2,9,2,3,0,17,12,5,22,0 -24385,10,1,0,8,1,5,7,3,4,0,17,18,2,35,0 -24386,4,4,0,8,6,6,9,2,2,0,13,6,0,10,0 -24387,9,8,0,6,6,5,0,0,2,0,0,9,3,1,1 -24388,3,6,0,1,1,0,0,0,2,1,11,18,4,17,0 -24389,7,0,0,1,2,0,6,1,1,1,3,19,0,1,1 -24390,7,3,0,12,6,6,0,3,2,0,13,3,1,33,0 -24391,7,0,0,11,0,5,6,3,4,1,8,9,3,31,0 -24392,4,3,0,7,2,1,4,1,1,0,16,13,5,0,0 -24393,0,7,0,13,3,0,4,0,3,1,17,14,4,3,0 -24394,5,4,0,14,3,5,5,1,2,1,2,18,5,16,0 -24395,3,8,0,10,1,6,0,3,3,1,2,3,2,23,0 -24396,6,0,0,5,4,4,7,4,3,1,2,18,0,1,0 -24397,0,1,0,14,1,3,6,5,4,0,6,13,1,30,0 -24398,0,6,0,12,6,3,0,1,4,1,9,3,0,19,1 -24399,3,2,0,1,3,2,3,0,4,1,5,20,4,6,0 -24400,7,2,0,15,2,4,3,5,0,1,18,5,0,1,1 -24401,4,4,0,2,3,0,9,1,1,1,0,11,2,10,0 -24402,1,4,0,5,1,4,6,4,1,0,17,16,3,33,0 -24403,0,5,0,6,2,6,5,4,4,0,2,0,3,30,0 -24404,7,6,0,12,6,0,6,1,1,1,1,19,1,34,0 -24405,0,6,0,3,1,3,14,4,3,0,2,18,3,2,0 -24406,8,0,0,11,3,5,13,4,4,1,18,10,0,34,1 -24407,1,8,0,5,5,6,9,1,0,0,12,13,0,13,0 -24408,9,2,0,12,5,6,12,3,3,0,4,20,1,14,0 -24409,4,4,0,14,4,5,10,4,4,1,11,7,5,21,1 -24410,1,6,0,5,2,2,1,5,0,0,0,19,2,9,0 -24411,9,6,0,8,4,4,6,5,3,1,2,2,0,11,0 -24412,8,2,0,8,2,1,3,1,3,0,0,13,1,16,0 -24413,6,7,0,4,4,2,4,5,2,1,11,10,4,24,1 -24414,7,2,0,4,6,6,9,5,3,1,18,19,3,17,1 -24415,5,4,0,1,6,3,0,4,1,1,12,9,2,0,0 -24416,6,3,0,0,6,5,2,4,2,1,2,15,0,34,0 -24417,8,4,0,11,2,2,9,1,1,0,9,12,5,37,1 -24418,9,6,0,12,2,0,9,3,4,0,6,11,3,12,0 -24419,10,3,0,6,5,3,2,3,4,1,4,2,1,6,0 -24420,0,4,0,10,1,0,3,0,4,1,4,6,5,4,0 diff --git a/katabatic/test2.py b/katabatic/test2.py deleted file mode 100644 index abf3952..0000000 --- a/katabatic/test2.py +++ /dev/null @@ -1,57 +0,0 @@ - # Get demo data from GANBLR package -from pandas import read_csv -from katabatic import Katabatic -ganblr_demo_data = read_csv('https://raw.githubusercontent.com/chriszhangpodo/discretizedata/main/adult-dm.csv',dtype=int) -# print(ganblr_demo_data) - -X_train, X_test, y_train, y_test = Katabatic.preprocessing(ganblr_demo_data) - -print(X_test) - -# import pandas as pd -# from sklearn.linear_model import LogisticRegression -# from sklearn.neural_network import MLPClassifier -# from sklearn.ensemble import RandomForestClassifier -# from sklearn.preprocessing import OneHotEncoder -# from sklearn.preprocessing import OrdinalEncoder -# from sklearn.preprocessing import LabelEncoder -# from sklearn.pipeline import Pipeline -# from sklearn.metrics import accuracy_score - -# # Real Data -# real_data = pd.read_csv('cities_demo.csv') - -# # Synthetic Data -# synthetic_data = pd.read_csv('ganblr_output.csv') -# print("Columns Before : ", synthetic_data.columns) -# synthetic_data.pop('Unnamed: 0') -# synthetic_data.columns = ["Temperature", "Longitude", "Latitude","Category"] -# print("Columns After: ", synthetic_data.columns) - - -# X_real, y_real = real_data[["Temperature","Longitude"]], real_data["Category"] -# X_synthetic, y_synthetic = synthetic_data[["Temperature","Longitude"]], synthetic_data["Category"] #TODO: split x and y - -# # Prototype Evaluation Method -# def evaluate(X_real, y_real, X_synthetic, y_synthetic): -# # TODO: error handling in the case where feature names are missing/do not match -# # Encode the Real Data -# # ordinal_enc = OrdinalEncoder() -# # label_enc = LabelEncoder() -# # X_real, y_real = ordinal_enc.transform(X_real), label_enc.transform(y_real) - -# # categories = ["Category"] #['Continental','Subtropical','Tropical'] -# ohe = OneHotEncoder(handle_unknown='ignore') -# logreg = LogisticRegression() -# eval_pipeline = Pipeline([('encoder', ohe), ('model', logreg)]) -# eval_pipeline.fit(X_synthetic, y_synthetic) -# y_pred = eval_pipeline.predict(X_real) - -# return accuracy_score(y_real, y_pred) - -# result = evaluate(X_real, y_real, X_synthetic, y_synthetic) -# print("accuracy score: ", result) - - - - diff --git a/katabatic/training_datasets/Iris.csv b/katabatic/training_datasets/Iris.csv deleted file mode 100644 index 1bf42f2..0000000 --- a/katabatic/training_datasets/Iris.csv +++ /dev/null @@ -1,151 +0,0 @@ -Id,SepalLengthCm,SepalWidthCm,PetalLengthCm,PetalWidthCm,Species -1,5.1,3.5,1.4,0.2,Iris-setosa -2,4.9,3.0,1.4,0.2,Iris-setosa -3,4.7,3.2,1.3,0.2,Iris-setosa -4,4.6,3.1,1.5,0.2,Iris-setosa -5,5.0,3.6,1.4,0.2,Iris-setosa -6,5.4,3.9,1.7,0.4,Iris-setosa -7,4.6,3.4,1.4,0.3,Iris-setosa -8,5.0,3.4,1.5,0.2,Iris-setosa -9,4.4,2.9,1.4,0.2,Iris-setosa -10,4.9,3.1,1.5,0.1,Iris-setosa -11,5.4,3.7,1.5,0.2,Iris-setosa -12,4.8,3.4,1.6,0.2,Iris-setosa -13,4.8,3.0,1.4,0.1,Iris-setosa -14,4.3,3.0,1.1,0.1,Iris-setosa -15,5.8,4.0,1.2,0.2,Iris-setosa -16,5.7,4.4,1.5,0.4,Iris-setosa -17,5.4,3.9,1.3,0.4,Iris-setosa -18,5.1,3.5,1.4,0.3,Iris-setosa -19,5.7,3.8,1.7,0.3,Iris-setosa -20,5.1,3.8,1.5,0.3,Iris-setosa -21,5.4,3.4,1.7,0.2,Iris-setosa -22,5.1,3.7,1.5,0.4,Iris-setosa -23,4.6,3.6,1.0,0.2,Iris-setosa -24,5.1,3.3,1.7,0.5,Iris-setosa -25,4.8,3.4,1.9,0.2,Iris-setosa -26,5.0,3.0,1.6,0.2,Iris-setosa -27,5.0,3.4,1.6,0.4,Iris-setosa -28,5.2,3.5,1.5,0.2,Iris-setosa -29,5.2,3.4,1.4,0.2,Iris-setosa -30,4.7,3.2,1.6,0.2,Iris-setosa -31,4.8,3.1,1.6,0.2,Iris-setosa -32,5.4,3.4,1.5,0.4,Iris-setosa -33,5.2,4.1,1.5,0.1,Iris-setosa -34,5.5,4.2,1.4,0.2,Iris-setosa -35,4.9,3.1,1.5,0.1,Iris-setosa -36,5.0,3.2,1.2,0.2,Iris-setosa -37,5.5,3.5,1.3,0.2,Iris-setosa -38,4.9,3.1,1.5,0.1,Iris-setosa -39,4.4,3.0,1.3,0.2,Iris-setosa -40,5.1,3.4,1.5,0.2,Iris-setosa -41,5.0,3.5,1.3,0.3,Iris-setosa -42,4.5,2.3,1.3,0.3,Iris-setosa -43,4.4,3.2,1.3,0.2,Iris-setosa -44,5.0,3.5,1.6,0.6,Iris-setosa -45,5.1,3.8,1.9,0.4,Iris-setosa -46,4.8,3.0,1.4,0.3,Iris-setosa -47,5.1,3.8,1.6,0.2,Iris-setosa -48,4.6,3.2,1.4,0.2,Iris-setosa -49,5.3,3.7,1.5,0.2,Iris-setosa -50,5.0,3.3,1.4,0.2,Iris-setosa -51,7.0,3.2,4.7,1.4,Iris-versicolor -52,6.4,3.2,4.5,1.5,Iris-versicolor -53,6.9,3.1,4.9,1.5,Iris-versicolor -54,5.5,2.3,4.0,1.3,Iris-versicolor -55,6.5,2.8,4.6,1.5,Iris-versicolor -56,5.7,2.8,4.5,1.3,Iris-versicolor -57,6.3,3.3,4.7,1.6,Iris-versicolor -58,4.9,2.4,3.3,1.0,Iris-versicolor -59,6.6,2.9,4.6,1.3,Iris-versicolor -60,5.2,2.7,3.9,1.4,Iris-versicolor -61,5.0,2.0,3.5,1.0,Iris-versicolor -62,5.9,3.0,4.2,1.5,Iris-versicolor -63,6.0,2.2,4.0,1.0,Iris-versicolor -64,6.1,2.9,4.7,1.4,Iris-versicolor -65,5.6,2.9,3.6,1.3,Iris-versicolor -66,6.7,3.1,4.4,1.4,Iris-versicolor -67,5.6,3.0,4.5,1.5,Iris-versicolor -68,5.8,2.7,4.1,1.0,Iris-versicolor -69,6.2,2.2,4.5,1.5,Iris-versicolor -70,5.6,2.5,3.9,1.1,Iris-versicolor -71,5.9,3.2,4.8,1.8,Iris-versicolor -72,6.1,2.8,4.0,1.3,Iris-versicolor -73,6.3,2.5,4.9,1.5,Iris-versicolor -74,6.1,2.8,4.7,1.2,Iris-versicolor -75,6.4,2.9,4.3,1.3,Iris-versicolor -76,6.6,3.0,4.4,1.4,Iris-versicolor -77,6.8,2.8,4.8,1.4,Iris-versicolor -78,6.7,3.0,5.0,1.7,Iris-versicolor -79,6.0,2.9,4.5,1.5,Iris-versicolor -80,5.7,2.6,3.5,1.0,Iris-versicolor -81,5.5,2.4,3.8,1.1,Iris-versicolor -82,5.5,2.4,3.7,1.0,Iris-versicolor -83,5.8,2.7,3.9,1.2,Iris-versicolor -84,6.0,2.7,5.1,1.6,Iris-versicolor -85,5.4,3.0,4.5,1.5,Iris-versicolor -86,6.0,3.4,4.5,1.6,Iris-versicolor -87,6.7,3.1,4.7,1.5,Iris-versicolor -88,6.3,2.3,4.4,1.3,Iris-versicolor -89,5.6,3.0,4.1,1.3,Iris-versicolor -90,5.5,2.5,4.0,1.3,Iris-versicolor -91,5.5,2.6,4.4,1.2,Iris-versicolor -92,6.1,3.0,4.6,1.4,Iris-versicolor -93,5.8,2.6,4.0,1.2,Iris-versicolor -94,5.0,2.3,3.3,1.0,Iris-versicolor -95,5.6,2.7,4.2,1.3,Iris-versicolor -96,5.7,3.0,4.2,1.2,Iris-versicolor -97,5.7,2.9,4.2,1.3,Iris-versicolor -98,6.2,2.9,4.3,1.3,Iris-versicolor -99,5.1,2.5,3.0,1.1,Iris-versicolor -100,5.7,2.8,4.1,1.3,Iris-versicolor -101,6.3,3.3,6.0,2.5,Iris-virginica -102,5.8,2.7,5.1,1.9,Iris-virginica -103,7.1,3.0,5.9,2.1,Iris-virginica -104,6.3,2.9,5.6,1.8,Iris-virginica -105,6.5,3.0,5.8,2.2,Iris-virginica -106,7.6,3.0,6.6,2.1,Iris-virginica -107,4.9,2.5,4.5,1.7,Iris-virginica -108,7.3,2.9,6.3,1.8,Iris-virginica -109,6.7,2.5,5.8,1.8,Iris-virginica -110,7.2,3.6,6.1,2.5,Iris-virginica -111,6.5,3.2,5.1,2.0,Iris-virginica -112,6.4,2.7,5.3,1.9,Iris-virginica -113,6.8,3.0,5.5,2.1,Iris-virginica -114,5.7,2.5,5.0,2.0,Iris-virginica -115,5.8,2.8,5.1,2.4,Iris-virginica -116,6.4,3.2,5.3,2.3,Iris-virginica -117,6.5,3.0,5.5,1.8,Iris-virginica -118,7.7,3.8,6.7,2.2,Iris-virginica -119,7.7,2.6,6.9,2.3,Iris-virginica -120,6.0,2.2,5.0,1.5,Iris-virginica -121,6.9,3.2,5.7,2.3,Iris-virginica -122,5.6,2.8,4.9,2.0,Iris-virginica -123,7.7,2.8,6.7,2.0,Iris-virginica -124,6.3,2.7,4.9,1.8,Iris-virginica -125,6.7,3.3,5.7,2.1,Iris-virginica -126,7.2,3.2,6.0,1.8,Iris-virginica -127,6.2,2.8,4.8,1.8,Iris-virginica -128,6.1,3.0,4.9,1.8,Iris-virginica -129,6.4,2.8,5.6,2.1,Iris-virginica -130,7.2,3.0,5.8,1.6,Iris-virginica -131,7.4,2.8,6.1,1.9,Iris-virginica -132,7.9,3.8,6.4,2.0,Iris-virginica -133,6.4,2.8,5.6,2.2,Iris-virginica -134,6.3,2.8,5.1,1.5,Iris-virginica -135,6.1,2.6,5.6,1.4,Iris-virginica -136,7.7,3.0,6.1,2.3,Iris-virginica -137,6.3,3.4,5.6,2.4,Iris-virginica -138,6.4,3.1,5.5,1.8,Iris-virginica -139,6.0,3.0,4.8,1.8,Iris-virginica -140,6.9,3.1,5.4,2.1,Iris-virginica -141,6.7,3.1,5.6,2.4,Iris-virginica -142,6.9,3.1,5.1,2.3,Iris-virginica -143,5.8,2.7,5.1,1.9,Iris-virginica -144,6.8,3.2,5.9,2.3,Iris-virginica -145,6.7,3.3,5.7,2.5,Iris-virginica -146,6.7,3.0,5.2,2.3,Iris-virginica -147,6.3,2.5,5.0,1.9,Iris-virginica -148,6.5,3.0,5.2,2.0,Iris-virginica -149,6.2,3.4,5.4,2.3,Iris-virginica -150,5.9,3.0,5.1,1.8,Iris-virginica diff --git a/katabatic/training_datasets/cities_demo.csv b/katabatic/training_datasets/cities_demo.csv deleted file mode 100644 index aa0bbc2..0000000 --- a/katabatic/training_datasets/cities_demo.csv +++ /dev/null @@ -1,14 +0,0 @@ -Country,Capital City,Latitude,Longitude,Temperature,Category -Canada,Ottawa,45.4166968,-75.7000153,6.6,Continental -Ukraine,Kiev,50.43336733,30.51662797,8.4,Continental -Austria,Vienna,48.20001528,16.36663896,10.4,Continental -Croatia,Zagreb,45.80000673,15.99999467,10.7,Continental -North Korea,Pyongyang,39.0194387,125.7546907,10.8,Continental -Armenia,Yerevan,40.18115074,44.51355139,12.4,Continental -New Zealand,Wellington,-41.29998785,174.7832659,12.9,Subtropical -Syria,Damascus,33.500034,36.29999589,17,Subtropical -Kenya,Nairobi,-1.283346742,36.81665686,17.8,Subtropical -Taiwan,Taipei,25.03752,121.56368,23,Tropical -Saudi Arabia,Riyadh,24.64083315,46.77274166,26,Tropical -Indonesia,Jakarta,-6.174417705,106.8294376,26.7,Tropical -Philippines,Manila,14.60415895,120.9822172,28.4,Tropical \ No newline at end of file diff --git a/katabatic/utils/__init__.py b/katabatic/utils/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/katabatic/utils/evaluate.py b/katabatic/utils/evaluate.py deleted file mode 100644 index 126260e..0000000 --- a/katabatic/utils/evaluate.py +++ /dev/null @@ -1,120 +0,0 @@ -import os -import pandas as pd -import json -from ..importer import * # Aiko Services module loader -import logging - -METRICS_FILE = os.path.abspath( - "katabatic/metrics/metrics.json" -) # Constant to retrieve metrics function table - - -# Accepts metric_name:str. Returns an instance of the selected metric. -# TODO: possibly update METRICS_FILE to a dict of dicts (to include type etc.. of each metric) -def run_metric(metric_name): - with open(METRICS_FILE, "r") as file: - metrics = json.load(file) - - if not metric_name in metrics: - raise SystemExit( - f"Metrics Function Table '{METRICS_FILE}' doesn't contain metric: {metric_name}" - ) - metric = metrics[metric_name] - - diagnostic = None # initialise an empty diagnostic variable - try: - module = load_module(metric) # load_module method from Aiko services - except FileNotFoundError: - diagnostic = "could not be found." - except Exception as exception: - diagnostic = f"could not be loaded: {exception}" - if diagnostic: - raise SystemExit(f"Metric {metric_name} {diagnostic}") - # Run Metric - # result = metric_name.evaluate() - # return result - return module - - -# evaluate_data assumes the last column to be y and all others to be X -def evaluate_data(synthetic_data, real_data, data_type, dict_of_metrics): - """ - Evaluate the quality of synthetic data against real data using specified metrics. - - This function assumes that the last column of the data is the target variable (y), - and all other columns are features (X). It then evaluates the performance of - the synthetic data using various metrics provided in `dict_of_metrics`. - - Parameters: - - synthetic_data (pd.DataFrame): The synthetic dataset to evaluate. - - real_data (pd.DataFrame): The real dataset to compare against. - - data_type (str): The type of data, either 'discrete' or 'continuous'. - - dict_of_metrics (dict): A dictionary where keys are metric names and values are - metric functions or classes that have an `evaluate` method. - - Returns: - - pd.DataFrame: A DataFrame with the metric names and their corresponding evaluation values. - """ - - # Input Validation - if not isinstance(synthetic_data, pd.DataFrame) or not isinstance( - real_data, pd.DataFrame - ): - raise ValueError("Both synthetic_data and real_data must be pandas DataFrames.") - - if synthetic_data.shape != real_data.shape: - logging.warning( - "Input shapes do not match: synthetic_data shape: %s, real_data shape: %s", - synthetic_data.shape, - real_data.shape, - ) - - # Reset Column Headers for both datasets - synthetic_data.columns = range(synthetic_data.shape[1]) - real_data.columns = range(real_data.shape[1]) - - # Split X and y, assume y is the last column. - X_synthetic, y_synthetic = synthetic_data.iloc[:, :-1], synthetic_data.iloc[:, -1] - X_real, y_real = real_data.iloc[:, :-1], real_data.iloc[:, -1] - - # Initialize the results DataFrame - results_df = pd.DataFrame(columns=["Metric", "Value"]) - - # Evaluate each metric - for metric_name in dict_of_metrics: - try: - # Evaluate the metric - metric_module = run_metric(metric_name) - result = metric_module.evaluate(X_synthetic, y_synthetic, X_real, y_real) - logging.info("Successfully evaluated metric: %s", metric_name) - except Exception as e: - logging.error("Error evaluating metric %s: %s", metric_name, str(e)) - result = None - - # Append the result to the results DataFrame - results_df = pd.concat( - [results_df, pd.DataFrame({"Metric": [metric_name], "Value": [result]})], - ignore_index=True, - ) - - return results_df - - -def evaluate_models(real_data, dict_of_models, dict_of_metrics): - results_df = pd.DataFrame() - for model_name, model in dict_of_models.items(): - model_results = evaluate_data( - real_data, - pd.DataFrame(model.generate(size=len(real_data))), - "continuous", - dict_of_metrics, - ) - model_results["Model"] = model_name - results_df = pd.concat([results_df, model_results], ignore_index=True) - - # results_df = pd.DataFrame() - # for i in range(len(dict_of_models)): - # model_name = dict_of_models[i] - - # run_model - return results_df diff --git a/katabatic/utils/preprocessing.py b/katabatic/utils/preprocessing.py deleted file mode 100644 index eedbecc..0000000 --- a/katabatic/utils/preprocessing.py +++ /dev/null @@ -1,2 +0,0 @@ -def preprosessing_method1(): - pass diff --git a/prototype/aiko_services/importer.py b/prototype/aiko_services/importer.py deleted file mode 100644 index 7289344..0000000 --- a/prototype/aiko_services/importer.py +++ /dev/null @@ -1,47 +0,0 @@ -# Usage -# ----- -# from aiko_services.utilities import * -# module_descriptor = "pathname/filename.py" # or "package.module" -# module = load_module(module_descriptor) -# module.some_class() -# module.some_function() -# -# To Do -# ----- -# - None, yet ! - -import importlib -import os -import sys - -__all__ = ["load_module", "load_modules"] - -if os.environ.get("AIKO_IMPORTER_USE_CURRENT_DIRECTORY"): - sys.path.append(os.getcwd()) - -MODULES_LOADED = {} - -def load_module(module_descriptor): - if module_descriptor in MODULES_LOADED: - module = MODULES_LOADED[module_descriptor] - else: - if module_descriptor.endswith(".py"): - # Load module from Python source pathname, e.g "directory/file.py" - module_pathname = module_descriptor - module = importlib.machinery.SourceFileLoader( - 'module', module_pathname).load_module() - else: - # Load module from "installed" modules, e.g "package.module" - module_name = module_descriptor - module = importlib.import_module(module_name) - MODULES_LOADED[module_descriptor] = module - return module - -def load_modules(module_pathnames): - modules = [] - for module_pathname in module_pathnames: - if module_pathname: - modules.append(load_module(module_pathname)) - else: - modules.append(None) - return modules diff --git a/prototype/aiko_services/process_manager.py b/prototype/aiko_services/process_manager.py deleted file mode 100644 index 2d2fb08..0000000 --- a/prototype/aiko_services/process_manager.py +++ /dev/null @@ -1,187 +0,0 @@ -#!/usr/bin/env python3 -# -# Aiko Service: Process Manager -# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -# Create and destroy processes -# -# Usage -# ~~~~~ -# ./process_manager.py # run as an Aiko Service # TO BE COMPLETED -# -# ./process_manager.py --example python -# ./process_manager.py --example shell -# -# To Do -# ~~~~~ -# - Check out https://pypi.org/project/python-daemon -# -# - Use of threading versus multiprocessing -# - https://docs.python.org/2/library/multiprocessing.html -# - https://hackernoon.com/concurrent-programming-in-python-is-not-what-you-think-it-is-b6439c3f3e6a -# -# - Complete implementation as an Aiko Service -# - Keep running, even when there are no processes to manage -# - Share running processes -# - Handle process standard output and standard error - -import click -import importlib -import os -import select -import time - -from subprocess import Popen -from threading import Thread - -from aiko_services import * - -__all__ = ["ProcessManager"] - - -PROCESS_POLL_TIME = 0.2 # seconds -PROTOCOL_PROCESS_MANAGER = f"ServiceProtocol.AIKO/process_manager:0" - -# --------------------------------------------------------------------------- # - -# poll = select.poll # TODO: Is this required ? Not available on Windows - -class ProcessManager: - def __init__(self, process_exit_handler=None): - self.process_exit_handler = process_exit_handler - self.processes = {} - self.thread = None - - def __str__(self): - output = "" - for id, process_data in self.processes.items(): - pid = process_data["process"].pid - command = process_data["command_line"][0] - if output: output += "\n" - output += f"{id}: {pid} {command}" - return output - - def create(self, id, command, arguments=None): - command_line = [command] - file_extension = os.path.splitext(command)[-1] - if file_extension not in [".py", ".sh"]: - specification = None - try: - importlib.util.find_spec - except AttributeError: # Python < 3.4 - specification = importlib.find_loader(command) # pylint: disable=deprecated-method - else: - specification = importlib.util.find_spec(command) - if specification: - command_line = [specification.origin] - print(f'Module path resolved to: "{command_line}"') - - if arguments: - command_line.extend(arguments) - process = Popen(command_line, bufsize=0, shell=False) - self.processes[id] = { - "command_line": command_line, - "process": process, - "return_code": None - } - - if not self.thread: - self.thread = Thread(target=self.run) - self.thread.start() - - def delete(self, id, terminate=True, kill=False): - process_data = self.processes[id] - del self.processes[id] - process = process_data["process"] - if terminate: - process.terminate() - if kill: - process.kill() - if self.process_exit_handler: - self.process_exit_handler(id, process_data) - - def run(self): - while len(self.processes): - for id, process_data in list(self.processes.items()): - process = process_data["process"] - return_code = process.poll() - if return_code is not None: - process_data["return_code"] = return_code - self.delete(id, terminate=False, kill=False) - time.sleep(PROCESS_POLL_TIME) - -def process_exit_handler_default(id, process_data): - details = "" - if process_data: - command = process_data["command_line"][0] - return_code = process_data["return_code"] - details = f": {command} status: {return_code}" - print(f"Exit process {id}" + details) - -# --------------------------------------------------------------------------- # - -def topic_in_handler(aiko, topic, payload_in): - print(f"Message: {topic}: {payload_in}") - -# tokens = payload_in[1:-1].split() -# if len(tokens) >= 1: -# command = tokens[0] - -# if command == "task" and len(tokens) == 2: -# operation = tokens[1] -# if operation == "start": -# if aks.get_parameter("publish_parameters") == "true": -# for parameter_name in aks_info.parameters[0]: -# parameter_value = aks.get_parameter(parameter_name) -# payload_out = f"{parameter_name}: {parameter_value}" -# aks_info.mqtt_client.publish( -# aks_info.TOPIC_OUT, payload_out) - -# payload_out = payload_in -# mqtt_client.publish(aks_info.TOPIC_OUT, payload_in) - - return False - -# --------------------------------------------------------------------------- # - -def example_code(process_manager, example): - if example == "ls": - command_line = [ "ls", "-l" ] - process = Popen(command_line, bufsize=0, shell=False) - - if example == "python": - command = "./process_manager.py" - arguments = [ "--example", "ls" ] - process_manager.create("test_1", command, arguments) - - if example == "shell": - command = "/bin/sh" - arguments = [ "-c", "echo Start A; sleep 1; echo Stop A" ] - process_manager.create("A", command, arguments) - arguments = [ "-c", "echo Start B; sleep 2; echo Stop B" ] - process_manager.create("B", command, arguments) - arguments = [ "-c", "echo Start C; sleep 10; echo Stop C" ] - process_manager.create("C", command, arguments) - time.sleep(5) - process_manager.delete("C") - -# --------------------------------------------------------------------------- # - -@click.command() -@click.option("--example", type=click.STRING, help="Run example") -@click.option("--tags", "-t", type=click.STRING, help="Aiko Service tags") - -def main(example, tags): - process_manager = ProcessManager(process_exit_handler_default) - - if example: - example_code(process_manager, example) - else: - pass - # self.add_message_handler(self.topic_in_handler, self.topic_in) - # ServiceTags.parse_tags(tags) - # aiko.process.run(True) - -if __name__ == "__main__": - main() - -# --------------------------------------------------------------------------- # diff --git a/prototype/ganblr_adapter.py b/prototype/ganblr_adapter.py deleted file mode 100644 index 20e8254..0000000 --- a/prototype/ganblr_adapter.py +++ /dev/null @@ -1,13 +0,0 @@ -# pip install ganblr - -from katabatic_spi import KatabaticSPI - -class GanblrAdapter(KatabaticSPI): - def load_data(self, data_pathname): - print("GanblrAdapter.load_data()") - - def split_data(self, data_frame, train_test_ratio): - print("GanblrAdapter.split_data()") - - def fit_model(self, data_frame): - print("GanblrAdapter.fit_model()") diff --git a/prototype/katabatic_spi.py b/prototype/katabatic_spi.py deleted file mode 100644 index 86c2795..0000000 --- a/prototype/katabatic_spi.py +++ /dev/null @@ -1,11 +0,0 @@ -from abc import ABC - -class KatabaticSPI(ABC): - def load_data(data_pathname): - pass - - def split_data(data_frame, train_test_ratio): - pass - - def fit_model(data_frame): - pass diff --git a/prototype/mock_datagen_1.py b/prototype/mock_datagen_1.py deleted file mode 100644 index 969ba0d..0000000 --- a/prototype/mock_datagen_1.py +++ /dev/null @@ -1,11 +0,0 @@ -from katabatic_spi import KatabaticSPI - -class MockDataGen1(KatabaticSPI): - def load_data(self, data_pathname): - print("MockDataGen1.load_data()") - - def split_data(self, data_frame, train_test_ratio): - print("MockDataGen1.split_data()") - - def fit_model(self, data_frame): - print("MockDataGen1.fit_model()") diff --git a/prototype/mock_datagen_2.py b/prototype/mock_datagen_2.py deleted file mode 100644 index a5e35e0..0000000 --- a/prototype/mock_datagen_2.py +++ /dev/null @@ -1,11 +0,0 @@ -from katabatic_spi import KatabaticSPI - -class MockDataGen2(KatabaticSPI): - def load_data(self, data_pathname): - print("MockDataGen2.load_data()") - - def split_data(self, data_frame, train_test_ratio): - print("MockDataGen2.split_data()") - - def fit_model(self, data_frame): - print("MockDataGen2.fit_model()") diff --git a/prototype/prototype.json b/prototype/prototype.json deleted file mode 100644 index d28751d..0000000 --- a/prototype/prototype.json +++ /dev/null @@ -1,12 +0,0 @@ -{ - "#": "Katabatic Configuration File", - - "mock_datagen_1": { - "datagen_module_descriptor": "mock_datagen_1", - "datagen_class_name": "MockDataGen1" - }, - "mock_datagen_2": { - "datagen_module_descriptor": "mock_datagen_2", - "datagen_class_name": "MockDataGen2" - } -} diff --git a/prototype/prototype.py b/prototype/prototype.py deleted file mode 100644 index 7396266..0000000 --- a/prototype/prototype.py +++ /dev/null @@ -1,126 +0,0 @@ -#!/usr/bin/env python3 -# -# Declaratively dynamically load the configured module and instantiate class -# -# - Edit the CONFIGURATION_FILE and change parameters as needed -# -# - python -m venv venv_mock_datagen_1 -# - . venv_mock_datagen_1/bin/activate -# - pip install --upgrade click pip setuptools # plus DataGen dependencies -# - deactivate -# -# - python -m venv venv_mock_datagen_2 -# - . venv_mock_datagen_1/bin/activate -# - pip install --upgrade click pip setuptools # plus DataGen dependencies -# - deactivate -# -# - python -m venv venv_katabatic -# - . venv_katabatic/bin/activate -# - pip install --upgrade click pip setuptools # plus prototype dependencies -# - ./prototype.py evaluate mock_datagen_1 mock_datagen_2 -# -# - ./prototype.py run mock_datagen_1 # requires correct Python venv set-up -# - ./prototype.py run mock_datagen_2 # requires correct Python venv set-up -# -# Resources -# ~~~~~~~~~ -# - https://www.datacamp.com/tutorial/python-subprocess -# -# To Do -# ~~~~~ -# - Compare and graph all DataGen Model results - -import click -import json -import os -import subprocess -import sys - -from katabatic_spi import KatabaticSPI - -from aiko_services.importer import load_module -# from aiko_services.process_manager import ProcessManager - -NAME = "prototype" -CONFIGURATION_FILE = f"{NAME}.json" - -# --------------------------------------------------------------------------- # - -def run_datagen_model_process(datagen_model_name): -# command = f"./{NAME}.sh" -# arguments = [datagen_model_name] -# process_manager = ProcessManager(process_exit_handler) -# process_manager.create(0, command, arguments) - - command = [f"./{NAME}.sh", datagen_model_name] - try: - result = subprocess.run(command, check=True, shell=False, timeout=None) - except (FileNotFoundError, PermissionError) as error: - raise SystemExit(f"Couldn't run DataGen Model: {error}") - except subprocess.CalledProcessError as called_process_error: - error_code = called_process_error.returncode - raise SystemExit(f"Error code {error_code}: {' '.join(command)}") - -def run_datagen_model(datagen_model_name): - with open(CONFIGURATION_FILE, "r") as file: - configuration = json.load(file) - - if not datagen_model_name in configuration: - raise SystemExit( - f"Configuration '{CONFIGURATION_FILE}' doesn't have DataGen model: {datagen_model_name}") - configuration = configuration[datagen_model_name] - - try: - datagen_module_descriptor = configuration["datagen_module_descriptor"] - datagen_class_name = configuration["datagen_class_name"] - except KeyError as key_error: - raise SystemExit( - f"Configuration '{CONFIGURATION_FILE}' doesn't have: {key_error}") - - diagnostic = None - try: - datagen_module = load_module(datagen_module_descriptor) - datagen_class = getattr(datagen_module, datagen_class_name) - except FileNotFoundError: - diagnostic = "couldn't be found" - except Exception: - diagnostic = "couldn't be loaded" - if diagnostic: - raise SystemExit(f"Module {datagen_module_descriptor} {diagnostic}") - - data_gen_ml = datagen_class() - if not isinstance(data_gen_ml, KatabaticSPI): - raise SystemExit(f"{datagen_class_name} doesn't implement KatabaticSPI") - - data_gen_ml.load_data(None) - data_gen_ml.split_data(None, None) - data_gen_ml.fit_model(None) - -# --------------------------------------------------------------------------- # - -@click.group() - -def main(): - pass - -@main.command(help="Evaluate DataGen Models") -@click.argument("datagen_model_names", nargs=-1) -def evaluate(datagen_model_names): - print(f"[Katabatic evaluate {NAME} 0.2]") - for datagen_model_name in datagen_model_names: - print("------------------") - run_datagen_model_process(datagen_model_name) - -# TODO: Compare and graph all DataGen Model results - -@main.command(help="Run DataGen Model") -@click.argument("datagen_model_name") -def run(datagen_model_name): - print(f"[Katabatic run {NAME} 0.2]") - print(f" Parent process: {os.getppid()}, my process id: {os.getpid()}") - run_datagen_model(datagen_model_name) - -if __name__ == "__main__": - main() - -# --------------------------------------------------------------------------- # diff --git a/prototype/prototype.sh b/prototype/prototype.sh deleted file mode 100644 index b26e7ad..0000000 --- a/prototype/prototype.sh +++ /dev/null @@ -1,22 +0,0 @@ -#!/bin/bash - -datagen_model_name=$1 - -if [ "$datagen_model_name"x == "x" ]; then - echo "Usage: prototype.sh DATAGEN_MODEL_NAME" - exit -1 -fi - -echo "Run DataGen Model: \"$datagen_model_name\" Python virtual environment" - -venv_directory="venv_"$datagen_model_name - -if ! [ -d $venv_directory ]; then - echo "Python virtual environment does not exist: \"$venv_directory\"" - exit -1 -fi - -source $venv_directory/bin/activate -./prototype.py run $datagen_model_name - -echo "Exit DataGen Model: \"$datagen_model_name\"" diff --git a/requirements.txt b/requirements.txt deleted file mode 100644 index fb1826e..0000000 --- a/requirements.txt +++ /dev/null @@ -1,5 +0,0 @@ -pyitlib -tensorflow -pgmpy -sdv -scikit-learn==1.0 \ No newline at end of file diff --git a/running.log b/running.log deleted file mode 100644 index 059f517..0000000 --- a/running.log +++ /dev/null @@ -1,727 +0,0 @@ -INFO:sdv.metadata.single_table:Detected metadata: -INFO:sdv.metadata.single_table:{ - "columns": { - "Country": { - "sdtype": "id" - }, - "Capital City": { - "sdtype": "city", - "pii": true - }, - "Latitude": { - "sdtype": "latitude", - "pii": true - }, - "Longitude": { - "sdtype": "longitude", - "pii": true - }, - "Temperature": { - "sdtype": "numerical" - }, - "Category": { - "sdtype": "categorical" - } - }, - "primary_key": "Country", - "METADATA_SPEC_VERSION": "SINGLE_TABLE_V1" -} -INFO:sdv.metadata.single_table:Detected metadata: -INFO:sdv.metadata.single_table:{ - "columns": { - "Country": { - "sdtype": "id" - }, - "Capital City": { - "sdtype": "city", - "pii": true - }, - "Latitude": { - "sdtype": "latitude", - "pii": true - }, - "Longitude": { - "sdtype": "longitude", - "pii": true - }, - "Temperature": { - "sdtype": "numerical" - }, - "Category": { - "sdtype": "categorical" - } - }, - "primary_key": "Country", - "METADATA_SPEC_VERSION": "SINGLE_TABLE_V1" -} -INFO:sdv.metadata.single_table:Detected metadata: -INFO:sdv.metadata.single_table:{ - "primary_key": "Country", - "METADATA_SPEC_VERSION": "SINGLE_TABLE_V1", - "columns": { - "Country": { - "sdtype": "id" - }, - "Capital City": { - "sdtype": "city", - "pii": true - }, - "Latitude": { - "sdtype": "latitude", - "pii": true - }, - "Longitude": { - "sdtype": "longitude", - "pii": true - }, - "Temperature": { - "sdtype": "numerical" - }, - "Category": { - "sdtype": "categorical" - } - } -} -INFO:SingleTableSynthesizer:{'EVENT': 'Instance', 'TIMESTAMP': datetime.datetime(2024, 8, 4, 6, 27, 13, 121308), 'SYNTHESIZER CLASS NAME': 'CTGANSynthesizer', 'SYNTHESIZER ID': 'CTGANSynthesizer_1.15.0_a92e7acfd9ff48078b5e5da3f1aa50ba'} -INFO:SingleTableSynthesizer:{'EVENT': 'Fit', 'TIMESTAMP': datetime.datetime(2024, 8, 4, 6, 27, 13, 121539), 'SYNTHESIZER CLASS NAME': 'CTGANSynthesizer', 'SYNTHESIZER ID': 'CTGANSynthesizer_1.15.0_a92e7acfd9ff48078b5e5da3f1aa50ba', 'TOTAL NUMBER OF TABLES': 1, 'TOTAL NUMBER OF ROWS': 13, 'TOTAL NUMBER OF COLUMNS': 6} -INFO:sdv.data_processing.data_processor:Fitting table metadata -INFO:sdv.data_processing.data_processor:Fitting formatters for table -INFO:sdv.data_processing.data_processor:Fitting constraints for table -INFO:sdv.data_processing.data_processor:Setting the configuration for the ``HyperTransformer`` for table -INFO:sdv.data_processing.data_processor:Fitting HyperTransformer for table -INFO:SingleTableSynthesizer:{'EVENT': 'Fit processed data', 'TIMESTAMP': datetime.datetime(2024, 8, 4, 6, 27, 13, 521329), 'SYNTHESIZER CLASS NAME': 'CTGANSynthesizer', 'SYNTHESIZER ID': 'CTGANSynthesizer_1.15.0_a92e7acfd9ff48078b5e5da3f1aa50ba', 'TOTAL NUMBER OF TABLES': 1, 'TOTAL NUMBER OF ROWS': 13, 'TOTAL NUMBER OF COLUMNS': 2} -INFO:rdt.transformers.null:Guidance: There are no missing values in column Temperature. Extra column not created. -INFO:SingleTableSynthesizer:{'EVENT': 'Sample', 'TIMESTAMP': datetime.datetime(2024, 8, 4, 6, 28, 13, 904723), 'SYNTHESIZER CLASS NAME': 'CTGANSynthesizer', 'SYNTHESIZER ID': 'CTGANSynthesizer_1.15.0_a92e7acfd9ff48078b5e5da3f1aa50ba', 'TOTAL NUMBER OF TABLES': 1, 'TOTAL NUMBER OF ROWS': 13, 'TOTAL NUMBER OF COLUMNS': 6} -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:tensorflow:5 out of the last 5 calls to .one_step_on_iterator at 0x7f9b1e4f6820> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:tensorflow:6 out of the last 6 calls to .one_step_on_iterator at 0x7f9b1de7b0d0> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:tensorflow:5 out of the last 5 calls to .one_step_on_data_distributed at 0x7f9b220460d0> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:tensorflow:6 out of the last 6 calls to .one_step_on_data_distributed at 0x7f9b22744700> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:tensorflow:5 out of the last 5 calls to .one_step_on_iterator at 0x7f9ecac73820> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:tensorflow:6 out of the last 6 calls to .one_step_on_iterator at 0x7f9ecc3701f0> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:tensorflow:5 out of the last 5 calls to .one_step_on_data_distributed at 0x7f9ecd6bb0d0> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:tensorflow:6 out of the last 6 calls to .one_step_on_data_distributed at 0x7f9ecdb76700> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -INFO:katabatic.models.ganblr.ganblr_adapter:Initializing Ganblr Model -INFO:katabatic.models.ganblr.ganblr_adapter:Fitting Ganblr Model -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:tensorflow:5 out of the last 5 calls to .one_step_on_iterator at 0x7f9ab8dbb820> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:tensorflow:6 out of the last 6 calls to .one_step_on_iterator at 0x7f9abb4cf040> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:tensorflow:5 out of the last 5 calls to .one_step_on_data_distributed at 0x7f9abca3a0d0> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:tensorflow:6 out of the last 6 calls to .one_step_on_data_distributed at 0x7f9abcf34700> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -INFO:katabatic.models.ganblr.ganblr_adapter:Generating from Ganblr Model -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:tensorflow:5 out of the last 5 calls to .one_step_on_iterator at 0x7f78ff01b820> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:tensorflow:6 out of the last 6 calls to .one_step_on_iterator at 0x7f78ff8cc4c0> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:tensorflow:5 out of the last 5 calls to .one_step_on_data_distributed at 0x7f790297c670> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:tensorflow:6 out of the last 6 calls to .one_step_on_data_distributed at 0x7f79038b6790> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:tensorflow:From c:\Users\Asus\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\src\backend\common\global_state.py:82: The name tf.reset_default_graph is deprecated. Please use tf.compat.v1.reset_default_graph instead. - -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:tensorflow:5 out of the last 5 calls to .one_step_on_iterator at 0x00000237B938D040> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:tensorflow:6 out of the last 6 calls to .one_step_on_iterator at 0x00000237BD1C95E0> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:tensorflow:5 out of the last 5 calls to .one_step_on_data_distributed at 0x00000237BE6C4040> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:tensorflow:6 out of the last 6 calls to .one_step_on_data_distributed at 0x00000237BE8F4160> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:tensorflow:From c:\Users\Asus\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\src\backend\common\global_state.py:82: The name tf.reset_default_graph is deprecated. Please use tf.compat.v1.reset_default_graph instead. - -WARNING:tensorflow:From c:\Users\Asus\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\src\backend\common\global_state.py:82: The name tf.reset_default_graph is deprecated. Please use tf.compat.v1.reset_default_graph instead. - -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:tensorflow:From c:\Users\Asus\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\src\backend\common\global_state.py:82: The name tf.reset_default_graph is deprecated. Please use tf.compat.v1.reset_default_graph instead. - -WARNING:tensorflow:From c:\Users\Asus\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\src\backend\common\global_state.py:82: The name tf.reset_default_graph is deprecated. Please use tf.compat.v1.reset_default_graph instead. - -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:tensorflow:5 out of the last 386 calls to .one_step_on_iterator at 0x0000014453594D30> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:tensorflow:6 out of the last 387 calls to .one_step_on_iterator at 0x000001446BCF9EE0> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:tensorflow:5 out of the last 767 calls to .one_step_on_data_distributed at 0x000001446F8E31F0> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:tensorflow:6 out of the last 768 calls to .one_step_on_data_distributed at 0x000001446FAE7790> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:tensorflow:From c:\Users\Asus\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\src\backend\common\global_state.py:82: The name tf.reset_default_graph is deprecated. Please use tf.compat.v1.reset_default_graph instead. - -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:tensorflow:From c:\Users\Asus\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\src\backend\common\global_state.py:82: The name tf.reset_default_graph is deprecated. Please use tf.compat.v1.reset_default_graph instead. - -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:tensorflow:From c:\Users\Asus\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\src\backend\common\global_state.py:82: The name tf.reset_default_graph is deprecated. Please use tf.compat.v1.reset_default_graph instead. - -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:tensorflow:From c:\Users\Asus\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\src\backend\common\global_state.py:82: The name tf.reset_default_graph is deprecated. Please use tf.compat.v1.reset_default_graph instead. - -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: -2.220446049250313e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. -WARNING:pgmpy:Probability values don't exactly sum to 1. Differ by: 1.1102230246251565e-16. Adjusting values. diff --git a/serve_ganblr.sh b/serve_ganblr.sh deleted file mode 100755 index 0d08261..0000000 --- a/serve_ganblr.sh +++ /dev/null @@ -1,22 +0,0 @@ -#!/bin/bash - -# Step 1: Setup environment (virtualenv or conda) -# For virtualenv (example): -# python3 -m venv ganblr_env -# source ganblr_env/bin/activate - -# For conda (example): -conda create --name ganblr_env python=3.9.19 -y -conda activate ganblr_env - -# Step 2: Install libraries in precise order -echo "Installing required libraries..." -pip install pyitlib -pip install tensorflow -pip install pgmpy -pip install sdv -pip install scikit-learn==1.0 -pip install fastapi uvicorn - -# Step 3: Serve the model API -uvicorn ganblr_api:app --reload --host 0.0.0.0 --port 8000 \ No newline at end of file diff --git a/setup.py b/setup.py deleted file mode 100644 index 870c698..0000000 --- a/setup.py +++ /dev/null @@ -1,30 +0,0 @@ -# % pip install wheel -# % pip install setuptools -# % pip install twine - -from setuptools import find_packages, setup - -setup( - name="katabatic", - packages=find_packages(include=["katabatic"]), - version="0.0.1", - description="An open source framework for tabular data generation", - author="Jaime Blackwell, Nayyar Zaidi", - install_requires=[ - "pandas", - "numpy", - "scikit-learn", - "pyitlib", - "tensorflow", - "pgmpy", - "sdv", - ], - extras_require={ - "dev": [ - # Add other development dependencies here - ], - }, - python_requires=">=3.9", - setup_requires=["pytest-runner"], - long_description=open("README.md").read(), -) diff --git a/static/css/CTGAN.css b/static/css/CTGAN.css new file mode 100644 index 0000000..cfe6a29 --- /dev/null +++ b/static/css/CTGAN.css @@ -0,0 +1,68 @@ +/* CTGAN Page Specific Styles */ +html, body { + height: 100%; + margin: 0; + padding: 0; + +} +body { + display: flex; + flex-direction: column; + background-color: #f0f4f7; + color: #333; +} + +/* Main Content Styling */ +main { + flex: 1; + display: flex; + flex-direction: column; + justify-content: center; + align-items: center; + padding: 40px 20px; + text-align: center; +} +/* Paragraph Styling */ +p { + font-size: 1.4rem; + line-height: 1.6; + color: #555; + max-width: 900px; + margin-bottom: 30px; +} + +/* Button Styling */ +.custom-button { + display: inline-block; + padding: 12px 30px; + background-color: #5e503f; + color: #eae0d5; + text-decoration: none; + border-radius: 30px; + font-size: 1.1rem; + text-align: center; + margin-top: 30px; + transition: background-color 0.3s ease, transform 0.2s ease; +} + +.custom-button:hover { + background-color: #3c3b35; + transform: translateY(-3px); +} + +/* Footer Styling */ +footer { + background-color: #22333b; + color: #eae0d5; + padding: 20px 0; + text-align: center; + font-size: 1rem; + margin-top: auto; + box-shadow: 0 -2px 10px rgba(0, 0, 0, 0.15); +} + +footer p { + margin: 0; + font-size: 0.9rem; + color: #b0b0b0; +} diff --git a/static/css/Contact.css b/static/css/Contact.css new file mode 100644 index 0000000..219b5b6 --- /dev/null +++ b/static/css/Contact.css @@ -0,0 +1,77 @@ +/* ----------------- */ +/* Global Layout */ +/* ----------------- */ +html, body { + height: 100%; /* Ensure full height */ + margin: 0; /* Remove default margins */ + display: flex; + flex-direction: column; /* Arrange elements vertically */ +} + +body { + display: flex; + flex-direction: column; /* Arrange elements vertically */ + min-height: 100vh; /* Ensure the body takes up at least the full height of the viewport */ +} + +main { + flex: 1; /* Allow the main content to grow and take up the remaining space */ +} + +/* ----------------- */ +/* Contact Us Section */ +/* ----------------- */ +.Contact { + background: linear-gradient(135deg, #d8dee4, #7a8da1); /* Dark gradient background */ + color: #2e3435; /* Light text color */ + font-size: 1.3rem; + height: 100vh; + padding: 100px 20px; /* Increased padding around the section */ + text-align: center; /* Centering content */ + border-radius: 0px; /* Rounded corners for the section */ +} + +.Contact h2 { + font-size: 3rem; /* Increased header font size */ + color: #1f2120; /* Bright green for header */ + margin-top: 50px; + margin-bottom: 0px; /* Reduced space below the header */ + font-weight: 600; /* Bolder header text */ +} + +.Contact .form-container { + max-width: 600px; /* Reduced max width for better mobile visibility */ + margin: 0 auto; /* Center the form */ + display: flex; + flex-direction: column; /* Column layout for easier form input */ + gap: 20px; /* Gap between form fields */ + padding: 20px; + background-color: #8ca6b1; /* Dark background for the form */ + border-radius: 10px; /* Rounded corners for the form */ + box-shadow: 0px 4px 10px rgba(0, 0, 0, 0.397); /* Shadow for depth */ +} + +/* Mobile Responsiveness */ +@media (max-width: 768px) { + .Contact h2 { + font-size: 1.8rem; /* Slightly smaller font size for header */ + } + + .Contact .form-container { + padding: 15px; /* Reduce padding on smaller screens */ + } +} + +/* ----------------- */ +/* Footer Section */ +/* ----------------- */ +footer { + background-color: #212529; /* Dark footer background */ + color: #f8f9fa; /* Light text color */ + text-align: center; + padding: 20px; + font-size: 1rem; + width: 100%; /* Ensure footer spans the entire width */ + position: relative; + bottom: 0; +} diff --git a/static/css/Homepage.css b/static/css/Homepage.css new file mode 100644 index 0000000..3caaa9c --- /dev/null +++ b/static/css/Homepage.css @@ -0,0 +1,214 @@ + +/* ----------------- */ +/* Main Section */ +/* ----------------- */ +.Firstpage { + display: flex; + flex-direction: column; + justify-content: center; + align-items: center; + height: 100vh; + background: linear-gradient(135deg, #bcc6cc 100%); + color: #3b3b3b; + text-align: center; + padding: 0 20px; +} + +.Firstpage h1 { + font-size: 3.5rem; + font-weight: bold; + letter-spacing: 3px; + margin-bottom: 20px; + color: #2a374f; /* Darker shade for text */ + text-transform: uppercase; +} + +.Firstpage p { + font-size: 1.4rem; + margin-bottom: 30px; + color: #3b3b3b; + max-width: 600px; + line-height: 1.6; +} + +/* Error Message Styling */ +.error-message { + color: #d99292; /* Light red for error messages */ + font-size: 1.1rem; /* Slightly larger error message text */ + margin-top: 10px; /* Space above the error message */ + visibility: hidden; /* Hidden by default */ +} + +/* Mobile Responsiveness for Firstpage Section */ +@media (max-width: 768px) { + .Firstpage h1 { + font-size: 2.5rem; /* Adjusted header font size for smaller screens */ + } + .Firstpage p { + font-size: 1rem; /* Adjusted paragraph font size for smaller screens */ + } +} +footer { + background-color: var(--primary-color); + color: var(--text-color); + padding: 20px; + text-align: center; +} + +footer .social-icons a { + color: var(--text-color); + margin: 0 15px; + text-decoration: none; + font-size: 1.2rem; +} + +footer .social-icons a:hover { + color: var(--secondary-color); +} + +/* ----------------- */ +/* Footer Section */ +/* ----------------- */ +footer { + background-color: #1a2a33; /* Dark footer background */ + color: #eae0d5; /* Light footer text */ + text-align: center; /* Center align footer text */ + padding: 20px; /* Padding inside the footer */ + font-size: 1rem; /* Standard font size */ +} + +/* ----------------- */ +/* Upload Section */ +/* ----------------- */ +.upload-container { + display: flex; /* Flexbox for centering */ + justify-content: center; /* Center items horizontally */ + align-items: center; /* Center items vertically */ + flex-direction: column; /* Stack items vertically */ + margin-top: 30px; /* Space above the upload container */ +} + +.upload-container input[type="file"] { + display: none; /* Hide the file input */ +} + +.upload-container label { + background-color: #4f2a4e; /* Deep blue */ + color: #f8f9fa; + padding: 20px 50px; + font-size: 1.2rem; + cursor: pointer; + border-radius: 25px; + transition: background-color 0.3s ease, transform 0.3s ease; + box-shadow: 0 8px 12px rgba(60, 70, 102, 0.2); +} + +.upload-container label:hover { + background-color: #e1b07d; /* Amber hover effect */ + color: #2a2e4f; /* Dark text color on hover */ + transform: translateY(-5px); +} + +/* ----------------- */ +/* Model Selection Section */ +/* ----------------- */ +.selection-container { + display: flex; + flex-direction: column; + justify-content: center; + align-items: center; + height: 100vh; + background: linear-gradient(135deg, #bcc6cc 100%); /* Retaining background */ + color: #702323; /* Light color for text */ +} + +.selection-container h1 { + font-size: 2.5rem; + margin-bottom: 20px; + color: #233551; +} + +.dropdown-container { + text-align: center; /* Center align the dropdown */ + margin-bottom: 20px; /* Space below the dropdown */ +} + +.dropdown-container select { + width: 300px; + padding: 10px; + font-size: 1rem; + border: 1px solid #090f30; /* New border color */ + border-radius: 5px; + background-color: #8aa6ae; /* Deep blue background */ + color: #080808; /* Light text */ + box-shadow: 0 4px 10px rgba(0, 0, 0, 0.2); + cursor: pointer; + font-family:Arial, Helvetica, sans-serif ; +} + +.dropdown-container select:hover { + border-color: #e1b07d; /* Amber color on hover */ + +} + +/* ----------------- */ +/* Error Message for Model Selection */ +/* ----------------- */ +.error-message { + color: #800e0e; /* Red error color */ + font-size: 1.1rem; /* Slightly larger font size */ + margin-top: 10px; /* Space above the error message */ + visibility: hidden; /* Hidden by default */ +} + +/* ----------------- */ +/* Button Styling */ +/* ----------------- */ +.custom-button { + background-color: #2a2e4f; /* Deep blue background */ + color: #f8f9fa; /* Light text */ + padding: 15px 30px; + font-size: 1.2rem; + border: none; + border-radius: 25px; + cursor: pointer; + text-transform: uppercase; + letter-spacing: 1px; + transition: background-color 0.3s ease, transform 0.2s ease; + box-shadow: 0 4px 10px rgba(0, 0, 0, 0.2); +} + +.custom-button:hover { + background-color: #e1b07d; /* Amber color on hover */ + transform: translateY(-3px); +} + +.custom-button:active { + transform: translateY(1px); +} + +/* ----------------- */ +/* Remove Button Styling */ +/* ----------------- */ +#remove-btn { + background-color: #581f1f; /* Red background for remove button */ + color: white; /* White text */ + padding: 10px 20px; /* Padding inside the button */ + font-size: 1rem; /* Font size for the button text */ + border: none; /* No border */ + border-radius: 5px; /* Rounded corners */ + cursor: pointer; /* Pointer cursor on hover */ + margin-top: 10px; /* Space above the button */ + transition: background-color 0.3s ease, transform 0.3s ease; /* Smooth hover and active transitions */ + box-shadow: 0 4px 10px rgba(0, 0, 0, 0.2); /* Button shadow */ +} + +#remove-btn:hover { + background-color: #cc5555; /* Darker red on hover */ + transform: translateY(-3px); /* Subtle lift effect */ +} + +#remove-btn:active { + transform: translateY(1px); /* Pressed down effect */ +} + diff --git a/static/css/about.css b/static/css/about.css new file mode 100644 index 0000000..e6a67d5 --- /dev/null +++ b/static/css/about.css @@ -0,0 +1,106 @@ + +header + .about-us { + margin-top: 100px; /* Adds space between the header and the info section */ +} +/* ----------------- */ +/* About Us Section */ +/* ----------------- */ +.about-us { + background: linear-gradient(135deg, #DAE2F8, #d6a4a4); /* Lighter gradient background */ + color: #0d161f; /* Dark text for readability */ + padding: 30px 20px; +} + +.about-us .container { + max-width: 1200px; + margin: 0 auto; + display: grid; + grid-template-columns: 1fr 1fr; + gap: 30px; + align-items: start; +} + +.about-us h1 { + font-size: 2rem; + margin-bottom: 30px; + color: #28223f; + text-align: center; + grid-column: span 2; +} + +.about-us .info-card { + background-color: #adb5bd; /* Light gray-blue background for info cards */ + border-radius: 20px; + padding: 20px; + box-shadow: 0 8px 20px rgba(0, 0, 0, 0.3); + transition: transform 0.3s, box-shadow 0.3s; + display: flex; + flex-direction: column; + align-items: center; + text-align: center; +} + +.about-us .info-card:hover { + transform: translateY(-10px); + box-shadow: 0 1px 30px rgba(0, 0, 0, 0.63); +} + +.about-us .info-card h2 { + font-size: 1.5rem; + margin-bottom: 15px; + color: #28223f; +} + +.about-us .info-card p { + font-size: 1.2rem; + line-height: 1.6; + margin-bottom: 20px; +} + +/* Style for list items */ +.about-us .info-card ul.features { + list-style-position: outside; + padding-left: 30px; + line-height: 1.6; + margin-top: 0; + text-align: left; +} + +.about-us .info-card ul.features li { + margin-bottom: 15px; + font-size: 1.2rem; + line-height: 1.6; +} + +.about-us .info-card ul.features li::before { + content: '✔'; + color: #f8b400; /* Bright yellow for the icon */ + margin-right: 10px; + font-size: 1.2rem; +} + +/* Mobile Responsiveness */ +@media (max-width: 768px) { + .about-us .container { + grid-template-columns: 1fr; + } + + .about-us h1 { + font-size: 1.5rem; + } + + .about-us .info-card { + padding: 30px; + } +} + +/* ----------------- */ +/* Footer Section */ +/* ----------------- */ +footer { + background-color: #212529; /* Dark footer background */ + color: #f8f9fa; /* Light text color */ + text-align: center; + padding: 20px; + font-size: 1rem; +} \ No newline at end of file diff --git a/static/css/glanblr.css b/static/css/glanblr.css new file mode 100644 index 0000000..aee9e98 --- /dev/null +++ b/static/css/glanblr.css @@ -0,0 +1,63 @@ +/* Glanblr Page Specific Styles */ +html, body { + height: 100%; /* Ensure full height */ + margin: 0; + padding: 0; + font-family: 'Arial', sans-serif; /* A clean font for the page */ +} + +body { + display: flex; + flex-direction: column; + background-color: #f0f4f7; /* Light background color for contrast */ + color: #333; /* Dark text for readability */ +} + +main { + flex: 1; /* Allow the main content to take available space */ + display: flex; + flex-direction: column; + justify-content: center; + align-items: center; + padding: 20px; +} + +p { + font-size: 1.1rem; + line-height: 1.6; + color: #555; + text-align: center; + max-width: 800px; + margin-bottom: 20px; +} + +.custom-button { + display: inline-block; + padding: 12px 25px; + background-color: #5e503f; /* Dark button color */ + color: #eae0d5; /* Light text */ + text-decoration: none; + border-radius: 5px; + font-size: 1rem; + text-align: center; + margin-top: 20px; + transition: background-color 0.3s ease; +} + +.custom-button:hover { + background-color: #3c3b35; /* Slightly darker on hover */ +} + +footer { + background-color: #22333b; + color: #eae0d5; + padding: 20px 0; + text-align: center; + font-size: 1rem; + margin-top: auto; + box-shadow: 0 -2px 5px rgba(0, 0, 0, 0.1); /* Subtle shadow for footer */ +} + +footer p { + margin: 0; +} diff --git a/static/css/meg.css b/static/css/meg.css new file mode 100644 index 0000000..0071859 --- /dev/null +++ b/static/css/meg.css @@ -0,0 +1,70 @@ +/* Meg Page Specific Styles */ +html, body { + height: 100%; + margin: 0; + padding: 0; + font-family: 'Arial', sans-serif; +} + +body { + display: flex; + flex-direction: column; + background-color: #f0f4f7; + color: #333; +} + +/* Main Content Styling */ +main { + flex: 1; + display: flex; + flex-direction: column; + justify-content: center; + align-items: center; + padding: 40px 20px; + text-align: center; +} + +/* Paragraph Styling */ +p { + font-size: 1.1rem; + line-height: 1.6; + color: #555; + max-width: 900px; + margin-bottom: 30px; +} + +/* Button Styling */ +.custom-button { + display: inline-block; + padding: 12px 30px; + background-color: #5e503f; + color: #eae0d5; + text-decoration: none; + border-radius: 30px; + font-size: 1.1rem; + text-align: center; + margin-top: 30px; + transition: background-color 0.3s ease, transform 0.2s ease; +} + +.custom-button:hover { + background-color: #3c3b35; + transform: translateY(-3px); +} + +/* Footer Styling */ +footer { + background-color: #22333b; + color: #eae0d5; + padding: 20px 0; + text-align: center; + font-size: 1rem; + margin-top: auto; + box-shadow: 0 -2px 10px rgba(0, 0, 0, 0.15); +} + +footer p { + margin: 0; + font-size: 0.9rem; + color: #b0b0b0; +} diff --git a/static/css/services.css b/static/css/services.css new file mode 100644 index 0000000..c4c8847 --- /dev/null +++ b/static/css/services.css @@ -0,0 +1,80 @@ +/* ----------------- */ +/* Services Section */ +/* ----------------- */ +.services { + background: linear-gradient(135deg, #7cabbc, #e7d6d6); + padding: 50px 50px; + margin-top: 80px; +} + +.services .container { + max-width: 1200px; + margin: 0 auto; + text-align: center; +} + +/* Header styling for the services section */ +.services h2 { + font-size: 2.5rem; /* Adjust size as needed */ + color: #131a1e; /* Change color if needed */ + margin: 20px 0px; /* Space below header */ +} + +.service-card h3 { /* Change 'h3' to any header tag you are using inside the card */ + font-size: 2rem; /* Adjust size as needed */ + color: #1a2a33; /* Color of the header */ + margin-bottom: 20px; /* Space below header */ +} + +.service-card { + background-color: #d4cdcd8f; + border-radius: 30px; + box-shadow: 0 4px 10px rgba(0, 0, 0, 0.1); + padding: 30px; + margin: 30px 0; + transition: transform 0.3s ease, box-shadow 0.3s ease; +} + +.service-card:hover { + transform: translateY(-10px); + box-shadow: 0 8px 20px rgba(0, 0, 0, 0.2); +} + +.service-card p { + font-size: 1.2rem; + color: #4d2c2c; + line-height: 1.9; + margin-bottom: 20px; +} + +.service-card ul { + list-style: none; + text-align: left; + padding: 0; + margin: 20px 0; +} + +.service-card ul li { + margin-bottom: 10px; + font-size: 1.3rem; + color: #4d2c2c; +} + +.service-card ul li::before { + content: '✔'; + color: #1a2a33; + margin-right: 10px; +} + + + +/* ----------------- */ +/* Footer Section */ +/* ----------------- */ +footer { + background-color: #1a2a33; + color: #eae0d5; + text-align: center; + padding: 20px 0; + font-size: 1rem; +} diff --git a/static/css/stylesheet.css b/static/css/stylesheet.css new file mode 100644 index 0000000..c6b0daf --- /dev/null +++ b/static/css/stylesheet.css @@ -0,0 +1,86 @@ +* { + margin: 0; + padding: 0; + box-sizing: border-box; +} + +/* Universal Styles */ +body { + font-family: 'Segoe UI', Tahoma, Geneva, Verdana, sans-serif; /* Set font to Times New Roman */ + background: linear-gradient(135deg, #f2f3f5 0%, #d1d8e0 40%, #bcc6cc 100%); /* New gradient background */ + color: #233742; /* Dark gray text */ + line-height: 1.6; +} + +/* ----------------- */ +/* Header Section */ +/* ----------------- */ +header { + background-color: #273544; /* Deep Charcoal Gray for a sophisticated look */ + padding: 10px 40px; + display: flex; + justify-content: space-between; + align-items: center; + color: #f3f4f6; /* Light text */ + box-shadow: 0 4px 12px rgba(60, 70, 102, 0.4); /* Shadow for depth */ + position: fixed; + top: 0; + left: 0; + width: 100%; + z-index: 1000; + border-bottom: 2px solid #a3b1c6; /* Lighter border underline */ +} + +header .logo a { + font-size: 2.5rem; + color: #f3f4f6; + text-decoration: none; + font-weight: bold; + letter-spacing: 2px; +} + +header h1 { + margin: 0; + font-size: 3.2rem; + color: #f3f4f6; + text-decoration: none; + font-weight: bold; +} + +header nav { + margin-top: 25px; + padding: 20px; +} + +header nav a { + text-decoration: none; + color: #f3f4f6; + margin: 0 20px; + font-size: 1.2rem; + text-transform: uppercase; + position: relative; + transition: color 0.3s, transform 0.3s; +} + +header nav a:hover { + color: #80abe0; /* Warm amber color on hover */ + transform: translateY(-5px); +} + +header nav a::before { + content: ''; + position: absolute; + bottom: -5px; + left: 0; + width: 100%; + height: 2px; + background-color: #80abe0; /* Amber underline */ + transform: scaleX(0); + transform-origin: bottom right; + transition: transform 0.3s ease-out; +} + +header nav a:hover::before { + transform: scaleX(1); + transform-origin: bottom left; +} diff --git a/templates/Contact.html b/templates/Contact.html new file mode 100644 index 0000000..6195cb1 --- /dev/null +++ b/templates/Contact.html @@ -0,0 +1,42 @@ + + + + + + Contact - Katabatic + + + + + + +
+ + +
+ + +
+

Contact Us

+
+

Name: Katabatic

+

Number: 0123456789

+

Email: abcd@deakin.edu.au

+
+
+ + +
+ +
+ + diff --git a/templates/about.html b/templates/about.html new file mode 100644 index 0000000..3d6a0de --- /dev/null +++ b/templates/about.html @@ -0,0 +1,97 @@ + + + + + + About Us - Katabatic + + + + + + +
+ + +
+ + +
+
+
+
+

About Katabatic

+

+ Katabatic is an innovative open-source project spearheaded by Dr. Nayyar Zaidi and Mr. Jaime Blackwell, under the leadership of DataBytes, a cutting-edge firm specializing in AI and data science solutions. This project addresses a critical gap in the development and integration of tabular data generation algorithms, bringing together diverse methods in a seamless and unified platform. +

+

+ In the rapidly evolving world of artificial intelligence (AI) and machine learning (ML), high-quality data is essential for training robust models. However, the development of tabular data, which is integral to many machine learning tasks, remains a fragmented and complex process. Traditional data generation techniques often operate in silos, requiring different skill sets and workflows. To tackle this challenge, Katabatic aims to simplify and streamline the creation of synthetic tabular data by integrating existing algorithms, thus making it easier for researchers, data scientists, and AI practitioners to access, compare, and evaluate various methods in one unified environment. +

+
+
+ +
+
+

What is Katabatic?

+

+ Katabatic is a platform that consolidates several leading algorithms for synthetic tabular data generation into a cohesive ecosystem. Currently, algorithms like GANBLR, MEG, CT-GAN, TableGAN, and MedGAN each have their own frameworks and requirements, creating challenges for researchers who want to explore these tools in parallel or compare results across different models. By combining these disparate technologies, Katabatic simplifies the process of testing and evaluating synthetic data generation techniques, enabling users to generate customizable, high-quality datasets for a variety of machine learning applications. +

+

+ The platform is open-source, ensuring that it remains transparent, flexible, and accessible to the global AI community. Researchers can easily adapt and extend the platform to suit their specific needs, contributing to the growing field of synthetic data generation and sharing their advancements with the broader scientific community. +

+
+
+ +
+
+

Why Katabatic Matters?

+

+ As AI and ML continue to push boundaries in industries such as healthcare, finance, and autonomous systems, access to high-quality data is crucial for model development. However, acquiring real-world data is often challenging due to issues such as data privacy concerns, ethical dilemmas, or the sheer scarcity of relevant datasets. This is where synthetic data comes in, offering a powerful solution that mirrors real-world data characteristics without compromising privacy or security. +

+
    +
  • Simplified Workflow: Researchers can use a single environment to develop, test, and evaluate multiple algorithms, streamlining the typically fragmented process of synthetic data generation.
  • +
  • Easy Model Comparison: Compare algorithm performances effectively.
  • +
  • Customizable Data Generation: Tailor data outputs to specific project needs.
  • +
  • Open-Source Contribution: Foster innovation through collaboration.
  • +
  • High-Quality Datasets: Generate realistic datasets for AI/ML tasks.
  • +
+
+
+ +
+
+

Core Features

+
    +
  • Integrated Algorithms: Access a variety of top synthetic data generation algorithms like GANBLR, MEG, and CT-GAN.
  • +
  • Customizable Generation: Modify settings for tailored tabular data generation.
  • +
  • Open-Source and Extensible: Expandable to meet evolving project requirements.
  • +
  • Streamlined Evaluation: Quickly test, compare, and evaluate algorithms.
  • +
  • Enhanced Model Development: Train AI/ML models with high-quality synthetic data.
  • +
+
+
+ +
+
+

The Future of Synthetic Data Generation

+

+ Katabatic is shaping the future of synthetic data generation by providing a user-friendly, all-in-one platform for the development, testing, and evaluation of synthetic tabular datasets. As the platform evolves, new algorithms, features, and improvements will be added to further enhance its capabilities. By simplifying the synthetic data creation process and providing a robust environment for testing and evaluation, Katabatic is accelerating the development of AI and ML models, enabling innovative solutions across diverse industries. +

+
+
+
+
+ + +
+

© 2025 Katabatic. All Rights Reserved.

+
+ + diff --git a/templates/index.html b/templates/index.html index f8c464d..5bba6fb 100644 --- a/templates/index.html +++ b/templates/index.html @@ -1,49 +1,150 @@ - - - - - - GANBLR++ - - - -
-

GANBLR++ Model Trainer

-
- -
-
-

Upload Your Dataset

-
- - - - - - - -
- {% if error %} -

{{ error }}

- {% endif %} -
- -
-

Training Feedback

-
- -
-
-
-
-

© 2025 GANBLR++ Team

-
- - - + + + + + + Katabatic - Synthetic Data Solutions + + + + + + +
+ + +
+ + +
+
+

Innovative Synthetic Data Solutions

+

Empowering AI and ML development with high-quality, customizable synthetic data tailored to your needs.

+
+ + +
+ + + + + + + + +
+
+ + +
+

Select a Model

+ + + + + +
+ + + + + + diff --git a/templates/models/CTGAN.html b/templates/models/CTGAN.html new file mode 100644 index 0000000..df84cc8 --- /dev/null +++ b/templates/models/CTGAN.html @@ -0,0 +1,31 @@ + + + + + + CTGAN Model + + + + + +
+

CTGAN Model

+ +
+ +
+

Welcome to the CTGAN model page. This section explains everything you need to know about the CTGAN model and how it helps in synthetic tabular data generation.

+ Go Back +
+ +
+

© 2025 CTGAN Model. All Rights Reserved.

+
+ + diff --git a/templates/models/glanblr.html b/templates/models/glanblr.html new file mode 100644 index 0000000..82124c3 --- /dev/null +++ b/templates/models/glanblr.html @@ -0,0 +1,33 @@ + + + + + + Glanblr Model + + + + + +
+

Glanblr Model

+ +
+ + + +
+

Welcome to the Glanblr model page. This section explains everything you need to know about the Glanblr model and how it helps in synthetic tabular data generation.

+ Go Back +
+ +
+

© 2025 Glanblr Model. All Rights Reserved.

+
+ + diff --git a/templates/models/meg.html b/templates/models/meg.html new file mode 100644 index 0000000..a350f4a --- /dev/null +++ b/templates/models/meg.html @@ -0,0 +1,31 @@ + + + + + + Meg Model + + + + + +
+

Meg Model

+ +
+ +
+

Welcome to the meg model page. This section explains everything you need to know about the meg model and how it helps in synthetic tabular data generation.

+ Go Back +
+ +
+

© 2025 meg Model. All Rights Reserved.

+
+ + diff --git a/templates/services.html b/templates/services.html new file mode 100644 index 0000000..1e33267 --- /dev/null +++ b/templates/services.html @@ -0,0 +1,61 @@ + + + + + + Services - Katabatic + + + + + + +
+ + +
+ + +
+
+

Our Services

+
+

Synthetic Data Generation

+

+ We provide top-notch synthetic data generation for AI and ML tasks using cutting-edge algorithms like GANBLR, CT-GAN, and MedGAN. Our solutions ensure high-quality tabular data tailored to your needs. +

+
    +
  • Multiple data generation algorithms
  • +
  • Customizable datasets
  • +
  • High-quality, realistic outputs
  • +
+
+
+

Data Quality Analysis

+

+ Leverage our platform for deep insights into your datasets, ensuring better quality and usability for AI training. +

+
    +
  • Comprehensive data profiling
  • +
  • Outlier detection
  • +
  • Actionable quality metrics
  • +
+
+
+
+ + + + + diff --git a/tests/testClick.py b/tests/testClick.py deleted file mode 100644 index 726138d..0000000 --- a/tests/testClick.py +++ /dev/null @@ -1,6 +0,0 @@ -import click - -@click.group() - -def run(): - return print("Run Method Running!rai") diff --git a/tests/testKatabatic.py b/tests/testKatabatic.py deleted file mode 100644 index f93138e..0000000 --- a/tests/testKatabatic.py +++ /dev/null @@ -1,20 +0,0 @@ -#from katabatic.katabatic import Katabatic #TODO: ideally simplify the import statement -from katabatic import * -from sklearn import datasets -import numpy as np - -# Test #1 involves generating some synthetic data using the Katabatic framework -# To do that we will gather demo data from scikit-learn... -iris = datasets.load_iris() -categories = iris.target_names -print("Categories: ", categories) -print("Feature Names: ", iris.feature_names) - -data = iris.data -target = iris.target - -# First instantiate a model: - -#model1 = Katabatic.run_model('ganblr') - -#model1.fit(data, target)