NPS Code Update #925
Merged
NPS Code Update #925
Google Cloud Build / data-pull-request-py (datcom-ci)
succeeded
Dec 22, 2023 in 6m 23s
Summary
Build Information
Trigger | data-pull-request-py |
Build | 01e41127-a354-488e-a105-f5825410d3d9 |
Start | 2023-12-22T03:40:56-08:00 |
Duration | 6m21.945s |
Status | SUCCESS |
Steps
Step | Status | Duration |
---|---|---|
python_install | SUCCESS | 1m35.338s |
python_test | SUCCESS | 4m21.587s |
python_format_check | SUCCESS | 2m40.484s |
Details
starting build "01e41127-a354-488e-a105-f5825410d3d9"
FETCHSOURCE
hint: Using 'master' as the name for the initial branch. This default branch name
hint: is subject to change. To configure the initial branch name to use in all
hint: of your new repositories, which will suppress this warning, call:
hint:
hint: git config --global init.defaultBranch <name>
hint:
hint: Names commonly chosen instead of 'master' are 'main', 'trunk' and
hint: 'development'. The just-created branch can be renamed via this command:
hint:
hint: git branch -m <name>
Initialized empty Git repository in /workspace/.git/
From https://github.com/datacommonsorg/data
* branch 9f9a5d8e8dc82c4d029ea82ce7968f6839b4ff6f -> FETCH_HEAD
Updating files: 27% (838/3030)
Updating files: 28% (849/3030)
Updating files: 29% (879/3030)
Updating files: 29% (894/3030)
Updating files: 30% (909/3030)
Updating files: 31% (940/3030)
Updating files: 32% (970/3030)
Updating files: 33% (1000/3030)
Updating files: 34% (1031/3030)
Updating files: 35% (1061/3030)
Updating files: 36% (1091/3030)
Updating files: 37% (1122/3030)
Updating files: 38% (1152/3030)
Updating files: 39% (1182/3030)
Updating files: 40% (1212/3030)
Updating files: 41% (1243/3030)
Updating files: 42% (1273/3030)
Updating files: 43% (1303/3030)
Updating files: 44% (1334/3030)
Updating files: 45% (1364/3030)
Updating files: 46% (1394/3030)
Updating files: 47% (1425/3030)
Updating files: 48% (1455/3030)
Updating files: 49% (1485/3030)
Updating files: 50% (1515/3030)
Updating files: 51% (1546/3030)
Updating files: 52% (1576/3030)
Updating files: 53% (1606/3030)
Updating files: 54% (1637/3030)
Updating files: 55% (1667/3030)
Updating files: 56% (1697/3030)
Updating files: 57% (1728/3030)
Updating files: 58% (1758/3030)
Updating files: 59% (1788/3030)
Updating files: 60% (1818/3030)
Updating files: 61% (1849/3030)
Updating files: 62% (1879/3030)
Updating files: 63% (1909/3030)
Updating files: 64% (1940/3030)
Updating files: 65% (1970/3030)
Updating files: 66% (2000/3030)
Updating files: 67% (2031/3030)
Updating files: 68% (2061/3030)
Updating files: 69% (2091/3030)
Updating files: 70% (2121/3030)
Updating files: 71% (2152/3030)
Updating files: 72% (2182/3030)
Updating files: 73% (2212/3030)
Updating files: 74% (2243/3030)
Updating files: 75% (2273/3030)
Updating files: 76% (2303/3030)
Updating files: 77% (2334/3030)
Updating files: 78% (2364/3030)
Updating files: 79% (2394/3030)
Updating files: 80% (2424/3030)
Updating files: 81% (2455/3030)
Updating files: 82% (2485/3030)
Updating files: 83% (2515/3030)
Updating files: 84% (2546/3030)
Updating files: 85% (2576/3030)
Updating files: 86% (2606/3030)
Updating files: 87% (2637/3030)
Updating files: 88% (2667/3030)
Updating files: 89% (2697/3030)
Updating files: 90% (2727/3030)
Updating files: 91% (2758/3030)
Updating files: 92% (2788/3030)
Updating files: 93% (2818/3030)
Updating files: 94% (2849/3030)
Updating files: 95% (2879/3030)
Updating files: 96% (2909/3030)
Updating files: 97% (2940/3030)
Updating files: 98% (2970/3030)
Updating files: 99% (3000/3030)
Updating files: 100% (3030/3030)
Updating files: 100% (3030/3030), done.
HEAD is now at 9f9a5d8 Readme update
BUILD
Starting Step #0 - "python_install"
Step #0 - "python_install": Pulling image: python:3.7
Step #0 - "python_install": 3.7: Pulling from library/python
Step #0 - "python_install": 167b8a53ca45: Pulling fs layer
Step #0 - "python_install": b47a222d28fa: Pulling fs layer
Step #0 - "python_install": debce5f9f3a9: Pulling fs layer
Step #0 - "python_install": 1d7ca7cd2e06: Pulling fs layer
Step #0 - "python_install": ff3119008f58: Pulling fs layer
Step #0 - "python_install": c2423a76a32b: Pulling fs layer
Step #0 - "python_install": e1c98ca4926a: Pulling fs layer
Step #0 - "python_install": 3b62c8e1d79b: Pulling fs layer
Step #0 - "python_install": 1d7ca7cd2e06: Waiting
Step #0 - "python_install": ff3119008f58: Waiting
Step #0 - "python_install": c2423a76a32b: Waiting
Step #0 - "python_install": e1c98ca4926a: Waiting
Step #0 - "python_install": 3b62c8e1d79b: Waiting
Step #0 - "python_install": b47a222d28fa: Verifying Checksum
Step #0 - "python_install": b47a222d28fa: Download complete
Step #0 - "python_install": 167b8a53ca45: Verifying Checksum
Step #0 - "python_install": 167b8a53ca45: Download complete
Step #0 - "python_install": debce5f9f3a9: Verifying Checksum
Step #0 - "python_install": debce5f9f3a9: Download complete
Step #0 - "python_install": ff3119008f58: Download complete
Step #0 - "python_install": c2423a76a32b: Verifying Checksum
Step #0 - "python_install": c2423a76a32b: Download complete
Step #0 - "python_install": e1c98ca4926a: Verifying Checksum
Step #0 - "python_install": e1c98ca4926a: Download complete
Step #0 - "python_install": 3b62c8e1d79b: Verifying Checksum
Step #0 - "python_install": 3b62c8e1d79b: Download complete
Step #0 - "python_install": 1d7ca7cd2e06: Verifying Checksum
Step #0 - "python_install": 1d7ca7cd2e06: Download complete
Step #0 - "python_install": 167b8a53ca45: Pull complete
Step #0 - "python_install": b47a222d28fa: Pull complete
Step #0 - "python_install": debce5f9f3a9: Pull complete
Step #0 - "python_install": 1d7ca7cd2e06: Pull complete
Step #0 - "python_install": ff3119008f58: Pull complete
Step #0 - "python_install": c2423a76a32b: Pull complete
Step #0 - "python_install": e1c98ca4926a: Pull complete
Step #0 - "python_install": 3b62c8e1d79b: Pull complete
Step #0 - "python_install": Digest: sha256:eedf63967cdb57d8214db38ce21f105003ed4e4d0358f02bedc057341bcf92a0
Step #0 - "python_install": Status: Downloaded newer image for python:3.7
Step #0 - "python_install": docker.io/library/python:3.7
Step #0 - "python_install": ### Installing Python requirements
Step #0 - "python_install": Installing Python requirements
Step #0 - "python_install": WARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ProtocolError('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer'))': /simple/google-cloud-logging/
Step #0 - "python_install": WARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ProtocolError('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer'))': /simple/py/
Step #0 - "python_install": DEPRECATION: ratelimit is being installed using the legacy 'setup.py install' method, because it does not have a 'pyproject.toml' and the 'wheel' package is not installed. pip 23.1 will enforce this behaviour change. A possible replacement is to enable the '--use-pep517' option. Discussion can be found at https://github.com/pypa/pip/issues/8559
Step #0 - "python_install": DEPRECATION: func-timeout is being installed using the legacy 'setup.py install' method, because it does not have a 'pyproject.toml' and the 'wheel' package is not installed. pip 23.1 will enforce this behaviour change. A possible replacement is to enable the '--use-pep517' option. Discussion can be found at https://github.com/pypa/pip/issues/8559
Step #0 - "python_install": DEPRECATION: frozendict is being installed using the legacy 'setup.py install' method, because it does not have a 'pyproject.toml' and the 'wheel' package is not installed. pip 23.1 will enforce this behaviour change. A possible replacement is to enable the '--use-pep517' option. Discussion can be found at https://github.com/pypa/pip/issues/8559
Step #0 - "python_install": DEPRECATION: easydict is being installed using the legacy 'setup.py install' method, because it does not have a 'pyproject.toml' and the 'wheel' package is not installed. pip 23.1 will enforce this behaviour change. A possible replacement is to enable the '--use-pep517' option. Discussion can be found at https://github.com/pypa/pip/issues/8559
Step #0 - "python_install": DEPRECATION: future is being installed using the legacy 'setup.py install' method, because it does not have a 'pyproject.toml' and the 'wheel' package is not installed. pip 23.1 will enforce this behaviour change. A possible replacement is to enable the '--use-pep517' option. Discussion can be found at https://github.com/pypa/pip/issues/8559
Step #0 - "python_install": DEPRECATION: rdp is being installed using the legacy 'setup.py install' method, because it does not have a 'pyproject.toml' and the 'wheel' package is not installed. pip 23.1 will enforce this behaviour change. A possible replacement is to enable the '--use-pep517' option. Discussion can be found at https://github.com/pypa/pip/issues/8559
Step #0 - "python_install":
Step #0 - "python_install": [notice] A new release of pip is available: 23.0.1 -> 23.3.2
Step #0 - "python_install": [notice] To update, run: pip install --upgrade pip
Finished Step #0 - "python_install"
Starting Step #1 - "python_test"
Starting Step #2 - "python_format_check"
Step #1 - "python_test": Already have image (with digest): python:3.7
Step #2 - "python_format_check": Already have image (with digest): python:3.7
Step #1 - "python_test": ### Running Python tests in util/
Step #2 - "python_format_check": ### Testing lint
Step #1 - "python_test": Installing Python requirements
Step #2 - "python_format_check": Installing Python requirements
Step #1 - "python_test":
Step #1 - "python_test": [notice] A new release of pip is available: 23.0.1 -> 23.3.2
Step #1 - "python_test": [notice] To update, run: pip install --upgrade pip
Step #2 - "python_format_check":
Step #2 - "python_format_check": [notice] A new release of pip is available: 23.0.1 -> 23.3.2
Step #2 - "python_format_check": [notice] To update, run: pip install --upgrade pip
Step #1 - "python_test": #### Testing Python code in util/
Step #2 - "python_format_check": #### Testing Python lint
Step #1 - "python_test": test_aggregate_dict (aggregation_util_test.AggregationUtilTest) ... ok
Step #1 - "python_test": test_aggregate_value (aggregation_util_test.AggregationUtilTest) ... ok
Step #1 - "python_test": test_config_map_with_override (config_map_test.TestConfigMap) ... ok
Step #1 - "python_test": test_load_config_file (config_map_test.TestConfigMap)
Step #1 - "python_test": Test loading of config dictionary from a file. ... ok
Step #1 - "python_test": test_set_config (config_map_test.TestConfigMap) ... ok
Step #1 - "python_test": test_update_config (config_map_test.TestConfigMap) ... ok
Step #1 - "python_test": test_add_counter (counters_test.TestCounters)
Step #1 - "python_test": Verify increment and decrement counters. ... Counters:
Step #1 - "python_test": test_inputs = 10
Step #1 - "python_test": test_process_elapsed_time = 0.00
Step #1 - "python_test": test_processed = 0
Step #1 - "python_test": test_start_time = 217.22
Step #1 - "python_test": ok
Step #1 - "python_test": test_counter_dict (counters_test.TestCounters)
Step #1 - "python_test": Verify counter dict is shared across counters. ... Counters:
Step #1 - "python_test": process_elapsed_time = 0.00
Step #1 - "python_test": processed = 0
Step #1 - "python_test": start_time = 217.22
Step #1 - "python_test": test_ctr = 1
Step #1 - "python_test": ok
Step #1 - "python_test": test_debug_counters (counters_test.TestCounters)
Step #1 - "python_test": Verify counters with debug string suffixes. ... Counters:
Step #1 - "python_test": test3_inputs = 10
Step #1 - "python_test": test3_inputs_test-case-2 = 10
Step #1 - "python_test": test3_process_elapsed_time = 0.00
Step #1 - "python_test": test3_processed = 0
Step #1 - "python_test": test3_start_time = 217.22
Step #1 - "python_test": ok
Step #1 - "python_test": test_set_counter (counters_test.TestCounters)
Step #1 - "python_test": Verify set_counter overrides current value. ... Counters:
Step #1 - "python_test": test2_lines = 1
Step #1 - "python_test": test2_lines_file1 = 1
Step #1 - "python_test": test2_process_elapsed_time = 0.00
Step #1 - "python_test": test2_processed = 0
Step #1 - "python_test": test2_start_time = 217.22
Step #1 - "python_test": Counters:
Step #1 - "python_test": test2_lines = 11
Step #1 - "python_test": test2_lines_file1 = 11
Step #1 - "python_test": test2_process_elapsed_time = 0.00
Step #1 - "python_test": test2_processed = 0
Step #1 - "python_test": test2_start_time = 217.22
Step #1 - "python_test": ok
Step #1 - "python_test": test_show_counters (counters_test.TestCounters) ... Counters:
Step #1 - "python_test": test-file-rows = 100
Step #1 - "python_test": test-process_elapsed_time = 0.00
Step #1 - "python_test": test-process_remaining_time = 1000000.00
Step #1 - "python_test": test-read-rows = 0
Step #1 - "python_test": test-start_time = 217.22
Step #1 - "python_test": Counters:
Step #1 - "python_test": test-file-rows = 100
Step #1 - "python_test": test-process_elapsed_time = 0.00
Step #1 - "python_test": test-process_remaining_time = 0.00
Step #1 - "python_test": test-processing_rate = 63584.51
Step #1 - "python_test": test-read-rows = 10
Step #1 - "python_test": test-start_time = 217.22
Step #1 - "python_test": ok
Step #1 - "python_test": test_dc_api_batched_wrapper (dc_api_wrapper_test.TestDCAPIWrapper)
Step #1 - "python_test": Test DC API wrapper for batched calls. ... ok
Step #1 - "python_test": test_dc_api_is_defined_dcid (dc_api_wrapper_test.TestDCAPIWrapper)
Step #1 - "python_test": Test API wrapper for defined DCIDs. ... ok
Step #1 - "python_test": test_dc_api_wrapper (dc_api_wrapper_test.TestDCAPIWrapper)
Step #1 - "python_test": Test the wrapper for DC API. ... ok
Step #1 - "python_test": test_dc_get_node_property_values (dc_api_wrapper_test.TestDCAPIWrapper)
Step #1 - "python_test": Test API wrapper to get all property:values for a node. ... ok
Step #1 - "python_test": test_download_file (download_util_test.TestCounters) ... ok
Step #1 - "python_test": test_prefilled_url (download_util_test.TestCounters) ... ok
Step #1 - "python_test": test_request_url (download_util_test.TestCounters) ... ok
Step #1 - "python_test": test_read_write (file_util_test.FileIOTest) ... ok
Step #1 - "python_test": test_file_get_estimate_num_rows (file_util_test.FileUtilsTest) ... ok
Step #1 - "python_test": test_file_get_matching (file_util_test.FileUtilsTest) ... ok
Step #1 - "python_test": test_file_load_csv_dict (file_util_test.FileUtilsTest) ... ok
Step #1 - "python_test": test_file_type (file_util_test.FileUtilsTest) ... ok
Step #1 - "python_test": test_file_write_load_py_dict (file_util_test.FileUtilsTest) ... ok
Step #1 - "python_test": test_aa2 (latlng2place_mapsapi_test.Latlng2PlaceMapsAPITest) ... ok
Step #1 - "python_test": test_country (latlng2place_mapsapi_test.Latlng2PlaceMapsAPITest) ... ok
Step #1 - "python_test": test_main (latlng_recon_geojson_test.LatlngReconGeojsonTest) ... ok
Step #1 - "python_test": test_basic (latlng_recon_service_test.LatlngReconServiceTest) ... /usr/local/lib/python3.7/concurrent/futures/thread.py:57: ResourceWarning: unclosed <ssl.SSLSocket fd=10, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('192.168.10.2', 47920), raddr=('35.244.133.155', 443)>
Step #1 - "python_test": result = self.fn(*self.args, **self.kwargs)
Step #1 - "python_test": ResourceWarning: Enable tracemalloc to get the object allocation traceback
Step #1 - "python_test": ok
Step #1 - "python_test": test_filter (latlng_recon_service_test.LatlngReconServiceTest) ... /usr/local/lib/python3.7/concurrent/futures/thread.py:57: ResourceWarning: unclosed <ssl.SSLSocket fd=10, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('192.168.10.2', 47922), raddr=('35.244.133.155', 443)>
Step #1 - "python_test": result = self.fn(*self.args, **self.kwargs)
Step #1 - "python_test": ResourceWarning: Enable tracemalloc to get the object allocation traceback
Step #1 - "python_test": ok
Step #1 - "python_test": test_dict_list_to_mcf_str (mcf_dict_util_test.TestMCFDict) ... ok
Step #1 - "python_test": test_drop_nodes (mcf_dict_util_test.TestMCFDict) ... ok
Step #1 - "python_test": test_get_dcid_node (mcf_dict_util_test.TestMCFDict) ... ok
Step #1 - "python_test": test_mcf_dict_rename_namespace (mcf_dict_util_test.TestMCFDict) ... ok
Step #1 - "python_test": test_mcf_dict_rename_prop (mcf_dict_util_test.TestMCFDict) ... ok
Step #1 - "python_test": test_mcf_dict_rename_prop_value (mcf_dict_util_test.TestMCFDict) ... ok
Step #1 - "python_test": test_mcf_to_dict_list (mcf_dict_util_test.TestMCFDict) ... ok
Step #1 - "python_test": test_node_list_check_existence_dc (mcf_dict_util_test.TestMCFDict) ... ok
Step #1 - "python_test": test_node_list_check_existence_node_list (mcf_dict_util_test.TestMCFDict) ... ok
Step #1 - "python_test": test_example_usage (mcf_template_filler_test.MCFTemplateFillerTest) ... ok
Step #1 - "python_test": test_pop_and_2_obs_with_all_pv (mcf_template_filler_test.MCFTemplateFillerTest)
Step #1 - "python_test": Use separate templates for Pop Obs, and use Obs template repeatedly. ... ok
Step #1 - "python_test": test_pop_with_missing_req_pv (mcf_template_filler_test.MCFTemplateFillerTest) ... ok
Step #1 - "python_test": test_require_node_name (mcf_template_filler_test.MCFTemplateFillerTest) ... ok
Step #1 - "python_test": test_unified_pop_obs_with_missing_optional_pv (mcf_template_filler_test.MCFTemplateFillerTest) ... ok
Step #1 - "python_test": test_place_id_resolution_by_name (state_division_to_dcid_test.PlaceMapTest) ... ok
Step #1 - "python_test": test_boolean_naming (statvar_dcid_generator_test.TestStatVarDcidGenerator) ... ok
Step #1 - "python_test": test_double_underscore (statvar_dcid_generator_test.TestStatVarDcidGenerator) ... ok
Step #1 - "python_test": test_ignore_props (statvar_dcid_generator_test.TestStatVarDcidGenerator) ... ok
Step #1 - "python_test": test_legacy_mapping (statvar_dcid_generator_test.TestStatVarDcidGenerator) ... ok
Step #1 - "python_test": test_measured_property (statvar_dcid_generator_test.TestStatVarDcidGenerator) ... ok
Step #1 - "python_test": test_measurement_constraint_removal (statvar_dcid_generator_test.TestStatVarDcidGenerator) ... ok
Step #1 - "python_test": test_measurement_denominator (statvar_dcid_generator_test.TestStatVarDcidGenerator) ... ok
Step #1 - "python_test": test_measurement_qualifier (statvar_dcid_generator_test.TestStatVarDcidGenerator) ... ok
Step #1 - "python_test": test_naics_name_generation (statvar_dcid_generator_test.TestStatVarDcidGenerator) ... ok
Step #1 - "python_test": test_namespace_removal (statvar_dcid_generator_test.TestStatVarDcidGenerator) ... ok
Step #1 - "python_test": test_prepend_append_replace (statvar_dcid_generator_test.TestStatVarDcidGenerator) ... ok
Step #1 - "python_test": test_quantity_name_generation (statvar_dcid_generator_test.TestStatVarDcidGenerator) ... ok
Step #1 - "python_test": test_quantity_range_name_generation (statvar_dcid_generator_test.TestStatVarDcidGenerator) ... ok
Step #1 - "python_test": test_soc_map (statvar_dcid_generator_test.TestStatVarDcidGenerator) ... ok
Step #1 - "python_test": test_soc_name_generation (statvar_dcid_generator_test.TestStatVarDcidGenerator) ... ok
Step #1 - "python_test": test_sorted_constraints (statvar_dcid_generator_test.TestStatVarDcidGenerator) ... ok
Step #1 - "python_test": test_stat_type (statvar_dcid_generator_test.TestStatVarDcidGenerator) ... ok
Step #1 - "python_test":
Step #1 - "python_test": ----------------------------------------------------------------------
Step #1 - "python_test": Ran 61 tests in 4.812s
Step #1 - "python_test":
Step #1 - "python_test": OK
Step #1 - "python_test": ### Running Python tests in import-automation/executor
Step #1 - "python_test": Installing Python requirements
Step #1 - "python_test":
Step #1 - "python_test": [notice] A new release of pip is available: 23.0.1 -> 23.3.2
Step #1 - "python_test": [notice] To update, run: pip install --upgrade pip
Step #1 - "python_test": #### Testing Python code in import-automation/executor
Step #1 - "python_test": test_appengine_job_request (test.cloud_scheduler_test.CloudSchedulerTest) ... ok
Step #1 - "python_test": test_http_job_request (test.cloud_scheduler_test.CloudSchedulerTest) ... ok
Step #1 - "python_test": test_levels (test.dashboard_api_test.DashboardAPITest)
Step #1 - "python_test": Tests that the convenient logging functions set the right ... ok
Step #1 - "python_test": test_log_helper (test.dashboard_api_test.DashboardAPITest) ... ok
Step #1 - "python_test": test_log_helper_http (test.dashboard_api_test.DashboardAPITest)
Step #1 - "python_test": Tests that an exception is thrown is the HTTP request fails. ... ok
Step #1 - "python_test": test_log_helper_id (test.dashboard_api_test.DashboardAPITest)
Step #1 - "python_test": Tests that at least one of run_id and attempt_id ... ok
Step #1 - "python_test": test_log_helper_time (test.dashboard_api_test.DashboardAPITest)
Step #1 - "python_test": Tests that time_logged is generated if not supplied. ... ok
Step #1 - "python_test": test_update_attempt (test.dashboard_api_test.DashboardAPITest) ... ok
Step #1 - "python_test": test_update_run (test.dashboard_api_test.DashboardAPITest) ... ok
Step #1 - "python_test": test.file_uploader_test (unittest.loader._FailedTest) ... ERROR
Step #1 - "python_test": test.github_api_test (unittest.loader._FailedTest) ... ERROR
Step #1 - "python_test": test_clean_time (test.import_executor_test.ImportExecutorTest) ... ok
Step #1 - "python_test": test_construct_process_message (test.import_executor_test.ImportExecutorTest) ... ok
Step #1 - "python_test": test_construct_process_message_no_output (test.import_executor_test.ImportExecutorTest)
Step #1 - "python_test": Tests that _construct_process_message does not append ... ok
Step #1 - "python_test": test_create_venv (test.import_executor_test.ImportExecutorTest) ... ok
Step #1 - "python_test": test_run_and_handle_exception (test.import_executor_test.ImportExecutorTest) ... ERROR:root:An unexpected exception was thrown
Step #1 - "python_test": Traceback (most recent call last):
Step #1 - "python_test": File "/workspace/import-automation/executor/app/executor/import_executor.py", line 604, in run_and_handle_exception
Step #1 - "python_test": return exec_func(*args)
Step #1 - "python_test": File "/workspace/import-automation/executor/test/import_executor_test.py", line 64, in raise_exception
Step #1 - "python_test": raise Exception
Step #1 - "python_test": Exception
Step #1 - "python_test": ok
Step #1 - "python_test": test_run_with_timeout (test.import_executor_test.ImportExecutorTest) ... ERROR:root:An unexpected exception was thrown: Command '['sleep', '5']' timed out after 0.1 seconds when running ['sleep', '5']: Traceback (most recent call last):
Step #1 - "python_test": File "/workspace/import-automation/executor/app/executor/import_executor.py", line 712, in _run_with_timeout
Step #1 - "python_test": env=env)
Step #1 - "python_test": File "/usr/local/lib/python3.7/subprocess.py", line 490, in run
Step #1 - "python_test": stdout, stderr = process.communicate(input, timeout=timeout)
Step #1 - "python_test": File "/usr/local/lib/python3.7/subprocess.py", line 964, in communicate
Step #1 - "python_test": stdout, stderr = self._communicate(input, endtime, timeout)
Step #1 - "python_test": File "/usr/local/lib/python3.7/subprocess.py", line 1732, in _communicate
Step #1 - "python_test": self._check_timeout(endtime, orig_timeout, stdout, stderr)
Step #1 - "python_test": File "/usr/local/lib/python3.7/subprocess.py", line 1011, in _check_timeout
Step #1 - "python_test": stderr=b''.join(stderr_seq) if stderr_seq else None)
Step #1 - "python_test": subprocess.TimeoutExpired: Command '['sleep', '5']' timed out after 0.1 seconds
Step #1 - "python_test": Traceback (most recent call last):
Step #1 - "python_test": File "/workspace/import-automation/executor/app/executor/import_executor.py", line 712, in _run_with_timeout
Step #1 - "python_test": env=env)
Step #1 - "python_test": File "/usr/local/lib/python3.7/subprocess.py", line 490, in run
Step #1 - "python_test": stdout, stderr = process.communicate(input, timeout=timeout)
Step #1 - "python_test": File "/usr/local/lib/python3.7/subprocess.py", line 964, in communicate
Step #1 - "python_test": stdout, stderr = self._communicate(input, endtime, timeout)
Step #1 - "python_test": File "/usr/local/lib/python3.7/subprocess.py", line 1732, in _communicate
Step #1 - "python_test": self._check_timeout(endtime, orig_timeout, stdout, stderr)
Step #1 - "python_test": File "/usr/local/lib/python3.7/subprocess.py", line 1011, in _check_timeout
Step #1 - "python_test": stderr=b''.join(stderr_seq) if stderr_seq else None)
Step #1 - "python_test": subprocess.TimeoutExpired: Command '['sleep', '5']' timed out after 0.1 seconds
Step #1 - "python_test": FAIL
Step #1 - "python_test": test_are_imports_finished (test.import_service_test.ImportServiceTest) ... ok
Step #1 - "python_test": test_block_on_import (test.import_service_test.ImportServiceTest) ... ok
Step #1 - "python_test": test_fix_input_path (test.import_service_test.ImportServiceTest) ... ok
Step #1 - "python_test": test_format_import_info (test.import_service_test.ImportServiceTest) ... ok
Step #1 - "python_test": test_get_fixed_absolute_import_name (test.import_service_test.ImportServiceTest) ... ok
Step #1 - "python_test": test_get_import_id (test.import_service_test.ImportServiceTest) ... ok
Step #1 - "python_test": test_smart_import (test.import_service_test.ImportServiceTest) ... ok
Step #1 - "python_test": test_absolute_import_name (test.import_target_test.ImportTargetTest) ... ok
Step #1 - "python_test": test_is_import_targetted_by_commit (test.import_target_test.ImportTargetTest) ... ok
Step #1 - "python_test": test_parse_commit_message_targets (test.import_target_test.ImportTargetTest) ... ok
Step #1 - "python_test": test.integration_test (unittest.loader._FailedTest) ... ERROR
Step #1 - "python_test": test_download_file (test.utils_test.AppUtilsTest)
Step #1 - "python_test": Response does not have a Content-Disposition header. ... ok
Step #1 - "python_test": test_download_file_timeout (test.utils_test.AppUtilsTest)
Step #1 - "python_test": Raises requests.Timeout exception. ... ok
Step #1 - "python_test": test_get_filename (test.utils_test.AppUtilsTest) ... ok
Step #1 - "python_test": test_get_filename_raise (test.utils_test.AppUtilsTest) ... ok
Step #1 - "python_test": test_pacific_time_to_datetime (test.utils_test.AppUtilsTest)
Step #1 - "python_test": Tests that the string returned by pacific_time can be converted to ... ok
Step #1 - "python_test": test_pacific_time_to_datetime_then_back (test.utils_test.AppUtilsTest)
Step #1 - "python_test": Tests that the string returned by pacific_time can be converted to ... ok
Step #1 - "python_test": test_compare_lines (test.utils_test.TestUtilsTest) ... ok
Step #1 - "python_test": test_import_spec_valid (test.validation_test.ValidationTest) ... ok
Step #1 - "python_test": test_import_spec_valid_fields_absent (test.validation_test.ValidationTest) ... ok
Step #1 - "python_test": test_import_spec_valid_script_not_exist (test.validation_test.ValidationTest) ... ok
Step #1 - "python_test": test_import_targets_valid_absolute_names (test.validation_test.ValidationTest) ... ok
Step #1 - "python_test": test_import_targets_valid_manifest_not_exist (test.validation_test.ValidationTest) ... ok
Step #1 - "python_test": test_import_targets_valid_name_not_exist (test.validation_test.ValidationTest) ... ok
Step #1 - "python_test": test_import_targets_valid_relative_names (test.validation_test.ValidationTest) ... ok
Step #1 - "python_test": test_import_targets_valid_relative_names_multiple_dirs (test.validation_test.ValidationTest) ... ok
Step #1 - "python_test": test_manifest_valid_fields_absent (test.validation_test.ValidationTest) ... ok
Step #1 - "python_test":
Step #1 - "python_test": ======================================================================
Step #1 - "python_test": ERROR: test.file_uploader_test (unittest.loader._FailedTest)
Step #1 - "python_test": ----------------------------------------------------------------------
Step #1 - "python_test": ImportError: Failed to import test module: test.file_uploader_test
Step #1 - "python_test": Traceback (most recent call last):
Step #1 - "python_test": File "/usr/local/lib/python3.7/unittest/loader.py", line 436, in _find_test_path
Step #1 - "python_test": module = self._get_module_from_name(name)
Step #1 - "python_test": File "/usr/local/lib/python3.7/unittest/loader.py", line 377, in _get_module_from_name
Step #1 - "python_test": __import__(name)
Step #1 - "python_test": File "/workspace/import-automation/executor/test/file_uploader_test.py", line 22, in <module>
Step #1 - "python_test": from test import integration_test
Step #1 - "python_test": File "/workspace/import-automation/executor/test/integration_test.py", line 27, in <module>
Step #1 - "python_test": 'github_repo_owner_username': os.environ['GITHUB_AUTH_USERNAME'],
Step #1 - "python_test": File "/usr/local/lib/python3.7/os.py", line 681, in __getitem__
Step #1 - "python_test": raise KeyError(key) from None
Step #1 - "python_test": KeyError: 'GITHUB_AUTH_USERNAME'
Step #1 - "python_test":
Step #1 - "python_test":
Step #1 - "python_test": ======================================================================
Step #1 - "python_test": ERROR: test.github_api_test (unittest.loader._FailedTest)
Step #1 - "python_test": ----------------------------------------------------------------------
Step #1 - "python_test": ImportError: Failed to import test module: test.github_a
...
[Logs truncated due to log size limitations. For full logs, see https://console.cloud.google.com/cloud-build/builds/01e41127-a354-488e-a105-f5825410d3d9?project=879489846695.]
...
dcid', 'MetricTon')])
Step #1 - "python_test": [ 2023-12-22 11:45:26.079688 ] Error: warning_ignored_stat_var_duplicate_AL Invalid statVar {'typeOf': 'dcs:StatisticalVariable', 'measurementQualifier': 'dcs:Annual', 'populationType': 'dcs:Energy', 'statType': 'dcs:measuredValue'} for row OrderedDict([('Commodity Code', 'ZG'), ('Country or Area Code', '276'), ('Country or Area', 'Germany'), ('Transaction Code', '121'), ('Commodity - Transaction Code', 'ZG121'), ('Commodity - Transaction', 'Of which: biogasoline - Consumption by manufacturing, construction and non-fuel mining industry'), ('Year', '2018'), ('Unit', 'Metric tons, thousand'), ('Quantity', '5000'), ('Quantity Footnotes', ''), ('_File', '/workspace/scripts/un/energy/test_data/un_energy_input.csv'), ('_Row', 2834), ('Country_dcid', 'dcs:country/DEU'), ('Unit_dcid', 'MetricTon')])
Step #1 - "python_test": [ 2023-12-22 11:45:26.079770 ] Error: warning_ignored_stat_var_duplicate_AL Invalid statVar {'typeOf': 'dcs:StatisticalVariable', 'measurementQualifier': 'dcs:Annual', 'populationType': 'dcs:Energy', 'statType': 'dcs:measuredValue'} for row OrderedDict([('Commodity Code', 'ZG'), ('Country or Area Code', '32'), ('Country or Area', 'Argentina'), ('Transaction Code', '022'), ('Commodity - Transaction Code', 'ZG022'), ('Commodity - Transaction', 'Of which: biogasoline - Receipts from other sources'), ('Year', '2019'), ('Unit', 'Metric tons, thousand'), ('Quantity', '850000'), ('Quantity Footnotes', ''), ('_File', '/workspace/scripts/un/energy/test_data/un_energy_input.csv'), ('_Row', 2835), ('Country_dcid', 'dcs:country/ARG'), ('Unit_dcid', 'MetricTon')])
Step #1 - "python_test": [ 2023-12-22 11:45:26.079844 ] Error: warning_ignored_stat_var_duplicate_AL Invalid statVar {'typeOf': 'dcs:StatisticalVariable', 'measurementQualifier': 'dcs:Annual', 'populationType': 'dcs:Energy', 'statType': 'dcs:measuredValue'} for row OrderedDict([('Commodity Code', 'ZG'), ('Country or Area Code', '32'), ('Country or Area', 'Argentina'), ('Transaction Code', '06'), ('Commodity - Transaction Code', 'ZG06'), ('Commodity - Transaction', 'Of which: biogasoline - Stock changes'), ('Year', '2013'), ('Unit', 'Metric tons, thousand'), ('Quantity', '-2000'), ('Quantity Footnotes', ''), ('_File', '/workspace/scripts/un/energy/test_data/un_energy_input.csv'), ('_Row', 2836), ('Country_dcid', 'dcs:country/ARG'), ('Unit_dcid', 'MetricTon')])
Step #1 - "python_test": [ 2023-12-22 11:45:26.080402 ] Error: warning_ignored_stat_var_duplicate_AL Invalid statVar {'typeOf': 'dcs:StatisticalVariable', 'measurementQualifier': 'dcs:Annual', 'populationType': 'dcs:Energy', 'statType': 'dcs:measuredValue'} for row OrderedDict([('Commodity Code', 'ZG'), ('Country or Area Code', '32'), ('Country or Area', 'Argentina'), ('Transaction Code', '1221'), ('Commodity - Transaction Code', 'ZG1221'), ('Commodity - Transaction', 'Of which: biogasoline - Consumption by road'), ('Year', '2019'), ('Unit', 'Metric tons, thousand'), ('Quantity', '842000'), ('Quantity Footnotes', ''), ('_File', '/workspace/scripts/un/energy/test_data/un_energy_input.csv'), ('_Row', 2837), ('Country_dcid', 'dcs:country/ARG'), ('Unit_dcid', 'MetricTon')])
Step #1 - "python_test": [ 2023-12-22 11:45:26.080507 ] Error: warning_ignored_stat_var_duplicate_AL Invalid statVar {'typeOf': 'dcs:StatisticalVariable', 'measurementQualifier': 'dcs:Annual', 'populationType': 'dcs:Energy', 'statType': 'dcs:measuredValue'} for row OrderedDict([('Commodity Code', 'ZG'), ('Country or Area Code', '32'), ('Country or Area', 'Argentina'), ('Transaction Code', '122'), ('Commodity - Transaction Code', 'ZG122'), ('Commodity - Transaction', 'Of which: biogasoline - Consumption in transport'), ('Year', '2019'), ('Unit', 'Metric tons, thousand'), ('Quantity', '842000'), ('Quantity Footnotes', ''), ('_File', '/workspace/scripts/un/energy/test_data/un_energy_input.csv'), ('_Row', 2838), ('Country_dcid', 'dcs:country/ARG'), ('Unit_dcid', 'MetricTon')])
Step #1 - "python_test": [ 2023-12-22 11:45:26.080620 ] Error: warning_ignored_stat_var_duplicate_AL Invalid statVar {'typeOf': 'dcs:StatisticalVariable', 'measurementQualifier': 'dcs:Annual', 'populationType': 'dcs:Energy', 'statType': 'dcs:measuredValue'} for row OrderedDict([('Commodity Code', 'ZG'), ('Country or Area Code', '32'), ('Country or Area', 'Argentina'), ('Transaction Code', '12'), ('Commodity - Transaction Code', 'ZG12'), ('Commodity - Transaction', 'Of which: biogasoline - Final energy consumption'), ('Year', '2019'), ('Unit', 'Metric tons, thousand'), ('Quantity', '842000'), ('Quantity Footnotes', ''), ('_File', '/workspace/scripts/un/energy/test_data/un_energy_input.csv'), ('_Row', 2839), ('Country_dcid', 'dcs:country/ARG'), ('Unit_dcid', 'MetricTon')])
Step #1 - "python_test": [ 2023-12-22 11:45:26.080739 ] Error: warning_ignored_stat_var_duplicate_AL Invalid statVar {'typeOf': 'dcs:StatisticalVariable', 'measurementQualifier': 'dcs:Annual', 'populationType': 'dcs:Energy', 'statType': 'dcs:measuredValue'} for row OrderedDict([('Commodity Code', 'ZG'), ('Country or Area Code', '32'), ('Country or Area', 'Argentina'), ('Transaction Code', 'GA'), ('Commodity - Transaction Code', 'ZGGA'), ('Commodity - Transaction', 'Of which: biogasoline - Total energy supply'), ('Year', '2019'), ('Unit', 'Metric tons, thousand'), ('Quantity', '850000'), ('Quantity Footnotes', ''), ('_File', '/workspace/scripts/un/energy/test_data/un_energy_input.csv'), ('_Row', 2840), ('Country_dcid', 'dcs:country/ARG'), ('Unit_dcid', 'MetricTon')])
Step #1 - "python_test": [ 2023-12-22 11:45:26.080840 ] Error: warning_ignored_stat_var_duplicate_AL Invalid statVar {'typeOf': 'dcs:StatisticalVariable', 'measurementQualifier': 'dcs:Annual', 'populationType': 'dcs:Energy', 'statType': 'dcs:measuredValue'} for row OrderedDict([('Commodity Code', 'ZG'), ('Country or Area Code', '32'), ('Country or Area', 'Argentina'), ('Transaction Code', 'NA'), ('Commodity - Transaction Code', 'ZGNA'), ('Commodity - Transaction', 'Of which: biogasoline - Final consumption'), ('Year', '2019'), ('Unit', 'Metric tons, thousand'), ('Quantity', '842000'), ('Quantity Footnotes', ''), ('_File', '/workspace/scripts/un/energy/test_data/un_energy_input.csv'), ('_Row', 2841), ('Country_dcid', 'dcs:country/ARG'), ('Unit_dcid', 'MetricTon')])
Step #1 - "python_test": [ 2023-12-22 11:45:26.080937 ] Error: warning_ignored_stat_var_duplicate_AL Invalid statVar {'typeOf': 'dcs:StatisticalVariable', 'measurementQualifier': 'dcs:Annual', 'populationType': 'dcs:Energy', 'statType': 'dcs:measuredValue'} for row OrderedDict([('Commodity Code', 'ZG'), ('Country or Area Code', '40'), ('Country or Area', 'Austria'), ('Transaction Code', '03'), ('Commodity - Transaction Code', 'ZG03'), ('Commodity - Transaction', 'Of which: biogasoline - Imports'), ('Year', '2019'), ('Unit', 'Metric tons, thousand'), ('Quantity', '34000'), ('Quantity Footnotes', ''), ('_File', '/workspace/scripts/un/energy/test_data/un_energy_input.csv'), ('_Row', 2842), ('Country_dcid', 'dcs:country/AUT'), ('Unit_dcid', 'MetricTon')])
Step #1 - "python_test": [ 2023-12-22 11:45:26.081068 ] Error: warning_ignored_stat_var_duplicate_AL Invalid statVar {'typeOf': 'dcs:StatisticalVariable', 'measurementQualifier': 'dcs:Annual', 'populationType': 'dcs:Energy', 'statType': 'dcs:measuredValue'} for row OrderedDict([('Commodity Code', 'ZG'), ('Country or Area Code', '40'), ('Country or Area', 'Austria'), ('Transaction Code', '04'), ('Commodity - Transaction Code', 'ZG04'), ('Commodity - Transaction', 'Of which: biogasoline - Exports'), ('Year', '2019'), ('Unit', 'Metric tons, thousand'), ('Quantity', '49000'), ('Quantity Footnotes', ''), ('_File', '/workspace/scripts/un/energy/test_data/un_energy_input.csv'), ('_Row', 2843), ('Country_dcid', 'dcs:country/AUT'), ('Unit_dcid', 'MetricTon')])
Step #1 - "python_test": [ 2023-12-22 11:45:26.081166 ] Error: warning_ignored_stat_var_duplicate_AL Invalid statVar {'typeOf': 'dcs:StatisticalVariable', 'measurementQualifier': 'dcs:Annual', 'populationType': 'dcs:Energy', 'statType': 'dcs:measuredValue'} for row OrderedDict([('Commodity Code', 'ZG'), ('Country or Area Code', '604'), ('Country or Area', 'Peru'), ('Transaction Code', '07'), ('Commodity - Transaction Code', 'ZG07'), ('Commodity - Transaction', 'Of which: biogasoline - Transfers and recycled products'), ('Year', '2012'), ('Unit', 'Metric tons, thousand'), ('Quantity', '-6000'), ('Quantity Footnotes', ''), ('_File', '/workspace/scripts/un/energy/test_data/un_energy_input.csv'), ('_Row', 2844), ('Country_dcid', 'dcs:country/PER'), ('Unit_dcid', 'MetricTon')])
Step #1 - "python_test": [ 2023-12-22 11:45:26.081254 ] Error: warning_ignored_stat_var_duplicate_AL Invalid statVar {'typeOf': 'dcs:StatisticalVariable', 'measurementQualifier': 'dcs:Annual', 'populationType': 'dcs:Energy', 'statType': 'dcs:measuredValue'} for row OrderedDict([('Commodity Code', 'ZG'), ('Country or Area Code', '604'), ('Country or Area', 'Peru'), ('Transaction Code', '1214e'), ('Commodity - Transaction Code', 'ZG1214e'), ('Commodity - Transaction', 'Of which: biogasoline - Consumption by mining and quarrying '), ('Year', '2018'), ('Unit', 'Metric tons, thousand'), ('Quantity', '0'), ('Quantity Footnotes', ''), ('_File', '/workspace/scripts/un/energy/test_data/un_energy_input.csv'), ('_Row', 2845), ('Country_dcid', 'dcs:country/PER'), ('Unit_dcid', 'MetricTon')])
Step #1 - "python_test": [ 2023-12-22 11:45:26.081346 ] Error: warning_ignored_stat_var_duplicate_AL Invalid statVar {'typeOf': 'dcs:StatisticalVariable', 'measurementQualifier': 'dcs:Annual', 'populationType': 'dcs:Energy', 'statType': 'dcs:measuredValue'} for row OrderedDict([('Commodity Code', 'ZG'), ('Country or Area Code', '604'), ('Country or Area', 'Peru'), ('Transaction Code', '1214o'), ('Commodity - Transaction Code', 'ZG1214o'), ('Commodity - Transaction', 'Of which: biogasoline - Consumption not elsewhere specified (industry)'), ('Year', '2012'), ('Unit', 'Metric tons, thousand'), ('Quantity', '1000'), ('Quantity Footnotes', ''), ('_File', '/workspace/scripts/un/energy/test_data/un_energy_input.csv'), ('_Row', 2846), ('Country_dcid', 'dcs:country/PER'), ('Unit_dcid', 'MetricTon')])
Step #1 - "python_test": [ 2023-12-22 11:45:26.081436 ] Error: warning_ignored_stat_var_duplicate_AL Invalid statVar {'typeOf': 'dcs:StatisticalVariable', 'measurementQualifier': 'dcs:Annual', 'populationType': 'dcs:Energy', 'statType': 'dcs:measuredValue'} for row OrderedDict([('Commodity Code', 'ZG'), ('Country or Area Code', '604'), ('Country or Area', 'Peru'), ('Transaction Code', '1225'), ('Commodity - Transaction Code', 'ZG1225'), ('Commodity - Transaction', 'Of which: biogasoline - Consumption not elsewhere specified (transport)'), ('Year', '2012'), ('Unit', 'Metric tons, thousand'), ('Quantity', '12000'), ('Quantity Footnotes', ''), ('_File', '/workspace/scripts/un/energy/test_data/un_energy_input.csv'), ('_Row', 2847), ('Country_dcid', 'dcs:country/PER'), ('Unit_dcid', 'MetricTon')])
Step #1 - "python_test": [ 2023-12-22 11:45:26.081998 ] Error: warning_ignored_stat_var_duplicate_AL Invalid statVar {'typeOf': 'dcs:StatisticalVariable', 'measurementQualifier': 'dcs:Annual', 'populationType': 'dcs:Energy', 'statType': 'dcs:measuredValue'} for row OrderedDict([('Commodity Code', 'ZG'), ('Country or Area Code', '858'), ('Country or Area', 'Uruguay'), ('Transaction Code', '101'), ('Commodity - Transaction Code', 'ZG101'), ('Commodity - Transaction', 'Of which: biogasoline - Losses'), ('Year', '2013'), ('Unit', 'Metric tons, thousand'), ('Quantity', '600.0'), ('Quantity Footnotes', ''), ('_File', '/workspace/scripts/un/energy/test_data/un_energy_input.csv'), ('_Row', 2848), ('Country_dcid', 'dcs:country/URY'), ('Unit_dcid', 'MetricTon')])
Step #1 - "python_test": [ 2023-12-22 11:45:26.082120 ] Error: warning_ignored_stat_var_duplicate_BJ Invalid statVar {'typeOf': 'dcs:StatisticalVariable', 'measurementQualifier': 'dcs:Annual', 'populationType': 'dcs:Energy', 'statType': 'dcs:measuredValue'} for row OrderedDict([('Commodity Code', 'ZJ'), ('Country or Area Code', '246'), ('Country or Area', 'Finland'), ('Transaction Code', '04'), ('Commodity - Transaction Code', 'ZJ04'), ('Commodity - Transaction', 'Of which: bio jet kerosene - Exports'), ('Year', '2015'), ('Unit', 'Metric tons, thousand'), ('Quantity', '1000'), ('Quantity Footnotes', ''), ('_File', '/workspace/scripts/un/energy/test_data/un_energy_input.csv'), ('_Row', 2849), ('Country_dcid', 'dcs:country/FIN'), ('Unit_dcid', 'MetricTon')])
Step #1 - "python_test": [ 2023-12-22 11:45:26.082229 ] Error: warning_ignored_stat_var_duplicate_BJ Invalid statVar {'typeOf': 'dcs:StatisticalVariable', 'measurementQualifier': 'dcs:Annual', 'populationType': 'dcs:Energy', 'statType': 'dcs:measuredValue'} for row OrderedDict([('Commodity Code', 'ZJ'), ('Country or Area Code', '246'), ('Country or Area', 'Finland'), ('Transaction Code', '07'), ('Commodity - Transaction Code', 'ZJ07'), ('Commodity - Transaction', 'Of which: bio jet kerosene - Transfers and recycled products'), ('Year', '2015'), ('Unit', 'Metric tons, thousand'), ('Quantity', '-1000'), ('Quantity Footnotes', ''), ('_File', '/workspace/scripts/un/energy/test_data/un_energy_input.csv'), ('_Row', 2850), ('Country_dcid', 'dcs:country/FIN'), ('Unit_dcid', 'MetricTon')])
Step #1 - "python_test": [ 2023-12-22 11:45:26.082310 ] Error: warning_ignored_stat_var_duplicate_BJ Invalid statVar {'typeOf': 'dcs:StatisticalVariable', 'measurementQualifier': 'dcs:Annual', 'populationType': 'dcs:Energy', 'statType': 'dcs:measuredValue'} for row OrderedDict([('Commodity Code', 'ZJ'), ('Country or Area Code', '246'), ('Country or Area', 'Finland'), ('Transaction Code', 'GA'), ('Commodity - Transaction Code', 'ZJGA'), ('Commodity - Transaction', 'Of which: bio jet kerosene - Total energy supply'), ('Year', '2015'), ('Unit', 'Metric tons, thousand'), ('Quantity', '-1000'), ('Quantity Footnotes', ''), ('_File', '/workspace/scripts/un/energy/test_data/un_energy_input.csv'), ('_Row', 2851), ('Country_dcid', 'dcs:country/FIN'), ('Unit_dcid', 'MetricTon')])
Step #1 - "python_test": ok
Step #1 - "python_test": test_process_containment (un.sdg.geography_test.GeographyTest) ... ok
Step #1 - "python_test": test_should_include_containment (un.sdg.geography_test.GeographyTest) ... ok
Step #1 - "python_test": test_write_place_mappings (un.sdg.geography_test.GeographyTest) ... ok
Step #1 - "python_test": test_write_un_containment (un.sdg.geography_test.GeographyTest) ... ok
Step #1 - "python_test": test_write_un_places (un.sdg.geography_test.GeographyTest) ... ok
Step #1 - "python_test": test_drop_null (un.sdg.process_test.ProcessTest) ... ok
Step #1 - "python_test": test_drop_special (un.sdg.process_test.ProcessTest) ... ok
Step #1 - "python_test": test_fix_encoding (un.sdg.process_test.ProcessTest) ... ok
Step #1 - "python_test": test_get_geography (un.sdg.process_test.ProcessTest) ... ok
Step #1 - "python_test": test_get_measurement_method (un.sdg.process_test.ProcessTest) ... ok
Step #1 - "python_test": test_process (un.sdg.process_test.ProcessTest) ... ok
Step #1 - "python_test": test_curate_pvs (un.sdg.util_test.UtilTest) ... ok
Step #1 - "python_test": test_format_description (un.sdg.util_test.UtilTest) ... ok
Step #1 - "python_test": test_format_property (un.sdg.util_test.UtilTest) ... ok
Step #1 - "python_test": test_format_title (un.sdg.util_test.UtilTest) ... ok
Step #1 - "python_test": test_format_variable_code (un.sdg.util_test.UtilTest) ... ok
Step #1 - "python_test": test_format_variable_description (un.sdg.util_test.UtilTest) ... ok
Step #1 - "python_test": test_is_float (un.sdg.util_test.UtilTest) ... ok
Step #1 - "python_test": test_is_valid (un.sdg.util_test.UtilTest) ... ok
Step #1 - "python_test": test_data_processing_small (us_bea.states_gdp.import_data_test.USStateQuarterlyGDPImportTest)
Step #1 - "python_test": Tests end-to-end data cleaning on a small example. ... ok
Step #1 - "python_test": test_data_processing_tiny (us_bea.states_gdp.import_data_test.USStateQuarterlyGDPImportTest)
Step #1 - "python_test": Tests end-to-end data cleaning on a tiny example. ... ok
Step #1 - "python_test": test_date_converter (us_bea.states_gdp.import_data_test.USStateQuarterlyGDPImportTest)
Step #1 - "python_test": Tests the date converter function used to process raw data. ... ok
Step #1 - "python_test": test_geoid_converter (us_bea.states_gdp.import_data_test.USStateQuarterlyGDPImportTest)
Step #1 - "python_test": Tests the geoid converter function used to process raw data. ... ok
Step #1 - "python_test": test_data_processing_tiny (us_bea.states_gdp.import_data_test.USStateQuarterlyPerIndustryImportTest)
Step #1 - "python_test": Tests end-to-end data cleaning on a tiny example. ... ok
Step #1 - "python_test": test_industry_class (us_bea.states_gdp.import_data_test.USStateQuarterlyPerIndustryImportTest)
Step #1 - "python_test": Tests industry class converter function that cleans out empty ... ok
Step #1 - "python_test": test_value_converter (us_bea.states_gdp.import_data_test.USStateQuarterlyPerIndustryImportTest)
Step #1 - "python_test": Tests value converter function that cleans out empty datapoints. ... ok
Step #1 - "python_test": test_preprocess (us_bjs.nps.import_data_test.TestPreprocess) ... /workspace/scripts/us_bjs/nps/preprocess_data.py:21: PerformanceWarning: DataFrame is highly fragmented. This is usually the result of calling `frame.insert` many times, which has poor performance. Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use `newframe = frame.copy()`
Step #1 - "python_test": df["PVINF_Temp"] = df["PVINF"].apply(convert_nan_for_calculation)
Step #1 - "python_test": /workspace/scripts/us_bjs/nps/preprocess_data.py:22: PerformanceWarning: DataFrame is highly fragmented. This is usually the result of calling `frame.insert` many times, which has poor performance. Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use `newframe = frame.copy()`
Step #1 - "python_test": df["PVOTHF_Temp"] = df["PVOTHF"].apply(convert_nan_for_calculation)
Step #1 - "python_test": /workspace/scripts/us_bjs/nps/preprocess_data.py:23: PerformanceWarning: DataFrame is highly fragmented. This is usually the result of calling `frame.insert` many times, which has poor performance. Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use `newframe = frame.copy()`
Step #1 - "python_test": df["PVINM_Temp"] = df["PVINM"].apply(convert_nan_for_calculation)
Step #1 - "python_test": /workspace/scripts/us_bjs/nps/preprocess_data.py:24: PerformanceWarning: DataFrame is highly fragmented. This is usually the result of calling `frame.insert` many times, which has poor performance. Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use `newframe = frame.copy()`
Step #1 - "python_test": df["PVOTHM_Temp"] = df["PVOTHM"].apply(convert_nan_for_calculation)
Step #1 - "python_test": ok
Step #1 - "python_test": test_filter_series (us_bls.cpi.generate_csv_mcf_test.TestGenerateCSVMCF) ... ok
Step #1 - "python_test": test_generate_statvar (us_bls.cpi.generate_csv_mcf_test.TestGenerateCSVMCF) ... ok
Step #1 - "python_test": test_invalid_series_id (us_bls.cpi.generate_csv_mcf_test.TestGenerateCSVMCF) ... ok
Step #1 - "python_test": test_parse_series_id (us_bls.cpi.generate_csv_mcf_test.TestGenerateCSVMCF) ... ok
Step #1 - "python_test": test_valid_series_id (us_bls.cpi.generate_csv_mcf_test.TestGenerateCSVMCF) ... ok
Step #1 - "python_test": test_clean_cdc_places_data (us_cdc.500_places.parse_cdc_places_test.TestParseCDCPlaces)
Step #1 - "python_test": Tests the clean_cdc_places_data function. ... ok
Step #1 - "python_test": test_brfss_asthma_extracted_data (us_cdc.brfss_aggregated_asthma_2016_2018.brfss_asthma_import_test.ProcessTest) ... ok
Step #1 - "python_test": test_clean_air_quality_data (us_cdc.environmental_health_toxicology.parse_air_quality_test.TestParseAirQuality)
Step #1 - "python_test": Tests the clean_air_quality_data function. ... ok
Step #1 - "python_test": test_clean_precipitation_data (us_cdc.environmental_health_toxicology.parse_precipitation_index_test.TestParsePrecipitationData)
Step #1 - "python_test": Tests the clean_precipitation_data function. ... ok
Step #1 - "python_test": test_spec_generation (us_census.acs5yr.subject_tables.common.acs_spec_generator_test.TestSpecGenerator) ... /workspace/scripts/us_census/acs5yr/subject_tables/common/datacommons_api_wrappers/datacommons_wrappers.py:147: ResourceWarning: unclosed file <_io.TextIOWrapper name='/workspace/scripts/us_census/acs5yr/subject_tables/common/datacommons_api_wrappers/prefetched_outputs/Person_dc_props.json' mode='r' encoding='UTF-8'>
Step #1 - "python_test": open(os.path.join(cache_path, f'{dcid}_dc_props.json'), 'r'))
Step #1 - "python_test": ResourceWarning: Enable tracemalloc to get the object allocation traceback
Step #1 - "python_test": /workspace/scripts/us_census/acs5yr/subject_tables/common/datacommons_api_wrappers/datacommons_wrappers.py:172: ResourceWarning: unclosed file <_io.TextIOWrapper name='/workspace/scripts/us_census/acs5yr/subject_tables/common/datacommons_api_wrappers/prefetched_outputs/Person_dc_props_types.json' mode='r' encoding='UTF-8'>
Step #1 - "python_test": open(os.path.join(cache_path, f'{dcid}_dc_props_types.json'), 'r'))
Step #1 - "python_test": ResourceWarning: Enable tracemalloc to get the object allocation traceback
Step #1 - "python_test": /workspace/scripts/us_census/acs5yr/subject_tables/common/datacommons_api_wrappers/datacommons_wrappers.py:200: ResourceWarning: unclosed file <_io.TextIOWrapper name='/workspace/scripts/us_census/acs5yr/subject_tables/common/datacommons_api_wrappers/prefetched_outputs/Person_dc_props_enum_values.json' mode='r' encoding='UTF-8'>
Step #1 - "python_test": 'r'))
Step #1 - "python_test": ResourceWarning: Enable tracemalloc to get the object allocation traceback
Step #1 - "python_test": ok
Step #1 - "python_test": test_find_columns_with_no_properties (us_census.acs5yr.subject_tables.common.acs_spec_validator_test.TestSpecValidator) ... ok
Step #1 - "python_test": test_find_extra_inferred_properties (us_census.acs5yr.subject_tables.common.acs_spec_validator_test.TestSpecValidator) ... ok
Step #1 - "python_test": test_find_extra_tokens (us_census.acs5yr.subject_tables.common.acs_spec_validator_test.TestSpecValidator) ... ok
Step #1 - "python_test": test_find_ignore_conflicts (us_census.acs5yr.subject_tables.common.acs_spec_validator_test.TestSpecValidator) ... ok
Step #1 - "python_test": test_find_missing_denominator_total_column (us_census.acs5yr.subject_tables.common.acs_spec_validator_test.TestSpecValidator) ... ok
Step #1 - "python_test": test_find_missing_denominators (us_census.acs5yr.subject_tables.common.acs_spec_validator_test.TestSpecValidator) ... ok
Step #1 - "python_test": test_find_multiple_measurement (us_census.acs5yr.subject_tables.common.acs_spec_validator_test.TestSpecValidator) ... ok
Step #1 - "python_test": test_find_multiple_population (us_census.acs5yr.subject_tables.common.acs_spec_validator_test.TestSpecValidator) ... ok
Step #1 - "python_test": test_find_repeating_denominators (us_census.acs5yr.subject_tables.common.acs_spec_validator_test.TestSpecValidator) ... ok
Step #1 - "python_test": test_check_column_map (us_census.acs5yr.subject_tables.common.column_map_validator_test.TestColumnMapValidator) ... ok
Step #1 - "python_test": test_column_ignore (us_census.acs5yr.subject_tables.common.common_util_test.TestCommonUtil) ... ok
Step #1 - "python_test": test_columns_from_CSVreader (us_census.acs5yr.subject_tables.common.common_util_test.TestCommonUtil) ... ok
Step #1 - "python_test": test_find_missing_tokens (us_census.acs5yr.subject_tables.common.common_util_test.TestCommonUtil) ... ok
Step #1 - "python_test": test_get_spec_token_list (us_census.acs5yr.subject_tables.common.common_util_test.TestCommonUtil) ... ok
Step #1 - "python_test": test_token_in_list (us_census.acs5yr.subject_tables.common.common_util_test.TestCommonUtil) ... ok
Step #1 - "python_test": test_tokens_from_column_list (us_census.acs5yr.subject_tables.common.common_util_test.TestCommonUtil) ... ok
Step #1 - "python_test": test_csv_file_input (us_census.acs5yr.subject_tables.common.data_loader_test.DataLoaderBaseTest) ... ok
Finished Step #2 - "python_format_check"
Step #1 - "python_test": test_zip_file_input (us_census.acs5yr.subject_tables.common.data_loader_test.DataLoaderBaseTest) ... ok
Step #1 - "python_test": test_generating_column_map_from_csv (us_census.acs5yr.subject_tables.common.generate_col_map_test.GenerateColMapTest) ... ok
Step #1 - "python_test": test_generating_column_map_from_zip (us_census.acs5yr.subject_tables.common.generate_col_map_test.GenerateColMapTest) ... ok
Step #1 - "python_test": test_geoIds_at_all_summary_levels (us_census.acs5yr.subject_tables.common.resolve_geo_id_test.ResolveCensusGeoIdTest) ... ok
Step #1 - "python_test": test_convert_column_to_stat_var (us_census.acs5yr.subject_tables.s2201.process_test.ProcessTest) ... ok
Step #1 - "python_test": test_create_csv (us_census.acs5yr.subject_tables.s2201.process_test.ProcessTest) ... ok
Step #1 - "python_test": test_create_tmcf (us_census.acs5yr.subject_tables.s2201.process_test.ProcessTest) ... ok
Step #1 - "python_test": test_csv_mcf_column_map (us_census.acs5yr.subject_tables.subject_table_test.TestSubjectTable) ... ok
Step #1 - "python_test": test_e2e (us_census.decennial.process_test.ProcessTest) ... ok
Step #1 - "python_test": test_bad_tmcf_variable_measured_two_equals_exception (us_census.enhanced_tmcf.process_etmcf_test.Process_ETMCF_Test) ... ok
Step #1 - "python_test": test_bad_tmcf_variable_measured_two_question_marks_exception (us_census.enhanced_tmcf.process_etmcf_test.Process_ETMCF_Test) ... ok
Step #1 - "python_test": test_csv_file_not_found_exception (us_census.enhanced_tmcf.process_etmcf_test.Process_ETMCF_Test) ... ok
Step #1 - "python_test": test_process_enhanced_tmcf_medium_success (us_census.enhanced_tmcf.process_etmcf_test.Process_ETMCF_Test) ... ok
Step #1 - "python_test": test_simple_opaque_success (us_census.enhanced_tmcf.process_etmcf_test.Process_ETMCF_Test) ... ok
Step #1 - "python_test": test_simple_success (us_census.enhanced_tmcf.process_etmcf_test.Process_ETMCF_Test) ... ok
Step #1 - "python_test": test_tmcf_file_not_found_exception (us_census.enhanced_tmcf.process_etmcf_test.Process_ETMCF_Test) ... ok
Step #1 - "python_test": test_process (us_eia.eia_860.main_test.TestProcess) ... ok
Step #1 - "python_test": test_cleanup_name (us_eia.opendata.process.common_test.TestProcess) ... ok
Step #1 - "python_test": test_process (us_eia.opendata.process.common_test.TestProcess) ... ok
Step #1 - "python_test": test_write_csv_county (us_epa.airdata.air_quality_aggregate_test.TestCriteriaGasesTest) ... ok
Step #1 - "python_test": test_write_csv_csba (us_epa.airdata.air_quality_aggregate_test.TestCriteriaGasesTest) ... ok
Step #1 - "python_test": test_write_tmcf (us_epa.airdata.air_quality_aggregate_test.TestCriteriaGasesTest) ... ok
Step #1 - "python_test": test_write_csv (us_epa.airdata.air_quality_test.TestCriteriaGasesTest) ... ok
Step #1 - "python_test": test_write_tmcf (us_epa.airdata.air_quality_test.TestCriteriaGasesTest) ... ok
Step #1 - "python_test": test_write_csv (us_epa.ejscreen.ejscreen_test.TestEjscreen) ... ok
Step #1 - "python_test": test_e2e (us_epa.facility.process_facility_test.ProcessTest) ... ok
Step #1 - "python_test": test_name_to_dcid (us_epa.ghgrp.gas_test.GasTest) ... ok
Step #1 - "python_test": test_process_direct_emitters (us_epa.ghgrp.process_test.ProcessTest) ... ok
Step #1 - "python_test": test_name_to_dcid (us_epa.ghgrp.sources_test.SourcesTest) ... ok
Step #1 - "python_test": test_parent_companies_e2e (us_epa.parent_company.process_parent_company_test.ProcessTest) ... ok
Step #1 - "python_test": test_svobs_e2e (us_epa.parent_company.process_parent_company_test.ProcessTest) ... ok
Step #1 - "python_test": test_e2e_superfund_funding_status (us_epa.superfund.site_and_funding_status.process_sites_fundingStatus_test.ProcessTest) ... ok
Step #1 - "python_test": test_e2e_superfund_sites (us_epa.superfund.site_and_funding_status.process_sites_test.ProcessTest) ... /workspace/scripts/us_epa/superfund/site_and_funding_status/process_sites.py:144: FutureWarning: The default value of regex will change from True to False in a future version.
Step #1 - "python_test": site_csv['siteName'] = site_csv['siteName'].str.replace(', Inc.', ' Inc.')
Step #1 - "python_test": ok
Step #1 - "python_test": test_e2e_superfund_site_contamination (us_epa.superfund.site_contamination.process_sites_contamination_test.ProcessTest) ... ok
Step #1 - "python_test": test_e2e (us_epa.superfund.site_hazards.process_sites_hazards_test.ProcessTest) ... ok
Step #1 - "python_test": test_e2e (us_epa.superfund.sites.measurement_sites.generate_measurement_site_mcf_test.ProcessTest) ... ok
Step #1 - "python_test": test_e2e (us_epa.superfund.sites.tar_creek.process_contaminants_test.ProcessTest) ... ok
Step #1 - "python_test": test_main (us_fema.national_risk_index.generate_schema_and_tmcf_test.ProcessFemaNriFileTest) ... ok
Step #1 - "python_test": test_county_missing_trailing_zero (us_fema.national_risk_index.process_data_test.FormatGeoIDTest) ... ok
Step #1 - "python_test": test_county_no_change_needed (us_fema.national_risk_index.process_data_test.FormatGeoIDTest) ... ok
Step #1 - "python_test": test_tract_missing_trailing_zero (us_fema.national_risk_index.process_data_test.FormatGeoIDTest) ... ok
Step #1 - "python_test": test_tract_no_change_needed (us_fema.national_risk_index.process_data_test.FormatGeoIDTest) ... ok
Step #1 - "python_test": test_process_county_file (us_fema.national_risk_index.process_data_test.ProcessFemaNriFileTest) ... ok
Step #1 - "python_test": test_process_tract_file (us_fema.national_risk_index.process_data_test.ProcessFemaNriFileTest) ... ok
Step #1 - "python_test": test_preprocess (us_gs.earthquake.preprocess_test.USGSEarthquakePreprocessTest) ... ok
Step #1 - "python_test": test_compute_150 (us_hud.income.process_test.ProcessTest) ... ok
Step #1 - "python_test": test_get_url (us_hud.income.process_test.ProcessTest) ... ok
Step #1 - "python_test": test_process (us_hud.income.process_test.ProcessTest) ... ok
Step #1 - "python_test": test_output_mcf (world_bank.boundaries.country_boundaries_mcf_generator_test.CountyBoundariesMcfGeneratorTest) ... ok
Step #1 - "python_test":
Step #1 - "python_test": ----------------------------------------------------------------------
Step #1 - "python_test": Ran 172 tests in 204.299s
Step #1 - "python_test":
Step #1 - "python_test": OK
Finished Step #1 - "python_test"
PUSH
DONE
Loading