From 8abbc26b055c3b85939979c5c2670207080ce432 Mon Sep 17 00:00:00 2001 From: Libba Lawrence Date: Mon, 10 Jun 2024 07:54:55 -0700 Subject: [PATCH] [EG] GA Namespaces (#35831) * [EGv2] Build Release (#30325) * move old sdk under legacy * gen typespec code * naming changes from archboard * samples * update patch naming * update imports with new gen * update samples * update client naming on aio * update receive op * update async to close client * update receive() * update gen code * moving around samples * updating samples * update samples * update patch and samples * patch internalmodels * spacing * updating model patch * update patch models * add both models back * update docstring * update docs * updating patch for receive * old EG models * add reject samples * patch * update format * update patch * eventgrid_client exceptions * update test imports * update total sample * receive patch fix * add in more tests * update test file * remove locktoken model * remove LockToken in patch * remove event delivery delay * eg client exceptions * .8.5 generation, and deliveryCount * rename sample * update version for beta * changelog * updating for gen * regen * generate via commit * publish result * fix docstring * publish docstring * return type * publish result * return publish result -- is none * format * update Publish result model * deliverycount patch * update from main * add copyright * added to readme * remove from readme * force publish_result response * update patch tp unindent * cspell * update mypy.ini * import order * mark livetest * update operations init * rename async * mypy * ignore mypy * pylint * pylint * ignore pylint for now to avoid gen code errors * ignore samples until ARM setup * update patches * remove publish result * remove PublishResult * remove publishresult * comma Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * update publishResult * change to .value * gen code " to ' * remove comment * ran black * update changelog * update sample readme * gen code without query name * gen code * update tsp commit * remove publishresult * readme disclaimer * update changelog --------- Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * link to samples * remove comment * remove uneeded test * [EGv2] doc updates (#30483) * doc updates * doc update * doc * [EGv2] Eg typing/formatting (#30492) * mypy pylint * update samples * remove version disclaimer * Beta LiveTests (#30728) * add bicep file for tests * update output * update test * secret sanitization * refactor failing test * update conftest * update assets and sanitizers * update preparer loc * update conftest * conftest * update conftest * remove variables for now * update assets * update tests * try to update regex * update recordings * update conftest * update preparer * update test * update exception test * update tests * update asset * update conftest * pr comments * default needs to be eastus * import * [EGv2] generate with newer emitter (#31962) * [EGv2] Build Release (#30325) * move old sdk under legacy * gen typespec code * naming changes from archboard * samples * update patch naming * update imports with new gen * update samples * update client naming on aio * update receive op * update async to close client * update receive() * update gen code * moving around samples * updating samples * update samples * update patch and samples * patch internalmodels * spacing * updating model patch * update patch models * add both models back * update docstring * update docs * updating patch for receive * old EG models * add reject samples * patch * update format * update patch * eventgrid_client exceptions * update test imports * update total sample * receive patch fix * add in more tests * update test file * remove locktoken model * remove LockToken in patch * remove event delivery delay * eg client exceptions * .8.5 generation, and deliveryCount * rename sample * update version for beta * changelog * updating for gen * regen * generate via commit * publish result * fix docstring * publish docstring * return type * publish result * return publish result -- is none * format * update Publish result model * deliverycount patch * update from main * add copyright * added to readme * remove from readme * force publish_result response * update patch tp unindent * cspell * update mypy.ini * import order * mark livetest * update operations init * rename async * mypy * ignore mypy * pylint * pylint * ignore pylint for now to avoid gen code errors * ignore samples until ARM setup * update patches * remove publish result * remove PublishResult * remove publishresult * comma Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * update publishResult * change to .value * gen code " to ' * remove comment * ran black * update changelog * update sample readme * gen code without query name * gen code * update tsp commit * remove publishresult * readme disclaimer * update changelog --------- Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * generate with newer emitter * update tsp * regen * update tests * update tspconfig * cspell * version * update serialization * update assets * update mypy --------- Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * [EGv2] Binary mode (#32922) * [EGv2] Build Release (#30325) * move old sdk under legacy * gen typespec code * naming changes from archboard * samples * update patch naming * update imports with new gen * update samples * update client naming on aio * update receive op * update async to close client * update receive() * update gen code * moving around samples * updating samples * update samples * update patch and samples * patch internalmodels * spacing * updating model patch * update patch models * add both models back * update docstring * update docs * updating patch for receive * old EG models * add reject samples * patch * update format * update patch * eventgrid_client exceptions * update test imports * update total sample * receive patch fix * add in more tests * update test file * remove locktoken model * remove LockToken in patch * remove event delivery delay * eg client exceptions * .8.5 generation, and deliveryCount * rename sample * update version for beta * changelog * updating for gen * regen * generate via commit * publish result * fix docstring * publish docstring * return type * publish result * return publish result -- is none * format * update Publish result model * deliverycount patch * update from main * add copyright * added to readme * remove from readme * force publish_result response * update patch tp unindent * cspell * update mypy.ini * import order * mark livetest * update operations init * rename async * mypy * ignore mypy * pylint * pylint * ignore pylint for now to avoid gen code errors * ignore samples until ARM setup * update patches * remove publish result * remove PublishResult * remove publishresult * comma Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * update publishResult * change to .value * gen code " to ' * remove comment * ran black * update changelog * update sample readme * gen code without query name * gen code * update tsp commit * remove publishresult * readme disclaimer * update changelog --------- Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * Beta LiveTests (#30728) * add bicep file for tests * update output * update test * secret sanitization * refactor failing test * update conftest * update assets and sanitizers * update preparer loc * update conftest * conftest * update conftest * remove variables for now * update assets * update tests * try to update regex * update recordings * update conftest * update preparer * update test * update exception test * update tests * update asset * update conftest * pr comments * default needs to be eastus * import * [EGv2] Build Release (#30325) * move old sdk under legacy * gen typespec code * naming changes from archboard * samples * update patch naming * update imports with new gen * update samples * update client naming on aio * update receive op * update async to close client * update receive() * update gen code * moving around samples * updating samples * update samples * update patch and samples * patch internalmodels * spacing * updating model patch * update patch models * add both models back * update docstring * update docs * updating patch for receive * old EG models * add reject samples * patch * update format * update patch * eventgrid_client exceptions * update test imports * update total sample * receive patch fix * add in more tests * update test file * remove locktoken model * remove LockToken in patch * remove event delivery delay * eg client exceptions * .8.5 generation, and deliveryCount * rename sample * update version for beta * changelog * updating for gen * regen * generate via commit * publish result * fix docstring * publish docstring * return type * publish result * return publish result -- is none * format * update Publish result model * deliverycount patch * update from main * add copyright * added to readme * remove from readme * force publish_result response * update patch tp unindent * cspell * update mypy.ini * import order * mark livetest * update operations init * rename async * mypy * ignore mypy * pylint * pylint * ignore pylint for now to avoid gen code errors * ignore samples until ARM setup * update patches * remove publish result * remove PublishResult * remove publishresult * comma Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * update publishResult * change to .value * gen code " to ' * remove comment * ran black * update changelog * update sample readme * gen code without query name * gen code * update tsp commit * remove publishresult * readme disclaimer * update changelog --------- Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * fix merge * dont go to generated before binary * update patch * update patches * eventgrid client patch * changes * add * update test * update tyoe checking * pass through binary_mode for now -- * update patch aio * add async func * update * sys * update kwargs * add Todo and start adding more tests * update * differentiate between binary and not * update binary * no base64 in binary mode * binary * try JSONEncoder on everything if not str/bytes * update test * update test * update changes * whitespace * space * remove commented * str serialize extensions? * xml test * encode extensions as object * update test * update extension serialization for deserialize * move flag to operation level * extra comma * dont raise httpresponse * update patch * accept dict cloud events * spacing * remove content_type check * add live test * remove live test mark * update * use env vars * update test * only run live test * comment * typo * error incorrect * start comments * update test * add sample * update tests * update docstrings to add clarity * update err message * remove generated cloud event * update sample * update * update samples to include dict * update patch * spacing * add comments * formatting * update doc * update tests * update tests * tests * skip tests for now * typo * add dict binary mode * update docstring * update patch to allow throw error * first pass at comments * update patch eror * nit --------- Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * [EventGrid] Ignite Release generate with new typespec (#32652) * Beta LiveTests (#30728) * add bicep file for tests * update output * update test * secret sanitization * refactor failing test * update conftest * update assets and sanitizers * update preparer loc * update conftest * conftest * update conftest * remove variables for now * update assets * update tests * try to update regex * update recordings * update conftest * update preparer * update test * update exception test * update tests * update asset * update conftest * pr comments * default needs to be eastus * import * [EGv2] Binary mode (#32922) * [EGv2] Build Release (#30325) * move old sdk under legacy * gen typespec code * naming changes from archboard * samples * update patch naming * update imports with new gen * update samples * update client naming on aio * update receive op * update async to close client * update receive() * update gen code * moving around samples * updating samples * update samples * update patch and samples * patch internalmodels * spacing * updating model patch * update patch models * add both models back * update docstring * update docs * updating patch for receive * old EG models * add reject samples * patch * update format * update patch * eventgrid_client exceptions * update test imports * update total sample * receive patch fix * add in more tests * update test file * remove locktoken model * remove LockToken in patch * remove event delivery delay * eg client exceptions * .8.5 generation, and deliveryCount * rename sample * update version for beta * changelog * updating for gen * regen * generate via commit * publish result * fix docstring * publish docstring * return type * publish result * return publish result -- is none * format * update Publish result model * deliverycount patch * update from main * add copyright * added to readme * remove from readme * force publish_result response * update patch tp unindent * cspell * update mypy.ini * import order * mark livetest * update operations init * rename async * mypy * ignore mypy * pylint * pylint * ignore pylint for now to avoid gen code errors * ignore samples until ARM setup * update patches * remove publish result * remove PublishResult * remove publishresult * comma Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * update publishResult * change to .value * gen code " to ' * remove comment * ran black * update changelog * update sample readme * gen code without query name * gen code * update tsp commit * remove publishresult * readme disclaimer * update changelog --------- Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * Beta LiveTests (#30728) * add bicep file for tests * update output * update test * secret sanitization * refactor failing test * update conftest * update assets and sanitizers * update preparer loc * update conftest * conftest * update conftest * remove variables for now * update assets * update tests * try to update regex * update recordings * update conftest * update preparer * update test * update exception test * update tests * update asset * update conftest * pr comments * default needs to be eastus * import * [EGv2] Build Release (#30325) * move old sdk under legacy * gen typespec code * naming changes from archboard * samples * update patch naming * update imports with new gen * update samples * update client naming on aio * update receive op * update async to close client * update receive() * update gen code * moving around samples * updating samples * update samples * update patch and samples * patch internalmodels * spacing * updating model patch * update patch models * add both models back * update docstring * update docs * updating patch for receive * old EG models * add reject samples * patch * update format * update patch * eventgrid_client exceptions * update test imports * update total sample * receive patch fix * add in more tests * update test file * remove locktoken model * remove LockToken in patch * remove event delivery delay * eg client exceptions * .8.5 generation, and deliveryCount * rename sample * update version for beta * changelog * updating for gen * regen * generate via commit * publish result * fix docstring * publish docstring * return type * publish result * return publish result -- is none * format * update Publish result model * deliverycount patch * update from main * add copyright * added to readme * remove from readme * force publish_result response * update patch tp unindent * cspell * update mypy.ini * import order * mark livetest * update operations init * rename async * mypy * ignore mypy * pylint * pylint * ignore pylint for now to avoid gen code errors * ignore samples until ARM setup * update patches * remove publish result * remove PublishResult * remove publishresult * comma Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * update publishResult * change to .value * gen code " to ' * remove comment * ran black * update changelog * update sample readme * gen code without query name * gen code * update tsp commit * remove publishresult * readme disclaimer * update changelog --------- Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * fix merge * dont go to generated before binary * update patch * update patches * eventgrid client patch * changes * add * update test * update tyoe checking * pass through binary_mode for now -- * update patch aio * add async func * update * sys * update kwargs * add Todo and start adding more tests * update * differentiate between binary and not * update binary * no base64 in binary mode * binary * try JSONEncoder on everything if not str/bytes * update test * update test * update changes * whitespace * space * remove commented * str serialize extensions? * xml test * encode extensions as object * update test * update extension serialization for deserialize * move flag to operation level * extra comma * dont raise httpresponse * update patch * accept dict cloud events * spacing * remove content_type check * add live test * remove live test mark * update * use env vars * update test * only run live test * comment * typo * error incorrect * start comments * update test * add sample * update tests * update docstrings to add clarity * update err message * remove generated cloud event * update sample * update * update samples to include dict * update patch * spacing * add comments * formatting * update doc * update tests * update tests * tests * skip tests for now * typo * add dict binary mode * update docstring * update patch to allow throw error * first pass at comments * update patch eror * nit --------- Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * [EGv2] Build Release (#30325) * move old sdk under legacy * gen typespec code * naming changes from archboard * samples * update patch naming * update imports with new gen * update samples * update client naming on aio * update receive op * update async to close client * update receive() * update gen code * moving around samples * updating samples * update samples * update patch and samples * patch internalmodels * spacing * updating model patch * update patch models * add both models back * update docstring * update docs * updating patch for receive * old EG models * add reject samples * patch * update format * update patch * eventgrid_client exceptions * update test imports * update total sample * receive patch fix * add in more tests * update test file * remove locktoken model * remove LockToken in patch * remove event delivery delay * eg client exceptions * .8.5 generation, and deliveryCount * rename sample * update version for beta * changelog * updating for gen * regen * generate via commit * publish result * fix docstring * publish docstring * return type * publish result * return publish result -- is none * format * update Publish result model * deliverycount patch * update from main * add copyright * added to readme * remove from readme * force publish_result response * update patch tp unindent * cspell * update mypy.ini * import order * mark livetest * update operations init * rename async * mypy * ignore mypy * pylint * pylint * ignore pylint for now to avoid gen code errors * ignore samples until ARM setup * update patches * remove publish result * remove PublishResult * remove publishresult * comma Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * update publishResult * change to .value * gen code " to ' * remove comment * ran black * update changelog * update sample readme * gen code without query name * gen code * update tsp commit * remove publishresult * readme disclaimer * update changelog --------- Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * Beta LiveTests (#30728) * add bicep file for tests * update output * update test * secret sanitization * refactor failing test * update conftest * update assets and sanitizers * update preparer loc * update conftest * conftest * update conftest * remove variables for now * update assets * update tests * try to update regex * update recordings * update conftest * update preparer * update test * update exception test * update tests * update asset * update conftest * pr comments * default needs to be eastus * import * regen * new api version * samples for new features * update test-resources.json * update operation samples * add samples * more sample * update tests * add mros * gen * fix changelog * update tests * update preparer * point at canary until release * update test deployment area * update * add * try this tests * update samples * livetest mark * update tests * eastus working? * regen - removed azure refs in gen code * update comments * add other sample * update * remove stream - no response * update version and date for release --------- Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * update for release * add changelog * [EG] Regenerate beta (#35014) * update generation * version * skip * [EG] Beta One Client (#34973) * [EG] dont hardcode api_version on request (#34965) * dont hardcode api_version on request * pylint fixes * revert * api version * Update sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_publisher_client_async.py * Update sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_publisher_client_async.py * add sync side patches * aio patches * update readme samples * all samples use EGClient * update readme samples * fix imports * import issue * missing pathc * allow sas cred * typo * updates * sas * client * self serialize cloudevent * add bakc in * updates * update ptach * update * update exception logic * async w client * aio [atch * typo * import * update links * tests * raise error * content type * use more fake url * remove content type * mypy * update apiversion * content type * unitttests * update auth * updates * add level * update readme * update * binary mode * args, kwargs * remove auth * add sample comments * testing * move around readme * content type * update tests * docstring * cncf event * add more tests * update doc * update inits to prevent typing errors * ran blakc * fix pylint patch * changes * add all kwargs * indent * reviews * nit * name changes * options * options/result rename * Revert "options" This reverts commit fe0623a51126ae474a8465ecc60ac5e6aadae5c3. * Revert "options/result rename" This reverts commit 6d374222f1fd763e197d280f43cb1a511a16a0c5. * fix tests * remove or None * remove EGPubClient * remove options naming * Revert "remove EGPubClient" This reverts commit bf943640407239ae568c81d7ee0558305c3e30f7. * typeerror * update readme * readme nit * readme updates * add send operation samples * add datacontenttpye * typo * make Options bag models kwargs * remove models * import * exception * update changelog * shorten operation names * nit * [EG] Docstring/update changelog (#35108) * nits * Revert "shorten operation names" This reverts commit cd37161b28e56bfa805c8a00615a7e5cac073ab0. * remove broken link * edit * update readme (#35147) * beta version * [EG] Readme updates (#35152) * simplify readme * update all links to feature branch * spacing * try * type error to value error (#35164) * [EG] rename release_delay (#35172) * rename * valueError * update version * version * [EG] regenerate to fix gen code bug (#35327) * regenerate to fix gen code bug * update serialization code * update * pylint * update faulty tests * use _patch * use _patch * add test type * fix test + add version * ver (#35345) * typo (#35348) * typo (#35351) * typo * update * [EG] Archboard Feedback (#35738) * regen * remove all samples/tests before fixing * move all topic/sub to client level * update * updates * update samples * add other publisher tests * missing * content type * consumerclient * upload consumer tests * updates * update * changes * updates * rename * update * patch * test update * update tests * fix * updates snippets * update readme * try updating api_version * typo Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * typo2 Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * renames/docs from comments * regen * update patch * remove import * caps --------- Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> * [EG] Update tests (#35833) * [EG] Update tests (#35752) * test * typo * update recordings * mark live * kwarg fix * updates * revert * kwargs * continue skip * rename * naming * remove _async * nit * typo * remove async * remove print (#35855) * [EG] update pyproject and samples (#35857) * update * pylint changes * update readme + version * regen * Update scripts/devops_tasks/test_run_samples.py * readme * typo * update * add tests * updates tests 2 * Revert "updates tests 2" This reverts commit 1b85d9e9ce915bb8c735b45c19d5cf01e5b675c7. * try * typeError * missing await * unused import --------- Co-authored-by: swathipil <76007337+swathipil@users.noreply.github.com> --- .vscode/cspell.json | 5 + sdk/eventgrid/azure-eventgrid/CHANGELOG.md | 36 +- sdk/eventgrid/azure-eventgrid/MANIFEST.in | 6 +- sdk/eventgrid/azure-eventgrid/README.md | 279 ++- .../azure-eventgrid/azure/__init__.py | 2 +- .../azure/eventgrid/__init__.py | 25 +- .../azure/eventgrid/_client.py | 183 ++ .../azure/eventgrid/_configuration.py | 132 ++ .../azure/eventgrid/_event_mappings.py | 501 ----- .../azure/eventgrid/_legacy/__init__.py | 19 + .../eventgrid/{ => _legacy}/_constants.py | 0 .../eventgrid/_legacy/_event_mappings.py | 533 +++++ .../{ => _legacy}/_generated/__init__.py | 0 .../{ => _legacy}/_generated/_client.py | 0 .../_generated/_configuration.py | 0 .../_generated/_operations/__init__.py | 0 .../_generated/_operations/_operations.py | 35 +- .../_generated/_operations/_patch.py | 0 .../{ => _legacy}/_generated/_patch.py | 0 .../_generated/_serialization.py | 55 +- .../{ => _legacy}/_generated/_vendor.py | 0 .../{ => _legacy}/_generated/aio/__init__.py | 0 .../{ => _legacy}/_generated/aio/_client.py | 0 .../_generated/aio/_configuration.py | 0 .../_generated/aio/_operations/__init__.py | 0 .../_generated/aio/_operations/_operations.py | 3 +- .../_generated/aio/_operations/_patch.py | 0 .../{ => _legacy}/_generated/aio/_patch.py | 0 .../{ => _legacy}/_generated/aio/_vendor.py | 0 .../_generated/models/__init__.py | 0 .../_generated/models/_models.py | 0 .../{ => _legacy}/_generated/models/_patch.py | 0 .../{ => _legacy}/_generated/py.typed | 0 .../azure/eventgrid/{ => _legacy}/_helpers.py | 57 +- .../{ => _legacy}/_messaging_shared.py | 6 +- .../azure/eventgrid/{ => _legacy}/_models.py | 0 .../eventgrid/{ => _legacy}/_policies.py | 4 +- .../{ => _legacy}/_publisher_client.py | 50 +- .../_signature_credential_policy.py | 0 .../azure/eventgrid/_legacy/_version.py | 12 + .../azure/eventgrid/_legacy/aio/__init__.py | 9 + .../aio/_publisher_client_async.py | 41 +- .../azure/eventgrid/_model_base.py | 887 ++++++++ .../azure/eventgrid/_operations/__init__.py | 21 + .../eventgrid/_operations/_operations.py | 1268 +++++++++++ .../azure/eventgrid/_operations/_patch.py | 351 +++ .../azure-eventgrid/azure/eventgrid/_patch.py | 153 ++ .../azure/eventgrid/_serialization.py | 1998 +++++++++++++++++ .../azure/eventgrid/_validation.py | 50 + .../azure/eventgrid/_vendor.py | 35 + .../azure/eventgrid/_version.py | 11 +- .../azure/eventgrid/aio/__init__.py | 28 +- .../azure/eventgrid/aio/_client.py | 191 ++ .../azure/eventgrid/aio/_configuration.py | 136 ++ .../eventgrid/aio/_operations/__init__.py | 21 + .../eventgrid/aio/_operations/_operations.py | 1055 +++++++++ .../azure/eventgrid/aio/_operations/_patch.py | 281 +++ .../azure/eventgrid/aio/_patch.py | 148 ++ .../azure/eventgrid/aio/_vendor.py | 35 + .../azure/eventgrid/models/__init__.py | 29 + .../azure/eventgrid/models/_enums.py | 25 + .../azure/eventgrid/models/_models.py | 370 +++ .../azure/eventgrid/models/_patch.py | 89 + .../azure-eventgrid/azure/eventgrid/py.typed | 1 + sdk/eventgrid/azure-eventgrid/mypy.ini | 10 +- sdk/eventgrid/azure-eventgrid/pyproject.toml | 4 +- .../sample_authentication_async.py | 9 +- .../sample_consume_process_events_async.py | 136 ++ ...le_publish_cloud_event_using_dict_async.py | 56 +- .../sample_publish_cncf_cloud_events_async.py | 51 +- ..._publish_custom_schema_to_a_topic_async.py | 8 +- ...ample_publish_eg_event_using_dict_async.py | 40 +- ...ple_publish_eg_events_to_a_domain_async.py | 42 +- ...mple_publish_eg_events_to_a_topic_async.py | 25 +- ...s_to_a_topic_using_sas_credential_async.py | 24 +- ...nts_using_cloud_events_1.0_schema_async.py | 25 +- .../sample_publish_to_channel_async.py | 32 +- .../consume_cloud_events_from_eventhub.py | 9 +- ...consume_cloud_events_from_storage_queue.py | 13 +- ...eventgrid_events_from_service_bus_queue.py | 8 +- .../EventGridTrigger1/__init__.py | 17 +- ...ish_cloud_events_to_custom_topic_sample.py | 14 +- ...ish_cloud_events_to_domain_topic_sample.py | 16 +- ...sh_custom_schema_events_to_topic_sample.py | 9 +- ...vent_grid_events_to_custom_topic_sample.py | 17 +- ...ish_with_shared_access_signature_sample.py | 16 +- .../sync_samples/sample_authentication.py | 2 +- .../sample_consume_process_events.py | 129 ++ .../sync_samples/sample_generate_sas.py | 4 +- .../sample_publish_cloud_event_using_dict.py | 50 +- .../sample_publish_cncf_cloud_events.py | 36 +- ...sample_publish_custom_schema_to_a_topic.py | 6 +- .../sample_publish_eg_event_using_dict.py | 40 +- .../sample_publish_eg_events_to_a_domain.py | 38 +- .../sample_publish_eg_events_to_a_topic.py | 20 +- ..._events_to_a_topic_using_sas_credential.py | 20 +- ...sh_events_using_cloud_events_1.0_schema.py | 20 +- sdk/eventgrid/azure-eventgrid/setup.py | 96 +- .../azure-eventgrid/swagger/_constants.py | 68 +- .../swagger/postprocess_eventnames.py | 9 +- sdk/eventgrid/azure-eventgrid/tests/_mocks.py | 96 +- .../azure-eventgrid/tests/conftest.py | 74 +- .../tests/eventgrid_preparer.py | 35 +- .../tests/perfstress_tests/send.py | 51 +- .../tests/test_cloud_event_tracing.py | 52 +- .../azure-eventgrid/tests/test_cncf_events.py | 16 +- .../tests/test_cncf_events_async.py | 15 +- .../tests/test_eg_consumer_client.py | 138 ++ .../tests/test_eg_consumer_client_async.py | 140 ++ .../tests/test_eg_event_get_bytes.py | 171 +- .../tests/test_eg_publisher_client.py | 319 +-- .../tests/test_eg_publisher_client_async.py | 306 +-- .../azure-eventgrid/tests/test_exceptions.py | 43 +- .../tests/test_exceptions_async.py | 46 +- .../tests/test_serialization.py | 121 +- .../azure-eventgrid/tsp-location.yaml | 4 + sdk/eventgrid/test-resources.json | 88 +- sdk/eventgrid/tests.yml | 1 + 118 files changed, 10260 insertions(+), 1751 deletions(-) create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/_client.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/_configuration.py delete mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/_event_mappings.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/__init__.py rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_constants.py (100%) create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_event_mappings.py rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/__init__.py (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/_client.py (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/_configuration.py (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/_operations/__init__.py (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/_operations/_operations.py (95%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/_operations/_patch.py (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/_patch.py (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/_serialization.py (98%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/_vendor.py (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/aio/__init__.py (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/aio/_client.py (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/aio/_configuration.py (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/aio/_operations/__init__.py (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/aio/_operations/_operations.py (99%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/aio/_operations/_patch.py (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/aio/_patch.py (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/aio/_vendor.py (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/models/__init__.py (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/models/_models.py (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/models/_patch.py (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_generated/py.typed (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_helpers.py (80%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_messaging_shared.py (92%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_models.py (100%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_policies.py (94%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_publisher_client.py (90%) rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/_signature_credential_policy.py (100%) create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_version.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/aio/__init__.py rename sdk/eventgrid/azure-eventgrid/azure/eventgrid/{ => _legacy}/aio/_publisher_client_async.py (90%) create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/_model_base.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/_operations/__init__.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/_operations/_operations.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/_operations/_patch.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/_patch.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/_serialization.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/_validation.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/_vendor.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_client.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_configuration.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_operations/__init__.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_operations/_operations.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_operations/_patch.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_patch.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_vendor.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/models/__init__.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/models/_enums.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/models/_models.py create mode 100644 sdk/eventgrid/azure-eventgrid/azure/eventgrid/models/_patch.py create mode 100644 sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_consume_process_events_async.py create mode 100644 sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_consume_process_events.py create mode 100644 sdk/eventgrid/azure-eventgrid/tests/test_eg_consumer_client.py create mode 100644 sdk/eventgrid/azure-eventgrid/tests/test_eg_consumer_client_async.py create mode 100644 sdk/eventgrid/azure-eventgrid/tsp-location.yaml diff --git a/.vscode/cspell.json b/.vscode/cspell.json index f324f4cc38f1..ccb1a4445f07 100644 --- a/.vscode/cspell.json +++ b/.vscode/cspell.json @@ -200,6 +200,7 @@ "centraluseuap", "creds", "ctoring", + "ctxt", "ctypes", "curr", "dateutil", @@ -209,6 +210,7 @@ "dependened", "deque", "deserialization", + "deserializers", "disablecov", "distilbert", "distilroberta", @@ -297,6 +299,8 @@ "mibps", "mgmt", "mhsm", + "mros", + "Nify", "mipsle", "mktime", "mlindex", @@ -449,6 +453,7 @@ "yarl", "SDDL", "dacl", + "wday", "whls", "aiter", "solft", diff --git a/sdk/eventgrid/azure-eventgrid/CHANGELOG.md b/sdk/eventgrid/azure-eventgrid/CHANGELOG.md index ef1aeefa75ad..2e65032ffaa0 100644 --- a/sdk/eventgrid/azure-eventgrid/CHANGELOG.md +++ b/sdk/eventgrid/azure-eventgrid/CHANGELOG.md @@ -1,14 +1,32 @@ # Release History -## 4.19.1 (Unreleased) +## 4.20.0 (Unreleased) ### Features Added - ### Breaking Changes +### Bugs Fixed +### Other Changes + +## 4.20.0b2 (2024-04-25) + +This is a Beta of the EventGridClient ### Bugs Fixed -### Other Changes +- Fixed serialization issues with CloudEvent and CNCF Cloud Event + +## 4.20.0b1 (2024-04-11) + +### Features Added + +- This is a Beta of the EventGridClient + - EventGridClient `send` can be used for both Event Grid Namespace Resources and Event Grid Basic Resources. + - Added a kwarg `level` in the EventGridClient constructor. The default value is `Standard` which creates a client for an Event Grid Namespace Resource. + +### Breaking Changes + +- Removed the `AcknowledgeOptions`,`ReleaseOptions`, `RejectOptions`, and `RenewLockOptions` models. `lock_tokens` can now be specified as a `kwarg` on the operation. +- Renamed `publish_cloud_events` to `send`. ## 4.19.0 (2024-04-10) @@ -16,7 +34,7 @@ - Added new enum values to `SystemEventNames` related to Azure Communication Services. -### Bugs Fixed +### Breaking Changes - Fixed a bug where the Api Version was being hardcoded to `2018-01-01` on any request sent to the service. @@ -32,6 +50,15 @@ This version and all future versions will require Python 3.8+. ### Features Added - Added new enums values to `SystemEventNames` related to Azure Storage and Azure VMware Solution. +## 4.17.0b1 (2023-11-09) + +### Features Added + +- Beta EventGridClient features were added on top of the last GA version of EventGrid. + - Added new features to the EventGridClient that supports `publish_cloud_events`, `receive_cloud_events`, `acknowledge_cloud_events` , `release_cloud_events`, and `reject_cloud_events` operations. These features include a `renew_cloud_event_locks` operation, as well as a `release_with_delay` parameter on the `release_cloud_events` operation. + - The `lock_tokens` parameter in `reject_cloud_events`, `release_cloud_events`, and `acknowledge_cloud_events` was renamed to `reject_options`, `release_options`, and `acknowledge_options`. + - The `binary_mode` keyword argument on `publish_cloud_events` was added to allow for binary mode support when publishing single Cloud Events. + - Added new models to support these new operations on EventGridClient. ## 4.16.0 (2023-11-08) @@ -55,6 +82,7 @@ This version and all future versions will require Python 3.8+. ### Features Added +- Beta EventGridClient features were removed for this and future GA versions. - Added new enum values to `SystemEventNames` related to Azure Container Services. ## 4.12.0b1 (2023-05-22) diff --git a/sdk/eventgrid/azure-eventgrid/MANIFEST.in b/sdk/eventgrid/azure-eventgrid/MANIFEST.in index a4f0c46bcd94..8aee6afa5284 100644 --- a/sdk/eventgrid/azure-eventgrid/MANIFEST.in +++ b/sdk/eventgrid/azure-eventgrid/MANIFEST.in @@ -1,6 +1,6 @@ -recursive-include tests *.py *.yaml -recursive-include samples *.py include *.md include LICENSE -include azure/__init__.py include azure/eventgrid/py.typed +recursive-include tests *.py +recursive-include samples *.py *.md +include azure/__init__.py \ No newline at end of file diff --git a/sdk/eventgrid/azure-eventgrid/README.md b/sdk/eventgrid/azure-eventgrid/README.md index 5771e39aba6f..f6d8bba39a6e 100644 --- a/sdk/eventgrid/azure-eventgrid/README.md +++ b/sdk/eventgrid/azure-eventgrid/README.md @@ -10,12 +10,24 @@ Azure Event Grid is a fully-managed intelligent event routing service that allow | [Samples][python-eg-samples] | [Changelog][python-eg-changelog] +## _Disclaimer_ + +This is a GA release of Azure Event Grid's `EventGridPublisherClient` and `EventGridConsumerClient`. `EventGridPublisherClient` supports `send` for Event Grid Basic and Event Grid Namespaces. `EventGridConsumerClient` supports `receive`, `acknowledge` , `release`, `reject`, and `renew_locks` operations for Event Grid Namespaces. Please refer to the [samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/eventgrid/azure-eventgrid/samples) for further information. + ## Getting started ### Prerequisites -* Python 3.7 or later is required to use this package. -* You must have an [Azure subscription][azure_subscription] and an Event Grid Topic resource to use this package. Follow this [step-by-step tutorial](https://docs.microsoft.com/azure/event-grid/custom-event-quickstart-portal) to register the Event Grid resource provider and create Event Grid topics using the [Azure portal](https://portal.azure.com/). There is a [similar tutorial](https://docs.microsoft.com/azure/event-grid/custom-event-quickstart) using [Azure CLI](https://docs.microsoft.com/cli/azure). +* Python 3.8 or later is required to use this package. +* You must have an [Azure subscription][azure_subscription] and at least one of the following: + * an Event Grid Namespace resource. To create an Event Grid Namespace resource follow [this tutorial](https://learn.microsoft.com/azure/event-grid/create-view-manage-namespaces). + * an Event Grid Basic resource. To create an Event Grid Basic resource via the Azure portal follow this [step-by-step tutorial](https://docs.microsoft.com/azure/event-grid/custom-event-quickstart-portal). To create an Event Grid Basic resource via the [Azure CLI](https://docs.microsoft.com/cli/azure) follow this [tutorial](https://docs.microsoft.com/azure/event-grid/custom-event-quickstart) + +### Event Grid Resources +Azure Event Grid Namespaces supports both pull and push delivery. Azure Event Grid Basic supports only push delivery. +More information on the two resource types can be found [here](https://learn.microsoft.com/azure/event-grid/choose-right-tier). + +**Note:** Azure Event Grid Namespaces only supports the Cloud Event v1.0 Schema. ### Install the package Install the Azure Event Grid client library for Python with [pip][pip]: @@ -24,31 +36,52 @@ Install the Azure Event Grid client library for Python with [pip][pip]: pip install azure-eventgrid ``` -* An existing Event Grid topic or domain is required. You can create the resource using [Azure Portal][azure_portal_create_EG_resource] or [Azure CLI][azure_cli_link] +* An existing Event Grid Basic topic or domain, or Event Grid Namespace topic is required. You can create the resource using [Azure Portal][azure_portal_create_EG_resource] or [Azure CLI][azure_cli_link] If you use Azure CLI, replace `` and `` with your own unique names. -#### Create an Event Grid Topic +#### Create an Event Grid Namespace ``` -az eventgrid topic --create --location --resource-group --name +az eventgrid namespace create --location --resource-group --name ``` -#### Create an Event Grid Domain +#### Create an Event Grid Namespace Topic ``` -az eventgrid domain --create --location --resource-group --name +az eventgrid namespace create topic --location --resource-group --name ``` ### Authenticate the client In order to interact with the Event Grid service, you will need to create an instance of a client. An **endpoint** and **credential** are necessary to instantiate the client object. + +The default EventGridPublisherClient created is compatible with an Event Grid Basic Resource. To create an Event Grid Namespace compatible client, specify `namespace_topic="YOUR_TOPIC_NAME"` when instantiating the client. + +```python +# Event Grid Namespace client +client = EventGridPublisherClient(endpoint, credential, namespace_topic=YOUR_TOPIC_NAME) + +# Event Grid Basic Client +client = EventGridPublisherClient(endpoint, credential) +``` + +EventGridConsumerClient only supports Event Grid Namespaces. +```python +# Event Grid Namespace Client +client = EventGridConsumerClient(endpoint, credential, namespace_topic=YOUR_TOPIC_NAME, subscription=YOUR_SUBSCRIPTION_NAME) +``` + #### Using Azure Active Directory (AAD) Azure Event Grid provides integration with Azure Active Directory (Azure AD) for identity-based authentication of requests. With Azure AD, you can use role-based access control (RBAC) to grant access to your Azure Event Grid resources to users, groups, or applications. -To send events to a topic or domain with a `TokenCredential`, the authenticated identity should have the "EventGrid Data Sender" role assigned. +To send events to a topic or domain with a `TokenCredential`, the authenticated identity should have the "Event Grid Data Sender" role assigned. +To receive events from a topic event subscription with a `TokenCredential`, the authenticated identity should have the "Event Grid Data Receiver" role assigned. +To send and receive events to/from a topic with a `TokenCredential`, the authenticated identity should have the "Event Grid Data Contributor" role assigned. + +More about RBAC setup can be found [here](https://learn.microsoft.com/azure/role-based-access-control/role-assignments-steps). With the `azure-identity` package, you can seamlessly authorize requests in both development and production environments. To learn more about Azure Active Directory, see the [`azure-identity` README](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/identity/azure-identity/README.md). @@ -67,11 +100,17 @@ client = EventGridPublisherClient(endpoint, default_az_credential) -#### Looking up the endpoint +### Looking up the endpoint + +#### Event Grid Namespace +You can find the Namespace endpoint within the Event Grid Namespace resource on the Azure portal. This will look like: +`"..eventgrid.azure.net"` + +#### Event Grid Basic You can find the topic endpoint within the Event Grid Topic resource on the Azure portal. This will look like: `"https://..eventgrid.azure.net/api/events"` -#### Create the client with AzureKeyCredential +### Create the client with AzureKeyCredential To use an Access key as the `credential` parameter, pass the key as a string into an instance of [AzureKeyCredential][azure-key-credential]. @@ -94,50 +133,39 @@ client = EventGridPublisherClient(endpoint, credential_key) -> **Note:** A client may also be authenticated via SAS signature, using the `AzureSasCredential`. A sample demonstrating this, is available [here][python-eg-sample-send-using-sas] ([async_version][python-eg-sample-send-using-sas-async]). +> **Note:** A Basic client may also be authenticated via SAS signature, using the `AzureSasCredential`. A sample demonstrating this, is available [here][python-eg-sample-send-using-sas] ([async_version][python-eg-sample-send-using-sas-async]). > **Note:** The `generate_sas` method can be used to generate a shared access signature. A sample demonstrating this can be seen [here][python-eg-generate-sas]. ## Key concepts -### Topic -A **[topic](https://docs.microsoft.com/azure/event-grid/concepts#topics)** is a channel within the EventGrid service to send events. The event schema that a topic accepts is decided at topic creation time. If events of a schema type are sent to a topic that requires a different schema type, errors will be raised. +### Event Grid Namespace -### Domain -An event **[domain](https://docs.microsoft.com/azure/event-grid/event-domains)** is a management tool for large numbers of Event Grid topics related to the same application. They allow you to publish events to thousands of topics. Domains also give you authorization and authentication control over each topic. For more information, visit [Event domain overview](https://docs.microsoft.com/azure/event-grid/event-domains). - -When you create an event domain, a publishing endpoint for this domain is made available to you. This process is similar to creating an Event Grid Topic. The only difference is that, when publishing to a domain, you must specify the topic within the domain that you'd like the event to be delivered to. +A **[namespace](https://learn.microsoft.com/azure/event-grid/concepts-event-grid-namespaces#namespaces)** is a management container for other resources. It allows for grouping of related resources in order to manage them under one subscription. -### Event schemas -An **[event](https://docs.microsoft.com/azure/event-grid/concepts#events)** is the smallest amount of information that fully describes something that happened in the system. When a custom topic or domain is created, you must specify the schema that will be used when publishing events. - -Event Grid supports multiple schemas for encoding events. +#### Namespace Topic -#### Event Grid schema -While you may configure your topic to use a [custom schema](https://docs.microsoft.com/azure/event-grid/input-mappings), it is more common to use the already-defined Event Grid schema. See the specifications and requirements [here](https://docs.microsoft.com/azure/event-grid/event-schema). +A **[namespace topic](https://learn.microsoft.com/azure/event-grid/concepts-event-grid-namespaces#namespace-topics)** is a topic that is created within an Event Grid namespace. The client publishes events to an HTTP namespace endpoint specifying a namespace topic where published events are logically contained. A namespace topic only supports the CloudEvent v1.0 schema. -#### CloudEvents v1.0 schema -Another option is to use the CloudEvents v1.0 schema. [CloudEvents](https://cloudevents.io/) is a Cloud Native Computing Foundation project which produces a specification for describing event data in a common way. The service summary of CloudEvents can be found [here](https://docs.microsoft.com/azure/event-grid/cloud-event-schema). +#### Event Subscription -### EventGridPublisherClient -`EventGridPublisherClient` provides operations to send event data to a topic hostname specified during client initialization. +An **[event subscription](https://learn.microsoft.com/azure/event-grid/concepts-event-grid-namespaces#event-subscriptions)** is a configuration resource associated with a single topic. -Regardless of the schema that your topic or domain is configured to use, `EventGridPublisherClient` will be used to publish events to it. Use the `send` method publishing events. -The following formats of events are allowed to be sent: -- A list or a single instance of strongly typed EventGridEvents. -- A dict representation of a serialized EventGridEvent object. -- A list or a single instance of strongly typed CloudEvents. -- A dict representation of a serialized CloudEvent object. +### Event Grid Basic -- A dict representation of any Custom Schema. +#### Topic +A **[topic](https://docs.microsoft.com/azure/event-grid/concepts#topics)** is a channel within the Event Grid service to send events. The event schema that a topic accepts is decided at topic creation time. If events of a schema type are sent to a topic that requires a different schema type, errors will be raised. -Please have a look at the [samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/eventgrid/azure-eventgrid/samples) for detailed examples. +#### Domain +An event **[domain](https://docs.microsoft.com/azure/event-grid/event-domains)** is a management tool for large numbers of Event Grid topics related to the same application. They allow you to publish events to thousands of topics. Domains also give you authorization and authentication control over each topic. For more information, visit [Event domain overview](https://docs.microsoft.com/azure/event-grid/event-domains). +#### Event schemas +An **[event](https://docs.microsoft.com/azure/event-grid/concepts#events)** is the smallest amount of information that fully describes something that happened in the system. When a custom topic or domain is created, you must specify the schema that will be used when publishing events. - **Note:** It is important to know if your topic supports CloudEvents or EventGridEvents before publishing. If you send to a topic that does not support the schema of the event you are sending, send() will throw an exception. +Event Grid supports multiple schemas for encoding events. -### System Topics +#### System Topics A **[system topic](https://docs.microsoft.com/azure/event-grid/system-topics)** in Event Grid represents one or more events published by Azure services such as Azure Storage or Azure Event Hubs. For example, a system topic may represent all blob events or only blob creation and blob deletion events published for a specific storage account. The names of the various event types for the system events published to Azure Event Grid are available in `azure.eventgrid.SystemEventNames`. @@ -145,58 +173,38 @@ For complete list of recognizable system topics, visit [System Topics](https://d For more information about the key concepts on Event Grid, see [Concepts in Azure Event Grid][publisher-service-doc]. -## Event Grid on Kubernetes with Azure Arc +## EventGridPublisherClient -Event Grid on Kubernetes with Azure Arc is an offering that allows you to run Event Grid on your own Kubernetes cluster. This capability is enabled by the use of Azure Arc enabled Kubernetes. Through Azure Arc enabled Kubernetes, a supported Kubernetes cluster connects to Azure. Once connected, you are able to install Event Grid on it. Learn more about it [here](https://docs.microsoft.com/azure/event-grid/kubernetes/overview). +`EventGridPublisherClient` provides operations to send event data to a resource specified during client initialization. -### Support for CNCF Cloud Events +If you are using Event Grid Basic, regardless of the schema that your topic or domain is configured to use, `EventGridPublisherClient` will be used to publish events to it. Use the `send` method to publish events. -Starting with v4.7.0, this package also supports publishing a CNCF cloud event from https://pypi.org/project/cloudevents/. You would be able to pass a CloudEvent object from this library to the `send` API. +The following formats of events are allowed to be sent to an Event Grid Basic resource: +- A list or a single instance of strongly typed EventGridEvents. +- A dict representation of a serialized EventGridEvent object. +- A list or a single instance of strongly typed CloudEvents. +- A dict representation of a serialized CloudEvent object. -```python +- A dict representation of any Custom Schema. -from cloudevents.http import CloudEvent +The following formats of events are allowed to be sent to an Event Grid Namespace resource, when a namespace topic is specified: -event = CloudEvent(...) +* A list of single instance of strongly typed CloudEvents. +* A dict representation of a serialized CloudEvent object. -client.send(event) -``` +Please have a look at the [samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/eventgrid/azure-eventgrid/samples) for detailed examples. + +## Event Grid on Kubernetes with Azure Arc + +Event Grid on Kubernetes with Azure Arc is an offering that allows you to run Event Grid on your own Kubernetes cluster. This capability is enabled by the use of Azure Arc enabled Kubernetes. Through Azure Arc enabled Kubernetes, a supported Kubernetes cluster connects to Azure. Once connected, you are able to install Event Grid on it. Learn more about it [here](https://docs.microsoft.com/azure/event-grid/kubernetes/overview). ## Examples The following sections provide several code snippets covering some of the most common Event Grid tasks, including: -* [Send an Event Grid Event](#send-an-event-grid-event) * [Send a Cloud Event](#send-a-cloud-event) * [Send Multiple Events](#send-multiple-events) -* [Send events as Dictionaries](#send-events-as-dictionaries) -* [Consume a payload from storage queue](#consume-from-storage-queue) -* [Consume from ServiceBus](#consume-from-servicebus) - -### Send an Event Grid Event - -This example publishes an Event Grid event. - -```python -import os -from azure.core.credentials import AzureKeyCredential -from azure.eventgrid import EventGridPublisherClient, EventGridEvent - -key = os.environ["EG_ACCESS_KEY"] -endpoint = os.environ["EG_TOPIC_HOSTNAME"] - -event = EventGridEvent( - data={"team": "azure-sdk"}, - subject="Door1", - event_type="Azure.Sdk.Demo", - data_version="2.0" -) - -credential = AzureKeyCredential(key) -client = EventGridPublisherClient(endpoint, credential) - -client.send(event) -``` +* [Receive and Process Events from Namespace](#receive-and-process-events-from-namespace) ### Send a Cloud Event @@ -208,8 +216,10 @@ from azure.core.credentials import AzureKeyCredential from azure.core.messaging import CloudEvent from azure.eventgrid import EventGridPublisherClient -key = os.environ["CLOUD_ACCESS_KEY"] -endpoint = os.environ["CLOUD_TOPIC_HOSTNAME"] +key = os.environ["EVENTGRID_KEY"] +endpoint = os.environ["EVENTGRID_ENDPOINT"] +topic_name = os.environ["EVENTGRID_TOPIC_NAME"] + event = CloudEvent( type="Azure.Sdk.Sample", @@ -218,7 +228,7 @@ event = CloudEvent( ) credential = AzureKeyCredential(key) -client = EventGridPublisherClient(endpoint, credential) +client = EventGridPublisherClient(endpoint, credential, namespace_topic=topic_name) client.send(event) ``` @@ -235,8 +245,9 @@ from azure.core.credentials import AzureKeyCredential from azure.core.messaging import CloudEvent from azure.eventgrid import EventGridPublisherClient -key = os.environ["CLOUD_ACCESS_KEY"] -endpoint = os.environ["CLOUD_TOPIC_HOSTNAME"] +key = os.environ["EVENTGRID_KEY"] +endpoint = os.environ["EVENTGRID_ENDPOINT"] +topic_name = os.environ["EVENTGRID_TOPIC_NAME"] event0 = CloudEvent( type="Azure.Sdk.Sample", @@ -252,95 +263,70 @@ event1 = CloudEvent( events = [event0, event1] credential = AzureKeyCredential(key) -client = EventGridPublisherClient(endpoint, credential) +client = EventGridPublisherClient(endpoint, credential, namespace_topic=topic_name) client.send(events) ``` -### Send events as dictionaries - -A dict representation of respective serialized models can also be used to publish CloudEvent(s) or EventGridEvent(s) apart from the strongly typed objects. +### Receive and Process Events from Namespace -Use a dict-like representation to send to a topic with custom schema as shown below. +Use EventGridConsumerClient's receive function to receive CloudEvents from a Namespace event subscription. Then try to acknowledge, reject, release or renew the locks. ```python import os import uuid import datetime as dt -from msrest.serialization import UTC from azure.core.credentials import AzureKeyCredential -from azure.eventgrid import EventGridPublisherClient - -key = os.environ["CUSTOM_SCHEMA_ACCESS_KEY"] -endpoint = os.environ["CUSTOM_SCHEMA_TOPIC_HOSTNAME"] +from azure.eventgrid import EventGridConsumerClient -event = custom_schema_event = { - "customSubject": "sample", - "customEventType": "sample.event", - "customDataVersion": "2.0", - "customId": uuid.uuid4(), - "customEventTime": dt.datetime.now(UTC()).isoformat(), - "customData": "sample data" - } +key = os.environ["EVENTGRID_KEY"] +endpoint = os.environ["EVENTGRID_ENDPOINT"] +topic_name = os.environ["EVENTGRID_TOPIC_NAME"] +sub_name = os.environ["EVENTGRID_EVENT_SUBSCRIPTION_NAME"] credential = AzureKeyCredential(key) -client = EventGridPublisherClient(endpoint, credential) +client = EventGridConsumerClient(endpoint, credential, namespace_topic=topic_name, subscription=sub_name) -client.send(event) -``` +events = client.receive(max_events=4) -### Consume from storage queue +for detail in events.value: + data = detail.event.data + broker_properties = detail.broker_properties + if data == "release": + release_events.append(broker_properties.lock_token) + elif data == "acknowledge": + acknowledge_events.append(broker_properties.lock_token) + else: + reject_events.append(broker_properties.lock_token) -This example consumes a message received from storage queue and deserializes it to a CloudEvent object. + # Renew all Locks + renew_tokens = e.broker_properties.lock_token + renew_result = client.renew_locks( + lock_tokens=renew_tokens, + ) -```python -from azure.core.messaging import CloudEvent -from azure.storage.queue import QueueServiceClient, BinaryBase64DecodePolicy -import os -import json - -# all types of CloudEvents below produce same DeserializedEvent -connection_str = os.environ['STORAGE_QUEUE_CONN_STR'] -queue_name = os.environ['STORAGE_QUEUE_NAME'] -with QueueServiceClient.from_connection_string(connection_str) as qsc: - payload = qsc.get_queue_client( - queue=queue_name, - message_decode_policy=BinaryBase64DecodePolicy() - ).peek_messages() - - ## deserialize payload into a list of typed Events - events = [CloudEvent.from_dict(json.loads(msg.content)) for msg in payload] -``` - -### Consume from servicebus - -This example consumes a payload message received from ServiceBus and deserializes it to an EventGridEvent object. - -```python -from azure.eventgrid import EventGridEvent -from azure.servicebus import ServiceBusClient -import os -import json +release_result = client.release( + lock_tokens=release_events, +) -# all types of EventGridEvents below produce same DeserializedEvent -connection_str = os.environ['SERVICE_BUS_CONN_STR'] -queue_name = os.environ['SERVICE_BUS_QUEUE_NAME'] +ack_result = client.acknowledge( + lock_tokens=acknowledge_events, +) -with ServiceBusClient.from_connection_string(connection_str) as sb_client: - payload = sb_client.get_queue_receiver(queue_name).receive_messages() +reject_result = client.reject( + lock_tokens=reject_events, +) - ## deserialize payload into a list of typed Events - events = [EventGridEvent.from_dict(json.loads(next(msg.body).decode('utf-8'))) for msg in payload] ``` -## Distributed Tracing with EventGrid +## Distributed Tracing with Event Grid -You can use OpenTelemetry for Python as usual with EventGrid since it's compatible with azure-core tracing integration. +You can use OpenTelemetry for Python as usual with Event Grid since it's compatible with azure-core tracing integration. Here is an example of using OpenTelemetry to trace sending a CloudEvent. -First, set OpenTelemetry as enabled tracing plugin for EventGrid. +First, set OpenTelemetry as enabled tracing plugin for Event Grid. ```python from azure.core.settings import settings @@ -368,7 +354,7 @@ trace.get_tracer_provider().add_span_processor( ) ``` -Once the `tracer` and `exporter` are set, please follow the example below to start collecting traces while using the `send` method from the `EventGridPublisherClient` to send a CloudEvent object. +Once the `tracer` and `exporter` are set, please follow the example below to start collecting traces while using the `send` method from the `EventGridClient` to send a CloudEvent object. ```python import os @@ -415,13 +401,15 @@ The following section provides several code snippets illustrating common pattern These code samples show common champion scenario operations with the Azure Event Grid client library. +#### Basic Event Grid Scenarios + * Generate Shared Access Signature: [sample_generate_sas.py][python-eg-generate-sas] * Authenticate the client: [sample_authentication.py][python-eg-auth] ([async_version][python-eg-auth-async]) * Publish events to a topic using SAS: [sample_publish_events_to_a_topic_using_sas_credential_async.py][python-eg-sample-send-using-sas] ([async_version][python-eg-sample-send-using-sas-async]) * Publish Event Grid Events to a topic: [sample_publish_eg_events_to_a_topic.py][python-eg-sample-eg-event] ([async_version][python-eg-sample-eg-event-async]) -* Publish EventGrid Events to a domain topic: [sample_publish_eg_events_to_a_domain_topic.py][python-eg-sample-eg-event-to-domain] ([async_version][python-eg-sample-eg-event-to-domain-async]) +* Publish Event Grid Events to a domain topic: [sample_publish_eg_events_to_a_domain_topic.py][python-eg-sample-eg-event-to-domain] ([async_version][python-eg-sample-eg-event-to-domain-async]) * Publish a Cloud Event: [sample_publish_events_using_cloud_events_1.0_schema.py][python-eg-sample-send-cloudevent] ([async_version][python-eg-sample-send-cloudevent-async]) * Publish a Custom Schema: [sample_publish_custom_schema_to_a_topic.py][python-eg-publish-custom-schema] ([async_version][python-eg-publish-custom-schema-async]) @@ -451,7 +439,7 @@ This project has adopted the [Microsoft Open Source Code of Conduct][code_of_con [azure_cli_link]: https://pypi.org/project/azure-cli/ -[python-eg-src]: https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventgrid/azure-eventgrid/ +[python-eg-src]: https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/eventgrid/azure-eventgrid [python-eg-pypi]: https://pypi.org/project/azure-eventgrid [python-eg-product-docs]: https://docs.microsoft.com/azure/event-grid/overview [python-eg-ref-docs]: https://azuresdkdocs.blob.core.windows.net/$web/python/azure-eventgrid/latest/index.html @@ -490,7 +478,8 @@ This project has adopted the [Microsoft Open Source Code of Conduct][code_of_con [python-eg-consume-samples]: https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventgrid/azure-eventgrid/samples/consume_samples [python-eg-sample-consume-custom-payload]: https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_consume_custom_payload.py + [cla]: https://cla.microsoft.com [code_of_conduct]: https://opensource.microsoft.com/codeofconduct/ [coc_faq]: https://opensource.microsoft.com/codeofconduct/faq/ -[coc_contact]: mailto:opencode@microsoft.com +[coc_contact]: mailto:opencode@microsoft.com \ No newline at end of file diff --git a/sdk/eventgrid/azure-eventgrid/azure/__init__.py b/sdk/eventgrid/azure-eventgrid/azure/__init__.py index 0c36c2076ba0..d55ccad1f573 100644 --- a/sdk/eventgrid/azure-eventgrid/azure/__init__.py +++ b/sdk/eventgrid/azure-eventgrid/azure/__init__.py @@ -1 +1 @@ -__path__ = __import__('pkgutil').extend_path(__path__, __name__) # type: ignore +__path__ = __import__("pkgutil").extend_path(__path__, __name__) # type: ignore diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/__init__.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/__init__.py index 1dc3655a13bb..7189e7b5a4f9 100644 --- a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/__init__.py +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/__init__.py @@ -2,18 +2,27 @@ # -------------------------------------------------------------------------- # Copyright (c) Microsoft Corporation. All rights reserved. # Licensed under the MIT License. See License.txt in the project root for license information. +# Code generated by Microsoft (R) Python Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is regenerated. # -------------------------------------------------------------------------- -from ._publisher_client import EventGridPublisherClient -from ._event_mappings import SystemEventNames -from ._helpers import generate_sas -from ._models import EventGridEvent +from ._patch import EventGridPublisherClient +from ._patch import EventGridConsumerClient from ._version import VERSION +__version__ = VERSION + +try: + from ._patch import __all__ as _patch_all + from ._patch import * # pylint: disable=unused-wildcard-import +except ImportError: + _patch_all = [] +from ._patch import patch_sdk as _patch_sdk + __all__ = [ "EventGridPublisherClient", - "EventGridEvent", - "generate_sas", - "SystemEventNames", + "EventGridConsumerClient", ] -__version__ = VERSION +__all__.extend([p for p in _patch_all if p not in __all__]) + +_patch_sdk() diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_client.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_client.py new file mode 100644 index 000000000000..c6ce28f47f9a --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_client.py @@ -0,0 +1,183 @@ +# coding=utf-8 +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for license information. +# Code generated by Microsoft (R) Python Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is regenerated. +# -------------------------------------------------------------------------- + +from copy import deepcopy +from typing import Any, TYPE_CHECKING, Union + +from azure.core import PipelineClient +from azure.core.credentials import AzureKeyCredential +from azure.core.pipeline import policies +from azure.core.rest import HttpRequest, HttpResponse + +from ._configuration import EventGridConsumerClientConfiguration, EventGridPublisherClientConfiguration +from ._operations import EventGridConsumerClientOperationsMixin, EventGridPublisherClientOperationsMixin +from ._serialization import Deserializer, Serializer + +if TYPE_CHECKING: + # pylint: disable=unused-import,ungrouped-imports + from azure.core.credentials import TokenCredential + + +class EventGridPublisherClient( + EventGridPublisherClientOperationsMixin +): # pylint: disable=client-accepts-api-version-keyword + """EventGridPublisherClient. + + :param endpoint: The host name of the namespace, e.g. + namespaceName1.westus-1.eventgrid.azure.net. Required. + :type endpoint: str + :param credential: Credential used to authenticate requests to the service. Is either a + AzureKeyCredential type or a TokenCredential type. Required. + :type credential: ~azure.core.credentials.AzureKeyCredential or + ~azure.core.credentials.TokenCredential + :keyword api_version: The API version to use for this operation. Default value is "2024-06-01". + Note that overriding this default value may result in unsupported behavior. + :paramtype api_version: str + """ + + def __init__(self, endpoint: str, credential: Union[AzureKeyCredential, "TokenCredential"], **kwargs: Any) -> None: + _endpoint = "{endpoint}" + self._config = EventGridPublisherClientConfiguration(endpoint=endpoint, credential=credential, **kwargs) + _policies = kwargs.pop("policies", None) + if _policies is None: + _policies = [ + policies.RequestIdPolicy(**kwargs), + self._config.headers_policy, + self._config.user_agent_policy, + self._config.proxy_policy, + policies.ContentDecodePolicy(**kwargs), + self._config.redirect_policy, + self._config.retry_policy, + self._config.authentication_policy, + self._config.custom_hook_policy, + self._config.logging_policy, + policies.DistributedTracingPolicy(**kwargs), + policies.SensitiveHeaderCleanupPolicy(**kwargs) if self._config.redirect_policy else None, + self._config.http_logging_policy, + ] + self._client: PipelineClient = PipelineClient(base_url=_endpoint, policies=_policies, **kwargs) + + self._serialize = Serializer() + self._deserialize = Deserializer() + self._serialize.client_side_validation = False + + def send_request(self, request: HttpRequest, *, stream: bool = False, **kwargs: Any) -> HttpResponse: + """Runs the network request through the client's chained policies. + + >>> from azure.core.rest import HttpRequest + >>> request = HttpRequest("GET", "https://www.example.org/") + + >>> response = client.send_request(request) + + + For more information on this code flow, see https://aka.ms/azsdk/dpcodegen/python/send_request + + :param request: The network request you want to make. Required. + :type request: ~azure.core.rest.HttpRequest + :keyword bool stream: Whether the response payload will be streamed. Defaults to False. + :return: The response of your network call. Does not do error handling on your response. + :rtype: ~azure.core.rest.HttpResponse + """ + + request_copy = deepcopy(request) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + + request_copy.url = self._client.format_url(request_copy.url, **path_format_arguments) + return self._client.send_request(request_copy, stream=stream, **kwargs) # type: ignore + + def close(self) -> None: + self._client.close() + + def __enter__(self) -> "EventGridPublisherClient": + self._client.__enter__() + return self + + def __exit__(self, *exc_details: Any) -> None: + self._client.__exit__(*exc_details) + + +class EventGridConsumerClient( + EventGridConsumerClientOperationsMixin +): # pylint: disable=client-accepts-api-version-keyword + """EventGridConsumerClient. + + :param endpoint: The host name of the namespace, e.g. + namespaceName1.westus-1.eventgrid.azure.net. Required. + :type endpoint: str + :param credential: Credential used to authenticate requests to the service. Is either a + AzureKeyCredential type or a TokenCredential type. Required. + :type credential: ~azure.core.credentials.AzureKeyCredential or + ~azure.core.credentials.TokenCredential + :keyword api_version: The API version to use for this operation. Default value is "2024-06-01". + Note that overriding this default value may result in unsupported behavior. + :paramtype api_version: str + """ + + def __init__(self, endpoint: str, credential: Union[AzureKeyCredential, "TokenCredential"], **kwargs: Any) -> None: + _endpoint = "{endpoint}" + self._config = EventGridConsumerClientConfiguration(endpoint=endpoint, credential=credential, **kwargs) + _policies = kwargs.pop("policies", None) + if _policies is None: + _policies = [ + policies.RequestIdPolicy(**kwargs), + self._config.headers_policy, + self._config.user_agent_policy, + self._config.proxy_policy, + policies.ContentDecodePolicy(**kwargs), + self._config.redirect_policy, + self._config.retry_policy, + self._config.authentication_policy, + self._config.custom_hook_policy, + self._config.logging_policy, + policies.DistributedTracingPolicy(**kwargs), + policies.SensitiveHeaderCleanupPolicy(**kwargs) if self._config.redirect_policy else None, + self._config.http_logging_policy, + ] + self._client: PipelineClient = PipelineClient(base_url=_endpoint, policies=_policies, **kwargs) + + self._serialize = Serializer() + self._deserialize = Deserializer() + self._serialize.client_side_validation = False + + def send_request(self, request: HttpRequest, *, stream: bool = False, **kwargs: Any) -> HttpResponse: + """Runs the network request through the client's chained policies. + + >>> from azure.core.rest import HttpRequest + >>> request = HttpRequest("GET", "https://www.example.org/") + + >>> response = client.send_request(request) + + + For more information on this code flow, see https://aka.ms/azsdk/dpcodegen/python/send_request + + :param request: The network request you want to make. Required. + :type request: ~azure.core.rest.HttpRequest + :keyword bool stream: Whether the response payload will be streamed. Defaults to False. + :return: The response of your network call. Does not do error handling on your response. + :rtype: ~azure.core.rest.HttpResponse + """ + + request_copy = deepcopy(request) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + + request_copy.url = self._client.format_url(request_copy.url, **path_format_arguments) + return self._client.send_request(request_copy, stream=stream, **kwargs) # type: ignore + + def close(self) -> None: + self._client.close() + + def __enter__(self) -> "EventGridConsumerClient": + self._client.__enter__() + return self + + def __exit__(self, *exc_details: Any) -> None: + self._client.__exit__(*exc_details) diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_configuration.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_configuration.py new file mode 100644 index 000000000000..93540c21e18c --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_configuration.py @@ -0,0 +1,132 @@ +# coding=utf-8 +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for license information. +# Code generated by Microsoft (R) Python Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is regenerated. +# -------------------------------------------------------------------------- + +from typing import Any, TYPE_CHECKING, Union + +from azure.core.credentials import AzureKeyCredential +from azure.core.pipeline import policies + +from ._version import VERSION + +if TYPE_CHECKING: + # pylint: disable=unused-import,ungrouped-imports + from azure.core.credentials import TokenCredential + + +class EventGridPublisherClientConfiguration: # pylint: disable=too-many-instance-attributes,name-too-long + """Configuration for EventGridPublisherClient. + + Note that all parameters used to create this instance are saved as instance + attributes. + + :param endpoint: The host name of the namespace, e.g. + namespaceName1.westus-1.eventgrid.azure.net. Required. + :type endpoint: str + :param credential: Credential used to authenticate requests to the service. Is either a + AzureKeyCredential type or a TokenCredential type. Required. + :type credential: ~azure.core.credentials.AzureKeyCredential or + ~azure.core.credentials.TokenCredential + :keyword api_version: The API version to use for this operation. Default value is "2024-06-01". + Note that overriding this default value may result in unsupported behavior. + :paramtype api_version: str + """ + + def __init__(self, endpoint: str, credential: Union[AzureKeyCredential, "TokenCredential"], **kwargs: Any) -> None: + api_version: str = kwargs.pop("api_version", "2024-06-01") + + if endpoint is None: + raise ValueError("Parameter 'endpoint' must not be None.") + if credential is None: + raise ValueError("Parameter 'credential' must not be None.") + + self.endpoint = endpoint + self.credential = credential + self.api_version = api_version + self.credential_scopes = kwargs.pop("credential_scopes", ["https://eventgrid.azure.net/.default"]) + kwargs.setdefault("sdk_moniker", "eventgrid/{}".format(VERSION)) + self.polling_interval = kwargs.get("polling_interval", 30) + self._configure(**kwargs) + + def _infer_policy(self, **kwargs): + if isinstance(self.credential, AzureKeyCredential): + return policies.AzureKeyCredentialPolicy( + self.credential, "Authorization", prefix="SharedAccessKey", **kwargs + ) + if hasattr(self.credential, "get_token"): + return policies.BearerTokenCredentialPolicy(self.credential, *self.credential_scopes, **kwargs) + raise TypeError(f"Unsupported credential: {self.credential}") + + def _configure(self, **kwargs: Any) -> None: + self.user_agent_policy = kwargs.get("user_agent_policy") or policies.UserAgentPolicy(**kwargs) + self.headers_policy = kwargs.get("headers_policy") or policies.HeadersPolicy(**kwargs) + self.proxy_policy = kwargs.get("proxy_policy") or policies.ProxyPolicy(**kwargs) + self.logging_policy = kwargs.get("logging_policy") or policies.NetworkTraceLoggingPolicy(**kwargs) + self.http_logging_policy = kwargs.get("http_logging_policy") or policies.HttpLoggingPolicy(**kwargs) + self.custom_hook_policy = kwargs.get("custom_hook_policy") or policies.CustomHookPolicy(**kwargs) + self.redirect_policy = kwargs.get("redirect_policy") or policies.RedirectPolicy(**kwargs) + self.retry_policy = kwargs.get("retry_policy") or policies.RetryPolicy(**kwargs) + self.authentication_policy = kwargs.get("authentication_policy") + if self.credential and not self.authentication_policy: + self.authentication_policy = self._infer_policy(**kwargs) + + +class EventGridConsumerClientConfiguration: # pylint: disable=too-many-instance-attributes,name-too-long + """Configuration for EventGridConsumerClient. + + Note that all parameters used to create this instance are saved as instance + attributes. + + :param endpoint: The host name of the namespace, e.g. + namespaceName1.westus-1.eventgrid.azure.net. Required. + :type endpoint: str + :param credential: Credential used to authenticate requests to the service. Is either a + AzureKeyCredential type or a TokenCredential type. Required. + :type credential: ~azure.core.credentials.AzureKeyCredential or + ~azure.core.credentials.TokenCredential + :keyword api_version: The API version to use for this operation. Default value is "2024-06-01". + Note that overriding this default value may result in unsupported behavior. + :paramtype api_version: str + """ + + def __init__(self, endpoint: str, credential: Union[AzureKeyCredential, "TokenCredential"], **kwargs: Any) -> None: + api_version: str = kwargs.pop("api_version", "2024-06-01") + + if endpoint is None: + raise ValueError("Parameter 'endpoint' must not be None.") + if credential is None: + raise ValueError("Parameter 'credential' must not be None.") + + self.endpoint = endpoint + self.credential = credential + self.api_version = api_version + self.credential_scopes = kwargs.pop("credential_scopes", ["https://eventgrid.azure.net/.default"]) + kwargs.setdefault("sdk_moniker", "eventgrid/{}".format(VERSION)) + self.polling_interval = kwargs.get("polling_interval", 30) + self._configure(**kwargs) + + def _infer_policy(self, **kwargs): + if isinstance(self.credential, AzureKeyCredential): + return policies.AzureKeyCredentialPolicy( + self.credential, "Authorization", prefix="SharedAccessKey", **kwargs + ) + if hasattr(self.credential, "get_token"): + return policies.BearerTokenCredentialPolicy(self.credential, *self.credential_scopes, **kwargs) + raise TypeError(f"Unsupported credential: {self.credential}") + + def _configure(self, **kwargs: Any) -> None: + self.user_agent_policy = kwargs.get("user_agent_policy") or policies.UserAgentPolicy(**kwargs) + self.headers_policy = kwargs.get("headers_policy") or policies.HeadersPolicy(**kwargs) + self.proxy_policy = kwargs.get("proxy_policy") or policies.ProxyPolicy(**kwargs) + self.logging_policy = kwargs.get("logging_policy") or policies.NetworkTraceLoggingPolicy(**kwargs) + self.http_logging_policy = kwargs.get("http_logging_policy") or policies.HttpLoggingPolicy(**kwargs) + self.custom_hook_policy = kwargs.get("custom_hook_policy") or policies.CustomHookPolicy(**kwargs) + self.redirect_policy = kwargs.get("redirect_policy") or policies.RedirectPolicy(**kwargs) + self.retry_policy = kwargs.get("retry_policy") or policies.RetryPolicy(**kwargs) + self.authentication_policy = kwargs.get("authentication_policy") + if self.credential and not self.authentication_policy: + self.authentication_policy = self._infer_policy(**kwargs) diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_event_mappings.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_event_mappings.py deleted file mode 100644 index bb1644791288..000000000000 --- a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_event_mappings.py +++ /dev/null @@ -1,501 +0,0 @@ -# -------------------------------------------------------------------------------------------- -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for license information. - -# THE VALUES IN THE ENUM ARE AUTO-GENERATED. DO NOT EDIT THIS MANUALLY. -# -------------------------------------------------------------------------------------------- -from enum import Enum -from azure.core import CaseInsensitiveEnumMeta - -# pylint: disable=line-too-long -# pylint: disable=enum-must-be-uppercase -class SystemEventNames(str, Enum, metaclass=CaseInsensitiveEnumMeta): - """ - This enum represents the names of the various event types for the system events published to - Azure Event Grid. To check the list of recognizable system topics, - visit https://docs.microsoft.com/azure/event-grid/system-topics. - """ - # These names at the top are 'corrected' aliases of duplicate values that appear below, which are - # deprecated but maintained for backwards compatibility. - AcsChatMemberAddedToThreadWithUserEventName = 'Microsoft.Communication.ChatMemberAddedToThreadWithUser' - - ResourceWriteFailureEventName = 'Microsoft.Resources.ResourceWriteFailure' - - IoTHubDeviceDeletedEventName = 'Microsoft.Devices.DeviceDeleted' - - IoTHubDeviceDisconnectedEventName = 'Microsoft.Devices.DeviceDisconnected' - - ResourceDeleteFailureEventName = 'Microsoft.Resources.ResourceDeleteFailure' - - ResourceDeleteCancelEventName = 'Microsoft.Resources.ResourceDeleteCancel' - - AcsChatThreadParticipantAddedEventName = 'Microsoft.Communication.ChatThreadParticipantAdded' - - ResourceDeleteSuccessEventName = 'Microsoft.Resources.ResourceDeleteSuccess' - - EventGridSubscriptionValidationEventName = 'Microsoft.EventGrid.SubscriptionValidationEvent' - - ResourceWriteSuccessEventName = 'Microsoft.Resources.ResourceWriteSuccess' - - ResourceActionSuccessEventName = 'Microsoft.Resources.ResourceActionSuccess' - - ResourceWriteCancelEventName = 'Microsoft.Resources.ResourceWriteCancel' - - ResourceActionFailureEventName = 'Microsoft.Resources.ResourceActionFailure' - - AcsChatMemberRemovedFromThreadWithUserEventName = 'Microsoft.Communication.ChatMemberRemovedFromThreadWithUser' - - IoTHubDeviceConnectedEventName = 'Microsoft.Devices.DeviceConnected' - - EventGridSubscriptionDeletedEventName = 'Microsoft.EventGrid.SubscriptionDeletedEvent' - - AcsChatThreadParticipantRemovedEventName = 'Microsoft.Communication.ChatThreadParticipantRemoved' - - ResourceActionCancelEventName = 'Microsoft.Resources.ResourceActionCancel' - - IoTHubDeviceCreatedEventName = 'Microsoft.Devices.DeviceCreated' - - # Aliases end here - AcsAdvancedMessageDeliveryStatusUpdatedEventName = 'Microsoft.Communication.AdvancedMessageDeliveryStatusUpdated' - - AcsAdvancedMessageReceivedEventName = 'Microsoft.Communication.AdvancedMessageReceived' - - AcsChatMessageDeletedEventName = 'Microsoft.Communication.ChatMessageDeleted' - - AcsChatMessageDeletedInThreadEventName = 'Microsoft.Communication.ChatMessageDeletedInThread' - - AcsChatMessageEditedEventName = 'Microsoft.Communication.ChatMessageEdited' - - AcsChatMessageEditedInThreadEventName = 'Microsoft.Communication.ChatMessageEditedInThread' - - AcsChatMessageReceivedEventName = 'Microsoft.Communication.ChatMessageReceived' - - AcsChatMessageReceivedInThreadEventName = 'Microsoft.Communication.ChatMessageReceivedInThread' - - AcsChatParticipantAddedToThreadEventName = 'Microsoft.Communication.ChatThreadParticipantAdded' - - AcsChatParticipantAddedToThreadWithUserEventName = 'Microsoft.Communication.ChatParticipantAddedToThreadWithUser' - - AcsChatParticipantRemovedFromThreadEventName = 'Microsoft.Communication.ChatThreadParticipantRemoved' - - AcsChatParticipantRemovedFromThreadWithUserEventName = 'Microsoft.Communication.ChatParticipantRemovedFromThreadWithUser' - - AcsChatThreadCreatedEventName = 'Microsoft.Communication.ChatThreadCreated' - - AcsChatThreadCreatedWithUserEventName = 'Microsoft.Communication.ChatThreadCreatedWithUser' - - AcsChatThreadDeletedEventName = 'Microsoft.Communication.ChatThreadDeleted' - - AcsChatThreadPropertiesUpdatedEventName = 'Microsoft.Communication.ChatThreadPropertiesUpdated' - - AcsChatThreadPropertiesUpdatedPerUserEventName = 'Microsoft.Communication.ChatThreadPropertiesUpdatedPerUser' - - AcsChatThreadWithUserDeletedEventName = 'Microsoft.Communication.ChatThreadWithUserDeleted' - - AcsEmailDeliveryReportReceivedEventName = 'Microsoft.Communication.EmailDeliveryReportReceived' - - AcsEmailEngagementTrackingReportReceivedEventName = 'Microsoft.Communication.EmailEngagementTrackingReportReceived' - - AcsIncomingCallEventName = 'Microsoft.Communication.IncomingCall' - - AcsRecordingFileStatusUpdatedEventName = 'Microsoft.Communication.RecordingFileStatusUpdated' - - AcsRouterJobCancelledEventName = 'Microsoft.Communication.RouterJobCancelled' - - AcsRouterJobClassificationFailedEventName = 'Microsoft.Communication.RouterJobClassificationFailed' - - AcsRouterJobClassifiedEventName = 'Microsoft.Communication.RouterJobClassified' - - AcsRouterJobClosedEventName = 'Microsoft.Communication.RouterJobClosed' - - AcsRouterJobCompletedEventName = 'Microsoft.Communication.RouterJobCompleted' - - AcsRouterJobDeletedEventName = 'Microsoft.Communication.RouterJobDeleted' - - AcsRouterJobExceptionTriggeredEventName = 'Microsoft.Communication.RouterJobExceptionTriggered' - - AcsRouterJobQueuedEventName = 'Microsoft.Communication.RouterJobQueued' - - AcsRouterJobReceivedEventName = 'Microsoft.Communication.RouterJobReceived' - - AcsRouterJobSchedulingFailedEventName = 'Microsoft.Communication.RouterJobSchedulingFailed' - - AcsRouterJobUnassignedEventName = 'Microsoft.Communication.RouterJobUnassigned' - - AcsRouterJobWaitingForActivationEventName = 'Microsoft.Communication.RouterJobWaitingForActivation' - - AcsRouterJobWorkerSelectorsExpiredEventName = 'Microsoft.Communication.RouterJobWorkerSelectorsExpired' - - AcsRouterWorkerDeletedEventName = 'Microsoft.Communication.RouterWorkerDeleted' - - AcsRouterWorkerDeregisteredEventName = 'Microsoft.Communication.RouterWorkerDeregistered' - - AcsRouterWorkerOfferAcceptedEventName = 'Microsoft.Communication.RouterWorkerOfferAccepted' - - AcsRouterWorkerOfferDeclinedEventName = 'Microsoft.Communication.RouterWorkerOfferDeclined' - - AcsRouterWorkerOfferExpiredEventName = 'Microsoft.Communication.RouterWorkerOfferExpired' - - AcsRouterWorkerOfferIssuedEventName = 'Microsoft.Communication.RouterWorkerOfferIssued' - - AcsRouterWorkerOfferRevokedEventName = 'Microsoft.Communication.RouterWorkerOfferRevoked' - - AcsRouterWorkerRegisteredEventName = 'Microsoft.Communication.RouterWorkerRegistered' - - AcsRouterWorkerUpdatedEventName = 'Microsoft.Communication.RouterWorkerUpdated' - - AcsSmsDeliveryReportReceivedEventName = 'Microsoft.Communication.SMSDeliveryReportReceived' - - AcsSmsReceivedEventName = 'Microsoft.Communication.SMSReceived' - - AcsUserDisconnectedEventName = 'Microsoft.Communication.UserDisconnected' - - ApiCenterApiDefinitionAddedEventName = 'Microsoft.ApiCenter.ApiDefinitionAdded' - - ApiCenterApiDefinitionUpdatedEventName = 'Microsoft.ApiCenter.ApiDefinitionUpdated' - - ApiManagementApiCreatedEventName = 'Microsoft.ApiManagement.APICreated' - - ApiManagementApiDeletedEventName = 'Microsoft.ApiManagement.APIDeleted' - - ApiManagementApiReleaseCreatedEventName = 'Microsoft.ApiManagement.APIReleaseCreated' - - ApiManagementApiReleaseDeletedEventName = 'Microsoft.ApiManagement.APIReleaseDeleted' - - ApiManagementApiReleaseUpdatedEventName = 'Microsoft.ApiManagement.APIReleaseUpdated' - - ApiManagementApiUpdatedEventName = 'Microsoft.ApiManagement.APIUpdated' - - ApiManagementGatewayApiAddedEventName = 'Microsoft.ApiManagement.GatewayAPIAdded' - - ApiManagementGatewayApiRemovedEventName = 'Microsoft.ApiManagement.GatewayAPIRemoved' - - ApiManagementGatewayCertificateAuthorityCreatedEventName = 'Microsoft.ApiManagement.GatewayCertificateAuthorityCreated' - - ApiManagementGatewayCertificateAuthorityDeletedEventName = 'Microsoft.ApiManagement.GatewayCertificateAuthorityDeleted' - - ApiManagementGatewayCertificateAuthorityUpdatedEventName = 'Microsoft.ApiManagement.GatewayCertificateAuthorityUpdated' - - ApiManagementGatewayCreatedEventName = 'Microsoft.ApiManagement.GatewayCreated' - - ApiManagementGatewayDeletedEventName = 'Microsoft.ApiManagement.GatewayDeleted' - - ApiManagementGatewayHostnameConfigurationCreatedEventName = 'Microsoft.ApiManagement.GatewayHostnameConfigurationCreated' - - ApiManagementGatewayHostnameConfigurationDeletedEventName = 'Microsoft.ApiManagement.GatewayHostnameConfigurationDeleted' - - ApiManagementGatewayHostnameConfigurationUpdatedEventName = 'Microsoft.ApiManagement.GatewayHostnameConfigurationUpdated' - - ApiManagementGatewayUpdatedEventName = 'Microsoft.ApiManagement.GatewayUpdated' - - ApiManagementProductCreatedEventName = 'Microsoft.ApiManagement.ProductCreated' - - ApiManagementProductDeletedEventName = 'Microsoft.ApiManagement.ProductDeleted' - - ApiManagementProductUpdatedEventName = 'Microsoft.ApiManagement.ProductUpdated' - - ApiManagementSubscriptionCreatedEventName = 'Microsoft.ApiManagement.SubscriptionCreated' - - ApiManagementSubscriptionDeletedEventName = 'Microsoft.ApiManagement.SubscriptionDeleted' - - ApiManagementSubscriptionUpdatedEventName = 'Microsoft.ApiManagement.SubscriptionUpdated' - - ApiManagementUserCreatedEventName = 'Microsoft.ApiManagement.UserCreated' - - ApiManagementUserDeletedEventName = 'Microsoft.ApiManagement.UserDeleted' - - ApiManagementUserUpdatedEventName = 'Microsoft.ApiManagement.UserUpdated' - - AppConfigurationKeyValueDeletedEventName = 'Microsoft.AppConfiguration.KeyValueDeleted' - - AppConfigurationKeyValueModifiedEventName = 'Microsoft.AppConfiguration.KeyValueModified' - - AppConfigurationSnapshotCreatedEventName = 'Microsoft.AppConfiguration.SnapshotCreated' - - AppConfigurationSnapshotModifiedEventName = 'Microsoft.AppConfiguration.SnapshotModified' - - AvsClusterCreatedEventName = 'Microsoft.AVS.ClusterCreated' - - AvsClusterDeletedEventName = 'Microsoft.AVS.ClusterDeleted' - - AvsClusterFailedEventName = 'Microsoft.AVS.ClusterFailed' - - AvsClusterUpdatedEventName = 'Microsoft.AVS.ClusterUpdated' - - AvsClusterUpdatingEventName = 'Microsoft.AVS.ClusterUpdating' - - AvsPrivateCloudFailedEventName = 'Microsoft.AVS.PrivateCloudFailed' - - AvsPrivateCloudUpdatedEventName = 'Microsoft.AVS.PrivateCloudUpdated' - - AvsPrivateCloudUpdatingEventName = 'Microsoft.AVS.PrivateCloudUpdating' - - AvsScriptExecutionCancelledEventName = 'Microsoft.AVS.ScriptExecutionCancelled' - - AvsScriptExecutionFailedEventName = 'Microsoft.AVS.ScriptExecutionFailed' - - AvsScriptExecutionFinishedEventName = 'Microsoft.AVS.ScriptExecutionFinished' - - AvsScriptExecutionStartedEventName = 'Microsoft.AVS.ScriptExecutionStarted' - - ContainerRegistryChartDeletedEventName = 'Microsoft.ContainerRegistry.ChartDeleted' - - ContainerRegistryChartPushedEventName = 'Microsoft.ContainerRegistry.ChartPushed' - - ContainerRegistryImageDeletedEventName = 'Microsoft.ContainerRegistry.ImageDeleted' - - ContainerRegistryImagePushedEventName = 'Microsoft.ContainerRegistry.ImagePushed' - - ContainerServiceClusterSupportEndedEventName = 'Microsoft.ContainerService.ClusterSupportEnded' - - ContainerServiceClusterSupportEndingEventName = 'Microsoft.ContainerService.ClusterSupportEnding' - - ContainerServiceNewKubernetesVersionAvailableEventName = 'Microsoft.ContainerService.NewKubernetesVersionAvailable' - - ContainerServiceNodePoolRollingFailedEventName = 'Microsoft.ContainerService.NodePoolRollingFailed' - - ContainerServiceNodePoolRollingStartedEventName = 'Microsoft.ContainerService.NodePoolRollingStarted' - - ContainerServiceNodePoolRollingSucceededEventName = 'Microsoft.ContainerService.NodePoolRollingSucceeded' - - DataBoxCopyCompletedEventName = 'Microsoft.DataBox.CopyCompleted' - - DataBoxCopyStartedEventName = 'Microsoft.DataBox.CopyStarted' - - DataBoxOrderCompletedEventName = 'Microsoft.DataBox.OrderCompleted' - - EventGridMQTTClientCreatedOrUpdatedEventName = 'Microsoft.EventGrid.MQTTClientCreatedOrUpdated' - - EventGridMQTTClientDeletedEventName = 'Microsoft.EventGrid.MQTTClientDeleted' - - EventGridMQTTClientSessionConnectedEventName = 'Microsoft.EventGrid.MQTTClientSessionConnected' - - EventGridMQTTClientSessionDisconnectedEventName = 'Microsoft.EventGrid.MQTTClientSessionDisconnected' - - EventHubCaptureFileCreatedEventName = 'Microsoft.EventHub.CaptureFileCreated' - - HealthcareDicomImageCreatedEventName = 'Microsoft.HealthcareApis.DicomImageCreated' - - HealthcareDicomImageDeletedEventName = 'Microsoft.HealthcareApis.DicomImageDeleted' - - HealthcareDicomImageUpdatedEventName = 'Microsoft.HealthcareApis.DicomImageUpdated' - - HealthcareFhirResourceCreatedEventName = 'Microsoft.HealthcareApis.FhirResourceCreated' - - HealthcareFhirResourceDeletedEventName = 'Microsoft.HealthcareApis.FhirResourceDeleted' - - HealthcareFhirResourceUpdatedEventName = 'Microsoft.HealthcareApis.FhirResourceUpdated' - - IotHubDeviceConnectedEventName = 'Microsoft.Devices.DeviceConnected' - - IotHubDeviceCreatedEventName = 'Microsoft.Devices.DeviceCreated' - - IotHubDeviceDeletedEventName = 'Microsoft.Devices.DeviceDeleted' - - IotHubDeviceDisconnectedEventName = 'Microsoft.Devices.DeviceDisconnected' - - IotHubDeviceTelemetryEventName = 'Microsoft.Devices.DeviceTelemetry' - - KeyVaultCertificateExpiredEventName = 'Microsoft.KeyVault.CertificateExpired' - - KeyVaultCertificateNearExpiryEventName = 'Microsoft.KeyVault.CertificateNearExpiry' - - KeyVaultCertificateNewVersionCreatedEventName = 'Microsoft.KeyVault.CertificateNewVersionCreated' - - KeyVaultKeyExpiredEventName = 'Microsoft.KeyVault.KeyExpired' - - KeyVaultKeyNearExpiryEventName = 'Microsoft.KeyVault.KeyNearExpiry' - - KeyVaultKeyNewVersionCreatedEventName = 'Microsoft.KeyVault.KeyNewVersionCreated' - - KeyVaultSecretExpiredEventName = 'Microsoft.KeyVault.SecretExpired' - - KeyVaultSecretNearExpiryEventName = 'Microsoft.KeyVault.SecretNearExpiry' - - KeyVaultSecretNewVersionCreatedEventName = 'Microsoft.KeyVault.SecretNewVersionCreated' - - KeyVaultVaultAccessPolicyChangedEventName = 'Microsoft.KeyVault.VaultAccessPolicyChanged' - - MachineLearningServicesDatasetDriftDetectedEventName = 'Microsoft.MachineLearningServices.DatasetDriftDetected' - - MachineLearningServicesModelDeployedEventName = 'Microsoft.MachineLearningServices.ModelDeployed' - - MachineLearningServicesModelRegisteredEventName = 'Microsoft.MachineLearningServices.ModelRegistered' - - MachineLearningServicesRunCompletedEventName = 'Microsoft.MachineLearningServices.RunCompleted' - - MachineLearningServicesRunStatusChangedEventName = 'Microsoft.MachineLearningServices.RunStatusChanged' - - MapsGeofenceEnteredEventName = 'Microsoft.Maps.GeofenceEntered' - - MapsGeofenceExitedEventName = 'Microsoft.Maps.GeofenceExited' - - MapsGeofenceResultEventName = 'Microsoft.Maps.GeofenceResult' - - MediaJobCanceledEventName = 'Microsoft.Media.JobCanceled' - - MediaJobCancelingEventName = 'Microsoft.Media.JobCanceling' - - MediaJobErroredEventName = 'Microsoft.Media.JobErrored' - - MediaJobFinishedEventName = 'Microsoft.Media.JobFinished' - - MediaJobOutputCanceledEventName = 'Microsoft.Media.JobOutputCanceled' - - MediaJobOutputCancelingEventName = 'Microsoft.Media.JobOutputCanceling' - - MediaJobOutputErroredEventName = 'Microsoft.Media.JobOutputErrored' - - MediaJobOutputFinishedEventName = 'Microsoft.Media.JobOutputFinished' - - MediaJobOutputProcessingEventName = 'Microsoft.Media.JobOutputProcessing' - - MediaJobOutputProgressEventName = 'Microsoft.Media.JobOutputProgress' - - MediaJobOutputScheduledEventName = 'Microsoft.Media.JobOutputScheduled' - - MediaJobOutputStateChangeEventName = 'Microsoft.Media.JobOutputStateChange' - - MediaJobProcessingEventName = 'Microsoft.Media.JobProcessing' - - MediaJobScheduledEventName = 'Microsoft.Media.JobScheduled' - - MediaJobStateChangeEventName = 'Microsoft.Media.JobStateChange' - - MediaLiveEventChannelArchiveHeartbeatEventName = 'Microsoft.Media.LiveEventChannelArchiveHeartbeat' - - MediaLiveEventConnectionRejectedEventName = 'Microsoft.Media.LiveEventConnectionRejected' - - MediaLiveEventEncoderConnectedEventName = 'Microsoft.Media.LiveEventEncoderConnected' - - MediaLiveEventEncoderDisconnectedEventName = 'Microsoft.Media.LiveEventEncoderDisconnected' - - MediaLiveEventIncomingDataChunkDroppedEventName = 'Microsoft.Media.LiveEventIncomingDataChunkDropped' - - MediaLiveEventIncomingStreamReceivedEventName = 'Microsoft.Media.LiveEventIncomingStreamReceived' - - MediaLiveEventIncomingStreamsOutOfSyncEventName = 'Microsoft.Media.LiveEventIncomingStreamsOutOfSync' - - MediaLiveEventIncomingVideoStreamsOutOfSyncEventName = 'Microsoft.Media.LiveEventIncomingVideoStreamsOutOfSync' - - MediaLiveEventIngestHeartbeatEventName = 'Microsoft.Media.LiveEventIngestHeartbeat' - - MediaLiveEventTrackDiscontinuityDetectedEventName = 'Microsoft.Media.LiveEventTrackDiscontinuityDetected' - - PolicyInsightsPolicyStateChangedEventName = 'Microsoft.PolicyInsights.PolicyStateChanged' - - PolicyInsightsPolicyStateCreatedEventName = 'Microsoft.PolicyInsights.PolicyStateCreated' - - PolicyInsightsPolicyStateDeletedEventName = 'Microsoft.PolicyInsights.PolicyStateDeleted' - - RedisExportRDBCompletedEventName = 'Microsoft.Cache.ExportRDBCompleted' - - RedisImportRDBCompletedEventName = 'Microsoft.Cache.ImportRDBCompleted' - - RedisPatchingCompletedEventName = 'Microsoft.Cache.PatchingCompleted' - - RedisScalingCompletedEventName = 'Microsoft.Cache.ScalingCompleted' - - ResourceActionCancelName = 'Microsoft.Resources.ResourceActionCancel' - - ResourceActionFailureName = 'Microsoft.Resources.ResourceActionFailure' - - ResourceActionSuccessName = 'Microsoft.Resources.ResourceActionSuccess' - - ResourceDeleteCancelName = 'Microsoft.Resources.ResourceDeleteCancel' - - ResourceDeleteFailureName = 'Microsoft.Resources.ResourceDeleteFailure' - - ResourceDeleteSuccessName = 'Microsoft.Resources.ResourceDeleteSuccess' - - ResourceNotificationsHealthResourcesAnnotatedEventName = 'Microsoft.ResourceNotifications.HealthResources.ResourceAnnotated' - - ResourceNotificationsHealthResourcesAvailabilityStatusChangedEventName = 'Microsoft.ResourceNotifications.HealthResources.AvailabilityStatusChanged' - - ResourceNotificationsResourceManagementCreatedOrUpdatedEventName = 'Microsoft.ResourceNotifications.Resources.CreatedOrUpdated' - - ResourceNotificationsResourceManagementDeletedEventName = 'Microsoft.ResourceNotifications.Resources.Deleted' - - ResourceWriteCancelName = 'Microsoft.Resources.ResourceWriteCancel' - - ResourceWriteFailureName = 'Microsoft.Resources.ResourceWriteFailure' - - ResourceWriteSuccessName = 'Microsoft.Resources.ResourceWriteSuccess' - - ServiceBusActiveMessagesAvailablePeriodicNotificationsEventName = 'Microsoft.ServiceBus.ActiveMessagesAvailablePeriodicNotifications' - - ServiceBusActiveMessagesAvailableWithNoListenersEventName = 'Microsoft.ServiceBus.ActiveMessagesAvailableWithNoListeners' - - ServiceBusDeadletterMessagesAvailablePeriodicNotificationsEventName = 'Microsoft.ServiceBus.DeadletterMessagesAvailablePeriodicNotifications' - - ServiceBusDeadletterMessagesAvailableWithNoListenersEventName = 'Microsoft.ServiceBus.DeadletterMessagesAvailableWithNoListeners' - - SignalRServiceClientConnectionConnectedEventName = 'Microsoft.SignalRService.ClientConnectionConnected' - - SignalRServiceClientConnectionDisconnectedEventName = 'Microsoft.SignalRService.ClientConnectionDisconnected' - - StorageAsyncOperationInitiatedEventName = 'Microsoft.Storage.AsyncOperationInitiated' - - StorageBlobCreatedEventName = 'Microsoft.Storage.BlobCreated' - - StorageBlobDeletedEventName = 'Microsoft.Storage.BlobDeleted' - - StorageBlobInventoryPolicyCompletedEventName = 'Microsoft.Storage.BlobInventoryPolicyCompleted' - - StorageBlobRenamedEventName = 'Microsoft.Storage.BlobRenamed' - - StorageBlobTierChangedEventName = 'Microsoft.Storage.BlobTierChanged' - - StorageDirectoryCreatedEventName = 'Microsoft.Storage.DirectoryCreated' - - StorageDirectoryDeletedEventName = 'Microsoft.Storage.DirectoryDeleted' - - StorageDirectoryRenamedEventName = 'Microsoft.Storage.DirectoryRenamed' - - StorageLifecyclePolicyCompletedEventName = 'Microsoft.Storage.LifecyclePolicyCompleted' - - StorageTaskAssignmentCompletedEventName = 'Microsoft.Storage.StorageTaskAssignmentCompleted' - - StorageTaskAssignmentQueuedEventName = 'Microsoft.Storage.StorageTaskAssignmentQueued' - - StorageTaskCompletedEventName = 'Microsoft.Storage.StorageTaskCompleted' - - StorageTaskQueuedEventName = 'Microsoft.Storage.StorageTaskQueued' - - SubscriptionDeletedEventName = 'Microsoft.EventGrid.SubscriptionDeletedEvent' - - SubscriptionValidationEventName = 'Microsoft.EventGrid.SubscriptionValidationEvent' - - WebAppServicePlanUpdatedEventName = 'Microsoft.Web.AppServicePlanUpdated' - - WebAppUpdatedEventName = 'Microsoft.Web.AppUpdated' - - WebBackupOperationCompletedEventName = 'Microsoft.Web.BackupOperationCompleted' - - WebBackupOperationFailedEventName = 'Microsoft.Web.BackupOperationFailed' - - WebBackupOperationStartedEventName = 'Microsoft.Web.BackupOperationStarted' - - WebRestoreOperationCompletedEventName = 'Microsoft.Web.RestoreOperationCompleted' - - WebRestoreOperationFailedEventName = 'Microsoft.Web.RestoreOperationFailed' - - WebRestoreOperationStartedEventName = 'Microsoft.Web.RestoreOperationStarted' - - WebSlotSwapCompletedEventName = 'Microsoft.Web.SlotSwapCompleted' - - WebSlotSwapFailedEventName = 'Microsoft.Web.SlotSwapFailed' - - WebSlotSwapStartedEventName = 'Microsoft.Web.SlotSwapStarted' - - WebSlotSwapWithPreviewCancelledEventName = 'Microsoft.Web.SlotSwapWithPreviewCancelled' - - WebSlotSwapWithPreviewStartedEventName = 'Microsoft.Web.SlotSwapWithPreviewStarted' - - ContainerRegistryArtifactEventName = 'Microsoft.AppConfiguration.KeyValueModified' - - KeyVaultAccessPolicyChangedEventName = 'Microsoft.KeyVault.VaultAccessPolicyChanged' - - ContainerRegistryEventName = 'Microsoft.ContainerRegistry.ChartPushed' - - ServiceBusDeadletterMessagesAvailableWithNoListenerEventName = 'Microsoft.ServiceBus.DeadletterMessagesAvailableWithNoListeners' diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/__init__.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/__init__.py new file mode 100644 index 000000000000..1dc3655a13bb --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/__init__.py @@ -0,0 +1,19 @@ +# coding=utf-8 +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for license information. +# -------------------------------------------------------------------------- + +from ._publisher_client import EventGridPublisherClient +from ._event_mappings import SystemEventNames +from ._helpers import generate_sas +from ._models import EventGridEvent +from ._version import VERSION + +__all__ = [ + "EventGridPublisherClient", + "EventGridEvent", + "generate_sas", + "SystemEventNames", +] +__version__ = VERSION diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_constants.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_constants.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_constants.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_constants.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_event_mappings.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_event_mappings.py new file mode 100644 index 000000000000..59fbfed265e6 --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_event_mappings.py @@ -0,0 +1,533 @@ +# -------------------------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for license information. + +# THE VALUES IN THE ENUM ARE AUTO-GENERATED. DO NOT EDIT THIS MANUALLY. +# -------------------------------------------------------------------------------------------- +from enum import Enum +from azure.core import CaseInsensitiveEnumMeta + + +# pylint: disable=line-too-long +# pylint: disable=enum-must-be-uppercase +class SystemEventNames(str, Enum, metaclass=CaseInsensitiveEnumMeta): + """ + This enum represents the names of the various event types for the system events published to + Azure Event Grid. To check the list of recognizable system topics, + visit https://docs.microsoft.com/azure/event-grid/system-topics. + """ + + # These names at the top are 'corrected' aliases of duplicate values that appear below, which are + # deprecated but maintained for backwards compatibility. + AcsChatMemberAddedToThreadWithUserEventName = "Microsoft.Communication.ChatMemberAddedToThreadWithUser" + + ResourceWriteFailureEventName = "Microsoft.Resources.ResourceWriteFailure" + + IoTHubDeviceDeletedEventName = "Microsoft.Devices.DeviceDeleted" + + IoTHubDeviceDisconnectedEventName = "Microsoft.Devices.DeviceDisconnected" + + ResourceDeleteFailureEventName = "Microsoft.Resources.ResourceDeleteFailure" + + ResourceDeleteCancelEventName = "Microsoft.Resources.ResourceDeleteCancel" + + AcsChatThreadParticipantAddedEventName = "Microsoft.Communication.ChatThreadParticipantAdded" + + ResourceDeleteSuccessEventName = "Microsoft.Resources.ResourceDeleteSuccess" + + EventGridSubscriptionValidationEventName = "Microsoft.EventGrid.SubscriptionValidationEvent" + + ResourceWriteSuccessEventName = "Microsoft.Resources.ResourceWriteSuccess" + + ResourceActionSuccessEventName = "Microsoft.Resources.ResourceActionSuccess" + + ResourceWriteCancelEventName = "Microsoft.Resources.ResourceWriteCancel" + + ResourceActionFailureEventName = "Microsoft.Resources.ResourceActionFailure" + + AcsChatMemberRemovedFromThreadWithUserEventName = "Microsoft.Communication.ChatMemberRemovedFromThreadWithUser" + + IoTHubDeviceConnectedEventName = "Microsoft.Devices.DeviceConnected" + + EventGridSubscriptionDeletedEventName = "Microsoft.EventGrid.SubscriptionDeletedEvent" + + AcsChatThreadParticipantRemovedEventName = "Microsoft.Communication.ChatThreadParticipantRemoved" + + ResourceActionCancelEventName = "Microsoft.Resources.ResourceActionCancel" + + IoTHubDeviceCreatedEventName = "Microsoft.Devices.DeviceCreated" + + # Aliases end here + AcsAdvancedMessageDeliveryStatusUpdatedEventName = "Microsoft.Communication.AdvancedMessageDeliveryStatusUpdated" + + AcsAdvancedMessageReceivedEventName = "Microsoft.Communication.AdvancedMessageReceived" + + AcsChatMessageDeletedEventName = "Microsoft.Communication.ChatMessageDeleted" + + AcsChatMessageDeletedInThreadEventName = "Microsoft.Communication.ChatMessageDeletedInThread" + + AcsChatMessageEditedEventName = "Microsoft.Communication.ChatMessageEdited" + + AcsChatMessageEditedInThreadEventName = "Microsoft.Communication.ChatMessageEditedInThread" + + AcsChatMessageReceivedEventName = "Microsoft.Communication.ChatMessageReceived" + + AcsChatMessageReceivedInThreadEventName = "Microsoft.Communication.ChatMessageReceivedInThread" + + AcsChatParticipantAddedToThreadEventName = "Microsoft.Communication.ChatThreadParticipantAdded" + + AcsChatParticipantAddedToThreadWithUserEventName = "Microsoft.Communication.ChatParticipantAddedToThreadWithUser" + + AcsChatParticipantRemovedFromThreadEventName = "Microsoft.Communication.ChatThreadParticipantRemoved" + + AcsChatParticipantRemovedFromThreadWithUserEventName = ( + "Microsoft.Communication.ChatParticipantRemovedFromThreadWithUser" + ) + + AcsChatThreadCreatedEventName = "Microsoft.Communication.ChatThreadCreated" + + AcsChatThreadCreatedWithUserEventName = "Microsoft.Communication.ChatThreadCreatedWithUser" + + AcsChatThreadDeletedEventName = "Microsoft.Communication.ChatThreadDeleted" + + AcsChatThreadPropertiesUpdatedEventName = "Microsoft.Communication.ChatThreadPropertiesUpdated" + + AcsChatThreadPropertiesUpdatedPerUserEventName = "Microsoft.Communication.ChatThreadPropertiesUpdatedPerUser" + + AcsChatThreadWithUserDeletedEventName = "Microsoft.Communication.ChatThreadWithUserDeleted" + + AcsEmailDeliveryReportReceivedEventName = "Microsoft.Communication.EmailDeliveryReportReceived" + + AcsEmailEngagementTrackingReportReceivedEventName = "Microsoft.Communication.EmailEngagementTrackingReportReceived" + + AcsIncomingCallEventName = "Microsoft.Communication.IncomingCall" + + AcsRecordingFileStatusUpdatedEventName = "Microsoft.Communication.RecordingFileStatusUpdated" + + AcsRouterJobCancelledEventName = "Microsoft.Communication.RouterJobCancelled" + + AcsRouterJobClassificationFailedEventName = "Microsoft.Communication.RouterJobClassificationFailed" + + AcsRouterJobClassifiedEventName = "Microsoft.Communication.RouterJobClassified" + + AcsRouterJobClosedEventName = "Microsoft.Communication.RouterJobClosed" + + AcsRouterJobCompletedEventName = "Microsoft.Communication.RouterJobCompleted" + + AcsRouterJobDeletedEventName = "Microsoft.Communication.RouterJobDeleted" + + AcsRouterJobExceptionTriggeredEventName = "Microsoft.Communication.RouterJobExceptionTriggered" + + AcsRouterJobQueuedEventName = "Microsoft.Communication.RouterJobQueued" + + AcsRouterJobReceivedEventName = "Microsoft.Communication.RouterJobReceived" + + AcsRouterJobSchedulingFailedEventName = "Microsoft.Communication.RouterJobSchedulingFailed" + + AcsRouterJobUnassignedEventName = "Microsoft.Communication.RouterJobUnassigned" + + AcsRouterJobWaitingForActivationEventName = "Microsoft.Communication.RouterJobWaitingForActivation" + + AcsRouterJobWorkerSelectorsExpiredEventName = "Microsoft.Communication.RouterJobWorkerSelectorsExpired" + + AcsRouterWorkerDeletedEventName = "Microsoft.Communication.RouterWorkerDeleted" + + AcsRouterWorkerDeregisteredEventName = "Microsoft.Communication.RouterWorkerDeregistered" + + AcsRouterWorkerOfferAcceptedEventName = "Microsoft.Communication.RouterWorkerOfferAccepted" + + AcsRouterWorkerOfferDeclinedEventName = "Microsoft.Communication.RouterWorkerOfferDeclined" + + AcsRouterWorkerOfferExpiredEventName = "Microsoft.Communication.RouterWorkerOfferExpired" + + AcsRouterWorkerOfferIssuedEventName = "Microsoft.Communication.RouterWorkerOfferIssued" + + AcsRouterWorkerOfferRevokedEventName = "Microsoft.Communication.RouterWorkerOfferRevoked" + + AcsRouterWorkerRegisteredEventName = "Microsoft.Communication.RouterWorkerRegistered" + + AcsRouterWorkerUpdatedEventName = "Microsoft.Communication.RouterWorkerUpdated" + + AcsSmsDeliveryReportReceivedEventName = "Microsoft.Communication.SMSDeliveryReportReceived" + + AcsSmsReceivedEventName = "Microsoft.Communication.SMSReceived" + + AcsUserDisconnectedEventName = "Microsoft.Communication.UserDisconnected" + + ApiCenterApiDefinitionAddedEventName = "Microsoft.ApiCenter.ApiDefinitionAdded" + + ApiCenterApiDefinitionUpdatedEventName = "Microsoft.ApiCenter.ApiDefinitionUpdated" + + ApiManagementApiCreatedEventName = "Microsoft.ApiManagement.APICreated" + + ApiManagementApiDeletedEventName = "Microsoft.ApiManagement.APIDeleted" + + ApiManagementApiReleaseCreatedEventName = "Microsoft.ApiManagement.APIReleaseCreated" + + ApiManagementApiReleaseDeletedEventName = "Microsoft.ApiManagement.APIReleaseDeleted" + + ApiManagementApiReleaseUpdatedEventName = "Microsoft.ApiManagement.APIReleaseUpdated" + + ApiManagementApiUpdatedEventName = "Microsoft.ApiManagement.APIUpdated" + + ApiManagementGatewayApiAddedEventName = "Microsoft.ApiManagement.GatewayAPIAdded" + + ApiManagementGatewayApiRemovedEventName = "Microsoft.ApiManagement.GatewayAPIRemoved" + + ApiManagementGatewayCertificateAuthorityCreatedEventName = ( + "Microsoft.ApiManagement.GatewayCertificateAuthorityCreated" + ) + + ApiManagementGatewayCertificateAuthorityDeletedEventName = ( + "Microsoft.ApiManagement.GatewayCertificateAuthorityDeleted" + ) + + ApiManagementGatewayCertificateAuthorityUpdatedEventName = ( + "Microsoft.ApiManagement.GatewayCertificateAuthorityUpdated" + ) + + ApiManagementGatewayCreatedEventName = "Microsoft.ApiManagement.GatewayCreated" + + ApiManagementGatewayDeletedEventName = "Microsoft.ApiManagement.GatewayDeleted" + + ApiManagementGatewayHostnameConfigurationCreatedEventName = ( + "Microsoft.ApiManagement.GatewayHostnameConfigurationCreated" + ) + + ApiManagementGatewayHostnameConfigurationDeletedEventName = ( + "Microsoft.ApiManagement.GatewayHostnameConfigurationDeleted" + ) + + ApiManagementGatewayHostnameConfigurationUpdatedEventName = ( + "Microsoft.ApiManagement.GatewayHostnameConfigurationUpdated" + ) + + ApiManagementGatewayUpdatedEventName = "Microsoft.ApiManagement.GatewayUpdated" + + ApiManagementProductCreatedEventName = "Microsoft.ApiManagement.ProductCreated" + + ApiManagementProductDeletedEventName = "Microsoft.ApiManagement.ProductDeleted" + + ApiManagementProductUpdatedEventName = "Microsoft.ApiManagement.ProductUpdated" + + ApiManagementSubscriptionCreatedEventName = "Microsoft.ApiManagement.SubscriptionCreated" + + ApiManagementSubscriptionDeletedEventName = "Microsoft.ApiManagement.SubscriptionDeleted" + + ApiManagementSubscriptionUpdatedEventName = "Microsoft.ApiManagement.SubscriptionUpdated" + + ApiManagementUserCreatedEventName = "Microsoft.ApiManagement.UserCreated" + + ApiManagementUserDeletedEventName = "Microsoft.ApiManagement.UserDeleted" + + ApiManagementUserUpdatedEventName = "Microsoft.ApiManagement.UserUpdated" + + AppConfigurationKeyValueDeletedEventName = "Microsoft.AppConfiguration.KeyValueDeleted" + + AppConfigurationKeyValueModifiedEventName = "Microsoft.AppConfiguration.KeyValueModified" + + AppConfigurationSnapshotCreatedEventName = "Microsoft.AppConfiguration.SnapshotCreated" + + AppConfigurationSnapshotModifiedEventName = "Microsoft.AppConfiguration.SnapshotModified" + + AvsClusterCreatedEventName = "Microsoft.AVS.ClusterCreated" + + AvsClusterDeletedEventName = "Microsoft.AVS.ClusterDeleted" + + AvsClusterFailedEventName = "Microsoft.AVS.ClusterFailed" + + AvsClusterUpdatedEventName = "Microsoft.AVS.ClusterUpdated" + + AvsClusterUpdatingEventName = "Microsoft.AVS.ClusterUpdating" + + AvsPrivateCloudFailedEventName = "Microsoft.AVS.PrivateCloudFailed" + + AvsPrivateCloudUpdatedEventName = "Microsoft.AVS.PrivateCloudUpdated" + + AvsPrivateCloudUpdatingEventName = "Microsoft.AVS.PrivateCloudUpdating" + + AvsScriptExecutionCancelledEventName = "Microsoft.AVS.ScriptExecutionCancelled" + + AvsScriptExecutionFailedEventName = "Microsoft.AVS.ScriptExecutionFailed" + + AvsScriptExecutionFinishedEventName = "Microsoft.AVS.ScriptExecutionFinished" + + AvsScriptExecutionStartedEventName = "Microsoft.AVS.ScriptExecutionStarted" + + ContainerRegistryChartDeletedEventName = "Microsoft.ContainerRegistry.ChartDeleted" + + ContainerRegistryChartPushedEventName = "Microsoft.ContainerRegistry.ChartPushed" + + ContainerRegistryImageDeletedEventName = "Microsoft.ContainerRegistry.ImageDeleted" + + ContainerRegistryImagePushedEventName = "Microsoft.ContainerRegistry.ImagePushed" + + ContainerServiceClusterSupportEndedEventName = "Microsoft.ContainerService.ClusterSupportEnded" + + ContainerServiceClusterSupportEndingEventName = "Microsoft.ContainerService.ClusterSupportEnding" + + ContainerServiceNewKubernetesVersionAvailableEventName = "Microsoft.ContainerService.NewKubernetesVersionAvailable" + + ContainerServiceNodePoolRollingFailedEventName = "Microsoft.ContainerService.NodePoolRollingFailed" + + ContainerServiceNodePoolRollingStartedEventName = "Microsoft.ContainerService.NodePoolRollingStarted" + + ContainerServiceNodePoolRollingSucceededEventName = "Microsoft.ContainerService.NodePoolRollingSucceeded" + + DataBoxCopyCompletedEventName = "Microsoft.DataBox.CopyCompleted" + + DataBoxCopyStartedEventName = "Microsoft.DataBox.CopyStarted" + + DataBoxOrderCompletedEventName = "Microsoft.DataBox.OrderCompleted" + + EventGridMQTTClientCreatedOrUpdatedEventName = "Microsoft.EventGrid.MQTTClientCreatedOrUpdated" + + EventGridMQTTClientDeletedEventName = "Microsoft.EventGrid.MQTTClientDeleted" + + EventGridMQTTClientSessionConnectedEventName = "Microsoft.EventGrid.MQTTClientSessionConnected" + + EventGridMQTTClientSessionDisconnectedEventName = "Microsoft.EventGrid.MQTTClientSessionDisconnected" + + EventHubCaptureFileCreatedEventName = "Microsoft.EventHub.CaptureFileCreated" + + HealthcareDicomImageCreatedEventName = "Microsoft.HealthcareApis.DicomImageCreated" + + HealthcareDicomImageDeletedEventName = "Microsoft.HealthcareApis.DicomImageDeleted" + + HealthcareDicomImageUpdatedEventName = "Microsoft.HealthcareApis.DicomImageUpdated" + + HealthcareFhirResourceCreatedEventName = "Microsoft.HealthcareApis.FhirResourceCreated" + + HealthcareFhirResourceDeletedEventName = "Microsoft.HealthcareApis.FhirResourceDeleted" + + HealthcareFhirResourceUpdatedEventName = "Microsoft.HealthcareApis.FhirResourceUpdated" + + IotHubDeviceConnectedEventName = "Microsoft.Devices.DeviceConnected" + + IotHubDeviceCreatedEventName = "Microsoft.Devices.DeviceCreated" + + IotHubDeviceDeletedEventName = "Microsoft.Devices.DeviceDeleted" + + IotHubDeviceDisconnectedEventName = "Microsoft.Devices.DeviceDisconnected" + + IotHubDeviceTelemetryEventName = "Microsoft.Devices.DeviceTelemetry" + + KeyVaultCertificateExpiredEventName = "Microsoft.KeyVault.CertificateExpired" + + KeyVaultCertificateNearExpiryEventName = "Microsoft.KeyVault.CertificateNearExpiry" + + KeyVaultCertificateNewVersionCreatedEventName = "Microsoft.KeyVault.CertificateNewVersionCreated" + + KeyVaultKeyExpiredEventName = "Microsoft.KeyVault.KeyExpired" + + KeyVaultKeyNearExpiryEventName = "Microsoft.KeyVault.KeyNearExpiry" + + KeyVaultKeyNewVersionCreatedEventName = "Microsoft.KeyVault.KeyNewVersionCreated" + + KeyVaultSecretExpiredEventName = "Microsoft.KeyVault.SecretExpired" + + KeyVaultSecretNearExpiryEventName = "Microsoft.KeyVault.SecretNearExpiry" + + KeyVaultSecretNewVersionCreatedEventName = "Microsoft.KeyVault.SecretNewVersionCreated" + + KeyVaultVaultAccessPolicyChangedEventName = "Microsoft.KeyVault.VaultAccessPolicyChanged" + + MachineLearningServicesDatasetDriftDetectedEventName = "Microsoft.MachineLearningServices.DatasetDriftDetected" + + MachineLearningServicesModelDeployedEventName = "Microsoft.MachineLearningServices.ModelDeployed" + + MachineLearningServicesModelRegisteredEventName = "Microsoft.MachineLearningServices.ModelRegistered" + + MachineLearningServicesRunCompletedEventName = "Microsoft.MachineLearningServices.RunCompleted" + + MachineLearningServicesRunStatusChangedEventName = "Microsoft.MachineLearningServices.RunStatusChanged" + + MapsGeofenceEnteredEventName = "Microsoft.Maps.GeofenceEntered" + + MapsGeofenceExitedEventName = "Microsoft.Maps.GeofenceExited" + + MapsGeofenceResultEventName = "Microsoft.Maps.GeofenceResult" + + MediaJobCanceledEventName = "Microsoft.Media.JobCanceled" + + MediaJobCancelingEventName = "Microsoft.Media.JobCanceling" + + MediaJobErroredEventName = "Microsoft.Media.JobErrored" + + MediaJobFinishedEventName = "Microsoft.Media.JobFinished" + + MediaJobOutputCanceledEventName = "Microsoft.Media.JobOutputCanceled" + + MediaJobOutputCancelingEventName = "Microsoft.Media.JobOutputCanceling" + + MediaJobOutputErroredEventName = "Microsoft.Media.JobOutputErrored" + + MediaJobOutputFinishedEventName = "Microsoft.Media.JobOutputFinished" + + MediaJobOutputProcessingEventName = "Microsoft.Media.JobOutputProcessing" + + MediaJobOutputProgressEventName = "Microsoft.Media.JobOutputProgress" + + MediaJobOutputScheduledEventName = "Microsoft.Media.JobOutputScheduled" + + MediaJobOutputStateChangeEventName = "Microsoft.Media.JobOutputStateChange" + + MediaJobProcessingEventName = "Microsoft.Media.JobProcessing" + + MediaJobScheduledEventName = "Microsoft.Media.JobScheduled" + + MediaJobStateChangeEventName = "Microsoft.Media.JobStateChange" + + MediaLiveEventChannelArchiveHeartbeatEventName = "Microsoft.Media.LiveEventChannelArchiveHeartbeat" + + MediaLiveEventConnectionRejectedEventName = "Microsoft.Media.LiveEventConnectionRejected" + + MediaLiveEventEncoderConnectedEventName = "Microsoft.Media.LiveEventEncoderConnected" + + MediaLiveEventEncoderDisconnectedEventName = "Microsoft.Media.LiveEventEncoderDisconnected" + + MediaLiveEventIncomingDataChunkDroppedEventName = "Microsoft.Media.LiveEventIncomingDataChunkDropped" + + MediaLiveEventIncomingStreamReceivedEventName = "Microsoft.Media.LiveEventIncomingStreamReceived" + + MediaLiveEventIncomingStreamsOutOfSyncEventName = "Microsoft.Media.LiveEventIncomingStreamsOutOfSync" + + MediaLiveEventIncomingVideoStreamsOutOfSyncEventName = "Microsoft.Media.LiveEventIncomingVideoStreamsOutOfSync" + + MediaLiveEventIngestHeartbeatEventName = "Microsoft.Media.LiveEventIngestHeartbeat" + + MediaLiveEventTrackDiscontinuityDetectedEventName = "Microsoft.Media.LiveEventTrackDiscontinuityDetected" + + PolicyInsightsPolicyStateChangedEventName = "Microsoft.PolicyInsights.PolicyStateChanged" + + PolicyInsightsPolicyStateCreatedEventName = "Microsoft.PolicyInsights.PolicyStateCreated" + + PolicyInsightsPolicyStateDeletedEventName = "Microsoft.PolicyInsights.PolicyStateDeleted" + + RedisExportRDBCompletedEventName = "Microsoft.Cache.ExportRDBCompleted" + + RedisImportRDBCompletedEventName = "Microsoft.Cache.ImportRDBCompleted" + + RedisPatchingCompletedEventName = "Microsoft.Cache.PatchingCompleted" + + RedisScalingCompletedEventName = "Microsoft.Cache.ScalingCompleted" + + ResourceActionCancelName = "Microsoft.Resources.ResourceActionCancel" + + ResourceActionFailureName = "Microsoft.Resources.ResourceActionFailure" + + ResourceActionSuccessName = "Microsoft.Resources.ResourceActionSuccess" + + ResourceDeleteCancelName = "Microsoft.Resources.ResourceDeleteCancel" + + ResourceDeleteFailureName = "Microsoft.Resources.ResourceDeleteFailure" + + ResourceDeleteSuccessName = "Microsoft.Resources.ResourceDeleteSuccess" + + ResourceNotificationsHealthResourcesAnnotatedEventName = ( + "Microsoft.ResourceNotifications.HealthResources.ResourceAnnotated" + ) + + ResourceNotificationsHealthResourcesAvailabilityStatusChangedEventName = ( + "Microsoft.ResourceNotifications.HealthResources.AvailabilityStatusChanged" + ) + + ResourceNotificationsResourceManagementCreatedOrUpdatedEventName = ( + "Microsoft.ResourceNotifications.Resources.CreatedOrUpdated" + ) + + ResourceNotificationsResourceManagementDeletedEventName = "Microsoft.ResourceNotifications.Resources.Deleted" + + ResourceWriteCancelName = "Microsoft.Resources.ResourceWriteCancel" + + ResourceWriteFailureName = "Microsoft.Resources.ResourceWriteFailure" + + ResourceWriteSuccessName = "Microsoft.Resources.ResourceWriteSuccess" + + ServiceBusActiveMessagesAvailablePeriodicNotificationsEventName = ( + "Microsoft.ServiceBus.ActiveMessagesAvailablePeriodicNotifications" + ) + + ServiceBusActiveMessagesAvailableWithNoListenersEventName = ( + "Microsoft.ServiceBus.ActiveMessagesAvailableWithNoListeners" + ) + + ServiceBusDeadletterMessagesAvailablePeriodicNotificationsEventName = ( + "Microsoft.ServiceBus.DeadletterMessagesAvailablePeriodicNotifications" + ) + + ServiceBusDeadletterMessagesAvailableWithNoListenersEventName = ( + "Microsoft.ServiceBus.DeadletterMessagesAvailableWithNoListeners" + ) + + SignalRServiceClientConnectionConnectedEventName = "Microsoft.SignalRService.ClientConnectionConnected" + + SignalRServiceClientConnectionDisconnectedEventName = "Microsoft.SignalRService.ClientConnectionDisconnected" + + StorageAsyncOperationInitiatedEventName = "Microsoft.Storage.AsyncOperationInitiated" + + StorageBlobCreatedEventName = "Microsoft.Storage.BlobCreated" + + StorageBlobDeletedEventName = "Microsoft.Storage.BlobDeleted" + + StorageBlobInventoryPolicyCompletedEventName = "Microsoft.Storage.BlobInventoryPolicyCompleted" + + StorageBlobRenamedEventName = "Microsoft.Storage.BlobRenamed" + + StorageBlobTierChangedEventName = "Microsoft.Storage.BlobTierChanged" + + StorageDirectoryCreatedEventName = "Microsoft.Storage.DirectoryCreated" + + StorageDirectoryDeletedEventName = "Microsoft.Storage.DirectoryDeleted" + + StorageDirectoryRenamedEventName = "Microsoft.Storage.DirectoryRenamed" + + StorageLifecyclePolicyCompletedEventName = "Microsoft.Storage.LifecyclePolicyCompleted" + + StorageTaskAssignmentCompletedEventName = "Microsoft.Storage.StorageTaskAssignmentCompleted" + + StorageTaskAssignmentQueuedEventName = "Microsoft.Storage.StorageTaskAssignmentQueued" + + StorageTaskCompletedEventName = "Microsoft.Storage.StorageTaskCompleted" + + StorageTaskQueuedEventName = "Microsoft.Storage.StorageTaskQueued" + + SubscriptionDeletedEventName = "Microsoft.EventGrid.SubscriptionDeletedEvent" + + SubscriptionValidationEventName = "Microsoft.EventGrid.SubscriptionValidationEvent" + + WebAppServicePlanUpdatedEventName = "Microsoft.Web.AppServicePlanUpdated" + + WebAppUpdatedEventName = "Microsoft.Web.AppUpdated" + + WebBackupOperationCompletedEventName = "Microsoft.Web.BackupOperationCompleted" + + WebBackupOperationFailedEventName = "Microsoft.Web.BackupOperationFailed" + + WebBackupOperationStartedEventName = "Microsoft.Web.BackupOperationStarted" + + WebRestoreOperationCompletedEventName = "Microsoft.Web.RestoreOperationCompleted" + + WebRestoreOperationFailedEventName = "Microsoft.Web.RestoreOperationFailed" + + WebRestoreOperationStartedEventName = "Microsoft.Web.RestoreOperationStarted" + + WebSlotSwapCompletedEventName = "Microsoft.Web.SlotSwapCompleted" + + WebSlotSwapFailedEventName = "Microsoft.Web.SlotSwapFailed" + + WebSlotSwapStartedEventName = "Microsoft.Web.SlotSwapStarted" + + WebSlotSwapWithPreviewCancelledEventName = "Microsoft.Web.SlotSwapWithPreviewCancelled" + + WebSlotSwapWithPreviewStartedEventName = "Microsoft.Web.SlotSwapWithPreviewStarted" + + ContainerRegistryArtifactEventName = "Microsoft.AppConfiguration.KeyValueModified" + + KeyVaultAccessPolicyChangedEventName = "Microsoft.KeyVault.VaultAccessPolicyChanged" + + ContainerRegistryEventName = "Microsoft.ContainerRegistry.ChartPushed" + + ServiceBusDeadletterMessagesAvailableWithNoListenerEventName = ( + "Microsoft.ServiceBus.DeadletterMessagesAvailableWithNoListeners" + ) diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/__init__.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/__init__.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/__init__.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/__init__.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/_client.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/_client.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/_client.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/_client.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/_configuration.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/_configuration.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/_configuration.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/_configuration.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/_operations/__init__.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/_operations/__init__.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/_operations/__init__.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/_operations/__init__.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/_operations/_operations.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/_operations/_operations.py similarity index 95% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/_operations/_operations.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/_operations/_operations.py index 86ab83d13741..0ba43ddc6003 100644 --- a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/_operations/_operations.py +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/_operations/_operations.py @@ -83,7 +83,9 @@ def build_event_grid_publisher_publish_cloud_event_events_request( return HttpRequest(method="POST", url=_url, params=_params, headers=_headers, json=json, **kwargs) -def build_event_grid_publisher_publish_custom_event_events_request(**kwargs: Any) -> HttpRequest: +def build_event_grid_publisher_publish_custom_event_events_request( + **kwargs: Any, +) -> HttpRequest: _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) @@ -110,7 +112,7 @@ def publish_events( # pylint: disable=inconsistent-return-statements events: List[_models.EventGridEvent], *, content_type: str = "application/json", - **kwargs: Any + **kwargs: Any, ) -> None: """Publishes a batch of events to an Azure Event Grid topic. @@ -129,7 +131,12 @@ def publish_events( # pylint: disable=inconsistent-return-statements @overload def publish_events( # pylint: disable=inconsistent-return-statements - self, topic_hostname: str, events: IO, *, content_type: str = "application/json", **kwargs: Any + self, + topic_hostname: str, + events: IO, + *, + content_type: str = "application/json", + **kwargs: Any, ) -> None: """Publishes a batch of events to an Azure Event Grid topic. @@ -148,7 +155,10 @@ def publish_events( # pylint: disable=inconsistent-return-statements @distributed_trace def publish_events( # pylint: disable=inconsistent-return-statements - self, topic_hostname: str, events: Union[List[_models.EventGridEvent], IO], **kwargs: Any + self, + topic_hostname: str, + events: Union[List[_models.EventGridEvent], IO], + **kwargs: Any, ) -> None: """Publishes a batch of events to an Azure Event Grid topic. @@ -241,7 +251,8 @@ def publish_cloud_event_events( # pylint: disable=inconsistent-return-statement _params = kwargs.pop("params", {}) or {} content_type: str = kwargs.pop( - "content_type", _headers.pop("Content-Type", "application/cloudevents-batch+json; charset=utf-8") + "content_type", + _headers.pop("Content-Type", "application/cloudevents-batch+json; charset=utf-8"), ) cls: ClsType[None] = kwargs.pop("cls", None) @@ -275,7 +286,12 @@ def publish_cloud_event_events( # pylint: disable=inconsistent-return-statement @overload def publish_custom_event_events( # pylint: disable=inconsistent-return-statements - self, topic_hostname: str, events: List[JSON], *, content_type: str = "application/json", **kwargs: Any + self, + topic_hostname: str, + events: List[JSON], + *, + content_type: str = "application/json", + **kwargs: Any, ) -> None: """Publishes a batch of events to an Azure Event Grid topic. @@ -294,7 +310,12 @@ def publish_custom_event_events( # pylint: disable=inconsistent-return-statemen @overload def publish_custom_event_events( # pylint: disable=inconsistent-return-statements - self, topic_hostname: str, events: IO, *, content_type: str = "application/json", **kwargs: Any + self, + topic_hostname: str, + events: IO, + *, + content_type: str = "application/json", + **kwargs: Any, ) -> None: """Publishes a batch of events to an Azure Event Grid topic. diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/_operations/_patch.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/_operations/_patch.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/_operations/_patch.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/_operations/_patch.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/_patch.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/_patch.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/_patch.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/_patch.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/_serialization.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/_serialization.py similarity index 98% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/_serialization.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/_serialization.py index 842ae727fbbc..1e7a11b1d256 100644 --- a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/_serialization.py +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/_serialization.py @@ -63,7 +63,11 @@ import isodate # type: ignore -from azure.core.exceptions import DeserializationError, SerializationError, raise_with_traceback +from azure.core.exceptions import ( + DeserializationError, + SerializationError, + raise_with_traceback, +) from azure.core.serialization import NULL as AzureCoreNull _BOM = codecs.BOM_UTF8.decode(encoding="utf-8") @@ -73,7 +77,6 @@ class RawDeserializer: - # Accept "text" because we're open minded people... JSON_REGEXP = re.compile(r"^(application|text)/([a-z+.]+\+)?json$") @@ -115,7 +118,6 @@ def deserialize_from_text(cls, data: Optional[Union[AnyStr, IO]], content_type: raise DeserializationError("JSON is invalid: {}".format(err), err) elif "xml" in (content_type or []): try: - try: if isinstance(data, unicode): # type: ignore # If I'm Python 2.7 and unicode XML will scream if I try a "fromstring" on unicode string @@ -298,9 +300,17 @@ def __init__(self, **kwargs: Any) -> None: self.additional_properties: Dict[str, Any] = {} for k in kwargs: if k not in self._attribute_map: - _LOGGER.warning("%s is not a known attribute of class %s and will be ignored", k, self.__class__) + _LOGGER.warning( + "%s is not a known attribute of class %s and will be ignored", + k, + self.__class__, + ) elif k in self._validation and self._validation[k].get("readonly", False): - _LOGGER.warning("Readonly attribute %s will be ignored in class %s", k, self.__class__) + _LOGGER.warning( + "Readonly attribute %s will be ignored in class %s", + k, + self.__class__, + ) else: setattr(self, k, kwargs[k]) @@ -337,7 +347,11 @@ def _create_xml_node(cls): except AttributeError: xml_map = {} - return _create_xml_node(xml_map.get("name", cls.__name__), xml_map.get("prefix", None), xml_map.get("ns", None)) + return _create_xml_node( + xml_map.get("name", cls.__name__), + xml_map.get("prefix", None), + xml_map.get("ns", None), + ) def serialize(self, keep_readonly: bool = False, **kwargs: Any) -> JSON: """Return the JSON that would be sent to azure from this model. @@ -486,7 +500,11 @@ def _classify(cls, response, objects): ) break else: - _LOGGER.warning("Discriminator %s is absent or null, use base class %s.", subtype_key, cls.__name__) + _LOGGER.warning( + "Discriminator %s is absent or null, use base class %s.", + subtype_key, + cls.__name__, + ) break return cls @@ -610,7 +628,6 @@ def _serialize(self, target_obj, data_type=None, **kwargs): serialized.update(target_obj.additional_properties) continue try: - orig_attr = getattr(target_obj, attr) if is_xml_model_serialization: pass # Don't provide "transformer" for XML for now. Keep "orig_attr" @@ -929,7 +946,11 @@ def serialize_iter(self, data, iter_type, div=None, **kwargs): if isinstance(el, ET.Element): el_node = el else: - el_node = _create_xml_node(node_name, xml_desc.get("prefix", None), xml_desc.get("ns", None)) + el_node = _create_xml_node( + node_name, + xml_desc.get("prefix", None), + xml_desc.get("ns", None), + ) if el is not None: # Otherwise it writes "None" :-p el_node.text = str(el) final_result.append(el_node) @@ -1155,7 +1176,12 @@ def serialize_iso(attr, **kwargs): if microseconds: microseconds = "." + microseconds date = "{:04}-{:02}-{:02}T{:02}:{:02}:{:02}".format( - utc.tm_year, utc.tm_mon, utc.tm_mday, utc.tm_hour, utc.tm_min, utc.tm_sec + utc.tm_year, + utc.tm_mon, + utc.tm_mday, + utc.tm_hour, + utc.tm_min, + utc.tm_sec, ) return date + microseconds + "Z" except (ValueError, OverflowError) as err: @@ -1532,7 +1558,8 @@ def failsafe_deserialize(self, target_obj, data, content_type=None): return self(target_obj, data, content_type=content_type) except: _LOGGER.debug( - "Ran into a deserialization error. Ignoring since this is failsafe deserialization", exc_info=True + "Ran into a deserialization error. Ignoring since this is failsafe deserialization", + exc_info=True, ) return None @@ -1811,7 +1838,11 @@ def deserialize_enum(data, enum_obj): if enum_value.value.lower() == str(data).lower(): return enum_value # We don't fail anymore for unknown value, we deserialize as a string - _LOGGER.warning("Deserializer is not able to find %s as valid enum in %s", data, enum_obj) + _LOGGER.warning( + "Deserializer is not able to find %s as valid enum in %s", + data, + enum_obj, + ) return Deserializer.deserialize_unicode(data) @staticmethod diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/_vendor.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/_vendor.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/_vendor.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/_vendor.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/aio/__init__.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/aio/__init__.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/aio/__init__.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/aio/__init__.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/aio/_client.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/aio/_client.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/aio/_client.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/aio/_client.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/aio/_configuration.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/aio/_configuration.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/aio/_configuration.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/aio/_configuration.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/aio/_operations/__init__.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/aio/_operations/__init__.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/aio/_operations/__init__.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/aio/_operations/__init__.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/aio/_operations/_operations.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/aio/_operations/_operations.py similarity index 99% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/aio/_operations/_operations.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/aio/_operations/_operations.py index 317fd4bfb161..ba6dc3ff67e5 100644 --- a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/aio/_operations/_operations.py +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/aio/_operations/_operations.py @@ -179,7 +179,8 @@ async def publish_cloud_event_events( # pylint: disable=inconsistent-return-sta _params = kwargs.pop("params", {}) or {} content_type: str = kwargs.pop( - "content_type", _headers.pop("Content-Type", "application/cloudevents-batch+json; charset=utf-8") + "content_type", + _headers.pop("Content-Type", "application/cloudevents-batch+json; charset=utf-8"), ) cls: ClsType[None] = kwargs.pop("cls", None) diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/aio/_operations/_patch.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/aio/_operations/_patch.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/aio/_operations/_patch.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/aio/_operations/_patch.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/aio/_patch.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/aio/_patch.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/aio/_patch.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/aio/_patch.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/aio/_vendor.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/aio/_vendor.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/aio/_vendor.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/aio/_vendor.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/models/__init__.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/models/__init__.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/models/__init__.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/models/__init__.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/models/_models.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/models/_models.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/models/_models.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/models/_models.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/models/_patch.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/models/_patch.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/models/_patch.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/models/_patch.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/py.typed b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/py.typed similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_generated/py.typed rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_generated/py.typed diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_helpers.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_helpers.py similarity index 80% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_helpers.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_helpers.py index e824426427d3..c0623fe16b58 100644 --- a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_helpers.py +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_helpers.py @@ -11,7 +11,10 @@ from urllib.parse import quote from azure.core.pipeline.transport import HttpRequest -from azure.core.pipeline.policies import AzureKeyCredentialPolicy, BearerTokenCredentialPolicy +from azure.core.pipeline.policies import ( + AzureKeyCredentialPolicy, + BearerTokenCredentialPolicy, +) from azure.core.credentials import AzureKeyCredential, AzureSasCredential from ._generated._serialization import Serializer from ._signature_credential_policy import EventGridSasCredentialPolicy @@ -24,6 +27,7 @@ if TYPE_CHECKING: from datetime import datetime + def generate_sas( endpoint: str, shared_access_key: str, @@ -52,19 +56,16 @@ def generate_sas( :caption: Generate a shared access signature. """ - full_endpoint = "{}?apiVersion={}".format( - endpoint, api_version - ) + full_endpoint = "{}?apiVersion={}".format(endpoint, api_version) encoded_resource = quote(full_endpoint, safe=constants.SAFE_ENCODE) encoded_expiration_utc = quote(str(expiration_date_utc), safe=constants.SAFE_ENCODE) unsigned_sas = "r={}&e={}".format(encoded_resource, encoded_expiration_utc) - signature = quote( - _generate_hmac(shared_access_key, unsigned_sas), safe=constants.SAFE_ENCODE - ) + signature = quote(_generate_hmac(shared_access_key, unsigned_sas), safe=constants.SAFE_ENCODE) signed_sas = "{}&s={}".format(unsigned_sas, signature) return signed_sas + def _generate_hmac(key, message): decoded_key = base64.b64decode(key) bytes_message = message.encode("ascii") @@ -77,18 +78,11 @@ def _get_authentication_policy(credential, bearer_token_policy=BearerTokenCreden if credential is None: raise ValueError("Parameter 'self._credential' must not be None.") if hasattr(credential, "get_token"): - return bearer_token_policy( - credential, - constants.DEFAULT_EVENTGRID_SCOPE - ) + return bearer_token_policy(credential, constants.DEFAULT_EVENTGRID_SCOPE) if isinstance(credential, AzureKeyCredential): - return AzureKeyCredentialPolicy( - credential=credential, name=constants.EVENTGRID_KEY_HEADER - ) + return AzureKeyCredentialPolicy(credential=credential, name=constants.EVENTGRID_KEY_HEADER) if isinstance(credential, AzureSasCredential): - return EventGridSasCredentialPolicy( - credential=credential, name=constants.EVENTGRID_TOKEN_HEADER - ) + return EventGridSasCredentialPolicy(credential=credential, name=constants.EVENTGRID_TOKEN_HEADER) raise ValueError( "The provided credential should be an instance of a TokenCredential, AzureSasCredential or AzureKeyCredential" ) @@ -102,7 +96,8 @@ def _is_cloud_event(event): except TypeError: return False -def _is_eventgrid_event(event): + +def _is_eventgrid_event_format(event): # type: (Any) -> bool required = ("subject", "eventType", "data", "dataVersion", "id", "eventTime") try: @@ -123,6 +118,7 @@ def _eventgrid_data_typecheck(event): "https://docs.microsoft.com/en-us/azure/event-grid/event-schema" ) + def _cloud_event_to_generated(cloud_event, **kwargs): if isinstance(cloud_event.data, bytes): data_base64 = cloud_event.data @@ -142,10 +138,11 @@ def _cloud_event_to_generated(cloud_event, **kwargs): datacontenttype=cloud_event.datacontenttype, subject=cloud_event.subject, additional_properties=cloud_event.extensions, - **kwargs + **kwargs, ) -def _from_cncf_events(event): # pylint: disable=inconsistent-return-statements + +def _from_cncf_events(event): # pylint: disable=inconsistent-return-statements """This takes in a CNCF cloudevent and returns a dictionary. If cloud events library is not installed, the event is returned back. @@ -156,11 +153,12 @@ def _from_cncf_events(event): # pylint: disable=inconsistent-return-statements """ try: from cloudevents.http import to_json + return json.loads(to_json(event)) except (AttributeError, ImportError): # means this is not a CNCF event return event - except Exception as err: # pylint: disable=broad-except + except Exception as err: # pylint: disable=broad-except msg = """Failed to serialize the event. Please ensure your CloudEvents is correctly formatted (https://pypi.org/project/cloudevents/)""" raise ValueError(msg) from err @@ -169,26 +167,21 @@ def _from_cncf_events(event): # pylint: disable=inconsistent-return-statements def _build_request(endpoint, content_type, events, *, channel_name=None, api_version=constants.DEFAULT_API_VERSION): serialize = Serializer() header_parameters: Dict[str, Any] = {} - header_parameters['Content-Type'] = serialize.header("content_type", content_type, 'str') + header_parameters["Content-Type"] = serialize.header("content_type", content_type, "str") if channel_name: - header_parameters['aeg-channel-name'] = channel_name + header_parameters["aeg-channel-name"] = channel_name query_parameters: Dict[str, Any] = {} - query_parameters['api-version'] = serialize.query("api_version", api_version, 'str') + query_parameters["api-version"] = serialize.query("api_version", api_version, "str") - body = serialize.body(events, '[object]') + body = serialize.body(events, "[object]") if body is None: data = None else: data = json.dumps(body) - header_parameters['Content-Length'] = str(len(data)) + header_parameters["Content-Length"] = str(len(data)) - request = HttpRequest( - method="POST", - url=endpoint, - headers=header_parameters, - data=data - ) + request = HttpRequest(method="POST", url=endpoint, headers=header_parameters, data=data) request.format_parameters(query_parameters) return request diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_messaging_shared.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_messaging_shared.py similarity index 92% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_messaging_shared.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_messaging_shared.py index 2ddd520a8648..107ebc6d670d 100644 --- a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_messaging_shared.py +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_messaging_shared.py @@ -11,10 +11,10 @@ # ========================================================================== - import json -def _get_json_content(obj): # pylint: disable=inconsistent-return-statements + +def _get_json_content(obj): # pylint: disable=inconsistent-return-statements """Event mixin to have methods that are common to different Event types like CloudEvent, EventGridEvent etc. @@ -39,7 +39,7 @@ def _get_json_content(obj): # pylint: disable=inconsistent-return-statements return json.loads(next(obj.body)) except ValueError as err: raise ValueError(msg) from err - except: # pylint: disable=bare-except + except: # pylint: disable=bare-except try: return json.loads(obj) except ValueError as err: diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_models.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_models.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_models.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_models.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_policies.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_policies.py similarity index 94% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_policies.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_policies.py index 13a19d60bbcd..d4f2dcfd5638 100644 --- a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_policies.py +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_policies.py @@ -36,11 +36,9 @@ def on_request(self, request): return if ( - request.http_request.headers["content-type"] - == CloudEventDistributedTracingPolicy._CONTENT_TYPE + request.http_request.headers["content-type"] == CloudEventDistributedTracingPolicy._CONTENT_TYPE and traceparent is not None ): - body = json.loads(request.http_request.body) for item in body: if "traceparent" not in item and "tracestate" not in item: diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_publisher_client.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_publisher_client.py similarity index 90% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_publisher_client.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_publisher_client.py index a16b69e5bf13..faba67dcbb52 100644 --- a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_publisher_client.py +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_publisher_client.py @@ -26,7 +26,7 @@ HttpResponseError, ResourceNotFoundError, ResourceExistsError, - map_error + map_error, ) from azure.core.messaging import CloudEvent @@ -34,7 +34,7 @@ from ._helpers import ( _get_authentication_policy, _is_cloud_event, - _is_eventgrid_event, + _is_eventgrid_event_format, _eventgrid_data_typecheck, _build_request, _cloud_event_to_generated, @@ -71,7 +71,7 @@ ListEventType = Union[List[CloudEvent], List[EventGridEvent], List[Dict]] -class EventGridPublisherClient(object): # pylint: disable=client-accepts-api-version-keyword +class EventGridPublisherClient(object): # pylint: disable=client-accepts-api-version-keyword """EventGridPublisherClient publishes events to an EventGrid topic or domain. It can be used to publish either an EventGridEvent, a CloudEvent or a Custom Schema. @@ -103,13 +103,13 @@ class EventGridPublisherClient(object): # pylint: disable=client-accepts-api-ver """ def __init__( - self, - endpoint: str, - credential: Union["AzureKeyCredential", "AzureSasCredential", "TokenCredential"], - *, - api_version: Optional[str] = None, - **kwargs: Any - ) -> None: + self, + endpoint: str, + credential: Union["AzureKeyCredential", "AzureSasCredential", "TokenCredential"], + *, + api_version: Optional[str] = None, + **kwargs: Any + ) -> None: self._endpoint = endpoint self._client = EventGridPublisherClientImpl( policies=EventGridPublisherClient._policies(credential, **kwargs), **kwargs @@ -139,13 +139,7 @@ def _policies(credential, **kwargs): return policies @distributed_trace - def send( - self, - events: SendType, - *, - channel_name: Optional[str] = None, - **kwargs: Any - ) -> None: + def send(self, events: SendType, *, channel_name: Optional[str] = None, **kwargs: Any) -> None: """Sends events to a topic or a domain specified during the client initialization. A single instance or a list of dictionaries, CloudEvents or EventGridEvents are accepted. @@ -218,25 +212,25 @@ def send( content_type = kwargs.pop("content_type", "application/json; charset=utf-8") if isinstance(events[0], CloudEvent) or _is_cloud_event(events[0]): try: - events = [ - _cloud_event_to_generated(e, **kwargs) - for e in events # pylint: disable=protected-access - ] + events = [_cloud_event_to_generated(e, **kwargs) for e in events] # pylint: disable=protected-access except AttributeError: ## this is either a dictionary or a CNCF cloud event - events = [ - _from_cncf_events(e) for e in events - ] + events = [_from_cncf_events(e) for e in events] content_type = "application/cloudevents-batch+json; charset=utf-8" - elif isinstance(events[0], EventGridEvent) or _is_eventgrid_event(events[0]): + elif isinstance(events[0], EventGridEvent) or _is_eventgrid_event_format(events[0]): for event in events: _eventgrid_data_typecheck(event) response = self._client.send_request( # pylint: disable=protected-access _build_request( - self._endpoint,content_type, events, channel_name=channel_name, api_version=self._api_version), - **kwargs + self._endpoint, content_type, events, channel_name=channel_name, api_version=self._api_version + ), + **kwargs ) - error_map = {401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError} + error_map = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + } if response.status_code != 200: map_error(status_code=response.status_code, response=response, error_map=error_map) raise HttpResponseError(response=response) diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_signature_credential_policy.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_signature_credential_policy.py similarity index 100% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/_signature_credential_policy.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_signature_credential_policy.py diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_version.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_version.py new file mode 100644 index 000000000000..c9812c54f54c --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/_version.py @@ -0,0 +1,12 @@ +# coding=utf-8 +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for +# license information. +# +# Code generated by Microsoft (R) AutoRest Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is +# regenerated. +# -------------------------------------------------------------------------- + +VERSION = "4.10.0" diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/aio/__init__.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/aio/__init__.py new file mode 100644 index 000000000000..0d2dce7aaea2 --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/aio/__init__.py @@ -0,0 +1,9 @@ +# coding=utf-8 +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ + +from ._publisher_client_async import EventGridPublisherClient + +__all__ = ["EventGridPublisherClient"] diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_publisher_client_async.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/aio/_publisher_client_async.py similarity index 90% rename from sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_publisher_client_async.py rename to sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/aio/_publisher_client_async.py index 69f6654ab145..de3cbd309291 100644 --- a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_publisher_client_async.py +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_legacy/aio/_publisher_client_async.py @@ -29,13 +29,13 @@ HttpResponseError, ResourceNotFoundError, ResourceExistsError, - map_error + map_error, ) from .._policies import CloudEventDistributedTracingPolicy from .._models import EventGridEvent from .._helpers import ( _is_cloud_event, - _is_eventgrid_event, + _is_eventgrid_event_format, _eventgrid_data_typecheck, _build_request, _cloud_event_to_generated, @@ -65,7 +65,7 @@ ListEventType = Union[List[CloudEvent], List[EventGridEvent], List[Dict]] -class EventGridPublisherClient: # pylint: disable=client-accepts-api-version-keyword +class EventGridPublisherClient: # pylint: disable=client-accepts-api-version-keyword """Asynchronous EventGridPublisherClient publishes events to an EventGrid topic or domain. It can be used to publish either an EventGridEvent, a CloudEvent or a Custom Schema. @@ -99,9 +99,7 @@ class EventGridPublisherClient: # pylint: disable=client-accepts-api-version-key def __init__( self, endpoint: str, - credential: Union[ - "AsyncTokenCredential", AzureKeyCredential, AzureSasCredential - ], + credential: Union["AsyncTokenCredential", AzureKeyCredential, AzureSasCredential], *, api_version: Optional[str] = None, **kwargs: Any @@ -114,14 +112,9 @@ def __init__( @staticmethod def _policies( - credential: Union[ - AzureKeyCredential, AzureSasCredential, "AsyncTokenCredential" - ], - **kwargs: Any + credential: Union[AzureKeyCredential, AzureSasCredential, "AsyncTokenCredential"], **kwargs: Any ) -> List[Any]: - auth_policy = _get_authentication_policy( - credential, AsyncBearerTokenCredentialPolicy - ) + auth_policy = _get_authentication_policy(credential, AsyncBearerTokenCredentialPolicy) sdk_moniker = "eventgridpublisherclient/{}".format(VERSION) policies = [ RequestIdPolicy(**kwargs), @@ -214,25 +207,25 @@ async def send(self, events: SendType, *, channel_name: Optional[str] = None, ** content_type = kwargs.pop("content_type", "application/json; charset=utf-8") if isinstance(events[0], CloudEvent) or _is_cloud_event(events[0]): try: - events = [ - _cloud_event_to_generated(e, **kwargs) - for e in events # pylint: disable=protected-access - ] + events = [_cloud_event_to_generated(e, **kwargs) for e in events] # pylint: disable=protected-access except AttributeError: ## this is either a dictionary or a CNCF cloud event - events = [ - _from_cncf_events(e) for e in events - ] + events = [_from_cncf_events(e) for e in events] content_type = "application/cloudevents-batch+json; charset=utf-8" - elif isinstance(events[0], EventGridEvent) or _is_eventgrid_event(events[0]): + elif isinstance(events[0], EventGridEvent) or _is_eventgrid_event_format(events[0]): for event in events: _eventgrid_data_typecheck(event) response = await self._client.send_request( # pylint: disable=protected-access - _build_request(self._endpoint, content_type, events, - channel_name=channel_name, api_version=self._api_version), + _build_request( + self._endpoint, content_type, events, channel_name=channel_name, api_version=self._api_version + ), **kwargs ) - error_map = {401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError} + error_map = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + } if response.status_code != 200: map_error(status_code=response.status_code, response=response, error_map=error_map) raise HttpResponseError(response=response) diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_model_base.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_model_base.py new file mode 100644 index 000000000000..5cf70733404d --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_model_base.py @@ -0,0 +1,887 @@ +# coding=utf-8 +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for +# license information. +# -------------------------------------------------------------------------- +# pylint: disable=protected-access, arguments-differ, signature-differs, broad-except + +import copy +import calendar +import decimal +import functools +import sys +import logging +import base64 +import re +import typing +import enum +import email.utils +from datetime import datetime, date, time, timedelta, timezone +from json import JSONEncoder +from typing_extensions import Self +import isodate +from azure.core.exceptions import DeserializationError +from azure.core import CaseInsensitiveEnumMeta +from azure.core.pipeline import PipelineResponse +from azure.core.serialization import _Null + +if sys.version_info >= (3, 9): + from collections.abc import MutableMapping +else: + from typing import MutableMapping + +_LOGGER = logging.getLogger(__name__) + +__all__ = ["SdkJSONEncoder", "Model", "rest_field", "rest_discriminator"] + +TZ_UTC = timezone.utc +_T = typing.TypeVar("_T") + + +def _timedelta_as_isostr(td: timedelta) -> str: + """Converts a datetime.timedelta object into an ISO 8601 formatted string, e.g. 'P4DT12H30M05S' + + Function adapted from the Tin Can Python project: https://github.com/RusticiSoftware/TinCanPython + + :param timedelta td: The timedelta to convert + :rtype: str + :return: ISO8601 version of this timedelta + """ + + # Split seconds to larger units + seconds = td.total_seconds() + minutes, seconds = divmod(seconds, 60) + hours, minutes = divmod(minutes, 60) + days, hours = divmod(hours, 24) + + days, hours, minutes = list(map(int, (days, hours, minutes))) + seconds = round(seconds, 6) + + # Build date + date_str = "" + if days: + date_str = "%sD" % days + + if hours or minutes or seconds: + # Build time + time_str = "T" + + # Hours + bigger_exists = date_str or hours + if bigger_exists: + time_str += "{:02}H".format(hours) + + # Minutes + bigger_exists = bigger_exists or minutes + if bigger_exists: + time_str += "{:02}M".format(minutes) + + # Seconds + try: + if seconds.is_integer(): + seconds_string = "{:02}".format(int(seconds)) + else: + # 9 chars long w/ leading 0, 6 digits after decimal + seconds_string = "%09.6f" % seconds + # Remove trailing zeros + seconds_string = seconds_string.rstrip("0") + except AttributeError: # int.is_integer() raises + seconds_string = "{:02}".format(seconds) + + time_str += "{}S".format(seconds_string) + else: + time_str = "" + + return "P" + date_str + time_str + + +def _serialize_bytes(o, format: typing.Optional[str] = None) -> str: + encoded = base64.b64encode(o).decode() + if format == "base64url": + return encoded.strip("=").replace("+", "-").replace("/", "_") + return encoded + + +def _serialize_datetime(o, format: typing.Optional[str] = None): + if hasattr(o, "year") and hasattr(o, "hour"): + if format == "rfc7231": + return email.utils.format_datetime(o, usegmt=True) + if format == "unix-timestamp": + return int(calendar.timegm(o.utctimetuple())) + + # astimezone() fails for naive times in Python 2.7, so make make sure o is aware (tzinfo is set) + if not o.tzinfo: + iso_formatted = o.replace(tzinfo=TZ_UTC).isoformat() + else: + iso_formatted = o.astimezone(TZ_UTC).isoformat() + # Replace the trailing "+00:00" UTC offset with "Z" (RFC 3339: https://www.ietf.org/rfc/rfc3339.txt) + return iso_formatted.replace("+00:00", "Z") + # Next try datetime.date or datetime.time + return o.isoformat() + + +def _is_readonly(p): + try: + return p._visibility == ["read"] # pylint: disable=protected-access + except AttributeError: + return False + + +class SdkJSONEncoder(JSONEncoder): + """A JSON encoder that's capable of serializing datetime objects and bytes.""" + + def __init__(self, *args, exclude_readonly: bool = False, format: typing.Optional[str] = None, **kwargs): + super().__init__(*args, **kwargs) + self.exclude_readonly = exclude_readonly + self.format = format + + def default(self, o): # pylint: disable=too-many-return-statements + if _is_model(o): + if self.exclude_readonly: + readonly_props = [p._rest_name for p in o._attr_to_rest_field.values() if _is_readonly(p)] + return {k: v for k, v in o.items() if k not in readonly_props} + return dict(o.items()) + try: + return super(SdkJSONEncoder, self).default(o) + except TypeError: + if isinstance(o, _Null): + return None + if isinstance(o, decimal.Decimal): + return float(o) + if isinstance(o, (bytes, bytearray)): + return _serialize_bytes(o, self.format) + try: + # First try datetime.datetime + return _serialize_datetime(o, self.format) + except AttributeError: + pass + # Last, try datetime.timedelta + try: + return _timedelta_as_isostr(o) + except AttributeError: + # This will be raised when it hits value.total_seconds in the method above + pass + return super(SdkJSONEncoder, self).default(o) + + +_VALID_DATE = re.compile(r"\d{4}[-]\d{2}[-]\d{2}T\d{2}:\d{2}:\d{2}" + r"\.?\d*Z?[-+]?[\d{2}]?:?[\d{2}]?") +_VALID_RFC7231 = re.compile( + r"(Mon|Tue|Wed|Thu|Fri|Sat|Sun),\s\d{2}\s" + r"(Jan|Feb|Mar|Apr|May|Jun|Jul|Aug|Sep|Oct|Nov|Dec)\s\d{4}\s\d{2}:\d{2}:\d{2}\sGMT" +) + + +def _deserialize_datetime(attr: typing.Union[str, datetime]) -> datetime: + """Deserialize ISO-8601 formatted string into Datetime object. + + :param str attr: response string to be deserialized. + :rtype: ~datetime.datetime + :returns: The datetime object from that input + """ + if isinstance(attr, datetime): + # i'm already deserialized + return attr + attr = attr.upper() + match = _VALID_DATE.match(attr) + if not match: + raise ValueError("Invalid datetime string: " + attr) + + check_decimal = attr.split(".") + if len(check_decimal) > 1: + decimal_str = "" + for digit in check_decimal[1]: + if digit.isdigit(): + decimal_str += digit + else: + break + if len(decimal_str) > 6: + attr = attr.replace(decimal_str, decimal_str[0:6]) + + date_obj = isodate.parse_datetime(attr) + test_utc = date_obj.utctimetuple() + if test_utc.tm_year > 9999 or test_utc.tm_year < 1: + raise OverflowError("Hit max or min date") + return date_obj + + +def _deserialize_datetime_rfc7231(attr: typing.Union[str, datetime]) -> datetime: + """Deserialize RFC7231 formatted string into Datetime object. + + :param str attr: response string to be deserialized. + :rtype: ~datetime.datetime + :returns: The datetime object from that input + """ + if isinstance(attr, datetime): + # i'm already deserialized + return attr + match = _VALID_RFC7231.match(attr) + if not match: + raise ValueError("Invalid datetime string: " + attr) + + return email.utils.parsedate_to_datetime(attr) + + +def _deserialize_datetime_unix_timestamp(attr: typing.Union[float, datetime]) -> datetime: + """Deserialize unix timestamp into Datetime object. + + :param str attr: response string to be deserialized. + :rtype: ~datetime.datetime + :returns: The datetime object from that input + """ + if isinstance(attr, datetime): + # i'm already deserialized + return attr + return datetime.fromtimestamp(attr, TZ_UTC) + + +def _deserialize_date(attr: typing.Union[str, date]) -> date: + """Deserialize ISO-8601 formatted string into Date object. + :param str attr: response string to be deserialized. + :rtype: date + :returns: The date object from that input + """ + # This must NOT use defaultmonth/defaultday. Using None ensure this raises an exception. + if isinstance(attr, date): + return attr + return isodate.parse_date(attr, defaultmonth=None, defaultday=None) # type: ignore + + +def _deserialize_time(attr: typing.Union[str, time]) -> time: + """Deserialize ISO-8601 formatted string into time object. + + :param str attr: response string to be deserialized. + :rtype: datetime.time + :returns: The time object from that input + """ + if isinstance(attr, time): + return attr + return isodate.parse_time(attr) + + +def _deserialize_bytes(attr): + if isinstance(attr, (bytes, bytearray)): + return attr + return bytes(base64.b64decode(attr)) + + +def _deserialize_bytes_base64(attr): + if isinstance(attr, (bytes, bytearray)): + return attr + padding = "=" * (3 - (len(attr) + 3) % 4) # type: ignore + attr = attr + padding # type: ignore + encoded = attr.replace("-", "+").replace("_", "/") + return bytes(base64.b64decode(encoded)) + + +def _deserialize_duration(attr): + if isinstance(attr, timedelta): + return attr + return isodate.parse_duration(attr) + + +def _deserialize_decimal(attr): + if isinstance(attr, decimal.Decimal): + return attr + return decimal.Decimal(str(attr)) + + +_DESERIALIZE_MAPPING = { + datetime: _deserialize_datetime, + date: _deserialize_date, + time: _deserialize_time, + bytes: _deserialize_bytes, + bytearray: _deserialize_bytes, + timedelta: _deserialize_duration, + typing.Any: lambda x: x, + decimal.Decimal: _deserialize_decimal, +} + +_DESERIALIZE_MAPPING_WITHFORMAT = { + "rfc3339": _deserialize_datetime, + "rfc7231": _deserialize_datetime_rfc7231, + "unix-timestamp": _deserialize_datetime_unix_timestamp, + "base64": _deserialize_bytes, + "base64url": _deserialize_bytes_base64, +} + + +def get_deserializer(annotation: typing.Any, rf: typing.Optional["_RestField"] = None): + if rf and rf._format: + return _DESERIALIZE_MAPPING_WITHFORMAT.get(rf._format) + return _DESERIALIZE_MAPPING.get(annotation) + + +def _get_type_alias_type(module_name: str, alias_name: str): + types = { + k: v + for k, v in sys.modules[module_name].__dict__.items() + if isinstance(v, typing._GenericAlias) # type: ignore + } + if alias_name not in types: + return alias_name + return types[alias_name] + + +def _get_model(module_name: str, model_name: str): + models = {k: v for k, v in sys.modules[module_name].__dict__.items() if isinstance(v, type)} + module_end = module_name.rsplit(".", 1)[0] + models.update({k: v for k, v in sys.modules[module_end].__dict__.items() if isinstance(v, type)}) + if isinstance(model_name, str): + model_name = model_name.split(".")[-1] + if model_name not in models: + return model_name + return models[model_name] + + +_UNSET = object() + + +class _MyMutableMapping(MutableMapping[str, typing.Any]): # pylint: disable=unsubscriptable-object + def __init__(self, data: typing.Dict[str, typing.Any]) -> None: + self._data = data + + def __contains__(self, key: typing.Any) -> bool: + return key in self._data + + def __getitem__(self, key: str) -> typing.Any: + return self._data.__getitem__(key) + + def __setitem__(self, key: str, value: typing.Any) -> None: + self._data.__setitem__(key, value) + + def __delitem__(self, key: str) -> None: + self._data.__delitem__(key) + + def __iter__(self) -> typing.Iterator[typing.Any]: + return self._data.__iter__() + + def __len__(self) -> int: + return self._data.__len__() + + def __ne__(self, other: typing.Any) -> bool: + return not self.__eq__(other) + + def keys(self) -> typing.KeysView[str]: + return self._data.keys() + + def values(self) -> typing.ValuesView[typing.Any]: + return self._data.values() + + def items(self) -> typing.ItemsView[str, typing.Any]: + return self._data.items() + + def get(self, key: str, default: typing.Any = None) -> typing.Any: + try: + return self[key] + except KeyError: + return default + + @typing.overload + def pop(self, key: str) -> typing.Any: ... + + @typing.overload + def pop(self, key: str, default: _T) -> _T: ... + + @typing.overload + def pop(self, key: str, default: typing.Any) -> typing.Any: ... + + def pop(self, key: str, default: typing.Any = _UNSET) -> typing.Any: + if default is _UNSET: + return self._data.pop(key) + return self._data.pop(key, default) + + def popitem(self) -> typing.Tuple[str, typing.Any]: + return self._data.popitem() + + def clear(self) -> None: + self._data.clear() + + def update(self, *args: typing.Any, **kwargs: typing.Any) -> None: + self._data.update(*args, **kwargs) + + @typing.overload + def setdefault(self, key: str, default: None = None) -> None: ... + + @typing.overload + def setdefault(self, key: str, default: typing.Any) -> typing.Any: ... + + def setdefault(self, key: str, default: typing.Any = _UNSET) -> typing.Any: + if default is _UNSET: + return self._data.setdefault(key) + return self._data.setdefault(key, default) + + def __eq__(self, other: typing.Any) -> bool: + try: + other_model = self.__class__(other) + except Exception: + return False + return self._data == other_model._data + + def __repr__(self) -> str: + return str(self._data) + + +def _is_model(obj: typing.Any) -> bool: + return getattr(obj, "_is_model", False) + + +def _serialize(o, format: typing.Optional[str] = None): # pylint: disable=too-many-return-statements + if isinstance(o, list): + return [_serialize(x, format) for x in o] + if isinstance(o, dict): + return {k: _serialize(v, format) for k, v in o.items()} + if isinstance(o, set): + return {_serialize(x, format) for x in o} + if isinstance(o, tuple): + return tuple(_serialize(x, format) for x in o) + if isinstance(o, (bytes, bytearray)): + return _serialize_bytes(o, format) + if isinstance(o, decimal.Decimal): + return float(o) + if isinstance(o, enum.Enum): + return o.value + try: + # First try datetime.datetime + return _serialize_datetime(o, format) + except AttributeError: + pass + # Last, try datetime.timedelta + try: + return _timedelta_as_isostr(o) + except AttributeError: + # This will be raised when it hits value.total_seconds in the method above + pass + return o + + +def _get_rest_field( + attr_to_rest_field: typing.Dict[str, "_RestField"], rest_name: str +) -> typing.Optional["_RestField"]: + try: + return next(rf for rf in attr_to_rest_field.values() if rf._rest_name == rest_name) + except StopIteration: + return None + + +def _create_value(rf: typing.Optional["_RestField"], value: typing.Any) -> typing.Any: + if not rf: + return _serialize(value, None) + if rf._is_multipart_file_input: + return value + if rf._is_model: + return _deserialize(rf._type, value) + return _serialize(value, rf._format) + + +class Model(_MyMutableMapping): + _is_model = True + + def __init__(self, *args: typing.Any, **kwargs: typing.Any) -> None: + class_name = self.__class__.__name__ + if len(args) > 1: + raise TypeError(f"{class_name}.__init__() takes 2 positional arguments but {len(args) + 1} were given") + dict_to_pass = { + rest_field._rest_name: rest_field._default + for rest_field in self._attr_to_rest_field.values() + if rest_field._default is not _UNSET + } + if args: + dict_to_pass.update( + {k: _create_value(_get_rest_field(self._attr_to_rest_field, k), v) for k, v in args[0].items()} + ) + else: + non_attr_kwargs = [k for k in kwargs if k not in self._attr_to_rest_field] + if non_attr_kwargs: + # actual type errors only throw the first wrong keyword arg they see, so following that. + raise TypeError(f"{class_name}.__init__() got an unexpected keyword argument '{non_attr_kwargs[0]}'") + dict_to_pass.update( + { + self._attr_to_rest_field[k]._rest_name: _create_value(self._attr_to_rest_field[k], v) + for k, v in kwargs.items() + if v is not None + } + ) + super().__init__(dict_to_pass) + + def copy(self) -> "Model": + return Model(self.__dict__) + + def __new__(cls, *args: typing.Any, **kwargs: typing.Any) -> Self: # pylint: disable=unused-argument + # we know the last three classes in mro are going to be 'Model', 'dict', and 'object' + mros = cls.__mro__[:-3][::-1] # ignore model, dict, and object parents, and reverse the mro order + attr_to_rest_field: typing.Dict[str, _RestField] = { # map attribute name to rest_field property + k: v for mro_class in mros for k, v in mro_class.__dict__.items() if k[0] != "_" and hasattr(v, "_type") + } + annotations = { + k: v + for mro_class in mros + if hasattr(mro_class, "__annotations__") # pylint: disable=no-member + for k, v in mro_class.__annotations__.items() # pylint: disable=no-member + } + for attr, rf in attr_to_rest_field.items(): + rf._module = cls.__module__ + if not rf._type: + rf._type = rf._get_deserialize_callable_from_annotation(annotations.get(attr, None)) + if not rf._rest_name_input: + rf._rest_name_input = attr + cls._attr_to_rest_field: typing.Dict[str, _RestField] = dict(attr_to_rest_field.items()) + + return super().__new__(cls) # pylint: disable=no-value-for-parameter + + def __init_subclass__(cls, discriminator: typing.Optional[str] = None) -> None: + for base in cls.__bases__: + if hasattr(base, "__mapping__"): # pylint: disable=no-member + base.__mapping__[discriminator or cls.__name__] = cls # type: ignore # pylint: disable=no-member + + @classmethod + def _get_discriminator(cls, exist_discriminators) -> typing.Optional[str]: + for v in cls.__dict__.values(): + if ( + isinstance(v, _RestField) and v._is_discriminator and v._rest_name not in exist_discriminators + ): # pylint: disable=protected-access + return v._rest_name # pylint: disable=protected-access + return None + + @classmethod + def _deserialize(cls, data, exist_discriminators): + if not hasattr(cls, "__mapping__"): # pylint: disable=no-member + return cls(data) + discriminator = cls._get_discriminator(exist_discriminators) + exist_discriminators.append(discriminator) + mapped_cls = cls.__mapping__.get(data.get(discriminator), cls) # pyright: ignore # pylint: disable=no-member + if mapped_cls == cls: + return cls(data) + return mapped_cls._deserialize(data, exist_discriminators) # pylint: disable=protected-access + + def as_dict(self, *, exclude_readonly: bool = False) -> typing.Dict[str, typing.Any]: + """Return a dict that can be JSONify using json.dump. + + :keyword bool exclude_readonly: Whether to remove the readonly properties. + :returns: A dict JSON compatible object + :rtype: dict + """ + + result = {} + if exclude_readonly: + readonly_props = [p._rest_name for p in self._attr_to_rest_field.values() if _is_readonly(p)] + for k, v in self.items(): + if exclude_readonly and k in readonly_props: # pyright: ignore + continue + is_multipart_file_input = False + try: + is_multipart_file_input = next( + rf for rf in self._attr_to_rest_field.values() if rf._rest_name == k + )._is_multipart_file_input + except StopIteration: + pass + result[k] = v if is_multipart_file_input else Model._as_dict_value(v, exclude_readonly=exclude_readonly) + return result + + @staticmethod + def _as_dict_value(v: typing.Any, exclude_readonly: bool = False) -> typing.Any: + if v is None or isinstance(v, _Null): + return None + if isinstance(v, (list, tuple, set)): + return type(v)(Model._as_dict_value(x, exclude_readonly=exclude_readonly) for x in v) + if isinstance(v, dict): + return {dk: Model._as_dict_value(dv, exclude_readonly=exclude_readonly) for dk, dv in v.items()} + return v.as_dict(exclude_readonly=exclude_readonly) if hasattr(v, "as_dict") else v + + +def _deserialize_model(model_deserializer: typing.Optional[typing.Callable], obj): + if _is_model(obj): + return obj + return _deserialize(model_deserializer, obj) + + +def _deserialize_with_optional(if_obj_deserializer: typing.Optional[typing.Callable], obj): + if obj is None: + return obj + return _deserialize_with_callable(if_obj_deserializer, obj) + + +def _deserialize_with_union(deserializers, obj): + for deserializer in deserializers: + try: + return _deserialize(deserializer, obj) + except DeserializationError: + pass + raise DeserializationError() + + +def _deserialize_dict( + value_deserializer: typing.Optional[typing.Callable], + module: typing.Optional[str], + obj: typing.Dict[typing.Any, typing.Any], +): + if obj is None: + return obj + return {k: _deserialize(value_deserializer, v, module) for k, v in obj.items()} + + +def _deserialize_multiple_sequence( + entry_deserializers: typing.List[typing.Optional[typing.Callable]], + module: typing.Optional[str], + obj, +): + if obj is None: + return obj + return type(obj)(_deserialize(deserializer, entry, module) for entry, deserializer in zip(obj, entry_deserializers)) + + +def _deserialize_sequence( + deserializer: typing.Optional[typing.Callable], + module: typing.Optional[str], + obj, +): + if obj is None: + return obj + return type(obj)(_deserialize(deserializer, entry, module) for entry in obj) + + +def _sorted_annotations(types: typing.List[typing.Any]) -> typing.List[typing.Any]: + return sorted( + types, + key=lambda x: hasattr(x, "__name__") and x.__name__.lower() in ("str", "float", "int", "bool"), + ) + + +def _get_deserialize_callable_from_annotation( # pylint: disable=R0911, R0915, R0912 + annotation: typing.Any, + module: typing.Optional[str], + rf: typing.Optional["_RestField"] = None, +) -> typing.Optional[typing.Callable[[typing.Any], typing.Any]]: + if not annotation or annotation in [int, float]: + return None + + # is it a type alias? + if isinstance(annotation, str): + if module is not None: + annotation = _get_type_alias_type(module, annotation) + + # is it a forward ref / in quotes? + if isinstance(annotation, (str, typing.ForwardRef)): + try: + model_name = annotation.__forward_arg__ # type: ignore + except AttributeError: + model_name = annotation + if module is not None: + annotation = _get_model(module, model_name) + + try: + if module and _is_model(annotation): + if rf: + rf._is_model = True + + return functools.partial(_deserialize_model, annotation) # pyright: ignore + except Exception: + pass + + # is it a literal? + try: + if annotation.__origin__ is typing.Literal: # pyright: ignore + return None + except AttributeError: + pass + + # is it optional? + try: + if any(a for a in annotation.__args__ if a == type(None)): # pyright: ignore + if len(annotation.__args__) <= 2: # pyright: ignore + if_obj_deserializer = _get_deserialize_callable_from_annotation( + next(a for a in annotation.__args__ if a != type(None)), module, rf # pyright: ignore + ) + + return functools.partial(_deserialize_with_optional, if_obj_deserializer) + # the type is Optional[Union[...]], we need to remove the None type from the Union + annotation_copy = copy.copy(annotation) + annotation_copy.__args__ = [a for a in annotation_copy.__args__ if a != type(None)] # pyright: ignore + return _get_deserialize_callable_from_annotation(annotation_copy, module, rf) + except AttributeError: + pass + + # is it union? + if getattr(annotation, "__origin__", None) is typing.Union: + # initial ordering is we make `string` the last deserialization option, because it is often them most generic + deserializers = [ + _get_deserialize_callable_from_annotation(arg, module, rf) + for arg in _sorted_annotations(annotation.__args__) # pyright: ignore + ] + + return functools.partial(_deserialize_with_union, deserializers) + + try: + if annotation._name == "Dict": # pyright: ignore + value_deserializer = _get_deserialize_callable_from_annotation( + annotation.__args__[1], module, rf # pyright: ignore + ) + + return functools.partial( + _deserialize_dict, + value_deserializer, + module, + ) + except (AttributeError, IndexError): + pass + try: + if annotation._name in ["List", "Set", "Tuple", "Sequence"]: # pyright: ignore + if len(annotation.__args__) > 1: # pyright: ignore + + entry_deserializers = [ + _get_deserialize_callable_from_annotation(dt, module, rf) + for dt in annotation.__args__ # pyright: ignore + ] + return functools.partial(_deserialize_multiple_sequence, entry_deserializers, module) + deserializer = _get_deserialize_callable_from_annotation( + annotation.__args__[0], module, rf # pyright: ignore + ) + + return functools.partial(_deserialize_sequence, deserializer, module) + except (TypeError, IndexError, AttributeError, SyntaxError): + pass + + def _deserialize_default( + deserializer, + obj, + ): + if obj is None: + return obj + try: + return _deserialize_with_callable(deserializer, obj) + except Exception: + pass + return obj + + if get_deserializer(annotation, rf): + return functools.partial(_deserialize_default, get_deserializer(annotation, rf)) + + return functools.partial(_deserialize_default, annotation) + + +def _deserialize_with_callable( + deserializer: typing.Optional[typing.Callable[[typing.Any], typing.Any]], + value: typing.Any, +): + try: + if value is None or isinstance(value, _Null): + return None + if deserializer is None: + return value + if isinstance(deserializer, CaseInsensitiveEnumMeta): + try: + return deserializer(value) + except ValueError: + # for unknown value, return raw value + return value + if isinstance(deserializer, type) and issubclass(deserializer, Model): + return deserializer._deserialize(value, []) + return typing.cast(typing.Callable[[typing.Any], typing.Any], deserializer)(value) + except Exception as e: + raise DeserializationError() from e + + +def _deserialize( + deserializer: typing.Any, + value: typing.Any, + module: typing.Optional[str] = None, + rf: typing.Optional["_RestField"] = None, + format: typing.Optional[str] = None, +) -> typing.Any: + if isinstance(value, PipelineResponse): + value = value.http_response.json() + if rf is None and format: + rf = _RestField(format=format) + if not isinstance(deserializer, functools.partial): + deserializer = _get_deserialize_callable_from_annotation(deserializer, module, rf) + return _deserialize_with_callable(deserializer, value) + + +class _RestField: + def __init__( + self, + *, + name: typing.Optional[str] = None, + type: typing.Optional[typing.Callable] = None, # pylint: disable=redefined-builtin + is_discriminator: bool = False, + visibility: typing.Optional[typing.List[str]] = None, + default: typing.Any = _UNSET, + format: typing.Optional[str] = None, + is_multipart_file_input: bool = False, + ): + self._type = type + self._rest_name_input = name + self._module: typing.Optional[str] = None + self._is_discriminator = is_discriminator + self._visibility = visibility + self._is_model = False + self._default = default + self._format = format + self._is_multipart_file_input = is_multipart_file_input + + @property + def _class_type(self) -> typing.Any: + return getattr(self._type, "args", [None])[0] + + @property + def _rest_name(self) -> str: + if self._rest_name_input is None: + raise ValueError("Rest name was never set") + return self._rest_name_input + + def __get__(self, obj: Model, type=None): # pylint: disable=redefined-builtin + # by this point, type and rest_name will have a value bc we default + # them in __new__ of the Model class + item = obj.get(self._rest_name) + if item is None: + return item + if self._is_model: + return item + return _deserialize(self._type, _serialize(item, self._format), rf=self) + + def __set__(self, obj: Model, value) -> None: + if value is None: + # we want to wipe out entries if users set attr to None + try: + obj.__delitem__(self._rest_name) + except KeyError: + pass + return + if self._is_model: + if not _is_model(value): + value = _deserialize(self._type, value) + obj.__setitem__(self._rest_name, value) + return + obj.__setitem__(self._rest_name, _serialize(value, self._format)) + + def _get_deserialize_callable_from_annotation( + self, annotation: typing.Any + ) -> typing.Optional[typing.Callable[[typing.Any], typing.Any]]: + return _get_deserialize_callable_from_annotation(annotation, self._module, self) + + +def rest_field( + *, + name: typing.Optional[str] = None, + type: typing.Optional[typing.Callable] = None, # pylint: disable=redefined-builtin + visibility: typing.Optional[typing.List[str]] = None, + default: typing.Any = _UNSET, + format: typing.Optional[str] = None, + is_multipart_file_input: bool = False, +) -> typing.Any: + return _RestField( + name=name, + type=type, + visibility=visibility, + default=default, + format=format, + is_multipart_file_input=is_multipart_file_input, + ) + + +def rest_discriminator( + *, + name: typing.Optional[str] = None, + type: typing.Optional[typing.Callable] = None, # pylint: disable=redefined-builtin +) -> typing.Any: + return _RestField(name=name, type=type, is_discriminator=True) diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_operations/__init__.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_operations/__init__.py new file mode 100644 index 000000000000..c716622cb722 --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_operations/__init__.py @@ -0,0 +1,21 @@ +# coding=utf-8 +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for license information. +# Code generated by Microsoft (R) Python Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is regenerated. +# -------------------------------------------------------------------------- + +from ._patch import EventGridPublisherClientOperationsMixin +from ._patch import EventGridConsumerClientOperationsMixin + +from ._patch import __all__ as _patch_all +from ._patch import * # pylint: disable=unused-wildcard-import +from ._patch import patch_sdk as _patch_sdk + +__all__ = [ + "EventGridPublisherClientOperationsMixin", + "EventGridConsumerClientOperationsMixin", +] +__all__.extend([p for p in _patch_all if p not in __all__]) +_patch_sdk() diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_operations/_operations.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_operations/_operations.py new file mode 100644 index 000000000000..b2094313f8e6 --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_operations/_operations.py @@ -0,0 +1,1268 @@ +# pylint: disable=too-many-lines,too-many-statements +# coding=utf-8 +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for license information. +# Code generated by Microsoft (R) Python Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is regenerated. +# -------------------------------------------------------------------------- +from io import IOBase +import json +import sys +from typing import Any, Callable, Dict, IO, List, Optional, Type, TypeVar, Union, overload + +from azure.core.exceptions import ( + ClientAuthenticationError, + HttpResponseError, + ResourceExistsError, + ResourceNotFoundError, + ResourceNotModifiedError, + map_error, +) +from azure.core.pipeline import PipelineResponse +from azure.core.rest import HttpRequest, HttpResponse +from azure.core.tracing.decorator import distributed_trace +from azure.core.utils import case_insensitive_dict + +from .. import models as _models +from .._model_base import SdkJSONEncoder, _deserialize +from .._serialization import Serializer +from .._validation import api_version_validation +from .._vendor import EventGridConsumerClientMixinABC, EventGridPublisherClientMixinABC + +if sys.version_info >= (3, 9): + from collections.abc import MutableMapping +else: + from typing import MutableMapping # type: ignore # pylint: disable=ungrouped-imports +T = TypeVar("T") +ClsType = Optional[Callable[[PipelineResponse[HttpRequest, HttpResponse], T, Dict[str, Any]], Any]] +JSON = MutableMapping[str, Any] # pylint: disable=unsubscriptable-object +_Unset: Any = object() + +_SERIALIZER = Serializer() +_SERIALIZER.client_side_validation = False + + +def build_event_grid_publisher_send_request(topic_name: str, **kwargs: Any) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + content_type: str = kwargs.pop("content_type") + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "2024-06-01")) + accept = _headers.pop("Accept", "application/json") + + # Construct URL + _url = "/topics/{topicName}:publish" + path_format_arguments = { + "topicName": _SERIALIZER.url("topic_name", topic_name, "str"), + } + + _url: str = _url.format(**path_format_arguments) # type: ignore + + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + # Construct headers + _headers["content-type"] = _SERIALIZER.header("content_type", content_type, "str") + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="POST", url=_url, params=_params, headers=_headers, **kwargs) + + +def build_event_grid_publisher_send_events_request( # pylint: disable=name-too-long + topic_name: str, **kwargs: Any +) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + content_type: str = kwargs.pop("content_type") + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "2024-06-01")) + accept = _headers.pop("Accept", "application/json") + + # Construct URL + _url = "/topics/{topicName}:publish" + path_format_arguments = { + "topicName": _SERIALIZER.url("topic_name", topic_name, "str"), + } + + _url: str = _url.format(**path_format_arguments) # type: ignore + + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + # Construct headers + _headers["content-type"] = _SERIALIZER.header("content_type", content_type, "str") + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="POST", url=_url, params=_params, headers=_headers, **kwargs) + + +def build_event_grid_consumer_receive_request( # pylint: disable=name-too-long + topic_name: str, + event_subscription_name: str, + *, + max_events: Optional[int] = None, + max_wait_time: Optional[int] = None, + **kwargs: Any +) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "2024-06-01")) + accept = _headers.pop("Accept", "application/json") + + # Construct URL + _url = "/topics/{topicName}/eventsubscriptions/{eventSubscriptionName}:receive" + path_format_arguments = { + "topicName": _SERIALIZER.url("topic_name", topic_name, "str"), + "eventSubscriptionName": _SERIALIZER.url("event_subscription_name", event_subscription_name, "str"), + } + + _url: str = _url.format(**path_format_arguments) # type: ignore + + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + if max_events is not None: + _params["maxEvents"] = _SERIALIZER.query("max_events", max_events, "int") + if max_wait_time is not None: + _params["maxWaitTime"] = _SERIALIZER.query("max_wait_time", max_wait_time, "int") + + # Construct headers + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="POST", url=_url, params=_params, headers=_headers, **kwargs) + + +def build_event_grid_consumer_acknowledge_request( # pylint: disable=name-too-long + topic_name: str, event_subscription_name: str, **kwargs: Any +) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "2024-06-01")) + accept = _headers.pop("Accept", "application/json") + + # Construct URL + _url = "/topics/{topicName}/eventsubscriptions/{eventSubscriptionName}:acknowledge" + path_format_arguments = { + "topicName": _SERIALIZER.url("topic_name", topic_name, "str"), + "eventSubscriptionName": _SERIALIZER.url("event_subscription_name", event_subscription_name, "str"), + } + + _url: str = _url.format(**path_format_arguments) # type: ignore + + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + # Construct headers + if content_type is not None: + _headers["Content-Type"] = _SERIALIZER.header("content_type", content_type, "str") + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="POST", url=_url, params=_params, headers=_headers, **kwargs) + + +def build_event_grid_consumer_release_request( # pylint: disable=name-too-long + topic_name: str, + event_subscription_name: str, + *, + release_delay_in_seconds: Optional[Union[str, _models.ReleaseDelay]] = None, + **kwargs: Any +) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "2024-06-01")) + accept = _headers.pop("Accept", "application/json") + + # Construct URL + _url = "/topics/{topicName}/eventsubscriptions/{eventSubscriptionName}:release" + path_format_arguments = { + "topicName": _SERIALIZER.url("topic_name", topic_name, "str"), + "eventSubscriptionName": _SERIALIZER.url("event_subscription_name", event_subscription_name, "str"), + } + + _url: str = _url.format(**path_format_arguments) # type: ignore + + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + if release_delay_in_seconds is not None: + _params["releaseDelayInSeconds"] = _SERIALIZER.query( + "release_delay_in_seconds", release_delay_in_seconds, "str" + ) + + # Construct headers + if content_type is not None: + _headers["Content-Type"] = _SERIALIZER.header("content_type", content_type, "str") + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="POST", url=_url, params=_params, headers=_headers, **kwargs) + + +def build_event_grid_consumer_reject_request( + topic_name: str, event_subscription_name: str, **kwargs: Any +) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "2024-06-01")) + accept = _headers.pop("Accept", "application/json") + + # Construct URL + _url = "/topics/{topicName}/eventsubscriptions/{eventSubscriptionName}:reject" + path_format_arguments = { + "topicName": _SERIALIZER.url("topic_name", topic_name, "str"), + "eventSubscriptionName": _SERIALIZER.url("event_subscription_name", event_subscription_name, "str"), + } + + _url: str = _url.format(**path_format_arguments) # type: ignore + + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + # Construct headers + if content_type is not None: + _headers["Content-Type"] = _SERIALIZER.header("content_type", content_type, "str") + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="POST", url=_url, params=_params, headers=_headers, **kwargs) + + +def build_event_grid_consumer_renew_locks_request( # pylint: disable=name-too-long + topic_name: str, event_subscription_name: str, **kwargs: Any +) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "2024-06-01")) + accept = _headers.pop("Accept", "application/json") + + # Construct URL + _url = "/topics/{topicName}/eventsubscriptions/{eventSubscriptionName}:renewLock" + path_format_arguments = { + "topicName": _SERIALIZER.url("topic_name", topic_name, "str"), + "eventSubscriptionName": _SERIALIZER.url("event_subscription_name", event_subscription_name, "str"), + } + + _url: str = _url.format(**path_format_arguments) # type: ignore + + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + # Construct headers + if content_type is not None: + _headers["Content-Type"] = _SERIALIZER.header("content_type", content_type, "str") + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="POST", url=_url, params=_params, headers=_headers, **kwargs) + + +class EventGridPublisherClientOperationsMixin(EventGridPublisherClientMixinABC): + + @distributed_trace + def _send( # pylint: disable=protected-access + self, topic_name: str, event: _models._models.CloudEvent, **kwargs: Any + ) -> _models._models.PublishResult: + # pylint: disable=line-too-long + """Publish a single Cloud Event to a namespace topic. + + :param topic_name: Topic Name. Required. + :type topic_name: str + :param event: Single Cloud Event being published. Required. + :type event: ~azure.eventgrid.models._models.CloudEvent + :return: PublishResult. The PublishResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models._models.PublishResult + :raises ~azure.core.exceptions.HttpResponseError: + + Example: + .. code-block:: python + + # JSON input template you can fill out and use as your body input. + event = { + "id": "str", # An identifier for the event. The combination of id and source + must be unique for each distinct event. Required. + "source": "str", # Identifies the context in which an event happened. The + combination of id and source must be unique for each distinct event. Required. + "specversion": "str", # The version of the CloudEvents specification which + the event uses. Required. + "type": "str", # Type of event related to the originating occurrence. + Required. + "data": {}, # Optional. Event data specific to the event type. + "data_base64": bytes("bytes", encoding="utf-8"), # Optional. Event data + specific to the event type, encoded as a base64 string. + "datacontenttype": "str", # Optional. Content type of data value. + "dataschema": "str", # Optional. Identifies the schema that data adheres to. + "subject": "str", # Optional. This describes the subject of the event in the + context of the event producer (identified by source). + "time": "2020-02-20 00:00:00" # Optional. The time (in UTC) the event was + generated, in RFC3339 format. + } + """ + error_map: MutableMapping[int, Type[HttpResponseError]] = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: str = kwargs.pop( + "content_type", _headers.pop("content-type", "application/cloudevents+json; charset=utf-8") + ) + cls: ClsType[_models._models.PublishResult] = kwargs.pop("cls", None) # pylint: disable=protected-access + + _content = json.dumps(event, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_event_grid_publisher_send_request( + topic_name=topic_name, + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + response.read() # Load the body in memory and close the socket + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() + else: + deserialized = _deserialize( + _models._models.PublishResult, response.json() # pylint: disable=protected-access + ) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @distributed_trace + def _send_events( # pylint: disable=protected-access + self, topic_name: str, events: List[_models._models.CloudEvent], **kwargs: Any + ) -> _models._models.PublishResult: + # pylint: disable=line-too-long + """Publish a batch of Cloud Events to a namespace topic. + + :param topic_name: Topic Name. Required. + :type topic_name: str + :param events: Array of Cloud Events being published. Required. + :type events: list[~azure.eventgrid.models._models.CloudEvent] + :return: PublishResult. The PublishResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models._models.PublishResult + :raises ~azure.core.exceptions.HttpResponseError: + + Example: + .. code-block:: python + + # JSON input template you can fill out and use as your body input. + events = [ + { + "id": "str", # An identifier for the event. The combination of id + and source must be unique for each distinct event. Required. + "source": "str", # Identifies the context in which an event + happened. The combination of id and source must be unique for each distinct + event. Required. + "specversion": "str", # The version of the CloudEvents specification + which the event uses. Required. + "type": "str", # Type of event related to the originating + occurrence. Required. + "data": {}, # Optional. Event data specific to the event type. + "data_base64": bytes("bytes", encoding="utf-8"), # Optional. Event + data specific to the event type, encoded as a base64 string. + "datacontenttype": "str", # Optional. Content type of data value. + "dataschema": "str", # Optional. Identifies the schema that data + adheres to. + "subject": "str", # Optional. This describes the subject of the + event in the context of the event producer (identified by source). + "time": "2020-02-20 00:00:00" # Optional. The time (in UTC) the + event was generated, in RFC3339 format. + } + ] + """ + error_map: MutableMapping[int, Type[HttpResponseError]] = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: str = kwargs.pop( + "content_type", _headers.pop("content-type", "application/cloudevents-batch+json; charset=utf-8") + ) + cls: ClsType[_models._models.PublishResult] = kwargs.pop("cls", None) # pylint: disable=protected-access + + _content = json.dumps(events, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_event_grid_publisher_send_events_request( + topic_name=topic_name, + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + response.read() # Load the body in memory and close the socket + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() + else: + deserialized = _deserialize( + _models._models.PublishResult, response.json() # pylint: disable=protected-access + ) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + +class EventGridConsumerClientOperationsMixin(EventGridConsumerClientMixinABC): + + @distributed_trace + def _receive( # pylint: disable=protected-access + self, + topic_name: str, + event_subscription_name: str, + *, + max_events: Optional[int] = None, + max_wait_time: Optional[int] = None, + **kwargs: Any + ) -> _models._models.ReceiveResult: + # pylint: disable=line-too-long + """Receive a batch of Cloud Events from a subscription. + + :param topic_name: Topic Name. Required. + :type topic_name: str + :param event_subscription_name: Event Subscription Name. Required. + :type event_subscription_name: str + :keyword max_events: Max Events count to be received. Minimum value is 1, while maximum value + is 100 events. If not specified, the default value is 1. Default value is None. + :paramtype max_events: int + :keyword max_wait_time: Max wait time value for receive operation in Seconds. It is the time in + seconds that the server approximately waits for the availability of an event and responds to + the request. If an event is available, the broker responds immediately to the client. Minimum + value is 10 seconds, while maximum value is 120 seconds. If not specified, the default value is + 60 seconds. Default value is None. + :paramtype max_wait_time: int + :return: ReceiveResult. The ReceiveResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models._models.ReceiveResult + :raises ~azure.core.exceptions.HttpResponseError: + + Example: + .. code-block:: python + + # response body for status code(s): 200 + response == { + "value": [ + { + "brokerProperties": { + "deliveryCount": 0, # The attempt count for + delivering the event. Required. + "lockToken": "str" # The token of the lock on the + event. Required. + }, + "event": { + "id": "str", # An identifier for the event. The + combination of id and source must be unique for each distinct event. + Required. + "source": "str", # Identifies the context in which + an event happened. The combination of id and source must be unique + for each distinct event. Required. + "specversion": "str", # The version of the + CloudEvents specification which the event uses. Required. + "type": "str", # Type of event related to the + originating occurrence. Required. + "data": {}, # Optional. Event data specific to the + event type. + "data_base64": bytes("bytes", encoding="utf-8"), # + Optional. Event data specific to the event type, encoded as a base64 + string. + "datacontenttype": "str", # Optional. Content type + of data value. + "dataschema": "str", # Optional. Identifies the + schema that data adheres to. + "subject": "str", # Optional. This describes the + subject of the event in the context of the event producer (identified + by source). + "time": "2020-02-20 00:00:00" # Optional. The time + (in UTC) the event was generated, in RFC3339 format. + } + } + ] + } + """ + error_map: MutableMapping[int, Type[HttpResponseError]] = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[_models._models.ReceiveResult] = kwargs.pop("cls", None) # pylint: disable=protected-access + + _request = build_event_grid_consumer_receive_request( + topic_name=topic_name, + event_subscription_name=event_subscription_name, + max_events=max_events, + max_wait_time=max_wait_time, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + response.read() # Load the body in memory and close the socket + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() + else: + deserialized = _deserialize( + _models._models.ReceiveResult, response.json() # pylint: disable=protected-access + ) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @overload + def _acknowledge( + self, + topic_name: str, + event_subscription_name: str, + body: JSON, + *, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.AcknowledgeResult: ... + @overload + def _acknowledge( + self, + topic_name: str, + event_subscription_name: str, + *, + lock_tokens: List[str], + content_type: str = "application/json", + **kwargs: Any + ) -> _models.AcknowledgeResult: ... + @overload + def _acknowledge( + self, + topic_name: str, + event_subscription_name: str, + body: IO[bytes], + *, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.AcknowledgeResult: ... + + @distributed_trace + def _acknowledge( + self, + topic_name: str, + event_subscription_name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + lock_tokens: List[str] = _Unset, + **kwargs: Any + ) -> _models.AcknowledgeResult: + """Acknowledge a batch of Cloud Events. The response will include the set of successfully + acknowledged lock tokens, along with other failed lock tokens with their corresponding error + information. Successfully acknowledged events will no longer be available to be received by any + consumer. + + :param topic_name: Topic Name. Required. + :type topic_name: str + :param event_subscription_name: Event Subscription Name. Required. + :type event_subscription_name: str + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword lock_tokens: Array of lock tokens. Required. + :paramtype lock_tokens: list[str] + :return: AcknowledgeResult. The AcknowledgeResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models.AcknowledgeResult + :raises ~azure.core.exceptions.HttpResponseError: + + Example: + .. code-block:: python + + # JSON input template you can fill out and use as your body input. + body = { + "lockTokens": [ + "str" # Array of lock tokens. Required. + ] + } + + # response body for status code(s): 200 + response == { + "failedLockTokens": [ + { + "error": { + "code": "str", # One of a server-defined set of + error codes. Required. + "message": "str", # A human-readable representation + of the error. Required. + "details": [ + ... + ], + "innererror": { + "code": "str", # Optional. One of a + server-defined set of error codes. + "innererror": ... + }, + "target": "str" # Optional. The target of the error. + }, + "lockToken": "str" # The lock token of an entry in the + request. Required. + } + ], + "succeededLockTokens": [ + "str" # Array of lock tokens for the successfully acknowledged cloud + events. Required. + ] + } + """ + error_map: MutableMapping[int, Type[HttpResponseError]] = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.AcknowledgeResult] = kwargs.pop("cls", None) + + if body is _Unset: + if lock_tokens is _Unset: + raise TypeError("missing required argument: lock_tokens") + body = {"lockTokens": lock_tokens} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_event_grid_consumer_acknowledge_request( + topic_name=topic_name, + event_subscription_name=event_subscription_name, + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + response.read() # Load the body in memory and close the socket + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() + else: + deserialized = _deserialize(_models.AcknowledgeResult, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @overload + @api_version_validation( + params_added_on={"2023-10-01-preview": ["release_delay_in_seconds"]}, + ) + def _release( + self, + topic_name: str, + event_subscription_name: str, + body: JSON, + *, + release_delay_in_seconds: Optional[Union[str, _models.ReleaseDelay]] = None, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.ReleaseResult: ... + @overload + @api_version_validation( + params_added_on={"2023-10-01-preview": ["release_delay_in_seconds"]}, + ) + def _release( + self, + topic_name: str, + event_subscription_name: str, + *, + lock_tokens: List[str], + release_delay_in_seconds: Optional[Union[str, _models.ReleaseDelay]] = None, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.ReleaseResult: ... + @overload + @api_version_validation( + params_added_on={"2023-10-01-preview": ["release_delay_in_seconds"]}, + ) + def _release( + self, + topic_name: str, + event_subscription_name: str, + body: IO[bytes], + *, + release_delay_in_seconds: Optional[Union[str, _models.ReleaseDelay]] = None, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.ReleaseResult: ... + + @distributed_trace + @api_version_validation( + params_added_on={"2023-10-01-preview": ["release_delay_in_seconds"]}, + ) + def _release( + self, + topic_name: str, + event_subscription_name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + lock_tokens: List[str] = _Unset, + release_delay_in_seconds: Optional[Union[str, _models.ReleaseDelay]] = None, + **kwargs: Any + ) -> _models.ReleaseResult: + """Release a batch of Cloud Events. The response will include the set of successfully released + lock tokens, along with other failed lock tokens with their corresponding error information. + Successfully released events can be received by consumers. + + :param topic_name: Topic Name. Required. + :type topic_name: str + :param event_subscription_name: Event Subscription Name. Required. + :type event_subscription_name: str + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword lock_tokens: Array of lock tokens. Required. + :paramtype lock_tokens: list[str] + :keyword release_delay_in_seconds: Release cloud events with the specified delay in seconds. + Known values are: "0", "10", "60", "600", and "3600". Default value is None. + :paramtype release_delay_in_seconds: str or ~azure.eventgrid.models.ReleaseDelay + :return: ReleaseResult. The ReleaseResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models.ReleaseResult + :raises ~azure.core.exceptions.HttpResponseError: + + Example: + .. code-block:: python + + # JSON input template you can fill out and use as your body input. + body = { + "lockTokens": [ + "str" # Array of lock tokens. Required. + ] + } + + # response body for status code(s): 200 + response == { + "failedLockTokens": [ + { + "error": { + "code": "str", # One of a server-defined set of + error codes. Required. + "message": "str", # A human-readable representation + of the error. Required. + "details": [ + ... + ], + "innererror": { + "code": "str", # Optional. One of a + server-defined set of error codes. + "innererror": ... + }, + "target": "str" # Optional. The target of the error. + }, + "lockToken": "str" # The lock token of an entry in the + request. Required. + } + ], + "succeededLockTokens": [ + "str" # Array of lock tokens for the successfully released cloud + events. Required. + ] + } + """ + error_map: MutableMapping[int, Type[HttpResponseError]] = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.ReleaseResult] = kwargs.pop("cls", None) + + if body is _Unset: + if lock_tokens is _Unset: + raise TypeError("missing required argument: lock_tokens") + body = {"lockTokens": lock_tokens} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_event_grid_consumer_release_request( + topic_name=topic_name, + event_subscription_name=event_subscription_name, + release_delay_in_seconds=release_delay_in_seconds, + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + response.read() # Load the body in memory and close the socket + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() + else: + deserialized = _deserialize(_models.ReleaseResult, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @overload + def _reject( + self, + topic_name: str, + event_subscription_name: str, + body: JSON, + *, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.RejectResult: ... + @overload + def _reject( + self, + topic_name: str, + event_subscription_name: str, + *, + lock_tokens: List[str], + content_type: str = "application/json", + **kwargs: Any + ) -> _models.RejectResult: ... + @overload + def _reject( + self, + topic_name: str, + event_subscription_name: str, + body: IO[bytes], + *, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.RejectResult: ... + + @distributed_trace + def _reject( + self, + topic_name: str, + event_subscription_name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + lock_tokens: List[str] = _Unset, + **kwargs: Any + ) -> _models.RejectResult: + """Reject a batch of Cloud Events. The response will include the set of successfully rejected lock + tokens, along with other failed lock tokens with their corresponding error information. + Successfully rejected events will be dead-lettered and can no longer be received by a consumer. + + :param topic_name: Topic Name. Required. + :type topic_name: str + :param event_subscription_name: Event Subscription Name. Required. + :type event_subscription_name: str + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword lock_tokens: Array of lock tokens. Required. + :paramtype lock_tokens: list[str] + :return: RejectResult. The RejectResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models.RejectResult + :raises ~azure.core.exceptions.HttpResponseError: + + Example: + .. code-block:: python + + # JSON input template you can fill out and use as your body input. + body = { + "lockTokens": [ + "str" # Array of lock tokens. Required. + ] + } + + # response body for status code(s): 200 + response == { + "failedLockTokens": [ + { + "error": { + "code": "str", # One of a server-defined set of + error codes. Required. + "message": "str", # A human-readable representation + of the error. Required. + "details": [ + ... + ], + "innererror": { + "code": "str", # Optional. One of a + server-defined set of error codes. + "innererror": ... + }, + "target": "str" # Optional. The target of the error. + }, + "lockToken": "str" # The lock token of an entry in the + request. Required. + } + ], + "succeededLockTokens": [ + "str" # Array of lock tokens for the successfully rejected cloud + events. Required. + ] + } + """ + error_map: MutableMapping[int, Type[HttpResponseError]] = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.RejectResult] = kwargs.pop("cls", None) + + if body is _Unset: + if lock_tokens is _Unset: + raise TypeError("missing required argument: lock_tokens") + body = {"lockTokens": lock_tokens} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_event_grid_consumer_reject_request( + topic_name=topic_name, + event_subscription_name=event_subscription_name, + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + response.read() # Load the body in memory and close the socket + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() + else: + deserialized = _deserialize(_models.RejectResult, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @overload + @api_version_validation( + method_added_on="2023-10-01-preview", + params_added_on={ + "2023-10-01-preview": ["api_version", "topic_name", "event_subscription_name", "content_type", "accept"] + }, + ) + def _renew_locks( + self, + topic_name: str, + event_subscription_name: str, + body: JSON, + *, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.RenewLocksResult: ... + @overload + @api_version_validation( + method_added_on="2023-10-01-preview", + params_added_on={ + "2023-10-01-preview": ["api_version", "topic_name", "event_subscription_name", "content_type", "accept"] + }, + ) + def _renew_locks( + self, + topic_name: str, + event_subscription_name: str, + *, + lock_tokens: List[str], + content_type: str = "application/json", + **kwargs: Any + ) -> _models.RenewLocksResult: ... + @overload + @api_version_validation( + method_added_on="2023-10-01-preview", + params_added_on={ + "2023-10-01-preview": ["api_version", "topic_name", "event_subscription_name", "content_type", "accept"] + }, + ) + def _renew_locks( + self, + topic_name: str, + event_subscription_name: str, + body: IO[bytes], + *, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.RenewLocksResult: ... + + @distributed_trace + @api_version_validation( + method_added_on="2023-10-01-preview", + params_added_on={ + "2023-10-01-preview": ["api_version", "topic_name", "event_subscription_name", "content_type", "accept"] + }, + ) + def _renew_locks( + self, + topic_name: str, + event_subscription_name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + lock_tokens: List[str] = _Unset, + **kwargs: Any + ) -> _models.RenewLocksResult: + """Renew locks for a batch of Cloud Events. The response will include the set of successfully + renewed lock tokens, along with other failed lock tokens with their corresponding error + information. Successfully renewed locks will ensure that the associated event is only available + to the consumer that holds the renewed lock. + + :param topic_name: Topic Name. Required. + :type topic_name: str + :param event_subscription_name: Event Subscription Name. Required. + :type event_subscription_name: str + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword lock_tokens: Array of lock tokens. Required. + :paramtype lock_tokens: list[str] + :return: RenewLocksResult. The RenewLocksResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models.RenewLocksResult + :raises ~azure.core.exceptions.HttpResponseError: + + Example: + .. code-block:: python + + # JSON input template you can fill out and use as your body input. + body = { + "lockTokens": [ + "str" # Array of lock tokens. Required. + ] + } + + # response body for status code(s): 200 + response == { + "failedLockTokens": [ + { + "error": { + "code": "str", # One of a server-defined set of + error codes. Required. + "message": "str", # A human-readable representation + of the error. Required. + "details": [ + ... + ], + "innererror": { + "code": "str", # Optional. One of a + server-defined set of error codes. + "innererror": ... + }, + "target": "str" # Optional. The target of the error. + }, + "lockToken": "str" # The lock token of an entry in the + request. Required. + } + ], + "succeededLockTokens": [ + "str" # Array of lock tokens for the successfully renewed locks. + Required. + ] + } + """ + error_map: MutableMapping[int, Type[HttpResponseError]] = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.RenewLocksResult] = kwargs.pop("cls", None) + + if body is _Unset: + if lock_tokens is _Unset: + raise TypeError("missing required argument: lock_tokens") + body = {"lockTokens": lock_tokens} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_event_grid_consumer_renew_locks_request( + topic_name=topic_name, + event_subscription_name=event_subscription_name, + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + response.read() # Load the body in memory and close the socket + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() + else: + deserialized = _deserialize(_models.RenewLocksResult, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_operations/_patch.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_operations/_patch.py new file mode 100644 index 000000000000..0606157136ab --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_operations/_patch.py @@ -0,0 +1,351 @@ +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ +"""Customize generated code here. +Follow our quickstart for examples: https://aka.ms/azsdk/python/dpcodegen/python/customize +""" +import sys +from typing import ( + Any, + Callable, + Dict, + List, + Optional, + TypeVar, + Union, + TYPE_CHECKING, +) + +from azure.core.exceptions import ( + HttpResponseError, + ResourceNotFoundError, +) +from azure.core.messaging import CloudEvent +from azure.core.tracing.decorator import distributed_trace +from azure.core.pipeline import PipelineResponse +from azure.core.rest import HttpRequest, HttpResponse + +from ._operations import ( + EventGridPublisherClientOperationsMixin as PublisherOperationsMixin, + EventGridConsumerClientOperationsMixin as ConsumerOperationsMixin, +) +from ..models._patch import ( + ReceiveDetails, +) +from .. import models as _models +from ..models._models import ( + CloudEvent as InternalCloudEvent, +) +from .._validation import api_version_validation + + +from .._legacy import EventGridEvent +from .._legacy._helpers import _from_cncf_events, _is_eventgrid_event_format, _is_cloud_event +from .._serialization import Serializer + +if sys.version_info >= (3, 9): + from collections.abc import MutableMapping +else: + from typing import MutableMapping # type: ignore # pylint: disable=ungrouped-imports + +JSON = MutableMapping[str, Any] # pylint: disable=unsubscriptable-object +T = TypeVar("T") +ClsType = Optional[Callable[[PipelineResponse[HttpRequest, HttpResponse], T, Dict[str, Any]], Any]] +_SERIALIZER = Serializer() +_SERIALIZER.client_side_validation = False + +if TYPE_CHECKING: + from cloudevents.http.event import CloudEvent as CNCFCloudEvent + + +class EventGridPublisherClientOperationsMixin(PublisherOperationsMixin): + + @distributed_trace + def send( + self, + events: Union[ + CloudEvent, + List[CloudEvent], + Dict[str, Any], + List[Dict[str, Any]], + "CNCFCloudEvent", + List["CNCFCloudEvent"], + EventGridEvent, + List[EventGridEvent], + ], + *, + channel_name: Optional[str] = None, + content_type: Optional[str] = None, + **kwargs: Any, + ) -> None: # pylint: disable=docstring-should-be-keyword, docstring-missing-param + """Send events to the Event Grid Service. + + :param events: The event(s) to send. If sending to an Event Grid Namespace, the dict or list of dicts + should be in the format of a CloudEvent. + :type events: CloudEvent or List[CloudEvent] or Dict[str, Any] or List[Dict[str, Any]] + or CNCFCloudEvent or List[CNCFCloudEvent] or EventGridEvent or List[EventGridEvent] + :keyword channel_name: The name of the channel to send the event to. Event Grid Basic Resource only. + :paramtype channel_name: str or None + :keyword content_type: The content type of the event. If not specified, the default value is + "application/cloudevents+json; charset=utf-8". + :paramtype content_type: str + + :return: None + :rtype: None + """ + if self._namespace and channel_name: + raise ValueError("Channel name is not supported for Event Grid Namespaces.") + + try: + if isinstance(events, dict): + events = CloudEvent.from_dict(events) + if isinstance(events, list) and isinstance(events[0], dict): + events = [CloudEvent.from_dict(e) for e in events] + except Exception: # pylint: disable=broad-except + pass + + if self._namespace: + kwargs["content_type"] = ( + content_type if content_type else "application/cloudevents-batch+json; charset=utf-8" + ) + if not isinstance(events, list): + events = [events] + + if isinstance(events[0], EventGridEvent) or _is_eventgrid_event_format(events[0]): + raise TypeError("EventGridEvent is not supported for Event Grid Namespaces.") + try: + # Try to send via namespace + self._publish(self._namespace, _serialize_events(events), **kwargs) + except Exception as exception: # pylint: disable=broad-except + self._http_response_error_handler(exception) + raise exception + else: + kwargs["content_type"] = content_type if content_type else "application/json; charset=utf-8" + try: + self._publish(events, channel_name=channel_name, **kwargs) + except Exception as exception: + self._http_response_error_handler(exception) + raise exception + + def _http_response_error_handler(self, exception): + if isinstance(exception, HttpResponseError): + if exception.status_code == 400: + raise HttpResponseError("Invalid event data. Please check the data and try again.") from exception + if exception.status_code == 404: + raise ResourceNotFoundError( + "Resource not found. " + "For Event Grid Namespaces, please specify the namespace_topic name on the client. " + "For Event Grid Basic, do not specify the namespace_topic name." + ) from exception + raise exception + + +class EventGridConsumerClientOperationsMixin(ConsumerOperationsMixin): + + @distributed_trace + def receive( + self, + *, + max_events: Optional[int] = None, + max_wait_time: Optional[int] = None, + **kwargs: Any, + ) -> List[ReceiveDetails]: + """Receive Batch of Cloud Events from the Event Subscription. + + :keyword max_events: Max Events count to be received. Minimum value is 1, while maximum value + is 100 events. If not specified, the default value is 1. Default value is None. + :paramtype max_events: int + :keyword max_wait_time: Max wait time value for receive operation in Seconds. It is the time in + seconds that the server approximately waits for the availability of an event and responds to + the request. If an event is available, the broker responds immediately to the client. Minimum + value is 10 seconds, while maximum value is 120 seconds. If not specified, the default value is + 60 seconds. Default value is None. + :paramtype max_wait_time: int + :return: Receive Details. + :rtype: list[~azure.eventgrid.models.ReceiveDetails] + :raises ~azure.core.exceptions.HttpResponseError: + """ + + detail_items = [] + received_result = self._receive( + self._namespace, + self._subscription, + max_events=max_events, + max_wait_time=max_wait_time, + **kwargs, + ) + for detail_item in received_result.details: + deserialized_cloud_event = CloudEvent.from_dict(detail_item.event) + detail_item.event = deserialized_cloud_event + detail_items.append( + ReceiveDetails( + broker_properties=detail_item.broker_properties, + event=detail_item.event, + ) + ) + return detail_items + + @distributed_trace + def acknowledge( + self, + *, + lock_tokens: List[str], + **kwargs: Any, + ) -> _models.AcknowledgeResult: + """Acknowledge a batch of Cloud Events. The response will include the set of successfully + acknowledged lock tokens, along with other failed lock tokens with their corresponding error + information. Successfully acknowledged events will no longer be available to be received by any + consumer. + + :keyword lock_tokens: Array of lock tokens of Cloud Events. Required. + :paramtype lock_tokens: List[str] + :return: AcknowledgeResult. The AcknowledgeResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models.AcknowledgeResult + :raises ~azure.core.exceptions.HttpResponseError: + """ + return super()._acknowledge( + topic_name=self._namespace, + event_subscription_name=self._subscription, + lock_tokens=lock_tokens, + **kwargs, + ) + + @api_version_validation( + params_added_on={"2023-10-01-preview": ["release_delay"]}, + ) + def release( + self, + *, + lock_tokens: List[str], + release_delay: Optional[Union[int, _models.ReleaseDelay]] = None, + **kwargs: Any, + ) -> _models.ReleaseResult: + """Release a batch of Cloud Events. The response will include the set of successfully released + lock tokens, along with other failed lock tokens with their corresponding error information. + Successfully released events can be received by consumers. + + :keyword lock_tokens: Array of lock tokens of Cloud Events. Required. + :paramtype lock_tokens: List[str] + :keyword release_delay: Release cloud events with the specified delay in seconds. + Known values are: 0, 10, 60, 600, and 3600. Default value is None, indicating no delay. + :paramtype release_delay: int or ~azure.eventgrid.models.ReleaseDelay + :return: ReleaseResult. The ReleaseResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models.ReleaseResult + :raises ~azure.core.exceptions.HttpResponseError: + """ + return super()._release( + topic_name=self._namespace, + event_subscription_name=self._subscription, + lock_tokens=lock_tokens, + release_delay_in_seconds=release_delay, + **kwargs, + ) + + @distributed_trace + def reject( + self, + *, + lock_tokens: List[str], + **kwargs: Any, + ) -> _models.RejectResult: + """Reject a batch of Cloud Events. The response will include the set of successfully rejected lock + tokens, along with other failed lock tokens with their corresponding error information. + Successfully rejected events will be dead-lettered and can no longer be received by a consumer. + + :keyword lock_tokens: Array of lock tokens of Cloud Events. Required. + :paramtype lock_tokens: List[str] + :return: RejectResult. The RejectResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models.RejectResult + :raises ~azure.core.exceptions.HttpResponseError: + """ + return super()._reject( + topic_name=self._namespace, + event_subscription_name=self._subscription, + lock_tokens=lock_tokens, + **kwargs, + ) + + @distributed_trace + @api_version_validation( + method_added_on="2023-10-01-preview", + params_added_on={"2023-10-01-preview": ["api_version", "content_type", "accept"]}, + ) + def renew_locks( + self, + *, + lock_tokens: List[str], + **kwargs: Any, + ) -> _models.RenewLocksResult: + """Renew locks for a batch of Cloud Events. The response will include the set of successfully + renewed lock tokens, along with other failed lock tokens with their corresponding error + information. Successfully renewed locks will ensure that the associated event is only available + to the consumer that holds the renewed lock. + + :keyword lock_tokens: Array of lock tokens of Cloud Events. Required. + :paramtype lock_tokens: List[str] + :return: RenewLocksResult. The RenewLocksResult is compatible with + MutableMapping + :rtype: ~azure.eventgrid.models.RenewLocksResult + :raises ~azure.core.exceptions.HttpResponseError: + """ + return super()._renew_locks( + topic_name=self._namespace, + event_subscription_name=self._subscription, + lock_tokens=lock_tokens, + **kwargs, + ) + + +def _serialize_events(events): + if isinstance(events[0], CloudEvent) or _is_cloud_event(events[0]): + # Try to serialize cloud events + try: + internal_body_list = [] + for item in events: + internal_body_list.append(_serialize_cloud_event(item)) + return internal_body_list + except AttributeError: + # Try to serialize CNCF Cloud Events + return [_from_cncf_events(e) for e in events] + else: + # Does not conform to format + raise TypeError("Invalid event data. Please check the data is of Cloud Event type/format and try again.") + + +def _serialize_cloud_event(cloud_event): + data_kwargs = {} + + if isinstance(cloud_event.data, bytes): + data_kwargs["data_base64"] = cloud_event.data + else: + data_kwargs["data"] = cloud_event.data + + internal_event = InternalCloudEvent( + id=cloud_event.id, + source=cloud_event.source, + type=cloud_event.type, + specversion=cloud_event.specversion, + time=cloud_event.time, + dataschema=cloud_event.dataschema, + datacontenttype=cloud_event.datacontenttype, + subject=cloud_event.subject, + **data_kwargs, + ) + if cloud_event.extensions: + internal_event.update(cloud_event.extensions) + return internal_event + + +__all__: List[str] = [ + "EventGridPublisherClientOperationsMixin", + "EventGridConsumerClientOperationsMixin", +] # Add all objects you want publicly available to users at this package level + + +def patch_sdk(): + """Do not remove from this file. + `patch_sdk` is a last resort escape hatch that allows you to do customizations + you can't accomplish using the techniques described in + https://aka.ms/azsdk/python/dpcodegen/python/customize + """ diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_patch.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_patch.py new file mode 100644 index 000000000000..b63baf75804a --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_patch.py @@ -0,0 +1,153 @@ +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ +"""Customize generated code here. +Follow our quickstart for examples: https://aka.ms/azsdk/python/dpcodegen/python/customize +""" + +from typing import List, Union, Optional, Any +from azure.core.credentials import AzureKeyCredential, AzureSasCredential, TokenCredential + +from ._client import ( + EventGridPublisherClient as InternalEventGridPublisherClient, + EventGridConsumerClient as InternalEventGridConsumerClient, +) +from ._legacy import ( + EventGridPublisherClient as LegacyEventGridPublisherClient, + SystemEventNames, + EventGridEvent, + generate_sas, +) +from ._serialization import Deserializer, Serializer + + +DEFAULT_STANDARD_API_VERSION = "2024-06-01" +DEFAULT_BASIC_API_VERSION = "2018-01-01" + + +class EventGridPublisherClient(InternalEventGridPublisherClient): + """EventGridPublisherClient. + + Sends events to a basic topic, basic domain, or a namespace topic + specified during the client initialization. + + A single instance or a list of dictionaries, CloudEvents or EventGridEvents are accepted. + If a list is provided, the list must contain only one type of event. + If dictionaries are provided and sending to a namespace topic, + the dictionary must follow the CloudEvent schema. + + :param endpoint: The host name of the namespace, e.g. + namespaceName1.westus-1.eventgrid.azure.net. Required. + :type endpoint: str + :param credential: Credential used to authenticate requests to the service. Is either a + AzureKeyCredential type or a TokenCredential type. Required. + :type credential: ~azure.core.credentials.AzureKeyCredential or + ~azure.core.credentials.TokenCredential + :keyword namespace_topic: The name of the topic to publish events to. Required for EventGrid Namespaces. + Default value is None, which is used for EventGrid Basic. + :paramtype namespace_topic: str or None + :keyword api_version: The API version to use for this operation. Default value is "2024-06-01". + Note that overriding this default value may result in unsupported behavior. + :paramtype api_version: str + """ + + def __init__( + self, + endpoint: str, + credential: Union[AzureKeyCredential, AzureSasCredential, "TokenCredential"], + *, + namespace_topic: Optional[str] = None, + api_version: Optional[str] = None, + **kwargs: Any, + ) -> None: + self._namespace = namespace_topic + self._credential = credential + + if not self._namespace: + self._client = LegacyEventGridPublisherClient( + endpoint, + credential, + api_version=api_version or DEFAULT_BASIC_API_VERSION, + ) # type:ignore[assignment] + self._publish = self._client.send # type:ignore[attr-defined] + else: + if isinstance(credential, AzureSasCredential): + raise TypeError("SAS token authentication is not supported for the standard client.") + super().__init__( + endpoint=endpoint, + credential=credential, + api_version=api_version or DEFAULT_STANDARD_API_VERSION, + **kwargs, + ) + + self._publish = self._send_events + self._serialize = Serializer() + self._deserialize = Deserializer() + self._serialize.client_side_validation = False + + def __repr__(self) -> str: + return ( + f"" + ) + + +class EventGridConsumerClient(InternalEventGridConsumerClient): + """EventGridConsumerClient. + + Consumes and manages events from a namespace topic + and event subscription specified during the client initialization. + + :param endpoint: The host name of the namespace, e.g. + namespaceName1.westus-1.eventgrid.azure.net. Required. + :type endpoint: str + :param credential: Credential used to authenticate requests to the service. Is either a + AzureKeyCredential type or a TokenCredential type. Required. + :type credential: ~azure.core.credentials.AzureKeyCredential or + ~azure.core.credentials.TokenCredential + :keyword namespace_topic: The name of the topic to consume events from. Required. + :paramtype namespace_topic: str + :subscription: The name of the subscription to consume events from. Required. + :paramtype subscription: str + :keyword api_version: The API version to use for this operation. Default value is "2024-06-01". + Note that overriding this default value may result in unsupported behavior. + :paramtype api_version: str + """ + + def __init__( + self, + endpoint: str, + credential: Union[AzureKeyCredential, "TokenCredential"], + *, + namespace_topic: str, + subscription: str, + api_version: Optional[str] = None, + **kwargs: Any, + ) -> None: + self._namespace = namespace_topic + self._subscription = subscription + self._credential = credential + super().__init__( + endpoint=endpoint, credential=credential, api_version=api_version or DEFAULT_STANDARD_API_VERSION, **kwargs + ) + + def __repr__(self) -> str: + return f"" + + +def patch_sdk(): + """Do not remove from this file. + `patch_sdk` is a last resort escape hatch that allows you to do customizations + you can't accomplish using the techniques described in + https://aka.ms/azsdk/python/dpcodegen/python/customize + """ + + +__all__: List[str] = [ + "EventGridPublisherClient", + "EventGridConsumerClient", + "SystemEventNames", + "EventGridEvent", + "generate_sas", +] # Add all objects you want publicly available to users at this package level diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_serialization.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_serialization.py new file mode 100644 index 000000000000..f0c6180722c8 --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_serialization.py @@ -0,0 +1,1998 @@ +# -------------------------------------------------------------------------- +# +# Copyright (c) Microsoft Corporation. All rights reserved. +# +# The MIT License (MIT) +# +# Permission is hereby granted, free of charge, to any person obtaining a copy +# of this software and associated documentation files (the ""Software""), to +# deal in the Software without restriction, including without limitation the +# rights to use, copy, modify, merge, publish, distribute, sublicense, and/or +# sell copies of the Software, and to permit persons to whom the Software is +# furnished to do so, subject to the following conditions: +# +# The above copyright notice and this permission notice shall be included in +# all copies or substantial portions of the Software. +# +# THE SOFTWARE IS PROVIDED *AS IS*, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING +# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS +# IN THE SOFTWARE. +# +# -------------------------------------------------------------------------- + +# pylint: skip-file +# pyright: reportUnnecessaryTypeIgnoreComment=false + +from base64 import b64decode, b64encode +import calendar +import datetime +import decimal +import email +from enum import Enum +import json +import logging +import re +import sys +import codecs +from typing import ( + Dict, + Any, + cast, + Optional, + Union, + AnyStr, + IO, + Mapping, + Callable, + TypeVar, + MutableMapping, + Type, + List, + Mapping, +) + +try: + from urllib import quote # type: ignore +except ImportError: + from urllib.parse import quote +import xml.etree.ElementTree as ET + +import isodate # type: ignore + +from azure.core.exceptions import DeserializationError, SerializationError +from azure.core.serialization import NULL as CoreNull + +_BOM = codecs.BOM_UTF8.decode(encoding="utf-8") + +ModelType = TypeVar("ModelType", bound="Model") +JSON = MutableMapping[str, Any] + + +class RawDeserializer: + + # Accept "text" because we're open minded people... + JSON_REGEXP = re.compile(r"^(application|text)/([a-z+.]+\+)?json$") + + # Name used in context + CONTEXT_NAME = "deserialized_data" + + @classmethod + def deserialize_from_text(cls, data: Optional[Union[AnyStr, IO]], content_type: Optional[str] = None) -> Any: + """Decode data according to content-type. + + Accept a stream of data as well, but will be load at once in memory for now. + + If no content-type, will return the string version (not bytes, not stream) + + :param data: Input, could be bytes or stream (will be decoded with UTF8) or text + :type data: str or bytes or IO + :param str content_type: The content type. + """ + if hasattr(data, "read"): + # Assume a stream + data = cast(IO, data).read() + + if isinstance(data, bytes): + data_as_str = data.decode(encoding="utf-8-sig") + else: + # Explain to mypy the correct type. + data_as_str = cast(str, data) + + # Remove Byte Order Mark if present in string + data_as_str = data_as_str.lstrip(_BOM) + + if content_type is None: + return data + + if cls.JSON_REGEXP.match(content_type): + try: + return json.loads(data_as_str) + except ValueError as err: + raise DeserializationError("JSON is invalid: {}".format(err), err) + elif "xml" in (content_type or []): + try: + + try: + if isinstance(data, unicode): # type: ignore + # If I'm Python 2.7 and unicode XML will scream if I try a "fromstring" on unicode string + data_as_str = data_as_str.encode(encoding="utf-8") # type: ignore + except NameError: + pass + + return ET.fromstring(data_as_str) # nosec + except ET.ParseError as err: + # It might be because the server has an issue, and returned JSON with + # content-type XML.... + # So let's try a JSON load, and if it's still broken + # let's flow the initial exception + def _json_attemp(data): + try: + return True, json.loads(data) + except ValueError: + return False, None # Don't care about this one + + success, json_result = _json_attemp(data) + if success: + return json_result + # If i'm here, it's not JSON, it's not XML, let's scream + # and raise the last context in this block (the XML exception) + # The function hack is because Py2.7 messes up with exception + # context otherwise. + _LOGGER.critical("Wasn't XML not JSON, failing") + raise DeserializationError("XML is invalid") from err + raise DeserializationError("Cannot deserialize content-type: {}".format(content_type)) + + @classmethod + def deserialize_from_http_generics(cls, body_bytes: Optional[Union[AnyStr, IO]], headers: Mapping) -> Any: + """Deserialize from HTTP response. + + Use bytes and headers to NOT use any requests/aiohttp or whatever + specific implementation. + Headers will tested for "content-type" + """ + # Try to use content-type from headers if available + content_type = None + if "content-type" in headers: + content_type = headers["content-type"].split(";")[0].strip().lower() + # Ouch, this server did not declare what it sent... + # Let's guess it's JSON... + # Also, since Autorest was considering that an empty body was a valid JSON, + # need that test as well.... + else: + content_type = "application/json" + + if body_bytes: + return cls.deserialize_from_text(body_bytes, content_type) + return None + + +_LOGGER = logging.getLogger(__name__) + +try: + _long_type = long # type: ignore +except NameError: + _long_type = int + + +class UTC(datetime.tzinfo): + """Time Zone info for handling UTC""" + + def utcoffset(self, dt): + """UTF offset for UTC is 0.""" + return datetime.timedelta(0) + + def tzname(self, dt): + """Timestamp representation.""" + return "Z" + + def dst(self, dt): + """No daylight saving for UTC.""" + return datetime.timedelta(hours=1) + + +try: + from datetime import timezone as _FixedOffset # type: ignore +except ImportError: # Python 2.7 + + class _FixedOffset(datetime.tzinfo): # type: ignore + """Fixed offset in minutes east from UTC. + Copy/pasted from Python doc + :param datetime.timedelta offset: offset in timedelta format + """ + + def __init__(self, offset): + self.__offset = offset + + def utcoffset(self, dt): + return self.__offset + + def tzname(self, dt): + return str(self.__offset.total_seconds() / 3600) + + def __repr__(self): + return "".format(self.tzname(None)) + + def dst(self, dt): + return datetime.timedelta(0) + + def __getinitargs__(self): + return (self.__offset,) + + +try: + from datetime import timezone + + TZ_UTC = timezone.utc +except ImportError: + TZ_UTC = UTC() # type: ignore + +_FLATTEN = re.compile(r"(? None: + self.additional_properties: Optional[Dict[str, Any]] = {} + for k in kwargs: + if k not in self._attribute_map: + _LOGGER.warning("%s is not a known attribute of class %s and will be ignored", k, self.__class__) + elif k in self._validation and self._validation[k].get("readonly", False): + _LOGGER.warning("Readonly attribute %s will be ignored in class %s", k, self.__class__) + else: + setattr(self, k, kwargs[k]) + + def __eq__(self, other: Any) -> bool: + """Compare objects by comparing all attributes.""" + if isinstance(other, self.__class__): + return self.__dict__ == other.__dict__ + return False + + def __ne__(self, other: Any) -> bool: + """Compare objects by comparing all attributes.""" + return not self.__eq__(other) + + def __str__(self) -> str: + return str(self.__dict__) + + @classmethod + def enable_additional_properties_sending(cls) -> None: + cls._attribute_map["additional_properties"] = {"key": "", "type": "{object}"} + + @classmethod + def is_xml_model(cls) -> bool: + try: + cls._xml_map # type: ignore + except AttributeError: + return False + return True + + @classmethod + def _create_xml_node(cls): + """Create XML node.""" + try: + xml_map = cls._xml_map # type: ignore + except AttributeError: + xml_map = {} + + return _create_xml_node(xml_map.get("name", cls.__name__), xml_map.get("prefix", None), xml_map.get("ns", None)) + + def serialize(self, keep_readonly: bool = False, **kwargs: Any) -> JSON: + """Return the JSON that would be sent to server from this model. + + This is an alias to `as_dict(full_restapi_key_transformer, keep_readonly=False)`. + + If you want XML serialization, you can pass the kwargs is_xml=True. + + :param bool keep_readonly: If you want to serialize the readonly attributes + :returns: A dict JSON compatible object + :rtype: dict + """ + serializer = Serializer(self._infer_class_models()) + return serializer._serialize(self, keep_readonly=keep_readonly, **kwargs) # type: ignore + + def as_dict( + self, + keep_readonly: bool = True, + key_transformer: Callable[[str, Dict[str, Any], Any], Any] = attribute_transformer, + **kwargs: Any + ) -> JSON: + """Return a dict that can be serialized using json.dump. + + Advanced usage might optionally use a callback as parameter: + + .. code::python + + def my_key_transformer(key, attr_desc, value): + return key + + Key is the attribute name used in Python. Attr_desc + is a dict of metadata. Currently contains 'type' with the + msrest type and 'key' with the RestAPI encoded key. + Value is the current value in this object. + + The string returned will be used to serialize the key. + If the return type is a list, this is considered hierarchical + result dict. + + See the three examples in this file: + + - attribute_transformer + - full_restapi_key_transformer + - last_restapi_key_transformer + + If you want XML serialization, you can pass the kwargs is_xml=True. + + :param function key_transformer: A key transformer function. + :returns: A dict JSON compatible object + :rtype: dict + """ + serializer = Serializer(self._infer_class_models()) + return serializer._serialize(self, key_transformer=key_transformer, keep_readonly=keep_readonly, **kwargs) # type: ignore + + @classmethod + def _infer_class_models(cls): + try: + str_models = cls.__module__.rsplit(".", 1)[0] + models = sys.modules[str_models] + client_models = {k: v for k, v in models.__dict__.items() if isinstance(v, type)} + if cls.__name__ not in client_models: + raise ValueError("Not Autorest generated code") + except Exception: + # Assume it's not Autorest generated (tests?). Add ourselves as dependencies. + client_models = {cls.__name__: cls} + return client_models + + @classmethod + def deserialize(cls: Type[ModelType], data: Any, content_type: Optional[str] = None) -> ModelType: + """Parse a str using the RestAPI syntax and return a model. + + :param str data: A str using RestAPI structure. JSON by default. + :param str content_type: JSON by default, set application/xml if XML. + :returns: An instance of this model + :raises: DeserializationError if something went wrong + """ + deserializer = Deserializer(cls._infer_class_models()) + return deserializer(cls.__name__, data, content_type=content_type) # type: ignore + + @classmethod + def from_dict( + cls: Type[ModelType], + data: Any, + key_extractors: Optional[Callable[[str, Dict[str, Any], Any], Any]] = None, + content_type: Optional[str] = None, + ) -> ModelType: + """Parse a dict using given key extractor return a model. + + By default consider key + extractors (rest_key_case_insensitive_extractor, attribute_key_case_insensitive_extractor + and last_rest_key_case_insensitive_extractor) + + :param dict data: A dict using RestAPI structure + :param str content_type: JSON by default, set application/xml if XML. + :returns: An instance of this model + :raises: DeserializationError if something went wrong + """ + deserializer = Deserializer(cls._infer_class_models()) + deserializer.key_extractors = ( # type: ignore + [ # type: ignore + attribute_key_case_insensitive_extractor, + rest_key_case_insensitive_extractor, + last_rest_key_case_insensitive_extractor, + ] + if key_extractors is None + else key_extractors + ) + return deserializer(cls.__name__, data, content_type=content_type) # type: ignore + + @classmethod + def _flatten_subtype(cls, key, objects): + if "_subtype_map" not in cls.__dict__: + return {} + result = dict(cls._subtype_map[key]) + for valuetype in cls._subtype_map[key].values(): + result.update(objects[valuetype]._flatten_subtype(key, objects)) + return result + + @classmethod + def _classify(cls, response, objects): + """Check the class _subtype_map for any child classes. + We want to ignore any inherited _subtype_maps. + Remove the polymorphic key from the initial data. + """ + for subtype_key in cls.__dict__.get("_subtype_map", {}).keys(): + subtype_value = None + + if not isinstance(response, ET.Element): + rest_api_response_key = cls._get_rest_key_parts(subtype_key)[-1] + subtype_value = response.pop(rest_api_response_key, None) or response.pop(subtype_key, None) + else: + subtype_value = xml_key_extractor(subtype_key, cls._attribute_map[subtype_key], response) + if subtype_value: + # Try to match base class. Can be class name only + # (bug to fix in Autorest to support x-ms-discriminator-name) + if cls.__name__ == subtype_value: + return cls + flatten_mapping_type = cls._flatten_subtype(subtype_key, objects) + try: + return objects[flatten_mapping_type[subtype_value]] # type: ignore + except KeyError: + _LOGGER.warning( + "Subtype value %s has no mapping, use base class %s.", + subtype_value, + cls.__name__, + ) + break + else: + _LOGGER.warning("Discriminator %s is absent or null, use base class %s.", subtype_key, cls.__name__) + break + return cls + + @classmethod + def _get_rest_key_parts(cls, attr_key): + """Get the RestAPI key of this attr, split it and decode part + :param str attr_key: Attribute key must be in attribute_map. + :returns: A list of RestAPI part + :rtype: list + """ + rest_split_key = _FLATTEN.split(cls._attribute_map[attr_key]["key"]) + return [_decode_attribute_map_key(key_part) for key_part in rest_split_key] + + +def _decode_attribute_map_key(key): + """This decode a key in an _attribute_map to the actual key we want to look at + inside the received data. + + :param str key: A key string from the generated code + """ + return key.replace("\\.", ".") + + +class Serializer(object): + """Request object model serializer.""" + + basic_types = {str: "str", int: "int", bool: "bool", float: "float"} + + _xml_basic_types_serializers = {"bool": lambda x: str(x).lower()} + days = {0: "Mon", 1: "Tue", 2: "Wed", 3: "Thu", 4: "Fri", 5: "Sat", 6: "Sun"} + months = { + 1: "Jan", + 2: "Feb", + 3: "Mar", + 4: "Apr", + 5: "May", + 6: "Jun", + 7: "Jul", + 8: "Aug", + 9: "Sep", + 10: "Oct", + 11: "Nov", + 12: "Dec", + } + validation = { + "min_length": lambda x, y: len(x) < y, + "max_length": lambda x, y: len(x) > y, + "minimum": lambda x, y: x < y, + "maximum": lambda x, y: x > y, + "minimum_ex": lambda x, y: x <= y, + "maximum_ex": lambda x, y: x >= y, + "min_items": lambda x, y: len(x) < y, + "max_items": lambda x, y: len(x) > y, + "pattern": lambda x, y: not re.match(y, x, re.UNICODE), + "unique": lambda x, y: len(x) != len(set(x)), + "multiple": lambda x, y: x % y != 0, + } + + def __init__(self, classes: Optional[Mapping[str, type]] = None): + self.serialize_type = { + "iso-8601": Serializer.serialize_iso, + "rfc-1123": Serializer.serialize_rfc, + "unix-time": Serializer.serialize_unix, + "duration": Serializer.serialize_duration, + "date": Serializer.serialize_date, + "time": Serializer.serialize_time, + "decimal": Serializer.serialize_decimal, + "long": Serializer.serialize_long, + "bytearray": Serializer.serialize_bytearray, + "base64": Serializer.serialize_base64, + "object": self.serialize_object, + "[]": self.serialize_iter, + "{}": self.serialize_dict, + } + self.dependencies: Dict[str, type] = dict(classes) if classes else {} + self.key_transformer = full_restapi_key_transformer + self.client_side_validation = True + + def _serialize(self, target_obj, data_type=None, **kwargs): + """Serialize data into a string according to type. + + :param target_obj: The data to be serialized. + :param str data_type: The type to be serialized from. + :rtype: str, dict + :raises: SerializationError if serialization fails. + """ + key_transformer = kwargs.get("key_transformer", self.key_transformer) + keep_readonly = kwargs.get("keep_readonly", False) + if target_obj is None: + return None + + attr_name = None + class_name = target_obj.__class__.__name__ + + if data_type: + return self.serialize_data(target_obj, data_type, **kwargs) + + if not hasattr(target_obj, "_attribute_map"): + data_type = type(target_obj).__name__ + if data_type in self.basic_types.values(): + return self.serialize_data(target_obj, data_type, **kwargs) + + # Force "is_xml" kwargs if we detect a XML model + try: + is_xml_model_serialization = kwargs["is_xml"] + except KeyError: + is_xml_model_serialization = kwargs.setdefault("is_xml", target_obj.is_xml_model()) + + serialized = {} + if is_xml_model_serialization: + serialized = target_obj._create_xml_node() + try: + attributes = target_obj._attribute_map + for attr, attr_desc in attributes.items(): + attr_name = attr + if not keep_readonly and target_obj._validation.get(attr_name, {}).get("readonly", False): + continue + + if attr_name == "additional_properties" and attr_desc["key"] == "": + if target_obj.additional_properties is not None: + serialized.update(target_obj.additional_properties) + continue + try: + + orig_attr = getattr(target_obj, attr) + if is_xml_model_serialization: + pass # Don't provide "transformer" for XML for now. Keep "orig_attr" + else: # JSON + keys, orig_attr = key_transformer(attr, attr_desc.copy(), orig_attr) + keys = keys if isinstance(keys, list) else [keys] + + kwargs["serialization_ctxt"] = attr_desc + new_attr = self.serialize_data(orig_attr, attr_desc["type"], **kwargs) + + if is_xml_model_serialization: + xml_desc = attr_desc.get("xml", {}) + xml_name = xml_desc.get("name", attr_desc["key"]) + xml_prefix = xml_desc.get("prefix", None) + xml_ns = xml_desc.get("ns", None) + if xml_desc.get("attr", False): + if xml_ns: + ET.register_namespace(xml_prefix, xml_ns) + xml_name = "{{{}}}{}".format(xml_ns, xml_name) + serialized.set(xml_name, new_attr) # type: ignore + continue + if xml_desc.get("text", False): + serialized.text = new_attr # type: ignore + continue + if isinstance(new_attr, list): + serialized.extend(new_attr) # type: ignore + elif isinstance(new_attr, ET.Element): + # If the down XML has no XML/Name, we MUST replace the tag with the local tag. But keeping the namespaces. + if "name" not in getattr(orig_attr, "_xml_map", {}): + splitted_tag = new_attr.tag.split("}") + if len(splitted_tag) == 2: # Namespace + new_attr.tag = "}".join([splitted_tag[0], xml_name]) + else: + new_attr.tag = xml_name + serialized.append(new_attr) # type: ignore + else: # That's a basic type + # Integrate namespace if necessary + local_node = _create_xml_node(xml_name, xml_prefix, xml_ns) + local_node.text = str(new_attr) + serialized.append(local_node) # type: ignore + else: # JSON + for k in reversed(keys): # type: ignore + new_attr = {k: new_attr} + + _new_attr = new_attr + _serialized = serialized + for k in keys: # type: ignore + if k not in _serialized: + _serialized.update(_new_attr) # type: ignore + _new_attr = _new_attr[k] # type: ignore + _serialized = _serialized[k] + except ValueError as err: + if isinstance(err, SerializationError): + raise + + except (AttributeError, KeyError, TypeError) as err: + msg = "Attribute {} in object {} cannot be serialized.\n{}".format(attr_name, class_name, str(target_obj)) + raise SerializationError(msg) from err + else: + return serialized + + def body(self, data, data_type, **kwargs): + """Serialize data intended for a request body. + + :param data: The data to be serialized. + :param str data_type: The type to be serialized from. + :rtype: dict + :raises: SerializationError if serialization fails. + :raises: ValueError if data is None + """ + + # Just in case this is a dict + internal_data_type_str = data_type.strip("[]{}") + internal_data_type = self.dependencies.get(internal_data_type_str, None) + try: + is_xml_model_serialization = kwargs["is_xml"] + except KeyError: + if internal_data_type and issubclass(internal_data_type, Model): + is_xml_model_serialization = kwargs.setdefault("is_xml", internal_data_type.is_xml_model()) + else: + is_xml_model_serialization = False + if internal_data_type and not isinstance(internal_data_type, Enum): + try: + deserializer = Deserializer(self.dependencies) + # Since it's on serialization, it's almost sure that format is not JSON REST + # We're not able to deal with additional properties for now. + deserializer.additional_properties_detection = False + if is_xml_model_serialization: + deserializer.key_extractors = [ # type: ignore + attribute_key_case_insensitive_extractor, + ] + else: + deserializer.key_extractors = [ + rest_key_case_insensitive_extractor, + attribute_key_case_insensitive_extractor, + last_rest_key_case_insensitive_extractor, + ] + data = deserializer._deserialize(data_type, data) + except DeserializationError as err: + raise SerializationError("Unable to build a model: " + str(err)) from err + + return self._serialize(data, data_type, **kwargs) + + def url(self, name, data, data_type, **kwargs): + """Serialize data intended for a URL path. + + :param data: The data to be serialized. + :param str data_type: The type to be serialized from. + :rtype: str + :raises: TypeError if serialization fails. + :raises: ValueError if data is None + """ + try: + output = self.serialize_data(data, data_type, **kwargs) + if data_type == "bool": + output = json.dumps(output) + + if kwargs.get("skip_quote") is True: + output = str(output) + output = output.replace("{", quote("{")).replace("}", quote("}")) + else: + output = quote(str(output), safe="") + except SerializationError: + raise TypeError("{} must be type {}.".format(name, data_type)) + else: + return output + + def query(self, name, data, data_type, **kwargs): + """Serialize data intended for a URL query. + + :param data: The data to be serialized. + :param str data_type: The type to be serialized from. + :keyword bool skip_quote: Whether to skip quote the serialized result. + Defaults to False. + :rtype: str, list + :raises: TypeError if serialization fails. + :raises: ValueError if data is None + """ + try: + # Treat the list aside, since we don't want to encode the div separator + if data_type.startswith("["): + internal_data_type = data_type[1:-1] + do_quote = not kwargs.get("skip_quote", False) + return self.serialize_iter(data, internal_data_type, do_quote=do_quote, **kwargs) + + # Not a list, regular serialization + output = self.serialize_data(data, data_type, **kwargs) + if data_type == "bool": + output = json.dumps(output) + if kwargs.get("skip_quote") is True: + output = str(output) + else: + output = quote(str(output), safe="") + except SerializationError: + raise TypeError("{} must be type {}.".format(name, data_type)) + else: + return str(output) + + def header(self, name, data, data_type, **kwargs): + """Serialize data intended for a request header. + + :param data: The data to be serialized. + :param str data_type: The type to be serialized from. + :rtype: str + :raises: TypeError if serialization fails. + :raises: ValueError if data is None + """ + try: + if data_type in ["[str]"]: + data = ["" if d is None else d for d in data] + + output = self.serialize_data(data, data_type, **kwargs) + if data_type == "bool": + output = json.dumps(output) + except SerializationError: + raise TypeError("{} must be type {}.".format(name, data_type)) + else: + return str(output) + + def serialize_data(self, data, data_type, **kwargs): + """Serialize generic data according to supplied data type. + + :param data: The data to be serialized. + :param str data_type: The type to be serialized from. + :param bool required: Whether it's essential that the data not be + empty or None + :raises: AttributeError if required data is None. + :raises: ValueError if data is None + :raises: SerializationError if serialization fails. + """ + if data is None: + raise ValueError("No value for given attribute") + + try: + if data is CoreNull: + return None + if data_type in self.basic_types.values(): + return self.serialize_basic(data, data_type, **kwargs) + + elif data_type in self.serialize_type: + return self.serialize_type[data_type](data, **kwargs) + + # If dependencies is empty, try with current data class + # It has to be a subclass of Enum anyway + enum_type = self.dependencies.get(data_type, data.__class__) + if issubclass(enum_type, Enum): + return Serializer.serialize_enum(data, enum_obj=enum_type) + + iter_type = data_type[0] + data_type[-1] + if iter_type in self.serialize_type: + return self.serialize_type[iter_type](data, data_type[1:-1], **kwargs) + + except (ValueError, TypeError) as err: + msg = "Unable to serialize value: {!r} as type: {!r}." + raise SerializationError(msg.format(data, data_type)) from err + else: + return self._serialize(data, **kwargs) + + @classmethod + def _get_custom_serializers(cls, data_type, **kwargs): + custom_serializer = kwargs.get("basic_types_serializers", {}).get(data_type) + if custom_serializer: + return custom_serializer + if kwargs.get("is_xml", False): + return cls._xml_basic_types_serializers.get(data_type) + + @classmethod + def serialize_basic(cls, data, data_type, **kwargs): + """Serialize basic builting data type. + Serializes objects to str, int, float or bool. + + Possible kwargs: + - basic_types_serializers dict[str, callable] : If set, use the callable as serializer + - is_xml bool : If set, use xml_basic_types_serializers + + :param data: Object to be serialized. + :param str data_type: Type of object in the iterable. + """ + custom_serializer = cls._get_custom_serializers(data_type, **kwargs) + if custom_serializer: + return custom_serializer(data) + if data_type == "str": + return cls.serialize_unicode(data) + return eval(data_type)(data) # nosec + + @classmethod + def serialize_unicode(cls, data): + """Special handling for serializing unicode strings in Py2. + Encode to UTF-8 if unicode, otherwise handle as a str. + + :param data: Object to be serialized. + :rtype: str + """ + try: # If I received an enum, return its value + return data.value + except AttributeError: + pass + + try: + if isinstance(data, unicode): # type: ignore + # Don't change it, JSON and XML ElementTree are totally able + # to serialize correctly u'' strings + return data + except NameError: + return str(data) + else: + return str(data) + + def serialize_iter(self, data, iter_type, div=None, **kwargs): + """Serialize iterable. + + Supported kwargs: + - serialization_ctxt dict : The current entry of _attribute_map, or same format. + serialization_ctxt['type'] should be same as data_type. + - is_xml bool : If set, serialize as XML + + :param list attr: Object to be serialized. + :param str iter_type: Type of object in the iterable. + :param bool required: Whether the objects in the iterable must + not be None or empty. + :param str div: If set, this str will be used to combine the elements + in the iterable into a combined string. Default is 'None'. + :keyword bool do_quote: Whether to quote the serialized result of each iterable element. + Defaults to False. + :rtype: list, str + """ + if isinstance(data, str): + raise SerializationError("Refuse str type as a valid iter type.") + + serialization_ctxt = kwargs.get("serialization_ctxt", {}) + is_xml = kwargs.get("is_xml", False) + + serialized = [] + for d in data: + try: + serialized.append(self.serialize_data(d, iter_type, **kwargs)) + except ValueError as err: + if isinstance(err, SerializationError): + raise + serialized.append(None) + + if kwargs.get("do_quote", False): + serialized = ["" if s is None else quote(str(s), safe="") for s in serialized] + + if div: + serialized = ["" if s is None else str(s) for s in serialized] + serialized = div.join(serialized) + + if "xml" in serialization_ctxt or is_xml: + # XML serialization is more complicated + xml_desc = serialization_ctxt.get("xml", {}) + xml_name = xml_desc.get("name") + if not xml_name: + xml_name = serialization_ctxt["key"] + + # Create a wrap node if necessary (use the fact that Element and list have "append") + is_wrapped = xml_desc.get("wrapped", False) + node_name = xml_desc.get("itemsName", xml_name) + if is_wrapped: + final_result = _create_xml_node(xml_name, xml_desc.get("prefix", None), xml_desc.get("ns", None)) + else: + final_result = [] + # All list elements to "local_node" + for el in serialized: + if isinstance(el, ET.Element): + el_node = el + else: + el_node = _create_xml_node(node_name, xml_desc.get("prefix", None), xml_desc.get("ns", None)) + if el is not None: # Otherwise it writes "None" :-p + el_node.text = str(el) + final_result.append(el_node) + return final_result + return serialized + + def serialize_dict(self, attr, dict_type, **kwargs): + """Serialize a dictionary of objects. + + :param dict attr: Object to be serialized. + :param str dict_type: Type of object in the dictionary. + :param bool required: Whether the objects in the dictionary must + not be None or empty. + :rtype: dict + """ + serialization_ctxt = kwargs.get("serialization_ctxt", {}) + serialized = {} + for key, value in attr.items(): + try: + serialized[self.serialize_unicode(key)] = self.serialize_data(value, dict_type, **kwargs) + except ValueError as err: + if isinstance(err, SerializationError): + raise + serialized[self.serialize_unicode(key)] = None + + if "xml" in serialization_ctxt: + # XML serialization is more complicated + xml_desc = serialization_ctxt["xml"] + xml_name = xml_desc["name"] + + final_result = _create_xml_node(xml_name, xml_desc.get("prefix", None), xml_desc.get("ns", None)) + for key, value in serialized.items(): + ET.SubElement(final_result, key).text = value + return final_result + + return serialized + + def serialize_object(self, attr, **kwargs): + """Serialize a generic object. + This will be handled as a dictionary. If object passed in is not + a basic type (str, int, float, dict, list) it will simply be + cast to str. + + :param dict attr: Object to be serialized. + :rtype: dict or str + """ + if attr is None: + return None + if isinstance(attr, ET.Element): + return attr + obj_type = type(attr) + if obj_type in self.basic_types: + return self.serialize_basic(attr, self.basic_types[obj_type], **kwargs) + if obj_type is _long_type: + return self.serialize_long(attr) + if obj_type is str: + return self.serialize_unicode(attr) + if obj_type is datetime.datetime: + return self.serialize_iso(attr) + if obj_type is datetime.date: + return self.serialize_date(attr) + if obj_type is datetime.time: + return self.serialize_time(attr) + if obj_type is datetime.timedelta: + return self.serialize_duration(attr) + if obj_type is decimal.Decimal: + return self.serialize_decimal(attr) + + # If it's a model or I know this dependency, serialize as a Model + elif obj_type in self.dependencies.values() or isinstance(attr, Model): + return self._serialize(attr) + + if obj_type == dict: + serialized = {} + for key, value in attr.items(): + try: + serialized[self.serialize_unicode(key)] = self.serialize_object(value, **kwargs) + except ValueError: + serialized[self.serialize_unicode(key)] = None + return serialized + + if obj_type == list: + serialized = [] + for obj in attr: + try: + serialized.append(self.serialize_object(obj, **kwargs)) + except ValueError: + pass + return serialized + return str(attr) + + @staticmethod + def serialize_enum(attr, enum_obj=None): + try: + result = attr.value + except AttributeError: + result = attr + try: + enum_obj(result) # type: ignore + return result + except ValueError: + for enum_value in enum_obj: # type: ignore + if enum_value.value.lower() == str(attr).lower(): + return enum_value.value + error = "{!r} is not valid value for enum {!r}" + raise SerializationError(error.format(attr, enum_obj)) + + @staticmethod + def serialize_bytearray(attr, **kwargs): + """Serialize bytearray into base-64 string. + + :param attr: Object to be serialized. + :rtype: str + """ + return b64encode(attr).decode() + + @staticmethod + def serialize_base64(attr, **kwargs): + """Serialize str into base-64 string. + + :param attr: Object to be serialized. + :rtype: str + """ + encoded = b64encode(attr).decode("ascii") + return encoded.strip("=").replace("+", "-").replace("/", "_") + + @staticmethod + def serialize_decimal(attr, **kwargs): + """Serialize Decimal object to float. + + :param attr: Object to be serialized. + :rtype: float + """ + return float(attr) + + @staticmethod + def serialize_long(attr, **kwargs): + """Serialize long (Py2) or int (Py3). + + :param attr: Object to be serialized. + :rtype: int/long + """ + return _long_type(attr) + + @staticmethod + def serialize_date(attr, **kwargs): + """Serialize Date object into ISO-8601 formatted string. + + :param Date attr: Object to be serialized. + :rtype: str + """ + if isinstance(attr, str): + attr = isodate.parse_date(attr) + t = "{:04}-{:02}-{:02}".format(attr.year, attr.month, attr.day) + return t + + @staticmethod + def serialize_time(attr, **kwargs): + """Serialize Time object into ISO-8601 formatted string. + + :param datetime.time attr: Object to be serialized. + :rtype: str + """ + if isinstance(attr, str): + attr = isodate.parse_time(attr) + t = "{:02}:{:02}:{:02}".format(attr.hour, attr.minute, attr.second) + if attr.microsecond: + t += ".{:02}".format(attr.microsecond) + return t + + @staticmethod + def serialize_duration(attr, **kwargs): + """Serialize TimeDelta object into ISO-8601 formatted string. + + :param TimeDelta attr: Object to be serialized. + :rtype: str + """ + if isinstance(attr, str): + attr = isodate.parse_duration(attr) + return isodate.duration_isoformat(attr) + + @staticmethod + def serialize_rfc(attr, **kwargs): + """Serialize Datetime object into RFC-1123 formatted string. + + :param Datetime attr: Object to be serialized. + :rtype: str + :raises: TypeError if format invalid. + """ + try: + if not attr.tzinfo: + _LOGGER.warning("Datetime with no tzinfo will be considered UTC.") + utc = attr.utctimetuple() + except AttributeError: + raise TypeError("RFC1123 object must be valid Datetime object.") + + return "{}, {:02} {} {:04} {:02}:{:02}:{:02} GMT".format( + Serializer.days[utc.tm_wday], + utc.tm_mday, + Serializer.months[utc.tm_mon], + utc.tm_year, + utc.tm_hour, + utc.tm_min, + utc.tm_sec, + ) + + @staticmethod + def serialize_iso(attr, **kwargs): + """Serialize Datetime object into ISO-8601 formatted string. + + :param Datetime attr: Object to be serialized. + :rtype: str + :raises: SerializationError if format invalid. + """ + if isinstance(attr, str): + attr = isodate.parse_datetime(attr) + try: + if not attr.tzinfo: + _LOGGER.warning("Datetime with no tzinfo will be considered UTC.") + utc = attr.utctimetuple() + if utc.tm_year > 9999 or utc.tm_year < 1: + raise OverflowError("Hit max or min date") + + microseconds = str(attr.microsecond).rjust(6, "0").rstrip("0").ljust(3, "0") + if microseconds: + microseconds = "." + microseconds + date = "{:04}-{:02}-{:02}T{:02}:{:02}:{:02}".format( + utc.tm_year, utc.tm_mon, utc.tm_mday, utc.tm_hour, utc.tm_min, utc.tm_sec + ) + return date + microseconds + "Z" + except (ValueError, OverflowError) as err: + msg = "Unable to serialize datetime object." + raise SerializationError(msg) from err + except AttributeError as err: + msg = "ISO-8601 object must be valid Datetime object." + raise TypeError(msg) from err + + @staticmethod + def serialize_unix(attr, **kwargs): + """Serialize Datetime object into IntTime format. + This is represented as seconds. + + :param Datetime attr: Object to be serialized. + :rtype: int + :raises: SerializationError if format invalid + """ + if isinstance(attr, int): + return attr + try: + if not attr.tzinfo: + _LOGGER.warning("Datetime with no tzinfo will be considered UTC.") + return int(calendar.timegm(attr.utctimetuple())) + except AttributeError: + raise TypeError("Unix time object must be valid Datetime object.") + + +def rest_key_extractor(attr, attr_desc, data): + key = attr_desc["key"] + working_data = data + + while "." in key: + # Need the cast, as for some reasons "split" is typed as list[str | Any] + dict_keys = cast(List[str], _FLATTEN.split(key)) + if len(dict_keys) == 1: + key = _decode_attribute_map_key(dict_keys[0]) + break + working_key = _decode_attribute_map_key(dict_keys[0]) + working_data = working_data.get(working_key, data) + if working_data is None: + # If at any point while following flatten JSON path see None, it means + # that all properties under are None as well + return None + key = ".".join(dict_keys[1:]) + + return working_data.get(key) + + +def rest_key_case_insensitive_extractor(attr, attr_desc, data): + key = attr_desc["key"] + working_data = data + + while "." in key: + dict_keys = _FLATTEN.split(key) + if len(dict_keys) == 1: + key = _decode_attribute_map_key(dict_keys[0]) + break + working_key = _decode_attribute_map_key(dict_keys[0]) + working_data = attribute_key_case_insensitive_extractor(working_key, None, working_data) + if working_data is None: + # If at any point while following flatten JSON path see None, it means + # that all properties under are None as well + return None + key = ".".join(dict_keys[1:]) + + if working_data: + return attribute_key_case_insensitive_extractor(key, None, working_data) + + +def last_rest_key_extractor(attr, attr_desc, data): + """Extract the attribute in "data" based on the last part of the JSON path key.""" + key = attr_desc["key"] + dict_keys = _FLATTEN.split(key) + return attribute_key_extractor(dict_keys[-1], None, data) + + +def last_rest_key_case_insensitive_extractor(attr, attr_desc, data): + """Extract the attribute in "data" based on the last part of the JSON path key. + + This is the case insensitive version of "last_rest_key_extractor" + """ + key = attr_desc["key"] + dict_keys = _FLATTEN.split(key) + return attribute_key_case_insensitive_extractor(dict_keys[-1], None, data) + + +def attribute_key_extractor(attr, _, data): + return data.get(attr) + + +def attribute_key_case_insensitive_extractor(attr, _, data): + found_key = None + lower_attr = attr.lower() + for key in data: + if lower_attr == key.lower(): + found_key = key + break + + return data.get(found_key) + + +def _extract_name_from_internal_type(internal_type): + """Given an internal type XML description, extract correct XML name with namespace. + + :param dict internal_type: An model type + :rtype: tuple + :returns: A tuple XML name + namespace dict + """ + internal_type_xml_map = getattr(internal_type, "_xml_map", {}) + xml_name = internal_type_xml_map.get("name", internal_type.__name__) + xml_ns = internal_type_xml_map.get("ns", None) + if xml_ns: + xml_name = "{{{}}}{}".format(xml_ns, xml_name) + return xml_name + + +def xml_key_extractor(attr, attr_desc, data): + if isinstance(data, dict): + return None + + # Test if this model is XML ready first + if not isinstance(data, ET.Element): + return None + + xml_desc = attr_desc.get("xml", {}) + xml_name = xml_desc.get("name", attr_desc["key"]) + + # Look for a children + is_iter_type = attr_desc["type"].startswith("[") + is_wrapped = xml_desc.get("wrapped", False) + internal_type = attr_desc.get("internalType", None) + internal_type_xml_map = getattr(internal_type, "_xml_map", {}) + + # Integrate namespace if necessary + xml_ns = xml_desc.get("ns", internal_type_xml_map.get("ns", None)) + if xml_ns: + xml_name = "{{{}}}{}".format(xml_ns, xml_name) + + # If it's an attribute, that's simple + if xml_desc.get("attr", False): + return data.get(xml_name) + + # If it's x-ms-text, that's simple too + if xml_desc.get("text", False): + return data.text + + # Scenario where I take the local name: + # - Wrapped node + # - Internal type is an enum (considered basic types) + # - Internal type has no XML/Name node + if is_wrapped or (internal_type and (issubclass(internal_type, Enum) or "name" not in internal_type_xml_map)): + children = data.findall(xml_name) + # If internal type has a local name and it's not a list, I use that name + elif not is_iter_type and internal_type and "name" in internal_type_xml_map: + xml_name = _extract_name_from_internal_type(internal_type) + children = data.findall(xml_name) + # That's an array + else: + if internal_type: # Complex type, ignore itemsName and use the complex type name + items_name = _extract_name_from_internal_type(internal_type) + else: + items_name = xml_desc.get("itemsName", xml_name) + children = data.findall(items_name) + + if len(children) == 0: + if is_iter_type: + if is_wrapped: + return None # is_wrapped no node, we want None + else: + return [] # not wrapped, assume empty list + return None # Assume it's not there, maybe an optional node. + + # If is_iter_type and not wrapped, return all found children + if is_iter_type: + if not is_wrapped: + return children + else: # Iter and wrapped, should have found one node only (the wrap one) + if len(children) != 1: + raise DeserializationError( + "Tried to deserialize an array not wrapped, and found several nodes '{}'. Maybe you should declare this array as wrapped?".format( + xml_name + ) + ) + return list(children[0]) # Might be empty list and that's ok. + + # Here it's not a itertype, we should have found one element only or empty + if len(children) > 1: + raise DeserializationError("Find several XML '{}' where it was not expected".format(xml_name)) + return children[0] + + +class Deserializer(object): + """Response object model deserializer. + + :param dict classes: Class type dictionary for deserializing complex types. + :ivar list key_extractors: Ordered list of extractors to be used by this deserializer. + """ + + basic_types = {str: "str", int: "int", bool: "bool", float: "float"} + + valid_date = re.compile(r"\d{4}[-]\d{2}[-]\d{2}T\d{2}:\d{2}:\d{2}" r"\.?\d*Z?[-+]?[\d{2}]?:?[\d{2}]?") + + def __init__(self, classes: Optional[Mapping[str, type]] = None): + self.deserialize_type = { + "iso-8601": Deserializer.deserialize_iso, + "rfc-1123": Deserializer.deserialize_rfc, + "unix-time": Deserializer.deserialize_unix, + "duration": Deserializer.deserialize_duration, + "date": Deserializer.deserialize_date, + "time": Deserializer.deserialize_time, + "decimal": Deserializer.deserialize_decimal, + "long": Deserializer.deserialize_long, + "bytearray": Deserializer.deserialize_bytearray, + "base64": Deserializer.deserialize_base64, + "object": self.deserialize_object, + "[]": self.deserialize_iter, + "{}": self.deserialize_dict, + } + self.deserialize_expected_types = { + "duration": (isodate.Duration, datetime.timedelta), + "iso-8601": (datetime.datetime), + } + self.dependencies: Dict[str, type] = dict(classes) if classes else {} + self.key_extractors = [rest_key_extractor, xml_key_extractor] + # Additional properties only works if the "rest_key_extractor" is used to + # extract the keys. Making it to work whatever the key extractor is too much + # complicated, with no real scenario for now. + # So adding a flag to disable additional properties detection. This flag should be + # used if your expect the deserialization to NOT come from a JSON REST syntax. + # Otherwise, result are unexpected + self.additional_properties_detection = True + + def __call__(self, target_obj, response_data, content_type=None): + """Call the deserializer to process a REST response. + + :param str target_obj: Target data type to deserialize to. + :param requests.Response response_data: REST response object. + :param str content_type: Swagger "produces" if available. + :raises: DeserializationError if deserialization fails. + :return: Deserialized object. + """ + data = self._unpack_content(response_data, content_type) + return self._deserialize(target_obj, data) + + def _deserialize(self, target_obj, data): + """Call the deserializer on a model. + + Data needs to be already deserialized as JSON or XML ElementTree + + :param str target_obj: Target data type to deserialize to. + :param object data: Object to deserialize. + :raises: DeserializationError if deserialization fails. + :return: Deserialized object. + """ + # This is already a model, go recursive just in case + if hasattr(data, "_attribute_map"): + constants = [name for name, config in getattr(data, "_validation", {}).items() if config.get("constant")] + try: + for attr, mapconfig in data._attribute_map.items(): + if attr in constants: + continue + value = getattr(data, attr) + if value is None: + continue + local_type = mapconfig["type"] + internal_data_type = local_type.strip("[]{}") + if internal_data_type not in self.dependencies or isinstance(internal_data_type, Enum): + continue + setattr(data, attr, self._deserialize(local_type, value)) + return data + except AttributeError: + return + + response, class_name = self._classify_target(target_obj, data) + + if isinstance(response, str): + return self.deserialize_data(data, response) + elif isinstance(response, type) and issubclass(response, Enum): + return self.deserialize_enum(data, response) + + if data is None or data is CoreNull: + return data + try: + attributes = response._attribute_map # type: ignore + d_attrs = {} + for attr, attr_desc in attributes.items(): + # Check empty string. If it's not empty, someone has a real "additionalProperties"... + if attr == "additional_properties" and attr_desc["key"] == "": + continue + raw_value = None + # Enhance attr_desc with some dynamic data + attr_desc = attr_desc.copy() # Do a copy, do not change the real one + internal_data_type = attr_desc["type"].strip("[]{}") + if internal_data_type in self.dependencies: + attr_desc["internalType"] = self.dependencies[internal_data_type] + + for key_extractor in self.key_extractors: + found_value = key_extractor(attr, attr_desc, data) + if found_value is not None: + if raw_value is not None and raw_value != found_value: + msg = ( + "Ignoring extracted value '%s' from %s for key '%s'" + " (duplicate extraction, follow extractors order)" + ) + _LOGGER.warning(msg, found_value, key_extractor, attr) + continue + raw_value = found_value + + value = self.deserialize_data(raw_value, attr_desc["type"]) + d_attrs[attr] = value + except (AttributeError, TypeError, KeyError) as err: + msg = "Unable to deserialize to object: " + class_name # type: ignore + raise DeserializationError(msg) from err + else: + additional_properties = self._build_additional_properties(attributes, data) + return self._instantiate_model(response, d_attrs, additional_properties) + + def _build_additional_properties(self, attribute_map, data): + if not self.additional_properties_detection: + return None + if "additional_properties" in attribute_map and attribute_map.get("additional_properties", {}).get("key") != "": + # Check empty string. If it's not empty, someone has a real "additionalProperties" + return None + if isinstance(data, ET.Element): + data = {el.tag: el.text for el in data} + + known_keys = { + _decode_attribute_map_key(_FLATTEN.split(desc["key"])[0]) + for desc in attribute_map.values() + if desc["key"] != "" + } + present_keys = set(data.keys()) + missing_keys = present_keys - known_keys + return {key: data[key] for key in missing_keys} + + def _classify_target(self, target, data): + """Check to see whether the deserialization target object can + be classified into a subclass. + Once classification has been determined, initialize object. + + :param str target: The target object type to deserialize to. + :param str/dict data: The response data to deserialize. + """ + if target is None: + return None, None + + if isinstance(target, str): + try: + target = self.dependencies[target] + except KeyError: + return target, target + + try: + target = target._classify(data, self.dependencies) # type: ignore + except AttributeError: + pass # Target is not a Model, no classify + return target, target.__class__.__name__ # type: ignore + + def failsafe_deserialize(self, target_obj, data, content_type=None): + """Ignores any errors encountered in deserialization, + and falls back to not deserializing the object. Recommended + for use in error deserialization, as we want to return the + HttpResponseError to users, and not have them deal with + a deserialization error. + + :param str target_obj: The target object type to deserialize to. + :param str/dict data: The response data to deserialize. + :param str content_type: Swagger "produces" if available. + """ + try: + return self(target_obj, data, content_type=content_type) + except: + _LOGGER.debug( + "Ran into a deserialization error. Ignoring since this is failsafe deserialization", exc_info=True + ) + return None + + @staticmethod + def _unpack_content(raw_data, content_type=None): + """Extract the correct structure for deserialization. + + If raw_data is a PipelineResponse, try to extract the result of RawDeserializer. + if we can't, raise. Your Pipeline should have a RawDeserializer. + + If not a pipeline response and raw_data is bytes or string, use content-type + to decode it. If no content-type, try JSON. + + If raw_data is something else, bypass all logic and return it directly. + + :param raw_data: Data to be processed. + :param content_type: How to parse if raw_data is a string/bytes. + :raises JSONDecodeError: If JSON is requested and parsing is impossible. + :raises UnicodeDecodeError: If bytes is not UTF8 + """ + # Assume this is enough to detect a Pipeline Response without importing it + context = getattr(raw_data, "context", {}) + if context: + if RawDeserializer.CONTEXT_NAME in context: + return context[RawDeserializer.CONTEXT_NAME] + raise ValueError("This pipeline didn't have the RawDeserializer policy; can't deserialize") + + # Assume this is enough to recognize universal_http.ClientResponse without importing it + if hasattr(raw_data, "body"): + return RawDeserializer.deserialize_from_http_generics(raw_data.text(), raw_data.headers) + + # Assume this enough to recognize requests.Response without importing it. + if hasattr(raw_data, "_content_consumed"): + return RawDeserializer.deserialize_from_http_generics(raw_data.text, raw_data.headers) + + if isinstance(raw_data, (str, bytes)) or hasattr(raw_data, "read"): + return RawDeserializer.deserialize_from_text(raw_data, content_type) # type: ignore + return raw_data + + def _instantiate_model(self, response, attrs, additional_properties=None): + """Instantiate a response model passing in deserialized args. + + :param response: The response model class. + :param d_attrs: The deserialized response attributes. + """ + if callable(response): + subtype = getattr(response, "_subtype_map", {}) + try: + readonly = [k for k, v in response._validation.items() if v.get("readonly")] + const = [k for k, v in response._validation.items() if v.get("constant")] + kwargs = {k: v for k, v in attrs.items() if k not in subtype and k not in readonly + const} + response_obj = response(**kwargs) + for attr in readonly: + setattr(response_obj, attr, attrs.get(attr)) + if additional_properties: + response_obj.additional_properties = additional_properties + return response_obj + except TypeError as err: + msg = "Unable to deserialize {} into model {}. ".format(kwargs, response) # type: ignore + raise DeserializationError(msg + str(err)) + else: + try: + for attr, value in attrs.items(): + setattr(response, attr, value) + return response + except Exception as exp: + msg = "Unable to populate response model. " + msg += "Type: {}, Error: {}".format(type(response), exp) + raise DeserializationError(msg) + + def deserialize_data(self, data, data_type): + """Process data for deserialization according to data type. + + :param str data: The response string to be deserialized. + :param str data_type: The type to deserialize to. + :raises: DeserializationError if deserialization fails. + :return: Deserialized object. + """ + if data is None: + return data + + try: + if not data_type: + return data + if data_type in self.basic_types.values(): + return self.deserialize_basic(data, data_type) + if data_type in self.deserialize_type: + if isinstance(data, self.deserialize_expected_types.get(data_type, tuple())): + return data + + is_a_text_parsing_type = lambda x: x not in ["object", "[]", r"{}"] + if isinstance(data, ET.Element) and is_a_text_parsing_type(data_type) and not data.text: + return None + data_val = self.deserialize_type[data_type](data) + return data_val + + iter_type = data_type[0] + data_type[-1] + if iter_type in self.deserialize_type: + return self.deserialize_type[iter_type](data, data_type[1:-1]) + + obj_type = self.dependencies[data_type] + if issubclass(obj_type, Enum): + if isinstance(data, ET.Element): + data = data.text + return self.deserialize_enum(data, obj_type) + + except (ValueError, TypeError, AttributeError) as err: + msg = "Unable to deserialize response data." + msg += " Data: {}, {}".format(data, data_type) + raise DeserializationError(msg) from err + else: + return self._deserialize(obj_type, data) + + def deserialize_iter(self, attr, iter_type): + """Deserialize an iterable. + + :param list attr: Iterable to be deserialized. + :param str iter_type: The type of object in the iterable. + :rtype: list + """ + if attr is None: + return None + if isinstance(attr, ET.Element): # If I receive an element here, get the children + attr = list(attr) + if not isinstance(attr, (list, set)): + raise DeserializationError("Cannot deserialize as [{}] an object of type {}".format(iter_type, type(attr))) + return [self.deserialize_data(a, iter_type) for a in attr] + + def deserialize_dict(self, attr, dict_type): + """Deserialize a dictionary. + + :param dict/list attr: Dictionary to be deserialized. Also accepts + a list of key, value pairs. + :param str dict_type: The object type of the items in the dictionary. + :rtype: dict + """ + if isinstance(attr, list): + return {x["key"]: self.deserialize_data(x["value"], dict_type) for x in attr} + + if isinstance(attr, ET.Element): + # Transform value into {"Key": "value"} + attr = {el.tag: el.text for el in attr} + return {k: self.deserialize_data(v, dict_type) for k, v in attr.items()} + + def deserialize_object(self, attr, **kwargs): + """Deserialize a generic object. + This will be handled as a dictionary. + + :param dict attr: Dictionary to be deserialized. + :rtype: dict + :raises: TypeError if non-builtin datatype encountered. + """ + if attr is None: + return None + if isinstance(attr, ET.Element): + # Do no recurse on XML, just return the tree as-is + return attr + if isinstance(attr, str): + return self.deserialize_basic(attr, "str") + obj_type = type(attr) + if obj_type in self.basic_types: + return self.deserialize_basic(attr, self.basic_types[obj_type]) + if obj_type is _long_type: + return self.deserialize_long(attr) + + if obj_type == dict: + deserialized = {} + for key, value in attr.items(): + try: + deserialized[key] = self.deserialize_object(value, **kwargs) + except ValueError: + deserialized[key] = None + return deserialized + + if obj_type == list: + deserialized = [] + for obj in attr: + try: + deserialized.append(self.deserialize_object(obj, **kwargs)) + except ValueError: + pass + return deserialized + + else: + error = "Cannot deserialize generic object with type: " + raise TypeError(error + str(obj_type)) + + def deserialize_basic(self, attr, data_type): + """Deserialize basic builtin data type from string. + Will attempt to convert to str, int, float and bool. + This function will also accept '1', '0', 'true' and 'false' as + valid bool values. + + :param str attr: response string to be deserialized. + :param str data_type: deserialization data type. + :rtype: str, int, float or bool + :raises: TypeError if string format is not valid. + """ + # If we're here, data is supposed to be a basic type. + # If it's still an XML node, take the text + if isinstance(attr, ET.Element): + attr = attr.text + if not attr: + if data_type == "str": + # None or '', node is empty string. + return "" + else: + # None or '', node with a strong type is None. + # Don't try to model "empty bool" or "empty int" + return None + + if data_type == "bool": + if attr in [True, False, 1, 0]: + return bool(attr) + elif isinstance(attr, str): + if attr.lower() in ["true", "1"]: + return True + elif attr.lower() in ["false", "0"]: + return False + raise TypeError("Invalid boolean value: {}".format(attr)) + + if data_type == "str": + return self.deserialize_unicode(attr) + return eval(data_type)(attr) # nosec + + @staticmethod + def deserialize_unicode(data): + """Preserve unicode objects in Python 2, otherwise return data + as a string. + + :param str data: response string to be deserialized. + :rtype: str or unicode + """ + # We might be here because we have an enum modeled as string, + # and we try to deserialize a partial dict with enum inside + if isinstance(data, Enum): + return data + + # Consider this is real string + try: + if isinstance(data, unicode): # type: ignore + return data + except NameError: + return str(data) + else: + return str(data) + + @staticmethod + def deserialize_enum(data, enum_obj): + """Deserialize string into enum object. + + If the string is not a valid enum value it will be returned as-is + and a warning will be logged. + + :param str data: Response string to be deserialized. If this value is + None or invalid it will be returned as-is. + :param Enum enum_obj: Enum object to deserialize to. + :rtype: Enum + """ + if isinstance(data, enum_obj) or data is None: + return data + if isinstance(data, Enum): + data = data.value + if isinstance(data, int): + # Workaround. We might consider remove it in the future. + try: + return list(enum_obj.__members__.values())[data] + except IndexError: + error = "{!r} is not a valid index for enum {!r}" + raise DeserializationError(error.format(data, enum_obj)) + try: + return enum_obj(str(data)) + except ValueError: + for enum_value in enum_obj: + if enum_value.value.lower() == str(data).lower(): + return enum_value + # We don't fail anymore for unknown value, we deserialize as a string + _LOGGER.warning("Deserializer is not able to find %s as valid enum in %s", data, enum_obj) + return Deserializer.deserialize_unicode(data) + + @staticmethod + def deserialize_bytearray(attr): + """Deserialize string into bytearray. + + :param str attr: response string to be deserialized. + :rtype: bytearray + :raises: TypeError if string format invalid. + """ + if isinstance(attr, ET.Element): + attr = attr.text + return bytearray(b64decode(attr)) # type: ignore + + @staticmethod + def deserialize_base64(attr): + """Deserialize base64 encoded string into string. + + :param str attr: response string to be deserialized. + :rtype: bytearray + :raises: TypeError if string format invalid. + """ + if isinstance(attr, ET.Element): + attr = attr.text + padding = "=" * (3 - (len(attr) + 3) % 4) # type: ignore + attr = attr + padding # type: ignore + encoded = attr.replace("-", "+").replace("_", "/") + return b64decode(encoded) + + @staticmethod + def deserialize_decimal(attr): + """Deserialize string into Decimal object. + + :param str attr: response string to be deserialized. + :rtype: Decimal + :raises: DeserializationError if string format invalid. + """ + if isinstance(attr, ET.Element): + attr = attr.text + try: + return decimal.Decimal(str(attr)) # type: ignore + except decimal.DecimalException as err: + msg = "Invalid decimal {}".format(attr) + raise DeserializationError(msg) from err + + @staticmethod + def deserialize_long(attr): + """Deserialize string into long (Py2) or int (Py3). + + :param str attr: response string to be deserialized. + :rtype: long or int + :raises: ValueError if string format invalid. + """ + if isinstance(attr, ET.Element): + attr = attr.text + return _long_type(attr) # type: ignore + + @staticmethod + def deserialize_duration(attr): + """Deserialize ISO-8601 formatted string into TimeDelta object. + + :param str attr: response string to be deserialized. + :rtype: TimeDelta + :raises: DeserializationError if string format invalid. + """ + if isinstance(attr, ET.Element): + attr = attr.text + try: + duration = isodate.parse_duration(attr) + except (ValueError, OverflowError, AttributeError) as err: + msg = "Cannot deserialize duration object." + raise DeserializationError(msg) from err + else: + return duration + + @staticmethod + def deserialize_date(attr): + """Deserialize ISO-8601 formatted string into Date object. + + :param str attr: response string to be deserialized. + :rtype: Date + :raises: DeserializationError if string format invalid. + """ + if isinstance(attr, ET.Element): + attr = attr.text + if re.search(r"[^\W\d_]", attr, re.I + re.U): # type: ignore + raise DeserializationError("Date must have only digits and -. Received: %s" % attr) + # This must NOT use defaultmonth/defaultday. Using None ensure this raises an exception. + return isodate.parse_date(attr, defaultmonth=0, defaultday=0) + + @staticmethod + def deserialize_time(attr): + """Deserialize ISO-8601 formatted string into time object. + + :param str attr: response string to be deserialized. + :rtype: datetime.time + :raises: DeserializationError if string format invalid. + """ + if isinstance(attr, ET.Element): + attr = attr.text + if re.search(r"[^\W\d_]", attr, re.I + re.U): # type: ignore + raise DeserializationError("Date must have only digits and -. Received: %s" % attr) + return isodate.parse_time(attr) + + @staticmethod + def deserialize_rfc(attr): + """Deserialize RFC-1123 formatted string into Datetime object. + + :param str attr: response string to be deserialized. + :rtype: Datetime + :raises: DeserializationError if string format invalid. + """ + if isinstance(attr, ET.Element): + attr = attr.text + try: + parsed_date = email.utils.parsedate_tz(attr) # type: ignore + date_obj = datetime.datetime( + *parsed_date[:6], tzinfo=_FixedOffset(datetime.timedelta(minutes=(parsed_date[9] or 0) / 60)) + ) + if not date_obj.tzinfo: + date_obj = date_obj.astimezone(tz=TZ_UTC) + except ValueError as err: + msg = "Cannot deserialize to rfc datetime object." + raise DeserializationError(msg) from err + else: + return date_obj + + @staticmethod + def deserialize_iso(attr): + """Deserialize ISO-8601 formatted string into Datetime object. + + :param str attr: response string to be deserialized. + :rtype: Datetime + :raises: DeserializationError if string format invalid. + """ + if isinstance(attr, ET.Element): + attr = attr.text + try: + attr = attr.upper() # type: ignore + match = Deserializer.valid_date.match(attr) + if not match: + raise ValueError("Invalid datetime string: " + attr) + + check_decimal = attr.split(".") + if len(check_decimal) > 1: + decimal_str = "" + for digit in check_decimal[1]: + if digit.isdigit(): + decimal_str += digit + else: + break + if len(decimal_str) > 6: + attr = attr.replace(decimal_str, decimal_str[0:6]) + + date_obj = isodate.parse_datetime(attr) + test_utc = date_obj.utctimetuple() + if test_utc.tm_year > 9999 or test_utc.tm_year < 1: + raise OverflowError("Hit max or min date") + except (ValueError, OverflowError, AttributeError) as err: + msg = "Cannot deserialize datetime object." + raise DeserializationError(msg) from err + else: + return date_obj + + @staticmethod + def deserialize_unix(attr): + """Serialize Datetime object into IntTime format. + This is represented as seconds. + + :param int attr: Object to be serialized. + :rtype: Datetime + :raises: DeserializationError if format invalid + """ + if isinstance(attr, ET.Element): + attr = int(attr.text) # type: ignore + try: + attr = int(attr) + date_obj = datetime.datetime.fromtimestamp(attr, TZ_UTC) + except ValueError as err: + msg = "Cannot deserialize to unix datetime object." + raise DeserializationError(msg) from err + else: + return date_obj diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_validation.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_validation.py new file mode 100644 index 000000000000..752b2822f9d3 --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_validation.py @@ -0,0 +1,50 @@ +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for license information. +# Code generated by Microsoft (R) Python Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is regenerated. +# -------------------------------------------------------------------------- +import functools + + +def api_version_validation(**kwargs): + params_added_on = kwargs.pop("params_added_on", {}) + method_added_on = kwargs.pop("method_added_on", "") + + def decorator(func): + @functools.wraps(func) + def wrapper(*args, **kwargs): + try: + # this assumes the client has an _api_version attribute + client = args[0] + client_api_version = client._config.api_version # pylint: disable=protected-access + except AttributeError: + return func(*args, **kwargs) + + if method_added_on > client_api_version: + raise ValueError( + f"'{func.__name__}' is not available in API version " + f"{client_api_version}. Pass service API version {method_added_on} or newer to your client." + ) + + unsupported = { + parameter: api_version + for api_version, parameters in params_added_on.items() + for parameter in parameters + if parameter in kwargs and api_version > client_api_version + } + if unsupported: + raise ValueError( + "".join( + [ + f"'{param}' is not available in API version {client_api_version}. " + f"Use service API version {version} or newer.\n" + for param, version in unsupported.items() + ] + ) + ) + return func(*args, **kwargs) + + return wrapper + + return decorator diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_vendor.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_vendor.py new file mode 100644 index 000000000000..e76c249bb07a --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_vendor.py @@ -0,0 +1,35 @@ +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for license information. +# Code generated by Microsoft (R) Python Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is regenerated. +# -------------------------------------------------------------------------- + +from abc import ABC +from typing import TYPE_CHECKING + +from ._configuration import EventGridConsumerClientConfiguration, EventGridPublisherClientConfiguration + +if TYPE_CHECKING: + # pylint: disable=unused-import,ungrouped-imports + from azure.core import PipelineClient + + from ._serialization import Deserializer, Serializer + + +class EventGridPublisherClientMixinABC(ABC): + """DO NOT use this class. It is for internal typing use only.""" + + _client: "PipelineClient" + _config: EventGridPublisherClientConfiguration + _serialize: "Serializer" + _deserialize: "Deserializer" + + +class EventGridConsumerClientMixinABC(ABC): + """DO NOT use this class. It is for internal typing use only.""" + + _client: "PipelineClient" + _config: EventGridConsumerClientConfiguration + _serialize: "Serializer" + _deserialize: "Deserializer" diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_version.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_version.py index 5834df1ac978..c615ce560cb1 100644 --- a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_version.py +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/_version.py @@ -1,12 +1,9 @@ # coding=utf-8 # -------------------------------------------------------------------------- # Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for -# license information. -# -# Code generated by Microsoft (R) AutoRest Code Generator. -# Changes may cause incorrect behavior and will be lost if the code is -# regenerated. +# Licensed under the MIT License. See License.txt in the project root for license information. +# Code generated by Microsoft (R) Python Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is regenerated. # -------------------------------------------------------------------------- -VERSION = "4.19.1" +VERSION = "4.20.0" diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/__init__.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/__init__.py index 0d2dce7aaea2..bcf16d1a1ec6 100644 --- a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/__init__.py +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/__init__.py @@ -1,9 +1,25 @@ # coding=utf-8 -# ------------------------------------ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. -# ------------------------------------ +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for license information. +# Code generated by Microsoft (R) Python Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is regenerated. +# -------------------------------------------------------------------------- -from ._publisher_client_async import EventGridPublisherClient +from ._patch import EventGridPublisherClient +from ._patch import EventGridConsumerClient -__all__ = ["EventGridPublisherClient"] +try: + from ._patch import __all__ as _patch_all + from ._patch import * # pylint: disable=unused-wildcard-import +except ImportError: + _patch_all = [] +from ._patch import patch_sdk as _patch_sdk + +__all__ = [ + "EventGridPublisherClient", + "EventGridConsumerClient", +] +__all__.extend([p for p in _patch_all if p not in __all__]) + +_patch_sdk() diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_client.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_client.py new file mode 100644 index 000000000000..07db6072d782 --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_client.py @@ -0,0 +1,191 @@ +# coding=utf-8 +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for license information. +# Code generated by Microsoft (R) Python Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is regenerated. +# -------------------------------------------------------------------------- + +from copy import deepcopy +from typing import Any, Awaitable, TYPE_CHECKING, Union + +from azure.core import AsyncPipelineClient +from azure.core.credentials import AzureKeyCredential +from azure.core.pipeline import policies +from azure.core.rest import AsyncHttpResponse, HttpRequest + +from .._serialization import Deserializer, Serializer +from ._configuration import EventGridConsumerClientConfiguration, EventGridPublisherClientConfiguration +from ._operations import EventGridConsumerClientOperationsMixin, EventGridPublisherClientOperationsMixin + +if TYPE_CHECKING: + # pylint: disable=unused-import,ungrouped-imports + from azure.core.credentials_async import AsyncTokenCredential + + +class EventGridPublisherClient( + EventGridPublisherClientOperationsMixin +): # pylint: disable=client-accepts-api-version-keyword + """EventGridPublisherClient. + + :param endpoint: The host name of the namespace, e.g. + namespaceName1.westus-1.eventgrid.azure.net. Required. + :type endpoint: str + :param credential: Credential used to authenticate requests to the service. Is either a + AzureKeyCredential type or a TokenCredential type. Required. + :type credential: ~azure.core.credentials.AzureKeyCredential or + ~azure.core.credentials_async.AsyncTokenCredential + :keyword api_version: The API version to use for this operation. Default value is "2024-06-01". + Note that overriding this default value may result in unsupported behavior. + :paramtype api_version: str + """ + + def __init__( + self, endpoint: str, credential: Union[AzureKeyCredential, "AsyncTokenCredential"], **kwargs: Any + ) -> None: + _endpoint = "{endpoint}" + self._config = EventGridPublisherClientConfiguration(endpoint=endpoint, credential=credential, **kwargs) + _policies = kwargs.pop("policies", None) + if _policies is None: + _policies = [ + policies.RequestIdPolicy(**kwargs), + self._config.headers_policy, + self._config.user_agent_policy, + self._config.proxy_policy, + policies.ContentDecodePolicy(**kwargs), + self._config.redirect_policy, + self._config.retry_policy, + self._config.authentication_policy, + self._config.custom_hook_policy, + self._config.logging_policy, + policies.DistributedTracingPolicy(**kwargs), + policies.SensitiveHeaderCleanupPolicy(**kwargs) if self._config.redirect_policy else None, + self._config.http_logging_policy, + ] + self._client: AsyncPipelineClient = AsyncPipelineClient(base_url=_endpoint, policies=_policies, **kwargs) + + self._serialize = Serializer() + self._deserialize = Deserializer() + self._serialize.client_side_validation = False + + def send_request( + self, request: HttpRequest, *, stream: bool = False, **kwargs: Any + ) -> Awaitable[AsyncHttpResponse]: + """Runs the network request through the client's chained policies. + + >>> from azure.core.rest import HttpRequest + >>> request = HttpRequest("GET", "https://www.example.org/") + + >>> response = await client.send_request(request) + + + For more information on this code flow, see https://aka.ms/azsdk/dpcodegen/python/send_request + + :param request: The network request you want to make. Required. + :type request: ~azure.core.rest.HttpRequest + :keyword bool stream: Whether the response payload will be streamed. Defaults to False. + :return: The response of your network call. Does not do error handling on your response. + :rtype: ~azure.core.rest.AsyncHttpResponse + """ + + request_copy = deepcopy(request) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + + request_copy.url = self._client.format_url(request_copy.url, **path_format_arguments) + return self._client.send_request(request_copy, stream=stream, **kwargs) # type: ignore + + async def close(self) -> None: + await self._client.close() + + async def __aenter__(self) -> "EventGridPublisherClient": + await self._client.__aenter__() + return self + + async def __aexit__(self, *exc_details: Any) -> None: + await self._client.__aexit__(*exc_details) + + +class EventGridConsumerClient( + EventGridConsumerClientOperationsMixin +): # pylint: disable=client-accepts-api-version-keyword + """EventGridConsumerClient. + + :param endpoint: The host name of the namespace, e.g. + namespaceName1.westus-1.eventgrid.azure.net. Required. + :type endpoint: str + :param credential: Credential used to authenticate requests to the service. Is either a + AzureKeyCredential type or a TokenCredential type. Required. + :type credential: ~azure.core.credentials.AzureKeyCredential or + ~azure.core.credentials_async.AsyncTokenCredential + :keyword api_version: The API version to use for this operation. Default value is "2024-06-01". + Note that overriding this default value may result in unsupported behavior. + :paramtype api_version: str + """ + + def __init__( + self, endpoint: str, credential: Union[AzureKeyCredential, "AsyncTokenCredential"], **kwargs: Any + ) -> None: + _endpoint = "{endpoint}" + self._config = EventGridConsumerClientConfiguration(endpoint=endpoint, credential=credential, **kwargs) + _policies = kwargs.pop("policies", None) + if _policies is None: + _policies = [ + policies.RequestIdPolicy(**kwargs), + self._config.headers_policy, + self._config.user_agent_policy, + self._config.proxy_policy, + policies.ContentDecodePolicy(**kwargs), + self._config.redirect_policy, + self._config.retry_policy, + self._config.authentication_policy, + self._config.custom_hook_policy, + self._config.logging_policy, + policies.DistributedTracingPolicy(**kwargs), + policies.SensitiveHeaderCleanupPolicy(**kwargs) if self._config.redirect_policy else None, + self._config.http_logging_policy, + ] + self._client: AsyncPipelineClient = AsyncPipelineClient(base_url=_endpoint, policies=_policies, **kwargs) + + self._serialize = Serializer() + self._deserialize = Deserializer() + self._serialize.client_side_validation = False + + def send_request( + self, request: HttpRequest, *, stream: bool = False, **kwargs: Any + ) -> Awaitable[AsyncHttpResponse]: + """Runs the network request through the client's chained policies. + + >>> from azure.core.rest import HttpRequest + >>> request = HttpRequest("GET", "https://www.example.org/") + + >>> response = await client.send_request(request) + + + For more information on this code flow, see https://aka.ms/azsdk/dpcodegen/python/send_request + + :param request: The network request you want to make. Required. + :type request: ~azure.core.rest.HttpRequest + :keyword bool stream: Whether the response payload will be streamed. Defaults to False. + :return: The response of your network call. Does not do error handling on your response. + :rtype: ~azure.core.rest.AsyncHttpResponse + """ + + request_copy = deepcopy(request) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + + request_copy.url = self._client.format_url(request_copy.url, **path_format_arguments) + return self._client.send_request(request_copy, stream=stream, **kwargs) # type: ignore + + async def close(self) -> None: + await self._client.close() + + async def __aenter__(self) -> "EventGridConsumerClient": + await self._client.__aenter__() + return self + + async def __aexit__(self, *exc_details: Any) -> None: + await self._client.__aexit__(*exc_details) diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_configuration.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_configuration.py new file mode 100644 index 000000000000..db6bd48f5aa3 --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_configuration.py @@ -0,0 +1,136 @@ +# coding=utf-8 +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for license information. +# Code generated by Microsoft (R) Python Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is regenerated. +# -------------------------------------------------------------------------- + +from typing import Any, TYPE_CHECKING, Union + +from azure.core.credentials import AzureKeyCredential +from azure.core.pipeline import policies + +from .._version import VERSION + +if TYPE_CHECKING: + # pylint: disable=unused-import,ungrouped-imports + from azure.core.credentials_async import AsyncTokenCredential + + +class EventGridPublisherClientConfiguration: # pylint: disable=too-many-instance-attributes,name-too-long + """Configuration for EventGridPublisherClient. + + Note that all parameters used to create this instance are saved as instance + attributes. + + :param endpoint: The host name of the namespace, e.g. + namespaceName1.westus-1.eventgrid.azure.net. Required. + :type endpoint: str + :param credential: Credential used to authenticate requests to the service. Is either a + AzureKeyCredential type or a TokenCredential type. Required. + :type credential: ~azure.core.credentials.AzureKeyCredential or + ~azure.core.credentials_async.AsyncTokenCredential + :keyword api_version: The API version to use for this operation. Default value is "2024-06-01". + Note that overriding this default value may result in unsupported behavior. + :paramtype api_version: str + """ + + def __init__( + self, endpoint: str, credential: Union[AzureKeyCredential, "AsyncTokenCredential"], **kwargs: Any + ) -> None: + api_version: str = kwargs.pop("api_version", "2024-06-01") + + if endpoint is None: + raise ValueError("Parameter 'endpoint' must not be None.") + if credential is None: + raise ValueError("Parameter 'credential' must not be None.") + + self.endpoint = endpoint + self.credential = credential + self.api_version = api_version + self.credential_scopes = kwargs.pop("credential_scopes", ["https://eventgrid.azure.net/.default"]) + kwargs.setdefault("sdk_moniker", "eventgrid/{}".format(VERSION)) + self.polling_interval = kwargs.get("polling_interval", 30) + self._configure(**kwargs) + + def _infer_policy(self, **kwargs): + if isinstance(self.credential, AzureKeyCredential): + return policies.AzureKeyCredentialPolicy( + self.credential, "Authorization", prefix="SharedAccessKey", **kwargs + ) + if hasattr(self.credential, "get_token"): + return policies.AsyncBearerTokenCredentialPolicy(self.credential, *self.credential_scopes, **kwargs) + raise TypeError(f"Unsupported credential: {self.credential}") + + def _configure(self, **kwargs: Any) -> None: + self.user_agent_policy = kwargs.get("user_agent_policy") or policies.UserAgentPolicy(**kwargs) + self.headers_policy = kwargs.get("headers_policy") or policies.HeadersPolicy(**kwargs) + self.proxy_policy = kwargs.get("proxy_policy") or policies.ProxyPolicy(**kwargs) + self.logging_policy = kwargs.get("logging_policy") or policies.NetworkTraceLoggingPolicy(**kwargs) + self.http_logging_policy = kwargs.get("http_logging_policy") or policies.HttpLoggingPolicy(**kwargs) + self.custom_hook_policy = kwargs.get("custom_hook_policy") or policies.CustomHookPolicy(**kwargs) + self.redirect_policy = kwargs.get("redirect_policy") or policies.AsyncRedirectPolicy(**kwargs) + self.retry_policy = kwargs.get("retry_policy") or policies.AsyncRetryPolicy(**kwargs) + self.authentication_policy = kwargs.get("authentication_policy") + if self.credential and not self.authentication_policy: + self.authentication_policy = self._infer_policy(**kwargs) + + +class EventGridConsumerClientConfiguration: # pylint: disable=too-many-instance-attributes,name-too-long + """Configuration for EventGridConsumerClient. + + Note that all parameters used to create this instance are saved as instance + attributes. + + :param endpoint: The host name of the namespace, e.g. + namespaceName1.westus-1.eventgrid.azure.net. Required. + :type endpoint: str + :param credential: Credential used to authenticate requests to the service. Is either a + AzureKeyCredential type or a TokenCredential type. Required. + :type credential: ~azure.core.credentials.AzureKeyCredential or + ~azure.core.credentials_async.AsyncTokenCredential + :keyword api_version: The API version to use for this operation. Default value is "2024-06-01". + Note that overriding this default value may result in unsupported behavior. + :paramtype api_version: str + """ + + def __init__( + self, endpoint: str, credential: Union[AzureKeyCredential, "AsyncTokenCredential"], **kwargs: Any + ) -> None: + api_version: str = kwargs.pop("api_version", "2024-06-01") + + if endpoint is None: + raise ValueError("Parameter 'endpoint' must not be None.") + if credential is None: + raise ValueError("Parameter 'credential' must not be None.") + + self.endpoint = endpoint + self.credential = credential + self.api_version = api_version + self.credential_scopes = kwargs.pop("credential_scopes", ["https://eventgrid.azure.net/.default"]) + kwargs.setdefault("sdk_moniker", "eventgrid/{}".format(VERSION)) + self.polling_interval = kwargs.get("polling_interval", 30) + self._configure(**kwargs) + + def _infer_policy(self, **kwargs): + if isinstance(self.credential, AzureKeyCredential): + return policies.AzureKeyCredentialPolicy( + self.credential, "Authorization", prefix="SharedAccessKey", **kwargs + ) + if hasattr(self.credential, "get_token"): + return policies.AsyncBearerTokenCredentialPolicy(self.credential, *self.credential_scopes, **kwargs) + raise TypeError(f"Unsupported credential: {self.credential}") + + def _configure(self, **kwargs: Any) -> None: + self.user_agent_policy = kwargs.get("user_agent_policy") or policies.UserAgentPolicy(**kwargs) + self.headers_policy = kwargs.get("headers_policy") or policies.HeadersPolicy(**kwargs) + self.proxy_policy = kwargs.get("proxy_policy") or policies.ProxyPolicy(**kwargs) + self.logging_policy = kwargs.get("logging_policy") or policies.NetworkTraceLoggingPolicy(**kwargs) + self.http_logging_policy = kwargs.get("http_logging_policy") or policies.HttpLoggingPolicy(**kwargs) + self.custom_hook_policy = kwargs.get("custom_hook_policy") or policies.CustomHookPolicy(**kwargs) + self.redirect_policy = kwargs.get("redirect_policy") or policies.AsyncRedirectPolicy(**kwargs) + self.retry_policy = kwargs.get("retry_policy") or policies.AsyncRetryPolicy(**kwargs) + self.authentication_policy = kwargs.get("authentication_policy") + if self.credential and not self.authentication_policy: + self.authentication_policy = self._infer_policy(**kwargs) diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_operations/__init__.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_operations/__init__.py new file mode 100644 index 000000000000..c716622cb722 --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_operations/__init__.py @@ -0,0 +1,21 @@ +# coding=utf-8 +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for license information. +# Code generated by Microsoft (R) Python Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is regenerated. +# -------------------------------------------------------------------------- + +from ._patch import EventGridPublisherClientOperationsMixin +from ._patch import EventGridConsumerClientOperationsMixin + +from ._patch import __all__ as _patch_all +from ._patch import * # pylint: disable=unused-wildcard-import +from ._patch import patch_sdk as _patch_sdk + +__all__ = [ + "EventGridPublisherClientOperationsMixin", + "EventGridConsumerClientOperationsMixin", +] +__all__.extend([p for p in _patch_all if p not in __all__]) +_patch_sdk() diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_operations/_operations.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_operations/_operations.py new file mode 100644 index 000000000000..e10006e75b52 --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_operations/_operations.py @@ -0,0 +1,1055 @@ +# pylint: disable=too-many-lines,too-many-statements +# coding=utf-8 +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for license information. +# Code generated by Microsoft (R) Python Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is regenerated. +# -------------------------------------------------------------------------- +from io import IOBase +import json +import sys +from typing import Any, Callable, Dict, IO, List, Optional, Type, TypeVar, Union, overload + +from azure.core.exceptions import ( + ClientAuthenticationError, + HttpResponseError, + ResourceExistsError, + ResourceNotFoundError, + ResourceNotModifiedError, + map_error, +) +from azure.core.pipeline import PipelineResponse +from azure.core.rest import AsyncHttpResponse, HttpRequest +from azure.core.tracing.decorator_async import distributed_trace_async +from azure.core.utils import case_insensitive_dict + +from ... import models as _models +from ..._model_base import SdkJSONEncoder, _deserialize +from ..._operations._operations import ( + build_event_grid_consumer_acknowledge_request, + build_event_grid_consumer_receive_request, + build_event_grid_consumer_reject_request, + build_event_grid_consumer_release_request, + build_event_grid_consumer_renew_locks_request, + build_event_grid_publisher_send_events_request, + build_event_grid_publisher_send_request, +) +from ..._validation import api_version_validation +from .._vendor import EventGridConsumerClientMixinABC, EventGridPublisherClientMixinABC + +if sys.version_info >= (3, 9): + from collections.abc import MutableMapping +else: + from typing import MutableMapping # type: ignore # pylint: disable=ungrouped-imports +T = TypeVar("T") +ClsType = Optional[Callable[[PipelineResponse[HttpRequest, AsyncHttpResponse], T, Dict[str, Any]], Any]] +JSON = MutableMapping[str, Any] # pylint: disable=unsubscriptable-object +_Unset: Any = object() + + +class EventGridPublisherClientOperationsMixin(EventGridPublisherClientMixinABC): + + @distributed_trace_async + async def _send( # pylint: disable=protected-access + self, topic_name: str, event: _models._models.CloudEvent, **kwargs: Any + ) -> _models._models.PublishResult: + # pylint: disable=line-too-long + """Publish a single Cloud Event to a namespace topic. + + :param topic_name: Topic Name. Required. + :type topic_name: str + :param event: Single Cloud Event being published. Required. + :type event: ~azure.eventgrid.models._models.CloudEvent + :return: PublishResult. The PublishResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models._models.PublishResult + :raises ~azure.core.exceptions.HttpResponseError: + + Example: + .. code-block:: python + + # JSON input template you can fill out and use as your body input. + event = { + "id": "str", # An identifier for the event. The combination of id and source + must be unique for each distinct event. Required. + "source": "str", # Identifies the context in which an event happened. The + combination of id and source must be unique for each distinct event. Required. + "specversion": "str", # The version of the CloudEvents specification which + the event uses. Required. + "type": "str", # Type of event related to the originating occurrence. + Required. + "data": {}, # Optional. Event data specific to the event type. + "data_base64": bytes("bytes", encoding="utf-8"), # Optional. Event data + specific to the event type, encoded as a base64 string. + "datacontenttype": "str", # Optional. Content type of data value. + "dataschema": "str", # Optional. Identifies the schema that data adheres to. + "subject": "str", # Optional. This describes the subject of the event in the + context of the event producer (identified by source). + "time": "2020-02-20 00:00:00" # Optional. The time (in UTC) the event was + generated, in RFC3339 format. + } + """ + error_map: MutableMapping[int, Type[HttpResponseError]] = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: str = kwargs.pop( + "content_type", _headers.pop("content-type", "application/cloudevents+json; charset=utf-8") + ) + cls: ClsType[_models._models.PublishResult] = kwargs.pop("cls", None) # pylint: disable=protected-access + + _content = json.dumps(event, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_event_grid_publisher_send_request( + topic_name=topic_name, + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = await self._client._pipeline.run( # type: ignore # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + await response.read() # Load the body in memory and close the socket + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() + else: + deserialized = _deserialize( + _models._models.PublishResult, response.json() # pylint: disable=protected-access + ) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @distributed_trace_async + async def _send_events( # pylint: disable=protected-access + self, topic_name: str, events: List[_models._models.CloudEvent], **kwargs: Any + ) -> _models._models.PublishResult: + # pylint: disable=line-too-long + """Publish a batch of Cloud Events to a namespace topic. + + :param topic_name: Topic Name. Required. + :type topic_name: str + :param events: Array of Cloud Events being published. Required. + :type events: list[~azure.eventgrid.models._models.CloudEvent] + :return: PublishResult. The PublishResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models._models.PublishResult + :raises ~azure.core.exceptions.HttpResponseError: + + Example: + .. code-block:: python + + # JSON input template you can fill out and use as your body input. + events = [ + { + "id": "str", # An identifier for the event. The combination of id + and source must be unique for each distinct event. Required. + "source": "str", # Identifies the context in which an event + happened. The combination of id and source must be unique for each distinct + event. Required. + "specversion": "str", # The version of the CloudEvents specification + which the event uses. Required. + "type": "str", # Type of event related to the originating + occurrence. Required. + "data": {}, # Optional. Event data specific to the event type. + "data_base64": bytes("bytes", encoding="utf-8"), # Optional. Event + data specific to the event type, encoded as a base64 string. + "datacontenttype": "str", # Optional. Content type of data value. + "dataschema": "str", # Optional. Identifies the schema that data + adheres to. + "subject": "str", # Optional. This describes the subject of the + event in the context of the event producer (identified by source). + "time": "2020-02-20 00:00:00" # Optional. The time (in UTC) the + event was generated, in RFC3339 format. + } + ] + """ + error_map: MutableMapping[int, Type[HttpResponseError]] = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: str = kwargs.pop( + "content_type", _headers.pop("content-type", "application/cloudevents-batch+json; charset=utf-8") + ) + cls: ClsType[_models._models.PublishResult] = kwargs.pop("cls", None) # pylint: disable=protected-access + + _content = json.dumps(events, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_event_grid_publisher_send_events_request( + topic_name=topic_name, + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = await self._client._pipeline.run( # type: ignore # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + await response.read() # Load the body in memory and close the socket + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() + else: + deserialized = _deserialize( + _models._models.PublishResult, response.json() # pylint: disable=protected-access + ) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + +class EventGridConsumerClientOperationsMixin(EventGridConsumerClientMixinABC): + + @distributed_trace_async + async def _receive( # pylint: disable=protected-access + self, + topic_name: str, + event_subscription_name: str, + *, + max_events: Optional[int] = None, + max_wait_time: Optional[int] = None, + **kwargs: Any + ) -> _models._models.ReceiveResult: + # pylint: disable=line-too-long + """Receive a batch of Cloud Events from a subscription. + + :param topic_name: Topic Name. Required. + :type topic_name: str + :param event_subscription_name: Event Subscription Name. Required. + :type event_subscription_name: str + :keyword max_events: Max Events count to be received. Minimum value is 1, while maximum value + is 100 events. If not specified, the default value is 1. Default value is None. + :paramtype max_events: int + :keyword max_wait_time: Max wait time value for receive operation in Seconds. It is the time in + seconds that the server approximately waits for the availability of an event and responds to + the request. If an event is available, the broker responds immediately to the client. Minimum + value is 10 seconds, while maximum value is 120 seconds. If not specified, the default value is + 60 seconds. Default value is None. + :paramtype max_wait_time: int + :return: ReceiveResult. The ReceiveResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models._models.ReceiveResult + :raises ~azure.core.exceptions.HttpResponseError: + + Example: + .. code-block:: python + + # response body for status code(s): 200 + response == { + "value": [ + { + "brokerProperties": { + "deliveryCount": 0, # The attempt count for + delivering the event. Required. + "lockToken": "str" # The token of the lock on the + event. Required. + }, + "event": { + "id": "str", # An identifier for the event. The + combination of id and source must be unique for each distinct event. + Required. + "source": "str", # Identifies the context in which + an event happened. The combination of id and source must be unique + for each distinct event. Required. + "specversion": "str", # The version of the + CloudEvents specification which the event uses. Required. + "type": "str", # Type of event related to the + originating occurrence. Required. + "data": {}, # Optional. Event data specific to the + event type. + "data_base64": bytes("bytes", encoding="utf-8"), # + Optional. Event data specific to the event type, encoded as a base64 + string. + "datacontenttype": "str", # Optional. Content type + of data value. + "dataschema": "str", # Optional. Identifies the + schema that data adheres to. + "subject": "str", # Optional. This describes the + subject of the event in the context of the event producer (identified + by source). + "time": "2020-02-20 00:00:00" # Optional. The time + (in UTC) the event was generated, in RFC3339 format. + } + } + ] + } + """ + error_map: MutableMapping[int, Type[HttpResponseError]] = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[_models._models.ReceiveResult] = kwargs.pop("cls", None) # pylint: disable=protected-access + + _request = build_event_grid_consumer_receive_request( + topic_name=topic_name, + event_subscription_name=event_subscription_name, + max_events=max_events, + max_wait_time=max_wait_time, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = await self._client._pipeline.run( # type: ignore # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + await response.read() # Load the body in memory and close the socket + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() + else: + deserialized = _deserialize( + _models._models.ReceiveResult, response.json() # pylint: disable=protected-access + ) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @overload + async def _acknowledge( + self, + topic_name: str, + event_subscription_name: str, + body: JSON, + *, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.AcknowledgeResult: ... + @overload + async def _acknowledge( + self, + topic_name: str, + event_subscription_name: str, + *, + lock_tokens: List[str], + content_type: str = "application/json", + **kwargs: Any + ) -> _models.AcknowledgeResult: ... + @overload + async def _acknowledge( + self, + topic_name: str, + event_subscription_name: str, + body: IO[bytes], + *, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.AcknowledgeResult: ... + + @distributed_trace_async + async def _acknowledge( + self, + topic_name: str, + event_subscription_name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + lock_tokens: List[str] = _Unset, + **kwargs: Any + ) -> _models.AcknowledgeResult: + """Acknowledge a batch of Cloud Events. The response will include the set of successfully + acknowledged lock tokens, along with other failed lock tokens with their corresponding error + information. Successfully acknowledged events will no longer be available to be received by any + consumer. + + :param topic_name: Topic Name. Required. + :type topic_name: str + :param event_subscription_name: Event Subscription Name. Required. + :type event_subscription_name: str + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword lock_tokens: Array of lock tokens. Required. + :paramtype lock_tokens: list[str] + :return: AcknowledgeResult. The AcknowledgeResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models.AcknowledgeResult + :raises ~azure.core.exceptions.HttpResponseError: + + Example: + .. code-block:: python + + # JSON input template you can fill out and use as your body input. + body = { + "lockTokens": [ + "str" # Array of lock tokens. Required. + ] + } + + # response body for status code(s): 200 + response == { + "failedLockTokens": [ + { + "error": { + "code": "str", # One of a server-defined set of + error codes. Required. + "message": "str", # A human-readable representation + of the error. Required. + "details": [ + ... + ], + "innererror": { + "code": "str", # Optional. One of a + server-defined set of error codes. + "innererror": ... + }, + "target": "str" # Optional. The target of the error. + }, + "lockToken": "str" # The lock token of an entry in the + request. Required. + } + ], + "succeededLockTokens": [ + "str" # Array of lock tokens for the successfully acknowledged cloud + events. Required. + ] + } + """ + error_map: MutableMapping[int, Type[HttpResponseError]] = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.AcknowledgeResult] = kwargs.pop("cls", None) + + if body is _Unset: + if lock_tokens is _Unset: + raise TypeError("missing required argument: lock_tokens") + body = {"lockTokens": lock_tokens} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_event_grid_consumer_acknowledge_request( + topic_name=topic_name, + event_subscription_name=event_subscription_name, + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = await self._client._pipeline.run( # type: ignore # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + await response.read() # Load the body in memory and close the socket + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() + else: + deserialized = _deserialize(_models.AcknowledgeResult, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @overload + @api_version_validation( + params_added_on={"2023-10-01-preview": ["release_delay_in_seconds"]}, + ) + async def _release( + self, + topic_name: str, + event_subscription_name: str, + body: JSON, + *, + release_delay_in_seconds: Optional[Union[str, _models.ReleaseDelay]] = None, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.ReleaseResult: ... + @overload + @api_version_validation( + params_added_on={"2023-10-01-preview": ["release_delay_in_seconds"]}, + ) + async def _release( + self, + topic_name: str, + event_subscription_name: str, + *, + lock_tokens: List[str], + release_delay_in_seconds: Optional[Union[str, _models.ReleaseDelay]] = None, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.ReleaseResult: ... + @overload + @api_version_validation( + params_added_on={"2023-10-01-preview": ["release_delay_in_seconds"]}, + ) + async def _release( + self, + topic_name: str, + event_subscription_name: str, + body: IO[bytes], + *, + release_delay_in_seconds: Optional[Union[str, _models.ReleaseDelay]] = None, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.ReleaseResult: ... + + @distributed_trace_async + @api_version_validation( + params_added_on={"2023-10-01-preview": ["release_delay_in_seconds"]}, + ) + async def _release( + self, + topic_name: str, + event_subscription_name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + lock_tokens: List[str] = _Unset, + release_delay_in_seconds: Optional[Union[str, _models.ReleaseDelay]] = None, + **kwargs: Any + ) -> _models.ReleaseResult: + """Release a batch of Cloud Events. The response will include the set of successfully released + lock tokens, along with other failed lock tokens with their corresponding error information. + Successfully released events can be received by consumers. + + :param topic_name: Topic Name. Required. + :type topic_name: str + :param event_subscription_name: Event Subscription Name. Required. + :type event_subscription_name: str + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword lock_tokens: Array of lock tokens. Required. + :paramtype lock_tokens: list[str] + :keyword release_delay_in_seconds: Release cloud events with the specified delay in seconds. + Known values are: "0", "10", "60", "600", and "3600". Default value is None. + :paramtype release_delay_in_seconds: str or ~azure.eventgrid.models.ReleaseDelay + :return: ReleaseResult. The ReleaseResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models.ReleaseResult + :raises ~azure.core.exceptions.HttpResponseError: + + Example: + .. code-block:: python + + # JSON input template you can fill out and use as your body input. + body = { + "lockTokens": [ + "str" # Array of lock tokens. Required. + ] + } + + # response body for status code(s): 200 + response == { + "failedLockTokens": [ + { + "error": { + "code": "str", # One of a server-defined set of + error codes. Required. + "message": "str", # A human-readable representation + of the error. Required. + "details": [ + ... + ], + "innererror": { + "code": "str", # Optional. One of a + server-defined set of error codes. + "innererror": ... + }, + "target": "str" # Optional. The target of the error. + }, + "lockToken": "str" # The lock token of an entry in the + request. Required. + } + ], + "succeededLockTokens": [ + "str" # Array of lock tokens for the successfully released cloud + events. Required. + ] + } + """ + error_map: MutableMapping[int, Type[HttpResponseError]] = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.ReleaseResult] = kwargs.pop("cls", None) + + if body is _Unset: + if lock_tokens is _Unset: + raise TypeError("missing required argument: lock_tokens") + body = {"lockTokens": lock_tokens} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_event_grid_consumer_release_request( + topic_name=topic_name, + event_subscription_name=event_subscription_name, + release_delay_in_seconds=release_delay_in_seconds, + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = await self._client._pipeline.run( # type: ignore # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + await response.read() # Load the body in memory and close the socket + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() + else: + deserialized = _deserialize(_models.ReleaseResult, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @overload + async def _reject( + self, + topic_name: str, + event_subscription_name: str, + body: JSON, + *, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.RejectResult: ... + @overload + async def _reject( + self, + topic_name: str, + event_subscription_name: str, + *, + lock_tokens: List[str], + content_type: str = "application/json", + **kwargs: Any + ) -> _models.RejectResult: ... + @overload + async def _reject( + self, + topic_name: str, + event_subscription_name: str, + body: IO[bytes], + *, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.RejectResult: ... + + @distributed_trace_async + async def _reject( + self, + topic_name: str, + event_subscription_name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + lock_tokens: List[str] = _Unset, + **kwargs: Any + ) -> _models.RejectResult: + """Reject a batch of Cloud Events. The response will include the set of successfully rejected lock + tokens, along with other failed lock tokens with their corresponding error information. + Successfully rejected events will be dead-lettered and can no longer be received by a consumer. + + :param topic_name: Topic Name. Required. + :type topic_name: str + :param event_subscription_name: Event Subscription Name. Required. + :type event_subscription_name: str + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword lock_tokens: Array of lock tokens. Required. + :paramtype lock_tokens: list[str] + :return: RejectResult. The RejectResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models.RejectResult + :raises ~azure.core.exceptions.HttpResponseError: + + Example: + .. code-block:: python + + # JSON input template you can fill out and use as your body input. + body = { + "lockTokens": [ + "str" # Array of lock tokens. Required. + ] + } + + # response body for status code(s): 200 + response == { + "failedLockTokens": [ + { + "error": { + "code": "str", # One of a server-defined set of + error codes. Required. + "message": "str", # A human-readable representation + of the error. Required. + "details": [ + ... + ], + "innererror": { + "code": "str", # Optional. One of a + server-defined set of error codes. + "innererror": ... + }, + "target": "str" # Optional. The target of the error. + }, + "lockToken": "str" # The lock token of an entry in the + request. Required. + } + ], + "succeededLockTokens": [ + "str" # Array of lock tokens for the successfully rejected cloud + events. Required. + ] + } + """ + error_map: MutableMapping[int, Type[HttpResponseError]] = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.RejectResult] = kwargs.pop("cls", None) + + if body is _Unset: + if lock_tokens is _Unset: + raise TypeError("missing required argument: lock_tokens") + body = {"lockTokens": lock_tokens} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_event_grid_consumer_reject_request( + topic_name=topic_name, + event_subscription_name=event_subscription_name, + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = await self._client._pipeline.run( # type: ignore # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + await response.read() # Load the body in memory and close the socket + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() + else: + deserialized = _deserialize(_models.RejectResult, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @overload + @api_version_validation( + method_added_on="2023-10-01-preview", + params_added_on={ + "2023-10-01-preview": ["api_version", "topic_name", "event_subscription_name", "content_type", "accept"] + }, + ) + async def _renew_locks( + self, + topic_name: str, + event_subscription_name: str, + body: JSON, + *, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.RenewLocksResult: ... + @overload + @api_version_validation( + method_added_on="2023-10-01-preview", + params_added_on={ + "2023-10-01-preview": ["api_version", "topic_name", "event_subscription_name", "content_type", "accept"] + }, + ) + async def _renew_locks( + self, + topic_name: str, + event_subscription_name: str, + *, + lock_tokens: List[str], + content_type: str = "application/json", + **kwargs: Any + ) -> _models.RenewLocksResult: ... + @overload + @api_version_validation( + method_added_on="2023-10-01-preview", + params_added_on={ + "2023-10-01-preview": ["api_version", "topic_name", "event_subscription_name", "content_type", "accept"] + }, + ) + async def _renew_locks( + self, + topic_name: str, + event_subscription_name: str, + body: IO[bytes], + *, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.RenewLocksResult: ... + + @distributed_trace_async + @api_version_validation( + method_added_on="2023-10-01-preview", + params_added_on={ + "2023-10-01-preview": ["api_version", "topic_name", "event_subscription_name", "content_type", "accept"] + }, + ) + async def _renew_locks( + self, + topic_name: str, + event_subscription_name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + lock_tokens: List[str] = _Unset, + **kwargs: Any + ) -> _models.RenewLocksResult: + """Renew locks for a batch of Cloud Events. The response will include the set of successfully + renewed lock tokens, along with other failed lock tokens with their corresponding error + information. Successfully renewed locks will ensure that the associated event is only available + to the consumer that holds the renewed lock. + + :param topic_name: Topic Name. Required. + :type topic_name: str + :param event_subscription_name: Event Subscription Name. Required. + :type event_subscription_name: str + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword lock_tokens: Array of lock tokens. Required. + :paramtype lock_tokens: list[str] + :return: RenewLocksResult. The RenewLocksResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models.RenewLocksResult + :raises ~azure.core.exceptions.HttpResponseError: + + Example: + .. code-block:: python + + # JSON input template you can fill out and use as your body input. + body = { + "lockTokens": [ + "str" # Array of lock tokens. Required. + ] + } + + # response body for status code(s): 200 + response == { + "failedLockTokens": [ + { + "error": { + "code": "str", # One of a server-defined set of + error codes. Required. + "message": "str", # A human-readable representation + of the error. Required. + "details": [ + ... + ], + "innererror": { + "code": "str", # Optional. One of a + server-defined set of error codes. + "innererror": ... + }, + "target": "str" # Optional. The target of the error. + }, + "lockToken": "str" # The lock token of an entry in the + request. Required. + } + ], + "succeededLockTokens": [ + "str" # Array of lock tokens for the successfully renewed locks. + Required. + ] + } + """ + error_map: MutableMapping[int, Type[HttpResponseError]] = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.RenewLocksResult] = kwargs.pop("cls", None) + + if body is _Unset: + if lock_tokens is _Unset: + raise TypeError("missing required argument: lock_tokens") + body = {"lockTokens": lock_tokens} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_event_grid_consumer_renew_locks_request( + topic_name=topic_name, + event_subscription_name=event_subscription_name, + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = await self._client._pipeline.run( # type: ignore # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + await response.read() # Load the body in memory and close the socket + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() + else: + deserialized = _deserialize(_models.RenewLocksResult, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_operations/_patch.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_operations/_patch.py new file mode 100644 index 000000000000..4cd5f859561d --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_operations/_patch.py @@ -0,0 +1,281 @@ +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ +"""Customize generated code here. +Follow our quickstart for examples: https://aka.ms/azsdk/python/dpcodegen/python/customize +""" +from typing import List, Union, Any, Optional, Callable, Dict, TypeVar, TYPE_CHECKING +import sys +from azure.core.messaging import CloudEvent +from azure.core.exceptions import ( + HttpResponseError, + ResourceNotFoundError, +) +from azure.core.tracing.decorator_async import distributed_trace_async +from azure.core.pipeline import PipelineResponse +from azure.core.rest import HttpRequest, AsyncHttpResponse +from ...models._patch import ReceiveDetails +from ._operations import ( + EventGridPublisherClientOperationsMixin as PublisherOperationsMixin, + EventGridConsumerClientOperationsMixin as ConsumerOperationsMixin, +) +from ... import models as _models +from ..._validation import api_version_validation + +from ..._operations._patch import ( + _serialize_events, +) + +from ..._legacy import EventGridEvent +from ..._legacy._helpers import _is_eventgrid_event_format + +if sys.version_info >= (3, 9): + from collections.abc import MutableMapping +else: + from typing import MutableMapping # type: ignore # pylint: disable=ungrouped-imports +JSON = MutableMapping[str, Any] # pylint: disable=unsubscriptable-object +T = TypeVar("T") +ClsType = Optional[Callable[[PipelineResponse[HttpRequest, AsyncHttpResponse], T, Dict[str, Any]], Any]] + +if TYPE_CHECKING: + from cloudevents.http.event import CloudEvent as CNCFCloudEvent + + +class EventGridPublisherClientOperationsMixin(PublisherOperationsMixin): + + @distributed_trace_async + async def send( + self, + events: Union[ + CloudEvent, + List[CloudEvent], + Dict[str, Any], + List[Dict[str, Any]], + "CNCFCloudEvent", + List["CNCFCloudEvent"], + EventGridEvent, + List[EventGridEvent], + ], + *, + channel_name: Optional[str] = None, + content_type: Optional[str] = None, + **kwargs: Any, + ) -> None: # pylint: disable=docstring-should-be-keyword, docstring-missing-param + """Send events to the Event Grid Service. + + :param events: The event(s) to send. If sending to an Event Grid Namespace, the dict or list of dicts + should be in the format of a CloudEvent. + :type events: CloudEvent or List[CloudEvent] or Dict[str, Any] or List[Dict[str, Any]] + or CNCFCloudEvent or List[CNCFCloudEvent] or EventGridEvent or List[EventGridEvent] + :keyword channel_name: The name of the channel to send the event to. Event Grid Basic Resource only. + :paramtype channel_name: str or None + :keyword content_type: The content type of the event. If not specified, the default value is + "application/cloudevents+json; charset=utf-8". + :paramtype content_type: str + + :return: None + :rtype: None + """ + if self._namespace and channel_name: + raise ValueError("Channel name is not supported for Event Grid Namespaces.") + + # If a cloud event dict, convert to CloudEvent for serializing + try: + if isinstance(events, dict): + events = CloudEvent.from_dict(events) + if isinstance(events, list) and isinstance(events[0], dict): + events = [CloudEvent.from_dict(e) for e in events] + except Exception: # pylint: disable=broad-except + pass + + if self._namespace: + kwargs["content_type"] = ( + content_type if content_type else "application/cloudevents-batch+json; charset=utf-8" + ) + if not isinstance(events, list): + events = [events] + + if isinstance(events[0], EventGridEvent) or _is_eventgrid_event_format(events[0]): + raise TypeError("EventGridEvent is not supported for Event Grid Namespaces.") + try: + # Try to send via namespace + await self._publish(self._namespace, _serialize_events(events), **kwargs) + except Exception as exception: # pylint: disable=broad-except + self._http_response_error_handler(exception) + raise exception + else: + kwargs["content_type"] = content_type if content_type else "application/json; charset=utf-8" + try: + await self._publish(events, channel_name=channel_name, **kwargs) + except Exception as exception: + self._http_response_error_handler(exception) + raise exception + + def _http_response_error_handler(self, exception): + if isinstance(exception, HttpResponseError): + if exception.status_code == 400: + raise HttpResponseError("Invalid event data. Please check the data and try again.") from exception + if exception.status_code == 404: + raise ResourceNotFoundError( + "Resource not found. " + "For Event Grid Namespaces, please specify the namespace_topic name on the client. " + "For Event Grid Basic, do not specify the namespace_topic name." + ) from exception + raise exception + + +class EventGridConsumerClientOperationsMixin(ConsumerOperationsMixin): + + @distributed_trace_async + async def receive( + self, + *, + max_events: Optional[int] = None, + max_wait_time: Optional[int] = None, + **kwargs: Any, + ) -> List[ReceiveDetails]: + """Receive Batch of Cloud Events from the Event Subscription. + + :keyword max_events: Max Events count to be received. Minimum value is 1, while maximum value + is 100 events. If not specified, the default value is 1. Default value is None. + :paramtype max_events: int + :keyword max_wait_time: Max wait time value for receive operation in Seconds. It is the time in + seconds that the server approximately waits for the availability of an event and responds to + the request. If an event is available, the broker responds immediately to the client. Minimum + value is 10 seconds, while maximum value is 120 seconds. If not specified, the default value is + 60 seconds. Default value is None. + :paramtype max_wait_time: int + :return: ReceiveDetails + :rtype: list[~azure.eventgrid.models.ReceiveDetails] + :raises ~azure.core.exceptions.HttpResponseError: + """ + + detail_items = [] + receive_result = await self._receive( + self._namespace, + self._subscription, + max_events=max_events, + max_wait_time=max_wait_time, + **kwargs, + ) + for detail_item in receive_result.details: + deserialized_cloud_event = CloudEvent.from_dict(detail_item.event) + detail_item.event = deserialized_cloud_event + detail_items.append( + ReceiveDetails( + broker_properties=detail_item.broker_properties, + event=detail_item.event, + ) + ) + return detail_items + + @distributed_trace_async + async def acknowledge( + self, + *, + lock_tokens: List[str], + **kwargs: Any, + ) -> _models.AcknowledgeResult: + """Acknowledge a batch of Cloud Events. The response will include the set of successfully + acknowledged lock tokens, along with other failed lock tokens with their corresponding error + information. Successfully acknowledged events will no longer be available to be received by any + consumer. + + :keyword lock_tokens: Array of lock tokens of Cloud Events. Required. + :paramtype lock_tokens: List[str] + :return: AcknowledgeResult. The AcknowledgeResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models.AcknowledgeResult + :raises ~azure.core.exceptions.HttpResponseError: + """ + return await super()._acknowledge(self._namespace, self._subscription, lock_tokens=lock_tokens, **kwargs) + + @distributed_trace_async + @api_version_validation( + params_added_on={"2023-10-01-preview": ["release_delay"]}, + ) + async def release( + self, + *, + lock_tokens: List[str], + release_delay: Optional[Union[int, _models.ReleaseDelay]] = None, + **kwargs: Any, + ) -> _models.ReleaseResult: + """Release a batch of Cloud Events. The response will include the set of successfully released + lock tokens, along with other failed lock tokens with their corresponding error information. + Successfully released events can be received by consumers. + + :keyword lock_tokens: Array of lock tokens of Cloud Events. Required. + :paramtype lock_tokens: List[str] + :keyword release_delay: Release cloud events with the specified delay in seconds. + Known values are: 0, 10, 60, 600, and 3600. Default value is None, indicating no delay. + :paramtype release_delay: int or ~azure.eventgrid.models.ReleaseDelay + :return: ReleaseResult. The ReleaseResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models.ReleaseResult + :raises ~azure.core.exceptions.HttpResponseError: + """ + return await super()._release( + self._namespace, + self._subscription, + lock_tokens=lock_tokens, + release_delay_in_seconds=release_delay, + **kwargs, + ) + + @distributed_trace_async + async def reject( + self, + *, + lock_tokens: List[str], + **kwargs: Any, + ) -> _models.RejectResult: + """Reject a batch of Cloud Events. The response will include the set of successfully rejected lock + tokens, along with other failed lock tokens with their corresponding error information. + Successfully rejected events will be dead-lettered and can no longer be received by a consumer. + + :keyword lock_tokens: Array of lock tokens of Cloud Events. Required. + :paramtype lock_tokens: List[str] + :return: RejectResult. The RejectResult is compatible with MutableMapping + :rtype: ~azure.eventgrid.models.RejectResult + :raises ~azure.core.exceptions.HttpResponseError: + """ + return await super()._reject(self._namespace, self._subscription, lock_tokens=lock_tokens, **kwargs) + + @distributed_trace_async + @api_version_validation( + method_added_on="2023-10-01-preview", + params_added_on={"2023-10-01-preview": ["api_version", "content_type", "accept"]}, + ) + async def renew_locks( + self, + *, + lock_tokens: List[str], + **kwargs: Any, + ) -> _models.RenewLocksResult: + """Renew lock for batch of Cloud Events. The server responds with an HTTP 200 status code if the + request is successfully accepted. The response body will include the set of successfully + renewed lockTokens, along with other failed lockTokens with their corresponding error + information. + + :keyword lock_tokens: Array of lock tokens of Cloud Events. Required. + :paramtype lock_tokens: List[str] + :return: RenewLocksResult. The RenewLocksResult is compatible with + MutableMapping + :rtype: ~azure.eventgrid.models.RenewLocksResult + :raises ~azure.core.exceptions.HttpResponseError: + """ + return await super()._renew_locks(self._namespace, self._subscription, lock_tokens=lock_tokens, **kwargs) + + +__all__: List[str] = [ + "EventGridPublisherClientOperationsMixin", + "EventGridConsumerClientOperationsMixin", +] # Add all objects you want publicly available to users at this package level + + +def patch_sdk(): + """Do not remove from this file. + `patch_sdk` is a last resort escape hatch that allows you to do customizations + you can't accomplish using the techniques described in + https://aka.ms/azsdk/python/dpcodegen/python/customize + """ diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_patch.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_patch.py new file mode 100644 index 000000000000..c82fcc0033d0 --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_patch.py @@ -0,0 +1,148 @@ +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ +"""Customize generated code here. +Follow our quickstart for examples: https://aka.ms/azsdk/python/dpcodegen/python/customize +""" + +from typing import List, Union, Any, TYPE_CHECKING, Optional +from azure.core.credentials import AzureKeyCredential, AzureSasCredential + + +from .._legacy.aio import EventGridPublisherClient as LegacyEventGridPublisherClient +from ._client import ( + EventGridPublisherClient as InternalEventGridPublisherClient, + EventGridConsumerClient as InternalEventGridConsumerClient, +) +from .._serialization import Deserializer, Serializer +from .._patch import ( + DEFAULT_BASIC_API_VERSION, + DEFAULT_STANDARD_API_VERSION, +) + +if TYPE_CHECKING: + # pylint: disable=unused-import,ungrouped-imports + from azure.core.credentials_async import AsyncTokenCredential + + +class EventGridPublisherClient(InternalEventGridPublisherClient): + """EventGridPublisherClient. + + Sends events to a basic topic, basic domain, or a namespace topic + specified during the client initialization. + + A single instance or a list of dictionaries, CloudEvents or EventGridEvents are accepted. + If a list is provided, the list must contain only one type of event. + If dictionaries are provided and sending to a namespace topic, + the dictionary must follow the CloudEvent schema. + + :param endpoint: The host name of the namespace, e.g. + namespaceName1.westus-1.eventgrid.azure.net. Required. + :type endpoint: str + :param credential: Credential used to authenticate requests to the service. Is either a + AzureKeyCredential type or a AsyncTokenCredential type. Required. + :type credential: ~azure.core.credentials.AzureKeyCredential or + ~azure.core.credentials_async.AsyncTokenCredential + :keyword namespace_topic: The name of the topic to publish events to. Required for EventGrid Namespaces. + Default value is None, which is used for EventGrid Basic. + :paramtype namespace_topic: str or None + :keyword api_version: The API version to use for this operation. Default value is "2024-06-01". + Note that overriding this default value may result in unsupported behavior. + :paramtype api_version: str + """ + + def __init__( + self, + endpoint: str, + credential: Union[AzureKeyCredential, AzureSasCredential, "AsyncTokenCredential"], + *, + namespace_topic: Optional[str] = None, + api_version: Optional[str] = None, + **kwargs: Any, + ) -> None: + self._namespace = namespace_topic + self._credential = credential + + if not self._namespace: + self._client = LegacyEventGridPublisherClient( # type: ignore[assignment] + endpoint, credential, api_version=api_version or DEFAULT_BASIC_API_VERSION, **kwargs + ) + self._publish = self._client.send # type: ignore[attr-defined] + else: + if isinstance(credential, AzureSasCredential): + raise TypeError("SAS token authentication is not supported for the standard client.") + + super().__init__( + endpoint=endpoint, + credential=credential, + api_version=api_version or DEFAULT_STANDARD_API_VERSION, + **kwargs, + ) + self._publish = self._send_events + self._serialize = Serializer() + self._deserialize = Deserializer() + self._serialize.client_side_validation = False + + def __repr__(self) -> str: + return ( + f"" + ) + + +class EventGridConsumerClient(InternalEventGridConsumerClient): + """EventGridConsumerClient. + + Consumes and manages events from a namespace topic + and event subscription specified during the client initialization. + + :param endpoint: The host name of the namespace, e.g. + namespaceName1.westus-1.eventgrid.azure.net. Required. + :type endpoint: str + :param credential: Credential used to authenticate requests to the service. Is either a + AzureKeyCredential type or a AsyncTokenCredential type. Required. + :type credential: ~azure.core.credentials.AzureKeyCredential or + ~azure.core.credentials_async.AsyncTokenCredential + :keyword namespace_topic: The name of the topic to consume events from. Required. + :paramtype namespace_topic: str + :subscription: The name of the subscription to consume events from. Required. + :paramtype subscription: str + :keyword api_version: The API version to use for this operation. Default value is "2024-06-01". + Note that overriding this default value may result in unsupported behavior. + :paramtype api_version: str + """ + + def __init__( + self, + endpoint: str, + credential: Union[AzureKeyCredential, "AsyncTokenCredential"], + *, + namespace_topic: str, + subscription: str, + api_version: Optional[str] = None, + **kwargs: Any, + ) -> None: + self._namespace = namespace_topic + self._subscription = subscription + self._credential = credential + super().__init__( + endpoint=endpoint, credential=credential, api_version=api_version or DEFAULT_STANDARD_API_VERSION, **kwargs + ) + + def __repr__(self) -> str: + return f"" + + +def patch_sdk(): + """Do not remove from this file. + `patch_sdk` is a last resort escape hatch that allows you to do customizations + you can't accomplish using the techniques described in + https://aka.ms/azsdk/python/dpcodegen/python/customize + """ + + +__all__: List[str] = [ + "EventGridConsumerClient", + "EventGridPublisherClient", +] # Add all objects you want publicly available to users at this package level diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_vendor.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_vendor.py new file mode 100644 index 000000000000..4bf238ec554f --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/aio/_vendor.py @@ -0,0 +1,35 @@ +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for license information. +# Code generated by Microsoft (R) Python Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is regenerated. +# -------------------------------------------------------------------------- + +from abc import ABC +from typing import TYPE_CHECKING + +from ._configuration import EventGridConsumerClientConfiguration, EventGridPublisherClientConfiguration + +if TYPE_CHECKING: + # pylint: disable=unused-import,ungrouped-imports + from azure.core import AsyncPipelineClient + + from .._serialization import Deserializer, Serializer + + +class EventGridPublisherClientMixinABC(ABC): + """DO NOT use this class. It is for internal typing use only.""" + + _client: "AsyncPipelineClient" + _config: EventGridPublisherClientConfiguration + _serialize: "Serializer" + _deserialize: "Deserializer" + + +class EventGridConsumerClientMixinABC(ABC): + """DO NOT use this class. It is for internal typing use only.""" + + _client: "AsyncPipelineClient" + _config: EventGridConsumerClientConfiguration + _serialize: "Serializer" + _deserialize: "Deserializer" diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/models/__init__.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/models/__init__.py new file mode 100644 index 000000000000..c687c94b6e9b --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/models/__init__.py @@ -0,0 +1,29 @@ +# coding=utf-8 +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for license information. +# Code generated by Microsoft (R) Python Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is regenerated. +# -------------------------------------------------------------------------- + +from ._models import AcknowledgeResult +from ._models import FailedLockToken +from ._models import RejectResult +from ._models import ReleaseResult +from ._models import RenewLocksResult + +from ._enums import ReleaseDelay +from ._patch import __all__ as _patch_all +from ._patch import * # pylint: disable=unused-wildcard-import +from ._patch import patch_sdk as _patch_sdk + +__all__ = [ + "AcknowledgeResult", + "FailedLockToken", + "RejectResult", + "ReleaseResult", + "RenewLocksResult", + "ReleaseDelay", +] +__all__.extend([p for p in _patch_all if p not in __all__]) +_patch_sdk() diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/models/_enums.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/models/_enums.py new file mode 100644 index 000000000000..207504715bf6 --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/models/_enums.py @@ -0,0 +1,25 @@ +# coding=utf-8 +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for license information. +# Code generated by Microsoft (R) Python Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is regenerated. +# -------------------------------------------------------------------------- + +from enum import Enum +from azure.core import CaseInsensitiveEnumMeta + + +class ReleaseDelay(str, Enum, metaclass=CaseInsensitiveEnumMeta): + """Supported delays for release operation.""" + + NO_DELAY = "0" + """Release the event after 0 seconds.""" + TEN_SECONDS = "10" + """Release the event after 10 seconds.""" + ONE_MINUTE = "60" + """Release the event after 60 seconds.""" + TEN_MINUTES = "600" + """Release the event after 600 seconds.""" + ONE_HOUR = "3600" + """Release the event after 3600 seconds.""" diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/models/_models.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/models/_models.py new file mode 100644 index 000000000000..a84b330a8112 --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/models/_models.py @@ -0,0 +1,370 @@ +# coding=utf-8 +# pylint: disable=too-many-lines +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for license information. +# Code generated by Microsoft (R) Python Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is regenerated. +# -------------------------------------------------------------------------- + +import datetime +import sys +from typing import Any, List, Mapping, Optional, TYPE_CHECKING, overload + +from .. import _model_base +from .._model_base import rest_field + +if sys.version_info >= (3, 9): + from collections.abc import MutableMapping +else: + from typing import MutableMapping # type: ignore # pylint: disable=ungrouped-imports + +if TYPE_CHECKING: + # pylint: disable=unused-import,ungrouped-imports + from .. import models as _models +JSON = MutableMapping[str, Any] # pylint: disable=unsubscriptable-object + + +class AcknowledgeResult(_model_base.Model): + """The result of the Acknowledge operation. + + All required parameters must be populated in order to send to server. + + :ivar failed_lock_tokens: Array of FailedLockToken for failed cloud events. Each + FailedLockToken includes the lock token along with the related error information (namely, the + error code and description). Required. + :vartype failed_lock_tokens: list[~azure.eventgrid.models.FailedLockToken] + :ivar succeeded_lock_tokens: Array of lock tokens for the successfully acknowledged cloud + events. Required. + :vartype succeeded_lock_tokens: list[str] + """ + + failed_lock_tokens: List["_models.FailedLockToken"] = rest_field(name="failedLockTokens") + """Array of FailedLockToken for failed cloud events. Each FailedLockToken includes the lock token + along with the related error information (namely, the error code and description). Required.""" + succeeded_lock_tokens: List[str] = rest_field(name="succeededLockTokens") + """Array of lock tokens for the successfully acknowledged cloud events. Required.""" + + @overload + def __init__( + self, + *, + failed_lock_tokens: List["_models.FailedLockToken"], + succeeded_lock_tokens: List[str], + ): ... + + @overload + def __init__(self, mapping: Mapping[str, Any]): + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: # pylint: disable=useless-super-delegation + super().__init__(*args, **kwargs) + + +class BrokerProperties(_model_base.Model): + """Properties of the Event Broker operation. + + All required parameters must be populated in order to send to server. + + :ivar lock_token: The token of the lock on the event. Required. + :vartype lock_token: str + :ivar delivery_count: The attempt count for delivering the event. Required. + :vartype delivery_count: int + """ + + lock_token: str = rest_field(name="lockToken") + """The token of the lock on the event. Required.""" + delivery_count: int = rest_field(name="deliveryCount") + """The attempt count for delivering the event. Required.""" + + +class CloudEvent(_model_base.Model): + """Properties of an event published to an Azure Messaging EventGrid Namespace topic using the + CloudEvent 1.0 Schema. + + All required parameters must be populated in order to send to server. + + :ivar id: An identifier for the event. The combination of id and source must be unique for each + distinct event. Required. + :vartype id: str + :ivar source: Identifies the context in which an event happened. The combination of id and + source must be unique for each distinct event. Required. + :vartype source: str + :ivar data: Event data specific to the event type. + :vartype data: any + :ivar data_base64: Event data specific to the event type, encoded as a base64 string. + :vartype data_base64: bytes + :ivar type: Type of event related to the originating occurrence. Required. + :vartype type: str + :ivar time: The time (in UTC) the event was generated, in RFC3339 format. + :vartype time: ~datetime.datetime + :ivar specversion: The version of the CloudEvents specification which the event uses. Required. + :vartype specversion: str + :ivar dataschema: Identifies the schema that data adheres to. + :vartype dataschema: str + :ivar datacontenttype: Content type of data value. + :vartype datacontenttype: str + :ivar subject: This describes the subject of the event in the context of the event producer + (identified by source). + :vartype subject: str + """ + + id: str = rest_field() + """An identifier for the event. The combination of id and source must be unique for each distinct + event. Required.""" + source: str = rest_field() + """Identifies the context in which an event happened. The combination of id and source must be + unique for each distinct event. Required.""" + data: Optional[Any] = rest_field() + """Event data specific to the event type.""" + data_base64: Optional[bytes] = rest_field(format="base64") + """Event data specific to the event type, encoded as a base64 string.""" + type: str = rest_field() + """Type of event related to the originating occurrence. Required.""" + time: Optional[datetime.datetime] = rest_field(format="rfc3339") + """The time (in UTC) the event was generated, in RFC3339 format.""" + specversion: str = rest_field() + """The version of the CloudEvents specification which the event uses. Required.""" + dataschema: Optional[str] = rest_field() + """Identifies the schema that data adheres to.""" + datacontenttype: Optional[str] = rest_field() + """Content type of data value.""" + subject: Optional[str] = rest_field() + """This describes the subject of the event in the context of the event producer (identified by + source).""" + + +class Error(_model_base.Model): + """The error object. + + All required parameters must be populated in order to send to server. + + :ivar code: One of a server-defined set of error codes. Required. + :vartype code: str + :ivar message: A human-readable representation of the error. Required. + :vartype message: str + :ivar target: The target of the error. + :vartype target: str + :ivar details: An array of details about specific errors that led to this reported error. + :vartype details: list[~azure.eventgrid.models._models.Error] + :ivar innererror: An object containing more specific information than the current object about + the error. + :vartype innererror: ~azure.eventgrid.models._models.InnerError + """ + + code: str = rest_field() + """One of a server-defined set of error codes. Required.""" + message: str = rest_field() + """A human-readable representation of the error. Required.""" + target: Optional[str] = rest_field() + """The target of the error.""" + details: Optional[List["_models._models.Error"]] = rest_field() + """An array of details about specific errors that led to this reported error.""" + innererror: Optional["_models._models.InnerError"] = rest_field() + """An object containing more specific information than the current object about the error.""" + + +class FailedLockToken(_model_base.Model): + """Failed LockToken information. + + All required parameters must be populated in order to send to server. + + :ivar lock_token: The lock token of an entry in the request. Required. + :vartype lock_token: str + :ivar error: Error information of the failed operation result for the lock token in the + request. Required. + :vartype error: ~azure.eventgrid.models._models.Error + """ + + lock_token: str = rest_field(name="lockToken") + """The lock token of an entry in the request. Required.""" + error: "_models._models.Error" = rest_field() + """Error information of the failed operation result for the lock token in the request. Required.""" + + @overload + def __init__( + self, + *, + lock_token: str, + error: "_models._models.Error", + ): ... + + @overload + def __init__(self, mapping: Mapping[str, Any]): + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: # pylint: disable=useless-super-delegation + super().__init__(*args, **kwargs) + + +class InnerError(_model_base.Model): + """An object containing more specific information about the error. As per Microsoft One API + guidelines - + https://github.com/Microsoft/api-guidelines/blob/vNext/Guidelines.md#7102-error-condition-responses. + + :ivar code: One of a server-defined set of error codes. + :vartype code: str + :ivar innererror: Inner error. + :vartype innererror: ~azure.eventgrid.models._models.InnerError + """ + + code: Optional[str] = rest_field() + """One of a server-defined set of error codes.""" + innererror: Optional["_models._models.InnerError"] = rest_field() + """Inner error.""" + + +class PublishResult(_model_base.Model): + """The result of the Publish operation.""" + + +class ReceiveDetails(_model_base.Model): + """Receive operation details per Cloud Event. + + All required parameters must be populated in order to send to server. + + :ivar broker_properties: The Event Broker details. Required. + :vartype broker_properties: ~azure.eventgrid.models._models.BrokerProperties + :ivar event: Cloud Event details. Required. + :vartype event: ~azure.eventgrid.models._models.CloudEvent + """ + + broker_properties: "_models._models.BrokerProperties" = rest_field(name="brokerProperties") + """The Event Broker details. Required.""" + event: "_models._models.CloudEvent" = rest_field() + """Cloud Event details. Required.""" + + +class ReceiveResult(_model_base.Model): + """Details of the Receive operation response. + + All required parameters must be populated in order to send to server. + + :ivar details: Array of receive responses, one per cloud event. Required. + :vartype details: list[~azure.eventgrid.models._models.ReceiveDetails] + """ + + details: List["_models._models.ReceiveDetails"] = rest_field(name="value") + """Array of receive responses, one per cloud event. Required.""" + + +class RejectResult(_model_base.Model): + """The result of the Reject operation. + + All required parameters must be populated in order to send to server. + + :ivar failed_lock_tokens: Array of FailedLockToken for failed cloud events. Each + FailedLockToken includes the lock token along with the related error information (namely, the + error code and description). Required. + :vartype failed_lock_tokens: list[~azure.eventgrid.models.FailedLockToken] + :ivar succeeded_lock_tokens: Array of lock tokens for the successfully rejected cloud events. + Required. + :vartype succeeded_lock_tokens: list[str] + """ + + failed_lock_tokens: List["_models.FailedLockToken"] = rest_field(name="failedLockTokens") + """Array of FailedLockToken for failed cloud events. Each FailedLockToken includes the lock token + along with the related error information (namely, the error code and description). Required.""" + succeeded_lock_tokens: List[str] = rest_field(name="succeededLockTokens") + """Array of lock tokens for the successfully rejected cloud events. Required.""" + + @overload + def __init__( + self, + *, + failed_lock_tokens: List["_models.FailedLockToken"], + succeeded_lock_tokens: List[str], + ): ... + + @overload + def __init__(self, mapping: Mapping[str, Any]): + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: # pylint: disable=useless-super-delegation + super().__init__(*args, **kwargs) + + +class ReleaseResult(_model_base.Model): + """The result of the Release operation. + + All required parameters must be populated in order to send to server. + + :ivar failed_lock_tokens: Array of FailedLockToken for failed cloud events. Each + FailedLockToken includes the lock token along with the related error information (namely, the + error code and description). Required. + :vartype failed_lock_tokens: list[~azure.eventgrid.models.FailedLockToken] + :ivar succeeded_lock_tokens: Array of lock tokens for the successfully released cloud events. + Required. + :vartype succeeded_lock_tokens: list[str] + """ + + failed_lock_tokens: List["_models.FailedLockToken"] = rest_field(name="failedLockTokens") + """Array of FailedLockToken for failed cloud events. Each FailedLockToken includes the lock token + along with the related error information (namely, the error code and description). Required.""" + succeeded_lock_tokens: List[str] = rest_field(name="succeededLockTokens") + """Array of lock tokens for the successfully released cloud events. Required.""" + + @overload + def __init__( + self, + *, + failed_lock_tokens: List["_models.FailedLockToken"], + succeeded_lock_tokens: List[str], + ): ... + + @overload + def __init__(self, mapping: Mapping[str, Any]): + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: # pylint: disable=useless-super-delegation + super().__init__(*args, **kwargs) + + +class RenewLocksResult(_model_base.Model): + """The result of the RenewLock operation. + + All required parameters must be populated in order to send to server. + + :ivar failed_lock_tokens: Array of FailedLockToken for failed cloud events. Each + FailedLockToken includes the lock token along with the related error information (namely, the + error code and description). Required. + :vartype failed_lock_tokens: list[~azure.eventgrid.models.FailedLockToken] + :ivar succeeded_lock_tokens: Array of lock tokens for the successfully renewed locks. Required. + :vartype succeeded_lock_tokens: list[str] + """ + + failed_lock_tokens: List["_models.FailedLockToken"] = rest_field(name="failedLockTokens") + """Array of FailedLockToken for failed cloud events. Each FailedLockToken includes the lock token + along with the related error information (namely, the error code and description). Required.""" + succeeded_lock_tokens: List[str] = rest_field(name="succeededLockTokens") + """Array of lock tokens for the successfully renewed locks. Required.""" + + @overload + def __init__( + self, + *, + failed_lock_tokens: List["_models.FailedLockToken"], + succeeded_lock_tokens: List[str], + ): ... + + @overload + def __init__(self, mapping: Mapping[str, Any]): + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: # pylint: disable=useless-super-delegation + super().__init__(*args, **kwargs) diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/models/_patch.py b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/models/_patch.py new file mode 100644 index 000000000000..0772b43bb722 --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/models/_patch.py @@ -0,0 +1,89 @@ +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ +"""Customize generated code here. + +Follow our quickstart for examples: https://aka.ms/azsdk/python/dpcodegen/python/customize +""" +from typing import List, overload, Mapping, Any +from azure.core.messaging import CloudEvent +from ._models import ( + ReceiveDetails as InternalReceiveDetails, + BrokerProperties as InternalBrokerProperties, +) + + +class ReceiveDetails(InternalReceiveDetails): + """Receive operation details per Cloud Event. + + All required parameters must be populated in order to send to Azure. + + :ivar broker_properties: The Event Broker details. Required. + :vartype broker_properties: ~azure.eventgrid.models.BrokerProperties + :ivar event: Cloud Event details. Required. + :vartype event: ~azure.core.messaging.CloudEvent + """ + + @overload + def __init__( + self, + *, + broker_properties: "BrokerProperties", + event: "CloudEvent", + ): ... + + @overload + def __init__(self, mapping: Mapping[str, Any]): + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: # pylint: disable=useless-super-delegation + super().__init__(*args, **kwargs) + + +class BrokerProperties(InternalBrokerProperties): + """Properties of the Event Broker operation. + + All required parameters must be populated in order to send to Azure. + + :ivar lock_token: The token used to lock the event. Required. + :vartype lock_token: str + :ivar delivery_count: The attempt count for deliverying the event. Required. + :vartype delivery_count: int + """ + + @overload + def __init__( + self, + *, + lock_token: str, + delivery_count: int, + ): ... + + @overload + def __init__(self, mapping: Mapping[str, Any]): + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: # pylint: disable=useless-super-delegation + super().__init__(*args, **kwargs) + + +__all__: List[str] = [ + "ReceiveDetails", + "BrokerProperties", +] # Add all objects you want publicly available to users at this package level + + +def patch_sdk(): + """Do not remove from this file. + + `patch_sdk` is a last resort escape hatch that allows you to do customizations + you can't accomplish using the techniques described in + https://aka.ms/azsdk/python/dpcodegen/python/customize + """ diff --git a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/py.typed b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/py.typed index e69de29bb2d1..e5aff4f83af8 100644 --- a/sdk/eventgrid/azure-eventgrid/azure/eventgrid/py.typed +++ b/sdk/eventgrid/azure-eventgrid/azure/eventgrid/py.typed @@ -0,0 +1 @@ +# Marker file for PEP 561. \ No newline at end of file diff --git a/sdk/eventgrid/azure-eventgrid/mypy.ini b/sdk/eventgrid/azure-eventgrid/mypy.ini index b8d3b2b62839..83bbdffd443c 100644 --- a/sdk/eventgrid/azure-eventgrid/mypy.ini +++ b/sdk/eventgrid/azure-eventgrid/mypy.ini @@ -1,12 +1,18 @@ [mypy] python_version = 3.7 -warn_return_any = True +warn_return_any = False warn_unused_configs = True ignore_missing_imports = True # Per-module options: -[mypy-azure.eventgrid._generated.*] +[mypy-azure.eventgrid._legacy.*] +ignore_errors = True + +[mypy-azure.eventgrid._operations.*] +ignore_errors = True + +[mypy-azure.eventgrid.aio._operations.*] ignore_errors = True [mypy-azure.core.*] diff --git a/sdk/eventgrid/azure-eventgrid/pyproject.toml b/sdk/eventgrid/azure-eventgrid/pyproject.toml index 57f5387d00e3..ab509fcf3611 100644 --- a/sdk/eventgrid/azure-eventgrid/pyproject.toml +++ b/sdk/eventgrid/azure-eventgrid/pyproject.toml @@ -1,5 +1,3 @@ [tool.azure-sdk-build] pyright = false -type_check_samples = false -verifytypes = false -strict_sphinx = true \ No newline at end of file +verifytypes = false \ No newline at end of file diff --git a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_authentication_async.py b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_authentication_async.py index cc1ccb15c6f2..1ee669f2ad63 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_authentication_async.py +++ b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_authentication_async.py @@ -44,14 +44,9 @@ from azure.eventgrid.aio import EventGridPublisherClient from azure.eventgrid import EventGridEvent -event = EventGridEvent( - data={"team": "azure-sdk"}, - subject="Door1", - event_type="Azure.Sdk.Demo", - data_version="2.0" -) +event = EventGridEvent(data={"team": "azure-sdk"}, subject="Door1", event_type="Azure.Sdk.Demo", data_version="2.0") default_az_credential = DefaultAzureCredential() endpoint = os.environ["EVENTGRID_TOPIC_ENDPOINT"] client = EventGridPublisherClient(endpoint, default_az_credential) -# [END client_auth_with_token_cred_async] \ No newline at end of file +# [END client_auth_with_token_cred_async] diff --git a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_consume_process_events_async.py b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_consume_process_events_async.py new file mode 100644 index 000000000000..a44dfe010460 --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_consume_process_events_async.py @@ -0,0 +1,136 @@ +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for +# license information. +# -------------------------------------------------------------------------- +""" +FILE: sample_consume_process_events_async.py +DESCRIPTION: + These samples demonstrate sending, receiving, releasing, and acknowledging CloudEvents. +USAGE: + python sample_consume_process_events_async.py + Set the environment variables with your own values before running the sample: + 1) EVENTGRID_KEY - The access key of your eventgrid account. + 2) EVENTGRID_ENDPOINT - The namespace endpoint. Typically it exists in the format + "https://..eventgrid.azure.net". + 3) EVENTGRID_TOPIC_NAME - The namespace topic name. + 4) EVENTGRID_EVENT_SUBSCRIPTION_NAME - The event subscription name. +""" +import os +import asyncio +from azure.core.credentials import AzureKeyCredential +from azure.eventgrid.models import * +from azure.core.messaging import CloudEvent +from azure.core.exceptions import HttpResponseError +from azure.eventgrid.aio import EventGridConsumerClient, EventGridPublisherClient + +EVENTGRID_KEY: str = os.environ["EVENTGRID_KEY"] +EVENTGRID_ENDPOINT: str = os.environ["EVENTGRID_ENDPOINT"] +TOPIC_NAME: str = os.environ["EVENTGRID_TOPIC_NAME"] +EVENT_SUBSCRIPTION_NAME: str = os.environ["EVENTGRID_EVENT_SUBSCRIPTION_NAME"] + + +async def run(): + # Create a client + publisher = EventGridPublisherClient( + EVENTGRID_ENDPOINT, AzureKeyCredential(EVENTGRID_KEY), namespace_topic=TOPIC_NAME + ) + client = EventGridConsumerClient( + EVENTGRID_ENDPOINT, + AzureKeyCredential(EVENTGRID_KEY), + namespace_topic=TOPIC_NAME, + subscription=EVENT_SUBSCRIPTION_NAME, + ) + + cloud_event_reject = CloudEvent(data="reject", source="https://example.com", type="example") + cloud_event_release = CloudEvent(data="release", source="https://example.com", type="example") + cloud_event_ack = CloudEvent(data="acknowledge", source="https://example.com", type="example") + cloud_event_renew = CloudEvent(data="renew", source="https://example.com", type="example") + + # Send Cloud Events + await publisher.send( + [ + cloud_event_reject, + cloud_event_release, + cloud_event_ack, + cloud_event_renew, + ] + ) + + # Receive Published Cloud Events + try: + receive_results = await client.receive( + max_events=10, + max_wait_time=10, + ) + except HttpResponseError: + raise + + # Iterate through the results and collect the lock tokens for events we want to release/acknowledge/reject/renew: + + release_events = [] + acknowledge_events = [] + reject_events = [] + renew_events = [] + + for detail in receive_results: + data = detail.event.data + broker_properties = detail.broker_properties + if data == "release": + release_events.append(broker_properties.lock_token) + elif data == "acknowledge": + acknowledge_events.append(broker_properties.lock_token) + elif data == "renew": + renew_events.append(broker_properties.lock_token) + else: + reject_events.append(broker_properties.lock_token) + + # Release/Acknowledge/Reject/Renew events + + if len(release_events) > 0: + try: + release_result = await client.release( + lock_tokens=release_events, + ) + except HttpResponseError: + raise + + for succeeded_lock_token in release_result.succeeded_lock_tokens: + print(f"Succeeded Lock Token:{succeeded_lock_token}") + + if len(acknowledge_events) > 0: + try: + ack_result = await client.acknowledge( + lock_tokens=acknowledge_events, + ) + except HttpResponseError: + raise + + for succeeded_lock_token in ack_result.succeeded_lock_tokens: + print(f"Succeeded Lock Token:{succeeded_lock_token}") + + if len(reject_events) > 0: + try: + reject_result = await client.reject( + lock_tokens=reject_events, + ) + except HttpResponseError: + raise + + for succeeded_lock_token in reject_result.succeeded_lock_tokens: + print(f"Succeeded Lock Token:{succeeded_lock_token}") + + if len(renew_events) > 0: + try: + renew_result = await client.renew_locks( + lock_tokens=renew_events, + ) + except HttpResponseError: + raise + + for succeeded_lock_token in renew_result.succeeded_lock_tokens: + print(f"Succeeded Lock Token:{succeeded_lock_token}") + + +if __name__ == "__main__": + asyncio.run(run()) diff --git a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_cloud_event_using_dict_async.py b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_cloud_event_using_dict_async.py index d46585fac3e9..1c341e44d183 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_cloud_event_using_dict_async.py +++ b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_cloud_event_using_dict_async.py @@ -21,28 +21,54 @@ from azure.eventgrid.aio import EventGridPublisherClient from azure.core.credentials import AzureKeyCredential -topic_key = os.environ["EVENTGRID_CLOUD_EVENT_TOPIC_KEY"] -endpoint = os.environ["EVENTGRID_CLOUD_EVENT_TOPIC_ENDPOINT"] async def publish(): + # To Event Grid Basic + topic_key = os.environ["EVENTGRID_CLOUD_EVENT_TOPIC_KEY"] + endpoint = os.environ["EVENTGRID_CLOUD_EVENT_TOPIC_ENDPOINT"] + credential = AzureKeyCredential(topic_key) client = EventGridPublisherClient(endpoint, credential) # [START publish_cloud_event_dict_async] async with client: - await client.send([ - { - "type": "Contoso.Items.ItemReceived", - "source": "/contoso/items", - "data": { - "itemSku": "Contoso Item SKU #1" - }, - "subject": "Door1", - "specversion": "1.0", - "id": "randomclouduuid11" - } - ]) + await client.send( + [ + { + "type": "Contoso.Items.ItemReceived", + "source": "/contoso/items", + "data": {"itemSku": "Contoso Item SKU #1"}, + "subject": "Door1", + "specversion": "1.0", + "id": "randomclouduuid11", + } + ] + ) # [END publish_cloud_event_dict_async] -if __name__ == '__main__': + # To Event Grid Namespaces + topic_endpoint = os.environ["EVENTGRID_ENDPOINT"] + topic_key = os.environ["EVENTGRID_KEY"] + topic_name = os.environ["EVENTGRID_TOPIC_NAME"] + sub = os.environ["EVENTGRID_EVENT_SUBSCRIPTION_NAME"] + + credential = AzureKeyCredential(topic_key) + client = EventGridPublisherClient(topic_endpoint, credential, namespace_topic=topic_name) + + async with client: + await client.send( + [ + { + "type": "Contoso.Items.ItemReceived", + "source": "/contoso/items", + "data": {"itemSku": "Contoso Item SKU #1"}, + "subject": "Door1", + "specversion": "1.0", + "id": "randomclouduuid11", + } + ] + ) + + +if __name__ == "__main__": asyncio.run(publish()) diff --git a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_cncf_cloud_events_async.py b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_cncf_cloud_events_async.py index e2036f141e63..f777f4ffd67b 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_cncf_cloud_events_async.py +++ b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_cncf_cloud_events_async.py @@ -20,24 +20,47 @@ from azure.core.credentials import AzureKeyCredential from cloudevents.http import CloudEvent -topic_key = os.environ["EVENTGRID_CLOUD_EVENT_TOPIC_KEY"] -endpoint = os.environ["EVENTGRID_CLOUD_EVENT_TOPIC_ENDPOINT"] - async def publish(): - + + # To Event Grid Basic + topic_key = os.environ["EVENTGRID_CLOUD_EVENT_TOPIC_KEY"] + endpoint = os.environ["EVENTGRID_CLOUD_EVENT_TOPIC_ENDPOINT"] + credential = AzureKeyCredential(topic_key) client = EventGridPublisherClient(endpoint, credential) - await client.send([ - CloudEvent( - attributes={ - "type": "cloudevent", - "source": "/cncf/cloud/event/1.0", - "subject": "testing-cncf-event" - }, - data=b'This is a cncf cloud event.', + await client.send( + [ + CloudEvent( + attributes={"type": "cloudevent", "source": "/cncf/cloud/event/1.0", "subject": "testing-cncf-event"}, + data=b"This is a cncf cloud event.", + ) + ] + ) + + # To Event Grid Namespaces + + topic_endpoint = os.environ["EVENTGRID_ENDPOINT"] + topic_key = os.environ["EVENTGRID_KEY"] + topic_name = os.environ["EVENTGRID_TOPIC_NAME"] + + credential = AzureKeyCredential(topic_key) + client = EventGridPublisherClient(topic_endpoint, credential, namespace_topic=topic_name) + + async with client: + await client.send( + [ + CloudEvent( + attributes={ + "type": "cloudevent", + "source": "/cncf/cloud/event/1.0", + "subject": "testing-cncf-event", + }, + data=b"This is a cncf cloud event.", + ) + ] ) - ]) -if __name__ == '__main__': + +if __name__ == "__main__": asyncio.run(publish()) diff --git a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_custom_schema_to_a_topic_async.py b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_custom_schema_to_a_topic_async.py index 9c6001e4afae..2d7ccbb8cbcf 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_custom_schema_to_a_topic_async.py +++ b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_custom_schema_to_a_topic_async.py @@ -28,6 +28,7 @@ key = os.environ["EVENTGRID_CUSTOM_EVENT_TOPIC_KEY"] endpoint = os.environ["EVENTGRID_CUSTOM_EVENT_TOPIC_ENDPOINT"] + async def publish_event(): # authenticate client # [START publish_custom_schema_async] @@ -40,13 +41,14 @@ async def publish_event(): "customDataVersion": "2.0", "customId": uuid.uuid4(), "customEventTime": dt.datetime.now(UTC()).isoformat(), - "customData": "sample data" + "customData": "sample data", } async with client: # publish list of events await client.send(custom_schema_event) - + # [END publish_custom_schema_async] -if __name__ == '__main__': + +if __name__ == "__main__": asyncio.run(publish_event()) diff --git a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_eg_event_using_dict_async.py b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_eg_event_using_dict_async.py index a175bfec0d75..8909b5717b2b 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_eg_event_using_dict_async.py +++ b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_eg_event_using_dict_async.py @@ -26,35 +26,33 @@ topic_key = os.environ["EVENTGRID_TOPIC_KEY"] endpoint = os.environ["EVENTGRID_TOPIC_ENDPOINT"] + async def publish(): credential = AzureKeyCredential(topic_key) client = EventGridPublisherClient(endpoint, credential) # [START publish_eg_event_dict_async] - event0 = { - "eventType": "Contoso.Items.ItemReceived", - "data": { - "itemSku": "Contoso Item SKU #1" - }, - "subject": "Door1", - "dataVersion": "2.0", - "id": "randomuuid11", - "eventTime": datetime.utcnow() - } - event1 = { - "eventType": "Contoso.Items.ItemReceived", - "data": { - "itemSku": "Contoso Item SKU #2" - }, - "subject": "Door1", - "dataVersion": "2.0", - "id": "randomuuid12", - "eventTime": datetime.utcnow() + event0 = { + "eventType": "Contoso.Items.ItemReceived", + "data": {"itemSku": "Contoso Item SKU #1"}, + "subject": "Door1", + "dataVersion": "2.0", + "id": "randomuuid11", + "eventTime": datetime.utcnow(), + } + event1 = { + "eventType": "Contoso.Items.ItemReceived", + "data": {"itemSku": "Contoso Item SKU #2"}, + "subject": "Door1", + "dataVersion": "2.0", + "id": "randomuuid12", + "eventTime": datetime.utcnow(), } - async with client: + async with client: await client.send([event0, event1]) # [END publish_eg_event_dict_async] -if __name__ == '__main__': + +if __name__ == "__main__": asyncio.run(publish()) diff --git a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_eg_events_to_a_domain_async.py b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_eg_events_to_a_domain_async.py index 2973273fb180..48272cb30166 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_eg_events_to_a_domain_async.py +++ b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_eg_events_to_a_domain_async.py @@ -24,30 +24,30 @@ domain_key = os.environ["EVENTGRID_DOMAIN_KEY"] domain_hostname = os.environ["EVENTGRID_DOMAIN_ENDPOINT"] + async def publish(): credential = AzureKeyCredential(domain_key) client = EventGridPublisherClient(domain_hostname, credential) - await client.send([ - EventGridEvent( - topic="MyCustomDomainTopic1", - event_type="Contoso.Items.ItemReceived", - data={ - "itemSku": "Contoso Item SKU #1" - }, - subject="Door1", - data_version="2.0" - ), - EventGridEvent( - topic="MyCustomDomainTopic2", - event_type="Contoso.Items.ItemReceived", - data={ - "itemSku": "Contoso Item SKU #2" - }, - subject="Door1", - data_version="2.0" - ) - ]) + await client.send( + [ + EventGridEvent( + topic="MyCustomDomainTopic1", + event_type="Contoso.Items.ItemReceived", + data={"itemSku": "Contoso Item SKU #1"}, + subject="Door1", + data_version="2.0", + ), + EventGridEvent( + topic="MyCustomDomainTopic2", + event_type="Contoso.Items.ItemReceived", + data={"itemSku": "Contoso Item SKU #2"}, + subject="Door1", + data_version="2.0", + ), + ] + ) + -if __name__ == '__main__': +if __name__ == "__main__": asyncio.run(publish()) diff --git a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_eg_events_to_a_topic_async.py b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_eg_events_to_a_topic_async.py index f1924f54f66d..4ab76f2a81fa 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_eg_events_to_a_topic_async.py +++ b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_eg_events_to_a_topic_async.py @@ -24,21 +24,24 @@ topic_key = os.environ["EVENTGRID_TOPIC_KEY"] endpoint = os.environ["EVENTGRID_TOPIC_ENDPOINT"] + async def publish(): credential = AzureKeyCredential(topic_key) client = EventGridPublisherClient(endpoint, credential) - await client.send([ - EventGridEvent( - event_type="Contoso.Items.ItemReceived", - data={ - "itemSku": "Contoso Item SKU #1" - }, - subject="Door1", - data_version="2.0" - ) - ]) + await client.send( + [ + EventGridEvent( + event_type="Contoso.Items.ItemReceived", + data={"itemSku": "Contoso Item SKU #1"}, + subject="Door1", + data_version="2.0", + ) + ] + ) + + # [END publish_eg_event_to_topic_async] -if __name__ == '__main__': +if __name__ == "__main__": asyncio.run(publish()) diff --git a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_events_to_a_topic_using_sas_credential_async.py b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_events_to_a_topic_using_sas_credential_async.py index 26e4c745ddaf..c70bfd11aef4 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_events_to_a_topic_using_sas_credential_async.py +++ b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_events_to_a_topic_using_sas_credential_async.py @@ -23,21 +23,23 @@ sas = os.environ["EVENTGRID_SAS"] endpoint = os.environ["EVENTGRID_TOPIC_ENDPOINT"] + async def publish(): credential = AzureSasCredential(sas) client = EventGridPublisherClient(endpoint, credential) async with client: - await client.send([ - EventGridEvent( - event_type="Contoso.Items.ItemReceived", - data={ - "itemSku": "Contoso Item SKU #1" - }, - subject="Door1", - data_version="2.0" - ) - ]) + await client.send( + [ + EventGridEvent( + event_type="Contoso.Items.ItemReceived", + data={"itemSku": "Contoso Item SKU #1"}, + subject="Door1", + data_version="2.0", + ) + ] + ) + -if __name__ == '__main__': +if __name__ == "__main__": asyncio.run(publish()) diff --git a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_events_using_cloud_events_1.0_schema_async.py b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_events_using_cloud_events_1.0_schema_async.py index 8e238dfee5a1..1d3ea0337e33 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_events_using_cloud_events_1.0_schema_async.py +++ b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_events_using_cloud_events_1.0_schema_async.py @@ -24,21 +24,24 @@ topic_key = os.environ["EVENTGRID_CLOUD_EVENT_TOPIC_KEY"] endpoint = os.environ["EVENTGRID_CLOUD_EVENT_TOPIC_ENDPOINT"] + async def publish(): credential = AzureKeyCredential(topic_key) client = EventGridPublisherClient(endpoint, credential) - await client.send([ - CloudEvent( - type="Contoso.Items.ItemReceived", - source="/contoso/items", - data={ - "itemSku": "Contoso Item SKU #1" - }, - subject="Door1" - ) - ]) + await client.send( + [ + CloudEvent( + type="Contoso.Items.ItemReceived", + source="/contoso/items", + data={"itemSku": "Contoso Item SKU #1"}, + subject="Door1", + ) + ] + ) + + # [END publish_cloud_event_to_topic_async] -if __name__ == '__main__': +if __name__ == "__main__": asyncio.run(publish()) diff --git a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_to_channel_async.py b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_to_channel_async.py index 15a6526f0088..f12070c35a1e 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_to_channel_async.py +++ b/sdk/eventgrid/azure-eventgrid/samples/async_samples/sample_publish_to_channel_async.py @@ -22,26 +22,28 @@ from azure.core.credentials import AzureKeyCredential from azure.core.messaging import CloudEvent -topic_key = os.environ['EVENTGRID_PARTNER_NAMESPACE_TOPIC_KEY'] -endpoint = os.environ['EVENTGRID_PARTNER_NAMESPACE_TOPIC_ENDPOINT'] +topic_key = os.environ["EVENTGRID_PARTNER_NAMESPACE_TOPIC_KEY"] +endpoint = os.environ["EVENTGRID_PARTNER_NAMESPACE_TOPIC_ENDPOINT"] + +channel_name = os.environ["EVENTGRID_PARTNER_CHANNEL_NAME"] -channel_name = os.environ['EVENTGRID_PARTNER_CHANNEL_NAME'] async def publish(): credential = AzureKeyCredential(topic_key) client = EventGridPublisherClient(endpoint, credential) async with client: - await client.send([ - CloudEvent( - type="Contoso.Items.ItemReceived", - source="/contoso/items", - data={ - "itemSku": "Contoso Item SKU #1" - }, - subject="Door1" - ) - ], - channel_name=channel_name) + await client.send( + [ + CloudEvent( + type="Contoso.Items.ItemReceived", + source="/contoso/items", + data={"itemSku": "Contoso Item SKU #1"}, + subject="Door1", + ) + ], + channel_name=channel_name, + ) + -if __name__ == '__main__': +if __name__ == "__main__": asyncio.run(publish()) diff --git a/sdk/eventgrid/azure-eventgrid/samples/consume_samples/consume_cloud_events_from_eventhub.py b/sdk/eventgrid/azure-eventgrid/samples/consume_samples/consume_cloud_events_from_eventhub.py index 613794f04f20..e5933ec205ef 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/consume_samples/consume_cloud_events_from_eventhub.py +++ b/sdk/eventgrid/azure-eventgrid/samples/consume_samples/consume_cloud_events_from_eventhub.py @@ -24,19 +24,20 @@ CONNECTION_STR = os.environ["EVENT_HUB_CONN_STR"] EVENTHUB_NAME = os.environ["EVENT_HUB_NAME"] + + def on_event(partition_context, event): dict_event: CloudEvent = CloudEvent.from_json(event) print("data: {}\n".format(dict_event.data)) + consumer_client = EventHubConsumerClient.from_connection_string( conn_str=CONNECTION_STR, - consumer_group='$Default', + consumer_group="$Default", eventhub_name=EVENTHUB_NAME, ) with consumer_client: event_list = consumer_client.receive( - on_event=on_event, - starting_position="-1", # "-1" is from the beginning of the partition. - prefetch=5 + on_event=on_event, starting_position="-1", prefetch=5 # "-1" is from the beginning of the partition. ) diff --git a/sdk/eventgrid/azure-eventgrid/samples/consume_samples/consume_cloud_events_from_storage_queue.py b/sdk/eventgrid/azure-eventgrid/samples/consume_samples/consume_cloud_events_from_storage_queue.py index f8975d603a63..86fbfaf2b77e 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/consume_samples/consume_cloud_events_from_storage_queue.py +++ b/sdk/eventgrid/azure-eventgrid/samples/consume_samples/consume_cloud_events_from_storage_queue.py @@ -20,17 +20,16 @@ import json # all types of CloudEvents below produce same DeserializedEvent -connection_str = os.environ['AZURE_STORAGE_CONNECTION_STRING'] -queue_name = os.environ['STORAGE_QUEUE_NAME'] +connection_str = os.environ["AZURE_STORAGE_CONNECTION_STRING"] +queue_name = os.environ["STORAGE_QUEUE_NAME"] with QueueServiceClient.from_connection_string(connection_str) as qsc: - payload = qsc.get_queue_client( - queue=queue_name, - message_decode_policy=BinaryBase64DecodePolicy() - ).peek_messages(max_messages=32) + payload = qsc.get_queue_client(queue=queue_name, message_decode_policy=BinaryBase64DecodePolicy()).peek_messages( + max_messages=32 + ) ## deserialize payload into a list of typed Events events: List[CloudEvent] = [CloudEvent.from_json(msg) for msg in payload] for event in events: - print(type(event)) ## CloudEvent + print(type(event)) ## CloudEvent diff --git a/sdk/eventgrid/azure-eventgrid/samples/consume_samples/consume_eventgrid_events_from_service_bus_queue.py b/sdk/eventgrid/azure-eventgrid/samples/consume_samples/consume_eventgrid_events_from_service_bus_queue.py index 9889ea87c112..25c6a3b49d05 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/consume_samples/consume_eventgrid_events_from_service_bus_queue.py +++ b/sdk/eventgrid/azure-eventgrid/samples/consume_samples/consume_eventgrid_events_from_service_bus_queue.py @@ -23,14 +23,14 @@ import json # all types of EventGridEvents below produce same DeserializedEvent -connection_str = os.environ['SERVICE_BUS_CONNECTION_STR'] -queue_name = os.environ['SERVICE_BUS_QUEUE_NAME'] +connection_str = os.environ["SERVICE_BUS_CONNECTION_STR"] +queue_name = os.environ["SERVICE_BUS_QUEUE_NAME"] with ServiceBusClient.from_connection_string(connection_str) as sb_client: - payload = sb_client.get_queue_receiver(queue_name).receive_messages() + payload = sb_client.get_queue_receiver(queue_name).receive_messages() ## deserialize payload into a list of typed Events events = [EventGridEvent.from_json(msg) for msg in payload] for event in events: - print(type(event)) ## EventGridEvent + print(type(event)) ## EventGridEvent diff --git a/sdk/eventgrid/azure-eventgrid/samples/consume_samples/functionsapp/EventGridTrigger1/__init__.py b/sdk/eventgrid/azure-eventgrid/samples/consume_samples/functionsapp/EventGridTrigger1/__init__.py index b7a38ab6b5b3..56c82506fc64 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/consume_samples/functionsapp/EventGridTrigger1/__init__.py +++ b/sdk/eventgrid/azure-eventgrid/samples/consume_samples/functionsapp/EventGridTrigger1/__init__.py @@ -5,16 +5,19 @@ import azure.functions as func from azure.eventgrid import EventGridEvent + def main(event: func.EventGridEvent): logging.info(sys.version) logging.info(event) - result = json.dumps({ - 'id': event.id, - 'data': event.get_json(), - 'topic': event.topic, - 'subject': event.subject, - 'event_type': event.event_type - }) + result = json.dumps( + { + "id": event.id, + "data": event.get_json(), + "topic": event.topic, + "subject": event.subject, + "event_type": event.event_type, + } + ) logging.info(result) deserialized_event = EventGridEvent.from_dict(json.loads(result)) ## can only be EventGridEvent diff --git a/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_cloud_events_to_custom_topic_sample.py b/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_cloud_events_to_custom_topic_sample.py index 45eccfc14c91..fba40a378241 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_cloud_events_to_custom_topic_sample.py +++ b/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_cloud_events_to_custom_topic_sample.py @@ -30,21 +30,20 @@ credential = AzureKeyCredential(key) client = EventGridPublisherClient(endpoint, credential) -services = ["EventGrid", "ServiceBus", "EventHubs", "Storage"] # possible values for data field +services = ["EventGrid", "ServiceBus", "EventHubs", "Storage"] # possible values for data field + def publish_event(): # publish events for _ in range(3): - event_list = [] # list of events to publish + event_list = [] # list of events to publish # create events and append to list for j in range(randint(1, 1)): - sample_members = sample(services, k=randint(1, 4)) # select random subset of team members + sample_members = sample(services, k=randint(1, 4)) # select random subset of team members data_dict = {"team": sample_members} event = CloudEvent( - type="Azure.Sdk.Sample", - source="https://egsample.dev/sampleevent", - data={"team": sample_members} - ) + type="Azure.Sdk.Sample", source="https://egsample.dev/sampleevent", data={"team": sample_members} + ) event_list.append(event) # publish list of events @@ -52,5 +51,6 @@ def publish_event(): print("Batch of size {} published".format(len(event_list))) time.sleep(randint(1, 5)) + if __name__ == "__main__": publish_event() diff --git a/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_cloud_events_to_domain_topic_sample.py b/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_cloud_events_to_domain_topic_sample.py index 4deab71ad6b5..ba9182a40beb 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_cloud_events_to_domain_topic_sample.py +++ b/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_cloud_events_to_domain_topic_sample.py @@ -32,21 +32,18 @@ credential = AzureKeyCredential(domain_key) client = EventGridPublisherClient(domain_endpoint, credential) + def publish_event(): # publish events for _ in range(3): - event_list = [] # list of events to publish - services = ["EventGrid", "ServiceBus", "EventHubs", "Storage"] # possible values for data field + event_list = [] # list of events to publish + services = ["EventGrid", "ServiceBus", "EventHubs", "Storage"] # possible values for data field # create events and append to list for j in range(randint(1, 3)): - sample_members = sample(services, k=randint(1, 4)) # select random subset of team members - event = CloudEvent( - type="Azure.Sdk.Demo", - source='domainname', - data={"team": sample_members} - ) + sample_members = sample(services, k=randint(1, 4)) # select random subset of team members + event = CloudEvent(type="Azure.Sdk.Demo", source="domainname", data={"team": sample_members}) event_list.append(event) # publish list of events @@ -54,5 +51,6 @@ def publish_event(): print("Batch of size {} published".format(len(event_list))) time.sleep(randint(1, 5)) -if __name__ == '__main__': + +if __name__ == "__main__": publish_event() diff --git a/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_custom_schema_events_to_topic_sample.py b/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_custom_schema_events_to_topic_sample.py index c24b7c3fddb4..0ab11a428c5a 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_custom_schema_events_to_topic_sample.py +++ b/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_custom_schema_events_to_topic_sample.py @@ -27,6 +27,7 @@ key = os.environ["EVENTGRID_CUSTOM_EVENT_TOPIC_KEY"] endpoint = os.environ["EVENTGRID_CUSTOM_EVENT_TOPIC_ENDPOINT"] + def publish_event(): # authenticate client credential = AzureKeyCredential(key) @@ -38,13 +39,13 @@ def publish_event(): "customDataVersion": "2.0", "customId": uuid.uuid4(), "customEventTime": dt.datetime.now(UTC()).isoformat(), - "customData": "sample data" + "customData": "sample data", } # publish events - for _ in range(3): + for _ in range(3): - event_list = [] # list of events to publish + event_list = [] # list of events to publish # create events and append to list for j in range(randint(1, 3)): event_list.append(custom_schema_event) @@ -55,5 +56,5 @@ def publish_event(): time.sleep(randint(1, 5)) -if __name__ == '__main__': +if __name__ == "__main__": publish_event() diff --git a/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_event_grid_events_to_custom_topic_sample.py b/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_event_grid_events_to_custom_topic_sample.py index 209d3ca48582..16247a4763bf 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_event_grid_events_to_custom_topic_sample.py +++ b/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_event_grid_events_to_custom_topic_sample.py @@ -28,22 +28,20 @@ # authenticate client credential = AzureKeyCredential(key) client = EventGridPublisherClient(endpoint, credential) -services = ["EventGrid", "ServiceBus", "EventHubs", "Storage"] # possible values for data field +services = ["EventGrid", "ServiceBus", "EventHubs", "Storage"] # possible values for data field + def publish_event(): # publish events for _ in range(3): - event_list = [] # list of events to publish + event_list = [] # list of events to publish # create events and append to list for j in range(randint(1, 3)): - sample_members = sample(services, k=randint(1, 4)) # select random subset of team members + sample_members = sample(services, k=randint(1, 4)) # select random subset of team members event = EventGridEvent( - subject="Door1", - data={"team": sample_members}, - event_type="Azure.Sdk.Demo", - data_version="2.0" - ) + subject="Door1", data={"team": sample_members}, event_type="Azure.Sdk.Demo", data_version="2.0" + ) event_list.append(event) # publish list of events @@ -51,5 +49,6 @@ def publish_event(): print("Batch of size {} published".format(len(event_list))) time.sleep(randint(1, 5)) -if __name__ == '__main__': + +if __name__ == "__main__": publish_event() diff --git a/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_with_shared_access_signature_sample.py b/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_with_shared_access_signature_sample.py index 9e0c02c15914..fc57050087f3 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_with_shared_access_signature_sample.py +++ b/sdk/eventgrid/azure-eventgrid/samples/publish_samples/publish_with_shared_access_signature_sample.py @@ -34,21 +34,20 @@ credential = AzureSasCredential(signature) client = EventGridPublisherClient(endpoint, credential) -services = ["EventGrid", "ServiceBus", "EventHubs", "Storage"] # possible values for data field +services = ["EventGrid", "ServiceBus", "EventHubs", "Storage"] # possible values for data field + def publish_event(): # publish events for _ in range(3): - event_list = [] # list of events to publish + event_list = [] # list of events to publish # create events and append to list for j in range(randint(1, 3)): - sample_members = sample(services, k=randint(1, 4)) # select random subset of team members + sample_members = sample(services, k=randint(1, 4)) # select random subset of team members event = CloudEvent( - type="Azure.Sdk.Demo", - source="https://egdemo.dev/demowithsignature", - data={"team": sample_members} - ) + type="Azure.Sdk.Demo", source="https://egdemo.dev/demowithsignature", data={"team": sample_members} + ) event_list.append(event) # publish list of events @@ -56,5 +55,6 @@ def publish_event(): print("Batch of size {} published".format(len(event_list))) time.sleep(randint(1, 5)) -if __name__ == '__main__': + +if __name__ == "__main__": publish_event() diff --git a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_authentication.py b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_authentication.py index ea3e5b30522c..eb54049d0925 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_authentication.py +++ b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_authentication.py @@ -46,4 +46,4 @@ default_az_credential = DefaultAzureCredential() endpoint = os.environ["EVENTGRID_TOPIC_ENDPOINT"] client = EventGridPublisherClient(endpoint, default_az_credential) -# [END client_auth_with_token_cred] \ No newline at end of file +# [END client_auth_with_token_cred] diff --git a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_consume_process_events.py b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_consume_process_events.py new file mode 100644 index 000000000000..c10f8196fcea --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_consume_process_events.py @@ -0,0 +1,129 @@ +# -------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for +# license information. +# -------------------------------------------------------------------------- +""" +FILE: sample_consume_process_events.py +DESCRIPTION: + These samples demonstrate sending, receiving, releasing, and acknowledging CloudEvents. +USAGE: + python sample_consume_process_events.py + Set the environment variables with your own values before running the sample: + 1) EVENTGRID_KEY - The access key of your eventgrid account. + 2) EVENTGRID_ENDPOINT - The namespace endpoint. Typically it exists in the format + "https://..eventgrid.azure.net". + 3) EVENTGRID_TOPIC_NAME - The namespace topic name. + 4) EVENTGRID_EVENT_SUBSCRIPTION_NAME - The event subscription name. +""" +import os +from azure.core.credentials import AzureKeyCredential +from azure.eventgrid.models import * +from azure.core.messaging import CloudEvent +from azure.core.exceptions import HttpResponseError +from azure.eventgrid import EventGridConsumerClient, EventGridPublisherClient + +EVENTGRID_KEY: str = os.environ["EVENTGRID_KEY"] +EVENTGRID_ENDPOINT: str = os.environ["EVENTGRID_ENDPOINT"] +TOPIC_NAME: str = os.environ["EVENTGRID_TOPIC_NAME"] +EVENT_SUBSCRIPTION_NAME: str = os.environ["EVENTGRID_EVENT_SUBSCRIPTION_NAME"] + + +# Create a client +publisher = EventGridPublisherClient(EVENTGRID_ENDPOINT, AzureKeyCredential(EVENTGRID_KEY), namespace_topic=TOPIC_NAME) +client = EventGridConsumerClient( + EVENTGRID_ENDPOINT, + AzureKeyCredential(EVENTGRID_KEY), + namespace_topic=TOPIC_NAME, + subscription=EVENT_SUBSCRIPTION_NAME, +) + + +cloud_event_reject = CloudEvent(data="reject", source="https://example.com", type="example") +cloud_event_release = CloudEvent(data="release", source="https://example.com", type="example") +cloud_event_ack = CloudEvent(data="acknowledge", source="https://example.com", type="example") +cloud_event_renew = CloudEvent(data="renew", source="https://example.com", type="example") + +# Send Cloud Events +publisher.send( + [ + cloud_event_reject, + cloud_event_release, + cloud_event_ack, + cloud_event_renew, + ] +) + +# Receive Published Cloud Events +try: + receive_results = client.receive( + max_events=10, + max_wait_time=10, + ) +except HttpResponseError: + raise + +# Iterate through the results and collect the lock tokens for events we want to release/acknowledge/reject/renew: + +release_events = [] +acknowledge_events = [] +reject_events = [] +renew_events = [] + +for detail in receive_results: + data = detail.event.data + broker_properties = detail.broker_properties + if data == "release": + release_events.append(broker_properties.lock_token) + elif data == "acknowledge": + acknowledge_events.append(broker_properties.lock_token) + elif data == "renew": + renew_events.append(broker_properties.lock_token) + else: + reject_events.append(broker_properties.lock_token) + +# Release/Acknowledge/Reject/Renew events + +if len(release_events) > 0: + try: + release_result = client.release( + lock_tokens=release_events, + ) + except HttpResponseError: + raise + + for succeeded_lock_token in release_result.succeeded_lock_tokens: + print(f"Succeeded Lock Token:{succeeded_lock_token}") + +if len(acknowledge_events) > 0: + try: + ack_result = client.acknowledge( + lock_tokens=acknowledge_events, + ) + except HttpResponseError: + raise + + for succeeded_lock_token in ack_result.succeeded_lock_tokens: + print(f"Succeeded Lock Token:{succeeded_lock_token}") + +if len(reject_events) > 0: + try: + reject_result = client.reject( + lock_tokens=reject_events, + ) + except HttpResponseError: + raise + + for succeeded_lock_token in reject_result.succeeded_lock_tokens: + print(f"Succeeded Lock Token:{succeeded_lock_token}") + +if len(renew_events) > 0: + try: + renew_result = client.renew_locks( + lock_tokens=renew_events, + ) + except HttpResponseError: + raise + + for succeeded_lock_token in renew_result.succeeded_lock_tokens: + print(f"Succeeded Lock Token:{succeeded_lock_token}") diff --git a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_generate_sas.py b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_generate_sas.py index 4c8eb1141a91..deb0d72d6649 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_generate_sas.py +++ b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_generate_sas.py @@ -22,11 +22,11 @@ topic_key = os.environ["EVENTGRID_TOPIC_KEY"] endpoint = os.environ["EVENTGRID_TOPIC_ENDPOINT"] -#represents the expiration date for sas +# represents the expiration date for sas expiration_date_utc = datetime.utcnow() + timedelta(hours=10) signature = generate_sas(endpoint, topic_key, expiration_date_utc) # [END generate_sas] -print(signature) \ No newline at end of file +print(signature) diff --git a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_cloud_event_using_dict.py b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_cloud_event_using_dict.py index 25be49359a50..50c313c51fc3 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_cloud_event_using_dict.py +++ b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_cloud_event_using_dict.py @@ -16,9 +16,11 @@ "https://..eventgrid.azure.net/api/events". """ import os -from azure.eventgrid import EventGridPublisherClient +from azure.eventgrid import EventGridPublisherClient, EventGridConsumerClient from azure.core.credentials import AzureKeyCredential + +# To Event Grid Basic topic_key = os.environ["EVENTGRID_CLOUD_EVENT_TOPIC_KEY"] endpoint = os.environ["EVENTGRID_CLOUD_EVENT_TOPIC_ENDPOINT"] @@ -26,16 +28,38 @@ client = EventGridPublisherClient(endpoint, credential) # [START publish_cloud_event_dict] -client.send([ - { - "type": "Contoso.Items.ItemReceived", - "source": "/contoso/items", - "data": { - "itemSku": "Contoso Item SKU #1" - }, - "subject": "Door1", - "specversion": "1.0", - "id": "randomclouduuid11" - } -]) +client.send( + [ + { + "type": "Contoso.Items.ItemReceived", + "source": "/contoso/items", + "data": {"itemSku": "Contoso Item SKU #1"}, + "subject": "Door1", + "specversion": "1.0", + "id": "randomclouduuid11", + } + ] +) # [END publish_cloud_event_dict] + +# To Event Grid Namespaces +topic_endpoint = os.environ["EVENTGRID_ENDPOINT"] +topic_key = os.environ["EVENTGRID_KEY"] +topic_name = os.environ["EVENTGRID_TOPIC_NAME"] +sub = os.environ["EVENTGRID_EVENT_SUBSCRIPTION_NAME"] + +credential = AzureKeyCredential(topic_key) +client = EventGridPublisherClient(topic_endpoint, credential, namespace_topic=topic_name) + +client.send( + [ + { + "type": "Contoso.Items.ItemReceived", + "source": "/contoso/items", + "data": {"itemSku": "Contoso Item SKU #1"}, + "subject": "Door1", + "specversion": "1.0", + "id": "randomclouduuid11", + } + ] +) diff --git a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_cncf_cloud_events.py b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_cncf_cloud_events.py index a54a7442a15a..ffb49eb077c2 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_cncf_cloud_events.py +++ b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_cncf_cloud_events.py @@ -19,19 +19,35 @@ from azure.core.credentials import AzureKeyCredential from cloudevents.http import CloudEvent +# To EventGrid Basic topic_key = os.environ["EVENTGRID_CLOUD_EVENT_TOPIC_KEY"] endpoint = os.environ["EVENTGRID_CLOUD_EVENT_TOPIC_ENDPOINT"] credential = AzureKeyCredential(topic_key) client = EventGridPublisherClient(endpoint, credential) -client.send([ - CloudEvent( - attributes={ - "type": "cloudevent", - "source": "/cncf/cloud/event/1.0", - "subject": "testing-cncf-event" - }, - data=b'This is a cncf cloud event.', - ) -]) +client.send( + [ + CloudEvent( + attributes={"type": "cloudevent", "source": "/cncf/cloud/event/1.0", "subject": "testing-cncf-event"}, + data=b"This is a cncf cloud event.", + ) + ] +) + +# To Event Grid Namespaces +topic_endpoint = os.environ["EVENTGRID_ENDPOINT"] +topic_key = os.environ["EVENTGRID_KEY"] +topic_name = os.environ["EVENTGRID_TOPIC_NAME"] + +credential = AzureKeyCredential(topic_key) +client = EventGridPublisherClient(topic_endpoint, credential, namespace_topic=topic_name) + +client.send( + [ + CloudEvent( + attributes={"type": "cloudevent", "source": "/cncf/cloud/event/1.0", "subject": "testing-cncf-event"}, + data=b"This is a cncf cloud event.", + ) + ] +) diff --git a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_custom_schema_to_a_topic.py b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_custom_schema_to_a_topic.py index abfaba2c0083..4650aae15d61 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_custom_schema_to_a_topic.py +++ b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_custom_schema_to_a_topic.py @@ -27,6 +27,7 @@ key = os.environ["EVENTGRID_CUSTOM_EVENT_TOPIC_KEY"] endpoint = os.environ["EVENTGRID_CUSTOM_EVENT_TOPIC_ENDPOINT"] + def publish_event(): # authenticate client credential = AzureKeyCredential(key) @@ -39,11 +40,12 @@ def publish_event(): "customDataVersion": "2.0", "customId": uuid.uuid4(), "customEventTime": dt.datetime.now(UTC()).isoformat(), - "customData": "sample data" + "customData": "sample data", } client.send(custom_schema_event) # [END publish_custom_schema] -if __name__ == '__main__': + +if __name__ == "__main__": publish_event() diff --git a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_eg_event_using_dict.py b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_eg_event_using_dict.py index 97b84b922592..b7154a0e07b8 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_eg_event_using_dict.py +++ b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_eg_event_using_dict.py @@ -25,33 +25,31 @@ topic_key = os.environ["EVENTGRID_TOPIC_KEY"] endpoint = os.environ["EVENTGRID_TOPIC_ENDPOINT"] + def publish(): # [START publish_eg_event_dict] credential = AzureKeyCredential(topic_key) client = EventGridPublisherClient(endpoint, credential) - event0 = { - "eventType": "Contoso.Items.ItemReceived", - "data": { - "itemSku": "Contoso Item SKU #1" - }, - "subject": "Door1", - "dataVersion": "2.0", - "id": "randomuuid11", - "eventTime": datetime.now(UTC()) - } - event1 = { - "eventType": "Contoso.Items.ItemReceived", - "data": { - "itemSku": "Contoso Item SKU #2" - }, - "subject": "Door1", - "dataVersion": "2.0", - "id": "randomuuid12", - "eventTime": datetime.now(UTC()) - } + event0 = { + "eventType": "Contoso.Items.ItemReceived", + "data": {"itemSku": "Contoso Item SKU #1"}, + "subject": "Door1", + "dataVersion": "2.0", + "id": "randomuuid11", + "eventTime": datetime.now(UTC()), + } + event1 = { + "eventType": "Contoso.Items.ItemReceived", + "data": {"itemSku": "Contoso Item SKU #2"}, + "subject": "Door1", + "dataVersion": "2.0", + "id": "randomuuid12", + "eventTime": datetime.now(UTC()), + } client.send([event0, event1]) # [END publish_eg_event_dict] -if __name__ == '__main__': + +if __name__ == "__main__": publish() diff --git a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_eg_events_to_a_domain.py b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_eg_events_to_a_domain.py index e66d26c7a36a..960f91aeae70 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_eg_events_to_a_domain.py +++ b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_eg_events_to_a_domain.py @@ -24,23 +24,21 @@ credential = AzureKeyCredential(domain_key) client = EventGridPublisherClient(domain_hostname, credential) -client.send([ - EventGridEvent( - topic="MyCustomDomainTopic1", - event_type="Contoso.Items.ItemReceived", - data={ - "itemSku": "Contoso Item SKU #1" - }, - subject="Door1", - data_version="2.0" - ), - EventGridEvent( - topic="MyCustomDomainTopic2", - event_type="Contoso.Items.ItemReceived", - data={ - "itemSku": "Contoso Item SKU #2" - }, - subject="Door1", - data_version="2.0" - ) -]) +client.send( + [ + EventGridEvent( + topic="MyCustomDomainTopic1", + event_type="Contoso.Items.ItemReceived", + data={"itemSku": "Contoso Item SKU #1"}, + subject="Door1", + data_version="2.0", + ), + EventGridEvent( + topic="MyCustomDomainTopic2", + event_type="Contoso.Items.ItemReceived", + data={"itemSku": "Contoso Item SKU #2"}, + subject="Door1", + data_version="2.0", + ), + ] +) diff --git a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_eg_events_to_a_topic.py b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_eg_events_to_a_topic.py index 9138a8ab1899..a0e3ab58b6dc 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_eg_events_to_a_topic.py +++ b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_eg_events_to_a_topic.py @@ -25,14 +25,14 @@ credential = AzureKeyCredential(topic_key) client = EventGridPublisherClient(endpoint, credential) -client.send([ - EventGridEvent( - event_type="Contoso.Items.ItemReceived", - data={ - "itemSku": "Contoso Item SKU #1" - }, - subject="Door1", - data_version="2.0" - ) -]) +client.send( + [ + EventGridEvent( + event_type="Contoso.Items.ItemReceived", + data={"itemSku": "Contoso Item SKU #1"}, + subject="Door1", + data_version="2.0", + ) + ] +) # [END publish_eg_event_to_topic] diff --git a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_events_to_a_topic_using_sas_credential.py b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_events_to_a_topic_using_sas_credential.py index 78ad100c40f9..6051d883fbd4 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_events_to_a_topic_using_sas_credential.py +++ b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_events_to_a_topic_using_sas_credential.py @@ -25,13 +25,13 @@ credential = AzureSasCredential(sas) client = EventGridPublisherClient(endpoint, credential) -client.send([ - EventGridEvent( - event_type="Contoso.Items.ItemReceived", - data={ - "itemSku": "Contoso Item SKU #1" - }, - subject="Door1", - data_version="2.0" - ) -]) +client.send( + [ + EventGridEvent( + event_type="Contoso.Items.ItemReceived", + data={"itemSku": "Contoso Item SKU #1"}, + subject="Door1", + data_version="2.0", + ) + ] +) diff --git a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_events_using_cloud_events_1.0_schema.py b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_events_using_cloud_events_1.0_schema.py index 9544f3fcd209..a0297f9950a3 100644 --- a/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_events_using_cloud_events_1.0_schema.py +++ b/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_events_using_cloud_events_1.0_schema.py @@ -26,14 +26,14 @@ credential = AzureKeyCredential(topic_key) client = EventGridPublisherClient(endpoint, credential) -client.send([ - CloudEvent( - type="Contoso.Items.ItemReceived", - source="/contoso/items", - data={ - "itemSku": "Contoso Item SKU #1" - }, - subject="Door1" - ) -]) +client.send( + [ + CloudEvent( + type="Contoso.Items.ItemReceived", + source="/contoso/items", + data={"itemSku": "Contoso Item SKU #1"}, + subject="Door1", + ) + ] +) # [END publish_cloud_event_to_topic] diff --git a/sdk/eventgrid/azure-eventgrid/setup.py b/sdk/eventgrid/azure-eventgrid/setup.py index 3a1acfc1aea8..ee29cfadd35d 100644 --- a/sdk/eventgrid/azure-eventgrid/setup.py +++ b/sdk/eventgrid/azure-eventgrid/setup.py @@ -1,74 +1,70 @@ -#!/usr/bin/env python - -#------------------------------------------------------------------------- +# coding=utf-8 +# -------------------------------------------------------------------------- # Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for -# license information. -#-------------------------------------------------------------------------- +# Licensed under the MIT License. See License.txt in the project root for license information. +# Code generated by Microsoft (R) Python Code Generator. +# Changes may cause incorrect behavior and will be lost if the code is regenerated. +# -------------------------------------------------------------------------- +# coding: utf-8 +import os import re -import os.path -from io import open -from setuptools import find_packages, setup +from setuptools import setup, find_packages + -# Change the PACKAGE_NAME only to change folder and different name PACKAGE_NAME = "azure-eventgrid" -PACKAGE_PPRINT_NAME = "Event Grid" +PACKAGE_PPRINT_NAME = "Azure Event Grid" # a-b-c => a/b/c -package_folder_path = PACKAGE_NAME.replace('-', '/') -# a-b-c => a.b.c -namespace_name = PACKAGE_NAME.replace('-', '.') +package_folder_path = PACKAGE_NAME.replace("-", "/") # Version extraction inspired from 'requests' -with open(os.path.join(package_folder_path, 'version.py') - if os.path.exists(os.path.join(package_folder_path, 'version.py')) - else os.path.join(package_folder_path, '_version.py'), 'r') as fd: - version = re.search(r'^VERSION\s*=\s*[\'"]([^\'"]*)[\'"]', - fd.read(), re.MULTILINE).group(1) +with open(os.path.join(package_folder_path, "_version.py"), "r") as fd: + version = re.search(r'^VERSION\s*=\s*[\'"]([^\'"]*)[\'"]', fd.read(), re.MULTILINE).group(1) if not version: - raise RuntimeError('Cannot find version information') + raise RuntimeError("Cannot find version information") -with open('README.md', encoding='utf-8') as f: - readme = f.read() -with open('CHANGELOG.md', encoding='utf-8') as f: - changelog = f.read() setup( name=PACKAGE_NAME, version=version, - include_package_data=True, - description='Microsoft Azure {} Client Library for Python'.format(PACKAGE_PPRINT_NAME), - long_description=readme + '\n\n' + changelog, - long_description_content_type='text/markdown', - license='MIT License', - author='Microsoft Corporation', - author_email='azpysdkhelp@microsoft.com', - url='https://github.com/Azure/azure-sdk-for-python', + description="Microsoft {} Client Library for Python".format(PACKAGE_PPRINT_NAME), + long_description=open("README.md", "r").read(), + long_description_content_type="text/markdown", + license="MIT License", + author="Microsoft Corporation", + author_email="azpysdkhelp@microsoft.com", + url="https://github.com/Azure/azure-sdk-for-python/tree/main/sdk", keywords="azure, azure sdk", classifiers=[ - "Development Status :: 5 - Production/Stable", - 'Programming Language :: Python', - 'Programming Language :: Python :: 3 :: Only', - 'Programming Language :: Python :: 3', - 'Programming Language :: Python :: 3.8', - 'Programming Language :: Python :: 3.9', - 'Programming Language :: Python :: 3.10', - 'Programming Language :: Python :: 3.11', - 'Programming Language :: Python :: 3.12', - 'License :: OSI Approved :: MIT License', + "Development Status :: 4 - Beta", + "Programming Language :: Python", + "Programming Language :: Python :: 3 :: Only", + "Programming Language :: Python :: 3", + "Programming Language :: Python :: 3.8", + "Programming Language :: Python :: 3.9", + "Programming Language :: Python :: 3.10", + "Programming Language :: Python :: 3.11", + "Programming Language :: Python :: 3.12", + "License :: OSI Approved :: MIT License", ], - python_requires=">=3.8", zip_safe=False, - packages=find_packages(exclude=[ - 'tests', - 'samples', - # Exclude packages that will be covered by PEP420 or nspkg - 'azure', - ]), + packages=find_packages( + exclude=[ + "tests", + # Exclude packages that will be covered by PEP420 or nspkg + "azure", + ] + ), + include_package_data=True, + package_data={ + "azure.eventgrid": ["py.typed"], + }, install_requires=[ "isodate>=0.6.1", - 'azure-core>=1.24.0', + "azure-core>=1.30.0", + "typing-extensions>=4.6.0", ], + python_requires=">=3.8", ) diff --git a/sdk/eventgrid/azure-eventgrid/swagger/_constants.py b/sdk/eventgrid/azure-eventgrid/swagger/_constants.py index c6a89faa3626..0a2cfc4503c7 100644 --- a/sdk/eventgrid/azure-eventgrid/swagger/_constants.py +++ b/sdk/eventgrid/azure-eventgrid/swagger/_constants.py @@ -29,44 +29,56 @@ "https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/eventgrid/data-plane/Microsoft.Web/stable/2018-01-01/Web.json", "https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/eventgrid/data-plane/Microsoft.HealthcareApis/stable/2018-01-01/HealthcareApis.json", "https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/eventgrid/data-plane/Microsoft.ApiCenter/stable/2018-01-01/ApiCenter.json", - ] +] ####################################################### ### Used for backward compatibility. Don't change this ####################################################### backward_compat = { - 'AcsChatMemberAddedToThreadWithUserEventName': "Microsoft.Communication.ChatMemberAddedToThreadWithUser", - 'ResourceWriteFailureEventName': "Microsoft.Resources.ResourceWriteFailure", - 'IoTHubDeviceDeletedEventName': "Microsoft.Devices.DeviceDeleted", - 'IoTHubDeviceDisconnectedEventName': "Microsoft.Devices.DeviceDisconnected", - 'ResourceDeleteFailureEventName': "Microsoft.Resources.ResourceDeleteFailure", - 'ResourceDeleteCancelEventName': "Microsoft.Resources.ResourceDeleteCancel", - 'AcsChatThreadParticipantAddedEventName': "Microsoft.Communication.ChatThreadParticipantAdded", - 'ResourceDeleteSuccessEventName': "Microsoft.Resources.ResourceDeleteSuccess", - 'EventGridSubscriptionValidationEventName': "Microsoft.EventGrid.SubscriptionValidationEvent", - 'ResourceWriteSuccessEventName': "Microsoft.Resources.ResourceWriteSuccess", - 'ResourceActionSuccessEventName': "Microsoft.Resources.ResourceActionSuccess", - 'ResourceWriteCancelEventName': "Microsoft.Resources.ResourceWriteCancel", - 'ResourceActionFailureEventName': "Microsoft.Resources.ResourceActionFailure", - 'AcsChatMemberRemovedFromThreadWithUserEventName': "Microsoft.Communication.ChatMemberRemovedFromThreadWithUser", - 'IoTHubDeviceConnectedEventName': "Microsoft.Devices.DeviceConnected", - 'EventGridSubscriptionDeletedEventName': "Microsoft.EventGrid.SubscriptionDeletedEvent", - 'AcsChatThreadParticipantRemovedEventName': "Microsoft.Communication.ChatThreadParticipantRemoved", - 'ResourceActionCancelEventName': "Microsoft.Resources.ResourceActionCancel", - 'IoTHubDeviceCreatedEventName': "Microsoft.Devices.DeviceCreated", + "AcsChatMemberAddedToThreadWithUserEventName": "Microsoft.Communication.ChatMemberAddedToThreadWithUser", + "ResourceWriteFailureEventName": "Microsoft.Resources.ResourceWriteFailure", + "IoTHubDeviceDeletedEventName": "Microsoft.Devices.DeviceDeleted", + "IoTHubDeviceDisconnectedEventName": "Microsoft.Devices.DeviceDisconnected", + "ResourceDeleteFailureEventName": "Microsoft.Resources.ResourceDeleteFailure", + "ResourceDeleteCancelEventName": "Microsoft.Resources.ResourceDeleteCancel", + "AcsChatThreadParticipantAddedEventName": "Microsoft.Communication.ChatThreadParticipantAdded", + "ResourceDeleteSuccessEventName": "Microsoft.Resources.ResourceDeleteSuccess", + "EventGridSubscriptionValidationEventName": "Microsoft.EventGrid.SubscriptionValidationEvent", + "ResourceWriteSuccessEventName": "Microsoft.Resources.ResourceWriteSuccess", + "ResourceActionSuccessEventName": "Microsoft.Resources.ResourceActionSuccess", + "ResourceWriteCancelEventName": "Microsoft.Resources.ResourceWriteCancel", + "ResourceActionFailureEventName": "Microsoft.Resources.ResourceActionFailure", + "AcsChatMemberRemovedFromThreadWithUserEventName": "Microsoft.Communication.ChatMemberRemovedFromThreadWithUser", + "IoTHubDeviceConnectedEventName": "Microsoft.Devices.DeviceConnected", + "EventGridSubscriptionDeletedEventName": "Microsoft.EventGrid.SubscriptionDeletedEvent", + "AcsChatThreadParticipantRemovedEventName": "Microsoft.Communication.ChatThreadParticipantRemoved", + "ResourceActionCancelEventName": "Microsoft.Resources.ResourceActionCancel", + "IoTHubDeviceCreatedEventName": "Microsoft.Devices.DeviceCreated", } additional_events = { - 'ContainerRegistryArtifactEventName': 'Microsoft.AppConfiguration.KeyValueModified', - 'KeyVaultAccessPolicyChangedEventName': 'Microsoft.KeyVault.VaultAccessPolicyChanged', - 'ContainerRegistryEventName': 'Microsoft.ContainerRegistry.ChartPushed', - 'ServiceBusDeadletterMessagesAvailableWithNoListenerEventName': 'Microsoft.ServiceBus.DeadletterMessagesAvailableWithNoListeners' + "ContainerRegistryArtifactEventName": "Microsoft.AppConfiguration.KeyValueModified", + "KeyVaultAccessPolicyChangedEventName": "Microsoft.KeyVault.VaultAccessPolicyChanged", + "ContainerRegistryEventName": "Microsoft.ContainerRegistry.ChartPushed", + "ServiceBusDeadletterMessagesAvailableWithNoListenerEventName": "Microsoft.ServiceBus.DeadletterMessagesAvailableWithNoListeners", } -EXCEPTIONS = ['ContainerRegistryArtifactEventData', 'ContainerRegistryEventData', 'ContainerServiceClusterSupportEventData', 'ContainerServiceNodePoolRollingEventData', - 'EventGridMQTTClientEventData', 'AppConfigurationSnapshotEventData', 'HealthResourcesResourceEventData', 'AcsRouterJobEventData', 'AcsRouterWorkerEventData', 'AcsRouterEventData', - 'AvsClusterEventData', 'AvsPrivateCloudEventData', 'AvsScriptExecutionEventData', "AcsMessageEventData" - ] +EXCEPTIONS = [ + "ContainerRegistryArtifactEventData", + "ContainerRegistryEventData", + "ContainerServiceClusterSupportEventData", + "ContainerServiceNodePoolRollingEventData", + "EventGridMQTTClientEventData", + "AppConfigurationSnapshotEventData", + "HealthResourcesResourceEventData", + "AcsRouterJobEventData", + "AcsRouterWorkerEventData", + "AcsRouterEventData", + "AvsClusterEventData", + "AvsPrivateCloudEventData", + "AvsScriptExecutionEventData", + "AcsMessageEventData", +] NAMING_CHANGES = ["AcsMessageDeliveryStatusUpdatedEventName", "AcsMessageReceivedEventName"] diff --git a/sdk/eventgrid/azure-eventgrid/swagger/postprocess_eventnames.py b/sdk/eventgrid/azure-eventgrid/swagger/postprocess_eventnames.py index 5e96e2c8cd62..842afcc14797 100644 --- a/sdk/eventgrid/azure-eventgrid/swagger/postprocess_eventnames.py +++ b/sdk/eventgrid/azure-eventgrid/swagger/postprocess_eventnames.py @@ -6,13 +6,14 @@ from azure.eventgrid._generated import models from _constants import files, backward_compat, additional_events, EXCEPTIONS, NAMING_CHANGES + def extract(definitions): if not definitions: return tups = [] for event in definitions: - if event.endswith('Data') and event not in EXCEPTIONS: - key, txt = "Name".join(event.rsplit('Data', 1)), definitions[event]['description'] + if event.endswith("Data") and event not in EXCEPTIONS: + key, txt = "Name".join(event.rsplit("Data", 1)), definitions[event]["description"] if key in NAMING_CHANGES: key = key.replace("Acs", "AcsAdvanced") try: @@ -29,6 +30,7 @@ def extract(definitions): sys.exit(1) return tups + def generate_enum_content(tuples): print("# These names at the top are 'corrected' aliases of duplicate values that appear below, which are") print("# deprecated but maintained for backwards compatibility.") @@ -40,10 +42,11 @@ def generate_enum_content(tuples): for k, v in additional_events.items(): print(k + " = '" + v + "'\n") + definitions = {} for fp in files: data = json.loads(urlopen(fp).read()) - definitions.update(data.get('definitions')) + definitions.update(data.get("definitions")) tup_list = extract(definitions) tup_list.sort(key=lambda tup: tup[0]) generate_enum_content(tup_list) diff --git a/sdk/eventgrid/azure-eventgrid/tests/_mocks.py b/sdk/eventgrid/azure-eventgrid/tests/_mocks.py index d06c77a30a89..eb4ebc6f19ce 100644 --- a/sdk/eventgrid/azure-eventgrid/tests/_mocks.py +++ b/sdk/eventgrid/azure-eventgrid/tests/_mocks.py @@ -3,35 +3,35 @@ # storage cloud event cloud_storage_dict = { - "id":"a0517898-9fa4-4e70-b4a3-afda1dd68672", - "source":"/subscriptions/{subscription-id}/resourceGroups/{resource-group}/providers/Microsoft.Storage/storageAccounts/{storage-account}", - "data":{ - "api":"PutBlockList", - "client_request_id":"6d79dbfb-0e37-4fc4-981f-442c9ca65760", - "request_id":"831e1650-001e-001b-66ab-eeb76e000000", - "e_tag":"0x8D4BCC2E4835CD0", - "content_type":"application/octet-stream", - "content_length":524288, - "blob_type":"BlockBlob", - "url":"https://oc2d2817345i60006.blob.core.windows.net/oc2d2817345i200097container/oc2d2817345i20002296blob", - "sequencer":"00000000000004420000000000028963", - "storage_diagnostics":{"batchId":"b68529f3-68cd-4744-baa4-3c0498ec19f0"} + "id": "a0517898-9fa4-4e70-b4a3-afda1dd68672", + "source": "/subscriptions/{subscription-id}/resourceGroups/{resource-group}/providers/Microsoft.Storage/storageAccounts/{storage-account}", + "data": { + "api": "PutBlockList", + "client_request_id": "6d79dbfb-0e37-4fc4-981f-442c9ca65760", + "request_id": "831e1650-001e-001b-66ab-eeb76e000000", + "e_tag": "0x8D4BCC2E4835CD0", + "content_type": "application/octet-stream", + "content_length": 524288, + "blob_type": "BlockBlob", + "url": "https://oc2d2817345i60006.blob.core.windows.net/oc2d2817345i200097container/oc2d2817345i20002296blob", + "sequencer": "00000000000004420000000000028963", + "storage_diagnostics": {"batchId": "b68529f3-68cd-4744-baa4-3c0498ec19f0"}, }, - "type":"Microsoft.Storage.BlobCreated", - "time":"2020-08-07T01:11:49.765846Z", - "specversion":"1.0" + "type": "Microsoft.Storage.BlobCreated", + "time": "2020-08-07T01:11:49.765846Z", + "specversion": "1.0", } cloud_storage_string = json.dumps(cloud_storage_dict) cloud_storage_bytes = cloud_storage_string.encode("utf-8") # custom cloud event cloud_custom_dict = { - "id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e033", - "source":"https://egtest.dev/cloudcustomevent", - "data":{"team": "event grid squad"}, - "type":"Azure.Sdk.Sample", - "time":"2020-08-07T02:06:08.11969Z", - "specversion":"1.0" + "id": "de0fd76c-4ef4-4dfb-ab3a-8f24a307e033", + "source": "https://egtest.dev/cloudcustomevent", + "data": {"team": "event grid squad"}, + "type": "Azure.Sdk.Sample", + "time": "2020-08-07T02:06:08.11969Z", + "specversion": "1.0", } cloud_custom_string = json.dumps(cloud_custom_dict) cloud_custom_bytes = cloud_custom_string.encode("utf-8") @@ -39,25 +39,25 @@ # storage eg event # spell-checker:ignore swpill eventgridegsub egtopicsamplesub eg_storage_dict = { - "id":"bbab6625-dc56-4b22-abeb-afcc72e5290c", - "subject":"/blobServices/default/containers/oc2d2817345i200097container/blobs/oc2d2817345i20002296blob", - "data":{ - "api":"PutBlockList", - "clientRequestId":"6d79dbfb-0e37-4fc4-981f-442c9ca65760", - "requestId":"831e1650-001e-001b-66ab-eeb76e000000", - "eTag":"0x8D4BCC2E4835CD0", - "contentType":"application/octet-stream", - "contentLength":524288, - "blobType":"BlockBlob", - "url":"https://oc2d2817345i60006.blob.core.windows.net/oc2d2817345i200097container/oc2d2817345i20002296blob", - "sequencer":"00000000000004420000000000028963", - "storageDiagnostics":{"batchId":"b68529f3-68cd-4744-baa4-3c0498ec19f0"} + "id": "bbab6625-dc56-4b22-abeb-afcc72e5290c", + "subject": "/blobServices/default/containers/oc2d2817345i200097container/blobs/oc2d2817345i20002296blob", + "data": { + "api": "PutBlockList", + "clientRequestId": "6d79dbfb-0e37-4fc4-981f-442c9ca65760", + "requestId": "831e1650-001e-001b-66ab-eeb76e000000", + "eTag": "0x8D4BCC2E4835CD0", + "contentType": "application/octet-stream", + "contentLength": 524288, + "blobType": "BlockBlob", + "url": "https://oc2d2817345i60006.blob.core.windows.net/oc2d2817345i200097container/oc2d2817345i20002296blob", + "sequencer": "00000000000004420000000000028963", + "storageDiagnostics": {"batchId": "b68529f3-68cd-4744-baa4-3c0498ec19f0"}, }, - "eventType":"Microsoft.Storage.BlobCreated", - "dataVersion":"2.0", - "metadataVersion":"1", - "eventTime":"2020-08-07T02:28:23.867525Z", - "topic":"/subscriptions/faa080af-c1d8-40ad-9cce-e1a450ca5b57/resourceGroups/t-swpill-test/providers/Microsoft.EventGrid/topics/eventgridegsub" + "eventType": "Microsoft.Storage.BlobCreated", + "dataVersion": "2.0", + "metadataVersion": "1", + "eventTime": "2020-08-07T02:28:23.867525Z", + "topic": "/subscriptions/faa080af-c1d8-40ad-9cce-e1a450ca5b57/resourceGroups/t-swpill-test/providers/Microsoft.EventGrid/topics/eventgridegsub", } eg_storage_string = json.dumps(eg_storage_dict) @@ -65,14 +65,14 @@ # custom eg event eg_custom_dict = { - "id":"3a30afef-b604-4b67-973e-7dfff7e178a7", - "subject":"Test EG Custom Event", - "data":{"team":"event grid squad"}, - "eventType":"Azure.Sdk.Sample", - "dataVersion":"2.0", - "metadataVersion":"1", - "eventTime":"2020-08-07T02:19:05.16916Z", - "topic":"/subscriptions/f8aa80ae-d1c8-60ad-9bce-e1a850ba5b67/resourceGroups/sample-resource-group-test/providers/Microsoft.EventGrid/topics/egtopicsamplesub" + "id": "3a30afef-b604-4b67-973e-7dfff7e178a7", + "subject": "Test EG Custom Event", + "data": {"team": "event grid squad"}, + "eventType": "Azure.Sdk.Sample", + "dataVersion": "2.0", + "metadataVersion": "1", + "eventTime": "2020-08-07T02:19:05.16916Z", + "topic": "/subscriptions/f8aa80ae-d1c8-60ad-9bce-e1a850ba5b67/resourceGroups/sample-resource-group-test/providers/Microsoft.EventGrid/topics/egtopicsamplesub", } eg_custom_string = json.dumps(eg_custom_dict) eg_custom_bytes = eg_custom_string.encode("utf-8") diff --git a/sdk/eventgrid/azure-eventgrid/tests/conftest.py b/sdk/eventgrid/azure-eventgrid/tests/conftest.py index fa630f477a16..ac07bea363d3 100644 --- a/sdk/eventgrid/azure-eventgrid/tests/conftest.py +++ b/sdk/eventgrid/azure-eventgrid/tests/conftest.py @@ -23,26 +23,72 @@ # IN THE SOFTWARE. # # -------------------------------------------------------------------------- -import platform -import sys +import os import pytest -from devtools_testutils import test_proxy -from devtools_testutils.sanitizers import add_remove_header_sanitizer, add_general_regex_sanitizer, set_custom_default_matcher +from urllib.parse import urlparse +from devtools_testutils import ( + add_general_string_sanitizer, + test_proxy, + add_body_key_sanitizer, + add_header_regex_sanitizer, + add_oauth_response_sanitizer, + add_body_regex_sanitizer, +) +from devtools_testutils.sanitizers import ( + add_remove_header_sanitizer, + add_general_regex_sanitizer, + set_custom_default_matcher, +) -# Ignore async tests for Python < 3.5 -collect_ignore_glob = [] -if sys.version_info < (3, 5): - collect_ignore_glob.append("*_async.py") - collect_ignore_glob.append("test_cncf*") @pytest.fixture(scope="session", autouse=True) -def add_aeg_sanitizer(test_proxy): +def add_sanitizers(test_proxy): # this can be reverted to set_bodiless_matcher() after tests are re-recorded and don't contain these headers set_custom_default_matcher( - compare_bodies=False, excluded_headers="Authorization,Content-Length,x-ms-client-request-id,x-ms-request-id" + compare_bodies=False, + excluded_headers="Authorization,Content-Length,x-ms-client-request-id,x-ms-request-id", ) - add_remove_header_sanitizer(headers="aeg-sas-key, aeg-sas-token") + add_remove_header_sanitizer(headers="aeg-sas-key, aeg-sas-token, aeg-channel-name") add_general_regex_sanitizer( - value="fakeresource", - regex="(?<=\\/\\/)[a-z-]+(?=\\.eastus-1\\.eventgrid\\.azure\\.net/api/events)" + value="fakeresource", regex="(?<=\\/\\/)[a-z-]+(?=\\.eastus-1\\.eventgrid\\.azure\\.net/api/events)" ) + + add_oauth_response_sanitizer() + add_header_regex_sanitizer(key="Set-Cookie", value="[set-cookie;]") + add_header_regex_sanitizer(key="Cookie", value="cookie;") + + add_body_key_sanitizer(json_path="$..id", value="id") + + client_id = os.getenv("AZURE_CLIENT_ID", "sanitized") + client_secret = os.getenv("AZURE_CLIENT_SECRET", "sanitized") + eventgrid_client_id = os.getenv("EVENTGRID_CLIENT_ID", "sanitized") + eventgrid_client_secret = os.getenv("EVENTGRID_CLIENT_SECRET", "sanitized") + tenant_id = os.getenv("AZURE_TENANT_ID", "sanitized") + eventgrid_topic_endpoint = os.getenv("EVENTGRID_TOPIC_ENDPOINT", "sanitized") + + eventgrid_endpoint = os.getenv("EVENTGRID_ENDPOINT", "sanitized") + eventgrid_key = os.getenv("EVENTGRID_KEY", "sanitized") + eventgrid_topic_name = os.getenv("EVENTGRID_TOPIC_NAME", "sanitized") + eventgrid_event_subscription_name = os.getenv("EVENTGRID_EVENT_SUBSCRIPTION_NAME", "sanitized") + + eventgrid_cloud_event_topic_key = os.getenv("EVENTGRID_CLOUD_EVENT_TOPIC_KEY", "sanitized") + eventgrid_cloud_event_topic_endpoint = os.getenv("EVENTGRID_CLOUD_EVENT_TOPIC_ENDPOINT", "sanitized") + + eventgrid_topic_key = os.getenv("EVENTGRID_TOPIC_KEY", "sanitized") + eventgrid_topic_endpoint = os.getenv("EVENTGRID_TOPIC_ENDPOINT", "sanitized") + + eventgrid_partner_channel_name = os.getenv("EVENTGRID_PARTNER_CHANNEL_NAME", "sanitized") + eventgrid_partner_namespace_topic_endpoint = os.getenv("EVENTGRID_PARTNER_NAMESPACE_TOPIC_ENDPOINT", "sanitized") + eventgrid_partner_namespace_topic_key = os.getenv("EVENTGRID_PARTNER_NAMESPACE_TOPIC_KEY", "sanitized") + + # Need to santize namespace for eventgrid_topic: + try: + eventgrid_hostname = urlparse(eventgrid_topic_endpoint).hostname + add_general_string_sanitizer(target=eventgrid_hostname.upper(), value="sanitized.eastus-1.eventgrid.azure.net") + except: + pass + add_general_string_sanitizer(target=client_id, value="00000000-0000-0000-0000-000000000000") + add_general_string_sanitizer(target=client_secret, value="sanitized") + add_general_string_sanitizer(target=eventgrid_client_id, value="00000000-0000-0000-0000-000000000000") + add_general_string_sanitizer(target=eventgrid_client_secret, value="sanitized") + add_general_string_sanitizer(target=tenant_id, value="00000000-0000-0000-0000-000000000000") diff --git a/sdk/eventgrid/azure-eventgrid/tests/eventgrid_preparer.py b/sdk/eventgrid/azure-eventgrid/tests/eventgrid_preparer.py index b9a1e245af64..917325fbd094 100644 --- a/sdk/eventgrid/azure-eventgrid/tests/eventgrid_preparer.py +++ b/sdk/eventgrid/azure-eventgrid/tests/eventgrid_preparer.py @@ -3,20 +3,29 @@ from azure.mgmt.eventgrid.models import Topic, InputSchema, JsonInputSchemaMapping, JsonField, JsonFieldWithDefault -EVENTGRID_TOPIC_PARAM = 'eventgrid_topic' -EVENTGRID_TOPIC_LOCATION = 'westus' + +EVENTGRID_TOPIC_PARAM = "eventgrid_topic" +EVENTGRID_TOPIC_LOCATION = "westus" CLOUD_EVENT_SCHEMA = InputSchema.cloud_event_schema_v1_0 CUSTOM_EVENT_SCHEMA = InputSchema.custom_event_schema -ID_JSON_FIELD = JsonField(source_field='customId') -TOPIC_JSON_FIELD = JsonField(source_field='customTopic') -EVENT_TIME_JSON_FIELD = JsonField(source_field='customEventTime') -EVENT_TYPE_JSON_FIELD_WITH_DEFAULT = JsonFieldWithDefault(source_field='customEventType', default_value='') -SUBJECT_JSON_FIELD_WITH_DEFAULT = JsonFieldWithDefault(source_field='customSubject', default_value='') -DATA_VERSION_JSON_FIELD_WITH_DEFAULT = JsonFieldWithDefault(source_field='customDataVersion', default_value='') -CUSTOM_JSON_INPUT_SCHEMA_MAPPING = JsonInputSchemaMapping(id=ID_JSON_FIELD, topic=TOPIC_JSON_FIELD, event_time=EVENT_TIME_JSON_FIELD, event_type=EVENT_TYPE_JSON_FIELD_WITH_DEFAULT, subject=SUBJECT_JSON_FIELD_WITH_DEFAULT, data_version=DATA_VERSION_JSON_FIELD_WITH_DEFAULT) +ID_JSON_FIELD = JsonField(source_field="customId") +TOPIC_JSON_FIELD = JsonField(source_field="customTopic") +EVENT_TIME_JSON_FIELD = JsonField(source_field="customEventTime") +EVENT_TYPE_JSON_FIELD_WITH_DEFAULT = JsonFieldWithDefault(source_field="customEventType", default_value="") +SUBJECT_JSON_FIELD_WITH_DEFAULT = JsonFieldWithDefault(source_field="customSubject", default_value="") +DATA_VERSION_JSON_FIELD_WITH_DEFAULT = JsonFieldWithDefault(source_field="customDataVersion", default_value="") +CUSTOM_JSON_INPUT_SCHEMA_MAPPING = JsonInputSchemaMapping( + id=ID_JSON_FIELD, + topic=TOPIC_JSON_FIELD, + event_time=EVENT_TIME_JSON_FIELD, + event_type=EVENT_TYPE_JSON_FIELD_WITH_DEFAULT, + subject=SUBJECT_JSON_FIELD_WITH_DEFAULT, + data_version=DATA_VERSION_JSON_FIELD_WITH_DEFAULT, +) EventGridPreparer = functools.partial( - EnvironmentVariableLoader, "eventgrid", + EnvironmentVariableLoader, + "eventgrid", eventgrid_topic_endpoint="https://fakeresource.eastus-1.eventgrid.azure.net/api/events", eventgrid_topic_key="fakekeyfakekeyfakekeyfakekeyfakekeyfakekeyA=", eventgrid_domain_endpoint="https://fakeresource.eastus-1.eventgrid.azure.net/api/events", @@ -27,7 +36,11 @@ eventgrid_cloud_event_domain_key="fakekeyfakekeyfakekeyfakekeyfakekeyfakekeyA=", eventgrid_custom_event_topic_endpoint="https://fakeresource.eastus-1.eventgrid.azure.net/api/events", eventgrid_custom_event_topic_key="fakekeyfakekeyfakekeyfakekeyfakekeyfakekeyA=", + eventgrid_partner_channel_name="fakechannel", eventgrid_partner_namespace_topic_endpoint="https://fakeresource.eastus-1.eventgrid.azure.net/api/events", eventgrid_partner_namespace_topic_key="fakekeyfakekeyfakekeyfakekeyfakekeyfakekeyA=", - eventgrid_partner_channel_name="fake_channel_name" + eventgrid_endpoint="https://fakeresource.eastus-1.eventgrid.azure.net/api/events", + eventgrid_key="fakekeyfakekeyfakekeyfakekeyfakekeyfakekeyA=", + eventgrid_topic_name="faketopic", + eventgrid_event_subscription_name="fakesub", ) diff --git a/sdk/eventgrid/azure-eventgrid/tests/perfstress_tests/send.py b/sdk/eventgrid/azure-eventgrid/tests/perfstress_tests/send.py index 743046cc1edb..f08902d3aaec 100644 --- a/sdk/eventgrid/azure-eventgrid/tests/perfstress_tests/send.py +++ b/sdk/eventgrid/azure-eventgrid/tests/perfstress_tests/send.py @@ -1,17 +1,21 @@ -#------------------------------------------------------------------------- +# ------------------------------------------------------------------------- # Copyright (c) Microsoft Corporation. All rights reserved. # Licensed under the MIT License. See License.txt in the project root for # license information. -#-------------------------------------------------------------------------- +# -------------------------------------------------------------------------- import asyncio from devtools_testutils.perfstress_tests import PerfStressTest -from azure.eventgrid import EventGridPublisherClient as SyncPublisherClient, EventGridEvent +from azure.eventgrid import ( + EventGridPublisherClient as SyncPublisherClient, + EventGridEvent, +) from azure.eventgrid.aio import EventGridPublisherClient as AsyncPublisherClient from azure.core.credentials import AzureKeyCredential + class EventGridPerfTest(PerfStressTest): def __init__(self, arguments): super().__init__(arguments) @@ -21,29 +25,23 @@ def __init__(self, arguments): endpoint = self.get_from_env("EG_TOPIC_HOSTNAME") # Create clients - self.publisher_client = SyncPublisherClient( - endpoint=endpoint, - credential=AzureKeyCredential(topic_key) - ) - self.async_publisher_client = AsyncPublisherClient( - endpoint=endpoint, - credential=AzureKeyCredential(topic_key) - ) + self.publisher_client = SyncPublisherClient(endpoint=endpoint, credential=AzureKeyCredential(topic_key)) + self.async_publisher_client = AsyncPublisherClient(endpoint=endpoint, credential=AzureKeyCredential(topic_key)) self.event_list = [] for _ in range(self.args.num_events): - self.event_list.append(EventGridEvent( - event_type="Contoso.Items.ItemReceived", - data={ - "services": ["EventGrid", "ServiceBus", "EventHubs", "Storage"] - }, - subject="Door1", - data_version="2.0" - )) + self.event_list.append( + EventGridEvent( + event_type="Contoso.Items.ItemReceived", + data={"services": ["EventGrid", "ServiceBus", "EventHubs", "Storage"]}, + subject="Door1", + data_version="2.0", + ) + ) async def close(self): """This is run after cleanup. - + Use this to close any open handles or clients. """ await self.async_publisher_client.close() @@ -51,7 +49,7 @@ async def close(self): def run_sync(self): """The synchronous perf test. - + Try to keep this minimal and focused. Using only a single client API. Avoid putting any ancillary logic (e.g. generating UUIDs), and put this in the setup/init instead so that we're only measuring the client API call. @@ -60,7 +58,7 @@ def run_sync(self): async def run_async(self): """The asynchronous perf test. - + Try to keep this minimal and focused. Using only a single client API. Avoid putting any ancillary logic (e.g. generating UUIDs), and put this in the setup/init instead so that we're only measuring the client API call. @@ -70,4 +68,11 @@ async def run_async(self): @staticmethod def add_arguments(parser): super(EventGridPerfTest, EventGridPerfTest).add_arguments(parser) - parser.add_argument('-n', '--num-events', nargs='?', type=int, help='Number of events to be sent. Defaults to 100', default=100) + parser.add_argument( + "-n", + "--num-events", + nargs="?", + type=int, + help="Number of events to be sent. Defaults to 100", + default=100, + ) diff --git a/sdk/eventgrid/azure-eventgrid/tests/test_cloud_event_tracing.py b/sdk/eventgrid/azure-eventgrid/tests/test_cloud_event_tracing.py index 0aad89a0d898..bf1ba4ac12fe 100644 --- a/sdk/eventgrid/azure-eventgrid/tests/test_cloud_event_tracing.py +++ b/sdk/eventgrid/azure-eventgrid/tests/test_cloud_event_tracing.py @@ -1,67 +1,63 @@ -#------------------------------------------------------------------------- +# ------------------------------------------------------------------------- # Copyright (c) Microsoft Corporation. All rights reserved. # Licensed under the MIT License. See License.txt in the project root for # license information. -#-------------------------------------------------------------------------- +# -------------------------------------------------------------------------- import pytest import json -from azure.core.pipeline import ( - PipelineRequest, - PipelineContext -) +from azure.core.pipeline import PipelineRequest, PipelineContext from azure.core.pipeline.transport import HttpRequest from azure.core.messaging import CloudEvent -from azure.eventgrid._policies import CloudEventDistributedTracingPolicy -from _mocks import ( - cloud_storage_dict -) +from azure.eventgrid._legacy._policies import CloudEventDistributedTracingPolicy +from _mocks import cloud_storage_dict + # spell-checker:disable _content_type = "application/cloudevents-batch+json; charset=utf-8" _traceparent_value = "00-4bf92f3577b34da6a3ce929d0e0e4736-00f067aa0ba902b7-01" _tracestate_value = "rojo=00-4bf92f3577b34da6a3ce929d0e0e4736-00f067aa0ba902b7-01,congo=lZWRzIHRoNhcm5hbCBwbGVhc3VyZS4" # spell-chcker:enable -class EventGridSerializationTests(object): +class EventGridSerializationTests(object): def test_cloud_event_policy_copies(self): policy = CloudEventDistributedTracingPolicy() body = json.dumps([cloud_storage_dict]) - universal_request = HttpRequest('POST', 'http://127.0.0.1/', data=body) - universal_request.headers['content-type'] = _content_type - universal_request.headers['traceparent'] = _traceparent_value - universal_request.headers['tracestate'] = _tracestate_value + universal_request = HttpRequest("POST", "http://127.0.0.1/", data=body) + universal_request.headers["content-type"] = _content_type + universal_request.headers["traceparent"] = _traceparent_value + universal_request.headers["tracestate"] = _tracestate_value request = PipelineRequest(universal_request, PipelineContext(None)) resp = policy.on_request(request) body = json.loads(request.http_request.body) - + for item in body: - assert 'traceparent' in item - assert 'tracestate' in item + assert "traceparent" in item + assert "tracestate" in item def test_cloud_event_policy_no_copy_if_trace_exists(self): policy = CloudEventDistributedTracingPolicy() - cloud_storage_dict.update({'traceparent': 'exists', 'tracestate': 'state_exists'}) + cloud_storage_dict.update({"traceparent": "exists", "tracestate": "state_exists"}) body = json.dumps([cloud_storage_dict]) - universal_request = HttpRequest('POST', 'http://127.0.0.1/', data=body) - universal_request.headers['content-type'] = _content_type - universal_request.headers['traceparent'] = _traceparent_value - universal_request.headers['tracestate'] = _tracestate_value + universal_request = HttpRequest("POST", "http://127.0.0.1/", data=body) + universal_request.headers["content-type"] = _content_type + universal_request.headers["traceparent"] = _traceparent_value + universal_request.headers["tracestate"] = _tracestate_value request = PipelineRequest(universal_request, PipelineContext(None)) resp = policy.on_request(request) body = json.loads(request.http_request.body) - + for item in body: - assert 'traceparent' in item - assert 'tracestate' in item - assert item['traceparent'] == 'exists' - assert item['tracestate'] == 'state_exists' + assert "traceparent" in item + assert "tracestate" in item + assert item["traceparent"] == "exists" + assert item["tracestate"] == "state_exists" diff --git a/sdk/eventgrid/azure-eventgrid/tests/test_cncf_events.py b/sdk/eventgrid/azure-eventgrid/tests/test_cncf_events.py index 6cf34da3e4e1..06b2fc949305 100644 --- a/sdk/eventgrid/azure-eventgrid/tests/test_cncf_events.py +++ b/sdk/eventgrid/azure-eventgrid/tests/test_cncf_events.py @@ -1,4 +1,3 @@ - import json from devtools_testutils import AzureRecordedTestCase, recorded_by_proxy @@ -9,6 +8,7 @@ EventGridPreparer, ) + class TestEventGridPublisherClientCncf(AzureRecordedTestCase): def create_eg_publisher_client(self, endpoint): credential = self.get_credential(EventGridPublisherClient) @@ -25,13 +25,14 @@ def test_send_cloud_event_data_dict(self, eventgrid_cloud_event_topic_endpoint): } data = {"message": "Hello World!"} cloud_event = CloudEvent(attributes, data) + def callback(request): req = json.loads(request.http_request.body) assert req[0].get("data") is not None assert isinstance(req[0], dict) assert req[0].get("type") == "com.example.sampletype1" assert req[0].get("source") == "https://example.com/event-producer" - + client.send(cloud_event, raw_request_hook=callback) @EventGridPreparer() @@ -42,13 +43,14 @@ def test_send_cloud_event_data_base64_using_data(self, eventgrid_cloud_event_top "type": "com.example.sampletype1", "source": "https://example.com/event-producer", } - data = b'hello world' + data = b"hello world" cloud_event = CloudEvent(attributes, data) + def callback(request): req = json.loads(request.http_request.body) assert req[0].get("data_base64") is not None assert req[0].get("data") is None - + client.send(cloud_event, raw_request_hook=callback) @EventGridPreparer() @@ -72,10 +74,12 @@ def test_send_cloud_event_data_str(self, eventgrid_cloud_event_topic_endpoint): "source": "https://example.com/event-producer", } data = "hello world" + def callback(request): req = json.loads(request.http_request.body) assert req[0].get("data_base64") is None assert req[0].get("data") is not None + cloud_event = CloudEvent(attributes, data) client.send(cloud_event, raw_request_hook=callback) @@ -98,8 +102,8 @@ def test_send_cloud_event_data_with_extensions(self, eventgrid_cloud_event_topic attributes = { "type": "com.example.sampletype1", "source": "https://example.com/event-producer", - "ext1": "extension" + "ext1": "extension", } data = "hello world" cloud_event = CloudEvent(attributes, data) - client.send([cloud_event]) \ No newline at end of file + client.send([cloud_event]) diff --git a/sdk/eventgrid/azure-eventgrid/tests/test_cncf_events_async.py b/sdk/eventgrid/azure-eventgrid/tests/test_cncf_events_async.py index febd44175a99..a4c34a8cafd9 100644 --- a/sdk/eventgrid/azure-eventgrid/tests/test_cncf_events_async.py +++ b/sdk/eventgrid/azure-eventgrid/tests/test_cncf_events_async.py @@ -1,4 +1,3 @@ - import json import pytest from devtools_testutils import AzureRecordedTestCase @@ -11,6 +10,7 @@ EventGridPreparer, ) + class TestEventGridPublisherClientCncf(AzureRecordedTestCase): def create_eg_publisher_client(self, endpoint): credential = self.get_credential(EventGridPublisherClient, is_async=True) @@ -28,13 +28,14 @@ async def test_send_cloud_event_data_dict(self, eventgrid_cloud_event_topic_endp } data = {"message": "Hello World!"} cloud_event = CloudEvent(attributes, data) + def callback(request): req = json.loads(request.http_request.body) assert req[0].get("data") is not None assert isinstance(req[0], dict) assert req[0].get("type") == "com.example.sampletype1" assert req[0].get("source") == "https://example.com/event-producer" - + await client.send(cloud_event, raw_request_hook=callback) @EventGridPreparer() @@ -46,13 +47,14 @@ async def test_send_cloud_event_data_base64_using_data(self, eventgrid_cloud_eve "type": "com.example.sampletype1", "source": "https://example.com/event-producer", } - data = b'hello world' + data = b"hello world" cloud_event = CloudEvent(attributes, data) + def callback(request): req = json.loads(request.http_request.body) assert req[0].get("data_base64") is not None assert req[0].get("data") is None - + await client.send(cloud_event, raw_request_hook=callback) @EventGridPreparer() @@ -79,11 +81,12 @@ async def test_send_cloud_event_data_str(self, eventgrid_cloud_event_topic_endpo } data = "hello world" cloud_event = CloudEvent(attributes, data) + def callback(request): req = json.loads(request.http_request.body) assert req[0].get("data_base64") is None assert req[0].get("data") is not None - + await client.send(cloud_event, raw_request_hook=callback) @EventGridPreparer() @@ -107,7 +110,7 @@ async def test_send_cloud_event_data_with_extensions(self, eventgrid_cloud_event attributes = { "type": "com.example.sampletype1", "source": "https://example.com/event-producer", - "ext1": "extension" + "ext1": "extension", } data = "hello world" cloud_event = CloudEvent(attributes, data) diff --git a/sdk/eventgrid/azure-eventgrid/tests/test_eg_consumer_client.py b/sdk/eventgrid/azure-eventgrid/tests/test_eg_consumer_client.py new file mode 100644 index 000000000000..07798d0b8b35 --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/tests/test_eg_consumer_client.py @@ -0,0 +1,138 @@ +# ------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for +# license information. +# -------------------------------------------------------------------------- + +import logging +import sys +import os +import json +import pytest +import uuid +from msrest.serialization import UTC +import datetime as dt + +from devtools_testutils import AzureRecordedTestCase +from azure.core.messaging import CloudEvent +from azure.core.credentials import AzureKeyCredential +from azure.eventgrid import EventGridConsumerClient, EventGridPublisherClient +from eventgrid_preparer import ( + EventGridPreparer, +) + + +class TestEventGridConsumerClient(AzureRecordedTestCase): + def create_eg_publisher_client(self, endpoint, topic=None): + credential = self.get_credential(EventGridPublisherClient) + client = self.create_client_from_credential( + EventGridPublisherClient, credential=credential, endpoint=endpoint, namespace_topic=topic + ) + return client + + def create_eg_consumer_client(self, endpoint, topic, subscription): + credential = self.get_credential(EventGridConsumerClient) + client = self.create_client_from_credential( + EventGridConsumerClient, + credential=credential, + endpoint=endpoint, + namespace_topic=topic, + subscription=subscription, + ) + return client + + @pytest.mark.live_test_only + @EventGridPreparer() + def test_receive_data(self, **kwargs): + eventgrid_endpoint = kwargs["eventgrid_endpoint"] + eventgrid_key = kwargs["eventgrid_key"] + eventgrid_topic_name = kwargs["eventgrid_topic_name"] + eventgrid_event_subscription_name = kwargs["eventgrid_event_subscription_name"] + publisher = self.create_eg_publisher_client(eventgrid_endpoint, eventgrid_topic_name) + consumer = self.create_eg_consumer_client(eventgrid_endpoint, eventgrid_topic_name, eventgrid_event_subscription_name) + cloud_event = CloudEvent( + source="http://samplesource.dev", + data={"sample": "cloudevent"}, + type="Sample.Cloud.Event", + ) + publisher.send(cloud_event) + + received_event = consumer.receive(max_events=1) + assert len(received_event) == 1 + + for event in received_event: + ack_result = consumer.acknowledge(lock_tokens=[event.broker_properties.lock_token]) + assert ack_result.succeeded_lock_tokens == [event.broker_properties.lock_token] + + @pytest.mark.live_test_only + @EventGridPreparer() + def test_receive_renew_data(self, **kwargs): + eventgrid_endpoint = kwargs["eventgrid_endpoint"] + eventgrid_key = kwargs["eventgrid_key"] + eventgrid_topic_name = kwargs["eventgrid_topic_name"] + eventgrid_event_subscription_name = kwargs["eventgrid_event_subscription_name"] + publisher = self.create_eg_publisher_client(eventgrid_endpoint, eventgrid_topic_name) + consumer = self.create_eg_consumer_client(eventgrid_endpoint, eventgrid_topic_name, eventgrid_event_subscription_name) + + cloud_event = CloudEvent( + source="http://samplesource.dev", + data={"sample": "cloudevent"}, + type="Sample.Cloud.Event", + ) + publisher.send(cloud_event) + + received_event = consumer.receive(max_events=1) + assert len(received_event) == 1 + + for event in received_event: + renew_lock = consumer.renew_locks(lock_tokens=[event.broker_properties.lock_token]) + ack_result = consumer.acknowledge(lock_tokens=[event.broker_properties.lock_token]) + assert ack_result.succeeded_lock_tokens == [event.broker_properties.lock_token] + + @pytest.mark.live_test_only + @EventGridPreparer() + def test_receive_release_data(self, **kwargs): + eventgrid_endpoint = kwargs["eventgrid_endpoint"] + eventgrid_key = kwargs["eventgrid_key"] + eventgrid_topic_name = kwargs["eventgrid_topic_name"] + eventgrid_event_subscription_name = kwargs["eventgrid_event_subscription_name"] + publisher = self.create_eg_publisher_client(eventgrid_endpoint, eventgrid_topic_name) + consumer = self.create_eg_consumer_client(eventgrid_endpoint, eventgrid_topic_name, eventgrid_event_subscription_name) + + cloud_event = CloudEvent( + source="http://samplesource.dev", + data={"sample": "cloudevent"}, + type="Sample.Cloud.Event", + ) + publisher.send(cloud_event) + + received_event = consumer.receive(max_events=1) + assert len(received_event) == 1 + + for event in received_event: + release = consumer.release(lock_tokens=[event.broker_properties.lock_token]) + assert release.succeeded_lock_tokens == [event.broker_properties.lock_token] + + @pytest.mark.live_test_only + @EventGridPreparer() + def test_receive_reject_data(self, **kwargs): + eventgrid_endpoint = kwargs["eventgrid_endpoint"] + eventgrid_key = kwargs["eventgrid_key"] + eventgrid_topic_name = kwargs["eventgrid_topic_name"] + eventgrid_event_subscription_name = kwargs["eventgrid_event_subscription_name"] + publisher = self.create_eg_publisher_client(eventgrid_endpoint, eventgrid_topic_name) + consumer = self.create_eg_consumer_client(eventgrid_endpoint, eventgrid_topic_name, eventgrid_event_subscription_name) + + cloud_event = CloudEvent( + source="http://samplesource.dev", + data={"sample": "cloudevent"}, + type="Sample.Cloud.Event", + ) + publisher.send(cloud_event) + + received_event = consumer.receive(max_events=1) + assert len(received_event) == 1 + + for event in received_event: + reject = consumer.reject(lock_tokens=[event.broker_properties.lock_token]) + assert reject.succeeded_lock_tokens == [event.broker_properties.lock_token] diff --git a/sdk/eventgrid/azure-eventgrid/tests/test_eg_consumer_client_async.py b/sdk/eventgrid/azure-eventgrid/tests/test_eg_consumer_client_async.py new file mode 100644 index 000000000000..b3bd78a78fdd --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/tests/test_eg_consumer_client_async.py @@ -0,0 +1,140 @@ +# ------------------------------------------------------------------------- +# Copyright (c) Microsoft Corporation. All rights reserved. +# Licensed under the MIT License. See License.txt in the project root for +# license information. +# -------------------------------------------------------------------------- + +import logging +import sys +import os +import json +import pytest +import asyncio +import uuid +from msrest.serialization import UTC +import datetime as dt + +from devtools_testutils import AzureRecordedTestCase +from azure.core.messaging import CloudEvent +from azure.core.credentials import AzureKeyCredential +from azure.eventgrid.aio import EventGridConsumerClient, EventGridPublisherClient +from eventgrid_preparer import ( + EventGridPreparer, +) + + +class TestEventGridConsumerClientAsync(AzureRecordedTestCase): + def create_eg_publisher_client(self, endpoint, topic=None): + credential = self.get_credential(EventGridPublisherClient) + client = self.create_client_from_credential( + EventGridPublisherClient, credential=credential, endpoint=endpoint, namespace_topic=topic + ) + return client + + def create_eg_consumer_client(self, endpoint, topic, subscription): + credential = self.get_credential(EventGridConsumerClient) + client = self.create_client_from_credential( + EventGridConsumerClient, + credential=credential, + endpoint=endpoint, + namespace_topic=topic, + subscription=subscription, + ) + return client + + @pytest.mark.live_test_only + @EventGridPreparer() + @pytest.mark.asyncio + async def test_receive_data(self, **kwargs): + eventgrid_endpoint = kwargs["eventgrid_endpoint"] + eventgrid_key = kwargs["eventgrid_key"] + eventgrid_topic_name = kwargs["eventgrid_topic_name"] + eventgrid_event_subscription_name = kwargs["eventgrid_event_subscription_name"] + publisher = self.create_eg_publisher_client(eventgrid_endpoint, eventgrid_topic_name) + consumer = self.create_eg_consumer_client(eventgrid_endpoint, eventgrid_topic_name, eventgrid_event_subscription_name) + cloud_event = CloudEvent( + source="http://samplesource.dev", + data={"sample": "cloudevent"}, + type="Sample.Cloud.Event", + ) + await publisher.send(cloud_event) + + received_event = await consumer.receive(max_events=1) + assert len(received_event) == 1 + + for event in received_event: + ack_result = await consumer.acknowledge(lock_tokens=[event.broker_properties.lock_token]) + assert ack_result.succeeded_lock_tokens == [event.broker_properties.lock_token] + + @pytest.mark.live_test_only + @EventGridPreparer() + @pytest.mark.asyncio + async def test_receive_renew_data(self, **kwargs): + eventgrid_endpoint = kwargs["eventgrid_endpoint"] + eventgrid_key = kwargs["eventgrid_key"] + eventgrid_topic_name = kwargs["eventgrid_topic_name"] + eventgrid_event_subscription_name = kwargs["eventgrid_event_subscription_name"] + publisher = self.create_eg_publisher_client(eventgrid_endpoint, eventgrid_topic_name) + consumer = self.create_eg_consumer_client(eventgrid_endpoint, eventgrid_topic_name, eventgrid_event_subscription_name) + cloud_event = CloudEvent( + source="http://samplesource.dev", + data={"sample": "cloudevent"}, + type="Sample.Cloud.Event", + ) + await publisher.send(cloud_event) + + received_event = await consumer.receive(max_events=1) + assert len(received_event) == 1 + + for event in received_event: + renew_lock = await consumer.renew_locks(lock_tokens=[event.broker_properties.lock_token]) + ack_result = await consumer.acknowledge(lock_tokens=[event.broker_properties.lock_token]) + assert ack_result.succeeded_lock_tokens == [event.broker_properties.lock_token] + + @pytest.mark.live_test_only + @EventGridPreparer() + @pytest.mark.asyncio + async def test_receive_release_data(self, **kwargs): + eventgrid_endpoint = kwargs["eventgrid_endpoint"] + eventgrid_key = kwargs["eventgrid_key"] + eventgrid_topic_name = kwargs["eventgrid_topic_name"] + eventgrid_event_subscription_name = kwargs["eventgrid_event_subscription_name"] + publisher = self.create_eg_publisher_client(eventgrid_endpoint, eventgrid_topic_name) + consumer = self.create_eg_consumer_client(eventgrid_endpoint, eventgrid_topic_name, eventgrid_event_subscription_name) + cloud_event = CloudEvent( + source="http://samplesource.dev", + data={"sample": "cloudevent"}, + type="Sample.Cloud.Event", + ) + await publisher.send(cloud_event) + + received_event = await consumer.receive(max_events=1) + assert len(received_event) == 1 + + for event in received_event: + release = await consumer.release(lock_tokens=[event.broker_properties.lock_token]) + assert release.succeeded_lock_tokens == [event.broker_properties.lock_token] + + @pytest.mark.live_test_only + @EventGridPreparer() + @pytest.mark.asyncio + async def test_receive_reject_data(self, **kwargs): + eventgrid_endpoint = kwargs["eventgrid_endpoint"] + eventgrid_key = kwargs["eventgrid_key"] + eventgrid_topic_name = kwargs["eventgrid_topic_name"] + eventgrid_event_subscription_name = kwargs["eventgrid_event_subscription_name"] + publisher = self.create_eg_publisher_client(eventgrid_endpoint, eventgrid_topic_name) + consumer = self.create_eg_consumer_client(eventgrid_endpoint, eventgrid_topic_name, eventgrid_event_subscription_name) + cloud_event = CloudEvent( + source="http://samplesource.dev", + data={"sample": "cloudevent"}, + type="Sample.Cloud.Event", + ) + await publisher.send(cloud_event) + + received_event = await consumer.receive(max_events=1) + assert len(received_event) == 1 + + for event in received_event: + reject = await consumer.reject(lock_tokens=[event.broker_properties.lock_token]) + assert reject.succeeded_lock_tokens == [event.broker_properties.lock_token] diff --git a/sdk/eventgrid/azure-eventgrid/tests/test_eg_event_get_bytes.py b/sdk/eventgrid/azure-eventgrid/tests/test_eg_event_get_bytes.py index 1ae80ecde2c9..bbd5ad4ac0f4 100644 --- a/sdk/eventgrid/azure-eventgrid/tests/test_eg_event_get_bytes.py +++ b/sdk/eventgrid/azure-eventgrid/tests/test_eg_event_get_bytes.py @@ -6,9 +6,10 @@ import pytest import uuid from msrest.serialization import UTC -from azure.eventgrid._messaging_shared import _get_json_content +from azure.eventgrid._legacy._messaging_shared import _get_json_content from azure.eventgrid import EventGridEvent + class MockQueueMessage(object): def __init__(self, content=None): self.id = uuid.uuid4() @@ -19,31 +20,33 @@ def __init__(self, content=None): self.pop_receipt = None self.next_visible_on = None + class MockServiceBusReceivedMessage(object): def __init__(self, body=None, **kwargs): - self.body=body - self.application_properties=None - self.session_id=None - self.message_id='3f6c5441-5be5-4f33-80c3-3ffeb6a090ce' - self.content_type='application/cloudevents+json; charset=utf-8' - self.correlation_id=None - self.to=None - self.reply_to=None - self.reply_to_session_id=None - self.subject=None - self.time_to_live=datetime.timedelta(days=14) - self.partition_key=None - self.scheduled_enqueue_time_utc=None - self.auto_renew_error=None, - self.dead_letter_error_description=None - self.dead_letter_reason=None - self.dead_letter_source=None - self.delivery_count=13 - self.enqueued_sequence_number=0 - self.enqueued_time_utc=datetime.datetime(2021, 7, 22, 22, 27, 41, 236000) - self.expires_at_utc=datetime.datetime(2021, 8, 5, 22, 27, 41, 236000) - self.sequence_number=11219 - self.lock_token='233146e3-d5a6-45eb-826f-691d82fb8b13' + self.body = body + self.application_properties = None + self.session_id = None + self.message_id = "3f6c5441-5be5-4f33-80c3-3ffeb6a090ce" + self.content_type = "application/cloudevents+json; charset=utf-8" + self.correlation_id = None + self.to = None + self.reply_to = None + self.reply_to_session_id = None + self.subject = None + self.time_to_live = datetime.timedelta(days=14) + self.partition_key = None + self.scheduled_enqueue_time_utc = None + self.auto_renew_error = (None,) + self.dead_letter_error_description = None + self.dead_letter_reason = None + self.dead_letter_source = None + self.delivery_count = 13 + self.enqueued_sequence_number = 0 + self.enqueued_time_utc = datetime.datetime(2021, 7, 22, 22, 27, 41, 236000) + self.expires_at_utc = datetime.datetime(2021, 8, 5, 22, 27, 41, 236000) + self.sequence_number = 11219 + self.lock_token = "233146e3-d5a6-45eb-826f-691d82fb8b13" + class MockEventhubData(object): def __init__(self, body=None): @@ -53,7 +56,7 @@ def __init__(self, body=None): raise ValueError("EventData cannot be None.") # Internal usage only for transforming AmqpAnnotatedMessage to outgoing EventData - self.body=body + self.body = body self._raw_amqp_message = "some amqp data" self.message_id = None self.content_type = None @@ -66,7 +69,7 @@ def __init__(self, data=None): def __iter__(self): return self - + def __next__(self): if not self.data: return """{"id":"f208feff-099b-4bda-a341-4afd0fa02fef","subject":"https://egsample.dev/sampleevent","data":"ServiceBus","event_type":"Azure.Sdk.Sample","event_time":"2021-07-22T22:27:38.960209Z","data_version":"1.0"}""" @@ -81,14 +84,15 @@ def __init__(self, data=None): def __iter__(self): return self - + def __next__(self): if not self.data: return b'[{"id":"f208feff-099b-4bda-a341-4afd0fa02fef","subject":"https://egsample.dev/sampleevent","data":"Eventhub","event_type":"Azure.Sdk.Sample","event_time":"2021-07-22T22:27:38.960209Z","data_version":"1.0"}]' return self.data - + next = __next__ + def test_get_bytes_storage_queue(): cloud_storage_dict = """{ "id":"a0517898-9fa4-4e70-b4a3-afda1dd68672", @@ -112,73 +116,73 @@ def test_get_bytes_storage_queue(): obj = MockQueueMessage(content=cloud_storage_dict) dict = _get_json_content(obj) - assert dict.get('data') == { - "api":"PutBlockList", - "client_request_id":"6d79dbfb-0e37-4fc4-981f-442c9ca65760", - "request_id":"831e1650-001e-001b-66ab-eeb76e000000", - "e_tag":"0x8D4BCC2E4835CD0", - "content_type":"application/octet-stream", - "content_length":524288, - "blob_type":"BlockBlob", - "url":"https://oc2d2817345i60006.blob.core.windows.net/oc2d2817345i200097container/oc2d2817345i20002296blob", - "sequencer":"00000000000004420000000000028963", - "storage_diagnostics":{"batchId":"b68529f3-68cd-4744-baa4-3c0498ec19f0"} - } - assert dict.get('data_version') == "1.0" + assert dict.get("data") == { + "api": "PutBlockList", + "client_request_id": "6d79dbfb-0e37-4fc4-981f-442c9ca65760", + "request_id": "831e1650-001e-001b-66ab-eeb76e000000", + "e_tag": "0x8D4BCC2E4835CD0", + "content_type": "application/octet-stream", + "content_length": 524288, + "blob_type": "BlockBlob", + "url": "https://oc2d2817345i60006.blob.core.windows.net/oc2d2817345i200097container/oc2d2817345i20002296blob", + "sequencer": "00000000000004420000000000028963", + "storage_diagnostics": {"batchId": "b68529f3-68cd-4744-baa4-3c0498ec19f0"}, + } + assert dict.get("data_version") == "1.0" + def test_get_bytes_storage_queue_wrong_content(): - string = u'This is a random string which must fail' + string = "This is a random string which must fail" obj = MockQueueMessage(content=string) with pytest.raises(ValueError, match="Failed to load JSON content from the object."): _get_json_content(obj) + def test_get_bytes_servicebus(): obj = MockServiceBusReceivedMessage( body=MockBody(), - message_id='3f6c5441-5be5-4f33-80c3-3ffeb6a090ce', - content_type='application/cloudevents+json; charset=utf-8', + message_id="3f6c5441-5be5-4f33-80c3-3ffeb6a090ce", + content_type="application/cloudevents+json; charset=utf-8", time_to_live=datetime.timedelta(days=14), delivery_count=13, enqueued_sequence_number=0, enqueued_time_utc=datetime.datetime(2021, 7, 22, 22, 27, 41, 236000), expires_at_utc=datetime.datetime(2021, 8, 5, 22, 27, 41, 236000), sequence_number=11219, - lock_token='233146e3-d5a6-45eb-826f-691d82fb8b13' + lock_token="233146e3-d5a6-45eb-826f-691d82fb8b13", ) dict = _get_json_content(obj) - assert dict.get('data') == "ServiceBus" - assert dict.get('data_version') == '1.0' + assert dict.get("data") == "ServiceBus" + assert dict.get("data_version") == "1.0" + def test_get_bytes_servicebus_wrong_content(): obj = MockServiceBusReceivedMessage( - body=MockBody(data='random'), - message_id='3f6c5441-5be5-4f33-80c3-3ffeb6a090ce', - content_type='application/json; charset=utf-8', + body=MockBody(data="random"), + message_id="3f6c5441-5be5-4f33-80c3-3ffeb6a090ce", + content_type="application/json; charset=utf-8", time_to_live=datetime.timedelta(days=14), delivery_count=13, enqueued_sequence_number=0, enqueued_time_utc=datetime.datetime(2021, 7, 22, 22, 27, 41, 236000), expires_at_utc=datetime.datetime(2021, 8, 5, 22, 27, 41, 236000), sequence_number=11219, - lock_token='233146e3-d5a6-45eb-826f-691d82fb8b13' + lock_token="233146e3-d5a6-45eb-826f-691d82fb8b13", ) with pytest.raises(ValueError, match="Failed to load JSON content from the object."): dict = _get_json_content(obj) def test_get_bytes_eventhubs(): - obj = MockEventhubData( - body=MockEhBody() - ) + obj = MockEventhubData(body=MockEhBody()) dict = _get_json_content(obj) - assert dict.get('data') == 'Eventhub' - assert dict.get('data_version') == '1.0' + assert dict.get("data") == "Eventhub" + assert dict.get("data_version") == "1.0" + def test_get_bytes_eventhubs_wrong_content(): - obj = MockEventhubData( - body=MockEhBody(data='random string') - ) + obj = MockEventhubData(body=MockEhBody(data="random string")) with pytest.raises(ValueError, match="Failed to load JSON content from the object."): dict = _get_json_content(obj) @@ -186,43 +190,44 @@ def test_get_bytes_eventhubs_wrong_content(): def test_get_bytes_random_obj(): json_str = '{"id": "de0fd76c-4ef4-4dfb-ab3a-8f24a307e033", "subject": "https://egtest.dev/cloudcustomevent", "data": {"team": "event grid squad"}, "event_type": "Azure.Sdk.Sample", "event_time": "2020-08-07T02:06:08.11969Z", "data_version": "1.0"}' - random_obj = { - "id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e033", - "subject":"https://egtest.dev/cloudcustomevent", - "data":{"team": "event grid squad"}, - "event_type":"Azure.Sdk.Sample", - "event_time":"2020-08-07T02:06:08.11969Z", - "data_version":"1.0", + random_obj = { + "id": "de0fd76c-4ef4-4dfb-ab3a-8f24a307e033", + "subject": "https://egtest.dev/cloudcustomevent", + "data": {"team": "event grid squad"}, + "event_type": "Azure.Sdk.Sample", + "event_time": "2020-08-07T02:06:08.11969Z", + "data_version": "1.0", } assert _get_json_content(json_str) == random_obj + def test_from_json_sb(): obj = MockServiceBusReceivedMessage( body=MockBody(), - message_id='3f6c5441-5be5-4f33-80c3-3ffeb6a090ce', - content_type='application/cloudevents+json; charset=utf-8', + message_id="3f6c5441-5be5-4f33-80c3-3ffeb6a090ce", + content_type="application/cloudevents+json; charset=utf-8", time_to_live=datetime.timedelta(days=14), delivery_count=13, enqueued_sequence_number=0, enqueued_time_utc=datetime.datetime(2021, 7, 22, 22, 27, 41, 236000), expires_at_utc=datetime.datetime(2021, 8, 5, 22, 27, 41, 236000), sequence_number=11219, - lock_token='233146e3-d5a6-45eb-826f-691d82fb8b13' + lock_token="233146e3-d5a6-45eb-826f-691d82fb8b13", ) event = EventGridEvent.from_json(obj) assert event.id == "f208feff-099b-4bda-a341-4afd0fa02fef" assert event.data == "ServiceBus" + def test_from_json_eh(): - obj = MockEventhubData( - body=MockEhBody() - ) + obj = MockEventhubData(body=MockEhBody()) event = EventGridEvent.from_json(obj) assert event.id == "f208feff-099b-4bda-a341-4afd0fa02fef" assert event.data == "Eventhub" + def test_from_json_storage(): eg_storage_dict = """{ "id":"a0517898-9fa4-4e70-b4a3-afda1dd68672", @@ -246,17 +251,17 @@ def test_from_json_storage(): obj = MockQueueMessage(content=eg_storage_dict) event = EventGridEvent.from_json(obj) assert event.data == { - "api":"PutBlockList", - "client_request_id":"6d79dbfb-0e37-4fc4-981f-442c9ca65760", - "request_id":"831e1650-001e-001b-66ab-eeb76e000000", - "e_tag":"0x8D4BCC2E4835CD0", - "content_type":"application/octet-stream", - "content_length":524288, - "blob_type":"BlockBlob", - "url":"https://oc2d2817345i60006.blob.core.windows.net/oc2d2817345i200097container/oc2d2817345i20002296blob", - "sequencer":"00000000000004420000000000028963", - "storage_diagnostics":{"batchId":"b68529f3-68cd-4744-baa4-3c0498ec19f0"} - } + "api": "PutBlockList", + "client_request_id": "6d79dbfb-0e37-4fc4-981f-442c9ca65760", + "request_id": "831e1650-001e-001b-66ab-eeb76e000000", + "e_tag": "0x8D4BCC2E4835CD0", + "content_type": "application/octet-stream", + "content_length": 524288, + "blob_type": "BlockBlob", + "url": "https://oc2d2817345i60006.blob.core.windows.net/oc2d2817345i200097container/oc2d2817345i20002296blob", + "sequencer": "00000000000004420000000000028963", + "storage_diagnostics": {"batchId": "b68529f3-68cd-4744-baa4-3c0498ec19f0"}, + } def test_from_json(): diff --git a/sdk/eventgrid/azure-eventgrid/tests/test_eg_publisher_client.py b/sdk/eventgrid/azure-eventgrid/tests/test_eg_publisher_client.py index ac3cdd25a8a8..b3bdd2aad738 100644 --- a/sdk/eventgrid/azure-eventgrid/tests/test_eg_publisher_client.py +++ b/sdk/eventgrid/azure-eventgrid/tests/test_eg_publisher_client.py @@ -1,8 +1,8 @@ -#------------------------------------------------------------------------- +# ------------------------------------------------------------------------- # Copyright (c) Microsoft Corporation. All rights reserved. # Licensed under the MIT License. See License.txt in the project root for # license information. -#-------------------------------------------------------------------------- +# -------------------------------------------------------------------------- import logging import sys @@ -19,22 +19,25 @@ except ImportError: from urlparse import urlparse -from devtools_testutils import AzureRecordedTestCase, recorded_by_proxy +from devtools_testutils import AzureRecordedTestCase, recorded_by_proxy -from azure.core.credentials import AzureSasCredential +from azure.core.credentials import AzureSasCredential, AzureKeyCredential from azure.core.messaging import CloudEvent from azure.core.serialization import NULL from azure.eventgrid import EventGridPublisherClient, EventGridEvent, generate_sas -from azure.eventgrid._helpers import _cloud_event_to_generated +from azure.eventgrid._legacy._helpers import _cloud_event_to_generated from eventgrid_preparer import ( EventGridPreparer, ) + class TestEventGridPublisherClient(AzureRecordedTestCase): - def create_eg_publisher_client(self, endpoint): + def create_eg_publisher_client(self, endpoint, topic=None): credential = self.get_credential(EventGridPublisherClient) - client = self.create_client_from_credential(EventGridPublisherClient, credential=credential, endpoint=endpoint) + client = self.create_client_from_credential( + EventGridPublisherClient, credential=credential, endpoint=endpoint, namespace_topic=topic + ) return client @EventGridPreparer() @@ -42,25 +45,25 @@ def create_eg_publisher_client(self, endpoint): def test_send_event_grid_event_data_dict(self, eventgrid_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_topic_endpoint) eg_event = EventGridEvent( - subject="sample", - data={"sample": "eventgridevent"}, - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", + data={"sample": "eventgridevent"}, + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) client.send(eg_event) @EventGridPreparer() @recorded_by_proxy - def test_send_event_grid_event_fails_without_full_url(self,eventgrid_topic_endpoint): + def test_send_event_grid_event_fails_without_full_url(self, eventgrid_topic_endpoint): credential = self.get_credential(EventGridPublisherClient) parsed_url = urlparse(eventgrid_topic_endpoint) client = EventGridPublisherClient(parsed_url.netloc, credential) eg_event = EventGridEvent( - subject="sample", - data={"sample": "eventgridevent"}, - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", + data={"sample": "eventgridevent"}, + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) with pytest.raises(ValueError): client.send(eg_event) @@ -69,17 +72,17 @@ def test_send_event_grid_event_fails_without_full_url(self,eventgrid_topic_endpo def test_send_event_grid_event_data_as_list(self, eventgrid_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_topic_endpoint) eg_event1 = EventGridEvent( - subject="sample", - data=u"eventgridevent", - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", + data="eventgridevent", + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) eg_event2 = EventGridEvent( - subject="sample2", - data=u"eventgridevent2", - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample2", + data="eventgridevent2", + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) client.send([eg_event1, eg_event2]) @EventGridPreparer() @@ -87,11 +90,11 @@ def test_send_event_grid_event_data_as_list(self, eventgrid_topic_endpoint): def test_send_event_grid_event_data_str(self, eventgrid_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_topic_endpoint) eg_event = EventGridEvent( - subject="sample", - data=u"eventgridevent", - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", + data="eventgridevent", + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) client.send(eg_event) @EventGridPreparer() @@ -99,11 +102,11 @@ def test_send_event_grid_event_data_str(self, eventgrid_topic_endpoint): def test_send_event_grid_event_data_bytes(self, eventgrid_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_topic_endpoint) eg_event = EventGridEvent( - subject="sample", - data=b"eventgridevent", - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", + data=b"eventgridevent", + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) with pytest.raises(TypeError, match="Data in EventGridEvent cannot be bytes*"): client.send(eg_event) @@ -112,12 +115,12 @@ def test_send_event_grid_event_data_bytes(self, eventgrid_topic_endpoint): def test_send_event_grid_event_dict_data_bytes(self, eventgrid_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_topic_endpoint) eg_event = { - "subject":"sample", - "data":b"eventgridevent", - "eventType":"Sample.EventGrid.Event", - "dataVersion":"2.0", - "id": uuid.uuid4(), - "eventTime": datetime.now() + "subject": "sample", + "data": b"eventgridevent", + "eventType": "Sample.EventGrid.Event", + "dataVersion": "2.0", + "id": uuid.uuid4(), + "eventTime": datetime.now(), } with pytest.raises(TypeError, match="Data in EventGridEvent cannot be bytes*"): client.send(eg_event) @@ -127,16 +130,33 @@ def test_send_event_grid_event_dict_data_bytes(self, eventgrid_topic_endpoint): def test_send_event_grid_event_dict_data_dict(self, eventgrid_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_topic_endpoint) eg_event = { - "subject":"sample", - "data":{"key1": "Sample.EventGrid.Event"}, - "eventType":"Sample.EventGrid.Event", - "dataVersion":"2.0", - "id": uuid.uuid4(), - "eventTime": datetime.now() + "subject": "sample", + "data": {"key1": "Sample.EventGrid.Event"}, + "eventType": "Sample.EventGrid.Event", + "dataVersion": "2.0", + "id": uuid.uuid4(), + "eventTime": datetime.now(), } client.send(eg_event) + @pytest.mark.live_test_only + @EventGridPreparer() + def test_send_event_grid_namespace(self, **kwargs): + eventgrid_endpoint = kwargs["eventgrid_endpoint"] + eventgrid_topic_name = kwargs["eventgrid_topic_name"] + client = self.create_eg_publisher_client(eventgrid_endpoint, eventgrid_topic_name) + eg_event = { + "subject": "sample", + "data": {"key1": "Sample.EventGrid.Event"}, + "eventType": "Sample.EventGrid.Event", + "dataVersion": "2.0", + "id": uuid.uuid4(), + "eventTime": datetime.now(), + } + with pytest.raises(TypeError): + client.send(eg_event) + ### CLOUD EVENT TESTS @EventGridPreparer() @@ -144,23 +164,46 @@ def test_send_event_grid_event_dict_data_dict(self, eventgrid_topic_endpoint): def test_send_cloud_event_data_dict(self, eventgrid_cloud_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) cloud_event = CloudEvent( - source = "http://samplesource.dev", - data = {"sample": "cloudevent"}, - type="Sample.Cloud.Event" - ) + source="http://samplesource.dev", + data={"sample": "cloudevent"}, + type="Sample.Cloud.Event", + ) client.send(cloud_event) + @pytest.mark.live_test_only + @EventGridPreparer() + def test_send_cloud_event_data_dict_namespace(self, **kwargs): + eventgrid_endpoint = kwargs["eventgrid_endpoint"] + eventgrid_topic_name = kwargs["eventgrid_topic_name"] + client = self.create_eg_publisher_client(eventgrid_endpoint, eventgrid_topic_name) + cloud_event = CloudEvent( + source="http://samplesource.dev", + data={"sample": "cloudevent"}, + type="Sample.Cloud.Event", + ) + client.send(cloud_event) + + @pytest.mark.live_test_only + @EventGridPreparer() + def test_send_cloud_event_data_channel_name_namespace(self, **kwargs): + eventgrid_endpoint = kwargs["eventgrid_endpoint"] + eventgrid_topic_name = kwargs["eventgrid_topic_name"] + client = self.create_eg_publisher_client(eventgrid_endpoint, eventgrid_topic_name) + cloud_event = CloudEvent( + source="http://samplesource.dev", + data={"sample": "cloudevent"}, + type="Sample.Cloud.Event", + ) + with pytest.raises(ValueError): + client.send(cloud_event, channel_name="testchannel") + @pytest.mark.skip("https://github.com/Azure/azure-sdk-for-python/issues/16993") @EventGridPreparer() @recorded_by_proxy def test_send_cloud_event_data_NULL(self, eventgrid_cloud_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) - cloud_event = CloudEvent( - source = "http://samplesource.dev", - data = NULL, - type="Sample.Cloud.Event" - ) - + cloud_event = CloudEvent(source="http://samplesource.dev", data=NULL, type="Sample.Cloud.Event") + def callback(request): req = json.loads(request.http_request.body) assert req[0].get("data") is None @@ -172,10 +215,10 @@ def callback(request): def test_send_cloud_event_data_base64_using_data(self, eventgrid_cloud_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) cloud_event = CloudEvent( - source = "http://samplesource.dev", - data = b'cloudevent', - type="Sample.Cloud.Event" - ) + source="http://samplesource.dev", + data=b"cloudevent", + type="Sample.Cloud.Event", + ) def callback(request): req = json.loads(request.http_request.body) @@ -187,21 +230,17 @@ def callback(request): def test_send_cloud_event_fails_on_providing_data_and_b64(self): with pytest.raises(ValueError, match="Unexpected keyword arguments data_base64.*"): cloud_event = CloudEvent( - source = "http://samplesource.dev", - data_base64 = b'cloudevent', - data = "random data", - type="Sample.Cloud.Event" - ) + source="http://samplesource.dev", + data_base64=b"cloudevent", + data="random data", + type="Sample.Cloud.Event", + ) @EventGridPreparer() @recorded_by_proxy def test_send_cloud_event_data_none(self, eventgrid_cloud_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) - cloud_event = CloudEvent( - source = "http://samplesource.dev", - data = None, - type="Sample.Cloud.Event" - ) + cloud_event = CloudEvent(source="http://samplesource.dev", data=None, type="Sample.Cloud.Event") client.send(cloud_event) @EventGridPreparer() @@ -209,10 +248,10 @@ def test_send_cloud_event_data_none(self, eventgrid_cloud_event_topic_endpoint): def test_send_cloud_event_data_str(self, eventgrid_cloud_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) cloud_event = CloudEvent( - source = "http://samplesource.dev", - data = "cloudevent", - type="Sample.Cloud.Event" - ) + source="http://samplesource.dev", + data="cloudevent", + type="Sample.Cloud.Event", + ) client.send(cloud_event) @EventGridPreparer() @@ -220,10 +259,10 @@ def test_send_cloud_event_data_str(self, eventgrid_cloud_event_topic_endpoint): def test_send_cloud_event_data_bytes(self, eventgrid_cloud_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) cloud_event = CloudEvent( - source = "http://samplesource.dev", - data = b"cloudevent", - type="Sample.Cloud.Event" - ) + source="http://samplesource.dev", + data=b"cloudevent", + type="Sample.Cloud.Event", + ) client.send(cloud_event) @EventGridPreparer() @@ -231,10 +270,10 @@ def test_send_cloud_event_data_bytes(self, eventgrid_cloud_event_topic_endpoint) def test_send_cloud_event_data_as_list(self, eventgrid_cloud_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) cloud_event = CloudEvent( - source = "http://samplesource.dev", - data = "cloudevent", - type="Sample.Cloud.Event" - ) + source="http://samplesource.dev", + data="cloudevent", + type="Sample.Cloud.Event", + ) client.send([cloud_event]) @EventGridPreparer() @@ -242,30 +281,27 @@ def test_send_cloud_event_data_as_list(self, eventgrid_cloud_event_topic_endpoin def test_send_cloud_event_data_with_extensions(self, eventgrid_cloud_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) cloud_event = CloudEvent( - source = "http://samplesource.dev", - data = "cloudevent", - type="Sample.Cloud.Event", - extensions={ - 'reasoncode':204, - 'extension':'hello' - } - ) + source="http://samplesource.dev", + data="cloudevent", + type="Sample.Cloud.Event", + extensions={"reasoncode": 204, "extension": "hello"}, + ) client.send([cloud_event]) internal = _cloud_event_to_generated(cloud_event).serialize() - assert 'reasoncode' in internal - assert 'extension' in internal - assert internal['reasoncode'] == 204 + assert "reasoncode" in internal + assert "extension" in internal + assert internal["reasoncode"] == 204 @EventGridPreparer() @recorded_by_proxy def test_send_cloud_event_dict(self, eventgrid_cloud_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) cloud_event1 = { - "id": "1234", - "source": "http://samplesource.dev", - "specversion": "1.0", - "data": "cloudevent", - "type": "Sample.Cloud.Event" + "id": "1234", + "source": "http://samplesource.dev", + "specversion": "1.0", + "data": "cloudevent", + "type": "Sample.Cloud.Event", } client.send(cloud_event1) @@ -279,11 +315,11 @@ def test_send_signature_credential(self, **kwargs): credential = AzureSasCredential(signature) client = EventGridPublisherClient(eventgrid_topic_endpoint, credential) eg_event = EventGridEvent( - subject="sample", - data={"sample": "eventgridevent"}, - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", + data={"sample": "eventgridevent"}, + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) client.send(eg_event) @EventGridPreparer() @@ -291,19 +327,19 @@ def test_send_signature_credential(self, **kwargs): def test_send_NONE_credential(self, eventgrid_topic_endpoint): with pytest.raises(ValueError, match="Parameter 'self._credential' must not be None."): client = EventGridPublisherClient(eventgrid_topic_endpoint, None) - + @EventGridPreparer() @recorded_by_proxy def test_send_custom_schema_event(self, eventgrid_custom_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_custom_event_topic_endpoint) custom_event = { - "customSubject": "sample", - "customEventType": "sample.event", - "customDataVersion": "2.0", - "customId": "1234", - "customEventTime": dt.datetime.now(UTC()).isoformat(), - "customData": "sample data" - } + "customSubject": "sample", + "customEventType": "sample.event", + "customDataVersion": "2.0", + "customId": "1234", + "customEventTime": dt.datetime.now(UTC()).isoformat(), + "customData": "sample data", + } client.send(custom_event) @EventGridPreparer() @@ -311,26 +347,29 @@ def test_send_custom_schema_event(self, eventgrid_custom_event_topic_endpoint): def test_send_custom_schema_event_as_list(self, eventgrid_custom_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_custom_event_topic_endpoint) custom_event1 = { - "customSubject": "sample", - "customEventType": "sample.event", - "customDataVersion": "2.0", - "customId": "1234", - "customEventTime": dt.datetime.now(UTC()).isoformat(), - "customData": "sample data" - } + "customSubject": "sample", + "customEventType": "sample.event", + "customDataVersion": "2.0", + "customId": "1234", + "customEventTime": dt.datetime.now(UTC()).isoformat(), + "customData": "sample data", + } custom_event2 = { - "customSubject": "sample2", - "customEventType": "sample.event", - "customDataVersion": "2.0", - "customId": "12345", - "customEventTime": dt.datetime.now(UTC()).isoformat(), - "customData": "sample data 2" - } + "customSubject": "sample2", + "customEventType": "sample.event", + "customDataVersion": "2.0", + "customId": "12345", + "customEventTime": dt.datetime.now(UTC()).isoformat(), + "customData": "sample data 2", + } client.send([custom_event1, custom_event2]) def test_send_throws_with_bad_credential(self): bad_credential = "I am a bad credential" - with pytest.raises(ValueError, match="The provided credential should be an instance of a TokenCredential, AzureSasCredential or AzureKeyCredential"): + with pytest.raises( + ValueError, + match="The provided credential should be an instance of a TokenCredential, AzureSasCredential or AzureKeyCredential", + ): client = EventGridPublisherClient("eventgrid_endpoint", bad_credential) @pytest.mark.live_test_only @@ -340,11 +379,11 @@ def test_send_token_credential(self, eventgrid_topic_endpoint): credential = self.get_credential(EventGridPublisherClient) client = EventGridPublisherClient(eventgrid_topic_endpoint, credential) eg_event = EventGridEvent( - subject="sample", - data={"sample": "eventgridevent"}, - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", + data={"sample": "eventgridevent"}, + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) client.send(eg_event) @pytest.mark.live_test_only @@ -353,11 +392,11 @@ def test_send_token_credential(self, eventgrid_topic_endpoint): def test_send_partner_namespace(self, eventgrid_partner_namespace_topic_endpoint, eventgrid_partner_channel_name): client = self.create_eg_publisher_client(eventgrid_partner_namespace_topic_endpoint) cloud_event = CloudEvent( - source = "http://samplesource.dev", - data = "cloudevent", - type="Sample.Cloud.Event" - ) - + source="http://samplesource.dev", + data="cloudevent", + type="Sample.Cloud.Event", + ) + def callback(request): req = request.http_request.headers assert req.get("aeg-channel-name") == eventgrid_partner_channel_name diff --git a/sdk/eventgrid/azure-eventgrid/tests/test_eg_publisher_client_async.py b/sdk/eventgrid/azure-eventgrid/tests/test_eg_publisher_client_async.py index e4cbddb43008..760e7e81e9d6 100644 --- a/sdk/eventgrid/azure-eventgrid/tests/test_eg_publisher_client_async.py +++ b/sdk/eventgrid/azure-eventgrid/tests/test_eg_publisher_client_async.py @@ -1,8 +1,8 @@ -#------------------------------------------------------------------------- +# ------------------------------------------------------------------------- # Copyright (c) Microsoft Corporation. All rights reserved. # Licensed under the MIT License. See License.txt in the project root for # license information. -#-------------------------------------------------------------------------- +# -------------------------------------------------------------------------- import logging import asyncio @@ -18,22 +18,20 @@ from devtools_testutils import AzureRecordedTestCase from devtools_testutils.aio import recorded_by_proxy_async -from azure.core.credentials import AzureSasCredential +from azure.core.credentials import AzureSasCredential, AzureKeyCredential from azure.core.messaging import CloudEvent from azure.core.serialization import NULL from azure.eventgrid import EventGridEvent, generate_sas from azure.eventgrid.aio import EventGridPublisherClient -from azure.eventgrid._helpers import _cloud_event_to_generated +from azure.eventgrid._legacy._helpers import _cloud_event_to_generated -from eventgrid_preparer import ( - EventGridPreparer -) +from eventgrid_preparer import EventGridPreparer class TestEventGridPublisherClient(AzureRecordedTestCase): - def create_eg_publisher_client(self, endpoint): + def create_eg_publisher_client(self, endpoint, topic=None): credential = self.get_credential(EventGridPublisherClient, is_async=True) - client = self.create_client_from_credential(EventGridPublisherClient, credential=credential, endpoint=endpoint) + client = self.create_client_from_credential(EventGridPublisherClient, credential=credential, endpoint=endpoint, namespace_topic=topic) return client @EventGridPreparer() @@ -42,31 +40,30 @@ def create_eg_publisher_client(self, endpoint): async def test_send_event_grid_event_data_dict(self, eventgrid_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_topic_endpoint) eg_event = EventGridEvent( - subject="sample", - data={"sample": "eventgridevent"}, - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", + data={"sample": "eventgridevent"}, + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) await client.send(eg_event) - @EventGridPreparer() @recorded_by_proxy_async @pytest.mark.asyncio async def test_send_event_grid_event_data_as_list(self, eventgrid_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_topic_endpoint) eg_event1 = EventGridEvent( - subject="sample", - data="eventgridevent", - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", + data="eventgridevent", + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) eg_event2 = EventGridEvent( - subject="sample2", - data="eventgridevent2", - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample2", + data="eventgridevent2", + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) await client.send([eg_event1, eg_event2]) @EventGridPreparer() @@ -77,11 +74,11 @@ async def test_send_event_grid_event_fails_without_full_url(self, eventgrid_topi parsed_url = urlparse(eventgrid_topic_endpoint) client = EventGridPublisherClient(parsed_url.netloc, credential) eg_event = EventGridEvent( - subject="sample", - data={"sample": "eventgridevent"}, - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", + data={"sample": "eventgridevent"}, + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) with pytest.raises(ValueError): await client.send(eg_event) @@ -91,11 +88,11 @@ async def test_send_event_grid_event_fails_without_full_url(self, eventgrid_topi async def test_send_event_grid_event_data_str(self, eventgrid_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_topic_endpoint) eg_event = EventGridEvent( - subject="sample", - data="eventgridevent", - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", + data="eventgridevent", + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) await client.send(eg_event) @EventGridPreparer() @@ -104,11 +101,11 @@ async def test_send_event_grid_event_data_str(self, eventgrid_topic_endpoint): async def test_send_event_grid_event_data_bytes(self, eventgrid_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_topic_endpoint) eg_event = EventGridEvent( - subject="sample", - data=b"eventgridevent", - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", + data=b"eventgridevent", + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) with pytest.raises(TypeError, match="Data in EventGridEvent cannot be bytes*"): await client.send(eg_event) @@ -118,28 +115,51 @@ async def test_send_event_grid_event_data_bytes(self, eventgrid_topic_endpoint): async def test_send_event_grid_event_dict_data_bytes(self, eventgrid_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_topic_endpoint) eg_event = { - "subject":"sample", - "data":b"eventgridevent", - "eventType":"Sample.EventGrid.Event", - "dataVersion":"2.0", - "id": "123-ddf-133-324255ffd", - "eventTime": dt.datetime.utcnow() + "subject": "sample", + "data": b"eventgridevent", + "eventType": "Sample.EventGrid.Event", + "dataVersion": "2.0", + "id": "123-ddf-133-324255ffd", + "eventTime": dt.datetime.utcnow(), } with pytest.raises(TypeError, match="Data in EventGridEvent cannot be bytes*"): await client.send(eg_event) + @pytest.mark.live_test_only + @EventGridPreparer() + @pytest.mark.asyncio + async def test_send_event_grid_namespace(self, **kwargs): + eventgrid_endpoint = kwargs["eventgrid_endpoint"] + eventgrid_topic_name = kwargs["eventgrid_topic_name"] + client = self.create_eg_publisher_client(eventgrid_endpoint, eventgrid_topic_name) + eg_event = { + "subject": "sample", + "data": {"key1": "Sample.EventGrid.Event"}, + "eventType": "Sample.EventGrid.Event", + "dataVersion": "2.0", + "id": "id-1234", + "eventTime": dt.datetime.now(), + } + with pytest.raises(TypeError): + await client.send(eg_event) + @EventGridPreparer() @recorded_by_proxy_async @pytest.mark.asyncio async def test_send_cloud_event_data_dict(self, eventgrid_cloud_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) cloud_event = CloudEvent( - source = "http://samplesource.dev", - data = {"sample": "cloudevent"}, - type="Sample.Cloud.Event" - ) + source="http://samplesource.dev", data={"sample": "cloudevent"}, type="Sample.Cloud.Event" + ) await client.send(cloud_event) + @EventGridPreparer() + @recorded_by_proxy_async + @pytest.mark.asyncio + async def test_send_cloud_event_data_str(self, eventgrid_cloud_event_topic_endpoint): + client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) + cloud_event = CloudEvent(source="http://samplesource.dev", data="cloudevent", type="Sample.Cloud.Event") + await client.send(cloud_event) @EventGridPreparer() @recorded_by_proxy_async @@ -147,10 +167,24 @@ async def test_send_cloud_event_data_dict(self, eventgrid_cloud_event_topic_endp async def test_send_cloud_event_data_str(self, eventgrid_cloud_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) cloud_event = CloudEvent( - source = "http://samplesource.dev", - data = "cloudevent", - type="Sample.Cloud.Event" - ) + source="http://samplesource.dev", + data="cloudevent", + type="Sample.Cloud.Event", + ) + await client.send(cloud_event) + + @pytest.mark.live_test_only + @EventGridPreparer() + @pytest.mark.asyncio + async def test_send_cloud_event_namespace(self, **kwargs): + eventgrid_endpoint = kwargs["eventgrid_endpoint"] + eventgrid_topic_name = kwargs["eventgrid_topic_name"] + client = self.create_eg_publisher_client(eventgrid_endpoint, eventgrid_topic_name) + cloud_event = CloudEvent( + source="http://samplesource.dev", + data="cloudevent", + type="Sample.Cloud.Event", + ) await client.send(cloud_event) @EventGridPreparer() @@ -159,10 +193,10 @@ async def test_send_cloud_event_data_str(self, eventgrid_cloud_event_topic_endpo async def test_send_cloud_event_data_bytes(self, eventgrid_cloud_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) cloud_event = CloudEvent( - source = "http://samplesource.dev", - data = b"cloudevent", - type="Sample.Cloud.Event" - ) + source="http://samplesource.dev", + data=b"cloudevent", + type="Sample.Cloud.Event", + ) await client.send(cloud_event) @EventGridPreparer() @@ -171,33 +205,28 @@ async def test_send_cloud_event_data_bytes(self, eventgrid_cloud_event_topic_end async def test_send_cloud_event_data_as_list(self, eventgrid_cloud_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) cloud_event = CloudEvent( - source = "http://samplesource.dev", - data = "cloudevent", - type="Sample.Cloud.Event" - ) + source="http://samplesource.dev", + data="cloudevent", + type="Sample.Cloud.Event", + ) await client.send([cloud_event]) - @EventGridPreparer() @recorded_by_proxy_async @pytest.mark.asyncio async def test_send_cloud_event_data_with_extensions(self, eventgrid_cloud_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) cloud_event = CloudEvent( - source = "http://samplesource.dev", - data = "cloudevent", - type="Sample.Cloud.Event", - extensions={ - 'reasoncode':204, - 'extension':'hello' - } - ) + source="http://samplesource.dev", + data="cloudevent", + type="Sample.Cloud.Event", + extensions={"reasoncode": 204, "extension": "hello"}, + ) await client.send([cloud_event]) internal = _cloud_event_to_generated(cloud_event).serialize() - assert 'reasoncode' in internal - assert 'extension' in internal - assert internal['reasoncode'] == 204 - + assert "reasoncode" in internal + assert "extension" in internal + assert internal["reasoncode"] == 204 @EventGridPreparer() @recorded_by_proxy_async @@ -205,24 +234,35 @@ async def test_send_cloud_event_data_with_extensions(self, eventgrid_cloud_event async def test_send_cloud_event_dict(self, eventgrid_cloud_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) cloud_event1 = { - "id": "1234", - "source": "http://samplesource.dev", - "specversion": "1.0", - "data": "cloudevent", - "type": "Sample.Cloud.Event" + "id": "1234", + "source": "http://samplesource.dev", + "specversion": "1.0", + "data": "cloudevent", + "type": "Sample.Cloud.Event", } await client.send(cloud_event1) + @pytest.mark.live_test_only + @EventGridPreparer() + @pytest.mark.asyncio + async def test_send_cloud_event_channel_namenamespace(self, **kwargs): + eventgrid_endpoint = kwargs["eventgrid_endpoint"] + eventgrid_topic_name = kwargs["eventgrid_topic_name"] + client = self.create_eg_publisher_client(eventgrid_endpoint, eventgrid_topic_name) + cloud_event = CloudEvent( + source="http://samplesource.dev", + data="cloudevent", + type="Sample.Cloud.Event", + ) + with pytest.raises(ValueError): + await client.send(cloud_event, channel_name="testchannel") + @EventGridPreparer() @recorded_by_proxy_async @pytest.mark.asyncio async def test_send_cloud_event_data_none(self, eventgrid_cloud_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) - cloud_event = CloudEvent( - source = "http://samplesource.dev", - data = None, - type="Sample.Cloud.Event" - ) + cloud_event = CloudEvent(source="http://samplesource.dev", data=None, type="Sample.Cloud.Event") await client.send(cloud_event) @pytest.mark.skip("https://github.com/Azure/azure-sdk-for-python/issues/16993") @@ -231,11 +271,8 @@ async def test_send_cloud_event_data_none(self, eventgrid_cloud_event_topic_endp @pytest.mark.asyncio async def test_send_cloud_event_data_NULL(self, eventgrid_cloud_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) - cloud_event = CloudEvent( - source = "http://samplesource.dev", - data = NULL, - type="Sample.Cloud.Event" - ) + cloud_event = CloudEvent(source="http://samplesource.dev", data=NULL, type="Sample.Cloud.Event") + def callback(request): req = json.loads(request.http_request.body) assert req[0].get("data") is None @@ -253,51 +290,49 @@ async def test_send_signature_credential(self, **kwargs): credential = AzureSasCredential(signature) client = EventGridPublisherClient(eventgrid_topic_endpoint, credential) eg_event = EventGridEvent( - subject="sample", - data={"sample": "eventgridevent"}, - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", + data={"sample": "eventgridevent"}, + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) await client.send(eg_event) - @EventGridPreparer() @recorded_by_proxy_async @pytest.mark.asyncio async def test_send_custom_schema_event(self, eventgrid_custom_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_custom_event_topic_endpoint) custom_event = { - "customSubject": "sample", - "customEventType": "sample.event", - "customDataVersion": "2.0", - "customId": "1234", - "customEventTime": dt.datetime.now(UTC()).isoformat(), - "customData": "sample data" - } + "customSubject": "sample", + "customEventType": "sample.event", + "customDataVersion": "2.0", + "customId": "1234", + "customEventTime": dt.datetime.now(UTC()).isoformat(), + "customData": "sample data", + } await client.send(custom_event) - @EventGridPreparer() @recorded_by_proxy_async @pytest.mark.asyncio async def test_send_custom_schema_event_as_list(self, eventgrid_custom_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_custom_event_topic_endpoint) custom_event1 = { - "customSubject": "sample", - "customEventType": "sample.event", - "customDataVersion": "2.0", - "customId": "1234", - "customEventTime": dt.datetime.now(UTC()).isoformat(), - "customData": "sample data" - } + "customSubject": "sample", + "customEventType": "sample.event", + "customDataVersion": "2.0", + "customId": "1234", + "customEventTime": dt.datetime.now(UTC()).isoformat(), + "customData": "sample data", + } custom_event2 = { - "customSubject": "sample2", - "customEventType": "sample.event", - "customDataVersion": "2.0", - "customId": "12345", - "customEventTime": dt.datetime.now(UTC()).isoformat(), - "customData": "sample data 2" - } + "customSubject": "sample2", + "customEventType": "sample.event", + "customDataVersion": "2.0", + "customId": "12345", + "customEventTime": dt.datetime.now(UTC()).isoformat(), + "customData": "sample data 2", + } await client.send([custom_event1, custom_event2]) @EventGridPreparer() @@ -305,18 +340,18 @@ async def test_send_custom_schema_event_as_list(self, eventgrid_custom_event_top @pytest.mark.asyncio async def test_send_and_close_async_session(self, eventgrid_cloud_event_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_cloud_event_topic_endpoint) - async with client: # this throws if client can't close + async with client: # this throws if client can't close cloud_event = CloudEvent( - source = "http://samplesource.dev", - data = "cloudevent", - type="Sample.Cloud.Event" - ) + source="http://samplesource.dev", + data="cloudevent", + type="Sample.Cloud.Event", + ) await client.send(cloud_event) @pytest.mark.skip() @EventGridPreparer() @recorded_by_proxy_async - def test_send_NONE_credential_async(self, eventgrid_topic_endpoint): + def test_send_NONE_credential(self, eventgrid_topic_endpoint): with pytest.raises(ValueError, match="Parameter 'self._credential' must not be None."): client = EventGridPublisherClient(eventgrid_topic_endpoint, None) @@ -328,24 +363,27 @@ async def test_send_token_credential(self, eventgrid_topic_endpoint): credential = self.get_credential(EventGridPublisherClient) client = EventGridPublisherClient(eventgrid_topic_endpoint, credential) eg_event = EventGridEvent( - subject="sample", - data={"sample": "eventgridevent"}, - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", + data={"sample": "eventgridevent"}, + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) await client.send(eg_event) @pytest.mark.live_test_only @EventGridPreparer() @recorded_by_proxy_async @pytest.mark.asyncio - async def test_send_partner_namespace(self, eventgrid_partner_namespace_topic_endpoint, eventgrid_partner_channel_name): + async def test_send_partner_namespace( + self, eventgrid_partner_namespace_topic_endpoint, eventgrid_partner_channel_name + ): client = self.create_eg_publisher_client(eventgrid_partner_namespace_topic_endpoint) cloud_event = CloudEvent( - source = "http://samplesource.dev", - data = "cloudevent", - type="Sample.Cloud.Event" - ) + source="http://samplesource.dev", + data="cloudevent", + type="Sample.Cloud.Event", + ) + def callback(request): req = request.http_request.headers assert req.get("aeg-channel-name") == eventgrid_partner_channel_name diff --git a/sdk/eventgrid/azure-eventgrid/tests/test_exceptions.py b/sdk/eventgrid/azure-eventgrid/tests/test_exceptions.py index 20ccd66d2d56..c6a6686b85b9 100644 --- a/sdk/eventgrid/azure-eventgrid/tests/test_exceptions.py +++ b/sdk/eventgrid/azure-eventgrid/tests/test_exceptions.py @@ -1,8 +1,8 @@ -#------------------------------------------------------------------------- +# ------------------------------------------------------------------------- # Copyright (c) Microsoft Corporation. All rights reserved. # Licensed under the MIT License. See License.txt in the project root for # license information. -#-------------------------------------------------------------------------- +# -------------------------------------------------------------------------- import os import json @@ -21,6 +21,7 @@ EventGridPreparer, ) + class TestEventGridPublisherClientExceptions(AzureMgmtRecordedTestCase): def create_eg_publisher_client(self, endpoint): credential = self.get_credential(EventGridPublisherClient) @@ -34,12 +35,15 @@ def test_raise_on_auth_error(self, **kwargs): akc_credential = AzureKeyCredential("bad credential") client = EventGridPublisherClient(eventgrid_topic_endpoint, akc_credential) eg_event = EventGridEvent( - subject="sample", - data={"sample": "eventgridevent"}, - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) - with pytest.raises(ClientAuthenticationError, match="The request authorization key is not authorized for*"): + subject="sample", + data={"sample": "eventgridevent"}, + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) + with pytest.raises( + ClientAuthenticationError, + match="The request authorization key is not authorized for*", + ): client.send(eg_event) @pytest.mark.skip("Fix during MQ - skip to unblock pipeline") @@ -48,13 +52,12 @@ def test_raise_on_auth_error(self, **kwargs): def test_raise_on_bad_resource(self, **kwargs): eventgrid_topic_key = kwargs.pop("eventgrid_topic_key") akc_credential = AzureKeyCredential(eventgrid_topic_key) - client = EventGridPublisherClient("https://bad-resource.westus-1.eventgrid.azure.net/api/events", akc_credential) + client = EventGridPublisherClient( + "https://bad-resource.westus-1.eventgrid.azure.net/api/events", akc_credential + ) eg_event = EventGridEvent( - subject="sample", - data={"sample": "eventgridevent"}, - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", data={"sample": "eventgridevent"}, event_type="Sample.EventGrid.Event", data_version="2.0" + ) with pytest.raises(HttpResponseError): client.send(eg_event) @@ -63,15 +66,15 @@ def test_raise_on_bad_resource(self, **kwargs): def test_raise_on_large_payload(self, eventgrid_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_topic_endpoint) - path = os.path.abspath(os.path.join(os.path.abspath(__file__), "..", "./large_data.json")) + path = os.path.abspath(os.path.join(os.path.abspath(__file__), "..", "./large_data.json")) with open(path) as json_file: data = json.load(json_file) eg_event = EventGridEvent( - subject="sample", - data=data, - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", + data=data, + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) with pytest.raises(HttpResponseError) as err: client.send(eg_event) assert "The maximum size (1536000) has been exceeded." in err.value.message diff --git a/sdk/eventgrid/azure-eventgrid/tests/test_exceptions_async.py b/sdk/eventgrid/azure-eventgrid/tests/test_exceptions_async.py index 9d6ee172d99d..915e28bc2205 100644 --- a/sdk/eventgrid/azure-eventgrid/tests/test_exceptions_async.py +++ b/sdk/eventgrid/azure-eventgrid/tests/test_exceptions_async.py @@ -1,8 +1,8 @@ -#------------------------------------------------------------------------- +# ------------------------------------------------------------------------- # Copyright (c) Microsoft Corporation. All rights reserved. # Licensed under the MIT License. See License.txt in the project root for # license information. -#-------------------------------------------------------------------------- +# -------------------------------------------------------------------------- import os import json @@ -23,6 +23,7 @@ EventGridPreparer, ) + class TestEventGridPublisherClientExceptionsAsync(AzureRecordedTestCase): def create_eg_publisher_client(self, endpoint): credential = self.get_credential(EventGridPublisherClient, is_async=True) @@ -37,12 +38,15 @@ async def test_raise_on_auth_error(self, **kwargs): akc_credential = AzureKeyCredential("bad credential") client = EventGridPublisherClient(eventgrid_topic_endpoint, akc_credential) eg_event = EventGridEvent( - subject="sample", - data={"sample": "eventgridevent"}, - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) - with pytest.raises(ClientAuthenticationError, match="The request authorization key is not authorized for*"): + subject="sample", + data={"sample": "eventgridevent"}, + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) + with pytest.raises( + ClientAuthenticationError, + match="The request authorization key is not authorized for*", + ): await client.send(eg_event) @pytest.mark.skip("Fix during MQ - skip to unblock pipeline") @@ -52,13 +56,15 @@ async def test_raise_on_auth_error(self, **kwargs): async def test_raise_on_bad_resource(self, **kwargs): eventgrid_topic_key = kwargs.pop("eventgrid_topic_key") akc_credential = AzureKeyCredential(eventgrid_topic_key) - client = EventGridPublisherClient("https://bad-resource.westus-1.eventgrid.azure.net/api/events", akc_credential) + client = EventGridPublisherClient( + "https://bad-resource.westus-1.eventgrid.azure.net/api/events", akc_credential + ) eg_event = EventGridEvent( - subject="sample", - data={"sample": "eventgridevent"}, - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", + data={"sample": "eventgridevent"}, + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) with pytest.raises(HttpResponseError): await client.send(eg_event) @@ -68,15 +74,15 @@ async def test_raise_on_bad_resource(self, **kwargs): async def test_raise_on_large_payload(self, eventgrid_topic_endpoint): client = self.create_eg_publisher_client(eventgrid_topic_endpoint) - path = os.path.abspath(os.path.join(os.path.abspath(__file__), "..", "./large_data.json")) + path = os.path.abspath(os.path.join(os.path.abspath(__file__), "..", "./large_data.json")) with open(path) as json_file: data = json.load(json_file) eg_event = EventGridEvent( - subject="sample", - data=data, - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", + data=data, + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) with pytest.raises(HttpResponseError) as err: await client.send(eg_event) assert "The maximum size (1536000) has been exceeded." in err.value.message diff --git a/sdk/eventgrid/azure-eventgrid/tests/test_serialization.py b/sdk/eventgrid/azure-eventgrid/tests/test_serialization.py index 6c7fc1d79913..cebc392e4dea 100644 --- a/sdk/eventgrid/azure-eventgrid/tests/test_serialization.py +++ b/sdk/eventgrid/azure-eventgrid/tests/test_serialization.py @@ -1,108 +1,108 @@ -#------------------------------------------------------------------------- +# ------------------------------------------------------------------------- # Copyright (c) Microsoft Corporation. All rights reserved. # Licensed under the MIT License. See License.txt in the project root for # license information. -#-------------------------------------------------------------------------- +# -------------------------------------------------------------------------- import sys import pytest import base64 from azure.core.messaging import CloudEvent -from azure.eventgrid._helpers import _cloud_event_to_generated +from azure.eventgrid._legacy._helpers import _cloud_event_to_generated from azure.eventgrid import SystemEventNames, EventGridEvent from _mocks import ( cloud_storage_dict, cloud_storage_string, cloud_storage_bytes, - ) +) + class EventGridSerializationTests: def _assert_cloud_event_serialized(self, expected, actual): - assert expected['source'] == actual['source'] - assert expected['type'] == actual['type'] - assert actual['specversion'] == '1.0' - assert 'id' in actual - assert 'time' in actual + assert expected["source"] == actual["source"] + assert expected["type"] == actual["type"] + assert actual["specversion"] == "1.0" + assert "id" in actual + assert "time" in actual # Cloud Event tests def test_cloud_event_serialization_extension_bytes(self, **kwargs): data = b"cloudevent" cloud_event = CloudEvent( - source="http://samplesource.dev", - data=data, - type="Sample.Cloud.Event", - extensions={'e1':1, 'e2':2} - ) - - cloud_event.subject = "subject" # to test explicit setting of prop - encoded = base64.b64encode(data).decode('utf-8') + source="http://samplesource.dev", + data=data, + type="Sample.Cloud.Event", + extensions={"e1": 1, "e2": 2}, + ) + + cloud_event.subject = "subject" # to test explicit setting of prop + encoded = base64.b64encode(data).decode("utf-8") internal = _cloud_event_to_generated(cloud_event) assert internal.additional_properties is not None - assert 'e1' in internal.additional_properties + assert "e1" in internal.additional_properties - json = internal.serialize() + json = internal.serialize() expected = { - 'source':'http://samplesource.dev', - 'data_base64': encoded, - 'type':'Sample.Cloud.Event', - 'reason_code':204, - 'e1':1, - 'e2':2 + "source": "http://samplesource.dev", + "data_base64": encoded, + "type": "Sample.Cloud.Event", + "reason_code": 204, + "e1": 1, + "e2": 2, } self._assert_cloud_event_serialized(expected, json) - assert expected['data_base64'] == json['data_base64'] - + assert expected["data_base64"] == json["data_base64"] def test_cloud_event_serialization_extension_string(self, **kwargs): data = "cloudevent" cloud_event = CloudEvent( - source="http://samplesource.dev", - data=data, - type="Sample.Cloud.Event", - extensions={'e1':1, 'e2':2} - ) - - cloud_event.subject = "subject" # to test explicit setting of prop + source="http://samplesource.dev", + data=data, + type="Sample.Cloud.Event", + extensions={"e1": 1, "e2": 2}, + ) + + cloud_event.subject = "subject" # to test explicit setting of prop internal = _cloud_event_to_generated(cloud_event) assert internal.additional_properties is not None - assert 'e1' in internal.additional_properties + assert "e1" in internal.additional_properties - json = internal.serialize() + json = internal.serialize() expected = { - 'source':'http://samplesource.dev', - 'data': data, - 'type':'Sample.Cloud.Event', - 'reason_code':204, - 'e1':1, - 'e2':2 + "source": "http://samplesource.dev", + "data": data, + "type": "Sample.Cloud.Event", + "reason_code": 204, + "e1": 1, + "e2": 2, } self._assert_cloud_event_serialized(expected, json) if sys.version_info > (3, 5): - assert expected['data'] == json['data'] + assert expected["data"] == json["data"] else: - encoded = base64.b64encode(data).decode('utf-8') - expected['data_base64'] = encoded - assert expected['data_base64'] == json['data_base64'] - assert 'data' not in json + encoded = base64.b64encode(data).decode("utf-8") + expected["data_base64"] = encoded + assert expected["data_base64"] == json["data_base64"] + assert "data" not in json def test_event_grid_event_raises_on_no_data(self): with pytest.raises(TypeError): eg_event = EventGridEvent( - subject="sample", - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) + subject="sample", + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) def test_import_from_system_events(self): - var = SystemEventNames.AcsChatMemberAddedToThreadWithUserEventName + var = SystemEventNames.AcsChatMemberAddedToThreadWithUserEventName assert var == "Microsoft.Communication.ChatMemberAddedToThreadWithUser" assert SystemEventNames.KeyVaultKeyNearExpiryEventName == "Microsoft.KeyVault.KeyNearExpiry" var = SystemEventNames.ServiceBusActiveMessagesAvailableWithNoListenersEventName @@ -114,17 +114,20 @@ def test_import_from_system_events(self): def test_eg_event_repr(self): event = EventGridEvent( - subject="sample2", - data="eventgridevent2", - event_type="Sample.EventGrid.Event", - data_version="2.0" - ) - + subject="sample2", + data="eventgridevent2", + event_type="Sample.EventGrid.Event", + data_version="2.0", + ) + assert "EventGridEvent(subject=sample2" in event.__repr__() def test_servicebus_system_events_alias(self): val = "Microsoft.ServiceBus.DeadletterMessagesAvailableWithNoListeners" - assert SystemEventNames.ServiceBusDeadletterMessagesAvailableWithNoListenerEventName == SystemEventNames.ServiceBusDeadletterMessagesAvailableWithNoListenersEventName + assert ( + SystemEventNames.ServiceBusDeadletterMessagesAvailableWithNoListenerEventName + == SystemEventNames.ServiceBusDeadletterMessagesAvailableWithNoListenersEventName + ) assert SystemEventNames.ServiceBusDeadletterMessagesAvailableWithNoListenerEventName == val assert SystemEventNames.ServiceBusDeadletterMessagesAvailableWithNoListenersEventName == val assert SystemEventNames(val) == SystemEventNames.ServiceBusDeadletterMessagesAvailableWithNoListenerEventName diff --git a/sdk/eventgrid/azure-eventgrid/tsp-location.yaml b/sdk/eventgrid/azure-eventgrid/tsp-location.yaml new file mode 100644 index 000000000000..b956ab4af3a0 --- /dev/null +++ b/sdk/eventgrid/azure-eventgrid/tsp-location.yaml @@ -0,0 +1,4 @@ +cleanup: false +commit: 871324004185488ac0beddf5cab7165e7432f317 +directory: specification/eventgrid/Azure.Messaging.EventGrid +repo: Azure/azure-rest-api-specs \ No newline at end of file diff --git a/sdk/eventgrid/test-resources.json b/sdk/eventgrid/test-resources.json index d5679be5c7a8..d54dbcb65e59 100644 --- a/sdk/eventgrid/test-resources.json +++ b/sdk/eventgrid/test-resources.json @@ -23,6 +23,9 @@ } }, "variables": { + "namespaceName": "[format('{0}-2', parameters('baseName'))]", + "topicName": "testtopic1", + "subscriptionName": "testsubscription1", "apiVersion": "2022-06-15", "eventGridTopicName": "[concat(parameters('baseName'), 'topic')]", "eventGridDomainName": "[concat(parameters('baseName'), 'domain')]", @@ -35,8 +38,58 @@ "partnerChannelName": "[concat(parameters('baseName'), 'partner-channel')]", "partnerTopicName": "[concat(parameters('baseName'), 'partner-topic')]", "eventGridDataSenderRoleId": "d5a91429-5739-47e2-a06b-3470a27159e7", + "eventGridDataContributorRoleId": "1d8c3fe3-8864-474b-8749-01e3783e8157" }, "resources": [ + { + "type": "Microsoft.EventGrid/namespaces", + "apiVersion": "2024-06-01-preview", + "name": "[variables('namespaceName')]", + "location": "[resourceGroup().location]", + "sku": { + "name": "Standard", + "capacity": 1 + }, + "properties": { + "isZoneRedundant": true, + "publicNetworkAccess": "Enabled" + } + }, + { + "type": "Microsoft.EventGrid/namespaces/topics", + "apiVersion": "2024-06-01-preview", + "name": "[format('{0}/{1}', variables('namespaceName'), variables('topicName'))]", + "properties": { + "publisherType": "Custom", + "inputSchema": "CloudEventSchemaV1_0", + "eventRetentionInDays": 1 + }, + "dependsOn": [ + "[resourceId('Microsoft.EventGrid/namespaces', variables('namespaceName'))]" + ] + }, + { + "type": "Microsoft.EventGrid/namespaces/topics/eventSubscriptions", + "apiVersion": "2024-06-01-preview", + "name": "[format('{0}/{1}/{2}', variables('namespaceName'), variables('topicName'), variables('subscriptionName'))]", + "properties": { + "deliveryConfiguration": { + "deliveryMode": "Queue", + "queue": { + "receiveLockDurationInSeconds": 60, + "maxDeliveryCount": 10, + "eventTimeToLive": "P1D" + } + }, + "eventDeliverySchema": "CloudEventSchemaV1_0", + "filtersConfiguration": { + "includedEventTypes": [] + } + }, + "dependsOn": [ + "[resourceId('Microsoft.EventGrid/namespaces/topics', variables('namespaceName'), variables('topicName'))]" + ] + }, { "type": "Microsoft.EventGrid/topics", "apiVersion": "[variables('apiVersion')]", @@ -166,8 +219,17 @@ "principalId": "[parameters('testApplicationOid')]", "scope": "[resourceGroup().id]" } + }, + { + "type": "Microsoft.Authorization/roleAssignments", + "apiVersion": "2019-04-01-preview", + "name": "[guid(resourceGroup().id, parameters('testApplicationOid'), variables('eventGridDataContributorRoleId'))]", + "properties": { + "roleDefinitionId": "[resourceId('Microsoft.Authorization/roleDefinitions', variables('eventGridDataContributorRoleId'))]", + "principalId": "[parameters('testApplicationOid')]", + "scope": "[resourceGroup().id]" + } } - ], "outputs": { "EVENTGRID_TOPIC_KEY": { @@ -221,6 +283,30 @@ "EVENTGRID_PARTNER_CHANNEL_NAME": { "type": "string", "value": "[variables('partnerChannelName')]" + }, + "EVENTGRID_KEY": { + "type": "string", + "value": "[listKeys(resourceId('Microsoft.EventGrid/namespaces', variables('namespaceName')), '2024-06-01-preview').key1]" + }, + "EVENTGRID_ENDPOINT": { + "type": "string", + "value": "[format('https://{0}', reference(resourceId('Microsoft.EventGrid/namespaces', variables('namespaceName')), '2024-06-01-preview').topicsConfiguration.hostname)]" + }, + "EVENTGRID_TOPIC_NAME": { + "type": "string", + "value": "[variables('topicName')]" + }, + "EVENTGRID_EVENT_SUBSCRIPTION_NAME": { + "type": "string", + "value": "[variables('subscriptionName')]" + }, + "RESOURCE_GROUP": { + "type": "string", + "value": "[resourceGroup().name]" + }, + "AZURE_SUBSCRIPTION_ID": { + "type": "string", + "value": "[subscription().subscriptionId]" } } } diff --git a/sdk/eventgrid/tests.yml b/sdk/eventgrid/tests.yml index dbf8b8add6d0..533a10696214 100644 --- a/sdk/eventgrid/tests.yml +++ b/sdk/eventgrid/tests.yml @@ -5,6 +5,7 @@ extends: parameters: ServiceDirectory: eventgrid BuildTargetingString: azure-eventgrid* + Location: eastus UseFederatedAuth: true MatrixReplace: - TestSamples=.*/true