Skip to content

Commit

Permalink
test: enable BYOSA test for remote_function cloud function (#432)
Browse files Browse the repository at this point in the history
This support was added in [PR#407](https://togithub.com/googleapis/python-bigquery-dataframes/pull/407) but the test was only verified locally since the project `bigframes-load-testing` is latchkey managed and would require some extra configuration to set-up. This change does one step better by enabling the test in the automated pipelines by targeting it to a different project `bigframes-dev-perf` which is easier to set up through cloud console. Eventually it should be moved to run entirely in `bigframes-load-testing` after the necessary configuration is done through latchkey (created internal issue 329339908 to track the work). 

Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly:
- [ ] Make sure to open an issue as a [bug/issue](https://togithub.com/googleapis/python-bigquery-dataframes/issues/new/choose) before writing your code!  That way we can discuss the change, evaluate designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)

Fixes #<issue_number_goes_here> 🦕
  • Loading branch information
shobsi authored Mar 15, 2024
1 parent 1c3e668 commit 40ddb69
Showing 1 changed file with 18 additions and 8 deletions.
26 changes: 18 additions & 8 deletions tests/system/large/test_remote_function.py
Original file line number Diff line number Diff line change
Expand Up @@ -1281,19 +1281,29 @@ def square(x):
)


@pytest.mark.skip("This requires additional project config.")
@pytest.mark.flaky(retries=2, delay=120)
def test_remote_function_via_session_custom_sa(scalars_dfs):
# Set these values to run the test locally
# TODO(shobs): Automate and enable this test
PROJECT = ""
GCF_SERVICE_ACCOUNT = ""
# TODO(shobs): Automate the following set-up during testing in the test project.
#
# For upfront convenience, the following set up has been statically created
# in the project bigfrmames-dev-perf via cloud console:
#
# 1. Create a service account as per
# https://cloud.google.com/iam/docs/service-accounts-create#iam-service-accounts-create-console
# 2. Give necessary roles as per
# https://cloud.google.com/functions/docs/reference/iam/roles#additional-configuration
#
project = "bigframes-dev-perf"
gcf_service_account = (
"[email protected]"
)

rf_session = bigframes.Session(context=bigframes.BigQueryOptions(project=PROJECT))
rf_session = bigframes.Session(context=bigframes.BigQueryOptions(project=project))

try:

@rf_session.remote_function(
[int], int, reuse=False, cloud_function_service_account=GCF_SERVICE_ACCOUNT
[int], int, reuse=False, cloud_function_service_account=gcf_service_account
)
def square_num(x):
if x is None:
Expand All @@ -1316,7 +1326,7 @@ def square_num(x):
gcf = rf_session.cloudfunctionsclient.get_function(
name=square_num.bigframes_cloud_function
)
assert gcf.service_config.service_account_email == GCF_SERVICE_ACCOUNT
assert gcf.service_config.service_account_email == gcf_service_account
finally:
# clean up the gcp assets created for the remote function
cleanup_remote_function_assets(
Expand Down

0 comments on commit 40ddb69

Please sign in to comment.