-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Google cloud spanner client fails when reading large columns #3998
Labels
api: spanner
Issues related to the Spanner API.
Comments
Issue seems fixed by adding
to google.cloud.spanner.streamed |
cc @tseaver @lukesneeringer |
@lukesneeringer @tseaver Do you agree that this looks like a quick fix? If so, let's get it done before Beta. If it looks much more involved, then let's discuss more. |
lukesneeringer
pushed a commit
to lukesneeringer/google-cloud-python
that referenced
this issue
Sep 21, 2017
Fixes googleapis#3981 Fixes googleapis#3998 Closes googleapis#4009
tseaver
pushed a commit
that referenced
this issue
Sep 21, 2017
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Consider following table:
Which gets populated by 100 rows each containing a large array of 1000 timestamps.
Trying to fetch these rows with
db.execute_sql('select * from big_arrays limit 100').consume_all()
results in following error and stacktrace:Running the above query with the gcloud cli works as intended
gcloud spanner databases execute-sql my-db --instance=my-instance --sql='select * from big_arrays limit 100' >/dev/null
The text was updated successfully, but these errors were encountered: