Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Query consistently times out #954

Closed
wagnerand opened this issue May 21, 2019 · 4 comments
Closed

Query consistently times out #954

wagnerand opened this issue May 21, 2019 · 4 comments
Assignees

Comments

@wagnerand
Copy link
Member

Issue Summary

A summary of the issue and the browser/OS environment in which it occurs.

Steps to Reproduce

  1. Create a new query for AMO-DB, SELECT guid FROM addons WHERE guid IS NOT NULL;
  2. Execute

After about 30s, the progress changes to "Loading results".
After about 1min, execution aborts with " Error running query: failed communicating with server. Please check your Internet connection and try again. "

Technical details:

  • Redash Version: sql.telemetry.mozilla.org
  • Browser/OS: Firefox 67 beta, macOS 10.14
  • How did you install Redash: I did not.
@jezdez
Copy link

jezdez commented May 21, 2019

@jasonthomas Can we see in the Celery log what may cause this?

@jasonthomas
Copy link
Member

I didn't see anything specific in the logs, based on celery it looks like it was successful:

May 20 22:05:05 ip-172-31-41-157.us-west-2.compute.internal docker-worker-adhoc[32089]: [2019-05-20 22:05:05,555][PID:102][INFO][ForkPoolWorker-66] task=execute_query state=executing_query query_hash=bcc6ad8e71a420c4f9fa342e0e6671ae type=mysql ds_id=25  task_id=60d245aa-cb7a-4b7d-ba98-5a2b55f081e9 queue=queries query_id=adhoc username=awagner
May 20 22:05:33 ip-172-31-41-157.us-west-2.compute.internal docker-worker-adhoc[32089]: [2019-05-20 22:05:33,257][PID:102][INFO][ForkPoolWorker-66] task=execute_query query_hash=bcc6ad8e71a420c4f9fa342e0e6671ae data_length=81117206 error=[None]
May 20 22:05:33 ip-172-31-41-157.us-west-2.compute.internal docker-worker-adhoc[32089]: [2019-05-20 22:05:33,259][PID:102][INFO][ForkPoolWorker-66] Inserted query (bcc6ad8e71a420c4f9fa342e0e6671ae) data; id=None
May 20 22:05:37 ip-172-31-41-157.us-west-2.compute.internal docker-worker-adhoc[32089]: [2019-05-20 22:05:37,948][PID:102][INFO][ForkPoolWorker-66] Updated 0 queries with result (bcc6ad8e71a420c4f9fa342e0e6671ae).
May 20 22:05:37 ip-172-31-41-157.us-west-2.compute.internal docker-worker-adhoc[32089]: [2019-05-20 22:05:37,973][PID:102][INFO][ForkPoolWorker-66] task=execute_query state=checking_alerts query_hash=bcc6ad8e71a420c4f9fa342e0e6671ae type=mysql ds_id=25  task_id=60d245aa-cb7a-4b7d-ba98-5a2b55f081e9 queue=queries query_id=adhoc username=awagne
May 20 22:05:37 ip-172-31-41-157.us-west-2.compute.internal docker-worker-adhoc[32089]: [2019-05-20 22:05:37,973][PID:102][INFO][ForkPoolWorker-66] task=execute_query state=finished query_hash=bcc6ad8e71a420c4f9fa342e0e6671ae type=mysql ds_id=25  task_id=60d245aa-cb7a-4b7d-ba98-5a2b55f081e9 queue=queries query_id=adhoc username=awagner
May 20 22:05:37 ip-172-31-41-157.us-west-2.compute.internal docker-worker-adhoc[32089]: [2019-05-20 22:05:37,975][PID:102][INFO][ForkPoolWorker-66] Task redash.tasks.execute_query[60d245aa-cb7a-4b7d-ba98-5a2b55f081e9] succeeded in 32.4271183042s: 6287624

I also ran this query and it succeeded successfully and I can see the output.
https://sql.telemetry.mozilla.org/queries/62855/source

May 21 18:48:22 ip-172-31-41-157.us-west-2.compute.internal docker-worker-adhoc[32089]: [2019-05-21 18:48:22,657][PID:111][INFO][ForkPoolWorker-75] task=execute_query state=executing_query query_hash=bcc6ad8e71a420c4f9fa342e0e6671ae type=mysql ds_id=25  task_id=dee8eb92-5292-47e1-aa96-8c8507f70f62 queue=queries query_id=adhoc username=jthomas
May 21 18:48:50 ip-172-31-41-157.us-west-2.compute.internal docker-worker-adhoc[32089]: [2019-05-21 18:48:50,222][PID:111][INFO][ForkPoolWorker-75] task=execute_query query_hash=bcc6ad8e71a420c4f9fa342e0e6671ae data_length=81127194 error=[None]
May 21 18:48:50 ip-172-31-41-157.us-west-2.compute.internal docker-worker-adhoc[32089]: [2019-05-21 18:48:50,224][PID:111][INFO][ForkPoolWorker-75] Inserted query (bcc6ad8e71a420c4f9fa342e0e6671ae) data; id=None
May 21 18:48:54 ip-172-31-41-157.us-west-2.compute.internal docker-worker-adhoc[32089]: [2019-05-21 18:48:54,721][PID:111][INFO][ForkPoolWorker-75] Updated 0 queries with result (bcc6ad8e71a420c4f9fa342e0e6671ae).
May 21 18:48:54 ip-172-31-41-157.us-west-2.compute.internal docker-worker-adhoc[32089]: [2019-05-21 18:48:54,896][PID:111][INFO][ForkPoolWorker-75] task=execute_query state=checking_alerts query_hash=bcc6ad8e71a420c4f9fa342e0e6671ae type=mysql ds_id=25  task_id=dee8eb92-5292-47e1-aa96-8c8507f70f62 queue=queries query_id=adhoc username=jthomas
May 21 18:48:54 ip-172-31-41-157.us-west-2.compute.internal docker-worker-adhoc[32089]: [2019-05-21 18:48:54,897][PID:111][INFO][ForkPoolWorker-75] task=execute_query state=finished query_hash=bcc6ad8e71a420c4f9fa342e0e6671ae type=mysql ds_id=25  task_id=dee8eb92-5292-47e1-aa96-8c8507f70f62 queue=queries query_id=adhoc username=jthomas
May 21 18:48:54 ip-172-31-41-157.us-west-2.compute.internal docker-worker-adhoc[32089]: [2019-05-21 18:48:54,899][PID:111][INFO][ForkPoolWorker-75] Task redash.tasks.execute_query[dee8eb92-5292-47e1-aa96-8c8507f70f62] succeeded in 32.24933971s: 6292326

That being said it took 33 seconds for this celery tasks to complete and save 1.5M+ rows from AMO-DB. Maybe we are hitting a gunicorn or nginx timeout?

@jasonthomas jasonthomas self-assigned this Jun 5, 2019
@jasonthomas
Copy link
Member

Requests like these are going to be slow and may timeout due the volume of rows that are being returned to be stored in redash's db. I wasn't able to replicate this specific issue so this is going to be difficult to troubleshoot. Can we limit the number of rows returned and/or aggregate the results here to limit the amount of data being stored as part of the query results?

@wagnerand
Copy link
Member Author

Unfortunately, in this specific case, I can't think of a workaround at the moment. The query is part of a tool that helps us get a list of all blocked add-ons. Since AMO doesn't store the information, we get all add-on blocklist patterns from kinto, get all add-on IDs from AMO and then check every ID against every pattern to see if it matches.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants