-
Notifications
You must be signed in to change notification settings - Fork 10
Write SQS consumer looking for buildhub.json files #465
Comments
I created an S3 bucket and SQS queue to testing that the consumer works. It is in the dev iam so @peterbe your dev creds should give you full access to it. region: us-west-2 I uploaded a few random files into S3 to make sure it works. It does. :) Example of CLI access:
|
The Consumer should access the S3 buckets directly and avoid going through the CDN. We can ask ops to make sure the EC2 box running the Consumer has the IAM right permissions to:
Also the Consumer should be idempotent. So if the same buildhub.json message is processed twice the kinto results will remain the same. Regular SQS queues has an "at least once" delivery guarantee, so messages could get duplicated. |
I created a random S3 file generator: https://github.com/mostlygeek/s3-file-maker You can use this by running it in the background, |
...and ...
Why did you mention that? Surely, Mind you, if we still need the backfill (which I suspect we do) we'll probably do that by extracting the manifests from the S3 bucket anonymously. So I guess the ask is to set up this bucket to be publicly readable just like the one we use for Mozilla builds. |
Ah! I see. The message doesn't contain the body. Of course. Silly me. I still have to go and pick it up. ...by its key and bucket name. |
|
A new daemon synchronous script that consumes a SQS queue (ARI from an env var) and only looks for
os.path.basename(uri) == 'buildhub.json'
that consumes, validates, inserts into kinto and deletes.See tokenserver code for example.
The text was updated successfully, but these errors were encountered: