Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow for Metric Scraping interval to be defined overall & per-metric via YAML config #473

Conversation

brandonh-msft
Copy link
Contributor

@brandonh-msft brandonh-msft commented Mar 28, 2019

Fixes #258 & #259

This PR accomplishes the following:

  • Move metric scraping schedule from Env variables -> YAML configuration under metricDefaults.scraping.schedule
  • Allow for an individual metric to have <metric>.scraping.schedule defined, overriding the value specified in metricDefaults
  • Require metricDefaults.scraping.schedule as part of validation
  • Adds a unit test to verify that validation fails if no default schedule is given in the YAML file
  • Reads the YAML config file at startup and spins up a MetricScrapingJob instance for every metric defined, with the correct schedule (inherited from default, overriden in metric def). I confirmed they all run in parallel on their correct schedules in my local testing 🎉

In addition to these features added, it changes the behavior of the Scraper to no longer require MetricDefaults as the MetricDeclarationProvider now supports an applyDefaults parameter on its Get() method (false by default; up for debate IMO) which will, when furnishing individual metrics, apply values in metricDefaults to any properties which don't have values in the metric itself. This isolates the scraper from needing to know how/when to do this.

…don't push this logic down in to the scrapers
…as a cron expression instead of System.Timespan
…icDefaults to metrics that don't have values for those properties
…metrics to illustrate cusomizability and new usage of metricDefaults for overall scraping schedule (vs old env variable)
…don't push this logic down in to the scrapers
…as a cron expression instead of System.Timespan
…metrics to illustrate cusomizability and new usage of metricDefaults for overall scraping schedule (vs old env variable)
@promitor-bot
Copy link

Docker image for this PR was built and is available on Docker Hub.

You can pull it locally via the CLI:

docker pull tomkerkhove/promitor-agent-scraper-ci:pr473

Want to verify the new version? Run it locally:

docker run -d -p 8999:80 --name promitor-scraper-pr473 \
                         --env PROMITOR_AUTH_APPID='<azure-ad-app-id>' \
                         --env PROMITOR_AUTH_APPKEY='<azure-ad-app-key>' \
                         --volume C:/Promitor/metrics-declaration.yaml:/config/metrics-declaration.yaml \
                         tomkerkhove/promitor-agent-scraper-ci:pr473

@promitor-bot
Copy link

Docker image for this PR was built and is available on Docker Hub.

You can pull it locally via the CLI:

docker pull tomkerkhove/promitor-agent-scraper-ci:pr473

Want to verify the new version? Run it locally:

docker run -d -p 8999:80 --name promitor-scraper-pr473 \
                         --env PROMITOR_AUTH_APPID='<azure-ad-app-id>' \
                         --env PROMITOR_AUTH_APPKEY='<azure-ad-app-key>' \
                         --volume C:/Promitor/metrics-declaration.yaml:/config/metrics-declaration.yaml \
                         tomkerkhove/promitor-agent-scraper-ci:pr473

@brandonh-msft
Copy link
Contributor Author

forgot docs

@promitor-bot
Copy link

Docker image for this PR was built and is available on Docker Hub.

You can pull it locally via the CLI:

docker pull tomkerkhove/promitor-agent-scraper-ci:pr473

Want to verify the new version? Run it locally:

docker run -d -p 8999:80 --name promitor-scraper-pr473 \
                         --env PROMITOR_AUTH_APPID='<azure-ad-app-id>' \
                         --env PROMITOR_AUTH_APPKEY='<azure-ad-app-key>' \
                         --volume C:/Promitor/metrics-declaration.yaml:/config/metrics-declaration.yaml \
                         tomkerkhove/promitor-agent-scraper-ci:pr473

@brandonh-msft
Copy link
Contributor Author

anybody know why the snyk steps wouldn't be running for this right now?

cc @tomkerkhove

@brandonh-msft brandonh-msft force-pushed the brandonh/feature/metric-scraping-interval-impl branch from 3d9aa3d to ca0c378 Compare March 28, 2019 20:54
@promitor-bot
Copy link

Docker image for this PR was built and is available on Docker Hub.

You can pull it locally via the CLI:

docker pull tomkerkhove/promitor-agent-scraper-ci:pr473

Want to verify the new version? Run it locally:

docker run -d -p 8999:80 --name promitor-scraper-pr473 \
                         --env PROMITOR_AUTH_APPID='<azure-ad-app-id>' \
                         --env PROMITOR_AUTH_APPKEY='<azure-ad-app-key>' \
                         --volume C:/Promitor/metrics-declaration.yaml:/config/metrics-declaration.yaml \
                         tomkerkhove/promitor-agent-scraper-ci:pr473

Copy link
Contributor

@brusMX brusMX left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please consolidate the optional parameters, move the resourceGroupName to the same section that you just created

@promitor-bot
Copy link

Docker image for this PR was built and is available on Docker Hub.

You can pull it locally via the CLI:

docker pull tomkerkhove/promitor-agent-scraper-ci:pr473

Want to verify the new version? Run it locally:

docker run -d -p 8999:80 --name promitor-scraper-pr473 \
                         --env PROMITOR_AUTH_APPID='<azure-ad-app-id>' \
                         --env PROMITOR_AUTH_APPKEY='<azure-ad-app-key>' \
                         --volume C:/Promitor/metrics-declaration.yaml:/config/metrics-declaration.yaml \
                         tomkerkhove/promitor-agent-scraper-ci:pr473

@brandonh-msft brandonh-msft requested a review from brusMX March 28, 2019 21:38
Copy link
Contributor

@brusMX brusMX left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@brandonh-msft
Copy link
Contributor Author

@tomkerkhove not sure why but we have 1 approving review and it still won't let us merge the PR. 🤷🏻‍♂️

@brandonh-msft
Copy link
Contributor Author

@jsturtevant could you approve this and see if we can get "Squash & merge" to light up for us?

@promitor-bot
Copy link

Docker image for this PR was built and is available on Docker Hub.

You can pull it locally via the CLI:

docker pull tomkerkhove/promitor-agent-scraper-ci:pr473

Want to verify the new version? Run it locally:

docker run -d -p 8999:80 --name promitor-scraper-pr473 \
                         --env PROMITOR_AUTH_APPID='<azure-ad-app-id>' \
                         --env PROMITOR_AUTH_APPKEY='<azure-ad-app-key>' \
                         --volume C:/Promitor/metrics-declaration.yaml:/config/metrics-declaration.yaml \
                         tomkerkhove/promitor-agent-scraper-ci:pr473

@tomkerkhove tomkerkhove merged commit 133a4ae into tomkerkhove:master Mar 29, 2019
@promitor-bot
Copy link

Docker image for this PR was built and is available on Docker Hub.

You can pull it locally via the CLI:

docker pull tomkerkhove/promitor-agent-scraper-ci:pr473

Want to verify the new version? Run it locally:

docker run -d -p 8999:80 --name promitor-scraper-pr473 \
                         --env PROMITOR_AUTH_APPID='<azure-ad-app-id>' \
                         --env PROMITOR_AUTH_APPKEY='<azure-ad-app-key>' \
                         --volume C:/Promitor/metrics-declaration.yaml:/config/metrics-declaration.yaml \
                         tomkerkhove/promitor-agent-scraper-ci:pr473

@brandonh-msft brandonh-msft deleted the brandonh/feature/metric-scraping-interval-impl branch March 29, 2019 16:45
@tomkerkhove tomkerkhove added this to the v1.0.0 milestone Apr 1, 2019
@tomkerkhove
Copy link
Owner

Thanks @brandonh-msft !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants