Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Build by service directory (part 2) #3647

Merged
merged 33 commits into from
Jun 12, 2019

Conversation

mitchdenny
Copy link
Contributor

This PR changes the JavaScript build to build all of the data-plane packages under the service directory associated with the pipeline rather than having a pipeline per package.

@mitchdenny
Copy link
Contributor Author

/azp run js - core - ci

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@mitchdenny
Copy link
Contributor Author

/azp run js - core - ci

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@mitchdenny
Copy link
Contributor Author

/azp run js - core - ci

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@mitchdenny mitchdenny force-pushed the build-by-service-part-2 branch from 1267027 to f2d04bd Compare June 11, 2019 16:48
@mitchdenny
Copy link
Contributor Author

/azp run js - core - ci

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@mitchdenny
Copy link
Contributor Author

js - core - ci is failing due to a flaky test - reported ( #3694 )

@bsiegel
Copy link
Member

bsiegel commented Jun 12, 2019

Rather than do that I decided to introduce a pattern for repo specific tools where we have an eng/tools folder, and under their we place code/scripts that are used exclusively by pipelines attached to this repo. In this case I've created a small module called select-packages which contains the logic to search a service directory for package.json files and check for a flag in the package to see if it is a data plane library (isDataPlane).

I might suggest an alternative, which is that Rush has the concept of "review categories". We can make a category that lets us distinguish between the two types of libraries. Then it's easy enough to just parse the rush.json file (or you can use @microsoft/rush-lib if you need anything more complex).

@mitchdenny
Copy link
Contributor Author

I might suggest an alternative, which is that Rush has the concept of "review categories". We can make a category that lets us distinguish between the two types of libraries. Then it's easy enough to just parse the rush.json file (or you can use @microsoft/rush-lib if you need anything more complex).

So is the suggestion to cross reference the name of the package loaded with what is in rush.json to determine whether it is a client lib?

@bsiegel
Copy link
Member

bsiegel commented Jun 12, 2019

So is the suggestion to cross reference the name of the package loaded with what is in rush.json to determine whether it is a client lib?

You could basically do the following:

const dataPlanePackages = rushConfig.projects.filter(p => p.reviewCategory == 'client');

for (const p of dataPlanePackages) {
  console.log(p.packageName);
}

My point was that I think the Rush config is the right place to define category metadata about our packages, rather than in the package.json, since it's only relevant to our build system and dev tooling.

P.S. : Right now only data plane projects are in the Rush workspace at all...

P.P.S. : I have an issue open in the Rush repo to allow grouping of packages for Rush actions, which basically does exactly what is described above - we would just promote the concept of "reviewCategories" to the more general "categories" which could then be targets for actions in addition to being used for reviewing dependencies.

@mitchdenny
Copy link
Contributor Author

You could basically do the following:

Right - that makes sense, I'd still need to filter by path but that makes sense. I wonder whether we are abusing this feature though - that would be my only concern - unintended side-effects today, or down the road.

@mitchdenny
Copy link
Contributor Author

/azp run js - storage-queue - ci

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@mitchdenny
Copy link
Contributor Author

/azp run js - storage-file - ci

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@mitchdenny
Copy link
Contributor Author

/azp run js - storage-blob - ci

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@mitchdenny
Copy link
Contributor Author

/azp run js - service-bus - ci

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@mitchdenny
Copy link
Contributor Author

/azp run js event-processor-host - ci

@azure-pipelines
Copy link

No pipelines are associated with this pull request.

@mitchdenny
Copy link
Contributor Author

/azp run js - event-processor-host - ci

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@mitchdenny
Copy link
Contributor Author

@weshaggard could I get you to approve this - its ready to merge - the failing build is because of a flaky test case in core-http.

@mitchdenny mitchdenny merged commit cdf431a into Azure:master Jun 12, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants