- Install SBT
- Make sure to download JDK 11 if you don't have it
- Fork the repository on github
- This is required if you would like to make PRs. If you choose the fork option, replace the clone link below with that of your fork.
- Git Clone your fork, or the repo directly
git clone https://github.com/Azure/mmlspark.git
- NOTE: If you would like to contribute to mmlspark regularly, add your fork as a remote named
origin
and Azure/mmlspark as a remote namedupstream
- Run sbt to compile and grab datasets
cd mmlspark
sbt setup
- Install IntelliJ
- Install Scala plugins during install
- Configure IntelliJ
- OPEN the mmlspark directory
- If the project does not automatically import,click on
build.sbt
and import project
To use secrets in the build you must be part of the mmlspark keyvault
and azure subscription. If you are MSFT internal would like to be
added please reach out [email protected]
Compiles the main, test, and integration test classes respectively
Runs all mmlspark tests
Runs scalastyle check
Generates documentation for scala sources
Creates a conda environment mmlspark
from environment.yaml
if it does not already exist.
This env is used for python testing. Activate this env before using python build commands.
Removes mmlspark
conda env
Compiles scala, runs python generation scripts, and creates a wheel
Generates documentation for generated python code
Installs generated python wheel into existing env
Generates and runs python tests
Downloads all datasets used in tests to target folder
Combination of compile
, test:compile
, it:compile
, getDatasets
Packages the library into a jar
Publishes Jar to mmlspark's azure blob based maven repo. (Requires Keys)
Publishes library to local maven repo
Publishes scala and python doc to mmlspark's build azure storage account. (Requires Keys)
Publishes the library to sonatype staging repo
Promotes the published sonatype artifact