Skip to content
This repository has been archived by the owner on Sep 13, 2023. It is now read-only.

Release 0.3.0 #397

Merged
merged 21 commits into from
Oct 27, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 9 additions & 1 deletion .github/workflows/check-test-release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ on:

env:
MLEM_TESTS: "true"
MLEM_DEBUG: "true"
aguschin marked this conversation as resolved.
Show resolved Hide resolved

jobs:
authorize:
Expand Down Expand Up @@ -58,7 +59,7 @@ jobs:
# no HDF5 support installed for tables
- os: windows-latest
python: "3.9"
fail-fast: true
fail-fast: false
steps:
- uses: actions/checkout@v3
with:
Expand All @@ -67,6 +68,10 @@ jobs:
- uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python }}
- uses: conda-incubator/setup-miniconda@v2
with:
python-version: ${{ matrix.python }}
activate-environment: true
- name: get pip cache dir
id: pip-cache-dir
run: |
Expand All @@ -92,6 +97,9 @@ jobs:
pip install pre-commit .[tests]
- run: pre-commit run pylint -a -v --show-diff-on-failure
if: matrix.python != '3.7'
- name: Start minikube
if: matrix.os == 'ubuntu-latest' && matrix.python == '3.9'
uses: medyagh/setup-minikube@master
- name: Run tests
timeout-minutes: 40
run: pytest
Expand Down
28 changes: 16 additions & 12 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
default_language_version:
python: python3
repos:
- repo: 'https://github.com/pre-commit/pre-commit-hooks'
rev: v4.0.1
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.3.0
hooks:
- id: check-added-large-files
- id: check-case-conflict
Expand All @@ -17,8 +17,8 @@ repos:
- id: mixed-line-ending
- id: sort-simple-yaml
- id: trailing-whitespace
- repo: 'https://github.com/pycqa/flake8'
rev: 4.0.1
- repo: https://github.com/pycqa/flake8
rev: 5.0.4
hooks:
- id: flake8
args:
Expand All @@ -28,16 +28,16 @@ repos:
- flake8-comprehensions
- flake8-debugger
- flake8-string-format
- repo: 'https://github.com/psf/black'
rev: 22.3.0
- repo: https://github.com/psf/black
rev: 22.10.0
hooks:
- id: black
- repo: 'https://github.com/PyCQA/isort'
rev: 5.10.1
hooks:
- id: isort
- repo: 'https://github.com/pre-commit/mirrors-mypy'
rev: v0.942
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v0.982
hooks:
- id: mypy
additional_dependencies:
Expand All @@ -54,7 +54,11 @@ repos:
entry: pylint -v
language: system
types: [ python ]
# - repo: https://github.com/PyCQA/bandit
# rev: '1.7.0'
# hooks:
# - id: bandit
- repo: https://github.com/PyCQA/bandit
rev: 1.7.4
hooks:
- id: bandit
exclude: tests/
args:
- -iii # high level
- -lll # high confidence
7 changes: 4 additions & 3 deletions .pylintrc
Original file line number Diff line number Diff line change
Expand Up @@ -170,7 +170,8 @@ disable=print-statement,
redefined-builtin, # TODO: https://github.com/iterative/mlem/issues/60
no-self-use, # TODO: https://github.com/iterative/mlem/issues/60 maybe leave it
import-outside-toplevel,
wrong-import-order # handeled by isort
wrong-import-order, # handeled by isort
cannot-enumerate-pytest-fixtures # TODO: https://github.com/iterative/mlem/issues/60
aguschin marked this conversation as resolved.
Show resolved Hide resolved

# Enable the message, report, category or checker with the given id(s). You can
# either give multiple identifier separated by comma (,) or put this option
Expand Down Expand Up @@ -369,7 +370,7 @@ indent-string=' '
max-line-length=100

# Maximum number of lines in a module.
max-module-lines=1000
max-module-lines=2000

# Allow the body of a class to be on the same line as the declaration if body
# contains single statement.
Expand All @@ -389,7 +390,7 @@ ignore-comments=yes
ignore-docstrings=yes

# Ignore imports when computing similarities.
ignore-imports=no
ignore-imports=yes

# Ignore function signatures when computing similarities.
ignore-signatures=no
Expand Down
120 changes: 34 additions & 86 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ The main reason to use MLEM instead of other tools is to adopt a **GitOps approa

## Usage

This a quick walkthrough showcasing deployment and export functionality of MLEM.
This a quick walkthrough showcasing deployment functionality of MLEM.

Please read [Get Started guide](https://mlem.ai/doc/get-started) for a full version.

Expand Down Expand Up @@ -81,18 +81,34 @@ def main():

save(
rf,
"rf",
"models/rf",
sample_data=data,
)

if __name__ == "__main__":
main()
```

### Productionization

We'll show how to deploy your model with MLEM below, but let's briefly mention all the
scenarios that MLEM enables with a couple lines of code:

- **[Apply model](/doc/get-started/applying)** - load model in Python or get
prediction in command line.
- **[Serve model](/doc/get-started/serving)** - create a service from your model
for online serving.
- **[Build model](/doc/get-started/building)** - export model into Python
packages, Docker images, etc.
- **[Deploy model](/doc/get-started/deploying)** - deploy your model to Heroku,
Sagemaker, Kubernetes, etc.

### Codification

Check out what we have:

```shell
$ ls
$ ls models/
rf
rf.mlem
$ cat rf.mlem
Expand Down Expand Up @@ -153,51 +169,6 @@ model_type:
- null
- 3
type: ndarray
sklearn_predict:
args:
- name: X
type_:
columns:
- sepal length (cm)
- sepal width (cm)
- petal length (cm)
- petal width (cm)
dtypes:
- float64
- float64
- float64
- float64
index_cols: []
type: dataframe
name: predict
returns:
dtype: int64
shape:
- null
type: ndarray
sklearn_predict_proba:
args:
- name: X
type_:
columns:
- sepal length (cm)
- sepal width (cm)
- petal length (cm)
- petal width (cm)
dtypes:
- float64
- float64
- float64
- float64
index_cols: []
type: dataframe
name: predict_proba
returns:
dtype: float64
shape:
- null
- 3
type: ndarray
type: sklearn
object_type: model
requirements:
Expand All @@ -213,54 +184,31 @@ requirements:
### Deploying the model

If you want to follow this Quick Start, you'll need to sign up on https://heroku.com,
create an API_KEY and populate `HEROKU_API_KEY` env var.

First, create an environment to deploy your model:

```shell
$ mlem declare env heroku staging
💾 Saving env to staging.mlem
```
create an API_KEY and populate `HEROKU_API_KEY` env var (or run `heroku login` in command line).
Besides, you'll need to run `heroku container:login`. This will log you in to Heroku
container registry.

Now we can [deploy the model with `mlem deploy`](https://mlem.ai/doc/get-started/deploying)
(you need to use different `app_name`, since it's going to be published on https://herokuapp.com):

```shell
$ mlem deployment run mydeploy -m rf -t staging -c app_name=mlem-quick-start
⏳️ Loading deployment from .mlem/deployment/myservice.mlem
🔗 Loading link to .mlem/env/staging.mlem
🔗 Loading link to .mlem/model/rf.mlem
💾 Updating deployment at .mlem/deployment/myservice.mlem
🏛 Creating Heroku App example-mlem-get-started
💾 Updating deployment at .mlem/deployment/myservice.mlem
$ mlem deployment run heroku app.mlem \
--model models/rf \
--app_name example-mlem-get-started-app
⏳️ Loading model from models/rf.mlem
⏳️ Loading deployment from app.mlem
🛠 Creating docker image for heroku
🛠 Building MLEM wheel file...
💼 Adding model files...
🛠 Generating dockerfile...
💼 Adding sources...
💼 Generating requirements file...
🛠 Building docker image registry.heroku.com/example-mlem-get-started/web...
✅ Built docker image registry.heroku.com/example-mlem-get-started/web
🔼 Pushed image registry.heroku.com/example-mlem-get-started/web to remote registry at host registry.heroku.com
💾 Updating deployment at .mlem/deployment/myservice.mlem
🛠 Releasing app my-mlem-service formation
💾 Updating deployment at .mlem/deployment/myservice.mlem
✅ Service example-mlem-get-started is up. You can check it out at https://mlem-quick-start.herokuapp.com/
```

### Exporting the model

You could easily [export the model to a different format using `mlem build`](https://mlem.ai/doc/get-started/building):

```
$ mlem build rf docker -c server.type=fastapi -c image.name=sklearn-model
⏳️ Loading model from rf.mlem
🛠 Building MLEM wheel file...
💼 Adding model files...
🛠 Generating dockerfile...
💼 Adding sources...
💼 Generating requirements file...
🛠 Building docker image sklearn-model:latest...
✅ Built docker image sklearn-model:latest
🛠 Building docker image registry.heroku.com/example-mlem-get-started-app/web...
✅ Built docker image registry.heroku.com/example-mlem-get-started-app/web
🔼 Pushing image registry.heroku.com/example-mlem-get-started-app/web to registry.heroku.com
✅ Pushed image registry.heroku.com/example-mlem-get-started-app/web to registry.heroku.com
🛠 Releasing app example-mlem-get-started-app formation
✅ Service example-mlem-get-started-app is up. You can check it out at https://example-mlem-get-started-app.herokuapp.com/
```

## Contributing
Expand Down
2 changes: 0 additions & 2 deletions mlem/api/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,15 +11,13 @@
import_object,
init,
link,
ls,
serve,
)

__all__ = [
"save",
"load",
"load_meta",
"ls",
"clone",
"init",
"link",
Expand Down
Loading