Skip to content

Commit

Permalink
Merge commit 'e51c556ec37945a5dedc88dcf2fbc939ad5c881a' into v1.5.1-r…
Browse files Browse the repository at this point in the history
…elease
  • Loading branch information
axsaucedo committed Dec 16, 2020
2 parents f3d9341 + e51c556 commit ac05dac
Show file tree
Hide file tree
Showing 15 changed files with 617 additions and 388 deletions.
2 changes: 1 addition & 1 deletion doc/source/examples/notebooks.rst
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@ Production Configurations and Integrations

Example Helm Deployments <helm_examples>
Max gRPC Message Size <max_grpc_msg_size>
REST timeouts <rest_timeouts>
Configurable timeouts <timeouts>
Deploy Multiple Seldon Core Operators <multiple_operators>
Protocol Examples <protocol_examples>
Custom Protobuf Data Example <customdata_example>
Expand Down
File renamed without changes.
49 changes: 49 additions & 0 deletions doc/source/python/developer_notes.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
# Development Tips


## Running locally for testing

Sometimes it is useful to be able to test your model locally without the need to build image with s2i or docker.

This can be easily done with `seldon-core` as its installed the CLI command that starts the microservice.

Assuming we have a simple model saved in `MyModel.py` file:
```python
class MyModel:

def predict(self, X, features_names=None):
"""
Return a prediction.
Parameters
----------
X : array-like
feature_names : array of feature names (optional)
"""
print("Predict called - will run identity function")
return X
```

We can start Seldon Core microservice with
```bash
seldon-core-microservice MyModel --service-type MODEL
```

Then in other terminal we can send `curl` requests to test REST endpoint:
```bash
curl http://localhost:9000/api/v1.0/predictions \
-H 'Content-Type: application/json' \
-d '{"data": {"names": ["input"], "ndarray": ["data"]}}'
```


And assuming that `seldon-core` code is accessible at `${SELDON_CORE_DIR}` we can use `grpcurl` to send gRPC request:
```bash
cd ${SELDON_CORE_DIR}/executor/proto && grpcurl \
-d '{"data": {"names": ["input"], "ndarray": ["data"]}}' \
-plaintext -proto ./prediction.proto 0.0.0.0:5000 seldon.protos.Seldon/Predict
```

The `grpcurl` tool can be obtained using binaries released on [GitHub](https://github.com/fullstorydev/grpcurl) or using [asdf-vm](https://github.com/asdf-vm/asdf-plugins).

See [Python Server](./python_server.html#configuration) documentation for config options.
6 changes: 3 additions & 3 deletions doc/source/python/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,6 @@ You can use the following links to navigate the Python seldon-core module:
Create image with S2I <python_wrapping_s2i.md>
Create image with a Dockerfile <python_wrapping_docker.md>
Seldon Python server configuration <python_server.md>
Calling the Seldon API with the Seldon Python client <seldon_client.md>
Python API reference <api/modules>

Calling the Seldon API with the Seldon Python client <seldon_client.md>
Python API reference <api/modules>
Development Tips <developer_notes>
76 changes: 39 additions & 37 deletions examples/ambassador/canary/ambassador_canary.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
"name": "stdout",
"output_type": "stream",
"text": [
"Error from server (AlreadyExists): namespaces \"seldon\" already exists\r\n"
"Error from server (AlreadyExists): namespaces \"seldon\" already exists\n"
]
}
],
Expand All @@ -42,7 +42,7 @@
"name": "stdout",
"output_type": "stream",
"text": [
"Context \"kind-kind\" modified.\r\n"
"Context \"kind-seldon\" modified.\n"
]
}
],
Expand Down Expand Up @@ -72,7 +72,7 @@
{
"data": {
"text/plain": [
"'1.5.0-dev'"
"'1.6.0-dev'"
]
},
"execution_count": 4,
Expand All @@ -97,7 +97,7 @@
},
{
"cell_type": "code",
"execution_count": 17,
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
Expand Down Expand Up @@ -130,14 +130,14 @@
},
{
"cell_type": "code",
"execution_count": 18,
"execution_count": 6,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"seldondeployment.machinelearning.seldon.io/example created\r\n"
"seldondeployment.machinelearning.seldon.io/example created\n"
]
}
],
Expand All @@ -147,7 +147,7 @@
},
{
"cell_type": "code",
"execution_count": 19,
"execution_count": 7,
"metadata": {},
"outputs": [
{
Expand All @@ -172,7 +172,7 @@
},
{
"cell_type": "code",
"execution_count": 20,
"execution_count": 8,
"metadata": {},
"outputs": [],
"source": [
Expand All @@ -189,7 +189,7 @@
},
{
"cell_type": "code",
"execution_count": 21,
"execution_count": 9,
"metadata": {},
"outputs": [
{
Expand All @@ -204,12 +204,12 @@
" tensor {\n",
" shape: 1\n",
" shape: 1\n",
" values: 0.5531305252881753\n",
" values: 0.71309135369705\n",
" }\n",
"}\n",
"\n",
"Response:\n",
"{'data': {'names': ['proba'], 'tensor': {'shape': [1, 1], 'values': [0.08599584459790496]}}, 'meta': {}}\n"
"{'data': {'names': ['proba'], 'tensor': {'shape': [1, 1], 'values': [0.09942988397161616]}}, 'meta': {'requestPath': {'classifier': 'seldonio/mock_classifier:1.6.0-dev'}}}\n"
]
}
],
Expand All @@ -230,7 +230,7 @@
},
{
"cell_type": "code",
"execution_count": 25,
"execution_count": 10,
"metadata": {},
"outputs": [],
"source": [
Expand Down Expand Up @@ -280,14 +280,15 @@
},
{
"cell_type": "code",
"execution_count": 26,
"execution_count": 11,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"seldondeployment.machinelearning.seldon.io/example created\r\n"
"Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply\n",
"seldondeployment.machinelearning.seldon.io/example configured\n"
]
}
],
Expand All @@ -297,13 +298,14 @@
},
{
"cell_type": "code",
"execution_count": 27,
"execution_count": 12,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Waiting for deployment \"example-canary-0-classifier\" rollout to finish: 0 of 1 updated replicas are available...\n",
"deployment \"example-canary-0-classifier\" successfully rolled out\n",
"deployment \"example-main-0-classifier\" successfully rolled out\n"
]
Expand All @@ -323,7 +325,7 @@
},
{
"cell_type": "code",
"execution_count": 28,
"execution_count": 13,
"metadata": {},
"outputs": [
{
Expand All @@ -337,15 +339,15 @@
" tensor {\n",
" shape: 1\n",
" shape: 1\n",
" values: 0.08077854180978328\n",
" values: 0.8694648161558428\n",
" }\n",
"}\n",
"\n",
"Response:\n",
"{'data': {'names': ['proba'], 'tensor': {'shape': [1, 1], 'values': [0.0554153788429373]}}, 'meta': {}}"
"{'data': {'names': ['proba'], 'tensor': {'shape': [1, 1], 'values': [0.11433542420010957]}}, 'meta': {'requestPath': {'classifier': 'seldonio/mock_classifier:1.6.0-dev'}}}"
]
},
"execution_count": 28,
"execution_count": 13,
"metadata": {},
"output_type": "execute_result"
}
Expand All @@ -356,7 +358,7 @@
},
{
"cell_type": "code",
"execution_count": 29,
"execution_count": 24,
"metadata": {},
"outputs": [],
"source": [
Expand All @@ -367,34 +369,41 @@
" r = sc.predict(gateway=\"ambassador\",transport=\"rest\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Following checks number of prediction requests processed by default/canary predictors respectively."
]
},
{
"cell_type": "code",
"execution_count": 30,
"execution_count": 25,
"metadata": {},
"outputs": [],
"source": [
"default_count=!kubectl logs $(kubectl get pod -lseldon-app=example-main -o jsonpath='{.items[0].metadata.name}') classifier | grep \"/predict\" | wc -l "
"default_count=!kubectl logs $(kubectl get pod -lseldon-app=example-main -o jsonpath='{.items[0].metadata.name}') classifier | grep \"root:predict\" | wc -l "
]
},
{
"cell_type": "code",
"execution_count": 31,
"execution_count": 27,
"metadata": {},
"outputs": [],
"source": [
"canary_count=!kubectl logs $(kubectl get pod -lseldon-app=example-canary -o jsonpath='{.items[0].metadata.name}') classifier | grep \"/predict\" | wc -l"
"canary_count=!kubectl logs $(kubectl get pod -lseldon-app=example-canary -o jsonpath='{.items[0].metadata.name}') classifier | grep \"root:predict\" | wc -l"
]
},
{
"cell_type": "code",
"execution_count": 32,
"execution_count": 29,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"0.2948717948717949\n"
"0.32894736842105265\n"
]
}
],
Expand All @@ -406,27 +415,20 @@
},
{
"cell_type": "code",
"execution_count": 34,
"execution_count": 30,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"seldondeployment.machinelearning.seldon.io \"example\" deleted\r\n"
"seldondeployment.machinelearning.seldon.io \"example\" deleted\n"
]
}
],
"source": [
"!kubectl delete -f canary.yaml"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
Expand All @@ -446,7 +448,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.8"
"version": "3.8.6"
},
"varInspector": {
"cols": {
Expand Down Expand Up @@ -479,5 +481,5 @@
}
},
"nbformat": 4,
"nbformat_minor": 1
"nbformat_minor": 4
}
Loading

0 comments on commit ac05dac

Please sign in to comment.