#### Motivation
Replacing default_bucket -> bucket everywhere in this repo to ensure
it's consistent with KServe.
#### Modifications
replaced every instance of `default_bucket` to `bucket`
#### Result
Tested the [quickstart
install](https://github.com/kserve/modelmesh-serving/blob/main/docs/quickstart.md)
after modifying
[quickstart.yaml](https://github.com/kserve/modelmesh-serving/blob/6c86da9473d50de63f9ea3af8a4d7c223849547e/config/dependencies/quickstart.yaml#L127)
pods up and running -
```
kubectl get pods
NAME READY STATUS RESTARTS AGE
etcd-6fdc487479-m9pkx 1/1 Running 0 32m
minio-6b5c846587-8bwdv 1/1 Running 0 32m
modelmesh-controller-5cd8d68bc-9ls9p 1/1 Running 0 31m
modelmesh-serving-mlserver-1.x-66bb94dcf6-hvgzj 4/4 Running 0 26m
modelmesh-serving-mlserver-1.x-66bb94dcf6-qtdzw 4/4 Running 0 26m
```
Model deployed and InferenceService is Ready -
```
kubectl get isvc
NAME URL READY PREV LATEST PREVROLLEDOUTREVISION LATESTREADYREVISION AGE
example-sklearn-isvc grpc://modelmesh-serving.modelmesh-serving:8033 True
```
```
kubectl describe isvc example-sklearn-isvc
Name: example-sklearn-isvc
Namespace: modelmesh-serving
Labels: <none>
Annotations: serving.kserve.io/deploymentMode: ModelMesh
API Version: serving.kserve.io/v1beta1
Kind: InferenceService
Metadata:
Creation Timestamp: 2024-05-28T07:19:00Z
Generation: 1
Resource Version: 5950
UID: db71cf11-7842-4bc1-af97-647282e6b9b9
Spec:
Predictor:
Model:
Model Format:
Name: sklearn
Storage:
Key: localMinIO
Path: sklearn/mnist-svm.joblib
Status:
Components:
Predictor:
Grpc URL: grpc://modelmesh-serving.modelmesh-serving:8033
Rest URL: http://modelmesh-serving.modelmesh-serving:8008
URL: grpc://modelmesh-serving.modelmesh-serving:8033
Conditions:
Last Transition Time: 2024-05-28T07:25:07Z
Status: True
Type: PredictorReady
Last Transition Time: 2024-05-28T07:25:07Z
Status: True
Type: Ready
Model Status:
Copies:
Failed Copies: 0
Total Copies: 1
States:
Active Model State: Loaded
Target Model State:
Transition Status: UpToDate
URL: grpc://modelmesh-serving.modelmesh-serving:8033
Events: <none>
```
Inference Request successful -
```
MODEL_NAME=example-sklearn-isvc
grpcurl \
-plaintext \
-proto fvt/proto/kfs_inference_v2.proto \
-d '{ "model_name": "'"${MODEL_NAME}"'", "inputs": [{ "name": "predict", "shape": [1, 64], "datatype": "FP32", "contents": { "fp32_contents": [0.0, 0.0, 1.0, 11.0, 14.0, 15.0, 3.0, 0.0, 0.0, 1.0, 13.0, 16.0, 12.0, 16.0, 8.0, 0.0, 0.0, 8.0, 16.0, 4.0, 6.0, 16.0, 5.0, 0.0, 0.0, 5.0, 15.0, 11.0, 13.0, 14.0, 0.0, 0.0, 0.0, 0.0, 2.0, 12.0, 16.0, 13.0, 0.0, 0.0, 0.0, 0.0, 0.0, 13.0, 16.0, 16.0, 6.0, 0.0, 0.0, 0.0, 0.0, 16.0, 16.0, 16.0, 7.0, 0.0, 0.0, 0.0, 0.0, 11.0, 13.0, 12.0, 1.0, 0.0] }}]}' \
localhost:8033 \
inference.GRPCInferenceService.ModelInfer
Handling connection for 8033
{
"modelName": "example-sklearn-isvc__isvc-6b2eb0b8bf",
"outputs": [
{
"name": "predict",
"datatype": "INT64",
"shape": [
"1",
"1"
],
"contents": {
"int64Contents": [
"8"
]
}
}
]
}
```
This issue closes #456
---------
Signed-off-by: Aayush Subramaniam <[email protected]>