diff --git a/android/README.md b/android/README.md
index 6c9b25669b9..b7be85388f7 100644
--- a/android/README.md
+++ b/android/README.md
@@ -16,7 +16,7 @@ In gradle, you can add the 5 modules in your dependencies:
```groovy
dependencies {
- implementation platform("ai.djl:bom:0.25.0")
+ implementation platform("ai.djl:bom:0.26.0")
implementation "ai.djl:api"
implementation "ai.djl.android:core"
diff --git a/android/pytorch-native/README.md b/android/pytorch-native/README.md
index 7ed09d5b448..504acc93cb4 100644
--- a/android/pytorch-native/README.md
+++ b/android/pytorch-native/README.md
@@ -124,7 +124,7 @@ cd ..
./gradlew compileAndroidJNI -Ppt_version=${PYTORCH_VERSION}
```
-`jnilib/0.25.0/android` folder will be created after build, and shared library will be uploaded to S3 in CI build
+`jnilib/0.26.0/android` folder will be created after build, and shared library will be uploaded to S3 in CI build
## Build PyTorch android library (.aar) and publish to Sonatype snapshot repo
@@ -138,7 +138,7 @@ cd ../../../android
# To avoid download jni from S3, manually copy them
mkdir -p pytorch-native/jnilib
-cp -r ../engines/pytorch/pytorch-native/jnilib/0.25.0/android/* pytorch-native/jnilib
+cp -r ../engines/pytorch/pytorch-native/jnilib/0.26.0/android/* pytorch-native/jnilib
./gradlew :pytorch-native:assemble
# publish to local maven repo (~/.m2 folder)
diff --git a/api/README.md b/api/README.md
index cf2bbfb55db..5b3ceaee02a 100644
--- a/api/README.md
+++ b/api/README.md
@@ -35,7 +35,7 @@ You can pull the DJL API from the central Maven repository by including the foll
ai.djl
api
- 0.25.0
+ 0.26.0
```
diff --git a/basicdataset/README.md b/basicdataset/README.md
index 11c217b5371..217f58d22b3 100644
--- a/basicdataset/README.md
+++ b/basicdataset/README.md
@@ -29,7 +29,7 @@ You can pull the module from the central Maven repository by including the follo
ai.djl
basicdataset
- 0.25.0
+ 0.26.0
```
diff --git a/bom/README.md b/bom/README.md
index c8e4e80d204..c98b9d1fbe1 100644
--- a/bom/README.md
+++ b/bom/README.md
@@ -22,7 +22,7 @@ will need to mention the type as pom and the scope as import) as the following:
ai.djl
bom
- 0.25.0
+ 0.26.0
pom
import
@@ -38,7 +38,7 @@ will need to mention the type as pom and the scope as import) as the following:
ai.djl
bom
- 0.25.0
+ 0.26.0
pom
import
@@ -65,7 +65,7 @@ will need to mention the type as pom and the scope as import) as the following:
- First you need add BOM into your build.gradle file as the following:
```
- implementation platform("ai.djl:bom:0.25.0")
+ implementation platform("ai.djl:bom:0.26.0")
```
- Then you import the desired DJL modules into to you pom.xml file (no version is needed):
diff --git a/djl-zero/README.md b/djl-zero/README.md
index 019fb155038..34acbac07b9 100644
--- a/djl-zero/README.md
+++ b/djl-zero/README.md
@@ -49,6 +49,6 @@ You can pull the module from the central Maven repository by including the follo
ai.djl
djl-zero
- 0.25.0
+ 0.26.0
```
diff --git a/docs/development/example_dataset.md b/docs/development/example_dataset.md
index 1b0f9b87b41..63583c2fdeb 100644
--- a/docs/development/example_dataset.md
+++ b/docs/development/example_dataset.md
@@ -24,8 +24,8 @@ api group: 'org.apache.commons', name: 'commons-csv', version: '1.7'
In order to extend the dataset, the following dependencies are required:
```
-api "ai.djl:api:0.25.0"
-api "ai.djl:basicdataset:0.25.0"
+api "ai.djl:api:0.26.0"
+api "ai.djl:basicdataset:0.26.0"
```
There are four parts we need to implement for CSVDataset.
diff --git a/docs/hybrid_engine.md b/docs/hybrid_engine.md
index 8a529770cfd..ddde08337ee 100644
--- a/docs/hybrid_engine.md
+++ b/docs/hybrid_engine.md
@@ -21,17 +21,17 @@ to run in a hybrid mode:
To use it along with Apache MXNet for additional API support, add the following two dependencies:
```
-runtimeOnly "ai.djl.mxnet:mxnet-engine:0.25.0"
+runtimeOnly "ai.djl.mxnet:mxnet-engine:0.26.0"
```
You can also use PyTorch or TensorFlow Engine as the supplemental engine by adding their corresponding dependencies.
```
-runtimeOnly "ai.djl.pytorch:pytorch-engine:0.25.0"
+runtimeOnly "ai.djl.pytorch:pytorch-engine:0.26.0"
```
```
-runtimeOnly "ai.djl.tensorflow:tensorflow-engine:0.25.0"
+runtimeOnly "ai.djl.tensorflow:tensorflow-engine:0.26.0"
```
## How Hybrid works
diff --git a/docs/load_model.md b/docs/load_model.md
index 727d02fa1e0..653ba3e91d7 100644
--- a/docs/load_model.md
+++ b/docs/load_model.md
@@ -181,7 +181,7 @@ Here is a few tips you can use to help you debug model loading issue:
See [here](development/configure_logging.md#configure-logging-level) for how to enable debug log
#### List models programmatically in your code
-You can use [ModelZoo.listModels()](https://javadoc.io/static/ai.djl/api/0.25.0/ai/djl/repository/zoo/ModelZoo.html#listModels--) API to query available models.
+You can use [ModelZoo.listModels()](https://javadoc.io/static/ai.djl/api/0.26.0/ai/djl/repository/zoo/ModelZoo.html#listModels--) API to query available models.
#### List available models using DJL command line
diff --git a/engines/ml/lightgbm/README.md b/engines/ml/lightgbm/README.md
index 9232077d6b9..74ab3eba411 100644
--- a/engines/ml/lightgbm/README.md
+++ b/engines/ml/lightgbm/README.md
@@ -36,13 +36,13 @@ LightGBM can only run on top of the Linux/Mac/Windows machine using x86_64.
## Installation
You can pull the LightGBM engine from the central Maven repository by including the following dependency:
-- ai.djl.ml.lightgbm:lightgbm:0.25.0
+- ai.djl.ml.lightgbm:lightgbm:0.26.0
```xml
ai.djl.ml.lightgbm
lightgbm
- 0.25.0
+ 0.26.0
runtime
```
diff --git a/engines/ml/xgboost/README.md b/engines/ml/xgboost/README.md
index 7c41cfbdbcd..df0a7897e3c 100644
--- a/engines/ml/xgboost/README.md
+++ b/engines/ml/xgboost/README.md
@@ -37,13 +37,13 @@ XGBoost can only run on top of the Linux/Mac machine. User can build from source
## Installation
You can pull the XGBoost engine from the central Maven repository by including the following dependency:
-- ai.djl.ml.xgboost:xgboost:0.25.0
+- ai.djl.ml.xgboost:xgboost:0.26.0
```xml
ai.djl.ml.xgboost
xgboost
- 0.25.0
+ 0.26.0
runtime
```
diff --git a/engines/mxnet/mxnet-engine/README.md b/engines/mxnet/mxnet-engine/README.md
index 58eba7ea094..66b2c98adc1 100644
--- a/engines/mxnet/mxnet-engine/README.md
+++ b/engines/mxnet/mxnet-engine/README.md
@@ -7,7 +7,7 @@ This module contains the Deep Java Library (DJL) EngineProvider for Apache MXNet
We don't recommend that developers use classes in this module directly. Use of these classes
will couple your code with Apache MXNet and make switching between engines difficult. Even so,
developers are not restricted from using engine-specific features. For more information,
-see [NDManager#invoke()](https://javadoc.io/static/ai.djl/api/0.25.0/ai/djl/ndarray/NDManager.html#invoke-java.lang.String-ai.djl.ndarray.NDArray:A-ai.djl.ndarray.NDArray:A-ai.djl.util.PairList-).
+see [NDManager#invoke()](https://javadoc.io/static/ai.djl/api/0.26.0/ai/djl/ndarray/NDManager.html#invoke-java.lang.String-ai.djl.ndarray.NDArray:A-ai.djl.ndarray.NDArray:A-ai.djl.util.PairList-).
## Documentation
@@ -33,7 +33,7 @@ You can pull the MXNet engine from the central Maven repository by including the
ai.djl.mxnet
mxnet-engine
- 0.25.0
+ 0.26.0
runtime
```
diff --git a/engines/mxnet/mxnet-model-zoo/README.md b/engines/mxnet/mxnet-model-zoo/README.md
index d97370fc56e..f32678944c0 100644
--- a/engines/mxnet/mxnet-model-zoo/README.md
+++ b/engines/mxnet/mxnet-model-zoo/README.md
@@ -27,7 +27,7 @@ You can pull the MXNet engine from the central Maven repository by including the
ai.djl.mxnet
mxnet-model-zoo
- 0.25.0
+ 0.26.0
```
diff --git a/engines/onnxruntime/onnxruntime-android/README.md b/engines/onnxruntime/onnxruntime-android/README.md
index c19594fc944..eba92b84288 100644
--- a/engines/onnxruntime/onnxruntime-android/README.md
+++ b/engines/onnxruntime/onnxruntime-android/README.md
@@ -6,13 +6,13 @@ This module contains the DJL ONNX Runtime engine for Android.
## Installation
You can pull the ONNX Runtime for Android from the central Maven repository by including the following dependency:
-- ai.djl.android:onnxruntime:0.25.0
+- ai.djl.android:onnxruntime:0.26.0
```xml
ai.djl.android
onnxruntime
- 0.25.0
+ 0.26.0
runtime
```
diff --git a/engines/onnxruntime/onnxruntime-engine/README.md b/engines/onnxruntime/onnxruntime-engine/README.md
index 249557f139e..b89b14f4473 100644
--- a/engines/onnxruntime/onnxruntime-engine/README.md
+++ b/engines/onnxruntime/onnxruntime-engine/README.md
@@ -37,13 +37,13 @@ for the official ONNX Runtime project.
## Installation
You can pull the ONNX Runtime engine from the central Maven repository by including the following dependency:
-- ai.djl.onnxruntime:onnxruntime-engine:0.25.0
+- ai.djl.onnxruntime:onnxruntime-engine:0.26.0
```xml
ai.djl.onnxruntime
onnxruntime-engine
- 0.25.0
+ 0.26.0
runtime
```
@@ -61,7 +61,7 @@ Maven:
ai.djl.onnxruntime
onnxruntime-engine
- 0.25.0
+ 0.26.0
runtime
@@ -81,7 +81,7 @@ Maven:
Gradle:
```groovy
-implementation("ai.djl.onnxruntime:onnxruntime-engine:0.25.0") {
+implementation("ai.djl.onnxruntime:onnxruntime-engine:0.26.0") {
exclude group: "com.microsoft.onnxruntime", module: "onnxruntime"
}
implementation "com.microsoft.onnxruntime:onnxruntime_gpu:1.14.0"
diff --git a/engines/paddlepaddle/paddlepaddle-engine/README.md b/engines/paddlepaddle/paddlepaddle-engine/README.md
index 749ebf4a937..6671cfbcd42 100644
--- a/engines/paddlepaddle/paddlepaddle-engine/README.md
+++ b/engines/paddlepaddle/paddlepaddle-engine/README.md
@@ -30,7 +30,7 @@ You can pull the PaddlePaddle engine from the central Maven repository by includ
ai.djl.paddlepaddle
paddlepaddle-engine
- 0.25.0
+ 0.26.0
runtime
```
diff --git a/engines/paddlepaddle/paddlepaddle-model-zoo/README.md b/engines/paddlepaddle/paddlepaddle-model-zoo/README.md
index 520dcd5808b..55d3c67fe50 100644
--- a/engines/paddlepaddle/paddlepaddle-model-zoo/README.md
+++ b/engines/paddlepaddle/paddlepaddle-model-zoo/README.md
@@ -26,7 +26,7 @@ from the central Maven repository by including the following dependency:
ai.djl.paddlepaddle
paddlepaddle-model-zoo
- 0.25.0
+ 0.26.0
```
diff --git a/engines/pytorch/pytorch-engine/README.md b/engines/pytorch/pytorch-engine/README.md
index d3451ef25d9..2bb210f572e 100644
--- a/engines/pytorch/pytorch-engine/README.md
+++ b/engines/pytorch/pytorch-engine/README.md
@@ -24,13 +24,13 @@ The javadocs output is built in the `build/doc/javadoc` folder.
## Installation
You can pull the PyTorch engine from the central Maven repository by including the following dependency:
-- ai.djl.pytorch:pytorch-engine:0.25.0
+- ai.djl.pytorch:pytorch-engine:0.26.0
```xml
ai.djl.pytorch
pytorch-engine
- 0.25.0
+ 0.26.0
runtime
```
@@ -113,7 +113,7 @@ export PYTORCH_FLAVOR=cpu
### macOS
For macOS, you can use the following library:
-- ai.djl.pytorch:pytorch-jni:2.0.1-0.25.0
+- ai.djl.pytorch:pytorch-jni:2.0.1-0.26.0
- ai.djl.pytorch:pytorch-native-cpu:2.0.1:osx-x86_64
```xml
@@ -127,7 +127,7 @@ For macOS, you can use the following library:
ai.djl.pytorch
pytorch-jni
- 2.0.1-0.25.0
+ 2.0.1-0.26.0
runtime
```
@@ -137,7 +137,7 @@ For macOS, you can use the following library:
### macOS M1
For macOS M1, you can use the following library:
-- ai.djl.pytorch:pytorch-jni:2.0.1-0.25.0
+- ai.djl.pytorch:pytorch-jni:2.0.1-0.26.0
- ai.djl.pytorch:pytorch-native-cpu:2.0.1:osx-aarch64
```xml
@@ -151,7 +151,7 @@ For macOS M1, you can use the following library:
ai.djl.pytorch
pytorch-jni
- 2.0.1-0.25.0
+ 2.0.1-0.26.0
runtime
```
@@ -162,7 +162,7 @@ installed on your GPU machine, you can use one of the following library:
#### Linux GPU
-- ai.djl.pytorch:pytorch-jni:2.0.1-0.25.0
+- ai.djl.pytorch:pytorch-jni:2.0.1-0.26.0
- ai.djl.pytorch:pytorch-native-cu118:2.0.1:linux-x86_64 - CUDA 11.8
```xml
@@ -176,14 +176,14 @@ installed on your GPU machine, you can use one of the following library:
ai.djl.pytorch
pytorch-jni
- 2.0.1-0.25.0
+ 2.0.1-0.26.0
runtime
```
### Linux CPU
-- ai.djl.pytorch:pytorch-jni:2.0.1-0.25.0
+- ai.djl.pytorch:pytorch-jni:2.0.1-0.26.0
- ai.djl.pytorch:pytorch-native-cpu:2.0.1:linux-x86_64
```xml
@@ -197,14 +197,14 @@ installed on your GPU machine, you can use one of the following library:
ai.djl.pytorch
pytorch-jni
- 2.0.1-0.25.0
+ 2.0.1-0.26.0
runtime
```
### For aarch64 build
-- ai.djl.pytorch:pytorch-jni:2.0.1-0.25.0
+- ai.djl.pytorch:pytorch-jni:2.0.1-0.26.0
- ai.djl.pytorch:pytorch-native-cpu-precxx11:2.0.1:linux-aarch64
```xml
@@ -218,7 +218,7 @@ installed on your GPU machine, you can use one of the following library:
ai.djl.pytorch
pytorch-jni
- 2.0.1-0.25.0
+ 2.0.1-0.26.0
runtime
```
@@ -228,7 +228,7 @@ installed on your GPU machine, you can use one of the following library:
We also provide packages for the system like CentOS 7/Ubuntu 14.04 with GLIBC >= 2.17.
All the package were built with GCC 7, we provided a newer `libstdc++.so.6.24` in the package that contains `CXXABI_1.3.9` to use the package successfully.
-- ai.djl.pytorch:pytorch-jni:2.0.1-0.25.0
+- ai.djl.pytorch:pytorch-jni:2.0.1-0.26.0
- ai.djl.pytorch:pytorch-native-cu118-precxx11:2.0.1:linux-x86_64 - CUDA 11.8
- ai.djl.pytorch:pytorch-native-cpu-precxx11:2.0.1:linux-x86_64 - CPU
@@ -243,7 +243,7 @@ All the package were built with GCC 7, we provided a newer `libstdc++.so.6.24` i
ai.djl.pytorch
pytorch-jni
- 2.0.1-0.25.0
+ 2.0.1-0.26.0
runtime
```
@@ -259,7 +259,7 @@ All the package were built with GCC 7, we provided a newer `libstdc++.so.6.24` i
ai.djl.pytorch
pytorch-jni
- 2.0.1-0.25.0
+ 2.0.1-0.26.0
runtime
```
@@ -274,7 +274,7 @@ For the Windows platform, you can choose between CPU and GPU.
#### Windows GPU
-- ai.djl.pytorch:pytorch-jni:2.0.1-0.25.0
+- ai.djl.pytorch:pytorch-jni:2.0.1-0.26.0
- ai.djl.pytorch:pytorch-native-cu118:2.0.1:win-x86_64 - CUDA 11.8
```xml
@@ -288,14 +288,14 @@ For the Windows platform, you can choose between CPU and GPU.
ai.djl.pytorch
pytorch-jni
- 2.0.1-0.25.0
+ 2.0.1-0.26.0
runtime
```
### Windows CPU
-- ai.djl.pytorch:pytorch-jni:2.0.1-0.25.0
+- ai.djl.pytorch:pytorch-jni:2.0.1-0.26.0
- ai.djl.pytorch:pytorch-native-cpu:2.0.1:win-x86_64
```xml
@@ -309,7 +309,7 @@ For the Windows platform, you can choose between CPU and GPU.
ai.djl.pytorch
pytorch-jni
- 2.0.1-0.25.0
+ 2.0.1-0.26.0
runtime
```
diff --git a/engines/pytorch/pytorch-model-zoo/README.md b/engines/pytorch/pytorch-model-zoo/README.md
index bd237e7511a..f598dd2aecd 100644
--- a/engines/pytorch/pytorch-model-zoo/README.md
+++ b/engines/pytorch/pytorch-model-zoo/README.md
@@ -25,7 +25,7 @@ You can pull the PyTorch engine from the central Maven repository by including t
ai.djl.pytorch
pytorch-model-zoo
- 0.25.0
+ 0.26.0
```
diff --git a/engines/tensorflow/tensorflow-api/README.md b/engines/tensorflow/tensorflow-api/README.md
index c0b03a81d31..9e151a274a0 100644
--- a/engines/tensorflow/tensorflow-api/README.md
+++ b/engines/tensorflow/tensorflow-api/README.md
@@ -16,6 +16,6 @@ You can pull the TensorFlow core java API from the central Maven repository by i
ai.djl.tensorflow
tensorflow-api
- 0.25.0
+ 0.26.0
```
diff --git a/engines/tensorflow/tensorflow-engine/README.md b/engines/tensorflow/tensorflow-engine/README.md
index d4669c87b5a..5a6ac3e6da1 100644
--- a/engines/tensorflow/tensorflow-engine/README.md
+++ b/engines/tensorflow/tensorflow-engine/README.md
@@ -28,13 +28,13 @@ The javadocs output is built in the `build/doc/javadoc` folder.
You can pull the TensorFlow engine from the central Maven repository by including the following dependency:
-- ai.djl.tensorflow:tensorflow-engine:0.25.0
+- ai.djl.tensorflow:tensorflow-engine:0.26.0
```xml
ai.djl.tensorflow
tensorflow-engine
- 0.25.0
+ 0.26.0
runtime
```
diff --git a/engines/tensorflow/tensorflow-model-zoo/README.md b/engines/tensorflow/tensorflow-model-zoo/README.md
index 46667133f5f..975caa6df82 100644
--- a/engines/tensorflow/tensorflow-model-zoo/README.md
+++ b/engines/tensorflow/tensorflow-model-zoo/README.md
@@ -26,7 +26,7 @@ from the central Maven repository by including the following dependency:
ai.djl.tensorflow
tensorflow-model-zoo
- 0.25.0
+ 0.26.0
```
diff --git a/engines/tensorrt/README.md b/engines/tensorrt/README.md
index cbb71b16b8d..8100b615e24 100644
--- a/engines/tensorrt/README.md
+++ b/engines/tensorrt/README.md
@@ -28,13 +28,13 @@ The javadocs output is generated in the `build/doc/javadoc` folder.
## Installation
You can pull the TensorRT engine from the central Maven repository by including the following dependency:
-- ai.djl.tensorrt:tensorrt:0.25.0
+- ai.djl.tensorrt:tensorrt:0.26.0
```xml
ai.djl.tensorrt
tensorrt
- 0.25.0
+ 0.26.0
runtime
```
diff --git a/engines/tflite/tflite-engine/README.md b/engines/tflite/tflite-engine/README.md
index 5191e46534e..861a66f9aaa 100644
--- a/engines/tflite/tflite-engine/README.md
+++ b/engines/tflite/tflite-engine/README.md
@@ -24,13 +24,13 @@ The javadocs output is built in the `build/doc/javadoc` folder.
## Installation
You can pull the TensorFlow Lite engine from the central Maven repository by including the following dependency:
-- ai.djl.tflite:tflite-engine:0.25.0
+- ai.djl.tflite:tflite-engine:0.26.0
```xml
ai.djl.tflite
tflite-engine
- 0.25.0
+ 0.26.0
runtime
```
diff --git a/extensions/audio/README.md b/extensions/audio/README.md
index 2c040526122..95ed8c53a84 100644
--- a/extensions/audio/README.md
+++ b/extensions/audio/README.md
@@ -23,6 +23,6 @@ You can pull the module from the central Maven repository by including the follo
ai.djl.audio
audio
- 0.25.0
+ 0.26.0
```
diff --git a/extensions/aws-ai/README.md b/extensions/aws-ai/README.md
index a95484c8a73..16d412904c5 100644
--- a/extensions/aws-ai/README.md
+++ b/extensions/aws-ai/README.md
@@ -58,6 +58,6 @@ You can pull the module from the central Maven repository by including the follo
ai.djl.aws
aws-ai
- 0.25.0
+ 0.26.0
```
diff --git a/extensions/fasttext/README.md b/extensions/fasttext/README.md
index 7e763aeca3d..f0c60d39bf1 100644
--- a/extensions/fasttext/README.md
+++ b/extensions/fasttext/README.md
@@ -34,7 +34,7 @@ You can pull the fastText engine from the central Maven repository by including
ai.djl.fasttext
fasttext-engine
- 0.25.0
+ 0.26.0
```
diff --git a/extensions/hadoop/README.md b/extensions/hadoop/README.md
index 92b38b86d42..8a376e22d85 100644
--- a/extensions/hadoop/README.md
+++ b/extensions/hadoop/README.md
@@ -52,6 +52,6 @@ You can pull the module from the central Maven repository by including the follo
ai.djl.hadoop
hadoop
- 0.25.0
+ 0.26.0
```
diff --git a/extensions/opencv/README.md b/extensions/opencv/README.md
index cbe481002fe..c23e0c58532 100644
--- a/extensions/opencv/README.md
+++ b/extensions/opencv/README.md
@@ -23,6 +23,6 @@ You can pull the module from the central Maven repository by including the follo
ai.djl.opencv
opencv
- 0.25.0
+ 0.26.0
```
diff --git a/extensions/sentencepiece/README.md b/extensions/sentencepiece/README.md
index bdf01857c95..de28d5334df 100644
--- a/extensions/sentencepiece/README.md
+++ b/extensions/sentencepiece/README.md
@@ -23,6 +23,6 @@ You can pull the module from the central Maven repository by including the follo
ai.djl.sentencepiece
sentencepiece
- 0.25.0
+ 0.26.0
```
diff --git a/extensions/spark/README.md b/extensions/spark/README.md
index 3ae23bffc31..da3171ca008 100644
--- a/extensions/spark/README.md
+++ b/extensions/spark/README.md
@@ -34,7 +34,7 @@ You can pull the module from the central Maven repository by including the follo
ai.djl.spark
spark_2.12
- 0.25.0
+ 0.26.0
```
diff --git a/extensions/tablesaw/README.md b/extensions/tablesaw/README.md
index 48104b14995..b4287d9733d 100644
--- a/extensions/tablesaw/README.md
+++ b/extensions/tablesaw/README.md
@@ -25,6 +25,6 @@ You can pull the module from the central Maven repository by including the follo
ai.djl.tablesaw
tablesaw
- 0.25.0
+ 0.26.0
```
diff --git a/extensions/timeseries/README.md b/extensions/timeseries/README.md
index 3401aeaec8d..3ef6887825c 100644
--- a/extensions/timeseries/README.md
+++ b/extensions/timeseries/README.md
@@ -245,6 +245,6 @@ You can pull the module from the central Maven repository by including the follo
ai.djl.timeseries
timeseries
- 0.25.0
+ 0.26.0
```
diff --git a/extensions/timeseries/docs/forecast_with_M5_data.md b/extensions/timeseries/docs/forecast_with_M5_data.md
index 80412411a3c..7b8e1c78210 100644
--- a/extensions/timeseries/docs/forecast_with_M5_data.md
+++ b/extensions/timeseries/docs/forecast_with_M5_data.md
@@ -56,7 +56,7 @@ repositories {
}
dependencies {
implementation "org.apache.logging.log4j:log4j-slf4j-impl:2.17.1"
- implementation platform("ai.djl:bom:0.25.0")
+ implementation platform("ai.djl:bom:0.26.0")
implementation "ai.djl:api"
implementation "ai.djl.timeseries"
runtimeOnly "ai.djl.mxnet:mxnet-engine"
diff --git a/extensions/tokenizers/README.md b/extensions/tokenizers/README.md
index 8511de17eb1..2cdf4f19137 100644
--- a/extensions/tokenizers/README.md
+++ b/extensions/tokenizers/README.md
@@ -23,7 +23,7 @@ You can pull the module from the central Maven repository by including the follo
ai.djl.huggingface
tokenizers
- 0.25.0
+ 0.26.0
```
diff --git a/model-zoo/README.md b/model-zoo/README.md
index 67174fe9ee5..b8f2a8fd124 100644
--- a/model-zoo/README.md
+++ b/model-zoo/README.md
@@ -33,7 +33,7 @@ You can pull the model zoo from the central Maven repository by including the fo
ai.djl
model-zoo
- 0.25.0
+ 0.26.0
```