Skip to content

Commit

Permalink
Merge pull request #505 from databrickslabs/releases/v_0.4.0
Browse files Browse the repository at this point in the history
Releases/v 0.4.0
  • Loading branch information
Milos Colic authored Jan 15, 2024
2 parents ad63f6e + 8791ef7 commit 1621d25
Show file tree
Hide file tree
Showing 9 changed files with 89 additions and 19 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/build_main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ jobs:
uses: ./.github/actions/scala_build
- name: build python
uses: ./.github/actions/python_build
# - name: build R
# uses: ./.github/actions/r_build
- name: build R
uses: ./.github/actions/r_build
- name: upload artefacts
uses: ./.github/actions/upload_artefacts
19 changes: 19 additions & 0 deletions .github/workflows/docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,25 @@ jobs:
with:
documentation_path: docs/source
requirements_path: docs/docs-requirements.txt
- name: checkout v0.3.x archive
# Please do not change any step in here, even though it may look hacky
# This is the only way to emulate git archive --remote with actions/checkout
# git checkout gh-pages-v0.3.x is required to have a local branch for archiving
# git pull is optional, but it's a good practice to have the latest version
# git checkout gh-pages right after is required to go back to the working branch
# mkdir ./v0.3.x is required to create a directory for the archive
# git archive gh-pages-v0.3.x | tar -x -C ./v0.3.x is required to extract the archive
# in the right place
# git add --all is required to add the new files to the working branch
# git commit -am "Adding v0.3.x docs" is required to commit the changes
run: |
git checkout gh-pages-v0.3.x
git pull
git checkout gh-pages
mkdir ./v0.3.x
git archive gh-pages-v0.3.x | tar -x -C ./v0.3.x
git add --all
git commit -am "Adding v0.3.x docs"
- name: Push changes
uses: ad-m/github-push-action@master
with:
Expand Down
2 changes: 1 addition & 1 deletion R/sparkR-mosaic/sparkrMosaic/DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Package: sparkrMosaic
Title: SparkR bindings for Databricks Mosaic
Version: 0.3.14
Version: 0.4.0
Authors@R:
person("Robert", "Whiffin", , "[email protected]", role = c("aut", "cre")
)
Expand Down
2 changes: 1 addition & 1 deletion R/sparklyr-mosaic/sparklyrMosaic/DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Package: sparklyrMosaic
Title: sparklyr bindings for Databricks Mosaic
Version: 0.3.14
Version: 0.4.0
Authors@R:
person("Robert", "Whiffin", , "[email protected]", role = c("aut", "cre")
)
Expand Down
2 changes: 1 addition & 1 deletion R/sparklyr-mosaic/tests.R
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ library(sparklyr)

spark_home <- Sys.getenv("SPARK_HOME")
spark_home_set(spark_home)
install.packages("sparklyrMosaic_0.3.14.tar.gz", repos = NULL)
install.packages("sparklyrMosaic_0.4.0.tar.gz", repos = NULL)
library(sparklyrMosaic)

# find the mosaic jar in staging
Expand Down
2 changes: 1 addition & 1 deletion docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
author = 'Stuart Lynn, Milos Colic, Erni Durdevic, Robert Whiffin, Timo Roest'

# The full version, including alpha/beta/rc tags
release = "v0.3.14"
release = "v0.4.0"


# -- General configuration ---------------------------------------------------
Expand Down
49 changes: 49 additions & 0 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,54 @@ Mosaic provides:
* optimisations for performing point-in-polygon joins using an approach we co-developed with Ordnance Survey (`blog post <https://databricks.com/blog/2021/10/11/efficient-point-in-polygon-joins-via-pyspark-and-bng-geospatial-indexing.html>`_); and
* the choice of a Scala, SQL and Python API.

.. note::
For Mosaic versions < 0.4.0 please use the `0.3.x docs <https://databrickslabs.github.io/mosaic/v0.3.x/index.html>`_.


Version 0.4.0
=============

We recommend using Databricks Runtime versions 13.3 LTS with Photon enabled.

.. warning::
Mosaic 0.4.x series only supports DBR 13.x DBRs.
If running on a different DBR it will throw an exception:

**DEPRECATION ERROR: Mosaic v0.4.x series only supports Databricks Runtime 13. You can specify `%pip install 'databricks-mosaic<0.4,>=0.3'` for DBR < 13.**

As of the 0.4.0 release, Mosaic issues the following ERROR when initialized on a cluster that is neither Photon Runtime nor Databricks Runtime ML `ADB <https://learn.microsoft.com/en-us/azure/databricks/runtime/>`_ | `AWS <https://docs.databricks.com/runtime/index.html/>`_ | `GCP <https://docs.gcp.databricks.com/runtime/index.html/>`_ :

**DEPRECATION ERROR: Please use a Databricks Photon-enabled Runtime for performance benefits or Runtime ML for spatial AI benefits; Mosaic 0.4.x series restricts executing this cluster.**

As of Mosaic 0.4.0 (subject to change in follow-on releases)
* No Mosaic SQL expressions cannot yet be registered with `Unity Catalog <https://www.databricks.com/product/unity-catalog>`_ due to API changes affecting DBRs >= 13.
* `Assigned Clusters <https://docs.databricks.com/en/compute/configure.html#access-modes>`_ : Mosaic Python, R, and Scala APIs.
* `Shared Access Clusters <https://docs.databricks.com/en/compute/configure.html#access-modes>`_ : Mosaic Scala API (JVM) with Admin `allowlisting <https://docs.databricks.com/en/data-governance/unity-catalog/manage-privileges/allowlist.html>`_ ; Python bindings to Mosaic Scala APIs are blocked by Py4J Security on Shared Access Clusters.

.. note::
As of Mosaic 0.4.0 (subject to change in follow-on releases)

* `Unity Catalog <https://www.databricks.com/product/unity-catalog>`_ : Enforces process isolation which is difficult to accomplish with custom JVM libraries; as such only built-in (aka platform provided) JVM APIs can be invoked from other supported languages in Shared Access Clusters.
* `Volumes <https://docs.databricks.com/en/connect/unity-catalog/volumes.html>`_ : Along the same principle of isolation, clusters (both assigned and shared access) can read Volumes via relevant built-in readers and writers or via custom python calls which do not involve any custom JVM code.



Version 0.3.x Series
====================

We recommend using Databricks Runtime versions 12.2 LTS with Photon enabled.
For Mosaic versions < 0.4.0 please use the `0.3.x docs <https://databrickslabs.github.io/mosaic/v0.3.x/index.html>`_.

.. warning::
Mosaic 0.3.x series does not support DBR 13.x DBRs.

As of the 0.3.11 release, Mosaic issues the following WARNING when initialized on a cluster that is neither Photon Runtime nor Databricks Runtime ML `ADB <https://learn.microsoft.com/en-us/azure/databricks/runtime/>`_ | `AWS <https://docs.databricks.com/runtime/index.html/>`_ | `GCP <https://docs.gcp.databricks.com/runtime/index.html/>`_ :

**DEPRECATION WARNING: Please use a Databricks Photon-enabled Runtime for performance benefits or Runtime ML for spatial AI benefits; Mosaic will stop working on this cluster after v0.3.x.**
If you are receiving this warning in v0.3.11+, you will want to begin to plan for a supported runtime. The reason we are making this change is that we are streamlining Mosaic internals to be more aligned with future product APIs which are powered by Photon. Along this direction of change, Mosaic has standardized to JTS as its default and supported Vector Geometry Provider.





Documentation
Expand All @@ -75,6 +123,7 @@ Documentation
usage/usage
models/models
literature/videos
v0.3.x/index


Indices and tables
Expand Down
4 changes: 3 additions & 1 deletion python/mosaic/api/raster.py
Original file line number Diff line number Diff line change
Expand Up @@ -927,7 +927,9 @@ def rst_fromcontent(raster: ColumnOrName, driver: ColumnOrName, sizeInMB: Column
"""

return config.mosaic_context.invoke_function(
"rst_fromcontent", pyspark_to_java_column(raster), pyspark_to_java_column(driver),
"rst_fromcontent",
pyspark_to_java_column(raster),
pyspark_to_java_column(driver),
pyspark_to_java_column(sizeInMB)
)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -711,16 +711,15 @@ class MosaicContext(indexSystem: IndexSystem, geometryAPI: GeometryAPI) extends
ColumnAdapter(RST_Tessellate(raster.expr, resolution.expr, expressionConfig))
def rst_tessellate(raster: Column, resolution: Int): Column =
ColumnAdapter(RST_Tessellate(raster.expr, lit(resolution).expr, expressionConfig))
def rst_fromcontent(raster: Column, driver:Column): Column =
def rst_fromcontent(raster: Column, driver: Column): Column =
ColumnAdapter(RST_FromContent(raster.expr, driver.expr, lit(-1).expr, expressionConfig))
def rst_fromcontent(raster: Column, driver:Column, sizeInMB:Column): Column =
def rst_fromcontent(raster: Column, driver: Column, sizeInMB: Column): Column =
ColumnAdapter(RST_FromContent(raster.expr, driver.expr, sizeInMB.expr, expressionConfig))
def rst_fromcontent(raster: Column, driver:String): Column =
def rst_fromcontent(raster: Column, driver: String): Column =
ColumnAdapter(RST_FromContent(raster.expr, lit(driver).expr, lit(-1).expr, expressionConfig))
def rst_fromcontent(raster: Column, driver:String, sizeInMB:Int): Column =
def rst_fromcontent(raster: Column, driver: String, sizeInMB: Int): Column =
ColumnAdapter(RST_FromContent(raster.expr, lit(driver).expr, lit(sizeInMB).expr, expressionConfig))
def rst_fromfile(raster: Column): Column =
ColumnAdapter(RST_FromFile(raster.expr, lit(-1).expr, expressionConfig))
def rst_fromfile(raster: Column): Column = ColumnAdapter(RST_FromFile(raster.expr, lit(-1).expr, expressionConfig))
def rst_fromfile(raster: Column, sizeInMB: Column): Column =
ColumnAdapter(RST_FromFile(raster.expr, sizeInMB.expr, expressionConfig))
def rst_fromfile(raster: Column, sizeInMB: Int): Column =
Expand Down Expand Up @@ -1015,11 +1014,12 @@ object MosaicContext extends Logging {

val isML = sparkVersion.contains("-ml-")
val isPhoton = sparkVersion.contains("-photon-")
val isTest = (
dbrMajor == 0
&& !spark.conf.getAll.exists(_._1.startsWith("spark.databricks.clusterUsageTags."))
)

val isTest =
(
dbrMajor == 0
&& !spark.conf.getAll.exists(_._1.startsWith("spark.databricks.clusterUsageTags."))
)

if (dbrMajor != 13 && !isTest) {
val msg = """|DEPRECATION ERROR:
| Mosaic v0.4.x series only supports Databricks Runtime 13.
Expand All @@ -1039,7 +1039,7 @@ object MosaicContext extends Logging {
logError(msg)
println(msg)
throw new Exception(msg)
}
}
true
}

Expand Down

0 comments on commit 1621d25

Please sign in to comment.