Skip to content

Commit

Permalink
Merge branch 'databrickslabs:main' into main
Browse files Browse the repository at this point in the history
  • Loading branch information
a0x8o authored Jan 12, 2024
2 parents 399528c + f734d66 commit 5e2afd6
Show file tree
Hide file tree
Showing 107 changed files with 795 additions and 227 deletions.
3 changes: 2 additions & 1 deletion .github/actions/python_build/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,9 +10,10 @@ runs:
- name: Install python dependencies
shell: bash
run: |
# - install pip libs
# note: gdal requires the extra args
cd python
pip install build wheel pyspark==${{ matrix.spark }} numpy==${{ matrix.numpy }}
pip install numpy==${{ matrix.numpy }}
pip install --no-build-isolation --no-cache-dir --force-reinstall gdal==${{ matrix.gdal }}
pip install .
- name: Test and build python package
Expand Down
17 changes: 8 additions & 9 deletions .github/actions/scala_build/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,17 +25,16 @@ runs:
sudo apt-add-repository "deb http://archive.ubuntu.com/ubuntu $(lsb_release -sc)-security main multiverse restricted universe"
sudo apt-add-repository "deb http://archive.ubuntu.com/ubuntu $(lsb_release -sc) main multiverse restricted universe"
sudo apt-get update -y
# - install numpy first
pip install --upgrade pip
pip install 'numpy>=${{ matrix.numpy }}'
# - install natives
sudo apt-get install -y unixodbc libcurl3-gnutls libsnappy-dev libopenjp2-7
sudo apt-get install -y gdal-bin libgdal-dev python3-gdal
# - install gdal with numpy
pip install --no-cache-dir --force-reinstall 'GDAL[numpy]==${{ matrix.gdal }}'
sudo wget -P /usr/lib -nc https://github.com/databrickslabs/mosaic/raw/main/resources/gdal/jammy/libgdalalljni.so
sudo wget -P /usr/lib -nc https://github.com/databrickslabs/mosaic/raw/main/resources/gdal/jammy/libgdalalljni.so.30
#sudo wget -P /usr/lib -nc https://github.com/databrickslabs/mosaic/raw/main/resources/gdal/jammy/libgdalalljni.so.30.0.3
sudo apt-get install -y gdal-bin libgdal-dev python3-numpy python3-gdal
# - install pip libs
pip install --upgrade pip
pip install gdal==${{ matrix.gdal }}
# - add the so files
sudo wget -nv -P /usr/lib -nc https://raw.githubusercontent.com/databrickslabs/mosaic/main/resources/gdal/jammy/libgdalalljni.so
sudo wget -nv -P /usr/lib -nc https://raw.githubusercontent.com/databrickslabs/mosaic/main/resources/gdal/jammy/libgdalalljni.so.30
sudo wget -nv -P /usr/lib -nc https://raw.githubusercontent.com/databrickslabs/mosaic/main/resources/gdal/jammy/libgdalalljni.so.30.0.3
- name: Test and build the scala JAR - skip tests is false
if: inputs.skip_tests == 'false'
shell: bash
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/build_main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ jobs:
strategy:
matrix:
python: [ 3.10.12 ]
numpy: [ 1.21.5 ]
numpy: [ 1.22.4 ]
gdal: [ 3.4.1 ]
spark: [ 3.4.0 ]
R: [ 4.2.2 ]
Expand All @@ -28,7 +28,7 @@ jobs:
uses: ./.github/actions/scala_build
- name: build python
uses: ./.github/actions/python_build
- name: build R
uses: ./.github/actions/r_build
# - name: build R
# uses: ./.github/actions/r_build
- name: upload artefacts
uses: ./.github/actions/upload_artefacts
2 changes: 1 addition & 1 deletion .github/workflows/build_python.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ jobs:
strategy:
matrix:
python: [ 3.10.12 ]
numpy: [ 1.21.5 ]
numpy: [ 1.22.4 ]
gdal: [ 3.4.1 ]
spark: [ 3.4.0 ]
R: [ 4.2.2 ]
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/build_r.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ jobs:
strategy:
matrix:
python: [ 3.10.12 ]
numpy: [ 1.21.5 ]
numpy: [ 1.22.4 ]
gdal: [ 3.4.1 ]
spark: [ 3.4.0 ]
R: [ 4.2.2 ]
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/build_scala.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ jobs:
strategy:
matrix:
python: [ 3.10.12 ]
numpy: [ 1.21.5 ]
numpy: [ 1.22.4 ]
gdal: [ 3.4.1 ]
spark: [ 3.4.0 ]
R: [ 4.2.2 ]
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/pypi-release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ jobs:
strategy:
matrix:
python: [ 3.10.12 ]
numpy: [ 1.21.5 ]
numpy: [ 1.22.4 ]
gdal: [ 3.4.1 ]
spark: [ 3.4.0 ]
R: [ 4.2.2 ]
Expand Down
10 changes: 9 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,12 @@
## v0.3.14
## v0.4.0 [DBR 13.3 LTS]
- First release for DBR 13.3 LTS which is Ubuntu Jammy and Spark 3.4.1. Not backwards compatible, meaning it will not run on prior DBRs; requires either a Photon DBR or a ML Runtime (__Standard, non-Photon DBR no longer allowed__).
- New `setup_fuse_install` function to meet various requirements arising with Unity Catalog + Shared Access clusters; removed the scala equivalent function, making artifact setup and install python-first for scala and Spark SQL.
- Removed OSS ESRI Geometry API for 0.4 series, JTS now the only vector provider.
- MosaicAnalyzer functions now accept Spark DataFrames instead of MosaicFrame, which has been removed.
- Docs for 0.3.x have been archived and linked from current docs; notebooks for 0.3.x have been separated from current notebooks.
- This release targets Assigned (vs Shared Access) clusters and offers python and scala language bindings; SQL expressions will not register in this release within Unity Catalog.

## v0.3.14 [DBR < 13]
- Fixes for Warning and Error messages on mosaic_enable call.
- Performance improvements for raster functions.
- Fix support for GDAL configuration via spark config (use 'spark.databricks.labs.mosaic.gdal.' prefix).
Expand Down
2 changes: 1 addition & 1 deletion R/generate_R_bindings.R
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ build_column_specifiers <- function(input){
build_method<-function(input){
function_name <- input$function_name
arg_names <- lapply(input$args, function(x){c(x[1])})
#this handles converting non-Column arguments to their R equivalents
# this handles converting non-Column arguments to their R equivalents
argument_parser <- function(x){
if(x[2] == 'Int'){
x[2] <- "numeric"
Expand Down
58 changes: 44 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,26 +32,53 @@ The supported languages are Scala, Python, R, and SQL.

## How does it work?

The Mosaic library is written in Scala to guarantee maximum performance with Spark and when possible, it uses code generation to give an extra performance boost.

The other supported languages (Python, R and SQL) are thin wrappers around the Scala code.
The Mosaic library is written in Scala (JVM) to guarantee maximum performance with Spark and when possible, it uses code generation to give an extra performance boost.

__The other supported languages (Python, R and SQL) are thin wrappers around the Scala (JVM) code.__

![mosaic-logical-design](src/main/resources/MosaicLogicalDesign.png)
Image1: Mosaic logical design.

## Getting started

We recommend using Databricks Runtime versions 11.3 LTS or 12.2 LTS with Photon enabled; this will leverage the
Databricks H3 expressions when using H3 grid system.
### Mosaic 0.4.x Series [Latest]

We recommend using Databricks Runtime versions 13.3 LTS with Photon enabled.

:warning: **Mosaic 0.4.x series only supports DBR 13**. If running on a different DBR with throw an exception:

> DEPRECATION ERROR: Mosaic v0.4.x series only supports Databricks Runtime 13. You can specify `%pip install 'databricks-mosaic<0.4,>=0.3'` for DBR < 13.
As of the 0.4.0 release, Mosaic issues the following ERROR when initialized on a cluster that is neither Photon Runtime nor Databricks Runtime ML [[ADB](https://learn.microsoft.com/en-us/azure/databricks/runtime/) | [AWS](https://docs.databricks.com/runtime/index.html) | [GCP](https://docs.gcp.databricks.com/runtime/index.html)]:

> DEPRECATION ERROR: Please use a Databricks Photon-enabled Runtime for performance benefits or Runtime ML for spatial AI benefits; Mosaic 0.4.x series restricts executing this cluster.
__Language Bindings__

As of Mosaic 0.4.0 (subject to change in follow-on releases)...

:warning: **Mosaic 0.3 series does not support DBR 13** (coming soon); also, DBR 10 is no longer supported in Mosaic.
* _No Mosaic SQL expressions cannot yet be registered with [Unity Catalog](https://www.databricks.com/product/unity-catalog) due to API changes affecting DBRs >= 13._
* [Assigned Clusters](https://docs.databricks.com/en/compute/configure.html#access-modes): Mosaic Python, R, and Scala APIs.
* [Shared Access Clusters](https://docs.databricks.com/en/compute/configure.html#access-modes): Mosaic Scala API (JVM) with Admin [allowlisting](https://docs.databricks.com/en/data-governance/unity-catalog/manage-privileges/allowlist.html); _Python bindings to Mosaic Scala APIs are blocked by Py4J Security on Shared Access Clusters._

As of the 0.3.11 release, Mosaic issues the following warning when initialized on a cluster that is neither Photon Runtime nor Databricks Runtime ML [[ADB](https://learn.microsoft.com/en-us/azure/databricks/runtime/) | [AWS](https://docs.databricks.com/runtime/index.html) | [GCP](https://docs.gcp.databricks.com/runtime/index.html)]:
__Additional Notes:__

> DEPRECATION WARNING: Mosaic is not supported on the selected Databricks Runtime. Mosaic will stop working on this cluster after v0.3.x. Please use a Databricks Photon-enabled Runtime (for performance benefits) or Runtime ML (for spatial AI benefits).
As of Mosaic 0.4.0 (subject to change in follow-on releases)...

If you are receiving this warning in v0.3.11+, you will want to begin to plan for a supported runtime. The reason we are making this change is that we are streamlining Mosaic internals to be more aligned with future product APIs which are powered by Photon. Along this direction of change, Mosaic will be standardizing to JTS as its default and supported Vector Geometry Provider.
1. [Unity Catalog](https://www.databricks.com/product/unity-catalog): Enforces process isolation which is difficult to accomplish with custom JVM libraries; as such only built-in (aka platform provided) JVM APIs can be invoked from other supported languages in Shared Access Clusters.
2. [Volumes](https://docs.databricks.com/en/connect/unity-catalog/volumes.html): Along the same principle of isolation, clusters (both assigned and shared access) can read Volumes via relevant built-in readers and writers or via custom python calls which do not involve any custom JVM code.

### Mosaic 0.3.x Series

We recommend using Databricks Runtime versions 12.2 LTS with Photon enabled.

:warning: **Mosaic 0.3.x series does not support DBR 13**.

As of the 0.3.11 release, Mosaic issues the following WARNING when initialized on a cluster that is neither Photon Runtime nor Databricks Runtime ML [[ADB](https://learn.microsoft.com/en-us/azure/databricks/runtime/) | [AWS](https://docs.databricks.com/runtime/index.html) | [GCP](https://docs.gcp.databricks.com/runtime/index.html)]:

> DEPRECATION WARNING: Please use a Databricks Photon-enabled Runtime for performance benefits or Runtime ML for spatial AI benefits; Mosaic will stop working on this cluster after v0.3.x.
If you are receiving this warning in v0.3.11+, you will want to begin to plan for a supported runtime. The reason we are making this change is that we are streamlining Mosaic internals to be more aligned with future product APIs which are powered by Photon. Along this direction of change, Mosaic has standardized to JTS as its default and supported Vector Geometry Provider.

### Documentation

Expand Down Expand Up @@ -114,21 +141,24 @@ import com.databricks.labs.mosaic.JTS
val mosaicContext = MosaicContext.build(H3, JTS)
mosaicContext.register(spark)
```

__Note: Mosaic 0.4.x SQL bindings for DBR 13 not yet available in Unity Catalog due to API changes.__

## Examples

Here are some example notebooks, check the language links for latest [[Python](/notebooks/examples/python/) | [Scala](/notebooks/examples/scala/) | [SQL](/notebooks/examples/sql/) | [R](/notebooks/examples/R/)]:

| Example | Description | Links |
| --- | --- | --- |
| __Quick Start__ | Example of performing spatial point-in-polygon joins on the NYC Taxi dataset | [python](/notebooks/examples/python/QuickstartNotebook.py), [scala](notebooks/examples/scala/QuickstartNotebook.scala), [R](notebooks/examples/R/QuickstartNotebook.r), [SQL](notebooks/examples/sql/QuickstartNotebook.sql) |
| __Quick Start__ | Example of performing spatial point-in-polygon joins on the NYC Taxi dataset | [python](/notebooks/examples/python/QuickstartNotebook.ipynb), [scala](notebooks/examples/scala/QuickstartNotebook.ipynb), [R](notebooks/examples/R/QuickstartNotebook.r), [SQL](notebooks/examples/sql/QuickstartNotebook.ipynb) |
| Shapefiles | Examples of reading multiple shapefiles | [python](notebooks/examples/python/Shapefiles/) |
| Spatial KNN | Runnable notebook-based example using Mosaic [SpatialKNN](https://databrickslabs.github.io/mosaic/models/spatial-knn.html) model | [python](notebooks/examples/python/SpatialKNN) |
| Open Street Maps | Ingesting and processing with Delta Live Tables the Open Street Maps dataset to extract buildings polygons and calculate aggregation statistics over H3 indexes | [python](notebooks/examples/python/OpenStreetMaps) |
| NetCDF | Read multiple NetCDFs, process through various data engineering steps before analyzing and rendering | [python](notebooks/examples/python/NetCDF/) |
| STS Transfers | Detecting Ship-to-Ship transfers at scale by leveraging Mosaic to process AIS data. | [python](notebooks/examples/python/Ship2ShipTransfers), [blog](https://medium.com/@timo.roest/ship-to-ship-transfer-detection-b370dd9d43e8) |

You can import those examples in Databricks workspace using [these instructions](https://docs.databricks.com/notebooks/notebooks-manage.html#import-a-notebook).
You can import those examples in Databricks workspace using [these instructions](https://docs.databricks.com/en/notebooks/index.html).

## Ecosystem
Mosaic is intended to augment the existing system and unlock the potential by integrating spark, delta and 3rd party frameworks into the Lakehouse architecture.
Mosaic is intended to augment the existing system and unlock the potential by integrating [Spark](https://spark.apache.org/), [Delta Lake](https://delta.io/) and 3rd party frameworks into the Lakehouse architecture.

![mosaic-logo](src/main/resources/MosaicEcosystem.png)
Image2: Mosaic ecosystem - Lakehouse integration.
Expand Down
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -278,7 +278,7 @@
<scala.version>2.12.10</scala.version>
<scala.compat.version>2.12</scala.compat.version>
<spark.version>3.4.0</spark.version>
<mosaic.version>0.3.14</mosaic.version>
<mosaic.version>0.4.0</mosaic.version>
</properties>
</profile>
</profiles>
Expand Down
2 changes: 1 addition & 1 deletion python/mosaic/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,4 @@
from .models import SpatialKNN
from .readers import read

__version__ = "0.3.14"
__version__ = "0.4.0"
51 changes: 38 additions & 13 deletions python/mosaic/api/enable.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,10 @@
from mosaic.utils.notebook_utils import NotebookUtils


def enable_mosaic(spark: SparkSession, dbutils=None) -> None:
def enable_mosaic(
spark: SparkSession, dbutils = None, log_info: bool = False,
jar_path: str = None, jar_autoattach: bool = True
) -> None:
"""
Enable Mosaic functions.
Expand All @@ -22,9 +25,25 @@ def enable_mosaic(spark: SparkSession, dbutils=None) -> None:
spark : pyspark.sql.SparkSession
The active SparkSession.
dbutils : dbruntime.dbutils.DBUtils
The dbutils object used for `display` and `displayHTML` functions.
Optional, only applicable to Databricks users.
Optional, specify dbutils object used for `display` and `displayHTML` functions.
log_info : bool
Logging cannot be adjusted with Unity Catalog Shared Access clusters;
if you try to do so, will throw a Py4JSecurityException.
- True will try to setLogLevel to 'info'
- False will not; Default is False
jar_path : str
Convenience when you need to change the JAR path for Unity Catalog
Volumes with Shared Access clusters
- Default is None; if provided, sets
"spark.databricks.labs.mosaic.jar.path"
jar_autoattach : bool
Convenience when you need to turn off JAR auto-attach for Unity
Catalog Volumes with Shared Access clusters.
- False will not registers the JAR; sets
"spark.databricks.labs.mosaic.jar.autoattach" to "false"
- True will register the JAR; Default is True
Returns
-------
Expand All @@ -34,7 +53,7 @@ def enable_mosaic(spark: SparkSession, dbutils=None) -> None:
- `spark.databricks.labs.mosaic.jar.autoattach`: 'true' (default) or 'false'
Automatically attach the Mosaic JAR to the Databricks cluster? (Optional)
- `spark.databricks.labs.mosaic.jar.location`
- `spark.databricks.labs.mosaic.jar.path`
Explicitly specify the path to the Mosaic JAR.
(Optional and not required at all in a standard Databricks environment).
- `spark.databricks.labs.mosaic.geometry.api`: 'JTS'
Expand All @@ -43,8 +62,20 @@ def enable_mosaic(spark: SparkSession, dbutils=None) -> None:
Explicitly specify the index system to use for optimized spatial joins. (Optional)
"""
# Set spark session, conditionally:
# - set conf for jar autoattach
# - set conf for jar path
# - set log level to 'info'
if not jar_autoattach:
spark.conf.set("spark.databricks.labs.mosaic.jar.autoattach", "false")
print("...set 'spark.databricks.labs.mosaic.jar.autoattach' to false")
if jar_path is not None:
spark.conf.set("spark.databricks.labs.mosaic.jar.path", jar_path)
print(f"...set 'spark.databricks.labs.mosaic.jar.path' to '{jar_path}'")
if log_info:
spark.sparkContext.setLogLevel('info')
config.mosaic_spark = spark
_ = MosaicLibraryHandler(config.mosaic_spark)
_ = MosaicLibraryHandler(config.mosaic_spark, log_info = log_info)
config.mosaic_context = MosaicContext(config.mosaic_spark)

# Register SQL functions
Expand All @@ -56,14 +87,8 @@ def enable_mosaic(spark: SparkSession, dbutils=None) -> None:

isSupported = config.mosaic_context._context.checkDBR(spark._jsparkSession)
if not isSupported:
print(
"""
DEPRECATION WARNING:
Please use a Databricks:
- Photon-enabled Runtime for performance benefits
- Runtime ML for spatial AI benefits
Mosaic will stop working on this cluster after v0.3.x."""
)
# unexpected - checkDBR returns true or throws exception
print("""WARNING: checkDBR returned False.""")

# Not yet added to the pyspark API
with warnings.catch_warnings():
Expand Down
Loading

0 comments on commit 5e2afd6

Please sign in to comment.