Skip to content

Commit

Permalink
Releasing 2.5.0
Browse files Browse the repository at this point in the history
  • Loading branch information
EnricoMi committed Mar 23, 2023
1 parent 32bff29 commit dffc744
Show file tree
Hide file tree
Showing 5 changed files with 22 additions and 15 deletions.
7 changes: 7 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,13 @@ All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).

## [2.5.0] - 2023-03-23

### Added

- Add whitespace agnostic diff comparator. (#137)
- Add Python whl package build. (#151)

## [2.4.0] - 2022-12-08

### Added
Expand Down
16 changes: 8 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ has the following semantics: `spark-extension_{SCALA_COMPAT_VERSION}-{VERSION}-{
Add this line to your `build.sbt` file:

```sbt
libraryDependencies += "uk.co.gresearch.spark" %% "spark-extension" % "2.4.0-3.3"
libraryDependencies += "uk.co.gresearch.spark" %% "spark-extension" % "2.5.0-3.3"
```

### Maven
Expand All @@ -72,7 +72,7 @@ Add this dependency to your `pom.xml` file:
<dependency>
<groupId>uk.co.gresearch.spark</groupId>
<artifactId>spark-extension_2.12</artifactId>
<version>2.4.0-3.3</version>
<version>2.5.0-3.3</version>
</dependency>
```

Expand All @@ -81,7 +81,7 @@ Add this dependency to your `pom.xml` file:
Launch a Spark Shell with the Spark Extension dependency (version ≥1.1.0) as follows:

```shell script
spark-shell --packages uk.co.gresearch.spark:spark-extension_2.12:2.4.0-3.3
spark-shell --packages uk.co.gresearch.spark:spark-extension_2.12:2.5.0-3.3
```

Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depending on your Spark Shell version.
Expand All @@ -97,7 +97,7 @@ from pyspark.sql import SparkSession

spark = SparkSession \
.builder \
.config("spark.jars.packages", "uk.co.gresearch.spark:spark-extension_2.12:2.4.0-3.3") \
.config("spark.jars.packages", "uk.co.gresearch.spark:spark-extension_2.12:2.5.0-3.3") \
.getOrCreate()
```

Expand All @@ -108,7 +108,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depe
Launch the Python Spark REPL with the Spark Extension dependency (version ≥1.1.0) as follows:

```shell script
pyspark --packages uk.co.gresearch.spark:spark-extension_2.12:2.4.0-3.3
pyspark --packages uk.co.gresearch.spark:spark-extension_2.12:2.5.0-3.3
```

Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depending on your PySpark version.
Expand All @@ -118,7 +118,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depe
Run your Python scripts that use PySpark via `spark-submit`:

```shell script
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.4.0-3.3 [script.py]
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.5.0-3.3 [script.py]
```

Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depending on your Spark version.
Expand All @@ -128,7 +128,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depe
There are plenty of [Data Science notebooks](https://datasciencenotebook.org/) around. To use this library,
add **a jar dependency** to your notebook using these **Maven coordinates**:

uk.co.gresearch.spark:spark-extension_2.12:2.4.0-3.3
uk.co.gresearch.spark:spark-extension_2.12:2.5.0-3.3

Or [download the jar](https://mvnrepository.com/artifact/uk.co.gresearch.spark/spark-extension) and place it
on a filesystem where it is accessible by the notebook, and reference that jar file directly.
Expand All @@ -144,7 +144,7 @@ Running your Python application on a Spark cluster will still require one of the
to add the Scala package to the Spark environment.

```shell script
pip install pyspark-extension==2.4.0.3.3
pip install pyspark-extension==2.5.0.3.3
```

Note: Pick the right Spark version (here 3.3) depending on your PySpark version.
Expand Down
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
<modelVersion>4.0.0</modelVersion>
<groupId>uk.co.gresearch.spark</groupId>
<artifactId>spark-extension_2.13</artifactId>
<version>2.5.0-3.3-SNAPSHOT</version>
<version>2.5.0-3.3</version>
<name>Spark Extension</name>
<description>A library that provides useful extensions to Apache Spark.</description>
<inceptionYear>2020</inceptionYear>
Expand Down
10 changes: 5 additions & 5 deletions python/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ Running your Python application on a Spark cluster will still require one of the
to add the Scala package to the Spark environment.

```shell script
pip install pyspark-extension==2.4.0.3.3
pip install pyspark-extension==2.5.0.3.3
```

Note: Pick the right Spark version (here 3.3) depending on your PySpark version.
Expand All @@ -38,7 +38,7 @@ from pyspark.sql import SparkSession

spark = SparkSession \
.builder \
.config("spark.jars.packages", "uk.co.gresearch.spark:spark-extension_2.12:2.4.0-3.3") \
.config("spark.jars.packages", "uk.co.gresearch.spark:spark-extension_2.12:2.5.0-3.3") \
.getOrCreate()
```

Expand All @@ -49,7 +49,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depe
Launch the Python Spark REPL with the Spark Extension dependency (version ≥1.1.0) as follows:

```shell script
pyspark --packages uk.co.gresearch.spark:spark-extension_2.12:2.4.0-3.3
pyspark --packages uk.co.gresearch.spark:spark-extension_2.12:2.5.0-3.3
```

Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depending on your PySpark version.
Expand All @@ -59,7 +59,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depe
Run your Python scripts that use PySpark via `spark-submit`:

```shell script
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.4.0-3.3 [script.py]
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.5.0-3.3 [script.py]
```

Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depending on your Spark version.
Expand All @@ -69,7 +69,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depe
There are plenty of [Data Science notebooks](https://datasciencenotebook.org/) around. To use this library,
add **a jar dependency** to your notebook using these **Maven coordinates**:

uk.co.gresearch.spark:spark-extension_2.12:2.4.0-3.3
uk.co.gresearch.spark:spark-extension_2.12:2.5.0-3.3

Or [download the jar](https://mvnrepository.com/artifact/uk.co.gresearch.spark/spark-extension) and place it
on a filesystem where it is accessible by the notebook, and reference that jar file directly.
Expand Down
2 changes: 1 addition & 1 deletion python/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
from pathlib import Path
from setuptools import setup

jar_version = '2.5.0-3.3-SNAPSHOT'
jar_version = '2.5.0-3.3'
scala_version = '2.13.8'
scala_compat_version = '.'.join(scala_version.split('.')[:2])
spark_compat_version = jar_version.split('-')[1]
Expand Down

0 comments on commit dffc744

Please sign in to comment.