Skip to content
This repository was archived by the owner on Mar 24, 2025. It is now read-only.

Commit ddd1ef5

Browse files
authored
Update for 0.18.0, move CICD configs to supported Spark versions (#680)
1 parent c42d6bc commit ddd1ef5

File tree

5 files changed

+13
-13
lines changed

5 files changed

+13
-13
lines changed

.github/workflows/test_spark_3_2_java_8.yml renamed to .github/workflows/test_spark_3_3_java_8.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
name: Spark 3.2 / Java 8 / Scala 2.12
1+
name: Spark 3.3 / Java 8 / Scala 2.12
22
on:
33
push:
44
branches: [master]
@@ -14,4 +14,4 @@ jobs:
1414
with:
1515
java-version: '[email protected]'
1616
- name: Build and test
17-
run: sbt -Dspark.testVersion=3.2.4 ++2.12.15 clean mimaReportBinaryIssues test
17+
run: sbt -Dspark.testVersion=3.3.4 ++2.12.15 clean mimaReportBinaryIssues test

.github/workflows/test_spark_3_4_java_11.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
name: Spark 3.4 / Java 11 / Scala 2.13
1+
name: Spark 3.4 / Java 11 / Scala 2.12
22
on:
33
push:
44
branches: [master]
@@ -14,4 +14,4 @@ jobs:
1414
with:
1515
java-version: '[email protected]'
1616
- name: Build and test
17-
run: sbt -Dspark.testVersion=3.4.1 ++2.13.8 clean scalastyle test:scalastyle mimaReportBinaryIssues test
17+
run: sbt -Dspark.testVersion=3.4.1 ++2.12.15 clean mimaReportBinaryIssues test

.github/workflows/test_spark_3_3_java_11.yml renamed to .github/workflows/test_spark_3_5_java_11.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
name: Spark 3.3 / Java 11 / Scala 2.13
1+
name: Spark 3.5 / Java 11 / Scala 2.13
22
on:
33
push:
44
branches: [master]
@@ -14,4 +14,4 @@ jobs:
1414
with:
1515
java-version: '[email protected]'
1616
- name: Build and test
17-
run: sbt -Dspark.testVersion=3.3.3 ++2.13.8 clean mimaReportBinaryIssues test
17+
run: sbt -Dspark.testVersion=3.5.1 ++2.13.8 clean scalastyle test:scalastyle mimaReportBinaryIssues test

README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -16,15 +16,15 @@ You can link against this library in your program at the following coordinates:
1616
```
1717
groupId: com.databricks
1818
artifactId: spark-xml_2.12
19-
version: 0.17.0
19+
version: 0.18.0
2020
```
2121

2222
## Using with Spark shell
2323

2424
This package can be added to Spark using the `--packages` command line option. For example, to include it when starting the spark shell:
2525

2626
```
27-
$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.12:0.17.0
27+
$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.12:0.18.0
2828
```
2929

3030
## Features
@@ -410,7 +410,7 @@ Automatically infer schema (data types)
410410
```R
411411
library(SparkR)
412412

413-
sparkR.session("local[4]", sparkPackages = c("com.databricks:spark-xml_2.12:0.17.0"))
413+
sparkR.session("local[4]", sparkPackages = c("com.databricks:spark-xml_2.12:0.18.0"))
414414

415415
df <- read.df("books.xml", source = "xml", rowTag = "book")
416416

@@ -422,7 +422,7 @@ You can manually specify schema:
422422
```R
423423
library(SparkR)
424424

425-
sparkR.session("local[4]", sparkPackages = c("com.databricks:spark-xml_2.12:0.17.0"))
425+
sparkR.session("local[4]", sparkPackages = c("com.databricks:spark-xml_2.12:0.18.0"))
426426
customSchema <- structType(
427427
structField("_id", "string"),
428428
structField("author", "string"),

build.sbt

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ import com.typesafe.tools.mima.core.MissingClassProblem
22

33
name := "spark-xml"
44

5-
version := "0.17.0"
5+
version := "0.18.0"
66

77
organization := "com.databricks"
88

@@ -12,7 +12,7 @@ crossScalaVersions := Seq("2.12.15", "2.13.8")
1212

1313
scalacOptions := Seq("-unchecked", "-deprecation")
1414

15-
val sparkVersion = sys.props.get("spark.testVersion").getOrElse("3.4.1")
15+
val sparkVersion = sys.props.get("spark.testVersion").getOrElse("3.5.1")
1616

1717
// To avoid packaging it, it's Provided below
1818
autoScalaLibrary := false
@@ -81,7 +81,7 @@ fork := true
8181
// Prints JUnit tests in output
8282
Test / testOptions := Seq(Tests.Argument(TestFrameworks.JUnit, "-v"))
8383

84-
mimaPreviousArtifacts := Set("com.databricks" %% "spark-xml" % "0.16.0")
84+
mimaPreviousArtifacts := Set("com.databricks" %% "spark-xml" % "0.17.0")
8585

8686
mimaBinaryIssueFilters ++= {
8787
Seq()

0 commit comments

Comments
 (0)