Skip to content

Commit

Permalink
updated version number to 3.3.1 (#469)
Browse files Browse the repository at this point in the history
  • Loading branch information
Aryex authored Jul 26, 2022
1 parent 99d5894 commit 300dbad
Show file tree
Hide file tree
Showing 5 changed files with 6 additions and 6 deletions.
4 changes: 2 additions & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ cd connector
sbt assembly
```

Running this will run all unit tests and build the jar to target/scala-2.12/spark-vertica-connector-assembly-3.3.0.jar
Running this will run all unit tests and build the jar to target/scala-2.12/spark-vertica-connector-assembly-3.3.1.jar

## Step 4: Set up an environment
The easiest way to set up an environment is to spin up the docker containers for a sandbox client environment and single-node clusters for both Vertica and HDFS following [this guide.](https://github.com/vertica/spark-connector/blob/main/examples/README.md)
Expand All @@ -88,7 +88,7 @@ The next requirement is a spark application that uses the connector jar. Example
```shell
cd examples/basic-read
mkdir lib
cp ../../connector/target/scala-2.12/spark-vertica-connector-assembly-3.3.0.jar lib
cp ../../connector/target/scala-2.12/spark-vertica-connector-assembly-3.3.1.jar lib
sbt run
```

Expand Down
2 changes: 1 addition & 1 deletion examples/kerberos-example/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ sbt assembly
Then create a `lib` folder at `/kerberos-example` and put the spark connector that you assembled inside.
```
mkdir /spark-connector/examples/kerberos-example/lib
cp /spark-connector/connector/target/scala-2.12/spark-vertica-connector-assembly-3.3.0.jar /spark-connector/examples/kerberos-example/lib
cp /spark-connector/connector/target/scala-2.12/spark-vertica-connector-assembly-3.3.1.jar /spark-connector/examples/kerberos-example/lib
```
Then in the example's `build.sbt`, comment out the vertica-spark connector dependency.

Expand Down
2 changes: 1 addition & 1 deletion examples/sparklyr/run.r
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ if (file.exists("done")) unlink("done")

# Create the Spark config and give access to our connector jar file
config <- spark_config()
config$sparklyr.jars.default <- "../../connector/target/scala-2.12/spark-vertica-connector-assembly-3.3.0.jar"
config$sparklyr.jars.default <- "../../connector/target/scala-2.12/spark-vertica-connector-assembly-3.3.1.jar"

# Submit a new Spark job that executes sparkapp.r with Spark version 3.1
spark_submit(master = "spark://localhost:7077", version = "3.1", file = "sparkapp.r", config = config)
Expand Down
2 changes: 1 addition & 1 deletion functional-tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Configuration is specified with application.conf (HOCON format)
From the functional-tests directory, run the following commands:
```
mkdir lib
cd ../connector && sbt assembly && cp target/scala-2.12/spark-vertica-connector-assembly-3.3.0.jar ../functional-tests/lib && cd ../functional-tests
cd ../connector && sbt assembly && cp target/scala-2.12/spark-vertica-connector-assembly-3.3.1.jar ../functional-tests/lib && cd ../functional-tests
```
This will create a lib folder and then build and copy the connector JAR file to it.

Expand Down
2 changes: 1 addition & 1 deletion version.properties
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
connector-version=3.3.0
connector-version=3.3.1

0 comments on commit 300dbad

Please sign in to comment.