Skip to content

Commit

Permalink
Downgrade cats dependency (#220)
Browse files Browse the repository at this point in the history
* Downgrade cats dependency

* Update README.md

* Remove unnecessary slim jar file

* update readme
  • Loading branch information
ravjotbrar authored Aug 30, 2021
1 parent eb956fb commit cf8c1bd
Show file tree
Hide file tree
Showing 5 changed files with 19 additions and 5 deletions.
2 changes: 1 addition & 1 deletion connector/build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ libraryDependencies += "org.scalactic" %% "scalactic" % "3.2.2" % Test
libraryDependencies += "org.scalatest" %% "scalatest" % "3.2.2" % "test"
libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.9.2"
libraryDependencies += "org.scalamock" %% "scalamock" % "4.4.0" % Test
libraryDependencies += "org.typelevel" %% "cats-core" % "2.3.0"
libraryDependencies += "org.typelevel" %% "cats-core" % "2.1.1"

Test / parallelExecution := false

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -154,7 +154,6 @@ class VerticaDistributedFilesystemWritePipe(val config: DistributedFilesystemWri
fileStoreLayer.closeWriteParquetFile()
}


def buildCopyStatement(targetTable: String, columnList: String, url: String, rejectsTableName: String, fileFormat: String): String = {
if (config.mergeKey.isDefined) {
s"COPY $targetTable FROM '$url' ON ANY NODE $fileFormat REJECTED DATA AS TABLE $rejectsTableName NO COMMIT"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@

package com.vertica.spark.util.cleanup

import cats.implicits.toTraverseOps
import cats.implicits._
import com.vertica.spark.config.LogProvider
import com.vertica.spark.datasource.fs.FileStoreLayerInterface
import com.vertica.spark.util.error.{CleanupError, ConnectorError, ParentDirMissingError}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,7 @@ import java.sql.ResultSetMetaData

import cats.data.NonEmptyList
import cats.implicits._

import scala.util.Either
import cats.instances.list._
import com.vertica.spark.config.{LogProvider, TableName, TableQuery, TableSource, ValidColumnList}
import com.vertica.spark.util.error.ErrorHandling.{ConnectorResult, SchemaResult}
import com.vertica.spark.util.error._
Expand Down
17 changes: 17 additions & 0 deletions examples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,23 @@ Once in the examples directory, run:
```
Example argument: basic-read-example/target/scala-2.12/spark-vertica-connector-basic-read-example-assembly-2.0.1.jar

### Using Thin Jar
If you are using the thin jar and running into an error similar to the following:
`java.lang.NoSuchMethodError: 'void cats.kernel.CommutativeSemigroup.$init$(cats.kernel.CommutativeSemigroup)'`, you may need to shade the cats dependency in your project.

This can be done by adding the following to your build.sbt file:

```
assembly / assemblyShadeRules := {
val shadePackage = "com.azavea.shaded.demo"
Seq(
ShadeRule.rename("cats.kernel.**" -> s"$shadePackage.cats.kernel.@1").inAll
)
}
```

### Note

There are additional prerequisites to run the S3, Pyspark, Sparklyr, or Kerberos examples. If you want to run these, please take a look at their respective README files.


Expand Down

0 comments on commit cf8c1bd

Please sign in to comment.