Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

1.0.0 #83

Closed
wants to merge 10 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,3 +23,4 @@ env:
# SONA_PASS
- secure: oDu2dXzektYr/7K5nw7EL2qDUR5AhO4Uz6XGHoOQsN1gJiovdsa5nJeDHgo2YFGpJljyTo+lABbxpGIFQpcnKGIG9eAaXIyYpRlEiksTUnZdwIlCXkRMg2l9cUr30ZDOoVS8QpQbCDdogOSqJ+RUShSuiXR8Qi2e0RfrsVucgkNogQ6w1IoB9kV8CAYsnJVzi/oenTJZjEh5qrKiUALpkiHGjB9WSIHQ80sAO/rwnr88w++HcOIqgnvhJ3/Ig3N6201Slud5pF2yVz4MxzY8bedetqNil5ffosYiU7dladOiKTVj8efZPx0cGq0dhpAZFVhehlXyu4EA24NRgKYvAIc0xWVVm49IBaMpDDI/nh24uF9fBPt2+Apj5BY/ETpKS5tFqFaGkBjlL9KFL3l2DfnWC8AfTHlBXFlkH8tKPSN4so612QAmWuULtrVuQpV8DF40HNwJoR2Lyyy5aHrZtpdjHsp3OJI83QfCxH2yTYhes4eHAxi4ynZDSDolt6mrjx651mmlQCsJWJ5KdWHQwjqzgRP8q1/bCaDYdODhrz0K1JPl6YYA+dzwRP+rFeSQbzG0yGo12p7FZGpq36/Hq9C/HSw6WVDN3Lr8CUxZr1rDhtmAvaMJG5EyYDXpNGn9j2DJX76A1Ifu7KXCp8h+FTLPa1CIxJruNxEA6vFSdqA=
- SONA_USER=snowplow
- ENCRYPTION_LABEL: 1ec7ed6de651
1 change: 1 addition & 0 deletions .travis/deploy.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ pwd

project_version=$(sbt version -Dsbt.log.noformat=true | perl -ne 'print $1 if /(\d+\.\d+\.\d+[^\r\n]*)/')
if [ "${project_version}" == "${tag_version}" ]; then
./.travis/deploy_docs.sh
sbt +publish
sbt +bintraySyncMavenCentral
else
Expand Down
17 changes: 17 additions & 0 deletions .travis/deploy_docs.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
#!/bin/bash

ENCRYPTED_KEY_VAR="encrypted_${ENCRYPTION_LABEL}_key"
ENCRYPTED_IV_VAR="encrypted_${ENCRYPTION_LABEL}_iv"
ENCRYPTED_KEY=${!ENCRYPTED_KEY_VAR}
ENCRYPTED_IV=${!ENCRYPTED_IV_VAR}

git config --global user.name "$USER"
git config --global user.email "$TRAVIS_BUILD_NUMBER@$TRAVIS_COMMIT"

openssl aes-256-cbc -K $ENCRYPTED_KEY -iv $ENCRYPTED_IV -in project/travis-deploy-key.enc -out project/travis-deploy-key -d
chmod 600 project/travis-deploy-key

eval "$(ssh-agent -s)"
ssh-add project/travis-deploy-key

sbt ghpagesPushSite
7 changes: 7 additions & 0 deletions CHANGELOG
Original file line number Diff line number Diff line change
@@ -1,3 +1,10 @@
Version 1.0.0 (2019-x-x)
--------------------------
Make parsing errors type-safe (#75)
Add function for creating empty event (#81)
Remove Vagrant setup (#84)
Extend copyright notice to 2019 (#85)

Version 0.4.2 (2019-08-06)
--------------------------
Bump iglu-core to 0.5.1 (#73)
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Use this SDK with **[Apache Spark][spark]**, **[AWS Lambda][lambda]**, **[Apache

## Copyright and license

The Snowplow Scala Analytics SDK is copyright 2016-2017 Snowplow Analytics Ltd.
The Snowplow Scala Analytics SDK is copyright 2016-2019 Snowplow Analytics Ltd.

Licensed under the **[Apache License, Version 2.0][license]** (the "License");
you may not use this software except in compliance with the License.
Expand All @@ -33,7 +33,7 @@ limitations under the License.
[license-image]: http://img.shields.io/badge/license-Apache--2-blue.svg?style=flat
[license]: http://www.apache.org/licenses/LICENSE-2.0

[release-image]: http://img.shields.io/badge/release-0.4.2-blue.svg?style=flat
[release-image]: http://img.shields.io/badge/release-1.0.0-blue.svg?style=flat
[releases]: https://github.com/snowplow/snowplow-scala-analytics-sdk/releases

[setup-guide]: https://github.com/snowplow/snowplow/wiki/Scala-Analytics-SDK-setup
Expand Down
19 changes: 0 additions & 19 deletions Vagrantfile

This file was deleted.

7 changes: 6 additions & 1 deletion build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -15,13 +15,18 @@ lazy val root = project.in(file("."))
.settings(Seq[Setting[_]](
name := "snowplow-scala-analytics-sdk",
organization := "com.snowplowanalytics",
version := "0.4.2",
version := "1.0.0-M1",
description := "Scala analytics SDK for Snowplow",
scalaVersion := "2.12.8",
crossScalaVersions := Seq("2.11.12", "2.12.8")
))
.enablePlugins(SiteScaladocPlugin)
.enablePlugins(GhpagesPlugin)
.settings(BuildSettings.buildSettings)
.settings(BuildSettings.publishSettings)
.settings(BuildSettings.mimaSettings)
.settings(BuildSettings.scoverageSettings)
.settings(BuildSettings.ghPagesSettings)
.settings(Seq(
shellPrompt := { _ => name.value + " > " }
))
Expand Down
46 changes: 45 additions & 1 deletion project/BuildSettings.scala
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
/*
* Copyright (c) 2016-2018 Snowplow Analytics Ltd. All rights reserved.
* Copyright (c) 2016-2019 Snowplow Analytics Ltd. All rights reserved.
*
* This program is licensed to you under the Apache License Version 2.0,
* and you may not use this file except in compliance with the Apache License Version 2.0.
Expand All @@ -19,6 +19,17 @@ import Keys._
import bintray.BintrayPlugin._
import bintray.BintrayKeys._

// Mima plugin
import com.typesafe.tools.mima.plugin.MimaKeys._
import com.typesafe.tools.mima.plugin.MimaPlugin

// Scoverage plugin
import scoverage.ScoverageKeys._

import com.typesafe.sbt.sbtghpages.GhpagesPlugin.autoImport._
import com.typesafe.sbt.site.SitePlugin.autoImport.makeSite
import com.typesafe.sbt.SbtGit.GitKeys.{gitBranch, gitRemoteRepo}

object BuildSettings {

// Basic settings for our app
Expand Down Expand Up @@ -59,4 +70,37 @@ object BuildSettings {
</developer>
</developers>)
)

// If new version introduces breaking changes,
// clear-out mimaBinaryIssueFilters and mimaPreviousVersions.
// Otherwise, add previous version to set without
// removing other versions.
val mimaPreviousVersions = Set()

val mimaSettings = MimaPlugin.mimaDefaultSettings ++ Seq(
mimaPreviousArtifacts := mimaPreviousVersions.map { organization.value %% name.value % _ },
mimaBinaryIssueFilters ++= Seq(),
test in Test := {
mimaReportBinaryIssues.value
(test in Test).value
}
)

val scoverageSettings = Seq(
coverageEnabled := true,
coverageMinimum := 50,
coverageFailOnMinimum := true,
coverageHighlighting := false,
(test in Test) := {
(coverageReport dependsOn (test in Test)).value
}
)

val ghPagesSettings = Seq(
ghpagesPushSite := (ghpagesPushSite dependsOn makeSite).value,
ghpagesNoJekyll := false,
gitRemoteRepo := "[email protected]:snowplow/snowplow-scala-analytics-sdk.git",
gitBranch := Some("gh-pages"),
excludeFilter in ghpagesCleanSite := "index.html"
)
}
51 changes: 51 additions & 0 deletions project/docs_init.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
#!/bin/bash

# This script is used for initializing the gh-pages branch

PROJECT_NAME=snowplow-scala-analytics-sdk
[email protected]:snowplow/snowplow-scala-analytics-sdk.git

# Using a fresh, temporary clone is safest for this procedure
pushd /tmp
git clone $PROJECT_REPO
cd $PROJECT_NAME

# Create branch with no history or content
git checkout --orphan gh-pages
git rm -rf .

# Create index.html file in order to redirect main page
# to /latest/api
cat > index.html <<- EOM
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Project Documentation</title>
<script language="JavaScript">
<!--
function doRedirect()
{
window.location.replace("latest/api");
}

doRedirect();
//-->
</script>
</head>
<body>
<a href="latest/api">Go to the project documentation
</a>
</body>
</html>
EOM

git add index.html

# Establish the branch existence
git commit --allow-empty -m "Initialize gh-pages branch"
git push origin gh-pages

# Return to original working copy clone, we're finished with the /tmp one
popd
rm -rf /tmp/$PROJECT_NAME
5 changes: 5 additions & 0 deletions project/plugins.sbt
Original file line number Diff line number Diff line change
@@ -1 +1,6 @@
addSbtPlugin("org.foundweekends" % "sbt-bintray" % "0.5.3")
addSbtPlugin("com.typesafe" % "sbt-mima-plugin" % "0.5.0")
addSbtPlugin("org.scoverage" % "sbt-scoverage" % "1.6.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-site" % "1.4.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-git" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-ghpages" % "0.6.3")
Binary file added project/travis-deploy-key.enc
Binary file not shown.
Original file line number Diff line number Diff line change
Expand Up @@ -256,4 +256,18 @@ object Event {
*/
def parse(line: String): DecodeResult[Event] =
parser.parse(line)

/**
* Creates empty event which optional fields of it are None
* Only necessary fields are given as arguments
*/
def emptyEvent(id: UUID, collectorTstamp: Instant, vCollector: String, vEtl: String): Event =
Event(None, None, None, collectorTstamp, None, None, id, None, None, None, vCollector, vEtl, None, None, None,
None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None,
None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None,
Contexts(Nil), None, None, None, None, None, UnstructEvent(None), None, None, None, None, None, None, None, None,
None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None,
None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None,
None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None,
Contexts(Nil), None, None, None, None, None, None, None, None)
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,156 @@
/*
* Copyright (c) 2016-2019 Snowplow Analytics Ltd. All rights reserved.
*
* This program is licensed to you under the Apache License Version 2.0,
* and you may not use this file except in compliance with the Apache License Version 2.0.
* You may obtain a copy of the Apache License Version 2.0 at http://www.apache.org/licenses/LICENSE-2.0.
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the Apache License Version 2.0 is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the Apache License Version 2.0 for the specific language governing permissions and limitations there under.
*/
package com.snowplowanalytics.snowplow.analytics.scalasdk
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

License header.


import cats.data.NonEmptyList
import cats.syntax.either._
import io.circe._
import io.circe.syntax._
import com.snowplowanalytics.snowplow.analytics.scalasdk.decode.Key

/**
* Represents error during parsing TSV event
*/
sealed trait ParsingError

object ParsingError {

/**
* Represents error which given line is not a TSV
*/
final case object NonTSVPayload extends ParsingError

/**
* Represents error which number of given columns is not equal
* to number of expected columns
* @param columnCount mismatched column count in the event
*/
final case class ColumnNumberMismatch(columnCount: Int) extends ParsingError

/**
* Represents error which encountered while decoding values in row
* @param errors Infos of errors which encountered during decoding
*/
final case class RowDecodingError(errors: NonEmptyList[RowDecodingErrorInfo]) extends ParsingError

/**
* Gives info about the reasons of the errors during decoding value in row
*/
sealed trait RowDecodingErrorInfo

object RowDecodingErrorInfo {
/**
* Represents cases where value in a field is not valid
* e.g invalid timestamp, invalid UUID
* @param key key of field
* @param value value of field
* @param message error message
*/
final case class InvalidValue(key: Key, value: String, message: String) extends RowDecodingErrorInfo

/**
* Represents cases which getting error is not expected while decoding row
* For example, while parsing the list of tuples to HList in the
* RowDecoder, getting more or less values than expected is impossible
* due to type check. Therefore 'UnexpectedRowDecodingError' is returned for
* these cases. These errors can be ignored since they are not possible to get
* @param error error message
*/
final case class UnexpectedRowDecodingError(error: String) extends RowDecodingErrorInfo

implicit val analyticsSdkRowDecodingErrorInfoCirceEncoder: Encoder[RowDecodingErrorInfo] =
Encoder.instance {
case InvalidValue(key, value, message) =>
Json.obj(
"type" := "InvalidValue",
"key" := key,
"value" := value,
"message" := message
)
case UnexpectedRowDecodingError(error: String) =>
Json.obj(
"type" := "UnexpectedRowDecodingError",
"error" := error
)
}

implicit val analyticsSdkRowDecodingErrorInfoCirceDecoder: Decoder[RowDecodingErrorInfo] =
Decoder.instance { cursor =>
for {
errorType <- cursor.downField("type").as[String]
result <- errorType match {
case "InvalidValue" =>
for {
key <- cursor.downField("key").as[Key]
value <- cursor.downField("value").as[String]
message <- cursor.downField("message").as[String]
} yield InvalidValue(key, value, message)

case "UnexpectedRowDecodingError" =>
cursor
.downField("error")
.as[String]
.map(UnexpectedRowDecodingError)
}
} yield result
}

implicit val analyticsSdkKeyCirceEncoder: Encoder[Key] =
Encoder.instance(_.toString.stripPrefix("'").asJson)

implicit val analyticsSdkKeyCirceDecoder: Decoder[Key] =
Decoder.instance(_.as[String].map(Symbol(_)))

}

implicit val analyticsSdkParsingErrorCirceEncoder: Encoder[ParsingError] =
Encoder.instance {
case NonTSVPayload =>
Json.obj("type" := "NonTSVPayload")
case ColumnNumberMismatch(columnCount) =>
Json.obj(
"type" := "ColumnNumberMismatch",
"columnCount" := columnCount
)
case RowDecodingError(errors) =>
Json.obj(
"type" := "RowDecodingError",
"errors" := errors.asJson
)
}

implicit val analyticsSdkParsingErrorCirceDecoder: Decoder[ParsingError] =
Decoder.instance { cursor =>
for {
error <- cursor.downField("type").as[String]
result <- error match {
case "NonTSVPayload" =>
NonTSVPayload.asRight
case "ColumnNumberMismatch" =>
cursor
.downField("columnCount")
.as[Int]
.map(ColumnNumberMismatch)
case "RowDecodingError" =>
cursor
.downField("errors")
.as[NonEmptyList[RowDecodingErrorInfo]]
.map(RowDecodingError)
case _ =>
DecodingFailure(
s"Error type $error cannot be recognized as Analytics SDK Parsing Error",
cursor.history).asLeft
}
} yield result
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Codecs are missing.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will add them.

Loading