Skip to content

Commit

Permalink
Add system tests to storagetransfer samples.
Browse files Browse the repository at this point in the history
Removes the PowerMock dependency which was having some bad interactions
with Mockito.
  • Loading branch information
tswast committed Aug 22, 2016
1 parent 99b21cd commit d6e9032
Show file tree
Hide file tree
Showing 16 changed files with 296 additions and 338 deletions.
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,9 @@ release.properties
dependency-reduced-pom.xml
buildNumber.properties

# Secrets
service-account.json
secrets.env

# intellij
.idea/
Expand Down
3 changes: 3 additions & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,9 @@ before_install:
- openssl aes-256-cbc -K $encrypted_37a4f399de75_key -iv $encrypted_37a4f399de75_iv
-in service-account.json.enc -out service-account.json -d && export GOOGLE_APPLICATION_CREDENTIALS=$TRAVIS_BUILD_DIR/service-account.json
GCLOUD_PROJECT=cloud-samples-tests || true
- openssl aes-256-cbc -K $encrypted_eb858daba67b_key -iv $encrypted_eb858daba67b_iv -in secrets.env.enc -out secrets.env -d
&& set +x && source secrets.env && set -x
|| true
# Skip the install step, since Maven will download the dependencies we need
# when the test build runs.
# http://stackoverflow.com/q/31945809/101923
Expand Down
Binary file added secrets.env.enc
Binary file not shown.
53 changes: 43 additions & 10 deletions storage/storage-transfer/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,18 +13,34 @@ These samples are used on the following documentation pages:

## Prerequisites

1. Set up a project on Google Developers Console.
1. Go to the [Developers Console](https://cloud.google.com/console) and create or select your project.
You will need the project ID later.
1. Set up a project on Google Cloud Console.
1. Go to the [Google Cloud Console](https://console.cloud.google.com) and
create or select your project. You will need the project ID later.
1. Enable the [Google Storage Transfer API in the Google Cloud
Console](https://console.cloud.google.com/apis/api/storagetransfer/overview).
1. Within Developers Console, select APIs & auth > Credentials.
1. Select Add credentials > Service account > JSON key.
1. Set the environment variable GOOGLE_APPLICATION_CREDENTIALS to point to your JSON key.
1. Add the Storage Transfer service account as an editor of your project
storage-transfer-5031963314028297433@partnercontent.gserviceaccount.com
1. Set the environment variable `GOOGLE_APPLICATION_CREDENTIALS` to point to
your JSON key.
1. Add the Storage Transfer service account as an editor of your project.
1. To get the email address used for the service account, execute the
[googleServiceAccounts.get REST
method](https://cloud.google.com/storage/transfer/reference/rest/v1/googleServiceAccounts/get#try-it).
There should be a "Try It" section on that page, otherwise execute it in
the [APIs
Explorer](https://developers.google.com/apis-explorer/#p/storagetransfer/v1/storagetransfer.googleServiceAccounts.get).

It should output an email address like:

```
[email protected]
```
1. Add this as a member and select the Project -> Editor permission on the
[Google Cloud Console IAM and Admin
page](https://console.cloud.google.com/iam-admin/iam/project).
1. Set up gcloud for application default credentials.
1. `gcloud components update`
1. `gcloud auth login`
1. `gcloud config set project PROJECT_ID`
1. `gcloud init`

## Transfer from Amazon S3 to Google Cloud Storage

Expand All @@ -35,9 +51,21 @@ Creating a one-time transfer from Amazon S3 to Google Cloud Storage.
1. Go to AWS Management Console and create a bucket.
1. Under Security Credentials, create an IAM User with access to the bucket.
1. Create an Access Key for the user. Note the Access Key ID and Secret Access Key.
1. Set the `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` environment variables.
1. In AwsRequester.java, fill in the user-provided constants.
1. Run with `mvn compile` and
`mvn exec:java -Dexec.mainClass="com.google.cloud.storage.storagetransfer.samples.AwsRequester"`
1. Compile the package with
```
mvn compile
```
1. Run the transfer job with
```
mvn exec:java \
-Dexec.mainClass="com.google.cloud.storage.storagetransfer.samples.AwsRequester" \
-DprojectId=your-google-cloud-project-id \
-DjobDescription="Sample transfer job from S3 to GCS." \
-DawsSourceBucket=your-s3-bucket-name \
-DgcsSinkBucket=your-gcs-bucket-name
```
1. Note the job ID in the returned Transfer Job.

## Transfer data from a standard Cloud Storage bucket to a Cloud Storage Nearline bucket
Expand All @@ -59,3 +87,8 @@ bucket for files untouched for 30 days.
1. In RequestChecker.java, fill in the user-provided constants. Use the Job Name you recorded earlier.
1. Run with `mvn compile` and
`mvn exec:java -Dexec.mainClass="com.google.cloud.storage.storagetransfer.samples.RequestChecker"`

## References

- [Google Storage Transfer API Client
Library](https://developers.google.com/api-client-library/java/apis/storagetransfer/v1)
19 changes: 12 additions & 7 deletions storage/storage-transfer/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,6 @@

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<powermock.version>1.6.2</powermock.version>
</properties>

<dependencies>
Expand All @@ -47,15 +46,21 @@

<!-- Test Dependencies -->
<dependency>
<groupId>org.powermock</groupId>
<artifactId>powermock-module-junit4</artifactId>
<version>${powermock.version}</version>
<groupId>com.google.truth</groupId>
<artifactId>truth</artifactId>
<version>0.29</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.powermock</groupId>
<artifactId>powermock-api-mockito</artifactId>
<version>${powermock.version}</version>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<version>1.10.19</version>
<scope>test</scope>
</dependency>
</dependencies>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,31 +27,19 @@
import com.google.api.services.storagetransfer.model.TransferSpec;

import java.io.IOException;
import java.util.logging.Logger;
import java.io.PrintStream;

/**
* Creates a one-off transfer job from Amazon S3 to Google Cloud Storage.
*/
public final class AwsRequester {

private static final String JOB_DESC = "YOUR DESCRIPTION";
private static final String PROJECT_ID = "YOUR_PROJECT_ID";
private static final String AWS_SOURCE_NAME = "YOUR SOURCE BUCKET";
private static final String AWS_ACCESS_KEY_ID = "YOUR_ACCESS_KEY_ID";
private static final String AWS_SECRET_ACCESS_KEY = "YOUR_SECRET_ACCESS_KEY";
private static final String GCS_SINK_NAME = "YOUR_SINK_BUCKET";

/**
* Specify times below using US Pacific Time Zone.
*/
private static final String START_DATE = "YYYY-MM-DD";
private static final String START_TIME = "HH:MM:SS";

private static final Logger LOG = Logger.getLogger(AwsRequester.class.getName());

/**
* Creates and executes a request for a TransferJob from Amazon S3 to Cloud Storage.
*
* <p>The {@code startDate} and {@code startTime} parameters should be set according to the UTC
* Time Zone. See:
* https://developers.google.com/resources/api-libraries/documentation/storagetransfer/v1/java/latest/com/google/api/services/storagetransfer/v1/model/Schedule.html#getStartTimeOfDay()
*
* @return the response TransferJob if the request is successful
* @throws InstantiationException
* if instantiation fails when building the TransferJob
Expand All @@ -60,43 +48,73 @@ public final class AwsRequester {
* @throws IOException
* if the client failed to complete the request
*/
public static TransferJob createAwsTransferJob() throws InstantiationException,
IllegalAccessException, IOException {
Date date = TransferJobUtils.createDate(START_DATE);
TimeOfDay time = TransferJobUtils.createTimeOfDay(START_TIME);
TransferJob transferJob = TransferJob.class
.newInstance()
.setDescription(JOB_DESC)
.setProjectId(PROJECT_ID)
.setTransferSpec(
TransferSpec.class
.newInstance()
.setAwsS3DataSource(
AwsS3Data.class
.newInstance()
.setBucketName(AWS_SOURCE_NAME)
.setAwsAccessKey(
AwsAccessKey.class.newInstance().setAccessKeyId(AWS_ACCESS_KEY_ID)
.setSecretAccessKey(AWS_SECRET_ACCESS_KEY)))
.setGcsDataSink(GcsData.class.newInstance().setBucketName(GCS_SINK_NAME)))
.setSchedule(
Schedule.class.newInstance().setScheduleStartDate(date).setScheduleEndDate(date)
.setStartTimeOfDay(time)).setStatus("ENABLED");
public static TransferJob createAwsTransferJob(
String projectId,
String jobDescription,
String awsSourceBucket,
String gcsSinkBucket,
String startDate,
String startTime,
String awsAccessKeyId,
String awsSecretAccessKey)
throws InstantiationException, IllegalAccessException, IOException {
Date date = TransferJobUtils.createDate(startDate);
TimeOfDay time = TransferJobUtils.createTimeOfDay(startTime);
TransferJob transferJob =
new TransferJob()
.setDescription(jobDescription)
.setProjectId(projectId)
.setTransferSpec(
new TransferSpec()
.setAwsS3DataSource(
new AwsS3Data()
.setBucketName(awsSourceBucket)
.setAwsAccessKey(
new AwsAccessKey()
.setAccessKeyId(awsAccessKeyId)
.setSecretAccessKey(awsSecretAccessKey)))
.setGcsDataSink(new GcsData().setBucketName(gcsSinkBucket)))
.setSchedule(
new Schedule()
.setScheduleStartDate(date)
.setScheduleEndDate(date)
.setStartTimeOfDay(time))
.setStatus("ENABLED");

Storagetransfer client = TransferClientCreator.createStorageTransferClient();
return client.transferJobs().create(transferJob).execute();
}

public static void run(PrintStream out)
throws InstantiationException, IllegalAccessException, IOException {
String projectId = TransferJobUtils.getPropertyOrFail("projectId");
String jobDescription = TransferJobUtils.getPropertyOrFail("jobDescription");
String awsSourceBucket = TransferJobUtils.getPropertyOrFail("awsSourceBucket");
String gcsSinkBucket = TransferJobUtils.getPropertyOrFail("gcsSinkBucket");
String startDate = TransferJobUtils.getPropertyOrFail("startDate");
String startTime = TransferJobUtils.getPropertyOrFail("startTime");
String awsAccessKeyId = TransferJobUtils.getEnvOrFail("AWS_ACCESS_KEY_ID");
String awsSecretAccessKey = TransferJobUtils.getEnvOrFail("AWS_SECRET_ACCESS_KEY");

TransferJob responseT =
createAwsTransferJob(
projectId,
jobDescription,
awsSourceBucket,
gcsSinkBucket,
startDate,
startTime,
awsAccessKeyId,
awsSecretAccessKey);
out.println("Return transferJob: " + responseT.toPrettyString());
}

/**
* Output the contents of a successfully created TransferJob.
*
* @param args
* arguments from the command line
*/
public static void main(String[] args) {
try {
TransferJob responseT = createAwsTransferJob();
LOG.info("Return transferJob: " + responseT.toPrettyString());
run(System.out);
} catch (Exception e) {
e.printStackTrace();
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,30 +27,21 @@
import com.google.api.services.storagetransfer.model.TransferSpec;

import java.io.IOException;
import java.util.logging.Logger;
import java.io.PrintStream;

/**
* Creates a daily transfer from a standard Cloud Storage bucket to a Cloud Storage Nearline
* bucket for files untouched for 30 days.
* bucket for files untouched for 30 days.
*/
public final class NearlineRequester {

private static final String JOB_DESC = "YOUR DESCRIPTION";
private static final String PROJECT_ID = "YOUR_PROJECT_ID";
private static final String GCS_SOURCE_NAME = "YOUR_SOURCE_BUCKET";
private static final String NEARLINE_SINK_NAME = "YOUR_SINK_BUCKET";

/**
* Specify times below using US Pacific Time Zone.
*/
private static final String START_DATE = "YYYY-MM-DD";
private static final String START_TIME = "HH:MM:SS";

private static final Logger LOG = Logger.getLogger(AwsRequester.class.getName());

/**
* Creates and executes a request for a TransferJob to Cloud Storage Nearline.
*
* <p>The {@code startDate} and {@code startTime} parameters should be set according to the UTC
* Time Zone. See:
* https://developers.google.com/resources/api-libraries/documentation/storagetransfer/v1/java/latest/com/google/api/services/storagetransfer/v1/model/Schedule.html#getStartTimeOfDay()
*
* @return the response TransferJob if the request is successful
* @throws InstantiationException
* if instantiation fails when building the TransferJob
Expand All @@ -59,31 +50,58 @@ public final class NearlineRequester {
* @throws IOException
* if the client failed to complete the request
*/
public static TransferJob createNearlineTransferJob() throws InstantiationException,
IllegalAccessException, IOException {
Date date = TransferJobUtils.createDate(START_DATE);
TimeOfDay time = TransferJobUtils.createTimeOfDay(START_TIME);
TransferJob transferJob = TransferJob.class
.newInstance()
.setDescription(JOB_DESC)
.setProjectId(PROJECT_ID)
.setTransferSpec(
TransferSpec.class
.newInstance()
.setGcsDataSource(GcsData.class.newInstance().setBucketName(GCS_SOURCE_NAME))
.setGcsDataSink(GcsData.class.newInstance().setBucketName(NEARLINE_SINK_NAME))
.setObjectConditions(
ObjectConditions.class.newInstance().setMinTimeElapsedSinceLastModification("2592000s"))
.setTransferOptions(
TransferOptions.class.newInstance().setDeleteObjectsFromSourceAfterTransfer(true)))
.setSchedule(Schedule.class.newInstance().setScheduleStartDate(date)
.setStartTimeOfDay(time))
.setStatus("ENABLED");
public static TransferJob createNearlineTransferJob(
String projectId,
String jobDescription,
String gcsSourceBucket,
String gcsNearlineSinkBucket,
String startDate,
String startTime)
throws InstantiationException, IllegalAccessException, IOException {
Date date = TransferJobUtils.createDate(startDate);
TimeOfDay time = TransferJobUtils.createTimeOfDay(startTime);
TransferJob transferJob =
new TransferJob()
.setDescription(jobDescription)
.setProjectId(projectId)
.setTransferSpec(
new TransferSpec()
.setGcsDataSource(new GcsData().setBucketName(gcsSourceBucket))
.setGcsDataSink(new GcsData().setBucketName(gcsNearlineSinkBucket))
.setObjectConditions(
new ObjectConditions()
.setMinTimeElapsedSinceLastModification("2592000s" /* 30 days */))
.setTransferOptions(
new TransferOptions()
.setDeleteObjectsFromSourceAfterTransfer(true)))
.setSchedule(
new Schedule().setScheduleStartDate(date).setStartTimeOfDay(time))
.setStatus("ENABLED");

Storagetransfer client = TransferClientCreator.createStorageTransferClient();
return client.transferJobs().create(transferJob).execute();
}

public static void run(PrintStream out)
throws InstantiationException, IllegalAccessException, IOException {
String projectId = TransferJobUtils.getPropertyOrFail("projectId");
String jobDescription = TransferJobUtils.getPropertyOrFail("jobDescription");
String gcsSourceBucket = TransferJobUtils.getPropertyOrFail("gcsSourceBucket");
String gcsNearlineSinkBucket = TransferJobUtils.getPropertyOrFail("gcsNearlineSinkBucket");
String startDate = TransferJobUtils.getPropertyOrFail("startDate");
String startTime = TransferJobUtils.getPropertyOrFail("startTime");

TransferJob responseT =
createNearlineTransferJob(
projectId,
jobDescription,
gcsSourceBucket,
gcsNearlineSinkBucket,
startDate,
startTime);
out.println("Return transferJob: " + responseT.toPrettyString());
}

/**
* Output the contents of a successfully created TransferJob.
*
Expand All @@ -92,8 +110,7 @@ public static TransferJob createNearlineTransferJob() throws InstantiationExcept
*/
public static void main(String[] args) {
try {
TransferJob responseT = createNearlineTransferJob();
LOG.info("Return transferJob: " + responseT.toPrettyString());
run(System.out);
} catch (Exception e) {
e.printStackTrace();
}
Expand Down
Loading

0 comments on commit d6e9032

Please sign in to comment.