Skip to content

Commit

Permalink
Merge pull request #503 from GoogleCloudPlatform/bigquery
Browse files Browse the repository at this point in the history
Merge gcloud-java-bigquery into master
  • Loading branch information
ajkannan committed Dec 22, 2015
2 parents 9bcebd3 + a1299c7 commit 3f2799d
Show file tree
Hide file tree
Showing 76 changed files with 15,603 additions and 0 deletions.
50 changes: 50 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ This client supports the following Google Cloud Platform services:
- [Google Cloud Datastore] (#google-cloud-datastore)
- [Google Cloud Storage] (#google-cloud-storage)
- [Google Cloud Resource Manager] (#google-cloud-resource-manager)
- [Google Cloud BigQuery] (#google-cloud-bigquery)

> Note: This client is a work-in-progress, and may occasionally
> make backwards-incompatible changes.
Expand Down Expand Up @@ -214,6 +215,51 @@ while (projectIterator.hasNext()) {
}
```

Google Cloud BigQuery
----------------------

- [API Documentation][bigquery-api]
- [Official Documentation][cloud-bigquery-docs]

#### Preview

Here is a code snippet showing a simple usage example from within Compute/App Engine. Note that you
must [supply credentials](#authentication) and a project ID if running this snippet elsewhere.

```java
import com.google.gcloud.bigquery.BaseTableInfo;
import com.google.gcloud.bigquery.BigQuery;
import com.google.gcloud.bigquery.BigQueryOptions;
import com.google.gcloud.bigquery.Field;
import com.google.gcloud.bigquery.JobStatus;
import com.google.gcloud.bigquery.LoadJobInfo;
import com.google.gcloud.bigquery.Schema;
import com.google.gcloud.bigquery.TableId;
import com.google.gcloud.bigquery.TableInfo;

BigQuery bigquery = BigQueryOptions.defaultInstance().service();
TableId tableId = TableId.of("dataset", "table");
BaseTableInfo info = bigquery.getTable(tableId);
if (info == null) {
System.out.println("Creating table " + tableId);
Field integerField = Field.of("fieldName", Field.Type.integer());
bigquery.create(TableInfo.of(tableId, Schema.of(integerField)));
} else {
System.out.println("Loading data into table " + tableId);
LoadJobInfo loadJob = LoadJobInfo.of(tableId, "gs://bucket/path");
loadJob = bigquery.create(loadJob);
while (loadJob.status().state() != JobStatus.State.DONE) {
Thread.sleep(1000L);
loadJob = bigquery.getJob(loadJob.jobId());
}
if (loadJob.status().error() != null) {
System.out.println("Job completed with errors");
} else {
System.out.println("Job succeeded");
}
}
```

Troubleshooting
---------------

Expand Down Expand Up @@ -276,3 +322,7 @@ Apache 2.0 - See [LICENSE] for more information.
[resourcemanager-api]:http://googlecloudplatform.github.io/gcloud-java/apidocs/index.html?com/google/gcloud/resourcemanager/package-summary.html
[cloud-resourcemanager-docs]:https://cloud.google.com/resource-manager/
[cloud-bigquery]: https://cloud.google.com/bigquery/
[cloud-bigquery-docs]: https://cloud.google.com/bigquery/docs/overview
[bigquery-api]: http://googlecloudplatform.github.io/gcloud-java/apidocs/index.html?com/google/gcloud/bigquery/package-summary.html
30 changes: 30 additions & 0 deletions TESTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ This library provides tools to help write tests for code that uses the following
- [Datastore] (#testing-code-that-uses-datastore)
- [Storage] (#testing-code-that-uses-storage)
- [Resource Manager] (#testing-code-that-uses-resource-manager)
- [BigQuery] (#testing-code-that-uses-bigquery)

### Testing code that uses Datastore

Expand Down Expand Up @@ -103,5 +104,34 @@ You can test against a temporary local Resource Manager by following these steps

This method will block until the server thread has been terminated.

### Testing code that uses BigQuery

Currently, there isn't an emulator for Google BigQuery, so an alternative is to create a test
project. `RemoteBigQueryHelper` contains convenience methods to make setting up and cleaning up the
test project easier. To use this class, follow the steps below:
1. Create a test Google Cloud project.
2. Download a [JSON service account credentials file][create-service-account] from the Google
Developer's Console.

3. Create a `RemoteBigQueryHelper` object using your project ID and JSON key.
Here is an example that uses the `RemoteBigQueryHelper` to create a dataset.
```java
RemoteBigQueryHelper bigqueryHelper =
RemoteBigQueryHelper.create(PROJECT_ID, new FileInputStream("/path/to/my/JSON/key.json"));
BigQuery bigquery = bigqueryHelper.options().service();
String dataset = RemoteBigQueryHelper.generateDatasetName();
bigquery.create(DatasetInfo.builder(dataset).build());
```

4. Run your tests.

5. Clean up the test project by using `forceDelete` to clear any datasets used.
Here is an example that clears the dataset created in Step 3.
```java
RemoteBigQueryHelper.forceDelete(bigquery, dataset);
```

[cloud-platform-storage-authentication]:https://cloud.google.com/storage/docs/authentication?hl=en#service_accounts
[create-service-account]:https://developers.google.com/identity/protocols/OAuth2ServiceAccount#creatinganaccount
Loading

0 comments on commit 3f2799d

Please sign in to comment.