Skip to content

Commit

Permalink
Merge pull request #868 from mderka/dns-batch-merge
Browse files Browse the repository at this point in the history
Dns batch merge
  • Loading branch information
ajkannan committed Apr 6, 2016
2 parents c026e1a + b29df66 commit 2d2532f
Show file tree
Hide file tree
Showing 172 changed files with 4,083 additions and 3,114 deletions.
29 changes: 15 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,16 +30,16 @@ If you are using Maven, add this to your pom.xml file
<dependency>
<groupId>com.google.gcloud</groupId>
<artifactId>gcloud-java</artifactId>
<version>0.1.5</version>
<version>0.1.7</version>
</dependency>
```
If you are using Gradle, add this to your dependencies
```Groovy
compile 'com.google.gcloud:gcloud-java:0.1.5'
compile 'com.google.gcloud:gcloud-java:0.1.7'
```
If you are using SBT, add this to your dependencies
```Scala
libraryDependencies += "com.google.gcloud" % "gcloud-java" % "0.1.5"
libraryDependencies += "com.google.gcloud" % "gcloud-java" % "0.1.7"
```

Example Applications
Expand Down Expand Up @@ -84,8 +84,9 @@ Most `gcloud-java` libraries require a project ID. There are multiple ways to s
1. Project ID supplied when building the service options
2. Project ID specified by the environment variable `GCLOUD_PROJECT`
3. App Engine project ID
4. Google Cloud SDK project ID
5. Compute Engine project ID
4. Project ID specified in the JSON credentials file pointed by the `GOOGLE_APPLICATION_CREDENTIALS` environment variable
5. Google Cloud SDK project ID
6. Compute Engine project ID

Authentication
--------------
Expand Down Expand Up @@ -249,13 +250,13 @@ ZoneInfo zoneInfo = ZoneInfo.of(zoneName, domainName, description);
Zone zone = dns.create(zoneInfo);
```
The second snippet shows how to create records inside a zone. The complete code can be found on [CreateOrUpdateDnsRecords.java](./gcloud-java-examples/src/main/java/com/google/gcloud/examples/dns/snippets/CreateOrUpdateDnsRecords.java).
The second snippet shows how to create records inside a zone. The complete code can be found on [CreateOrUpdateRecordSets.java](./gcloud-java-examples/src/main/java/com/google/gcloud/examples/dns/snippets/CreateOrUpdateRecordSets.java).
```java
import com.google.gcloud.dns.ChangeRequest;
import com.google.gcloud.dns.ChangeRequestInfo;
import com.google.gcloud.dns.Dns;
import com.google.gcloud.dns.DnsOptions;
import com.google.gcloud.dns.DnsRecord;
import com.google.gcloud.dns.RecordSet;
import com.google.gcloud.dns.Zone;
import java.util.Iterator;
Expand All @@ -265,24 +266,24 @@ Dns dns = DnsOptions.defaultInstance().service();
String zoneName = "my-unique-zone";
Zone zone = dns.getZone(zoneName);
String ip = "12.13.14.15";
DnsRecord toCreate = DnsRecord.builder("www.someexampledomain.com.", DnsRecord.Type.A)
RecordSet toCreate = RecordSet.builder("www.someexampledomain.com.", RecordSet.Type.A)
.ttl(24, TimeUnit.HOURS)
.addRecord(ip)
.build();
ChangeRequest.Builder changeBuilder = ChangeRequest.builder().add(toCreate);
ChangeRequestInfo.Builder changeBuilder = ChangeRequestInfo.builder().add(toCreate);
// Verify that the record does not exist yet.
// If it does exist, we will overwrite it with our prepared record.
Iterator<DnsRecord> recordIterator = zone.listDnsRecords().iterateAll();
while (recordIterator.hasNext()) {
DnsRecord current = recordIterator.next();
Iterator<RecordSet> recordSetIterator = zone.listRecordSets().iterateAll();
while (recordSetIterator.hasNext()) {
RecordSet current = recordSetIterator.next();
if (toCreate.name().equals(current.name()) &&
toCreate.type().equals(current.type())) {
changeBuilder.delete(current);
}
}
ChangeRequest changeRequest = changeBuilder.build();
ChangeRequestInfo changeRequest = changeBuilder.build();
zone.applyChangeRequest(changeRequest);
```
Expand Down
4 changes: 2 additions & 2 deletions RELEASING.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@ Most of the release process is handled by the `after_success.sh` script, trigger
1. Run `utilities/update_pom_version.sh` from the repository's base directory.
This script takes an optional argument denoting the new version. By default, if the current version is X.Y.Z-SNAPSHOT, the script will update the version in all the pom.xml files to X.Y.Z. If desired, another version can be supplied via command line argument instead.

2. Create a PR to update the pom.xml version.
The PR should look something like [#225](https://github.com/GoogleCloudPlatform/gcloud-java/pull/225). After this PR is merged into GoogleCloudPlatform/gcloud-java, Travis CI will push a new website to GoogleCloudPlatform/gh-pages, push a new artifact to the Maven Central Repository, and update versions in the README files.
2. Create a PR to update the pom.xml version. If releasing a new client library, this PR should also update javadoc grouping in the base directory's [pom.xml](./pom.xml).
PRs that don't release new modules should look something like [#225](https://github.com/GoogleCloudPlatform/gcloud-java/pull/225). PRs that do release a new module should also add the appropriate packages to the javadoc groups "SPI" and "Test helpers", as shown in [#802](https://github.com/GoogleCloudPlatform/gcloud-java/pull/802) for `gcloud-java-dns`. After this PR is merged into GoogleCloudPlatform/gcloud-java, Travis CI will push a new website to GoogleCloudPlatform/gh-pages, push a new artifact to the Maven Central Repository, and update versions in the README files.

3. Before moving on, verify that the artifacts have successfully been pushed to the Maven Central Repository. Open Travis CI, click the ["Build History" tab](https://travis-ci.org/GoogleCloudPlatform/gcloud-java/builds), and open the second build's logs for Step 2's PR. Be sure that you are not opening the "Pull Request" build logs. When the build finishes, scroll to the end of the log and verify that the artifacts were successfully staged and deployed. You can also search for `gcloud-java` on the [Sonatype website](https://oss.sonatype.org/#nexus-search;quick~gcloud-java) and check the latest version number. If the deployment didn't succeed because of a flaky test, rerun the build.

Expand Down
2 changes: 1 addition & 1 deletion codacy-conf.json

Large diffs are not rendered by default.

6 changes: 3 additions & 3 deletions gcloud-java-bigquery/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,16 +22,16 @@ If you are using Maven, add this to your pom.xml file
<dependency>
<groupId>com.google.gcloud</groupId>
<artifactId>gcloud-java-bigquery</artifactId>
<version>0.1.5</version>
<version>0.1.7</version>
</dependency>
```
If you are using Gradle, add this to your dependencies
```Groovy
compile 'com.google.gcloud:gcloud-java-bigquery:0.1.5'
compile 'com.google.gcloud:gcloud-java-bigquery:0.1.7'
```
If you are using SBT, add this to your dependencies
```Scala
libraryDependencies += "com.google.gcloud" % "gcloud-java-bigquery" % "0.1.5"
libraryDependencies += "com.google.gcloud" % "gcloud-java-bigquery" % "0.1.7"
```

Example Application
Expand Down
2 changes: 1 addition & 1 deletion gcloud-java-bigquery/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
<parent>
<groupId>com.google.gcloud</groupId>
<artifactId>gcloud-java-pom</artifactId>
<version>0.1.6-SNAPSHOT</version>
<version>0.1.8-SNAPSHOT</version>
</parent>
<properties>
<site.installationModule>gcloud-java-bigquery</site.installationModule>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,16 +19,15 @@
import static com.google.common.base.Preconditions.checkArgument;

import com.google.common.base.Function;
import com.google.common.base.Joiner;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.Lists;
import com.google.common.collect.Sets;
import com.google.gcloud.FieldSelector;
import com.google.gcloud.FieldSelector.Helper;
import com.google.gcloud.Page;
import com.google.gcloud.Service;
import com.google.gcloud.bigquery.spi.BigQueryRpc;

import java.util.List;
import java.util.Set;

/**
* An interface for Google Cloud BigQuery.
Expand All @@ -43,7 +42,7 @@ public interface BigQuery extends Service<BigQueryOptions> {
* @see <a href="https://cloud.google.com/bigquery/docs/reference/v2/datasets#resource">Dataset
* Resource</a>
*/
enum DatasetField {
enum DatasetField implements FieldSelector {
ACCESS("access"),
CREATION_TIME("creationTime"),
DATASET_REFERENCE("datasetReference"),
Expand All @@ -56,24 +55,19 @@ enum DatasetField {
LOCATION("location"),
SELF_LINK("selfLink");

static final List<? extends FieldSelector> REQUIRED_FIELDS =
ImmutableList.of(DATASET_REFERENCE);

private final String selector;

DatasetField(String selector) {
this.selector = selector;
}

@Override
public String selector() {
return selector;
}

static String selector(DatasetField... fields) {
Set<String> fieldStrings = Sets.newHashSetWithExpectedSize(fields.length + 1);
fieldStrings.add(DATASET_REFERENCE.selector());
for (DatasetField field : fields) {
fieldStrings.add(field.selector());
}
return Joiner.on(',').join(fieldStrings);
}
}

/**
Expand All @@ -82,7 +76,7 @@ static String selector(DatasetField... fields) {
* @see <a href="https://cloud.google.com/bigquery/docs/reference/v2/tables#resource">Table
* Resource</a>
*/
enum TableField {
enum TableField implements FieldSelector {
CREATION_TIME("creationTime"),
DESCRIPTION("description"),
ETAG("etag"),
Expand All @@ -101,25 +95,19 @@ enum TableField {
TYPE("type"),
VIEW("view");

static final List<? extends FieldSelector> REQUIRED_FIELDS =
ImmutableList.of(TABLE_REFERENCE, TYPE);

private final String selector;

TableField(String selector) {
this.selector = selector;
}

@Override
public String selector() {
return selector;
}

static String selector(TableField... fields) {
Set<String> fieldStrings = Sets.newHashSetWithExpectedSize(fields.length + 2);
fieldStrings.add(TABLE_REFERENCE.selector());
fieldStrings.add(TYPE.selector());
for (TableField field : fields) {
fieldStrings.add(field.selector());
}
return Joiner.on(',').join(fieldStrings);
}
}

/**
Expand All @@ -128,7 +116,7 @@ static String selector(TableField... fields) {
* @see <a href="https://cloud.google.com/bigquery/docs/reference/v2/jobs#resource">Job Resource
* </a>
*/
enum JobField {
enum JobField implements FieldSelector {
CONFIGURATION("configuration"),
ETAG("etag"),
ID("id"),
Expand All @@ -138,25 +126,19 @@ enum JobField {
STATUS("status"),
USER_EMAIL("user_email");

static final List<? extends FieldSelector> REQUIRED_FIELDS =
ImmutableList.of(JOB_REFERENCE, CONFIGURATION);

private final String selector;

JobField(String selector) {
this.selector = selector;
}

@Override
public String selector() {
return selector;
}

static String selector(JobField... fields) {
Set<String> fieldStrings = Sets.newHashSetWithExpectedSize(fields.length + 2);
fieldStrings.add(JOB_REFERENCE.selector());
fieldStrings.add(CONFIGURATION.selector());
for (JobField field : fields) {
fieldStrings.add(field.selector());
}
return Joiner.on(',').join(fieldStrings);
}
}

/**
Expand Down Expand Up @@ -210,7 +192,8 @@ private DatasetOption(BigQueryRpc.Option option, Object value) {
* returned, even if not specified.
*/
public static DatasetOption fields(DatasetField... fields) {
return new DatasetOption(BigQueryRpc.Option.FIELDS, DatasetField.selector(fields));
return new DatasetOption(BigQueryRpc.Option.FIELDS,
Helper.selector(DatasetField.REQUIRED_FIELDS, fields));
}
}

Expand Down Expand Up @@ -279,7 +262,8 @@ private TableOption(BigQueryRpc.Option option, Object value) {
* of {@link Table#definition()}) are always returned, even if not specified.
*/
public static TableOption fields(TableField... fields) {
return new TableOption(BigQueryRpc.Option.FIELDS, TableField.selector(fields));
return new TableOption(BigQueryRpc.Option.FIELDS,
Helper.selector(TableField.REQUIRED_FIELDS, fields));
}
}

Expand Down Expand Up @@ -376,10 +360,8 @@ public static JobListOption pageToken(String pageToken) {
* listing jobs.
*/
public static JobListOption fields(JobField... fields) {
String selector = JobField.selector(fields);
StringBuilder builder = new StringBuilder();
builder.append("etag,jobs(").append(selector).append(",state,errorResult),nextPageToken");
return new JobListOption(BigQueryRpc.Option.FIELDS, builder.toString());
return new JobListOption(BigQueryRpc.Option.FIELDS,
Helper.listSelector("jobs", JobField.REQUIRED_FIELDS, fields, "state", "errorResult"));
}
}

Expand All @@ -402,7 +384,8 @@ private JobOption(BigQueryRpc.Option option, Object value) {
* returned, even if not specified.
*/
public static JobOption fields(JobField... fields) {
return new JobOption(BigQueryRpc.Option.FIELDS, JobField.selector(fields));
return new JobOption(BigQueryRpc.Option.FIELDS,
Helper.selector(JobField.REQUIRED_FIELDS, fields));
}
}

Expand Down Expand Up @@ -488,9 +471,10 @@ public static QueryResultsOption maxWaitTime(long maxWaitTime) {
Dataset getDataset(DatasetId datasetId, DatasetOption... options);

/**
* Lists the project's datasets. This method returns partial information on each dataset
* ({@link Dataset#datasetId()}, {@link Dataset#friendlyName()} and {@link Dataset#id()}). To get
* complete information use either {@link #getDataset(String, DatasetOption...)} or
* Lists the project's datasets. This method returns partial information on each dataset:
* ({@link Dataset#datasetId()}, {@link Dataset#friendlyName()} and
* {@link Dataset#generatedId()}). To get complete information use either
* {@link #getDataset(String, DatasetOption...)} or
* {@link #getDataset(DatasetId, DatasetOption...)}.
*
* @throws BigQueryException upon failure
Expand Down Expand Up @@ -558,9 +542,9 @@ public static QueryResultsOption maxWaitTime(long maxWaitTime) {
Table getTable(TableId tableId, TableOption... options);

/**
* Lists the tables in the dataset. This method returns partial information on each table
* ({@link Table#tableId()}, {@link Table#friendlyName()}, {@link Table#id()} and type, which
* is part of {@link Table#definition()}). To get complete information use either
* Lists the tables in the dataset. This method returns partial information on each table:
* ({@link Table#tableId()}, {@link Table#friendlyName()}, {@link Table#generatedId()} and type,
* which is part of {@link Table#definition()}). To get complete information use either
* {@link #getTable(TableId, TableOption...)} or
* {@link #getTable(String, String, TableOption...)}.
*
Expand All @@ -569,9 +553,9 @@ public static QueryResultsOption maxWaitTime(long maxWaitTime) {
Page<Table> listTables(String datasetId, TableListOption... options);

/**
* Lists the tables in the dataset. This method returns partial information on each table
* ({@link Table#tableId()}, {@link Table#friendlyName()}, {@link Table#id()} and type, which
* is part of {@link Table#definition()}). To get complete information use either
* Lists the tables in the dataset. This method returns partial information on each table:
* ({@link Table#tableId()}, {@link Table#friendlyName()}, {@link Table#generatedId()} and type,
* which is part of {@link Table#definition()}). To get complete information use either
* {@link #getTable(TableId, TableOption...)} or
* {@link #getTable(String, String, TableOption...)}.
*
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -103,8 +103,8 @@ public Builder friendlyName(String friendlyName) {
}

@Override
Builder id(String id) {
infoBuilder.id(id);
Builder generatedId(String generatedId) {
infoBuilder.generatedId(generatedId);
return this;
}

Expand Down
Loading

0 comments on commit 2d2532f

Please sign in to comment.