Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dynamo DB Mapper #35

Closed
millems opened this issue Jul 3, 2017 · 68 comments
Closed

Dynamo DB Mapper #35

millems opened this issue Jul 3, 2017 · 68 comments
Labels
1.x Parity dynamodb-enhanced feature-request A feature should be added or improved.

Comments

@millems
Copy link
Contributor

millems commented Jul 3, 2017

Review the inherited state of V1 Dynamo DB mapper support and determine which changes are necessary for V2.

(Feel free to comment on this issue with desired changes).

@cjkent
Copy link

cjkent commented Jul 13, 2017

Please consider supporting immutable classes in the mapper.

DynamoDbMapper requires all mapped classes to be mutable. Immutable domain objects are increasingly the norm in Java, and using DynamoDbMapper means creating mutable duplicates of all domain objects or giving up on the benefits of immutability.

@MikeFHay
Copy link

Please remove PaginatedList, and replace uses of it with Stream.

java.util.List implementations are expected to have fast size() methods, but as far as I can tell there is no way to implement that for a DynamoDB scan or query. Currently PaginatedList will either load the entire scan result into memory on a size() call, or simply fail if configured as ITERATION_ONLY. This is a potentially surprising behaviour which can be made much more intuitive and explicit by returning a Stream and requiring users to collect(toList()) if they want a List.

@millems
Copy link
Contributor Author

millems commented Jul 26, 2017

We're looking to make the automatic depagination provided by PaginatedList work for all service's paginated APIs with this story: #26.

+1 to avoiding accidentally blowing up your memory and holding up a thread by using something as simple as size(). Using Stream is a solid idea. We'll need to think about whether other pagination strategies for different APIs can have an efficient size() method. If so, we shouldn't expose it in as innocuous a form for services like Dynamo that make it expensive.

@leadVisionary
Copy link

One of the things I'd call out here is that I think there's a problem with the com.amazonaws.services.dynamodbv2.datamodeling.StandardBeanProperties method of reflecting over an object. Say I have:

@DynamoDBTable(tableName = "Polyps")
public final class Polyp {
private Set<String> endpoints = new ConcurrentSkipListSet<>();
public Set<String> getEndpoints() {
        return endpoints;
    }
    public void setEndpoints(final Collection<String> endpoints) {
        if (endpoints != null && !endpoints.isEmpty()) {
            this.endpoints = new ConcurrentSkipListSet<>(endpoints);
        }
    }
}

In theory, even if DynamoDB only respects Sets, I should be able to handle the case where a setter allows a wider Collection type.

It looks like this is because of line 140-146, the setterOf method, which does a "clobber get with set and look for the exact type" algorithm. Consequently, this method returns null, which later becomes a problem during line 111 of DynamoDBMapperFieldModel.

I think it should be checking if the return type has parent types, and if so trying to run up the tree.

@humanzz
Copy link

humanzz commented Sep 12, 2017

With the current DynamoDBMapper implementation, when performing scans, there's no easy way to make the scan operation respect the DynamoDB table's read throughputs.
Solutions combining https://aws.amazon.com/blogs/developer/rate-limited-scans-in-amazon-dynamodb/ and https://aws.amazon.com/blogs/developer/understanding-auto-paginated-scan-with-dynamodbmapper/ have to be used and are not that straightforward.

Please provide a scan operation that is able to respect a table's read throughput (whether by explicitly specifying a percentage of throughput to use or an absolute value of read units to use).

@ryonday
Copy link

ryonday commented Sep 19, 2017

Compatibility Between DynamoDBMapper, DynamoDB and AmazonDynamoDB

So this is pretty irritating. Within the Java DynamoDB SDK, there are essentially three different ways of working with Dynamo; all of them have their specific uses for specific situations, yet it's really difficult to convert between the different sets of classes that each one uses:

  • DynamoDBMapper uses annotated domain classes.
  • AmazonDynamoDB uses various forms of Map<String, AttributeValue>
  • DynamoDB uses Item and so forth.

Each of these requires irritating bespoke code to convert. Sure, DynamoDBMapper has two methods (marshallIntoObject and marshallIntoObjects), but there is no corresponding unmarshallIntoMap or anything like that. It should be simple to convert between domain classes and Map<String, AttributeValue>. It should be simple to convert between domain classes and Item. It should be simple to convert between any two of these.

Primarily, since DynamoDBMapper just doesn't handle Update Expressions at all, there is no elegant method of interop between DynamoDBMapper and when you need to use them.

Unexpected and Confusing Behavior of DynamoDBMapper Type Conversion

This one is long.

At the very least, the documentation could use clarification here.

But, the best case scenario is that the various methods of type conversion work together harmoniously so that the representation of data in Dynamo when using DynamoDBMapper can best match the user's intent and use case. Right now things are kind of close, which for me is more irritating than were it nowhere near at all.

I have an entire project full of nothing but code examples and JUnit tests to document and remind myself of the oddities of and tips and tricks for this system. I am more than happy to share them at any time.

Anyway, Let's start from the top.

At Least Three Methods of Type Conversion (and one kind of)?

I count three methods of explicit type conversion in the DynamoDBMapper system, and one outlier.

  1. ConversionSchema
  2. DynamoDBTypeConverterFactory
  3. The DynamoDBTyped and DynamoDBTypeConverted family of annotations
  4. AttributeTransformer (Yes, the use case for this is somewhat tangental but there are some improvements to be made here too.)

Throughout this section, I'll use the java.time.Instant class as an example for two reasons:

  1. In the 1.11.x SDK, it has no native support.

  2. Crucially, it can have a number of valid representations depending on the context.

    • A String/S, for UTC timestamps.
    • A DynamoDB M for separating out different aspects of a point on the timeline.
    • A Long/Integer/N for Epoch milli
    • A Long/Integer/N for Epoch second (hello, DynamoDB TTL!)

    Each of these representations has a valid use case. Each of these could even conceivably be used within the same project or even DynamoDBMapper domain class.

ConversionSchema

What I've gathered from looking through the code, the ConversionSchema is intended to define the base "primitive", direct type mappings between Java types and DynamoDB types by doing a class-to-AttributeValue mapping. For instance byte, integer and family map to AttributeValue.withN(thingie.toString()) and so forth.

So the neat thing is that you can add new primitive types to your mapper using a DynamoDBMapperConfig. Let's do it and default to a UTC timestamp!

public class InstantArgumentPal implements ArgumentMarshaller, ArgumentUnmarshaller {
    @Override
    public AttributeValue marshall(Object obj) { return new AttributeValue(obj.toString()); }
    
    @Override
    public void typeCheck(AttributeValue value, Method setter) { // Why? When?  }

    @Override
    public Object unmarshall(AttributeValue value) throws ParseException { return Instant.parse(value.getS()); }
}

I cannot for the life of me figure out when typeCheck is useful. Using it, you cannot tell the mapper "this Unmarshaller is not suitable for this AtributeValue for this Method" except by throwing an exception, which just kills your entire operation. Can this return a boolean?

Regardless, let's use our Pal!

// Omit getters, setters, equals, hashcode, etc etc
@DynamoDBTable(tableName = "instant_conversion_example")
public class InstantConversionExample {
    @DynamoDBHashKey(attributeName = "hash_key")
    private Instant hashKey;
    @DynamoDBAttribute(attributeName = "whatever")
    private String whatever;
 }

Later, let's set up a DynamoDBMapperConfig.Builder.

InstantArgumentPal pal = new InstantArgumentPal();
.withConversionSchema(ConversionSchemas.v2Builder("v2WithInstant")
                .addFirstType(Instant.class, pal, pal)
                .build());

Let's give it a whirl!

 @Test
    public void go_go_gadget_instant() throws Exception {
        InstantConversionExample example = new InstantConversionExample()
            .setHashKey(Instant.now())
            .setWhatever("whatevs");
        mapper.save(example);
        InstantConversionExample example1 = mapper.load(InstantConversionExample.class, example.getHashKey());
        assertThat(example1).isEqualTo(example);
	//
    	// Caused by: com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMappingException: InstantConversionExample[hash_key]; only scalar (B, N, or S) type allowed for key
    	//
    	// (┛◉Д◉)┛彡┻━┻
    }

sigh

@DynamoDBTable(tableName = "instant_conversion_example")
public class InstantConversionExample {
    @DynamoDBHashKey(attributeName = "hash_key")
    private String hashKey;
    @DynamoDBAttribute(attributeName = "whatever")
    private Instant whatever;
    public String getHashKey() {
        return hashKey;
    }

 @Test
    public void go_go_gadget_instant() throws Exception {
        InstantConversionExample example = new InstantConversionExample()
            .setHashKey("hash")
            .setWhatever(Instant.now());
        mapper.save(example);
        InstantConversionExample example1 = mapper.load(InstantConversionExample.class, "hash");
        assertThat(example1).isEqualTo(example);
	// IT WORKS NOW!
    }

OK cool, so far so good. Basically neat but:

  • Why aren't ArgumentMarshaller and ArgumentUnmarshaller generic?
  • What is typeCheck good for?
  • The addFirst is basically addOnly, from what I've been able to tell.
  • Technically, I have defined Instant as a scalar in my first example, yet it does not work for hash keys.

The last point tells me that my own additions to the ConversionSchema are not first class citizens. They should be.

DynamoDBTypeConverterFactory

Let's play around with the DynamoDBTypeConverterFactory!

public class InstantStringConverter extends DynamoDBTypeConverter.AbstractConverter<String, Instant> {
    @Override
    public String convert(Instant instant) {
        return instant.toString();
    }
    @Override
    public Instant unconvert(String string) {
        return Instant.parse(string);
    }
}

OK so this seems a little cleaner. It's generic! Let's skip the set-up and tests this time and go directly to the spoilers:

  • Still cannot use Instant as a hash key
  • Basically works exactly the same as the ConversionSchema thingie but it's generic.

But what's the big problem here?

DynamoDBTypeConverterFactory Is Only Meaningful for String

What happens if I make say, DynamoDBTypeConverter.AbstractConverter<Long, Instant>? Nothing. No conversion happens.

This is especially important if I register, say two converters:

 .withTypeConverterFactory(DynamoDBTypeConverterFactory.standard()
                .override()
                .with(Long.class, Instant.class, new InstantLongConverter())
                .with(String.class, Instant.class, new InstantStringConverter())
                .build());

And annotate the Instant a certain way:

  @DynamoDBTyped(DynamoDBMapperFieldModel.DynamoDBAttributeType.N)
  private Instant instant;

Since I've registered a type converter for Instant for a Java type that directly corresponds to a DynamoDB N type, it stands to reason that the type converter factory would look for a converter for the domain class type that converts to a Java numeric type, right?

No. What happens is this:

  1. The TypeConverterFactory explicitly searches for a String converter for the domain class type
  2. The TypeConverterFactory attempts to coerce the output of the String converter to a number type.

That is to say, for @DynamoDBTyped(DynamoDBMapperFieldModel.DynamoDBAttributeType.N) to work, There must be a String converter registered for the type and the output of the converter must be a Number formatted as a String

this is extremely counter-intuitive. It severely limits the utility of DynamoDBTypeConverterFactory.

But it gets worse. It turns out that ConversionSchema and DynamoDBTypeConverterFactory do not play well at all:

If a ConversionSchema Marshaller/Unmarshaller pair is registered for a type, the DynamoDBTypeConverterFactory converters for that type are completely ignored, as are any @DynamoDBTyped or @DynamoDBTypeConverted annotations on domain class members for that type.

Also, you cannot register a converter for a non-built-in scalar type (map, etc).

My final comment about this is that it appears that scalar types for this system are hardcoded along with their marshalers and unmarshallers into an Enum. This makes it impossible to extend the DynamoDB type system in a nuanced, elegant way.

The DynamoDBTyped and DynamoDBTypeConverted Family

These work OK, they are just really verbose, and honestly I wish that I could just have more nuanced usage of DynamoDBTypeConverterFactory.

AttributeTransformer

Only comment here is that you should be able to register these on a per-class basis, similar to the ConversionSchema and DynamoDBTypeConverterFactory marshalers and unmarshallers.

Summing Up

  • Documentation around the available type conversion options is limited and confusing. There is little or no guidance around which one to use in which situation.
  • The interoperation between the type conversion options is confusing and counter-intuitive.
  • The behavior of DynamoDBTypeConverterFactory is confusing, counter-intuitive and limiting.
  • Because of the above, DynamoDBTyped is of seriously limited utility for non-built-in scalar types.
  • DynamoDBTypeConverted with explicit converters works, but the verbosity is irritating
  • There should be a way to define your own primitives/scalar types and map them to DynamoDB types.

AttributeValue Is Irritating

new AttributeValue.withN(someNumberType.toString()) // WHYYYYYYY

Please, please, please add static methods that do null checks.

  • AttributeValue.S(String s)
  • AttributeValue.N(Number n)
  • AttributeValue.N(String s)
  • AttributeValue.mk(Object) // Introspect argument in a documented way and "do the right thing"

@millems
Copy link
Contributor Author

millems commented Sep 28, 2017

Issue reported external to github: @DynamoDBGeneratedUuid does not work for nested list objects.

@DynamoDBTable(tableName="XYZ")
class A {
    @DynamoDBAttribute
    List<B> listOfB;
}

@DynamoDBDocument
class B {
    @DynamoDBGeneratedUuid(DynamoDBAutoGenerateStrategy.CREATE)
    UUID id;
} 

@millems
Copy link
Contributor Author

millems commented Sep 28, 2017

Issue reported external to github: It would be nice to be able to annotate either the get method OR the type for type-specific annotations like @DynamoDBTypeConverted. Currently, the method has to be annotated.

@millems
Copy link
Contributor Author

millems commented Sep 28, 2017

Issue reported external to github: We should support automatic conversion of non-string key values in maps.

@millems
Copy link
Contributor Author

millems commented Nov 7, 2017

Issue reported external to github: DynamoDBMapper support for multiple conditions on same attribute for Save/Delete

I should be able to do this without having to throw away DynamoDBMapper and using the API directly...

expectedAttributes being a Map (DynamoDBSaveExpression.java and DynamoDBDeleteExpression) is a problem.

It's a pretty common use case: "Upsert a record into DynamoDB as long as versionID < {myNewVersionId}" - to ensure not to override older records.

This fails if the record doesn't yet exist, so the intuitive solution is to change the conditional expression to a list of two - versionId.exists(false) OR versionId.LE(myNewVersionID);

This is simply not possible right now. I realize this is possibly because the conditions maps to a Legacy API (https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/LegacyConditionalParameters.Expected.html), but unfortunately the new and more powerful Condition Expressions (https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Expressions.ConditionExpressions.html) are not supported through DynamoDBMapper.

So I suppose ultimately that's the request. Can DynamoDBMapper either support multiple conditions on the same attributes either via:

  • Generating ConditionExpressions from Expected attributes (that are in a List, not a map)
  • Accepting ConditionExpressions through DynamoDBMapper interface

References of others with same problem:

@millems
Copy link
Contributor Author

millems commented Nov 9, 2017

Issue reported external to github: DynamoDBMapper ignores inherited public getters from non-public classes

If we have 2 classes A, and B extends A. Class A declares a property p with a public getter getP() (Annotated with @DynamoDBAttribute). Then a call is made to dynamoDBMapper.save() with an instance of class B, if class A was public, property p is persisted, but if class A was package-private, property p is ignored.

Sample code that reproduces the issue :

File A.java
----------------------
package com.amazon.ab;

[public] class A {
private int p = 5;

@DynamoDBAttribute
public int getP() {
return p;
}
}
----------------------

File B.java
----------------------
package com.amazon.ab; // Same as A

public class B extends A {
}
----------------------

Test code:
File C.java
----------------------
package com.amazon.xyz; // Different than A & B

...

B b = new B();
dynamoDBMapper.save(b);

...
----------------------

Analysis :
The way the Java compiler and the JVM work are different. When a class inherits another, it inherits all its public methods and they are still public and accessible for everyone accessing the inheriting class. The only thing that the Java compiler enforces is that the inheriting class can access the inherited class (they are in the same package in this case). Other classes in other packages can call public methods regardless if they were just defined or inherited. This does not seem to be the case with how the JVM works. It seems to not allow calls to code that lives inside a package-private class from classes outside the package even if this call is coming through an inheriting public class that lives in the same package. So, to workaround this, the Java compiler generates a synthetic bridge method in the inheriting class that just calls the inherited method (using the #invokespecial instruction) so that classes in other packages are not accessing anything in a class that is not public.

In simpler terms, when a public class inherits a package-private class, the Java compiler generates synthetic bridge methods for inherited public methods while it doesn't if the inherited class was public.
Based on that, the root cause of this issue seems to be in StandardBeanProperties.java. The method canMap() that is called for every method in a bean class, filters out bridge and synthetic methods without further checks.

@bhaskarbagchi
Copy link

I see that this issue is open yet. Do we have any support for this as of now?

@shorea
Copy link
Contributor

shorea commented Aug 16, 2018

@bhaskarbagchi not yet. We plan on tackling the high level libraries shortly after we GA. We will update this issue as we have more information.

@varunnvs92
Copy link
Contributor

Issue reported external to github:
It is not possible to use Java stream() on PaginatedList and honor PaginationLoadingStrategy.LAZY_LOADING strategy. As you call stream() on PaginatedList, JDK tries to create a stream using spliterator which calls size() on the Collection. As PaginatedList.size() loads all results into memory, the Lazy_loading feature is not obeyed anymore. So customers either has to sacrifice Lazy loading capability of Java 8 functional programming capabilities.

@danieladams456
Copy link

danieladams456 commented Aug 31, 2018

I know the mapper is a higher level interface and currently just returns the data object to keep things simple. Would it complicate the usage too much to instead return an object with the data object and metadata properties, or do we have to use the lower level API if we need those? Useful ones that I have run into would be:

  • consumed capacity when doing a query
  • modified attributes when doing a save

Edit: I was just thinking about it, and save() returns void. It could return the old object instead, possibly on a flag in DynamoDBMapperConfig object.

@dagnir
Copy link
Contributor

dagnir commented Sep 14, 2018

Related: #703

@KyleBS
Copy link

KyleBS commented Oct 18, 2018

Not sure if anyone has mentioned it, but I'd like to see improved filter support for DynamoDBMapper in V2. My understanding is that there are two mechanisms for filtering right now:

https://github.com/aws/aws-sdk-java/blob/master/aws-java-sdk-dynamodb/src/main/java/com/amazonaws/services/dynamodbv2/datamodeling/DynamoDBScanExpression.java#L237-L240

and

https://github.com/aws/aws-sdk-java/blob/master/aws-java-sdk-dynamodb/src/main/java/com/amazonaws/services/dynamodbv2/datamodeling/DynamoDBScanExpression.java#L487-L490

The former is easier to program against but has limitations in its expressiveness. The latter solves this at the expense of generation. ExpressionSpecBuilder exists for the low-level client and can nearly be used directly with DynamoDBScanExpression, but has 1 major difference. DynamoDBScanExpression's withExpressionAttributeValues method takes Map<String, AttributeValue> where the *ExpressionSpec classes return Map<String, Object>.

https://github.com/aws/aws-sdk-java/blob/master/aws-java-sdk-dynamodb/src/main/java/com/amazonaws/services/dynamodbv2/xspec/ScanExpressionSpec.java#L75-L77

The Objects referred to there are the raw values, not AttributeValues. While that seems pretty close, I have not run across any exposed way to marshall from Object to AttributeValue. In this particular case, ConversionSchema contains all the necessarily logic to handle this marshalling but can't be leveraged due to the tight method access:

public static AttributeValue toAttributeValue(final Object o) {     
    val marshallerSet = new AbstractMarshallerSet(V2MarshallerSet.marshallers(), V2MarshallerSet.setMarshallers());
    val marshaller = marshallerSet.getMemberMarshaller(o.getClass());
    return marshaller.marshall(o);                                        
}

Even if such a toAttributeValue method did exist in this case, it would still feel rather clunky to have to map over each object in the ScanExpressionSpec's value map and marshall them myself. I'd be great to see withFilterExpression/withScanFilter get replaced with something more along the lines of withScanExpressionSpec for easy interop between DynamoDBMapper and DynamoDB.

@bmaizels
Copy link
Contributor

However, we appear to be missing a test for your use-case. I'll repro and hopefully fix it. Stay tuned, and thanks for the catch.

Repro confirmed. Opened #1748

@vaibhav-walia
Copy link

Issue reported external to github: DynamoDBMapper support for multiple conditions on same attribute for Save/Delete

I should be able to do this without having to throw away DynamoDBMapper and using the API directly...

expectedAttributes being a Map (DynamoDBSaveExpression.java and DynamoDBDeleteExpression) is a problem.

It's a pretty common use case: "Upsert a record into DynamoDB as long as versionID < {myNewVersionId}" - to ensure not to override older records.

This fails if the record doesn't yet exist, so the intuitive solution is to change the conditional expression to a list of two - versionId.exists(false) OR versionId.LE(myNewVersionID);

Any updates on this? This is indeed a very common use-case and causes a lot of pain!

@bmaizels
Copy link
Contributor

bmaizels commented Apr 6, 2020

This fails if the record doesn't yet exist, so the intuitive solution is to change the conditional expression to a list of two - versionId.exists(false) OR versionId.LE(myNewVersionID);

Any updates on this? This is indeed a very common use-case and causes a lot of pain!

When I try and parse this issue in the context of the new DynamoDB Enhanced Client we are currently previewing in v2 I'm not sure it applies. The design of the enhanced client, especially around how conditional statements are handled, is somewhat different to the v1 mapper but I don't understand the nuances of the v1 mapper and the issue as it is written here clearly enough to state with absolute certainty the issue is fixed.

What I'd ask you to do is take a look at the new enhanced client in v2 and see if this issue is addressed, if not I will take another look. The new enhanced client does support the 'newer' conditional expression syntax provided by the DynamoDB API.

@mdamak
Copy link

mdamak commented Apr 16, 2020

Could we have an idea for when we will have have support for immutable objects?
It is really frustrating that you don't consider this as a priority. Meanwhile we are stuck with mutable objects. Really make me want to switch to another db :(

@millems
Copy link
Contributor Author

millems commented Apr 16, 2020

@mdamak I'm sorry this is such a big painpoint for you. We have a lot of other painpoints we're trying to address at this time, and our customer feedback shows those painpoints (like lack of TransferManager or Metrics in V2) are a lot higher priority for us to address. Once those are addressed, we definitely want to come back and add immutable support to the enhanced client.

If you'd be willing to invest the time into implementing immutable support, we'd be willing to help you with the design and code reviews.

@bmaizels
Copy link
Contributor

Hi all,

The wait is finally over and we'd like to announce the launch of the DynamoDB Enhanced Client for the AWS SDK for Java 2.0.

https://github.com/aws/aws-sdk-java-v2/tree/master/services-custom/dynamodb-enhanced

This client enhances every DynamoDb operation by providing direct mapping to and from your data classes without changing the nature of those interactions for a low-friction and intuitive development experience.

If you'd like to learn about some of the other advantages this client offers, feel free to take a look at our launch blog:
https://aws.amazon.com/blogs/developer/introducing-enhanced-dynamodb-client-in-the-aws-sdk-for-java-v2/

Now we've launched the work doesn't stop here. We've heard lots of great ideas that we haven't had time to implement. We will continue to build and improve this library as well as look at other AWS services that could benefit from enhanced clients of their own.

As always we welcome your support, we hope the good ideas and contributions will keep coming!

@rcolombo
Copy link

rcolombo commented Apr 29, 2020

Is it possible to implement the @DynamoDBAutoGeneratedTimestamp annotation for the DynamoDB Enhanced Client? https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/dynamodbv2/datamodeling/DynamoDBAutoGeneratedTimestamp.html

This is something I find very useful since all of my tables have "created" and "updated" timestamp attributes.

Is there a way to do this with the EnhancedClient short of manually setting these fields on every write/update?

@bmaizels
Copy link
Contributor

@rcolombo :

Is it possible to implement the @DynamoDBAutoGeneratedTimestamp annotation for the DynamoDB Enhanced Client?

The best way to do this in my opinion is to write it as an extension (see https://github.com/aws/aws-sdk-java-v2/blob/master/services-custom/dynamodb-enhanced/src/main/java/software/amazon/awssdk/enhanced/dynamodb/DynamoDbEnhancedClientExtension.java). This gives you the hooks you need to be able do exactly what this used to do in the v1 mapper, and you can even design your own annotations for it.

For an example of how to write an extension like this, see the versioned record extension which is bundled by default : https://github.com/aws/aws-sdk-java-v2/blob/master/services-custom/dynamodb-enhanced/src/main/java/software/amazon/awssdk/enhanced/dynamodb/extensions/VersionedRecordExtension.java

And to see how this extension uses custom annotations, take a look at : https://github.com/aws/aws-sdk-java-v2/blob/master/services-custom/dynamodb-enhanced/src/main/java/software/amazon/awssdk/enhanced/dynamodb/extensions/annotations/DynamoDbVersionAttribute.java

Having said all that, if this all sounds like too much work this is a feature we will likely get around to doing ourselves assuming nobody submits a PR for it first. Thanks for +1'ing it.

@rcolombo
Copy link

@bmaizels I'll give it a shot. If I have enough success I'll try to open a PR

Thanks for the quick response!

@bmaizels
Copy link
Contributor

Immutables fans can now jump onto #1801 . We're going to start peeling off issues here with the goal of closing this issue once everyone's feedback and desires are accounted for in other places.

@millems
Copy link
Contributor Author

millems commented Jun 22, 2020

We will be splitting this issue into the remaining open feature requests for the DynamoDbEnhancedClient. See the mentions above and below to follow-up on what issues you care about the most and +1 them!

@millems
Copy link
Contributor Author

millems commented Jun 22, 2020

We think we've peeled off the remaining dynamodb-enhanced features here: dynamodb-enhanced

We might have missed some, because there were many feature requests that were very v1-specific, but I believe we've gotten them all already.

Please feel free to +1 the issues you want to put your vote behind us implementing (or ask on the issue if you want to take a stab at it yourself!).

We'll be resolving this issue, now.

@millems millems closed this as completed Jun 22, 2020
@gakinson
Copy link

gakinson commented Aug 7, 2020

@rcolombo :

Is it possible to implement the @DynamoDBAutoGeneratedTimestamp annotation for the DynamoDB Enhanced Client?

The best way to do this in my opinion is to write it as an extension (see https://github.com/aws/aws-sdk-java-v2/blob/master/services-custom/dynamodb-enhanced/src/main/java/software/amazon/awssdk/enhanced/dynamodb/DynamoDbEnhancedClientExtension.java). This gives you the hooks you need to be able do exactly what this used to do in the v1 mapper, and you can even design your own annotations for it.

For an example of how to write an extension like this, see the versioned record extension which is bundled by default : https://github.com/aws/aws-sdk-java-v2/blob/master/services-custom/dynamodb-enhanced/src/main/java/software/amazon/awssdk/enhanced/dynamodb/extensions/VersionedRecordExtension.java

And to see how this extension uses custom annotations, take a look at : https://github.com/aws/aws-sdk-java-v2/blob/master/services-custom/dynamodb-enhanced/src/main/java/software/amazon/awssdk/enhanced/dynamodb/extensions/annotations/DynamoDbVersionAttribute.java

Having said all that, if this all sounds like too much work this is a feature we will likely get around to doing ourselves assuming nobody submits a PR for it first. Thanks for +1'ing it.

I created an implementation of this extension via an annotation which acts the same way the @DynamoDBAutoGeneratedTimestamp in v 1.11 works. Feel free to modify and integrate into the SDK or anyone can use it at their own discretion.

https://github.com/gakinson/dyanamodb-enhanced-DynamoDbAutoGeneratedTimestamp-annotation

Thanks,
Geoffrey K

@juandyego
Copy link

juandyego commented Sep 18, 2020

Hello, I'm migrating a JAVA Lambda function from DynamoDB SDK 1 to DynamoDB SDK 2.
I have a table TABLE like this:
id as key
att1 as attribute
att2 as atribute

With SDK 2 version, I use a model class with annotations to map this table, MODELTABLE :
@DynamoDbBean
public class MODELTABLE {
String id;
String att1;
String att2;

@DynamoDbPartitionKey
@DynamoDbAttribute(value = "id")
...get / set for id
@DynamoDbAttribute(value = "att1")
...get / set for att1
@DynamoDbAttribute(value = "att2")
...get / set for att2
}

I want to get rows in table with att2 = value. How can I make this?
I try to use this method, but QueryConditional is mandatory to set and I only want to filter by att2 value.

public <T> Iterator<T> getDataByAttributeValues (DynamoDbTable<T> mapper, Map<String, AttributeValue> eav, String filterExpression) {
    QueryConditional queryConditional = QueryConditional
            .keyEqualTo(Key.builder().partitionValue("KEY_VALUE")
            .build());

    Expression scanExpression  = Expression.builder()
            .expression(filterExpression)
            .expressionValues(eav)
            .build();

    QueryEnhancedRequest queryRequest = QueryEnhancedRequest.builder().queryConditional(queryConditional).filterExpression(scanExpression).build();

	return mapper.query(queryRequest).items().iterator();
}

Thanks in advance.

@juandyego
Copy link

Hello, I'm migrating a JAVA Lambda function from DynamoDB SDK 1 to DynamoDB SDK 2.
I have a table TABLE like this:
id as key
att1 as attribute
att2 as atribute

With SDK 2 version, I use a model class with annotations to map this table, MODELTABLE :
@DynamoDbBean
public class MODELTABLE {
String id;
String att1;
String att2;

@DynamoDbPartitionKey
@DynamoDbAttribute(value = "id")
...get / set for id
@DynamoDbAttribute(value = "att1")
...get / set for att1
@DynamoDbAttribute(value = "att2")
...get / set for att2
}

I want to get rows in table with att2 = value. How can I make this?
I try to use this method, but QueryConditional is mandatory to set and I only want to filter by att2 value.

public <T> Iterator<T> getDataByAttributeValues (DynamoDbTable<T> mapper, Map<String, AttributeValue> eav, String filterExpression) {
    QueryConditional queryConditional = QueryConditional
            .keyEqualTo(Key.builder().partitionValue("KEY_VALUE")
            .build());

    Expression scanExpression  = Expression.builder()
            .expression(filterExpression)
            .expressionValues(eav)
            .build();

    QueryEnhancedRequest queryRequest = QueryEnhancedRequest.builder().queryConditional(queryConditional).filterExpression(scanExpression).build();

	return mapper.query(queryRequest).items().iterator();
}

Thanks in advance.

Forget the request. I have found how to do it. ScanEnhancedRequest is the solution:

public <T> Iterator<T> getDataByAttributeValues (DynamoDbTable<T> mapper, Map<String, AttributeValue> eav, String filterExpression) {

    Expression scanExpression  = Expression.builder()
            .expression(filterExpression)
            .expressionValues(eav)
            .build();

    ScanEnhancedRequest scanRequest = ScanEnhancedRequest.builder().filterExpression(scanExpression).build();

	return mapper.scan(scanRequest).items().iterator();
}

@juandyego
Copy link

juandyego commented Oct 19, 2020 via email

@sdesai1987
Copy link

sdesai1987 commented Jun 15, 2022

Hi Team,

I have the two tables One is customer and other is Users

Customer table is having the primary keys as cutsomerId and users table is having the primary key as userId

I want to update the both table data by using the batchSave method.

List objectsToWrite = Arrays.asList(cust, users);

I tried with batchSave(objectsToWrite)

But i am getting the below exception as
Servlet.service() for servlet [dispatcherServlet] in context with path [/v1/biz-retention-nosql-worker] threw exception [Request processing failed; nested exception is com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMappingException: class com.amazonaws.services.dynamodbv2.datamodeling.PaginatedScanList not annotated with @DynamoDBTable] with root cause
com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMappingException: class com.amazonaws.services.dynamodbv2.datamodeling.PaginatedScanList not annotated with @DynamoDBTable

Also in both tables i annotated the @DynamoDBTable

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
1.x Parity dynamodb-enhanced feature-request A feature should be added or improved.
Projects
None yet
Development

No branches or pull requests