You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I found an inconsistency between the stub and the live implementations of the s3 service with regards to listing objects with a prefix. This small app highlights the issue:
packagezio.s3importzio._importzio.s3._importsoftware.amazon.awssdk.regions.Regionimportsoftware.amazon.awssdk.auth.credentials.AwsBasicCredentialsimportjava.net.URIimportzio.nio.file.PathobjectListingObjectsWithPrefixextendsZIOAppDefault {
// Run in the root of this repo// Run docker-compose up before running this appoverridedefrun=for {
_ <- listInMinio
_ <- listWithStub
} yield ()
deflistInMinio=
list("minio").provide(
zio.s3
.live(
Region.CA_CENTRAL_1,
AwsBasicCredentials.create("TESTKEY", "TESTSECRET"),
Some(URI.create("http://127.0.0.1:9000"))
)
)
deflistWithStub=
list("stub").provide(
stub(Path("./test-data"))
)
deflist(label: String) =for {
objs <- listObjects(
"bucket-1",
ListObjectOptions(prefix =Some("dir1"), maxKeys =10, delimiter =None, starAfter =None)
)
_ <-Console.printLine(s"Objects using $label: "+ objs.objectSummaries.map(_.key).mkString(", "))
} yield ()
}
This will print:
Objects using minio: dir1/hello.txt, dir1/user.csv
Objects using stub:
From my understanding of the AWS API the minio output is correct. I could also reproduce it using an actual s3 bucket.
I've looked into the code and believe I found the cause of the issue. I'll create a PR shortly.
The text was updated successfully, but these errors were encountered:
hbibel
added a commit
to hbibel/zio-s3
that referenced
this issue
Mar 23, 2023
Hi all,
I found an inconsistency between the stub and the live implementations of the s3 service with regards to listing objects with a prefix. This small app highlights the issue:
This will print:
From my understanding of the AWS API the minio output is correct. I could also reproduce it using an actual s3 bucket.
I've looked into the code and believe I found the cause of the issue. I'll create a PR shortly.
The text was updated successfully, but these errors were encountered: