Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

S3.GetObject no longer returns the result as a string #1877

Closed
igilham opened this issue Jan 6, 2021 · 109 comments
Closed

S3.GetObject no longer returns the result as a string #1877

igilham opened this issue Jan 6, 2021 · 109 comments
Assignees
Labels
guidance General information and guidance, answers to FAQs, or recommended best practices/resources. p2 This is a standard priority issue

Comments

@igilham
Copy link

igilham commented Jan 6, 2021

Describe the bug
I'm using the GetObjectCommand with an S3Client to pull a file down from S3. In v2 of the SDK I can write response.Body.toString('utf-8') to turn the response into a string. In v3 of the SDK response.Body is a complex object that does not seem to expose the result of reading from the socket.

It's not clear if the SDK's current behaviour is intentional, but the change in behaviour since v2 is significant and undocumented.

SDK version number
3.1.0

Is the issue in the browser/Node.js/ReactNative?
Node.js

Details of the browser/Node.js/ReactNative version
v12.18.0

To Reproduce (observed behavior)

import { GetObjectCommand, S3Client } from '@aws-sdk/client-s3';

export async function getFile() {
  const client = new S3Client({ region: 'eu-west-1' });
  const cmd = new GetObjectCommand({
    Bucket: 'my-bucket',
    Key: '/readme.txt',
  });
  const data = await client.send(cmd);

  console.log(data.Body.toString('utf-8'));
}

Expected behavior
It should print the text of the file.

Additional context

data.Body is a complex object with circular references. Object.keys(data.Body) returns the following:

[
  "_readableState",
  "readable",
  "_events",
  "_eventsCount",
  "_maxListeners",
  "socket",
  "connection",
  "httpVersionMajor",
  "httpVersionMinor",
  "httpVersion",
  "complete",
  "headers",
  "rawHeaders",
  "trailers",
  "rawTrailers",
  "aborted",
  "upgrade",
  "url",
  "method",
  "statusCode",
  "statusMessage",
  "client",
  "_consuming",
  "_dumped",
  "req"
]
@igilham igilham added bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels Jan 6, 2021
@trivikr
Copy link
Member

trivikr commented Jan 6, 2021

This happens as data.Body is now of type Readable | ReadableStream | Blob

Body?: Readable | ReadableStream | Blob;

For your specific example, you can write a streamToString function to convert ReadableStream to a string.

const { S3Client, GetObjectCommand } = require("@aws-sdk/client-s3");

(async () => {
  const region = "us-west-2";
  const client = new S3Client({ region });

  const streamToString = (stream) =>
    new Promise((resolve, reject) => {
      const chunks = [];
      stream.on("data", (chunk) => chunks.push(chunk));
      stream.on("error", reject);
      stream.on("end", () => resolve(Buffer.concat(chunks).toString("utf8")));
    });

  const command = new GetObjectCommand({
    Bucket: "test-aws-sdk-js-1877",
    Key: "readme.txt",
  });

  const { Body } = await client.send(command);
  const bodyContents = await streamToString(Body);
  console.log(bodyContents);
})();

@igilham Does this resolve your query?

@trivikr trivikr added guidance General information and guidance, answers to FAQs, or recommended best practices/resources. response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. and removed bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels Jan 6, 2021
@igilham
Copy link
Author

igilham commented Jan 6, 2021

Thanks, @trivikr. This works in my application but raises a few concerns about the library that are worth sharing:

  • There is no documentation for clients and the GetObjectCommand is not documented in the user guide or sample code. The project Readme file implies I could expect the same behaviour as SDKv2.
  • My IDE can't tell me what the type of response.Body is. It tells me that it's any. Perhaps the library configuration could be improved to export the correct type information.
  • It's nice to have options for data processing, but I shouldn't be forced to write boilerplate I/O code for the most common use case.
  • As noted below, I can't find an export of ReadableStream and Blob so it appears to be impossible to make this code type-safe.

For reference, I've rewritten the streamToString with the missing types added back in to comply with my team's linter settings.

import { Readable } from 'stream';

// Apparently the stream parameter should be of type Readable|ReadableStream|Blob
// The latter 2 don't seem to exist anywhere.
async function streamToString (stream: Readable): Promise<string> {
  return await new Promise((resolve, reject) => {
    const chunks: Uint8Array[] = [];
    stream.on('data', (chunk) => chunks.push(chunk));
    stream.on('error', reject);
    stream.on('end', () => resolve(Buffer.concat(chunks).toString('utf-8')));
  });
}

@trivikr
Copy link
Member

trivikr commented Jan 6, 2021

There is no documentation for clients and the GetObjectCommand is not documented in the user guide or sample code. The project Readme file implies I could expect the same behaviour as SDKv2.

Documentation for getObject operation lists that GetObjectOutput.Body is Readable | ReadableStream | Blob
API Reference: https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-s3/classes/s3.html#getobject

Screenshot

Screen Shot 2021-01-06 at 9 19 12 AM

My IDE can't tell me what the type of response.Body is. It tells me that it's any. Perhaps the library configuration could be improved to export the correct type information.

I'm using Visual Studio Code, and it shows type of response.Body as internal.Readable | ReadableStream<any> | Blob on hover.

Please create a new issue with details of your IDE and code if problem persists.

Screenshot

Screen Shot 2021-01-06 at 9 13 01 AM

@trivikr
Copy link
Member

trivikr commented Jan 6, 2021

  • As noted below, I can't find an export of ReadableStream and Blob so it appears to be impossible to make this code type-safe.

For reference, I've rewritten the streamToString with the missing types added back in to comply with my team's linter settings.

import { Readable } from 'stream';

// Apparently the stream parameter should be of type Readable|ReadableStream|Blob
// The latter 2 don't seem to exist anywhere.
async function streamToString (stream: Readable): Promise<string> {
  return await new Promise((resolve, reject) => {
    const chunks: Uint8Array[] = [];
    stream.on('data', (chunk) => chunks.push(chunk));
    stream.on('error', reject);
    stream.on('end', () => resolve(Buffer.concat(chunks).toString('utf-8')));
  });
}

As this code is run on Node.js, you can pass Body as Readable as follows:

const bodyContents = await streamToString(Body as Readable);

@igilham
Copy link
Author

igilham commented Jan 6, 2021

Thanks for following up.

I didn't realise the methods and types were documented. I took the description on the client landing page (go to the README) to mean it was a dead-end. Perhaps improving the wording should be a separate issue.

I can't explain the IDE issue. I'm also on VSCode and it says it's an any. I find the IDE quite unstable though, so maybe it's just me.

image

@trivikr
Copy link
Member

trivikr commented Jan 6, 2021

I didn't realise the methods and types were documented. I took the description on the client landing page (go to the README) to mean it was a dead-end. Perhaps improving the wording should be a separate issue.

I've created documentation update request at #1878

@github-actions github-actions bot removed the response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. label Jan 7, 2021
@igilham
Copy link
Author

igilham commented Jan 7, 2021

Thanks. I think that covers my remaining frustrations. I appreciate that it can take time for documentation elsewhere to catch up when a major version is released.

@leimd
Copy link

leimd commented Jan 21, 2021

The codesnippet works in Node.js environment, in the browser, you would have a ReadableStream instead of Readable.

Here is my implementation of handling the ReadableStream:

const streamToString = (stream) => {
  return new Promise((resolve, reject) => {
    if (stream instanceof ReadableStream === false) {
      reject(
        "Expected stream to be instance of ReadableStream, but got " +
          typeof stream
      );
    }
    let text = "";
    const decoder = new TextDecoder("utf-8");

    const reader = stream.getReader();
    const processRead = ({ done, value }) => {
      if (done) {
        // resolve promise with chunks
        console.log("done");
        // resolve(Buffer.concat(chunks).toString("utf8"));
        resolve(text);
        return;
      }

      text += decoder.decode(value);

      // Not done, keep reading
      reader.read().then(processRead);
    };

    // start read
    reader.read().then(processRead);
  });
};

@ffxsam
Copy link

ffxsam commented Jan 28, 2021

@trivikr Thanks for that link to the docs! I didn't even know they existed till just now.

@lambrospetrou
Copy link

lambrospetrou commented Feb 9, 2021

The codesnippet works in Node.js environment, in the browser, you would have a ReadableStream instead of Readable.

Here is my implementation of handling the ReadableStream:

const streamToString = (stream) => {
  return new Promise((resolve, reject) => {
    if (stream instanceof ReadableStream === false) {
      reject(
        "Expected stream to be instance of ReadableStream, but got " +
          typeof stream
      );
    }
    let text = "";
    const decoder = new TextDecoder("utf-8");

    const reader = stream.getReader();
    const processRead = ({ done, value }) => {
      if (done) {
        // resolve promise with chunks
        console.log("done");
        // resolve(Buffer.concat(chunks).toString("utf8"));
        resolve(text);
        return;
      }

      text += decoder.decode(value);

      // Not done, keep reading
      reader.read().then(processRead);
    };

    // start read
    reader.read().then(processRead);
  });
};

I also wasted lots of time on GetObject and the trifecta of its types. Also, the fact that ReadableStream | Blob is only Browser, and Readable only Node made it extremely annoying :)

The streamToString solution posted above works for Node.
For the browser, I found that using the Response object from fetch seems a shorter solution:

new Response(response!.body, {});

This will return a Response object which will then allow us to use any of the helper methods it has to convert to String, Buffer, Json, etc. See more at https://developer.mozilla.org/en-US/docs/Web/API/Response#methods.

Full example:

const s3 = new S3({
  region: "us-east-1",
  credentials: {
    accessKeyId: "replace-it",
    secretAccessKey: "replace-it",
  },
});
const resp = await s3.getObject({
  Bucket: "your-bucket",
  Key: "your-object-key",
});
console.info(await new Response(resp.Body, {}).text())

It's quite unfortunate that everybody has to go through these hoops to get the content out of the response though. Especially considering that we have to do type checking with things like if (resp.Body instanceof Readable), or declare special interfaces to avoid differences between browser/Node.

@smiccoli
Copy link

The codesnippet works in Node.js environment, in the browser, you would have a ReadableStream instead of Readable.
Here is my implementation of handling the ReadableStream:

const streamToString = (stream) => {
  return new Promise((resolve, reject) => {
    if (stream instanceof ReadableStream === false) {
      reject(
        "Expected stream to be instance of ReadableStream, but got " +
          typeof stream
      );
    }
    let text = "";
    const decoder = new TextDecoder("utf-8");

    const reader = stream.getReader();
    const processRead = ({ done, value }) => {
      if (done) {
        // resolve promise with chunks
        console.log("done");
        // resolve(Buffer.concat(chunks).toString("utf8"));
        resolve(text);
        return;
      }

      text += decoder.decode(value);

      // Not done, keep reading
      reader.read().then(processRead);
    };

    // start read
    reader.read().then(processRead);
  });
};

I also wasted lots of time on GetObject and the trifecta of its types. Also, the fact that ReadableStream | Blob is only Browser, and Readable only Node made it extremely annoying :)

The streamToString solution posted above works for Node.
For the browser, I found that using the Response object from fetch seems a shorter solution:

new Response(response!.body, {});

This will return a Response object which will then allow us to use any of the helper methods it has to convert to String, Buffer, Json, etc. See more at https://developer.mozilla.org/en-US/docs/Web/API/Response#methods.

Full example:

const s3 = new S3({
  region: "us-east-1",
  credentials: {
    accessKeyId: "replace-it",
    secretAccessKey: "replace-it",
  },
});
const resp = await s3.getObject({
  Bucket: "your-bucket",
  Key: "your-object-key",
});
console.info(await new Response(resp.Body, {}).text())

It's quite unfortunate that everybody has to go through these hoops to get the content out of the response though. Especially considering that we have to do type checking with things like if (resp.Body instanceof Readable), or declare special interfaces to avoid differences between browser/Node.

the use of Response looks as the neatest solution right now, for json and text payloads.

@ffxsam
Copy link

ffxsam commented Feb 10, 2021

I've been running into these pain points as well, including Lambda invocation. The payload returned is now a Uint8Array, so it takes a few hoops to get it into a usable format:

const payload = JSON.parse(Buffer.from(data.Payload).toString());

Whereas in the previous JS SDK, it was simply:

const payload = JSON.parse(data.Payload);

I don't understand this new direction with the SDK. I can't say I'm a fan. Maybe @trivikr can weigh in.

@trivikr
Copy link
Member

trivikr commented Feb 10, 2021

Reopening as lot of customers have raised questions.
Tagging @AllanZhengYP for comment.

@moltar
Copy link

moltar commented Mar 6, 2021

export interface GetObjectOutput {
    /**
     * <p>Object data.</p>
     */
    Body?: Readable | ReadableStream | Blob;

  // ... snip
}

image


This is throwing an error in a NodeJS app, because TS config does not load DOM libs.

This results in the Body being set to any.

image

@kennu
Copy link

kennu commented Mar 8, 2021

I'm also very confused about how to read S3 Body responses with SDK v3. The SDK documentation for GetObjectCommand does not describe how to do it, and the SDK examples are also missing it (awsdocs/aws-doc-sdk-examples#1677).

I would ask the AWS SDK team to include in the SDK a simple way to read S3 Body responses. We don't want to re-implement complicated event handlers and helper functions for this simple purpose every time we use GetObject in a project.

In v2 we could just say something like JSON.parse(response.Body?.toString()). Please make it as simple in v3. Stream-based processing is also important, but it should be only an alternative for the simple case for parsing small JSON objects.

For reference, I was able to do this in Node.js by utilizing node-fetch. I would like something like this be included in AWS SDK.

npm install node-fetch
npm install --save-dev @types/node-fetch
import { Response } from 'node-fetch'

const response = new Response(s3Response.Body)
const data = await response.json()

@m-radzikowski
Copy link

A one-line alternative is to use get-stream package, as posted here: #1096 (comment)

I understand the reason for returning a ReadableStream, but a built-in helper method would be nice. Reading the whole body into string in memory is probably good for 99% of cases.

If some helper method would be a part of the SDK we could just call it as readStream(response.Body) and everyone would be happy not having to add another dependency or 10 lines of boilerplate code to every new project.

@kuhe
Copy link
Contributor

kuhe commented Oct 21, 2022

This is now documented in the root readme with an example: https://github.com/kuhe/aws-sdk-js-v3/tree/main#streams

You do not need to import sdkStreamMixin explicitly. As of that version It is applied to stream objects in command outputs.

import { S3 } from "@aws-sdk/client-s3";

const client = new S3({});

const getObjectResult = await client.getObject({
  Bucket: "...",
  Key: "...",
});

// env-specific stream with added mixin methods.
const bodyStream = getObjectResult.Body; 

// one-time transform.
const bodyAsString = await bodyStream.transformToString();

// throws an error on 2nd call, stream cannot be rewound.
const __error__ = await bodyStream.transformToString();

@sfwhite
Copy link

sfwhite commented Oct 21, 2022

So it looks like going by the latest PR: https://github.com/aws/aws-sdk-js-v3/pull/3977/files

The recommended way to do this is now:

import { GetObjectCommand, S3Client } from '@aws-sdk/client-s3';
import { sdkStreamMixin } from '@aws-sdk/util-stream-node';

const s3Client = new S3Client({});
const { Body } = await s3Client.send(
  new GetObjectCommand({
    Bucket: 'your-bucket',
    Key: 'your-key',
  }),
);
const objectString = await sdkStreamMixin(Body).transformToString(); // this throws if Body is undefined

Took two solid years, but hey, we have an official solution..

@zbagley
Copy link

zbagley commented Oct 21, 2022

@sfwhite Thanks for the heads up on the throw. @kuhe Glad to see this in, and it probably would be useful in the docs to notate (in general) this should be a try { ... } catch { ... } for most use cases.

@samchungy
Copy link
Contributor

This is now documented in the root readme with an example: https://github.com/kuhe/aws-sdk-js-v3/tree/main#streams

You do not need to import sdkStreamMixin explicitly. As of that version It is applied to stream objects in command outputs.

import { S3 } from "@aws-sdk/client-s3";

const client = new S3({});

const getObjectResult = await client.getObject({
  Bucket: "...",
  Key: "...",
});

// env-specific stream with added mixin methods.
const bodyStream = getObjectResult.Body; 

// one-time transform.
const bodyAsString = await bodyStream.transformToString();

// throws an error on 2nd call, stream cannot be rewound.
const __error__ = await bodyStream.transformToString();

Looks like we need to check for Body being undefined though or else we get Object is possibly 'undefined'.ts(2532) so

import { S3 } from "@aws-sdk/client-s3";

const client = new S3({});

const getObjectResult = await client.getObject({
  Bucket: "...",
  Key: "...",
});

if (!getObjectResult.Body) {
  // handle not found
  throw new Error("Object not found");
}

// env-specific stream with added mixin methods.
const bodyStream = getObjectResult.Body; 

// one-time transform.
const bodyAsString = await bodyStream.transformToString();

// throws an error on 2nd call, stream cannot be rewound.
const __error__ = await bodyStream.transformToString();

@acommodari
Copy link

With Node 18 introducing Web Streams API will this affect s3 download streams in any way?
To my knowledge if you were working in node you could assume the Body was always gonna be a Readable.
Will it now also support ReadableStream?

@ryanblock
Copy link

Hey @trivikr! Any updates or official word from your side on this? Haven't heard from you in this thread for over a year and a half, and it's especially relevant with Lambda nodejs18.x now out with SDK v3. Thanks! 💕

@ShivamJoker
Copy link

@kuhe any idea how do I get this working with sharp ? It's not taking the string or buffer or stream

@misantronic
Copy link

@kuhe any idea how do I get this working with sharp ? It's not taking the string or buffer or stream

this works for me with sharp:

async function streamToBuffer(stream: Readable): Promise<Buffer> {
    return await new Promise((resolve, reject) => {
        const chunks: Uint8Array[] = [];
        stream.on('data', (chunk) => chunks.push(chunk));
        stream.on('error', reject);
        stream.on('end', () => resolve(Buffer.concat(chunks)));
    });
}

const resp = await s3.send(new GetObjectCommand({ Bucket, Key }));

if (resp.Body) {
    resp.Body = (await streamToBuffer(resp.Body as Readable)) as any;
}

@ShivamJoker
Copy link

Can't we just use the stream instead of converting it?

@misantronic
Copy link

Can't we just use the stream instead of converting it?

I was trying to - no success. If you find a way, keep me posted.

@ShivamJoker
Copy link

Okay so after spending few hours I got it right. This way we can pipe our s3 response body into sharp and later use the .toBuffer() to push it to bucket.

  const getObj = new GetObjectCommand({
    Bucket,
    Key: objectKey,
  });

  const s3ImgRes = await s3Client.send(getObj);

  const sharpImg = sharp().resize({ width: 500 }).toFormat("webp");

  // pipe the body to sharp img
  s3ImgRes.Body.pipe(sharpImg);

  const putObj = new PutObjectCommand({
    Bucket,
    Key: `converted/${objectKey.replace(/[a-zA-Z]+$/, "webp")}`,
    Body: await sharpImg.toBuffer(),
  });

  await s3Client.send(putObj);

But AWS team please please you need update your docs, I know there is a lot to update but as developer its just so much struggle to use AWS services because of insufficient docs.

@adcreare
Copy link

adcreare commented Dec 2, 2022

Here is an example of how to download an object from S3 and write that as a file to disk, while keeping it as a stream. This example is typescript targeting node.

It seems silly to me if we're going to all this trouble of having a stream coming from AWS that we then convert that to a buffer or string to write to disk.

I also agree with the sentiments expressed by others in this thread. It is crazy that getObject has become such a complicated operation in the V3 SDK compared with the V2 SDK and is going to trip many people up for years to come.

import type { Readable } from 'node:stream';
import { pipeline } from 'node:stream/promises';
import fs from 'node:fs'
import { GetObjectCommand, S3 } from '@aws-sdk/client-s3';

async function downloadFile() {
  const s3 = new S3({});
  const s3Result = await s3.send(new GetObjectCommand({ Bucket: sourceBucket, Key: sourceKey }));
  if (!s3Result.Body) {
    throw new Error('received empty body from S3');
  }
  await pipeline(s3Result.Body as Readable, fs.createWriteStream('/tmp/filedownload.zip'));
}

@rdt712
Copy link

rdt712 commented Dec 6, 2022

Found an easy solution using transformToString if wanting to parse a JSON file in S3.

import { S3, GetObjectCommand } from '@aws-sdk/client-s3'

const s3 = new S3({});

const getObjectParams = {
  Bucket: 'my-bucket',
  Key: 'my-object',
};
const getObjectCommand = new GetObjectCommand(getObjectParams);
const s3Object = await s3.send(getObjectCommand);

const dataStr = await s3Object.Body?.transformToString();

let data;
if (dataStr) {
  data = JSON.parse(dataStr);
}

@kesavab
Copy link

kesavab commented Dec 13, 2022

Found an easy solution using transformToString if wanting to parse a JSON file in S3.

import { S3, GetObjectCommand } from '@aws-sdk/client-s3'

const s3 = new S3({});

const getObjectParams = {
  Bucket: 'my-bucket',
  Key: 'my-object',
};
const getObjectCommand = new GetObjectCommand(getObjectParams);
const s3Object = await s3.send(getObjectCommand);

const dataStr = await s3Object.Body?.transformToString();

let data;
if (dataStr) {
  data = JSON.parse(dataStr);
}

When I use this, I get error during parsing. Error is
Uncaught SyntaxError SyntaxError: Unexpected token  in JSON at position 0
at eval (repl:1:6)

Call to transformToString returns the json file contents

@OlivierCuyp
Copy link

@kesavab This means your dataStr is not a valid JSON string.
Did you try to log it with something like:

// ...
if (dataStr) {
 try {
   data = JSON.parse(dataStr);
 } catch (err) {
   console.log(err, dataStr);
 } 
}

@ghost
Copy link

ghost commented Feb 15, 2023

What's the next step after getting the object string if I want to download the file? Should I convert it into a blob file and return it to the frontend?

@kmordan24
Copy link

Typescript errors if you try to pipe Body even though it is a stream:
Property 'pipe' does not exist on type 'SdkStream<Readable | ReadableStream<any> | Blob | undefined>'.   Property 'pipe' does not exist on type 'ReadableStream<any> & SdkStreamMixin'.

const { Body } = await s3Client.send(new GetObjectCommand({ Bucket: bucket, Key: key }));

if (!Body) {
  throw new Error('....'');
}

Body.pipe(fs.createWriteStream(filePath))
    .on('error', (err: any) => reject(err))
    .on('close', () => resolve());

...

The only way around this is type casting - unless anyone has any other ideas.

Will there be a fix for this in a future release version?

@RanVaknin RanVaknin added the p2 This is a standard priority issue label Feb 17, 2023
@RanVaknin
Copy link
Contributor

Hi all,

It is extremely hard for us to keep track of issues that lay at the end of the issue queue on Github.

Since the OP's concern was addressed , and additional concerns were also addressed I feel like we can close this.

If you still need help whether it be to report a bug, or to ask a question, please re-engage with us on a new thread / discussion.
This makes it so we have more visibility to your concerns and can answer you in a timely manner.

Sorry for the inconvenience,
Ran~

@jsancho
Copy link

jsancho commented Feb 28, 2023

Okay so after spending few hours I got it right. This way we can pipe our s3 response body into sharp and later use the .toBuffer() to push it to bucket.

  const getObj = new GetObjectCommand({
    Bucket,
    Key: objectKey,
  });

  const s3ImgRes = await s3Client.send(getObj);

  const sharpImg = sharp().resize({ width: 500 }).toFormat("webp");

  // pipe the body to sharp img
  s3ImgRes.Body.pipe(sharpImg);

  const putObj = new PutObjectCommand({
    Bucket,
    Key: `converted/${objectKey.replace(/[a-zA-Z]+$/, "webp")}`,
    Body: await sharpImg.toBuffer(),
  });

  await s3Client.send(putObj);

But AWS team please please you need update your docs, I know there is a lot to update but as developer its just so much struggle to use AWS services because of insufficient docs.

can't thank you enough for this workaround :)

@j0k3r
Copy link

j0k3r commented Mar 10, 2023

I found an other solution depending on how you want to get your content from s3 when using Node.js >= v16.7.0 using stream/consumers:

import consumers from 'node:stream/consumers'
import { S3 } from '@aws-sdk/client-s3'

const s3 = new S3()
const { Body } = await s3.getObject({ Bucket: "your-bucket", Key: "your-object-key" })
// as a buffer
const buffer = await consumers.buffer(Body)
// as a text
const text = await consumers.text(Body)
// as a json
const json = await consumers.json(Body)

@github-actions
Copy link

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs and link to relevant comments in this thread.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Mar 25, 2023
@yenfryherrerafeliz
Copy link
Contributor

yenfryherrerafeliz commented Jun 26, 2023

Hi people, I just want to let you all know that we have introduced a new feature that allows us to convert the body of a bucket's object to a string by doing the following:

import {S3Client, GetObjectCommand} from "@aws-sdk/client-s3";

const client = new S3Client({
    region: 'us-east-2'
});
const response = await client.send(new GetObjectCommand({
    Bucket: process.env.TEST_BUCKET,
    Key: process.env.TEST_KEY
}));
console.log(await response.Body.transformToString('utf-8'));

This feature is available after version 3.357.0

Thanks!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
guidance General information and guidance, answers to FAQs, or recommended best practices/resources. p2 This is a standard priority issue
Projects
None yet
Development

Successfully merging a pull request may close this issue.