Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

paginateGetLogEvents should guarantee sane behavior or throw an error explaining how to call it #6631

Open
3 of 4 tasks
jedwards1211 opened this issue Nov 6, 2024 · 2 comments
Assignees
Labels
bug This issue is a bug. p2 This is a standard priority issue

Comments

@jedwards1211
Copy link

jedwards1211 commented Nov 6, 2024

Checkboxes for prior research

Describe the bug

Right now, we have to manually set startFromHead: true when calling paginateGetLogEvents, or the pagination ends unexpectedly.

#5559 (comment)

This kind of thing is a really irritating waste of time, it should be improved. paginateGetLogEvents should set whatever default options necessary to ensure sane behavior or throw an error explaining how to call it in a way that actually works

Regression Issue

  • Select this option if this issue appears to be a regression.

SDK version number

@aws-sdk/[email protected]

Which JavaScript Runtime is this issue in?

Node.js

Details of the browser/Node.js/ReactNative version

Node 20.10.0

Reproduction Steps

    for await (const page of paginateGetLogEvents(
      {
        client,
        stopOnSameToken: true,
      },
      {
        logGroupIdentifier: logGroupName.replace(/:\*$/, ""),
        logStreamName,
        startTime: startTime.getTime(),
        endTime: endTime.getTime(),
      },
    )) {

Observed Behavior

the loop ends after getting one page

Expected Behavior

the loop goes through all pages between startTime and endTime

Possible Solution

If the underlying service can't be improved then paginateGetLogEvents should set a default value of startFromHead: true when none is provided, so that it can behave in a sane manner. If paginating backward from most to least recent with startFromHead: false/no value can't work, then paginateGetLogEvents should just throw an error explaining it that option doesn't work.

Additional Information/Context

No response

@jedwards1211 jedwards1211 added bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels Nov 6, 2024
@jedwards1211 jedwards1211 changed the title paginateGetLogEvents should be more straightforward to work with paginateGetLogEvents should guarantee sane behavior or throw an error explaining how to call it Nov 6, 2024
@zshzbh zshzbh self-assigned this Nov 6, 2024
@zshzbh zshzbh added p2 This is a standard priority issue response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. and removed needs-triage This issue or PR still needs to be triaged. labels Nov 15, 2024
@zshzbh
Copy link
Contributor

zshzbh commented Nov 15, 2024

Hey @jedwards1211 ,

Thanks for the feedback! Have you tried to limit the page size? How many logs is expected? I don't see this behavior but maybe there's not a lot of the logs in my target log stream. This is the code I have :

import { paginateGetLogEvents, CloudWatchLogsClient} from "@aws-sdk/client-cloudwatch-logs";

async function getAllLogEvents(logGroupName, logStreamName, startTime, endTime) {
  const client = new CloudWatchLogsClient({ region: "us-west-2" });
  
  const paginator = paginateGetLogEvents(
    {
      client,
	  stopOnSameToken: true,
	  // startFromHead: true, // I got the same result if set startFromHead to false
      // pageSize: 1, // adjust as needed
    },
    {
      logGroupName: logGroupName.replace(/:\*$/, ""),
      logStreamName,
      startTime: startTime.getTime(),
      endTime: endTime.getTime(),
    }
  );

  const allEvents = [];
  
  try {
    for await (const page of paginator) {
      if (page.events) {
		//console.log("page-----",page.events)
        allEvents.push(...page.events);
      }
    }

    return allEvents;
  } catch (error) {
    console.error("Error fetching logs:", error);
    throw error;
  }
}

const startTime = new Date("2024-11-14T00:00:00Z");
const endTime = new Date("2024-11-16T00:00:00Z");
const res = await getAllLogEvents("test","day1",startTime, endTime)
console.log(res)

And it returns all logs in the log stream -

➜  6631 node index.js
[
  {
    eventId: '38618180469885183388241784396081954239566890170596851712',
    ingestionTime: 1731699104386,
    message: 'day1',
    timestamp: 1731699103599
  },
  {
    eventId: '38618184999121931719414283441219152807385840054760505344',
    ingestionTime: 1731699307472,
    message: 'day1-1',
    timestamp: 1731699306697
  },
  {
    eventId: '38618185139973438393333699210894440561295687333203148800',
    ingestionTime: 1731699313871,
    message: 'day1-2',
    timestamp: 1731699313013
  },
  {
    eventId: '38618185322482737098108319011050540936608465762275426304',
    ingestionTime: 1731699322000,
    message: 'day1-3',
    timestamp: 1731699321197
  }
]

@zshzbh
Copy link
Contributor

zshzbh commented Nov 15, 2024

I tried with a real log group & log stream.

The first couple results I got with startFromHead: true -

[
 {
    eventId: '38618126044626616686675106431117400846423084883354451968',
    ingestionTime: 1731696667343,
    message: 'INIT_START Runtime Version: nodejs:20.v42\tRuntime Version ARN: arn:aws:lambda:us-west-2::runtime:XXXXXXXx\n',
    timestamp: 1731696663086
  },
  {
    eventId: '38618126714607904686130617514352486347627051971567484928',
    ingestionTime: 1731696702149,
    message: 'START RequestId: 9b673797-5a0f-412a-8f17-876e9c195e02 Version: $LATEST\n',
    timestamp: 1731696693129
  },
  {
    eventId: '38618126751448735754103206944169492934042145179447132161',
    ingestionTime: 1731696702149,
    message: '2024-11-15T18:51:34.781Z\t9b673797-5a0f-412a-8f17-876e9c195e02\tERROR\tInvoke Error \t{"errorType":"ResourceNotFoundException","errorMessage":"Stream test under account 471112623206 not found.","name":"ResourceNotFoundException","$fault":"client","$metadata":{"httpStatusCode":400,"requestId":"c1bf0b4d-ee84-606d-9e0e-913b1457d6ed","extendedRequestId":"S/dGNxp1EgsMp7KGqXN3qmpg1dshKUytLA82UJiCqrLNTfaLfc48Jq2dYcdfhqKLbeQyoHTcCBHtYhcIm8esJeib1rDfnM0d3MgnfZNdRbs=","attempts":1,"totalRetryDelay":0},"__type":"ResourceNotFoundException","stack":["ResourceNotFoundException: Stream test under account 471112623206 not found.","    at de_ResourceNotFoundExceptionRes (/var/runtime/node_modules/@aws-sdk/client-kinesis/dist-cjs/index.js:1383:21)","    at de_CommandError (/var/runtime/node_modules/@aws-sdk/client-kinesis/dist-cjs/index.js:1223:19)","    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)","    at async /var/runtime/node_modules/@aws-sdk/node_modules/@smithy/middleware-serde/dist-cjs/index.js:35:20","    at async /var/runtime/node_modules/@aws-sdk/node_modules/@smithy/core/dist-cjs/index.js:165:18","    at async /var/runtime/node_modules/@aws-sdk/node_modules/@smithy/middleware-retry/dist-cjs/index.js:320:38","    at async /var/runtime/node_modules/@aws-sdk/middleware-logger/dist-cjs/index.js:34:22","    at async Runtime.handler (file:///var/task/index.mjs:14:18)"]}\n',
    timestamp: 1731696694781
  },
  {
    eventId: '38618126753232795369985656795492350395854014099925565442',
    ingestionTime: 1731696702149,
    message: 'END RequestId: 9b673797-5a0f-412a-8f17-876e9c195e02\n',
    timestamp: 1731696694861
  },
  {
    eventId: '38618126753232795369985656795492350395854014099925565443',
    ingestionTime: 1731696702149,
    message: 'REPORT RequestId: 9b673797-5a0f-412a-8f17-876e9c195e02\tDuration: 1731.70 ms\tBilled Duration: 1732 ms\tMemory Size: 128 MB\tMax Memory Used: 83 MB\tInit Duration: 319.96 ms\t\n',
    timestamp: 1731696694861
  },
]

The logs without startFromHead -

 {
    eventId: '38618131478024778347073190181465893278047963672007540736',
    ingestionTime: 1731696915748,
    message: 'START RequestId: d8673798-a34e-490b-9237-496507f82106 Version: $LATEST\n',
    timestamp: 1731696906728
  },
  {
    eventId: '38618131480946175968080701813007072371764899029290975233',
    ingestionTime: 1731696915748,
    message: '2024-11-15T18:55:06.859Z\td8673798-a34e-490b-9237-496507f82106\tERROR\tInvoke Error \t{"errorType":"ResourceNotFoundException","errorMessage":"Stream test under account 471112623206 not found.","name":"ResourceNotFoundException","$fault":"client","$metadata":{"httpStatusCode":400,"requestId":"e967a5d2-4004-f682-b6d6-3e98c7b10f14","extendedRequestId":"F3jtLMEn3UgszAnRYLsxOF/Wa1KTAg2tkdcddiaLGo/G+64RTRt04mVi3xXRgXcIHuFTW7I77ItXql33vzbBH28QSh+yb7Be","attempts":1,"totalRetryDelay":0},"__type":"ResourceNotFoundException","stack":["ResourceNotFoundException: Stream test under account 471112623206 not found.","    at de_ResourceNotFoundExceptionRes (/var/runtime/node_modules/@aws-sdk/client-kinesis/dist-cjs/index.js:1383:21)","    at de_CommandError (/var/runtime/node_modules/@aws-sdk/client-kinesis/dist-cjs/index.js:1223:19)","    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)","    at async /var/runtime/node_modules/@aws-sdk/node_modules/@smithy/middleware-serde/dist-cjs/index.js:35:20","    at async /var/runtime/node_modules/@aws-sdk/node_modules/@smithy/core/dist-cjs/index.js:165:18","    at async /var/runtime/node_modules/@aws-sdk/node_modules/@smithy/middleware-retry/dist-cjs/index.js:320:38","    at async /var/runtime/node_modules/@aws-sdk/middleware-logger/dist-cjs/index.js:34:22","    at async Runtime.handler (file:///var/task/index.mjs:14:18)"]}\n',
    timestamp: 1731696906859
  },
  {
    eventId: '38618131481392190872051314275837786737217866259410583554',
    ingestionTime: 1731696915748,
    message: 'END RequestId: d8673798-a34e-490b-9237-496507f82106\n',
    timestamp: 1731696906879
  },
  {
    eventId: '38618131481392190872051314275837786737217866259410583555',
    ingestionTime: 1731696915748,
    message: 'REPORT RequestId: d8673798-a34e-490b-9237-496507f82106\tDuration: 149.57 ms\tBilled Duration: 150 ms\tMemory Size: 128 MB\tMax Memory Used: 85 MB\t\n',
    timestamp: 1731696906879
  },
  {
    eventId: '38618131725607651541160168312213130242609407671985373184',
    ingestionTime: 1731696926847,
    message: 'START RequestId: a0673798-b2d8-47ad-94da-18a21ab8bc0d Version: $LATEST\n',
    timestamp: 1731696917830
  },
  {
    eventId: '38618131728038432767800006234640523534328079076137238529',
    ingestionTime: 1731696926847,
    message: '2024-11-15T18:55:17.939Z\ta0673798-b2d8-47ad-94da-18a21ab8bc0d\tERROR\tInvoke Error \t{"errorType":"ResourceNotFoundException","errorMessage":"Stream test under account 471112623206 not found.","name":"ResourceNotFoundException","$fault":"client","$metadata":{"httpStatusCode":400,"requestId":"fe73764e-6679-aa67-a1c2-ed1bd01820f1","extendedRequestId":"hCP97Vufzz1vs+gMMmDayCYtk6GZJwfWdQSFHZeGshQ2y1tE6RdfXZGWAVlVlf1Zcx2+BLS0YTSL1q8HbeyjUeXKCLehR5xk","attempts":1,"totalRetryDelay":0},"__type":"ResourceNotFoundException","stack":["ResourceNotFoundException: Stream test under account 471112623206 not found.","    at de_ResourceNotFoundExceptionRes (/var/runtime/node_modules/@aws-sdk/client-kinesis/dist-cjs/index.js:1383:21)","    at de_CommandError (/var/runtime/node_modules/@aws-sdk/client-kinesis/dist-cjs/index.js:1223:19)","    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)","    at async /var/runtime/node_modules/@aws-sdk/node_modules/@smithy/middleware-serde/dist-cjs/index.js:35:20","    at async /var/runtime/node_modules/@aws-sdk/node_modules/@smithy/core/dist-cjs/index.js:165:18","    at async /var/runtime/node_modules/@aws-sdk/node_modules/@smithy/middleware-retry/dist-cjs/index.js:320:38","    at async /var/runtime/node_modules/@aws-sdk/middleware-logger/dist-cjs/index.js:34:22","    at async Runtime.handler (file:///var/task/index.mjs:14:18)"]}\n',
    timestamp: 1731696917939
  },

The pagination doesn't end unexpectedly. But the sequence of the logs is different. Could you please share more information on :

  1. number of log events
  2. The error message or logs you have
    Thanks!

@github-actions github-actions bot removed the response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. label Nov 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug This issue is a bug. p2 This is a standard priority issue
Projects
None yet
Development

No branches or pull requests

2 participants