Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Flush keepalive batches past keepalive limit #1100

Merged
merged 3 commits into from
Jul 10, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions .changeset/shaggy-dryers-press.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
'@segment/analytics-next': minor
---

Flush large keepalive requests
Original file line number Diff line number Diff line change
Expand Up @@ -130,6 +130,31 @@ describe('Batching', () => {
expect(fetch).toHaveBeenCalledTimes(1)
})

it('sends requests if the size of events exceeds keepalive limits', async () => {
const { dispatch } = batch(`https://api.segment.io`, {
size: 600,
keepalive: true,
})

// fatEvent is about ~1kb in size
for (let i = 0; i < 250; i++) {
await dispatch(`https://api.segment.io/v1/t`, {
event: 'small event',
})
}
expect(fetch).not.toHaveBeenCalled()

for (let i = 0; i < 65; i++) {
await dispatch(`https://api.segment.io/v1/t`, {
event: 'fat event',
properties: fatEvent,
})
}

// still called, even though our batch limit is 600 events
expect(fetch).toHaveBeenCalledTimes(1)
})

it('sends requests when the timeout expires', async () => {
const { dispatch } = batch(`https://api.segment.io`, {
size: 100,
Expand Down
17 changes: 15 additions & 2 deletions packages/browser/src/plugins/segmentio/batched-dispatcher.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,11 @@ import { onPageChange } from '../../lib/on-page-change'
export type BatchingDispatchConfig = {
size?: number
timeout?: number
keepalive?: boolean
}

const MAX_PAYLOAD_SIZE = 500
const MAX_KEEPALIVE_SIZE = 64

function kilobytes(buffer: unknown): number {
const size = encodeURI(JSON.stringify(buffer)).split(/%..|./).length - 1
Expand All @@ -23,6 +25,15 @@ function approachingTrackingAPILimit(buffer: unknown): boolean {
return kilobytes(buffer) >= MAX_PAYLOAD_SIZE - 50
}

/**
* Checks if payload is over or approaching the limit for keepalive
* requests. If keepalive is enabled we want to avoid
* going over this to prevent data loss.
*/
function passedKeepaliveLimit(buffer: unknown): boolean {
return kilobytes(buffer) >= MAX_KEEPALIVE_SIZE - 10
}

function chunks(batch: object[]): Array<object[]> {
const result: object[][] = []
let index = 0
Expand Down Expand Up @@ -67,7 +78,7 @@ export default function batch(
})

return fetch(`https://${apiHost}/b`, {
keepalive: pageUnloaded,
keepalive: config?.keepalive || pageUnloaded,
headers: {
'Content-Type': 'text/plain',
},
Expand Down Expand Up @@ -114,7 +125,9 @@ export default function batch(
buffer.push(body)

const bufferOverflow =
buffer.length >= limit || approachingTrackingAPILimit(buffer)
buffer.length >= limit ||
approachingTrackingAPILimit(buffer) ||
(config?.keepalive && passedKeepaliveLimit(buffer))

return bufferOverflow || pageUnloaded ? flush() : scheduleFlush()
}
Expand Down
Loading