A robust, zero-dependency parser for Server-Sent Events (SSE) streams.
This library is designed to consume ReadableStream or AsyncIterable and parse them into event messages. It is written in TypeScript and optimized for modern runtimes like Node.js, Bun, Deno, and Browsers.
- Universal Support: Works with standard Web API
ReadableStreamand Node.js streams. - 🧩 Azure OpenAI Compatible: Includes a specific flush mechanism to handle streams that don't end with a newline (common in Azure OpenAI / LangChain scenarios), ensuring no data is lost.
- TypeScript: Fully typed with TS sources included.
- Lightweight: No runtime dependencies.
# npm
npm install event-source-parse
# bun
bun add event-source-parse
# pnpm
pnpm add event-source-parse
# yarn
yarn add event-source-parseThe easiest way to consume a stream is using the helper function convertEventStreamToIterableReadableDataStream. This converts a raw SSE stream directly into an async iterable of data strings.
import { convertEventStreamToIterableReadableDataStream } from 'event-source-parse'
async function consumeStream() {
const response = await fetch('https://api.openai.com/v1/chat/completions', {
method: 'POST',
body: JSON.stringify({ stream: true, /* ... */ }),
})
// Convert the raw stream into an iterable of data strings
const stream = convertEventStreamToIterableReadableDataStream(response.body)
for await (const chunk of stream) {
console.log('Received chunk:', chunk)
}
}If you prefer working with standard Web Streams (e.g., for piping to other streams or using getReader), use convertEventStreamToReadableDataStream. This returns a ReadableStream<string>.
import { convertEventStreamToReadableDataStream } from 'event-source-parse'
async function consumeWithReader() {
const response = await fetch('https://api.openai.com/v1/chat/completions', {
method: 'POST',
body: JSON.stringify({ stream: true, /* ... */ }),
})
// Returns a ReadableStream<string>
const dataStream = convertEventStreamToReadableDataStream(response.body)
const reader = dataStream.getReader()
while (true) {
const { done, value } = await reader.read()
if (done) break
console.log('Received data:', value)
}
}If you need full control over the parsing process (e.g., accessing event ID, retry time, or custom event types), you can compose the parser functions manually.
import { getBytes, getLines, getMessages } from 'event-source-parse'
async function parseCustomStream(stream: ReadableStream) {
// 1. Create a message handler
const onMessage = (msg) => {
console.log('Event:', msg.event)
console.log('Data:', msg.data)
console.log('ID:', msg.id)
}
// 2. Create the pipeline
// getMessages -> processes lines into EventSourceMessage objects
// getLines -> processes raw bytes into lines
const processLine = getMessages(onMessage)
const processChunk = getLines(processLine)
// 3. Start reading bytes from the stream
await getBytes(stream, processChunk)
}The high-level helpers allow you to hook into specific events like metadata without disrupting the main data flow.
const stream = convertEventStreamToIterableReadableDataStream(
response.body,
(metadata) => {
console.log('Received metadata:', metadata)
}
)This project uses Bun for development.
# Install dependencies
bun install
# Run tests
bun run test
# Run tests with coverage
bun run test:coverage
# Lint code
bun run lint
# Build library
bun run buildMIT