Producing and consuming chunked ndjson
Imagine you have some kind of long running endpoint and instead of waiting for it to finish and return all data you my do it on the go
In my case it was custom app to search accross all github repositories in org, and as you can imagine such search may take some time, but whenever we find next item we may return it immediately so client may refresh itself
Here is an example of backend part:
app.get('/search', async (req, res) => {
const log = payload => {
const message = JSON.stringify(payload)
console.log(message)
res.write(`${message}\n`)
}
// res.setHeader('Content-Type', 'application/x-ndjson')
res.setHeader('Content-Type', 'text/event-stream') // chrome likes that more
res.setHeader('Transfer-Encoding', 'chunked') // tell client that we are going to return chunked response
const timer = Date.now()
const term = req.query.term
log({ message: `search for '${term}' started` })
someLongRunningSearchThatYieldsResults(term, result => {
log({ message: `result found`, result })
})
log({ message: `search for '${term}' ended`, took: Date.now() - timer })
return res.end() // because response is chunked we need to close it manually
})
If you try to call this method (no matter how, via browser or curl) you will see hor results appearing on the go, it will be something like:
{"message":"search for 'hello' started"}
{"message":"result found","result":{}}
{"message":"result found","result":{}}
{"message":"result found","result":{}}
{"message":"search for 'hello' ended","took":15000}
To consume such response on a cliend side we gonna neede following function
async function ndjson(response, cb) {
const reader = response.body.getReader()
const decoder = new TextDecoder()
let result = await reader.read()
let buffer = ''
while (!result.done) {
buffer += decoder.decode(result.value)
let idx = buffer.indexOf("\n")
while(idx !== -1) {
const text = buffer.substring(0, idx)
try {
const message = JSON.parse(text)
// console.log(message)
cb(message)
} catch(error) {
console.warn(text)
}
buffer = buffer.substring(idx + 1)
idx = buffer.indexOf("\n")
}
result = await reader.read()
}
}
and we can use it like so:
const url = new URL('/search', window.origin)
url.searchParams.set('term', event.target.elements.term.value)
const response = await fetch(url)
await ndjson(response, message => {
console.log(message)
})
with that in place we may consume chunked response, and because it is new line delimited json we are transfering machine readable data