File uploads with Node fs.ReadStream
Use case
Upload a large file from disk in Node.js using a readable stream, without loading the entire file into memory.
Smallest working example
import { createClient } from '@parcely/core'
import { createReadStream } from 'node:fs'
const http = createClient({ baseURL: 'https://api.example.com' })
await http.post('/upload', createReadStream('./big.zip'), {
headers: { 'Content-Type': 'application/zip' },
})
Axios equivalent
// axios (Node):
import { createReadStream } from 'node:fs'
await http.post('/upload', createReadStream('./big.zip'), {
headers: { 'Content-Type': 'application/zip' },
})
The pattern is identical. Under the hood, parcely converts the Node ReadStream to a Web ReadableStream for use with fetch.
Notes and gotchas
- You must set
Content-Typemanually when using a stream. parcely cannot infer it from aReadStream. - The Node
ReadStreamis converted to a WebReadableStreamunder the hood. This is transparent to the caller. - Stream errors during upload produce
HttpErrorwithcode: 'ERR_NETWORK'. - Node file-path convenience helpers (e.g.,
uploadFile('./big.zip'), chunked/resumable uploads) are out of scope for v1 core and reserved for a future@parcely/upload-nodesubpackage.