fix: resolve document upload hang by fixing stream pipeline (refs #33)
All checks were successful
Deploy to Staging / Build Images (pull_request) Successful in 2m22s
Deploy to Staging / Deploy to Staging (pull_request) Successful in 29s
Deploy to Staging / Verify Staging (pull_request) Successful in 7s
Deploy to Staging / Notify Staging Ready (pull_request) Successful in 6s
Deploy to Staging / Notify Staging Failure (pull_request) Has been skipped
All checks were successful
Deploy to Staging / Build Images (pull_request) Successful in 2m22s
Deploy to Staging / Deploy to Staging (pull_request) Successful in 29s
Deploy to Staging / Verify Staging (pull_request) Successful in 7s
Deploy to Staging / Notify Staging Ready (pull_request) Successful in 6s
Deploy to Staging / Notify Staging Failure (pull_request) Has been skipped
The upload was hanging silently because breaking early from a `for await` loop on a Node.js stream corrupts the stream's internal state. The remaining stream could not be used afterward. Changes: - Collect ALL chunks from the file stream before processing - Use subarray() for file type detection header (first 4100 bytes) - Create single readable stream from complete buffer for storage - Remove broken headerStream + remainingStream piping logic This fixes the root cause where uploads would hang after logging "Document upload requested" without ever completing or erroring. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
@@ -272,20 +272,15 @@ export class DocumentsController {
|
||||
});
|
||||
}
|
||||
|
||||
// Read first 4100 bytes to detect file type via magic bytes
|
||||
// Collect ALL file chunks first (breaking early from async iterator corrupts stream state)
|
||||
const chunks: Buffer[] = [];
|
||||
let totalBytes = 0;
|
||||
const targetBytes = 4100;
|
||||
|
||||
for await (const chunk of mp.file) {
|
||||
chunks.push(chunk);
|
||||
totalBytes += chunk.length;
|
||||
if (totalBytes >= targetBytes) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
const fullBuffer = Buffer.concat(chunks);
|
||||
|
||||
const headerBuffer = Buffer.concat(chunks);
|
||||
// Use first 4100 bytes for file type detection via magic bytes
|
||||
const headerBuffer = fullBuffer.subarray(0, Math.min(4100, fullBuffer.length));
|
||||
|
||||
// Validate actual file content using magic bytes
|
||||
const detectedType = await FileType.fromBuffer(headerBuffer);
|
||||
@@ -341,15 +336,9 @@ export class DocumentsController {
|
||||
|
||||
const counter = new CountingStream();
|
||||
|
||||
// Create a new readable stream from the header buffer + remaining file chunks
|
||||
const headerStream = Readable.from([headerBuffer]);
|
||||
const remainingStream = mp.file;
|
||||
|
||||
// Pipe header first, then remaining content through counter
|
||||
headerStream.pipe(counter, { end: false });
|
||||
headerStream.on('end', () => {
|
||||
remainingStream.pipe(counter);
|
||||
});
|
||||
// Create readable stream from the complete buffer and pipe through counter
|
||||
const fileStream = Readable.from([fullBuffer]);
|
||||
fileStream.pipe(counter);
|
||||
|
||||
const storage = getStorageService();
|
||||
const bucket = 'documents';
|
||||
|
||||
Reference in New Issue
Block a user