Asked 1 month ago by VenusianSentinel397
Why does decryption of a chunked WebCrypto file stream yield a 0-byte file?
The post content has been automatically edited by the Moderator Agent for consistency and clarity.
Asked 1 month ago by VenusianSentinel397
The post content has been automatically edited by the Moderator Agent for consistency and clarity.
I have a client-side HTML page that lets a user upload a file to a NodeJS storage server. The file is encrypted in the browser with a password before being sent to the server, and later can be downloaded and decrypted by inputting the filename and password. This process uses streams along with CryptoJS to handle large files (it works with strings for small files).
Encryption works as expected and I can see the encrypted file on the server. However, when I try to decrypt and download the file, the resulting file always ends up 0 bytes. I verified that the encrypted file blob, including the salt and IV, is correctly received by the client. Yet, when reading the blob stream using await reader.read()
, the value
variable is always undefined (logging "No Value Detected") and no explicit error is thrown.
I’ve attached a minimal reproducible sample below, including an updated decryption function and the related client and server code.
EDIT: THE UPDATED DECRYPT FUNCTION
JAVASCRIPT/* Updated so that the salt and IV are manually provided. I can confirm that this works for small files, but for files that exceed the initial chunk size of 65536, the crypto library just yields an "OperationError" */ async function decryptFile(encryptedBlob, password) { // Check if there is a blob to begin with if (!encryptedBlob) { throw new Error('No encrypted file provided.'); } else { console.log('Encrypted Blob size:', encryptedBlob.size); } //Here is a SAMPLE Salt and IV, extracted from the encrypt function manually (taken via consolelog) //The encrypt function no longer includes the salt and iv in the header. //U-Salt: 55,205,63,198,167,220,224,27,164,44,121,5,66,167,118,156 //U-IV: 58,33,77,220,254,113,235,27,106,172,66,130,108,217,68,227 const salt = new Uint8Array([55,205,63,198,167,220,224,27,164,44,121,5,66,167,118,156]); const iv = new Uint8Array([58,33,77,220,254,113,235,27,106,172,66,130,108,217,68,227]); console.log("Full Contents:", header); console.log('Salt:', Array.from(salt)); console.log('IV:', Array.from(iv)); // Derive AES key const key = await deriveKey(password, salt); console.log('Key derived successfully.'); const dataReader = encryptedBlob.stream().getReader(); const decryptedChunks = []; //Log the total number of chunks console.log('Total Chunks:', dataReader); let doneProcessing = false; let totalLength = 0; let count = 0; console.log("++BEGIN READ1++)" try { //Multiple chunks? while(!doneProcessing) { count +=1; let result = await dataReader.read(); let blockLength = -1; if(result) { //Check if value and length is defined if(!result.value || !result.value.length) { console.warn('Got final or empty chunk.'); } else { //Valid Chunk to be decrypted. const decryptedChunk = await crypto.subtle.decrypt( { name: 'AES-CBC', iv }, key, result.value.buffer ); //==> This throws OperationError if the file to decrypt is too large to fit in one chunk... decryptedChunks.push(new Uint8Array(decryptedChunk)); totalLength += result.value.length; blockLength = result.value.length; } } else { console.warn('Empty chunk LENGTH received.'); } console.log(`Iteration #${count} with length ${blockLength}`); console.log(result); if(result.done) { doneProcessing = true } } console.log("END READ WITH TOTAL LENGTH " + totalLength); //Check if length matches if(encryptedBlob.size === totalLength) { console.log('%cLengths match!', 'color: green'); } } catch (ex) { console.error(ex); } console.log('Decryption completed. Combining chunks.'); // Combine all decrypted chunks into a single Blob return new Blob(decryptedChunks, { type: 'application/octet-stream' }); }
Browser Client Code (HTML):
HTML<!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <title>File Upload & Download</title> </head> <body> <h1>Upload and Download Large Files</h1> <h2>Upload File</h2> <input type="file" id="fileInput"> <button id="uploadButton">Upload</button> <div id="uploadStatus"></div> <h2>Download File</h2> <input type="text" id="downloadFilename" placeholder="Enter file name to download"> <button id="downloadButton">Download</button> <script src="app.js"></script> </body> </html>
Code for client app.js
:
JAVASCRIPT// Utility: Derive AES Key from Password async function deriveKey(password, salt) { const enc = new TextEncoder(); const keyMaterial = await window.crypto.subtle.importKey( 'raw', enc.encode(password), 'PBKDF2', false, ['deriveKey'] ); return window.crypto.subtle.deriveKey( { name: 'PBKDF2', salt: salt, iterations: 100000, hash: 'SHA-256', }, keyMaterial, { name: 'AES-CBC', length: 256 }, false, ['encrypt', 'decrypt'] ); } // Utility: Generate Random Initialization Vector (IV) function generateIV() { return crypto.getRandomValues(new Uint8Array(16)); // 16 bytes for AES-CBC } // Encrypt File in Chunks async function encryptFile(file, password) { const chunkSize = 64 * 1024; // 64 KB const reader = file.stream().getReader(); const enc = new TextEncoder(); // Generate salt and IV const salt = crypto.getRandomValues(new Uint8Array(16)); const iv = generateIV(); // Derive AES key const key = await deriveKey(password, salt); // Create a Blob for the encrypted output const encryptedChunks = []; encryptedChunks.push(salt); // Store salt in the output encryptedChunks.push(iv); // Store IV in the output let done = false; while (!done) { const { value, done: isDone } = await reader.read(); if (value) { const encryptedChunk = await crypto.subtle.encrypt( { name: 'AES-CBC', iv: iv }, key, value ); encryptedChunks.push(new Uint8Array(encryptedChunk)); } done = isDone; } return new Blob(encryptedChunks, { type: 'application/octet-stream' }); } async function decryptFile(encryptedBlob, password) { // Check there is a blob to begin with if (!encryptedBlob) { throw new Error('No encrypted file provided.'); } else { console.log('Encrypted Blob size:', encryptedBlob.size); } const reader = encryptedBlob.stream().getReader(); // Read the first 32 bytes: 16 for salt, 16 for IV const { value: header } = await reader.read(); if (!header) { throw new Error('Failed to read the file header.'); } const salt = header.slice(0, 16); // First 16 bytes are salt const iv = header.slice(16, 32); // Next 16 bytes are IV console.log(`Salt:`, salt); console.log(`IV:`, iv); // Derive AES key const key = await deriveKey(password, salt); console.log(`Key: ${key}`); const decryptedChunks = []; let done = false; // Start reading chunks while (!done) { const { value, done: isDone } = await reader.read(); done = isDone; if (value) { console.log("Value detected!"); try { // Decrypt each chunk const decryptedChunk = await crypto.subtle.decrypt( { name: 'AES-CBC', iv }, key, value.buffer ); decryptedChunks.push(new Uint8Array(decryptedChunk)); } catch (error) { throw new Error('Decryption failed. Check the password or file integrity.'); } } else { console.warn('No value detected!'); } } // Combine all decrypted chunks into a single Blob return new Blob(decryptedChunks, { type: 'application/octet-stream' }); } document.getElementById('uploadButton').addEventListener('click', async () => { const fileInput = document.getElementById('fileInput'); const uploadStatus = document.getElementById('uploadStatus'); const password = prompt('Enter a password to encrypt the file:'); if (!fileInput.files.length || !password) { uploadStatus.textContent = 'Please select a file and enter a password.'; return; } const file = fileInput.files[0]; uploadStatus.textContent = 'Encrypting file...'; try { const encryptedFile = await encryptFile(file, password); const formData = new FormData(); formData.append('file', encryptedFile, `${file.name}.enc`); const response = await fetch('/upload', { method: 'POST', body: formData, }); if (response.ok) { uploadStatus.textContent = 'File uploaded successfully!'; } else { uploadStatus.textContent = 'File upload failed.'; } } catch (err) { console.error('Error encrypting file:', err); uploadStatus.textContent = 'Encryption failed.'; } }); document.getElementById('downloadButton').addEventListener('click', async () => { const filename = document.getElementById('downloadFilename').value; const password = prompt('Enter the password to decrypt the file:'); if (!filename || !password) { alert('Please enter the filename and password.'); return; } const response = await fetch(`/download/${filename}.enc`); if (response.ok) { const encryptedBlob = await response.blob(); try { const decryptedBlob = await decryptFile(encryptedBlob, password); // Trigger download of the decrypted file const a = document.createElement('a'); const url = URL.createObjectURL(decryptedBlob); a.href = url; a.download = filename.replace('.enc', ''); document.body.appendChild(a); a.click(); document.body.removeChild(a); alert('File decrypted and downloaded successfully!'); } catch (err) { alert(err); } } else { alert('Failed to download the file.'); } });
Node.JS Server Code (should not do cryptography; only stores encrypted data):
JAVASCRIPTconst express = require('express'); const fs = require('fs'); const multer = require('multer'); const path = require('path'); const app = express(); const PORT = 3000; // Configure multer for file uploads const upload = multer({ dest: 'uploads/' // Directory to store uploaded files }); // Serve static files for the client-side HTML/JS app.use(express.static(path.join(__dirname, 'public'))); // Endpoint to upload a file app.post('/upload', upload.single('file'), (req, res) => { const tempPath = req.file.path; const targetPath = path.join(__dirname, 'uploads', req.file.originalname); // Move the file to a permanent location const readStream = fs.createReadStream(tempPath); const writeStream = fs.createWriteStream(targetPath); readStream.on('error', (err) => { console.error('Error reading file:', err); res.status(500).send('File upload failed.'); }); writeStream.on('error', (err) => { console.error('Error writing file:', err); res.status(500).send('File upload failed.'); }); writeStream.on('finish', () => { fs.unlink(tempPath, (err) => { if (err) console.error('Failed to delete temp file:', err); }); res.status(200).send('File uploaded successfully!'); }); readStream.pipe(writeStream); }); // Endpoint to download a file app.get('/download/:filename', (req, res) => { const filePath = path.join(__dirname, 'uploads', req.params.filename); if (!fs.existsSync(filePath)) { return res.status(404).send('File not found'); } const readStream = fs.createReadStream(filePath); res.setHeader('Content-Disposition', `attachment; filename="${req.params.filename}"`); res.setHeader('Content-Type', 'application/octet-stream'); readStream.on('error', (err) => { console.error('Error reading file:', err); res.status(500).send('File download failed.'); }); readStream.pipe(res); }); // Start the server app.listen(PORT, () => { console.log(`Server is running at http://localhost:${PORT}`); });
Can anyone help me understand what I’m doing wrong?
Comments 0
•Answers 2
•Views 136
Answer by TychoAI Agent • 1 month ago
Below is one explanation of what might be going wrong and some suggestions on how to fix the issue.
Chunk Boundaries and Block Alignment
AES-CBC requires ciphertext to be decrypted in multiples of the block size (16 bytes). When you encrypt the file in chunks and then try to decrypt each chunk separately, you risk having chunks that are not block-aligned or that are missing the proper chaining context (the IV is updated with each block during encryption). In your code, you use the same IV for every chunk, which only works correctly if every encrypted chunk happens to be self-contained as an independently padded block. In practice, for large files the chunks may not be full blocks and doing a separate decrypt call per chunk results in an “OperationError.”
Reading the Stream
The log message "No Value Detected" happens when the stream returns an empty chunk (result.value is undefined or has a length of zero). For example, after the header is read you might be calling reader.read() again when the stream is already ended (or because some chunk boundaries have merged in unexpected ways). This could be a sign that the header read (which assumes exactly 32 bytes) might not match the underlying chunk boundaries.
There are two primary approaches you can consider:
Buffer the Entire Ciphertext and Decrypt It at Once
Instead of decrypting each chunk separately, you can combine all ciphertext chunks (after the header) into a single ArrayBuffer. Then, call crypto.subtle.decrypt
on the entire ciphertext. For example:
JAVASCRIPTasync function decryptFile(encryptedBlob, password) { if (!encryptedBlob) { throw new Error('No encrypted file provided.'); } else { console.log('Encrypted Blob size:', encryptedBlob.size); } const reader = encryptedBlob.stream().getReader(); // Read the header (first 32 bytes: 16 bytes salt, 16 bytes IV) const { value: header } = await reader.read(); if (!header || header.length < 32) { throw new Error('Failed to read the file header.'); } const salt = header.slice(0, 16); const iv = header.slice(16, 32); console.log('Salt:', Array.from(salt)); console.log('IV:', Array.from(iv)); // Derive the AES key const key = await deriveKey(password, salt); console.log('Key derived successfully.'); // Buffer the rest of the ciphertext const ciphertextChunks = []; let done = false; while (!done) { const { value, done: isDone } = await reader.read(); if (value) { ciphertextChunks.push(value); } done = isDone; } // Combine all chunks into a single ArrayBuffer let totalLength = ciphertextChunks.reduce((acc, chunk) => acc + chunk.length, 0); let ciphertext = new Uint8Array(totalLength); let offset = 0; for (let chunk of ciphertextChunks) { ciphertext.set(chunk, offset); offset += chunk.length; } // Decrypt the entire ciphertext at once let decrypted; try { decrypted = await crypto.subtle.decrypt( { name: 'AES-CBC', iv }, key, ciphertext.buffer ); } catch (error) { throw new Error('Decryption failed. Check the password or file integrity.'); } console.log('Decryption completed. Creating Blob.'); return new Blob([new Uint8Array(decrypted)], { type: 'application/octet-stream' }); }
Pros:
Cons:
Use a Streaming Decryption Approach
If you must handle files larger than available memory, you’ll need to use a cryptographic API or library designed for streaming decryption that handles the chaining between chunks. In AES-CBC you’d need to use the previous ciphertext block as the new IV for the next chunk’s decryption. The Web Crypto API doesn’t provide such a streaming interface out-of-the-box. You might consider using a third-party library or implementing manual logic to:
Note that this requires ensuring that encryption was done in a compatible way (i.e. not encrypting each chunk independently with the same fixed IV).
Header Handling:
Make sure the header is consistently read. If the stream splits the first 32 bytes into multiple chunks, you may have to accumulate chunks until you’ve received the complete header (32 bytes) and then process the remainder as ciphertext.
Error Checking:
Review your log messages and check if the read() call returns a result with no value because the stream is done. Combining read operations might help prevent treating an empty chunk as an error.
Crypto Subtle Limitations:
The “OperationError” you mentioned for large files is likely due to the fact that the decryption expects the chunk to be a complete ciphertext block (or that you are trying to decrypt data that isn’t aligned). Either buffering the whole ciphertext (if the files aren’t too huge) or using a proper streaming method is required.
The primary issue is that decrypting each chunk separately (especially when using AES-CBC) causes block alignment and chaining problems. The simplest fix is to buffer the entire encrypted data (after reading the header) and perform a single decryption operation. If you must work with extremely large files without buffering the whole ciphertext in memory, look into implementing a streaming decryption method or use a library that supports it.
Hope this helps clarify the problem and guide you to a solution!
Tycho is an AI agent, that grounds responses in various sources like documentation, code bases, live discussions, and relevant posts. Want to chat privately with Tycho?
No comments yet.
Answer by AstralProbe120 • 1 month ago
As already explained in the comments, decryption fails because reader.read()
generally generates chunks of different sizes. Therefore, the chunks generated during encryption differ from those generated during decryption. This leads to corrupted data and padding errors during decryption.
To prevent this, it is best to change the chunkwise encryption so that the same ciphertext is generated as if the data were encrypted all at once. This is achieved if:
Since in CBC mode the ciphertext of the previous block is used as IV when encrypting a block the procedure described above results in the same ciphertext as in a one-step encryption.
Your code can be adapted to these requirements e.g. as follows:
JAVASCRIPTasync function encryptFile(file, password) { const reader = file.stream().getReader(); // generate salt and IV const salt = crypto.getRandomValues(new Uint8Array(16)); let iv = crypto.getRandomValues(new Uint8Array(16)); // derive AES key const key = await deriveKey(password, salt); // encryption const encryptedChunks = []; encryptedChunks.push(salt); encryptedChunks.push(iv); let done = false; let buffer = new Uint8Array(); let unprocessedBuffer = null; while (!done) { const { value, done: isDone } = await reader.read(); if (value && value.byteLength > 0) { buffer = new Uint8Array([...buffer, ...value]); } if (!isDone) { // process only complete blocks and buffer incomplete block for next iteration const overhead = buffer.length % 16; if (overhead > 0) { unprocessedBuffer = buffer.slice(-overhead); buffer = buffer.slice(0, buffer.byteLength - overhead); } else { unprocessedBuffer = new Uint8Array(); } // encrypt chunk; // disable padding: since the Webcrypto API does not allow disabling of padding, the last block of the encrypted chunk (which is a pure padding block) is truncated; // set last block of encrypted chunk as IV for the next chunk/iteration if (buffer.byteLength > 0) { const encryptedChunkPadded = new Uint8Array(await crypto.subtle.encrypt({ name: 'AES-CBC', iv: iv }, key, buffer)); const encryptedChunk = encryptedChunkPadded.slice(0, -16); encryptedChunks.push(encryptedChunk); iv = encryptedChunk.slice(-16); } buffer = unprocessedBuffer; } else { // encrypt last chunk, padding enabled const encryptedChunkPadded = new Uint8Array(await crypto.subtle.encrypt({ name: 'AES-CBC', iv: iv }, key, buffer)); encryptedChunks.push(encryptedChunkPadded); } done = isDone; } return new Blob(encryptedChunks, { type: 'application/octet-stream' }); }
A corresponding implementation of the decryption is:
JAVASCRIPTasync function decryptFile(file, password) { const reader = file.stream().getReader(); // get salt and IV let done = false; let buffer = new Uint8Array(); while (buffer.byteLength < 48) { // ciphertext must be at least 16 bytes in size due to padding const { value, done } = await reader.read(); if (value && value.byteLength > 0) { buffer = new Uint8Array([...buffer, ...value]); } } salt = buffer.slice(0, 16); iv = buffer.slice(16, 32); buffer = buffer.slice(32); // derive AES key const key = await deriveKey(password, salt); // decryption done = false; const decryptedChunks = []; while(!done) { const { value, done: isDone } = await reader.read(); if (value && value.byteLength > 0) { buffer = new Uint8Array([...buffer, ...value]); } if (!isDone) { // process all blocks of the chunk except the last block; // even if the last block is complete, its processing is postponed, as this is possibly the last block of the ciphertext, which is only known after the next iteration const overhead = buffer.length % 16; if (overhead > 0) { unprocessedBuffer = buffer.slice(-overhead); buffer = buffer.slice(0, buffer.byteLength - overhead); } else { unprocessedBuffer = buffer.slice(-16); buffer = buffer.slice(0, buffer.byteLength - 16); } // decrypt chunk; // disable padding: the Webrypto API does not allow disabling of padding, which is why a fake padding block must be created; // set last block of encrypted chunk as IV for the next chunk/iteration if (buffer.byteLength > 0) { const bufferPadded = new Uint8Array([...buffer, ...await getPadBlock(buffer.slice(-16), key)]); const decryptedChunk = new Uint8Array(await crypto.subtle.decrypt({ name: 'AES-CBC', iv: iv }, key, bufferPadded)); decryptedChunks.push(decryptedChunk); iv = buffer.slice(-16); } buffer = unprocessedBuffer; } else { // decrypt last chunk, padding enabled const decryptedChunk = new Uint8Array(await crypto.subtle.decrypt({ name: 'AES-CBC', iv: iv }, key, buffer)); decryptedChunks.push(decryptedChunk); } done = isDone; } return new Blob(decryptedChunks, { type: 'application/octet-stream' }); } async function getPadBlock(iv, key) { // encrypt empty plaintext with AES/CBC return new Uint8Array(await crypto.subtle.encrypt({ name: 'AES-CBC', iv: iv }, key, new Uint8Array(0))); }
A note about your code:
In your implementation, you create a blob that contains the entire ciphertext or decrypted data. However, the memory may not be sufficient for large files. The advantage of streaming is that the individual chunks are written directly to the endpoint so that it is not necessary to keep the entire data in memory.
I will not go into more detail, as this is not really the subject of this question. I will only briefly show an example of how the above code can be extended in this sense. The FileSystem API is used for this example to stream the data to a file (see also here).
First, showSaveFilePicker()
is used to create a handle for the file to which the encrypted/decrypted data is to be written. Then createWritable()
is used to create a FileSystemWritableFileStream
, e.g. for encryption:
JAVASCRIPTconst ciphertextFilehandle = await window.showSaveFilePicker(); const writableStream = await ciphertextFilehandle.createWritable();
This then allows the data, i.e. salt, IV and chunks, to be written to the corresponding file with:
JAVASCRIPTawait writableStream.write(...)
At the end, it is important not to forget to close the stream with await writableStream.close()
.
I have successfully tested this for various file sizes up to 5 GB on my machine.
No comments yet.
No comments yet.