Uploading large files via HTTP can be a challenge.
It can result in timeouts and frustrated users.
Additionally, large uploads may not be possible if your hosting provider imposes HTTP file upload size restrictions (most do).
The solution is “chunking”.
Chunking involves breaking a single large file upload into multiple smaller files and uploading the smaller files to the server. The chunks are stitched back together to recreate the large file.
We’ll need to create some JavaScript to break the file into chunks and then upload each chunk one by one.
We’ll start with the HTML form:
<html> <head> <title>Upload A Large File</title> </head> <body> <form action="path/to/upload/page" id="form"> <input type="file" name="file" id="file"/> <input type="button" value="Upload" id="submit"/> <input type="button" value="Cancel" id="cancel"/> </form> </body> <-- JS Here --> </html>
Now add the JavaScript.
Get references to our HTML buttons:
// Buttons const submitButton = document.getElementById("submit"); const cancelButton = document.getElementById("cancel");
Setup the tracking variables:
// Array to store the chunks of the file let chunks = []; // Timeout to start the upload let uploadTimeout; // Boolean to check if a chunk is currently being uploaded let uploading = false; // Maximum number of retries for a chunk const maxRetries = 3; // Create an AbortController to cancel the fetch request let controller;
Create some utility functions:
// Function to cancel the upload and abort the fetch requests const cancelUpload = () => { clearUpload(); if (controller) controller.abort(); enableUpload(); }; // Function to clear/reset the upload varaibles const clearUpload = () => { clearTimeout(uploadTimeout); uploading = false; chunks = []; }; // Function to enable the submit button const enableUpload = () => { submitButton.removeAttribute("disabled"); submitButton.setAttribute("value","Upload"); }; // Function to enable the submit button const disableUpload = () => { submitButton.setAttribute("disabled","disabled"); submitButton.setAttribute("value","Uploading..."); };
Add the cancel button event listener:
// Event listener for the cancel button cancelButton.addEventListener("click",(e) => { cancelUpload(); });
Add the submit button event listener:
// Event listener for the submit button submitButton.addEventListener("click",(e) => { const fileInput = document.getElementById("file"); if (fileInput.files.length) { disableUpload(); const file = fileInput.files[0]; // Break the file into 1MB chunks const chunkSize = 1024 * 1024; const totalChunks = Math.ceil(file.size / chunkSize); // Setup chunk array with starting byte and endbyte that will be sliced from the file and uploaded let startByte = 0; for (var i = 1; i <= totalChunks; i++) { let endByte = Math.min(startByte + chunkSize,file.size); // [chunk number, start byte, end byte, uploaded, retry count, error, promise] chunks.push([i,startByte, Math.min(startByte + chunkSize,file.size), null, 0, null, null]); startByte = endByte; } // Begin uploading the chunks one after the other uploadTimeout = setInterval(() => { if (!uploading) { uploading = true; // Prevent the next interval from starting until the current chunk is uploaded // Upload the first chunk that hasn't already been uploaded for (var i = 0; i < chunks.length; i++) { if (!chunks[i][3] && chunks[i][4] < maxRetries) { // Get the binary chunk (start byte / end byte) const chunk = file.slice(chunks[i][1],chunks[i][2]); const formData = new FormData(); formData.append("chunk",chunk); formData.append("chunknumber",chunks[i][0]); formData.append("totalchunks",totalChunks); formData.append("filename",file.name); // Create an AbortController to cancel the fetch request // Need a new one for each fetch request controller = new AbortController(); fetch(document.getElementById("form").getAttribute("action"), { method: "POST", body: formData, signal: controller.signal }).then(response => { if (response.status == 413) { cancelUpload(); alert("The chunk is too large. Try a smaller chunk size."); } else { response.json().then(data => { if (data && data.success) { // Uploaded successfully chunks[i][3] = true; // Mark the chunk as uploaded uploading = false; // Allow the next chunk to be uploaded if (chunks[i][0] == totalChunks) { // Last chunk uploaded successfully clearUpload(); enableUpload(); alert("File uploaded successfully"); } } else { // Error uplooading cancelUpload(); alert("Error uploading chunk. Check the browser debug window for details."); console.error("Unhandled error uploading chunk"); } }).catch(error => { // Error uploading cancelUpload(); alert("Error uploading chunk. Check the browser debug window for details."); console.error("Error:",error); }); } }).catch(error => { // Error uploading chunks[i][4]++; // Increment the retry count chunks[i][5] = error; // Store the error uploading = false; // Allow retrying this chunk console.error("Error:",error); // Log the error }); break; } } } },1000); } else { alert("No file selected."); } });
In the code above we’re storing the start and end bytes of the file chunks in an array that we can loop through. We set our loop to run on an interval in the background so we’re not blocking the browser (we could use Web Workers here, but I’m not very familiar with them). During each interval we check to make sure we’re not uploading a chunk, then we find the next chunk that has not been uploaded. We slice the start and end bytes of the current chunk from the file and upload it.
Now we need to create the ColdFusion server-side code to accept the binary chunks and then stitch them all together into a complete file.
<cfscript> if (compareNoCase(cgi.request_method, "post") eq 0) { /* ColdFusion will take the binary data supplied in the form.chunk variable and write it to a temporary file on the server and then provide the path to that temp file in the form.chunk variable. */ binaryChunk = fileReadBinary(form.chunk); /* Make sure to cleanup the temporary file because we are still ultimately going to be uploading a large file to the server. We don't want the server drive that holds the ColdFusion temp directory to fill up */ fileDelete(form.chunk); uploadedFile = expandPath("path/to/uploads/#form.filename#"); if (form.chunknumber eq 1 AND fileExists(uploadedFile)) fileDelete(uploadedFile); // Overwrite the file if it already exists // Now we can append the binary data to the file we are uploading to the server uploadFile = fileOpen(uploadedFile,"append"); // Write the binary data to the file fileWrite(uploadFile,binaryChunk); // Close the file fileClose(uploadFile); } </cfscript> <cfcontent type="application/json" reset="true"><cfoutput>#serializeJSON({"success":true})#</cfoutput><
On the server-side ColdFusion will accept the binary chunk provided in the form variable and write it to a temporary file in the ColdFusion temp directory. It’s important to be aware of this because we can quickly fill up the servers drive with thousands of 1MG binary files. The code above reads the temporary file then deletes it. The binary data is then appended to the uploaded file.
We return a JSON response with a success flag so the client-side JavaScript will upload the next chunk.
That’s it. Pretty simple.
It’s possible to expand on this example and provide things like progress and status to further enhance the user experience.
Click here to download the source files.
An information technology professional with twenty five years experience in systems administration, computer programming, requirements gathering, customer service, and technical support.
0 Comments