Uploading large file in chunks in Asp.net Mvc c# from Javascript ajax

Often we have a requirement to upload files in Asp.net, Mvc c# application but when it comes to uploading larger file, we always think how to do it as uploading large file in one go have many challenges like UI responsiveness, If network fluctuate for a moment in between then uploading task get breaks and user have to upload it again etc.

In such cases uploading large file in chunks would be the best idea and have lots of advantages like
  • You can show uploading progress 
  • If uploading process get failed then you can resume uploading only rest of the file.
  • You can also implement pause/resume functionality by adding little extra code (not cover in this article).
Here i will show the whole process to uploading files in small chunks in Mvc c# application and to uploading in Asp.net c# there will not be major changes you just need to implement the same code in .ashx (handler).

Given code covers 4 parts -
  • Drag and drop files
  • Slice file (create chunks) using JavaScript
  • Upload these chunks (called file blob) to server via ajax request (one chunk at a time)
  • Create and merge uploaded chunks into the file in Mvc C#

Drag and drop multiple files to web page

$(document).ready(function () {    
  $('#dvDragFiles').on({
    'dragover dragenter'function (e) {
        $(e.currentTarget).css({ opacity: 0.5 });
        e.preventDefault();
        e.stopPropagation();
    },
    'drop'function (e) {
        console.log("Drop");
        //Upload file here
        onFileDrop(e);
        e.preventDefault();
        e.stopPropagation();
    },
    'dragexit dragend dragleave'function (e) {
        $(e.currentTarget).css({ opacity: 1 });
    }
  });

  function onFileDrop(evt) {
    $(evt.currentTarget).css({ opacity: 1 });
    try {
        var files = evt.originalEvent.dataTransfer.files;
        for (var i = 0; i < files.length; i++) {
            uploadFile(files[i]);
        }
    } catch (e) {

    }
  }

});

Slice each file into small chunks, chunk size is given in "maxFileSizeKB" property you can change this, by default 100 KB is set in the given code and store these chunks into an array.

To make file name unique to avoid any data overriding, append random number in the file name and to get original file name in the server you can remove this random number from the name when needed.

Slice file in to small chunks (file blob) using JavaScript

function uploadFile(file) {
    //max file chunk size set to 100 KB change as per requirement.
    var maxFileSizeKB = 100;

    var fileChunks = [];
    var bufferChunkSizeInBytes = maxFileSizeKB * (1024);

    var currentStreamPosition = 0;
    var endPosition = bufferChunkSizeInBytes;
    var size = file.size;

    while (currentStreamPosition < size) {
        fileChunks.push(file.slice(currentStreamPosition, endPosition));
        currentStreamPosition = endPosition;
        endPosition = currentStreamPosition + bufferChunkSizeInBytes;
    }

    //Append random number to file name to make it unique
    var fileName = Math.random() + "_" + file.name;
    uploadFileChunk(fileChunks, fileName, 1, fileChunks.length);

}

Pass these array of chunks to the given function, current part index and unique file name, "uploadFileChunk" function calls it self recursively until the whole file uploaded,

Below are two method to upload files to the server, our example follows the first one -
  • Proceed to upload next chunk only when previous chunk get uploaded (get success response from the server). 
  • Send all chunk upload request at once and then wait for the response of each request (in this type of approach you need to append total part and current part number in the file name and save each file separately and when all file chunks get uploaded then merge these file into one and delete the separated chunk file.

Uploading file chunk to the server (Asp.net Mvc C#) by JavaScript ajax request

function uploadFileChunk(fileChunks, fileName, currentPart, totalPart) {
    var formData = new FormData();
    formData.append('file', fileChunks[currentPart - 1], fileName);

    $.ajax({
        type: "POST",
        url: '/FileManagement/UploadFileChunks',
        contentType: false,
        processData: false,
        data: formData,
        success: function (data) {
            if (totalPart >= currentPart) {
                console.log("uploading file part no: " + currentPart, " out of " + totalPart);
                if (data.status == true) {
                    if (totalPart == currentPart) {
                        //Whole file uploaded
                        console.log("whole file uploaded successfully");
                    } else {
                        //Show uploading progress
                        uploadFileChunk(fileChunks, fileName, currentPart + 1, totalPart);
                    }
                } else {
                    //retry message to upload rest of the file
                    console.log("failed to upload file part no: " + currentPart);
                }
            }
        },
        error: function () {
            //retry message to upload rest of the file
            console("error to upload file part no: " + currentPart);
        }
    });

}

Server side code to merge uploaded file chunks, in the below code if file is not created then file get created and chunks get merged next time into this file.

Server side code to merge the uploaded file chunks.

public class FileManagement : Controller

{
  public JsonResult UploadFileChunks()
  {
    var files = Request.Files;

    if (files.Count > 0)
    {
        try
        {
            string filePath = Path.Combine(GetUploadPath(), files[0].FileName);

            using (FileStream fs = new FileStream(filePath, FileMode.Append))
            {
                var bytes = GetBytes(files[0].InputStream);
                fs.Write(bytes, 0, bytes.Length);
            }

            return Json(new { status = true });
        }
        catch (Exception ex)
        {
            return Json(new { status = false, message = ex.Message });
        }
    }

    return Json(new { status = false });
  }

  private byte[] GetBytes(Stream input)
  {
    byte[] buffer = new byte[input.Length];
    using (MemoryStream ms = new MemoryStream())
    {
        int read;
        while ((read = input.Read(buffer, 0, buffer.Length)) > 0)
        {
            ms.Write(buffer, 0, read);
        }

        return ms.ToArray();
    }
  }

  private string GetUploadPath()
  {
    var rootPath = System.Web.Hosting.HostingEnvironment.MapPath("~/UploadedFiles/" + Session.SessionID);

    if (!Directory.Exists(rootPath))
    {
        Directory.CreateDirectory(rootPath);
    }

    return rootPath;
  }
}

Popular Posts