使用JQuery文件上传上传文件之前的校验和md5

jquery file upload jquery-file-upload blueimp

1069 观看

1回复

1 作者的声誉

我正在使用https://github.com/blueimp/jQuery-File-Upload lib将文件上传到服务器。但是在此之前,我想校验md5以发送AJAX请求以检查重复的文件。所以有什么办法可以在上传之前校验文件的MD5。谢谢和最好的问候。

作者: boyxmen006 的来源 发布者: 2017 年 9 月 15 日

回应 1


1

73 作者的声誉

摘要:

  • 'add'在调用之前,在该部分添加spark-md5代码data.submit()-这将启动上传。
  • 您还可以执行其他操作,例如文件大小检查等。

我的堆栈:

控制台日志结果:

uploading adobe_flash_setup_0906278883.exe 4522ae4ce9ee143b5b18dfa4a51b01b6
file name: adobe_flash_setup_0906278883.exe (1,518,959 bytes)
read chunk number 1 of 1
finished loading :)
computed hash: 3f38a0468b52a38c34385201de4746b0
placeholder call for data.submit();

在我的<script>标签之前:

<script src="https://unpkg.com/jquery@3.2.1/dist/jquery.min.js"></script>
<script src="https://unpkg.com/blueimp-file-upload@9.19.1/js/vendor/jquery.ui.widget.js"></script>
<script src="https://unpkg.com/blueimp-file-upload@9.19.1/js/jquery.iframe-transport.js"></script>
<script src="https://unpkg.com/blueimp-file-upload@9.19.1/js/jquery.fileupload.js"></script>
<script src="https://unpkg.com/spark-md5@3.0.0/spark-md5.min.js"></script>

在我的<script>标签上:

      $('#fileupload').fileupload({
                  url: 'https://mywebsite/blahblahblahblahblah',
                  paramName: '_file',
                  dataType: 'json',
                  type: 'POST',
                  autoUpload: true,
                  add: function(e, data) {
                      console.log('uploading', data.files[0].name, _hashID);


                      var blobSlice = File.prototype.slice || File.prototype.mozSlice || File.prototype.webkitSlice,
                          file = data.files[0],
                          chunkSize = 2097152, // read in chunks of 2MB
                          chunks = Math.ceil(file.size / chunkSize),
                          currentChunk = 0,
                          spark = new SparkMD5.ArrayBuffer(),
                          frOnload = function(e) {
                              console.log("\nread chunk number " + parseInt(currentChunk + 1) + " of " + chunks);
                              spark.append(e.target.result); // append array buffer
                              currentChunk++;
                              if (currentChunk < chunks)
                                  loadNext();
                              else
                                  console.log("\nfinished loading :)\n\ncomputed hash:\n" + spark.end());
                              console.log("placeholder call for data.submit();")
                              data.submit();
                          },
                          frOnerror = function() {
                              console.log("\noops, something went wrong.");
                          };

                      function loadNext() {
                          var fileReader = new FileReader();
                          fileReader.onload = frOnload;
                          fileReader.onerror = frOnerror;
                          var start = currentChunk * chunkSize,
                              end = ((start + chunkSize) >= file.size) ? file.size : start + chunkSize;
                          fileReader.readAsArrayBuffer(blobSlice.call(file, start, end));
                      };
                      console.log("file name: " + file.name + " (" + file.size.toString().replace(/\B(?=(?:\d{3})+(?!\d))/g, ',') + " bytes)\n");
                      loadNext();
                  },
                  progress: function (e, data) {
                      // usual stuff
                  },
                  done: function (e, data) {
                      // usual stuff
                  }
作者: xemasiv 发布者: 2017 年 10 月 22 日
32x32