# SparkMD5
SparkMD5 is a fast md5 implementation of the MD5 algorithm.
This script is based in the JKM md5 library which is the [fastest](http://jsperf.com/md5-shootout/7) algorithm around. This is most suitable for browser usage, because `nodejs` version might be faster.
NOTE: Please disable Firebug while performing the test!
Firebug consumes a lot of memory and CPU and slows the test by a great margin.
**[Demo](http://9px.ir/demo/incremental-md5.html)**
## Install
```sh
npm install --save spark-md5
```
## Improvements over the JKM md5 library
* Strings are converted to utf8, like most server side algorithms
* Fix computation for large amounts of data (overflow)
* Incremental md5 (see bellow)
* Support for array buffers (typed arrays)
* Functionality wrapped in a closure, to avoid global assignments
* Object oriented library
* CommonJS (it can be used in node) and AMD integration
* Code passed through JSHint and JSCS
Incremental md5 performs a lot better for hashing large amounts of data, such as
files. One could read files in chunks, using the FileReader & Blob's, and append
each chunk for md5 hashing while keeping memory usage low. See example below.
## Usage
### Normal usage
```js
var hexHash = SparkMD5.hash('Hi there'); // hex hash
var rawHash = SparkMD5.hash('Hi there', true); // OR raw hash (binary string)
```
### Incremental usage
```js
var spark = new SparkMD5();
spark.append('Hi');
spark.append(' there');
var hexHash = spark.end(); // hex hash
var rawHash = spark.end(true); // OR raw hash (binary string)
```
### Hash a file incrementally
NOTE: If you test the code bellow using the file:// protocol in chrome you must start the browser with -allow-file-access-from-files argument.
Please see: http://code.google.com/p/chromium/issues/detail?id=60889
```js
document.getElementById('file').addEventListener('change', function () {
var blobSlice = File.prototype.slice || File.prototype.mozSlice || File.prototype.webkitSlice,
file = this.files[0],
chunkSize = 2097152, // Read in chunks of 2MB
chunks = Math.ceil(file.size / chunkSize),
currentChunk = 0,
spark = new SparkMD5.ArrayBuffer(),
fileReader = new FileReader();
fileReader.onload = function (e) {
console.log('read chunk nr', currentChunk + 1, 'of', chunks);
spark.append(e.target.result); // Append array buffer
currentChunk++;
if (currentChunk < chunks) {
loadNext();
} else {
console.log('finished loading');
console.info('computed hash', spark.end()); // Compute hash
}
};
fileReader.onerror = function () {
console.warn('oops, something went wrong.');
};
function loadNext() {
var start = currentChunk * chunkSize,
end = ((start + chunkSize) >= file.size) ? file.size : start + chunkSize;
fileReader.readAsArrayBuffer(blobSlice.call(file, start, end));
}
loadNext();
});
```
You can see some more examples in the test folder.
## Documentation
### SparkMD5 class
#### SparkMD5#append(str)
Appends a string, encoding it to UTF8 if necessary.
#### SparkMD5#appendBinary(str)
Appends a binary string (e.g.: string returned from the deprecated [readAsBinaryString](https://developer.mozilla.org/en-US/docs/Web/API/FileReader/readAsBinaryString)).
#### SparkMD5#end(raw)
Finishes the computation of the md5, returning the hex result.
If `raw` is true, the result as a binary string will be returned instead.
#### SparkMD5#reset()
Resets the internal state of the computation.
#### SparkMD5#getState()
Returns an object representing the internal computation state.
You can pass this state to setState(). This feature is useful to resume an incremental md5.
#### SparkMD5#setState(state)
Sets the internal computation state. See: getState().
#### SparkMD5#destroy()
Releases memory used by the incremental buffer and other additional resources.
#### SparkMD5.hash(str, raw)
Hashes a string directly, returning the hex result.
If `raw` is true, the result as a binary string will be returned instead.
Note that this function is `static`.
#### SparkMD5.hashBinary(str, raw)
Hashes a binary string directly (e.g.: string returned from the deprecated [readAsBinaryString](https://developer.mozilla.org/en-US/docs/Web/API/FileReader/readAsBinaryString)), returning the hex result.
If `raw` is true, the result as a binary string will be returned instead.
Note that this function is `static`.
### SparkMD5.ArrayBuffer class
#### SparkMD5.ArrayBuffer#append(arr)
Appends an array buffer.
#### SparkMD5.ArrayBuffer#end(raw)
Finishes the computation of the md5, returning the hex result.
If `raw` is true, the result as a binary string will be returned instead.
#### SparkMD5.ArrayBuffer#reset()
Resets the internal state of the computation.
#### SparkMD5.ArrayBuffer#destroy()
Releases memory used by the incremental buffer and other additional resources.
#### SparkMD5.ArrayBuffer#getState()
Returns an object representing the internal computation state.
You can pass this state to setState(). This feature is useful to resume an incremental md5.
#### SparkMD5.ArrayBuffer#setState(state)
Sets the internal computation state. See: getState().
#### SparkMD5.ArrayBuffer.hash(arr, raw)
Hashes an array buffer directly, returning the hex result.
If `raw` is true, the result as a binary string will be returned instead.
Note that this function is `static`.
## License
The project is double licensed, being [WTF2](./LICENSE) the master license and [MIT](./LICENSE2) the alternative license.
The reason to have two licenses is that some entities refuse to use the master license (WTF2) due to
bad language. If that's also your case, you can choose the alternative license.
## Credits
[Joseph Myers](http://www.myersdaily.org/joseph/javascript/md5-text.html)
没有合适的资源?快使用搜索试试~ 我知道了~
温馨提示
MxsDoc是基于Web的文件管理系统,支持权限管理、历史版本管理、Office编辑、Office预览、在线解压缩、文件分享、文件加密、远程存储、跨仓库推送、跨服务器推送、秒传、断点续传、智能搜索、文件备注、本地自动备份、异地自动备份、一键迁移。 1. 开源 Gitee: https://gitee.com/RainyGao/DocSys Github: https://github.com/RainyGao-GitHub/DocSys 2. 多仓库支持 支持定义各自的文件存储、版本管理、权限管理规则 3. 存储安全 本地化文件存储方案,避免系统意外损坏对仓库文件的影响 3. 历史版本 采用目前最流行的SVN和GIT版本仓库 5. 在线编辑 支持各种文本文件在线编辑,支持Office文件协同编辑,无任何外部依赖(无需安装和部署其他Office编辑软件或系统) 6. 文件分享 支持文件和目录分享,支持访问权限设置、密码访问控制 7. 全文搜索 支持文件名、文件备注、文件内容搜索,让文件查找更加方便快捷 8. 后台管理 支持用户管理、用户组管理、仓库管理、系统管理
资源推荐
资源详情
资源评论





















收起资源包目录





































































































共 2000 条
- 1
- 2
- 3
- 4
- 5
- 6
- 20
资源评论


RainyFree
- 粉丝: 72
上传资源 快速赚钱
我的内容管理 展开
我的资源 快来上传第一个资源
我的收益
登录查看自己的收益我的积分 登录查看自己的积分
我的C币 登录后查看C币余额
我的收藏
我的下载
下载帮助


最新资源
- 人工智能生成发明可专利性及其权利归属.docx
- 基于DevOps理念的Pass平台实践-YY互娱.pdf
- 用plc实现交通灯和刀库的方案设计书.doc
- 计算机网络考试大纲.doc
- 电气自动化技术在电网建设中的应用1.docx
- 电子商务中的网络信息安全研究.docx
- 银行不良资产处置的互联网模式分析.docx
- 计算机技术在智慧农业中的应用研究.docx
- 人工智能四川省重点实验室项目申报书黄丹平.doc
- 工程项目管理课程方案常怡敏.doc
- 《VisualFoxPro数据库基础》第章:面向对象程序设计概念与入门.ppt
- 健康网络-远离网瘾主题班会幻灯片..ppt
- 通信工程本科四年制优秀教学计划.doc
- 2006年4月全国计算机等级历年考试三级信息管理笔试真题.doc
- 严肃游戏与社交媒体现状及未来展望
- 区块链技术在国网安全生产工作中的应用研究.docx
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈



安全验证
文档复制为VIP权益,开通VIP直接复制
