Computing file hashes with node.js

  • I ran the code, which now displays number of calls to update(), and made another test with the following code:

       7 fs.readFile(filename, function (err, data) {
       8     shasum.update(data);
       9     console.log(shasum.digest('hex') + '  ' + filename);
      10 });
    
    Here are the results:

      $ dd if=/dev/urandom of=xxx.data bs=1000 count=200000
      $ time sha256sum xxx.data
      e774e4c46ab832ec09dbfd1a944044651560c3fdc3c5e2e8b46c2ea7d54f6649  xxx.data
      
      real	0m2.087s
      user	0m2.000s
      sys	0m0.064s
      $ time node xxx-1.js xxx.data
      e774e4c46ab832ec09dbfd1a944044651560c3fdc3c5e2e8b46c2ea7d54f  6649  xxx.data
      4883
      
      real	0m13.972s
      user	0m13.885s
      sys	0m0.264s
      $ time node xxx-2.js xxx.data
      e774e4c46ab832ec09dbfd1a944044651560c3fdc3c5e2e8b46c2ea7d54f6649  xxx.data
      
      real	0m14.043s
      user	0m13.433s
      sys	0m0.732s
    
    Personally I never make multiple calls to update() but it seems not to be linked here. Oh, and check the docs, readFile() is async.

    PS: timings of xxx-1.js and xxx-2.js are equivalent on several runs.

  • I dont know if its just me, but i found the results difficult to read and understand. Too much metadata in there that i dont need to know about. Just sum it up with the actual results that we need to say would make it easier to digest the information being given.

    No doubt its useful information but when presented like this the effect is diluted.

  • useless insanity! you can do same in the shell and with every PL worth using.

    Seriously folks...

  • ouch that is horrid; you can almost imagine it feeding it to the update() one boxed byte at time