[Pkg-javascript-commits] [node-yauzl] 02/03: Imported Upstream version 2.3.1

Andrew Kelley andrewrk-guest at moszumanska.debian.org
Sat May 16 20:03:33 UTC 2015


This is an automated email from the git hooks/post-receive script.

andrewrk-guest pushed a commit to branch master
in repository node-yauzl.

commit 2af71f1841838f8dc9b09c20c2c8437a9b60f690
Author: Andrew Kelley <superjoe30 at gmail.com>
Date:   Sat May 16 20:00:33 2015 +0000

    Imported Upstream version 2.3.1
---
 .gitignore                                         |   3 +-
 .travis.yml                                        |   8 +++
 README.md                                          |  34 ++++++++-
 index.js                                           |  76 ++++++++++++---------
 package.json                                       |  13 ++--
 ...size mismatch for stored file 2147483647 5.zip} | Bin
 ...ata overflows file bounds 63 2147483647 308.zip | Bin 308 -> 308 bytes
 ...he stream expected 2048576 got only 1000000.zip | Bin 0 -> 1153 bytes
 ...he stream expected 82496 got at least 98304.zip | Bin 0 -> 1153 bytes
 .../Turmion Katilot/Hoitovirhe/Rautaketju.mp3      |   0
 .../Pirun nyrkki/Mista veri pakenee.mp3            |   0
 test/test.js                                       |  24 ++++++-
 12 files changed, 116 insertions(+), 42 deletions(-)

diff --git a/.gitignore b/.gitignore
index c2658d7..ccc2930 100644
--- a/.gitignore
+++ b/.gitignore
@@ -1 +1,2 @@
-node_modules/
+/coverage
+/node_modules
diff --git a/.travis.yml b/.travis.yml
new file mode 100644
index 0000000..4afc2a1
--- /dev/null
+++ b/.travis.yml
@@ -0,0 +1,8 @@
+language: node_js
+node_js:
+  - "0.10"
+script:
+  - "npm run test-travis"
+after_script:
+  - "npm install coveralls at 2 && cat ./coverage/lcov.info | ./node_modules/.bin/coveralls"
+
diff --git a/README.md b/README.md
index 6c0a1fc..80dce9d 100644
--- a/README.md
+++ b/README.md
@@ -1,5 +1,8 @@
 # yauzl
 
+[![Build Status](https://travis-ci.org/thejoshwolfe/yauzl.svg?branch=master)](https://travis-ci.org/thejoshwolfe/yauzl)
+[![Coverage Status](https://img.shields.io/coveralls/thejoshwolfe/yauzl.svg)](https://coveralls.io/r/thejoshwolfe/yauzl)
+
 yet another unzip library for node. For zipping, see
 [yazl](https://github.com/thejoshwolfe/yazl).
 
@@ -8,6 +11,7 @@ Design principles:
  * Follow the spec.
    Don't scan for local file headers.
    Read the central directory for file metadata.
+   (see [No Streaming Unzip API](#no-streaming-unzip-api)).
  * Don't block the JavaScript thread.
    Use and provide async APIs.
  * Keep memory usage under control.
@@ -129,6 +133,19 @@ If the entry is compressed (with a supported compression method),
 the read stream provides the decompressed data.
 If this zipfile is already closed (see `close()`), the `callback` will receive an `err`.
 
+It's possible for the `readStream` to emit errors for several reasons.
+For example, if zlib cannot decompress the data, the zlib error will be emitted from the `readStream`.
+Two more error cases are if the decompressed data has too many or too few actual bytes
+compared to the reported byte count from the entry's `uncompressedSize` field.
+yauzl notices this false information and emits an error from the `readStream`
+after some number of bytes have already been piped through the stream.
+
+Because of this check, clients can always trust the `uncompressedSize` field in `Entry` objects.
+Guarding against [zip bomb](http://en.wikipedia.org/wiki/Zip_bomb) attacks can be accomplished by
+doing some heuristic checks on the size metadata and then watching out for the above errors.
+Such heuristics are outside the scope of this library,
+but enforcing the `uncompressedSize` is implemented here as a security feature.
+
 #### close()
 
 Causes all future calls to `openReadStream()` to fail,
@@ -212,6 +229,20 @@ be sure to do the following:
 
 ## Limitations
 
+### No Streaming Unzip API
+
+Due to the design of the .zip file format, it's impossible to interpret a .zip file from start to finish
+(such as from a readable stream) without sacrificing correctness.
+The Central Directory, which is the authority on the contents of the .zip file, is at the end of a .zip file, not the beginning.
+A streaming API would need to either buffer the entire .zip file to get to the Central Directory before interpreting anything
+(defeating the purpose of a streaming interface), or rely on the Local File Headers which are interspersed through the .zip file.
+However, the Local File Headers are explicitly denounced in the spec as being unreliable copies of the Central Directory,
+so trusting them would be a violation of the spec.
+
+Any library that offers a streaming unzip API must make one of the above two compromises,
+which makes the library either dishonest or nonconformant (usually the latter).
+This library insists on correctness and adherence to the spec, and so does not offer a streaming API.
+
 ### No Multi-Disk Archive Support
 
 This library does not support multi-disk zip files.
@@ -235,8 +266,7 @@ and encrypted zip files will cause undefined behavior.
 Many unzip libraries mistakenly read the Local File Header data in zip files.
 This data is officially defined to be redundant with the Central Directory information,
 and is not to be trusted.
-There may be conflicts between the Central Directory information and the Local File Header,
-but the Local File Header is always ignored.
+Aside from checking the signature, yauzl ignores the content of the Local File Header.
 
 ### No CRC-32 Checking
 
diff --git a/index.js b/index.js
index e6f8cc9..f82d869 100644
--- a/index.js
+++ b/index.js
@@ -1,8 +1,9 @@
 var fs = require("fs");
 var zlib = require("zlib");
-var FdSlicer = require("fd-slicer");
+var fd_slicer = require("fd-slicer");
 var util = require("util");
 var EventEmitter = require("events").EventEmitter;
+var Transform = require("stream").Transform;
 var PassThrough = require("stream").PassThrough;
 
 exports.open = open;
@@ -41,7 +42,7 @@ function fromFd(fd, options, callback) {
   if (callback == null) callback = defaultCallback;
   fs.fstat(fd, function(err, stats) {
     if (err) return callback(err);
-    var fdSlicer = new FdSlicer(fd, {autoClose: true});
+    var fdSlicer = fd_slicer.createFromFd(fd, {autoClose: true});
     // this ref is unreffed in zipfile.close()
     fdSlicer.ref();
     fromFdSlicer(fdSlicer, stats.size, options, callback);
@@ -51,7 +52,7 @@ function fromFd(fd, options, callback) {
 function fromBuffer(buffer, callback) {
   if (callback == null) callback = defaultCallback;
   // i got your open file right here.
-  var fdSlicer = new FakeFdSlicer(buffer);
+  var fdSlicer = fd_slicer.createFromBuffer(buffer);
   fromFdSlicer(fdSlicer, buffer.length, {}, callback);
 }
 function fromFdSlicer(fdSlicer, totalSize, options, callback) {
@@ -188,6 +189,12 @@ function readEntries(self) {
 
     self.readEntryCursor += 46;
 
+    // validate file size
+    if (entry.compressionMethod === 0) {
+      var msg = "compressed/uncompressed size mismatch for stored file: " + entry.compressedSize + " != " + entry.uncompressedSize;
+      if (entry.compressedSize !== entry.uncompressedSize) return emitErrorAndAutoClose(self, new Error(msg));
+    }
+
     buffer = new Buffer(entry.fileNameLength + entry.extraFieldLength + entry.fileCommentLength);
     readFdSlicerNoEof(self.fdSlicer, buffer, 0, buffer.length, self.readEntryCursor, function(err) {
       if (err) return emitErrorAndAutoClose(self, err);
@@ -267,12 +274,13 @@ ZipFile.prototype.openReadStream = function(entry, callback) {
       // 30 - File name
       // 30+n - Extra field
       var localFileHeaderEnd = entry.relativeOffsetOfLocalHeader + buffer.length + fileNameLength + extraFieldLength;
-      var filterStream = null;
+      var compressed;
       if (entry.compressionMethod === 0) {
         // 0 - The file is stored (no compression)
+        compressed = false;
       } else if (entry.compressionMethod === 8) {
         // 8 - The file is Deflated
-        filterStream = zlib.createInflateRaw();
+        compressed = true;
       } else {
         return callback(new Error("unsupported compression method: " + entry.compressionMethod));
       }
@@ -288,8 +296,14 @@ ZipFile.prototype.openReadStream = function(entry, callback) {
         }
       }
       var stream = self.fdSlicer.createReadStream({start: fileDataStart, end: fileDataEnd});
-      if (filterStream != null) {
-        stream = stream.pipe(filterStream);
+      if (compressed) {
+        var deflateFilter = zlib.createInflateRaw();
+        var checkerStream = new AssertByteCountStream(entry.uncompressedSize);
+        deflateFilter.on("error", function(err) {
+          // forward zlib errors to the client-visible stream
+          checkerStream.emit("error", err);
+        });
+        stream = stream.pipe(deflateFilter).pipe(checkerStream);
       }
       callback(null, stream);
     } finally {
@@ -329,6 +343,28 @@ function readFdSlicerNoEof(fdSlicer, buffer, offset, length, position, callback)
   });
 }
 
+util.inherits(AssertByteCountStream, Transform);
+function AssertByteCountStream(byteCount) {
+  Transform.call(this);
+  this.actualByteCount = 0;
+  this.expectedByteCount = byteCount;
+}
+AssertByteCountStream.prototype._transform = function(chunk, encoding, cb) {
+  this.actualByteCount += chunk.length;
+  if (this.actualByteCount > this.expectedByteCount) {
+    var msg = "too many bytes in the stream. expected " + this.expectedByteCount + ". got at least " + this.actualByteCount;
+    return cb(new Error(msg));
+  }
+  cb(null, chunk);
+};
+AssertByteCountStream.prototype._flush = function(cb) {
+  if (this.actualByteCount < this.expectedByteCount) {
+    var msg = "not enough bytes in the stream. expected " + this.expectedByteCount + ". got only " + this.actualByteCount;
+    return cb(new Error(msg));
+  }
+  cb();
+};
+
 var cp437 = '\u0000☺☻♥♦♣♠•◘○◙♂♀♪♫☼►◄↕‼¶§▬↨↑↓→←∟↔▲▼ !"#$%&\'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~⌂ÇüéâäàåçêëèïîìÄÅÉæÆôöòûùÿÖÜ¢£¥₧ƒáíóúñѪº¿⌐¬½¼¡«»░▒▓│┤╡╢╖╕╣║╗╝╜╛┐└┴┬├─┼╞╟╚╔╩╦╠═╬╧╨╤╥╙╘╒╓╫╪┘┌█▄▌▐▀αßΓπΣσµτΦΘΩδ∞φε∩≡±≥≤⌠⌡÷≈°∙·√ⁿ²■ ';
 function bufferToString(buffer, start, end, isUtf8) {
   if (isUtf8) {
@@ -342,32 +378,6 @@ function bufferToString(buffer, start, end, isUtf8) {
   }
 }
 
-function FakeFdSlicer(buffer) {
-  // pretend that a buffer is an FdSlicer
-  this.buffer = buffer;
-}
-FakeFdSlicer.prototype.read = function(buffer, offset, length, position, callback) {
-  this.buffer.copy(buffer, offset, position, position + length);
-  setImmediate(function() {
-    callback(null, length);
-  });
-};
-FakeFdSlicer.prototype.createReadStream = function(options) {
-  var start = options.start;
-  var end = options.end;
-  var buffer = new Buffer(end - start);
-  this.buffer.copy(buffer, 0, start, end);
-  var stream = new PassThrough();
-  stream.write(buffer);
-  stream.end();
-  return stream;
-};
-// i promise these functions are working properly :|
-FakeFdSlicer.prototype.ref = function() {};
-FakeFdSlicer.prototype.unref = function() {};
-FakeFdSlicer.prototype.on = function() {};
-FakeFdSlicer.prototype.once = function() {};
-
 function defaultCallback(err) {
   if (err) throw err;
 }
diff --git a/package.json b/package.json
index 961df30..11e91f7 100644
--- a/package.json
+++ b/package.json
@@ -1,10 +1,12 @@
 {
   "name": "yauzl",
-  "version": "2.1.0",
+  "version": "2.3.1",
   "description": "yet another unzip library for node",
   "main": "index.js",
   "scripts": {
-    "test": "node test/test.js"
+    "test": "node test/test.js",
+    "test-cov": "istanbul cover test/test.js",
+    "test-travis": "istanbul cover --report lcovonly test/test.js"
   },
   "repository": {
     "type": "git",
@@ -24,7 +26,10 @@
   },
   "homepage": "https://github.com/thejoshwolfe/yauzl",
   "dependencies": {
-    "fd-slicer": "~0.2.1",
-    "pend": "~1.1.3"
+    "fd-slicer": "~1.0.1",
+    "pend": "~1.2.0"
+  },
+  "devDependencies": {
+    "istanbul": "~0.3.4"
   }
 }
diff --git a/test/failure/file data overflows file bounds 63 2147483647 308.zip b/test/failure/compressed uncompressed size mismatch for stored file 2147483647 5.zip
similarity index 100%
copy from test/failure/file data overflows file bounds 63 2147483647 308.zip
copy to test/failure/compressed uncompressed size mismatch for stored file 2147483647 5.zip
diff --git a/test/failure/file data overflows file bounds 63 2147483647 308.zip b/test/failure/file data overflows file bounds 63 2147483647 308.zip
index c64c3fa..5f63af9 100644
Binary files a/test/failure/file data overflows file bounds 63 2147483647 308.zip and b/test/failure/file data overflows file bounds 63 2147483647 308.zip differ
diff --git a/test/failure/not enough bytes in the stream expected 2048576 got only 1000000.zip b/test/failure/not enough bytes in the stream expected 2048576 got only 1000000.zip
new file mode 100644
index 0000000..d2c0b45
Binary files /dev/null and b/test/failure/not enough bytes in the stream expected 2048576 got only 1000000.zip differ
diff --git a/test/failure/too many bytes in the stream expected 82496 got at least 98304.zip b/test/failure/too many bytes in the stream expected 82496 got at least 98304.zip
new file mode 100644
index 0000000..48c0dc5
Binary files /dev/null and b/test/failure/too many bytes in the stream expected 82496 got at least 98304.zip differ
diff --git "a/test/success/unicode/Turmion K\303\244til\303\266t/Hoitovirhe/Rautaketju.mp3" b/test/success/unicode/Turmion Katilot/Hoitovirhe/Rautaketju.mp3
similarity index 100%
rename from "test/success/unicode/Turmion K\303\244til\303\266t/Hoitovirhe/Rautaketju.mp3"
rename to test/success/unicode/Turmion Katilot/Hoitovirhe/Rautaketju.mp3
diff --git "a/test/success/unicode/Turmion K\303\244til\303\266t/Pirun nyrkki/Mist\303\244 veri pakenee.mp3" b/test/success/unicode/Turmion Katilot/Pirun nyrkki/Mista veri pakenee.mp3
similarity index 100%
rename from "test/success/unicode/Turmion K\303\244til\303\266t/Pirun nyrkki/Mist\303\244 veri pakenee.mp3"
rename to test/success/unicode/Turmion Katilot/Pirun nyrkki/Mista veri pakenee.mp3
diff --git a/test/test.js b/test/test.js
index 7dae8b1..b177549 100644
--- a/test/test.js
+++ b/test/test.js
@@ -13,8 +13,15 @@ var pend = new Pend();
 // 1 thing at a time for better determinism/reproducibility
 pend.max = 1;
 
+var args = process.argv.slice(2);
+function shouldDoTest(testPath) {
+  if (args.length === 0) return true;
+  return args.indexOf(testPath) !== -1;
+}
+
 // success tests
 listZipFiles(path.join(__dirname, "success")).forEach(function(zipfilePath) {
+  if (!shouldDoTest(zipfilePath)) return;
   var openFunctions = [
     function(callback) { yauzl.open(zipfilePath, callback); },
     function(callback) { yauzl.fromBuffer(fs.readFileSync(zipfilePath), callback); },
@@ -26,13 +33,16 @@ listZipFiles(path.join(__dirname, "success")).forEach(function(zipfilePath) {
     var DIRECTORY = 1; // not a string
     recursiveRead(".");
     function recursiveRead(name) {
+      // windows support? whatever.
+      var name = name.replace(/\\/g, "/");
+      var key = addUnicodeSupport(name);
       var realPath = path.join(expectedPathPrefix, name);
       if (fs.statSync(realPath).isFile()) {
         if (path.basename(name) !== ".git_please_make_this_directory") {
-          expectedArchiveContents[name] = fs.readFileSync(realPath);
+          expectedArchiveContents[key] = fs.readFileSync(realPath);
         }
       } else {
-        if (name !== ".") expectedArchiveContents[name] = DIRECTORY;
+        if (name !== ".") expectedArchiveContents[key] = DIRECTORY;
         fs.readdirSync(realPath).forEach(function(child) {
           recursiveRead(path.join(name, child));
         });
@@ -102,6 +112,7 @@ listZipFiles(path.join(__dirname, "success")).forEach(function(zipfilePath) {
 
 // failure tests
 listZipFiles(path.join(__dirname, "failure")).forEach(function(zipfilePath) {
+  if (!shouldDoTest(zipfilePath)) return;
   var expectedErrorMessage = path.basename(zipfilePath).replace(/\.zip$/, "");
   var failedYet = false;
   var emittedError = false;
@@ -174,3 +185,12 @@ function listZipFiles(dir) {
   zipfilePaths.sort();
   return zipfilePaths;
 }
+
+function addUnicodeSupport(name) {
+  // reading and writing unicode filenames on mac is broken.
+  // we keep all our test data ascii, and then swap in the real names here.
+  // see https://github.com/thejoshwolfe/yauzl/issues/10
+  name = name.replace(/Turmion Katilot/g, "Turmion Kätilöt");
+  name = name.replace(/Mista veri pakenee/g, "Mistä veri pakenee");
+  return name;
+}

-- 
Alioth's /usr/local/bin/git-commit-notice on /srv/git.debian.org/git/pkg-javascript/node-yauzl.git



More information about the Pkg-javascript-commits mailing list